# [Official] AMD R9 290X / 290 Owners Club



## Arizonian

[Official] AMD R9 290X / 290 Owners Club

Enthusiast GPU



*R9 290X & 290 Club Roster*

*To be added on the member list please submit the following in your post*
*
1. GPU-Z Link with OCN name or Screen shot of GPU-Z validation tab open with OCN Name or finally a pic of GPU with piece of paper with OCN name showing.
2. Manufacturer & Brand - Please don't make me guess.
3. Cooling - Stock, Aftermarket or 3rd Party Water*
_Please note if you do not provide manufacturer, brand or cooling - by default Sapphire Stock Cooling will be chosen for you._

https://docs.google.com/spreadsheet/pub?hl=en&hl=en&key=0AgQn1aK9PRbrdENJMWhJcEJBOHg1eENKbmNvY2Q3X0E&single=true&gid=0&output=html&widget=true

*AMD Gaming Facebook*
*AMD Radeon Twitter*
*AMD YouTube*

*Club Signature (Use your specific GPU proudly)*








*[Official] AMD R9 290X Owners Club*









Code:



Code:


[CENTER] [URL=http://www.overclock.net/t/1436497/amd-r9-290x-290-owners-club#post_21044503][B] :clock:[Official] AMD R9 290X Owners Club :clock: [/B][/URL] [/CENTER]









*[Official] AMD R9 290 Owners Club*









Code:



Code:


[CENTER] [URL=http://www.overclock.net/t/1436497/amd-r9-290x-290-owners-club#post_21044503][B] :clock:[Official] AMD R9 290 Owners Club :clock: [/B][/URL] [/CENTER]

*StarYoshi Edit: The thread is official now. Go bonkers and bask in the wonderment this entails.



*AMD Drivers Page*

*Latest Catalyst Software Suite*

*How-To Uninstall AMD Catalyst™ Drivers From A Windows® Based System*

*How-To Install AMD Catalyst™ Drivers For A Windows® Based System*

*Introducing the* *Catalyst™ Omega Driver* *Windows®
*
*AMD Catalys Display Driver 14.4 WHQL Windows®*

*AMD Beta Driver History beta*

*Note*: With AMD Catalyst™ Software Suite for AMD *Radeon™ 300 Series Graphics* cards now releasing with new GPU's, this list will be end of line for updates.

*
AMD Catalyst™ Driver 14.12 Windows®*


Spoiler: Warning: Spoiler!



*Highlights of AMD Catalyst™ 14.11.2 Windows® Beta Driver Performance Improvements*
◾Dragon Age: Inquisition performance optimizations
- Up to 5% performance increase over Catalyst™ 14.11.1 beta in single GPU scenarios with Anti-Aliasing enabled.
- Optimized AMD CrossFire™ profile
◾Far Cry 4 performance optimizations
- Up to 50% performance increase over Catalyst™ 14.11.1 beta in single GPU scenarios with Anti-Aliasing enabled.

*Important Notes*
◾All previous versions of AMD Catalyst™ drivers must be completely uninstalled to receive full benefits of this driver. Please make sure to completely uninstall all previous AMD Catalyst™ display drivers before installing the 14.11.2 Beta.
◾The AMD CrossFire™ profile for Far Cry 4 is currently disabled in this driver while AMD works with Ubisoft to investigate an issue where AMD CrossFire™ configurations are not performing as intended. An update is expected on this issue in the near future through an updated game patch or an AMD driver posting.

*Resolved Issues*
◾[409235]: Small chance of intermittent screen tearing or corruption in Call of Duty®: Advanced Warfare on high settings 4K resolution in AMD CrossFire™ mode.
◾[408892]: World of Warcraft can sometimes exhibit corruption when using CMAA in AMD CrossFire™ configurations.
◾[407431]: Minecraft sometimes produces corruption when changing video settings in windowed mode.
◾[407338]: XDMA Quad CrossFire™ configurations in portrait Eyefinity modes sometimes display tearing or stuttering.

*Known Issues*
◾[408723]: System can sometimes hang when upgrading to Catalyst™ 14.11.2 from Catalyst™14.7 in AMD CrossFire™ configurations. As a workaround please completely uninstall previous Catalyst™ software versions before installing Catalyst™14.11.2 beta.
◾[409518]: Slight performance drops in FIFA 2015 on AMD CrossFire™ configurations.
◾[409502]: Occasional flickering sometimes observed while playing FIFA 2015 in AMD Dual Graphics configurations.
◾[409638]: Slight Battlefield 4 performance drop on AMD Radeon™ R9 290X in AMD CrossFire™ configuration.
◾[409628]: AMD Radeon™ R9 285 intermittently hangs in Hitman Absolution on new game start.
◾[408484]: AMD Radeon™ R9 285 can sometimes exhibit flickering in Assassins Creed Unity.
◾[409613]: Assassins Creed Unity can sometimes experience frame stutter on some AMD CrossFire™ configurations.
◾[408706]: Call of Duty: Advanced Warfare intermittent black screen when loading game in Quad AMD Crossfire™ configurations.
◾[409600]: Civilization: Beyond Earth mantle users in AMD CrossFire™ configurations may sometimes experience an issue where they cannot change their game resolution. As a work around please use "Enable MGPU=1" in the games configuration .ini file.

AMD is currently working with BioWare to resolve the following issues:
◾Flickering is sometimes observed in Dragon Age: Inquisition on a limited number of surfaces in AMD CrossFire™ configurations.

AMD is currently working with Ubisoft to resolve the following issues:
◾Uneven hair corruption sometimes observed in Assassins Creed Unity when applying "ultra" game settings.
◾Flickering occasionally observed between windows on walls in Assassins Creed Unity.
◾Windows/Doors intermittently flash with black textures in Assassins Creed Unity.
◾Assassins Creed Unity occasionally exits to desktop when "ultra" game settings are applied.



*AMD Catalyst™ Driver 14.9 Windows®*


Spoiler: Driver Info!



*Highlights of AMD Catalyst™ 14.9 Windows Driver*
◾ Support for the AMD Radeon R9 28
◾ Performance improvements (comparing AMD Catalyst 14.9 vs. AMD Catalyst 14.4)
◾3DMark Sky Diver improvements
◾AMD A4 6300 - improves up to 4%
◾Enables AMD Dual Graphics / AMD CrossFire support
◾3DMark Fire Strike
◾AMD Radeon R9 290 Series - improves up to 5% in Performance Preset
◾3DMark11
◾AMD Radeon R9 290 Series / R9 270 Series - improves up to 4% in Entry and Performance Preset
◾BioShock Infinite
◾AMD Radeon R9 290 Series - 1920x1080 - improves up to 5%
◾Company of Heroes 2
◾AMD Radeon R9 290 Series - improves up to 8%
◾Crysis 3
◾AMD Radeon R9 290 Series / R9 270 Series - improves up to 10%
◾Grid Auto Sport
◾AMD CrossFire profile
◾Murdered Soul Suspect
◾AMD Radeon R9 290X (2560x1440, 4x MSAA, 16x AF) - improves up to 50%
◾AMD Radeon R9 290 Series / R9 270 Series - improves up to 6%
◾CrossFire configurations improve scaling up to 75%
◾Plants vs. Zombies (Direct3D performance improvements)
◾AMD Radeon R9 290X - 1920x1080 Ultra - improves up to 11%
◾AMD Radeon R9290X - 2560x1600 Ultra - improves up to 15%
◾AMD Radeon R9290X CrossFire configuration (3840x2160 Ultra) - 92% scaling
◾Batman Arkham Origins:
◾AMD Radeon R9 290X (4x MSAA) - improves up to 20%
◾CrossFire configurations see up to a 70% gain in scaling
◾Wildstar
◾Power Xpress profile
◾Performance improvements to improve smoothness of application
◾Performance improves up to 30% on the AMD Radeon R9 and R7 Series of products for both single GPU and Multi-GPU configurations
◾Tomb Raider
◾AMD Radeon R9 290 Series - improves up to 5%
◾Watch Dogs
◾AMD Radeon R9 290 Series / R9 270 Series - improves up to 9%
◾AMD CrossFire - Frame pacing improvement
◾Improved CrossFire performance - up to 20%
◾Assassin's Creed IV
◾Improves CrossFire scaling (3840x2160 High Settings) up to 93% (CrossFire scaling improvement of 25% compared to AMD Catalyst 14.4)
◾Lichdom
◾Improves performance for single GPU and Multi-GPU configurations
◾Star Craft II
◾AMD Radeon R9 290X (2560x1440, AA, 16x AF) - improves up to 20%
◾ AMD Eyefinity enhancements
◾Mixed Resolution Support
◾A new architecture providing brand new capabilities
◾Display groups can be created with monitors of different resolution (including difference sizes and shapes)
◾Users have a choice of how surface is created over the display group
◾Fill - legacy mode, best for identical monitors
◾Fit - create the Eyefinity surface using best available rectangular area with attached displays
◾Expand - create a virtual Eyefinity surface using desktops as viewports onto the surface
◾Eyefinity Display Alignment
◾Enables control over alignment between adjacent monitors
◾One-Click Setup
◾Driver detects layout of extended desktop
◾Can create Eyefinity display group using this layout in one click!
◾ New user controls for video color and display settings
◾Greater control over Video Color Management:
◾Controls have been expanded from a single slider for controlling Boost and Hue to per color axis
◾Color depth control for Digital Flat Panels (available on supported HDMI and DP displays)
◾Allows users to select different color depths per resolution and display
◾ AMD Mantle enhancements
◾Mantle now supports AMD Mobile products with Enduro technology
◾Battlefield 4: AMD Radeon HD 8970M (1366x768; high settings) - 21% gain
◾Thief: AMD Radeon HD 8970M (1920x1080; high settings) - 14% gain
◾Star Swarm: AMD Radeon HD 8970M (1920x1080; medium settings) - 274% gain
◾Enables support for Multi-GPU configurations with Thief (requires the latest Thief update)
◾ AMD AM1 JPEG decoding acceleration
◾JPEG decoding acceleration was first enabled on the A10 APU Series in AMD Catalyst 14.1 beta, and has now been extended to the AMD AM1 Platform
◾Provides fast JPEG decompression
◾Provides Power Efficiency for JPEG decompression

*Resolved Issues*
◾60Hz SST flickering has been identified as an issue with non-standard display timings exhibited by the AOC U2868PQU panel on certain AMD Radeon™ graphics cards. A software workaround has been implemented in the AMD Catalyst 14.9 driver to resolve the display timing issues with this display
◾Users seeing flickering issues in 60Hz SST mode are further encouraged to obtain newer display firmware from their monitor vendor that will resolve flickering at its origin.
◾Users are additionally advised to utilize DisplayPort-certified cables to ensure the integrity of the DisplayPort data connection.
◾4K panel flickering issues found on the AMD Radeon R9 290 Series and AMD Radeon HD 7800 Series
◾Screen tearing observed on AMD CrossFire systems with Eyefinity portrait display configurations
◾Instability issues for Grid Autosport when running in 2x1 or 1x2 Eyefinity configurations
◾Geometry corruption in State of Decay

*Known Issues*
◾[404829]: Horizontal flashing lines on second screen in a clone mode with V-Sync on using AMD Mobility Graphics with Switchable Intel Graphics
◾[404508]: Display takes a long time to redraw the screen after an S4 cycle
◾[405432]: Mantle driver will TDR when running Star Swarm on the loading screen
◾[404660]: GPU gets stuck in a low power state after it was previously stressed to max power
◾[403032]: Severe flickering observed on default launch of SimCity 4
◾[403449]: Playing any media sample in full screen in the 2x1/1x2 Fill SLS configuration leads to TDR
◾[400573]: Intermittent application hang observed while launching Aliens vs. Predator
◾[400693]: While running performance test, crash is observed with fault module atidxx32.dll
◾[401386]: Severe corruption and flashing light observed with specific game settings in Grid 2
◾[401289]: Flashing lights in Batman Arkham Origins game main menu



*AMD Catalyst™ 14.7 RC Driver for Windows® Operating System*


Spoiler: Driver Info!



Last Updates: 7/10/2014
*Feature Highlights of the AMD Catalyst™ 14.7 RC Driver for Windows
*◾Includes all improvements found in the AMD Catalyst™ 14.6 RC driver
◾AMD CrossFire™ and AMD Radeon™ Dual Graphics profile update for Plants vs. Zombies 
◾Assassin's Creed IV - improved CrossFire scaling (3840x2160 High Settings) up to 93%
◾Collaboration with AOC has identified non-standard display timings as the root cause of 60Hz SST flickering exhibited by the AOC U2868PQU panel on certain AMD Radeon™ graphics cards. A software workaround has been implemented in AMD Catalyst 14.7 RC driver to resolve the display timing issues with this display.
◾Users are further encouraged to obtain newer display firmware from AOC that will resolve flickering at its origin.
◾Users are additionally advised to utilize DisplayPort-certified cables to ensure the integrity of the DisplayPort data connection. 
*Feature Highlights of the AMD Catalyst™ 14.6 RC Driver for Windows*
◾Plants vs. Zombies (Direct3D performance improvements):
◾AMD Radeon R9 290X - 1920x1080 Ultra - improves up to 11%
◾AMD Radeon R9 290X - 2560x1600 Ultra - improves up to 15%
◾AMD Radeon R9 290X CrossFire configuration (3840x2160 Ultra) - 92% scaling
◾3DMark Sky Diver improvements:
◾AMD A4-6300 - improves up to 4%
◾Enables AMD Dual Graphics/AMD CrossFire support
◾Grid Auto Sport:
◾AMD CrossFire profile
◾Wildstar:
◾Power Xpress profile
◾Performance improvements to improve smoothness of application
◾Performance improves up to 24% at 2560x1600 on the AMD Radeon R9 and R7 Series of products for both single GPU and multi-GPU configurations.
◾Watch Dogs:
◾AMD CrossFire - Frame pacing improvements
◾Battlefield Hardline Beta:
◾AMD CrossFire profile
*Known Issues*
◾Running Watch Dogs with a R9 280X CrossFire configuration may result in the application running in CrossFire software compositing mode
◾Enabling Temporal SMAA in a CrossFire configuration when playing Watch Dogs will result in flickering
◾AMD CrossFire™ configurations with AMD Eyefinity enabled will see instability with BattleField 4 or Thief when running Mantle
◾Catalyst™ Install Manager text is covered by Express/Custom radio button text
◾Express Uninstall does not remove C:\Program Files\(AMD or ATI) folder



*AMD Catalyst™ 14.6 RC Driver for Windows® Operating System*


Spoiler: Driver Info!



Last Updates: 6/23/2014

*◾Starting with AMD Catalyst 14.6 Beta, AMD will no longer support Windows 8.0 (and the WDDM 1.2 driver)
*◾Windows 8.0 users should upgrade (for Free) to Windows 8.1 to take advantage of the new features found in the AMD Catalyst 14.6 Beta
◾AMD Catalyst 14.4 will remain available for users who wish to remain on Windows 8
*◾A future AMD Catalyst release will allow for the WDDM 1.1 (Windows 7 driver) to be installed under Windows 8.0 for those users unable to upgrade to Windows 8.1

Feature Highlights of The AMD Catalyst™ 14.6 RC Driver for Windows
◾Plants vs. Zombies (Direct3D performance improvements):
◾AMD Radeon R9 290X - 1920x1080 Ultra - improves up to 11%
◾AMD Radeon R9290X - 2560x1600 Ultra - improves up to 15%
◾AMD Radeon R9290X CrossFire configuration (3840x2160 Ultra) - 92% scaling
◾3DMark Sky Diver improvements:
◾AMD A4 6300 - improves up to 4%
◾Enables AMD Dual Graphics / AMD CrossFire support
◾Grid Auto Sport:
◾AMD CrossFire profile
◾Wildstar:
◾Power Xpress profile
◾Performance improvements to improve smoothness of application
◾Watch Dogs:
◾AMD CrossFire - Frame pacing improvements
◾Battlefield Hardline Beta:
◾AMD CrossFire profile
Known Issues
◾Running Watch Dogs with a R9 280X CrossFire configuration may result in the application running in CrossFire software compositing mode
◾Enabling Temporal SMAA in a CrossFire configuration when playing Watch Dogs will result in Flickering
◾AMD CrossFire configurations with Eyefinity enabled will see application stability with BattleField 4 or Thief when running Mantle
◾Catalyst Install Manager text is covered by Express/Custom radio button text
◾Express Uninstall does not remove C:\Program Files\(AMD or ATI) folder*



*AMD Catalyst™ 14.6 Beta Driver for Windows®*


Spoiler: Driver Info!



*Feature Highlights of The AMD Catalyst 14.6 Beta Driver for Windows*
*◾Performance improvements*
◾Watch Dogs performance improvements
◾AMD Radeon R9 290X - 1920x1080 4x MSAA - improves up to 25%
◾AMD Radeon R9290X - 2560x1600 4x MSAA - improves up to 28%
◾AMD Radeon R9290X CrossFire configuration (3840x2160 Ultra settings, MSAA = 4X) - 92% scaling
◾Murdered Soul Suspect performance improvements
◾AMD Radeon R9 290X - 2560x1600 4x MSAA - improves up to 16%
◾AMD Radeon R9290X CrossFire configuration (3840x2160 Ultra settings, MSAA = 4X) - 93% scaling
◾AMD Eyefinity enhancements
◾Mixed Resolution Support
◾A new architecture providing brand new capabilities
◾Display groups can be created with monitors of different resolution (including difference sizes and shapes)
◾Users have a choice of how surface is created over the display group
◾Fill - legacy mode, best for identical monitors
◾Fit - create the Eyefinity surface using best available rectangular area with attached displays.
◾Expand - create a virtual Eyefinity surface using desktops as viewports onto the surface.
◾Eyefinity Display Alignment
◾Enables control over alignment between adjacent monitors
◾One-Click Setup
◾Driver detects layout of extended desktops
◾Can create Eyefinity display group using this layout in one click!
◾New user controls for video color and display settings
◾Greater control over Video Color Management:
◾Controls have been expanded from a single slider for controlling Boost and Hue to per color axis
◾Color depth control for Digital Flat Panels (available on supported HDMI and DP displays)
◾Allows users to select different color depths per resolution and display

◾AMD Mantle enhancements
◾Mantle now supports AMD Mobile products with Enduro technology
◾Battlefield 4: AMD Radeon HD 8970M (1366x768; high settings) - 21% gain
◾Thief: AMD Radeon HD 8970M (1920x1080; high settings) - 14% gain
◾Star Swarm: AMD Radeon HD 8970M (1920x1080; medium settings) - 274% gain
◾Enables support for Multi-GPU configurations with Thief (requires the latest Thief update)
◾AMD AM1 JPEG decoding acceleration
◾JPEG decoding acceleration was first enabled on the A10 APU Series in AMD Catalyst 14.1 beta, and has now been extended to the AMD AM1 Platform
◾Provides fast JPEG decompression
◾Provides Power Efficiency for JPEG decompression

Known Issues
Running Watch Dogs with a R9 280X CrossFire configuration may result in the application running in CrossFire software compositing mode
◾Enabling Temporal SMAA in a CrossFire configuration when playing Watch Dogs will result in Flickering
◾AMD CrossFire configurations with Eyefinity enabled will see application stability with BattleField 4 or Thief when running Mantle
◾Catalyst Install Manager text is covered by Express/Custom radio button text
◾Express Uninstall does not remove C:\Program Files\(AMD or ATI) folder



*AMD Catalyst™ 14.4 Beta V1.0 Driver for Windows® Operating System*


Spoiler: Driver Info!



Feature Highlights of The AMD Catalyst™ 14.4 Release Candidate Driver for Windows
◾Support for the AMD Radeon™ R9 295X
◾CrossFire™ fixes enhancements:
◾Crysis 3 - frame pacing improvements
◾Far Cry 3 - 3 and 4 GPU performance improvements at high quality settings, high resolution settings
◾Anno 2070 - Improved CrossFire scaling up to 34%
◾Titanfall - Resolved in game flickering with CrossFire enabled
◾Metro Last Light - Improved Crossfire scaling up to 10%
◾Eyefinity 3x1 (with three 4K panels) no longer cuts off portions of the application
◾Stuttering has been improved in certain applications when selecting mid-Eyefinity resolutions with V-sync Enabled
◾Full support for OpenGL 4.4
◾OpenGL 4.4 supports the following extensions:
◾ARB_buffer_storage
◾ARB_enhanced_layouts
◾ARB_query_buffer_object
◾ARB_clear_texture
◾ARB_texture_mirror_clamp_to_edge
◾ARB_texture_stencil8
◾ARB_vertex_type_10f_11f_11f_rev
◾ARB_multi_bind
◾ARB_bindless_texture
◾ARB_spare_texture
◾ARB_seamless_cubemap_per_texture
◾ARB_indirect_parameters
◾ARB_compute_variable_group_size
◾ARB_shader_draw_parameters
◾ARB_shader_group_vote
◾Mantle beta driver improvements:
◾BattleField 4: Performance slowdown is no longer seen when performing a task switch/Alt-tab
◾BattleField 4: Fuzzy images when playing in rotated SLS resolution with an A10 Kaveri system



*AMD Catalyst™ 14.3 Beta V1.0 Driver for Windows® Operating System*


Spoiler: Driver Info!



*Feature Highlights of The AMD Catalyst™ 14.3 Beta V1.0 Driver for Windows®*
◾Thief:
◾AMD Mantle and AMD True Audio support
◾Improves stuttering observed in CrossFire mode
◾Call of Duty: Ghosts: QUAD CrossFire profile update - improves level load times
◾Audio issues observed when using CrossFire configurations (and V-sync enabled) have been resolved
◾BattleField 4: V-sync issues observed on CrossFire configurations (with Mantle enabled) have been resolved

*Known Issues*
◾Intermittent driver stability issues when installing/un-installing on Desktop Kaveri platforms that support AMD Enduro technology under Windows 8.1. Please disable Enduro support to resolve the issue
◾Secondary GPUs do not enter low power state on CrossFire configurations; this issue will be addressed in the next AMD Catalyst beta release
◾Thief (DirectX): Lighting flickers on CrossFire configurations only after CrossFire has been enabled then disabled; this issue will be addressed in the next AMD Catalyst beta release
◾Battlefield 4 (DirectX): Quad CrossFire configurations with Eyefinity Display configurations suffer slowdowns and stability issues
◾Titanfall: Flickering occurs under AMD CrossFire configurations



*AMD Catalyst™ 14.2 Beta V1.3 Driver for Windows®
*


Spoiler: Driver Info!



*Feature Highlights of The AMD Catalyst™ 14.2 Beta V1.3 Driver for Windows®
*◾Thief: Crossfire Profile update and performance improvements for single GPU configurations
◾Mantle: Multi-GPU configurations (up to 4 GPUs) running Battlefield 4 are now supported
◾Frame Pacing for Dual Graphics and non-XDMA configurations above 2560x1600 are now supported with Battlefield 3 and Battlefield 4
◾Dual graphics DirectX 9 application issues have been resolved
◾Minecraft: Missing textures have been resolved
◾3D applications no longer see intermittent hangs or application crashes
◾Resolves corruption issues seen in X-plane.

*Known Issues*
◾Notebooks based on AMD Enduro or PowerXpress™ technologies are currently not supported by the Mantle codepath
◾Thief does not render the right eye when CrossFire and Stereo 3D is enabled



*AMD Catalyst™ Display Driver 14.1 beta for Windows®*


Spoiler: Driver Info!



Feature Highlights of The AMD Catalyst™ 14.1 Beta Driver for Windows

Support for the following new AMD Desktop APU (Accelerated Processors) products:
◾AMD A10-7850K
◾AMD A10-7700K

*Mantle Beta driver*

◾AMD's Mantle is a groundbreaking graphics API that promises to transform the world of game development to help bring better, faster games to the PC

◾Performance gain of up to 45%(versus the DirectX version) for Battlefield 4 on the R9 290 Series

◾Performance gain of up to 200% (versus the DirectX version) for Star Swarm on the R9 290 Series

◾AMD Catalyst 14.1 Beta must be used in conjunction with versions of these applications that support Mantle
◾It is expected that these applications will have future updates to support additional AMD Mantle features
◾AMD Mantle Beta driver is currently supported on:
◾AMD Radeon™ R9 Series GPUs
◾AMD Radeon™ R7 Series GPUs
◾AMD Radeon™ HD 7000 Series GPUs
◾AMD Radeon™ HD 8000 Series GPUs
◾AMD A10-7000 Series and AMD A8-7000 Series APUs
◾ For additional details please see the AMD Mantle Technology FAQ on amd.com

◾Enhanced AMD CrossFire frame pacing - Support for 4K panel and Eyefinity non-XDMA CrossFire solutions (including the AMD Radeon R9 280, 270 Series, 7900 Series, 7800 Series) and Dual Graphics configurations

◾ Frame pacing ensures that frames rendered across multiple GPUs in an AMD CrossFire configuration will be displayed at an even and regular pace
◾Supported on 4K panels and Eyefinity configurations
◾Supported on AMD Dual Graphics configurations
◾Supported on DirectX® 10 and DirectX 11 applications

Resolved issue highlights of AMD Catalyst 14.1 Beta 
◾Resolves ground texture flickering seen in Total War: Rome 2 with high settings (and below) set in game

◾Resolves flickering texture corruption when playing Call of Duty: Ghosts (multi-player) in the space station level

Resolved Issues

◾Ground texture flickering seen in Total War: Rome 2 with high settings (and below) set in game
◾Flickering texture corruption when playing Call of Duty: Ghosts (multi-player) in the space station level
◾Blu-ray playback using PowerDVD black screen on extended mode
◾Streaming VUDU HD/HDX content on Sharp PN-K321 (DP) causes the right-side half to flicker in and out
◾Black screen happened after wake up the monitor
◾Full screen issue at rotation in DX9 mode
◾Video window black screen when using Samsung Kies to play video
◾Crysis2 negative scaling in outdoor scene
◾Crysis2 has insufficient CrossFire scaling in some scene
◾Red Faction: The game has no or negative crossfire scaling with DX9 and DX11

◾Age of Conan has corruption and performance issues with crossfire enabled

◾Company of Heroes shadows are corrupted when using crossfire

◾Resident Evil5 's performance is unstable when display mode set to Window mode

◾Total War: Shogun 2 flickering menu/text

◾Frame rate drop when disabling post-processing in 3DMark06

◾Negative Crossfire scaling with game "The Secret World" in DX11 mode
◾F1 2012 Crashes to desktop

◾Tomb Raider Hair Simulation Stutters on CFX

◾Negative CrossFire scaling experienced in Call of Duty

◾Battlefield 3 performance drop on Haswell systems

◾Choppy video playback on 4k Video
◾VSync ON Tearing with 2x1 Eyefinity SLS CrossFire

◾Far Cry 3 - Game flickering while changing resolutions

◾Display corruption and BSOD occurs when extending a display after disabling Multiple GPU SLS array

◾Flickering seen when enable three 4kx2k panels at the same time

◾No Video, just a black screen when setting Chrome to run in "High Performance" when playing certain video clips

◾Image crashed on Starcraft game

Known Issues

◾Mantle performance for the AMD Radeon™ HD 7000/HD 8000 Series GPUs and AMD Radeon™ R9 280X and R9 270X GPUs will be optimized for BattleField 4™ in future AMD Catalyst™ releases. These products will see limited gains in BattleField 4™ and AMD is currently investigating optimizations for them.
◾Multi-GPU support under DirectX® and Mantle will be added to StarSwarm in a future application patch
◾Intermittent stuttering or stability issues may occur when utilizing Mantle with AMD CrossFire™ technology in BattleField 4™ - AMD recommends using the DirectX code path when playing Battlefield 4 with multiple GPUs. A future AMD Catalyst release will resolve these issues
◾Notebooks based on AMD Enduro or PowerXpress™ technologies are currently not supported by the Mantle codepath in Battlefield 4™
◾AMD Eyefinity configurations utilizing portrait display orientations are currently not supported by the Mantle codepath in Battlefield 4™
◾AMD Eyefinity technology is not currently supported in the Star Swarm application
◾AMD testing for the AMD Catalyst™ 14.1 Beta Mantle driver has been concentrated on the following products: AMD Radeon™ R9 290X, R9 290, R9 280, R9 270, R7 260X, R7 260, HD 7000 Series, HD 8000 Series, A10-7850K and A10-7700K. Future AMD Catalyst™ releases will include full test coverage for all AMD products supported by Mantle.
◾Graphics hardware in the AMD A10-7850K and A10-7700K may override the presence of a discrete GPU under the Mantle code path in Battlefield 4™
◾Frame Pacing for Dual Graphics and non-XDMA configurations above 2560x1600 do not currently work with Battlefield 3 and Battlefield 4. An upcoming release will enable support
◾DX9 Dual graphics is not supported in AMD Catalyst 14.1 Beta. An upcoming release will enable support



*AMD Catalyst™ 13.11 Beta9.5 for Windows® *


Spoiler: Driver Info!



*Resolves the issue of AMD Overdrive missing in the AMD Catalyst Control Center for the AMD Radeon™ R9 290 Series graphics cards
Resolves intermittent flickering seen on some AMD Radeon R9 270x graphics cards
Resolves graphics corruption seen in Starcraft®
Improves frame pacing r esults in AMD Quad CrossFire™ configurations for the following: Hitman: Absolution, and Total War™ : Rome 2*



*AM D Catalyst™ 13.11 Beta 9.4 Driver for Windows®*


Spoiler: Driver Info!



*May resolve intermittent black screens or display loss observed on some AMD Radeon™ R9 290X and AMD Radeon R9 290 graphics cards
Improves AMD CrossFire™ scaling in the multi-player portion of Call of Duty®: Ghosts

AMD Enduro Technology Profile updates:
XCOM: Enemy Unknown
Need for Speed Rivals*



*AMD Catalyst™ 13.11 Beta 9.2 for Windows®*


Spoiler: Driver Info!



*Call of Duty®: Ghost - Improves anti-aliasing performance, and updates the AMD CrossFire™ profile*
*AMD Radeon™ R9 290 Series - PowerTune update to reduce variance of fan speed / RPM
Resolves intermittent crashes seen in legacy DirectX® 9 applications*



*AMD Catalyst™ 13.11 Beta 8 for Windows® *


Spoiler: Driver Info!



*Resolves intermittent crashes experienced with Battlefield 4 on Windows 8 based systems*



*AMD Catalyst™ 13.11 Beta 7 Driver for Windows®*


Spoiler: Driver Info!



*Increases AMD CrossFire ™ scaling up to an additional 20% for Battlefield 4*



*AMD Catalyst™ 13.11 Beta 6 Driver for Windows ®*


Spoiler: Driver Info!



Includes support for the new prod ucts:
*[AMD Radeon ™ R9 290X
AMD Radeon R9 290*

Performance improvements
* Batman: Arkham Origins - improves performance up to 35% with MSAA 8x enabled
Total War™: Rome 2 - improves performance up to 10%
Battlefield 3 - improves performance up to 10%
GRID 2 - improves performance up to 8.5%
DiRT Showdown - improves performance up to 10%
Formula 1™ 2013 - improves performance up to 8%
DiRT 3 - improves performance up to 7%
Sleeping Dogs - improves performance up to 5%
Automatic AMD Eyefinity Configuration 
Automatic "plug and play" configuration of supported Ultra HD/4K tiled displays*










*USEFUL SOFTWARE & INFO SECTION*









*Nvidia & AMD Drivers Un-install Utility*

*RadeonPro*

*Atiman Uninstaller v.7.0.2.msi*

*TechPowerUp GPU-Z*

*MSI Afterburner*

*Sapphire TriXX*

*ASUS GPU Tweak*

*OCCT*

*Fraps*

*Unigine Valley Benchmark*

*Futuremark 3DMark Benchmark*

*Memory Info Test*

*TechPowerUP Video BIOS Collection 290X*

*TechPowerUP Video BIOS Collection 290*

*TechPower UP - BIOS Flahsing - ATI Winflash 2.6.7*

*Stop Down Clocking by OCN member Sgt. Bilko







*



Spoiler: Stop Down Clocking Instructions Using AB Beta 17



Using AB 3.0.0 Beta 17, once you have applied your overclock you must go back into CCC and reset the Power Limit back to 50% again. Even though it says it's 50% in AB it's NOT 50% in CCC.

However IF CCC is disabled you DO NOT ENABLE OVERDRIVE in CCC and your AB settings will work.

NOTE: Whenever you set a new clock and your screen flashes you need to bump the power limit back up in CCC as well as it resets every time the driver does.



*How to Uninstall GPU Drviers by OCN member BradleyW*









*How To Uninstall Your ATI GPU Drivers*

*How To Uninstall Your NVIDIA GPU Drivers*

*Unofficial OC Method by OCN member tsm106*








This is ONLY if you need to do this



Spoiler: Disabling ULPS!



Quote:


> *Source* - *The AMD How To Thread
> *
> Assumes you have no instance of AB, aka afterburner installed previously. If you do have AB installed previously, uninstall it or delete the folder. Also, it is generally a good idea to uninstall AB before any new driver install when you are crossfired. AB will automatically run, even when run at windows start is disabled, and when it does and it tries to fool with clocks you will BSOD with a 07E code.**
> 
> Run afterburner install. At this point, if you choose default install directory, there will incur security issues later on, just keep that in mind. I would recommend you install it in another folder/drive.
> 
> After install completes, do not check the option to start afterburner. If you are crossfired follow A- below to disable ULPS if you are using Unofficial Overclock Method (UOM) or , you can skip this portion if you are using Official Overclock Method (-xcl), however be warned that Afterburner will not be able to see your sleeping cards if you have ULPS enabled. Thus it's probably a good idea to disable ULPS anyways.
> 
> *A- Disabling ULPS*
> 
> Open regedit and go to:
> 
> HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Class\{4D36E968-E325-11CE-BFC1-08002BE10318}
> 
> This key 4D36E968, under CurrentControlSet is the only folder you need to access. You can ignore the others, so don't search for it just traverse directly to said folder. It's time to disable ULPS, or ultra low power savings. Inside the folder you will find more folders, 0000/0001/0002/etc so on and on.
> 
> Open each folder and double click on EnableULPS and change to 0. You do not need to change any other key, or any keys that look similar, just change EnableULPS. Close regedit. End of A-
> 
> At this point if you installed into default directory, you will have a security issue. Open file manager, go to where you installed afterburner and right click on MSIAfterburner.cfg and go to security, click edit, add yourself and give yourself full control, click ok.
> 
> Now go back and open MSIAfterburner.cfg in notepad.
> 
> Scroll to bottom and change this:
> 
> [ATIADLHAL]
> UnofficialOverclockingEULA =
> UnofficialOverclockingMode= 0
> 
> to this:
> 
> [ATIADLHAL]
> UnofficialOverclockingEULA = I confirm that I am aware of unofficial overclocking limitations and fully understand that MSI will not provide me any support on it
> UnofficialOverclockingMode= 1
> 
> **Note on the mode setting. Some new games are causing clock conflicts with Powerplay creating flickering on the desktop. You can avoid this by choosing 2, instead of 1. I am running with Powerplay off. Look Mom, no flickering. Refer to the bottom of Post 2 (_in source 'The AMD How To Thread"_ ) for more info on this.**
> 
> Close and save MSIAfterburner.cfg. At this point download the file from the link above for AMD Clock Control files and run it. Do not reboot.
> 
> Now you can run afterburner for the first time. Afterburner will analyze the gpu asic and ask to reboot when ready. Choose yes to reboot. After reboot, you will have afterburner installed, you can then unlock voltage control and monitoring, etc. Happy overclocking.






*How much PSU is required to run a 290X by OCN member SonDa5







*

*290X PSU Power Output Tests*

*How to give more volts on MSI Afterburner by OCN member sugarhell*











Spoiler: ADD more volts to MSI AB Guide



*Source*

Just use /wi4,30,8d,10 for 100mv. The offset is 6.25 mv in hexademical. So on decimal is :16*6.25=100 mv. For 50mv you need 8. For 200mv you need 20( 20=32 on dec. So 32 * 6.25=200mv)

The easy way to do changes:

Create a txt on desktop. Write
CD C:\Program Files (x86)\MSI Afterburner
MSIAfterburner.exe /wi4,30,8d,10

and then save as .bat file. Eveyrtime you start this bat file msi will start with +100mv

For 50mv: 8
For 100mv:10
For 125mv:14
For 150mv:18
For 175mv:1C
For 200mv:20

I wouldn't go over this point because
1)You are close to leave the sweet spot of the ref pcb vrms efficiency
2)These commands add 200mv on top of the 100mv offset through AB gui.That means 300mv

By default /wi command apply to current gpu only. So if you have 2 or more gpus you must use /sg command. That means the command line is something like that
ex:MsiAfterburner.exe /sg0 /wi6,30,8d,10 /sg1 /wi6,30,8d,10



*Need a good PSU? Check out the Comparison Threads by OCN member shilka*









*700-750 watts comparison thread*
*1000-1050 watts comparison thread*
*1200-1350 watts comparison thread*

*The R9 290(X) "Need to Know" by OCN member Roboyto*









*The R9 290(X) "Need to Know" Post*
*Omega Drivers Comparison by OCN member Roboyto*









*Omega Drivers Comparison*

*Sapphire R9 290 + Arctic Accelero Xtreme IV + VRM mod by OCN member Yuriewitsch*









*Sapphire R9 290 + Arctic Accelero Xtreme IV + VRM mod*

*Random Crashes Playing Fallout Fix by OCN member Alancsalt*











Spoiler: Random Crashes Playing Fallout Fix!



Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Having a big problem with my PC. So the past few days I've been getting random hard crashes while playing Fallout and only Fallout. Other games played just fine so i figured it was a mod conflict or something. But today I got a hard crash and restarted my PC.
> 
> I got a BSoD on boot, immediately after the Windows startup screen. I booted into safemode and did a clean uninstall and reinstall of both 14.9 and Omega drivers but the same thing happens. You might remember me mentioning a different bug with Omega drivers a few months ago, so I figured that I would just reinstall Windows and see if that would help.
> 
> Fast forward an hour and I got a new installation of Windows 7 and all my essential drivers. Then I installed Omega drivers and got a BSoD when rebooting. Clean uninstall and 14.9, same problem.
> 
> The BSoD says atikmdag and some other stuff, but it's too quick to read. I don't think it's a hardware issue because my cards were both working fine in all games and benchmarks besides Fallout. The PC also runs fine without AMD drivers installed Any idea of what to troubleshoot next? And how can I test if it's my videocard(s) if I can't even boot into Windows? I will try booting with just one GPU at a time but I don't think it will work.


Quote:


> Originally Posted by *alancsalt*
> 
> There's a setting in bios about not instantly rebooting on crash that gives you time to read usually OR program like "who crashed" will read your dumps and give you the BSOD number to check against the overclocking BSOD list.
> 
> There's probably a newer one, but:
> BSOD codes for overclocking
> 0x101 = increase vcore
> 0x124 = increase/decrease vcore or QPI/VTT...have to test to see which one it is
> 0x0A = unstable RAM/IMC, increase QPI first, if that doesn't work increase vcore
> 0x1E = increase vcore
> 0x3B = increase vcore
> 0x3D = increase vcore
> 0xD1 = QPI/VTT, increase/decrease as necessary
> 0x9C = QPI/VTT most likely, but increasing vcore has helped in some instances
> 0x50 = RAM timings/Frequency or uncore multi unstable, increase RAM voltage or adjust QPI/VTT, or lower uncore if you're higher than 2x
> 0x109 = Not enough or too Much memory voltage
> 0x116 = Low IOH (NB) voltage, GPU issue (most common when running multi-GPU/overclocking GPU)
> 0x7E = Corrupted OS file, possibly from overclocking. Run sfc /scannow and chkdsk /r


Quote:


> Originally Posted by *chiknnwatrmln*
> 
> OK, my PC booted up today and I did a clean install of 14.9 drivers and it's working. I tried out the new MSI AB but I'm having an issue with it. Event though I click the setting to sync my two GPUs I have to manually change the clocks for each one individually. Also I can't increase the power limit. Even though I move the slider up and hit OK, it just resets itself to zero. I have unofficial overclocking mode on.
> 
> Also TY Arizonian, that will be helpful if I have that problem again in the future.






*Water Blocks and Back Plates*

*







R9 290X & 290







*

*EKWB - Water Block* - *LINK*
*EKWB - Back Plate* - *LINK* ( Not a stand alone )

*VID-AR290X Koolance Water Block (AMD Radeon R9 290, 290X)* - *LINK*

*XSPC - Razor R9 290X / 290 Water Block* - *LINK*
*XSPC - Razor R9 290X / 290 Backplate* - *LINK*

*Swiftech - KOMODO-R9-LE R9 290X / 290 Backplate* - *LINK*

*Aqua Computer - Kryographics Black Edition* - *LINK*
*Aqua Computer - Kryographics Black Edition / Nickle Plated* - *LINK*
*Aqua Computer - Kryographics Acrylic Glass Edition* - *LINK*
*Aqua Computer - Kryographics Acrylic Glass Edition / Nickle Plated/B] - LINK

Aqua Computer - Kryographics Backplate , passive - LINK
Aqua Computer - Kryographics Backplate, active XCS - LINK

*

*OCN Benchmark Links*

Got a good benchmark? Show your GPU scores by taking pride and feel free to participate representing the 290 and 290X. Remember to read the first page and follow those rules when benching to submit scores for validity in order to be counted.









*Single GPU Firestrike Top 30*

*Top 30 3d Mark 13 Fire Strike Scores in Crossfire / SLI
*
*Firestrike Extreme Top 30*

*Top 30 3DMark11 Scores for Single/Dual/Tri/Quad*

*[OFFICIAL]--- Top 30 --- Unigine 'Valley' Benchmark 1.0*

*Catzilla Top 30 Thread*

*OCN / AMD Related Sites*

*MSI R9 290X Lightning Thread*

*290/290X Black Screen Poll*

*xXCrossXFire ClubXx --Because one's not enough *

*The R9 290 -> 290X Unlock Thread*
*
Official AMD R9 295X2 Owners Club*
*
XFX Black/Double Dissipation Club!*


----------



## Arizonian

It's the beginning of a new era. Radeon™ is gaming.









*Ultra Resolution Gaming*

GPUs built for the demands of ultra-high resolution gaming in single and multi-display configurations: get the most out of every square inch of screen. Ultra resolution (four times higher than HD) gaming with rich details.

Reviewers Guide
Quote:


> Many UltraHD/4K monitors can achieve a 60 Hz refresh rate using a tiled display configuration. AMD Eyefinity technology can be leverage to support these tiled displays by making two 2Kx2K tiles act as one 4Kx2K monitor. AMD has taken steps to make this easy for end users by providing an Automatic AMD Eyefinity Configuration features. This feature allows an automatic "plug and play" configuration of supported UltraHD/4K tiled displays when a DisplayPort cable is connected.
> 
> When a user hotplugs a tiled 4K monitor (such as the Sharp PN-K321 or Asus PQ321Q), a 2x1 display group will be automatically created and the two tiles will be combined to act as one monitor. This configuration will be remembered and re-enabled when the display is unplugged or the system is rebooted. It is also possible to manually disable the display group in CCC, and have the two tiles act as independent monitors. It should be noted that once the display group is manually disabled, it can no longer be automatically configured as a tiled display via the Automatic AMD Eyefinity Configuration feature.
> 
> Additionally, AMD is supporting a new industry standard for tiled displays in VESA DisplayID v1.3. The "Tiled Display Topology Data Block" describes additional display capabilities that can be leveraged to enable the plug and play experience with tiled displays. Tiled displays supporting this data block can be supported by the Automatic AMD Eyefinity Configuration with no driver update required.


*TrueAudio Technology*

With the sonic brilliance of AMD TrueAudio1 technology, your games now sound as good as they look. More acoustic bandwidth and capabilities for developers to play with, plus a healthy slug of extra audio performance make for a richer in-game soundscape.



_AMD is implementing a fully programmable audio engine, True Audio technology will be on the new architecture R9 290X & 290 with an onboard audio processor to help with CPU load. Game developers can use a Wwise audio plugin over the AMD True Audio DPS. When implemented by game developers, approximately 14% of the CPU will be offloaded to the graphic card audio processor.

_

There are multiple Audio optimized DSP cores
Tensilica HiFi2 EP instruction set
Tensilica Xtensa SP Fload support
The DSPs have 32KB instruction and data caches
8KB of scratch RAM for local operation.

*Mantle*

There's optimization, and then there's Mantle. Games enabled with Mantle speak the language of the Graphics Core Next architecture to unlock revolutionary performance and image quality. It's a game-changing innovation developed by AMD.

*The four core principles of AMD's Mantle*


Essential principle #1: Helping developers
Essential principle #2: Helping PC gamers get better performance
Essential principle #3: Bringing innovation back to graphics APIs
Essential principle #4: Don't break games
Quote:


> In short, Mantle is a new and better way to bring the code developers are already writing for next-generation consoles to life on the PC. It achieves this by being similar to, and often compatible with, the code they are already writing for those platforms. The ultimate goal of Mantle is to give gamers the ultimate performance in compatible games, and doing that in such a way that developers are free to put forth whatever effort is required to ensure optimal performance for competing platforms.


*Source*
Quote:


> AMD claims that Mantle can enable up to nine times the number of draw calls issued from the CPU compared to existing APIs. Draw calls are often a bottleneck to graphics performance, with GPUs sometimes able to process more than CPUs can issue, so this can only be a good thing.
> 
> Mantle particularly interesting is of course the fact that AMD has GCN hardware in all the new consoles. While the similarities between Mantle and the console APIs remains an unknown, cross-platform developers are already becoming familiar with GCN at a lower level and optimising their code for it, theoretically making ports to PC much easier and more scalable. *With this level of in-built support it's a strong bet Mantle will make some impact.*
> 
> While there will naturally be an overhead for developers to include Mantle support, AMD has garnered support from a key partner, specifically DICE. A free update to Battlefield 4, due in December, will enable Mantle support in the game, *potentially gracing AMD GPU users with a free and significant boost in performance*. While the full ramifications of Mantle thus remain to be seen, you can expect more info, including partner announcements and demos, at AMD's developer summit in November.


*AMD Mantle API poised to revolutionise PC gaming* - *Source* - *My Gaming*

*GCN Architecture*

Designed from the ground up for extraordinary performance with the most demanding games, this is the pinnacle of desktop graphics technology.

_GCN stands for "Graphics Core Next." It is the architecture that drives the Radeon R9 series. It is also the architecture that runs the Playstation 4 and XBox One. Through this architecture, AMD is able to support Mantle and DirectX 11.2. DirectX 11.2 is currently shipping with Windows 8.1._

DIRECTX 11.2 which brings a host of new features to improve performance in your games and graphics app. The DIRECTX 11.2 offers HLSL shader linking, Inbox HLSL compiler, GPU overlay support, DirectX tiled resources, and more. Bring new levels of visual realism to gaming on the PC and get top-notch performance.

*Powertune Technology*

Designed with the most advanced power management features, the AMD Radeon HD graphics card is equipped with the intelligence to convert unused power headroom into extra performance by dynamically controlling clock speeds, allowing gamers advanced performance inside of the power envelope they specify.

*Source*
The clock speed is dynamic on the R9 290X and R9 290 via a 2nd generation serial VID interface, which is a new controller interface designed by AMD. The controller allows a 6.25mV voltage step granularity and 255 voltage steps between 0.00V to 1.55V. There is a dedicated SVI2 Telemetry line providing voltage and current feedback at 40KHz sampling and 20Mbps of telemetry bandwidth. The controller allows multiple voltage domains as well. power, temperature, and application demands will determine the frequency the GPU is running at.

AMD has set the default temperature of these GPUs to run at a max of 95c. _AMD states that the thermal threshold for these GPUs is well above 95c_.

The default fan speed to a cap of 40% under 'quite' mode. The default fan speed to a cap of 55% under 'uber' mode. You can manually set the fan to a higher cap, up to 100%. The fan is dynamic and automatically adjusts with controller. Core clock speeds will not auto boost above set Core but may dynamically lower itself if temps rise above 95c. Higher fan caps can alleviate higher temps and keep Core clock from dynamically being lowered.

*Powerplay & AMD Zerocore Technology*

AMD PowerPlay power management technology automatically manages power consumption according to actual GPU loading. AMD ZeroCore Power technology enables lower idle power than any other currently available graphics card. With support for AMD CrossFire technology, AMD ZeroCore Power eliminates wasted power, as well as additional heat and noise from extra cards when you are not using them.

*APP Acceleration*

Powered by a set of innovative hardware and software technologies working in concert behind the scenes, AMD App Acceleration gives you enhanced speed and performance beyond traditional graphics and video processing.


Enjoy beautifully rich and clear video playback when streaming from the web
Take in your favorite movies in stunning, stutter-free HD quality
Run multiple applications smoothly at maximum speed
Enjoy lightning fast game play and realistic physics effects

*HD3D Technology*

AMD HD3D Technology is supported by an advanced and open ecosystem that, in conjunction with specific AMD hardware and software technologies, enables 3D display capabilities for many PC applications and experiences.

*Recommended 3D Displays*


Play your favorite PC games in stereo 3D across 3 displays with AMD Eyefinity 3D technology3
Connect to your display and watch the latest Blu-ray 3D movies with excellent display results with 3Ghz HDMI support1,4
Convert 2D Videos and Movies to stereoscopic 3D1
Make your photos come to life with 2D to 3D conversion support

*Eyefinity*

The next generation of AMD Eyefinity technology features all-new support for stereo 3D, universal bezel compensation and brand new display configurations, bringing you ultimate panoramic computing experience.

*Connections*

HDMI 1.4b
DisplayPort 1.2 with Multi-Streaming
Two DVI-D




*Crossfire*

AMD CrossFire, are built right in, offering you more than one way to elevate your gaming experience. With its compact design, this graphics card is perfect for standard ATX systems.

*AMD Radeon R9 290X To Feature New AMD CrossFireX Technology*

Where are CrossFire connectors?

The missing crossfire finger technology is called *Sideport*. This technology support the crossfire connection through the PCI-E Express rather than separate connectors. What does this mean? Goodbye Crossfire Bridge connectors.









AMD has added a hardware DMA engine (Direct Memory Access) in the AMD CrossFire compositing block. The video cards now have dual DMA engines. This new technology is only present in the R9 290X and R9 290. CrossFire has been simplified in the new series, and as such AMD was able to get frame pacing working.

DMA engines allow for direct access between the GPUs over the PCI-Express bus solely. AMD has said the video cards can saturate a PCIe 3.0 x16 bus bandwidth; 16GB/s bi-directional. There is no performance penalty for eliminating the external bridge. Sideport can improve scaling and efficiency between the GPUs, especially as you scale up to triple or quad cards.

No performance penalties vs external bridge.

*New AMD OverDrive in Catalyst Control Center*

AMD OverDrive now allows you to customize the power limit and temperature which in turn effects thresholds.



Spoiler: AMD Overdrive!







*Dual BIOS*

The AMD Radeon R9-290X comes flashed with two separate BIOS for testing. The bios can be toggled by the small black switch located on the top of the part closest to the mounting bracket that you plug your displays into. You must reboot after toggling the switch for changes to take effect.

*Quiet Mode - Switch is in position closest to where you plug in your displays. Fan is at 40% cap.*

*Uber Mode - Switch is in position furthest away to where you plug in your displays. Fan is at 55% cap.*



Spoiler: Dual Bios Switch Pic









*Introducing the AMD Gaming Evolved App Powered by Raptr*
Quote:


> Have access to in-game tools such as broadcasting live video via Twitch, taking screenshots, web browsing, and chat.
> 
> With a single click in the app's Control Center, gamers will be able to optimize their games based on performance, quality, or a balance of both for their AMD hardware. Optimal game settings are determined using system and game data captured from millions of PCs stored in Raptr's Cloud combined with extensive testing of various combinations of GPUs, CPUs, and resolutions. The Control Center will also be the hub for gamers to access Raptr's tens of thousands of dedicated game communities for the latest discussions, strategy, streams, and more.


Direction to the download page by clicking on the link below
*AMD Gaming Evolved App, powered by Raptr Link*

*Keep your games optimized*

Auto-optimization provides the best quality and performance settings for your rig based on data collected from the AMD community

*Supported Games Link*

*Real rewards just for playing games*

Earn real rewards just for using the Gaming Evolved app while you play. The more you play, the more rewards you'll unlock!

*Broadcast, Watch, and Chat While You Play*

Broadcast live video via Twitch, watch streams, take screenshots, and share them on Raptr, Facebook, and Twitter without ever leaving your game!

*Gamers Come First*

Join AMD's Gaming Evolved program that strives to create the best possible PC gaming experience by delivering innovative technologies, nurturing open industry standards, and helping the gaming industry maintain the PC platform as the world's premier environment.

*Video INFO*

*AMD R9 290X 4-Way Crossfire Benchmarks 7680 x 1600*


----------



## Arizonian

*R9 290X & 290 [Reference] Specifications*





*REFERENCE REVIEWS*

*







R9 290X Reviews







*

*AMD Radeon R9 290X Video Card Review* - *[H]ard|OCP - Oct 23, 2013*
*AMD Radeon R9 290X 4 GB* - *TechPowerUP! - Oct 23, 2013*
*Radeon R9 290X Review: AMD's Back In Ultra-High-End Gaming* - *tom's HARDWARE - Oct 23, 201*
*The Radeon R9 290X Review* - *AnandTech - Oct 23, 2013*
*AMD Radeon R9 290X CrossFire* - *TechPowerUP! - Oct 23, 2013*
*AMD Radeon R9-290X tested and reviewed* - *Guru3D 23 Oct 23, 2013*
*Radeon R9-290X Crossfire vs GeForce GTX 780 SLI tested and reviewed* - *Guru3D - Oct 23, 2013*
*AMD Radeon R9 290X 4GB Review* - *Bit-Tech - Oct 23, 2013*
*AMD Radeon R9 290X 4GB Reference Video Card Review* - *Tweak Town - Oct 23, 2013*
*AMD Radeon R9 290X Review* - *TechSpot - Oct 23, 2013*
*AMD Radeon R9 290X Hawaii Review* - *PC Perspective - Oct 23, 2013*
*AMD Radeon R9 290X Review: Welcome To Hawaii* - *Hot Hardware Oct 24, 2013*
*AMD Radeon R9 290X CrossFire Video Card Review* - *[H]ardOCP - Nov 1, 2013*

*







R9 290 Reviews







*

*AMD Radeon R9 290 Video Card Review* - *[H]ardOCP* - *Nov 4, 2013*
*AMD Radeon R9 290 4 GB* - *TechPowerUP! - Nov 4, 2013*
*AMD Radeon R9 290 4GB Review* - *Hardware Canucks - Nov 4, 2013*
*The AMD Radeon R9 290 Review* - *AnandTech - Nov 11, 2013*
*AMD Radeon R9-290 review* - *Guru3D* - *Nov 4, 2013*
*AMD Radeon R9 290 4GB Review* - *PC Perspective - Nov 4, 2013*
*AMD Radeon R9 290 4GB Review* - *Bit-Tech - Nov 5, 2013*
*AMD Radeon R9 290 Review: Hawaii Just Got Cheaper* - *Hot Hardware - Nov 5, 2013*
*AMD Radeon R9 290 Review* - *Tom's Hardware - Nov 5, 2013*
*AMD Radeon R9 290 Review* - *TechSpot - Nov 5, 2013*
*Radeon R9-290 Crossfire Review* - *Guru3D - Nov 5, 2013*

*MISC REVIEWS*

*







R9 290X Reviews







*

*MSI LIGHTNING*

*MSI Radeon R9-290X Lightning Review* - *Guru3D - March 6, 2014*

*ASUS MATRIX*

*ASUS Radeon R9 290X Matrix Platinum Review* - *Ocaholic - March 18, 2014*

*MSI GAMING*

*MSI Radeon R9 290X Gaming Review* - *TechPowerUP - February 24, 1014*
*MSI Radeon R9-290X Gaming OC Review* - *Guru3D - January 28, 2014*
*MSI R9 290 OC Gaming Edition Review (1600p, Ultra HD 4K)* - *Kit Guru - February 27, 2014*
*MSI Radeon R9 290X Gaming Review* - *[H]ard|OCP - February 24, 2014*

*POWERCOLOR PCS+*

*PowerColor R9 290X PCS+ 4GB Review* - *HardwareCanucks - March 13, 2014*
*PowerColor R9 290X PCS+ Review* - *TechPowerUP - March 4, 2014
*
*PowerColor Radeon R9-290X PCS+ Review* - *Guru3D - February 25, 2014*
*PowerColor Radeon R9 290X PCS+ Review* - *Hexus - February 26, 2014*
*Powercolor Radeon R9 290X PCS+ Review* - *[H]ard|OCP - March 18, 2014*

*GIGABYTE WINDFORCE*

*ASUS R9 290X DirectCU II and Sapphire R9 290X Tri-X Video Card Reviews* - *Legit Reviews - January 5, 2014*
*GIGABYTE R9 290X WindForce OC Review* - *Hardware Canucks - February 4, 2014*
*Gigabyte Radeon R9 290X WindForce OC Review* - *Hexus - January 9, 2014*
*Gigabyte Radeon R9-290X WindForce 3X OC Review* - *Guru3D - January 7, 2014*
*GIGABYTE R9 280X OC Edition Video Card Review* - *[H]ard|OCP - November 15, 2013*

*SAPPHIRE TRI-X*

*Sapphire Radeon R9 290X TRI-X Review* - *TweakTown - March 7, 2014*
*Sapphire R9 290X Tri-X OC Review* - *TechPowerUP - February 4, 2014*
*Sapphire R9 290X Tri-X OC Review (1600p, Ultra HD 4K)* - *Kit Guru - December 22, 2013*
*Sapphire Radeon R9 290 Tri-X OC Review* - *AnandTech - December 24, 2013*
*Sapphire Tri-X R9 290X Review* - *Hardware Heven - February 7, 2014*

*XFX DOUBLE DISSIPATION*

*XFX Radeon R9 290X Double Dissipation Review* - *Hardware Canucks - January 28, 2014*
*XFX R9 290X Double Dissipation Overclocking Review* - *[H]ard|OCP - February 24, 2014*
*XFX R9 290X Black OC Edition Graphics Card Review with Mantle* - *Hardware Heven - February 3, 2014*

*ASUS DCUII*

*ASUS Radeon R9-290X DirectCU II OC review* - *Guru3D December 18, 2013*
*ASUS Radeon R9 290X DirectCU II Graphics Card Review* - *PC Perspective - December 18, 2013*
*ASUS R9 290X DirectCU II OC Review* - *TechPowerUP - January 21, 2014*
*ASUS R9 290X DirectCU II OC Overclocking Review* - *[H]ard|OCP - January 13, 2014*

*SAPPHIRE VAPOR-X R9 290X TRI-X*

*







R9 290 Reviews







*

*XFX DOUBLE DISSIPATION*

*XFX Radeon R9 290 Double Dissipation Review* - *Benchmark Reviews - January 28, 2014*
*XFX R9 290 Double Dissipation Review* - *Hardware Canucks - March 17, 2014*

*GIGABYTE WINDFORCE*

*Gigabyte R9 290 WindForce OC Review (1600p, Ultra HD 4K)* - *Kit Guru - January 30, 2014*
*Gigabyte Radeon R9 290X OC and R9 290 OC Review* - *TechSpot - January 9, 2014*

*MSI GAMING*

*MSI R9 290 OC Gaming Edition Review (1600p, Ultra HD 4K) Review* - *Kit Guru - March 17, 2014*
*MSI R9 290 Gaming Review* - *[H]ard|OCP - February 17, 2014*

*ASUS DCUII*

*ASUS Radeon R9-290 DirectCU II OC Review* - *Guru3D - January 17, 2014*

*SAPPHIRE TRI-X*

*R9 290 Tri-X Review* - *Hexus - December 20, 2013*

*http://www.powercolor.com/Global/products_features.asp?id=523POWERCOLOR PCS+*

*SAPPHIRE VAPOR-X R9 290 TRI-X*


----------



## Arizonian

*ARIZONIAN - I7 3770K 4.5 Ghz - R9 290X 1150 CORE / 1350 MEMORY - Stock Cooling - 13.11 Beta 6 Drivers*



Spoiler: Club Entry Pic









Spoiler: Rig Pics









*GPU-Z Validation Link*

*CPU-Z Validation*



Spoiler: MY BENCHMARKS 290X REFERENCE



*3DMark 11 Performance*

http://www.3dmark.com/3dm11/7375771

*3DMark11 Extreme*

http://www.3dmark.com/3dm11/7380927

*Firestrike*

http://www.3dmark.com/3dm/1484478

*Firestrike Extreme*

http://www.3dmark.com/3dm/1488266



*BF4 MULTIPLAYER Ultra Settings - Stock Cooling- 1100 CORE / 1300 MEMORY - U2713HM 2560 X 1440 (Pre-Mantel 13.11 Beta 7)*






*MSI Afterburner - 84C Temp - 76% Manual Fan - 100% GPU Usage - 1100 Core Clock / 1300 Memory Clock (No down clocking)*



Spoiler: AB readings : Proof of 13:25 Time


----------



## VSG

Asus + Sapphire R9-290X, now if the EK blocks finally show up I will finish my build and get to benchmarking/gaming.


----------



## Master__Shake

....


----------



## Slomo4shO

It has almost begun!

Here is something interesting:
Quote:


> GIGABYTE GV-R929D5-4GD-B
> 
> Effective Memory Clock *6.0GHz*
> 
> Cooler *WINDFORCE 3X*


Seems that the 290 is going to be having a factory overclocked card non-reference card on release.


----------



## DampMonkey

Installed 10-28-2013


----------



## PolyMorphist

May order two for my 3D/gaming hobbies. Maybe when the third party manufacturers start making higher-end variants.


----------



## RocketAbyss

Oh you cleaned up the thread! Nice! Reserved for my card here!


----------



## PapaSmurf6768

Reserving, also...

Wasn't this thread already made a while ago? What'd I miss?

If the rumors are true I'll be ordering one tonight as soon as they go on sale, then actually reading some reviews to see if I should cancel my order or not


----------



## reoneclipse

I got paid last night, so my budget can easily handle cross-fire 290x's. Depending on the price I might grab a second one from Newegg, but for now it's just the one from ShopBLT.

I'll probably just redo the thermal paste as I will be sitting on the stock cooler for a while since it's starting to get cold down here anyway.

Update:
I actually have had the card since Tuesday but having too much fun to post an update here. I got some time now so here goes.


Spoiler: Pics









Still using the stock cooler but I redid the TIM, around 2C lower under load but the fan has never went above 70% in anything. Not planning on doing water for this build, maybe on my next build. Runs great and I love the fact that I can get 4XSSAA in BF4 and stay above 50FPS at 1080p Ultra, the game looks so crispy!


----------



## Arizonian

Quote:


> Originally Posted by *PapaSmurf6768*
> 
> Reserving, also...
> 
> Wasn't this thread already made a while ago? What'd I miss?
> 
> If the rumors are true I'll be ordering one tonight as soon as they go on sale, then actually reading some reviews to see if I should cancel my order or not


Yes NDA is being officially lifted tomorrow on 290X with official NDA lift on 290 on Oct 31, 2013 and it's official.

*AMD Radeon R9 290X arrives October 24th, R9 290 October 31st*



Since the pre-order release was cancelled from originaly Oct 3, 2013 we had the Pre-order thread that had 80+ pages of speculation we've started the club fresh by members request.

Earlier today *[H]ardForum* posted: *Source*
Quote:


> AMD is launching the Radeon R9 290X today. The R9 290X represents AMD's fastest single-GPU video card ever produced. It is priced to be less expensive than the GeForce GTX 780, but packs a punch on the level of GTX TITAN. We look at performance, the two BIOS mode options, and even some 4K gaming.


So we should start seeing reviews by early tomorrow morning while we sleep. Something I'll be missing out on and real orders are going live as well.


----------



## RocketAbyss

Quote:


> Originally Posted by *Arizonian*
> 
> So we should start seeing reviews by early tomorrow morning while we sleep. Something I'll be missing out on and real orders are going live as well.


Well, Im wide awake right now and its almost 10AM here. Meh















Eagerly waiting for lunch time so I can finally see reviews on the card and if cards will go on sale locally here. If not I might have to wait till next week to get a local piece.


----------



## Slomo4shO

Reviews are up
http://www.hardocp.com/article/2013/10/23/amd_radeon_r9_290x_video_card_review
http://www.techpowerup.com/reviews/AMD/R9_290X/1.html
http://www.tomshardware.com/reviews/radeon-r9-290x-hawaii-review,3650.html
http://www.anandtech.com/show/7457/the-radeon-r9-290x-review
http://www.techspot.com/review/727-radeon-r9-290x/
http://www.legitreviews.com/amd-radeon-r9-290x-video-card-review_126806
http://www.guru3d.com/articles_pages/radeon_r9_290x_review_benchmarks,1.html
http://us.hardware.info/reviews/4901/amd-radeon-r9-290x-review-amds-new-high-end-hawaii-gpu?utm_source=rss-mixed&utm_medium=rssfeed&utm_campaign=hardwareinfo
http://www.pcper.com/reviews/Graphics-Cards/AMD-Radeon-R9-290X-Hawaii-Review-Taking-TITANs
http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/63742-amd-radeon-r9-290x-4gb-review.html
http://www.tweaktown.com/reviews/5823/amd-radeon-r9-290x-4gb-reference-video-card-review/index.html
http://www.hardwareheaven.com/reviews/1865/pg1/amd-radeon-r9-290x-graphics-card-review-introduction.html
http://www.expertreviews.co.uk/graphics-cards/1303303/amd-r9-290x
http://hexus.net/tech/reviews/graphics/61505-amd-radeon-r9-290x/

Crossfire review:
http://www.techpowerup.com/reviews/AMD/R9_290X_CrossFire/

MSRP is $549


----------



## Arizonian

Quote:


> Originally Posted by *Slomo4shO*
> 
> Review is up
> http://www.hardocp.com/article/2013/10/23/amd_radeon_r9_290x_video_card_review
> 
> MSRP is $549


Sapphire 290X BF4 Edition $579 Purchased already.









1 *SAPPHIRE 100361BF4SR Radeon R9 290X* 4GB GDDR5 PCI Express 3.0 *BattleField 4 Game Edition* Video Card
Item #: N82E16814202058
VGA Replacement Only Return Policy
$579.99
Subtotal $579.99
Tax $0.00
Newegg Next Day $27.09
Rush Processing (Preferred Account) -$2.99
Rush Processing $2.99
Order Total $607.08

Put in for next day order. Minus what BF4 cost this comes out to $519 (minus next day shipping) - amazing price / performance ratio. Should have by Saturday.


----------



## Stay Puft

I grabbed 4 sapphire's. They had that stupid limit 2 per person so I bought two and the wife bought two


----------



## VSG

Guys, the 290 (blurred on one of the reviews) is right up there with the 290X. I am so tempted but my mind is telling me to wait.


----------



## Slomo4shO

Quote:


> Originally Posted by *Stay Puft*
> 
> I grabbed 4 sapphire's. They had that stupid limit 2 per person so I bought two and the wife bought two


The Tom's review has a crossfire review at 7680x1440 resolution.


----------



## DampMonkey

Snagged a Sapphire with bf4 for $580. And i thought i was going to pay 720!


----------



## Arizonian

Quote:


> Originally Posted by *Slomo4shO*
> 
> Review is up
> http://www.hardocp.com/article/2013/10/23/amd_radeon_r9_290x_video_card_review
> http://www.techpowerup.com/reviews/AMD/R9_290X/1.html
> http://www.tomshardware.com/reviews/radeon-r9-290x-hawaii-review,3650.html
> http://www.anandtech.com/show/7457/the-radeon-r9-290x-review
> 
> Crossfire review:
> http://www.techpowerup.com/reviews/AMD/R9_290X_CrossFire/
> 
> MSRP is $549


Sweet job collecting them quickly. Added to OP.

So if without the game it's $549 then those that get it with the game and were buying the game separately would actually be less for the bundle.









Quote:


> Originally Posted by *geggeg*
> 
> Guys, the 290 (blurred on one of the reviews) is right up there with the 290X. I am so tempted but my mind is telling me to wait.


If your thinking of the 290 and don't mind the $50 difference without bundle approximately if the 290 will come in at $499, I'd say do it.


----------



## DampMonkey

More Reviews:

Guru3d
http://www.guru3d.com/news_story/amd_radeon_r9_290x_tested_and_reviewed.html

Guru3d 290x Xfire vs 780 SLI
http://www.guru3d.com/news_story/radeon_r9_290x_crossfire_vs_geforce_gtx_780_sli_tested_and_reviewed.html

Bit-tech.net
http://www.bit-tech.net/hardware/graphics/2013/10/24/amd-radeon-r9-290x-4gb-review/1


----------



## Slomo4shO

Quote:


> Originally Posted by *Arizonian*
> 
> Sweet job collecting them quickly. Added to OP.


Added more, I am sure I missed a few.
Quote:


> Originally Posted by *Arizonian*
> 
> If your thinking of the 290 and don't mind the $50 difference without bundle approximately if the 290 will come in at $499, I'd say do it.


My prediction is still $449 for the 290


----------



## tsm106

Quote:


> Originally Posted by *Arizonian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Slomo4shO*
> 
> Review is up
> http://www.hardocp.com/article/2013/10/23/amd_radeon_r9_290x_video_card_review
> 
> MSRP is $549
> 
> 
> 
> Sapphire 290X BF4 Edition $579 Purchased already.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1*SAPPHIRE 100361BF4SR Radeon R9 290X* 4GB GDDR5 PCI Express 3.0 *BattleField 4 Game Edition* Video Card
> Item #: N82E16814202058
> VGA Replacement Only Return Policy
> $579.99
> Subtotal$579.99
> Tax$0.00
> Newegg Next Day$27.09
> Rush Processing (Preferred Account)-$2.99
> Rush Processing$2.99
> Order Total$607.08
> 
> Put in for next day order. Minus what BF4 cost this comes out to $519 (minus next day shipping) - amazing price / performance ratio. Should have by Saturday.
Click to expand...

Wow mine came out cheaper.

Shipping Method: Newegg 3Day

1 x ($579.99) SAPPHIRE 100361BF4SR Radeon R9 290X 4GB GDDR5 PCI
$579.99

1 x ($-29.00) DISCOUNT FOR PROMOTION CODE
$-29.00

Subtotal:$550.99
Tax:$41.32
Shipping and Handling:$8.50
Total Amount:$600.81

Newegg is 45mins away so no point in next day delivery. I got hit with ship fee cuz they dont do shoprunner over mobile yet, oh well.


----------



## RhoSigmaTau

I'm kind of worried though, based on the reviews it runs at 95 Degrees Celcius, which AMD claims to be completely safe. Don't buy it though.

Planning on waiting a few months for non-reference coolers to come out, as running a graphic card at 95 Degrees Celcius...That's gonna heat up.


----------



## tsm106

Gonna wait for non-bf4 stock to get teh rest... though still undecided on quantity.. I think quad is overkill or trip 1080s lol. Two is probably perfect.


----------



## Slomo4shO

Quote:


> Originally Posted by *tsm106*
> 
> 1 x ($-29.00) DISCOUNT FOR PROMOTION CODE
> $-29.00


Promo code *MBLEMC10G* for 5% off through the mobile site or the Newegg app


----------



## DampMonkey

Proof:


----------



## tsm106

Mobile app is hard to input codes, I couldn't find it so fyi, It looks like they have lots of stock though.


----------



## Arizonian

Quote:


> Originally Posted by *tsm106*
> 
> Wow mine came out cheaper.
> 
> Shipping Method: Newegg 3Day
> 
> 1 x ($579.99) SAPPHIRE 100361BF4SR Radeon R9 290X 4GB GDDR5 PCI
> $579.99
> 
> 1 x ($-29.00) DISCOUNT FOR PROMOTION CODE
> $-29.00
> 
> Subtotal:$550.99
> Tax:$41.32
> Shipping and Handling:$8.50
> Total Amount:$600.81
> 
> Newegg is 45mins away so no point in next day delivery. I got hit with ship fee cuz they dont do shoprunner over mobile yet, oh well.


What discount code did you use? I moved so fast on purchase I missed it.


----------



## VSG

Ugh, couldn't resist and ended up buying the Sapphire BF4 edition using the mobile discount code. I need to talk to a rep and see if I can get my free shoprunner shipping for this.


----------



## tsm106

Quote:


> Originally Posted by *Arizonian*
> 
> What discount code did you use? I moved so fast on purchase I missed it.


Psst!!! As mentioned above 5% off m.newegg

http://slickdeals.net/f/6359848-5-of-entire-purchase-newegg-up-to-50-through-newegg-mobile-app?src=tdw


----------



## Stay Puft

Quote:


> Originally Posted by *Arizonian*
> 
> What discount code did you use? I moved so fast on purchase I missed it.


MBLEMC10G

Had to use the mobile app or site tho for it to work


----------



## reoneclipse

ouch! those taxes! Glad I don't have to pay that


----------



## DampMonkey

Quote:


> Originally Posted by *Arizonian*
> 
> What discount code did you use? I moved so fast on purchase I missed it.


I used MBLEMC10G, it was 5% off only if you use the mobile newegg app


----------



## tsm106

Quote:


> Originally Posted by *reoneclipse*
> 
> ouch! those taxes! Glad I don't have to pay that


Hell yea. We get shafted on release days. I'm gonna wait til tiger gets non=bf4s in stock to buy the other(s) card.


----------



## jrcbandit

Quote:


> Originally Posted by *Stay Puft*
> 
> MBLEMC10G
> 
> Had to use the mobile app or site tho for it to work


Yep, used it for a total of $559 with 3 day shipping.

Now I just need to find a good water block, any info on Heatkiller making one for the 290x? Also, at 95 C, will I need a 2nd radiator? I have a single EX360 running a 3570K and 7970. I could add a 240EX radiator but that requires more tubing/fittings/etc which will all add up in addition to the waterblock + back-cover.


----------



## VSG

What are the odds on a decent overclock on CFX 290X's using a corsair 860i? Seems there isn't a whole lot of overclock room using the standard cooler but I plan to go water cooling.


----------



## tsm106

Quote:


> Originally Posted by *geggeg*
> 
> What are the odds on a decent overclock on CFX 290X's using a corsair 860i? Seems there isn't a whole lot of overclock room using the standard cooler but I plan to go water cooling.


Not good if you are gonna be overclocking with some authority.


----------



## LiquidHaus

I'll be owner of a 290X by the end of November for sure. What I'm waiting for is mini-reviews from you guys on them but watercooled, and how they do with overclocking. Because I'll be ordering the EK block same time as the card when the time comes. Excited for you guys!!


----------



## VSG

I don't plan on pushing crazy voltages, but let's see. So far reviewers have noticed a peak of 650-700 watts system usage without over clocking on the CFX setup.


----------



## Tobiman

Quote:


> Originally Posted by *geggeg*
> 
> What are the odds on a decent overclock on CFX 290X's using a corsair 860i? Seems there isn't a whole lot of overclock room using the standard cooler but I plan to go water cooling.


Yeah, these cards draw up to 720w in CF so you'll be looking at 1000W min.


----------



## DampMonkey

From techpowerup's review:
Quote:


> AMD's Radeon R9 290X shows fantastic clock scaling with GPU voltage, better than any GPU I've previously reviewed. The clocks do not show any signs of diminishing returns, which leads me to believe that the GPU could clock even higher with more voltage and cooling.
> 
> AMD's stock cooler is completely overwhelmed with the heat output of the card during voltage tweaking, though. Even at 100%, it could barely keep the card from overheating and was noisier than any cooler I've ever experienced. My neighbors actually complained, asking why I used power tools that late at night.


This is why i have an EK block enroute


----------



## VSG

Quote:


> Originally Posted by *Tobiman*
> 
> Yeah, these cards draw up to 720w in CF so you'll be looking at 1000W min.


I might just stick to a single card then and overclock that to infinity and beyond


----------



## PapaSmurf6768

I've got one on order as of now since the price was just so darn good, but the temps do scare me. Hoping I don't regret this purchase!


----------



## VSG

So any thoughts on which EK block to go for? I haven't bought from them before and I recall them having serious issues with their nickel stuff. Full cover acetal ok?


----------



## Stay Puft

Quote:


> Originally Posted by *geggeg*
> 
> So any thoughts on which EK block to go for? I haven't bought from them before and I recall them having serious issues with their nickel stuff. Full cover acetal ok?


The nickel issue has been resolved


----------



## Mr357

Quote:


> Originally Posted by *geggeg*
> 
> So any thoughts on which EK block to go for? I haven't bought from them before and I recall them having serious issues with their nickel stuff. Full cover acetal ok?


The acetal+copper version was sold out, so I went ahead and ordered the nickel version. Should be fine if I'm just using distilled and liquid utopia, right?


----------



## Tobiman

OCUK's review shows that 290 is just as good as 290X. I may be getting two of those bad boys.


----------



## tsm106

Quote:


> Originally Posted by *Tobiman*
> 
> OCUK's review shows that 290 is just as good as 290X. I may be getting two of those bad boys.


Go look in the HOF, tell me how many 7950s u see are still in the top 100?


----------



## DampMonkey

Quote:


> Originally Posted by *geggeg*
> 
> So any thoughts on which EK block to go for? I haven't bought from them before and I recall them having serious issues with their nickel stuff. Full cover acetal ok?


Generally you don't want to mix metals. Is the rest of your loop copper? Get copper. Are your other blocks nickel? Nickel it is! If the block doesn't specify "nickel" in the name, then it is copper based.

Like stay-puft said, EK has revised their nickel plating process that brought a lot of bad press years ago.


----------



## aznplayer213

hm...im so close to pulling the trigger...


----------



## youra6

I'm surprised they still have them in stock... Ebay is going to be loaded with these things soon.


----------



## aznplayer213

and its out of stock...i would have picked it up too at newegg hq in city of industry...


----------



## DampMonkey

Quote:


> Originally Posted by *aznplayer213*
> 
> and its out of stock...i would have picked it up too at newegg hq in city of industry...


get a sapphire http://www.newegg.com/Product/Product.aspx?Item=N82E16814202058


----------



## aznplayer213

im getting conflicting reports now. its somehow in stock with limit 2 per customer.


----------



## reoneclipse

both sapphire and XFX are in stock on my end. XFX / Sapphire


----------



## caenlen

12 left for sale on tigerdirect's website.

http://www.tigerdirect.com/applications/campaigns/deals.asp?campaignid=2733&cm_re=Homepage-_-Spot%2001a-_-CatId_campaign_amd_r9Launch

r9 290x with bf4.


----------



## c0ld

So why is everyone going with Sapphire? They have Hynx chips?

Does MSi have Hynx or Elphida?

BTW what a hater on the review.


----------



## youra6

I got 50 dollars off my 2x r9 290x orders using this:

Promo code *MBLEMC10G* for 5% off through the mobile site or the Newegg app.

My BillmeLater account right now:

http://www.usdebtclock.org/


----------



## utnorris

Thanks for the promo code guys. I cancelled my current order and redid it with the promo code and basically got free next day shipping. Plus I switched from XFX to Saphire, hopefully that was a good move.


----------



## staryoshi

Personally, I'll be waiting for the non-reference models to show up before I (possibly) pull the trigger on one. That said, this song is for those of you who will be happy owners soon. (Dog version)






I look forward to seeing plenty of great content in this thread


----------



## Arizonian

Quote:


> Originally Posted by *staryoshi*
> 
> Personally, I'll be waiting for the non-reference models to show up before I (possibly) pull the trigger on one. That said, this song is for those of you who will be happy owners soon. (Dog version)
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I look forward to seeing plenty of great content in this thread


Thanks stary for popping in and making the club [Official]. Love that song through my cans right now.


----------



## ConservingClips

Count me in as a member of the club who forgot to do express shipping or use any discount codes... on my first order! Fixed that one so I could get next day delivery on the house.









Thanks to Arizonian for setting up this club and youra6 for the promo code.


----------



## Arizonian

Quote:


> Originally Posted by *ConservingClips*
> 
> Count me in as a member of the club who forgot to do express shipping or use any discount codes... on my first order! Fixed that one so I could get next day delivery on the house.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks to Arizonian for setting up this club and youra6 for the promo code.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Thanks and no problem.









I've added names to the list but I'm going to hold off from this point on. Wait till we get the right info so I can also add the appropriate link to your post with submissions as proof. I'm thinking the following.....

*1. GPU-Z Link or Screen shot with OCN Name
2. Manufacturer & Brand
3. Cooling - Stock or 3rd Party Water*

If you guys need updating afterward a simple post will do if you add another GPU or put under water etc... Also pics are welcome and to be honest I like to see other set ups with cards in them. Congrats to everyone who's got them on order and on their way. It's going to be fun in the coming weeks to see what we can do on air and water cooling. We've also got some hardcore enthusiast who love to bench here that will really show what these cards got under the hood. Nothing like the feeling of a new GPU for us geeks.







-s


----------



## Kenshiro 26




----------



## hotrod717

NVM


----------



## Stay Puft

Quote:


> Originally Posted by *hotrod717*
> 
> I got a notification from newegg, but still says coming soon??? Ahhhh!


You have to click on the individual cards. I bought mine hours ago


----------



## DampMonkey

Another review, this time from hexus.net, their caption is "say hi to the fastest gpu in the world"









http://hexus.net/tech/reviews/graphics/61505-amd-radeon-r9-290x/


----------



## Scotty99

Wonder how long til non reference coolers will be out, did that asus model with "DC II" turn out to be reference?


----------



## Clockster

My cards will be here on monday.
Will do the usual to join the club, also just ordered a Waterblock...Haven't done watercooling in years but this card is screaming for a block


----------



## RocketAbyss

Local stores here in Singapore have the PowerColor 290X going for 818SGD, which is about 629USD after tax and shipping, which to me is still compelling value. Anyone have any ideas on the memory chips on the PowerColor model? Or should I just wait for a Sapphire model to pop out? I'm very tempted to head down after work(which is in 3.5hours time) and grab a 290X


----------



## devilhead

i have bought 290x sapphire +bf4 ----> 590 euro (Norway) , next week will pick up waterblock for 290x


----------



## Asterra

I went ahead and bit the bullet because Amazon is making me sweat and I didn't want to run the risk of not getting this card for the weekend. Now just gotta pray that Newegg doesn't pull a fast one like _failing to get it shipped out today, the release day._ Amazon has been 100.0% trustworthy in this regard. Can anyone speak for Newegg?

You guys will laugh, but I bought this card for Fallout New Vegas - a game which theoretically gets 80fps at 1080p even on a card like the 4870, but which in _my_ experience doesn't get anything like that unless I enable both GPUs on my current 6990 (which I refuse to do because I loathe microstuttering). I want a rock solid 60fps 1080p from start to finish, with all the bells and whistles I can throw at it (AA, etc.), and no microstuttering. You could argue that this card is "overkill". I beg to differ. Tom's Hardware makes this assertion when they talk about Battlefield 3, yet even that game dips below 60fps in their own chart.


----------



## Arizonian

Quote:


> Originally Posted by *Asterra*
> 
> I went ahead and bit the bullet because Amazon is making me sweat and I didn't want to run the risk of not getting this card for the weekend. Now just gotta pray that Newegg doesn't pull a fast one like _failing to get it shipped out today, the release day._ Amazon has been 100.0% trustworthy in this regard. Can anyone speak for Newegg?
> 
> You guys will laugh, but I bought this card for Fallout New Vegas - a game which theoretically gets 80fps at 1080p even on a card like the 4870, but which in _my_ experience doesn't get anything like that unless I enable both GPUs on my current 6990 (which I refuse to do because I loathe microstuttering). I want a rock solid 60fps 1080p from start to finish, with all the bells and whistles I can throw at it (AA, etc.), and no microstuttering. You could argue that this card is "overkill". I beg to differ. Tom's Hardware makes this assertion when they talk about Battlefield 3, yet even that game dips below 60fps in their own chart.


Your on OCN no one is going to tell you over kill. We'll congratulate you though.









If you bought it with BF4 now you have that too.


----------



## Scotty99

So out of curiously, did anyone not buy this card now that they see the temps from stock cooling but otherwise would have pulled the trigger had those leaked chinese results been false?


----------



## EvilGnomes

Grabbed the Sapphire 290X. Only one that was still up for sale. Really wanted MSI but meh Sapphire is a good one too. should be here by monday (free shipping via that shopper program thing)


----------



## Stay Puft

Quote:


> Originally Posted by *Scotty99*
> 
> So out of curiously, did anyone not buy this card now that they see the temps from stock cooling but otherwise would have pulled the trigger had those leaked chinese results been false?


The temp issue is a simple fix. Not Buying this card simply because of it would have been foolish


----------



## Scotty99

 whats the simple fix you speak of? I hope your not meaning a waterblock lol.


----------



## caenlen

i wonder if my 7970 dual fan cooler will fit this, does it have the tilted chip design?


----------



## Stay Puft

Quote:


> Originally Posted by *Scotty99*
> 
> whats the simple fix you speak of? I hope your not meaning a waterblock lol.


39 dollar Arctic twin turbo II. Probably see 30C lower temps with it over the reference cooler


----------



## devilhead

will see if this 290x sapphire can hit my 7970 saphire clocks under the water (1370/1810)


----------



## Ukkooh

DO you guys think a h100i would be able to cool this enough to allow ocing? Not sure if i should wait for custom cards.


----------



## Koniakki

Quote:


> Originally Posted by *Scotty99*
> 
> Wonder how long til non reference coolers will be out, did that asus model with "DC II" turn out to be reference?


About 1,5-2 months on average unfortunately. And the 280X/270X didn't count as sooner because they were re-brands and they went straight to the vendors.









Btw I wanna congratulate all of you and Team RED. What a BEAST of a card!


----------



## Asterra

Quote:


> Originally Posted by *Arizonian*
> 
> If you bought it with BF4 now you have that too.


Well of course. ;p Newegg is doing what they can to rake it in, by only making the BF4 versions available at this time. Since I really didn't want to risk it, that's-a what I bought. Ironically, I never intended to buy BF4, and I may see if there's a way to transfer the code. ;p


----------



## MrLinky

Purchased my Sapphire BF4 290x... standard shipping because of the stupid Shoprunner + mobile thing.
Quote:


> Originally Posted by *c0ld*
> 
> So why is everyone going with Sapphire? They have Hynx chips?
> 
> Does MSi have Hynx or Elphida?
> 
> BTW what a hater on the review.


Sapphire: Hynix
Asus: Hynix
Gigabyte: Elpida
MSI: Unknown
HIS: Hynix
XFX: Unknown (rumored to be Elpida)

It seems that all 290 cards will have Elpida RAM.


----------



## Scotty99

Quote:


> Originally Posted by *Stay Puft*
> 
> 39 dollar Arctic twin turbo II. Probably see 30C lower temps with it over the reference cooler


Wouldnt exactly call that a simple fix, i thought you were talking about a fan profile or something lol. Aye for the rest of us, now the wait for non reference coolers, yay?


----------



## Asterra

Quote:


> Originally Posted by *Stay Puft*
> 
> 39 dollar Arctic twin turbo II. Probably see 30C lower temps with it over the reference cooler


Ehh. I went ahead and googled that up. For one thing, say goodbye to your 3-year warranty on your famously excessively hot card that just came out. For another.. Glue? That word disqualifies the procedure from being "simple". ;p


----------



## Stay Puft

Quote:


> Originally Posted by *Scotty99*
> 
> Wouldnt exactly call that a simple fix, i thought you were talking about a fan profile or something lol. Aye for the rest of us, now the wait for non reference coolers, yay?


Nothing is simpler then removing the reference cooler and bolting on an arctic cooler


----------



## caenlen

Quote:


> Originally Posted by *Stay Puft*
> 
> Nothing is simpler then removing the reference cooler and bolting on an arctic cooler


I have a twin turbo ii on my ref 780. it does the job fine, was an easy install too. now the arctic hybrid... now that is a hard install, but the twin turbo 2 is easy as cake.


----------



## Stay Puft

Quote:


> Originally Posted by *Asterra*
> 
> Ehh. I went ahead and googled that up. For one thing, say goodbye to your 3-year warranty on your famously excessively hot card that just came out. For another.. Glue? That word disqualifies the procedure from being "simple". ;p


Ill have a write up in a few days. From the pics I've seen ill simply leave the stock reference vrm coolers on the card and just bolt on the cooler. I don't waste time with Arctics glue


----------



## Scotty99

Think you misunderstood me. Simple would denote not having to take things apart, meaning a software fix etc. I dont find anything difficult when relating to PC hardware (been working on cars for 25 years, this stuff is very easy in comparison, not that working on cars is hard either lol).


----------



## Stay Puft

Quote:


> Originally Posted by *Scotty99*
> 
> Think you misunderstood me. Simple would denote not having to take things apart, meaning a software fix etc. I dont find anything difficult when relating to PC hardware (been working on cars for 25 years, this stuff is very easy in comparison, not that working on cars is hard either lol).


No I understood you. Simple is just defined differently between you and I


----------



## RocketAbyss

My questioned got missed I guess. -_-

Anyone know what memory chips will be on a PowerColor model? My local shops are selling the PowerColor one, but am not too sure if they have the Sapphire one available and I was thinking of picking one up from the shop in 2.5 hours time after work


----------



## Scotty99

Quote:


> Originally Posted by *Stay Puft*
> 
> No I understood you. Simple is just defined differently between you and I


Ok well i imagine most people would side with me in assuming "simple" would also mean no added costs, like a fan curve for example.

Anywho, good luck with your install.


----------



## c0ld

Quote:


> Originally Posted by *MrLinky*
> 
> Purchased my Sapphire BF4 290x... standard shipping because of the stupid Shoprunner + mobile thing.
> Sapphire: Hynix
> Asus: Hynix
> Gigabyte: Elpida
> MSI: Unknown
> HIS: Hynix
> XFX: Unknown (rumored to be Elpida)
> 
> It seems that all 290 cards will have Elpida RAM.


Wont matter much if I am not gonna put it under water right?

XFX is only one available in Tigerdirect


----------



## Porter_

ordered the Sapphire R9 290X BF4 edition! been a damn fun and gratuitous couple of days. received my ASUS PB278Q monitor this afternoon and ordered the 290X tonight. damn it feels good to be a (gangster) geek.


----------



## pounced

Hi there guys. I dropped $610 dollars to rush process and 1 day shipping this card so I'll let you guys know how it does. I picked up the sapphire card from newegg that comes with bf4.


----------



## Porter_

if you use the mobile app and use promo code MBLEMC10G you get 5% off. my total with 1 day shipping came out to $578. so the promo code covered the 1 day shipping for me.


----------



## crun

So... kinda offtopic but tell me, do you keep warranty if you watercool the GPU? Using custom cooler means bye-bye to warranty too?

Anyway, it's almost as hot and loud as fermi... I wouldn't buy it unless I have water cooling


----------



## LiquidHaus

Here's for you guys wondering if the Artic Cooler will possibly work..

*7970:*



*290X:*



hope this helps.


----------



## Scotty99

Ya pretty sure it voids your warranty with most manufacturers. Xfx may be the only exception (dont quote me on this).


----------



## Asterra

Anyone have any comments on Newegg's reliability, insofar as getting the thing shipped today? I read a post from one of their reps which claims a 90% likelihood of shipping same-day, but in my estimation that is already a high gamble, and frankly sounds more like covering one's rear. Oh, Amazon - why could you not list the thing asap? ;p


----------



## pounced

Hopefully it gets here today but I'm completely fine with early friday


----------



## selk22

Just ordered a 290x Sapphire card..



Did the 2 day shipping and the Cali tax was lame..

I hope its worth it


----------



## EvilGnomes

Quote:


> Originally Posted by *selk22*
> 
> Just ordered a 290x Sapphire card..
> 
> 
> 
> Did the 2 day shipping and the Cali tax was lame..
> 
> I hope its worth it


should have signed up for shop runner trial, I have yet to even pay for shop runner because they keep giving me free time...Removed my card and it is still there...after 2 years.

Did a bit of research. The Sapphire Tech 290x uses Hynix modules.


----------



## Arizonian

Quote:


> Originally Posted by *lifeisshort117*
> 
> Here's for you guys wondering if the Artic Cooler will possibly work..
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> *7970:*
> 
> 
> 
> *290X:*
> 
> 
> 
> 
> 
> hope this helps.


Thank you for those comparisons.
Quote:


> Originally Posted by *Scotty99*
> 
> Ya pretty sure it voids your warranty with most manufacturers. Xfx may be the only exception (dont quote me on this).


We've got VaporX a Sapphire Rep here on OCN we should ask unless he stumbles on this question as I'm interested possibly in it as well. I like the fact reference builds are over built but will make the determination after doing some over clocking and gaming with it.

Quote:


> Originally Posted by *Asterra*
> 
> Anyone have any comments on Newegg's reliability, insofar as getting the thing shipped today? I read a post from one of their reps which claims a 90% likelihood of shipping same-day, but in my estimation that is already a high gamble, and frankly sounds more like covering one's rear. Oh, Amazon - why could you not list the thing asap? ;p


Newegg is pretty darn good about shipping. I ordered mine right as it went live within a few mins literally for next day shipping. However it's still sitting in 'order verification' on my status. If it ships tomorrow I'd expect it at my door by Friday. Hopefully if not whomever they use delivers on Saturday or it'd have been a waste of money.

On a side note: I've updated the OP with some info on powertune, crossfire, the new AMD Overdrive, and dual BIOS.

EDIT: Hopefully VaporX can also confirm once and for all for us which memory is used on the Sapphire cards.


----------



## Ukkooh

Quote:


> Originally Posted by *Ukkooh*
> 
> DO you guys think a h100i would be able to cool this enough to allow ocing? Not sure if i should wait for custom cards.


Bump


----------



## Scotty99

So ya, found out why these fans are locked to 55% max:


----------



## leyzar

Quote:


> Originally Posted by *Scotty99*
> 
> So ya, found out why these fans are locked to 55% max:


Oh .. My .. GOD!
No.. NO OK ? just NO! Period.


----------



## EvilGnomes

Quote:


> Originally Posted by *Scotty99*
> 
> So ya, found out why these fans are locked to 55% max:


jebus...thats really loud.


----------



## Arizonian

Quote:


> Originally Posted by *Scotty99*
> 
> So ya, found out why these fans are locked to 55% max:


Pretty loud. If you don't game with head set that can be bothersome. However at uber 55% was no louder than my 690 at full load.

They're not capped. The two onboard BIOS 'quite' is capped 40% and Uber at 50%.

You can raise the threshold fan control with AMD Overdrive. It will reach max set fan speed by the time it hits 95c. It will dynamically lower core clock once it hits the temp threshold set. From what I read as temps go down the fan will auto adjust lower as well. You can even raise the temp threshold as 95c is suggested as comfortable. Not sure on temp limit though and how much she can really take.


----------



## Scotty99

Hmm i was told wrong then. I was under the assumption you had to modify the bios to get fans to go over 55%. But ya, dont think id put the slider above 55% anyways wow lol.


----------



## MIGhunter

Ordered the Sapphire BF4 edition of Newegg at midnight.


----------



## Asterra

Quote:


> Originally Posted by *leyzar*
> 
> Oh .. My .. GOD!
> No.. NO OK ? just NO! Period.


Pretty much. This is why right after ordering my 290x, I also put down a few bucks on a longer hdmi cable and an extension cord for my mouse. Gonna keep the pc in a different room. ;p


----------



## pounced

I'm really hoping the card gets here this afternoon (10/24/13) because I ordered it @ 1:50 AM East on 10/24/13. I cant wait to play all my games that have had trouble running on my 6850 with this card lol its going to be night and day.


----------



## kot0005

Looking forward ...


----------



## selk22

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *kot0005*
> 
> 
> 
> 
> Looking forward ...






Awesome.... Made me a happy man to read that. Hope this is true


----------



## Sgt Bilko

Sign me up, My 290x gets shipped tomorrow.








Quote:


> Originally Posted by *kot0005*


Guess i better get on upgrading my rig for Xfire then...........Missus is so going to murder me


----------



## LiquidHaus

Quote:


> Originally Posted by *kot0005*
> 
> 
> 
> 
> Looking forward ...


This is what I was expecting honestly. The VisionTek 7970 I have right now can do 1250 core clock and 1750 on memory and that's with a waterblock. Still would like to hit 1300 with a 290X though.

looking forward to your guys' overclocking attempts!


----------



## tsm106

Quote:


> Originally Posted by *Ukkooh*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ukkooh*
> 
> DO you guys think a h100i would be able to cool this enough to allow ocing? Not sure if i should wait for custom cards.
> 
> 
> 
> Bump
Click to expand...

No. Get a proper fullcover block or don't bother because you end up with inadequate cooling on the vrms. You ought to invest in it because as shown the cards hit 1400mhz on the cold on stock voltage. The chips want to be cooled well!

Anyways, I fell asleep in bed. It was 4am when I awoke to a steam message linking me too this. Quadfire confirmed. Beast silicon confirmed. It is looking just like 7970s all over again oc wise, booyah.

http://www.overclock.net/t/1436561/various-amd-radeon-r9-290x-reviews-thread/360_40#post_21046990

While awake, I just barely missed the HIS non-bf4 so I thought to myself, what the hell they are giving bf4 for 30, so I got another Sapphire.


----------



## cookiesowns

Am I the only one that's worried about power draw from the R9 290X?

Anyone have voltage clock - wattage scaling on other cards?

Was planning on running 3-way with OC on a 1200W PSU but I don't think that's achievable anymore. But then again, who needs to OC with 3 cards LOL, just a small boost on memory and should be good for eyefinity & 4K/2K


----------



## tsm106

Quote:


> Originally Posted by *cookiesowns*
> 
> Am I the only one that's worried about power draw from the R9 290X?
> 
> Anyone have voltage clock - wattage scaling on other cards?
> 
> *Was planning on running 3-way with OC on a 1200W PSU* but I don't think that's achievable anymore. But then again, who needs to OC with 3 cards LOL, just a small boost on memory and should be good for eyefinity & 4K/2K












You need to speak with ppl who actually have successfully run the setups. Cuz then you would learned that you couldn't run triple 7970s on 1200w psu to begin with, so how would you have run cards with an even higher tdp?

Btw, I'm talk real use not grandma overclocking here.


----------



## fleetfeather

oh you can reserve spots?

fleetfeather - 2x Sapphire R9 290X - Air


Spoiler: Proof





Still cheaper than buying locally, and covered by Sapphire Intl Warranty


----------



## VaporX

Hey guys I was told there is a question on warranty voiding with after market coolers. This is correct, if you remove the cooler you will be voiding the warranty, in fact it is safest to assume this is the case for pretty much every manufacturer unless they have a very explicate statement of other wise obviously posted.

As for which memory is used I have passed this up the ladder to get you guys and answer.


----------



## cookiesowns

Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You need to speak with ppl who actually have successfully run the setups. Cuz then you would learned that you couldn't run triple 7970s on 1200w psu to begin with, so how would you have run cards with an even higher tdp?
> 
> Btw, I'm talk real use not grandma overclocking here.


I've ran triple 580s on 1200w fine. Didn't have much OC headroom, but was capable of running it.

There aren't any decent 1.5KW 80+ plat PSU's, plus I like Corsair PSU's. I guess I can just run an external 1200W psu when benching and doing serious OC, and just run undervolted triple 290X's for daily use.


----------



## tsm106

Quote:


> Originally Posted by *cookiesowns*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You need to speak with ppl who actually have successfully run the setups. Cuz then you would learned that you couldn't run triple 7970s on 1200w psu to begin with, so how would you have run cards with an even higher tdp?
> 
> Btw, I'm talk real use not grandma overclocking here.
> 
> 
> 
> I've ran triple 580s on 1200w fine. Didn't have much OC headroom, but was capable of running it.
> 
> There aren't any decent 1.5KW 80+ plat PSU's, plus I like Corsair PSU's. I guess I can just run an external 1200W psu when benching and doing serious OC, and just run undervolted triple 290X's for daily use.
Click to expand...

Get a dual adapter of ebay for 11 bucks. It comes with a how to wire it up guide, its simple simple. Btw as an fyi, I've pulled well in excess of 1350w at the wall benching Valley at clocks over 1330/1830. Some have had higher draws than me depending on your cpu and loop size. I eventually moved to dual psu too.


----------



## wstanci3

If some of guys buy from NCIX, they have the Sapphire
US
http://us.ncix.com/products/?sku=91477&vpn=21226%2D00&manufacture=SAPPHIRE
Ca
http://www.ncix.ca/products/?sku=91477&vpn=21226%2D00&manufacture=SAPPHIRE


----------



## kcuestag

Soon to buy an R9 290X, just sold my 7970's.

2nd one will come in Christmas.


----------



## leyzar

Quote:


> Originally Posted by *VaporX*
> 
> Hey guys I was told there is a question on warranty voiding with after market coolers. This is correct, if you remove the cooler you will be voiding the warranty, in fact it is safest to assume this is the case for pretty much every manufacturer unless they have a very explicate statement of other wise obviously posted.
> 
> As for which memory is used I have passed this up the ladder to get you guys and answer.


Hey man, while you are at it ask what memory chips are used in the 290, no "x" as-well, because there are conflicting reports about this.


----------



## cookiesowns

Quote:


> Originally Posted by *tsm106*
> 
> Get a dual adapter of ebay for 11 bucks. It comes with a how to wire it up guide, its simple simple. Btw as an fyi, I've pulled well in excess of 1350w at the wall benching Valley at clocks over 1330/1830. Some have had higher draws than me depending on your cpu and loop size. I eventually moved to dual psu too.


But then that would put me into Caselabs territory.

1350w at the wall on what type of system? triple 7970s? 1350W at the wall with about 88% efficiency = 1188 actual DC usage, which is still within limits.

Obviously if I'm going tri 290X instead of Titans that would be saving me enough money to go out and get another AX1200i and build into a Caselabs. I suppose I'll wait till Christmas to see what's in town for us.


----------



## tsm106

Quote:


> Originally Posted by *wstanci3*
> 
> If some of guys buy from NCIX, they have the Sapphire
> US
> http://us.ncix.com/products/?sku=91477&vpn=21226%2D00&manufacture=SAPPHIRE
> Ca
> http://www.ncix.ca/products/?sku=91477&vpn=21226%2D00&manufacture=SAPPHIRE


More expensive than newegg and they do foreign CC charges, ie. you incur exchange rate charges cuz they run the CC from CA since they don't actually have US filings. Make sure your CC doesn't charge exchange fees, could be sizable sum on top of their overpriced listings.


----------



## Sgt Bilko

Quote:


> Originally Posted by *VaporX*
> 
> Hey guys I was told there is a question on warranty voiding with after market coolers. This is correct, if you remove the cooler you will be voiding the warranty, in fact it is safest to assume this is the case for pretty much every manufacturer unless they have a very explicate statement of other wise obviously posted.
> 
> As for which memory is used I have passed this up the ladder to get you guys and answer.


Very nice, Thank you very much


----------



## Arizonian

Quote:


> Originally Posted by *VaporX*
> 
> Hey guys I was told there is a question on warranty voiding with after market coolers. This is correct, if you remove the cooler you will be voiding the warranty, in fact it is safest to assume this is the case for pretty much every manufacturer unless they have a very explicate statement of other wise obviously posted.
> 
> As for which memory is used I have passed this up the ladder to get you guys and answer.


Thanks VaporX. That was the question in the preorder thread about using the Arctic Extreme III Cooler on the Sapphire reference card.

The other question was which memory is Sapphire using for confirmation, Hynix or Elpida? Appreciate you getting back to us. It's nice to have an active rep onbard OCN.


----------



## Scotty99

Quote:


> Originally Posted by *VaporX*
> 
> Hey guys I was told there is a question on warranty voiding with after market coolers. This is correct, if you remove the cooler you will be voiding the warranty, in fact it is safest to assume this is the case for pretty much every manufacturer unless they have a very explicate statement of other wise obviously posted.
> 
> As for which memory is used I have passed this up the ladder to get you guys and answer.


I figured as much. Here is a quote from an xfx CS agent on a 7950 review on newegg:
Quote:


> Manufacturer Response:
> Hello, The VOID warranty stickers do not apply to North America. The 7950 is voltage locked, but the core and memory clocks are not. Remember this is a black edition card to begin with, it is factory overclocked. *Please feel free to modify your cooling solution, just keep the original parts in case we need to RMA the unit in the future*


Not sure if this applies to the 290 series, but just throwing it out there.


----------



## kot0005

Quote:


> Originally Posted by *VaporX*
> 
> Hey guys I was told there is a question on warranty voiding with after market coolers. This is correct, if you remove the cooler you will be voiding the warranty, in fact it is safest to assume this is the case for pretty much every manufacturer unless they have a very explicate statement of other wise obviously posted.
> 
> As for which memory is used I have passed this up the ladder to get you guys and answer.


Wow why would you not give us warranty , when using waterblocks is clearly a better choice..


----------



## sugarhell

I will join soon guys!


----------



## tsm106

When it comes to watercooling, its much like the US Military's policy on alternative um you know what, don't ask, don't tell.

Don't blow up your cards while changing cooling. Return to stock for any warranty work, etc. Just like ya do with your cars.


----------



## c0ld

Well time for the red side, last one I had was a Radeon 9600SE. I should get a XFX 290X on the mail soon


----------



## Scotty99

Quote:


> Originally Posted by *tsm106*
> 
> When it comes to watercooling, its much like the US Military's policy on alternative um you know what, don't ask, don't tell.
> 
> Don't blow up your cards while changing cooling. Return to stock for any warranty work, etc. Just like ya do with your cars.


Some have a sticker tho. And it looks to me xfx doesnt care what you do with the coolers, you guys should email them and find out for sure. (or is there an xfx guy on these boards?)


----------



## tsm106

Quote:


> Originally Posted by *Scotty99*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> When it comes to watercooling, its much like the US Military's policy on alternative um you know what, don't ask, don't tell.
> 
> Don't blow up your cards while changing cooling. Return to stock for any warranty work, etc. Just like ya do with your cars.
> 
> 
> 
> Some have a sticker tho. And it looks to me xfx doesnt care what you do with the coolers, you guys should email them and find out for sure. (or is there an xfx guy on these boards?)
Click to expand...

XFX started on with the stickers years ago. It SHOULD be common knowledge by now. XFX have a modders warranty. They dont give a rats ass what you do as long as you don't self inflict damage. Only North America gets this warranty unfortunately, includes you guys in CA. The rest of the world is not so lucky, you get the typical you touch you void warranty. It sucks they are using elpida on their 290xthough, typical XFX shenanigans lol.

Most US companies won't void for tim changes because its a normal service. You dont lose your warranty for changing your brakes, oil, tires now do you? They use fear to keep ppl in line. If you are terribly concerned about tim changes or whatever you can ask your AIB. But best best is not to break stuff and then return it to stock in the case something happens.

Btw, the companies using stickers that I know of are Powercolor, MSI and XFX. Only XFX have the modders warranty.


----------



## Scotty99

I was replying to the sapphire rep bro, he said we should assume every company was like this, thus i linked the xfx post...

It should be common knowledge to read words around other words, by doing so you gain something called "context".


----------



## tsm106

Quote:


> Originally Posted by *Scotty99*
> 
> I was replying to the sapphire rep bro, he said we should assume every company was like this, thus i linked the xfx post...
> 
> It should be common knowledge to read words around other words, by doing so you gain something called "context".


I was responding in general but I'll be sure not to reply to you again.


----------



## Scotty99

Solid choice, bro.


----------



## c0ld

Any other retailer then newegg and tigerdirect that has them in stock I cancelled my order with tigerdirect they were being a pain on the ass not shipping other than the billing address.


----------



## szeged

so, any word on which companies dont void warranty based on switching to waterblocks?

i have a HIS 290x inc to me atm, idc about the voided warranty but my gf's brother does care since he wants to waterblock his also.


----------



## selk22

Well it seems the Sapphire one does not.. Which is what I ordered but I have no worries honestly.

Thanks again Szeged it seems it was worth the 2 day wait









As for HIS i have heard nothing about a warranty for modders


----------



## $ilent

Asus - Hynix
Sapphire - Hynix
HIS - Elpida
Gigabyte - Elpida
MSI - Unknown


----------



## selk22

Source?


----------



## $ilent

http://forums.overclockers.co.uk/showpost.php?p=25170878&postcount=184


----------



## selk22

Cheers +rep


----------



## $ilent

no worries, I went with HIS before I saw gibbos post because everywhere I checked said HIS had hynix...so I had to rush change my order at 6am this morning!


----------



## Scotty99

I hate to play devils advocate but my brain is making me lol.

What if two months down the road when all the overclocking reviews come out the elpida is performing better than hynix! I know, unlikely from past GPU series, but it could happen!


----------



## moldyviolinist

Are there any of these cards left in stock anywhere? I'm trying to buy in the US. I didn't expect it to be a hard launch at midnight...


----------



## flopper

Quote:


> Originally Posted by *Scotty99*
> 
> I hate to play devils advocate but my brain is making me lol.
> 
> What if two months down the road when all the overclocking reviews come out the elpida is performing better than hynix! I know, unlikely from past GPU series, but it could happen!


gibbo flashed asus bios to cards wtih elpida chips, oc worked better then.
so no difference oc wise with maybe a tad better for hynix I expect average might be better


----------



## kcuestag

Just bought a Sapphire R9 290X including Battlefield 4.


----------



## sugarhell

Quote:


> Originally Posted by *Scotty99*
> 
> I hate to play devils advocate but my brain is making me lol.
> 
> What if two months down the road when all the overclocking reviews come out the elpida is performing better than hynix! I know, unlikely from past GPU series, but it could happen!


Elpida clock lower than hynix or samsung because they use tighter timings. Its a pure technical subject not magic or something


----------



## Porter_

still in 'order verification' status. come on newegg give me a tracking number


----------



## selk22

Quote:


> Originally Posted by *kcuestag*
> 
> Just bought a Sapphire R9 290X including Battlefield 4.


Congrats







Me to!

Exciting days....


----------



## kcuestag

Quote:


> Originally Posted by *selk22*
> 
> Congrats
> 
> 
> 
> 
> 
> 
> 
> Me to!
> 
> Exciting days....


Yeah.

I know I'll lose performance over my current 2x 7970's but I do plan on a 2nd one in Christmas, plus there's nothing I can't play at +60fps with an R9 290X even at 2560x1440 so not worried.


----------



## $ilent

I got Sapphire BF4 290X too...I bought it for BF4 cant wait!


----------



## selk22

Quote:


> Originally Posted by *kcuestag*
> 
> Yeah.
> 
> I know I'll lose performance over my current 2x 7970's but I do plan on a 2nd one in Christmas, plus there's nothing I can't play at +60fps with an R9 290X even at 2560x1440 so not worried.


And on your resolution who's to say you cant crossfire down the road and even get better performance? This is the 4k king right now imo


----------



## jerrolds

Anyone planning on doing a H60 or Antec 620 mod on this thing? It doesnt seem like they overclock well, at all. An EK waterblock is just too much for me...wondering if anyone is doing a more ghetto solution.

Either that or im waiting for Black Friday sales to pick up an Accelero Hybrid, assuming they fit on the 290X and its on sale, its about $130CDN atm.


----------



## $ilent

no im not risking it, last tiome I tried that I killed my 7870. Gonna go full block on my 290x


----------



## kcuestag

Quote:


> Originally Posted by *$ilent*
> 
> no im not risking it, last tiome I tried that I killed my 7870. Gonna go full block on my 290x


Good choice, I'm waiting for other manufacturers to announce their waterblocks, I don't like the looks of the EK blocks.


----------



## RocketAbyss

Well, I think I'm the first one here to get hold of a PowerColor 290X. Finally got it setup and running so incoming picture spam!




Spoiler: Warning: Spoiler!







My Asus Matrix 7970 retiring to the new beast!





GPU-Z to come soon once I get the drivers sorted out!


----------



## Porter_

nice pics RocketAbyss


----------



## dade_kash_xD

Random question. Anyone know the length in CM or MM the crossfire bridge that come with XFX cards are?


----------



## Scotty99

No xfire bridge with this GPU, its done with the PCI-E lanes.


----------



## Slomo4shO

Quote:


> Originally Posted by *dade_kash_xD*
> 
> Random question. Anyone know the length in CM or MM the crossfire bridge that come with XFX cards are?


The cards don't require CF bridges.
Quote:


> Originally Posted by *RocketAbyss*
> 
> Well, I think I'm the first one here to get hold of a PowerColor 290X. Finally got it setup and running so incoming picture spam!
> 
> GPU-Z to come soon once I get the drivers sorted out!


Nice, the PowerColor card is actually overclocked by 30Mhz compared to the order models


----------



## youra6

Have the Asus BF4 cards been released yet?


----------



## VSG

The non-BF4 edition cards have better looking shrouds, especially the Sapphire one.


----------



## Mr-Mechraven

Quote:


> Originally Posted by *jerrolds*
> 
> Anyone planning on doing a H60 or Antec 620 mod on this thing? It doesnt seem like they overclock well, at all. An EK waterblock is just too much for me...wondering if anyone is doing a more ghetto solution.
> 
> Either that or im waiting for Black Friday sales to pick up an Accelero Hybrid, assuming they fit on the 290X and its on sale, its about $130CDN atm.


Just so you know the Prolimatech MK 26 VGA cooler will fit the R9 290X with either 2x 120mm or 2 x 140mm fans









Just another option for those who want better air cooling


----------



## Mr357

Has anyone received shipping confirmation from Newegg? I got rush processing and it's still shown up as "packaging" since last night.


----------



## Roaches

Quote:


> Originally Posted by *Mr-Mechraven*
> 
> Just so you know the Prolimatech MK 26 VGA cooler will fit the R9 290X with either 2x 120mm or 2 x 140mm fans
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just another option for those who want better air cooling


We just need a proper backplate then







....I've seen GPU sag with the MK-26 installed on GPUs


----------



## Stay Puft

Quote:


> Originally Posted by *Scotty99*
> 
> Some have a sticker tho. And it looks to me xfx doesnt care what you do with the coolers, you guys should email them and find out for sure. (or is there an xfx guy on these boards?)


As tsm has said. Very easy to get past the sticker

Quote:


> Originally Posted by *youra6*
> 
> Have the Asus BF4 cards been released yet?


Probably today


----------



## VSG

I just received confirmation from AMD Radeon's twitter that the 290X successfully crossfires with the 290.


----------



## Mr357

Quote:


> Originally Posted by *geggeg*
> 
> I just received confirmation from AMD Radeon's twitter that the 290X successfully crossfires with the 290.










Maybe the 290 will unlock to a 290X like with the 6950?


----------



## sterob

Quote:


> Originally Posted by *RocketAbyss*
> 
> Well, I think I'm the first one here to get hold of a PowerColor 290X. Finally got it setup and running so incoming picture spam!
> 
> GPU-Z to come soon once I get the drivers sorted out!


Hi, can you try some cgminer BTC/LTC run ?


----------



## youra6

Quote:


> Originally Posted by *Mr357*
> 
> 
> 
> 
> 
> 
> 
> 
> Maybe the 290 will unlock to a 290X like with the 6950?


AMD cards have always been able to do that though... I'm sure that the shaders clusters will be lazer cut, hence impossible to unlock.

If the 290 is $450, I will consider returning the 290X, and pocket the costs for waterblocks. I hope 290s under water will be close on par with 290xs on air cooling.


----------



## RocketAbyss

I've been trying to get my GPU-Z to show my 290X properly, but it doesn't seem to be working, even on the latest version. Anyone have any clue? This is on Catalyst 13.11 Beta drivers.

So instead, I used the catalyst manager to show the breakdown of the card specs. Enjoy!

EDIT: Pics


----------



## nemm

One Sapphire 290x, EK water block and back plate on pre-order due November 5th as an early Christmas present from the better half. By the time it arrives the driver performance should have progressed making the changeover from the 670 less of a task.

The countdown begins and you never know it may get a partner ordered while I wait ^^


----------



## $ilent

Quote:


> Originally Posted by *Mr357*
> 
> 
> 
> 
> 
> 
> 
> 
> Maybe the 290 will unlock to a 290X like with the 6950?


From Gibbo at OCUK:
Quote:


> By the way this is an R290 I've flashed with R290X BIOS, works a tread, but shaders not unlocked, but can now hit 1220 core and 6600 RAM on the regular R290 as well. At these speeds this also obliverates Titan. Incredible value!


----------



## pounced

I too ordered my card at 1:50 East AM this morning and its still showing up as packaging and I put rush processing on it along with one day shipping. Still showing up as packaging for me =(


----------



## Porter_

i didn't include rush processing and mine still shows Order Verification


----------



## mohit9206

Quote:


> Originally Posted by *pounced*
> 
> I too ordered my card at 1:50 East AM this morning and its still showing up as packaging and I put rush processing on it along with one day shipping. Still showing up as packaging for me =(


But it too noisy and consumes too much power.Wait for custom cooled cards.


----------



## Stay Puft

My cards are in the "Packaging" phase. If you ordered after 12:30am i dont believe your cards will ship today


----------



## Mr357

Quote:


> Originally Posted by *pounced*
> 
> I too ordered my card at 1:50 East AM this morning and its still showing up as packaging and I put rush processing on it along with one day shipping. Still showing up as packaging for me =(


Same here


----------



## Porter_

Quote:


> Originally Posted by *Porter_*
> 
> i didn't include rush processing and mine still shows Order Verification


boom. Packaging.


----------



## Koniakki

Quote:


> Originally Posted by *$ilent*
> 
> From Gibbo at OCUK:


wow! Just wow!! R9-290 price/performance went skyhigh now..


----------



## YaLu

Which brand that produces r9 290x will not void the warranty if you remove the stock heatsink?


----------



## Stay Puft

Quote:


> Originally Posted by *YaLu*
> 
> Which brand that produces r9 290x will not void the warranty if you remove the stock heatsink?


I don't believe any. You'll just have to not tell them you removed it


----------



## pounced

Quote:


> Originally Posted by *Stay Puft*
> 
> My cards are in the "Packaging" phase. If you ordered after 12:30am i dont believe your cards will ship today


Yea that sucks I guess. I have off work tomorrow though so fun fun fun


----------



## Kokin

If the 290(non-X) is similar to the MSI 7950 with the 7970PCB(6+8pin vs 2x6pin), that will be some crazy good price/performance.


----------



## VSG

I called up newegg today morning and I will get a refund for the 3-day shipping charge since the mobile version did not have shoprunner. I should have just went ahead with the 2 day shipping then


----------



## kcuestag

Anyone knows if other manufacturers have mentioned R9 290X waterblocks yet? I can only find EK waterblocks, but not anyone else (Like Aquacomputer, Watercool, XSPC... etc)


----------



## tsm106

Quote:


> Originally Posted by *YaLu*
> 
> Which brand that produces r9 290x will not void the warranty if you remove the stock heatsink?


Read my post above. XFX, but only if you are in North America. Otherwise, don't ask don't tell, don't break your card and it should never be an issue. Well, unless you buy a card with stickers all over it.

Quote:


> Originally Posted by *kcuestag*
> 
> Anyone knows if other manufacturers have mentioned R9 290X waterblocks yet? I can only find EK waterblocks, but not anyone else (Like Aquacomputer, Watercool, XSPC... etc)


They all will have blocks, some really slowly though. I'm in with ek for the vrm cooling.


----------



## Mr357

Quote:


> Originally Posted by *Stay Puft*
> 
> I don't believe any. You'll just have to not tell them you removed it


"Yeah, I haven't tampered with it at all. I'm not sure how that IC Diamond got on there though, must have been those little elves that come out at night."


----------



## Stay Puft

I emailed Wizzard and inquired if he'd measure the mounts between the 290X and 280X to see they're the same.
Quote:


> Originally Posted by *Mr357*
> 
> "Yeah, I haven't tampered with it at all. I'm not sure how that IC Diamond got on there though, must have been those little elves that come out at night."


They wouldn't be able to tell the different TIM's. Asus sure didnt......


----------



## Slomo4shO

Quote:


> Originally Posted by *Porter_*
> 
> boom. Packaging.










Arrogant bastard is good but I prefer the double bastard








Quote:


> Originally Posted by *Koniakki*
> 
> wow! Just wow!! R9-290 price/performance went skyhigh now..


Now I am wondering if my 1200W PSU would be sufficient or if I should be picking up a LEPA G1600 for a quadfire sup with water cooling since that would be the logical configuration


----------



## Stay Puft

Quote:


> Originally Posted by *Slomo4shO*
> 
> 
> 
> 
> 
> 
> 
> 
> Arrogant bastard is good but I prefer the double bastard
> 
> 
> 
> 
> 
> 
> 
> 
> Now I am wondering if my 1200W PSU would be sufficient or if I should be picking up a LEPA G1600 for a quadfire sup with water cooling since that would be the logical configuration


Lepa for sure


----------



## leyzar

Evening everybody, happy to see you guys enjoying your new cards








Any more info on the 290 nonX ?


----------



## kcuestag

Quote:


> Originally Posted by *tsm106*
> 
> Read my post above. XFX, but only if you are in North America. Otherwise, don't ask don't tell, don't break your card and it should never be an issue. Well, unless you buy a card with stickers all over it.
> They all will have blocks, some really slowly though. I'm in with ek for the vrm cooling.


The bad part is all the EK waterblocks are out of stock, I was going to get one but now they're all gone.


----------



## Arizonian

Quote:


> Originally Posted by *Stay Puft*
> 
> My cards are in the "Packaging" phase. If you ordered after 12:30am i dont believe your cards will ship today


Quote:


> Originally Posted by *Porter_*
> 
> boom. Packaging.


BOOM BOOM - packaging with Next Day Air









Quote:


> Originally Posted by *kcuestag*
> 
> Anyone knows if other manufacturers have mentioned R9 290X waterblocks yet? I can only find EK waterblocks, but not anyone else (Like Aquacomputer, Watercool, XSPC... etc)


FYI to any water coolers here. When you get your blocks please post links so I can get them added to OP for other members to see. EK already added.









Quote:


> Originally Posted by *RocketAbyss*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I've been trying to get my GPU-Z to show my 290X properly, but it doesn't seem to be working, even on the latest version. Anyone have any clue? This is on Catalyst 13.11 Beta drivers.
> 
> So instead, I used the catalyst manager to show the breakdown of the card specs. Enjoy!


Got you added to the list. Congrats.









Remember that 13.11 beta doesn't support 290X and I don't see any update to drivers yet. Also GPU-Z probably hasn't added support for it either I'm assuming.


----------



## Ha-Nocri

http://forums.overclockers.co.uk/showpost.php?p=25171096&postcount=46
Quote:


> Its down to the BIOS on the card and having the right software version.
> 
> Asus works with GPU tweak.
> MSI will no doubt work with Afterburner.
> Sapphire with Trixx.
> 
> But they can do voltage, just software updates needed to go official.


So, it seems that voltage control is there


----------



## Porter_

Quote:


> Originally Posted by *Slomo4shO*
> 
> 
> 
> 
> 
> 
> 
> 
> Arrogant bastard is good but I prefer the double bastard


indeed sir.

Quote:


> Originally Posted by *Arizonian*
> 
> BOOM BOOM - packaging with Next Day Air


same here







. nerdy high-five!


----------



## kcuestag

Do we know if these have dual bios? Would be awesome, I'm looking forward to use whatever BIOS allows voltage control with MSI Afterburner.








Quote:


> Originally Posted by *Arizonian*
> 
> FYI to any water coolers here. When you get your blocks please post links so I can get them added to OP for other members to see. EK already added.


Will do.


----------



## RocketAbyss

Quote:


> Originally Posted by *Arizonian*
> 
> Remember that 13.11 beta doesn't support 290X and I don't see any update to drivers yet. Also GPU-Z probably hasn't added support for it either I'm assuming.


Thanks. However, you got the label on the list wrong. Am using a PowerColor 290X, not a Sapphire one. Heres the proof physically:
Quote:


> Originally Posted by *RocketAbyss*
> 
> Well, I think I'm the first one here to get hold of a PowerColor 290X. Finally got it setup and running so incoming picture spam!
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> My Asus Matrix 7970 retiring to the new beast!
> 
> 
> 
> 
> 
> GPU-Z to come soon once I get the drivers sorted out!


----------



## tsm106

Quote:


> Originally Posted by *kcuestag*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Read my post above. XFX, but only if you are in North America. Otherwise, don't ask don't tell, don't break your card and it should never be an issue. Well, unless you buy a card with stickers all over it.
> They all will have blocks, some really slowly though. I'm in with ek for the vrm cooling.
> 
> 
> 
> The bad part is all the EK waterblocks are out of stock, I was going to get one but now they're all gone.
Click to expand...

Restock on ek blocks is next Tues.


----------



## Mr357

Quote:


> Originally Posted by *tsm106*
> 
> Restock on ek blocks is next Tues.


So glad I ordered mine yesterday; that's exactly what I was afraid of.


----------



## Stay Puft

Quote:


> Originally Posted by *Mr357*
> 
> So glad I ordered mine yesterday; that's exactly what I was afraid of.


Me too. I knew if i waited i would never get them


----------



## tsm106

Quote:


> Originally Posted by *Mr357*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Restock on ek blocks is next Tues.
> 
> 
> 
> So glad I ordered mine yesterday; that's exactly what I was afraid of.
Click to expand...

Im gonna wait for restock. No rush yet. In the meantime I've got a gpu only block to mess with, extra shims, and a spare sapph oc cooler. Lets see how it does with a better aircooler for instance.


----------



## VSG

Quote:


> **UPDATE**
> 
> Hynix and Elpida are both just as good as each other.
> 
> I took my Elpida card which could only do 5700MHz, flashed the Asus BIOS to it, pumped up the voltage and now the Elpida card is matching with 6600MHz.
> 
> So there you go, Elpida, Hynix, it don't matter. What matters is voltage control.


From OCUK


----------



## tsm106

Quote:


> Originally Posted by *geggeg*
> 
> Quote:
> 
> 
> 
> **UPDATE**
> 
> Hynix and Elpida are both just as good as each other.
> 
> I took my Elpida card which could only do 5700MHz, flashed the Asus BIOS to it, pumped up the voltage and now the Elpida card is matching with 6600MHz.
> 
> So there you go, Elpida, Hynix, it don't matter. What matters is voltage control.
> 
> 
> 
> From OCUK
Click to expand...

That's good as long as they are rated the same.










Elpida got a bad rap cuz on 7970s, some AIBs were using much lower rated elpida chips instead of the reference spec chips.


----------



## Arm3nian

My 2 sapphires already shipped









oh sweet my 2 ek blocks also shipped


----------



## Slomo4shO

Quote:


> Originally Posted by *Stay Puft*
> 
> Lepa for sure


It is a shame that the EVGA SuperNOVA NEX 1500 won't fit my HAF XB, the individually sleeved black and red cables would go well with my Maximus VI Extreme.


----------



## pounced

Quote:


> Originally Posted by *Arm3nian*
> 
> My 2 saphires already shipped


Yea I have mine being said as shipped so maybe they will arrive later today? Its almost 2 PM here? What are the chances they come today?


----------



## Stay Puft

Quote:


> Originally Posted by *pounced*
> 
> Yea I have mine being said as shipped so maybe they will arrive later today? Its almost 2 PM here? What are the chances they come today?


None. You'll get them tomorrow


----------



## Arm3nian

I got the cheapest shipping lol, not starting my build till I get the RIVE BE anyway so idc. probably tomorrow or monday.


----------



## Stay Puft

Quote:


> Originally Posted by *Arm3nian*
> 
> I got the cheapest shipping lol, not starting my build till I get the RIVE BE anyway so idc. probably tomorrow or monday.


Same here. I don't expect to have the SR2 up and running till after Halloween

Edit mine have shipped as well


----------



## kcuestag

I didn't get my waterblock earlier because I didn't have this upgrade in my mind, I planned on keeping my 7970 CFX, until I woke up this morning and saw them on stock and bought one.









Oh well, hoping I can get one before BF4 release in Europe (Thursday), I don't want to game with this thing on air.


----------



## c0ld

So sad tiger direct didn't want to ship other than my billing address. Anyone know where a can snag one besides newegg?


----------



## Arm3nian

Anyone know the difference between the ek nickel and nickel original csq? I'm guessing it is just the plexi glass that extends to the end of the card. Tbh I previously thought it looked better than the shorter one but after seeing the temps this cards produces, I'm glad I ordered the shorties


----------



## VSG

Why would the shorties be better than the full cover blocks?


----------



## jerrolds

I ordered mine from Newegg.ca, got the payment charged at 11:43PM - still packaging







I guess ill get it next Tuesday/Wed.


----------



## Arm3nian

Quote:


> Originally Posted by *geggeg*
> 
> Why would the shorties be better than the full cover blocks?


You misunderstood, the shorties are also full cover blocks, just like the full one they cool the gpu, ram and vrm. The difference is that the long one just has extra plexi glass extending over the caps and such to the left and right.


----------



## VSG

Ya I did misunderstand, still though why would that extra piece of plexy be bad? Is it due to limiting case air cooling over these parts?


----------



## tsm106

Quote:


> Originally Posted by *geggeg*
> 
> Ya I did misunderstand, still though why would that extra piece of plexy be bad? Is it due to limiting case air cooling over these parts?


Its not bad its for looks. Some peeps like the full board length look. The full length does help in keep the pcb flat though.


----------



## Arm3nian

Quote:


> Originally Posted by *geggeg*
> 
> Ya I did misunderstand, still though why would that extra piece of plexy be bad? Is it due to limiting case air cooling over these parts?


Yeah that is the gist. I actually thought it was going to make no difference but one of the EK reps actually said they did some testing, showing a bit better temps on the ones not covering the components. Passive heat from component to air transfer is always good, so in theory the shorty will run cooler.

TSM: backplate ftw?


----------



## tsm106

Quote:


> Originally Posted by *Arm3nian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *geggeg*
> 
> Ya I did misunderstand, still though why would that extra piece of plexy be bad? Is it due to limiting case air cooling over these parts?
> 
> 
> 
> Yeah that is the gist. I actually thought it was going to make no difference but one of the EK reps actually said they did some testing, showing a bit better temps on the ones not covering the components. Passive heat from component to air transfer is always good, so in theory the shorty will run cooler.
> 
> TSM: backplate ftw?
Click to expand...

Backplate and fullcover... but we don't really have a choice. You want the half length block u must go nickel. And if there's anything I've learned about ek, it's that their nickel sucks! Thus it's not like I had a choice in getting the acetal copper.


----------



## Arm3nian

Quote:


> Originally Posted by *tsm106*
> 
> Backplate and fullcover... but we don't really have a choice. You want the half length block u must go nickel. And if there's anything I've learned about ek, it's that their nickel sucks! Thus it's not like I had a choice in getting the acetal copper.


Well there is a non nickel plated shorty: http://www.frozencpu.com/products/21665/ex-blc-1564/EK_Radeon_R9-290X_VGA_Liquid_Cooling_Block_-_Acrylic_EK-FC_R9-290X.html?tl=g57c599s2078&id=WMu5oTBm&mv_pc=671

Also, I think their "fix" for the nickel plating has actually worked, haven't heard any horror stories after it. Non nickel plating would just ruin the look of my build.


----------



## pounced

Haha kinda silly question but does the bundle come with Battlefield 4 + Premium or just battlefield 4?


----------



## Porter_

just BF4 limited edition


----------



## pounced

Quote:


> Originally Posted by *Porter_*
> 
> just BF4 limited edition


That's still pretty good considering that it's a $69.99 value and I was getting this card solely for battlefield 4 haha.


----------



## Forceman

Quote:


> Originally Posted by *Porter_*
> 
> just BF4 limited edition


So it should get the China Rising DLC also, right?


----------



## Kokin

Quote:


> Originally Posted by *tsm106*
> 
> Backplate and fullcover... but we don't really have a choice. You want the half length block u must go nickel. And if there's anything I've learned about ek, it's that their nickel sucks! Thus it's not like I had a choice in getting the acetal copper.


They pretty much fixed their Nickel plating issues for products released this year. I haven't read any horror stories lately either.

Just don't go mixing silver with nickel stuff.


----------



## Porter_

Quote:


> Originally Posted by *Forceman*
> 
> So it should get the China Rising DLC also, right?


yessir


----------



## Arizonian

Quote:


> Originally Posted by *pounced*
> 
> That's still pretty good considering that it's a $69.99 value and I was getting this card solely for battlefield 4 haha.


Thats a great deal if you paid $579.99 with BF4 on a card normally sold without BF4 at $549.99. That's only $30 for BF4! AMD's way of saying "Thanks".


----------



## Thanos1972

Quote:


> Originally Posted by *Roaches*
> 
> We just need a proper backplate then
> 
> 
> 
> 
> 
> 
> 
> ....I've seen GPU sag with the MK-26 installed on GPUs


are you sure we need another backplate for the mk 26 in order to fit?i think we do not need a backplate like the 7950-70 needs.
can someone confirm this?


----------



## Coree

Who would like to be a guineapig and test out if the S1 Plus heatsink (~30 bucks) fit the 290X? There are several mounting holes for different chips. Combine the heatsink with 2x TY147 fans, and run them at max speed. (Rated 21dBa @ 1300rpm)
I have this mod on my 7870LE, but it would be interesting what the cooling performance will be w/ 290X.
This is what it looks like on the 7870LE:


----------



## Forceman

Quote:


> Originally Posted by *Stay Puft*
> 
> My cards are in the "Packaging" phase. If you ordered after 12:30am i dont believe your cards will ship today


I don't think Newegg updates their shipping status until 5PM or so.


----------



## Mr357

Quote:


> Originally Posted by *Forceman*
> 
> I don't think Newegg updates their shipping status until 5PM or so.


5PM East or Pacific?


----------



## Forceman

Quote:


> Originally Posted by *Mr357*
> 
> 5PM East or Pacific?


Pacific.


----------



## Stay Puft

Quote:


> Originally Posted by *Forceman*
> 
> I don't think Newegg updates their shipping status until 5PM or so.


My status says "Shipped".


----------



## Mr357

Quote:


> Originally Posted by *Forceman*
> 
> Pacific.


Guess I'll just have to be patient.


----------



## Raxus

Quote:


> Originally Posted by *Stay Puft*
> 
> My status says "Shipped".


Same


----------



## X-oiL

Sooo how long til we see some nice game bundles like the Never Settle one on these cards?


----------



## Forceman

Quote:


> Originally Posted by *Stay Puft*
> 
> My status says "Shipped".


Which warehouse? Tennesee? I think they don't update the status until UPS/FEDEX picks it up, which would depend on the time zone. All my stuff comes from CA, and that status doesn't get updated until 5PM, normally. So I shouldn't have generalized my experience.

And yes, I'm justifying the fact that mine still shows "Packaging" because I want to believe it'll ship today.


----------



## PapaSmurf6768

Well, I went ahead and cancelled my order for the Sapphire BF4 Edition this morning after I thought about it a little more before I went to sleep (although I didn't get much sleep last night...) Maybe I'll reconsider once we get some Mantle benchmarks, but I just couldn't justify spending so much money on such a strange card, especially with the 780Ti coming out soon.


----------



## $ilent

Quote:


> Originally Posted by *Coree*
> 
> Who would like to be a guineapig and test out if the S1 Plus heatsink (~30 bucks) fit the 290X? There are several mounting holes for different chips. Combine the heatsink with 2x TY147 fans, and run them at max speed. (Rated 21dBa @ 1300rpm)
> I have this mod on my 7870LE, but it would be interesting what the cooling performance will be w/ 290X.
> This is what it looks like on the 7870LE:


No, because the 290X heatsink fins go through the heatsink, i.e there is no gaps on top for air to get in or out. See:



Quote:


> Originally Posted by *X-oiL*
> 
> Sooo how long til we see some nice game bundles like the Never Settle one on these cards?


I got 3 game golden ticket with my 290X


----------



## Porter_

Quote:


> Originally Posted by *pounced*
> 
> Haha kinda silly question but does the bundle come with Battlefield 4 + Premium or just battlefield 4?


Quote:


> Originally Posted by *Porter_*
> 
> just BF4 limited edition


Quote:


> Originally Posted by *Forceman*
> 
> So it should get the China Rising DLC also, right?


Quote:


> Originally Posted by *Porter_*
> 
> yessir


you know i'm sort of talking out my ass here. i believe the cards ship with BF4 'Limited Edition' but i have no idea what that entails. all i see available for pre-order is the 'Digital Deluxe Edition' and 'Standard Edition'. don't know what we're getting with our cards but i assume it's the standard edition. all pre-orders get China Rising included, not sure if the game codes we get with our cards are considered pre-orders.


----------



## 250179

I'm trying to pull the trigger on one of these, I have one right beside me at my local store on hold, I just can't comprehend a 95 degree load temp


----------



## devilhead

it will be nice to know about ASIC score of cards







mine arrives just next week


----------



## famich

Hi, guys, , I have Sapphire R9 290x coming from the U.K.. IS there any new version of Trixx software that supports the overvolting of the new card ?


----------



## $ilent

Quote:


> Originally Posted by *famich*
> 
> Hi, guys, , I have Sapphire R9 290x coming from the U.K.. IS there any new version of Trixx software that supports the overvolting of the new card ?


Gibbo said there will be soon yes. HE also said flashing the ASUS bios unlocks voltage too iirc.


----------



## Porter_

Quote:


> Originally Posted by *gabsonuro*
> 
> I'm trying to pull the trigger on one of these, I have one right beside me at my local store on hold, I just can't comprehend a 95 degree load temp


it's by design, if that makes you feel any better (it should).


----------



## xxmastermindxx

FWIW, I ordered last night at about 2:30AM CST, and I have a FedEx tracking number and "shipped" status, from CA warehouse.

Also, got almost $30 off with the Newegg mobile coupon code going around. That went to overnight shipping


----------



## jerrolds

Have anyways sites talked about how well the 290X overclocks? Is it purely a heat thing, meaning a wb or custom cooling solution is needed to unlock potential - or do they have very little room for OCing? I think i read a couple that said that even at 100% fan speed it made no difference in overclocking.


----------



## Mr357

Quote:


> Originally Posted by *jerrolds*
> 
> Have anyways sites talked about how well the 290X overclocks? Is it purely a heat thing, meaning a wb or custom cooling solution is needed to unlock potential - or do they have very little room for OCing? I think i read a couple that said that even at 100% fan speed it made no difference in overclocking.


Kepp in mind even "Uber mode" caps at 55% fan speed, so as long as you don't mind the audible equivalence of a jet engine, you should be able to do *some* overclocking on reference cooling.


----------



## $ilent

Made a quick poll thread guys for your thoughts on the 290x here - http://www.overclock.net/t/1436726/poll-are-you-impressed-with-the-r9-290x/0_40

Multiple choice!


----------



## VSG

Got my tracking number a min ago, shipped from CA.


----------



## Mr357




----------



## flopper

Quote:


> Originally Posted by *jerrolds*
> 
> Have anyways sites talked about how well the 290X overclocks? Is it purely a heat thing, meaning a wb or custom cooling solution is needed to unlock potential - or do they have very little room for OCing? I think i read a couple that said that even at 100% fan speed it made no difference in overclocking.


1200mhz+ with proper cooling and voltage


----------



## famich

Quote:


> Originally Posted by *$ilent*
> 
> Gibbo said there will be soon yes. HE also said flashing the ASUS bios unlocks voltage too iirc.


Allright, let us wait and see ..


----------



## 250179

I have one in my hands right now, only 619 $. pull the trigger or no?


----------



## wstanci3

Quote:


> Originally Posted by *gabsonuro*
> 
> I have one in my hands right now, only 619 $. pull the trigger or no?


Do it for Bertha.


----------



## Slomo4shO

Quote:


> Originally Posted by *gabsonuro*
> 
> I have one in my hands right now, only 619 $. pull the trigger or no?


I would recommend waiting for the release of the 290 next Thursday before making a purchase decision.


----------



## Sgt Bilko

Quote:


> Originally Posted by *gabsonuro*
> 
> I have one in my hands right now, only 619 $. pull the trigger or no?


Well, i spent a bit more than that











Count me in here!!


----------



## VSG

A store charging for PayPal fees? Wow..


----------



## Forceman

Quote:


> Originally Posted by *Porter_*
> 
> you know i'm sort of talking out my ass here. i believe the cards ship with BF4 'Limited Edition' but i have no idea what that entails. all i see available for pre-order is the 'Digital Deluxe Edition' and 'Standard Edition'. don't know what we're getting with our cards but i assume it's the standard edition. all pre-orders get China Rising included, not sure if the game codes we get with our cards are considered pre-orders.


I'm assuming the 'Limited Edition' part refers to the card/game bundle, and not just the game. But I guess we'll find out in a week. Even if it doesn't include the DLC, bundle price+DLC is still cheaper than the pre-order prices.


----------



## 250179

I have one in my hands right now, only 619 $. pull the trigger or no?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Forceman*
> 
> I'm assuming the 'Limited Edition' part refers to the card/game bundle, and not just the game. But I guess we'll find out in a week. Even if it doesn't include the DLC, bundle price+DLC is still cheaper than the pre-order prices.


Ok, Limited Edition:


and Deluxe edition:


Not that it really matter, so far as i can tell it's just the Standard BF4.

Quote:


> Originally Posted by *geggeg*
> 
> A store charging for PayPal fees? Wow..


yeah, they only just started using PayPal so i don't mind. Also usually cheapest place in Aus for PC stuffs


----------



## Slomo4shO

Quote:


> Originally Posted by *geggeg*
> 
> A store charging for PayPal fees? Wow..


Why not? Business are charged for every creditcard and paypal transaction and most incorporate these costs into the product pricing. Why not itemize these fees instead adding them to the price of the product?


----------



## r3quiem

Just Picked Mine up from Canada Computers - Sapphire R9 290X BF4 Edition


----------



## Arizonian

Quote:


> Originally Posted by *geggeg*
> 
> Got my tracking number a min ago, shipped from CA.


Yea.....just got my tracking # too. It stayed in packaging for quite some time. Will be here tommrow garuanteed by end of day. I work anyway so will be starring at the box until I'm off anyway. Backed up system just in case need clean install coming from Nvidia drivers that give me issues.

PS Newgg sold out Sapphire BF4 editions.


----------



## Sgt Bilko

Quote:


> Originally Posted by *gabsonuro*
> 
> I have one in my hands right now, only 619 $. pull the trigger or no?


I'm not gonna tell you how to spend your money......do you really want it or do you want to see how the 290 and 780 Ti play out?


----------



## Sgt Bilko

Quote:


> Originally Posted by *r3quiem*
> 
> Just Picked Mine up from Canada Computers - Sapphire R9 290X BF4 Edition
> -snip-


now that is one damn sexy looking card...........argh, why won't it post already!?!?!


----------



## tsm106

Quote:


> Originally Posted by *r3quiem*
> 
> Just Picked Mine up from Canada Computers - Sapphire R9 290X BF4 Edition


Can you crack it open to check the memory ics?


----------



## DampMonkey

Quote:


> Originally Posted by *tsm106*
> 
> Can you crack it open to check the memory ics?


Yes!!! Void your warranty to soothe my impatience









I kid i kid. But really, the guy over on oc.uk that is playing with overclocks has said that he is seeing no difference between elpida and hynix, and that these reference release cards are clocking a lot better than the press pre-release cards


----------



## jerrolds

Quote:


> Originally Posted by *DampMonkey*
> 
> Yes!!! Void your warranty to soothe my impatience
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I kid i kid. But really, the guy over on oc.uk that is playing with overclocks has said that he is seeing no difference between elpida and hynix, and that these reference release cards are clocking a lot better than the press pre-release cards


How far is he getting? +100mhz? +200mhz? Please link


----------



## tsm106

Quote:


> Originally Posted by *jerrolds*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DampMonkey*
> 
> Yes!!! Void your warranty to soothe my impatience
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I kid i kid. But really, the guy over on oc.uk that is playing with overclocks has said that he is seeing no difference between elpida and hynix, and that these reference release cards are clocking a lot better than the press pre-release cards
> 
> 
> 
> How far is he getting? +100mhz? +200mhz? Please link
Click to expand...

Dude, it's been posted all over the threads. Ek has seen 1200/1600 on stock volts under water, and easily they add. Gibbo at ocuk is getting 1250mhz core. Sugar, fill in my blanks please. Too many figures to juggle lately. These cards WILL overclock very well. It's just like Tahiti. Keep it cool and it will pay you back in mhz. Cash money!


----------



## $ilent

Tsm sapphire has hynix.


----------



## DampMonkey

Quote:


> Originally Posted by *tsm106*
> 
> Dude, it's been posted all over the threads. Ek has seen 1200/1600 on stock volts under water, and easily they add. Gibbo at ocuk is getting 1250mhz core. Sugar, fill in my blanks please. Too many figures to juggle lately. These cards WILL overclock very well. It's just like Tahiti. Keep it cool and it will pay you back in mhz. Cash money!


To quote techpowerup:
Quote:


> AMD's Radeon R9 290X shows fantastic clock scaling with GPU voltage, better than any GPU I've previously reviewed. The clocks do not show any signs of diminishing returns, which leads me to believe that the GPU could clock even higher with more voltage and cooling.


It should be noted that heavily overlcocking will not be for the faint of heart, TPU's system jumped from 400w system pull to nearly 650W when playing with the voltage. Cooling beyond the reference fan is definitely recommended


----------



## r3quiem

Quote:


> Originally Posted by *tsm106*
> 
> Dude, it's been posted all over the threads. Ek has seen 1200/1600 on stock volts under water, and easily they add. Gibbo at ocuk is getting 1250mhz core. Sugar, fill in my blanks please. Too many figures to juggle lately. These cards WILL overclock very well. It's just like Tahiti. Keep it cool and it will pay you back in mhz. Cash money!


Putting it Through its Paces now, Will update with what OC I could Manage shortly


----------



## Arizonian

Quote:


> Originally Posted by *r3quiem*
> 
> Just Picked Mine up from Canada Computers - Sapphire R9 290X BF4 Edition
> 
> 
> Spoiler: Warning: Spoiler!


Added


----------



## Porter_

yeah most reviews are showing these as marginal overclockers.

mine has been shipped


----------



## szeged

Anywhere in USA have more of these in stock?


----------



## Sgt Bilko

Thinking about slapping one of these on instead of the ref cooler.

http://www.pccasegear.com/index.php?main_page=product_info&cPath=207_314&products_id=17525

Opinions anyone?


----------



## flopper

Quote:


> Originally Posted by *Porter_*
> 
> yeah most reviews are showing these as marginal overclockers.
> 
> mine has been shipped


----------



## jerrolds

Oh - forgot to show my club card membership - SAPPHIRE Radeon R9 290X BattleField 4 Game Edition


Spoiler: Warning: Spoiler!


----------



## tsm106

Welp, the only consolation for being taxed in cali is that shipping takes 1 day. Can't wait for TGIF!

Ship date 10/24/2013
Estimated delivery 10/25/2013

Quote:


> Originally Posted by *flopper*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Porter_*
> 
> yeah most reviews are showing these as marginal overclockers.
> 
> mine has been shipped
Click to expand...

Is el gappo still not using volts?


----------



## tsm106

Stock alert for MSI w BF4.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814127768


----------



## Porter_

Quote:


> Originally Posted by *flopper*


that vaguely confirms my point.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Thinking about slapping one of these on instead of the ref cooler.
> 
> http://www.pccasegear.com/index.php?main_page=product_info&cPath=207_314&products_id=17525
> 
> Opinions anyone?


No-one?

Also, Aus still has 290x BF4 bundles if anyone down under was still looking









EDIT: nevermind all gone, restocks on the 1st Nov


----------



## starmanwarz

Is a 6 year old Enermax Infinity 650W going to be enough for this card? My cpu is at 4.6. I can't afford to change PSU as well so if it's not enough I'm going to have to pass


----------



## korruptedkaos

Quote:


> Originally Posted by *r3quiem*
> 
> Putting it Through its Paces now, Will update with what OC I could Manage shortly


getting some cookies, lets us see it ! lets us see it!


----------



## r3quiem

Just Finished Preliminary OC Test.


http://www.techpowerup.com/gpuz/6ueqq/

So Far:

1. I dont know why but GPU-Z is reading PCIE 2.0 No Matter What Test I run it doesnt seem to want to switch to 3.0.
2. Stock Cooler Really has to go, Takes a very long time to cool down after testing. Like every review said its no joke this card runs hot but its good in the winter =P
3. I think its easy to hit about 1160/1550 on stock Power limit but to go beyond that I had to raise power limit to 35%+


----------



## $ilent

Nice, keep going!!

Also that memory overclock...almost 7ghz?!


----------



## Stay Puft

Quote:


> Originally Posted by *r3quiem*
> 
> Just Finished Preliminary OC Test. So Far:
> 
> 
> http://www.techpowerup.com/gpuz/6ueqq/


They still havent released a new GPUZ?


----------



## Ha-Nocri

Quote:


> Originally Posted by *r3quiem*
> 
> Just Finished Preliminary OC Test. So Far:
> 
> 
> http://www.techpowerup.com/gpuz/6ueqq/


How did you set power tune settings? Do you have the means to detect throttling?


----------



## r3quiem

Quote:


> Originally Posted by *Stay Puft*
> 
> They still havent released a new GPUZ?


Not as far as I know, AMD Doesnt even have the drivers for 290x on their site you have to use a Disc and I didnt have a CD Drive had to borrow a second pc just to transfer files.
Quote:


> Originally Posted by *Ha-Nocri*
> 
> How did you set power tune settings? Do you have the means to detect throttling?


Doing it through AMD Overdrive right now, Im allowing 100% Fan Speed just to see what I can do atm. Hits about 55-60% Fan Speed but keeps the Card hovering at 87-92 C. However the Rest of my System is watercooled and Im in relatively cold Basement Ambient Temp is 20 C


----------



## Dominican

Order Mine SAPPHIRE 100361BF4SR

Subtotal $579.99
Tax $0.00
Newegg 3Day $8.50
Rush Processing (Preferred Account) -$3.99
Rush Processing $3.99
Promo code -$29.00
Order Total $559.49

my been ship out already very happy.


----------



## Ha-Nocri

Quote:


> Originally Posted by *r3quiem*
> 
> Not as far as I know, AMD Doesnt even have the drivers for 290x on their site you have to use a Disc and I didnt have a CD Drive had to borrow a second pc just to transfer files.
> Doing it through AMD Overdrive right now, Im allowing 100% Fan Speed just to see what I can do atm. Hits about 55-60% Fan Speed but keeps the Card hovering at 87-92 C. However the Rest of my System is watercooled and Im in relatively cold Basement Ambient Temp is 20 C


It probably isn't throttling, but you can check with After Burner for example, setting it's refresh time low, but you probably know that. Can you control voltage?


----------



## Sgt Bilko

Quote:


> Originally Posted by *r3quiem*
> 
> Just Finished Preliminary OC Test.
> 
> http://www.techpowerup.com/gpuz/6ueqq/
> 
> So Far:
> 
> 1. I dont know why but GPU-Z is reading PCIE 2.0 No Matter What Test I run it doesnt seem to want to switch to 3.0.
> 2. Stock Cooler Really has to go, Takes a very long time to cool down after testing. Like every review said its no joke this card runs hot but its good in the winter =P
> 3. I think its easy to hit about 1160/1550 on stock Power limit but to go beyond that I had to raise power limit to 35%+


I'm guessing GPU-Z hasn't included the 290x in it's list yet.

AMD's driver don't officially support it yet afaik

other than that.......not looking too bad, needs more cooling though by the sounds of it


----------



## Stileprojekt

I am now a proud owner of the Limited edition BF4 gpu. I purchased the MSI R9 290x off tiger direct. was $575.99, paid an extra $13.00 for next day delivery.


----------



## r3quiem

So the highest I can push it without aritifacting or resetting is: 1189/1625 However this is at +50% Powerlimit setting which is the highest it will go. So depending on how well you can cool it right now I think I will stick to 1160/1600 which runs fine at close to stock Powerlimit. Thats all I got for tonight. Got to run and good luck to everyone else. I hope you get a better sample than mine

















Quote:


> Originally Posted by *Ha-Nocri*
> 
> It probably isn't throttling, but you can check with After Burner for example, setting it's refresh time low, but you probably know that. Can you control voltage?


No direct control over voltage just like nVidia, not sure how it was with AMDs 79xx Series. You can set a threshold of up to 50% but no solid numbers.

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'm guessing GPU-Z hasn't included the 290x in it's list yet.
> 
> AMD's driver don't officially support it yet afaik
> 
> other than that.......not looking too bad, needs more cooling though by the sounds of it


Yes, I really wish they did something about the stock cooler it idles really high. Looks like this is best Liquid Cooled or with an aftermarket cooler.


----------



## Arizonian

I was hoping by release the supporting drivers for 290X would have been published. I'm sure 13.11 beta isn't even optimized.


----------



## Dominican

Quote:


> Originally Posted by *Arizonian*
> 
> I was hoping by release the supporting drivers for 290X would have been published. I'm sure 13.11 beta isn't even optimized.


that the problem with amd driver wish they could done better job.


----------



## szeged

290x been stuck in "packaging" phase all day, if its that way tomorrow im just gonna cancel my order and retry again when newegg isnt scrambling around.

also, they need to get more in stock, i wanna crossfire bench these.


----------



## Sgt Bilko

Quote:


> Originally Posted by *szeged*
> 
> 290x been stuck in "packaging" phase all day, if its that way tomorrow im just gonna cancel my order and retry again when newegg isnt scrambling around.
> 
> also, they need to get more in stock, i wanna crossfire bench these.


I just called my supplier and they are trying to have them all out the door within the next 2 hours. if they don't then i have to wait till Wednesday...........I hate waiting on things


----------



## Forceman

Quote:


> Originally Posted by *szeged*
> 
> 290x been stuck in "packaging" phase all day, if its that way tomorrow im just gonna cancel my order and retry again when newegg isnt scrambling around.
> 
> also, they need to get more in stock, i wanna crossfire bench these.


Mine just flipped to Shipped about 30 minutes ago.


----------



## szeged

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I just called my supplier and they are trying to have them all out the door within the next 2 hours. if they don't then i have to wait till Wednesday...........I hate waiting on things


i just dont like paying for overnight shipping only to have to wait over a weekend, and probably another extra day or so because they werent prepared for the gpu's launch. If i had just paid for standard shipping, i wouldnt care if it took however long because of it.


----------



## Forceman

Quote:


> Originally Posted by *Sgt Bilko*
> 
> No-one?
> 
> Also, Aus still has 290x BF4 bundles if anyone down under was still looking
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: nevermind all gone, restocks on the 1st Nov


It'll work well if it is compatible. But that's an old cooler so I don't know that it would actually fit. Did they ever make a version that supports the 7970? That's probably a better bet.


----------



## wstanci3

Just sold my Lightning(never used), was going to buy a 290x but dat jet engine...
I guess I will wait for 780 to drop down or the 780Ti.


----------



## szeged

Quote:


> Originally Posted by *wstanci3*
> 
> Just sold my Lightning(never used), was going to buy a 290x but dat jet engine...
> I guess I will wait for 780 to drop down or the 780Ti.


if you arent against buying used cards, expect to see some hefty price drops on used 780s im guessing


----------



## Sgt Bilko

Quote:


> Originally Posted by *szeged*
> 
> i just dont like paying for overnight shipping only to have to wait over a weekend, and probably another extra day or so because they weren't prepared for the gpu's launch. If i had just paid for standard shipping, i wouldnt care if it took however long because of it.


They are only just getting the cards in store today, still have to box, label and then ship them. The Capital here posts over weekends but my local PO doesn't


----------



## wstanci3

Quote:


> Originally Posted by *szeged*
> 
> if you arent against buying used cards, expect to see some hefty price drops on used 780s im guessing


I sold it on Ebay. I think I got what I could on there. Let's hope he is a good buyer.








I thought it was prime time to drop it and not have that bad decision hanging over my head. LOL
And to used cards, I haven't ever done that. When I want something, I generally buy new. I might take up your advice, especially since you are the king of ze Titan hunting on Ebay.








Did you order a 290x?


----------



## Cool Mike

Liking the 290x.

I will wait for a custom 290x. Toxic, Lightning, Matrix. Hope to see something by Christmas.


----------



## Dominican

Newegg Promo: MBLEMC10G 29.00 Off 290X R9


----------



## wstanci3

Quote:


> Originally Posted by *Cool Mike*
> 
> Liking the 290x.
> 
> I will wait for a custom 290x. Toxic, Lightning, Matrix. Hope to see something by Christmas.


Don't mean to burst your bubble, but Tiny Tim Logan said not to expect custom 290x's until 2014. Asus nor Club3d is not going to come out with any by Christmas, anyway.


----------



## szeged

Quote:


> Originally Posted by *Dominican*
> 
> Newegg Promo: MBLEMC10G 29.00 Off 290X R9


newegg is teasing because they know they dont have any in stock


----------



## DampMonkey

Quote:


> Originally Posted by *wstanci3*
> 
> Don't mean to burst your bubble, but Tiny Tim Logan said not to expect custom 290x's until 2014. Asus nor Club3d is not going to come out with any by Christmas, anyway.


Not completely true. A sapphire rep popped in one of the other 290x threads to say their non-reference designs are definitely going to impress. He said theyre coming in december, but because of NDA couldnt give any more specifics


----------



## DampMonkey

Quote:


> Originally Posted by *szeged*
> 
> newegg is teasing because they know they dont have any in stock


I used that code when i bought mine last night


----------



## wstanci3

Quote:


> Originally Posted by *DampMonkey*
> 
> Not completely true. A sapphire rep popped in one of the other 290x threads to say their non-reference designs are definitely going to impress. He said theyre coming in december, but because of NDA couldnt give any more specifics


Ahh, that would be a welcome surprise. Linky Link please?


----------



## szeged

Quote:


> Originally Posted by *DampMonkey*
> 
> I used that code when i bought mine last night


i hope it lasts for a couple days, i need to buy a second 290x for crossfire but cant find any in stock.


----------



## Forceman

Quote:


> Originally Posted by *szeged*
> 
> i hope it lasts for a couple days, i need to buy a second 290x for crossfire but cant find any in stock.


It was 72 hours only - I think it expires tomorrow. And it's only if you use the mobile site/app.


----------



## szeged

Quote:


> Originally Posted by *Forceman*
> 
> It was 72 hours only - I think it expires tomorrow. And it's only if you use the mobile site/app.


well thats kinda lame, they better get some more in stock asap.


----------



## Scotty99

I know no one cares about WoW benches but why has nvidia always performed better in this game:

http://www.techpowerup.com/reviews/AMD/R9_290X/24.html

Im not in the market for the 290x but if i find a 290 down the road for 4 bills or less i may pull the trigger, problem is wow is my main game sigh lol. I play other games, but majority of the time at my pc im playing an MMO.

This is true with other blizzard games like starcraft, as well as other mmo's like swtor. Very interested to see how nvidia vs amd plays out in wildstar.


----------



## Bartouille

I wonder if this reference cooler at 100% is louder than a windforce x3 at 100%... I ran my windforce x3 at 100% while gaming and tbh it doesn't bother me. If I remember correctly windforce x3 fans were spinning at about 4k rpm, this reference cooler should be about 5k?


----------



## rdr09

Quote:


> Originally Posted by *Scotty99*
> 
> I know no one cares about WoW benches but why has nvidia always performed better in this game:
> 
> http://www.techpowerup.com/reviews/AMD/R9_290X/24.html
> 
> Im not in the market for the 290x but if i find a 290 down the road for 4 bills or less i may pull the trigger, problem is wow is my main game sigh lol. I play other games, but majority of the time at my pc im playing an MMO.
> 
> This is true with other blizzard games like starcraft, as well as other mmo's like swtor. Very interested to see how nvidia vs amd plays out in wildstar.


dude, discuss that here . . .

http://www.overclock.net/t/1431287/amd-r9-290x-290-pre-order-discussion-thread


----------



## DampMonkey

Quote:


> Originally Posted by *wstanci3*
> 
> Ahh, that would be a welcome surprise. Linky Link please?


Quote:


> Originally Posted by *VaporX*
> 
> Guys you have no idea how bad I want to give you specifics, I truly do. I am an enthusiast just like all of you so this kind of stuff really gets the blood pumping. SAPPHIRE will be introducing boards based on the R9 290 over the next few months. As for the time frame of seeing those solutions I have no information that I can currently share. What I can tell you is that SAPPHIRE is going to have our traditional custom solutions and this cards are going to rock.


----------



## DampMonkey

Quote:


> Originally Posted by *Bartouille*
> 
> I wonder if this reference cooler at 100% is louder than a windforce x3 at 100%... I ran my windforce x3 at 100% while gaming and tbh it doesn't bother me. If I remember correctly windforce x3 fans were spinning at about 4k rpm, this reference cooler should be about 5k?


From my experience, blowers are typically louder than top down coolers like the windforce, at 100% speeds anyway


----------



## Forceman

Quote:


> Originally Posted by *Bartouille*
> 
> I wonder if this reference cooler at 100% is louder than a windforce x3 at 100%... I ran my windforce x3 at 100% while gaming and tbh it doesn't bother me. If I remember correctly windforce x3 fans were spinning at about 4k rpm, this reference cooler should be about 5k?


I'm going to go out on a very big limb and say yes, yes the reference card is going to be louder at 100%. Those blower fans make a lot of noise.

And that's the right way to mangle that saying right - a big limb means I think there is no way that statement is wrong? Because that's what I was going for. The reference cooler is definitely going to be louder.


----------



## wstanci3

Quote:


> Originally Posted by *DampMonkey*


Very interesting.








But I don't know if a "few months" translates into December. IF they can deliver a custom solution in December, then I'd hold off my purchase of a 780 or 780Ti depending. GPU launches are always exciting.


----------



## youra6

R9 290 at 450 would be a killer deal. I ordered 2x 290x BF4 Edition but now I am having second thoughts...

Spent almost 300 more on two cards that is only ~5% faster.


----------



## VSG

Ya, I have a feeling the 290:290X will be similar to 780:Titan. I was thinking of possibly picking up another 290X but I may well do a 290 and flash the 290X bios given that OCUK says it is possible. I don't have a lot of regrets on getting this first 290X though- between the newegg discount code and selling off BF4, I will end up at having the card for ~$500-520.


----------



## nemm

After reading over on OCUK about the 290 and bios flash, I am also wondering whether a 290 would've been the more wiser choice but we shall see.
The brother in law was about to pull the trigger and pre-order but backed out until 290 info is released. He said that should he order the 290 then I be able to tinker with it, perhaps even try partner with my 290x should crossfire be possible. Should this be possible and depending on the performance result I maybe running 290x/290 in the future with the money saved going to the water blocks for 290.
We shall see in November and I will be sure to post my findings.


----------



## Forceman

Quote:


> Originally Posted by *youra6*
> 
> R9 290 at 450 would be a killer deal. I ordered 2x 290x BF4 Edition but now I am having second thoughts...
> 
> Spent almost 300 more on two cards that is only ~5% faster.


Well, if you factor in $50 for BF4 you only paid $529 for one card, so assuming $459 for the 290 that's $70, and then you sell the other BF4 code for $40 and that's only $80 difference. So really, you only paid $150 extra (or $170 if the 290 is $449).


----------



## selk22

Mine flipped to shipped sometime today and it says estimated tomorrow as delivery day!

Yes!


----------



## binormalkilla

Ordered an MSI 790X BF4 edition with overnight shipping. According to my tracking info it will arrive tomorrow. I was kind of worried I ordered too late. I wanted to order a second one for Crossfire, but they only had BF4 editions left.

Does anyone have any info on the MSI model? What memory ICs, VRMs, etc.?


----------



## RocketAbyss

Quote:


> Originally Posted by *binormalkilla*
> 
> Ordered an MSI *790X* BF4 edition with overnight shipping.


AMD made a new card so fast? Dangit I should have waited!


----------



## Lord Xeb

Where the hell do you buy things things


----------



## MIGhunter

I got my email that mine shipped. Where do I see the estimated delivery date? When I click on my tracking number, it says it's not a valid tracking number. Anyway, here's my invoice

http://s279.photobucket.com/user/botdphotos/media/gpu_zps9350b222.png.html


----------



## selk22

Quote:


> Originally Posted by *Lord Xeb*
> 
> Where the hell do you buy things things


Newegg for me at midnight


----------



## binormalkilla

Quote:


> Originally Posted by *RocketAbyss*
> 
> AMD made a new card so fast? Dangit I should have waited!


I have my sources


----------



## Porter_

Quote:


> Originally Posted by *MIGhunter*
> 
> I got my email that mine shipped. Where do I see the estimated delivery date? When I click on my tracking number, it says it's not a valid tracking number. Anyway, here's my invoice
> 
> http://s279.photobucket.com/user/botdphotos/media/gpu_zps9350b222.png.html


it takes a bit for FedEx to register your delivery and input it into their system. give it 20-30 minutes and it should show up.


----------



## Tatakai All

Nice browse through read. A few points I picked up on is that I might want wait and check the 290 before going all in on a 290x. Especially since the bios is flashable.


----------



## Lord Xeb

I gotta wait now.


----------



## MIGhunter

eww, updated, estimated arrival date is Tuesday....


----------



## Dominican

my estimated delivery on 29 wish was tomorrow :sad smiley


----------



## Dominican

Quote:


> Originally Posted by *MIGhunter*
> 
> eww, updated, estimated arrival date is Tuesday....


my tooo


----------



## c0ld

At least you got one I slept it through and couldnt order one


----------



## fleetfeather

There are options other than Newegg still available, even at the same price. The downside is you'll have to wait longer for shipping.


----------



## Arizonian

Quote:


> Originally Posted by *c0ld*
> 
> At least you got one I slept it through and couldnt order one


Newegg sold out all the BF4 limited editions except ASUS which is still pending to 'coming soon'.

The only cards left after that which are still showing '*coming soon'* without BF4 bundle is ASUS, Sapphire, MSI and Gigabyte which is on it's separate *list* for some reason. These cards have not been released for sale as of yet. Still a chance.


----------



## c0ld

Quote:


> Originally Posted by *Arizonian*
> 
> Newegg sold out all the BF4 limited editions except ASUS which is still pending to 'coming soon'.
> 
> The only cards left after that which are still showing '*coming soon'* without BF4 bundle is ASUS, Sapphire, MSI and Gigabyte which is on it's separate *list* for some reason. These cards have not been released for sale as of yet. Still a chance.


Yeah i waiting for that one, but tax is gonna kill the price. I got an XFX from tigerdirect but they didnt want to ship to my alternate address, because I am not living at my billing address so I went head and canceled the order


----------



## leyzar




----------



## kot0005




----------



## stn0092

Are they left in stock anywhere in the US?


----------



## Arizonian

Quote:


> Originally Posted by *kot0005*
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats added to list.









Let us know what you think after you game with it and any thoughts.


----------



## TheRoot

Quote:


> Originally Posted by *leyzar*


lol nvi bought this


----------



## kot0005

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats added to list.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Let us know what you think after you game with it and any thoughts


When i Get home in 1 hr ordered ek block as well should get them as soon as next Friday.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kot0005*


Yours come from PCCG?
Quote:


> Originally Posted by *kot0005*
> 
> When i Get home in 1 hr ordered ek block as well should get them as soon as next Friday.


EK Full Cover VGA Block EK-FC R9-290X Acetal Nickel

ETA: 07/11/13


----------



## kot0005

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yours come from PCCG?
> EK Full Cover VGA Block EK-FC R9-290X Acetal Nickel
> 
> ETA: 07/11/13


Yea Gpu from pccg, blocks from EKWB web store.

PORNO


----------



## kcuestag

Waterblocks are actually in stock at EK's website, I might buy the clear version.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kot0005*
> 
> Yea Gpu from pccg, blocks from EKWB web store.
> 
> PORNO
> 
> -Snip-


Damn nice photos, very comprehensive.

Mine ships out on Monday now, waiting for an Accelero Xtreme to ship with it

If that fails then.....Waterblock


----------



## Arizonian

*@kot0005*

What cable did that come with out of curiosity?

Wish Sapphire sold optional back plates with their reference cards. I've become spoiled with EVGA and like seeing it for aesthetics mostly in my rig.


----------



## kot0005

1.5M High speed HDMI cable


----------



## leyzar

Hey guys did you get a chance to look at this ?
Sounds very promising , im getting excited again


----------



## FlyingSolo

Does anyone know if Sapphire has a UK RMA service or even in EU just in case if i have to RMA my card. Also say if it's outside of UK,EU by sending it back for RMA will i be charged custom fee's etc once i get back the card form outside of EU


----------



## $ilent

Quote:


> Originally Posted by *FlyingSolo*
> 
> Does anyone know if Sapphire has a UK RMA service or even in EU just in case if i have to RMA my card. Also say if it's outside of UK,EU by sending it back for RMA will i be charged custom fee's etc once i get back the card form outside of EU


I wouldnt have thought youd get charged customs fees if you send it outside the UK, firstly because im not sure you pay customs fees on stuff form inside Europe, just maybe from like USA? Dont quote me on that.

Secondly if you did RMA it, your not buying a whole new gpu, you will have already paid VAT on the gpu when you first buy it.


----------



## MIGhunter

http://www.ekwb.com/shop/kits-cases/kits/ek-kit-h3o-360-hfx.html

Is that a good kit or is it better to go with individual parts? I remember HeatKiller used to be the CPU block to get. I'm not really going to go water atm but maybe in the future if temps are too high for my liking. Translation: wife didn't like the money spent on the GPU so have to wait to spend more


----------



## FlyingSolo

Quote:


> Originally Posted by *$ilent*
> 
> I wouldnt have thought youd get charged customs fees if you send it outside the UK, firstly because im not sure you pay customs fees on stuff form inside Europe, just maybe from like USA? Dont quote me on that.
> 
> Secondly if you did RMA it, your not buying a whole new gpu, you will have already paid VAT on the gpu when you first buy it.


Yeah you don't get custom charge if you buy anything from EU, Thanks plus +rep


----------



## $ilent

Also ive asked Gibbo what his policy is on Sapphire warranty, did you buy a sapphire too?


----------



## FlyingSolo

Quote:


> Originally Posted by *$ilent*
> 
> Also ive asked Gibbo what his policy is on Sapphire warranty, did you buy a sapphire too?


I still haven't bought one yet. just thinking if it's a good idea to wait for the custom cooler to arrive. Since i wont be water cooling any time soon


----------



## leyzar

Quote:


> Originally Posted by *FlyingSolo*
> 
> I still haven't bought one yet. just thinking if it's a good idea to wait for the custom cooler to arrive. Since i wont be water cooling any time soon


ToT and LTT mentioned they will have costume designs very soon, also the 290 shows very good OC headroom which can indicate custom designs , at this point it may be the wise move to wait


----------



## FlyingSolo

Quote:


> Originally Posted by *leyzar*
> 
> ToT and LTT mentioned they will have costume designs very soon, also the 290 shows very good OC headroom which can indicate custom designs , at this point it may be the wise move to wait


That's what am thinking. Since am not gonna water cool any time soon. But will be going crossfire but not straight away. But then there's the problem of heat in the case with custom coolers unless some one comes out with a good blower design one if they ever do that is


----------



## $ilent

Add me please


----------



## szeged

still no stock of these anywhere in the USA, ughgfuhgfhghuduhsuhhssassasaaa


----------



## FlyingSolo

$ilent please do some 1440p game benchmarks. Are you gonna water cool the card


----------



## szeged

Quote:


> Originally Posted by *$ilent*
> 
> Add me please


do you mind doing a unigine valley bench at 1920x1080 on extreme HD preset at stock clocks and with as much of an overclock you can manage?


----------



## RocketAbyss

Quote:


> Originally Posted by *szeged*
> 
> do you mind doing a unigine valley bench at 1920x1080 on extreme HD preset at stock clocks and with as much of an overclock you can manage?




I don't have a valley one but a heaven bench. This was done last night when I first got my PowerColor 290X. Clocks were 1030Mhz Core /1250Mhz Mem out of the box and I did not try and OC the card yet. This is with an FX 8350 at 4.8Ghz


----------



## szeged

Quote:


> Originally Posted by *RocketAbyss*
> 
> 
> 
> I don't have a valley one but a heaven bench. This was done last night when I first got my PowerColor 290X. Clocks were 1030Mhz Core /1250Mhz Mem out of the box and I did not try and OC the card yet. This is with an FX 8350 at 4.8Ghz


thanks







make sure to post it in http://www.overclock.net/t/1436635/ocn-gk110-vs-hawaii-bench-off-thread so we can get as many results as possible


----------



## RocketAbyss

Quote:


> Originally Posted by *szeged*
> 
> thanks
> 
> 
> 
> 
> 
> 
> 
> make sure to post it in http://www.overclock.net/t/1436635/ocn-gk110-vs-hawaii-bench-off-thread so we can get as many results as possible


I will get a proper one done later when im back from work+dinner instead of having a screenie taken with my phone lol


----------



## $ilent

Quote:


> Originally Posted by *FlyingSolo*
> 
> $ilent please do some 1440p game benchmarks. Are you gonna water cool the card


Will do some BF3 gaming and yeah gonna WC it.
Quote:


> Originally Posted by *szeged*
> 
> do you mind doing a unigine valley bench at 1920x1080 on extreme HD preset at stock clocks and with as much of an overclock you can manage?


I could do, but im at 1440p so not sure how it would work? downscale maybe?


----------



## szeged

Quote:


> Originally Posted by *$ilent*
> 
> Will do some BF3 gaming and yeah gonna WC it.
> I could do, but im at 1440p so not sure how it would work? downscale maybe?


there is an option for 1920x1080 on it automatically, itll set everything for you.

i run 7680x1440 and when i do it at 1920x1080 it just sets it and goes


----------



## selk22

Well time for sleeping.. tomorrow a 290x will be in my hands...

580 1.5gb to 290x lol big jump and should be nice


----------



## $ilent

Quote:


> Originally Posted by *szeged*
> 
> there is an option for 1920x1080 on it automatically, itll set everything for you.
> 
> i run 7680x1440 and when i do it at 1920x1080 it just sets it and goes


Ok ill let you know how I get on.

A slight quirk though, with my 290x installed whenever a new window pops up in windows, i.e a message from the UAC about opening a programme, my screen goes black and it takes like 5 seconds to come back. Seems very sluggish to me.


----------



## $ilent

Quote:


> Originally Posted by *szeged*
> 
> there is an option for 1920x1080 on it automatically, itll set everything for you.
> 
> i run 7680x1440 and when i do it at 1920x1080 it just sets it and goes


Ok ill let you know how I get on.

A slight quirk though, with my 290x installed whenever a new window pops up in windows, i.e a message from the UAC about opening a programme, my screen goes black and it takes like 5 seconds to come back. Seems very sluggish to me.


----------



## szeged

got all your drivers updated?


----------



## kot0005

Now, Does any one know how to make a "AMD Radeon R9 290X" LED logo ? I wish they had one like the Nvidia cards. I am going to use my GPU fan header socket to power it since I will be installing a block anyway.


----------



## RocketAbyss

Quote:


> Originally Posted by *$ilent*
> 
> Ok ill let you know how I get on.
> 
> A slight quirk though, with my 290x installed whenever a new window pops up in windows, i.e a message from the UAC about opening a programme, my screen goes black and it takes like 5 seconds to come back. Seems very sluggish to me.


I experienced that as well. A reboot fixed that issue weirdly but I dunno if it'll come back. Will find out once I'm back home from work in a few hours. It could be a driver problem as AMD hasn't released an official/beta driver that actually states that it supports the 290X. 13.11 Beta3 only goes up to 280X iirc.


----------



## Arizonian

Just to make members aware here - we have Titan owners wanting a bench off in the graphic card section.

GK110 vs Hawaii Bench Off Thread
http://www.overclock.net/t/1436635/ocn-gk110-vs-hawaii-bench-off-thread

Personally as a moderator who has been dealing with members on both sides who can't follow TOS when posting it's been the worst few days I've had since Titan released trying to keep it respectable. It's been tough to even be happy about a new GPU I'm getting when one can't post something without the bad sentiment from members who don't share the same enthusiasm.

So for any of the 290X owners that would like to join a civil bench off please feel free to join Alatar and the many Titan owners that enjoy such a challenge there to discuss and bench.

I myself dont care much as I'm a gamer not a bencher. I will see what mine has under the hood and try to contribute in OCN spirit. I enjoy watching the scores but not for the same reasons of competition but rather learning how to push them and get the most out of them for gaming to maximize FPS.

This feels like Fermi all over again except this time I'm running a Hawaii chip. I found myself on the Fermi side with cracks about cooking eggs and power draw. Ironically I'm now on the Hawaii side and those same people who ran Fermi are now sharing the same sentiment only the foot has changed sides.

Would be nice if AMD would come out with final release driver rather than beta and that actually support 290X as 13.11 currently does not if you read the specs.

Good luck to our members who do take the challenge seriously. Have fun.


----------



## $ilent

ive tried removing all my nvidia drivers, installing amd driver from their website, installing the driver CD with CCC on, still doing it. Gonna try removing CCC and just installing the website driver.

Anyone else installing CCC or just the driver?


----------



## Tobiman

Quote:


> Originally Posted by *MIGhunter*
> 
> http://www.ekwb.com/shop/kits-cases/kits/ek-kit-h3o-360-hfx.html
> 
> Is that a good kit or is it better to go with individual parts? I remember HeatKiller used to be the CPU block to get. I'm not really going to go water atm but maybe in the future if temps are too high for my liking. Translation: wife didn't like the money spent on the GPU so have to wait to spend more


That's a great kit. The best cpu waterblock may give you an extra 1 or maybe 2 degrees celsius less so not really worth the hassle.


----------



## Arizonian

Quote:


> Originally Posted by *$ilent*
> 
> ive tried removing all my nvidia drivers, installing amd driver from their website, installing the driver CD with CCC on, still doing it. Gonna try removing CCC and just installing the website driver.
> 
> Anyone else installing CCC or just the driver?


Have you tried the how to uninstall Nvidia drivers thread found on the OP second post? It helped others in the past and still might be useful in making sure there is no traces anywhere.


----------



## FlyingSolo

$ilent do a clean install of windows. And see if that helps


----------



## RocketAbyss

Quote:


> Originally Posted by *$ilent*
> 
> ive tried removing all my nvidia drivers, installing amd driver from their website, installing the driver CD with CCC on, still doing it. Gonna try removing CCC and just installing the website driver.
> 
> Anyone else installing CCC or just the driver?


I installed from the CD provided in the box, which turned out to be 13.11 Beta3/5<---not too sure if its 5. I could not get a downloaded one from the AMD site working so I was forced to use the CD lol. But it worked so I guess we can only wait for an official one, hopefully soon before Battlefield 4 launches in 4 days time


----------



## $ilent

Quote:


> Originally Posted by *Arizonian*
> 
> Have you tried the how to uninstall Nvidia drivers thread found on the OP second post? It helped others in the past and still might be useful in making sure there is no traces anywhere.


I will try that thanks
Quote:


> Originally Posted by *RocketAbyss*
> 
> I installed from the CD provided in the box, which turned out to be 13.11 Beta3/5<---not too sure if its 5. I could not get a downloaded one from the AMD site working so I was forced to use the CD lol. But it worked so I guess we can only wait for an official one, hopefully soon before Battlefield 4 launches in 4 days time


Does your screen go black and lagg whenever you try do anything remotely taxing?


----------



## RocketAbyss

Quote:


> Originally Posted by *$ilent*
> 
> I will try that thanks
> Does your screen go black and lagg whenever you try do anything remotely taxing?


On the very first reboot after installing drivers yes. As soon as I started BF3 or Heaven Bench, the computer will freeze and go black, like what you mentioned.

After I did another hard reboot, none of these problems resurfaced again tho. Might have to try it out again later. I can very much imagine its just a driver-"less" problem right now.


----------



## $ilent

Well a reboot didnt fix it...still BF3 crashes and screen goes black. Will try removing nv drivers.


----------



## $ilent

Well that worked, removing al NV stuff has allowed me to get into BF3. But im still getting the annoying, click any programme your screen crashes for 5 seconds thing.

Anyway heres my BF3 results, did a 5 min benchmark on 64 player kharg island with Ultra preset, 4AA. Note in gpuz, all sensor results show the max results.


----------



## Dominican

Quote:


> Originally Posted by *Porter_*
> 
> it takes a bit for FedEx to register your delivery and input it into their system. give it 20-30 minutes and it should show up.


did you not use promo ?


----------



## $ilent

Arizonian am I allowed to join the club?


----------



## RocketAbyss

Quote:


> Originally Posted by *$ilent*
> 
> Well that worked, removing al NV stuff has allowed me to get into BF3. But im still getting the annoying, click any programme your screen crashes for 5 seconds thing.
> 
> Anyway heres my BF3 results, did a 5 min benchmark on 64 player kharg island with Ultra preset, 4AA. Note in gpuz, all sensor results show the max results.


This was on 1440p?


----------



## $ilent

that was on 1440p yes.


----------



## MIGhunter

Quote:


> Originally Posted by *Dominican*
> 
> did you not use promo ?


Didn't know about promo. Also I did it from a pc at work because mobile is forbidden.


----------



## $ilent

Well I removed all ATI/AMD folders, all NV drivers, rebooted, reinstalled the CD driver, rebooted and im still getting this the same issue with my screen hanging everytime I do anything. Also if im in BF3 and then minimise to windows, I get kicked from the game or it crashes.

So for now my 290X is largely unusable.


----------



## Ha-Nocri

Quote:


> Originally Posted by *$ilent*
> 
> Well I removed all ATI/AMD folders, all NV drivers, rebooted, reinstalled the CD driver, rebooted and im still getting this the same issue with my screen hanging everytime I do anything. Also if im in BF3 and then minimise to windows, I get kicked from the game or it crashes.
> 
> So for now my 290X is largely unusable.


It looks like a driver problem. Maybe windows reinstall would be the best...


----------



## $ilent

Im gonna leave it, cant be arsed reinstalling windows to be perfectly honest. Ill just wait for a real driver release and not a beta.


----------



## RocketAbyss

Quote:


> Originally Posted by *$ilent*
> 
> Im gonna leave it, cant be arsed reinstalling windows to be perfectly honest. Ill just wait for a real driver release and not a beta.


Yeah AMD needs to get an actual out now, especially for their new flagship card. Its honestly disappointing that we dont get day 1 drivers for a brand new card in the market, an official one even.


----------



## $ilent

Yep true.

Just tried running heaven benchmark and it crashes instantly...I assume the reason there is no submitted 290X benchmarks on hwbot is because nobody can physically run them


----------



## PCModderMike

Hope your experiences so far aren't a sign of things to come..


----------



## sugarhell

You can try the uninstall method on my sig rig.

Also you can try this one to clean your pc from nvidia leftovers.

http://forums.guru3d.com/showthread.php?t=379505


----------



## $ilent

well im confused...

got heaven DX11 running, it was running fine I left my pc came back and the benchmark is gone and its asking me to make screenshiot. But my heaven result screen is gone? Is this normal?


----------



## sugarhell

http://forums.guru3d.com/showpost.php?p=4683564&postcount=141

Check this guys


----------



## $ilent

nvm I got it, number 1 in 290X for heaven haha! Its like 300+ world wide though lol

http://hwbot.org/hardware/videocard/radeon_r9_290x/

Is there a block on us submitting hwbot submissions for the 290x or something? Nobody else has done any...


----------



## RocketAbyss

Quote:


> Originally Posted by *sugarhell*
> 
> http://forums.guru3d.com/showpost.php?p=4683564&postcount=141
> 
> Check this guys


Awesome news! Thanks for informing us!

Can't wait for the new driver to hit so these launch day issues can be settled


----------



## sugarhell

Also downsampling works again with a different method.


----------



## Scotty99

Wait a second, amd didnt have a driver for the 290x on release day...seriously?


----------



## sugarhell

http://kingpincooling.com/forum/showthread.php?t=2473

Hmm gputweak for up to 2v vore


----------



## devilhead

Quote:


> Originally Posted by *sugarhell*
> 
> http://kingpincooling.com/forum/showthread.php?t=2473
> 
> Hmm gputweak for up to 2v vore


and http://www.mediafire.com/?voj4j1rlk0ucfz4 really showed that my 7970 has Hynix







it works!


----------



## RocketAbyss

Quote:


> Originally Posted by *devilhead*
> 
> and http://www.mediafire.com/?voj4j1rlk0ucfz4 really showed that my 7970 has Hynix
> 
> 
> 
> 
> 
> 
> 
> it works!


Gonna try it on my PowerColor when im home in an hour or two


----------



## $ilent

Damm 2v...thats sick!

Anyone tried using manual fan control? 5000rpm is crazy, it genuinely cant be safe to run the fan like that, it sounds like a hair dryer.


----------



## narmour

Ordered mine! Will be with me tomorrow. I will be happy to see the 5870 move to one of my less superior systems!


----------



## $ilent

Went with the £500 one ey, nice


----------



## szeged

ughhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhh USA still doesnt have any more in stock....comeeeeeee onnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnnn


----------



## Arizonian

Quote:


> Originally Posted by *$ilent*
> 
> Well that worked, removing al NV stuff has allowed me to get into BF3. But im still getting the annoying, click any programme your screen crashes for 5 seconds thing.
> 
> Anyway heres my BF3 results, did a 5 min benchmark on 64 player kharg island with Ultra preset, 4AA. Note in gpuz, all sensor results show the max results.
> 
> 
> Spoiler: Warning: Spoiler!


Welcome aboard


----------



## $ilent

Thanks arizonian.

Do you kniw why noboidy submitting any hwbot benchmarks for the 290x? Theres no block on it is there


----------



## kcuestag

$ilent that BF3 benchmark can't be right if it's at 1080p, please tell me it's at 1440p.









Also 94ºC OMG! I'm hoping I get my waterblock soon, hoping I don't have to play on it with air cooler for more than a day or two.


----------



## sugarhell

Quote:


> Originally Posted by *kcuestag*
> 
> $ilent that BF3 benchmark can't be right if it's at 1080p, please tell me it's at 1440p.


Yeah its 1440p


----------



## $ilent

its 1440p alright









cant wait for the new driver tomorrow, this screen flashing is doing my head in heh!

Putting up another hwbot submission, cmon guys dont let me do them all by myself


----------



## RocketAbyss

Im still out lol. Its almost 10pm here but I will be back home soon to play with the 290x more, maybe even try pushing some OC on the card without melting my case


----------



## skupples

Quote:


> Originally Posted by *$ilent*
> 
> Thanks arizonian.
> 
> Do you kniw why noboidy submitting any hwbot benchmarks for the 290x? Theres no block on it is there


Strange, I was under the impression 290x had already set multiple 4way x-fire records on the bot... they were linked in T.O.C. Last night...


----------



## Kaapstad

Can I join the R9 290X club





Asus R9 290Xs

Using stock coolers for the next couple of weeks until I can get waterblocks on them.


----------



## $ilent

Quote:


> Originally Posted by *skupples*
> 
> Strange, I was under the impression 290x had already set multiple 4way x-fire records on the bot... they were linked in T.O.C. Last night...


Theres only me and one otger person doing em, bout 4 tests out of like 20 done.


----------



## szeged

Quote:


> Originally Posted by *Kaapstad*
> 
> Using stock coolers for the next couple of weeks until I can get waterblocks on them.


house fire waiting to happen


----------



## Nevk

Quote:


> Originally Posted by *Kaapstad*
> 
> Can I join the R9 290X club
> 
> Asus R9 290Xs
> 
> Using stock coolers for the next couple of weeks until I can get waterblocks on them.










.....want


----------



## wstanci3

Oh that beloved 590...

*OT, to the 290x owners, what are your initial impressions so far?*


----------



## jerrolds

I ordered mine at 1130PM CST on the 23rd from Newegg.ca and it left LA last night at 830PM







Express International is 2 days - which means itll arrive on Monday argh.


----------



## Arizonian

Quote:


> Originally Posted by *Kaapstad*
> 
> Can I join the R9 290X club
> Quote:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Asus R9 290Xs
> 
> Using stock coolers for the next couple of weeks until I can get waterblocks on them.
Click to expand...

Very nice. Congrats. Added.









I'm sure your going to be one of our members who will be able to squeeze and find the most juice you can out of them. I've seen what you can do with the cards you get in your possession in the benches. We await your info & feed back. Good luck.

*On a side note:* For members who add water blocks, once you've got them let me know I'll change status from stock to 3rd party water on the roster.


----------



## $ilent

Best I can manage on stock volt on 3dmark11 performance test


----------



## stn0092

Could someone raise the fan speed above the 55% max default and report back on temperatures?


----------



## szeged

any luck on memory overclocking?


----------



## $ilent

Quote:


> Originally Posted by *stn0092*
> 
> Could someone raise the fan speed above the 55% max default and report back on temperatures?













Quote:


> Originally Posted by *szeged*
> 
> any luck on memory overclocking?


Will try that now.


----------



## provost

Quote:


> Originally Posted by *Kaapstad*
> 
> Can I join the R9 290X club
> 
> 
> 
> 
> 
> Asus R9 290Xs
> 
> Using stock coolers for the next couple of weeks until I can get waterblocks on them.


You are probably the only person that I am aware of who has 4 290x and 4 way Titan sli as well as Quad 690s.
Would be interesting to see some comparative benches, even if the 290x are still on air.

At least I know that you can bench, and we will get a better picture than some the hopeless reviews I have been seeing.


----------



## rdr09

Quote:


> Originally Posted by *$ilent*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Will try that now.


+rep. i am sure that card is not $ilent at 100%.

your graphics score is higher than my 7950/7970 crossfire at stock.


----------



## $ilent

Honestly, its so loud is untrue and when you put your hand at the back of the card exhaust it actually feels like a hair dryer.


----------



## sugarhell

Quote:


> Originally Posted by *$ilent*
> 
> Honestly, its so loud is untrue and when you put your hand at the back of the card exhaust it actually feels like a hair dryer.


Its a hair dryer


----------



## KingT

Quote:


> Originally Posted by *$ilent*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Best I can manage on stock volt on 3dmark11 performance test


Your 3DMark 11 GPU score with OC'd 290X is just under my GPU score with two Asus HD7950 TOP cards @ stock (900/1250MHz).

http://www.3dmark.com/3dm11/7331740

Not bad at all for a single GPU







, but still I got price/performance ratio, silence and less heat.









Thanx for posting









CHEERS..


----------



## jerrolds

FYI - i contacted Artic Cooling to ask if their current Accelero Hybrid will work on the R290X. Their rep Eric Abellada [email protected] said no it wouldnt
Quote:


> Eric: No it will not because we will not be updating the Hybrid 7970 anymore. We will however, update the existing Accelero Hybrid and Xtreme III together with other new products we will introduce to support the R9-290X. We will keep you posted.


But i get the feeling they want to push more product - id love for someone with a Hybrid on hand to at least measure them up if possible.

*Update* - it looks like the Hybrid may actually be compatible afterall http://www.computerbase.de/news/2013-10/arctic-und-prolimatech-unterstuetzen-radeon-r9-290x/
Quote:


> Google Translated: Both Arctic and in the current top models among the Prolimatech GPU coolers are also available with AMD's Radeon 290X R9 compatible. This was announced by the manufacturers known to demand from computer base.


----------



## skupples

@ 63c temp doesn't seem like your limiting factor.. Think it's volts or watts?


----------



## $ilent

Best core and memory overclock I can manage thus far on stock voltage:



Quote:


> Originally Posted by *skupples*
> 
> @ 63c temp doesn't seem like your limiting factor.. Think it's volts or watts?


Volts definitely. Gibbo from OCUK managed to up his memory to 6600mhz and core to 1220mhz with an asus bios flash.


----------



## Sk1llS




----------



## Newbie2009

LOL that's insanely loud.


----------



## $ilent

Also power usage for anyone interested:

290x/3770k idle - 70 watts
290x/3770k during benchmark - 345 watts


----------



## stn0092

Quote:


> Originally Posted by *$ilent*


Thanks! How about a less extreme fan profile, like 75-85%?


----------



## Roaches

Quote:


> Originally Posted by *szeged*
> 
> house fire waiting to happen


Not being a Housefire enthusiast....Stay pleb!









I'll be getting 2 once I see non-reference models...Can't wait to make savings on my gas bill during the winter


----------



## Arizonian

Taking into consideration most of us put it in a tower and don't use test beds muffles some of the noise. To put it into perspective at uber mode 55% fan speed....


----------



## RocketAbyss

Ok I managed to do some Heaven 4.0 Benches on my PowerColor 290X. At stock voltages, I managed 7% on the Core and 12% on the Memory, which equates to 1102Mhz Core and 1400Mhz Memory. Here are some bench results for Heaven 4.0:

Stock, 1030/1250(Factory OC'ed):


1102/1250:


1030/1400:


1102/1400:


I have a feeling somewhere along the way, my FX8350 at 4.8Ghz might have bottlenecked the scores, I won't be entirely sure but I can be very certain these scores will be more or less the same as other 290Xs out there at the same clocks.

Also remember, these scores are at 0% Power limits, aka Stock Voltage. I let the 95C Temp Target remain and bumped the fan speed to a max of 70%, which I didnt find loud cos I was benching and I didn't notice any throttle. Amazingly, it took the card awhile to hit 95C, it remained in the 80s for the longest period of time during the 4 benches.

More to come!


----------



## Ha-Nocri

Quote:


> Originally Posted by *$ilent*


Your GPU core isn't really working @1150. From that graph it seems much lower on average. Not temp issues since it's bellow 95c. Power maybe?


----------



## EliteReplay

Quote:


> Originally Posted by *$ilent*
> 
> Also power usage for anyone interested:
> 
> 290x/3770k idle - 70 watts
> 290x/3770k during benchmark - 345 watts


thanks man really needed to know this... this is really good!!! this card doesnt draw that much mpower even Oced.


----------



## EliteReplay

Quote:


> Originally Posted by *RocketAbyss*
> 
> Ok I managed to do some Heaven 4.0 Benches on my PowerColor 290X. At stock voltages, I managed 7% on the Core and 12% on the Memory, which equates to 1102Mhz Core and 1400Mhz Memory. Here are some bench results for Heaven 4.0:
> 
> Stock, 1030/1250(Factory OC'ed):
> 
> 
> 1102/1250:
> 
> 
> 1030/1400:
> 
> 
> 1102/1400:
> 
> 
> I have a feeling somewhere along the way, my FX8350 at 4.8Ghz might have bottlenecked the scores, I won't be entirely sure but I can be very certain these scores will be more or less the same as other 290Xs out there at the same clocks.
> 
> Also remember, these scores are at 0% Power limits, aka Stock Voltage. I let the 95C Temp Target remain and bumped the fan speed to a max of 70%, which I didnt find loud cos I was benching and I didn't notice any throttle. Amazingly, it took the card awhile to hit 95C, it remained in the 80s for the longest period of time during the 4 benches.
> 
> More to come!


would love to see the Power Comsuption? Idle and load... with that CPU. thanks!!


----------



## $ilent

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Your GPU core isn't really working @1150. From that graph it seems much lower on average. Not temp issues since it's bellow 95c. Power maybe?


The temp is low because the fan is blasting away at 100%. Also I couldnt bench at 1150mhz, max I can bench the core at is 1125mhz, see below.

Also I just re did the benchmark and for some reason my scores have gone up?:



This is what my gpuz looked like during the last benchmark, with max figures shown. Does it look normal/in order? Should the gpu clock be bouncing up and down? In fact the bouncing up and down is me switching between firefox and desktop, I cant check the gpu usage during stress testing.



Lastly, power usage maxes out at 373 watts during benching at stock volt.


----------



## Avant

does r9 290x fit in NZXT Phantom 410 ?


----------



## Arizonian

Quote:


> Originally Posted by *Avant*
> 
> does r9 290x fit in NZXT Phantom 410 ?


First of all welcome to overclock.net with your first post









I've got the same tower and if you remove the middle HDD cage you can fit anything in there. I will have pictures with my 290X inside my 410 tower and you will get a good idea by the end of tonight around 11 PM Pacific.

You can also put the bottom fan in the 410 as long as your PSU is not too long which will push air up and help cooling even more unless your water cooling.


----------



## Ha-Nocri

Quote:


> Originally Posted by *$ilent*
> 
> This is what my gpuz looked like during the last benchmark, with max figures shown. Does it look normal/in order? Should the gpu clock be bouncing up and down? In fact the bouncing up and down is me switching between firefox and desktop, I cant check the gpu usage during stress testing.
> 
> 
> 
> Lastly, power usage maxes out at 373 watts during benching at stock volt.


No, actually this is the way it should be, flat line, always max clock. Means it didn't reach power/temp limit


----------



## RocketAbyss

Quote:


> Originally Posted by *EliteReplay*
> 
> would love to see the Power Comsuption? Idle and load... with that CPU. thanks!!


Sorry man I dont have a kill-a-watt to measure watt usage from the wall


----------



## $ilent

Quote:


> Originally Posted by *Ha-Nocri*
> 
> No, actually this is the way it should be, flat line, always max clock. Means it didn't reach power/temp limit


I dont understand is it correct then or not? I cant check the gpuz until after stress testing and the lines done change if you click min, max or avg.

I have a feeling something is not right...my 3dmark score went up, gpu temps went down even though fan speed was lower...Can anyone shed some light on this? I thought if it passes 3dmark without crashing its success?


----------



## Ha-Nocri

Quote:


> Originally Posted by *$ilent*
> 
> I dont understand is it correct then or not? I cant check the gpuz until after stress testing and the lines done change if you click min, max or avg.


Yes, it is correct. GPU clock shouldn't be bouncing.

You can monitor it better with After Burner. Set refresh times to 100ms to see every drop if it appears. That way you can understand the card better.


----------



## $ilent

Guess i need to re do the test. What about my 3d mark resukts going up with no clock increase?


----------



## Ha-Nocri

Quote:


> Originally Posted by *$ilent*
> 
> Guess i need to re do the test. What about my 3d mark resukts going up with no clock increase?


Post GPU-Z SS with 100% fan speed


----------



## $ilent

What will that prove?


----------



## Porter_

BAM!

http://s214.photobucket.com/user/Po...C-15571-000005D44A032973_zps7682f16d.jpg.html


----------



## Ha-Nocri

Quote:


> Originally Posted by *$ilent*
> 
> What will that prove?


To see if we can see why the score was lower. Maybe GPU core clock wasn't constant for some reason.


----------



## tsm106

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Quote:
> 
> 
> 
> Originally Posted by *$ilent*
> 
> What will that prove?
> 
> 
> 
> To see if we can see why the score was lower. Maybe GPU core clock wasn't constant for some reason.
Click to expand...

Its a waste of time doing this on air. Just like Tahiti, Hawaii is terribly clock dependent on heat. If you can't remove the heat you are not going anywhere fast.


----------



## Arizonian

Quote:


> Originally Posted by *Porter_*
> 
> BAM!
> 
> http://s214.photobucket.com/user/Po...C-15571-000005D44A032973_zps7682f16d.jpg.html
> 
> 
> Spoiler: Warning: Spoiler!


BAM! added.


----------



## Ha-Nocri

Quote:


> Originally Posted by *tsm106*
> 
> Its a waste of time doing this on air. Just like Tahiti, Hawaii is terribly clock dependent on heat. If you can't remove the heat you are not going anywhere fast.


He isn't near 94c if GPU-Z reads it correctly. Maybe with 100% fan speed he reaches power limit, altho I did read that power tune only checks GPU's power, not the whole card's


----------



## Porter_

Quote:


> Originally Posted by *Arizonian*
> 
> BAM! added.


thank you sir


----------



## tsm106

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Its a waste of time doing this on air. Just like Tahiti, Hawaii is terribly clock dependent on heat. If you can't remove the heat you are not going anywhere fast.
> 
> 
> 
> He isn't near 94c if GPU-Z reads it correctly. Maybe with 100% fan speed he reaches power limit, altho I did read that power tune only checks GPU's power, not the whole card's
Click to expand...

There is little relevance to the ideal setup under water.

What he should be doing is running uber mode, 100% fan speed, stock volts, check asic, and then see how far it will clock. That will give an indication of clock efficiency vs stock volts. Then as ppl get their cards, everyone runs same methodology we will start to get a big picture outlook.

This is what I do on Tahiti to get a relative bearing on which card has potential to be gold.


----------



## Ha-Nocri

Quote:


> Originally Posted by *tsm106*
> 
> There is little relevance to the ideal setup under water.
> 
> *What he should be doing is running uber mode, 100% fan speed, stock volts, check asic, and then see how far it will clock*. That will give an indication of clock efficiency vs stock volts. Then as ppl get their cards, everyone runs same methodology we will start to get a big picture outlook.
> 
> This is what I do on Tahiti to get a relative bearing on which card has potential to be gold.


Check this thread, he already did that.


----------



## $ilent

I tried running the tests again but keep getting errors, i think its related to the piece of crap beta driver. Cant wait for tomorrow.

Also I cant check ASIC quality on this card, gpuz says no.


----------



## tsm106

Support is terrible atm but we do get a driver today. There should be a new gpuz soon too. Btw what post # has your max clock results with max fan and uber mode?


----------



## Arizonian

Quote:


> Originally Posted by *tsm106*
> 
> Support is terrible atm but we do get a driver today. There should be a new gpuz soon too. Btw what post # has your max clock results with max fan and uber mode?


That's good news regarding drivers. Mine arrives today and should be in system late tonight. Will have eyes glued to AMD driver support page for the update. If anyone sees it first please post so I can add it to OP. Watching UPS and its on delivery right now.


----------



## $ilent

Post 463 is my best so far. It doesnt have gpu temps though


----------



## Wooojciech1983

Guys, how to increase the voltage on the card?


----------



## devilhead

Quote:


> Originally Posted by *$ilent*
> 
> Post 463 is my best so far. It doesnt have gpu temps though


maybe you can make some Valley beches at extreme HD at max your 290x OC?


----------



## Arizonian

Quote:


> Originally Posted by *Wooojciech1983*
> 
> Guys, how to increase the voltage on the card?


Haven't tried it yet but I'm assuming through MSI afterburner once they support the 290X. For those Sapphire owners have you tried using Trixx in order to overvolt? ASUS owners with GPU Tweak?


----------



## $ilent

Flashing asus bios unlocks voltage too.


----------



## sugarhell

I already post a gpu tweak with up to 2 volts core. Also a bios for no vdroop. You should try that

http://kingpincooling.com/forum/showthread.php?t=2473


----------



## $ilent

Quote:


> Originally Posted by *devilhead*
> 
> maybe you can make some Valley beches at extreme HD at max your 290x OC?


Extreme HD eyyy

Got a link to Valley benchmark? nothing showing up on HWbot.

Edit, nvm downloading naow!


----------



## Wooojciech1983

Great, thanks. I will get the card in next week and hope for some nice overclock


----------



## pounced

booooooom


----------



## Blackops_2

Christ almighty just watched that vid from nordichardware with the fan on 100%. I didn't really think it would be that loud. I just figured people like their rigs a lot quieter than i like mine. Truth be told my 7970 is pretty damn loud at 60% fan speed with the reference blower. Anyhow this first batch of reference cards might as well have been dubbed water only. I will actually be using mine on air for a couple of months before i can complete my new build, so this forces me to wait for custom.

Anyhow nice results $ilent, i couldn't believe it actually hit 94C in a game benchmark. I was hoping that was just furmark results.


----------



## $ilent

Quote:


> Originally Posted by *Ha-Nocri*
> 
> No, actually this is the way it should be, flat line, always max clock. Means it didn't reach power/temp limit


Wait a second...do these cards underclock if they are not near the temp limit?


----------



## francisco9751

http://forums.overclockers.co.uk/showpost.php?p=25180395&postcount=35

i am waiting for a 290 custom


----------



## Clockster

Man I am so jealous...still gotta wait 3 more days before mine gets here and then another week for the wb


----------



## pounced

I have a full tower Thor v2 case with just tons of fans on it and I have a bottom mounted 140 fan that blows straight into the fan on the 290x like it lines up perfectly. So far the tempo have been staying pretty low for me while running tomb raider on ultra.


----------



## Ha-Nocri

Quote:


> Originally Posted by *$ilent*
> 
> Wait a second...do these cards underclock if they are not near the temp limit?


There is a power limit. Dunno when it kicks in. You can increase it by 50% in CCC. That is why I suggested you to monitor the card better to understand what is going on (MSI AfterBurner for example).


----------



## $ilent

Quote:


> Originally Posted by *Ha-Nocri*
> 
> There is a power limit. Dunno when it kicks in. You can increase it by 50% in CCC. That is why I suggested you to monitor the card better to understand what is going on (MSI AfterBurner for example).


How can I monitor MSI AB when running a full screen benchmark? Ive repeated myself twice now


----------



## leyzar

Quote:


> Originally Posted by *$ilent*
> 
> How can I monitor MSI AB when running a full screen benchmark? Ive repeated myself twice now


Use OSD ?...


----------



## Ha-Nocri

Quote:


> Originally Posted by *$ilent*
> 
> How can I monitor MSI AB when running a full screen benchmark? Ive repeated myself twice now


You can see what was gong on after you finish benchmarking, just like with GPU-Z.

It would be ideal to get a 2nd monitor tho if you have one nearby.


----------



## Arizonian

Quote:


> Originally Posted by *pounced*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> booooooom


Booooooom added









and Booooom - look what arrived for me sitting on my office desk.


----------



## jerrolds

Quote:


> Originally Posted by *$ilent*
> 
> How can I monitor MSI AB when running a full screen benchmark? Ive repeated myself twice now


Theres a monitoring tab in AB, for each thing you want to monitor in real time you can select "In OSD" or something similiar to that effect. I know you can monitor core/mem clock, fan %, fan rpm, mem/gpu usage..voltages should work if its supported.

After you set them you should see a new icon on your task bar, i think it looks like a jet plan with numbers on it. You can set intervals, as well as hotkeys.


----------



## $ilent

Im using overdrive now. How long does valley bench take?


----------



## utnorris

Got mine too.


----------



## VSG

Can anyone with 2 cards tell me the total power consumption at stock clocks, overclocked at stock voltage and overclocked at, say, 1.275 or 1.3V? I imagine for the last part you would need to be water cooled. Thanks a lot!


----------



## kpoeticg

Damnit, I'm already starting to regret not ordering the other night. I decided to wait for the Non-BF4 Asus or Sapphires









Has this been confirmed yet?
Sapphire R290X BF4 - SK Hynix
Sapphire R290 - Elpida
Asus R290X BF4 - SK Hynix
Gigabyte R290X BF4 - Elpida
AMD Press R290X - SK Hynix
MSI R290X BF4 - Yet to arrive
HIS R290X BF4 - Elpida
HIS R290 - Elpida


----------



## Ha-Nocri

Quote:


> Originally Posted by *jerrolds*
> 
> Theres a monitoring tab in AB, for each thing you want to monitor in real time you can select "In OSD" or something similiar to that effect. I know you can monitor core/mem clock, fan %, fan rpm, mem/gpu usage..voltages should work if its supported.
> 
> After you set them you should see a new icon on your task bar, i think it looks like a jet plan with numbers on it. You can set intervals, as well as hotkeys.


Yes, it has OSD, forgot about that since I never used it (having 2 monitors)


----------



## leyzar

Quote:


> Originally Posted by *geggeg*
> 
> Can anyone with 2 cards tell me the total power consumption at stock clocks, overclocked at stock voltage and overclocked at, say, 1.275 or 1.3V? I imagine for the last part you would need to be water cooled. Thanks a lot!


2x 290x CF ?
Stock Idle = 118
Stock Load = 640


----------



## RocketAbyss

Quote:


> Originally Posted by *kpoeticg*
> 
> Damnit, I'm already starting to regret not ordering the other night. I decided to wait for the Non-BF4 Asus or Sapphires
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Has this been confirmed yet?
> Sapphire R290X BF4 - SK Hynix
> Sapphire R290 - Elpida
> Asus R290X BF4 - SK Hynix
> Gigabyte R290X BF4 - Elpida
> AMD Press R290X - SK Hynix
> MSI R290X BF4 - Yet to arrive
> HIS R290X BF4 - Elpida
> HIS R290 - Elpida


You can add the PowerColor 290X BF4 to the list. It uses Hynix Memory


----------



## $ilent

Valley benchy done


----------



## tsm106

Quote:


> Originally Posted by *RocketAbyss*
> 
> You can add the PowerColor 290X BF4 to the list. It uses Hynix Memory


Hey, have you check it for any warranty voiding stickers on the back? I wonder if PC is doing that again like they did with the dual gpu cards.


----------



## devilhead

Quote:


> Originally Posted by *$ilent*
> 
> 
> 
> Valley benchy done


+REP Valley likes memory overclock, but not bad at all, so at 1300/1600 able to reach + ati cheats ~90


----------



## Arizonian

Quote:


> Originally Posted by *utnorris*
> 
> Got mine too.
> 
> 
> Spoiler: Warning: Spoiler!


Got you added congrats.


----------



## $ilent

Ati cheats 90?


----------



## sugarhell

If you go on valley thread and check my guide for tweaks you can get 3 fps

Also can you mention the custom settings? 2xaa?


----------



## $ilent

I didnt put any custom settings on, just used the normal settings when you open it.


----------



## RocketAbyss

Quote:


> Originally Posted by *tsm106*
> 
> Hey, have you check it for any warranty voiding stickers on the back? I wonder if PC is doing that again like they did with the dual gpu cards.


Heres some pics of the back of the card: Doesn't seem to have any warranty stickers on the back


----------



## sugarhell

Oh try to do an ultra HD valley.


----------



## [email protected]

I just found this









http://www2.ati.com/drivers/beta/amd_catalyst_13.11_betav6.exe


----------



## YaLu

Quote:


> Originally Posted by *kpoeticg*
> 
> Damnit, I'm already starting to regret not ordering the other night. I decided to wait for the Non-BF4 Asus or Sapphires
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Has this been confirmed yet?
> Sapphire R290X BF4 - SK Hynix
> Sapphire R290 - Elpida
> Asus R290X BF4 - SK Hynix
> Gigabyte R290X BF4 - Elpida
> AMD Press R290X - SK Hynix
> MSI R290X BF4 - Yet to arrive
> HIS R290X BF4 - Elpida
> HIS R290 - Elpida


why don't take elpida? :\


----------



## $ilent

Quote:


> Originally Posted by *sugarhell*
> 
> Oh try to do an ultra HD valley.


What settings shoukd i use?


----------



## RocketAbyss

Quote:


> Originally Posted by *[email protected]*
> 
> I just found this
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www2.ati.com/drivers/beta/amd_catalyst_13.11_betav6.exe


Ooooo...trustworthy?


----------



## [email protected]

Quote:


> Originally Posted by *RocketAbyss*
> 
> Ooooo...trustworthy?


Yea, it's from amd servers and probably for 290x support.

http://forums.guru3d.com/showthread.php?t=382778


----------



## sugarhell

Yeah i just got them on my email. Every driver is from www2.


----------



## DampMonkey

Quote:


> Originally Posted by *[email protected]*
> 
> Yea, it's from amd servers and probably for 290x support.
> 
> http://forums.guru3d.com/showthread.php?t=382778


I cant find driver details anywhere. Maybe its a leak?


----------



## sugarhell

Quote:


> Originally Posted by *DampMonkey*
> 
> I cant find driver details anywhere. Maybe its a leak?


Amd refresh the site every 5 hours or something. For a log you need to wait. But this is legit directly from amd site


----------



## RocketAbyss

Anyone willing to be our guinea pig?


----------



## $ilent

No ill wait til tomorrow for that driver.

Back to Valley, does this look right?


----------



## tsm106

Quote:


> Originally Posted by *RocketAbyss*
> 
> Anyone willing to be our guinea pig?


What? Install it already. It's for your card.


----------



## pounced

Quote:


> Originally Posted by *RocketAbyss*
> 
> Anyone willing to be our guinea pig?


Downloading them now lol wish me luck


----------



## tsm106

Quote:


> Originally Posted by *$ilent*
> 
> No ill wait til tomorrow for that driver.
> 
> Back to Valley, does this look right?


Thats not very fast.


----------



## sugarhell

Luck? WTH people. Its directly from amd server. First time amd gpu or something?


----------



## RocketAbyss

Quote:


> Originally Posted by *tsm106*
> 
> What? Install it already. It's for your card.


Haha relax relax I was and am downloading them now


----------



## devilhead

Quote:


> Originally Posted by *$ilent*
> 
> No ill wait til tomorrow for that driver.
> 
> Back to Valley, does this look right?


just put Preset : Extreme HD ! i didn't saw at first your picture, just looked at FPS








because then is much easier to compare to others http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0


----------



## $ilent

Mine doesnt have an extreme hd preset.


----------



## tsm106

Quote:


> Originally Posted by *$ilent*
> 
> Mine doesnt have an extreme hd preset.


There isnt an extremehd for custom res. You have to manually set it. C'mon ppl.


----------



## sugarhell

Its okay now. You need to set it manual for 1440p. If you can run it fullscreen


----------



## $ilent

Nvm its me been stupid, found it.


----------



## selk22

My turn!




SO excited..


----------



## tsm106

Quote:


> Originally Posted by *$ilent*
> 
> Nvm its me been stupid, found it.


Dont run windowed since the res is larger than it needs to be.

And you need to get higher than 38fps.


----------



## tsm106

What the heck, you're still on the old driver?

I get my card in an hour...


----------



## Newbie2009

Quote:


> Originally Posted by *tsm106*
> 
> What the heck, you're still on the old driver?
> 
> I get my card in an hour...


Patiently waiting. You preorder any block yet?

I downloaded and installed the new drivers, now I need some 290s


----------



## YaLu

SK Hynix or Elpida for this 290X?


----------



## szeged

still newegg has none of these in stock....whyyyy


----------



## Forceman

Quote:


> Originally Posted by *$ilent*
> 
> How can I monitor MSI AB when running a full screen benchmark? Ive repeated myself twice now


If you leave the monitoring window open in the background while you are testing, it'll update the graphs while you are testing. Then when you are done you can go back and look at the graphs (if you hover over a spot it'll show you what the values were at that point in time).


----------



## $ilent

New valley test, using extreme HD preset.


----------



## RocketAbyss

Heres my first run on Valley at 1102/1400:


----------



## szeged

65.8 at 1920x1080p on extremeHD seems rather low for having a mild overclock


----------



## $ilent

Look at rocketabyss' tests, similar results.


----------



## RocketAbyss

Quote:


> Originally Posted by *$ilent*
> 
> Look at rocketabyss' tests, similar results.


Looks like I'm slightly held back cos of my CPU?


----------



## sterob

can anyone try cgminer run?


----------



## Newbie2009

Quote:


> Originally Posted by *szeged*
> 
> 65.8 at 1920x1080p on extremeHD seems rather low for having a mild overclock




290 did really bad in unigine in the reviews too for some reason.


----------



## szeged

hmm, must be a driver thing or card is throttling or something, those scores seem low to me.

Quote:


> Originally Posted by *Newbie2009*


valley not heaven


----------



## Arizonian

Quote:


> Originally Posted by *selk22*
> 
> My turn!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> SO excited..


Gottcha. Congrats -









Quote:


> Originally Posted by *[email protected]*
> 
> I just found this
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www2.ati.com/drivers/beta/amd_catalyst_13.11_betav6.exe


Nice







Added to the OP with this advanced exe leak. Will post all the info when AMD Drivers & Support page shows it.


----------



## Newbie2009

Quote:


> valley not heaven


----------



## $ilent

Im guessing its a driver issue. Anyone downloaded that "driver" that was linked yet?


----------



## VSG

Quote:


> Originally Posted by *leyzar*
> 
> 2x 290x CF ?
> Stock Idle = 118
> Stock Load = 640


Thanks, now let's see if anyone can provide this info for overvolted cards


----------



## devilhead

Quote:


> Originally Posted by *RocketAbyss*
> 
> Heres my first run on Valley at 1102/1400:


heh, i'm reaching with OC 7970 almoset 61FPS


----------



## sugarhell

Quote:


> Originally Posted by *$ilent*
> 
> Im guessing its a driver issue. Anyone downloaded that "driver" that was linked yet?


Are you still on the old drivers? Get the new one. Its directly from amd server


----------



## szeged

so, when do you guys expect new stock of the 290x to arrive at newegg? i keep refreshing every 5 mins just incase, only to be dissapointed every time


----------



## $ilent

Why is it not on the amd website then when you go to drivers?


----------



## sugarhell

Quote:


> Originally Posted by *$ilent*
> 
> Why is it not on the amd website then when you go to drivers?


Because they update the site every few hours.


----------



## YaLu

SK Hynix or Elpida for this new 290X?


----------



## Newbie2009

Quote:


> Originally Posted by *$ilent*
> 
> Why is it not on the amd website then when you go to drivers?


Straight from the man himself.

http://forums.guru3d.com/showthread.php?t=382778


----------



## szeged

Quote:


> Originally Posted by *YaLu*
> 
> SK Hynix or Elpida for this new 290X?


hynix is generally considered better, but someone at ocuk broke it down and showed that when cooled correctly, it does not matter.

sorta like with the 780 classified.


----------



## ebduncan

Quote:


> Originally Posted by *devilhead*
> 
> heh, i'm reaching with OC 7970 almoset 61FPS


I get the same exact score with 7950 crossfire default clocks (900mhz)

I score much higher with them overclocked to 1200mhz core/1575mem

Same cpu, clocked a bit higher at 5ghz.


----------



## Newbie2009

Quote:


> Originally Posted by *ebduncan*
> 
> I get the same exact score with 7950 crossfire default clocks (900mhz)
> 
> I score much higher with them overclocked to 1200mhz core/1575mem
> 
> Same cpu, clocked a bit higher at 5ghz.


77 here with stock 7970s (beta 13 11.6)


----------



## $ilent

Well that new driver does *nothing*. In fact it makes things worse. Screen lockups are even longer now and my benchmark results are exactly the same.

Anytime I do anything, be it open gpuz, my screne locks, goes black then 5 seconds later comes back.

Superb

In fact something is up, my gpu usage is at 100%, when my card is doing NOTHING?


----------



## Ukkooh

Any info out on 290x's stock voltage yet?


----------



## RocketAbyss

Bumped my CPU to 5Ghz, managed to squeeze out alittle more on FPS. But I'm guessing thats about it, for a 290X OC'ed to 1102/1400 on 0% power limit settings


----------



## szeged

Quote:


> Originally Posted by *Ukkooh*
> 
> Any info out on 290x's stock voltage yet?


i believe it was reported in at 1.25v, about the same as a 7970ghz volts.


----------



## HeadlessKnight

Valley scores for now are pretty underwhelming. I hope this get sorted out with new drivers. Correct me if I am wrong guys but don't 780s get at least 75 fps+ with mild overclocks? (no tweaks applied, drivers at default values.)


----------



## $ilent

anyone else getting 100% gpu usage in idle on desktop? Ive been through my processes cant see anything and did quick virus scan with MSE nothing came up?!


----------



## Stay Puft

Quote:


> Originally Posted by *RocketAbyss*
> 
> 
> 
> Bumped my CPU to 5Ghz, managed to squeeze out alittle more on FPS. But I'm guessing thats about it, for a 290X OC'ed to 1102/1400 on 0% power limit settings


Pretty horrible score compared to overclocked 780s and Titans


----------



## szeged

Quote:


> Originally Posted by *Ukkooh*
> 
> Any info out on 290x's stock voltage yet?


Quote:


> Originally Posted by *HeadlessKnight*
> 
> Valley scores for now are pretty underwhelming. I hope this get sorted out with new drivers. Correct me if I am wrong guys but don't 780s get at least 75 fps+ with mild overclocks? (no tweaks applied, drivers at default values.)


on stock bios and stock voltages my titans can do about 80 fps, something along those lines. Just for comparison.

give it custom bios and hacked volts and it rips it up hard lol.

things im hoping for with this card

dynamic voltage doesnt suck

voltage control doesnt suck

card overclocks like a beats on roids when properly cooled

newegg usa gets more stock


----------



## RocketAbyss

Quote:


> Originally Posted by *Stay Puft*
> 
> Pretty horrible score compared to overclocked 780s and Titans


Yeah, I'm hoping its just drivers or something


----------



## Stay Puft

Quote:


> Originally Posted by *RocketAbyss*
> 
> Yeah, I'm hoping its just drivers or something


Which drivers are you using?


----------



## sugarhell

Includes all Feature Highlights of AMD Catalyst 13.11 Beta
Includes support for the new products:
AMD Radeon™ R9 290X
AMD Radeon R9 290
Performance improvements
Batman: Arkham Origins - improves performance up to 35% with MSAA 8x enabled
Total War™: Rome 2 - improves performance up to 10%
Battlefield 3 - improves performance up to 10%
GRID 2 - improves performance up to 8.5%
DiRT Showdown - improves performance up to 10%
Formula 1™ 2013 - improves performance up to 8%
DiRT 3 - improves performance up to 7%
Sleeping Dogs - improves performance up to 5%
Automatic AMD Eyefinity Configuration;
Automatic "plug and play" configuration of supported Ultra HD/4K tiled displays


----------



## kpoeticg

Quote:


> Originally Posted by *YaLu*
> 
> why don't take elpida? :\


I mean, if the 290x was only available with Elpida, I'd still take it. Hynix just has a better track record for OC'ability. If it costs the same price for a card with Hynix or Elpida, I'll take Hynix


----------



## szeged

Quote:


> Originally Posted by *kpoeticg*
> 
> I mean, if the 290x was only available with Elpida, I'd still take it. Hynix just has a better track record for OC'ability. If it costs the same price for a card with Hynix or Elpida, I'll take Hynix


i wish i could get ahold of some samsung memory chips, id do some soldering overnight and have a 290x with samsung


----------



## RocketAbyss

Quote:


> Originally Posted by *Stay Puft*
> 
> Which drivers are you using?


Latest 13.11 Beta6 that was linked earlier


----------



## Stay Puft

Quote:


> Originally Posted by *szeged*
> 
> i wish i could get ahold of some samsung memory chips, id do some soldering overnight and have a 290x with samsung


Szeg are you sure you're not going to burst into flames by owning both a 290X and Titan?








Quote:


> Originally Posted by *RocketAbyss*
> 
> Latest 13.11 Beta6 that was linked earlier


Thats nuts. CPU bottleneck? Retest it but use the max powertune setting


----------



## sugarhell

Stock 290x

http://www.3dmark.com/fs/1043392

From ocuk forum


----------



## szeged

Quote:


> Originally Posted by *Stay Puft*
> 
> Szeg are you sure you're not going to burst into flames by owning both a 290X and Titan?


i have a bucket of water ready









i was testing the newest addition to my titan family last night on air, never realized how annoying the stock bios on titan is when it comes to temps and power target with air clocking. skyn3t and waterblocks do this card real justice lol.

also my set up was definitely not meant for air lol, was pushing the mem chips on the card as hard as i could, started getting artifacts at a certain point, until i started hand fanning them down with an envelope i found in my desk, artifacting went away rofl


----------



## jerrolds

Quote:


> Originally Posted by *$ilent*
> 
> anyone else getting 100% gpu usage in idle on desktop? Ive been through my processes cant see anything and did quick virus scan with MSE nothing came up?!


Your rig sounds flaky - in my experience reinstalling windows and essential programs (video drivers/chrome) takes about an hour or so, while trying to debug driver/weird windows problems can take a couple days. I would just do a fresh install, and then load up programs as you need them.

Up to you though.


----------



## Ukkooh

Quote:


> Originally Posted by *szeged*
> 
> i believe it was reported in at 1.25v, about the same as a 7970ghz volts.


Source on this? If that is true I might have to get a 780/titan instead it seems.


----------



## szeged

Quote:


> Originally Posted by *Ukkooh*
> 
> Source on this? If that is true I might have to get a 780/titan instead it seems.


i believe it was posted on overclockersuk

why would you go with a 780/titan over the 290x just because of the stock volts? if the reference cooler can (kind of) handle those volts, imagine it under water


----------



## $ilent

Quote:


> Originally Posted by *jerrolds*
> 
> Your rig sounds flaky - in my experience reinstalling windows and essential programs (video drivers/chrome) takes about an hour or so, while trying to debug driver/weird windows problems can take a couple days. I would just do a fresh install, and then load up programs as you need them.
> 
> Up to you though.


Ive removed the beta driver and its rolled me back to 13.250.18.0 driver, stopped the screen hanging and 100% gpu usage immediately.


----------



## youra6

Quote:


> Originally Posted by *sugarhell*
> 
> Stock 290x
> 
> http://www.3dmark.com/fs/1043392
> 
> From ocuk forum












About 1K higher than what was revealed here. What does a stock 780 and Titan get on Firestrike?


----------



## $ilent

youra6 how do you do bold?









Also would you guys say its normal for the 290x to bounce up to like 90% just when opening a new browser?


----------



## tsm106

Two mystery boxes showed up, hmmm?


----------



## sugarhell

I hate you


----------



## VSG

Koolance has no plans for an R9-290/290X water block. So essentially down to Swiftech and EK at this point on who can make the most of this situation.


----------



## szeged

hope newegg restocks 290x's soon, getting tired of checking their site lol.


----------



## bencher

Quote:


> Originally Posted by *szeged*
> 
> hope newegg restocks 290x's soon, getting tired of checking their site lol.


Auto notify doesn't work?


----------



## pounced

what kind of overclocks are people getting on air with these. I have 5% GPU and 7% memory OC atm but not sure how safe it would be to push it further.


----------



## szeged

Quote:


> Originally Posted by *bencher*
> 
> Auto notify doesn't work?


last time i used auto notify with newegg, i randomly ended up on their site again and what i wanted was back in stock, didnt get the email for another 2 hours, checked again and it was out of stock.


----------



## szeged

Quote:


> Originally Posted by *pounced*
> 
> what kind of overclocks are people getting on air with these. I have 5% GPU and 7% memory OC atm but not sure how safe it would be to push it further.


you can push it more, but itll just throttle itself back down.


----------



## Stay Puft

Quote:


> Originally Posted by *tsm106*
> 
> Two mystery boxes showed up, hmmm?


Did you go quad as well?


----------



## utnorris

I got mine installed, will play with it in a bit. What I found funny was the beta drivers wouldn't install, I ended up having to go to the disk for the drivers to get the system to recognize it. Probably a fluke as I could probably use a reinstall of the OS since I have swapped AMD with Nvidia quite a few times. Anyways, won't have the EK for a week or so depending on when Frozen gets more stock. Until then, I am tempted to try and make something using a Koolance chipset block and some heatsinks on the VRM's.


----------



## utnorris

Double Post


----------



## tsm106

Here we go... though I have to eat something, pick up mah kids from school, take him to basketball practive at 5pm, its 2:14pm now. Then crap wife wants to have dinner at soup plantation! Gah, and there are 3 kids bday parties this weekend. Always when there is stuff I wanna do! Like to be home hermit style. Anyways gonna stick them into this rig, one with mcw82 for kicks till my real blocks get here.


----------



## AddictedGamer93

Quote:


> Originally Posted by *tsm106*
> 
> Here we go... though I have to eat something, pick up mah kids from school, take him to basketball practive at 5pm, its 2:14pm now. Then crap wife wants to have dinner at soup plantation! Gah, and there are 3 kids bday parties this weekend. Always when there is stuff I wanna do! Like to be home hermit style. Anyways gonna stick them into this rig, one with mcw82 for kicks till my real blocks get here.


So who's going to get that extra BF4 key? lol


----------



## carlhil2

....


----------



## $ilent

Guys is it normal fir 3dmark to get artifacts/screen flases during testing, even if it passes ok?


----------



## sugarhell

8 pack review

http://forums.overclockers.co.uk/showthread.php?t=18551642


----------



## carlhil2

Quote:


> Originally Posted by *carlhil2*
> 
> Quote:
> 
> 
> 
> Originally Posted by *$ilent*
> 
> 
> 
> Nice graphics score with that clock, impressive
Click to expand...


----------



## Forceman

Quote:


> Originally Posted by *$ilent*
> 
> Guys is it normal fir 3dmark to get artifacts/screen flases during testing, even if it passes ok?


No, normally means too high a memory overclock. Could be something else though (like driver issues, given your problems so far), but either way it isn't normal.


----------



## $ilent

My memory only at 5300mhz though :/

Im gonna reinstall windows tmoz i think.


----------



## bencher

Quote:


> Originally Posted by *pounced*
> 
> what kind of overclocks are people getting on air with these. I have 5% GPU and 7% memory OC atm but not sure how safe it would be to push it further.


I wouldn't overclock unless fanspeed is at 80% an ear plugs purchased.


----------



## bencher

Quote:


> Originally Posted by *tsm106*
> 
> Here we go... though I have to eat something, pick up mah kids from school, take him to basketball practive at 5pm, its 2:14pm now. Then crap wife wants to have dinner at soup plantation! Gah, and there are 3 kids bday parties this weekend. Always when there is stuff I wanna do! Like to be home hermit style. Anyways gonna stick them into this rig, one with mcw82 for kicks till my real blocks get here.


Proud of you man. I can't wait to be able to say i have kids


----------



## pounced

So I try and redeem my coupon for BF4 and it says the key is invalid LOL

Time to contact support from them and get a good key or one that wasn't used already T.T


----------



## $ilent

My firestrike score. A little over the AMD gpu 2014 presentation mind


----------



## Arizonian

Quote:


> Originally Posted by *tsm106*
> 
> Here we go... though I have to eat something, pick up mah kids from school, take him to basketball practive at 5pm, its 2:14pm now. Then crap wife wants to have dinner at soup plantation! Gah, and there are 3 kids bday parties this weekend. Always when there is stuff I wanna do! Like to be home hermit style. Anyways gonna stick them into this rig, one with mcw82 for kicks till my real blocks get here.
> 
> 
> 
> Spoiler: Warning: Spoiler!


Updated on list congrats







Enjoy the kids, they grow up quick.


----------



## solitario07777

Quote:


> Originally Posted by *pounced*
> 
> So I try and redeem my coupon for BF4 and it says the key is invalid LOL
> 
> Time to contact support from them and get a good key or one that wasn't used already T.T


not be redeemed until the release


----------



## pounced

Quote:


> Originally Posted by *solitario07777*
> 
> not be redeemed until the release


Are you 100% sure?


----------



## Ukkooh

Quote:


> Originally Posted by *szeged*
> 
> i believe it was posted on overclockersuk
> 
> why would you go with a 780/titan over the 290x just because of the stock volts? if the reference cooler can (kind of) handle those volts, imagine it under water


Imagining it under water isn't going to magically help me gather funds for watercooling. A good loop here in good ol' finland is about 600-700€. And even if the temps were good I wouldn't dare to use more than 1.35V for 24/7. I guess I'll hold on a little bit more even though I already made a deal to sell my 7970.

Edit: Well I have plans to mount my 290x with the spare h100i I have (that is if I decide to get it) but that doesn't count as watercooling does it?


----------



## solitario07777

Quote:


> Originally Posted by *pounced*
> 
> Are you 100% sure?


50% according to some pages you visit, you may redeem up to 29

forgives use translator


----------



## Bartouille

Meh... so far the OC results have not been very impressive. I was hoping to see some 1.2ghz like the 7970 at launch with stock voltage and ref cooler. Now we need to see some water results. If you are going to keep this card on air you might as well pay 100$ more and get a 780 that will probably end up being faster once oc'd and quieter, but maybe that's just me.


----------



## Stay Puft

Quote:


> Originally Posted by *tsm106*
> 
> Here we go... though I have to eat something, pick up mah kids from school, take him to basketball practive at 5pm, its 2:14pm now. Then crap wife wants to have dinner at soup plantation! Gah, and there are 3 kids bday parties this weekend. Always when there is stuff I wanna do! Like to be home hermit style. Anyways gonna stick them into this rig, one with mcw82 for kicks till my real blocks get here.


Mmm Soup Plantation. Make sure to get the Loaded baked Potato with bacon soup.. Delicious


----------



## $ilent

So based on my firestrike score youd need a 1200+mhz clocked 780 or a ~1000mhz clocked titan to get the same score.


----------



## Stay Puft

Quote:


> Originally Posted by *$ilent*
> 
> So based on my firestrike score youd need a 1200+mhz clocked 780 or a ~1000mhz clocked titan to get the same score.


Any chance you can run valley for us?


----------



## sugarhell

I dont know if you saw that guys but 8 pack did a review. I hope that you know 8 pack

http://forums.overclockers.co.uk/showthread.php?t=18551642

290x is faster everywhere except valley adn the difference is 1 fps


----------



## $ilent

Quote:


> Originally Posted by *$ilent*
> 
> 
> 
> New valley test, using extreme HD preset.


----------



## Stay Puft

Quote:


> Originally Posted by *sugarhell*
> 
> I dont know if you saw that guys but 8 pack did a review. I hope that you know 8 pack
> 
> http://forums.overclockers.co.uk/showthread.php?t=18551642
> 
> 290x is faster everywhere except valley adn the difference is 1 fps


Wow look at that 3 and 4 way scaling


----------



## sugarhell

On valley amd doesnt have a profile for more than 2 cards. You need to use 1x1 optimization like 7970s


----------



## Tobiman

Quote:


> Originally Posted by *Ukkooh*
> 
> Imagining it under water isn't going to magically help me gather funds for watercooling. A good loop here in good ol' finland is about 600-700€. And even if the temps were good I wouldn't dare to use more than 1.35V for 24/7. I guess I'll hold on a little bit more even though I already made a deal to sell my 7970.
> 
> Edit: Well I have plans to mount my 290x with the spare h100i I have (that is if I decide to get it) but that doesn't count as watercooling does it?


If you are in finland, why not just take your system outside?


----------



## Stay Puft

Quote:


> Originally Posted by *sugarhell*
> 
> On valley amd doesnt have a profile for more than 2 cards. You need to use 1x1 optimization like 7970s


They still dont have a profile for valley?


----------



## Ukkooh

Quote:


> Originally Posted by *Tobiman*
> 
> If you are in finland, why not just take your system outside?


It is not cold enough for that yet. Just checked my thermometer and it was at +4°C.


----------



## sugarhell

Quote:


> Originally Posted by *Stay Puft*
> 
> They still dont have a profile for valley?


Nope even the single profile is a beta one. I hope someday to get a proper one


----------



## rubicsphere

Quote:


> Originally Posted by *Ukkooh*
> 
> It is not cold enough for that yet. Just checked my thermometer and it was at +4°C.


4C! Brrrrrrrr. It's 22C here today


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> Two mystery boxes showed up, hmmm?


you might not have enough distilled.


----------



## Arm3nian

Two Sapphire 290x limited bf4 edition
Add me


----------



## FearzUSA

Plz newegg restore stock. =[


----------



## $ilent

Man literally everyone is grabbing sapphire cards!


----------



## szeged

Quote:


> Originally Posted by *$ilent*
> 
> Man literally everyone is grabbing sapphire cards!


i went with HIS like i always do with AMD cards









i might grab a sapphire or msi card though for my second one...or whatever comes in stock first at newegg.


----------



## King4x4

Subbed to learn more about 290x modding!


----------



## FearzUSA

Use a korean VPN, and buy the game + premium for like 30 euro ?


----------



## Blackops_2

Quote:


> Originally Posted by *$ilent*
> 
> Man literally everyone is grabbing sapphire cards!


Aesthetically they do look "cooler" lol

I'm wondering where Diamond is? They going out of business, switch to Nvidia, or what? Think i'll be going Asus or Sapphire this round.


----------



## Stay Puft

Quote:


> Originally Posted by *$ilent*
> 
> Man literally everyone is grabbing sapphire cards!


They were the primary card available on release night. Has Asus released their's yet?


----------



## selk22

Well okay have the 290x installed and wanted to report in before I go game my face off for a few hours...

I have GPUz running and the 13.11 beta drivers and wanted to note thatt in GPUz at idle usage is like 3% and seems to be reporting things very accurately BUT MSI afterburner seems to say im at 100% usage which is not true so this is an issue I think with either the beta driver or Afterburner. For now going by GPUz

But yeah I will be back to report on performance soon.


----------



## infranoia

Step 1: Set switch to 'Uber'
Step 2: Put in 290x
Step 3: Seal up P280
Step 4: Laugh at the whinging lilywhites as I hear that it's quieter than my dual 5850s at load.
Step 5: Ignore, for now, that it can go up to eleven.


----------



## szeged

Quote:


> Originally Posted by *infranoia*
> 
> Step 5: Ignore, for now, that it can go up to eleven.[/QUOTE]
> 
> [IMG alt="applaud.gif"]https://www.overclock.net/images/smilies/applaud.gif hahahaha


----------



## Arizonian

Quote:


> Originally Posted by *Arm3nian*
> 
> Two Sapphire 290x limited bf4 edition
> Add me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats....added








Quote:


> Originally Posted by *infranoia*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Step 1: Set switch to 'Uber'
> Step 2: Put in 290x
> Step 3: Seal up P280
> Step 4: Laugh at the whinging lilywhites as I hear that it's quieter than my dual 5850s at load.
> Step 5: Ignore, for now, that it can go up to eleven.


Congrats -- added


----------



## utnorris

Quote:


> Originally Posted by *tsm106*
> 
> Here we go... though I have to eat something, pick up mah kids from school, take him to basketball practive at 5pm, its 2:14pm now. Then crap wife wants to have dinner at soup plantation! Gah, and there are 3 kids bday parties this weekend. Always when there is stuff I wanna do! Like to be home hermit style. Anyways gonna stick them into this rig, one with mcw82 for kicks till my real blocks get here.


Yeah, it always seems like when we want to have fun, well sh** we have a party too go to or some other event for the kids. It probably would not be that big of a deal if it didn't HAPPEN EVERY WEEKEND!!!! But seriously, just means late nights for me while everyone else sleeps.


----------



## Porter_

Quote:


> Originally Posted by *infranoia*
> 
> http://www.overclock.net/content/type/61/id/1717337/width/500/height/1000
> 
> Step 1: Set switch to 'Uber'
> Step 2: Put in 290x
> Step 3: Seal up P280
> Step 4: Laugh at the whinging lilywhites as I hear that it's quieter than my dual 5850s at load.
> Step 5: Ignore, for now, that it can go up to eleven.


ha i agree on all parts. at 55% fan speed it's quieter than my XFX 7970 BEDD (at the fan speed i ran).


----------



## 12Cores

Has anyone overclocked these cards under water yet?


----------



## Newbie2009

Quote:


> Originally Posted by *12Cores*
> 
> Has anyone overclocked these cards under water yet?


No blocks out yet I don't think.


----------



## Slomo4shO

AMD Catalyst 13.11 Beta6 Drivers are now available.

http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx


----------



## 12Cores

EK has blocks out, really looking forward to see what this thing can do under water.

http://www.techpowerup.com/192982/ek-debuts-water-block-for-amd-radeon-r9-290x.html


----------



## c0ld

So is newegg ever going to release the ASUS cards?


----------



## Slomo4shO

Quote:


> Originally Posted by *c0ld*
> 
> So is newegg ever going to release the ASUS cards?


Yes, when you are asleep they will release them and you will find the card sold out once you wake up


----------



## c0ld

Quote:


> Originally Posted by *Slomo4shO*
> 
> Yes, when you are asleep they will release them and you will find the card sold out once you wake up


Bah!!! what makes me mad I had an XFX ordered from TigerDirect but they didnt want to ship to my alternate address


----------



## tsm106

Quote:


> Originally Posted by *utnorris*
> 
> Yeah, it always seems like when we want to have fun, well sh** we have a party too go to or some other event for the kids. It probably would not be that big of a deal if it didn't HAPPEN EVERY WEEKEND!!!! But seriously, just means late nights for me while everyone else sleeps.


Amen. I had 20 minutes till basketball lol. I managed to finished the gpu only block. Its cobbled together from spare parts. It should be good, just need to watch vrm temps. I might use a temp lead on the rivf if/when I get a chance. I also have a big 200mm fan which should keep air fed to the rest of the pcb beyond the 80mm fan there.


----------



## Newbie2009

Quote:


> Originally Posted by *tsm106*
> 
> Amen. I had 20 minutes till basketball lol. I managed to finished the gpu only block. Its cobbled together from spare parts. It should be good, just need to watch vrm temps. I might use a temp lead on the rivf if/when I get a chance. I also have a big 200mm fan which should keep air fed to the rest of the pcb beyond the 80mm fan there.


I think I better go to bed. But I will be checking in first thing.


----------



## szeged

Got another one finally!



well that was a completely annoying hold up by newegg, also cant wait to tear those lame stickers off the card.


----------



## Slomo4shO

Quote:


> Originally Posted by *c0ld*
> 
> Bah!!! what makes me mad I had an XFX ordered from TigerDirect but they didnt want to ship to my alternate address


The preliminary leaked R290 benchmarks suggest that it performs within 2-5% of the 290X so it would be recommend that you wait for the official release and the reviews this Thursday. A 290 + a water block would be a great alternative to a 290X if the 290 comes in as $449


----------



## c0ld

Quote:


> Originally Posted by *szeged*
> 
> Got another one finally!
> 
> 
> 
> well that was a completely annoying hold up by newegg, also cant wait to tear those lame stickers off the card.


I want one


----------



## c0ld

Quote:


> Originally Posted by *Slomo4shO*
> 
> The preliminary leaked R290 benchmarks suggest that it performs within 2-5% of the 290X so it would be recommend that you wait for the official release and the reviews this Thursday. A 290 + a water block would be a great alternative to a 290X if the 290 comes in as $449


Im not going to put my video card under water though. It's cause I want to play BF4 at launch date too


----------



## Porter_

Quote:


> Originally Posted by *szeged*
> 
> well that was a completely annoying hold up by newegg, also cant wait to tear those lame stickers off the card.


they really are lame looking stickers. the card is already an eyesore IMO, no need to slap stickers on it to make and worse. it's like putting a K&N sticker on a '78 Pinto.


----------



## provost

Quote:


> Originally Posted by *tsm106*
> 
> Amen. I had 20 minutes till basketball lol. I managed to finished the gpu only block. Its cobbled together from spare parts. It should be good, just need to watch vrm temps. I might use a temp lead on the rivf if/when I get a chance. I also have a big 200mm fan which should keep air fed to the rest of the pcb beyond the 80mm fan there.


At least someone is not messing around and wasting any time...quite resourceful.









Good to see some Experienced benchers weighing in on the side of 290x


----------



## tsm106

Last update for a few hours. Bball time and dinner.


----------



## utnorris

Quote:


> Originally Posted by *tsm106*
> 
> Amen. I had 20 minutes till basketball lol. I managed to finished the gpu only block. Its cobbled together from spare parts. It should be good, just need to watch vrm temps. I might use a temp lead on the rivf if/when I get a chance. I also have a big 200mm fan which should keep air fed to the rest of the pcb beyond the 80mm fan there.


Yeah, I am thinking of using a Koolance chipset block to cool the GPU and then heatsinks and fans on the rest, but we will see.

So, non-scientific and just my opinion, my experience for the first few hours has been like this.


In 3DMark Vantage, the 290x pulled a P42538 (47610 GPU) with the CPU at 4.6Ghz. My dual HD7970's with the CPU at 4.6Ghz pulled a P49279 (59817 GPU) and my GTX670 SLI pulled a P49214 (57920 GPU) with the cpu at 4.8Ghz.

In 3DMark11 the 290x pulled a P14788 (16436 GPU), the HD7970's pulled a P15714 (18802 GPU) and the GTX670's pulled a P15255 (17334 GPU).

So far, the best I have pulled off for an overclock has been + 11% using Catalyst. I am still playing with it, but it took me a little bit to figure out a few settings in there. As soon as we have a new MSI AB supporting it, I will switch over to it.

So in one benchmark it gets smashed by my previous dual card setups, no real surprise there, and in a new one it comes within spitting distance of the dual setups, which is what I was hoping for. I am also not hearing the noise from the fans unless I purposely crank it to 100% as it has yet to get there on default settings. So not bad.


----------



## Dominican

back in stock everybody .


----------



## Dominican

Quote:


> Originally Posted by *c0ld*
> 
> So is newegg ever going to release the ASUS cards?


around Christmas


----------



## wolfej

Had two in my cart and they got sold before I could finish the checkout. /sigh. I'll get my blocks on Monday or Tuesday with no GPUs to cool.


----------



## szeged

probably best to wait till monday anyways to order, even with next day shipping and rush processing, if ordering from newegg atm, you wont get them till monday/ tuesday depending on where you live.


----------



## Arizonian

Quote:


> Originally Posted by *Slomo4shO*
> 
> The preliminary leaked R290 benchmarks suggest that it performs within 2-5% of the 290X so it would be recommend that you wait for the official release and the reviews this Thursday. A 290 + a water block would be a great alternative to a 290X if the 290 comes in as $449


Looking good if review holds true and if that's the price your speculating it would be amazing price / performance. Allowing the 290X and 290 to be crossfired is great for us 290X owners who might like that option. I'm thinking we might have a lot of new members come Oct 31st.


----------



## wolfej

I don't want to order from Newegg because they charge tax and I've have some problems with them before, but they're the only ones with stock. Tigerdirect had MSI and XFX but I refuse to use either of those brands.


----------



## Dominican

Quote:


> Originally Posted by *wolfej*
> 
> Had two in my cart and they got sold before I could finish the checkout. /sigh. I'll get my blocks on Monday or Tuesday with no GPUs to cool.


sorry but make sure to use promo code still work 1 per account.


----------



## wolfej

Quote:


> Originally Posted by *Dominican*
> 
> sorry but make sure to use promo code still work 1 per account.


Yeah I had the mobile promo code in and everything, unless you're talking about some other promo code.


----------



## gboeds

Quote:


> Originally Posted by *$ilent*


thanks for the benches, $ilent....still waiting for folding numbers


----------



## anubis1127

Quote:


> Originally Posted by *gboeds*
> 
> thanks for the benches, $ilent....still waiting for folding numbers


Oh he'll have them as soon as Stanford updates their GPU.txt for the 290x. He's already tried folding on it.


----------



## kot0005

Its in, installing 13.11 now.


----------



## Porter_

anyone able to get their BF4 code to work? mine comes up as invalid on AMD's site.


----------



## burningrave101

XFX in stock at Amazon $569.99:

http://www.amazon.com/XFX-RADEON-1000MHz-Graphics-R9290XENFC/dp/B00G2OTRMA


----------



## wolfej

Quote:


> Originally Posted by *burningrave101*
> 
> XFX in stock at Amazon $569.99:
> 
> http://www.amazon.com/XFX-RADEON-1000MHz-Graphics-R9290XENFC/dp/B00G2OTRMA


Woot, got the last two.


----------



## youra6

Quote:


> Originally Posted by *$ilent*
> 
> youra6 how do you do bold?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also would you guys say its normal for the 290x to bounce up to like 90% just when opening a new browser?


I think only mods and editors can do it.


----------



## Nintendo Maniac 64

Quote:


> Originally Posted by *Ukkooh*
> 
> It is not cold enough for that yet. Just checked my thermometer and it was at +4°C.


Doing the time-zone calculation it would have been about midnight when you checked right? It's about 22:30 here in northeast Ohio and it's only +3°C... we even had full-on (lake effect) snow a couple mornings ago when the temp was about -1°C.

I thought Finland was colder than that?


----------



## Sgt Bilko

Bought mine 24th, came into stock on the 25th..........shipping on the 28th, expected arrival: 30th.

I hate waiting......


----------



## djriful

Can someone take a picture of the card light up in the dark? I never actually get a chance to see the lightning from it.


----------



## Porter_

it does not light up. i also watched the linus video that stated that. false info.


----------



## djriful

Quote:


> Originally Posted by *Porter_*
> 
> it does not light up. i also watched the linus video that stated that. false info.


._.

Bummer.


----------



## Arizonian

Ok got it installed. OP page has updated info from AMD site now showing latest 13.11 Beta 6 with support for R9 290X & 290.



Only problem I'm having is can't delete the Nvidia Corporation folder in Program Files (x86) nor Program Files due to it being open in another program.

I went through the uninstall guide and removed all the Nvidia folders from registry. I want to figure out why I can't remove those folders before trying any benching etc. Anyone know why this might be happening?

IF not I'm thinking clean install and it's going to be another hour or so before I'm back up and fully running to get system back to this point with everything.


----------



## Nintendo Maniac 64

You could always boot into a small live Linux distro like Puppy Linux and manually delete the Nvidia folders.


----------



## Nihaan

Hello

Can someone help us please ?

Every site on the internet says 290X beats 780 and Titan easily so i told my cousin that to buy 290x instead of buying a second 780.

We spent more than 9 hours figuring out performance issues when card is overclocked.

Can anyone help us ?

Gonna make a list of what we have been doing.

4930k Cpu + Rampage IV Extreme Motherboard with latest bios + 32 GB G.Skill Ram

We used latest drivers.
Default clocked cpu.
Overclocked cpu.
OS reinstallation. Windows 7, 8 and 8.1.
No sound card on.
We always installed 290x and its drivers first then used driver sweeper to install 780 and its drivers.

We tested it on Crysis 3, Battlefield 3, Heaven 4, Valley 1.0, Bioshock, Metro LL and Tomb Raider.

On the same setup his max overclocked 780 performs way better than his max overclocked 290x.

Is this normal due to temperature limits or do we have a crappy card that doesn't overclock well ?

Can you share your stock and overclocked Valley, Heaven, Crysis 3, Metro LL, Tomb Raider benchmarks please.

Thank you.


----------



## th3illusiveman

I think these low Valley scores are because the cards is throttling. You can't expect to hold an OC when the GPU runs at *95c stock*. Buying a card with the stock cooler is silly unless you plan on changing the cooler later.....


----------



## Tobiman

You'll have to put the 290X under water to get any serious clocks.


----------



## Ha-Nocri

Did you even increase it's fan-speed?


----------



## Arm3nian

Quote:


> Originally Posted by *Nihaan*
> 
> Hello
> 
> Can someone help us please ?
> 
> Every site on the internet says 290X beats 780 and Titan easily so i told my cousin that to buy 290x instead of buying a second 780.
> 
> We spent more than 9 hours figuring out performance issues when card is overclocked.
> 
> Can anyone help us ?
> 
> Gonna make a list of what we have been doing.
> 
> 4930k Cpu + Rampage IV Extreme Motherboard with latest bios + 32 GB G.Skill Ram
> 
> We used latest drivers.
> Default clocked cpu.
> Overclocked cpu.
> OS reinstallation. Windows 7, 8 and 8.1.
> No sound card on.
> We always installed 290x and its drivers first then used driver sweeper to install 780 and its drivers.
> 
> We tested it on Crysis 3, Battlefield 3, Heaven 4, Valley 1.0, Bioshock, Metro LL and Tomb Raider.
> 
> On the same setup his max overclocked 780 performs way better than his max overclocked 290x.
> 
> Is this normal due to temperature limits or do we have a crappy card that doesn't overclock well ?
> 
> Can you share your stock and overclocked Valley, Heaven, Crysis 3, Metro LL, Tomb Raider benchmarks please.
> 
> Thank you.


The card is most likely throttling. Reference card basically needs to be watercooled.


----------



## Porter_

Quote:


> Originally Posted by *Tobiman*
> 
> You'll have to put the 290X under water to get any serious clocks.


this is not necessarily true. even water might not release any further overclocking potential. i believe LN2 guys are getting ~1350 MHz which is not exactly stellar. this might just be a low-clocking generation. performance out of the box is freaking great though.


----------



## Arm3nian

Quote:


> Originally Posted by *Porter_*
> 
> this is not necessarily true. even water might not release any further overclocking potential. i believe LN2 guys are getting ~1350 MHz which is not exactly stellar. this might just be a low-clocking generation. performance out of the box is freaking great though.


The ln2 results are on stock volts, and the highest was 1450.


----------



## Porter_

ah well stock volts changes the landscape. thanks for the info Arm3nian


----------



## Nihaan

Ofc, fan was at %100 and we had to use cotton ear plugs ( not kidding







)

No we are not planning to do watercooling on his rig since he needs to buy a new monitor and he can't afford buying both.

So what we are having here is normal and not an issue ? How come none of the review sites didn't mention this. 290X seemed so superior on the reviews and i checked more than 10 of them yesterday.

Are you sure it is a normal thing to see this card behind an overclocked 780 ?

Do you know if there is a review out there where a max overclocked 290x being compared to a max overclocked 780 ?

At least we can try to check numbers and see if they are close or equal.

Thank you.


----------



## King PWNinater

I would just love to know if it is possible to crossfire a 290x with an older card such a 7970 or ect...


----------



## Porter_

it is not


----------



## Arm3nian

Quote:


> Originally Posted by *Nihaan*
> 
> Ofc, fan was at %100 and we had to use cotton ear plugs ( not kidding
> 
> 
> 
> 
> 
> 
> 
> )
> 
> No we are not planning to do watercooling on his rig since he needs to buy a new monitor and he can't afford buying both.
> 
> So what we are having here is normal and not an issue ? How come none of the review sites didn't mention this. 290X seemed so superior on the reviews and i checked more than 10 of them yesterday.
> 
> Are you sure it is a normal thing to see this card behind an overclocked 780 ?
> 
> Do you know if there is a review out there where a max overclocked 290x being compared to a max overclocked 780 ?
> 
> At least we can try to check numbers and see if they are close or equal.
> 
> Thank you.


Review sites show stock clocks for 780 and 290x, and show the 290x > 780, this is true.

Highly overclocked (1300mhz) 780 > stock 290x as in your case. The reviews show an overclocked 290x (1100mhz) beating or trading blows with the highly overclocked 780. But since your card is probably throttling, it is an overclocked 780 against a stock 290x.

Quote:


> Originally Posted by *King PWNinater*
> 
> I would just love to know if it is possible to crossfire a 290x with an older card such a 7970 or ect...


Most likely not going to happen, hasn't really happened before. Also, 290x doesn't have a crossfire bridge


----------



## Scotty99

There was a guy in this thread getting 65c load temps with 100% fan, so im not entirely sure that water cooling is gonna be this cards saving grace. Time will tell i guess.


----------



## Arm3nian

Quote:


> Originally Posted by *Scotty99*
> 
> There was a guy in this thread getting 65c load temps with 100% fan, so im not entirely sure that water cooling is gonna be this cards saving grace. Time will tell i guess.


IMO this card is going to shine under water, or at least a proper cooler. But reference cooling is possible to run, as long as you have good airflow in your case, your card has room to breathe, and you take 2 minutes applying good paste.


----------



## skupples

Quote:


> Originally Posted by *Scotty99*
> 
> There was a guy in this thread getting 65c load temps with 100% fan, so im not entirely sure that water cooling is gonna be this cards saving grace. Time will tell i guess.


I lost track of that information... Was trying to figure out, if he's only hitting 65c under load, than temp isn't/shouldn't really be a limiting factor for low level overclocking.


----------



## Scotty99

I don't think you read my post properly. If the guy in the thread wasnt being temp limited when OC with 100% fan speed what is the limit? Obviously he didn't change the volts cause i dont think any OC software allows this yet but at least at stock volts we arent seeing that great OC potential.


----------



## Arm3nian

Quote:


> Originally Posted by *Scotty99*
> 
> I don't think you read my post properly. If the guy in the thread wasnt being temp limited when OC with 100% fan speed *what is the limit*? Obviously he didn't change the volts cause i dont think any OC software allows this yet but at least at stock volts we arent seeing that great OC potential.


The limit was his oc... if he wasn't reaching high temperatures that means his oc wasn't high enough. Increasing clock speeds also results in an increase of heat, not just increasing volts.


----------



## Scotty99

Umm huh? You are just agreeing with me in an odd way lol.

People are not getting good OC results with this card so far, they are simply cranking the fan up to simulate an aftermarket cooling solution. We need to see voltage increasing OC benchmarks to make real conclusions, but at stock volts its not very impressive.


----------



## DampMonkey

Quote:


> Originally Posted by *Scotty99*
> 
> There was a guy in this thread getting 65c load temps with 100% fan, so im not entirely sure that water cooling is gonna be this cards saving grace. Time will tell i guess.


The EK guys ran furmark at 1200/1600 and the card maxed at 54*. This thing will do just fine under water


----------



## Scotty99

Quote:


> Originally Posted by *DampMonkey*
> 
> The EK guys ran furmark at 1200/1600 and the card maxed at 54*. This thing will do just fine under water


You misunderstood me...

Of course watercooling will make it cool, no idea how you can take my statement as thinking this lol. My point is the guy has a maxed fan to simulate aftermarket cooling and it still did not overclock well, removing thermals as a factor.


----------



## Arm3nian

Double post


----------



## Arm3nian

Quote:


> Originally Posted by *Scotty99*
> 
> Umm huh? You are just agreeing with me in an odd way lol.
> 
> People are not getting good OC results with this card so far, they are simply cranking the fan up to simulate an aftermarket cooling solution. We need to see voltage increasing OC benchmarks to make real conclusions, but at stock volts its not very impressive.


They are not getting good oc results because they are temperature limited. Cranking the fan up to 100% changes nothing, it is still an abomination of a cooler at 50% fan speed and at 100% fan speed.
With water, and stock volts, people are going to get high overclocks because they will not be temperature limited.
*Then*, after we see that with water the actual volts are the limit, instead of the temperatures, we will see unlocked voltage overclocks.
Quote:


> Originally Posted by *Scotty99*
> 
> You misunderstood me...
> 
> Of course watercooling will make it cool, no idea how you can take my statement as thinking this lol. My point is the guy has a maxed fan to simulate aftermarket cooling and it still did not overclock well, removing thermals as a factor.


Why didn't he get a good overclock? People have gotten all the way up to 1450mhz with no mods, are you telling me that he stopped overclocking because the card couldn't go any further? No, he stopped because if he went more the card would melt.


----------



## Scotty99

Eh, explain to me how 65c is thermally limited oc?

Yes of course the stock cooler is garbage but if you crank the fan up it spins at something like 6k RPM which is insanely loud but DOES cool pretty good at that point. I am astounded the replies i am getting, my logic is fairly solid here and im still gettting people arguing about things that seem pretty common sense to me.

Again none of this is even relevant until we see people that have increased the voltage, stock voltages are just childs play anyways you all know that.


----------



## Scotty99

Dude just go find the post, his name was silent i think Max stable OC he could muster was 1125mhz, temp wasnt his limiting factor he just couldnt get it stable above that.


----------



## Moustache

Quote:


> Originally Posted by *Nihaan*
> 
> Ofc, fan was at %100 and we had to use cotton ear plugs ( not kidding
> 
> 
> 
> 
> 
> 
> 
> )
> 
> No we are not planning to do watercooling on his rig since he needs to buy a new monitor and he can't afford buying both.
> 
> So what we are having here is normal and not an issue ? How come none of the review sites didn't mention this. 290X seemed so superior on the reviews and i checked more than 10 of them yesterday.
> 
> Are you sure it is a normal thing to see this card behind an overclocked 780 ?
> 
> Do you know if there is a review out there where a max overclocked 290x being compared to a max overclocked 780 ?
> 
> At least we can try to check numbers and see if they are close or equal.
> 
> Thank you.


Whether the 290x is at 100% or not, it's still running at an average of 900mhz-950mhz or maybe lower especially when you're playing a demanding game like Crysis 3. The 290x is definitely throttling, there is no doubt about that. The reference cooler is pretty "crappy". It need to be under water or you can slap vga cooler such as mk-26. Otherwise, it's better to wait for custom 290x from Asus, Gigabyte and etc.

290x will run faster than 780/Titan at clock for clock or max oc when the card is not throttling.


----------



## Arm3nian

Quote:


> Originally Posted by *Scotty99*
> 
> Dude just go find the post, his name was silent i think Max stable OC he could muster was 1125mhz, temp wasnt his limiting factor he just couldnt get it stable above that.


Almost every review and test got the card over 1125mhz.
Almost every review and test states the card throttling with the fan at 100%.


----------



## DaaQ

Look into *this* , as was posted a few pages back.
The two benchmark programs he's using favor the 780. He is not telling us what his benchmarks and settings used were. Also what resolutions they are using.
He does mention that next upgrade is a monitor if I'm not mistaken.
So to basically say, it's the temps (which probably is a main component) or something else like it is a bad card, is not doing any good to help him figure out whats going on with his stuff.
What I'm trying to say is..... C'mon man, help a brother out







not just to say slap a waterblock on it and your gtg









Quote:


> Originally Posted by *Nihaan*
> 
> Hello
> 
> Can someone help us please ?
> 
> Every site on the internet says 290X beats 780 and Titan easily so i told my cousin that to buy 290x instead of buying a second 780.
> 
> We spent more than 9 hours figuring out performance issues when card is overclocked.
> 
> Can anyone help us ?
> 
> Gonna make a list of what we have been doing.
> 
> 4930k Cpu + Rampage IV Extreme Motherboard with latest bios + 32 GB G.Skill Ram
> 
> We used latest drivers.
> Default clocked cpu.
> Overclocked cpu.
> OS reinstallation. Windows 7, 8 and 8.1.
> No sound card on.
> We always installed 290x and its drivers first then used driver sweeper to install 780 and its drivers.
> 
> We tested it on Crysis 3, Battlefield 3, Heaven 4, Valley 1.0, Bioshock, Metro LL and Tomb Raider.
> 
> On the same setup his max overclocked 780 performs way better than his max overclocked 290x.
> 
> Is this normal due to temperature limits or do we have a crappy card that doesn't overclock well ?
> 
> Can you share your stock and overclocked *Valley, Heaven*, Crysis 3, Metro LL, Tomb Raider benchmarks please.
> 
> Thank you.


----------



## Blackops_2

Quote:


> Originally Posted by *Arm3nian*
> 
> Almost every review and test got the card over 1125mhz.
> Almost every review and test states the card throttling with the fan at 100%.


This. Didn't EK already state 1200/1600 was relatively easy on water? Hence temperature severely limiting this cards OCing ability. Or as of now anyway. I say we wait for water results, then custom air results, before writing this card off with no OCing headroom.


----------



## Scotty99

Ok well maybe he has a poor sample, but why are the cards throttling at temps where they shouldnt be doing so? If the cards are throttling at 65c what makes you think they wont throttle under water? Seems like this throttling thing is the real problem that amd needs to iron out.


----------



## fleetfeather

if cards are throttling based on any factor other than temp, it's not by design. sounds like a firmware bug to me


----------



## Arm3nian

Quote:


> Originally Posted by *DaaQ*
> 
> Look into *this* , as was posted a few pages back.
> The two benchmark programs he's using favor the 780. He is not telling us what his benchmarks and settings used were. Also what resolutions they are using.
> He does mention that next upgrade is a monitor if I'm not mistaken.
> So to basically say, it's the temps (which probably is a main component) or something else like it is a bad card, is not doing any good to help him figure out whats going on with his stuff.
> What I'm trying to say is..... C'mon man, help a brother out
> 
> 
> 
> 
> 
> 
> 
> not just to say slap a waterblock on it and your gtg


You can troubleshoot or tell him what to do differently all you want, but I'm giving him the hard truth. This thing isn't meant to be run on air... If you want air wait for a non reference lol. I don't see the point of buying something not designed for that purpose when there are other options.

*However*, as I said in a couple of posts back, you can try and improve case airflow, or try and apply better paste. My point is that you should not be expecting high overclocks with that cooler, especially when every review so far has stated it throttling. If he is going for 1100mhz, kudos to him.


----------



## Arm3nian

Quote:


> Originally Posted by *Scotty99*
> 
> Ok well maybe he has a poor sample, but why are the cards throttling at temps where they shouldnt be doing so? If the cards are throttling at 65c what makes you think they wont throttle under water? Seems like this throttling thing is the real problem that amd needs to iron out.


I never said that his card is throttling at 65c, I said he didn't push it far enough. Why is he the only one I've seen out of many people to not be able to go past 1125mhz? Did he specifically say whatever he did he couldn't get it stable or did he say that any more overclocking resulted in the card throttling due to temps rising.
Quote:


> Originally Posted by *fleetfeather*
> 
> if cards are throttling based on any factor other than temp, it's not by design. sounds like a firmware bug to me


Yeah, last series clocked well, why would amd release an updated version of gcn and remove that feature.


----------



## skupples

Sounds like 290x club is going to require a hex editor for custom bios.


----------



## DampMonkey

Quote:


> Originally Posted by *Scotty99*
> 
> You misunderstood me...
> 
> Of course watercooling will make it cool, no idea how you can take my statement as thinking this lol. My point is the guy has a maxed fan to simulate aftermarket cooling and it still did not overclock well, removing thermals as a factor.


You said watercooling would not be this cards saving grace. I said ek has easily done 1200/1600 without voltmodding. Ill let you know on tuesday when my block comes in how well this thing clocks.


----------



## th3illusiveman

Quote:


> Originally Posted by *fleetfeather*
> 
> if cards are throttling based on any factor other than temp, it's not by design. sounds like a firmware bug to me


This, I think it's highly unlikely but since AMD is relatively new to setting temperature targets in CCC, i wouldn't put it past them to mess it up.


----------



## Arm3nian

Quote:


> Originally Posted by *skupples*
> 
> Sounds like 290x club is going to require a hex editor for custom bios.


Lets bribe skyn3t with +rep if he succeeds lol


----------



## Scotty99

Thats exactly what my point here is, people have shown with 100% fan speeds on the reference cooler that temps are NOT the limiting factor. Something else is happening here. Firmware was the word i was looking for i just couldnt think of it lol.


----------



## DaaQ

If I remember right, these cards, throttle to stay at 94C. Once they hit 95C they will throttle down to keep at 94C. Correct? AMD has said they will run at 94C all day long with no problems right?

Maybe the monitor they are using is 1080P or less than??? ( hope not). From the OCUK site, and I thought I heard this here as well, maybe not in this thread but on OCN, That the ASUS bios and GPU tweak are giving better oc results.

Nihaan, can you post up your results between your 290x and 780. What are his max 780 clocks and what 780 is it? What resolution are you guys using and why not add in Firestrike and 3D Mark as well.

Here are some results from the link I put up from my post above. Source








There for those that don't wish to follow the link above.








Edit: exceptional scaling in CFX


----------



## selk22

Well after playing a few hours of bioshock and then planetside 2 and also some Far cry 3 it seems that on the stock cooler playing with the memory clock at all causes issues and artifacting.. but I can play with the core clock all day!

Like I stated before MSI afterburner displays everything properly except the GPU usage which right now for me only GPUz reports accurate.

It was rather easy to keep temps under control by setting a custom profile where the fan can reach up to 65% under full load. It is loud yes but I do play games mostly with headphones and to be quite honest it is barely louder than my 580 was.

I have have it clocked to 1150/1250

Game temps on custom profile are 65-80c

SO far I am VERY happy with the performance of this card in games. I am maxing everything out without any issues and cant wait for bf4


----------



## binormalkilla

Well I'm happy to report that so far I don't get the tearing that plagued Eyefinity users that mixed display interfaces. With my 7950s in Crossfire I got horrible tearing on the HDMI monitor (the others were using Displayport). Looks like I won't have to hunt down a Displayport MST hub after all.

I haven't played any games yet, so hopefully they're tear free as well.

BTW anyone tried this card in Linux yet? Looks like AMD released a 'compatible' driver today.


----------



## fleetfeather

Quote:


> Originally Posted by *selk22*
> 
> Well after playing a few hours of bioshock and then planetside 2 and also some Far cry 3 it seems that on the stock cooler playing with the memory clock at all causes issues and artifacting.. but I can play with the core clock all day!
> 
> Like I stated before MSI afterburner displays everything properly except the GPU usage which right now for me only GPUz reports accurate.
> 
> It was rather easy to keep temps under control by setting a custom profile where the fan can reach up to 65% under full load. It is loud yes but I do play games mostly with headphones and to be quite honest it is barely louder than my 580 was.
> 
> I have have it clocked to 1150/1250
> 
> Game temps on custom profile are 65-80c
> 
> SO far I am VERY happy with the performance of this card in games. I am maxing everything out without any issues and cant wait for bf4


how much throttling are you getting with 1150/1250 @ 65% fan speed? I too use headphones (closed back), so noise is less of a factor to me..









E: oh, and you said the memory was holding you back a bit; what brand card do you have? (read: what brand memory do you have)


----------



## Arm3nian

Quote:


> Originally Posted by *Scotty99*
> 
> Thats exactly what my point here is, people have shown with 100% fan speeds on the reference cooler that temps are NOT the limiting factor. Something else is happening here. Firmware was the word i was looking for i just couldnt think of it lol.


Most I've seen do report temps are the limiting factor, and I believe they are. But, there could also be other factors, I'm not counting that out. Look at IVY-E, most hit a wall at 4.6ghz w/o anywhere near temp limits. Future bios updates to the motherboards improved it, but not as much as we would like. It is why everyone is waiting for RIVE BE.

Drivers and bios are still new for the 290x, give it time. Everything will be unfolded shortly.


----------



## DampMonkey

Quote:


> Originally Posted by *skupples*
> 
> Sounds like 290x club is going to require a hex editor for custom bios.


I dont think the 3dmark record guys needed a bios mod for their clock, 1435 or whatever it was


----------



## selk22

Quote:


> Originally Posted by *fleetfeather*
> 
> how much throttling are you getting with 1150/1250 @ 65% fan speed? I too use headphones (closed back), so noise is less of a factor to me..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> E: oh, and you said the memory was holding you back a bit; what brand card do you have? (read: what brand memory do you have)


Thermals are 0 issue like I stated so I am at a constant 1150-1100 depending on what the game needs.

I have a sapphire card and the only reason I believe that the mem OC's are holding me back is because of the cooler. I know once its under water this wont be an issue

Also should mention im on 1200p


----------



## Arm3nian

Quote:


> Originally Posted by *DampMonkey*
> 
> I dont think the 3dmark record guys needed a bios mod for their clock, 1435 or whatever it was


Yeah Smoke and friends mentioned no sign of any mods. It is why with Ln2 they only got the ~1450.


----------



## fleetfeather

Quote:


> Originally Posted by *Scotty99*
> 
> Thats exactly what my point here is, people have shown with 100% fan speeds on the reference cooler that temps are NOT the limiting factor. Something else is happening here. Firmware was the word i was looking for i just couldnt think of it lol.


another thing to note is $ilent has (unfortunately!) been having a fair few issues with his PC so far (driver issues, strange blank screens etc.), so it could be a one-off problem...


----------



## Arm3nian

Quote:


> Originally Posted by *fleetfeather*
> 
> another thing to note is $ilent has (unfortunately!) been having a fair few issues with his PC so far (driver issues, strange blank screens etc.), so it could be a one-off problem...


He is the only one i've seen so far that WASN'T temp limited.


----------



## fleetfeather

Quote:


> Originally Posted by *Arm3nian*
> 
> He is the only one i've seen so far that WASN'T temp limited.


Sorry i'm tired, are you agreeing with me, or disagreeing? hahah


----------



## Arm3nian

Quote:


> Originally Posted by *fleetfeather*
> 
> Sorry i'm tired, are you agreeing with me, or disagreeing? hahah


I meant that most people I've seen run into the thermal barrier, not volt barrier. So agreeing lol.


----------



## DStealth

Quote:


> Originally Posted by *DampMonkey*
> 
> I dont think the *3dmark record* guys needed a bios mod for their clock, 1435 or whatever it was


Sure, disabled Tessellation was more than enough


----------



## DaaQ

Quote:


> Originally Posted by *Arm3nian*
> 
> You can troubleshoot or tell him what to do differently all you want, but I'm giving him the hard truth. This thing isn't meant to be run on air... If you want air wait for a non reference lol. I don't see the point of buying something not designed for that purpose when there are other options.
> 
> *However*, as I said in a couple of posts back, you can try and improve case airflow, or try and apply better paste. My point is that you should not be expecting high overclocks with that cooler, especially when every review so far has stated it throttling. If he is going for 1100mhz, kudos to him.


Really? Lol , if it was not *designed* to run on air, it would ship with a block on it.
Dude was comparing max oc 290x vs max oc 780 and complaining that 780 performed better. He never gave out his oc numbers for either card. So what exactly is being argued here and for what? I put up bench results attained on air with no volts. 8 pack said he was doing Ln2 this weekend so keep an eye on that thread if anyone is wanting to see how well these cards will do when cold.


----------



## tsm106

Hmm, lets see... my son's rig does not have the best cpu and its holding me back. When I get my real blocks then we'll really see what it can do. But for now, I chose to try the hardest bench atm. My son's loop is on the small side, 35mm rads sized 360+240. Temps maxed at 55c. Would love to throw these cards into my main rig's loop muahaha.

[email protected] 1250/[email protected] load = 79.1 fps.


----------



## Arm3nian

Quote:


> Originally Posted by *tsm106*
> 
> Hmm, lets see... my son's rig does not have the best cpu and its holding me back. When I get my real blocks then we'll really see what it can do. But for now, I chose to try the hardest bench atm. My son's loop is on the small side, 35mm rads sized 360+240. Temps maxed at 55c. Would love to throw these cards into my main rig's loop muahaha.
> 
> [email protected] 1250/[email protected] load = 79.1 fps.


Was it not stable if you went higher in the clock?


----------



## tsm106

The cpu? Yea. Though I coulda sworn it was 5ghz stable... but I could have mixed the chip with my other 3820. Doh. Maybe that's what it is. I have a good feeling it will hit the numbers the titan guys are doing easy.


----------



## Scotty99

Ooh finally a real result, one with volts modified! Is 1250 on 1.3v the highest you could get stable or didnt you try going past.


----------



## Arm3nian

Quote:


> Originally Posted by *DaaQ*
> 
> Really? Lol , if it was not *designed* to run on air, it would ship with a block on it.
> Dude was comparing max oc 290x vs max oc 780 and complaining that 780 performed better. He never gave out his oc numbers for either card. So what exactly is being argued here and for what? I put up bench results attained on air with no volts. 8 pack said he was doing Ln2 this weekend so keep an eye on that thread if anyone is wanting to see how well these cards will do when cold.


If it shipped with a block on it, it would not appeal to most and would be expensive. The cooler is cheap, and is made to be taken off lol.

He was comparing max oc 780 vs a stock 290x basically because it was most likely throttling. The benches you put show a stock 780 vs a stock 290x, not an oc 780.


----------



## $ilent

Quote:


> Originally Posted by *Arm3nian*
> 
> I meant that most people I've seen run into the thermal barrier, not volt barrier. So agreeing lol.


your posts dont make much sense to me.

These cards will run at 94C wether your at 1000mhz or 1150mhz with the fan at quiet mode. Its only when you ramp up the fan to 60%+ does that temperature comes down. My card actually did better with the fan left on auto and running at 94C than what it did when I ran it at 100% and temps were only 63C.


----------



## rubicsphere

Quote:


> Originally Posted by *tsm106*
> 
> Hmm, lets see... my son's rig does not have the best cpu and its holding me back. When I get my real blocks then we'll really see what it can do. But for now, I chose to try the hardest bench atm. My son's loop is on the small side, 35mm rads sized 360+240. Temps maxed at 55c. Would love to throw these cards into my main rig's loop muahaha.
> 
> [email protected] 1250/[email protected] load = 79.1 fps.
> 
> 
> 
> Spoiler: Warning: Spoiler!


That temp though! It's over 9000


----------



## Arm3nian

Quote:


> Originally Posted by *tsm106*
> 
> The cpu? Yea. Though I coulda sworn it was 5ghz stable... but I could have mixed the chip with my other 3820. Doh. Maybe that's what it is. I have a good feeling it will hit the numbers the titan guys are doing easy.


I was talking about the card actually lol, you see my 290x's are sitting in their box because I have no motherboard, so I don't know what the exact value for the stock volts are. My question was, what was stopping you from going higher on the cards?


----------



## th3illusiveman

Quote:


> Originally Posted by *selk22*
> 
> Thermals are 0 issue like I stated so I am at a constant 1150-1100 depending on what the game needs.
> 
> I have a sapphire card and the only reason I believe that the mem OC's are holding me back is because of the cooler. I know once its under water this wont be an issue
> 
> Also should mention im on 1200p


Did you flip the Bios switch? I remember reading in a review somewhere that they couldn't OC them memory on one of the two bios switch settings, but when they changed it they managed to get to ~1500Mhz+ (*400GB/s*)


----------



## Arm3nian

Quote:


> Originally Posted by *$ilent*
> 
> your posts dont make much sense to me.
> 
> These cards will run at 94C wether your at 1000mhz or 1150mhz with the fan at quiet mode. Its only when you ramp up the fan to 60%+ does that temperature comes down. My card actually did better with the fan left on auto and running at 94C than what it did when I ran it at 100% and temps were only 63C.


My point is that every result I've seen show the temperature being the limit. According to scotty, temperatures were not your limit.


----------



## th3illusiveman

Quote:


> Originally Posted by *tsm106*
> 
> Hmm, lets see... my son's rig does not have the best cpu and its holding me back. When I get my real blocks then we'll really see what it can do. But for now, I chose to try the hardest bench atm. My son's loop is on the small side, 35mm rads sized 360+240. Temps maxed at 55c. Would love to throw these cards into my main rig's loop muahaha.
> 
> [email protected] 1250/[email protected] load = 79.1 fps.


this is with no software mods right? Vey impressive jump in performance there, from ~65fps max OC on OCN on air to ~80fps with water (proves these Beasts are heavily temp limited).

I wonder what alatar will be saying when you get over 90fps using the software mods to boost scores he does.







probably start whinning about the power draw at that point because there will be nothing left lol.


----------



## Arm3nian

Quote:


> Originally Posted by *th3illusiveman*
> 
> this is with no software mods right? Vey impressive jump in performance there, from ~65fps max OC on OCN on air to ~80fps with water (proves these Beasts are heavily temp limited).
> 
> I wonder what alatar will be saying when you get over 90fps using the software mods to boost scores he does.
> 
> 
> 
> 
> 
> 
> 
> probably start whinning about the power draw at that point because there will be nothing left lol.


So his score is with stock volts? I saw a bios for unlocked voltage so I thought he used that. Anyway, awesome news if that is with stock volts. I will be satisfied when my 290x gets 93fps+ on valley.


----------



## Havolice

i just bought 2 of these puppy'\s

and im shocked at how freaking loud the fan is in desktop its like a damn wind tunnel blowing
it also goes up and down up and down in speed sporadicly for no reason while the tune panel keeps saying its at 20% and cannot reach 100%

i think it reached 100% a few times ><

so something is off on 1 of the cards

btw noob question i downloaded the 13,11 beta6 drivers but it also had a mobility driver did i need that aswel ?


----------



## $ilent

tsms score is with voltage mod, these cards only have 1.25v, he is at 1.3v.

Also my card is artifacting like crazy now anytime I go near the memory.



Literally the start of the day it was doing fine, but as the day got on the artifacts have got worse, to the point where I get artifacts in every benchmark I do. Sound like a faulty card?


----------



## Sgt Bilko

Quote:


> Originally Posted by *$ilent*
> 
> tsms score is with voltage mod, these cards only have 1.25v, he is at 1.3v.
> 
> Also my card is artifacting like crazy now anytime I go near the memory.
> 
> Literally the start of the day it was doing fine, but as the day got on the artifacts have got worse, to the point where I get artifacts in every benchmark I do. Sound like a faulty card?


Sounds like it's faulty yeah

That's just......annoying


----------



## CTM Audi

Quote:


> Originally Posted by *tsm106*
> 
> Hmm, lets see... my son's rig does not have the best cpu and its holding me back. When I get my real blocks then we'll really see what it can do. But for now, I chose to try the hardest bench atm. My son's loop is on the small side, 35mm rads sized 360+240. Temps maxed at 55c. Would love to throw these cards into my main rig's loop muahaha.
> 
> [email protected] 1250/[email protected] load = 79.1 fps.


How is that CPU holding it back, on a bench that is very GPU bound? CPU makes extremely little difference with Heaven.

And that loop isn't on the small side either. A 360+240 loop is overkill for a CPU + 1xGPU setup (assuming that run was with one card).

Still seems like a nice OC, and nice score.


----------



## SoloCamo

Anyone having an issue where it's not ramping down and saying 100% gpu usage with the core remaining full but the memory downclocking? Using afterburner for it.. sometimes a reboot fixes it and it usually only pops up after a while of gaming. Not sure if it's actually the case or if'ts AB playing tricks, then again, the temps from two programs are staying warmer as if it's under load.

Currently at 1100 and 1325 without so much as trying and it's stable with zero signs of throttling.. of course I adjusted the fan profile and sacrificed acoustics a bit but by no means is it any worse then the XFX Double D 7970 GE I kept at 1225/1625

Also, any word on when gpu-z will report properly?

Using cat 11 beta 6 - any recommended drivers at this point?


----------



## Scotty99

IIRC someone in this thread said that is a bug with afterburner, try a different program to monitor GPU clocks and see.


----------



## Havolice

none of the gpu programs i seem to use can adjust the second card

every program sees it

but furmark says you have 2 gpu's but UHHH second one not available

in afterburner i cannot adjust fan speed of the second card ><

great did i buy a faulty one again like Always sigh

nvm they both work this crossfire method the r9 290x is using the system and furmark will see it as 1 solid card with 2 gpu's

as soon as you turn of crossfire you will see that furmark displays both gpu's apart

still od tho as soon as i play a game or do furmark the fans go totaly silent oO and when i turn it of they go in rage mode again lol -.-

in short i think the drivers and mayby the firmware adressing the fans need some serious work


----------



## $ilent

Quote:


> Originally Posted by *SoloCamo*
> 
> Anyone having an issue where it's not ramping down and saying 100% gpu usage with the core remaining full but the memory downclocking? Using afterburner for it.. sometimes a reboot fixes it and it usually only pops up after a while of gaming. Not sure if it's actually the case or if'ts AB playing tricks, then again, the temps from two programs are staying warmer as if it's under load.
> 
> Currently at 1100 and 1325 without so much as trying and it's stable with zero signs of throttling.. of course I adjusted the fan profile and sacrificed acoustics a bit but by no means is it any worse then the XFX Double D 7970 GE I kept at 1225/1625
> 
> Also, any word on when gpu-z will report properly?
> 
> Using cat 11 beta 6 - any recommended drivers at this point?


Mine been doing this pretty much none stop since I bought it.


----------



## rv8000

So tempted to sell my 780 and pick up a 290x or 290. Keep the benchs and tests coming









Does gpuz report vrm temps for these cards properly yet, and if so what are they around?

Funny side note, so many people complaining about how hot these cards get, i honestly dont believe my 780 is any better, stock settings the thing will insta throttle after 10 mins of play and stick at 80c and stock core







. Stock cooler is a stock cooler


----------



## $ilent

Quote:


> Originally Posted by *rv8000*
> 
> So tempted to sell my 780 and pick up a 290x or 290. Keep the benchs and tests coming
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Does gpuz report vrm temps for these cards properly yet, and if so what are they around?
> 
> Funny side note, so many people complaining about how hot these cards get, i honestly dont believe my 780 is any better, stock settings the thing will insta throttle after 10 mins of play and stick at 80c and stock core
> 
> 
> 
> 
> 
> 
> 
> . Stock cooler is a stock cooler


The only thing gpuz reports correctly is the gpu temperaure, core clock and memory clock. Literally everything else is wrong.

My gut feeling is the drivers are just so crap that this card wont be normal/stable for a while to come yet. With that being said, how long are we honestly expected to wait for a full decent driver none beta? Theres no way I can play BF4 in a weeks time on this crap driver, not a cats chance in hell.


----------



## th3illusiveman

Quote:


> Originally Posted by *$ilent*
> 
> tsms score is with voltage mod, these cards only have 1.25v, he is at 1.3v.
> 
> Also my card is artifacting like crazy now anytime I go near the memory.
> 
> 
> 
> Literally the start of the day it was doing fine, but as the day got on the artifacts have got worse, to the point where I get artifacts in every benchmark I do. Sound like a faulty card?


return it for a refund and wait until mid-November when your options expand.

> More info on the GTX780Ti (maybe it would have launched by then)
> Price Cut on the GTX780 incase you want to get it
> Custom Cooled R9-290Xs would start saturating the market
> The R9-290 will have been released

Those make for some excellent reasons to wait imo









_Or you could return it for a new one._ *Either way your card is faulty,* it sucks but it happens.

Whatever you choose, good luck.








Quote:


> Originally Posted by *rv8000*
> 
> So tempted to sell my 780 and pick up a 290x or 290. Keep the benchs and tests coming
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Does gpuz report vrm temps for these cards properly yet, and if so what are they around?
> 
> Funny side note, so many people complaining about how hot these cards get, i honestly dont believe my 780 is any better, stock settings the thing will insta throttle after 10 mins of play and stick at 80c and stock core
> 
> 
> 
> 
> 
> 
> 
> . Stock cooler is a stock cooler


hmm that's a tough one... You either:


*Sell it now and make enough money to completely pay for a new R9-290X while the GTX780 still has an MSRP of $650*

Negative side of this is that you deal with the launch issues all cards go through, currently it seems to be a terrible stock cooler which induces significant throttling and/or potentially faulty drivers

*Keep your card and wait for the R9-290X drivers to mature as well as custom cooled cards to hit the market, that way when you buy it, there are no issues.*

Negative of this is that by then you won't be able to sell your GTX780 for as much because it would have likely gone through a price cut

*You keep your card which is the third fastest single GPU available*

Negative of this is that a card that is $100 cheaper runs faster then it and has 1GB more Vram

decisions.. decisions lol


----------



## $ilent

but it only seems to do it when I touch the memory. Still sound faulty?


----------



## Scotty99

Don't think its a question of being faulty or not, cards are only guaranteed to work at stock speeds. But then you need to ask yourself if you are ok with having to run stock memory clocks, and there is where the answer lies.


----------



## Sgt Bilko

Well......I wasn't expecting an ASUS version to cost this much: ASUS Radeon R9 290X, 4GB + Battlefield 4 Coupon - $849.00

Just for comparisons sake from the same site: Gigabyte Radeon R9 290X, 4GB + Battlefield 4 Coupon - $699.00

This is Australia mind you, but still.......$150 extra for a different branded Ref card?


----------



## rubicsphere

Quote:


> Originally Posted by *rv8000*
> 
> So tempted to sell my 780 and pick up a 290x or 290. Keep the benchs and tests coming
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Does gpuz report vrm temps for these cards properly yet, and if so what are they around?
> 
> Funny side note, so many people complaining about how hot these cards get, i honestly dont believe my 780 is any better, stock settings the thing will insta throttle after 10 mins of play and stick at 80c and stock core
> 
> 
> 
> 
> 
> 
> 
> . Stock cooler is a stock cooler


I had 780's in SLi a while back and I had the exact same issues except my top card would reach 86C. I was very disappointed with that setup for the $1300 it cost


----------



## froyang

From OCUK，asus bios helps oc indeed, try it, please


----------



## th3illusiveman

Quote:


> Originally Posted by *$ilent*
> 
> but it only seems to do it when I touch the memory. Still sound faulty?


Did you flip the bios switch and retry your OC? Are there any penalties in returning the card?


----------



## anticommon

Quote:


> Originally Posted by *th3illusiveman*
> 
> hmm that's a tough one... You either:
> 
> 
> *Sell it now and make enough money to completely pay for a new R9-290X while the GTX780 still has an MSRP of $650*
> 
> Negative side of this is that you deal with the launch issues all cards go through, currently it seems to be a terrible stock cooler which induces significant throttling and/or potentially faulty drivers
> 
> *Keep your card and wait for the R9-290X drivers to mature as well as custom cooled cards to hit the market, that way when you buy it, there are no issues.*
> 
> Negative of this is that by then you won't be able to sell your GTX780 for as much because it would have likely gone through a price cut
> 
> *You keep your card which is the third fastest single GPU available*
> 
> Negative of this is that a card that is $100 cheaper runs faster then it and has 1GB more Vram
> 
> decisions.. decisions lol


I don't entirely see how the 780 would be considered the third fastest card. The two cards seem to trade blows and the 780 surely surpasses it when both are overclocked. If anything I'd say the 780 to 290x or vice versa is pretty much the definition of a side grade unless you are looking to get either one specifically for the special features.


----------



## $ilent

Quote:


> Originally Posted by *th3illusiveman*
> 
> Did you flip the bios switch and retry your OC? Are there any penalties in returning the card?


This is on uber bios yeah. I would try the asus bios butthe latest versversion of gpuz doesbtdoesn't let you backup the 290x bios so I relreluctant to try it


----------



## Arizonian

Well played some games at default to see where the base is. Set a fan profile using AB. 2560 x 1440 resolution in quite mode.

Crysis 2 / 104 FPS - 120 FPS / 84 C / 70% Fan - Advanced settings 16x
Crysis 3 / 45 FPS - 51 FPS / 84 C / 70% Fan - Very High Settings AA Disabled

Tried to crank up BF 3 neither campaign or multiplayer work. Both show AB OSD starting up, then black screen, then white not responding.

I have a feeling my Nvidia drivers are in the way. It's bugging me so I'm going to have to do a clean install. Luckily I've backed up all my stuff in anticipation of this very thing.

I've not tried over clocking just yet. Anyone else try BF3 and have issues? If not, then it's fresh install of windows 8.1 for me.


----------



## DBGT

How much time does it take for the third party partners to release their 290x cards?


----------



## Forceman

Quote:


> Originally Posted by *Scotty99*
> 
> Ok well maybe he has a poor sample, but why are the cards throttling at temps where they shouldnt be doing so? If the cards are throttling at 65c what makes you think they wont throttle under water? Seems like this throttling thing is the real problem that amd needs to iron out.


Did you increase the power limit? Maybe you are power throttling - I think Guru3D ran into the same problem with their card.


----------



## djriful

Quote:


> Originally Posted by *Arizonian*
> 
> Well played some games at default to see where the base is. Set a fan profile using AB. 2560 x 1440 resolution in quite mode.
> 
> Crysis 2 / 104 FPS - 120 FPS / 84 C / 70% Fan - Advanced settings 16x
> Crysis 3 / 45 FPS - 51 FPS / 84 C / 70% Fan - Very High Settings AA Disabled
> 
> Tried to crank up BF 3 neither campaign or multiplayer work. Both show AB OSD starting up, then black screen, then white not responding.
> 
> *I have a feeling my Nvidia drivers are in the way. It's bugging me so I'm going to have to do a clean install. Luckily I've backed up all my stuff in anticipation of this very thing.*
> 
> I've not tried over clocking just yet. Anyone else try BF3 and have issues? If not, then it's fresh install of windows 8.1 for me.


Use this tool here: http://goo.gl/uj75kz


----------



## Arm3nian

Quote:


> Originally Posted by *Arizonian*
> 
> Well played some games at default to see where the base is. Set a fan profile using AB. 2560 x 1440 resolution in quite mode.
> 
> Crysis 2 / 104 FPS - 120 FPS / 84 C / 70% Fan - Advanced settings 16x
> Crysis 3 / 45 FPS - 51 FPS / 84 C / 70% Fan - Very High Settings AA Disabled
> 
> Tried to crank up BF 3 neither campaign or multiplayer work. Both show AB OSD starting up, then black screen, then white not responding.
> 
> I have a feeling my Nvidia drivers are in the way. It's bugging me so I'm going to have to do a clean install. Luckily I've backed up all my stuff in anticipation of this very thing.
> 
> I've not tried over clocking just yet. Anyone else try BF3 and have issues? If not, then it's fresh install of windows 8.1 for me.


I wouldn't worry too much about bf3. It is a broken game imo. When I had my 690s, I could run every single game or bench with my overclock. But not bf3, with even a 10mhz overclock it crashed. No driver helped.


----------



## Scotty99

Wow is that game for me. Everything i did on my PC worked fine with my overclock but i would actually get artifacts in wow. Some games are just like that, and it varies on GPU model on which games that will be.


----------



## Arizonian

Quote:


> Originally Posted by *djriful*
> 
> Use this tool here: http://goo.gl/uj75kz


Dude you got a +1 rep for that because it took away the Nvidia folder that I couldn't do to it being in an open program. Also removed from Control panel. Giving BF3 a try now and adding this to OP page.


----------



## MIGhunter

Quote:


> Originally Posted by *Arizonian*
> 
> Dude you got a +1 rep for that because it took away the Nvidia folder that I couldn't do to it being in an open program. Also removed from Control panel. Giving BF3 a try now and adding this to OP page.


Unlocker works too http://www.filehippo.com/download_unlocker/


----------



## Arm3nian

Quote:


> Originally Posted by *Scotty99*
> 
> Wow is that game for me. Everything i did on my PC worked fine with my overclock but i would actually get artifacts in wow. Some games are just like that, and it varies on GPU model on which games that will be.


Yeah I suspect the problem is that there isn't a specific driver for every card; 1 driver supports an entire lineup. The driver is designed to operate the same on all the cards, but we all know that isn't the case.

I wasn't the only one with bf3 problems though, lots of people had that instant crash or "display driver has stopped responding" nonsense.


----------



## Forceman

Hmmm, for some reason I was thinking BF4 launches on Wednesday instead of Tuesday (or right after midnight Monday). So no pre-loading for me.









On a related note - what's with Newegg restricting their shipments to delivery only? I tried to arrange to pick my card up at the FEDEX facility and was told they couldn't release it because the shipper wouldn't allow it.


----------



## Arizonian

Thanks again djrifu. that did the trick. BF3 sparked up in campaign for quick test run. Again with custom AB at 2560 x 1440

BF3 / 78C / 54% Fan / 52 FPS - 60 FPS - Ultra default settings across the board.

Whew....now I can finally look into some over clocking. I'm going to grab a beer and see what this has running some benches. Find stability.

Initial thoughts - 'not a titan killer' but at this price which I valued at $519 after BF4 code - it's best price / performance right now easily. Still have a lot to learn about uber and quite because it's weird. I don't think I need to ramp up the fan with custom profile to get the same performance even if this gets hotter.

Switched to uber mode as there's any other choice here.


----------



## SonDa5

I got a 290x today.

I could have waited for 290 but I opted for the 290x because I really felt like AMD earned the sell + I have been nickel and diming rebates for a long time and I got some credit card cash rewards that I was able to use. Sapphire 290x for 579.99 with BF4 and a 5% off new egg coupon helped me make up my mind. This is the first high end video card I have bought at launch. It looks so neat in its box. Haven't even opened it up yet. The box states it supports TRIXX for tweaking. Going to play with it soon. The box also states a 750w PSU with 1x150w 8 pin connector and 1x75w 6 pin connector recommended, quite high PSU requirement for single video card.


----------



## Scotty99

They overrate those numbers massively, you could probably run these in crossfire on a 750w.


----------



## Arm3nian

Quote:


> Originally Posted by *Arizonian*
> 
> Initial thoughts - 'not a titan killer'


This is going to be up to overclockability under water with a volt mod. I predict epic bench wars soon


----------



## selk22

Yeah so far I am extremely happy with the card once you take away the thermal throttling at least on the chip itself it seems to perform excellently.. I cant wait to cool the VRM with a block though so I can actually OC the mem...

Price/performance this is a titan killer but just raw power I assume that the 780 may take it by a nose but keep in mind its 100 cheaper so no complaints here


----------



## mboner1

Quote:


> Originally Posted by *th3illusiveman*
> 
> return it for a refund and wait until mid-November when your options expand.
> 
> > More info on the GTX780Ti (maybe it would have launched by then)
> > Price Cut on the GTX780 incase you want to get it
> > Custom Cooled R9-290Xs would start saturating the market
> > The R9-290 will have been released
> 
> Those make for some excellent reasons to wait imo
> 
> 
> 
> 
> 
> 
> 
> 
> 
> _Or you could return it for a new one._ *Either way your card is faulty,* it sucks but it happens.
> 
> Whatever you choose, good luck.
> 
> 
> 
> 
> 
> 
> 
> 
> hmm that's a tough one... You either:
> 
> 
> *Sell it now and make enough money to completely pay for a new R9-290X while the GTX780 still has an MSRP of $650*
> 
> Negative side of this is that you deal with the launch issues all cards go through, currently it seems to be a terrible stock cooler which induces significant throttling and/or potentially faulty drivers
> 
> *Keep your card and wait for the R9-290X drivers to mature as well as custom cooled cards to hit the market, that way when you buy it, there are no issues.*
> 
> Negative of this is that by then you won't be able to sell your GTX780 for as much because it would have likely gone through a price cut
> 
> *You keep your card which is the third fastest single GPU available*
> 
> Negative of this is that a card that is $100 cheaper runs faster then it and has 1GB more Vram
> 
> decisions.. decisions lol


It's not faulty, it's a beta driver. Go back and try 13.9 or 13.10 or 13.11 beta2 if the earlier ones aren't stable with the 290x.


----------



## Scotty99

Ya the price is the real win in all of this. Gotta remember 780 would have been at the 500 dollar mark if AMD had a card out to compete with them at the time, people have been paying a premium since it came out for that reason.

Can't wait to see how the 290 performs as that is the card ive always had my eyes on and why im following this thread. Im hoping that cards proves to be a OC beast : )


----------



## Arizonian

Ok first run - 3DMark11 Performance - only benchmark I own fully outright.

Performance run. 1150 Core / 1350 Memory - 14894 Score - i7 3770K at 4.49 Ghz

http://www.3dmark.com/3dm11/7375771


Considering my GTX 690 hit 17011 same benchmark - I'm happy. That 690 is going in my kids rig for the 3D Vision gaming I still enjoy on that second rig. Since I'm a bit of an audiophile I heard no fan.


----------



## th3illusiveman

Quote:


> Originally Posted by *mboner1*
> 
> It's not faulty, it's a beta driver. Go back and try 13.9 or 13.10 or 13.11 beta2 if the earlier ones aren't stable with the 290x.


typically when a card artifacts in all benchmarks and the windows desktop it is faulty. I haven't read a single review where that happened and after spending $600 on it, i wouldn't want a faulty card for any longer than i had to.

Still though, i guess my "bias" against this loud and hot cooler is seeping into my comments alittle more than i would like.
Quote:


> Originally Posted by *Arizonian*
> 
> Ok first run - 3DMark11 Performance - only benchmark I own fully outright.
> 
> Performance run. 1150 Core / 1350 Memory - 14894 Score - i7 3770K at 4.49 Ghz
> 
> http://www.3dmark.com/3dm11/7375771
> 
> 
> Considering my GTX 690 hit 17011 same benchmark - I'm happy. That 690 is going in my kids rig for the 3D Vision gaming I still enjoy on that second rig. Since I'm a bit of an audiophile I heard no fan.


and that's 3DM11 where Nvidia's arch is typically superior.... that thing will destroy your 690 in Firestrike if you can keep it from throttling.







Cool benchies bro, keep em' coming


----------



## Forceman

Quote:


> Originally Posted by *th3illusiveman*
> 
> typically when a card artifacts in all benchmarks and the windows desktop it is faulty. I haven't read a single review where that happened and after spending $600 on it, i wouldn't want a faulty card for any longer than i had to.
> 
> Still though, i guess my "bias" against this loud and hot cooler is seeping into my comments alittle more than i would like.


I agree. If you are getting artifacts on the desktop, you need to either turn down the overclock, or return the card if it's at stock.


----------



## mboner1

Quote:


> Originally Posted by *th3illusiveman*
> 
> typically when a card artifacts in all benchmarks and the windows desktop it is faulty. I haven't read a single review where that happened and after spending $600 on it, i wouldn't want a faulty card for any longer than i had to.
> 
> Still though, i guess my "bias" against this loud and hot cooler is seeping into my comments alittle more than i would like.
> and that's 3DM11 where Nvidia's arch is typically superior.... that thing will destroy your 690 in Firestrike if you can keep it from throttling.
> 
> 
> 
> 
> 
> 
> 
> Cool benchies bro, keep em' coming


Well my advice would always be to try a earlier driver that is stable, especially when the newest beta was only released a day ago, and especially when 1 of his main issues is the gpu going to 100% usage which is a known issue with AMD drivers. Recently just went through it myself with my 7970 out of the blue after i updated drivers.


----------



## Newbie2009

Quote:


> Originally Posted by *tsm106*
> 
> Hmm, lets see... my son's rig does not have the best cpu and its holding me back. When I get my real blocks then we'll really see what it can do. But for now, I chose to try the hardest bench atm. My son's loop is on the small side, 35mm rads sized 360+240. Temps maxed at 55c. Would love to throw these cards into my main rig's loop muahaha.
> 
> [email protected] 1250/[email protected] load = 79.1 fps.


Close to my stock 7970 score with a 2500k.


----------



## szeged

you have a stock 7970 thats pushing 80 fps on valley extreme hd?


----------



## Arizonian

OK bought Firestrike - never ran it - simple 1150 Core / 1350 Memory i7 3770k 4.49 Ghz - 10749 Score

http://www.3dmark.com/3dm/1484478


Again, the audiophile in me only heard hard music.


----------



## selk22

okay about to run valley on my rig post results soon


----------



## Euda

Does anyone know why GPU-Z shows these info?:
http://www.imagebam.com/image/aed449284023487


----------



## leoxtxt

Is it possible to see some Heaven 4.0 results ? (Extreme - *1080p*), preferably under water so the frequency remains consistent.


----------



## Newbie2009

Quote:


> Originally Posted by *tsm106*
> 
> Hmm, lets see... my son's rig does not have the best cpu and its holding me back. When I get my real blocks then we'll really see what it can do. But for now, I chose to try the hardest bench atm. My son's loop is on the small side, 35mm rads sized 360+240. Temps maxed at 55c. Would love to throw these cards into my main rig's loop muahaha.
> 
> [email protected] 1250/[email protected] load = 79.1 fps.


Quote:


> Originally Posted by *Arizonian*
> 
> Well played some games at default to see where the base is. Set a fan profile using AB. 2560 x 1440 resolution in quite mode.
> 
> Crysis 2 / 104 FPS - 120 FPS / 84 C / 70% Fan - Advanced settings 16x
> Crysis 3 / 45 FPS - 51 FPS / 84 C / 70% Fan - Very High Settings AA Disabled
> 
> Tried to crank up BF 3 neither campaign or multiplayer work. Both show AB OSD starting up, then black screen, then white not responding.
> 
> I have a feeling my Nvidia drivers are in the way. It's bugging me so I'm going to have to do a clean install. Luckily I've backed up all my stuff in anticipation of this very thing.
> 
> I've not tried over clocking just yet. Anyone else try BF3 and have issues? If not, then it's fresh install of windows 8.1 for me.


Quote:


> Originally Posted by *szeged*
> 
> you have a stock *7970s* thats pushing 80 fps on valley extreme hd?


Need coffee


----------



## selk22

Well thats crazy your getting above 70 fps on the Valley benchmark! good job







I haven't touched my volts yet and was artifacting above 1120 so I ran the Valley again at 1100/1250 and here is my result.



This is at 1080p. 3930k at 4.6

Anyone have similar results?

Also suggestions for taking the card higher?

Should I use the MSI Power Limit slider to adjust the volts? Or AMD control center?

...first AMD card in a long time


----------



## fleetfeather

If any fellow 290X owners want to help me out with a External WC loop setup for my incoming cards, I started a thread HERE asking for some help. It'd be great if those who have been setting up WC'd 290X's could give some insight into the rad space I need


----------



## Newbie2009

Quote:


> Originally Posted by *selk22*
> 
> Well thats crazy your getting above 70 fps on the Valley benchmark! good job
> 
> 
> 
> 
> 
> 
> 
> I haven't touched my volts yet and was artifacting above 1120 so I ran the Valley again at 1100/1250 and here is my result.
> 
> This is at 1080p. 3930k at 4.6
> 
> Anyone have similar results?
> 
> Also suggestions for taking the card higher?
> 
> Should I use the MSI Power Limit slider to adjust the volts? Or AMD control center?
> 
> ...first AMD card in a long time


Looks similar to others in this thread. You cannot adjust volts yet, just the power limit of the card. I always put up to the max. Will not make a huge difference though.


----------



## RocketAbyss

Quote:


> Originally Posted by *selk22*
> 
> Well thats crazy your getting above 70 fps on the Valley benchmark! good job
> 
> 
> 
> 
> 
> 
> 
> I haven't touched my volts yet and was artifacting above 1120 so I ran the Valley again at 1100/1250 and here is my result.
> 
> 
> 
> This is at 1080p. 3930k at 4.6
> 
> Anyone have similar results?
> 
> Also suggestions for taking the card higher?
> 
> Should I use the MSI Power Limit slider to adjust the volts? Or AMD control center?
> 
> ...first AMD card in a long time


I have about the same results at 1102/1400 with 0% on power limit settings slider within CCC with beta6 drivers. Also on an FX8350 @ 5.0GHz.

Did you bump your voltage? Im not using msi ab.


----------



## selk22

No and I cant seem to even touch the mem clock without causing some sort of issue.. I believe this to be because of the VRM heating to much maybe???

I am using MSI for the OC but I could try the CCC but it just seemed so strange to me using percentages.


----------



## HeadlessKnight

Quote:


> Originally Posted by *Arizonian*
> 
> Ok first run - 3DMark11 Performance - only benchmark I own fully outright.
> 
> Performance run. 1150 Core / 1350 Memory - 14894 Score - i7 3770K at 4.49 Ghz
> 
> http://www.3dmark.com/3dm11/7375771
> 
> 
> Considering my GTX 690 hit 17011 same benchmark - I'm happy. That 690 is going in my kids rig for the 3D Vision gaming I still enjoy on that second rig. Since I'm a bit of an audiophile I heard no fan.


Quote:


> Originally Posted by *tsm106*
> 
> Hmm, lets see... my son's rig does not have the best cpu and its holding me back. When I get my real blocks then we'll really see what it can do. But for now, I chose to try the hardest bench atm. My son's loop is on the small side, 35mm rads sized 360+240. Temps maxed at 55c. Would love to throw these cards into my main rig's loop muahaha.
> 
> [email protected] 1250/[email protected] load = 79.1 fps.


Impressive scores guys







.
@ Arizonian : Nice 1150 MHz on reference cooler. 17,011 with GTX 690 is the GPU score or the overall score?


----------



## mboner1

Quote:


> Originally Posted by *Newbie2009*
> 
> Close to my stock 7970 score with a 2500k.


Quote:


> Originally Posted by *Newbie2009*
> 
> Close to my stock 7970 score with a 2500k.


I find that hard to believe. This is my 7970.


----------



## YaLu

Damn, looking this benchmark results I'm wondering if now is a good idea to buy a 780 classy (watercooled) instead a 290X (watercooled)...


----------



## Sgt Bilko

I just ordered a Arctic Cooling Accelero Xtreme III for mine since it's been confirmed that it fits.
so i'll do some testing on Stock air and then with that, i had to order it from another site though so i have $90 worth of store credit......time to find some more case fans and pretty blue lights


----------



## Havolice

god these fans on the stock coolers are driving me crazy


----------



## selk22

How is everyone able to clock there mem above 1300? If I touch mine it seems to artifact and cause problems in benchmarks..

1100 is the highest I can reach using MSI and CCC so any tips are really appreciated here.


----------



## szeged

are you using the fan at 100%?


----------



## Havolice

Quote:


> Originally Posted by *szeged*
> 
> are you using the fan at 100%?


no im using it at factory default
i think its in uber mode * NOT SURE* what worries me catalyst says 40% is the max of the fan but afterburner says its at 57% at hardest


----------



## szeged

Quote:


> Originally Posted by *Havolice*
> 
> no im using it at factory default
> i think its in uber mode * NOT SURE* what worries me catalyst says 40% is the max of the fan but afterburner says its at 57% at hardest


do you have it in a case or open bench?


----------



## Falkentyne

Quote:


> Originally Posted by *Havolice*
> 
> no im using it at factory default
> i think its in uber mode * NOT SURE* what worries me catalyst says 40% is the max of the fan but afterburner says its at 57% at hardest


Uber mode is flicking the switch on the top of the card, to the side facing the fan area (away from the "back" of the card). The card come stock with the switch on quiet mode (facing the backplate of the card).


----------



## Havolice

Quote:


> Originally Posted by *szeged*
> 
> do you have it in a case or open bench?


900d case *stock fans*


----------



## kcuestag

So the Quiet vs Uber mode, does it toggle with a BIOS switch in the PCB it's self? I thought it was software side.

Can anyone confirm? My 290X will arrive on Monday.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kcuestag*
> 
> So the Quiet vs Uber mode, does it toggle with a BIOS switch in the PCB it's self? I thought it was software side.
> 
> Can anyone confirm? My 290X will arrive on Monday.


Don't have mine yet but from the pics i've seen it's a toggle switch on the side of the PCB

And you lucky bugger, mine will show up Tuesday at the very best......


----------



## mboner1

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Don't have mine yet but from the pics i've seen it's a toggle switch on the side of the PCB
> 
> And you lucky bugger, mine will show up Tuesday at the very best......


I heard one reviewer say he had to manually reset the fan profile in ccc after switching to uber mode for it to force the change as well, might want to double check by doing that as well.


----------



## Moustache

Quote:


> Originally Posted by *mboner1*
> 
> I find that hard to believe. This is my 7970.


it's actually 7970s


----------



## Ukkooh

Is XFX still the only brand that allows changing the stock cooler to something else without breaking warranty?


----------



## Scotty99

Quote:


> Originally Posted by *Ukkooh*
> 
> Is XFX still the only brand that allows changing the stock cooler to something else without breaking warranty?


I know for sure this was true in 7900 series, you may wanna get ahold of them and make sure it applies to 290's also.


----------



## RocketAbyss

Quote:


> Originally Posted by *Ukkooh*
> 
> Is XFX still the only brand that allows changing the stock cooler to something else without breaking warranty?


PowerColor might be one as well...if you go back to one of my recent posts I posted pics of where the screws could be seen and there's no stickers on them about warranty


----------



## Ukkooh

Thanks for the fast replies. It seems that I have to contact XFX and powercolor support just to be sure.


----------



## Ha-Nocri

Quote:


> Originally Posted by *Arizonian*
> 
> Thanks again djrifu. that did the trick. BF3 sparked up in campaign for quick test run. Again with custom AB at 2560 x 1440
> 
> BF3 / 78C / 54% Fan / 52 FPS - 60 FPS - Ultra default settings across the board.
> 
> Whew....now I can finally look into some over clocking. I'm going to grab a beer and see what this has running some benches. Find stability.
> 
> Initial thoughts - 'not a titan killer' but at this price which I valued at $519 after BF4 code - it's best price / performance right now easily. Still have a lot to learn about uber and quite because it's weird. I don't think I need to ramp up the fan with custom profile to get the same performance even if this gets hotter.
> 
> Switched to uber mode as there's any other choice here.


Your results, temps especially, seem the most realistic to me. I think everyone else should make sure they *really* cleaned their system from previous drivers.


----------



## kcuestag

Can't wait to recieve mine, I heard about the GTX780 Ti but that one will perform pretty much like this one and cost hell lot more.









Also hoping I can grab a 2nd 290X in December.


----------



## EliteReplay

Hi guys im really interested on getting this card to run BF3/BF4 on a 120hz Monitor with constant 120fps

IF is not much to as can u do some bechmark on multiplayers BF3 on LOW SETTINGS/Mesh Ultra/x16 on 1080p please!

and let me know:
MIN FPS
MAX FPS
AVG FPS

+rep is u post it


----------



## szeged

Quote:


> Originally Posted by *EliteReplay*
> 
> Hi guys im really interested on getting this card to run BF3/BF4 on a 120hz Monitor with constant 120fps
> 
> IF is not much to as can u do some bechmark on multiplayers BF3 on LOW SETTINGS/Mesh Ultra/x16 on 1080p please!
> 
> and let me know:
> MIN FPS
> MAX FPS
> AVG FPS
> 
> +rep is u post it


why low settings? i think this card could probably manage insane fps on bf3 with high settings


----------



## Sgt Bilko

Quote:


> Originally Posted by *kcuestag*
> 
> Can't wait to recieve mine, I heard about the GTX780 Ti but that one will perform pretty much like this one and cost hell lot more.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also hoping I can grab a 2nd 290X in December.


I'll be grabbing a 2nd one sometime around Xmas. hopefully the non-ref cooler models are out then.

guess i'll be needing a CPU upgrade as well then







........never ends does it


----------



## fleetfeather

Quote:


> Originally Posted by *EliteReplay*
> 
> Hi guys im really interested on getting this card to run BF3/BF4 on a 120hz Monitor with constant 120fps
> 
> IF is not much to as can u do some bechmark on multiplayers BF3 on LOW SETTINGS/Mesh Ultra/x16 on 1080p please!
> 
> and let me know:
> MIN FPS
> MAX FPS
> AVG FPS
> 
> +rep is u post it


just a heads up, BF3 on low settings (or any settings) will give you very little insight into BF4 performance. And any benchmarks made are also going to be affected by the CPU and RAM as well, which will further obscure any data.

Your best bet is to just wait til BF4 release and ask someone with very similar specs to you to bench it.


----------



## maarten12100

I guess the fan on itself uses about 15W at full blast then another 40W because the core runs extremely hot.
This card might be really efficient once you put the voltage a tad lower and under water.

Voltage should go up to 1,55V on reference so it might also be a good overclocker once cooled


----------



## Euda

Does anyone of you guys get artifacts @ 1100 MHz @ Unigine Valley, too? :/


----------



## Falkentyne

How do you change the memory clocks on this 290X, without the entire system crashing (black screening?)
I found that when you FIRST set overdrive and set the memory slider, the system works fine (example: 1500 MHz), UNTIL you restart the computer!

After you restart, ANY gpu activity at all will either cause flickers, or (more likely), the system to BLACK SCREEN. Unless you want to run display driver uninstaller in safe mode (to uninstall the driver and thus remove the memory overdrive setting), the only thing you can do is to HOPE you can restart, open the CCC as soon as you possibly can, and press "Defaults" before you black screen.

Changing the core clock doesn't seem to cause any issues.


----------



## froyang

is possible the bios has power throttling?


----------



## Arizonian

Quote:


> Originally Posted by *HeadlessKnight*
> 
> Impressive scores guys
> 
> 
> 
> 
> 
> 
> 
> .
> @ Arizonian : Nice 1150 MHz on reference cooler. 17,011 with GTX 690 is the GPU score or the overall score?


That was over all score.

My 690 3DMark 11 - Graphic just over 20K
http://www.3dmark.com/3dm11/3778572


----------



## sugarhell

lol


----------



## EliteReplay

Quote:


> Originally Posted by *szeged*
> 
> why low settings? i think this card could probably manage insane fps on bf3 with high settings


it has to be low setting for competitive purpuses and test man... no one play BF3 competitive on high or ultra settings... since u are trying to archive constant maximum and minimum FPS
and low settings you are able to see clearly. it give u advantaged over a person that plays ULTRA.

can u please do this for me?


----------



## sugarhell

On bf3 a 7970 gets over 200 fps on low settings.


----------



## EliteReplay

Quote:


> Originally Posted by *sugarhell*
> 
> On bf3 a 7970 gets over 200 fps on low settings.


thats not possible with a single card. unless your machines is ultra overcloocked.

i just want a test with a regular 290X.


----------



## sugarhell

Quote:


> Originally Posted by *EliteReplay*
> 
> thats not possible with a single card. unless your machines is ultra overcloocked.
> 
> i just want a test with a regular 290X.


My friend plays all the time bf3 on low with his card clocked at 1200. All the time 200 fps


----------



## y2jrock60

Sorry if this had already been answered, I searched through the forums and this thread without avail.

I just purchased the Sapphire 290x BF4 bundle on Newegg and was wondering what version of BF4 it comes with? I've heard speculation that it may come with premium as well. It doesn't specify on Newegg.


----------



## sugarhell

Arizonian i just check your score on fs. It match a 780 at 1300~. Nice


----------



## Nihaan

Quote:


> Originally Posted by *Moustache*
> 
> Whether the 290x is at 100% or not, it's still running at an average of 900mhz-950mhz or maybe lower especially when you're playing a demanding game like Crysis 3. The 290x is definitely throttling, there is no doubt about that. The reference cooler is pretty "crappy". It need to be under water or you can slap vga cooler such as mk-26. Otherwise, it's better to wait for custom 290x from Asus, Gigabyte and etc.
> 
> 290x will run faster than 780/Titan at clock for clock or max oc when the card is not throttling.


Well we ended up returning it back a few hours ago.

He wasnt happy with overclocking results, temperatures and noise level of the cooler. Also we had something weird card was stable for more than an hour on any game then due to temperatures it started to give artifacts even on windows, there were random lines all over the screen.

In the end we couldn't pass 1200 mhz on the core (its not bad at all i guess?) and compared to his other card that hits 1385 on air with great temps, 290x was behind. So when you compare both cards on max oc 290x wasnt the winner here due to temperatures.

Good thing is they didn't charge us for returning it and we managed to get our money back.

Seems like people are right if you wan't decent oc on this card you need watercooling or you need to wait for non-reference models.

How much difference would a decent cooler make on this card ? Lets say when msi releases Lightning 290X what do you think temps would be with their cooler ? Can we expect a huge difference ?

Anyways... Now our options are;

1) Wait until non-reference R9 290X models.
2) Buy another 780 Classified and go SLI
3) Wait for 780TI / 780TI Classified

What should we do ?


----------



## narmour

Arrived this morning. About to hook this baby up! #excited


----------



## formula m

Quote:


> Originally Posted by *narmour*
> 
> Arrived this morning. About to hook this baby up! #excited


Congratulations..


----------



## Sgt Bilko

Quote:


> Originally Posted by *EliteReplay*
> 
> thats not possible with a single card. unless your machines is ultra overcloocked.
> 
> i just want a test with a regular 290X.


I just ran 1080p Low settings then and got 170fps+ then on 1100/1500 a very mild OC

So 200+ on 1200 would be accurate and for a 290x? I'm thinking easy 250+


----------



## sugarhell

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I just ran 1080p Low settings then and got 170fps+ then on 1100/1500 a very mild OC
> 
> So 200+ on 1200 would be accurate and for a 290x? I'm thinking easy 250+


200 is the limit of frostbite.


----------



## Sgt Bilko

Quote:


> Originally Posted by *narmour*
> 
> Arrived this morning. About to hook this baby up! #excited


Grats man, pretty jelly atm









Quote:


> Originally Posted by *sugarhell*
> 
> 200 is the limit of frostbite.


I didn't know that, Thanks for the info.


----------



## EliteReplay

WOW im so jelly of all u getting this card on release... i will prolly have to wait working hard 2-3month saving money to get it lol... if some one can kindly give me a bf4 code


----------



## Euda

Took a video with my Nexus 7 of the card running @ 80% Fan-Speed fixed - which conforms ~4300 RPM. For you guys to get an impression of the "Boeing card". My Hovercraft-Rig is about to lift offf the ground, every other fan (2 Arctic F12, 3 Standard-Fans of the Zalman Z9 Plus) including the Mugen 3 Fan is running at max:




Besides, here are the Results with the card running at 1080/1350MHz, my FX-8350 @ 4.5GHz, NB 2500MHz, HT Link 2750MHz, DDR3-2000 CL 9-10-9-27 2T:

http://www.3dmark.com/3dm/1485954

& Valley:


----------



## szeged

sounds like you have a swarm of angry bees in your house lol


----------



## wstanci3

Kind of sounds like a skateboard on concrete.








I really can't wait to see the 290x results with waterblocks. 1300+mhz perhaps?


----------



## skupples

Quote:


> Originally Posted by *DampMonkey*
> 
> I dont think the 3dmark record guys needed a bios mod for their clock, 1435 or whatever it was


I'm still confused about those records. I think it's borderline marketing from AMD. Seeing as they were dropped the day they released, & he already had buckets for them. The gpu-Z's for both gpu's were all out of whack. Not saying it's fake, just saying it's slightly fishy.


----------



## kcuestag

Mother of god, that noise! Must buy a waterblock NOW before my card arrives!


----------



## Sgt Bilko

Thats actually a lot quieter than i thought it would be........


----------



## Euda

Quote:


> Originally Posted by *kcuestag*
> 
> Mother of god, that noise! Must buy a waterblock NOW before my card arrives!


That's 80% fixed RPM + 6 other 120mm Fans @ >1k RPM - in quiet mode and an otherwise silent system, it's okay to play with the Reference Design - at least temporarily until i'll get my Arctic Accelero Hybrid next week.


----------



## 218689

so, is this the right time to sell my 2 6970's and go over to a single card configuration?


----------



## pounced

Not sure if anyone knows if the keys are not activated yet or what but my BF4 Code that came with my card wont activate when I go to the site and try to redeem it, it comes back as coupon invalid ID or something along those lines. I'm wondering if we cant use them till the launch of bf4


----------



## Havolice

Quote:


> Originally Posted by *Falkentyne*
> 
> Uber mode is flicking the switch on the top of the card, to the side facing the fan area (away from the "back" of the card). The card come stock with the switch on quiet mode (facing the backplate of the card).


my god if this is quiet mode :O

cannot wait uber mode ugh







gonna test it in a moment i gues


----------



## kcuestag

Quote:


> Originally Posted by *Euda*
> 
> That's 80% fixed RPM + 6 other 120mm Fans @ >1k RPM - in quiet mode and an otherwise silent system, it's okay to play with the Reference Design - at least temporarily until i'll get my Arctic Accelero Hybrid next week.


How high does Uber mode set your fan to on AUTO?

I can't stand the AMD reference blowers at above 50% or so.


----------



## TheSoldiet

Geting my 290x bf4 edition on tuesday! So exited ;-)

Fx 8350 4.5
Asus crosshair v formula z
AMD 6970


----------



## Koniakki

Quote:


> Originally Posted by *Arizonian*
> 
> Ok first run - 3DMark11 Performance - only benchmark I own fully outright.
> 
> Performance run. 1150 Core / 1350 Memory - 14894 Score - i7 3770K at 4.49 Ghz
> 
> http://www.3dmark.com/3dm11/7375771


So it takes a about 1359Mhz/1800 780 to match a 1150/1350 290X at 3DMark 11? !! Is that really correct??









My ran was at 1333/1777. Look at gpu score and fps numbers.


Quote:


> Originally Posted by *Euda*
> 
> Took a video with my Nexus 7 of the card running @ 80% Fan-Speed fixed - which conforms ~4300 RPM. For you guys to get an impression of the "Boeing card". My Hovercraft-Rig is about to lift offf the ground, every other fan (2 Arctic F12, 3 Standard-Fans of the Zalman Z9 Plus) including the Mugen 3 Fan is running at max:
> 
> video removed----
> 
> Besides, here are the Results with the card running at 1080/1350MHz, my FX-8350 @ 4.5GHz, NB 2500MHz, HT Link 2750MHz, DDR3-2000 CL 9-10-9-27 2T:
> 
> http://www.3dmark.com/3dm/1485954
> 
> & Valley:


I guess Valley in is a lot different story than 3DMark11. My stock [email protected]/1502 was getting the same score as *selk22* below. So we can assume a 1110/1250 290X gets the same like a 1045/1502 780?
Quote:


> Originally Posted by *selk22*
> 
> Well thats crazy your getting above 70 fps on the Valley benchmark! good job
> 
> 
> 
> 
> 
> 
> 
> I haven't touched my volts yet and was artifacting above 1120 so I ran the Valley again at 1100/1250 and here is my result.
> 
> 
> 
> This is at 1080p. 3930k at 4.6
> 
> Anyone have similar results?
> 
> Also suggestions for taking the card higher?
> 
> Should I use the MSI Power Limit slider to adjust the volts? Or AMD control center?
> 
> ...first AMD card in a long time


Its a little strange. The two scores above are inconsistent based on their clocks and I think its a CCC settings/OS differences that affects them.

*P.S:* No bashing or childish comparisons here in any manner e.g my card is better than yours. I do not own a 780 anymore. I'm just comparing the from a performance perspective because I'm interest.


----------



## Gunderman456

Quote:


> Originally Posted by *kcuestag*
> 
> How high does Uber mode set your fan to on AUTO?
> 
> I can't stand the AMD reference blowers at above 50% or so.


Uber mode sets it at 55%.


----------



## Moustache

Quote:


> Originally Posted by *Nihaan*
> 
> How much difference would a decent cooler make on this card ? Lets say when msi releases Lightning 290X what do you think temps would be with their cooler ? Can we expect a huge difference ?


Yes, there will be a huge difference especially with the like of Lightning where the temps will definitely be cooler. Cooler temperature means no throttling and therefore, we can expect the card to run at its full potential. Of course, we can also expect a better overclocking than what we can achieve from reference version due to a better cooling system and custom PCB.
Quote:


> Originally Posted by *Nihaan*
> 
> Anyways... Now our options are;
> 
> 1) Wait until non-reference R9 290X models.
> 2) Buy another 780 Classified and go SLI
> 3) Wait for 780TI / 780TI Classified
> 
> What should we do ?


If you guys are going for another 780 Classy, I'd wait for 780TI to show up first. Once 780TI is released, 780 Classy will drop in price. Beside, a new 780TI sounds good as well if the leaked specs is true since it could be a Titan rebrand with higher clock and cheaper price. Imagine a Titan Classy.







Of course, the first option sounds promising as well. I'd say wait first.


----------



## lordzed83

Me with my card yesterday. Messing around with since 5am

And big problem i got with jumping memory clock:



it jumps like tard from 150 to 1250 and eventually card wil crash.
Atm I use 2 profiles in masi afterburner with locked clocks otherwise its unusable :/

2 week old windows 8.1 tried 3 different drivers with full uninstalls NOTHING.

Guess i need new bios :/


----------



## Euda

Quote:


> Originally Posted by *kcuestag*
> 
> How high does Uber mode set your fan to on AUTO?
> 
> I can't stand the AMD reference blowers at above 50% or so.


Yep, it's running like constantly between 45-55% RPM and 94°C [. . .]

Is there a solution to add voltage to the GPU yet?


----------



## KyGuy

Can't wait to get my R9 290x, just ordered a Sapphire from Newegg


----------



## kpoeticg

I just tried ordering one. It lets you add to cart and then says the item has been removed from cart due to insufficient quantities


----------



## szeged

someone probably cancelled their order, so it showed up for half a second before someone else bought it.


----------



## Koniakki

Quote:


> Originally Posted by *Arizonian*
> 
> OK bought Firestrike - never ran it - simple 1150 Core / 1350 Memory i7 3770k 4.49 Ghz - 10749 Score
> 
> http://www.3dmark.com/3dm/1484478


WOW! A 290X at 1150/1350 beating a [email protected]/1777?? Holy smokes!


----------



## rdr09

Quote:


> Originally Posted by *Koniakki*
> 
> WOW! A 290X at 1150/1350 beating a [email protected]/1777?? Holy smokes!
> 
> 
> Spoiler: Warning: Spoiler!


my avatar is not too accurate.


----------



## sugarhell

Quote:


> Originally Posted by *rdr09*
> 
> my avatar is not too accurate.


http://www.overclock.net/t/1437166/vcz-nvidia-geforce-gtx-780-ti-3dmark-score-exposed/40#post_21063469

See my post. 780 has no hope against a high oced 290x. Lets see with a titan


----------



## KyGuy

Quote:


> Originally Posted by *kpoeticg*
> 
> I just tried ordering one. It lets you add to cart and then says the item has been removed from cart due to insufficient quantities


I bought mine right in the nick of time, I ordered it, and right after, it said Out of Stock. Even though it says Auto-Notify, click on the card because sometimes it says Add To Cart on the actual card page. I found this true on 5 occasions, but I have been waiting for a Sapphire.


----------



## kpoeticg

Yeah, I've been doin it for the past cpl days. On the pre-release night i was waiting for an Asus to pop up to go with my RIVE BE build. By the time i realized it wasn't happening, it was too late. I've been clicking them


----------



## rdr09

Quote:


> Originally Posted by *sugarhell*
> 
> http://www.overclock.net/t/1437166/vcz-nvidia-geforce-gtx-780-ti-3dmark-score-exposed/40#post_21063469
> 
> See my post. 780 has no hope against a high oced 290x. Lets see with a titan


780 is beating the Titan already. nvidia is going to have to do over the top with maxwell - can't wait. performance not the price. wonder how the 290 performs in extreme?

we need Karlitos to do 290 benching. he he he.


----------



## Moustache




----------



## y2jrock60

Sapphires popped up on Newegg last night. I was contemplating buying one at the time and when I decided to click add to the cart they were sold out. This morning I kept refreshing the page every few minutes and eventually a few came up for sale. I was able to snag a Sapphire around 10:30am EST. Literally, after I ordered it they went out of stock. They've been appearing sporadically ever since. Like the previous poster stated, it's probably due to cancelled orders. Just keep refreshing the page and make sure you have your account setup so you can order it fast.


----------



## Euda

Is there a way to flash the ASUS Bios to the R9 290X? ATIWinFlash keeps saying my Video Adapter is not valid :/


----------



## kpoeticg

Thanx KyGuy and y2jrock for pointing that out and getting it into my head =P


----------



## rdr09

Quote:


> Originally Posted by *Moustache*


+rep. it seems they picked the slowest clocked 780 to compare. they did that to the 7950.


----------



## Ha-Nocri

290 will be faster than 780 clock-for-clock, that is almost certain. Hope it's 400$ and not 450$.


----------



## Koniakki

Quote:


> Originally Posted by *rdr09*
> 
> my avatar is not too accurate.


Well I had a 1306Mhz 780 beat my 1359Mhz Valley score with 84fps vs my 81.3fps, so taking my scores as reference might not be the best thing to do.









My scores do not represent the entire 780 owners scores.









*P.S:* Nice avatar.


----------



## rdr09

Quote:


> Originally Posted by *Koniakki*
> 
> Well I had a 1306Mhz 780 beat my 1359Mhz Valley score with 84fps vs my 81.3fps, so taking my scores as reference might not be the best thing to do.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My scores do not represent the entire 780 owners scores.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *P.S:* Nice avatar.


nvidia excels in valley for some reason, especially in multi-gpu. and thanks, i may have to tweak it.


----------



## KyGuy

Quote:


> Originally Posted by *kpoeticg*
> 
> Thanx KyGuy and y2jrock for pointing that out and getting it into my head =P


NP, can't wait to bench this bad boy!







Good Luck with Buying a card....KyGuy


----------



## jomama22

6 cards arrive Monday and three blocks on Thursday. I'm on vacation visiting the family now and won't be home until Wednesday anyway. Perfect timing!

Anyone overvolting, are you using the Asus bios with gpu tweak (if so, where is the bios?) or does ab allow over voltage?

Cheers!


----------



## Sgt Bilko

Alrighty, so i ordered some Vram heatsinks to go along with my Accelero Xtreme, and that's all im gonna order......no more.......hmm, new CPU would be nice.....


----------



## mboner1

Quote:


> Originally Posted by *jomama22*
> 
> 6 cards arrive Monday and three blocks on Thursday. I'm on vacation visiting the family now and won't be home until Wednesday anyway. Perfect timing!
> 
> Anyone overvolting, are you using the Asus bios with gpu tweak (if so, where is the bios?) or does ab allow over voltage?
> 
> Cheers!


What's your address? Just joking lol.

But seriously , no one needs 6 cards, 5 is plenty for anyone, just send one my way. Cheers dude.


----------



## $ilent

Sup folks!

@Lordzed, my 290x does that too whenever I touch the memory clock. Im gonna try reinstalling windows then go from there.

Hows everybody doing? I am seeing a large variance in 290x overclock success...i think its largely down to the driver to be honest. On a side note, does anyone have a copy of the stock 290x bios? My gpuz wont let me save mine and Im dying to put the asus bios on for overvolting testing.


----------



## Arizonian

Quote:


> Originally Posted by *Euda*
> 
> Took a video with my Nexus 7 of the card running @ 80% Fan-Speed fixed - which conforms ~4300 RPM. For you guys to get an impression of the "Boeing card". My Hovercraft-Rig is about to lift offf the ground, every other fan (2 Arctic F12, 3 Standard-Fans of the Zalman Z9 Plus) including the Mugen 3 Fan is running at max:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Besides, here are the Results with the card running at 1080/1350MHz, my FX-8350 @ 4.5GHz, NB 2500MHz, HT Link 2750MHz, DDR3-2000 CL 9-10-9-27 2T:
> 
> http://www.3dmark.com/3dm/1485954
> 
> & Valley:
> 
> 
> Spoiler: Warning: Spoiler!


Quote:


> Originally Posted by *narmour*
> 
> Arrived this morning. About to hook this baby up! #excited
> 
> 
> Spoiler: Warning: Spoiler!


Quote:


> Originally Posted by *lordzed83*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Me with my card yesterday. Messing around with since 5am
> 
> And big problem i got with jumping memory clock:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> it jumps like tard from 150 to 1250 and eventually card wil crash.
> Atm I use 2 profiles in masi afterburner with locked clocks otherwise its unusable :/
> 
> 2 week old windows 8.1 tried 3 different drivers with full uninstalls NOTHING.
> 
> Guess i need new bios :/


Congrats to you three this morning - added


----------



## Cool Mike

ANY WORD OUT ON WHEN A CUSTOM (NON-REFERENCE) 290X WILL BE AVAILABLE? HOPING BY CHRISMAS...
THANKS


----------



## Arizonian

Quote:


> Originally Posted by *Cool Mike*
> 
> ANY WORD OUT ON WHEN A CUSTOM (NON-REFERENCE) 290X WILL BE AVAILABLE? HOPING BY CHRISMAS...
> THANKS


Usually it's been two - three months after release with AIB partners.


----------



## provost

Quote:


> Originally Posted by *sugarhell*
> 
> http://www.overclock.net/t/1437166/vcz-nvidia-geforce-gtx-780-ti-3dmark-score-exposed/40#post_21063469
> 
> See my post. 780 has no hope against a high oced 290x. Lets see with a titan


If you are going to compare to a 780, why pick the lame one...here is a better link from the same thread
http://www.overclock.net/t/1437166/vcz-nvidia-geforce-gtx-780-ti-3dmark-score-exposed/70


----------



## TheSoldiet

Is it worth waiting for Arctic accelero xtreme IV or should i just Get the xtreme III and some extra vram heatsinks? My Card will be here by tuesday/wednesday


----------



## SonDa5

Quote:


> Originally Posted by *Scotty99*
> 
> They overrate those numbers massively, you could probably run these in crossfire on a 750w.


I don't know about that. TPU reported that the card scaled well with voltage. With a massive overclock this card may be able to build up quite a load.


----------



## Sgt Bilko

These just got posted in the GK110 vs Hawaii thread
Quote:


> Originally Posted by *Jared Pace*
> 
> Titan at 1802/1953 vs. 290X at 1466/1730.


Sugarhell posted one earlier.


----------



## provost

Quote:


> Originally Posted by *Sgt Bilko*
> 
> These just got posted in the GK110 vs Hawaii thread
> Sugarhell posted one earlier.


This is great. Is this an Ln2 score with Tess off?
I have not seen any postedon the official futuremark site which does not permit tess off http://www.3dmark.com/hall-of-fame-2/3dmark+11+3dmark+score+performance+preset/version+1.0.5/1+gpu

If this is a water score, it is fantastic


----------



## $ilent

Anyone got any info on flashing asus bios then?


----------



## TheSoldiet

Isn't the uber/quiet switch a "bios" switch? If it is then you could switch to the uber bios and then flash the asus bios on it? If it fails then you could switch back to quiet mode.


----------



## Sgt Bilko

Quote:


> Originally Posted by *provost*
> 
> This is great. Is this an Ln2 score with Tess off?
> I have not seen any postedon the official futuremark site which does not permit tess off http://www.3dmark.com/hall-of-fame-2/3dmark+11+3dmark+score+performance+preset/version+1.0.5/1+gpu
> 
> If this is a water score, it is fantastic


I think it might be an LN2 score and tess might be off, not sure really.....only just re-learning all this again (last time i looked at this i had 4850's







)

I'm hoping it's a high-end water score but i doubt it.


----------



## szeged

it is LN2 with tessellation turned off, cant wait to see how it does on water







ln2 scores never really interested me lol


----------



## Sgt Bilko

Quote:


> Originally Posted by *szeged*
> 
> it is LN2 with tessellation turned off, cant wait to see how it does on water
> 
> 
> 
> 
> 
> 
> 
> ln2 scores never really interested me lol


Thank you for clearing that up for me


----------



## Forceman

Quote:


> Originally Posted by *Nihaan*
> 
> He wasnt happy with overclocking results, temperatures and noise level of the cooler. Also we had something weird card was stable for more than an hour on any game then due to temperatures it started to give artifacts even on windows, there were random lines all over the screen.


I think you're the second person to report artifacting in Windows. Wonder what's going on.


----------



## RocketAbyss

Anyone still getting the issue where when launching a game/benchmark that goes to fullscreen, your whole system chokes up and freezes(screen blacks in and out for 5 seconds before program crashes). Its occuring to me on both the CD drivers and the latest Beta6 drivers. Personally, its getting real annoying when I can't even test out the card


----------



## szeged

Quote:


> Originally Posted by *RocketAbyss*
> 
> Anyone still getting the issue where when launching a game/benchmark that goes to fullscreen, your whole system chokes up and freezes(screen blacks in and out for 5 seconds before program crashes). Its occuring to me on both the CD drivers and the latest Beta6 drivers. Personally, its getting real annoying when I can't even test out the card


hopefully its just driver issues, i cant see multiple people having hardware issues like this so early on.


----------



## pounced

Quote:


> Originally Posted by *RocketAbyss*
> 
> Anyone still getting the issue where when launching a game/benchmark that goes to fullscreen, your whole system chokes up and freezes(screen blacks in and out for 5 seconds before program crashes). Its occuring to me on both the CD drivers and the latest Beta6 drivers. Personally, its getting real annoying when I can't even test out the card


also experienced this even when I open google chrome (It asks for admin permission) when that window pops open the system chokes even from that.

I have a feeling it is just drivers and nothing else should be fixed soon I hope.


----------



## $ilent

Quote:


> Originally Posted by *Forceman*
> 
> I think you're the second person to report artifacting in Windows. Wonder what's going on.


Mine does this when i touch the memory clock.
Quote:


> Originally Posted by *RocketAbyss*
> 
> Anyone still getting the issue where when launching a game/benchmark that goes to fullscreen, your whole system chokes up and freezes(screen blacks in and out for 5 seconds before program crashes). Its occuring to me on both the CD drivers and the latest Beta6 drivers. Personally, its getting real annoying when I can't even test out the card


Yes mine does this too, windows is visibly laggy and it constantly flashes from black screen to benchmark.


----------



## RocketAbyss

Quote:


> Originally Posted by *szeged*
> 
> hopefully its just driver issues, i cant see multiple people having hardware issues like this so early on.


With Battlefield 4 dropping in 3 days time(or less), if this issue persists I'm just gonna take out the 290X and reuse my 7970. Sigh I expected better AMD.


----------



## EliteReplay

Quote:


> Originally Posted by *$ilent*
> 
> Mine does this when i touch the memory clock.
> Yes mine does this too, windows is visibly laggy and it constantly flashes from black screen to benchmark.


try the previous beta driver. or you just got unlucky







and have to RMA.


----------



## szeged

having to RMA so soon would just be an absolute damper on the hype of this card, if this is indeed a defect on the hardware level that many people are experiencing that is.


----------



## djriful

To Canadian fellows, if you want watercooling keep an eye on Dazmode website. He just put up the placeholders for the waterblocks.

http://goo.gl/FuWKGP


----------



## zealord

Quote:


> Originally Posted by *szeged*
> 
> having to RMA so soon would just be an absolute damper on the hype of this card, if this is indeed a defect on the hardware level that many people are experiencing that is.


what is going on? haven't followed everything so far. driver issues?


----------



## EliteReplay

Quote:


> Originally Posted by *szeged*
> 
> having to RMA so soon would just be an absolute damper on the hype of this card, if this is indeed a defect on the hardware level that many people are experiencing that is.


that doesnt matter, i know people returning Titans and 780 the same way... stop mis informing people.


----------



## EliteReplay

every electronic device has a % of failure, so he is just being unlucky with the card, but i wont go that far until i install windows again or install other drivers... waht about testing the card in other PC?


----------



## $ilent

It doesnt do the screen artifacting at stock, it only does it when my memory is clocked like 1400+? so I dont think i can RMA it based on that. Windows just feels sluggish, like when I shut down my pc the windows blue loggong off screen appears in sections, like its moving at 10fps instead of 60fps, I hope this makes sense.

Also folks theres a new version beta 16 MSI afterburner out, came out yesterday - http://www.guru3d.com/files_details/msi_afterburner_beta_download.html

Might be worth giving it a shot.


----------



## Sgt Bilko

Quote:


> Originally Posted by *EliteReplay*
> 
> that doesnt matter, i know people returning Titans and 780 the same way... stop mis informing people.


he wasn't mis-informing anyone, just saying that there is a possibility it might be a hardware fault.


----------



## $ilent

Quote:


> Originally Posted by *zealord*
> 
> what is going on? haven't followed everything so far. driver issues?


See post 898 onwards


----------



## jomama22

For those looking for the unlocked Asus bios:

https://www.dropbox.com/s/rv79yt18tob7fc8/ASUS.ROM

Noted to unlock voltage to 1410mv core, 1.7v mem. Also helps with power throttling as well it seems.

It is also compatible with the 290 though no unlocked shaders. Does unlock voltage.

May want to add this to op.

Credit goes to gibbo over at ocuk. He found out the Asus bios was the only one to unlock voltage.

You do have to use gputweak...for better or worse.


----------



## szeged

Quote:


> Originally Posted by *EliteReplay*
> 
> that doesnt matter, i know people returning Titans and 780 the same way... stop mis informing people.


of course people had to rma titans and 780s, im not trying to give any information to people at all, i was giving my opinion that it would suck to have to rma your card so soon. that is all.
Quote:


> Originally Posted by *Sgt Bilko*
> 
> he wasn't mis-informing anyone, just saying that there is a possibility it might be a hardware fault.


this.


----------



## Sgt Bilko

Quote:


> Originally Posted by *jomama22*
> 
> For those looking for the unlocked Asus bios:
> 
> https://www.dropbox.com/s/rv79yt18tob7fc8/ASUS.ROM
> 
> Noted to unlock voltage to 1410mv core, 1.7v mem. Also helps with power throttling as well it seems.
> 
> It is also compatible with the 290 though no unlocked shaders. Does unlock voltage.
> 
> May want to add this to op.
> 
> Credit goes to gibbo over at ocuk. He found out the Asus bios was the only one to unlock voltage.
> 
> You do have to use gputweak...for better or worse.


Ok here comes the newbie question, how do i flash it?


----------



## SoloCamo

Quote:


> Originally Posted by *$ilent*
> 
> It doesnt do the screen artifacting at stock, it only does it when my memory is clocked like 1400+? so I dont think i can RMA it based on that. Windows just feels sluggish, like when I shut down my pc the windows blue loggong off screen appears in sections, like its moving at 10fps instead of 60fps, I hope this makes sense.
> 
> Also folks theres a new version beta 16 MSI afterburner out, came out yesterday - http://www.guru3d.com/files_details/msi_afterburner_beta_download.html
> 
> Might be worth giving it a shot.


that's 150mhz over the stock memory... perhaps you just got unlucky on the memory overclocking lottery? What brand card did you get?


----------



## Jared Pace

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Ok here comes the newbie question, how do i flash it?


ati winflash

very simple program you can use from within windows. Load the bios file and hit the flash/program button


----------



## $ilent

Quote:


> Originally Posted by *jomama22*
> 
> For those looking for the unlocked Asus bios:
> 
> https://www.dropbox.com/s/rv79yt18tob7fc8/ASUS.ROM
> 
> Noted to unlock voltage to 1410mv core, 1.7v mem. Also helps with power throttling as well it seems.
> 
> It is also compatible with the 290 though no unlocked shaders. Does unlock voltage.
> 
> May want to add this to op.
> 
> Credit goes to gibbo over at ocuk. He found out the Asus bios was the only one to unlock voltage.
> 
> You do have to use gputweak...for better or worse.


I echo the call on how to flash the bios, I know how to flash NV cards just not AMD ones.

Quote:


> Originally Posted by *SoloCamo*
> 
> that's 150mhz over the stock memory... perhaps you just got unlucky on the memory overclocking lottery? What brand card did you get?


I got Sapphire.


----------



## SoloCamo

Quote:


> Originally Posted by *$ilent*
> 
> I echo the call on how to flash the bios, I know how to flash NV cards just not AMD ones.
> I got Sapphire.


Interesting, I'm using sapphire as well, though the highest memory clock I've aimed for was 1350.. no issues here knock on wood. Perhaps I'll test 1400 and report back after I see how 1350 continues to do.. after all I'm only at 1080p and no game I play even maxed requires me to overclock thankfully


----------



## $ilent

Im at 1440p does that test benchmarks more? Whats your max core clock?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Jared Pace*
> 
> ati winflash
> 
> very simple program you can use from within windows. Load the bios file and hit the flash/program button


Thank you very much.......now i just need the card to get here lol

Damn you postal service


----------



## Jared Pace

Help you guys out a little. There is atiflash.exe which can be used from within DOS like NVflash.exe. In DOS or command prompt, use similar triggers as NVflash such as -f for forcing. -0 to select primary adapter, -1 to select secondary etc. An easier method is to use the ATI Winflash program, and just load your .rom bios file and use the GUI to flash from within windows. Both ways are as easy as NVflash, but ATI Winflash is simpler. Shamino posted two 290X bioses you can use with Asus GPUTweak. The second one he posted has Vdroop disabled which will give you a higher overclock. You're likely still going to throttle if your card is not very cool (ie. waterblock, ln2). You can backup your original bios using GPUZ or download a copy of it from TPU's bios database.

The assumption that the bios switch for uber/quiet is just a dual bios is correct. Just choose position 1 or 2 and test your bios on that. Gibbo's Asus bios may do 1410mv or higher, both of Shaminos do 2000mv. You probably wont need anything more than 1350-1400mv for now, and never that much at all on the stock cooler. Excited to see an end to the throttling and some high scores from you guys on water.

Good luck, glad to help. Anyone needs assistance, send a PM.

Cheers.


----------



## SoloCamo

Quote:


> Originally Posted by *$ilent*
> 
> Im at 1440p does that test benchmarks more? Whats your max core clock?


Haven't gone for max on either yet, using 1100 core and 1350 memory at the moment, I've been slowly upping it and checking stability after an hour or so of games/benches


----------



## Arizonian

Again don't bench competitively but here is *Firestrike with Extreme* checked.

i7 3770K 4.5 Ghz - 290X 1150 Core / 1350 Memory



http://www.3dmark.com/3dm/1488266

What I'm finding is I can't go any higher on Core or Memory without it failing the bench tests. I think if I had just a tad of voltage I could squeeze more. Is there a way to tweak voltage yet? Afterburner is denying me.


----------



## $ilent

Quote:


> Originally Posted by *Jared Pace*
> 
> Help you guys out a little. There is atiflash.exe which can be used from within DOS like NVflash.exe. In DOS or command prompt, use similar triggers as NVflash such as -f for forcing. -0 to select primary adapter, -1 to select secondary etc. An easier method is to use the ATI Winflash program, and just load your .rom bios file and use the GUI to flash from within windows. Both ways are as easy as NVflash, but ATI Winflash is simpler. Shamino posted two 290X bioses you can use with Asus GPUTweak. The second one he posted has Vdroop disabled which will give you a higher overclock. You're likely still going to throttle if your card is not very cool (ie. waterblock, ln2). You can backup your original bios using GPUZ or download a copy of it from TPU's bios database.
> 
> The assumption that the bios switch for uber/quiet is just a dual bios is correct. Just choose position 1 or 2 and test your bios on that. Gibbo's Asus bios may do 1410mv or higher, both of Shaminos do 2000mv. You probably wont need anything more than 1350-1400mv for now, and never that much at all on the stock cooler. Excited to see an end to the throttling and some high scores from you guys on water.
> 
> Good luck, glad to help. Anyone needs assistance, send a PM.
> 
> Cheers.


Thanks, any ideas where I can find the stock sapphire 290x bios? My gpuz wont let me save mine and the 2 listed on techpowerup gpuz database arent the same bios version as mine either.
Quote:


> Originally Posted by *Arizonian*
> 
> Again don't bench competitively but here is *Firestrike with Extreme* checked.
> 
> i7 3770K 4.5 Ghz - 290X 1150 Core / 1350 Memory
> 
> 
> 
> http://www.3dmark.com/3dm/1488266
> 
> What I'm finding is I can't go any higher on Core or Memory without it failing the bench tests. I think if I had just a tad of voltage I could squeeze more. Is there a way to tweak voltage yet? Afterburner is denying me.


See quoted post above dude, flash the asus bios.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Arizonian*
> 
> Again don't bench competitively but here is *Firestrike with Extreme* checked.
> 
> i7 3770K 4.5 Ghz - 290X 1150 Core / 1350 Memory
> 
> What I'm finding is I can't go any higher on Core or Memory without it failing the bench tests. I think if I had just a tad of voltage I could squeeze more. Is there a way to tweak voltage yet? Afterburner is denying me.


Only way i know atm is to Flash the BIOS to an Asus version and use GPU Tweak as Jared said above.

I could have swore that i saw something about Sapphire card being Volt unlocked fout of the box though. oh well


----------



## szeged

wait wait wait wait, so all the non asus cards are voltage locked? ughhh was hoping i left bios flashing to the titans lol, oh well it only takes 5 seconds.


----------



## SniperOct

Quote:


> Originally Posted by *Cool Mike*
> 
> ANY WORD OUT ON WHEN A CUSTOM (NON-REFERENCE) 290X WILL BE AVAILABLE? HOPING BY CHRISMAS...
> THANKS


Some partners have said as soon as november


----------



## $ilent

Also ati winflash doesnt work...just says cannot find discrete video card.


----------



## sugarhell

Quote:


> Originally Posted by *szeged*
> 
> wait wait wait wait, so all the non asus cards are voltage locked? ughhh was hoping i left bios flashing to the titans lol, oh well it only takes 5 seconds.


They are not voltage locked. Gpu tweak gives voltage only on asus card.

Msi AB still doesnt support the new cards


----------



## Jared Pace

Quote:


> Originally Posted by *$ilent*
> 
> Thanks, any ideas where I can find the stock sapphire 290x bios? My gpuz wont let me save mine and the 2 listed on techpowerup gpuz database arent the same bios version as mine either.
> See quoted post above dude, flash the asus bios.


Version 015.039.000.007.000000 is here - http://www.techpowerup.com/vgabios/index.php?did=1002-67b0--

Which version is on yours? If GPUZ doesnt work, Ati Winflash will save it for you.


----------



## szeged

Quote:


> Originally Posted by *sugarhell*
> 
> They are not voltage locked. Gpu tweak gives voltage only on asus card.
> 
> Msi AB still doesnt support the new cards


ah, so MSI is slacking again


----------



## $ilent

Mine is 015.039.000.006.003515

Also I cant use ati winflash, just says cannot find discrete video card


----------



## sugarhell

Quote:


> Originally Posted by *szeged*
> 
> ah, so MSI is slacking again


Say this to unwinder. He will rage for 1 week


----------



## szeged

Quote:


> Originally Posted by *sugarhell*
> 
> Say this to unwinder. He will rage for 1 week


why did he rage about someone complaining about msi earlier?


----------



## Jared Pace

Quote:


> Originally Posted by *$ilent*
> 
> Also ati winflash doesnt work...just says cannot find discrete video card.


use DOS method then.... atiflash4-17.exe located in here (PT1.rom & PT3.rom are Shaminos 290x bioses. PT3 has Vdroop disabled)

290X Custom Bioses.zip 286k .zip file


1 Card:
If you only have ONE card, then use this command:
atiflash -p 0 biosname.rom

Multi Cards: (This will flash ALL video cards)
If you have a multi card setup, then use this command:
atiflash -pa biosname.rom

-i [NUM]Display information of ATI adapters in the system

-ai [NUM]Display advanced information of ATI adapters [NUM] if specified

-p Write BIOS to image to all approriate adapters

-s [SIZE] Save BIOS image from adapter to file
First [SIZE] kbytes (except for Theater in bytes) of ROM

-cf [SIZE] [SUM] calculate 16-bit checksum for file
Checksum for the first [SIZE] kbytes of the file is calculated if [SIZE] is specified

-cb [SIZE] [SUM] Calculate 16-bit BIOS image checksum for adapter
Checksum for the first [SIZE] kbytes of the ROM content is calculated if [SIZE] is specified.

-t Test ROM access of adapter

-v Compare ROM content of adapter to

-f Force flashing regardless of security checking BIOS file info check OR boot-up card

-fa Force flashing bypassing already-programmed check

-fm Forec flashing bypassing BIOS memory config check

-fs Force flashing bypassing BIOS SSID check

-fp Force flashing bypassing BIOS P/N check

-pcionly Enumerate only PCI adapters

-agp Enumerate only AGP adapters

-pcie Enumerate only PCIE adapter

-reboot Forec a reboot of the system after successfully completing the specified operation


----------



## sugarhell

Quote:


> Originally Posted by *szeged*
> 
> why did he rage about someone complaining about msi earlier?


In general he rage a lot if someone tells him something about AB lol


----------



## Jared Pace

You may have to rename 'atiflash4.17.exe' to 'atiflash.exe' if -atiflash command does not work from within DOS/Command prompt.

Edit: In case anyone does not know how to use dos or prompt:

Start -> Run.
Type cmd.
Press ctrl+shift+enter for admin

Point drive letter to USB stick or directory containing atiflash.exe

Change directory command = cd

Follow commands in my above post for how to flash using -atiflash -f -0


----------



## $ilent

that file doesnt work jared pace, just says invalid.


----------



## binormalkilla

I couldn't redeem my BF4 coupon code either . It said 'invalid UID'. I wonder if we have to wait until release day..


----------



## Sgt Bilko

Taken from the AMD/BF4 pdf:

To Receive EA Origin® Product Code: An internet connection is required. Internet connection charges may apply. Upon receipt of the AMD Unique ID Code, Participant should visit www.amd.com/getbattlefield4, sign up for a user account (if user does not already have an account), input the AMD Unique ID (referred to as a "Unique ID" on the website) and provide all other information, as required, and submit the web form. THIS WEBSITE IS ONLY PROVIDED IN ENGLISH, JAPANESE, KOREAN, GERMAN, ITALIAN, FRENCH,
RUSSIAN, PORTUGUESE, AND SPANISH.

Participant will receive an email from [email protected] at the email address provided on the web form which contains the EA Origin® Product Code. If Participant does not receive the EA Origin® Product Code via an email from [email protected] within seven (7) days of supplying the AMD Unique ID Code on www.amd.com/getbattlefield4, the Participant should submit a support request at http://www.amd4u.com/radeonrewards/ticket.

Hope that clears some things up for people.

might have to add it to the OP Arizonian, not sure if it's on the card or not.


----------



## Jared Pace

Quote:


> Originally Posted by *$ilent*
> 
> that file doesnt work jared pace, just says invalid.


Did you run it from DOS? Instructions for how to enter DOS mode: http://www.techpowerup.com/forums/showthread.php?t=57750


----------



## $ilent

I mean the whole 290X .zip file. It doesnt let you open it, just sayd invalid file.


----------



## King PWNinater

Once more....

Can a 290x and an older card (Anything from 280x to 7770.) to crossfire in the same system?
I've already seen people using a 280x and a 7970 together, but I just want to be sure a 290x can do so.

Thank you in advance.


----------



## Jared Pace

Quote:


> Originally Posted by *$ilent*
> 
> I mean the whole 290X .zip file. It doesnt let you open it, just sayd invalid file.


oh, get it here then

http://www.mediafire.com/download/y5j1e8lkhzy8q2f/ggg.rar


----------



## $ilent

Quote:


> Originally Posted by *King PWNinater*
> 
> Once more....
> 
> Can a 290x and an older card (Anything from 280x to 7770.) to crossfire in the same system?
> I've already seen people using a 280x and a 7970 together, but I just want to be sure a 290x can do so.
> 
> Thank you in advance.


280x and 7970 are same card...290x and 280x card are completely different...


----------



## Jared Pace

It opens for me using WinRAR Win 7 x64


----------



## RocketAbyss

Quote:


> Originally Posted by *King PWNinater*
> 
> Once more....
> 
> Can a 290x and an older card (Anything from 280x to 7770.) to crossfire in the same system?
> I've already seen people using a 280x and a 7970 together, but I just want to be sure a 290x can do so.
> 
> Thank you in advance.


No.

290X and a 7970 cores are different by miles apart. You can only Crossfire same GPU core cards. 280X and a 7970 have the exact same Tahiti Core that is why they can crossfire


----------



## Arizonian

So did a Unigine Valley Benchmark 1.0 run for those interested.

i7 3770k 4.5 Ghz / 290X 1150 Core 1350 Memory



Will not be flashing BIOS but will try to add voltage when AB supports 290X.

Well I'm done with the benchmarks - I'm a gamer. Going to see what I can push now gaming in Crysis 3 which is the sole game to bring any GPU or even multiple GPU's to it's knees.


----------



## KingT

Quote:


> Originally Posted by *Arizonian*
> 
> So did a Unigine Valley Benchmark 1.0 run for those interested.
> 
> i7 3770k 4.5 Ghz / 290X 1150 Core 1350 Memory
> 
> 
> 
> Will not be flashing BIOS but will try to add voltage when AB supports 290X.
> 
> Well I'm done with the benchmarks - I'm a gamer. Going to see what I can push now gaming in Crysis 3 which is the sole game to bring any GPU or even multiple GPU's to it's knees.


That is identical score to my HD7950 CrossFire @ stock (900/1250MHz).

With cards slightly OC (stock voltages) @ 1000/1575MHz my sistem scores 3250 Pts.

CHEERS..


----------



## skupples

Some times you have to shift+ right click on the folder the .exe is inside of and open the prompt from there.

alway's save a backup of original bios. I'm not @ fault if you blow your vrm's.


----------



## oblivious45cs

Hello everyone,

Figured i would share some of my thoughts and results, since this forum has been so helpful in providing me information









I picked up the HIS 290X and flashed it using the Asus Bios and ATIflash method. Stock volts i was only able to do about 1070 without crashing. I bumped voltage up to 1.312 and was able to get 1160 stable. Below are some images of my results, i have only played around with Firestrike and Metro LL so far.

Metro Results - First Graph is stock, second @ 1160 / 1400


Spoiler: Warning: Spoiler!







Firestrike Results - First is stock, second @ 1160 / 1400


Spoiler: Warning: Spoiler!







Stock i was actually surprised the card nice and quiet, yeah the temps hit 94 but i guess that supposed to happen lol. When running at 1160 i set the fan to 70% and that kept temps under 90 but it is extremely loud, 100% on these stock coolers is unbearably loud.

At this point im still debating on adding it to my H220 loop, was hoping i could get closer to 1200, but anything past 1160 and it crashes or artifacts, i haven't tried to keep going higher with the voltage yet, still a little gun shy. I will see what you more experienced guys can do with them and decide.

Forgot to add information on my rig...
MSI Mpower Z87
4770K @ 4.6
16GB @ 2000
290x @ 1160/1400


----------



## jerrolds

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I just ordered a Arctic Cooling Accelero Xtreme III for mine since it's been confirmed that it fits.
> so i'll do some testing on Stock air and then with that, i had to order it from another site though so i have $90 worth of store credit......time to find some more case fans and pretty blue lights


Awesome let us know how it does - im thinking of getting that or the Accelero Hybrid, closest thing to actual watercooling id guess. That or i was thinking of zip-tieing an H80 (or H100i) and slap on some ram sinks


----------



## Jared Pace

Quote:


> Originally Posted by *oblivious45cs*
> 
> I picked up the HIS 290X and flashed it using the Asus Bios and winflash method.


Ah so ATI Winflash does work. Good to know. Wonder why $ilent couldn't find his adapter using it.


----------



## $ilent

Hmm not sure gonna give it another try


----------



## Clukos

Arizonian can you maybe post a screenie of 3D Mark 11 extreme?


----------



## Arizonian

Quote:


> Originally Posted by *KingT*
> 
> That is identical score to my HD7950 CrossFire @ stock (900/1250MHz).
> 
> With cards slightly OC (stock voltages) @ 1000/1575MHz my sistem scores 3250 Pts.
> 
> CHEERS..


Thanks, I think you mean - *Forza!*









I'm happy with this card at this price point. Waiting to see what non-reference brings to the table or perhaps Arctic Accelero Xtreme III does. To sum it up in a video how I feel.....





Will be selling off a GTX 680 locally and offset cost too.


----------



## oblivious45cs

Yah atiflash worked for me, i used the directions from the thread you copied a few posts back.


----------



## Sgt Bilko

Quote:


> Originally Posted by *jerrolds*
> 
> Awesome let us know how it does - im thinking of getting that or the Accelero Hybrid, closest thing to actual watercooling id guess. That or i was thinking of zip-tieing an H80 (or H100i) and slap on some ram sinks


Will do, I think Stay Puft is also getting some Accelero Xtreme III's and he might do a small tutorial.

I actually ordered some Ram heatsinks as well with that store credit, along with some Arctic 5 paste. Hopefully i get the packages on Tuesday.......probably more like Wednesday though. So much stuff to look forward too!!









EDIT: I'm going for the full Air Cooler route because i plan on getting another 290/x at some stage for Crossfire and i want to do my Water Loop at the same time so this is for short-mid term and also to find out how well this works on a 290x.

I'll be doing some Stock air benches and then some with the Accelero Xtreme III and then some more with Vram sinks on (Thermal Adhesive takes 5-7 days to dry)


----------



## jerrolds

Local shop has it for $135CDN, i think ill wait for black friday/others try it - ive been talking with Artic Cooling support and they say while Xtreme III and Hybrid should support R290X it might need some additional fittings
Quote:


> Eric: Yes those two existing Accelero products indeed will support R9 290X! We are waiting for the details if there is any refresh on the accessories or mounting design. We will keep you posted.
> 
> Me: Awesome! So if i were to walk into a local shop and picked a hybrid off the shelf - it should work with the 290x?
> 
> Eric: Hold your horses! Sometimes we have to make changes such as adding accessories required for the new GPU and inventory on the shelf right now will not have those additions. However, if we don't need any modification/addition, then you are right. I would suggest wait a little until we have all the proper information.
> 
> We hope this answers your question. Thank you for choosing ARCTIC products!


----------



## BababooeyHTJ

Has anyone tested a card on water? I'm dieing to see if these scale as well with temps as Tahiti.

Also, what is Sapphire Trixx no longer in development? It hasn't been updated since February by the looks of things. Thats too bad.


----------



## $ilent

Maybe its me being silly jared but do i not just download winflash and click run as admin?


----------



## RocketAbyss

$ilent, and others who are affected by the blackout screen for >5 seconds when loading up a game/fullscreen program, and your 290X constantly pegged at 100% usage even when idling at desktop, try this for me and for everyone else:

1.) After a reboot of your PC, don't open GPU-Z or any other temp reading software(not too sure about MSI AB, but having CCC open is fine).
2.) Try and replicate the symptom when opening games/fullscreen program.
3.) If it happens again, then it's definitely a driver problem.
4.) If it does NOT happen, its a GPU-Z/other temp reading software problem.

Because I've tried only having CCC open, and I've managed to play a few rounds on BF3, try Valley and Heaven Benches, try Crysis etc. And I had no more screen freezes or 100% GPU load on desktop idle. I have not tried reopening GPU-Z to see if the problem will appear after awhile. But if it does happen, I'll get back to you all.









Peace and enjoy your 290X(s)! Cos now that I've kinda sourced out where that problem is, I'm happy gaming as it is


----------



## Sgt Bilko

Quote:


> Originally Posted by *jerrolds*
> 
> Local shop has it for $135CDN, i think ill wait for black friday/others try it - ive been talking with Artic Cooling support and they say while Xtreme III and Hybrid should support R290X it might need some additional fittings


Ah well, guess i'll find out when it arrives, no big loss if i need some extra bits and pieces for it


----------



## sugarhell

Latest gpu-z cause my 7970 to run at 100% too


----------



## Jared Pace

Quote:


> Originally Posted by *$ilent*
> 
> Maybe its me being silly jared but do i not just download winflash and click run as admin?


That would be my first guess! Maybe you have some sort of conflict... Running Windows 8? That might be it. Perhaps the guy who successfully did it could help you further. If you're familiar with dos there's always atiflash too.


----------



## SoloCamo

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *RocketAbyss*
> 
> $ilent, and others who are affected by the blackout screen for >5 seconds when loading up a game/fullscreen program, and your 290X constantly pegged at 100% usage even when idling at desktop, try this for me and for everyone else:
> 
> 1.) After a reboot of your PC, don't open GPU-Z or any other temp reading software(not too sure about MSI AB, but having CCC open is fine).
> 2.) Try and replicate the symptom when opening games/fullscreen program.
> 3.) If it happens again, then it's definitely a driver problem.
> 4.) If it does NOT happen, its a GPU-Z/other temp reading software problem.
> 
> Because I've tried only having CCC open, and I've managed to play a few rounds on BF3, try Valley and Heaven Benches, try Crysis etc. And I had no more screen freezes or 100% GPU load on desktop idle. I have not tried reopening GPU-Z to see if the problem will appear after awhile. But if it does happen, I'll get back to you all.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Peace and enjoy your 290X(s)! Cos now that I've kinda sourced out where that problem is, I'm happy gaming as it is






This X2 - reinstalling the Beta 6 drivers after a clean uninstall of them fixed it for me


----------



## $ilent

im on win7 64bit.
Quote:


> Originally Posted by *RocketAbyss*
> 
> $ilent, and others who are affected by the blackout screen for >5 seconds when loading up a game/fullscreen program, and your 290X constantly pegged at 100% usage even when idling at desktop, try this for me and for everyone else:
> 
> 1.) After a reboot of your PC, don't open GPU-Z or any other temp reading software(not too sure about MSI AB, but having CCC open is fine).
> 2.) Try and replicate the symptom when opening games/fullscreen program.
> 3.) If it happens again, then it's definitely a driver problem.
> 4.) If it does NOT happen, its a GPU-Z/other temp reading software problem.
> 
> Because I've tried only having CCC open, and I've managed to play a few rounds on BF3, try Valley and Heaven Benches, try Crysis etc. And I had no more screen freezes or 100% GPU load on desktop idle. I have not tried reopening GPU-Z to see if the problem will appear after awhile. But if it does happen, I'll get back to you all.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Peace and enjoy your 290X(s)! Cos now that I've kinda sourced out where that problem is, I'm happy gaming as it is


Mine is hit and miss, ive reinstalled windows and since doing that its not done the 100% thing anymore, but when I run benchmarks it still goes to black screen every 5 mins.

Although saying that, even when trying to install AB it does it, screen hangs, next option comes up, click a button...screen hangs. Rinse and repeat


----------



## oblivious45cs

Quote:


> Originally Posted by *Jared Pace*
> 
> That would be my first guess! Maybe you have some sort of conflict... Running Windows 8? That might be it. Perhaps the guy who successfully did it could help you further. If you're familiar with dos there's always atiflash too.


I think you guys mis-understood me









I used ATI-Flash, not WinFlash, i couldn't get Winflash to run, so i went with ATI-Flash, created a Bootable USB drive, and flashed in DOS.

The process is shown in this link
http://www.techpowerup.com/forums/showthread.php?t=57750


----------



## RocketAbyss

Quote:


> Originally Posted by *$ilent*
> 
> im on win7 64bit.
> Mine is hit and miss, ive reinstalled windows and since doing that its not done the 100% thing anymore, but when I run benchmarks it still goes to black screen every 5 mins.
> 
> Although saying that, even when trying to install AB it does it, screen hangs, next option comes up, click a button...screen hangs. Rinse and repeat


Try just running CCC after a reboot. If you're able to replicate my results with the black screens and 100% peg at idle disappearing, we can safely rule out its not an AMD driver issue with that part, and more likely than not, a definite issue with current third party temp/gpu OC software.


----------



## Arizonian

Hmm what does power limit do in AB? I've been running it default - 0 - Does it allow for higher OC?

BTW gaming I'm stable without artifcating at 1100 Core / 1300 Memory in Crysis 3. When I bring it up to 1150 / 1350 it artifacts.


----------



## Jared Pace

Quote:


> Originally Posted by *oblivious45cs*
> 
> I picked up the HIS 290X and flashed it using the Asus Bios and winflash method.


Quote:


> Originally Posted by *oblivious45cs*
> 
> I think you guys mis-understood me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I used ATI-Flash, not WinFlash


Yeah misunderstood. You said Winflash Method in your first post as quoted above. So apparently ATI Winflash _DOES NOT_ and $ilent, you must use atiflash in dos. Just follow my earlier posts or the TPU tutorial (which is the source of the instructions in my post).


----------



## $ilent

I think im just gonna wait for another AMD driver and wait for MSI AB to update so I can use unlocked voltage without modding the bios...hopefully they release it prior to BF4 (you know...the game they have been banging on about so much that uses, neigh _REQUIRES_ the 290X to run well? I think AMD it might help if you could actually make a driver that works, as apposed to these crappy beta drivers that do nothing?)

If they havent released a decent driver by the time BF4 is released im off back to my gtx 570, at least that card works? Im happy with my 290x's performance so far, but honestly AMD. Is it really THAT difficult to pre emptively make a decent working driver for the card that you claim is the holy grail of gpu gaming.

Christ almighty...


----------



## sugarhell

Quote:


> Originally Posted by *Arizonian*
> 
> Hmm what does power limit do in AB? I've been running it default - 0 - Does it allow for higher OC?
> 
> BTW gaming I'm stable without artifcating at 1100 Core / 1300 Memory in Crysis 3. When I bring it up to 1150 / 1350 it artifacts.


You should use 50% powerlimit through CCC. It increase the volts if i can understand properly the new powertune


----------



## jerrolds

Quote:


> Originally Posted by *Arizonian*
> 
> Hmm what does power limit do in AB? I've been running it default - 0 - Does it allow for higher OC?
> 
> BTW gaming I'm stable without artifcating at 1100 Core / 1300 Memory in Crysis 3. When I bring it up to 1150 / 1350 it artifacts.


Doesnt it give it more juice to prevent throttling? I know with the 7970 its usually maxed at 120% Or am i thinking of Power Target...


----------



## Jared Pace

Quote:


> Originally Posted by *Arizonian*
> 
> Hmm what does power limit do in AB? I've been running it default - 0 - Does it allow for higher OC?
> 
> BTW gaming I'm stable without artifcating at 1100 Core / 1300 Memory in Crysis 3. When I bring it up to 1150 / 1350 it artifacts.


It adjusts the 208 watts given to the core ASIC in percentages, max of -50% or +50% for factory bioses. Adjust slightly up or down to monitor your throttling as you try to achieve higher stable clocks. As each chip has a variable leakage & binning, you will find a sweet spot for this power setting that will provide you with the optimum core frequency and minimum throttle, resulting in the longest sustained / highest clock.


----------



## oblivious45cs

Oops sorry about that guys the names are too similar!


----------



## rdr09

Quote:


> Originally Posted by *oblivious45cs*
> 
> Hello everyone,
> 
> Figured i would share some of my thoughts and results, since this forum has been so helpful in providing me information
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I picked up the HIS 290X and flashed it using the Asus Bios and ATIflash method. Stock volts i was only able to do about 1070 without crashing. I bumped voltage up to 1.312 and was able to get 1160 stable. Below are some images of my results, i have only played around with Firestrike and Metro LL so far.
> 
> Metro Results - First Graph is stock, second @ 1160 / 1400
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Firestrike Results - First is stock, second @ 1160 / 1400
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Stock i was actually surprised the card nice and quiet, yeah the temps hit 94 but i guess that supposed to happen lol. When running at 1160 i set the fan to 70% and that kept temps under 90 but it is extremely loud, 100% on these stock coolers is unbearably loud.
> 
> At this point im still debating on adding it to my H220 loop, was hoping i could get closer to 1200, but anything past 1160 and it crashes or artifacts, i haven't tried to keep going higher with the voltage yet, still a little gun shy. I will see what you more experienced guys can do with them and decide.
> 
> Forgot to add information on my rig...
> MSI Mpower Z87
> 4770K @ 4.6
> 16GB @ 2000
> 290x @ 1160/1400


i was comparing your graph to an oc'ed 780 and noticed how smooth it is . . .


----------



## SonDa5

Getting ready to hook up my 290x with some cooling love. 

This will be temporary till my acrylic copper EK block arrives.

The block is swiftech mcw80.


----------



## Sgt Bilko

Quote:


> Originally Posted by *SonDa5*
> 
> Getting ready to hook up my 290x with some cooling love.
> 
> This will be temporary till my acrylic copper EK block arrives.


You've reduced it to it's bare parts and you didn't even throw a towel on it.......shame on you









Serious time, looks good, looking forward to more tests while i wait


----------



## $ilent

Well looks like im out of action for a while.

I just reinstalled windows, went to plug my raid drives back in, switched psu on heard a small "tsst" noise, it happens everytime I press the I/O button on the back of the psu...now my psu span spins for half a second then goes off. I know the motherboard is getting power because the mobo on button is light up. This happens quite often when I change stuff in my pc, especially when I have cables plugged into the psu that arent being use,d but ive tried removing all those cables and ive tried reseatting all psu cables but nothing...still doing same thing.

Superb.

Anyone want a free 290x? might aswell bin that along with the rest of thise crappy pc ive got.


----------



## SoloCamo

Took a screenshot as proof, already have the card obviously installed but can a screenshot of the box if needed as well... forgot to request I be added lol 

Keeping it at 1050/1325 for now as there is no need to go higher at all yet at 1080p


----------



## Gunderman456

Quote:


> Originally Posted by *$ilent*
> 
> Well looks like im out of action for a while.
> 
> I just reinstalled windows, went to plug my raid drives back in, switched psu on heard a small "tsst" noise, it happens everytime I press the I/O button on the back of the psu...now my psu span spins for half a second then goes off. I know the motherboard is getting power because the mobo on button is light up. This happens quite often when I change stuff in my pc, especially when I have cables plugged into the psu that arent being use,d but ive tried removing all those cables and ive tried reseatting all psu cables but nothing...still doing same thing.
> 
> Superb.
> 
> Anyone want a free 290x? might aswell bin that along with the rest of thise crappy pc ive got.


I feel your anguish bro!

I hate when that happens. Something deep inside tells you to stop but the enthusiast in you compels to keep pushing.

You almost wish you can restore from a saved game, but there is no such respite.

I've been following the AMD r9 290x launch and I'm not really impressed;


The hardware can't keep things cool which causes erratic throttling to occur 700-900MHz
I don't really like this boost/uber mode which is also present on my HD 7990
Drivers are not where they should be
It also appears that AMD and EA have colluded in forcing you to buy the BF4 edition since I've not seen any vanilla versions on offer
Launch is weak (could say a paper launch), at least here in Canada
More waiting required for third party aftermarket solutions


----------



## Arm3nian

Quote:


> Originally Posted by *Gunderman456*
> 
> I feel your anguish bro!
> 
> I hate when that happens. Something deep inside tells you to stop but the enthusiast in you compels to keep pushing.
> 
> You almost wish you can restore from a saved game, but there is no such respite.
> 
> I've been following the AMD r9 290x launch and I'm not really impressed;
> 
> 
> The hardware can't keep things cool which causes erratic throttling to occur 700-900MHz
> I don't really like this boost/uber mode which is also present on my HD 7990
> Drivers are not where they should be
> It also appears that AMD and EA have colluded in forcing you to buy the BF4 edition since I've not seen any vanilla versions on offer
> Launch is weak (could say a paper launch), at least here in Canada
> More waiting required for third party aftermarket solutions


I don't see what the big deal is. Either go water or wait...

In previous posts I argued this card in its current release is made for water. Everyone seemed to disagree, and now everyone is complaining.
Yet no one who has these under water has any issues. They are just waiting for voltage unlocks to see what the card can do.


----------



## Arizonian

Quote:


> Originally Posted by *SonDa5*
> 
> Getting ready to hook up my 290x with some cooling love.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> This will be temporary till my acrylic copper EK block arrives.
> 
> The block is swiftech mcw80.


Quote:


> Originally Posted by *SoloCamo*
> 
> Took a screenshot as proof, already have the card obviously installed but can a screenshot of the box if needed as well... forgot to request I be added lol
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Keeping it at 1050/1325 for now as there is no need to go higher at all yet at 1080p


Congrats + Added both









I'm going to start having to remember which page I leave the club at because it's moving so fast I don't want to miss anyone.


----------



## TheSoldiet

Im still waiting for my card to come. It wont be in stock till the 29th :-( I think I will order a xtreme III tomorrow and then mount it on my card next weekend.

BTW can anybody confirm that the Sapphire bf4 edition comes with hynix memory? I would like to know


----------



## kcuestag

You guys are so lucky to get your cards on a Saturday, mine won't arrive until Monday, and the waterblock sometime between Wednesday and next Monday.


----------



## $ilent

Quote:


> Originally Posted by *TheSoldiet*
> 
> Im still waiting for my card to come. It wont be in stock till the 29th :-( I think I will order a xtreme III tomorrow and then mount it on my card next weekend.
> 
> BTW can anybody confirm that the Sapphire bf4 edition comes with hynix memory? I would like to know


Let me just pull mine out of the bin ill tell you...


----------



## kcuestag

Quote:


> Originally Posted by *$ilent*
> 
> Let me just pull mine out of the bin ill tell you...


Send me that 290X, will you?










Just return it and demand a replacement!


----------



## TheSoldiet

Unless the rest of your system is crap, then get a new psu. I feel sorry for you :-( You get the most powerfull gpu and you cant even use it :-(


----------



## Arm3nian

Quote:


> Originally Posted by *$ilent*
> 
> Let me just pull mine out of the bin ill tell you...


I still don't understand why you're trying to volt mod. It has already been show that the card can go way past the clock you're at WITHOUT a volt mod. Even if you did it, you realize volt modding is reserved for wc/ln2 right? Especially for a card that runs this hot. There is obviously something wrong with your sample if you can't get it past your clock.


----------



## Tobiman

Quote:


> Originally Posted by *$ilent*
> 
> Well looks like im out of action for a while.
> 
> I just reinstalled windows, went to plug my raid drives back in, switched psu on heard a small "tsst" noise, it happens everytime I press the I/O button on the back of the psu...now my psu span spins for half a second then goes off. I know the motherboard is getting power because the mobo on button is light up. This happens quite often when I change stuff in my pc, especially when I have cables plugged into the psu that arent being use,d but ive tried removing all those cables and ive tried reseatting all psu cables but nothing...still doing same thing.
> 
> Superb.
> 
> Anyone want a free 290x? might aswell bin that along with the rest of thise crappy pc ive got.


I don't get. Is it your PSU or GPU that kciked the bucket?


----------



## $ilent

Quote:


> Originally Posted by *Arm3nian*
> 
> I still don't understand why you're trying to volt mod. It has already been show that the card can go way past the clock you're at WITHOUT a volt mod. Even if you did it, you realize volt modding is reserved for wc/ln2 right? Especially for a card that runs this hot. There is obviously something wrong with your sample if you can't get it past your clock.


Im not trying to volt mod, all ive done is tried to inatall ati wi flashm that failed so i gave up. Which clock are you referring to?

I can 3dmark bench at 1100/1345, so i dont think i can return it? It artifacts in windows at idle with like 1450 memory though.

To tobiman, psu i think so far cant test anything else. I might buy a cheap unit to test it ill see.


----------



## sugarhell

Dont use ab or gpu-z. They cause problems. Gpu-z dont even show the right specs. Until we get proper AB and voltage control just play with CCC


----------



## Falkentyne

Quote:


> Originally Posted by *$ilent*
> 
> Im not trying to volt mod, all ive done is tried to inatall ati wi flashm that failed so i gave up. Which clock are you referring to?
> 
> I can 3dmark bench at 1100/1345, so i dont think i can return it? It artifacts in windows at idle with like 1450 memory though.
> 
> To tobiman, psu i think so far cant test anything else. I might buy a cheap unit to test it ill see.


Check OCUK forums. Your problem with the memory has been experienced by others.
The RAM can apparently handle higher clocks. It seems to be the 2d/3d desktop speed switching that causes the problem.

I Know because I was able to run at 1100/1500 on my first time, looping Valley Benchmark multiple times, as well as playing some league of Legends. No problems keeping it there all night.

Then I rebooted the computer.

BAM: once I was on desktop, the instant I tried to move ANYTHING, I saw those lines you were talking about, then an instant black screen.
Rebooted again, tried to open MSI afterburner, set the 70% manual fan speed, then while mousing over something: black screen.

Rebooted again and managed to open CCC and hit "defaults" before the black screen happened.

Changing the core speed only doesn't seem to cause issues.

It's something with the memory clock.
The memory clock should NEVER be going to 3d speeds while scrolling or opening something on the desktop...isn't that what the UVD clocks were for?
Seems to be a bug.

Some people fixed this by gpu tweaking the memory voltage on the Asus bios 0.05v higher, then that stopped the desktop from crashing.


----------



## Arm3nian

Quote:


> Originally Posted by *sugarhell*
> 
> Dont use ab or gpu-z. They cause problems. Gpu-z dont even show the right specs. Until we get proper AB and voltage control just play with CCC


This. People are trying to get 1500mhz with new drivers, stock bios, and a stock cooler that is underperforming compared to 780/titan. Just wait a bit. For now just go play games, the card performs amazingly.


----------



## $ilent

Thats falk ill have a read when i get my psu sorted


----------



## Jared Pace

Quote:


> Originally Posted by *sugarhell*
> 
> Dont use ab or gpu-z. They cause problems. Gpu-z dont even show the right specs. Until we get proper AB and voltage control just play with CCC


+1. Wait until MSI AB or GPUz versions are released that specify compatibility with 290x. It should be the next version for each program. AB 3.00 B17 or 3.00 Final or Gpu-z 0.74. If you want R9 290X voltage control right now you have to use Asus bios + Asus GPUTweak OC software.


----------



## th3illusiveman

Quote:


> Originally Posted by *$ilent*
> 
> I think im just gonna wait for another AMD driver and wait for MSI AB to update so I can use unlocked voltage without modding the bios...hopefully they release it prior to BF4 (you know...the game they have been banging on about so much that uses, neigh _REQUIRES_ the 290X to run well? I think AMD it might help if you could actually make a driver that works, as apposed to these crappy beta drivers that do nothing?)
> 
> If they havent released a decent driver by the time BF4 is released im off back to my gtx 570, at least that card works? Im happy with my 290x's performance so far, but honestly AMD. Is it really THAT difficult to pre emptively make a decent working driver for the card that you claim is the holy grail of gpu gaming.
> 
> Christ almighty...


you should realize that it is usually unwise to buy cards on the day the launch. This isn't the first time AMD has had day 1 driver issues and it won't be the last (Nvidia are in the same boat btw)... It's just how these things are.

Your frustration is understandable though....


----------



## selk22

Quote:


> Originally Posted by *Koniakki*
> 
> I guess Valley in is a lot different story than 3DMark11. My stock [email protected]/1502 was getting the same score as *selk22* below. So we can assume a 1110/1250 290X gets the same like a 1045/1502 780?
> 
> Its a little strange. The two scores above are inconsistent based on their clocks and I think its a CCC settings/OS differences that affects them.


I think this is great to see the differences here, I am just excited to hear my 290x for 550 was matching your stock 780! Cool









I really dont understand how you guys are able to push this mem so far!

Someone please help! I have tried to OC in CCC and its successful with the GPU Clock but the Mem clock stays the same..

Then I can OC everything I want to in MSI afterburner but pretty much if I even touch the Mem I will see artifacting in benches and games..

So please tell me how you guys are OC'ing this card?


----------



## thestache

So no word on voltage control? And has anyone got a card under water to compare how it performs with real cooling?

Price is too dam good to not be considering crossfire at the moment. Even short term until GTX 780 TI non reference cards and Gsync.

Anyone running eyefinity and having issues?

And does vsync actually work yet ? Lol.


----------



## rdr09

Quote:


> Originally Posted by *thestache*
> 
> So no word on voltage control? And has anyone got a card under water to compare how it performs with real cooling?
> 
> Price is too dam good to not be considering crossfire at the moment. Even short term until GTX 780 TI non reference cards and Gsync.
> 
> Anyone running eyefinity and having issues?
> 
> And does vsync actually work yet ? Lol.


why? is it giving you input lag?


----------



## BababooeyHTJ

Quote:


> Originally Posted by *rdr09*
> 
> why? is it giving you input lag?


I thought vsync had issues in eyefinity. I don't know if that was resolved either.


----------



## Jared Pace

Quote:


> Originally Posted by *thestache*
> 
> So no word on voltage control?


asus bios + gputweak


----------



## jerrolds

Quote:


> Originally Posted by *thestache*
> 
> So no word on voltage control? And has anyone got a card under water to compare how it performs with real cooling?
> 
> Price is too dam good to not be considering crossfire at the moment. Even short term until GTX 780 TI non reference cards and Gsync.
> 
> Anyone running eyefinity and having issues?
> 
> And does vsync actually work yet ? Lol.


GSync looks amazing - but only looks like will be supported on 1080p 120/144hz TN panels - which is a shame. My QNIX will live a little longer and will switch to green once GSync is possible on 1440p IPS/PLS panels, assuming Radeon doesnt beat it in some way.


----------



## rdr09

Quote:


> Originally Posted by *BababooeyHTJ*
> 
> I thought vsync had issues in eyefinity. I don't know if that was resolved either.


never used eyeinfinity. i read nvidia is having issues with vsync. gsync should resolve that. maybe 2014.


----------



## selk22

Quote:


> Originally Posted by *Jared Pace*
> 
> asus bios + gputweak


Has anyone confirmed this works and if so any help on flashing the 290x Bios? I have only ever done the Mobo


----------



## Jared Pace

Quote:


> Originally Posted by *selk22*
> 
> Has anyone confirmed this works and if so any help on flashing the 290x Bios? I have only ever done the Mobo


Yeah Gibbo @ ocuk did it & someone here at OCN did it a couple pages back. It works. There's instructions in the thread. Right now ATI Winflash isn't working so you'll have to flash from dos or cmd. The instructions are also listed here a few pages ago.


----------



## SoloCamo

Quote:


> Originally Posted by *sugarhell*
> 
> Dont use ab or gpu-z. They cause problems. Gpu-z dont even show the right specs. Until we get proper AB and voltage control just play with CCC


Not sure about AB yet, but the 100% load so far from what I've seen seems to be associated to gpu-z.


----------



## Arm3nian

Quote:


> Originally Posted by *thestache*
> 
> So no word on voltage control? And has anyone got a card under water to compare how it performs with real cooling?
> 
> Price is too dam good to not be considering crossfire at the moment. Even short term until GTX 780 TI non reference cards and Gsync.
> 
> Anyone running eyefinity and having issues?
> 
> And does vsync actually work yet ? Lol.


One member reported eyefinity working much better than 7970s. We are still waiting to for software to allow voltage control (AB). If it doesn't, we need to bios flash.


----------



## Arizonian

Quote:


> Originally Posted by *SoloCamo*
> 
> Not sure about AB yet, but the 100% load so far from what I've seen seems to be associated to gpu-z.


I recently had issues with BF3 again not starting up with GPU-Z monitoring. Turned it off and BF3 now loads perfectly. I've uninstalled GPU-Z until I see it support 290X.

I've been using AB to over clock but having issues not being able to add more than 100 Mhz OC to memory and 150 Mhz OC to Core. Unless this is a poor over clocker (which is possible) I'm being limited, as I see most GPU's can get 200 Mzh average OC on both Core and Memory just on air.

Gaming this is a beast with simple 70% fan (not louder than a GTX 690 full load) I've got temps well under control 84C with 100% GPU usage. Will be nice to have final release drivers and supported AB as well.


----------



## binormalkilla

Quote:


> Originally Posted by *thestache*
> 
> So no word on voltage control? And has anyone got a card under water to compare how it performs with real cooling?
> 
> Price is too dam good to not be considering crossfire at the moment. Even short term until GTX 780 TI non reference cards and Gsync.
> 
> Anyone running eyefinity and having issues?
> 
> And does vsync actually work yet ? Lol.


I'm running 3 portrait eyefinity with 1 of each type of port. Tearing has been completely eliminated. It might still be broken in crossfire though. This is the first time I've run eyeyfinity on a single card. I'll find out as soon as the non bf4 editions become available.


----------



## Arm3nian

Quote:


> Originally Posted by *Arizonian*
> 
> I recently had issues with BF3 again not starting up with GPU-Z monitoring. Turned it off and BF3 now loads perfectly. I've uninstalled GPU-Z until I see it support 290X.
> 
> I've been using AB to over clock but having issues not being able to add more than 100 Mhz OC to memory and 150 Mhz OC to Core. Unless this is a poor over clocker (which is possible) I'm being limited, as I see most GPU's can get 200 Mzh average OC on both Core and Memory just on air.
> 
> Gaming this is a beast with simple 70% fan (not louder than a GTX 690 full load) I've got temps well under control 84C with 100% GPU usage. Will be nice to have final release drivers and supported AB as well.


Can you just disable the overclocking thing in CCC? I don't like it to be honest. I prefer the old slider









Also, I had issues with EVGA precision x monitoring on bf3.


----------



## Arizonian

Quote:


> Originally Posted by *Arm3nian*
> 
> Can you just disable the overclocking thing in CCC? I don't like it to be honest. I prefer the old slider
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also, I had issues with EVGA precision x monitoring on bf3.


Not that I know of. It's been four years since I've been on the red team. I have 'Enable Graphics Overdrive' unchecked. Go to AB and over clock from there and it's working. Even OSD monitoring in my games for Core / Mem / Temp & FPS.


----------



## Deedot78

Please add to list.


----------



## 250179

what i have learned from this launch - never be the guinea pig of an AMD card, i had chance of being the first (consumer) to have it, glad i passed lol


----------



## szeged

Quote:


> Originally Posted by *gabsonuro*
> 
> what i have learned from this launch - never be the guinea pig of an AMD card, i had chance of being the first (consumer) to have it, glad i passed lol


early adopters of cards on either sides are usually riddled with problems


----------



## CallsignVega

What are these cards maxing out at so far?


----------



## szeged

Quote:


> Originally Posted by *CallsignVega*
> 
> What are these cards maxing out at so far?


it seems 1100 to 1200 is about average for users on air here, memory is iffy, some are getting it to 1450+ while others are artifacting and crashing if they even add +1 on the memory.

on ln2 the highest core clock weve seen was something around 1466 i believe, i dont remember what the memory was at.

the 1200ish core clock scores are with 100% fan btw.


----------



## wolfej

Quote:


> Originally Posted by *CallsignVega*
> 
> What are these cards maxing out at so far?


On air the highest I've seen is ~1200 and then the guys on LN2 got to 1435 IIRC. The voltage hasn't been able to be upped a lot yet from what I've read.


----------



## thestache

Quote:


> Originally Posted by *Jared Pace*
> 
> asus bios + gputweak


Thanks for that and all the replies guys.

What kind of clocks on water are people getting? I'm keen on running 1.35v 24/7 if the higher clocks are worth it.

Quote:


> Originally Posted by *jerrolds*
> 
> GSync looks amazing - but only looks like will be supported on 1080p 120/144hz TN panels - which is a shame. My QNIX will live a little longer and will switch to green once GSync is possible on 1440p IPS/PLS panels, assuming Radeon doesnt beat it in some way.


I'm upgrading to those panels for 120hz surround Gsync for when it drops so no dramas for me.

Quote:


> Originally Posted by *rdr09*
> 
> never used eyeinfinity. i read nvidia is having issues with vsync. gsync should resolve that. maybe 2014.


Vsync has worked flawlessly in surround for years. Is the only reason I never kept my Devil 13 R7990 or R7970 Lighting crossfire. Vsync simply at the time did not even function in the driver and when it did with Radeonpro you had massive amounts of input lag and/or terrible frame pacing.

Quote:


> Originally Posted by *binormalkilla*
> 
> I'm running 3 portrait eyefinity with 1 of each type of port. Tearing has been completely eliminated. It might still be broken in crossfire though. This is the first time I've run eyeyfinity on a single card. I'll find out as soon as the non bf4 editions become available.


What settings are you running (vsync in driver or game), what panels, any active adapters or anything and what ports etc? Thanks for any information that'd be a great help and this is probably the only reason I wouldn't try the 290X out in crossfire.


----------



## szeged

hope afterburner gets updated to support voltage tweaking on these cards before monday, dont wanna spend time messing with more bios tweaking lol.

Also want to know the max safe voltage these cards can handle, if 1.25v is at stock, then im guessing 1.4 to 1.5 should be no sweat.


----------



## Sazz

you guys can add me on that list xD

http://imageshack.us/photo/my-images/62/u1wd.png/

GPU-Z Validation link

Anyways, there's quite a lot of pages down in this thread already but anyone here knows how to unlock the voltage control? MSI Afterburner 3.0.0 beta seems to work on overclocking but even after unlocking the EULA thing still doesn't let you control the voltage.


----------



## Arm3nian

Quote:


> Originally Posted by *thestache*
> 
> Thanks for that and all the replies guys.
> 
> What kind of clocks on water are people getting? I'm keen on running 1.35v 24/7 if the higher clocks are worth it.


We had maybe only two oc's on water so far. They aren't that much higher than air because the voltage adjustment isn't really "working" atm lol.

Quote:


> Originally Posted by *szeged*
> 
> hope afterburner gets updated to support voltage tweaking on these cards before monday, dont wanna spend time messing with more bios tweaking lol.
> 
> Also want to know the max safe voltage these cards can handle, if 1.25v is at stock, then im guessing 1.4 to 1.5 should be no sweat.


Yeah I'd rather not go the bios update route unless there is no other choice for voltage increases. I saw a bios that enabled voltage all the way 2 volts, maybe they tested it at that, maybe they didn't. I'm guessing 1.4-1.5 should be good for water. TSM on a "low end" wc rig (according to him) got it to 1.3, and temps were low, so there is a lot more room.


----------



## Jpmboy

Just got a 290x EK waterblock... now all i need is the card!


----------



## szeged

im just tired of voiding warranties on expensive cards







but ill do it...for the sake of overclocking science.

the vrms on the 290x seem super beefed up vs titans, try to put 1.35v on the titans reference cooler and...going going goneeeeeeee, kaboom.

guess we will have to give it a few months like with the 780s after voltage hacks and see how long it takes for a vrm to pop







have yet to see a titan blow up other than on ln2.


----------



## Sazz

Quote:


> Originally Posted by *Arm3nian*
> 
> We had maybe only two oc's on water so far. They aren't that much higher than air because the voltage adjustment isn't really "working" atm lol.
> Yeah I'd rather not go the bios update route unless there is no other choice for voltage increases. I saw a bios that enabled voltage all the way 2 volts, maybe they tested it at that, maybe they didn't. I'm guessing 1.4-1.5 should be good for water. TSM on a "low end" wc rig (according to him) got it to 1.3, and temps were low, so there is a lot more room.


Yeah, if GPU Tweak for Asus got their voltage control working I don't see why MSI AB won't do that as well, we just have to wait for the update on AB or even Sappire TrixX. I don't really like the BIOS flash route. for now imma settle with 1100/1350mhz overclock. fans doesn't even reach 100% and my temps don't really reach over 80C at 67% fan speed.


----------



## Arm3nian

Quote:


> Originally Posted by *Sazz*
> 
> Yeah, if GPU Tweak for Asus got their voltage control working I don't see why MSI AB won't do that as well, we just have to wait for the update on AB or even Sappire TrixX. I don't really like the BIOS flash route. for now imma settle with 1100/1350mhz overclock. fans doesn't even reach 100% and my temps don't really reach over 80C at 67% fan speed.


AB has always supported volt modding for amd, I see no reason why they wouldn't this time around. I'm fine with bios flash, but just using software w/o flashing is much easier. Who knows, maybe me and others on water are going to need a bios flash anyway for even higher volts.

Side note: What part of vegas are you in?


----------



## Sazz

Quote:


> Originally Posted by *Arm3nian*
> 
> AB has always supported volt modding for amd, I see no reason why they wouldn't this time around. I'm fine with bios flash, but just using software w/o flashing is much easier. Who knows, maybe me and others on water are going to need a bios flash anyway for even higher volts.
> 
> Side note: What part of vegas are you in?


Yeah I am waiting on my block as well, ill prolly receive it this coming Wednesday xD

And I am in North Las Vegas area.

So far on eyefinity I have played Black Ops 2, Tomb Raider, Need For Speed World and Vindictus and all are working great, Tomb Raider at Ultra settings gets on average 45Fps during gameplay dips down to 40ish during busy screen (explosions and whatnot) Black ops 2 on eyefinity is easy as pie, getting over 87Fps on average and so as the other two games gets constant 60Fps (V-synced NFS and Vindictus is FPS locked)

I am so glad that the price for this card is just under 600bucks, it is expensive but much much worthy than the 780.


----------



## Arm3nian

Quote:


> Originally Posted by *Sazz*
> 
> Yeah I am waiting on my block as well, ill prolly receive it this coming Wednesday xD
> 
> And I am in North Las Vegas area.
> 
> So far on eyefinity I have played Black Ops 2, Tomb Raider, Need For Speed World and Vindictus and all are working great, Tomb Raider at Ultra settings gets on average 45Fps during gameplay dips down to 40ish during busy screen (explosions and whatnot) Black ops 2 on eyefinity is easy as pie, getting over 87Fps on average and so as the other two games gets constant 60Fps (V-synced NFS and Vindictus is FPS locked)
> 
> I am so glad that the price for this card is just under 600bucks, it is expensive but much much worthy than the 780.


If we get voltage unlocking working and better drivers this is going to kill everything.

I ordered 2 ek blocks from frozencpu on thursday, estimated delivery is Friday... they ship it from NY so on the other side of the country.


----------



## BababooeyHTJ

Quote:


> Originally Posted by *rdr09*
> 
> never used eyeinfinity. i read nvidia is having issues with vsync. gsync should resolve that. maybe 2014.


No, Nvidia had some issues with vsync back when GTX680 launched well over a year ago and was resolved well over a year ago now.

Gsync is a completely new tech. Thats completely unrelated.

To this day afaik you still can't force vsync through the amd control panel.


----------



## Arm3nian

Quote:


> Originally Posted by *BababooeyHTJ*
> 
> No, Nvidia had some issues with vsync back when GTX680 launched well over a year ago and was resolved well over a year ago now.
> 
> Gsync is a completely new tech. Thats completely unrelated.
> 
> To this day afaik you still can't force vsync through the amd control panel.


What? I had multiple issues with vsync on 600 series. Adaptive vsync was completely useless for one, and half the time vysnc didn't work. I would have to restart the game 10 times or go in through inspector.

My friend had 700 series and reported the same thing.


----------



## thestache

Quote:


> Originally Posted by *BababooeyHTJ*
> 
> No, Nvidia had some issues with vsync back when GTX680 launched well over a year ago and was resolved well over a year ago now.
> 
> Gsync is a completely new tech. Thats completely unrelated.
> 
> To this day afaik you still can't force vsync through the amd control panel.


Yeah I don't know why they can't take the time to make it work, Nvidia have it working flawlessly.

Quote:


> Originally Posted by *Arm3nian*
> 
> What? I had multiple issues with vsync on 600 series. Adaptive vsync was completely useless for one, and half the time vysnc didn't work. I would have to restart the game 10 times or go in through inspector.
> 
> My friend had 700 series and reported the same thing.


Adaptive vsync is the stupidest thing ever. And vsync has always worked for me in surround or single screen. Turn on in the driver and that's it, leave it off in game.


----------



## jomama22

Quote:


> Originally Posted by *thestache*
> 
> Yeah I don't know why they can't take the time to make it work, Nvidia have it working flawlessly.
> Adaptive vsync is the stupidest thing ever. And vsync has always worked for me in surround or single screen. Turn on in the driver and that's it, leave it off in game.


You can force vsync through the 3d profiles. Can set it globally or just game by game.


----------



## utnorris

Finally hit over 15k in 3DMark11. GPU at 1110Mhz and Memory at 1450Mhz.

http://www.3dmark.com/3dm11/7380809


----------



## pounced

Found this hilarious enjoy!


----------



## selk22

This is probably going to be the last bench I post until I have mine under water. I also may post some game benchmarks later but for now enjoy









This is my max stable OC with the Mem above 1300..

3930k 4.6 & R9 290x 1100/1350


Here is my Fan profile if anyone is interested.. This helped a lot with OC the mem


----------



## Arm3nian

I was browsing through the noiseblocker site and found this. I guess this is what AMD used.



Here is the link lol: http://www.blacknoise.com/en/products/industry/14/Noiseblocker-NB_IP55_Serie_1225


----------



## SoloCamo

Quote:


> Originally Posted by *selk22*
> 
> This is probably going to be the last bench I post until I have mine under water. I also may post some game benchmarks later but for now enjoy
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This is my max stable OC with the Mem above 1300..
> 
> 3930k 4.6 & R9 290x 1100/1350
> 
> 
> Here is my Fan profile if anyone is interested.. This helped a lot with OC the mem


So how much does CPU play a role here?

Scored a 60fps in Valley at the same clocks, but minimum was at 22.9fps. Stock 8350 vs 3930k @ 4.6 I would assume even in a gpu bound situation must help out here


----------



## Sazz

Quote:


> Originally Posted by *Arm3nian*
> 
> I was browsing through the noiseblocker site and found this. I guess this is what AMD used.
> 
> 
> 
> Here is the link lol: http://www.blacknoise.com/en/products/industry/14/Noiseblocker-NB_IP55_Serie_1225


I don't know about you, but mine so far only maxed out at 67% fan speed and its still a lot less noisier than the 6870 that I was using before I plugged in the 290x


----------



## Arizonian

Quote:


> Originally Posted by *Sazz*
> 
> I don't know about you, but mine so far only maxed out at 67% fan speed and its still a lot less noisier than the 6870 that I was using before I plugged in the 290x


I agree. Other than benching obviously with a manual fan setting gaming I have set, I'm hovering 85C at 70% fan. No louder than my 690 full load (85% fan) and reminds me of my 580 temps when I had it over clocked to 950 Mhz Core and no memory OC where it got to 92C most of the time with full loads.


----------



## rdr09

Quote:


> Originally Posted by *BababooeyHTJ*
> 
> No, Nvidia had some issues with vsync back when GTX680 launched well over a year ago and was resolved well over a year ago now.
> 
> Gsync is a completely new tech. Thats completely unrelated.
> 
> To this day afaik you still can't force vsync through the amd control panel.


here is another solution - post 353

http://www.overclock.net/t/1397117/tpu-asus-unveil-39inch-4k-va-monitor/350


----------



## ABD EL HAMEED

I wanna ask how this card compares to a 780 clock per clock


----------



## binormalkilla

Quote:


> Originally Posted by *thestache*
> 
> What settings are you running (vsync in driver or game), what panels, any active adapters or anything and what ports etc? Thanks for any information that'd be a great help and this is probably the only reason I wouldn't try the 290X out in crossfire.


I almost always keep the VSync to allow the app to decide, unless the in game setting doesn't work for some reason. In that case I'll create a game specific profile and force it.

So far the only game I've tested is Planetside 2. I'm in Windows 8. I don't experience any tearing on the desktop or in game. Crossfire + Eyefinity might still be broken, but when I get my second card I'll post results.

Also, for those that are interested, the latest 13.11 beta Linux driver works with Eyefinity in Linux. Tear free desktop seems to work as well.


----------



## raghu78

Quote:


> Originally Posted by *Arizonian*
> 
> I agree. Other than benching obviously with a manual fan setting gaming I have set, I'm hovering 85C at 70% fan. No louder than my 690 full load (85% fan) and reminds me of my 580 temps when I had it over clocked to 950 Mhz Core and no memory OC where it got to 92C most of the time with full loads.


lots of Nvidia users jumped on the too loud campaign. now when we hear from actual users who are using the card it seems AMD has done a decent job for the price.


----------



## Arm3nian

Quote:


> Originally Posted by *ABD EL HAMEED*
> 
> I wanna ask how this card compares to a 780 clock per clock


Most benches show that the 290x at 1100mhz equals a 780 at 1300-1350mhz.


----------



## ABD EL HAMEED

Quote:


> Originally Posted by *Arm3nian*
> 
> Most benches show that the 290x at 1100mhz equals a 780 at 1300-1350mhz.


Nice,now how does it compare to a titan again clock per clock


----------



## raghu78

Quote:


> Originally Posted by *Arm3nian*
> 
> Most benches show that the 290x at 1100mhz equals a 780 at 1300-1350mhz.


yeah thats probably correct. if you have a R9 290X which is not throttling at 1.1 ghz perf is on par with GTX 780 custom cards at 1.3 Ghz. the point is you need to run fan speed at 80 - 90% to avoid throttling on stock cooler.


----------



## Tobiman

I was thinking the prolimatech MK 26 would be good for this card for those that must stay on air.
http://www.prolimatech.com/en/products/detail.asp?id=2444


----------



## selk22

Quote:


> Originally Posted by *SoloCamo*
> 
> So how much does CPU play a role here?
> 
> Scored a 60fps in Valley at the same clocks, but minimum was at 22.9fps. Stock 8350 vs 3930k @ 4.6 I would assume even in a gpu bound situation must help out here


Not sure but I know ram timing also has something to do with minimum frames. I am at 1866 8gb. I know it never hurts to have a faster CPU or a least a decent OC. If I was you I would try to push that 8350 a little.. This is OCN


----------



## Arm3nian

Quote:


> Originally Posted by *ABD EL HAMEED*
> 
> Nice,now how does it compare to a titan again clock per clock


Most would agree it is 1:1 with the titan. 1% error + or -
Watercooled titans can get ~1300mhz. With volt mods, the 290x should be able to pass that.
Quote:


> Originally Posted by *raghu78*
> 
> yeah thats probably correct. if you have a R9 290X which is not throttling at 1.1 ghz perf is on par with GTX 780 custom cards at 1.3 Ghz. the point is you need to run fan speed at 80 - 90% to avoid throttling on stock cooler.


Yeah as long as the 290x isn't throttling you're good to go in terms of performance. I'm throwing blocks on mine so I don't need to worry about that. Just waiting for voltage unlocks in AB for now to see how far it can go under water without a bios flash. If it is volt limited rather than temp limited at that point, I'm all in for a bios flash.


----------



## skupples

Quote:


> Originally Posted by *jerrolds*
> 
> GSync looks amazing - but only looks like will be supported on 1080p 120/144hz TN panels - which is a shame. My QNIX will live a little longer and will switch to green once GSync is possible on 1440p IPS/PLS panels, assuming Radeon doesnt beat it in some way.


It's already been confirmed on 1440p as well. A list quoted from NV is in the G-sync thread.

Glad to see bios flashing is now an acceptable way to overclock...


----------



## jjjc_93

I picked up my card yesterday arvo, and people weren't kidding when they said these things were loud and hot, damn! Loving it though, the performance is looking really good, LN2 next











Also has anybody tried to remove the stock cooler? I've taken all my screws out, including the ones on the PCIE bracket and it still feels pretty well fixed to the card.


----------



## utnorris

So the max I can pull is 1100Mhz, anything above that I start getting artifacts. I tried moving the power limit up, but that seems to have zero effect on overclocking. Hopefully voltage control will fix that. As far as memory goes, I went to 1450 with no issues, got tired of benching otherwise I would have gone higher with it. While I am still good with the card, I am hoping voltage control allows the card to go in the 1200+ range for GPU. I will be under water, so temps should not be an issue and at that overclock I should be able to match my previous dual HD7970's I had overclocked. They were not the best, but I would like to match them in power now that I am on a mITX board.


----------



## Arm3nian

Quote:


> Originally Posted by *jjjc_93*
> 
> I picked up my card yesterday arvo, and people weren't kidding when they said these things were loud and hot, damn! Loving it though, the performance is looking really good, LN2 next
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Also has anybody tried to remove the stock cooler? I've taken all my screws out, including the ones on the PCIE bracket and it still feels pretty well fixed to the card.


Nice! Looking forward to ln2 results. Especially when we get the volt mods completely figured out.


----------



## jjjc_93

I got the cooler off just now, just took a good firm grab off the card.

There are hardmods and a modded bios available already, use at your own risk:

http://kingpincooling.com/forum/showthread.php?t=2473


----------



## Iamanerd

Hey guys, so I got my 290x yesterday and I'll post some results shortly but has anyone tried redeeming the BF4 code yet? I keep getting invalid UID and I'm wondering if anyone else is running in to this issue? I might just buy BF4 if need be but I'd rather not as I got this code.


----------



## MIGhunter

Quote:


> Originally Posted by *Iamanerd*
> 
> Hey guys, so I got my 290x yesterday and I'll post some results shortly but has anyone tried redeeming the BF4 code yet? I keep getting invalid UID and I'm wondering if anyone else is running in to this issue? I might just buy BF4 if need be but I'd rather not as I got this code.


nobody has gotten their code to work. Maybe try when it's released in a couple of days.


----------



## pounced

Quote:


> Originally Posted by *Iamanerd*
> 
> Hey guys, so I got my 290x yesterday and I'll post some results shortly but has anyone tried redeeming the BF4 code yet? I keep getting invalid UID and I'm wondering if anyone else is running in to this issue? I might just buy BF4 if need be but I'd rather not as I got this code.


No one can redeem it till launch day or the night before


----------



## Arizonian

Quote:


> Originally Posted by *jjjc_93*
> 
> I picked up my card yesterday arvo, and people weren't kidding when they said these things were loud and hot, damn! Loving it though, the performance is looking really good, LN2 next
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Also has anybody tried to remove the stock cooler? I've taken all my screws out, including the ones on the PCIE bracket and it still feels pretty well fixed to the card.


Great score. Congrats. Got you added.









_I got you down as a Sapphire. Let me know if it's any different._


----------



## Arm3nian

Quote:


> Originally Posted by *jjjc_93*
> 
> I got the cooler off just now, just took a good firm grab off the card.
> 
> There are hardmods and a modded bios available already, use at your own risk:
> 
> http://kingpincooling.com/forum/showthread.php?t=2473


Yeah I saw that yesterday, are there any benches using it?

This one was at stock volts lol. 1450mhz+ at stock... crazy. Unless Gpu tweak isn't reporting the voltage correctly.


----------



## modinn

Quote:


> Originally Posted by *Iamanerd*
> 
> Hey guys, so I got my 290x yesterday and I'll post some results shortly but has anyone tried redeeming the BF4 code yet? I keep getting invalid UID and I'm wondering if anyone else is running in to this issue? I might just buy BF4 if need be but I'd rather not as I got this code.


-snip others answered the question-

Looking forward to your test results


----------



## jjjc_93

Quote:


> Originally Posted by *Arm3nian*
> 
> Yeah I saw that yesterday, are there any benches using it?
> 
> This one was at stock volts lol. 1450mhz+ at stock... crazy. Unless Gpu tweak isn't reporting the voltage correctly.


Yeah he's using hard mods there so the software won't see what volts he is running at, the LN2 is helping him too.

It's still a new card so people are still trying to find them and then mod them and work it all out before really benching, but you'll see some nice results soon. I flashed the bios with PT3.rom and it works fine on air.


----------



## SoloCamo

Quote:


> Originally Posted by *selk22*
> 
> Not sure but I know ram timing also has something to do with minimum frames. I am at 1866 8gb. I know it never hurts to have a faster CPU or a least a decent OC. If I was you I would try to push that 8350 a little.. This is OCN


Would, but the motherboard won't support it due to poor power phase. Have a FX-9590 on the way (no, I didn't buy it myself, it was gifted to me and I asked they return it but it was too late.) and will be ordering a crosshair V tonight so I should at least be at 4.7-4.8 which I'd imagine would help somewhat alleviate any bottlenecks.


----------



## Arizonian

Quote:


> Originally Posted by *raghu78*
> 
> yeah thats probably correct. if you have a R9 290X which is not throttling at 1.1 ghz perf is on par with GTX 780 custom cards at 1.3 Ghz. the point is you need to run fan speed at 80 - 90% to avoid throttling on stock cooler.


I opened my case when benching last time and felt to top and bottom of the GPU and it was *not* hot. All that hot air was being exhausted out the rear air flow bracket outside the case.

I'm leaning toward getting a Sapphire Toxic w/back plate and put it on top. I think the combo will work out just fine for air. Over kill for gaming (just as I like it) so I will never see dips under 60 FPS for the next year or two with maxed game settings. Since I won't be benching the crossfire config with my AX850 watt PSU I can safely run them both 1100 / 1350 clocks. Can't wait to for the non-reference to come....hoping before Christmas for something under the tree.


----------



## Arm3nian

Quote:


> Originally Posted by *jjjc_93*
> 
> Yeah he's using hard mods there so the software won't see what volts he is running at, the LN2 is helping him too.
> 
> It's still a new card so people are still trying to find them and then mod them and work it all out before really benching, but you'll see some nice results soon. I flashed the bios with PT3.rom and it works fine on air.


What kind of voltage does that bios allow? Do you think it will be able to provide enough volts for water or not.


----------



## sugarhell

Quote:


> Originally Posted by *jjjc_93*
> 
> Yeah he's using hard mods there so the software won't see what volts he is running at, the LN2 is helping him too.
> 
> It's still a new card so people are still trying to find them and then mod them and work it all out before really benching, but you'll see some nice results soon. I flashed the bios with PT3.rom and it works fine on air.


Can you check with the modded bios the memory voltage on gpu tweak? Its for real 1.8 or its just buggy?


----------



## jjjc_93

Quote:


> Originally Posted by *Arm3nian*
> 
> What kind of voltage does that bios allow? Do you think it will be able to provide enough volts for water or not.


It allows for up to 2v, which you do not want to apply so more than enough for water cooling.
Quote:


> Originally Posted by *sugarhell*
> 
> Can you check with the modded bios the memory voltage on gpu tweak? Its for real 1.8 or its just buggy?


I'm soldering on my voltage measuring points now so I'll get back to you in a bit with accurate voltages.


----------



## yawa

Count me in come mid November for an R290 ( which I will immediately put under water) as if the leaked benches are true, is so close to a 290X ( and importantly, way closer to my price range, especially after I sell my GTX 670), I cannot ignore it. Originally I was going to hold off and see what thehigh end steamroller looks like aand use my money to get that and a high end FM2+ Mobo, but these cards are so good I don't think I can wait.

So come November, R290, EK water block, and a 360 mm radiator for me. In the meantime guys, go give em hell in bench off thread.

BTW, anyone got there's under water yet? If so detailed synopsis please, and temps, setup, and benchmarking results asap. Thank you.


----------



## sugarhell

Ok thanks man


----------



## KyGuy

Talk about overclocking, this is going to be a big temp jump in my case. Right now, my 680+ 4GB is overclocked to 1150 on the Core and 3305 on the memory. Only maxes at 65C!! The R9's seem to run around 85-90 on 65% fan. I run that right now on my 680 as a custom fan profile. Burning!!!


----------



## djriful

Quote:


> Originally Posted by *selk22*
> 
> This is probably going to be the last bench I post until I have mine under water. I also may post some game benchmarks later but for now enjoy
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This is my max stable OC with the Mem above 1300..
> 
> 3930k 4.6 & R9 290x 1100/1350
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here is my Fan profile if anyone is interested.. This helped a lot with OC the mem


Quote:


> Originally Posted by *Arm3nian*
> 
> Most benches show that the 290x at 1100mhz equals a 780 at 1300-1350mhz.


Mmh.... I take that 780 can surpass TITAN at stock clocks.

FPS:

*68.4*

Score:

*2864*

Min FPS:

*31.4*

Max FPS:

*133.1*

That is my scores and I run at 4.7Ghz and GPU at 1050-1075Mhz Core with +200Mhz on memory only. So how can a 290X 1100mhz equal to 780 1300mhz?!? Are the 780 not running on hexacores CPU? GPU BIOS is at stock atm until my water setup is ready to go.


----------



## Sgt Bilko

Quote:


> Originally Posted by *djriful*
> 
> Mmh.... I take that 780 can surpass TITAN at stock clocks.
> 
> FPS:
> *68.4*
> Score:
> *2864*
> Min FPS:
> *31.4*
> Max FPS:
> *133.1*
> 
> That is my scores and I run at 4.7Ghz and GPU at 1050-1075Mhz Core with +200Mhz on memory only. So how can a 290X 1100mhz equal to 780 1300mhz?!? Are the 780 not running on hexacores CPU? GPU BIOS is at stock atm until my water setup is ready to go.


http://www.overclock.net/t/1436635/ocn-gk110-vs-hawaii-bench-off-thread/230


----------



## Arm3nian

Quote:


> Originally Posted by *djriful*
> 
> Mmh.... I take that 780 can surpass TITAN at stock clocks.
> 
> FPS:
> *68.4*
> Score:
> *2864*
> Min FPS:
> *31.4*
> Max FPS:
> *133.1*
> 
> That is my scores and I run at 4.7Ghz and GPU at 1050-1075Mhz Core with +200Mhz on memory only. So how can a 290X 1100mhz equal to 780 1300mhz?!? Are the 780 not running on hexacores CPU? GPU BIOS is at stock atm until my water setup is ready to go.


I'm not sure what you mean. The overclocked 780s surpass both the titan and 290x at stock clocks. But 1100mhz 290x is not stock, it has an overclock.

It would also be helpful if you list what "Im running" is...


----------



## MIGhunter

Code:



Code:


10/26/2013 03:04:00  ARRIVAL SCAN[I] INDIANAPOLIS, IN, US

If my card sits in the warehouse 45 minutes away until Monday (meaning I won't get it until Tuesday) I'm going to be pissed!


----------



## SonDa5

Quote:


> Originally Posted by *jjjc_93*
> 
> Also has anybody tried to remove the stock cooler? I've taken all my screws out, including the ones on the PCIE bracket and it still feels pretty well fixed to the card.


Yes. Make sure you get all screws. It is snug but does come off. Be careful.


----------



## Arizonian

Quote:


> Originally Posted by *Deedot78*
> 
> Please add to list.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - Welcome aboard.









Sorry I missed you. If anyone else was missed please PM me.


----------



## Sazz

Here are my runs with 3Dmark 11 and Firestrike so far both stock and overclocked paired up with my FX-8350 at 5Ghz

3Dmark 11 Stock clocks score

3Dmark 11 OC score

3Dmark Firestrike Stock clocks score

3Dmark Firestrike OC score

Will run unigine benches later on, got to go back on playing games xD


----------



## selk22

Hey Arizonian,

How did you end up OC your card to 1150 and 1350?

Using AB? Also did you need to flash a new Bios or Use the Asus Volt control?

I am using a Sapphire card and if I even take the mem over 1300-1325ish I start having issues.

Anyone think custom Bios and being on water will fix this issue? I don't want to put a card under water if I wont ever be able to push it past 1350 on the mem


----------



## wholeeo

How bad would an i3 bottleneck one of these bad boys?


----------



## yawa

The EK people say putting her under water nets 1200+MHz easy, but I'm dying to know this memory clocking info as well. We'll just have to hope some hero of the late night over clocking card club emerges tonight, with a fully water blocked card ready to deliver us from the 95 C temperture dragon


----------



## DampMonkey

Quote:


> Originally Posted by *yawa*
> 
> The EK people say putting her under water nets 1200+MHz easy, but I'm dying to know this memory clocking info as well. We'll just have to hope some hero of the late night over clocking card club emerges tonight, with a fully water blocked card ready to deliver us from the 95 C temperture dragon


Their exact quote was 1200/1600 easy with 54* load temps in furmark. its gonna be a long 2 days till my block comes in....


----------



## selk22

I really hope so.. Not being able to touch this mem right now is slightly discouraging..

I may try the Asus Bios and GPU Tweak.. I have never flashed a GPU bios but it cant be to hard right


----------



## yawa

Good lord my selective reading problem missed that one. I am so getting impatient myself, at least you don't have to wait till mid November.

I'll be joining you in that vast sea of over clocking coolness come then though.


----------



## formula m

It's been three days... Newegg still out of them...!!!!!!


----------



## psyside

Quote:


> Originally Posted by *$ilent*
> 
> but it only seems to do it when I touch the memory. Still sound faulty?


Hmm, not sure but i think its issue with AB, uninstall it, uninstall any oc software, clean the registry, and then start over.
Quote:


> Originally Posted by *Forceman*
> 
> I think you're the second person to report artifacting in Windows. Wonder what's going on.


I think its issue with AB controlling the voltage of the card/messing up with it.


----------



## jjjc_93

Quote:


> Originally Posted by *sugarhell*
> 
> Ok thanks man


Finally got around to doing it and memory is only using 1.5v, not 1.8.

Link to album of soldering pics:


http://imgur.com/Eez3c


----------



## Forceman

Quote:


> Originally Posted by *MIGhunter*
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> 10/26/2013 03:04:00  ARRIVAL SCAN[I] INDIANAPOLIS, IN, US
> 
> If my card sits in the warehouse 45 minutes away until Monday (meaning I won't get it until Tuesday) I'm going to be pissed!


Mine's been sitting at the Fedex facility since Friday evening. I called to see if I could come pick it up, but they said it was going to be in the shipping container until they unpacked it Monday morning.









Can someone who's taken the cooler off confirm if the die is recessed from the support bezel thing around it? So you'd need a shim to mount a universal-type cooler. Or is it flush or raised above it like on Nvidia cards?


----------



## jjjc_93

Quote:


> Originally Posted by *Forceman*
> 
> Mine's been sitting at the Fedex facility since Friday evening. I called to see if I could come pick it up, but they said it was going to be in the shipping container until they unpacked it Monday morning.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can someone who's taken the cooler off confirm if the die is recessed from the support bezel thing around it? So you'd need a shim to mount a universal-type cooler. Or is it flush or raised above it like on Nvidia cards?


It's like Nvidia this time, no shims needed.


----------



## broken pixel

Reserved! :*)


----------



## Forceman

Quote:


> Originally Posted by *jjjc_93*
> 
> It's like Nvidia this time, no shims needed.


Sweet, because the shims I ordered are apparently taking the scenic route from China.


----------



## Sgt Bilko

EDIT: nevermind Mod was doing a clean-out


----------



## Arizonian

Quote:


> Originally Posted by *Sgt Bilko*
> 
> OK the OCN GK110 vs. Hawaii Bench-off thread got locked, guess i'm not going to be posting results in there anytime soon.


It's going to be re-opened soon again. We can continue. A lot of good members on OCN who are truly in the pursuit of performance and seriously want to compare without the degradation in discussion that took place.

I myself will love to see those who are putting this under water and how they compare to the other cards in the competition between Titan, 780, 290X and 290.

Stand by.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Arizonian*
> 
> It's going to be re-opened soon again. We can continue. A lot of good members on OCN who are truly in the pursuit of performance and seriously want to compare without the degradation in discussion that took place.
> 
> I myself will love to see those who are putting this under water and how they compare to the other cards in the competition between Titan, 780, 290X and 290.
> 
> Stand by.


I seen that, thanks for the clean-out. Also looking forward to Water results, want to see if it's worth me going Xfire and for a full water loop later on.


----------



## xxmastermindxx

Add me! Card is stock right now, but will be wet sometime next week, depending on how fast the EK block comes in







Where'd you guys order your blocks from?

As you can see, 100% GPU usage while at desktop, but only when GPU-Z is open. Hmm


----------



## broken pixel

Quote:


> Originally Posted by *VaporX*
> 
> Hey guys I was told there is a question on warranty voiding with after market coolers. This is correct, if you remove the cooler you will be voiding the warranty, in fact it is safest to assume this is the case for pretty much every manufacturer unless they have a very explicate statement of other wise obviously posted.
> 
> ^ I have spoken to many MSI techs and they do not void the warranty if you install aftermarket cooling onto your GPU. Even if you put a Philips head into the sticker that says warranty void if removed will not void the warranty.
> 
> I have done two RMAs with the warranty void if removed stickers punctured on GPUs. MSI also upgraded my two 7950s to 7970 lightning BE since they could not replace my 7950s with the same SKU 7950.
> 
> ****! Most of the time factory TIM jobs are horrid and they use way too much.
> 
> Buy MSI if you plan on aftermarket cooling.
> 
> As for which memory is used I have passed this up the ladder to get you guys and answer.


----------



## Arizonian

Quote:


> Originally Posted by *xxmastermindxx*
> 
> Add me! Card is stock right now, but will be wet sometime next week, depending on how fast the EK block comes in
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As you can see, 100% GPU usage while at desktop, but only when GPU-Z is open. Hmm
> 
> 
> 
> Spoiler: Warning: Spoiler!


Added









Yea GPU-Z is also messing up BF3 from being able to be started with me. Until GPU-Z updates it with support I'd suggest leaving it closed.

Once you get your blocks, post it and for those members who do, I will update the list.


----------



## RocketAbyss

Quote:


> Originally Posted by *Arizonian*
> 
> Added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yea GPU-Z is also messing up BF3 from being able to be started with me. Until GPU-Z updates it with support I'd suggest leaving it closed.
> 
> Once you get your blocks, post it and for those members who do, I will update the list.


Ditto. GPU-Z caused my card to be pegged at 100% at idle on the desktop. Also it caused every fullscreen application to crash after multiple 5 second blackouts. And this continued to happen even when I had already closed GPU-Z. Seems rebooting will get rid of it(duh) until you reopen GPU-Z again. My result: rebooted my system and tried not having GPU-Z open, and none of the above mentioned symptoms reappeared. So for now, GPU-Z is dead to me


----------



## MIGhunter

Quote:


> Originally Posted by *xxmastermindxx*
> 
> As you can see, 100% GPU usage while at desktop, but only when GPU-Z is open. Hmm


In case you missed it in the past 50 pages, that's a known bug with the current cpu-z


----------



## xxmastermindxx

Quote:


> Originally Posted by *MIGhunter*
> 
> In case you missed it in the past 50 pages, that's a known bug with the current cpu-z


Yeah. Thanks. I've skimmed the thread.


----------



## Sazz

Quote:


> Originally Posted by *Sazz*
> 
> you guys can add me on that list xD
> 
> http://imageshack.us/photo/my-images/62/u1wd.png/
> 
> GPU-Z Validation link
> 
> Anyways, there's quite a lot of pages down in this thread already but anyone here knows how to unlock the voltage control? MSI Afterburner 3.0.0 beta seems to work on overclocking but even after unlocking the EULA thing still doesn't let you control the voltage.


I just noticed I forgot to mention the manufacturer and cooling info, It's from Sapphire and for now its at stock and soon it will be wet xD

Hopefully by Wednesday.


----------



## Arizonian

Quote:


> Originally Posted by *Sazz*
> 
> I just noticed I forgot to mention the manufacturer and cooling info, It's from Sapphire and for now its at stock and soon it will be wet xD
> 
> Hopefully by Wednesday.


Added


----------



## lordzed83

Ok guys i flashes my sapphire with asus bios
Memory clock jumoing still here so it is windows 8.1/drivers issue.

At the moment ran out of ideas how to fix it so i just use lowest possible mis afterburner profile to sit on windows


----------



## fleetfeather

i wonder if any users suffering from memory clock issues have the option to dual-boot into linux or win7...









Props to all of you early adopters for persisting with release issues though, I wish I could be there in the mix with you all haha


----------



## lordzed83

Update. UnInstalled GPU Z
Using GPU Tweak i made profile 1100/[email protected]
Card iddles at 500/2500 settings.

Memory clock jumping issue is GONE !!!!
Thanks Asus should have paid ya those extra 20 pounds for giving me an option of fixing the issue


----------



## thestache

Quote:


> Originally Posted by *selk22*
> 
> This is probably going to be the last bench I post until I have mine under water. I also may post some game benchmarks later but for now enjoy
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This is my max stable OC with the Mem above 1300..
> 
> 3930k 4.6 & R9 290x 1100/1350
> 
> 
> Here is my Fan profile if anyone is interested.. This helped a lot with OC the mem


That's really low. What other scores are people getting with 290Xs?

Memory plays a huge role in valley, I'm normally running 7200mhz. My GTX Titan at 1200mhz core and 7200mhz memory is around 82-83FPS and 3400-3460 score in valley. At 1.35v it's a whole other story.


----------



## Scotty99

I see lots of people in this thread having normal everyday problems with their cards. Is this ONLY related to memory clocks are people experiencing black screens/crashes/artifacts at stock clocks as well (ive heard of other problems as well but these seem to be most frequent)./


----------



## Euda

Quote:


> Originally Posted by *RocketAbyss*
> 
> Ditto. GPU-Z caused my card to be pegged at 100% at idle on the desktop. Also it caused every fullscreen application to crash after multiple 5 second blackouts. And this continued to happen even when I had already closed GPU-Z. Seems rebooting will get rid of it(duh) until you reopen GPU-Z again. My result: rebooted my system and tried not having GPU-Z open, and none of the above mentioned symptoms reappeared. So for now, GPU-Z is dead to me


Exactly that happened to me


----------



## famich

I got Sapphire from UK, so far able to run Heaven @1200MHz @ silent BIOS. I will be posting my validation info later on.
Guys, any support from sapphire -Trixx unclocking volts etc ?


----------



## TheSoldiet

Quote:


> Originally Posted by *famich*
> 
> I got Sapphire from UK, so far able to run Heaven @1200MHz @ silent BIOS. I will be posting my validation info later on.
> Guys, any support from sapphire -Trixx unclocking volts etc ?


Are you sure you are not throttling? 1200 seems very high for the ref cooler.


----------



## psyside

One more time. If you want to see if you got hardware problems or software there is a simple trick.

Uninstall any oc/monitoring software completely, from control panel and clean from registry

Then reboot, and run any intensive game benchmark for like 15 mins, or do the same tasks which produced the problems before, if this doesn't help, and you want to make sure, format and bench - play games with no oc/monitoring tools, use only CCC for oc that's all.


----------



## selk22

Quote:


> Originally Posted by *thestache*
> 
> That's really low. What other scores are people getting with 290Xs?
> 
> Memory plays a huge role in valley, I'm normally running 7200mhz. My GTX Titan at 1200mhz core and 7200mhz memory is around 82-83FPS and 3400-3460 score in valley. At 1.35v it's a whole other story.


Yeah unfortunately until I can tweak the volts then its a crap shoot..

But on the plus side I got the OC to 1100/1400 and it seems to like 1400 alot more than anything in the 1300's for some reason? Still not able to push that Core until I can get more juice into this baby


----------



## szeged

flash the asus bios and get voltage control









flashing takes all of 3 seconds and is pretty safe unless you decide you want to falcon punch your card out of the pci-e slot while doing it


----------



## selk22

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *szeged*
> 
> flash the asus bios and get voltage control
> 
> 
> 
> 
> 
> 
> 
> 
> 
> flashing takes all of 3 seconds and is pretty safe unless you decide you want to falcon punch your card out of the pci-e slot while doing it






Yeah I downloaded the ROM but was unsure on exactly the safest way to do this on the 290x.. I didn't want to rush anything if MSI AB or Sapphire may release voltage control sometime soon

Like this?


----------



## Sgt Bilko

Quote:


> Originally Posted by *szeged*
> 
> flash the asus bios and get voltage control
> 
> 
> 
> 
> 
> 
> 
> 
> 
> flashing takes all of 3 seconds and is pretty safe unless you decide you want to falcon punch your card out of the pci-e slot while doing it


Thats my first step unless Sapphire release a Trixx version that supports Voltmods.


----------



## maarten12100

The cheapest webshop can deliver the 290x to me the 23rd of November and the quickest can deliver it in 5 working days...(those 5 working days is for delivery to them packing and shipping will take another 3/5 days)

Anybody knows a place in Germany or the Netherlands that has availability for a competitive price? (I don't need the BF4 ultimate edition btw)


----------



## kcuestag

Quote:


> Originally Posted by *maarten12100*
> 
> The cheapest webshop can deliver the 290x to me the 23rd of November and the quickest can deliver it in 5 working days...(those 5 working days is for delivery to them packing and shipping will take another 3/5 days)
> 
> Anybody knows a place in Germany or the Netherlands that has availability for a competitive price? (I don't need the BF4 ultimate edition btw)


Check Caseking.de or Hardwareversand.de, those two ship to all Europe, even Alternate.de.


----------



## Ukkooh

Is 515€ for 290x good with shipping included in the price? I'm not sure if I should order now or wait for non-refernce cards. It is an MSI card tho so I'd lose warranty if I'd replace the reference cooler...


----------



## selk22

Quote:


> Originally Posted by *Ukkooh*
> 
> Is 515€ for 290x good with shipping included in the price? I'm not sure if I should order now or wait for non-refernce cards. It is an MSI card tho so I'd lose warranty if I'd replace the reference cooler...


If you dont plan to WC then just wait for the non reference cooler.


----------



## Ukkooh

Quote:


> Originally Posted by *selk22*
> 
> If you dont plan to WC then just wait for the non reference cooler.


I'm planning on putting a H100i on it. So should I pull the trigger?


----------



## selk22

Quote:


> Originally Posted by *Ukkooh*
> 
> I'm planning on putting a H100i on it. So should I pull the trigger?


Like with a Dwood mount?

Up to you I dont regret buying mine


----------



## maarten12100

Quote:


> Originally Posted by *kcuestag*
> 
> Check Caseking.de or Hardwareversand.de, those two ship to all Europe, even Alternate.de.


Checked them Delivery date unknow sadly.
That 5 working days was from alternate NL guess I'll just have to wait until next week then it should be in stock.


----------



## K1llrzzZ

Hey,i ordered 2 R9 290X cards,i want to use them in CF,I just wanna ask that will my configuration be enough for these two cards?I have a 850 W Corsair PSU,I will not overclock the cards,but i will probably use them in "uber" mode,i have an Asrock Z77 Extreme 4 mobo,not the most expensive but it supports CF in 8x/8x if im right(will that hold back the cards?),and I have an I7 3770K cpu cloacked az 4.5 Ghz,and a Coolermaster HAF 932 Case.So my main concerns are the PSU(will 850 W be enough?),the Mobo(will the 8x/8x pci hold the cards back?)the CPU(will there be a bottleneck?),and the heat.Both cars are reference cards,but I was told they could be better for CF becouse custom cooler cards are bigger,witch means there is less space between the two cards,of course i know liquid cooling would be the best,but as I said im not going to OC the cards.I know they will be very loud,but that doesn't bother me at all.Or is there anything that i haven't considered and might couse a problem?


----------



## Scotty99

Ya 850 is more than enough, even with them overclocked.


----------



## K1llrzzZ

Thanks for the fast answer.What about the other things?Will the mobo or the CPU bottleneck the cards in any way?


----------



## Arizonian

Quote:


> Originally Posted by *K1llrzzZ*
> 
> Thanks for the fast answer.What about the other things?Will the mobo or the CPU bottleneck the cards in any way?


No more than it would with a crossfire bridge I'm assuming. I've read that these cards are fully saturating 3.0 PCIe bandwidth. Finally being put to use. So it's going to be interesting to see the difference in performance between 2.0 / 3.0 and may make 4.0 more relevant when it releases. I've read there was no penalty from running one in 16x and the other in 8x slot though. So there's going to be some interesting comparisons regarding this soon.

PS - i'm going to be watching you closely with your crossfire reference cards regarding temperatures and over clocks. I was thinking of getting a non-reference cooler for the top slot but if you turn out okay I may just go double cross fire reference. I put my hand on the top of the card and the heat is not bad as it's being exhausted out the rear air flow bracket quite well. The bottom plastic shroud was totally touchable so though the temps are up there it's doing a good job directing it out of the tower. A well ventilated case hasn't brought my CPU temps higher.


----------



## sugarhell

http://www.overclock.net/t/1437382/dsogaming-pcars-developer-mantle-will-offer-70-80-of-next-gen-console-performance-regarding-draw-calls

Interesting.
Quote:


> If AMD's claims are legit, then R9 290X may outperform even Nvidia's next-gen GPUs offerings


----------



## K1llrzzZ

Well then i hope everything would be fine,my cards are arriveing next tuesday.Thanks for your answer


----------



## Arizonian

Quote:


> Originally Posted by *sugarhell*
> 
> http://www.overclock.net/t/1437382/dsogaming-pcars-developer-mantle-will-offer-70-80-of-next-gen-console-performance-regarding-draw-calls
> 
> Interesting.


Quite interesting and I am so looking forward to the free performance boosts our cards are going to get playing a frostbite engine. BF4 will be my main game, so I'm stoked. Crossing fingers more game developers take mantle and make it relevant. We need to shed bloated Direct X.


----------



## utnorris

Quote:


> Originally Posted by *jjjc_93*
> 
> I picked up my card yesterday arvo, and people weren't kidding when they said these things were loud and hot, damn! Loving it though, the performance is looking really good, LN2 next
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also has anybody tried to remove the stock cooler? I've taken all my screws out, including the ones on the PCIE bracket and it still feels pretty well fixed to the card.


How the heck are you hitting 17k in 3DMark11 at those overclocks? I am barely over 15k with 1110/1450 and my CPU is at 4.6Ghz. I did 100% fan, so I do not think it is throttling.

On a side note, I checked my wattage coming out of the wall while benching 3DMark11 and even overclocked I did not go over 400watts total system. Am I missing something, because I thought the card alone would pull 300watts+ on it's own.


----------



## Jared Pace

Quote:


> Originally Posted by *Scotty99*
> 
> Ya 850 is more than enough, even with them overclocked.


This is debatable. I wouldn't use less than 1200-1500 watts for 2. Depends upon what your definition of "overclocked" means. AMD recommends 650 for just 1. So do the math.


----------



## skupples

Sounds like you guy's are going to need your own Skyn3t for all those bios's that will require flashing for proper overclocking.


----------



## formula m

Quote:


> Originally Posted by *szeged*
> 
> flash the asus bios and get voltage control
> 
> 
> 
> 
> 
> 
> 
> 
> 
> flashing takes all of 3 seconds and is pretty safe unless you decide you want to falcon punch your card out of the pci-e slot while doing it


A buddy & I are watch pre-game and reading this thread.. and this ^...

.. is so hilarious..


----------



## Jared Pace

Quote:


> Originally Posted by *utnorris*
> 
> How the heck are you hitting 17k in 3DMark11 at those overclocks?


custom bios with disabled throttling, probably has TOP gddr5 timings as well. Asus pt3.rom + gputweak. And he has voltage points hardwired. Maybe he has low temps too. Not sure if any of his hardmods make a difference for holding higher clocks.


----------



## szeged

we need to get skynet to get a 290x so he can make both teams some custom bios lol


----------



## formula m

Quote:


> Originally Posted by *K1llrzzZ*
> 
> Hey,i ordered 2 R9 290X cards,i want to use them in CF,I just wanna ask that will my configuration be enough for these two cards?I have a 850 W Corsair PSU,I will not overclock the cards,but i will probably use them in "uber" mode,i have an Asrock Z77 Extreme 4 mobo,not the most expensive but it supports CF in 8x/8x if im right(will that hold back the cards?),and I have an I7 3770K cpu cloacked az 4.5 Ghz,and a Coolermaster HAF 932 Case.So my main concerns are the PSU(will 850 W be enough?),the Mobo(will the 8x/8x pci hold the cards back?)the CPU(will there be a bottleneck?),and the heat.Both cars are reference cards,but I was told they could be better for CF becouse custom cooler cards are bigger,witch means there is less space between the two cards,of course i know liquid cooling would be the best,but as I said im not going to OC the cards.I know they will be very loud,but that doesn't bother me at all.Or is there anything that i haven't considered and might couse a problem?


You are golden, have fun with that rig..!


----------



## Scotty99

Quote:


> Originally Posted by *Jared Pace*
> 
> This is debatable. I wouldn't use less than 1200-1500 watts for 2. Depends upon what your definition of "overclocked" means. AMD recommends 650 for just 1. So do the math.


Its really not debatable at all, you could overclock 2 of these cards to the hills with a 5+ ghz cpu and be in the 700w load range, maybe 750. Not to mention the person who asked has a corsair and they are rated 100-150 less than what they can actually output.


----------



## SoloCamo

Quote:


> Originally Posted by *Arizonian*
> 
> Quite interesting and I am so looking forward to the free performance boosts our cards are going to get playing a frostbite engine. BF4 will be my main game, so I'm stoked. Crossing fingers more game developers take mantle and make it relevant. We need to shed bloated Direct X.


BF4 will be my main game as well - mantle should really make it shine, even a 5% gain with it will be worth it IMO.


----------



## RocketAbyss

Quote:


> Originally Posted by *Arizonian*
> 
> Quite interesting and I am so looking forward to the free performance boosts our cards are going to get playing a frostbite engine. BF4 will be my main game, so I'm stoked. Crossing fingers more game developers take mantle and make it relevant. We need to shed bloated Direct X.


Quote:


> Originally Posted by *SoloCamo*
> 
> BF4 will be my main game as well - mantle should really make it shine, even a 5% gain with it will be worth it IMO.


On a more important note: BF4 hitting here in 12 hours time!


----------



## overclockFrance

Quote:


> Originally Posted by *oblivious45cs*
> 
> Hello everyone,
> 
> Figured i would share some of my thoughts and results, since this forum has been so helpful in providing me information
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I picked up the HIS 290X and flashed it using the Asus Bios and ATIflash method. Stock volts i was only able to do about 1070 without crashing. I bumped voltage up to 1.312 and was able to get 1160 stable. Below are some images of my results, i have only played around with Firestrike and Metro LL so far.
> 
> Metro Results - First Graph is stock, second @ 1160 / 1400
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Firestrike Results - First is stock, second @ 1160 / 1400
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Stock i was actually surprised the card nice and quiet, yeah the temps hit 94 but i guess that supposed to happen lol. When running at 1160 i set the fan to 70% and that kept temps under 90 but it is extremely loud, 100% on these stock coolers is unbearably loud.
> 
> At this point im still debating on adding it to my H220 loop, was hoping i could get closer to 1200, but anything past 1160 and it crashes or artifacts, i haven't tried to keep going higher with the voltage yet, still a little gun shy. I will see what you more experienced guys can do with them and decide.
> 
> Forgot to add information on my rig...
> MSI Mpower Z87
> 4770K @ 4.6
> 16GB @ 2000
> 290x @ 1160/1400


I own A HIS 290X too but I hesitate flashing my card : mine is equipped with Elpida memory modules and ASUS cards are with Hynix. I would like to know if yours is equipped with Elpida or Hynix modules. If it is with Elpida then I will flash, no risk. Otherwise, I will wait that Afterburner unlocks voltage. Indeed, I am likely to face with problems if the memory modules are different.


----------



## SoloCamo

Quote:


> Originally Posted by *overclockFrance*
> 
> I own A HIS 290X too but I hesitate flashing my card : mine is equipped with Elpida memory modules and ASUS cards are with Hynix. I would like to know if yours is equipped with Elpida or Hynix modules. If it is with Elpida then I will flash, no risk. Otherwise, I will wait that Afterburner unlocks voltage. Indeed, I am likely to face with problems if the memory modules are different.


Speaking of which, has it been confirmed that the Sapphires are definitely using Hynix?


----------



## Jared Pace

Quote:


> Originally Posted by *Scotty99*
> 
> Its really not debatable at all, you could overclock 2 of these cards to the hills with a 5+ ghz cpu and be in the 700w load range, maybe 750. Not to mention the person who asked has a corsair and they are rated 100-150 less than what they can actually output.


Would be okay for me to call you a completely uniformed idiot? Probably not. But that is how you are acting. Professional overclockers have been known to use over 900 watts with 1 card, and 1600 watts with 1 card and a cpu. This thread is better off without misinformed posts like yours telling a new buyer he will be okay overclocking two top of the line video cards on an 850 watt PSU.

AMD probably 'recommends' more than 850w by default. With 850 watts get ready for 3dmark to shut your computer down. Just a heads up.


----------



## overclockFrance

It depends : the Sapphire cards with BF4 use Hynix and without BF4, Elpida.

But in my eyes, Elpida are as good as Hynix modules : I am able to play and bench at 1625 Mhz for the memory without upping any voltage.

In order to avoid the black screen when the memory is too high, I use hotkeys : in Windows desktop, memory is at 1450 Mhz and in games, I switch with a hotkey at 1625 Mhz. No more problems.


----------



## devilhead

it so weird to see, that still nobody have Ek waterblock







))i get my 290x and Ek waterblock next week


----------



## Scotty99

/facepalm

Why in the world would you assume someone asking a public forum such questions is going to be using Ln2? (with those wattage numbers you are claiming that is the only scenario you can be referring to).

850w is MORE than enough for two OC'd (on air, or water even) 290x's and a highly OC'd CPU. Manufacturers grossly overestimate their suggested wattage numbers, this is a very well known fact.

Just a little video to kinda show what im talking about:






4 480's drawing MAX 1100w from the wall FOUR.


----------



## sugarhell

No. With 2 7970 1.35 volt and 5 ghz 7970 i can push 900 watt.

With 4 7970 oced i need 2 psus

With 50% powerlimit 2 290s gonna push 350 watt+


----------



## Scotty99

Quote:


> Originally Posted by *sugarhell*
> 
> No. With 2 7970 1.35 volt and 5 ghz 7970 i can push 900 watt.
> 
> With 4 7970 oced i need 2 psus
> 
> With 50% powerlimit 2 290s gonna push 350 watt+


Sorry, but i call bull on this. Post a picture of your system with a kil-a-watt during a benchmark. 900 watts with two 7970's? There is just no way in hell. Do this and ill write a 2 page apology.


----------



## binormalkilla

Quote:


> Originally Posted by *Scotty99*
> 
> Sorry, but i call bull on this. Post a picture of your system with a kil-a-watt during a benchmark. 900 watts with two 7970's? There is just no way in hell. Do this and ill write a 2 page apology.


Well when my second 290x arrives I'll be able to check my power usage on my UPS. I'm getting 450-500W on my system at full load right now.


----------



## Jared Pace

Quote:


> Originally Posted by *Scotty99*
> 
> /facepalm
> 
> Why in the world would you assume someone asking a public forum such questions is going to be using Ln2? (with those wattage numbers you are claiming that is the only scenario you can be referring to).
> 
> 850w is MORE than enough for two OC'd (on air, or water even) 290x's and a highly OC'd CPU. Manufacturers grossly overestimate their suggested wattage numbers, this is a very well known fact.
> 
> Just a little video to kinda show what im talking about:
> 
> 
> 
> 
> 
> 
> 4 480's drawing MAX 1100w from the wall FOUR.


I'm not here to argue with you about having too little power. If people wish, they can follow your recommendation, as it is equal or less than the default amount. Once they overclock, they will see. Take a look at the 780 thread and you will see two 780's using 1390 watts. Your recommendation shows your very limited level of experience with overclocking. I have a dozen AMD cards. I can make two 7970s and a 965 CPU use over 1000 watts with the default bios on air cooling. The Titan club thread has plenty of posts where 2 cards are shutting down AX1200's, Anandtech forums has a guy shutting down a G1300 with 2 780's. A six core extreme cpu with a full bank of dimms, few case fans and periphreals can approach 650 watts. A GK110 card can approach 600 watts on slv7 bios, and 1000 watts in Kingpins hands. My 7950s and 7970s can do over 400 watts on air + stock bios. Plenty of 780s & titans are doing over 500 watts in the club threads. Not sure what qualifications or relevant experience you have that validates 850 watts taking someone to "the hills and beyond" with a 5ghz cpu. You seem to be making baseless assumptions from reading the side of a box or something. If you just want to barely get by without crashing on stock settings or a mild overclock, still be cautious squeezing by on a limited amount of power. There is a lot of frustration behind removing a PSU and wiring from your case when you realize the power source is inadequate or inferior. Only take a 850 if you plan to OC a single card.

That is the worst possible recommendation. Please anyone reading this do yourself a favor and take a name brand top of the line PSU if you plan to OC multiple cards. AX1200i or higher level of quality.


----------



## Arizonian

Quote:


> Originally Posted by *sugarhell*
> 
> No. With 2 7970 1.35 volt and 5 ghz 7970 i can push 900 watt.
> 
> With 4 7970 oced i need 2 psus
> 
> With 50% powerlimit 2 290s gonna push 350 watt+


But if one dosent add voltage would that be possible? We're talking a 6pin/8pin here. I asked Shilka and he to suggested it'd be fine as long as your not adding voltage. I was contemplating a new PSU but am reluctant as my current AX850 is only a year old. If one is only gaming Xfire and not benching?


----------



## SoloCamo

Quote:


> Originally Posted by *Scotty99*
> 
> Sorry, but i call bull on this. Post a picture of your system with a kil-a-watt during a benchmark. 900 watts with two 7970's? There is just no way in hell. Do this and ill write a 2 page apology.


better start writing an apology...


----------



## sugarhell

Without too many volts yeah. Its fine on a 850 w psu


----------



## Scotty99

Im tempted to buy a corsair 850 and two 290x's just to prove you wrong lol. I would bet the farm with 1200 clocks on each and a 5ghz 4770k id max the PSU in 750w range in a benchmark.


----------



## SoloCamo

Quote:


> Originally Posted by *Scotty99*
> 
> Im tempted to buy a corsair 850 and two 290x's just to prove you wrong lol. I would bet the farm with 1200 clocks on each and a 5ghz 4770k id max the PSU in 750w range in a benchmark.


Oh, so you just missed my post above then?


----------



## Scotty99

Quote:


> Originally Posted by *SoloCamo*
> 
> better start writing an apology...


Eh, what are you talking about? That chart proves MY point not his lmao.


----------



## sugarhell

Probably you are a newbie when it comes to tahiti/hawaii overclocking.

I am trying to find tsm power consumption with 3 7970 which was over 1400

http://www.overclock.net/t/1406438/best-1000w-psu-for-3x-7970s/30#post_20329406

Look single card consumption. Put 3x a 5ghz sbe and a huge loop and its over 1400 watt


----------



## Euda

*Again:
Can anyone help me trying to flash the ASUS-BIOS to my VLocked (although the original-package says "Voltage Unlocked!!!1111" -.-) XFX R9 290X? Voltage still locked after installing Afterburner 3 Beta 16, besides I tried to flash my BIOS via "atiflash -p 0 f ASUS.ROM" and it errors "No valid Video Adapter found!" - the same with "atiflash -p 1 -f ASUS.ROM", "atiflash -p 2 -f ASUS.ROM" and so on....

I'd really appreciate if there's anyone who can help, cuz' I'm going to illustrate the Clock-Scaling (percentual) of the card in several engines
















*


----------



## SoloCamo

Quote:


> Originally Posted by *Scotty99*
> 
> Eh, what are you talking about? That chart proves MY point not his lmao.


You missed the point completely then... Nevermind. Sorry to feed into this guy, don't mean to derail the thread further but I can't stand power use misinformation, as it's so vital yet often brushed to the side


----------



## Jared Pace

Quote:


> Originally Posted by *Euda*
> 
> *Again:
> Can anyone help me trying to flash the ASUS-BIOS to my VLocked (although the original-package says "Voltage Unlocked!!!1111" -.-) XFX R9 290X? Voltage still locked after installing Afterburner 3 Beta 16, besides I tried to flash my BIOS via "atiflash -p 0 f ASUS.ROM" and it errors "No valid Video Adapter found!" - the same with "atiflash -p 1 -f ASUS.ROM", "atiflash -p 2 -f ASUS.ROM" and so on....
> 
> I'd really appreciate if there's anyone who can help, cuz' I'm going to illustrate the Clock-Scaling (percentual) of the card in several engines
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *


Can you post your "Asus.rom" file please? Are you booting into dos with a bootdisk & changing directory to your usb stick which contains the newest atiflash & rom file?


----------



## Scotty99

The guy claimed 900w with two 7970's, the chart you linked shows crossfire 280x's with total system draw (including CPU) at 570w. What point are you trying to make again? You didnt specify if the CPU or GPU's were overclocked in the chart, you just linked a chart with no further info. Lets just agree to disagree, no point in derailing the thread like you say. Personally, id feel 100% safe with an 850w quality psu, ill leave it at that.


----------



## Redwoodz

Quote:


> Originally Posted by *Jared Pace*
> 
> Would be okay for me to call you a completely uniformed idiot? Probably not. But that is how you are acting. Professional overclockers have been known to use over 900 watts with 1 card, and 1600 watts with 1 card and a cpu. This thread is better off without misinformed posts like yours telling a new buyer he will be okay overclocking two top of the line video cards on an 850 watt PSU.
> 
> AMD probably 'recommends' more than 850w by default. With 850 watts get ready for 3dmark to shut your computer down. Just a heads up.


900w with one card......um no.
1600w with one card and CPU.....um no.

Quote:


> Originally Posted by *Jared Pace*
> 
> I'm not here to argue with you about having too little power. If people wish, they can follow your recommendation, as it is equal or less than the default amount. Once they overclock, they will see. Take a look at the 780 thread and you will see two 780's using 1390 watts. Your recommendation shows your very limited level of experience with overclocking. I have a dozen AMD cards. I can make two 7970s and a 965 CPU use over 1000 watts with the default bios on air cooling. The Titan club thread has plenty of posts where 2 cards are shutting down AX1200's, Anandtech forums has a guy shutting down a G1300 with 2 780's. A six core extreme cpu with a full bank of dimms, few case fans and periphreals can approach 650 watts. A GK110 card can approach 600 watts on slv7 bios, and 1000 watts in Kingpins hands. My 7950s and 7970s can do over 400 watts on air + stock bios. Plenty of 780s & titans are doing over 500 watts in the club threads. Not sure what qualifications or relevant experience you have that validates 850 watts taking someone to "the hills and beyond" with a 5ghz cpu. You seem to be making baseless assumptions from reading the side of a box or something. If you just want to barely get by without crashing on stock settings or a mild overclock, still be cautious squeezing by on a limited amount of power. There is a lot of frustration behind removing a PSU and wiring from your case when you realize the power source is inadequate or inferior. Only take a 850 if you plan to OC a single card.
> 
> That is the worst possible recommendation. Please anyone reading this do yourself a favor and take a name brand top of the line PSU if you plan to OC multiple cards. AX1200i or higher level of quality.


Quote:


> Originally Posted by *SoloCamo*
> 
> better start writing an apology...


You are the misinformed one my friend.The chart just proves he was right-R9 280X Xfire pulling 570w and 290X XFire 725...which means 850W will do just fine.A quality PSU can pull for more than it's rated capacity also.I suggest you read our resident PSU Guru's post about PSU requirements and download his PSU calc which is the best there is...period.
http://www.overclock.net/t/1140534/psu-calc-final-release/0_20

Also name brands don't mean anything,that's the worst advice ever.Do your homework and pick the right PSU for your application,and OCN has all you need to do this.
http://www.overclock.net/t/183810/faq-recommended-power-supplies/0_20


----------



## binormalkilla

Quote:


> Originally Posted by *Scotty99*
> 
> The guy claimed 900w with two 7970's, the chart you linked shows crossfire 280x's with total system draw (including CPU) at 570w. What point are you trying to make again? You didnt specify if the CPU or GPU's were overclocked in the chart, you just linked a chart with no further info. Lets just agree to disagree, no point in derailing the thread like you say. Personally, id feel 100% safe with an 850w quality psu, ill leave it at that.


You would be fine at stock vgpu, but you would quickly hit a power limit after increasing voltage. I have an 850w PSU but ordered an ax1200i in preparation for a second 290x coming in stock.


----------



## sugarhell

But he talked about oced 290x. On stock even a quality 750 is enough


----------



## Scotty99

This will be my last post on the matter. The person who originally asked has a corsair 850w, i acconted for this fact and included overclocking in my response. 850ww corsairs can output 1000w or more, they even come with spec sheets showing their actual output, which is usually like 150-200w above what they rate.

Not making a new post just adding to this one so im not making this stuff up:

http://www.overclock3d.net/reviews/power_supply/corsair_hx850w_850w_atx_psu/4

OC3D showing a hx 850 running at 90% efficiency with a max temp of 28c at *1084w*. To get the unit to kick off he had to crank it up to *91* amps, 21 more than what its rated at.


----------



## Euda

Quote:


> Originally Posted by *Jared Pace*
> 
> *Can you post your "Asus.rom" file please? Are you booting into dos with a bootdisk & changing directory to your usb stick which contains the newest atiflash & rom file?*


I tried atiwinflash and with a bootable, Windows ME MS-DOS "8.0"-Stick. I made sure to be in the same directory as the ASUS.ROM-file located on the USB-Stick main dir (same dir as the atiflash.exe)...

I can't even "atiflash -i 0", it keeps telling me there's no valid video adapter; this is the Asus.Rom:
http://www.mediafire.com/?5gewoc5e0jv4552

Am I using the wrong version of atiflash maybe? This it the atiflash.exe:
http://www.mediafire.com/?90yaopb6e86r2mk

Thanks and I appreciate your help


----------



## K1llrzzZ

I asked this question,i talked about two R9 290X in "uber" mode,but not OC-d with a 850W Corsair PSU,and an OC-d i7 3770K 4.5 Ghz in an Asrock Z77 Extreme 4 Mobo.So will this be enough if i DO NOT overclock?


----------



## skupples

Quote:


> Originally Posted by *overclockFrance*
> 
> I own A HIS 290X too but I hesitate flashing my card : mine is equipped with Elpida memory modules and ASUS cards are with Hynix. I would like to know if yours is equipped with Elpida or Hynix modules. If it is with Elpida then I will flash, no risk. Otherwise, I will wait that Afterburner unlocks voltage. Indeed, I am likely to face with problems if the memory modules are different.


I would also hesitate in that scenario. It's proven bad to mix bios' when the memory is different.


----------



## ABD EL HAMEED

Quote:


> Originally Posted by *K1llrzzZ*
> 
> I asked this question,i talked about two R9 290X in "uber" mode,but not OC-d with a 850W Corsair PSU,and an OC-d i7 3770K 4.5 Ghz in an Asrock Z77 Extreme 4 Mobo.So will this be enough if i DO NOT overclock?


They already said it's enough


----------



## Falkentyne

Quote:


> Originally Posted by *Euda*
> 
> I tried atiwinflash and with a bootable, Windows ME MS-DOS "8.0"-Stick. I made sure to be in the same directory as the ASUS.ROM-file located on the USB-Stick main dir (same dir as the atiflash.exe)...
> 
> I can't even "atiflash -i 0", it keeps telling me there's no valid video adapter; this is the Asus.Rom:
> http://www.mediafire.com/?5gewoc5e0jv4552
> 
> Am I using the wrong version of atiflash maybe? This it the atiflash.exe:
> http://www.mediafire.com/?90yaopb6e86r2mk
> 
> Thanks and I appreciate your help
> 
> 
> 
> 
> 
> 
> 
> 
> [/B]


Command is Atiflash (NOT atiwinflash) -f -p 0 (filename.bin).
like that.

No idea why that -I is in there.


----------



## K1llrzzZ

Yeah,but there was a bit of an argument and I just asked it again to let everybody know that i'm not talking about overclocked cards.


----------



## SoloCamo

Quote:


> Originally Posted by *Redwoodz*
> 
> 900w with one card......um no.
> 1600w with one card and CPU.....um no.
> 
> You are the misinformed one my friend.The chart just proves he was right-R9 280X Xfire pulling 570w and 290X XFire 725...which means 850W will do just fine.A quality PSU can pull for more than it's rated capacity also.I suggest you read our resident PSU Guru's post about PSU requirements and download his PSU calc which is the best there is...period.
> http://www.overclock.net/t/1140534/psu-calc-final-release/0_20


Obviously there are some misunderstandings here... When two 290x's are pulling 725 on a bench system and a relatively mild OC i7 (4.2ghz) it leaves you with VERY little headroom. So much as a remotely mild OC on those 290x's and you are pushing the 850 with ease. I guess I'm just not one to risk $1100 in gpu's alone due to being near my PSU's limit constantly... Also, please keep in mind that was Crysis 3, not a full on 100% load by any means for both cpu & gpu. I just like having headroom.. Not misinformed at all - just different approaches to what I consider acceptable especially when 2k or more in hardware is is being powered by it..

Also keep in mind, bench systems usually aren't loaded up with other hardware such as a sound card, extra hard drives, etc.


----------



## tsm106

Quote:


> Originally Posted by *sugarhell*
> 
> Probably you are a newbie when it comes to tahiti/hawaii overclocking.
> 
> I am trying to find tsm power consumption with 3 7970 which was over 1400
> 
> http://www.overclock.net/t/1406438/best-1000w-psu-for-3x-7970s/30#post_20329406
> 
> Look single card consumption. Put 3x a 5ghz sbe and a huge loop and its over 1400 watt


^^That. 290x draws 50w more stock and with volts draws power like mad unlike the other ultra enthusiasts offering.

Quote:


> Originally Posted by *SoloCamo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Scotty99*
> 
> Eh, what are you talking about? That chart proves MY point not his lmao.
> 
> 
> 
> You missed the point completely then... Nevermind. Sorry to feed into this guy, don't mean to derail the thread further but I can't stand power use misinformation, as it's so vital yet often brushed to the side
Click to expand...

Dude, I put that guy on ignore since the beginning of the thread. It's obvious he's nothing but a troll.


----------



## K1llrzzZ

So you say it's risky even if i do not overclock the cards?


----------



## szeged

Quote:


> Originally Posted by *tsm106*
> 
> ^^That. 290x draws 50w more stock and with volts draws power like a mad unlike the other ultra enthusiasts offering.
> Dude, I put that guy on ignore since the beginning of the thread. It's obvious he's nothing but a troll.


hows the 290x benchs coming along? i gotta wait till mid day monday for my first one to arrive, second one probably tuesday =\


----------



## Yeroon

Quote:


> Originally Posted by *SoloCamo*
> 
> better start writing an apology...


He asked about uber mode, nothing more - his 850 is perfectly suited for the job.
Anand used an ax1200i - 80+ platinum - which has a peak of 92% efficiency.
So 727W * .92 = 668W actual power draw from the PSU itself. Now he's got a quad core ivy vs anands 4960, plus anand was running 4 x 8GB.

The difference between single vs CFX was 320W - ineffeciencies becomes at worst (92% wall to card) = 294W/ per 290x,

While i was at it, i did for the 7970Ghz, which came out to 232W/card @ stock.

Power supplies are rated at their output, not wall draw, and power converters have inefficiencies, and sometimes can put out more than rated - cleanly.


----------



## tsm106

Quote:


> Originally Posted by *szeged*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> ^^That. 290x draws 50w more stock and with volts draws power like a mad unlike the other ultra enthusiasts offering.
> Dude, I put that guy on ignore since the beginning of the thread. It's obvious he's nothing but a troll.
> 
> 
> 
> hows the 290x benchs coming along? i gotta wait till mid day monday for my first one to arrive, second one probably tuesday =\
Click to expand...

Faster than titan at same core clocks so far. But I reckon you'd want a fullcover block before throwing lots of volts at the card. Gpu only or a mod is not a good idea imo. The memory chips get freaking hot and I'm generally concerned over cooling all things that aren't on water. Air cooling isn't just holding the core back, its the rest of the card too.


----------



## SoloCamo

Quote:


> Originally Posted by *Yeroon*
> 
> He asked about uber mode, nothing more - his 850 is perfectly suited for the job.
> Anand used an ax1200i - 80+ platinum - which has a peak of 92% efficiency.
> So 727W * .92 = 668W actual power draw from the PSU itself. Now he's got a quad core ivy vs anands 4960, plus anand was running 4 x 8GB.
> 
> The difference between single vs CFX was 320W - ineffeciencies becomes at worst (92% wall to card) = 294W/ per 290x,


Fair enough, I was initially under the impression he would be overclocking due to what Scott wrote "you can OC it to the hills with a 5ghz cpu, etc".. Anyways, that said, yes as long as the corsair isn't old and is a decent model he should be fine at stock settings.

On the flip side, I absolutely don't trust CFX with these on my NZXT Hale-82 850w with the FX-9590 I will have in here shortly... lol.
Quote:


> Originally Posted by *K1llrzzZ*
> 
> So you say it's risky even if i do not overclock the cards?


You are fine, sorry for any confusion here as you can tell things get out of hand quickly with any sort of misunderstanding


----------



## Scotty99

Sigh, i really didnt want to have to post this again but people need to see this and i fear it was lost in my edit:

http://www.overclock3d.net/reviews/power_supply/corsair_hx850w_850w_atx_psu/4

This will be my last post on the matter. The person who originally asked has a corsair 850w, i acconted for this fact and included overclocking in my response. 850ww corsairs can output 1000w or more, they even come with spec sheets showing their actual output, which is usually like 150-200w above what they rate.

Not making a new post just adding to this one so im not making this stuff up:

OC3D showing a hx 850 running at 90% efficiency with a max temp of 28c at 1084w. To get the unit to kick off he had to crank it up to 91 amps, 21 more than what its rated at.


----------



## ABD EL HAMEED

Quote:


> Originally Posted by *K1llrzzZ*
> 
> Yeah,but there was a bit of an argument and I just asked it again to let everybody know that i'm not talking about overclocked cards.


Now you know it's enough,now go enjoy those cards


----------



## Euda

Quote:


> Originally Posted by *Falkentyne*
> 
> Command is Atiflash (NOT atiwinflash) -f -p 0 (filename.bin).
> like that.
> 
> No idea why that -I is in there.


Sorry, I tried _both_ Atiflash under DOS and atiwinflash under Win7


----------



## tsm106

Quote:


> Originally Posted by *SoloCamo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Yeroon*
> 
> He asked about uber mode, nothing more - his 850 is perfectly suited for the job.
> Anand used an ax1200i - 80+ platinum - which has a peak of 92% efficiency.
> So 727W * .92 = 668W actual power draw from the PSU itself. Now he's got a quad core ivy vs anands 4960, plus anand was running 4 x 8GB.
> 
> The difference between single vs CFX was 320W - ineffeciencies becomes at worst (92% wall to card) = 294W/ per 290x,
> 
> 
> 
> Fair enough, I was initially under the impression he would be overclocking due to what Scott wrote "you can OC it to the hills with a 5ghz cpu, etc".. Anyways, that said, yes as long as the corsair isn't old and is a decent model he should be fine at stock settings.
> 
> On the flip side, I absolutely don't trust CFX with these on my NZXT Hale-82 850w with the FX-9590 I will have in here shortly... lol.
> Quote:
> 
> 
> 
> Originally Posted by *K1llrzzZ*
> 
> So you say it's risky even if i do not overclock the cards?
> 
> Click to expand...
> 
> You are fine, sorry for any confusion here as you can tell things get out of hand quickly with any sort of misunderstanding
Click to expand...

Don't use reviews to get an approximation of power draws in the real world. Who is going to run stock? The 290x is not locked down and will scale with volts making it draw ludicrous amounts of power, IF you want. If you want ppl to tell you to get a puny psu over and over you are in the wrong forum. Go to the psu forum, many guys there love selling you puny psus. Except they are not around to foot the bill when you blow your psu.


----------



## Jared Pace

Quote:


> Originally Posted by *K1llrzzZ*
> 
> So you say it's risky even if i do not overclock the cards?


If you don't overclock it's fine.


----------



## Jared Pace

Quote:


> Originally Posted by *Euda*
> 
> Sorry, I tried _both_ Atiflash under DOS and atiwinflash under Win7


winflash isnt working right now. If you read back through this thread, there are instructions for using atiflash, and I linked to the proper updated version of atiflash you need for 290x

edit: sorry for DP


----------



## Scotty99

Quote:


> Originally Posted by *Jared Pace*
> 
> If you don't overclock it's fine.


Please, PLEASE stop giving bad advice. I know i said i was done posting on this matter but i CANNOT let this slide.

I have already proven beyond a shadow of a doubt, with forum regulars backing me that an 850w corsair is perfectly fine for these cards while OVERCLOCKED.

If you missed it here it is again:
http://www.overclock3d.net/reviews/power_supply/corsair_hx850w_850w_atx_psu/4

Ran like a dream at 1084 DC watts with 90% efficiency at low temps. To get the unit to actually turn off he had to crank it all the way up to 91 amps, 21 over rated amperage (he didnt list the wattage needed for this number, but im sure it was absurd).

Phew, i feel better now.


----------



## Jared Pace

Quote:


> Originally Posted by *tsm106*
> 
> Don't use reviews to get an approximation of power draws in the real world. Who is going to run stock? The 290x is not locked down and will scale with volts making it draw ludicrous amounts of power, IF you want. If you want ppl to tell you to get a puny psu over and over you are in the wrong forum. Go to the psu forum, many guys there love selling you puny psus. Except they are not around to foot the bill when you blow your psu.


Finally someone who knows the true spirit of "Overclock.net". I put that scotty guy on my block list


----------



## Yeroon

Quote:


> Originally Posted by *tsm106*
> 
> Don't use reviews to get an approximation of power draws in the real world. Who is going to run stock? The 290x is not locked down and will scale with volts making it draw ludicrous amounts of power, IF you want. If you want ppl to tell you to get a puny psu over and over you are in the wrong forum. Go to the psu forum, many guys there love selling you puny psus. Except they are not around to foot the bill when you blow your psu.


He said stock uber mode - and a [email protected], with a PS he already has. I would stand by the recommendation that he is 100% ok with that 850w rated psu if its a quality unit - not a quality name.
low wattage or high wattage makes no difference in whether a psu will blow up - the quality of its components will. Except few people want to pay good prices for low wattage power supplies.

I test all my systems with a power meter @ stock/oc, full hardware. I run my machines @ 100% load 24/7 doing science (ok, poem gpu work doesn't hit 100%, but HCC did when done proper).
I'd rather buy a better quality unit with slightly lower peak rating then a higher rated one with crappier parts.

Take into account i never claimed any 850w unit will support CFX oc'd 290X's, every situation is different, I was just showing that wall draw != psu draw.


----------



## jomama22

No reason to cheap out on a PSU honestly. Nothing wrong with overkill as it allows expansion and future power hungry cards.

Thanks Jared for all the links and the thread with all the bios info. Can't wait to throw the vdroop disabled bios on these 6 cards and bin the crap out of them lol.


----------



## Jared Pace

I'm glad to help and excited to see your benches Jomama22. TSM106 has good advice about Power. He does have the highest scores on OCN, so I would take his word.


----------



## mboner1

Well i ran crossfired 7970's moderately over clocked on my corsair tx850 for a good few months with no issues. If i pick up the 290x i would imagine at some point cross firing them. I wouldn't over clock now after reading this thread but i would imagine it to be fine at stock. Based on nothing but my experience with the 7970's and the recommended power supply from amd. Also if you do go over 850w momentarily and not consistently the corsair tx850 has a additional 150-200w to rely on.

Quote:


> Originally Posted by *tsm106*
> 
> Don't use reviews to get an approximation of power draws in the real world. Who is going to run stock? The 290x is not locked down and will scale with volts making it draw ludicrous amounts of power, IF you want. If you want ppl to tell you to get a puny psu over and over you are in the wrong forum. Go to the psu forum, many guys there love selling you puny psus. Except they are not around to foot the bill when you blow your psu.


So your saying a tx850w wouldn't suffice for crossfired 290x at stock for gaming and not benchmarking???


----------



## $ilent

Hey guys

Quick update, put a borred psu in my system, everything working fine thankully! Gonna send me seasonic unit off for RMA, Im hoping but its looking unlikely Ill have a new one in time for BF4


----------



## Iniura

Quote:


> Originally Posted by *Euda*
> 
> I tried atiwinflash and with a bootable, Windows ME MS-DOS "8.0"-Stick. I made sure to be in the same directory as the ASUS.ROM-file located on the USB-Stick main dir (same dir as the atiflash.exe)...
> 
> I can't even "atiflash -i 0", it keeps telling me there's no valid video adapter; this is the Asus.Rom:
> http://www.mediafire.com/?5gewoc5e0jv4552
> 
> Am I using the wrong version of atiflash maybe? This it the atiflash.exe:
> http://www.mediafire.com/?90yaopb6e86r2mk
> 
> Thanks and I appreciate your help
> 
> 
> 
> 
> 
> 
> 
> 
> [/B]


I took this from OC UK credits go to WalderX he just posted that there.

Maybe this will help all the people having problems flashing there bios.

Following the techpowerup guide does work but it links to the wrong atiflash (older version), and the command to flash is missing the -f switch.
I have spent a bit of time today finding the correct atiflash and some bios files and have put them all together into a zip file you can grab here:

http://sdrv.ms/1c9DpWB

Unzip that and you will have all the correct bits needed so you can follow TPU guide:

http://www.techpowerup.com/forums/sh...ad.php?t=57750

You will need an empty USB stick (the HP tool will format it). The win98 boot files are included in my zip file, you need to direct the hp tool to use them.

Before performing the flash I went into the AMD OverDrive control panel and set it back to defaults and unticked 'Enable Graphics OverDrive'. This meant I was back to stock and minimised risk of any problems when it booted back up with the new bios.

Once booted into the USB stick with the files copied you can get flashing! Firstly I ran the following command:

atiflash -s 0 backup.rom

This gave me a copy of my sapphire bios incase I want to reflash in future, or any problem occurred with flashing the asus bios. I have included this bios image in the zip file above (sapphire.rom).

You then are ready to flash the Asus bios as follows:

atiflash -p -f 0 asus.rom

If you don't use the -f switch you will get an SSID check error and it won't go through with the flashing. If you have multiple cards you just need to run the command again, incrementing the number until all your cards are flashed.
Reboot, remove USB stick and you're done!

Now you will need to use Asus GPU Tweak software to do your overclocking, as far as I am aware MSI afterburner doesn't work properly with this card yet - haven't even tried it myself. You can get it direct from Asus here:

http://support.asus.com/download.asp...pu+tweak&os=30

I've got mine running at 1150 core & 6000 mem @ 1.350v. Make sure you always have the power target slider set at the max of 150%, to ensure that the card can draw the power it needs when required.

Hope this guide helps! Will also post this as a new thread in the graphics cards section (if there isn't one already) as I see many people asking how to flash across many forums. The Asus bios rom is from Gibbo so many thanks to him for getting hold of it for us


----------



## fateswarm

For your information, the PSU experts over at realhardtech suggest 1000 on CF, 1300 on 3x and 1600 on 4x.

http://www.realhardtechx.com/index_archivos/Page362.htm

They are known to be conservative but they also know their psus.


----------



## Ha-Nocri

yep, highly OC'd system with 2x 7970 in CF can consume more than 900W. I did research it when I was thinking to go 580 SLI with my 750W PSU.


----------



## Jared Pace

Something to keep in mind when these 290x's start opening up:

2 cards @ only 1.3v. 1.4v and he'd be touching 1500w
Quote:


> Originally Posted by *Jpmboy*
> 
> here's some system power draw data:
> 
> Killawatt power measurements (at the PSU plug)
> 3930K @5.0(1.523V)
> 2xTitans SLI (svl7v3 bios, softvoltmod, LLC=0)
> 
> Bios = 220W
> Boot = 500W (?)
> Idle = ~ *160-170 watts* to the rig
> Browser = *~300W*
> Super Pi = 340W
> p95 (8G ram) = 600W (597+/-)
> 3Dmk11 @ 875/3005 1.16V = 800-900W
> 3DMk11 @1215/3602 1.3V = 1190-1220W !!!
> Valley @ 1215/3602 1.3V = *950-1050*W (1080P ExHD)
> Firestrike @ 1215/3602 @ 1.3V = 1050-1130W (default)
> 
> so - 2xTitans SLI with only a modest OC (1215/3602) pull ~ 750 or more watts from the PSU.
> 
> Before buying a new PSU, make sure you have all cables connected correctly. But, frankly, I do not think an 850W psu is just not enough for OC/overvolted Titans
> 
> Oh yeah - you can also pick up a second ~600W psu and an "add2psu" for much cheaper.


http://www.overclock.net/t/1363440/nvidia-geforce-gtx-titan-owners-club/16880#post_20992725


----------



## Taint3dBulge

Man whats up with newegg and the asus cards? Have been outa town for the last few days but have been waiting for my auto notify. Havnt seen or herd anything yet about when those reference cards come out. Anyone buy/get a reference asus card yet? Was gonna wait for a matrix but not sure if i can wait another 6+ weeks. I might just break down and get a reference asus. But they are nowhere to be found?


----------



## jomama22

Quote:


> Originally Posted by *Taint3dBulge*
> 
> Man whats up with newegg and the asus cards? Have been outa town for the last few days but have been waiting for my auto notify. Havnt seen or herd anything yet about when those reference cards come out. Anyone buy/get a reference asus card yet? Was gonna wait for a matrix but not sure if i can wait another 6+ weeks. I might just break down and get a reference asus. But they are nowhere to be found?


Seems Asus is only available outside the USA ATM.


----------



## scyy

Quote:


> Originally Posted by *Falkentyne*
> 
> *
> Check OCUK forums. Your problem with the memory has been experienced by others.
> The RAM can apparently handle higher clocks. It seems to be the 2d/3d desktop speed switching that causes the problem.
> *
> I Know because I was able to run at 1100/1500 on my first time, looping Valley Benchmark multiple times, as well as playing some league of Legends. No problems keeping it there all night.
> 
> Then I rebooted the computer.
> 
> BAM: once I was on desktop, the instant I tried to move ANYTHING, I saw those lines you were talking about, then an instant black screen.
> Rebooted again, tried to open MSI afterburner, set the 70% manual fan speed, then while mousing over something: black screen.
> 
> Rebooted again and managed to open CCC and hit "defaults" before the black screen happened.
> 
> Changing the core speed only doesn't seem to cause issues.
> 
> It's something with the memory clock.
> The memory clock should NEVER be going to 3d speeds while scrolling or opening something on the desktop...isn't that what the UVD clocks were for?
> Seems to be a bug.
> 
> Some people fixed this by gpu tweaking the memory voltage on the Asus bios 0.05v higher, then that stopped the desktop from crashing.


Wait a second, this is an issue again? That used to happen to me whenever I would try to overclock my old 4830 using dual monitors, it would cause my 2d clocks to go crazy and make my second monitor flip out like that. I'm sorry but come on AMD, they have fixed and broken that at least 3 times now.


----------



## cyenz

Does any one know if the ARCTIC COOLING ACCELERO HYBRID works on 290x? I´ve read some where that it was compatible, if it is, are we talking about the 7970 version of the cooler or te standard? Is the core of 290x like the 7970, the 7970 need a special block with some elevation.


----------



## skupples

Quote:


> Originally Posted by *cyenz*
> 
> Does any one know if the ARCTIC COOLING ACCELERO HYBRID works on 290x? I´ve read some where that it was compatible, if it is, are we talking about the 7970 version of the cooler or te standard? Is the core of 290x like the 7970, the 7970 need a special block with some elevation.


The maker's have stated no not really, & that a new model will be coming out shortly.


----------



## Euda

Quote:


> Originally Posted by *Iniura*
> 
> I took this from OC UK credits go to WalderX he just posted that there.
> 
> Maybe this will help all the people having problems flashing there bios.
> 
> Following the techpowerup guide does work but it links to the wrong atiflash (older version), and the command to flash is missing the -f switch.
> I have spent a bit of time today finding the correct atiflash and some bios files and have put them all together into a zip file you can grab here:
> 
> http://sdrv.ms/1c9DpWB
> 
> Unzip that and you will have all the correct bits needed so you can follow TPU guide:
> 
> http://www.techpowerup.com/forums/sh...ad.php?t=57750
> 
> You will need an empty USB stick (the HP tool will format it). The win98 boot files are included in my zip file, you need to direct the hp tool to use them.
> 
> Before performing the flash I went into the AMD OverDrive control panel and set it back to defaults and unticked 'Enable Graphics OverDrive'. This meant I was back to stock and minimised risk of any problems when it booted back up with the new bios.
> 
> Once booted into the USB stick with the files copied you can get flashing! Firstly I ran the following command:
> 
> atiflash -s 0 backup.rom
> 
> This gave me a copy of my sapphire bios incase I want to reflash in future, or any problem occurred with flashing the asus bios. I have included this bios image in the zip file above (sapphire.rom).
> 
> You then are ready to flash the Asus bios as follows:
> 
> atiflash -p -f 0 asus.rom
> 
> If you don't use the -f switch you will get an SSID check error and it won't go through with the flashing. If you have multiple cards you just need to run the command again, incrementing the number until all your cards are flashed.
> Reboot, remove USB stick and you're done!
> 
> Now you will need to use Asus GPU Tweak software to do your overclocking, as far as I am aware MSI afterburner doesn't work properly with this card yet - haven't even tried it myself. You can get it direct from Asus here:
> 
> http://support.asus.com/download.asp...pu+tweak&os=30
> 
> I've got mine running at 1150 core & 6000 mem @ 1.350v. Make sure you always have the power target slider set at the max of 150%, to ensure that the card can draw the power it needs when required.
> 
> Hope this guide helps! Will also post this as a new thread in the graphics cards section (if there isn't one already) as I see many people asking how to flash across many forums. The Asus bios rom is from Gibbo so many thanks to him for getting hold of it for us


Now it finally worked flawlessly thanks to you and the updated atiflash-version!







So much appreciation, thank you!
Currently I'm running Crysis 3 and we seem to have similar cards - at 1150 MHz, mine also stops artifacting @ 1.35v - 1.32v was a bit too low at least... :/ So just one question again: Are these results average or is it a tendential weak chip? Would love to get a nice performance boost caused by a nice OC with my Arctic Accelero Hybrid next week... And thinking about changing the thermal compound this evening


----------



## Euda

@*Iniuara*:
Thank you so much for your help and the updated atiflash-Revision, thanks to that it just worked and my voltage is unlocked! Really nice, much appreciation








_
_Now, I just kicked the card into Crysis 3 and gave it 1150MHz - the card stops artifacting at something like 1.35v, so we've got very similar chips :/ Is that an average result or not really great for the Hawaii XT-Chip?_


----------



## Jared Pace

Quote:


> Originally Posted by *Euda*
> 
> @*Iniuara*:
> Thank you so much for your help and the updated atiflash-Revision, thanks to that it just worked and my voltage is unlocked! Really nice, much appreciation
> 
> 
> 
> 
> 
> 
> 
> 
> _
> _Now, I just kicked the card into Crysis 3 and gave it 1150MHz - the card stops artifacting at something like 1.35v, so we've got very similar chips :/ Is that an average result or not really great for the Hawaii XT-Chip?_


Euda & Iniuara,

Please try this bios named PT3.rom to see if it helps you overclock >1150mhz and/or increases your benchmark scores. It has LLC disabled an no Vdroop and possibly tighter memory timings.

*Warning:* *Fully unlocked voltage control. Voltage too high can result in permanent damage to your card.*

PT3.zip 42k .zip file


----------



## tsm106

^^Be careful with those bios, you can damage your card. Btw, there's NOTHING GOOD about using them on air! Unlocked card = you are the protection or not, you're choice.


----------



## Jared Pace

Quote:


> Originally Posted by *tsm106*
> 
> ^^Be careful with those bios, you can damage your card. Btw, there's NOTHING GOOD about using them on air! Unlocked card = you are the protection or not, you're choice.


Ah yeah thanks tsm106. Should have mentioned that as a disclaimer. You will have full control over voltages with this bios. Raising them too high can hurt your card.


----------



## binormalkilla

I OCed a little (1100 GPU / 1350 memory) and saw some graphical corruption while running 3dmark11. It finished successfully though. I'm not going to do much OCing until I get my blocks and second card in though.


----------



## $ilent

This is the best test i managed on my card at stock volt:










Anyone got any ideas how much higher i could go with unlocked voltage?


----------



## Iniura

Quote:


> Originally Posted by *Jared Pace*
> 
> Euda & Iniuara,
> 
> Please try this bios named PT3.rom to see if it helps you overclock >1150mhz and/or increases your benchmark scores. It has LLC disabled an no Vdroop and possibly tighter memory timings.
> 
> *Warning:* *Fully unlocked voltage control. Voltage too high can result in permanent damage to your card.*
> 
> PT3.zip 42k .zip file


I don't own a 290X (yet), I just posted that to help the people flash there bios









Here in the Netherlands the 290X isn't available yet, you can order but it takes a while before they can deliver them (couple weeks) and all the versions come with BF4.
I want a version without BF4 because I already pre ordered BF4. There is no sign of 290X's without BF4 here yet, I guess they will become available after the 29th of October when BF4 launched or perhaps along with the release of the 290.

However I am following this thread closely because I want to buy a 780, 290X or 780Ti in the upcoming weeks, month, just watching everything closely in anticipation of price drops, release, price and performance of 780Ti, Performance of 290X under water, and maybe wait until December on the performance boost gained from Mantle before making the best decision on which GPU to buy.


----------



## Euda

Is your card artifacting at stock voltage with those 1125 MHz?


----------



## binormalkilla

Quote:


> Originally Posted by *Euda*
> 
> Is your card artifacting at stock voltage with those 1125 MHz?


Yes, that's with stock voltage and stock cooling.


----------



## FtW 420

Quote:


> Originally Posted by *$ilent*
> 
> This is the best test i managed on my card at stock volt:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone got any ideas how much higher i could go with unlocked voltage?


Remember to set the other cpu-z tabs to memory & mobo, years later I still have to double check my screens since I mess them up half the time...


----------



## Paulenski

So glad I found this thread, so much useful information









Has anyone tried attaching the MK-26 or Accelero hybrid to it?

Can I join the club? XFX Brand


----------



## $ilent

Gee i never even noticed it haha thanks.

Cant remember if it artifacted but my benchmarks get wierd flases on screen later on trying to do like 1100/1345 for some reason...


----------



## SoloCamo

Quote:


> Originally Posted by *$ilent*
> 
> Gee i never even noticed it haha thanks.
> 
> Cant remember if it artifacted but my benchmarks get wierd flases on screen later on trying to do like 1100/1345 for some reason...


Try 25% power limit in afterburner and adjust the fan curve... My sapphire is doing 1100/1350 without issue knock on wood


----------



## $ilent

That was with 50% power limiy, ive always had it at 50.

Also guys for info the 290x goes from idle less than 90 watts to almost 380 watts at full load.


----------



## Euda

We all seem to have similar overclocking chips








Mine is also artifacting at stock voltages 1100 MHz, mild flashes... But it's benchmark-stable until like 1030 Megs


----------



## devilhead

Quote:


> Originally Posted by *Paulenski*
> 
> So glad I found this thread, so much useful information
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Has anyone tried attaching the MK-26 or Accelero hybrid to it?
> 
> Can I join the club? XFX Brand


damn man your cpu is not overheating? your rad is full off dust!!(and all system, it will be hard time for 290x)


----------



## $ilent

Becnhmark stable until 1030? Doesnt it bench at 1100?


----------



## SoloCamo

Quote:


> Originally Posted by *$ilent*
> 
> That was with 50% power limiy, ive always had it at 50.
> 
> Also guys for info the 290x goes from idle less than 90 watts to almost 380 watts at full load.


Try to lower it, I've gotten better results at 25% then I did at 30%.. worth a shot.


----------



## $ilent

No psu lol


----------



## _Killswitch_

How do you guys like your 290/290X's?

I'm looking at new card's for my new build. Just curious how you guys like your's, and what performance/issues you might have had? Curious how Crossfire is working Also.

my choices are SLI 780's (only if the price drops Marginly) or Crossfire 290X's

but if Aqua Computer makes a waterblock for the 290x's like this one
http://www.aquatuning.us/product_info.php/info/p15633_Aquacomputer-kryographics-f-r-GTX-780-acrylic-glass-edition--vernickelte-Ausf-hrung.html

Think I'll be very over whelmed to go RED again, after some many years of been on green side =S


----------



## $ilent

After running a very brief bf3 benchmark i can tell the card will perform.

But at this moment in time it feels like a car with no wheels...you cant use it due to the mismatch of bad driver and no programmes updated to use it. I.e gpuz, msi ab,, windows in general...


----------



## Euda

Quote:


> Originally Posted by *$ilent*
> 
> Becnhmark stable until 1030? Doesnt it bench at 1100?


Sorry, 1130MHz









And yep I agree, the card feels like a rough diamond at the moment


----------



## Arizonian

Quote:


> Originally Posted by *Paulenski*
> 
> So glad I found this thread, so much useful information
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Has anyone tried attaching the MK-26 or Accelero hybrid to it?
> 
> Can I join the club? XFX Brand
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> [
> 
> 
> /quote]
> 
> Added to the club as our first XFX entry too. Welcome.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @binormalkilla - submit a pic with OCN name, GPU-Z screen shot w/OCN name and I'll add you as well.
> 
> Just need something with name to verify as proof to link it to.


----------



## kcuestag

Quote:


> Originally Posted by *fateswarm*
> 
> For your information, the PSU experts over at realhardtech suggest 1000 on CF, 1300 on 3x and 1600 on 4x.
> 
> http://www.realhardtechx.com/index_archivos/Page362.htm
> 
> They are known to be conservative but they also know their psus.


Good to know as I do plan on getting a 2nd R9 290X on christmas and I have a 1000W PSU.

Quote:


> Originally Posted by *Ha-Nocri*
> 
> yep, highly OC'd system with 2x 7970 in CF can consume more than 900W. I did research it when I was thinking to go 580 SLI with my 750W PSU.


With all the respect, I used to run 3x EVGA GTX580 @ 915MHz Core (High OC) and I peaked at 950-1010W on this very same PSU I have right now, so 2x 7970's or 580's have to be far from 900W..


----------



## SoloCamo

My advice to anyone owning one at the moment..

Don't use gpu-z - only use Afterburner, and try to keep system reporting programs closed out for now. As far as gaming and benchmarks go, I've had absolutely no issues, any game I was playing on my overclocked 7970GE is playing the same, just worlds better minimum frames per second and averages..

I can say considering I've had it essentially day one, it can only get better from here. Other then some software reporting issues (gpu-z, I'm looking at you) I could use it as is which considering all of the releases I've seen in the past is a fairly rare thing to say.

Can't wait to see if drivers bring even another 5% out of this thing


----------



## Paulenski

Quote:


> Originally Posted by *devilhead*
> 
> damn man your cpu is not overheating? your rad is full off dust!!(and all system, it will be hard time for 290x)


Haha yeah I know, I built the system like 11 months ago and I've only gone in a few times since then to change some things. I've done some light cleaning but I'm planning to take down all the fans and rad to clean them out. I have the first zalman AIO so it has some age on it too, considering to upgrade to an H100 or something.

Quote:


> Originally Posted by *Arizonian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Paulenski*
> 
> So glad I found this thread, so much useful information
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Has anyone tried attaching the MK-26 or Accelero hybrid to it?
> 
> Can I join the club? XFX Brand
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> [
> 
> 
> /quote]
> 
> Added to the club as our first XFX entry too. Welcome.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @binormalkilla - submit a pic with OCN name, GPU-Z screen shot w/OCN name and I'll add you as well.
> 
> Just need something with name to verify as proof to link it to.
Click to expand...

Here ya go sir, hope it's enough.

http://i.imgur.com/mzFJHrF.jpg
http://i.imgur.com/m1Fc1QY.jpg
http://i.imgur.com/bBwvbc6.png
http://i.imgur.com/XTL8oOA.jpg


----------



## yawa

K a couple of things so far.

- First, I'm amazed how fast we got volt nodded BIOSes working on these cards, whatever team ( or singular person) responsible deserves a huge standing ovation. And the daredevils currently risiking their $550-$600 dollar investment without the safety ney of a water block right now, deserve another one. So give it up.-

- Second, speaking of water blocks, it's amazing to me no one has theirs yet. I hope they come soon and some majorly detailed synopsis posts start popping up. I'm debating ordering one before I even get my R290 just to avoid the shipping issues you guys seem to be having. -

- Third, I'm hoping by the time I go to order one, my brick & mortar local Micro center has a few cards on the shelf, which brings me to this next question. Has anyone bought a card at retail level yet, and if so, what did the stock look like? I'm anticipating the R290's will be much harder to come by by mid-November than the 290x's and as such I'm hoping to be able to grab one from a local computer part store.-

Nonetheless great job guys. When new drivers, mantle, and water cooling becomes the norm for these cards, I have a feeling the already impressive performance on these things will go through the roof. Keep it up. And most importantly have fun while "Ridiculing" various overclocked Titans.


----------



## rv8000

Anyone know of any rumors that the 290 will be released with brand custom cooling solutions?


----------



## Lord Xeb

I just found me one. >.>


----------



## tsm106

Quote:


> Originally Posted by *rv8000*
> 
> Anyone know of any rumors that the 290 will be released with brand custom cooling solutions?


Latest thread has the eta in late Nov.

Quote:


> Originally Posted by *Lord Xeb*
> 
> I just found me one. >.>


Grats dude.


----------



## rv8000

Quote:


> Originally Posted by *tsm106*
> 
> Latest thread has the eta in late Nov.
> Grats dude.










, I hate when new tech is released, I'm so impatient







. Have to wait and see if the 290 comes in ~450 or not as well, if I had the extra cash for wc right now wouldve already grabbed a 290x.


----------



## yawa

Quote:


> Originally Posted by *rv8000*
> 
> 
> 
> 
> 
> 
> 
> 
> , I hate when new tech is released, I'm so impatient
> 
> 
> 
> 
> 
> 
> 
> . Have to wait and see if the 290 comes in ~450 or not as well, if I had the extra cash for wc right now wouldve already grabbed a 290x.


Don't feel bad I'm in the same boat man. Just hope by the time it's released and we have the funds we are able to walk into a local store ( like my Micro center in Cambridge, MA) and find one on the shelves.

If you have some of the funds now though you should do what I'm doing and order an R290 water block asap so you aren't waiting for it to arrive when you get the card. If you're planning on WC'ing that is.

Either way I'm here waiting alongside you though, so remember you aren't alone.


----------



## Dominican

what driver should i be using ?


----------



## rv8000

Quote:


> Originally Posted by *yawa*
> 
> Don't feel bad I'm in the same boat man. Just hope by the time it's released and we have the funds we are able to walk into a local store ( like my Micro center in Cambridge, MA) and find one on the shelves.
> 
> If you have some of the funds now though you should do what I'm doing and order an R290 water block asap so you aren't waiting for it to arrive when you get the card. If you're planning on WC'ing that is.
> 
> Either way I'm here waiting alongside you though, so remember you aren't alone.


All the money I should've spent on custom wc I sunk into case mods and paint







, will have to wait until summer comes around for a custom loop, Aftermarket air until then.


----------



## skupples

Those 3dmark benches are weird. Must be tessalation or something. Valley shows much different results than that .


----------



## Lord Xeb

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rv8000*
> 
> Anyone know of any rumors that the 290 will be released with brand custom cooling solutions?
> 
> 
> 
> Latest thread has the eta in late Nov.
> 
> Quote:
> 
> 
> 
> Originally Posted by *Lord Xeb*
> 
> I just found me one. >.>
> 
> Click to expand...
> 
> Grats dude.
Click to expand...

The problem is that i paid about 100 bucks more


----------



## Paul17041993

subbed as my next card will be a 290X, have to wait till they are in stock again on pccasegear though...

how far have people managed to clock theirs so far? we have any duds at all?


----------



## HeadlessKnight

Quote:


> Originally Posted by *skupples*
> 
> Those 3dmark benches are weird. Must be tessalation or something. Valley shows much different results than that .


Benchmarks act differently to different architectures and drivers. Valley has always been an Nvidia favored benchmark. 3DMark11 too but to a lower extent. Also if you noticed the Tessellation has not been manipulated in CCC, otherwise 3DMark will be able to detect that







.


----------



## szeged

Quote:


> Originally Posted by *Lord Xeb*
> 
> The problem is that i paid about 100 bucks more


did you grab one off ebay or something? saw a few on there for $650 lol


----------



## DampMonkey

Quote:


> Originally Posted by *HeadlessKnight*
> 
> Benchmarks act differently to different architectures and drivers. Valley has always been an Nvidia favored benchmark. 3DMark11 too but to a lower extent. Also if you noticed the Tessellation has not been manipulated in CCC, otherwise 3DMark will be able to detect that
> 
> 
> 
> 
> 
> 
> 
> .


You would think that Valley would run ok because it is a Unigine product and Heaven seems to be an accurate benchmark for both AMD and Nvidia. I've seen Valley benchmark graphs before where the GTX770 almost hangs with the 290x in valley. Something is definitely up with that bench


----------



## Mr357

Quote:


> Originally Posted by *DampMonkey*
> 
> You would think that Valley would run ok because it is a Unigine product and Heaven seems to be an accurate benchmark for both AMD and Nvidia. I've seen Valley benchmark graphs before where the GTX770 almost hangs with the 290x in valley. Something is definitely up with that bench


Look how low the 7950 is on that graph.










Something definitely isn't right.


----------



## ABD EL HAMEED

Quote:


> Originally Posted by *Mr357*
> 
> Look how low the 7950 is on that graph.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Something definitely isn't right.


Last time I checked the 7990 was better than the titan and the 690.....interesting


----------



## Lord Xeb

Quote:


> Originally Posted by *szeged*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Lord Xeb*
> 
> The problem is that i paid about 100 bucks more
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> did you grab one off ebay or something? saw a few on there for $650 lol
Click to expand...

Yeah, I did. I am not waiting for everything to come back in stock.


----------



## Paul17041993

Quote:


> Originally Posted by *DampMonkey*
> 
> You would think that Valley would run ok because it is a Unigine product and Heaven seems to be an accurate benchmark for both AMD and Nvidia. I've seen Valley benchmark graphs before where the GTX770 almost hangs with the 290x in valley. Something is definitely up with that bench


people using 1x optimization? I think that was something noted about the 7970...


----------



## szeged

Quote:


> Originally Posted by *Lord Xeb*
> 
> Yeah, I did. I am not waiting for everything to come back in stock.


i was going to but a sapphire one popped up on newegg in the middle of the night lol, saved me $100


----------



## kpoeticg

Damnit, I missed it AGAIN!!!!


----------



## jomama22

Next week I'll have 3 for sale. Most likely on Saturday or Sunday.

Back on topic, valley is only one of a bunch of benchmarks so I tale it as it is in comparison to others.


----------



## szeged

Quote:


> Originally Posted by *jomama22*
> 
> Next week I'll have 3 for sale. Most likely on Saturday or Sunday.
> 
> Back on topic, valley is only one of a bunch of benchmarks so I tale it as it is in comparison to others.


selling 290x's? why?


----------



## maarten12100

Quote:


> Originally Posted by *Iniura*
> 
> I don't own a 290X (yet), I just posted that to help the people flash there bios
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here in the Netherlands the 290X isn't available yet, you can order but it takes a while before they can deliver them (couple weeks) and all the versions come with BF4.
> I want a version without BF4 because I already pre ordered BF4. There is no sign of 290X's without BF4 here yet, I guess they will become available after the 29th of October when BF4 launched or perhaps along with the release of the 290.
> 
> However I am following this thread closely because I want to buy a 780, 290X or 780Ti in the upcoming weeks, month, just watching everything closely in anticipation of price drops, release, price and performance of 780Ti, Performance of 290X under water, and maybe wait until December on the performance boost gained from Mantle before making the best decision on which GPU to buy.


Same problem bro.
10+ days I mean the Americans complain about availability while they are simply unontainable to us (just like the Titan was but there wegad EVGA)


----------



## Koniakki

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I just ordered a Arctic Cooling Accelero Xtreme III for mine since it's been confirmed that it fits.
> so i'll do some testing on Stock air and then with that, i had to order it from another site though so i have $90 worth of store credit......time to find some more case fans and pretty blue lights


Keep us posted about it. I would love to see what a difference the AC Xtreme III will make..


----------



## tsm106

Quote:


> Originally Posted by *tsm106*
> 
> [email protected] 1250/[email protected] load = 79.1 fps.


Is this is still the fastest valley run? A friend pointed out kaapstad's 74fps on ocuk. C'mon guys throw up some fps. Wtb ek block for 1300mhz!


----------



## szeged

Quote:


> Originally Posted by *tsm106*
> 
> Is this is still the fastest valley run? A friend pointed out kaapstad's 74fps on ocuk. C'mon guys throw up some fps. Wtb ek block for 1300mhz!


valley loves memory clock bumps, im sure if you could push your memory harder you can get 85+ fps pretty easy with the core the same.


----------



## vettefan8

Quote:


> Originally Posted by *Koniakki*
> 
> Keep us posted about it. I would love to see what a difference the AC Xtreme III will make..


I also ordered a 290x which should be here tomorrow and an AC Xtreme III which should be here on Wednesday. Any particular benchmarks or temperature tests that you want me to do with the stock cooler and with the new cooler?


----------



## Arizonian

Well guys the 290X is paying off as I'm preparing for battle.


----------



## Daveleaf

Sorry, debating between 290 or 290x

What is better suited for 1080p gaming with the goal of maintaining 120+ fps. (light boost monitor)

will be underwater, and will replace a 7870xt


----------



## Kokin

I'm excited to see the results. If I cannot afford them, at least I'll be happy to see how they will perform. Keep on sharing pictures and benchmarks!


----------



## pounced

Quote:


> Originally Posted by *Arizonian*
> 
> Well guys the 290X is paying off as I'm preparing for battle.


same!!!!!!!!! My key finally worked when I got home from work today.

Btw question for you guys with more experience. I'm running this card on a system with a 2500k on a z77 Astrock extreme 6 Mother board. Also running 16gb of DDR3 Ram at 1866mhz

Is my CPU going to bottleneck this card? I know that I can only get PCI-E Gen 2 with it due to Sandy bridge not having PCI-E Gen 3 on it so I'm wondering if I should invest in a really good CPU that fits this socket (1155)

So far in BF3 I've seen some dips below 60 nothing less than 55 but it has gotten into the high 50s and 60s a couple times and I'm thinking with this caliber of card it shouldn't do that.

Edit: I have my 2500k clocked @ 4.4Ghz


----------



## Forceman

Quote:


> Originally Posted by *Arizonian*
> 
> Well guys the 290X is paying off as I'm preparing for battle.


Can you check and see if it includes the China Rising DLC? I'm assuming it would, but it would be nice to be sure.


----------



## SoloCamo

welp my coupon definitely isn't going to work considering the letters/numbers are absolutely unreadable.... oh well...


----------



## mark_thaddeus

Quote:


> Originally Posted by *Mr357*
> 
> Look how low the 7950 is on that graph.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Something definitely isn't right.


My 7950 scored a 2176 without tweaks or mods and on air. That score is the stock 7950 and at 1920x1080. Seems about right for 2560x1440. I also think (correct me if I'm wrong) that most people who post on that bench do recognize that it leans toward the green side, though if you tweak certain settings it gets mitigated.

http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0/1220_20#post_19361022


----------



## KyGuy

Quote:


> Originally Posted by *Daveleaf*
> 
> Sorry, debating between 290 or 290x
> 
> What is better suited for 1080p gaming with the goal of maintaining 120+ fps. (light boost monitor)
> 
> will be underwater, and will replace a 7870xt


I would say that you could get away with a 290. The benchmarks put the 290x even at around 35-45 FPS at Ultra High Definition (UHD). I bought the 290x just because I want to have the latest and greatest, but I would say for a 1080p monitor, you could get away with a 290. Even if you are disappointed, you can overclock if you wanted. Would probably have to put it under water though....KyGuy


----------



## trippinonprozac

Anyone with a waterblock and custom bios on these...

Interested to know what the performance is like compared to overclocked Classifieds? I play 90% battlefield and it looks as though mantle alone will be reason enough to change teams.


----------



## Paulenski

So earlier tonight before work I was trying to find the limit of my xfx model on stock voltage with 50% power limit.

Using kombustor for basic artifact viewing and then testing on valley1.0, I managed to reach 1072 / 1525. You know what's strange after running valley and getting an score of 318x, I started up firestrike and it immediately started artifacting on both monitors and system crashed. Pretty sure it's the memory clock, I'm surprised it didn't cause artifacting sooner on kombustor or valley

Sent from my SGH-M919 using Tapatalk


----------



## Paulenski

Forgot to mention, I stopped increasing memory because I was maxing out at 92-93c with 80% fan speed, didn't think I should go much higher nor could air handle it anymore

Sent from my SGH-M919 using Tapatalk


----------



## bpmcleod

Have there been any SOLID OCs on water with these yet? I currently own a 780 HC Classy and tbh I havent seen anything overwhelming with these cards that make me want to switch over. I have been debating it for the past few days but then when I get to seeing the results they arent performing anywhere near the 780s OCd. Anyone able to shed some light on good OC results on water with these? The price difference between these and a HC 780 is still only a 100$ once you put them under water so the difference is negligible but the only bonus I can see if they perform close to each other is that you can quad fire these and yluo cant on a 780?


----------



## RocketAbyss

Quote:


> Originally Posted by *bpmcleod*
> 
> Have there been any SOLID OCs on water with these yet? I currently own a 780 HC Classy and tbh I havent seen anything overwhelming with these cards that make me want to switch over. I have been debating it for the past few days but then when I get to seeing the results they arent performing anywhere near the 780s OCd. Anyone able to shed some light on good OC results on water with these? The price difference between these and a HC 780 is still only a 100$ once you put them under water so the difference is negligible but the only bonus I can see if they perform close to each other is that you can quad fire these and yluo cant on a 780?


Not too sure about solid water OCs out there, but I can tell you that if you have an above average clocking 780 classy, its pointless to switch over to the 290X. This 290X is more viable and aimed towards people who don't have a 780/Titan/7990 etc imo.


----------



## VSG

Isn't a hydrocopper 780 classy $820? With a water block and backplate, the 290X will cost ~$660-680 depending on various options. So that's an average of $150 less per card.


----------



## BababooeyHTJ

Performance pc has blocks in stock, I just ordered two. Can't wait.


----------



## VSG

Thanks for the heads-up. I am going to grab 2 of the acetal block + backplates, and was wondering if the EK Terminal is worth it? I have a maxumus vi formula and the distance between the first 2 PCI-E slots is a more than standard 2 slots.


----------



## Mr357

Not sure if this was mentioned already, but FrozenCPU now has EK 290X backplates. I just ordered mine, and I believe there's now just 11 left.

Link


----------



## Arm3nian

Quote:


> Originally Posted by *Mr357*
> 
> Not sure if this was mentioned already, but FrozenCPU now has EK 290X backplates. I just ordered mine, and I believe there's now just 11 left.
> 
> Link


Meh looks kind of boring, are there other ones?


----------



## Mr357

Quote:


> Originally Posted by *Arm3nian*
> 
> Meh looks kind of boring, are there other ones?


For the time being, no.


----------



## provost

Quote:


> Originally Posted by *Arizonian*
> 
> Well guys the 290X is paying off as I'm preparing for battle.


Although drivers are still not mature for 290x, but how do you like the 290x compare to your 690? Forgetting about the stock cooler, heat and all those issues with the 290x, as these can be fixed with wc or after market cooler.


----------



## Arizonian

Quote:


> Originally Posted by *Forceman*
> 
> Can you check and see if it includes the China Rising DLC? I'm assuming it would, but it would be nice to be sure.


It dosent say anything more than basic BF4 not the digital delux, sorry. Grayed out until launch date. Still a $60 value off our cards.


----------



## RocketAbyss

Quote:


> Originally Posted by *Arizonian*
> 
> It dosent say anything more than basic BF4 not the digital delux, sorry. Grayed out until launch date. Still a $60 value off our cards.


Yeap already sold it my friend day 1 when I got my 290X.









On a side note: 12hrs to BF4 launch!


----------



## CallsignVega

Hmm, going over a lot of data I am at a crossroads. I am posting this in both the 290X and Titan threads for input. My post is only in reference to Eyefinity/Surround resolution and 4K data. Virtually all reviews had the 290X at it's stock ~1000 MHz and Titans stock boost around ~993 MHz, so roughly the same.

Averaging all of those numbers up, 290X clock for clock is about 6% faster than Titan at these high resolutions. Some of that in part to Crossfire on average scaling a tad better than SLI.

I could most likely sell the Titan's with waterblocks for around $650 each, which would be pretty much a wash with new 290X's and water blocks.

So basically I'd need to get my 290X's up to ~1225 MHz on water in order to match my 1300 MHz Titans at these resolutions. Then in comes down to how smooth are 290X's in Crossfire Eyefinity/4K versus SLI? What is the status of the frame pacing drivers?

Basically, I'd be giving up better power and heat characteristics of the Titans for slightly faster performance. The Titan's are already a sunk cost, so that would be a wash with the 290X's. VRAM I am not worried about, 4GB is fine as at these resolutions as I don't run much AA.

NVIDIA has the G-Sync trick up their sleeve, which may make it hard to switch red. Although, there might be a whole new generation of GPU's out before we see that in 4K monitors or anything non 144 Hz TN based. The question of whether I'd run non-Lightboost G-Sync over non G-Sync Lightboost is also of importance. The 290X's could run Lightboost but not G-Sync.

If I were buying new cards, pretty much a no-brainer to go with the 290X's. Since I already have Titan's, it makes it much harder. NVIDIA is quite laughable keeping Titan's at $1000. Their sales have had to drop to virtually zero over the last few days. One thing I don't like about NVIDIA is they come off to me as more arrogant than their competition. Kinda like Apple in their segment. Although, I will say NVIDIA do more R&D in the display field such as Lightboost and G-Sync, so I guess we must pay a little for that. AMD seems more practical overall.

I guess it can come down to simply if I want G-Sync or not. Or whether Lightboost which either can run would be superior anyways...


----------



## sugarhell

vegas pcper did a crossfire review at 4k.

But care on closing thoughts:
Quote:


> As you might be able to tell: we didn't get this second AMD Radeon R9 290X card from AMD. They preferred us to wait a bit for our CrossFire testing and, in particular, our 4K CrossFire testing. But sometimes hardware finds its way to our office; when it does, we test!
> 
> And to be honest, I am not sure why AMD wouldn't have wanted this story out. When I published an article in September that looked at the severe problems that plagued the Radeon HD 7000 series in Eyefinity configurations (and thus tiled 4K ones, by association) that also used CrossFire, we wanted to push the company forward to release a fix sooner rather than later. Today's release of the Radeon R9 290X, based on a new architecture with a completely new CrossFire implementation, proves that the GCN design can be improved! Our testing with the 4K ASUS PQ321Q monitor with a pair of R9 290X cards clearly shows that to be the case.


http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-AMD-Radeon-R9-290X-CrossFire-and-4K-Preview-Testing/Battlefield-


----------



## Blackops_2

Quote:


> Originally Posted by *CallsignVega*
> 
> If I were buying new cards, pretty much a no-brainer to go with the 290X's. Since I already have Titan's, it makes it much harder. NVIDIA is quite laughable keeping Titan's at $1000. Their sales have had to drop to virtually zero over the last few days. One thing I don't like about NVIDIA is they come off to me as more arrogant than their competition. Kinda like Apple in their segment. Although, I will say NVIDIA do more R&D in the display field such as Lightboost and G-Sync, so I guess we must pay a little for that. AMD seems more practical overall.
> 
> I guess it can come down to simply if I want G-Sync or not. Or whether Lightboost which either can run would be superior anyways...


Nvidia gives me that vibe as well or at least when it comes to pricing. Even when their product is beat they rarely price it below AMD's counterpart that's beating it. I want the 290 to match the 780 and make the 780 drop to 450$ i think that is overly optimistic knowing Nvidia. It would be great for everyone but my guess is the 780 drops to 290x pricing and the 780TI comes in at 650$







not that i have a problem going with AMD or the other way. I would just like to see some great pricing on both sides. One of which is already doing so but i'm still forced to wait for custom cooling.


----------



## CallsignVega

Quote:


> Originally Posted by *sugarhell*
> 
> vegas pcper did a crossfire review at 4k.
> 
> But care on closing thoughts:
> http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-AMD-Radeon-R9-290X-CrossFire-and-4K-Preview-Testing/Battlefield-


Sweet, gonna read it now.

Quote:


> Originally Posted by *Blackops_2*
> 
> Nvidia gives me that vibe as well or at least when it comes to pricing. Even when their product is beat they rarely price it below AMD's counterpart that's beating it. I want the 290 to match the 780 and make the 780 drop to 450$ i think that is overly optimistic knowing Nvidia. It would be great for everyone but my guess is the 780 drops to 290x pricing and the 780TI comes in at 650$
> 
> 
> 
> 
> 
> 
> 
> not that i have a problem going with AMD or the other way. I would just like to see some great pricing on both sides. One of which is already doing so but i'm still forced to wait for custom cooling.


Ya, the Titan should have been $799 at launch and should be $599 now. But hey, it's NVIDIA we are talking about!


----------



## Forceman

Quote:


> Originally Posted by *CallsignVega*
> 
> I guess it can come down to simply if I want G-Sync or not. Or whether Lightboost which either can run would be superior anyways...


The other unknown is Mantle. If it is even a 10% boost on games you play, then that's a pretty big advantage for the 290X. It all comes down to adoption rate though, kind of like G-sync (plus then you need to buy new monitors monitors).

But if you wanted to sell the Titans, you probably want to do it relatively soon. I imagine the 780 Ti is going to take a lot of value out of the resale market.


----------



## CallsignVega

Reading that PcPer crossfire thread, did they lock the 780 at 863 MHz or let it boost up? I hate when they don't explain that as it makes a big difference.

As for Mantle, I know it can be a good thing. But after reading about all of this new tech and the responses, I think Mantle may be slightly over-rated and G-Sync slightly under-rated. Just from my experience in this business. As for buying new monitors, I never hesitate in that area. Over the last decade I must have owned 30+ monitors.


----------



## Paul17041993

Quote:


> Originally Posted by *CallsignVega*
> 
> -snip-


I would say quickly sell now and hop aboard for the red sea so you get no net loss + copy of BF4.

as for framepacing, that was blown out of proportion for the most part, just means now its not even noticable in any scenario, and Ive never seen enough tearing (except when fps <60) to see much point in G-sync, and I think said tech will be only in new ASUS monitors for now...?


----------



## sugarhell

They just report the base boost. The card boost normally


----------



## Scotty99

Quote:


> Originally Posted by *Paul17041993*
> 
> I would say quickly sell now and hop aboard for the red sea so you get no net loss + copy of BF4.
> 
> as for framepacing, that was blown out of proportion for the most part, just means now its not even noticable in any scenario, *and Ive never seen enough tearing (except when fps <60) to see much point in G-sync*, and I think said tech will be only in new ASUS monitors for now...?


That's just a side effect of g-sync, not the main benefit. I play games rather than benchmark and g-sync is the most exciting news ive come across in quite a while, obviously i havent seen it in action but just from hearing people talk about it.

And i assume its at least going to be asus and benq to start, their 144hz models. Wouldn't surprise me if some older 120hz panels get g-sync too.


----------



## VSG

Just bought 2 ek acetal blocks with back plates and a terminal connection. Now I just need to decide between another 290x or a 290 for cfx.


----------



## Arizonian

Quote:


> Originally Posted by *provost*
> 
> Although drivers are still not mature for 290x, but how do you like the 290x compare to your 690? Forgetting about the stock cooler, heat and all those issues with the 290x, as these can be fixed with wc or after market cooler.


Obviously a down grade in performance from my 690 which I had to put into my 2nd rig for 120 Hz 3D Vision gaming which I still enjoy to play a new title at least once. It sorely needed it more since FPS gets cut by 45% when gaming in 3D vision. I still enjoy watching 3D movies too.

I've got an aggressive manual fan setting which goes up to 70% on the 290X gaming which keeps temp full load to 84C running 1100 Core / 1300 Memory and 100% GPU usage.

I'd compare my 70% fan on my 290X as loud as my 690 at 95% fan. I had to keep temps on the 690 below 72C to keep it from down clocking incrementally by 13 MHz on the core for each 10C steps over 72C.

Fan noise does get loud beyond 70% but unless one is benching there's no reason to have fan speed any higher. At 100% for benching it's ludicrous loud. Luckily for my ears I'm not a bencher. Real world gaming is not as bad as people are making it out to be. At 100% fan I can't argue they're right.

On my new 1440p monitor main rig I'm clocked at 1100 Core & 1300 Memory very stable no artifacts. I tested Crysis 3 very high settings with AA disabled and get 45-51 FPS 84C / 70% fan. Crysis 2 getting 104-120 FPS same 84C / 70% fan speed maxed setting. BF3 Ultra mode defaults 52-60 FPS and only reach 78C / 54% fan.

I'm going to see what 290X Matrix brings and heavily leaning to stay with one GPU rather than Xfire. One negative I'm not happy with on this GPU is the power draw when it's performance lays between 780 and titan but draws more juice. With an 850 Watt PSU I feel crossfire isn't going to cut it. I've always wanted to try a Matrix or perhaps Toxic anyway. If Toxic is voltage locked then it will be a Matrix and I'm going to over clock the HELL out of it.


----------



## skupples

G-Sync, Shield, Lightboost. All things that have helped to jack the price of products, that's for sure. Now they are working on that epic new physX engine. Can't wait to see it in action.
Quote:


> Originally Posted by *geggeg*
> 
> Just bought 2 ek acetal blocks with back plates and a terminal connection. Now I just need to decide between another 290x or a 290 for cfx.


Wouldn't you want another 290X if you already have one? Seems like using a slower GPU would cause the 290x to not hit it's true potential... Or is that not an issue with Xfire?


----------



## RocketAbyss

Quote:


> Originally Posted by *Arizonian*
> 
> If Toxic is voltage locked then it will be a Matrix and I'm going to over clock the *HELL* out of it.


Shouldn't it be Volcano?


----------



## GioV

What is everyone idle temps with the R9 290X? I'm getting anywhere from 45 to 56C when browsing. Slightly higher when watching a video. Can anyone confirm that this is normal? All my settings are at default currently and even though I have a water cooled computer the radiator fans still provide decent airflow into the case.


----------



## Arizonian

Quote:


> Originally Posted by *GioV*
> 
> What is everyone idle temps with the R9 290X? I'm getting anywhere from 45 to 56C when browsing. Slightly higher when watching a video. Can anyone confirm that this is normal? All my settings are at default currently and even though I have a water cooled computer the radiator fans still provide decent airflow into the case.


Idle 45C with 20% fan.
Quote:


> Originally Posted by *RocketAbyss*
> 
> Shouldn't it be Volcano?


Well it is the Hawaiian islands made from volcanoes. I'll OC the volcano out of it. I love vacationing in Hawaii.


----------



## JimmieRustle

Is there a way to unlock the voltage in the sapphire 290x? MSI AB doesn't let me move the slider even after enabling it and Trixx doesn't give me the option to.


----------



## Forceman

Quote:


> Originally Posted by *JimmieRustle*
> 
> Is there a way to unlock the voltage in the sapphire 290x? MSI AB doesn't let me move the slider even after enabling it and Trixx doesn't give me the option to.


Not yet. You can flash the Asus BIOS which unlocks voltage control in GPU Tweak, or you can wait for Afterburner to be updated.


----------



## JimmieRustle

Quote:


> Originally Posted by *Forceman*
> 
> Not yet. You can flash the Asus BIOS which unlocks voltage control in GPU Tweak, or you can wait for Afterburner to be updated.


ty


----------



## RocketAbyss

Quote:


> Originally Posted by *GioV*
> 
> What is everyone idle temps with the R9 290X? I'm getting anywhere from 45 to 56C when browsing. Slightly higher when watching a video. Can anyone confirm that this is normal? All my settings are at default currently and even though I have a water cooled computer the radiator fans still provide decent airflow into the case.


I'm idling around those temps as well. This is with an ambient room temp of about 27-30C depending on what time of the day and if the AC is on or not


----------



## VSG

Quote:


> Originally Posted by *skupples*
> 
> Wouldn't you want another 290X if you already have one? Seems like using a slower GPU would cause the 290x to not hit it's true potential... Or is that not an issue with Xfire?


Nah, this isn't an issue with crossfire. Besides, OCUK has shown you can flash a 290X bios onto the 290 and get some more performance from it.


----------



## mboner1

Quote:


> Originally Posted by *Arizonian*
> 
> Obviously a down grade in performance from my 690 which I had to put into my 2nd rig for 120 Hz 3D Vision gaming which I still enjoy to play a new title at least once. It sorely needed it more since FPS gets cut by 45% when gaming in 3D vision. I still enjoy watching 3D movies too.
> 
> I've got an aggressive manual fan setting which goes up to 70% on the 290X gaming which keeps temp full load to 84C running 1100 Core / 1300 Memory and 100% GPU usage.
> 
> I'd compare my 70% fan on my 290X as loud as my 690 at 95% fan. I had to keep temps on the 690 below 72C to keep it from down clocking incrementally by 13 MHz on the core for each 10C steps over 72C.
> 
> Fan noise does get loud beyond 70% but unless one is benching there's no reason to have fan speed any higher. At 100% for benching it's ludicrous loud. Luckily for my ears I'm not a bencher. Real world gaming is not as bad as people are making it out to be. At 100% fan I can't argue they're right.
> 
> On my new 1440p monitor main rig I'm clocked at 1100 Core & 1300 Memory very stable no artifacts. I tested Crysis 3 very high settings with AA disabled and get 45-51 FPS 84C / 70% fan. Crysis 2 getting 104-120 FPS same 84C / 70% fan speed maxed setting. BF3 Ultra mode defaults 52-60 FPS and only reach 78C / 54% fan.
> 
> I'm going to see what 290X Matrix brings and heavily leaning to stay with one GPU rather than Xfire. One negative I'm not happy with on this GPU is the power draw when it's performance lays between 780 and titan but draws more juice. With an 850 Watt PSU I feel crossfire isn't going to cut it. I've always wanted to try a Matrix or perhaps Toxic anyway. If Toxic is voltage locked then it will be a Matrix and I'm going to over clock the HELL out of it.


I'm the same as you man, was gonna do 290x crossfire with a 850w psu, but seems like the extra 100w over crossfired 7970's makes it a bit more risky. Add in overclocking and it seems even less wise. Makes stock 780's the better option for anyone on a 850w psu i think. Which in turn makes me think why bother, i might as well just grab a 7970 and go crossfire and over clock them both and i will be laughing, hell i can pick up a 2nd hand 7970 for $240 today.


----------



## GioV

Quote:


> Originally Posted by *Arizonian*
> 
> Idle 45C with 20% fan.


Thanks, can't wait for a waterblock!


----------



## Scotty99

Quote:


> Originally Posted by *mboner1*
> 
> I'm the same as you man, was gonna do 290x crossfire with a 850w psu, but seems like the extra 100w over crossfired 7970's makes it a bit more risky. Add in overclocking and it seems even less wise. Makes stock 780's the better option for anyone on a 850w psu i think. Which in turn makes me think why bother, i might as well just grab a 7970 and go crossfire and over clock them both and i will be laughing, hell i can pick up a 2nd hand 7970 for $240 today.


You can get a new one for 250.00:
http://www.newegg.com/Product/Product.aspx?Item=N82E16814131468

Not the best 7970 but still, 3 free games and a warranty.


----------



## tsm106

Quote:


> Originally Posted by *CallsignVega*
> 
> Reading that PcPer crossfire thread, did they lock the 780 at 863 MHz or let it boost up? I hate when they don't explain that as it makes a big difference.
> 
> As for Mantle, I know it can be a good thing. But after reading about all of this new tech and the responses, I think Mantle may be slightly over-rated and G-Sync slightly under-rated. Just from my experience in this business. As for buying new monitors, I never hesitate in that area. Over the last decade I must have owned 30+ monitors.


Cards should boost like normal in the reviews. It would be unusual for them to lock the clocks down don't you think?

Mantle is going to be around and a foundation for amd for the next decade. I think some want to write it off from the start, but that's some parts delusion I would say. Never before has a pc developer had access to a close to metal api and in this instance one that is directly tied to the close to metal api that they use on the consoles. It's coming, one can wait for it or pretend the future is not going to happen. With its incorporation into frost bite, there will be a lot of games signed on. Now the long game that amd is playing is going to allow them to not have to operate on the same schedule as nvidia. Optimistic projections have mantle giving faster than 30% gains which would put Hawaii on par with Maxwell. That's crazy! Yes it sure as hell is but amd just released a whole gpu that some mod said could never be done and for half the price. And in a few months they are going to reward the buyers with mantle. It's kinda hard to beat that. Personally I have a feeling it will play out as most have supposed it would. And if it does come together like so, what's going to happen to the value of green cards from xmas to Maxwell? Our it could fail... anything is possible but I don't see that happening since that api is the fundamental basis of the two main consoles for the next decade. It's a tough one for you since you already have a silly fast rig. However that fat bus has massive pixel density written all over it. It's a tough one.


----------



## VSG

Quote:


> Originally Posted by *mboner1*
> 
> I'm the same as you man, was gonna do 290x crossfire with a 850w psu, but seems like the extra 100w over crossfired 7970's makes it a bit more risky. Add in overclocking and it seems even less wise. Makes stock 780's the better option for anyone on a 850w psu i think. Which in turn makes me think why bother, i might as well just grab a 7970 and go crossfire and over clock them both and i will be laughing, hell i can pick up a 2nd hand 7970 for $240 today.


I was within the new egg 30 day return period and noticed a good price after discount and rebate on the corsair 1200i. So I initiated an RMA for my 860i and got the 1200i instead.


----------



## Paul17041993

Quote:


> Originally Posted by *Scotty99*
> 
> That's just a side effect of g-sync, not the main benefit. I play games rather than benchmark and g-sync is the most exciting news ive come across in quite a while, obviously i havent seen it in action but just from hearing people talk about it.
> 
> And i assume its at least going to be asus and benq to start, their 144hz models. Wouldn't surprise me if some older 120hz panels get g-sync too.


< 60 fps isn't ideal anyway, sure you could run 30 but input polling takes a hit, best bet is to keep framerate >= 60 as much as possible, but I get what G-sync is good for, just not really much of a thing to look at yet, for framerate > 60 (or whatever your monitor runs at), if you have tearing in a game a simple fix is to either enable vsync or drop the excess frames (which i think the framepacing does now...?)

edit; actually I see G-sync would be fairly usefull on 120-144 Hz screens with cards not able to keep up with them all the time, but this is still a far and experimental market...


----------



## Arizonian

Quote:


> Originally Posted by *GioV*
> 
> Thanks, can't wait for a waterblock!
> 
> 
> Spoiler: Warning: Spoiler!


Added


----------



## Sazz

I have ran some Unigine benchmarks on mine (both Valley and Heaven) and I noticed something weird, clocks seems to suddenly drop at random times and goes back up to its maximum clock set for a split second and those dips results in dips in FPS. maybe it's just the drivers not optimized yet for these benchmarks or the benchmarks itself not fully supporting these cards (these benchmarks shows temps at 1828378C! LOLOL!) or a combination of both? coz even at stock clocks I see those dips, thought it was MSI AB was causing it but still there when its not running..

Don't really matter, I don't really usually run Unigine benchmarks anyways, I just did it coz someone requested it from me xD


----------



## GioV

Quote:


> Originally Posted by *Sazz*
> 
> I have ran some Unigine benchmarks on mine (both Valley and Heaven) and I noticed something weird, clocks seems to suddenly drop at random times and goes back up to its maximum clock set for a split second and those dips results in dips in FPS. maybe it's just the drivers not optimized yet for these benchmarks or the benchmarks itself not fully supporting these cards (these benchmarks shows temps at 1828378C! LOLOL!) or a combination of both? coz even at stock clocks I see those dips, thought it was MSI AB was causing it but still there when its not running..
> 
> Don't really matter, I don't really usually run Unigine benchmarks anyways, I just did it coz someone requested it from me xD


Same happened to me, no matter how much you overclock the card once it starts hitting that 95C and 55% fan speed wall (In Uber mode) it will down clock to maintain a max of 95C.


----------



## Sazz

Quote:


> Originally Posted by *GioV*
> 
> Same happened to me, no matter how much you overclock the card once it starts hitting that 95C and 55% fan speed wall (In Uber mode) it will down clock to maintain a max of 95C.


I am actually using a custom fan profile and my temp is maxing out at 78C at 65% fan speed, my clocks are stable and the card is not downclocking itself on other becnhmarks like 3Dmark 11/Firestrike and playing games. but with Unigine benchmarks there are split second moments where FPS drops to around 20ish FPS then goes back up as normal and at those times the clocks dips down randomly, lowest I've seen it dip down is 624Mhz but those dips again are for a split second only.

I'm thinking its just the driver not working well with Ungine or Unigine not supporting 290x well at this moment or maybe combination of both? but considering there's this issue the performance it put out is still good.


----------



## szeged

Quote:


> Originally Posted by *Sazz*
> 
> 
> 
> I have ran some Unigine benchmarks on mine (both Valley and Heaven) and I noticed something weird, clocks seems to suddenly drop at random times and goes back up to its maximum clock set for a split second and those dips results in dips in FPS. maybe it's just the drivers not optimized yet for these benchmarks or the benchmarks itself not fully supporting these cards (these benchmarks shows temps at 1828378C! LOLOL!) or a combination of both? coz even at stock clocks I see those dips, thought it was MSI AB was causing it but still there when its not running..
> 
> Don't really matter, I don't really usually run Unigine benchmarks anyways, I just did it coz someone requested it from me xD


hmm, 290x really seems to be lagging behind titans in valley, does valley really prefer kepler over gcn? or could it be drivers? my titan without volt softmods can do around 84-85 fps, with skyn3t bios.


----------



## DampMonkey

Quote:


> Originally Posted by *szeged*
> 
> hmm, 290x really seems to be lagging behind titans in valley, does valley really prefer kepler over gcn? or could it be drivers? my titan without volt softmods can do around 84-85 fps, with skyn3t bios.


Someone said recently that AMD has never developed a profile for the Valley bench. I dont really know what that means, and if/why/how it would matter. either way, the graphs are always in Nvidias favor for valley, but heaven seems more even.


----------



## Sazz

Quote:


> Originally Posted by *szeged*
> 
> hmm, 290x really seems to be lagging behind titans in valley, does valley really prefer kepler over gcn? or could it be drivers? my titan without volt softmods can do around 84-85 fps, with skyn3t bios.


Do keep in mind I am using FX-8350 @5Ghz , if you are using Intel i5 2500k and above you will see better scores. Unigine uses the CPU pretty good too, in example for me if I run it at stock CPU clocks my min and average FPS drops pretty significantly.


----------



## th3illusiveman

Quote:


> Originally Posted by *Scotty99*
> 
> You can get a new one for 250.00:
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814131468
> 
> Not the best 7970 but still, 3 free games and a warranty.


stay away from that thing! I RMA'd a reference design card and got that one in return and it is severely limited by the VRMs they swapped out the stock ones for. They look just like the memory chips and any overclock gets unstable if they go above 70c and the GPU starts throttling when they go above 80c, oh and there is a 100mv Vdroop issue you need to contend with. Plus the PCB feels extremely cheap and flexes like a thin piece of paper... overall a terrible card for anyone who wants to OC.

The cooler is very quiet though, so if you want to run stock al least it has that going for it....
Quote:


> Originally Posted by *szeged*
> 
> hmm, 290x really seems to be lagging behind titans in valley, does valley really prefer kepler over gcn? or could it be drivers? my titan without volt softmods can do around 84-85 fps, with skyn3t bios.


Funny thing is that when Valley released AMD cards destroyed Nvidia cards because of their bigger memory bus width. I saw overclocked 7950s easily outscoring overclocked GTX680s. It seems the tables have turned though lol.


----------



## DStealth

Quote:


> Originally Posted by *DampMonkey*
> 
> Someone said recently that AMD has never developed a profile for the Valley bench. I dont really know what that means, and if/why/how it would matter. either way, the graphs are always in Nvidias favor for valley, but heaven seems more even.


And they shouldn't, cos there's no Tessellation to manipulate in Valley...to be "optimized".








Although the card seems well positioned price/performance wise, specially for those who can withstand noise/heat.







No flaming here, just telling my HO...those cards shouldn't be produced before 20nm process took place


----------



## Falkentyne

Quote:


> Originally Posted by *GioV*
> 
> Same happened to me, no matter how much you overclock the card once it starts hitting that 95C and 55% fan speed wall (In Uber mode) it will down clock to maintain a max of 95C.


Guys I said it before,
OVERRIDE THE FAN SPEED with Afterburner!! (even Trixx works), then you can set the fan speed just like on 7970's, up to 100%.

No problems with any sort of throttling at 1100 MHz and 70% fan speed (max temps in Valley were 86C).


----------



## fleetfeather

what sort of fan is in the reference cooler (dimensions, blade type)? Anyone had any ideas about replacing it with an aftermarket one which might be quieter?


----------



## Sazz

Quote:


> Originally Posted by *fleetfeather*
> 
> what sort of fan is in the reference cooler (dimensions, blade type)? Anyone had any ideas about replacing it with an aftermarket one which might be quieter?


tbh its not the fan I believe, I mean any blower style cooler will tend to be loud and the fans on this one seems to be similar to the one used with 7970 which is OK noise wise, but the fins on the cooler itself might be causing more noise than the one they used on the 7970 as air passes thru it, noise is generated of course. But really, I don't see what's the problem on the "noise" coz in my experience I am maxing at 78C at 65% fan speed at stock clocks and at that noise level its still quieter than my XFX HD6870 Core edition at 100% fan speed. which is pretty, I can't really say loud but more audible than the 290X fan at 65%.


----------



## fleetfeather

ahh okay, fair enough!









What are your ambient temps currently?


----------



## Sgt Bilko

Well, My 290x is finally in the post and i will be expecting it within the next 2-3 days, I have my GPU in one parcel, my Accelero Xtreme III in another and in a 3rd i have my Vram heatsinks and thermal paste.....any bets on which one turns up first?


----------



## Sazz

Quote:


> Originally Posted by *fleetfeather*
> 
> ahh okay, fair enough!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What are your ambient temps currently?


Ambient temp of 25C


----------



## Paul17041993

Quote:


> Originally Posted by *DStealth*
> 
> And they shouldn't, cos there's no Tessellation to manipulate in Valley...to be "optimized".
> 
> 
> 
> 
> 
> 
> 
> 
> Although the card seems well positioned price/performance wise, specially for those who can withstand noise/heat.
> 
> 
> 
> 
> 
> 
> 
> No flaming here, just telling my HO...those cards shouldn't be produced before 20nm process took place


considering the 290X only uses a little more power then a slightly overclocked 7970 while giving much more performance, especially in eyefinity and 4K, what the card is built for, and 28nm was chosen on purpose anyway for scalability, one fact that many don't realize is while smaller nodes slightly increase efficiency it drastically reduces integrity and overall scalability...


----------



## fleetfeather

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well, My 290x is finally in the post and i will be expecting it within the next 2-3 days, I have my GPU in one parcel, my Accelero Xtreme III in another and in a 3rd i have my Vram heatsinks and thermal paste.....any bets on which one turns up first?


did you order both through PCCG? If so, whichever order you put through first








Quote:


> Originally Posted by *Sazz*
> 
> Ambient temp of 25C


nice! my ambient air intake is 25C as well, so hopefully I'm in the same boat.


----------



## Sgt Bilko

Quote:


> Originally Posted by *fleetfeather*
> 
> did you order both through PCCG? If so, whichever order you put through first
> 
> 
> 
> 
> 
> 
> 
> 
> nice! my ambient air intake is 25C as well, so hopefully I'm in the same boat.


290x, Vram heatsinks and the thermal paste/adhesive through PCCG, the Accelero Xtreme III had to come through Scorptech seeing as PCCG only goes up to the Xtreme II.

290x was ordered first but i got both all 3 e-mails within a space of 10 minutes so i kind of expect Auspost to screw up somewhere.......they usually do, Pity it didn't ship on Friday like all the others


----------



## Paul17041993

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 290x, ... through PCCG, ...


steal my chance of getting one soon why don't you...


----------



## Arizonian

I just did a test in BF3 campaign ultra settings with 1100/1300 and left the fan on auto in uber BIOS.

100% GPU usage - 90-92C - 55% fan - hovered 1058-1100 MHz Core - pegged 1300 MHz Memory the whole time. 58-60 FPS average on 1440p.

Temps are not causing down clocks. If I use my manual fan settings 70% fan cap it brings temps to 84C. At 70% its no worse sounding than my 690 or 680 full load.

Again, idle 45C at 20% fan on desktop.

Consistently perfoming regardless of temps gaming. You guys getting WB's are going to take this wide open. Interested in seeing what we'll see.


----------



## selk22

This is my plan for the expansion of the h220 to cool a 3930k 4.6 1.32 and the 290x


Spoiler: Warning: Spoiler!



http://www.frozencpu.com/products/15684/ex-res-359/Aquacomputer_Aquabox_Professional_525_Bay_Reservoir_-_Black_Delrin.html?tl=g30c97s168#blank 50$

http://www.swiftech.com/truflextubing.aspx 8$

http://www.swiftech.com/Helix140mmfan.aspx 11$

http://www.frozencpu.com/products/16094/ex-rad-403/Swiftech_MCRx40_Quiet_Power_Single_140mm_Radiator_-_Black_MCR140-QP.html?tl=g30c95s929#blank 50$

http://www.swiftech.com/3-8x5-8inch-lokseal-compression-fitting.aspx x6 30$

http://www.frozencpu.com/products/21663/ex-blc-1567/EK_Radeon_R9-290X_VGA_Liquid_Cooling_Block_-_Acetal_Nickel_EK-FC_R9-290X_-_AcetalNickel.html?tl=g30c309s2073 120$



I think I should have this as soon as I sell my GTX 580


----------



## kcuestag

Quote:


> Originally Posted by *Arizonian*
> 
> I just did a test in BF3 campaign ultra settings with 1100/1300 and left the fan on auto in uber BIOS.
> 
> 100% GPU usage - 90-92C - 55% fan - hovered 1058-1100 MHz Core - pegged 1300 MHz Memory the whole time. 58-60 FPS average on 1440p.
> 
> Temps are not causing down clocks. If I use my manual fan settings 70% fan cap it brings temps to 84C. At 70% its no worse sounding than my 690 or 680 full load.
> 
> Again, idle 45C at 20% fan on desktop.
> 
> Consistently perfoming regardless of temps gaming. You guys getting WB's are going to take this wide open. Interested in seeing what we'll see.


My card is on delivery right now, and waterblock should be here in 2-3 days!


----------



## jjjc_93

Anybody using the Asus PT3 bios should keep in mind that the voltage actually runs higher than what you're setting. To combat vdroop 1.3v is actually running at 1.32-1.33, this is from monitoring voltages with a DMM. PT1 should be a better bios for daily users in the sense that the card won't prop up voltages.


----------



## Mas

Just picked up R9 290X on the way home from work and installed it a couple of minutes ago. Shop only had one card so will have to wait for crossfire. When I get the second card I'll put in the order for some water blocks. Already have a 480 rad waiting, going to do a loop just for the GPUs. Will fit easily in the 900D I have ordered which should arrive by end of week.

1. http://www.techpowerup.com/gpuz/fucez/

2. Gigabyte R9 290X

3. Stock cooling for the moment, will put it under water soon.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Paul17041993*
> 
> steal my chance of getting one soon why don't you...


I managed to snag one of the last Sapphire BF4 editions last Thursday........i take it you didn't get an ASUS one today then?

Scorptech had some Gigabyte and Powercolor's coming back in stock iirc

Quote:


> Originally Posted by *jjjc_93*
> 
> Anybody using the Asus PT3 bios should keep in mind that the voltage actually runs higher than what you're setting. To combat vdroop 1.3v is actually running at 1.32-1.33, this is from monitoring voltages with a DMM. PT1 should be a better bios for daily users in the sense that the card won't prop up voltages.


Thanks for the heads up, i'll probably run the PT1 on mine, just want to try it out then go back to Sapphires for gaming


----------



## fleetfeather

Quote:


> Originally Posted by *Sgt Bilko*
> 
> snip


Scorptek are bloody quick. I bought a UPS through them and got it next day. I'll put 5 bucks on Scorptek's package arriving before PCCG









I'm starting to regret buying my cards in the states now. The ETA on my 2 290X's + my PSU is 8th Nov haha...


----------



## fleetfeather

Quote:


> Originally Posted by *Paul17041993*
> 
> steal my chance of getting one soon why don't you...


mwave and umart are in stock iirc


----------



## famich

Quote:


> Originally Posted by *Arizonian*
> 
> I just did a test in BF3 campaign ultra settings with 1100/1300 and left the fan on auto in uber BIOS.
> 
> 100% GPU usage - 90-92C - 55% fan - hovered 1058-1100 MHz Core - pegged 1300 MHz Memory the whole time. 58-60 FPS average on 1440p.
> 
> Temps are not causing down clocks. If I use my manual fan settings 70% fan cap it brings temps to 84C. At 70% its no worse sounding than my 690 or 680 full load.
> 
> Again, idle 45C at 20% fan on desktop.
> 
> Consistently perfoming regardless of temps gaming. You guys getting WB's are going to take this wide open. Interested in seeing what we'll see.


Could not agree more : I have tried the ASUS BIOS +ASUS GPU tweak, my Sapphire was able to pass through Heaven Ex @1160 @1,3V , but I had to put the fn at 70% RPM : It was like a hairdryer .








But here in EU EK sold all waterblocks.

IMHO even GTX 780TI that would supposedly surpass the Titan by 5%/ rumored / , will not overcome this GPU. But it s really asking for WC and more voltage control.
BTW , an awesome thread and lots of info . Thanks !!!


----------



## Fezlakk

Hey fleetfeather, I am a fellow Aussie. PCCG do their best, but when I look at the prices our American cousins pay for hardware it makes me cry a little. Do you buy your components from NCIX CA or Amazon, or do you use a import service?

I ask because I am looking into a new rig when non-reference 290X's start appearing (a pair of Toxic's would fit the bill nicely) Do you get smashed by import tax if it is over 1k in a single purchase? That thought is troubling me.

Thanks


----------



## fleetfeather

Quote:


> Originally Posted by *Fezlakk*
> 
> Hey fleetfeather, I am a fellow Aussie. PCCG do their best but when I look at the prices our American cousins pay for hardware it makes me cry a little. Do you buy your components from NXIC CA or Amazon, or do you use a import service?


Heya mate. I've bought tech gear from Amazon before, but I bought my 290X's and PSU from BLT and Provantage, because they had the same price as Newegg (in fact my PSU was cheaper than Newegg). I split up the purchases across 2 different stores to make sure I wouldn't get hit with import tax. It can work out well to purchase things from Canada and the States, but the shipping and exchange rates can really add up quick. I've found it works out better to import only if you're a) making a sizable purchase, and b) buying tech gear which is covered under international warranty


----------



## jjjc_93

Quote:


> Originally Posted by *Fezlakk*
> 
> Hey fleetfeather, I am a fellow Aussie. PCCG do their best but when I look at the prices our American cousins pay for hardware it makes me cry a little. Do you buy your components from NXIC CA or Amazon, or do you use a import service?


PCCG are actually one of the more expensive places from what I have seen, I get most stuff locally cheaper and that's without shipping factored in. Even CPUs are slightly cheaper local, you'd think with their higher buying/selling volume they would be more competitive.


----------



## fleetfeather

Quote:


> Originally Posted by *jjjc_93*
> 
> PCCG are actually one of the more expensive places from what I have seen, I get most stuff locally cheaper and that's without shipping factored in. Even CPUs are slightly cheaper local, you'd think with their higher buying/selling volume they would be more competitive.


are you buying at wholesale or something? I knew PCCG wasn't the cheapest (I think they feel the CS they provide is worth the extra mark-up), but I thought Umart + Scorptek was as cheap as it got for us


----------



## Chickenman

One of the local stores will do cost + $1 here, unfortunately for me they are absolutely useless and have never successfully sold me anything in spite of me trying to throw money at them.


----------



## Sgt Bilko

Quote:


> Originally Posted by *fleetfeather*
> 
> Scorptek are bloody quick. I bought a UPS through them and got it next day. I'll put 5 bucks on Scorptek's package arriving before PCCG
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm starting to regret buying my cards in the states now. The ETA on my 2 290X's + my PSU is 8th Nov haha...


I was thinking about the states but two things made me buy local

1. RMA.....just in case, shipping back to the US is a pain and probably wouldn't be covered

2. By the time i factored in Shipping (even with a US address) it was only going to be around $30-$50 extra to buy local so i didn't mind
Quote:


> Originally Posted by *fleetfeather*
> 
> mwave and umart are in stock iirc


Umart, PCCG, Scorptech and Mwave are all out of stock, Scorptech is the only one with an actual date for restock
Quote:


> Originally Posted by *Fezlakk*
> 
> Hey fleetfeather, I am a fellow Aussie. PCCG do their best but when I look at the prices our American cousins pay for hardware it makes me cry a little. Do you buy your components from NXIC CA or Amazon, or do you use a import service?
> 
> I am looking into a new rig when non-reference 290X's start appearing (a pair of Toxic's would fit the bill nicely) Do you get smashed by import tax if it is over 1k in a single purchase? that thought is worrying me.


You are better off going with a Local retailer imo even if it's just because of the warranty side of things. I've had to return Ram back overseas and that was a nightmare, never again......unless the prices are just completely stupid (like Razor's Aus store for example)


----------



## Fezlakk

Thanks heaps mate REP+







. Yeah, I was worried about the dreaded import Tax. When some non-reference 290X's show their pretty (and hopefully much cooler) heads, I will look into a new rig with a pair of them.


----------



## fleetfeather

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 1. RMA.....just in case, shipping back to the US is a pain and probably wouldn't be covered


I forgot to add that I arranged with both Provantage and BLT for them to test the card's and PSU before shipping to make sure it didn't get shipped already dead in the box. Provantage covers the cost of my DOA shipping (they are sending 1 of my 290X's and the PSU), BLT wont (but then again, how do they know which 290X is there's hehe...). My warranty will be void once i slap a block on anyways, so its more just a matter of shipping damage for me.

@ Fezlakk no problem mate, all the best with whichever route you take


----------



## Akula

Anyone got their 290x underwater yet? if so head over to the GK110 v Hawaii Bench thread and get the ball rolling with some serious clocks!


----------



## Fezlakk

Quote:


> Originally Posted by *Akula*
> 
> Anyone got their 290x underwater yet? if so head over to the GK110 v Hawaii Bench thread and get the ball rolling with some serious clocks!


I am also keen to see this, I wonder if it has what it takes (when its settled in a bit obviously) to beat Alatar's 1400mhz Valley run...


----------



## Sgt Bilko

Quote:


> Originally Posted by *fleetfeather*
> 
> I forgot to add that I arranged with both Provantage and BLT for them to test the card's and PSU before shipping to make sure it didn't get shipped already dead in the box. Provantage covers the cost of my DOA shipping (they are sending 1 of my 290X's and the PSU), BLT wont (but then again, how do they know which 290X is there's hehe...). My warranty will be void once i slap a block on anyways, so its more just a matter of shipping damage for me.
> 
> @ Fezlakk no problem mate, all the best with whichever route you take


Well that's different then









Good call on your part, i had my first Crosshair V turn up DOA.......and i mean dead, no post, no voltage.....nothing, last time i bought through Megaware (that and the stupid high prices).

And PCCG is actually Cheaper than Scorptech from what i'm seeing. on average around $10-20. And Scorptech uses Startrack to post.....Never had one good experiance with them.

Austin Computers in Perth might have some 290x's http://www.austin.net.au/video-graphics-cards/amd-r9-290-series.html
Just says not available online so you might be able to call them and get lucky?

On Topic: I read somewhere a few pages back that only the Sapphire BF4 290x's have Hynix memory, can anyone confirm this?

Also, anyone on Water yet?


----------



## Paul17041993

Quote:


> Originally Posted by *Arizonian*
> 
> I just did a test in BF3 campaign ultra settings with 1100/1300 and left the fan on auto in uber BIOS.
> 
> 100% GPU usage - 90-92C - 55% fan - hovered 1058-1100 MHz Core - pegged 1300 MHz Memory the whole time. 58-60 FPS average on 1440p.
> 
> Temps are not causing down clocks. If I use my manual fan settings 70% fan cap it brings temps to 84C. At 70% its no worse sounding than my 690 or 680 full load.
> 
> Again, idle 45C at 20% fan on desktop.
> 
> Consistently perfoming regardless of temps gaming. You guys getting WB's are going to take this wide open. Interested in seeing what we'll see.


theoretically they shouldn't throttle anyway provided you set it to the 95C max, which you seemed to stay under. of course lowering the max temp will/should cause throttling unless its under water/subzero...
Quote:


> Originally Posted by *Sgt Bilko*
> 
> I managed to snag one of the last Sapphire BF4 editions last Thursday........i take it you didn't get an ASUS one today then?


yea missed that before i even saw it, but I wouldn't have spent the extra 60-70 bucks on it anyway, especially after my previous experience with ASUS graphics support...


----------



## Iniura

Quote:


> Originally Posted by *selk22*
> 
> This is my plan for the expansion of the h220 to cool a 3930k 4.6 1.32 and the 290x
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://www.frozencpu.com/products/15684/ex-res-359/Aquacomputer_Aquabox_Professional_525_Bay_Reservoir_-_Black_Delrin.html?tl=g30c97s168#blank 50$
> 
> http://www.swiftech.com/truflextubing.aspx 8$
> 
> http://www.swiftech.com/Helix140mmfan.aspx 11$
> 
> http://www.frozencpu.com/products/16094/ex-rad-403/Swiftech_MCRx40_Quiet_Power_Single_140mm_Radiator_-_Black_MCR140-QP.html?tl=g30c95s929#blank 50$
> 
> http://www.swiftech.com/3-8x5-8inch-lokseal-compression-fitting.aspx x6 30$
> 
> http://www.frozencpu.com/products/21663/ex-blc-1567/EK_Radeon_R9-290X_VGA_Liquid_Cooling_Block_-_Acetal_Nickel_EK-FC_R9-290X_-_AcetalNickel.html?tl=g30c309s2073 120$
> 
> 
> 
> I think I should have this as soon as I sell my GTX 580


I don't know 100% sure but I wont take that swiftech tubing as it might plasticize, the tubing watercoolers prefer at the moment are Primochill advanced LRT, it wont plasticize and is probably the best on the market right now, or you could go with EK-Tube ZMT Matte Black or Tygone A-60-G Norprene tubing, those are made out of EPDM and Norprene (sort of rubber which cannot plasticize).

Also are you planning to cool your CPU and GPU with 1 140mm Rad?, If so that wont work i't won't cool enough, (take a look at the watercooling club thread, there is a general rule of thumb like 120mm rad space to begin with and for every component you add like CPU u add 120mm more rad space and for CPU and GPU add again 120mm rad space, and if you would like to keep the rpm of your fans low and still have good cooling I would advise to go at least 2x 240mm radiators or 1 360mm + 1 120mm) ask how much rad space you need in the WC thread they know exactly what you need.

Also you need a Pump and you've got to plan you loop so you now how much and what kind of fittings to buy, and you need to think about which coolant you are going to use and all that comes with it like distilled water + biocide or distilled water + a kill coil. Or you could go with Mayhem's products for cooling.

I suggest you take a look in this thread there is a lot of information to be gained and i'm sure the guys there are willing to help you out with your loop and all the questions you have.

http://www.overclock.net/t/584302/ocn-water-cooling-club-and-picture-gallery/51650_50


----------



## Arizonian

Quote:


> Originally Posted by *Mas*
> 
> Just picked up R9 290X on the way home from work and installed it a couple of minutes ago. Shop only had one card so will have to wait for crossfire. When I get the second card I'll put in the order for some water blocks. Already have a 480 rad waiting, going to do a loop just for the GPUs. Will fit easily in the 900D I have ordered which should arrive by end of week.
> 
> 1. http://www.techpowerup.com/gpuz/fucez/
> 
> 2. Gigabyte R9 290X
> 
> 3. Stock cooling for the moment, will put it under water soon.


Added the first Gigabyte....welcome aboard.


----------



## selk22

Quote:


> Originally Posted by *Iniura*
> 
> I don't know 100% sure but I wont take that swiftech tubing as it might plasticize, the tubing watercoolers prefer at the moment are Primochill advanced LRT, it wont plasticize and is probably the best on the market right now, or you could go with EK-Tube ZMT Matte Black or Tygone A-60-G Norprene tubing, those are made out of EPDM and Norprene (sort of rubber which cannot plasticize).
> 
> Also are you planning to cool your CPU and GPU with 1 140mm Rad?, If so that wont work i't won't cool enough, (take a look at the watercooling club thread, there is a general rule of thumb like 120mm rad space to begin with and for every component you add like CPU u add 120mm more rad space and for CPU and GPU add again 120mm rad space, and if you would like to keep the rpm of your fans low and still have good cooling I would advise to go at least 2x 240mm radiators or 1 360mm + 1 120mm) ask how much rad space you need in the WC thread they know exactly what you need.
> 
> Also you need a Pump and you've got to plan you loop so you now how much and what kind of fittings to buy, and you need to think about which coolant you are going to use and all that comes with it like distilled water + biocide or distilled water + a kill coil. Or you could go with Mayhem's products for cooling.
> 
> I suggest you take a look in this thread there is a lot of information to be gained and i'm sure the guys there are willing to help you out with your loop and all the questions you have.
> 
> http://www.overclock.net/t/584302/ocn-water-cooling-club-and-picture-gallery/51650_50


I don't think you read my post.. I am going to expand the h220 which is already a 220 rad and pump with res. I will be adding the components listed to the current loop I already have but I appreciate the help. The swiftech tubing is the same tubing on my h220 right now but in white and if I can afford to I will deff go with the primochill


----------



## skupples

Quote:


> Originally Posted by *DampMonkey*
> 
> You would think that Valley would run ok because it is a Unigine product and Heaven seems to be an accurate benchmark for both AMD and Nvidia. I've seen Valley benchmark graphs before where the GTX770 almost hangs with the 290x in valley. Something is definitely up with that bench
> 
> 
> Spoiler: Warning: Spoiler!


ALSO, this is a 1600P comparison for those scratching your head over some of the info.

could be many things. NVCP tweaks (some of them not so allowed like LOD BIAS) also, driver's are a huge issue with some of these benches. Specially in tri/quad.


----------



## kcuestag

My Sapphire just arrived!















Add me to the club!









Also, how do I know if mine's set to Uber or Quiet? My switch was on the left side, I flipped it to the right side, not quite sure what mode my card is on right now.


----------



## maarten12100

Quote:


> Originally Posted by *kcuestag*
> 
> My Sapphire just arrived!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Add me to the club!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also, how do I know if mine's set to Uber or Quiet? My switch was on the left side, I flipped it to the right side, not quite sure what mode my card is on right now.


Towards the back(towards power connector) of the card is uber towards the rear of the card(towards the display connectors) is quiet.


----------



## kcuestag

Quote:


> Originally Posted by *maarten12100*
> 
> Towards the back(towards power connector) of the card is uber towards the rear of the card(towards the display connectors) is quiet.


Thanks, I'm on Uber then.









Going to do some gaming on it now.


----------



## RocketAbyss

Quote:


> Originally Posted by *kcuestag*
> 
> Thanks, I'm on Uber then.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Going to do some gaming on it now.


Battlefield 4 in 40mins here! Not too sure about you guys but I'm gonna push this 290X!


----------



## vettefan8

Looks like Nvidia is dropping their prices tomorrow.
GTX 780 goes from $649 to $499
GTX 770 2GB goes from $399 to $329

GTX 780TI comes out November 7 for $699

http://www.pcper.com/news/Graphics-Cards/NVIDIA-Drops-GTX-780-GTX-770-Prices-Announces-GTX-780-Ti-Price

The 780 and 770 2GB also include Assassin's Creed IV Black Flag, Tom Clancy's Splinter Cell Blacklist Deluxe Edition, and Batman: Arkham Origins gaming bundle

http://www.geforce.com/get-the-ultimate-bundle

My 290x arrives today and I have no regrets.


----------



## Arizonian

Quote:


> Originally Posted by *kcuestag*
> 
> My Sapphire just arrived!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Add me to the club!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also, how do I know if mine's set to Uber or Quiet? My switch was on the left side, I flipped it to the right side, not quite sure what mode my card is on right now.


Awesomeness. Got you added to the club. Great pics.









On OP page.

Quiet Mode - Switch is in position closest to where you plug in your displays. Fan is at 40% cap.

Uber Mode - Switch is in position furthest away to where you plug in your displays. Fan is at 55% cap.


----------



## jerrolds

Quote:


> Originally Posted by *CallsignVega*
> 
> Hmm, going over a lot of data I am at a crossroads. I am posting this in both the 290X and Titan threads for input. My post is only in reference to Eyefinity/Surround resolution and 4K data. Virtually all reviews had the 290X at it's stock ~1000 MHz and Titans stock boost around ~993 MHz, so roughly the same.
> 
> Averaging all of those numbers up, 290X clock for clock is about 6% faster than Titan at these high resolutions. Some of that in part to Crossfire on average scaling a tad better than SLI.
> 
> I could most likely sell the Titan's with waterblocks for around $650 each, which would be pretty much a wash with new 290X's and water blocks.
> 
> So basically I'd need to get my 290X's up to ~1225 MHz on water in order to match my 1300 MHz Titans at these resolutions. Then in comes down to how smooth are 290X's in Crossfire Eyefinity/4K versus SLI? What is the status of the frame pacing drivers?
> 
> Basically, I'd be giving up better power and heat characteristics of the Titans for slightly faster performance. The Titan's are already a sunk cost, so that would be a wash with the 290X's. VRAM I am not worried about, 4GB is fine as at these resolutions as I don't run much AA.
> 
> NVIDIA has the G-Sync trick up their sleeve, which may make it hard to switch red. Although, there might be a whole new generation of GPU's out before we see that in 4K monitors or anything non 144 Hz TN based. The question of whether I'd run non-Lightboost G-Sync over non G-Sync Lightboost is also of importance. The 290X's could run Lightboost but not G-Sync.
> 
> If I were buying new cards, pretty much a no-brainer to go with the 290X's. Since I already have Titan's, it makes it much harder. NVIDIA is quite laughable keeping Titan's at $1000. Their sales have had to drop to virtually zero over the last few days. One thing I don't like about NVIDIA is they come off to me as more arrogant than their competition. Kinda like Apple in their segment. Although, I will say NVIDIA do more R&D in the display field such as Lightboost and G-Sync, so I guess we must pay a little for that. AMD seems more practical overall.
> 
> I guess it can come down to simply if I want G-Sync or not. Or whether Lightboost which either can run would be superior anyways...


Not worth the switch imo - Crossfire is still not as good as SLI, much better than before but still a bit flaky in some games. PCI-e helps Eyefinity tearing apparently as well, but for 6% boost - assuming you can hit 1225mhz on both cards underwater (probably a coin toss) ...buying and waiting for waterblocks..i dunno - to me not worth the hassle.

Are you running Lightboost or 120hz IPS/PLS nowadays? Id hold off, and see what GSync has to offer. Unless your really REALLY into BF4 and Mantle really lives up to the hype..then it might be worth it.

Then again - 290X with waterblocks might have more of a resale value if Mantle takes off...plus if your the one selling, people will buy from Vega.


----------



## Arizonian

Quote:


> Originally Posted by *vettefan8*
> 
> Looks like Nvidia is dropping their prices tomorrow.
> GTX 780 goes from $649 to $499
> GTX 770 2GB goes from $399 to $329
> 
> GTX 780TI comes out November 7 for $699
> 
> http://www.pcper.com/news/Graphics-Cards/NVIDIA-Drops-GTX-780-GTX-770-Prices-Announces-GTX-780-Ti-Price
> 
> The 780 and 770 2GB also include Assassin's Creed IV Black Flag, Tom Clancy's Splinter Cell Blacklist Deluxe Edition, and Batman: Arkham Origins gaming bundle
> 
> http://www.geforce.com/get-the-ultimate-bundle
> 
> My 290x arrives today and I have no regrets.


This is a surprise because I didn't think Nvidia had it in them. They must have finally felt threatened.

It's a bit of a bummer since one can run two 780's with an 850 watt PSU and I'm stuck having to upgrade my 850 watt PSU if I'd like to crossfire, which I don't want the extra $330 for a AX1200i. Synthetic benchmarks aside the 290X seems faster so the $50 less price tag balanced that out.

I have a feeling AMD will drop another $50 to match it on the 290X and we might see the 290 by $50 less than originally expected right off the bat.

Wish this power draw wasn't so high.


----------



## Slomo4shO

Quote:


> Originally Posted by *Arizonian*
> 
> Wish this power draw wasn't so high.


Competition is a beautiful thing









Hopefully this will bring the top tier cards at $450 or lower by the end of the year as they should have been to begin with.


----------



## kcuestag

Quote:


> Originally Posted by *Slomo4shO*
> 
> Competition is a beautiful thing
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hopefully this will bring the top tier cards at $450 or lower by the end of the year as they should have been to begin with.


Agreed, that way it'll be easier for me to get my 2nd 290X in Christmas.


----------



## VSG

The 1200i is $300 on rebate and even less with the newegg mobile promo codes. It ended up being $275 for me after rebate when I bought it yesterday.


----------



## CallsignVega

Quote:


> Originally Posted by *tsm106*
> 
> Cards should boost like normal in the reviews. It would be unusual for them to lock the clocks down don't you think?


I've actually seen review sites lock dynamic clock cards down for comparisons. Like why even publish the low non-boost frequency on their charts when under virtually all benchmark conditions the cards would be fully boosted.
Quote:


> Originally Posted by *jerrolds*
> 
> Not worth the switch imo - Crossfire is still not as good as SLI, much better than before but still a bit flaky in some games. PCI-e helps Eyefinity tearing apparently as well, but for 6% boost - assuming you can hit 1225mhz on both cards underwater (probably a coin toss) ...buying and waiting for waterblocks..i dunno - to me not worth the hassle.
> 
> Are you running Lightboost or 120hz IPS/PLS nowadays? Id hold off, and see what GSync has to offer. Unless your really REALLY into BF4 and Mantle really lives up to the hype..then it might be worth it.
> 
> Then again - 290X with waterblocks might have more of a resale value if Mantle takes off...plus if your the one selling, people will buy from Vega.


You say SLI is better? Where do you think Crossfire falls short; smoothness, game compatibility?

I am running a 3x Lightboost setup, but NVIDIA has broken it in the last driver. Ironically, you can still get it to work with AMD cards and NVIDIA invented the silly thing! I'd REALLY love to test out a G-Sync 4K monitor, but I fear that may still be a bit away. I hate always waiting for things.


----------



## jerrolds

Quote:


> Originally Posted by *CallsignVega*
> 
> You say SLI is better? Where do you think Crossfire falls short; smoothness, game compatibility?


All of the above - microstutter is still _measurably_ more pronounced that SLI - http://www.tomshardware.com/reviews/radeon-r9-290x-hawaii-review,3650-26.html http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-AMD-Radeon-R9-290X-CrossFire-and-4K-Preview-Testing/Bioshock-Inf While the reviewers say its visually hard to tell the difference - to me..i dunno, just knowing its there, and knowing another technology handles it better, would bug me.

Plus - i believe Frame pacing still doesnt work in DX9 and Eyefinity - but dont quote me on that. I guess AMD thought that crossfire thru PCI-e would fix everything, but it doesnt quite do it - i dunno why they couldnt just put in a hardware frame pacer like geforces have for years.

Does SLI work in windowed mode? Cuz i'm pretty sure CF needs fullscreen - so any custom icc profiles get thrown out the window.


----------



## sugarhell

4k is eyefinity atm.

Also keep the misinformations. Nvidia doesnt have hardware frame pacing fix for years. Only on kepler they got it. Also hardware fix means nothing without the drivers. Look how on metro and grid 2 they have stutter on sli.


----------



## rdr09

Quote:


> Originally Posted by *CallsignVega*
> 
> I've actually seen review sites lock dynamic clock cards down for comparisons. Like why even publish the low non-boost frequency on their charts when under virtually all benchmark conditions the cards would be fully boosted.
> You say SLI is better? Where do you think Crossfire falls short; smoothness, game compatibility?
> 
> I am running a 3x Lightboost setup, but NVIDIA has broken it in the last driver. Ironically, you can still get it to work with AMD cards and NVIDIA invented the silly thing! I'd REALLY love to test out a G-Sync 4K monitor, but I fear that may still be a bit away. I hate always waiting for things.


you are asking the wrong person. you've got more experience that us combined for sure.


----------



## jerrolds

Quote:


> Originally Posted by *sugarhell*
> 
> 4k is eyefinity atm.
> 
> Also keep the misinformations. Nvidia doesnt have hardware frame pacing fix for years. Only on kepler they got it. Also hardware fix means nothing without the drivers. Look how on metro and grid 2 they have stutter on sli.


Didnt they have it on the 5 series? I dunno 780 SLI is MUCH tighter than 290X CF http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-AMD-Radeon-R9-290X-CrossFire-and-4K-Preview-Testing/Metro-Last-L

I'm not saying theres absolutely no microstutter with SLI, im saying its much better than what AMD can do atm.


----------



## sugarhell

No they dont. Fermi had awful microstutter especially on 480s.

Please read what you just link. The 780 sli dont scale at all thats means only 1 gpu work. Thats why the frame latency is like single gpu. Also if you check the closing thoughts amd said to not test crossfire yet because its not quite ready. This is like a preview


----------



## evensen007

Amazon finally added the 290x to it's site, although it shows OOS right now.

http://www.amazon.com/Sapphire-PCI-Express-Graphics-Edition-21226-00-50G/dp/B00FZN2LO0/ref=sr_1_3?ie=UTF8&qid=1382976259&sr=8-3&keywords=r9+290x


----------



## tsm106

Quote:


> Originally Posted by *jerrolds*
> 
> Quote:
> 
> 
> 
> Originally Posted by *sugarhell*
> 
> 4k is eyefinity atm.
> 
> Also keep the misinformations. Nvidia doesnt have hardware frame pacing fix for years. Only on kepler they got it. Also hardware fix means nothing without the drivers. Look how on metro and grid 2 they have stutter on sli.
> 
> 
> 
> Didnt they have it on the 5 series? I dunno 780 SLI is MUCH tighter than 290X CF http://www.pcper.com/reviews/Graphics-Cards/Frame-Rating-AMD-Radeon-R9-290X-CrossFire-and-4K-Preview-Testing/Metro-Last-L
> 
> I'm not saying theres absolutely no microstutter with SLI, im saying its much better than what AMD can do atm.
Click to expand...

Give it a rest. Thanks.


----------



## CallsignVega

Quote:


> Originally Posted by *jerrolds*
> 
> Does SLI work in windowed mode? Cuz i'm pretty sure CF needs fullscreen - so any custom icc profiles get thrown out the window.


Yes, SLI does work windowed mode and crossfire does not. I forgot about that. It can be quite annoying.


----------



## jerrolds

Quote:


> Originally Posted by *sugarhell*
> 
> No they dont. Fermi had awful microstutter especially on 480s.
> 
> Please read what you just link. The 780 sli dont scale at all thats means only 1 gpu work. Thats why the frame latency is like single gpu. Also if you check the closing thoughts amd said to not test crossfire yet because its not quite ready. This is like a preview


What chart are you looking at?

 and http://cdn.pcper.com/files/imagecache/article_max_width/review/2013-10-23/MetroLL_2560x1440_STUT_0.png

The 780SLI frametime line is clearly more consistent and a lot smoother than 290XCF and it seems to scale just fine, under the game you think it does poor in



I guess by scaling you mean the 4k charts? But..no ones really considering 4K now, even Vega said he thinks itll be a ways off. GPU scaling works fine at 1440 on the 780s
Quote:


> Give it a rest. Thanks.


Sorry but give what a rest? Are we giving AMD a pass on CF now all of a sudden? Its still broken on DX9 afaik. Dont get me wrong i like Radeons, ive had 6970 CF and a Matrix 7970, and now a 290X on the way...i dont see why we cant discuss CF microstutter on 290X


----------



## szeged

have we gotten proper drivers for the 290x yet? 290x should be here in a few hours


----------



## tsm106

Quote:


> Originally Posted by *jerrolds*
> 
> Quote:
> 
> 
> 
> Originally Posted by *sugarhell*
> 
> No they dont. Fermi had awful microstutter especially on 480s.
> 
> Please read what you just link. The 780 sli dont scale at all thats means only 1 gpu work. Thats why the frame latency is like single gpu. Also if you check the closing thoughts amd said to not test crossfire yet because its not quite ready. This is like a preview
> 
> 
> 
> What chart are you looking at?
> 
> and http://cdn.pcper.com/files/imagecache/article_max_width/review/2013-10-23/MetroLL_2560x1440_STUT_0.png
> 
> The 780SLI frametime line is clearly more consistent and a lot smoother than 290XCF and it seems to scale just fine, under the game you think it does poor in
> 
> 
> 
> I guess by scaling you mean the 4k charts? But..no ones really considering 4K now, even Vega said he thinks itll be a ways off. GPU scaling works fine at 1440 on the 780s
> Quote:
> 
> 
> 
> Give it a rest. Thanks.
> 
> Click to expand...
> 
> Sorry but give what a rest? Are we giving AMD a pass on CF now all of a sudden? Its still broken on DX9 afaik. Dont get me wrong i like Radeons, ive had 6970 CF and a Matrix 7970, and now a 290X on the way...i dont see why we cant discuss CF microstutter on 290X
Click to expand...

Go make your own thread not in here if you want to expound on frametimes. Thanks.


----------



## sugarhell

Meh. Vegas doesnt use a simple 1440p monitor. Before you comment go search more


----------



## Slomo4shO

In case the 290X comes in stock at Newegg:

$40 off $499 or more, $100 off $999 or more at Newegg Business
http://www.neweggbusiness.com/product/productlist.aspx?Submit=Property&Subcategory=48&Description=&Type=&N=0&IsNodeId=1&IsPowerSearch=1&srchInDesc=&MinPrice=&MaxPrice=&PropertyCodeValue=679%3A446074&PropertyCodeValue=679%3A446068

XB2B1028E40 for $40 off

XB2B1028E100 for $100 off

Promo expires 11/01/2013 or while funds last. This may be a targeted promo so YMMV.


----------



## wolfej

Got both my 290x's from Amazon this morning.

Proof

Also on Amazon it didn't say anything about it being a BF4 edition, but both cards come with a BF4 code so that's a nice little bonus I didn't expect.


----------



## GioV

Quote:


> Originally Posted by *selk22*
> 
> I don't think you read my post.. I am going to expand the h220 which is already a 220 rad and pump with res. I will be adding the components listed to the current loop I already have but I appreciate the help. The swiftech tubing is the same tubing on my h220 right now but in white and if I can afford to I will deff go with the primochill


I used to have an H220 with the 240 rad it came with and an alpha cool UT60 240 rad along with a 7970 cooled by a Swiftech Komodo and an overclocked ivy bridge. I quickly upgraded because the H220 pump just simply cannot provide enough water flow. I now use the swiftech rad as my resovoir that feeds into a standalone water cooling pump using an EK waterblock and temperatures are 20C lower.

To give you an example of how bad the H220 pump is, when I was filling up my loop it took about 8 min to fill a 120mm resovoir (used for aesthetics) and it was a HUGE pain to bleed. With the standalone pump it was about 6 seconds to fill the reservoir and 20min to fully bleed the system.


----------



## jerrolds

Quote:


> Originally Posted by *tsm106*
> 
> Go make your own thread not in here if you want to expound on frametimes. Thanks.


Vega specifically asked about SLI vs CF - and one of the main points is microstutter, i did not bring it out of nowheren - i was trying to help him make an informed decision. I understand that Vega is hardcore and knows way more about some of this stuff than a lot of us, including me - but not everyone knows everything, and he was admittedly rusty when it comes the current state of Radeons.
Quote:


> Meh. Vegas doesnt use a simple 1440p monitor. Before you comment go search more


Are you serious? I asked if he was using lightboost or IPS - we've all seen his 5x1P samsung builds, then his 3x1L Catleap 2B setup...but i wasnt sure if he went with PLS or lightboost lately. He's using 3x1 1080p lightboost atm, and youre the one talking about 4K scaling. He even said he's not considering it..maybe you should be the one searching before commenting...


----------



## sugarhell

I am done with you


----------



## r0l4n

Quote:


> Originally Posted by *CallsignVega*
> 
> Yes, SLI does work windowed mode and crossfire does not. I forgot about that. It can be quite annoying.


Does SLI work in windowed mode???


----------



## tsm106

Quote:


> Originally Posted by *r0l4n*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CallsignVega*
> 
> Yes, SLI does work windowed mode and crossfire does not. I forgot about that. It can be quite annoying.
> 
> 
> 
> Does SLI work in windowed mode???
Click to expand...

In supported games. Support isn't universal in windowed mode, it has to coded for it in the driver.


----------



## burningrave101

A few XFX 290x in stock at Amazon for $553.49:

http://www.amazon.com/gp/product/B00G2OTRMA/ref=oh_details_o00_s00_i00?ie=UTF8&psc=1


----------



## jrcbandit

Quote:


> Originally Posted by *Slomo4shO*
> 
> Competition is a beautiful thing
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hopefully this will bring the top tier cards at $450 or lower by the end of the year as they should have been to begin with.


290X are selling out at $579 and will at $550 for the non BF4 edition - the Nvidia price drop will have 0 effect on that because the 290x is the hot (literally and figuratively...) new thing. Of course the 780 Ti is coming, but that is priced at $700 so again no incentive for AMD to drop pricing. However, the price drops might influence AMD's pricing on the 290 which would be nice. Can't have the 290 selling for $500 now and even $450 might be too much considering the new price of the 780.


----------



## Roaches

Quote:


> Originally Posted by *jrcbandit*
> 
> 290X are selling out at $579 and will at $550 for the non BF4 edition - the Nvidia price drop will have 0 effect on that because the 290x is the hot (literally and figuratively...) new thing. Of course the 780 Ti is coming, but that is priced at $700 so again no incentive for AMD to drop pricing. However, the price drops might influence AMD's pricing on the 290 which would be nice. Can't have the 290 selling for $500 now and even $450 might be too much considering the new price of the 780.


Could be true!

http://www.fudzilla.com/home/item/32962-radeon-r9-290x-sold-out-in-hours

I still can't find any in stock right now


----------



## szeged

Quote:


> Originally Posted by *burningrave101*
> 
> A few XFX 290x in stock at Amazon for $553.49:
> 
> http://www.amazon.com/gp/product/B00G2OTRMA/ref=oh_details_o00_s00_i00?ie=UTF8&psc=1


does xfx still suck? i was thinking about picking up a couple more 290xs to do quadfire yummyness but hesitant about xfx.


----------



## burningrave101

Quote:


> Originally Posted by *Roaches*
> 
> Could be true!
> 
> http://www.fudzilla.com/home/item/32962-radeon-r9-290x-sold-out-in-hours
> 
> I still can't find any in stock right now


Then you obviously aren't looking very hard. Read two posts up and hurry before they sell out.


----------



## burningrave101

Quote:


> Originally Posted by *szeged*
> 
> does xfx still suck? i was thinking about picking up a couple more 290xs to do quadfire yummyness but hesitant about xfx.


They're reference cards so brand doesn't really matter outside of the warranty and XFX is about as good as you get from AMD. XFX allows you to install a water block on the card without voiding the warranty unlike other brands like Sapphire.


----------



## Roaches

Quote:


> Originally Posted by *burningrave101*
> 
> Then you obviously aren't looking very hard. Read two posts up and hurry before they sell out.


I only shop Newegg, Ebay is listing them for $600+ average for the BF4/Standard edition...


----------



## burningrave101

Quote:


> Originally Posted by *Roaches*
> 
> I only shop Newegg, Ebay is listing them for $600+ average for the BF4/Standard edition...


Then you'll have to be Johnny on the spot because Newegg sells out a lot faster than Amazon unless they receive a large number.


----------



## Arizonian

Quote:


> Originally Posted by *szeged*
> 
> have we gotten proper drivers for the 290x yet? 290x should be here in a few hours


I've got the first drivers that support 290 X on the OP - 13.11 beta 6 driver.


----------



## szeged

Quote:


> Originally Posted by *burningrave101*
> 
> They're reference cards so brand doesn't really matter outside of the warranty and XFX is about as good as you get from AMD. XFX allows you to install a water block on the card without voiding the warranty unlike other brands like Sapphire.


yeah i figured as much, ive just heard horror stories about xfx trying to deny warranties even though it was their shoddy cooling solutions that caused the problems.


----------



## kcuestag

I'm about to try some Battlefield 4 Multiplayer on the 290X, still haven't tried any game on it.


----------



## $ilent

Guys is bf4 available? Someone posted in here saying it would be available in like 4 hours but it's still the 28th today and it's only 6:30pm


----------



## utnorris

Ok, so a question to those that have played with both the 290x and 780, is the $50 worth it? Better yet, getting a used one for say $100 less than a 290x a better deal? I am kinda on the fence right now with mine, mainly because of the lack of water blocks available. I am being told the ones the Performance supposedly have are actually pre-orders, which would really piss me off since they do not list it that way. Anyways, just thought I would get some insight from users that have played with both (szeged).

Thanks

P.S. Not trying to start a big debate, but since they dropped the priced on the 780 it is making me wonder.


----------



## Arizonian

Quote:


> Originally Posted by *$ilent*
> 
> Guys is bf4 available? Someone posted in here saying it would be available in like 4 hours but it's still the 28th today and it's only 6:30pm


I thought 9PM today but I can't recall I'd it was Eastern or Pacific for US or what that relates to in other countries.


----------



## Bitemarks and bloodstains

Midnight EST, 11PM Central, 9PM PST


----------



## utnorris

So just an FYI, got an email from Performance just now and no they do not have the blocks yet. They are saying today or tomorrow they should have them.


----------



## VSG

Frozen CPU will have them in stock on Friday, so my blocks won't be here till next week. Oh well, at least that gives me time to get the second 290X or 290 if they are ever available!


----------



## Slomo4shO

Quote:


> Originally Posted by *jrcbandit*
> 
> 290X are selling out at $579 and will at $550 for the non BF4 edition - the Nvidia price drop will have 0 effect on that because the 290x is the hot (literally and figuratively...) new thing. Of course the 780 Ti is coming, but that is priced at $700 so again no incentive for AMD to drop pricing. However, the price drops might influence AMD's pricing on the 290 which would be nice. Can't have the 290 selling for $500 now and even $450 might be too much considering the new price of the 780.


Say what you might but the 780 with aftermarket coolers are already available for $500 with 3 games. Unless you are going under water, we have already seen overclocked 780s keep pace with the limited overclock potential of the reference 290X. Since there is no plan for aftermarket coolers until the end of November for the 290X and 290, I can't help but recommend going with a 780 if you are purchasing a card at or before Black Friday unless AMD actually drops the prices and add a good gaming bundle.


----------



## Forceman

Quote:


> Originally Posted by *Slomo4shO*
> 
> Say what you might but the 780 with aftermarket coolers are already available for $500 with 3 games. Unless you are going under water, we have already seen overclocked 780s keep pace with the limited overclock potential of the reference 290X. Since there is no plan for aftermarket coolers until the end of November for the 290X and 290, I can't help but recommend going with a 780 if you are purchasing a card at or before Black Friday unless AMD actually drops the prices and add a good gaming bundle.


The EVGA with ACX cooler is $519 on Newegg right now, and with the 3 brand-new games that's a fantastic deal. Kudos to AMD for bringing competition back to the high-end (finally).


----------



## $ilent

Thanks guys so that's 4am GMT. I'll be in bed by then lol.

Also anyone see the 780 prices? Man £399 is so tempting,but then I just remember I got my £480 290x with £90 worth of games, so in effect my 290x would be even cheaper than the 780 since there's no games or bf4 with that.


----------



## Paul17041993

Quote:


> Originally Posted by *burningrave101*
> 
> They're reference cards so brand doesn't really matter outside of the warranty and XFX is about as good as you get from AMD. XFX allows you to install a water block on the card without voiding the warranty unlike other brands like Sapphire.


no... just... no...


----------



## Arizonian

There's a huge discussion about the Nvidia price drops that I prefer we take it there since we still have members here that are asking questions about their new 290 X cards. This way we don't have to sift through many pages to find relevant information.

http://www.overclock.net/t/1437655/vc-nvidia-drops-prices-on-geforce-gtx-780-and-gtx-770-graphics-cards

Thank you.


----------



## vettefan8

Feel free to officially add me to the owner's club. Sapphire BF4 edition. I'm going to be on the stock cooler until Wednesday, i'll update the pic once I get my Arctic Xtreme III installed on Wednesday.


----------



## $ilent

Do we have any definitive date for the new amd driver, gpuz or msi ab?


----------



## Arizonian

Quote:


> Originally Posted by *vettefan8*
> 
> Feel free to officially add me to the owner's club. Sapphire BF4 edition. I'm going to be on the stock cooler until Wednesday, i'll update the pic once I get my Arctic Xtreme III installed on Wednesday.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats and added


----------



## jerrolds

Quote:


> Originally Posted by *Slomo4shO*
> 
> Say what you might but the 780 with aftermarket coolers are already available for $500 with 3 games. Unless you are going under water, we have already seen overclocked 780s keep pace with the limited overclock potential of the reference 290X. Since there is no plan for aftermarket coolers until the end of November for the 290X and 290, I can't help but recommend going with a 780 if you are purchasing a card at or before Black Friday unless AMD actually drops the prices and add a good gaming bundle.


Whoa how long have the prices been like this? 780 with a ACX cooler was $650+ last time i checked..coulda swore that was a week or so ago. Didnt think 7xx price drops were till the 7th.


----------



## Arm3nian

Quote:


> Originally Posted by *burningrave101*
> 
> They're reference cards and brands do matter and Saphhire is about as good as you get from AMD.


Fixed.


----------



## Paulenski

The ASUS Bios is sooo much better than my stock XFX one, just having that voltage control and an working gpu-z is nice. I'm planning on getting a MK-26, it should definitely run it smoother.

I can confirm that XFX is using Elpida memory

There is also very bad vdroop with the stock bios', they throttle the output so hard. I was benching for TPU Valley benchmark and I had voltage set to 1.375 and it would average 1.273 with only 1 scene causing an random spike to 1.320. I'm thinking of switching to the PT3 bios for better control and get more consistent voltage.

Here's my result of the Valley bench at 1101 GPU / 1450 Mem @ 1.375v

Score: 3275
http://i.imgur.com/uJme5Uf.png


----------



## szeged

i crack up every time i see valley's reports on the 290x's temperatures lol.


----------



## sugarhell

New drivers

http://www2.ati.com/drivers/beta/AMD_Catalyst_13.11_BetaV7.exe


----------



## szeged

any info on what the new drivers improve?


----------



## sugarhell

We need to wait for site update. Probably bf4 drivers


----------



## Newbie2009

Quote:


> Originally Posted by *sugarhell*
> 
> New drivers
> 
> http://www2.ati.com/drivers/beta/AMD_Catalyst_13.11_BetaV7.exe


DAMN. Finally an AMD I can get on board with, drivers coming thick and fast.


----------



## Paulenski

Quote:


> Originally Posted by *szeged*
> 
> i crack up every time i see valley's reports on the 290x's temperatures lol.


Haha that doesn't even compare to the highest I've seen.

2.1million c, surface of the sun going on here


----------



## Paulenski

Quote:


> Originally Posted by *sugarhell*
> 
> We need to wait for site update. Probably bf4 drivers


There's usually an changelog with the files? I thought they included an pdf of one like the webpage has.


----------



## BusterOddo

Quote:


> Originally Posted by *Newbie2009*
> 
> DAMN. Finally an AMD I can get on board with, drivers coming thick and fast.


This has to be for BF4 right???


----------



## maarten12100

Seems like they shouldn't have changed how cards function in terms of boost states.
They could've pulled a total Nvidia greenlight though.


----------



## Arizonian

Is there any function to clean previous drivers during the install with AMD drivers? Or do you just install right on top of the previous?

PS Thanks Sugarhell added that link to OP.


----------



## Clukos

Quote:


> Originally Posted by *Arizonian*
> 
> Is there any function to clean previous drivers during the install with AMD drivers? Or do you just install right on top of the previous?


You could always use Display Driver Uninstaller, does the job for me.

Link: http://www.wagnardmobile.com/DDU/


----------



## Roaches

Quote:


> Originally Posted by *Arizonian*
> 
> Is there any function to clean previous drivers during the install with AMD drivers? Or do you just install right on top of the previous?


When I switch from AMD to Nvidia with my 680, I just uninstall normally through the control panel and use Regedit to delete the registry's.
I'm not sure if its the same in reverse.


----------



## szeged

Quote:


> Originally Posted by *maarten12100*
> 
> Seems like they shouldn't have changed how cards function in terms of boost states.
> They could've pulled a total Nvidia greenlight though.


nvidia greenlight is the spawn of the devil and should be thrown into the pits of a volcano along with whoever came up with the idea.


----------



## sugarhell

If you want you can do the process on my sig guide. Otherwise uninstall from device manager reboot on safe mode and use DDU


----------



## $ilent

Still no word on gpuz or AB updates though?


----------



## Newbie2009

Quote:


> Originally Posted by *Arizonian*
> 
> Is there any function to clean previous drivers during the install with AMD drivers? Or do you just install right on top of the previous?
> 
> PS Thanks Sugarhell added that link to OP.


I usually just run the uninstall through the new drivers and then install with new set. (offers you a choice to uninstall or install with each pack)


----------



## sugarhell

Quote:


> Originally Posted by *$ilent*
> 
> Still no word on gpuz or AB updates though?


http://forums.guru3d.com/showpost.php?p=4685929&postcount=33


----------



## Arizonian

I guess I've been spoiled with the Nvidia process of Clean Install. Thanks for the answers.


----------



## jerrolds

Quote:


> Originally Posted by *Arizonian*
> 
> Is there any function to clean previous drivers during the install with AMD drivers? Or do you just install right on top of the previous?
> 
> PS Thanks Sugarhell added that link to OP.


There have been issues in the past with the AMD Uninstall Utility with Windows 8.1 - so beware, i believe there was a thread here about it. It messues up Power options amongst other things. This was back in 8.1Beta, not sure if its been fixed for release last week.

I normally uninstall, delete temp folders Windows\Temp and User\Temp, delete driver from device manager, reboot and reinstall.


----------



## Drakenxile

HAs anyone tried mining with these cards? i plan on buying one for gaming + mining on the side just wanted a heads up


----------



## dajez

would a 360 rad(swiftech h320) be enough to cool a 3770k and 290x?


----------



## $ilent

Quote:


> Originally Posted by *sugarhell*
> 
> http://forums.guru3d.com/showpost.php?p=4685929&postcount=33


He seems to be referring to him recieving a MSI card (290x) as apposed to speaking about the AB programme?


----------



## Drakenxile

Quote:


> Originally Posted by *dajez*
> 
> would a 360 rad(swiftech h320) be enough to cool a 3770k and 290x?


It shouldn't be a problem most people do 120 per block


----------



## sugarhell

Quote:


> Originally Posted by *$ilent*
> 
> He seems to be referring to him recieving a MSI card (290x) as apposed to speaking about the AB programme?


Alexey Nicolaychuk aka Unwinder, RivaTuner creator


----------



## Slomo4shO

Quote:


> Originally Posted by *jerrolds*
> 
> Whoa how long have the prices been like this? 780 with a ACX cooler was $650+ last time i checked..coulda swore that was a week or so ago. Didnt think 7xx price drops were till the 7th.


The official price drop is tomorrow but EVGA cards were dropped today.


----------



## skupples

Quote:


> Originally Posted by *szeged*
> 
> nvidia greenlight is the spawn of the devil and should be thrown into the pits of a volcano along with whoever came up with the idea.


greenlight is going to do more damage than good. I really don't understand how they can think it's a good idea for anyone but them selves (& their RMA budget) It's only purpose is to reduce RMA number's & will switch me over to being an AMD customer if it's in place on the 6,144 core Maxwell Titan.


----------



## burningrave101

Quote:


> Originally Posted by *Drakenxile*
> 
> HAs anyone tried mining with these cards? i plan on buying one for gaming + mining on the side just wanted a heads up


http://www.tomshardware.com/reviews/radeon-r9-290x-hawaii-review,3650-34.html

Unless you have watercooling I wouldn't use a 290x for mining.


----------



## sugarhell

Cgminer doesnt support 290x yet.


----------



## maarten12100

Quote:


> Originally Posted by *Drakenxile*
> 
> HAs anyone tried mining with these cards? i plan on buying one for gaming + mining on the side just wanted a heads up


Going by the fact it doesn't do much better in bitcoin hashing I doubt it would do good in scrypt.
But the extra bandwidth might help a lot as scrypt needs that a lot.

I'm getting one once it is available here in NL so I can keep you up to speed.


----------



## Arizonian

Do we know what 13.11 beta7 improves upon?


----------



## evensen007

Quote:


> Originally Posted by *$ilent*
> 
> Guys is bf4 available? Someone posted in here saying it would be available in like 4 hours but it's still the 28th today and it's only 6:30pm


People in the EU and US are logging into battlelog using a free international proxy service m(spoofing Asia timezones). That's how they're logging in early.

On topic, I cannot WAIT to see water results with these! I was able to get my 7970 to 1300 core/1700ish mem with an EK block. On the 7970, memory O/C didn't really effect performance too much. Memory bandwidth on the 290x seems to highly correlate to framerates, so I am looking fwd to pushing core and mem to the max. Crossing my fingers for another 1300 core under water!


----------



## Drakenxile

Quote:


> Originally Posted by *burningrave101*
> 
> http://www.tomshardware.com/reviews/radeon-r9-290x-hawaii-review,3650-34.html
> 
> Unless you have watercooling I wouldn't use a 290x for mining.


Thanks for the link i assume after optimization you can get better then what they're getting.


----------



## sugarhell

Tsm score

http://www.3dmark.com/3dm11/7381362


----------



## c0ld

Well I guess its time to order BF4 separately. Anyone wanna sell me their BF4 code?

I am gonna have to wait for the 290 and how the 780Ti stacks up.


----------



## battleaxe

I find it really annoying that as soon as ATI launches these cards that Nvidia drops their prices on cards so drastically. This just proves they've been severely taking advantage of consumers. I have Nvidia cards right now and a few ATI's that are older too - but honestly this makes me want to completely switch over to AMD/ATI

If I were a customer that had bought a 780 a few weeks ago I'd be scathing mad.


----------



## kcuestag

So far really impressed with the 290X, been playing Battlefield 4 for an hour and at 2560x1440, Ultra (All maxed except MSAA which is OFF, I'm using AA post on Medium) I play anywhere from 50 to 90fps, but most of the time well above 60fps, and this is with card downclocking of course since I'm on AUTO fan and it hovers anywhere from 800 to 1GHz.









Can't wait to get this beast on water and see what it is cappable of.


----------



## Newbie2009

Quote:


> Originally Posted by *sugarhell*
> 
> Tsm score
> 
> http://www.3dmark.com/3dm11/7381362


You know if that's with a proper block?


----------



## sugarhell

Quote:


> Originally Posted by *Newbie2009*
> 
> You know if that's with a proper block?


universal/air vrms 1.3 volt no vdroop bios.


----------



## $ilent

Quote:


> Originally Posted by *kcuestag*
> 
> So far really impressed with the 290X, been playing Battlefield 4 for an hour and at 2560x1440, Ultra (All maxed except MSAA which is OFF, I'm using AA post on Medium) I play anywhere from 50 to 90fps, but most of the time well above 60fps, and this is with card downclocking of course since I'm on AUTO fan and it hovers anywhere from 800 to 1GHz.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can't wait to get this beast on water and see what it is cappable of.


Good stuff, why aren't you running full aa though? Please don't tell me you get under 60fps with it on full.


----------



## Arm3nian

Quote:


> Originally Posted by *$ilent*
> 
> Good stuff, why aren't you running full aa though? Please don't tell me you get under 60fps with it on full.


Full AA is demanding at 2560x1440. You most likely will need 2 290x's or a watercooled one to get above 60fps average.


----------



## Slomo4shO

Quote:


> Originally Posted by *Durvelle27*
> 
> Just a few quick runs
> 
> System Specs:
> 
> FX-8350 @5GHz
> Gigabyte 990FXA-UD3 Rev. 1.1
> 8GB(2x4GB) DDR3 1866MHz
> *PowerColor R9 290X*
> Corsair TX850 850w
> 
> Stock (1000/1250)


The stock speed of your powercolor card is 1030MHz core and 1250MHz memory.
Quote:


> Originally Posted by *Sazz*
> 
> Cool story dude, those are my scores...
> 
> Why are you claiming my scores are yours? are you pulling a Matthew Reed on me?


Explains why he can't figure our the stock clocks of his GPU.


----------



## OrcishMonkey

Waybill: 9720011663
Signed for by: R SKIDMORE
Get Signature Proof of Delivery External Link
Monday, October 28, 2013 at 12:46
Origin Service Area: LJUBLJANA - Ljubljana - SLOVENIA
Destination Service Area: SOUTH SAN DIEGO, CA - OCEANSIDE - USA
Monday, October 28, 2013 Location Time
16 Delivered - Signed for by : R SKIDMORE OCEANSIDE 12:46

God damnit, my blocks are at home, stupid work. Inc water numbers soon =)


----------



## utnorris

Quote:


> Originally Posted by *OrcishMonkey*
> 
> Waybill: 9720011663
> Signed for by: R SKIDMORE
> Get Signature Proof of Delivery External Link
> Monday, October 28, 2013 at 12:46
> Origin Service Area: LJUBLJANA - Ljubljana - SLOVENIA
> Destination Service Area: SOUTH SAN DIEGO, CA - OCEANSIDE - USA
> Monday, October 28, 2013 Location Time
> 16 Delivered - Signed for by : R SKIDMORE OCEANSIDE 12:46
> 
> God damnit, my blocks are at home, stupid work. Inc water numbers soon =)


I hate you.


----------



## Iniura

Don't know if this was posted here already but if not, you can try beta 7 out

AMD Catalyst 13.11 BetaV7 (13.250.18.0 October 25)

Desktop:
http://www2.ati.com/drivers/beta/AMD....11_BetaV7.exe

Mobility:
http://www2.ati.com/drivers/beta/AMD...ity_BetaV7.exe

Build Info:
DriverVer=10/25/2013, 13.250.18.0000
13.25.18-131025a-164191E-ATI
Catalyst: 13.11
CCC: 2013.1025.1143.19184
3D: 9.14.10.01001
OGL: 6.14.10.12614
OCL: N/A

Supported OS:
Windows 7
Windows 8
Windows 8.1

Many thanks to Espionage724 from Guru3D for spotting these.


----------



## kcuestag

Quote:


> Originally Posted by *$ilent*
> 
> Good stuff, why aren't you running full aa though? Please don't tell me you get under 60fps with it on full.


OF course you do get under 60 fps with it on full, even at BF3 you will on a 64 player server.









The main reason is FPS, 2nd reason is I don't need or want AA at 2560x1440, plus I'm already using AA Post on Medium (It's FXAA) which is more than enough, plus it's better than MSAA in this game, MSAA doesn't work in most textures.

If you want to MAX it at 1440p you'll want a 2nd 290X to keep 60fps at ALL times in Multiplayer.


----------



## lordzed83

Finished whole BF4 campaign hour ago. Its 5 hours long on mormal dif. I was running 1080/[email protected] above 1080 i get artefacts :/
Anyhow no crash or anything card keept me warm with no problems.
Thank god i got asus volcan anc headset


----------



## DampMonkey

Look what i found on my doorstep. First?


----------



## Sazz

Quote:


> Originally Posted by *lordzed83*
> 
> Finished whole BF4 campaign hour ago. Its 5 hours long on mormal dif. I was running 1080/[email protected] above 1080 i get artefacts :/
> Anyhow no crash or anything card keept me warm with no problems.
> Thank god i got asus volcan anc headset


wait, BF4 is playable now? mine is still not letting me get on it.


----------



## hotrod717

Quote:


> Originally Posted by *Slomo4shO*
> 
> The stock speed of your powercolor card is 1030MHz core and 1250MHz memory.
> Explains why he can't figure our the stock clocks of his GPU.


It's the same guy who started the 280x thread that has never posted proof of owning one. He did say he was going to flash his 7970 with a 280x bios in the 7970 thread though.


----------



## vettefan8

Quote:


> Originally Posted by *Sazz*
> 
> wait, BF4 is playable now? mine is still not letting me get on it.


I think that some people are using VPN's to trick origin in to thinking that they are in asia (where it is already Tuesday).


----------



## Newbie2009

Quote:


> Originally Posted by *DampMonkey*
> 
> Look what i found on my doorstep. First?










Looking forward to see results. I believe we don't have water results yet, Just TSM's makeshift cooling.


----------



## CallsignVega

Wait, how are people playing BF4 I thought it doesn't release until tomorrow? Mine still says pre-load.


----------



## Slomo4shO

Quote:


> Originally Posted by *CallsignVega*
> 
> Wait, how are people playing BF4 I thought it doesn't release until tomorrow? Mine still says pre-load.


http://linustechtips.com/main/topic/69222-get-battlefield-4-pc-14-hours-early-100-legal/


----------



## Shoggy

Quote:


> Originally Posted by *DampMonkey*
> 
> Look what i found on my doorstep. First?


I have one too but for some reason it looks different than yours


----------



## Tobiman

Quote:


> Originally Posted by *Shoggy*
> 
> I have one too but for some reason it looks different than yours


Damn son, that thing is pure sex !!


----------



## Hattifnatten

Quote:


> Originally Posted by *Slomo4shO*
> 
> http://linustechtips.com/main/topic/69222-get-battlefield-4-pc-14-hours-early-100-legal/


I won't get mine before 3 days from now -.- Unless the store I preordered from decides to don't give a about the release dates in the EU.


----------



## rdr09

Quote:


> Originally Posted by *Shoggy*
> 
> I have one too but for some reason it looks different than yours


nice view of Hawii.


----------



## Ukkooh

Quote:


> Originally Posted by *Shoggy*
> 
> I have one too but for some reason it looks different than yours
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: HW Pr0n!


Why do I never encounter random item spawns on my doorstep?


----------



## DampMonkey

Quote:


> Originally Posted by *Shoggy*
> 
> I have one too but for some reason it looks different than yours


Oh my, the copper Hawaii is a nice touch! What a beautiful piece of engineering! Good job


----------



## pioneerisloud

Quote:


> Originally Posted by *Tobiman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Shoggy*
> 
> I have one too but for some reason it looks different than yours
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Damn son, that thing is pure sex !!
Click to expand...

What kind of sex are you having?









Personally, I like that EK block posted earlier. When I get my 290x and water (I'm still torn between that or a 7990 for tri fire), I'll be getting the EK block because it gets rid of all that excess "flash". I like things sleek and elegant looking.


----------



## Slomo4shO

Quote:


> Originally Posted by *hotrod717*
> 
> It's the same guy who started the 280x thread that has never posted proof of owning one. He did say he was going to flash his 7970 with a 280x bios in the 7970 thread though.


Durvelle27 is a AMD viral marketer as indicated by his signature at anandtech. He seems to get free hardware through the AMD Test Drive members but it appears that he likes to use other peoples work instead of contributing his own.
Quote:


> The AMD A-Series Test Drive program *targets AMD's most vocal fans who are active in the component community.* AMD provides the components and software (OS, apps) necessary to build an A-Series-based system and asks in turn that participants build a system, then comment/post their experiences (videos, photos, comments, etc.), ultimately helping to guide other enthusiasts to become advocates of AMD.


.
Source


----------



## Clukos

Quote:


> Originally Posted by *Slomo4shO*
> 
> Durvelle27 is a AMD viral marketer as indicated by his signature at anandtech. He seems to get free hardware through the AMD Test Drive members but it appears that he likes to use other peoples work instead of contributing his own.
> .
> Source


Pretty dumb to post the same thing in the thread you found it tho lol


----------



## skupples

Quote:


> Originally Posted by *Slomo4shO*
> 
> http://linustechtips.com/main/topic/69222-get-battlefield-4-pc-14-hours-early-100-legal/


VPN.

Really? We have Viral marketer's posting fake/stolen info? That's pretty whack OCN mods.


----------



## Paul17041993

Quote:


> Originally Posted by *Shoggy*
> 
> I have one too but for some reason it looks different than yours


200% cooler


----------



## Arizonian

Quote:


> Originally Posted by *skupples*
> 
> VPN.
> 
> Really? We have Viral marketer's posting fake/stolen info? That's pretty whack OCN mods.


It's being looked into. Let's continue our normal club discussions.


----------



## Duvar

Hi guys,

here are new results (CF R9 290)
Might be interesting.


----------



## pioneerisloud

Let's please keep this thread on topic guys.


----------



## rdr09

Quote:


> Originally Posted by *sugarhell*
> 
> Tsm score
> 
> http://www.3dmark.com/3dm11/7381362


what's the max for a 780?

here is one at 1398/3600

http://www.3dmark.com/3dm11/7387225


----------



## Arizonian

13.11 beta 7

Feature Highlights of The AMD Catalyst 13.11 Beta7 Driver for Windrows
Includes all Feature Highlights of the AMD Catalyst 13.11 Beta6

Increases AMD CrossFire ™ scaling up to an additional 20% for Battlefield 4

Feature Highlights of The AMD Catalyst 13.11 Beta 6 Driver for Windows
Includes all Feature Highlights of AMD Catalyst 13.11 Beta

Includes support for the new prod ucts:
AMD Radeon ™ R9 290X
AMD Radeon R9 290
Performance improvements
Batman: Arkham Origins - improves performance up to 35% with MSAA 8x enabled
Total War™: Rome 2 - improves performance up to 10%
Battlefield 3 - improves performance up to 10%
GRID 2 - improves performance up to 8.5%
DiRT Showdown - improves performance up to 10%
Formula 1™ 2013 - improves performance up to 8%
DiRT 3 - improves performance up to 7%
Sleeping Dogs - improves performance up to 5% 
Performance improvements for the AMD APU Series (comparing AMD Catalyst 13.11 Beta6 to AMD Catalyst 13.9)
Luxmark (openCL) - improves performance up to 10%
Winzip 17.5 (openCL) - improves performance up to 20%
GRID 2 - improves performance up to 4%
Battlefield 4 - improves performance up to 5%
Lef4Dead - improves performance up to 10%
Automatic AMD Eyefinity Configuration 
Automatic "plug and play" configuration of supported Ultra HD/4K tiled displays

I'm envious of you guys that have two cards with the extra support now in crossfire for battlefield for tonight.


----------



## tsm106

Anyone get a fullcover on yet?


----------



## skupples

Quote:


> Originally Posted by *Arizonian*
> 
> It's being looked into. Let's continue our normal club discussions.




Quote:


> Originally Posted by *Duvar*
> 
> Hi guys,
> 
> here are new results (CF R9 290)
> Might be interesting.
> 
> https://www.facebook.com/photo.php?fbid=170679869806374&set=a.149311011943260.1073741827.136447289896299&type=1&theater


Hello! Welcome to OCN! You can use the little  to upload photo's directly to the page! It would be great for the minority of people who refuse to interact with facebook.


----------



## Jared Pace

OCuk poster has one, but hes on stock bios. He's got VRM temperature measurements too.

http://forums.overclockers.co.uk/showpost.php?p=25200569&postcount=822


----------



## Yeroon

Quote:


> Originally Posted by *tsm106*
> 
> Anyone get a fullcover on yet?


Someone in the benchoff thread (prob here too) is getting a cover on. I'm guessing if all goes well we'll prob see results soon?


----------



## tsm106

Quote:


> Originally Posted by *Jared Pace*
> 
> OCuk poster has one, but hes on stock bios. He's got VRM temperature measurements too.
> 
> http://forums.overclockers.co.uk/showpost.php?p=25200569&postcount=822


Doh, that run wasn't stable, the bench lost time synch.

Here's one of mine.

http://www.3dmark.com/fs/1050519


----------



## bencher

Quote:


> Originally Posted by *skupples*
> 
> 
> Hello! Welcome to OCN! You can use the little  to upload photo's directly to the page! It would be great for the minority of people who refuse to interact with facebook.


What happen to the Titan thread?


----------



## selk22

Quote:


> Originally Posted by *GioV*
> 
> I used to have an H220 with the 240 rad it came with and an alpha cool UT60 240 rad along with a 7970 cooled by a Swiftech Komodo and an overclocked ivy bridge. I quickly upgraded because the H220 pump just simply cannot provide enough water flow. I now use the swiftech rad as my resovoir that feeds into a standalone water cooling pump using an EK waterblock and temperatures are 20C lower.
> 
> To give you an example of how bad the H220 pump is, when I was filling up my loop it took about 8 min to fill a 120mm resovoir (used for aesthetics) and it was a HUGE pain to bleed. With the standalone pump it was about 6 seconds to fill the reservoir and 20min to fully bleed the system.


Thanks for the reply +rep. I agree with you and have a few questions. First were you running the pump at full speed when you did this or did you still have it hooked up to the CPU fan header? How easy do you find it to use the h220 res as your main fill res?

I will most likely end up buying a pump/res combo and adding it to the loop along with the h220 pump.

Thanks again


----------



## BababooeyHTJ

Quote:


> Originally Posted by *tsm106*
> 
> Anyone get a fullcover on yet?


I know. I'm half tempted to throw an HR-03gt that I have kicking around onto my card for now just to see how much temps effect overclocking headroom.


----------



## anubis1127

Quote:


> Originally Posted by *BababooeyHTJ*
> 
> I know. I'm half tempted to throw an HR-03gt that I have kicking around onto my card for now just to see how much temps effect overclocking headroom.


Do it!


----------



## xxmastermindxx

Those of you who've received your EK blocks already, who did you order from?


----------



## DampMonkey

Been doing a furmark burn in for 15 minutes now at stock clocks. Maxing at 38*C......


----------



## sugarhell

I found the best review to understand 290x and on the best review in general

http://www.hardware.fr/articles/910-4/powertune-evolue.html

The best article to understand the new powertune

PS. Its not on English


----------



## selk22

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *DampMonkey*
> 
> Been doing a furmark burn in for 15 minutes now at stock clocks. Maxing at 38*C......






Porn.... Omg man awesome haha I cant wait for my block

E: Also 13.11 v7 is out for you Crossfire people


----------



## BababooeyHTJ

I regret not ordering backplates now. I may have to order two.


----------



## DampMonkey

By the way, my sapphire had elpida memory, no sign of hynix


----------



## selk22

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *DampMonkey*
> 
> By the way, my sapphire had elpida memory, no sign of hynix






Was it the Bf4 edition or the Standard?


----------



## anubis1127

Quote:


> Originally Posted by *DampMonkey*
> 
> Been doing a furmark burn in for 15 minutes now at stock clocks. Maxing at 38*C......


Looks nice, and nice temps. Guess gibbo @57C just didn't even enough radiator, haha.


----------



## jamaican voodoo

i'm so jelly of you guys right now i wont be able to get my 290x's until tax season all water-cooled...i'm aiming for three of them in trifire, good luck to you guys who already have one have fun.


----------



## jezzer

Quote:


> Originally Posted by *kcuestag*
> 
> So far really impressed with the 290X, been playing Battlefield 4 for an hour and at 2560x1440, Ultra (All maxed except MSAA which is OFF, I'm using AA post on Medium) I play anywhere from 50 to 90fps, but most of the time well above 60fps, and this is with card downclocking of course since I'm on AUTO fan and it hovers anywhere from 800 to 1GHz.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can't wait to get this beast on water and see what it is cappable of.


What FPS do u get when u max the settings out?


----------



## GioV

Quote:


> Originally Posted by *selk22*
> 
> Thanks for the reply +rep. I agree with you and have a few questions. First were you running the pump at full speed when you did this or did you still have it hooked up to the CPU fan header? How easy do you find it to use the h220 res as your main fill res?
> 
> I will most likely end up buying a pump/res combo and adding it to the loop along with the h220 pump.
> 
> Thanks again


Yes I was, you always bleed a system with the motherboard disconnected from the PSU so the pump was connected straight to the PSU which makes it run at full speed. As for the H220 res, its as easy as any other reservoir, just make sure one of the outlets feeds straight into the pump.

You cannot use 2 pumps in a system and any other pump you buy will be stronger than the H220 so use that. Like another user said, read the water cooling thread on these forums. Do not jump into watercooling unless you know the risks, you wouldn't want anything happening to your R9 290X!

More relevant to the thread, Should I bite the bullet and get the EK waterblock or should I wait? I do not know too much about their performance and there is nothing else to compare it to. I really want to get rid of this stock cooler :/


----------



## fleetfeather

Those of you under water, how much rad space would you recommend per 290X?


----------



## utnorris

Aquacomputer has one coming out that looks pretty sweet and right now I do not think anyone has the EK blocks in stock in the U.S. Europe may be different.


----------



## skupples

Quote:


> Originally Posted by *GioV*
> 
> Yes I was, you always bleed a system with the motherboard disconnected from the PSU so the pump was connected straight to the PSU which makes it run at full speed. As for the H220 res, its as easy as any other reservoir, just make sure one of the outlets feeds straight into the pump.
> 
> You cannot use 2 pumps in a system and any other pump you buy will be stronger than the H220 so use that. Like another user said, read the water cooling thread on these forums. Do not jump into watercooling unless you know the risks, you wouldn't want anything happening to your R9 290X!
> 
> More relevant to the thread, Should I bite the bullet and get the EK waterblock or should I wait? I do not know too much about their performance and there is nothing else to compare it to. I really want to get rid of this stock cooler :/


EK's are baller status for sure, specially with the active VRM cooling... As I said earlier. All waterblocks cool the chip about the same (+/- 1-3c) the big difference in block's are those that include vrm contact's.


----------



## EvilGnomes

Cant find the memory info program, could somebody post a link?


----------



## FtW 420

Quote:


> Originally Posted by *EvilGnomes*
> 
> Cant find the memory info program, could somebody post a link?


It's in the first post http://kingpincooling.com/forum/showthread.php?t=2473


----------



## Arm3nian

Quote:


> Originally Posted by *DampMonkey*
> 
> By the way, my sapphire had elpida memory, no sign of hynix


Did you get the bf4 edition? I heard the bf4 editions come with hynix, non bf4 editions come with elpida.


----------



## Sazz

Quote:


> Originally Posted by *DampMonkey*
> 
> Been doing a furmark burn in for 15 minutes now at stock clocks. Maxing at 38*C......


The heck really? what CPU are you using again in that loop and at what clocks/voltage?

Quote:


> Originally Posted by *Arm3nian*
> 
> Did you get the bf4 edition? I heard the bf4 editions come with hynix, non bf4 editions come with elpida.


Just a noob question, what's the difference between the two? is one better overclocker than the other?

Right now I don't get to seem to increase my clocks over 1100/1350 stable I just get black screen whenever I try to run benchmarks.


----------



## Arm3nian

Quote:


> Originally Posted by *Sazz*
> 
> The heck really? what CPU are you using again in that loop and at what clocks/voltage?
> Just a noob question, what's the difference between the two? is one better overclocker than the other?
> 
> Right now I don't get to seem to increase my clocks over 1100/1350 stable I just get black screen whenever I try to run benchmarks.


Yeah hynix is usually the better quality one. Some did mention though if you flash another bios with cards with the elpida memory that you can get higher clock out of them.


----------



## cookiesowns

IS there a database of what GDDR IC's each brand uses?


----------



## Arm3nian

Quote:


> Originally Posted by *cookiesowns*
> 
> IS there a database of what GDDR IC's each brand uses?


Sapphire R290X BF4 - SK Hynix
Sapphire R290 - Elpida
Asus R290X BF4 - SK Hynix
Gigabyte R290X BF4 - Elpida
AMD Press R290X - SK Hynix
HIS R290X - SK Hynix / Elpida
XFX is also Elpida I think
Powercolor is Elpida as well


----------



## Arizonian

Quote:


> Originally Posted by *Arm3nian*
> 
> Sapphire R290X BF4 - SK Hynix
> Sapphire R290 - Elpida
> Asus R290X BF4 - SK Hynix
> Gigabyte R290X BF4 - Elpida
> AMD Press R290X - SK Hynix
> HIS R290X - SK Hynix / Elpida
> XFX is also Elpida I think
> Powercolor is Elpida as well


I'm going to add that memory info to OP.


----------



## armartins

Please... let us know water results... Need to know what clocks those cards are able to hold when temperature is not a problem... but please not only benchmark results (max) please show us stock voltage max oc then at least two more increments in voltage and finally the results with max voltage.


----------



## DampMonkey

Quote:


> Originally Posted by *Arizonian*
> 
> I'm going to add that memory info to OP.


Thats wrong. My bf4 sapphire 290x had elpida


----------



## selk22

So damp monkey you had it at 1200Core and 6000Mem? Very nice if so!


----------



## DampMonkey

Quote:


> Originally Posted by *selk22*
> 
> So damp monkey you had it at 1200Core and 6000Mem? Very nice if so!


still playing around with it to see what voltages work best. 1220/1500(6000) was relatively easy. Loads have been around 40*C in heaven. my room must be cold or something


----------



## Arm3nian

Quote:


> Originally Posted by *DampMonkey*
> 
> Thats wrong. My bf4 sapphire 290x had elpida


It seems mixed. Some HIS also have hynix.

Did you check with the memory tool?
If so, I guess Sapphire is also hynix/elpida on both editions


----------



## wolfej

I posted this earlier, but I'm guessing you missed it so here's my two 290x's.



Also I got both of these off of Amazon and it didn't list them as being BF4 editions, but both cards came with a code so yay for that.


----------



## selk22

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *DampMonkey*
> 
> still playing around with it to see what voltages work best. 1220/1500(6000) was relatively easy. Loads have been around 40*C in heaven. my room must be cold or something






Awesome well that's my exact card.. did you need to flash it to Asus Bios to Volt tweak?

Or is this at stock volts?

I am getting my WC setup ASAP..

just sold my gtx 580 to buy the parts


----------



## th3illusiveman

Quote:


> Originally Posted by *DampMonkey*
> 
> still playing around with it to see what voltages work best. 1220/1500(6000) was relatively easy. Loads have been around 40*C in heaven. my room must be cold or something


Have you managed to fix the issue you had with Valley? 61fps is extremely low for the clocks you're reporting and some very highly clocked 7970s were beating that by 2-3 fps afew months back. You should be in the mid 70fps range if tsm's score is anything to go by.

Did you check for Vdroop, throttling, artifacts, and performance loss from unstable memory overclocks?


----------



## Arizonian

Quote:


> Originally Posted by *wolfej*
> 
> I posted this earlier, but I'm guessing you missed it so here's my two 290x's.
> 
> 
> 
> Also I got both of these off of Amazon and it didn't list them as being BF4 editions, but both cards came with a code so yay for that.


Ooops sorry I did miss that. Congrats on TWO. Added to list









Are you water cooling those out of curiosity? It not what's your PSU wattage?

Also OCN'ers love to see peoples rig specs you should list them.









*"How to put your Rig in your Sig"*


----------



## evensen007

Quote:


> Originally Posted by *DampMonkey*
> 
> still playing around with it to see what voltages work best. 1220/1500(6000) was relatively easy. Loads have been around 40*C in heaven. my room must be cold or something


This.... is promising. My daily gaming o/c on my 7970's is 1250 on the core. If the 290x handles 1225-1250 without issue the card is going to absolutely murder BF4 and everything else. Keep us posted!


----------



## armartins

Please guys post voltage - clock - temp - power tune while talking about OC results water or air... so we can start estimating the avg oc those cards will hit!


----------



## wolfej

Yes I'm WCing them (blocks should get here Wednesday) and I've got a kilowatt PSU.

Weird, my rig used to be in my sig. Did they wipe them or something? I've got a 7970 rig now and I had it in there. Hmmm.


----------



## Arizonian

Quote:


> Originally Posted by *wolfej*
> 
> Yes I'm WCing them (blocks should get here Wednesday) and I've got a kilowatt PSU.
> 
> Weird, my rig used to be in my sig. Did they wipe them or something? I've got a 7970 rig now and I had it in there. Hmmm.


Thanks was just curious.

Quote:


> Originally Posted by *evensen007*
> 
> This.... is promising. My daily gaming o/c on my 7970's is 1250 on the core. If the 290x handles 1225-1250 without issue the card is going to absolutely murder BF4 and everything else. Keep us posted!


Mine on air does 1100 / 1300 stock voltage. Can run it uber it hits 92C but will throttle from 1100 to 1058 Mhz on Core. However manual fan settings with 70% Fan (still very playable audibly) I stay pegged at 1100 Core the whole time. Both scenarios 100% GPU usage.

Spekaing of playing - only 11 mins as of my typing before I log off and hit the BF4 Campaign.


----------



## Paulenski

Is there backplates that have holes to attach aftermarket cooler?

I want some back support for the mk-26 I'm getting

Sent from my SGH-M919 using Tapatalk


----------



## infranoia

Quote:


> Originally Posted by *Arm3nian*
> 
> It seems mixed. Some HIS also have hynix.
> 
> Did you check with the memory tool?
> If so, I guess Sapphire is also hynix/elpida on both editions


So both HIS and Sapphire are confirmed to be mixed. My Sapphire is Hynix:



Oh, dear. Look at the time. Gotta go. Won't come up for air until I'm fragged.


----------



## jrcbandit

So disappointed, this arrived today:


But my video card is sitting at UPS until delivery for tomorrow! Well I have no idea how I will dismantle my loop since I have never done maintenance so it might take me awhile anyway....

Also, any idea on what kind of power draw to expect when upping the voltage?


----------



## Sazz

In other mater BF4 is officially open for US! logging on it right now xD


----------



## Arm3nian

Quote:


> Originally Posted by *infranoia*
> 
> So both HIS and Sapphire are confirmed to be mixed. My Sapphire is Hynix:
> 
> 
> 
> Oh, dear. Look at the time. Gotta go. Won't come up for air until I'm fragged.


Yeah we have already seen both Hynix and Elpida HIS, but he is the only one so far that claims his BF4 Sapphire is Elpida. Either he misread, since he hasn't posted a screen of it, or Sapphire screwed up lol.


----------



## tsm106

Quote:


> Originally Posted by *jrcbandit*
> 
> So disappointed, this arrived today:
> 
> 
> But my video card is sitting at UPS until delivery for tomorrow! Well I have no idea how I will dismantle my loop since I have never done maintenance so it might take me awhile anyway....
> 
> Also, any idea on what kind of power draw to expect when upping the voltage?


Play with the stock asus bios first. Do not jump to the unlocked bios without FIRST getting a feel for how volts affects things and the associated droop. Imo, most ppl would be better off with the stock asus bios, much safer. The unlocked bios are really for seasoned overclockers who acknowledge they are the only impediment to a bad outcome. Get to know the stock asus bios first before moving on. Don't ramp the memory up too fast as well.


----------



## DampMonkey

Quote:


> Originally Posted by *Arm3nian*
> 
> Yeah we have already seen both Hynix and Elpida HIS, but he is the only one so far that claims his BF4 Sapphire is Elpida. Either he misread, since he hasn't posted a screen of it, or Sapphire screwed up lol.


It was definitely Elpida. I tried taking pictures of it but the text was so faint and it needed to be in the right light too see. The hynix ive seen in pictures was clearly visible. This was just readable enoug to see "elpida"


----------



## pounced

With this card bf4 runs like a wet dream!


----------



## Arm3nian

Quote:


> Originally Posted by *DampMonkey*
> 
> It was definitely Elpida. I tried taking pictures of it but the text was so faint and it needed to be in the right light too see. The hynix ive seen in pictures was clearly visible. This was just readable enoug to see "elpida"


Trying using the memory tool to be certain. Chip are labeled while manufactured but still lol. I haven't taken my blocks off yet so idk what mine are. Where did you orders yours from?


----------



## tsm106

Memory type doesn't really matter outside of preference or ocd, since you will be flashing it anyways, so it will run at whatever timing is written on the bios. The only diff between the two are the timings in essence since they are both rated the same.


----------



## Arm3nian

Quote:


> Originally Posted by *tsm106*
> 
> Memory type doesn't really matter outside of preference or ocd, since you will be flashing it anyways, so it will run at whatever timing is written on the bios. The only diff between the two are the timings in essence since they are both rated the same.


They will run the same after flashed but there is no telling that the cheaper one will potentially clock as high. Best elpida could be binned higher than a crappy hynix and allow for more clocks, but speaking in general terms.


----------



## EvilGnomes

bummer. Sapphire should have just used straight Hynix chips instead of the cheaper and worse


----------



## jjjc_93

Why so much hate on the epilda memory, can somebody tell me the performance difference or is everyone just going with the flock on this one?









I have an epilda card benching 1650mhz memory no problems, same as on most 780s. 780s have a mix of the 2 now that Samsung is unavailable and they both clock similarly with a ~1700 wall.

EDIT: Also these vendors buy bundles so to speak, it comes with memory and core together and might not be an issue of cheaping out but instead availability. At least that's how it works with Nvidia.


----------



## fleetfeather

A few questions:

1. BF4 Benchmarks?
2. Radiators needed per 290X?
3. Average OC on both air and water?


----------



## tsm106

Quote:


> Originally Posted by *fleetfeather*
> 
> A few questions:
> 
> 1. BF4 Benchmarks?
> 2. Radiators needed per 290X?
> 3. Average OC on both air and water?


2- 360mm per 290x, generally the more the merrier.
3- if you can get 1125 on stock volts, its a good start, on water we don't know yet. I was able to hit 1250 with a less than stellar gpu only setup. Fullcover with better cooling on ram and vrm should achieve a bit more.


----------



## skupples

I just need to point out, back plates are 99.96% aesthetic's.
Quote:


> Originally Posted by *jjjc_93*
> 
> Why so much hate on the epilda memory, can somebody tell me the performance difference or is everyone just going with the flock on this one?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have an epilda card benching 1650mhz memory no problems, same as on most 780s. 780s have a mix of the 2 now that Samsung is unavailable and they both clock similarly with a ~1700 wall.
> 
> EDIT: Also these vendors buy bundles so to speak, it comes with memory and core together and might not be an issue of cheaping out but instead availability. At least that's how it works with Nvidia.


*as of recent,* the anti-elpida started with 780 classis.


----------



## DampMonkey

Quote:


> Originally Posted by *skupples*
> 
> I just need to point out, back plates are 99.96% aesthetic's.
> *as of recent,* the anti-elpida started with 780 classis.


While they are mainly for looks, mine does get quite warm when the card gets loaded, so thats definitely a good thing. This EK backplate comes with thermal pads for the back of the cpu and vrms, so i know its taking heat away from the pcb. Whether it has an effect on anything, i doubt it. Id buy it again though!! Probablly keeps the card from warping as well


----------



## fleetfeather

Quote:


> Originally Posted by *tsm106*
> 
> 2- 360mm per 290x, generally the more the merrier.
> 3- if you can get 1125 on stock volts, its a good start, on water we don't know yet. I was able to hit 1250 with a less than stellar gpu only setup. Fullcover with better cooling on ram and vrm should achieve a bit more.


So I'm looking at a phobya 1080 for CF 290X + CPU







Yeah when i get the cash together for a loop I'll go for EK blocks


----------



## tsm106

No way in hell I would do a block w/o a backplate. It's a first line of defense against dropped screws on your pcb and any other element/environmental debris, screwdriver head, etc etc. It provides some cooling as well. IT also helps to prevent the dreaded sagging butt syndrome.

Elpida got a bad rap on amd gpu side too cuz 7950s got the slower rated elpida and that bad rep stuck.

Quote:


> Originally Posted by *fleetfeather*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> 2- 360mm per 290x, generally the more the merrier.
> 3- if you can get 1125 on stock volts, its a good start, on water we don't know yet. I was able to hit 1250 with a less than stellar gpu only setup. Fullcover with better cooling on ram and vrm should achieve a bit more.
> 
> 
> 
> So I'm looking at a phobya 1080 for CF 290X + CPU
> 
> 
> 
> 
> 
> 
> 
> Yeah when i get the cash together for a loop I'll go for EK blocks
Click to expand...

It is hard to beat a nova for price vs perf. Awesome way to start a loop. Never worry about water temp with that setup. Don't forget to plan for some qdc too, will make life sooo much easier with external rads.


----------



## Arm3nian

Quote:


> Originally Posted by *tsm106*
> 
> No way in hell I would do a block w/o a backplate. It's a first line of defense against dropped screws on your pcb and any other element/environmental debris, screwdriver head, etc etc. It provides some cooling as well. IT also helps to prevent the dreaded sagging butt syndrome.
> 
> Elpida got a bad rap on amd gpu side too cuz 7950s got the slower rated elpida and that bad rep stuck.
> It is hard to beat a nova for price vs perf. Awesome way to start a loop. Never worry about water temp with that setup. Don't forget to plan for some qdc too, will make life sooo much easier with external rads.


Hmm so much rad.

Think I need more than three sr1 480s? Probably running 1000-1400rpm fans. Going to be aiming for highest overclock possible on a 4930k and two 290x.

Now that I think of it, I'm going to be running 2 pumps and watercooling the mobo chipset and mofsets and watercooling quad channel ram hmm...good thing my case supports 14x 480 rads internally w/o any modding.


----------



## DampMonkey

Got a 3dmark11 performance run in. I need more voltage! This asus rom maxes out at 1410mV.
This score was done with 1228 / 1568 (6276) at 1.4V. maxed out at 47*C. CPU is a 4.9ghz 8350

http://www.3dmark.com/3dm11/7392783


----------



## Arm3nian

Quote:


> Originally Posted by *DampMonkey*
> 
> Got a 3dmark11 performance run in. I need more voltage! This asus rom maxes out at 1410mV.
> This score was done with 1228 / 1568 (6276) at 1.4V. maxed out at 47*C. CPU is a 4.9ghz 8350
> 
> http://www.3dmark.com/3dm11/7392783


Awful amount of volts for the clock. TSM got 1250 with 1.3volts I think. You can try the other bios at the kingpin site, it goes to 2volts. Don't kill your card though.


----------



## Taint3dBulge

Need to see some temps of a 290x under water during furtest. Anyone got any screenies of these ~40c temps im reading?


----------



## RocketAbyss

Quote:


> Originally Posted by *Arm3nian*
> 
> Sapphire R290X BF4 - SK Hynix
> Sapphire R290 - Elpida
> Asus R290X BF4 - SK Hynix
> Gigabyte R290X BF4 - Elpida
> AMD Press R290X - SK Hynix
> HIS R290X - SK Hynix / Elpida
> XFX is also Elpida I think
> Powercolor is Elpida as well


PowerColor is Hynix not Elpida. That memory info thing I used showed my PowerColor having Hynix. I will post a screenshot when im back from work in a few hours


----------



## Arm3nian

Quote:


> Originally Posted by *RocketAbyss*
> 
> PowerColor is Hynix not Elpida. That memory info thing I used showed my PowerColor having Hynix. I will post a screenshot when im back from work in a few hours


Almost every review said powercolor uses elpida. I guess if yours uses hynix then we can conclude that basically every brand uses both. was yours a bf4 edition or does powercolor not have that.


----------



## Taint3dBulge

Here is a question, since im going to be using an aftermarket aircooler.. To get temps down on the ram and on the vrm's.. Would it be ok to use thermal tape on the backside, as long as it touches the backplate? Or dat be a waste of time and it wouldnt do nothing>?


----------



## alancsalt

Top 30 3DMark11 Scores for Single/Dual/Tri/Quad

Hoping to see some of the benchers here in that thread..


----------



## RocketAbyss

Quote:


> Originally Posted by *Arm3nian*
> 
> Almost every review said powercolor uses elpida. I guess if yours uses hynix then we can conclude that basically every brand uses both. was yours a bf4 edition or does powercolor not have that.


See my post very early in the thread. Bf4 edition


----------



## Arizonian

Ok guys - here is my first time playing BF4 so be nice to me - was trying to find the action on new map I've not seen before. FRAPS in the upper left running. Saw 45 FPS lows - 72 FPS highs and would say average 55-65 FPS on single 290X on air. For those interested in how it's going to do, a video was made. I'm excited to know that after mantle, where we know 100% that DICE is on-board, we'll see even higher yields come mid-November not including what drivers will do as they move forward.









*BF4 MULTIPLAYER Ultra Settings - Stock Cooling- 1100 CORE / 1300 MEMORY - U2713HM 2560 X 1440 (Pre-Mantel 13.11 Beta 7)*






*MSI Afterburner - 83C Temp - 70% Manual Fan - 100% GPU Usage - 1100 Core Clock / 1300 Memory Clock (No down clocking) - 2388 GB VRAM Memory Usage*



Spoiler: AB readings : Proof of 13:25 Time


----------



## Arm3nian

Quote:


> Originally Posted by *RocketAbyss*
> 
> See my post very early in the thread. Bf4 edition


Luck of the draw then for all brands.


----------



## Forceman

Well bullocks. Got my 290X this afternoon, popped it in and ran a Metro LL bench just to get warmed up. Everything looked good so I kicked it to 1100/1300 and ran it again. Still good. Ran Heaven and it locked up in the middle with a driver reset. Rebooted and tried again and the computer hard rebooted, no warning at all. Dropped clock to stock - same thing, hard reboot about 10 seconds in. Lower clocks to 800/1000 just to see what would happen - same thing. Next time I started the MB just hung at GPU Initialization. Tried different slots, different power connections, nothing. Dead card. I don't know what went - based on the reboots I'm guessing it may be something in power delivery, but for now it's RMA time. Since there's no chance they'll have stock to replace the card I guess I'll get a refund - so now I'm thinking maybe a 290 makes more sense.


----------



## fleetfeather

Quote:


> Originally Posted by *Forceman*
> 
> Well bullocks. Got my 290X this afternoon, popped it in and ran a Metro LL bench just to get warmed up. Everything looked good so I kicked it to 1100/1300 and ran it again. Still good. Ran Heaven and it locked up in the middle with a driver reset. Rebooted and tried again and the computer hard rebooted, no warning at all. Dropped clock to stock - same thing, hard reboot about 10 seconds in. Lower clocks to 800/1000 just to see what would happen - same thing. Next time I started the MB just hung at GPU Initialization. Tried different slots, different power connections, nothing. Dead card. I don't know what went - based on the reboots I'm guessing it may be something in power delivery, but for now it's RMA time. Since there's no chance they'll have stock to replace the card I guess I'll get a refund - so now I'm thinking maybe a 290 makes more sense.


that's rough buddy..

Sapphire?


----------



## Forceman

Quote:


> Originally Posted by *fleetfeather*
> 
> that's rough buddy..
> 
> Sapphire?


Yeah, Sapphire BF4 edition. The first stock Metro LL run was nice though - 36% higher than my GTX 680.


----------



## fleetfeather

Quote:


> Originally Posted by *Forceman*
> 
> Yeah, Sapphire BF4 edition. The first stock Metro LL run was nice though - 36% higher than my GTX 680.


ugh, making me nervous about my sapphires which are due in a week haha... Are you able to confirm the memory IC for your card?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Forceman*
> 
> Well bullocks. Got my 290X this afternoon, popped it in and ran a Metro LL bench just to get warmed up. Everything looked good so I kicked it to 1100/1300 and ran it again. Still good. Ran Heaven and it locked up in the middle with a driver reset. Rebooted and tried again and the computer hard rebooted, no warning at all. Dropped clock to stock - same thing, hard reboot about 10 seconds in. Lower clocks to 800/1000 just to see what would happen - same thing. Next time I started the MB just hung at GPU Initialization. Tried different slots, different power connections, nothing. Dead card. I don't know what went - based on the reboots I'm guessing it may be something in power delivery, but for now it's RMA time. Since there's no chance they'll have stock to replace the card I guess I'll get a refund - so now I'm thinking maybe a 290 makes more sense.


I feel for you.....making me nervous now
Quote:


> Originally Posted by *fleetfeather*
> 
> ugh, making me nervous about my sapphires which are due in a week haha... Are you able to confirm the memory IC for your card?


Mine should be due tomorrow.....Star-track refuse to deliver to my door so it's sitting in a PO about 120 kms from me......have to arrange for AusPost to take it the rest of the way, Not the first time Star-Track has done this.....last time i'll use them though, Door to Door couriers my ass.


----------



## selk22

My sapphire bf4 edition is running at 1100/1400 No problem right now on air with custom fan profile that maxes at 70%.

Not sure what the mem I have is but I will know when I WC it









Running bf4 campaign 1920x1200 at max settings without a problem at all.. 60fps nearly constantly except a few parts with insane effects.. took off vsync and on some parts was getting over 110 FPS

Very pleased









Edit: I cant wait for mantle.. its going to even get better! I cant believe it


----------



## Forceman

Quote:


> Originally Posted by *fleetfeather*
> 
> ugh, making me nervous about my sapphires which are due in a week haha... Are you able to confirm the memory IC for your card?


No, didn't think to check it.


----------



## Arizonian

Quote:


> Originally Posted by *Forceman*
> 
> No, didn't think to check it.


I came from Nvidia too and had issues until I used a utility to remove the drivers clean and have been working like a charm since. That was even after I did a manual removal from registry.

On OP:

*Nvidia Drivers Un-install Utility*


----------



## alancsalt

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Forceman*
> 
> Well bullocks. Got my 290X this afternoon, popped it in and ran a Metro LL bench just to get warmed up. Everything looked good so I kicked it to 1100/1300 and ran it again. Still good. Ran Heaven and it locked up in the middle with a driver reset. Rebooted and tried again and the computer hard rebooted, no warning at all. Dropped clock to stock - same thing, hard reboot about 10 seconds in. Lower clocks to 800/1000 just to see what would happen - same thing. Next time I started the MB just hung at GPU Initialization. Tried different slots, different power connections, nothing. Dead card. I don't know what went - based on the reboots I'm guessing it may be something in power delivery, but for now it's RMA time. Since there's no chance they'll have stock to replace the card I guess I'll get a refund - so now I'm thinking maybe a 290 makes more sense.
> 
> 
> 
> I feel for you.....making me nervous now
> Quote:
> 
> 
> 
> Originally Posted by *fleetfeather*
> 
> ugh, making me nervous about my sapphires which are due in a week haha... Are you able to confirm the memory IC for your card?
> 
> Click to expand...
> 
> Mine should be due tomorrow.....Star-track refuse to deliver to my door so it's sitting in a PO about 120 kms from me......have to arrange for AusPost to take it the rest of the way, Not the first time Star-Track has done this.....last time i'll use them though, Door to Door couriers my ass.
Click to expand...

The Post Office bought Star Track out, so you're really dealing with them anyway. Where I am, the last Star Track PO is 30K away, so I follow the tracking and usually pick it up rather than wait a full day more for the PO to cover the last 30K.


----------



## Forceman

Quote:


> Originally Posted by *Arizonian*
> 
> I came from Nvidia too and had issues until I used a utility to remove the drivers clean and have been working like a charm since. That was even after I did a manual removal from registry.
> 
> On OP:
> 
> *Nvidia Drivers Un-install Utility*


What kind of issues? It seemed to work fine at first, but after the third reboot it wouldn't even boot up, so I don't think it was a driver issue. Motherboard would just hang with a 62 code (GPU Initialization). I think something must have fried. Unfortunately I had already redeemed the BF4 code, so I guess Newegg will dock me the full price of that. Maybe it's for the best, now I can "downgrade" to a 290 instead.

Was really looking forward to some 290X powered BF4 action tomorrow though.


----------



## Sazz

I am running BF4 at 6020x1080 resolution (triple monitor eyefinity) and with everything maxed out including AA in-game I am getting 30fps on average while turning in-game AA off I am getting around 45-60fps, not bad for a single card config not to mention I was playing 64player games and during the times of busy screen lowest I've seen it dip is 42fps.


----------



## Sgt Bilko

Quote:


> Originally Posted by *alancsalt*
> 
> The Post Office bought Star Track out, so you're really dealing with them anyway. Where I am, the last Star Track PO is 30K away, so I follow the tracking and usually pick it up rather than wait a full day more for the PO to cover the last 30K.


well thats just stupid, I heard the Star Track bought out AAE (Australian Air Express), But i ordered a Courier service.....not regular post, complete waste of money if it's going to turn up at the same time as regular mail and i still need to go to the PO to pick it up.....

Tomorrow hopefully, then i pick up BF4 on Thursday.


----------



## alancsalt

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *alancsalt*
> 
> The Post Office bought Star Track out, so you're really dealing with them anyway. Where I am, the last Star Track PO is 30K away, so I follow the tracking and usually pick it up rather than wait a full day more for the PO to cover the last 30K.
> 
> 
> 
> well thats just stupid, I heard the Star Track bought out AAE (Australian Air Express), But i ordered a Courier service.....not regular post, complete waste of money if it's going to turn up at the same time as regular mail and i still need to go to the PO to pick it up.....
> 
> Tomorrow hopefully, then i pick up BF4 on Thursday.
Click to expand...

Ah, on 2 October 2012, Qantas announced it would acquire Australia Post's 50% interest in AeA, in return for Australia Post acquire Qantas' 50% interest in Star Track Express. Confusing?


----------



## Arizonian

Quote:


> Originally Posted by *Forceman*
> 
> What kind of issues? It seemed to work fine at first, but after the third reboot it wouldn't even boot up, so I don't think it was a driver issue. Motherboard would just hang with a 62 code (GPU Initialization). I think something must have fried. Unfortunately I had already redeemed the BF4 code, so I guess Newegg will dock me the full price of that. Maybe it's for the best, now I can "downgrade" to a 290 instead.
> 
> Was really looking forward to some 290X powered BF4 action tomorrow though.


I couldn't get BF3 to spark up. It would freeze. Just thinking if you may be having driver conflicts and it might at least rule that out.


----------



## Sgt Bilko

Quote:


> Originally Posted by *alancsalt*
> 
> Ah, on 2 October 2012, Qantas announced it would acquire Australia Post's 50% interest in AeA, in return for Australia Post acquire Qantas' 50% interest in Star Track Express. Confusing?


Argh, this makes my head hurt, next time im just gonna drive the 600 something km and get it myself.

I could use a good drive anyway.


----------



## overclockFrance

I installed the EK copper waterblock on my HIS 290X yesterday and I tested the overclock :

- on air with 100 % fan stock volts , the card was able to run every benchmark at 1130/1650 Mhz (Elpdia memory modules). Games were stable at 1100/6400 Mhz.
- under water :
- stock volts : 1130/1650 Mhz : no overclocking improvement but what a silence !
- 1410 mV : 1260/1650 Mhz stable on every benchmark but I have several remarks :

- I suspect the GPU tweak voltage inaccurate : I check realtime temperatures, voltages, frequencies with the Logitech G19 display and AIDA64 :at 1410 mV, the actual GPU voltage is around 1.28 - 1.29 V and it fluctuates, depending on the display scene. I think that AIDA voltages are accurate since GPU and VRM temperatures are not very high, 43°C and 62°C respectively on 3DMark Vantage.

- at 1260 Mhz and whatever the RAM frequency, the display is corrupted, not artifacted : the display fluctuates between normal and blurred. If I lower the GPU frequency, at 1230, I have the same problem. Tested on every benchmark and Crysis 3. The characters are blurry, unreadable, the drawings are in 640x480 mode. Then If I reboot the computer, the image is crisp again. I tested evey bench and Crysis 3 without overclocking, no problem. I migrated the driver from 13.11beta 6 to beta7 and no change. Since the GPU frequency change does not solve my problem, I assume that memory is responsible for or the memory controller. Would it be possible to increase memory voltage ? Otherwise, would you have an idea ?

I flashed my HIS card with ASUS bios without any problems which is strange since my card has Elpida modules whereas ASUS ones are said to have Hynix memory.

In terms of scores at 1260/6400 Mhz :
3DMark Vantage Extreme : 32416 (and 274 W consumed !)
3D MArk 2013 : 6373
Unigine Heaven 4.0 2560x1600, everything maxed, AA 0X : 1155
3DMark 11 : X5514
Crysis 3 2560x1600, everything maxed except AA 0X : 35-40 fps at the beginning of the game, in the rain. Same as my previous GTX690


----------



## psyside




----------



## Paul17041993

Quote:


> Originally Posted by *psyside*


my god I lol-ed hard... though the upper pic needs cropping...


----------



## psyside

Dont know why the [H] pics show so small, i set it up for 500x....


----------



## Sgt Bilko

Arctic Cooling replied to my e-mail (sent 3 days ago) about what products they recommend for the 290x.

Already been posted in this thread but they recommend the Accelero Xtreme III - http://www.arctic.ac/en/p/cooling/vga/554/accelero-xtreme-iii.html?c=2182 and the Accelero Hybrid - http://www.arctic.ac/en/p/cooling/vga/569/accelero-hybrid.html?c=2182

Hope that helps someone out.


----------



## TheSoldiet

I think bf4 got a benchmarking tool. Go to test range (gamemode) and then video settings. Should be there.


----------



## pharma57

Quote:


> Originally Posted by *Arizonian*
> 
> I came from Nvidia too and had issues until I used a utility to remove the drivers clean and have been working like a charm since. That was even after I did a manual removal from registry.
> 
> On OP:
> 
> *Nvidia Drivers Un-install Utility*


I definitely recommend this tool for really getting your PC clean of any AMD or Nvidia drivers.







Does wonders (it found stuff I had 2 years ago when running a 560ti. The direct link to get Wagnard's latest version and ask any questions is below:

http://forums.guru3d.com/showthread.php?t=379506


----------



## Paul17041993

Quote:


> Originally Posted by *psyside*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Dont know why the [H] pics show so small, i set it up for 500x....


think cause of the width, hence why i mentioned cropping...


----------



## moeqawama

So, having issues with Battlefield 4... in fact it's the only game that has been underwhelming performance-wise since I got the 290X. At ultra settings, when I do a performance test, it tells me that my settings are bad and that I need to kick them down to get better FPS. I honestly have no idea what it can be causing this game to run crappy. It's definitely not my CPU (i7 3770k OC'd to 4.5ghz), not my RAM (G.Skill ripjaws-x 16 GB @2133 CL 9-9-10-27 1T). I have the Beta 7 installed. Any suggestions, guys? I thought this game of all the games would sing on my 290X. I might either roll back to Beta 6 or re-install Beta 7. Any suggestions/help are/is much appreiated


----------



## provost

Quote:


> Originally Posted by *Forceman*
> 
> Well bullocks. Got my 290X this afternoon, popped it in and ran a Metro LL bench just to get warmed up. Everything looked good so I kicked it to 1100/1300 and ran it again. Still good. Ran Heaven and it locked up in the middle with a driver reset. Rebooted and tried again and the computer hard rebooted, no warning at all. Dropped clock to stock - same thing, hard reboot about 10 seconds in. Lower clocks to 800/1000 just to see what would happen - same thing. Next time I started the MB just hung at GPU Initialization. Tried different slots, different power connections, nothing. Dead card. I don't know what went - based on the reboots I'm guessing it may be something in power delivery, but for now it's RMA time. Since there's no chance they'll have stock to replace the card I guess I'll get a refund - so now I'm thinking maybe a 290 makes more sense.


Which 290x is it? Can you post a pic


----------



## Sgt Bilko

Quote:


> Originally Posted by *moeqawama*
> 
> So, having issues with Battlefield 4... in fact it's the only game that has been underwhelming performance-wise since I got the 290X. At ultra settings, when I do a performance test, it tells me that my settings are bad and that I need to kick them down to get better FPS. I honestly have no idea what it can be causing this game to run crappy. It's definitely not my CPU (i7 3770k OC'd to 4.5ghz), not my RAM (G.Skill ripjaws-x 16 GB @2133 CL 9-9-10-27 1T). I have the Beta 7 installed. Any suggestions, guys? I thought this game of all the games would sing on my 290X. I might either roll back to Beta 6 or re-install Beta 7. Any suggestions/help are/is much appreiated


Clean wipe of the Drivers and then re-install Beta-7?

If that fails then rinse and repeat with beta 6.

Only things i can think of atm.


----------



## $ilent

So whats up, my BF4 is saying preload for November 1st release?

I thought it was out at midnight...

Well nvm just seen its only for US release. What a joke


----------



## fleetfeather

diff timezones = diff release, maybe? Idk. It's approaching 12:01am 10/30/13 here


----------



## Sgt Bilko

Quote:


> Originally Posted by *$ilent*
> 
> So whats up, my BF4 is saying preload for November 1st release?
> 
> I thought it was out at midnight...


http://help.ea.com/article/battlefield-4-worldwide-release-schedule

You can pre-load 24 hours before release in your region


----------



## TheSoldiet

Just use an american vpn and you will be set. Working for me! Norway


----------



## $ilent

whats a vpn (well I know its virtual private network) but what actually is it?

Also anyone got that link to check your memory type? And I just updated to v7 beta and for some reason its changed all my fonts on OCN?


----------



## Sgt Bilko

Quote:


> Originally Posted by *$ilent*
> 
> whats a vpn (well I know its virtual private network) but what actually is it?


I don't know much but you use a VPN and it changes your IP address to a country of your choosing, I've used it for Netflix (not avaliable in Aus).


----------



## $ilent

so what if origin/pb ban you


----------



## Sgt Bilko

Quote:


> Originally Posted by *$ilent*
> 
> so what if origin/pb ban you


To be honest, unless you really really can't wait a couple more days then go for it, Not sure what Origin would do, probably ban you for a set amount of time, heard of some rare cases where it was perma-ban.

BF4 will still be there in a couple of days, Not worth going for a VPN for it unless you are really desperate for the MP.


----------



## $ilent

yeh im just gonna wait. My psu died this week, and I borrowed one just so I could play Bf4 today. But sen now that its not available I guess it doesnt really matter.

I sent the PSu to seasonic yesterday, should have it today. What you reckon chances of them getting a replacement to me by friday is?


----------



## Sgt Bilko

I have no experience with Seasonic so i honestly couldn't say but seeing as it's only early Tuesday there I'd say that your chances are pretty good.

here's hoping you will be flying through BF4 on the weekend with your 290x leading the charge









At least you get to play the MP, they way my net is i won't even be able to login to a server lol


----------



## $ilent

lol, thanks.

whats up with it?


----------



## fleetfeather

In this instance, a VPN is just a way of routing data through a different path. It's bloody difficult for EA to track your origin (bad pun) when you use a VPN. However, it's somewhat easy to tell you're using one, because lots of people will be pushing traffic through the same IP address / gateway, but that in itself is not against ToS.

I've used VPN's (such as Hotspot Shield's free trial, and Battleping) to buy geoblocked game deals before







It doesn't always work (some stores will check the country of origin your credit card / paypal account is linked to, at which point they cancel the order request), but it's always worth a shot








Quote:


> Originally Posted by *Sgt Bilko*
> 
> I have no experience with Seasonic so i honestly couldn't say but seeing as it's only early Tuesday there I'd say that your chances are pretty good.
> 
> here's hoping you will be flying through BF4 on the weekend with your 290x leading the charge
> 
> 
> 
> 
> 
> 
> 
> 
> 
> At least you get to play the MP, they way my net is i won't even be able to login to a server lol


you should go to Uni like me

http://www.speedtest.net/my-result/3064853760

I'm currently using a Cat5 ethernet cable and 100mb full duplex, so I'm limited to a theoretical 100mbps U/D, but when I'm on a Cat6 + 1gbps full duplex, I can pull up to 450mbps U/D


----------



## Sgt Bilko

Quote:


> Originally Posted by *fleetfeather*
> 
> In this instance, a VPN is just a way of routing data through a different path. It's bloody difficult for EA to track your origin (bad pun) when you use a VPN. However, it's somewhat easy to tell you're using one, because lots of people will be pushing traffic through the same IP address / gateway, but that in itself is not against ToS.
> 
> I've used VPN's (such as Hotspot Shield's free trial, and Battleping) to buy geoblocked game deals before
> 
> 
> 
> 
> 
> 
> 
> It doesn't always work (some stores will check the country of origin your credit card / paypal account is linked to, at which point they cancel the order request), but it's always worth a shot
> 
> 
> 
> 
> 
> 
> 
> 
> you should go to Uni like me
> 
> http://www.speedtest.net/my-result/3064853760
> 
> I'm currently using a Cat5 ethernet cable and 100mb full duplex, so I'm limited to a theoretical 100mbps U/D, but when I'm on a Cat6 + 1gbps full duplex, I can pull up to 450mbps U/D


Best explanation ive seen for one









And Ha......i win


----------



## fleetfeather

duuuuuuude, i can't even imagine... Not even my parents wifi is that bad D:

Did someone say NBN?


----------



## Arizonian

Quote:


> Originally Posted by *overclockFrance*
> 
> I installed the EK copper waterblock on my HIS 290X yesterday and I tested the overclock :
> 
> - on air with 100 % fan stock volts , the card was able to run every benchmark at 1130/1650 Mhz (Elpdia memory modules). Games were stable at 1100/6400 Mhz.
> - under water :
> - stock volts : 1130/1650 Mhz : no overclocking improvement but what a silence !
> - 1410 mV : 1260/1650 Mhz stable on every benchmark but I have several remarks :
> 
> - I suspect the GPU tweak voltage inaccurate : I check realtime temperatures, voltages, frequencies with the Logitech G19 display and AIDA64 :at 1410 mV, the actual GPU voltage is around 1.28 - 1.29 V and it fluctuates, depending on the display scene. I think that AIDA voltages are accurate since GPU and VRM temperatures are not very high, 43°C and 62°C respectively on 3DMark Vantage.
> 
> - at 1260 Mhz and whatever the RAM frequency, the display is corrupted, not artifacted : the display fluctuates between normal and blurred. If I lower the GPU frequency, at 1230, I have the same problem. Tested on every benchmark and Crysis 3. The characters are blurry, unreadable, the drawings are in 640x480 mode. Then If I reboot the computer, the image is crisp again. I tested evey bench and Crysis 3 without overclocking, no problem. I migrated the driver from 13.11beta 6 to beta7 and no change. Since the GPU frequency change does not solve my problem, I assume that memory is responsible for or the memory controller. Would it be possible to increase memory voltage ? Otherwise, would you have an idea ?
> 
> I flashed my HIS card with ASUS bios without any problems which is strange since my card has Elpida modules whereas ASUS ones are said to have Hynix memory.
> 
> In terms of scores at 1260/6400 Mhz :
> 3DMark Vantage Extreme : 32416 (and 274 W consumed !)
> 3D MArk 2013 : 6373
> Unigine Heaven 4.0 2560x1600, everything maxed, AA 0X : 1155
> 3DMark 11 : X5514
> Crysis 3 2560x1600, everything maxed except AA 0X : 35-40 fps at the beginning of the game, in the rain. Same as my previous GTX690
> 
> 
> 
> Spoiler: Warning: Spoiler!


Nice. Welcome aboard with the first HIS to list. Addded


----------



## Sgt Bilko

Quote:


> Originally Posted by *fleetfeather*
> 
> duuuuuuude, i can't even imagine... Not even my parents wifi is that bad D:
> 
> Did someone say NBN?


Ahh hate to break it to you, but this is NBN









Just not the NBN you know, I'm on Satelitte and thanks to the downgrades this ain't getting any faster till 2015..........yeah

Used to be 6/1 with a ping of 700 or less, perfectly fine for co-op PvE games.


----------



## EvilGnomes

So im pretty happy with my results on air.
15% powertune increase (could probably bump it down a bit)
1100mhz/1350mhz stable

so far loving this card...BF4 is kinda terrible with all the bugs and everything atm though sadly.


----------



## lordzed83

What bugs ?? Finished bf4 and had only one crash on loading screen. HAve not noticed a single bug.


----------



## fleetfeather

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Ahh hate to break it to you, but this is NBN
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just not the NBN you know, I'm on Satelitte and thanks to the downgrades this ain't getting any faster till 2015..........yeah
> 
> Used to be 6/1 with a ping of 700 or less, perfectly fine for co-op PvE games.


Ahhh, I always wondered about Satellite performance... That's a total bummer man, Aus' broadband infrastructure is really shameful, especially when you consider Canada suffers from many of the same issues we have, yet their speeds/connectivity are substantially better than ours :/


----------



## $ilent

For anyone into folding im putting up some ppd numbers here -http://www.overclock.net/t/1436884/r9-290x-f-h-ppd-results/0_40

Looks very promising!


----------



## fleetfeather

Quote:


> Originally Posted by *$ilent*
> 
> For anyone into folding im putting up some ppd numbers here -http://www.overclock.net/t/1436884/r9-290x-f-h-ppd-results/0_40
> 
> Looks very promising!


given how well bf4 seems to have been optimized, i might dedicate one of my 290X's to folding purposes







(can you fold on one gpu, and play games on the other?)


----------



## sugarhell

From hexus on fb
Quote:


> The graphics war is heating up, folks. In an effort to combat Nvidia's recent price cuts, we're hearing a new AMD driver is imminent. Performance improvements are thought to be the key enhancement, and though there are no specifics just yet, we wouldn't be surprised to see R9 290X frame rates jump by up to 10 per cent or so in certain games.


----------



## $ilent

Quote:


> Originally Posted by *fleetfeather*
> 
> given how well bf4 seems to have been optimized, i might dedicate one of my 290X's to folding purposes
> 
> 
> 
> 
> 
> 
> 
> (can you fold on one gpu, and play games on the other?)


Thats what I have been doing for months. Gamed on my gtx 570 (I was at 1080p), and folded on my gtx 670 24/7


----------



## szeged

290x's arrived last night







fixing the lighting in my house then pics inc so you dont have awful camera glare.

also, was looking at amazon again to see if any more were in stock so i can test tri/quad also

saw this and had a little laugh


----------



## SpewBoy

Just ordered two HIS's. Has anyone received cards that *haven't* come with BF4? I ask because the place I purchased them from did not mention BF4 for this particular brand and the picture (unlike the pictures for other brands) did not show a box with BF4 on it, just the reference picture of the card itself. Just wondering if all the stock around the stores at the moment is a mix of BF4 versions and regular versions or whether they are all BF4 versions atm.

Still gotta order my EK blocks... one step at a time









And thanks for all the amazing info guys. Read every inch of this thread and it helped me make my mind up.


----------



## selk22

Well this is amazing I am so happy with my 290x tonight









In BF4 at 1100/1400 and getting 50-60+ FPS on 64 player maps at ultra everything 4xMSAA no AA post (I don't like the way it looks) and 120% Res scale and its looking beautiful!!
1920x1200 3930k 4.6 Rig in my sig..

I cant wait for mantle.. I am hoping to see 150% res scale with my current frames when mantle hits! One can only hope..

Also want to mention at 200% res scale at 1920x1200 it plays about 27-40FPS which I am impressed by


----------



## rdr09

Quote:


> Originally Posted by *selk22*
> 
> Well this is amazing I am so happy with my 290x tonight
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In BF4 at 1100/1400 and getting 50-60+ FPS on 64 player maps at ultra everything 4xMSAA no AA post (I don't like the way it looks) and 120% Res scale and its looking beautiful!!
> 1920x1200 3930k 4.6 Rig in my sig..
> 
> I cant wait for mantle.. I am hoping to see 150% res scale with my current frames when mantle hits! One can only hope..
> 
> Also want to mention at 200% res scale at 1920x1200 it plays about 27-40FPS which I am impressed by


with your comment, we might see a Titan at $650. i hope.


----------



## BusterOddo

Quote:


> Originally Posted by *moeqawama*
> 
> So, having issues with Battlefield 4... in fact it's the only game that has been underwhelming performance-wise since I got the 290X. At ultra settings, when I do a performance test, it tells me that my settings are bad and that I need to kick them down to get better FPS. I honestly have no idea what it can be causing this game to run crappy. It's definitely not my CPU (i7 3770k OC'd to 4.5ghz), not my RAM (G.Skill ripjaws-x 16 GB @2133 CL 9-9-10-27 1T). I have the Beta 7 installed. Any suggestions, guys? I thought this game of all the games would sing on my 290X. I might either roll back to Beta 6 or re-install Beta 7. Any suggestions/help are/is much appreiated


Although I only have a 7970, I got the low performance suggestion from the in-game performance tester at all graphic settings I tried last night. I think its not working properly yet.


----------



## $ilent

Quote:


> Originally Posted by *szeged*
> 
> 290x's arrived last night
> 
> 
> 
> 
> 
> 
> 
> fixing the lighting in my house then pics inc so you dont have awful camera glare.
> 
> also, was looking at amazon again to see if any more were in stock so i can test tri/quad also
> 
> saw this and had a little laugh


haha nice spot!


----------



## raghu78

Quote:


> Originally Posted by *szeged*
> 
> 290x's arrived last night
> 
> 
> 
> 
> 
> 
> 
> fixing the lighting in my house then pics inc so you dont have awful camera glare.


are you watercooling ? would love to see some game benchmarks and personal feedback on in game performance from you on Titan voltage OC vs R9 290X voltage OC.


----------



## szeged

Quote:


> Originally Posted by *raghu78*
> 
> are you watercooling ? would love to see some game benchmarks and personal feedback on in game performance from you on Titan voltage OC vs R9 290X voltage OC.


gonna test them on air, will have EK blocks in soon, performance pcs isnt that far away from me


----------



## kot0005

can some 1 link the download link for MemoryInfo plz.


----------



## K1llrzzZ

I got my cards today







2 Sapphire R9 290X Battlefield 4 edition. My Corsair HX 850 can handle them fine so far,but as I told a few days ago,Im did not overclock them.


----------



## raghu78

Quote:


> Originally Posted by *szeged*
> 
> gonna test them on air, will have EK blocks in soon, performance pcs isnt that far away from me


more than benchmarks tell us about in game performance in the big titles like Crysis 3, BF4, Metro LL . try and run max settings with MSAA 4x and SSAA 2x if possible.









btw did you get BF4 edition or normal


----------



## $ilent

Quote:


> Originally Posted by *kot0005*
> 
> can some 1 link the download link for MemoryInfo plz.


I would also like this link to check mine.


----------



## kot0005

Quote:


> Originally Posted by *$ilent*
> 
> I would also like this link to check mine.


Google doesnt work haha.


----------



## sugarhell

Quote:


> Originally Posted by *kot0005*
> 
> Google doesnt work haha.


http://www.overclock.net/t/1437170/290x-voltage-control-vdroop-mods-custom-bios-thread


----------



## szeged

Quote:


> Originally Posted by *raghu78*
> 
> more than benchmarks tell us about in game performance in the big titles like Crysis 3, BF4, Metro LL . try and run max settings with MSAA 4x and SSAA 2x if possible.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> btw did you get BF4 edition or normal


yeah ill test them in games also









the two i have currently are bf4 editions, gonna grab another sapphire bf4 from a user selling his, and gonna try to get one more bf4 edition from amazon or something, then all my friends can get bf4 for free lol.


----------



## kot0005

Quote:


> Originally Posted by *sugarhell*
> 
> http://www.overclock.net/t/1437170/290x-voltage-control-vdroop-mods-custom-bios-thread


thanks mate, +rep


----------



## DampMonkey

Memory Info Tool:
http://www.overclock.net/attachments/17642


----------



## kot0005

Bam, mine is elpida...omg qqq.

Really should have waited for Asus ones ..


----------



## sugarhell

Quote:


> Originally Posted by *kot0005*
> 
> Bam, mine is elpida...omg qqq


Its okay, From what i saw in general they use high spec elpida now. The same oc as hynix


----------



## raghu78

Quote:


> Originally Posted by *szeged*
> 
> yeah ill test them in games also
> 
> 
> 
> 
> 
> 
> 
> 
> 
> the two i have currently are bf4 editions, gonna grab another sapphire bf4 from a user selling his, and gonna try to get one more bf4 edition from amazon or something, then all my friends can get bf4 for free lol.


whoa man what are you going to do with 4 cards. binning for a 1300 mhz R9 290X


----------



## DampMonkey

Quote:


> Originally Posted by *kot0005*
> 
> Bam, mine is elpida...omg qqq.
> 
> Really should have waited for Asus ones ..


If you flash to an Asus bios, it will use different timings and the manufacturer wont matter anymore


----------



## szeged

Quote:


> Originally Posted by *raghu78*
> 
> more than benchmarks tell us about in game performance in the big titles like Crysis 3, BF4, Metro LL . try and run max settings with MSAA 4x and SSAA 2x if possible.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> btw did you get BF4 edition or normal


Quote:


> Originally Posted by *raghu78*
> 
> whoa man what are you going to do with 4 cards. binning for a 1300 mhz R9 290X


bin them and use the highest one for bench rig, other three for folding


----------



## battleaxe

Quote:


> Originally Posted by *szeged*
> 
> bin them and use the highest one for bench rig, other three for folding


LOL... you're ridiculous. That's awesome.


----------



## kot0005

Quote:


> Originally Posted by *DampMonkey*
> 
> If you flash to an Asus bios, it will use different timings and manufacturer wont matter anymore


We 'll see lol . Getting my blocks tomorrow.


----------



## DampMonkey

Has anyone had luck with the PT3 bios? The Asus bios seemed more stable, but I think ill give PT3 another try tonight. I wish there was another overclocking program to use, GPU Tweak doesnt seem to be controllling voltage very well. I went from 1.3V to 1.4V and my temps didn't change at all....


----------



## jomama22

Asus has elpidia as well as hynix. At this point memory type seems moot.


----------



## GioV

For reference, MSI R9 290X BF4 edition uses Elpidia chips.


----------



## jomama22

Quote:


> Originally Posted by *GioV*
> 
> For reference, MSI R9 290X BF4 edition uses Elpidia chips.


Or hynix as others have reoorted. Every aib has both it seems.


----------



## Havolice

wel after few days of testing on stock

i got to say those stock coolers are damned awefull







there spinning up and down in idle like crazy i can hear them literaly upstairs IN IDLE

im realy close to just plain out re selling them -.- wich is meh cause the cards themselfs are good.
do any of you have any issues while in idle with the fans being as loud as it was in game or in benchmarks.

*ps i have 2 in crossfire*


----------



## selk22

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Havolice*
> 
> wel after few days of testing on stock
> 
> i got to say those stock coolers are damned awefull
> 
> 
> 
> 
> 
> 
> 
> there spinning up and down in idle like crazy i can hear them literaly upstairs IN IDLE
> 
> im realy close to just plain out re selling them -.- wich is meh cause the cards themselfs are good.
> do any of you have any issues while in idle with the fans being as loud as it was in game or in benchmarks.
> 
> *ps i have 2 in crossfire*






Did you plan to buy two reference cooled cards and not water cool them?


----------



## Arizonian

Quote:


> Originally Posted by *EvilGnomes*
> 
> So im pretty happy with my results on air.
> 15% powertune increase (could probably bump it down a bit)
> 1100mhz/1350mhz stable
> 
> so far loving this card...BF4 is kinda terrible with all the bugs and everything atm though sadly.


What bugs? I played 2.5 hrs last night, multiplayer and campaign and thought it was smooth, saw no glitches.

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/1540#post_21081473


----------



## Jared Pace

Quote:


> Originally Posted by *DampMonkey*
> 
> Has anyone had luck with the PT3 bios? The Asus bios seemed more stable, but I think ill give PT3 another try tonight. I wish there was another overclocking program to use, GPU Tweak doesnt seem to be controllling voltage very well. I went from 1.3V to 1.4V and my temps didn't change at all....


Which version of GPUTweak do you have?


----------



## battleaxe

Quote:


> Originally Posted by *Arizonian*
> 
> What bugs? I played 2.5 hrs last night, multiplayer and campaign and thought it was smooth, saw no glitches.
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/1540#post_21081473


I'm wondering if some of these installs need to do a complete wipe of the old drivers then reinstall. Some are having no trouble while others are seeing bugs and artifacts. Doesn't make sense.


----------



## Havolice

Quote:


> Originally Posted by *selk22*
> 
> 
> Did you plan to buy two reference cooled cards and not water cool them


yes i did but it seems its terrible to get the blocks here and wel our warranty's do not caver switching the stock cooler







and wel ppl in house are REALY annoyed with the sound


----------



## Arizonian

Quote:


> Originally Posted by *battleaxe*
> 
> I'm wondering if some of these installs need to do a complete wipe of the old drivers then reinstall. Some are having no trouble while others are seeing bugs and artifacts. Doesn't make sense.


I couldn't spark BF3 originally even after a manual clean of registry of files until I used the Nvidia un-intaller that's linked on OP. After that all was great.

Only issue I've experienced is after gaming the cards idle higher. I'm chalking that up to driver issue because my 690 did the same thing when it first released until next driver came out.


----------



## overclockFrance

It seems that mine does not overlock very high : as soon as I apply more than 1213 mV, whatever the GPU and RAM frequencies, the benchmarks get blurred. And when I return to the desktop, 2D is blurred too.

I flashed my HIS card with ASUS then PT3 bioses, I tested in Windows 7 and 8, I changed the driver, same problem.

Whereas, The GPU can bench at 1260 Mhz, in fact, it is usable until 1180 - 1190 Mhz. With 7970 cards, the limit was similar generally speaking, it was difficult to go beyond 1180 Mhz, the benchmarks were instable. The symptoms are different with 290X cards but the limit seems similar.

What is your max overclocking under water ? Did you encounter a wall or problems ?

I contemplate getting rid of it : under water, I wanted 1250 Mhz full stable at minimum.


----------



## Jared Pace

overclockFrance: Could be because your memory is different. Could also be a different scenario with an HIS OC bios & Afterburner w/ 290x voltage control. Or it could be a fact that 290X on water doesnt like going >1250mhz. I will wait a few weeks to see lots of results before deciding that is fact.


----------



## jomama22

If any of you are using gpu-z....stop this instant! Gpu-z is a major cause of problems ATM.


----------



## battleaxe

Quote:


> Originally Posted by *Arizonian*
> 
> I couldn't spark BF3 originally even after a manual clean of registry of files until I used the Nvidia un-intaller that's linked on OP. After that all was great.
> 
> Only issue I've experienced is after gaming the cards idle higher. I'm chalking that up to driver issue because my 690 did the same thing when it first released until next driver came out.


So that supports my theory then. Seems some here need to find a way to get the old drivers completely off their PC's before making any final conclusions.


----------



## Arizonian

Quote:


> Originally Posted by *sugarhell*
> 
> http://www.overclock.net/t/1437170/290x-voltage-control-vdroop-mods-custom-bios-thread


Added to 2nd Post of info on OP if you guys want to refer the link to that if a member asks.

*By OCN member Jared Pace*









*290X Voltage Control, Vdroop Mods, Custom Bios Thread*

EDIT: If anyone sees any wrong info regarding your info on the roster please let me know.

Still missing links of proof from ConservingClips, youra6, jrcbandit, and Stay Puft, If you submitted please PM me and I'll update.


----------



## jomama22

I'll have water #s for 6 cards come Saturday. Once tsm gets his up, we can see how far these can really be pushed. Looks like were shooting for 1300+/1700 @ 1.35-1.4v. Have yet to see a stable 1300 core on anything but ln2.

@ 1250 core, you will be matching a 1300-1350 titan for the most part. 1300 core would almost be untouchable by just about all titans.


----------



## DampMonkey

Quote:


> Originally Posted by *jomama22*
> 
> I'll have water #s for 6 cards come Saturday. Once tsm gets his up, we can see how far these can really be pushed. Looks like were shooting for 1300+/1700 @ 1.35-1.4v. Have yet to see a stable 1300 core on anything but ln2.
> 
> @ 1250 core, you will be matching a 1300-1350 titan for the most part. 1300 core would almost be untouchable by just about all titans.


What do you use for voltage tweaking? PT3.rom and the hawaii gpu tweak?


----------



## tsm106

Quote:


> Originally Posted by *overclockFrance*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I installed the EK copper waterblock on my HIS 290X yesterday and I tested the overclock :
> 
> - on air with 100 % fan stock volts , the card was able to run every benchmark at 1130/1650 Mhz (Elpdia memory modules). Games were stable at 1100/6400 Mhz.
> - under water :
> - stock volts : 1130/1650 Mhz : no overclocking improvement but what a silence !
> - 1410 mV : 1260/1650 Mhz stable on every benchmark but I have several remarks :
> 
> - I suspect the GPU tweak voltage inaccurate : I check realtime temperatures, voltages, frequencies with the Logitech G19 display and AIDA64 :at 1410 mV, the actual GPU voltage is around 1.28 - 1.29 V and it fluctuates, depending on the display scene. I think that AIDA voltages are accurate since GPU and VRM temperatures are not very high, 43°C and 62°C respectively on 3DMark Vantage.
> 
> - at 1260 Mhz and whatever the RAM frequency, the display is corrupted, not artifacted : the display fluctuates between normal and blurred. If I lower the GPU frequency, at 1230, I have the same problem. Tested on every benchmark and Crysis 3. The characters are blurry, unreadable, the drawings are in 640x480 mode. Then If I reboot the computer, the image is crisp again. I tested evey bench and Crysis 3 without overclocking, no problem. I migrated the driver from 13.11beta 6 to beta7 and no change. Since the GPU frequency change does not solve my problem,
> 
> 
> I assume that memory is responsible for or the memory controller. Would it be possible to increase memory voltage ? Otherwise, would you have an idea ?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I flashed my HIS card with ASUS bios without any problems which is strange since my card has Elpida modules whereas ASUS ones are said to have Hynix memory.
> 
> In terms of scores at 1260/6400 Mhz :
> 3DMark Vantage Extreme : 32416 (and 274 W consumed !)
> 3D MArk 2013 : 6373
> Unigine Heaven 4.0 2560x1600, everything maxed, AA 0X : 1155
> 3DMark 11 : X5514
> Crysis 3 2560x1600, everything maxed except AA 0X : 35-40 fps at the beginning of the game, in the rain. Same as my previous GTX690


There is no memory voltage adjustment yet, so I wouldn't push memory too high. I would assume an updated afterburner will bring us memory volt control. Crosses fingers.


----------



## K1llrzzZ

Seems like i have a problem...I tested heaven bench,crysis 3,3D mark 11,hitman absolution,everything was fine until i started a metro last light test...screen froze,i restarted,and BSOD...Could be a driver error,or the PSU is not enough?Or something else?This PSU is 3 years old i bought it around the summer of 2010.


----------



## Taint3dBulge

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Arctic Cooling replied to my e-mail (sent 3 days ago) about what products they recommend for the 290x.
> 
> Already been posted in this thread but they recommend the Accelero Xtreme III - http://www.arctic.ac/en/p/cooling/vga/554/accelero-xtreme-iii.html?c=2182 and the Accelero Hybrid - http://www.arctic.ac/en/p/cooling/vga/569/accelero-hybrid.html?c=2182
> 
> Hope that helps someone out.


The Accelero Xtreme III will work, but one thing i noticed. It only has 12 memory module coolers.. The 290x has 16 modules. So that means you will need to order 4 more, might as well order some good copper ones. Cannot figure out what the hight is on the coolers that come with the AXIII. These are 14*14*9
http://www.newegg.com/Product/Product.aspx?Item=N82E16835708009

Not quite sure about VRM coolers... Those are the main ones i would want to have copper to get the best cooling... But cannot find any aftermarket ones..


----------



## Havolice

Quote:


> Originally Posted by *K1llrzzZ*
> 
> Seems like i have a problem...I tested heaven bench,crysis 3,3D mark 11,hitman absolution,everything was fine until i started a metro last light test...screen froze,i restarted,and BSOD...Could be a driver error,or the PSU is not enough?Or something else?This PSU is 3 years old i bought it around the summer of 2010.


i use a 1 year old corsair AX860i atm on crossfire stock it did metro just fine as a reference for you


----------



## Arizonian

Quote:


> Originally Posted by *Havolice*
> 
> i use a 1 year old corsair AX860i atm on crossfire stock it did metro just fine as a reference for you


Quote:


> Originally Posted by *K1llrzzZ*
> 
> Seems like i have a problem...I tested heaven bench,crysis 3,3D mark 11,hitman absolution,everything was fine until i started a metro last light test...screen froze,i restarted,and BSOD...Could be a driver error,or the PSU is not enough?Or something else?This PSU is 3 years old i bought it around the summer of 2010.


Just so everyone knows your using an 850 watt PSU.

I was under the assumption 75 watts for the PCI-E plus 75 watts per 6 pin and 150 watts for the 8 pin/pins

So unless the 290x has two 8 pin it shouldnt break 300 watts. This of course is taken into consideration that we're not adding voltage

For gamers we don't need to overclock and just want overkill on FPS in Xfire it should be fine theoretically.

But now I'm in a conundrum. I was originally thinking to crossfire but now I am heavily leaning to waiting for the Asus matrix and just over-clock the hell out of it since FPS is good in all the games I play anyway. I'm just seeking no low dips under 60 FPS really.


----------



## sugarhell

pcie can give more than 75 watts.


----------



## K1llrzzZ

Some info on the BSOD,this might help to find out what was the problem:"- EventData

BugcheckCode 0
BugcheckParameter1 0x0
BugcheckParameter2 0x0
BugcheckParameter3 0x0
BugcheckParameter4 0x0
SleepInProgress false
PowerButtonTimestamp 0
"
"The system has rebooted without first shutting down properly. This error could occur is because the system does not respond, or collapse unexpectedly terminated the power supply" Not sure if this is accurate,im from hungary,so the messsage was in hungarian,i just used google translator.


----------



## Jared Pace

anyone else confirm aida64 shows 290x VRM temperatures as reported by overclockFrance?


----------



## DampMonkey

Quote:


> Originally Posted by *Jared Pace*
> 
> anyone else confirm aida64 shows 290x VRM temperatures as reported by overclockFrance?


I will confirm this tonight if nobody gets to it before I can. When benchmarking at the highest clocks i could get, I could feel the back of the VRMs to get a decent idea of their temps. They were definitely pretty warm, but not what i would call hot. No where near as hot as my motherboard VRMs, that heatsink will take off skin


----------



## skupples

Quote:


> Originally Posted by *Arm3nian*
> 
> Hmm so much rad.
> 
> Think I need more than three sr1 480s? Probably running 1000-1400rpm fans. Going to be aiming for highest overclock possible on a 4930k and two 290x.
> 
> Now that I think of it, I'm going to be running 2 pumps and watercooling the mobo chipset and mofsets and watercooling quad channel ram hmm...good thing my case supports 14x 480 rads internally w/o any modding.


I would think 3 480's Would be good for all that... What case supports 14x 480s? Double ped sth10?


----------



## maarten12100

Quote:


> Originally Posted by *skupples*
> 
> I would think 3 480's Would be good for all that... What case supports 14x 480s? Double ped sth10?


Rule of thumb is 150W per 120*120 rads how thick a rad doesn't matter much best to go with thin rads and decent fans. (for a 20 degrees delta with ambient)
3x 240*240 or equiv is overkill and with decent blocks and paste would yield like 5 degrees from ambient.


----------



## K1llrzzZ

Ok,now my PC restarted randomly in Windows,I was benching 3D Mark before without problem,and it restarts AFTER that in windows?That makes no sense...Maybe it was a BSOD again,but it wasn't displayed,but under metro I understand,but in windows...that can't be becouse of the PSU.Or am I wrong?


----------



## Arm3nian

Quote:


> Originally Posted by *skupples*
> 
> I would think 3 480's Would be good for all that... What case supports 14x 480s? Double ped sth10?


TH10 with pedestal and top cover. 2 in pedestal, 2 on top cover, 6 on the chassis, 4 on the doors.

I still think 3x 480 might be pushing it.


----------



## Forceman

Quote:


> Originally Posted by *provost*
> 
> Which 290x is it? Can you post a pic


My card that died was a Sapphire BF4. I took a pic of the box for the owners club, but I didn't take any pics of the card itself. No apparent damage I could see though.
Quote:


> Originally Posted by *K1llrzzZ*
> 
> Seems like i have a problem...I tested heaven bench,crysis 3,3D mark 11,hitman absolution,everything was fine until i started a metro last light test...screen froze,i restarted,and BSOD...Could be a driver error,or the PSU is not enough?Or something else?This PSU is 3 years old i bought it around the summer of 2010.


Hmm. I was running Metro LL bench when my card died also. Locked up in Metro, then a couple of hard reboots (no BSOD, just black and then BIOS screen), and then it wouldn't even boot. I thought power initially, but I have single card on a year old HX750 so I can't imagine that's a problem. Put my GTX 680 back in and everything is working fine. Which card?


----------



## Mr357

Got mine today











Once my block arrives later this week I will have this baby under water.


----------



## K1llrzzZ

Quote:


> Originally Posted by *Forceman*
> 
> My card that died was a Sapphire BF4. I took a pic of the box for the owners club, but I didn't take any pics of the card itself. No apparent damage I could see though.
> Hmm. I was running Metro LL bench when my card died also. Locked up in Metro, then a couple of hard reboots (no BSOD, just black and then BIOS screen), and then it wouldn't even boot. I thought power initially, but I have single card on a year old HX750 so I can't imagine that's a problem. Put my GTX 680 back in and everything is working fine. Which card?


Sapphire R9 290X both cards.


----------



## Forceman

Quote:


> Originally Posted by *K1llrzzZ*
> 
> Sapphire R9 290X both cards.


Have you tried each card individually to see if you can identify the dodgy card?


----------



## utnorris

Quote:


> Originally Posted by *szeged*
> 
> gonna test them on air, will have EK blocks in soon, performance pcs isnt that far away from me


Just an FYI, even though they say they have them on their site, they do not have them. I was told they just cleared customs today, so make sure you call before driving there. I was pissed when they told me, plus they will not upgrade the shipping to get them to me sooner than Saturday assuming they get them today.


----------



## utnorris

Quote:


> Originally Posted by *Arizonian*
> 
> Just so everyone knows your using an 850 watt PSU.
> 
> I was under the assumption 75 watts for the PCI-E plus 75 watts per 6 pin and 150 watts for the 8 pin/pins
> 
> So unless the 290x has two 8 pin it shouldnt break 300 watts. This of course is taken into consideration that we're not adding voltage
> 
> For gamers we don't need to overclock and just want overkill on FPS in Xfire it should be fine theoretically.
> 
> But now I'm in a conundrum. I was originally thinking to crossfire but now I am heavily leaning to waiting for the Asus matrix and just over-clock the hell out of it since FPS is good in all the games I play anyway. I'm just seeking no low dips under 60 FPS really.


I am going to say it again, under benchmarking I have yet to go over 400watts from the wall according to my UPS, so still think a solid 850watt PSU will be fine for CF.


----------



## Paul17041993

Quote:


> Originally Posted by *K1llrzzZ*
> 
> Some info on the BSOD,this might help to find out what was the problem:"- EventData
> 
> BugcheckCode 0
> BugcheckParameter1 0x0
> BugcheckParameter2 0x0
> BugcheckParameter3 0x0
> BugcheckParameter4 0x0
> SleepInProgress false
> PowerButtonTimestamp 0
> "
> "The system has rebooted without first shutting down properly. This error could occur is because the system does not respond, or collapse unexpectedly terminated the power supply" Not sure if this is accurate,im from hungary,so the messsage was in hungarian,i just used google translator.


null bugcheck...? you made sure you grabbed the right event?


----------



## K1llrzzZ

Quote:


> Originally Posted by *Forceman*
> 
> Have you tried each card individually to see if you can identify the dodgy card?


I haven't started metro after that BSOD,when the Catalyst didn't see my second card it scared the ***** out of me.Maybe I disable crossfire and try to run metro again.
Paul17041993retty sure,the time was correct.


----------



## Arizonian

Quote:


> Originally Posted by *Mr357*
> 
> Got mine today
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Once my block arrives later this week I will have this baby under water.


Congrats - added. When blocks arrive, post it and I'll update list









Quote:


> Originally Posted by *utnorris*
> 
> I am going to say it again, under benchmarking I have yet to go over 400watts from the wall according to my UPS, so still think a solid 850watt PSU will be fine for CF.


Fair enough. It's not that I don't believe it, I'm just questioning from what I thought and what's being said. Guess we'll find out once K1llrzzZ gets his issue solved.

On a side note - did you guys see this....3DMark11 Performance bench.

*Smoke Smashes 3DMark 11 Record With Quad R9 290X CrossFireX* - Oct 28, 2013
Quote:


> With a new Rampage IV Black Edition pushing the Intel Core i7 4970X to 6GHz and DDR3 to 2,666MHz, while packing four new R9 290X graphics cards in CrossFireX at 1,435MHz core, 1650MHz (6,600MHz effective) memory, Russian overclocker Smoke, smashed the 3DMark 11 Performance record by nearly 2,000 points to *41,531*


----------



## K1llrzzZ

Well I disabled Crossfire and run metro again...It finished,but as I see it was around the end last time,so I wouldn't draw any conclusions from this one test.


----------



## DampMonkey

Quote:


> Originally Posted by *Arizonian*
> 
> On a side note - did you guys see this....3DMark11 Performance bench.
> 
> *Smoke Smashes 3DMark 11 Record With Quad R9 290X CrossFireX* - Oct 28, 2013


Yea, this was posted a few days back. The rebuttal from the green team was "It was posted on HWbot, it doesnt count"


----------



## Mr357

Quote:


> Originally Posted by *DampMonkey*
> 
> Yea, this was posted a few days back. The rebuttal from the green team was "It was posted on HWbot, it doesnt count"


That's just immature. If not HWbot, then where?


----------



## DampMonkey

Quote:


> Originally Posted by *Mr357*
> 
> That's just immature. If not HWbot, then where?


I think its just because its not as official as if it were to be posted in the 3dmark hall of fame, maintained by futuremark.


----------



## szeged

Quote:


> Originally Posted by *DampMonkey*
> 
> Yea, this was posted a few days back. The rebuttal from the green team was "It was posted on HWbot, it doesnt count"


i dont think they were trying to say it didnt count, they were just saying it was with tesselation off, which is the HWbot rules, which are different from most, so to compare it on the 3dmark site to hwbot site would have been a bad comparison.


----------



## provost

Quote:


> Originally Posted by *DampMonkey*
> 
> I think its just because its not as official as if it were to be posted in the 3dmark hall of fame, maintained by futuremark.


i think you are right, as it is not allowed by official futuremark site due to tess off
http://www.3dmark.com/hall-of-fame-2/3dmark+11+3dmark+score+performance+preset/


----------



## DampMonkey

Quote:


> Originally Posted by *szeged*
> 
> i dont think they were trying to say it didnt count, they were just saying it was with tesselation off, which is the HWbot rules, which are different from most, so to compare it on the 3dmark site to hwbot site would have been a bad comparison.


Wouldn't that mean everyone else had tesselation off too?

http://hwbot.org/benchmark/3dmark11_-_performance/


----------



## Koniakki

Guys I remember some or someone here ordered an AC Xtreme III for their 290X. So anyone received/installed it yet?


----------



## provost

Quote:


> Originally Posted by *DampMonkey*
> 
> Wouldn't that mean everyone else had tesselation off too?
> 
> http://hwbot.org/benchmark/3dmark11_-_performance/


Apparently you can't tess off with Nvidia cards, but if there is a way, it would be great to find out how for giggles


----------



## DampMonkey

Quote:


> With a new Rampage IV Black Edition pushing the Intel Core i7 4970X to 6GHz and DDR3 to 2,666MHz, while packing four new R9 290X graphics cards in CrossFireX at 1,435MHz core, 1650MHz (6,600MHz effective) memory, Russian overclocker Smoke, smashed the 3DMark 11 Performance record by nearly 2,000 points to 41,531


I suddenly don't feel as bad about my jerry rigged 1228 / 1568 (6276) overclock


----------



## Arizonian

Quote:


> Originally Posted by *K1llrzzZ*
> 
> Well I disabled Crossfire and run metro again...It finished,but as I see it was around the end last time,so I wouldn't draw any conclusions from this one test.


Keep us posted because I'm thinking that it's possible without overclocking but there are two sides of this debate. Have you tried running both cards one at a time to see if one is faulty?

Quote:


> Originally Posted by *DampMonkey*
> 
> Yea, this was posted a few days back. The rebuttal from the green team was "It was posted on HWbot, it doesnt count"










Ah....


----------



## DampMonkey

Quote:


> Originally Posted by *provost*
> 
> Apparently you can't tess off with Nvidia cards, but if there is a way, it would be great to find out how for giggles


When this benchmark was first brought up, everyone was claiming the better score because crossfire scaled better with 4 cards then SLI, i dont remember hearing that he had tesselation off


----------



## kcuestag

Anyone with the 290X under water? If so, does it still run hot (Compared to other cards) or does it run cool like the rest?









I'm getting my block on Thursday most likely, can't wait.


----------



## provost

Quote:


> Originally Posted by *DampMonkey*
> 
> When this benchmark was first brought up, everyone was claiming the better score because crossfire scaled better with 4 cards then SLI, i dont remember hearing that he had tesselation off


I don't care that guy won fair and square by Hwbot rules









But, if does not show up on Futuremark, its usually because it has been disallowed


----------



## anubis1127

Quote:


> Originally Posted by *kcuestag*
> 
> Anyone with the 290X under water? If so, does it still run hot (Compared to other cards) or does it run cool like the rest?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm getting my block on Thursday most likely, can't wait.


A few guys do. Temps are pretty in line with what you would expect for the TDP of the card, 45-60C depending on the size of your radiators.


----------



## skupples

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I don't know much but you use a VPN and it changes your IP address to a country of your choosing, I've used it for Netflix (not avaliable in Aus).


Man, the more I hear the more it sounds like Aus has an all out assault against the interweb's.

Netflix is currently under fire stateside too, just by the ISP's, not the government(yet) Comcast, verizon, att, the rest want to start charging the end user for using netflix.

xfire scaling>sli scaling, as per usual. Hell, nvidia has even dropped driver support for quad-sli, which makes me sad.


----------



## solitario07777

Quote:


> Originally Posted by *kcuestag*
> 
> Anyone with the 290X under water? If so, does it still run hot (Compared to other cards) or does it run cool like the rest?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm getting my block on Thursday most likely, can't wait.


http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/1480


----------



## K1llrzzZ

Quote:


> Originally Posted by *Arizonian*
> 
> Keep us posted because I'm thinking that it's possible without overclocking but there are two sides of this debate. Have you tried running both cards one at a time to see if one is faulty?
> 
> 
> 
> 
> 
> 
> 
> Ah....


Not yet,today i keep testing it,see it the problem occurs again,or not,if it does then tomorrow i do a windows reinstall and try both cards without CF.


----------



## DampMonkey

Quote:


> Originally Posted by *provost*
> 
> I don't care that guy won fair and square by Hwbot rules
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But, if does not show up on Futuremark, its usually because it has been disallowed


I get what you're saying now. The average scores on HWbot are much higher than on Futuremarks hall of fame, things must be going on behind the scenes that futuremark can throw out


----------



## anubis1127

Well AMD hasn't even released a WHQL driver for the r9 290X yet have they? That automatically hinders it from getting on the Futuremark HOF.


----------



## FtW 420

Quote:


> Originally Posted by *DampMonkey*
> 
> I get what you're saying now. The average scores on HWbot are much higher than on Futuremarks hall of fame, things must be going on behind the scenes that futuremark can throw out


Different rules, at futuremark it has to be an approved driver & they disallow tessellation tweaked scores, at hwbot any driver is OK (as long as the driver itself isn't modified) & tessellation/LOD tweaked scores are OK.

I generally trust the scores at hwbot more, in the futuremark hall of fame bugged scores & mis-read multi gpu systems can stay in the list for quite a while, bugged high scores generally get caught pretty quick at the bot.


----------



## DampMonkey

Quote:


> Originally Posted by *anubis1127*
> 
> Well AMD hasn't even released a WHQL driver for the r9 290X yet have they? That automatically hinders it from getting on the Futuremark HOF.


All of my benches have said "Graphics driver not approved"


----------



## sugarhell

Yeah futuremark only valid whql drivers


----------



## FtW 420

Quote:


> Originally Posted by *DampMonkey*
> 
> All of my benches have said "Graphics driver not approved"


They don't have any approved driver for 290x yet, catalyst 13.9 is the last approved for AMD, 7000 series.
Quote:


> Originally Posted by *sugarhell*
> 
> Yeah futuremark only valid whql drivers


They take some time for new ones as well, 331.69 is the latest WHQL, but 331.58 is the latest approved for Nvidia.

Fururemark approved driver info is here http://www.futuremark.com/support/approved-drivers


----------



## vettefan8

Quote:


> Originally Posted by *Koniakki*
> 
> Guys I remember some or someone here ordered an AC Xtreme III for their 290X. So anyone received/installed it yet?


Yea I ordered one, it's supposed to be here tomorrow. Anything you want me to test or benchmark on the stock cooler vs new cooler? Looks like we might be short 4 heat sinks for the memory though. I'll have to wait until it gets here tomorrow. A drive to Microcenter might be necessary for me to get everything going.
Quote:


> Originally Posted by *Taint3dBulge*
> 
> The Accelero Xtreme III will work, but one thing i noticed. It only has 12 memory module coolers.. The 290x has 16 modules. So that means you will need to order 4 more, might as well order some good copper ones. Cannot figure out what the hight is on the coolers that come with the AXIII. These are 14*14*9
> http://www.newegg.com/Product/Product.aspx?Item=N82E16835708009
> 
> Not quite sure about VRM coolers... Those are the main ones i would want to have copper to get the best cooling... But cannot find any aftermarket ones..


----------



## Forceman

Quote:


> Originally Posted by *vettefan8*
> 
> Yea I ordered one, it's supposed to be here tomorrow. Anything you want me to test or benchmark on the stock cooler vs new cooler? Looks like we might be short 4 heat sinks for the memory though. I'll have to wait until it gets here tomorrow. A drive to Microcenter might be necessary for me to get everything going.


I doubt you really need to sink the memory chips - GDDR5 doesn't get all that hot, and with the fans blowing straight down on the chips I don't think you'll have any problems. I suspect, but have no way of proving, that actively cooled non-sinked memory chips are probably cooler than reference cooled memory chips anyway - those thermal pillows they use can't be all that effective and they just sink to a big block of uncooled metal anyway.


----------



## tsm106

Quote:


> Originally Posted by *vettefan8*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Koniakki*
> 
> Guys I remember some or someone here ordered an AC Xtreme III for their 290X. So anyone received/installed it yet?
> 
> 
> 
> Yea I ordered one, it's supposed to be here tomorrow. Anything you want me to test or benchmark on the stock cooler vs new cooler? Looks like we might be short 4 heat sinks for the memory though. I'll have to wait until it gets here tomorrow. A drive to Microcenter might be necessary for me to get everything going.
Click to expand...

Make sure the memory is cooled properly. These ic get damn hot to the touch. I personally don't like the thermal tape cuz it gets gummy when heated up, fall off, etc.


----------



## jerrolds

My Sapphire 290X finally got delivered to my parents house - ill be running on air tonight, and about a 50/50 shot that ill be grabbing an Accelero Hybrid. Ill put benchmarkups for air vs hybrid vs matrix 7970 if that happens.


----------



## Dominican

how do you switch between Uber mode and Quiet Mode ?


----------



## Forceman

Quote:


> Originally Posted by *Dominican*
> 
> how do you switch between Uber mode and Quiet Mode ?


Flip the little switch on top. Toward the power connectors is Uber mode.


----------



## Arizonian

Quote:


> Originally Posted by *Dominican*
> 
> how do you switch between Uber mode and Quiet Mode ?


On OP:

Quiet Mode - Switch is in position closest to where you plug in your displays. Fan is at 40% cap.

Uber Mode - Switch is in position furthest away to where you plug in your displays. Fan is at 55% cap.


----------



## jerrolds

Does that switch also change the BIOS? I'll probably be flashing the Sapphire to ASUS bios at some point. General consensus is that the Asus bios is more stable, even at stock?


----------



## skupples

When was the last time AMD dropped a WHQL? Was the legendary frame pacing driver even WHQL?

(not a flame post, in case anyone get's that impression)


----------



## DampMonkey

Quote:


> Originally Posted by *jerrolds*
> 
> Does that switch also change the BIOS? I'll probably be flashing the Sapphire to ASUS bios at some point. General consensus is that the Asus bios is more stable, even at stock?


At stock, i dont *think* you will see a difference between the asus and sapphire bios's. If you are doing any kind of overclocking though, you will want the Asus for voltage tweaking.


----------



## rdr09

Quote:


> Originally Posted by *skupples*
> 
> When was the last time AMD dropped a WHQL? Was the legendary frame pacing driver even WHQL?
> 
> (not a flame post, in case anyone get's that impression)


Dude, not flaming but there is nothing more legendary that the three-eighteen-point-twenty.









or is it

320.18?


----------



## Arizonian

Quote:


> Originally Posted by *skupples*
> 
> When was the last time AMD dropped a WHQL? Was the legendary frame pacing driver even WHQL?
> 
> (not a flame post, in case anyone get's that impression)


13.9 WHQL - 9/18/2013

I'm not positive but it was right about the beginning of the 13.11 beta drivers where frame pacing started to be addressed. _TSM correct me if I'm wrong._

I was on the green side when that was going down.


----------



## Arm3nian

Quote:


> Originally Posted by *rdr09*
> 
> Dude, not flaming but there is nothing more legendary that the three-eighteen-point-twenty.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> or is it
> 
> 320.18?


320.x didn't even deserve to labeled as a driver lol.

The betas were even worse, I installed one and couldn't even boot into windows, I had to format my drive.


----------



## tsm106

13,9 is whql. The frame pacing drivers are betas, ie. not whql yet.


----------



## vettefan8

Quote:


> Originally Posted by *tsm106*
> 
> Make sure the memory is cooled properly. These ic get damn hot to the touch. I personally don't like the thermal tape cuz it gets gummy when heated up, fall off, etc.


I've never had to install an aftermarket cooler on a graphics card before. Would you recommend thermal adhesive over the tape? Will I be able to remove the heatsinks later on if I use thermal adhesive? If something fails on my card and I have to reinstall the factory cooler for warranty purposes I'd like to be able to get the heatsinks off.


----------



## sugarhell

We have 13.10 wqhl

Also found this:

http://www.guru3d.com/articles_pages/radeon_r9_290x_review_benchmarks,12.html thermal review


----------



## tsm106

They copy the french now?

Quote:


> Originally Posted by *vettefan8*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Make sure the memory is cooled properly. These ic get damn hot to the touch. I personally don't like the thermal tape cuz it gets gummy when heated up, fall off, etc.
> 
> 
> 
> I've never had to install an aftermarket cooler on a graphics card before. Would you recommend thermal adhesive over the tape? Will I be able to remove the heatsinks later on if I use thermal adhesive? If something fails on my card and I have to reinstall the factory cooler for warranty purposes I'd like to be able to get the heatsinks off.
Click to expand...

When I was using a gpu only block on my 290x, the heatsinks got sill hot, to the point where a fan is mandatory. With temps like that the glue in the tape starts to get gummy. Hmm I'm not sure which is the lesser of two evils, tape or glue...


----------



## Arizonian

Since AMD has added DMA engine (Direct Memory Access) in the AMD CrossFire compositing block. The video cards now have dual DMA engines. Only present in the R9 290X and R9 290. CrossFire has been simplified in the new series, and able to get frame pacing working. Also on OP.









Hooray for working sideport. One of the reasons why I decide to try AMD and switched to the Red team.


----------



## tsm106

Quote:


> Originally Posted by *Arizonian*
> 
> Since AMD has added DMA engine (Direct Memory Access) in the AMD CrossFire compositing block. The video cards now have dual DMA engines. Only present in the R9 290X and R9 290. CrossFire has been simplified in the new series, and able to get frame pacing working. Also on OP.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hooray for working sideport. One of the reasons why I decide to try AMD and switched to the Red team.


Cards can directly communicate with each other ftw.


----------



## Darlinangel

Good cards and good overclocking so far Subbed

"edit" looking to see them overclocked under water and how they perform.


----------



## jomama22

Flying back to Denver tonight, grabbing the 6 290xs from my friend holding my packages for me tomorrow morning. Will have 6 290x air tests tomorrow night. Max 1.3v.

Water tests will happen Thursday/Friday night. Max 1.42v.

I'll be using the pt03 bios for all testing that allows voltage control. Will also use stock bios to give an idea of what out-of-the-box performance to expect on air/water.

I'm hoping for at least 1 to hit 1300+ on water @ 1.4v or lower. If I can get 3 out of 6....mmmmmm delicious.


----------



## sugarhell

I hate text walls. Pls dont


----------



## Jared Pace

Quote:


> Originally Posted by *jomama22*
> 
> Will have 6 290x air tests tomorrow night. Max 1.3v.
> 
> Water tests will happen Thursday/Friday night. Max 1.42v.
> 
> I'll be using the pt03 bios for all testing that allows voltage control. Will also use stock bios to give an idea of what out-of-the-box performance to expect on air/water.
> 
> I'm hoping for at least 1 to hit 1300+ on water @ 1.4v or lower. If I can get 3 out of 6....mmmmmm delicious.












The best frame of reference for 290X OC on water shall be from you.









Just wondering - How in the world do you swap 6 waterblocks from a loop? Sounds time consuming.


----------



## jerrolds

Quote:


> Originally Posted by *jomama22*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Flying back to Denver tonight, grabbing the 6 290xs from my friend holding my packages for me tomorrow morning. Will have 6 290x air tests tomorrow night. Max 1.3v.
> 
> Water tests will happen Thursday/Friday night. Max 1.42v.
> 
> I'll be using the pt03 bios for all testing that allows voltage control. Will also use stock bios to give an idea of what out-of-the-box performance to expect on air/water.
> 
> I'm hoping for at least 1 to hit 1300+ on water @ 1.4v or lower. If I can get 3 out of 6....mmmmmm delicious.


6 290X's?! Are you gonna test and keep the best 2 or something?


----------



## K1llrzzZ

I took a deep breath,and run metro last light again with 2 cards,no BSOD this time.


----------



## DampMonkey

Quote:


> Originally Posted by *Jared Pace*
> 
> Just wondering - How in the world do you swap 6 waterblocks from a loop? Sounds time consuming.


The person with 50 processors in their avatar talking about something being time consuming









What were all those for anyway?


----------



## Raxus

Add me please


----------



## youra6

Refused delivery of my cards today.

Waiting for the 290s.


----------



## jerrolds

If i were to slap on an Accelero Hybrid ....can i NOT glue the ram sinks and get away with just the fan circulating air over the ram? Id like the option to change cooling, or sell the card and cooler separately.

I can also get zalman ram sinks with thermal tape, are those removable?


----------



## Arizonian

Quote:


> Originally Posted by *Raxus*
> 
> Add me please
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - Added


----------



## Raxus

Hmm. My h80 is louder, Plan on putting my rig on water soon anyhow though.


----------



## Dominican

has any one able to play BF 4 ?


----------



## skupples

Quote:


> Originally Posted by *Arizonian*
> 
> Since AMD has added DMA engine (Direct Memory Access) in the AMD CrossFire compositing block. The video cards now have dual DMA engines. Only present in the R9 290X and R9 290. CrossFire has been simplified in the new series, and able to get frame pacing working. Also on OP.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hooray for working sideport. One of the reasons why I decide to try AMD and switched to the Red team.


Make's me wonder if Nvidia is tinkering with the same stuff in the Lab. God know's they need to increase sli capability.


----------



## pharma57

Nvidia already had an solution awhile ago by increasing the bandwidth of the SLI bridge connector. The DMA engine will only be a solution for people with R9 290's and up, but anyone with another AMD card in Crossfire is out of luck unless AMD can do something similar to Nvidia solution.


----------



## jjjc_93

Quote:


> Originally Posted by *jomama22*
> 
> Flying back to Denver tonight, grabbing the 6 290xs from my friend holding my packages for me tomorrow morning. Will have 6 290x air tests tomorrow night. Max 1.3v.
> 
> Water tests will happen Thursday/Friday night. Max 1.42v.
> 
> I'll be using the pt03 bios for all testing that allows voltage control. Will also use stock bios to give an idea of what out-of-the-box performance to expect on air/water.
> 
> I'm hoping for at least 1 to hit 1300+ on water @ 1.4v or lower. If I can get 3 out of 6....mmmmmm delicious.


I used -50c and 1.35v for 1350, not sure I like your chances on water


----------



## Paul17041993

Quote:


> Originally Posted by *Arizonian*
> 
> Since AMD has added DMA engine (Direct Memory Access) in the AMD CrossFire compositing block. The video cards now have dual DMA engines. Only present in the R9 290X and R9 290. CrossFire has been simplified in the new series, and able to get frame pacing working. Also on OP.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hooray for working sideport. One of the reasons why I decide to try AMD and switched to the Red team.


with the modifications to the design, similar to what they did in the xOne and PS4, makes me wonder what they will be doing for steamroller and/or next lines of mobo chipsets, some linking components inside the CPU for better CPU and GPU synergy over the motherboard...? (apart from Hypertransport increase)

now what would be hilarious, a 8350 + 290X combined as a one-chip APU...


----------



## Roaches

Quote:


> Originally Posted by *Paul17041993*
> 
> now what would be hilarious, a 8350 + 290X combined as a one-chip APU...


TDP would be over the roof









Thankfully there's watercooling and huge heatsinks.


----------



## tsm106

Quote:


> Originally Posted by *jjjc_93*
> 
> I used -50c and 1.35v for 1350, not sure I like your chances on water


You are harbinger of bad news!


----------



## VSG

Seriously! Geez man, here I was looking forward to water cooling and overclocking.


----------



## DampMonkey

New personal Best 1270/5972 - 1.318v, peaked at 48*C

http://www.3dmark.com/3dm11/7397431


----------



## BababooeyHTJ

I'm really debating selling my r9 290x. Not going to lie. I'll miss 3d vision.


----------



## Kenshiro 26




----------



## tsm106

Quote:


> Originally Posted by *DampMonkey*
> 
> New personal Best 1270/5972 - 1.318v, peaked at 48*C
> 
> http://www.3dmark.com/3dm11/7397431


That card is crying for a diff cpu, no justice.


----------



## anubis1127

Quote:


> Originally Posted by *tsm106*
> 
> That card is crying for a diff cpu, no justice.


Don't let @dmfree88 see you posting that, xD


----------



## Arizonian

Quote:


> Originally Posted by *BababooeyHTJ*
> 
> I'm really debating selling my r9 290x. Not going to lie. I'll miss 3d vision.


You need a second rig you can hand down things too. My 680 was replaced with my 690 so I can 3D Vision game every now and again.







Only for everything else there's no way I prefer 1080p over 1440p. Did wonderfully last night with my single 290X.

Will be playing BF4 3D Vision campaign though after I complete it in 2D. Still love watching 3-D movies. One thing that AMD hasn't figured out with HD3D is using third party software Tri-Def which is really a 2D to 3D palor trick.

Anyway I digress. Good to see an old 3D vision club buddy here.









Quote:


> Originally Posted by *Kenshiro 26*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Added congrats.


----------



## $ilent

Quote:


> Originally Posted by *anubis1127*
> 
> Don't let @dmfree88
> see you posting that, xD


You seem tempted now Dan by the 290x?









@dampmonkey, that score seems low. I think what tsm is saying is right, the cpu is holding it back.

Alao anyone with a wc setup will know, woukd a 1.3v overclocked gpu draw more watta running at 95c versus the same card with full load temps of like 45c wc block on?


----------



## anubis1127

Quote:


> Originally Posted by *$ilent*
> 
> You seem tempted now Dan by the 290x?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @dampmonkey, that score seems low. I think what tsm is saying is right, the cpu is holding it back.
> 
> Alao anyone with a wc setup will know, woukd a 1.3v overclocked gpu draw more watta running at 95c versus the same card with full load temps of like 45c wc block on?


Naw, not so much the 290X, but probably a non-reference cooler 290 around Christmas time hopefully. (Just in time to try Mantle).

Watercooling will draw more wattage because of extra Fans / Pump / maybe a fan controller.


----------



## Forceman

Quote:


> Originally Posted by *$ilent*
> 
> Alao anyone with a wc setup will know, woukd a 1.3v overclocked gpu draw more watta running at 95c versus the same card with full load temps of like 45c wc block on?


Yes, higher temps increase resistance and will increase power draw. I don't know how much of a practical difference it makes though.


----------



## tsm106

Quote:


> Originally Posted by *$ilent*
> 
> Quote:
> 
> 
> 
> Originally Posted by *anubis1127*
> 
> Don't let @dmfree88
> see you posting that, xD
> 
> 
> 
> You seem tempted now Dan by the 290x?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @dampmonkey, that score seems low. I think what tsm is saying is right, the cpu is holding it back.
> 
> Alao anyone with a wc setup will know, woukd a 1.3v overclocked gpu draw more watta running at 95c versus the same card with full load temps of like 45c wc block on?
Click to expand...

It's got nothing to do with being right or wrong, just putting out the best effort thru the whole process.

With my sons lil 3820 at 4.6 with 1225/1500 clocks. It scores 15659. You can see DM's card is working faster due to the graphics score and clock advantage but everything else just gets hammered. His cpu is not doing the gpu justice.

http://www.3dmark.com/3dm11/7381362


----------



## surrealalucard

So just got my 290x today, slapped it in and played some bf4. So everything on ultra, 4x msaa, and all that...my avg fps is 43, on stock everything and raising the fan to 70%. I have a amd phenom 2 x6 1100T, 8 gigs of 1600mhz ram, running on a ssd. So is that really the normal for this card? 43 fps???


----------



## sugarhell

Get a better cpu for bf4 or oc more


----------



## Arizonian

Quote:


> Originally Posted by *surrealalucard*
> 
> So just got my 290x today, slapped it in and played some bf4. So everything on ultra, 4x msaa, and all that...my avg fps is 43, on stock everything and raising the fan to 70%. I have a amd phenom 2 x6 1100T, 8 gigs of 1600mhz ram, running on a ssd. So is that really the normal for this card? 43 fps???


Hello, I see it's your first post on OCN - welcome aboard.









For your CPU and GPU combo it probably is.

On my i7 3770K 4.5 Ghz / 290X OC 1100 / 1300 I saw 45 FPS lows - 72 FPS highs and would say average 55-65 FPS. Temps were 84C with 70% fan. Same ULTRA default setting.

PS Post back with a picture of your rig and OCN name and I'll add you to our roster.


----------



## rdr09

Quote:


> Originally Posted by *surrealalucard*
> 
> So just got my 290x today, slapped it in and played some bf4. So everything on ultra, 4x msaa, and all that...my avg fps is 43, on stock everything and raising the fan to 70%. I have a amd phenom 2 x6 1100T, 8 gigs of 1600mhz ram, running on a ssd. So is that really the normal for this card? 43 fps???


what resolution and cpu oc?

why the thuban? lol


----------



## SpewBoy

To my Australian brethren: HIS R9 290X's are in stock over at JW Computers for $669 (as cheap as I could find in stock). I didn't know how long PCCG was going to take to get more stock in so I ordered mine from there last night and they confirmed stock and payment / postage details over the phone this morning. Note that they do not come with a BF4 coupon.


----------



## $ilent

Quote:


> Originally Posted by *anubis1127*
> 
> Naw, not so much the 290X, but probably a non-reference cooler 290 around Christmas time hopefully. (Just in time to try Mantle).
> 
> Watercooling will draw more wattage because of extra Fans / Pump / maybe a fan controller.


Awesome, i kinda wished id of waited for the 290 but im glad i didnt. Firstly i got free bf4 and 3 games, second i dont think i could have waited another week or more, and lastly the 290x is slightly better, so if id of returned my games + 290x for a 290 in return id end up paying the same amount.

@surreal your cpu is holding your 290x back im sorry to say


----------



## surrealalucard

Quote:


> Originally Posted by *rdr09*
> 
> what resolution and cpu oc?
> 
> why the thuban? lol


Resolution is 1920x1080, and cpu is at 4 ghz, and I grabbed it thinking I was getting a decent deal, and it had two more cores. Got it before the am3 cores were out.

I will have to get a picture of my rig later, probably tomorrow.


----------



## VSG

Wait how did you get 3 other games? Vendor promo?

I too am very happy I got the 290X- between the newegg mobile discount and selling off the BF4 code to a fellow OCNer, it cost me $510. I should have just bought 2 when it launched! Oh well, now I get to decide if I should get a 2nd 290X or a 290 instead.


----------



## surrealalucard

Quote:


> Originally Posted by *$ilent*
> 
> Awesome, i kinda wished id of waited for the 290 but im glad i didnt. Firstly i got free bf4 and 3 games, second i dont think i could have waited another week or more, and lastly the 290x is slightly better, so if id of returned my games + 290x for a 290 in return id end up paying the same amount.
> 
> @surreal your cpu is holding your 290x back im sorry to say


Yea, I thought thats what It most likely was.

I was thinking about upgrading to a AMD FX-8350 Vishera..Heard good things about it.


----------



## broken pixel

Quote:


> Originally Posted by *Arm3nian*
> 
> Full AA is demanding at 2560x1440. You most likely will need 2 290x's or a watercooled one to get above 60fps average.


I do 82 fps capped with ultra settings post process medium AA 4x @ 2560x1440 with a 7970 lightning and R7950 @ 1150MHz each.

2x 290x should do 120 fps


----------



## rdr09

Quote:


> Originally Posted by *surrealalucard*
> 
> Resolution is 1920x1080, and cpu is at 4 ghz, and I grabbed it thinking I was getting a decent deal, and it had two more cores. Got it before the am3 cores were out.
> 
> I will have to get a picture of my rig later, probably tomorrow.


it could be that the final release is a lot heavier than the beta. i was able to play the beta smoothly with the 7950 stock and thuban at 4GHz maxed out 1080 in MP 64. i did not measure my fps, though. do you mind running 3DMark11?

i've dl'ed BF4 in my intel rig and will do it on my amd. i'll play with them soon. crossfire this weekend.


----------



## BababooeyHTJ

Quote:


> Originally Posted by *Arizonian*
> 
> You need a second rig you can hand down things too. My 680 was replaced with my 690 so I can 3D Vision game every now and again.
> 
> 
> 
> 
> 
> 
> 
> Only for everything else there's no way I prefer 1080p over 1440p. Did wonderfully last night with my single 290X.
> 
> Will be playing BF4 3D Vision campaign though after I complete it in 2D. Still love watching 3-D movies. One thing that AMD hasn't figured out with HD3D is using third party software Tri-Def which is really a 2D to 3D palor trick.
> 
> Anyway I digress. Good to see an old 3D vision club buddy here.
> 
> 
> 
> 
> 
> 
> 
> 
> Added congrats.


Yeah, I'm really impressed with what a single 290x can do with 2560x1440. I should have my second tomorrow. Should be great for my 2b catleap.


----------



## surrealalucard

Quote:


> Originally Posted by *rdr09*
> 
> it could be that the final release is a lot heavier than the beta. i was able to play the beta smoothly with the 7950 stock and thuban at 4GHz maxed out 1080 in MP 64. i did not measure my fps, though. do you mind running 3DMark11?


Yea, I can run it tomorrow when I get home from work, will bookmark the thread!

see you guys later and thanks for all the quick replies!


----------



## $ilent

Quote:


> Originally Posted by *geggeg*
> 
> Wait how did you get 3 other games? Vendor promo?
> 
> I too am very happy I got the 290X- between the newegg mobile discount and selling off the BF4 code to a fellow OCNer, it cost me $510. I should have just bought 2 when it launched! Oh well, now I get to decide if I should get a 2nd 290X or a 290 instead.


Yeah vendor continued the golden ticket offer for firat day of 290x sales! Any idea how it works?


----------



## VSG

You mean the Never Settle Forever Bundle? Go to http://sites.amd.com/us/promo/never-settle/Pages/nsreloadedforever.aspx


----------



## $ilent

No its golden ticket, pick 3 free games.


----------



## jerrolds

Quote:


> Originally Posted by *$ilent*
> 
> Awesome, i kinda wished id of waited for the 290 but im glad i didnt. Firstly i got free bf4 and 3 games, second i dont think i could have waited another week or more, and lastly the 290x is slightly better, so if id of returned my games + 290x for a 290 in return id end up paying the same amount.


Wait - you got bf4 AND 3 games? I only got the coupon for bf4 with my Sapphire..


----------



## Sazz

Quote:


> Originally Posted by *DampMonkey*
> 
> New personal Best 1270/5972 - 1.318v, peaked at 48*C
> 
> http://www.3dmark.com/3dm11/7397431


your CPU score seems lower than it should be, I got my 8350 at 5Ghz too and scores about 1k more than yours
Quote:


> Originally Posted by *tsm106*
> 
> That card is crying for a diff cpu, no justice.


FX-8350 is way more than good enough to run games, fact of the matter is on non-biased sites that reviewed it, it actually performs on par and sometimes surpassing Intel's high end CPU in gaming, I'd rather spend 180bucks on the CPU compared to a 500-1k USD CPU that performs 2-10fps better in games where its already reaching over 60fps.

And before you pull up the power consumption card, teksyndicate already did the math.


----------



## Sgt Bilko

In about 5 hours ill have my 290x, Vrm heatsinks, Accelero Xtreme III cooler and the thermal paste and adhesive. So ill run sone stock air tests tonight. Then the accelero on Friday and then ill put the heatsinks on during next week (5 days for the adhesive to dry).

Heres hoping it all goes well


----------



## psyside

Quote:


> Originally Posted by *sugarhell*
> 
> Get a better cpu for bf4 or oc more


Quote:


> Originally Posted by *surrealalucard*
> 
> So just got my 290x today, slapped it in and played some bf4. So everything on ultra, 4x msaa, and all that...my avg fps is 43, on stock everything and raising the fan to 70%. I have a amd phenom 2 x6 1100T, 8 gigs of 1600mhz ram, running on a ssd. So is that really the normal for this card? 43 fps???


----------



## anubis1127

Quote:


> Originally Posted by *Sazz*
> 
> FX-8350 is way more than good enough to run games, fact of the matter is on non-biased sites that reviewed it, it actually performs on par and sometimes surpassing Intel's high end CPU in gaming, I'd rather spend 180bucks on the CPU compared to a 500-1k USD CPU that performs 2-10fps better in games where its already reaching over 60fps.
> 
> And before you pull up the power consumption card, teksyndicate already did the math.


I'd rather spend less money on an Intel quad core and get better multi-GPU performance, but to each their own. Nothing wrong with a FX quad core for one GPU though.


----------



## $ilent

Quote:


> Originally Posted by *jerrolds*
> 
> Wait - you got bf4 AND 3 games? I only got the coupon for bf4 with my Sapphire..


Yeah i got both, where you buy from?


----------



## Sgt Bilko

Was a UK only deal iirc


----------



## tsm106

Quote:


> Originally Posted by *Sazz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> That card is crying for a diff cpu, no justice.
> 
> 
> 
> FX-8350 is way more than good enough to run games, fact of the matter is on non-biased sites that reviewed it, it actually performs on par and sometimes surpassing Intel's high end CPU in gaming, I'd rather spend 180bucks on the CPU compared to a 500-1k USD CPU that performs 2-10fps better in games where its already reaching over 60fps.
> 
> And before you pull up the power consumption card, teksyndicate already did the math.
Click to expand...

Who mentioned anything about him buying a 500-1k usd CPU? Eh? Come again? Don't come in with that chip on shoulder attitude. Now if you look at the numbers, my sons rig which runs a cheap 200 buck 3820 clocked at a low 4.6ghz put out a 15600 ish pscore to his 13892. The clocks on my run were much lower all around. The fact is his scores could be a lot higher. This same quad core powered my 7970 quadfire rig and put out some ludicrous numbers. Yea, all from an omg intel chip that costs me 200 bucks.

http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-4-0-scores/640_40#post_17525750

Also read post 941.


----------



## M125

*---Sapphire, stock HSF for now.*

I just got this system going again (no overclocking yet (New Batch 4770K!)), and the 290X does seem to have a bit more punch than a 7950/7970. I'm just wondering if it's thermal footprint is too much for my system.

It's tucked inside a Node 304 with a 4770K using the only real place to mount a radiator (with a Corsair H90). Other options include non-water aftermarket HSFs, moving to a larger case (I'm against this), or selling off the R9 290X and getting a cooler-running GTX 780.

I use my systems 24/7 for BOINC projects, and so far the 290X looks to be performing 20-30% faster than a 7970 non-GE in Milkyway while consuming ~25% more power. All in "quiet" mode too.

*Edit: "Looks to be;" AMD capped Hawaii at 1/8 SP for double precision, the R9 290X performs ~30% lower than a 7970/R9 280X with DP*

*Gripes:*
Bad Coil Whine with [email protected], Milkyway.
For some reason, every utility I use either reports the card running at 1000+°C or 0°C.
Noisy, even in quiet mode.
Cheesy plastic heat-sink.
Stupid hot and power hungry.








This die should have been at 20nm.


----------



## Arizonian

Quote:


> Originally Posted by *M125*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> IMG ALT=""]http://www.overclock.net/content/type/61/id/1723230/width/500/height/1000[/IMG]
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> *---Sapphire, stock HSF for now.*
> 
> I just got this system going again (no overclocking yet (New Batch 4770K!)), and the 290X does seem to have a bit more punch than a 7950/7970. I'm just wondering if it's thermal footprint is too much for my system.
> 
> It's tucked inside a Node 304 with a 4770K using the only real place to mount a radiator (with a Corsair H90). Other options include non-water aftermarket HSFs, moving to a larger case (I'm against this), or selling off the R9 290X and getting a cooler-running GTX 780.
> 
> I use my systems 24/7 for BOINC projects, and so far the 290X looks to be performing 20-30% faster than a 7970 non-GE in Milkyway while consuming ~25% more power. All in "quiet" mode too.
> 
> *Gripes:*
> Bad Coil Whine with [email protected], Milkyway.
> For some reason, every utility I use either reports the card running at 1000+°C or 0°C.
> Noisy, even in quiet mode.
> Cheesy plastic heat-sink.
> Stupid hot and power hungry.
> 
> 
> 
> 
> 
> 
> 
> 
> This die should have been at 20nm.


Sorry to hear yours has coil whine. Your the first to have that. It does happen to GPU's. Your added to the owners list.


----------



## Sazz

Quote:


> Originally Posted by *tsm106*
> 
> Who mentioned anything about him buying a 500-1k usd CPU? Eh? Come again? Don't come in with that chip on shoulder attitude. Now if you look at the numbers, my sons rig which runs a cheap 200 buck 3820 clocked at a low 4.6ghz put out a 15600 ish pscore to his 13892. The clocks on my run were much lower all around. The fact is his scores could be a lot higher. This same quad core powered my 7970 quadfire rig and put out some ludicrous numbers. Yea, all from an omg intel chip that costs me 200 bucks.
> 
> http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-4-0-scores/640_40#post_17525750
> 
> Also read post 941.


See, this is what I meant.

How does that translate to actual game FPS and not some synthetic benchmark pts? I consider actual GAME FPS and not these synthetic benchmark pts, these points mean nothing in the actual games. heck some hardware seems to perform very well on these synthetic benchmarks but it doesn't translate well when it comes to actual game FPS.

I've used intel before but I cannot justify the cost of their CPU's and mobo's and their costs right now, FX-8350+Gigabyte UD3 board cost about 300bucks right now and it keeps up with intel when it comes to actual game FPS (again, not some synthetic benchmark pts).

anyways /endofdicussion about this and move on to 290x discussion, don't wanna give you a heart attack intel. LOL


----------



## tsm106

If I had a nickel for each time the debate gets directed to the oft used "but bench scores are meaningless..." How is a bench defined? It's a measurement of work by the system. If you want to ignore benches, that's great. Don't pop into a situation with benching as the topic, and when things get tough cry about games, ok? You can do away with the trolling attitude too. Thanks.

Quote:


> Originally Posted by *Arizonian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *M125*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> IMG ALT=""]http://www.overclock.net/content/type/61/id/1723230/width/500/height/1000[/IMG]
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> *---Sapphire, stock HSF for now.*
> 
> I just got this system going again (no overclocking yet (New Batch 4770K!)), and the 290X does seem to have a bit more punch than a 7950/7970. I'm just wondering if it's thermal footprint is too much for my system.
> 
> It's tucked inside a Node 304 with a 4770K using the only real place to mount a radiator (with a Corsair H90). Other options include non-water aftermarket HSFs, moving to a larger case (I'm against this), or selling off the R9 290X and getting a cooler-running GTX 780.
> 
> I use my systems 24/7 for BOINC projects, and so far the 290X looks to be performing 20-30% faster than a 7970 non-GE in Milkyway while consuming ~25% more power. All in "quiet" mode too.
> 
> *Gripes:*
> Bad Coil Whine with [email protected], Milkyway.
> For some reason, every utility I use either reports the card running at 1000+°C or 0°C.
> Noisy, even in quiet mode.
> Cheesy plastic heat-sink.
> Stupid hot and power hungry.
> 
> 
> 
> 
> 
> 
> 
> 
> This die should have been at 20nm.
> 
> 
> 
> Sorry to hear yours has coil whine. Your the first to have that. It does happen to GPU's. Your added to the owners list.
Click to expand...

Wow, that is a big card for a lil itty bitty Node. I have some coil whine on my Sappire 290x also. It's not annoying for the most part.


----------



## DampMonkey

Quote:


> Originally Posted by *Arizonian*
> 
> Sorry to hear yours has coil whine. Your the first to have that. It does happen to GPU's. Your added to the owners list.


Ive got coil whine too. About as much as my sapphire 7950 used to have, but less ear piercing


----------



## raghu78

Quote:


> Originally Posted by *tsm106*
> 
> If I had a nickel for each time the debate gets directed to the oft used "but bench scores are meaningless..." How is a bench defined? It's a measurement of work by the system. If you want to ignore benches, that's great. Don't pop into a situation with benching as the topic, and when things get tough cry about games, ok? You can do away with the trolling attitude too. Thanks. Wow, that is a big card for a lil itty bitty Node. I have some coil whine on my Sappire 290x also. It's not annoying for the most part.


tsm106 have you put your card under water ? what are you clocking at ? how is the performance in games like BF4, Crysis 3 ?


----------



## binormalkilla

Has anyone seen any 3x Crossfire 290x results? When 2 more of the non-BF4 cards come into stock I'll be going this way probably.


----------



## tsm106

I had the 290x running with a gpu only block but have since gone back to the stock cooler. I'm waiting for ek blocks to arrive before torturing the cards more.

The highest overclock I managed was 1250/1650. I benched valley at 1250/1650 and got 79.1fps. When the blocks get here I will swap out the quads for two blocked 290x. Any day now fcpu, any day now!


----------



## DampMonkey

290x at 1260/5800 matching a 1411/7406 Classified
Quote:


> Originally Posted by *strong island 1*
> 
> GTX 780 Classified - Core 1411 - memory 7406 - 4930k 4.6ghz - Graphics score 13426
> 
> http://www.3dmark.com/3dm/1512722


R290X 1260 \ 5800 @ 1.318V with a 5ghz 8350 - GPU Score 13,457. Topped out at 49*C

http://i.imgur.com/mujk5UH.jpg
details: http://i.imgur.com/mujk5UH.jpg


----------



## raghu78

Quote:


> Originally Posted by *tsm106*
> 
> I had the 290x running with a gpu only block but have since gone back to the stock cooler. I'm waiting for ek blocks to arrive before torturing the cards more.
> 
> The highest overclock I managed was 1250/1650. I benched valley at 1250/1650 and got 79.1fps. When the blocks get here I will swap out the quads for two blocked 290x. Any day now fcpu, any day now!


1250 / 1650 mhz awesome







let us know of game performance when the full cover blocks arrive


----------



## Sazz

Quote:


> Originally Posted by *DampMonkey*
> 
> 290x at 1260/5800 matching a 1411/7406 Classified
> R290X 1260 \ 5800 @ 1.318V with a 5ghz 8350 - GPU Score 13,457. Topped out at 49*C
> 
> http://i.imgur.com/mujk5UH.jpg
> details: http://i.imgur.com/mujk5UH.jpg


anyone that has used the Asus BIOS flash route, can you guys post the stock voltages on your cards as well? I wanna get an idea how low the voltage is on average at stock settings.


----------



## VSG

290 launch delayed by a week


----------



## DampMonkey

Quote:


> Originally Posted by *Sazz*
> 
> anyone that has used the Asus BIOS flash route, can you guys post the stock voltages on your cards as well? I wanna get an idea how low the voltage is on average at stock settings.


Stock voltage is 1.25 and it clocks down to 300mhz when not in use


----------



## th3illusiveman

Quote:


> Originally Posted by *DampMonkey*
> 
> 290x at 1260/5800 matching a 1411/7406 Classified
> R290X 1260 \ 5800 @ 1.318V with a 5ghz 8350 - GPU Score 13,457. Topped out at 49*C
> 
> http://i.imgur.com/mujk5UH.jpg
> details: http://i.imgur.com/mujk5UH.jpg


Are you turning Tessellation off in your 3DM11 benchmarks or is the card actually getting 18K with it on? If you are turning it off then what are your scores with it on?


----------



## Sazz

Quote:


> Originally Posted by *th3illusiveman*
> 
> Are you turning Tessellation off in your 3DM11 benchmarks or is the card actually getting 18K with it on? If you are turning it off then what are your scores with it on?


I am getting 16.6k at 1100/1350 OC with FX-8350 @5Ghz so I think 18k is very achievable with further OC.


----------



## DampMonkey

Quote:


> Originally Posted by *th3illusiveman*
> 
> Are you turning Tessellation off in your 3DM11 benchmarks or is the card actually getting 18K with it on? If you are turning it off then what are your scores with it on?


Its set to "use application settings"


----------



## psyside

Quote:


> Originally Posted by *geggeg*
> 
> 290 launch delayed by a week


?


----------



## Arizonian

Quote:


> Originally Posted by *psyside*
> 
> ?


November 5th

http://www.techpowerup.com/mobile/193497/radeon-r9-290-non-x-launch-pushed-back-a-week.html
Quote:


> AMD reportedly developed a new driver that significantly improves performance on the R9 290. AMD pushed this driver onto reviewers at the last minute, and asked them to re-bench their R9 290 samples from scratch


In short they have new drivers and they want to push them out at the same time as the release date so expect performance boost on the 290X as well. They've asked reviewers to re-bench their scores ready for release date new NDA lift. So despite a six-day delay this means better performance for us I'm excited. I have a feeling It's not a coincidence that it happens to be the same time the 780 TI is coming out.


----------



## Sgt Bilko

Finally!!!

Count me in


----------



## Roaches

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Finally!!!
> 
> Count me in


Show us results when you get that cooler installed









I wanna know how much beastly temps that card is with 3 fans


----------



## Sgt Bilko

Quote:


> Originally Posted by *Roaches*
> 
> Show us results when you get that cooler installed
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I wanna know how much beastly temps that card is with 3 fans


I'll be putting the cooler on Friday afternoon, i want the whole weekend to play around with this









I'm hoping for it to drop to 60 max


----------



## Arizonian

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Finally!!!
> 
> Count me in
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Welcome to the show. I'm very much going to be interested in your numbers with the Arctic Accelero Extreme III cooler added on.

Added to roster


----------



## KnightVII

My question:

-Is it loud when playing BF4?
-Auto shutdown - hot?
-Stuttering?
-Is Stock cooler ok for 290x?


----------



## Sazz

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'll be putting the cooler on Friday afternoon, i want the whole weekend to play around with this
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm hoping for it to drop to 60 max


are you using the shim? I've seen people say it fits even without the shim.

And you might wanna do it Thursday night coz I believe the thermal adhesive that it came with needs about 5-8hrs to cure.

I've researched it before coz I was gonna use the 7970 version of it and some bought the separate arctic cooling thermal adhesive, and he mixed in the AS5 (40/40/20% AS5) to lessen the adhesiveness of it, he said it is stll sticking in perfectly but it made it easily removeable.

Quote:


> Originally Posted by *KnightVII*
> 
> My question:
> 
> -Is it loud when playing BF4?
> -Auto shutdown - hot?
> -Stuttering?
> -Is Stock cooler ok for 290x?


-to me, nope. again this question always depends on the user, some may tolerate more noise than others but imo at 65% fan speed it never goes over 78C and I can barely hear it over my speakers (volume set at 25% on windows and my speakers are 25watts Sony speakers, just to give you an idea how loud my speaker is, loud enough to fill the room and a little bit more but ain't too loud that it can be heard beyond the walls of my room that can cause disturbance)

-nope, it will never go to that point coz when it reaches 95C it will auto downclock itself to use less power to maintain 95C (in real time useage its 94C) temperature, it is designed to run daily at 95C.
-nope, haven't seen any of it. in fact when I was overclocking every other GPU that I have handled most of them stutter to show instability, this one the 290x almost never showed me that during OC testing, when its unstable during OC its either not gonna let you run the benchmark (it won't load thru) or you will see it turn to black screen (memory OC causes this, means your unstable but all you need to do is a force restart and you will be back in business), all I see is when a OC is unstable and it goes thru running benchmarks is some artifacts that appears if its unstable but not to the point of stuttering.
-stock cooler is "good enough" to cool it but it could have been way better. I suggest to get an aftermarket cooler for it if you want it to be really quiet (i.e. Arctic Cooling Accelero Xtreme coolers)


----------



## Forceman

Quote:


> Originally Posted by *Sazz*
> 
> are you using the shim? I've seen people say it fits even without the shim.
> 
> And you might wanna do it Thursday night coz I believe the thermal adhesive that it came with needs about 5-8hrs to cure.
> 
> I've researched it before coz I was gonna use the 7970 version of it and some bought the separate arctic cooling thermal adhesive, and he mixed in the AS5 (40/40/20% AS5) to lessen the adhesiveness of it, he said it is stll sticking in perfectly but it made it easily removeable.


It shouldn't need a shim, the die sticks up past the edge of the surrounding structure.


----------



## Arizonian

Quote:


> Originally Posted by *KnightVII*
> 
> My question:
> 
> -Is it loud when playing BF4?
> -Auto shutdown - hot?
> -Stuttering?
> -Is Stock cooler ok for 290x?


MSI Afterburner - 83C Temp - 70% Manual Fan - 100% GPU Usage - 1100 Core Clock / 1300 Memory Clock (No down clocking) - 2388 GB VRAM Memory Usage

Recorded a video so members can see for themselves. I felt like it was just as good as my Nvidia cards playing smooth as butter. I was expecting all kinds of issues and had not one.

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/1540#post_21081473

BF4 Ultra default settings @ 2560 x 1440p - 45 FPS lows - 72 FPS highs and would say average 55-65 FPS

Check it out and let me know what you think? Seeing is believing. I recorded it live rather than FRAPS to show exactly how it's being played.


----------



## Sgt Bilko

Quick Valley run, 1120/1380, 80% fan, Temp 71 stock air and stock volts


----------



## RocketAbyss

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quick Valley run, 1120/1380, 80% fan, Temp 71 stock air and stock volts
> 
> 
> Spoiler: Warning: Spoiler!


Seems your CPU is holding you back.

Heres my run at 1102/1400 on my FX8350 5GHz:


----------



## Sazz

Quote:


> Originally Posted by *RocketAbyss*
> 
> Seems your CPU is holding you back.
> 
> Heres my run at 1102/1400 on my FX8350 5GHz:


I've ran Valley and heaven from stock clocks to 1100/1350 OC and I think the CPU plays a bigger role on these benchmarks than others. 2-5Fps better than my 7970 runs at 1330/1860 OC.



(First colum from the left is custom Heaven run, at 1920x1080, ultra settings, extreme tessa and x8 AA, 2nd is the preset Extreme HD settings in Heaven and third is the Valley ExtremeHD run paired with FX-8350 @ 5Ghz)


----------



## Sgt Bilko

Quote:


> Originally Posted by *RocketAbyss*
> 
> Seems your CPU is holding you back.
> 
> Heres my run at 1102/1400 on my FX8350 5GHz:


Thats what im thinking as well, just have to find out if the 9370 will be compatible with my motherboard, i know it's fine for the Formula Z but i have the non-Z version, bought it one month before ASUS announced it......grrrrr

Stupidly enough it works out to be cheaper to buy the 9370 and use the included water cooler as it does to buy an 8350 and then watercool it


----------



## selk22

Here is pretty much my final bench.. managed to squeeze .2-.5 better than previously









Valley Extreme HD


Heaven Extreme 1080p


Thats the best I can do at the card is at 1100/1400


----------



## MIGhunter

Stupid UPS, got ready for work tonight and found their sticker on my door. The guy never knocked. I'm still baffled as to how he got to my door to put the sticker on it without my dogs going crazy. He musta been all ninja like! Guess there's always today.


----------



## Sazz

LOL My best Valley run so far at 1100/1375 OC, pretty evil score xD


----------



## Cool Mike

waiting on the custom's. Within one month


----------



## Sazz

Quote:


> Originally Posted by *Cool Mike*
> 
> waiting on the custom's. Within one month


funny coz that's whats holding Performance-pcs and Frozencpu up with the 290x waterblocks... -_-


----------



## Sgt Bilko

was just watching a movie and glanced down to see the card idling at 93 degrees on auto..........just set it to 50% and dropped to 57 now, I need to get that Accelero Xtreme on this fast.......


----------



## $ilent

Quote:


> Originally Posted by *Sgt Bilko*
> 
> was just watching a movie and glanced down to see the card idling at 93 degrees on auto..........just set it to 50% and dropped to 57 now, I need to get that Accelero Xtreme on this fast.......


Nice

thats probably down to the 100% gpu usage at idle bug, need to close gpuz and msi ab to get rid of that. Also ber in mind the fan immediately drops to 20% following gaming, so it takes ages to cool down too.


----------



## maarten12100

It seems that the cooler is good but the block makes a lot of resonance anybody tested the fan without the block?


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> was just watching a movie and glanced down to see the card idling at 93 degrees on auto..........just set it to 50% and dropped to 57 now, I need to get that Accelero Xtreme on this fast.......


i saw your valley run. hawii really needs some work on this bench. you might want to hold off 'cause some 7950s beats that score.

edit: my bad. just saw Sazz's score at only 1100.


----------



## Sgt Bilko

Quote:


> Originally Posted by *$ilent*
> 
> Nice
> 
> thats probably down to the 100% gpu usage at idle bug, need to close gpuz and msi ab to get rid of that. Also ber in mind the fan immediately drops to 20% following gaming, so it takes ages to cool down too.


Yeah, i haven't opened GPU-Z due to that bug, wasn't aware of AB doing it though. guess i better download Trixx unitl this gets fixed. thanks for the heads-up









Very curious to see what the temps/noise is like after i put that aftermarket cooler on it though.


----------



## skupples

Quote:


> Originally Posted by *Sazz*
> 
> See, this is what I meant.
> 
> How does that translate to actual game FPS and not some synthetic benchmark pts? I consider actual GAME FPS and not these synthetic benchmark pts, these points mean nothing in the actual games. heck some hardware seems to perform very well on these synthetic benchmarks but it doesn't translate well when it comes to actual game FPS.
> 
> I've used intel before but I cannot justify the cost of their CPU's and mobo's and their costs right now, FX-8350+Gigabyte UD3 board cost about 300bucks right now and it keeps up with intel when it comes to actual game FPS (again, not some synthetic benchmark pts).
> 
> anyways /endofdicussion about this and move on to 290x discussion, don't wanna give you a heart attack intel. LOL


I only remember Tek comparing it to 3770K, not E series. Which is was neck & neck, & some times better than in the test's they ran. Romps all over 3770K when streaming.

ok, no more intel vs. amd derail, we had plenty of nv derailing, which I HAD NOTHING TO DO WITH.








Quote:


> Originally Posted by *MIGhunter*
> 
> Stupid UPS, got ready for work tonight and found their sticker on my door. The guy never knocked. I'm still baffled as to how he got to my door to put the sticker on it without my dogs going crazy. He musta been all ninja like! Guess there's always today.


Trained master in the art of sneak attack trick attack. Ancient form of Ninjitsu.


----------



## Clockster

Cards have finally arrived.
Will collect them in the morning









Can someone say 8x Gigabyte R9 290X







lol


----------



## $ilent

I would just use ccc for now budd


----------



## fleetfeather

Quote:


> Originally Posted by *Clockster*
> 
> Cards have finally arrived.
> Will collect them in the morning
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can someone say *8x Gigabyte R9 290X*
> 
> 
> 
> 
> 
> 
> 
> lol


----------



## $ilent

Quote:


> Originally Posted by *Clockster*
> 
> Cards have finally arrived.
> Will collect them in the morning
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can someone say 8x Gigabyte R9 290X
> 
> 
> 
> 
> 
> 
> 
> lol


ate....eigh....no I cant do it.


----------



## bond32

Don't mean to stir the Hornets nest, but those Valley scores aren't exactly what I hoped. I'm still undecided on this card or the 780 lightning. Those of you on water, how's your clocks like?

Edit: where are you guys finding these in Stock?


----------



## DampMonkey

Quote:


> Originally Posted by *bond32*
> 
> Don't mean to stir the Hornets nest, but those Valley scores aren't exactly what I hoped. I'm still undecided on this card or the 780 lightning. Those of you on water, how's your clocks like?


Go look at any formal review, AMD always lags behind in Valley. Heaven is more even for some reason though


----------



## bond32

Quote:


> Originally Posted by *DampMonkey*
> 
> Go look at any formal review, AMD always lags behind in Valley. Heaven is more even for some reason though


Thanks for the clarification. Figured it was something along those lines.

Going to read through this thread later today. What are your framerates in games like?


----------



## jomama22

Quote:


> Originally Posted by *Clockster*
> 
> Cards have finally arrived.
> Will collect them in the morning
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can someone say 8x Gigabyte R9 290X
> 
> 
> 
> 
> 
> 
> 
> lol


Quote:


> Originally Posted by *Clockster*
> 
> Cards have finally arrived.
> Will collect them in the morning
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can someone say 8x Gigabyte R9 290X
> 
> 
> 
> 
> 
> 
> 
> lol


and I thought my 6 was a good number, lol.

Almost picked up another from amazon this morning, but I'll just wait.


----------



## $ilent

Quote:


> Originally Posted by *bond32*
> 
> Don't mean to stir the Hornets nest, but those Valley scores aren't exactly what I hoped. I'm still undecided on this card or the 780 lightning. Those of you on water, how's your clocks like?
> 
> Edit: where are you guys finding these in Stock?


How do the 3dmark11 scores grab you?


----------



## evensen007

Stepped away for a bit and lots of posts:

Quick question. Have we seen the limits being pushed on water yet, or is the early driver/afterburner-voltage lock holding it back?

I would expect once AB unlocks true core and memory voltage controls, we should be able to hit 1250 on the core easily on water. Is this the consensus? Can someone also confirm/deny whether memory overclocking on the 290x has a more significant impact on FPS than it did on the 7970's?

Thanks!

Chris


----------



## DampMonkey

Quote:


> Originally Posted by *evensen007*
> 
> Stepped away for a bit and lots of posts:
> 
> Quick question. Have we seen the limits being pushed on water yet, or is the early driver/afterburner-voltage lock holding it back?
> 
> I would expect once AB unlocks true core and memory voltage controls, we should be able to hit 1250 on the core easily on water. Is this the consensus? Can someone also confirm/deny whether memory overclocking on the 290x has a more significant impact on FPS than it did on the 7970's?
> 
> Thanks!
> 
> Chris


1250 core can be had relatively easy on water. Im having a lot of trouble getting stable at 1300 though. Im still pretty new to gpu overclocking and haven't figured out what the ram likes yet. I need to run some benches to see if ram clocking even effects game performance, because i think it's causing my issues

Temperatures do not seem to be the issue, i have yet to break 50*C on the core. No word on vrm temps yet


----------



## szeged

Quote:


> Originally Posted by *DampMonkey*
> 
> 1250 core can be had relatively easy on water. Im having a lot of trouble getting stable at 1300 though. Im still pretty new to gpu overclocking and haven't figured out what the ram likes yet. I need to run some benches to see if ram clocking even effects game performance, because i think it's causing my issues
> 
> Temperatures do not seem to be the issue, i have yet to break 50*C on the core. No word on vrm temps yet


vram doesnt effect games nearly as much as core does


----------



## raghu78

if the R9 290 has been delayed due to a new driver which brings performance improvement wouldn't that help the R9 290X too. they are the same chip and architecture. AMD is really bringing a good fight this holiday. BF4 Mantle will be icing on the cake in Dec


----------



## DampMonkey

Quote:


> Originally Posted by *szeged*
> 
> vram doesnt effect games nearly as much as core does


i was thinking that was the case also because the card has quite a lot of bandwidth anyway from being 512bit. I may hold off on finding max ram overclocks until i get the core sorted out and better memory voltage management is available


----------



## seabiscuit68

Anyone have any advice for me?

Currently running 6870's (Crossfire) @ 1920 x 1080

I will eventually upgrade to a higher resolution monitor, but I don't suspect that will be for another year.

Is it worth upgrading to 7970's (Crossfire) for $560 or going straight for a single 290x ($560)

Benchmarks show the 7970's would be faster, but buying a single 290x would allow me to crossfire eventually...

Or is it not worth it right now, and I should wait until I am using a higher resolution monitor?

Thanks


----------



## SLK

Is it just me or does the 290x have no UEFI GOP support? I am getting boot loops on windows 8.1 because I can't setup Secure boot properly. Works fine on the intel integrated graphics.

Also does anyone's fan rattle a little at around 40% fan speed?


----------



## raghu78

Quote:


> Originally Posted by *seabiscuit68*
> 
> Anyone have any advice for me?
> 
> Currently running 6870's (Crossfire) @ 1920 x 1080
> 
> I will eventually upgrade to a higher resolution monitor, but I don't suspect that will be for another year.
> 
> Is it worth upgrading to 7970's (Crossfire) for $560 or going straight for a single 290x ($560)
> 
> Benchmarks show the 7970's would be faster, but buying a single 290x would allow me to crossfire eventually...
> 
> Or is it not worth it right now, and I should wait until I am using a higher resolution monitor?
> 
> Thanks


wait for R9 290 launch on Nov 5th. at the same clocks R9 290 is going to be 6- 8% slower than R9 290X. wait for reviews and then jump on the card.









http://www.techpowerup.com/193497/radeon-r9-290-non-x-launch-pushed-back-a-week.html


----------



## rdr09

Quote:


> Originally Posted by *seabiscuit68*
> 
> Anyone have any advice for me?
> 
> Currently running 6870's (Crossfire) @ 1920 x 1080
> 
> I will eventually upgrade to a higher resolution monitor, but I don't suspect that will be for another year.
> 
> Is it worth upgrading to 7970's (Crossfire) for $560 or going straight for a single 290x ($560)
> 
> Benchmarks show the 7970's would be faster, but buying a single 290x would allow me to crossfire eventually...
> 
> Or is it not worth it right now, and I should wait until I am using a higher resolution monitor?
> 
> Thanks


just an estimate . . .

7970 = 2.5 * 6870
290 = 5.0 * 6870


----------



## jomama22

Here they are boys and girls. All 6.

3x MSI 2x xfx 1x sapphire.


----------



## raghu78

Quote:


> Originally Posted by *jomama22*
> 
> Here they are boys and girls. All 6.
> 
> 3x MSI 2x xfx 1x sapphire.


wow why do you need 6. are you binning.


----------



## alancsalt

Quote:


> Originally Posted by *raghu78*
> 
> Quote:
> 
> 
> 
> Originally Posted by *jomama22*
> 
> Here they are boys and girls. All 6.
> 
> 3x MSI 2x xfx 1x sapphire.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> wow why do you need 6. are you binning.
Click to expand...

I'd guess he was bit mining....


----------



## jomama22

Quote:


> Originally Posted by *raghu78*
> 
> wow why do you need 6. are you binning.


Mayyyybbbeeeee lol. I have work, then my favorite band is going on their last tour ever (streetlight manifesto) and I will be going to their Denver concert tonight. Then I'll spend all night air benching. Get the blocks in tommorow and will have water #s by Friday night/Saturday.

And I just got back from vacation... What a good 2 weeks haha.


----------



## cyenz

I have an option to buy a Titan for 600€ New, it´s a good buy? Or better wait for 290x?


----------



## EmZkY

I'm juggling between waiting for the 290 or just buy a 290x right now for my newly built rig =). Is it regular Battlefield 4 that comes with the r9 290x or is it Battlefield 4 Premium?


----------



## VSG

lol why are the XFX boxes so small in comparison? No accessories included?


----------



## Arizonian

Quote:


> Originally Posted by *EmZkY*
> 
> I'm juggling between waiting for the 290 or just buy a 290x right now for my newly built rig =). Is it regular Battlefield 4 that comes with the r9 290x or is it Battlefield 4 Premium?


Regular edition.


----------



## Okt00

Quote:


> Originally Posted by *jomama22*
> 
> Mayyyybbbeeeee lol. I have work, then my favorite band is going on their last tour ever (streetlight manifesto) and I will be going to their Denver concert tonight. Then I'll spend all night air benching. Get the blocks in tommorow and will have water #s by Friday night/Saturday.
> 
> And I just got back from vacation... What a good 2 weeks haha.


Looking forward to some numbers, especially comparisons water vs air!


----------



## winkyeye

Quote:


> Originally Posted by *DampMonkey*
> 
> Go look at any formal review, AMD always lags behind in Valley. Heaven is more even for some reason though


Really? In the last generation the GTX 680 lagged severely behind 7970 in Valley. My GTX 680 at 1520 core struggled to keep up with a 1300 core 7970.


----------



## seabiscuit68

Quote:


> Originally Posted by *raghu78*
> 
> wait for R9 290 launch on Nov 5th. at the same clocks R9 290 is going to be 6- 8% slower than R9 290X. wait for reviews and then jump on the card.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/193497/radeon-r9-290-non-x-launch-pushed-back-a-week.html


Thanks for the reminder. I completely forgot about the 290 (long day). I'll start looking for buyers on my 6870's - maybe get $120 for them (combined) and put that towards the 290.


----------



## TheRoot

Quote:


> Originally Posted by *jomama22*
> 
> Here they are boys and girls. All 6.
> 
> 3x MSI 2x xfx 1x sapphire.


3500$


----------



## battleaxe

Quote:


> Originally Posted by *TheRoot*
> 
> 3500$


The bright side is that this would have barely covered 3 Titans.


----------



## raghu78

Quote:


> Originally Posted by *seabiscuit68*
> 
> Thanks for the reminder. I completely forgot about the 290 (long day). I'll start looking for buyers on my 6870's - maybe get $120 for them (combined) and put that towards the 290.


good decision.







you can sell the two HD 6870 cards for $150 - $160.


----------



## wolfej

Well, got my waterblocks today, but one of my GPUs has ridiculously loud coil wine. I can barely hear it with even my headphones on. Back to Amazon that one goes.


----------



## jomama22

Quote:


> Originally Posted by *wolfej*
> 
> Well, got my waterblocks today, but one of my GPUs has ridiculously loud coil wine. I can barely hear it with even my headphones on. Back to Amazon that one goes.


Quick question. When did amazon have them in stock and for how much? Never seemed to catch them

Kind of a side note.

The serial # on the cards could probably be traced to having hynix or elpidia.

Also, its clear from the serial #s there are at least over 20,000 made so far. My #s all fall within batch 40-xxxxxxx or 41-xxxxxxx with the xs having 1-9999 it seems.


----------



## wolfej

They're in stock right now and for 569 bucks with BF4.

Edit: And now they're not haha. They were in stock like an hour ago.


----------



## jomama22

Quote:


> Originally Posted by *wolfej*
> 
> They're in stock right now and for 569 bucks with BF4.
> 
> Edit: And now they're not haha. They were in stock like an hour ago.


haha no problem. I saw they had a bf4 sapphire for $600 as well.


----------



## Havolice

no store has the water blocks here and i got no time due to work to tinker on my pc

needless to say i took my 290x's out and put in a 670 mini -.- that realy loud fan speeds in basic windows was making my life a living hell

i still dont get it in desktops fans make a lot of noise

in games the fan magicly slows down and then its bareble


----------



## VSG

Waterblocks are due to come in Friday to Frozen CPU/PPC and will be shipped out on Monday or Tuesday for orders placed so far. So I am looking at late next week for mine


----------



## Scorpion667

Can someone please post minimum and average FPS for a 1100Mhz 290x in BF4 Multiplayer?
Please use the following settings:

All Ultra except for shadows, keep these on mid.

AA disabled
Ambient Occlusion disabled
AF at 4x
1920x1080


----------



## tsm106

I ran this on air the other day, but I goofed and forgot up the memory speed more than 1375mhz. Doh. Still she ran 1225mhz core on air lol.

http://www.3dmark.com/fs/1060554


----------



## jrcbandit

For the EK water block, should I use MX-4 for the VRM/Mosfet thermal pads as suggested in the instructions? I assume they are talking about thermal pads labeled 2 in the manual since 1 appears to be memory? Does it matter how well you cut the thermal pads to fit the VRMs? Only the memory pads are cut to size.


----------



## Sazz

Quote:


> Originally Posted by *geggeg*
> 
> Waterblocks are due to come in Friday to Frozen CPU/PPC and will be shipped out on Monday or Tuesday for orders placed so far. So I am looking at late next week for mine


Looks like PPCS have received their stocks of the waterblocks, my order has been marked shipped earlier so I may get mine either Thursday (Tomorrow) at the earliest.


----------



## SLK

Is anyone's fan a little "clicky" sounding around 40-50%? I remember my 7970 fan sounding pretty smooth but this one sounds like that crap fan they used on the GTX 670 short versions.


----------



## skupples

Quote:


> Originally Posted by *szeged*
> 
> vram doesnt effect games nearly as much as core does


Very true, though if you are multi-monitor gamer it will help to smooth things out, more than boosting FPS.

Quote:


> Originally Posted by *jrcbandit*
> 
> For the EK water block, should I use MX-4 for the VRM/Mosfet thermal pads as suggested in the instructions? I assume they are talking about thermal pads labeled 2 in the manual since 1 appears to be memory? Does it matter how well you cut the thermal pads to fit the VRMs? Only the memory pads are cut to size.


EK normally suggests to use a dot of tim, sammiched with a thermal pad, on both VRM & VRAM.


----------



## GioV

Quote:


> Originally Posted by *Sazz*
> 
> Looks like PPCS have received their stocks of the waterblocks, my order has been marked shipped earlier so I may get mine either Thursday (Tomorrow) at the earliest.


Thanks for the heads up! Just ordered, I called and even though received a huge shipment the $106 water block is still unavailable due to high demand. Got the slightly more expensive one. Should get it tomorrow, Performance PCs is only 45 min from me!


----------



## jerrolds

Finally got my 290X yesterday proof:


----------



## skupples

Quote:


> Originally Posted by *GioV*
> 
> Thanks for the heads up! Just ordered, I called and even though received a huge shipment the $106 water block is still unavailable due to high demand. Got the slightly more expensive one. Should get it tomorrow, Performance PCs is only 45 min from me!


I have driven there many times! Appointment only.


----------



## Sazz

Quote:


> Originally Posted by *GioV*
> 
> Thanks for the heads up! Just ordered, I called and even though received a huge shipment the $106 water block is still unavailable due to high demand. Got the slightly more expensive one. Should get it tomorrow, Performance PCs is only 45 min from me!


I ordered the $106 one, even fcpu said that specific block is the one that is in demand, they got the plexi+nickel in stock and plexi+copper but none of the acetal ones as yet.

I guess I am one of the lucky ones who got an order in for the acetal+copper ones first.

fcpu expects to get their shipment on monday so they may start sending those out on Tuesday.


----------



## Taint3dBulge

WOWZERS. Look at those Heat pipes on this 290X.. This is at the BF4 launch party..

Guess these are VTX3D cards.. Never herd of that brand before...


----------



## Coree

Quote:


> Originally Posted by *Taint3dBulge*
> 
> WOWZERS. Look at those Heat pipes on this 290X.. This is at the BF4 launch party..
> 
> Guess these are VTX3D cards.. Never herd of that brand before...


I REALLY hope that they didn't forget to put heatsinks to the both VRM's. My VTX3D 7950 had only a heatsink on the #2 one. #1 had none, and was running hot. reaching 95C+ at stock volts (993mV) Crippled my OC potential so much on air








Otherwise I love the colors on the card.


----------



## Paul17041993

Quote:


> Originally Posted by *SLK*
> 
> Is anyone's fan a little "clicky" sounding around 40-50%? I remember my 7970 fan sounding pretty smooth but this one sounds like that crap fan they used on the GTX 670 short versions.


did you get a good look around the fan? if the cover is slightly too close to it? how much does the fan move on its bearing? I think there might have been an iffy batch on these ones with what Ive heard from some people here...
Quote:


> Originally Posted by *Taint3dBulge*
> 
> WOWZERS. Look at those Heat pipes on this 290X.. This is at the BF4 launch party..
> 
> Guess these are VTX3D cards.. Never herd of that brand before...


mate, those are 280Xs...


----------



## Iniura

Guys appareantly on OC UK there is a guy who got a sapphire 290 delivered instead of a 290X biggrin.gif They made a big mistake lol He is posting benchmarks and running benchmarks at the moment, thought a lot of people would be interested in this so shared it here. Don't pin me on it if is legit or not but it sure looks like it is http://forums.overclockers.co.uk/showthread.php?t=18551717&highlight=r9+290x&page=33


----------



## MrTOOSHORT

Here's some R9 290 pics from another forum:






*http://forums.overclockers.co.uk/showthread.php?t=18551717&highlight=r9+290x&page=35*

Guy bought a R9 290x, but received a R9 290 instead.


----------



## Slomo4shO

AMD Catalyst 13.11 v8 Beta Driver are out.

Windows 64:
http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx

Linux 64 is still v6:
http://support.amd.com/en-us/kb-articles/Pages/Latest-LINUX-Beta-Driver.aspx


----------



## MarvinDessica

Can't wait to get my 290x


----------



## Arizonian

Quote:


> Originally Posted by *jerrolds*
> 
> Finally got my 290X yesterday proof:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Dang - Sapphire BF4 edition with Elpdia memory, hopefully it OCs well.


Congrats bud. Added









Quote:


> Originally Posted by *MrTOOSHORT*
> 
> Here's some R9 290 pics from another forum:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *http://forums.overclockers.co.uk/showthread.php?t=18551717&highlight=r9+290x&page=35*
> 
> Guy bought a R9 290x, but received a R9 290 instead.


That's pre-new drivers for thje 290 that AMD has delayed NDA lift so they could get the new drivers to reviewers to re-bench. However it will give us some insight to how much of a boost approximately they brought for the wait when we compare those numbers.


----------



## surrealalucard

Alright, got 3dmark11 and ran the extreme test preset for benchmarks, everything on the gfx card is stock, fan running at 70% and the temps barely went over 65c.

Comp Specs:
amd phenom II x6 1100t (tomorrow my fx8350 gets here.) @ 4ghz
R9 290x of course! @ stock
8gb gddr 3 memory @ stock

Here is my score: let me know if its under par, spot on or what


----------



## Tobiman

Quote:


> Originally Posted by *surrealalucard*
> 
> Alright, got 3dmark11 and ran the extreme test preset for benchmarks, everything on the gfx card is stock, fan running at 70% and the temps barely went over 65c.
> 
> Comp Specs:
> amd phenom II x6 1100t (tomorrow my fx8350 gets here.) @ 4ghz
> R9 290x of course! @ stock
> 8gb gddr 3 memory @ stock
> 
> Here is my score: let me know if its under par, spot on or what


Yeah, the phenom is defintely holding it back. The FX should stretch its legs much better though.


----------



## C-BuZz

Quote:


> Originally Posted by *Scorpion667*
> 
> Can someone please post minimum and average FPS for a 1100Mhz 290x in BF4 Multiplayer?
> Please use the following settings:
> 
> All Ultra except for shadows, keep these on mid.
> 
> AA disabled
> Ambient Occlusion disabled
> AF at 4x
> 1920x1080


Every setting maxed @ 1600p 64 player map Vsync on. Did not see it drop below 55fps @ 1100mhz.


----------



## tsm106

Let me drop this here...

Single @ 1200/1500 air

http://www.3dmark.com/fs/1069617



Trifire @ 1200/1500 air

http://www.3dmark.com/fs/1069571


----------



## szeged

very nice run on tri, thats top 10 territory on just air.

Do you have any ln2 pots?


----------



## tsm106

Quote:


> Originally Posted by *szeged*
> 
> very nice run on tri, thats top 10 territory on just air.
> 
> Do you have any ln2 pots?


Nope, just a regular 24/7 rig. With ln2... I'd imagine the scores to be crazy. WTB ek blocks!


----------



## szeged

Quote:


> Originally Posted by *tsm106*
> 
> Nope, just a regular 24/7 rig. With ln2... I'd imagine the scores to be crazy. WTB ek blocks!


man if we lived near each other i would definitely hook you up with some ln2 so you could smash some records.


----------



## psyside

Quote:


> Originally Posted by *Taint3dBulge*
> 
> WOWZERS. Look at those Heat pipes on this 290X.. This is at the BF4 launch party..


Are you sure, those are 290X?


----------



## Hattifnatten

Those are 280X's.


----------



## skupples

Seem's to be confirmed that the 780Ti will be a 3gb 2880 core GK110. IE: 200 more cores than titan @ 700$.


----------



## psyside




----------



## battleaxe

Quote:


> Originally Posted by *psyside*
> 
> Are you sure, those are 290X?


I love how the top one is visibly sagging down. LOL


----------



## szeged

Quote:


> Originally Posted by *battleaxe*
> 
> I love how the top one is visibly sagging down. LOL


yeah i loled at that, that poor pci-e slot is pullin some serious overtime there.


----------



## Forceman

Quote:


> Originally Posted by *skupples*
> 
> Seem's to be confirmed that the 780Ti will be a 3gb 2880 core GK110. IE: 200 more cores than titan @ 700$.


Confirmed where? Hopefully not that questionable GPU-z shot.


----------



## surrealalucard

Woo Got a Picture of my box for the 290x, and oc name is surrealalucard


----------



## tsm106

Quote:


> Originally Posted by *szeged*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Nope, just a regular 24/7 rig. With ln2... I'd imagine the scores to be crazy. WTB ek blocks!
> 
> 
> 
> man if we lived near each other i would definitely hook you up with some ln2 so you could smash some records.
Click to expand...

I've tossed around the idea of at least a cpu pot. Hmm... Anyways heres a trfire 3dm11.

Trifire @ 1200/1500 air

http://www.3dmark.com/3dm11/7402935


----------



## Arizonian

Quote:


> Originally Posted by *skupples*
> 
> Seem's to be confirmed that the 780Ti will be a 3gb 2880 core GK110. IE: 200 more cores than titan @ 700$.


When I first read that for a brief second I thought I was in the Titan or 780 club. Had to re-look at the title.


----------



## wstanci3

Quote:


> Originally Posted by *tsm106*
> 
> I've tossed around the idea of at least a cpu pot. Hmm... Anyways heres a trfire 3dm11.
> 
> Trifire @ 1200/1500 air
> 
> http://www.3dmark.com/3dm11/7402935


How does it compare to your tri-fire 7970s in 3dmark?


----------



## Sgt Bilko

Damn......that tri-fire


----------



## battleaxe

Quote:


> Originally Posted by *tsm106*
> 
> I've tossed around the idea of at least a cpu pot. Hmm... Anyways heres a trfire 3dm11.
> 
> Trifire @ 1200/1500 air
> 
> http://www.3dmark.com/3dm11/7402935


Wow.... IDK what to say. What's the record at right now?


----------



## surrealalucard

So I noticed something on my 3d mark 11 results, its showing my core clock speed at 300 mhz, mem clock speed at 150mhz, and memory only at 3 gbs....whats up with that?


----------



## MrTOOSHORT

Quote:


> Originally Posted by *surrealalucard*
> 
> So I noticed something on my 3d mark 11 results, its showing my core clock speed at 300 mhz, mem clock speed at 150mhz, and memory only at 3 gbs....whats up with that?


When 3dmark11 was reading the final info of your run, your card down clocked at that moment.

Totally normal.

Oh... nice TSM!


----------



## surrealalucard

ahh ok good, making sure they didn't send me a box with a 280x in it haha


----------



## FtW 420

Quote:


> Originally Posted by *surrealalucard*
> 
> So I noticed something on my 3d mark 11 results, its showing my core clock speed at 300 mhz, mem clock speed at 150mhz, and memory only at 3 gbs....whats up with that?


In the futuremark link? That is the systeminfo that looks at the system before the test runs, it frequently gets info incorrect. It's handy to get a screenshot with gpu-z & cpu-z tabs to see the right cpu, memory & gpu clocks for the bench.


----------



## spitty13

It has been a week since 290x have been out. Have we confirmed which one have hynix memory and which ones don't?


----------



## szeged

in stock at newegg


----------



## tsm106

Quote:


> Originally Posted by *spitty13*
> 
> It has been a week since 290x have been out. Have we confirmed which one have hynix memory and which ones don't?


Mem type doesn't matter, once flashed they all run the same speed. I have three cards, hynix and elpida, clocking up to 1600 stock volts no prob on air.


----------



## $ilent

Tsm ia that 1200/1500 on air with asus bios?

Also anyone know what the v8 driver does update wise? Do we have any word on new afterburner yet to provide voltage control?


----------



## tsm106

No other bios has volt control man, so yea its the asus bios. Using the stock asus bios for now. The unlocked bios can wait for water.


----------



## Koniakki

Quote:


> Originally Posted by *Sgt Bilko*
> 
> In about 5 hours ill have my 290x, Vrm heatsinks, Accelero Xtreme III cooler and the thermal paste and adhesive. So ill run sone stock air tests tonight. Then the accelero on Friday and then ill put the heatsinks on during next week (5 days for the adhesive to dry).
> 
> Heres hoping it all goes well


Hey Sgt did you fitted that accelero? Or did anybody guys? Sorry for asking again. I searched but didn't find anything.


----------



## psyside




----------



## szeged

lots in stock at newegg atm, just grabbed a third, saving 4th to buy from a user here


----------



## Falkentyne

Quote:


> Originally Posted by *tsm106*
> 
> No other bios has volt control man, so yea its the asus bios. Using the stock asus bios for now. The unlocked bios can wait for water.


TSM, sorry to interrupt, but can you get 1200/1500 on the STOCK (sapphire) bios, on air, or only on the asus bios? How are they different if you are not adjusting voltages?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Koniakki*
> 
> Hey Sgt did you fitted that accelero? Or did anybody guys? Sorry for asking again. I searched but didn't find anything.


Currently at work atm.....yay for mobile data







ill put my 7970 back in tonight and fit the accelero to the 290x and test it out tomorrow. Roughly 30 or so hours ill have an answer for you


----------



## Paul17041993

Quote:


> Originally Posted by *psyside*
> 
> Are you sure, those are 290X?


Quote:


> Originally Posted by *Paul17041993*
> 
> mate, those are 280Xs...


----------



## $ilent

Quote:


> Originally Posted by *Falkentyne*
> 
> TSM, sorry to interrupt, but can you get 1200/1500 on the STOCK (sapphire) bios, on air, or only on the asus bios? How are they different if you are not adjusting voltages?


Im curious to know this too


----------



## KyGuy

Hey guys, R9's in stock. PowerColor, MSI, and HIS. Hurry Up and get one if you don't have one!!! They're at newegg


----------



## VSG

Powercolor and MSI in stock now, is MSI a better choice?


----------



## VSG

Grrrrr out of stock as I pressed check out.


----------



## KyGuy

I would buy the MSI. Just personal preference.


----------



## Arizonian

Quote:


> Originally Posted by *surrealalucard*
> 
> Woo Got a Picture of my box for the 290x, and oc name is surrealalucard
> 
> 
> Spoiler: Warning: Spoiler!


Congrats added









Side note Powercolor 290X available on Newegg.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814131522


----------



## KyGuy

Quote:


> Originally Posted by *geggeg*
> 
> Grrrrr out of stock as I pressed check out.


Sorry Dude.....I bought my Sapphire 290x 4 days ago. Coming tomorrow


----------



## $ilent

Damm these things go out of stock soon as they appear! Huge demand


----------



## OrcishMonkey

haha sry i stole one, that shopblt preorder never panned out for my second card.


----------



## Arm3nian

Quote:


> Originally Posted by *spitty13*
> 
> It has been a week since 290x have been out. Have we confirmed which one have hynix memory and which ones don't?


We have shown that basically every brand uses both Elpida and Hynix. The BF4 Editions also do not guarantee that you get a specific brand of memory on the card. However getting a BF4 edition might increase your chances. I'm not sure if the non BF4 editions also have Hynix on some cards, but we do know that the BF4 Editions use both.

On another note, it has been shown that the brand doesn't matter currently. It might on water/ln2, not sure, but on air they reach the same clocks.


----------



## szeged

rofl newegg auto notify is so awful, i already got another HIS 290x, and went and packed up another card i just sold, and newegg just now emailed me with their auto notify crap.


----------



## Arm3nian

Quote:


> Originally Posted by *szeged*
> 
> rofl newegg auto notify is so awful, i already got another HIS 290x, and went and packed up another card i just sold, and newegg just now emailed me with their auto notify crap.


I've been emailed 3 days after I snatched the item. If you want an product, don't use that notification tool lol.


----------



## jomama22

Quote:


> Originally Posted by *szeged*
> 
> rofl newegg auto notify is so awful, i already got another HIS 290x, and went and packed up another card i just sold, and newegg just now emailed me with their auto notify crap.


They are horrible tbh...I was lucky to be refreshing newegg and TD at midnight in the 25th to grab 6.

I'll say, newegg did let me order 3 cards with 2 different checkouts (as its max 2) and didn't say a word to me about it.

I just want amazon or TD to get more as I get 5% back from each and have a fee amazon GC to burn.

Though 6 should be enough...lol.


----------



## battleaxe

Is that promo code still working or is it dead? (newegg)


----------



## Arizonian

Quote:


> Originally Posted by *battleaxe*
> 
> Is that promo code still working or is it dead? (newegg)


The mobile promo is over. At least two days ago when I entertained the thought of a new PSU in my cart. I scrapped it.


----------



## battleaxe

Quote:


> Originally Posted by *Arizonian*
> 
> The mobile promo is over. At least two days ago when I entertained the thought of a new PSU in my cart. I scrapped it.


Bummer. Thanks


----------



## Slomo4shO

Quote:


> Originally Posted by *battleaxe*
> 
> Bummer. Thanks


$40 off $499 or more, $100 off $999 or more at Newegg Business
www.neweggbusiness.com

XB2B1028E40 for $40 off

XB2B1028E100 for $100 off

Promo expires 11/01/2013

Codes do work for the card assuming you can find them when they are in stock. I was hoping to use them for the 290 release but it has been pushed back to the 5th


----------



## MIGhunter

Quote:


> Originally Posted by *battleaxe*
> 
> I love how the top one is visibly sagging down. LOL


Quote:


> Originally Posted by *szeged*
> 
> yeah i loled at that, that poor pci-e slot is pullin some serious overtime there.


Is there a good solution to keep a card from sagging?


----------



## hotrod717

Newegg shows MSI and Powercolor in stock, but when you put it in cart, says insufficient stock.


----------



## Taint3dBulge

The Asus 290x has been taken off completely. Freakn bummed out. I emailed them about it. lets see if i get a reply..


----------



## hotrod717

Quote:


> Originally Posted by *Taint3dBulge*
> 
> The Asus 290x has been taken off completely. Freakn bummed out. I emailed them about it. lets see if i get a reply..


A few days ago another site had some Sapphires coming in 11/6 as well as a few other brands showing. They've removed most of them except Gigabyte. This is one befuddled launch.


----------



## kot0005

More pron! unfortunately I wont be able to get it underwater before 13th of nov as I have exams


----------



## VSG

Is that a ROG etch on the backplate?


----------



## tsm106

3dmark 11 Extreme, same settings as other benches

http://www.3dmark.com/3dm11/7403486


----------



## skupples

Quote:


> Originally Posted by *hotrod717*
> 
> A few days ago another site had some Sapphires coming in 11/6 as well as a few other brands showing. They've removed most of them except Gigabyte. This is one befuddled launch.


I think it likely has to do with the coming "epic" driver launch people are now talking about.


----------



## Taint3dBulge

Quote:


> Originally Posted by *hotrod717*
> 
> A few days ago another site had some Sapphires coming in 11/6 as well as a few other brands showing. They've removed most of them except Gigabyte. This is one befuddled launch.


Ya, very confusing as to whats going on.... Wish there would be some NEWS from some of these companies as to when they are going to resupply and if there are any problems getting there products to vendors.


----------



## Arizonian

OP updated with new 13.11 beta 8 driver update which fixes the crashing bug reported by Battlefield 4 players with Windows 8.


----------



## kot0005

Quote:


> Originally Posted by *geggeg*
> 
> Is that a ROG etch on the backplate?


Nah mate, its a metallic sticker. Purchased it from modsticker.com


----------



## staryoshi

Quote:


> Originally Posted by *Taint3dBulge*
> 
> The Asus 290x has been taken off completely. Freakn bummed out. I emailed them about it. lets see if i get a reply..


They will continue to go in and out of stock, sometimes becoming unavailable (EG no longer on the website) in between. They'll be back.

Also, this is my post # 7777. I should go buy a lottery ticket.


----------



## Arm3nian

Quote:


> Originally Posted by *staryoshi*
> 
> They will continue to go in and out of stock, sometimes becoming unavailable (EG no longer on the website) in between. They'll be back.
> 
> Also, this is my post # 7777. I should go buy a lottery ticket.


Play the silicon lottery also.


----------



## Taint3dBulge

Quote:


> Originally Posted by *staryoshi*
> 
> They will continue to go in and out of stock, sometimes becoming unavailable (EG no longer on the website) in between. They'll be back.
> 
> Also, this is my post # 7777. I should go buy a lottery ticket.


Thats what Im hoping. I been checking 10 times a day since launch though and its been on there page since then. Funny that it just all a sudden fall off, unless at midnight they get put on. Someone posted that they are supposed to all go on at midnight...


----------



## Sazz

Quote:


> Originally Posted by *Taint3dBulge*
> 
> The Asus 290x has been taken off completely. Freakn bummed out. I emailed them about it. lets see if i get a reply..


Remember there is only so many of these BF4 edition that will be on sale worldwide (8k units if I remember it right? or 10k?) so if they completely removed the BF4 versions it means that they may really be gone and the regular version would be left.


----------



## Taint3dBulge

Quote:


> Originally Posted by *Sazz*
> 
> Remember there is only so many of these BF4 edition that will be on sale worldwide (8k units if I remember it right? or 10k?) so if they completely removed the BF4 versions it means that they may really be gone and the regular version would be left.


Iv seen non bf4 versions too.. From what I was told there should be 40,000 more cards up for grab. No idea if that is true.. I might need to go back and find that post.


----------



## wolfej

I'm having the worst luck. One of my EK waterblocks cracked when I was taking the top off. I would have understood if I overtightened it but I was taking the damn thing off. Soooooo, now I get to wait for a replacement block on top of waiting for a replacement 290x.


----------



## tsm106

Quote:


> Originally Posted by *wolfej*
> 
> I'm having the worst luck. One of my EK waterblocks cracked when I was taking the top off. I would have understood if I overtightened it but I was taking the damn thing off. Soooooo, now I get to wait for a replacement block on top of waiting for a replacement 290x.


Was it acrylic? I couldn't imagine acetal cracking.

I was running some game benches and this thing absolutely flies in grid 2. Maxed out 16xeq ultra at 5760x1080 at stock, stock fan setting and all.



Spoiler: Warning: Spoiler!


----------



## wolfej

Quote:


> Originally Posted by *tsm106*
> 
> Was it acrylic? I couldn't imagine acetal cracking.
> 
> I was running some game benches and this thing absolutely flies in grid 2. Maxed out 16xeq ultra at 5760x1080 at stock, stock fan setting and all.
> 
> 
> 
> Spoiler: Warning: Spoiler!


Yeah it was acrylic. I had taken the two outside screws out and was unscrewing the middle one and heard a pop noise. Ugh.


----------



## tsm106

That could be a long delay. You should ask for an acetal replacement if you can swing it so it doesn't happen again the future? There's a reason why acetal is so popular besides the color. Good luck on your rma.


----------



## jerrolds

Quote:


> Originally Posted by *tsm106*
> 
> Was it acrylic? I couldn't imagine acetal cracking.
> 
> I was running some game benches and this thing absolutely flies in grid 2. Maxed out 16xeq ultra at 5760x1080 at stock, stock fan setting and all.
> 
> 
> Spoiler: Warning: Spoiler!


Hey tsm - would you mind posting your settings on pushing your 290x that far with the stock cooler? 70% fan/PT3 bios/1.3v/+30 power limit/etc?

I'm currently at 1100/1400 on stock everything w/ uber bios, getting corruption at 1125mhz. Ill probably flast to PT1 bios soon and try and hit 1150/1500mhz using a bit of voltage/power limit bump.

Cant afford to put it underwater atm









Thanks


----------



## Arm3nian

Quote:


> Originally Posted by *wolfej*
> 
> I'm having the worst luck. One of my EK waterblocks cracked when I was taking the top off. I would have understood if I overtightened it but I was taking the damn thing off. Soooooo, now I get to wait for a replacement block on top of waiting for a replacement 290x.


Sad news. Next time, try unscrewing all the screws a little by little, this creates more even pressure.


----------



## Paulenski

Here's an graph I put together reading voltages and temps from AIDA64 using the Asus ROM with some overclocking, some issues with vdroop though, wish it would work better to allow for more precision in voltage control and stability.

I'll have my MK-26 tomorrow, can't wait to run more benchmarks


----------



## Alexbo1101

So, a question for you guys who own a 290x, would it be worth selling my 7950 and get one, or should i just get a second 7950? (prices in Denmark are 4090 DKK for 290x and 1850 DKK for 7950)

Sorry for hijacking the thread









TIA
Alex


----------



## Sazz

Quote:


> Originally Posted by *Alexbo1101*
> 
> So, a question for you guys who own a 290x, would it be worth selling my 7950 and get one, or should i just get a second 7950? (prices in Denmark are 4090 DKK for 290x and 1850 DKK for 7950)
> 
> Sorry for hijacking the thread
> 
> 
> 
> 
> 
> 
> 
> 
> 
> TIA
> Alex


if you're playing on 1080p resolution then I think 7950 will still do you good, only thing I don't like on crossfire is that it can have some problems with performance, specially on new games. so you will be at mercy of AMD if there is a problem with the driver until they fix that problem but when crossfire works properly it works good.

another thing you may consider is your power supply, will it be able to handle it, and noise too. Are you gonna be able to tolerate both cards at 100% if you are using the stock coolers that it came with and that comes hand in hand with heat too.

Another thing is when you crossfire, you don't double your performance, you probably will get 95%ish or so on two way crossfire at some games, some games can only get upto 70-80% of extra boost from the second card.

For me I always opt in for a single powerful card instead of two cards crossfire so that I can eliminate the potential problem of drivers not being able to work right on games (specially newly released games).


----------



## Alexbo1101

Quote:


> Originally Posted by *Sazz*
> 
> if you're playing on 1080p resolution then I think 7950 will still do you good, only thing I don't like on crossfire is that it can have some problems with performance, specially on new games. so you will be at mercy of AMD if there is a problem with the driver until they fix that problem but when crossfire works properly it works good.
> 
> another thing you may consider is your power supply, will it be able to handle it, and noise too. Are you gonna be able to tolerate both cards at 100% if you are using the stock coolers that it came with and that comes hand in hand with heat too.
> 
> Another thing is when you crossfire, you don't double your performance, you probably will get 95%ish or so on two way crossfire at some games, some games can only get upto 70-80% of extra boost from the second card.
> 
> For me I always opt in for a single powerful card instead of two cards crossfire so that I can eliminate the potential problem of drivers not being able to work right on games (specially newly released games).


Yup, I'm playing at 1080p, but on a 144hz screen and I'd really like to get more than 50-80 fps in BF3









My 750w power supply should be enough for two 7950's or a single 290x, and I intend to go down the wet road of water cooling soetime in the future.

Thanks!









EDIT: just ordered a Sapphire R9 290X with BF4...


----------



## petedread

Any one not been able to redeem their AMD BF4 code? I'm really fed up with not being able to play especially as I payed more for a BF4 bundled GPU.


----------



## Alexbo1101

EU price gouging FTW, 4141,77 DKK is $761,95 USD


----------



## Paulenski

Quote:


> Originally Posted by *petedread*
> 
> Any one not been able to redeem their AMD BF4 code? I'm really fed up with not being able to play especially as I payed more for a BF4 bundled GPU.


They started working since Monday, it worked for me.


----------



## HornexPC

Quote:


> Originally Posted by *MIGhunter*
> 
> Is there a good solution to keep a card from sagging?


If you have a case with cable management, you can plug in the power cables and pull them over the top of the card and round the back of the case, then cable tied them to the back/side of the case, this will pull the card and prevent sagging, also allowing the cables to simple hang from the card can cause it to pull the card down.

Hope you get what I mean.

If not I can post a Photo.


----------



## Sgt Bilko

Ok so i got my Accelero Xtreme III on and i have good news and bad news,

Good news is i did a 15 min Burn in run on Furmark and maxed out at 71 Degrees

Bad news is a couple of my Heatsinks fell of in the process -_- (I used the Adhesive that came with it for them, otherwise the rest i mixed myself)

Apoligies for the crappy pics, trying to take a photo one-handed with a smartphone is harder than you would think











http://imgur.com/a


----------



## flopper

Quote:


> Originally Posted by *Alexbo1101*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EU price gouging FTW, 4141,77 DKK is $761,95 USD


my 7970 runs 100fps in bf4 at 5040x1050 with my settings, better than BF3
No rush to buy a 290 the next few months so I save up and wait for the deal


----------



## Alexbo1101

Quote:


> Originally Posted by *flopper*
> 
> my 7970 runs 100fps in bf4 at 5040x1050 with my settings, better than BF3
> No rush to buy a 290 the next few months so I save up and wait for the deal


Well... My 7950 doesn't want to overclock at all, so with high i usually get 60-70 fps in BF3, far from 144, hell... far from 120.


----------



## fleetfeather

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Ok so i got my Accelero Xtreme III on and i have good news and bad news,
> 
> Good news is i did a 15 min Burn in run on Furmark and maxed out at 71 Degrees
> 
> Bad news is a couple of my Heatsinks fell of in the process -_- (I used the Adhesive that came with it for them, otherwise the rest i mixed myself)
> 
> Apoligies for the crappy pics, trying to take a photo one-handed with a smartphone is harder than you would think
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://imgur.com/a


Nice one and Bad luck









The Xtreme 3 is more than 2-slots tall, right? And can I ask, what is the effective length of the card when it has the Xtreme on it?


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Ok so i got my Accelero Xtreme III on and i have good news and bad news,
> 
> Good news is i did a 15 min Burn in run on Furmark and maxed out at 71 Degrees
> 
> Bad news is a couple of my Heatsinks fell of in the process -_- (I used the Adhesive that came with it for them, otherwise the rest i mixed myself)
> 
> Apoligies for the crappy pics, trying to take a photo one-handed with a smartphone is harder than you would think
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://imgur.com/a


no, heatsinks for the vrms? they get hot. hotter than the core. well, not really but they do get hot.


----------



## rdr09

Quote:


> Originally Posted by *Alexbo1101*
> 
> Well... My 7950 doesn't want to overclock at all, so with high i usually get 60-70 fps in BF3, far from 144, hell... far from 120.


here is BF3 MP 1080 Ultra and 4MSAA with an i7 4.5 7950 stock. i forgot what map. i think it was caspian, which usually get higher fps . . .

HT off and on



not much difference in single but crossfired . . . with HT off the chip was pegged to 100% usage.


----------



## Sgt Bilko

Quote:


> Originally Posted by *fleetfeather*
> 
> Nice one and Bad luck
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The Xtreme 3 is more than 2-slots tall, right? And can I ask, what is the effective length of the card when it has the Xtreme on it?


it takes up 3 slots, and as for the length? AC's website quotes this for the Cooler itself: Dimensions: 288 (L) x 104 (W) x 54 (H) mm,
so on the 290x it's probably in the vicinity of being around 31-33cm long all up...........my messy cables don't help anything though

Pros: This is better than the stock cooler for sure idle 34 degrees (100% fan speed and i can barely hear a thing)

Cons: there isn't enough heatsinks in the package for all the Vram mods so i needed to buy some extra and it makes the card flex (hence the Ninja wire)

All in all, it's not worth the money for a 290x, It's a way better option to put the cash towards a water cooling set-up with full cover blocks and backplates.
Quote:


> Originally Posted by *rdr09*
> 
> no, heatsinks for the vrms? they get hot. hotter than the core.





http://imgur.com/a


It's an album, not just a single pic









EDIT: Mine is a Sapphire BF4 Edition with Elphida Mem in case that adds to the confusion


----------



## Alexbo1101

Price dosen't hurt so much anymore, managed to sell my 7950 for $220







.


----------



## fleetfeather

Quote:


> Originally Posted by *Alexbo1101*
> 
> Price dosen't hurt so much anymore, managed to sell my 7950 for $220
> 
> 
> 
> 
> 
> 
> 
> .


I hope that was a golden card or something, otherwise your buyer paid too much







(great outcome for you though regardless!)


----------



## Alexbo1101

Quote:


> Originally Posted by *fleetfeather*
> 
> I hope that was a golden card or something, otherwise your buyer paid too much
> 
> 
> 
> 
> 
> 
> 
> (great outcome for you though regardless!)


It's actually $130 below MSRP in Denmark


----------



## fleetfeather

Quote:


> Originally Posted by *Alexbo1101*
> 
> It's actually $130 below MSRP in Denmark


ahhh, that makes sense then


----------



## Alexbo1101

Quote:


> Originally Posted by *fleetfeather*
> 
> ahhh, that makes sense then


We pay roughly 37% more in Denmark


----------



## fleetfeather

Quote:


> Originally Posted by *Alexbo1101*
> 
> We pay roughly 37% more in Denmark


that's a bummer









---

Aussies, XFX 290X is in stock: http://www.pccasegear.com/index.php?main_page=product_info&products_id=25612

Also... Have you guys seen the Benchmarks for BF4 yet? Not only is the 290X pulling considerably more frames than our brothers in green, but the 290X also has lower frame timings! :O


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> it takes up 3 slots, and as for the length? AC's website quotes this for the Cooler itself: Dimensions: 288 (L) x 104 (W) x 54 (H) mm,
> so on the 290x it's probably in the vicinity of being around 31-33cm long all up...........my messy cables don't help anything though
> 
> Pros: This is better than the stock cooler for sure idle 34 degrees (100% fan speed and i can barely hear a thing)
> 
> Cons: there isn't enough heatsinks in the package for all the Vram mods so i needed to buy some extra and it makes the card flex (hence the Ninja wire)
> 
> All in all, it's not worth the money for a 290x, It's a way better option to put the cash towards a water cooling set-up with full cover blocks and backplates.
> 
> 
> http://imgur.com/a
> 
> 
> It's an album, not just a single pic
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: Mine is a Sapphire BF4 Edition with Elphida Mem in case that adds to the confusion


so you did put heatsinks on the highlighted area? i did not see, sorry.


----------



## Alexbo1101

Quote:


> Originally Posted by *fleetfeather*
> 
> that's a bummer
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ---
> 
> Aussies, XFX 290X in in stock: http://www.pccasegear.com/index.php?main_page=product_info&products_id=25612


Lol, now you aussies can't complain about hardware prices anymore


----------



## fleetfeather

Quote:


> Originally Posted by *Alexbo1101*
> 
> Lol, now you aussies can't complain about hardware prices anymore


hehe









---

Guys... we have a problem

http://pclab.pl/art55318-10.html

As someone who'll be playing with a 3570k + CF 290X, with windows 7, and 1600mhz RAM... I'm kinda annoyed right now lol


----------



## LazarusIV

Quick question for you all. I will be getting the Qnix 1440p monitor and I am not quite sure about the gfx card. I will be overclocking the monitor as much as I can but I don't care if I don't get 120Hz out of it. I'm perfectly happy with 96Hz or 108Hz too. Should I get 2 R9 280Xs, one R9 290 and maybe a second one down the road, or 2 R9 290s immediately? What do you guys think? I also will be watercooling the cards and my 8350 in the next couple of months. Thanks for the input all!


----------



## Alexbo1101

Quote:


> Originally Posted by *fleetfeather*
> 
> hehe
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ---
> 
> Guys... we have a problem
> 
> http://pclab.pl/art55318-10.html
> 
> As someone who'll be playing with a 3570k + CF 290X, with windows 7, and 1600mhz RAM... I'm kinda annoyed right now lol



What the hell?


----------



## fleetfeather

Quote:


> Originally Posted by *Alexbo1101*
> 
> 
> What the hell?


Sorry, I linked to the very last page of the review. I trust all of you can navigate back to the first page of the review and check it all out









Yeah, hopefully AMD's upcoming "awesome drivers" I've been hearing about bring some big changes


----------



## $ilent

Quote:


> Originally Posted by *LazarusIV*
> 
> Quick question for you all. I will be getting the Qnix 1440p monitor and I am not quite sure about the gfx card. I will be overclocking the monitor as much as I can but I don't care if I don't get 120Hz out of it. I'm perfectly happy with 96Hz or 108Hz too. Should I get 2 R9 280Xs, one R9 290 and maybe a second one down the road, or 2 R9 290s immediately? What do you guys think? I also will be watercooling the cards and my 8350 in the next couple of months. Thanks for the input all!


Depends on the game. On bf3 with everything max and 4xAA my average fps is around 60. If you want max fps with max settings youd need 2 290. Turn aa off and you prob get 90fps with a single 290.


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> so you did put heatsinks on the highlighted area? i did not see, sorry.


yes i did, i wasn't very comprehensive in my pics, sorry

Just played the first two missions in BF4 and was holding 60+ fps all the way in Ultra 1080p, very happy right now.

Will try out some more overclocking on the weekend and see if the new cooler makes any difference with it.

Quote:


> Originally Posted by *fleetfeather*
> 
> Sorry, I linked to the very last page of the review. I trust all of you can navigate back to the first page of the review and check it all out
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah, hopefully AMD's upcoming "awesome drivers" I've been hearing about bring some big changes


Chrome won't translate......any chance you can give a quick summary?
Quote:


> Originally Posted by *Alexbo1101*
> 
> Lol, now you aussies can't complain about hardware prices anymore


Of course we can









You guys get HDD's and Monitors cheaper than us on average, i know, my wife is Danish


----------



## fleetfeather

Quote:


> Originally Posted by *Sgt Bilko*
> 
> yes i did, i wasn't very comprehensive in my pics, sorry
> 
> Just played the first two missions in BF4 and was holding 60+ fps all the way in Ultra 1080p, very happy right now.
> 
> Will try out some more overclocking on the weekend and see if the new cooler makes any difference with it.
> Chrome won't translate......any chance you can give a quick summary?


I wasn't using translate either, but rather just looking at the graphs. In either case, these particular graphs stand out:


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> yes i did, i wasn't very comprehensive in my pics, sorry
> 
> Just played the first two missions in BF4 and was holding 60+ fps all the way in Ultra 1080p, very happy right now.
> 
> Will try out some more overclocking on the weekend and see if the new cooler makes any difference with it.
> Chrome won't translate......any chance you can give a quick summary?
> Of course we can
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You guys get HDD's and Monitors cheaper than us on average, i know, my wife is Danish


cool. about BF4, op already did test it with his 290X. also, we have this . . .

http://www.overclock.net/t/1430640/battlefield-4-beta-fps-database

created by $ilent. +rep to him.


----------



## fleetfeather

Quote:


> Originally Posted by *rdr09*
> 
> cool. about BF4, op already did test it with his 290X. also, we have this . . .
> 
> http://www.overclock.net/t/1430640/battlefield-4-beta-fps-database
> 
> created by $ilent. +rep to him.


indeed a solid effort by Silent, +rep

the data in the spreadsheet however is based on the Beta, not the Launch client as discussed in the review. Some people have already starting calling the article out as misleading though. I guess we'll know for real in a few hours (it's actually the 1st Nov here now)


----------



## rdr09

Quote:


> Originally Posted by *fleetfeather*
> 
> indeed a solid effort by Silent, +rep
> 
> the data in the spreadsheet however is based on the Beta, not the Launch client as discussed in the review. Some people have already starting calling the article out as misleading though. I guess we'll know for real in a few hours (it's actually the 1st Nov here now)


true. i wonder if he plans to continue? anyway, i played it last night with my 7950 stock maxed out 1080 and it was lovely. MP64 i went to was totally different from the beta but looked more like the old BF3 with more eye candy. used all cores of my thuban @ 4GHz.


----------



## $ilent

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Just played the first two missions in BF4 and was holding 60+ fps all the way in Ultra 1080p, very happy right now.


I should think so, you should be able to hold 60fps at 1440p


----------



## cjp4eva

Quote:


> Originally Posted by *fleetfeather*
> 
> I wasn't using translate either, but rather just looking at the graphs. In either case, these particular graphs stand out:


From what i have seen in reviews and bf4 benches, the 780 & Titam beat the 290x at 1080p and lower resolutions but when you go above 1080 be it multiple monitors or even 1440p the 290x starts gaining ground on nvidias card and even beats them, the higher the resolution you play at the more performance you get compared to nvidias cards.


----------



## sugarhell

Unwinder got his 290. So expect new msi ab soon ~a week


----------



## jerrolds

Does WinFlash work? I havent flashed a gpu's bios since i flash modded a 6950 to 6970. I cant find a usb stick i can use for atiflash


----------



## VSG

Quote:


> Originally Posted by *sugarhell*
> 
> Unwinder got his 290. So expect new msi ab soon ~a week


That's great news, I might well have a working afterburner by the time I set everything up.


----------



## tsm106

Quote:


> Originally Posted by *fleetfeather*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> yes i did, i wasn't very comprehensive in my pics, sorry
> 
> Just played the first two missions in BF4 and was holding 60+ fps all the way in Ultra 1080p, very happy right now.
> 
> Will try out some more overclocking on the weekend and see if the new cooler makes any difference with it.
> Chrome won't translate......any chance you can give a quick summary?
> 
> 
> 
> I wasn't using translate either, but rather just looking at the graphs. In either case, these particular graphs stand out:
> 
> 
> Spoiler: Warning: Spoiler!
Click to expand...

Yea... ugh those charts are totally believable lol.


----------



## sugarhell

Quote:


> Originally Posted by *tsm106*
> 
> Yea... ugh those charts are totally believable lol.


I believe them


----------



## tsm106

Quote:


> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Yea... ugh those charts are totally believable lol.
> 
> 
> 
> I believe them
Click to expand...

I'm selling my 3930k for an i5 tomorrow. Anyone want to trade for a 780 while we're at it?


----------



## anubis1127

Quote:


> Originally Posted by *tsm106*
> 
> I'm selling my 3930k for an i5 tomorrow. Anyone want to trade for a 780 while we're at it?


I just traded my 3930k + RIVF last weekend for an i5 3570k + z77 WS. I think you were being sarcastic, but I'm happy with the move.


----------



## tsm106

Quote:


> Originally Posted by *anubis1127*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I'm selling my 3930k for an i5 tomorrow. Anyone want to trade for a 780 while we're at it?
> 
> 
> 
> I just traded my 3930k + RIVF last weekend for an i5 3570k + z77 WS. I think you were being sarcastic, but I'm happy with the move.
Click to expand...

What are you doing here? You forgot to trade for a 780!


----------



## anubis1127

Quote:


> Originally Posted by *tsm106*
> 
> What are you doing here? You forgot to trade for a 780!


Meh, I sold my 780 earlier this month, before the price drop. 

Waiting for a 290 non-reference now.


----------



## battleaxe

Quote:


> Originally Posted by *tsm106*
> 
> I'm selling my 3930k for an i5 tomorrow. Anyone want to trade for a 780 while we're at it?


Personally, I think you should move on up to the awesome i3. C'mon, if you're gonna do it. Do it right.


----------



## $ilent

Quote:


> Originally Posted by *anubis1127*
> 
> Meh, I sold my 780 earlier this month, before the price drop.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Waiting for a 290 non-reference now.


Good choice, the 290 is gonna be stellar. I would have been so annoyed if id of bought a 780 6 months ago. In fact I contemplated it at one point...but soon as the 290x is out nvidia drop thier prices in a flash. Just goes to show how comfortable they were with taking the mick with their prices.


----------



## sugarhell

Silent new gpu-z

http://www.techpowerup.com/downloads/SysInfo/GPU-Z/


----------



## $ilent

Quote:


> Originally Posted by *sugarhell*
> 
> Silent new gpu-z
> 
> http://www.techpowerup.com/downloads/SysInfo/GPU-Z/


Nice post +rep. What updates we got? in fact nvm found em
Quote:


> Added support for AMD R9 290X, R9 290, R9 270, HD 7310, HD 8280
> Added support for NVIDIA GTX 780 Ti, GT 635, Quadro K3100M
> Added release date for AMD R7 260X, R7 250, R7 240
> Fixed release date for AMD R9 280X
> Fixed die size for AMD Tahiti
> Fixed ROP count on Ivy Bridge and Haswell
> Fixed BIOS saving not working on AMD cards without driver
> Fixed some rare crashes on systems with Intel VGA
> Render Test can be paused by left-clicking into window


We just need new msi ab now!


----------



## Arizonian

Quote:


> Originally Posted by *LazarusIV*
> 
> Quick question for you all. I will be getting the Qnix 1440p monitor and I am not quite sure about the gfx card. I will be overclocking the monitor as much as I can but I don't care if I don't get 120Hz out of it. I'm perfectly happy with 96Hz or 108Hz too. Should I get 2 R9 280Xs, one R9 290 and maybe a second one down the road, or 2 R9 290s immediately? What do you guys think? I also will be watercooling the cards and my 8350 in the next couple of months. Thanks for the input all!


*FPS must = Hz refresh rate* or you won't be getting the effect of extra fluidity gaming but rather you'll see stutter. A single 290X is just enough for a 1440p 60 Hz monitor at ultra settings in BF4. So you will need a two GPU solution.

Your other problem is with two GPU's you may have a bottleneck with your 8350 and not get full 100% GPU usage. Maybe someone with a bit more knowledge on AMD CPU's can chime in regarding this.

So if your at 96 Hz and see dips that go below 10 FPS or so expect to to have stuttering in game. It can be compensated to by lowering in game settings. It takes quite a bit of GPU power with a 120 Hz 1080p monitor alone and then added the extra resolution requires even more.

Side note: Remember as you over clock your Qnix you will experience darkening and saturation of colors. There is an ICC you can download at the OCN Qnix club that can compensate for this at both 96Hz and 120hz depending on how far you take it.

Quote:


> Originally Posted by *Sgt Bilko*
> 
> it takes up 3 slots, and as for the length? AC's website quotes this for the Cooler itself: Dimensions: 288 (L) x 104 (W) x 54 (H) mm,
> so on the 290x it's probably in the vicinity of being around 31-33cm long all up...........my messy cables don't help anything though
> 
> Pros: This is better than the stock cooler for sure idle 34 degrees (100% fan speed and i can barely hear a thing)
> 
> Cons: there isn't enough heatsinks in the package for all the Vram mods so i needed to buy some extra and it makes the card flex (hence the Ninja wire)
> 
> All in all, it's not worth the money for a 290x, It's a way better option to put the cash towards a water cooling set-up with full cover blocks and backplates.
> 
> 
> http://imgur.com/a
> 
> 
> It's an album, not just a single pic
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: Mine is a Sapphire BF4 Edition with Elphida Mem in case that adds to the confusion


Looks great.









Even matches my black n white color scheme in my rig. Will be watching you to see what kind of OC you get. Impressed with the fan noise you described.







Might be going this route over a non-reference I'm just not sure yet on which way I'm going to go until I see the non-reference cooling numbers and OC potential.

Quote:


> Originally Posted by *sugarhell*
> 
> Silent new gpu-z
> 
> http://www.techpowerup.com/downloads/SysInfo/GPU-Z/


Sweetness. Anyone experience any ill effects let us know. Seems to be reporting correctly so far.


----------



## jerrolds

Quote:


> Originally Posted by *Arizonian*
> 
> *FPS must = Hz refresh rate* or you won't be getting the effect of extra fluidity gaming but rather you'll see stutter. A single 290X is just enough for a 1440p 60 Hz monitor at ultra settings in BF4. So you will need a two GPU solution.
> 
> Your other problem is with two GPU's you may have a bottleneck with your 8350 and not get full 100% GPU usage. Maybe someone with a bit more knowledge on AMD CPU's can chime in regarding this.
> 
> So if your at 96 Hz and see dips that go below 10 FPS or so expect to to have stuttering in game. It can be compensated to by lowering in game settings. It takes quite a bit of GPU power with a 120 Hz 1080p monitor alone and then added the extra resolution requires even more.


Dont think this is entirely true - i have a QNIX at 120hz, and while i cannot get a solid 120fps in BF4 when it dips down to 80-90ish fps its still a lot better than 60fps - as long as vsync is off. 120fps is ideal obviously, but 80-90fps still takes advantage of the monitor.

Currently playing the campaign with settings inbetween high-ultra with no AA/HBAO, and post AA set to low. Its really surprising how demanding BF4 is, hoping Mantle gives us a good 10+ fps bump.


----------



## devilhead

so share your ASIC with new gpu-z


----------



## $ilent

can anyone confirm if the new gpuz supports reading ASIC quality? I would need to remove parts from another pc to check (my pc has no psu atm)


----------



## Arizonian

Well I'm sure we'll be comparing all day on these. Still not sure how much weight to put into the numbers. I've seen some OC better than others which conflict with the ASIC scores.


----------



## anubis1127

Tahiti it didn't seem to matter much, unless it was just terribly low.

Kepler I've definitely seen a direct correlation.


----------



## $ilent

brb lemme just check mine heh


----------



## tsm106

Quote:


> Originally Posted by *anubis1127*
> 
> Tahiti it didn't seem to matter much, unless it was just terribly low.
> 
> Kepler I've definitely seen a direct correlation.


For Tahiti, asic mattered a lot but not for determining overclocks because that is still down to luck. Asic rating determines how your silicon reacts with voltage and not just with droop but hitting voltage walls and how much it scaled clocks vs voltage added, etc.


----------



## M125

Meh.

One of my 7950s was 94%, the other was 73%. Never saw a difference between the two besides maybe 5-10 °C. More than likely due to top card/bottom card anyway. ASIC quality probably has a lot more weight once your reach OC levels in like the 95th percentile and/or non-sustainable voltages (i.e. your cooling with LN2). Its effect seems to be minimal with pedestrian day to day overclocks.


----------



## Mr357

I still have hope though


----------



## $ilent

Dont worry guys, these will fly on h20!


----------



## Clockster

Ok so I got to the office this morning, only to find a Box marked R9 290X and a Invoice marked R9 290X, open the box and to my surprise and dismay the box was full of Gigabyte R9 290 cards..
Naturally I was really upset and asked them what the hell, 1st they tried to convince us they were 290X cards but after getting the cfo of the main supplier to come have a look he confirmed that they were indeed R9 290 cards. So after hours of fighting I then got told we won't have any R9 290X cards in Sa until end November









Guess I'll end up with R9 290's then.


----------



## Forceman

Pop one open and do some tests for us!


----------



## kcuestag

Got my waterblock today:







So far it dropped idle temperatures from 50-60ºC to 28-34ºC, let's try load temperatures now.







.


----------



## Slomo4shO

Asus 290X BF editions are in stock at Newegg


----------



## jomama22

Quote:


> Originally Posted by *kcuestag*
> 
> Got my waterblock today:
> 
> 
> 
> 
> 
> 
> 
> So far it dropped idle temperatures from 50-60ºC to 28-34ºC, let's try load temperatures now.
> 
> 
> 
> 
> 
> 
> 
> .


My 3 will be here today! While I'm stuck at work


----------



## TheSoldiet

Finally got my 290X!


----------



## Clockster

Quote:


> Originally Posted by *Forceman*
> 
> Pop one open and do some tests for us!


They were immediately removed from my office and I was told I could collect them on Monday morning.
Thing that pisses me off the most is they charged me for R9 290X cards and are now telling me those prices are correct...*** lol

Edit: You guys in the states have no idea how lucky you are lol We are paying roughly $730 inc vat exc BF4.
Just crazy...


----------



## Taint3dBulge

WOOT!!!!!!!

The Asus 290x is out at newegg.. Just got myn ordered..

http://www.newegg.com/Product/Product.aspx?Item=N82E16814121819

*FINALLY*


----------



## Arizonian

Quote:


> Originally Posted by *TheSoldiet*
> 
> Finally got my 290X!
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - Added









Quote:


> Originally Posted by *Taint3dBulge*
> 
> WOOT!!!!!!!
> 
> The Asus 290x is out at newegg.. Just got myn ordered..
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814121819
> 
> *FINALLY*


Wow that went fast! Will await your pic for submission to roster.


----------



## utnorris

Plenty in stock.


----------



## VSG

Why are they $10 more?


----------



## Forceman

Quote:


> Originally Posted by *geggeg*
> 
> Why are they $10 more?


Asus tax.


----------



## Sazz

Anyone tried the MSI AB beta 15 yet? currently dling it right now. let's see if this got voltage control for 290x's


----------



## Slomo4shO

XFX RADEON R9 290 is listed on Amazon for $469.99 hmm...


----------



## VSG

I am now debating if I should just wait till Tuesday for the 290 launch and see if I should crossfire my 290X with it instead.


----------



## utnorris

So tempted to get a second to play with CF.

Because they are Asus and Newegg knows they will sell.


----------



## VSG

Eh I ended up buying it, the extra year of warranty sorta countered the $10 more than what I paid for the sapphire version.


----------



## AndresR

Not sure if get the Asus right now, wait for the 290 or get a Classy 780, I don't care about the temps, I'll swap stock cooler for a waterblock... decisions decisions


----------



## VSG

Wait it out if you aren't already invested in either camp, I had already decided on crossfire 290 series so I had bought water blocks/back plates already. Anyway, I am finally done purchasing everything now and can finally finish up my build


----------



## Jpmboy

Where are you guys buying 290x's from?


----------



## VSG

Newegg currently has the Asus BF4 version available


----------



## Arthedes

Quote:


> Originally Posted by *DampMonkey*
> 
> Installed 10-28-2013


dat rig tho


----------



## formula m

Quote:


> Originally Posted by *Slomo4shO*
> 
> Asus 290X BF editions are in stock at Newegg


I bought two, about an hour ago..


----------



## skupples

Quote:


> Originally Posted by *Sazz*
> 
> Anyone tried the MSI AB beta 15 yet? currently dling it right now. let's see if this got voltage control for 290x's


I'm pretty sure AB is up to beta 16 @ this point, still w/o 290X support as far as I know.


----------



## Slomo4shO

Quote:


> Originally Posted by *formula m*
> 
> I bought two, about an hour ago..


Your welcome? The post you are quoting was up for an hour and 12 minutes from the time you posted so I am unsure what message you are trying to portray...









The Asus cards are now sold out at Newegg


----------



## formula m

Quote:


> Originally Posted by *Slomo4shO*
> 
> Your welcome? The post you are quoting was up for an hour and 12 minutes from the time you posted so I am unsure what message you are trying to portray...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The Asus cards are now sold out at Newegg


NewEgg app for my WP...

Hit refresh/search every so often.. Bingo!


----------



## kcuestag

I can say this card is not as hot as it looks.









The stock cooler is crap, but once under water, I took it to 1125/1400 (default BIOS) and played some Battlefield 4, it maxed at 51ºC with fans on very low (800-900RPM).









Very happy with the performance.


----------



## $ilent

Quote:


> Originally Posted by *fleetfeather*
> 
> indeed a solid effort by Silent, +rep
> 
> the data in the spreadsheet however is based on the Beta, not the Launch client as discussed in the review. Some people have already starting calling the article out as misleading though. I guess we'll know for real in a few hours (it's actually the 1st Nov here now)


Quote:


> Originally Posted by *rdr09*
> 
> true. i wonder if he plans to continue? anyway, i played it last night with my 7950 stock maxed out 1080 and it was lovely. MP64 i went to was totally different from the beta but looked more like the old BF3 with more eye candy. used all cores of my thuban @ 4GHz.


Ive updated that BF4 database guys, so your now ok to put your full BF4 results in there. I have highlighted the BETA results.

ASIc quality on mine is 71.8%...lowest on here so far! yeehaw


----------



## TheSoldiet

Im having some troubles with my card. The gpu usage is weird, it goes up and down while in Windows and in bf4 it is even worse! The card is throttling even at 68c! I only use msi ab (3.0) my gpu usage in bf is like 99 percent to 1 percent every half second. Getting 40-60 fps ultra.

AMD fx 8350 4.5 GHz


----------



## Tobiman

Quote:


> Originally Posted by *TheSoldiet*
> 
> Im having some troubles with my card. The gpu usage is weird, it goes up and down while in Windows and in bf4 it is even worse! The card is throttling even at 68c! I only use msi ab (3.0) my gpu usage in bf is like 99 percent to 1 percent every half second. Getting 40-60 fps ultra.
> 
> AMD fx 8350 4.5 GHz


Don't use MSI ab.


----------



## $ilent

I get an error message about the memory checker file being invalid whenever I try to open it, anyone got any ideas?


----------



## LazarusIV

Quote:


> Originally Posted by *$ilent*
> 
> Depends on the game. On bf3 with everything max and 4xAA my average fps is around 60. If you want max fps with max settings youd need 2 290. Turn aa off and you prob get 90fps with a single 290.


I've heard that AA on a 1440p monitor is not quite as important as a lower resolution but maybe that's just for 4K. Regardless, I usually can't tell the difference too much with that kind of stuff, I'd just have to play some games both ways and see if I notice. Looks like I'll get a couple of 290s after the price has normalized... I'm going full water too so I might as well wait...
Quote:


> Originally Posted by *Arizonian*
> 
> *FPS must = Hz refresh rate* or you won't be getting the effect of extra fluidity gaming but rather you'll see stutter. A single 290X is just enough for a 1440p 60 Hz monitor at ultra settings in BF4. So you will need a two GPU solution.
> 
> Your other problem is with two GPU's you may have a bottleneck with your 8350 and not get full 100% GPU usage. Maybe someone with a bit more knowledge on AMD CPU's can chime in regarding this.
> 
> So if your at 96 Hz and see dips that go below 10 FPS or so expect to to have stuttering in game. It can be compensated to by lowering in game settings. It takes quite a bit of GPU power with a 120 Hz 1080p monitor alone and then added the extra resolution requires even more.
> 
> Side note: Remember as you over clock your Qnix you will experience darkening and saturation of colors. There is an ICC you can download at the OCN Qnix club that can compensate for this at both 96Hz and 120hz depending on how far you take it.


I'd much rather get 2 for the horsepower, better safe than sorry I suppose. Plus with blocks on both and the CPU I should see some nice OCs. If SLI 780s doesn't bottleneck an 8350 then I don't think Crossfire 290s will too much, if at all. I guess we'll find out!

Quote:


> Originally Posted by *jerrolds*
> 
> Dont think this is entirely true - i have a QNIX at 120hz, and while i cannot get a solid 120fps in BF4 when it dips down to 80-90ish fps its still a lot better than 60fps - as long as vsync is off. 120fps is ideal obviously, but 80-90fps still takes advantage of the monitor.
> 
> Currently playing the campaign with settings inbetween high-ultra with no AA/HBAO, and post AA set to low. Its really surprising how demanding BF4 is, hoping Mantle gives us a good 10+ fps bump.


I haven't noticed any stuttering on my 1200p monitor when frame rates drop in BF3 or other games so I don't think it'll be much of an issue on a 1440p monitor. I just need to have enough horsepower to crush those extra pixels. I never play with VSync on, it looks like crap to me for some reason.

Thanks everyone for the input, I appreciate it!


----------



## GioV

Quote:


> Originally Posted by *kcuestag*
> 
> I can say this card is not as hot as it looks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The stock cooler is crap, but once under water, I took it to 1125/1400 (default BIOS) and played some Battlefield 4, it maxed at 51ºC with fans on very low (800-900RPM).
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Very happy with the performance.


Just added a WB as well! Achieved the same clocks as you and the same temps, let me know how far you can push the card on stock voltage. Did you use the TIM provided with the EK WB? I feel with a custom loop we should be able to get this card well below 50C as other users have reported.

ASIC Quality is 74.7%


----------



## Mas

Quote:


> Originally Posted by *Tobiman*
> 
> Don't use MSI ab.


Is there anything available at the moment that does support the 290X?


----------



## TheSoldiet

Quote:


> Originally Posted by *Tobiman*
> 
> Don't use MSI ab.


So you are suggesting that I should use catalyst control center?


----------



## $ilent

Quote:


> Originally Posted by *TheSoldiet*
> 
> So you are suggesting that I should use catalyst control center?


yes


----------



## Paul17041993

Quote:


> Originally Posted by *TheSoldiet*
> 
> So you are suggesting that I should use catalyst control center?


why not? lol you only need to apply the registry patch and you can set the clocks in there to whatever you want, if you want volt control the only option atm is to use GPUTweak with the ASUS BIOS (compatible with any brand card).

you could try trixx if you want, not sure how effective it is but Ive heard people have been using it...


----------



## Mr357

Quote:


> Originally Posted by *TheSoldiet*
> 
> So you are suggesting that I should use catalyst control center?


For the time being, yes.


----------



## TheSoldiet

Thx guys. I will try the asus gpu tweak and catalyst.

Btw does anybody know how to get temps, gpu usage etc....in osd whitout AB?


----------



## jomama22

Quote:


> Originally Posted by *TheSoldiet*
> 
> Thx guys. I will try the asus gpu tweak and catalyst.
> 
> Btw does anybody know how to get temps, gpu usage etc....in osd whitout AB?


GPU tweak should only be used if you have flash either of the 3 Asus bios out in the wild. That will also grant you voltage control.

I got the computer setup for the air tests of 6 290xs this morning, will start the tests tonight when I get home. I will have stock oc vs Asus bios (1.3v) results in the early hours.

Water blocks come today as well, hope to have all 6 water #s (no vdroop bios/1.42v) by Saturday night at the latest.


----------



## wolfej

Well, FrozenCPU just lost any future business from me. I was trying to get an RMA from my waterblock that cracked ( I mentioned it earlier in the thread) and couldn't find what to do on
FrozenCPU's website. Well after sending like 3 or 4 emails with no reply I finally get a reply from the freaking owner of the business and I'll just quote it here for you.

"This is ridiculous - who just sends things back like with no permission or procedure ?

Stop his new order and I want the other REFUSED"

-Mark FrozenCPU

What is ridiculously rude about this is he should not have sent this to a customer which is extremely disrespectful and rude, but they didn't even read what I had sent them about requesting an RMA. Instead this jerk jumps to conclusions and wants my order cancelled when it says you can order a new one as a replacement and get reimbursed once the old one is received by them on their website.

I've never had worse customer service in my entire life.









Edit: HOLY CRAP Here's another email.

"Not with VGA return policy, you don't bust your block yourself order a freshie and dump your broken stuff on my dock. That's not it works.

We get a photo of what you have and get ek to send you a replacement. Ya don't make up your scenario that works for you the best. "

I haven't even sent it to them yet.

Edit2: Here's another goodie

"You have a VGA return, you did not follow any rules regarding.

You deal with ek direct on a broken top that you cracked, you dont like the way the rules shop somewhere else, were sick and tired of you guys breaking your own stuff and just returning it without even telling us"

Once again, the box is literally sitting right next to me. I have no idea what he is talking about.


----------



## Mas

Quote:


> Originally Posted by *wolfej*
> 
> Well, FrozenCPU just lost any future business from me. I was trying to get an RMA from my waterblock that cracked ( I mentioned it earlier in the thread) and couldn't find what to do on
> FrozenCPU's website. Well after sending like 3 or 4 emails with no reply I finally get a reply from the freaking owner of the business and I'll just quote it here for you.
> 
> "This is ridiculous - who just sends things back like with no permission or procedure ?
> 
> Stop his new order and I want the other REFUSED"
> 
> -Mark FrozenCPU
> 
> What is ridiculously rude about this is he should not have sent this to a customer which is extremely disrespectful and rude, but they didn't even read what I had sent them about requesting an RMA. Instead this jerk jumps to conclusions and wants my order cancelled when it says you can order a new one as a replacement and get reimbursed once the old one is received by them on their website.
> 
> I've never had worse customer service in my entire life.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: HOLY CRAP Here's another email.
> 
> "Not with VGA return policy, you don't bust your block yourself order a freshie and dump your broken stuff on my dock. That's not it works.
> 
> We get a photo of what you have and get ek to send you a replacement. Ya don't make up your scenario that works for you the best. "
> 
> I haven't even sent it to them yet.
> 
> Edit2: Here's another goodie
> 
> "You have a VGA return, you did not follow any rules regarding.
> 
> You deal with ek direct on a broken top that you cracked, you dont like the way the rules shop somewhere else, were sick and tired of you guys breaking your own stuff and just returning it without even telling us"
> 
> Once again, the box is literally sitting right next to me. I have no idea what he is talking about.


Yeah that seems pretty off to me.

OCN boycot FrozenCPU IMO


----------



## wolfej

I'm at a loss for words. I didn't even do anything other than request an RMA and order a new one which they say you can do on their website.

"How do I get my replacement quicker?

If you would like to receive a replacement quicker, then you may purchase the identical item again. Then you can ship us the original item with a copy of both receipts and we will refund you for the price of said item if the return is received within 30 days."


----------



## Falkentyne

Quote:


> Originally Posted by *wolfej*
> 
> Well, FrozenCPU just lost any future business from me. I was trying to get an RMA from my waterblock that cracked ( I mentioned it earlier in the thread) and couldn't find what to do on
> FrozenCPU's website. Well after sending like 3 or 4 emails with no reply I finally get a reply from the freaking owner of the business and I'll just quote it here for you.
> 
> "This is ridiculous - who just sends things back like with no permission or procedure ?
> 
> Stop his new order and I want the other REFUSED"
> 
> -Mark FrozenCPU
> 
> What is ridiculously rude about this is he should not have sent this to a customer which is extremely disrespectful and rude, but they didn't even read what I had sent them about requesting an RMA. Instead this jerk jumps to conclusions and wants my order cancelled when it says you can order a new one as a replacement and get reimbursed once the old one is received by them on their website.
> 
> I've never had worse customer service in my entire life.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: HOLY CRAP Here's another email.
> 
> "Not with VGA return policy, you don't bust your block yourself order a freshie and dump your broken stuff on my dock. That's not it works.
> 
> We get a photo of what you have and get ek to send you a replacement. Ya don't make up your scenario that works for you the best. "
> 
> I haven't even sent it to them yet.
> 
> Edit2: Here's another goodie
> 
> "You have a VGA return, you did not follow any rules regarding.
> 
> You deal with ek direct on a broken top that you cracked, you dont like the way the rules shop somewhere else, were sick and tired of you guys breaking your own stuff and just returning it without even telling us"
> 
> Once again, the box is literally sitting right next to me. I have no idea what he is talking about.


^^
This why I stopped caring about the human race and most people in this world.
People are EVIL, SATANIC and downright FLESH maggots, in the vast majority of cases.


----------



## battleaxe

Quote:


> Originally Posted by *wolfej*
> 
> Well, FrozenCPU just lost any future business from me. I was trying to get an RMA from my waterblock that cracked ( I mentioned it earlier in the thread) and couldn't find what to do on
> FrozenCPU's website. Well after sending like 3 or 4 emails with no reply I finally get a reply from the freaking owner of the business and I'll just quote it here for you.
> 
> "This is ridiculous - who just sends things back like with no permission or procedure ?
> 
> Stop his new order and I want the other REFUSED"
> 
> -Mark FrozenCPU
> 
> What is ridiculously rude about this is he should not have sent this to a customer which is extremely disrespectful and rude, but they didn't even read what I had sent them about requesting an RMA. Instead this jerk jumps to conclusions and wants my order cancelled when it says you can order a new one as a replacement and get reimbursed once the old one is received by them on their website.
> 
> I've never had worse customer service in my entire life.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: HOLY CRAP Here's another email.
> 
> "Not with VGA return policy, you don't bust your block yourself order a freshie and dump your broken stuff on my dock. That's not it works.
> 
> We get a photo of what you have and get ek to send you a replacement. Ya don't make up your scenario that works for you the best. "
> 
> I haven't even sent it to them yet.
> 
> Edit2: Here's another goodie
> 
> "You have a VGA return, you did not follow any rules regarding.
> 
> You deal with ek direct on a broken top that you cracked, you dont like the way the rules shop somewhere else, were sick and tired of you guys breaking your own stuff and just returning it without even telling us"
> 
> Once again, the box is literally sitting right next to me. I have no idea what he is talking about.


That's enough for me. Not doing business with someone like that. Thanks for the heads up. There are plenty of companies out there that actually want our business. Go away, and I'm betting he will with an attitude like that.


----------



## Forceman

Quote:


> Originally Posted by *wolfej*
> 
> Well, FrozenCPU just lost any future business from me. I was trying to get an RMA from my waterblock that cracked ( I mentioned it earlier in the thread) and couldn't find what to do on
> FrozenCPU's website. Well after sending like 3 or 4 emails with no reply I finally get a reply from the freaking owner of the business and I'll just quote it here for you.
> 
> "This is ridiculous - who just sends things back like with no permission or procedure ?
> 
> Stop his new order and I want the other REFUSED"
> 
> -Mark FrozenCPU
> 
> What is ridiculously rude about this is he should not have sent this to a customer which is extremely disrespectful and rude, but they didn't even read what I had sent them about requesting an RMA. Instead this jerk jumps to conclusions and wants my order cancelled when it says you can order a new one as a replacement and get reimbursed once the old one is received by them on their website.
> 
> I've never had worse customer service in my entire life.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: HOLY CRAP Here's another email.
> 
> "Not with VGA return policy, you don't bust your block yourself order a freshie and dump your broken stuff on my dock. That's not it works.
> 
> We get a photo of what you have and get ek to send you a replacement. Ya don't make up your scenario that works for you the best. "
> 
> I haven't even sent it to them yet.
> 
> Edit2: Here's another goodie
> 
> "You have a VGA return, you did not follow any rules regarding.
> 
> You deal with ek direct on a broken top that you cracked, you dont like the way the rules shop somewhere else, were sick and tired of you guys breaking your own stuff and just returning it without even telling us"
> 
> Once again, the box is literally sitting right next to me. I have no idea what he is talking about.


Wow. I was just about to drop ~$450 on a custom water loop from them. Like, literally tonight. Maybe I'll shop around a little more now.


----------



## VSG

Wow that sucks man. I will call them up and request them to reconsider your situation. If other people call up and complain, they may yet do the right thing.


----------



## wolfej

Well if I can save someone from having to deal with a person like that then I guess it was worth it. My blood is boiling right now after reading the emails he's sent me. I can't believe that is the owner of a company.


----------



## battleaxe

Quote:


> Originally Posted by *wolfej*
> 
> Well if I can save someone from having to deal with a person like that then I guess it was worth it. My blood is boiling right now after reading the emails he's sent me. I can't believe that is the owner of a company.


Not for long if he keeps the kind of garbage up.


----------



## $ilent

Dud you should post them on every forum. Then drop him a snotty email in a week or two and ask if his sales are down.


----------



## scyy

Wow, I was planning on ordering my loop from them.


----------



## VSG

I still don't understand what those emails mean. In this context they make no sense.


----------



## alancsalt

Profanity now edited out. Please do not quote posts full of asterisks.


----------



## tsm106

Quote:


> Originally Posted by *wolfej*
> 
> Well if I can save someone from having to deal with a person like that then I guess it was worth it. My blood is boiling right now after reading the emails he's sent me. I can't believe that is the owner of a company.


He saved you some time actually. Though I agree they handled it extremely poorly, however you broke the acrylic. You chose to take the top off for whatever reason... when dealing with acrylic under suspension/pressure you have to be very careful not to crack it. Imagine that it went thru the proper channels and the case got to ek. You would have promptly been denied warranty. Shrugs, seem rather obvious to me.

Did you loosen the screws all at the same time little by little or one at a time?


----------



## Darlinangel

From a business stand point on the product he is correct and has the right to refuse a refund if the item was in good condition when received. If it was bad quality it was bad quality and he is right should be taken up with the manufacturer.

However from the stand point of been a customer and been abuse is a huge no no as you do not insult or abuse someone it makes no business sense at all to give yourself a bad reputation for customer support and of course the business in itself makes people afraid to purchase just in case they do have legit problems.


----------



## KyGuy

Hey guys, just got my Sapphire 290x from UPS. Don't have time to put it in now, but do you think it will fit in an Antec 300 Illusion. I read that the card is 10.8in long, and I believe the Illusion can fit an 11 inch card. Is this correct? I only have one drive, so the HDD bay issue won't be a problem.....KyGuy


----------



## BroJin

I would call your credit card company and request a stop payment and let them take care of this issue. If you used AMEX, they would get your money back quick.


----------



## PillarOfAutumn

Any ideas when the non-BF4 edition of the 290x will go on sale?
Specifically talking about newegg.


----------



## BackwoodsNC

Quote:


> Originally Posted by *wolfej*
> 
> Well, FrozenCPU just lost any future business from me. I was trying to get an RMA from my waterblock that cracked ( I mentioned it earlier in the thread) and couldn't find what to do on
> FrozenCPU's website. Well after sending like 3 or 4 emails with no reply I finally get a reply from the freaking owner of the business and I'll just quote it here for you.
> 
> "This is ridiculous - who just sends things back like with no permission or procedure ?
> 
> Stop his new order and I want the other REFUSED"
> 
> -Mark FrozenCPU
> 
> What is ridiculously rude about this is he should not have sent this to a customer which is extremely disrespectful and rude, but they didn't even read what I had sent them about requesting an RMA. Instead this jerk jumps to conclusions and wants my order cancelled when it says you can order a new one as a replacement and get reimbursed once the old one is received by them on their website.
> 
> I've never had worse customer service in my entire life.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: HOLY CRAP Here's another email.
> 
> "Not with VGA return policy, you don't bust your block yourself order a freshie and dump your broken stuff on my dock. That's not it works.
> 
> We get a photo of what you have and get ek to send you a replacement. Ya don't make up your scenario that works for you the best. "
> 
> I haven't even sent it to them yet.
> 
> Edit2: Here's another goodie
> 
> "You have a VGA return, you did not follow any rules regarding.
> 
> You deal with ek direct on a broken top that you cracked, you dont like the way the rules shop somewhere else, were sick and tired of you guys breaking your own stuff and just returning it without even telling us"
> 
> Once again, the box is literally sitting right next to me. I have no idea what he is talking about.


Dude that sucks.

I had to RMA something with them once. It was a EK CPU block that had corroded. They accepted it no problem, maybe their CS has changed since then.


----------



## djriful

Quote:


> Originally Posted by *wolfej*
> 
> Well, FrozenCPU just lost any future business from me. I was trying to get an RMA from my waterblock that cracked ( I mentioned it earlier in the thread) and couldn't find what to do on
> FrozenCPU's website. Well after sending like 3 or 4 emails with no reply I finally get a reply from the freaking owner of the business and I'll just quote it here for you.
> 
> "This is ridiculous - who just sends things back like with no permission or procedure ?
> 
> Stop his new order and I want the other REFUSED"
> 
> -Mark FrozenCPU
> 
> What is ridiculously rude about this is he should not have sent this to a customer which is extremely disrespectful and rude, but they didn't even read what I had sent them about requesting an RMA. Instead this jerk jumps to conclusions and wants my order cancelled when it says you can order a new one as a replacement and get reimbursed once the old one is received by them on their website.
> 
> I've never had worse customer service in my entire life.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: HOLY CRAP Here's another email.
> 
> "Not with VGA return policy, you don't bust your block yourself order a freshie and dump your broken stuff on my dock. That's not it works.
> 
> We get a photo of what you have and get ek to send you a replacement. Ya don't make up your scenario that works for you the best. "
> 
> I haven't even sent it to them yet.
> 
> Edit2: Here's another goodie
> 
> "You have a VGA return, you did not follow any rules regarding.
> 
> You deal with ek direct on a broken top that you cracked, you dont like the way the rules shop somewhere else, were sick and tired of you guys breaking your own stuff and just returning it without even telling us"
> 
> Once again, the box is literally sitting right next to me. I have no idea what he is talking about.


File a complaint here: http://www.bbb.org/upstate-new-york/business-reviews/computers-supplies-and-parts/frozencpu-com-inc-in-east-rochester-ny-30001156/


----------



## Paul17041993

Quote:


> Originally Posted by *wolfej*
> 
> Well, FrozenCPU just lost any future business from me. I was trying to get an RMA from my waterblock that cracked ( I mentioned it earlier in the thread) and couldn't find what to do on
> FrozenCPU's website. Well after sending like 3 or 4 emails with no reply I finally get a reply from the freaking owner of the business and I'll just quote it here for you.
> 
> "This is ridiculous - who just sends things back like with no permission or procedure ?
> 
> Stop his new order and I want the other REFUSED"
> 
> -Mark FrozenCPU
> 
> What is ridiculously rude about this is he should not have sent this to a customer which is extremely disrespectful and rude, but they didn't even read what I had sent them about requesting an RMA. Instead this jerk jumps to conclusions and wants my order cancelled when it says you can order a new one as a replacement and get reimbursed once the old one is received by them on their website.
> 
> I've never had worse customer service in my entire life.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: HOLY CRAP Here's another email.
> 
> "Not with VGA return policy, you don't bust your block yourself order a freshie and dump your broken stuff on my dock. That's not it works.
> 
> We get a photo of what you have and get ek to send you a replacement. Ya don't make up your scenario that works for you the best. "
> 
> I haven't even sent it to them yet.
> 
> Edit2: Here's another goodie
> 
> "You have a VGA return, you did not follow any rules regarding.
> 
> You deal with ek direct on a broken top that you cracked, you dont like the way the rules shop somewhere else, were sick and tired of you guys breaking your own stuff and just returning it without even telling us"
> 
> Once again, the box is literally sitting right next to me. I have no idea what he is talking about.


holy crap, I have to keep asking myself if that's even real, that doesn't even sound a thing like customer support...

what are they even talking about...? a returned graphics card...? RMA needs an RMA number to start with or it just gets sent back through postage so I don't even get what they mean...


----------



## pioneerisloud

If I'm not mistaken, HE broke the acrylic on his block himself, and then decided to try to defraud frozencpu into a refund. That's what I understood of the situation, and to be frank, I don't blame Mark one bit.


----------



## wolfej

Quote:


> Originally Posted by *pioneerisloud*
> 
> If I'm not mistaken, HE broke the acrylic on his block himself, and then decided to try to defraud frozencpu into a refund. That's what I understood of the situation, and to be frank, I don't blame Mark one bit.


Defraud? I was asking for an RMA for the block because I followed the directions and it cracked. I have emailed ek about it now and he should have just said to do so to begin with. Literally cussing me out for inquiring about an RMA is ridiculous. I did what his website said to do for getting returns quicker. I didn't see anything about water blocks being a special case.

I will forward anyone the email gladly so you can see what was going on and accusing me of defrauding someone is ridiculous.

I was taking the top off to put on the ek bridge. If ek turns me down then I'll just try to fix it myself but I didn't do anything wrong as I emailed them to get an RMA and I have not shipped it to them like he kept saying. I literally asked what I should do for an RMA.


----------



## Slomo4shO

Does this discussion really belong in this thread? You cracked your block less than 24 hours ago, sent multiple emails, provided a one sided account of the events... There are probably better means of destroying the retailers reputation than derailing this thread...


----------



## PillarOfAutumn

Sorry, I'm very new to watercooling, but is it normal to take the acrylic top off? I'm thinking the guy probably thought you were trying to modify it somehow. You should send him an email to tell him that what you were doing was necessary, as per insyrunctions of the manufacturer, and that caused the crack. But I disagree with that guy's wording in his email to you. It sounded like some uneducated hick who was angry that his wife bought two lamps and overran his credit limit and decided to take the anger out on you.


----------



## VSG

His 290X wasn't working as well as he would have liked but it already had the water block installed. So in order to RMA the GPU, he had to take out the block and replace the stock cooler back. While removing the block, he cracked the acrylic part and that's where this secondary issue began.


----------



## Arm3nian

Quote:


> Originally Posted by *wolfej*
> 
> Defraud? I was asking for an RMA for the block because I followed the directions and it cracked. I have emailed ek about it now and he should have just said to do so to begin with. Literally cussing me out for inquiring about an RMA is ridiculous. I did what his website said to do for getting returns quicker. I didn't see anything about water blocks being a special case.
> 
> I will forward anyone the email gladly so you can see what was going on and accusing me of defrauding someone is ridiculous.
> 
> I was taking the top off to put on the ek bridge. If ek turns me down then I'll just try to fix it myself but I didn't do anything wrong as I emailed them to get an RMA and I have not shipped it to them like he kept saying. I literally asked what I should do for an RMA.


The RMA directions are for something that arrives DOA. Physical damage isn't covered under any warranty. EK would have rejected it also. Acrylic is not the strongest material, it can be cut with a razor blade and cracks under pressure. What you should have done is untighten the screws a little by little, not remove them one by one. Removing them one by one gives the same effect as over tightening. You should have known that.


----------



## Sgt Bilko

XFX 290x (non BF4) was on sale at PCCG......for about $6 cheaper than than the BF4 edition......thats just stupid.

So regular 290x's are are appearing now for those that asked.


----------



## wolfej

Quote:


> Originally Posted by *Arm3nian*
> 
> The RMA directions are for something that arrives DOA. Physical damage isn't covered under any warranty. EK would have rejected it also. Acrylic is not the strongest material, it can be cut with a razor blade and cracks under pressure. What you should have done is untighten the screws a little by little, not remove them one by one. Removing them one by one gives the same effect as over tightening. You should have known that.


Like I said, if that is the case then fine I'll try to fix it myself. I didn't put the waterblock on the GPU yet. I was taking the top off to get the EK bridge ready.

He could have easily just told me to contact the manufacturer instead of cussing me out. Like I said I'm not giving a one sided account, I will gladly forward anyone the entire series of emails.

I'm just trying to warn people about that as I think my block had a defect because taking the screw out should not have broke it in my opinion. If EK says it's my fault then oh well. Like I said, he could have easily said that in the emails.

I have not lied or hidden anything about what happened and was just showing it to people that might buy waterblocks from them.


----------



## Arm3nian

Quote:


> Originally Posted by *wolfej*
> 
> Like I said, if that is the case then fine I'll try to fix it myself. I didn't put the waterblock on the GPU yet. I was taking the top off to get the EK bridge ready.
> 
> He could have easily just told me to contact the manufacturer instead of cussing me out. Like I said I'm not giving a one sided account, I will gladly forward anyone the entire series of emails.
> 
> I'm just trying to warn people about that as I think my block had a defect because taking the screw out should not have broke it in my opinion. If EK says it's my fault then oh well. Like I said, he could have easily said that in the emails.
> 
> I have not lied or hidden anything about what happened and was just showing it to people that might buy waterblocks from them.


He got mad because you tried to screw his company. As you said, if you think it is a manufacturer defect,then you contact EK and send it to them. Instead, you sent it to Frozen CPU w/o contacting them first. They had nothing to do with it and your action was going to cost them money.

I have two acrylic water blocks on their way from Frozen CPU. I know how to use it, if I see that there is something wrong with the block, I will contact EK and tell them that they have a badly designed product and they need to send me a new one, not try and and get one from the retailer who had nothing to do with the product. Frozen CPU's job is to get the item to you in good shape, and it seems they did that.

If you do not know what the correct procedure is, you should always send both companies emails and *wait* for their response.


----------



## wolfej

Quote:


> Originally Posted by *Arm3nian*
> 
> He got mad because you tried to screw his company. As you said, if you think it is a manufacturer defect,then you contact EK and send it to them. Instead, you sent it to Frozen CPU w/o contacting them first. They had nothing to do with it and your action was going to cost them money.
> 
> I have two acrylic water blocks on their way from Frozen CPU. I know how to use it, if I see that there is something wrong with the block, I will contact EK and tell them that they have a badly designed product and they need to send me a new one, not try and and get one from the retailer who had nothing to do with the product. Frozen CPU's job is to get the item to you in good shape, and it seems they did that.


I DIDN't SEND ANYTHING for like the fourth time. All I did was email them. That's it.

Edit: Most of the time the seller will deal with replacements within like 30 days of purchase. That's why I contacted them.

Edit2: Also, I didn't have a single problem with the other block and I did the exact same thing. The waterblock is literally sitting in my living room.


----------



## DampMonkey

Hey guys. How about we start talking about 290X's now and stop arguing about frozencpu/ek
ill start

new personal best

3dmark gpu score- 19365
290x 1320/5972 @ 1.4V

full res: http://i.imgur.com/tmjSLpk.jpg


----------



## wolfej

Quote:


> Originally Posted by *DampMonkey*
> 
> Hey guys. How about we start talking about 290X's now and stop arguing about frozencpu/ek
> ill start
> 
> new personal best
> 
> 3dmark gpu score- 19365
> 290x 1320/5972 @ 1.4V
> 
> full res: http://i.imgur.com/tmjSLpk.jpg


Nice clocks! Is it 100% stable or have you not tested it yet?


----------



## sugarhell

You cant do more? Artifacts or freeze?


----------



## Arm3nian

Quote:


> Originally Posted by *wolfej*
> 
> I DIDN't SEND ANYTHING for like the fourth time. All I did was email them. That's it.
> 
> Edit: Most of the time the seller will deal with replacements within like 30 days of purchase. That's why I contacted them.
> 
> Edit2: Also, I didn't have a single problem with the other block and I did the exact same thing. The waterblock is literally sitting in my living room.


The retailer will do something in 30 days, they will contact the manufacturer for you and maybe speed up the process. Also, in your original post you said that you already ordered a new one, so you were planning to send in your old one.


----------



## $ilent

Quote:


> Originally Posted by *DampMonkey*
> 
> Hey guys. How about we start talking about 290X's now and stop arguing about frozencpu/ek
> ill start
> 
> new personal best
> 
> 3dmark gpu score- 19365
> 290x 1320/5972 @ 1.4V
> 
> full res: http://i.imgur.com/tmjSLpk.jpg


Man that cpu is such a bottleneck.


----------



## wolfej

Quote:


> Originally Posted by *Arm3nian*
> 
> The retailer will do something in 30 days, they will contact the manufacturer for you and maybe speed up the process. Also, in your original post you said that you already ordered a new one, so you were planning to send in your old one.


I did that because it said you could do that on the website. He said it doesn't work that way for waterblocks, which is fine, and he should have said so originally.


----------



## Arm3nian

Quote:


> Originally Posted by *wolfej*
> 
> I did that because it said you could do that on the website. He said it doesn't work that way for waterblocks, which is fine, and he should have said so originally.


It doesn't work that way for anything that YOU damaged. If they sent it and it was damaged, then that process applies.


----------



## skupples

Quote:


> Originally Posted by *wolfej*
> 
> Defraud? I was asking for an RMA for the block because I followed the directions and it cracked. I have emailed ek about it now and he should have just said to do so to begin with. Literally cussing me out for inquiring about an RMA is ridiculous. I did what his website said to do for getting returns quicker. I didn't see anything about water blocks being a special case.
> 
> I will forward anyone the email gladly so you can see what was going on and accusing me of defrauding someone is ridiculous.
> 
> I was taking the top off to put on the ek bridge. If ek turns me down then I'll just try to fix it myself but I didn't do anything wrong as I emailed them to get an RMA and I have not shipped it to them like he kept saying. I literally asked what I should do for an RMA.


This is why you should be doing business outside of New York. Might I recommend to you Performance-Pc's for your next acquisition.


----------



## PillarOfAutumn

Is there a specific brand for the 290x that's better? I'be been told to stay away from powercolor and xfx, so that more or less leaves sapphire, Asus, and msi? Amongst those three, which is better? I was leaning towards sapphire.


----------



## wolfej

Quote:


> Originally Posted by *Arm3nian*
> 
> It doesn't work that way for anything that YOU damaged. If they sent it and it was damaged, then that process applies.


Then once again he should have said so. I will gladly forward you the email so you can make your own assumptions of how he handled it. If I did something incorrect then he could have said so then he could have told me what to do and we could have avoided this entire "thing".


----------



## Arm3nian

Quote:


> Originally Posted by *skupples*
> 
> This is why you should be doing business outside of New York. Might I recommend to you Performance-Pc's for your next acquisition.


They're in the east also, nothing in the west I think







Have to wait 5 days for everything.
Quote:


> Originally Posted by *sugarhell*
> 
> You cant do more? Artifacts or freeze?


That is a pretty good clock, avg. titan clock in the club under water is 1300.


----------



## Mas

Regardless, the response was uncalled for.

Customer doesn't properly understand RMA process, sends email to retailer asking what to do, and get's verbally abused?

No.

Retailer should reply with RMA instructions, even if those instructions are a simple "Contact the manufacturer for this type of situation"


----------



## wolfej

Quote:


> Originally Posted by *skupples*
> 
> This is why you should be doing business outside of New York. Might I recommend to you Performance-Pc's for your next acquisition.


Definitely, I've done business with them before without problem. FrozenCPU was the same until this evening.
Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Is there a specific brand for the 290x that's better? I'be been told to stay away from powercolor and xfx, so that more or less leaves sapphire, Asus, and msi? Amongst those three, which is better? I was leaning towards sapphire.


Since they're all reference right now there isn't much difference. The only actual difference I've seen is RAM manufacturer and stickers on the outside


----------



## VSG

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Is there a specific brand for the 290x that's better? I'be been told to stay away from powercolor and xfx, so that more or less leaves sapphire, Asus, and msi? Amongst those three, which is better? I was leaning towards sapphire.


Why were you told to stay away from those 2 brands? As a reference design with vram type apparently not a factor anymore, it comes down to warranty and customer support. I personally have heard both good and bad experiences from all brands so just look for warranty length (2 vs 3 years) if you plan to hold onto these cards for a while.


----------



## Taint3dBulge

I know someone on another forum that is having problems with his 290x, says hes getting 40fps on 1080p. Any ideas
Quote:


> Hi there,
> 
> after having read a lot of raving reviews about AMD's new flagship card, the Radeon R9 290x, I went ahead and ordered the Gigabyte BF4 Edition.
> 
> I've installed it in my rig, running an Intel [email protected] with 8 GB of RAM and the latest Catalyst Beta for Windows 8.1. My resolution is 1080p.
> 
> Now, the performance I'm getting in most games pretty much sucks!
> 
> For example in Battlefield 4 I'm getting an average of 40 FPS, with all settings cranked to Ultra. And even when I set everything to Low, there's hardly any noticable improvement.
> 
> In Arham Origins it's almost the same, until I turn off MSAA.
> 
> I've already checked the CPU, which remains around 50-60% during a game of BF4 and the same goes for the memory.
> 
> So, what's the deal here?
> 
> Does anybody else have this card? How's the performance for you? Is it the drivers or could it be a bottleneck somewhere?
> 
> I mean, for me this is even worse than the BF4-beta, which I played on an GTX 660 TI... so it's a huge disappointment!
> 
> Can somebody help? Thanks!


His psu is 600w, he also has oced his cpu to 4.5ghz and is still in the same performance...

http://forums.guru3d.com/showthread.php?t=382959


----------



## Arm3nian

Quote:


> Originally Posted by *wolfej*
> 
> Then once again he should have said so. I will gladly forward you the email so you can make your own assumptions of how he handled it. If I did something incorrect then he could have said so then he could have told me what to do and we could have avoided this entire "thing".


I don't want the email lol, he should've handled it more professionally. But I hope you see what he might have thought. You broke the block, ordered a new one, and then would demand a refund, and if they didn't refund it, you might have sued or something.


----------



## wolfej

Quote:


> Originally Posted by *Mas*
> 
> Regardless, the response was uncalled for.
> 
> Customer doesn't properly understand RMA process, sends email to retailer asking what to do, and get's verbally abused?
> 
> No.
> 
> Retailer should reply with RMA instructions, even if those instructions are a simple "Contact the manufacturer for this type of situation"


That's really my only problem with the whole situation. I make enough money that I'm not upset about being out 100 bucks if that is the case, but he could have easily helped me along with the process. I understand that he got mad and I even tried to calm him down, but he was obviously angry about something or he is someone who goes off the handle rather easily.
Quote:


> Originally Posted by *Arm3nian*
> 
> I don't want the email lol, he should've handled it more professionally. But I hope you see what he might have thought. You broke the block, ordered a new one, and then would demand a refund, and if they didn't refund it, you might have sued or something.


I agree somewhat. If he had actually read my emails then he would have seen that I wasn't trying to rip him off.


----------



## _Killswitch_

You guy's are killing me with this RMA thing (Sorry for your issues Wolf) Can't decided between SLI 780's or CF 290X's. So been stalking this thread trying see how 290X is doing so far from people who have them.


----------



## wolfej

Quote:


> Originally Posted by *_Killswitch_*
> 
> You guy's are killing me with this RMA thing (Sorry for your issues Wolf) Can't decided between SLI 780's or CF 290X's. So been stalking this thread trying see how 290X is doing so far from people who have them.


Sorry, this turned into a whole big thing. I was just trying to share my very bad experience. As soon as I get my replacement 290x I'll have them setup and do some benchmark runs.

Edit: Are you wanting water-cooled results or on air? I can do both when I get the replacement if you want.


----------



## Arm3nian

Quote:


> Originally Posted by *wolfej*
> 
> That's really my only problem with the whole situation. I make enough money that I'm not upset about being out 100 bucks if that is the case, but he could have easily helped me along with the process. I understand that he got mad and I even tried to calm him down, but he was obviously angry about something or he is someone who goes off the handle rather easily.
> I agree somewhat. If he had actually read my emails then he would have seen that I wasn't trying to rip him off.


The point i'm trying to make is that you took an action without waiting for the reply. You might be trustworthy, you might not be, he doesn't know that, and he can't really trust you either. That is why there is a policy, if you are unclear, wait for their response, don't order a new one, that is probably what set him off.

I'm not sure how the other employees interpreted the situation to him, that might have also been a cause. Either way, they should have treated you with more respect.


----------



## wolfej

I get that he could be irritated, but he could have just explained what to do. He literally could have avoided this whole thing with like one sentence.


----------



## Mas

Quote:


> Originally Posted by *Taint3dBulge*
> 
> I know someone on another forum that is having problems with his 290x, says hes getting 40fps on 1080p. Any ideas
> His psu is 600w, he also has oced his cpu to 4.5ghz and is still in the same performance...
> 
> http://forums.guru3d.com/showthread.php?t=382959


Did he properly uninstall previous drivers? (Note: "properly" does not necessarily mean using the supplied uninstaller or simply running uninstall from control panel)

Downsampling?

Running GPUz or MSI AB? I've heard they aren't playing nice with R9 290x


----------



## pioneerisloud

Quote:


> Originally Posted by *Taint3dBulge*
> 
> I know someone on another forum that is having problems with his 290x, says hes getting 40fps on 1080p. Any ideas
> Quote:
> 
> 
> 
> Hi there,
> 
> after having read a lot of raving reviews about AMD's new flagship card, the Radeon R9 290x, I went ahead and ordered the Gigabyte BF4 Edition.
> 
> I've installed it in my rig, running an Intel [email protected] with 8 GB of RAM and the latest Catalyst Beta for Windows 8.1. My resolution is 1080p.
> 
> Now, the performance I'm getting in most games pretty much sucks!
> 
> For example in Battlefield 4 I'm getting an average of 40 FPS, with all settings cranked to Ultra. And even when I set everything to Low, there's hardly any noticable improvement.
> 
> In Arham Origins it's almost the same, until I turn off MSAA.
> 
> I've already checked the CPU, which remains around 50-60% during a game of BF4 and the same goes for the memory.
> 
> So, what's the deal here?
> 
> Does anybody else have this card? How's the performance for you? Is it the drivers or could it be a bottleneck somewhere?
> 
> I mean, for me this is even worse than the BF4-beta, which I played on an GTX 660 TI... so it's a huge disappointment!
> 
> Can somebody help? Thanks!
> 
> 
> 
> His psu is 600w, he also has oced his cpu to 4.5ghz and is still in the same performance...
> 
> http://forums.guru3d.com/showthread.php?t=382959
Click to expand...

I'm thinking PSU. He says its "600w" but no mention of which unit. Somebody made note its a Corsair, its possible he has a CX600 which is realistically a 500w at best unit and mediocre at that. Could just be that he needs a better quality PSU. Not necessarily more wattage, but better quality.


----------



## sugarhell

Quote:


> Originally Posted by *Arm3nian*
> 
> They're in the east also, nothing in the west I think
> 
> 
> 
> 
> 
> 
> 
> Have to wait 5 days for everything.
> That is a pretty good clock, avg. titan clock in the club under water is 1300.


Nothing to do with the titan. I am trying to find the average temps that 290x becames unstable. Pretty much i want to see if its the vrms or the core is limiting his oc or just non proper voltage control. I bet the gpu needs around 35-40C for 1400


----------



## _Killswitch_

Quote:


> Originally Posted by *wolfej*
> 
> Sorry, this turned into a whole big thing. I was just trying to share my very bad experience. As soon as I get my replacement 290x I'll have them setup and do some benchmark runs.
> 
> Edit: Are you wanting water-cooled results or on air? I can do both when I get the replacement if you want.


Watercooled is fine since doesn't if it's SLI 780's or CF 290X I go with either or will be Watercooled


----------



## Arm3nian

Quote:


> Originally Posted by *sugarhell*
> 
> Nothing to do with the titan. I am trying to find the average temps that 290x becames unstable. Pretty much i want to see if its the vrms or the core is limiting his oc or just non proper voltage control. I bet the gpu needs around 35-40C for 1400


Titan and 290x are comparable clock for clock so I think it is a good is a oc.


----------



## sugarhell

Quote:


> Originally Posted by *Arm3nian*
> 
> Titan and 290x are comparable clock for clock so I think it is a good is a oc.


Still not as an architecture. Tahiti and now hawaii are temp limited. 50C and you can do 1300, 60C and you cant do that.His temps are quite high for water but still he can do 1320. With proper voltage control and lower temps 1400 is quite possible.


----------



## Arm3nian

Quote:


> Originally Posted by *sugarhell*
> 
> Still not as an architecture. Tahiti and now hawaii are temp limited. 50C and you can do 1300, 60C and you cant do that.His temps are quite high for water but still he can do 1320. With proper voltage control and lower temps 1400 is quite possible.


I'm not saying 1320 is the limit, I personally think this card is capable of much more. But for raw performance, 1320 is good compared to what green can get.

I'm just proving the green fanboys wrong, who say this card can't reach titan performance. It already has, and will continue onwards, just like you said.


----------



## utnorris

I have dealt with FrozenCPU several times and I have never gotten a response like that, not sure what is up with that. Yes, they are correct you should deal with EK directly, but it could have been a simple reply stating that since you had not shipped the original block. That being said, if I am understanding correctly, you were installing the bridge to connect two blocks together, which does call for the removal of the top three screws and unless they changed something, it does not say to remove all three at the same time. I have never had an acrylic block break due to loosening or removing the screws and I have done plenty, not just EK. It does sound like there is an issue with the manufacturing part. Also, back to FrozenCPU, while they may not want to deal with the issue directly, having them replace it through EK is easier for the buyer considering the cost to ship it back to EK. FrozenCPU probably ships a bulk RMA to them and heck, probably gets a shipping credit against the next order, where as an end buyer won't be able to do that. My point is, FrozenCPU should help the customer get it resolved, especially if the user was following the install instructions provided by EK since the customer bought it from FrozenCPU and not EK directly. It's no different than if I buy a GPU from Newegg and it explodes in my case within the first 30 days. Newegg takes care of it and deals with the company to get it replaced.

Anyways, back on topic, did anyone update to the latest beta drivers? I did not notice any improvements, did anyone else?


----------



## wolfej

Quote:


> Originally Posted by *Arm3nian*
> 
> I'm not saying 1320 is the limit, I personally think this card is capable of much more. But for raw performance, 1320 is good compared to what green can get.
> 
> I'm just proving the green fanboys wrong, who say this card can't reach titan performance. It already has, and will continue onwards, just like you said.


I would be extremely happy with 1320 honestly. I'm hoping I can get 1400, but anything higher than 1300 is going to make me happy.


----------



## Arm3nian

Quote:


> Originally Posted by *wolfej*
> 
> I would be extremely happy with 1320 honestly. I'm hoping I can get 1400, but anything higher than 1300 is going to make me happy.


Yeah it's what I originally said, 1320 is a good clock. Still, it is capable of much more.

Personally, I'll be satisfied when I kill Alatar's 93fps on valley. If only the RIVBE would stop getting delayed I might have a computer...


----------



## tsm106

Quote:


> Originally Posted by *wolfej*
> 
> I get that he could be irritated, but he could have just explained what to do. He literally could have avoided this whole thing with like one sentence.


Contact ek. They will sell you a replacement top. They most likely won't have the replacement parts listed yet since the block is so new. Choose acetal.


----------



## wolfej

Quote:


> Originally Posted by *tsm106*
> 
> Contact ek. They will sell you a replacement top. They most likely won't have the replacement parts listed yet since the block is so new. Choose acetal.


I did, but would I be able to replace the acrylic top with the acetal? Didn't think about them being universal with their 290x blocks like that.


----------



## tsm106

Quote:


> Originally Posted by *wolfej*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Contact ek. They will sell you a replacement top. They most likely won't have the replacement parts listed yet since the block is so new. Choose acetal.
> 
> 
> 
> I did, but would I be able to replace the acrylic top with the acetal? Didn't think about them being universal with their 290x blocks like that.
Click to expand...

Yes, the blocks themselves are all the same 290x/290, the difference is the top, either you get acrylic or acetal and full pcb or short. Get acetal in whatever flavor. Acetal is very tough and you have to try hard to damage it. Acrylic on the other hand you can destroy it just by tightening it wrong.


----------



## modinn

Quote:


> Originally Posted by *sugarhell*
> 
> Still not as an architecture. Tahiti and now hawaii are temp limited. 50C and you can do 1300, 60C and you cant do that.His temps are quite high for water but still he can do 1320. With proper voltage control and lower temps 1400 is quite possible.


Edit: Of course, this isn't taking into account why LN2 and Dice runs achieve better overclocks. So I'll probably just refactor this post anyways as I realize what I've just posted.


----------



## skupples

Quote:


> Originally Posted by *utnorris*
> 
> I
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> have dealt with FrozenCPU several times and I have never gotten a response like that, not sure what is up with that. Yes, they are correct you should deal with EK directly, but it could have been a simple reply stating that since you had not shipped the original block. That being said, if I am understanding correctly, you were installing the bridge to connect two blocks together, which does call for the removal of the top three screws and unless they changed something, it does not say to remove all three at the same time. I have never had an acrylic block break due to loosening or removing the screws and I have done plenty, not just EK. It does sound like there is an issue with the manufacturing part. Also, back to FrozenCPU, while they may not want to deal with the issue directly, having them replace it through EK is easier for the buyer considering the cost to ship it back to EK. FrozenCPU probably ships a bulk RMA to them and heck, probably gets a shipping credit against the next order, where as an end buyer won't be able to do that. My point is, FrozenCPU should help the customer get it resolved, especially if the user was following the install instructions provided by EK since the customer bought it from FrozenCPU and not EK directly. It's no different than if I buy a GPU from Newegg and it explodes in my case within the first 30 days. Newegg takes care of it and deals with the company to get it replaced.
> 
> 
> 
> Anyways, back on topic, did anyone update to the latest beta drivers? I did not notice any improvements, did anyone else?


sound's like you are either the straw that broke Mark's back, or that he has you confused with some one else who has been hitting them up for a replacement on user error damage.


----------



## tsm106

Dampmonkey, nice clocks dude.

You should really think about a better cpu man. Your FS at 1300mhz vs my FS at 1225.

1300mhz
GS=13839
http://cdn.overclock.net/6/6a/6ace33f5_8MOaWrq.jpeg

1225mhz
GS=13414
http://www.3dmark.com/fs/1050519


----------



## sugarhell

What i just read. Please dont post that crap ever again


----------



## wolfej

Quote:


> Originally Posted by *sugarhell*
> 
> What i just read. Please dont post that crap ever again


Huh? Who are you talking to? Lol


----------



## sugarhell

Quote:


> Originally Posted by *wolfej*
> 
> Huh? Who are you talking to? Lol


He knows


----------



## wolfej

Quote:


> Originally Posted by *sugarhell*
> 
> He knows


Haha ok.


----------



## modinn

Quote:


> Originally Posted by *sugarhell*
> 
> He knows


Me. I made a stupid post. Sugarhell made me realize what I had posted. I'm not afraid of admitting I was wrong.

Anyways, I thought I'd add the part back of what I wanted to say originally. I'm happy with the way the 290X is turning out so far in terms of watercooling performance. It makes my watercooling adventure seem far more useful this time around compared to Kepler GK104.


----------



## sugarhell

Whats the point to watercool a 680? It doesnt scale under water or with temps at all


----------



## Arm3nian

Quote:


> Originally Posted by *sugarhell*
> 
> Whats the point to watercool a 680? It doesnt scale under water or with temps at all


Noise is enough to wc imo lol. Maybe a slightly better oc, but you're not going to really get anything out of 600 series under water. Only worth it if you had the $ at the time.


----------



## modinn

Quote:


> Originally Posted by *sugarhell*
> 
> Whats the point to watercool a 680? It doesnt scale under water or with temps at all


Which is exactly why I said that I think watercooling this time around would be better for me. I bought one during the first wave of 680's when they arrived on Newegg, I stayed up all night and ordered a waterblock to go along with it. This was long before anyone had done any testing under water, and so I was taking the risk of being the guinea pig.

Not this time around though, I'm letting people like DampMonkey be my guinea pig now


----------



## _Killswitch_

These 290X's pull a lot of power. if I Go CF 290X Glad I went overkill on my PSU with a 1300w =S


----------



## Arizonian

Quote:


> Originally Posted by *_Killswitch_*
> 
> These 290X's pull a lot of power. if I Go CF 290X Glad I went overkill on my PSU with a 1300w =S


Only gripe I have too. I only have myself to blame when I went for conservative high quality 850 watt on my Ivy system on release day. Love to go over kill crossfire but don't think it'd be wise. Wonder how much less of a power draw non-ref 290 Xfire might be? So contemplating my next move is signal 290X non-ref and sell off the reference card.


----------



## Sazz

This sucks, the Display port in my card stopped putting out display, now I need to send it for RMA for replacement with newegg, thing that scares me since the Sapphire one is out-of-stock they will end up changing it to refund but the representative that I talked to said if it gets changed to refund I just need to contact them and they will help me accordingly, told them I just really want a working card don't really want a refund xD


----------



## _Killswitch_

Arizo I have AX850 in my current rig, and EVGA supernova 1300 in 900D case behind me that is my future build. AX850 is a good psu hasn't gave me an once of trouble. Just new pc is aim for overkill (for me anyways) hence 1300 watt psu, getting SLI 780 or CF 290x's all watercooling ect ect


----------



## Arm3nian

Quote:


> Originally Posted by *Sazz*
> 
> This sucks, the Display port in my card stopped putting out display, now I need to send it for RMA for replacement with newegg, thing that scares me since the Sapphire one is out-of-stock they will end up changing it to refund but the representative that I talked to said if it gets changed to refund I just need to contact them and they will help me accordingly, told them I just really want a working card don't really want a refund xD


Did you try troubleshooting? Reinstall drivers, reseat card. Could be monitor as well. Don't think a display output going dead is that common.


----------



## Sazz

Quote:


> Originally Posted by *Arm3nian*
> 
> Did you try troubleshooting? Reinstall drivers, reseat card. Could be monitor as well. Don't think a display output going dead is that common.


I rotated my displays and the monitor works properly, its just the display port is not putting out display, both DVI are putting out display, dunno bout the HDMI, don't have a HDMI to DVI adapter to try but yeah.. this sucks -_-

And I am about to get the block tomorrow -_-


----------



## TheSoldiet

Is it anybody that uses the AMD catalyst uninstall utility 1.2.1.0? works great for me on W7 ultimate.


----------



## Asrock Extreme7

no but will try just used fraps and it says 112fps on bf4 ultra setting love this card


----------



## psyside

Quote:


> Originally Posted by *TheSoldiet*
> 
> Is it anybody that uses the AMD catalyst uninstall utility 1.2.1.0? works great for me on W7 ultimate.


I'm using it, its great. But i think better alternative is this, because its properly updated. The AMD utility, is like several months not updated. And its' bad on Windows 8 as well.


----------



## Arizonian

Just saw this posted elsewhere on OCN new review up by [H]ardOCP Nov 1, 2013 - updated to review section on OP as well.

*AMD Radeon R9 290X CrossFire Video Card Review*


Quote:


> In Crysis 3 we are running at 5760x1200 with SMAA 1X and "High" settings. The R9 290X gave us a playable average of 43.6 FPS, but *adding a second R9 290X it went up to 77 FPS. This is a performance advantage of near 77% by adding a second R9 290X.*




Pretty darn good scaling going on here.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *geggeg*
> 
> Why were you told to stay away from those 2 brands? As a reference design with vram type apparently not a factor anymore, it comes down to warranty and customer support. I personally have heard both good and bad experiences from all brands so just look for warranty length (2 vs 3 years) if you plan to hold onto these cards for a while.


Well, I've been seeing xfx and powercolor usually get the lowest reviews on most cards. And apparently, a few memebers have already sent in xfx cards for coil whine. So that also leads me to believe that that these companies aren't too good. And I've also heard people say that if you want a quality card that can oc, run with Asus or sapphire. And I've heard that sapphire has amazing customer support, so I should go with them. Does any of this have any merit?


----------



## Slomo4shO

Sapphire cards are in stock at Newegg

Also, $25 Off Your Purchase of $250+ at Newegg.com using code *NAFSAVE25NOV1R* at checkout.

Edit:
Card is no longer in stock.


----------



## Shoggy

Do not try this at home


----------



## Arizonian

Quote:


> Originally Posted by *Slomo4shO*
> 
> Sapphire cards are in stock at Newegg
> 
> Also, $25 Off Your Purchase of $250+ at Newegg.com using code *NAFSAVE25NOV1R* at checkout.
> 
> Edit:
> Card is no longer in stock.


That was quick, there gone as soon as you add it to cart. People must be camping the 290X's right now.


----------



## Slomo4shO

Quote:


> Originally Posted by *Arizonian*
> 
> That was quick, there gone as soon as you add it to cart. People must be camping the 290X's right now.


Not sure why, the factory OC EVGA GTX 780 SC ACX is currently $485 after rebate and the 780 Lightening is currently $525 after rebate and both come with 3 games.


----------



## Sazz

Quote:


> Originally Posted by *Arizonian*
> 
> That was quick, there gone as soon as you add it to cart. People must be camping the 290X's right now.


I just hope they have saved one for my replacement T_T


----------



## Alexbo1101

Screw these danish retailers







. Yesterday they said that they'd get some 290x's instock today, but now it says the 22. instead...


----------



## Arizonian

Quote:


> Originally Posted by *Sazz*
> 
> I just hope they have saved one for my replacement T_T


Trust me - if they have to refund or replace yours - I'm sure they pulled one before putting it any others for sale. They'll do whatever it takes to avoid RMA shipping cost. I'd call them in morning and let them know you saw it go available and gone in seconds and hoping yours was set aside and on it's way.


----------



## Forceman

Quote:


> Originally Posted by *Sazz*
> 
> I just hope they have saved one for my replacement T_T


You'll probably get back the one I returned and I'll get yours. Newegg has been known to ship dead cards back as "replacements". At this point I would rather have a refund and grab a card off Amazon. I'm tired of dealing with Newegg's restrictive return policies.


----------



## Sazz

Quote:


> Originally Posted by *Arizonian*
> 
> Trust me - if they have to refund or replace yours - I'm sure they pulled one before putting it any others for sale. They'll do whatever it takes to avoid RMA shipping cost. I'd call them in morning and let them know you saw it go available and gone in seconds and hoping yours was set aside and on it's way.


Yeah imma tell them that. another problem is I already used the BF4 code, but the representative said that they will find something to work things out, so in the end I probably just have to send back the coupon for the BF4 game from the replacement that they will send to me if they do have it back stocked.

there was one case where they ended up letting me have both coupon code coz of a similar case for a different product. but I just hope I get a replacement, don't really wanna bother trying to buy another one again coz they are flying off the shelves quicker than you can blink -_-


----------



## Paul17041993

Quote:


> Originally Posted by *Shoggy*
> 
> Do not try this at home


omg why is there no protective tape on there...


----------



## fleetfeather

Quote:


> Originally Posted by *Shoggy*
> 
> Do not try this at home


Hold my beer...


----------



## maarten12100

http://www.4launch.nl/shop/get/p-4-productid-146964

6 to 10 days so I might get the 290 by that time as well unless it takes just as long to become available.
It is cheap though I don't get why you would get a 780 over this since the classified is more expensive and the reference one needs all gimicks to get rid of Greenlight bs.


----------



## Alexbo1101

Quote:


> Originally Posted by *maarten12100*
> 
> http://www.4launch.nl/shop/get/p-4-productid-146964
> 
> 6 to 10 days so I might get the 290 by that time as well unless it takes just as long to become available.
> It is cheap though I don't get why you would get a 780 over this since the classified is more expensive and the reference one needs all gimicks to get rid of Greenlight bs.


I'm with you. The shop where I ordered mine was supposed to get some today, but instead they first get some on the 22., so I'll wait on the 290 and see if that's worth it.


----------



## maarten12100

Quote:


> Originally Posted by *Alexbo1101*
> 
> I'm with you. The shop where I ordered mine was supposed to get some today, but instead they first get some on the 22., so I'll wait on the 290 and see if that's worth it.


Story of living anywhere in Europe that isn't Poland or Germany (well and the UK)


----------



## flopper

Quote:


> Originally Posted by *Arizonian*
> 
> That was quick, there gone as soon as you add it to cart. People must be camping the 290X's right now.


all selling out.


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> Yes, the blocks themselves are all the same 290x/290, the difference is the top, either you get acrylic or acetal and full pcb or short. Get acetal in whatever flavor. Acetal is very tough and you have to try hard to damage it. Acrylic on the other hand you can destroy it just by tightening it wrong.


i just ordered the Acrylic. Should I call them and ask to replace with Acetal?

edit: I'll call them as soon as they open. hopefully the CS is in a good mood.

Update: Got it replaced to Acetal.


----------



## fleetfeather

Hmmm, both my orders should've started shipping today. Neither BLT or Provantage appear to have received their stock yet, so we'll see how this goes down







If they don't manage to secure any stock, I may have to look elsewhere for some GPU horsepower haha...


----------



## Sgt Bilko

My ASIC Score: 70.4%


----------



## kcuestag

Mine is 75.4% ASIC Quality, is that good or bad?


----------



## maarten12100

Quote:


> Originally Posted by *kcuestag*
> 
> Mine is 75.4% ASIC Quality, is that good or bad?


It depends a bit per card but high is good fot air while low is good for LN2 normally


----------



## kcuestag

Quote:


> Originally Posted by *maarten12100*
> 
> It depends a bit per card but high is good fot air while low is good for LN2 normally


I can do 1125/1400 on default voltages but raising the clock above 1135 will cause flickering and weird lighting which I guess means it's unstable, haven't tried higher memory than that.

Is that a good OC for default voltage? average? bad?


----------



## selk22

Just did a 3dmark basic run and here are my results! Im pretty happy with 10k in Firestrike









On air 1100/1400 no volt mods


http://www.3dmark.com/3dm/1528722

And @kcuestag

yes that's average from what I have seen so far! Better than some


----------



## Newbie2009

Still no real water benchmarks. Come on guys!


----------



## selk22

Quote:


> Originally Posted by *Newbie2009*
> 
> Still no real water benchmarks. Come on guys!


Rad and tubes come this thursday block incoming week after most likely...

I am still contemplating waiting for the Swiftech block


----------



## MIGhunter

Any suggestions on installing my card? I plugged it in, turned on my pc, everything is working but when I try to install the beta drivers, it says they aren't compatible. My CD drive is acting up and I can't get the disc in to install from the disc. When I try the auto detect, it tells me it can't find my card type. I know the card is working because I have a screen.


----------



## selk22

Have you tried just installing the 13.11 beta 8 drivers from AMD's website?

Have you tried an earlier version like maybe 6? Thats what I am on


----------



## maarten12100

Quote:


> Originally Posted by *kcuestag*
> 
> I can do 1125/1400 on default voltages but raising the clock above 1135 will cause flickering and weird lighting which I guess means it's unstable, haven't tried higher memory than that.
> 
> Is that a good OC for default voltage? average? bad?


Seems like an ok OC did you max the fan to reduce the leakage even further?
I want to get my hands on one of those cards asap so I can try for myself. (yes it is Titan all over again tweaking a card is hard if you never actually got the card







)
Quote:


> Originally Posted by *MIGhunter*
> 
> Any suggestions on installing my card? I plugged it in, turned on my pc, everything is working but when I try to install the beta drivers, it says they aren't compatible. My CD drive is acting up and I can't get the disc in to install from the disc. When I try the auto detect, it tells me it can't find my card type. I know the card is working because I have a screen.


Tried driver sweeper in safe mode?
Usually does the trick for me though I must say I had that problem with mobile Nvidia drivers once


----------



## MIGhunter

When I tried the beta drivers it said it wasn't compatible. I got my cd drive open so I'm installing now hopefully no problems after that. Without it, I couldn't extend my desktop. It's odd because I'm coming from a sapphire and card with catalyst installed and I didn't uninstall it. I'd thin it would have tab with that


----------



## kcuestag

Quote:


> Originally Posted by *maarten12100*
> 
> Seems like an ok OC did you max the fan to reduce the leakage even further?
> I want to get my hands on one of those cards asap so I can try for myself. (yes it is Titan all over again tweaking a card is hard if you never actually got the card
> 
> 
> 
> 
> 
> 
> 
> )
> Tried driver sweeper in safe mode?
> Usually does the trick for me though I must say I had that problem with mobile Nvidia drivers once


My card is on water. No fan.









I haven't done any real benchmarking as I do not want to tweak the BIOS, I'd rather wait for proper voltage support via MSI AB.


----------



## MIGhunter

Btw here's my card


----------



## selk22

Quote:


> Originally Posted by *kcuestag*
> 
> My card is on water. No fan.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I haven't done any real benchmarking as I do not want to tweak the BIOS, I'd rather wait for proper voltage support via MSI AB.


Well then Id say thats great without and voltage tweaking but I feel like under water you can push that mem higher if youd like to even on stock volts.. Up to you









I am also waiting for a proper release from sapphire or MSI for power control.

Temps? Pics?







How many rads?


----------



## kcuestag

Quote:


> Originally Posted by *selk22*
> 
> Well then Id say thats great without and voltage tweaking but I feel like under water you can push that mem higher if youd like to even on stock volts.. Up to you
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am also waiting for a proper release from sapphire or MSI for power control.


Okay, I'll try 1450 and 1500MHz on the Memory, let's see how that works for Battlefield 4. Although, with 512 bus, does MEmory OC really matter for game FPS?

Anyhow, here are some decent pictures from the rig:


----------



## selk22

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *kcuestag*
> 
> Okay, I'll try 1450 and 1500MHz on the Memory, let's see how that works for Battlefield 4. Although, with 512 bus, does MEmory OC really matter for game FPS?
> 
> Anyhow, here are some decent pictures from the rig:






Wow looks great
















If you manage 1450 on the core without volts I will be impressed! I am not sure about games it helped some in the heaven and valley benchmarks I ran


----------



## maarten12100

Quote:


> Originally Posted by *kcuestag*
> 
> My card is on water. No fan.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I haven't done any real benchmarking as I do not want to tweak the BIOS, I'd rather wait for proper voltage support via MSI AB.


I hope we wont need to flash like we did with Titan to get stable voltage Boost 2.0 as part of Greenlight was really really bad for enthusiasts.
The only reason they are pushing this powertune variable algorithm is look better in terms of power consumption and limiting OC RMA's they should just keep their hands of enthusiast cards (not that AMD is really limiting that much it is more the vendors that do especially XFX)


----------



## kcuestag

Quote:


> Originally Posted by *selk22*
> 
> 
> Wow looks great
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you manage 1450 on the core without volts I will be impressed! I am not sure about games it helped some in the heaven and valley benchmarks I ran


You got it wrong, max I can do on the Core with no voltage tweaking is 1135MHz, the Memory is at 1450MHz right now.


----------



## selk22

Awesome congrats









How is the bf4 performance at 1440p? Able to max it out? AA?


----------



## kcuestag

Quote:


> Originally Posted by *selk22*
> 
> Awesome congrats
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How is the bf4 performance at 1440p? Able to max it out? AA?


I'm playing it on Ultra, only thing I've disabled is MSAA, and instead I'm using FXAA (AA Post) on Medium, just like I did on Battlefield 3, as I prefer it over MSAA.

I play it anywhere from 60 to 120fps, with some drops to the 50s but they are rare. Really happy with the performance so far.


----------



## selk22

Awesome good to hear! thanks for quick replies +rep

I tried to rep a mod again


----------



## devilhead

Quote:


> Originally Posted by *Alexbo1101*
> 
> Screw these danish retailers
> 
> 
> 
> 
> 
> 
> 
> . Yesterday they said that they'd get some 290x's instock today, but now it says the 22. instead...


same norway!!! today suppose to send my card, but changed time to 22 next month, what a disaster!!!!


----------



## $ilent

my replacement psu arrived, im back in business!


----------



## TacticalAce42

Hello guys Im worried about my 290X. It started when playing Battlefield 4, after about an hour or so of play the computer would freeze and show these red and yellow vertical lines on the left half of my screen and red and green vertical lines on the right half of my screen. So I went ahead and ran furmark for about a half an hour and it was fine, I played crysis 3 for about 2 hours it was fine, and I played a match of Battlefield 3 and it was fine.

But, I tried skyrim absolutely bare bones no mods or anything installed and when I leave the cave when you escape helgen and make my way to riverwood the game crashes with no error or anything right near the first standing stones. So Upon further investigation when I alt/tab out of the game with CCC open I noticed the gpu core clock is still 300-400mhz. Its almost as if the gpu is still at idle when I play skyrim. So I took the 290X out and put my 6950 in and it ran through those sections multiple times. So it clearly is the 290X.

The sad part is the nature of the R9 290X, there is no way to have a fixed clock speed because it might go above 95C if it stays at 1000mhz. So is there anything I can do? Do I have a faulty GPU? Should I return it before its too late? If I keep it who knows what games will be recognized and what wont. The GPU might not even do what its supposed to do on certain games making them unplayable. I really dont want to throw away $550 so please help!

Rig

Cpu: i7 4770k @ 4.6 GHZ 1.261V, Gpu: R9 290X, Memory: 16GB Corsair Vengeance Pro 1866, PSU: Corsair RM850, SSD: Corsair Force Gt 128GB, 5TB WD Black (2tb+3tb)


----------



## $ilent

sounds like a dodgy 290x to me, which driver you on?


----------



## TacticalAce42

13.11 Beta8. I'll try Beta7 to see if it makes a difference with skyrim.


----------



## sugarhell

Without rig and specs its useless to ask for help.. just saying


----------



## TacticalAce42

I edited into the original post


----------



## jerrolds

Quote:


> Originally Posted by *selk22*
> 
> Awesome congrats
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How is the bf4 performance at 1440p? Able to max it out? AA?


I only played the campaign and i think it was around 60fps at Ultra with no AA....but i turned down a bunch of settings to get as close to 120fps without sacrificing too much detail. I'm around 90fps outdoors and more indoors at High/Settings, no AA/HBAO. BF4 is demanding.

This is at 1100/1400 stock everything 55% fan. Hoping to slap on an H80i and some ramsinks and push it to 1200/1500+ with the PT1 bios < 1.35v . Right now i get corruption at 1140mhz.


----------



## SonDa5

There she blows in all her glory.


----------



## Arizonian

Quote:


> Originally Posted by *MIGhunter*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Btw here's my card


Congrats - Added


----------



## szeged

finally got some free time, about to open these newegg boxes that have been staring at me and giving my titans the evil eye


----------



## jomama22

Hey question for those using the Asus bios. What are you using to track voltage when set in gputweak? Gpu-z and gputweak both report pretty much the same as without voltage.

I was able to increase my overclock so I am assuming voltage control is working but I can't be sure.


----------



## DampMonkey

Quote:


> Originally Posted by *SonDa5*
> 
> There she blows in all her glory.


The VRMs can get pretty warm, get active cooling on them if you can


----------



## PillarOfAutumn

This is a very noon question. On some overclocked, people are reporting 1300/1500. Is that core clock and memory clock? In some cases, I'm also seeing memory clocks as high as 5000. Can someone please explain that?


----------



## SonDa5

Quote:


> Originally Posted by *DampMonkey*
> 
> The VRMs can get pretty warm, get active cooling on them if you can


It's aluminum heat sinks on memory and VRMs with a big active fan blowing on them.


----------



## Mr357

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> This is a very noon question. On some overclocked, people are reporting 1300/1500. Is that core clock and memory clock? In some cases, I'm also seeing memory clocks as high as 5000. Can someone please explain that?


In that case, 1300MHz would be the core clock, and 1500MHz would be the memory clock. There's also an "effective memory clock" which is simply the actual memory clock multiplied by four.

So, at stock the 290X has a core clock of 1000MHz, a memory clock of 1250MHz, and an effective memory clock of 5000MHz.


----------



## hotrod717

Was about to say HIS 290x (non-Battlefeild) was in stock at the Egg, but just sold out . Was in stock for approx. 30 min. Think I'm going to wait and see how the new drivers are going to turn out and for a preferred brand to come into stock.


----------



## wolfej

Has anyone heard anything about an afterburner update? They're usually pretty quick with updates so I'm surprised nothing has been said yet.


----------



## Arizonian

Quote:


> Originally Posted by *szeged*
> 
> finally got some free time, about to open these newegg boxes that have been staring at me and giving my titans the evil eye


Submit a pic of them when you get them out if the box to flex thier legs. It's going to be interesting to watch your scores on the same system because a lot of times different systems get different scores and we can get a really good comparison on how they did against your Titans.
Quote:


> Originally Posted by *hotrod717*
> 
> Was about to say HIS 290x (non-Battlefeild) was in stock at the Egg, but just sold out . Was in stock for approx. 30 min. Think I'm going to wait and see how the new drivers are going to turn out and for a preferred brand to come into stock.


From a couple different sources now that I've read online it looks like AIB's will be allowed to release at the latter part of November.


----------



## fleetfeather

Well, BLT have revised their shipping date to 11/21, so that wont be happening... sigh









Edit: and Provantage isn't shipping til 11/18...


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Mr357*
> 
> In that case, 1300MHz would be the core clock, and 1500MHz would be the memory clock. There's also an "effective memory clock" which is simply the actual memory clock multiplied by four.
> 
> So, at stock the 290X has a core clock of 1000MHz, a memory clock of 1250MHz, and an effective memory clock of 5000MHz.


What is the effective clock for? Why multiply by 4?


----------



## Mr357

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> What is the effective clock for? Why multiply by 4?


It's because the RAM used is GDDR5 (in reality it's DDR3 SDRAM). It has double the data lines of DDR2, and four times as many as the classic DDR. So on DDR, the actual and effective memory clocks are equal, while DDR2 doubles the actual, and DDR3 quadruples.

I may be wrong about this though. I'm afraid I'm not exactly an expert.


----------



## lordzed83

ASIC 79% :]

Water block arriving tomorrow :]


----------



## armartins

Can't wait for the results of the insane guy with 6 cards air/water results.... please man do your homework! You'll be a very nice reference 6 cards same rig same wc... hope you dig some golden cards!


----------



## vettefan8

Feel free to upgrade my cooler for the record book







I got it on Thursday and put it on last night. I haven't had time to run any benchmarks yet but it's definitely quieter and idling a lot cooler.


----------



## raghu78

Quote:


> Originally Posted by *szeged*
> 
> finally got some free time, about to open these newegg boxes that have been staring at me and giving my titans the evil eye


http://www.hardocp.com/article/2013/11/01/amd_radeon_r9_290x_crossfire_video_card_review

if you have a multi monitor setup ( seems you are at 3 x 1440p) the R9 290X cards in CF should be a force to reckon with. slap the waterblocks and let us know of your gaming experience on the most demanding games - BF4, Crysis 3, Metro last light, Farcry 3


----------



## TacticalAce42

Has anyone tried skyrim? Are you getting CTD's? Bad Microstuttering? My skyrim was bare bones with no mods or anything and I had these problems, pop in my 6950's the crashing goes away. So please can someone else test a new game of skyrim to see if its the driver or my gpu? thanks.


----------



## Arizonian

@vettefan8 - updated to Aftermarket cooling









Let us know some numbers. There are some members who are interested in the same thing and would love any info how affects it.


----------



## nemm

Well Mr Royal Mail just handed me a parcel, I thought it may be my card but instead it is my water block and back plate so I still have to wait for the filling to arrive which should be on Monday if I am lucky.



I am tempted to remove packaging but I will wait until the card arrives.


----------



## Raephen

Quote:


> Originally Posted by *vettefan8*
> 
> Feel free to upgrade my cooler for the record book
> 
> 
> 
> 
> 
> 
> 
> I got it on Thursday and put it on last night. I haven't had time to run any benchmarks yet but it's definitely quieter and idling a lot cooler.


Glad to know it fits the PCB!

I'm kinsda eyeballing the 290 non X, to see what it brings to the table.

In my digging on the net, I already considered the Xtreme III as maybe a good replacement to the stock cooler.


----------



## evensen007

Quote:


> Originally Posted by *TacticalAce42*
> 
> Has anyone tried skyrim? Are you getting CTD's? Bad Microstuttering? My skyrim was bare bones with no mods or anything and I had these problems, pop in my 6950's the crashing goes away. So please can someone else test a new game of skyrim to see if its the driver or my gpu? thanks.


All of the official reviews tested against Skyrim and they didn't have those issues. Did you try uninstalling and reinstalling Skyrim ? I don't think it's a wide-spread issue.


----------



## Mr357

Quote:


> Originally Posted by *nemm*
> 
> Well Mr Royal Mail just handed me a parcel, I thought it may be my card but instead it is my water block and back plate so I still have to wait for the filling to arrive which should be on Monday if I am lucky.
> 
> 
> 
> I am tempted to remove packaging but I will wait until the card arrives.


Definitely keep that backplate wrapped up until you're ready to install it.


----------



## Deisun

I probably sound real stupid here, but:

Where are guys ordering your R9 290X's from? I've had zero luck getting one ordered from Newegg. What am I missing here?


----------



## Mr357

Quote:


> Originally Posted by *Deisun*
> 
> I probably sound real stupid here, but:
> 
> Where are guys ordering your R9 290X's from? I've had zero luck getting one ordered from Newegg. What am I missing here?


I got mine from Newegg on launch night. I have no clue where they can be found now.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Mr357*
> 
> Definitely keep that backplate wrapped up until you're ready to install it.


Why is that? I'll be getting my block and backplate next week and by how the current stock for the 290x is, it looks like I won't be getting it until a month or so. I feel like I would want to open up the package and just inspect the block/backplate for any damage.
Quote:


> Originally Posted by *Deisun*
> 
> I probably sound real stupid here, but:
> 
> Where are guys ordering your R9 290X's from? I've had zero luck getting one ordered from Newegg. What am I missing here?


lol, newegg is the only site I've seen that's selling the 290x at the recommended price of $550. Everyone else is at $600+ for the non bf4 ed. So I would assume that the demand at newegg is a 2 but loads and a half higher than anywhere else and as soon as they get something in stock, it's immediately sold out.


----------



## GioV

R9 290X crossfire review was posted today, seems like 2 cards need 780W. Great performance by the R9 290X when crossfired, it beat SLI Titans and SLI 780s in every benchmark.

http://www.hardocp.com/article/2013/11/01/amd_radeon_r9_290x_crossfire_video_card_review/1#.UnP3_nCkpH4


----------



## TacticalAce42

Yes multiple times and its just skyrim with no mods or anything, I RMA the card today to be on the safe side, but that would be terrible luck if I just had a bad card.


----------



## Deisun

Quote:


> Originally Posted by *hotrod717*
> 
> Was about to say HIS 290x (non-Battlefeild) was in stock at the Egg, but just sold out . Was in stock for approx. 30 min. Think I'm going to wait and see how the new drivers are going to turn out and for a preferred brand to come into stock.


Which are the preferred brands? HIS and Sapphire?


----------



## Taint3dBulge

Quote:


> Originally Posted by *Raephen*
> 
> Glad to know it fits the PCB!
> 
> I'm kinsda eyeballing the 290 non X, to see what it brings to the table.
> 
> In my digging on the net, I already considered the Xtreme III as maybe a good replacement to the stock cooler.


Its a hell of a nice cooler.. I have the xtreme plus II on my 6950, saw a 30deg swing between factory and this thing... Ill be getting a xtreme 3 soon, prolly around the 15th ill order it.. Im going to do some studying up tho on the memory module coolers. The ones supplied are aluminum. Im going to try and find some good copper ones, also see if i can find vrm heatsinks too.. I kinda hope there will be a kit out soon for the 290x so you can just order the heat sinks themselves. Also I was loooking at arctic coolings website and it only showed 12 memory heatsinks, when there are 16 modules. But in the pictures from vettefan8 it shows all 16 modules have the coolers...


----------



## ABD EL HAMEED

Quote:


> Originally Posted by *Taint3dBulge*
> 
> Its a hell of a nice cooler.. I have the xtreme plus II on my 6950, saw a 30deg swing between factory and this thing... Ill be getting a xtreme 3 soon, prolly around the 15th ill order it.. Im going to do some studying up tho on the memory module coolers. The ones supplied are aluminum. Im going to try and find some good copper ones, also see if i can find vrm heatsinks too.. I kinda hope there will be a kit out soon for the 290x so you can just order the heat sinks themselves. Also I was loooking at arctic coolings website and it only showed 12 memory heatsinks, when there are 16 modules. But in the pictures from vettefan8 it shows all 16 modules have the coolers...


Isn't that cooler a bit heavy for the PCB that's holding it via only one bracket on the GPU core?doesn't that mean that it may cause PCB bending or something like that?


----------



## Taint3dBulge

You know what, i might just try this thing out... Have to see if it will work witht he 290x first.. But i wonder how much cooler the gpu would run with the hybrid vs xtreme III


----------



## ABD EL HAMEED

Quote:


> Originally Posted by *Taint3dBulge*
> 
> You know what, i might just try this thing out... Have to see if it will work witht he 290x first.. But i wonder how much cooler the gpu would run with the hybrid vs xtreme III


I think this would be a bit better


----------



## Taint3dBulge

Quote:


> Originally Posted by *ABD EL HAMEED*
> 
> Isn't that cooler a bit heavy for the PCB that's holding it via only one bracket on the GPU core?doesn't that mean that it may cause PCB bending or something like that?


There is no bending.. You can see what it looks like in my avatar picture just click that and it will show a full view... I do have a metal backplate that stiffens it...


----------



## esqueue

Quote:


> Originally Posted by *Deisun*
> 
> I probably sound real stupid here, but:
> 
> Where are guys ordering your R9 290X's from? I've had zero luck getting one ordered from Newegg. What am I missing here?


I got mine from newegg. I used this site http://www.nowinstock.net/computers/amdr9290x/ and enabled the alarm on the page to beep when it comes in stock. It refreshes every minute and well warn you once it is in a tab. Just make a newegg account beforehand if you plan on getting it thought them. I already had one but needed to recover my password and update the address. It took 25 minutes for me


----------



## vettefan8

Quote:


> Originally Posted by *Taint3dBulge*
> 
> Its a hell of a nice cooler.. I have the xtreme plus II on my 6950, saw a 30deg swing between factory and this thing... Ill be getting a xtreme 3 soon, prolly around the 15th ill order it.. Im going to do some studying up tho on the memory module coolers. The ones supplied are aluminum. Im going to try and find some good copper ones, also see if i can find vrm heatsinks too.. I kinda hope there will be a kit out soon for the 290x so you can just order the heat sinks themselves. Also I was loooking at arctic coolings website and it only showed 12 memory heatsinks, when there are 16 modules. But in the pictures from vettefan8 it shows all 16 modules have the coolers...


I had read that the kit only includes 12 memory heatsinks and that the 290x needs 16 so I ordered an 8 pack of additional heatsinks online. I see that someone used 4 of the long skinny heatsinks instead of ordering additional heatsinks. They spanned 2 long skinny heatsinks side by side to cover 2 of the memory chips in close proximity. You can see it at the top left of the picture included. I have also included a link of their cooler installation.
http://hardforum.com/showthread.php?p=1040332902#post1040332902


----------



## MIGhunter

What programs are people using to bench their card. I was surprised that while playing ffxiv on max settings that I was getting dips down in the 30 fps range. I think I was averaging about 55 and this wasn't even a highly congested area.


----------



## Alexbo1101

Hm... Found another retailer who has some ASUS 290x's in-stock, so bye bye snailmail Sapphire!









If I'm lucky it should be here monday or tuesday!


----------



## maarten12100

Quote:


> Originally Posted by *GioV*
> 
> R9 290X crossfire review was posted today, seems like 2 cards need 780W. Great performance by the R9 290X when crossfired, it beat SLI Titans and SLI 780s in every benchmark.
> 
> http://www.hardocp.com/article/2013/11/01/amd_radeon_r9_290x_crossfire_video_card_review/1#.UnP3_nCkpH4


Power consumption will drop luckily for us once the core is kept cooler so it'll be about 250W/card once they have a decent temp and that delta fan is removed.


----------



## jerrolds

Quote:


> Originally Posted by *Taint3dBulge*
> 
> You know what, i might just try this thing out... Have to see if it will work witht he 290x first.. But i wonder how much cooler the gpu would run with the hybrid vs xtreme III


Please let us know how this goes, ive been flip flopping between Hybrid, Xtreme 3, Red Mod w/ H60 or H80 but price/pcb support (bending)/modding are my reasons against them atm. A Hybrid would probably be best outside of watercooling, but its kinda pricey at $140ish - that and you wont be able to remove the glued ramsinks easily.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Deisun*
> 
> Which are the preferred brands? HIS and Sapphire?


Many people are saying that for reference, it doesn't really matter.


----------



## maarten12100

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Many people are saying that for reference, it doesn't really matter.


Sapphire is the best retailer for AMD but there is no such thing as EVGA for AMD cards. (XFX used to be the choice for AMD but they went to poo)


----------



## FtW 420

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Many people are saying that for reference, it doesn't really matter.


Yep, reference cards are reference design, all the same until they start changing revisions. A few posts have said the elpda & hynix memory on them are clocking about the same, so price & customer service of the manufacturer are the things to look at.


----------



## PillarOfAutumn

Thanks! I'm aiming for a Sapphire card, but I guess I will grab whichever I can.

btw, I found this:

http://www.neweggbusiness.com/product/product.aspx?item=9b-14-121-806

It looks like Asus has a DCUII version in the works, or soon for release.


----------



## Arizonian

Quote:


> Originally Posted by *Deisun*
> 
> Which are the preferred brands? HIS and Sapphire?


Our club roster list shows mostly Sapphire cards the preference among club members.
Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Thanks! I'm aiming for a Sapphire card, but I guess I will grab whichever I can.
> 
> btw, I found this:
> 
> http://www.neweggbusiness.com/product/product.aspx?item=9b-14-121-806
> 
> It looks like Asus has a DCUII version in the works, or soon for release.


When AMD lifts the restriction, the AIB partners will be able to release their non-reference versions all on the same day. This timeframe is slated for the latter part of November. Would be nice if we were surprised with an earlier release.


----------



## Taint3dBulge

Does anyone know if there are backplates available for these? I wonder if a backplate from a 6950 would fit... doubt it but hmmm.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Taint3dBulge*
> 
> Does anyone know if there are backplates available for these? I wonder if a backplate from a 6950 would fit... doubt it but hmmm.


EK has a backplate for it, but idk if its a universal that will work with anything out there.


----------



## Mr357

Quote:


> Originally Posted by *Taint3dBulge*
> 
> Does anyone know if there are backplates available for these? I wonder if a backplate from a 6950 would fit... doubt it but hmmm.


There's no way a 6950 backplate would fit properly. They're specialized according to the components on the back side of the PCB.


----------



## Dart06

I'm really hoping the 290s come non-reference right away...


----------



## sugarhell

I got something for you guys

www2.ati.com/drivers/beta/amd_catalyst_13.11_betaV9.exe

Hmm a sec now its not working


----------



## Arizonian

Quote:


> Originally Posted by *Dart06*
> 
> I'm really hoping the 290s come non-reference right away...


That would be nice, however looking at Newegg and the pictures were seeing as well as through leaks their most likely reference and if they are, will probably have to wait at least another month or two before we see non reference 290s.


----------



## Raephen

Quote:


> Originally Posted by *Taint3dBulge*
> 
> There is no bending.. You can see what it looks like in my avatar picture just click that and it will show a full view... I do have a metal backplate that stiffens it...


Nice backplate.

I might have to consider doing something like that for a 290 should I get a stock one.

That, or wait a bit for something like an Asus DCII version to come allong.

The game I play most is Skyrim at 1080p. My 7870GHz ed. seems more than capable for that game maxed out, you never know what's over the horizon...

That, and my e-peen likes some lovin' once in a while









PS - While they might not really compare, my 7870 is cooled by Arctic's Twin Turbo II and that doesn't even have the small backplate to distribute the load, It's really big and hefty compared to the Sapphire Dual-X stock cooler, but no flexing of the GPU, so I'm not too worried about that.


----------



## kot0005

Dammit, I knew I should have waited for the GTX 780Ti, its clocks are looking so good.


----------



## rdr09

Quote:


> Originally Posted by *kot0005*
> 
> Dammit, I knew I should have waited for the GTX 780Ti, its clocks are looking so good.


3GB for $600? nah thanks. will they come with higher amount of vram?


----------



## Taint3dBulge

Quote:


> Originally Posted by *Mr357*
> 
> There's no way a 6950 backplate would fit properly. They're specialized according to the components on the back side of the PCB.


I know that, but im willing to take it off the 6950 and look lol. or at least eye ball it when both are side by side. I really hate the way PCB'S look. Need a smoooth cover.


----------



## Taint3dBulge

Quote:


> Originally Posted by *jerrolds*
> 
> Please let us know how this goes, ive been flip flopping between Hybrid, Xtreme 3, Red Mod w/ H60 or H80 but price/pcb support (bending)/modding are my reasons against them atm. A Hybrid would probably be best outside of watercooling, but its kinda pricey at $140ish - that and you wont be able to remove the glued ramsinks easily.


I emailed Arctic to see if it works with a 290x, and also asked if there was any new products coming out for the 290x...

But in the mean time it looks like you can find the coolers for not awhole lot.

$85

http://www.rakuten.com/prod/arctic-accelero-accessories-accelero-hybrid/233968403.html?listingId=303402659

$105

http://www.newegg.com/Product/Product.aspx?Item=N82E16835186067&nm_mc=KNC-GoogleAdwords&cm_mmc=KNC-GoogleAdwords-_-pla-_-VGA+Cooling-_-N82E16835186067&gclid=CNKUpNDOxLoCFbFxQgodlHEACg


----------



## maarten12100

Quote:


> Originally Posted by *sugarhell*
> 
> I got something for you guys
> 
> www2.ati.com/drivers/beta/amd_catalyst_13.11_betaV9.exe
> 
> Hmm a sec now its not working


Bring it on oh and once you found it again please post it in the new section so everybody can enjoy


----------



## Neo_Morpheus

Quote:


> Originally Posted by *vettefan8*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Taint3dBulge*
> 
> Its a hell of a nice cooler.. I have the xtreme plus II on my 6950, saw a 30deg swing between factory and this thing... Ill be getting a xtreme 3 soon, prolly around the 15th ill order it.. Im going to do some studying up tho on the memory module coolers. The ones supplied are aluminum. Im going to try and find some good copper ones, also see if i can find vrm heatsinks too.. I kinda hope there will be a kit out soon for the 290x so you can just order the heat sinks themselves. Also I was loooking at arctic coolings website and it only showed 12 memory heatsinks, when there are 16 modules. But in the pictures from vettefan8 it shows all 16 modules have the coolers...
> 
> 
> 
> I had read that the kit only includes 12 memory heatsinks and that the 290x needs 16 so I ordered an 8 pack of additional heatsinks online. I see that someone used 4 of the long skinny heatsinks instead of ordering additional heatsinks. They spanned 2 long skinny heatsinks side by side to cover 2 of the memory chips in close proximity. You can see it at the top left of the picture included. I have also included a link of their cooler installation.
> http://hardforum.com/showthread.php?p=1040332902#post1040332902
Click to expand...

I had that same problem with the lack of vrm cooling on my accelero xtreme back in the 5000's series, I would wait for non reference coolers.


----------



## skupples

Quote:


> Originally Posted by *kot0005*
> 
> Dammit, I knew I should have waited for the GTX 780Ti, its clocks are looking so good.


----------



## Arizonian

Quote:


> Originally Posted by *kot0005*
> 
> Dammit, I knew I should have waited for the GTX 780Ti, its clocks are looking so good.


Quote:


> Originally Posted by *rdr09*
> 
> 3GB for $600? nah thanks. will they come with higher amount of vram?


Not $600 but $700 for reference.









http://videocardz.com/47508/videocardz-nvidia-geforce-gtx-780-ti-2880-cuda-cores

However if we can keep this from turning into a 780Ti discussion. Club thread been hijacked already enough times and most of us 290X owners do not want to sift through off topic posts. News section has a huge thread on it now if anyone is interested. I'm sure once officially released and reviewed we'll compare but for now it's not relevant and speculation.

We're just days away from 290 NDA lift and driver 13.11 beta 9 which may contain that performance boost speculated is just about upon us. Sugarhell showed a link that's not working ATM.

Have we confirmed13.11 beta 8 is out in the wild?


----------



## $ilent

yes - http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx


----------



## VSG

The 13.11 beta 8 might well be the "performance boost" driver. Either case, I am very impressed at the rapid turnover of drivers from AMD in the past 3-4 months.


----------



## Paul17041993

as intensive this tread is to read through currently vs the other dozen I'm subbed to, I'm finding it a fun way to pass the time until I can get mine


----------



## Arizonian

Nice.

Performance improvements
Batman: Arkham Origins - improves performance up to 35% with MSAA 8x enabled
Total War™: Rome 2 - improves performance up to 10%
Battlefield 3 - improves performance up to 10%
GRID 2 - improves performance up to 8.5%
DiRT Showdown - improves performance up to 10%
Formula 1™ 2013 - improves performance up to 8%
DiRT 3 - improves performance up to 7%
Sleeping Dogs - improves performance up to 5% 
Performance improvements for the AMD APU Series (comparing AMD Catalyst 13.11 Beta6 to AMD Catalyst 13.9)
Luxmark (openCL) - improves performance up to 10%
Winzip 17.5 (openCL) - improves performance up to 20%
GRID 2 - improves performance up to 4%
Battlefield 4 - improves performance up to 5%
Lef4Dead - improves performance up to 10%


----------



## sugarhell

Thats old change log.

Beta 8 fix the crashes on bf4


----------



## Arizonian

Quote:


> Originally Posted by *sugarhell*
> 
> Thats old change log.
> 
> Beta 8 fix the crashes on bf4










not beta 9


----------



## sugarhell

Yeah i found beta 9 but they pulled it off. I dont know anything else maybe they fixed the capital V finally


----------



## OrcishMonkey

Going on water tonight =)


----------



## surrealalucard

Just ran through Fire Strike Extreme and here are the results and I have to say I am fairly happy. What are you guys getting?

MSI R9 290x stock bios Core:1130, Mem:1400, Fan Speed: 70%, Power Limit: 50%, All on air of course.
AMD FX-8350 OCed to 4.5ghz

http://www.3dmark.com/3dm/1534021?


----------



## PillarOfAutumn

MSI 290x in stock:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814127768


----------



## Raephen

Quote:


> Originally Posted by *Paul17041993*
> 
> as intensive this tread is to read through currently vs the other dozen I'm subbed to, I'm finding it a fun way to pass the time until I can get mine


Here here!


----------



## tsm106

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> MSI 290x in stock:
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814127768


So tempted to get another, but I really don't need one for my current res. Must fight the urge!


----------



## VSG

There is a $25 off $250 code expiring tonight in the online deals forum to make this sweeter.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *tsm106*
> 
> So tempted to get another, but I really don't need one for my current res. Must fight the urge!


You Hobitses know you wants Preciousssss....all that power...that Preciousss can give you...just one click the hobitses needs to do...


----------



## Paul17041993

Quote:


> Originally Posted by *OrcishMonkey*
> 
> 
> Going on water tonight =)


hm, now I'm wondering if I should get acrylic on mine too when I get to water...


----------



## utnorris

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> MSI 290x in stock:
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814127768


Thank s just ordered my second to play with. See which is the best between the two and sell the lesser one.......................maybe


----------



## maarten12100

Quote:


> Originally Posted by *tsm106*
> 
> So tempted to get another, but I really don't need one for my current res. Must fight the urge!


Must upgrade to a higher res to justify getting another 290X


----------



## utnorris

Quote:


> Originally Posted by *geggeg*
> 
> There is a $25 off $250 code expiring tonight in the online deals forum to make this sweeter.


Thanks, contacted them and got the $25 credit taken off. Now only $570 shipped. By the way, they sold out in probably less than 5 minutes. Mine is already being packed, so there is a good chance I will have it on Monday. Debating on ordering two of the full PCB water blocks or another just like mine. Decisions, decisions.


----------



## utnorris

Quote:


> Originally Posted by *maarten12100*
> 
> Must upgrade to a higher res to justify getting another 290X


I wasn't planning on running two, but after seeing HardOCP's CF review, I will probably run multi-monitors again or maybe just enjoy my single 27" 1440P monitor at ridiculous frame rates.


----------



## Arizonian

I know we already know this, but it's been confirmed.
Quote:


> Hi Arizonian
> 
> It has been confirmed: existing ARCTIC Accelero Hybrid and Xtreme III will both support the new AMD R9 290X factory reference layout VGA! The only minor thing you may need would be additional VR/RAM heatsinks but that was the easy part.
> 
> We hope this answers your question. Thank you for choosing ARCTIC products!
> 
> Best Regards
> 
> Eric Abellada
> Customer Service Team
> ARCTIC Inc.
> 14783 Clark Avenue | City of Industry, CA 91745 | USA
> T. (626) 961-5437
> E. [email protected]


----------



## jbottz

Received my Gigabyte R9-290X (BF4 edition) from Amazon today. $550.06 with free shipping. Will be going under water once I manage to track down somewhere that has full cover blocks in stock!


----------



## utnorris

FrozenCPU is supposed to get some Monday and I think PPC still has some. Don't forget the discount code for overclock.net.


----------



## jbottz

Quote:


> Originally Posted by *utnorris*
> 
> FrozenCPU is supposed to get some Monday and I think PPC still has some. Don't forget the discount code for overclock.net.


Thanks! Now that I know they're going to be in stock relatively soon, I went ahead and put an order in with FrozenCPU.


----------



## PillarOfAutumn

Yup, FCPU said that they were receiving a shipment today and will be filling out back orders. And they have some more shipments shceduled for 2-3 times next week. He said if I were to order by today (I spoke to him yesterday on Thursday), I should have it by end of next week, or beginning of the week after.


----------



## $ilent

theres ONE 290x BF4 edition sapphire card in on OCUK, for UK buyers - http://www.overclockers.co.uk/showproduct.php?prodid=GX-335-SP&groupid=701&catid=56&subcat=1752

The price has gone up though £515 for that card. Who said something about price gouging again for us with preorders?


----------



## th3illusiveman

Quote:


> Originally Posted by *rdr09*
> 
> i just ordered the Acrylic. Should I call them and ask to replace with Acetal?
> 
> edit: I'll call them as soon as they open. hopefully the CS is in a good mood.
> 
> Update: Got it replaced to Acetal.


Double account?


----------



## Arizonian

Quote:


> Originally Posted by *jbottz*
> 
> Received my Gigabyte R9-290X (BF4 edition) from Amazon today. $550.06 with free shipping. Will be going under water once I manage to track down somewhere that has full cover blocks in stock!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats and added


----------



## Taint3dBulge

Quote:


> Originally Posted by *Arizonian*
> 
> I know we already know this, but it's been confirmed.


Quote:


> ARCTIC Accelero Hybrid and Xtreme III will both support the new AMD R9 290X factory reference layout VGA! The only minor thing you may need would be additional VR/RAM heatsinks but that was the easy part.


$85

HERE

HERE

I dont like using anything but newegg and guess what its an Iron egg item. so they will match the price.

IronEggItem


----------



## Sgt Bilko

Ok, I've got my Accelero Xtreme III on and working properly now, Idle Temp is 36 degrees

Just finished a Furmark Burn in test and max temp was 85 degrees............I need to get water on this


----------



## Roaches

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Ok, I've got my Accelero Xtreme III on and working properly now, Idle Temp is 36 degrees
> 
> Just finished a Furmark Burn in test and max temp was 85 degrees............I need to get water on this


Try a more realistic scenario like light and heavy gaming, thats where temps matter most....

My 2 GTX 680s would hit around low to mid 70 degrees Celsius in Furmark 15 minute burn test


----------



## $ilent

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Ok, I've got my Accelero Xtreme III on and working properly now, Idle Temp is 36 degrees
> 
> Just finished a Furmark Burn in test and max temp was 85 degrees............I need to get water on this


Ouch, thats disappointing


----------



## Sgt Bilko

Quote:


> Originally Posted by *Roaches*
> 
> Try a more realistic scenario like light and heavy gaming, thats where temps matter most....
> 
> My 2 GTX 680s would hit around low to mid 70 degrees Celsius in Furmark 15 minute burn test


Yeah, i'm going to play through the BF4 campaign today so i'll see what it's like.

Can the VRM temp meter in GPU-Z be trusted?


----------



## Roaches

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yeah, i'm going to play through the BF4 campaign today so i'll see what it's like.
> 
> Can the VRM temp meter in GPU-Z be trusted?


I trust it, it can hit a little over 45 degrees celsius when running MSI Kombustor, seams realistic to me, though I haven't tried my other GPUs yet to see their temps...


----------



## PillarOfAutumn

Any ideas when the non-bf4 cards will be released? I already have BF4, so no need to spend an addition $30 on the card.


----------



## $ilent

the none bf4 290x's are already out, just go out of stock as soon as they're in.


----------



## th3illusiveman

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Ok, I've got my Accelero Xtreme III on and working properly now, Idle Temp is 36 degrees
> 
> Just finished a Furmark Burn in test and max temp was 85 degrees............I need to get water on this


*Check your VRM temperatures* . Those coolers have very poor VRM cooling, worse then stock even.

Buuuut, your core temp should be way lower then that, i think you messed up your installation.


----------



## Tobiman

Yeah, from what I've read. it takes a while for the thermal paste to settle and perform optimally.


----------



## selk22

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Sgt Bilko*
> 
> Ok, I've got my Accelero Xtreme III on and working properly now, Idle Temp is 36 degrees
> 
> Just finished a Furmark Burn in test and max temp was 85 degrees............I need to get water on this






Well my reference gtx 580 would get up to around 80-82c on a burn test like that but gaming temps were 60c-75c

I wouldn't be worried about it hurting the card but those certainly are not OCing friendly temps


----------



## Sgt Bilko

Running the BF4 campaign the temp is 61 degrees max

VRM's are running at 50 degrees.

And i've left the compound to set overnight (10-12 hours) so i shouldn't have issues there.

Think i'm just going to go a for water loop in the near future.


----------



## bencher

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Ok, I've got my Accelero Xtreme III on and working properly now, Idle Temp is 36 degrees
> 
> Just finished a Furmark Burn in test and max temp was 85 degrees............I need to get water on this


Why are you testing furmark?

What are you temps in a game?


----------



## Arm3nian

Furmark is useless, in basically everything. You're never going to see those temperatures in anything else, and it provides no stability testing.


----------



## beejay

R9 290X on Arctic Accelero Hybrid

Here is my Validation
http://www.techpowerup.com/gpuz/62hus/

Brand is a Powercolor OC version
Cooler is an Accelero Hybrid
core: 1100mhz
mem: 1500mhz (6000mhz effective)
voltage: 1.25V

And the images


----------



## selk22

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *beejay*
> 
> R9 290X
> 
> Here is my Validation
> 
> http://www.techpowerup.com/gpuz/33ugc/
> 
> Brand is a Powercolor OC version
> 
> Cooler is an Accelero Hybrid
> 
> I'll post pictures next time ^__^






Hows temps on the hybrid?


----------



## Taint3dBulge

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Ok, I've got my Accelero Xtreme III on and working properly now, Idle Temp is 36 degrees
> 
> Just finished a Furmark Burn in test and max temp was 85 degrees............I need to get water on this


There has to be an issue, or your cooler isnt full seated.. What thermal paste you using, the stuff that came on it... Wouldnt hurt to takeit off clean that off of it and put some better paste on.. I saw a 4c drop on load going from the stuff that was on it to ic diamond 24.... Could you try this? Cause im up in the air with using this or the hybrid...... Might do hybrid since i can get it for 85 bucks.........

Quote:


> Originally Posted by *beejay*
> 
> R9 290X
> 
> Here is my Validation
> 
> http://www.techpowerup.com/gpuz/33ugc/
> 
> Brand is a Powercolor OC version
> 
> Cooler is an Accelero Hybrid
> 
> I'll post pictures next time ^__^


Post some smexy pics







Also whats your temps look like runing furmark and gaming?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Taint3dBulge*
> 
> There has to be an issue, or your cooler isnt full seated.. What thermal paste you using, the stuff that came on it... Wouldnt hurt to takeit off clean that off of it and put some better paste on.. I saw a 4c drop on load going from the stuff that was on it to ic diamond 24.... Could you try this? Cause im up in the air with using this or the hybrid...... Might do hybrid since i can get it for 85 bucks.........
> Post some smexy pics
> 
> 
> 
> 
> 
> 
> 
> Also whats your temps look like runing furmark and gaming?


Cooler is seated, Don't think i used enough paste on the die (Arctic Silver 5).

Was a Pea sized drop, now i'm starting to think i should have used a bit more..........


----------



## psyside

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Running the BF4 campaign the temp is 61 degrees max
> 
> VRM's are running at 50 degrees.
> 
> And i've left the compound to set overnight (10-12 hours) so i shouldn't have issues there.
> 
> Think i'm just going to go a for water loop in the near future.


Sooooo, let me get this straight. You got 34c drop, and your not satisfied? great


----------



## Sgt Bilko

Quote:


> Originally Posted by *psyside*
> 
> Sooooo, let me get this straight. You got 34c drop, and your not satisfied? great


Not satisfied because the cooler + paste + additional Heatsinks cost over $100.

And even with the stock cooler set to 50% i wasn't getting over 70 degrees..........and i have this cooler set to 100% all the time.


----------



## Arizonian

Quote:


> Originally Posted by *beejay*
> 
> R9 290X
> 
> Here is my Validation
> 
> http://www.techpowerup.com/gpuz/33ugc/
> 
> Brand is a Powercolor OC version
> 
> Cooler is an Accelero Hybrid
> 
> I'll post pictures next time ^__^


That GPU-Z worked for me but still love to see pics. Put it in your original post too by editing it if you'd like for it to be linked from roster.

Congrats Added


----------



## th3illusiveman

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Not satisfied because the cooler + paste + additional Heatsinks cost over $100.
> 
> And even with the stock cooler set to 50% i wasn't getting over 70 degrees..........and i have this cooler set to 100% all the time.


yea but's it's silent, my accelero is quieter at 100% then the stock cooler in idle.


----------



## Forceman

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Can the VRM temp meter in GPU-Z be trusted?


Supposedly Aida and HWInfo can read the VRM temps - maybe compare those three and see if they all agree? Be nice to know which ones can be trusted.


----------



## DampMonkey

Quote:


> Originally Posted by *Forceman*
> 
> Supposedly Aida and HWInfo can read the VRM temps - maybe compare those three and see if they all agree? Be nice to know which ones can be trusted.


gpu-z can read vrm temps


----------



## Forceman

Quote:


> Originally Posted by *DampMonkey*
> 
> gpu-z can read vrm temps


Which is why I'd like to see someone compare all the programs that can read them and see if they all agree on the temps. If they all have different temps then you don't know which ones are actually correct (or if any of them are).


----------



## Taint3dBulge

Quote:


> Originally Posted by *beejay*
> 
> R9 290X
> 
> Here is my Validation
> 
> http://www.techpowerup.com/gpuz/33ugc/
> 
> Brand is a Powercolor OC version
> 
> Cooler is an Accelero Hybrid
> 
> I'll post pictures next time ^__^


What are your temps like? Gpu and vrms
. In game and furmark. And yes post some pics... love to see.


----------



## alancsalt

AS5 has a break in period (longer than most). Temps should have a further improvement after a week (?) Should be written on the packaging, or Google would find it..


----------



## Paul17041993

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Ok, I've got my Accelero Xtreme III on and working properly now, Idle Temp is 36 degrees
> 
> Just finished a Furmark Burn in test and max temp was 85 degrees............I need to get water on this


if the fans aren't making any noise, they are too slow...


----------



## RAFFY

Quote:


> Originally Posted by *alancsalt*
> 
> AS5 has a break in period (longer than most). Temps should have a further improvement after a week (?) Should be written on the packaging, or Google would find it..


+1 Last time used AS5 after its break in period I saw a 5-7c drop, great product.

Add me to the list please. I'll have my new build Monday and will post my proof then. 2 x ASUS R290X


----------



## Sgt Bilko

Quote:


> Originally Posted by *alancsalt*
> 
> AS5 has a break in period (longer than most). Temps should have a further improvement after a week (?) Should be written on the packaging, or Google would find it..


didn't have any packaging unfortunately, just the tube and not much written on it. Thanks for that info, i'll report back in a week if the temps drop a bit more.

Quote:


> Originally Posted by *Paul17041993*
> 
> if the fans aren't making any noise, they are too slow...


i can't set it any higher than 100% and i use a headset anyway.

I probably am complaining a bit too much about the Accelero Xtreme III,
It's great option for those that have a ref 290x and don't want to watercool.
It's near silent, lowers the temp by a good amount and is easy to install.

The downsides are the length (32cm from back to end of the heatpipes), the amount of space it takes up (3 slots). and it only includes 12 Vram heatsinks when you need 16

I don't want to turn anyone off buying this, it's great cooler even though it has some downsides.


----------



## PillarOfAutumn

Wait..aren't you not supposed to use AS5? Its slightly capacitive.


----------



## Paul17041993

Quote:


> Originally Posted by *Sgt Bilko*
> 
> didn't have any packaging unfortunately, just the tube and not much written on it. Thanks for that info, i'll report back in a week if the temps drop a bit more.
> i can't set it any higher than 100% and i use a headset anyway.
> 
> I probably am complaining a bit too much about the Accelero Xtreme III,
> It's great option for those that have a ref 290x and don't want to watercool.
> It's near silent, lowers the temp by a good amount and is easy to install.
> 
> The downsides are the length (32cm from back to end of the heatpipes), the amount of space it takes up (3 slots). and it only includes 12 Vram heatsinks when you need 16
> 
> I don't want to turn anyone off buying this, it's great cooler even though it has some downsides.


I think its just cause your using the stock fans, AC doesn't design them for high performance so its only really a quiet alternate to the stock cooler...

if you wanted you could try grabbing 2 or 3 high power PWM fans, some means of mounting (handful of zipties might be enough), strip the cooler of the stock fans and stick these newer ones on, hook them all up to the PSU and connect their PWM signal to the card's controller in some way and enjoy a massive cooling curve, but otherwise water is always going to have better results...


----------



## TacticalAce42

Why do my 290x core clock not increase when my temperature was lower then 95c?


----------



## raghu78

Quote:


> Originally Posted by *TacticalAce42*
> 
> Why do my 290x core clock not increase when my temperature was lower then 95c?


max out power control and push fan speed to 75% . these are essential for hitting 1100 mhz on stock cooler.


----------



## Arizonian

Quote:


> Originally Posted by *TacticalAce42*
> 
> Why do my 290x core clock not increase when my temperature was lower then 95c?


Quote:


> Originally Posted by *raghu78*
> 
> max out power control and push fan speed to 75% . these are essential for hitting 1100 mhz on stock cooler.


Manual fan setting where 70% = max will keep temps under 83C-84C and keep 1100 Core / 1300 Mem stable in BF 4 without any down clocking.

Uber mode will max fan at 55% and temps will hit 92C and will down clock to 1058 Mhz Core / 1300 Mem.



_That straight line was gaming before I turned off game and you see the first big drop. Steady all the way with exception when I went to options screen for a brief second inadvertently which lowered a drop for brief moment._


----------



## beejay

Quote:


> Originally Posted by *Taint3dBulge*
> 
> There has to be an issue, or your cooler isnt full seated.. What thermal paste you using, the stuff that came on it... Wouldnt hurt to takeit off clean that off of it and put some better paste on.. I saw a 4c drop on load going from the stuff that was on it to ic diamond 24.... Could you try this? Cause im up in the air with using this or the hybrid...... Might do hybrid since i can get it for 85 bucks.........
> Post some smexy pics
> 
> 
> 
> 
> 
> 
> 
> Also whats your temps look like runing furmark and gaming?


Haven't gotten checking the thermals yet, still waiting for the sinks to cure. Sexy pics and thermals by monday night, I'm leaving for a camping trip today.


----------



## TacticalAce42

So do I just increase the fan speed or do I add power as well?


----------



## Arizonian

Quote:


> Originally Posted by *TacticalAce42*
> 
> So do I just increase the fan speed or do I add power as well?


Both. Set manual fan speeds yourself. 50% power limit is what I used.









By 80C make sure you hit 70% fan speed on fan settings. I also raised it at 90C to 100% but it never got that far.


----------



## TacticalAce42

Thank you, I RMA mines due to suspicion of it being defective I should see a turn around in 2 weeks hopefully.


----------



## Forceman

Quote:


> Originally Posted by *TacticalAce42*
> 
> Why do my 290x core clock not increase when my temperature was lower then 95c?


It won't automatically increase at any temp. It'll decrease when the temp hits 94C, but it starts at 1000 (or whatever you have set) and goes down. It's the opposite of the way Nvidia is running their boost, where it starts at the base and goes up.


----------



## Sgt Bilko

Is anyone else crashing in BF4 campaign? in the Singapore mission mostly.


----------



## SonDa5

Got an EK block today!


----------



## Sazz

Quote:


> Originally Posted by *SonDa5*
> 
> Got an EK block today!


Got mine too, but sadly my 290X's display port stopped showing display so I had to RMA it and just sent it out today.. hopefully newegg do a replacement instead of a refund T_T


----------



## Forceman

Quote:


> Originally Posted by *Sazz*
> 
> Got mine too, but sadly my 290X's display port stopped showing display so I had to RMA it and just sent it out today.. hopefully newegg do a replacement instead of a refund T_T


Sent mine back this week and they just gave me a refund today instead of replacing it. So unless they have stock I think they'll automatically make it a refund. Now I have to play the F5 game again. Unless Tuesday rolls around first and I just grab a launch day 290 instead.

Interesting that the two 290X cards with model numbers that indicate non-reference (the Asus DCII and the Gigabyte Windforce) both now say Discontinued on Newegg.


----------



## Sazz

Quote:


> Originally Posted by *Forceman*
> 
> Sent mine back this week and they just gave me a refund today instead of replacing it. So unless they have stock I think they'll automatically make it a refund. Now I have to play the F5 game again. Unless Tuesday rolls around first and I just grab a launch day 290 instead.
> 
> Interesting that the two 290X cards with model numbers that indicate non-reference (the Asus DCII and the Gigabyte Windforce) both now say Discontinued on Newegg.


They do that, put it as "discontinued" but when they get it in stock thats when they put it back up, that's how it is back then during the HD6*** times.

They will receive my return on monday, hopefully they get some stock by then, I already used the BF4 game but the representative says that I don't have to worry about it and I shouldn't be obligated to return it and nor will I be charged for it. I saved all my chat lags with them just in case they end up doing it as a refund and charging me for the BF4 game, didn't really want the game anyways it was just free xD

Anyways anyone got an idea when the regular 290X comes out? I may just have to wait for those if I do end up getting a refund.


----------



## pioneerisloud

I'm kind of curious to know which models you guys that have to return them have bought. Seems like every day I keep seeing people returning those cards in this thread. It's really turning me off buying 2 in January, lol. Even though I know they should have them all ironed out by then anyway.


----------



## OrcishMonkey

Crossfire on water, ill flash em and bench em tomorrow and report my findings =)


----------



## Sazz

Quote:


> Originally Posted by *pioneerisloud*
> 
> I'm kind of curious to know which models you guys that have to return them have bought. Seems like every day I keep seeing people returning those cards in this thread. It's really turning me off buying 2 in January, lol. Even though I know they should have them all ironed out by then anyway.


It happens, I mean mine is working except for the fact that the display port stopped showing display, prolly just had to fix something on the port itself, maybe some bad soldering? but the card performed great even with the drivers still in its embryotic stages.


----------



## icemanjkh

Could anyone (especially those who upgraded from a 7970) give me some advice on this situation:

I have a single monitor setup and I'd to play BF4 @ [email protected]+fps on High/Ultra (or [email protected] on High at the least).
(Second monitor - 2407WFP - is for web browser/movies,etc)

A Gigabyte R290X costs $700 (AUD) here in Australia.
Until Monday, I can buy an MSI 7970 Twin Frozr OC Boost Edition for $300 (AUD). (brand new retail)

Obviously the 290X is an awesome card, yet it's quite a lot of money (and more than double the 7970's special price).
I realise that the 7970 is "old" generation now (aka 280x







), however what are people's thoughts on bang-for-buck with it's $300 price tag.
Should I simply go for the super deal on the 7970 or pay double and get the R290x?

I wasn't sure if this is the right place to ask such a question, but depending on how the replies turn out, I might become an R290X owner









Thanks.


----------



## Arizonian

Quote:


> Originally Posted by *OrcishMonkey*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Crossfire on water, ill flash em and bench em tomorrow and report my findings =)


Congrats - looks good. Added









Quote:


> Originally Posted by *SonDa5*
> 
> Got an EK block today!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Updated to water cooling


----------



## Forceman

Quote:


> Originally Posted by *Sazz*
> 
> They do that, put it as "discontinued" but when they get it in stock thats when they put it back up, that's how it is back then during the HD6*** times.
> 
> They will receive my return on monday, hopefully they get some stock by then, I already used the BF4 game but the representative says that I don't have to worry about it and I shouldn't be obligated to return it and nor will I be charged for it. I saved all my chat lags with them just in case they end up doing it as a refund and charging me for the BF4 game, didn't really want the game anyways it was just free xD
> 
> Anyways anyone got an idea when the regular 290X comes out? I may just have to wait for those if I do end up getting a refund.


Yeah, they gave me a full refund even though I used the BF4 coupon also. And the regular 290 (non X) comes out on Tuesday.


----------



## Moustache

Quote:


> Originally Posted by *OrcishMonkey*
> 
> 
> 
> Crossfire on water, ill flash em and bench em tomorrow and report my findings =)


finally!


----------



## Paul17041993

Quote:


> Originally Posted by *OrcishMonkey*
> 
> 
> 
> Crossfire on water, ill flash em and bench em tomorrow and report my findings =)


purdy, I'm guessing you got UV blue dye/coolant?


----------



## utnorris

I got my EK water block yesterday. Half way through getting it installed and had to do some wife maintenance, so I will finish the install this morning or this afternoon sometime. Hopefully, I will see an increase in my scores due to the lower temps and less throttling. I haven't decided if I will flash it or just wait on AB's voltage control. On a side note, for me, the last Beta drivers seemed buggy, anyone else feel that way?


----------



## TheSoldiet

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Is anyone else crashing in BF4 campaign? in the Singapore mission mostly.


I also crashed in that mission, but only if i died. So dont die ;-)


----------



## wthenshaw

Does anyone here know if the Arctic Coming Accelero Xtreme 3 will fit the 290X (gigabyte)? Thanks


----------



## Sgt Bilko

Quote:


> Originally Posted by *wthenshaw*
> 
> Does anyone here know if the Arctic Coming Accelero Xtreme 3 will fit the 290X (gigabyte)? Thanks


The phrase "Winter is Coming" just popped into my head after that









And yes, it does fit.......been discussed several times in this thread.


----------



## Arizonian

Quote:


> Originally Posted by *wthenshaw*
> 
> Does anyone here know if the Arctic Coming Accelero Xtreme 3 will fit the 290X (gigabyte)? Thanks


Do you mean the Arctic Accelero Extreme III? Not sure what you meant by 'Coming'?

Anyway if it was a type-o then yes it has been confirmed. Two members have already put it on too . Proof on below link.

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/2240#post_21107489

You will need a package of additional VRAM heatsinks. The Arctic hybrid will also fit. All the reference cards have the same layout by different vendors.


----------



## Dominican

what program is everyone using to overclock.


----------



## maarten12100

Quote:


> Originally Posted by *Dominican*
> 
> what program is everyone using to overclock.


Asus gpu tweak since MSI afterburner ain't updated yet


----------



## overclockFrance

I found the reason of the display blurred as soon as I increased the GPU voltage above around 1.225 V on my HIS 290X card : the EK copper waterblock was not in contact with a few memory modules. I saw that when I removed it. I reinstalled the stock cooler, set the GPU voltage at 1.24 V and no problem.

Is my waterblock faulty ? Did another one with the same waterblock encounter the same problem ?


----------



## kcuestag

Quote:


> Originally Posted by *overclockFrance*
> 
> I found the reason of the display blurred as soon as I increased the GPU voltage above around 1.225 V on my HIS 290X card : the EK copper waterblock was not in contact with a few memory modules. I saw that when I removed it. I reinstalled the stock cooler, set the GPU voltage at 1.24 V and no problem.
> 
> Is my waterblock faulty ? Did another one with the same waterblock encounter the same problem ?


Make sure you use the correct thermal pads included, there are two types, 0.5mm and 1.0mm versions if I remember correctly. Check the instructions paper.

Also, to those watercooling the 290X, can you check if your VRM1 and VRM2 sensors on GPU-z are different? Because one of them runs like 10ºC cooler for me, not sure if normal.


----------



## $ilent

Well my pc just randomly shut down just now after messing with amd overdrive. I tried to swap to quiet mode after a reboot and timed out from a\ bf4 server.

What happens if i set bios to uber mode but put max fan speed at say 35% with temp target of 95C? Will it just downclock itself? the fan is getting stupidly loud when playing BF4..Christ I cant even hear myself think.


----------



## scramz

Just showing my face as I am on the 3dmark 11 leader board









i7 3770k @ 4.8GHz
R9 290x @ 1262/1500

Score P15864
Physics Score 11136
Combined Score 10153
Graphics Score 18884

11.13 Beta V7 drivers
Win 7

http://www.3dmark.com/3dm11/7414930

http://www.techpowerup.com/gpuz/dvsku/

http://valid.canardpc.com/4ea55i



Spoiler: Warning: Spoiler!









Spoiler: Warning: Spoiler!









Spoiler: Warning: Spoiler!







Here's my beauty!


Spoiler: Warning: Spoiler!







This may not have the best overall score due to the CPU, but check out the GFX score, I believe it beat the No.1 one card in that thread


----------



## Koniakki

Quote:


> Originally Posted by *scramz*
> 
> Just showing my face as I am on the 3dmark 11 leader board
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i7 3770k @ 4.8GHz
> R9 290x @ 1262/1500
> 
> Score P15864
> Physics Score 11136
> Combined Score 10153
> Graphics Score 18884
> 
> 11.13 Beta V7 drivers
> Win 7
> 
> http://www.3dmark.com/3dm11/7414930
> 
> http://www.techpowerup.com/gpuz/dvsku/
> 
> http://valid.canardpc.com/4ea55i
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Here's my beauty!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> This may not have the best overall score due to the CPU, but check out the GFX score, I believe it beat the No.1 one card in that thread


Great score. Especially the GPU Score.









One thing tho as it seems weird. My [email protected] was getting about 13K Physics score. I had my [email protected] 11-12-11-24 1T but I don't think that would yield me 2k over your Physics score.

Could it be that your CPU was downclocking or something? 11K for a [email protected] I don't think is normal.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Forceman*
> 
> Sent mine back this week and they just gave me a refund today instead of replacing it. So unless they have stock I think they'll automatically make it a refund. Now I have to play the F5 game again. Unless Tuesday rolls around first and I just grab a launch day 290 instead.
> 
> Interesting that the two 290X cards with model numbers that indicate non-reference (the Asus DCII and the Gigabyte Windforce) both now say Discontinued on Newegg.


Did they do their usual BS of charging a 15% restocking fee?


----------



## scramz

Quote:


> Originally Posted by *Koniakki*
> 
> Great score. Especially the GPU Score.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> One thing tho as it seems weird. My [email protected] was getting about 13K Physics score. I had my [email protected] 11-12-11-24 1T but I don't think that would yield me 2k over your Physics score.
> 
> Could it be that your CPU was downclocking or something? 11K for a [email protected] I don't think is normal.


As far as I am aware it is not downclocking. I dont know what to suggest.


----------



## alancsalt

Not having your rig in your sig, I can't see your ram, but fast ram can help your physics score. Tightening the timings can help a little too. The first thing is the stabler your CPU overclock the better. Once that is finalized, then get the ram as good as possible....


----------



## scramz

Quote:


> Originally Posted by *alancsalt*
> 
> Not having your rig in your sig, I can't see your ram, but fast ram can help your physics score. Tightening the timings can help a little too. The first thing is the stabler your CPU overclock the better. Once that is finalized, then get the ram as good as possible....


Im fairly new to this overclocking stuff. My CPU is stable at 4.8, have done many many tests for this. I guess it is my ram. Or i just need to get a extreme edition CPU lol.

I just noticed that with my graphics score, with a better CPU I will be about 7th in the 3D mark all of fame


----------



## glina

Quote:


> Originally Posted by *Koniakki*
> 
> Great score. Especially the GPU Score.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> One thing tho as it seems weird. My [email protected] was getting about 13K Physics score. I had my [email protected] 11-12-11-24 1T but I don't think that would yield me 2k over your Physics score.
> 
> Could it be that your CPU was downclocking or something? 11K for a [email protected] I don't think is normal.


I got 12378pts with 3770k @ 4.88 and ram at 2040MHz 9-11-11-31 1T.

Apparently the ram clock IS making a significant difference.


----------



## cjp4eva

Quote:


> Originally Posted by *OrcishMonkey*
> 
> 
> 
> Crossfire on water, ill flash em and bench em tomorrow and report my findings =)


NICE!







, OK question time. Is that Mayhems bluberry coolant? What size tubing is that? The reason im asking is because thats exactly what i want my wc loop to look like, SPECIALLY the color of the coolant.


----------



## kcuestag

Quote:


> Originally Posted by *cjp4eva*
> 
> NICE!
> 
> 
> 
> 
> 
> 
> 
> , OK question time. Is that Mayhems bluberry coolant? What size tubing is that? The reason im asking is because thats exactly what i want my wc loop to look like, SPECIALLY the color of the coolant.


Tubing looks like 16/10 or 16/11mm and I'm not sure about the coolant, but definitely looks like Pastel.


----------



## utnorris

Quote:


> Originally Posted by *kcuestag*
> 
> Make sure you use the correct thermal pads included, there are two types, 0.5mm and 1.0mm versions if I remember correctly. Check the instructions paper.
> 
> Also, to those watercooling the 290X, can you check if your VRM1 and VRM2 sensors on GPU-z are different? Because one of them runs like 10ºC cooler for me, not sure if normal.


I am using the EK short block with a backplate and I am noticing the backplate is actually getting warm, not hot, but warm, so I may pull my card and check to see how good the memory is making contact to the block. I have had an EK block before not make good contact on some of the memory modules and had to use thicker pads for those.

To your VRM question, yes, I am showing two and one goes up in temp, while one barely moves. It would be nice to see what those are tied too.
Quote:


> Originally Posted by *$ilent*
> 
> Well my pc just randomly shut down just now after messing with amd overdrive. I tried to swap to quiet mode after a reboot and timed out from a\ bf4 server.
> 
> What happens if i set bios to uber mode but put max fan speed at say 35% with temp target of 95C? Will it just downclock itself? the fan is getting stupidly loud when playing BF4..Christ I cant even hear myself think.


I had this happen too where it just froze for no apparent reason. I am thinking there is some software conflicts going on between the latest Beta drivers and Aida since I can shut down Aida and it appears to not do it. As far as Uber mode goes, I believe the only difference between it and silent is the fan profile, which means if you switch to it and lower the fan speed it will just downclock.
Quote:


> Originally Posted by *scramz*
> 
> Just showing my face as I am on the 3dmark 11 leader board
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i7 3770k @ 4.8GHz
> R9 290x @ 1262/1500
> 
> Score P15864
> Physics Score 11136
> Combined Score 10153
> Graphics Score 18884
> 
> 11.13 Beta V7 drivers
> Win 7
> 
> http://www.3dmark.com/3dm11/7414930
> 
> http://www.techpowerup.com/gpuz/dvsku/
> 
> http://valid.canardpc.com/4ea55i
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Here's my beauty!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> This may not have the best overall score due to the CPU, but check out the GFX score, I believe it beat the No.1 one card in that thread


Nice score, but I believe the best GPU score so far under water has been by DampMonkey. These cards are just begging for better cooling. I have mine up and running and it no longer throttles while benching. My max temp on the GPU has been 48c and the VRM was 56c. I also noticed that some of the artifacting that was happening at 1110Mhz/1475Mhz has stopped which might be due to the lower temps. I am waiting for AB to allow for voltage control, but if that does not happen soon I will flash the bios. I forgot to check what memory chips mine has, but if I take the block back off I will check. My Asic score is 78% and my default voltage is 1.14v, so I am pretty sure I have a lot of headroom for overclocking, but we will see.


----------



## wthenshaw

Quote:


> Originally Posted by *Arizonian*
> 
> Do you mean the Arctic Accelero Extreme III? Not sure what you meant by 'Coming'?
> 
> Anyway if it was a type-o then yes it has been confirmed. Two members have already put it on too . Proof on below link.
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/2240#post_21107489
> 
> You will need a package of additional VRAM heatsinks. The Arctic hybrid will also fit. All the reference cards have the same layout by different vendors.


Ah sorry my phone corrected me as I was typing, should have proof read in the first please.

Thanks for the help!


----------



## DampMonkey

Quote:


> Originally Posted by *scramz*
> 
> Just showing my face as I am on the 3dmark 11 leader board
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i7 3770k @ 4.8GHz
> R9 290x @ 1262/1500
> 
> Score P15864
> Physics Score 11136
> Combined Score 10153
> Graphics Score 18884
> 
> 11.13 Beta V7 drivers
> Win 7
> 
> http://www.3dmark.com/3dm11/7414930
> 
> http://www.techpowerup.com/gpuz/dvsku/
> 
> http://valid.canardpc.com/4ea55i
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Here's my beauty!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> This may not have the best overall score due to the CPU, but check out the GFX score, I believe it beat the No.1 one card in that thread


Almost as high as mine


----------



## Arizonian

Quote:


> Originally Posted by *scramz*
> 
> Just showing my face as I am on the 3dmark 11 leader board
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i7 3770k @ 4.8GHz
> R9 290x @ 1262/1500
> 
> Score P15864
> Physics Score 11136
> Combined Score 10153
> Graphics Score 18884
> 
> 11.13 Beta V7 drivers
> Win 7
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://www.3dmark.com/3dm11/7414930
> 
> http://www.techpowerup.com/gpuz/dvsku/
> 
> http://valid.canardpc.com/4ea55i
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Here's my beauty!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This may not have the best overall score due to the CPU, but check out the GFX score, I believe it beat the No.1 one card in that thread


Thanks for taking my invitation to the club and coming here from the benchmarks section. Congrats and Welcome - added to roster.









This is the place to be with your card and together we've been learning how to fine tune them and just tapping into what's under the hood.









EDIT: I put you down for Sapphire but let me know if it's anything else.


----------



## maarten12100

Quote:


> Originally Posted by *DampMonkey*
> 
> Almost as high as mine


Wow 1320 he just nailed it should we make that powertune limit 200%?


----------



## scramz

Quote:


> Originally Posted by *DampMonkey*
> 
> Almost as high as mine


1320!!! Nice going! I cant go above 1260 :/


----------



## Ha-Nocri

Quote:


> Originally Posted by *maarten12100*
> 
> Wow 1320 he just nailed it should we make that powertune limit 200%?


Did anybody hit power limit yet (150%)?
Quote:


> Originally Posted by *scramz*
> 
> 1320!!! Nice going! I cant go above 1260 :/


What voltage?


----------



## Dominican

i try 1100 and 1300 with MSI Afterburner but it freeze when i try to run 3DMark11


----------



## Ukkooh

What are these 290xs reaching for stable gaming clocks under water? Is it still too early to ask?


----------



## scramz

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Did anybody hit power limit yet (150%)?
> What voltage?


I lied I just got 1270, set at 1.4 volts but it is only pulling 1.36


----------



## scramz

Quote:


> Originally Posted by *DampMonkey*
> 
> Almost as high as mine


Have you flashed the bios? I am on the stock one that come with it


----------



## Ha-Nocri

Quote:


> Originally Posted by *scramz*
> 
> I lied I just got 1270, set at 1.4 volts but it is only pulling 1.36


So far it seems on avg. 290x OC is ~1300MHz on water. I'm guessing 1200-1250MHz on aftermarket air coolers.
Quote:


> Originally Posted by *scramz*
> 
> Have you flashed the bios? I am on the stock one that come with it


Yes he did, asus bios. What card do you have? I thought only ASUS 290x comes with unlocked voltage?! Can you change memory voltage?


----------



## scramz

Quote:


> Originally Posted by *Ha-Nocri*
> 
> So far it seems on avg. 290x OC is ~1300MHz on water. I'm guessing 1200-1250MHz on aftermarket air coolers.
> 
> Yes he did, asus bios. What card do you have? I thought only ASUS 290x comes with unlocked voltage?! Can you change memory voltage?


I have the Asus card, my problem is I set the Asus Tweak to 1.4v but it never pulls that many :/


----------



## Jared Pace

Quote:


> Originally Posted by *scramz*
> 
> I have the Asus card, my problem is I set the Asus Tweak to 1.4v but it never pulls that many :/


Asus bios has a lot of Vdroop. Limiit is 1.299v / 468w. If you see ~1.35v it's either because your card is very cool and/or low leakage asic with more headroom before TDP limits, or it's just a spike and you're actually averaging out around 1.299v. It's unlikely you will be able maintain 1410mv on even a golden sample with ASUS.rom, LLC is so extreme & Vdroop doesn't help overclocking. Colder temperatures would help you. 1260mhz is in line with Gibbo's 1250mhz also done on asus.rom.


----------



## $ilent

why is my fan speed going over the max I set in cc to meet the temp target though?


----------



## Newbie2009

Quote:


> Originally Posted by *$ilent*
> 
> why is my fan speed going over the max I set in cc to meet the temp target though?


Safety purposes?


----------



## PillarOfAutumn

I'm very new to overclocking and overvolting a GPU. I want to get a card that doesn't need me to flash anything to get the maximum potential of the card. It seems like most people are flashing Asus' card so should I just get an Asus card? I was actually in between Asus, MSI, and Sapphire, but leaning towards Sapphire. Again, I don't want to make an bios changes and risk screwing the flashing process and voiding my warranty.


----------



## Gunderman456

You can flash any of the reference cards to the Asus bios. If you don't want to flash on the other hand, wait till the non reference cards to come out and they will have better cooling solutions and better to overclock.


----------



## PillarOfAutumn

Well, I'm going to be watercooling. I've already ordered the waterblocks, and my system is almost complete. I just need to buy a 290x when I can find one on sale.

Now, I see most people here are flashing their bios to squeeze more performance, but I don't want to void my warranty, at least not for the first year. What I'm asking for is which manufacturer's card will allow the most OC without doing any bios flashing?


----------



## RAFFY

Thursday around 2:30pm central time Newegg posted the ASUS 290X's. I was in the process of piecing together my build and saw a banner in the bottom right saying in stock. So I clicked and sure enough they were in stock. Then I ordered right away. So they are getting them its just dumb luck I think.


----------



## Forceman

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Did they do their usual BS of charging a 15% restocking fee?


Nah, defective card so no restocking fee.


----------



## armartins

Quote:


> Originally Posted by *icemanjkh*
> 
> Could anyone (especially those who upgraded from a 7970) give me some advice on this situation:
> 
> I have a single monitor setup and I'd to play BF4 @ [email protected]+fps on High/Ultra (or [email protected] on High at the least).
> (Second monitor - 2407WFP - is for web browser/movies,etc)
> 
> A Gigabyte R290X costs $700 (AUD) here in Australia.
> Until Monday, I can buy an MSI 7970 Twin Frozr OC Boost Edition for $300 (AUD). (brand new retail)
> 
> Obviously the 290X is an awesome card, yet it's quite a lot of money (and more than double the 7970's special price).
> I realise that the 7970 is "old" generation now (aka 280x
> 
> 
> 
> 
> 
> 
> 
> ), however what are people's thoughts on bang-for-buck with it's $300 price tag.
> Should I simply go for the super deal on the 7970 or pay double and get the R290x?
> 
> I wasn't sure if this is the right place to ask such a question, but depending on how the replies turn out, I might become an R290X owner
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks.


Probably it really isn't the right place... nonetheless I guess the answer to your question ATM 7970 is a beter deal (under those price you've posted) ... for us international buyers it's always better wait a little to things settle down... And just FYI my 7970 @1230 Core /1750 Men [email protected] HT ON, 16GB 2133 CL10 Ram in BF4 @1440P everyting on ultra just disable MSAA and Ambient Oclusion (HBAO -> OFF) it stays most of the time from 70 to 114FPS and that's capping the FPS with Gametime @114FPS if you let frostbite default 200FPS cap it often goes 135FPS but my monitor is OC'd at 114hz so I just cap it at that frame rate... You'll see rare dips in the 50's when underwater or at paracel storm map... but that's it... my WC'd single 7970 is doing a great job at BF4 much better than I'd expected really... since at the beta it was much worse...
Quote:


> Originally Posted by *Ha-Nocri*
> 
> So far it seems on avg. 290x OC is ~1300MHz on water. I'm guessing 1200-1250MHz on aftermarket air coolers.
> Yes he did, asus bios. What card do you have? I thought only ASUS 290x comes with unlocked voltage?! Can you change memory voltage?


Idk where you're taking that 1300 avg from (no offense), I still think it's too early to tell... I guess that the real AVG on water for the gamer that pushes like 4h every day will be 1200ish at the core in the long run.... 1250 for a really good card, and I mean that as the OC people will hit with 5-10 steps below max voltage... something like 1.25v for 7970s... (I know for 290x the voltage should be higher)


----------



## Ukkooh

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Well, I'm going to be watercooling. I've already ordered the waterblocks, and my system is almost complete. I just need to buy a 290x when I can find one on sale.
> 
> Now, I see most people here are flashing their bios to squeeze more performance, but I don't want to void my warranty, at least not for the first year. What I'm asking for is which manufacturer's card will allow the most OC without doing any bios flashing?


FYI putting a waterblock on it voids the warranty already. So every manufacturer OCs just as well because increasing voltage doesn't help with the reference cooler.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Ukkooh*
> 
> FYI putting a waterblock on it voids the warranty already. So every manufacturer OCs just as well because increasing voltage doesn't help with the reference cooler.


Well the thing with the waterblock is that unless you tell the manufacturer, they're not going to know that you had a waterblock on there. However if you flash a new bios and in the process you somehow brick it, there are ways for them to find out what happened. Unless I'm misunderstanding something.


----------



## Siezureboy

Quote:


> Originally Posted by *Ukkooh*
> 
> FYI putting a waterblock on it voids the warranty already. So every manufacturer OCs just as well because increasing voltage doesn't help with the reference cooler.


From what I've read, as long as you include the stock cooler when you rma, then you'll be fine. And to be honest, I don't think the manufacturers would mind too much since these cards run about 70-90 on stock coolers alone







.


----------



## jomama22

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Well the thing with the waterblock is that unless you tell the manufacturer, they're not going to know that you had a waterblock on there. However if you flash a new bios and in the process you somehow brick it, there are ways for them to find out what happened. Unless I'm misunderstanding something.


Just an FYI. Out of the 6 290x cards i have (3 msi, 2 xfx, 1 sapphire) the xfx's and msi's have small "void if removed" stickers on the gpu die retension screws/bracket. Impossible to remove cleanly. The sapphire has no such stickers anywhere.

Though im sure you could just remove them and clean off the screws and such and they wouldnt know either way.


----------



## $ilent

Quote:


> Originally Posted by *Newbie2009*
> 
> Safety purposes?


but it still goes over what I set even if I put the card in quiet mode...


----------



## PillarOfAutumn

Whoa! What are you doing with 6 cards?

And thank you for letting me know about that! I've crossed off MSI from my list, so I'm only choosing between Asus and Sapphire. Do you know if Asus has those "warranty void if removed" stickers on them?


----------



## overclockFrance

On the ASUS GTX780, 1 sticker was on 1 screw.

I own a HIS 290X and there is no sticker. Nevertheless, inside the graphic card, there is a moving sheet "void if removed". If you need to mount the stock cooler again then you must not forget it.


----------



## cjp4eva

Quote:


> Originally Posted by *kcuestag*
> 
> Tubing looks like 16/10 or 16/11mm and I'm not sure about the coolant, but definitely looks like Pastel.


Yeah it looks like the blueberry pastel but brighter, i wanna know if he added any other dye to it, can't wait until i get my water loo up and two 290x.


----------



## Ukkooh

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Well the thing with the waterblock is that unless you tell the manufacturer, they're not going to know that you had a waterblock on there. However if you flash a new bios and in the process you somehow brick it, there are ways for them to find out what happened. Unless I'm misunderstanding something.


You got it completely wrong. Actually they'll know that you had a block in there just by looking at the TIM. And swapping the cooler is against their warranty terms and conditions so it is technically void even if they had no way to know that you changed the cooler. Anyway 290x is a dual bios card so you can boot with the other bios and then flash the other one if it gets bricked. So breaking the card is near impossible by only flashing the bios.
Quote:


> Originally Posted by *jomama22*
> 
> Just an FYI. Out of the 6 290x cards i have (3 msi, 2 xfx, 1 sapphire) the xfx's and msi's have small "void if removed" stickers on the gpu die retension screws/bracket. Impossible to remove cleanly. The sapphire has no such stickers anywhere.
> 
> Though im sure you could just remove them and clean off the screws and such and they wouldnt know either way.


Thanks for the info! I guess my 290x will be a sapphire one then.


----------



## OrcishMonkey

Quote:


> Originally Posted by *OrcishMonkey*
> 
> 
> 
> Crossfire on water, ill flash em and bench em tomorrow and report my findings =)


Hey guys, just for those asking, it's mayhems pastel blueberry and its got just about 100ml of extra water in it to brighten it up a little, also have a nzxt hue system in there too which adds some sick lighting effects, and its 1/2 3/4 tubing, pm me if anyone else has questions, its my first custom loop =) heres the very short build log with parts and a cpl pics,

http://www.overclock.net/t/1436443/build-log-my-phantom-is-kinda-heavy-now-three-rads-og-phantom

That pic is taken with a note 3 btw, not too shabby but ill go steal my bros dslr today.


----------



## utnorris

Just an FYI on those little stickers on the screws, heat them up with a blow dryer and gently, gently use a razor to remove them. Be very careful as they are made to pull apart. But if you heat them up just enough, the glue becomes super soft and will allow you to take them off. That being said, I doubt any manufacture actually looks for those during an RMA, I say that because I have gotten several cards, same model, same manufacture and one would have it and the other would not. As far as overclocking goes, if you do not want to flash the bios, either get an Asus or wait for the After Burner version that supports voltage control on the 290x. And I am going to bet that once we get enough entries we will see that the average overclock under water or good cooling will probably be around 1200Mhz, which quite frankly will be good enough to be the majority of the overclocked 780's out there, maybe not the Ti or the Titan, but the 780's.


----------



## inastris

I was installing ek water block on my r9-290x and I realized that in the installation guide there is no back plate around where the gpu is. Do I just put in the four screws right on the pcb, or do I need need to put the stock backplate on there, and it just doesn't show properly on the installation guide?


----------



## Ukkooh

Quote:


> Originally Posted by *inastris*
> 
> I was installing ek water block on my r9-290x and I realized that in the installation guide there is no back plate around where the gpu is. Do I just put in the four screws right on the pcb, or do I need need to put the stock backplate on there, and it just doesn't show properly on the installation guide?


Just screws and washers.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *inastris*
> 
> I was installing ek water block on my r9-290x and I realized that in the installation guide there is no back plate around where the gpu is. Do I just put in the four screws right on the pcb, or do I need need to put the stock backplate on there, and it just doesn't show properly on the installation guide?


I haven't put on the block, so I wouldn't know from experience, but I'm pretty sure you just screw it in. I don't think there is a retention bracket or anything. I know that the Gigabyte's 7970's Windforce Cooler was pretty hefty but was secured to the PCB with just 4 screws.

And as for the backplate that's sold separately, that's optional. Its for aesthetics but they say its also for some heat dissipation as well, but not necessary.


----------



## xxmastermindxx

Quote:


> Originally Posted by *inastris*
> 
> I was installing ek water block on my r9-290x and I realized that in the installation guide there is no back plate around where the gpu is. Do I just put in the four screws right on the pcb, or do I need need to put the stock backplate on there, and it just doesn't show properly on the installation guide?


Quote:


> Originally Posted by *Ukkooh*
> 
> Just screws and washers.


Like he said, it's just the screws and washers. It does show this properly in the installation guide. You're screwing the block to the card using the provided screws/washers, all from the backside of the PCB.


----------



## scramz

My best so far

i7 3770k @ 4.8GHz
R9 290x @ 1270/1485

Score P15912
Physics Score 11135
Combined Score 10193
Graphics Score 18958

11.13 Beta V7 drivers
Win 7

http://www.3dmark.com/3dm11/7419550

http://gpuz.techpowerup.com/13/11/02/h5z.png

http://gpuz.techpowerup.com/13/11/02/6qw.png

http://www.techpowerup.com/gpuz/fx5kf/


----------



## reoneclipse

I actually have had the card since Tuesday but having too much fun to post an update here. I got some time now so here goes.


Spoiler: Pics











Still using the stock cooler but I redid the TIM, around 2C lower under load but the fan has never went above 70% in anything. Not planning on doing water for this build, maybe on my next build. Runs great and I love the fact that I can get 4XSSAA in BF4 and stay above 50FPS at 1080p Ultra, the game looks so crispy!


----------



## Mr357

Here's some ugly pics just to prove I got mine block'd.


----------



## Arizonian

Quote:


> Originally Posted by *Mr357*
> 
> Here's some ugly pics just to prove I got mine block'd.
> 
> 
> Spoiler: Warning: Spoiler!


Updated to Water - post back results of OC'ing


----------



## Mr357

Quote:


> Originally Posted by *Arizonian*
> 
> Updated to Water - post back results of OC'ing


I think I'm going to hold off on that until AB gets an update.


----------



## inastris

add me up!


----------



## utnorris

Ok, so the best I can pull from my 290x is:

GPU - 1209Mhz
Memory - 1475Mhz ( I can go higher, but it did not seemed to matter)
Voltage is maxed setting and shows max to be at 1.367v in GPUz

3DMark11 Performance: 15969

http://www.3dmark.com/3dm11/7419925

Graphics Score
18263

Physics Score
12179

Combined Score
10825

3DMark11 Extreme: X5432

http://www.3dmark.com/3dm11/7420176

Graphics Score
5004

Physics Score
12018

Combined Score
6295

3DMark - Free Edition

Icestorm: 160659

http://www.3dmark.com/results/is

Graphics Score
372010

Physics Score
53760

Cloudgate: 27358

http://www.3dmark.com/cg/961772

Graphics Score
78840

Physics Score
8327

Firestrike: 11146

http://www.3dmark.com/fs/1084748

Graphics Score
13167

Physics Score
11301

Combined Score
5133

My Specs:

4770k - 4.7Ghz
Memory - 2132Mhz

Everything is water cooled.

Let me know if I missed anything.


----------



## Newbie2009

Looks like you have a turkey


----------



## Arizonian

Quote:


> Originally Posted by *inastris*
> 
> add me up!
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - Added


----------



## jrcbandit

Prepped my card with waterblock so ready to install:


Can change me to watercooled!


----------



## Arizonian

Quote:


> Originally Posted by *jrcbandit*
> 
> Prepped my card with waterblock so ready to install:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Can change me to watercooled!


Right on - updated









Missing Stay Puft - ConservingClips - and Kenshiro 26 - who I've added to list on order night but have not submitted their proof yet. You should have them by now. Did I miss you guys?

Also I've decided to add members only after they've submitted proof, easier to keep track.









EDIT: Since I'm still last post. Turns out my Sapphire has Elpida memory.


----------



## $ilent

has anyone with water submitted any unlocked voltage OC results yet?


----------



## MIGhunter

Here's some pics of mine stock. I haven't had a day off to do anything but install it.
http://s279.photobucket.com/user/bo...0-49fe-9e60-cffdc4d13b82_zps28e277ec.jpg.html
http://s279.photobucket.com/user/botdphotos/media/Mobile Uploads/IMAG0032_zps714fcc13.jpg.html
http://s279.photobucket.com/user/bo...e-4068-93e7-53b333482818_zpsa8cc8bc3.jpg.html
http://s279.photobucket.com/user/botdphotos/media/Mobile Uploads/IMAG0035_zpsaa117887.jpg.html
http://s279.photobucket.com/user/botdphotos/media/Mobile Uploads/IMAG0034_zps87960c74.jpg.html


----------



## inastris

Change me to water as well


----------



## utnorris

Quote:


> Originally Posted by *Newbie2009*
> 
> Looks like you have a turkey


Wouldn't exactly call it a Turkey, just not an extreme performer.









In reality I am pretty happy with being able to beat my previous GTX670 SLI and HD7970 CF scores with just one card. I have an MSI coming in that I will try out and see how well it does. If it ends up being an extreme performer than I will sell the Sapphire and grab another 290x when they come back in stock. If it performs similarly I will just keep them both and run CF and call it the day.


----------



## Arizonian

Quote:


> Originally Posted by *inastris*
> 
> Change me to water as well


Updated









Quote:


> Originally Posted by *MIGhunter*
> 
> Here's some pics of mine stock. I haven't had a day off to do anything but install it.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s279.photobucket.com/user/bo...0-49fe-9e60-cffdc4d13b82_zps28e277ec.jpg.html
> http://s279.photobucket.com/user/botdphotos/media/Mobile Uploads/IMAG0032_zps714fcc13.jpg.html
> http://s279.photobucket.com/user/bo...e-4068-93e7-53b333482818_zpsa8cc8bc3.jpg.html
> http://s279.photobucket.com/user/botdphotos/media/Mobile Uploads/IMAG0035_zpsaa117887.jpg.html
> http://s279.photobucket.com/user/botdphotos/media/Mobile Uploads/IMAG0034_zps87960c74.jpg.html


Congrats welcome aboard - added


----------



## kcuestag

Quote:


> Originally Posted by *utnorris*
> 
> Wouldn't exactly call it a Turkey, just not an extreme performer.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In reality I am pretty happy with being able to beat my previous GTX670 SLI and HD7970 CF scores with just one card. I have an MSI coming in that I will try out and see how well it does. If it ends up being an extreme performer than I will sell the Sapphire and grab another 290x when they come back in stock. If it performs similarly I will just keep them both and run CF and call it the day.


You sure it's beating your old 7970 CF? Coz I moved from 7970 CF to a single R9 290X and I don't think it beats them. It's close, and single card, but doesn't beat them.

I'm still very happy with them though, waiting on MSI AB for voltage control then I'll push it further than 1125/1450.


----------



## utnorris

Hey Arizonia, if you haven't already updated mine, I am under water with the Asus bios.


----------



## Arizonian

Quote:


> Originally Posted by *inastris*
> 
> Change me to water as well


Quote:


> Originally Posted by *utnorris*
> 
> Hey Arizonia, if you haven't already updated mine, I am under water with the Asus bios.


Missed ya bud - updated









More under water than I thought. Compared to my 1100 Core Clock / 1300 Memory on air stable?

I'm a bit bummed my memory is only allowing me a 50 Mhz bump which results in 4% OC. My Core 100 Mhz bump gives me a 10% bump.

http://www.techpowerup.com/gpuz/e2gqd/

It's stable across all my games, Crysis 3 (the toughest to be stable), and the others like BF3, BF4 and Crysis 2. I'm definitely looking forward to non-reference for a bit more without having to water cool.


----------



## icemanjkh

Quote:


> Originally Posted by *armartins*
> 
> Probably it really isn't the right place... nonetheless I guess the answer to your question ATM 7970 is a beter deal (under those price you've posted) ... for us international buyers it's always better wait a little to things settle down... And just FYI my 7970 @1230 Core /1750 Men [email protected] HT ON, 16GB 2133 CL10 Ram in BF4 @1440P everyting on ultra just disable MSAA and Ambient Oclusion (HBAO -> OFF) it stays most of the time from 70 to 114FPS and that's capping the FPS with Gametime @114FPS if you let frostbite default 200FPS cap it often goes 135FPS but my monitor is OC'd at 114hz so I just cap it at that frame rate... You'll see rare dips in the 50's when underwater or at paracel storm map... but that's it... my WC'd single 7970 is doing a great job at BF4 much better than I'd expected really... since at the beta it was much worse...


Noted. Thanks for taking the time to reply in any case









While it seems the GTX 780Ti is about to release (and allegedly performs better than 290x), I don't think I'll see any price drops on the 290X soon enough for me to not buy that $300 7970.


----------



## HoZy

XFX Core Edition 290x

Waiting on waterblock, Currently stock cooling.


----------



## utnorris

Quote:


> Originally Posted by *$ilent*
> 
> has anyone with water submitted any unlocked voltage OC results yet?


Mine are, max voltage of 1.36v
Quote:


> Originally Posted by *kcuestag*
> 
> You sure it's beating your old 7970 CF? Coz I moved from 7970 CF to a single R9 290X and I don't think it beats them. It's close, and single card, but doesn't beat them.
> 
> I'm still very happy with them though, waiting on MSI AB for voltage control then I'll push it further than 1125/1450.


Yeah, I was thinking overall score, but when I look at GPU at 1200Mhz they scored around 20k, but barely stable. Here is their score:

http://www.3dmark.com/3dm11/4067299

My 670's definitely scored lower on the GPU side at just over 17k:

http://www.3dmark.com/3dm11/6148748


----------



## Arizonian

Quote:


> Originally Posted by *HoZy*
> 
> XFX Core Edition 290x
> 
> Waiting on waterblock, Currently stock cooling.
> 
> 
> Spoiler: Warning: Spoiler!


Nice system. Congrats - Added


----------



## MIGhunter

Does the Nvidia driver uninstaller in the OP work for AMD? I tried to install the beta drivers this morning and my screen flashed black and then got stuck. So, I had to hard reboot my system. Eveything is working fine but I was thinking about erasing my drivers and reinstalling them tomorrow when I get a day off.

Also, when my son tried to play BF4 online, it said it need punkbuster. When I installed everything from the CD, I installed punkbuster. What would be causing that.


----------



## Arizonian

Quote:


> Originally Posted by *MIGhunter*
> 
> Does the Nvidia driver uninstaller in the OP work for AMD? I tried to install the beta drivers this morning and my screen flashed black and then got stuck. So, I had to hard reboot my system. Eveything is working fine but I was thinking about erasing my drivers and reinstalling them tomorrow when I get a day off.
> 
> Also, when my son tried to play BF4 online, it said it need punkbuster. When I installed everything from the CD, I installed punkbuster. What would be causing that.


Yes I used it and it worked like a dream. Had issues loading BF3 and getting rid of Nvidia folders in drive along with control panel. Used it and boom! gone.









http://www.wagnardmobile.com/DDU/


----------



## Sazz

Quote:


> Originally Posted by *Forceman*
> 
> Yeah, they gave me a full refund even though I used the BF4 coupon also. And the regular 290 (non X) comes out on Tuesday.


I meant the regular version of 290X (non BF4 version)

Anyone got an idea when the non-BF4 version will be released?


----------



## $ilent

^I think its already out, just goes out of stock so fast.


----------



## Sazz

Quote:


> Originally Posted by *$ilent*
> 
> ^I think its already out, just goes out of stock so fast.


what I mean is for every other manufacturer, only one that went available that is a non-BF4 edition is from HIS, I don't like HIS. so far the top 3 manufacturer (Asus,Sapphire and MSI) are still at "coming soon" phase. and everything else except for HIS are BF4 editions.


----------



## Arizonian

My Sapphire reference which we all thought to have Hynix memory is actually Elpida. I've always had horrible luck with over clocking and I'm comparing with others on air not under water btw. Nvidia cards which has been EVGA and now this Sapphire. I can never get a good memory OC. I see some crazy high OC's but that's how it goes I guess. Some day I'll get a golden chip or at least a decent OC


----------



## HoZy

You do know the Non-BF4 versions are not any cheaper?


----------



## Sazz

Quote:


> Originally Posted by *HoZy*
> 
> You do know the Non-BF4 versions are not any cheaper?


but if they get available sooner than the BF4 version ill get it if I get a refund instead of a replacement on my card, and besides I already used the coupon code for BF4 and they said they won't charge me anything for the game if it indeed turn out to be refunded, but I am still crossing my fingers hoping that they get some stocks in on monday so that they will just send me a replacement instead of refund.


----------



## PillarOfAutumn

From all the retailers I've seen, most of them have only been selling the bf4 edition. Since I already have bf4, it doesn't make sense for me to spend the extra $30 for it. If anything, I'm just going to buy the card and game, make sure everything is working, and then sell the code for $40. I know they said that theybhave a limited amount of keys, so I guess they'll wait till the sell out the bf4 editions, and then ship out the non bf4 cards.


----------



## HoZy

My 290x Was a Non-Bf4 XFX.


----------



## $ilent

Quote:


> Originally Posted by *Arizonian*
> 
> My Sapphire reference which we all thought to have Hynix memory is actually Elpida. I've always had horrible luck with over clocking and I'm comparing with others on air not under water btw. Nvidia cards which has been EVGA and now this Sapphire. I can never get a good memory OC. I see some crazy high OC's but that's how it goes I guess. Some day I'll get a golden chip or at least a decent OC


Where did you download the mem tool from? jared?'s link doesnt work for me just says invalid file

Also do we have any word on afterburner update yet?


----------



## utnorris

Hey Arizonia, don't feel bad, I have mine under water and the most I can get with full 1.41v is 1221Mhz on the GPU. My memory is Hynix (Sapphire card) and while the memory does overclock well (1475) it does not seem to help much with benchmarks. I have an MSI one coming early next week, maybe I will get lucky and get a card like Dampmonkey that seems to be unstoppable.


----------



## Arizonian

Quote:


> Originally Posted by *$ilent*
> 
> Where did you download the mem tool from? jared?'s link doesnt work for me just says invalid file
> 
> Also do we have any word on afterburner update yet?


On OP there is a link to Jared's thread who is our BIOS modder helping 290X owners push votls. His page has all the info. Which I'm grateful as I've got a lot on my plate with the Club and moderating. Thank you Jared for the very in depth BIOS thread.


----------



## $ilent

Please refer to my previous post, jareds link doesnt work.


----------



## Arizonian

Quote:


> Originally Posted by *$ilent*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Please refer to my previous post, jareds link doesnt work.


Try this

http://www.mediafire.com/download/voj4j1rlk0ucfz4/MemoryInfo+1005.rar

Sorry trying to bench right now and not getting any better results off this default BIOS. Might have to flash this one, I'm just reluctant as all we need is a working AB for over volt and I'm selling this card when reference comes out most likely.


----------



## $ilent

thanks, got it downloaded but mine just looks like this lol


----------



## utnorris

Quote:


> Originally Posted by *Arizonian*
> 
> Try this
> 
> http://www.mediafire.com/download/voj4j1rlk0ucfz4/MemoryInfo+1005.rar
> 
> Sorry trying to bench right now and not getting any better results off this default BIOS. Might have to flash this one, I'm just reluctant as all we need is a working AB for over volt and I'm selling this card when reference comes out most likely.


When you flash it you can save the current bios. Just keep the flash drive intact and all you will need to do is put it back in and flash back to the factory bios.


----------



## Emmett

Quote:


> Originally Posted by *scramz*
> 
> My best so far
> 
> i7 3770k @ 4.8GHz
> R9 290x @ 1270/1485
> 
> Score P15912
> Physics Score 11135
> Combined Score 10193
> Graphics Score 18958
> 
> Wow, this is pretty identical to my score with CF 7970's with a 3770K at 4.7
> and cards at 1225/1700 ! nice!


----------



## Emmett

Quote:


> Originally Posted by *Emmett*
> 
> Quote:
> 
> 
> 
> Originally Posted by *scramz*
> 
> My best so far
> 
> i7 3770k @ 4.8GHz
> R9 290x @ 1270/1485
> 
> Score P15912
> Physics Score 11135
> Combined Score 10193
> Graphics Score 18958
> 
> Wow, this is pretty identical to my score with CF 7970's with a 3770K at 4.7
> and cards at 1225/1700 ! nice!
> 
> 
> 
> oops, nevermind my CF 7970 score is 18324
Click to expand...


----------



## rdr09

Quote:


> Originally Posted by *Emmett*
> 
> Quote:
> 
> 
> 
> Originally Posted by *scramz*
> 
> My best so far
> 
> i7 3770k @ 4.8GHz
> R9 290x @ 1270/1485
> 
> Score P15912
> Physics Score 11135
> Combined Score 10193
> Graphics Score 18958
> 
> Wow, this is pretty identical to my score with CF 7970's with a 3770K at 4.7
> and cards at 1225/1700 ! nice!
> 
> 
> 
> wut? here are 7950 s lower clocked (1160) . . .
> 
> http://www.3dmark.com/3dm11/7414877
Click to expand...


----------



## Emmett

Quote:


> Originally Posted by *rdr09*
> 
> wut? here are 7950 s lower clocked (1160) . . .
> 
> http://www.3dmark.com/3dm11/7414877


Oops. I corrected above 18325 with 24583 graphics score. my bad..

I have two 290x cards inbound. XFX but was all I could get..


----------



## Jared Pace

Quote:


> Originally Posted by *Arizonian*
> 
> Thank you Jared for the very in depth BIOS thread.


Yes Sir happy to help out, just a messenger of all the helpful info out there, thank sham for pt3 - thanks for including a link to that thread in your OP. One reason utnorris' card tops at ~1200 is because of his bios version. DampMonkey is on a different bios at 1320. His voltage registers 1387mv & it's actually ~1420mv. utnorris reads ~1360mv and it's actually ~1299mv. GPUz VDDCI reading is incorrect, so there's a chance what it says for VRM temp could be incorrect as well. VRM temp is probably accurate though - just need somebody to verify with IR reading. Also GPUZ doesn't show all stats on all cards (a bios version mismatch bug-dunno why). And i don't know why $ilent's Memtool isn't working, but $ilent seems to have trouble with every 290x link he downloads..a ahahah. Ati Winflash wasn't his friend, and none of my zip/rar files open for him, now Memtool has no info for him. There are alt/backup links for everything there in case.


----------



## rdr09

Quote:


> Originally Posted by *Emmett*
> 
> Oops. I corrected above 18325 with 24583 graphics score. my bad..
> 
> I have two 290x cards inbound. XFX but was all I could get..










i ordered the waterblock but still waiting for the 290. tells us about your cards.


----------



## $ilent

I will be picking up a gelid icy vision rev2 cooler on monday hopefully, ill let you know how I get on but for info my current one in use on my gtx 670 keeps it at 48C max temp at 1.21v and fans inaudible at 1700rpm


----------



## Emmett

Quote:


> Originally Posted by *rdr09*
> 
> 
> 
> 
> 
> 
> 
> 
> i ordered the waterblock but still waiting for the 290. tells us about your cards.


I was checking newegg. all cards including XFX showed OOS. but the XFX card also said add to cart even though showing OOS.
Tried to do quantity two. would revert to one. so I went ahead with the order. second time it let me add 2 cards (oddly) but not wanting 3 cards. I just ordered one more for total of two. fully expecting the OOS email, I was surprised to get a tracking # ..

I was thinking of staying on air this time. but not sure. I can get blocks now. but acrylic. I last used acrylic on my 1900XT's
danger den blocks on my gtx 8800's .. ran my gtx 480's on air (<--- yes lol and I had blocks but to lazy to put em on..). and put EK acetel on my 7970's ...

Hmmm... might.... just..... order... the..... acrylic..... must resist the urge.... and wait....

I'll post a pic of the cards so I can join...


----------



## skupples

Quote:


> Originally Posted by *MIGhunter*
> 
> Does the Nvidia driver uninstaller in the OP work for AMD? I tried to install the beta drivers this morning and my screen flashed black and then got stuck. So, I had to hard reboot my system. Eveything is working fine but I was thinking about erasing my drivers and reinstalling them tomorrow when I get a day off.
> 
> Also, when my son tried to play BF4 online, it said it need punkbuster. When I installed everything from the CD, I installed punkbuster. What would be causing that.


You should be able to find a direct update for punkbuster inside your BF folder's somewhere.


----------



## Arizonian

Well no matter how much I tried to improve - I've found the top for a reference on air for 290X on my system anyway.

i7 3770K 4.6Ghz / 290X 1150 Mhz Core - 1300 Mhz Memory / no voltage bumps or BIOS / 13.11 beta 8

Firestrike 10642 Score / Graphic Score 12557



http://www.3dmark.com/3dm/1543398

http://valid.canardpc.com/8v6c0a

http://www.techpowerup.com/gpuz/3h4u4/

Barely beat my previous score but it's the most I can push on air reference Sapphire BIOS. One thing I'm lacking is a proper DRAM overclock using default GSKILL Trident X 2400 Mhz but you can figure out it's not much more overall. Got my CPU to 4.6 Ghz which is nice as I'm going to leave it at that 24/7. Thank goodness for Intel Tuning program.









I'm more than happy gaming and look forward to seeing what non-reference has to offer in single GPU performance. I'm still contemplating MSI lighting, ASUS Matrix, or Sapphire Toxic. We'll see. But as for this card I'm done testing it tonight.....







-s


----------



## HoZy

Could I get page1 updated, My cards a XFX not a SAPPHIRE

thanks


----------



## Sazz

Here's my block, now its just waiting for my replacement card. really crossing my fingers that newegg gets some stock of the sapphire 290X on monday so that they won't have to send me a refund instead. T_T

Anyways has anyone have any idea when is the non-BF4 versions will officially start selling?


----------



## Arizonian

Quote:


> Originally Posted by *HoZy*
> 
> Could I get page1 updated, My cards a XFX not a SAPPHIRE
> 
> thanks


No problem - updated









Rember everyone if I don't get this info I need I'm going to default to Sappphire / Air - as a lot of these entries lack all the info I need









While we're on this topic and not directed at you bud - I'd like to resonate.....

*PER OP*
******* To be added on the member list please submit the following in your post *******

1. GPU-Z Link or Screen shot with OCN Name
2. Manufacturer & Brand
3. Cooling - Stock, Aftermarket or 3rd Party Water

WOW - we're looking at 50 members with 57 confirmed 290X's and three people who have yet to submit proof / quantity in 10 days.


----------



## VSG

You can add my 2 cards as well, will have pictures in my post on page 1 when my second card arrives tomorrow.


----------



## Arizonian

Quote:


> Originally Posted by *Sazz*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Here's my block, now its just waiting for my replacement card. really crossing my fingers that newegg gets some stock of the sapphire 290X on monday so that they won't have to send me a refund instead. T_T
> 
> Anyways has anyone have any idea when is the non-BF4 versions will officially start selling?


Updated - good luck on replacement.

Quote:


> Originally Posted by *geggeg*
> 
> You can add my 2 cards as well, will have pictures in my post on page 1 when my second card arrives tomorrow.


Congrats gegeg - will take your word for it









Sapphire right


----------



## RAFFY

Quote:


> Originally Posted by *Arizonian*
> 
> No problem - updated
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Rember everyone if I don't get this info I need I'm going to default to Sappphire / Air - as a lot of these entries lack all the info I need
> 
> 
> 
> 
> 
> 
> 
> 
> 
> While we're on this topic and not directed at you bud - I'd like to resonate.....
> 
> *PER OP*
> ******* To be added on the member list please submit the following in your post *******
> 
> 1. GPU-Z Link or Screen shot with OCN Name
> 2. Manufacturer & Brand
> 3. Cooling - Stock, Aftermarket or 3rd Party Water
> 
> WOW - we're looking at 50 members with 57 confirmed 290X's and three people who have yet to submit proof / quantity in 10 days.


Monday evening your wishes will be granted and proof shall be submitted!!!


----------



## lordzed83

OK i am under water


Sadly my 290x is poor clocker like i had feeling.. Max what i can get out of it with no artefacts is 1170mhz/[email protected]
After running frymark on 15minute burn in test max gpu temperature was 62c on core I can live with that :]

Maybe when msi afterburner comes out i can mess around with sapphire bios and get something different but not on asus one


----------



## Strata

I know this is a bit out of place, but I trust you guys much more than the reviewers online. I will be buying one card only, and very soon. I have to use newegg as I have a massive credit there. For the time being it looks like they will be out of 290Xs, while I will wait until mid November I cannot really wait too much longer. So the choice is either 290X (if its available, would need some brand recommendation here), 780 Lightning, or 780 Classified.

Running 3570k @ 4.5, 8gb 2133 ram, 650w seasonic x650

Any thoughts, you guys are the current experts here as far as I can see.


----------



## pioneerisloud

Quote:


> Originally Posted by *Strata*
> 
> I know this is a bit out of place, but I trust you guys much more than the reviewers online. I will be buying one card only, and very soon. I have to use newegg as I have a massive credit there. For the time being it looks like they will be out of 290Xs, while I will wait until mid November I cannot really wait too much longer. So the choice is either 290X (if its available, would need some brand recommendation here), 780 Lightning, or 780 Classified.
> 
> Running 3570k @ 4.5, 8gb 2133 ram, 650w seasonic x650
> 
> Any thoughts, you guys are the current experts here as far as I can see.


290x non reference cooler cards should by out by end of November if you're going to be running on air. I'd personally vote for the 290x definitely. And Sapphire, MSI, or Asus gets my vote (in that order) on brand.

The GTX 780Ti is supposed to be coming out soon as well to meet up with the 290x. However its rumored to be around $700. If that holds true, I'd opt still for the 290x based off price.


----------



## Arizonian

Quote:


> Originally Posted by *lordzed83*
> 
> OK i am under water
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Sadly my 290x is poor clocker like i had feeling.. Max what i can get out of it with no artefacts is 1170mhz/[email protected]
> After running frymark on 15minute burn in test max gpu temperature was 62c on core I can live with that :]
> 
> Maybe when msi afterburner comes out i can mess around with sapphire bios and get something different but not on asus one


and so you shall be - updated


----------



## raghu78

some nice feedback on R9 290 from Brent justice of hardocp . looks like R9 290X will also get a bump up due to the new performance drivers.

http://hardforum.com/showthread.php?t=1789068


----------



## Strata

Quote:


> Originally Posted by *pioneerisloud*
> 
> 290x non reference cooler cards should by out by end of anyNovember if you're going to be running on air. I'd personally vote for the 290x definitely. And Sapphire, MSI, or Asus gets my vote (in that order) on brand.
> 
> The GTX 780Ti is supposed to be coming out soon as well to meet up with the 290x. However its rumored to be around $700. If that holds true, I'd opt still for the 290x based off price.


Yeah, the Ti sounds great, but that cost hurts. I also assume the drivers it will use are heavily based on the Titan/780 ones, which means there shouldn't be any massive performance gains right? Whereas the 290X is on early drivers which could very well be optimized with some more time to greatly improve performance?


----------



## Paul17041993

Quote:


> Originally Posted by *Strata*
> 
> Yeah, the Ti sounds great, but that cost hurts. I also assume the drivers it will use are heavily based on the Titan/780 ones, which means there shouldn't be any massive performance gains right? Whereas the 290X is on early drivers which could very well be optimized with some more time to greatly improve performance?


think they would be the same drivers, pretty sure the 780Ti didn't have any architectural changes bar maby a few more cores, though this is nvidia...


----------



## Sgt Bilko

Finally got Metro 2033 to work and i did a couple of runs.


----------



## rdr09

Good Morning!

Just ordered my R9 290. If you are interested . . . here you GO!

http://www.provantage.com/msi-r9-290-4gd5~7MSTI0J1.htm

edit: I guess I am the first owner of this gpu in OCN. It might arrive ahead of the waterblock.


----------



## HeadlessKnight

Quote:


> Originally Posted by *rdr09*
> 
> Good Morning!
> 
> Just ordered my R9 290. If you are interested . . . here you GO!
> 
> http://www.provantage.com/msi-r9-290-4gd5~7MSTI0J1.htm
> 
> edit: I guess I am the first owner of this gpu in OCN. It might arrive ahead of the waterblock.


I am afraid you justed ordered a 9700 pro . Is this how your card looks like











jk


----------



## rdr09

Quote:


> Originally Posted by *HeadlessKnight*
> 
> I am afraid you justed ordered a 9700 pro . Is this how your card looks like
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> jk


oh, noes. i might get mine either tuesday or wednesday. provantage is fast. i ordered from them before using regular mail and as soon as i hit the submit button . . . a few seconds later . . . the item was at my front doorstep. jk. the item came the next day.


----------



## Paul17041993

Quote:


> Originally Posted by *HeadlessKnight*
> 
> I am afraid you justed ordered a 9700 pro . Is this how your card looks like
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> jk




edit; I'm actually liking that heatsink design though, wonder how effective it was... (I only ever had AGP 9250s and a 9550...)


----------



## Newbie2009

Quote:


> Originally Posted by *Arizonian*
> 
> My Sapphire reference which we all thought to have Hynix memory is actually Elpida. I've always had horrible luck with over clocking and I'm comparing with others on air not under water btw. Nvidia cards which has been EVGA and now this Sapphire. I can never get a good memory OC. I see some crazy high OC's but that's how it goes I guess. Some day I'll get a golden chip or at least a decent OC


In my experience I learned to avoid the bigger names, for both Nvidia and AMD because I had the exact same experience until the 6950s. My HIS 6950s were great overclockers.

I tried a HIS 7970 and bingo. I then ordered an OEM part and ended up with a sapphire card, which was just as good.

It is all luck really maybe, but I feel better chance with smaller companies. I would not hesitate buy an unknown name for a reference card. Only go by the price. my 2c


----------



## Sgt Bilko

Firestrike Performance,
Graphics score: 11505
Physics score: 7447

http://www.3dmark.com/fs/1087050

Firestrike Extreme,
Graphics Score: 5322
Physics Score: 7446

http://www.3dmark.com/fs/1087087

Both Test were done at 1110 Core/ 1300 Memory, Stock Volts, Sapphire Bios
and Max temp was 59 Degrees with an Accelero Xtreme III Air Cooler.

I'm pretty sure my CPU is holding me back


----------



## VSG

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats gegeg - will take your word for it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sapphire right


1 sapphire, 1 Asus for now. Blocks and backplates coming in next week


----------



## utnorris

Quote:


> Originally Posted by *Jared Pace*
> 
> Yes Sir happy to help out, just a messenger of all the helpful info out there, thank sham for pt3 - thanks for including a link to that thread in your OP. One reason utnorris' card tops at ~1200 is because of his bios version. DampMonkey is on a different bios at 1320. His voltage registers 1387mv & it's actually ~1420mv. utnorris reads ~1360mv and it's actually ~1299mv. GPUz VDDCI reading is incorrect, so there's a chance what it says for VRM temp could be incorrect as well. VRM temp is probably accurate though - just need somebody to verify with IR reading. Also GPUZ doesn't show all stats on all cards (a bios version mismatch bug-dunno why). And i don't know why $ilent's Memtool isn't working, but $ilent seems to have trouble with every 290x link he downloads..a ahahah. Ati Winflash wasn't his friend, and none of my zip/rar files open for him, now Memtool has no info for him. There are alt/backup links for everything there in case.


Hey Jared, I am using the PT1 Bios. I was able to squeeze out just a bit more at 1221Mhz, however that is with me putting the voltage up to 1.41v according to Aida and GPU Tweak. I know that it may be lower, so do you know what vdroop is? From reading your thread it looks like Aida is suppose to be accurate. VRM temps are getting into the 60's at that voltage. I will probably pull my card and check to see if it is making good contact now that it has been on for a day and has had time to settle. Any suggestions?


----------



## Dominican

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Finally got Metro 2033 to work and i did a couple of runs.


you have stock cooler or water ?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Dominican*
> 
> you have stock cooler or water ?


I'm using the Water Cooler that came with my 8150, It's based off a Asetek model iirc


----------



## kcuestag

Looks like my Sapphire R9 290X also has a Elpdia memory. But that's not bad, is it? I remember seeing the OCuK Staff trying them out and they overclocked pretty much the same on 290 and 290X, didn't they?


----------



## scramz

Not quite up to monkey's score but I broke the 19K gfx score









i7 3770k @ 4.8GHz
R9 290x @ 1271/1500

Score P15926
Physics Score 11100
Combined Score 10108
Graphics Score 19044

11.13 Beta V7 drivers
Win 7

http://www.3dmark.com/3dm11/7424256


----------



## esqueue

I just got my XFX r9 290x card from newegg November 1st using the alarm feature on http://www.nowinstock.net/computers/amdr9290x/ I'm not sure if this is the correct place to ask but 1 what is the best site to get a waterblock for my card. I'm afraid that this card turns out like most of the older xbox 360s do. While the gpu can handle the temps the heat caused the board to warp and the lead-free solder didn't hold.

As of now, I know of kryographics Hawaii and EK-FC R9-290X cooling blocks and would like to know which do you guys recommend. I would also like to know of a recommended vendor that has it in stock. I do have a watercooling setup that I have connected to my xbox 360 that I haven't powered on in a while. Pontiac Bonneville heater core and an acrylic resevoir mounted on a Swiftech MCP355 pump. All I am planning on doing is adding a cpu block and throw that in my pc.


----------



## lordzed83

utnorris try Furmark mate.. After 15 minute burn in 1080p my VRM temps on water maxed out at 97c








60 is looooowwww








On standard cooler ect they are 80afaik


----------



## overclockFrance

I recommend you the Kryographics Hawaii instead of the EK waterblock : I ordered the EK one and there was no contact with a few memory modules which made the display blurred when the card was strongly overclocked (memory overheating).

Today, I have just ordered the Aquacomputer since the memory modules are directly in contact with the waterblock. Even, if it is more expensive, Aquacomputer quality is better (I owned the GTX690 Aquacomputer waterblock).

According to my experiences, the 2 best waterblock suppliers are Aquacomputer and Watercool, in terms of quality.


----------



## Arizonian

Question for those of you with two 290X's - are you experiencing microstutter?

I've been following this thread and haven't heard any of you with crossfire mention this and it's being echoed in other places on the net as it exists with these cards as bad as all the other lower end line.

I was under the assumption with sideport using PCIe for crossfire things were smooth. In fact I do recall hearing smooth as butter.

Thanks


----------



## lordzed83

overclockFrance mine works fine.
I did had soem VRM temperature probilems ---> used wrong pads did not knew they got 2 different sizes like...
Anyhow i got Thicker pad on everything


----------



## beejay

Just tested furmark on my 290x. I'm using the Arctic Accelero Hybrid. I got up to 80c in about 10 minutes. Funny though, the fan speed only goes up to 24%, then it throttles at about 80c. I have the fanspeed setting to 100% in overdrive, target temp set to 96c. I know it would be cooler if the fanspeed goes higher. Any thoughts?

(I'll upload the picture tomorrow)


----------



## maarten12100

Quote:


> Originally Posted by *Arizonian*
> 
> Question for those of you with two 290X's - are you experiencing microstutter?
> 
> I've been following this thread and haven't heard any of you with crossfire mention this and it's being echoed in other places on the net as it exists with these cards as bad as all the other lower end line.
> 
> I was under the assumption with sideport using PCIe for crossfire things were smooth. In fact I do recall hearing smooth as butter.
> 
> Thanks


According to the reviews it is smooth as butter on high resolutions [H]OCP actually stated smoother than SLI


----------



## kcuestag

Quote:


> Originally Posted by *beejay*
> 
> Just tested furmark on my 290x. I'm using the Arctic Accelero Hybrid. I got up to 80c in about 10 minutes. Funny though, the fan speed only goes up to 24%, then it throttles at about 80c. I have the fanspeed setting to 100% in overdrive, target temp set to 96c. I know it would be cooler if the fanspeed goes higher. Any thoughts?
> 
> (I'll upload the picture tomorrow)


Use MSI AB to set the Accelero fans to 100%, the difference will be quite dramatic.









Few years ago I used an Accelero on my HD5970 and I remember I no longer used AUTO fan profile as it was stupid, the ACcelero was quiet even at 100% so I kept it at Manual speed at all time.


----------



## beejay

Yep, installed After Burner now, I'm running full 1030mhz, and stock Mem clocks, 10 minutes now. No throttling, temps are at 70C
Quote:


> Originally Posted by *kcuestag*
> 
> Use MSI AB to set the Accelero fans to 100%, the difference will be quite dramatic.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Few years ago I used an Accelero on my HD5970 and I remember I no longer used AUTO fan profile as it was stupid, the ACcelero was quiet even at 100% so I kept it at Manual speed at all time.


----------



## beejay

Wow, did a minor OC of 1080 core, 1300 memory. Burn-in temp at 70C @ 5 minutes


----------



## sugarhell

Dont test furmark. People you need to learn to not use furmark


----------



## beejay

I know ^__^

But sometimes, a bit of a quick fur burn can help check stability at high temps. I'm not benchmarking yet, will bench tomorrow. Just checking stability.
Quote:


> Originally Posted by *sugarhell*
> 
> Dont test furmark. People you need to learn to not use furmark


----------



## Slomo4shO

I might have missed them but has there been an overclock benchmarks of a 2, 3, or 4 way crossfire? I want to know how my current setup compares before the launch of the 290s this Tuesday


----------



## skupples

Quote:


> Originally Posted by *beejay*
> 
> Just tested furmark on my 290x. I'm using the Arctic Accelero Hybrid. I got up to 80c in about 10 minutes. Funny though, the fan speed only goes up to 24%, then it throttles at about 80c. I have the fanspeed setting to 100% in overdrive, target temp set to 96c. I know it would be cooler if the fanspeed goes higher. Any thoughts?
> 
> (I'll upload the picture tomorrow)


furmark is good for two things, blowing up your card & testing it's max power draw & temp...


----------



## rdr09

Quote:


> Originally Posted by *Slomo4shO*
> 
> I might have missed them but has there been an overclock benchmarks of a 2, 3, or 4 way crossfire? I want to know how my current setup compares before the launch of the 290s this Tuesday


ordered one this AM

http://www.provantage.com/msi-r9-290-4gd5~7MSTI0J1.htm


----------



## beejay

Well, that was my goal. Max temp on max load. Had to do furmark to test the accelero hybrid.

Anyway going to flasb the bios tomorrow. See what I can get on proper benchmarks.


----------



## rdr09

Quote:


> Originally Posted by *beejay*
> 
> Well, that was my goal. Max temp on max load. Had to do furmark to test the accelero hybrid.
> 
> Anyway going to flasb the bios tomorrow. See what I can get on proper benchmarks.


Use GPUz render test. click the ? mark and start the render test. Have HwInfo64 running in the background to see temps for the core and vrms.


----------



## beejay

Quote:


> Originally Posted by *Arizonian*
> 
> That GPU-Z worked for me but still love to see pics. Put it in your original post too by editing it if you'd like for it to be linked from roster.
> 
> Congrats Added


Added images from when I was working on the installation. Will update the GPU-z link with the oc tomorrow.

cheers!


----------



## esqueue

Quote:


> Originally Posted by *overclockFrance*
> 
> I recommend you the Kryographics Hawaii instead of the EK waterblock : I ordered the EK one and there was no contact with a few memory modules which made the display blurred when the card was strongly overclocked (memory overheating).
> 
> Today, I have just ordered the Aquacomputer since the memory modules are directly in contact with the waterblock. Even, if it is more expensive, Aquacomputer quality is better (I owned the GTX690 Aquacomputer waterblock).
> 
> According to my experiences, the 2 best waterblock suppliers are Aquacomputer and Watercool, in terms of quality.


First, thanks for responding to my question. If the EK has issues with the memory then I will definitely look into the aquacomputer. Where do you recommend I purchase it from? All places I check online seems to be sold out.


----------



## tsm106

Quote:


> Originally Posted by *overclockFrance*
> 
> I recommend you the Kryographics Hawaii instead of the EK waterblock : I ordered the EK one and there was no contact with a few memory modules which made the display blurred when the card was strongly overclocked (memory overheating).
> 
> Today, I have just ordered the Aquacomputer since the memory modules are directly in contact with the waterblock. Even, if it is more expensive, Aquacomputer quality is better (I owned the GTX690 Aquacomputer waterblock).
> 
> According to my experiences, the 2 best waterblock suppliers are Aquacomputer and Watercool, in terms of quality.


User error or a single instance defect? You sound more noob for implying it is representative of all ek 290x blocks.


----------



## kcuestag

No issues here with the EK FC R9 290X - Nickel, my VRM's are well below 55ºC at all times.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *tsm106*
> 
> User error or a single instance defect? You sound more noob for implying it is representative of all ek 290x blocks.


I ordered the EK 290x blocks myself, and no one else has made the same complaints he has.DampMonkey is pushing higher and higher numbers, as well as other members, and they all seem to have EK Blocks.


----------



## Paul17041993

gaps over the memory will occur if you use the wrong pads and/or you screw it on wrong, only other cause is if the block and/or PCB has a severe warp that would be noticeable before you even tried to mount them...


----------



## skupples

EK is #1 VRM cooler on the market. All waterblock's can cool the GPU, but not many of them focus on the VRM's like EK does!

I have only ever used EK on all my gpu's, never had an issue with this supposed poor memory module contact. Did you stretch your Thermal pad's too thin? Most companies are shipping them perforated these day's, so that should be non issue.

those other companies may have a "higher build quality" but they tend to totally ignore VRM's, which is #1 concern when pushing volt's.


----------



## TheSoldiet

1140MHZ core clock and 1400MHZ memory clock stable on stock cooling


----------



## tsm106

Quote:


> Originally Posted by *skupples*
> 
> EK is #1 VRM cooler on the market. All waterblock's can cool the GPU, but not many of them focus on the VRM's like EK does!
> 
> I have only ever used EK on all my gpu's, never had an issue with this supposed poor memory module contact. Did you stretch your Thermal pad's too thin? Most companies are shipping them perforated these day's, so that should be non issue.
> 
> those other companies may have a "higher build quality" but they tend to totally ignore VRM's, which is #1 concern when pushing volt's.


I don't really care for a palm tree badge on my fullcover. But I will take the fattest, widest vrm channel design instead.


----------



## Forceman

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Finally got Metro 2033 to work and i did a couple of runs.


What was going on at the end there - that looks choppy as heck.


----------



## sugarhell

Quote:


> Originally Posted by *Forceman*
> 
> What was going on at the end there - that looks choppy as heck.


8150


----------



## Forceman

Quote:


> Originally Posted by *rdr09*
> 
> Good Morning!
> 
> Just ordered my R9 290. If you are interested . . . here you GO!
> 
> http://www.provantage.com/msi-r9-290-4gd5~7MSTI0J1.htm
> 
> edit: I guess I am the first owner of this gpu in OCN. It might arrive ahead of the waterblock.


Did you notice this in the shopping cart page?

"Direct from the Manufacturer. Place your order with us and we will have the manufacturer ship this item directly to you. See Availability. 100% Satisfaction Guaranteed."

I just put one in the cart to see the shipping charges and noticed that - so I'm guessing it won't ship before Tuesday.

Edit: And now it went from 22 In Stock to Out Of Stock in about 5 seconds.


----------



## rdr09

Quote:


> Originally Posted by *Forceman*
> 
> Did you notice this in the shopping cart page?
> 
> "Direct from the Manufacturer. Place your order with us and we will have the manufacturer ship this item directly to you. See Availability. 100% Satisfaction Guaranteed."
> 
> I just put one in the cart to see the shipping charges and noticed that - so I'm guessing it won't ship before Tuesday.
> 
> Edit: And now it went from 22 In Stock to Out Of Stock in about 5 seconds.


when i checked for availability it said "in stock". anyway, i checked again and now it is oos. there were only 22 to begin with.

edit: i was hoping to get it first before the block, so i can gauge how it does on air.


----------



## PillarOfAutumn

What exactly is true audio? Does it side step the soundcard? If you have a sound card, will true audio add to the experience, or will it just not make a difference?


----------



## SonDa5

After a week I have finally got my 290x up and running.









Stock heat sink and settings.

http://www.techpowerup.com/gpuz/b4623



ASIC score 72%. Was hoping for a 90%+ but maybe it doesn't matter as much as it did for 7900 series.


----------



## overclockFrance

Quote:


> Originally Posted by *skupples*
> 
> EK is #1 VRM cooler on the market. All waterblock's can cool the GPU, but not many of them focus on the VRM's like EK does!
> 
> I have only ever used EK on all my gpu's, never had an issue with this supposed poor memory module contact. Did you stretch your Thermal pad's too thin? Most companies are shipping them perforated these day's, so that should be non issue.
> 
> those other companies may have a "higher build quality" but they tend to totally ignore VRM's, which is #1 concern when pushing volt's.


No, I did not stretch the thermal pads. They were intact and pre-cut. I applied the thicker thermal pad on the VRMs, tightened the screws enough. I don't see the cause of the poor contact. Error is human and I might have done a mistake but it is not the first waterblock I install.

I may have received a bad waterblock, since all of you do not face with the same problem. But the sticker "quality control" should certify that the waterblock is not defective.


----------



## skupples

Quote:


> Originally Posted by *Forceman*
> 
> Did you notice this in the shopping cart page?
> 
> "Direct from the Manufacturer. Place your order with us and we will have the manufacturer ship this item directly to you. See Availability. 100% Satisfaction Guaranteed."
> 
> I just put one in the cart to see the shipping charges and noticed that - so I'm guessing it won't ship before Tuesday.
> 
> Edit: And now it went from 22 In Stock to Out Of Stock in about 5 seconds.


Drop shipping is the way of the future!
Quote:


> Originally Posted by *overclockFrance*
> 
> No, I did not stretch the thermal pads. They were intact and pre-cut. I applied the thicker thermal pad on the VRMs, tightened the screws enough. I don't see the cause of the poor contact. Error is human and I might have done a mistake but it is not the first waterblock I install.
> 
> I may have received a bad waterblock, since all of you do not face with the same problem. But the sticker "quality control" should certify that the waterblock is not defective.


In Utopia maybe. QC sticker's a joke, & on every product you buy. Though, it is strange as these block's are machined by computer's. Unless some one bumped the machine with a forklift while it was being made...


----------



## $ilent

Quote:


> Originally Posted by *SonDa5*
> 
> After a week I have finally got my 290x up and running.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Stock heat sink and settings.
> 
> http://www.techpowerup.com/gpuz/b4623
> 
> 
> 
> ASIC score 72%. Was hoping for a 90%+ but maybe it doesn't matter as much as it did for 7900 series.


Good stuff, virtually same ASIC score as me. I dont think it does matter much, ive not seen one over ~75% in here yet, let alone 90% lol.


----------



## Mr357

Quote:


> Originally Posted by *SonDa5*
> 
> After a week I have finally got my 290x up and running.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Stock heat sink and settings.
> 
> http://www.techpowerup.com/gpuz/b4623
> 
> 
> 
> ASIC score 72%. Was hoping for a 90%+ but maybe it doesn't matter as much as it did for 7900 series.


You just made my day. Hadn't seen an ASIC rating lower than my 73.1% until now; I guess anything above 70% is decent.


----------



## Faksnima

just ordered the the r9 290 from amazon for $390 after $115 in gc/promotions. i am hoping it's close to the titan's performance....can't go wrong at $390 for titan level perf. i heard the reason it was held back was a driver release to increase performance 10%? any truths to this? also is there anyone that's managed to unlock missing shaders (should they exist) on the r9 290?


----------



## $ilent

Quote:


> Originally Posted by *Faksnima*
> 
> just ordered the the r9 290 from amazon for $390 after $115 in gc/promotions. i am hoping it's close to the titan's performance....can't go wrong at $390 for titan level perf. i heard the reason it was held back was a driver release to increase performance 10%? any truths to this? also is there anyone that's managed to unlock missing shaders (should they exist) on the r9 290?


They held it back because of the drivers, they wanted reviews to be done on the new driver so that it looks like it has better performance. From what ive read nobody seems to think unlocking shaders is possible, but you never know!


----------



## scramz

I am new to GPU overclocking and watercooling. If one of the memory modules has not got good contact is there anyway to detect this within windows?


----------



## Mr357

Quote:


> Originally Posted by *scramz*
> 
> I am new to GPU overclocking and watercooling. If one of the memory modules has not got good contact is there anyway to detect this within windows?


I believe the "extreme" version of AIDA64 can read memory IC temperatures.


----------



## scramz

Quote:


> Originally Posted by *Mr357*
> 
> I believe the "extreme" version of AIDA64 can read memory IC temperatures.


So even if one mem module is over heating it will show this?


----------



## Mr357

Quote:


> Originally Posted by *scramz*
> 
> So even if one mem module is over heating it will show this?


I really don't know. I haven't used it.


----------



## Arizonian

Quote:


> Originally Posted by *$ilent*
> 
> They held it back because of the drivers, they wanted reviews to be done on the new driver so that it looks like it has better performance. From what ive read nobody seems to think unlocking shaders is possible, but you never know!


I'm sure they will be laser cut which makes unlocking shaders defunct. No confirmation though.


----------



## utnorris

Quote:


> Originally Posted by *scramz*
> 
> Not quite up to monkey's score but I broke the 19K gfx score
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i7 3770k @ 4.8GHz
> R9 290x @ 1271/1500
> 
> Score P15926
> Physics Score 11100
> Combined Score 10108
> Graphics Score 19044
> 
> 11.13 Beta V7 drivers
> Win 7
> 
> http://www.3dmark.com/3dm11/7424256


Your score seems to be a little low. I got this off of 1271/1475Mhz with my CPU at 4.7Ghz:
http://www.3dmark.com/3dm11/7421025

P16038

Graphics Score
18381

Physics Score
12162

Combined Score
10853
Quote:


> Originally Posted by *$ilent*
> 
> Good stuff, virtually same ASIC score as me. I dont think it does matter much, ive not seen one over ~75% in here yet, let alone 90% lol.


Mine is at 78% and I have seen a few more around that, but none above 80%.


----------



## jomama22

A preview to my 6 cards findings(which will be up a bit later...most likely after I sell the cards)

The asics:

72.6%
72.6%
72.3%
70.4%
74%
77.2%

Nice spread. And as luck would have it...

3 are elpidia 3 are hynix, manufactureer doesn't make a difference. Clock wise, they are they same as well.


----------



## Mr357

Quote:


> Originally Posted by *jomama22*
> 
> A preview to my 6 cards findings(which will be up a bit later...most likely after I sell the cards)
> 
> The asics:
> 
> 72.6%
> 72.6%
> 72.3%
> 70.4%
> 74%
> 77.2%
> 
> Nice spread. And as luck would have it...
> 
> 3 are elpidia 3 are hynix, manufactureer doesn't make a difference. Clock wise, they are they same as well.


If I'm not mistaken, 77.2% is the highest posted here thus far. Nice!


----------



## Durquavian

So far who has the best air and who has best water card. Came here hoping to see what TSM has gotten.


----------



## Falkentyne

Quote:


> Originally Posted by *$ilent*
> 
> Good stuff, virtually same ASIC score as me. I dont think it does matter much, ive not seen one over ~75% in here yet, let alone 90% lol.


Mine is 77.9%. But I can only do 1100 MHz core on stock volts/cooling. Haven't tried values inbetween, but 1125 MHz will black screen on Heaven in less than 30 seconds (not even 70C yet when it happens)


----------



## jomama22

Quote:


> Originally Posted by *Mr357*
> 
> If I'm not mistaken, 77.2% is the highest posted here thus far. Nice!


Its also has the highest stable stock bios clocks on air out of the 6 (1150 core) and lowest power usage/voltage under load(<1.1v during load) but things change once the Asus bios is put on...I've said too much!


----------



## SonDa5

Quote:


> Originally Posted by *Mr357*
> 
> You just made my day. Hadn't seen an ASIC rating lower than my 73.1% until now; I guess anything above 70% is decent.


Great card. Stock heatsink and stock settings on the card.



http://www.3dmark.com/3dm11/7425979

I have yet to learn how to over clock it and I have an EK block to put on it next.


----------



## M125

Wow, these things are worthless in [email protected] I did not read anywhere on launch day a change from double precision being 1/4*SP. AMD dropped it to 1/8*SP in an attempt to drop TDP on Hawaii. End result is that Tahiti is more powerful than Hawaii with double precision calculations. My 290X is 30% slower than a 7970 GHz.

This, combined with the immense heat and noise with the reference cooler (I'm sorta stuck with it, its in a Node 304, no room for enough radiator surface area to cool 400+ watts effectively), and I'm debating selling it. A 7970/R9 280X or a GTX 780 should do until 20nm GPUs arrive with a single 2560x1440 display I think.

If you are watercooling or have great aftermarket cooling and only play games, Hawaii is excellent.


----------



## $ilent

@M125, on [email protected] my card managed around 185k ppd, which is same or better than Titan and much more than 780 (150k IIRC).

My Asic is just under 72%, best test ive managed thus far is 1125/1450 on 3dmark11, stock volt.


----------



## Arizonian

Quote:


> Originally Posted by *Durquavian*
> 
> So far who has the best air and who has best water card. Came here hoping to see what TSM has gotten.


Not sure - keeping track is hard thread moving so fast. The bench-off in General Graphics section says mines pretty up there. I'm not sure if all DampMonkey's entries were water or if he submitted on air. My bench marks in spoiler.



Spoiler: BENCHMARKS



*3DMark 11 Performance*

http://www.3dmark.com/3dm11/7375771

*3DMark11 Extreme*

http://www.3dmark.com/3dm11/7380927

*Firestrike*

http://www.3dmark.com/3dm/1484478

*Firestrike Extreme*

http://www.3dmark.com/3dm/1488266



Alatar hasn't auto arranged it from highest to lowest or air vs water yet.


----------



## maarten12100

Quote:


> Originally Posted by *$ilent*
> 
> @M125, on [email protected] my card managed around 185k ppd, which is same or better than Titan and much more than 780 (150k IIRC).
> 
> My Asic is just under 72%, best test ive managed thus far is 1125/1450 on 3dmark11, stock volt.


[email protected] is single precision but indeed a fair amount of points


----------



## $ilent

I wish we knew when afterburner is getting an update. In fact are we even sure there is going to be an update?


----------



## Sgt Bilko

Quote:


> Originally Posted by *$ilent*
> 
> I wish we knew when afterburner is getting an update. In fact are we even sure there is going to be an update?


There has to be.....I dont want to have to flash my card.


----------



## skupples

Quote:


> Originally Posted by *$ilent*
> 
> I wish we knew when afterburner is getting an update. In fact are we even sure there is going to be an update?


I'm sure it will be part of the next beta, he may possibly be waiting on 290 to come out so he can do it all @ once.


----------



## $ilent

hmm I hope its soon, I want to overvolt it by time I get my wc stuff mid november hopefully.


----------



## Taint3dBulge

Quote:


> Originally Posted by *$ilent*
> 
> I wish we knew when afterburner is getting an update. In fact are we even sure there is going to be an update?


There will be. You will see it first at guru3d since they are partnered with a.b. or so i believe.


----------



## kcuestag

Quote:


> Originally Posted by *$ilent*
> 
> I wish we knew when afterburner is getting an update. In fact are we even sure there is going to be an update?


http://forums.guru3d.com/showthread.php?p=4691038#post4691038


----------



## Sgt Bilko

Quote:


> Originally Posted by *kcuestag*
> 
> http://forums.guru3d.com/showthread.php?p=4691038#post4691038


Sweet. Thanks for the link


----------



## $ilent

Quote:


> Originally Posted by *kcuestag*
> 
> http://forums.guru3d.com/showthread.php?p=4691038#post4691038


Ah thats good news, sounds like its not too far away!


----------



## skupples

Good ol' Unwinder hard @ work!


----------



## M125

Quote:


> Originally Posted by *$ilent*
> 
> @M125, on [email protected] my card managed around 185k ppd, which is same or better than Titan and much more than 780 (150k IIRC).
> 
> My Asic is just under 72%, best test ive managed thus far is 1125/1450 on 3dmark11, stock volt.


Oh, yeah, If you only use single precision, then yes, an R9 290X decimates all. I've tested with [email protected] (SP), Milkyway (DP), Einstein (SP), and [email protected] (SP), and only [email protected] impressed me. [email protected] is an oddball because of its CPU/RAM reliance, so I'm not too concerned about performance there (my 7750 with a Xeon E3 is 2/3rds as performing as the R9 290X and a 4770K). Einstein needs an app update, as I'm running on par with a 7950.

I'm just disappointed in the reduction of double precision capability. Milkyway is one of my main projects and has been for a while. I wonder how much a 2816 stream FirePro would cost?


----------



## $ilent

ah, I dont really read into that precision stuff too much


----------



## rdr09

Quote:


> Originally Posted by *Faksnima*
> 
> just ordered the the r9 290 from amazon for $390 after $115 in gc/promotions. i am hoping it's close to the titan's performance....can't go wrong at $390 for titan level perf. i heard the reason it was held back was a driver release to increase performance 10%? any truths to this? also is there anyone that's managed to unlock missing shaders (should they exist) on the r9 290?


got mine for $460.









Quote:


> Originally Posted by *scramz*
> 
> I am new to GPU overclocking and watercooling. If one of the memory modules has not got good contact is there anyway to detect this within windows?


new? your graphics score is the highest in the single list of Top 30 in 3DM11 benchmarking thread. congrats. may I join you in savoring the moment? the Ti's are coming.


----------



## skupples

"unlocking" the missing shaders should theoretically be impossible since they laser them to not function. It's not like the good ol' day's where you could do that with some cpu's.


----------



## Tobiman

Quote:


> Originally Posted by *M125*
> 
> Oh, yeah, If you only use single precision, then yes, an R9 290X decimates all. I've tested with [email protected] (SP), Milkyway (DP), Einstein (SP), and [email protected] (SP), and only [email protected] impressed me. [email protected] is an oddball because of its CPU/RAM reliance, so I'm not too concerned about performance there (my 7750 with a Xeon E3 is 2/3rds as performing as the R9 290X and a 4770K). Einstein needs an app update, as I'm running on par with a 7950.
> 
> I'm just disappointed in the reduction of double precision capability. Milkyway is one of my main projects and has been for a while. I wonder how much a 2816 stream FirePro would cost?


I dunno. Maybe a 7990 would be a good choice, if you can still find one for cheap.


----------



## Asrock Extreme7

my asics: 69.6 elpidia but over clocks ok first try 1250 core 1400 mem 1.3v going to go water with this card then see how high I can go


----------



## $ilent

Quote:


> Originally Posted by *Asrock Extreme7*
> 
> my asics: 69.6 elpidia but over clocks ok first try 1250 core 1400 mem 1.3v going to go water with this card then see how high I can go


Good to know ASIC apparently doesnt make much difference! mine is lower than others too.


----------



## Sgt Bilko

Very good to know. Was a little worried with mine being 70.4%. Going to get an 8350 within the next 2 weeks and see what gains I get from it.


----------



## Ukkooh

Quote:


> Originally Posted by *Asrock Extreme7*
> 
> my asics: 69.6 elpidia but over clocks ok first try 1250 core 1400 mem 1.3v going to go water with this card then see how high I can go


Are you saying you got 1250 on the reference cooler? :O
Must be a heck of a beast under water.


----------



## Arizonian

Quote:


> Originally Posted by *Asrock Extreme7*
> 
> my asics: 69.6 elpidia but over clocks ok first try 1250 core 1400 mem 1.3v going to go water with this card then see how high I can go


Amazing OC...stable gaming?

Also noticed you haven't joined us here. Submit a pic with OCN name or GPU-Z validation and I'll add you to roster. That goes for any other 290X owners following this thread that haven't joined, please do.


----------



## xxmastermindxx

Playing with clocks a little today. On the plus side, my load temps top out around 41C, and I can do multiple runs of 3DMark/3DMark11 at 1100 core. Bad side is my memory doesn't seem to like anything over 1375. If I go over, I start getting artifacts just on the desktop. It's got Hynix modules.

I'm going to play with Asus GPU Tweak and PT3 bios later.

Change me to water


----------



## Arizonian

Quote:


> Originally Posted by *xxmastermindxx*
> 
> Playing with clocks a little today. On the plus side, my load temps top out around 41C, and I can do multiple runs of 3DMark/3DMark11 at 1100 core. Bad side is my memory doesn't seem to like anything over 1375. If I go over, I start getting artifacts just on the desktop. It's got Hynix modules.
> 
> I'm going to play with Asus GPU Tweak and PT3 bios later.
> 
> Change me to water
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Looks good. Nice temps. Should be able to do better on that Core because I'm getting 1100 Mhz Core / 1300 Mhz Memory stable gaming on air. We just need AB to support voltage now so we can open them up. Keep us updated.

Updated to water.


----------



## Sgt Bilko

Agreed. Im running 1110/1300 stable on Air. You should be able to push it to 1200/1400 at least.

Hopefully the bios tweaking works out.

Nice temps btw, good to see the heat being tamed


----------



## HoZy

Phwoar, I've been chasing a 290x EK waterblock & backplate for a week now in Aus, Everywhere that had a couple have sold out. And no one has ETA's


----------



## M125

Quote:


> Originally Posted by *Tobiman*
> 
> I dunno. Maybe a 7990 would be a good choice, if you can still find one for cheap.


Yeah, I'm in a real tight scenario. I try to build no-compromise ITX systems for the challenge, and prefer miniscule, quiet systems over towers whenever possible. I'm building with a Fractal Node 304, and cannot feasibly watercool (one 120/140mm rad, one 92mm rad, that's it for 400W power target), and on air a blower type card is needed for such a tight case. R9 290X's reference blower is far too loud under load and/or cannot cool the thing effectively, 7990 is not a blower type. I've had 7950 crossfire before (_well_ before the frame pacing fix), and do not wish to go back even with the fix. SLI with a 690 is not an option either. Single cards for me. I just want one BIG die with a blower type heatsink that performs on par with Nvidia's reference blower solution for Titan-770. That, and 1/4 SP double precision. Seems like a pipe dream.









And looking at it now, there are no R9 280X (7970) with better-than-reference blower type heatsinks are there? It is loud reference blower or quiet dual fan. I see a small hole in the market where you could fit something like the HIS 7870 IceQ blower (redesigned for 200W+) on a R9 280X or greater. Is it an issue of design? Are AMD's reference blowers as quiet as they can be moving the air they do? The R9 290X is quiet up to around 1600 RPM. Drawing inspiration from dual-fan designs, what If the vapor chamber was staggered and two blower fans were fitted in succession, one thicker than the other. Run them both at 1600 RPM, and you have a cooler running card. I know it is much more complicated than that, but it seems there is little to no interest in aftermarket cooling solutions that are blower type on AMD's side.

If Nvidia's reference heatsink for the Titan can keep it cool AND _relatively_ quiet with 250W to dissipate, why can't AMD GPUs have a similar heatsink? What makes Nvidia's heatsink more efficient? Different vapor chamber design, different fan? Both?


----------



## the9quad

Quote:


> Originally Posted by *Arizonian*
> 
> Amazing OC...stable gaming?
> 
> Also noticed you haven't joined us here. Submit a pic with OCN name or GPU-Z validation and I'll add you to roster. That goes for any other 290X owners following this thread that haven't joined, please do.


Is this what you need, I am new here?

http://www.techpowerup.com/gpuz/w6zz8/

Also my asic is 72.2, most i've messed with so far is 1120/1425 on air, and it resulted in 10692 in firestrike with 12169 for graphics. http://www.3dmark.com/fs/1090085

Might see if it will go higher, but not until I figure out why it randomly will black screen at stock clocks. Anyone else having this issue? It seems like I can play a game all day just fine then maybe the next day after 3 minutes of gaming it will black screen, I will reboot it will run fine, than maybe randomly black screen again later. It really is random and frustrating. I flahed the bios to asus today and am trying a custom fan profile, thinking it might be temp related, hope it works.


----------



## Arizonian

Quote:


> Originally Posted by *the9quad*
> 
> Is this what you need, I am new here?
> 
> http://www.techpowerup.com/gpuz/w6zz8/
> 
> Also my asic is 72.2, most i've messed with so far is 1120/1425 on air, and it resulted in 10692 in firestrike with 12169 for graphics. http://www.3dmark.com/fs/1090085
> 
> Might see if it will go higher, but not until I figure out why it randomly will black screen at stock clocks. Anyone else having this issue? It seems like I can play a game all day just fine then maybe the next day after 3 minutes of gaming it will black screen, I will reboot it will run fine, than maybe randomly black screen again later. It really is random and frustrating. I flahed the bios to asus today and am trying a custom fan profile, thinking it might be temp related, hope it works.


Welcome aboard OCN with your first post.









Yes that's perfect - I'm assuming Sapphire? If it's different let me know I'll update it. Congrats on your 290X, your added to the club roster.









Only thing I can think of that's causing your black screen is your PSU possibly since your at stock. Doubt it's temps because it can handle it and your not pushing higher temps at stock either in quite mode or uber mode that it can't take. Sounds like it might be going or being pushed to the edge possibly.....

Love to see your system specs if you post them. *"How to put your Rig in your Sig"* It will help others know what your dealing with and possibly help your easier.


----------



## alancsalt

Quote:


> Originally Posted by *HoZy*
> 
> Phwoar, I've been chasing a 290x EK waterblock & backplate for a week now in Aus, Everywhere that had a couple have sold out. And no one has ETA's


Just go online and order them direct from EKWB. Only take about a week to arrive. Only risk is whether customs inspects them. If they do you'll get charged an extra $50 for the privilige.

I've bought all my EK blocks that way.


----------



## HoZy

Quote:


> Originally Posted by *alancsalt*
> 
> Just go online and order them direct from EKWB. Only take about a week to arrive. Only risk is whether customs inspects them. If they do you'll get charged an extra $50 for the privilige.
> 
> I've bought all my EK blocks that way.


ekwb = * This product is not available in the requested quantity. 2 of the items will be backordered."

Might just bite the bullet an order, At least it's 3days DHL Shipping. They should have it in stock before anyone locally I assume.

Cheers
Mat

EDIT:

ORDERED


----------



## SpewBoy

Quote:


> Originally Posted by *alancsalt*
> 
> Just go online and order them direct from EKWB. Only take about a week to arrive. Only risk is whether customs inspects them. If they do you'll get charged an extra $50 for the privilige.
> 
> I've bought all my EK blocks that way.


At least at the moment, there's no stock. It's also $50 postage to Australia, which is a bit steep but I guess if you want the block, you want the block.

EDIT: I got beaten... lol. Also didn't notice it was 3 days DHL Express... More tempting.

EDIT 2: Wooh my 290Xs just arrived!



My bathroom has decent lighting...


----------



## tsm106

Metro 2033 @ 1200/1500 air



Metro LL @1150/1500 air



This is the most benching on air I've ever done lol.


----------



## Deedot78

where are the comparison for stock vs water cooler overclock. Tryna decide on a water block to add and seeing what kinda performance is out there.


----------



## Arizonian

Quote:


> Originally Posted by *SpewBoy*
> 
> At least at the moment, there's no stock. It's also $50 postage to Australia, which is a bit steep but I guess if you want the block, you want the block.
> 
> EDIT: I got beaten... lol. Also didn't notice it was 3 days DHL Express... More tempting.
> 
> EDIT 2: Wooh my 290Xs just arrived!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> My bathroom has decent lighting...


Nice - congrats on two HIS - added.









Quote:


> Originally Posted by *tsm106*
> 
> Metro 2033 @ 1200/1500 air
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Metro LL @1150/1500 air
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> This is the most benching on air I've ever done lol.


That's one hell of a Memory OC on air!

Core seems to be game dependent for over clocking. My best OC across all the games is 1100 Mhz 1300 Mhz but I test on Crysis 3 for stability. If it's good there it's safe across all other games.


----------



## raghu78

Quote:


> Originally Posted by *tsm106*
> 
> Metro 2033 @ 1200/1500 air
> 
> Metro LL @1150/1500 air
> 
> This is the most benching on air I've ever done lol.


tsm106 are you waiting for your waterblocks or have you already put the R9 290X under water


----------



## tsm106

Yea, I'm stuck in the fcpu queue and its soo annoying. I got a sheet of ultra extremes too, for the mem and vrms. My patience is really getting tested with all this waiting.


----------



## MasterT

Been waiting a while for some of you guys to receive two or more of these cards. My question; I have 3 1440p Korean monitors, and I want to get 2x 290x's. Can I connect 2 monitors to one card, and the other one to the next card? Or does all connecting have to be done on one card?


----------



## VSG

For eyefinity, you have to use a single card.


----------



## MasterT

Quote:


> Originally Posted by *geggeg*
> 
> For eyefinity, you have to use a single card.


Damn. So I have to get a converter then.


----------



## raghu78

Quote:


> Originally Posted by *tsm106*
> 
> Yea, I'm stuck in the fcpu queue and its soo annoying. I got a sheet of ultra extremes too, for the mem and vrms. My patience is really getting tested with all this waiting.


there is some more performance waiting for you. will get revealed on tuesday.









http://forums.overclockers.co.uk/showthread.php?t=18551534&page=9

gibbo of ocuk

"The blurred out card is the new card coming Tuesday 5am from AMD. *A card which I've now managed scores even beating the R290X OC with, due to newer driver*s. "

"*Of course if I re-tested the X it would be faster again*, but I am short on time. Shall post up what an R290 Pro can do maxed out 5am Tuesday. "


----------



## SpewBoy

For any potential HIS buyers who may want to water cool, the HIS 290X box explicitly states that the warranty is voided if you disassemble the "fan cooler" or change the "card's configuration". That said, there are no "warranty void if removed" stickers on the PCB or screws so it appears that one can easily take the cooler off and put it back on without raising suspicions (apart from the fact that the thermal paste changed but I have never heard of a manufacturer actually checking this).


----------



## DampMonkey

Quote:


> Originally Posted by *raghu78*
> 
> there is some more performance waiting for you. will get revealed on tuesday.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://forums.overclockers.co.uk/showthread.php?t=18551534&page=9
> 
> gibbo of ocuk
> 
> "The blurred out card is the new card coming Tuesday 5am from AMD. *A card which I've now managed scores even beating the R290X OC with, due to newer driver*s. "
> 
> "*Of course if I re-tested the X it would be faster again*, but I am short on time. Shall post up what an R290 Pro can do maxed out 5am Tuesday. "


This is all very exciting! I can tell from using the 290x for a week that drivers are not taking full advantage of this card. Performance is all over the board, and it's very hard to compare to the GK110 because some applications are scoring great while others are suffering, and the drivers are throwing up a huge flag for me. Everybody is drawing early conclusions as to the 290 series performance, but I feel we are only seeing the tip of the iceberg. Can't wait to see whats in store


----------



## tsm106

Quote:


> Originally Posted by *raghu78*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Yea, I'm stuck in the fcpu queue and its soo annoying. I got a sheet of ultra extremes too, for the mem and vrms. My patience is really getting tested with all this waiting.
> 
> 
> 
> there is some more performance waiting for you. will get revealed on tuesday.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://forums.overclockers.co.uk/showthread.php?t=18551534&page=9
> 
> gibbo of ocuk
> 
> "The blurred out card is the new card coming Tuesday 5am from AMD. *A card which I've now managed scores even beating the R290X OC with, due to newer driver*s. "
> 
> "*Of course if I re-tested the X it would be faster again*, but I am short on time. Shall post up what an R290 Pro can do maxed out 5am Tuesday. "
Click to expand...

That driver is going to make some ppl cry, mods included lol.


----------



## spitty13

Can't seem to get my hands on a 290x. Thinking of just settling for a 290. Its not like they are voltage locked or crippled too much compared to the 290x, right?


----------



## raghu78

Quote:


> Originally Posted by *DampMonkey*
> 
> This is all very exciting! I can tell from using the 290x for a week that drivers are not taking full advantage of this card. Performance is all over the board, and it's very hard to compare to the GK110 because some applications are scoring great while others are suffering, and the drivers are throwing up a huge flag for me. Everybody is drawing early conclusions as to the 290 series performance, but I feel we are only seeing the tip of the iceberg. Can't wait to see whats in store


I am of the same opinion. the R9 290 series has just got started. AMD is going to surprise us more with this beast








Quote:


> Originally Posted by *tsm106*
> 
> That driver is going to make some ppl cry, mods included lol.


well said







and this is without Mantle. Dec is going to be a tough month as AMD piles the hurt on Nvidia in BF4 with Mantle


----------



## the9quad

Quote:


> Originally Posted by *Arizonian*
> 
> Welcome aboard OCN with your first post.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yes that's perfect - I'm assuming Sapphire? If it's different let me know I'll update it. Congrats on your 290X, your added to the club roster.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Only thing I can think of that's causing your black screen is your PSU possibly since your at stock. Doubt it's temps because it can handle it and your not pushing higher temps at stock either in quite mode or uber mode that it can't take. Sounds like it might be going or being pushed to the edge possibly.....
> 
> Love to see your system specs if you post them. *"How to put your Rig in your Sig"* It will help others know what your dealing with and possibly help your easier.


Updated my rig, it's a brand new PC power and cooling platinum rated 1200 watt power supply...but you may be right as it does act kind of funny as I have to mess around with the cord for it to power up which is a problem. Oh we'll I will RMA it and see if I can get another one. Thanks.


----------



## beejay

Quote:


> Originally Posted by *Deedot78*
> 
> where are the comparison for stock vs water cooler overclock. Tryna decide on a water block to add and seeing what kinda performance is out there.


Well from stock (1030mhz core, slight factory overclock from power color) and ref. Cooler, I was throttling to 700ish on a 15 minute burn in furmark at 95c. On an accelero hybrid, at 1085mhz core, 1300mhz mem, I'm stable at 70c on 100% fan speed (not loud, definitely less than reference cooler at 100%). Haven't pushed clocks as I have not flashed a custom bios yet.

Come to think of it, us users of the accelero hybrid are classified as water or 3rd party?


----------



## RAFFY

Quote:


> Originally Posted by *SpewBoy*
> 
> At least at the moment, there's no stock. It's also $50 postage to Australia, which is a bit steep but I guess if you want the block, you want the block.
> 
> EDIT: I got beaten... lol. Also didn't notice it was 3 days DHL Express... More tempting.
> 
> EDIT 2: Wooh my 290Xs just arrived!
> 
> 
> 
> My bathroom has decent lighting...


Hope their not crappy!!! AHHHHH see what I did!


----------



## kot0005

Not bad for my first etching attempt lolz.


----------



## kot0005

Boom!


----------



## esqueue

Quote:


> Originally Posted by *the9quad*
> 
> Is this what you need, I am new here?
> 
> http://www.techpowerup.com/gpuz/w6zz8/
> 
> Also my asic is 72.2, most i've messed with so far is 1120/1425 on air, and it resulted in 10692 in firestrike with 12169 for graphics. http://www.3dmark.com/fs/1090085
> 
> Might see if it will go higher, but not until I figure out why it randomly will black screen at stock clocks. Anyone else having this issue? It seems like I can play a game all day just fine then maybe the next day after 3 minutes of gaming it will black screen, I will reboot it will run fine, than maybe randomly black screen again later. It really is random and frustrating. I flahed the bios to asus today and am trying a custom fan profile, thinking it might be temp related, hope it works.


Just so you know, I had MAJOR issues with my screens completely going black for 30 seconds + at times. It would happen is I have gpuz running and do any display changes such as opening a game, switching from game screen to desktop, when changing resolution, etc... I would have to reset and make sure that I don't open gpuz. I use the amd overdriver seettings area to see temp, speed and fan speeds for now.

I hope that this is your issue.


----------



## Arm3nian

New drivers, more oc potential, and mantle all coming soon. Looking good for us


----------



## jomama22

Soon....


Water testing to be commenced!!!


----------



## PillarOfAutumn

Quote:


> Originally Posted by *jomama22*
> 
> Soon....
> 
> 
> Water testing to be commenced!!!


Are you the reason why there's a shortage of 290x and the EK water blocks?


----------



## Mr357

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Are you the reason why there's a shortage of 290x and the EK water blocks?


Definitely. He bought 6 of them, and at least 3 blocks.


----------



## Sazz

Quote:


> Originally Posted by *MasterT*
> 
> Damn. So I have to get a converter then.


R9 290 and 290X don't need an active display adapter but if your Monitors don't have HDMI or Display port input besides the DVI that most monitors have then you will only need a basic cable adapter, either HDMI to DVI or Display port to DVI, I personally bought an HDMI to DVI cable by Rosewill from Newegg.


----------



## skupples

The new driver's will be interesting to say the least... Without trying to sound too pessimistic... It will likely only be a few % points difference from what we are currently seeing. It will be more like "ok, this is how it's supposed to run" instead of "uhhh why am i getting 30FPS in a game that should only require 1/3 of this gpu to get 60fps+"

I wonder if they will include the Unigine fix people are looking for.


----------



## Arizonian

Quote:


> Originally Posted by *skupples*
> 
> The new driver's will be interesting to say the least... Without trying to sound too pessimistic... It will likely only be a few % points difference from what we are currently seeing. It will be more like "ok, this is how it's supposed to run" instead of "uhhh why am i getting 30FPS in a game that should only require 1/3 of this gpu to get 60fps+"
> 
> I wonder if they will include the Unigine fix people are looking for.


Well since your not a 290X owner I realize your expectations isn't as great as ours and hope your wrong


----------



## DampMonkey

Quote:


> Originally Posted by *skupples*
> 
> The new driver's will be interesting to say the least... Without trying to sound too pessimistic... *It will likely only be a few % points difference from what we are currently seeing.* It will be more like "ok, this is how it's supposed to run" instead of "uhhh why am i getting 30FPS in a game that should only require 1/3 of this gpu to get 60fps+"
> 
> I wonder if they will include the Unigine fix people are looking for.


Thats all I'll need to top the charts


----------



## Arm3nian

Quote:


> Originally Posted by *Arizonian*
> 
> Well since your not a 290X owner I realize your expectations isn't as great as ours and hope your wrong


Quote:


> Originally Posted by *DampMonkey*
> 
> Thats all I'll need to top the charts


Skupples isn't an nvidia troll, he goes for what performs the best, be nice









Anyway, I think these drivers will be quite good. Drivers are still in their infancy, it can only get better. 780/titan have had 9 months of drivers.


----------



## Arizonian

Quote:


> Originally Posted by *jomama22*
> 
> Soon....
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Water testing to be commenced!!!


I have to apologize jomama22 because I missed your submission post.









Slot you in where you should have been at 41st member with 3 MSI 290X and now updated under water. Congrats.









Let us know how that tri-crossfire scaling works out.


----------



## s0up2up

Quote:


> Originally Posted by *Arizonian*
> 
> Let us know how that tri-crossfire scaling works out.


Very much this. I would be interested too!


----------



## Forceman

Quote:


> Originally Posted by *tsm106*
> 
> Yea, I'm stuck in the fcpu queue and its soo annoying. I got a sheet of ultra extremes too, for the mem and vrms. My patience is really getting tested with all this waiting.


Did you check Performance PCs? They had some in stock on Friday.


----------



## th3illusiveman

Nice benchmarks dudes, I'll be buying one of these cards next year when the dust settles but it's been a very interesting 2 weeks so far and it should only get more chaotic this week









tsm, what's taking so long dude? Those benchmark charts are waiting for ya.


----------



## jomama22

Quote:


> Originally Posted by *Arizonian*
> 
> I have to apologize jomama22 because I missed your submission post.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Slot you in where you should have been at 41st member with 3 MSI 290X and now updated under water. Congrats.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Let us know how that tri-crossfire scaling works out.


its 3 msi, 2 x xfx, 1x sapphire for a total of 6 lol.


----------



## Arizonian

Quote:


> Originally Posted by *jomama22*
> 
> its 3 msi, 2 x xfx, 1x sapphire for a total of 6 lol.


Oh wow - how did I miss that. Just looked at that pic again.









Multiple rigs all getting 290X's I'm gathering. Updated.


----------



## Adglu

Did anyone test this with 4x PCIe3.0? I wonder if we really have bandwidth bottleneck scenario here


----------



## Taint3dBulge

My card needs to hurry up and get here. Did 3 day shipping on thursday, but it wont show up till tuesday because sunday ups doesnt ship anything i guess.. blah. I want this card some new drivers and some benching to see what this thing does at 1200~1300 core (once i get different cooling). Compare it to the leaked benches of the 780ti.. cant believe the new thread where they compare a heavily oced 780ti to a stock 290x. All the nvidia fanboys saying oh man blows the 290x outa the water.. Well of course it will, they have 9 months worth of driver updates for the gk110 whereas the 290x has 2 weeks of driver updates. we need some more benches guys.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Arizonian*
> 
> Oh wow - how did I miss that. Just looked at that pic again.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Multiple rigs all getting 290X's I'm gathering. Updated.


He's using the other 3 to prop up his desk XD

Quote:


> Originally Posted by *Taint3dBulge*
> 
> My card needs to hurry up and get here. Did 3 day shipping on thursday, but it wont show up till tuesday because sunday ups doesnt ship anything i guess.. blah. I want this card some new drivers and some benching to see what this thing does at 1200~1300 core (once i get different cooling). Compare it to the leaked benches of the 780ti.. cant believe the new thread where they compare a heavily oced 780ti to a stock 290x. All the nvidia fanboys saying oh man blows the 290x outa the water.. Well of course it will, they have 9 months worth of driver updates for the gk110 whereas the 290x has 2 weeks of driver updates. we need some more benches guys.


I liked how nVidia's recent chart showed that the 780ti had a performance boost of 33% compared to a 290x. Funny thing is that they were testing the 290x in it's quiet mode.


----------



## Paul17041993

Quote:


> Originally Posted by *SpewBoy*
> 
> At least at the moment, there's no stock. It's also $50 postage to Australia, which is a bit steep but I guess if you want the block, you want the block.
> 
> EDIT: I got beaten... lol. Also didn't notice it was 3 days DHL Express... More tempting.
> 
> EDIT 2: Wooh my 290Xs just arrived!
> 
> 
> 
> My bathroom has decent lighting...











Quote:


> Originally Posted by *MasterT*
> 
> Damn. So I have to get a converter then.


pretty sure you can use the HDMI alongside both the DVIs like the 280X...
edit; oh though the screens don't support 1440p via the HDMI/SL-DVI do they...?


----------



## PillarOfAutumn

Is it possible to run bf4 with this card at 1440p, everything ultra, and at 200% resolution?


----------



## rubicsphere

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Is it possible to run bf4 with this card at 1440p, everything ultra, and at 200% resolution?


Sure it will run but probably at < 10Fps


----------



## Sazz

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Is it possible to run bf4 with this card at 1440p, everything ultra, and at 200% resolution?


I ran BF4 at triple monitor eyefinity (6020x1080) everything ultra EXCEPT anti-aliasin, I turned off anti-aliasing and it doesn't really show any big difference (If there is any at all) between it at max and it being off. and I am getting 45-60fps on average and lowest dip I've seen is 32fps during extreme firefight on a 64player lobby. (Altho I am using FX-8350 @ 4.8Ghz daily clock based system, so if you got a intel rig i5-3570k and above you will probably get higher minimum FPS)

Quote:


> Originally Posted by *rubicsphere*
> 
> Sure it will run but probably at < 10Fps


Do you own the card? if not, I think you are not entitled to answer the question.


----------



## beejay

Yep, it's quite smooth at 1440P, with or without aliasing (4x). See my rig for my configurations.

Also I've only tested BF4 on stock clocks and cooler. Will play it later and see what the accelero hybrid and new bios can bring to the table.

Edit:
Quite smooth means 45-65 human perceivable average fps, all maxed.
Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Is it possible to run bf4 with this card at 1440p, everything ultra, and at 200% resolution?


----------



## Sgt Bilko

Hows the Temps with the Accelero Hybrid Beejay?


----------



## beejay

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Hows the Temps with the Accelero Hybrid Beejay?


Burned for 20 minutes, 1085mhz core, 1300 mem, 100% Fan. 69-70c.

Hows the Accelero Xtreme?


----------



## Sgt Bilko

Quote:


> Originally Posted by *beejay*
> 
> Burned for 20 minutes, 1085mhz core, 1300 mem, 100% Fan. 69-70c.
> 
> Hows the Accelero Xtreme?


So far maxing out at 70c in long game sessions and in Furmark im hitting 70c in less than a minute.

But gaming is all im really concerned with so a max of 70c i'm happy


----------



## kcuestag

There's no way you can run 1440p + 200% resolution on a single 290X, it will run well below 20-30fps for sure.


----------



## beejay

Quote:


> Originally Posted by *Sgt Bilko*
> 
> So far maxing out at 70c in long game sessions and in Furmark im hitting 70c in less than a minute.
> 
> But gaming is all im really concerned with so a max of 70c i'm happy


Do you experience throttling? Are your fans going over 25%? Thought I would give Overdrive a chance but I had this problem last night where Overdrive does not want my fan speed to go above 24% even though it's set to max. Burned for 10 minutes and I'm at 85c, with core clock going down to ~800mhz

So updated my AB, created a custom profile and it's all pretty.

So after the fan profile adjustments, the burn test never dropped 1085mhz in that 20 minutes.


----------



## Paul17041993

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Is it possible to run bf4 with this card at 1440p, everything ultra, and at 200% resolution?


Quote:


> Originally Posted by *Sazz*
> 
> I ran BF4 at triple monitor eyefinity (6020x1080) everything ultra EXCEPT anti-aliasin, I turned off anti-aliasing and it doesn't really show any big difference (If there is any at all) between it at max and it being off. and I am getting 45-60fps on average and lowest dip I've seen is 32fps during extreme firefight on a 64player lobby. (Altho I am using FX-8350 @ 4.8Ghz daily clock based system, so if you got a intel rig i5-3570k and above you will probably get higher minimum FPS)


double res minus AA for highish/playable fps? thats pretty reassuring for my needs, if not impressive at the least for a single gpu...

mind you I only set AA and AF to lowish values unless its a light game, not exactly needed to be very high...


----------



## beejay

Quote:


> Originally Posted by *kcuestag*
> 
> There's no way you can run 1440p + 200% resolution on a single 290X, it will run well below 20-30fps for sure.


Ah, never saw the +200% resolution part. That's like super sampling am I right? Yep with that enable, you would like half the fps.


----------



## psyside

Quote:


> Originally Posted by *Sgt Bilko*
> 
> So far maxing out at 70c in long game sessions and in Furmark im hitting 70c in less than a minute.
> 
> But gaming is all im really concerned with so a max of 70c i'm happy


Sorry not following this thread, did you just said, that replacing the tim, make the card run like 25c cooler?


----------



## Sgt Bilko

Quote:


> Originally Posted by *beejay*
> 
> Do you experience throttling? Are your fans going over 25%? Thought I would give Overdrive a chance but I had this problem last night where Overdrive does not want my fan speed to go above 24% even though it's set to max. Burned for 10 minutes and I'm at 85c, with core clock going down to ~800mhz
> 
> So updated my AB, created a custom profile and it's all pretty.
> 
> So after the fan profile adjustments, the burn test never dropped 1085mhz in that 20 minutes.


Yeah, running at 100% fan all the time due to this thing being so quiet, my case fans are louder if im honest


----------



## beejay

Quote:


> Originally Posted by *psyside*
> 
> Sorry not following this thread, did you just said, that replacing the time, make the card run like 25c cooler?


He's using an aftermarket cooler, Accelero Extreme III, it should cool it well.

Those using water has cooler temps when gaming.


----------



## Paul17041993

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yeah, running at 100% fan all the time due to this thing being so quiet, my case fans are louder if im honest


still recon you should turbocharge that cooler...


----------



## Sgt Bilko

Quote:


> Originally Posted by *Paul17041993*
> 
> still recon you should turbocharge that cooler...


Oh i will, I don't like White anyway










I want a better CPU first though, then i will get some high RPM fans, also have to work out what is going on with one of the VRM Temps, It's hitting some scary temps sometimes.....


----------



## psyside

Quote:


> Originally Posted by *beejay*
> 
> He's using an aftermarket cooler, Accelero Extreme III, it should cool it well.
> 
> Those using water has cooler temps when gaming.


So with that cooler -25c? amazing result!


----------



## Sgt Bilko

Quote:


> Originally Posted by *psyside*
> 
> So with that cooler -25c? amazing result!


yeah, idle is 36c atm and in game it hits a max of 70c, mostly hangs around 60c or so.

VRM's hit around 50c and 60c, will be even cooler once i get some high speed fans on it though


----------



## Sgt Bilko

So close to 5k......

http://www.3dmark.com/fs/1091552

Core: 1111Mhz

Memory: 1314Mhz


----------



## kcuestag

Quote:


> Originally Posted by *beejay*
> 
> He's using an aftermarket cooler, Accelero Extreme III, it should cool it well.
> 
> Those using water has cooler temps when gaming.


Yeah, mine is under water, and even with fans at lowest speed (800-900rpm) the card has topped so far at 52ºC with an OC of 1125/1450 and +50% Power.

If I push the fans to 1000-1200RPM I think I could keep it well below 48ºC.


----------



## $ilent

Quote:


> Originally Posted by *kcuestag*
> 
> Yeah, mine is under water, and even with fans at lowest speed (800-900rpm) the card has topped so far at 52ºC with an OC of 1125/1450 and +50% Power.
> 
> If I push the fans to 1000-1200RPM I think I could keep it well below 48ºC.


stock bios?


----------



## the9quad

It seems a few people on overclocker uk and hard forum are having the same random black screens as me. I heard a potential fix is going to a set vcore and bumping it up instead of offset. What was stable before isn't stable I guess with the 290x. I'm going to try it today and see if that helps and also check my power supply first to see if this brand new power supply is messed up. My first venture in doing anything other than adjusting the multiplier as I'm really new at this, I guess I'll just set it at 1.35 or something and hope for the best?


----------



## Connolly

Hello,

I'm thinking about upgrading to the R9 290x but am unsure how badly my current rig would bottle neck.

I have:

- Intel Core i7 2600k 3.4GHz Socket 1155 8MB Cache
- Corsair 8GB (2x4GB) DDR3 1600MHz
- Asus P8Z68-V PRO Z68 Socket 1155 8 Channel HD Audio ATX Motherboard
- Coolermaster Centurion 5 II SPECIAL EDITION Case with Red Interior - With Coolermaster 650W GX PSU

The components are a couple of years old and I'm wondering if I'd need to replace most of them to realise anything like the full potential of the GPU. I'm assuming I'd need a new PSU, but that's relatively inexpensive, it's more the CPU and motherboard I'm concerned about.

Obviously I'm not a bleeding edge tech guy, so any advice would be greatly appreciated.

Cheers,

Matt.


----------



## HeadlessKnight

Quote:


> Originally Posted by *Connolly*
> 
> Hello,


Hello Connolly.
While 2600K is a bit old. It still has enough power to easily drive a 290X. The newer mainstream processors are not much faster than it any way.
But I think If you plan to overclock the GPU+CPU you might push the PSU to its limits. (I am probably wrong here).

Good Luck


----------



## Sgt Bilko

Quote:


> Originally Posted by *Connolly*
> 
> Hello,
> 
> I'm thinking about upgrading to the R9 290x but am unsure how badly my current rig would bottle neck.
> 
> I have:
> 
> - Intel Core i7 2600k 3.4GHz Socket 1155 8MB Cache
> - Corsair 8GB (2x4GB) DDR3 1600MHz
> - Asus P8Z68-V PRO Z68 Socket 1155 8 Channel HD Audio ATX Motherboard
> - Coolermaster Centurion 5 II SPECIAL EDITION Case with Red Interior - With Coolermaster 650W GX PSU
> 
> The components are a couple of years old and I'm wondering if I'd need to replace most of them to realise anything like the full potential of the GPU. I'm assuming I'd need a new PSU, but that's relatively inexpensive, it's more the CPU and motherboard I'm concerned about.
> 
> Obviously I'm not a bleeding edge tech guy, so any advice would be greatly appreciated.
> 
> Cheers,
> 
> Matt.


i think you will be fine with the 2600k, you will bottleneck but i don't think it will be that bad, I'm running an FX-8150 atm and gaming performance is fine, bottlenecks a bit.......but not enough to hurt it majorly.

IMO, get the 290 instead of the 290x, and later on you will still have a kick-ass graphics card when you do upgrade the rest of your rig.......and yeah, new PSU for sure.

Hopefully someone else comments as i'm no expert on Intel Processors


----------



## TomiKazi

Quote:


> Originally Posted by *Connolly*
> 
> Hello,
> 
> I'm thinking about upgrading to the R9 290x but am unsure how badly my current rig would bottle neck.
> 
> I have:
> 
> - Intel Core i7 2600k 3.4GHz Socket 1155 8MB Cache
> - Corsair 8GB (2x4GB) DDR3 1600MHz
> - Asus P8Z68-V PRO Z68 Socket 1155 8 Channel HD Audio ATX Motherboard
> - Coolermaster Centurion 5 II SPECIAL EDITION Case with Red Interior - With Coolermaster 650W GX PSU
> 
> The components are a couple of years old and I'm wondering if I'd need to replace most of them to realise anything like the full potential of the GPU. I'm assuming I'd need a new PSU, but that's relatively inexpensive, it's more the CPU and motherboard I'm concerned about.
> 
> Obviously I'm not a bleeding edge tech guy, so any advice would be greatly appreciated.
> 
> Cheers,
> 
> Matt.


Nothing wrong with your cpu, you just need to overclock it. After all, that's what the 'K' in 2600K stands for








Can't tell much about the PSU. As far as I know, most Coolermaster PSU's aren't considered to be that great...

Not sure if it has been asked here before, but how is the coil whine on the 290X?


----------



## the9quad

Some have coil whine, some don't, I got lucky and don't have it. I guess it's luck of the draw


----------



## beejay

Quote:


> Originally Posted by *Connolly*
> 
> Hello,
> 
> I'm thinking about upgrading to the R9 290x but am unsure how badly my current rig would bottle neck.
> 
> I have:
> 
> - Intel Core i7 2600k 3.4GHz Socket 1155 8MB Cache
> - Corsair 8GB (2x4GB) DDR3 1600MHz
> - Asus P8Z68-V PRO Z68 Socket 1155 8 Channel HD Audio ATX Motherboard
> - Coolermaster Centurion 5 II SPECIAL EDITION Case with Red Interior - With Coolermaster 650W GX PSU
> 
> The components are a couple of years old and I'm wondering if I'd need to replace most of them to realise anything like the full potential of the GPU. I'm assuming I'd need a new PSU, but that's relatively inexpensive, it's more the CPU and motherboard I'm concerned about.
> 
> Obviously I'm not a bleeding edge tech guy, so any advice would be greatly appreciated.
> 
> Cheers,
> 
> Matt.


Your system is quite fast. Ivy bridge 3770k has about 15-20% improvement over sandy bridge, but sandy can overclock way higher. In my opinion, your Power supply may not cut it. Although 650 watts might be enough, cooler master is not making the best PSUs currently. Correct me if I'm wrong.

Also, as a boost to your confidence, I'm also running 2600k and I haven't felt the urge to upgrade yet, check my rig. I overclock though.


----------



## SpewBoy

I was running Heaven not long ago with my 290Xs and the coil whine was pretty extreme. Far noisier than the fans for the first 45 seconds or so and then the temp built up and the fans started taking off as they do. Coil whine may have gone away or may have just been drowned out by the fans but it can get reasonably loud. For some reason I can kinda tune it out (don't even mind the noise tbh), but I can't tune out fan noise.

Speaking of Heaven, I got the dreaded black screen requiring a reset after the first run-through. Haven't tried again. Got a lot on my mind right now and no time for serious benches (plus the cards are in my older 2600K rig while I wait for their water blocks before putting them into my 4770K rig, so they aren't going to perform at their best due to temps and PCIe2.0).

If I black screen without overclocking at all, does that signify a hardware issue or a driver issue? I'm hoping it's drivers. I'm running 13.11b8.


----------



## TomiKazi

Quote:


> Originally Posted by *SpewBoy*
> 
> I was running Heaven not long ago with my 290Xs and the coil whine was pretty extreme. Far noisier than the fans for the first 45 seconds or so and then the temp built up and the fans started taking off as they do. Coil whine may have gone away or may have just been drowned out by the fans but it can get reasonably loud. For some reason I can kinda tune it out (don't even mind the noise tbh), but I can't tune out fan noise.
> 
> Speaking of Heaven, I got the dreaded black screen requiring a reset after the first run-through. Haven't tried again. Got a lot on my mind right now and no time for serious benches (plus the cards are in my older 2600K rig while I wait for their water blocks before putting them into my 4770K rig, so they aren't going to perform at their best due to temps and PCIe2.0).
> 
> If I black screen without overclocking at all, does that signify a hardware issue or a driver issue? I'm hoping it's drivers. I'm running 13.11b8.


Quote:


> Originally Posted by *the9quad*
> 
> Some have coil whine, some don't, I got lucky and don't have it. I guess it's luck of the draw


I guess that's part of the game then








Sucks though, certainly when planning on a waterblock.

I do believe overclocking that 2600K will be necessary, at least for multiplayer intensive games like BF3/BF4. But on the bright side, overclocking a 2600K is about as easy as it gets.


----------



## HoZy

www.ekwb.com/shop has minimal stock of the 290x blocks & backplates!

Get on it ASAP


----------



## scramz

Quote:


> Originally Posted by *utnorris*
> 
> Your score seems to be a little low. I got this off of 1271/1475Mhz with my CPU at 4.7Ghz:
> http://www.3dmark.com/3dm11/7421025
> 
> P16038
> 
> Graphics Score
> 18381
> 
> Physics Score
> 12162
> 
> Combined Score
> 10853
> Mine is at 78% and I have seen a few more around that, but none above 80%.


Your RAM is better than mine that is why you are achieving a higher physics score


----------



## moa.

Hi,
I am currently running 6990 + 6970, thinking of upgrading. I can run BF4 smoothly at ultra at 1080p, however at high settings 5760x1080 it is quite choppy. this is despite the fact that ingame fps meter says 40FPS. I am thinking whether upgrading to a single 290x will give me a noticable bump in smoothness.


----------



## kcuestag

Quote:


> Originally Posted by *$ilent*
> 
> stock bios?


Yes, stock BIOS.


----------



## scramz

So I have reached over 19K graphics cards on 3dmark 11 on the asus default bios out of the box and the Vdroop is causing my max volts to peak at 1.35v.

Should I now flash the bios to PT3 to reach 1.4 plus or is it not worth it?


----------



## SpewBoy

Quote:


> Originally Posted by *TomiKazi*
> 
> I guess that's part of the game then
> 
> 
> 
> 
> 
> 
> 
> 
> Sucks though, certainly when planning on a waterblock.
> 
> I do believe overclocking that 2600K will be necessary, at least for multiplayer intensive games like BF3/BF4. But on the bright side, overclocking a 2600K is about as easy as it gets.


I won't be grabbing BF4 until after I have my new rig up and running. This was just to test the cards to make sure they worked and weren't complete dogs.

New rig will be using a Corsair AX860 so in the coming weeks once everything is put together I'll be able to see if 860W is enough for two 290Xs overclocked under water. I know I'll be cutting it close but I've gotta try it. I read somewhere here that a reviewer got a 780W load with two 290Xs, so factoring in overclocks it is likely going to exceed 860W but the AX860 apparently has a bit of a buffer and should be able to handle more than its rating so we will just have to wait and see.


----------



## Connolly

Thanks for the replies guys, certainly helped make the decision. I've just bookmarked the over clock process for my CPU and will do it when I get home.
Just seen this http://www.scan.co.uk/products/4gb-vtx3d-radeon-r9-290x-x-edition-28nm-5000mhz-gddr5-gpu-1030mhz-2816-streams-dport-dvi-hdmi-plus-b?utm_source=google+shopping&utm_medium=cpc&gclid=CJi2it3OxboCFaZf2wodcUYAYQ
Seems very cheap, anyone have an experience with VTX3D products?


----------



## scramz

Quote:


> Originally Posted by *kcuestag*
> 
> Yeah, mine is under water, and even with fans at lowest speed (800-900rpm) the card has topped so far at 52ºC with an OC of 1125/1450 and +50% Power.
> 
> If I push the fans to 1000-1200RPM I think I could keep it well below 48ºC.


I reached 1271/1500 under water, max temp as 49 degrees with a few of sp120 quiet editions. This card was made for watercooling lol


----------



## Moustache

Quote:


> Originally Posted by *moa.*
> 
> Hi,
> I am currently running 6990 + 6970, thinking of upgrading. I can run BF4 smoothly at ultra at 1080p, however at high settings 5760x1080 it is quite choppy. this is despite the fact that ingame fps meter says 40FPS. I am thinking whether upgrading to a single 290x will give me a noticable bump in smoothness.


must be due to the vram and bandwidth limit. a single 290x should be smoother.


----------



## HoZy

Quote:


> Originally Posted by *scramz*
> 
> I reached 1271/1500 under water, max temp as 49 degrees with a few of sp120 quiet editions. This card was made for watercooling lol


That's all well and good, But what Radiator setup? dual 120? Dual 140? 2x Dual 120 rads? Explain captain.

Cheers
Mat


----------



## beejay

Gonna go home and flash my R9 290x to a custom bios ^__^ oh and change me to aftermarket (or water, I don't know how to classify a hybrid, close loop)


----------



## scramz

Quote:


> Originally Posted by *HoZy*
> 
> That's all well and good, But what Radiator setup? dual 120? Dual 140? 2x Dual 120 rads? Explain captain.
> 
> Cheers
> Mat


I have EK full water block and plate, 1x EX360, 1x EX240, XSPC D5 pump


----------



## Sgt Bilko

Quote:


> Originally Posted by *beejay*
> 
> Gonna go home and flash my R9 290x to a custom bios ^__^ oh and change me to aftermarket (or water, I don't know how to classify a hybrid, close loop)


I would put it under Hybrid Water/Air


----------



## HeadlessKnight

Quote:


> Originally Posted by *beejay*
> 
> Ivy bridge 3770k has about *15-20%* improvement over sandy bridge, but sandy can overclock way higher


Not even close to that. you're exaggerating mate. More like 5-7% clock-for-clock. And max OC vs max OC they will perform much closer than that. i7 3770K supports PCI-E 3.0 though. But still for single cards you will hardly see a difference between 2.0 & 3.0


----------



## beejay

Quote:


> Originally Posted by *HeadlessKnight*
> 
> Not even close to that. you're exaggerating mate. More like 5-7% clock-for-clock. And max OC vs max OC they will perform much closer than that. i7 3770K supports PCI-E 3.0 though. But still for single cards you will hardly see a difference between 2.0 & 3.0


Yup, maybe I was. Bottom line, if you have a second gen i7, you're pretty much set for more than 2 years . Unless broadwell will give tremendous ipc boost, which I douubt, sandies are good to go, except if new pcie devices would saturate current pcie lanes, which a move to ivy may solve.

And I doubt broadwell would oc much on 14nm, plus if they do go bga, well... I don't know.

All these are speculations.

Back to the 290x


----------



## $ilent

Installed a gelid icy vision rev2 cooler to my 290x. How high should I let the VRM temps go? is gpuz accurate for VRM temps?


----------



## Tobiman

~1300mhz on core. Mother of God. Run some benchmarks!!


----------



## sugarhell

90 max


----------



## Ha-Nocri

Quote:


> Originally Posted by *$ilent*
> 
> 
> 
> Installed a gelid icy vision rev2 cooler to my 290x. How high should I let the VRM temps go? is gpuz accurate for VRM temps?


45c GPU temp under load? So this thing is much better than Accelero coolers?


----------



## $ilent

sorry guys, the gpuz on the left is my gtx 670, the 2 on the right are my 290x. But all temps are at full load [email protected]

So 74C and 57C is ok for VRM? Do we know if gpuz is accurate for VRM? I tried hwinfo but it doesnt show vrm temps on the 290x.


----------



## the9quad

Just hit 10966 in Firestrike-1250/5800 stock cooler. Scared to go any further. Good news is bumping up my vcore to 1.37 and multiplier to 43 fixed my black screens (fingers crossed)








http://www.3dmark.com/3dm/1552701


----------



## $ilent

Just swapped my 290x into the second pcie slot and moved my gtx 670 to pcie slot 1. These temps cant be right for the 290x at full load surely?


----------



## the9quad

Silent something aint right your gpu load says 1%, just checked mine and it reads 100% when I'm at stock clocks at full load


----------



## Arizonian

Quote:


> Originally Posted by *the9quad*
> 
> Silent something aint right your gpu load says 1%, just checked mine and it reads 100% when I'm at stock clocks at full load


That is good news.









What are you using for voltage control out of curiosity?


----------



## $ilent

Quote:


> Originally Posted by *the9quad*
> 
> Silent something aint right your gpu load says 1%, just checked mine and it reads 100% when I'm at stock clocks at full load


Its just on gpuz it says 0% load, check out CCC alongside my results here:


----------



## PillarOfAutumn

If anyone's interested, I just got off the phone with FrozenCPU regarding the 290x waterblock:

They said there was some holiday over the weekend where EK or their factories are based, so there have been some delays in shipment. But he said that my order will be shipped by this Thursday/Friday once they get their shipment. My order includes the EK 290x Acetal/Nickel waterblock as well as the EK backplate. I put my order in at approximately 7:00 last Thursday.


----------



## tsm106

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> If anyone's interested, I just got off the phone with FrozenCPU regarding the 290x waterblock:
> 
> They said there was some holiday over the weekend where EK or their factories are based, so there have been some delays in shipment. But he said that my order will be shipped by this Thursday/Friday once they get their shipment. My order includes the EK 290x Acetal/Nickel waterblock as well as the EK backplate. I put my order in at approximately 7:00 last Thursday.


Friday, then Monday, now Thurs... what next?

I really hate having to use fcpu what with their business practices and their policies. The blocks are on preorder, and they charge your card for the full amount before they have any stock. This gets them your money upfront, forcing you to stay with them. If there's a problem they issue store credit for your money. Fcpu...


----------



## $ilent

So who said Hawaii runs hot again?


----------



## the9quad

Quote:


> Originally Posted by *Arizonian*
> 
> That is good news.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What are you using for voltage control out of curiosity?


Not sure what you mean?
here is a pick of GPU tweak and CPU-z

I think on my CPU it's like 1.37 Vcore and 1.3 VCSSA


----------



## PillarOfAutumn

Quote:


> Originally Posted by *tsm106*
> 
> Friday, then Monday, now Thurs... what next?
> 
> I really hate having to use fcpu what with their business practices and their policies. The blocks are on preorder, and they charge your card for the full amount before they have any stock. This gets them your money upfront, forcing you to stay with them. If there's a problem they issue store credit for your money. Fcpu...


When did you place your order?


----------



## $ilent

Hey all you 290x owners, particularly those with water blocks I got a question. Ive put this air cooler on my 290x, just checking ive got all necessary areas covered on the pcb.

What ive got squared in red I have covered with heatsinks/fans.



Is this right? I dont need to cover anything else?


----------



## Arizonian

Quote:


> Originally Posted by *the9quad*
> 
> Not sure what you mean?
> here is a pick of GPU tweak and CPU-z
> 
> I think on my CPU it's like 1.37 Vcore and 1.3 VCSSA


I don't have GPU voltage control on GPU Tweak. Version 2.9.4.2


----------



## nemm

Just had notification that OverclockersUK have dispatched my Sapphire 290x so I will finally join the club tomorrow and for those interested in the UK in buying they have little over 10 available in stock.


----------



## the9quad

Quote:


> Originally Posted by *Arizonian*
> 
> I don't have GPU voltage control on GPU Tweak. Version 2.9.4.2


Im running 2.4.9.2 got it from the asus link in the bios flashing guide, I used global instead of the d/l service or p2p. You want me to share the one I have when I get up? I work nights, so i am off to bed, but I get up around 6pm central and can upload it somewhere if you wish.


----------



## Slomo4shO

Quote:


> Originally Posted by *Sazz*
> 
> I ran BF4 at triple monitor eyefinity (6020x1080) everything ultra EXCEPT anti-aliasin, I turned off anti-aliasing and it doesn't really show any big difference (If there is any at all) between it at max and it being off. and I am getting 45-60fps on average and lowest dip I've seen is 32fps during extreme firefight on a 64player lobby. (Altho I am using FX-8350 @ 4.8Ghz daily clock based system, so if you got a intel rig i5-3570k and above you will probably get higher minimum FPS)


I assume that the extra horizontal resolution is for bezel correction. Do you mind running a few more eyefinity benchmarks?
Quote:


> Originally Posted by *Tobiman*
> 
> ~1300mhz on core. Mother of God. Run some benchmarks!!


That is the GTX 670, the 290X is at 998MHz and is running at 69C on load.


----------



## Arizonian

Quote:


> Originally Posted by *the9quad*
> 
> Im running 2.4.9.2 got it from the asus link in the bios flashing guide, I used global instead of the d/l service or p2p. You want me to share the one I have when I get up? I work nights, so i am off to bed, but I get up around 6pm central and can upload it somewhere if you wish.


No it's cool - get some sleep. I know the link. Did you flash to ASUS BIOS? Would explain it as I'm on Sapphire BIOS still.

Thanks.


----------



## tsm106

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Friday, then Monday, now Thurs... what next?
> 
> I really hate having to use fcpu what with their business practices and their policies. The blocks are on preorder, and they charge your card for the full amount before they have any stock. This gets them your money upfront, forcing you to stay with them. If there's a problem they issue store credit for your money. Fcpu...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> When did you place your order?
Click to expand...

The Sunday before, not this past weekend but last. What's most annoying is that I could have gotten them from PPCS last week but Fcpu said they'd be in last Friday. That never happened obviously. I also need some thermal pads that only fcpu carry.


----------



## devilhead

today i have canceled my EK waterblock order and done new one with Aquacomputer waterblock


----------



## Connolly

Does anyone have any idea what the difference is between this GPU from different companies? From what I'm reading, you get reference and non reference cards. The reference cards being the ones that are made by AMD then have another companies logo stuck on it too, then non- reference are ones that the third party company has been allowed to make hardware modifications, for better or worse. Surely as these cards are only being shipped as non modified at the moment, it's best to go with whoever the cheapest supplier is? From what I can see, that's VTX3D. I can see that lots of you are buying Sapphire etc, just can't see why.


----------



## $ilent

Does anyone know about my 290x PCB cooling question above?

Also my sapphire bf4 card has elpida.


----------



## rdr09

Quote:


> Originally Posted by *$ilent*
> 
> Does anyone know about my 290x PCB cooling question above?
> 
> Also my sapphire bf4 card has elpida.


you got it all covered. i watercooled a 7950 and a 7970. that pcb is similar to the tahitis.


----------



## tsm106

Quote:


> Originally Posted by *devilhead*
> 
> today i have canceled my EK waterblock order and done new one with Aquacomputer waterblock


Why would...? Look at the puny vrm channel. You can see it under the palm tree badge.


----------



## Arizonian

Quote:


> Originally Posted by *Connolly*
> 
> Does anyone have any idea what the difference is between this GPU from different companies? From what I'm reading, you get reference and non reference cards. The reference cards being the ones that are made by AMD then have another companies logo stuck on it too, then non- reference are ones that the third party company has been allowed to make hardware modifications, for better or worse. Surely as these cards are only being shipped as non modified at the moment, it's best to go with whoever the cheapest supplier is? From what I can see, that's VTX3D. I can see that lots of you are buying Sapphire etc, just can't see why.


Reference cards are like you said - specifications from AMD that the other companies manufacture have to copy to the tee. All the same. Only difference is customer service and warranty support that can vary from company to company.

From what I've learned about AMD, they have strong cards as they over build their reference versions and great with water cooling.

At first we thought Sapphire was using exclusively Hynix memory over Elpida. Really no difference but more of a preference as both can OC just as well as each other. Turns out Sapphire is using both anyway and it's not standard. Sapphire has had a good track record with RMA and customer service so a lot of people trust them.


----------



## $ilent

Quote:


> Originally Posted by *rdr09*
> 
> you got it all covered. i watercooled a 7950 and a 7970. that pcb is similar to the tahitis.


thanks man


----------



## Connolly

Quote:


> Originally Posted by *Arizonian*
> 
> Reference cards are like you said - specifications from AMD that the other companies manufacture have to copy to the tee. All the same. Only difference is customer service and warranty support that can vary from company to company.
> 
> From what I've learned about AMD, they have strong cards as they over build their reference versions and great with water cooling.
> 
> At first we thought Sapphire was using exclusively Hynix memory over Elpida. Really no difference but more of a preference as both can OC just as well as each other. Turns out Sapphire is using both anyway and it's not standard. Sapphire has had a good track record with RMA and customer service so a lot of people trust them.


I thought so, thanks for clearing it up.


----------



## devilhead

Quote:


> Originally Posted by *tsm106*
> 
> Why would...? Look at the puny vrm channel. You can see it under the palm tree badge.


i think Aquacomputer always have made quality products, will see how it performs later







aesthetics for me looks nice


----------



## rdr09

Quote:


> Originally Posted by *$ilent*
> 
> thanks man


you did cover the highlighted, too, right?



check this thread out . . .

http://www.overclock.net/t/1437634/installation-guide-tips-of-rev-2-icy-vision-on-r9-290x


----------



## tsm106

Quote:


> Originally Posted by *Connolly*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Arizonian*
> 
> Reference cards are like you said - specifications from AMD that the other companies manufacture *have to copy to the tee*. All the same. Only difference is customer service and warranty support that can vary from company to company.
> 
> From what I've learned about AMD, they have strong cards as they over build their reference versions and great with water cooling.
> 
> At first we thought Sapphire was using exclusively Hynix memory over Elpida. Really no difference but more of a preference as both can OC just as well as each other. Turns out Sapphire is using both anyway and it's not standard. Sapphire has had a good track record with RMA and customer service so a lot of people trust them.
> 
> 
> 
> I thought so, thanks for clearing it up.
Click to expand...

They don't copy anything actually. They buy a kit (that includes all the parts and serialized pcb) to manufacture if they have that capability otherwise they buy the card whole. And in the case that they buy the card, it is produced by a chosen AIB.


----------



## $ilent

Quote:


> Originally Posted by *rdr09*
> 
> you did cover the highlighted, too, right?
> 
> 
> 
> check this thread out . . .
> 
> http://www.overclock.net/t/1437634/installation-guide-tips-of-rev-2-icy-vision-on-r9-290x


Yeah I did that that, the 2 silver squares below that red section, all memory chips and then the big vertical line of VRMs on the right side.


----------



## tobitronics

Got myself an ASUS Limited edition R9 290X with the reference cooler, but I changed it for an EK-FC Full Cover Waterblock. Looks hell nice









Here is the GPU-Z link: http://www.techpowerup.com/gpuz/cxvsn/ , so you can count me in for your little club









Achieved an OC of 1025 core and 1400 on the memory.

However I read that lots of reviewers managed to easily push the memory past 6000mhz (so 1500 effective), however my card just quits past 1400... Do I have a bad card or are those reviewers extremely lucky?


----------



## utnorris

There are three in that area. Looks like a triangle. At least EK has you cover it with thermal pads for their water block.


----------



## maarten12100

Quote:


> Originally Posted by *tobitronics*
> 
> Got myself an ASUS Limited edition R9 290X with the reference cooler, but I changed it for an EK-FC Full Cover Waterblock. Looks hell nice
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here is the GPU-Z link: http://www.techpowerup.com/gpuz/cxvsn/ , so you can count me in for your little club
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Achieved an OC of 1025 core and 1400 on the memory.
> 
> However I read that lots of reviewers managed to easily push the memory past 6000mhz (so 1500 effective), however my card just quits past 1400... Do I have a bad card or are those reviewers extremely lucky?


Check if you mem makes good contact and btw OC yout core


----------



## jerrolds

Quote:


> Originally Posted by *tobitronics*
> 
> Got myself an ASUS Limited edition R9 290X with the reference cooler, but I changed it for an EK-FC Full Cover Waterblock. Looks hell nice
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here is the GPU-Z link: http://www.techpowerup.com/gpuz/cxvsn/ , so you can count me in for your little club
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Achieved an OC of 1025 core and 1400 on the memory.
> 
> However I read that lots of reviewers managed to easily push the memory past 6000mhz (so 1500 effective), however my card just quits past 1400... Do I have a bad card or are those reviewers extremely lucky?


1025 core underwater? You should be able to hit 1100 on air pretty easily (mine is at 1120 at stock voltage/bios) - i would think underwater can hit 1200mhz+ most of the time


----------



## jrcbandit

I got a stable overclock at 1200 core, 1400 memory on water, max 48 °C. Voltage at 1.299 on Asus bios (1350 with vdroop). Sapphire card with Hynix memory and 73.9 ASIC. Need higher voltage on both gpu and memory. My memory didn't go very high, wonder if EK block does enough cooling.


----------



## Paul17041993

Quote:


> Originally Posted by *$ilent*
> 
> Hey all you 290x owners, particularly those with water blocks I got a question. Ive put this air cooler on my 290x, just checking ive got all necessary areas covered on the pcb.
> 
> What ive got squared in red I have covered with heatsinks/fans.
> 
> 
> 
> Is this right? I dont need to cover anything else?


pretty sure the controllers don't need cooling so yea, that's all of it...


----------



## evensen007

Quote:


> Originally Posted by *tobitronics*
> 
> Got myself an ASUS Limited edition R9 290X with the reference cooler, but I changed it for an EK-FC Full Cover Waterblock. Looks hell nice
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here is the GPU-Z link: http://www.techpowerup.com/gpuz/cxvsn/ , so you can count me in for your little club
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Achieved an OC of 1025 core and 1400 on the memory.
> 
> However I read that lots of reviewers managed to easily push the memory past 6000mhz (so 1500 effective), however my card just quits past 1400... Do I have a bad card or are those reviewers extremely lucky?


Tob,

You have to increase the voltage to get higher O/C's. Right now, that means using Asus' tool to overclock it.


----------



## $ilent

Quote:


> Originally Posted by *Paul17041993*
> 
> pretty sure the controllers don't need cooling so yea, that's all of it...


Cool cheers!

I think im ready for a voltage unlock, max temps with this air cooler: 56C core, VRM1 66C, VRM2 48C. Just hope we get MSI AB update soon, I dont want to mess about flashing the asus bios.


----------



## Ukkooh

FFS it is still impossible to get a hold of a 290x here in Finland and even harder to get a ek block for it as the postage from ek's store pretty much doubles the price. Haven't seen any 290x cards in stock after the launch day. (Not counting those with a price of over 700€)


----------



## tobitronics

Quote:


> Originally Posted by *evensen007*
> 
> Tob,
> 
> You have to increase the voltage to get higher O/C's. Right now, that means using Asus' tool to overclock it.


Do I also have to increase the voltage to overclock the memory?


----------



## $ilent

Quote:


> Originally Posted by *Ukkooh*
> 
> FFS it is still impossible to get a hold of a 290x here in Finland and even harder to get a ek block for it as the postage from ek's store pretty much doubles the price. Haven't seen any 290x cards in stock after the launch day. (Not counting those with a price of over 700€)


Why not order from overclockers.co.uk? they got it in stock

Theyve got 10+ in stock of the sapphire 290X's and they only cost 531 Euros - http://www.overclockers.co.uk/showproduct.php?prodid=GX-336-SP&groupid=701&catid=56&subcat=1752


----------



## Ukkooh

Quote:


> Originally Posted by *$ilent*
> 
> Why not order from overclockers.co.uk? they got it in stock
> 
> Theyve got 10+ in stock of the sapphire 290X's and they only cost 531 Euros - http://www.overclockers.co.uk/showproduct.php?prodid=GX-336-SP&groupid=701&catid=56&subcat=1752


The one with the preinstalled EK waterblock seems to be the only way to not break the warranty when watercooling so I think I'll get one of those. Though that means that I have to wait until december to get my loop funds from tax returns. I had no idea OCUK shipped to Finland so +rep for you.


----------



## $ilent

ah no worries, sorry I didnt know you were waiting for one of those specific type 290x's.


----------



## Ukkooh

Actually I just wanted a reference 290x before I'll get the loop so I can try if it works properly or not before putting it under water. Actually I've never seen a gpu with a preinstalled waterblock for sale in Finland.


----------



## $ilent

Id buy the sapphire one if I was you, the price seems decent and you could put a gelid icy vison on it like I have for a month or two if you want better temps.


----------



## jerrolds

The gelid icy 2 install looks more involved than id like - i dont think the accelero xtreme 3 or hybrid are as complicated.


----------



## $ilent

Quote:


> Originally Posted by *jerrolds*
> 
> The gelid icy 2 install looks more involved than id like - i dont think the accelero xtreme 3 or hybrid are as complicated.


To be fair its actually a really simple install, just takes ~30mins or so. All you do is remove original stock cooler, put small stickers on all the VRMS & memory chips, remove the sticker cover, apply heatsinks, put the actual cooler unit on the gpu core, screw it down and your done. Its knocked off ~40C load temps on my 290x and the fans are inaudible now.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *$ilent*
> 
> Why not order from overclockers.co.uk? they got it in stock
> 
> Theyve got 10+ in stock of the sapphire 290X's and they only cost 531 Euros - http://www.overclockers.co.uk/showproduct.php?prodid=GX-336-SP&groupid=701&catid=56&subcat=1752


man...I'm sitting here refreshing the Newegg page here in the US every 30-60 minutes and there a UK site has it...I'm tempted to buy from them and have it ship it here.

crap...should I just buy from overclockers co uk?


----------



## $ilent

its $720 if you buy it from OCUK, thats like an extra $170 man. Id wait it out...I know its crap.


----------



## Porter_

that and the 290's are releasing tomorrow (i believe). so there should be some availability of those, assuming the performance rumors are true i'd just snag a 290.


----------



## Raephen

Quote:


> Originally Posted by *$ilent*
> 
> To be fair its actually a really simple install, just takes ~30mins or so. All you do is remove original stock cooler, put small stickers on all the VRMS & memory chips, remove the sticker cover, apply heatsinks, put the actual cooler unit on the gpu core, screw it down and your done. Its knocked off ~40C load temps on my 290x and the fans are inaudible now.


This is good to hear, but I'm curious at what speed do the fans run on your Icy Vision?

The compareable Accelero Twin Turbo II I have on my HD7870 has gpu-pwm, but the Gelid cooler just has 3-pins to work with, as far as I know.

I know the two cards - my HD 7870 and an r9 290 series card - aren't really compareable, in terms of heat output - the r9 series is at least 75W higher in TDP spec, and the 290x seems to use even more than that, where my grandma overclocked 7870 (1100/1300/+5%) seems to draw slightly less than TDP - and perhaps in terms of fan control with the new powertune features.

But in a 1920x1080 15min FurMark test my HD 7870 never broke a sweat at 66C tops with the fans never passing 40%.

It might hold it's owm on a 290(x), but I'd think I'd rather go with a Xtreme III.

Or full water... I've been thinking about how to expand my CPU only loop (mcp35x, RayStorm CPU block and an Alphacool ut30-360rad)... I imagine I'd place it between the cpu-out and the return to my res: res -> pump -> rad -> cpu -> gpu -> res.

The EK block looks good, but let's see how the pricing goes for the 290.

There is one listed on info.hardware.nl , but it's priced higher than the cheapest 290x, lol!

Let's just wait and see what happens.


----------



## Zavis

Helly Guys,

So i just got my r9 290x, And now i want to upgrade my deer PSU, because it will have a hard time pushing my new setup i am building









i did make a thread in the PSU sectioin, but i would like to get hold of a person, that knows "facts" of how much power i need for 2 x R9 290x

from my Thread
Quote:


> Hello boys.
> 
> So i just got myself a r9 290x (Awesome card btw)
> 
> so now i am ready to build my final rig.. So to the last part i need a PSU thats able to power up 2x 290X
> 
> any suggestion on a good solid power supply.
> I was about to buy Corsair AX1200, but found some bad reviews about it, so decided not to? ( mistake?)
> 
> I would like the PSU to be Fully Modulated, and able to buy red sleeved cables
> 
> Thanks in advance


----------



## PillarOfAutumn

Quote:


> Originally Posted by *$ilent*
> 
> its $720 if you buy it from OCUK, thats like an extra $170 man. Id wait it out...I know its crap.


oh wow. I was doing my own calculations and it was coming out to $598. But when I put it in the cart, and went to checkout just to see the price, it was reading $733 or something. I backed the eff up lol.


----------



## $ilent

I have the fans at ~1650RPM by using an old scythe fan controller I had laying around. The fans on the Gelid only get noisy at around 2000RPM. These are my idle temps with the fans as they are now: 33C core, 28C VRM1, 32C VRM2.

Quote:


> Originally Posted by *Zavis*
> 
> Helly Guys,
> 
> So i just got my r9 290x, And now i want to upgrade my deer PSU, because it will have a hard time pushing my new setup i am building
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i did make a thread in the PSU sectioin, but i would like to get hold of a person, that knows "facts" of how much power i need for 2 x R9 290x
> 
> from my Thread


My system with a r9 290x in and a 3770k whilst benchmarking drew ~370 watts from the plug, measured with a kill-a-watt.

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> oh wow. I was doing my own calculations and it was coming out to $598. But when I put it in the cart, and went to checkout just to see the price, it was reading $733 or something. I backed the eff up lol.


yeah the UK price is £450, which converted to US dollar is over $700.


----------



## Arizonian

Quote:


> Originally Posted by *Zavis*
> 
> Helly Guys,
> 
> So i just got my r9 290x, And now i want to upgrade my deer PSU, because it will have a hard time pushing my new setup i am building
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i did make a thread in the PSU sectioin, but i would like to get hold of a person, that knows "facts" of how much power i need for 2 x R9 290x
> 
> from my Thread


Coolermaster V1000 would be an inexpensive yet well built PSU. Better than it would be Antec HCP 1000w Plat is another solid PSU.

I own a Corsair AX850 and if I decide to go crossfire won't be able to add more than a mild over clock without pushing it to the max and no way to add voltage.

In the PSU section there are some members who have deep knowledge on PSU's that visit regularly so I'd post this there as well.


----------



## tsm106

Quote:


> Originally Posted by *Arizonian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Zavis*
> 
> Helly Guys,
> 
> So i just got my r9 290x, And now i want to upgrade my deer PSU, because it will have a hard time pushing my new setup i am building
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i did make a thread in the PSU sectioin, but i would like to get hold of a person, that knows "facts" of how much power i need for 2 x R9 290x
> 
> from my Thread
> 
> 
> 
> Coolermaster V1000 would be an inexpensive yet well built PSU. Better than it would be Antec HCP 1000w Plat is another solid PSU.
> 
> I own a Corsair AX850 and if I decide to go crossfire won't be able to add more than a mild over clock without pushing it to the max and no way to add voltage.
> 
> In the PSU section there are some members who have deep knowledge on PSU's that visit regularly so I'd post this there as well.
Click to expand...

I would add it depends on how serious of an overclocker you are. If running near stock on gpu and slight overclock on any cpu the V1000 is good, but the HCP1000 is best however the Antec is a ripoff price wise. I would most definitely look at the EVGA G2 1000. I have one too. However if you are going to watercool and add volts, you will want to add a few hundred more watts for padding.


----------



## $ilent

I would also recommend a 900 or 1000W psu for 2x 290x. Since a single one whilst benching draws almost 400w at stock, if your planning on upping the voltage the power draw goes up alot I remember reading.


----------



## jerrolds

Quote:


> Originally Posted by *$ilent*
> 
> To be fair its actually a really simple install, just takes ~30mins or so. All you do is remove original stock cooler, put small stickers on all the VRMS & memory chips, remove the sticker cover, apply heatsinks, put the actual cooler unit on the gpu core, screw it down and your done. Its knocked off ~40C load temps on my 290x and the fans are inaudible now.


Oh really? You have the gelid icy 2 on your 290X? I guess the install guide looks more complicated than it really is http://www.overclock.net/t/1437634/installation-guide-tips-of-rev-2-icy-vision-on-r9-290x

Currently on sale at newegg.ca and i have a $10 coupon code - worth getting? I was originally thinking of red modding with a H80i and zalman heatsinks, or just go out and buy the Accelero Hybrid.


----------



## PillarOfAutumn

I don't know why they're charging me the VAT tax, though. That's whats pushing the price way up.


----------



## Sgt Bilko

290x is $700 AUD, was thinking about getting another but then remembered PCIe 2.0


----------



## $ilent

Quote:


> Originally Posted by *jerrolds*
> 
> Oh really? You have the gelid icy 2 on your 290X? I guess the install guide looks more complicated than it really is http://www.overclock.net/t/1437634/installation-guide-tips-of-rev-2-icy-vision-on-r9-290x
> 
> Currently on sale at newegg.ca and i have a $10 coupon code - worth getting? I was originally thinking of red modding with a H80i and zalman heatsinks, or just go out and buy the Accelero Hybrid.


yeah I made a thread here - www.overclock.net/t/1439731/bored-of-waiting-for-none-ref-290x-check-this-simple-cheap-fix/0_40


----------



## VSG

Finally got my second 290X, I must say the unboxing of the Asus was a better experience than the Sapphire one.


----------



## VSG

Also, since the other Asus owners have apparently absconded, there are no stickers anywhere on the Asus regarding warranty voidage upon removal of the stock cooler- just like Sapphire


----------



## Ha-Nocri

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> I don't know why they're charging me the VAT tax, though. That's whats pushing the price way up.


b/c country you live in tries to be socially responsible (cheaper education, health care etc...)


----------



## Slomo4shO

Quote:


> Originally Posted by *tsm106*
> 
> I would add it depends on how serious of an overclocker you are. If running near stock on gpu and slight overclock on any cpu the V1000 is good, but the HCP1000 is best however the Antec is a ripoff price wise. I would most definitely look at the EVGA G2 1000. I have one too. However if you are going to watercool and add volts, you will want to add a few hundred more watts for padding.


I am not sure what you are getting at here. The V1000 is a Sea Sonic build model and is one of the best available models at this capacity. The only better models in this range would be the SeaSonic Platinum-1000 or XFX ProSeries 1000W and even then you are paying a hefty premium for marginally better efficiency. The V1000 is hands down the best option for both performance and price.


----------



## Paul17041993

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 290x is $700 AUD, was thinking about getting another but then remembered PCIe 2.0


CHV means you get dual x16 if you use the 1st and 3rd x16 slots, but yea you'll need space on the lower end of your case unless you water or just use the stock cooler...

and yea, two 290X's (in an overclocked rig) will need 1KW to be stable, 800W *may* be enough but I wouldn't trust it to be stable, I got my seasonic for this very reason...


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Ha-Nocri*
> 
> b/c country you live in tries to be socially responsible (cheaper education, health care etc...)


Well, I'm in the US. And I can tell you that the US is none of that


----------



## Ha-Nocri

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Well, I'm in the US. And I can tell you that the US is none of that


I know it's none of that and I though you don't have VAT


----------



## Sgt Bilko

Quote:


> Originally Posted by *Paul17041993*
> 
> CHV means you get dual x16 if you use the 1st and 3rd x16 slots, but yea you'll need space on the lower end of your case unless you water or just use the stock cooler...
> 
> and yea, two 290X's (in an overclocked rig) will need 1KW to be stable, 800W *may* be enough but I wouldn't trust it to be stable, I got my seasonic for this very reason...


I have a 1200w Silverstone PSU so thats covered. And with the cooler I have on I dont think I have the room for a second card. Ill go for watercooling for next years upgrade.


----------



## Arizonian

Sorry but no political discussions per TOS. Posts removed.


----------



## maarten12100

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Looking forward to seeing what these 290's are capable of. Think we can crossfire 290/290x?
> 
> And drivers AMD......need them. Some games are so choppy atm.


I figure they can be crossfired with one another just like you could CF 6950 and 6970 and you could CF 7950 and 7970.
Any idea if the drivers that will increase performance like 10% (going by the graph of TPU) making the 290 outperform a 290x on old drivers.
I wonder how the 290X with the new drivers will stack up against the 780 since it already makes it look pathetic even with the price drop unless you get a classified.

EDIT: quote removed


----------



## RAFFY

I know this post is worthless without proof of my purchases but I CAN NOT WAIT to go pick up my 290X's after work today! I have been reading through this thread and can not wait to dominate BF4 and COD:Ghosts with these GPU's!!!


----------



## Arizonian

Quote:


> Originally Posted by *RAFFY*
> 
> I know this post is worthless without proof of my purchases but I CAN NOT WAIT to go pick up my 290X's after work today! I have been reading through this thread and can not wait to dominate BF4 and COD:Ghosts with these GPU's!!!


Always a good feeling getting new graphics card.









Getting two huh? Sweet. Post pics when you get them.


----------



## utnorris

My second 290x should be here Wednesday and my second block on Thursday. By the way, PPCS still has couple blocks left in stock, not the acetal versions though.


----------



## Sgt Bilko

Quote:


> Originally Posted by *maarten12100*
> 
> I figure they can be crossfired with one another just like you could CF 6950 and 6970 and you could CF 7950 and 7970.
> Any idea if the drivers that will increase performance like 10% (going by the graph of TPU) making the 290 outperform a 290x on old drivers.
> I wonder how the 290X with the new drivers will stack up against the 780 since it already makes it look pathetic even with the price drop unless you get a classified.
> 
> EDIT: quote removed


Well im roughly hitting the same scores as 780 ref owners with stock volts so with some decent drivers and voltage unlock should be gg


----------



## Forceman

Well, heck. I ordered a XFX 290X from Amazon last week and it turned out they were out of stock so over the weekend I put in an order for the 290 they had listed, and now both of them shipped today with 2nd Day Air. So I guess I'll have both a 290X and 290 to choose from on Wednesday.

Edit: Hmm. I'm guessing the EK 290X blocks will work on a reference 290 also?


----------



## PillarOfAutumn

so can someone please tell me if this is enough: I have a 750 watt SeaSonicX psu. To it I have 2 SSDs, a z77 MPower mobo, an i5 3570k OCed to 4.5ghz, 7 cougar vortex fans, and one swiftech D5 pwm pump. Will 750 watts be enough to power this whole system with the 290x potentially overclocked to 1300/1500 and still have some safety headroom?


----------



## xxmastermindxx

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> so can someone please tell me if this is enough: I have a 750 watt SeaSonicX psu. To it I have 2 SSDs, a z77 MPower mobo, an i5 3570k OCed to 4.5ghz, 7 cougar vortex fans, and one swiftech D5 pwm pump. Will 750 watts be enough to power this whole system with the 290x potentially overclocked to 1300/1500 and still have some safety headroom?


The 290X box shows 750W minimum requirement. An overclock of 1300, even if your card reaches it, will require a bit of power. That and your overclocked CPU, along with the other minor parts, and I'd say no safety headroom at all.


----------



## Forceman

Quote:


> Originally Posted by *xxmastermindxx*
> 
> The 290X box shows 750W minimum requirement. An overclock of 1300, even if your card reaches it, will require a bit of power. That and your overclocked CPU, along with the other minor parts, and I'd say no safety headroom at all.


Meh. They overestimate those requirements to account for people with crappy PSUs that can't deliver anything near their rated power. A couple of people have Kill-a-Watted their systems and showed ~400W draw with 290X equipped systems. Unless you are pushing crazy volts I don't think you'd run into trouble with a 750W.


----------



## HoZy

Depending on your O/C I've seen 290x's pulling 450watt alone!


----------



## maarten12100

Quote:


> Originally Posted by *HoZy*
> 
> Depending on your O/C I've seen 290x's pulling 450watt alone!


Not that bad since Titan pulls 500 OCed


----------



## Forceman

Quote:


> Originally Posted by *HoZy*
> 
> Depending on your O/C I've seen 290x's pulling 450watt alone!


Where, and what overclock?


----------



## xxmastermindxx

Quote:


> Originally Posted by *Forceman*
> 
> Meh. They overestimate those requirements to account for people with crappy PSUs that can't deliver anything near their rated power. A couple of people have Kill-a-Watted their systems and showed ~400W draw with 290X equipped systems. Unless you are pushing crazy volts I don't think you'd run into trouble with a 750W.


That's probably true, but he asked if he was fine with safety headroom. On an already overclocked system, and with the high volts he's probably going to be using (with asking about 1300 GPU speed), I wouldn't feel comfortable saying yes, there's plenty of safety headroom.


----------



## Forceman

Quote:


> Originally Posted by *xxmastermindxx*
> 
> That's probably true, but he asked if he was fine with safety headroom. On an already overclocked system, and with the high volts he's probably going to be using (with asking about 1300 GPU speed), I wouldn't feel comfortable saying yes, there's plenty of safety headroom.


True, it's all going to depend on the volts.


----------



## HoZy

A stock 290x in Uber mode will pull 400watt on furmark. Test it yourself if you like, But just need to have a google around for 290x power consumption figures and plenty of people have put them out there.

It's why 1200w is recommended for a Crossfire setup + watercooling etc.


----------



## maarten12100

Quote:


> Originally Posted by *HoZy*
> 
> A stock 290x in Uber mode will pull 400watt on furmark. Test it yourself if you like, But just need to have a google around for 290x power consumption figures and plenty of people have put them out there.
> 
> It's why 1200w is recommended for a Crossfire setup + watercooling etc.


Furmark...
1600W for quadfire is plenty


----------



## Forceman

Quote:


> Originally Posted by *HoZy*
> 
> A stock 290x in Uber mode will pull 400watt on furmark. Test it yourself if you like, But just need to have a google around for 290x power consumption figures and plenty of people have put them out there.
> 
> It's why 1200w is recommended for a Crossfire setup + watercooling etc.


That's total system power. The cards themselves aren't drawing that much, even in Uber mode. With voltage mods maybe, but not on stock BIOSes.


----------



## Mas

Quote:


> Originally Posted by *HoZy*
> 
> A stock 290x in Uber mode will pull 400watt on furmark. Test it yourself if you like, But just need to have a google around for 290x power consumption figures and plenty of people have put them out there.


Err.. sorry for stupid question, but do you mean total system draw or just the card itself without factoring in system?


----------



## King PWNinater

Quote:


> Originally Posted by *HoZy*
> 
> A stock 290x in Uber mode will pull 400watt on furmark. Test it yourself if you like, But just need to have a google around for 290x power consumption figures and plenty of people have put them out there.
> 
> It's why 1200w is recommended for a Crossfire setup + watercooling etc.


You do realize that is Total System consumption, right?


----------



## xxmastermindxx

Quote:


> Originally Posted by *Forceman*
> 
> That's total system power. The cards themselves aren't drawing that much, even in Uber mode. With voltage mods yes, but not on stock BIOSes.


Again, he said "290x potentially overclocked to 1300/1500". That means volts. I'm running a quick test right now at 1.3V and 1050MHz core (this card is a bad clocker), and pulling close to 400W. Card only.


----------



## Mas

Quote:


> Originally Posted by *Mas*
> 
> Just picked up R9 290X on the way home from work and installed it a couple of minutes ago. Shop only had one card so will have to wait for crossfire. When I get the second card I'll put in the order for some water blocks. Already have a 480 rad waiting, going to do a loop just for the GPUs. Will fit easily in the 900D I have ordered which should arrive by end of week.
> 
> 1. http://www.techpowerup.com/gpuz/fucez/
> 
> 2. Gigabyte R9 290X
> 
> 3. Stock cooling for the moment, will put it under water soon.


Still no cards available, but pre-order just opened up for the waterblocks, so I put in an order. Should have them 16th or 17th of Nov. Going to put this card (and future 2nd card) on its own 480 rad.


----------



## Mas

Quote:


> Originally Posted by *xxmastermindxx*
> 
> Again, he said "290x potentially overclocked to 1300/1500". That means volts. I'm running a quick test right now at 1.3V and 1050MHz core (this card is a bad clocker), and pulling close to 400W. Card only.


Did he? Not what I'm reading

Quote:


> Originally Posted by *HoZy*
> 
> A stock 290x in Uber mode will pull 400watt on furmark. Test it yourself if you like, But just need to have a google around for 290x power consumption figures and plenty of people have put them out there.
> 
> It's why 1200w is recommended for a Crossfire setup + watercooling etc.
> Quote:
> 
> 
> 
> Originally Posted by *Forceman*
> 
> That's total system power. The cards themselves aren't drawing that much, even in Uber mode. With voltage mods maybe, but not on stock BIOSes.
Click to expand...


----------



## xxmastermindxx

Quote:


> Originally Posted by *Mas*
> 
> Did he? Not what I'm reading


Read a little further back to see who and what I'm responding to, because you're a little behind


----------



## Forceman

Quote:


> Originally Posted by *xxmastermindxx*
> 
> Again, he said "290x potentially overclocked to 1300/1500". That means volts. I'm running a quick test right now at 1.3V and 1050MHz core (this card is a bad clocker), and pulling close to 400W. Card only.


You and I are on the same page - overclocked that high it is going to depend on the volts. My other comment was directed toward HoZy saying the cards were drawing 400W stock.

What BIOS do you have on your card? Vdroop is apparently pretty bad with the Asus one - might be holding back your core clock. Or else you have one of the worst clocking cards around.


----------



## Sgt Bilko

In short.....a good quality 1000w PSU will have you covered for an overclocked and watercooled rig.

But I wouldnt try and run an overclocked system on a 750w PSU.


----------



## xxmastermindxx

Quote:


> Originally Posted by *Forceman*
> 
> You and I are on the same page - overclocked that high it is going to depend on the volts. My other comment was directed toward HoZy saying the cards were drawing 400W stock.
> 
> What BIOS do you have on your card? Vdroop is apparently pretty bad with the Asus one - might be holding back your core clock. Or else you have one of the worst clocking cards around.


I'm using PT3 right now. I'm running a few tests now with my spare PSU, an XFX 850W. The card is the only thing hooked up to it, and the PSU to a Kill-a-Watt. 1.3V and 1050MHz peaks just over 400W.


----------



## Mr357

It's been fine so far for me. I've got a 2700K at 4.8GHz and my 290X at 1200/1500 on a single rail 750 watt.


----------



## Mas

Quote:


> Originally Posted by *xxmastermindxx*
> 
> Read a little further back to see who and what I'm responding to, because you're a little behind


Then I think you may have quoted the wrong post in your response


----------



## PillarOfAutumn

Thanks for your help, guys! I guess I will either buy the 290 or won't OC the 290x too much. I will also push my CPU back down to its stock clocks. The only thing is that I bought this PSU just back in January and I want to keep it for a while. Do driver updates, in anyway, improve power usage?


----------



## tsm106

Two cards on psu with my cpu and nothing else connected, ie. no drives, no loop, nothing else. Running 3dm11 extreme at stock with +50 on air, so it's throttling a bit as expected. 903w x .89 at that wattage = *803w.* This number should blow up with blocks and serious overclocking. Ya need to remember that on the stock bios PT is trying to keep the tdp of the cards very very low.


----------



## skupples

Quote:


> Originally Posted by *Forceman*
> 
> Well, heck. I ordered a XFX 290X from Amazon last week and it turned out they were out of stock so over the weekend I put in an order for the 290 they had listed, and now both of them shipped today with 2nd Day Air. So I guess I'll have both a 290X and 290 to choose from on Wednesday.
> 
> Edit: Hmm. I'm guessing the EK 290X blocks will work on a reference 290 also?


How does that work with AMD exactly? Iv'e always been confused about how GCN can mix gpu types... Can the 290X only run as fast as the 290? Or is that non issue with AMD products?


----------



## tsm106

Quote:


> Originally Posted by *skupples*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Forceman*
> 
> Well, heck. I ordered a XFX 290X from Amazon last week and it turned out they were out of stock so over the weekend I put in an order for the 290 they had listed, and now both of them shipped today with 2nd Day Air. So I guess I'll have both a 290X and 290 to choose from on Wednesday.
> 
> Edit: Hmm. I'm guessing the EK 290X blocks will work on a reference 290 also?
> 
> 
> 
> How does that work with AMD exactly? Iv'e always been confused about how GCN can mix gpu types... Can the 290X only run as fast as the 290? Or is that non issue with AMD products?
Click to expand...

They are not locked clocks wise if that's what you're wondering. The cards will run at whatever speed you set them to, but too much difference can incur problems with them syncing. It should in theory not be as big a deal with these cards with their cfx scaling system.


----------



## PillarOfAutumn

Just ordered the Sapphire 290x from Newegg!!


----------



## s0up2up

Does anyone think that an Seasonic X-1250 should be able to handle a 4.5Ghz 4930K and three 290X's?


----------



## Forceman

Quote:


> Originally Posted by *skupples*
> 
> How does that work with AMD exactly? Iv'e always been confused about how GCN can mix gpu types... Can the 290X only run as fast as the 290? Or is that non issue with AMD products?


I'm not planning to crossfire them, just choose which one to keep. I didn't really intend to buy both, I just wanted to hedge my OOS 290X order. Ended up with both, but at least Amazon is cool with no restocking fees [Newegg, I'm looking at you].


----------



## Kaapstad

These cards really use the watts, I tried running three 290Xs in a system with a 1200i corsair PSU and it kept crashing.

I did manage to keep it running long enough to do this

http://www.3dmark.com/fs/1093552

Score 24450

AMD Radeon R9 290X(3x) @1180/1625 on air

Intel Core i7-3970X Extreme Edition Processor @4.9

Graphics Score 36299

Physics Score 17328

Combined Score 8635

But even then the system was not flat out.

When I get all four 290Xs running I will be using 2 x 1200watt PSUs to take care of things.


----------



## esqueue

Quote:


> Originally Posted by *geggeg*
> 
> Also, since the other Asus owners have apparently absconded, there are no stickers anywhere on the Asus regarding warranty voidage upon removal of the stock cooler- just like Sapphire


Since companies want to be frugal and put void warranty stickers on a card that I feel need extra cooling. I'll use a hair drier to remove and save the sticker.


----------



## Mas

Quote:


> Originally Posted by *s0up2up*
> 
> Does anyone think that an Seasonic X-1250 should be able to handle a 4.5Ghz 4930K and three 290X's?


TBH I think that would be pushing it, wouldn't do it personally.


----------



## Kaapstad

Quote:


> Originally Posted by *s0up2up*
> 
> Does anyone think that an Seasonic X-1250 should be able to handle a 4.5Ghz 4930K and three 290X's?


No see my post above

If you are going to be overclocking it is way too tight.


----------



## supermi

Quote:


> Originally Posted by *scramz*
> 
> I reached 1271/1500 under water, max temp as 49 degrees with a few of sp120 quiet editions. This card was made for watercooling lol


What voltage did you need? is this game stable?

If you already answered this later on in the thread oops, I am a few pages behind (catching up thought)!


----------



## sdf258sefkljs

I'll be waiting for the non-reference models to show up before I (possibly) pull the trigger on one.


----------



## HoZy

Quote:


> Originally Posted by *sdf258sefkljs*
> 
> I'll be waiting for the non-reference models to show up before I (possibly) pull the trigger on one.


Goodluck, AMD pushed a reference only launch & you won't be seeing a pre-fitted non-reference cooler until early 2014 from what everyone's saying.


----------



## Sgt Bilko

Quote:


> Originally Posted by *HoZy*
> 
> Goodluck, AMD pushed a reference only launch & you won't be seeing a pre-fitted non-reference cooler until early 2014 from what everyone's saying.


We have heard late November at the earliest and December looking more likely.......maybe to coincide with BF4 mantle launch?


----------



## PillarOfAutumn

So I noticed that as soon as I bought my 290x from Newegg, the site said "out of stock." I was using the website nowinstock.com to track this and as soon as I heard the alarm, I quickly bought, and it went out of stock immediately. So I'm thinking that there was probably 1, maybe 2 in stock. Now, what are the chances that what Newegg is going to send me has been a returned card already? Is it common to have only 1 or 2 in stock?


----------



## Yvese

Quote:


> Originally Posted by *Connolly*
> 
> Hello,
> 
> I'm thinking about upgrading to the R9 290x but am unsure how badly my current rig would bottle neck.
> 
> I have:
> 
> - Intel Core i7 2600k 3.4GHz Socket 1155 8MB Cache
> - Corsair 8GB (2x4GB) DDR3 1600MHz
> - Asus P8Z68-V PRO Z68 Socket 1155 8 Channel HD Audio ATX Motherboard
> - Coolermaster Centurion 5 II SPECIAL EDITION Case with Red Interior - With Coolermaster 650W GX PSU
> 
> The components are a couple of years old and I'm wondering if I'd need to replace most of them to realise anything like the full potential of the GPU. I'm assuming I'd need a new PSU, but that's relatively inexpensive, it's more the CPU and motherboard I'm concerned about.
> 
> Obviously I'm not a bleeding edge tech guy, so any advice would be greatly appreciated.
> 
> Cheers,
> 
> Matt.


I would recommend OCing that 2600k to at least 4.2ghz if you can. 4.5 ideally. You can hit up the Intel subforum here for help with that.

As for everything else, it all looks good.


----------



## Forceman

Amazon just dropped the 290 price to $419 (from $469) so looks like it will be around $400. That's a heck of a price, and now I guess I'll have a 290X to sell Wednesday night, if anyone wants one.


----------



## Sazz

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Just ordered the Sapphire 290x from Newegg!!


How can you order if they aren't in stock, I've been checking almost every hour for the past 3days.


----------



## DampMonkey

I just realized that the R9 290 is releasing in a few hours. Hopefully this means we can treat our 290x's to a new set of drivers as well









http://www.bit-tech.net/news/hardware/2013/10/31/amd-r9-290-delay/1


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Sazz*
> 
> How can you order if they aren't in stock, I've been checking almost every hour for the past 3days.


I was using the website BuyItInStock.com.

Quote:


> Originally Posted by *Forceman*
> 
> Amazon just dropped the 290 price to $419 (from $469) so looks like it will be around $400. That's a heck of a price, and now I guess I'll have a 290X to sell Wednesday night, if anyone wants one.


I'm starting to get buyer's remorse. I should probably sell the 290x. Will Newegg cancel my order? It hasn't even shipped yet.


----------



## Roaches

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> I'm starting to get buyer's remorse. I should probably sell the 290x. Will Newegg cancel my order? It hasn't even shipped yet.


If it hasn't shipped yet you mind as well cancel it before it gets all complicated...


----------



## esqueue

Quote:


> Originally Posted by *Sazz*
> 
> How can you order if they aren't in stock, I've been checking almost every hour for the past 3days.


http://www.nowinstock.net/computers/amdr9290x/ is how I ordered mine from newegg a few days ago. Set the alarm and have your account made and ready. It took me over 30 minutes to change my password that I made years back and my bank was asking for additional info due to it's high price.


----------



## Sazz

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> So I noticed that as soon as I bought my 290x from Newegg, the site said "out of stock." I was using the website nowinstock.com to track this and as soon as I heard the alarm, I quickly bought, and it went out of stock immediately. So I'm thinking that there was probably 1, maybe 2 in stock. Now, what are the chances that what Newegg is going to send me has been a returned card already? Is it common to have only 1 or 2 in stock?


I find this intruiging, they received my RMA today and instead of getting a replacement they sent me a in-store credit (was gonna get sent a refund but the CSR told me if we do a in-store credit they can do it as soon as today as well so IF the item goes in stock I can immidiately buy it with the instore credit)

If they have stocks of it today they would have sent me a replacement instead of refund/in-store credit so I don't know how you guys were able to get one, I am watching the site and each product page and keep checking on it almost every hour for the past 3 days since I've sent my RMA and I haven't seen one get in stock on their site so far.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Roaches*
> 
> If it hasn't shipped yet you mind as well cancel it before it gets all complicated...


Yup! just cancelled it! So there's one in stock now. I think I might as well just go for the 290. lower TDP (hopefully), and it can probably handle games at 1080/1440p. I think I'm gonna be on 1080p for a while.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Sazz*
> 
> I find this intruiging, they received my RMA today and instead of getting a replacement they sent me a in-store credit (was gonna get sent a refund but the CSR told me if we do a in-store credit they can do it as soon as today as well so IF the item goes in stock I can immidiately buy it with the instore credit)
> 
> If they have stocks of it today they would have sent me a replacement instead of refund/in-store credit so I don't know how you guys were able to get one, I am watching the site and each product page and keep checking on it almost every hour for the past 3 days since I've sent my RMA and I haven't seen one get in stock on their site so far.


Dude!

http://www.newegg.com/Product/Product.aspx?Item=N82E16814202058

Buy it now before its too late! I cancelled my order!


----------



## Sazz

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Dude!
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814202058
> 
> Buy it now before its too late! I cancelled my order!


GOT IT! Thanks for the PM! saved me!!


----------



## Roaches

Its already out of stock.


----------



## Sazz

Quote:


> Originally Posted by *Roaches*
> 
> Its already out of stock.


yeah I was able to get it xD


----------



## Roaches

Quote:


> Originally Posted by *Sazz*
> 
> yeah I was able to get it xD


Sticky fingers you have there.


----------



## Sazz

Quote:


> Originally Posted by *Roaches*
> 
> Sticky fingers you have there.




I just find this weird, coz today they received my returned R9 290X, and then today there is this one R9 290X that went available on their site?

Hmmmm.... lets see if it has the same S/N as the ones I received first, I wish I have marked the card in some little way that it wasn't obvious so that I know if they are just sending me back the same card or not.

Well if the display port does't work it means its the same thing xD


----------



## the9quad

When do you think these will be available in normal quantities? You think next month? I'd like to get another one and a better monitor, sometime around thanksgiving. One card handles 1080p at 60fps in bf 4, but anything more than that and I start seeing it dip below 60 during intense situations, so I definitely think crossfire is needed for anything more than 1080p.


----------



## Sazz

Quote:


> Originally Posted by *the9quad*
> 
> When do you think these will be available in normal quantities? You think next month? I'd like to get another one and a better monitor, sometime around thanksgiving. One card handles 1080p at 60fps in bf 4, but anything more than that and I start seeing it dip below 60 during intense situations, so I definitely think crossfire is needed for anything more than 1080p.


They should have more of them during the holiday sales... I mean it would be stupid not to xD

I ran my R9 290X when it was working properly with 3 monitor set-up at 6020x1080 resolution, everything maxed out except AA which I turned it off in-game and just let the CCC use its own AA settings and I was able to get on average of 45-60fps and during intense situations (when theres a lot of players with explosions everywhere! LOL) the lowest I see it dip is 32FPS, but even at 32FPS its still smooth. You know when sometimes you see FPS drop to 30fps and you really feel it, like you can really see the screen not as smooth as normal but with my experience with this one even during those dips I didn't sense any sluggish feel to it, it was still smooth.

1440p should still be alright with everything maxed including AA and still be right around the constant 60fps but anything over that is probably either 4K monitor or eyefinity set-up and you would need either to turn off AA and maybe turn the Tessa a notch down to make it keep higher FPS or get a 2nd card.

But in consideration tho, I am using FX-8350 with this card so if you are using a i5-3570k and above, you will see a higher minimum FPS than I do, probably around 5 maybe upto 10fps higher than my minimum fps of 32 which isn't bad.


----------



## RAFFY

Quote:


> Originally Posted by *Arizonian*
> 
> Always a good feeling getting new graphics card.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Getting two huh? Sweet. Post pics when you get them.


SEE! I'm NOT a liar lol. Sorry I don't have a GPU-Z score up as I'm waiting for my roommate (she's a female) to finish watching The Princess Diary's movie. Need the tv as monitor since my ASUS PB278Q doesn't come in until tomorrow.


2 * ASUS R9 290X


----------



## Arizonian

Quote:


> Originally Posted by *DampMonkey*
> 
> I just realized that the R9 290 is releasing in a few hours. Hopefully this means we can treat our 290x's to a new set of drivers as well
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.bit-tech.net/news/hardware/2013/10/31/amd-r9-290-delay/1


I wonder what the power draw is going to be on a 290 compared to 290X? I hope the non-reference cards are sooner than 2014. I read also end of November in ASUS Rog forum. Will try to dig it up but it was speculation.

I'm looking at 290X ASUS Matrix, Sapphire Toxic or MSI Lightning.

Also unlike last time will add 290 owners with pic and OCN name for submission. Got a little too much to track before.


----------



## Sazz

Quote:


> Originally Posted by *Forceman*
> 
> Amazon just dropped the 290 price to $419 (from $469) so looks like it will be around $400. That's a heck of a price, and now I guess I'll have a 290X to sell Wednesday night, if anyone wants one.


I'm guessing that once the dust settles we may see 290X (regular nonBF4 version) at low 500's mark, (thinking it may go as low as 519bucks considering 290 is 419, 100bucks difference should be right)

But that's always the price of getting it first, you pay the extra bucks. same case when I got the 7970 first, 559 on the first month or so and drops down to 499 after the dust settled.


----------



## Arizonian

Quote:


> Originally Posted by *Sazz*
> 
> I'm guessing that once the dust settles we may see 290X (regular nonBF4 version) at low 500's mark, (thinking it may go as low as 519bucks considering 290 is 419, 100bucks difference should be right)
> 
> But that's always the price of getting it first, you pay the extra bucks. same case when I got the 7970 first, 559 on the first month or so and drops down to 499 after the dust settled.


+1 Early adopters should know it come with the territory. Anything that's bleed tech cost through the nose too aka 4K monitors.









Quote:


> Originally Posted by *RAFFY*
> 
> SEE! I'm NOT a liar lol. Sorry I don't have a GPU-Z score up as I'm waiting for my roommate (she's a female) to finish watching The Princess Diary's movie. Need the tv as monitor since my ASUS PB278Q doesn't come in until tomorrow.
> 
> 
> 2 * ASUS R9 290X


Congrats your on board now


----------



## VSG

If the 290X price reduces to $499 before even a month, I can't say I will be happy.


----------



## utnorris

Well, if AMD does lower the price to $519 I am wondering how retailers will handle that considering it is less than 30 days. IIRC, Nvidia did that with the 580 and people got refunds, but we will see what happens. I wouldn't mind if they did, but I also understood that I would be paying a premium upfront for these.

As far as power goes, I am going to say it again, most of you have quoted the high power consumption with x79 setups which eat a lot of power compared to Z77 systems. Even running high volts in benchmarks I was not going pass 400watts total system. Granted, I was not using Furmark, but that's an unrealistic way to test power consumption. I will have my second 290x in a few days and will test it out using my 1200 watt Antec PSU and see what the pull is. If it never goes above 800watt I switch back to my Seasonic 860 platnium as it is a nicer PSU and fully modular.


----------



## Clockster

Well still no stock of the R9 290X here in SA, so getting 2x R9 290's today








Will post pics tonight, my waterblocks are also here...Just hoping they fit the 290...


----------



## skupples

Quote:


> Originally Posted by *s0up2up*
> 
> Does anyone think that an Seasonic X-1250 should be able to handle a 4.5Ghz 4930K and three 290X's?


Seems with the results people are providing under full load you may be really pushing it. May I recommend to you this toy called Add2PSU!

Quote:


> Originally Posted by *esqueue*
> 
> Since companies want to be frugal and put void warranty stickers on a card that I feel need extra cooling. I'll use a hair drier to remove and save the sticker.


Yeah... It's an annoying practice that's been going on for as long as I can remember. It's why I always get my NV products from EVGA. They have one of the more liberal policies when it comes to tweaking with their cards. For one, the warranty sticker is almost never over the screws, they don't seem to check for bios flashing, & totally understand that we <3 putting water block's on them & cranking the settings to max.

The hair dryer should work pretty well, just remember to leave it on the card so you don't loose it!


----------



## Sgt Bilko

If they reduce the price when the non ref models come out then im fine with that.

Arizonian, pretty sure a rep mentioned the possible late Nov date for non ref cards in another thread


----------



## ZombieJon

Quote:


> Originally Posted by *Arizonian*
> 
> I wonder what the power draw is going to be on a 290 compared to 290X? I hope the non-reference cards are sooner than 2014. I read also end of November in ASUS Rog forum. Will try to dig it up but it was speculation.
> 
> I'm looking at 290X ASUS Matrix, Sapphire Toxic or MSI Lightning.
> 
> Also unlike last time will add 290 owners with pic and OCN name for submission. Got a little too much to track before.


I'm curious about this as well. I'm currently waiting for a SigmaCool mount to arrive before I decide between a 290 and 290X.


----------



## Arizonian

Also some other good news I don't know if you guys heard but Mantle Graphics API Adopted by Industry Leading Game Developers Cloud Imperium, Eidos-Montreal and Oxide.

Let the Mantle train begin.









http://phx.corporate-ir.net/mobile.view?c=74093&v=203&d=1&id=1871714


----------



## skupples

Quote:


> Originally Posted by *Arizonian*
> 
> Also some other good news I don't know if you guys heard but Mantle Graphics API Adopted by Industry Leading Game Developers Cloud Imperium, Eidos-Montreal and Oxide.
> 
> Let the Mantle train begin.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://phx.corporate-ir.net/mobile.view?c=74093&v=203&d=1&id=1871714


hrrm.. Interesting, though not surprised. I expect all "evolved" type engines to adapt mantle.

so, mantle will only be 100% active when the system run's both amd cpu & gpu correct?

ohhh, star citizen is even looking to adapt it... THAT is interesting... & may have me jumping ship when 20NM comes to market...


----------



## Arizonian

No just GCN GPU's. Have a read

http://www.amd.com/us/products/technologies/gcn/Pages/gcn-architecture.aspx


----------



## youra6

any newegg promo codes?


----------



## tsm106

Iirc, almost all the engines are accounted for and on board.


----------



## Kriant

R9 290 seems like a wild "win" for 400$ delivering same performance as 290x according to TPU 0_0


----------



## Slomo4shO

$400 for the 290. Now to decide whether to wait for non-reference or pickup the reference cards now


----------



## Arizonian

Quote:


> Originally Posted by *Kriant*
> 
> R9 290 seems like a wild "win" for 400$ delivering same performance as 290x according to TPU 0_0


Yes that's taking the new drivers into consideration which also boost the 290 X which will stay ahead of it.


----------



## VSG

Those new drivers are apparently just a boost in stock fan profiles.


----------



## Forceman

Quote:


> Originally Posted by *Arizonian*
> 
> Yes that's taking the new drivers into consideration which also boost the 290 X which will stay ahead of it.


The new drivers aren't new, they are 13.11 Beta 8, and they just increase the default fan profile on the 290. So no gains for the 290X, according to the reviews I read.


----------



## tsm106

Quote:


> Originally Posted by *Arizonian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Kriant*
> 
> R9 290 seems like a wild "win" for 400$ delivering same performance as 290x according to TPU 0_0
> 
> 
> 
> Yes that's taking the new drivers into consideration which also boost the 290 X which will stay ahead of it.
Click to expand...

780/titan level performance for 400 bucks lol. It's a good time for single 1080p gamers who are in the market for one card.


----------



## battleaxe

Quote:


> Originally Posted by *Kaapstad*
> 
> These cards really use the watts, I tried running three 290Xs in a system with a 1200i corsair PSU and it kept crashing.
> 
> I did manage to keep it running long enough to do this
> 
> http://www.3dmark.com/fs/1093552
> 
> Score 24450
> 
> AMD Radeon R9 290X(3x) @1180/1625 on air
> 
> Intel Core i7-3970X Extreme Edition Processor @4.9
> 
> Graphics Score 36299
> 
> Physics Score 17328
> 
> Combined Score 8635
> 
> But even then the system was not flat out.
> 
> When I get all four 290Xs running I will be using 2 x 1200watt PSUs to take care of things.


Geez, nice!


----------



## jerrolds

Finally able to install gpuz - running 75% ASIC on the dot. My Gelid Icy 2 should arrive before the weekend, hopefully push this to 1200mhz with increased volts. Right now at 1100/1400 stock everything.


----------



## Kriant

Quote:


> Originally Posted by *tsm106*
> 
> 780/titan level performance for 400 bucks lol. It's a good time for single 1080p gamers who are in the market for one card.


I was thinking more along the lines of : watercooled quadfire


----------



## Arizonian

Quote:


> Originally Posted by *Forceman*
> 
> The new drivers aren't new, they are 13.11 Beta 8, and they just increase the default fan profile on the 290. So no gains for the 290X, according to the reviews I read.


I'm expecting 13.11 beta 9 which are the drivers that delayed this launching in the first place, so we reviewers can review it with Beta 9 with NDA lift. We should be seeing that perfomance boost promised along with the 290X.


----------



## Slomo4shO

Quote:


> Originally Posted by *Arizonian*
> 
> Yes that's taking the new drivers into consideration which also boost the 290 X which will stay ahead of it.


I am not sure, all the reviews thus far mention AMD Catalyst 13.11 Beta v8 as the drivers for the card. Well, techpowerup didn't specify the version but anandtech and toms both mention v8.


----------



## Forceman

Quote:


> Originally Posted by *Arizonian*
> 
> I'm expecting 13.11 beta 9 which are the drivers that delayed this launching in the first place, so we reviewers can review it with Beta 9 with NDA lift. We should be seeing that perfomance boost promised along with the 290X.


No, the delay was for Beta 8, which changed the fan profiles.
Quote:


> *What has changed is the default fan speed.* As you might recall from our 290X review, the 290X can't actually sustain its 1000MHz boost clock at its default fan limit of 40%. The amount of heat generated at those clockspeeds and voltages is just too great for the cooler, and as a result the card has to pull back, significantly at times, in order to keep itself within tolerances with the amount of cooling provided at a 40% fan speed. Like the 290X, the 290 as originally specified would also have a default fan speed of 40%, and like the 290X it too would throttle under just about all sustained workloads. Or as AMD likes to put it, the 40% fan speed on the original 290 would have left "untapped performance headroom."
> 
> So for the new 290 as will be reviewed and shipping, AMD has turned up the default fan speed from 40% to 47%, essentially making uber mode the default mode on the 290. Consequently with improved cooling performance the 290 throttles less (if at all), thereby improving its performance despite the other specifications technically remaining the same. Or to put this another way, AMD was able to significantly increase their performance merely by turning up the fan speed and reducing the thermal throttling that was holding back the card's performance.


----------



## tsm106

Quote:


> Originally Posted by *Kriant*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> 780/titan level performance for 400 bucks lol. It's a good time for single 1080p gamers who are in the market for one card.
> 
> 
> 
> I was thinking more along the lines of : watercooled quadfire
Click to expand...

If for gaming hell yea. But if you wanna bench too, it will always be behind all things being equal.


----------



## Arizonian

Wasn't the performance supposed to change along with those drivers though?


----------



## Forceman

Quote:


> Originally Posted by *Arizonian*
> 
> Wasn't the performance supposed to change along with those drivers though?


Performance did change for the 290 - it is up to 10% faster with the new drivers. But it doesn't affect the 290X, it is only applicable to the 290. And it only matters in Quiet mode, in Uber mode (on the 290X) the difference goes away.

Edit: Strange. Just noticed that the [H] review says there is no Quiet and Uber mode on the 290, but the Guru3D review says there is, and their picture shows a switch on top. Wonder what's up with that.


----------



## Snyderman34

Just paid for 2 290s on NewEgg! Hate the tax (in TN), but love that it basically gets one day shipped to me for nothing extra (ships from Memphis, 3 day shipping turns into one







)


----------



## SpewBoy

I dunno if this is a unique problem for me or whether it affects all people in my situation but I finally decided to install GPU-Z and fired up Valley and Heaven and both benchmarks have the same issue.

I'm running 2-way crossfire on a Sabertooth Z77 with a 2600K and my second (bottom) card is not clocking properly. In-game at stock clocks when it should be doing 1000MHz it is doing approximately 730-750MHz and the memory clock is rock solid on 150MHz. The top card is fine.

I'm thinking it probably has something to do with the fact that I'm rocking PCIe2.0 still and the new crossfire tech doesn't have enough bandwidth to utilise the 2x8 PCIe2.0 bandwidth available. causing the second card to downclock (and causes memory clock to revert to idle clock). If I exit Valley and look at the credits, both GPUs utilise all 1000MHz of their core clocks, but that goes back to idle as soon as I completely exit the benchmark.

I'll swap the cards around when they aren't too hot to touch to be sure it is a crossfire issue (seriously, I literally burned my hand while touching the top of the top card last night).

Oh and I'd just like to point out that this is only a temporary situation. When I get my blocks (or maybe sooner if I can't fix this issue), I'll be chucking these into my Z87 / 4770K rig. I just don't have the time to swap out an entire system in my Switch 810 just for testing purposes right now, only to swap it back into my 900D when the blocks arrive (exams!!!).



EDIT: Also not sure what to make of that GPU load (and the fact that card 2 doesn't even mention a GPU load .___. is it meant to?).


----------



## Arizonian

Reviews are going up on OP review section - six so far.


----------



## xxmastermindxx

Did a little testing on a single card just to show the power consumption once you start raising voltage. The only thing connected to the PSU was the 290X. Stock clocks, 1.3V. I have yet to test this under water with much cooler temps. Anyway, 1.3V and slightly higher clock speed is roughly 400W.


----------



## VSG

I never thought the 290x would have the "Titan" dilemma this early, I am still not regretting getting the 290x over the 290. But it would be sweet if the 290x got cut to $499 and we all got refunds


----------



## ZombieJon

Odd...Anandtech shows the 290 drawing more total system W than the 290X.

Toms and Guru3d both have it at 200-250W (VGA only).


----------



## spitty13

I was so tempted to buy 2 290s, but I need to see if my 850 watt PSU can handle it first. Both are going to be overclocked and under water so I'm leaning towards crossfire not working.


----------



## esqueue

I wonder if I'm the only one who plans on watercooling and not overclocking. No matter what AMD says, I can't see this thing lasting at 95c. The first xbox 360's thought that 95c was fine and while it didn't affect the chip at all, it messed with the pcb and lead free solder resulting in a nearly 100% eventual failure rate.


----------



## Taint3dBulge

any words of new drivers?


----------



## Forceman

Quote:


> Originally Posted by *ZombieJon*
> 
> Odd...Anandtech shows the 290 drawing more total system W than the 290X.
> 
> Toms and Guru3d both have it at 200-250W (VGA only).


The 290X was throttling. The 290 also had a higher VID.


----------



## Arizonian

Quote:


> Originally Posted by *esqueue*
> 
> I wonder if I'm the only one who plans on watercooling and not overclocking. No matter what AMD says, I can't see this thing lasting at 95c. The first xbox 360's thought that 95c was fine and while it didn't affect the chip at all, it messed with the pcb and lead free solder resulting in a nearly 100% eventual failure rate.


Yet we still have Fermi's in the wild today. I don't see why not. Still see people with GTX 480's in their rigs.

EDIT - 10 Review sits now up on OP review section.


----------



## Xtreme21

Just ordered my Sapphire 290 for next business day shipping


----------



## tsm106

Quote:


> Originally Posted by *ZombieJon*
> 
> Odd...Anandtech shows the 290 drawing more total system W than the 290X.
> 
> Toms and Guru3d both have it at 200-250W (VGA only).


It's cuz its not throttling, probably much at all.


----------



## fleetfeather

Quote:


> Originally Posted by *Arizonian*
> 
> Yet we still have Fermi's in the wild today. I don't see why not. Still see people with GTX 480's in their rigs.
> 
> EDIT - 10 Review sits now up on OP review section.


There are still _some_ fermi's around, but whether or not the rest of them were casualties of the upgrading path or just good ol' casualties is not clear







That said, at a retail price of 550, AMD can't afford to have a high RMA quota on these cards. Time will tell, but I'm confident that AMD hasn't shot themselves in the foot regarding temps and failure rates


----------



## Jared Pace

Quote:


> Originally Posted by *xxmastermindxx*
> 
> Did a little testing on a single card just to show the power consumption once you start raising voltage. The only thing connected to the PSU was the 290X. Stock clocks, 1.3V. I have yet to test this under water with much cooler temps. Anyway, 1.3V and slightly higher clock speed is roughly 400W.


400 from the 8+6, 100 from the pcie, only 1.3v.


----------



## Arizonian

Looks like the 290 has the same temps - 12 watts per card lower power - but louder fan noise in full load over 290X.


----------



## esqueue

Quote:


> Originally Posted by *fleetfeather*
> 
> There are still _some_ fermi's around, but whether or not the rest of them were casualties of the upgrading path or just good ol' casualties is not clear
> 
> 
> 
> 
> 
> 
> 
> That said, at a retail price of 550, AMD can't afford to have a high RMA quota on these cards. Time will tell, but I'm confident that AMD hasn't shot themselves in the foot regarding temps and failure rates


That's nice to hear. I am only familiar with the 360 and read of some other cards having the same issue so that's where my paranoia stems from. I have no clue what the fermi is but I'm sure a google search could remedy that.

Either way, I have an ek block that shipped out today from Florida. It should be here in California on Friday. Watercooling will not hurt and I plan on doing the same for my pc while I'm at it.

http://valid.canardpc.com/ap8rs2

It seems that I'd have to upgrade my 620W antec High Current Gamer psu


----------



## Slomo4shO

Add me to the list, I bit and ordered two Asus R9 290


----------



## Arizonian

I'm waiting for everyone to get their cards with pics and OCN name before adding to roster. Last time I had to enter names and then go back and enter proof links. It was twice the work and besides wouldn't you want your links to be a pic of your card vs the invoice?

Hope it's ok guys. *_Arizonian hides in fear of being pelted_"

I promise I'll go in order to your invoice though.


----------



## Lord Xeb

Reporting in!


----------



## skupples

I have two 480's (fermi's) they are almost as old as they can get, still solid as a rock.


----------



## ImJJames

Where are people ordering the r9 290 from????


----------



## Forceman

Quote:


> Originally Posted by *ImJJames*
> 
> Where are people ordering the r9 290 from????


Newegg had Asus and Sapphire in stock earlier.


----------



## ImJJames

I am going crazy I can't find any in stock!


----------



## HoZy

Quote:


> Originally Posted by *ImJJames*
> 
> I am going crazy I can't find any in stock!


Calm down, There seems to be price drops happening. Wait a week, It's not "The most amazing card out there" So waiting a week will not kill you.

I am severely underwhelmed with my R9-290x over my previous OC 6970 Crossfire.

Cheers
Mat


----------



## Arm3nian

Quote:


> Originally Posted by *HoZy*
> 
> Calm down, There seems to be price drops happening. Wait a week, It's not "The most amazing card out there" So waiting a week will not kill you.
> 
> *I am severely underwhelmed with my R9-290x over my previous OC 6970 Crossfire*.
> 
> Cheers
> Mat


I don't... even... what?


----------



## ImJJames

Quote:


> Originally Posted by *HoZy*
> 
> Calm down, There seems to be price drops happening. Wait a week, It's not "The most amazing card out there" So waiting a week will not kill you.
> 
> *I am severely underwhelmed with my R9-290x over my previous OC 6970 Crossfire.
> *
> Cheers
> Mat


You never heard of price protection? lol, and of course going from CF 6970 to 290x you didn't feel the difference what did you expect? CF 6970 is on par of between a 280x and 290x. Did you really expect to feel any difference besides synthetic benchmarks?


----------



## Lord Xeb

I knew I was getting a performance downgrade going from my SLI 670s, but when I get a second one...


----------



## Kriant

Ordered three Asus 290s. Almost pulled a trigger on four, but then...I can always add it later on.









Now to seel my qaud 7970 set with koolance blocks.









And thankfully got a screenshot of newegg castomer rep -> I can return those cards if I won't open them. Soo if 780ti makes a "whoosh" sound, run miles around everyone else, then be my guest.


----------



## ImJJames

Quote:


> Originally Posted by *Kriant*
> 
> Ordered three Asus 290s. Almost pulled a trigger on four, but then...I can always add it later on.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now to seel my qaud 7970 set with koolance blocks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And thankfully got a screenshot of newegg castomer rep -> I can return those cards if I won't open them. Soo if 780ti makes a "whoosh" sound, run miles around everyone else, then be my guest.


So you're buying it without opening it? I don't understand. And where did you order! I have a feeling they will have to reduce that 780ti to around $550 range, and bring the 780 down to $430 dollar range for Nvidia to even be relevant.


----------



## jjjsong

Quote:


> Originally Posted by *ImJJames*
> 
> I am going crazy I can't find any in stock!


Are you talking about 290? they're in stock at newegg. newegg listing shows them "coming soon" but once you click into the items, they're in stock.


----------



## ImJJames

Quote:


> Originally Posted by *jjjsong*
> 
> Are you talking about 290? they're in stock at newegg. newegg listing shows them "coming soon" but once you click into the items, they're in stock.


Which brand is best out all these 290's?


----------



## hotrod717

Yep, the 290's are in stock at Newegg. Jjjsong is right, click on them and they show as in stock, add to cart.

MSI and Asus have 3 year warranty.


----------



## Sgt Bilko

Quote:


> Originally Posted by *ImJJames*
> 
> Which brand is best out all these 290's?


I'm assuming they are all Ref cards so very little differance apart from Customer Support if you ever need it, I'd go with Sapphire imo, always a good choice


----------



## jjjsong

Quote:


> Originally Posted by *ImJJames*
> 
> Which brand is best out all these 290's?


I think right now they're all pretty much the same (all reference everything). I personally like Asus and Sapphire more but that's just me.


----------



## Kriant

Quote:


> Originally Posted by *ImJJames*
> 
> Which brand is best out all these 290's?


Asus -> 3 years warranty and they can honor it overseas etc....Sapphire's warranty is "so-so".

And yes, I bought it because they might just go "Out of stock" just like 290x, but I won't open them until 780ti comes out. If it's that much better, than it will be reasonable to return, grab the money and re-spend on green camp.


----------



## ImJJames

Quote:


> Originally Posted by *Kriant*
> 
> Asus -> 3 years warranty and they can honor it overseas etc....Sapphire's warranty is "so-so".
> 
> And yes, I bought it because they might just go "Out of stock" just like 290x, but I won't open them until 780ti comes out. If it's that much better, than it will be reasonable to return, grab the money and re-spend on green camp.


I never had a problem returning items even if it had no problems and opened.


----------



## Arm3nian

Quote:


> Originally Posted by *ImJJames*
> 
> You never heard of price protection? lol, and of course going from CF 6970 to 290x you didn't feel the difference what did you expect? CF 6970 is on par of between a 280x and 290x. Did you really expect to feel any difference besides synthetic benchmarks?


CF 6970 and a 290x aren't even in the same league.
Quote:


> Originally Posted by *Lord Xeb*
> 
> I knew I was getting a performance downgrade going from my SLI 670s, but when I get a second one...


How are you getting a performance downgrade? Are you comparing oc'd 670 sli vs a throttling 290x? 600 series is a bad overclocker and will never match a 290x under water if comparing max oc. If on stock cooling you probably should've waited for non reference designs or invested in a loop.


----------



## hotrod717

Careful, Newegg is replacement only.


----------



## Sgt Bilko

Sapphire R9 290:
http://www.newegg.com/Product/Product.aspx?Item=N82E16814202043

Asus R9 290:
http://www.newegg.com/Product/Product.aspx?Item=N82E16814121807

MSI R9 290:
http://www.newegg.com/Product/Product.aspx?Item=N82E16814127758

They are avaliable right now.


----------



## Kriant

Quote:


> Originally Posted by *hotrod717*
> 
> Careful, Newegg is replacement only.


I've spoke with the rep -> you can return for a refund if you don't open. Just in case saved the convo in screen and print.


----------



## ImJJames

Quote:


> Originally Posted by *hotrod717*
> 
> Careful, Newegg is replacement only.


maybe I should wait for amazon than haha


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Sazz*
> 
> GOT IT! Thanks for the PM! saved me!!


Glad you got it!! Hopefully its not the same one you returned, otherwise I may have dodged a bullet lol.
Quote:


> Originally Posted by *Arizonian*
> 
> Also some other good news I don't know if you guys heard but Mantle Graphics API Adopted by Industry Leading Game Developers Cloud Imperium, Eidos-Montreal and Oxide.
> 
> Let the Mantle train begin.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://phx.corporate-ir.net/mobile.view?c=74093&v=203&d=1&id=1871714


If I'm not mistaken, didn't Cloud Imperium have some partnership with AMD? Wasn't CI in a few high profile AMD events?

Also, I'm so glad I cancelled the 290x. I think I'll much rather pick up the 290 as this way I will feel less crappy: spending $430 instead of the $640 I would have spent with the 290x. And I'm assuming (hoping!) that an OCed 290 won't be too far behind the 290x for 1080p/1440p gaming? I'm going to wait for you awesome people to benchmark the living crap out of these cards before I drop my order in, hopefully by next Monday/Tuesday. And hopefully AB will be fully compatible with all 290 manufacturers so that I don't have to go just for Asus.


----------



## ImJJames

Quote:


> Originally Posted by *Arm3nian*
> 
> CF 6970 and a 290x aren't even in the same league.
> How are you getting a performance downgrade? Are you comparing oc'd 670 sli vs a throttling 290x? 600 series is a bad overclocker and will never match a 290x under water if comparing max oc. If on stock cooling you probably should've waited for non reference designs or invested in a loop.


Well I said in between, because from what I remember 2x 6970 surpasses 7970 which is a r9 280x, and next in line to 280x is 290x. Thats why I said in between, but obviously much closer to 280x than 290x lol


----------



## PillarOfAutumn

Pretty strange these numbers...: http://www.anandtech.com/show/7481/the-amd-radeon-r9-290-review/15

The 290 is drawing more power than the 290x??


----------



## fleetfeather

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Pretty strange these numbers...: http://www.anandtech.com/show/7481/the-amd-radeon-r9-290-review/15
> 
> The 290 is drawing more power than the 290x??


290X was throttling, thus lower power draw


----------



## ImJJames

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Pretty strange these numbers...: http://www.anandtech.com/show/7481/the-amd-radeon-r9-290-review/15
> 
> The 290 is drawing more power than the 290x??


290 has higher voltage in stock and boost compared to 290x boost.


----------



## SonDa5

290 looks like a incredible deal for $399.99.

My 290x getting prepped for EK block!











CL Ultra TIM on GPU DIE
Fujipoly xtreme thermal pads on memory IC and Fujipoly Ultra on VRMs.


----------



## Arm3nian

Quote:


> Originally Posted by *ImJJames*
> 
> Well I said in between, because from what I remember 2x 6970 surpasses 7970 which is a r9 280x, and next in line to 280x is 290x. Thats why I said in between, but obviously much closer to 280x than 290x lol


Can't really say between 280x and 290x. 280x is a rebrand, 290x is totaly new. There is a giant gap between them.

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Pretty strange these numbers...: http://www.anandtech.com/show/7481/the-amd-radeon-r9-290-review/15
> 
> The 290 is drawing more power than the 290x??


290 is running at more volts at the same clock.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Arm3nian*
> 
> 290 is running at more volts at the same clock.


God damnit... I was hoping that the 290 would be less of a burden to my 750 watt PSU... either way, if it performs similar to the 290x at 1080p or 1440p, I'll be more than happy to jump on it for less than $200 vs the 290x. Can't wait to see water temps.


----------



## ImJJames

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> God damnit... I was hoping that the 290 would be less of a burden to my 750 watt PSU... either way, if it performs similar to the 290x at 1080p or 1440p, I'll be more than happy to jump on it for less than $200 vs the 290x. Can't wait to see water temps.


Your 750 watt seasonic for single 290 is more than enough.


----------



## jerrolds

$439CDN for 290? Damn..regretting my 290X for $599 w/ BF4...


----------



## ImJJames

Quote:


> Originally Posted by *jerrolds*
> 
> $439CDN for 290? Damn..regretting my 290X for $599 w/ BF4...


This is why you never jump the gun on new releases


----------



## Arm3nian

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> God damnit... I was hoping that the 290 would be less of a burden to my 750 watt PSU... either way, if it performs similar to the 290x at 1080p or 1440p, I'll be more than happy to jump on it for less than $200 vs the 290x. Can't wait to see water temps.


Depends on your system and overclocks really. 750 should be enough for an air cooled build. 290 is the best price to performance, it is a no brainer.


----------



## nemm

Sorry to be lazy by not doing research and asking this but time is against me this morning and in need a quick response.

With the release of the 290 nonX it seems the saving over 290x for little loss in performance has got me thinking. Do I reject the 290x due today and order a 290? At a later date I planned to order a second 290x but the saving over going 290 nearly allows a 3rd 290, will the 3rd be worth it in the long run? If I was to run trifire it would be on a x79/99 platform.

Also I was unable to find out whether the 290 is voltage locked or not, which is it and I know overclocking software is limiting factor at present?

Much appreciative of any help.

Thank you


----------



## Forceman

Quote:


> Originally Posted by *nemm*
> 
> Sorry to be lazy by not doing research and asking this but time is against me this morning and in need a quick response.
> 
> With the release of the 290 nonX it seems the saving over 290x for little loss in performance has got me thinking. Do I reject the 290x due today and order a 290? At a later date I planned to order a second 290x but the saving over going 290 nearly allows a 3rd 290, will the 3rd be worth it in the long run? If I was to run trifire it would be on a x79/99 platform.
> 
> Also I was unable to find out whether the 290 is voltage locked or not, which is it and I know overclocking software is limiting factor at present?
> 
> Much appreciative of any help.
> 
> Thank you


No, they aren't voltage locked and once the new AB comes out you should be able to overvolt them with it, and yes, you should reject the 290X and get the 290.

But that second part is just my opinion.


----------



## jerrolds

If i had the option id get the 290 over the X - all im hoping for now is that my card can hit 1200mhz+ once the Gelid comes in, then itll make it a bit more worth it.

Damn i wish the 290X was a bit cheaper and/or faster, or the 290 more expensive and/or slower









Price performance on the 290 is incredible.


----------



## c0ld

Crap this 290 is making me wanna return the 780 Lightning after it gets here....


----------



## scramz

Quote:


> Originally Posted by *c0ld*
> 
> Crap this 290 is making me wanna return the 780 Lightning after it gets here....


Plus mantle due soon, if it makes as much of a boost as they say then it could be worth it.


----------



## scramz

Quote:


> Originally Posted by *jerrolds*
> 
> If i had the option id get the 290 over the X - all im hoping for now is that my card can hit 1200mhz+ once the Gelid comes in, then itll make it a bit more worth it.
> 
> Damn i wish the 290X was a bit cheaper and/or faster, or the 290 more expensive and/or slower
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Price performance on the 290 is incredible.


Wait for the benchmarks to be released for the 290x on the new drivers. Gibbo over at OC uk said when the 290x was tested with the new drivers the performance raised again in line with the 290 performance boost. Either way, both cards are great value for money.


----------



## Paul17041993

Quote:


> Originally Posted by *c0ld*
> 
> Crap this 290 is making me wanna return the 780 Lightning after it gets here....



(joke)

though what was the reason for the 780 anyway? not sure really how much different they are between each other...


----------



## HeadlessKnight

http://www.techpowerup.com/reviews/Powercolor/R9_290X_OC/29.html

http://www.techpowerup.com/reviews/AMD/R9_290/29.html

290 OC vs 290X OC. Looks like 290X has about ~4.5-5% clock-for-clock advantage. Max OC should be slightly higher in 290X favor than that as the best chips most of the time are chosen as 290Xs.


----------



## c0ld

Quote:


> Originally Posted by *scramz*
> 
> Plus mantle due soon, if it makes as much of a boost as they say then it could be worth it.


Sigh... The temps put me off though.
Quote:


> Originally Posted by *Paul17041993*
> 
> 
> (joke)
> 
> though what was the reason for the 780 anyway? not sure really how much different they are between each other...


Well seeing I didn't snag a 290X when they came out the lightning dropped price and it became a more attractive choice, I wasn't expecting for the 290 to be this low.

Dont know if I wanna refund the $540 for the lightning and get a 290. Save myself a $140, for similar performance.


----------



## nemm

Thank you for the replies, I managed to order 2x 290s for 160gbp more than I paid for 1x 290x so I am sure even after the new performance figures using new drivers I wont regret my decision, If I do then so be it, life goes on


----------



## Ukkooh

Am I the only one in here who is still going to buy a 290x?


----------



## beejay

Hmmm, checking GPU-z, after 10 minutes of heaven, my VRM temp 1 goes to 88c, and I think it would go further. After stopping heaven it goes down rapidly to 43c. My VRM temp 2 never goes over 68, and stays at 64c when idle. Is 88c VRM temp normal for this thing?

Here are the details:
BIOS: PT3
Cooler: Accelero Hybrid
Core: 1100
Mem: 6000
GPU Voltage: 1.3V
Power target: 150%
Fan speed: 100% @ 70c

GPU:
Idle: 45c
Load: 65c
VRM1:
Idle: 42c
Load: 88c
VRM2:
Idle: 62c
Load: 67c

And which part of the board is vrm1 and vrm 2? There are 3 regulator modules on the top left hand side of the card, and of course the row of VRMs at the right hand side.

Any thoughts? Is the accelero hybrid just weak on VRM cooling? Am I still under working numbers?


----------



## Forceman

Quote:


> Originally Posted by *Ukkooh*
> 
> Am I the only one in here who is still going to buy a 290x?


Yes.

Just kidding, but the $150 price difference puts the 290X in a bad spot. The normal $100 would be okay, but that extra $50 makes a difference.


----------



## Paul17041993

Quote:


> Originally Posted by *c0ld*
> 
> Well seeing I didn't snag a 290X when they came out the lightning dropped price and it became a more attractive choice, I wasn't expecting for the 290 to be this low.
> 
> Dont know if I wanna refund the $540 for the lightning and get a 290. Save myself a $140, for similar performance.


probably might as well, and the extra 140 could mean water or a custom cooler...
Quote:


> Originally Posted by *Ukkooh*
> 
> Am I the only one in here who is still going to buy a 290x?


no


----------



## c0ld

Quote:


> Originally Posted by *Paul17041993*
> 
> probably might as well, and the extra 140 could mean water or a custom cooler...


Does the Accelero Xtreme fit the 290?


----------



## Ukkooh

Quote:


> Originally Posted by *c0ld*
> 
> Does the Accelero Xtreme fit the 290?


If it fits the 290x it fits the 290 too.


----------



## Hattifnatten

Oh my, I'm so tempted to buy a 290 right now, a store here has hundreds (!) of them in stock, and the price is just a tiny bit above the 770. Even I could buy it. Only thing stopping me is the fact that there is no X behind it


----------



## scramz

Quote:


> Originally Posted by *c0ld*
> 
> Sigh... The temps put me off though.
> Well seeing I didn't snag a 290X when they came out the lightning dropped price and it became a more attractive choice, I wasn't expecting for the 290 to be this low.
> 
> Dont know if I wanna refund the $540 for the lightning and get a 290. Save myself a $140, for similar performance.


Unless you wait for the non ref coolers to be released. I put my 290x under water.


----------



## Dominican

290 R9

299

look like it match 780 gtx ?


----------



## TheSoldiet

Hmmm, i may just return my 290X and get a 290 + a Xtreme III.

Not sure


----------



## Dominican

Quote:


> Originally Posted by *TheSoldiet*
> 
> Hmmm, i may just return my 290X and get a 290 + a Xtreme III.
> 
> Not sure


you already open it ? 290x r9 sold out everywhere.


----------



## lordzed83

beejay
Normal temps mate..

Highest i got VRM1 on my 290x to was 119c when i had bad contact on waterblock







Card started stuttering but did not blow.
They go up to 97c under water on furmark 15 minute test and looked stable

VRM2 are those 3 small ones in top left by memory chip. They stay cooler max i seen was like 61c

Anyhow 99% of vrms are rated 105c maximum operating temperature.


----------



## SLADEizGOD

Quote:


> Originally Posted by *Arizonian*
> 
> Only gripe I have too. I only have myself to blame when I went for conservative high quality 850 watt on my Ivy system on release day. Love to go over kill crossfire but don't think it'd be wise. Wonder how much less of a power draw non-ref 290 Xfire might be? So contemplating my next move is signal 290X non-ref and sell off the reference card.


Now im a little nervous. Will my XFX black edition 1050 Watt Hybrid do the job with 2 290x's?


----------



## SLADEizGOD

Quote:


> Originally Posted by *Ukkooh*
> 
> Am I the only one in here who is still going to buy a 290x?


Nope. I'm looking into getting 2 with Waterblocks


----------



## beejay

Quote:


> Originally Posted by *lordzed83*
> 
> beejay
> Normal temps mate..
> 
> Highest i got VRM1 on my 290x to was 119c when i had bad contact on waterblock
> 
> 
> 
> 
> 
> 
> 
> Card started stuttering but did not blow.
> They go up to 97c under water on furmark 15 minute test and looked stable
> 
> VRM2 are those 3 small ones in top left by memory chip. They stay cooler max i seen was like 61c
> 
> Anyhow 99% of vrms are rated 105c maximum operating temperature.


I found the problem. The heat sink on the 3 regulator modules where not in line with the airflow. I was so sleepy when I installed this.

Anyway, because of the surrounding components and the small space, I can only fit one hit sink, and used a thermal pad from the stock ones. I now level out at 88c on heaven on VRM1.

I need to find big sinks with small footprint and glue it to those components.

And seems like vrm1 and vrm2 may be swapped depending on the card? vrm1 on mine are the small ones. Also hwinfo displays two GPU0 with VRM 1 and 2 swapped on the first GPU0 and the second GPU 0.


----------



## Newbie2009

Quote:


> Originally Posted by *SLADEizGOD*
> 
> Now im a little nervous. Will my XFX black edition 1050 Watt Hybrid do the job with 2 290x's?


Depends on how mad you go with overclocking. You will be pushing the limits with heavy overclocks id say. Personally if I was you id risk it.


----------



## Clockster

http://www.techpowerup.com/gpuz/kskf7/



Will take pics tonight and add the other card in.


----------



## mam72

How is the 290x (or 290 if you have it) for overclocking with custom cooling like the arctic cooling X3, the CLC mod or a water block?

Does the card have coil whine? I know some AMD cards are known for that, I had one in the past doing it.


----------



## kantxcape

Jesus christ, so i cant crossfire 2 290x with a platimum 1000w PSU? System is watercooled.


----------



## esqueue

Quote:


> Originally Posted by *mam72*
> 
> How is the 290x (or 290 if you have it) for overclocking with custom cooling like the arctic cooling X3, the CLC mod or a water block?
> 
> Does the card have coil whine? I know some AMD cards are known for that, I had one in the past doing it.


I'm not too sure what coil whine is but if it sounds like a high pitched squeal that changes tunes, then yes. I noticed that sound from my 290x when playing batman but I thought that it was coming from my speakers. I heard it when I muted the game once.


----------



## Newbie2009

Quote:


> Originally Posted by *kantxcape*
> 
> Jesus christ, so i cant crossfire 2 290x with a platimum 1000w PSU? System is watercooled.


You can. I would. You will be maxing your PSU.


----------



## mam72

Quote:


> Originally Posted by *kantxcape*
> 
> Jesus christ, so i cant crossfire 2 290x with a platimum 1000w PSU? System is watercooled.


well one 290x consumes 245w (TechPowerUp)

So two will be 490

I am assuming you will be OC that 3930K so say 300w.

That is 790w so lets say 800w in other words you are fine


----------



## alancsalt

Quote:


> well one 290x consumes 245w


At stock clocks?


----------



## Ukkooh

Quote:


> Originally Posted by *mam72*
> 
> well one 290x consumes 245w (TechPowerUp)
> 
> So two will be 490
> 
> I am assuming you will be OC that 3930K so say 300w.
> 
> That is 790w so lets say 800w in other words you are fine


I'm pretty sure they didn't overvolt the card for that chart. Better keep that in mind.


----------



## alancsalt

Yeah, quite possible that if you overclock everything with the lot you could reach your power boundary.


----------



## Hattifnatten

Tom's has a very nice "new" way to measure.
Quote:


> We're using a current clamp to measure power consumption at the external PCIe power cable and, using a special PCB, directly at the PCIe slot. These measurements are recorded in parallel and in real time, added up for each second, and logged using multi-channel monitoring along with the respective voltages.


Quote:


> The curve isn't just representative; it's also exact. Measuring system power introduces bias, since a number of factors can affect consumption other than the graphics card. A faster GPU might cause the CPU's power consumption to go up as well, for example, since a limiting factor holding it back is gone.


Tom's


----------



## kantxcape

Quote:


> Originally Posted by *mam72*
> 
> well one 290x consumes 245w (TechPowerUp)
> 
> So two will be 490
> 
> I am assuming you will be OC that 3930K so say 300w.
> 
> That is 790w so lets say 800w in other words you are fine


Yeah i OC my 3930k to 4.7ghz. hope they ship my first 290x today.


----------



## SLADEizGOD

I'll just make the plunge and get them when I can. Need to upgrade my cards.plus I can't wait to see the difference when Mantle is released. .but does anyone have a recommendation on a sexy water block with the 290x?


----------



## PillarOfAutumn

Where exactly does TrueAudio play into this? Does the fact that someone has a sound card affect the TrueAudio?


----------



## lordzed83

beejay Interesting.... Maybe i am wrong and VRM1 are those 3 small ones ?? Anyone can clarify it ??
I am checking temperatures using new GPU-Z.

All i know is that backplate gets warm where it contacts VRMs


----------



## anubis1127

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Where exactly does TrueAudio play into this? Does the fact that someone has a sound card affect the TrueAudio?


It shouldn't AFAIK, TrueAudio isn't a soundcard, it's not going to show up in your device manager as a sound device. It's just a way for game devs to off load sound processing to the GPU, rather than having the CPU do it.


----------



## dade_kash_xD

Add me to club gents! Just ordered 2x sapphire 290's. Rma my crappy xfx 280xs!


----------



## Arizonian

Quote:


> Originally Posted by *dade_kash_xD*
> 
> Add me to club gents! Just ordered 2x sapphire 290's. Rma my crappy xfx 280xs!


When I see proper submission. I'm asking for simple pic of GPU with OCN name or GPU-Z screenshot with OCN name. In post please specify which brand as well. If you go water show blocks in pic. If you water block later simple pic of block, no OCN name needed will do and I'll update.









Congrats to those who scored thiers tonight. Looking foward to it.


----------



## Clockster

Stock Run with sig rig and single Gigabyte R9 290.

http://www.3dmark.com/3dm11/7433981


----------



## skupples

Quote:


> Originally Posted by *ImJJames*
> 
> So you're buying it without opening it? I don't understand. And where did you order! I have a feeling they will have to reduce that 780ti to around $550 range, and bring the 780 down to $430 dollar range for Nvidia to even be relevant.


the lowest i see 15SMX gk110(780Ti) going is 650$.


----------



## battleaxe

Quote:


> Originally Posted by *Clockster*
> 
> 
> 
> Stock Run with sig rig and single Gigabyte R9 290.


Cant' read the graphics score...?


----------



## Sammyboy83

The graphics score is 13902


----------



## Koniakki

Quote:


> Originally Posted by *battleaxe*
> 
> Cant' read the graphics score...?


Click on the image. If you still can't read it, click the Magnifier Icon with the word Original next to it, below the image on the right.


----------



## HeadlessKnight

Quote:


> Originally Posted by *battleaxe*
> 
> Cant' read the graphics score...?


13,902


----------



## battleaxe

Quote:


> Originally Posted by *Koniakki*
> 
> Click on the image. If you still can't read it, click the Magnifier Icon with the word Original next to it, below the image on the right.


Thanks for that. I'll take the 'tard of the week' award now. +1 to you


----------



## ZombieJon

Quote:


> Originally Posted by *Hattifnatten*
> 
> Tom's has a very nice "new" way to measure.
> 
> Tom's


Definitely very interesting. If those results are valid, I should be able to run a CF 290X at stock clock, or a hybrid 290X+290.


----------



## moa.

Just ordered two Sapphire 290's from Overclockers.co.uk... with waterblocks!







So excited! Do you know if they come with BF4?


----------



## jomama22

I'll just leave this here:
http://www.3dmark.com/fs/1095810

First of 6 underwater, fire strike graphics score: 14101

290x
1315/1500


----------



## VSG

No BF4 with the 290, I am.afraid.


----------



## Clockster

Quote:


> Originally Posted by *moa.*
> 
> Just ordered two Sapphire 290's from Overclockers.co.uk... with waterblocks!
> 
> 
> 
> 
> 
> 
> 
> So excited! Do you know if they come with BF4?


Only got BF4 because my supplier decided to bundle with the cards seeing as we won't have 290X cards till the last week of November.


----------



## flopper

Quote:


> Originally Posted by *jomama22*
> 
> I'll just leave this here:
> http://www.3dmark.com/fs/1095810
> 
> First of 6 underwater, fire strike graphics score: 14101
> 
> 290x
> 1315/1500


nice.
24/7?


----------



## moa.

Quote:


> Originally Posted by *Clockster*
> 
> Only got BF4 because my supplier decided to bundle with the cards seeing as we won't have 290X cards till the last week of November.


So normally BF4 is only for 290x?


----------



## Kriant

Guys, any points in getting backplates for 290 ?
I mean, will it actually help VRM at all?


----------



## Slomo4shO

Quote:


> Originally Posted by *c0ld*
> 
> Does the Accelero Xtreme fit the 290?


Yes it does.


----------



## sugarhell

Quote:


> Originally Posted by *jomama22*
> 
> I'll just leave this here:
> http://www.3dmark.com/fs/1095810
> 
> First of 6 underwater, fire strike graphics score: 14101
> 
> 290x
> 1315/1500


Nice. You cant push more?


----------



## Moustache

Quote:


> Originally Posted by *jomama22*
> 
> I'll just leave this here:
> http://www.3dmark.com/fs/1095810
> 
> First of 6 underwater, fire strike graphics score: 14101
> 
> 290x
> 1315/1500


is that the highest clock you can get? I hope you that you can overclock it a bit higher and also the memory clock as well.
and then, just run firestrike extreme and beat alatar's score. would love to see his reaction and probably hearing another excuse from him.


----------



## Kriant

Quote:


> Originally Posted by *tsm106*
> 
> If for gaming hell yea. But if you wanna bench too, it will always be behind all things being equal.


True, but considering that OC is a lottery, and the cards are essentially almost identical, I'm fine with getting those, and pushing to 1100-1150 under water.

For copetitive OCing I need more experience and funds lol


----------



## ExGreyFox

Hey gents. I am considering going dual 290x crossfire. I have just sold my dual 7970 ghz setup. Will my AX1200i be able to handle two 290x at stock with an overclocked 3770k at 4.6? Seeing the TDP am I correct to say that dual 290x at stock will burn about 600-700 watts? I'm doing 3240 x 1920 3x monitor portrait.


----------



## Arizonian

Quote:


> Originally Posted by *Clockster*
> 
> 
> 
> Stock Run with sig rig and single Gigabyte R9 290.
> 
> http://www.3dmark.com/3dm11/7433981


Congrats added









How did you score your 290 so quickly?


----------



## Connolly

Just been reading some reviews and came across this

http://www.tomshardware.com/reviews/radeon-r9-290-review-benchmark,3659-6.html

Has anyone else received a duff card as this review seems to suggest is possible? If you get a good one, it seems that the 290 is practically as good as the 290x, as long as you're just running in 1920x1080


----------



## Slomo4shO

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How did you score your 290 so quickly?


He ordered a 290X and was shipped a 290









Clockster, did you ever get the difference back from the retailer?


----------



## Arizonian

Quote:


> Originally Posted by *Connolly*
> 
> Just been reading some reviews and came across this
> 
> http://www.tomshardware.com/reviews/radeon-r9-290-review-benchmark,3659-6.html
> 
> Has anyone else received a duff card as this review seems to suggest is possible? If you get a good one, it seems that the 290 is practically as good as the 290x, as long as you're just running in 1920x1080


It's not really performance wise. They have the 290 fan on 47% and the 290X on 40% cap. So the 290 is being cooled better than the 290X. Manual fans will take care of that. Spec wise when OC'd vs OC'd the difference is 10% and will reamin that way.

Price wise for $150 difference in price - 10% performance difference isn't so bad.


----------



## Clockster

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How did you score your 290 so quickly?


I've actually got 10 of them laying at my office. Gigabyte accidentally sent us the 290 cards instead of 290x cards.
So still no 290X cards here, just grabbed the 290's today cause I don't wanna wait anymore









@ Slomo4shO: I paid about $560 each for these where the 290x would have been about $800.
We pay so much for hardware in this country its crazy lol Also to make up for the mistake they bundled BF4 with my 290's


----------



## MrTOOSHORT

The R9 290 is the card to get at it's $400 price point. The other guys GTX 770 needs to be at $299 now.

Nice score Clockster!


----------



## jomama22

Quote:


> Originally Posted by *Moustache*
> 
> is that the highest clock you can get? I hope you that you can overclock it a bit higher and also the memory clock as well.
> and then, just run firestrike extreme and beat alatar's score. would love to see his reaction and probably hearing another excuse from him.


Quote:


> Originally Posted by *sugarhell*
> 
> Nice. You cant push more?


This is my highest asic of the 6 (77.2%) so I wansnt expecting much. This was at 1.4v actual (1.37v pt3 bios) 1300 is attained at a much less 1.32v.

The memory starts giving me less points over 1500mhz. Though I won't see artifacts, ecc is kicking in hard. I suspect many people will experience the same thing.

I'll have all 6 tested by tonight. I believe I will be able to get over 1350 on at least 1.

Just like Tahiti, after ~ 1.32-1.35v, you need gobs of volts to get higher clocks.


----------



## Connolly

Ok, thanks. So which of the current 290x's don't end up with a voided warranty if you mod the cooling system?


----------



## Connolly

Quote:


> Originally Posted by *Arizonian*
> 
> It's not really performance wise. They have the 290 fan on 47% and the 290X on 40% cap. So the 290 is being cooled better than the 290X. Manual fans will take care of that. Spec wise when OC'd vs OC'd the difference is 10% and will reamin that way.
> 
> Price wise for $150 difference in price - 10% performance difference isn't so bad.


There is a massive difference between the 290x press and 290x retail versions, seems a little strange to me. Also, which of the current crop of 290x cards don't have their warranty voided if you add a different cooling system?


----------



## tsm106

Quote:


> Originally Posted by *mam72*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kantxcape*
> 
> Jesus christ, so i cant crossfire 2 290x with a platimum 1000w PSU? System is watercooled.
> 
> 
> 
> well one 290x consumes 245w (TechPowerUp)
> 
> So two will be 490
> 
> I am assuming you will be OC that 3930K so say 300w.
> 
> That is 790w so lets say 800w in other words you are fine
Click to expand...

Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> Two cards on psu with my cpu and nothing else connected, ie. no drives, no loop, nothing else. Running 3dm11 extreme at stock with +50 on air, so it's throttling a bit as expected. 903w x .89 at that wattage = *803w.* This number should blow up with blocks and serious overclocking. Ya need to remember that on the stock bios PT is trying to keep the tdp of the cards very very low.


These will consume a lot more when you start to really overclock.

Quote:


> Originally Posted by *Kriant*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> If for gaming hell yea. But if you wanna bench too, it will always be behind all things being equal.
> 
> 
> 
> True, but considering that OC is a lottery, and the cards are essentially almost identical, I'm fine with getting those, and pushing to 1100-1150 under water.
> 
> For copetitive OCing I need more experience and funds lol
Click to expand...

But overall you will have the fastest setup for the least money spent. And with AMD scaling, will walk away from gk110 setup. Four of these at a modest 1150/1500 will kick some serious tail. Can you get a high overclock on your cpu?


----------



## RAFFY

Finally started playing BF4 with these 290x's in crossfire and hot damn is it awesome!


----------



## Kriant

Quote:


> Originally Posted by *tsm106*
> 
> These will consume a lot more when you start to really overclock.
> But overall you will have the fastest setup for the least money spent. And with AMD scaling, will walk away from gk110 setup. Four of these at a modest 1150/1500 will kick some serious tail. Can you get a high overclock on your cpu?


Ordering EK blocks now. For 3 cards so far ( will wait and see how it scales past 2 cards before dropping in a 4th one).

As for CPU - ehh, it's uncertain, I'm having some strange problems with OC right now ( conveniently appeared with latest catalyst drivers for some reason).

I'll aim at 4.6-4.7


----------



## moldyviolinist

Hey, what temperature are most people's 290xs idling at?
Mine seems to be excessively hot. It's 17.5 C in my room, and the GPU stays around 52 C at 20% fan speed. My old 7970 would generally stay below 45 at idle. I suppose it doesn't really matter, but it's a little concerning.

Thanks. I'll be posting proof soon to join the club


----------



## ExGreyFox

Quote:


> Originally Posted by *RAFFY*
> 
> Finally started playing BF4 with these 290x's in crossfire and hot damn is it awesome!


Hey RAFFY how are those dual 290x running with your PSU? Im considering buying dual 290x myself but am unsure if my Ax1200i will be enough with a CPU oc.


----------



## maarten12100

Quote:


> Originally Posted by *jomama22*
> 
> I'll just leave this here:
> http://www.3dmark.com/fs/1095810
> 
> First of 6 underwater, fire strike graphics score: 14101
> 
> 290x
> 1315/1500


You just burned Alatar's score


----------



## Newbie2009

Quote:


> Originally Posted by *maarten12100*
> 
> You just burned Alatar's score


No way will the 290x be faster than a titan, I was there when Alatar said it so it must be true. If it is it's because you are using a titan with a 290x cover


----------



## $ilent

tsm can you link me to a decent but cheap thermometer I can use to measure my vrm temps?


----------



## psyside

Don't get me wrong but valid benches without video proof cannot be done, on either side.


----------



## Newbie2009

Quote:


> Originally Posted by *tsm106*
> 
> nvalatar's scores aren't even valid.


They don't have to be, the man is a legend.


----------



## tsm106

Quote:


> Originally Posted by *$ilent*
> 
> tsm can you link me to a decent but cheap thermometer I can use to measure my vrm temps?


http://www.newegg.com/Product/Product.aspx?Item=N82E16896268001


----------



## sugarhell

LOL. I will throw my rad inside my refrigerator and i will do some 1450 290x tests. Then i will claim that 290x can do 1450 on average


----------



## psyside

Quote:


> Originally Posted by *Newbie2009*
> 
> They don't have to be, the man is a legend.


Legends who fakes scores cannot be legends, no matter what side it is.


----------



## raghu78

Quote:


> Originally Posted by *Newbie2009*
> 
> No way will the 290x be faster than a titan, I was there when Alatar said it so it must be true. If it is it's because you are using a titan with a 290x cover
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Don't forget gsync, u don't have gsync man. looser. G sync improves Alatar score by 50% as a default, so you will always be behind, in theory.
> Yeah but Alatar has had his card for 8 months and is really happy with it. Alatar will just get a 780ti and teach everyone.


well said.







he is approaching a legend status on ocn. in fact he is already one i guess


----------



## fleetfeather

Quote:


> Originally Posted by *ExGreyFox*
> 
> Hey RAFFY how are those dual 290x running with your PSU? Im considering buying dual 290x myself but am unsure if my Ax1200i will be enough with a CPU oc.


Ease up turbo, a 1200w PSU will be plenty for CF 290X's









Maximum possible power draw of a 6+8pin graphics card (such as the 290X) is 300w. Therefore, if you had two cards, the maximum they could possibly draw is 600w (half your PSU's capacity). Now, as opposed to GPU power draw, a CPU's power draw is substantially smaller. Your overclocked CPU is probably pulling 150-180w at 100% load, maybe spiking up to 220w in crazy situations (which is fine).

When you add all this up, you're still only using roughly 66% of your PSU's rated capacity, and most high-end PSU's (like the AX1200i) can sustain loads much higher than the 1200w rating that the manufacturer suggests.

All in all, you'll be more the fine. You're barely approaching your PSU's efficiency sweet-spot


----------



## skupples

Quote:


> Originally Posted by *Kriant*
> 
> Guys, any points in getting backplates for 290 ?
> I mean, will it actually help VRM at all?


It's a slightly debated topic. They are considered to be mostly aesthetic by allot of pro's, some even say it can Affect thing's. Though, it serves as great protection for the back of the unit.
Quote:


> Originally Posted by *Newbie2009*
> 
> They don't have to be, the man is a legend.


Hater's goin' hate...


----------



## DampMonkey

Quote:


> Originally Posted by *fleetfeather*
> 
> Ease up turbo, a 1200w PSU will be plenty for CF 290X's
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Maximum possible power draw of a 6+8pin graphics card (such as the 290X) is 300w. Therefore, if you had two cards, the maximum they could possibly draw is 600w (half your PSU's capacity). Now, as opposed to GPU power draw, a CPU's power draw is substantially smaller.


It should be noted that you are talking about stock speeds. The 290x easily blows passed 300W when you start adding voltage for overclocking


----------



## jomama22

I wasn't going for overall score on these runs, merely just maxing out each card to find the best ones. Once that is done, I will throw the CPU to 5.2(it was 4.7 in the link before).

I can run valley no problem @ 1.32v-1315/1500. Bf4 is stable @ the same as valley. Fire strike seems to kicks these in the balls lol. Do remember, we don't have memory voltage adjustment quite yet either.


----------



## tsm106

Quote:


> Originally Posted by *Newbie2009*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> nvalatar's scores aren't even valid.
> 
> 
> 
> They don't have to be, the man is a legend.
Click to expand...

Chuck Norris' only fears one bencher.

Quote:


> Originally Posted by *psyside*
> 
> Don't get me wrong but valid benches without video proof cannot be done, on either side.


They should at least not have asterisks or multiple asterisks next to them.


----------



## sugarhell

Jomama you can do a valley run and a fs extreme?







Pls?


----------



## tsm106

Quote:


> Originally Posted by *sugarhell*
> 
> Jomama you can do a valley run and a fs extreme?
> 
> 
> 
> 
> 
> 
> 
> Pls?


Want to see that fse! Valley, you must beat 79fps!


----------



## raghu78

Quote:


> Originally Posted by *jomama22*
> 
> I wasn't going for overall score on these runs, merely just maxing out each card to find the best ones. Once that is done, I will throw the CPU to 5.2(it was 4.7 in the link before).
> 
> I can run valley no problem @ 1.32v-1315/1500. Bf4 is stable @ the same as valley. Fire strike seems to kicks these in the balls lol. Do remember, we don't have memory voltage adjustment quite yet either.


i am looking forward to you taking the 3dmark firestrike extreme no. 1 spot from alatar


----------



## fleetfeather

Quote:


> Originally Posted by *DampMonkey*
> 
> It should be noted that you are talking about stock speeds. The 290x easily blows passed 300W when you start adding voltage for overclocking


Woah, do I have my facts crossed somewhere? How can a gpu actually pull more than 300W through those 8+6pin power cables if 300W is the limit they can supply? I wouldn't of thought any substantial power was being fed through the mobo?

E: maybe you're referring to TDP..?


----------



## sugarhell

pcie can give up to 300 watt.


----------



## Newbie2009

Quote:


> Originally Posted by *skupples*
> 
> It's a slightly debated topic. They are considered to be mostly aesthetic by allot of pro's, some even say it can Affect thing's. Though, it serves as great protection for the back of the unit.
> *Hater's goin' hate*...


How am I a hater? Just a bit of fun.

Quote:


> Originally Posted by *tsm106*
> 
> Want to see that fse! Valley, you must beat 79fps!


What's the deal with valley anyway, 290x seems to not do so well in the valley


----------



## fleetfeather

Quote:


> Originally Posted by *sugarhell*
> 
> pcie can give up to 300 watt.


as in, through the ATX cable -> Mobo traces -> PCIe slot? O__O


----------



## jerrolds

What kinds of numbers do you think a 290X can hit using aftermarket coolers like the Accelero Hybrid/Xtreme 3 or Gelid Icy 2? Assuming it drops load temps by 20C? My 290X stock everything can do 1115/1400 Elpida memory and 75% ASIC

...is 1200+ attainable with better air cooling and more volts?

I wont get my Icy 2 till Friday at the earliest so im just kinda wondering what i should expect.


----------



## ImJJames

Quote:


> Originally Posted by *jerrolds*
> 
> What kinds of numbers do you think a 290X can hit using aftermarket coolers like the Accelero Hybrid/Xtreme 3 or Gelid Icy 2? Assuming it drops load temps by 20C? My 290X stock everything can do 1115/1400 ...is 1200+ attainable with better air cooling and more volts?
> 
> I wont get my Icy 2 till Friday at the earliest so im just kinda wondering what i should expect.


http://www.tomshardware.com/reviews/radeon-r9-290-review-benchmark,3659-19.html

Here is xtreme 3 with r9 290.


----------



## tsm106

Quote:


> Originally Posted by *jerrolds*
> 
> What kinds of numbers do you think a 290X can hit using aftermarket coolers like the Accelero Hybrid/Xtreme 3 or Gelid Icy 2? Assuming it drops load temps by 20C? My 290X stock everything can do 1115/1400 ...is 1200+ attainable with better air cooling and more volts?
> 
> I wont get my Icy 2 till Friday at the earliest so im just kinda wondering what i should expect.


The problem with custom aircooling is that the vrm/mem will generally be in a worst cooling situation than stock. This is ironic because the core cooling is soo much better than stock with a big custom cooler. It's a bit of a catch 22.


----------



## fleetfeather

Quote:


> Originally Posted by *jerrolds*
> 
> What kinds of numbers do you think a 290X can hit using aftermarket coolers like the Accelero Hybrid/Xtreme 3 or Gelid Icy 2? Assuming it drops load temps by 20C? My 290X stock everything can do 1115/1400 ...is 1200+ attainable with better air cooling and more volts?
> 
> I wont get my Icy 2 till Friday at the earliest so im just kinda wondering what i should expect.


The Hybrid / X3 solutions don't seem to be dropping VRMs as much as the GPU core itself. I'd estimate a 50mhz boost to OC potential, but certainly nothing like a what a full cover waterblock would allow


----------



## utnorris

So what's the consensus so far? Get rid of the 290x (refuse delivery) and get a 290? It looks like you could get 3 x 290's for the price of 2 x 290x's.


----------



## RAFFY

Quote:


> Originally Posted by *ExGreyFox*
> 
> Hey RAFFY how are those dual 290x running with your PSU? Im considering buying dual 290x myself but am unsure if my Ax1200i will be enough with a CPU oc.


Honest to god my PSU's fan hasn't even turned on yet. I was just playing COD: Ghosts no PSU fan, then before that BF4 no PSU fan, then before that I had BF4 open and minimized while playing COD: Ghosts and no PSU fan. With that being said my PSU doesn't make any whining noises that people have reported with the platinum's. This was all at 1080p resolution with all graphics maxed to ultra. I'm about to hop in game at 2560x1440p since my new monitor just arrived. I'll report back again about this PSU.


----------



## jerrolds

Quote:


> Originally Posted by *utnorris*
> 
> So what's the consensus so far? Get rid of the 290x (refuse delivery) and get a 290? It looks like you could get 3 x 290's for the price of 2 x 290x's.


I would definitely get a 290, 5% performance at a 30% price premium? No thanks.

Someone buy my 290X


----------



## $ilent

Quote:


> Originally Posted by *tsm106*
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16896268001


Got any UK recommendations? They dont sell those rosewill ones in the UK









wait, that rosewill one only goes from 0 - 50C?


----------



## beejay

Quote:


> Originally Posted by *jerrolds*
> 
> What kinds of numbers do you think a 290X can hit using aftermarket coolers like the Accelero Hybrid/Xtreme 3 or Gelid Icy 2? Assuming it drops load temps by 20C? My 290X stock everything can do 1115/1400 Elpida memory and 75% ASIC
> 
> ...is 1200+ attainable with better air cooling and more volts?
> 
> I wont get my Icy 2 till Friday at the earliest so im just kinda wondering what i should expect.


I'm using an accelero hybrid. Gpu core is 1100, and normaly hovers around 60c on load usimg the stock fan. My only gripe with the hybrid is that the trio of regulator modules on the left side of the gpu chip is not being cooled enough and goes to 80c on load. I'm at 1.28 volts. I'll try a mod on that particular part. Maybe another 80mm fan. If I can cool that part, I'll try to push more.


----------



## pastuch

Is Chris Angelini at Toms on the NV payroll or are all of you ignoring this bit of the Toms review?

"The card that AMD sent to me is a stallion. Even if you get it nice and hot before running a test, bringing it down off of that 1000 MHz "wishful thinking" spec, it's still faster than GeForce GTX 780, and oftentimes GeForce GTX Titan. But the Radeon R9 290X I bought from Newegg is a dud. It'll drop to 727 MHz and stay there&#8230;and the reference cooler still can't cool it fast enough. The result is that it violates its 40% fan speed ceiling as well. The craziness, then, is that my R9 290 press board is typically faster than my R9 290X retail card. In the benchmarks, you're going to see numbers for all three."

I hope he's wrong but seeing 10% lower numbers on retail cards vs the review cards doesn't give me warm fuzzys.


----------



## jomama22

Quote:


> Originally Posted by *sugarhell*
> 
> Jomama you can do a valley run and a fs extreme?
> 
> 
> 
> 
> 
> 
> 
> Pls?


heres the fse:
http://www.3dmark.com/3dm/1560550?



6600 graphics score...


----------



## tsm106

Quote:


> Originally Posted by *jomama22*
> 
> Quote:
> 
> 
> 
> Originally Posted by *sugarhell*
> 
> Jomama you can do a valley run and a fs extreme?
> 
> 
> 
> 
> 
> 
> 
> Pls?
> 
> 
> 
> heres the fse:
> http://www.3dmark.com/3dm/1560550?
> 
> 
> 
> 6600 graphics score...
Click to expand...

Nice start. How high can u raise the mem speed? Are you tapped at 1500?


----------



## lordzed83

Hmm i must have very bad clocker if i cant get pass 1170 without artefacts...

Any update on VRM1 and VRM2 situation ?? Are small 3 ones VRM1 or VRM2 on GPUZ ???


----------



## rdr09

Quote:


> Originally Posted by *pastuch*
> 
> Is Chris Angelini at Toms on the NV payroll or are all of you ignoring this bit of the Toms review?
> 
> "The card that AMD sent to me is a stallion. Even if you get it nice and hot before running a test, bringing it down off of that 1000 MHz "wishful thinking" spec, it's still faster than GeForce GTX 780, and oftentimes GeForce GTX Titan. But the Radeon R9 290X I bought from Newegg is a dud. It'll drop to 727 MHz and stay there&#8230;and the reference cooler still can't cool it fast enough. The result is that it violates its 40% fan speed ceiling as well. The craziness, then, is that my R9 290 press board is typically faster than my R9 290X retail card. In the benchmarks, you're going to see numbers for all three."
> 
> I hope he's wrong but seeing 10% lower numbers on retail cards vs the review cards doesn't give me warm fuzzys.


Ignore.

WEI > tom's


----------



## pastuch

Quote:


> Originally Posted by *rdr09*
> 
> Ignore.
> 
> WEI > tom's


Pardon my ignorance: WEI?


----------



## tsm106

Quote:


> Originally Posted by *pastuch*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> Ignore.
> 
> WEI > tom's
> 
> 
> 
> Pardon my ignorance: WEI?
Click to expand...

Ignore Toms lol. Did you not see the retail card benches from jomama? First experienced user bench posted and its already in the top ten level on HoF. You think there's a problem still?


----------



## ImJJames

Quote:


> Originally Posted by *pastuch*
> 
> Pardon my ignorance: WEI?


Windows Experience Index?


----------



## rdr09

Quote:


> Originally Posted by *ImJJames*
> 
> Windows Experience Index?


yes,


----------



## nemm

1x 290x returned and 2x 290 on the way instead (I couldn.t help myself when ordering) but I have problem, only one nickel acetal block in my possession and unable to source another with no eta for more stock so should I grin and bear it by ordering an copper acetal version?


----------



## skupples

Wuhhhh? anything>wei.

Not all GPU's are created equally. It's completely within reason for different sources to produce different results.


----------



## jerrolds

Newegg wouldnt let me request a return for my opened 290X ..so..ya either keep it and hope for a great overclock once the icy 2 comes in, or sell it for someone looking for an X in stock.


----------



## hotrod717

Newegg has XFX 290 with lifetime warranty for $20 more.








Worth it it?


----------



## 290XGoneSwimmin

Anybody else having issues with artifacting? Trying to play a game like Arkham City occasionally results in a full-screen millisecond long checkerboard and it's pissing me off.


----------



## Tobiman

Quote:


> Originally Posted by *290XGoneSwimmin*
> 
> Anybody else having issues with artifacting? Trying to play a game like Arkham City occasionally results in a full-screen millisecond long checkerboard and it's pissing me off.


Best thing is to report it on AMD's new website so that they can sort it out in the next driver release.


----------



## FtW 420

Quote:


> Originally Posted by *skupples*
> 
> Wuhhhh? anything>wei.
> 
> Not all GPU's are created equally. It's completely within reason for different sources to produce different results.


tom's < wei is accurate enough, neither actually tell you anything about overclocked performance...


----------



## tsm106

Quote:


> Originally Posted by *FtW 420*
> 
> Quote:
> 
> 
> 
> Originally Posted by *skupples*
> 
> Wuhhhh? anything>wei.
> 
> Not all GPU's are created equally. It's completely within reason for different sources to produce different results.
> 
> 
> 
> tom's < wei is accurate enough, neither actually tell you anything about overclocked performance...
Click to expand...

Eh, its user error. Suddenly this rumor of gold cards for reviewers is spreading like wild fire. Coincidence...?

http://www.overclock.net/t/1439991/swe-radeon-r9-290x-under-fire-for-golden-samples-sweclockers-investigates/0_40#post_21129835


----------



## EmZkY

I'm in the market for a 290x, and I wonder which one ends up at the top between 780 and 290x under water and fully overclocked. Im also considering to wait for a 780 ti, but I'm using a 1440p monitor.. Aargh so many choices.


----------



## rdr09

Quote:


> Originally Posted by *EmZkY*
> 
> I'm in the market for a 290x, and I wonder which one ends up at the top between 780 and 290x under water and fully overclocked. Im also considering to wait for a 780 ti, but I'm using a 1440p monitor.. Aargh so many choices.


Are you currently playing or planning to play BF4?


----------



## Deisun

I just ordered the R9 290. Couldn't resist...

BF4 here I come!
I'm hoping my Core i7 920 @ 4GHz, 6gb memory and the R9 290 will push me into ULTRA territory.

If my i7 920 can last me until SKYLAKE in 2015, I will be a very happy man.


----------



## jomama22

Quote:


> Originally Posted by *tsm106*
> 
> Nice start. How high can u raise the mem speed? Are you tapped at 1500?


Anything over 1500 starts lowering my score so I know I'm at the brink. We need mem voltage so bad!

highest I can get on valley is 79.4 even with the CPU @ 5.1 and 1340/1500 clocks. Valley is insanely mem clock dependent it seems. I will try a lower core and higher mem when I get back home.


----------



## $ilent

Quote:


> Originally Posted by *EmZkY*
> 
> I'm in the market for a 290x, and I wonder which one ends up at the top between 780 and 290x under water and fully overclocked. Im also considering to wait for a 780 ti, but I'm using a 1440p monitor.. Aargh so many choices.


Dont wait for 780ti, a 780ti is going to cost $700, for an extra $100 you could get 2x r9 290 which would trounce the single 780ti.


----------



## rdr09

Quote:


> Originally Posted by *Deisun*
> 
> I just ordered the 290. Couldn't resist...
> 
> BF4 here I come!
> I'm hoping my Core i7 @ 4GHz and 6gb memory and the R9 290 will push me into ULTRA territory.


if 1080, even a 7950 can do that for you. yes, max.


----------



## EmZkY

Quote:


> Originally Posted by *rdr09*
> 
> Are you currently playing or planning to play BF4?


Yes. Im planning to play it once i change out my 5870 with a newer card


----------



## tsm106

Quote:


> Originally Posted by *jomama22*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Nice start. How high can u raise the mem speed? Are you tapped at 1500?
> 
> 
> 
> Anything over 1500 starts lowering my score so I know I'm at the brink. We need mem voltage so bad!
> 
> highest I can get on valley is 79.4 even with the CPU @ 5.1 and 1340/1500 clocks. Valley is insanely mem clock dependent it seems. I will try a lower core and higher mem when I get back home.
Click to expand...



1250/1650 with a 3820 at 4.6.

Here's my run right when i got the first card. It was too abusive on the mem/vrm on aircooling. Sitting tight till I get blocks.


----------



## ImJJames

Quote:


> Originally Posted by *rdr09*
> 
> if 1080, even a 7950 can do that for you. yes, max.


My 7850 oc'ed runs BF 4 max ultra 50+ FPS average no problem 1080P. When I tested with crossfire with another 7850 I reached 80+ Fps on ultra average and 95+ fps average on HIGH. Not sure why people think you need a $400+ card to run bf 4, but hey its their money.


----------



## Paul17041993

Quote:


> Originally Posted by *beejay*
> 
> I found the problem. The heat sink on the 3 regulator modules where not in line with the airflow. I was so sleepy when I installed this.
> 
> Anyway, because of the surrounding components and the small space, I can only fit one hit sink, and used a thermal pad from the stock ones. I now level out at 88c on heaven on VRM1.
> 
> I need to find big sinks with small footprint and glue it to those components.
> 
> And seems like vrm1 and vrm2 may be swapped depending on the card? vrm1 on mine are the small ones. Also hwinfo displays two GPU0 with VRM 1 and 2 swapped on the first GPU0 and the second GPU 0.


likely they use the same ID and will appear in whatever order they get found in...

Quote:


> Originally Posted by *kantxcape*
> 
> Jesus christ, so i cant crossfire 2 290x with a platimum 1000w PSU? System is watercooled.


of course you could, just like 7970 trifire, just don't go too overboard with your voltage (not that I think you would need to).


----------



## jomama22

Quote:


> Originally Posted by *tsm106*
> 
> Nice start. How high can u raise the mem speed? Are you tapped at 1500?


Quote:


> Originally Posted by *tsm106*
> 
> Ignore Toms lol. Did you not see the retail card benches from jomama? First experienced user bench posted and its already in the top ten level on HoF. You think there's a problem still?


Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> 1250/1650 with a 3820 at 4.6.
> 
> Here's my run right when i got the first card. It was too abusive on the mem/vrm on aircooling. Sitting tight till I get blocks.


Here we go, new high.



1330/1600 - any more memory and I start to crash.

I could probably get it to 1340 but honestly, the benefits are low.

and lol, are screens are almost on the exact same frame


----------



## rdr09

Quote:


> Originally Posted by *EmZkY*
> 
> Yes. Im planning to play it once i change out my 5870 with a newer card


you have no choice but to stick with AMD.


----------



## Deisun

It's not really that I felt I needed R9 290. More so that I just couldn't resist the price per performance that the 290 gives you and I was already needing a new card.


----------



## $ilent

I just measured VRM temps with gpuz and hwinfo, both reporting same temperatures within 1C of each other. So are they wrong tsm or are my VRMs really this cool?


----------



## ImJJames

Quote:


> Originally Posted by *Deisun*
> 
> It's not really that I felt I needed R9 290. More so that I just couldn't resist the price per performance that the 290 gives you and I was already needing a new card.


Yeah 290 is a steal, Im going to wait though, I really think patience will pay off because currently Nvidia and AMD are having a war and in the end customers will win.


----------



## Ukkooh

Quote:


> Originally Posted by *Deisun*
> 
> It's not really that I felt I needed R9 290. More so that I just couldn't resist the price per performance that the 290 gives you and I was already needing a new card.


It's ok bro. I mostly play cod4 and smite nowadays and I still got the unbeatable itch for a custom loop and a 290x. I guess I need that 1100 fps.


----------



## Forceman

Quote:


> Originally Posted by *jomama22*
> 
> The memory starts giving me less points over 1500mhz. Though I won't see artifacts, ecc is kicking in hard. I suspect many people will experience the same thing.


One of the reviews mentioned that there was a BIOS bug that was limiting memory overclocks. so maybe that's the issue. Feeling ballsy enough to flash a 290X BIOS on it and see what happens?
Quote:


> Originally Posted by *fleetfeather*
> 
> Woah, do I have my facts crossed somewhere? How can a gpu actually pull more than 300W through those 8+6pin power cables if 300W is the limit they can supply? I wouldn't of thought any substantial power was being fed through the mobo?
> 
> E: maybe you're referring to TDP..?


Quote:


> Originally Posted by *sugarhell*
> 
> pcie can give up to 300 watt.


The standard is 300W (of which 75W is supplied by the actual PCIe slot). So they technically aren't supposed to make a card that exceeds 300W, which is why they power throttle at default settings. But the actual power delivery capacity of the 8+6 is much higher than that.


----------



## tsm106

Quote:


> Originally Posted by *$ilent*
> 
> I just measured VRM temps with gpuz and hwinfo, both reporting same temperatures within 1C of each other. So are they wrong tsm or are my VRMs really this cool?


Not sure, but generally and going from my experience with 7970s, the reference cooler cannot cool vrms below 60c. And custom coolers have worse cooling than the stock reference cooler, since the stocker uses a giant monoblock frame giving a ton of material which which to transfer heat. Simple stick on heatsinks do not compare. I haven't had a chance to confirm temps but I will when I get my blocks.

Also, check this out and remember the VR stages of the 7970 and 290x are very similar.

http://www.swiftech.com/hd7900-hsf.aspx

Scroll down to see the the massive unisink, now imagine tiny stick on heatsinks in comparison...


----------



## $ilent

ive ordered a infrared thermo so will post in here when it arrives too.


----------



## esqueue

I couldn't find any back plates in stock so, I ended up going without it. It will my ek block will arrive here on Friday, If there was a way to get one here before then I would still get one.
Quote:


> Originally Posted by *ImJJames*
> 
> Windows Experience Index?


I nearly fell out of my chair when they explained what WEI meant.


----------



## Arizonian

Well gentlemen - since I'm not water cooling, not willing to upgrade my 1 yr old AX850 PSU until next rig build, I've sold my reference 290X. Sold it to my cousins work friend who wanted one so bad offered me what I paid for it without the BF4 code this afternoon. It's so hard to obtain he was ecstatic to hand me the money.

I got to try this beast and now waiting for non-reference coolers and see what they bring to the table for someone in my position. Originally I intended to keep it until non-reference cards came out but his offer was only good until he could score one somewhere else. Getting to keep BF4 was the icing on the cake as I made out ahead of what I paid theoretically.

Slapped my old 680 in main rig (since my 690 is already in the second 3D Vision rig) and going back to waiting. Happy / sad day I guess.


----------



## VSG

Just called up FrozenCPU and the shipment of EK blocks/backplates got delayed again. Now they are expecting a big shipment coming in Friday/Monday so expect shipment on Monday/Tuesday at the earliest


----------



## $ilent

Why did you not just put a gelid icy vision on it before selling it?


----------



## VSG

Quote:


> Originally Posted by *Arizonian*
> 
> Well gentlemen - since I'm not water cooling, not willing to upgrade my 1 yr old AX850 PSU until next rig build, I've sold my reference 290X. Sold it to my cousins work friend who wanted one so bad offered me what I paid for it without the BF4 code this afternoon. It's so hard to obtain he was ecstatic to hand me the money.
> 
> I got to try this beast and now waiting for non-reference coolers and see what they bring to the table for someone in my position. Originally I intended to keep it until non-reference cards came out but his offer was only good until he could score one somewhere else. Getting to keep BF4 was the icing on the cake as I made out ahead of what I paid theoretically.
> 
> Slapped my old 680 in main rig (since my 690 is already in the second 3D Vision rig) and going back to waiting. Happy / sad day I guess.


Well darn I guess


----------



## jomama22

Alright, I squeezed her a bit harder in valley...



1320/1700!!! So the memory can indeed get up there. Seems to be very dependent on core speed and voltage.

82.1 FPS...Highest 290x i have seen on air/water.


----------



## Arizonian

Quote:


> Originally Posted by *jomama22*
> 
> Alright, I squeezed her a bit harder in valley...
> 
> 
> 
> 1320/1700!!! So the memory can indeed get up there. Seems to be very dependent on core speed and voltage.
> 
> 82.1 FPS...Highest 290x i have seen on air/water.


I hope you're posting your scores and the benchmark threads and represent us.


----------



## devilhead

Quote:


> Originally Posted by *jomama22*
> 
> Alright, I squeezed her a bit harder in valley...
> 
> 
> 
> 1320/1700!!! So the memory can indeed get up there. Seems to be very dependent on core speed and voltage.
> 
> 82.1 FPS...Highest 290x i have seen on air/water.


thats nice score you have, and you didn't used ATI tweaks?


----------



## jomama22

Quote:


> Originally Posted by *Arizonian*
> 
> I hope you're posting your scores and the benchmark threads and represent us.


Once I get them all benched I will. Don't want to sandbag myself lol. This is just the first of 6 for water testing. It also has the highest asic @ 77.2%, I expected it to be one of the worst waterclockers as tahiti was better in the 68-74 range for water.


----------



## $ilent

jomamma what volt is that at? Any ideas what the stock voltage is for the 290x?


----------



## jomama22

Quote:


> Originally Posted by *$ilent*
> 
> jomamma what volt is that at? Any ideas what the stock voltage is for the 290x?


That is on the pt3 bios @ 1.35v, it has droop disabled and a high llc, so its most likely around 1.375v if not a little less because of the high asic.

The stock voltage for this card (77.2% asic) was running @ <1.12 under load. This maxed at 1150 on the stock bios


----------



## Gilgam3sh

does anyone know if the Sapphire 290 is voltage locked and need to be flashed with another BIOS? also what about the ASUS 290 voltage?


----------



## Arm3nian

Quote:


> Originally Posted by *jomama22*
> 
> That is on the pt3 bios @ 1.35v, it has droop disabled and a high llc, so its most likely around 1.375v if not a little less because of the high asic.
> 
> The stock voltage for this card (77.2% asic) was running @ <1.12 under load. This maxed at 1150 on the stock bios


Can you push more volts currently without a hardmod?


----------



## RAFFY

Quote:


> Originally Posted by *RAFFY*
> 
> Honest to god my PSU's fan hasn't even turned on yet. I was just playing COD: Ghosts no PSU fan, then before that BF4 no PSU fan, then before that I had BF4 open and minimized while playing COD: Ghosts and no PSU fan. With that being said my PSU doesn't make any whining noises that people have reported with the platinum's. This was all at 1080p resolution with all graphics maxed to ultra. I'm about to hop in game at 2560x1440p since my new monitor just arrived. I'll report back again about this PSU.


Ok im quoting myself because im a liar. The PSU fan does come on but only for 10-30seconds max then turns off. This is while the PSU is in Hybrid mode. Either way this PSU is great so far. And two of these cards on 2560x1440p even at stock clocks is rocking my socks. Cant wait to get a nice WC setup come December and get some overclocking going on.


----------



## tsm106

Quote:


> Originally Posted by *Arm3nian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *jomama22*
> 
> That is on the pt3 bios @ 1.35v, it has droop disabled and a high llc, so its most likely around 1.375v if not a little less because of the high asic.
> 
> The stock voltage for this card (77.2% asic) was running @ <1.12 under load. This maxed at 1150 on the stock bios
> 
> 
> 
> Can you push more volts currently without a hardmod?
Click to expand...

Yea you can if you are crazy cool. Keyword, cool, keep it cool enough... it's harder than it sounds though.


----------



## $ilent

Quote:


> Originally Posted by *jomama22*
> 
> That is on the pt3 bios @ 1.35v, it has droop disabled and a high llc, so its most likely around 1.375v if not a little less because of the high asic.
> 
> The stock voltage for this card (77.2% asic) was running @ <1.12 under load. This maxed at 1150 on the stock bios


Wow 1.12v sounds low, how did you check that? if in GPUZ which which one was it? They have a couple like VDDC etc.


----------



## Arm3nian

Quote:


> Originally Posted by *tsm106*
> 
> Yea you can if you are crazy cool. Keyword, cool, keep it cool enough... it's harder than it sounds though.


Well keeping it cool with water is just going to rely on tim and ambient temps, adding more and more rads isn't going to help I think. Should be able to go cooler on the core if Liquid Ultra is used, don't really want the hassle of it though, killed my 690 with Liquid Pro (only propable cause).


----------



## tsm106

Yea I've got a tube of CLU and Fuji Ultra Extremes.... Gdamn fcpu.


----------



## Gilgam3sh

anyone tried flashing 290X BIOS on the 290?


----------



## jrcbandit

Quote:


> Originally Posted by *290XGoneSwimmin*
> 
> Anybody else having issues with artifacting? Trying to play a game like Arkham City occasionally results in a full-screen millisecond long checkerboard and it's pissing me off.


Are you overclocking? Only had this issue when overclocked. Was at 1200 core, 1400 memory and had to drop to 1190 core and 1375 memory to eliminate any artifacting. This was with the voltage at 1.299 (max on Asus bios - 1.35 + vdroop = 1.299). I'm hoping Afterburner gets released soon that allows higher voltages and voltage changes for the memory.


----------



## Arm3nian

Quote:


> Originally Posted by *tsm106*
> 
> Yea I've got a tube of CLU and Fuji Ultra Extremes.... Gdamn fcpu.


I snatched my blocks the first day, still sitting there waiting for the stupid RivBE. Have you seen any difference between those fuji pads and the ones that come with the ek blocks?


----------



## tsm106

Quote:


> Originally Posted by *Arm3nian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Yea I've got a tube of CLU and Fuji Ultra Extremes.... Gdamn fcpu.
> 
> 
> 
> I snatched my blocks the first day, still sitting there waiting for the stupid RivBE. Have you seen any difference between those fuju pads and the ones that come with the ek blocks?
Click to expand...

They are over three times more conductive. Hmm, results... my benches are pretty fast?


----------



## outofmyheadyo

When will we see aftermarket coolers like asus DCII, Windforce3, and others on these cards ? 290 is an amazing value for money if only the cards had a similar price and a decent cooler. But as of now i cant buy them because of the ref cooler.


----------



## Arizonian

Quote:


> Originally Posted by *outofmyheadyo*
> 
> When will we see aftermarket coolers like asus DCII, Windforce3, and others on these cards ? 290 is an amazing value for money if only the cards had a similar price and a decent cooler. But as of now i cant buy them because of the ref cooler.


Fingers crossed but it may be by latter part of Novemer. It's been a cat and mouse game between Nvidia and AMD. I have a feeling AMD is waiting to see the hand that Nvidia place before making their decision on release and pricing


----------



## VSG

Quote:


> Originally Posted by *tsm106*
> 
> They are over three times more conductive. Hmm, results... my benches are pretty fast?


I was considering the Fuji ultra pads but they are so expensive! Between my 2 blocks and back plates, it would cost almost $170. I will stick with the stock pads and see how the performance is.


----------



## Forceman

Quote:


> Originally Posted by *tsm106*
> 
> Yea I've got a tube of CLU and Fuji Ultra Extremes.... Gdamn fcpu.


What thickness fujipoly are you using?


----------



## SonDa5

The Asus tweaked bios that are out and about are not working that great for my Water cooled Sapphire 290x. Having a hard time stabilizing 1200mhz. Hoping new Sapphire TRIXX comes out soon.


----------



## rdr09

Quote:


> Originally Posted by *SonDa5*
> 
> The Asus tweaked bios that are out and about are not working that great for my Water cooled Sapphire 290x. Having a hard time stabilizing 1200mhz. Hoping new Sapphire TRIXX comes out soon.


pm jomama if you like. posts #2973 and 2978.


----------



## outofmyheadyo

I think the prices on 290 cards can only go lower with time, and for that reason I think it will be very hard to justify buying a 780ti for the price of almost 2x 290 cards that will surely kick out some sick scores.
I think AMD did really hit the jackpot with the pricing of the 290 card, if the stock cooler wouldnt be such a turnoff id run to the store right now and grab a pair, heres hoping i dont have to swap out my Seasonic X850 because 2x gtx 580 @ 1000core were giving me random reboots and shutdowns while each card did 1000 core no problem, perhaps a powersupply issue, and im pretty sure the 290-s dont consume much lesss power when oc-d


----------



## jomama22

Quote:


> Originally Posted by *SonDa5*
> 
> The Asus tweaked bios that are out and about are not working that great for my Water cooled Sapphire 290x. Having a hard time stabilizing 1200mhz. Hoping new Sapphire TRIXX comes out soon.


The asus bios has max 1.299v under load. so be sure to just put the slider up to 1.412 to get the most out of it without worrying about damage. Set powertune to 150%.

Try from there. The two I have tested on the asus bios hit 1245/1255 stable in firestrike. More for valley/games.


----------



## SonDa5

Quote:


> Originally Posted by *jomama22*
> 
> The asus bios has max 1.299v under load. so be sure to just put the slider up to 1.412 to get the most out of it without worrying about damage. Set powertune to 150%.
> 
> Try from there. The two I have tested on the asus bios hit 1245/1255 stable in firestrike. More for valley/games.


I'm running the PT3 BIOS which allows voltage all the way to 2v, My card gets unstable with anything over 1.35v with that BIOS. I am using Furmark to test stability.

I am holding off on pushing this card till Sapphire TRIXX comes out or some new BIOS updates are put out from Sapphire.


----------



## SonDa5

Quote:


> Originally Posted by *Forceman*
> 
> What thickness fujipoly are you using?


For EK block I have Ultra on VRM 1mm and .5mm on memory IC of the xtreme.

CL Ultra TIM.


----------



## Raephen

It looks like I'll be joining this club a lot sooner than I had expected.

I received a price alert I had set for Saphire R9 290 today. I had set the bar at anythiing below € 410.

It turned out to be € 380ish, so I almost pulled the trigger there and then, but I decided to wait some more - made some coffee, baked some potatoes etc etc - and reconsidered. So I went back to the site offering the lowest price in my neck-of-the-woods and saw it for € 354.

That was an offer I couldn't refuse. So, including shipping, I got an R9 290 for € 360,65.

Damn glad I waited that little bit, but even more that I didn't wait longer: it's back to € 379 as we speak (@ www.afuture.nl ).

While I have no doubts my Seasonic X-560 would be enough for the 290 at stock speeds, I took this as an excuse to also order a new psu: another Seasonic, but the platinum 860 Watt version. I should be a bit more efficient on low loads (desktop) and give me some overclocking headroom.


----------



## pk7677

Any one have a guide on how to install the accelero xtreme 3 on to a r9 290? Just got 2 r9 290s and 2 accelero coolers. Toms hardware did it in their review but didn't say how.


----------



## Sgt Bilko

Quote:


> Originally Posted by *pk7677*
> 
> Any one have a guide on how to install the accelero xtreme 3 on to a r9 290? Just got 2 r9 290s and 2 accelero coolers. Toms hardware did it in their review but didn't say how.


Just follow the instructions included.
Only bit of advice I can give is to make sure that you have good coverage of the vrm's as they will throw off alot of heat.


----------



## pk7677

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Just follow the instructions included.
> Only bit of advice I can give is to make sure that you have good coverage of the vrm's as they will throw off alot of heat.


I'm doing my research right now and earlier in this thread you said something about they give you 12 vram heatsinks and you need 16, so where can I buy extra vrm heatsinks?. Anything else i should know?


----------



## Gilgam3sh

Quote:


> Originally Posted by *pk7677*
> 
> I'm doing my research right now and earlier in this thread you said something about they give you 12 vram heatsinks and you need 16, so where can I buy extra vrm heatsinks?. Anything else i should know?


some guys in Sweden have fitted the Accelero Xtreme III on their 290X and said the vram heatsinks that comes with the cooler is enough... I will get my cooler in a couple of days with the R9 290.


----------



## PillarOfAutumn

So AMD said that they will be including the never settle in the new r9 cards. So if I were to buy an r9 290 now, will AMD give me some of those free games after they release the never settle? Due to time constraints, I'm almost 95% certain I'm not going to play any of those games or ever, but I like to include in my steam library so that I can look at it from time to time and then not play it.


----------



## Sgt Bilko

Quote:


> Originally Posted by *pk7677*
> 
> I'm doing my research right now and earlier in this thread you said something about they give you 12 vram heatsinks and you need 16, so where can I buy extra vrm heatsinks?. Anything else i should know?


Im in Aus so I bought mine from pccasegear.com.au

You only need a couple extra, unless you want to take a hacksaw to some of the heatsinks that come with it. I would just get extra tbh.

I also found the compound that came with it was a little underwhelming. I bought some phobya thermal adhesive and mixed it with a little arctic silver 5 for the heatsinks.

Overall its a very good cooler. Drops the temps nicely and its silent. But im planning to add some higher rpm fans to it to get it a little cooler.


----------



## Snyderman34

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> So AMD said that they will be including the never settle in the new r9 cards. So if I were to buy an r9 290 now, will AMD give me some of those free games after they release the never settle? Due to time constraints, I'm almost 95% certain I'm not going to play any of those games or ever, but I like to include in my steam library so that I can look at it from time to time and then not play it.


Really now? I hope they do! Though I would play them.... for 15 minutes. Then I'd look at them.


----------



## Sgt Bilko

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> So AMD said that they will be including the never settle in the new r9 cards. So if I were to buy an r9 290 now, will AMD give me some of those free games after they release the never settle? Due to time constraints, I'm almost 95% certain I'm not going to play any of those games or ever, but I like to include in my steam library so that I can look at it from time to time and then not play it.


The never settle bundle doesnt interest me that much anymore. I got most of the games when I bought my 7970 and others through steam sales.

I did ask about it when I bought my 290x but they told me that it wouldnt be avaliable....the 290 might have it included but I thought AMD were waiting for a couple more games to be released before they brought it to the R9 series. I'm probably wrong though


----------



## Taint3dBulge

Perfect day to get new parts.. First snow fall of the year..

You can finally add me to the list


----------



## tobitronics

Does anyone know if it is wise to flash the AMD 015.039.000.007.000000 BIOS on my ASUS 290X reference card? I currently run the 015.039.000.006.003515 version. Will this give me benefits or will it destroy my card?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Taint3dBulge*
> 
> Perfect day to get new parts.. First snow fall of the year..
> 
> You can finally add me to the list


Looks good


----------



## Arizonian

Quote:


> Originally Posted by *Taint3dBulge*
> 
> Perfect day to get new parts.. First snow fall of the year..
> 
> You can finally add me to the list
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> MG ALT=""]http://www.overclock.net/content/type/61/id/1732332/width/500/height/1000[/IMG]


Congrats - added


----------



## PillarOfAutumn

Has anyone gotten their paws on the 290







??


----------



## Forceman

Quote:


> Originally Posted by *SonDa5*
> 
> For EK block I have Ultra on VRM 1mm and .5mm on memory IC of the xtreme.
> 
> CL Ultra TIM.


Thanks. First time using a full cover block, what's the reason for not machining the block so it can have direct contact with the VRMs and memory? Card flex issues?


----------



## sch010

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Has anyone gotten their paws on the 290
> 
> 
> 
> 
> 
> 
> 
> ??


I'm supposedly getting one from Amazon on Friday. Although it's been stuck on "shipping now" since shortly after midnight... haha. I would've let the dust settle some more, but my Korean monitor doesn't play nicely with the 3570K iGPU.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *sch010*
> 
> I'm supposedly getting one from Amazon on Friday. Although it's been stuck on "shipping now" since shortly after midnight... haha. I would've let the dust settle some more, but my Korean monitor doesn't play nicely with the 3570K iGPU.


lol, just ordered mine. I know I should have waited because I'm not going to get time to work with this GPU until Next Thursday/Friday. ANd I won't be gaming or testing it out until Sunday lol.


----------



## KaiserFrederick

Just pre-ordered a Sapphire 290, EK water block and backplate. Should arrive on the 15th


----------



## esqueue

Never mind, I was mixing up the ek passive back plate for the aquacomputer active. I'll pass as the card doesn't look bad.


----------



## Taint3dBulge

I have an issue. Wondering if you guys might know.. First off is i have a really bad screeching noise, thinking its coil whine? Its super loud when i start up a game. second is in battlfeild my fps go from 120fps to 45 up and down up and down.. Iv tryed quiet mode and uber.. Iv set the fan profile and iv also left it on auto..

here is a screen shot..


----------



## esqueue

Quote:


> Originally Posted by *Taint3dBulge*
> 
> I have an issue. Wondering if you guys might know.. First off is i have a really bad screeching noise, thinking its coil whine? Its super loud when i start up a game. second is in battlfeild my fps go from 120fps to 45 up and down up and down.. Iv tryed quiet mode and uber.. Iv set the fan profile and iv also left it on auto..
> 
> here is a screen shot..


Does the asus card actually allow you to control the fan? Since my avatar is still default my posts tend to get ignored so I haven't bothered asking and have been searching for my XFX card.


----------



## Ha-Nocri

@Taint3dBulge: you have enough RAM?


----------



## Arizonian

@Taint3dBulge

By 75°- 80° you should have that fan manually pumping 'user define' at 70% and at 90° you should have it at 75% fan. They should keep you from passing 92° which will down clock. Your fan is at auto from you graph. What you do below those temps is up to you as they'd be desktop temps and your not gaming then.


----------



## Slomo4shO

The Humble Bundle currently has Batman Arkham Asylum GOTY for $1.00 (Steam key) for anyone wanting a cheap game for benching (bundle also includes F.E.A.R. 2: Project Origin, F.E.A.R. 3, and Lord of the Rings: War in the North and the Beat the Average, which is currently $4.56 also includes Scribblenauts Unlimited and Batman: Arkham City GOTY)


----------



## utnorris

Just in case there was any doubts:

http://www.techpowerup.com/193891/existing-ek-fc-r9-290x-series-water-blocks-compatible-with-amd-radeon-r9-290.html

The 290x EK water block does fit the 290. It also means the backplate fits too.

Ok, so besides being slightly cut down in the stream processors and the TMU count is down, I cannot see why the 290 will not overclock and perform very close to the 290x. Anyone disagree?


----------



## Arm3nian

Quote:


> Originally Posted by *utnorris*
> 
> Just in case there was any doubts:
> 
> http://www.techpowerup.com/193891/existing-ek-fc-r9-290x-series-water-blocks-compatible-with-amd-radeon-r9-290.html
> 
> The 290x EK water block does fit the 290. It also means the backplate fits too.
> 
> Ok, so besides being slightly cut down in the stream processors and the TMU count is down, I cannot see why the 290 will not overclock and perform very close to the 290x. Anyone disagree?


Depends where and how much they cut corners. 290x might be able to oc more. 290 is still the better deal if you care about price to performance. Results will be close for the majority of the people. 290x might gain some room with high end water or ln2.


----------



## MrStick89

So I have had my 290x since Monday now and I'm wondering if I should get rid of it before its too late. All these benchmarks I see the 290 is putting out nearly the same level or performance for $150 less. So whats the point in the 290x, are we expecting better drivers to utilize this card better? I knew the specs were close but I figured there would be a bigger step in performance.


----------



## utnorris

Well I am switching back over to my tech bench to see how well the 290x does on my full blown water cooling. I am debating on refusing my second 290x tomorrow and ordering two 290's since the waterblocks fit and they seem to be the exact same power and chip. For the difference of my 2 x 290x's and the price of 2 x 290's I could almost get a third 290. It really is a hard decision, especially since my current 290x tops out at 1221Mhz on the GPU. If I had one that did 1300Mhz like some the others around here I might be inclined to keep it. Decisions, decisions.


----------



## Taint3dBulge

Quote:


> Originally Posted by *esqueue*
> 
> Does the asus card actually allow you to control the fan? Since my avatar is still default my posts tend to get ignored so I haven't bothered asking and have been searching for my XFX card.


Yes on gpu tweak i can set up a fan profile.. Ok i did just remember my pcix plugin extensions are both 6 pin's one has an extra 2 pins added. I took those out and put in the actual 8 pin cord and plugged it back in.. Gonna go try battlefield again, see what happens.


----------



## xxmastermindxx

Quote:


> Originally Posted by *utnorris*
> 
> Just in case there was any doubts:
> 
> http://www.techpowerup.com/193891/existing-ek-fc-r9-290x-series-water-blocks-compatible-with-amd-radeon-r9-290.html
> 
> The 290x EK water block does fit the 290. It also means the backplate fits too.
> 
> Ok, so besides being slightly cut down in the stream processors and the TMU count is down, I cannot see why the 290 will not overclock and perform very close to the 290x. Anyone disagree?


Nope. I will have 2 on the way to me shortly


----------



## Arm3nian

Quote:


> Originally Posted by *utnorris*
> 
> Well I am switching back over to my tech bench to see how well the 290x does on my full blown water cooling. I am debating on refusing my second 290x tomorrow and ordering two 290's since the waterblocks fit and they seem to be the exact same power and chip. For the difference of my 2 x 290x's and the price of 2 x 290's I could almost get a third 290. It really is a hard decision, especially since my current 290x tops out at 1221Mhz on the GPU. If I had one that did 1300Mhz like some the others around here I might be inclined to keep it. Decisions, decisions.


1221 on water? Seems low. If it is the same chip, 290x is most likely binned better, as we saw before the 290 needed more volts to reach the clocks the 290x had.


----------



## Taint3dBulge

Quote:


> Originally Posted by *Arizonian*
> 
> @Taint3dBulge
> 
> By 75°- 80° you should have that fan manually pumping 'user define' at 70% and at 90° you should have it at 75% fan. They should keep you from passing 92° which will down clock. Your fan is at auto from you graph. What you do below those temps is up to you as they'd be desktop temps and your not gaming then.


I played for about 30 min, with fan on manual mode and the temps never went above 80c. Manuel mode at 60% gonna try the game with auto fan mode, see if it does it then ill try at 75% and see if it does it..


----------



## sch010

Quote:


> Originally Posted by *Arm3nian*
> 
> 1221 on water? Seems low. If it is the same chip, 290x is most likely binned better, as we saw before the 290 needed more volts to reach the clocks the 290x had.


This. Depends on how marginal the harvested chips in the 290 are. Hopefully leakage won't be as much of a factor under water.


----------



## utnorris

Quote:


> Originally Posted by *Arm3nian*
> 
> 1221 on water? Seems low. If it is the same chip, 290x is most likely binned better, as we saw before the 290 needed more volts to reach the clocks the 290x had.


I agree it is low, which is why I swapping to my tech bench and switching to my ROG Gene to see if there is something else holding the card back. I literately took the voltage to 1.41v and still could not get above 1221Mhz with artifacts. The GPU never went over 48c either, so I do not think it was a cooling issue, but we will find out. Personally I think I just got a dud, which is why I am hesitating refusing my second 290x as it would let me know if something else was up. Of course, if I open it, then I am stuck with it and would need to Ebay it to get my money back.


----------



## golemite

Just picked up from will call..


----------



## Strata

Just scored a Sapphire 290X on Newegg (thank you Auto Notify), cannot wait...


----------



## Jared Pace

Quote:


> Originally Posted by *utnorris*
> 
> I literately took the voltage to 1.41v and still could not get above 1221Mhz with artifacts.


Because your actual voltage isn't 1.41v, it's 1.29v.


----------



## Taint3dBulge

Ya there is something totally wrong with my card. Gonna have to rma it.. The coil whine just screams. You can hear it downstairs... The gpu usage is all over the board never does it stay at 100% its just up and down and temps are totally fine. At 75% it never goes over 90c... I put bf3 to medium settings at 1680x1050 and it goes from 150fps to 45 and it just bounces all around.. Latest drivers, fresh install of windows. Even my 6950 at high settings would run 70fps constantly with maybe a blurp in the lower 50's never 45.. Im not looking forward to trying to find another asus...


----------



## Arm3nian

Quote:


> Originally Posted by *Taint3dBulge*
> 
> Ya there is something totally wrong with my card. Gonna have to rma it.. The coil whine just screams. You can hear it downstairs... The gpu usage is all over the board never does it stay at 100% its just up and down and temps are totally fine. At 75% it never goes over 90c... I put bf3 to medium settings at 1680x1050 and it goes from 150fps to 45 and it just bounces all around.. Latest drivers, fresh install of windows. Even my 6950 at high settings would run 70fps constantly with maybe a blurp in the lower 50's never 45.. Im not looking forward to trying to find another asus...


Sounds like it might have been damaged during shipment?
Quote:


> Originally Posted by *Jared Pace*
> 
> Because your actual voltage isn't 1.41v, it's 1.29v.


If he was on ASUS bios then yeah. Still should be able to push a little more imo.


----------



## Sazz

anyone wants Sapphire 290X theres stock in newegg at this moment, atleast 2 is available.


----------



## spork8655

Hey, I'm having a lot of issues with my pair of Gigabyte Radeon R9 290x'es. I made a thread about it over here:

I'm having some major issues with my Crossfired Gigabyte Radeon R9 290x setup. 

started on 11/04/13
•

last post 11/08/13 at 11:55am
•

25 replies
•

1004 views

Somebody suggested it may be a power issue, and that I should ask over here. I would greatly appreciate any help or suggestions.


----------



## esqueue

Quote:


> Originally Posted by *Sazz*
> 
> anyone wants Sapphire 290X theres stock in newegg at this moment, atleast 2 is available.


Interesting that the site doesn't refresh and says OOS but when you click it, you can purchase it. That's why those who weren't using http://www.nowinstock.net/computers/amdr9290x/ had such a hard time finding the card.


----------



## esqueue

Quote:


> Originally Posted by *spork8655*
> 
> Hey, I'm having a lot of issues with my pair of Gigabyte Radeon R9 290x'es. I made a thread about it over here:
> 
> I'm having some major issues with my Crossfired Gigabyte Radeon R9 290x setup.
> 
> started on 11/04/13
> •
> 
> last post 11/08/13 at 11:55am
> •
> 
> 25 replies
> •
> 
> 1004 views


Somebody suggested it may be a power issue, and that I should ask over here. I would greatly appreciate any help or suggestions.

If you are using gpu-z make sure that it is at least version 0.7.4 Oddly enough, I had major freezing issues when I would run a version before it. I would have to reset the pc for it to behave normal. Once I opened gpu-z, it would go crazy even if I closed it. I don't know which version but it didn't even have VRM temps.

Hopefully you have this issue though it's very doubtful.


----------



## the9quad

Quote:


> Originally Posted by *spork8655*
> 
> Hey, I'm having a lot of issues with my pair of Gigabyte Radeon R9 290x'es. I made a thread about it over here:
> 
> I'm having some major issues with my Crossfired Gigabyte Radeon R9 290x setup.
> 
> started on 11/04/13
> •
> 
> last post 11/08/13 at 11:55am
> •
> 
> 25 replies
> •
> 
> 1004 views


Somebody suggested it may be a power issue, and that I should ask over here. I would greatly appreciate any help or suggestions.

Well I had a similar issue on one card. I had to flash the bios to asus, run a custom fan profile making sure temps never got above 80, and had to up my vcore a tad. That fixed my random lockups. I think 90-95 is still to hot for these cards and when they get stressed in a benchmark or game at those temps they crash. I'm not exactly a genius, but that's what I think is going on. They either don't get enough juice or they overheat. 55% fan speed is not enough to keep these things cool in my haf x, but 60-65% is. Maybe we both got bum cards I don't know.


----------



## PillarOfAutumn

Alright, so after ordering my 290, I'm just sitting here at my desk staring at my monitor and just waiting...



Spoiler: Warning: Spoiler!


----------



## Deisun

Hilarious! I'm feeling the same way, ordered a R9 290 earlier today


----------



## MrStick89

One $550 gfx card just isn't enough these days.. Got my 290x running at 1110mhz on stock volts is that decent? asic score of 77.7%


----------



## Mr357

Quote:


> Originally Posted by *MrStick89*
> 
> One $550 gfx card just isn't enough these days.. Got my 290x running at 1110mhz on stock volts is that decent? asic score of 77.7%


From what I've seen, every 290X can do 1100 on stock volts. See if you can do 1150, or better yet 1200.


----------



## Arizonian

Quote:


> Originally Posted by *MrStick89*
> 
> One $550 gfx card just isn't enough these days.. Got my 290x running at 1110mhz on stock volts is that decent? asic score of 77.7%


1100 MHz / 1300 MHz gaming any game stable yes on stock volts. Some games will allow a 1150 / 1350 and others don't. Benching will vary too. I'm sure when get voltage control we'll be able to add a tad more to get a little higher. Stock air has limits. After market and non-reference will open the door a bit more but nothing compared to what our water coolers will be seeing. They are just starting to show us numbers and fine tuning things.


----------



## _Killswitch_

Been fighting with myself over SLI 780's or CF 290X's..but after seeing the performance of 290 that beats the 780 in most games except a few but 780 doesn't beat the 290 by a lot or any thing...I have come to conclusion that as much as I like Nvidia card's..CF 290's are better choice price and performance.

Just hope the new Aquacomputer waterblock for the 290x will fit the 290's as well or I'm be a sad panda


----------



## TamaDrumz76

Any confirmation on which RAM chips the 290's are using (and if it varies by manufacturer)? Earlier in thread someone posted the findings on 290x (though MSI was "unknown"), but only seemed like a guess that all 290 series would use Elpida... Any confirmation whether that's true or not?


----------



## PillarOfAutumn

Aren't the 290 and the 290x similar in layout?


----------



## Arizonian

Quote:


> Originally Posted by *_Killswitch_*
> 
> Been fighting with myself over SLI 780's or CF 290X's..but after seeing the performance of 290 that beats the 780 in most games except a few but 780 doesn't beat the 290 by a lot or any thing...I have come to conclusion that as much as I like Nvidia card's..CF 290's are better choice price and performance.
> 
> Just hope the new Aquacomputer waterblock for the 290x will fit the 290's as well or I'm be a sad panda


Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Aren't the 290 and the 290x similar in layout?


Should be same exact reference boards minus some stream processors and texture units. I believe EK water blocks are selling the same one for both. Can anyone who looked into water cooling confirm this for a 290 owner?


----------



## tsm106

Quote:


> Originally Posted by *TamaDrumz76*
> 
> Any confirmation on which RAM chips the 290's are using (and if it varies by manufacturer)? Earlier in thread someone posted the findings on 290x (though MSI was "unknown"), but only seemed like a guess that all 290 series would use Elpida... Any confirmation whether that's true or not?


Mem chips dont matter and its not like you have control over it anyways.

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Aren't the 290 and the 290x similar in layout?


C'mon ppl, pay attention. 290 and 290x are the same pcb just like with Cayman minus the intended spec differences.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Arizonian*
> 
> Should be same exact reference boards minus some stream processors and texture units. I believe EK water blocks are selling the same one for both. Can anyone who looked into water cooling confirm this for a 290 owner?


Yup. TechPowerUp recently published an article stating that the 290x EK block is also compatible with the 290.


----------



## MrStick89

http://www.3dmark.com/3dm11/7437792

Looks like this is all I can do for now on stock volts. 1110mhz/1498mhz haven't really messed with mem clock just guessed and did a run.


----------



## ImJJames

Quote:


> Originally Posted by *MrStick89*
> 
> http://www.3dmark.com/3dm11/7437792
> 
> Looks like this is all I can do for now on stock volts. 1110mhz/1498mhz haven't really messed with mem clock just guessed and did a run.


Should use the latest 3dmark.


----------



## Tom1121

Got my two 290's coming, will post the picture Thurs to enter the club. Btw, does anyone know if the 290's are eligible for the Never Settle bundle?


----------



## Arizonian

Quote:


> Originally Posted by *golemite*
> 
> Just picked up from will call..
> 
> 
> Spoiler: Warning: Spoiler!


Congrats man - added


----------



## Forceman

Quote:


> Originally Posted by *Tom1121*
> 
> Btw, does anyone know if the 290's are eligible for the Never Settle bundle?


No, they aren't.


----------



## SonDa5

Good to see some new 290 owners. I think you guys are getting a great card at a great price. Good luck with the over clocking.


----------



## utnorris

Quote:


> Originally Posted by *Jared Pace*
> 
> Because your actual voltage isn't 1.41v, it's 1.29v.


I thought based on your thread that Aida showed pretty close to the actual voltage. I kept raising it until it showed me at 1.41v. You are saying I have even more head room?
Quote:


> Originally Posted by *Arm3nian*
> 
> Sounds like it might have been damaged during shipment?
> If he was on ASUS bios then yeah. Still should be able to push a little more imo.


I am on the Asus bios, but it is the PT2 bios which I thought was a tweaked voltage without vdroop. It is no longer up, so I cannot confirm, but if I still have headroom to overclock I will give it a try.


----------



## HighTemplar

Has anyone tried "The Mod" on one of these GPUs? Do they need a shim? I might just slap a 620 on it and blast it until I get a proper block.

Probably will be ordering 2 shortly to replace my 780 Classifieds.


----------



## Forceman

Quote:


> Originally Posted by *HighTemplar*
> 
> Has anyone tried "The Mod" on one of these GPUs? Do they need a shim? I might just slap a 620 on it and blast it until I get a proper block.
> 
> Probably will be ordering 2 shortly to replace my 780 Classifieds.


No shim required. Not sure why you'd want to replace two classys though.


----------



## Sgt Bilko

So im thinking about replacing the fans on my accelero xtreme III, any suggestions?

i should add that noise isn't a big concern for me


----------



## Forceman

Quote:


> Originally Posted by *Sgt Bilko*
> 
> So im thinking about replacing the fans on my accelero xtreme III, any suggestions?
> 
> i should add that noise isn't a big concern for me


Can you change the fans on that? I thought the fans were integrated.

Looking at the pictures it doesn't look like they are interchangeable.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Forceman*
> 
> Can you change the fans on that? I thought the fans were integrated.
> 
> Looking at the pictures it doesn't look like they are interchangeable.


They are, but i would need to take the whole fan assembly off, i don't have an issue with that

I just want these vrm's to stay a bit colder.........hmm, might go for water cooling....


----------



## Clockster

Guys can we use the latest MSI afterburner to set fan profile with these cards?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Clockster*
> 
> Guys can we use the latest MSI afterburner to set fan profile with these cards?


Yes you can


----------



## Connolly

Can anyone tell me which company has the better warranty service between XFX and Sapphire? And also, do you void your warranty for both if you add a water system or change the stock cooler in any way? I can get the Sapphire 290 for £319 or the XFX 290 for £335.


----------



## Sazz

Quote:


> Originally Posted by *Connolly*
> 
> Can anyone tell me which company has the better warranty service between XFX and Sapphire? And also, do you void your warranty for both if you add a water system or change the stock cooler in any way? I can get the Sapphire 290 for £319 or the XFX 290 for £335.


Technically with Sapphire you do void your warranty if you use any other cooler on it, but they don't put any "seal" on it so if you have problems with it, just pop the stock cooler back and they wouldn't know a thing. unless of course you tell them you used a waterblock/aftermarket cooler on it LOL.

XFX on the other hand does have a sticker "Void" seal on the X-Bracket screws but I believe someone told me before that you can register your product and ask permission from them to remove the cooler to use an aftermarket cooler on it, but from what I've remembered he told them specifically that he was just gonna replace the thermal pads and grease on it coz its getting hot and XFX let him do that.

So its upto you now to decide which way to go, XFX do have lifetime warranty but so far I've seen more "miss" from their cards than hit on the other hand I see people complaining about Sapphire's customer support, I never needed to contact their support so I can't really comment about that.

But anyways, is it just me or is Gigabyte is missing on this action with the new AMD cards... and why the heck are they MIA on this cards? I know they are gonna release the windforce version of it but I wonder why they are not getting involved with the reference release..


----------



## hotrod717

FYI, Newegg has both Battlefield and non- Battlefied Sapphire 290x's in stock right now. Better hurry if you want one, they've been selling out quick.


----------



## Paul17041993

gigabyte's there, you just haven't seen them for some reason...

oh look, the 290s are on preorder at pccasegear for 500AUD too...

now I have to think whether I want a 290X or go 300 bucks more for two 290s, for 3x1080p that is...


----------



## Alexbo1101

Woo! My 290x is ready for pickup! Will have photo proof in 3-4 hours, when I'm done with work.


----------



## Connolly

Quote:


> Originally Posted by *Sazz*
> 
> Technically with Sapphire you do void your warranty if you use any other cooler on it, but they don't put any "seal" on it so if you have problems with it, just pop the stock cooler back and they wouldn't know a thing. unless of course you tell them you used a waterblock/aftermarket cooler on it LOL.
> 
> XFX on the other hand does have a sticker "Void" seal on the X-Bracket screws but I believe someone told me before that you can register your product and ask permission from them to remove the cooler to use an aftermarket cooler on it, but from what I've remembered he told them specifically that he was just gonna replace the thermal pads and grease on it coz its getting hot and XFX let him do that.
> 
> So its upto you now to decide which way to go, XFX do have lifetime warranty but so far I've seen more "miss" from their cards than hit on the other hand I see people complaining about Sapphire's customer support, I never needed to contact their support so I can't really comment about that.
> 
> But anyways, is it just me or is Gigabyte is missing on this action with the new AMD cards... and why the heck are they MIA on this cards? I know they are gonna release the windforce version of it but I wonder why they are not getting involved with the reference release..


Is the XFX lifetime warranty available in all countries? I'm in the U.K. and can't find anything about it, when I view the warranty info on sites such as ebuyer, scan, aria & overclockers it just says "2 years". I'm assuming this is a warranty that you register with them directly, but I can't find any information about it on their site. The warranty information section just seems standard.


----------



## Spectre-

hi guys
i am currently running my 7950 and am, thinking of picking up the R9 290 considering its $ 490 in Australia

anyways would anyone be able to tell me if i can fit a Arctic Accelero 2 Extreme on the R9 290 or do i have to grab Arctic Accelero 3 extreme

much appreciated for your time


----------



## Sgt Bilko

Quote:


> Originally Posted by *Spectre-*
> 
> hi guys
> i am currently running my 7950 and am, thinking of picking up the R9 290 considering its $ 490 in Australia
> 
> anyways would anyone be able to tell me if i can fit a Arctic Accelero 2 Extreme on the R9 290 or do i have to grab Arctic Accelero 3 extreme
> 
> much appreciated for your time


Only the Accelero Xtreme III and the Accelero Hybrid will fit the R9 290 and 290x


----------



## Arm3nian

I thought XFX changed their policy and doesn't have lifetime warranty anymore?


----------



## hotrod717

They appear to have it on the 290. For $419 vs. $399 from other manufacturers. Wondering how good that warranty is though. They had been a top seller. Seems they dropped the ball on 79** series though and may want to gain some business back.


----------



## Spectre-

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Only the Accelero Xtreme III and the Accelero Hybrid will fit the R9 290 and 290x


Alrite cheers mate

seems like its all run out at pc case gear and mwave any other stores any one could recommend


----------



## Pfortunato

From the experience that I had with xfx, using the component that you bought will void your warranty xD The brand that should make gpus is corsair xD

Sent from my GT-I9505 using Tapatalk


----------



## Sgt Bilko

Quote:


> Originally Posted by *Spectre-*
> 
> Alrite cheers mate
> 
> seems like its all run out at pc case gear and mwave any other stores any one could recommend


i got mine through scorp-tech but it looks like they are out atm as well

http://www.scorptec.com.au/computer/45783-ac-xtrmiii


----------



## Spectre-

Quote:


> Originally Posted by *Sgt Bilko*
> 
> i got mine through scorp-tech but it looks like they are out atm as well
> 
> http://www.scorptec.com.au/computer/45783-ac-xtrmiii


probably will have to ebay it then

i hope ill get them before it gets too hot


----------



## MIGhunter

Is it going to be viable to use a 290x with a 290 in crossfire? I know they are similar models but will the 290 hold the 290x back?

Also, I didn't see a post after the psu debate earlier. The guy that had the 800w PSU that was planning on crossfiring the 2 290x, did it pan out or did you end up needing a new psu?


----------



## Clockster

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yes you can


Yeah messing with it, the noise is bad but my temps have dropped dramatically while gaming







Thank God for HDJ 2000's lol


----------



## Sgt Bilko

Quote:


> Originally Posted by *Clockster*
> 
> Yeah messing with it, the noise is bad but my temps have dropped dramatically while gaming
> 
> 
> 
> 
> 
> 
> 
> Thank God for HDJ 2000's lol


I hear that, People on the other end of my mic were asking me if i had a blender running next to my PC when i still had the stock cooler on lol


----------



## Clockster

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I hear that, People on the other end of my mic were asking me if i had a blender running next to my PC when i still had the stock cooler on lol


Yeah I work from home a lot as my office is quite a drive away, so I have my business partner on Skype and he was like why are you vacuuming while talking to me








Really impressed with the card, can't wait to run CF under water on Friday


----------



## Clockster

Did a quick run with stock cpu and gpu core @ 1050, mem stock

http://www.3dmark.com/3dm11/7438661


----------



## jtjoetan

hey guys! my country sells the r9 290 for about 534$ and the r9 290x for 723$ ( bf4 game and saphire cpu cooler which is useless).. its freaking overpriced.. so which 1 should i get? or should i consider getting other GPU? i would want to run ultra in bf4 at least 60fps average.. and besides that, im concern that the r9 290/290x heat issues.... the third party reference cooler will be next year.. which i wont be able to wait.. so yea..
below are my specs:
CPU: AMD FX8350
Mobo: ASUS Sabertooth 990FX R2.0
RAM: Gskill trident X 4x2gb (2400mhz)
HDD: WD 1TB Caviar Blue HDD
SSD: Plextor M5S 128GB SSD
ODD: Liteon 24x Sata DVDRW
PSU: Seasonic X650
Cooler: Noctua D-14
Chasis: NZXT Phantom 410 White
GPU:


----------



## Clockster

Quote:


> Originally Posted by *jtjoetan*
> 
> hey guys! my country sells the r9 290 for about 534$ and the r9 290x for 723$ ( bf4 game and saphire cpu cooler which is useless).. its freaking overpriced.. so which 1 should i get? or should i consider getting other GPU? i would want to run ultra in bf4 at least 60fps average.. and besides that, im concern that the r9 290/290x heat issues.... the third party reference cooler will be next year.. which i wont be able to wait.. so yea..
> below are my specs:
> CPU: AMD FX8350
> Mobo: ASUS Sabertooth 990FX R2.0
> RAM: Gskill trident X 4x2gb (2400mhz)
> HDD: WD 1TB Caviar Blue HDD
> SSD: Plextor M5S 128GB SSD
> ODD: Liteon 24x Sata DVDRW
> PSU: Seasonic X650
> Cooler: Noctua D-14
> Chasis: NZXT Phantom 410 White
> GPU:


Grab a R9 290, set a custom fan profile for it if you don't mind the noise. Alternatively you could just fit your own aftermarket cooler or go watercooling like I am. I am not gonna lie and pretend the card doesn't get hot, it really does but if you take the time and effort to set a decent profile and the noise doesn't bug you then its a utterly amazing card.

I am playing BF4 with a single card at the moment and its as smooth as butter.


----------



## Spectre-

Quote:


> Originally Posted by *jtjoetan*
> 
> hey guys! my country sells the r9 290 for about 534$ and the r9 290x for 723$ ( bf4 game and saphire cpu cooler which is useless).. its freaking overpriced.. so which 1 should i get? or should i consider getting other GPU? i would want to run ultra in bf4 at least 60fps average.. and besides that, im concern that the r9 290/290x heat issues.... the third party reference cooler will be next year.. which i wont be able to wait.. so yea..
> below are my specs:
> CPU: AMD FX8350
> Mobo: ASUS Sabertooth 990FX R2.0
> RAM: Gskill trident X 4x2gb (2400mhz)
> HDD: WD 1TB Caviar Blue HDD
> SSD: Plextor M5S 128GB SSD
> ODD: Liteon 24x Sata DVDRW
> PSU: Seasonic X650
> Cooler: Noctua D-14
> Chasis: NZXT Phantom 410 White
> GPU:


R9 290

easily the best card for its price only being ~1-5% slower than its older brother


----------



## Sgt Bilko

Agreed, R9 290 is the way to go for you, custom cooler, small overclock and bam........


----------



## jtjoetan

Thanks for the replies everyone!
Ok i'll grab the non-x ver. just FYI guys, im not assembling the parts myself, im buying my parts in a local store and asking the staff to help me assemble the parts.. Umm to be honest, i'd want a quiet set up.. i've read about adding the aftermarket fans, im currently asking the staff if its possible to do it for me..

Im new to hardware and i just have no confident in building my first rig! :X im afraid i'll spoil something..


----------



## Sgt Bilko

Quote:


> Originally Posted by *jtjoetan*
> 
> Thanks for the replies everyone!
> Ok i'll grab the non-x ver. just FYI guys, im not assembling the parts myself, im buying my parts in a local store and asking the staff to help me assemble the parts.. Umm to be honest, i'd want a quiet set up.. i've read about adding the aftermarket fans, im currently asking the staff if its possible to do it for me..
> 
> Im new to hardware and i just have no confident in building my first rig! :X im afraid i'll spoil something..


get them to install an Accelero Xtreme III or an Accelero Hybrid, much quieter and cooler than the Ref cooler


----------



## jtjoetan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> get them to install an Accelero Xtreme III or an Accelero Hybrid, much quieter and cooler than the Ref cooler


Im not sure if my local store has them..
Btw guys, the r9 290x is too overpriced in my country and not worth my money, is it why u guys suggested me to go for r9 290?


----------



## Alexbo1101

Add please!











E: ASIC is 75.6%


----------



## Sgt Bilko

Quote:


> Originally Posted by *jtjoetan*
> 
> Im not sure if my local store has them..
> Btw guys, the r9 290x is too overpriced in my country and not worth my money, is it why u guys suggested me to go for r9 290?


If i'm honest, I would pick up 2 x 290's if i didn't get a 290x, they offer the best price/performance in the market atm.

on another note, i've only just noticed that my card is still throttling even though i've set the power limit to 50% in CCC, any ideas about this guys?

Quote:


> Originally Posted by *Alexbo1101*
> 
> Add please!


Congrats, how much did that cost in Denmark?


----------



## Alexbo1101

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Congrats, how much did that cost in Denmark?


Roughly $750 USD shipped


----------



## Sgt Bilko

Quote:


> Originally Posted by *Alexbo1101*
> 
> Roughly $750 USD shipped


thats not as bad as i thought it would be, $700 AUD for a 290x here.

i have some friends in DK and i occasionally take some parts over to them due to the stupid Custom taxes and higher prices.


----------



## tobitronics

You can add me as well









Here is my gpu-z:


----------



## brazilianloser

Just ordered mine 290 with next day shipping... hopefully I will have my little beast tomorrow.


----------



## DizzlePro

i guess im gonna get a R9 290, any eta on when the aftermarket coolers like come?


----------



## brazilianloser

Quote:


> Originally Posted by *DizzlePro*
> 
> i guess im gonna get a R9 290, any eta on when the aftermarket coolers like come?


There is no known news on the eta of such yet. I don't think even non reference cooling will be enough for this things thou... but sure will help. I just ordered a reference one due to the fact I will be placing them in a water loop anyways.


----------



## anubis1127

Quote:


> Originally Posted by *DizzlePro*
> 
> i guess im gonna get a R9 290, any eta on when the aftermarket coolers like come?


In a couple months.


----------



## Rar4f

Would have been nice to get a R9 290 as it's priced around a GTX 770 (when it was released). But they look very ugly and more importantly the fans and cooler dont seem great.
When will the Asus, Gigabyte , Sapphire etc r9 290s appear?


----------



## anubis1127

Quote:


> Originally Posted by *Rar4f*
> 
> Would have been nice to get a R9 290 as it's priced around a GTX 770 (when it was released). But they look very ugly and more importantly the fans and cooler dont seem great.
> *When will the Asus, Gigabyte , Sapphire etc r9 290s appear?*


They came out yesterday.


----------



## nemm

Time to officially join



Spoiler: Here







Sock air cooling for the moment, just waiting on 2nd EK water block.

Let the testing begin


----------



## Ponycar

Quote:


> Originally Posted by *Rar4f*
> 
> Would have been nice to get a R9 290 as it's priced around a GTX 770 (when it was released). But they look very ugly and more importantly the fans and cooler dont seem great.
> When will the Asus, Gigabyte , Sapphire etc r9 290s appear?


aftermarket coolers usually take at least a month.


----------



## Moustache

Quote:


> Originally Posted by *jtjoetan*
> 
> Im not sure if my local store has them..
> Btw guys, the r9 290x is too overpriced in my country and not worth my money, is it why u guys suggested me to go for r9 290?


Afaik, they're mostly available online so most stores doesn't have them.

I still believe that the 290 is a better choice than 290X. 290 is $150 cheaper and performs very closely to 290X. If you overclock it, it could surpass the 290X very easily. Btw, mind telling me what is your country?


----------



## Scotty99

I find it very odd that every review site has basically reviewed the trash heatsink and not the card. Sure they mention how amazing the 290 price/performance is but the over arcing sentiment is that its too loud/hot. I speak of anand/tomsharware etc, most review sites do give it positve reviews but they have to banter on about the heatsink because thats what they have in front of them. Its amd's fault not the reviewers.

Its almost like doing a car review and instead of listing all of the positives they drone on about lack of cup holders or something.

Two things i take away from r9 290's:

1. What in the heck is amd thinking releasing these cards with the absolute trash that is the reference cooler? If they would have spent just a little more money on R+D the r9 series would have been an absolute smash bang success (only thing they could complain about is power consumption, while a bit high its not ridiculous high). Look at the titan cooler, if these cards had something on the magnitude of that it would have changed the entire feeling of this launch, way too much time spent discussing temps and noise when it could have been avoided 100%.

2. Is this not a absolute slap in the face to people that bought the 290x just a cpl weeks ago? I realize most people went in knowing the 290 would be the better price/performance card but come, these are far too close in performance for the price gap to be that large. Also its way too soon for a price gap that large, a 400.00 dollar 290 would have made sense 6 months down the road but....2 weeks? I am 100% sure that at least some of the 290x buyers feel this way, others could care less cause they only buy the best no matter of price/performance.


----------



## MrStick89

^^^^ I feel like I wasted a lot of money on the 290x.. wish I had waited .. probably return it... it does throttle below 800mhz playing bf4 so...


----------



## Deisun

I have a R9 290 on the way. My question is this: When you guys make a reference to after market coolers, it seems some people are referring to non-reference video cards and some people are referring to an actual cooler you can buy for your reference card.

Is it safe to assume both will exist and I will be able to find a good cooler for my reference 290?


----------



## rdr09

Quote:


> Originally Posted by *Deisun*
> 
> I have a R9 290 on the way. My question is this: When you guys make a reference to after market coolers, it seems some people are referring to non-reference video cards and some people are referring to an actual cooler you can buy for your reference card.
> 
> Is it safe to assume both will exist and I will be able to find a good cooler for my reference 290?


there are already available. accelero and gelid for air. waterblocks are available, too. not really, they are hard to reach still.

i bought a 290, too, and it is on the way. although i feel good about its price, i sure hell wish for the letter X. fortunately, i don't include letters in my sigs.


----------



## anubis1127

Quote:


> Originally Posted by *Deisun*
> 
> I have a R9 290 on the way. My question is this: When you guys make a reference to after market coolers, it seems some people are referring to non-reference video cards and some people are referring to an actual cooler you can buy for your reference card.
> 
> Is it safe to assume both will exist and I will be able to find a good cooler for my reference 290?


Aftermarket coolers already exist. Promilatech MK-26, Gelid Icy whatever, and Arctic Accelero Extreme III have all been fitted to a 290x/290 pcb. Waterblocks, while a shortage of them, are being produced by at least EK, and Aquacomputer, probably more.


----------



## ABD EL HAMEED

Quote:


> Originally Posted by *Scotty99*
> 
> I find it very odd that every review site has basically reviewed the trash heatsink and not the card. Sure they mention how amazing the 290 price/performance is but the over arcing sentiment is that its too loud/hot. I speak of anand/tomsharware etc, most review sites do give it positve reviews but they have to banter on about the heatsink because thats what they have in front of them. Its amd's fault not the reviewers.
> 
> Its almost like doing a car review and instead of listing all of the positives they drone on about lack of cup holders or something.
> 
> Two things i take away from r9 290's:
> 
> 1. What in the heck is amd thinking releasing these cards with the absolute trash that is the reference cooler? If they would have spent just a little more money on R+D the r9 series would have been an absolute smash bang success (only thing they could complain about is power consumption, while a bit high its not ridiculous high). Look at the titan cooler, if these cards had something on the magnitude of that it would have changed the entire feeling of this launch, way too much time spent discussing temps and noise when it could have been avoided 100%.
> 
> 2. Is this not a absolute slap in the face to people that bought the 290x just a cpl weeks ago? I realize most people went in knowing the 290 would be the better price/performance card but come, these are far too close in performance for the price gap to be that large. Also its way too soon for a price gap that large, a 400.00 dollar 290 would have made sense 6 months down the road but....2 weeks? I am 100% sure that at least some of the 290x buyers feel this way, others could care less cause they only buy the best no matter of price/performance.


Some people just want the latest and greatest,it's really that simple


----------



## Rar4f

Let me just say, and i dont mean to sound like a fanboy, but AMD> Nvidia as of now.
When 770 was released it cost around same price as a R9 290, which owns GTX 770 by far. Hell even R9 280X which was priced ALOT LESS than GTX 770 is beating 770 up now and then, and at 1440p.

Nvidia needs to stop being so pricy. Prolly why they didn't get the deal with console developers.


----------



## SpewBoy

That stuff I said earlier about my crossfire setup not working properly (second GPU at virtually idle clocks, etc.) wasn't because I was running PCIe2.0. It was because I was running the benchmarks in windowed mode. I haven't been with AMD since my unlocked 6950 (and even then, never done crossfire, only SLI) and didn't realise crossfire didn't work properly with windowed mode.

Both cards now seem to be operating fine and they don't squeal during gaming, just while benching (though I haven't tested many games yet).

Anyone know if AMD just kinda drops crossfire support for older games? I play a lot of Black Ops 1 and despite the fact that it seems to have a crossfire profile (which was added at some point late 2010), it has absolutely no crossfire support it seems. Also, the refraction shader used in explosions and heat effects seems to completely cripple AMD gpus, going from 125 fps cap to 40, obviously making the game unplayable for a second due to the stutter.

After exams I'll grab BF4, Ghosts, Crysis 3, the latest Metro title and see how these babies really perform. By then I'll have my blocks (which just shipped today).


----------



## Deisun

Quote:


> Originally Posted by *Rar4f*
> 
> Let me just say, and i dont mean to sound like a fanboy, but AMD> Nvidia as of now.
> When 770 was released it cost around same price as a R9 290, which owns GTX 770 by far. Hell even R9 280X which was priced ALOT LESS than GTX 770 is beating 770 up now and then, and at 1440p.
> 
> Nvidia needs to stop being so pricy. Prolly why they didn't get the deal with console developers.


Competition is a beautiful thing and Nvidia will have to adjust unless they want to continually lose sales. It's a good thing for us as consumers. I will support whichever side gives me the better deal, period. No ties to anybody. In this case, AMD is getting my money.


----------



## altsanity

Quote:


> Originally Posted by *Deisun*
> 
> Competition is a beautiful thing and Nvidia will have to adjust unless they want to continually lose sales. It's a good thing for us as consumers. I will support whichever side gives me the better deal, period. No ties to anybody. In this case, AMD is getting my money.


I totally agree. With 2 GPU companies slugging it out price and performance wise, the only winner is ALL OF US. A few months back the only way to get Titan-like performance from a single card was to cough up $1000 for a Titan. Now with the 290X, there are options at a lot less of the cost. Even the 780's are taking a price cut. Everyone wins.

Also, my waterblock from EK is shipping out right now. And my supplier expects to get stock of the 290X on Friday. Its a race between the card and the block.... Who will win


----------



## pompss

guys
beware from buying msi 290x or any reference card if you wanna keep the stock cooler.

i just remove the stock cooler and find out that the chip has some rust spots








Also the heatpipe where the gpu is intouch is scratched very badly








For the top amd reference card i am very dissapointed. this kind of think should never happen for a 600 dollars video card.
Come from gtx 780 and nvidia si quality wroth every cent.
Contacted MSi about this let see what they say


----------



## brazilianloser

Quote:


> Originally Posted by *pompss*
> 
> guys
> beware from buying msi 290x or any reference card if you wanna keep the stock cooler.
> 
> i just remove the stock cooler and find out that the chip has some rust spots
> 
> 
> 
> 
> 
> 
> 
> 
> Also the heatpipe where the gpu is intouch is scratched very badly
> 
> 
> 
> 
> 
> 
> 
> 
> For the top amd reference card i am very dissapointed. this kind of think should never happen for a 600 dollars video card.
> Come from gtx 780 and nvidia si quality wroth every cent.
> Contacted MSi about this let see what they say


Lets hope they don't pull the you removed the cooler so its your fault and the warranty is void card... :/

Yeah when I decided to switch over by attempting to get the 280x I asked friends and everyone seem to dislike MSI when it comes to gpu.
I guess I will take a peak anyways when my Asus arrives.


----------



## pompss

Quote:


> Originally Posted by *brazilianloser*
> 
> Lets hope they don't pull the you removed the cooler so its your fault and the warranty is void card... :/
> 
> Yeah when I decided to switch over by attempting to get the 280x I asked friends and everyone seem to dislike MSI when it comes to gpu.
> I guess I will take a peak anyways when my Asus arrives.


I always buy Evga cards.
I didnt say my part number or serial i just contact them and sent them a picture.
iam curious to know what they wanna say about this
For sure i will never ever again buy a msi card.


----------



## anubis1127

Is there rust on the GPU core? Or are you just talking about that bit of oxidation on the reference cooler?


----------



## pompss

Quote:


> Originally Posted by *anubis1127*
> 
> Is there rust on the GPU core? Or are you just talking about that bit of oxidation on the reference cooler?


Yes on the side of gpu core. not on the center
I dont have the pics because already put the water block and maybe doesnt affect the cpu performance but for me looks like rust spots. the color tells me that.


----------



## Rar4f

Anyone have any idea when non reference R9 290 will be out? I really need to know as its vital to me making a decision. And what prices can i expect?


----------



## sch010

Quote:


> Originally Posted by *Rar4f*
> 
> Anyone have any idea when non reference R9 290 will be out? I really need to know as its vital to me making a decision. And what prices can i expect?


Like everybody else has said, it'll probably be at least 3 weeks to a month, possibly longer. Pricing is an open question; usually it's not too steep (~$10-20) for a non-ref cooler on the reference PCB.


----------



## anubis1127

Quote:


> Originally Posted by *pompss*
> 
> Yes on the side of gpu core. not on the center
> I dont have the pics because already put the water block and maybe doesnt affect the cpu performance but for me looks like rust spots. the color tells me that.


Hm, that certainly is odd, never heard of that. Wish you had a pic, I'd like to see it, but no worries.

Report back with what MSI says, if anything. I really doubt its the AIB partner's fault, probably more AMD's fault as it's their reference card / cooler. I don't really think the AIB partners really do much with them.


----------



## pompss

Quote:


> Originally Posted by *anubis1127*
> 
> Hm, that certainly is odd, never heard of that. Wish you had a pic, I'd like to see it, but no worries.
> 
> Report back with what MSI says, if anything. I really doubt its the AIB partner's fault, probably more AMD's fault as it's their reference card / cooler. I don't really think the AIB partners really do much with them.


i agree 100% amd fault


----------



## sugarhell

Scratch on a bare die? Thats means 99% of the time that the chip is dead


----------



## kantxcape

The 290 performs the same as the 290x? Is this a joke? Just saw guru3d review and the 290x gets 2 fps more?


----------



## rdr09

Quote:


> Originally Posted by *pompss*
> 
> Yes on the side of gpu core. not on the center
> I dont have the pics because already put the water block and maybe doesnt affect the cpu performance but for me looks like rust spots. the color tells me that.


did you send them pic of the rust spot, too? i just bought a msi 290 based on my pleasant experience with msi products (4555 and 7950).


----------



## sugarhell

Less throttling. Please people dont check just the graphs. Read all the review


----------



## DampMonkey

Quote:


> Originally Posted by *kantxcape*
> 
> The 290 performs the same as the 290x? Is this a joke? Just saw guru3d review and the 290x gets 2 fps more?


The 290 has a default fan profile that allows it to run without throttling. The 290x on the other hand will throttle in most cases unless you increase the fan speed a bit


----------



## DampMonkey

Quote:


> Originally Posted by *pompss*
> 
> Yes on the side of gpu core. not on the center
> I dont have the pics because already put the water block and maybe doesnt affect the cpu performance but for me looks like rust spots. the color tells me that.


My sapphire had spots too. honestly, its nothing to be worried about


----------



## psyside

Quote:


> Originally Posted by *kantxcape*
> 
> The 290 performs the same as the 290x? Is this a joke? Just saw guru3d review and the 290x gets 2 fps more?


clock for clock around 5-10% difference.


----------



## pompss

Quote:


> Originally Posted by *DampMonkey*
> 
> My sapphire had spots too. honestly, its nothing to be worried about


i see your benchmark that you push the r290x up 1300 for the core,
do you flash the 290x with a bios right??
what bios do you use and what bios u suggest is good??
thanks


----------



## lordzed83

Well i could not get off my mind that i got thicker thermal pads on memory than supposed to have. Ordered better ones. Some Liquid ultra and stuff.
Will strip my loop drop 120mm rad out of loop t get pressure up and hopefully temperatures down.

Still nothing about msi afterburner or sapphire trix ??


----------



## jerrolds

Quote:


> Originally Posted by *Sgt Bilko*
> 
> They are, but i would need to take the whole fan assembly off, i don't have an issue with that
> 
> I just want these vrm's to stay a bit colder.........hmm, might go for water cooling....


What are the VRMs at?


----------



## nemm

In the process of testing one of the cards for max oc at 1440p with stock bios and air cooling at 100%. GPUz is registering 1.211v max, 73degC with core at 1165. Core 1165 is my max stable clock before artifacts occur so now its time to start memory overclocking. Impressed so far on the overclocking front with the following results.

My 670 core 1398 memory 1856, 3dmark13 gpu score 3908
http://www.3dmark.com/fs/1059152/

Stock Sapphire 290, 3dmark13 gpu score 4772
http://www.3dmark.com/fs/1102395

Sapphire 290 with core 1155 memory 1250, 3dmark13 gpu score 5505
http://www.3dmark.com/fs/1102681

At the time of of post, memory max overclock is at 1425, testing 1500

For those interested the card asic is 79%

No signs of give up with memory at 1600 but the scores are showing negligible gain.

3dmark13 fse at max overclock incoming 11xx core / 15xx memory


----------



## rdr09

Quote:


> Originally Posted by *nemm*
> 
> In the process of testing one of the cards for max oc at 1440p with stock bios and air cooling at 100%. GPUz is registering 1.211v max, 73degC with core at 1165. Core 1165 is my max stable clock before artifacts occur so now its time to start memory overclocking. Impressed so far on the overclocking front with the following results.
> 
> my 670 core 1398 memory 1856, 3dmark13 gpu score 3908
> http://www.3dmark.com/fs/1059152/
> 
> stock sapphire 290, 3dmark13 gpu score 4772
> http://www.3dmark.com/fs/1102395
> 
> sapphire 290 with core 1165 memory 1250, 3dmark13 gpu score 5577
> http://www.3dmark.com/fs/1102522
> 
> at the time of of post, memory max overclock is at 1425, testing 1500
> 
> for those interested the card asic is 79%


nemm, here is a GTX 780 @ 1254/3500
GPU score: 5448

http://www.3dmark.com/3dm/1476282


----------



## jerrolds

Is there an equivalent to a 10run Intel Burn Test that can quickly test for stability with some accuracy? Right now im randomly hard crashing with BF4 at 1120/1400 stock everything but i cant tell if its the game or the card







Since i have to reboot i cant check to see what was happening to the card in AB

Generally Heaven/3DMark can run without issue but are not actually game stable.

If IBT standard run passes, then its pretty good stability wise in my experience.

I guess for now ill run with Heaven/3Dmark in demo mode ..or maybe log the monitoring to a file..hopefully if it crashes i can open it up


----------



## lordzed83

jerrolds try furmark at our own risk if you are not under water...


----------



## jtjoetan

Quote:


> Originally Posted by *Moustache*
> 
> Afaik, they're mostly available online so most stores doesn't have them.
> 
> I still believe that the 290 is a better choice than 290X. 290 is $150 cheaper and performs very closely to 290X. If you overclock it, it could surpass the 290X very easily. Btw, mind telling me what is your country?


Mmm okay.. thanks for the advise. I live in malaysia.


----------



## jomama22

Quote:


> Originally Posted by *rdr09*
> 
> nemm, here is a GTX 780 @ 1254/3500
> GPU score: 5448
> 
> http://www.3dmark.com/3dm/1476282


One of my 290x will do 1315/1500 through fse. Gfx score: 6600

http://www.3dmark.com/fs/1098036

Valley will push 82.1 @ 1320/1700.


----------



## Mr357

Quote:


> Originally Posted by *jomama22*
> 
> One of my 290x will do 1315/1500 through fse. Gfx score: 6600
> 
> http://www.3dmark.com/fs/1098036
> 
> Valley will push 82.1 @ 1320/1700.










How many volts?


----------



## tsm106

Quote:


> Originally Posted by *pompss*
> 
> guys
> beware from buying msi 290x or any reference card if you wanna keep the stock cooler.
> 
> i just remove the stock cooler and find out that the chip has some rust spots
> 
> 
> 
> 
> 
> 
> 
> 
> Also the heatpipe where the gpu is intouch is scratched very badly
> 
> 
> 
> 
> 
> 
> 
> 
> For the top amd reference card i am very dissapointed. this kind of think should never happen for a 600 dollars video card.
> Come from gtx 780 and nvidia si quality wroth every cent.
> Contacted MSi about this let see what they say


That is the cooler, and it is not polished obviously. Heatpipes what? Where are these scratches on the die?


----------



## sugarhell

No heatpipes. Its vapor chamber


----------



## tsm106

Heatpipes??


----------



## DampMonkey

Quote:


> Originally Posted by *tsm106*
> 
> Heatpipes??


Definitely don't see any heatpipes


----------



## TheSoldiet

Any word on a new MSI AB version?

When i play BF4, or any other game my GPU usage is just crazy. Going from 100% to 40% and so on all the time! What is this? (MSI AB)


----------



## pompss

Quote:


> Originally Posted by *nemm*
> 
> In the process of testing one of the cards for max oc at 1440p with stock bios and air cooling at 100%. GPUz is registering 1.211v max, 73degC with core at 1165. Core 1165 is my max stable clock before artifacts occur so now its time to start memory overclocking. Impressed so far on the overclocking front with the following results.
> 
> My 670 core 1398 memory 1856, 3dmark13 gpu score 3908
> http://www.3dmark.com/fs/1059152/
> 
> Stock Sapphire 290, 3dmark13 gpu score 4772
> http://www.3dmark.com/fs/1102395
> 
> Sapphire 290 with core 1165 memory 1250, 3dmark13 gpu score 5577
> http://www.3dmark.com/fs/1102522
> 
> At the time of of post, memory max overclock is at 1425, testing 1500
> 
> For those interested the card asic is 79%
> 
> No signs of give up with memory at 1600 but the scores are showing negligible gain.
> 
> 3dmark13 fse at max overclock incoming 11xx core / 15xx memory


mine is 1120/1350 mhz get GPU score 5660


----------



## sugarhell

7970 and 290x cooler. No scratches. You need to clean more the surface. Also its not polished


----------



## hyrule4927

Hey, I saw this in TechSpot's review and had a quick question for you 290/290X owners.

"Update: Based on your feedback, I took the IceQ X2 cooler off the HIS Radeon R9 280X and stuck it on our R9 290 sample. Cooling was dramatically improved. The FurMark stress test maxed out at 76 degrees while the card never exceeded 63 degrees in Crysis 3 and Battlefield 4. So it seems as expected the board partners will be able to solve the heat issues of the reference card."

Sounds like despite the difference in die size and rotation, you can still put a 280X/7970/7950 heatsink on a 290? Can anyone confirm this? I'd be really tempted to order one now if I could use my Accelero 7970 or the Twin Frozr from my 7950 . . .

Pardon the crappy post formatting, I'm on mobile right now.


----------



## nemm

Sapphire R9-290, stock bios @1.211v I managed 1155 core and 1550 memory in FSE.

3DMark13 FSE gpu score 5677
http://www.3dmark.com/fs/1102719

1165/1600 is fine in Heaven4 max setting @1440p but not FSE. After initial test with memory at 1600 compared to 1500 the gain was negligible but I discovered there was a background task triggered skewing the results.

Card one tested and happy with, card 2 time.


----------



## ssgwright

well I'm thinking of making the switch, anyone know where I can pick one of these bad boys up?


----------



## $ilent

Quote:


> Originally Posted by *TheSoldiet*
> 
> Any word on a new MSI AB version?
> 
> When i play BF4, or any other game my GPU usage is just crazy. Going from 100% to 40% and so on all the time! What is this? (MSI AB)


This, I only come in here now to see if we have a new version of afterburner now.


----------



## jerrolds

Anyone know what the rated allowable temps are for the VRMs? Rage3d http://www.guru3d.com/articles_pages/radeon_r9_290_review_benchmarks,12.html pointed a thermal gun to the card and the VRMs only got up to 65C under load (GPU was at 90C+)

I remember the old 7970 VRMs were rated to go up to 120C or something crazy like that...i'm hoping with the icy 2 cooler and smaller ram sinks the VRMs still stay below 100C while achieving a better overclock than on stock air/voltage, aiming for 1200-1225mhz on the core.


----------



## kantxcape

Just got a Sapphire 290x, the card makes a really crappy coil whine noise, anyone else with this problem?


----------



## Kriant

Cards arrive on Monday








EK waterblocks arrive god knows when


----------



## Rar4f

could anyone tell me why there are many r9 290s that look the same but have different labels on them? What makes them different from eachother?

Asus r9 290 vs Gigabyte r9 290

Thank you.


----------



## jtjoetan

Im thinking of getting into watercooling since the r9 290 temps are so damn high and the fan speed at 47% sounds loud for me... but is watercooling hard to maintain? the biggest fear i have now is that the liquid will leak & hard to maintain..

2nd ques, will r9 290 be an overkill for my monitor? 1920x1080 res.
http://www.amazon.com/Samsung-S22B150N-21-5-Inch-LED-Lit-Monitor/dp/B007ILDVBO/ref=sr_1_18?s=electronics&ie=UTF8&qid=1383756312&sr=1-18&keywords=samsung+led+monitor+syncmaster

tq


----------



## tsm106

Quote:


> Originally Posted by *Rar4f*
> 
> could anyone tell me why there are many r9 290s that look the same but have different labels on them? What makes them different from eachother?
> 
> Asus r9 290 vs Gigabyte r9 290
> 
> Thank you.


AMD releases are reference cards first then custom. The reference cards are all technically the same. The only difference is the company selling them and whatever art they put on the card. Along with the difference in AIB brand, each brand carries it's own warranty terms.


----------



## $ilent

Quote:


> Originally Posted by *Rar4f*
> 
> could anyone tell me why there are many r9 290s that look the same but have different labels on them? What makes them different from eachother?
> 
> Asus r9 290 vs Gigabyte r9 290
> 
> Thank you.


Just different brands. They are all the same card underneath, all run at same clock speed. Some companies offer different warranty.


----------



## Arizonian

Quote:


> Originally Posted by *Alexbo1101*
> 
> Add please!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> E: ASIC is 75.6%


Quote:


> Originally Posted by *tobitronics*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> You can add me as well
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here is my gpu-z:
> 
> 
> Spoiler: Warning: Spoiler!


Quote:


> Originally Posted by *nemm*
> 
> Time to officially join
> 
> 
> 
> Spoiler: Here
> 
> 
> 
> 
> 
> 
> 
> Sock air cooling for the moment, just waiting on 2nd EK water block.
> 
> Let the testing begin


Wow trifecta before I woke up this morning. Congrats to all three - Added


----------



## deathlikeeric

Hey guy should i sell my 2 7950 and get a r9 290 with waterblock? Or keep my 7950's.
I play at 2560x1440 and maybe planning on adding another monitor


----------



## Kriant

Quote:


> Originally Posted by *jtjoetan*
> 
> Im thinking of getting into watercooling since the r9 290 temps are so damn high and the fan speed at 47% sounds loud for me... but is watercooling hard to maintain? the biggest fear i have now is that the liquid will leak & hard to maintain..
> 
> 2nd ques, will r9 290 be an overkill for my monitor? 1920x1080 res.
> http://www.amazon.com/Samsung-S22B150N-21-5-Inch-LED-Lit-Monitor/dp/B007ILDVBO/ref=sr_1_18?s=electronics&ie=UTF8&qid=1383756312&sr=1-18&keywords=samsung+led+monitor+syncmaster
> 
> tq


1. WC is not hard to maintain.
2. WC is fun ( like Lego 0_o)
3. Not an overkill imho. Newer games ( Theif 4 etc) will use more it well.


----------



## Arizonian

Oh on a side note everyone is asking but here was where I read regarding non-reference release date possibility.

*Source - TechPowerUP!*
Quote:


> According to the report, AMD's AIB partners will launch R9 290X graphics cards *with custom-design air- and liquid-cooling solutions by late November, 2013.*


Crossing fingers no delay.


----------



## DampMonkey

Quote:


> Originally Posted by *Kriant*
> 
> 1. WC is not hard to maintain.
> 2. WC is fun ( like Lego 0_o)
> 3. Not an overkill imho. Newer games ( Theif 4 etc) will use more it well.


1. Debateable. Flushing loops frequently, cleaning, and making hardware changes can be a pain. But other than that, it can be simple if you build your loop to accommodate

2. Very fun! The customization that goes into building a loop is incredible. Picking out all the parts and making it your own is probably the best part









Also, leaking is pretty difficult to do unless you really screw up somewhere. Then it's the builders fault. I don't think ive seen hardware failure resulting in a leak anywhere in recent years. Can anyone attest to that?


----------



## Exostenza

Any idea on when the R9 290 cards are going to come out with third party coolers? I always wait to buy my cards with custom coolers... specifically Gigabyte WF3!


----------



## jtjoetan

Quote:


> Originally Posted by *Kriant*
> 
> 1. WC is not hard to maintain.
> 2. WC is fun ( like Lego 0_o)
> 3. Not an overkill imho. Newer games ( Theif 4 etc) will use more it well.


thx for the reply. The thing is, im a complete newbie to hardware stuff.. So probably i wont be the one assembling the parts.. im just too afraid i'll screw up something..


----------



## Raxus

Quote:


> Originally Posted by *DampMonkey*
> 
> 1. Debateable. Flushing loops frequently, cleaning, and making hardware changes can be a pain. But other than that, it can be simple if you build your loop to accommodate
> 
> 2. Very fun! The customization that goes into building a loop is incredible. Picking out all the parts and making it your own is probably the best part
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also, leaking is pretty difficult to do unless you really screw up somewhere. Then it's the builders fault. I don't think ive seen hardware failure resulting in a leak anywhere in recent years. Can anyone attest to that?


I've been gathering parts for a water cooling loop the past couple of weeks for my 290x, this gives me more confidence! Thanks!


----------



## Newbie2009

Quote:


> Originally Posted by *jtjoetan*
> 
> thx for the reply. The thing is, im a complete newbie to hardware stuff.. So probably i wont be the one assembling the parts.. im just too afraid i'll screw up something..


There are a tonne of newbie WC guides online, from loops to fitting GPU blocks. Just make sure you have all the tools needed and take it slowly once you have done your research.


----------



## Alexbo1101

Anyone else crashing like there's no tomorrow in BF4?


----------



## Raxus

Quote:


> Originally Posted by *Alexbo1101*
> 
> Anyone else crashing like there's no tomorrow in BF4?


I did until I updated to the 13.11 beta 8 drivers, seemed way more stable for me.


----------



## nemm

Second card tested, not as good as first but still happy.

1st: 1155 core / 1550 memory @1.211v ASIC 79%, 3Dmark13 FSE 5677
http://www.3dmark.com/fs/1102719

2nd: 1115 core / 1550 memory @1.227v ASIC 74.2%, 3DMark13 FSE 5555
http://www.3dmark.com/fs/1102971

Now I wait the 2nd water block and Voltage control :/


----------



## jomama22

Quote:


> Originally Posted by *DampMonkey*
> 
> 1. Debateable. Flushing loops frequently, cleaning, and making hardware changes can be a pain. But other than that, it can be simple if you build your loop to accommodate
> 
> 2. Very fun! The customization that goes into building a loop is incredible. Picking out all the parts and making it your own is probably the best part
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also, leaking is pretty difficult to do unless you really screw up somewhere. Then it's the builders fault. I don't think ive seen hardware failure resulting in a leak anywhere in recent years. Can anyone attest to that?


one word: quick disconnects. i have 10 pairs in my 2 loop system. though you need some pumping power as they do create restriction, they are a godsend for taking out parts (aka 6 290x). I dont even drain the loop while binning, take out the card, drain it, fill up the new card and put the qdc on the ends to cap it, re fit it. May need to add 4 oz water for the tiny bit of air left in the new card.


----------



## Rar4f

Quote:


> Originally Posted by *Arizonian*
> 
> Oh on a side note everyone is asking but here was where I read regarding non-reference release date possibility.
> 
> *Source - TechPowerUP!*
> Crossing fingers no delay.


You think this will apply for R9 290 as well?


----------



## tsm106

Quote:


> Originally Posted by *jomama22*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DampMonkey*
> 
> 1. Debateable. Flushing loops frequently, cleaning, and making hardware changes can be a pain. But other than that, it can be simple if you build your loop to accommodate
> 
> 2. Very fun! The customization that goes into building a loop is incredible. Picking out all the parts and making it your own is probably the best part
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also, leaking is pretty difficult to do unless you really screw up somewhere. Then it's the builders fault. I don't think ive seen hardware failure resulting in a leak anywhere in recent years. Can anyone attest to that?
> 
> 
> 
> one word: quick disconnects. i have 10 pairs in my 2 loop system. though you need some pumping power as they do create restriction, they are a godsend for taking out parts (aka 6 290x). I dont even drain the loop while binning, take out the card, drain it, fill up the new card and put the qdc on the ends to cap it, re fit it. *May need to add 4 oz water for the tiny bit of air left in the new card.*
Click to expand...

Here's a tip, you just take off a qdc and fill her up. Put qdc back on and its only a few drops lost. The amount of air you introduce back into loop from block change is then minimal.


----------



## Arizonian

Quote:


> Originally Posted by *Exostenza*
> 
> Any idea on when the R9 290 cards are going to come out with third party coolers? I always wait to buy my cards with custom coolers... specifically Gigabyte WF3!


If it fits the 290X it fits the 290 and third party coolers are out. Two members here using Accelero Extreme III's on their 290X. Needs a few more heat sinks with an extra packet but it'd doable.

Quote:


> Originally Posted by *Rar4f*
> 
> You think this will apply for R9 290 as well?


I'm going to bet two weeks after the 290X to get some more sales. Let early adopters pay for the extra stream processors / texture units and then come out with a lowered priced card and clean up those who didn't.


----------



## rdr09

Quote:


> Originally Posted by *jomama22*
> 
> One of my 290x will do 1315/1500 through fse. Gfx score: 6600
> 
> http://www.3dmark.com/fs/1098036
> 
> Valley will push 82.1 @ 1320/1700.


it gets lonely at the top of the R9s. tsm and others will be with you shortly.


----------



## Rar4f

Quote:


> Originally Posted by *Arizonian*
> 
> If it fits the 290X it fits the 290 and third party coolers are out. Two members here using Accelero Extreme III's on their 290X. Needs a few more heat sinks with an extra packet but it'd doable.
> I'm going to bet two weeks after the 290X to get some more sales. Let early adopters pay for the extra stream processors / texture units and then come out with a lowered priced card and clean up those who didn't.


That's quite late for my plans. I guess i may have to go without a GPU.


----------



## $ilent

Arozonian I posted that 780ti overclocking results here - http://videocardz.com/47690/nvidia-geforce-gtx-780-ti-overclocking-exposed


----------



## Alexbo1101

Quote:


> Originally Posted by *Raxus*
> 
> I did until I updated to the 13.11 beta 8 drivers, seemed way more stable for me.


I'm already on beta 8, but i crash almost every 5 minutes in SP and half-way through a MP match.


----------



## pk7677

Hey guys will a i5 2500k @ 4.5ghz bottleneck 2 290s in crossfire? my resolution is 1440p.


----------



## esqueue

Quote:


> Originally Posted by *Alexbo1101*
> 
> I'm already on beta 8, but i crash almost every 5 minutes in SP and half-way through a MP match.


I crashed once after hours of playing BF4. I don't have any plans on ever playing online so I don't know how stable it would be there. I've used beta 8 drivers everything stock speed.


----------



## nemm

For those Sapphire owners awaiting voltage control with Trixx, I had a reply from tech support with regards to new release to support the 290s, "It should be available by December" I guess mine will get the flash treatment soon as they're under water.


----------



## jerrolds

Afterburner normally supports reference cards from other manufacturers right? Asus/Sapphire/etc? Hopefully AB beta 16 or whatever will be out soon.


----------



## jrcbandit

From what some people are posting it seems like 290s can potentially overclock better than 290x? So why did I pay a difference of $150?? Without altering voltages, I couldn't go higher than 1110 core on my 290x, and putting it at 1.299 V (1.35V with Vdroop) I can only get 1190 core and 1375 memory which is rather underwhelming, especially the memory overclock.


----------



## nemm

Quote:


> Originally Posted by *jerrolds*
> 
> Afterburner normally supports reference cards from other manufacturers right? Asus/Sapphire/etc? Hopefully AB beta 16 or whatever will be out soon.


Usually they do so I'm hoping beta 17 will be the one to support


----------



## jtjoetan

to the r9 290 saphire owners, how r ur cards running on bf4 ultra? does it need 40% to get to average 60fps?
still in a dilemma of getting 1, i wouldn't want to waste my precious 2 months school holiday







.. the non reference will be out estimated by dec..









btw guys, ive decided to stay away from water cooling.. as it is over my budget and it can be confuse for a beginner like me....


----------



## $ilent

Quote:


> Originally Posted by *pk7677*
> 
> Hey guys will a i5 2500k @ 4.5ghz bottleneck 2 290s in crossfire? my resolution is 1440p.


I would hazard a guess and say yes unfortunately









Sorry


----------



## Haebyun

I assume the waterblocks of the 290x should fit the 290 right?


----------



## $ilent

Few people have said yes. The cards are identical I think, just some shaders lasered out


----------



## Gilgam3sh

NEED HELP!

I just installed my Sapphire 290, I GOT BIG PROBLEMS with the drivers and Catalyst Control Center, I cleaned everything using ATIman uninstaller, everything fine, then installed 13.11 BETA8, problem is I can't see Catalyst Control Center at all, like it have never been installed, I cleaned the drivers again and same thing, no CCC, I'm using Windows 8.1 Pro x64, everything was fine with my HD7970 in Crossfire.

anyone??


----------



## brazilianloser

Quote:


> Originally Posted by *Haebyun*
> 
> I assume the waterblocks of the 290x should fit the 290 right?


Yes my good sir. The EK ones at least are made for both cards since they are basically the same in shape and size, location of all components according to the EK guys.


----------



## jerrolds

Quote:


> Originally Posted by *Gilgam3sh*
> 
> NEED HELP!
> 
> I just installed my Sapphire 290, I GOT BIG PROBLEMS with the drivers and Catalyst Control Center, I cleaned everything using ATIman uninstaller, everything fine, then installed 13.11 BETA8, problem is I can't see Catalyst Control Center at all, like it have never been installed, I cleaned the drivers again and same thing, no CCC, I'm using Windows 8.1 Pro x64, everything was fine with my HD7970 in Crossfire.
> 
> anyone??


Uninstall everything, delete Temp folders C:\Windows\Temp and C:\Users\Temp or whatever - ATI Man uninstaller doesnt work with 8.1 (if i remember correctly)

You can also use the Geforce Uninstall Utility - it works on AMD as well (ive used it on Win 8 in Safe Mode with good results) https://forums.geforce.com/default/topic/550192/geforce-drivers/display-driver-uninstaller-ddu-v9-1/


----------



## jomama22

Gotta say, the average water oc im seeing right now on the asus.rom bios (1.299 max(1.412 set), though mostly 1.26-1.29 actual) is ~1230-1270...so 1250 or so. That is after 5 of 6 being thoroughly tested. 1300 stable is actually hard to reach, even with 1.4v. @ 1.35 (actual volts, not necessarily what gpu tweak says) 1280 seems to be the number right now.

Memory wise, 1600-1700 is attainable, hynix does matter on water it seems as the 1700s are all hynix while elpidia is lower.


----------



## psyside

Quote:


> Originally Posted by *jomama22*
> 
> Gotta say, the average water oc im seeing right now on the asus.rom bios (1.299 max(1.412 set), though mostly 1.26-1.29 actual) is ~1230-1270...so 1250 or so. That is after 5 of 6 being thoroughly tested. 1300 stable is actually hard to reach, even with 1.4v. @ 1.35 (actual volts, not necessarily what gpu tweak says) 1280 seems to be the number right now.
> 
> Memory wise, 1600-1700 is attainable, hynix does matter on water it seems as the 1700s are all hynix while elpidia is lower.


1. Is there any 290X cards with Hynx (100%)?

2. is there any 290 with Hynx?


----------



## Mr357

Quote:


> Originally Posted by *Gilgam3sh*
> 
> NEED HELP!
> 
> I just installed my Sapphire 290, I GOT BIG PROBLEMS with the drivers and Catalyst Control Center, I cleaned everything using ATIman uninstaller, everything fine, then installed 13.11 BETA8, problem is I can't see Catalyst Control Center at all, like it have never been installed, I cleaned the drivers again and same thing, no CCC, I'm using Windows 8.1 Pro x64, everything was fine with my HD7970 in Crossfire.
> 
> anyone??


Maybe try using Driver Sweeper in safe mode instead of ATIman? Probably won't change anything, but it's worth a shot.


----------



## Gilgam3sh

Quote:


> Originally Posted by *jerrolds*
> 
> Uninstall everything, delete Temp folders C:\Windows\Temp and C:\Users\Temp or whatever - ATI Man uninstaller doesnt work with 8.1 (if i remember correctly)
> 
> You can also use the Geforce Uninstall Utility - it works on AMD as well (ive used it on Win 8 in Safe Mode with good results) https://forums.geforce.com/default/topic/550192/geforce-drivers/display-driver-uninstaller-ddu-v9-1/


THANKS!!!!!!!!! it works now! removed ATIman and using display driver uinstaller!

FINALLY!!!!!!!!!!


----------



## Rar4f

Is the temperature on 290 really normal?


----------



## vettefan8

Well since I've had my accelero xtreme III on my 290x for about 5 days now, I figured it was time to check out the temps on Furmark. I'm not sure what the normal procedure is, so I just did the "Burn in test" at 2560x1440 with AA off for 30 minutes. At Idle my GPU = 34C and my VRM1 = 34C and VRM2 = 36C with the fan speed at 20%. I set CCC to max fan speed of 100% (since this cooler is extremely quiet) and a target GPU temperature of 50C. While running Furmark, the GPU gets up to 55C and just stayed locked right at 55C for the entire 30 minutes. My VRM1 gets up to 67C and VRM2 gets up to 57C. If there is a better way that people usually run Furmark let me know.


----------



## ImJJames

Quote:


> Originally Posted by *Rar4f*
> 
> Is the temperature on 290 really normal?


Its designed to handle 90C for normal usage.


----------



## Rar4f

Quote:


> Originally Posted by *ImJJames*
> 
> Its designed to handle 90C for normal usage.


Define normal usage?


----------



## RAFFY

Quote:


> Originally Posted by *Taint3dBulge*
> 
> Ya there is something totally wrong with my card. Gonna have to rma it.. The coil whine just screams. You can hear it downstairs... The gpu usage is all over the board never does it stay at 100% its just up and down and temps are totally fine. At 75% it never goes over 90c... I put bf3 to medium settings at 1680x1050 and it goes from 150fps to 45 and it just bounces all around.. Latest drivers, fresh install of windows. Even my 6950 at high settings would run 70fps constantly with maybe a blurp in the lower 50's never 45.. Im not looking forward to trying to find another asus...


Quote:


> Originally Posted by *Rar4f*
> 
> Define normal usage?


At the end of the day whether we like it or not the R9 290x is designed to run at 95c with a low fan speed to keep the noise down. Nothing more, nothing less that's just how it is.


----------



## $ilent

Quote:


> Originally Posted by *Gilgam3sh*
> 
> NEED HELP!
> 
> I just installed my Sapphire 290, I GOT BIG PROBLEMS with the drivers and Catalyst Control Center, I cleaned everything using ATIman uninstaller, everything fine, then installed 13.11 BETA8, problem is I can't see Catalyst Control Center at all, like it have never been installed, I cleaned the drivers again and same thing, no CCC, I'm using Windows 8.1 Pro x64, everything was fine with my HD7970 in Crossfire.
> 
> anyone??


I get that problem when I try install drivers from CD I think, had to get ut from internet. OR vice versa...cant remember lol one of them didnt work.

Also, look at these 290 prices - http://www.overclockers.co.uk/showproduct.php?prodid=GX-086-HS&groupid=701&catid=56&subcat=1752

£309....£309! That is actually ridiculous, I paid £480 for my 290x.


----------



## Rar4f

Quote:


> Originally Posted by *RAFFY*
> 
> At the end of the day whether we like it or not the R9 290x is designed to run at 95c with a low fan speed to keep the noise down. Nothing more, nothing less that's just how it is.


It's not that i mind the temperature, i am simply worried that it's not normal and could be potentially damaging to the gpu over long time.
Why buy a great performance card if 2 years later it breaks down.

Got any reading material from AMD thati can read about the temperature?


----------



## jerrolds

Quote:


> Originally Posted by *$ilent*
> 
> I get that problem when I try install drivers from CD I think, had to get ut from internet. OR vice versa...cant remember lol one of them didnt work.
> 
> Also, look at these 290 prices - http://www.overclockers.co.uk/showproduct.php?prodid=GX-086-HS&groupid=701&catid=56&subcat=1752
> 
> £309....£309! That is actually ridiculous, I paid £480 for my 290x.


Yea im kinda bummed I paid more than $160 for the 290X and BF4, whats worse is the higher number means more tax. I shouldve sold BF4 to recoupe some of the cost and then pick it up during a sale or something









Im just hoping that i can hit 1200mhz once the icy 2 comes in - you have the same cooler dont you? or are you underwater?


----------



## RAFFY

Quote:


> Originally Posted by *Rar4f*
> 
> It's not that i mind the temperature, i am simply worried that it's not normal and could be potentially damaging to the gpu over long time.
> Why buy a great performance card if 2 years later it breaks down.
> 
> Got any reading material from AMD thati can read about the temperature?


I don't have any reading material but I have seen that statement published in multiple reviews. Therefore I'm sure this is some official AMD press release or statement that contains that statement or something close. I'm not worried about it at all. My cards have a 3 year warranty and I sure as heck know in 3 years I wont be the owner of these cards. Plus I've owned a few ATI/AMD cards in the past and they always ran hot.


----------



## brazilianloser

Quote:


> Originally Posted by *Rar4f*
> 
> It's not that i mind the temperature, i am simply worried that it's not normal and could be potentially damaging to the gpu over long time.
> Why buy a great performance card if 2 years later it breaks down.
> 
> Got any reading material from AMD thati can read about the temperature?


Most guys that are dropping 500+ on the 290x are folks that always want the best. Highly doubt they will keep the card for over two years... now on the 290 which is more widely received due to its low price yeah that could be problematic... but you have the tools to tune down the card to only go so far to hit the temperature you wish for as shown on the Linus video which I do not have a link. Its about are you comfortable having you card full power or not???


----------



## Taint3dBulge

Quote:


> Originally Posted by *RAFFY*
> 
> At the end of the day whether we like it or not the R9 290x is designed to run at 95c with a low fan speed to keep the noise down. Nothing more, nothing less that's just how it is.


Fan noise isnt that big of deal, but when the you your gpu starts squeeling twice as loud as the fan, there is an issue.. It is starting to dim down alittle though. Just going to keep using it for a week or two see what happens... Gonna let heaven run overnight see if that will fix it.. Well that is if i can sleep, i can hear this thing across the house when runing furmark, cooler and coilwhine anyways.. I aint gonna fret of it anymore thanks to everyone that has helped.. I just need to figure out what to do before i order up a accelero hybrid, cant get rid of the card after installin that lol.


----------



## Rar4f

Quote:


> Originally Posted by *brazilianloser*
> 
> Most guys that are dropping 500+ on the 290x are folks that always want the best. Highly doubt they will keep the card for over two years... now on the 290 which is more widely received due to its low price yeah that could be problematic... but you have the tools to tune down the card to only go so far to hit the temperature you wish for as shown on the Linus video which I do not have a link. Its about are you comfortable having you card full power or not???


I am thinking of r9 290, not X. It has high temperature as well








Anyway i don't want to take down the performance of the card by meddling with settings. I want best performance out of the card, and have a temperature that is safe for the card.
Hopefully non reference cards will tackle alot of the heat, so that we are even more safer.


----------



## Taint3dBulge

Quote:


> Originally Posted by *TheSoldiet*
> 
> Any word on a new MSI AB version?
> 
> When i play BF4, or any other game my GPU usage is just crazy. Going from 100% to 40% and so on all the time! What is this? (MSI AB)


Well holy smokes, I'm not the only one that gets this..... Ok its gotta be a driver problem.. I get the same thing with gpu tweak, yet this is the only other instance iv read about gpu usage going all over the place such as myn does too.


----------



## RAFFY

Quote:


> Originally Posted by *Taint3dBulge*
> 
> Fan noise isnt that big of deal, but when the you your gpu starts squeeling twice as loud as the fan, there is an issue.. It is starting to dim down alittle though. Just going to keep using it for a week or two see what happens... Gonna let heaven run overnight see if that will fix it.. Well that is if i can sleep, i can hear this thing across the house when runing furmark, cooler and coilwhine anyways.. I aint gonna fret of it anymore thanks to everyone that has helped.. I just need to figure out what to do before i order up a accelero hybrid, cant get rid of the card after installin that lol.


Squeeling? I haven't noticed any crazy noises coming from my GPU's until the fan speed increases.


----------



## iPDrop

Hey guys so something kind of weird happened, I was looking at the R9 290 on newegg on November 4th and it said a price of $399.99 and said it was available to purchase, so I ordered two of them... but then as soon as I refreshed the page it was saying out of stock or something and I don't think the card was actually officially released until the next day on the 5th. But I checked the order status and the package was shipped and estimated delivery is tomorrow.. Did they just send me my cards out early before the release or something?


----------



## jrcbandit

Quote:


> Originally Posted by *jomama22*
> 
> Gotta say, the average water oc im seeing right now on the asus.rom bios (1.299 max(1.412 set), though mostly 1.26-1.29 actual) is ~1230-1270...so 1250 or so. That is after 5 of 6 being thoroughly tested. 1300 stable is actually hard to reach, even with 1.4v. @ 1.35 (actual volts, not necessarily what gpu tweak says) 1280 seems to be the number right now.
> 
> Memory wise, 1600-1700 is attainable, hynix does matter on water it seems as the 1700s are all hynix while elpidia is lower.


I'm on water and I can't get the memory to go above 1375 with Hynix.... Used the thermal pads that came with the EK water block. Is it necessary to get better thermal pads? Also, my core didn't even reach 1200, although I had it set to 1.35 instead of 1.41 with the Asus bios (not sure if it makes much difference with Vdroop). I used the thermal paste that came with the EK block for the GPU, should I have used MX-4 instead?


----------



## brazilianloser

Quote:


> Originally Posted by *iPDrop*
> 
> Hey guys so something kind of weird happened, I was looking at the R9 290 on newegg on November 4th and it said a price of $399.99 and said it was available to purchase, so I ordered two of them... but then as soon as I refreshed the page it was saying out of stock or something and I don't think the card was actually officially released until the next day on the 5th. But I checked the order status and the package was shipped and estimated delivery is tomorrow.. Did they just send me my cards out early before the release or something?


The Asus I know it was available late night on the 4th... and was still available from Newegg yesterday night as well after being out of stock for a day . The cards are released... so you didn't get it before release if you are only receiving them tomorrow... you just got lucky to order ahead of most.


----------



## iPDrop

Oh sweet







I was hoping they accidentally shipped me two R9290X'S by mistake but oh well haha


----------



## jerrolds

Quote:


> Originally Posted by *jrcbandit*
> 
> I'm on water and I can't get the memory to go above 1375 with Hynix.... Used the thermal pads that came with the EK water block. Is it necessary to get better thermal pads? Also, my core didn't even reach 1200, although I had it set to 1.35 instead of 1.41 with the Asus bios (not sure if it makes much difference with Vdroop). I used the thermal paste that came with the EK block for the GPU, should I have used MX-4 instead?


That sucks, and temps are all in line? Maybe powerlimit needs to be maxed? Whats your ASIC?


----------



## brazilianloser

Quote:


> Originally Posted by *iPDrop*
> 
> Oh sweet
> 
> 
> 
> 
> 
> 
> 
> 
> I was hoping they accidentally shipped me two R9290X'S by mistake but oh well haha


Never know thou lol... but yeah the 290s been out since Monday night... which in turn those that got any before they ran out probably had theirs shipped yesterday.


----------



## brazilianloser

Mine is in the packaging stage according to Newegg... and I did pay for next day... so if everything goes right I should have mine tomorrow too but only one for now.


----------



## Gilgam3sh

anyone yet tried flashing a 290X BIOS on the 290? seems to be the same BIOS version on them.


----------



## Forceman

Quote:


> Originally Posted by *jrcbandit*
> 
> I'm on water and I can't get the memory to go above 1375 with Hynix.... Used the thermal pads that came with the EK water block. Is it necessary to get better thermal pads? Also, my core didn't even reach 1200, although I had it set to 1.35 instead of 1.41 with the Asus bios (not sure if it makes much difference with Vdroop). I used the thermal paste that came with the EK block for the GPU, should I have used MX-4 instead?


290 or 290X? One of the reviews mentioned a BIOS bug that may be limiting 290 memory overclocks.


----------



## the9quad

Quote:


> Originally Posted by *RAFFY*
> 
> At the end of the day whether we like it or not the R9 290x is designed to run at 95c with a low fan speed to keep the noise down. Nothing more, nothing less that's just how it is.


In my experience, the R9 290x is designed to throttle like crazy to maintain 95c while in quiet mode.
My card will never hit advertised speeds except at the very beginning of a gaming session, in quiet mode. It will peg at whatever the throttling temp is because 40% is way too slow and throttle like heck to maintain that.

Uber mode doesn't fair much better as the fan will kick up speed way too late to not throttle and 55% fan speed is still to slow to not hit the throttling temp so it will just allow it throttle down not as far, but it will still be throttling most of the time.
The only fix I had is to flash to the ASUS bios use GPU tweak and set a custom fan profile. Now I can maintain no throttling and decent card temps at the same time, albeit the fan is now at 62-65%.
This would definitely be unacceptable for some.

In my opinion if you are not going to custom cool these cards or are not willing to put up with the fan noise in the 55%-65% range than DO NOT buy these cards (290 and 290x)

If you are on stock cooling and you can game without throttling using just straight up quiet mode or even the default uber mode, please post and let me know. So far I don't think it is possible. The cooling is just inadequate.


----------



## ukaussi

Quote:


> Originally Posted by *the9quad*
> 
> ...... In my opinion if you are not going to custom cool these cards or are not willing to put up with the fan noise in the 55%-65% range than DO NOT buy these cards (290 and 290x)....
> .


...yet.

wait for the non-reference designed coolers from ASUS, Gigabyte etc ... that is what I am doing and possibly buying 2 x 290.... Black Friday/Cyber Monday sales!!


----------



## Rar4f

As long as the cooling will be good, i could care for the noise. I'll just take off my headphone and say "DIDNT HEAR LOL!"


----------



## the9quad

Quote:


> Originally Posted by *ukaussi*
> 
> ...yet.
> 
> wait for the non-reference designed coolers from ASUS, Gigabyte etc ... that is what I am doing and possibly buying 2 x 290.... Black Friday/Cyber Monday sales!!


Yeah for sure I am talking about the reference cards with stock coolers. And don't think I am bad mouthing the card, I love mine. I just think some people are going to be disappointed with the reference (290 and 290x) cards because they are either: 1.)Going to throttle and still run hot as heck OR 2.) be very loud and require a custom fan profile.


----------



## RAFFY

Quote:


> Originally Posted by *the9quad*
> 
> In my experience, the R9 290x is designed to throttle like crazy to maintain 95c while in quiet mode.
> My card will never hit advertised speeds except at the very beginning of a gaming session, in quiet mode. It will peg at whatever the throttling temp is because 40% is way too slow and throttle like heck to maintain that.
> 
> Uber mode doesn't fair much better as the fan will kick up speed way too late to not throttle and 55% fan speed is still to slow to not hit the throttling temp so it will just allow it throttle down not as far, but it will still be throttling most of the time.
> The only fix I had is to flash to the ASUS bios use GPU tweak and set a custom fan profile. Now I can maintain no throttling and decent card temps at the same time, albeit the fan is now at 62-65%.
> This would definitely be unacceptable for some.
> 
> In my opinion if you are not going to custom cool these cards or are not willing to put up with the fan noise in the 55%-65% range than DO NOT buy these cards (290 and 290x)
> 
> If you are on stock cooling and you can game without throttling using just straight up quiet mode or even the default uber mode, please post and let me know. So far I don't think it is possible. The cooling is just inadequate.


Why is everyone using GPU Tweak for controlling fan speed? I'm not overclocking right now and I am assuming that's why people are using GPU Tweak. But currently I just have my Catalyst set to Max Fan Speed 100% and Max GPU Temp 80c. So far with that combo I seem to be have great results. Fans are usally around 60-70% and temps are around 77c give or take a 5c. And this is with my computer laid out on a desk with no air pressure. I really just don't understand all the complaining with these cards. It's a top of the line, high performance, gaming video card, it's going to get hot, it's going to get noisy and it's going to kick ass in your games. It's like buying a Ferrari and then complaining that it makes noise when you accelerate or idle.

Edit: By the way I'm not calling you a complainer the9quad. That statement was just for people of this thread lol


----------



## cyenz

At last! It was hard the get the first batch in Portugal but here it is a Gigabyte 290X!


----------



## RAFFY

Quote:


> Originally Posted by *cyenz*
> 
> At last! It was hard the get the first batch in Portugal but here it is a Gigabyte 290X!


Sweet! I've got a good friend that lives over in Porto!


----------



## Rar4f

Quote:


> Originally Posted by *RAFFY*
> 
> Why is everyone using GPU Tweak for controlling fan speed? I'm not overclocking right now and I am assuming that's why people are using GPU Tweak. But currently I just have my Catalyst set to Max Fan Speed 100% and Max GPU Temp 80c. So far with that combo I seem to be have great results. Fans are usally around 60-70% and temps are around 77c give or take a 5c. And this is with my computer laid out on a desk with no air pressure. I really just don't understand all the complaining with these cards. It's a top of the line, high performance, gaming video card, it's going to get hot, it's going to get noisy and it's going to kick ass in your games. It's like buying a Ferrari and then complaining that it makes noise when you accelerate or idle.
> 
> Edit: By the way I'm not calling you a complainer the9quad. That statement was just for people of this thread lol


AMD Told us to never settle.
Even if R9 290 hits 1 DBA at 100% fan, we will still complain about that 1 DBA.


----------



## Arizonian

Quote:


> Originally Posted by *cyenz*
> 
> At last! It was hard the get the first batch in Portugal but here it is a Gigabyte 290X!
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## the9quad

Quote:


> Originally Posted by *RAFFY*
> 
> Why is everyone using GPU Tweak for controlling fan speed? I'm not overclocking right now and I am assuming that's why people are using GPU Tweak. But currently I just have my Catalyst set to Max Fan Speed 100% and Max GPU Temp 80c. So far with that combo I seem to be have great results. Fans are usally around 60-70% and temps are around 77c give or take a 5c. And this is with my computer laid out on a desk with no air pressure. I really just don't understand all the complaining with these cards. It's a top of the line, high performance, gaming video card, it's going to get hot, it's going to get noisy and it's going to kick ass in your games. It's like buying a Ferrari and then complaining that it makes noise when you accelerate or idle.


So your card is not throttling to maintain that temp? If it is the same as my experience with the default CCC, than all the max GPU temp does in CCC is lower the temp at which you will start throttling. You will still only get to a max of 55% fan speed even (if you have it set to 100%) and it will still wait too long to start speeding up the fan which results in you hitting the set temp (in your case 80) and throttling with the fan at a constant 55% to maintain just below the set temp (77 in your case). That is why I am using GPU tweak, because I found the CCC broken for what I wanted it to do, which is never throttle and just increase the fan speed to maintain a set temperature...which is what you would think max temp and fan speed of 100% would do in CCC but it does not.. AKA I want my ferrari to not have the default CCC speed limiter and I don't care about noise.

Than again maybe my CC and 290x is just broke, which is possible, but I am pretty sure that even in uber with the sapphire bios, a setting of 100% fan speed in CCC will never go above 55%. Hence flashing to asus and using gpu tweak.


----------



## omgsosluuw

So tempted to get a 290 right now. Can anyone do a noise test with case close?


----------



## jerrolds

Flashing with WinFlash shouldnt pose any problems right? I cant find a spare USB stick ;p


----------



## RAFFY

Quote:


> Originally Posted by *the9quad*
> 
> So your card is not throttling to maintain that temp? If it is the same as my experience with the default CCC, than all the max GPU temp does in CCC is lower the temp at which you will start throttling. You will still only get to a max of 55% fan speed even (if you have it set to 100%) and it will still wait too long to start speeding up the fan which results in you hitting the set temp (in your case 80) and throttling with the fan at a constant 55% to maintain just below the set temp (77 in your case). That is why I am using GPU tweak, because I found the CCC broken for what I wanted it to do, which is never throttle and just increase the fan speed to maintain a set temperature...which is what you would think max temp and fan speed of 100% would do in CCC but it does not..
> 
> Than again maybe my CC and 290x is just broke, which is possible, but I am pretty sure that even in uber with the sapphire bios, a setting of 100% fan speed in CCC will never go above 55%. Hence flashing to asus and using gpu tweak.


It may be throttling but hell I can't tell in game. According to my ASUS post game reports for BF4 I'm averaging between 150-197 average fps in crossfire at 1440p with Ultra settings. Since I don't have any aftermarket cooling on my CPU or GPU's I'm pretty contempt with these results.

Edit: I haven't tried using GPU Tweak because I don't have an optical drive and I had brain fart didn't think to check the ASUS website to download it haha. I may switch to it because in COD:Ghosts I do get some chop but I think that is more driver related than anything.


----------



## Paul17041993

Quote:


> Originally Posted by *pompss*
> 
> guys
> beware from buying msi 290x or any reference card if you wanna keep the stock cooler.
> 
> i just remove the stock cooler and find out that the chip has some rust spots
> 
> 
> 
> 
> 
> 
> 
> 
> Also the heatpipe where the gpu is intouch is scratched very badly
> 
> 
> 
> 
> 
> 
> 
> 
> For the top amd reference card i am very dissapointed. this kind of think should never happen for a 600 dollars video card.
> Come from gtx 780 and nvidia si quality wroth every cent.
> Contacted MSi about this let see what they say


looks fine to me, if you really wanted one or two less degrees you could polish it...


----------



## NightHawK360

I'm really considering on buying an R9 290, due to the really attractive price. The only thing that's making hesitate on buying one is how bad is the throttling like would it cause a dramatic decrease in performance or just a noticeable decrease or it varies on how hot it gets above the 95C threshold.


----------



## Gilgam3sh

I will try and flash a 290X BIOS on my 290 now, will be using ATIFLASH, but can someone tell me if ATIWINFLASH works as it easier to use that...


----------



## PillarOfAutumn

2 questions:

1- I keep reading that AMD "designed" the card to run at 95c. What exactly did they do different for it to allow it to run at those temps? Are they using some NASA designed PCB or something?

2- also I have a 290 and an EK waterblock on the way. What should I do right from the beginning to check the integrity of the card? ASICs score is one of them, what else?


----------



## sugarhell

They simply use lower rpm to keep the tdp on check. Better cooling means more tdp. Thats only on quiet bios. On uber you dont care.

Also when the components have 110C -130C max temps 95C is nothing


----------



## sch010

Got my R9 290 today. Total crap (and I'm an AMD guy all the way). Fan speed won't set in CCC, it idles at 90 C, and XFX put warranty stickers on the baseplate screws so I can't even replace the TIM (which I assume doesn't even exist). Card is hot enough to literally burn me. What a joke.


----------



## Paul17041993

they just designed the chip in a way to handle high temps without degrading like its predecessors, the PCB and components will handle such temps quite easily too provided there is a decent amount of case airflow.

if you must know the truth, the real killer is flipping between hot and cold states frequently and aggressively, this causes movement across the PCB and silicon and warps it till it breaks.

has anyone looked at intel's specs and noted that their CPUs are designed for use at up to 99C constant too...?


----------



## stn0092

Any news on nonreference coolers yet?


----------



## Paul17041993

Quote:


> Originally Posted by *sch010*
> 
> Got my R9 290 today. Total crap (and I'm an AMD guy all the way). Fan speed won't set in CCC, it idles at 90 C, and XFX put warranty stickers on the baseplate screws so I can't even replace the TIM (which I assume doesn't even exist). Card is hot enough to literally burn me. What a joke.


uh, perchance your not trying to set the fan below 40% right...?


----------



## tsm106

Quote:


> Originally Posted by *sch010*
> 
> Got my R9 290 today. Total crap (and I'm an AMD guy all the way). Fan speed won't set in CCC, it idles at 90 C, and XFX put warranty stickers on the baseplate screws so I can't even replace the TIM (which I assume doesn't even exist). Card is hot enough to literally burn me. What a joke.


you should return it and get a ti.


----------



## sch010

Quote:


> Originally Posted by *tsm106*
> 
> you should return it and get a ti.


I want to like the card, and was planning to put it under water. Clearly I just got a dud. Would've liked to think QC would be a priority at launch, but I guess not.


----------



## Clockster

Quote:


> Originally Posted by *sch010*
> 
> Got my R9 290 today. Total crap (and I'm an AMD guy all the way). Fan speed won't set in CCC, it idles at 90 C, and XFX put warranty stickers on the baseplate screws so I can't even replace the TIM (which I assume doesn't even exist). Card is hot enough to literally burn me. What a joke.


Why would you use CCC in the 1st place? Use Msi Afterburner, set a custom fan profile and switch it to auto when your not using the machine.
Keeps temps in check and noise at an ok level. Everyone knows the card runs hot, what did you expect? Yours to run colder or something? lol


----------



## Blackops_2

Don't buy XFX...apparently even their reference versions


----------



## sch010

Quote:


> Originally Posted by *Clockster*
> 
> Why would you use CCC in the 1st place? Use Msi Afterburner, set a custom fan profile and switch it to auto when your not using the machine.
> Keeps temps in check and noise at an ok level. Everyone knows the card runs hot, what did you expect? Yours to run colder or something? lol


Because I haven't had time to download it yet? Their site is crawling today, for whatever reason. It's not acceptable for a brand new card to IDLE in windows at the throttle threshold temperature. I know the card runs hot at load... not at 1% activity.


----------



## ricklen

Pff I don't really understand why we can only buy these reference coolers, I'm hesitating between a GTX 770 4GB and a R9 290 but with the current coolers on the R9 290 I will be warming up my room this winter.


----------



## Gunderman456

Just pulled the trigger on 2 Gigabyte r9 290s from newegg.ca!

Could not resist any longer with a $429 listing. Will post pic with OCN name when I receive shipment.

With that, now I'm game for a whole new PC. Been putting a sweet part list together. Will post "The Hawaiian Build Log" (trade marked) in the next few weeks.

But before the Build Log, I will oc each card separately and post the limits here with reference fans and than with aftermarket cooler and vrm/ram heatsink mods.

Can't wait!


----------



## tsm106

Quote:


> Originally Posted by *sch010*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> you should return it and get a ti.
> 
> 
> 
> I want to like the card, and was planning to put it under water. Clearly I just got a dud. Would've liked to think QC would be a priority at launch, but I guess not.
Click to expand...

no just sounds like user error. the fan was designed to run warm or did you not pay any attention to the reviews? the xfx stickers are for countries not in north america.


----------



## esqueue

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> 2 questions:
> 
> 1- I keep reading that AMD "designed" the card to run at 95c. What exactly did they do different for it to allow it to run at those temps? Are they using some NASA designed PCB or something?
> 
> 2- also I have a 290 and an EK waterblock on the way. What should I do right from the beginning to check the integrity of the card? ASICs score is one of them, what else?


I keep reading that too but as I've said on earlier posts, I'll air on the side of caution and get aftermarket cooling. Only time will tell if it can actually stand up to 95c of daily use. The fact that the cards are throttling is proof that the cooling is far too inadequate. I purchased the card with the full intent on watercooling it though.

I did have an nvidia card that idled at 90c+ though. I forgot the name of it but I kept it for 3+ years before it started to black out the screen.


----------



## sch010

Quote:


> Originally Posted by *tsm106*
> 
> no just sounds like user error. the fan was designed to run warm or did you not pay any attention to the reviews? the xfx stickers are for countries not in north america.


What can I have possibly done wrong? Seriously. Install card, plug in power cables, install newest beta drivers (13.11 v8). Typically this combination does not yield high 90s centigrade temperatures at idle.

I'm not trolling, nor am I an idiot. I'm aware the card runs hot _at load_. Idling at the throttle point is absurd, and as I've already said, I assume there's something funky going on with the TIM.


----------



## sugarhell

Please dont keep the misinformations. First read how the new powertune works then talk.The card throttle on the quiet bios because of the tdp limit and the lock fan speed.


----------



## bjozac

I'm skimmed the thread from start to end...
I'm about to renew the rigg.
plan is to start experimenting with water cooling (obisdian 900d + 2 * 480 rad, using pwm fans (arctic f12) and pwm pump mcp___something)

When it comes to GPU i was thinking either 2 * 290 or to start with 1 * 290x.

But the big question is which brand, seems as there are many limitation to different branches.
I want to overclock the most of it and i would like to keep warrenty with watercooling.

Got any ideas?


----------



## RAFFY

Quote:


> Originally Posted by *stn0092*
> 
> Any news on nonreference coolers yet?


Search the thread next time but to answer your question supposedly end of the month.
Quote:


> Originally Posted by *Paul17041993*
> 
> they just designed the chip in a way to handle high temps without degrading like its predecessors, the PCB and components will handle such temps quite easily too provided there is a decent amount of case airflow.
> 
> if you must know the truth, the real killer is flipping between hot and cold states frequently and aggressively, this causes movement across the PCB and silicon and warps it till it breaks.
> 
> has anyone looked at intel's specs and noted that their CPUs are designed for use at up to 99C constant too...?


Heck aren't the Xeon processors rated for 110c? CPU and GPU's run hot they're doing the heavy lifting. All these high temp posts are making me laugh.
Quote:


> Originally Posted by *tsm106*
> 
> you should return it and get a ti.


+1 these 290's and 290x's are pure trash I hate getting close to 200fps in BF4.
Quote:


> Originally Posted by *esqueue*
> 
> I keep reading that too but as I've said on earlier posts, I'll air on the side of caution and get aftermarket cooling. Only time will tell if it can actually stand up to 95c of daily use. The fact that the cards are throttling is proof that the cooling is far too inadequate. I purchased the card with the full intent on watercooling it though.
> 
> I did have an nvidia card that idled at 90c+ though. I forgot the name of it but I kept it for 3+ years before it started to black out the screen.


Was it the Nvidia 9800GX2? It sure was a powerhouse but damn was that thing a space heater lol


----------



## bpmcleod

Ok so I got the new r9 290 Powercolor version. Any ideas on how to unlock voltages for them yet?


----------



## jerrolds

Quote:


> Originally Posted by *sch010*
> 
> What can I have possibly done wrong? Seriously. Install card, plug in power cables, install newest beta drivers (13.11 v8). Typically this combination does not yield high 90s centigrade temperatures at idle.
> 
> I'm not trolling, nor am I an idiot. I'm aware the card runs hot _at load_. Idling at the throttle point is absurd, and as I've already said, I assume there's something funky going on with the TIM.


No your right - idle for me is about 50C lol but ambient temps is like 10C in that spare room with an open case. I do not approach 90C at idle though.


----------



## RAFFY

Quote:


> Originally Posted by *bpmcleod*
> 
> Ok so I got the new r9 290 Powercolor version. Any ideas on how to unlock voltages for them yet?


Well right now for the 290x people are flashing to the ASUS bios. Not sure if this is the case with the 290 though.


----------



## bpmcleod

The r9 290 from ASUS was discontinued according to newegg? I wonder why this is unless newegg just made a mistake


----------



## sch010

Quote:


> Originally Posted by *jerrolds*
> 
> No your right - idle for me is about 50C lol but ambient temps is like 10C in that spare room with an open case. I do not approach 90C at idle though.


Thank you haha. Weird stuff happens.


----------



## Arizonian

Quote:


> Originally Posted by *bpmcleod*
> 
> The r9 290 from ASUS was discontinued according to newegg? I wonder why this is unless newegg just made a mistake


Newegg does that when they don't have any in stock and don't know when they are going to get some more. It's only temporarily when an item information is unavailable rather than actually remove it. I've seen discontinued items go back in stock many times.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *bpmcleod*
> 
> The r9 290 from ASUS was discontinued according to newegg? I wonder why this is unless newegg just made a mistake


Its just marked like that. The sapphire and Asus 290x non BF4 editions were marked as discontinued for the first few days as well.


----------



## brazilianloser

Quote:


> Originally Posted by *Arizonian*
> 
> Newegg does that when they don't have any in stock and don't know when they are going to get some more. It's only temporarily when an item information is unavailable rather than actually remove it. I've seen discontinued items go back in stock many times.


Yeah they had it on site and in stock on the 4th late night and some of the 5th... once it seemed like they run out they took it off the site completely for the entire day. But it was put back up late night yesterday again (thats when I got mine, shipped today should arrive tomorrow)... now they probably did the same again once they run out again of it. Good that I woke up at 4am to do some homework









Really didn't want the Sappire which I actually was only minutes from buying when I noticed the Asus on stock


----------



## Gunderman456

Quote:


> Originally Posted by *bpmcleod*
> 
> The r9 290 from ASUS was discontinued according to newegg? I wonder why this is unless newegg just made a mistake


Newegg.ca is saying same thing about the r9 290 MSI. They were just selling them yesterday before selling out. I think its a mistake and newegg just means out of stock.


----------



## Knight26

It's depends on the MB. The PCIE slots supply a certain amount of voltage but I don't remember what it is off the top of my head. However many MB's, like my G1 Sniper 3, have an additional power connector for SATA power or ATX 4-pin power cable from the PSU. These supply more watts to the PCIE slots for multi-card setups and overclocking. I'm going to find out shortly want the power draw will be. I have 2 Sapphire 290's on order as well as a couple EK blocks to replace my 3-way SLI 670 setup. I post picks when I get them to be added to the owners club roster.


----------



## esqueue

Quote:


> Originally Posted by *RAFFY*
> 
> Was it the Nvidia 9800GX2? It sure was a powerhouse but damn was that thing a space heater lol


No, definitely not the 9000, a higher end 6000-8000 GTX OC LOL sorry that's a wide range but it was old.

Quote:


> Originally Posted by *sch010*
> 
> Fan speed won't set in CCC, it idles at 90 C, and XFX put warranty stickers on the baseplate screws so I can't even replace the TIM (which I assume doesn't even exist). Card is hot enough to literally burn me. What a joke.


90°C idle means that either your room is 120°F+ or something is wrong. As for the warranty stickers, I plan on turning my heat gun to 200°C and use tweezers to save the stickers.

http://xfxforce.com/en-us/help/support/warrantyinformation.aspx The site only applies to North America and has all their warranty info. From reading it, Our cards DO NOT qualify for the lifetime warranty but for the 2 year warranty. It also mentions about the cooling.

*"XFX has carefully selected the optimal thermal or fansink component for your graphics card model. We do not encourage the removal of components due to damage that may result in the process. XFX understands that some enthusiasts may choose to replace the original component with their own cooling solution. To support the gaming community, we recommend that you contact XFX prior to any modifications so that we can update your profile and product registration to avoid potential issues with warranty support. In addition, XFX support will be able to walk through the installation with you or provide feedback and pointers on available options for your specific product. You may even consider shipping your components to XFX and allow the technicians at XFX to perform the modification for you (shipping charges to XFX apply)."*


----------



## PillarOfAutumn

Quote:


> Originally Posted by *bjozac*
> 
> But the big question is which brand, seems as there are many limitation to different branches.
> I want to overclock the most of it and i would like to keep warrenty with watercooling.
> 
> Got any ideas?


When I was looking to buy, I weeded out MSI, xfx, and a few other brands, but was looking mainly at an Asus and Sapphire brands. I was leaning more towards the Asus card because they are voltage unlocked and many people are flashing Asus BIOSes on to their cards. But the main advantage with sapphire and Asus vs the others is that these guys don't have those warranty void stickers all around their cards. Removing them is necessary in order to take off the cooler to slap on a waterblock or to change the TIM even.


----------



## LunaP

Hey guys quick question on this. This screenshot threw me off as I thought that requiring the same 3 types of monitors was something only Nvidia forced upon their drivers. From what I recall EyeFinity is supposed to allow you select ( and profile I think ) monitors to group together for a surround effect. Just wanted to clarify on this. Appreciate any input. I use more than 3 monitors (currently @ 4 ) and looking to go 5 for the moment.


----------



## sch010

Quote:


> Originally Posted by *esqueue*
> 
> No, definitely not the 9000, a higher end 6000-8000 GTX OC LOL sorry that's a wide range but it was old.
> 90°C idle means that either your room is 120°F+ or something is wrong. As for the warranty stickers, I plan on turning my heat gun to 200°C and use tweezers to save the stickers.
> 
> http://xfxforce.com/en-us/help/support/warrantyinformation.aspx The site only applies to North America and has all their warranty info. From reading it, Our cards DO NOT qualify for the lifetime warranty but for the 2 year warranty. It also mentions about the cooling.
> 
> *"XFX has carefully selected the optimal thermal or fansink component for your graphics card model. We do not encourage the removal of components due to damage that may result in the process. XFX understands that some enthusiasts may choose to replace the original component with their own cooling solution. To support the gaming community, we recommend that you contact XFX prior to any modifications so that we can update your profile and product registration to avoid potential issues with warranty support. In addition, XFX support will be able to walk through the installation with you or provide feedback and pointers on available options for your specific product. You may even consider shipping your components to XFX and allow the technicians at XFX to perform the modification for you (shipping charges to XFX apply)."*


I assume there's something wrong with the card. My ambient temp is 70F. It's behaving like there's not any TIM.

I'll swap it out for a non-XFX branded card once Amazon has some in stock (XFX was all they had available yesterday).


----------



## esqueue

Quote:


> Originally Posted by *sch010*
> 
> I assume there's something wrong with the card. My ambient temp is 70F. It's behaving like there's not any TIM.
> 
> I'll swap it out for a non-XFX branded card once Amazon has some in stock (XFX was all they had available yesterday).


I would advise against XFX due to the website. It takes you around in circles and doesn't answer anything. XFXsupport.com was very frustrating to browse.

A second thing is that the included disc doesn't function on windows 8 at all. That matters not as their support site links you to amd's site to dl the most updated drivers anyway.


----------



## jerrolds

I have ThermalRight ChillFactor 3 laying around, when my Gelid ICY 2 arrives - it comes with GC-2 thermal compound. According to xbit labs http://www.xbitlabs.com/articles/coolers/display/thermal-interface-roundup-1_12.html#sect0 outerforms GC2 by a good margin

Will there be any issues using CF3 instead of GC2, i dont think its conductive...not sure if its capacitive - whatever that means.


----------



## nemm

I had a little go flashing the various 290x bios knocking about on the second card. I started with Asus and overclocking with GPU tweak which not plain sailing. The GPU was recognized as 'Generic VGA' and the memory which was stable at 1550 before flash using stock Sapphire 290 bios and MSI AB is no longer after the flash.

Sapphire 1100 core (not 1115 I first thought) / 1550 memory @ 1.227v max temp 73degC
Asus 1180 core / 1425 memory @ 1.299v max temp 79degC
PT3 1220 core / 1425 memory @ 1.352v max temp 84degC, I tried 1.4v but I just got BSOD's

Flashed back to original bios and used Asus GPU tweak, unstable memory overclock at 1425+ so I reverted back to AB, 1425+ memory was fine.

No more cross flashing for me, will wait for AB or Trixx to support the cards voltage adjustments.

On a side note I have 2 cards, same batch just serial number incremented by 1. Serial xxxxxxxxxx7 (one tested above) is not the best and has Elpdia modules and the xxxxxxxxxx8 one has Hynix. Both seem to obtain similar overclocks although the Hynix card seems a better sample.

Just noticed in the spreadsheet page 1 I am down for 2x Sapphire 290x but I actually have nonX versions.

1/2 EK WC blocked, second is in the post, delivery set for tomorrow

* Just got an Asus 290 Bios, time to test that as oopsed to the 290x version


----------



## RAFFY

Quote:


> Originally Posted by *sch010*
> 
> Thank you haha. Weird stuff happens.


Quote:


> Originally Posted by *sch010*
> 
> I assume there's something wrong with the card. My ambient temp is 70F. It's behaving like there's not any TIM.
> 
> I'll swap it out for a non-XFX branded card once Amazon has some in stock (XFX was all they had available yesterday).


I've never had a problem with an ASUS or SAPPHIRE graphics card.


----------



## $ilent

Quote:


> Originally Posted by *jerrolds*
> 
> Yea im kinda bummed I paid more than $160 for the 290X and BF4, whats worse is the higher number means more tax. I shouldve sold BF4 to recoupe some of the cost and then pick it up during a sale or something
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Im just hoping that i can hit 1200mhz once the icy 2 comes in - you have the same cooler dont you? or are you underwater?


Yeh I got same cooler, core temps are great idle 33C, max is 56C. Just waiting on my infrared thermoeter though to be delivered so I can confirm VRM temps.


----------



## Arizonian

Quote:


> Originally Posted by *LunaP*
> 
> Hey guys quick question on this. This screenshot threw me off as I thought that requiring the same 3 types of monitors was something only Nvidia forced upon their drivers. From what I recall EyeFinity is supposed to allow you select ( and profile I think ) monitors to group together for a surround effect. Just wanted to clarify on this. Appreciate any input. I use more than 3 monitors (currently @ 4 ) and looking to go 5 for the moment.


On OP

The next generation of AMD Eyefinity technology features all-new support for stereo 3D, universal bezel compensation and brand new display configurations, bringing you ultimate panoramic computing experience.

Connections
HDMI 1.4b
DisplayPort 1.2 with Multi-Streaming
Two DVI-D



Six monitors capable with one multistream display port connector.


----------



## jomama22

just bought my 7th 290x. Been waiting for TD to have some since i get 6% back and $10 off. Hopefully it gets here by friday, if not monday.


----------



## brazilianloser

Quote:


> Originally Posted by *jomama22*
> 
> just bought my 7th 290x. Been waiting for TD to have some since i get 6% back and $10 off. Hopefully it gets here by friday, if not monday.


7th holy cow...


----------



## Durquavian

I wont be able to get one till jan. Good news it all the issues and such should have worked out by then with plenty of choices.


----------



## Rar4f

Quote:


> Originally Posted by *jomama22*
> 
> just bought my 7th 290x. Been waiting for TD to have some since i get 6% back and $10 off. Hopefully it gets here by friday, if not monday.


Are you trying to build a time machine or suuumething?


----------



## skupples

I have had nothing but terrible experienced with XFX with previous AMD generations. Mostly Fan failures & RMA issues.


----------



## Clukos

Quote:


> Originally Posted by *jomama22*
> 
> just bought my 7th 290x. Been waiting for TD to have some since i get 6% back and $10 off. Hopefully it gets here by friday, if not monday.


I get you like these cards a lot eh?


----------



## Sgt Bilko

Ive never had XFX but Sapphire, ASUS, HIS and Gigabyte have all been great cards for me.


----------



## JimmieRustle

is 1.4V on a 290x safe?

also what is power target(%) in gpu tweak?


----------



## jomama22

Quote:


> Originally Posted by *JimmieRustle*
> 
> is 1.4V on a 290x safe?
> 
> also what is power target(%) in gpu tweak?


i was topping out @ 1.42, which is 1.37-1.39 on pt3.rom


----------



## JimmieRustle

Quote:


> Originally Posted by *jomama22*
> 
> i was topping out @ 1.42, which is 1.37-1.39 on pt3.rom


What clocks were you able to get? Also, did you change the Power Target?


----------



## RAFFY

Quote:


> Originally Posted by *Clukos*
> 
> I get you like these cards a lot eh?


He IS a 290X!


----------



## DampMonkey

Quote:


> Originally Posted by *JimmieRustle*
> 
> is 1.4V on a 290x safe?
> 
> also what is power target(%) in gpu tweak?


Should probably this first, what kind of cooling are you using?


----------



## JimmieRustle

Quote:


> Originally Posted by *DampMonkey*
> 
> Should probably this first, what kind of cooling are you using?


EK wb + 3930k wb + 360mm rad + 240mm rad both with push pull fans


----------



## Remij

I'm going to be getting a couple of these cards early December. I'm just hoping they are readily available.


----------



## Mr357

Quote:


> Originally Posted by *JimmieRustle*
> 
> What clocks were you able to get? Also, did you change the Power Target?


You simply need to click on "Settings," then the tab labeled "Tune," then check two boxes which will enable you to modify the core voltage and power target.


----------



## JimmieRustle

Quote:


> Originally Posted by *Mr357*
> 
> You simply need to click on "Settings," then the tab labeled "Tune," then check two boxes which will enable you to modify the core voltage and power target.


I've done that. I'm just wondering what changing the Power Target does or if I should just go ahead and set it to max

in GPUZ it shows 84C VRM temp 1 and 54C VRM temp 2. Does this mean I messed up putting several of the thermal pads on or what?


----------



## VSG

I placed an order for a quarter sheet of Fujipoly extreme thermal pads (0.5 mm and 1 mm thick each) from FrozenCPU to come along with my EK blocks next week. These offer 4x the thermal conductivity of stock EK pads and are reasonably priced unlike the Fujipoly ultra extremes. For those having issues with VRM heating, consider better thermal pads.


----------



## Rar4f

considering r9 290 and x reference cards are out, shouldnt this mean that both the 290 and x non reference cards from Asus, Sapphire, msi, etc be ready at same time by end of november? Instead of only R9 290X ?
Sure would suck if i have to wait til 10th or 15th december before R9 290 non referencecards are ready :/

I will endure, but why not kill two flies in 1 go.


----------



## CallsignVega

Sorry, don't have time to read 150 pages. Anyone able to give some cliff notes?

Is there a BIOS out that allows 1.35v+ on the 290x?

On water, what core clocks have people been getting on increased voltage? Temps / any throttling?

The market on Titans is pretty soft, still debating on whether to swap them out for 290x's. I think I would need to get like a consistent 1250+ core in crossfire on water to make the switch over the Titans.

Any word on custom PCB's?


----------



## Forceman

Quote:


> Originally Posted by *JimmieRustle*
> 
> I've done that. I'm just wondering what changing the Power Target does or if I should just go ahead and set it to max
> 
> in GPUZ it shows 84C VRM temp 1 and 54C VRM temp 2. Does this mean I messed up putting several of the thermal pads on or what?


There are two VRM blocks and they seem to have different temps, so the split is normal. 84C is still okay, although a lot of people have gotten them lower.
Quote:


> Originally Posted by *CallsignVega*
> 
> Sorry, don't have time to read 150 pages. Anyone able to give some cliff notes?
> 
> Is there a BIOS out that allows 1.35v+ on the 290x?
> 
> On water, what core clocks have people been getting on increased voltage? Temps / any throttling?
> 
> The market on Titans is pretty soft, still debating on whether to swap them out for 290x's. I think I would need to get like a consistent 1250+ core in crossfire on water to make the switch over the Titans.
> 
> Any word on custom PCB's?


Yes, the PT3 BIOS allows over 1.35V. Looks like 1250 is about the range for watercooled, although a couple have hit 1300+.


----------



## Arm3nian

Quote:


> Originally Posted by *CallsignVega*
> 
> Sorry, don't have time to read 150 pages. Anyone able to give some cliff notes?
> 
> Is there a BIOS out that allows 1.35v+ on the 290x?
> 
> On water, what core clocks have people been getting on increased voltage? Temps / any throttling?
> 
> The market on Titans is pretty soft, still debating on whether to swap them out for 290x's. I think I would need to get like a consistent 1250+ core in crossfire on water to make the switch over the Titans.
> 
> Any word on custom PCB's?


There is a bios that allows up to 2v, you would require a hardmod at that point however. With that bios, you should be able to go up to 1.5/1.6 I think without modding. 1.35-1.4 seems to be the current range on water without running into too much thermal problems. We have seen 1340mhz on the core on water, might be able to do more. 1250+ should be doable on water. We are still waiting for better volt adjustment tools and maybe more optimized bioses. Aside from oc, new drivers should yield more performance.


----------



## armartins

Quote:


> Originally Posted by *Taint3dBulge*
> 
> Fan noise isnt that big of deal, but when the you your gpu starts squeeling twice as loud as the fan, there is an issue.. It is starting to dim down alittle though. Just going to keep using it for a week or two see what happens... Gonna let heaven run overnight see if that will fix it.. Well that is if i can sleep, i can hear this thing across the house when runing furmark, cooler and coilwhine anyways.. I aint gonna fret of it anymore thanks to everyone that has helped.. I just need to figure out what to do before i order up a accelero hybrid, cant get rid of the card after installin that lol.


Coil whine rises with FPS... those benches that run 1000+ fps... or furmark in low res will always have very strong coil whine...


----------



## rdr09

Quote:


> Originally Posted by *CallsignVega*
> 
> Sorry, don't have time to read 150 pages. Anyone able to give some cliff notes?
> 
> Is there a BIOS out that allows 1.35v+ on the 290x?
> 
> On water, what core clocks have people been getting on increased voltage? Temps / any throttling?
> 
> The market on Titans is pretty soft, still debating on whether to swap them out for 290x's. I think I would need to get like a consistent 1250+ core in crossfire on water to make the switch over the Titans.
> 
> Any word on custom PCB's?


why? your titans will outlast both the 290X and the Ti, especially the latter only having 3GB.


----------



## Arm3nian

Quote:


> Originally Posted by *rdr09*
> 
> why? your titans will outlast both the 290X and the Ti, especially the latter only having 3GB.


Lol what. 780ti is meant to EOL titan. Price hasn't even dropped from the 1 grand. 290 series 4gb should be enough for everything, we saw triple 4k running fine.


----------



## rdr09

Quote:


> Originally Posted by *Arm3nian*
> 
> Lol what. *780ti is meant to EOL titan*. Price hasn't even dropped from the 1 grand. 290 series 4gb should be enough for everything, we saw triple 4k running fine.


lol. that is the plan. that thing will choke at 1600P.


----------



## Arm3nian

Quote:


> Originally Posted by *rdr09*
> 
> lol. that is the plan. that thing will choke at 1600P.


A single 1600p monitor? Cards have been able to run that for almost 2 gens now. There are much higher resolutions. 290x also kills the competition at 4k.


----------



## jrcbandit

Quote:


> Originally Posted by *jerrolds*
> 
> That sucks, and temps are all in line? Maybe powerlimit needs to be maxed? Whats your ASIC?


I did max the powerlimit at 150% and the ASIC quality was 73.9%. For temps, I get a max of 52 C when playing games (was 48 C yesterday when the temperature outside was much cooler ;p) and the VRM were pretty cool - is GPU-Z accurately reporting this?? I did use MX-2 compound both under and over the thermal pad for the VRM as suggested by the EK instructions. The only thing I can think to do is to purchase some higher quality thermal pads like Fujipoly extreme (not going ultra extreme), or maybe my Sapphire card doesn't like the Asus bios.

Quote:


> Originally Posted by *Forceman*
> 
> 290 or 290X? One of the reviews mentioned a BIOS bug that may be limiting 290 memory overclocks.


I have the 290X Sapphire.


----------



## rdr09

Quote:


> Originally Posted by *Arm3nian*
> 
> A single 1600p monitor? Cards have been able to run that for almost 2 gens now. There are much higher resolutions. 290x also kills the competition at 4k.


true. maybe skimp on some settings in some games. but does not make sense on a card that cost $700.


----------



## iPDrop

So how are these cards overclocking? I've got two non-x's in the mail expected tomorrow.


----------



## Arm3nian

Quote:


> Originally Posted by *rdr09*
> 
> true. maybe skimp on some settings in some games. but does not make sense on a card that cost $700.


I highly doubt you would need to lower settings with a 780ti/titan for a single 1600p monitor, all I would say would hurt it is a game like crysis 3 or BF4 with max AA. Maybe at triple screens or 4k, but we all know you need multiple gpus for that anyway.

Also, if I'm not mistaken, the only advantage the titan will have over the ti is more vram, everything else is the same or a bit more on the ti. 3gb might be pushing it at triple 1440/1600p or 4k, but I don't care for either of those cards. 290/290x ftw.


----------



## Snyderman34

Color me official











Will get some #s then post again


----------



## rdr09

Quote:


> Originally Posted by *Arm3nian*
> 
> I highly doubt you would need to lower settings with a 780ti/titan for a single 1600p monitor, all I would say would hurt it is a game like crysis 3 or BF4 with max AA. Maybe at triple screens or 4k, but we all know you need multiple gpus for that anyway.
> 
> Also, if I'm not mistaken, the only advantage the titan will have over the ti is more vram, everything else is the same or a bit more on the ti. 3gb might be pushing it at triple 1440/1600p or 4k, but I don't care for either of those cards. 290/290x ftw.


that is exactly the advantage of the titan - the vram - and we can say the same thing with the 290s. i remember last year how we talk about 2Gb cards as will last awhile and then Hitman and some of these titles came out. then monitors are getting cheaper.

checkout this thread . . .

http://www.overclock.net/t/1424464/benchmarking-thread-batman-origins-benchmarks-are-up


----------



## Arm3nian

Quote:


> Originally Posted by *rdr09*
> 
> that is exactly the advantage of the titan - the vram - and we can say the same thing with the 290s. i remember last year how we talk about 2Gb cards as will last awhile and then Hitman and some of these titles came out. then monitors are getting cheaper.
> 
> checkout this thread . . .
> 
> http://www.overclock.net/t/1424464/benchmarking-thread-batman-origins-benchmarks-are-up


Well not everyone runs higher than 1600p res. 780ti should outperform titan at that. Triple 1440 or 4k however and you are going to wish you went with titan or 290 series. Since titan is still at the ridiculous 1k price range, you shouldn't even need to think about it. 290 series also has lots of potential, we have already seen what the titan is capable of.

As for the op, he is going to lose money switching over, so that might not be worth it, performance wise it is.


----------



## Arizonian

Quote:


> Originally Posted by *Snyderman34*
> 
> Color me official
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Will get some #s then post again


Are they 290X or 290's? Cant tell from box. When you post back an OCN name to make it official on roster.









Edit: GPU-Z validation with OCN name would work.


----------



## Mas

Quote:


> Originally Posted by *RAFFY*
> 
> I've never had a problem with an ASUS or SAPPHIRE graphics card.


The last 3 Asus cards I've had have either died or had some sort of issue.

First one was an HD5870. Card itself was fine, but the fan started sounding like a lawnmower running over a log after about a year. Unfortunately, my wife threw out the invoice so I just dealt with it myself.

Second one was constant BSODing my system. Took it to retailer but they couldn't replicate so they wouldn't RMA for me. Contacted Asus directly and sent them the card, and they identified a fault with the card that reacted with my particular setup but obviously not the system that the retailer tested in. They sent me a new card as a replacement (GTX480)

Third one was yet another GTX480, about a year and a half it started to BSOD when in 1st PCI slot, but was fine in the second or third slot in SLI. Tried to initiate RMA with Asus, but they did a run around on me in emails, then stopped responding all together. This was still well within the warranty period.

No more Asus cards for me. I still like their motherboards (even though I have had problems with my R3E).


----------



## rdr09

Quote:


> Originally Posted by *Arm3nian*
> 
> Well not everyone runs higher than 1600p res. 780ti should outperform titan at that. Triple 1440 or 4k however and you are going to wish you went with titan or 290 series. Since titan is still at the ridiculous 1k price range, you shouldn't even need to think about it. 290 series also has lots of potential, we have already seen what the titan is capable of.
> 
> As for the op, he is going to lose money switching over, so that might not be worth it, performance wise it is.


good points but Callsign's case is different. He prolly just wants to play with something new.


----------



## JimmieRustle

Quote:


> Originally Posted by *Arizonian*
> 
> Are they 290X or 290's? Cant tell from box. When you post back an OCN name to make it official on roster.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: GPU-Z validation with OCN name would work.


they're 290s


----------



## Taint3dBulge

Has to be one of the higher scores?


----------



## Raxus

So my Sapphire 290x took a dump last night. Not even over clocking it yet, black screened every time I tried to load any games up.

Talked to newegg, they gave me the $$$ in store credit so I can go with another Manufacturer.

Thinking I'll wait for my credit then grab an asus r9 290x and put it under water.


----------



## jomama22

Quote:


> Originally Posted by *Raxus*
> 
> So my Sapphire 290x took a dump last night. Not even over clocking it yet, black screened every time I tried to load any games up.
> 
> Talked to newegg, they gave me the $$$ in store credit so I can go with another Manufacturer.
> 
> Thinking I'll wait for my credit then grab an asus r9 290x and put it under water.


Were you running gpuz/ab/GPU tweak at all?


----------



## Arm3nian

Quote:


> Originally Posted by *rdr09*
> 
> good points but Callsign's case is different. He prolly just wants to play with something new.


Well if he wants something new he should go with the 290/290x and see what it can do, we are all still anxious to see what this card is capable of in the future with better volt mods, bioses, drivers, mantle. No point in going with the ti to get "something new" but get 1 FPS at high resolutions due to a vram wall.


----------



## Arm3nian

Quote:


> Originally Posted by *Raxus*
> 
> So my Sapphire 290x took a dump last night. Not even over clocking it yet, black screened every time I tried to load any games up.
> 
> Talked to newegg, they gave me the $$$ in store credit so I can go with another Manufacturer.
> 
> Thinking I'll wait for my credit then grab an asus r9 290x and put it under water.


All of these DOA cards seem like shipment issues to me. We have seen this happen to every brand. Good luck on your new one


----------



## Raxus

Quote:


> Originally Posted by *jomama22*
> 
> Were you running gpuz/ab/GPU tweak at all?


Not at the time no.


----------



## Raxus

Quote:


> Originally Posted by *Arm3nian*
> 
> All of these DOA cards seem like shipment issues to me. We have seen this happen to every brand. Good luck on your new one


Just hope I can get my hands on a reference Asus card. Been scarce since release.

Seemed to be fine for the first week. Initially noticed some wacky throttling early on though with 70% + fans.


----------



## iPDrop

How have the R9 290s or 290X's been overclocking anyone know?


----------



## Arm3nian

Quote:


> Originally Posted by *Raxus*
> 
> Just hope I can get my hands on a reference Asus card. Been scarce since release.
> 
> Seemed to be fine for the first week. Initially noticed some wacky throttling early on though with 70% + fans.


They were scarce on release, but they seem to restock in line with the other brands now. Newegg usually gets the ASUS cards with the other brands at the same time. Should be able to snatch one if you try lol.


----------



## the9quad

OK if you own a 290 or 290x and are still using the reference cooling please read and reply:

When running a game or benchmark in quiet mode: What temperature does your card reach when it starts to throttle? Does it pretty much remain there at 40% or so fan speed while it throttles the GPU? (i.e it stays around 90C, throttles the GPU and stay at 40% fan speed the whole time pretty much).

Does the same thing happen in Uber except now the fan speed is around 55%?

This has been my experience with the card so far, with the exception being that in uber it takes longer to reach the throttling temp and because of the higher fan speed it doesn't have to throttle the GPU quite as drastic. Is this your experience as well?

Do you have random black screen lockups in benchmarks or games? Does lowering the GPU temp target help? Did raising Vcore help?

Please respond to this if you own one of the cards, I need to know If this is just how the cards work or if I should RMA the card. For reference, my case is a CM Hafx, and the CPU is water cooled, so it has plenty of room and airflow. and my power supply is a pc power anc cooling 1200 watt power supply running one card, so it is not a power issue.

In addition using GPU tweak and the Asus bios I was able to create a fan profile, that allows the card to not throttle or black screen, but it pretty much means the fan is around 60-65% at full load.

Thanks for any replies.


----------



## Sgt Bilko

Mine isnt even hitting 50c and its starts throttling with 100% fan......its just weird. Powerlimit is set to +50.....any ideas?


----------



## Raxus

Quote:


> Originally Posted by *the9quad*
> 
> OK if you own a 290 or 290x and are still using the reference cooling please read and reply:
> 
> When running a game or benchmark in quiet mode: What temperature does your card reach when it starts to throttle? Does it pretty much remain there at 40% or so fan speed while it throttles the GPU? (i.e it stays around 90C, throttles the GPU and stay at 40% fan speed the whole time pretty much).
> 
> Does the same thing happen in Uber except now the fan speed is around 55%?
> 
> This has been my experience with the card so far, with the exception being that in uber it takes longer to reach the throttling temp and because of the higher fan speed it doesn't have to throttle the GPU quite as drastic. Is this your experience as well?
> 
> Do you have random black screen lockups in benchmarks or games? Does lowering the GPU temp target help? Did raising Vcore help?
> 
> Please respond to this if you own one of the cards, I need to know If this is just how the cards work or if I should RMA the card. For reference, my case is a CM Hafx, and the CPU is water cooled, so it has plenty of room and airflow. and my power supply is a pc power anc cooling 1200 watt power supply running one card, so it is not a power issue.
> 
> In addition using GPU tweak and the Asus bios I was able to create a fan profile, that allows the card to not throttle or black screen, but it pretty much means the fan is around 60-65% at full load.
> 
> Thanks for any replies.


from everything ive read this is how it currently works on all cards.

Once the gpu reaches 92 - 94c it throttles. You have to make a fan profile that prevents the card from ever getting to 94c


----------



## binormalkilla

I finally was able to catch a non bf4 290x in stock. I should be running crossfire by Friday. I'm waiting on the release of the rampage iv black edition to get the third card and water blocks though.


----------



## Raxus

http://www.newegg.com/Product/Product.aspx?Item=N82E16814121820

anyone know why this is still listed as coming soon?


----------



## the9quad

Quote:


> Originally Posted by *Raxus*
> 
> from everything ive read this is how it currently works on all cards.
> 
> Once the gpu reaches 92 - 94c it throttles. You have to make a fan profile that prevents the card from ever getting to 94c


That's what I am experiencing, and just wanted to make sure. Default Uber @ 55% is not enough. You actually need from 62-65% fan speed.


----------



## Falkentyne

Quote:


> Originally Posted by *the9quad*
> 
> OK if you own a 290 or 290x and are still using the reference cooling please read and reply:
> 
> When running a game or benchmark in quiet mode: What temperature does your card reach when it starts to throttle? Does it pretty much remain there at 40% or so fan speed while it throttles the GPU? (i.e it stays around 90C, throttles the GPU and stay at 40% fan speed the whole time pretty much).
> 
> Does the same thing happen in Uber except now the fan speed is around 55%?
> 
> This has been my experience with the card so far, with the exception being that in uber it takes longer to reach the throttling temp and because of the higher fan speed it doesn't have to throttle the GPU quite as drastic. Is this your experience as well?
> 
> Do you have random black screen lockups in benchmarks or games? Does lowering the GPU temp target help? Did raising Vcore help?
> 
> Please respond to this if you own one of the cards, I need to know If this is just how the cards work or if I should RMA the card. For reference, my case is a CM Hafx, and the CPU is water cooled, so it has plenty of room and airflow. and my power supply is a pc power anc cooling 1200 watt power supply running one card, so it is not a power issue.
> 
> In addition using GPU tweak and the Asus bios I was able to create a fan profile, that allows the card to not throttle or black screen, but it pretty much means the fan is around 60-65% at full load.
> 
> Thanks for any replies.


Try using ONLY Afterburner to overclock, and DISABLE overdrive in the CCC. And use the original Bios; one poster above said his RAM overclock took a nose dive after he switched to the Asus bios.


----------



## Raxus

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Mine isnt even hitting 50c and its starts throttling with 100% fan......its just weird. Powerlimit is set to +50.....any ideas?


Are you only using CCC?

100% fan in CCC isnt 100% fan lol


----------



## brazilianloser

Quote:


> Originally Posted by *Raxus*
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814121820
> 
> anyone know why this is still listed as coming soon?


They either do that or just hide the item when the item is out of stock. Usually...


----------



## Sgt Bilko

Quote:


> Originally Posted by *Raxus*
> 
> Are you only using CCC?
> 
> 100% fan in CCC isnt 100% fan lol


Nope AB for fan controls/powerlimit and ccc for powerlimit and oc


----------



## Raxus

Anyone know how long Asus produces reference cards typically?

Don't want to delay my water cooling plans again this generation because of being unable to secure reference cards.


----------



## Bloitz

Of these brands, which would you prefer for a 290: (Europe, so the XFX cooler warranty doesn't apply here if I'm not mistaken): Gigabyte, XFX, VTX3D (never heard of them TBH







) or MSI ("in stock soon" they say)
I've seen posts about Elpida and hynix memory? So I presume reference isn't the same accross the board...









Oh and if you don't mind: how do they do under water (clocks and temps) ?

I tried scrolling through the thread and looking for it myself but it's 02:18 am and getting tired


----------



## Raxus

Quote:


> Originally Posted by *Bloitz*
> 
> Of these brands, which would you prefer for a 290: (Europe, so the XFX cooler warranty doesn't apply here if I'm not mistaken): Gigabyte, XFX, VTX3D (never heard of them TBH
> 
> 
> 
> 
> 
> 
> 
> ) or MSI ("in stock soon" they say)
> I've seen posts about Elpida and hynix memory? So I presume reference isn't the same accross the board...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Oh and if you don't mind: how do they do under water (clocks and temps) ?
> 
> I tried scrolling through the thread and looking for it myself but it's 02:18 am and getting tired


Personally I've had issues with XFX.

MSI, heard good things but never owned any of their hardware.

Gigabyte has done me right in the past.


----------



## the9quad

Quote:


> Originally Posted by *Falkentyne*
> 
> Try using ONLY Afterburner to overclock, and DISABLE overdrive in the CCC. And use the original Bios; one poster above said his RAM overclock took a nose dive after he switched to the Asus bios.


My questions were for people who are not using anything other than the default crap that's in overdrive. Personally I am using the asus bios with overdrive disabled, stock clocks, and GPU tweak for the fan profile. System is stable, but fan hovers around 62-65 while gaming but temps stay in the 70's which i prefer.

Basically I wanted to know straight out of the box using just CCC overdrive, are they all this bad or was mine broke?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Bloitz*
> 
> Of these brands, which would you prefer for a 290: (Europe, so the XFX cooler warranty doesn't apply here if I'm not mistaken): Gigabyte, XFX, VTX3D (never heard of them TBH
> 
> 
> 
> 
> 
> 
> 
> ) or MSI ("in stock soon" they say)
> I've seen posts about Elpida and hynix memory? So I presume reference isn't the same accross the board...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Oh and if you don't mind: how do they do under water (clocks and temps) ?
> 
> I tried scrolling through the thread and looking for it myself but it's 02:18 am and getting tired


Me personally id go for Gigabyte.

Ive seen temps around 48c max for water with roughly a 1200/1500 oc (someone else might be able to answer that better)

And yes. Ref boards are a mix of Hynix/Elphida mem. None is brand exclusive from what ive seen.


----------



## CallsignVega

Quote:


> Originally Posted by *Forceman*
> 
> There are two VRM blocks and they seem to have different temps, so the split is normal. 84C is still okay, although a lot of people have gotten them lower.
> Yes, the PT3 BIOS allows over 1.35V. Looks like 1250 is about the range for watercooled, although a couple have hit 1300+.


Quote:


> Originally Posted by *Arm3nian*
> 
> We have seen 1340mhz on the core on water, might be able to do more. 1250+ should be doable on water. We are still waiting for better volt adjustment tools and maybe more optimized bioses. Aside from oc, new drivers should yield more performance.


Sweet, 1300+? That should pass my 1300 MHz Titans in multi-monitor by a good 10-15%. Will probably try and get my hands on two and go from there. Any benchmarks of these 1300 Hz core specimens destroying in benchmarks?
Quote:


> Originally Posted by *rdr09*
> 
> why? your titans will outlast both the 290X


What do you mean outlast? Are you referring to simply the VRAM amount? I am not really interested in the 780ti, I like the idea of the 290x's 512 bit bus better for my high FPS multi-monitor config.

I am getting the new Eizo's that can run "strobing" backlight motion clarity on any GPU, so no worries about brand there. The 4GB 290x will be plenty for my 6 mil pixel Eyefinity setup and I don't run much AA. I only keep GPU's for 9-12 months anyway.


----------



## Gunderman456

Quote:


> Originally Posted by *iPDrop*
> 
> How have the R9 290s or 290X's been overclocking anyone know?


I've been following this very closely and from what I've seen around 1100MHz core with default bios and up to 100‰ fan speed. Higher on aftermarket air coolers. On water it can be as high as 1200-1300MHz+.


----------



## rv8000

Question for both 290x and 290 owners. Have any of you experienced coil whine on your cards, if so what brand?


----------



## josephimports

Quote:


> Originally Posted by *Gunderman456*
> 
> I've been following this very closely and from what I've seen around 1100MHz core with default bios and up to 100‰ fan speed. Higher on aftermarket air coolers. On water it can be as high as 1200-1300MHz+.


This sounds about right. Ill have my shot at it come Monday. Asus 290 with Gelid Icy cooler in route.









FYI, the Gigabyte 290 just became available at Newegg.


----------



## Mr357

Quote:


> Originally Posted by *rv8000*
> 
> Question for both 290x and 290 owners. Have any of you experienced coil whine on your cards, if so what brand?


Yep, in high FPS situations (100+) I get a ton of coil whine out of my Sapphire.


----------



## Arm3nian

Quote:


> Originally Posted by *CallsignVega*
> 
> Sweet, 1300+? That should pass my 1300 MHz Titans in multi-monitor by a good 10-15%. Will probably try and get my hands on two and go from there. Any benchmarks of these 1300 Hz core specimens destroying in benchmarks?


We have gotten some nice scores in 3dmark 11. Other than that the 1340/1700 290x got 82+ fps on valley. But valley is Nvidia favored and really likes memory clocks, which we don't have voltage adjustments for currently.. Those are a couple of examples. We should get more results from others also. OC can only increase with AB and other software updates. Performance can only increase with drivers.


----------



## Slomo4shO

Yes, the camera on the Nexus 4 sucks, too lazy to find my camera


----------



## Taint3dBulge

Quote:


> Originally Posted by *rv8000*
> 
> Question for both 290x and 290 owners. Have any of you experienced coil whine on your cards, if so what brand?


I have with myn, ASUS, but it seems like its not as bad as it was. still there is some but alittle less maybe...


----------



## flopper

Quote:


> Originally Posted by *Gunderman456*
> 
> I've been following this very closely and from what I've seen around 1100MHz core with default bios and up to 100‰ fan speed. Higher on aftermarket air coolers. On water it can be as high as 1200-1300MHz+.


1250mhz on water seems common a few hit 1300mhz as far.
maybe afterburner once out might elevate some more mhz


----------



## rv8000

Quote:


> Originally Posted by *Mr357*
> 
> Yep, in high FPS situations (100+) I get a ton of coil whine out of my Sapphire.










I love my 120hz monitor and hate it for this very reason. I went through 5 cards just to get away from coil whine (7950 > 7970 > 670 > 680 > 780) and although i still occasionally get some on my 780 I don't really seem to notice it. Seeing as theyre all reference right now not like a different brand is an option, just luck of the draw. I want to get rid of my 780 before the value drops even more as I have a feeling it will, and there's unfortunately no specific confirmation date on aftermarket models for the 290







.


----------



## iPDrop

What kind of overclocks are you guys getting?


----------



## MIGhunter

Quote:


> Originally Posted by *rv8000*
> 
> Question for both 290x and 290 owners. Have any of you experienced coil whine on your cards, if so what brand?


I don't have any on my Sapphire card but after I installed it, it had a clicking that sounded like a fan blade hitting the housing. I shutdown my pc, took the side off and looked at it, nothing. It hasn't done it since so I don't know what it was.


----------



## rv8000

Quote:


> Originally Posted by *MIGhunter*
> 
> I don't have any on my Sapphire card but after I installed it, it had a clicking that sounded like a fan blade hitting the housing. I shutdown my pc, took the side off and looked at it, nothing. It hasn't done it since so I don't know what it was.


Do you run games with vsync enabled, or above 100+ fps?


----------



## jomama22

Quote:


> Originally Posted by *CallsignVega*
> 
> Sweet, 1300+? That should pass my 1300 MHz Titans in multi-monitor by a good 10-15%. Will probably try and get my hands on two and go from there. Any benchmarks of these 1300 Hz core specimens destroying in benchmarks?
> What do you mean outlast? Are you referring to simply the VRAM amount? I am not really interested in the 780ti, I like the idea of the 290x's 512 bit bus better for my high FPS multi-monitor config.
> 
> I am getting the new Eizo's that can run "strobing" backlight motion clarity on any GPU, so no worries about brand there. The 4GB 290x will be plenty for my 6 mil pixel Eyefinity setup and I don't run much AA. I only keep GPU's for 9-12 months anyway.


Just for you:

Ignore the overall score (was running at 4.7)

http://www.3dmark.com/fs/1098036

6600 gfx score
That's at 1315/1500. Ran valley @ 1340 max but best was 1320/1700.

Fse takes a dump on amd as you know, so to say it beats most Titans out there in their best bench (excluding single card valley) I'd say its pretty good.

Though out of the 5 I have tested so far, the average seems to be ~ 1250 @ 1.3v and 1280 @ 1.35.

If you get the memory up to 1700, these thing really open up.


----------



## Xtreme21

Add me. Just got her today Sapphire R9 290 currently stock on Windows 8.1 Pro running 13.11 Beta 8.


----------



## ivanlabrie

Very nice gents...hope to join your ranks soon.


----------



## Arizonian

Quote:


> Originally Posted by *Xtreme21*
> 
> Add me. Just got her today Sapphire R9 290 currently stock on Windows 8.1 Pro running 13.11 Beta 8.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## Xtreme21

Thank you! Now to start some benching =D


----------



## iPDrop

Ah cool what are some easy ways to put it under water? are there some closed loops I can buy that will fit on them?


----------



## spitty13

Just wanted to give heads up. My ASUS 290 came with Elpida ram. Hopefully it will still perform as good as hynix


----------



## jomama22

AIB doesnt matter. You can get elpida/hynix from any of them. Just and FYI.


----------



## Bloitz

looks like they don't do too shabby in folding so chances are I'm ordering a 290 + delicious EK block tommorow


----------



## CallsignVega

Quote:


> Originally Posted by *jomama22*
> 
> Just for you:
> 
> Ignore the overall score (was running at 4.7)
> 
> http://www.3dmark.com/fs/1098036
> 
> 6600 gfx score
> That's at 1315/1500. Ran valley @ 1340 max but best was 1320/1700.
> 
> Fse takes a dump on amd as you know, so to say it beats most Titans out there in their best bench (excluding single card valley) I'd say its pretty good.
> 
> Though out of the 5 I have tested so far, the average seems to be ~ 1250 @ 1.3v and 1280 @ 1.35.
> 
> If you get the memory up to 1700, these thing really open up.


Thanks, ya the 290x doesn't seem to do much over the Titan at low resolutions. I would pretty much only switch because of the slight bump at higher resolutions + slightly better multi-GPU scaling. Although not looking forward to the higher energy use and heat.


----------



## Gunderman456

Quote:


> Originally Posted by *iPDrop*
> 
> Ah cool what are some easy ways to put it under water? are there some closed loops I can buy that will fit on them?


There is one closed water loop vga cooler - the Arctic Accelero Hybrid;

http://www.arctic.ac/worldwide_en/products/cooling/vga/accelero-hybrid.html


----------



## sugarhell

Quote:


> Originally Posted by *CallsignVega*
> 
> Thanks, ya the 290x doesn't seem to do much over the Titan at low resolutions. I would pretty much only switch because of the slight bump at higher resolutions + slightly better multi-GPU scaling. Although not looking forward to the higher energy use and heat.


Winter is coming


----------



## jomama22

Quote:


> Originally Posted by *CallsignVega*
> 
> Thanks, ya the 290x doesn't seem to do much over the Titan at low resolutions. I would pretty much only switch because of the slight bump at higher resolutions + slightly better multi-GPU scaling. Although not looking forward to the higher energy use and heat.


Lol i hear ya. Ill be going to 3x 1080 120 so it was a no brainier for me. Yeah these things need some volts to get up there for sure.


----------



## sugarhell

Yeah a lot of volts. 1.55


----------



## jomama22

Quote:


> Originally Posted by *sugarhell*
> 
> Yeah a lot of volts. 1.55


pretty sure they are talking about their CPUs in that shot. As you know, the 290x will only boost by itself up to ~1.21v .

But these pcbs can take a licking, was running at upwards of 1.42+ volts.


----------



## Mr357

Quote:


> Originally Posted by *sugarhell*
> 
> Yeah a lot of volts. 1.55


Okay, time to put 1.55V through my card and see if it can do 1400 or better.


----------



## sugarhell

Searching with a hex editor they have over 250 powerstates up to 1.55.But most of them are locked. This is the voltage regulator for 290x. And you can adjust the volts up to 1.55 volts


----------



## Slomo4shO

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added


You missed my proof post or haven't updated OP









Ironically enough these release delays is going to delay my use of these cards. I won't be switching out the quadfire until after Thanksgiving. Might get 2 more and go water around Black Friday if a deal presents itself or switch over to non-reference designs since Newegg will allow for returns of these cards since I got a chat rep to agree to it. Either way, I will be replacing the PSU along with the cards in the current rig.


----------



## brazilianloser

Curious at you guys with two 290 non X power supplies...? Anyone done any testing on power consumption in crossfire? I got a few days to return my corsair ax860 which I am sure will give me no headroom if I was to run two 290 water loop and such.


----------



## Arizonian

Newegg Gigabyte 290 up on sale if your interested.









http://www.newegg.com/Product/Product.aspx?Item=N82E16814125479


----------



## fleetfeather

Gigabyte is using Elpida, yes?


----------



## Arizonian

Quote:


> Originally Posted by *fleetfeather*
> 
> Gigabyte is using Elpida, yes?


*shrug* Think all 290's using Elpida. We haven't had any members confirm what's in their 290's. Threads been moving fast last couple days.


----------



## esqueue

Quote:


> Originally Posted by *fleetfeather*
> 
> Gigabyte is using Elpida, yes?


From reading these posts, they all can have either Hynix or Elpida.


----------



## brian015

Anyone interested in buying my 290X Sapphire w/ BF4, off me? I got in on the 780 Amazon price mistake and am going to keep that instead. Paid $559.49 so just looking for the same back out of it, just thought I would offer it up before sending it back to Newegg.


----------



## SpewBoy

Quote:


> Originally Posted by *brazilianloser*
> 
> Curious at you guys with two 290 non X power supplies...? Anyone done any testing on power consumption in crossfire? I got a few days to return my corsair ax860 which I am sure will give me no headroom if I was to run two 290 water loop and such.


I'm running an AX860 with two 290Xs (stock), a 4770K (stock atm), 2 120mm fans, a 140mm fan, sound card, and 2 HDDs + 3 SSDs and this morning I've been having a few issues.

First, when I booted I got no video output whatsoever. Rebooted and I got video output but my HDD raid had disappeared by the time I got into Windows (I'm pretty sure it was present during POST). Rebooted again and all devices appear to be showing correctly.

Take this however you want but that kinda sounds like PSU trouble to me... which is worrying considering nothing is overclocked and my system shouldn't be using 860W even at full load from the information I've gathered.


----------



## Snyderman34

Quote:


> Originally Posted by *Arizonian*
> 
> Are they 290X or 290's? Cant tell from box. When you post back an OCN name to make it official on roster.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: GPU-Z validation with OCN name would work.


:fp: I did forget that. I'll have it up in a bit. They're 290s.


----------



## Arizonian

Quote:


> Originally Posted by *SpewBoy*
> 
> I'm running an AX860 with two 290Xs (stock), a 4770K (stock atm), 2 120mm fans, a 140mm fan, sound card, and 2 HDDs + 3 SSDs and this morning I've been having a few issues.
> 
> First, when I booted I got no video output whatsoever. Rebooted and I got video output but my HDD raid had disappeared by the time I got into Windows (I'm pretty sure it was present during POST). Rebooted again and all devices appear to be showing correctly.
> 
> Take this however you want but that kinda sounds like PSU trouble to me... which is worrying considering nothing is overclocked and my system shouldn't be using 860W even at full load from the information I've gathered.


Yup sure may be the case. Sorry to hear this. Have your tried gaming on them? If so what happened?

I'm looking at settling with one GPU with my AX850. Crossfire I'd have to invest even further and upgrade my PSU too. Bummed as I'm stretching my budget already if I went crossfire let alone upgrade a perfectly fine year old PSU. So I'm thinking one now and after Christmas shopping in Feb to upgrade the rest if the way I guess. *sigh*

Darn these power hungry performance beasts.


----------



## skupples

so is anyone under water getting to 7k mem or higher?
Quote:


> Originally Posted by *Arizonian*
> 
> Yup sure may be the case. Sorry to hear this. Have your tried gaming on them? If so what happened?
> 
> I'm looking at settling with one GPU with my AX850. Crossfire I'd have to invest even further and upgrade my PSU too. Bummed as I'm stretching my budget already if I went crossfire let alone upgrade a perfectly fine year old PSU. So I'm thinking one now and after Christmas shopping in Feb to upgrade the rest if the way I guess. *sigh*
> 
> Darn these power hungry performance beasts.


ADD2PSU Great way to save money really, takes up a bit more space but is well worth not having to scrap one psu for a bigger one.


----------



## rv8000

Could not resist any longer, Sapphire R9 290 on the way. Gpu's are just too fun to mess with


----------



## Mr357

Quote:


> Originally Posted by *skupples*
> 
> so is anyone under water getting to 7k mem or higher?
> ADD2PSU Great way to save money really, takes up a bit more space but is well worth not having to scrap one psu for a bigger one.


So far I've gotten to 6000 without a hiccup. I'll see how much more it will tolerate.


----------



## jtjoetan

hey guys is it difficult to install the aftermarket cooler to the r9 290?
these are the aftermarket coolers that i can get: http://www.vismart.com.my/category/products/it-products/arctic/arctic-cooling-fan
which 1 is suitable for r9 290? tqvm and have a nice day everyone!


----------



## jrcbandit

I am using the Asus bios to change voltage and I thought there wouldn't be much difference between 1.35 V and 1.41V since the Asus with Vdroop maxes out at 1.299 V. However, I was able to up my overclock by a good amount switching to 1.41V - I was stuck at 1190 and now can do 1230 without any artificating, even 1250 worked well except artifacting every once in awhile.


----------



## Taint3dBulge

What are the best ASIC scores, and how does those translate into ocing? Also what program do you use to tell what memory your card is using, elpida or hynix. Thinkin now i might keep this card, the coil whine is starting to ease up and after hearing how many people have the same thing, well I might just keep it...







It really just needs some new cooling and im golden..


----------



## DampMonkey

Quote:


> Originally Posted by *skupples*
> 
> so is anyone under water getting to 7k mem or higher?


Without memory voltage control, i don't think we'll see 7k any time soon. Non-reference might make it more attainable, but the guys on LN2 managed 6600 on their reference cards. I usually hang around 6000 because pushing it farther doesn't seem to add a lot of performance anyway


----------



## Moustache

Quote:


> Originally Posted by *jtjoetan*
> 
> Mmm okay.. thanks for the advise. I live in malaysia.


lol i knew it. i'm from msia as well. if you live in kl, you could try lowyat or compuzone. recently, compuzone priced sapphire 290 at rm1599 which is quite expensive. btw, their asus with bf4 290x priced at rm2399. that's pretty expensive as well. unfortunately, i don't live in kl but i use compuzones' prices as reference, check their website. you can also try newegg m'sia but they're currently closed. i would assume reference 290 is at rm1349 (without shipping price) if you decide to order via them. it could get cheaper depending on the exchange rate that they use. right now, their exchange rate is at $1=rm3.35. imo, it's better for you to buy from your local store instead of dealing with too many things when you're ordering.


----------



## Aussiejuggalo

So hows the 290 real world @ 1080p compared to the reviews? or compared to a 780 if anyones tested both

Stuck between the 290 and a 780 atm


----------



## Arm3nian

Quote:


> Originally Posted by *DampMonkey*
> 
> Without memory voltage control, i don't think we'll see 7k any time soon. Non-reference might make it more attainable, but the guys on LN2 managed 6600 on their reference cards. I usually hang around 6000 because pushing it farther doesn't seem to add a lot of performance anyway


jomama got 6800 on water and no hardmods so the ln2 guys were doing something wrong lol. And memory helps benches like valley, core seems to make little difference on valley.


----------



## FtW 420

Quote:


> Originally Posted by *Arm3nian*
> 
> jomama got 6800 on water and no hardmods so the ln2 guys were doing something wrong lol. And memory helps benches like valley, core seems to make little difference on valley.


Memory doesn't always like being too cold, while core clocks go up on ln2 not all memory likes to OC as much as on air or water. Some cards are fine all frozen, some are a hassle.


----------



## Bloitz

Couldn't sleep so : Gigabyte R9 290 + EK fullcover block ordered









Had to pay via bank transfer so it will take a 2-3 days to get processed probably. Ze germans don't trust us Belgians it seems. Perhaps I should remind them who invaded who all those years ago


----------



## Sazz

Ok, here's my story again.

So I returned the first 290X that I bought coz the Display port stopped showing image.
Here are some stats of what the first 290X have,

ASIC Rating: 70.2%
Memory: Hynix
Max stable OC: 1100/1350

Now I another 290X as a replacement and this one got 69.9% ASIC so I was like "crap, I got even lower this would be a bad overclocker -_-". And then I checked the memory, "Elipdia" oh crap, double wammy on my face!

So I tested it, I did 1100/1375 no sweat alright, I did up the core to 1150 and now I started to have artifacts.. put it down and I got to 1120 on core clock stable, hmm... I was thinking it should be a little lower on OC but I actually got 20Mhz extra, well I'll take that!.

Now I came to testing the memory clocks, I went nuts and went for 1500, Firestrike will just stop mid test, so I put it down and I went as low as 1430Mhz on memory stable so I was like "hmm, I thought Hynix should be the better ones... I got extra 55Mhz on this one"

Anyways I guess Hynix or Elipdia it doesn't matter, some of these memory will like OC some won't.. and as for ASIC, I was surprised that extra .3% didn't matter LOL.. I sure would've loved atleast mid 70's ASIC rating so that I can get 1150 on the core at stock voltage/BIOS.


----------



## skupples

Quote:


> Originally Posted by *DampMonkey*
> 
> Without memory voltage control, i don't think we'll see 7k any time soon. Non-reference might make it more attainable, but the guys on LN2 managed 6600 on their reference cards. I usually hang around 6000 because pushing it farther doesn't seem to add a lot of performance anyway


oh, i was under the impression you guys had access to memory voltage. It's something I have been dreaming about with my titans for ages. Luckily Skyn3t's new bios has allowed me to run 7k easy in everything i do under 1.212(stock limit)
Quote:


> Originally Posted by *Sazz*
> 
> Ok, here's my story again.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> So I returned the first 290X that I bought coz the Display port stopped showing image.
> Here are some stats of what the first 290X have,
> 
> ASIC Rating: 70.2%
> Memory: Hynix
> Max stable OC: 1100/1350
> 
> Now I another 290X as a replacement and this one got 69.9% ASIC so I was like "crap, I got even lower this would be a bad overclocker -_-". And then I checked the memory, "Elipdia" oh crap, double wammy on my face!
> 
> So I tested it, I did 1100/1375 no sweat alright, I did up the core to 1150 and now I started to have artifacts.. put it down and I got to 1120 on core clock stable, hmm... I was thinking it should be a little lower on OC but I actually got 20Mhz extra, well I'll take that!.
> 
> Now I came to testing the memory clocks, I went nuts and went for 1500, Firestrike will just stop mid test, so I put it down and I went as low as 1430Mhz on memory stable so I was like "hmm, I thought Hynix should be the better ones... I got extra 55Mhz on this one"
> 
> Anyways I guess Hynix or Elipdia it doesn't matter, some of these memory will like OC some won't.. and as for ASIC, I was surprised that extra .3% didn't matter LOL.. I sure would've loved atleast mid 70's ASIC rating so that I can get 1150 on the core at stock voltage/BIOS.


Memory brand seems to mean little, asic quality seems to mean even less in most instances, what does matter is the quality of the IMC which seems to vary greatly from gpu to gpu.

My 67% titan clocks much better then my 87% card. The only difference I really see is the amount of voltage required from one to another. The 87% card idles @ .919 the 67% card idles @ .887


----------



## Slomo4shO

Quote:


> Originally Posted by *Arizonian*
> 
> Yup sure may be the case. Sorry to hear this. Have your tried gaming on them? If so what happened?
> 
> I'm looking at settling with one GPU with my AX850. Crossfire I'd have to invest even further and upgrade my PSU too. Bummed as I'm stretching my budget already if I went crossfire let alone upgrade a perfectly fine year old PSU. So I'm thinking one now and after Christmas shopping in Feb to upgrade the rest if the way I guess. *sigh*
> 
> Darn these power hungry performance beasts.


You can usually find the V1000 for around $130 when it is on promo and there are a few different 1200-1300W gold rated quality PSUs that regularly go on sale for around $160-170... In fact, the KINGWIN LZG-1300 is currently $165.50 after rebate and XFX ProSeries1250W is $178.25 after rebate at Newegg. I am sure there will be better deals available during Black Friday. I am looking to find a deal on the Lepa G1600 since it is the only model over 1300W that fits my case


----------



## Arm3nian

Quote:


> Originally Posted by *skupples*
> 
> oh, i was under the impression you guys had access to memory voltage. It's something I have been dreaming about with my titans for ages. Luckily Skyn3t's new bios has allowed me to run 7k easy in everything i do under 1.212(stock limit)


I mentioned in the BE club that we didn't have access to the mem voltage lol. Maybe we will though


----------



## Slomo4shO

Use Promo Code *NAFSAVE5NOV6G* for 5% off at Newegg.


----------



## alancsalt

You guys have probably already seen... (All Quad cards)


----------



## ImJJames

Quote:


> Originally Posted by *alancsalt*
> 
> You guys have probably already seen... (Quad card)


#2 also quad card right?


----------



## Sazz

ok I have been experimenting with these fan speeds, to keep the card from throttling while being as less audible as possible.

Found out that at +10% power limit, 72% fan speed at 85C keeps my card flat out running at full 1000Mhz core clocks and temps is at 90C flat out all thru the furmark 15minute benchmark.

I set my fan speed points like this.

40% = 40C
45% = 50C
50% = 60C
60% = 75C
72% = 85C

When I set power limit at default it throttles between 920 to 980Mhz, set it to +7% and you will see it dip down over and over probably with 1-3second intervals as low as 960Mhz and set the power limit to +10% and I never see any sign of throttling.

Tomorrow after my DEP meeting imma start putting the card under watercooling, will probably post some numbers on Monday.


----------



## alancsalt

Above: It is the current quad card list....all of them are Quad GPU


----------



## th3illusiveman

Any news on when the non-ref cards are coming out?


----------



## Spectre-

Quote:


> Originally Posted by *th3illusiveman*
> 
> Any news on when the non-ref cards are coming out?


last time i heard 290X non reference are coming out next week

we all know that is a rumour


----------



## PillarOfAutumn

Omg...I was in a rush to buy the 290 in fear of it selling out so I did something incredibly stupid. Without looking at the shipping options, I just went for whichever was cheapest and placed the order. I'm in NJ and this is shipping from California. These guys are going to ship via FedEx and then have USPS deliver it so the earliest I will get it is on the 14th. What pisses me of is that usps is involved and these guys have lost my last two packages and delayed delivery of some of my packages by a week. They mark it "delivered" but it's never in my mailbox or by my door. Luckily it was through amazon that I bought those items so they not only replace it but have it delivered next day via UPS, who is a lot more reliable. I'm just scared that the same crap is going to happen but with newegg, it's going to be a huge hassle to start this refund process or to make them believe it wasn't delivered even though it's marked delivered. I could have 1- paid $2 extra to have ups to a 3 day delivery or 2- drive myself about 45 minutes away to a newegg warehouse to pick up...really disappointed by this :-(


----------



## tsm106

http://www.3dmark.com/fs/1069571

Check this out. Look at the the graphics score. This is on the stock cooler.

Quote:


> Originally Posted by *alancsalt*
> 
> Above: It is the current quad card list....all of them are Quad GPU


----------



## th3illusiveman

Quote:


> Originally Posted by *Spectre-*
> 
> last time i heard 290X non reference are coming out next week
> 
> we all know that is a rumour


I hope so, I want to see what these cards can really do, I bought a reference 7970 and the blower on that thing drove me insane, never again. Plus this cooler causes so much throttling it makes reviews kinda pointless. I have no clue why they are taking so long to allow thirdparty coolers to market.
Quote:


> Originally Posted by *tsm106*
> 
> http://www.3dmark.com/fs/1069571
> 
> Check this out. Look at the the graphics score. This is on the stock cooler.


Your ears tho.


----------



## esqueue

The more I think about it, the more I wonder if my water cooling setup will be adequate for my 3770k http://valid.canardpc.com/2irqu1 and 290x. It's a 77 bonneville heatercore with a "custom"







shroud, swiftech mcp350 pump, and 2 120mm pretty decent fans. I honestly can't remember their cfm but they kept an xbox 360 CPU/GPU 41°C/36°C during the day when my non air conditioned room was scorching and 36°C/31°C during night gaming.


----------



## ImJJames

Quote:


> Originally Posted by *alancsalt*
> 
> Above: It is the current quad card list....all of them are Quad GPU


Damn 290x has sick potential


----------



## tsm106

Quote:


> Originally Posted by *th3illusiveman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> http://www.3dmark.com/fs/1069571
> 
> Check this out. Look at the the graphics score. This is on the stock cooler.
> 
> 
> 
> Your ears tho.
Click to expand...

dude it's almost an insult, you didn't even look at the graphics score in comparison to the top 2 scores.


----------



## NateN34

Quote:


> Originally Posted by *Spectre-*
> 
> last time i heard 290X non reference are coming out next week
> 
> we all know that is a rumour


Hope this is true.

Getting really antsy and almost about to pull he trigger on a reference one.


----------



## th3illusiveman

Quote:


> Originally Posted by *tsm106*
> 
> dude it's almost an insult, you didn't even look at the graphics score in comparison to the top 2 scores.


No need to get offended lol, i didn't notice it was firestrike extreme, i thought it was 3DM11 Performance graphics score with a single card because I closed the tab so quick. I've seen a dude pull 19K 3DM11 with one of these under water so i wasn't immediately impressed, now however... good job.... but that's to be expected from you i guess.


----------



## alancsalt

Quote:


> Originally Posted by *tsm106*
> 
> http://www.3dmark.com/fs/1069571
> 
> Check this out. Look at the the graphics score. This is on the stock cooler.


Right now, that would take over my 19th place in Tri GPU with GTX 580s (11956)

http://www.hwbot.org/benchmark/3dmark_-_fire_strike/rankings?start=0#interval=20#cores=3#start=0


----------



## nemm

Quote:


> Originally Posted by *Arizonian*
> 
> *shrug* Think all 290's using Elpida. We haven't had any members confirm what's in their 290's. Threads been moving fast last couple days.


I have 1 with Hynix and 1 with Elpdia
They're concurrent serials as well so its just luck.


----------



## Snyderman34

Ok, so here's my proof:



Now to the good stuff (pic heavy)


Spoiler: Warning: Spoiler!



NOTE: Overclocks were at 1125/1473 save for 3DMark Fire Strike (1100/1437 due to minor artifacting). Also, Heaven would not play nice with Crossfire, so no Heaven Crossfire right now). CPU is an i7-4770k at stock. Cards used were a Sapphire HD7970 Boost and 2 Sapphire R9 290s). PSU is a Corsair HX1050.

In Heaven 4.0 (following OCN's recommendations for the Heaven 4.0 thread for 1080p):

7970, stock


R9 290, stock


R9 290, OC


Heaven 4.0, OCN recommended settings @ 1440p

7970, stock


R9 290, stock


R9 290, OC


3DMark (3DMark 11 and Fire Strike (Fire Strike non Extreme))

7970, stock (Result # 3 is Fire Strike. Result # 4 is 3DMark 11)


R9 290, stock (Result #3 is Fire Strike. Result #4 is 3DMark 11)


R9 290, OC (3DMark 11)


R9 290 XFire, OC (3DMark 11)


R9 290, OC (Fire Strike)


R9 290, OC (Fire Strike)




I'm pretty happy with them. Temps were sitting around 80C during benching (fan curve in AB set to 10C=10% fan). They do get a little loud, but nothing overbearing IMO. After the first of the year, I may throw waterblocks on them. Wife says I have spent enough







. Hope these help someone out.


----------



## SpewBoy

Quote:


> Originally Posted by *Arizonian*
> 
> Yup sure may be the case. Sorry to hear this. Have your tried gaming on them? If so what happened?
> 
> I'm looking at settling with one GPU with my AX850. Crossfire I'd have to invest even further and upgrade my PSU too. Bummed as I'm stretching my budget already if I went crossfire let alone upgrade a perfectly fine year old PSU. So I'm thinking one now and after Christmas shopping in Feb to upgrade the rest if the way I guess. *sigh*
> 
> Darn these power hungry performance beasts.


It's odd. I can game all day on these (though tbh I haven't tried any really intensive games yet, but have done some banchmarks) and my system performs absolutely fine. It has only been while booting that issues have occurred, and one black screen while running Heaven v3 for 10+ minutes on my older 2600K rig (which is using an OCZ ZX 1000W PSU).

I purchased this AX860 a few weeks before the cards because I didn't know how power hungry they'd be. If this issue keeps happening I may have to sell it. I saved around 50 bucks by purchasing it overseas so a return isn't really feasible but I may be able to recoup my losses by selling it on eBay as a used PSU that has only been running for 3 days (that should be worth almost retail surely) and purchasing an AX1200.


----------



## Bloitz

Quote:


> Originally Posted by *Snyderman34*
> 
> /snip...
> After the first of the year, I may throw waterblocks on them. Wife says I have spent enough
> 
> 
> 
> 
> 
> 
> 
> . Hope these help someone out.


I can understand your wife is in charge but that's kind of "stupid" . WB don't really drop in price unless they're for severely outdated tech (at least from what I've seen). Might as well do the investment now and enjoy them for a year and then some more.

PS: Take her out to dinner. Women love food, but don't tell them or they think you're calling them fat. Women are weird and they scare me.


----------



## Sazz

shoot forgot the link xD


----------



## Arizonian

Quote:


> Originally Posted by *Snyderman34*
> 
> Ok, so here's my proof:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Now to the good stuff (pic heavy)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> NOTE: Overclocks were at 1125/1473 save for 3DMark Fire Strike (1100/1437 due to minor artifacting). Also, Heaven would not play nice with Crossfire, so no Heaven Crossfire right now). CPU is an i7-4770k at stock. Cards used were a Sapphire HD7970 Boost and 2 Sapphire R9 290s). PSU is a Corsair HX1050.
> 
> In Heaven 4.0 (following OCN's recommendations for the Heaven 4.0 thread for 1080p):
> 
> 7970, stock
> 
> 
> R9 290, stock
> 
> 
> R9 290, OC
> 
> 
> Heaven 4.0, OCN recommended settings @ 1440p
> 
> 7970, stock
> 
> 
> R9 290, stock
> 
> 
> R9 290, OC
> 
> 
> 3DMark (3DMark 11 and Fire Strike (Fire Strike non Extreme))
> 
> 7970, stock (Result # 3 is Fire Strike. Result # 4 is 3DMark 11)
> 
> 
> R9 290, stock (Result #3 is Fire Strike. Result #4 is 3DMark 11)
> 
> 
> R9 290, OC (3DMark 11)
> 
> 
> R9 290 XFire, OC (3DMark 11)
> 
> 
> R9 290, OC (Fire Strike)
> 
> 
> R9 290, OC (Fire Strike)
> 
> 
> 
> 
> I'm pretty happy with them. Temps were sitting around 80C during benching (fan curve in AB set to 10C=10% fan). They do get a little loud, but nothing overbearing IMO. After the first of the year, I may throw waterblocks on them. Wife says I have spent enough
> 
> 
> 
> 
> 
> 
> 
> . Hope these help someone out.


Congrats - Added









Question....your running two on an AX850 PSU - any issues? Have you gamed?


----------



## Snyderman34

Quote:


> Originally Posted by *Bloitz*
> 
> I can understand your wife is in charge but that's kind of "stupid" . WB don't really drop in price unless they're fore severely outdated tech (at least from what I've seen). Might as well do the investment now and enjoy them for a year and then some more.
> 
> PS: Take her out to dinner. Women love food, but don't tell them or they think you're calling them fat. Women are weird and they scare me.


It isn't like that. I've dropped $1500 or so in the last two weeks, so that's probably enough anyway. I don't mind the noise. Better to wait for tax return time for me anyway. The funds, they are low.
Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - Added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Question....your running two on an AX850 PSU - any issues? Have you gamed?


Running them on an HX1050. Haven't gamed yet. Just finished benching


----------



## Bloitz

Quote:


> Originally Posted by *SpewBoy*
> 
> I purchased this AX860 a few weeks before the cards because I didn't know how power hungry they'd be. If this issue keeps happening I may have to sell it. I saved around 50 bucks by purchasing it overseas so a return isn't really feasible but I may be able to recoup my losses by selling it on eBay as a used PSU that has only been running for 3 days (t*hat should be worth almost retail surely*) and purchasing an AX1200.


Not sure where you are from, but in Belgium with pretty much everything (electronics, cars etc. ) as soon as you take it out of the box it's only worth 75% of the retail value anymore (generally speaking, something like a 290X that's out of stock everywhere might fetch near the retail price). But what you don't ask, you will never get


----------



## Arizonian

Quote:


> Originally Posted by *Snyderman34*
> 
> -snip-
> Running them on an HX1050. Haven't gamed yet. Just finished benching


Ah ok - your sig rig listed - Cooler Master Silent Pro 850W - fooled me.


----------



## Snyderman34

Quote:


> Originally Posted by *Arizonian*
> 
> Ah ok - your sig rig listed - Cooler Master Silent Pro 850W - fooled me.


Yeah, working on updating that. lol


----------



## Doug2507

Apologies if this has been discussed in the thread already but i've not had time to read through it yet. Is there a custom bios available for the 290x, or one in the pipeline?


----------



## Forceman

Choices, choices.



Put the 290 in and it immediately ran through a Metro bench at 1100/1300, so I'm think I'll just keep it. At that speed it was exactly even with the 290X I had before. I think the fan on this card is more annoying than the 290X fan though - it has a higher whinier pitch to it, and it seems to be cycling a lot at idle. Not too concerned since it'll be under water by this weekend (which reminds me I forgot to take a pic of the block), but annoying in the interim. 75% fan kept the GPU temps under 75C though, but man it was loud - I had the target temp set to 95C, but it ramped the fans all the way up to 75% even though the temp never got that high, not sure if that is a bug or what.

Edit: Oh, and the void if removed stickers were no big deal. They weren't really attached very well, since they were on the screws, and I was easily able to just lift them off with a razor blade. Parked them on some pins sticking up so I could put them back later, and off we go. Only tricky part was when I accidentally put one on the smooth PCB temporarily - that one stuck pretty good, but even then the razor blade lifted it off without any issue.


----------



## nemm

For those wondering if 850 will be enough for xfire 290/290x with overclocks, ill be testing my AX850 later running 3770k 4.8 @1.336 and graphics overclocked to max under water. After looking at reviews and seeing the power draw I got a 1300w (overkill I know but this is for a future trifire and haswell-e build also) on standby.

Currently using PT1 bios on my 290 for constant voltage to see how far i can push it, ASUS290, ASUS290x and PT3 had voltage varying too much to find stable clocks. The 290 with Elpdia memory isn't too great with overclocks so far, 1100/1450 @stock 1.227v, 1125/1450 @ 1.265, 1150/1450 @ 1.335v. running at 75% fan the highest temp registered has been 83*C.


----------



## SpewBoy

Quote:


> Originally Posted by *Bloitz*
> 
> Not sure where you are from, but in Belgium with pretty much everything (electronics, cars etc. ) as soon as you take it out of the box it's only worth 75% of the retail value anymore (generally speaking, something like a 290X that's out of stock everywhere might fetch near the retail price). But what you don't ask, you will never get


If I sell it locally, people will be expecting retail prices of around $280-290 for an AX860, so:

0.75 x 280 = $210

If I purchased it overseas for $230, that's a loss of $20, which isn't too bad (well, add a 10% or whatever eBay fee to that and I lose like 40 bucks, which isn't the greatest).


----------



## Gilgam3sh

anyone with the Sapphire 290 can tell if the card is voltage locked? will it be needing another BIOS or will it be enough when MSI Afterburner gets updated?


----------



## ZombieJon

Quote:


> Originally Posted by *Forceman*
> 
> Choices, choices.
> 
> 
> 
> Put the 290 in and it immediately ran through a Metro bench at 1100/1300, so I'm think I'll just keep it. At that speed it was exactly even with the 290X I had before. I think the fan on this card is more annoying than the 290X fan though - it has a higher whinier pitch to it, and it seems to be cycling a lot at idle. Not too concerned since it'll be under water by this weekend (which reminds me I forgot to take a pic of the block), but annoying in the interim. 75% fan kept the GPU temps under 75C though, but man it was loud - I had the target temp set to 95C, but it ramped the fans all the way up to 75% even though the temp never got that high, not sure if that is a bug or what.


What was your power draw on it?


----------



## Scotty99

Quote:


> Originally Posted by *Arizonian*
> 
> Yup sure may be the case. Sorry to hear this. Have your tried gaming on them? If so what happened?
> 
> I'm looking at settling with one GPU with my AX850. Crossfire I'd have to invest even further and upgrade my PSU too. Bummed as I'm stretching my budget already if I went crossfire let alone upgrade a perfectly fine year old PSU. So I'm thinking one now and after Christmas shopping in Feb to upgrade the rest if the way I guess. *sigh*
> 
> Darn these power hungry performance beasts.


Your ax850 is fine for crossfire man.

http://www.hardwaresecrets.com/article/Corsair-AX850W-Power-Supply-Review/1081/8

They easily reached 1000 DC wattage with over 80% efficiency and they said they could even get more but were limited by testing equipment. Corsairs are monsters and actual wattage output is very underrated from the factory.


----------



## nemm

Quote:


> Originally Posted by *Gilgam3sh*
> 
> anyone with the Sapphire 290 can tell if the card is voltage locked? will it be needing another BIOS or will it be enough when MSI Afterburner gets updated?


You'll be happy to know the cards are not locked. I am using the various Asus bios floating around and gpu tweak at the moment to unlock voltages but new AB and Trixx revisions will support voltage control with sapphire default bios.

Maximum overclock I could get on card with Elpdia memory with stock cooling is 1165/5800 @1.352v 86*C. I tried for 1200 but voltage was over 1.45v 92*C so that is a no go. Still happy with the card although not the best overclocker, +218 core and +800 memory is still good enough for me and when under water I may obtain better (awaiting mail for block)

http://www.3dmark.com/3dm/1572439


----------



## Forceman

Quote:


> Originally Posted by *ZombieJon*
> 
> What was your power draw on it?


Forgot to check. Hang on.

At the wall it was 75 idle, peaked at 375 load (1100/1300, default voltage). HWInfo says VRMs peaked at 63C with the fan at 75%. Card feels much cooler than the 290X was, although I don't think I ran the fan as high on the 290X so that's surely why. Anyone know how to check the VID?

Edit: GPU-Z says 1.18V under load. 69.6% ASIC.


----------



## Gilgam3sh

Quote:


> Originally Posted by *nemm*
> 
> You'll be happy to know the cards are not locked. I am using the various Asus bios floating around and gpu tweak at the moment to unlock voltages but new AB and Trixx revisions will support voltage control with sapphire default bios.


oh great, thanks


----------



## SpewBoy

A few reboots and Furmark burn-ins later and I can't get my hardware to fail. If Furmark can't overload my 860W PSU with two stock 290Xs, then surely nothing will (excluding overclocks). I'm on the verge of calling those previous boot misshaps a once off problem or possibly some motherboard issue (yikes!). Until it happens again, all is good on the red side (apart from some strange 100% GPU load when idle bugs - I'm using GPU-Z but I assumed it got updated to fix those bugs... if not, then yeah, those can be attributed to GPU-Z).


----------



## Sgt Bilko

Well i was expecting my 8350 to turn up tomorrow but it's sat at the shop all day (paid for Wednesday) and won't be posted till tomorrow....which means i won't get it till Monday now.......so no more testing from me for a while.


----------



## selk22

The new GPU-Z supports the 290x.. Wondering if it is still causing issues for people? It reads everything correctly for me as of right now while AB still does not report GPU usage correctly.


----------



## rdr09

Quote:


> Originally Posted by *alancsalt*
> 
> Right now, that would take over my 19th place in Tri GPU with GTX 580s (11956)
> 
> http://www.hwbot.org/benchmark/3dmark_-_fire_strike/rankings?start=0#interval=20#cores=3#start=0


wonder where it will land at normal?


----------



## SpewBoy

I don't have a GPU usage appearing for my second card in GPU-Z for some reason. Although occasionally it might pop up.

Just testing out some stock voltage OCs to see if my PSU will take it. Just doing each GPU together to get a rough idea of what they can do.

Running 1130 and 1375 in Valley with stock 4770K (stock cooler so no OC right now) gets me 4783 points with the Extreme HD preset. I have no idea if that's good or not, or even in the ballpark for two 290Xs.


----------



## 2advanced

Quote:


> Originally Posted by *selk22*
> 
> The new GPU-Z supports the 290x.. Wondering if it is still causing issues for people? It reads everything correctly for me as of right now while AB still does not report GPU usage correctly.


MSI AB Beta 16 can be found HERE

Dont know if it works with the 290/X series cards though....


----------



## NapalmV5

anyone know anyone have any info any rumors on how soon to expect 8gb 290/290x ?

that would seal the deal for me to go 4k


----------



## SpewBoy

Quote:


> Originally Posted by *NapalmV5*
> 
> anyone know anyone have any info any rumors on how soon to expect 8gb 290/290x ?
> 
> that would seal the deal for me to go 4k


I'm pretty sure even quad cf wouldn't be able to utilise anywhere near 8GB VRAM and get playable frame rates... Unless you're going for four 290Xs, I don't think 8GB VRAM will ever be of any use to you whatsoever.


----------



## selk22

Quote:


> Originally Posted by *2advanced*
> 
> MSI AB Beta 16 can be found HERE
> 
> Dont know if it works with the 290/X series cards though....


Its working for some aspects and others not.. Like reading things correctly except GPU Usage and not letting you control Voltage in anyway. Also allows for custom fan profiles which eliminates the 290x throttling your seeing in many reviewers benchmarks.


----------



## Sazz

so far this is my best Firestrike score that I have achieved with any single AMD cards paired with an FX-8350 at 5.1Ghz

http://www.3dmark.com/fs/1105953

Will do more benches on monday under water.


----------



## NapalmV5

Quote:


> Originally Posted by *SpewBoy*
> 
> I'm pretty sure even quad cf wouldn't be able to utilise anywhere near 8GB VRAM and get playable frame rates... Unless you're going for four 290Xs, I don't think 8GB VRAM will ever be of any use to you whatsoever.


no less than quad 290/290x 8gb - should do just fine - thats the point i wouldnt want to fully utilize all 8gigs gotta have some free vram for those nasty spikes

dont worry man i know how to utilize vram - i mod just about every game i play - for me 3gb is not enough at 1080p


----------



## Ukkooh

Yay! Finally got confirmation from club3d that I can put a waterblock on their 290x without losing the warranty. I've emailed every single manufacturer and they seem to be the only one's who care about watercoolers worldwide.


----------



## kantxcape

Quote:


> Originally Posted by *rv8000*
> 
> Question for both 290x and 290 owners. Have any of you experienced coil whine on your cards, if so what brand?


My Sapphire 290x does coil whine, how can people complain about the fan noise? The coil whine does more noise than the freakin fan... im so mad... all the AMD cards i purchase have coil whine. Gonna watercool it so i can hear the coil whine perfectly.


----------



## kantxcape

Quote:


> Originally Posted by *armartins*
> 
> Coil whine rises with FPS... those benches that run 1000+ fps... or furmark in low res will always have very strong coil whine...


I have my card locked to 119 fps, and the coil whine is annoying as hell. The performance is great need to see if the problem gets less annoying but my system is watercooled, when i put the block on the 290x im gonna hear it even more, and when i crossfire it its gonna be insane! ZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZZ mega noise attack!


----------



## Paul17041993

Quote:


> Originally Posted by *NapalmV5*
> 
> anyone know anyone have any info any rumors on how soon to expect 8gb 290/290x ?
> 
> that would seal the deal for me to go 4k


yea pretty sure there isn't any game with poor enough optimization that it would have to fill all 4GB up with unnecessary textures, 8GB would have virtually no use and would most likely only worsen the performance...


----------



## Ukkooh

Quote:


> Originally Posted by *kantxcape*
> 
> My Sapphire 290x does coil whine, how can people complain about the fan noise? The coil whine does more noise than the freakin fan... im so mad... all the AMD cards i purchase have coil whine. Gonna watercool it so i can hear the coil whine perfectly.


FYI you can RMA it for the coil whine. I wouldn't settle for anything less than no coil whine at all. Or you could get tinnitus like me and never be able to hear coil whine at all.


----------



## Sazz

I only get coil whine when I overclock. at stock clocks even at board power limit increased to +10% it doesn't have any coil whine.

but when I do my max OC runs thats when I hear the whine.


----------



## kantxcape

Quote:


> Originally Posted by *Ukkooh*
> 
> FYI you can RMA it for the coil whine. I wouldn't settle for anything less than no coil whine at all. Or you could get tinnitus like me and never be able to hear coil whine at all.


Quote:


> Originally Posted by *Sazz*
> 
> I only get coil whine when I overclock. at stock clocks even at board power limit increased to +10% it doesn't have any coil whine.
> 
> but when I do my max OC runs thats when I hear the whine.


Yeah, i guess that what i need to do, but it sucks major donkey b.

I dont even have my 290x OCed, my 7970s suffered from the same problem. Gonna RMA de 290x once i get the 7970s back from RMA. Which means i cant water block my 290x. bah


----------



## NapalmV5

Quote:


> Originally Posted by *Paul17041993*
> 
> yea pretty sure there isn't any game with poor enough optimization that it would have to fill all 4GB up with unnecessary textures, 8GB would have virtually no use and would most likely only worsen the performance...


lol if youd know what youre talking about you wouldnt say that ^

im not asking whether 8gb is necessary or not - i know how much vram i need i know how much vram games use the whole nine yards

maybe this was the wrong thread maybe wrong forum to ask


----------



## ZombieJon

Quote:


> Originally Posted by *Forceman*
> 
> Forgot to check. Hang on.
> 
> At the wall it was 75 idle, peaked at 375 load (1100/1300, default voltage). HWInfo says VRMs peaked at 63C with the fan at 75%. Card feels much cooler than the 290X was, although I don't think I ran the fan as high on the 290X so that's surely why. Anyone know how to check the VID?
> 
> Edit: GPU-Z says 1.18V under load. 69.6% ASIC.


Thanks. Good to know.

I think a 850 should be able to run them in CF as long as voltage doesn't get increased.


----------



## SonDa5

My Sapphire 290x has coil whine when benching. but hopefully it will go away after I burn them in.









New Entry
SonDa5 - 4770k @4.8GHZ - AMD R9 290x - 12,132 - 11/07/13



http://www.3dmark.com/3dm/1573423?


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> http://www.3dmark.com/fs/1069571
> 
> Check this out. Look at the the graphics score. This is on the stock cooler.


ln2

http://www.3dmark.com/fs/948203

1400 core


----------



## SonDa5

Quote:


> Originally Posted by *rdr09*
> 
> ln2
> 
> http://www.3dmark.com/fs/948203
> 
> 1400 core


Dude that is not a 290x.... It's 3 780s in SLI and though they are clocked fast they don't have the power that a 290x has. Great score though but a 290x is MOAR powerful.


----------



## rdr09

Quote:


> Originally Posted by *SonDa5*
> 
> Dude that is not a 290x.... It's 3 780s in SLI and though they are clocked fast they don't have the power that a 290x has. Great score though but a 290x is MOAR powerful.


compare the graphics scores. one is stock cooler the other ln2. looks like Hawaii is scaling as good as the Tahitis.


----------



## Jpmboy

Okay guys, my sapphire 290x should arrive tomorrow, the EK water block arrived already with the needed QDCs. I'm gonna pull the sli titans from ParkBench and see what this puppy can do. Yeah, i run both green and red side by side








But, for benchmarking, Some info would be very helpful:
Best bios?
Tools or softmods to up the mV?

I'm really looking forward to working with this 290x card. I' ve been running cfx 7970s since the week they were launched and still going strong (and still good enough for gaming at 1600p !).


----------



## SpewBoy

Quote:


> Originally Posted by *Ukkooh*
> 
> FYI you can RMA it for the coil whine. I wouldn't settle for anything less than no coil whine at all. Or you could get tinnitus like me and never be able to hear coil whine at all.


W-wait... if you have tinnitus and can't hear coil whine at all... how can you only settle for no coil whine when you can't hear it in the first place? Do you invite your neighbours over and say "hey, does anyone hear that screeching noise?" and observe their reaction?

I have a feeling my coil whine has disappeared. I guess I've burned the card in. Burned is the correct word. My top card likes to sit at 70c idle (but with the right amount of tweaking the cooler can handle it reasonably fine under full load with no throttling).


----------



## Newbie2009

I wonder will we see a little price drop with the 780ti launched today.

Don't think I can hold off much longer on buying new cards.


----------



## Rar4f

Will a good 600W be enuff for r9 290 with overclock, and 4770k and also that oced?


----------



## M125

Quote:


> Originally Posted by *Arizonian*
> 
> Well gentlemen - since I'm not water cooling, not willing to upgrade my 1 yr old AX850 PSU until next rig build, I've sold my reference 290X. Sold it to my cousins work friend who wanted one so bad offered me what I paid for it without the BF4 code this afternoon. It's so hard to obtain he was ecstatic to hand me the money.
> 
> I got to try this beast and now waiting for non-reference coolers and see what they bring to the table for someone in my position. Originally I intended to keep it until non-reference cards came out but his offer was only good until he could score one somewhere else. Getting to keep BF4 was the icing on the cake as I made out ahead of what I paid theoretically.
> 
> Slapped my old 680 in main rig (since my 690 is already in the second 3D Vision rig) and going back to waiting. Happy / sad day I guess.


I'm in the nearly same boat as you. The R9 290X/R9 290 with it's reference cooling is not suited for small form factor ITX systems unless you have the ability to watercool. The noise is either far too great to be acceptable, or the performance suffers due to thermal downclocking.

I sold mine to a friend/coworker who, in his words, "doesn't care about noise." He has it set to a static 75% fan speed that he sets up when he games.









He does nothing besides game on it, so it's not like its loud all the time. Its in a Thermaltake Level 10 paired with an FX 8150; maybe that dulls the noise somewhat?


----------



## SpewBoy

Quote:


> Originally Posted by *Rar4f*
> 
> Will a good 600W be enuff for r9 290 with overclock, and 4770k and also that oced?


That would be pushing it. I think the recommended wattage is 750 for the 290X, which is probably a bit of an over-estimate to be on the safe side but still, 600 is cutting it rather close with OCs in mind.


----------



## rdr09

Quote:


> Originally Posted by *Newbie2009*
> 
> I wonder will we see a little price drop with the 780ti launched today.
> 
> Don't think I can hold off much longer on buying new cards.


pretty sure it will. $600 would not be bad for that 3GB.


----------



## brazilianloser

Yeah way back a few pages but thanks for the answers... I have an ax860 as well... I did find a review for crossfire 2 way Crossfire Power Consumption In full stress they only managed to pull 603w out of their system... if that proves to be true a "non problematic" ax860 should handle it just fine... now if you have a lot of extras maybe not.


----------



## Rar4f

Quote:


> Originally Posted by *SpewBoy*
> 
> That would be pushing it. I think the recommended wattage is 750 for the 290X, which is probably a bit of an over-estimate to be on the safe side but still, 600 is cutting it rather close with OCs in mind.


I guess no R9 290 for me then ;p Already stuck with 600W.


----------



## brazilianloser

Now I am sure some have tried since I remember seeing such a question back a few pages... but any of you guys have tried to flash the 290x bios into the 290 yet?


----------



## fleetfeather

290 non-X still the strongest choice imo... I wonder if the 780TI release will cause an influx of AIB coolers to release...


----------



## SpewBoy

Quote:


> Originally Posted by *Rar4f*
> 
> I guess no R9 290 for me then ;p Already stuck with 600W.


If it's a quality PSU you'd probably be fine. I'm really not a PSU guru and I'm running an AX860 with two overclocked 290Xs (not overvolted) and a stock 4770K and apart from a string of odd booting behaviour (GPUs / HDDs going missing for 2 boot cycles this morning), I've had no symptoms of PSU trouble. I'll be able to provide more accurate feedback once I've had all components under water and overvolted but at the moment it seems two 290Xs at stock volts with a 4770K pull less than 860W (though my PSU can apparently handle more than 860W anyway so who knows). Take off 300W for one 290X and you're left with 560W so yeah, a 600W PSU may do the job but it may be tight.


----------



## SonDa5

Quote:


> Originally Posted by *Rar4f*
> 
> I guess no R9 290 for me then ;p Already stuck with 600W.


May work out. I'm running single 290x and 4770k with a Seasonic 660W and so far no problems.


----------



## alancsalt

Quote:


> Originally Posted by *brazilianloser*
> 
> Yeah way back a few pages but thanks for the answers... I have an ax860 as well... I did find a review for crossfire 2 way Crossfire Power Consumption In full stress they only managed to pull 603w out of their system... if that proves to be true a "non problematic" ax860 should handle it just fine... now if you have a lot of extras maybe not.


Yeah, with no overclocking, which is fine if you are not going to.


----------



## brazilianloser

Quote:


> Originally Posted by *alancsalt*
> 
> Yeah, with no overclocking, which is fine if you are not going to.


I am going to but only beginning of next year when I get the second and put them on water. For the mean time a single one oc will be fine under the 860... just lame because I bought the set up of single braided wires, and the ax1200 is kind of expansive. Might finally bail on Corsair for some other full modular option.


----------



## jtjoetan

Quote:


> Originally Posted by *Moustache*
> 
> lol i knew it. i'm from msia as well. if you live in kl, you could try lowyat or compuzone. recently, compuzone priced sapphire 290 at rm1599 which is quite expensive. btw, their asus with bf4 290x priced at rm2399. that's pretty expensive as well. unfortunately, i don't live in kl but i use compuzones' prices as reference, check their website. you can also try newegg m'sia but they're currently closed. i would assume reference 290 is at rm1349 (without shipping price) if you decide to order via them. it could get cheaper depending on the exchange rate that they use. right now, their exchange rate is at $1=rm3.35. imo, it's better for you to buy from your local store instead of dealing with too many things when you're ordering.


oh, hi! nice to meet u.. ideal tech priced saphire 290 at rm 1.7k, r9 290x bf4 saphire for 2.3k... yea. i prefer to buy from local store as well, warranty all easy to settle.. now what's holding me back from buying r9 290 is the fan.. the freaking fan! aftermarket cooler such as the arctic 1 can be found online (lowyat forum).. I heard the r9 290 performs better than the r9 290x, is that true?

and also, i feel like building my rig ( buying parts and compiling it myself) but im afraid i'll screw something up.. or should i just ask other ppl to help me compile the parts? cause if i compile it myself, i'll have a feeling like 'i build this rig' and am proud of it.. something like that.. hahaha


----------



## Rar4f

I think the psu, www.newegg.com/Product/Product.aspx?Item=N82E16817580004&Tpk=newton r3 600w, should handle it without overclocking but...if i overclock the 290 and the processor by significant amount, i am pretty sure that a good 600W will not meet be enuff.


----------



## rdr09

Quote:


> Originally Posted by *jtjoetan*
> 
> oh, hi! nice to meet u.. ideal tech priced saphire 290 at rm 1.7k, r9 290x bf4 saphire for 2.3k... yea. i prefer to buy from local store as well, warranty all easy to settle.. now what's holding me back from buying r9 290 is the fan.. the freaking fan! aftermarket cooler such as the arctic 1 can be found online (lowyat forum).. I heard the *r9 290 performs better than the r9 290x*, is that true?
> 
> and also, i feel like building my rig ( buying parts and compiling it myself) but im afraid i'll screw something up.. or should i just ask other ppl to help me compile the parts? cause if i compile it myself, i'll have a feeling like 'i build this rig' and am proud of it.. something like that.. hahaha


get a XBox.


----------



## MojoW

I'm thinking of going R9 290 crossfire.
How does it compare to quad 6990?


----------



## battleaxe

Hey are there any Newegg 250 and $25 promo codes still going? Or any others?


----------



## Kriant

Quote:


> Originally Posted by *battleaxe*
> 
> Hey are there any Newegg 250 and $25 promo codes still going? Or any others?


5% off promo


----------



## Rar4f

Quote:


> Originally Posted by *jtjoetan*
> 
> oh, hi! nice to meet u.. ideal tech priced saphire 290 at rm 1.7k, r9 290x bf4 saphire for 2.3k... yea. i prefer to buy from local store as well, warranty all easy to settle.. now what's holding me back from buying r9 290 is the fan.. the freaking fan! aftermarket cooler such as the arctic 1 can be found online (lowyat forum).. I heard the r9 290 performs better than the r9 290x, is that true?
> 
> and also, i feel like building my rig ( buying parts and compiling it myself) but im afraid i'll screw something up.. or should i just ask other ppl to help me compile the parts? cause if i compile it myself, i'll have a feeling like 'i build this rig' and am proud of it.. something like that.. hahaha


Live is a challenge, instead of being afraid that you may screw up, prepare yourself to not







I also have fears as i will soon build my very first computer but i try to read up as much as i can so that i don't make any mistake.
If you want questions answered, maybe i can help you out with what i have read up on in context to building a computer.

Just PM me


----------



## jtjoetan

Quote:


> Originally Posted by *rdr09*
> 
> get a XBox.


huh? what do u mean?


----------



## tsm106

Quote:


> Originally Posted by *brazilianloser*
> 
> Yeah way back a few pages but thanks for the answers... I have an ax860 as well... I did find a review for crossfire 2 way Crossfire Power Consumption In full stress they only managed to pull 603w out of their system... if that proves to be true a "non problematic" ax860 should handle it just fine... now if you have a lot of extras maybe not.


Word of advice, never trust consumption numbers measured from the wall from guru. They are always extremely low. They would fit right in with the guys from our psu forum lol. And using furmark to measure draw is silly too since it has been a throttled app for years now. You need real world workload and max fan on the stock cooler. For ex, look below. Consider also that the number I got below is w/o accessories, loop, drives, fans, pumps, etc. Then imagine when it is run on the uber bios or unlock bios and overclocked with anger!

Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> Two cards on psu with my cpu and nothing else connected, ie. no drives, no loop, nothing else. Running 3dm11 extreme at stock with +50 on air, so it's throttling a bit as expected. 903w x .89 at that wattage = *803w.* This number should blow up with blocks and serious overclocking. Ya need to remember that on the stock bios PT is trying to keep the tdp of the cards very very low.


----------



## jtjoetan

Quote:


> Originally Posted by *Rar4f*
> 
> Live is a challenge, instead of being afraid that you may screw up, prepare yourself to not
> 
> 
> 
> 
> 
> 
> 
> I also have fears as i will soon build my very first computer but i try to read up as much as i can so that i don't make any mistake.
> If you want questions answered, maybe i can help you out with what i have read up on in context to building a computer.
> 
> Just PM me


yea! thanks for the encouragement! I'll PM u if i need help, thanks for the offer! what are ur desire specs? i've seen that u're quite worried about the reference cooler for r9 290 as well.. guess we're on the same boat.. :X, what if we have to wait until next early year to get the 3rd party coolers? @[email protected] im still unsure whether to pull the trigger or not, as i've only 2months of school holidays which im going to put full use of it


----------



## Rar4f

Quote:


> Originally Posted by *jtjoetan*
> 
> yea! thanks for the encouragement! I'll PM u if i need help, thanks for the offer! what are ur desire specs? i've seen that u're quite worried about the reference cooler for r9 290 as well.. guess we're on the same boat.. :X, what if we have to wait until next early year to get the 3rd party coolers? @[email protected] im still unsure whether to pull the trigger or not, as i've only 2months of school holidays which im going to put full use of it


If you won't use it now, you can next holiday







Also someone on this thread said that R9 290X with 3rd party coolers should be here by end of november and that few weeks after R9 290 3rdparty coolers should follow.
But i am crossing my fingers that both R9 290 and X version with after market coolers will be ready by end of november.

I will pm you my desired specs


----------



## ivanlabrie

Quote:


> Originally Posted by *SpewBoy*
> 
> If it's a quality PSU you'd probably be fine. I'm really not a PSU guru and I'm running an AX860 with two overclocked 290Xs (not overvolted) and a stock 4770K and apart from a string of odd booting behaviour (GPUs / HDDs going missing for 2 boot cycles this morning), I've had no symptoms of PSU trouble. I'll be able to provide more accurate feedback once I've had all components under water and overvolted but at the moment it seems two 290Xs at stock volts with a 4770K pull less than 860W (though my PSU can apparently handle more than 860W anyway so who knows). Take off 300W for one 290X and you're left with 560W so yeah, a 600W PSU may do the job but it may be tight.


Overclock that cpu already, crossfire with stock cpu is a terrible idea.
Quote:


> Originally Posted by *MojoW*
> 
> I'm thinking of going R9 290 crossfire.
> How does it compare to quad 6990?


I think a single 290x oc'd would be on par with quadfired 6990s...


----------



## Nevk

just ordered my two MSI R9 290X....hoping my AX850 can holding it


----------



## outofmyheadyo

Im pretty sure ur ax850 will fail pretty hard, since my seasonic x850 failed hard on double 580sli @ 1000core each


----------



## Sgt Bilko

I have a Steam key for Batman: Arkham City here that i need to get rid of.......any takers?

only catch is you have to test your 290/x and post results in the GK110 vs Hawaii Thread.


----------



## BonzaiTree

I'm so sorely tempted to sell my 670 and waterblock and buy a 290. So...very...tempting!!!

I wish I had enough rep to use the marketplace on OCN--if I did I would definitely be selling it.


----------



## Arizonian

Quote:


> Originally Posted by *Forceman*
> 
> Choices, choices.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Put the 290 in and it immediately ran through a Metro bench at 1100/1300, so I'm think I'll just keep it. At that speed it was exactly even with the 290X I had before. I think the fan on this card is more annoying than the 290X fan though - it has a higher whinier pitch to it, and it seems to be cycling a lot at idle. Not too concerned since it'll be under water by this weekend (which reminds me I forgot to take a pic of the block), but annoying in the interim. 75% fan kept the GPU temps under 75C though, but man it was loud - I had the target temp set to 95C, but it ramped the fans all the way up to 75% even though the temp never got that high, not sure if that is a bug or what.
> 
> Edit: Oh, and the void if removed stickers were no big deal. They weren't really attached very well, since they were on the screws, and I was easily able to just lift them off with a razor blade. Parked them on some pins sticking up so I could put them back later, and off we go. Only tricky part was when I accidentally put one on the smooth PCB temporarily - that one stuck pretty good, but even then the razor blade lifted it off without any issue.


Congrats on one of each. Added both









Let me know if you return one or the other. Since you got'em, keep'em both


----------



## GioV

Can anyone help my overclock with Asus GPU tweak? I have experience overclocking but my card just keeps crashing. I flashed my card with the Asus bios and currently running a full custom loop that keeps the card around 50C. While trying to find a stable clock I kept getting BSOD's and valley crashes at just 1210 core and stock memory with maxed out voltage.

Specs since I haven't included in OC.net profile:
i5 3570k at 4.7ghz
r9 290x
8GB 2133 ram
z77 Mpower
700W PSU


----------



## Deisun

Damn was hoping my R9 290 would come tomorrow for an epic weekend of BF4 gaming, but the shipping says Monday. SO SAD


----------



## Sherp

My R9 290 comes tomorrow, gonna stick a Kühler 620 on it.


----------



## outofmyheadyo

do the 290/x cards all support the AIO CPU coolers without any mods ? Thinking its a really nice and cheap solution if u have any AIO just collecting dust, just add some ram/vrm heatsinks and you should be good to go.


----------



## battleaxe

Quote:


> Originally Posted by *Kriant*
> 
> 5% off promo


Where can I find it?


----------



## Arizonian

For those who were asking if frame times were being addressed.

*Source - [H]ardOCP*
Quote:


> However, in every other game we tested the gameplay experience was exactly the same between the GTX 780 Ti and the R9 290X at 2560x1600 resolution. *There were some small framerate performance differences, often with the GTX 780 Ti coming out on top. However, we are talking about differences at most of 5%. With a difference that small, it is not noticeable in-game and does not allow one card to offer a better visual quality experience over the other.*


Thing of the past with 290X & 290.


----------



## jerrolds

Quote:


> Originally Posted by *outofmyheadyo*
> 
> do the 290/x cards all support the AIO CPU coolers without any mods ? Thinking its a really nice and cheap solution if u have any AIO just collecting dust, just add some ram/vrm heatsinks and you should be good to go.


Look at the "Red Mod" thread in the ATI Cooling forums, cheapest way is to zip tie everything. Not the best way, but the cheapest. There are also 3rd party brackets that let you use AIOs like the 620 or H80 - look for Sigma Cool MK2 brackets - i think they confirmed it should work on 290(X) http://keplerdynamics.com/


----------



## Sherp

Quote:


> Originally Posted by *outofmyheadyo*
> 
> do the 290/x cards all support the AIO CPU coolers without any mods ? Thinking its a really nice and cheap solution if u have any AIO just collecting dust, just add some ram/vrm heatsinks and you should be good to go.


I ordered a bracket off Sigma. AirJarhead said the 7950/7 bracket fits his 290X.


----------



## staryoshi

I contacted Arctic Cooling with regard to 290/290X compatibility of the AC Hybrid. I'll report back when I hear something from them. I much prefer the AC Hybrid to "The Mod" for reasons of aesthetics, warranty, and design.


----------



## $ilent

any news on msi afterburner update?


----------



## jerrolds

Quote:


> Originally Posted by *staryoshi*
> 
> I contacted Arctic Cooling with regard to 290/290X compatibility of the AC Hybrid. I'll report back when I hear something from them. I much prefer the AC Hybrid to "The Mod" for reasons of aesthetics, warranty, and design.


It is indeed compatible - confirmed from multiple sources. Here's my conversation with one of their staff:

http://www.overclock.net/t/1436954/accelero-hybrid-extreme-iii-may-work-on-r9-290x


----------



## Sgt Bilko

Quote:


> Originally Posted by *staryoshi*
> 
> I contacted Arctic Cooling with regard to 290/290X compatibility of the AC Hybrid. I'll report back when I hear something from them. I much prefer the AC Hybrid to "The Mod" for reasons of aesthetics, warranty, and design.


Hybrid is compatible, along with the Accelero Xtreme III. I emailed Arctic just before i got my 290x









There is someone here that is currently using the Hybrid right now, i'll try and find the link.

EDIT: found it: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/2590#post_21120825


----------



## staryoshi

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Hybrid is compatible, along with the Accelero Xtreme III. I emailed Arctic just before i got my 290x
> 
> 
> 
> 
> 
> 
> 
> 
> 
> There is someone here that is currently using the Hybrid right now, i'll try and find the link.
> 
> EDIT: found it: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/2590#post_21120825


There are two variants of the AC Hybrid, one specific to the HD7970 and one for multiple GPUs. Neither link provides conclusive evidence that one of these models will work with the R9 290 series and I'm not familiar with its mounting layout. I'll report back once I have word from the horse's mouth.


----------



## jerrolds

Use the NON-7970 version - as that one has a shim, which is not needed on the 290 careds


----------



## iPDrop

1. 
2. ASUS
3. Stock


----------



## Arizonian

Quote:


> Originally Posted by *iPDrop*
> 
> 1.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 2. ASUS
> 3. Stock
> 
> 
> Spoiler: Warning: Spoiler!


Your in. Congrats


----------



## ImJJames

I can't believe 780ti is priced at $700, from looking at benchmarks its not worth extra $350 compared to r9 290, 780ti has about 13% performance increase in 1080p and is pretty much equal at 1440P+...I think Nvidia needs a new marketing team.


----------



## Sherp

Quote:


> Originally Posted by *staryoshi*
> 
> I contacted Arctic Cooling with regard to 290/290X compatibility of the AC Hybrid. I'll report back when I hear something from them. I much prefer the AC Hybrid to "The Mod" for reasons of aesthetics, warranty, and design.


The AC Hybrid will fit on a 7950 and 7970, so it will most likely fit on a 290.

I would've gone for that but it's double the price. Plus I'm not a huge fan of the shroud design.


----------



## Tobiman

Can anyone confirm that the HIS R9 280X cooler fits the R9 290? Techspot did this and the results are great but is anyone able to confirm this someway? I'm thinking the is the route i'll be taking with my R9 290/X...haven't decided on which one I'll be getting.


----------



## Rar4f

Quote:


> Originally Posted by *ImJJames*
> 
> I can't believe 780ti is priced at $700, from looking at benchmarks its not worth extra $350, 780ti has about 13% performance increase in 1080p and is pretty much equal at 1440P+...I think Nvidia needs a new marketing team.


Tell me about it. I try to have a open mind and not pick a side, but be objective...but with prices Nvidia is charging where is the bloody point? For crying out loud...GTX 770 intial price was not far away from a reference R9 290 in my region.
-_-


----------



## the9quad

has any one crossfired a 290 and 290x together yet? Interested in seeing the results, as I am thinking of just saving the $100 and getting my second card as a 290 instead of another 290x. Would be interesting to see if the difference between a 290x/290x setup and a 290x/290 is negligible.


----------



## jerrolds

290 with a aftermarket cooler like the Xtreme/Icy2/Prolimatech MK2 will be the best priceeformance by far - assuming they cool VRMs adequately. I wont be able to know till next week









Fortunately the 290X VRMs dont run that hot to begin with..about 65C under load with the stock cooler, hopefully theyre rated to run up to 125C or so like other VRMs, have lots of headroom even with crappy little ramsinks with a fan blowing on them.


----------



## Kriant

Quote:


> Originally Posted by *ImJJames*
> 
> I can't believe 780ti is priced at $700, from looking at benchmarks its not worth extra $350 compared to r9 290, 780ti has about 13% performance increase in 1080p and is pretty much equal at 1440P+...I think Nvidia needs a new marketing team.


And that sealed my determination to open my 290s =). 700$ 10-13% difference and only 3gb of vram ? I rather get a 4th 290 later on and be "balling"


----------



## VSG

I just got a call from FrozenCPU that my EK blocks/backplates have arrived and will ship today. So instead of end of next week, I will have these Monday/Tuesday. So I should FINALLY have my build completed by the next weekend. Hopefully MSI Afterburner also gets an update for the 290x by then.


----------



## Kriant

Quote:


> Originally Posted by *geggeg*
> 
> I just got a call from FrozenCPU that my EK blocks/backplates have arrived and will ship today. So instead of end of next week, I will have these Monday/Tuesday. So I should FINALLY have my build completed by the next weekend. Hopefully MSI Afterburner also gets an update for the 290x by then.


That's great, hopefully they will ship my blocks as well!

So, any reason to get backplates aside from visual presentation?
I'm not sure how much of that "passive VRAM cooling" it actually provides


----------



## ImJJames

Quote:


> Originally Posted by *Kriant*
> 
> And that sealed my determination to open my 290s =). 700$ 10-13% difference and only 3gb of vram ? I rather get a 4th 290 later on and be "balling"


Want to sell one of your 7970's to me for cheap


----------



## Tobiman

Quote:


> Originally Posted by *Kriant*
> 
> That's great, hopefully they will ship my blocks as well!
> 
> So, any reason to get backplates aside from visual presentation?
> I'm not sure how much of that "passive VRAM cooling" it actually provides


In my experience, it actually makes the card retain more heat especially if it's black. The good side though is that the card doesn't warp it's pcb due to waterblock weight.


----------



## Rar4f

Is it hard to replace reference cooler with a aftermarket one, and is it worth it?


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Kriant*
> 
> That's great, hopefully they will ship my blocks as well!
> 
> So, any reason to get backplates aside from visual presentation?
> I'm not sure how much of that "passive VRAM cooling" it actually provides


It will also give your card some rigidity so that it doesn't sag with all that heavy cooling gear on it.


----------



## Forceman

Quote:


> Originally Posted by *Rar4f*
> 
> Is it hard to replace reference cooler with a aftermarket one, and is it worth it?


It's not that hard as long as you are careful, and it'll make a huge difference for these cards. Lower temps and much less noise.


----------



## Ricdeau

Quote:


> Originally Posted by *geggeg*
> 
> I just got a call from FrozenCPU that my EK blocks/backplates have arrived and will ship today. So instead of end of next week, I will have these Monday/Tuesday. So I should FINALLY have my build completed by the next weekend. Hopefully MSI Afterburner also gets an update for the 290x by then.


Hoping I also hear from them today or tomorrow on mine. I got two 290Xs on the way as well which should be here Saturday. I'm almost tempted to return them and go with two 290s. Haven't decided yet.


----------



## vettefan8

Quote:


> Originally Posted by *staryoshi*
> 
> I contacted Arctic Cooling with regard to 290/290X compatibility of the AC Hybrid. I'll report back when I hear something from them. I much prefer the AC Hybrid to "The Mod" for reasons of aesthetics, warranty, and design.


I emailed them a couple of weeks ago and received the following reply. It takes them a while to respond to emails, so I thought that I would save you some time.

Hi Brett,

It has been confirmed: existing ARCTIC Accelero Hybrid and Xtreme III will both support the new AMD R9 290X factory reference layout VGA! The only minor thing you may need would be additional VR/RAM heatsinks but that was the easy part.

We hope this answers your question. Thank you for choosing ARCTIC products!

Best Regards

Eric Abellada
Customer Service Team
ARCTIC Inc.
14783 Clark Avenue | City of Industry, CA 91745 | USA
T. (626) 961-5437
E. [email protected]


----------



## VSG

Quote:


> Originally Posted by *Kriant*
> 
> That's great, hopefully they will ship my blocks as well!
> 
> So, any reason to get backplates aside from visual presentation?
> I'm not sure how much of that "passive VRAM cooling" it actually provides


Aesthetics, passive VRM cooling (possibly noticeable with better thermal pads), rigidity to PCB and protection in case something falls from above (screw/fitting/coolant).


----------



## Forceman

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> It will also give your card some rigidity so that it doesn't sag with all that heavy cooling gear on it.


Can you put the backplate on after the block is on, or does it need to be done at the same time (or do you need to remove the block to put it on)? All the backplates were OOS when I got my block so I didn't bother, and I probably won't bother in the future if it means unmounting the block to install one.


----------



## VSG

You can put the backplate on anytime, the instructions even mention putting on the block first. It might be easier to put it on outside the case though.


----------



## Kriant

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> It will also give your card some rigidity so that it doesn't sag with all that heavy cooling gear on it.


I though cooling hear, even with water weights less than stock turbine cooler


----------



## brazilianloser

And finally my beast has arrived...







too bad I have to go to work now and wont be able to test it out to my full desires until tomorrow probably.


----------



## Arizonian

Quote:


> Originally Posted by *brazilianloser*
> 
> And finally my beast has arrived...
> 
> 
> 
> 
> 
> 
> 
> too bad I have to go to work now and wont be able to test it out to my full desires until tomorrow probably.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## Forceman

Quote:


> Originally Posted by *geggeg*
> 
> You can put the backplate on anytime, the instructions even mention putting on the block first. It might be easier to put it on outside the case though.


Excellent, thanks.


----------



## $ilent

so do we know if msi afterburner is updated yet?


----------



## jerrolds

The guru3d guys will know first...they should have a forum/thread dedicated to AB news


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Forceman*
> 
> Can you put the backplate on after the block is on, or does it need to be done at the same time (or do you need to remove the block to put it on)? All the backplates were OOS when I got my block so I didn't bother, and I probably won't bother in the future if it means unmounting the block to install one.


I'm not 100% sure but most videos I've seen have shown that the backplate screwed through the PCB and into the waterblock. So after you put the waterblock on, you probably can take the screws back out and without removing the waterblock, put the backplate on and screw everything in. I think this is what I'm going to do to test out temps with no backplate, backplate with stock pads, backplate with more expensive pads. I'm not sure which one to get, though.

I'm not 100% sure, but from the videos
Quote:


> Originally Posted by *Kriant*
> 
> I though cooling hear, even with water weights less than stock turbine cooler


not 100% sure, but I don't think it would hurt. Plus, in the off chance that something above the gpu spills, you won't fry your card right off the bat.

Edit: does anyone have any experience with fujipoly thermal pads? How are the 11 and the 17 W/mk pads compared to the stock ones provided by EK?


----------



## RYCRAI

Quote:


> Originally Posted by *MIGhunter*
> 
> I don't have any on my Sapphire card but after I installed it, it had a clicking that sounded like a fan blade hitting the housing. I shutdown my pc, took the side off and looked at it, nothing. It hasn't done it since so I don't know what it was.


This with an r9 290x or 290? 

I have this same problem ... Sounds like a clicking noise, tried shaking the card, reseating it.... Weird, I kind of tuned it out after playing games, but when I alt tab out of games it stops immediately.


----------



## sugarhell

Hmm not impressed with 780ti. 290 is almost the same performance.

http://forums.overclockers.co.uk/showthread.php?t=18555564

A big no for 700 bucks


----------



## Arm3nian

Quote:


> Originally Posted by *sugarhell*
> 
> Hmm not impressed with 780ti. 290 is almost the same performance.
> 
> http://forums.overclockers.co.uk/showthread.php?t=18555564
> 
> A big no for 700 bucks


I know its 8 pack but there was no mention of possible throttling. 1220mhz seems a bit high on air with the stock cooler doesn't it?


----------



## Sgt Bilko

Quote:


> Originally Posted by *sugarhell*
> 
> Hmm not impressed with 780ti. 290 is almost the same performance.
> 
> http://forums.overclockers.co.uk/showthread.php?t=18555564
> 
> A big no for 700 bucks


wow....$700 USD i take it?

Thats a pretty big price for something that's 290x level of performance......


----------



## the9quad

well you always pay a premium to have thee fastest available, and that looks like it will definitely be that. Looks like the TI is a nice card. I am not a hater. IN all reality it is $150 dollars more than a 290x, and the 290x is $150 more than a 290 so if your just gonna go $ per performance the 290 wins and if your going pure performance the Ti wins, leaving the 290x somewhere in between. Still happy with my 290x so dont get me wrong.


----------



## $ilent

Me either.


----------



## jerrolds

the 780ti is impressive imo - overclocks well, quieter, less power hungry...its just these prices that nvidia pushes ...its almost insulting


----------



## brazilianloser

Well got one black screen so far... And having the missing the PhysXLoader.dll when trying to open 3dmark... Seems like I got some corrupted files which means I might have to do a clean install :/


----------



## Blackops_2

Quote:


> Originally Posted by *sugarhell*
> 
> Hmm not impressed with 780ti. 290 is almost the same performance.
> 
> http://forums.overclockers.co.uk/showthread.php?t=18555564
> 
> A big no for 700 bucks


If anything i keep becoming more impressed with Hawaii given that we should seem some gains in driver performance and cooling this card has nowhere to go but up. Less the AIB manufacturers butcher them like XFX did the 7970s


----------



## black7hought

I received my 290 today.










Spoiler: Warning: Spoiler!


----------



## Arizonian

Quote:


> Originally Posted by *RYCRAI*
> 
> This with an r9 290x or 290?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I have this same problem ... Sounds like a clicking noise, tried shaking the card, reseating it.... Weird, I kind of tuned it out after playing games, but when I alt tab out of games it stops immediately.


Congrats - Added









Quote:


> Originally Posted by *black7hought*
> 
> I received my 290 today.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - Ditto


----------



## Rar4f

What did Hardocp use to test how much power r9 290 used? Furmark or games?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Blackops_2*
> 
> If anything i keep becoming more impressed with Hawaii given that we should seem some gains in driver performance and cooling this card has nowhere to go but up. Less the AIB manufacturers butcher them like XFX did the 7970s


I agree, at least the 290x will get better drivers and cooling solutions.

Ti is faster yes but i'm not trading in my 290x for it....loving my card, now i just need to justify Crossfire


----------



## Arizonian

Quote:


> Originally Posted by *the9quad*
> 
> well you always pay a premium to have thee fastest available, and that looks like it will definitely be that. Looks like the TI is a nice card. I am not a hater. IN all reality it is $150 dollars more than a 290x, and the 290x is $150 more than a 290 so if your just gonna go $ per performance the 290 wins and if your going pure performance the Ti wins, leaving the 290x somewhere in between. Still happy with my 290x so dont get me wrong.


Since I'm not water cooling. Waiting on Non-reference 290X to solve my solutiuon, it's hard to not pull the trigger on it. However I still see 290X passing 780Ti after Mantle in all mantle supported games and G-sync is not impressive for single GPU users, so I'm going to hope it goes out of stock soon and must resist.









Not to even mention the extra 1GB VRAM.


----------



## Slomo4shO

The EVGA GeForce GTX 780Ti Classifieds are coming on December 6 and Superclock ACX are expected to release Nov 29 ! I wonder when will AMD have their non-reference designs on the market...


----------



## NeMoD

Hey guys, I received my ASUS 290 about 2 hours ago and I'm having some problems.

I did the following:
-Ran the 13.11 beta8 drivers and chose uninstall
-Removed my 5850 and put in my 290
-Powered up and installed 13.11 beta8

The problem is that I'm getting worse frames than my 5850. I only get 12 frames in valley and can't even break 120 in CS:GO. Any ideas? I've tried reinstalling drivers too.

Processor is a 2500k at 4.3. Power supply is a seasonic 620w.


----------



## Yvese

DId anyone read about the 290 being fitted with a 280x HIS IceQ cooler and getting 20C+ drop in temps? Here's the post ( scroll all the way down )

That is very promising for upcoming non-ref coolers.


----------



## Arm3nian

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I agree, at least the 290x will get better drivers and cooling solutions.
> 
> Ti is faster yes but i'm not trading in my 290x for it....loving my card, now i just need to justify Crossfire


Ti is faster in that bench because almost all the benches are nvidia favored, notice how in the bench where both amd and nvidia perform well the 290x is on top... Secondly, the Ti has a much better stock cooler than the 290x. Third, 700 series has had drivers for 9 months, 290 series just came out.


----------



## Clukos

Quote:


> Originally Posted by *Arm3nian*
> 
> Ti is faster in that bench because almost all the benches are nvidia favored, notice how in the bench where both amd and nvidia perform well the 290x is on top... Secondly, the Ti has a much better stock cooler than the 290x. Third, 700 series has had drivers for 9 months, 290 series just came out.


Whatever, the difference when overclocked is minimal.

http://forums.overclockers.co.uk/showthread.php?t=18555564



Looks like 290 CF is the new thing to beat at under $1k price point.


----------



## Rar4f

I read a comment somewhere that people who bought Radeon 7970 early, later got Never Settle when it was released or something. Is there any chance AMD would give Never Settle to people who got R9 290 later on when they do update or release it?


----------



## Paul17041993

Quote:


> Originally Posted by *NapalmV5*
> 
> lol if youd know what youre talking about you wouldnt say that ^
> 
> im not asking whether 8gb is necessary or not - i know how much vram i need i know how much vram games use the whole nine yards
> 
> maybe this was the wrong thread maybe wrong forum to ask


please explain son, that you have a game running 4k textures at a high enough FPS to fill all 4GB...?


----------



## Kriant

Quote:


> Originally Posted by *Paul17041993*
> 
> please explain son, that you have a game running 4k textures at a high enough FPS to fill all 4GB...?


Max Payne 3
Hitman Absolution
Metro LL
Serious Sam 3
Crysis 3
Far Cry 3

IDK about 4k, but at 5760x1080 those game required more vram than my 7970s had.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Clukos*
> 
> Whatever, the difference when overclocked is minimal.


Thats the point, the Ti already has it's Mature drivers and a better ref cooler.

the 290/x's can only get better from here on, Drivers, coolers etc

and yeah, 290's in Crossfire will be the best bang for buck atm.


----------



## Tobiman

Quote:


> Originally Posted by *Yvese*
> 
> DId anyone read about the 290 being fitted with a 280x HIS IceQ cooler and getting 20C+ drop in temps? Here's the post ( scroll all the way down )
> 
> That is very promising for upcoming non-ref coolers.


I'm trying to find anyone who has that cooler to tell me if it has a flat bottom.

EDIT: 290 with 280X cooler

Ploutonas did I say, I did it? Yes I did so that means I truly did it, we are not in the business of lying to our readers. Why do you think it is operating cooler than the 280X? It does nothing for the performance or at least nothing much since we didn€t see any throttling. It could make a bit of a difference on the 290X however.

I have attached some photos for those that require evidence.






EDIT: Comments from techspot reviewer

Some updated info guys. I took the IceQ X2 cooler off the HIS Radeon R9 280X and stuck it on our R9 290 sample. Cooling was dramatically improved. The FurMark stress test maxed out at 76 degrees while the card never exceeded 63 degrees in Crysis 3 and Battlefield 4. So it seems as expected the board partners will be able to solve the heat issues of the reference card.

Okay not a problem. Just to clarity something for earlier the Radeon R9 290X runs at 83 degrees with the AMD reference cooler (the same cooler from the R9 290). The HIS IceQ X2 gets the R9 280X down to 70 degrees in FurMark so the R9 290 was 6 degrees warmer with the same IceQ X2 cooler.


----------



## esqueue

Quote:


> Originally Posted by *Yvese*
> 
> DId anyone read about the 290 being fitted with a 280x HIS IceQ cooler and getting 20C+ drop in temps? Here's the post ( scroll all the way down )
> 
> That is very promising for upcoming non-ref coolers.


That is great news for those that don't want to watercool. Here are a few pics of the r9 290

Source: techspot.com comments


----------



## Paul17041993

Quote:


> Originally Posted by *Kriant*
> 
> Max Payne 3
> Hitman Absolution
> Metro LL
> Serious Sam 3
> Crysis 3
> Far Cry 3
> 
> IDK about 4k, but at 5760x1080 those game required more vram than my 7970s had.


well yea at that point you have an issue with only 3GB, I think those games might already use 4k or near 4k textures, or at least just had enough loaded at any one time to fill it up...


----------



## skupples

Quote:


> Originally Posted by *NapalmV5*
> 
> anyone know anyone have any info any rumors on how soon to expect 8gb 290/290x ?
> 
> that would seal the deal for me to go 4k


Quote:


> Originally Posted by *NapalmV5*
> 
> no less than quad 290/290x 8gb - should do just fine - thats the point i wouldnt want to fully utilize all 8gigs gotta have some free vram for those nasty spikes
> 
> dont worry man i know how to utilize vram - i mod just about every game i play - for me 3gb is not enough at 1080p


you likely wont see a consumer grade hawaii with that kind of vram. You will likely have to wait for the commercial/professional grade Hawaii GPU's to release if that's the kind of vram you are looking for.

For those saying 4gb is already too much, maybe on single monitors... Even 5760x1080/3240x1920 can consume over 4gb in these new games coming out. OccamRazor linked screenshots of COD:Ghosts sucking up 5gb in 3240x1920 portrait surround last night. Same goes for Batman Origins, & BF4 is using well into 4 gigs in surround. So, it seems these new games are being programmed to utilize as much of it as you have. We should have more information on if it's actually increasing performance or not as time goes on.

hell, even allot of the new, not "next gen" games suck down vram like crazy in multi-monitor.


----------



## Paul17041993

the thing to think of, is that you still have to render it, if there's too many large textures in one frame it will take too long to render (of course the professional cards would have the 8GB as they don't render in real-time).

when you stack in crossfire though, the memory may be kept in parallel for each card, but there will be a significant reduction in the total memory usage as they are rendering at a slower rate per card, which could be slow enough to take out an extra 1GB of utilization...


----------



## brazilianloser

Quote:


> Originally Posted by *NeMoD*
> 
> Hey guys, I received my ASUS 290 about 2 hours ago and I'm having some problems.
> 
> I did the following:
> -Ran the 13.11 beta8 drivers and chose uninstall
> -Removed my 5850 and put in my 290
> -Powered up and installed 13.11 beta8
> 
> The problem is that I'm getting worse frames than my 5850. I only get 12 frames in valley and can't even break 120 in CS:GO. Any ideas? I've tried reinstalling drivers too.
> 
> Processor is a 2500k at 4.3. Power supply is a seasonic 620w.


I personally had some black screens on mine earlier until I created a fan settings... For some reason my fan was stuck on 20% until the card hit 90c on user mode... Made a fan curve and was totally fine didn't even got passed 86c on furmark. I would try that and maybe get a kilowatt thing to see if you are pushing more than you psu can handle, but doubt that.


----------



## maarten12100

Just ordered 2 R9 290's AAAAAAAAAAAAAH YEAH!
The 290x from Gigabyte was about the same price but had a waiting time of 2 to 4 weeks and I rather have my cards now. (by Dutch law I'm allowed to return it within 14 days so if the 290X's become available I might swap them with those)


----------



## the9quad

Quote:


> Originally Posted by *brazilianloser*
> 
> I personally had some black screens on mine earlier until I created a fan settings... For some reason my fan was stuck on 20% until the card hit 90c on user mode... Made a fan curve and was totally fine didn't even got passed 86c on furmark. I would try that and maybe get a kilowatt thing to see if you are pushing more than you psu can handle, but doubt that.


Thats exactly what I had to do with mine. Create a custom fan profile, and I also bumped up vcore for some reason, but ever since I did that no more black screens. Weird it was like before the custom profile it would wait til the last possible minute to spin up the fans, and by then it was too late and voila black screen, just like you said.


----------



## Arm3nian

Quote:


> Originally Posted by *skupples*
> 
> you likely wont see a consumer grade hawaii with that kind of vram. You will likely have to wait for the commercial/professional grade Hawaii GPU's to release if that's the kind of vram you are looking for.
> 
> For those saying 4gb is already too much, maybe on single monitors... Even 5760x1080/3240x1920 can consume over 4gb in these new games coming out. OccamRazor linked screenshots of COD:Ghosts sucking up 5gb in 3240x1920 portrait surround last night. Same goes for Batman Origins, & BF4 is using well into 4 gigs in surround. So, it seems these new games are being programmed to utilize as much of it as you have. We should have more information on if it's actually increasing performance or not as time goes on.
> 
> hell, even allot of the new, not "next gen" games suck down vram like crazy in multi-monitor.


Remember just because the game can use more vram doesn't mean it actually needs more vram. Some games use more vram to preload textures to reduce stuttering when moving to a new area. With my 690s, I ran games that used less than 2gb vram, but those with the 4gb 680s were running the same game at the same settings and were using 2800-3200mb of vram.

When you run out of vram, you get .01 FPS, I experienced this a lot with a 690, because it was the single most powerful gpu and only had 2gb of vram, it was even worse with quad 690s. The more vram the better, "usage" can be misleading.


----------



## skupples

Quote:


> Originally Posted by *Arm3nian*
> 
> Remember just because the game can use more vram doesn't mean it actually needs more vram. Some games use more vram to preload textures to reduce stuttering when moving to a new area. With my 690s, I ran games that used less than 2gb vram, but those with the 4gb 680s were running the same game at the same settings and were using 2800-3200mb of vram.
> 
> When you run out of vram, you get .01 FPS, I experienced this a lot with a 690, because it was the single most powerful gpu and only had 2gb of vram, it was even worse with quad 690s. The more vram the better, "usage" can be misleading.


Very true... Woah, since when did google start doing this "some one has quoted you on OCN" thing? I swear this is the first time iv'e noticed it.


----------



## PillarOfAutumn

I'm thinking of buying a set of Fujipoly thermal pads thact conduct at 17 w/mk. Now, I know I should be getting 0.5mm for the VRAM and 1.0mm for the VRM. But what dimensions should I get? I need to buy enough for the main waterblock as well as the backplate. Also, what thickness is needed for the backplate?


----------



## esqueue

Will a 1977 Bonneville heater core provide enough cooling for a r9 290x and a 3770k cpu? I get my block tomorrow and will get something else if it is not good enough.
Quote:


> Originally Posted by *skupples*
> 
> Very true... Woah, since when did google start doing this "some one has quoted you on OCN" thing? I swear this is the first time iv'e noticed it.


Can you post a screen of what you are talking about?


----------



## skupples

Quote:


> Originally Posted by *esqueue*
> 
> Will a 1977 Bonneville heater core provide enough cooling for a r9 290x and a 3770k cpu? I get my block tomorrow and will get something else if it is not good enough.
> Can you post a screen of what you are talking about?


----------



## xxmastermindxx

Well, my 290X is gone. Two 290 incoming... dat price:


----------



## flopper

Quote:


> Originally Posted by *xxmastermindxx*
> 
> Well, my 290X is gone. Two 290 incoming... dat price:


ouch our taxes kills my price.


----------



## DampMonkey

Quote:


> Originally Posted by *esqueue*
> 
> Will a 1977 Bonneville heater core provide enough cooling for a r9 290x and a 3770k cpu? I get my block tomorrow and will get something else if it is not good enough.


Depends on your fans im guessing. I have a 360 and a 240, both ut60's for my overclocked 8350 and 290x. I can turn off the fans on the 360 during BF4 and my temps will still be within good reason(50 cpu, 52 gpu)


----------



## Strata

Sadly I had to work so I don't get to play with it...Also looks like it won't fit in my HAF XB with my swiftech mcr220qp res using push pull, but I'll know better tonight...


----------



## rdr09

Quote:


> Originally Posted by *xxmastermindxx*
> 
> Well, my 290X is gone. Two 290 incoming... dat price:


congrats on your "780 Ti".

kid kid


----------



## bpmcleod

So I know it was like this for the 290x's but man its pointless to up memory clocks on these cards. I just got a Powercolor 290 and at 1100 core I scored a 10009 with around a 11500 GPU score. At 1050 core and 5250 memory my score dropped to 9600. But I will say this. Stock clocks I score around 8600. Pretty large jump on stock volts only clocking up 150mhz or so!









http://www.3dmark.com/fs/1109159


----------



## xxmastermindxx

Quote:


> Originally Posted by *rdr09*
> 
> congrats on your "780 Ti".
> 
> kid kid


Heh, I thought about it. I like PhysX sometimes, but for only $60 more, there was absolutely no question. Plus, they'll be water cooled.


----------



## rdr09

Quote:


> Originally Posted by *xxmastermindxx*
> 
> Heh, I thought about it. I like PhysX sometimes, but for only $60 more, there was absolutely no question. Plus, they'll be water cooled.


XFX Pro should handle them. congrats again.


----------



## bpmcleod

Quote:


> Originally Posted by *xxmastermindxx*
> 
> Well, my 290X is gone. Two 290 incoming... dat price:


I cant wait for the ASUS ones to come back into stock. Ill be snatching one of those up and possible flashing its BIOS to my Powercolor one and joining you in xfire!


----------



## esqueue

Quote:


> Originally Posted by *skupples*


Cool, I havent seen anything like that on this forum. I have seen things like it on other forums though.

Quote:


> Originally Posted by *DampMonkey*
> 
> Depends on your fans im guessing. I have a 360 and a 240, both ut60's for my overclocked 8350 and 290x. I can turn off the fans on the 360 during BF4 and my temps will still be within good reason(50 cpu, 52 gpu)


Ok, I guess I'll see what I can do with my current setup before thinking of getting a new rad. I heard that the bonneville falls between the 240 and 360 rads with decent fans.


----------



## Arizonian

Quote:


> Originally Posted by *Strata*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Sadly I had to work so I don't get to play with it...Also looks like it won't fit in my HAF XB with my swiftech mcr220qp res using push pull, but I'll know better tonight...


Congrats - added


----------



## Forceman

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> I'm thinking of buying a set of Fujipoly thermal pads thact conduct at 17 w/mk. Now, I know I should be getting 0.5mm for the VRAM and 1.0mm for the VRM. But what dimensions should I get? I need to buy enough for the main waterblock as well as the backplate. Also, what thickness is needed for the backplate?


What are you going to put the fujipoly on on the back of the card? Just the PCB itself? Seems like a lot of money for little gain on that side.


----------



## xxmastermindxx

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> I'm thinking of buying a set of Fujipoly thermal pads thact conduct at 17 w/mk. Now, I know I should be getting 0.5mm for the VRAM and 1.0mm for the VRM. But what dimensions should I get? I need to buy enough for the main waterblock as well as the backplate. Also, what thickness is needed for the backplate?


The pads that came with my EK backplate are 1.5 mm for the core, 1 mm for the VRM.


----------



## thekamikazepr

Just installed


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Forceman*
> 
> What are you going to put the fujipoly on on the back of the card? Just the PCB itself? Seems like a lot of money for little gain on that side.


I just ordered one 60x50x0.5mm for the vram and one 15x100x1.0mm for the vrm. I'm mainly going to concentrate on the front part of the card where the wb, and if I have any of the 1.0mm left over I'll use it on the backplate. But I'm just trying to put the best setup on there. How much of a difference does the backplate make in cooling?

Quote:


> Originally Posted by *xxmastermindxx*
> 
> The pads that came with my EK backplate are 1.5 mm for the core, 1 mm for the VRM.


Thanks! How much of a difference does passive backplate cooling make?


----------



## SpewBoy

Quote:


> Originally Posted by *Rar4f*
> 
> I think the psu, www.newegg.com/Product/Product.aspx?Item=N82E16817580004&Tpk=newton r3 600w, should handle it without overclocking but...if i overclock the 290 and the processor by significant amount, i am pretty sure that a good 600W will not meet be enuff.


I think overvolting plays a big role. Do you plan on tweaking the voltage? That's when power draw really increases.

Sorry if someone's already responded to you. I just got up and I've got 10 pages to sift through.
Quote:


> Originally Posted by *ivanlabrie*
> 
> Overclock that cpu already, crossfire with stock cpu is a terrible idea.


Lol I'm waiting on my GPU blocks. I realise it is a crime to not overclock a K CPU but my blocks may arrive today and then I can compile the water loop and really see what's what. Hmm but I do have an exam tomorrow...


----------



## esqueue

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> How much of a difference does passive backplate cooling make?


I've heard no one say that it helps, one guy said that he even thinks it traps heat. Do the rear plates even come with thermal pads?


----------



## Mr357

Quote:


> Originally Posted by *esqueue*
> 
> I've heard no one say that it helps, one guy said that he even thinks it traps heat. Do the rear plates even come with thermal pads?


Yes, the EK one comes with pre-cut pads. Think of it like the heatsinks on RAM modules. The thermal utility of a backplate is laughable; it's main uses are to prevent PCB sagging and to protect the components on the back. Not to mention, most people like the look of it, including myself.


----------



## maarten12100

my 290's will be delivered tommorow between 13:00 and 15:00
Delivery within 16 hours of ordering


----------



## PillarOfAutumn

Quote:


> Originally Posted by *esqueue*
> 
> I've heard no one say that it helps, one guy said that he even thinks it traps heat. Do the rear plates even come with thermal pads?


The ek one does. What I really want to do is run furmark 10 minutes, run 3DMark and then play some BF4 for 30 minutes and see where the temps are. then I want to add the backplate on, do the same thing and then see where the temperatures are.

But my only problem is that I'll be using rigid copper pipes in my build. So I'll have to find a way to attach the backplate while it's still hooked up to my computer.


----------



## xxmastermindxx

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Thanks! How much of a difference does passive backplate cooling make?


I don't think it makes any difference at all, to be honest. My backplate felt slightly warm when the card was in use, but I'm pretty sure that a naked rear PCB with a little case airflow is just fine.
Quote:


> Originally Posted by *esqueue*
> 
> I've heard no one say that it helps, one guy said that he even thinks it traps heat. Do the rear plates even come with thermal pads?


The EK backplates do.
Quote:


> Originally Posted by *Mr357*
> 
> Yes, the EK one comes with pre-cut pads. Think of it like the heatsinks on RAM modules. The thermal utility of a backplate is laughable; it's main uses are to prevent PCB sagging and to protect the components on the back. Not to mention, most people like the look of it, including myself.


I'm not sure about that, my 290X fully blocked with backplate looked pretty saggy to me


----------



## xxmastermindxx

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> The ek one does. What I really want to do is run furmark 10 minutes, run 3DMark and then play some BF4 for 30 minutes and see where the temps are. then I want to add the backplate on, do the same thing and then see where the temperatures are.
> 
> But my only problem is that I'll be using rigid copper pipes in my build. So I'll have to find a way to attach the backplate while it's still hooked up to my computer.


A stubby screwdriver with T8 tip would work, though I wouldn't try installing it with card in the slot. It would be pretty difficult, especially the screws close to the slot.


----------



## BababooeyHTJ

The backplate on my GTX680 lightning saved the card. I dropped a drop of water on my top rad. It dripped through onto the card. Luckilly onto the backplate. it also protects against physical damage. A slipped screwdriver, dropped screw, etc. I wish that I didn't cheap out and bought a backplate for my cards.


----------



## esqueue

Quote:


> Originally Posted by *BababooeyHTJ*
> 
> The backplate on my GTX680 lightning saved the card. I dropped a drop of water on my top rad. It dripped through onto the card. Luckilly onto the backplate. it also protects against physical damage. A slipped screwdriver, dropped screw, etc. I wish that I didn't cheap out and bought a backplate for my cards.


It's not about being cheap, it that the damn things were unavailable when we purchased out blocks. I am sure that almost everyone that ordered a block would have gotten one. These guys should know that if they can't keep up with initial demand, they can lose some profits. Think of how many more they would have sold if there was enough to meet the demand.

On the other hand, I do understand that the card is new though and their main priority was to get the block out first.


----------



## solitario07777

A query to the owners of these cards.

First I hope you understand use a translator.

I come from a Spanish forum and criticism to this card are great with returns, hardly anyone using it and came to this thread, reading reviews not so bad and some users say consistent with their cards.

This is my question are so bad choice the 290x or 290.

Da many problems to play.

greetings and thanks for responding.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *solitario07777*
> 
> A query to the owners of these cards.
> 
> First I hope you understand use a translator.
> 
> I come from a Spanish forum and criticism to this card are great with returns, hardly anyone using it and came to this thread, reading reviews not so bad and some users say consistent with their cards.
> 
> This is my question are so bad choice the 290x or 290.
> 
> Da many problems to play.
> 
> greetings and thanks for responding.


The problem with the 290/x is the heat it produces. It isn't able to dissipate it and as a result, it gets severely limited on how much it can perform. I think that for the price, the 290 and 290x are excellent cards and if you're looking for an upgrade, you most certainly should consider these cards.

however, I would only recommend the current cards if you are planning to watercool. Either that, or wait for non reference coolers. Because once you fix the heat issue, this card performs excellently! It overclocks pretty well too. It's just that the stock cooler from AMD was holding it back, however this card is tan fuerte como torro







.


----------



## SpewBoy

Quote:


> Originally Posted by *skupples*
> 
> For those saying 4gb is already too much, maybe on single monitors... Even 5760x1080/3240x1920 can consume over 4gb in these new games coming out.


Original OP mentioned he could use 4GB VRAM at 1080p. I don't know whether that means he is actually rocking a 1080p monitor and decided he needs 8GB VRAM and quad 290Xs to play games the way he likes or whether 1080p was just an example of how rediculously reckless he is with VRAM consumption and he actually has a multi-monitor setup.

Personally, I may have miss-interpreted it as him using just a 1080p monitor for which 8GB VRAM would be rediculous. Even if you state "oh, I could use 8GB VRAM at 1080p if I wanted to", doesn't mean you should. VRAM ain't everything and more VRAM doesn't make a better looking game lol. In fact, I'd go as far as to say, if you had the choice between a more powerful GPU with 4GB VRAM and a less powerful GPU with 8GB VRAM for the same price, you'd be crazy not to go with the former because for most intents and purposes VRAM is a ceiling. Unless you constantly hit that ceiling, you're not going to even know it is there, i.e., extra VRAM is worthless unless you utilise it most of the time.

Of course if you run 3x1440p or 4k monitors and the cards have enough raw grunt to spew out the pixels fast enough then by all means, use 8GB VRAM. My main concern is the number of fragments / pixels you'd need to be manipulating would be so large that not even four 290Xs could push a decent enough FPS.


----------



## pompss

Quote:


> Originally Posted by *solitario07777*
> 
> A query to the owners of these cards.
> 
> First I hope you understand use a translator.
> 
> I come from a Spanish forum and criticism to this card are great with returns, hardly anyone using it and came to this thread, reading reviews not so bad and some users say consistent with their cards.
> 
> This is my question are so bad choice the 290x or 290.
> 
> Da many problems to play.
> 
> greetings and thanks for responding.


As a owner of r9 290x i would suggest you to go with r9 290.
They have almost the same performance. doesnt make sense spending 150 dollars or 100 euro more for the r9 290x
also the gtx 780 ti is to expensive.
at this point i will go with two 290 in crossfire in a couple of month instead buying the gtx 780 ti.


----------



## Forceman

Are there any settings I need to change in CCC to improve quality/performance in games? Things along the lines of Transparency Antialiasing in the Nvidia Control Panel? Anyone have a guide? First time in a while using a high-end AMD card where I care about maximizing the settings.


----------



## Rar4f

R9 290 heat problem should be no problem once better coolers are used


----------



## sugarhell

Quote:


> Originally Posted by *Forceman*
> 
> Are there any settings I need to change in CCC to improve quality/performance in games? Things along the lines of Transparency Antialiasing in the Nvidia Control Panel? Anyone have a guide? First time in a while using a high-end AMD card where I care about maximizing the settings.


Check this as a basic

http://forums.guru3d.com/showthread.php?t=350890

Also new driver

http://www2.ati.com/drivers/beta/amd_catalyst_13.11_betav9.2.exe


----------



## SpewBoy

Quote:


> Originally Posted by *Forceman*
> 
> Are there any settings I need to change in CCC to improve quality/performance in games? Things along the lines of Transparency Antialiasing in the Nvidia Control Panel? Anyone have a guide? First time in a while using a high-end AMD card where I care about maximizing the settings.


I think there are AA options under Application Settings. There's the option of MLAA or whatever, which is basically a post-process filter that can get rid of aliasing but for stuff like transparent chainlink fences, it won't do a good job. Stuff like that requires some adaptation of super sampling, like multi sampling. I too am unfamiliar with CCC.


----------



## Arm3nian

Quote:


> Originally Posted by *Forceman*
> 
> What are you going to put the fujipoly on on the back of the card? Just the PCB itself? Seems like a lot of money for little gain on that side.


Using it on the backplate seems like an extreme waste of money. The heat will be transferred faster to the backplate, but it will just stay there, because there is no active cooling. For the front of the card, the heat is actually carried away from the vrms with the water.

In other words, passive cooling won't be enough to take advantage of the heat transfer.


----------



## Duvar

Do you guys installed newest driver, sry have not read the last pages...
http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx


----------



## VSG

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> I'm thinking of buying a set of Fujipoly thermal pads thact conduct at 17 w/mk. Now, I know I should be getting 0.5mm for the VRAM and 1.0mm for the VRM. But what dimensions should I get? I need to buy enough for the main waterblock as well as the backplate. Also, what thickness is needed for the backplate?


Quote:


> Originally Posted by *Forceman*
> 
> What are you going to put the fujipoly on on the back of the card? Just the PCB itself? Seems like a lot of money for little gain on that side.


Quote:


> Originally Posted by *Arm3nian*
> 
> Using it on the backplate seems like an extreme waste of money. The heat will be transferred faster to the backplate, but it will just stay there, because there is no active cooling. For the front of the card, the heat is actually carried away from the vrms with the water.
> 
> In other words, passive cooling won't be enough to take advantage of the heat transfer.


Ya, pretty much. Don't bother with the backplate thermal pads, use the stock EK ones. As far as dimensions go, the pre-cut fujipoly ultra extreme pads on FrozenCPU (100x15x0.5/1.0) are about 1 mm short in 1 axis but it should still be fine. I ended up getting a quarter sheet of the fujipoly extreme pads instead.


----------



## tsm106

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Quote:
> 
> 
> 
> Originally Posted by *solitario07777*
> 
> A query to the owners of these cards.
> 
> First I hope you understand use a translator.
> 
> I come from a Spanish forum and criticism to this card are great with returns, hardly anyone using it and came to this thread, reading reviews not so bad and some users say consistent with their cards.
> 
> This is my question are so bad choice the 290x or 290.
> 
> Da many problems to play.
> 
> greetings and thanks for responding.
> 
> 
> 
> *The problem with the 290/x is the heat it produces. It isn't able to dissipate it and as a result, it gets severely limited on how much it can perform.* I think that for the price, the 290 and 290x are excellent cards and if you're looking for an upgrade, you most certainly should consider these cards.
> 
> however, I would only recommend the current cards if you are planning to watercool. Either that, or wait for non reference coolers. Because once you fix the heat issue, this card performs excellently! It overclocks pretty well too. It's just that the stock cooler from AMD was holding it back, however this card is tan fuerte como torro
> 
> 
> 
> 
> 
> 
> 
> .
Click to expand...

Did you miss the memo that it was designed to run at that temp? Saying that it isn't able to dissipate the heat when it wasn't designed to dissipate the heat in question is outright misinformation.


----------



## Arizonian

Quote:


> Originally Posted by *sugarhell*
> 
> Check this as a basic
> 
> http://forums.guru3d.com/showthread.php?t=350890
> 
> Also new driver
> 
> http://www2.ati.com/drivers/beta/amd_catalyst_13.11_betav9.2.exe


Yea









Feature Highlights of The AMD Catalyst 13.11 Beta9.2 Driver for Windows

Call of Duty®: Ghost - Improves anti-aliasing performance, and updates the AMD CrossFire™ profile
AMD Radeon™ R9 290 Series - PowerTune update to reduce variance of fan speed / RPM
Resolves intermittent crashes seen in legacy DirectX® 9 applications

Will add to OP when I have time later this evening.


----------



## Raxus

got my new 290x sapphire, no issues.

Picked up a 750d for my water cooling build.

Think people would have interest in a build log?


----------



## VSG

Quote:


> Originally Posted by *Raxus*
> 
> got my new 290x sapphire, no issues.
> 
> Picked up a 750d for my water cooling build.
> 
> Think people would have interest in a build log?


Of course!


----------



## Slomo4shO

Quote:


> Originally Posted by *Strata*
> 
> 
> 
> Sadly I had to work so I don't get to play with it...Also looks like it won't fit in my HAF XB with my swiftech mcr220qp res using push pull, but I'll know better tonight...


My HAF XB allows for up to 11" cards with a Seidon 240M in push/pull. Your rad is 7mm thicker so you probably have around than 10.72" available if you are using standard 25mm fans. Sapphire indicates that the 290X is 277mm long which is roughly 10.9" long. It doesn't seem likely that the card will fit with the fans installed.

For reference, the MSI R7950 Twin Frozr 3GD5/OC that are currently in my case are 10.28" long and as you can see there isn't much room left:


----------



## PillarOfAutumn

Quote:


> Originally Posted by *tsm106*
> 
> Did you miss the memo that it was designed to run at that temp? Saying that it isn't able to dissipate the heat when it wasn't designed to dissipate the heat in question is outright misinformation.


What I meant to say was that it doesn't dissipate it as efficiently.


----------



## tsm106

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Did you miss the memo that it was designed to run at that temp? Saying that it isn't able to dissipate the heat when it wasn't designed to dissipate the heat in question is outright misinformation.
> 
> 
> 
> What I meant to say was that it doesn't dissipate it as efficiently.
Click to expand...

Now you are making up yet more variation of the same point. Read the darn info on the new Powertune before coming up with another way to say the same thing.


----------



## Forceman

Quote:


> Originally Posted by *Arizonian*
> 
> AMD Radeon™ R9 290 Series - PowerTune update to reduce variance of fan speed / RPM


Nice, mine's been cycling like crazy.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *tsm106*
> 
> Now you are making up yet more variation of the same point. Read the darn info on the new Powertune before coming up with another way to say the same thing.


explain to me what the purpose of that cooler is then if it's not designed to dissipate heat.


----------



## tsm106

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Now you are making up yet more variation of the same point. Read the darn info on the new Powertune before coming up with another way to say the same thing.
> 
> 
> 
> explain to me what the purpose of that cooler is then if it's not designed to dissipate heat.
Click to expand...

You need to be spoon fed too?









You can't click on the Tom's AMA? Or google it or something?


----------



## thekamikazepr

hey guys im trying to remove my stock cooler to put a block on my r9 290 but it seems to be stuck, i dont want to force it.. are there any hidden/seccret screws (under stickers) other than the two "void warranty" ones?

i havent been able to find videos/guides yet and searching in this post had many results most mentioning the gelid cooler

btw i used this:

http://www.ekwb.com/shop/EK-IM/EK-IM-3831109868539.pdf

Edit/Update:

grabbed crappy rad on the sides and wiggle it and a little bit of force and it works


----------



## Strata

Quote:


> Originally Posted by *Slomo4shO*
> 
> My HAF XB allows for up to 11" cards with a Seidon 240M in push/pull. Your rad is 7mm thicker so you probably have around than 10.72" available if you are using standard 25mm fans. Sapphire indicates that the 290X is 277mm long which is roughly 10.9" long. It doesn't seem likely that the card will fit with the fans installed.
> 
> For reference, the MSI R7950 Twin Frozr 3GD5/OC that are currently in my case are 10.28" long and as you can see there isn't much room left:


The CM site says the XB fits 334mm cards, that's without fans or Rads. My math says I should have 57mm after the card, which is 2mm too little, but perhaps some minor modding could fix it...we shall see


----------



## ivanlabrie

Quote:


> Originally Posted by *sugarhell*
> 
> Check this as a basic
> 
> http://forums.guru3d.com/showthread.php?t=350890
> 
> Also new driver
> 
> http://www2.ati.com/drivers/beta/amd_catalyst_13.11_betav9.2.exe


You gots triple 780s now mate?








I'm getting my 290 non x soonish...
Quote:


> Originally Posted by *Raxus*
> 
> got my new 290x sapphire, no issues.
> 
> Picked up a 750d for my water cooling build.
> 
> Think people would have interest in a build log?


Always nice to see geek pr0n...


----------



## VSG

So if I am to overvolt 2 290x's, is it worth it to connect the supplementary molex on my motherboard?


----------



## Rar4f

You gonna get non reference Ivan?


----------



## esqueue

Quote:


> Originally Posted by *tsm106*
> 
> You need to be spoon fed too?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You can't click on the Tom's AMA? Or google it or something?


I guess he worded some facts wrong. The reference cooler on the 290x is a bad design and can't _*quietly*_ keep the card under a temp which they deem fit and throttles as a result. The IceQ X2 cooler off the HIS Radeon R9 280X proves that a better cooler can easilly be made as proven as shown here

I guess that's what he should have typed. Yes, I'm quite certain that you know this already. I would respond to him but it was easier to quote you


----------



## rdr09

Quote:


> Originally Posted by *thekamikazepr*
> 
> hey guys im trying to remove my stock cooler to put a block on my r9 290 but it seems to be stuck, i dont want to force it.. are there any hidden/seccret screws (under stickers) other than the two "void warranty" ones?
> 
> i havent been able to find videos/guides yet and searching in this post had many results most mentioning the gelid cooler
> 
> btw i used this:
> 
> http://www.ekwb.com/shop/EK-IM/EK-IM-3831109868539.pdf
> 
> Edit/Update:
> 
> grabbed crappy rad on the sides and wiggle it and a little bit of force and it works


looks like there are 16 screws at the back of the pcb including the ones for the core and 2 screws that hold the shroud to the bracket.

http://www.techspot.com/review/736-amd-radeon-r9-290/

they should all have the same amount.


----------



## tsm106

Quote:


> Originally Posted by *esqueue*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> You need to be spoon fed too?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You can't click on the Tom's AMA? Or google it or something?
> 
> 
> 
> I guess he worded some facts wrong. The reference cooler on the 290x is a bad design and can't _*quietly*_ keep the card under a temp which they deem fit and throttles as a result. The IceQ X2 cooler off the HIS Radeon R9 280X proves that a better cooler can easilly be made as proven as shown here
> 
> I guess that's what he should have typed. Yes, I'm quite certain that you know this already. I would respond to him but it was easier to quote you
Click to expand...

Again, more noobery. It's obvious you want a cool setup and the 290s are not that. Is it because the cooler just plain sucks or is maybe because AMD decided in their wisdom to keep the card in a specific temp range on purpose? Maybe you know more than AMD? You know,







if you had a better cooler, it would take the card out of this specific range and now make the card expend more energy to maintain the lower temp and it also forces the card to go thru many more heat cycles in a given timeframe. Just because everyone thinks they know what the hell they think in their head is right, doesn't make it so.

Quote:


> My favorite is the new implementation of PowerTune on the 290X and 290. There's a lot of doom and gloom around the 95C temperature, because people are used to a world where the product is designed to run as cold as possible... but that's not the world we're living in with these units. The doom and gloom is based on an old viewpoint.
> 
> 95C is the optimal temperature that allows the board to convert its power consumption into meaningful performance for the user. Every single component on the board is designed to run at that temperature throughout the lifetime of the product.
> 
> If you throttle the temperature down below that threshold, then the board must in turn consume less power to respect the new temperature limit. Consuming less power means lowering vcore and engine clock, which means less performance.
> 
> You want to take full advantage of product TDP to maximize performance, and that is accomplished with a 95C ideal operating temperature for the 290 and 290X.
> 
> Even with a third-party cooling solution, like the Accelero 3 some users have started deploying, the logic of PowerTune will still try to maximize TDP by allowing temperatures to float higher until some other limit is met (voltage, clock, fan RPM, whatever).
> 
> It's so bloody smart and it kills me that more people don't fully understand it.


----------



## Snyderman34

Thinking I may get rid of one of my 290s. 2 is a ton of power for one 1440p monitor, and I'm thinking I'd rather have one WC'd 290 than 2 hot and (semi) loud 290s. Talked with Newegg and they set up a refund for me (when I had received the cards, one of them had a broken seal and a tear in the box. So the lady set me up). Now I just gotta find a block...


----------



## esqueue

Quote:


> Originally Posted by *tsm106*
> 
> Again, more noobery. It's obvious you want a cool setup and the 290s are not that. Is it because the cooler just plain sucks or is maybe because AMD decided in their wisdom to keep the card in a specific temp range on purpose? Maybe you know more than AMD? You know,
> 
> 
> 
> 
> 
> 
> 
> if you had a better cooler, it would take the card out of this specific range and now make the card expend more energy to maintain the lower temp and it also forces the card to go thru many more heat cycles in a given timeframe. Just because everyone thinks they know what the hell they think in their head is right, doesn't make it so.


I will refrain from childish name calling and pretend that we are all mature adults. You argue that 95°C is the optimal temperature for the gpu, I guess that you are also saying that throttling your gpu to keep that 95°C is better than having a more efficient cooling setup that is capable of more but reduces fan speed to keep that so called optimal temperature.







Ok, what ever you say. Almighty AMD knows it, the loud fan is optimal and throttling is good.


----------



## bpmcleod

I recently purchased an r9 290 and windows 8.1. After installing windows 8.1, my screen locks up randomly then black screen boots. I had this same problem with my 780 and all I had to do was disable the intel HD 4000 graphics to fix it but now the problem still persists. Any ideas?

Edit: This only happens with the 290. When I unplug the pcie power connectors and boot from iGPU it runs fine.


----------



## tsm106

Quote:


> Originally Posted by *esqueue*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Again, more noobery. It's obvious you want a cool setup and the 290s are not that. Is it because the cooler just plain sucks or is maybe because AMD decided in their wisdom to keep the card in a specific temp range on purpose? Maybe you know more than AMD? You know,
> 
> 
> 
> 
> 
> 
> 
> if you had a better cooler, it would take the card out of this specific range and now make the card expend more energy to maintain the lower temp and it also forces the card to go thru many more heat cycles in a given timeframe. Just because everyone thinks they know what the hell they think in their head is right, doesn't make it so.
> 
> 
> 
> I will refrain from childish name calling and pretend that we are all mature adults. You argue that 95°C is the optimal temperature for the gpu, I guess that you are also saying that throttling your gpu to keep that 95°C is better than having a more efficient cooling setup that is capable of more but reduces fan speed to keep that so called optimal temperature.
> 
> 
> 
> 
> 
> 
> 
> Ok, what ever you say. Almighty AMD knows it, the loud fan is optimal and throttling is good.
Click to expand...

Wrong. I'm not arguing anything. You guys are making the argument that the cooler sucks. But that is flawed, and you are projecting your frustration over not understanding AMD's design principles behind their decision to run the cards at that temp. You claim the cooler is an inferior design because it can't keep the card cooler. That just defines your disconnect or misunderstanding of AMD's stated goal of 95c.

This is like damning Porsche because they used oil to cool their engines for decades before switching to water. Bah, don't Porsche know better than to cool their engines with OIL of all things? Or how about their rear engine design, ahaha what joke that is right? Am I getting close?


----------



## bpmcleod

Quote:


> Originally Posted by *tsm106*
> 
> Wrong. I'm not arguing anything. You guys are making the argument that the cooler sucks. But that is flawed, and you are projecting your frustration over not understanding AMD's design principles behind their decision to run the cards at that temp. You claim the cooler is an inferior design because it can't keep the card cooler. That just defines your disconnect or misunderstanding of AMD's stated goal of 95c.
> 
> This is like damning Porsche because they used oil to cool their engines for decades before switching to water. Bah, don't Porsche know better than to cool their engines with OIL of all things? Or how about their rear engine design, ahaha what joke that is right? Am I getting close?


The card is designed to run at 95C and runs best around these temps. Albeit its a rather odd way to market a product but thats what they wanted I guess. basically the best way tog et performance from these cards is to run them hot and figure out a way to keep them hot while not allowing them to throttle at the same time! This was AMDs thinking ! Sog ood luck to all of us!


----------



## Rar4f

So if you cool 290 to say 70 degree, it will perform less. If it overheats at 95 degree it will throttle...
So 85degrees is the right spot or something? Am i understanding this right?


----------



## Forceman

So now running 95C can be considered a feature? I guess everyone should dismantle their water cooling setups since they are just holding their cards back.


----------



## BababooeyHTJ

Just ignore TSM, AMD can do no wrong with him.









Yeah, GTX480 had a great cooler too. Ran at optimal temps as well.


----------



## tsm106

Quote:


> Originally Posted by *bpmcleod*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Wrong. I'm not arguing anything. You guys are making the argument that the cooler sucks. But that is flawed, and you are projecting your frustration over not understanding AMD's design principles behind their decision to run the cards at that temp. You claim the cooler is an inferior design because it can't keep the card cooler. That just defines your disconnect or misunderstanding of AMD's stated goal of 95c.
> 
> This is like damning Porsche because they used oil to cool their engines for decades before switching to water. Bah, don't Porsche know better than to cool their engines with OIL of all things? Or how about their rear engine design, ahaha what joke that is right? Am I getting close?
> 
> 
> 
> The card is designed to run at 95C and runs best around these temps. Albeit its a rather odd way to market a product but thats what they wanted I guess. basically the best way tog et performance from these cards is to run them hot and figure out a way to keep them hot while not allowing them to throttle at the same time! This was AMDs thinking ! Sog ood luck to all of us!
Click to expand...

This might seem counter, but you can choose to not follow amd's tdp plan. Go water and disable the tdp limit and temp limit. The card will work very hard but its temp will stay very even since its watercooled. This achieves the principle effect that AMD is shooting for, to keep the card in a consistent temperature range. Since the card barely fluctuates in temps under water, the bloody thing will last forever because it doesn't have to go thru countless heat cycles, like it would with a very good cooler. A good cooler for ex. that keeps it's idle temps below in the mid 30c, but then the loaded temps shoot to 70c. That would be a large temperature swing in AMD's eyes imo. Pretty much what they want to avoid with their Powertune.

Quote:


> Originally Posted by *Rar4f*
> 
> So if you cool 290 to say 70 degree, it will perform less. If it overheats at 95 degree it will throttle...
> So 85degrees is the right spot or something? Am i understanding this right?


Iirc, if you put on a cooler that cools much better, you are still bounded by the tdp, temp, voltage limits. Thus it will run cooler and faster till you hit one of those limtis and throttle.


----------



## esqueue

Quote:


> Originally Posted by *tsm106*
> 
> Wrong. I'm not arguing anything. You guys are making the argument that the cooler sucks. But that is flawed, and you are projecting your frustration over not understanding AMD's design principles behind their decision to run the cards at that temp. You claim the cooler is an inferior design because it can't keep the card cooler. That just defines your disconnect or misunderstanding of AMD's stated goal of 95c.


You are completely ignoring what I said in my last post. A cooler that can cool something and remain more quiet is more superior than one that can't if they have the same durability. There are coolers for a 280x that can allow the 290 to keep that 95°C with the noise and certainly without the throttling.

The reference 290 cooler is not keeping it at 95°C quietly and still throttles because they don't want it to get louder. That is the definition of a bad design.

I'll end this discussion as we will never agree.


----------



## bpmcleod

Sooo anyone happen to have an idea on my problem? D


----------



## BababooeyHTJ

Yes, AMD's new powertune is excellent. I love what they did.

Using that to defend the subpar stock cooler is a ridiculous statement though. There is a lot of room for improvement with the stock cooling on these cards.


----------



## battleaxe

Quote:


> Originally Posted by *bpmcleod*
> 
> Sooo anyone happen to have an idea on my problem? D


Surely you can't be serious. I am serious and don't call me Shirley. No No... we're too busy talking about temps to answer any real questions.


----------



## ABD EL HAMEED

Quote:


> Originally Posted by *tsm106*
> 
> Again, more noobery. It's obvious you want a cool setup and the 290s are not that. Is it because the cooler just plain sucks or is maybe because AMD decided in their wisdom to keep the card in a specific temp range on purpose? Maybe you know more than AMD? You know,
> 
> 
> 
> 
> 
> 
> 
> if you had a better cooler, it would take the card out of this specific range and now make the card expend more energy to maintain the lower temp and it also forces the card to go thru many more heat cycles in a given timeframe. Just because everyone thinks they know what the hell they think in their head is right, doesn't make it so.


Quote:


> Originally Posted by *bpmcleod*
> 
> Sooo anyone happen to have an idea on my problem? D


In an argument like this you will be ignored lol


----------



## Kriant

Did I miss something, or one of the members of this much HEATED argument seems to contradict himself? 0_o ( see what I did there







)


----------



## bpmcleod

Obviously lol. I am trying to boot through the iGPU with the pcie power connectors in so the machine recognizes it so I can try to update the drivers. If I have the iGPU disabled and running off the 290 and try to update it freezes. I can move the mouse but cant do anything. See how this goes.

Edit: So far so good! Got past the step it was freezing on!


----------



## tsm106

Extreme heat damages parts, which 95c is not. Ppl are freaking out and ignore the fact that any typical crossfire/sli setup on air will routinely hit 95c temps. For ex. a set of titans in sli under load on the stock cooler will hit mid 90s, and throttle like mad doing it, same with my old 7970s and especially with the spacing from aircoolers. That aside... who am I question AMD? They designed the card, and they are standing behind it at that temp. Now just because I am putting up AMD's side and correcting ppl on their liberal use of opinion, it doesn't mean I am not going water. I'm totally going water, but there is one unique aspect to water as I mentioned above. Water loops keep gpus in a set temperature range just like what AMD is doing with this new Powertune. The difference for me is that I'm going to disable what boundry limits I can, like the TDP limit.


----------



## esqueue

I can assure you that if AMD had a way of changing out all of our coolers for SUPERIOR ones that cool better at less noise, at no cost to them, they would in a heartbeat. They will not same something as nonsensical as throttling will provide more performance over a card that doesn't throttle. Anyone who believes such things are a fool.
These are my opinions and I will stick to them. Companies make mistakes all the time, and putting a crappy cooler on a video card is hardly a major deal but saying that it was designed that way is crazy.


----------



## rdr09

Quote:


> Originally Posted by *BababooeyHTJ*
> 
> Just ignore TSM, AMD can do no wrong with him.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah, GTX480 had a great cooler too. Ran at optimal temps as well.


how are your 290Xs?


----------



## SpewBoy

tsm, it seems you're kinda just looking for an argument here...

I don't think anyone is debating whether or not it is good to keep the card at a consistently high temperature (except ABD EL HAMEED, who asked an innocent question and you replied - all good there). Yes there are fewer heat cycles and yes, it is apparent that AMD designed the card to run at 95c no problems. That's all fine.

I find it hard to believe that you can argue that the cooler itself is not rubbish. In order to _keep_ the card at 95 degrees and unthrottled, the cooler must ramp up to fan speeds that can only be considered "rather loud" at best.

Contrast this with a custom cooler and what you get instead is a card that still tops out at 95 degrees, but is 1. less likely to throttle (because the cooler has more noise / rpm headroom) and 2. much quieter. I believe that's the point just about everyone here is trying to get across.
Quote:


> Originally Posted by *battleaxe*
> 
> Surely you can't be serious. I am serious and don't call me Shirley. No No... we're too busy talking about temps to answer any real questions.


*High five* for Airplane! quote


----------



## dade_kash_xD

Finally got my Sapphire R9-290's! They OC well and stay pretty cool. Fan sounds super loud! Can't wait to throw these under water! Sign me up Arizonian




3DMark 13 Stock Clocks


3DMark 13 Overclocked 1125/1400


----------



## Tobiman

Try the new driver. Beta version 9.


----------



## Arizonian

/removed post swearing and replies in quotes

Reminder of the no profanity rule please. Even acronyms or abbreviations to circumvent swearing.


----------



## VSG

New driver resolving the fan speed issue is out: http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx


----------



## Tobiman

Quote:


> Originally Posted by *bpmcleod*
> 
> Sooo anyone happen to have an idea on my problem? D


Try this.
Quote:


> Originally Posted by *the9quad*
> 
> Thats exactly what I had to do with mine. Create a custom fan profile, and I also bumped up vcore for some reason, but ever since I did that no more black screens. Weird it was like before the custom profile it would wait til the last possible minute to spin up the fans, and by then it was too late and voila black screen, just like you said.


----------



## tsm106

Quote:


> Originally Posted by *SpewBoy*
> 
> tsm, it seems you're kinda just looking for an argument here...
> 
> I don't think anyone is debating whether or not it is good to keep the card at a consistently high temperature (except ABD EL HAMEED, who asked an innocent question and you replied - all good there). Yes there are fewer heat cycles and yes, it is apparent that AMD designed the card to run at 95c no problems. That's all fine.
> 
> *I find it hard to believe that you can argue that the cooler itself is not rubbish.* In order to _keep_ the card at 95 degrees and unthrottled, the cooler must ramp up to fan speeds that can only be considered "rather loud" at best.
> 
> Contrast this with a custom cooler and what you get instead is a card that still tops out at 95 degrees, but is 1. less likely to throttle (because the cooler has more noise / rpm headroom) and 2. much quieter. I believe that's the point just about everyone here is trying to get across.


You bring up somewhat of a fallacy. If you accept that AMD designed things this way on purpose, and it technically meeting the specs they laid out, then how can you say it is a rubbish cooler? That statement would only make sense if you declare AMD's philosophy as invalid. That's fine too, personally I like cool gpus too just like you. But I wonder if we're conditioned prefer things one way since we've been doing it that way for so long? I'd like to bring up an example of AMD breaking the mold with their crossfire bridge design, er the lack of it with the 290s. Ppl blasted AMD for the same theoretical limit on the bridges as NV. And when they said they were not going to use bridges anymore you had experts go off. ANd what happened? Crossfire is working pretty darn good over the pcie bus.

I've had custom coolered cards, like for ex. 580 TF2/OCs and they were hot and loud as anything I've ever heard. I dunno, I'm sitting here next to three 290x on the stock coolers and I'm not really bothered by them. They are idling at 62c atm, but I'm not swimming in heat like I was with the 580s. Gawd, I still remember those things and the insane heat they put out. The whole case got hot and the cards were in a loop!


----------



## SpewBoy

Quote:


> Originally Posted by *tsm106*
> 
> You bring up somewhat of a fallacy. If you accept that AMD designed things this way on purpose, and it technically meeting the specs they laid out, then how can you say it is a rubbish cooler? That statement would only make sense if you declare AMD's philosophy as invalid. That's fine too, personally I like cool gpus too just like you. But I wonder if we're conditioned prefer things one way since we've been doing it that way for so long? I'd like to bring up an example of AMD breaking the mold with their crossfire bridge design, er the lack of it with the 290s. Ppl blasted AMD for the same theoretical limit on the bridges as NV. And when they said they were not going to use bridges anymore you had experts go off. ANd what happened? Crossfire is working pretty darn good over the pcie bus.


In situations such as this, I like to turn to definitions. Because a cooler is meant for cooling, I define a rubbish cooler as a cooler that must resort to higher fan speeds and therefore louder noise to maintain a certain temperature compared to other available coolers / air cooling technologies. If the company that made the product got to define what a good and bad product was, we'd never have bad products.

I'm not debating that the cooler doesn't "giterdun". Mine are both capable of cooling the GPUs, but I have to resort to over 55% fan speed to do so without throttling, which for me is really loud (loudest card I've ever owned at load, including a GTX 260 with some generic custom cooler that had a 2 pin connector and blasted at 100% all day long).


----------



## esqueue

Guess I'll post my picture, getting happy that my waterblock will arrive tomorrow.









On a side note, notice how on the box, they technically define a superior cooler as quiet.
EDIT: image deleted. Feel kinda foolish to post it and have it ignored.


----------



## bpmcleod

Quote:


> Originally Posted by *Tobiman*
> 
> Try this.


I don't think you got what my problem was. I did a fresh install of Windows 8.1 and my screen freezes when I run off the 290. I can still move the mouse but cabt do anything. If I restart it black screens and mouse/keyboard turn off. I can run off the igpu just fine. It only happens on the 290. I dled the driver installer, turned off pc, put it to boot from pcie, restarted, plugged in pcie connectors, got it started, started installer (btw I disabled igpu) and about halfway through screen goes black. If I switch the cable out and in the screen will appear but inactive and can't do anything.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> *The problem with the 290/x is the heat it produces. It isn't able to dissipate it and as a result, it gets severely limited on how much it can perform.* I think that for the price, the 290 and 290x are excellent cards and if you're looking for an upgrade, you most certainly should consider these cards.
> 
> however, I would only recommend the current cards if you are planning to watercool. Either that, or wait for non reference coolers. Because once you fix the heat issue, this card performs excellently! It overclocks pretty well too. It's just that the stock cooler from AMD was holding it back, however this card is tan fuerte como torro
> 
> 
> 
> 
> 
> 
> 
> .


Quote:


> Originally Posted by *tsm106*
> 
> Did you miss the memo that it was designed to run at that temp? Saying that it isn't able to dissipate the heat when it wasn't designed to dissipate the heat in question is outright misinformation.


Quote:


> Originally Posted by *PillarOfAutumn*
> 
> What I meant to say was that it doesn't dissipate it as efficiently.


Quote:


> Originally Posted by *tsm106*
> 
> Now you are making up yet more variation of the same point. Read the darn info on the new Powertune before coming up with another way to say the same thing.


Apparently you don't know how to interpret reviews? The more recent reviews even said that once the fan was pushed up by 7%, performance improved. What does this mean? Cooler = more performance. I'm not sure what you're trying to argue here, but I think that AMD made a good cooler?

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> explain to me what the purpose of that cooler is then if it's not designed to dissipate heat.


Quote:


> Originally Posted by *tsm106*
> 
> You need to be spoon fed too?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You can't click on the Tom's AMA? Or google it or something?


Its not my responsibility to research and understand what the heck you're trying to say, cause honestly, I don't even know what you're arguing. Learn how to talk like an adult.
Quote:


> Originally Posted by *esqueue*
> 
> I guess he worded some facts wrong. The reference cooler on the 290x is a bad design and can't _*quietly*_ keep the card under a temp which they deem fit and throttles as a result. The IceQ X2 cooler off the HIS Radeon R9 280X proves that a better cooler can easilly be made as proven as shown here
> 
> I guess that's what he should have typed. Yes, I'm quite certain that you know this already. I would respond to him but it was easier to quote you


I'm not sure how much similarly I could say that







. I wasn't really speaking about the noise, but you bring up a good point about the noise too. I was mainly talking about heat dissipation of the stock coolers.

Quote:


> Originally Posted by *tsm106*
> 
> Wrong. I'm not arguing anything. You guys are making the argument that the cooler sucks. But that is flawed, and you are projecting your frustration over not understanding AMD's design principles behind their decision to run the cards at that temp. You claim the cooler is an inferior design because it can't keep the card cooler. That just defines your disconnect or misunderstanding of AMD's stated goal of 95c.
> 
> This is like damning Porsche because they used oil to cool their engines for decades before switching to water. Bah, don't Porsche know better than to cool their engines with OIL of all things? Or how about their rear engine design, ahaha what joke that is right? Am I getting close?


Right, so if AMD came out with it by saying a card is "designed to run at 95", then it means all of us and the reviewers should pack up and go home, cause it was "designed" to do that. If one were to say (and this mean "in a hypothetical situation") that this card doesn't beat the GTX 780, it'd be alright for you to counter with "it wasn't designed to beat the GTX 780." The way you're arguing is like telling someone who is complaining that their A/C isn't cooling their room that "oh, the A/C wasn't designed to keep your room cool." Any flaw that you point out can be met with "it wasn't designed to do that."

I don't know how many reviews you've read but almost all reviewers have said that if the card was cooler, you'd be able to get more performance out of it. And the fact that you're unable to even overclock too much without reaching the 95C quickly or without the card sounding like a jet engine speaks a lot about how flawed the cooler is and how little time AMD spent on this cooler. If you need to modify something yourself in order to have perform like it should, you can't chalk that off as "AMD put thought in to this so this is how it is supposed to work." Many of our members have replaced the cooler with aftermarket ones to remedy this design flaw from AMD. Please don't use your opinions to argue against facts.


----------



## ABD EL HAMEED

Quote:


> Originally Posted by *tsm106*
> 
> Extreme heat damages parts, which 95c is not. Ppl are freaking out and ignore the fact that any typical crossfire/sli setup on air will routinely hit 95c temps. For ex. a set of titans in sli under load on the stock cooler will hit mid 90s, and throttle like mad doing it, same with my old 7970s and especially with the spacing from aircoolers. That aside... who am I question AMD? They designed the card, and they are standing behind it at that temp. Now just because I am putting up AMD's side and correcting ppl on their liberal use of opinion, it doesn't mean I am not going water. I'm totally going water, but there is one unique aspect to water as I mentioned above. Water loops keep gpus in a set temperature range just like what AMD is doing with this new Powertune. The difference for me is that I'm going to disable what boundry limits I can, like the TDP limit.


OK,so 95C isn't extremely hot and what AMD is trying to accomplish here is to make temps stay in a certain range rather than swing all the time which might be understandable since swinging temps cause stress,Right?


----------



## Forceman

Even if it was "designed to run at 95C" and AMD specifically runs powertune to keep the card at exactly 95C because it "improves performance" somehow - the cooler is _still_ a failure because it can't even keep the card at 95C without throttling. Even if they wanted it to run at exactly 95C all the time to maximize performance (which I think is very unlikely) then why not put a cooler on that could do so while maintaining the clock speed? Or is throttling also somehow beneficial to performance?


----------



## iPDrop

Is anyone else getting terrible crossfire scaling with the 290's ? One card gives me an average of 80fps in BF4 and then both of them together give me average of 107 and crazy GPU usage..

was happening both on original drivers from disk and latest 13.11 v9.2 beta


----------



## esqueue

Quote:


> Originally Posted by *ABD EL HAMEED*
> 
> OK,so 95C isn't extremely hot and what AMD is trying to accomplish here is to make temps stay in a certain range rather than swing all the time which might be understandable since swinging temps cause stress,Right?


That would be true but going from nearly 95°C to 60°C almost instantly after you turn off a game is what cause heat stress. If they wanted to keep it at 95°C, they would have it idle at lower than 20%. I set a custom fan profile to decrease the wide temperature shifts at the cost of quietness. Temporary solution


----------



## Arizonian

This so reminds me of the Nvidia fermi days only this time the shoes on the other foot but I'm wearing it again. There are still some hot running GTX 480s to this day even in SLI configuration. So there is some backing that these cards can last even with this type of heat.

Personally coming from Nvidia's reference coolers that keep lower temps out of the box with low noise and I know these AMD ref cards could've been done better too. Non-reference out soon will lay this debate to rest and we won't be suggesting reference cards unless your water blocking.

If you don't like the heat stay out of the kitchen. I'm used to it and I'm not afraid of it. If you have horrible ventilation in your case then you might look elsewhere until aftermarket coolers come out. Back in the day Nvidia said that those 480's even 580's can handle it and they did. I don't see what's any different about AMD making the same claim now.


----------



## Tobiman

Quote:


> Originally Posted by *bpmcleod*
> 
> I don't think you got what my problem was. I did a fresh install of Windows 8.1 and my screen freezes when I run off the 290. I can still move the mouse but cabt do anything. If I restart it black screens and mouse/keyboard turn off. I can run off the igpu just fine. It only happens on the 290. I dled the driver installer, turned off pc, put it to boot from pcie, restarted, plugged in pcie connectors, got it started, started installer (btw I disabled igpu) and about halfway through screen goes black. If I switch the cable out and in the screen will appear but inactive and can't do anything.


Do you have another card to test your PCIE slot?


----------



## Tobiman

Quote:


> Originally Posted by *iPDrop*
> 
> Is anyone else getting terrible crossfire scaling with the 290's ? One card gives me an average of 80fps in BF4 and then both of them together give me average of 107 and crazy GPU usage..
> 
> was happening both on original drivers from disk and latest 13.11 v9.2 beta


Disable ULPS! Use MSI ab or go through the registry route.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Arizonian*
> 
> This so reminds me of the Nvidia fermi days only this time the shoes on the other foot but I'm wearing it again. There are still some hot running GTX 480s to this day even in SLI configuration. So there is some backing that these cards can last even with this type of heat.
> 
> Personally coming from Nvidia's reference coolers that keep lower temps out of the box with low noise and I know these AMD ref cards could've been done better too. Non-reference out soon will lay this debate to rest and we won't be suggesting reference cards unless your water blocking.
> 
> If you don't like the heat stay out of the kitchen. I'm used to it and I'm not afraid of it. If you have horrible ventilation in your case then you might look elsewhere until aftermarket coolers come out. Back in the day Nvidia said that those 480's even 580's can handle it and they did. I don't see what's any different about AMD making the same claim now.


My GTX280 does not run cool either. I am not sure how high it was getting under Dota 2 @ 1440p but it was high 80s for sure.


----------



## SpewBoy

Quote:


> Originally Posted by *iPDrop*
> 
> Is anyone else getting terrible crossfire scaling with the 290's ? One card gives me an average of 80fps in BF4 and then both of them together give me average of 107 and crazy GPU usage..


I get crazy GPU usage in Black Ops. Pretty much when one card is 100% load, the other is close to 0% and it fluctuates between the cards multiple times a second. I put this down to cf support no longer being available for Black Ops because it is 3 years old and they don't care about it anymore.

In benchmarks I can't say I've experienced these issues.

EDIT:
Quote:


> Originally Posted by *Tobiman*
> 
> Disable ULPS! Use MSI ab or go through the registry route.


Ahh, I'll give that a shot too. Cheers.


----------



## tsm106

Quote:


> Originally Posted by *SpewBoy*
> 
> In situations such as this, I like to turn to definitions. Because a cooler is meant for cooling, I define a rubbish cooler as a cooler that must resort to higher fan speeds and therefore louder noise to maintain a certain temperature compared to other available coolers / air cooling technologies. If the company that made the product got to define what a good and bad product was, we'd never have bad products.
> 
> I'm not debating that the cooler doesn't "giterdun". Mine are both capable of cooling the GPUs, but I have to resort to over 55% fan speed to do so without throttling, which for me is really loud (loudest card I've ever owned at load, including a GTX 260 with some generic custom cooler that had a 2 pin connector and blasted at 100% all day long).


In classical terms sure the cooler isn't as good it could be. However, the thing is they didn't try to make it better, so I'm not sure its fair to say unless we call their implementation of Powertune a failure. Another thing, I've never been a fan of Powertune in its earlier forms, I've written guides on how to bypass it but I understand why they as a brand need PT. I guess I'm willing to wait and see. They've been doing a helluvalot of things everyone said they couldn't do during this launch, from the gpu coolers yall hate, to mantle, to the consoles, to apus. Shrugs...

Quote:


> Originally Posted by *ABD EL HAMEED*
> 
> OK,so 95C isn't extremely hot and what AMD is trying to accomplish here is to make temps stay in a certain range rather than swing all the time which might be understandable since swinging temps cause stress,Right?


Yea, lowered heat cycling improves component life.


----------



## RYCRAI

I got the Sapphire 290 and it produces a God awful amount of coil whine... Seriously considering RMA'ing this. Anyone else having this problem?


----------



## Paul17041993

I'm surprised I'm still hearing complaints about temperature and the ref cooler...

100C is in fact a normal run temp for a vast majority of silicon-based microchips, these cards are no different, its just the FX and first GCN cores that had certain sensitivities in their design that high temps could trigger instability and wear them, they changed this in the 290 and 290X so, just like a lot of intel chips, and my 5770, can run as high as 99C before shutting off. (funnily enough my 5770 never did)

water boils at, guess what, 100C, solder melts at, usually 180C, silicon melts above 1000C, copper melts above 2500C.

heating and cooling rapidly will eventually cause warpage, but like a lot of people have noticed, the core will heat slowly (bout 5-10 minutes under heavy load), then the fan shuts off after use and cools down slowly, this overall stops the warpage and will give it more then 3 years life.

and the reference cooler, yes, it likely only costs 50 bucks, yes, the fan uses a sleeve bearing (or I'm pretty sure it does), the chamber is even a lower-end liquid type vs the vapor chamber the sapphire VaporX card use, overall its DESIGNED to be CHEAP and SIMPLE, so if you want to use it, you can, otherwise AMD have done something very nice and cut 100 bucks off the price of the card by only using a basic cooler for you to replace with whatever you want.

if you want to complain about it, compare it to the reference coolers supplied for your CPU.


----------



## esqueue

Quote:


> Originally Posted by *Paul17041993*
> 
> if you want to complain about it, compare it to the reference coolers supplied for your CPU.


I am using my CPU's reference cooler and keeping great temps. I don't see what cpu reference coolers have to do with this.
Quote:


> Originally Posted by *RYCRAI*
> 
> I got the Sapphire 290 and it produces a God awful amount of coil whine... Seriously considering RMA'ing this. Anyone else having this problem?


I get some on the beginning screen of batman Arkham Origins. There is one screen where I can hear it. Thing is that I have to listen for it. I don't know if coil whine is bad for the card but the noise isn't loud enough to bother me nor can I even hear it when I am playing games.

To anyone who know: Is coil whine bad for your card?


----------



## SpewBoy

Quote:


> Originally Posted by *tsm106*
> 
> In classical terms sure the cooler isn't as good it could be. However, the thing is they didn't try to make it better, so I'm not sure its fair to say unless we call their implementation of Powertune a failure.


I feel that a graphics card should be rated according to multiple factors, some of which include the performance of the cooler and the performance of the card itself. The card performs fine (could be a little better without that throttling), but as far as coolers go, it doesn't work so well. Hawaii pulls some serious power and this demands some serious cooling, which I believe they did not deliver (yes, it has the cooling capacity, but to get there, you've really gotta crank the fan up higher than is comfortable). Just because AMD didn't really "try" to make the cooler better vs say, the 7970 cooler, doesn't mean it isn't fair to judge it compared to other coolers where the manufacturers did try.

They don't hold 100m sprints for those that do try and those that don't







Bung 'em in together and see who gets to the finish first.
Quote:


> Originally Posted by *esqueue*
> 
> To anyone who know: Is coil whine bad for your card?


From what I have gathered, I don't think coil whine is bad for the card.

I'd also just like to point out that I like the way AMD is going with Powertune. It makes sense to not have the temperature constantly go up and down depending on load and it is nice to have the noise reduced instantly when idle, and that is partly due to how you have explained it, tsm. I just think the cooler could and should be a bit more efficient compared to the compeition


----------



## sugarhell

Oh 95C argument. This is only for the quiet bios. New powertune has a limit close to 220 watt. They use lower fan speed(lower tdp) so you can have a quieter card. Thats means when you hit 95C you will get lower clocks not higher fan speed.

On uber the card will increase the fan speed and it will remove tdp limits. Its so simply. The cooler is good enough its just noisy. But from the sound my 7970s ref cooler had a high pitch noise. This is a lower one a lot lower pitch.


----------



## bpmcleod

Quote:


> Originally Posted by *Tobiman*
> 
> Do you have another card to test your PCIE slot?


I pulled a 780 hc from it and then put the 290 in and it worked perfect on win 7 64 bit. I did a fresh install and the problem occured. Its not the pcie slot unless it went bad in the time it took to install win 8.1


----------



## mrsus

My Sapphire 290 doesn't have any coil whine, just the fan noise when it hit 70% fan speed trying to cool itself







.

It got elpida memory and 66% asic, switch to asus bios and did some overclocking.

got to 1160 / 6000 right now, seem not a great overclocker.


----------



## Arm3nian

Quote:


> Originally Posted by *SpewBoy*
> 
> I feel that a graphics card should be rated according to multiple factors, some of which include the performance of the cooler and the performance of the card itself. The card performs fine (could be a little better without that throttling), but as far as coolers go, it doesn't work so well. Hawaii pulls some serious power and this demands some serious cooling, which I believe they did not deliver (yes, it has the cooling capacity, but to get there, you've really gotta crank the fan up higher than is comfortable). Just because AMD didn't really "try" to make the cooler better vs say, the 7970 cooler, doesn't mean it isn't fair to judge it compared to other coolers where the manufacturers did try.
> 
> They don't hold 100m sprints for those that do try and those that don't
> 
> 
> 
> 
> 
> 
> 
> Bung 'em in together and see who gets to the finish first.
> From what I have gathered, I don't think coil whine is bad for the card.
> 
> I'd also just like to point out that I like the way AMD is going with Powertune. It makes sense to not have the temperature constantly go up and down depending on load and it is nice to have the noise reduced instantly when idle, and that is partly due to how you have explained it, tsm. I just think the cooler could and should be a bit more efficient compared to the compeition


This is a reference card, it is designed for a single purpose: performance. Saying the cooler is overall crap is not accurate. If AMD didn't implement powertune, the card would have been clocked lower at stock, and draw less power, resulting in less heat. In this situation the cooler is the same, but since the card is running cooler, the cooler would seem better. What AMD did is sacrifice silence for maximum performance, if you don't like it, you should have either waited for non reference or gone water.


----------



## Tobiman

Quote:


> Originally Posted by *bpmcleod*
> 
> I pulled a 780 hc from it and then put the 290 in and it worked perfect on win 7 64 bit. I did a fresh install and the problem occured. Its not the pcie slot unless it went bad in the time it took to install win 8.1


How did you uninstall your Nvidia drivers? Did you use DDU or go through the extra long route?


----------



## tsm106

Quote:


> Originally Posted by *sugarhell*
> 
> Oh 95C argument. This is only for the quiet bios. New powertune has a limit close to 220 watt. They use lower fan speed(lower tdp) so you can have a quieter card. Thats means when you hit 95C you will get lower clocks not higher fan speed.
> 
> On uber the card will increase the fan speed and it will remove tdp limits. Its so simply. The cooler is good enough its just noisy. But from the sound my 7970s ref cooler had a high pitch noise. This is a lower one a lot lower pitch.


Yea, compared to the squirrel cage on 7970s, the tone of 290 cooler is much nicer. Deeper tones are less obtrusive to a limit of course.

Quote:


> Originally Posted by *bpmcleod*
> 
> I pulled a 780 hc from it and then put the 290 in and it worked perfect on win 7 64 bit. I did a fresh install and the problem occured. Its not the pcie slot unless it went bad in the time it took to install win 8.1


What bios are you on? Have you tried again on the other setting? You didn't happen to flash a bios or anything?


----------



## bpmcleod

Quote:


> Originally Posted by *Tobiman*
> 
> How did you uninstall your Nvidia drivers? Did you use DDU or go through the extra long route?


Again its a fresh install. Unless I missed the part where nvidia has super drivers that stay on your pc through a fresh install, they are gone







.


----------



## Tobiman

Quote:


> Originally Posted by *bpmcleod*
> 
> Again its a fresh install. Unless I missed the part where nvidia has super drivers that stay on your pc through a fresh install, they are gone
> 
> 
> 
> 
> 
> 
> 
> .


My bad. Didn't catch that for some reason. The only other option would be to return it or if you can test it on another PC.


----------



## bpmcleod

Quote:


> Originally Posted by *tsm106*
> 
> Yea, compared to the squirrel cage on 7970s, the tone of 290 cooler is much nicer. Deeper tones are less obtrusive to a limit of course.
> What bios are you on? Have you tried again on the other setting? You didn't happen to flash a bios or anything?


Its a powercolor 290 on stock bios. It worked great on win 7. Thesenproblems are from a fresh installed win 8.1. I feel its the igpu drivers conflicting but its not even letting me install amd drivers without freezing.


----------



## sugarhell

Intel igpu is not working on 8.1 for me at least


----------



## tsm106

Quote:


> Originally Posted by *bpmcleod*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Yea, compared to the squirrel cage on 7970s, the tone of 290 cooler is much nicer. Deeper tones are less obtrusive to a limit of course.
> What bios are you on? Have you tried again on the other setting? You didn't happen to flash a bios or anything?
> 
> 
> 
> Its a powercolor 290 on stock bios. It worked great on win 7. Thesenproblems are from a fresh installed win 8.1. I feel its the igpu drivers conflicting but its not even letting me install amd drivers without freezing.
Click to expand...

Have you run all the updates? The drivers rely upon dot.net 4.5 libraries iirc and won't function properly if the files are not up to date. The old catalyst versions had lower reqs so you could install them right after a fresh os, but not these days.


----------



## esqueue

Quote:


> Originally Posted by *tsm106*
> 
> Have you run all the updates? The drivers rely upon dot.net 4.5 libraries iirc and won't function properly if the files are not up to date. The old catalyst versions had lower reqs so you could install them right after a fresh os, but not these days.


Don't the new catalyst drivers have the dot net libraries? I knew that I had installed them prior to for other things but saw lots of check marks when installing the catalysts. I made sure that they were all enabled but can't remember them now.


----------



## bpmcleod

Quote:


> Originally Posted by *Tobiman*
> 
> My bad. Didn't catch that for some reason. The only other option would be to return it or if you can test it on another PC.


Its not the card nor the mobo. The card worked great on windows 7 64 bit. Its some kind of conflict with drivers. I just haven't found the right work around. I tried setting my mobo to boot from igpu while the 290 was poeered and this worked fine. It recognized both the igpu and a generic vga. As the drivers updated though the screen went black. I got the screen back but could only move my mouse nothing else. I tried rebooting and had to go off igpu caus ethe 290 black screened and became unresponsive on boot. I can get it to boot from the 290 bu disabling the igpu but as soon as I run catalyst it freezes. Sorry for my typing, this is all off a phone lol


----------



## bpmcleod

Quote:


> Originally Posted by *tsm106*
> 
> Have you run all the updates? The drivers rely upon dot.net 4.5 libraries iirc and won't function properly if the files are not up to date. The old catalyst versions had lower reqs so you could install them right after a fresh os, but not these days.


This I haven't done. I usually do my gpu first cause I hate looking at a lousy res. Ill try this and get back to you.


----------



## esqueue

In rare cases, Windows can just install bad and nothing works as it should. I've had that happen with xp where it ran like crap and not even the cd functioned correctly. I'd suggest installing windows 8.1 again. Fresh if you have nothing to lose or repair if you do. Many of us have windows 8 and it works. Have you also installed the correct mobo drivers? Just shooting out suggestions. Sorry if you have already said that you've done it already.


----------



## tsm106

Quote:


> Originally Posted by *esqueue*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Have you run all the updates? The drivers rely upon dot.net 4.5 libraries iirc and won't function properly if the files are not up to date. The old catalyst versions had lower reqs so you could install them right after a fresh os, but not these days.
> 
> 
> 
> Don't the new catalyst drivers have the dot net libraries? I knew that I had installed them prior to for other things but saw lots of check marks when installing the catalysts. I made sure that they were all enabled but can't remember them now.
Click to expand...

Iirc, the drivers won't install on a complete virgin fresh os install. I would double check it again if I were in the middle of a wipe, or at least remember this point the next time I do a wipe. Anyways as I recall comes packed with some framework sets but not all. That would make the driver files over 500mb i think. Sugar might know or have more relevant info taking space in his noggin.


----------



## iPDrop

Quote:


> Originally Posted by *Tobiman*
> 
> Disable ULPS! Use MSI ab or go through the registry route.


I tried disabling ULPS but this is still happening... both cards benchmarked fine but in crossfire they are not working well together.


----------



## PillarOfAutumn

I have a question: if you're coming from a 7970 and going to a 290/x, are you still reformatting everything and doing a completely new install? Also, if your cooler has a coil whine to it and you plan on watercooling, is it necessary to RMA the card? Does coil whine mean something about the card itself is messed up?


----------



## sugarhell

Always windows update before any driver. You can install the final framework if you want to install first the gpu drivers.

http://download.microsoft.com/download/1/6/7/167F0D79-9317-48AE-AEDB-17120579F8E2/NDP451-KB2858728-x86-x64-AllOS-ENU.exe

4.5.1


----------



## Tobiman

Quote:


> Originally Posted by *iPDrop*
> 
> I tried disabling ULPS but this is still happening... both cards benchmarked fine but in crossfire they are not working well together.


Did you use MSI ab or the registry method?


----------



## Arizonian

Quote:


> Originally Posted by *dade_kash_xD*
> 
> Finally got my Sapphire R9-290's! They OC well and stay pretty cool. Fan sounds super loud! Can't wait to throw these under water! Sign me up Arizonian
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 3DMark 13 Stock Clocks
> 
> 
> 3DMark 13 Overclocked 1125/1400


Congrats - Your added buddy


----------



## esqueue

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> I have a question: if you're coming from a 7970 and going to a 290/x, are you still reformatting everything and doing a completely new install? Also, if your cooler has a coil whine to it and you plan on watercooling, is it necessary to RMA the card? Does coil whine mean something about the card itself is messed up?


I guess that depends on how bad it is. I can barely hear mine but I am also not pushing it as hard as an overclocked card getting benchmarked. I only notice it when really listening for it with the game paused. I haven't read anywhere some coil whine is bad. Keep in mind that the type of sound it makes, some people will just not be able to hear it as it is very high pitched.


----------



## SpewBoy

Quote:


> Originally Posted by *Arm3nian*
> 
> This is a reference card, it is designed for a single purpose: performance. Saying the cooler is overall crap is not accurate. If AMD didn't implement powertune, the card would have been clocked lower at stock, and draw less power, resulting in less heat. In this situation the cooler is the same, but since the card is running cooler, the cooler would seem better. What AMD did is sacrifice silence for maximum performance, if you don't like it, you should have either waited for non reference or gone water.


Uhh, I *am* going water and I am so far reasonably happy with these cards overall so don't give me that crap.

You can design a card for performance, you don't design a cooler for performance, you design it for cooling efficiency (noise vs watts dissipated). All I'm saying is, there are other reference and custom coolers that are better. Plenty of reviews say this, and I'm saying it too.

Coolers are for cooling. GPUs are for pushing out pixels. The cooler is not that great at its job compared to the competition. The GPU's performance is irrelevent to my argument, except when throttling occurs but as you imply, that is subjective due to the nature of how the new Powertune works and for the purposes of this argument can be ignored. I've already stated that I am content with my card reaching and staying at 95 degrees, even when not under load. I am not okay with the noise it generates to keep it that way. It has already been proven that other coolers do it better (ie, keep it at 95 degrees with less noise).

Other coolers > 290X cooler > other even worse coolers. Nothing's perfect. I can accept that and acknowledge it. I don't see why you can't.


----------



## iPDrop

Quote:


> Originally Posted by *Tobiman*
> 
> Did you use MSI ab or the registry method?


The registry method, I changed both enableulps and enableulps_na to 0 and changed enablecrossfireautolink to 1 because it was at 0 for some reason. even though i enabled crossfire in the amd catalyst settings, but it's still happening.


----------



## Elmy

http://s1126.photobucket.com/user/Elmnator/media/20131107_172808.jpg.html?sort=3&o=4 My new toys thanks to Club3D/AMD 

http://s1126.photobucket.com/user/Elmnator/media/20131107_184301.jpg.html?sort=3&o=1 replaced the EK waterblocks black screws with Stainless steel.

Should be up and running tomorrow.


----------



## tsm106

Quote:


> Originally Posted by *iPDrop*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Tobiman*
> 
> Did you use MSI ab or the registry method?
> 
> 
> 
> The registry method, I changed both enableulps and enableulps_na to 0 and changed enablecrossfireautolink to 1 because it was at 0 for some reason. even though i enabled crossfire in the amd catalyst settings, but it's still happening.
Click to expand...

Technically you don't have to disable ulps right now because you are not overclocking with voltage, since well AB does not support volts on these cards. So that is not it. Oh in the future, you only need to disable "enableulps" in currentcontrolset and nothing else.

I have a question regarding bf4, are you using 32 or 64 bits? Also, you should set AB to default and use Overdrive to raise Powertune. AB does not support our cards yet, and it can cause issues when you actually use it to change settings. Monitoring works fine however.


----------



## esqueue

Quote:


> Originally Posted by *Elmy*
> 
> http://s1126.photobucket.com/user/Elmnator/media/20131107_172808.jpg.html?sort=3&o=4 My new toys thanks to Club3D/AMD
> 
> http://s1126.photobucket.com/user/Elmnator/media/20131107_184301.jpg.html?sort=3&o=1 replaced the EK waterblocks black screws with Stainless steel.
> 
> Should be up and running tomorrow.


What are those silver pieces that have dual x-clamp cut-outs? They Look like polished SS back plates LOL.


----------



## Arizonian

Quote:


> Originally Posted by *Elmy*
> 
> http://s1126.photobucket.com/user/Elmnator/media/20131107_172808.jpg.html?sort=3&o=4 My new toys thanks to Club3D/AMD
> 
> http://s1126.photobucket.com/user/Elmnator/media/20131107_184301.jpg.html?sort=3&o=1 replaced the EK waterblocks black screws with Stainless steel.
> 
> Should be up and running tomorrow.


Congrats on 2x Club3D - 290's and under water - added









I assumed 290 so let me know if I'm wrong.


----------



## Arm3nian

Quote:


> Originally Posted by *SpewBoy*
> 
> Uhh, I *am* going water and I am so far reasonably happy with these cards overall so don't give me that crap.
> 
> You can design a card for performance, you don't design a cooler for performance, you design it for cooling efficiency (noise vs watts dissipated). All I'm saying is, there are other reference and custom coolers that are better. Plenty of reviews say this, and I'm saying it too.
> 
> Coolers are for cooling. GPUs are for pushing out pixels. The cooler is not that great at its job compared to the competition. The GPU's performance is irrelevent to my argument, except when throttling occurs but as you imply, that is subjective due to the nature of how the new Powertune works and for the purposes of this argument can be ignored.
> 
> Other coolers > 290X cooler > other even worse coolers. Nothing's perfect. I can accept that and acknowledge it. I don't see why you can't.


Disregarding the fact that the underlined rendered what you've said thus far invalid, you might want to read my quote again.
Quote:


> Originally Posted by *Arm3nian*
> 
> If AMD didn't implement powertune, the card would have been clocked lower at stock, and draw less power, resulting in less heat. In this situation the cooler is the same, but since the card is running cooler, the cooler would seem better.


----------



## tsm106

Quote:


> Originally Posted by *esqueue*
> 
> What are those silver pieces that have dual x-clamp cut-outs? They Look like polished SS back plates LOL.


Seriously blinging 7990...


----------



## bond32

Would someone with 1 single 290 comment on the 1440p performance? Possibly BF4?


----------



## Elmy

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats on 2x Club3D - 290's and under water - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I assumed 290 so let me know if I'm wrong.


290X's

You guys talking about my chromed out 7990 backplates? LoL

I like bling.... LoL ... My computer is the first one in this video.






Both rads are also chromed


----------



## SpewBoy

Quote:


> Originally Posted by *Arm3nian*
> 
> Disregarding the fact that the underlined rendered what you've said thus far invalid, you might want to read my quote again.


Except for the fact that a more efficient cooler would be quieter and therefore better... which has been proven by attaching another cooler to the card (again, same clocks, same temp, less noise).

Or is that somehow fundamentally wrong? lol.


----------



## iPDrop

Quote:


> Originally Posted by *tsm106*
> 
> Technically you don't have to disable ulps right now because you are not overclocking with voltage, since well AB does not support volts on these cards. So that is not it. Oh in the future, you only need to disable "enableulps" in currentcontrolset and nothing else.
> 
> I have a question regarding bf4, are you using 32 or 64 bits? Also, you should set AB to default and use Overdrive to raise Powertune. AB does not support our cards yet, and it can cause issues when you actually use it to change settings. Monitoring works fine however.


I'm using 64bit windows and bf4 is only in 64bit. also Overdrive wont open on my pc because I'm using an Intel processor. my cards came with ASUS's GPU Tweak would that be okay to use? Also any idea whats wrong with my crossfire issue?


----------



## Arm3nian

Quote:


> Originally Posted by *SpewBoy*
> 
> Except for the fact that a more efficient cooler would be quieter and therefore better.
> 
> Or is that somehow fundamentally wrong? lol.


For all we know, the cooler used on the 290x can be very good. One can say the 780 has a good cooler. If the 780 was clocked higher from the factory, which would require more voltage and therefore more power, and ran at 95c, we could say the 780 cooler is garbage.


----------



## Tobiman

Quote:


> Originally Posted by *iPDrop*
> 
> I'm using 64bit windows and bf4 is only in 64bit. also Overdrive wont open on my pc because I'm using an Intel processor. my cards came with ASUS's GPU Tweak would that be okay to use? Also any idea whats wrong with my crossfire issue?


I have an intel processor and overdrive works perfectly.


----------



## iPDrop

It gave me a message box saying "No AMD processors we're detected on this computer" or something along those lines, and then it never opened.


----------



## tsm106

Quote:


> Originally Posted by *iPDrop*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Technically you don't have to disable ulps right now because you are not overclocking with voltage, since well AB does not support volts on these cards. So that is not it. Oh in the future, you only need to disable "enableulps" in currentcontrolset and nothing else.
> 
> I have a question regarding bf4, are you using 32 or 64 bits? Also, you should set AB to default and use Overdrive to raise Powertune. AB does not support our cards yet, and it can cause issues when you actually use it to change settings. Monitoring works fine however.
> 
> 
> 
> I'm using 64bit windows and bf4 is only in 64bit. also Overdrive wont open on my pc because I'm using an Intel processor. my cards came with ASUS's GPU Tweak would that be okay to use? Also any idea whats wrong with my crossfire issue?
Click to expand...

AB and 64bit BF4 is a bad mix. Btw there is 32bit BF4. Right click BF4 in origin and its under settings. You can choose to run the 32bit version with limits.

It is fine to use gputweak, though you will have to disable ULPS to be on the safe side. Technically ULPS is not compatible with unofficial overclock method, aka overclocking past Overdrive limits using a third party tool, like gputweak.

In typical cases crossfire issues are predominantly driver related. Overclocking process or how you setup all those variables doesn't really apply yet since we are not really overclocking much nor using voltage 24-7. Gputweak is an annoying program to use daily so that is kind of to be expected. I would start by fixing your driver issues first.


----------



## xxmastermindxx

Quote:


> Originally Posted by *iPDrop*
> 
> I'm using 64bit windows and bf4 is only in 64bit. also Overdrive wont open on my pc because I'm using an Intel processor. my cards came with ASUS's GPU Tweak would that be okay to use? Also any idea whats wrong with my crossfire issue?


Overdrive not opening has nothing to do with your processor. I only use Intel and Overdrive has always worked fine for me. Do a clean uninstall of all AMD software, and reinstall the latest driver.
Quote:


> Originally Posted by *iPDrop*
> 
> It gave me a message box saying "No AMD processors we're detected on this computer" or something along those lines, and then it never opened.


You're not using the standalone Overdrive software, are you? All you need is Catalyst and the Overdrive that's built into it on the Performance tab.


----------



## bpmcleod

I got it working finally. Whats weird is that my ISO for windows 8.1 comes with everything already preloaded onto it. I am not sure why, but after reinstalling the entire OS and running the ATI catalyst installer it worked... sortof. It black screened on me again but I then just restarted and it works perfectly fine. It didnt the first time but did now /shrug. Who knows. Ty for the help anyways!


----------



## esqueue

Quote:


> Originally Posted by *bpmcleod*
> 
> I got it working finally. Whats weird is that my ISO for windows 8.1 comes with everything already preloaded onto it. I am not sure why, but after reinstalling the entire OS and running the ATI catalyst installer it worked... sortof. It black screened on me again but I then just restarted and it works perfectly fine. It didnt the first time but did now /shrug. Who knows. Ty for the help anyways!


I suggested reinstalling your os 2 pages back








Quote:


> Originally Posted by *esqueue*
> 
> In rare cases, Windows can just install bad and nothing works as it should. I've had that happen with xp where it ran like crap and not even the cd functioned correctly. I'd suggest installing windows 8.1 again. Fresh if you have nothing to lose or repair if you do. Many of us have windows 8 and it works. Have you also installed the correct mobo drivers? Just shooting out suggestions. Sorry if you have already said that you've done it already.


----------



## iPDrop

Okay I was trying to use the overdrive program that I downloaded or something, I guess I can just do it straight in Catalyst.. But after disabling ULPS I still get poor crossfire performance and crazy GPU Usage.

One thing I noticed was that the fan on my 2nd card was not running at all for a brief moment, and the 2nd card was not working at all for some reason just now when trying Crysis 3. My first card was doing all the work and my 2nd one was at 0% GPU Usage, and I was only getting 30 frames. Do you think there is something wrong with the 2nd card?


----------



## SpewBoy

Quote:


> Originally Posted by *Arm3nian*
> 
> For all we know, the cooler used on the 290x can be very good. One can say the 780 has a good cooler. If the 780 was clocked higher from the factory, which would require more voltage and therefore more power, and ran at 95c, we could say the 780 cooler is garbage.


It has been proven that non-reference coolers available for other cards right now (the IceQ x2 for the 280X in particular) are easily able to reduce the temperature (and probably noise levels) of the 290(X) to values comparable to 780s and the like with non-reference cooling. Now imagine doing that, but keeping the temp at 95 degrees. It would barely make a peep.

Yes, it is a non-reference cooler and therefore has more room to play with in terms of spacing and design but the important fact is, it gets non-reference 780-like temps whereas if what you are saying holds true, it should still be hotter than a non-reference 780.

But anyway, this conversation is getting old and I think we will have to agree to disagree. Once non-reference coolers are shipped with the 290 it might become clearer. Until then, good day to you sir.


----------



## tsm106

Quote:


> Originally Posted by *iPDrop*
> 
> Okay I was trying to use the overdrive program that I downloaded or something, I guess I can just do it straight in Catalyst.. But after disabling ULPS I still get poor crossfire performance and crazy GPU Usage.
> 
> One thing I noticed was that the fan on my 2nd card was not running at all for a brief moment, and the 2nd card was not working at all for some reason just now when trying Crysis 3. My first card was doing all the work and my 2nd one was at 0% GPU Usage, and I was only getting 30 frames. Do you think there is something wrong with the 2nd card?


The Overdrive you download is for the cpu. The one you want to use is in Catalyst.

Btw, did you test each card individually first?


----------



## brazilianloser

I am getting black screens on bf4 as well... Even though gpu is under 80c the whole time. I guess I will up the voltage. Kind of sad about this.


----------



## Mr357

Quote:


> Originally Posted by *SpewBoy*
> 
> It has been proven that non-reference coolers available for other cards right now (the IceQ x2 for the 280X in particular) are easily able to reduce the temperature (and probably noise levels) of the 290(X) to values comparable to 780s and the like with non-reference cooling. Now imagine doing that, but keeping the temp at 95 degrees. It would barely make a peep.
> 
> Yes, it is a non-reference cooler and therefore has more room to play with in terms of spacing and design but the important fact is, it gets non-reference 780-like temps whereas if what you are saying holds true, it should still be hotter than a non-reference 780.
> 
> But anyway, this conversation is getting old and I think we will have to agree to disagree. Once non-reference coolers are shipped with the 290 it might become clearer. Until then, good day to you sir.


I think that hyperlink isn't what you meant it to be. It's the Techspot review of the reference 290, not anything about the HIS aftermarket 280X.

EDIT: Never mind, I see now what you were eluding to.


----------



## Sazz

Leak testing xD


----------



## skupples

For my AMD brethren.




strange thing was, the eyefinity center piece was on 280x x2... All the bf4 machines were also 280x... The vendor systems were all on a single 6pin card of some sort. Think it was 240? They had Non HD oculus rift's running on single 240's, was pretty epic for low HD roller coaster demo.


----------



## RAFFY

Quote:


> Originally Posted by *brazilianloser*
> 
> I am getting black screens on bf4 as well... Even though gpu is under 80c the whole time. I guess I will up the voltage. Kind of sad about this.


I've had this happen to me a couple of times and it has actually frozen my computer. But the last time it happened I was able to get out of it and saw that Direct X was the problem and was actually crashing. I noticed in the new drivers that were released tonight they mentioned a fix to Legacy Direct X 9.0 and earlier. I'm wondering if this will help with the later versions as well. By the way I'm at stock clocks and coolers as well.


----------



## brazilianloser

I was able to play without crashes for a while after I stopped using the 64 bit version of bf4. I guess that's helping as well bring on the crashes.


----------



## Falkentyne

Is this good enough to get in the club? Sorry it was taken from a MSI laptop webcam.

http://img9.imageshack.us/img9/9436/ddfv.png

gpuz: http://www.techpowerup.com/gpuz/64e6y/


----------



## Arizonian

Quote:


> Originally Posted by *Sazz*
> 
> Leak testing xD


Anticipation









Quote:


> Originally Posted by *Falkentyne*
> 
> Is this good enough to get in the club? Sorry it was taken from a MSI laptop webcam.
> 
> http://img9.imageshack.us/img9/9436/ddfv.png
> 
> gpuz: http://www.techpowerup.com/gpuz/64e6y/


Good enough - congrats - added


----------



## Slomo4shO

Quote:


> Originally Posted by *tsm106*
> 
> Wrong. I'm not arguing anything. You guys are making the argument that the cooler sucks. But that is flawed, and you are projecting your frustration over not understanding AMD's design principles behind their decision to run the cards at that temp. You claim the cooler is an inferior design because it can't keep the card cooler. That just defines your disconnect or misunderstanding of AMD's stated goal of 95c.


The 290X throttles massively in quite mode and seems to average around 750-800MHz in certain applications. If the cooler was designed to run at 95C then may I ask why the quite mode fails to perform at 95C? Unless, of course, you are content with running the GPU well below the specified clocks. Yes, uber mode resolves most of these issues but it still does not completely eliminate throttling due to temperatures. How is that cooler is meeting these design needs if the card remains at 94C and continues to throttle to maintain this temperature even at 55% speeds? This was a poor decision by AMD and there may be enough negative media coverage that this oversight may keep individuals from considering the 290s. I am unsure how you can justify such a lousy design...


----------



## esqueue

Quote:


> Originally Posted by *Sazz*
> 
> Leak testing xD


That looks awesome. Is that all copper with some sort of vinyl wrapping? For hard tubing I assume it is using compression fittings so I will not bother asking.


----------



## Arizonian

Quote:


> Originally Posted by *skupples*
> 
> For my AMD brethren.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> strange thing was, the eyefinity center piece was on 280x x2... All the bf4 machines were also 280x... The vendor systems were all on a single 6pin card of some sort. Think it was 240? They had Non HD oculus rift's running on single 240's, was pretty epic for low HD roller coaster demo.


Wasted opportunity by AMD I guess to not spot light their flagship. Not sure how this is relevant to us in this club since the 290X nor 290 wasn't represented.


----------



## pioneerisloud

Quote:


> Originally Posted by *Slomo4shO*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Wrong. I'm not arguing anything. You guys are making the argument that the cooler sucks. But that is flawed, and you are projecting your frustration over not understanding AMD's design principles behind their decision to run the cards at that temp. You claim the cooler is an inferior design because it can't keep the card cooler. That just defines your disconnect or misunderstanding of AMD's stated goal of 95c.
> 
> 
> 
> The 290X throttles massively in quite mode and seems to average around 750-800MHz in certain applications. If the cooler was designed to run at 95C then may I ask why the quite mode fails to perform at 95C? Unless, of course, you are content with running the GPU well below the specified clocks. Yes, uber mode resolves most of these issues but it still does not completely eliminate throttling due to temperatures. How is that cooler is meeting these design needs if the card remains at 94C and continues to throttle to maintain this temperature even at 55% speeds? This was a poor decision by AMD and there may be enough negative media coverage that this oversight may keep individuals from considering the 290s. I am unsure how you can justify such a lousy design...
Click to expand...

My 7970 reference card hits 93*C with the stock cooler at 66% fan speed (auto). What's your point?

All these complaints about the cooler sucking..... how about you fix the fan curve or manually set it to a fixed speed? There you go, problem solved.


----------



## Arizonian

While we're on this subject here is an article that explains what we're seeing.

*AMD's Radeon R9 290 has a problem, but Nvidia's smear attack is heavy-handed* - Source - Extreme Tech
Quote:


> Maximum fan speeds below 50% will not maintain review frame rates. This appears to be true for both the R9 290 and R9 290X. AMD may have thought they could still deliver superior performance overall by designing a card that would burst and then idle at a lower frequency. It works with the 290X because that GPU sets an Uber fan profile of 55%. The 290′s default 47% is close - but not quite high enough. If you want full performance out of one of these cards, you're going to have to push the fan speed higher than 47%.


I encourage all of you to read it in it's entirety the title is misleading a bit as it's more about the throttling than Nvidia smearing AMD while it has the chance.

It's really not putting a good foot forward by AMD not thinking this through and I'm going to guess because their reference fan is loud after a certain point. As Pioneerisloud said - ALL FIXABLE WITH A MANUAL FAN SETTING.


----------



## esqueue

Quote:


> Originally Posted by *pioneerisloud*
> 
> My 7970 reference card hits 93*C with the stock cooler at 66% fan speed (auto). What's your point?
> 
> All these complaints about the cooler sucking..... how about you fix the fan curve or manually set it to a fixed speed? There you go, problem solved.


From all the posts written when this whole thing started a few hours ago, I didn't notice any complaining. We were discussing if AMD purposely designed the cooler to throttle while not being able to keep 95°C. I and many others think that they put in a cooling system that can't keep their goal without either the card throttling or the fans being loud. It wouldn't be smart to pay $600+ for a video card and not read reviews and not know what you are getting yourself into.

The card was purchased and performed exactly as I expected. I never planed on keeping the stock cooling and I will never state that I think that the cooling system was adequate to do the job at a low noise level because it can't. My custom fan settings are keeping the card around 80°C but IS loud. People keep comparing other bad cards that are either louder or run hotter. The fact that other cards did it doesn't have anything to do with this card. I believe that this is an awesome card with a bad choice of a cooling setup. The cons weren't enough for me to wait for another version nor even bring it up. I only commented as I strongly disagree that this is a flawless design.

It's like buying a new car with what I consider small ugly rims. I can say that they should have come with better wheels but I'll upgrade them myself. err....forget about the car analogy as rims are just a personal preference.


----------



## Paul17041993

3% fan difference for a miniscule amount of throttling, perchance what environment were these tests done on anyway...


----------



## pioneerisloud

Quote:


> Originally Posted by *esqueue*
> 
> Quote:
> 
> 
> 
> Originally Posted by *pioneerisloud*
> 
> My 7970 reference card hits 93*C with the stock cooler at 66% fan speed (auto). What's your point?
> 
> All these complaints about the cooler sucking..... how about you fix the fan curve or manually set it to a fixed speed? There you go, problem solved.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> From all the posts written when this whole thing started a few hours ago, I didn't notice any complaining. We were discussing if AMD purposely designed the cooler to throttle while not being able to keep 95°C. I and many others think that they put in a cooling system that can't keep their goal without either the card throttling or the fans being loud. It wouldn't be smart to pay $600+ for a video card and not read reviews and not know what you are getting yourself into.
> 
> The card was purchased and performed exactly as I expected. I never planed on keeping the stock cooling and I will never state that I think that the cooling system was adequate to do the job at a low noise level because it can't. My custom fan settings are keeping the card around 80°C but IS loud. People keep comparing other bad cards that are either louder or run hotter. The fact that other cards did it doesn't have anything to do with this card. I believe that this is an awesome card with a bad choice of a cooling setup. The cons weren't enough for me to wait for another version nor even bring it up. I only commented as I strongly disagree that this is a flawless design.
> 
> It's like buying a new car with what I consider small ugly rims. I can say that they should have come with better wheels but I'll upgrade them myself. err....forget about the car analogy as rims are just a personal preference.
Click to expand...

Well, I wasn't saying you in particular were complaining. But I see that excuse posted ALL the time, that it just runs too hot, blah blah blah. Lol. Reference coolers honestly suck, regardless of which GPU maker its from. I've heard Nvidia has come around a little bit with their reference coolers, which is great. But every single other high end reference GPU has always been hot and loud. Just the way reference cards generally are. My GTX580 was loud and hot, my HD4890 was hot and loud. I could go on, lol.

My point is....I'm just tired of seeing the "it runs too hot" card being pulled, when the reason it runs hot is because of the insanely low fan speed curve, designed to keep noise down.....because people complained about noise from the 7970's.


----------



## esqueue

Quote:


> Originally Posted by *pioneerisloud*
> 
> Well, I wasn't saying you in particular were complaining. But I see that excuse posted ALL the time, that it just runs too hot, blah blah blah. Lol. Reference coolers honestly suck, regardless of which GPU maker its from. I've heard Nvidia has come around a little bit with their reference coolers, which is great. But every single other high end reference GPU has always been hot and loud. Just the way reference cards generally are. My GTX580 was loud and hot, my HD4890 was hot and loud. I could go on, lol.
> 
> My point is....I'm just tired of seeing the "it runs too hot" card being pulled, when the reason it runs hot is because of the insanely low fan speed curve, designed to keep noise down.....because people complained about noise from the 7970's.


I understand. This is honestly my first big videocard purchase and the first one I've actually done research on. I usually go down to frys electronics and pick one with the highest numbers that is on sale. I didn't see the complaining but I haven't been here long enough to witness them so I do see how it would get tiring. My ek block is coming in tomorrow and I just realized that I am way too impatient. I may pay a little extra for quicker shipping next time.


----------



## Sazz

uhm... is it just me or did the voltage mod thread on this 290X is no more? I was gonna check it out and see the instructions but the first two post where it contains the instructions are not there anymore.


----------



## Paul17041993

Quote:


> Originally Posted by *Sazz*
> 
> uhm... is it just me or did the voltage mod thread on this 290X is no more? I was gonna check it out and see the instructions but the first two post where it contains the instructions are not there anymore.


yea the owner killed it...


----------



## RAFFY

Quote:


> Originally Posted by *brazilianloser*
> 
> I was able to play without crashes for a while after I stopped using the 64 bit version of bf4. I guess that's helping as well bring on the crashes.


Well it turns out I'm a dummy and I forgot to disable "Overdrive" in the Catalyst while using GPU Tweak at the same time. Since I figured this out I haven't crashed. Oh stupidity got me today haha


----------



## Arizonian

Quote:


> Originally Posted by *Sazz*
> 
> uhm... is it just me or did the voltage mod thread on this 290X is no more? I was gonna check it out and see the instructions but the first two post where it contains the instructions are not there anymore.


Your correct. Thread starter didn't want to support it and locked per his request. Last few posts in that thread sums it up.

Quote:


> Originally Posted by *RAFFY*
> 
> Well it turns out I'm a dummy and I forgot to disable "Overdrive" in the Catalyst while using GPU Tweak at the same time. Since I figured this out I haven't crashed. Oh stupidity got me today haha


Sometimes it's the small things we miss. Your sharing that may help someone else experiencing same issue.


----------



## Arm3nian

Quote:


> Originally Posted by *esqueue*
> 
> From all the posts written when this whole thing started a few hours ago, I didn't notice any complaining. We were discussing if AMD purposely designed the cooler to throttle while not being able to keep 95°C. I and many others think that they put in a cooling system that can't keep their goal without either the card throttling or the fans being loud. It wouldn't be smart to pay $600+ for a video card and not read reviews and not know what you are getting yourself into.
> 
> The card was purchased and performed exactly as I expected. I never planed on keeping the stock cooling and I will never state that I think that the cooling system was adequate to do the job at a low noise level because it can't. My custom fan settings are keeping the card around 80°C but IS loud. People keep comparing other bad cards that are either louder or run hotter. The fact that other cards did it doesn't have anything to do with this card. I believe that this is an awesome card with a bad choice of a cooling setup. The cons weren't enough for me to wait for another version nor even bring it up. I only commented as I strongly disagree that this is a flawless design.
> 
> It's like buying a new car with what I consider small ugly rims. I can say that they should have come with better wheels but I'll upgrade them myself. err....forget about the car analogy as rims are just a personal preference.


If you want to sacrifice performance for low noise, put it on quiet mode. If you don't care about noise, put it on uber. The design is very good, it gives those who are wanting a silent card the option to have one, and those who are looking for maximum performance the option to have one. Without the 95c (which the card is designed to run at) the performance would be lower because clocks would be lower. This design allows both parties to have what they are looking for.

If you want quiet and maximum performance, then you're going to need water, simple as that. The cooler does its job. AMD didn't need to spend anymore money putting the best cooler they could find on it. Leave that to the 3rd party.


----------



## Sazz

anyone got instructions on how to flash BIOS with this 290X?

God damn childish argument on the volt mod thread... -_-


----------



## Arizonian

Quote:


> Originally Posted by *Sazz*
> 
> anyone got instructions on how to flash BIOS with this 290X?
> 
> God damn childish argument on the volt mod thread... -_-


Only have the ASUS BIOS download

http://www.mediafire.com/download/5gewoc5e0jv4552/ASUS.ROM

Use atiflash in dos to flash 290X


----------



## esqueue

Quote:


> Originally Posted by *Arm3nian*
> 
> If you want to sacrifice performance for low noise, put it on quiet mode. If you don't care about noise, put it on uber. The design is very good, it gives those who are wanting a silent card the option to have one, and those who are looking for maximum performance the option to have one. Without the 95c (which the card is designed to run at) the performance would be lower because clocks would be lower. This design allows both parties to have what they are looking for.
> 
> If you want quiet and maximum performance, then you're going to need water, simple as that. The cooler does its job. AMD didn't need to spend anymore money putting the best cooler they could find on it. Leave that to the 3rd party.


Most people look out for themselves and of course I would prefer that AMD paid to have a better cooler but made the decision to get it anyway.
Please tell me the purpose of this post and why was it quoted as if you are replying to me? I posted facts about the reference cooler and you post your opinions about the card. I said it many times and even in the post that you quoted that, " I am watercooling the card".

"Lower Noise, Greater air flow cooling" is what's written on the box. That is only possible with aftermarket fans. Not the reference cooler. Again, these are the facts.


----------



## Rar4f

could anyone please explain how running card at 87-95 degree is not damaging for a gpu lifespan longterm? And if R9 290 is designed to run at hot temperature (i dont get how that works), would that mean there is a degree spot e.g 85c that the card may be best at running?


----------



## Arm3nian

Quote:


> Originally Posted by *esqueue*
> 
> Most people look out for themselves and of course I would prefer that AMD paid to have a better cooler but made the decision to get it anyway.
> Please tell me the purpose of this post and why was it quoted as if you are replying to me? I posted facts about the reference cooler and you post your opinions about the card. I said it many times and even in the post that you quoted that, " I am watercooling the card".


There were no facts in your post. You like others seem to misunderstand the design that was implemented and continue to ramble on with the same claims even though me and others have tried explaining it. I don't care if you're watercooling or absolute zero cooling, don't say it "it runs hot" or it "runs loud" if you knew it was going to do that to acheive max clocks.

If you want quiet performance switch it to quiet mode, don't need to be a genius for that.


----------



## esqueue

Quote:


> Originally Posted by *Rar4f*
> 
> could anyone please explain how running card at 87-95 degree is not damaging for a gpu lifespan longterm? And if R9 290 is designed to run at hot temperature (i dont get how that works), would that mean there is a degree spot e.g 85c that the card may be best at running?


There are things that can be done to keep the care perfectly fine at those temps. It isn't the temperature that kills these devices by the way. It is rapid heating and cooling.


----------



## Rar4f

Quote:


> Originally Posted by *esqueue*
> 
> There are things that can be done to keep the care perfectly fine at those temps. It isn't the temperature that kills these devices by the way. It is rapid heating and cooling.


So your saying that if you run r9 290

For some reasons the temps act up like this:

From 70c to 96
down to 70, back up to 96 and so on
this will ruin the card.

But a card running consistent 90s degree (90-95), will be just fine?


----------



## Forceman

Quote:


> Originally Posted by *Arm3nian*
> 
> There were no facts in your post. You like others seem to misunderstand the design that was implemented and continue to ramble on with the same claims even though me and others have tried explaining it. I don't care if you're watercooling or absolute zero cooling, don't say it "it runs hot" or it "runs loud" if you knew it was going to do that to acheive max clocks.
> 
> If you want quiet performance switch it to quiet mode, don't need to be a genius for that.


The point is that it was within AMD's power to keep us from having to make that choice. They could have put a better, more capable reference cooler on it like Nvidia did with the 780/Titan. I shouldn't have to choose between running 20% below the advertised clock speed just to keep the card from sounding like a jet engine, or conversely having a card that sounds like a jet engine just to get it to run at the advertised clock. Why is it okay for AMD to put us in that position when they could have just made the cooler better? I don't get why people are making excuses and defending AMD in this - the cooler is completely inadequate for the task at hand, and AMD shouldn't get a pass for that just because you have the option of running it louder.


----------



## esqueue

Quote:


> Originally Posted by *Rar4f*
> 
> So your saying that if you run r9 290
> 
> For some reasons the temps act up like this:
> 
> From 70c to 96
> down to 70, back up to 96 and so on
> this will ruin the card.
> 
> But a card running consistent 90s degree (90-95), will be just fine?


If it was really designed to handle those temps (which I don't doubt) then it could run them all the time. The thing is that after a gaming session, the card cools off quite rapidly. It might be slow enough but I have no idea no will hold off on any comments about that.
Quote:


> Originally Posted by *Forceman*
> 
> The point is that it was within AMD's power to keep us from having to make that choice. They could have put a better, more capable reference cooler on it like Nvidia did with the 780/Titan. I shouldn't have to choose between running 20% below the advertised clock speed just to keep the card from sounding like a jet engine, or conversely having a card that sounds like a jet engine just to get it to run at the advertised clock. Why is it okay for AMD to put us in that position when they could have just made the cooler better? I don't get why people are making excuses and defending AMD in this - the cooler is completely inadequate for the task at hand, and AMD shouldn't get a pass for that just because you have the option of running it louder.


http://www.techspot.com/news/54576-radeon-r9-290-review-kick-ass-value-same-top-notch-performance.html if you scroll down they show a cooler that can easily achieve performance and quietness.


----------



## Arizonian

Quote:


> Originally Posted by *Rar4f*
> 
> So your saying that if you run r9 290
> 
> For some reasons the temps act up like this:
> 
> From 70c to 96
> down to 70, back up to 96 and so on
> this will ruin the card.
> 
> But a card running consistent 90s degree (90-95), will be just fine?


I'm sure you've been following since you first asked back on page 365. I'm going to re-post Paul's qoute with bold and underlined text. Thank you Paul17041993. Good explanation.
Quote:


> Originally Posted by *Paul17041993*
> 
> I'm surprised I'm still hearing complaints about temperature and the ref cooler...
> 
> 100C is in fact a normal run temp for a vast majority of silicon-based microchips, these cards are no different, its just the FX and first GCN cores that had certain sensitivities in their design that high temps could trigger instability and wear them, they changed this in the 290 and 290X so, just like a lot of intel chips, and my 5770, can run as high as 99C before shutting off. (funnily enough my 5770 never did)
> 
> *water boils at, guess what, 100C, solder melts at, usually 180C, silicon melts above 1000C, copper melts above 2500C.*
> 
> heating and cooling rapidly will eventually cause warpage, but like a lot of people have noticed, the core will heat slowly (bout 5-10 minutes under heavy load), then the fan shuts off after use and cools down slowly, this overall stops the warpage and will give it more then 3 years life.
> 
> and the reference cooler, yes, it likely only costs 50 bucks, yes, the fan uses a sleeve bearing (or I'm pretty sure it does), the chamber is even a lower-end liquid type vs the vapor chamber the sapphire VaporX card use, overall its DESIGNED to be CHEAP and SIMPLE, so if you want to use it, you can, otherwise AMD have done something very nice and cut 100 bucks off the price of the card by only using a basic cooler for you to replace with whatever you want.
> 
> if you want to complain about it, compare it to the reference coolers supplied for your CPU.


----------



## Forceman

Quote:


> if you want to complain about it, compare it to the reference coolers supplied for your CPU.


I don't know about AMD, but the stock Intel cooler runs the CPU just fine at it's rated speed, without excessive noise. If that's their justification, they are even more in denial than I thought.


----------



## Arm3nian

Quote:


> Originally Posted by *Forceman*
> 
> The point is that it was within AMD's power to keep us from having to make that choice. They could have put a better, more capable reference cooler on it like Nvidia did with the 780/Titan. I shouldn't have to choose between running 20% below the advertised clock speed just to keep the card from sounding like a jet engine, or conversely having a card that sounds like a jet engine just to get it to run at the advertised clock. Why is it okay for AMD to put us in that position when they could have just made the cooler better? I don't get why people are making excuses and defending AMD in this - the cooler is completely inadequate for the task at hand, and AMD shouldn't get a pass for that just because you have the option of running it louder.


Take a titan or 780 and pretend that it came from the factory simply clocked higher with an increased voltage. When overclocked, do you think that cooler would still be amazing? No, it would go over tdp without a jet engine fan. AMD gave you a choice of quietness or performance. If they balanced it like nvidia, you would get mediocre performance for a non quiet fan. Putting a better cooler would also drive up the price. Aftermarket coolers still run into a thermal barrier.


----------



## Rar4f

Thanks Ariz!


----------



## Cheapshot

Hello,

I just registered for the forum because I have Club3D R9 290X and having the same black screen issue described in this thread in Battlefield 4 and some other games and benchmarks.
Personally I lost faith in the product because I bought it for BF4 and its just not playable, luckily I have an old GTX460 laying around so I can still play BF4 on medium without any problems.
Here are my specs:

Cpu: Intel 4670K stock, cooled by Corsair H100i
Motherboard: Gigabyte G1.Sniper M5, latest bios F7
GPU: Club3D R9 290X, Catalyst 13.11 Beta8 and tested the latest 9.2 beta
Memory: Crucial Ballistix 16GB DDR3 PC3-12800 1.5 volt
SSD: Samsung EVO 840 250GB
OS: Windows 8 Pro, up to date, no 8.1

Made a small video where I do some other benchmarks resulting in a black screen.





I hope someone will find a solution for this issue. Im already mailing with Club3d support to see if there is a solution or that I should just send the card in for repair.


----------



## Paul17041993

Quote:


> Originally Posted by *Forceman*
> 
> I don't know about AMD, but the stock Intel cooler runs the CPU just fine at it's rated speed, without excessive noise. If that's their justification, they are even more in denial than I thought.


unless you peak it, (talking about top-end i7s and the basic air coolers), they get pretty noisy, I even had a pentium 4 with a copper block stock heatsink and it was quite the noise maker in summer...

but point being, the coolers only meant to be basic, to do its job while not increasing the cost of the card, and being a card for the enthusiast market a large majority will be modded anyway, just like the majority or people here will use a different cooler for their CPU.

in a sense though I wish they put a form of cap over the core so people don't have to worry about coating the surrounding micro components...


----------



## esqueue

Quote:


> Originally Posted by *Paul17041993*
> 
> in a sense though I wish they put a form of cap over the core so people don't have to worry about coating the surrounding micro components...


I actually have to be spoon fed this Are you talking about how the older cpu/gpus were full metal and now how they are fully exposed? If so aren't they doing it to allow better heat transfer?
Quote:


> Originally Posted by *Cheapshot*
> 
> Hello,
> 
> I just registered for the forum because I have Club3D R9 290X and having the same black screen issue described in this thread in Battlefield 4 and some other games and benchmarks.
> Personally I lost faith in the product because I bought it for BF4 and its just not playable, luckily I have an old GTX460 laying around so I can still play BF4 on medium without any problems.
> Here are my specs:
> 
> Cpu: Intel 4670K stock, cooled by Corsair H100i
> Motherboard: Gigabyte G1.Sniper M5, latest bios F7
> GPU: Club3D R9 290X, Catalyst 13.11 Beta8 and tested the latest 9.2 beta
> Memory: Crucial Ballistix 16GB DDR3 PC3-12800 1.5 volt
> SSD: Samsung EVO 840 250GB
> OS: Windows 8 Pro, up to date, no 8.1
> 
> Made a small video where I do some other benchmarks resulting in a black screen.
> 
> 
> 
> 
> 
> I hope someone will find a solution for this issue. Im already mailing with Club3d support to see if there is a solution or that I should just send the card in for repair.


Is your card overclocked or stock speed?


----------



## lugal

R9 290 cooler is ok for stock blower, only nvidia had to start marketing campaign against r9 290, because they cant counter it with actual product - and couldnt really find anything "bad" about the card, so they aim at cooler.
And why would anyone complain about about the card running within its operation temperature... oh, viral marketing, nvidia marketing team knows that there is a bunch of customers who pay big money for their cards and need to justify their (past/future) purchases now their cards are owned by amd gpus, so they actually leave the "dirty work" to them, while gently "showing them the way" by paid reviewers.


----------



## Paul17041993

Quote:


> Originally Posted by *esqueue*
> 
> I actually have to be spoon fed this Are you talking about how the older cpu/gpus were full metal and now how they are fully exposed? If so aren't they doing it to allow better heat transfer?


older cpus, bout ~2000 and earlier, were bare silicon, very similar to gpus, but they started soldering caps over the top to make installing cpu coolers much simpler and to add more surface area.

but yes, one downside is the extra metal can increase temperature slightly, don't think by much though using solder, though with graphics cards having a limited profile I think the extra few mm is also a concern...


----------



## Forceman

Quote:


> Originally Posted by *lugal*
> 
> R9 290 cooler is ok for stock blower, only nvidia had to start marketing campaign against r9 290, because they cant counter it with actual product - and couldnt really find anything "bad" about the card, so they aim at cooler.
> And why would anyone complain about about the card running within its operation temperature... oh, viral marketing, nvidia marketing team knows that there is a bunch of customers who pay big money for their cards and need to justify their (past/future) purchases now their cards are owned by amd gpus, so they actually leave the "dirty work" to them, while gently "showing them the way" by paid reviewers.


And here come the conspiracy theories, Nvidia behind every bush


----------



## Cheapshot

System is completely stock.


----------



## Sazz

Ok, I was able to figure out how to flash the BIOS, and here is a preview so far of what's to come, I got work over the weekend so I cannot really do much benchmarks.

my best Firestrike score so far, almost breaking the 10k barrier with an all AMD build, I am using the PT1 BIOS

http://www.3dmark.com/3dm/1580272


----------



## TheSoldiet

Im having the same problem with bf4. The only fix i have is after the Black screen, turn of the psu and turn it back on. Works for me. As the pc wont boot if i dont. After i reboot i can play bf4 just fine, but when i try to play the next day, it crashes and i have to do it all over again.


----------



## lugal

Quote:


> Originally Posted by *Forceman*
> 
> And here come the conspiracy theories, Nvidia behind every bush


You mean conspiracy theories like "USA spying on everyone" just few years ago was?


----------



## Forceman

Quote:


> Originally Posted by *sugarhell*
> 
> Check this as a basic
> 
> http://forums.guru3d.com/showthread.php?t=350890


Thanks, so is it best just to leave the CCC options alone, or try to change some of the antialiasing settings?


----------



## Clockster

I don't know guys, this stock cooler is fine, even though it it insanely loud, it keeps the card running fine.
Playing BF4, with a custom fan profile and running my card at 1050 core and my temps are fantastic, highest I've seen was 81c under full load.

On a side note my 2nd card was DOA







Getting it swapped later today.

Starting on my watercooling tonight


----------



## the9quad

The beta 9.2 drivers brought back black screen lockups and graphical corruption for me, I am thinking of RMA'ing this card.


----------



## Slomo4shO

Quote:


> Originally Posted by *pioneerisloud*
> 
> My 7970 reference card hits 93*C with the stock cooler at 66% fan speed (auto). What's your point?
> 
> All these complaints about the cooler sucking..... how about you fix the fan curve or manually set it to a fixed speed? There you go, problem solved.


Do you hear me complaining about one of my 7950s in my quadfire topping 95C under load? That is because the card isn't throttling. However, the 290X throttles at both stock bios modes. Your 7970 didn't have a predefined cap on your fan speeds to reduce the noise levels, the Hawaii GPUs do. The issue isn't the workarounds, it is actually the product delivery itself. The cards are designed to perform at 94C and will clock down when they reach 95C. Both the silent and uber mode does not provide ample cooling at stock, AMD configured, setting to maintain factory core frequencies. This is one of the reasons why AMD delayed the 290 and delivered a driver set that changed the fan speeds of the 290 to 47% so it could actually compete with the 780 instead of being throttled and under performing. Yes, individuals on these forums would have enough sense to change the cooler or reconfigure the fan curve to ensure that the card is cooled properly to where it can actually perform at full capacity at stock frequencies at the very least. However, anyone outside of this circle may not and likely do not go into any more details than installing the card and drivers and loading their games. Such individuals purchased a product expecting it to perform at a certain level and will find that the cards under perform and their perception will be reinforced by just about every review of the cards out there that suggests that the cards throttle under load. The fact still stands: AMD paired an inefficient cooler and then also limited the maximum speed to limit noise and ended up with a product that is still relatively loud and still throttles at stock settings. Are you really going to argue against this?
Quote:


> Originally Posted by *Arm3nian*
> 
> If you want quiet performance switch it to quiet mode, don't need to be a genius for that.


Only problem is that uber mode is still unable to maintain stable load clocks of 1000MHz and will clock down under load. The only way to circumvent this is to setup a custom fan curve and that mean AMDs product delivery failed to perform at stock settings. I am unsure why you seem to think that this argument is heat and noise when it is actually about performance. Is it too much to ask for the card to perform at the factory specs under factory settings? Catalyst 13.11 BETA 9.2 supposed helps to alleviate the downclocking but we shall see what happens.


----------



## Paul17041993

Quote:


> Originally Posted by *TheSoldiet*
> 
> Im having the same problem with bf4. The only fix i have is after the Black screen, turn of the psu and turn it back on. Works for me. As the pc wont boot if i dont. After i reboot i can play bf4 just fine, but when i try to play the next day, it crashes and i have to do it all over again.


hm, so what happens on this "black screen"? the system stays running? if it shuts off completely your overloading your PSU, if it says on then its something else...


----------



## Strata

Definitely need WC on the card, but < 80C during Heaven 4.0 on ultra extreme tessellation is good for a custom fan curve (tops at 65%), scored 1507

FSE scored 5076

[email protected], WC
8gb 1600 @ 2133, 11-12-12-32 1T
Kingston Hyper 3K SSD
Sapphire R9-290X BF4 @ 1125/1500 AIR


----------



## Falkentyne

Quote:


> Originally Posted by *Cheapshot*
> 
> Hello,
> 
> I just registered for the forum because I have Club3D R9 290X and having the same black screen issue described in this thread in Battlefield 4 and some other games and benchmarks.
> Personally I lost faith in the product because I bought it for BF4 and its just not playable, luckily I have an old GTX460 laying around so I can still play BF4 on medium without any problems.
> Here are my specs:
> 
> Cpu: Intel 4670K stock, cooled by Corsair H100i
> Motherboard: Gigabyte G1.Sniper M5, latest bios F7
> GPU: Club3D R9 290X, Catalyst 13.11 Beta8 and tested the latest 9.2 beta, stock speeds
> Memory: Crucial Ballistix 16GB DDR3 PC3-12800 1.5 volt
> SSD: Samsung EVO 840 250GB
> OS: Windows 8 Pro, up to date, no 8.1
> 
> Made a small video where I do some other benchmarks resulting in a black screen.
> 
> 
> 
> 
> 
> I hope someone will find a solution for this issue. Im already mailing with Club3d support to see if there is a solution or that I should just send the card in for repair.


For you guys having black screens in windowed mode (borderless):

1) do you have Hynix or Elpida memory?
2) DO YOU HAVE OVERDRIVE ENABLED OR DISABLED?

I just ran heaven 4.0 for about 50 minutes windowed and didn't get a black screen. (Hynix, overdrive disabled, overclocking through afterburner only, 1100/1400).

95% sure these black screens are memory related. Only time I ever got a black screen was when I tried overclocking the memory through OVERDRIVE (to 1500 MHz), which was FINE in windows, UNTIL I restarted the computer..then it would black screen on the desktop, almost as soon as CCC loaded...


----------



## Strata

No Black screens here either, though I'm getting some odd visuals, it's like certain light sources (shafts mostly), and a flag on Heaven 4. The shafts repeat instead of diffusing, and the flag is blurry add if you smeared the art...I'll try to get some SS


----------



## Gilgam3sh

Quote:


> Originally Posted by *Falkentyne*
> 
> 95% sure these black screens are memory related. Only time I ever got a black screen was when I tried overclocking the memory through OVERDRIVE (to 1500 MHz), which was FINE in windows, UNTIL I restarted the computer..then it would black screen on the desktop, almost as soon as CCC loaded...


when you say that I also think it's something wrong with OVERDRIVE, got black screen after login to Windows 8.1, only that helped was to boot into safe mode, uninstall the driver and then reboot and install driver again, now I only use MSI Afterburner to overclock and there is no problem with black screen.


----------



## TheSoldiet

My psu and pc stays on. Nothing wrong with my psu ocz zx 850 gold. When i try to press the restart button nothing happens (front of my case). It just stays black. I have to restart my psu for some reason to make it work.

On my Phone so it is hard to write sgs3.


----------



## Paul17041993

seems a bit strange for black screens to be memory related, a black screen generally means the cores shut off for some reason, memory rarely causes this unless its the controller in the core itself, memory issues are usually determined by visual artifacts and bad textures...

has anyone gotten there with hwinfo or similar and logged the values to a file for the crashes? I know when my 7970 had its zebra screens the system was still alive for a couple of seconds after, enough to show uninitialized values in the logs (super low negatives and zeros)...


----------



## TheSoldiet

When i get my black screen, I can still hear my friends on Skype for a couple of seconds, then it freezes and the sound is all weird. The same Black screen i get if i oc my gpu to much. (this black screen happens at stock clock)

Fx 8350 4.5 GHz 1.39 volts
Asus crosshair v formula z
Cm storm stryker
8gb vengeance 1600mhz
R9 290X (sapphire)
AMD his hd 6970 reference blue pcb (my old gpu)


----------



## Spectre-

Quote:


> Originally Posted by *TheSoldiet*
> 
> When i get my black screen, I can still hear my friends on Skype for a couple of seconds, then it freezes and the sound is all weird. The same Black screen i get if i oc my gpu to much. (this black screen happens at stock clock)
> 
> Fx 8350 4.5 GHz 1.39 volts
> Asus crosshair v formula z
> Cm storm stryker
> 8gb vengeance 1600mhz
> R9 290X (sapphire)
> AMD his hd 6970 reference blue pcb (my old gpu)


graphics driver has crashed

really most of the time it would recover but you should look into it


----------



## Cheapshot

Quote:


> Originally Posted by *Falkentyne*
> 
> For you guys having black screens in windowed mode (borderless):
> 
> 1) do you have Hynix or Elpida memory?
> 2) DO YOU HAVE OVERDRIVE ENABLED OR DISABLED?
> 
> I just ran heaven 4.0 for about 50 minutes windowed and didn't get a black screen. (Hynix, overdrive disabled, overclocking through afterburner only, 1100/1400).
> 
> 95% sure these black screens are memory related. Only time I ever got a black screen was when I tried overclocking the memory through OVERDRIVE (to 1500 MHz), which was FINE in windows, UNTIL I restarted the computer..then it would black screen on the desktop, almost as soon as CCC loaded...


1) I have Elpida memory
2) Disabled, I do my overclocking with MSI Asfterburner

I just down clocked my card to 800/1000 and did the Metro last light test 5 times without a black screen, I wasn't able to do this before. Also did 30 minutes of BF4 without a black screen, so it must be the memory or the gpu. During the tests the core never got above 70C.

I'll keep testing and post when I have more to report.


----------



## the9quad

Definitely not a power issue with the black screens, I am running a pc power and cooling 1200 watt platinum power supply for ONE CARD. Going back to beta 8 seems to have fixed it some what, but I still get random red dots on things in BF4, almost like someone has a laser pointer pointed at things once in a while, but going back to beta 8 fixed the flashing textures and black screens I hope. Fan speed is roughly 70% and temp of card is roughly 75 or 80 there abouts, NO Overclock on the card, just a custom fan profile.


----------



## kantxcape

I have a Sapphire 290x elpida mems and im getting the black screen crashes in normal and uber mode, only tried BF4 but i bought the 290x for it so... need to check if this happens with the new drivers. I will report back later, also do your 290x that crash at the black screen do a crazy coil whine?


----------



## Paul17041993

guess they haven't quite worked out all the kinks with the new controllers, should hopefully improve when he custom card come though...

be sure your reporting your issues to their driver team via the bug/issue reporter, so they can test with your setups and try to work it out.


----------



## the9quad

Quote:


> Originally Posted by *Paul17041993*
> 
> guess they haven't quite worked out all the kinks with the new controllers, should hopefully improve when he custom card come though...
> 
> be sure your reporting your issues to their driver team via the bug/issue reporter, so they can test with your setups and try to work it out.


where does one find the bug reporter?


----------



## Falkentyne

Quote:


> Originally Posted by *kantxcape*
> 
> I have a Sapphire 290x elpida mems and im getting the black screen crashes in normal and uber mode, only tried BF4 but i bought the 290x for it so... need to check if this happens with the new drivers. I will report back later, also do your 290x that crash at the black screen do a crazy coil whine?


So far, each and every person who has had the black screen problem has Elpida memory.

Has ANYONE WITH HYNIX had the black screen happen (when not overclocking with overdrive?)


----------



## Paul17041993

Quote:


> Originally Posted by *the9quad*
> 
> where does one find the bug reporter?


uuuhhhhhhhmmmmm... oh here it is;
http://www.amdsurveys.com/se.ashx?s=5A1E27D25AD12B60

amd overhauled their site a while ago, I think they usually link the issue reporting on the beta driver pages...


----------



## Falkentyne

Quote:


> Originally Posted by *Cheapshot*
> 
> 1) I have Elpida memory
> 2) Disabled, I do my overclocking with MSI Asfterburner
> 
> I just down clocked my card to 800/1000 and did the Metro last light test 5 times without a black screen, I wasn't able to do this before. Also did 30 minutes of BF4 without a black screen, so it must be the memory or the gpu. During the tests the core never got above 70C.
> 
> I'll keep testing and post when I have more to report.


Meh where did what I wrote vanish to?

Well I said, try raising the memory 50 MHz at a time, with the core downclocked, back up to 1250 and see if it happens again.

So far I haven't seen anyone with Hynix (yet) have this issue, so we need more feedback.


----------



## Dominican

newegg outsold last night


----------



## TheSoldiet

I also have elpida memory on my sapphire 290x.....


----------



## TheSoldiet

Is it worth trading back my 290x and then get a 290 with xtreme III cooler?


----------



## Spectre-

Quote:


> Originally Posted by *TheSoldiet*
> 
> Is it worth trading back my 290x and then get a 290 with xtreme III cooler?


personally save up and just put that accelero on the 290x


----------



## selk22

Quote:


> Originally Posted by *the9quad*
> 
> but I still get random red dots on things in BF4, almost like someone has a laser pointer pointed at things once in a while


bf4 right now is a terrible way to judge for artifacts because the thing you are talking about is actually a glitch in the game of the laser texture being stuck randomly places.


----------



## the9quad

Quote:


> Originally Posted by *selk22*
> 
> bf4 right now is a terrible way to judge for artifacts because the thing you are talking about is actually a glitch in the game of the laser texture being stuck randomly places.


YEAH!!! thanks for pointing that out, seriously, I was like what the heck, overheating already!!!>?>>>


----------



## selk22

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *the9quad*
> 
> YEAH!!! thanks for pointing that out, seriously, I was like ***, overheating already!!!>?>>>






No problem buddy didn't want you freaking out because of dumpy EA!


----------



## esqueue

Quote:


> Originally Posted by *selk22*
> 
> 
> No problem buddy didn't want you freaking out because of dumpy EA!


LOL, anytime I read your responses, I hear it in Piccolo's voice.

I need sleep.


----------



## givmedew

So what the way to go right now?

I am going to be water cooling this thing... Pretty much set on a 290 non x version. Does it really matter what card I get? Should I wait for any specific vendors to release a non reference card?


----------



## Pfortunato

Does anyone knows when will asus, gigabyte... launch the r9 290/290x with aftermarket coolers?

Thx


----------



## Spectre-

Quote:


> Originally Posted by *Pfortunato*
> 
> Does anyone knows when will asus, gigabyte... launch the r9 290/290x with aftermarket coolers?
> 
> Thx


i heard rumours for 2 weeks

bu then some people are saying 2 months
cant be to sure mate


----------



## flopper

Quote:


> Originally Posted by *givmedew*
> 
> So what the way to go right now?
> 
> I am going to be water cooling this thing... Pretty much set on a 290 non x version. Does it really matter what card I get? Should I wait for any specific vendors to release a non reference card?


no all same cards different stickers, check the warranty etc..that you want.


----------



## EliteReplay

Quote:


> Originally Posted by *TheSoldiet*
> 
> When i get my black screen, I can still hear my friends on Skype for a couple of seconds, then it freezes and the sound is all weird. The same Black screen i get if i oc my gpu to much. (this black screen happens at stock clock)
> 
> Fx 8350 4.5 GHz 1.39 volts
> Asus crosshair v formula z
> Cm storm stryker
> 8gb vengeance 1600mhz
> R9 290X (sapphire)
> AMD his hd 6970 reference blue pcb (my old gpu)


thats not an issue with your card but with windows 8.1 ... look at google... how many people with the same... just get windows 8.0 until Microsoft resolved this issue.

https://www.google.com.do/?gws_rd=cr&ei=ONt8UvS-Ko7OkQelq4GIAQ#q=windows+8.1+black+screen


----------



## TheSoldiet

But i run Windows 7 ultimate -_-


----------



## Sammyboy83

Which memory is better, hynix or elpida? My friend bought a gigabyte 290 and it was with hynix memory. I also bought a gigabyte 290 today with ek waterblock, will test the card after work.


----------



## rdr09

Quote:


> Originally Posted by *TheSoldiet*
> 
> But i run Windows 7 ultimate -_-


i would rma that. i will even go as far as disagree with Spectre Go get a 290 and an accelero. this is if you're not into benching.


----------



## Durquavian

Quote:


> Originally Posted by *EliteReplay*
> 
> thats not an issue with your card but with windows 8.1 ... look at google... how many people with the same... just get windows 8.0 until Microsoft resolved this issue.
> 
> https://www.google.com.do/?gws_rd=cr&ei=ONt8UvS-Ko7OkQelq4GIAQ#q=windows+8.1+black+screen


Actually could be CCC Catalyst 13.11beta8. Granted my cards aren't up to the same level but I was having the same issue in Skyrim. The game was still playing but my screen was frozen, had to ctrlk-alt-delete my way out and back in to fix. Also got a few black screen reboots. System is stable checked that after. DL 13.11beta9.2 and gonna see if that fixes it tonight.


----------



## Spectre-

Quote:


> Originally Posted by *rdr09*
> 
> i would rma that. i will even go as far as disagree with Spectre Go get a 290 and an accelero. this is if you're not into benching.


How dare you sir mock me (sarcasm)
i guess i would have picked up the 290X not the 290 if it was the same price as USA in australia


----------



## Tobiman

If you are getting Artifacts in BF4. Use this to sort it out.


----------



## selk22

Quote:


> Originally Posted by *esqueue*
> 
> LOL, anytime I read your responses, I hear it in Piccolo's voice.
> 
> I need sleep.


Awesome! It just makes the responses much more epic...


----------



## rdr09

Quote:


> Originally Posted by *Spectre-*
> 
> How dare you sir mock me (sarcasm)
> i guess i would have picked up the 290X not the 290 if it was the same price as USA in australia


----------



## battleaxe

I decided to get another 670 for SLI instead of a 290x or 290 in the short term.

I'm glad I did, because now I want to wait for the aftermarket coolers on these 290's, at which point, I really want to give AMD another chance. I'm especially excited that they seem to have worked out the micro stuttering crossfire issues. Overall, I'm really happy with what AMD has done here which is to bring Nvidia back to the real world and their prices out of the stratosphere. That was getting seriously annoying.


----------



## DarknightOCR

hello

I have not read the whole topic back

I've seen some problems with black screen
but mine seems to be different.

bought a sapphire R9 290, but did not even start the Pc

the screen is all black after the loading windows.
have tested win7 and win8 and is always equal.

if you go into safe mode gives good.

entering without drivers installed, also occurs with low resolution, but once you install the drivers the screen goes black.
to graphically makes a small noise in the fan wheel a second faster then it seems that goes into sleep mode.

someone with the same problem?

or already have to go to RMA? even worked.


----------



## Lennyx

Just got 290x and ek block in house. But this damn w8 wont let me get my picture from the camera.
It feels like every freaking time i try to do a normal thing on this computer w8 ticks me off.

Edit


----------



## nemm

2 Sapphire R9 290's now under water, awaiting the sli tubing that was omitted from the delivery before I can use which should be here tomorrow.


----------



## skupples

Quote:


> Originally Posted by *Arizonian*
> 
> Wasted opportunity by AMD I guess to not spot light their flagship. Not sure how this is relevant to us in this club since the 290X nor 290 wasn't represented.


Shiny pictures are shiny!

They had boxes of Hawaii, just none in use from what I could tell. Probably because they are selling every single unit coming off of the line like hot cakes.


----------



## Deisun

Does anyone know what kind of modifications are needed to make the Accelero after-marker cooler fit the 290?
I read somewhere that it has to be modified just to fit.

I don't know if I want to do that. Maybe i'll just wait for something that fits without modifications needed.


----------



## xiong91

So the case of retail 290x card is peaking at 7xxmhz and much slower than press sample card has been resolved? I was thinking to get this card but this issue is pulling me off.. Need clarification.. Thanks









PS: I'm going to water cool it anyway..


----------



## EliteReplay

Quote:


> Originally Posted by *nemm*
> 
> 2 Sapphire R9 290's now under water, awaiting the sli tubing that was omitted from the delivery before I can use which should be here tomorrow.


Please can u do Unigine heaven 4.0?

1080p
8xAA
Ultra
Extreme HD?

Please!!!


----------



## Bloitz

Looks like everyone is going for the Acetal-Nickel blocks


----------



## Newbie2009

Quote:


> Originally Posted by *Bloitz*
> 
> Looks like everyone is going for the Acetal-Nickel blocks


Copper ones sold out everythere.


----------



## nemm

Quote:


> Originally Posted by *EliteReplay*
> 
> Please can u do Unigine heaven 4.0?
> 
> 1080p
> 8xAA
> Ultra
> Extreme HD?
> 
> Please!!!


Will do as soon as the machine is up and running with loop free of air.


----------



## bond32

I have to say, I'm personally quite frustrated with some of the R9 290x reviews. For those of us who plan to water cool, the sound and heat won't be an issue. I'm only interested in performance. Seems hard to find reviews that show these things...


----------



## Kriant

"FrozenCPU y u no send me those EK blocks"


----------



## Deisun

So what were the "slight modifications" TomsHardware did to the Accelero Xtreme iii cooler to make it fit the 290/290X?

I'm interested in ordering one for my 290.


----------



## jomama22

Quote:


> Originally Posted by *TheSoldiet*
> 
> But i run Windows 7 ultimate -_-


Is your CPU over clocked?


----------



## gooeyballzz

Hey Guys,

I have been lurking for along time and I really like your site.

I just got my MSI r9 290x and I have a problem. It seems to be throttling, in Metro Last Light I see GPU usage up and down constantly, 100% to 0.

I have installed the drivers from ATI released last night, which did not help.

I used MSI AB and set the fan speed to 50%. That helped with the Metro Benchmark,but the game still played choppy and I saw the constant up and down.

I got it at newegg so the RMA was a piece of cake, but I would really like to make it work.

Sorry for the long windedness.

Thanks


----------



## rdr09

Quote:


> Originally Posted by *Deisun*
> 
> So what were the "slight modifications" TomsHardware did to the Accelero Xtreme iii cooler to make it fit the 290/290X?
> 
> I'm interested in ordering one for my 290.


here is another alternative . . .

http://www.overclock.net/t/1439731/bored-of-waiting-for-none-ref-290x-check-this-simple-cheap-fix


----------



## rdr09

Quote:


> Originally Posted by *gooeyballzz*
> 
> Hey Guys,
> 
> I have been lurking for along time and I really like your site.
> 
> I just got my MSI r9 290x and I have a problem. It seems to be throttling, in Metro Last Light I see GPU usage up and down constantly, 100% to 0.
> 
> I have installed the drivers from ATI released last night, which did not help.
> 
> I used MSI AB and set the fan speed to 50%. That helped with the Metro Benchmark,but the game still played choppy and I saw the constant up and down.
> 
> I got it at newegg so the RMA was a piece of cake, but I would really like to make it work.
> 
> Sorry for the long windedness.
> 
> Thanks


what driver are you using? do you mind running a bench like 3DMark11?


----------



## skupples

forgot to link this... XFX rep said this will be showing up on Hawaii very soon!

(their! Slightly more on topic AZ!







)


----------



## DampMonkey

Quote:


> Originally Posted by *skupples*
> 
> 
> 
> forgot to link this... XFX rep said this will be showing up on Hawaii very soon!
> 
> (their! Slightly more on topic AZ!
> 
> 
> 
> 
> 
> 
> 
> )


And i thought the card couldn't get any uglier. Bravo, XFX


----------



## gooeyballzz

I'm running 13.11_betav9.2, from what I read it was suppose to fix throttling issues.

I don't have 3DMark11, but don't mind running Heaven.


----------



## skupples

Quote:


> Originally Posted by *DampMonkey*
> 
> And i thought the card couldn't get any uglier. Bravo, XFX


My picture doesn't do it much justice. Thx to the card punch contest pretty much every booth was swarming with sheeple not even looking @ tech... I got to handle it, the build quality is great, though a bit plasticy.


----------



## TheSoldiet

Quote:


> Originally Posted by *jomama22*
> 
> Is your CPU over clocked?


Yes it is. 4.5 GHz 1.39 volts amd fx 8350.

The q code led shows 'b2' when i black screen, and when i try to reboot. It means legacy option rom initialization. Dont know what it means though


----------



## rdr09

Quote:


> Originally Posted by *gooeyballzz*
> 
> I'm running 13.11_betav9.2, from what I read it was suppose to fix throttling issues.
> 
> I don't have 3DMark11, but don't mind running Heaven.


Welcome aboard.

dl the latest if you like . . .

http://www.techpowerup.com/downloads/Benchmarking/Futuremark/

once you get thing working right . . .

http://www.overclock.net/f/21/benchmarking-software-and-discussion


----------



## rdr09

Quote:


> Originally Posted by *TheSoldiet*
> 
> Yes it is. 4.5 GHz 1.39 volts amd fx 8350.
> 
> The q code led shows 'b2' when i black screen, and when i try to reboot. It means legacy option rom initialization. Dont know what it means though


you may need to set your PCIe in the bios properly. i just can't remember exactly what.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Kriant*
> 
> "FrozenCPU y u no send me those EK blocks"


I called FCPU yesterday and they said that they will receive a small batch of EK waterblocks today and will have it shipped out today. And next week, they're expecting an even bigger batch of delivery. I asked them where I was in the queue and he said that I will have to wait for next week's shipment. I ordered a nickel-acetal block last Thursday (oct.31) at night. So with the EK block delayed even further for me, and my 290 not coming in until the 15th of next week so long as USPS doesn't lose my package again, I don't t think I will be putting my WC setup until Thanksgiving break maybe :-\.


----------



## tsm106

Quote:


> Originally Posted by *gooeyballzz*
> 
> Hey Guys,
> 
> I have been lurking for along time and I really like your site.
> 
> I just got my MSI r9 290x and I have a problem. It seems to be throttling, in Metro Last Light *I see GPU usage up and down constantly, 100% to 0.*
> 
> I have installed the drivers from ATI released last night, which did not help.
> 
> I used MSI AB and set the fan speed to 50%. That helped with the Metro Benchmark,but the game still played choppy and I saw the constant up and down.
> 
> I got it at newegg so the RMA was a piece of cake, but I would really like to make it work.
> 
> Sorry for the long windedness.
> 
> Thanks


That's not technically throttling but a usage problem. Throttling is usually clock frequency related. Also, I'm not sure AB is displaying the usage correctly. I get the same effect on my trifire, one card is at max usage, another going up, and the last going down to 0.


----------



## evensen007

I'm a little annoyed with AMD right now. They have a 400 dollar card cannibalizing (IMHO) a 550 dollar card and there really hasn't been a peep from AMD about the situation. So someone like me who is ready to upgrade to a new GPU is looking at the playing field and seeing an over-priced 780ti that is the performance leader, a 550 290x that looks very tempting (I am watercooling), and then a 290 that seems to have little to no faults against it's 150 dollar more expensive big brother. It's keeping me from pulling the trigger at all.


----------



## the9quad

Quote:


> Originally Posted by *evensen007*
> 
> I'm a little annoyed with AMD right now. They have a 400 dollar card cannibalizing (IMHO) a 550 dollar card and there really hasn't been a peep from AMD about the situation. So someone like me who is ready to upgrade to a new GPU is looking at the playing field and seeing an over-priced 780ti that is the performance leader, a 550 290x that looks very tempting (I am watercooling), and then a 290 that seems to have little to no faults against it's 150 dollar more expensive big brother. It's keeping me from pulling the trigger at all.


Personally I would get the 290, it's just such a flat out a better deal than anything else, that it aint funny. However, I would wait for custom cooling.


----------



## Newbie2009

Quote:


> Originally Posted by *gooeyballzz*
> 
> I'm running 13.11_betav9.2, from what I read it was suppose to fix throttling issues.
> 
> I don't have 3DMark11, but don't mind running Heaven.


Does AB officially support 290s yet? You noticing performance issue in the game itself or just AB readings?

If it is not supported by AB yet I would just turn it off, if there is a performance hit then perhaps a driver reinstall is needed. As TSM mentioned, throttling is clock related rather than utilization.

Quote:


> Originally Posted by *evensen007*
> 
> I'm a little annoyed with AMD right now. They have a 400 dollar card cannibalizing (IMHO) a 550 dollar card and there really hasn't been a peep from AMD about the situation. So someone like me who is ready to upgrade to a new GPU is looking at the playing field and seeing an over-priced 780ti that is the performance leader, a 550 290x that looks very tempting (I am watercooling), and then a 290 that seems to have little to no faults against it's 150 dollar more expensive big brother. It's keeping me from pulling the trigger at all.


At your rez I would go 290x. I had same issue and figured I could get 3 290s for close to 2 290x but would probably have to upgrade my PSU and perhaps get a 2011 cpu.


----------



## jrcbandit

Quote:


> Originally Posted by *Bloitz*
> 
> Looks like everyone is going for the Acetal-Nickel blocks


I still don't trust EK with Nickel for the long term 1+ years, and my CPU block is copper... They were sold out of Acetal-Copper so I went Acrylic-Copper. more safe than nickel ;p.


----------



## gooeyballzz

Quote:


> Originally Posted by *tsm106*
> 
> That's not technically throttling but a usage problem. Throttling is usually clock frequency related. Also, I'm not sure AB is displaying the usage correctly. I get the same effect on my trifire, one card is at max usage, another going up, and the last going down to 0.


Yeah I was not really sure if that was the correct term, but the "eye test" tells me there is a problem with a brand new Video Card.


----------



## TheSoldiet

It only happens after i boot, and at about 5 minutes of gametime it blackscreens on me. When i try to reboot i get the b2 error message. When i finally manage to reboot then there is not any problems what so ever. Plays games like a charm! (so far only bf4 crashed on me)


----------



## Arizonian

Quote:


> Originally Posted by *Lennyx*
> 
> Just got 290x and ek block in house. But this damn w8 wont let me get my picture from the camera.
> It feels like every freaking time i try to do a normal thing on this computer w8 ticks me off.
> 
> Edit
> 
> 
> Spoiler: Warning: Spoiler!


Updated to water









Quote:


> Originally Posted by *nemm*
> 
> 2 Sapphire R9 290's now under water, awaiting the sli tubing that was omitted from the delivery before I can use which should be here tomorrow.
> 
> 
> Spoiler: Warning: Spoiler!


Didn't see any OCN name in the shot or GPU-Z. I'll take your word for it, you can add it to that post later as it's your link that everyone sees from the roster.

Congrats - Added









Quote:


> Originally Posted by *skupples*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> forgot to link this... XFX rep said this will be showing up on Hawaii very soon!
> 
> (their! Slightly more on topic AZ!
> 
> 
> 
> 
> 
> 
> 
> )


Yes much better.







That's good news from a rep as it means we are drawing nearer and hope sooner than latter part of November. Please please, dare I say it but if ACX 780Ti show up on Newegg first I'm a pretty trigger happy guy....


----------



## tsm106

Quote:


> Originally Posted by *gooeyballzz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> That's not technically throttling but a usage problem. Throttling is usually clock frequency related. Also, I'm not sure AB is displaying the usage correctly. I get the same effect on my trifire, one card is at max usage, another going up, and the last going down to 0.
> 
> 
> 
> Yeah I was not really sure if that was the correct term, but the "eye test" tells me there is a problem with a brand new Video Card.
Click to expand...

Oh I don't doubt you have a usage problem. In my experience they are usually driver related.

Have you thrown some benches at your card to confirm its performance and operational condition, ie. its a good working card under benches?


----------



## lordzed83

This is why i went copper this time...
I even had faking EK coolant in the loop. Corrosion anyway.


----------



## gooeyballzz

Quote:


> Originally Posted by *tsm106*
> 
> Oh I don't doubt you have a usage problem. In my experience they are usually driver related.
> 
> Have you thrown some benches at your card to confirm its performance and operational condition, ie. its a good working card under benches?


The Metro Last Light Bench appeared normal, 100% all the time no ups and downs, but the actual game play is really bad.


----------



## Arizonian

So in anticipation of non-reference cards I'm going to ask here rather than start a thread perhaps. Since 580, 680, 690 have all been reference cards for me I'm now looking at some non-reference that are going to be best for over clocking foremost. Quality boards, vrms. Maybe extra power phases. Not too concerned with fan noise as anything non-reference is going to be quieter than the ref 290X I had.

Sapphire Toxic - ASUS Matrix seem to be the top two. Any other suggestions and which ones to you guys think I should shoot for and why?

My last non-ref cards were two 6870 MSI TwinFrozr II's.


----------



## Newbie2009

Quote:


> Originally Posted by *Arizonian*
> 
> So in anticipation of non-reference cards I'm going to ask here rather than start a thread perhaps. Since 580, 680, 690 have all been reference cards for me I'm now looking at some non-reference that are going to be best for over clocking foremost. Quality boards, vrms. Maybe extra power phases. Not too concerned with fan noise as anything non-reference is going to be quieter than the ref 290X I had.
> 
> Sapphire Toxic - ASUS Matrix seem to be the top two. Any other suggestions and which ones to you guys think I should shoot for and why?
> 
> My last non-ref cards were two 6870 MSI TwinFrozr II's.


Usual suspects as you mentioned and msi lightnings, Gigabyte usually do one and I do like the HIS ICE cards. Not as expensive as the others but i like them.


----------



## Rar4f

Quote:


> Originally Posted by *Arizonian*
> 
> So in anticipation of non-reference cards I'm going to ask here rather than start a thread perhaps. Since 580, 680, 690 have all been reference cards for me I'm now looking at some non-reference that are going to be best for over clocking foremost. Quality boards, vrms. Maybe extra power phases. Not too concerned with fan noise as anything non-reference is going to be quieter than the ref 290X I had.
> 
> Sapphire Toxic - ASUS Matrix seem to be the top two. Any other suggestions and which ones to you guys think I should shoot for and why?
> 
> My last non-ref cards were two 6870 MSI TwinFrozr II's.


Vapor X


----------



## Newbie2009

Quote:


> Originally Posted by *Rar4f*
> 
> Vapor X


I had some of those for the 5870 and they were nice, think they were gimped in the HD7000 series , no volt control. Could be wrong.


----------



## Arizonian

Quote:


> Originally Posted by *Newbie2009*
> 
> I had some of those for the 5870 and they were nice, think they were gimped in the HD7000 series , no volt control. Could be wrong.


I've read that about Vapor-X too. Would be nice to have a bit of voltage control without flashing BIOS.


----------



## the9quad

Ive had this sapphire 290x since launch day, you think newegg will take it back and let me get credit so I can just get a couple of 290's this thing just doesnt work right imo, black screens are not right, I shouldnt have to use a different bios, and run a fan profile just not to get hardlock ups. Even then i still get them just not as frequently.


----------



## Elmy

Club 3D 290X's waterblocks installed check 

It still says I have 290's in the owners list. Not that its a big deal or anything though... LoL


----------



## Newbie2009

Quote:


> Originally Posted by *Arizonian*
> 
> I've read that about Vapor-X too. Would be nice to have a bit of voltage control without flashing BIOS.


Sapphire said it will be ready for trixx in December. I don't use AB anymore myself. Also mantle should be in Dec sometime too, interesting times ahead.

Quote:


> Originally Posted by *Elmy*
> 
> Club 3D 290X's waterblocks installed check


SEXY!


----------



## jerrolds

VaporX, Lightning, HoF, IceQ2 seem to be really nice.


----------



## battleaxe

Quote:


> Originally Posted by *the9quad*
> 
> Ive had this sapphire 290x since launch day, you think newegg will take it back and let me get credit so I can just get a couple of 290's this thing just doesnt work right imo, black screens are not right, I shouldnt have to use a different bios, and run a fan profile just not to get hardlock ups. Even then i still get them just not as frequently.


It should be able to handle stock clocks, if it cannot while running a game or test then yes, they will take it right back.


----------



## MojoW

Just ordered my first 290







Hoping it arrives tomorrow as it was in stock.


----------



## ducknukem86

i'm thinking of selling my 280x, now that i see the 290 at such a tempting price!


----------



## Rar4f

Quote:


> Originally Posted by *ducknukem86*
> 
> i'm thinking of selling my 280x, now that i see the 290 at such a tempting price!


You too huh


----------



## utnorris

Quote:


> Originally Posted by *nemm*
> 
> 2 Sapphire R9 290's now under water, awaiting the sli tubing that was omitted from the delivery before I can use which should be here tomorrow.


Two barbs and a piece of tubing. Don't make us wait!!!


----------



## MojoW

Quote:


> Originally Posted by *Elmy*
> 
> Club 3D 290X's waterblocks installed check
> 
> It still says I have 290's in the owners list. Not that its a big deal or anything though... LoL


Sweet


----------



## chiknnwatrmln

Anybody put the Arctic Accelero Extreme on their card? I'm considering upgrading to a 290x and was wondering how well one of these would work.


----------



## utnorris

Quote:


> Originally Posted by *the9quad*
> 
> Ive had this sapphire 290x since launch day, you think newegg will take it back and let me get credit so I can just get a couple of 290's this thing just doesnt work right imo, black screens are not right, I shouldnt have to use a different bios, and run a fan profile just not to get hardlock ups. Even then i still get them just not as frequently.


They let me do that. I have two 290's coming Monday (stupid UPS slow butts). It ends up being about $190 more because I used a 5% off coupon, but I get quite a bit more power. Granted, the 290x will probably overclock better in general, but mine was not very good, topped out at 1221Mhz even with 1.41v (real voltage). I figure if the 290's overclock to 1150Mhz it will be good enough for my needs. Luckily, the waterblocks are the same, so I don't have to worry about replacing them.


----------



## Gilgam3sh

just flashed my Sapphire 290 to ASUS 290X BIOS, now I can use ASUS GPU Tweak and increase voltage


----------



## tsm106

Quote:


> Originally Posted by *Gilgam3sh*
> 
> just flashed my Sapphire 290 to ASUS 290X BIOS, now I can use ASUS GPU Tweak and increase voltage


Grats. Fyi, let me warn you before you get black screened, don't use the pt3 bios.

*Yall stay away from the bios' from kingpin.*


----------



## the9quad

Quote:


> Originally Posted by *tsm106*
> 
> Grats. Fyi, let me warn you before you get black screened, don't use the pt3 bios.
> 
> *Yall stay away from the bios' from kingpin.*


can you link to the correct bios then? maybe that is my issue?


----------



## tsm106

Quote:


> Originally Posted by *the9quad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Grats. Fyi, let me warn you before you get black screened, don't use the pt3 bios.
> 
> *Yall stay away from the bios' from kingpin.*
> 
> 
> 
> can you link to the correct bios then? maybe that is my issue?
Click to expand...

Just get the stock asus bios and download gputweak from Asus.

ASUS.zip 102k .zip file


http://support.asus.com/download.aspx?SLanguage=en&m=gpu+tweak&os=30


----------



## RAFFY

Quote:


> Originally Posted by *Tobiman*
> 
> If you are getting Artifacts in BF4. Use this to sort it out.


Can't wait to try this quick fix out later tonight. I'm not really having any stuttering or artifact issues but I've gotten a couple black screens where my it locked my computer and I had to power down and restart. But I think this was because of me being a dummy and having the CCC "Overdriver" enabled as well as my custom fan profile in ASUS GPU Tweak. But I'm going to add this to mix just to be safe.

Quote:


> Originally Posted by *gooeyballzz*
> 
> Hey Guys,
> 
> I have been lurking for along time and I really like your site.
> 
> I just got my MSI r9 290x and I have a problem. It seems to be throttling, in Metro Last Light I see GPU usage up and down constantly, 100% to 0.
> 
> I have installed the drivers from ATI released last night, which did not help.
> 
> I used MSI AB and set the fan speed to 50%. That helped with the Metro Benchmark,but the game still played choppy and I saw the constant up and down.
> 
> I got it at newegg so the RMA was a piece of cake, but I would really like to make it work.
> 
> Sorry for the long windedness.
> 
> Thanks


Make sure that you don't have "Overdrive" enabled in CCC.


----------



## bpmcleod

Quote:


> Originally Posted by *tsm106*
> 
> Just get the stock asus bios and download gputweak from Asus.
> 
> ASUS.zip 102k .zip file
> 
> 
> http://support.asus.com/download.aspx?SLanguage=en&m=gpu+tweak&os=30[/quote
> Is there an asus bios for the 290s? Ill flash the 290x one if I have to just wanted a 290 version.


----------



## TomiKazi

So what is the consensus about EK backplates on the 290/X? Does it help cooling certain parts or is the money better spent on extra radiator space?


----------



## SonDa5

Quote:


> Originally Posted by *tsm106*
> 
> Grats. Fyi, let me warn you before you get black screened, don't use the pt3 bios.
> 
> *Yall stay away from the bios' from kingpin.*


Yup. Like a bad virus that makes entire system unstable.

You do need the gpu tweak that supports Hawaii though, I don't think asus web site has updated gpu tweak.


----------



## Gilgam3sh

Quote:


> Originally Posted by *SonDa5*
> 
> Yup. Like a bad virus that makes entire system unstable.
> 
> You do need the gpu tweak that supports Hawaii though, I don't think asus web site has updated gpu tweak.


GPU Tweak is updated, works great on my 290


----------



## SonDa5

Quote:


> Originally Posted by *utnorris*
> 
> They let me do that. I have two 290's coming Monday (stupid UPS slow butts). It ends up being about $190 more because I used a 5% off coupon, but I get quite a bit more power. Granted, the 290x will probably overclock better in general, but mine was not very good, topped out at 1221Mhz even with 1.41v (real voltage). I figure if the 290's overclock to 1150Mhz it will be good enough for my needs. Luckily, the waterblocks are the same, so I don't have to worry about replacing them.


1221/1700 is max my 290x with 1.41 would do with stock asus bios and gpu tweak for Hawaii. Sapphire oc 290x bios with sapphire TRIXX may work better. I'm being patient.


----------



## Gilgam3sh

so what is the max 24/7 voltage for the 290 (maybe the same for 290X) on stock air cooler and Accelero Xtreme III?


----------



## Forceman

Quote:


> Originally Posted by *Gilgam3sh*
> 
> just flashed my Sapphire 290 to ASUS 290X BIOS, now I can use ASUS GPU Tweak and increase voltage


Can you do me a solid and see if flipping the BIOS switch puts you back on the Sapphire BIOS? I'd like to be sure that it really is dual BIOS and not just some kind of dual mode.
Quote:


> Originally Posted by *TomiKazi*
> 
> So what is the consensus about EK backplates on the 290/X? Does it help cooling certain parts or is the money better spent on extra radiator space?


Minimal cooling at best. More for aesthetics and PCB protection.


----------



## Gilgam3sh

Quote:


> Originally Posted by *Forceman*
> 
> Can you do me a solid and see if flipping the BIOS switch puts you back on the Sapphire BIOS? I'd like to be sure that it really is dual BIOS and not just some kind of dual mode.


DUAL BIOS







it took me back to Sapphire 947 on core instead of 1000 and it says ATI under Subvendor and not ASUS.

now back to ASUS BIOS


----------



## gooeyballzz

Quote:


> Originally Posted by *RAFFY*
> 
> Can't wait to try this quick fix out later tonight. I'm not really having any stuttering or artifact issues but I've gotten a couple black screens where my it locked my computer and I had to power down and restart. But I think this was because of me being a dummy and having the CCC "Overdriver" enabled as well as my custom fan profile in ASUS GPU Tweak. But I'm going to add this to mix just to be safe.
> Make sure that you don't have "Overdrive" enabled in CCC.


Sorry I'm be thick right now. I don't understand what that will do. Do uncheck the Overdrive option in CCC and set the fan speed in afterburner?


----------



## TheSoldiet

Is it any word on AMD 290's unlocked to 290x's? Would be really cool if that was possible.


----------



## Gilgam3sh

Quote:


> Originally Posted by *TheSoldiet*
> 
> Is it any word on AMD 290's unlocked to 290x's? Would be really cool if that was possible.


not possible, I also flashed ASUS 290X BIOS on my Sapphire 290 and still 2560 shaders..

http://www.techpowerup.com/forums/showthread.php?t=193961


----------



## Mr357

Quote:


> Originally Posted by *TheSoldiet*
> 
> Is it any word on AMD 290's unlocked to 290x's? Would be really cool if that was possible.


This has been brought up before, and sadly the answer is no. The die's are lasered rather than simply being locked, so it's physically impossible to unlock any cores.


----------



## TheSoldiet

i see you have a Xtreme III on your 290. How is the temps? VRM, CORE


----------



## Poyri

Can i use 2 r9 290 and 3930k with corsair AX860i with watercooling? 1 one of my card is on its way, if it is okay i will gel 1 more.


----------



## Forceman

Quote:


> Originally Posted by *Gilgam3sh*
> 
> DUAL BIOS
> 
> 
> 
> 
> 
> 
> 
> it took me back to Sapphire 947 on core instead of 1000 and it says ATI under Subvendor and not ASUS.
> 
> now back to ASUS BIOS


Excellent, thanks. Now I feel better about the impending BIOS flash.


----------



## Gilgam3sh

Quote:


> Originally Posted by *TheSoldiet*
> 
> i see you have a Xtreme III on your 290. How is the temps? VRM, CORE


sorry I thought I would get the cooler today but it did not arrive, so maybe on monday if I'm lucky... but people here in Sweden reported losing 20-25°C in load when gaming etc... with the Accelero Xtreme III


----------



## tsm106

Quote:


> Originally Posted by *TomiKazi*
> 
> So what is the consensus about EK backplates on the 290/X? Does it help cooling certain parts or is the money better spent on extra radiator space?


I always get a backplate. For 28 bucks you get body armor for the pcb, it deflects screwdrivers and prying fingers, stuff falling on the pcb. It also gives you some defense against leaks, more time to react before fatality. They also help to prevent sag. And finally, sandwiching the vrms does help remove some latent heat.


----------



## TheSoldiet

Quote:


> Originally Posted by *Gilgam3sh*
> 
> sorry I thought I would got the cooler today but it did not arrive, so maybe on monday if I'm lucky... but people here in Sweden reported losing 20-25 degree in load when gaming etc... with the Accelero Xtreme III


Nice! Thinking of getting the Gelid Icy Vision REV 2 for my 290X. Prices here in Norway is normally a little higher than it is in Sweden though







I do have some relatives in Sweden, so if it is cheaper there i will order in Sweden. What is the best retailer there? inet.se, komplett.se etc...


----------



## rancor

Just jumped on the Saphire 290 up on newegg. This will be my first AMD gpu in 6 7 years.


----------



## Gilgam3sh

Quote:


> Originally Posted by *TheSoldiet*
> 
> Nice! Thinking of getting the Gelid Icy Vision REV 2 for my 290X. Prices here in Norway is normally a little higher than it is in Sweden though
> 
> 
> 
> 
> 
> 
> 
> I do have some relatives in Sweden, so if it is cheaper there i will order in Sweden. What is the best retailer there? inet.se, komplett.se etc...


hehe, I bought my 290 from Webhallen.com, I usually don't buy new stuff as I don't think it's worth lol, but this time I jumped because of good price, but everyone of them is good, I bought many times from Komplett too, inet is nice too even though I still not bought from them yet. but the Accelero Xtreme III I bought from a private person, he bought it but did not use it, got it for 370SEK instead of 550SEK.


----------



## lordzed83

TomiKazi well backplate gets warm so it does do something besides looking good


----------



## PillarOfAutumn

Quote:


> Originally Posted by *TomiKazi*
> 
> So what is the consensus about EK backplates on the 290/X? Does it help cooling certain parts or is the money better spent on extra radiator space?


It won't add anything in terms of cooling. Most it will do is conduct the heat, but if you dont have any airflow, the heat really won't move anywhere. Even if you did have good airflow, it doesn't really change anything. 99% of the heat exchange will happen in the front with the waterflow.

The only thing good about the backplate is the looks and rigid support.


----------



## rancor

Does anyone know if it is currently possible to overclock a 1440p monitor on the 290/290x? Thanks in advance.


----------



## jerrolds

Quote:


> Originally Posted by *rancor*
> 
> Does anyone know if it is currently possible to overclock a 1440p monitor on the 290/290x? Thanks in advance.


Yes..my QNIX is at 120hz on my 290X. The signal is actually cleaner than my Matrix 7970, i had some random scan lines at 120...had to run at 119hz.


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> I always get a backplate. For 28 bucks you get body armor for the pcb, it deflects screwdrivers and prying fingers, stuff falling on the pcb. It also gives you some defense against leaks, more time to react before fatality. They also help to prevent sag. And finally, sandwiching the vrms does help remove some latent heat.


your posts like this that got me to buy Fuji extreme and CLU. now, backplate . . .









any tips how to spread the CLU? i was thinking of not brushing it all the way to the edges.


----------



## GioV

Quote:


> Originally Posted by *jerrolds*
> 
> Yes..my QNIX is at 120hz on my 290X. The signal is actually cleaner than my Matrix 7970, i had some random scan lines at 120...had to run at 119hz.


Did you use the modded drivers? When I tried overclocking my monitor with a 290x I would always get some type of black screen or some games would not work right.


----------



## jerrolds

Quote:


> Originally Posted by *GioV*
> 
> Did you use the modded drivers? When I tried overclocking my monitor with a 290x I would always get some type of black screen or some games would not work right.


Nope, just patch with ToastyX's ati patcher, reboot - set resolution/refresh in CRU, reboot and youre good to go. No monitor drivers needed.

Try LCD Reduced in CRU and go from there - i have timings in the Monitor thread for [email protected] pixel clock - the lowest ive seen.


----------



## bond32

I have those random lines on my xstar as well, dropped the clock down and they disappeared.


----------



## tsm106

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I always get a backplate. For 28 bucks you get body armor for the pcb, it deflects screwdrivers and prying fingers, stuff falling on the pcb. It also gives you some defense against leaks, more time to react before fatality. They also help to prevent sag. And finally, sandwiching the vrms does help remove some latent heat.
> 
> 
> 
> your posts like this that got me to buy Fuji extreme and CLU. now, backplate . . .
> 
> 
> 
> 
> 
> 
> 
> 
> 
> any tips how to spread the CLU? i was thinking of not brushing it all the way to the edges.
Click to expand...

Hehe. It's the little things that tend to pay out the most when you are looking for that lil bit extra. But on the backplate, I really would not do it up w/o a plate. Theres so much to gain or not lose from using one. And there's been plenty of times where I'm more sloppy than I should be and the drops hit the backplate. So easy, I just wipe it off lol, instead of freaking out that its on the pcb.

Get some painters tape, or any decent masking tape. Mask off the die. Apply your clu, etc, then remove tape. Voila no overage/excess putting your card at risk.


----------



## SonDa5

Quote:


> Originally Posted by *tsm106*
> 
> I always get a backplate. For 28 bucks you get body armor for the pcb, it deflects screwdrivers and prying fingers, stuff falling on the pcb. It also gives you some defense against leaks, more time to react before fatality. They also help to prevent sag. And finally, sandwiching the vrms does help remove some latent heat.


Ek 290x also has heat pad sandwich directly on gpu area.


----------



## tsm106

^^Oh nice! That ek cut out over the die on the 79xx series did leave a weakness as for as body armor went. The 290 backplate looks to fix that and one up the cooling.


----------



## RAFFY

Quote:


> Originally Posted by *gooeyballzz*
> 
> Sorry I'm be thick right now. I don't understand what that will do. Do uncheck the Overdrive option in CCC and set the fan speed in afterburner?


Well if you have them both enabled then they are not going to work. Both programs will be trying to control your GPU at the time. Just uncheck the Overdrive option and stick AB.
Quote:


> Originally Posted by *jerrolds*
> 
> Nope, just patch with ToastyX's ati patcher, reboot - set resolution/refresh in CRU, reboot and youre good to go. No monitor drivers needed.
> 
> Try LCD Reduced in CRU and go from there - i have timings in the Monitor thread for [email protected] pixel clock - the lowest ive seen.


Sorry for the noob post but I've been failing at overclocking my ASUS PB278Q monitor with my 290x's. This is the correct process right...
1) Run patcher
2) Enable test mode
3) Reboot
4) Open CRU create new profile with 61+ Hz
5) Apply Profile
6) Restart
7) Select profile in CCC?


----------



## Gilgam3sh

Quote:


> Originally Posted by *RAFFY*
> 
> Well if you have them both enabled then they are not going to work. Both programs will be trying to control your GPU at the time. Just uncheck the Overdrive option and stick AB.
> Sorry for the noob post but I've been failing at overclocking my ASUS PB278Q monitor with my 290x's. This is the correct process right...
> 1) Run patcher
> 2) Enable test mode
> 3) Reboot
> 4) Open CRU create new profile with 61+ Hz
> 5) Apply Profile
> 6) Restart
> 7) Select profile in CCC?


no need for test mode anymore, just patch the driver, reboot, run CRU creat profile, reboot, done.


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> Hehe. It's the little things that tend to pay out the most when you are looking for that lil bit extra. But on the backplate, I really would not do it up w/o a plate. Theres so much to gain or not lose from using one. And there's been plenty of times where I'm more sloppy than I should be and the drops hit the backplate. So easy, I just wipe it off lol, instead of freaking out that its on the pcb.
> 
> Get some painters tape, or any decent masking tape. Mask off the die. Apply your clu, etc, then remove tape. Voila no overage/excess putting your card at risk.


so it is ok to paint the whole gpu's surface evenly and it is not going to overflow ones the block is installed tight. thanks, tsm.

i saw Sonda's backplate and i must say - it looks smexy.


----------



## nemm

Quote:


> Originally Posted by *utnorris*
> 
> Two barbs and a piece of tubing. Don't make us wait!!!


Tried it, not enough space unfortunately and I even tried a 90* fitting a loop with no joy. Damned Gene motherboard and even though I like the board I will never buy this form factor again.

To those wondering about the psu requirement, my AX850 failed when overclocking with upped volts on 290XF and running background programs. All seemed okay running system at stock but not 4.8 3770k @1.35v and stock 290s. I was suffering shutdowns when trying to push them so never got any data and proceeded to breakdown system and install new psu. It didn't occur to me before 290/x release 850x would be a problem and never planned on replacing when I decided to eventually go 290x route. So glad the 290 debut at such a good price because I was able to get 2x290 and new psu for cost of 2x290x. There will be others that say 850w is enough but in my case the ax850 failed unfortunately.


----------



## RAFFY

Quote:


> Originally Posted by *Gilgam3sh*
> 
> no need for test mode anymore, just patch the driver, reboot, run CRU creat profile, reboot, done.


Ok cool and then after the final reboot I can go into CCC to confirm that I'm actually at that hertz correct?


----------



## ProdigalGenius

i need help with this card..i have a no signal!! 1st it was with sapphire 290x sent back coz i thought it was defective..now i have xfx 290..same thing no signal output.i have corsair HG 900w. i tried moving it in different pcie slot (sabertooth z77)..no result..


----------



## Gilgam3sh

Quote:


> Originally Posted by *RAFFY*
> 
> Ok cool and then after the final reboot I can go into CCC to confirm that I'm actually at that hertz correct?


first you need to change the hz in the windows settings.


----------



## jerrolds

Quote:


> Originally Posted by *RAFFY*
> 
> Ok cool and then after the final reboot I can go into CCC to confirm that I'm actually at that hertz correct?


You should have them selectable in normal Windows Resolution screen. I typlically just run 2 files, ATIPatcher.exe or whatever...make sure you patch the values (if its already patche and still doesnt work, try restoring them and starting over)

After rebooting, setup your resolution/refresh in CRU and reboot again

You should be able to select resolution/refresh in windows normally.


----------



## youra6

Just noticed this, and I'm pretty ticked off. On the back of the MSI card, there is a sticker covering one of the screws, saying :*void if removed* -essentially used to discourage any modifications.

From what I understood when I read the warranty ToC, MSI allows modifications to the card so as long as the card is returned with the stock coolers on.

I'm going to see if I can peel off the sticker first.


----------



## Forceman

Quote:


> Originally Posted by *youra6*
> 
> Just noticed this, and I'm pretty ticked off. On the back of the MSI card, there is a sticker covering one of the screws, saying :*void if removed* -essentially used to discourage any modifications.
> 
> From what I understood when I read the warranty ToC, MSI allows modifications to the card so as long as the card is returned with the stock coolers on.
> 
> I'm going to see if I can peel off the sticker first.


Try lifting it with the edge of a razor blade. That worked with my XFX stickers. Just don't put the removed sticker down an a flat spot of PCB - much harder to get off.


----------



## SonDa5

Quote:


> Originally Posted by *nemm*
> 
> Tried it, not enough space unfortunately and I even tried a 90* fitting a loop with no joy. Damned Gene motherboard and even though I like the board I will never buy this form factor again.
> 
> To those wondering about the psu requirement, my AX850 failed when overclocking with upped volts on 290XF and running background programs. All seemed okay running system at stock but not 4.8 3770k @1.35v and stock 290s. I was suffering shutdowns when trying to push them so never got any data and proceeded to breakdown system and install new psu. It didn't occur to me before 290/x release 850x would be a problem and never planned on replacing when I decided to eventually go 290x route. So glad the 290 debut at such a good price because I was able to get 2x290 and new psu for cost of 2x290x. There will be others that say 850w is enough but in my case the ax850 failed unfortunately.


Stock bios on the card?


----------



## Lennyx

Quote:


> Originally Posted by *ProdigalGenius*
> 
> i need help with this card..i have a no signal!! 1st it was with sapphire 290x sent back coz i thought it was defective..now i have xfx 290..same thing no signal output.i have corsair HG 900w. i tried moving it in different pcie slot (sabertooth z77)..no result..


I had the same problem. Im so tired right now i dont remember exactly what i did.
But ye i made sure the pci-e cabels where pushed in properly then i pushed downt he card with a little bit of force. turned off my psu and removed the psu cable.
Well i dont know but it suddenly worked and i had signal.


----------



## SonDa5

Quote:


> Originally Posted by *rdr09*
> 
> i saw Sonda's backplate and i must say - it looks smexy.


----------



## SonDa5

Quote:


> Originally Posted by *Lennyx*
> 
> I had the same problem. Im so tired right now i dont remember exactly what i did.
> But ye i made sure the pci-e cabels where pushed in properly then i pushed downt he card with a little bit of force. turned off my psu and removed the psu cable.
> Well i dont know but it suddenly worked and i had signal.


I had same problem. Card was not seated all the way in the pice slot.


----------



## RAFFY

Quote:


> Originally Posted by *SonDa5*


Damn those look great. When I switch to water I will be purchasing the plates.


----------



## Lennyx

Quote:


> Originally Posted by *SonDa5*


Looks awesome. To bad i cant use a backplate on my mobo


----------



## jerrolds

Interesting...I actually have extra acrylic lying around..maybe i can just cut and spray paint my own backplate to add rigidity and looks...


----------



## youra6

Screw it, I'm doing it. Razor blade didn't work; I got stones for hands. As long as this clause remains on the MSI website, I should be fine:

Quote:


> The product MUST be returned to MSI in the original factory configuration and condition. All aftermarket modifications must be reversed prior to sending in the product for repair or replacement.


----------



## nemm

Quote:


> Originally Posted by *SonDa5*
> 
> Stock bios on the card?


I was running Asus290 bios for unlocked voltage


----------



## SonDa5

Quote:


> Originally Posted by *Lennyx*
> 
> Looks awesome. To bad i cant use a backplate on my mobo


What do you mean?

The back plate on 290x should work with any mb. Mine is in a mini-ITX mb and the rigidity is a must to prevent sagging.


----------



## ProdigalGenius

Quote:


> Originally Posted by *Lennyx*
> 
> I had the same problem. Im so tired right now i dont remember exactly what i did.
> But ye i made sure the pci-e cabels where pushed in properly then i pushed downt he card with a little bit of force. turned off my psu and removed the psu cable.
> Well i dont know but it suddenly worked and i had signal.


i'll be doin that..coz i'm so tired right now..i have to pull out everything and messed up my cabling management to make sure it wasnt the psu coz i blieve it requires just 500w and 800-900 for xfire.

i'll update and let you know..thanks!!!


----------



## SonDa5

Quote:


> Originally Posted by *nemm*
> 
> I was running Asus290 bios for unlocked voltage


You mean 290x?

I had problems with the pt3 Saussure bios and found a stock asus 290x bios and it worked a lot better. Be careful which bios you try.

660w PSU here with no problems when over clocking with 4770k and 290x. Had problems with bad bios not PSU.


----------



## TomiKazi

Quote:


> Originally Posted by *SonDa5*


...
This pretty much settles the backplate question for me


----------



## rdr09

Quote:


> Originally Posted by *SonDa5*


----------



## Strata

My card seems to be stuck at 100% in AB, even though the rest of my PC is idle, any thoughts, as last night I was idling normal instead, and 92C idle is not ok


----------



## jerrolds

Did you plug the card in with the 2 6+8pin power connector things?


----------



## brazilianloser

Man good that I won't have to work until Monday after today work... I need to reinstall Windows and all my games fresh since something got corrupted while installing drivers for the 290... Hopefully in the process I can get rid of the black screen crashes as well. None of my benchmarks work other than Valley... Such annoyance.


----------



## Lennyx

Quote:


> Originally Posted by *SonDa5*
> 
> What do you mean?
> 
> The back plate on 290x should work with any mb. Mine is in a mini it'd mb and the rigidity is a must to prevent sagging.


Maybe i was just tired and didnt pay attention when i tried to install the gtx 670 i have with backplate on the asus gryphon.
On my m-ix board the gtx670 with backplated worked fine.


----------



## nemm

Quote:


> Originally Posted by *SonDa5*
> 
> You mean 290x?
> 
> I had problems with the pt3 Saussure bios and found a stock asus 290x bios and it worked a lot better. Be careful which bios you try.
> 
> 660w PSU here with no problems when over clocking with 4770k and 290x. Had problems with bad bios not PSU.


At the time it was the ASUS 290non X bios in crossfire, single card was fine as you would expect. The PT3 290x crossflashed to 290 was problematic, PT1 was fine though


----------



## Bloitz

Got a mail today, payment has been received but the EK FC block isn't in stock anymore (it was in stock when I placed the order) . So bummed out they couldn't reserve it for 2 bloody days to wait for payment, now I have to wait till the 15th of November


----------



## utnorris

Quote:


> Originally Posted by *nemm*
> 
> Tried it, not enough space unfortunately and I even tried a 90* fitting a loop with no joy. Damned Gene motherboard and even though I like the board I will never buy this form factor again.
> 
> To those wondering about the psu requirement, my AX850 failed when overclocking with upped volts on 290XF and running background programs. All seemed okay running system at stock but not 4.8 3770k @1.35v and stock 290s. I was suffering shutdowns when trying to push them so never got any data and proceeded to breakdown system and install new psu. It didn't occur to me before 290/x release 850x would be a problem and never planned on replacing when I decided to eventually go 290x route. So glad the 290 debut at such a good price because I was able to get 2x290 and new psu for cost of 2x290x. There will be others that say 850w is enough but in my case the ax850 failed unfortunately.


I have the Gene too. My 2 x 290's will be here Monday. Looks like I will need to break out my 1200watt PSU, but I am going to try it on my Seasonic Platinum 860 first. Fully modular and the case that it will eventually be going in will not take my 1200 easily.


----------



## estens

What are peoples vrm temps under water?
Currently running stock volts, and core hits 42c wich I am happy with, but higest vrm temp sensor reads 72c under load.


----------



## ProdigalGenius

Quote:


> Originally Posted by *SonDa5*
> 
> I had same problem. Card was not seated all the way in the pice slot.


Prolly retry later my fingers effing red pulling,pushing and screwing ... The pcie contact on the card has a uber thick space before pcie contact..


----------



## Arm3nian

Do backplates come into stock often?


----------



## ProdigalGenius

Quote:


> Originally Posted by *jerrolds*
> 
> Did you plug the card in with the 2 6+8pin power connector things?


Plugged it in a single 6+2 pcie cable and a seperate 6pin pcie cable not from the extended cable of the 6+2 to make sure it has power


----------



## ProdigalGenius

By the way i never had these problems with my previous 280 sapphire toxic. I returned it after there 290s came out


----------



## givmedew

Quote:


> Originally Posted by *flopper*
> 
> Quote:
> 
> 
> 
> Originally Posted by *givmedew*
> 
> So what the way to go right now?
> 
> I am going to be water cooling this thing... Pretty much set on a 290 non x version. Does it really matter what card I get? Should I wait for any specific vendors to release a non reference card?
> 
> 
> 
> no all same cards different stickers, check the warranty etc..that you want.
Click to expand...

So is it worth waiting for non reference cards? I will be water cooling so I don't care about the fan.


----------



## RAFFY

Quote:


> Originally Posted by *utnorris*
> 
> I have the Gene too. My 2 x 290's will be here Monday. Looks like I will need to break out my 1200watt PSU, but I am going to try it on my Seasonic Platinum 860 first. Fully modular and the case that it will eventually be going in will not take my 1200 easily.


I have the XP2 model of your PSU and I'm not having any issues at all with 290x in crossfire and a 4770k.


----------



## Sammyboy83

Just got my gigabyte 290 today. Tested with cold Norwegian air and watercooling, 1160/1525 stock bios and voltage, heaven ran without artifacts. Asic is 79.9%. The final max temp was 28 degrees. Forgot to reset vrm temps, but they were around 22-30 degrees. As you can see, the cpu temps on the right corner is pretty low also.


----------



## DampMonkey

Quote:


> Originally Posted by *estens*
> 
> What are peoples vrm temps under water?
> Currently running stock volts, and core hits 42c wich I am happy with, but higest vrm temp sensor reads 72c under load.


Under my highest overclock (1320/6000), i never saw them go above 65*C with the EK block during benches


----------



## estens

Guess I screwed up putting the block on then


----------



## chrisf4lc0n

Can I join in








Proof


----------



## VSG

lol do you literally have a fan blowing air onto the GPU?


----------



## DampMonkey

Quote:


> Originally Posted by *estens*
> 
> Guess I screwed up putting the block on then


Or your loop could be a little warmer than mine. Are you sure you used the correct thickness of thermal padds for the VRMs? I almost messed that up


----------



## Sammyboy83

I saw 52 degrees on vrm1 temp when the core temp was 43. So you may need to reset your block.


----------



## DampMonkey

Quote:


> Originally Posted by *geggeg*
> 
> lol do you literally have a fan blowing air onto the GPU?


Looks like its blowing on his ram


----------



## Falkentyne

I posted a few pages back but I guess it got lost in all the posts....

Are people having black screen problems (when NOT overclocking with overdrive in CCC) using HYNIX or ELPIDA memory?
Is it only windowed/borderless/ or full screen?

I haven't seen a single report of Hynix blackscreens yet.
Does downclocking to 1000 MHz on memory (with AB) fix all the black screens?


----------



## utnorris

Quote:


> Originally Posted by *estens*
> 
> What are peoples vrm temps under water?
> Currently running stock volts, and core hits 42c wich I am happy with, but higest vrm temp sensor reads 72c under load.


I was hitting 60-70c with 1.41v. I am assuming I did not get the block mounted correctly and will check once I get my 290's in.
Quote:


> Originally Posted by *Arm3nian*
> 
> Do backplates come into stock often?


PPCS told me they should have some next week. I need one more.
Quote:


> Originally Posted by *givmedew*
> 
> So is it worth waiting for non reference cards? I will be water cooling so I don't care about the fan.


If you are water cooling get reference ones since that is what the water blocks will fit typically.
Quote:


> Originally Posted by *estens*
> 
> Guess I screwed up putting the block on then


I understand your frustration.


----------



## DampMonkey

Quote:


> Originally Posted by *Falkentyne*
> 
> I posted a few pages back but I guess it got lost in all the posts....
> 
> Are people having black screen problems (when NOT overclocking with overdrive in CCC) using HYNIX or ELPIDA memory?
> Is it only windowed/borderless/ or full screen?
> 
> I haven't seen a single report of Hynix blackscreens yet.
> Does downclocking to 1000 MHz on memory (with AB) fix all the black screens?


Only time ive ever seen a black screen flash is from using PT1 or PT3 bios's and clock speeds upwards of 1250. Depends on the app as well


----------



## tsm106

Re-ran FSE on 13.8 which is the driver I happen to be running. I gained a 1k over 13.6 lolzers. Score is good enough for 5th spot overall HoF, and the graphics score beats all comers except for one.

http://www.3dmark.com/3dm/1585448


----------



## chrisf4lc0n

Quote:


> Originally Posted by *DampMonkey*
> 
> Under my highest overclock (1320/6000), i never saw them go above 65*C with the EK block during benches


How did you managed to achieve that? My max is 1150MHz GPU and 1400MHz VRAM above that it either crashes or at least I see artefacts....


----------



## utnorris

Quote:


> Originally Posted by *Sammyboy83*
> 
> 
> 
> Just got my gigabyte 290 today. Tested with cold Norwegian air and watercooling, 1160/1525 stock bios and voltage, heaven ran without artifacts. Asic is 79.9%. The final max temp was 28 degrees. Forgot to reset vrm temps, but they were around 22-30 degrees. As you can see, the cpu temps on the right corner is pretty low also.


I am hoping my 290's overclock like this. Two will be monster like.


----------



## DampMonkey

Quote:


> Originally Posted by *tsm106*
> 
> Re-ran FSE on 13.8 which is the driver I happen to be running. I gained a 1k over 13.6 lolzers. Score is good enough for 5th spot overall HoF, and the graphics score beats all comers except for one.
> 
> http://www.3dmark.com/3dm/1585448


13.8? Why such an old driver







Did you see improvements in any other cases?


----------



## DampMonkey

Quote:


> Originally Posted by *chrisf4lc0n*
> 
> How did you managed to achieve that? My max is 1150MHz GPU and 1400MHz VRAM above that it either crashes or at least I see artefacts....


PT3 bios, wouldn't recommend it honestly


----------



## Sammyboy83

Quote:


> Originally Posted by *utnorris*
> 
> I am hoping my 290's overclock like this. Two will be monster like.


Normaly, I will get artifacts when the core hit 1140mhz, but with cold air the core will give me 20mhz extra. I will flash bios tomorrow, and I really hope I can get a decent oc on the 290. But the stock voltage is pretty low, the highest point was 1.125v.

Edit: My desktop will start to dancing salsa when I set the momery over 1525mhz. But when benching even 1625mhz is going smooth. Anybody know why this is happening?


----------



## tsm106

Quote:


> Originally Posted by *DampMonkey*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Re-ran FSE on 13.8 which is the driver I happen to be running. I gained a 1k over 13.6 lolzers. Score is good enough for 5th spot overall HoF, and the graphics score beats all comers except for one.
> 
> http://www.3dmark.com/3dm/1585448
> 
> 
> 
> 13.8? Why such an old driver
> 
> 
> 
> 
> 
> 
> 
> Did you see improvements in any other cases?
Click to expand...

Why the







?

Anyways, it's the beta 8 driver which I wrote in correctly as 13.8.


----------



## DampMonkey

Quote:


> Originally Posted by *tsm106*
> 
> Why the
> 
> 
> 
> 
> 
> 
> 
> ?
> 
> Anyways, it's the beta 8 driver which I wrote in correctly as 13.8.


I thought you meant 13.8, the original frampacing driver, which came out months ago. I was rolling my eyes, wondering why a driver that doesn't support hawaii would perfrom better than supporting drivers. AMD drivers at work!


----------



## tsm106

In the future if someone supplies the 3dm link, you can simply scroll down and see the listed driver.


----------



## LazarusIV

Quote:


> Originally Posted by *$ilent*
> 
> Depends on the game. On bf3 with everything max and 4xAA my average fps is around 60. If you want max fps with max settings youd need 2 290. Turn aa off and you prob get 90fps with a single 290.


I'm still waffling on this decision... my latest question involves my power supply. I'd gladly get one R9 290X, watercool it and OC it as much as I can to get >60 fps on an OCd Qnix 2560X1440 monitor, since I've got a 750W psu that shouldn't be an issue. If I can't get those framerates on that monitor then I'd get 2 R9 290s, watercool them and then OC them but I don't think my 750W psu can handle that. I'd really rather not drop $200 on a new psu when mine works perfectly. Will I be able to get >60 fps for BF3 / BF4, Skyrim, Civ V, Borderlands 2, Planetside 2, Metro, etc with an OCd 290X? I don't play a ton of really demanding games I suppose, I don't play any of the Crysis games. What would you guys suggest? R9 290X, watercool, OC and take what I get or balls out crossfire R9 290s with a new PSU?


----------



## jerrolds

Quote:


> Originally Posted by *LazarusIV*
> 
> I'm still waffling on this decision... my latest question involves my power supply. I'd gladly get one R9 290X, watercool it and OC it as much as I can to get >60 fps on an OCd Qnix 2560X1440 monitor, since I've got a 750W psu that shouldn't be an issue. If I can't get those framerates on that monitor then I'd get 2 R9 290s, watercool them and then OC them but I don't think my 750W psu can handle that. I'd really rather not drop $200 on a new psu when mine works perfectly. Will I be able to get >60 fps for BF3 / BF4, Skyrim, Civ V, Borderlands 2, Planetside 2, Metro, etc with an OCd 290X? I don't play a ton of really demanding games I suppose, I don't play any of the Crysis games. What would you guys suggest? R9 290X, watercool, OC and take what I get or balls out crossfire R9 290s with a new PSU?


I run BF4 on High everything, No AA, and HBAO - i get about 85-90fp outside and 110+ indoors. This is at 1100/1400. This is campaign mode. Not sure how it is in multi, but at that point im guessing its CPU bottlenecked.


----------



## chrisf4lc0n

Quote:


> Originally Posted by *DampMonkey*
> 
> PT3 bios, wouldn't recommend it honestly


So using the atiflash and PT3 BIOS on a good WC system I could do 1300/[email protected] ??








Should I do it or not??? It is a brand new card...
Why did I even look at this thread aaaaaa!
How many times chrisf4lc0n flashed something which was working and then stopped, but on the other side how many times chrisf4lc0n flashed something and it worked better.

You should be called DevilMonkey not DampMonkey!


----------



## Mr357

Quote:


> Originally Posted by *jerrolds*
> 
> I run BF4 on High everything, No AA, and HBAO - i get about 85-90fp outside and 110+ indoors. This is at 1100/1400. This is campaign mode. Not sure how it is in multi, but at that point im guessing its CPU bottlenecked.


At max settings (1080p sadly) I get 70+ fps in multiplayer.


----------



## LazarusIV

Quote:


> Originally Posted by *jerrolds*
> 
> I run BF4 on High everything, No AA, and HBAO - i get about 85-90fp outside and 110+ indoors. This is at 1100/1400. This is campaign mode. Not sure how it is in multi, but at that point im guessing its CPU bottlenecked.


So it sounds like I could go the R9 290X route with waterblock, keep my power supply and call it good on my OCd 2560X1440 monitor. I'll run BF3 and BF4 on High if I have to, I don't really see a big difference between High and Ultra anyway. Plus I can tweak the settings as needed to play on Ultra and still get great framerates I think. Anyone else want to chime in?

EDIT: Though now that I'm catching up this thread's reading it seems like Raffy's running 2 X R9 290X on an 869W PSU and having no issues so maybe I'd be alright running 2 X R9 290s on my PSU with a mild overclock on stock voltage... I think that might be it.


----------



## tsm106

Quote:


> Originally Posted by *chrisf4lc0n*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DampMonkey*
> 
> PT3 bios, wouldn't recommend it honestly
> 
> 
> 
> So using the atiflash and PT3 BIOS on a good WC system I could do 1300/[email protected] ??
> 
> 
> 
> 
> 
> 
> 
> 
> Should I do it or not??? It is a brand new card...
> Why did I even look at this thread aaaaaa!
> How many times chrisf4lc0n flashed something which was working and then stopped, but on the other side how many times chrisf4lc0n flashed something and it worked better.
> 
> You should be called DevilMonkey not DampMonkey!
Click to expand...

Go on ahead, all you. Learn the hard way.


----------



## Spectre-

Quote:


> Originally Posted by *tsm106*
> 
> Go on ahead, all you. Learn the hard way.


please correct me i f am wrong

but i have heard that bio brick's cards :L


----------



## Dominican

i keep getting red screen not fun.


----------



## bpmcleod

Is there a way to run atiflash without a USB drive? The only guide I have found to ATIFlash uses a thumbdrive that I dont currently have on me. I can get mine but would love to do without -.-

Edit: Mmmm cant wait to get this card undeer water with a 67.9 ASICs


----------



## Sgt Bilko

Well, looks like i'm not getting my card under water any-time soon, i did some playing around looking at the various parts etc and i'm looking at around $1k for the dual loop system that i want (1 for CPU and 1 for GPU/'s)

I'm putting that on the backburner atm, need to take the wife out to dinner first or something i think


----------



## PillarOfAutumn

In order to squeeze out more frames in bf4, something else you can do is decrease motion blur. I seriously don't know why they put that in FPS games. Its fine for driving or flight games, but IMO it has no place in fps.

Also, for those of you using EK Nickel Acetal and have copper in your loops, what are you using for the liquid and biocide?


----------



## estens

I think I used right thermal pads, precut are for ram chips right? It`s a ek block

Water in my loop gets up to about 29c so I guess I have to reset the block tomorrow and see..


----------



## SLADEizGOD

Quote:


> Originally Posted by *DampMonkey*
> 
> And i thought the card couldn't get any uglier. Bravo, XFX


lmao..yup your right.


----------



## Strata

Quote:


> Originally Posted by *jerrolds*
> 
> Did you plug the card in with the 2 6+8pin power connector things?


No, my Seasonic PSU had native 6+2 connectors


----------



## Gilgam3sh

so does it give anything in games when overclocking the memory??


----------



## Asrock Extreme7

cpu holding u back I get 100+ all ultra x4aa on bf4 mp / 1080/ 1000 core 1250mem


----------



## Asrock Extreme7

vdroop is bad asus bios 1.410v gpu tweak says 1.267v max


----------



## Falkentyne

If you guys are still getting black screens in windowed or borderless mode and are NOT using overdrive/CCC to overclock, you can get the memory info tool here still:
http://www.mediafire.com/?voj4j1rlk0ucfz4

or just click here.

MemoryInfo 1005.zip 1101k .zip file


----------



## Gilgam3sh

Quote:


> Originally Posted by *Falkentyne*
> 
> If you guys are still getting black screens in windowed or borderless mode and are NOT using overdrive/CCC to overclock, you can get the memory info tool here still:
> http://www.mediafire.com/?voj4j1rlk0ucfz4
> 
> or just click here.
> 
> MemoryInfo 1005.zip 1101k .zip file


I got Elpida, good or bad?

http://imageshack.us/photo/my-images/11/na5w.jpg/


----------



## Jpmboy

Sapphire R290x arrived - plugged in aircoled to make sure it's okay...

Sign me up!


just breaking in to 10400 range on Firestrike

and my titans sitting by while i dance with another chick











could use a few pointers on OC this thing...


----------



## stilllogicz

Well, I've come to the realization that there most likely won't be a 6gb 780 Ti. I've owned a 7970 lightning in the past and loved it. I'm thinking about making the switch to the 290x due to the competitive performance and the fact it has 4gb of ram. Out of all the reference models, which would be the best to buy with watercooling in mind? I've heard some models don't have unlockable voltage control, etc.


----------



## Gilgam3sh

Quote:


> Originally Posted by *stilllogicz*
> 
> Well, I've come to the realization that there most likely won't be a 6gb 780 Ti. I've owned a 7970 lightning in the past and loved it. I'm thinking about making the switch to the 290x due to the competitive performance and the fact it has 4gb of ram. Out of all the reference models, which would be the best to buy with watercooling in mind? I've heard some models don't have unlockable voltage control, etc.


no card is hardware voltage locked, it's only the BIOS , so just pick any card (not XFX, it comes with ugly warranty stickers), but the ASUS comes with unlocked BIOS


----------



## Jpmboy

Is there a set of bios' for the r290x somewhere? OP - any pointers?


----------



## Bull56

Hey

I cannot find the links at ocn for the PT3 BIOS anymore...
Anyone can post the link to the page for me?


----------



## Gilgam3sh

Quote:


> Originally Posted by *Jpmboy*
> 
> Is there a set of bios' for the r290x somewhere? OP - any pointers?


here you have the ASUS 290X unlocked BIOS if thats what you're after.

http://forums.overclockers.co.uk/showthread.php?t=18552408


----------



## DampMonkey

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> In order to squeeze out more frames in bf4, something else you can do is decrease motion blur. I seriously don't know why they put that in FPS games. Its fine for driving or flight games, but IMO it has no place in fps.
> 
> Also, for those of you using EK Nickel Acetal and have copper in your loops, what are you using for the liquid and biocide?


Distilled from the grocery store and a killcoil


----------



## DampMonkey

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well, looks like i'm not getting my card under water any-time soon, i did some playing around looking at the various parts etc and i'm looking at around $1k for the dual loop system that i want (1 for CPU and 1 for GPU/'s)
> 
> I'm putting that on the backburner atm, need to take the wife out to dinner first or something i think


Dual loop system wont see any gains btw. You can save a little money and effort by just doing a single loop


----------



## Falkentyne

Quote:


> Originally Posted by *Gilgam3sh*
> 
> I got Elpida, good or bad?
> 
> http://imageshack.us/photo/my-images/11/na5w.jpg/


I have no idea, but so far, on here and hardforum, everyone who has had black screen problems in windowed mode has been using Elpida, so far (I havent' had a single feedback from anyone using Hynix).

I haven't had one using Hynix.

The *ONLY* time I got a black screen (that was not a vpu recovery crash, e.g. running Valley at 1125 MHz causes the demo to freeze then it black screens but you CAN control alt delete or alt tab to end the frozen program), was when I used CCC to set the memory to 1500 MHz and then rebooted...instant black screens on the -desktop- as soon as ANY sort of mouse or 2d activity began.

These black screen hard locks are 95% certain to be MEMORY (or IMC) related. Black screening due to app hanging from a VPU recovery (this usually does NOT bring the system down)=core clock--big difference.

There was a post where someone flashed their bios with the Asus (NOT PT3) rom and then THEY had black screens too...memory timing differences?


----------



## HoZy

Please update me to water, Ordered from EK web store, I'm in Australia. Ordered on Monday 1pm Melbourne time, Received Friday 3pm









1100mhz Core, 1250 Memory. (5000mhz)

These things run soo cool, 39c Max on furmark with a 21c ambient.


----------



## Sgt Bilko

Quote:


> Originally Posted by *DampMonkey*
> 
> Dual loop system wont see any gains btw. You can save a little money and effort by just doing a single loop


More for aesthetics and i want dual rads anyway so in my mind the tubing ends up being more simple with a dual loop as opposed to a single one.


----------



## alancsalt

Quote:


> Originally Posted by *HoZy*
> 
> Please update me to water, Ordered from EK web store, I'm in Australia. Ordered on Monday 1pm Melbourne time, Received Friday 3pm
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1100mhz Core, 1250 Memory. (5000mhz)
> 
> These things run soo cool, 39c Max on furmark with a 21c ambient.
> 
> 
> Spoiler: Warning: Spoiler!


Amazing how quick they are. I can't criticize them for shipping speed.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well, looks like i'm not getting my card under water any-time soon, i did some playing around looking at the various parts etc and i'm looking at around $1k for the dual loop system that i want (1 for CPU and 1 for GPU/'s)
> 
> *I'm putting that on the backburner atm, need to take the wife out to dinner first or something i think
> 
> 
> 
> 
> 
> 
> 
> *


But the more you treat her, the less money you'll have for your build


----------



## xxmastermindxx

Quote:


> Originally Posted by *HoZy*
> 
> Please update me to water, Ordered from EK web store, I'm in Australia. Ordered on Monday 1pm Melbourne time, Received Friday 3pm
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1100mhz Core, 1250 Memory. (5000mhz)
> 
> These things run soo cool, 39c Max on furmark with a 21c ambient.


That is a disgustingly low load temperature, especially in Furcrap. Grats!









Also, for everyone out there considering PT3 BIOS, I'd avoid it, honestly. It caused me nothing but issues. Completely stock and air cooled I was able to run 1150/1450, stable in BF4 and multiple 3DMark runs. Go to PT3, and I'd get black screens if I even looked at my damn monitor. Voltage did absolutely nothing for me. Flashed back to stock, and right back to stable 1150/1450 runs.

I sold the card and have two 290's incoming, but just thought I'd share my experience. I'll be waiting for full Afterburner support.


----------



## Sgt Bilko

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> But the more you treat her, the less money you'll have for your build


While that would be true, it's worth it in the long run.........she doesn't know i have a 290x yet


----------



## kcuestag

I wish I could afford a 2nd 290X.









Probably not happening any time soon (Maybe in January if lucky







).


----------



## cyenz

After 3 hours of heavy fight against Accelero Hybrid... it is done,

I will never again put a accelero hybrid on any VGA again, terrible experience









But what count are the result´s, and i could not be happier.

59C after one hour of BF4 stock clocks.

If it wasnt for the pain that was mounting the goddam thing i would recomend it to anyone.

Here is the pic, sorry for the terrible cable management, but after 3 frustrating hours all i wanted was that everything was finished.


----------



## rdr09

Quote:


> Originally Posted by *jomama22*
> 
> Alright, I squeezed her a bit harder in valley...
> 
> 
> 
> 1320/1700!!! So the memory can indeed get up there. Seems to be very dependent on core speed and voltage.
> 
> 82.1 FPS...Highest 290x i have seen on air/water.


the 780 Ti is a beast 1158 core . . .



edit: Valley is indeed rdg the core speed incorrectly.


----------



## Arm3nian

Quote:


> Originally Posted by *rdr09*
> 
> the 780 Ti is a beast 1400 core . . .
> 
> if Valley is reading the speed right.


Valley has always been nvidia favored. Valley also likes memory clocks, which the 290 series is unstable right now after a certain point, probably due to bios.


----------



## evensen007

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well, looks like i'm not getting my card under water any-time soon, i did some playing around looking at the various parts etc and i'm looking at around $1k for the dual loop system that i want (1 for CPU and 1 for GPU/'s)
> 
> I'm putting that on the backburner atm, need to take the wife out to dinner first or something i think


Let me save you some money. I went from dual-loop to single, and the temp difference was ~nill. I have a single loop with a 3x360 rad with push/pull and an extra single rad in pull and my temps are VERY close to what they were when I had 2 closed loops.

Nevermind. Just saw you were doing it for build reasons.


----------



## surrealalucard

So I just started having a problem after installing the 290x about a week or so ago, if I am watching a video or even sometimes when I am playing a game, the computer pretty much freezes.

I click on something and it takes like 20 seconds for it to move a window, I can't click on any icons, ctrl alt del brings up the screen quick but clicking task manager takes about 30 seconds to switch back to desktop and bring up task manager. The task manager is frozen, not showing any kind of activity on the cpu usage etc.

It kind of sounds like the display driver stops, and this happens even when its not oced.

specs are:
AMD FX-8350(just bought this and installed when I got the 290x as well)
MSI R9 290x x1 BF4 edition (Latest Drivers Installed)
8 gbs Gskill memory
Windows 8.1


----------



## tsm106

Quote:


> Originally Posted by *Falkentyne*
> 
> I have no idea, but so far, on here and hardforum, everyone who has had black screen problems in windowed mode has been using Elpida, so far (I havent' had a single feedback from anyone using Hynix).
> 
> I haven't had one using Hynix.
> 
> The *ONLY* time I got a black screen (that was not a vpu recovery crash, e.g. running Valley at 1125 MHz causes the demo to freeze then it black screens but you CAN control alt delete or alt tab to end the frozen program), was when I used CCC to set the memory to 1500 MHz and then rebooted...instant black screens on the -desktop- as soon as ANY sort of mouse or 2d activity began.
> 
> These black screen hard locks are 95% certain to be MEMORY (or IMC) related. Black screening due to app hanging from a VPU recovery (this usually does NOT bring the system down)=core clock--big difference.
> 
> There was a post where someone flashed their bios with the Asus (NOT PT3) rom and then THEY had black screens too...memory timing differences?


It happens to both. My Sapphire with hynix blackscreened after using the pt3 bios. We don't know enough and we don't have proper volt control. Rangers 290x blackscreened on the pt3 bios also and his has elpida. It doesn't matter the brand imo.


----------



## brazilianloser

Blackscreen galore here at my end... No matter the game... Card is not even passing the 75c mark and it crashes...


----------



## sugarhell

Black screen=memory


----------



## brazilianloser

But if I am not doing anything to the card other than stock settings... It's running cool enough to make most jealous... Why the memory crashing? Probably faulty which means rma for me... Sucks big time.


----------



## Falkentyne

Quote:


> Originally Posted by *tsm106*
> 
> It happens to both. My Sapphire with hynix blackscreened after using the pt3 bios. We don't know enough and we don't have proper volt control. Rangers 290x blackscreened on the pt3 bios also and his has elpida. It doesn't matter the brand imo.


The PT3 bios is bad news.

The black screen issue I'm talking about are for people who are not even overclocking in the first place, not for those using modified bioses.
There may be timing issues on the Elpida vs Hynix, which is what I'd like to find out.

So far, everyone who has had black screens with NOT flashing a custom bios OR using CCC to overclock, has had elpida GDDR5.


----------



## Falkentyne

Quote:


> Originally Posted by *brazilianloser*
> 
> But if I am not doing anything to the card other than stock settings... It's running cool enough to make most jealous... Why the memory crashing? Probably faulty which means rma for me... Sucks big time.


What memory is on your card? Elpida or Hynix? (use the memory tool I uploaded a few pages back in my post).


----------



## jomama22

I got plenty of black screens on the pt3 but also on stock and Asus. Its a matter of both core voltage and memory clock. Voltage itself has a tendency to create this issue if you don't give it enough juice OR if its given too much. Atrifacting only really occurred on 3/6 while testing, which made it more difficult to pinpoint the exact issue.

It may suck, but I noticed that even with no artifacts, you can still have an unstable clock speed once you get the black screen. 90% of the time I need to lower clocks, but lowering both voltage and clocks or just rasisng voltage has helped.

I'll say, I didnt have an issue getting any of the 6 cards stable over 1215mhz. that is with no black screens or artifacts. I had cards drop as much as 30 MHz from blacking screening to not.

For those running purely stock, that's cause for RMA in my mind, unless you are OK with upping the voltage on your own.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *DampMonkey*
> 
> Distilled from the grocery store and a killcoil


Alright cool. That's what I'm planning on doing. The copper and silver won't corrode the nickel plating, right? And should I put in some anti corrosive to protect the metals? Do you think it's necessary?


----------



## Mr357

Quote:


> Originally Posted by *Falkentyne*
> 
> The PT3 bios is bad news.
> 
> The black screen issue I'm talking about are for people who are not even overclocking in the first place, not for those using modified bioses.
> There may be timing issues on the Elpida vs Hynix, which is what I'd like to find out.
> 
> So far, everyone who has had black screens with NOT flashing a custom bios OR using CCC to overclock, has had elpida GDDR5.


Hopefully just coincidence. My card has Elpida IC's, but hasn't had a memory-related crash even once yet. I've gotten the memory clock up to 1500 without a hiccup so far.
Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Alright cool. That's what I'm planning on doing. The copper and silver won't corrode the nickel plating, right? And should I put in some anti corrosive to protect the metals? Do you think it's necessary?


A mix of copper and nickel won't cause problems, but I think nickel and silver don't mix well. To be safe, I would just get something like Primochill Liquid Utopia instead of a coil.


----------



## brazilianloser

Quote:


> Originally Posted by *Falkentyne*
> 
> What memory is on your card? Elpida or Hynix? (use the memory tool I uploaded a few pages back in my post).


Elpidia... Sadly. I guess I am just that lucky as usual to get a faulty card.


----------



## Falkentyne

Quote:


> Originally Posted by *brazilianloser*
> 
> Elpidia... Sadly. I guess I am just that lucky as usual to get a faulty card.


Don't be so sure, yet. So far your post has confirmed that 100% of the STOCK non overclocked black screen problems have ALL had to do with having Elpida memory on the card.

If you're not too busy, and you still ihave the card, use AFTERBURNER and downclock the memory to 1000 mhz.

Then run your windowed and fullscreen tests.

Do you still black screen?


----------



## jomama22

Just to make this clear, I highly doubt this is a elpida only problem. I got equal share of hynix and elpida, and neither gave me more trouble then the other. The hynix did manage a higher average overclock (1650-1750) but playing games, doing benches, etc didn't cause one or the other to crash anymore then i expected it to.

Elpida also clocked better on air for me so there's that.


----------



## the9quad

Is this the bios that causes black screens?


----------



## Falkentyne

My Bios is the exact same as that, but I'm using Hynix modules, and no problems yet.
.
I want to be proven wrong, but if that's Elpida....


----------



## utnorris

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well, looks like i'm not getting my card under water any-time soon, i did some playing around looking at the various parts etc and i'm looking at around $1k for the dual loop system that i want (1 for CPU and 1 for GPU/'s)
> 
> I'm putting that on the backburner atm, need to take the wife out to dinner first or something i think


Not sure why you want or need a dual loop, but you could always start small and add on as you go. I think you could easily stay under $500 going retail and if you shop the forums and Ebay you could drop that by a third. The items don't have to be brand new except for the GPU block because you won't find a used one of those for some time, but pump, rads, cpu blocks, fittings, even tubbing can be had used and there are a lot of nice deals out there right now, just have to look for them. Heck, I will be putting up a lot of stuff this weekend sometime when I get some time to take pictures.


----------



## brazilianloser

Quote:


> Originally Posted by *Falkentyne*
> 
> Don't be so sure, yet. So far your post has confirmed that 100% of the STOCK non overclocked black screen problems have ALL had to do with having Elpida memory on the card.
> 
> If you're not too busy, and you still ihave the card, use AFTERBURNER and downclock the memory to 1000 mhz.
> 
> Then run your windowed and fullscreen tests.
> 
> Do you still black screen?


Card survives at 1k after 15min of the Furmark... I guess I will keep bumping it up to see when the crashes start... Either way definitely getting in touch with Newegg to get a new one once they come in stock again. Not going to settle for a card that needs to be downclocked.


----------



## utnorris

Quote:


> Originally Posted by *Mr357*
> 
> Hopefully just coincidence. My card has Elpida IC's, but hasn't had a memory-related crash even once yet. I've gotten the memory clock up to 1500 without a hiccup so far.
> A mix of copper and nickel won't cause problems, but I think nickel and silver don't mix well. To be safe, I would just get something like Primochill Liquid Utopia instead of a coil.


Silver and copper are fine. You use the silver to keep algae from growing. Distilled water and either the silver coil or a biocide from the pet store are your best bet, but if you must have color then you could do Mayhems dye. Plus distilled water is like $.79 a gallon versus $10 to $20 for a liter of "water cooling" fluid.


----------



## Arm3nian

Quote:


> Originally Posted by *brazilianloser*
> 
> Card survives at 1k after 15min of the Furmark... I guess I will keep bumping it up to see when the crashes start... Either way definitely getting in touch with Newegg to get a new one once they come in stock again. Not going to settle for a card that needs to be downclocked.


You're using furmark to test stability...?


----------



## brazilianloser

Quote:


> Originally Posted by *Arm3nian*
> 
> You're using furmark to test stability...?


Now I am... Haven't done any benchmark yet... The crashes were just happening in game. And not in the mood to play anymore tonight in search of crashes.


----------



## Sgt Bilko

Quote:


> Originally Posted by *utnorris*
> 
> Not sure why you want or need a dual loop, but you could always start small and add on as you go. I think you could easily stay under $500 going retail and if you shop the forums and Ebay you could drop that by a third. The items don't have to be brand new except for the GPU block because you won't find a used one of those for some time, but pump, rads, cpu blocks, fittings, even tubbing can be had used and there are a lot of nice deals out there right now, just have to look for them. Heck, I will be putting up a lot of stuff this weekend sometime when I get some time to take pictures.


just because it's easier in my mind and less tubes going everywhere, one rad up top, one rad down bottom, Res and Pump in the middle.

I'll take a look through Ebay at some point but i've gotten my price down to $800 on the website now.


----------



## Arm3nian

Quote:


> Originally Posted by *brazilianloser*
> 
> Now I am... Haven't done any benchmark yet... The crashes were just happening in game. And not in the mood to play anymore tonight in search of crashes.


Furmark just cooks your card. Other than that it provides no validity in clocks for stability. Try 3dmark, heaven, or valley.


----------



## brazilianloser

But yeah I am stupid though... Just no Internet here and I can't find the installer for Valley that I had saved up. Furmark is not making use of the ram on the gpu which is the suspicious cause of the crashes so that won't help.


----------



## tsm106

Here's an elpida sapphire card @ 1300/1625 putting up a good showing.

http://www.3dmark.com/fs/1116676

http://www.3dmark.com/fs/1116635

**Interesting thing... test if you can raise mem clock w/o raising core voltage first. You get where I'm going with this?


----------



## Arm3nian

Quote:


> Originally Posted by *brazilianloser*
> 
> But yeah I am stupid though... Just no Internet here and I can't find the installer for Valley that I had saved up. Furmark is not making use of the ram on the gpu which is the suspicious cause of the crashes so that won't help.


Overheating can be a cause for crashes. What are your temps like?

If you need to lower clocks from stock then something seems wrong. But you need to try benchmarks first to test that, games aren't a valid test because there are still driver issues.


----------



## brazilianloser

Quote:


> Originally Posted by *Arm3nian*
> 
> Overheating can be a cause for crashes. What are your temps like?
> 
> If you need to lower clocks from stock then something seems wrong. But you need to try benchmarks first to test that, games aren't a valid test because there are still driver issues.


just posted a pic a few posts back... 15 min of full gpu usage only hitting 86c. And while gaming where the crashes are happening I am sitting at 75c


----------



## Arm3nian

Quote:


> Originally Posted by *brazilianloser*
> 
> just posted a pic a few posts back... 15 min of full gpu usage only hitting 86c. And while gaming where the crashes are happening I am sitting at 75c


Really the only way to test if your gpu has a defect is to revert everything in your system to stock and just run a benchmark.


----------



## brazilianloser

Quote:


> Originally Posted by *Arm3nian*
> 
> Really the only way to test if your gpu has a defect is to revert everything in your system to stock and just run a benchmark.


It has been all stock from point one... That was my initial point. Now what I have been doing is what the other home brought up. To reduce the memory speed and see if the crashes continue...


----------



## tsm106

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *utnorris*
> 
> Not sure why you want or need a dual loop, but you could always start small and add on as you go. I think you could easily stay under $500 going retail and if you shop the forums and Ebay you could drop that by a third. The items don't have to be brand new except for the GPU block because you won't find a used one of those for some time, but pump, rads, cpu blocks, fittings, even tubbing can be had used and there are a lot of nice deals out there right now, just have to look for them. Heck, I will be putting up a lot of stuff this weekend sometime when I get some time to take pictures.
> 
> 
> 
> just because it's easier in my mind and less tubes going everywhere, one rad up top, one rad down bottom, Res and Pump in the middle.
> 
> I'll take a look through Ebay at some point but i've gotten my price down to $800 on the website now.
Click to expand...

Dual loop is worse unless you want to specifically separate cpu from gou heat, like in mining for ex. Otherwise you double the complexity, raise the costs, and reduce overall efficiency. Say you can fit 3 rads in your case, but with dual loop you have to split that up. If you are doing something that is not hitting both cpu/gpu that means in single loop either block set can use the full surface area in your case to cool. You lose that advantage in a dual loop. Then when both cpu and gpu are loaded its no different than if it was in a dual loop so it's no gain or loss there.


----------



## $ilent

any msi afterburner update yet?


----------



## Strata

Quote:


> Originally Posted by *Mr357*
> 
> A mix of copper and nickel won't cause problems, but I think nickel and silver don't mix well. To be safe, I would just chromeget something like Primochill Liquid Utopia instead of a coil.


Actually silver and copper are more likely to corrode than copper and nickel, the killer usually is brass and nickel. In a typical loop, silver is the lowest on the galvanic scale, followed by nickel, copper,brass, chrome, and finally aluminum. If you're paranoid like I am and will have anything but copper and brass use an inhibitor like Water Wetter or Fesar.


----------



## Sazz

I've tested PT3 BIOS and it seems like its unstable to me when I use that, can't even do the same OC as I got with PT1 BIOS which is 1260/1420. guess imma revert it back to PT1 and do more benches.


----------



## ImJJames

Quote:


> Originally Posted by *tsm106*
> 
> Here's an elpida sapphire card @ 1300/1625 putting up a good showing.
> 
> http://www.3dmark.com/fs/1116676
> 
> http://www.3dmark.com/fs/1116635
> 
> **Interesting thing... test if you can raise mem clock w/o raising core voltage first. You get where I'm going with this?


Can you post a valley benchmark please, extremehd


----------



## SonDa5

Quote:


> Originally Posted by *estens*
> 
> Guess I screwed up putting the block on then


1mm on VRMs. I used Fujipoly ultra thermal pad on vrm and temps are great. Also EK instructions recommend a small drop of the EK Tim on the VRMs to help transfer heat.


----------



## brazilianloser

Run Valley on surround for a while in all memory speeds ranging from 1000-1250... And no crashes. I guess I will attempt some games again.


----------



## jomama22

Quote:


> Originally Posted by *tsm106*
> 
> Here's an elpida sapphire card @ 1300/1625 putting up a good showing.
> 
> http://www.3dmark.com/fs/1116676
> 
> http://www.3dmark.com/fs/1116635
> 
> **Interesting thing... test if you can raise mem clock w/o raising core voltage first. You get where I'm going with this?


His memory is unstable. Have a look at this for some fun:

http://www.3dmark.com/fs/1111431

thats at 1225/1750

My 1315/1500 = 14100.

Im saving the 1315/1725 @ 5.1 for later









Heres the thing about the memory. At 6000, its good, 6100-6400, no go, lower scores in 3dmark13 (higher in valley though?). 6500-6800, back to working again and showing awesome scores. 7000? yeah, only 1 hit that...and the weakest core to boot haha.

Core voltage seemed to have a slight effect but only at 1.35v+ (actual) and it was in a negative way lol.


----------



## Dominican

with stock cooler what is best overclock setting anybody able to have ?


----------



## Durquavian

Quote:


> Originally Posted by *Dominican*
> 
> with stock cooler what is best overclock setting anybody able to have ?


Seen a few mention 1100 with 75% fan speed, but I gather it isn't the voltage or chip keeping them to 1100 but the fan itself so far.


----------



## jomama22

Quote:


> Originally Posted by *Durquavian*
> 
> Seen a few mention 1100 with 75% fan speed, but I gather it isn't the voltage or chip keeping them to 1100 but the fan itself so far.


80% fan (could go lower, probably 70%) on asus bios, maxed voltage (1.299 actual) gave me anywhere from 1150-1220 core on the 6 cards.


----------



## esqueue

Got my 290x watercooled and decided to throw a block and watercool the cpu while I'm at it. So far, my Bonneville radiator is doing a damn good job. It's night and I only ran Valley a few times. I'll probably OC the videocard sometime.


----------



## Jpmboy

Quote:


> Originally Posted by *Gilgam3sh*
> 
> here you have the ASUS 290X unlocked BIOS if thats what you're after.
> 
> http://forums.overclockers.co.uk/showthread.php?t=18552408


thanks +1


----------



## selk22

Quote:


> Originally Posted by *Poyri*
> 
> Can i use 2 r9 290 and 3930k with corsair AX860i with watercooling? 1 one of my card is on its way, if it is okay i will gel 1 more.


I am on this exact setup I would say its working great! Except mine is 290x


----------



## Spectre-

Quote:


> Originally Posted by *selk22*
> 
> I am on this exact setup I would say its working great!


heck with that AX 860i you can have 2 r9 290's watercooled


----------



## DampMonkey

Quote:


> Originally Posted by *Spectre-*
> 
> heck with that AX 860i you can have 2 r9 290's watercooled


Not if you plan on heavy overclocking


----------



## Spectre-

untitled-1.png 86k .png file

Quote:


> Originally Posted by *DampMonkey*
> 
> Not if you plan on heavy overclocking


i am sorry you said something


----------



## overclockFrance

On a Sapphire 290, in order to apply more than 1410 mV allowed with ASUS290 bios, do I have to flash the PT1 bios and use ASUS GPU Tweak ?


----------



## selk22

Yeah If and when I xfire I will most likely be looking for 1100-1200 and 1400-1600mem clocks nothing crazy!

Here is my setup in preparation of the GPU Block...

I couldn't bare to look at all the x45 any longer.. haha I wanted it in the loop!

http://www.overclock.net/g/a/1058904/h220-expansion/


----------



## Spectre-

anyone got the PowerColor oc edition 290 here

someone tell me what there Oc results were like

getting mines by next wednesday


----------



## Arm3nian

Quote:


> Originally Posted by *Spectre-*
> 
> untitled-1.png 86k .png file
> 
> i am sorry you said something


That is going to double - triple once you start heavy overvolting.


----------



## hotrod717

Quote:


> Originally Posted by *Spectre-*
> 
> heck with that AX 860i you can have 2 r9 290's watercooled


I believe there 3-4 pages+ that cover why this won't work in overclocking situation. 900+w from wall.


----------



## Spectre-

Quote:


> Originally Posted by *Arm3nian*
> 
> That is going to double - triple once you start heavy overvolting.


thats there GPU's oc'd

also dont forget what the original comment i said it can run on an AX 860i

which is a great PSU and has an output of 855 watts with 92% efficiency on average so i doubt it would need anything better even oc'd


----------



## Arizonian

Wow a lot of pages in one day where I was away from a computer. Think I got all the new owners and updates.









Quote:


> Originally Posted by *chrisf4lc0n*
> 
> Can I join in
> 
> 
> 
> 
> 
> 
> 
> 
> Proof
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added and under water.









Quote:


> Originally Posted by *Jpmboy*
> 
> Sapphire R290x arrived - plugged in aircoled to make sure it's okay...
> 
> Sign me up!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> just breaking in to 10400 range on Firestrike
> 
> and my titans sitting by while i dance with another chick
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> could use a few pointers on OC this thing...


Congrats added.









Quote:


> Originally Posted by *HoZy*
> 
> Please update me to water, Ordered from EK web store, I'm in Australia. Ordered on Monday 1pm Melbourne time, Received Friday 3pm
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1100mhz Core, 1250 Memory. (5000mhz)
> 
> These things run soo cool, 39c Max on furmark with a 21c ambient.
> 
> 
> Spoiler: Warning: Spoiler!


Under water added


----------



## HoZy

Upped it a little more.

XFX Stock Bios, EK Waterblock. Stock Volt
1150Core & 1350Memory.
Max temp 40c


----------



## youra6

http://www.3dmark.com/3dm11/7456119

R9 290 CFX @ 1100/1400mhz *under water*, stock volts

http://www.3dmark.com/3dm11/7223454

GTX 670 SLI @ +1400/+500mhz *under water* 1.212V

9k higher score... impressive. Imagine the score once I push this thing closer to 1300Mhz... Gonna run the test again; killing all my unneeded processes in the task manager.


----------



## GodOfGaming

Is there a Windows XP driver of the 290x? I'm using Windows 7 mostly, but I keep XP for some old games that don't work on w7, and I'd like to be able to use the card there as well? It's a bit early to drop support for XP, even Microsoft will keep at it for at least a few more months?


----------



## RAFFY

Quote:


> Originally Posted by *Gilgam3sh*
> 
> first you need to change the hz in the windows settings.


Quote:


> Originally Posted by *jerrolds*
> 
> You should have them selectable in normal Windows Resolution screen. I typlically just run 2 files, ATIPatcher.exe or whatever...make sure you patch the values (if its already patche and still doesnt work, try restoring them and starting over)
> 
> After rebooting, setup your resolution/refresh in CRU and reboot again
> 
> You should be able to select resolution/refresh in windows normally.


Edit: I got it work but I can't get past 60hz







Is windows 8.1 messing it up?


----------



## Arm3nian

Quote:


> Originally Posted by *Spectre-*
> 
> thats there GPU's oc'd
> 
> also dont forget what the original comment i said it can run on an AX 860i
> 
> which is a great PSU and has an output of 855 watts with 92% efficiency on average so i doubt it would need anything better even oc'd


Efficiency has nothing to do with what the psu can output. We had many members saying their 860i's failing with crossfire 290s and quad cores. Throw in a hexacore and you're going to need an even heftier psu.


----------



## selk22

Quote:


> Originally Posted by *Arm3nian*
> 
> Efficiency has nothing to do with what the psu can output. We had many members saying their 860i's failing with crossfire 290s and quad cores. Throw in a hexacore and you're going to need an even heftier psu.


I plan to upgrade to a 1200watt PSU for when I decide to Xfire. I agree with Hexacore OC and the GPU it is out of the question... But its a great PSU for single OC 290x and 3930k OC


----------



## Spectre-

Quote:


> Originally Posted by *Arm3nian*
> 
> Efficiency has nothing to do with what the psu can output. We had many members saying their 860i's failing with crossfire 290s and quad cores. Throw in a hexacore and you're going to need an even heftier psu.


hmm so what do you need another

100 watts? just curious

thanks for your time


----------



## HoZy

Quote:


> Originally Posted by *Arm3nian*
> 
> Efficiency has nothing to do with what the psu can output. We had many members saying their 860i's failing with crossfire 290s and quad cores. Throw in a hexacore and you're going to need an even heftier psu.


*Thanks to Jared Pace:*

R9 290X Power Consumption Estimates:

- These values represent scenes of maximum load & power draw (ex: 3DMark, Farcry 3, Furmark)
- The fan uses between 2-20W depending on speed.
- All other instances including typical gaming power usage is always less, these are only maximums.
- Default: 8pin = 150W, 6pin = 75W, PCIE = 75W. Overvolted: (up to) 8pin = 400W, 6pin = 250W, PCIE = 125W
- Maximums, not averages.
- Wattage is affected (in order from most to least) by Voltage, Temperature & Mhz
- Wattage can always momentarily spike 25% higher than what you see here.
- Safe limit = 525W and/or 1.45V
Quote:


> Silent bios TDP = 300W
> Uber bios TDP = 350W, limit 470W with voltage
> Asus bios limit = 470W
> VRM limit = 560W
> PCB limit = 580-600W
> PT1/PT3 limit = 800W
> 
> Air
> Stock cooler / Quiet bios / 95C / 1.175V / 1000 / 1250 = 255W
> Stock cooler / Quiet bios / 85C / 1.175V / 1000 / 1250 = 250W
> Stock cooler / Quiet bios / 75C / 1.175V / 1000 / 1250 = 240W
> Stock cooler / Quiet bios / 65C / 1.175V / 1000 / 1250 = 230W
> Stock cooler / Uber bios / 95C / 1.175V / 1000 / 1250 = 290W
> Stock cooler / Uber bios / 85C / 1.175V / 1000 / 1250 = 280W
> Stock cooler / Uber bios / 75C / 1.175V / 1000 / 1250 = 270W
> Stock cooler / Quiet bios / 95C / 1.175V / 1150 / 1350 = 285W
> Stock cooler / Quiet bios / 85C / 1.175V / 1150 / 1350 = 275W
> Stock cooler / Quiet bios / 75C / 1.175V / 1150 / 1350 = 270W
> Stock cooler / Quiet bios / 65C / 1.175V / 1150 / 1350 = 260W
> Stock cooler / Uber bios / 95C / 1.175V / 1150 / 1350 = 320W
> Stock cooler / Uber bios / 85C / 1.175V / 1150 / 1350 = 305W
> Stock cooler / Uber bios / 75C / 1.175V / 1150 / 1350 = 290W
> 
> Stock cooler / ASUS bios / 95C / 1.250V / 1150 / 1350 = 410W
> Stock cooler / ASUS bios / 85C / 1.250V / 1150 / 1350 = 400W
> Stock cooler / ASUS bios / 75C / 1.250V / 1150 / 1350 = 390W
> Stock cooler / ASUS bios / 65C / 1.250V / 1150 / 1350 = 370W
> Stock cooler / ASUS bios / 95C / 1.250V / 1200 / 1400 = 430W
> Stock cooler / ASUS bios / 85C / 1.250V / 1200 / 1400 = 415W
> Stock cooler / ASUS bios / 75C / 1.250V / 1200 / 1400 = 400W
> Stock cooler / ASUS bios / 65C / 1.250V / 1200 / 1400 = 380W
> Stock cooler / ASUS bios / 95C / 1.300V / 1150 / 1350 = 465W
> Stock cooler / ASUS bios / 85C / 1.300V / 1150 / 1350 = 445W
> Stock cooler / ASUS bios / 75C / 1.300V / 1150 / 1350 = 435W
> Stock cooler / ASUS bios / 65C / 1.300V / 1150 / 1350 = 420W
> Stock cooler / ASUS bios / 95C / 1.300V / 1250 / 1400 = 475W
> Stock cooler / ASUS bios / 85C / 1.300V / 1250 / 1400 = 455W
> Stock cooler / ASUS bios / 75C / 1.300V / 1250 / 1400 = 435W
> Stock cooler / ASUS bios / 65C / 1.300V / 1250 / 1400 = 430W
> 
> Water
> Asus bios / 60C / 1.300V / 1000/1250 =450W
> Asus bios / 40C / 1.300V / 1000/1250 =430W
> Asus bios / 20C / 1.300V / 1000/1250 =410W
> Asus bios / 60C / 1.300V / 1250/1450 =465W
> Asus bios / 40C / 1.300V / 1250/1450 =445W
> Asus bios / 20C / 1.300V / 1250/1450 =425W
> PT1/PT3 bios / 60C / 1.400V / 1250/1450=530W
> PT1/PT3 bios / 40C / 1.400V / 1250/1450=510W
> PT1/PT3 bios / 20C / 1.400V / 1250/1450=490W
> PT1/PT3 bios / 60C / 1.400V / 1300/1500=545W
> PT1/PT3 bios / 40C / 1.400V / 1300/1500=525W
> PT1/PT3 bios / 20C / 1.400V / 1300/1500=510W
> PT1/PT3 bios / 60C / 1.500V / 1250/1450=585W
> PT1/PT3 bios / 40C / 1.500V / 1250/1450=570W
> PT1/PT3 bios / 20C / 1.500V / 1250/1450=555W
> PT1/PT3 bios / 60C / 1.500V / 1350/1550=595W
> PT1/PT3 bios / 40C / 1.500V / 1350/1550=575W
> PT1/PT3 bios / 20C / 1.500V / 1350/1550=560W
> 
> Extreme (LN2 with core & memory voltages manually controlled)
> PT1/PT3 bios / 60C / 1.500V / 1300/1500= 625W
> PT1/PT3 bios -50C / 1.500V / 1300/1500= 585W
> PT1/PT3 bios -200C / 1.500V / 1300/1500= 500W
> PT1/PT3 bios / 60C / 1.700V / 1550/1750= 795W (up to 1000w)
> PT1/PT3 bios -50C / 1.700V / 1550/1750= 675W
> PT1/PT3 bios -200C / 1.700V / 1550/1750= 620W
> 
> Voltage (large impact on wattage)
> PT1/PT3 bios / 60C / 1.550V / 1300/1500=635W (dangerous voltages)
> PT1/PT3 bios / 60C / 1.500V / 1300/1500=600W
> PT1/PT3 bios / 60C / 1.450V / 1300/1500=580W
> PT1/PT3 bios / 60C / 1.400V / 1300/1500=545W
> PT1/PT3 bios / 60C / 1.350V / 1300/1500=525W
> PT1/PT3 bios / 60C / 1.300V / 1300/1500=495W
> PT1/PT3 bios / 60C / 1.250V / 1300/1500=455W
> 
> Temperature (medium impact on wattage)
> PT1/PT3 bios / 90C / 1.400V / 1300/1500=595W (not a good idea for 1.4 on air)
> PT1/PT3 bios / 80C / 1.400V / 1300/1500=575W
> PT1/PT3 bios / 70C / 1.400V / 1300/1500=555W
> PT1/PT3 bios / 60C / 1.400V / 1300/1500=545W
> PT1/PT3 bios / 40C / 1.400V / 1300/1500=535W
> PT1/PT3 bios / 20C / 1.400V / 1300/1500=525W
> PT1/PT3 bios / 10C / 1.400V / 1300/1500=520W
> PT1/PT3 bios / -50C / 1.400V / 1300/1500=490W
> 
> Mhz (small impact on wattage)
> PT1/PT3 bios / 60C / 1.400V / 1450/1550=575W (wont clock this high anyway)
> PT1/PT3 bios / 60C / 1.400V / 1400/1500=570W
> PT1/PT3 bios / 60C / 1.400V / 1350/1450=560W
> PT1/PT3 bios / 60C / 1.400V / 1300/1400=545W
> PT1/PT3 bios / 60C / 1.400V / 1100/1350=535W
> PT1/PT3 bios / 60C / 1.400V / 1000/1300=525W
> PT1/PT3 bios / 60C / 1.400V / 900/1250=520W
> 
> PT3
> Vdroop - it's ~.02-.075v depending on load & voltage. It's a percentage of VDDC.
> PT3 on has 1 predefined LLC setting for OCbios (maximum)
> 1.225v set + LLC, Vdroop = .025v, actual = 1.25v.
> 1.3v set + LLC, Vdroop = ~.033v, actual = ~1.33v
> 1.45v set + LLC, Vdroop = .070v, actual = ~1.52v.
> 1.5v probably spikes up to 1.6v
> 1.6v+ set = blow your card up territory
> 
> ASUS
> Asus.rom Vdroop averages (this depends on silicon lottery, temps, ASIC qual, scene rendering difficulty, board quality, chip leakage, etc)
> 1.2v set = 1.175v
> 1.3v set = 1.27v
> 1.35v set = 1.28-1.29v
> 1.412v set = 1.299v (spikes anywhere up to ~1.35v)
> 1.5v set (if possible) = 1.299v - 1.35v
> Since Asus bios has TDP throttling, depending on factors mentioned above voltage is equalized around 1.299v after Droop
> 
> PT1
> PT1 would equal Vdroop to Asus.rom since LLC is not modded, but unlimited TDP & voltage setting so max limit is higher. EX:
> 1.2v set = 1.175v
> 1.3v set = 1.27v
> 1.35v set = 1.28-1.29v
> 1.412v set = 1.36v
> 1.5v set = 1.42v
> 1.6v set = 1.5v
> 1.7v set = 1.55v OCP mods & Hard mods are needed for more headroom at this point
> voltage can always equal what you set, so for PT1 setting 1.5v in GPUTweak would result in voltage oscillation in a range of 1.42v-1.5v
> 
> 290x ref pcb nphase vrm is overloaded at 559.5 watts according to specifications, while the Asus bios stop at 468 watts. Uber & silent throttle you before 350w TDP threshold is reached, so if you go somewhere between 350w-468w, it's only due to power surging or tinkering voltage on ASUS bios. 1320mhz/1390mv 45/60c is probably somewhere between 400 to 500w. (Low temps = less power) ~580 watts & beyond is torture to the system - way out of spec. PT3 in theory would let you overload the card from 559.5w to ~725w or beyond. But it'll likely fry somewhere around there anyway. The stock blower cooler stops you at 265w-300w, AMD silent & uber stop you at ~300w & ~350w respectively, ASUS bios at 468w, the buck controller stops you at 559.5w, the pcb stops you at ~580w, PT3.rom (PT1 also) stops you at ~725-800w. I suppose beyond 1.45v is diminishing returns on overclock without LN2 cold temperatures.
> 
> By comparison the highest clocked GTX 780 reference card on full cover block I can find (OCN member KarateF22 1520mhz/1550mv) shows that NV OCP & OVP is active & hardware monitored up to ~572-580w. There are claims of Titan/780 ref >600w bioses, but that would only be possible with OCP mod. Remember that using PT3.rom isn't going to stop a R9 290x at ~580w like hardware OCP stops Skynet's vbios on 780 @ 572w. Therefore, PT3.rom is much more dangerous than most 780/Titan vbios @ OCN in regards to permanently damaging your video card since protections are bypassed..
> 
> VRM temperatures are an important limiting factor in overclocking and may very well be accurate, but it must first be validated using an Infra red thermometer gun pointed at the back side of the PCB. Then you must read GPUz's VRM temperature and check that it matches #s seen on your IR gun (give or take 2 or 3 degrees). Theres always the good old fashioned way of touching it and feeling the burn. tongue.gif GPUz has two VRM temperature readings, one of them is the memory voltage nphase's mosfet, the other is the core voltage nphase's mosfets. Once you start overclocking, the hotter of the two is usually from the mosfets regulating core voltage. Particularly true in the 290X's case since at this moment we aren't messing with memory overvoltage.
> 
> Concerning 3D clockstates:
> When you punch numbers into GPUTweak voltage slider it represents a target voltage that you wish to have. Depending on which of the 3 bioses you're using, the actual voltage as measured by a digital multimeter will read 3 different values compared to what you've set in GPUTweak. JJJC_93 has shown this for us since he has already done Vcore & Vmem measure on DMM with PT3 bios - i linked both of these in the OP. Let's say for example you set 1.412v in GPUTweak as your core voltage. Using ASUS.rom bios (GPUTweak at 1.412v) your actual core voltage is somewhere around 1.299v (as shown by both Paulenski & overclockfrance) and using PT1 bios your actual voltage is ~1.350v; finally using PT3 bios your actual voltage is close to 1.49v (shown in multiple posts at hwbot.org). However, in each scenario GPUTweak setting is still 1.412v, and I'm unsure of what its graph would read. GPUZ will have 'more realistic' measure of the voltage that would match the actual vcore. GPUZ with PT1 may read either 1.412, 1.36v or between & PT3 may read 1.412, 1.49v, or between. In the case of ASUS.rom bios, while GPUTweak reads 1.412v, Gpuz would show ~1.299v (and perhaps it would spike to 1.35v) I don't have a 290x to test - you will have to test it for yourself as Paulenski has and take screenshots to help prove or disprove it. Since GPUZ can show 'max', that 'max' value can misleadingly represent a spike that only occurred momentarily over an extended period of high load. You will need to view 'average' or carefully look at the graph while mousing over points in time coverage to see what your average or constant voltage readouts were. Or you will need a digital multimeter & Vcore measure point to verify that GPUTweak, GPUZ & your actual voltage shown on the meter measure up concurrently or as you've said, provide 'accurate readout'.
> 
> Afterburner is coming in a couple days or less. Unwinder says maybe after the weekend. All the normal bioses that come with 290x cards out of the box will throttle at total board power consumption under 460 watts because all bioses except PT1/PT3 limit power. IMO, judging by OC results thus far, AB + stock bios + 1.3-1.4v will make for overclocks of ~1225-1295mhz. It also depends somewhat on what MSI AB allows for maximum voltage. It will be interesting to see if Afterburner cooperates with PT1 & PT3 bioses, and what actual voltages end up being when changed. GPUTweak does suck.. Regardless of which software you prefer to use for overvolting (I prefer AB over GPUtweak), I think 290X will require a bios granting unlimited TDP for clocks of 1275-1350mhz. If you're really lucky on the lottery & have super cool temps, maybe >1350mhz.


----------



## youra6

Anyone know of a non-X Asus BIOS? I've heard flashing a X BIOS onto a R9 290 may cause problems.


----------



## rv8000

Could you add me to the list kindly. Gotta love having a newegg warehouse 1 state away, pretty much free overnight shipping











Only got to do a quick firestrike run but stock vs stock gpu score was 700 points higher than my 780. It's a shame I have to trash that stock cooler as it goes so well with my theme D: . Looking forward to having some fun with this card for sure!


----------



## Arizonian

Quote:


> Originally Posted by *rv8000*
> 
> Could you add me to the list kindly. Gotta love having a newegg warehouse 1 state away, pretty much free overnight shipping
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Only got to do a quick firestrike run but stock vs stock gpu score was 700 points higher than my 780. It's a shame I have to trash that stock cooler as it goes so well with my theme D: . Looking forward to having some fun with this card for sure!


Congrats - Added


----------



## Falkentyne

So far, after continuing to look at the replies, *each and every person* affected by the black screen WITHOUT overclocking through CCC (or even in afterburner) has been using Elpida memory. EVERY one (this does NOT include using other bioses).

The only cases of Hynix black screening have been from
1) Overclocking the memory through CCC, usually fine until you restart the computer, then black screen galore on the *DESKTOP* (a big No-No; use afterburner instead). 2) using the PT3 Bios.

Perhaps someone should make a thread dedicated to this EXACT issue, along with a poll, if overclocking was used or not and when the black screens occurred.


----------



## Arm3nian

Quote:


> Originally Posted by *Spectre-*
> 
> hmm so what do you need another
> 
> 100 watts? just curious
> 
> thanks for your time


If overclocking on air I'd say the minimum would be 1000watts for a full system. The list just posted has some good info.

The Asus bios draws quite a bit more power, so it really depends on what bios you're using and if on air or water.


----------



## th3illusiveman

Quote:


> Originally Posted by *jomama22*
> 
> His memory is unstable. Have a look at this for some fun:
> 
> http://www.3dmark.com/fs/1111431
> 
> thats at 1225/1750
> 
> My 1315/1500 = 14100.
> 
> Im saving the 1315/1725 @ 5.1 for later
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Heres the thing about the memory. At 6000, its good, 6100-6400, no go, lower scores in 3dmark13 (higher in valley though?). 6500-6800, back to working again and showing awesome scores. 7000? yeah, only 1 hit that...and the weakest core to boot haha.
> 
> Core voltage seemed to have a slight effect but only at 1.35v+ (actual) and it was in a negative way lol.


whats the bandwidth at that speed? 1750Mhz memory? Are you close to 500GB/s?


----------



## youra6

http://www.3dmark.com/3dm11/7456225

Seems to max out at 1130/1400. Now to run Heaven as a better tool for stability.


----------



## HoZy

Quote:


> Originally Posted by *Falkentyne*
> 
> So far, after continuing to look at the replies, *each and every person* affected by the black screen WITHOUT overclocking through CCC (or even in afterburner) has been using Elpida memory. EVERY one (this does NOT include using other bioses).
> 
> The only cases of Hynix black screening have been from
> 1) Overclocking the memory through CCC, usually fine until you restart the computer, then black screen galore on the *DESKTOP* (a big No-No; use afterburner instead). 2) using the PT3 Bios.
> 
> Perhaps someone should make a thread dedicated to this EXACT issue, along with a poll, if overclocking was used or not and when the black screens occurred.


I'm on stock bios, XFX with Elpida memory. 1150 core & 1350 memory using CCC. Ek Waterblock, No "black screen" and I've been running 3dmark & furmark for the past week.

Cheers
Mat


----------



## Taint3dBulge

Quote:


> Originally Posted by *Falkentyne*
> 
> So far, after continuing to look at the replies, *each and every person* affected by the black screen WITHOUT overclocking through CCC (or even in afterburner) has been using Elpida memory. EVERY one (this does NOT include using other bioses).
> 
> The only cases of Hynix black screening have been from
> 1) Overclocking the memory through CCC, usually fine until you restart the computer, then black screen galore on the *DESKTOP* (a big No-No; use afterburner instead). 2) using the PT3 Bios.
> 
> Perhaps someone should make a thread dedicated to this EXACT issue, along with a poll, if overclocking was used or not and when the black screens occurred.


I get this, and just looked at my asus. its got elpida, thought asus used hynix... I found tho that when it does the black screen all i have to do is change dvi ports the issue goes away... If i use the bottom port that is. Going to test more tomorrow...


----------



## skupples

Quote:


> Originally Posted by *Strata*
> 
> Actually silver and copper are more likely to corrode than copper and nickel, the killer usually is brass and nickel. In a typical loop, silver is the lowest on the galvanic scale, followed by nickel, copper,brass, chrome, and finally aluminum. If you're paranoid like I am and will have anything but copper and brass use an inhibitor like Water Wetter or Fesar.


I hear Red Line is great. (not sure which actual product though, should of thought of that first before posting. silly late night thoughts, will clarify tomorrow) I think it was the "water wetter" product they make.

I'm hoping I didn't over load my system with dead water (copper sulfate). I later added a kill coil, & only flushed 3/4 of the water... Will find out next week if my nickel block's are all nasty in side. Temps are the same since day one, so that's a good sign.


----------



## qqan

Would you sign me up?

All stock so far (keeps 1030 MHz on stock air "cooling" around 87C on 44% fan during BF4).
I'll move to the Accelero Hybrid tomorrow.

Thanks,
qq

Update: Accelero Hybrid





- requires 3 additional RAM sinks, and the single RAM chip at the PCI-E port does not allow for full size sink
- loop gets pretty warm during Furmark - I'll add more/better fans
- stock fan noise gone 

I'll repeat above BF4 test now.

- BF4 now gives me around 63C on GPU.
- VRMs are hot though, up to 85C


----------



## hotrod717

Quote:


> Originally Posted by *youra6*
> 
> Anyone know of a non-X Asus BIOS? I've heard flashing a X BIOS onto a R9 290 may cause problems.


Im wondering about this myself. I seem to recall flashing Asus.rom to 290 may cause issue with non-Asus motherboards. Not sure why. I believe I read that on ukoc page. Haven't read of any first hand acounts of someone trying this though. Any volunteers? Lol.


----------



## youra6

Quote:


> Originally Posted by *hotrod717*
> 
> Im wondering about this myself. I seem to recall flashing Asus.rom to 290 may cause issue with non-Asus motherboards. Not sure why. I believe I read that on ukoc page. Haven't read of any first hand acounts of someone trying this though. Any volunteers? Lol.


I don't think it has been released yet, we'll just have to wait.


----------



## skupples

Quote:


> Originally Posted by *hotrod717*
> 
> Im wondering about this myself. I seem to recall flashing Asus.rom to 290 may cause issue with non-Asus motherboards. Not sure why. I believe I read that on ukoc page. Haven't read of any first hand acounts of someone trying this though. Any volunteers? Lol.


you just need to run the "override gpu id mismatch" is what it's called in NvFlash. Oh & EEprom protection.


----------



## Arizonian

Quote:


> Originally Posted by *qqan*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Would you sign me up?
> 
> All stock so far (keeps 1030 MHz on stock air "cooling" around 87C on 44% fan during BF4).
> I'll move to the Accelero Hybrid tomorrow.
> 
> Thanks,
> qq


Congrats added w/aftermarket cooling









EDIT: Since switching my 690 to second 3D Vision rig, I'm using the 680 as back up in my main rig. This is killing me playing BF4 in low settings to keep FPS up. I've stopped completely gaming. Don't feel like making the switch again with PSU cables etc while I wait for AIB's to come out with their non-ref versions.


----------



## overclockFrance

Quote:


> Originally Posted by *HoZy*


THank you very much for this useful information.

Can the 290 be flashed with PT1 bios ? It seems that there are problems with 290 and PT3 one.


----------



## Sazz

anyone that has a sapphire 290X, can you guys upload the quiet mode BIOS, I thought I backed up my Quiet mode BIOS before flashing on it but for some reason my back up is the uber mode not the quiet mode, dunno how that happened but anyways it would be great if someone can upload the stock Quiet mode BIOS of Sapphire Card.

Quote:


> Originally Posted by *overclockFrance*
> 
> THank you very much for this useful information.
> 
> Can the 290 be flashed with PT1 bios ? It seems that there are problems with 290 and PT3 one.


even with my R9 290X the PT3 BIOS is unstable.

I got 1275/1466 with PT1 but PT3, I can't get the same clocks stable.


----------



## Sgt Bilko

i think i might have to RMA my 290x........i'm getting a BSOD around 5mins into a game now, i've tried 4 different games, switching from quiet to uber and resetting the stock clocks and then raising them again.

Thoughts anyone?


----------



## selk22

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Sgt Bilko*
> 
> i think i might have to RMA my 290x........i'm getting a BSOD around 5mins into a game now, i've tried 4 different games, switching from quiet to uber and resetting the stock clocks and then raising them again.
> 
> Thoughts anyone?






I thought you recently replaced the cpu? to a 8350? If so maybe this is the cause?


----------



## Sazz

and for the people that is getting black screen even without overclock, does it happen you guys have MSI afterburner beta15? so far I am starting to experience this when running the MSI AB. but when I close it I don't get it anymore, and I usually get the black screen when I switch tab on chrome.

I installed beta 16 atm, I am testing this theory out.


----------



## Sgt Bilko

not yet, CPU is still shipping (no post on weekends here), i haven't change drivers......updated anything, everything is the same as before, it's only in games though, i can play movies, music videos etc all fine

it's just weird


----------



## sasuki

Hi! awesome thread!

Need some help! Im thinking of buying the R9 290 (without the X)








Im not gonna overclock it. So is the stock cooler good for normal use or i should buy a custom cooler or wait fir the non-reference r9 290?

Thanks


----------



## selk22

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Sgt Bilko*
> 
> not yet, CPU is still shipping (no post on weekends here), i haven't change drivers......updated anything, everything is the same as before, it's only in games though, i can play movies, music videos etc all fine
> 
> it's just weird





Do the usual trouble shoot.. Pull ram sticks one at a time.. Then reset CMOS and reinstall drivers in safe mode. Usual things









If nothing helps then I would RMA that sucker quick!
Quote:


> Originally Posted by *sasuki*
> 
> Hi! awesome thread!
> 
> Need some help! Im thinking of buying the R9 290 (without the X)
> 
> 
> 
> 
> 
> 
> 
> 
> Im not gonna overclock it. So is the stock cooler good for normal use or i should buy a custom cooler or wait fir the non-reference r9 290?
> 
> Thanks


If you can set a custom fan profile and don't mind noise then sure order a reference card especially if you don't mind replacing the cooler with something better in the future. The fan profile ensures you wont get thermal throttling something like a 60% curve. It is loud though..

or wait a few weeks and see what happens


----------



## Sazz

my 3Dmark 11 and 3Dmark Firestrike best scores with PT1 BIOS, PT3 ain't stable for me, can't even achieve 1100/1420clocks that I can achieve with stock Sapphire BIOS without getting unstable.

3Dmark 11
http://www.3dmark.com/3dm11/7456375

3Dmark Firestrike
http://www.3dmark.com/fs/1117688

Breaking 10k on a pure AMD system with single GPU on firestrike, finally! xD

And btw, anyone who owns a sapphire 290X card, can someone upload the quiet mode BIOS, somehow my I backed up the wrong BIOS, instead the Quiet mode I backed up the uber mode, I flashed over the quiet mode and I want to revert it back.


----------



## Sgt Bilko

Well, i set the BIOS to quiet mode (didn't press it fully last time) and it seems to be ok now, Just running stock clocks and so far so good.

Fingers crossed.........heh, was almost hoping i had to RMA so i could get myself 2 x 290's then









EDIT: my Uber mode is busted.......Quiet works fine but when i switch it to Uber i get 5 mins into a game and it either crashes, Black Screens or BSODs. Time to RMA i think.


----------



## sasuki

Quote:


> Originally Posted by *selk22*
> 
> Do the usual trouble shoot.. Pull ram sticks one at a time.. Then reset CMOS and reinstall drivers in safe mode. Usual things
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If nothing helps then I would RMA that sucker quick!
> If you can set a custom fan profile and don't mind noise then sure order a reference card especially if you don't mind replacing the cooler with something better in the future. The fan profile ensures you wont get thermal throttling something like a 60% curve. It is loud though..
> 
> or wait a few weeks and see what happens


Thanks for the reply! I think ill wait, cause i read somewhere that non-Reference R9 290X Graphics Card Coming Late November.


----------



## Sgt Bilko

This might seem like a silly question but is there any difference between the quiet and the uber BIOS besides CCC limiting the fan speed cap?


----------



## smokedawg

http://imgur.com/uWhd8Hr


Finally got to install my 290x (old case was too small):

1x Powercolor 290x
Stock cooling for the moment (waiting for EK Water Block)
OC to 1100 / 1330 with stock bios (any higher on either gave a few artifacts in valley)

ASIC 79.0%


----------



## HoZy

Quote:


> Originally Posted by *smokedawg*
> 
> Finally got to install my 290x (old case was too small):
> 
> 1x Powercolor 290x
> Stock cooling for the moment (waiting for EK Water Block)
> OC to 1100 / 1330 with stock bios (any higher on either gave a few artifacts in valley)
> 
> ASIC 79.0%


What are you running that's dropping your bus speed down to 1x ?


----------



## Sgt Bilko

Quote:


> Originally Posted by *HoZy*
> 
> What are you running that's dropping your bus speed down to 1x ?


it drops down to x1.1 when you aren't running anything at all


----------



## Paul17041993

Quote:


> Originally Posted by *HoZy*
> 
> What are you running that's dropping your bus speed down to 1x ?


think that's just the link state power saving...


----------



## smokedawg

Quote:


> Originally Posted by *HoZy*
> 
> What are you running that's dropping your bus speed down to 1x ?


Not sure, but according to GPU-Z it might be a wrong reading. When I run the render test I get this:


http://imgur.com/8YnCGb6


(which is the maximum for my sandy bridge from what I read)


----------



## HoZy

Ah sorry, Didn't notice it was in power saving mode.


----------



## r0l4n

How do you guys avoid all the tearing in the desktop when overclocking the memory? I'm game stable at over 1650, but back to the desktop, I get tearing and image corruption from 1500 and above. Anything to do with the 2D profile maybe? EDIT: I'm using AB beta 16.


----------



## SonDa5

Hey guys and gals I just did some 290X PSU tests and shared my results in a separate thread.

Check it out.









http://www.overclock.net/t/1441118/290x-psu-power-output-tests


----------



## Sgt Bilko

Well, i'm going to put the stock cooler back on tomorrow and see if that makes a differance or not, hopefully it does and it just means i've got poor cooling somewhere i need to work out.

if it doesn't then i guess i gotta RMA it.........which kinda sucks


----------



## LookOut

Hi,

I use the pt1 bios for my r290x Club 3d.
When I set in the asus gpu tweak up to 1,497 Volt Vcore my Computer shuts directly down (without running any Benchmark) (Gpu @ 1230Mhz).
With 1,48 Volt everything is runnning fine (also under heavy load).

Seems to be a protection Feature ?

Somebody has an idea to turn this off ?

System:
3930k
Fatality Champion X79
290x Club 3d
Tagan PipeRock 1100W

Or should I try another Psu ?


----------



## Menthol




----------



## SonDa5

Quote:


> Originally Posted by *LookOut*
> 
> Hi,
> 
> I use the pt1 bios for my r290x Club 3d.
> When I set in the asus gpu tweak up to 1,497 Volt Vcore my Computer shuts directly down (without running any Benchmark) (Gpu @ 1230Mhz).
> With 1,48 Volt everything is runnning fine (also under heavy load).
> 
> Seems to be a protection Feature ?
> 
> Somebody has an idea to turn this off ?
> 
> System:
> 3930k
> Fatality Champion X79
> 290x Club 3d
> Tagan PipeRock 1100W
> 
> Or should I try another Psu ?


I wouldn't mess with the BIOS. Get a BIOS that is stable and stick with it till a better BIOS is available.


----------



## $ilent

Everytime i ask this people ignore me, but do we have any afterburner update yet?


----------



## Jpmboy

Quote:


> Originally Posted by *youra6*
> 
> http://www.3dmark.com/3dm11/7456119
> 
> R9 290 CFX @ 1100/1400mhz *under water*, stock volts
> 
> http://www.3dmark.com/3dm11/7223454
> 
> GTX 670 SLI @ +1400/+500mhz *under water* 1.212V
> 
> 9k higher score... impressive. Imagine the score once I push this thing closer to 1300Mhz... Gonna run the test again; killing all my unneeded processes in the task manager.


for some reason, I can't even complete a 3dmk11 run with my R290x. FS, and FSE - fine, Valley - fine (but not umpressive), but 3dmk11 gives an error saying it "Exclusive access to keyboard or monitor has been lost". Huh?

also, the new cat driver, does not want to play well with a 4K monitor and fails to do any other resolution (to the 4K) correctly (1080P is a tiny window in the center)

HELP!

*OP - now watercooled:*


----------



## bpmcleod

Anyone have any ideas as to why my r9 290 cant be recognized in any flash utility? It is still on stock BIOS atm..

Edit: In DOS using atiflash I just get the "Adapter not found" message and winflash just says no discrete ATI card found..


----------



## flopper

Quote:


> Originally Posted by *$ilent*
> 
> Everytime i ask this people ignore me, but do we have any afterburner update yet?


no


----------



## upgraditus

Quote:


> Originally Posted by *$ilent*
> 
> Everytime i ask this people ignore me, but do we have any afterburner update yet?


VRM state/id/location seem to be pretty similar to mine R290 (Hawaii PRO Press Sample).
You can alter voltage on it even with current MSI AB beta by sending commands to VRM via MSI AB command line:

To set +100mV offset:
MSIAfterburner.exe /wi4,30,8d,10

To restore original voltage:
MSIAfterburner.exe /wi4,30,8d,0

Use it at your own risk

source: http://forums.guru3d.com/showthread.php?t=382760&page=6
Quote:


> Originally Posted by *Jpmboy*
> 
> for some reason, I can't even complete a 3dmk11 run with my R290x. FS, and FSE - fine, Valley - fine (but not umpressive), but 3dmk11 gives an error saying it "Exclusive access to keyboard or monitor has been lost". Huh?
> 
> also, the new cat driver, does not want to play well with a 4K monitor and fails to do any other resolution (to the 4K) correctly (1080P is a tiny window in the center)
> 
> HELP!
> 
> *OP - now watercooled:*


I had that issue in 3DMark11 (Exclusive access...) recently with a 7970 so not exclusive to new GPU's. Fixed with removing all driver traces (DDU) and reinstall.


----------



## rdr09

Standing in front of house waiting for the GPU.












waterblock should be here by 2015!


----------



## SonDa5

Quote:


> Originally Posted by *rdr09*
> 
> Standing in front of house waiting for the GPU.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> waterblock should be here by 2015!


The day mine was delivered I had the same message on the tracking but I didn't get it till almost 8pm the same day.










Good luck.


----------



## SonDa5

Quote:


> Originally Posted by *bpmcleod*
> 
> Anyone have any ideas as to why my r9 290 cant be recognized in any flash utility? It is still on stock BIOS atm..
> 
> Edit: In DOS using atiflash I just get the "Adapter not found" message and winflash just says no discrete ATI card found..


You need ATIflash 4.17


----------



## SonDa5

Quote:


> Originally Posted by *$ilent*
> 
> Everytime i ask this people ignore me, but do we have any afterburner update yet?


Nope.


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> Standing in front of house waiting for the GPU.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> waterblock should be here by 2015!


I wish our post would deliver on weekends.......my CPU is sitting in a warehouse 80kms away


----------



## qqan

Partner,

Could you change my card to X on the list? I'll post the AC Hybrid experience and some "black screen" thoughts later.

-qq


----------



## $ilent

Is it bad if my gpu is getting these artifacts at stock:



I get these little artifacts at stock but whilst folding at 100%. The artifacts move soon as I open windows screen capture, so I had to mock them in paint, but this is what they look like.


----------



## rdr09

Quote:


> Originally Posted by *SonDa5*
> 
> The day mine was delivered I had the same message on the tracking but I didn't get it till almost 8pm the same day.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Good luck.


that's ok. i'll drain my loop while waiting.
Quote:


> Originally Posted by *Sgt Bilko*
> 
> I wish our post would deliver on weekends.......my CPU is sitting in a warehouse 80kms away


weekend? i'd be in Bondi.


----------



## Sgt Bilko

Not all of Aus is Beaches you know









nah, i thought i'd actually get to work on my pile of shame....not bad so far, finished 3 games.......another 50-something to go..........


----------



## upgraditus

Quote:


> Originally Posted by *$ilent*
> 
> Is it bad if my gpu is getting these artifacts at stock:
> 
> I get these little artifacts at stock but whilst folding at 100%. The artifacts move soon as I open windows screen capture, so I had to mock them in paint, but this is what they look like.


If the gpu is operating within safe temp levels I personally would RMA as it should be able to run @ stock without errors (artifacts)


----------



## $ilent

hmm suppose I will contact them.


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Not all of Aus is Beaches you know
> 
> 
> 
> 
> 
> 
> 
> 
> 
> nah, i thought i'd actually get to work on my pile of shame....not bad so far, finished 3 games.......another 50-something to go..........


will your stock liquid cooler work?


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> will your stock liquid cooler work?


I'm guessing thats in referance to the 8350?

if so then no, the 8150 and closed loop i currently have will be donated to the wifes rig and i will be getting a H100i for the 8350.


----------



## Kipsofthemud

When I was drunk last night I ordered a Gigabyte 290X for 380 euros because it was 3 euros more expensive than a 290...I also ordered a waterblock for it.

I wish I would have read this thread before pulling the trigger...It seems like you guys are having a lot of trouble with these cards x.x

Are there any people that are actually enjoying their purchase?


----------



## SonDa5

Quote:


> Originally Posted by *Kipsofthemud*
> 
> Are there any people that are actually enjoying their purchase?


Me.


----------



## upgraditus

Quote:


> Originally Posted by *Kipsofthemud*
> 
> When I was drunk last night I ordered a Gigabyte 290X for 380 euros because it was 3 euros more expensive than a 290...I also ordered a waterblock for it.
> 
> I wish I would have read this thread before pulling the trigger...It seems like you guys are having a lot of trouble with these cards x.x
> 
> Are there any people that are actually enjoying their purchase?


If you don't mind me asking where did you find it at that ridiculous price?

I was considering a 290pro but if the x can be had for that......


----------



## rdr09

Quote:


> Originally Posted by *Kipsofthemud*
> 
> When I was drunk last night I ordered a Gigabyte 290X for 380 euros because it was 3 euros more expensive than a 290...I also ordered a waterblock for it.
> 
> I wish I would have read this thread before pulling the trigger...It seems like you guys are having a lot of trouble with these cards x.x
> 
> Are there any people that are actually enjoying their purchase?


dude, when you go to a bar . . do you try to meet a woman before drinking or after a few drinks?

kid kid

things will get ironed out soon enough. we are pioneers. some are flashing and overclocking and these things will intro issues.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Kipsofthemud*
> 
> When I was drunk last night I ordered a Gigabyte 290X for 380 euros because it was 3 euros more expensive than a 290...I also ordered a waterblock for it.
> 
> I wish I would have read this thread before pulling the trigger...It seems like you guys are having a lot of trouble with these cards x.x
> 
> Are there any people that are actually enjoying their purchase?


I'm enjoying mine, i've had some hiccups but it's to be expected on a new ref card.......and since you ordered a water block with it i imagine you will enjoy yours too









Things can only go up from here


----------



## Arizonian

Quote:


> Originally Posted by *smokedawg*
> 
> 
> 
> http://imgur.com/uWhd8Hr
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Finally got to install my 290x (old case was too small):
> 
> 1x Powercolor 290x
> Stock cooling for the moment (waiting for EK Water Block)
> OC to 1100 / 1330 with stock bios (any higher on either gave a few artifacts in valley)
> 
> ASIC 79.0%


Congrats and added









Also thank you so much for putting all your info in the post. That's what I'm looking for.









Quote:


> Originally Posted by *Menthol*
> 
> 
> 
> Spoiler: Warning: Spoiler!


I don't know which brand or series you have so your down for Sapphire 290X by default. Let me know if it's any different.









Congrats added









Quote:


> Originally Posted by *rdr09*
> 
> Standing in front of house waiting for the GPU.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> waterblock should be here by 2015!


I know that feeling of anticipation of new computer toys. Good times.









Quote:


> Originally Posted by *Sgt Bilko*
> 
> I wish our post would deliver on weekends.......my CPU is sitting in a warehouse 80kms away


I know that feeling too. Sorry bud.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Arizonian*
> 
> I know that feeling too. Sorry bud.


It's all good, i just got a new PB in valley so im happy atm......gonna try and tweak some more for Firestrike, i'm determined to crack 5k in extreme


----------



## qqan

Quote:


> Originally Posted by *Kipsofthemud*
> 
> When I was drunk last night I ordered a Gigabyte 290X for 380 euros because it was 3 euros more expensive than a 290...I also ordered a waterblock for it.
> 
> I wish I would have read this thread before pulling the trigger...It seems like you guys are having a lot of trouble with these cards x.x
> 
> Are there any people that are actually enjoying their purchase?


 these are enthusiast cards man, no xbox stuff..


----------



## Asrock Extreme7

my 290x works best with windows 8 had problems with windows 7 //major problems with 8.1 black screens so back to windows 8 not one problem bf4 runs grate u lot having problems try windows 8


----------



## Tobiman

Quote:


> Originally Posted by *Asrock Extreme7*
> 
> my 290x works best with windows 8 had problems with windows 7 //major problems with 8.1 black screens so back to windows 8 not one problem bf4 runs grate u lot having problems try windows 8


Good to know. Seems windows 8.1 is the cause of a lot of black screen problems.


----------



## Asrock Extreme7

yes and gpu tweak is a no go on 8.1 had the card from day one so done a lot of problem solving


----------



## Sgt Bilko

Looks like im not gonna crack 5k until that 8350 gets here

http://www.3dmark.com/fs/1119878


----------



## qqan

Quote:


> Originally Posted by *Tobiman*
> 
> Good to know. Seems windows 8.1 is the cause of a lot of black screen problems.


I found an issue when OCing using the Sabertooth Z77 performance option - screens go black, and USB stops (?) but system keeps running.
This looks like driver related and Windows 8.1 specific. on stock clocks everything is rock solid.


----------



## Kipsofthemud

Quote:


> Originally Posted by *qqan*
> 
> these are enthusiast cards man, no xbox stuff..


What are you saying son? Who's talking about xboxes.


----------



## Sammyboy83

Got my 290 benching 1250mhz/1650mhz with asus 290x bios. The vdrop gives 1.227v when 1.4v was set in gpu tweak. Bf4 is artifacts free 1220/1650. The card is watercooled.

http://www.3dmark.com/3dm11/7458936


----------



## Kipsofthemud

Quote:


> Originally Posted by *SonDa5*
> 
> Me.


Quote:


> Originally Posted by *upgraditus*
> 
> If you don't mind me asking where did you find it at that ridiculous price?
> 
> I was considering a 290pro but if the x can be had for that......


It's a Dutch shop, but they don't ship abroad. still want to know?
Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'm enjoying mine, i've had some hiccups but it's to be expected on a new ref card.......and since you ordered a water block with it i imagine you will enjoy yours too
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Things can only go up from here


I guess, I'm just a bit frightened cuz of all the NODRIVERS spam i've seen for the last 5 years







It'll be my first AMD card since the 9800 Pro...
Quote:


> Originally Posted by *rdr09*
> 
> dude, when you go to a bar . . do you try to meet a woman before drinking or after a few drinks?
> 
> kid kid
> 
> things will get ironed out soon enough. we are pioneers. some are flashing and overclocking and these things will intro issues.


I guess it wasn't the best call but I saw that price and that it was 3 euro's more than their 290, 100 euro's cheaper than the 2nd cheapest 290X so I couldn't let it pass!


----------



## Sgt Bilko

Quote:


> Originally Posted by *Sammyboy83*
> 
> Got my 290 benching 1250mhz/1650mhz with asus 290x bios. The vdrop gives 1.227v when 1.4v was set in gpu tweak. Bf4 is artifacts free 1220/1650. The card is watercooled.
> 
> http://www.3dmark.com/3dm11/7458936


Nice set-up.....you should start posting some benches over here:http://www.overclock.net/t/1436635/ocn-gk110-vs-hawaii-bench-off-thread#

We need some more 290/x's in the fight over there


----------



## hatlesschimp

Do all manufacturers of the 290x support hdmi 2.0?

I've seen on club3's website that their 290x will support hdmi2.0.

I have a sony 65"4k tv and some games are great in 4k at 30hz but others like Cod and bf4 are unplayable. Sony will soon release the firmware update for my tv to enable hdmi 2.0. But I think it's only going to be 8bit color and chroma 4:2:0 but this hasn't been confirmed yet.

So in general is the 290x good? Will 2 of them get me 60fps at 4k?

Which manufacturer 290x should I get?

Thanks


----------



## Sammyboy83

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Nice set-up.....you should start posting some benches over here:http://www.overclock.net/t/1436635/ocn-gk110-vs-hawaii-bench-off-thread#
> 
> We need some more 290/x's in the fight over there


I will post the 3dmark score there. Thanks


----------



## bpmcleod

Quote:


> Originally Posted by *SonDa5*
> 
> You need ATIflash 4.17


Ty. Worked perfect!



Powercolor R9 290 flashed to 290x ASUS BIOS







Add me plz







Unless more is needed?


----------



## upgraditus

Quote:


> Originally Posted by *Kipsofthemud*
> 
> It's a Dutch shop, but they don't ship abroad. still want to know?
> I guess, I'm just a bit frightened cuz of all the NODRIVERS spam i've seen for the last 5 years
> 
> 
> 
> 
> 
> 
> 
> It'll be my first AMD card since the 9800 Pro...
> 
> I guess it wasn't the best call but I saw that price and that it was 3 euro's more than their 290, 100 euro's cheaper than the 2nd cheapest 290X so I couldn't let it pass!


Thanks but no point if no euro shipping.

It was the right call trust me, even if you don't like it you will be able to ebay it and get your money back if not make a profit on it.


----------



## esqueue

This is probably the best place to ask this question. If I want 5.1+ sound from a game on my pc, what would be my best way of achieving that? I have a Kenwood 5.1 receiver that is ancient and doesn't even have DTS but only DD5.1 and older.

My current setup is a custom made jumper that sits on the motherboards spdif pins to an av cable that connects into my radio's coaxial. The optical gives the exact thing so that isn't the issue.

I heard that I can either get a soundcard that supports Dolby Digital Live or I can get a Better Receiver that has HDMI and even use my videocard for 5.1. I'm more interested in the latter and can use ANY advice.

Anyone using their 290x or 290 as an audio out and gets 5.1 audio during gaming? I've been getting 5.1 from movies for years but this is the only thing that my consoles have on my pc setup.


----------



## upgraditus

Quote:


> Originally Posted by *esqueue*
> 
> This is probably the best place to ask this question. If I want 5.1+ sound from a game on my pc, what would be my best way of achieving that? I have a Kenwood 5.1 receiver that is ancient and doesn't even have DTS but only DD5.1 and older.
> 
> My current setup is a custom made jumper that sits on the motherboards spdif pins to an av cable that connects into my radio's coaxial. The optical gives the exact thing so that isn't the issue.
> 
> I heard that I can either get a soundcard that supports Dolby Digital Live or I can get a Better Receiver that has HDMI and even use my videocard for 5.1. I'm more interested in the latter and can use ANY advice.


For true surround you need either anologue or as I do HDMI from the gpu to a capable receiver (Onkyo 876). Anologue onboard is kinda sucky though with low snr while a cheap x-fi will sound much better, just make sure it's a real x-fi (i.e xttreme gamer/music)

Any surround over s-pdif is lossy.


----------



## battleaxe

Your last choice was better. A new receiver with HDMI. You can pipe the sound via hdmi straight from the 290x into the receiver. You could even use the receiver's switching to then send the signal to your monitor if you use HDMI. If you use DVI-D then that won't work of course.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Sammyboy83*
> 
> I will post the 3dmark score there. Thanks


Awesome, feel free to post any other benches you do in there as well. people are very interested in seeing what the 290 is capable of and yours is no slouch


----------



## maarten12100

Have got my 2 290's yesterday they work fine but the LED's in the cooler don't light up aren't they supposed to light up?


----------



## esqueue

Quote:


> Originally Posted by *upgraditus*
> 
> For true surround you need either anologue or as I do HDMI from the gpu to a capable receiver (Onkyo 876). Anologue onboard is kinda sucky though with low snr while a cheap x-fi will sound much better, just make sure it's a real x-fi (i.e xttreme gamer/music)
> 
> Any surround over s-pdif is lossy.


Quote:


> Originally Posted by *battleaxe*
> 
> Your last choice was better. A new receiver with HDMI. You can pipe the sound via hdmi straight from the 290x into the receiver. You could even use the receiver's switching to then send the signal to your monitor if you use HDMI. If you use DVI-D then that won't work of course.


Yeah, I read that it is lossy. Option 2 it is and thanks for the quick response guys.


----------



## upgraditus

Quote:


> Originally Posted by *battleaxe*
> 
> Your last choice was better. A new receiver with HDMI. You can pipe the sound via hdmi straight from the 290x into the receiver. You could even use the receiver's switching to then send the signal to your monitor if you use HDMI. If you use DVI-D then that won't work of course.


I cannot recommend daisy chaining from the receiver to the display, for me even in game mode it adds a huge amount of latency.
I run two separate dvi-hdmi, one for receiver one for display, put in duplicate mode and it idles at the correct 300/150 instead of in extended mode 500/1000.

N.B: You MUST use ATI/AMD dvi-hdmi adapters for audio.


----------



## Sgt Bilko

Quote:


> Originally Posted by *maarten12100*
> 
> Have got my 2 290's yesterday they work fine but the LED's in the cooler don't light up aren't they supposed to light up?


not that i know of.......another cost cut i imagine









EDIT: i just tried to start folding since i'm about to sleep and i got a BSOD, switched it to Quiet mode again and it's fine........yeah, i need to RMA, something is screwy here.


----------



## battleaxe

Quote:


> Originally Posted by *upgraditus*
> 
> I cannot recommend daisy chaining from the receiver to the display, for me even in game mode it adds a huge amount of latency.
> I run two separate dvi-hdmi, one for receiver one for display, put in duplicate mode and it idles at the correct 300/150 instead of in extended mode 500/1000.
> 
> N.B: You MUST use ATI/AMD dvi-hdmi adapters for audio.


Good point.


----------



## GioV

Quote:


> Originally Posted by *Sgt Bilko*
> 
> not that i know of.......another cost cut i imagine
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: i just tried to start folding since i'm about to sleep and i got a BSOD, switched it to Quiet mode again and it's fine........yeah, i need to RMA, something is screwy here.


Did you flash your card then revert back to original bios?


----------



## PillarOfAutumn

Hey does, anyone have a spare copy of BF4 that they're looking to get rid of?


----------



## hotrod717

Quote:


> Originally Posted by *Sammyboy83*
> 
> Got my 290 benching 1250mhz/1650mhz with asus 290x bios. The vdrop gives 1.227v when 1.4v was set in gpu tweak. Bf4 is artifacts free 1220/1650. The card is watercooled.
> 
> http://www.3dmark.com/3dm11/7458936


Quote:


> Originally Posted by *bpmcleod*
> 
> Ty. Worked perfect!
> 
> 
> 
> Powercolor R9 290 flashed to 290x ASUS BIOS
> 
> 
> 
> 
> 
> 
> 
> Add me plz
> 
> 
> 
> 
> 
> 
> 
> Unless more is needed?


Alright! This gives me hope. Now if I would have ordered my 290 when I ordered my monitor last week, I'd be doing this too. Have to wait until Tuesday.


----------



## Arizonian

Quote:


> Originally Posted by *bpmcleod*
> 
> Ty. Worked perfect!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Powercolor R9 290 flashed to 290x ASUS BIOS
> 
> 
> 
> 
> 
> 
> 
> Add me plz
> 
> 
> 
> 
> 
> 
> 
> Unless more is needed?


Congrats - added


----------



## kcuestag

I'm having an issue where my computer loses signal to the monitor and I lose audio as well, so I think the computer just hangs with blackscreen (no signal).

Anyone experienced this? I'm on the Uber bios (Although on Water), I tried both CPU and GPU on stock and was still having the issue.


----------



## upgraditus

Quote:


> Originally Posted by *kcuestag*
> 
> I'm having an issue where my computer loses signal to the monitor and I lose audio as well, so I think the computer just hangs with blackscreen (no signal).
> 
> Anyone experienced this? I'm on the Uber bios (Although on Water), I tried both CPU and GPU on stock and was still having the issue.


Under what conditions are you getting these black screen crash's?

This is the only thing putting me off right now as it seems many people are getting this. I would say the drivers are crashing but not recovering and taking the OS with it.


----------



## kcuestag

Quote:


> Originally Posted by *upgraditus*
> 
> Under what conditions are you getting these black screen crash's?
> 
> This is the only thing putting me off right now as it seems many people are getting this. I would say the drivers are crashing but not recovering and taking the OS with it.


It happens when opening a game, for example.

Not always though, jusit now I was playing some Arma 2 and it happened after an hour. The card is under water and NEVER exceeds 1125MHz.

So this is common issue? I thought it was either card dying or PSU dying... IF you say others are having this issue as well, then it is a relief.


----------



## ImJJames

Quote:


> Originally Posted by *Sammyboy83*
> 
> Got my 290 benching 1250mhz/1650mhz with asus 290x bios. The vdrop gives 1.227v when 1.4v was set in gpu tweak. Bf4 is artifacts free 1220/1650. The card is watercooled.
> 
> http://www.3dmark.com/3dm11/7458936


Can you do valley extremehd pls


----------



## kcuestag

It just happened again, opened Arma 2 and 2 minutes later no signal and I had to hard reset the computer.









Anyone else getting similar behavior?


----------



## upgraditus

Quote:


> Originally Posted by *kcuestag*
> 
> It happens when opening a game, for example.
> 
> Not always though, jusit now I was playing some Arma 2 and it happened after an hour. The card is under water and NEVER exceeds 1125MHz.
> 
> So this is common issue? I thought it was either card dying or PSU dying... IF you say others are having this issue as well, then it is a relief.


I've seen it across a few different forums now and under many different circumstances overclocked or stock it seems to be happening. Some people claim to have fixed it with more juice to the CPU, some that certain software was the issue (e.g gputweak) others say reinstalling the drivers helped. My hope is that it's just teething trouble with the driver but I fear it's something else. I used to get this same crash with a 7970 if OC'd to high on the core/not enough voltage.

http://forums.guru3d.com/showthread.php?t=383104
http://hardforum.com/showthread.php?s=4d59110b42a79f8e647c19304b17b434&t=1789166
http://battlelog.battlefield.com/bf3/en/forum/threadview/2955065218069487298/1/


----------



## kcuestag

Quote:


> Originally Posted by *upgraditus*
> 
> I've seen it across a few different forums now and under many different circumstances overclocked or stock it seems to be happening. Some people claim to have fixed it with more juice to the CPU, some that certain software was the issue (e.g gputweak) others say reinstalling the drivers helped. My hope is that it's just teething trouble with the driver but I fear it's something else. I used to get this same crash with a 7970 if OC'd to high on the core/not enough voltage.
> 
> http://forums.guru3d.com/showthread.php?t=383104
> http://hardforum.com/showthread.php?s=4d59110b42a79f8e647c19304b17b434&t=1789166
> http://battlelog.battlefield.com/bf3/en/forum/threadview/2955065218069487298/1/


I'll remove MSI Afterburner and see if it helps... Maybe that's causing the issue? I remember reading some people were having issues with MSI Afterburner since it still doesn't fully support this GPU, but I don't know what kind of issues they were talking about.


----------



## outofmyheadyo

Quote:


> Originally Posted by *Sammyboy83*
> 
> Got my 290 benching 1250mhz/1650mhz with asus 290x bios. The vdrop gives 1.227v when 1.4v was set in gpu tweak. Bf4 is artifacts free 1220/1650. The card is watercooled.
> 
> http://www.3dmark.com/3dm11/7458936


What would the score be with a 4770K @ 5ghz ? I dont think i can go much over 15K on my 780 & 4770K @ 5G but you are runnig the X cpu that much give quite a boost to the score.


----------



## Ricdeau

Got my two 290Xs on a truck out for delivery today. The wait is killing me. Still no waterblocks either.


----------



## evensen007

Quote:


> Originally Posted by *kcuestag*
> 
> I'll remove MSI Afterburner and see if it helps... Maybe that's causing the issue? I remember reading some people were having issues with MSI Afterburner since it still doesn't fully support this GPU, but I don't know what kind of issues they were talking about.


Are you using displayport by any chance? If so, unplug the DP from the back of the computer and then plug it back in.


----------



## Slomo4shO

Quote:


> Originally Posted by *Arm3nian*
> 
> That is going to double - triple once you start heavy overvolting.


For benchmarks it is fine to push such power consumption limits but for a 24/7 gaming rig it is impractical. There is absolutely no reason why anyone in their right mind would double or triple their power consumption to obtain 10-20% performance gains.

Quote:


> Originally Posted by *Arm3nian*
> 
> If overclocking on air I'd say the minimum would be 1000watts for a full system. The list just posted has some good info.
> 
> The Asus bios draws quite a bit more power, so it really depends on what bios you're using and if on air or water.


Power consumption is always exaggerated... This is a 7950 quadfire at 1000core/1500memory with a 4770K at 4.6GHz core/4.6GHz uncore running Heaven:


----------



## kcuestag

Quote:


> Originally Posted by *evensen007*
> 
> Are you using displayport by any chance? If so, unplug the DP from the back of the computer and then plug it back in.


No, I'm using Dual-Link DVI for 2560x1440.


----------



## upgraditus

Quote:


> Originally Posted by *kcuestag*
> 
> I'll remove MSI Afterburner and see if it helps... Maybe that's causing the issue? I remember reading some people were having issues with MSI Afterburner since it still doesn't fully support this GPU, but I don't know what kind of issues they were talking about.


It can't hurt to try but I don't think that is the problem. I would personally look at clock speeds. Go back to stock and test for a bit then try lower than stock if still doing it. I know no-one wants to underclock but if we can see if it is clock related be it core or vram we can rule out everything else pretty much, then ask AMD what's the dealyo.

Quote from BF4 thread:

"Made a small video where I do some other benchmarks resulting in a black screen.

Now I also did these test at 800/1000 settings done in MSI afterburner and the funny thing was that everything was working fine. Then I turned the GPU back to default MHz en the memory to 1000, black screens again. And the other way around, 800 MHz GPU and stock memory, black screens again.

My conclusion is that the card is broken, I returned my card yesterday for RMA, so I'll let you know what they say.

So maybe give down clock you card a try, else I would RMA it."


----------



## kcuestag

Quote:


> Originally Posted by *upgraditus*
> 
> It can't hurt to try but I don't think that is the problem. I would personally look at clock speeds. Go back to stock and test for a bit then try lower than stock if still doing it. I know no-one wants to underclock but if we can see if it is clock related be it core or vram we can rule out everything else pretty much, then ask AMD what's the dealyo.
> 
> Quote from BF4 thread:
> 
> "Made a small video where I do some other benchmarks resulting in a black screen.
> 
> Now I also did these test at 800/1000 settings done in MSI afterburner and the funny thing was that everything was working fine. Then I turned the GPU back to default MHz en the memory to 1000, black screens again. And the other way around, 800 MHz GPU and stock memory, black screens again.
> 
> My conclusion is that the card is broken, I returned my card yesterday for RMA, so I'll let you know what they say.
> 
> So maybe give down clock you card a try, else I would RMA it."


I just tried on STOCK and same crap happened after 1 minute of Arma 2... It does indeed look like when my old 7970's were unstable on OC...

I'm going to downclock and see... If that's the issue, I'll be very mad.

Any idea how to unlock CCC so that I can mess with the clocks (Instead of using MSI Afterburner)? I can't find the clocks on AMD Overdrive.


----------



## upgraditus

Quote:


> Originally Posted by *kcuestag*
> 
> I just tried on STOCK and same crap happened after 1 minute of Arma 2... It does indeed look like when my old 7970's were unstable on OC...
> 
> I'm going to downclock and see... If that's the issue, I'll be very mad.
> 
> Any idea how to unlock CCC so that I can mess with the clocks (Instead of using MSI Afterburner)? I can't find the clocks on AMD Overdrive.


Sorry I'm not familiar with the new Overdrive section but after a quick glance it seems using -50% on the power target may throttle it back?
Are you seeing this?

If not then you need to accept the license agreement.

From bit-tech article:
The power limit can now be increased (or decreased) by a whopping 50 percent compared to just 20 percent before. Oddly though, AMD has also chosen to present the GPU clock and memory clock, which has its own separate slider, as percentages. They claim this is because the clock speeds are dynamic rather than fixed, but you're still effectively adjusting the card's maximum frequencies, so there's little reason not to just call them that. This is especially true of the memory clock, which doesn't fluctuate under load like the core clock does.


----------



## Jpmboy

Quote:


> Originally Posted by *upgraditus*
> 
> VRM state/id/location seem to be pretty similar to mine R290 (Hawaii PRO Press Sample).
> You can alter voltage on it even with current MSI AB beta by sending commands to VRM via MSI AB command line:
> 
> To set +100mV offset:
> MSIAfterburner.exe /wi4,30,8d,10
> 
> To restore original voltage:
> MSIAfterburner.exe /wi4,30,8d,0
> 
> Use it at your own risk
> 
> source: http://forums.guru3d.com/showthread.php?t=382760&page=
> I had that issue in 3DMark11 (Exclusive access...) recently with a 7970 so not exclusive to new GPU's. Fixed with removing all driver traces (DDU) and reinstall.


thx - i did that on the switch over. will try again.


----------



## upgraditus

Quote:


> Originally Posted by *Jpmboy*
> 
> thx - i did that on the switch over. will try again.


np. I also had igpu drivers which hadn't removed, might be worth a look.


----------



## Sammyboy83

Quote:


> Originally Posted by *ImJJames*
> 
> Can you do valley extremehd pls


I need to install the valley. Will post score later.


----------



## Arm3nian

Quote:


> Originally Posted by *Slomo4shO*
> 
> For benchmarks it is fine to push such power consumption limits but for a 24/7 gaming rig it is impractical. There is absolutely no reason why anyone in their right mind would double or triple their power consumption to obtain 10-20% performance gains.
> Power consumption is always exaggerated... This is a 7950 quadfire at 1000core/1500memory with a 4770K at 4.6GHz core/4.6GHz uncore running Heaven:


Last time I looked this is ocn. I don't have to write it out, go look at this list that was posted by Jared Pace. I also don't know what the powerdraw of a 7950 has to do with the 290/290x, especially at 1ghz clocks, overvolt that on water and repost a pic of your meter.


----------



## Sammyboy83

Quote:


> Originally Posted by *outofmyheadyo*
> 
> What would the score be with a 4770K @ 5ghz ? I dont think i can go much over 15K on my 780 & 4770K @ 5G but you are runnig the X cpu that much give quite a boost to the score.


The 3970x has 6 cores, and will get more physics score. Total score will also rise. It's better to compare the graphics score when we got different cpu's.


----------



## brazilianloser

Updated to yesterday drivers and I can't even set up eyefinity now without losing signal to the monitors. Every day one more problem. All the excitement I had before getting this cards has quickly evaporated.


----------



## kcuestag

I tried downclocking card to 800/1100 and got instant signal loss (black screen) upon launching Valley Benchmark.

I'm going to contact the Store and see if they'd honor a refund or an instant replacement since they have stock.


----------



## Sammyboy83

Quote:


> Originally Posted by *ImJJames*
> 
> Can you do valley extremehd pls


----------



## upgraditus

Quote:


> Originally Posted by *kcuestag*
> 
> I tried downclocking card to 800/1100 and got instant signal loss (black screen) upon launching Valley Benchmark.
> 
> I'm going to contact the Store and see if they'd honor a refund or an instant replacement since they have stock.


Damn, good luck dude.


----------



## kcuestag

Quote:


> Originally Posted by *upgraditus*
> 
> Damn, good luck dude.


As a last resort I am going to try the Beta 8 drivers, but I have no faith at all (I used Beta 7 and now the new Beta 9.2).

I also noticed that twice today the computer rebooted it's self while on Windows after a fresh start from a crash and the motherboard said it rebooted as an anti-surge protection, I'm wondering if the card may be damaging (or trying to) other components...


----------



## Tobiman

Windows 8.1?


----------



## tsm106

Quote:


> Originally Posted by *Spectre-*
> 
> untitled-1.png 86k .png file
> 
> Quote:
> 
> 
> 
> Originally Posted by *DampMonkey*
> 
> Not if you plan on heavy overclocking
> 
> 
> 
> i am sorry you said something
Click to expand...

You fail immediately. Posting power consumption from guru... seriously?

Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> Two cards on psu with my cpu and nothing else connected, ie. no drives, no loop, nothing else. Running 3dm11 extreme at stock with +50 on air, so it's throttling a bit as expected. 903w x .89 at that wattage = *803w.* This number should blow up with blocks and serious overclocking. Ya need to remember that on the stock bios PT is trying to keep the tdp of the cards very very low.


----------



## psyside

Can anyone do some extreme IQ tests for me please?


----------



## Forceman

Quote:


> Originally Posted by *kcuestag*
> 
> I tried downclocking card to 800/1100 and got instant signal loss (black screen) upon launching Valley Benchmark.
> 
> I'm going to contact the Store and see if they'd honor a refund or an instant replacement since they have stock.


My original 290X did the same thing - instant black screen and then reboot as soon as you put a load on it, even heavily downclocked like yours. Eventually I just got a GPU initialization error on boot up and it wouldn't work at all. Had to return it.

Also, can you update me to 290 under water. Decided to send the 290X back. Not exactly thrilled with the VRM temps I'm getting with the block, about 63C running Metro bench with default voltages, but the GPU temp is amazing - 45C under the same test.


----------



## the9quad

I am giving mine one last chance before I RMA it:


Fresh install of windows (sticking with 8 this time instead of 8.1 since I heard 8.1 is the cause of the black screen< doubtful...)
Latest MOBO bios
Latest chipset drivers
Catalyst beta 8
Stock bios no GPU tweak
Downloading BF4 currently

wish me luck, I am about fed up.


----------



## tsm106

Quote:


> Originally Posted by *the9quad*
> 
> I am giving mine one last chance before I RMA it:
> 
> 
> Fresh install of windows (sticking with 8 this time instead of 8.1 since I heard 8.1 is the cause of the black screen< doubtful...)
> Latest MOBO bios
> Latest chipset drivers
> Catalyst beta 8
> Stock bios no GPU tweak
> Downloading BF4 currently
> 
> wish me luck, I am about fed up.


I've been on win 8 without "major" issue since release. No rush for 8.1 here. Though I use both 8 and 7, and 8 still has issues btw even on a perfectly setup os. I use 7 24-7 btw.


----------



## kcuestag

Quote:


> Originally Posted by *Forceman*
> 
> My original 290X did the same thing - instant black screen and then reboot as soon as you put a load on it, even heavily downclocked like yours. Eventually I just got a GPU initialization error on boot up and it wouldn't work at all. Had to return it.


What worries me is my computer has shut down a few times today and when booting again the BIOS has displayed that motherboard fired the ANTI-SURGE protection to prevent damage, not quite sure if this is related to the GPU or something else... I've only been getting this since today.









Quote:


> Originally Posted by *the9quad*
> 
> I am giving mine one last chance before I RMA it:
> 
> 
> Fresh install of windows (sticking with 8 this time instead of 8.1 since I heard 8.1 is the cause of the black screen< doubtful...)
> Latest MOBO bios
> Latest chipset drivers
> Catalyst beta 8
> Stock bios no GPU tweak
> Downloading BF4 currently
> 
> wish me luck, I am about fed up.


What issues are you having exactly?


----------



## brazilianloser

Quote:


> Originally Posted by *the9quad*
> 
> I am giving mine one last chance before I RMA it:
> 
> 
> Fresh install of windows (sticking with 8 this time instead of 8.1 since I heard 8.1 is the cause of the black screen< doubtful...)
> Latest MOBO bios
> Latest chipset drivers
> Catalyst beta 8
> Stock bios no GPU tweak
> Downloading BF4 currently
> 
> wish me luck, I am about fed up.


been there and done that... Now it's even worse... I can't set up eyefinity or even organize the monitors without getting a black screen. Going back to Newegg as soon as they send me my replacement.


----------



## the9quad

to answer your questions about my issues (since i forgot to hit quote lol)

Random black screen lockups, in benchmarks, games, bf4 etc..

Not a power issue- PC power and cooling silencer mkIII 1200 watt- overkill for a single card.


----------



## tsm106

Quote:


> Originally Posted by *the9quad*
> 
> to answer your questions about my issues (since i forgot to hit quote lol)
> 
> Random black screen lockups, in benchmarks, games, bf4 etc..
> 
> Not a power issue- PC power and cooling silencer mkIII 1200 watt- overkill for a single card.


When in the middle of a blackscreen, have you tried disconnecting the dvi cable and trying the 2nd port or even reconnecting it?


----------



## the9quad

Quote:


> Originally Posted by *tsm106*
> 
> When in the middle of a blackscreen, have you tried disconnecting the dvi cable and trying the 2nd port or even reconnecting it?


Nope. what will that do, I mean what do you think that will fix


----------



## tsm106

Quote:


> Originally Posted by *the9quad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> When in the middle of a blackscreen, have you tried disconnecting the dvi cable and trying the 2nd port or even reconnecting it?
> 
> 
> 
> Nope. what will that do, I mean what do you think that will fix
Click to expand...

The blackscreen imo are tied to the card's output stage. I think that if enough ppl report the issue it could be fixed in the driver.


----------



## the9quad

Quote:


> Originally Posted by *tsm106*
> 
> The blackscreen imo are tied to the card's output stage. I think that if enough ppl report the issue it could be fixed in the driver.


well if it happens again, after all this... I will give it a shot. Not sure if it will work, because it's a hard lock only fixed by rebooting ctrl alt del dont work.

still have about 45 minutes left on this bf4 download.


----------



## battleaxe

Quote:


> Originally Posted by *Forceman*
> 
> My original 290X did the same thing - instant black screen and then reboot as soon as you put a load on it, even heavily downclocked like yours. Eventually I just got a GPU initialization error on boot up and it wouldn't work at all. Had to return it.
> 
> Also, can you update me to 290 under water. Decided to send the 290X back. Not exactly thrilled with the VRM temps I'm getting with the block, about 63C running Metro bench with default voltages, but the GPU temp is amazing - 45C under the same test.


That looks freaking sweet BTW. Just sayin'


----------



## tsm106

Quote:


> Originally Posted by *the9quad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> The blackscreen imo are tied to the card's output stage. I think that if enough ppl report the issue it could be fixed in the driver.
> 
> 
> 
> well if it happens again, after all this... I will give it a shot. Not sure if it will work, because it's a hard lock only fixed by rebooting ctrl alt del dont work.
> 
> still have about 45 minutes left on this bf4 download.
Click to expand...

There are different blackscreens too. Some are not bsod in nature with a blackscreen, and its those that can recover by reseting the ports.


----------



## rv8000

Seeing what my 290 can handle with stock volts. Highest I could get without artifacting in firestrike was 1150/1550, although any memory clock above 1500 seems to show no gain (hynix chips btw).

http://www.3dmark.com/fs/1121788



What seems to be the average clock without voltage increases for the core on 290s?


----------



## battleaxe

Quote:


> Originally Posted by *tsm106*
> 
> When in the middle of a blackscreen, have you tried disconnecting the dvi cable and trying the 2nd port or even reconnecting it?


Don't you have to shut the monitor off before doing this? I burnt out my HDMI inputs on one of my monitors doing that. Or is HDMI different?


----------



## tsm106

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> When in the middle of a blackscreen, have you tried disconnecting the dvi cable and trying the 2nd port or even reconnecting it?
> 
> 
> 
> Don't you have to shut the monitor off before doing this? I burnt out my HDMI inputs on one of my monitors doing that. Or is HDMI different?
Click to expand...

Shutting off the panel would be a safe way of doing it. Though I'd never thought about it since I haven't had any panels burn out. Perhaps you got a surge, static maybe I dunno?


----------



## battleaxe

Quote:


> Originally Posted by *tsm106*
> 
> Shutting off the panel would be a safe way of doing it. Though I'd never thought about it since I haven't had any panels burn out. Perhaps you got a surge, static maybe I dunno?


No the DVI port still works, it was just the HDMI port that burnt out. I've heard of this happening to others also (but only so far with respect to HDMI ports).

I just didn't know if the DVI ports could do the same thing with hot plugging them. Sounds like DVI's aren't affected by this for some reason.


----------



## Falkentyne

Quote:


> Originally Posted by *kcuestag*
> 
> No, I'm using Dual-Link DVI for 2560x1440.


I already explained what is happening in a previous post. Do you have elpida memory? (I attached the memory checker earlier) 95% chance you do, as EVERY single card that is black screening that has to have the memory underclocked to 1000 MHz is elpida (so far).


----------



## brazilianloser

Yeah RMA set up and comes Monday morning bright and early it is on its way back to Newegg... Maybe I can get my hands on a stable one next shot if not might just give up and go back to the usual.


----------



## battleaxe

That's it. I'm waiting for the non ref 290's.


----------



## Falkentyne

Quote:


> Originally Posted by *kcuestag*
> 
> I just tried on STOCK and same crap happened after 1 minute of Arma 2... It does indeed look like when my old 7970's were unstable on OC...
> 
> I'm going to downclock and see... If that's the issue, I'll be very mad.
> 
> Any idea how to unlock CCC so that I can mess with the clocks (Instead of using MSI Afterburner)? I can't find the clocks on AMD Overdrive.


check my post several pages back. Does the memory checker say you have Elpida memory?


----------



## Jpmboy

Quote:


> Originally Posted by *Falkentyne*
> 
> I already explained what is happening in a previous post. Do you have elpida memory? (I attached the memory checker earlier) 95% chance you do, as EVERY single card that is black screening that has to have the memory underclocked to 1000 MHz is elpida (so far).


can you please repost or attach the memory checker to your sig?


----------



## Falkentyne

I don't know how to attach it to a sig. But here.

MemoryInfo 1005.zip 1101k .zip file


----------



## upgraditus

Quote:


> Originally Posted by *Falkentyne*
> 
> I already explained what is happening in a previous post. Do you have elpida memory? (I attached the memory checker earlier) 95% chance you do, as EVERY single card that is black screening that has to have the memory underclocked to 1000 MHz is elpida (so far).


If that is the case then why did underclocking the core/vram to 800/1000 fix it for one user but when bumping core back to stock whilst maintaining vram at 1000 the black screen crash still occurred?


----------



## kcuestag

Quote:


> Originally Posted by *Falkentyne*
> 
> I already explained what is happening in a previous post. Do you have elpida memory? (I attached the memory checker earlier) 95% chance you do, as EVERY single card that is black screening that has to have the memory underclocked to 1000 MHz is elpida (so far).


Quote:


> Originally Posted by *Falkentyne*
> 
> check my post several pages back. Does the memory checker say you have Elpida memory?


Yes it is Elpdia memory (Shows Elpdia, not Elpida







).

But, have you also seen your motherboard enabling the ANTI-Surge protection? I don't understand why that feature is kicking in...

I'll try to underclock the Memory to 1000MHz see if it helps. Where did you gather all this information?


----------



## overclockFrance

I returned my HIS 290X back to the shop and replaced it with this Sapphire 290, because of a far better performance/price ratio.

To summarize :

Sapphire 290 stable at 1270/1725 Mhz (water) at 1412 mV with ASUS GPU Tweak, 1.27-28 V real.
Max temperatures with 3DMark Vantage :
T GPU : 43°C
T VRM1 (GPU) : 63°C
T VRM2 (RAM) : 35°C
ambient T : 25°C

Scores at these frequencies :

3DMark Vantage : 31523 (106.95 and 77.21 fps)
Reference :
290 at 947/1250 Mhz : 24573
290X at 1260/1600 Mhz : 32246 (109.4 and 79 fps)
7970 at 1350/1650 Mhz : 26854 (86.32 and 70.74 fps)

3Dmark 11 : X5475 (26.85-28.03-24.85-14.32-40.02-29.93 fps)
Reference :
290 at 947/1250 Mhz : X4226 (20.41-21.26-19.32-10.84-39.59-23.49 fps)
290X at 1250/1650 Mhz : X5514 (27.44-28.01-24.47-14.49-40.65-29.26 fps)
7970 at 1320/1700 Mhz : X3760 (18.62-19.32-16.84-10.02-26.52-19.56 fps)

Graphic score Fire Strike Extreme : 6297 (32.55-23.63-37.67-12.54 fps)
Reference :
290 at 947/1250 Mhz : 4739 (24.59-17.33-9.79 fps)
290X at 1200/1600 Mhz : 6027 (31.88-22.25-37.56-11.62 fps)
7970 at 1350/1650 Mhz : 4460 (22.78-16.88-38.08-8.71 fps)

Unigine Heaven 2560x1600, everything maxed except AA 0X : 1184, 47 fps
Reference :
290X at 1110-1625 Mhz : 1074, 42.6 fps

Metro Last Light 2560x1600, everything maxed except SSAA and PhysX disabled : 56 fps
Reference :
290 at 1100/1480 Mhz : 49 fps
290X at 1100/1480 Mhz : 52 fps
GTX780 boost at around 1000 Mhz for the GPU (I did not write the accurate frequency but it was the boost frequency with fan at 100 %) : 42 fps

Bioshock Infinite 2560x1600 everything maxed except AA 0X : 71.89 fps
Reference :
290 1000/1250 Mhz : 56.12 fps
290X 1000/1250 Mhz : 56.99 fps

According to these results, a 34 % GPU overclock and 38 % RAM overclock gives the following gains :

3DMark vantage : 28 %
3DMark 11 : 29.5 %
Fire Srike Extreme : 33 %

At present, I cannot go higher because of the maximum voltage delivered by ASUS GPU Tweak. Since of my low temperatures, I am sure that I will be able to reach 1300 Mhz at 1.3 V (real). *Do you know if it is possible to flash the 290 with PT1 bios ?*

On the other hand, the memory overclock is excellent : my 290 is equipped with Hynix memeoy modules whereas my previous HIS 290X had Elpida ones. The 290X was able to overclock the memory at 1650 Mhz but stable only on a few benchs and 1600 Mhz to be fully stable. Here, my 290 at 1725 Mhz is 100 % stable. The memory bandwidth is 442 GB/s !


----------



## kcuestag

I just tried downclocking the Memory to 1000MHz and still got an instant shutdown/reboot after opening Valley Benchmark.

After booting again, this is what the motherboard shows:



Not sure if I should blame that on the GPU black screen issue?


----------



## upgraditus

Quote:


> Originally Posted by *overclockFrance*
> 
> I returned my HIS 290X back to the shop and replaced it with this Sapphire 290, because of a far better performance/price ratio.
> 
> At present, I cannot go higher because of the maximum voltage delivered by ASUS GPU Tweak. Since of my low temperatures, I am sure that I will be able to reach 1300 Mhz at 1.3 V (real). Do you know if it is possible to flash the 290 with PT bios ?
> 
> On the other hand, the memory overclock is excellent : my 290 is equipped with Hynix memeoy modules whereas my previous HIS 290X had Elpida ones. The 290X was able to overclock the memory at 1650 Mhz but stable only on a few benchs and 1600 Mhz to be fully stable. Here, my 290 at 1725 Mhz is 100 % stable. The memory bandwidth is 442 GB/s !


What you need is configurable LLC to sort that vdroop but that will probably require a custom PCB.


----------



## SonDa5

I updated my drivers to the latest beta and upon boot up after a few moments I lose signal from the card and the screen goes black. I uninstalled the driver and the problem went away. I have my Intel grafix driver for the 4770k installed as well and I think it could possibly be part of the problem.

Definitely buggy drivers going on with the the 290x.... good news is that its not a hardware failure problem.

Mother board power surge protection may be part of the problem as well. PCI-E power settings in the bios may be part of the problem as well. I am running Windows 8.


----------



## tsm106

Quote:


> Originally Posted by *kcuestag*
> 
> I just tried downclocking the Memory to 1000MHz and still got an instant shutdown/reboot after opening Valley Benchmark.
> 
> After booting again, this is what the motherboard shows:
> 
> 
> 
> Not sure if I should blame that on the GPU black screen issue?


Try another psu? How is the power from the wall?


----------



## overclockFrance

I had flashed the 290X with the PT3 bios and it worked fine, no more Vdroop. But I did not find the answer to the bold question : 290 with PT1 or PT3 bios possible ? With PT1, I would be able to apply more than 1412 mV and my problem would be solved.


----------



## upgraditus

Quote:


> Originally Posted by *overclockFrance*
> 
> I had flashed the 290X with the PT3 bios and it worked fine, no more Vdroop. But I did not find the answer to the bold question : 290 with PT1 or PT3 bios possible ? With PT1, I would be able to apply more than 1412 mV and my problem would be solved.


290 has been flashedd to 290x bios as people wanted to know if you can unlock the extra shaders (which you cannot). Unsure if it was ASUS one but cannot see why vendor specific should matter.


----------



## upgraditus

Quote:


> Originally Posted by *kcuestag*
> 
> I just tried downclocking the Memory to 1000MHz and still got an instant shutdown/reboot after opening Valley Benchmark.
> 
> Not sure if I should blame that on the GPU black screen issue?


Do you have another GPU and PSU to test with?

Edit: sorry for double post, should've multi quoted.


----------



## seabiscuit68

Any advice would be appreciated. I can get the Sapphire reference card for $399.99 with $20 off ($379.99) and free 2 days shipping. I don't plan on modding it with a new cooler and I don't plan to put it under water.

I am cheap and don't want to pay substantially more for one with a non-reference cooler.

Is this worth buying for this price if I am not going to change the cooling?

How bad is it (noise wise)?
Do you think a new game bundle will be put in for the holidays? If so, I will definitely wait as I don't have a lot of the newer games like BF4, Ghosts, Black Flag, etc.

Thanks


----------



## overclockFrance

Quote:


> Originally Posted by *upgraditus*
> 
> 290 has been flashedd to 290x bios as people wanted to know if you can unlock the extra shaders (which you cannot). Unsure if it was ASUS one but cannot see why vendor specific should matter.


My Sapphire has been flashed with the ASUs 290 bios. If the ASUS 290X can be flashed on a 290, it may be the case with the PT1 too. But I don't want to take any risk.


----------



## upgraditus

Quote:


> Originally Posted by *overclockFrance*
> 
> My Sapphire has been flashed with the ASUs 290 bios. If the ASUS 290X can be flashed on a 290, it may be the case with the PT1 too. But I don't want to take any risk.


Is there no dual bios switch?


----------



## overclockFrance

Yes, I forgot it. Consequently, I may test tomorrow.


----------



## Raephen

The mail-man rang twice today









Sign me up!



I could have doodled my handle on a piece of paper, but I took the snapshot above nstead.

Forgive the bad quality. I just proed the card up and took a picture with my Nexus 7.

It's a Sapphire R9 290. Running on stock with a manual fan profile.

ASIC: 70,2 Memory: Elpidia

I did a few quick test today. Both Skyrim and Dragon Age II look - for lack of a beter word - "glossier / shinny-er" than they did with the HD 7870. Odd.

Maybe it's so, or maybe a placebo effect. I don't know

Nor do I care. I love this card.

I've done no overclocking yet, nor am I planning to until I get the Accelero Xtreme III I ordered on it.

Right now I'm just checking this card for quality and/or flaws. I have yet to notice any coil whine, so that's good. Do you guys have any suggestions on tests I could run?

Temp wise it's okay with the manual fan profile. I did some full 3Dmark runs and the core never exceded 84C. But man, you know the fan is spinning when it hit 40% and above!

For the 3Dmark score: http://www.3dmark.com/3dm11/7460757.


----------



## Slomo4shO

Quote:


> Originally Posted by *Arm3nian*
> 
> Last time I looked this is ocn. I don't have to write it out, go look at this list that was posted by Jared Pace. I also don't know what the powerdraw of a 7950 has to do with the 290/290x, especially at 1ghz clocks, overvolt that on water and repost a pic of your meter.


Last I checked, context was important, the original question had no indication that there were any plans of extreme overclocking:

From Jared Pace's figures:
Quote:


> Stock cooler / Quiet bios / 65C / 1.175V / 1150 / 1350 = *260W*
> PT1/PT3 bios / 60C / 1.400V / 1300/1500=*545W*


That is over double the power consumption (109.6% more to be exact) for an extra 150MHz core and memory clocks. It is perfectly reasonable to expect someone to crossfire 290X on a 850W PSU if you keep the voltage and temperature down. And yes, I am pretty confident that a 290X crossfire running at 1150/1350 will outperform a single 290X at 1300/1500.

Again, I fail to see why anyone wanting a 24/7 clock would want to double the power consumption just to pickup around 10% in performance.

Just because we are on OCN doesn't mean that the blanket response should be to go pickup a 1600W LEPA, watercool the rig, and oc to the maximum stable clocks using as much power as your cooling solution can adequately cool. Or maybe it does


----------



## kcuestag

I'm now testing without the side panel from behind motherboard tray, no more shutdowns, looks like something was causing a short circuit....









Let's see if I can figure what cable was doing that.


----------



## upgraditus

Quote:


> Originally Posted by *Raephen*
> 
> The mail-man rang twice today
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sign me up!
> 
> It's a Sapphire R9 290..
> 
> ASIC: 70,2 Memory: Elpidia.


Strange, Sapphire 290 with Elpida and Hynix vram lottery.


----------



## battleaxe

Quote:


> Originally Posted by *seabiscuit68*
> 
> Any advice would be appreciated. I can get the Sapphire reference card for $399.99 with $20 off ($379.99) and free 2 days shipping. I don't plan on modding it with a new cooler and I don't plan to put it under water.
> 
> I am cheap and don't want to pay substantially more for one with a non-reference cooler.
> 
> Is this worth buying for this price if I am not going to change the cooling?
> 
> How bad is it (noise wise)?
> Do you think a new game bundle will be put in for the holidays? If so, I will definitely wait as I don't have a lot of the newer games like BF4, Ghosts, Black Flag, etc.
> 
> Thanks


I'd advise waiting to pay 20 more for a non ref cooler. Noise isn't the only issue. Non ref coolers just work better. This is what I'm doing too.


----------



## Arm3nian

Quote:


> Originally Posted by *Slomo4shO*
> 
> Last I checked, context was important, the original question had no indication that there were any plans of extreme overclocking:
> 
> From Jared Pace's figures:
> That is over double the power consumption (109.6% more to be exact) for an extra 150MHz core and memory clocks. It is perfectly reasonable to expect someone to crossfire 290X on a 850W PSU if you keep the voltage and temperature down. And yes, I am pretty confident that a 290X crossfire running at 1150/1350 will outperform a single 290X at 1300/1500.
> 
> Again, I fail to see why anyone wanting a 24/7 clock would want to double the power consumption just to pickup around 10% in performance.
> 
> Just because we are on OCN doesn't mean that the blanket response should be to go pickup a 1600W LEPA, watercool the rig, and oc to the maximum stable clocks using as much power as your cooling solution can adequately cool. Or maybe it does


The original question said that he is going to be watercooling, and you posted the power consumption on the stock cooler lol, 850 watts isn't going to be enough, 2 cards themselves are going to draw that much:

Asus bios / 60C / 1.300V / 1000/1250 =450W
Asus bios / 40C / 1.300V / 1000/1250 =430W
Asus bios / 20C / 1.300V / 1000/1250 =410W
Asus bios / 60C / 1.300V / 1250/1450 =465W
Asus bios / 40C / 1.300V / 1250/1450 =445W
Asus bios / 20C / 1.300V / 1250/1450 =425W

That overclock of 1150 on air (which is quite hard to get) to 1300 on water is going to provide more than a 10% performance increase. That is just the core clock, you can increase the memory clock as well with better cooling.

The original question was if the wattage would be enough. On water with an average oc (1.3v 1250mhz), an 850 watt psu isn't going to cut it. Don't change the question to "double the wattage isn't worth the performance increase".


----------



## brazilianloser

After a second fresh Windows install and last dying attempt to see if the problem was on my end and not the card or drivers problems... I decided to run some valley in surround and bam finish it up took a screen shot and the computer went into its black slumber... now sitting beside me keeping some papers in place while the older 280x that I haven't returned yet is doing just fine at the job the 290 failed so hard to accomplish. Hoping for the best on the replacement.


----------



## TomiKazi

Hm, I think I'll wait with getting a 290x until these black screen issues are solved


----------



## Arizonian

Quote:


> Originally Posted by *kcuestag*
> 
> I'm now testing without the side panel from behind motherboard tray, no more shutdowns, looks like something was causing a short circuit....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Let's see if I can figure what cable was doing that.


Glad to see it wasn't the GPU after all. I think we can say the same for some others having issues and it's something being over looked beside GPU.
Quote:


> Originally Posted by *Raephen*
> 
> The mail-man rang twice today
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sign me up!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I could have doodled my handle on a piece of paper, but I took the snapshot above nstead.
> 
> Forgive the bad quality. I just proed the card up and took a picture with my Nexus 7.
> 
> It's a Sapphire R9 290. Running on stock with a manual fan profile.
> 
> ASIC: 70,2 Memory: Elpidia
> 
> I did a few quick test today. Both Skyrim and Dragon Age II look - for lack of a beter word - "glossier / shinny-er" than they did with the HD 7870. Odd.
> 
> Maybe it's so, or maybe a placebo effect. I don't know
> 
> Nor do I care. I love this card.
> 
> I've done no overclocking yet, nor am I planning to until I get the Accelero Xtreme III I ordered on it.
> 
> Right now I'm just checking this card for quality and/or flaws. I have yet to notice any coil whine, so that's good. Do you guys have any suggestions on tests I could run?
> 
> Temp wise it's okay with the manual fan profile. I did some full 3Dmark runs and the core never exceded 84C. But man, you know the fan is spinning when it hit 40% and above!
> 
> For the 3Dmark score: http://www.3dmark.com/3dm11/7460757.


Congrats your in









Update us when you get the aftermarket cooling on it. Love to see a pic of those.


----------



## SonDa5

My 290x PSU testing with a Seasonic 660W Platinum PSU.
http://www.overclock.net/t/1441118/290x-psu-power-output-tests#post_21159770


----------



## Raephen

Quote:


> Originally Posted by *upgraditus*
> 
> Strange, Sapphire 290 with Elpida and Hynix vram lottery.


Indeed. Looks like just the luck of the draw.

But the AIB's get the refference boards straight from AMD, no? Like, the only thing really sapphire about my board are the box, stickers and warranty?

That might explain it...

Oh well, never mind that. All the talk about black screening and it's possible link to memory manufacturer kinda spooked me.

You guys know of any sure-fire way to test if I've got a super massive black screener in hand?


----------



## Arizonian

BTW - I want to apologize to all the members who copied original Club signature from OP - it was a link to third post not first post. It's been updated.









Darn I was crazy busy that day.


----------



## Gilgam3sh

I've had the card for a couple of days but put me on the list, no serious overclock yet as I have not got my Accelero Xtreme III yet.


----------



## brazilianloser

Quote:


> Originally Posted by *Raephen*
> 
> Indeed. Looks like just the luck of the draw.
> 
> But the AIB's get the refference boards straight from AMD, no? Like, the only thing really sapphire about my board are the box, stickers and warranty?
> 
> That might explain it...
> 
> Oh well, never mind that. All the talk about black screening and it's possible link to memory manufacturer kinda spooked me.
> 
> You guys know of any sure-fire way to test if I've got a super massive black screener in hand?


Just play some games, some benchmarks... any real stress on the card will show you if you have a black screen card.


----------



## upgraditus

Quote:


> Originally Posted by *Raephen*
> 
> You guys know of any sure-fire way to test if I've got a super massive black screener in hand?


You could OC mem to 2000 and watch it melt. Might be aMUSEing, but I wouldn't recomend it









Seriously though I'd just try some games out since I guess that's what you bought it for.


----------



## the9quad

Well I can confirm that switching to windows 8 from 8.1 has zero effect on black screens. unplugging the output and plugging in a different one has no effect either. Time to RMA this POS.


----------



## Raephen

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats your in
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Update us when you get the aftermarket cooling on it. Love to see a pic of those.


Will do, capo.


----------



## bpmcleod

Anyone attempted flashing the pt1 bios to a 290 yet ?


----------



## Slomo4shO

Quote:


> Originally Posted by *Arm3nian*
> 
> The original question said that he is going to be watercooling, and you posted the power consumption on the stock cooler lol, 850 watts isn't going to be enough, 2 cards themselves are going to draw that much:
> 
> Asus bios / 60C / 1.300V / 1000/1250 =450W
> Asus bios / 40C / 1.300V / 1000/1250 =430W
> Asus bios / 20C / 1.300V / 1000/1250 =410W
> Asus bios / 60C / 1.300V / 1250/1450 =465W
> Asus bios / 40C / 1.300V / 1250/1450 =445W
> Asus bios / 20C / 1.300V / 1250/1450 =425W
> 
> That overclock of 1150 on air (which is quite hard to get) to 1300 on water is going to provide more than a 10% performance increase. That is just the core clock, you can increase the memory clock as well with better cooling.
> 
> The original question was if the wattage would be enough. On water with an average oc (1.3v 1250mhz), an 850 watt psu isn't going to cut it. Don't change the question to "double the wattage isn't worth the performance increase".


Wow, you are dense. The list is incomplete so you have to pick and choose to make a point. The air clocks fit best as the temps were 65C and the clocks were based on 1.175V. In fact, those clocks and voltage on water would only mean that the power consumption goes down from the air example due to lower temperatures:
Quote:


> Temperature (medium impact on wattage)
> PT1/PT3 bios / 90C / 1.400V / 1300/1500=595W (not a good idea for 1.4 on air)
> PT1/PT3 bios / 80C / 1.400V / 1300/1500=575W
> PT1/PT3 bios / 70C / 1.400V / 1300/1500=555W
> PT1/PT3 bios / 60C / 1.400V / 1300/1500=545W
> PT1/PT3 bios / 40C / 1.400V / 1300/1500=535W
> PT1/PT3 bios / 20C / 1.400V / 1300/1500=525W
> PT1/PT3 bios / 10C / 1.400V / 1300/1500=520W
> PT1/PT3 bios / -50C / 1.400V / 1300/1500=490W


1150 with 1.2V or less seems pretty reasonable under water. Water isn't just for extreme clock speeds, it is also used to reduce noise, lower temperatures, and have a more stable 24/7 OC. Is there any rule in place that suggests that you should use 1.3V to reach 1250 MHz when on water? 1250MHz is only a 8.7% OC over 1150MHz. Even with improved memory clocks, performance gains will not exceed 10% but power consumption would be up 71-78% which still means that you can run two cards at 5-10% lower frequencies and still use about the same amount of power as a single card. And since crossfire scales really well...


----------



## Raephen

Quote:


> Originally Posted by *upgraditus*
> 
> You could OC mem to 2000 and watch it melt. Might be aMUSEing, but I wouldn't recomend it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Seriously though I'd just try some games out since I guess that's what you bought it for.


LOL. yeah... I might not be OCN level 'PC enthusiast' just yet, but the 2000 MHz thing kinda seemed like a no-no to begin with









Iguess the same advice would apply to coil-whine? Jus do my thing and keep my eyes (rather: ears) peeled?


----------



## Dominican

has anyone be able to play more that 1 hour with out red screen or crashing ? trying play batman and BF 4 keep getting red screen


----------



## CurrentlyPissed

Yeah, I can't even run a FF14 benchmark at stock speeds. Goes to black screen. Playing the game, I get times where I can go for about an hour with no black screen, sometimes 30 minutes.

2x290 in CFX. Kinda annoying.. About to put my titan back in and order up some 780 Tis instead.


----------



## smokedawg

Just in case anyone is still keeping track:
Powercolor 290x MemoryInfo:


----------



## Arizonian

Quote:


> Originally Posted by *Gilgam3sh*
> 
> I've had the card for a couple of days but put me on the list, no serious overclock yet as I have not got my Accelero Xtreme III yet.
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## kcuestag

I believe I found my problem. The water pump's molex cable was loose and when I checked it ended up breaking. So I'll have to send it to a friend who'll solder it for me and fix it.









Looks like the GPU is fine, the few minutes I checked with the cable alive, the GPU was running Heaven just fine at 1125/1450.









So yeah, I'll report back when I get my pump on Wednesday or Thursday, hopefully next weekend I'll get to play some.


----------



## Kipsofthemud

Quote:


> Originally Posted by *maarten12100*
> 
> Just ordered 2 R9 290's AAAAAAAAAAAAAH YEAH!
> The 290x from Gigabyte was about the same price but had a waiting time of 2 to 4 weeks and I rather have my cards now. (by Dutch law I'm allowed to return it within 14 days so if the 290X's become available I might swap them with those)


You're talking about A z erty im guessing? I ordered a 290X last night but today they increased the price by a 100 euros...I hope they will honor my order but I am kinda dissapointed already because I think they'll return the money and say "it was a mistake"


----------



## brazilianloser

Quote:


> Originally Posted by *kcuestag*
> 
> I believe I found my problem. The water pump's molex cable was loose and when I checked it ended up breaking. So I'll have to send it to a friend who'll solder it for me and fix it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looks like the GPU is fine, the few minutes I checked with the cable alive, the GPU was running Heaven just fine at 1125/1450.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So yeah, I'll report back when I get my pump on Wednesday or Thursday, hopefully next weekend I'll get to play some.


If only some of our problems werent just the card... I put my 280x back in and left it burning on valley surround for an hour and not a hick up... Oh well


----------



## Arm3nian

Quote:


> Originally Posted by *Slomo4shO*
> 
> Wow, you are dense. The list is incomplete so you have to pick and choose to make a point. The air clocks fit best as the temps were 65C and the clocks were based on 1.175V. In fact, those clocks and voltage on water would only mean that the power consumption goes down from the air example due to lower temperatures:
> 1150 with 1.2V or less seems pretty reasonable under water. Water isn't just for extreme clock speeds, it is also used to reduce noise, lower temperatures, and have a more stable 24/7 OC. Is there any rule in place that suggests that you should use 1.3V to reach 1250 MHz when on water? 1250MHz is only a 8.7% OC over 1150MHz. Even with improved memory clocks, performance gains will not exceed 10% but power consumption would be up 71-78% which still means that you can run two cards at 5-10% lower frequencies and still use about the same amount of power as a single card. And since crossfire scales really well...


Previously you posted 1150mhz on air to make a point and now you say 1150 is reasonable for water to make a point and I'm dense? Lol okay.

The original question didn't specify his needs. With water you can get 1300+ at 1.45 and still run an extremely quiet system, I am going for that, because I'm going to have a lot of rads.

Go start a new thread claiming that an overclock will always be under 10% performance gains, I don't want to see that here.


----------



## SpewBoy

I just completely reinstalled Windows (going from 8 to 8.1) because of boot trouble I was having with either my mobo or AX860 where devices (drives usually, occasionally GPU) would stop appearing or outputting during the boot process (but ALWAYS present in BIOS and POST message). This resulted in Windows boot screen hanging, chucking a blue screen, or getting into Windows to find one of my partitions is gone (it is still present but not "initialised"). This doesn't happen every time.

I'm running two 290Xs, a 4770K, Maximus VI Hero, RAID0 SSDs (boot), RAID0 HDDs (storage), an SSD cache drive for HDDs.

All components but the drives are brand new and those drives have all worked perfectly in the past.

Nothing is overclocked (I'm debugging boot stability issues here).

Reinstall did absolutely nothing to help. I'm running bare essentials with just the latest drivers and no extra fluff. I have noticed that Windows 8.1 black screens on me when idle sometimes. Seems to happen when or some time after the display goes to sleep. Computer has not frozen because I can still use capslock (which isn't the case when the computer is frozen) and unplugging the monitor and plugging it back in to the DVI port makes it instantly work again. This occurs with just one or two 290Xs. Pretty sure it is a driver issue. Elpidia memory.

I've been running my PC on one 290X for a little while and trying reboots and so far nothing has gone wrong. I'm not sure what to think of this because the PSU can't be overloaded... Hybrid mode fan should come on at 50% load and it never comes on during boot, and AFAIK booting isn't the most GPU-intensive task. Computer is 100% stable in Furmark and benches.

EDIT: Okay, still having problems with just one card. That leaves two possibilities:
- PSU is faulty
- Mobo is faulty


----------



## Raephen

Quote:


> Originally Posted by *Kipsofthemud*
> 
> You're talking about A z erty im guessing? I ordered a 290X last night but today they increased the price by a 100 euros...I hope they will honor my order but I am kinda dissapointed already because I think they'll return the money and say "it was a mistake"


I you closed the deal with that price, they are bound by law to honour that agreement.

Just like my 290 I ordered at Afuture.nl. They had 3 pricings listed on that day. Luckily I decided to jump the gun when it dropped to € 354, because sometime later it was € 15 more exspensive.


----------



## Falkentyne

The problem is, so far, I haven't seen a single person with Hynix memory, who has had these black screen problems. And believe me, I've been checking.

Someone really needs to get to the bottom of this. Why are all the cards (so far) that are having these black screen at stock issues all using Elpida memory?

Is there an IMC problem with the timings, that doesn't affect the Hynix boards?


----------



## Slomo4shO

Quote:


> Originally Posted by *Arm3nian*
> 
> Previously you posted 1150mhz on air to make a point and now you say 1150 is reasonable for water to make a point and I'm dense? Lol okay.
> 
> The original question didn't specify his needs. With water you can get 1300+ at 1.45 and still run an extremely quiet system, I am going for that, because I'm going to have a lot of rads.
> 
> Go start a new thread claiming that an overclock will always be under 10% performance gains, I don't want to see that here.


Is there a language barrier or do you just not comprehend what you read? I am unsure where you got the idea that "an overclock will always be under 10% performance gains." The performance gains going up to 1250MHz is only a 8.7% overclock from 1150MHz so the performance is not going to increase more than 10% if you jump from 1150MHz to 1250MHz. Also, the original question only asked if he could have 2 290X under water on a 850W PSU. The answer to that is YES. It is perfectly reasonable to have a modest overclocked crossfire config on a 850W PSU. Extreme OC requires extreme voltages and extreme power consumption. This isn't for everyone. Good for you if you think that it is necessary to have your 290Xs running at 1.3-1.45V and using 2-3x the power in the process. It is impractical but you do have the right do do as you wish.

I am done with this discussion since you can't seem to understand basic math and comprehend simple logic. Best of luck to you in your build.


----------



## Tobiman

Quote:


> Originally Posted by *Falkentyne*
> 
> The problem is, so far, I haven't seen a single person with Hynix memory, who has had these black screen problems. And believe me, I've been checking.
> 
> Someone really needs to get to the bottom of this. Why are all the cards (so far) that are having these black screen at stock issues all using Elpida memory?
> 
> Is there an IMC problem with the timings, that doesn't affect the Hynix boards?


I just scanned through the 290/X thread on some other forums and it's the same thing. All of them have elpida ram modules. I can see that you asked them to run the app to tell if it's elpida.


----------



## Gilgam3sh

Quote:


> Originally Posted by *Falkentyne*
> 
> The problem is, so far, I haven't seen a single person with Hynix memory, who has had these black screen problems. And believe me, I've been checking.
> 
> Someone really needs to get to the bottom of this. Why are all the cards (so far) that are having these black screen at stock issues all using Elpida memory?
> 
> Is there an IMC problem with the timings, that doesn't affect the Hynix boards?


I have the Elpida memory and no problem here, running Windows 8.1 x64... one time I got black screen when I tried overclock with CCC but now I use ASUS GPU Tweak and seems to work great, played BF4 and CS Source at 1,1GHz and 5,7GHz mem... no crashes


----------



## Arm3nian

Quote:


> Originally Posted by *Slomo4shO*
> 
> Is there a language barrier or do you just not comprehend what you read? I am unsure where you got the idea that "an overclock will always be under 10% performance gains." The performance gains going up to 1250MHz is only a 8.7% overclock from 1150MHz so the performance is not going to increase more than 10% if you jump from 1150MHz to 1250MHz. Also, the original question only asked if he could have 2 290X under water on a 850W PSU. The answer to that is YES. It is perfectly reasonable to have a modest overclocked crossfire config on a 850W PSU. Extreme OC requires extreme voltages and extreme power consumption. This isn't for everyone. Good for you if you think that it is necessary to have your 290Xs running at 1.3-1.45V and using 2-3x the power in the process. It is impractical but you do have the right do do as you wish.
> 
> I am done with this discussion since you can't seem to understand basic math and comprehend simple logic. Best of luck to you in your build.


I can't comprehend what you write primarily due to your poor writing skills, but disregarding that...

The original question was not asking if an 850 is enough for 2 cards, but if an 850 is enough for an entire system, on a hexacore cpu. We have no idea how high he is clocking anything in his system, which is why I gave him the answer based on the average clocks that people run under water. The average clock and voltage that people who are watercooling run is 1250mhz @1.3v. This would result in 400watts x2 which would equal 800watts. That is just the cards alone, the cpu with a mild oc would run another 200. Is this really that difficult to understand? TSM posted his meter drawing 900+ watts with 2 cards and a cpu only. I'm going off from what I have seen.


----------



## Arizonian

I had my Sapphire 290X ' elpida ' memory running 1100 / 1300 BF4 total campaign time 5 hrs and about the same multiplayer before selling mine and no issues but I was only up to 13.11 beta 6 and some on beta 7 no issues either.

Windows Pro 8.1 64bit


----------



## kcuestag

Yeah, mine is Elpida and same issues (apart from the short circuit issues I was having.







).

I'm wondering if AMD is aware of this...


----------



## CurrentlyPissed

Anyone having issues where when it black screens, you reboot. And it doesn't post, forcing a 2nd reboot, and having to do a windows repair scan every time?


----------



## kpoeticg

Is there any brand 290x that's guaranteed to come with Hynix? I was under the impression that Sapphire was one of the companies that used Hynix in their BF4 Cards. Obviously I'm wrong about that, but is it just basically luck of the draw?


----------



## Arm3nian

Quote:


> Originally Posted by *kpoeticg*
> 
> Is there any brand 290x that's guaranteed to come with Hynix? I was under the impression that Sapphire was one of the companies that used Hynix in their BF4 Cards. Obviously I'm wrong about that, but is it just basically luck of the draw?


Nope, no single brand comes only with hynix.


----------



## Gilgam3sh

Quote:


> Originally Posted by *CurrentlyPissed*
> 
> Anyone having issues where when it black screens, you reboot. And it doesn't post, forcing a 2nd reboot, and having to do a windows repair scan every time?


I had that when I tried overclocking with CCC, after I applied OC screen turned black, rebooted, after login screen turned black... after that uninstalled drivers and since then not been using CCC to overclock

Quote:


> Originally Posted by *kpoeticg*
> 
> Is there any brand 290x that's guaranteed to come with Hynix? I was under the impression that Sapphire was one of the companies that used Hynix in their BF4 Cards. Obviously I'm wrong about that, but is it just basically luck of the draw?


I don't even think the companies know what memorys they get, I mean all the card are the same, they only put their stickers on them...


----------



## kpoeticg

Thanx for clearing that up guys. I was under the impression Asus and Sapphire BF4's were all Hynix. I woulda felt like an idiot if i payed more for one of those just for that reason, which i almost did a cpl times


----------



## Strata

Been getting black screens now a lot, and I still seem to have a load applied even when the PC is idle, maybe the two are related...I just know I shouldn't idle at 90C

I'm going to try stock when I get home and see if the issues occur, as it seems to have gotten worse with burn in.


----------



## qqan

Does anybody know (and would share) what are the VRM load temps on the stock cooler for the 290X?
What would be maximum acceptable temps?

-qq


----------



## upgraditus

Thread on AMD forum with a poll regarding the black screen crashes.
http://forums.amd.com/game/messageview.cfm?catid=440&threadid=169188

Also people having issue please try installing latest DirectX and report if fixed or not.


----------



## HoZy

Elpida memory here, Had my card since the 1st November.

It's put down 6hours of furmark, 10+ Uniengine Heaven & Valley benches as well as so far 11hours of BF4 Gameplay.

Put the EK block on 2days ago, And overclocked it to 1150core & 1350 memory on stock bios through CCC.

Ran another 2hours of furmark and another few hours of BF4 and it's flawless.

Cheers
Mat

Windows 8 64bit, AMD CCC B9.2


----------



## Slomo4shO

Quote:


> Originally Posted by *Arm3nian*
> 
> *The average clock* and voltage that people who are watercooling run is 1250mhz @1.3v.


1188/1393 on water.


----------



## blado

Quote:


> Originally Posted by *kpoeticg*
> 
> Is there any brand 290x that's guaranteed to come with Hynix? I was under the impression that Sapphire was one of the companies that used Hynix in their BF4 Cards. Obviously I'm wrong about that, but is it just basically luck of the draw?


http://videocardz.com/47200/amd-launches-radeon-r9-290x This article claims to have information on which brand of memory comes with each card. I'm somewhat doubtful of how accurate it is, though.


----------



## upgraditus

Quote:


> Originally Posted by *blado*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kpoeticg*
> 
> Is there any brand 290x that's guaranteed to come with Hynix? I was under the impression that Sapphire was one of the companies that used Hynix in their BF4 Cards. Obviously I'm wrong about that, but is it just basically luck of the draw?
> 
> 
> 
> http://videocardz.com/47200/amd-launches-radeon-r9-290x This article claims to have information on which brand of memory comes with each card. I'm somewhat doubtful of how accurate it is, though.
Click to expand...

It's already been proven inaccurate a page or two back.

My guess is the only way to guarantee Hynix chips is to wait for AIB custom PCB cards.


----------



## IBIubbleTea

Is there something wrong with the memory brand they use? (I'm new to this)
And also should I get the 290x or the 290? I'm planning on water cooling it. I don't want a power hungry computer but I still want the performance. if I overclock the 290, will it still draw more power then the 290x stock? Sorry about that, I have so many questions...


----------



## bpmcleod

Quote:


> Originally Posted by *SpewBoy*
> 
> I just completely reinstalled Windows (going from 8 to 8.1) because of boot trouble I was having with either my mobo or AX860 where devices (drives usually, occasionally GPU) would stop appearing or outputting during the boot process (but ALWAYS present in BIOS and POST message). This resulted in Windows boot screen hanging, chucking a blue screen, or getting into Windows to find one of my partitions is gone (it is still present but not "initialised"). This doesn't happen every time.
> 
> I'm running two 290Xs, a 4770K, Maximus VI Hero, RAID0 SSDs (boot), RAID0 HDDs (storage), an SSD cache drive for HDDs.
> 
> All components but the drives are brand new and those drives have all worked perfectly in the past.
> 
> Nothing is overclocked (I'm debugging boot stability issues here).
> 
> Reinstall did absolutely nothing to help. I'm running bare essentials with just the latest drivers and no extra fluff. I have noticed that Windows 8.1 black screens on me when idle sometimes. Seems to happen when or some time after the display goes to sleep. Computer has not frozen because I can still use capslock (which isn't the case when the computer is frozen) and unplugging the monitor and plugging it back in to the DVI port makes it instantly work again. This occurs with just one or two 290Xs. Pretty sure it is a driver issue. Elpidia memory.
> 
> I've been running my PC on one 290X for a little while and trying reboots and so far nothing has gone wrong. I'm not sure what to think of this because the PSU can't be overloaded... Hybrid mode fan should come on at 50% load and it never comes on during boot, and AFAIK booting isn't the most GPU-intensive task. Computer is 100% stable in Furmark and benches.
> 
> EDIT: Okay, still having problems with just one card. That leaves two possibilities:
> - PSU is faulty
> - Mobo is faulty


Are you doing a fresh install of 8.1? Or fresh install of 8 then upgrading? Upgrading to 8.1 instead of doing a direct install of it causes an amazing amount of problems. So that could be one possibility. Second, try removing the power cables to your cards, booting from your CPUs iGPU and see if this causes it to post. If this is the case its most likely not the mobo (although maybe your second pcie slot is faulty but with the problems going on with these drivers not likely). If it boots form your iGPU, then disable the iGPU and turn off the PC. Plug the power cables back into the cards and try booting again (you may have to set your mobo to boot directly from your iGPU in BIOS to get it to post to it so if you do, make sure you set it back to auto after disabling the iGPU and then reboot from BIOS si save). If this doesnt work then maybeee its a bad PSU. But considering I had almost identical symptoms as you and I had to do these cycles a thousand times to get it to post correctly, its most likely your problem. Good luck.


----------



## Arm3nian

Quote:


> Originally Posted by *Slomo4shO*
> 
> 1188/1393 on water.


You can't use hwbot for average clocks on a product this new. Look at the 4930k, the average clock for air is higher than water. Your points are only valid if the user runs an overclock that is achievable on the STOCK cooler and basically on STOCK volts. People who invest in watercooling invest it in it for maximum performance and silence, especially on OCN. From what I've seen from reading 4200 posts on this thread is that people under water are running 1250 @1.3 as a minimum. Almost all have tried to push it to the max.


----------



## bpmcleod

Quote:


> Originally Posted by *Arm3nian*
> 
> You can't use hwbot for average clocks on a product this new. Look at the 4930k, the average clock for air is higher than water. Your points are only valid if the user runs an overclock that is achievable on the STOCK cooler and basically on STOCK volts. People who invest in watercooling invest it in it for maximum performance and silence, especially on OCN. From what I've seen from reading 4200 posts on this thread is that people under water are running 1250 @1.3 as a minimum. Almost all have tried to push it to the max.


^^ agreed 100%


----------



## the9quad

Quote:


> Originally Posted by *blado*
> 
> http://videocardz.com/47200/amd-launches-radeon-r9-290x This article claims to have information on which brand of memory comes with each card. I'm somewhat doubtful of how accurate it is, though.


Well the article is wrong for sure it says the sapphire BF4 edition uses hynix- thats what I have and it's elipda.


----------



## Arm3nian

Quote:


> Originally Posted by *the9quad*
> 
> Well the article is wrong for sure it says the sapphire BF4 edition uses hynix- thats what I have and it's elipda.


Yeah, at first we all thought Sapphire and Asus BF4 use only hynix. But other members including you showed their brands having elpida.


----------



## ImJJames

I wanna see some extremehd valley scores from you 280x owners, come on!


----------



## bpmcleod

Quote:


> Originally Posted by *Arm3nian*
> 
> Yeah, at first we all thought Sapphire and Asus BF4 use only hynix. But other members including you showed their brands having elpida.


Launch ones tend to have better quality memory chips as of late it looks. EVGA launches with samsung then switches to elpida then bounces between hynix and sammy. Seems others are doing similar with the 290/x series.


----------



## kpoeticg

I'd love to see one of the third party 290x's like the DCII/Matrix/Lightning rockin some Samsung memory modules. I wonder what the chances are


----------



## Arm3nian

Quote:


> Originally Posted by *bpmcleod*
> 
> Launch ones tend to have better quality memory chips as of late it looks. EVGA launches with samsung then switches to elpida then bounces between hynix and sammy. Seems others are doing similar with the 290/x series.


I got my 2 sapphires at launch day, so I'll be sure show what I got. Still have no motherboard to power them on and don't want to take the cover off yet.


----------



## mickeykool

Are these black screens happening only on the 290x's?


----------



## the9quad

Quote:


> Originally Posted by *mickeykool*
> 
> Are these black screens happening only on the 290x's?


I'd like to know this as well as i think when RMA my 290x I will just get two 290's instead.


----------



## Ukkooh

Quote:


> Originally Posted by *Slomo4shO*
> 
> 1188/1393 on water.


I think the avergae would be 1200+ if more people used the higher voltage bioses.


----------



## Kipsofthemud

Quote:


> Originally Posted by *Raephen*
> 
> I you closed the deal with that price, they are bound by law to honour that agreement.
> 
> Just like my 290 I ordered at Afuture.nl. They had 3 pricings listed on that day. Luckily I decided to jump the gun when it dropped to € 354, because sometime later it was € 15 more exspensive.


Well, I checked their disclaimers etc and it says they have the right to cancel an order if they want, prices can be changed, etc. They have the legal high ground so to speak







I just hope they honor my purchase since i've spent thousands at this shop over the last couple of years







15 euros is not the same difference as a 100


----------



## brazilianloser

Quote:


> Originally Posted by *the9quad*
> 
> I'd like to know this as well as i think when RMA my 290x I will just get two 290's instead.


I have a 290 here and it does to me... know a few others as well all having the same problem... my 280x on the other hand until I have to return it as well is going on strong.


----------



## Paul17041993

well, with money shortages I'll likely not get a 290X for at least a few months from now, I'll probably have to ignore most of this thread for now too seeing how long it takes to read through it every morning...

memory modules, technically shouldn't matter and all brands and editions seem randomized, someone should make a separate thread for the blackscreen issues if there isn't one already so it can be narrowed down more easily.
Quote:


> Originally Posted by *kpoeticg*
> 
> I'd love to see one of the third party 290x's like the DCII/Matrix/Lightning rockin some Samsung memory modules. I wonder what the chances are


agreed, could be quite a kicker.


----------



## Emmett

Can I join?


----------



## DampMonkey

Quote:


> Originally Posted by *Ukkooh*
> 
> I think the avergae would be 1200+ if more people used the higher voltage bioses.


PT3 and PT1 aren't proving to be very stable. Not a good bios for most peoples daily drivers


----------



## tsm106

Quote:


> Originally Posted by *DampMonkey*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ukkooh*
> 
> I think the avergae would be 1200+ if more people used the higher voltage bioses.
> 
> 
> 
> PT3 and PT1 aren't proving to be very stable. Not a good bios for most peoples daily drivers
Click to expand...

They work, but its not for everyone.


----------



## Jpmboy

Quote:


> Originally Posted by *DampMonkey*
> 
> PT3 and PT1 aren't proving to be very stable. Not a good bios for most peoples daily drivers


so far the asus unlocked bios is doing okay for me.. the AMD drivers need improvement (of course).


----------



## Jpmboy

this is old news I'm sure:

http://forums.overclockers.co.uk/showthread.php?t=18552408


----------



## DampMonkey

Quote:


> Originally Posted by *tsm106*
> 
> They work, but its not for everyone.


Did they work at high resolutions for you? Some clocks would work completely fine on my 1200p monitor but then they would artifact and become very blurry doing them same testes on my 1440p monitor. Asus drivers have been solid otherwise


----------



## tsm106

Quote:


> Originally Posted by *Jpmboy*
> 
> this is old news I'm sure:
> 
> http://forums.overclockers.co.uk/showthread.php?t=18552408


Yea. The real source is kingpin, without the text.

Quote:


> Originally Posted by *DampMonkey*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> They work, but its not for everyone.
> 
> 
> 
> Did they work at high resolutions for you? Some clocks would work completely fine on my 1200p monitor but then they would artifact and become very blurry doing them same testes on my 1440p monitor. Asus drivers have been solid otherwise
Click to expand...

I'm using the stock asus atm and at 5760, no issues but I'm not doing anymore oc till I get my blocks on.


----------



## black7hought

Quote:


> Originally Posted by *kcuestag*
> 
> It just happened again, opened Arma 2 and 2 minutes later no signal and I had to hard reset the computer.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone else getting similar behavior?


I've had loss of monitor signal while folding and I had to hard reset. The monitors were off overnight and in the morning they would only show multicolored lines across both screens. I turned off AMD Overdrive but I haven't tried folding since. I was having similar folding issues with my 280X on beta drivers as well so I'm leaning towards it being a driver related issue.


----------



## Jared Pace

Quote:


> Originally Posted by *Jpmboy*
> 
> got her to 1350/6000 at 1.4V. posted a single gpu FS score of 11550. NOt great but getting better.


^Fastest card here so far. 30mhz faster than Dampmonkey & 10mhz faster than jomama22
Quote:


> Originally Posted by *tsm106*
> 
> Yea. The real source is kingpin, without the text.


Sure it isn't Shamino?


----------



## FtW 420

Quote:


> Originally Posted by *Jared Pace*
> 
> ^Fastest card here so far. 30mhz faster than Dampmonkey & 10mhz faster than jomama22
> Sure it isn't Shamino?


It is Shamino, but posted at kingpin's site.


----------



## tsm106

LOL saved me the post.


----------



## skupples

Quote:


> Originally Posted by *HoZy*
> 
> Elpida memory here, Had my card since the 1st November.
> 
> It's put down 6hours of furmark, 10+ Uniengine Heaven & Valley benches as well as so far 11hours of BF4 Gameplay.
> 
> Put the EK block on 2days ago, And overclocked it to 1150core & 1350 memory on stock bios through CCC.
> 
> Ran another 2hours of furmark and another few hours of BF4 and it's flawless.
> 
> Cheers
> Mat
> 
> Windows 8 64bit, AMD CCC B9.2


Try not to read any rudeness out of this.. Are you trying to blow up your GPU w/ all that furmark?


----------



## bpmcleod

Quote:


> Originally Posted by *Jared Pace*
> 
> ^Fastest card here so far. 30mhz faster than Dampmonkey & 10mhz faster than jomama22
> Sure it isn't Shamino?


Whats weird is, those score are barely beating mine and im only at 1175/1500. What were their GPU scores? My FS score on a single card is 10880.

http://www.3dmark.com/3dm/1595580?

Edit: I am also still on air  with a 290..


----------



## bpmcleod

Quote:


> Originally Posted by *skupples*
> 
> Try not to read any rudeness out of this.. Are you trying to blow up your GPU w/ all that furmark?


Alot of people do burn tests to make sure its stable. GPUs are designed to be able to run for quite some time at stock clocks / voltages at 100% load. Look at people with folding machines..


----------



## skupples

Quote:


> Originally Posted by *bpmcleod*
> 
> Alot of people do burn tests to make sure its stable. GPUs are designed to be able to run for quite some time at stock clocks / voltages at 100% load. Look at people with folding machines..


furmark =/= folding... it = death... I'm sorry, I lost a 100% stock 670 to furmark, will never use it again.


----------



## Taint3dBulge

Quote:


> Originally Posted by *kcuestag*
> 
> It just happened again, opened Arma 2 and 2 minutes later no signal and I had to hard reset the computer.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone else getting similar behavior?


Try changing DVI ports.. Its what fixed it for me. I had it with when windows started though.. after 5 seconds of windows starting it would go black screen.. But now i switched back to the top dvi and its working fine again.







Its gotta be a driver issue..


----------



## lacrossewacker

Quote:


> Originally Posted by *skupples*
> 
> furmark =/= folding... it = death... I'm sorry, I lost a 100% stock 670 to furmark, will never use it again.


For some reason my gpu heats up more with folding core 17s than it does with furmark.

Though you're right, there are stories of people's gpus dying to furmark, haven't seen anything as drastic with folding users


----------



## brazilianloser

Quote:


> Originally Posted by *Taint3dBulge*
> 
> Try changing DVI ports.. Its what fixed it for me. I had it with when windows started though.. after 5 seconds of windows starting it would go black screen.. But now i switched back to the top dvi and its working fine again.
> 
> 
> 
> 
> 
> 
> 
> Its gotta be a driver issue..


What about those that are using all ports on the card??? lol


----------



## maarten12100

Quote:


> Originally Posted by *Sgt Bilko*
> 
> not that i know of.......another cost cut i imagine
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: i just tried to start folding since i'm about to sleep and i got a BSOD, switched it to Quiet mode again and it's fine........yeah, i need to RMA, something is screwy here.


Ah darn I really thought there were leds in it I might shove a white led in there to see how it looks.
The stock cooler btw seems fine for me totally fine at 50% though I might go water once I start on my brickland build since then I will be overclocking too.


----------



## skupples

Quote:


> Originally Posted by *lacrossewacker*
> 
> For some reason my gpu heats up more with folding core 17s than it does with furmark.
> 
> Though you're right, there are stories of people's gpus dying to furmark, haven't seen anything as drastic with folding users


I don't know the facts behind it... My 670 had been running strong for 3 months, 10 mins of furmark & dat smell...









I do know that furmark throttles the hell out of kepler, and STILL runs it to max temp so yeah idk... I'm officially scurred of furmark.


----------



## Sgt Bilko

So these black screen issues.........have we found the cause yet?

mine black screens and gives me a BSOD at stock clocks and overclocked when it's in Uber mode under load but seems to work fine in Quiet mode.


----------



## Rar4f

Will less noise be the only job of non reference cards? Since i keep hearing that 290s are best at 95c.
And will this not be in contrast to past priority of other cards where the job was to have not only low noise but also good cooling?


----------



## brazilianloser

Quote:


> Originally Posted by *Sgt Bilko*
> 
> So these black screen issues.........have we found the cause yet?
> 
> mine black screens and gives me a BSOD at stock clocks and overclocked when it's in Uber mode under load but seems to work fine in Quiet mode.


Not sure if it is going to be fixable by a new bios or driver... have you checked which type of memory with the checker posted earlier?


----------



## hotrod717

All this talk of 290x and black screens. Haven't seen an issue with 290. Am I wrong?


----------



## Snyderman34

Here's a question (just thought about it, and I'm at work so I can't find out): Does running 3 monitors still keep idle temps up? I know my 6950s and 7970 (I think) idled higher (frequency and temp) because I had 3 monitors attached. Taking one out dropped frequency and temps. Is that still the case, by chance?


----------



## Sgt Bilko

Quote:


> Originally Posted by *brazilianloser*
> 
> Not sure if it is going to be fixable by a new bios or driver... have you checked which type of memory with the checker posted earlier?


yeah, i'm running Elphida memory.......I'd really rather not RMA this, getting this cooler off would be a nightmare.


----------



## brazilianloser

Quote:


> Originally Posted by *hotrod717*
> 
> All this talk of 290x and black screens. Haven't seen an issue with 290. Am I wrong?


You missed the point. Both 290 and 290x owners are having the problem. This is a 290/290x club after all. Myself and some others own the 290 and so on. The problem so far seems to be related to the type of memory used in your card. But well hopefully AMD and the third party folks will figure this junk out and stop dishing out rushed broken cards.


----------



## brazilianloser

Quote:


> Originally Posted by *Sgt Bilko*
> 
> yeah, i'm running Elphida memory.......I'd really rather not RMA this, getting this cooler off would be a nightmare.


Hold out as long as you can then before your rma period is over to see if they come up with any over the net fixes...


----------



## brazilianloser

I just want to get two stable cards before my left over money for school arrives end of the year... Going to be putting a full water loop... and would hate to have problems after placing the cards in the loop. Otherwise I will be one happy little boy.


----------



## bpmcleod

Does anyone know if having one card flashed to a 290x bios and another on a 290 bios would affect the xfire of the two cards? I have a Powercolor 290 flashed to an ASUS 290x BIOS and plan on adding a 2nd (hopefully an ASUS 290) and wanted to ask this before hand lol.


----------



## hotrod717

Quote:


> Originally Posted by *brazilianloser*
> 
> You missed the point. Both 290 and 290x owners are having the problem. This is a 290/290x club after all. Myself and some others own the 290 and so on. The problem so far seems to be related to the type of memory used in your card. But well hopefully AMD and the third party folks will figure this junk out and stop dishing out rushed broken cards.


No, I don't think I missed the point. I asked a simple question. Thanks for the rudeness!







I don't recall reading any posts by 290 owners having issues with their cards. Hence, the question.


----------



## brazilianloser

Quote:


> Originally Posted by *hotrod717*
> 
> No, I don't think I missed the point. I asked a simple question. Thanks for the rudeness!
> 
> 
> 
> 
> 
> 
> 
> I don't recall reading any posts by 290 owners having issues with their cards. Hence, the question.


Rudeness comes in different levels when you are reading something emotionless in the net... I for one didn't meant to be such. Anyways there are many and many pages of me personally and a few others that are 290 (non x) owners going on about the black screen issue. So yes to your question both cards have the problem and as already mentioned memory related most likely.


----------



## jomama22

Quote:


> Originally Posted by *Jared Pace*
> 
> ^Fastest card here so far. 30mhz faster than Dampmonkey & 10mhz faster than jomama22
> Sure it isn't Shamino?


Quote:


> Originally Posted by *Jared Pace*
> 
> ^Fastest card here so far. 30mhz faster than Dampmonkey & 10mhz faster than jomama22
> Sure it isn't Shamino?


Does he have a FS page to look at?

Found it, he got 12800+ gfx score, must be completely unstable as I got 14100+ with 1315/1500.


----------



## jomama22

Quote:


> Originally Posted by *DampMonkey*
> 
> PT3 and PT1 aren't proving to be very stable. Not a good bios for most peoples daily drivers


Quote:


> Originally Posted by *tsm106*
> 
> They work, but its not for everyone.


People just aren't getting what voltage is actually going through To their chip. Testing Asus.ROM/pt3/pt1 the voltage required at any clock is exactly the same. If I set 1.25 in pt3, I get the same clocks as if I set 1.35 in pt1 and 1.412 in Asus.ROM.

There are 0 stability differences. If the actual volts going to the card are the same, you will get the same clock speed. I tried this on all 6 cards after all the hooplah over the pt3 bios.

The only thing pt3 does is remove the inherent vdroop and increase llc.


----------



## Sazz

hmmm, people who got the Sapphire 290X card, can anyone upload the Quiet Mode BIOS? I need em. I accidentally made a back up of the Uber mode instead of Quiet mode and I need the Quiet mode BIOS to re-flash it back to that BIOS.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Sazz*
> 
> hmmm, people who got the Sapphire 290X card, can anyone upload the Quiet Mode BIOS? I need em. I accidentally made a back up of the Uber mode instead of Quiet mode and I need the Quiet mode BIOS to re-flash it back to that BIOS.


Ok so i finally managed to install the Beta9 driver and the BSOD and black screens seem to have stopped for me..........i'm guessing that people would have tried this already but on the surface it seems like a driver issue.
Quote:


> Originally Posted by *Sazz*
> 
> hmmm, people who got the Sapphire 290X card, can anyone upload the Quiet Mode BIOS? I need em. I accidentally made a back up of the Uber mode instead of Quiet mode and I need the Quiet mode BIOS to re-flash it back to that BIOS.


sure thing, just gimme a few mins to reboot back into quiet mode and upload it


----------



## Sazz

Quote:


> Originally Posted by *Sgt Bilko*
> 
> sure thing, just gimme a few mins to reboot back into quiet mode and upload it


Thanks!


----------



## Arm3nian

Quote:


> Originally Posted by *jomama22*
> 
> People just aren't getting what voltage is actually going through To their chip. Testing Asus.ROM/pt3/pt1 the voltage required at any clock is exactly the same. If I set 1.25 in pt3, I get the same clocks as if I set 1.35 in pt1 and 1.412 in Asus.ROM.
> 
> There are 0 stability differences. If the actual volts going to the card are the same, you will get the same clock speed. I tried this on all 6 cards after all the hooplah over the pt3 bios.
> 
> The only thing pt3 does is remove the inherent vdroop and increase llc.


PT3 is the bios that is closest to the actual voltage going to the card, because it has the lowest vdroop. The instability is caused by the actual bios. For example, the max voltage set on the asus bios is equal to 1.29 without vdroop (actual voltage), and when tested it doesn't cause instability. But if you set 1.29 in PT3 (same as the asus without vdroop and close to the actual voltage, it crashes.


----------



## CriticalHit

Quote:


> Originally Posted by *Sgt Bilko*
> 
> So these black screen issues.........have we found the cause yet?
> 
> mine black screens and gives me a BSOD at stock clocks and overclocked when it's in Uber mode under load but seems to work fine in Quiet mode.


http://www.simology.co/forums/gaming-platform-pc/3302-r9-290x-display-black-screen?limitstart=0
Quote:


> We know that there has been some issues with motherboards using the UEFI boot mode with this card. Please make sure that you can request a new BIOS update on the motherboard www.asus.com/Motherboards/MAXIMUS_VI_FORMULA/#support
> To support this graphic card. Normally booting on Legacy mode will not have issues like you describe.
> Kind regards,
> Club 3D support


there is a suggestion it may be a conflict with running a 290 on windows UEFI OS ..
Suggested fix is set BIOS to run in Legacy mode..


----------



## Snyderman34

I also can't remember if I had posted this or not, but at 1100/1437 my 290 cracked 10k in Fire Strike (with a stock 4770k, if it matters).

http://www.3dmark.com/fs/1105373

Gonna fiddle more later, see what else I can get out of it.


----------



## youra6

I would like to see a separate 290s owners club. The 290x owners out number us 3:1


----------



## the9quad

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Ok so i finally managed to install the Beta9 driver and the BSOD and black screens seem to have stopped for me..........i'm guessing that people would have tried this already but on the surface it seems like a driver issue.
> sure thing, just gimme a few mins to reboot back into quiet mode and upload it


Beta 9 didn't help me, glad it worked for you though.


----------



## CriticalHit

Quote:


> Originally Posted by *youra6*
> 
> I would like to see a separate 290s owners club. The 290x owners out number us 3:1


i wish they named it differently.. searching for information on r9 290 is really tough... just get directed to articles on 290x ...


----------



## Arm3nian

Quote:


> Originally Posted by *youra6*
> 
> I would like to see a separate 290s owners club. The 290x owners out number us 3:1


290x came out earlier. 290 is the more bang for your buck, just like 7950, 670, 770, etc, 290 owners will most likely out number 290x owners in the future. Not a bad idea though if we see differences in the cards such as bioses, drivers, etc.


----------



## Falkentyne

Why would switching from Uber Bios to Quiet Bios stop the black screens? They are exactly the same Bios, just with different fan curves, so if you overrid the autofan and used manual (in afterburner), they would be identical.....wouldn't they?

I STILL want to see feedback from ANYONE--even one person, who has HYNIX gddr5 memory and has the black screen at stock issue...


----------



## jomama22

Quote:


> Originally Posted by *Arm3nian*
> 
> PT3 is the bios that is closest to the actual voltage going to the card, because it has the lowest vdroop. The instability is caused by the actual bios. For example, the max voltage set on the asus bios is equal to 1.29 without vdroop (actual voltage), and when tested it doesn't cause instability. But if you set 1.29 in PT3 (same as the asus without vdroop and close to the actual voltage, it crashes.


I'm sorry but this is wrong. You are going to get someone to kill their card with this info. If you are setting 1.29 in PT3, you are actually sending upwards of 1.34v to the chip. Pt3 not only removes vdroop, but adds an extreme llc, you are getting overvolted anywhere from .3v-.6v depending on the asic.

You were unstable because your card did not like that voltage (1.34v) especially on air...i bet if you lower that pt3 bios voltage to 1.25-1.26 you would be just fine and those clocks would be stable.


----------



## brazilianloser

Quote:


> Originally Posted by *CriticalHit*
> 
> http://www.simology.co/forums/gaming-platform-pc/3302-r9-290x-display-black-screen?limitstart=0
> there is a suggestion it may be a conflict with running a 290 on windows UEFI OS ..
> Suggested fix is set BIOS to run in Legacy mode..


I guess it doesn't hurt to try. But even if it fixes still returning mine.


----------



## Sgt Bilko

Quote:


> Originally Posted by *the9quad*
> 
> Beta 9 didn't help me, glad it worked for you though.


Well it's so far so good........can't say with absolute certainty this has fixed it


----------



## brazilianloser

Nvm one minute into Valley and the crashes are back again. Legacy or UEFI not the issue at my end at least.


----------



## CriticalHit

Quote:


> Originally Posted by *brazilianloser*
> 
> I guess it doesn't hurt to try. But even if it fixes still returning mine.


edit.... damn
Quote:


> Nvm one minute into Valley and the crashes are back again. Legacy or UEFI not the issue at my end at least.


----------



## CallsignVega

Hm, good thing I checked this thread. Was going to purchase 4 - 290x's until I read about this black screen issue. Will wait until more info is known, or maybe even when a custom PCB 290x comes out, if ever. Knowing how long it takes some manufacturers to release custom PCB versions, Maxwell may be out LOL.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Sazz*
> 
> hmmm, people who got the Sapphire 290X card, can anyone upload the Quiet Mode BIOS? I need em. I accidentally made a back up of the Uber mode instead of Quiet mode and I need the Quiet mode BIOS to re-flash it back to that BIOS.


http://www.mediafire.com/download/bmjcg2lial4b6a8/Sapphire+290x+Quiet+Mode.rom

Here ya go,


----------



## Emmett

Are people getting blackscreens running native and Through PLX?

Just thought I would mention I am running NON UEFI and both cards on native 8X and not having any issues so far. loaded beta 9 from the start.

EDIT: Elpida memory


----------



## Arm3nian

Quote:


> Originally Posted by *jomama22*
> 
> I'm sorry but this is wrong. You are going to get someone to kill their card with this info. If you are setting 1.29 in PT3, you are actually sending upwards of 1.34v to the chip. Pt3 not only removes vdroop, but adds an extreme llc, you are getting overvolted anywhere from .3v-.6v depending on the asic.
> 
> You were unstable because your card did not like that voltage (1.34v) especially on air...i bet if you lower that pt3 bios voltage to 1.25-1.26 you would be just fine and those clocks would be stable.


I'm not talking about my cards I'm talking about others who have had it under water. People running PT3 should know what they're doing to begin with so we shouldn't see dead cards with any info regarding PT3.
Anyways, PT3 does not completely remove vdroop. The LLC can be combated by setting the voltage below the max. There is a reason why we mentioned 1.45 in PT3 as a safe voltage, because with LLC, you can see up to 1.5, which is still safe for the card. As long as your cooling is good, you shouldn't have a problem as long as you know what you're doing.

Also, the so called overvoltage you will get is .0x not .x, if you got .6 overvolted and were running 1.3 you would be up to 1.9 and your card would blow up.


----------



## Sazz

Quote:


> Originally Posted by *Sgt Bilko*
> 
> http://www.mediafire.com/download/bmjcg2lial4b6a8/Sapphire+290x+Quiet+Mode.rom
> 
> Here ya go,


Thanks a bunch!


----------



## Sgt Bilko

Quote:


> Originally Posted by *Sazz*
> 
> Thanks a bunch!


No worries, have fun


----------



## jomama22

Quote:


> Originally Posted by *Arm3nian*
> 
> I'm not talking about my cards I'm talking about others who have had it under water. People running PT3 should know what they're doing to begin with so we shouldn't see dead cards with any info regarding PT3.
> Anyways, PT3 does not completely remove vdroop. The LLC can be combated by setting the voltage below the max. There is a reason why we mentioned 1.45 in PT3 as a safe voltage, because with LLC, you can see up to 1.5, which is still safe for the card. As long as your cooling is good, you shouldn't have a problem as long as you know what you're doing.
> 
> Also, the so called overvoltage you will get is .0x not .x, if you got .6 overvolted and were running 1.3 you would be up to 1.9 and your card would blow up.


Thanks for the correction as I meant .ox volts. Most cards will start to flop at 1.375v actual, or a setting of 1.325 in pt3 anyway. The idicator is horizontal static horizontal lines and/or shuddering of the image when there is too much voltage. When you see the "graph" or "checkerboard" that is core/too little voltage, when you see color spots or large bars of a single color that is memory clock. Black screening is memory clock or too much voltage.

You just stated that you set pt3 at 1.29 under the assumption that it equals 1.29v actual, which is completely wrong. As I stated before, if you are setting 1.29 in pt3, actual is 1.33-1.35v.

Voltage is what is causing instability. Pt1 with vdroop does allow more precise voltage adjustment merely because of vdroop, but the idea that pt3 is displaying actual voltage through GPU tweak is wrong.

Take a look at the original kingpin thread and see that ln2 benchers are seeing a .05v increase over their GPU tweak setting. This is with hard modded vddc measurement points.

All I'm saying is don't tell people pt3 = actual voltage as it is wrong.

I hate saying this, but I have way more experience at this point on these BIOS then you do. The last thing I want is misinformation about voltage and bios' that could potentially lead to someone killing their card.


----------



## rv8000

Anyone here notice that with 290/290x overclocking memory past 1500mhz seems to have almost 0 benefit in both 3dmark and Uniengine benchmarks? My card shows no sign of artifacting @ 1575 or even 1600 yet it yields no performance increase. My only guess is ecc? Have chip timings ever changed depending on clocks for gpus?


----------



## Arm3nian

Quote:


> Originally Posted by *jomama22*
> 
> Thanks for the correction as I meant .ox volts. Most cards will start to flop at 1.375v actual, or a setting of 1.325 in pt3 anyway. The idicator is horizontal static horizontal lines and/or shuddering of the image when there is too much voltage. When you see the "graph" or "checkerboard" that is core/too little voltage, when you see color spots or large bars of a single color that is memory clock. Black screening is memory clock or too much voltage.
> 
> You just stated that you set pt3 at 1.29 under the assumption that it equals 1.29v actual, which is completely wrong. As I stated before, if you are setting 1.29 in pt3, actual is 1.33-1.35v.
> 
> Voltage is what is causing instability. Pt1 with vdroop does allow more precise voltage adjustment merely because of vdroop, but the idea that pt3 is displaying actual voltage through GPU tweak is wrong.
> 
> Take a look at the original kingpin thread and see that ln2 benchers are seeing a .05v increase over their GPU tweak setting. This is with hard modded vddc measurement points.
> 
> All I'm saying is don't tell people pt3 = actual voltage as it is wrong.
> 
> I hate saying this, but I have way more experience at this point on these BIOS then you do. The last thing I want is misinformation about voltage and bios' that could potentially lead to someone killing their card.


Please quote me saying PT3 = Actual voltage. I only said it is the closest to the actual voltage compared to the other bios. I don't doubt you have more experience than me with these bioses because I haven't even powered up my cards, all I'm saying is that w/o too little or too much voltage people are still getting black screens with the PT3 bios.


----------



## MIGhunter

If you are using a Display Port, make sure it's active. I have a passive and it has black screen issues that I can't get out of .


----------



## skupples

Quote:


> Originally Posted by *Snyderman34*
> 
> Here's a question (just thought about it, and I'm at work so I can't find out): Does running 3 monitors still keep idle temps up? I know my 6950s and 7970 (I think) idled higher (frequency and temp) because I had 3 monitors attached. Taking one out dropped frequency and temps. Is that still the case, by chance?


I didn't see anyone answer this... Yes, in theory running multi-monitor is going to increase idle temps. Basically because you are putting more strain on the gpu's while in 2D... You will possibly see higher 2d voltage, & higher 2d clocks. So, that would relate to a higher temp. I don't know if AMD has a similar program, but Nvidia inspector has a tool called "multi-monitor power saver" it basically forces the cards to run @ bare idle speeds while in 2D... For example, if I turn it off my cards will run @ almost full clock speed on mem & core. When I enable it my core & mem both drop to 324mhz @~.881v. Which means less power usage, & lower 2D temps.


----------



## Snyderman34

Quote:


> Originally Posted by *skupples*
> 
> I didn't see anyone answer this... Yes, in theory running multi-monitor is going to increase idle temps. Basically because you are putting more strain on the gpu's while in 2D... You will possibly see higher 2d voltage, & higher 2d clocks. So, that would relate to a higher temp. I don't know if AMD has a similar program, but Nvidia inspector has a tool called "multi-monitor power saver" it basically forces the cards to run @ bare idle speeds while in 2D... For example, if I turn it off my cards will run @ almost full clock speed on mem & core. When I enable it my core & mem both drop to 324mhz @~.881v. Which means less power usage, & lower 2D temps.


Awesome. I figured that would be the case, but wasn't sure if it had changed or not. Thanks a lot!


----------



## Arm3nian

Quote:


> Originally Posted by *skupples*
> 
> I didn't see anyone answer this... Yes, in theory running multi-monitor is going to increase idle temps. Basically because you are putting more strain on the gpu's while in 2D... You will possibly see higher 2d voltage, & higher 2d clocks. So, that would relate to a higher temp. I don't know if AMD has a similar program, but Nvidia inspector has a tool called "multi-monitor power saver" it basically forces the cards to run @ bare idle speeds while in 2D... For example, if I turn it off my cards will run @ almost full clock speed on mem & core. When I enable it my core & mem both drop to 324mhz @~.881v. Which means less power usage, & lower 2D temps.


Back in the 600 series days with control panel there was an option for adaptive or maximum performance. Adaptive would turn your gpus down to 324 and lower voltage. Maximum would run it at the max stock settings on 2d apps. Both would go up to the overclocked settings once a 3d app was running. Basically maximum performance on the desktop would heat your room for no reason. A low end gpu might have trouble running 2d apps when clocked at a third of its power, especially on high resolutions/multi-monitors, but I seriously doubt this is a problem with any medium-high end gpu.


----------



## skupples

Quote:


> Originally Posted by *Snyderman34*
> 
> Awesome. I figured that would be the case, but wasn't sure if it had changed or not. Thanks a lot!


Np!
Quote:


> Originally Posted by *Arm3nian*
> 
> Back in the 600 series days with control panel there was an option for adaptive or maximum performance. Adaptive would turn your gpus down to 324 and lower voltage. Maximum would run it at the max stock settings on 2d apps. Both would go up to the overclocked settings once a 3d app was running. Basically maximum performance on the desktop would heat your room for no reason. A low end gpu might have trouble running even 2d apps when clocked at a third of its power, but I seriously doubt this is a problem with any medium-high end gpu.


It still exists... Though, with my surround setup it doesn't actually do anything. Not sure if it's the modded bios or what. So, i use the inspector tool to force it to down clock. Yeah... 2 titans @ 324mhz are completely capable of tackling any basic 2D task.


----------



## Jared Pace

Quote:


> Originally Posted by *jomama22*
> 
> Thanks for the correction as I meant .ox volts. *Most cards will start to flop at 1.375v actual, or a setting of 1.325 in pt3 anyway.* The idicator is horizontal static horizontal lines and/or shuddering of the image when there is too much voltage. When you see the "graph" or "checkerboard" that is core/too little voltage, when you see color spots or large bars of a single color that is memory clock. Black screening is memory clock or too much voltage.
> 
> You just stated that you set pt3 at 1.29 under the assumption that it equals 1.29v actual, which is completely wrong. As I stated before, if you are setting 1.29 in pt3, actual is 1.33-1.35v.
> 
> Voltage is what is causing instability. Pt1 with vdroop does allow more precise voltage adjustment merely because of vdroop, but the idea that pt3 is displaying actual voltage through GPU tweak is wrong.
> 
> Take a look at the original kingpin thread and see that ln2 benchers are seeing a .05v increase over their GPU tweak setting. This is with hard modded vddc measurement points.
> 
> All I'm saying is don't tell people pt3 = actual voltage as it is wrong.
> 
> I hate saying this, but I have way more experience at this point on these BIOS then you do. The last thing I want is misinformation about voltage and bios' that could potentially lead to someone killing their card.


I agree with everything you've posted here in response to Arm3nian; except I have 1 question about this remark you make. I bolded it inside your post: "Most cards top out at 1.325v/1.375v". What do you think about this screenshot of a run with 1.4v/1.45v? Is it legit?:
http://www.overclock.net/t/1436635/ocn-gk110-vs-hawaii-bench-off-thread/740#post_21101666

Is his particular R9 290X just a "special circumstance" card that doesn't fit into what "most cards" will do? Just curious. Thanks jomama22.


----------



## Arm3nian

Quote:


> Originally Posted by *Jared Pace*
> 
> I agree with everything you've posted here in response to Arm3nian; except I have 1 question about this remark you make. I bolded it inside your post: "Most cards top out at 1.325v/1.375v". What do you think about this screenshot of a run with 1.4v/1.45v? Is it legit?:
> http://www.overclock.net/t/1436635/ocn-gk110-vs-hawaii-bench-off-thread/740#post_21101666
> 
> Is his particular R9 290X just a "special circumstance" card that doesn't fit into what "most cards" will do? Just curious. Thanks jomama22.


Several people have reported black screens with the PT3 bios. Too little or too much voltage makes no sense to me. If they are running too little voltage and you black screen or crash the obvious solution is to raise volts. If they still get black screens what do you say then? Too much voltage? What?


----------



## tsm106

There's a lot of vagueness with hawaii that makes it real annoying. Btw a monkey could figure out that pt3 adds more volts than what you input by seeing the monitor and adding the difference from a droop bios. Anyways there are two types of blackscreens that a friend and I have been trying to shed light on. There are those that get it stock and those that oc with voltage.

OCing w/o voltage as far as we've tested, when going too far it reacts the same way as Tahiti which is artifact then crash. With volts on Hawaii, when you go too far it blackscreens. It doesn't take a genius to see we have a volt control problem. Whether its the apps... not sure even though gputweak can suck it. Adding volts on AB causes blackscreens too. Yea, not a typo... thanks Falkentyne. Oh yea, and that other guy.

You can test induce blackscreens by adding a lil bit of volts and go into a fps game and spin, spin, spin. The fast motion with volts can/will induce a blackscreen. That didn't happen with Tahiti for sure.


----------



## Forceman

Got my first red screen crash on BF4 today. Played a couple of hours of campaign yesterday at the same settings with no issue. Is there a known cause for the red screen? Is it overclock related, or is it maybe a game or driver issue?

Edit: Hynix on my XFX 290 also.


----------



## HoZy

Quote:


> Originally Posted by *Forceman*
> 
> Got my first red screen crash on BF4 today. Played a couple of hours of campaign yesterday at the same settings with no issue. Is there a known cause for the red screen? Is it overclock related, or is it maybe a game or driver issue?
> 
> Edit: Hynix on my XFX 290 also.


Update your driver, Red screen was a stock/beta6/7 thing. Update to beta 9.2.

Cheers
Mat


----------



## bpmcleod

BTW thought I would throw my experience with black screens out there also. I have had a few black screens on bootup (alot actually) and they almost always seem to be caused by either error code 55 (A memory code that gets thrown during boot and I can normally reboot and it clears through to post) but my memory has passed any memtest I have thrown at it so idk why this code is thrown, and also because I have a second monitor hooked to my DVI port. If I unplug this monitor and reboot it will post fine and I can replug it in after it has posted. Just a thought incase anyone is coming accross similar problems.


----------



## Forceman

Quote:


> Originally Posted by *HoZy*
> 
> Update your driver, Red screen was a stock/beta6/7 thing. Update to beta 9.2.
> 
> Cheers
> Mat


Pretty sure I'm already on 9.2. I'll double check though.


----------



## tsm106

Quote:


> Originally Posted by *bpmcleod*
> 
> BTW thought I would throw my experience with black screens out there also. I have had a few black screens on bootup (alot actually) and they almost always seem to be caused by either error code 55 (A memory code that gets thrown during boot and I can normally reboot and it clears through to post) but my memory has passed any memtest I have thrown at it so idk why this code is thrown, and also because I have a second monitor hooked to my DVI port. If I unplug this monitor and reboot it will post fine and I can replug it in after it has posted. Just a thought incase anyone is coming accross similar problems.


Code 55 on your board generally means rma time. Seems its rather prevalent with your board type.

Only success I saw was this.

http://rog.asus.com/forum/showthread.php?24767-MAXIMUS-V-FORMULA-ThunderFX-Will-Not-Boot-with-More-than-2-Sticks-of-RAM/page2


----------



## HoZy

I had code 55 the other day after going Direct to Die on my CPU, I ended up bending 2 pins in the lower right area of my 1155 socket, I bent them back with a needle & bolted it all back together & no 55.


----------



## tsm106

I see a pattern starting to develop with these stock blackscreen on boots...


----------



## fleetfeather

Quote:


> Originally Posted by *tsm106*
> 
> I see a pattern starting to develop with these stock blackscreen on boots...


I've been busy with uni stuff and missed the last 200 odd posts. Would you mind summarizing what's going on?


----------



## youra6

Quote:


> Originally Posted by *tsm106*
> 
> I see a pattern starting to develop with these stock blackscreen on boots...


I didn't get the black screen on boot, but my computer did go back spontaneously a moment ago. Just flashed to the 290 (non X) BIOS.


----------



## Sgt Bilko

Quote:


> Originally Posted by *fleetfeather*
> 
> I've been busy with uni stuff and missed the last 200 odd posts. Would you mind summarizing what's going on?


Some people have been having issues with Black Screens, BSOD's and complete lock-outs. imo it seems to be driver related, others think it might be the video ports, some think it might be memory (Elphida) and yeah........thats basically what i have gotten from it all.

Someone else would probably give a better explanation than I.


----------



## the9quad

Well mine only black screens when the card is getting a workout.

Here is the list of things I have tried some initially had success, but ultimately the issue reared it's head again and again.


When i first got the card, I ran it stock quiet mode with whatever driver was on the cd. Graphical corruption and black screen crash very fast.
Uninstalled that Raptr program- seemed to do something as my graphical corruption went away.- still black screened.
Heard to use the beta drivers, so I switched to them. Much better performance but still black screen.
Next up tried out the uber mode- set the fan at max and let her rip- could play longer it seemed, but eventually would black screen.
Hey beta 8! - this helped some, but still black screens.
Flashed to the ASUS bios and used GPU tweak (turned off overdrive)- I thought I had the magic bullet with the new custom profile setup yes! Nope random black screen lockups.
The next few days was trying various settings with power and fan profiles to attempt to get rid of the black screens- always had moments when I thought i fixed it but they randomly returned.
Over that same period also down clocked my CPU-didn't help also adjusted VCORE up some, thought it helped but it didn't.
Now I try beta 9- holy balls it was worse. Back to beta 8.
Uninstalled windows 8.1 and re installed 8. Started fresh again with stock everything. Made sure to update everything for my mobo. Installed a few benchmarks and BF4, as well as the beta 8's.
NOPE!
Also tried using different display ports no help
Also in that time I tried out a new power supply and also tried using different PCI-E power lines from my modular supply. None of that helped.
Anyone have any other ideas?


----------



## HoZy

the9quad:

Ever updated your motherboard bios? If you haven't I'd suggest updating it. And if you have, Enable legacy mode.

Then see how she goes.

Cheers
Mat


----------



## Sazz

I noticed that if the card doesn't need all its speed to run the game it doesn't use all the clocks and it downclocks itself even if the temp is way under 50C.

but when I run furmark it runs flat out 1100 (my daily clock)


----------



## the9quad

Quote:


> Originally Posted by *HoZy*
> 
> the9quad:
> 
> Ever updated your motherboard bios? If you haven't I'd suggest updating it. And if you have, Enable legacy mode.
> 
> Then see how she goes.
> 
> Cheers
> Mat


Yeah I have updated my bios twice since the card came out, neither helped. I don't think that legacy mode is going to help, but next time I black screen I will enable it.

*Asus Sabertooth X79 BIOS*

4404 on 10/25/13
4502 on 10/31/13


----------



## tsm106

Quote:


> Originally Posted by *gooeyballzz*
> 
> Hey Guys,
> 
> I have been lurking for along time and I really like your site.
> 
> I just got my MSI r9 290x and I have a problem. It seems to be throttling, in Metro Last Light *I see GPU usage up and down constantly, 100% to 0.*
> 
> I have installed the drivers from ATI released last night, which did not help.
> 
> I used MSI AB and set the fan speed to 50%. That helped with the Metro Benchmark,but the game still played choppy and I saw the constant up and down.
> 
> I got it at newegg so the RMA was a piece of cake, but I would really like to make it work.
> 
> Sorry for the long windedness.
> 
> Thanks


I was thinking about this... and then it dawned on me lol. It was so simple. I forgot I was using my rig at stock while waiting for my blocks to arrive (that sounds funny), well that and a poper AB to release. Thus at stock I had usage problems too because Powertune is set to stock.

So double check your powertune slider is not at zero.


----------



## rv8000

Are people really still using furmark


----------



## Arm3nian

Quote:


> Originally Posted by *rv8000*
> 
> Are people really still using furmark


Duh, it's the best stability test.


Spoiler: Warning: Spoiler!



Sarcasm


----------



## Strata

So If I am (so far) not getting Black Screens on stock does that mean that in the future with better voltage tools and WC block on my card I can go back to OC? Or should I keep RMAing to try and get Hynix?


----------



## jomama22

Quote:


> Originally Posted by *Strata*
> 
> So If I am (so far) not getting Black Screens on stock does that mean that in the future with better voltage tools and WC block on my card I can go back to OC? Or should I keep RMAing to try and get Hynix?


do not rma to get hynix please. Thats just ridiculous honestly I highly doubt blacksceening is only an elpida problem.

@tsm my previous post about pt3 voltage wasn't aimed at you, your quote was there for context.


----------



## Kuat

Can anyone in the know post a short summary on 290/290x cards?

I scanned this thread and got the impression that people are getting a lot of random problems (at least with the ref cards).

Is it really such a crap product or what?


----------



## Snyderman34

New results! Flashed to the ASUS BIOS. Got up to 1200/1500 on air @ 1.4v (drooped to 1.296 or so). 4770k @ 4.3GHz

3DMark 11 (Extreme):



Fire Strike:


----------



## Arm3nian

Quote:


> Originally Posted by *Kuat*
> 
> Can anyone in the know post a short summary on 290/290x cards?
> 
> I scanned this thread and got the impression that people are getting a lot of random problems (at least with the ref cards).
> 
> Is it really such a crap product or what?


Summary is that we are waiting for proper voltage control.

As for the problems, every gpu release has had problems. The cards are still new, with newer bioses and drivers the problems should be fixed. Far from a crap product.


----------



## CurrentlyPissed

Can someone with an Asus 290 (NON-X) dump the bios for me please?


----------



## Raephen

Quote:


> Originally Posted by *Kipsofthemud*
> 
> Well, I checked their disclaimers etc and it says they have the right to cancel an order if they want, prices can be changed, etc. They have the legal high ground so to speak
> 
> 
> 
> 
> 
> 
> 
> I just hope they honor my purchase since i've spent thousands at this shop over the last couple of years
> 
> 
> 
> 
> 
> 
> 
> 15 euros is not the same difference as a 100


It's true €15 < €100.

Hmm, I never even thought to look at their disclaimer... Thouugh, when you ordered it, did you also immediatly pay? Did you receive confirmation of payment?

If so, it'd be hard for them to get out of, unless they are real bastards, which would be bad for their rep.

Imagine it like this: you purchase a frozen pizza at your local Jumbo / AH / Aldi / whatever supermarket and right after you payed, the shop manager comes up to you and says the itme was priced wrong. The big difference being in the supermarket example, you have your GPU flavoured pizza in hand and the receipt to prove the deal was legal and thhe too low pricing their mistake.

I hope for you Azerty aren't bastards and even if it was a mistake - the pricing - hope they own up to it and honour the deals struck when it was priced that way.

Keep us posted what happens!


----------



## hatlesschimp

When do people think the non reference cards will arrive?


----------



## Sgt Bilko

Quote:


> Originally Posted by *hatlesschimp*
> 
> When do people think the non reference cards will arrive?


Late Nov-Early Dec according to the rumours floating around


----------



## hatlesschimp

whats the best current 290x brand? have they all had issues or is one of them immune so far. I like the look of the CLUB3D


----------



## r0l4n

Quote:


> Originally Posted by *r0l4n*
> 
> How do you guys avoid all the tearing in the desktop when overclocking the memory? I'm game stable at over 1650, but back to the desktop, I get tearing and image corruption from 1500 and above. Anything to do with the 2D profile maybe? EDIT: I'm using AB beta 16.


Quote:


> Originally Posted by *tsm106*
> 
> *Desktop flicker, flickering, etc*
> 
> It's due to the way the driver or AB is handling being pushed into 3D clocks on the desktop. *You can avoid this by choosing option 2 at the unlock AB process thereby disabling Powerplay(PP).* Note, this only works for those using Unofficial Overclock Method.
> What's happening is while a game login screen is on, AB kicks it into 3D mode cuz the game is technically on. However PP see's 3D clocks on the desktop and is like nice try and tries to drop clocks. bam, you get flickering. If you open AB's graph monitoring *you'll see the core/memory clocks fluctuating, which causes the flicker. This should be visible on the desktop, like on your way to logging into whatever multiplayer game. Also if you're using unofficial method like me, you don't need Powerplay at all.
> 
> Here's a video showing the flicker with Powerplay on and off.
> 
> 
> 
> 
> 
> *


Workaround


----------



## Sgt Bilko

Quote:


> Originally Posted by *hatlesschimp*
> 
> whats the best current 290x brand? have they all had issues or is one of them immune so far. I like the look of the CLUB3D


Well they are all Ref boards atm so you can take your pick really, mostly about the warranty side of things for the moment.

The Asus cards all have unlocked voltages but you can flash your card with an Asus 290/x bios if you really want to for the moment.

imo most of the black screening issues going around at the moment might be just driver related. if you are looking at the 290x in Aus, why not Crossfire 290's? roughly about $1k vs $700 for 1 x 290x
and it will be plenty fast


----------



## hatlesschimp

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well they are all Ref boards atm so you can take your pick really, mostly about the warranty side of things for the moment.
> 
> The Asus cards all have unlocked voltages but you can flash your card with an Asus 290/x bios if you really want to for the moment.
> 
> imo most of the black screening issues going around at the moment might be just driver related. if you are looking at the 290x in Aus, why not Crossfire 290's? roughly about $1k vs $700 for 1 x 290x
> and it will be plenty fast


Thanks *Bilko* for the reply,

Im torn in my mind for what I want.

I have 3x GTX Titans, Sony 4K TV and 3x VG248QE Monitors.

My replacement Sony 4k TV arrives tomorrow. I could have had it 3 weeks ago but I was away from home with work. So for the last 3 weeks the 4k tv has been in the living room and my missus has fallen in love with it and she doesn't want me to have it in my office now lol. I might let her have it and keep going with my 3x VG248QE monitors and maybe get 2 more and make it 5 screens in portrait eyefinity.

Im also thinking of selling my titans and grabbing 2 or 3 of the CLUB3D AMD 290X. They claim HDMI 2.0 support. I really like 4k gaming the detail is crazy and web browsing is awesome.

Also it looks like the TV and Monitor companies are reading what we are typing and trying to produce some good items. I think Seikis new 4K tv/monitor will have DP1.2a with MST and HDMI 2.0 with SST 4k 60hz. If they do they will sell like hot cakes!!!


----------



## lordzed83

r0l4n thats what i had. Flashing bios sorted it out for me. But before that i fixed it with MSI AB.
Wish next version would be out alrd :/


----------



## psyside

Quote:


> Originally Posted by *the9quad*
> 
> Well mine only black screens when the card is getting a workout.
> 
> Here is the list of things I have tried some initially had success, but ultimately the issue reared it's head again and again.
> 
> 
> When i first got the card, I ran it stock quiet mode with whatever driver was on the cd. Graphical corruption and black screen crash very fast.
> Uninstalled that Raptr program- seemed to do something as my graphical corruption went away.- still black screened.
> Heard to use the beta drivers, so I switched to them. Much better performance but still black screen.
> Next up tried out the uber mode- set the fan at max and let her rip- could play longer it seemed, but eventually would black screen.
> Hey beta 8! - this helped some, but still black screens.
> Flashed to the ASUS bios and used GPU tweak (turned off overdrive)- I thought I had the magic bullet with the new custom profile setup yes! Nope random black screen lockups.
> The next few days was trying various settings with power and fan profiles to attempt to get rid of the black screens- always had moments when I thought i fixed it but they randomly returned.
> Over that same period also down clocked my CPU-didn't help also adjusted VCORE up some, thought it helped but it didn't.
> Now I try beta 9- holy balls it was worse. Back to beta 8.
> Uninstalled windows 8.1 and re installed 8. Started fresh again with stock everything. Made sure to update everything for my mobo. Installed a few benchmarks and BF4, as well as the beta 8's.
> NOPE!
> Also tried using different display ports no help
> Also in that time I tried out a new power supply and also tried using different PCI-E power lines from my modular supply. None of that helped.
> Anyone have any other ideas?


Hardware defect it seems, or very unstable drivers (think not)


----------



## Arizonian

Quote:


> Originally Posted by *Emmett*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Can I join?


Congrats Added


----------



## Newbie2009

Woah think I'll wait for trixx to unlock votls. Don't like the amount of people having problems. Anyone have any gigabyte 290Xs ?


----------



## cyenz

Quote:


> Originally Posted by *Newbie2009*
> 
> Woah think I'll wait for trixx to unlock votls. Don't like the amount of people having problems. Anyone have any gigabyte 290Xs ?


yep, i have one, with Elpida memory.. and some nice black screens when playing games.


----------



## Newbie2009

Quote:


> Originally Posted by *cyenz*
> 
> yep, i have one, with Elpida memory.. and some nice black screens when playing games.


Awesome. You have flashed it?


----------



## cyenz

I have tried Asus bios just to check if black screens would occur when aplying more voltage to GPU. The same, can play for hours without a black screen, and then just five minutes is enough to get a goddam blacksreen, it´s the last time i buy new hardware as soon as is released, specially from AMD.


----------



## Falkentyne

Are you also using Elpida GDDR5 memory on your 290/290x?

Still more people having problems with elpida, none (STILL) with Hynix (YET).


----------



## cyenz

Quote:


> Originally Posted by *Falkentyne*
> 
> Are you also using Elpida GDDR5 memory on your 290/290x?
> 
> Still more people having problems with elpida, none (STILL) with Hynix (YET).


yep, i have the "famous" Elpida chips, it really seems that the only users afected are those with Elpida GDDR5. Maybe a Bios update with looser timmings could help, i don´t know.

It seems really strange that AMD could not detect this in quality control.


----------



## Jpmboy

Quote:


> Originally Posted by *upgraditus*
> 
> np. I also had igpu drivers which hadn't removed, might be worth a look.


do you know how to page the vrm LLC? (or do you know which VRMs the 290x is using - i can probably get the manf spec sheet and its command list)


----------



## r0l4n

Quote:


> Originally Posted by *lordzed83*
> 
> r0l4n thats what i had. Flashing bios sorted it out for me. But before that i fixed it with MSI AB.
> Wish next version would be out alrd :/


Thanks for your comments







What BIOS did you flash your card with?


----------



## Jpmboy

Quote:


> Originally Posted by *hatlesschimp*
> 
> Thanks *Bilko* for the reply,
> 
> Im torn in my mind for what I want.
> 
> I have 3x GTX Titans, Sony 4K TV and 3x VG248QE Monitors.
> 
> My replacement Sony 4k TV arrives tomorrow. I could have had it 3 weeks ago but I was away from home with work. So for the last 3 weeks the 4k tv has been in the living room and my missus has fallen in love with it and she doesn't want me to have it in my office now lol. I might let her have it and keep going with my 3x VG248QE monitors and maybe get 2 more and make it 5 screens in portrait eyefinity.
> 
> Im also thinking of selling my titans and grabbing 2 or 3 of the CLUB3D AMD 290X. They claim HDMI 2.0 support. I really like 4k gaming the detail is crazy and web browsing is awesome.
> 
> Also it looks like the TV and Monitor companies are reading what we are typing and trying to produce some good items*. I think Seikis new 4K tv/monitor will have DP1.2a with MST and HDMI 2.0 with SST 4k 60hz. If they do they will sell like hot cakes*!!!


with at least one sale to me!


----------



## Pfortunato

So guys, I need to replace my 7870xt that I sent to rma, I'm thinking in a gigabyte r9 290 for around 360 euros and when the aftermaket version launch I sell mine and buy the aftermarket version, what would you do? Get a better cooler for the reference or sell the reference and buy the aftermarket? Damn confused atm xD

Cheers


----------



## chmodlabs

Quote:


> Originally Posted by *jerrolds*
> 
> Look at the "Red Mod" thread in the ATI Cooling forums, cheapest way is to zip tie everything. Not the best way, but the cheapest. There are also 3rd party brackets that let you use AIOs like the 620 or H80 - look for Sigma Cool MK2 brackets - i think they confirmed it should work on 290(X) http://keplerdynamics.com/


Compatibility has indeed been confirmed with multiple OCN users, and both are reporting 40-48% reductions in load GPU core temperature with single 120mm AIO coolers!

- Kepler


----------



## bpmcleod

If its elpida memory its not all of them. The only blackscreens I've had have either been hardware related (code 55), iGPU conflicts with drivers, or having a second monitor plugged in on boot causes it to hang. These seem to have been my only blackscreens. Other then these my card has ran perfect if you don't count bf4 crashes heh

Edit: honestly I don't believe its elpida memory at all. Nvidia cards are running it also with none of these issues. I can't believe amd got all of the crap elpida and they didn't just sayin. To me its either an underlying issue with drivers or something the user could fix and just hasn't figured it out or hardware related. Just my two cents


----------



## Jpmboy

Quote:


> Originally Posted by *bpmcleod*
> 
> Whats weird is, those score are barely beating mine and im only at 1175/1500. What were their GPU scores? My FS score on a single card is 10880.
> 
> http://www.3dmark.com/3dm/1595580?
> 
> Edit: I am also still on air  with a 290..


http://www.3dmark.com/3dm/1596016

this is the run


----------



## bpmcleod

Quote:


> Originally Posted by *Jpmboy*
> 
> http://www.3dmark.com/3dm/1596016
> 
> this is the run


Yea jomama comented on it earlier. He said he broke 14k gpu score though. He was hinting at damps run being unstable so idk seems possible. I'm only 400 behind him with quite a ways to go to catch his clocks with a slightly slower card.

Edit: btw for anyone who folds, these cards hit close to and probably over 200k PPD especially on water. I'm running mine at 1100/1300 and its pulling 185k.


----------



## nemm

Well after a weekend of playing and endless problems which started with sli wc fitting not arriving despite paying for saturday delivery the air coolers went back on because I wanted to actually use these beauties. After a while I discovered one of my cards is actually a dud and it was the one that seemed to have the best potential (Hynix) started giving black screens, shutdowns, bsod, driver crash, you name it it was doing it. When I first tried in crossfire these symptoms were only present when beginning to overclock which I thought would be the power supply but know it may actually have been the card all along. After constant problems the card eventually started to give vga error on post preventing boot up and after testing in another build the problem was the same.

Not letting these unfortunate events stop me the other card got the water treatment and Asus290 flash
Stock max clocks were 1100/1450 @1.,227v (monitor reading)
After flash the clocks were 1200/1600 @1.335v (monitor reading)



Even though the max voltage reads 1.390, this was a spike and the constant approximate voltage was 1.335v.
Not too happy with the temperatures I am getting but my loop does have some air left as it shows from cpu temperatures also.

I couldn't quite break the 6000 gpu score but with a better bios that will actually provide less vdroop and more stable supply I will get there.

Despite the problems I am more than happy with the card and I shall wait for an RMA of the other before crossfire playing.

Another thing, why does the card get picked up as Generic even though it has a 290bios just from another manufacturer, anyone know?


----------



## rdr09

Quote:


> Originally Posted by *nemm*
> 
> Well after a weekend of playing and endless problems which started with sli wc fitting not arriving despite paying for saturday delivery the air coolers went back on because I wanted to actually use these beauties. After a while I discovered one of my cards is actually a dud and it was the one that seemed to have the best potential (Hynix) started giving black screens, shutdowns, bsod, driver crash, you name it it was doing it. When I first tried in crossfire these symptoms were only present when beginning to overclock which I thought would be the power supply but know it may actually have been the card all along. After constant problems the card eventually started to give vga error on post preventing boot up and after testing in another build the problem was the same.
> 
> Not letting these unfortunate events stop me the other card got the water treatment and Asus290 flash
> Stock max clocks were 1100/1450 @1.,227v (monitor reading)
> After flash the clocks were 1200/1600 @1.335v (monitor reading)
> 
> 
> 
> Even though the max voltage reads 1.390, this was a spike and the constant approximate voltage was 1.335v.
> Not too happy with the temperatures I am getting but my loop does have some air left as it shows from cpu temperatures also.
> 
> I couldn't quite break the 6000 gpu score but with a better bios that will actually provide less vdroop and more stable supply I will get there.
> 
> Despite the problems I am more than happy with the card and I shall wait for an RMA of the other before crossfire playing.
> 
> Another thing, why does the card get picked up as Generic even though it has a 290bios just from another manufacturer, anyone know?


i am not sure if i would react the same way as you did if i ever get a dud. thanks for sharing. looking forward with your crossfire results.


----------



## bond32

Find some vcheck points on the board... Someone should put a meter on one of these


----------



## DavidH

New member and Powercolor R9 290 owner here! I thought I would chime in because I have been hearing some good and bad stuff about our cards. First off I have the elpida memory on my card but have not had any black screens, also I have flashed my card to the PT3 Asus bios and am running 5600mhz ram speed and with 1150mhz core at 1.225vc and have played bf4 for hours this way with no issues. I have not tried to go higher yet but I have a feeling I can get the core to at least 1200mhz which I will try to do later today. Also keep in mind I have not tried lower vcore settings than 1.225vc so I might actually be able to get away with lower vcore for my 1150mhz core overclock.

For those wondering about cooling I have installed the Prolimitech air cooler onto my GPU with 2 120mm fans and it seems to be working well as my peek temps with these settings has been 65c. Also GPUZ reports my voltage to be just a tad bit less than the set 1.225vc, it says its loading at 1.19 to 1.20vc so I guess im getting just a tiny bit of vdroop under the PT3 bios.


----------



## DavidH

I will try to upload some pics of my setup a little later today so I can be added. Thank you.


----------



## Amhro

Wanted to go 280x, but changed my mind and decided for 290, since there is only little difference (290 is 360€ while 280x is 300€).
Just want to ask, sapphire or gigabyte? I guess they are all the same, but still... asus is out because of 150€ higher price than sapphire/gigabyte.


----------



## $ilent

Sapphire is good


----------



## hotrod717

Quote:


> Originally Posted by *Amhro*
> 
> Wanted to go 280x, but changed my mind and decided for 290, since there is only little difference (290 is 360€ while 280x is 300€).
> Just want to ask, sapphire or gigabyte? I guess they are all the same, but still... asus is out because of 150€ higher price than sapphire/gigabyte.


With 290, XFX has lifetime warranty. They're all reference at this point, so warranty and price is just about the only real consideration. Other than brand loyalty, ect,ect.


----------



## Asrock Extreme7

Screenshot (3).png 125k .png file

my score 1062 core & 1475 mem


----------



## Asrock Extreme7

Quote:


> Originally Posted by *Asrock Extreme7*
> 
> Screenshot (3).png 125k .png file
> 
> my score 1062 core & 1475 mem


sorry1162 core


----------



## Amhro

Quote:


> Originally Posted by *hotrod717*
> 
> With 290, XFX has lifetime warranty. They're all reference at this point, so warranty and price is just about the only real consideration. Other than brand loyalty, ect,ect.


XFX is not available in my country








That's why I'm deciding between sapphire and gigabyte.
As for warranty, sapphire has 2 years, gigabyte 3 years.


----------



## Pfortunato

Quote:


> Originally Posted by *Amhro*
> 
> XFX is not available in my country
> 
> 
> 
> 
> 
> 
> 
> 
> That's why I'm deciding between sapphire and gigabyte.
> As for warranty, sapphire has 2 years, gigabyte 3 years.


I'm in the same debate and I'm buying the gigabyte, my only question is about the temperatures and noise. Any information about the aftermarket version?

Cheers


----------



## CallsignVega

Quote:


> Originally Posted by *hatlesschimp*
> 
> Thanks *Bilko* for the reply,
> 
> Im torn in my mind for what I want.
> 
> I have 3x GTX Titans, Sony 4K TV and 3x VG248QE Monitors.
> 
> My replacement Sony 4k TV arrives tomorrow. I could have had it 3 weeks ago but I was away from home with work. So for the last 3 weeks the 4k tv has been in the living room and my missus has fallen in love with it and she doesn't want me to have it in my office now lol. I might let her have it and keep going with my 3x VG248QE monitors and maybe get 2 more and make it 5 screens in portrait eyefinity.
> 
> Im also thinking of selling my titans and grabbing 2 or 3 of the CLUB3D AMD 290X. They claim HDMI 2.0 support. I really like 4k gaming the detail is crazy and web browsing is awesome.
> 
> Also it looks like the TV and Monitor companies are reading what we are typing and trying to produce some good items. I think Seikis new 4K tv/monitor will have DP1.2a with MST and HDMI 2.0 with SST 4k 60hz. If they do they will sell like hot cakes!!!


Torn here too. I like the possibility of going 5x1 portrait again, but may need a custom PCB/output config version of the 290x. A stock 290x can run 5x1 setup with a MST hub, but only at 60 Hz, not my required 120 Hz.

Also going to hold off until more is known about this black screen issues. Obviously, if the cards don't hold up under more voltage and overclock, they aren't much use to me.

All 290x's currently are reference, so all of them are the same besides them using whatever memory is available at that time of production. 290x is not HDMI 2.0. What Club3D may be referring to is you could plug an HDMI 2.0 display into the port, but it will not run at full 2.0 speeds. There are no GPU's that have an HDMI 2.0 transmission controller as far as I've seen during research.


----------



## overclockFrance

I tried to bypass the ASUS GPU Tweak 1412 mV voltage limit by flashing my Sapphire 290 with PT1 and PT3 bioses.

My results are the following :

PT1 :
Advantages :
- no voltage limit (2V !)
- 3DMark vantage enhancements : on the feature tests 3, 4 and 6, the gains are respectively 4, 3 and 11 % compared to the ASUS normal bios.

Drawbacks :
- since there is no 2D mode, the GPU Tweak selected voltage is applied in 2D : if you choose 1400 mV, this voltage is applied even if a bench or a game has not been launched. It is dangerous because 1.4 V in 2D is too high even with a GPU temperature of 30°C. And since I want to apply more than 1412 mV, using PT1 is forbidden if I don't want to damage my card.

PT3 :
Advantages :
- no voltage limit (2V !)
- 3DMark vantage enhancements : on the feature tests 3, 4 and 6, the gains are respectively 4, 3 and 11 % compared to the ASUS normal bios.
- no surprise : the GPU Tweak selected voltage is applied in 2D and 3D. Consequently, the overclocking is more trnasparent since one knows the voltage precisely.

Drawbacks :
- the card is heating much because I am forced to apply a higher voltage, compared to a dynamic voltage : +4°C for the GPU and +10°C for the VRMs (measured at 1250 mV and f GPU = 1220 Mhz on 3DMark Vantage). Such a VRM temperature increase will limit the max overclock.
- no 2D mode : 1.3 V in 3D is applied in 2D. Quite stressing for the card in the long run.

PT3 does not satisfy me. The ideal bios would be the PT1 one including a 2D mode.

I flashed my 290 card with the Asus290x bios but no gain, no difference.

So, now, I am limited to 1412 mV.


----------



## Xylene

Maybe someone already answered this but I can't read 439 pages to find out, but does the reference cooler cut it if you are willing to run the fan at 90% or higher? I am not talking about a crazy over clock, but being able to hold 1ghz. I don't care about the noise as I only use the machine for gaming and I always use headphones.


----------



## rdr09

Quote:


> Originally Posted by *Xylene*
> 
> Maybe someone already answered this but I can't read 439 pages to find out, but does the reference cooler cut it if you are willing to run the fan at 90% or higher? I am not talking about a crazy over clock, but being able to hold 1ghz. I don't care about the noise as I only use the machine for gaming and I always use headphones.


checkout post #4.


----------



## Xylene

Quote:


> Originally Posted by *rdr09*
> 
> checkout post #4.


Well I sure feel dumb. Thanks


----------



## PillarOfAutumn

so apparently AMD is updating the never settle bundle: http://www.amd4u.com/neversettlereloaded/index_maintenance.html

I'm wondering if those of us who bought the R9 before everyone else will be able to take advantage of this?


----------



## VSG

Sure hope so!


----------



## LookOut

Somebody knows which vcore is required to run the 290x with 1275Mhz with the PT1 Bios in fire strike extreme ?
1242Mhz was the last result I can get with the Standard stock cooler.

http://www.benchdeluxx.de/letzfryyourcard290x.jpg

I will try it again watercooled next week, I hope to manage the 1260-1275Mhz.


----------



## fleetfeather

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> so apparently AMD is updating the never settle bundle: http://www.amd4u.com/neversettlereloaded/index_maintenance.html
> 
> I'm wondering if those of us who bought the R9 before everyone else will be able to take advantage of this?


If the 290 non-X gets the top tier NS bundle deal, shiz is about to get real.


----------



## PillarOfAutumn

I just sent AMD an email, I'll reply back with what they say.


----------



## sugarhell

You know guys you can tweet on the catalyst creator.

https://twitter.com/CatalystCreator


----------



## brazilianloser

Quote:


> Originally Posted by *fleetfeather*
> 
> If the 290 non-X gets the top tier NS bundle deal, shiz is about to get real.


That would be nice to calm my current dissatisfaction of this card... But still would prefer some sort of fix appearing for those with problematic cards. Cause with my luck my rma replacement will be a problem child as well.


----------



## lordzed83

Quote:


> Originally Posted by *r0l4n*
> 
> Thanks for your comments
> 
> 
> 
> 
> 
> 
> 
> What BIOS did you flash your card with?


Asus all other ones are waste of time if You ask me.
P1 and P3 are not good for daily usage from what I read around forums.


----------



## selk22

I am hoping that waiting for the Trixx update or AB will be worth it.. The 290x performance with just a custom fan profile and 1100/1400 is already so beastly! I can be patient and wait, its going to make it just that much sweeter


----------



## lordzed83

LookOut awesome card mate.. Even with max sliders i get artefacts pass 1180mhz on asus bios :/


----------



## AlphaC

Quote:


> Originally Posted by *hotrod717*
> 
> With 290, XFX has lifetime warranty. They're all reference at this point, so warranty and price is just about the only real consideration. Other than brand loyalty, ect,ect.


Read the fine print.

XFX has a lifetime warranty for USA/Canada. & must be registered
Quote:


> This limited hardware warranty covers defects in materials and workmanship in your - our end-user customer's - XFX-branded Graphics Cards purchased in the United States and Canada.


Quote:


> How long does this limited warranty last?
> 
> The limited hardware warranty for Graphics Cards lasts for a time period of two years.
> 
> Which products are eligible for a limited lifetime hardware warranty?
> 
> The following Graphics Cards are eligible for an extension of the standard two-year limited hardware warranty:
> 
> 1. XFX Radeon HD 7000 Series Dual Fan (Double Dissipation Edition) Graphics Cards with Ghost Technology; a floating cover design that maximizes airflow by creating exceptional venting throughout the card.
> 
> 2. XFX Radeon HD 7000 Series Graphics Cards with 10-digit model numbers ending in "R" (example: "FX-797A-TDFR")
> 
> 3. XFX Radeon HD 6000, HD 5000, and HD 4000 Series Graphics Cards
> 
> 4. XFX GeForce GT 520, GT 430, 200, 9000, 8000, 7000, 6000 Series Graphics Cards purchased after April 17, 2007.
> 
> If you register any of the specified products noted above online at http://www.xfxforce.com/ within 30 days of purchase, your limited warranty will be EXTENDED for the duration of your life. Registration within 30 days of the date of purchase is a condition precedent to receiving the lifetime warranty.***


They could be sneaky and say it's for the lifetime of the product (i.e. 2/3 yr) ala Zotac.


----------



## DavidH

Where can I get the normal Asus R9 290X bios from? not the modded PT1 and PT3 versions. I just want 2d mode to work while still being able to have voltage control.


----------



## Arizonian

Just curious about the black screens. Correct me if I'm wrong, trying to put the finger on it.


Out of 83 members, 22 which are under water thus far, how many are having issues _because it's not everyone._
Theory spinning is it's happening to Elpida memory but we've got a member with Hynix experiencing same now.
Some of the members who've had them tracked it to a different issue but they have varied as to what caused it for them and none were the GPU.
Which version is all these members of windows running as it seems to vary and not specific to one version.
Can't seem to put a finger on a common denominator yet.

Wondering if supported voltage control and a bump could resolve this issue for those that are?


----------



## AlphaC

Quote:


> Originally Posted by *bpmcleod*
> 
> Yea jomama comented on it earlier. He said he broke 14k gpu score though. He was hinting at damps run being unstable so idk seems possible. I'm only 400 behind him with quite a ways to go to catch his clocks with a slightly slower card.
> 
> Edit: btw for anyone who folds, these cards hit close to and probably over 200k PPD especially on water. I'm running mine at 1100/1300 and its pulling 185k.


That's not impressive given that GTX 780 does about the same ~180K PPD with less power




http://anandtech.com/show/7492/the-geforce-gtx-780-ti-review/14

see also
http://www.computerbase.de/artikel/grafikkarten/2013/nvidia-geforce-gtx-780-ti-gegen-gtx-titan-im-test/7/#gpucomputing-fahbench
http://www.computerbase.de/artikel/grafikkarten/2013/amd-radeon-r9-290x-im-test/8/#gpucomputing-fahbench

It's odd because in almost every other OpenCL application Nvidia's Kepler trails behind. I suppose Stanford has enough incentive to be equally attentive to AMD and Nvidia.

In raw TFLOPs,
GTX 780 TI is ~5 to 5.3Tflops (EVGA SC ACX is ~5.6Tflop but ~6Tflop under Boost clock of 1072Mhz)
--> 1/24 rate double precision
GTX 780 ~4Tflop (EVGA ACX Classified ~4.6Tflop , ~4.8Tflop when boosted to 1046Mhz)
--> 1/24 double precision
TITAN with 980Boost ~5Tflop
--> 1/3 rate double precision
R9 290X ~5.6Tflop (@1030Mhz it's 5.8Tflop)
--> 1/8 double precision
R9 290 ~ 4.9Tflop
--> 1/8 double precision
HD7970GE ~4Tflop , ~4.6Tflop @ 1120Mhz
--> 1/4 double precision

(bunch of ~150W-200W TDP cards like the HD7950 , HD7870XT, GTX 670, GTX 660 Ti, GTX [email protected] , etc do ~2.5 - 3Tflops)


----------



## selk22

Do we have a way to check memory type without removing cooler right now?

I will check when I put on water block but not until.

As for me im on Sapphire windows 7 with no Black screen issues yet.. I am wondering if any of these people have tried resetting both CPU and GPU OC's and see if that helped? Also anyone having this issue after Voltage bumps?


----------



## Jpmboy

MemoryInfo 1005.zip 1101k .zip file


courtesy of Falkentyne


----------



## selk22

+rep

Looks like I have Elpdia which is actually what I expected lol


----------



## Asrock Extreme7

i have asus 290x with Elpida memory but run fine mem goes up to 1475 and core 1167 windows 8 and no downclocking runs at 1167 all time 1.37v 1.257 real


----------



## Arizonian

Quote:


> Originally Posted by *AlphaC*
> 
> That's not impressive given that GTX 780 does about the same ~180K PPD with less power
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> http://anandtech.com/show/7492/the-geforce-gtx-780-ti-review/14
> 
> see also
> http://www.computerbase.de/artikel/grafikkarten/2013/nvidia-geforce-gtx-780-ti-gegen-gtx-titan-im-test/7/#gpucomputing-fahbench
> http://www.computerbase.de/artikel/grafikkarten/2013/amd-radeon-r9-290x-im-test/8/#gpucomputing-fahbench
> 
> It's odd because in almost every other OpenCL application Nvidia's Kepler trails behind. I suppose Stanford has enough incentive to be equally attentive to AMD and Nvidia. In raw TFLOPs, GTX 780 TI is ~5 to 5.3Tflops (EVGA SC ACX is ~5.6Tflop), GTX 780 ~4Tflop (EVGA ACX Classified ~4.6Tflop) , TITAN with 980Boost ~5Tflop. R9 290 ~ 4.9Tflop, R9 290X ~5.6Tflop (@1030Mhz it's 5.8Tflop), HD7970GE ~4Tflop


I'm pretty sure he wasn't comparing it to the 780 but just stating a fact.


----------



## TheSoldiet

Im pretty sure it is my motherboard that is giving me my once in a day black screen .
Something about legacy rom. No new BIOS updates so far though.


----------



## Asrock Extreme7

Quote:


> Originally Posted by *Asrock Extreme7*
> 
> i have asus 290x with Elpida memory but run fine mem goes up to 1475 and core 1167 windows 8 and no downclocking runs at 1167 all time 1.37v 1.257 real


was thinking of takeing back and see if i can get one with Hynix what do u think


----------



## DavidH

Again, where can I get the Asus R9 290x bios? the normal unmodded one?


----------



## Asrock Extreme7

Quote:


> Originally Posted by *DavidH*
> 
> Again, where can I get the Asus R9 290x bios? the normal unmodded one?


uber or quite i have backup of my cards bios will upload for u


----------



## DavidH

Quote:


> Originally Posted by *Asrock Extreme7*
> 
> uber or quite i have backup of my cards bios will upload for u


I have the R9 290 card so I guess the quiet one is best right? I have an aftermarket cooler installed so heat is not an issue, just want the best bios for overclocking and have decided to get away from the modded PT1 and PT3 bios.


----------



## Arizonian

Quote:


> Originally Posted by *DavidH*
> 
> I have the R9 290 card so I guess the quiet one is best right? I have an aftermarket cooler installed so heat is not an issue, just want the best bios for overclocking and have decided to get away from the modded PT1 and PT3 bios.


I'd suggest Uber mode and create your own fan profile to keep temps down and allow full usage of Core without down clocking.


----------



## iPDrop

Just did some benchmarks all 1080p resolution maxed graphics Uber Mode

Crysis 3 8x MSAA
one R9 290: *36.7 fps*
two R9 290 xfx: *72.4 fps*
*197.3%*

Batman Arkham Origins 8x MSAA
one R9 290: *82.7 fps*
two R9 290 xfx: *134.4 fps*
*162.5%*

Borderlands 2 FXAA
one R9 290: *137.1 fps*
two R9 290: *220 fps*
*160.5%*

BF4 4xMSAA
one r9 290 (frame pacing off): *72.9 fps*
two r9 290 (frame pacing off): *88.6 fps*
two r9 290 (frame pacing on): *87.2 fps*
*121.5%* (frame pacing off)

BF4 4x MSAA @ 4K (Felt VERY jittery with two cards, but smooth with one)
one r9 290: *28.7*
two r9 290: *45.5*
*158.5%*

I'm not sure if everybody is having as bad performance in BF4 with crossfire as I am but that's currently what I'm dealing with.


----------



## DavidH

Quote:


> Originally Posted by *Arizonian*
> 
> I'd suggest Uber mode and create your own fan profile to keep temps down and allow full usage of Core without down clocking.


The fan profile doesnt matter for me cause im running the Prolimatech cooler with 2 120mm fans attached. I just wanted which ever bios would be the best one for overclocking/overvolting. Maybe I should try both?

BTW thanks so much for your offering to help me.


----------



## brazilianloser

Well to me at least I know it's the card... Since outside of doing all the fixes brought up I have a second card 280x still in my possession to be returned by this week. Pop in the 290 all stock or with the dozen of fixes and it's a black screen withing one run of Valley or about five minutes of game play. Pop in my 280x and everything is fine. I even left the poor thing on accident running Valley for six hours due to falling asleep after spending the whole day trying to figure out the source of the problem and woke up to a card that was in the mid 70c and not a hick up. So for most I believe it's card related. Now if it will be fixable by a driver or new bios I do not know.


----------



## Gilgam3sh

Quote:


> Originally Posted by *DavidH*
> 
> Again, where can I get the Asus R9 290x bios? the normal unmodded one?


http://forums.overclockers.co.uk/showthread.php?t=18552408


----------



## PwrElec

any 290x quad fire reviews? i can't find any...


----------



## Asrock Extreme7

Quote:


> Originally Posted by *DavidH*
> 
> I have the R9 290 card so I guess the quiet one is best right? I have an aftermarket cooler installed so heat is not an issue, just want the best bios for overclocking and have decided to get away from the modded PT1 and PT3 bios.


https://skydrive.live.com/?cid=0aa466d1c03b2ec3&id=AA466D1C03B2EC3%21787&authkey=!ANoiqfQqK6n6tdo


----------



## ImJJames

Quote:


> Originally Posted by *iPDrop*
> 
> Just did some benchmarks all 1080p resolution maxed graphics Uber Mode
> 
> Crysis 3 8x MSAA
> one R9 290: *36.7 fps*
> two R9 290 xfx: *72.4 fps*
> *197.3%*
> 
> Batman Arkham Origins 8x MSAA
> one R9 290: *82.7 fps*
> two R9 290 xfx: *134.4 fps*
> *162.5%*
> 
> Borderlands 2 FXAA
> one R9 290: *137.1 fps*
> two R9 290: *220 fps*
> *160.5%*
> 
> BF4 4xMSAA
> one r9 290 (frame pacing off): *72.9 fps*
> two r9 290 (frame pacing off): *88.6 fps*
> two r9 290 (frame pacing on): *87.2 fps*
> *121.5%* (frame pacing off)
> 
> BF4 4x MSAA @ 4K (Felt VERY jittery with two cards, but smooth with one)
> one r9 290: *28.7*
> two r9 290: *45.5*
> *158.5%*
> 
> I'm not sure if everybody is having as bad performance in BF4 with crossfire as I am but that's currently what I'm dealing with.


Yeah your BF 4 performance with single and double is kind of weird, you're using latest beta drivers and running on full-screen right?


----------



## DavidH

Quote:


> Originally Posted by *Asrock Extreme7*
> 
> https://skydrive.live.com/?cid=0aa466d1c03b2ec3&id=AA466D1C03B2EC3%21787&authkey=!ANoiqfQqK6n6tdo


Thanks brother.


----------



## CurrentlyPissed

Quote:


> Originally Posted by *Arizonian*
> 
> Just curious about the black screens. Correct me if I'm wrong, trying to put the finger on it.
> 
> 
> Out of 83 members, 22 which are under water thus far, how many are having issues _because it's not everyone._
> Theory spinning is it's happening to Elpida memory but we've got a member with Hynix experiencing same now.
> Some of the members who've had them tracked it to a different issue but they have varied as to what caused it for them and none were the GPU.
> Which version is all these members of windows running as it seems to vary and not specific to one version.
> Can't seem to put a finger on a common denominator yet.
> 
> Wondering if supported voltage control and a bump could resolve this issue for those that are?


Here's my info

Windows 8.1, tried 8, same issue. Have reformatted to verify.
Elpida Memory on both Sapphire 290
I don't play many games. Only LoL, and Final Fantasy 14: A Realm Reborn. And benchmarks.
FF14 Benchmark I cannot complete, whether it be 1 card at a time, or two cards. Crossfire enabled, or disable. I have yet to complete the benchmark because of black screen.
When playing FF14 I can go anywhere from 30m to 1hr until I get a black screen. When the black screen happens I continue to have sound, but nothing can be seen. I can still even talk on Teamspeak.
In benchmarks. I can occasionally complete a Valley, or 3DMark. But generally towards the end they black screen.
I have also tried stock CPU clocks, no change.


----------



## iPDrop

Quote:


> Originally Posted by *ImJJames*
> 
> Yeah your BF 4 performance with single and double is kind of weird, you're using latest beta drivers and running on full-screen right?


Yeah I've tried several different things can't seem to get it right







I've seen a few other people reporting the same issue.


----------



## bpmcleod

Quote:


> Originally Posted by *AlphaC*
> 
> That's not impressive given that GTX 780 does about the same ~180K PPD with less power
> 
> 
> 
> 
> http://anandtech.com/show/7492/the-geforce-gtx-780-ti-review/14
> 
> see also
> http://www.computerbase.de/artikel/grafikkarten/2013/nvidia-geforce-gtx-780-ti-gegen-gtx-titan-im-test/7/#gpucomputing-fahbench
> http://www.computerbase.de/artikel/grafikkarten/2013/amd-radeon-r9-290x-im-test/8/#gpucomputing-fahbench
> 
> It's odd because in almost every other OpenCL application Nvidia's Kepler trails behind. I suppose Stanford has enough incentive to be equally attentive to AMD and Nvidia.
> 
> In raw TFLOPs,
> GTX 780 TI is ~5 to 5.3Tflops (EVGA SC ACX is ~5.6Tflop but ~6Tflop under Boost clock of 1072Mhz)
> --> 1/24 rate double precision
> GTX 780 ~4Tflop (EVGA ACX Classified ~4.6Tflop , ~4.8Tflop when boosted to 1046Mhz)
> --> 1/24 double precision
> TITAN with 980Boost ~5Tflop
> --> 1/3 rate double precision
> R9 290X ~5.6Tflop (@1030Mhz it's 5.8Tflop)
> --> 1/8 double precision
> R9 290 ~ 4.9Tflop
> --> 1/8 double precision
> HD7970GE ~4Tflop , ~4.6Tflop @ 1120Mhz
> --> 1/4 double precision
> 
> (bunch of ~150W-200W TDP cards like the HD7950 , HD7870XT, GTX 670, GTX 660 Ti, GTX [email protected] , etc do ~2.5 - 3Tflops)


Its still impressive considering the price difference, and the fact the drivers are still in beta. The points may rise from completed drivers


----------



## CurrentlyPissed

Please add me to the list. 2 x 290 Sapphire on Air, for now. Watercool soon.


----------



## Arizonian

Quote:


> Originally Posted by *DavidH*
> 
> The fan profile doesnt matter for me cause im running the Prolimatech cooler with 2 120mm fans attached. I just wanted which ever bios would be the best one for overclocking/overvolting. Maybe I should try both?
> 
> BTW thanks so much for your offering to help me.










Should have checked what type of cooling you were using.

Quote:


> Originally Posted by *CurrentlyPissed*
> 
> Please add me to the list. 2 x 290 Sapphire on Air, for now. Watercool soon.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats and added.









Thought I had missed you when you originally posted and couldn't find your entry on list.


----------



## Rar4f

Quote:


> Originally Posted by *bpmcleod*
> 
> Its still impressive considering the price difference, and the fact the drivers are still in beta. The points may rise from completed drivers


What has history shown of change of performance after beta drivers for past cards?


----------



## kcuestag

Quote:


> Originally Posted by *Arizonian*
> 
> Just curious about the black screens. Correct me if I'm wrong, trying to put the finger on it.
> 
> 
> Out of 83 members, 22 which are under water thus far, how many are having issues _because it's not everyone._
> Theory spinning is it's happening to Elpida memory but we've got a member with Hynix experiencing same now.
> Some of the members who've had them tracked it to a different issue but they have varied as to what caused it for them and none were the GPU.
> Which version is all these members of windows running as it seems to vary and not specific to one version.
> Can't seem to put a finger on a common denominator yet.
> 
> Wondering if supported voltage control and a bump could resolve this issue for those that are?


I am quite sure my issues with the signal loss (black screen) were not caused by the failure of the water pump's cable (shorting the whole rig and making the motherboard reset for anti-surge protection). They are two different problems. I experienced the blackscreens even before those shutdowns.

So, in reply to your question:

- I am affected
- Mine is Elpida Memory
- I tracked the shutdowns, but again, the blackscreens were a GPU issue, the shutdowns were NOT.
- I'm using Windows 8.1 PRO 64 Bit.
- I can still hear sound as well, although mine is like a frozen/broken sound, can't really hear proper sound anymore when it happens and I have to hard reset.


----------



## Mr357

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/4390#post_21163667
Quote:


> Originally Posted by *Arizonian*
> 
> Just curious about the black screens. Correct me if I'm wrong, trying to put the finger on it.
> 
> 
> Out of 83 members, 22 which are under water thus far, how many are having issues _because it's not everyone._
> Theory spinning is it's happening to Elpida memory but we've got a member with Hynix experiencing same now.
> Some of the members who've had them tracked it to a different issue but they have varied as to what caused it for them and none were the GPU.
> Which version is all these members of windows running as it seems to vary and not specific to one version.
> Can't seem to put a finger on a common denominator yet.
> 
> Wondering if supported voltage control and a bump could resolve this issue for those that are?


* I have a Sapphire 290X under water
* It has Elpida IC's
* I have had zero memory-related crashes so far, even at 1500MHz (6000 effective)
* I am running Windows 7 64 bit


----------



## ImJJames

Quote:


> Originally Posted by *Rar4f*
> 
> What has history shown of change of performance after beta drivers for past cards?


A lot actually...look at amd only official release 13.9, and now look at all the changes that has been made with 13.11 beta 9.2, and its still in BETA.


----------



## GioV

290x at $445.19!

http://www.amazon.com/gp/product/B00GIAXVKM/ref=oh_details_o00_s00_i00?ie=UTF8&psc=1


----------



## Rar4f

Quote:


> Originally Posted by *ImJJames*
> 
> A lot actually...look at amd only official release 13.9, and now look at all the changes that has been made with 13.11 beta 9.2, and its still in BETA.


In terms of frames per second increase that is, not fixing of bugs or issues.


----------



## Heinz68

Quote:


> Originally Posted by *hatlesschimp*
> 
> Im also thinking of selling my titans and grabbing 2 or 3 of the CLUB3D AMD 290X. They claim HDMI 2.0 support. I really like 4k gaming the detail is crazy and web browsing is awesome.
> 
> Also it looks like the TV and Monitor companies are reading what we are typing and trying to produce some good items. I think Seikis new 4K tv/monitor will have DP1.2a with MST and HDMI 2.0 with SST 4k 60hz. If they do they will sell like hot cakes!!!


I search the club-3d site for HDMI 2.0 and the only results I .got was HDMI 1.4a

On this picture it shows _"Built in HDMI 1.4a with 2.0 support, DisplayPort 1.2 and Dual DVI-D"_, so most likely there is some partial support built in the firmware but they can't call it HDMI 2.0 and maybe other cards are the same.

Anyway good news if Seiki is also going to have DP1.2, it's proven it suports 60Hz with the cards

EDIT
I e-mailed club-3d technical support, for more info.


----------



## AlphaC

Quote:


> Originally Posted by *GioV*
> 
> 290x at $445.19!
> 
> http://www.amazon.com/gp/product/B00GIAXVKM/ref=oh_details_o00_s00_i00?ie=UTF8&psc=1


Shows $419.99 + $8.64 shipping for me (+tax ofc)

It's an MSI too... 3 yr warranty


----------



## Jpmboy

Quote:


> Originally Posted by *Mr357*
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/4390#post_21163667
> * I have a Sapphire 290X under water
> * It has Elpida IC's
> * I have had zero memory-related crashes so far, even at 1500MHz (6000 effective)
> * I am running Windows 7 64 bit


ditto


----------



## ImJJames

Quote:


> Originally Posted by *Rar4f*
> 
> In terms of frames per second increase that is, not fixing of bugs or issues.


13.11 beta 1
http://www.guru3d.com/files_details/amd_catalyst_13_11_beta1_(13_200_16_september_26)_download.html

13.11 beta 6
http://www.guru3d.com/files_details/amd_catalyst_13_11_beta6_(13_250_18_october_24)_download.html

13.11 beta 8
http://www.guru3d.com/files_details/amd_catalyst_13_11_beta7_(13_250_18_october_25).html

Read the highlights


----------



## Jpmboy

Quote:


> Originally Posted by *Rar4f*
> 
> In terms of frames per second increase that is, not fixing of bugs or issues.


plenty of evidence if fps improvements with driver updates (with the occasional bad release going the other direction







) and personal experience, including NV drivers.


----------



## GioV

Quote:


> Originally Posted by *AlphaC*
> 
> Shows $419.99 + $8.64 shipping for me (+tax ofc)
> 
> It's an MSI too... 3 yr warranty


Hmm, even better. It showed $445 for me and when I refreshed the page it went down to $419.99. All sold out now.


----------



## lordzed83

Rar4f man. Look at 79xx starting drivers and ones we got today. Card gained around 15-20% performance in ONE YEAR of drivers !!!!


----------



## bpmcleod

Quote:


> Originally Posted by *Rar4f*
> 
> In terms of frames per second increase that is, not fixing of bugs or issues.


Proper drivers bring stability and with stability.... the entire discussion was folding anyways not fps. My 290 is arpund 185-190 and he was saying it wasn't impressive cause a 780 pulls that also. I made the comment that drivers are still in beta (because I get random drops in clock speeds and the cards running at 70c so it isn't temp throttling) and the price difference.


----------



## Jpmboy

Quote:


> Originally Posted by *bpmcleod*
> 
> Proper drivers bring stability and with stability.... the entire discussion was folding anyways not fps. My 290 is arpund 185-190 and he was saying it wasn't impressive cause a 780 pulls that also. I made the comment that drivers are still in beta (because I get random drops in clock speeds and the cards running at 70c so it isn't temp throttling) *and the price difference.*


ignore the price, the performance is very good and the r290x does not need that caveat.


----------



## jomama22

Some new score for you guys, still have a few higher fse in my pocket but this will do for now:

Fire strike performance:

http://www.3dmark.com/fs/1127453

13002 overall
14590 gfx
1335/1740

Fire strike extreme:

http://www.3dmark.com/fs/1127113

6562 overall
6802 gfx
1335/1750

This is a 290x BTW.


----------



## DarknightOCR

methyl days ago will here my problem black screen.
happened every time I went into the windows, then the loading / pass

After trying several things, some cables i discovered my problem.
was the cable

The second cable DVI-D> DVI-D has worked well.
previous cables (DVI-D and HDMI) should be defect in crossing the digital image.

with this new cable behind filters anti-electromagnetism everything works fine.

I now have with stock voltage 1100/1400.


----------



## utnorris

Quote:


> Originally Posted by *Arizonian*
> 
> Just curious about the black screens. Correct me if I'm wrong, trying to put the finger on it.
> 
> 
> Out of 83 members, 22 which are under water thus far, how many are having issues _because it's not everyone._
> Theory spinning is it's happening to Elpida memory but we've got a member with Hynix experiencing same now.
> Some of the members who've had them tracked it to a different issue but they have varied as to what caused it for them and none were the GPU.
> Which version is all these members of windows running as it seems to vary and not specific to one version.
> Can't seem to put a finger on a common denominator yet.
> 
> Wondering if supported voltage control and a bump could resolve this issue for those that are?


Sapphire 290x under water no issues at stock or overclock on stock bios, no issue on Asus bios or PT1 bios either. Hynix memory on this one. My two 290's will be here tomorrow and I can let you know then if the issue shows up.


----------



## Sandcracka

I have the R9 290x and I love it but the fan is pretty loud when setting the target temp to 70c or below. This means that trying to oc the card would make my card very loud with only the stock cooler. I've read that the ARCTIC Accelero Xtreme III is indeed compatible with the 290x, however while watching an unboxing it appears that there are only 12 VRAM heatsinks included with the kit (I believe I would need 16). Can anyone tell me which heatsinks i should buy and where to get them? Also, would I need to get additional VRM heatsinks as well? I don't really want to go with water cooling because my HAF XM doesn't really have the room, and I'm already using the Kuhler 920 with my 4670K (delidded w/clu). Thanks for any help!


----------



## Arm3nian

Quote:


> Originally Posted by *jomama22*
> 
> Some new score for you guys, still have a few higher fse in my pocket but this will do for now:
> 
> Fire strike performance:
> 
> http://www.3dmark.com/fs/1127453
> 
> 13002 overall
> 14590 gfx
> 1335/1740
> 
> Fire strike extreme:
> 
> http://www.3dmark.com/fs/1127113
> 
> 6562 overall
> 6802 gfx
> 1335/1750
> 
> This is a 290x BTW.


That is pretty nice, gj. What voltage do you need for those clocks?


----------



## jomama22

Quote:


> Originally Posted by *Arm3nian*
> 
> That is pretty nice, gj. What voltage do you need for those clocks?


1.42 actual, 1.375 in gputweak with pt3.


----------



## sugarhell

Keep them coming jom


----------



## smartdroid

Anyone here running 2 R290 on a 850W psu? After reading some reviews i'm kinda confused if it will hold 2 R290's.


----------



## Asrock Extreme7

is 1167 core 1475 mem 3.94v asus bios good or bad


----------



## rv8000

Quote:


> Originally Posted by *jomama22*
> 
> Some new score for you guys, still have a few higher fse in my pocket but this will do for now:
> 
> Fire strike performance:
> 
> http://www.3dmark.com/fs/1127453
> 
> 13002 overall
> 14590 gfx
> 1335/1740
> 
> Fire strike extreme:
> 
> http://www.3dmark.com/fs/1127113
> 
> 6562 overall
> 6802 gfx
> 1335/1750
> 
> This is a 290x BTW.


Are memory clocks above 1500 giving you performance increases? Anything above 1500 is seeming stable with no artifacting but it is yielding no performance increase.


----------



## jomama22

[/quote]
Quote:


> Originally Posted by *rv8000*
> 
> Are memory clocks above 1500 giving you performance increases? Anything above 1500 is seeming stable with no artifacting but it is yielding no performance increase.


The memory is very fickle on these cards. What I suggest is to actually push to 1625(6600) and then go up from there if you can. I have noticed that anywhere between 1500-1600, i will always get bad scores, then once i push up a bit more, my scores shoot back up. you need to really test every 25mhz at the least. I had memory fail at 1525 and frustrate me, then i just go up and wouldnt you know, it can hit 1725mhz.
Quote:


> Originally Posted by *sugarhell*
> 
> Keep them coming jom


Nice to have top 10 hof scores for both fse and fs performance if i can say so myself lol.


----------



## subyman

I'm trying to get up to date on the black screen issue, is this happening to people bone stock or when trying to OC?


----------



## CurrentlyPissed

Quote:


> Originally Posted by *subyman*
> 
> I'm trying to get up to date on the black screen issue, is this happening to people bone stock or when trying to OC?


Both, some people even have to underclock.


----------



## bjozac

I've been reading this thread from #1 for past couple of days while waiting for my new parts and cooling stuff to arrive.
I'm fairly new to this forum but I like it alot, many good comments!

What I do miss is a ASUS NON-X 290 bios for flashing my Sapphire 290 NON-X bios. I would prefer to flash that before the x version.


----------



## nemm

Quote:


> Originally Posted by *bjozac*
> 
> I've been reading this thread from #1 for past couple of days while waiting for my new parts and cooling stuff to arrive.
> I'm fairly new to this forum but I like it alot, many good comments!
> 
> What I do miss is a ASUS NON-X 290 bios for flashing my Sapphire 290 NON-X bios. I would prefer to flash that before the x version.


There is a non x version floating about which is what I am using it at the moment and it performs no different to the x version. Where you can get it I won't disclose as it isn't mine and not taking any responsibility for it just in case. All I will say is another forum which begins with 'O' or you can ask some nice member on here with an Asus 290 to post a back up of theirs since there are a few.


----------



## petedread

R9 290x With stock bios and stock fan, the best I can do is 1120 core and 1425 mem. Anything over 1120 core crashes and the sweet spot for the memory is 1400. Have yet to flash and add block.
What's the best out of the box overclock? And what's the average? My card is from XFX. Which brand seem to be the strongest?


----------



## nemm

@petedread

Myself out of box was 1165/1550 (Hynix) which developed a fault and is now dead whereas the other managed 1100/1450 (Elpdia, initial max thought to be 1115/1550). Second card with Asus bios out of box I can get 1100/1600.


----------



## Forceman

Quote:


> Originally Posted by *lordzed83*
> 
> Rar4f man. Look at 79xx starting drivers and ones we got today. Card gained around 15-20% performance in ONE YEAR of drivers !!!!


New architecture going from VLIW to GCN, so more scope for improvement. You won't see those kind of gains with the 290 cards. But even 10% will be nice.


----------



## petedread

Thanks nemm. I'm surprised how high some people can get the memory. Personally I found that overclocking the memory does give me better bench scores, where as with my 6970's OC'ing the memory didn't help as much as OC'ing the core. I'm looking forward to giving this card some more volts and seeing what happens.


----------



## Blackroush

Last time I did 90c Limit on my 290x BF4 works like charm although the fan is really noisy. By the time I change to 95c max BF4 blackscreen crash within less than 5 minutes. Just wondering this is hardware problem or software problem? It is works when we limit the Temp @90c max but not when @95c max. I think this is a heat issue since this card crash and we cannot see the HWMonitor max temp that probably went out more than @95c max limit so the system shutdown automatically (Black Screen Crash).


----------



## Blackroush

Moderator I want to Join in your List, Please Add me.. Thanks

XFX 290X

Stock Cooler


----------



## Jpmboy

Quote:


> Originally Posted by *Asrock Extreme7*
> 
> is 1167 core 1475 mem 3.94v asus bios good or bad


bad.







you are not pushing 3.94V


----------



## Jpmboy

Anyone know why the PT3 (and PT1) bios is 64K whereas the Asus unlocked bios is 128? (it's not flashing quiet and uber)


----------



## Newbie2009

Quote:


> Originally Posted by *jomama22*
> 
> Some new score for you guys, still have a few higher fse in my pocket but this will do for now:
> 
> Fire strike performance:
> 
> http://www.3dmark.com/fs/1127453
> 
> 13002 overall
> 14590 gfx
> 1335/1740
> 
> Fire strike extreme:
> 
> http://www.3dmark.com/fs/1127113
> 
> 6562 overall
> 6802 gfx
> 1335/1750
> 
> This is a 290x BTW.


Pretty sweet man. I'm really impressed.


----------



## PillarOfAutumn

So whats the problem with the black screens? Is it that over a certain overclock of the memory, the screen just blacks out? Or is this happening without overclocking?


----------



## rdr09

Quote:


> Originally Posted by *Newbie2009*
> 
> Pretty sweet man. I'm really impressed.


Highest fs extreme i've seen in ocn. gj, jomama.


----------



## cyenz

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> So whats the problem with the black screens? Is it that over a certain overclock of the memory, the screen just blacks out? Or is this happening without overclocking?


Happens at stock to me.


----------



## TooBAMF

Anybody know if Crossfire is working well with 290X and 1600p?

290/290X results are very impressive at 1440p and 4K but last I heard those were specifically targeted by AMD.


----------



## brazilianloser

Man all brands other than XFX and HIS in stock now on Newegg... Guess I will play the auto-notify game again.


----------



## Asrock Extreme7

Quote:


> Originally Posted by *Jpmboy*
> 
> bad.
> 
> 
> 
> 
> 
> 
> 
> you are not pushing 3.94V


sorry 1.39


----------



## PillarOfAutumn

Quote:


> Originally Posted by *cyenz*
> 
> Happens at stock to me.


Oh crap.
So is the only other option to RMA it? My card is coming in this Thursday. What tests shouldninrun to make sure there's nothing wrong with the card?


----------



## brazilianloser

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Oh crap.
> So is the only other option to RMA it? My card is coming in this Thursday. What tests shouldninrun to make sure there's nothing wrong with the card?


No need for tests... just play some games... and you will see. Otherwise some benchmarks (valley, 3dmark...)


----------



## cyenz

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Oh crap.
> So is the only other option to RMA it? My card is coming in this Thursday. What tests shouldninrun to make sure there's nothing wrong with the card?


For me it happens on BF4 and Firestrike Extreme, i can happen in five minutes or take an hour or two, It seems that 99% have Elpida GDDR5. I will wait a little bit until RMA, can be some bios incompatibility with the memory controler and Elpida chips, our maybe the timmings are to tight for the VRAM (just assuming).


----------



## brazilianloser

I guess Amazon is trying to make up for their mistake on prices of the 780 when they had that drop by charging more than the other guys on the 290... come on now Amazon Sapphire 290


----------



## BababooeyHTJ

Afterburner will adjust the core voltage on 290/290x if you don't mind using a command line. Works well for me.








Quote:


> Originally Posted by *Unwinder;4697356*
> Thanks, VRM state/id/location seem to be pretty similar to mine R290 (Hawaii PRO Press Sample).
> You can alter voltage on it even with current MSI AB beta by sending commands to VRM via MSI AB command line:
> 
> To set +100mV offset:
> MSIAfterburner.exe /wi4,30,8d,10
> 
> To restore original voltage:
> MSIAfterburner.exe /wi4,30,8d,0
> 
> Use it at your own risk


Quote:


> Originally Posted by *Unwinder;4697929*
> Almost correct, but you're thinking in right direction. Offset is adjusted in 6.25mV steps and specified in hexadecimal format. So 10 hexadecimal = 16 decimal = 16 * 6.25 mV = 100 mV. 5 results in 5 * 6.25 = 31.25 mV. To set it to 50 mV you should use 8.


Quote:


> Originally Posted by *Unwinder;4698353*
> By default I2C write commands (/wi) apply to currently selected GPU only. But there are additional /sg command line switches allowing you to specifiy target GPU for the next commands. So the next command line will apply settings to both GPU0 and GPU1:
> 
> MSIAfterburner.exe /sg0 /wi4,30,8d,10 /sg1 /wi4,30,8d,10


----------



## skupples

Quote:


> Originally Posted by *BababooeyHTJ*
> 
> Afterburner will adjust the core voltage on 290/290x if you don't mind using a command line. Works well for me.


Yay!!! Unwinder to the rescue!

Here's your answer JP!


----------



## Ricey20

Quote:


> Originally Posted by *brazilianloser*
> 
> No need for tests... just play some games... and you will see. Otherwise some benchmarks (valley, 3dmark...)


I'm getting both black screen crash and sometimes vertical line/checkerboard crashes. Has anyone been able to narrow it down if it's a hardware issue or driver related? Currently running xfire and will be taking out the other card to see if it still does it. I haven't touched a thing yet besides make a more aggressive fan profile.


----------



## Jpmboy

Quote:


> Originally Posted by *skupples*
> 
> Yay!!! Unwinder to the rescue!
> 
> Here's your answer JP!


oh yeah - been doing the vrm offset already - but it's not as effective as dialing down LLC IMO - which i have not yet figured out. And the international rectifier [restricted] tech sheet download is being a real PIA - unlike ON Semiconductor.
Alexey has been a great help. for real!! And Falkentyne is the path blazer


----------



## Jared Pace

I wonder why Shamino created that memory tool in the first place. Seems like a rather stupid and pointless piece of simple software coming from a seasoned overclocker, considering all we have to do is read what it says on the damn memory chips with our own two bare eyes to see who the manufacturer is. Got me thinking why he would make such a tool in the first place.

He probably tested several 290x cards well before the public release and saw what a disaster it is with these black screen freezes. Then made the MemoryInfo tool so everyone beginner/novice/expert alike would download it and be able to easily discern from a large sample field if the memory brand was a major cause of the problem. Seeing you guys say it's 99% elpida means... Looks like his hunch was right and it's working out...

Good job? lol. This is lame as hell.


----------



## Falkentyne

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> So whats the problem with the black screens? Is it that over a certain overclock of the memory, the screen just blacks out? Or is this happening without overclocking?


This is what I don't get, either.

I can trigger the black screens on my Hynix card, by overclocking the MEMORY through the CCC. Guru3d even said on their review, to not overclock the memory through the CCC, because the system may black screen on the *desktop* as soon as CCC loads. Using Afterburner and disabling overdrive works.

The FUNNY thing is, if I use CCC to overclock the memory, I do NOT get black screens UNTIL I RESTART THE COMPUTER.. I was happily gaming and running Valley at 1100/1500 UNTIL I restarted the computer. Then instant black screens on the desktop unless I was able to open CCC fast enough (not easy) to hit "defaults". It did NOT black screen if I did NOT touch the mouse, however (100% idle on desktop). Yet that didn't happen until restarting. Seems like there's a failsafe/voltage issue with the memory if you OC through CCC and restart computer.

This doesn't happen with Afterburner.

I'm stable at 1100/1400 with 70% fan speed. Any higher on the core needs more voltage: . +31.25 mv on the core ran valley at 1125 MHz until core reached 90C (about 10 minutes) then it "App hung" black screen (vpu recovery), where you can alt tab and end task on the frozen game just like it should do. +50 mv might work for 1125 MHz but 1150 MHz wasn't stable even with +50 mv.


----------



## Falkentyne

Quote:


> Originally Posted by *Jpmboy*
> 
> oh yeah - been doing the vrm offset already - but it's not as effective as dialing down LLC IMO - which i have not yet figured out. And the international rectifier [restricted] tech sheet download is being a real PIA - unlike ON Semiconductor.
> Alexey has been a great help. for real!! And Falkentyne is the path blazer


Path blazer? (what did I do? But thank you anyway).


----------



## Jpmboy

Quote:


> Originally Posted by *Falkentyne*
> 
> Path blazer? (what did I do? But thank you anyway).


Well, you ran the query for unwinder and Id'ed the vrm channel 30,8d for offset voltage (i think?). I locked myself out of guru3d... ask him to page the vrm for LLC adjustment.


----------



## AlphaC

Guys you need to keep in mind the memory controller is more simple than the Tahiti one.
Quote:


> Oddly enough, the most intriguing thing about Hawaii's basic architecture may be a fairly straightforward engineering tradeoff. The chip has eight 64-bit memory interfaces onboard, giving it, effectively, a 512-bit-wide path to memory. In order to make that wide memory path practical while keeping the chip size in check, the Hawaii team chose to exchange the complex memory PHYs in Tahiti for smaller, simpler ones. Complex PHYs, or physical interface devices, are necessary to drive GDDR5 DRAMs at peak clock frequencies, but they also eat up silicon space. AMD claims Hawaii's 512-bit memory interface occupies 20% less die area than Tahiti's 384-bit interface. As a result, Hawaii's memory operates at lower speeds. The 290X's GDDR5 runs at 5 GT/s, down from 6 GT/s for Tahiti-based cards like the Radeon HD 7970 GHz Edition. Still, overall memory bandwidth is up from 288 GB/s on Tahiti to 320 GB/s with Hawaii, thanks the wider data path.
> 
> Of course, Hawaii's advantage of this front extends beyond Tahiti. Nvidia chose a 384-bit interface and 6 GT/s memory rates for its competing GK110 chip, too.


http://techreport.com/review/25509/amd-radeon-r9-290x-graphics-card-reviewed

Hope that helps


----------



## RAFFY

Please excuse the noob question I just want to clarify this. My screen goes black while I play BF4 and it locks up my computer to the point were I have to press the reset button and then the power or just hold down the power button. I can not hear any sound nor can I ALT+Tab or anything else. So this means the my current 860watt PSU is not enough correct? I have a Seasonic Platinum 860w that I just bought and can still return to newegg. Let me know what you guys think please.


----------



## SSJVegeta

When are non-reference R9 290's due?


----------



## rv8000

Does any program report memory voltage for any brand of 290/290x yet?


----------



## shilka

Quote:


> Originally Posted by *RAFFY*
> 
> Please excuse the noob question I just want to clarify this. My screen goes black while I play BF4 and it locks up my computer to the point were I have to press the reset button and then the power or just hold down the power button. I can not hear any sound nor can I ALT+Tab or anything else. So this means the my current 860watt PSU is not enough correct? I have a Seasonic Platinum 860w that I just bought and can still return to newegg. Let me know what you guys think please.


Have you overvolted those cards?


----------



## RAFFY

Quote:


> Originally Posted by *shilka*
> 
> Have you overvolted those cards?


No I have the volt set to 100%, every this is bone stock no over clocks on my cpu or gpu's and only a ssd and a 120mm fan are connected.


----------



## RocketAbyss

Quote:


> Originally Posted by *RAFFY*
> 
> Please excuse the noob question I just want to clarify this. My screen goes black while I play BF4 and it locks up my computer to the point were I have to press the reset button and then the power or just hold down the power button. I can not hear any sound nor can I ALT+Tab or anything else. So this means the my current 860watt PSU is not enough correct? I have a Seasonic Platinum 860w that I just bought and can still return to newegg. Let me know what you guys think please.


Do you have GPU-Z open while playing? I have had issues where GPU-Z will cause my computer to experience the same symptoms you're experiencing. I was using the 0.7.3 version previously before the recent update so I'm not too sure if the new one still causes the black screen. But ever since i stopped using GPU-Z, the black screens have disappeared.


----------



## shilka

Quote:


> Originally Posted by *RAFFY*
> 
> No I have the volt set to 100%, every this is bone stock no over clocks on my cpu or gpu's and only a ssd and a 120mm fan are connected.


Then your problem is not the PSU unless its broken

It sounds like it could be something that is overheating

If you only have one fan in your case that might be the reason why

Unless you dont have your motherboard in a case?


----------



## RAFFY

Quote:


> Originally Posted by *RocketAbyss*
> 
> Do you have GPU-Z open while playing? I have had issues where GPU-Z will cause my computer to experience the same symptoms you're experiencing. I was using the 0.7.3 version previously before the recent update so I'm not too sure if the new one still causes the black screen. But ever since i stopped using GPU-Z, the black screens have disappeared.


No GPU-Z is closed the only programs I have running are steam, origin, ASUS GPU Tweak and Ventrilo.


----------



## RAFFY

Quote:


> Originally Posted by *shilka*
> 
> Then your problem is not the PSU unless its broken
> 
> It sounds like it could be something that is overheating
> 
> If you only have one fan in your case that might be the reason why
> 
> Unless you dont have your motherboard in a case?


That would make sense because it'll only happen when I use ULTRA settings.


----------



## RocketAbyss

Quote:


> Originally Posted by *RAFFY*
> 
> No GPU-Z is closed the only programs I have running are steam, origin, ASUS GPU Tweak and Ventrilo.


Try turning off GPU tweak and/or any other sensor monitor software. If problem still happens then it might be hardware side.


----------



## RAFFY

Quote:


> Originally Posted by *RocketAbyss*
> 
> Try turning off GPU tweak and/or any other sensor monitor software. If problem still happens then it might be hardware side.


If I turn off GPU Tweak then wont my fan profile not run?


----------



## Samurai Batgirl

Hola, errbody.

Got a question: Does the GTX 780 Ti have the crown now, or is it still too close to know?
Will we have to wait for more drivers and games?

I'm asking because it looks like it's too close to call. If it stays that way, the R9 290X seems to be the best one to get right now.

Thank y'all.


----------



## RocketAbyss

Quote:


> Originally Posted by *RAFFY*
> 
> If I turn off GPU Tweak then wont my fan profile not run?


Not too sure about that. Might have to consult someone who is more experienced with GPU tweak. But you could try it yourself for a short run to see if problem reappears


----------



## Taint3dBulge

Quote:


> Originally Posted by *brazilianloser*
> 
> What about those that are using all ports on the card??? lol


Ya i understand that. But try diffrent ports with 1 monitor.


----------



## Duvar

Guys who have black screens, try lowering your refresh rates.
If you have 120 try 60, i read that on oc.uk, you can try it.


----------



## ImJJames

Quote:


> Originally Posted by *Samurai Batgirl*
> 
> Hola, errbody.
> 
> Got a question: Does the GTX 780 Ti have the crown now, or is it still too close to know?
> Will we have to wait for more drivers and games?
> 
> I'm asking because it looks like it's too close to call. If it stays that way, the R9 290X seems to be the best one to get right now.
> 
> Thank y'all.


780ti is $700...it better perform better.


----------



## Arm3nian

Quote:


> Originally Posted by *ImJJames*
> 
> 780ti is $700...it better perform better.


I have never seen such flawed logic lol.


----------



## RAFFY

Quote:


> Originally Posted by *Duvar*
> 
> Guys who have black screens, try lowering your refresh rates.
> If you have 120 try 60, i read that on oc.uk, you can try it.


Im only at 60hz. This is a new issue that just started happening. I just played a whole round with no problems what so ever at an average of 92.5fps. Then I left the game and joined as a commander and I black screened. I have zero artifacting or anything of that nature before the black screen happens. It just happens out of the blue.


----------



## Falkentyne

Quote:


> Originally Posted by *RocketAbyss*
> 
> Do you have GPU-Z open while playing? I have had issues where GPU-Z will cause my computer to experience the same symptoms you're experiencing. I was using the 0.7.3 version previously before the recent update so I'm not too sure if the new one still causes the black screen. But ever since i stopped using GPU-Z, the black screens have disappeared.


The thing was, on my computer, 0.7.3 GPU-Z would not black screen. it would just take about 10-15 seconds to start a game or to alt tab in or out. It would never 'freeze' on the black screen.
Wizzard's old "GPUtool (still a very good artifact stress tester that puts a load on the cards the same as max load/supersampling AA load) also causes the same temporary freeze 9but only when alt tabbing in; accesses the same 'bad' register as the old gpu-z. But closing gputool, and reopening and closing gpu-z again will "unfreeze" the clocks and set everything back to normal without the temporary freezing.

you said you had permanent black screens with 0.7.3 gpu-z running?


----------



## RocketAbyss

Quote:


> Originally Posted by *Falkentyne*
> 
> The thing was, on my computer, 0.7.3 GPU-Z would not black screen. it would just take about 10-15 seconds to start a game or to alt tab in or out. It would never 'freeze' on the black screen.
> Wizzard's old "GPUtool (still a very good artifact stress tester that puts a load on the cards the same as max load/supersampling AA load) also causes the same temporary freeze 9but only when alt tabbing in; accesses the same 'bad' register as the old gpu-z. But closing gputool, and reopening and closing gpu-z again will "unfreeze" the clocks and set everything back to normal without the temporary freezing.
> 
> you said you had permanent black screens with 0.7.3 gpu-z running?


Not permanent but exactly you have described here. But instead of taking 10-15 seconds to start a game, the game or program(heaven bench) would crash after that time, or my computer would lock up, preventing me from alt tab, clt alt del, which forced me to hard reset the computer after waiting for like 2-3 mins to see if the problem rectifies itself. But I've given up on GPU-Z, for now and have been only using CCC to monitor temps and fan speed of my 290X. No issues with black screens ever since I stopped using that software.

On a side note, BF4 has been a blast besides that stupid net code and crashes


----------



## KyGuy

Quote:


> Originally Posted by *Arm3nian*
> 
> I have never seen such flawed logic lol.


I know, I have my 290x overclocked to 1100/1410 and I get 55.0 FPS flat in Heaven 4.0. The Ti only has 336GB/S and I have 361GB/s flat. One of the new reviews put the 290x pretty close to the average frame rate in most games vs the 780Ti. Even in Crysis 3 surprisingly. Kicks the 780Ti in BF4. Can't wait until they patch it for Mantle.







Maybe AMD will release some better drivers too.....


----------



## Jpmboy

Quote:


> Originally Posted by *KyGuy*
> 
> I know, I have my 290x overclocked to 1100/1410 and I get 55.0 FPS flat in Heaven 4.0. The Ti only has 336GB/S and I have 361GB/s flat. One of the new reviews put the 290x pretty close to the average frame rate in most games vs the 780Ti. Even in Crysis 3 surprisingly. Kicks the 780Ti in BF4. Can't wait until they patch it for Mantle.
> 
> 
> 
> 
> 
> 
> 
> Maybe AMD will release some better drivers too.....


it's all in the drivers!


----------



## KyGuy

Quote:


> Originally Posted by *Jpmboy*
> 
> it's all in the drivers!


Sure is.....Hopefully we will get a big update when Mantle Support emerges...


----------



## Samurai Batgirl

Quote:


> Originally Posted by *KyGuy*
> 
> I know, I have my 290x overclocked to 1100/1410 and I get 55.0 FPS flat in Heaven 4.0. The Ti only has 336GB/S and I have 361GB/s flat. One of the new reviews put the 290x pretty close to the average frame rate in most games vs the 780Ti. Even in Crysis 3 surprisingly. Kicks the 780Ti in BF4. Can't wait until they patch it for Mantle.
> 
> 
> 
> 
> 
> 
> 
> Maybe AMD will release some better drivers too.....


I'm gonna hope this is the answer I was looking for.


----------



## KyGuy

Quote:


> Originally Posted by *Samurai Batgirl*
> 
> I'm gonna hope this is the answer I was looking for.


I think that it is drivers more than anything now. A 290x under water can overclock just as well as a 780Ti. Reviews have been putting the average overclock at about 200 on the core and about 175 on the memory for the Ti. I got 50 on the core and 160 on my memory for my 290c. Max Temp in BF4 is 68C with 60% fan. Yes it is audible, but I can deal with it.


----------



## some1zwatching

Hey guys - I am having an issue with my Sapphire reference R9 290. I noticed when playing BF4 or Metro LL, the card was throttling with temperatures way below 95 C. I opened BF3 to check, and my card locked onto a steady 946 MHz or whatever it's supposed to be. But when I play Metro LL or BF4, either of those games, the clock frequency is pretty erratic, and I can notice the jerkiness in the game.

For BF4, I've found for whatever reason if I play in borderless mode, the clock frequency stays where it should and there's no jerkiness. But BF4 in full screen, or Metro LL both are jerky and the clock is throttling. Using afterburner to plot frequency, temps, etc. Looking at reviews, it looks like this is not a normal problem - no one else has reported this yet.

I did go in the BF4 directory and install the DirectX stuff, and that didn't help. Wondering if anyone else here could give me a hand.

System - i7 2600k, 16 gb 1600 MHz G.Skill, Xonar Sound card, MSI P67 mobo,


----------



## Mr357

Quote:


> Originally Posted by *some1zwatching*
> 
> Hey guys - I am having an issue with my Sapphire reference R9 290. I noticed when playing BF4 or Metro LL, the card was throttling with temperatures way below 95 C. I opened BF3 to check, and my card locked onto a steady 946 MHz or whatever it's supposed to be. But when I play Metro LL or BF4, either of those games, the clock frequency is pretty erratic, and I can notice the jerkiness in the game.
> 
> For BF4, I've found for whatever reason if I play in borderless mode, the clock frequency stays where it should and there's no jerkiness. But BF4 in full screen, or Metro LL both are jerky and the clock is throttling. Using afterburner to plot frequency, temps, etc. Looking at reviews, it looks like this is not a normal problem - no one else has reported this yet.
> 
> I did go in the BF4 directory and install the DirectX stuff, and that didn't help. Wondering if anyone else here could give me a hand.
> 
> System - i7 2600k, 16 gb 1600 MHz G.Skill, Xonar Sound card, MSI P67 mobo,


You may just have to RMA it.


----------



## some1zwatching

Has anyone else on here with a reference 290 noticed erratic GPU frequencies and stuttering? Also, this is occuring before the card even warms up to 95C - or if I manually crank fan speed up to ensure that it stays away from the temp limit.


----------



## sterob

hi, anyone know when will non-ref 290/290x come out?


----------



## SpewBoy

Quote:


> Originally Posted by *RAFFY*
> 
> Please excuse the noob question I just want to clarify this. My screen goes black while I play BF4 and it locks up my computer to the point were I have to press the reset button and then the power or just hold down the power button. I can not hear any sound nor can I ALT+Tab or anything else. So this means the my current 860watt PSU is not enough correct? I have a Seasonic Platinum 860w that I just bought and can still return to newegg. Let me know what you guys think please.


I'm running similar rig to yours with an AX860 (afaik pretty much the same as your PSU) and I haven't had any issues that are PSU related (have not overvolted but have OCed).


----------



## frogzombie

I just spent an 2 hours playing Battlefield 4. My R9 290 got to 200 degrees Fahrenheit and blue screened my computer. I have not done any overclocking. Is an aftermarket cooler the only option?


----------



## Tobiman

Quote:


> Originally Posted by *frogzombie*
> 
> I just spent an 2 hours playing Battlefield 4. My R9 290 got to 200 degrees Fahrenheit and blue screened my computer. I have not done any overclocking. Is an aftermarket cooler the only option?


For now, yes. You can also turn up the fans a bit.


----------



## frogzombie

Do I use the CCC graphics overdive to up the fan speed?


----------



## Arizonian

Quote:


> Originally Posted by *frogzombie*
> 
> Do I use the CCC graphics overdive to up the fan speed?


I used *MSI Afterburner* and didn't have issues with manual fan settings. Hearing new version will support 290X/290 officially real soon.

IDK but CCC seems buggy atm.


----------



## DavidH

1200mhz core now with my bios flashed Powercolor R9 290







Thanks for the asus bios fellows, I still have a lot of room left too i think!


----------



## Arizonian

Quote:


> Originally Posted by *Blackroush*
> 
> Moderator I want to Join in your List, Please Add me.. Thanks
> 
> XFX 290X
> 
> Stock Cooler
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## frogzombie

Thanks. I was freaking out a bit.


----------



## frogzombie

If I set MSI afterburner to auto, what temp does it try to keep the card at?

EDIT: Never mind. I found the user defined fan settings. Thanks.


----------



## DraXxus1549

Quote:


> Originally Posted by *some1zwatching*
> 
> Hey guys - I am having an issue with my Sapphire reference R9 290. I noticed when playing BF4 or Metro LL, the card was throttling with temperatures way below 95 C. I opened BF3 to check, and my card locked onto a steady 946 MHz or whatever it's supposed to be. But when I play Metro LL or BF4, either of those games, the clock frequency is pretty erratic, and I can notice the jerkiness in the game.
> 
> For BF4, I've found for whatever reason if I play in borderless mode, the clock frequency stays where it should and there's no jerkiness. But BF4 in full screen, or Metro LL both are jerky and the clock is throttling. Using afterburner to plot frequency, temps, etc. Looking at reviews, it looks like this is not a normal problem - no one else has reported this yet.
> 
> I did go in the BF4 directory and install the DirectX stuff, and that didn't help. Wondering if anyone else here could give me a hand.
> 
> System - i7 2600k, 16 gb 1600 MHz G.Skill, Xonar Sound card, MSI P67 mobo,


I was seeing something similar when playing Tomb Raider earlier. Do you have vsync enabled? I noticed that when I have vsync enabled I saw more erratic core clocks. If I disabled vsync it would stay at 947 as expected. I may be totally wrong about this but it seems like instead of adjusting the GPU usage after the frame limit is reached it adjusts the core clock, not sure if that makes any sense, but that's what it looks like to me.


----------



## DavidH

Is 1.280vc good for 1200mhz core? Is that too much voltage? Im not even sure what the max safe voltage is but my temps are great with my R9 290 cause of the prolimitech cooler.


----------



## Mr357

I believe I have debunked the "Elpida IC's are poor performers" myth. At 1100MHz core (haven't confirmed that 1200 is rock solid stable yet) and 1.4V set in GPU Tweak (actual idle is 1.367 and load drops to just barely under 1.3), I have reached 1700MHz stable and I'm continuing on.









I'll be back once I figure out the limit on these IC's.

EDIT: Wouldn't you know it, the second I try to pass 1700MHz, I get my first black screen with this card. I guess that's where I'll draw the line.


----------



## Arizonian

Quote:


> Originally Posted by *frogzombie*
> 
> If I set MSI afterburner to auto, what temp does it try to keep the card at?


I set the fan settings to manual - I felt by 79C that fan needs to be 70% and by 84C set to 76% and I found optimal gaming with Core never down clocking. Anything below that is up to you and what you'd like when running idle. Idle temps are fine if you fan is at 20% even it it's riding up to the 51C mark. It won't hurt it.


----------



## youngmanblues

I getting black screen too.

Happens randomly when i am playing BF4. Can take 30 mins to 3 hours to pop up, very random. Sound is still playing properly but no response from keyboard.
Cannot ctr alt del, caps lock light not lighting up when pressed. Have to hard reset computer to restart.


CPU and GPU: stock clock
MB bios: latest
Gpu bios : stock bios
Latest beta drivers
Epida memory
GPU temp: 57(idle), 78(load), custom fan curve using MSI AB


----------



## Sazz

Quote:


> Originally Posted by *youngmanblues*
> 
> I getting black screen too.
> 
> Happens randomly when i am playing BF4. Can take 30 mins to 3 hours to pop up, very random. Sound is still playing properly but no response from keyboard.
> Cannot ctr alt del, caps lock light not lighting up when pressed. Have to hard reset computer to restart.
> 
> 
> CPU and GPU: stock clock
> MB bios: latest
> Gpu bios : stock bios
> Latest beta drivers
> Epida memory
> GPU temp: 57(idle), 78(load), custom fan curve using MSI AB


same thing on my 290X, I think its just the game and not the card. I play fine with any other game except BF4


----------



## homestyle

Anyone know if the Thermalright Shaman fits the 290 series?


----------



## RAFFY

Quote:


> Originally Posted by *youngmanblues*
> 
> I getting black screen too.
> 
> Happens randomly when i am playing BF4. Can take 30 mins to 3 hours to pop up, very random. Sound is still playing properly but no response from keyboard.
> Cannot ctr alt del, caps lock light not lighting up when pressed. Have to hard reset computer to restart.
> 
> 
> CPU and GPU: stock clock
> MB bios: latest
> Gpu bios : stock bios
> Latest beta drivers
> Epida memory
> GPU temp: 57(idle), 78(load), custom fan curve using MSI AB


Quote:


> Originally Posted by *Sazz*
> 
> same thing on my 290X, I think its just the game and not the card. I play fine with any other game except BF4


Quote:


> Originally Posted by *RocketAbyss*
> 
> Not permanent but exactly you have described here. But instead of taking 10-15 seconds to start a game, the game or program(heaven bench) would crash after that time, or my computer would lock up, preventing me from alt tab, clt alt del, which forced me to hard reset the computer after waiting for like 2-3 mins to see if the problem rectifies itself. But I've given up on GPU-Z, for now and have been only using CCC to monitor temps and fan speed of my 290X. No issues with black screens ever since I stopped using that software.
> 
> On a side note, BF4 has been a blast besides that stupid net code and crashes


I take it you mean GPU Tweak. I just finished gaming for over two hours with GPU Tweak closed and didn't encounter a single issues except for one but that was BF4 related and it just froze the game during the new match loading. Other than that not a single issue at full resolution and max settings. I hope they fix GPU Tweak to work with windows 8.1 soon.


----------



## youra6

Quote:


> Originally Posted by *DavidH*
> 
> Is 1.280vc good for 1200mhz core? Is that too much voltage? Im not even sure what the max safe voltage is but my temps are great with my R9 290 cause of the prolimitech cooler.


My card need 1.38 to reach 1200mhz. What are your ASIC values? I am under water though. My VID voltage starts at 1.25... Boooo


----------



## Sazz

Quote:


> Originally Posted by *RAFFY*
> 
> I take it you mean GPU Tweak. I just finished gaming for over two hours with GPU Tweak closed and didn't encounter a single issues except for one but that was BF4 related and it just froze the game during the new match loading. Other than that not a single issue at full resolution and max settings. I hope they fix GPU Tweak to work with windows 8.1 soon.


I get the black screens on BF4 only but when I alt+tab out of the game and back in everything is restored to normal and its working again so I just do that whenever I get black screen, but then again I only get the black screen during BF4, any other game I play I haven't had any problems even with my daily clocks of 1100/1350.


----------



## ImJJames

Quote:


> Originally Posted by *Sazz*
> 
> I get the black screens on BF4 only but when I alt+tab out of the game and back in everything is restored to normal and its working again so I just do that whenever I get black screen, but then again I only get the black screen during BF4, any other game I play I haven't had any problems even with my daily clocks of 1100/1350.


Sounds like a driver issue


----------



## Sazz

Quote:


> Originally Posted by *ImJJames*
> 
> Sounds like a driver issue


probably, imma go try close AB and use CCC for Overclocking and see if maybe AB is conflicting with AB.


----------



## DavidH

Quote:


> Originally Posted by *youra6*
> 
> My card need 1.38 to reach 1200mhz. What are your ASIC values? I am under water though. My VID voltage starts at 1.25... Boooo


Hi, im 73.1% ASIC and the VID is 1180.


----------



## xTristinx

Just sold my two 4gb 680s and ordered myself a 290x. Just waiting for my paypal to clear to order my second one. SOOOOO EXCITED FOR 7680x1440 GAMING.


----------



## youra6

Quote:


> Originally Posted by *DavidH*
> 
> Hi, im 73.1% ASIC and the VID is 1180.


Interesting, looks like I got shanked in the silicon lottery. I believe you got a pretty decent overclocking 290.


----------



## Mr357

With the amount of Vdroop I'm seeing, I think I've hit my limit for the PT1 BIOS. Would it be worth it to step up to PT3? I would really like to see how much higher this card can clock.


----------



## some1zwatching

Quote:


> I was seeing something similar when playing Tomb Raider earlier. Do you have vsync enabled? I noticed that when I have vsync enabled I saw more erratic core clocks. If I disabled vsync it would stay at 947 as expected. I may be totally wrong about this but it seems like instead of adjusting the GPU usage after the frame limit is reached it adjusts the core clock, not sure if that makes any sense, but that's what it looks like to me.


Draxxus - no Vsynch - although I did try it with vsynch just to see what would happen. I thought the purpose of the GPU freq throttling back was to prevent from exceeding 95 C? All of this is happening around 80 C or so - I game with decent noise isolating closed cans in BF4, so the fan doesn't bother me in the slightest, so I've tried setting the fan to 52% RPM manually, and the card stayed cool but it still downclocked.

Thinking this may be the issue, I also reset all factory settings from CCC and tried while just running Afterburner without changing anything, and it still couldn't keep the clock rate up.


----------



## Sazz

ok I caught the culprit on my BF4 black screens, its the driver! I was actually on Quiet mode BIOS and since you cant OC on quiet mode its conflicting with AB so when I switched to Uber mode I went thru a whole match on BF4 without the Black screen xD


----------



## RAFFY

Quote:


> Originally Posted by *youngmanblues*
> 
> I getting black screen too.
> 
> Happens randomly when i am playing BF4. Can take 30 mins to 3 hours to pop up, very random. Sound is still playing properly but no response from keyboard.
> Cannot ctr alt del, caps lock light not lighting up when pressed. Have to hard reset computer to restart.
> 
> 
> CPU and GPU: stock clock
> MB bios: latest
> Gpu bios : stock bios
> Latest beta drivers
> Epida memory
> GPU temp: 57(idle), 78(load), custom fan curve using MSI AB


Quote:


> Originally Posted by *youra6*
> 
> Interesting, looks like I got shanked in the silicon lottery. I believe you got a pretty decent overclocking 290.


Is 73.1 good? My first card reads 74.2 and my second card reads 77.4


----------



## Arizonian

Quote:


> Originally Posted by *Sazz*
> 
> ok I caught the culprit on my BF4 black screens, its the driver! I was actually on Quiet mode BIOS and since you cant OC on quiet mode its conflicting with AB so when I switched to Uber mode I went thru a whole match on BF4 without the Black screen xD


Could it be that simple? Can everyone else having this issue try this out by checking which BIOS your on?

Quiet Mode - Switch is in position closest to where you plug in your displays.
Uber Mode - Switch is in position furthest away to where you plug in your displays.


----------



## RocketAbyss

Quote:


> Originally Posted by *RAFFY*
> 
> I take it you mean GPU Tweak. I just finished gaming for over two hours with GPU Tweak closed and didn't encounter a single issues except for one but that was BF4 related and it just froze the game during the new match loading. Other than that not a single issue at full resolution and max settings. I hope they fix GPU Tweak to work with windows 8.1 soon.


Nope it was GPU-Z for me. I have not flashed my 290X's bios to the asus one and all I'm using to OC my card is with CCC. Never touched GPU Tweak.


----------



## RAFFY

Quote:


> Originally Posted by *Arizonian*
> 
> Could it be that simple? Can everyone else having this issue try this out by checking which BIOS your on?
> 
> Quiet Mode - Switch is in position closest to where you plug in your displays.
> Uber Mode - Switch is in position furthest away to where you plug in your displays.


Just flicked mine to Uber mode! Have to go to bed but will report back tomorrow my results.


----------



## selk22

Quiet Mode is for the Weak!

This is OCN!


----------



## DampMonkey

Quote:


> Originally Posted by *selk22*
> 
> Quiet Mode is for the Weak!
> 
> This is OCN!


Water cooled = 24/7 quiet mode


----------



## Arizonian

Quote:


> Originally Posted by *selk22*
> 
> Quiet Mode is for the Weak!
> 
> This is OCN!


No, I think you missed the point. Members may have had quite mode on having black screens playing BF4 and Uber mode may solve this. So we need members that did have black screen issue in BF4 to see what BIOS they are set to and confirm if it fixes this issue, _at least until a driver fix._

Bears repeating.

Quote:


> Originally Posted by *Sazz*
> 
> ok I caught the culprit on my BF4 black screens, its the driver! I was actually on Quiet mode BIOS and since you cant OC on quiet mode its conflicting with AB so when I switched to Uber mode I went thru a whole match on BF4 without the Black screen xD


Quiet Mode - Switch is in position closest to where you plug in your displays.
Uber Mode - Switch is in position furthest away to where you plug in your displays.


----------



## HighTemplar

Quote:


> Originally Posted by *KyGuy*
> 
> I think that it is drivers more than anything now. A 290x under water can overclock just as well as a 780Ti. Reviews have been putting the average overclock at about 200 on the core and about 175 on the memory for the Ti. I got 50 on the core and 160 on my memory for my 290c. Max Temp in BF4 is 68C with 60% fan. Yes it is audible, but I can deal with it.


Sorry, but the avg OC for the Ti is closer to 1300mhz than you think.


----------



## lethal343

SOOO VERY UPSET

Set an aggressive fan profile through msi Afterburner&#8230;.. Temperatures are much better. (Obviously not as good as water-cooling) However black screen still happens&#8230;.mostly on metro last light and bf4&#8230; never crashed on crysis 3.

Should we all just RMA? What's the go? lol you think when you spend quality money you'd get quality products...

hopefully its just drivers....

CPU: Core i7 4960X
CPU COOLER: H100i
GPU: 2X GIBABYTE R9 290X CF
MB: GIGABYTE x79s-UP5-WiFi
RAM: HYPER X BEAST 16GB (2x8GB)
PSU: THERMALTAKE 1350W TOUGH POWER 80PLUS GOLD
CASE: THERMALTAKE LEVEL 10 GT SNOW


----------



## lethal343

also how do i check what type of vram ive got?
thx heaps guys


----------



## Arizonian

Quote:


> Originally Posted by *lethal343*
> 
> also how do i check what type of vram ive got?
> thx heaps guys


Download this to find out which memory you have without having to open up your GPU shroud.

http://www.mediafire.com/download/voj4j1rlk0ucfz4/MemoryInfo+1005.rar

I just decided to add this to the OP.


----------



## lethal343

mine are gigabyte cards.... noticed the asus logo....program reports back with empty fields?


----------



## Arizonian

Quote:


> Originally Posted by *lethal343*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> mine are gigabyte cards.... noticed the asus logo....program reports back with empty fields?


Mine did too but it read my Sapphire memory. Hmmmmm


----------



## lethal343

like it matters any more man....ive heard that black screen is happening with either vendors


----------



## Arizonian

Quote:


> Originally Posted by *lethal343*
> 
> like it matters any more man....ive heard that black screen is happening with either vendors


You on Quite or Uber mode? Trying to confirm something from another member tonight who switched to Uber mode and he didn't have any black screens.

EDIT _ Never mind re-read your post and you said you 'heard' not that it was happening to you.


----------



## Falkentyne

Quote:


> Originally Posted by *youra6*
> 
> My card need 1.38 to reach 1200mhz. What are your ASIC values? I am under water though. My VID voltage starts at 1.25... Boooo


How do you find the starting vid for your card, WITHOUT using gpu tweak? (im using stock bios).


----------



## lethal343

yeah im on UBER mode either way...

on 13.11,9.2 drivers windows 8.1


----------



## selk22

Quote:


> Originally Posted by *Arizonian*
> 
> No, I think you missed the point. Members may have had quite mode on having black screens playing BF4 and Uber mode may solve this. So we need members that did have black screen issue in BF4 to see what BIOS they are set to and confirm if it fixes this issue, _at least until a driver fix._


I completely understood the point.. I don't think you understood mine. It was really just playful but was hinting at why be at quiet mode in the first place this is OCN "The Pursuit of Performance"

I understand the pleasure of a quiet PC, this is not intended to start an argument lol I was being lighthearted.


----------



## Amhro

Quote:


> Originally Posted by *SSJVegeta*
> 
> When are non-reference R9 290's due?


Same question here


----------



## Blackroush

Mine Elpdia too (290x). Also blackscreen in BF4


----------



## Arizonian

Quote:


> Originally Posted by *selk22*
> 
> I completely understood the point.. I don't think you understood mine. It was really just playful but was hinting at why be at quiet mode in the first place this is OCN "The Pursuit of Performance"
> 
> I understand the pleasure of a quiet PC, this is not intended to start an argument lol I was being lighthearted.


Just heard 'whoosh' over my head. LOL.

Quote:


> Originally Posted by *lethal343*
> 
> yeah im on UBER mode either way...
> 
> on 13.11,9.2 drivers windows 8.1


Bummer then not sure why changing the BIOS mode worked for sazz?







_To be honest I didn't think it was that simple of an over sight_.
Quote:


> Originally Posted by *Amhro*
> 
> Same question here


Regarding when are non-reference cards coming out:

*Non-reference Design Radeon R9 290X to Be Available From Late-November*

*Non-Reference R9 290X Graphics Card Coming Late November*

Updated - correct links.


----------



## muhd86

i plan to get 3 sapphire r290 gpus ..any heads up ....reading since morning and all i see are complaints ...whats with the black screen in bf4 , i read some where that if you switch the bios to uber mode the problem goes away .

also do these new gpus dont require a crossfire bridge no more --thats amazing .

with 3 gpus can i beat the crap out of a guy having 4 titans .


----------



## Blackroush

I am not getting Blackscreen when I use this setting (In Quite Mode).


----------



## lethal343

i dont know man...
with non-ref cards doing well and all....what about all the people who bought ref cards?..
i dont think AMD will just say "oh well, too bad"
We'd all be liars if we said we won't throw tantrums!

btw some people are saying just up the voltage on the cards and set fan speeds to aggressive and you shouldnt crash?


----------



## Amhro

Quote:


> Originally Posted by *Arizonian*
> 
> Regarding when are non-reference cards coming out:
> 
> *Non-reference Design Radeon R9 290X to Be Available From Late-November*
> 
> *Non-Reference R9 290X Graphics Card Coming Late November*
> 
> Updated - correct links.


Same for 290?


----------



## Arizonian

Quote:


> Originally Posted by *Amhro*
> 
> Same for 290?


If not, it won't be more than two weeks later for the 290 like the reference release.


----------



## Sgt Bilko

Ok so my 8350 arrived today and its installed.........but i'm returning my 290x, the blackscreens are getting somewhat annoying, i'm trying to decide between 2 x 290's or a non ref 290x.

suggestions?


----------



## lordzed83

When i am reading this whole topic i noticed one thing:
I dont think ANY of new cards had so many RMA.
Its like Black screen RMA
To Loud fan RMA
Coil whine RMA

Its like 20% of cards ended up with RMA lol...


----------



## Sgt Bilko

Quote:


> Originally Posted by *lordzed83*
> 
> When i am reading this whole topic i noticed one thing:
> I dont think ANY of new cards had so many RMA.
> Its like Black screen RMA
> To Loud fan RMA
> Coil whine RMA
> 
> Its like 20% of cards ended up with RMA lol...


Tbh, the last thing i want to do is RMA, The coil whine didn't bother me, the fan noise didnt bother me but when i'm unable to use the card at load......yeah, that bothers me.


----------



## SonDa5

Quote:


> Originally Posted by *lordzed83*
> 
> When i am reading this whole topic i noticed one thing:
> I dont think ANY of new cards had so many RMA.
> Its like Black screen RMA
> To Loud fan RMA
> Coil whine RMA
> 
> Its like 20% of cards ended up with RMA lol...


I think your invented number of 20% RMAs doesn't mean jack.

For you guys that are thinking about RMAng I think you are jumping the gun on a card that is worth keeping.

I lost signal from video card to display and had a black screen after updating from one beta driver the newest beta driver. Black screen would happen when would happen a few seconds after boot up. Had time to get into the desktop on Windows 8 and the the black screen would happen.

I booted into SAFE mode and deleted all ATI drivers folders and registry and then booted and the black screen went away. Re-installed the newest beta driver and the problem went away.

Right now after doing a clean driver uninstall and install the problem is gone. I blame the drivers.

Running stock Asus 290X BIOS on my Sapphire 290X and using GPU Tweak to adjust voltage.

I think the black screen problem is related to a driver glitch.

My 290X has hynix memory Ic.


----------



## rdr09

Quote:


> Originally Posted by *Sazz*
> 
> I get the black screens on BF4 only but when I alt+tab out of the game and back in everything is restored to normal and its working again so I just do that whenever I get black screen, but then again I only get the black screen during BF4, any other game I play I haven't had any problems even with my daily clocks of 1100/1350.


do you mind trying this?.





Check this out, if your experience crashes or stuttering in BF4.

it was posted by Tobiman in another thread.


----------



## kcuestag

Quote:


> Originally Posted by *Arizonian*
> 
> Could it be that simple? Can everyone else having this issue try this out by checking which BIOS your on?
> 
> Quiet Mode - Switch is in position closest to where you plug in your displays.
> Uber Mode - Switch is in position furthest away to where you plug in your displays.


The issue is compeltely random and the BIOS won't matter, I've tried both. It can happen after 2 hours of gaming, or it can happen instantly upon opening a 3D application, like BF4 or even Valley Benchmark, doesn't matter.


----------



## Newbie2009

Is everyone experiencing this black screen on the unlocked volts bios?


----------



## kcuestag

Quote:


> Originally Posted by *Newbie2009*
> 
> Is everyone experiencing this black screen on the unlocked volts bios?


I experienced at complete stock BIOS (Quiet & Uber), and even when underclocking the card as low as 800/1000.

I don't use ASUS Tweak or any monitoring program, except MSI AB, which I uninstalled to see if that was causing it, but it wasn't.

This issue is very weird, because sometimes it can happen instantly upon opening a game or a benchmark (As soon as card gets load) or it can happen minutes or even hours later.

I'm hoping this is driver related and that we don't need to RMA our cards.


----------



## Newbie2009

Quote:


> Originally Posted by *kcuestag*
> 
> I experienced at complete stock BIOS (Quiet & Uber), and even when underclocking the card as low as 800/1000.
> 
> I don't use ASUS Tweak or any monitoring program, except MSI AB, which I uninstalled to see if that was causing it, but it wasn't.
> 
> This issue is very weird, because sometimes it can happen instantly upon opening a game or a benchmark (As soon as card gets load) or it can happen minutes or even hours later.
> 
> I'm hoping this is driver related and that we don't need to RMA our cards.


Has to be driver related. I would roll back to the earliest drivers to support the 290 series, no review sites reported on the issue.


----------



## CriticalHit

Quote:


> Originally Posted by *Newbie2009*
> 
> Has to be driver related. I would roll back to the earliest drivers to support the 290 series, no review sites reported on the issue.


its sounding like that way more and more ... noting a few people have said its fixed by doing a full driver wipe and reinstall of the latest drivers.

i wonder if those having issues just updated drivers or clean install ( with reboot after uninstall ) ? ( i havent got cards myself yet to witness the issue .. .paid up, but will be a couple of weeks before in my rig. )


----------



## Sgt Bilko

Well, I just did a full driver wipe then and re-installed Beta9 and it seems to be ok for me, fingers crossed, the one game that always caused a black screen within 5 mins of me loading it was DiRT: Showdown, so i'll go crash some cars and report back.


----------



## SpewBoy

Is there a reason why MemoryInfo says "Elpdia" instead of "Elpida"? Just a typo? Been annoying me somewhat with all the different spellings going on.


----------



## Falkentyne

Guys I asked before but no one replied









How do you find the VID of your 290X if you are NOT using the Asus Bios?


----------



## Forceman

Quote:


> Originally Posted by *Falkentyne*
> 
> Guys I asked before but no one replied
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How do you find the VID of your 290X if you are NOT using the Asus Bios?


Should be just your load voltage at stock settings. Should be able to read it with GPU-Z.


----------



## iksklld

I know some AMD cards are known for that, I had one in the past doing it.


----------



## ImJJames

Quote:


> Originally Posted by *Arm3nian*
> 
> I have never seen such flawed logic lol.


I have all AMD products but you're going to tell me 780ti is inferior to 290x as of right now? Do you really believe that?


----------



## Sazz

Quote:


> Originally Posted by *Arizonian*
> 
> Could it be that simple? Can everyone else having this issue try this out by checking which BIOS your on?
> 
> Quiet Mode - Switch is in position closest to where you plug in your displays.
> Uber Mode - Switch is in position furthest away to where you plug in your displays.


Yeah I know that, I just forgot to switch back to uber mode, I used the Quiet Mode BIOS Slot when I flashed it to PT1 BIOS to do some OC runs with voltage increase and when I re-flashed it back I forgot to flick the BIOS back to Uber mode slot.

I've played 3hrs of BF4 now and no black screen whatsoever with OC using AB. but the game still is buggy coz I loose sound from time to time which is a known bug of the game.

Quote:


> Originally Posted by *rdr09*
> 
> do you mind trying this?.
> 
> 
> 
> 
> 
> Check this out, if your experience crashes or stuttering in BF4.
> 
> it was posted by Tobiman in another thread.


I don't really have any other problems with BF4 besides that black screen which is caused of me trying to overclock while the BIOS is on quiet mode, I forgot to flick it back to uber mode xD


----------



## Sgt Bilko

Well, 20 mins later and no black screens or crashes on my end......I'm gonna cancel my RMA and wait it out a bit longer.

This has to be a driver issue......


----------



## CriticalHit

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well, 20 mins later and no black screens or crashes on my end......I'm gonna cancel my RMA and wait it out a bit longer.
> 
> This has to be a driver issue......


fingers crossed you don't see anymore !


----------



## Newbie2009

Looks like I'm going to have to go for a nickle plated copper base for my blocks. I am running a copper block in my loop, should there be any issues?


----------



## Sgt Bilko

Quote:


> Originally Posted by *CriticalHit*
> 
> fingers crossed you don't see anymore !


Same here......Only ever had one faulty component (CVF was DOA) and i don't want another.....


----------



## th3illusiveman

Tiny Tom Logan says 3rd party coolers won't be coming out till 2014...

No idea why AMD is soo slow with something that could help their cards fly off the shelves since heat and noise are the #1 reason most people aren't buying these cards.. Right now the GTX 780 is just a better buy because unless you're water cooling you can easily make it faster then the 290X while it's still quieter and cooler.


----------



## DeadSkull

Quote:


> Originally Posted by *kcuestag*
> 
> The issue is compeltely random and the BIOS won't matter, I've tried both. It can happen after 2 hours of gaming, or it can happen instantly upon opening a 3D application, like BF4 or even Valley Benchmark, doesn't matter.


Sounds like drivers. Worst case scenario there is an inherent bug with the latest AMD implementation of dynamic clock tech within the architecture itself. Then you guys are SOL.


----------



## kcuestag

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well, 20 mins later and no black screens or crashes on my end......I'm gonna cancel my RMA and wait it out a bit longer.
> 
> This has to be a driver issue......


Yeah, I don't think it's faulty card, as it can happen anywhere from instantly to 2 hours later.









Quote:


> Originally Posted by *DeadSkull*
> 
> Sounds like drivers. Worst case scenario there is an inherent bug with the latest AMD implementation of dynamic clock tech within the architecture itself. Then you guys are SOL.


Well, in that case I'll return the card.


----------



## kantxcape

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well, I just did a full driver wipe then and re-installed Beta9 and it seems to be ok for me, fingers crossed, the one game that always caused a black screen within 5 mins of me loading it was DiRT: Showdown, so i'll go crash some cars and report back.


Are you using windows 8.1? What was your driver full wipe method?


----------



## flopper

Quote:


> Originally Posted by *ImJJames*
> 
> I have all AMD products but you're going to tell me 780ti is inferior to 290x as of right now? Do you really believe that?


for the price yea.
actual gameplay its even until Mantle hits the street.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kantxcape*
> 
> Are you using windows 8.1? What was your driver full wipe method?


i used the Atiman uninstaller that was listed on the OP : http://www.mediafire.com/download/0jdko53gk5npzo0/Atiman+Uninstaller+v.7.0.2.msi

I only re-installed the Beta9 driver and i'm using Win 7 64 bit (can't bring myself to change to 8 yet)

i hope this works for someone other than me


----------



## lordzed83

Well I am 100% sure its driver problem in my case.
When I installed beta 9 with full driver wipe i started having black screens for first time.
Went back to beta 8 and problem is gone..


----------



## kitsunestarwind

Offically a Radeon R9-290 Owner as of today, Add me to the list

Also I am getting the black screen at stock clocks
been reading and testing all the day



I have a 80% ASIC from my XFX R-290
it is running the Elpida GDDR5 memory chips

It would black screen on heaven and FF14 (benchmark utility) and in Metro LL

Does not even get close to overheating
But my testing all afternoon has lead to me getting it stable with no black screens at -5% memory clock in the Overdrive, no more black screens at all


----------



## Sgt Bilko

Quote:


> Originally Posted by *lordzed83*
> 
> Well I am 100% sure its driver problem in my case.
> When I installed beta 9 with full driver wipe i started having black screens for first time.
> Went back to beta 8 and problem is gone..


Well it looks like good news on the surface, guess we will know for sure once some more drivers start getting released.


----------



## flopper

Quote:


> Originally Posted by *kitsunestarwind*
> 
> Does not even get close to overheating
> But my testing all afternoon has lead to me getting it stable with no black screens at -5% memory clock in the Overdrive, no more black screens at all


heh, thats just silly having to downclock.


----------



## the9quad

Quote:


> Originally Posted by *CriticalHit*
> 
> its sounding like that way more and more ... noting a few people have said its fixed by doing a full driver wipe and reinstall of the latest drivers.
> 
> i wonder if those having issues just updated drivers or clean install ( with reboot after uninstall ) ? ( i havent got cards myself yet to witness the issue .. .paid up, but will be a couple of weeks before in my rig. )


I did a complete fresh install of windows and then the beta 9's, that is not a fix. I seriously doubt anyone has a fix until they can run several days without the issue.


----------



## Falkentyne

Well it "looks" like the VID of my card is 1.188v. That's the highest vcore it would ever read. I don't know if that accounts for any vdroop, and since vcore is in 6.25 mv steps, then it's either 1.188v or 1.200v.
1.188v was the highest possible vcore I could ever catch in gpu-z, usually when closing a game. If I added 12.50 mv through afterburner, it would say 1.195v.

My ASIC is 77.9%. I found that heaven loops basically forever at or under 90% at 1100/1400, but if I run around in free mode, i may eventually get an application freeze (black screen but alt tabable) from the heat if its at 70% fan. Adding 12mv makes it fully stable (have to put fan at 75% instead of 70%)

So is 1.188v the proper default vcore for that asic or is it 1.200v (counting for droop that might make it 1.188v?) for 77.9% asic?
Someone had 1.25v as their VID?


----------



## smokedawg

Would you advise upgrading my Mainboard+CPU for the 290x ? I was thinking of upgrading my ASRock z68 extreme3 gen3 + 2500k with a z87 board + haswell (4770k). I consider going crossfire 290x if my 1000w EVGA supernova p2 PSU can handle it. This is for gaming at [email protected] if that matters. Any recommendations?


----------



## Bloitz

Quote:


> Originally Posted by *Amhro*
> 
> XFX is not available in my country
> 
> 
> 
> 
> 
> 
> 
> 
> That's why I'm deciding between sapphire and gigabyte.
> As for warranty, sapphire has 2 years, gigabyte 3 years.


Don't forget to check if the warranty applies for Europe. The lifetime warranty from XFX for example is US and Canada only . Either way, 2 year warranty is standard due to European legislation.
Quote:


> Originally Posted by *Arizonian*
> 
> Just curious about the black screens. Correct me if I'm wrong, trying to put the finger on it.
> 
> 
> Out of 83 members, 22 which are under water thus far, how many are having issues _because it's not everyone._
> Theory spinning is it's happening to Elpida memory but we've got a member with Hynix experiencing same now.
> Some of the members who've had them tracked it to a different issue but they have varied as to what caused it for them and none were the GPU.
> Which version is all these members of windows running as it seems to vary and not specific to one version.
> Can't seem to put a finger on a common denominator yet.
> 
> Wondering if supported voltage control and a bump could resolve this issue for those that are?


Perhaps time for a poll?
The population here might not be enough to draw a conclusion but might show signs of a pattern.


----------



## Mas

Quote:


> Originally Posted by *Bloitz*
> 
> Don't forget to check if the warranty applies for Europe. The lifetime warranty from XFX for example is North America only (and Canada as well apparently).


Sorry don't mean to be a jerk, but that's because Canada is in North America.


----------



## Stay Puft

Quote:


> Originally Posted by *Bloitz*
> 
> Don't forget to check if the warranty applies for Europe. The lifetime warranty from XFX for example is North America only (and Canada as well apparently). Either way, 2 year warranty is standard due to European legislation.


I've got toilet paper worth more then the paper the XFX warranty is written on.


----------



## Bloitz

Quote:


> Originally Posted by *Mas*
> 
> Sorry don't mean to be a jerk, but that's because Canada is in North America.


It still is? Good to know, I'll edit it


----------



## Amhro

Quote:


> Originally Posted by *Bloitz*
> 
> Don't forget to check if the warranty applies for Europe. The lifetime warranty from XFX for example is US and Canada only . Either way, 2 year warranty is standard due to European legislation.


Local shop already gives 2-3 years warranty, so I guess it's fine.


----------



## estens

Looks like I found the reason for my high vrm temps, looks like block isn`t making contact with them








http://s67.photobucket.com/user/eivst/media/290x/20131111_1408541_zps8c08fe09.jpg.html


----------



## hatlesschimp

Will one r9 290x max all settings for 2 1080p projectors.


----------



## Kipsofthemud

Quote:


> Originally Posted by *estens*
> 
> Looks like I found the reason for my high vrm temps, looks like block isn`t making contact with them
> 
> 
> 
> 
> 
> 
> 
> 
> http://s67.photobucket.com/user/eivst/media/290x/20131111_1408541_zps8c08fe09.jpg.html


what block is that please?


----------



## Stay Puft

Quote:


> Originally Posted by *estens*
> 
> Looks like I found the reason for my high vrm temps, looks like block isn`t making contact with them
> 
> 
> 
> 
> 
> 
> 
> 
> http://s67.photobucket.com/user/eivst/media/290x/20131111_1408541_zps8c08fe09.jpg.html


Wrong size thermal pads?


----------



## Newbie2009

Quote:


> Originally Posted by *lordzed83*
> 
> Well I am 100% sure its driver problem in my case.
> When I installed beta 9 with full driver wipe i started having black screens for first time.
> Went back to beta 8 and problem is gone..


That is good news.


----------



## estens

Quote:


> Originally Posted by *Kipsofthemud*
> 
> what block is that please?


It`s a EK copper/plexi block

Quote:


> Originally Posted by *Stay Puft*
> 
> Wrong size thermal pads?


It could be, but I don`t think so. Because the thinner thermal pads are for the ram chips and they come precut.
So if wrong thickness is the case, I must have gotten two strips of 0,5mm thermal pads, one that is precut and one that is not.

Update: meassured the remaining thermal pads and it was 0,5mm so I have only gotten strips of 0,5mm thermal pads


----------



## evensen007

Quote:


> Originally Posted by *estens*
> 
> Looks like I found the reason for my high vrm temps, looks like block isn`t making contact with them
> 
> 
> 
> 
> 
> 
> 
> 
> http://s67.photobucket.com/user/eivst/media/290x/20131111_1408541_zps8c08fe09.jpg.html


Quote:


> Originally Posted by *estens*
> 
> It`s a EK copper/plexi block
> It could be, but I don`t think so. Because the thinner thermal pads are for the ram chips and they come precut.
> So if wrong thickness is the case, I must have gotten two strips of 0,5mm thermal pads, one that is precut and one that is not.
> 
> Update: meassured the remaining thermal pads and it was 0,5mm so I have only gotten strips of 0,5mm thermal pads


Ok... This has me a bit worried. Is this why people were ordering the Fuji thermal pads? What about the others who have gotten EK blocks; have you had this same thermal pad thickness issue for the VRM pads?

I was going to wait for the koolance, alpha, or swiftech waterblock, but those don't seem to be out yet.


----------



## Stay Puft

Quote:


> Originally Posted by *estens*
> 
> It`s a EK copper/plexi block
> It could be, but I don`t think so. Because the thinner thermal pads are for the ram chips and they come precut.
> So if wrong thickness is the case, I must have gotten two strips of 0,5mm thermal pads, one that is precut and one that is not.
> 
> Update: meassured the remaining thermal pads and it was 0,5mm so I have only gotten strips of 0,5mm thermal pads


Hit up Frozencpu and order more thermal pads


----------



## DavidH

Quote:


> Originally Posted by *Newbie2009*
> 
> Is everyone experiencing this black screen on the unlocked volts bios?


PT3 bios has been problem free for me on my Powercolor R9 290.


----------



## DavidH

Quote:


> Originally Posted by *Mr357*
> 
> With the amount of Vdroop I'm seeing, I think I've hit my limit for the PT1 BIOS. Would it be worth it to step up to PT3? I would really like to see how much higher this card can clock.


Only thing I didnt like about the PT3 bios was that the 2d clocks and voltage are missing meaning that even while on the desktop its running higher clocks/vcore than what is necessary.


----------



## DavidH

Quote:


> Originally Posted by *HighTemplar*
> 
> Sorry, but the avg OC for the Ti is closer to 1300mhz than you think.


So my R9 290 at 1200 to 1250mhz core should give a 780TI at 1300mhz a run for its money?


----------



## DavidH

Quote:


> Originally Posted by *SonDa5*
> 
> I think your invented number of 20% RMAs doesn't mean jack.
> 
> For you guys that are thinking about RMAng I think you are jumping the gun on a card that is worth keeping.
> 
> I lost signal from video card to display and had a black screen after updating from one beta driver the newest beta driver. Black screen would happen when would happen a few seconds after boot up. Had time to get into the desktop on Windows 8 and the the black screen would happen.
> 
> I booted into SAFE mode and deleted all ATI drivers folders and registry and then booted and the black screen went away. Re-installed the newest beta driver and the problem went away.
> 
> Right now after doing a clean driver uninstall and install the problem is gone. I blame the drivers.
> 
> Running stock Asus 290X BIOS on my Sapphire 290X and using GPU Tweak to adjust voltage.
> 
> I think the black screen problem is related to a driver glitch.
> 
> My 290X has hynix memory Ic.


Yep, I always do a very careful cleaning of my old drivers and have never had an issue. When I went from my HD 7970 to my R9 290 this is what I did in the following order and its always worked flawlessly.....

1. Uninstall drivers via Control Panel.
2. Delete left over ATI folders.
3. Boot into safe mode and run driver cleaner "paid for version which stays updated". If you use the free version you are on your own and I dont recommend it.
4. Run Ccleaner while still in safe mode then reboot and install new drivers.

Im on win 8.1 so I do not use the ati uninstaller program, it has been known to cause issues for both win 8 and 8.1. Win 7 or Vista should be fine if you desire to use the ati uninstaller utility.


----------



## Mr357

Quote:


> Originally Posted by *Falkentyne*
> 
> Well it "looks" like the VID of my card is 1.188v. That's the highest vcore it would ever read. I don't know if that accounts for any vdroop, and since vcore is in 6.25 mv steps, then it's either 1.188v or 1.200v.
> 1.188v was the highest possible vcore I could ever catch in gpu-z, usually when closing a game. If I added 12.50 mv through afterburner, it would say 1.195v.
> 
> My ASIC is 77.9%. I found that heaven loops basically forever at or under 90% at 1100/1400, but if I run around in free mode, i may eventually get an application freeze (black screen but alt tabable) from the heat if its at 70% fan. Adding 12mv makes it fully stable (have to put fan at 75% instead of 70%)
> 
> So is 1.188v the proper default vcore for that asic or is it 1.200v (counting for droop that might make it 1.188v?) for 77.9% asic?
> Someone had 1.25v as their VID?


So far, it has been determined that ASIC quality doesn't mean much. If you want to see the VID max, you can set GPU-Z to read out the min/max voltage instead of the current value.


----------



## maarten12100

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> so apparently AMD is updating the never settle bundle: http://www.amd4u.com/neversettlereloaded/index_maintenance.html
> 
> I'm wondering if those of us who bought the R9 before everyone else will be able to take advantage of this?


Wondering that too if not might just return my cards (you think they would make a problem out of the fact that I registered them with Sapphire by law I'm right but I'm not sure whether they'll like it.)
Would be nice if BF4 was included that way I could give a copy to a friend of mine which is low on cash (along with my 2 5870's)


----------



## kcuestag

Quote:


> Originally Posted by *the9quad*
> 
> I did a complete fresh install of windows and then the beta 9's, that is not a fix. I seriously doubt anyone has a fix until they can run several days without the issue.


This, this, and this.

Just because you can get it to run for a few hours doesn't mean it's fixed, wait a few days then you can say it's fixed.


----------



## r0l4n

Quote:


> Originally Posted by *Mr357*
> 
> So far, it has been determined that ASIC quality doesn't mean much. If you want to see the VID max, you can set GPU-Z to read out the min/max voltage instead of the current value.


I believe the ASIC determines what VID your GPU gets at each speed step. Given an ASIC of X, you'll get a VID of Y at a given speed of Z, where the higher the X, the lower the Y. Don't quote me on this, but I believe that's how it works.


----------



## DavidH




----------



## Mr357

Quote:


> Originally Posted by *r0l4n*
> 
> I believe the ASIC determines what VID your GPU gets at each speed step. Given an ASIC of X, you'll get a VID of Y at a given speed of Z, where the higher the X, the lower the Y. Don't quote me on this, but I believe that's how it works.


That's how it works in theory, but I was saying that in practice, it hasn't influenced overclocking results with these cards.


----------



## battleaxe

Quote:


> Originally Posted by *DavidH*


Good grief, that thing is large.

temps please?


----------



## r0l4n

Quote:


> Originally Posted by *Mr357*
> 
> That's how it works in theory, but I was saying that in practice, it hasn't influenced overclocking results with these cards.


Agree to that


----------



## eternal7trance

I wish I knew how long the wait was on the non ref 290s. I'm tempted to just get a 290 and put a custom cooler on it instead but all these issues I see in this thread make me want to stay away


----------



## Jpmboy

Quote:


> Originally Posted by *evensen007*
> 
> Ok... This has me a bit worried. Is this why people were ordering the Fuji thermal pads? What about the others who have gotten EK blocks; have you had this same thermal pad thickness issue for the VRM pads?
> 
> I was going to wait for the koolance, alpha, or swiftech waterblock, but those don't seem to be out yet.


I've been running with the EK block and pads and doing some serious benching with the unlocked asus bios. Max vrm temp i' e seen in gpuz was 59c, and the max i recorded with an ir thermometer shooting the pcb backside was 51c.

Be sure to apply a tiny amount of good TIM to the vrms and to the block itself. Reallly helps move heat through the thermal pads.

And e advice to get some fujipoly pads is good advice whether or not ek sent the proper ones.


----------



## DavidH

Quote:


> Originally Posted by *eternal7trance*
> 
> I wish I knew how long the wait was on the non ref 290s. I'm tempted to just get a 290 and put a custom cooler on it instead but all these issues I see in this thread make me want to stay away


What issues? I am clocked to 1250core and 6000mem rock solid with a modded bios. Not a single issue thus far gaming for hours.


----------



## eternal7trance

Quote:


> Originally Posted by *DavidH*
> 
> What issues? I am clocked to 1250core and 6000mem rock solid with a modded bios. Not a single issue thus far gaming for hours.


All the people reporting black screens and whatnot


----------



## pkrexer

+1 for the black screen crash here. Sapphire 290 (non-x)... The odd thing is, I have yet to have it happen in BF4 yet (could just be luck)

However, its happened in Crysis 2 several times. It seems to be triggered at certain game transitions... like during a level change or when the inventory screen comes up (though not always). Like other people stated, it can happen within 2 minutes or 2 hours.

Another annoying issue I'm experiencing, mostly only occurring with BF4; It seems like my core clock is fluctuating like crazy which is also causing lot of stuttering. Its worse on some maps than others. I have a custom fan profile setup within AB and my temps never go above 85c, so its not throttling it because of temps. I have noticed that if I turn on Vsync, it makes the fluctuating less erratic... though, it does still happen. Its almost like the card is trying to adjust its clocks according to the load on the GPU which in turn is causing these problems, I'm not sure.

I really want to like the card, I'm hoping all these issues can be resolved with driver fixes or bios updates. I was so close to shipping it back to newegg yesterday and picking up a HOF 780 *sigh* we'll see.


----------



## DavidH

Quote:


> Originally Posted by *eternal7trance*
> 
> All the people reporting black screens and whatnot


Which seem to be driver related. A few guys recently followed my advice over at the r9 290/ r9 290x owners thread and are now having no more issues. I always do a thorough cleaning of old drivers which probably explains why I never got the black screen problem and im highly overclocked with a modded bios to boot.


----------



## some1zwatching

Quote:


> Originally Posted by *rdr09*
> 
> do you mind trying this?.
> 
> 
> 
> 
> 
> Check this out, if your experience crashes or stuttering in BF4.
> 
> it was posted by Tobiman in another thread.


Hey - I am not getting black screens in BF4, but in both BF4 and Metro LL my card is throttling well below 95 C. I *do not* care about noise/temps/etc etc. None of that bothers me. As such, I simply turned the fan up to 53% to keep the card away from 95C.

I tried the video and it didn't work either. Is anyone else having their card throttle below 95 C? Sapphire Reference R9 290.

Wondering if I should RMA?

So far, BF3 and CS:S run perfectly, clock rate stays locked in at 946 MHz. BF4 and Metro LL are all over the place.


----------



## iPDrop

Does anyone have benchmarks of BF4 with two R9 290's in crossfire?


----------



## DavidH

Quote:


> Originally Posted by *pkrexer*
> 
> +1 for the black screen crash here. Sapphire 290 (non-x)... The odd thing is, I have yet to have it happen in BF4 yet (could just be luck)
> 
> However, its happened in Crysis 2 several times. It seems to be triggered at certain game transitions... like during a level change or when the inventory screen comes up (though not always). Like other people stated, it can happen within 2 minutes or 2 hours.
> 
> Another annoying issue I'm experiencing, mostly only occurring with BF4; It seems like my core clock is fluctuating like crazy which is also causing lot of stuttering. Its worse on some maps than others. I have a custom fan profile setup within AB and my temps never go above 85c, so its not throttling it because of temps. I have noticed that if I turn on Vsync, it makes the fluctuating less erratic... though, it does still happen. Its almost like the card is trying to adjust its clocks according to the load on the GPU which in turn is causing these problems, I'm not sure.
> 
> I really want to like the card, I'm hoping all these issues can be resolved with driver fixes or bios updates. I was so close to shipping it back to newegg yesterday and picking up a HOF 780 *sigh* we'll see.


Sorry your having those issues, BF4 is silky smooth for me without a hitch or stutter unless its server related lag and I game with vsync.


----------



## Bloitz

Quote:


> Originally Posted by *DavidH*
> 
> 
> 
> Spoiler: Warning: Spoiler!


I would make some sort of plexi housing around that cooler so the airflow is forced on the PCB more. Looks like a beastly cooler though :O


----------



## Mr357

Can anyone link to me the PT3 BIOS? I can't find it on TPU and that guy who made the 290X overclocking thread deleted the post that had all of the links in it.


----------



## DavidH

Quote:


> Originally Posted by *Bloitz*
> 
> I would make some sort of plexi housing around that cooler so the airflow is forced on the PCB more. Looks like a beastly cooler though :O


Its doing a good job, If I stick my hand over the top of the card you can feel the air coming up from the fans. VRM temps under load reported by GPUZ are 60c for the hottest one and thats with the overclock plus additional vcore.


----------



## Phelan

Quote:


> Originally Posted by *frogzombie*
> 
> I just spent an 2 hours playing Battlefield 4. My R9 290 got to 200 degrees Fahrenheit and blue screened my computer. I have not done any overclocking. Is an aftermarket cooler the only option?


200*F is 93.4*C, just under 95*C, the normal operating temperature.


----------



## eternal7trance

Has anyone gotten better temps by just redoing the paste on the card?


----------



## Mr357

Quote:


> Originally Posted by *eternal7trance*
> 
> Has anyone gotten better temps by just redoing the paste on the card?


Someone replaced the stock TIM with IC Diamond or some other high grade stuff and reported a 2 degree difference. If you dug around you could find that post in this thread.


----------



## tsm106

Quote:


> Originally Posted by *Mr357*
> 
> Quote:
> 
> 
> 
> Originally Posted by *eternal7trance*
> 
> Has anyone gotten better temps by just redoing the paste on the card?
> 
> 
> 
> Someone replaced the stock TIM with IC Diamond or some other high grade stuff and reported a 2 degree difference. If you dug around you could find that post in this thread.
Click to expand...

You shouldn't use something abrasive like IC on the bare die unless you don't value your card or its warranty.


----------



## Blackroush

I just got a blackscreen while playing BF4 in uber mode, SHUT!


----------



## some1zwatching

Quote:


> Originally Posted by *pkrexer*
> 
> Another annoying issue I'm experiencing, mostly only occurring with BF4; It seems like my core clock is fluctuating like crazy which is also causing lot of stuttering. Its worse on some maps than others. I have a custom fan profile setup within AB and my temps never go above 85c, so its not throttling it because of temps. I have noticed that if I turn on Vsync, it makes the fluctuating less erratic... though, it does still happen. Its almost like the card is trying to adjust its clocks according to the load on the GPU which in turn is causing these problems, I'm not sure.


I am having the same issue.

One thing I noticed that made it *drastically* better in BF4 - go into your settings, and instead of having it render in Fullscreen, set it to render in borderless. After you boot the game, switch to fullscreen, then to borderless. I don't think it fixed it completely for me, but it made the clock settings *much* more stable and the stuttering was *almost* eliminated. If it works for you too, will you please let me know? Thanks!


----------



## Mr357

Quote:


> Originally Posted by *tsm106*
> 
> You shouldn't use something abrasive like IC on the bare die unless you don't value your card or its warranty.


What should be used? CLP/CLU?


----------



## tsm106

Quote:


> Originally Posted by *Mr357*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> You shouldn't use something abrasive like IC on the bare die unless you don't value your card or its warranty.
> 
> 
> 
> What should be used? CLP/CLU?
Click to expand...

Only CLU since it literally wipes off because it stays in a liquid form. CLP can harden over time and put you into the same negative situation like IC.


----------



## VSG

Anything non conductive like MX4 should be fine. CLU should also be ok but I still don't trust it to be completely liquid over time.


----------



## selk22

Iv used Mx-4 and it works great! Nice temps also. CLU is the way to go though I hear.. But doesn't it dry up eventually?


----------



## Mr357

Quote:


> Originally Posted by *tsm106*
> 
> Only CLU since it literally wipes off because it stays in a liquid form. CLP can harden over time and put you into the same negative situation like IC.


Well I'm glad to hear it since CLU is exactly what I used.


----------



## the9quad

Quote:


> Originally Posted by *DavidH*
> 
> Which seem to be driver related. A few guys recently followed my advice over at the r9 290/ r9 290x owners thread and are now having no more issues. I always do a thorough cleaning of old drivers which probably explains why I never got the black screen problem and im highly overclocked with a modded bios to boot.


Ok dave please give me advice. Here is what I have done:

Replaced power supply
Replaced power supply cables
Installed FRESH windows 8 and beta 8 DOESN'T GET ANY MORE THOROUGH THAN DOING THIS!
Installed FRESH windows 8.1 and beta 8 DOESN'T GET ANY MORE THOROUGH THAN DOING THIS!
Installed FRESH windows 8 and beta 9 DOESN'T GET ANY MORE THOROUGH THAN DOING THIS!
installed FRESH windows 8.1 and beta 9 DOESN'T GET ANY MORE THOROUGH THAN DOING THIS!
tried different bios's
adjusted GPU voltage
adjusted Vcore
ran with stock clocks, ran with underclocks

Please school us ignorant peasants on what we are doing wrong, and how it's all a driver issue please.
Sorry for the tone, but maybe just maybe there are some cards with defects and you didn't get one?


----------



## selk22

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *the9quad*
> 
> Ok dave please give me advice. Here is what I have done:
> 
> Replaced power supply
> Replaced power supply cables
> Installed FRESH windows 8 and beta 8 DOESN'T GET ANY MORE THOROUGH THAN DOING THIS!
> Installed FRESH windows 8.1 and beta 8 DOESN'T GET ANY MORE THOROUGH THAN DOING THIS!
> Installed FRESH windows 8 and beta 9 DOESN'T GET ANY MORE THOROUGH THAN DOING THIS!
> installed FRESH windows 8.1 and beta 9 DOESN'T GET ANY MORE THOROUGH THAN DOING THIS!
> tried different bios's
> adjusted GPU voltage
> adjusted Vcore
> ran with stock clocks, ran with underclocks
> 
> Please school us ignorant peasants on what we are doing wrong, and how it's all a driver issue please.
> Sorry for the tone, but maybe just maybe there are some cards with defects and you didn't get one?






Just RMA your card man! You seem to know what your talking about here


----------



## the9quad

Quote:


> Originally Posted by *selk22*
> 
> 
> Just RMA your card man! You seem to know what your talking about here


I am lol, just waiting on the shipping label email.

I didnt mean to have an attitude, it just burns me when people act like "well mine is fine, you must have installed drivers wrong...."


----------



## tsm106

Quote:


> Originally Posted by *the9quad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *selk22*
> 
> 
> Just RMA your card man! You seem to know what your talking about here
> 
> 
> 
> I am lol, just waiting on the shipping label email.
> 
> I didnt mean to have an attitude, it just burns me when people act like "well mine is fine, you must have installed drivers wrong...."
Click to expand...

Fill out yer rig specs btw. GL on yer rma, they can be like a dentist visit. Even when its good, it hurts.


----------



## flopper

Quote:


> Originally Posted by *the9quad*
> 
> Ok dave please give me advice. Here is what I have done:
> 
> Replaced power supply
> Replaced power supply cables
> Installed FRESH windows 8 and beta 8 DOESN'T GET ANY MORE THOROUGH THAN DOING THIS!
> Installed FRESH windows 8.1 and beta 8 DOESN'T GET ANY MORE THOROUGH THAN DOING THIS!
> Installed FRESH windows 8 and beta 9 DOESN'T GET ANY MORE THOROUGH THAN DOING THIS!
> installed FRESH windows 8.1 and beta 9 DOESN'T GET ANY MORE THOROUGH THAN DOING THIS!
> tried different bios's
> adjusted GPU voltage
> adjusted Vcore
> ran with stock clocks, ran with underclocks
> 
> Please school us ignorant peasants on what we are doing wrong, and how it's all a driver issue please.
> Sorry for the tone, but maybe just maybe there are some cards with defects and you didn't get one?


yea, whatever it is I dont like things that dosnt work out of the box

some had success with memory downclock 5%, some was driver, etc..


----------



## flopper

Quote:


> Originally Posted by *some1zwatching*
> 
> Hey - I am not getting black screens in BF4, but in both BF4 and Metro LL my card is throttling well below 95 C. I *do not* care about noise/temps/etc etc. None of that bothers me. As such, I simply turned the fan up to 53% to keep the card away from 95C.
> 
> I tried the video and it didn't work either. Is anyone else having their card throttle below 95 C? Sapphire Reference R9 290.
> 
> Wondering if I should RMA?
> 
> So far, BF3 and CS:S run perfectly, clock rate stays locked in at 946 MHz. BF4 and Metro LL are all over the place.


downclock memory by 5%?
some had that working.


----------



## rdr09

Arizonian, add me please . . .



air for now. thanks.


----------



## pkrexer

Quote:


> Originally Posted by *some1zwatching*
> 
> I am having the same issue.
> 
> One thing I noticed that made it *drastically* better in BF4 - go into your settings, and instead of having it render in Fullscreen, set it to render in borderless. After you boot the game, switch to fullscreen, then to borderless. I don't think it fixed it completely for me, but it made the clock settings *much* more stable and the stuttering was *almost* eliminated. If it works for you too, will you please let me know? Thanks!


It didn't really seem to make much of a difference for me, though running borderless definitely caused the core clock to fluctuate even more.. making the game almost unplayable.


----------



## joeywootuk

Quote:


> Originally Posted by *Deisun*
> 
> I just ordered the R9 290. Couldn't resist...
> 
> BF4 here I come!
> I'm hoping my Core i7 920 @ 4GHz, 6gb memory and the R9 290 will push me into ULTRA territory.
> 
> If my i7 920 can last me until SKYLAKE in 2015, I will be a very happy man.


How does this run for you on ultra 1080p? im thinking of getting the r9 290 and i have the same setup as you! but my i7 920 D0 is @ 4ghz. What seems to be your lowest fps? Cheers


----------



## Mr357

I guess this post was overlooked
Quote:


> Originally Posted by *Mr357*
> 
> Can anyone link to me the PT3 BIOS? I can't find it on TPU and that guy who made the 290X overclocking thread deleted the post that had all of the links in it.


----------



## sugarhell

The original source is from kingpin forum-shamino lair.


----------



## Mr357

Quote:


> Originally Posted by *sugarhell*
> 
> The original source is from kingpin forum-shamino lair.


Found it, thanks!









Link


----------



## Arizonian

Quote:


> Originally Posted by *kitsunestarwind*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Offically a Radeon R9-290 Owner as of today, Add me to the list
> 
> Also I am getting the black screen at stock clocks
> been reading and testing all the day
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I have a 80% ASIC from my XFX R-290
> it is running the Elpida GDDR5 memory chips
> 
> It would black screen on heaven and FF14 (benchmark utility) and in Metro LL
> 
> Does not even get close to overheating
> But my testing all afternoon has lead to me getting it stable with no black screens at -5% memory clock in the Overdrive, no more black screens at all


Congrats added -









Quote:


> Originally Posted by *rdr09*
> 
> Arizonian, add me please . . .
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> air for now. thanks.


Welcome aboard as well


----------



## PillarOfAutumn

Just got an email from FCPU. My nickel waterblock for the 290 has shipped. I ordered it on the 31stbof October.


----------



## ljreyl

I have a weird problem going on with my overclocked 290x

System
AMD 8350 @ 4.7 GHz (1.368 Volts)
8 GB 1866 RAM (Stock Volts
2600 NB (1.275 volts)
2600 HT (Stock Volts)
R9 290x @ 1150 Core, 1450 memory (1.4 Volts, Max temps 80C, Stock cooling and 80% fan)

So my problem is, I play a game for a while and everything runs fine, no artifacts or anything BUT after about 30 minutes, my sound card cuts in and out. No in game stuttering for video but sound just takes a dump every 2 seconds or so.

Anyone know why?

Sound card - Xonar Phoebus
Note - I disabled onboard sound and didn't install the R9 HDMI sound drivers.

Also, all overclocks were stable (CPU, NB, HT, Etc) when I had my 7970 overclocked at 1250 core and 1800 memory


----------



## some1zwatching

Quote:


> Originally Posted by *flopper*
> 
> downclock memory by 5%?
> some had that working.


what can i use to downclock? afterburner?


----------



## the9quad

Quote:


> Originally Posted by *ljreyl*
> 
> I have a weird problem going on with my overclocked 290x
> 
> System
> AMD 8350 @ 4.7 GHz (1.368 Volts)
> 8 GB 1866 RAM (Stock Volts
> 2600 NB (1.275 volts)
> 2600 HT (Stock Volts)
> R9 290x @ 1150 Core, 1450 memory (1.4 Volts, Max temps 80C, Stock cooling and 80% fan)
> 
> So my problem is, I play a game for a while and everything runs fine, no artifacts or anything BUT after about 30 minutes, my sound card cuts in and out. No in game stuttering for video but sound just takes a dump every 2 seconds or so.
> 
> Anyone know why?
> 
> Sound card - Xonar Phoebus
> Note - I disabled onboard sound and didn't install the R9 HDMI sound drivers.
> 
> Also, all overclocks were stable (CPU, NB, HT, Etc) when I had my 7970 overclocked at 1250 core and 1800 memory


What game BF4 only? if so it's the game issue, happens to a lot of people especially on the dam map. If it's every game, I have no idea why


----------



## the9quad

Quote:


> Originally Posted by *tsm106*
> 
> Fill out yer rig specs btw. GL on yer rma, they can be like a dentist visit. Even when its good, it hurts.


yeah I will fill them out, when i get done being mad about this video card.... Anyway the card is with UPS now for the RMA process, and i am back to a 5770....eeeek.


----------



## cyenz

Quote:


> Originally Posted by *ljreyl*
> 
> I have a weird problem going on with my overclocked 290x
> 
> System
> AMD 8350 @ 4.7 GHz (1.368 Volts)
> 8 GB 1866 RAM (Stock Volts
> 2600 NB (1.275 volts)
> 2600 HT (Stock Volts)
> R9 290x @ 1150 Core, 1450 memory (1.4 Volts, Max temps 80C, Stock cooling and 80% fan)
> 
> So my problem is, I play a game for a while and everything runs fine, no artifacts or anything BUT after about 30 minutes, my sound card cuts in and out. No in game stuttering for video but sound just takes a dump every 2 seconds or so.
> 
> Anyone know why?
> 
> Sound card - Xonar Phoebus
> Note - I disabled onboard sound and didn't install the R9 HDMI sound drivers.
> 
> Also, all overclocks were stable (CPU, NB, HT, Etc) when I had my 7970 overclocked at 1250 core and 1800 memory


If it´s BF4 only, it´s a game related bug, happens to me and to alot of people. Dont worry about that.


----------



## battleaxe

Quote:


> Originally Posted by *the9quad*
> 
> What game BF4 only? if so it's the game issue, happens to a lot of people especially on the dam map. If it's every game, I have no idea why


True, as certain maps and servers have crashing artifact issues. I was getting really weird colors on my 670 too. Lots of crashes as well. Then other servers, and maps work totally fine. So this is not necessarily the card.


----------



## rdr09

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats added -
> 
> 
> 
> 
> 
> 
> 
> 
> Welcome aboard as well


thanks, again. mine uses Elpida Vram. Played BF4 briefly just to make sure it works on that game and it does. *No blackscreen issues so far*. Currently using 13.11 v8.

Here is my 3DMark11 stock score where the 290 is running at X8 (2nd slot) waiting for the waterblock.

http://www.3dmark.com/3dm11/7472292

Pretty happy how it handled BF4 maxed out using 1080 smoothly. But then again, my 7950 did, too. ha!


----------



## bjozac

Would it be an idea to create a thread with only results from overclocking, me for one is a noob in overclocking and likes to see results from what others do. But reading 5000 posts and sorting results out, between x and non-x... well that's gonna be a pain









Just got my Saphire 290 now and verified that it does 1125/1500 on stock cooler and stock bios. Next step is flashing bios and then wait for my water cooling stuff to arrive!


----------



## qqan

Quote:


> Originally Posted by *qqan*
> 
> I found an issue when OCing using the Sabertooth Z77 performance option - screens go black, and USB stops (?) but system keeps running.
> This looks like driver related and Windows 8.1 specific. on stock clocks everything is rock solid.


While the "Optimized Performance" auto-OC feature of the mainboard causes black screens or freezes with r9 290x, all is rock solid with manual OC on system CPU and RAM - as long as I don't touch the BUS speed...


----------



## Raephen

I'm still waiting for my Accelero Xtreme III to arrive, but damn, those waterblocks do look sexy on this card.

I might start saving up a bit and see what's available next year...

Though, this presents a problem: if I was to use the thermal glue that comes with the cooler, I'd be stuck withthose heatsinks.

The ram heatsinks aren't that critical and could be left of (I'm not a heavy overclocker and a decent airflow from the fans at 35 - 40% minimum should suffice), but the VRM's on the other hand require some serious cooling.

What kind of alternatives would there be? Is there thermal padding that kinda acts double-sided tape?

Ah... choices, choices...


----------



## Sazz

Quote:


> Originally Posted by *some1zwatching*
> 
> Hey - I am not getting black screens in BF4, but in both BF4 and Metro LL my card is throttling well below 95 C. I *do not* care about noise/temps/etc etc. None of that bothers me. As such, I simply turned the fan up to 53% to keep the card away from 95C.
> 
> I tried the video and it didn't work either. Is anyone else having their card throttle below 95 C? Sapphire Reference R9 290.
> 
> Wondering if I should RMA?
> 
> So far, BF3 and CS:S run perfectly, clock rate stays locked in at 946 MHz. BF4 and Metro LL are all over the place.


This is what I did for my 290X when it was on stock cooler.

I put +10% on board power limit, put my max fan speed at 72% as my max speed at 80C (using Afterburner)

And the temp under load flat out at 90C with an ambient temp of 25C, if your ambient temp is higher than mine you may need to increase fan speed by a few %.

As of BF3 and BF4 the card is not "throttling" but when it doesn't need all the "power" it has it will downclock a bit, but on other games I noticed that the card usually flat out runs at max speed. maybe this will improve once mantle is implemented on BF4.

Quote:


> Originally Posted by *DavidH*


I find that configuration for the PSU troubling, the two fans on your GPU will steal most of the air that would've gone thru the PSU, so watch out for your PSU's temps.. might wanna go put them on a traditional configuration since you got two powerful fans competing with the fan from your seasonic PSU.

And for those who are still experiencing black screens, lets take notes on what game/application when it happened, what clocks, and what background programs are running when it happened so that we can see the pattern quicker.


----------



## Slomo4shO

Quote:


> Originally Posted by *Bloitz*
> 
> *It still is?*


Considering that the US and and Canada are on the same North American plate...


----------



## ljreyl

Quote:


> Originally Posted by *ljreyl*
> 
> I have a weird problem going on with my overclocked 290x
> 
> System
> AMD 8350 @ 4.7 GHz (1.368 Volts)
> 8 GB 1866 RAM (Stock Volts
> 2600 NB (1.275 volts)
> 2600 HT (Stock Volts)
> R9 290x @ 1150 Core, 1450 memory (1.4 Volts, Max temps 80C, Stock cooling and 80% fan)
> 
> So my problem is, I play a game for a while and everything runs fine, no artifacts or anything BUT after about 30 minutes, my sound card cuts in and out. No in game stuttering for video but sound just takes a dump every 2 seconds or so.
> 
> Anyone know why?
> 
> Sound card - Xonar Phoebus
> Note - I disabled onboard sound and didn't install the R9 HDMI sound drivers.
> 
> Also, all overclocks were stable (CPU, NB, HT, Etc) when I had my 7970 overclocked at 1250 core and 1800 memory


Quote:


> Originally Posted by *the9quad*
> 
> What game BF4 only? if so it's the game issue, happens to a lot of people especially on the dam map. If it's every game, I have no idea why


Its all games. I've tested crisis 3, bf4, tomb raider, heaven benchmark, etc.


----------



## DampMonkey

Quote:


> Originally Posted by *Sazz*
> 
> And for those who are still experiencing black screens, lets take notes on what game/application when it happened, what clocks, and what background programs are running when it happened so that we can see the pattern quicker.


For me its only happened when using PT3 or PT1 bios's, clocks upwards of 1250 and voltages upwards of 1.35. Never happened on the Asus bios at max gputweak voltage, i can run 1200-1250 all day with no issue. This was occuring for core overclocks, not memory. It would still happen if i left memory alone


----------



## evensen007

I may have gotten lost in the hundreds of pages of posts, but has anyone found a single reason yet to purchase a 290x over a 290 yet? From the looks of things, this is a different situation to the 7950/7970 days when the 7970 had a noticeable edge when highly overclocked.

What's the consensus?


----------



## Raephen

OK. I found what I was looking for: Thermal tape.

But it doesn't look all that encouraging: on the bits I've found, a thermal conductivity ranging from 0,9 W to 1,5 W. Would those suffice?


----------



## DampMonkey

Quote:


> Originally Posted by *evensen007*
> 
> I may have gotten lost in the hundreds of pages of posts, but has anyone found a single reason yet to purchase a 290x over a 290 yet? From the looks of things, this is a different situation to the 7950/7970 days when the 7970 had a noticeable edge when highly overclocked.
> 
> What's the consensus?


the 7950 to 7970 is a good comparison for the 290/x. Neither cards have had the opportunity to show their best overclocking potential, so I can attest to wheter the 290 clocks better or worse than the 290x, but in every other case the 290x will be anywhere from 1-8 fps faster, depending on the application, throttling, and resolution. The 290 is a very very good deal, which i would have bought 2 instead of a 290x


----------



## ljreyl

Quote:


> Originally Posted by *evensen007*
> 
> I may have gotten lost in the hundreds of pages of posts, but has anyone found a single reason yet to purchase a 290x over a 290 yet? From the looks of things, this is a different situation to the 7950/7970 days when the 7970 had a noticeable edge when highly overclocked.
> 
> What's the consensus?


I had a 1st generation 7970 overclocked to 1250 core and 1800 mem stable (1.3 Volts Arctic Hybrid liquid cooler)
my 7970 vs my new R9 is night and day in performance.
Heaven 4.0 Benchmark
7970 OC 41FPS ULTRA
Stock R9 290x 53FPS ULTRA

Crisis 3
OC 7970 40FPS ULTRA fxaa
R9 290X 55FPS Ultra

When I overclock the R9 to 1150 core and 1450 memory, I average 59 FPS on Heaven on ULTRA (8MSAA, Extreme tessellation)


----------



## Stay Puft

Quote:


> Originally Posted by *Raephen*
> 
> I'm still waiting for my Accelero Xtreme III to arrive, but damn, those waterblocks do look sexy on this card.
> 
> I might start saving up a bit and see what's available next year...
> 
> Though, this presents a problem: if I was to use the thermal glue that comes with the cooler, I'd be stuck withthose heatsinks.
> 
> The ram heatsinks aren't that critical and could be left of (I'm not a heavy overclocker and a decent airflow from the fans at 35 - 40% minimum should suffice), but the VRM's on the other hand require some serious cooling.
> 
> What kind of alternatives would there be? Is there thermal padding that kinda acts double-sided tape?
> 
> Ah... choices, choices...


Do not use those garbage heat sinks that Arctic includes. Grab some of these

http://www.frozencpu.com/products/15573/vid-184/Akust_Copper_Memory_Chip_Heatsink_-_13mm_x_12mm_x_5mm_-_4_Pack_RS00-0602-AKS.html?tl=g40c16s224

And you should be very happy with the Arctic


----------



## youra6

Quote:


> Originally Posted by *RAFFY*
> 
> Is 73.1 good? My first card reads 74.2 and my second card reads 77.4


ASIC really doesn't mean a whole lot. VID probably is a better indicator. Mine are 74% and 80% respectively and neither can can exceed 1220 @ 1.412V (max volts)


----------



## Raephen

Quote:


> Originally Posted by *Stay Puft*
> 
> Do not use those garbage heat sinks that Arctic includes. Grab some of these
> 
> http://www.frozencpu.com/products/15573/vid-184/Akust_Copper_Memory_Chip_Heatsink_-_13mm_x_12mm_x_5mm_-_4_Pack_RS00-0602-AKS.html?tl=g40c16s224
> 
> And you should be very happy with the Arctic


Thanks for your input. I have ordered similar copper heatsinks, but the problrm is: the semi permanent attachment and the fact I doubt if that shape is suitable for the VRM's.

Please, feel free to tell me my fears are wrong.


----------



## some1zwatching

Quote:


> Originally Posted by *Sazz*
> 
> This is what I did for my 290X when it was on stock cooler.
> 
> I put +10% on board power limit, put my max fan speed at 72% as my max speed at 80C (using Afterburner)
> 
> And the temp under load flat out at 90C with an ambient temp of 25C, if your ambient temp is higher than mine you may need to increase fan speed by a few %.
> 
> As of BF3 and BF4 the card is not "throttling" but when it doesn't need all the "power" it has it will downclock a bit, but on other games I noticed that the card usually flat out runs at max speed. maybe this will improve once mantle is implemented on BF4


Sazz - thanks for the tip - I'll give it a try. One point to note, the reason I suggest the GPU is "throttling" in BF4 is because when I see the frequency varying quite a bit, the game is stuttering, and pretty badly. If I run the game in borderless or windowed mode rather than full screen, there is drastically less stuttering, and afterburner shows a much more constant gpu frequency.


----------



## eternal7trance

Can you use the stock cooling plate and then put something on to replace the actual gpu cooler like a gelid? Or will the stock plate block it?

I want a 290 but I just don't feel like having to take the whole thing off just for it to work right


----------



## DeadlyDNA

Forgive me if this already somewhere in this thread. I want to know what models of the new AMD cards support quad crossfire. Since they no longer have the crossfire bridge i want to know. I haven't seen it anywhere except for 290x .

can anyone provide insight with a link or reference if possible?


----------



## rdr09

ok, i did some quick oc'ing of the 290 (stock 947/1250) and this is without any voltage tweaks - oc'ed to 1115/1400; ASIC 79%

GPU score: 16050

http://www.3dmark.com/3dm11/7473081

In comparison . . .

Crossfire 7950/7970 stock

GPU: 15640

http://www.3dmark.com/3dm11/6233890


----------



## some1zwatching

In BF4, I tested quite a few settings/options, and if you turn off deferred AF (or AA, can't remember which), game smoothed out quite a bit - no more choppiness or lag.


----------



## Forceman

Quote:


> Originally Posted by *eternal7trance*
> 
> Can you use the stock cooling plate and then put something on to replace the actual gpu cooler like a gelid? Or will the stock plate block it?
> 
> I want a 290 but I just don't feel like having to take the whole thing off just for it to work right


No you have to take the stock cooler off as a unit. The aftermarket coolers replace the entire thing. If you don't want to have to mod the card yourself, you can wait a few weeks for the non-reference models - best of both worlds, although waiting sucks.


----------



## hatlesschimp

Will one 290x be able to get the job done with 3840 x 1080p @ 60fps minimum?


----------



## stilllogicz

These black screens are really disturbing. The last thing I want to deal with is blackscreening left right & center. Sigh. The price, performance and the vram is the major selling point for me with the 290x, I really want to buy 2-3 of them.


----------



## RSharpe

Please excuse me if this has been covered, but I haven't been able to read every single post in every single 290/290X thread.

Has anyone been able to test out of differences between water cooled 290 versus 290x? It's clear there are limits to the capabilities of the stock cooler, and many reviews have stated that the 290x throttles back because of the cooler. Has anyone compared the water cooling results from both cards?

I'm only interested in water cooling GPUs, and was wondering if the extra cost of the 290x might be worth it. It's clear it's not worth it under stock cooling.


----------



## rdr09

Quote:


> Originally Posted by *stilllogicz*
> 
> These black screens are really disturbing. The last thing I want to deal with is blackscreening left right & center. Sigh. The price, performance and the vram is the major selling point for me with the 290x, I really want to buy 2-3 of them.


haven't had any (crosses fingers). i still have the 7970 in my intel rig with the power unplugged. all i did was uninstalled the driver using the same driver i used for the 7970 (13.11 Beta 8), shutdown and installed the 290 in the second slot. Booted and ran the same driver that i used for the 7970 and intalled it. I played BF4 for about half hour and ran some benches. also watched some you tube videos and no issues. no flickering none of that stuff. so far my highest stable oc in 3DMark11 without touching the voltages (they are greyed out anyway) are 1115/1400.

can't wait for the waterblock 'cause i heard the fan while playing BF4. at idle or browsing, my pump is still louder.


----------



## Amhro

Found this, not sure if posted
asus gtx 780 @ 1300mhz vs msi r9 290x @ 1200mhz, both watercooled

780


290x


----------



## DeadlyDNA

I asked earlier about Crossfire support and crossfire bridges. I found the answer for me on AMD's website. Posting this in case anyone else needs it for reference

http://www.amd.com/us/products/desktop/graphics/r9/pages/amd-radeon-hd-r9-series.aspx#5


----------



## esqueue

Used tweezers and a heat gun set on 300°F to heat up and remove the stickers without damaging them.

They are in better condition that the camera makes it look. I don't know if I should email them and ask for permission like they said in their warranty. If the card breaks for any reason, I don't want them to blame it on a water cooling block that does wonders for the card.

I even have the card supported to eliminate any sag due to the ek waterblock.


----------



## rdr09

Quote:


> Originally Posted by *DeadlyDNA*
> 
> I asked earlier about Crossfire support and crossfire bridges. I found the answer for me on AMD's website. Posting this in case anyone else needs it for reference
> 
> http://www.amd.com/us/products/desktop/graphics/r9/pages/amd-radeon-hd-r9-series.aspx#5


i am sorry. info about crossfire (sideport) is included in the op.


----------



## evensen007

Good god, I failed my resistance roll against upgrade. I just bought 2x XFX 290's on Amazon and 2 Nickel/Plexi waterblocks from performance Pc's (they're back in stock). I should have the cards tomorrow and the blocks on Wednesday.
Quote:


> Originally Posted by *esqueue*
> 
> 
> 
> Used tweezers and a heat gun set on 300°F to heat up and remove the stickers without damaging them.
> 
> They are in better condition that the camera makes it look. I don't know if I should email them and ask for permission like they said in their warranty. If the card breaks for any reason, I don't want them to blame it on a water cooling block that does wonders for the card.
> 
> I even have the card supported to eliminate any sag due to the ek waterblock.


*EDIT*

Wait, Xfx voids the warranty if I use a waterblock!? I may cancel this order.


----------



## DeadlyDNA

Quote:


> Originally Posted by *rdr09*
> 
> i am sorry. info about crossfire (sideport) is included in the op.


I see the information in the OP about sideport/crossfire, However i don't see anything about number of GPU's supported per model basis. In the link i provided it shows how many GPU's are supported per model number.

R9-290 supports 4 GPU's in CF
R9-290x Supports 4 GPU's in CF
R9-280x Supports 4 GPU's in CF
R9-270x Supports 2 GPU's in CF

Again sorry if that was in the OP but i looked for it and i'm not seeing it


----------



## Arvalin

Sapphire R9 290 stock








http://www.techpowerup.com/gpuz/5kmqd/


----------



## rdr09

Quote:


> Originally Posted by *DeadlyDNA*
> 
> I see the information in the OP about sideport/crossfire, However i don't see anything about number of GPU's supported per model basis. In the link i provided it shows how many GPU's are supported per model number.
> 
> R9-290 supports 4 GPU's in CF
> R9-290x Supports 4 GPU's in CF
> R9-280x Supports 4 GPU's in CF
> R9-270x Supports 2 GPU's in CF
> 
> Again sorry if that was in the OP but i looked for it and i'm not seeing it


yes, for the 290s, quad is possible. we also want to consider the system to run this many gpus. afaik, you need a X79 system to fully utilize this configuration with a highly oc'ed cpu. with regard to the psu, well, 2 powerful psus for sure. might cause a blackout in the neighborhood. lol.


----------



## skupples

Quote:


> Originally Posted by *rdr09*
> 
> yes, for the 290s, quad is possible. we also want to consider the system to run this many gpus. afaik, you need a X79 system to fully utilize this configuration with a highly oc'ed cpu. with regard to the psu, well, 2 powerful psus for sure. might cause a blackout in the neighborhood. lol.


you could also do it on a PLX chipped lga 1150 if you like, for much less, with almost no hit (and some times a gain) in gaming performance. It of course won't handle the high end tasks as well as x79. & yes I would HIGHLY recommend dual PSU & ADD2PSU for quad SLI Hawaii.


----------



## MojoW

I'm joining in, with my Sapphire R9 290 with elpida memory.
It's running on a comfortable 1100/1300 on the stock uber bios with the stock cooler.(Custom fan profile)
I'm running Windows 8.1 with 13.11 B9.2 no black screens so far but i always clean thoroughly after driver uninstall.(Came from 6990 quad)

(Don't mind the pic my phone lens is busted)


----------



## iPDrop

Is anyone else's cards purring?

My two r9 290's i can literally hear something along the lines of a cat's purr


----------



## Mas

Quote:


> Originally Posted by *evensen007*
> 
> Ok... This has me a bit worried. Is this why people were ordering the Fuji thermal pads? What about the others who have gotten EK blocks; have you had this same thermal pad thickness issue for the VRM pads?
> 
> I was going to wait for the koolance, alpha, or swiftech waterblock, but those don't seem to be out yet.


Not sure if already posted in the last 10 pages, but koolance full cover block is out 19th in Aus:

http://www.pccasegear.com/index.php?main_page=product_info&products_id=25725


----------



## Gilgam3sh

I just fitted the Accelero Xtreme III, if I don't run the fans @ 100% I get this annoying coil whine it's killing me, the temps are very good , about 30-35 at idle, VRM temps at 29-34, but the coil whine is too much, if I don't find any solution this cooler will be removed asap.... and it's not only me who having this problem, everyone who fitted it have that.


----------



## VSG

Quote:


> Originally Posted by *Mas*
> 
> Not sure if already posted in the last 10 pages, but koolance full cover block is out 19th in Aus:
> 
> http://www.pccasegear.com/index.php?main_page=product_info&products_id=25725


$150?? Is that the msrp or is this the store raising its own price?


----------



## Slomo4shO

Quote:


> Originally Posted by *Amhro*
> 
> Found this, not sure if posted
> asus gtx 780 @ 1300mhz vs msi r9 290x @ 1200mhz, both watercooled
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 780
> 
> 
> 290x


Do you happen to know where I can find higher resolution pictures of these benches?


----------



## Stay Puft

Quote:


> Originally Posted by *DeadlyDNA*
> 
> I see the information in the OP about sideport/crossfire, However i don't see anything about number of GPU's supported per model basis. In the link i provided it shows how many GPU's are supported per model number.
> 
> R9-290 supports 4 GPU's in CF
> R9-290x Supports 4 GPU's in CF
> R9-280x Supports 4 GPU's in CF
> R9-270x Supports 2 GPU's in CF
> 
> Again sorry if that was in the OP but i looked for it and i'm not seeing it


Sapphire for some reason gave its 270X toxic two xfire connectors and states it can do 3 way crossfire


----------



## Slomo4shO

Quote:


> Originally Posted by *Gilgam3sh*
> 
> I just fitted the Accelero Xtreme III, if I don't run the fans @ 100% I get this annoying coil whine it's killing me, the temps are very good , about 30-35 at idle, VRM temps at 29-34, but the coil whine is too much, if I don't find any solution this cooler will be removed asap.... and it's not only me who having this problem, everyone who fitted it have that.
> 
> 
> Spoiler: Warning: Spoiler!


I see you are using a HAF XB







Do you happen to have the total length of the card with the Accelero Xtreme III installed?


----------



## Mr357

Has anyone determined the margin of error between software voltage readouts and the actual voltage (tested with a meter)? I would do it myself, but I can't exactly take the backplate off without removing the card.


----------



## Gilgam3sh

Quote:


> Originally Posted by *Slomo4shO*
> 
> I see you are using a HAF XB
> 
> 
> 
> 
> 
> 
> 
> Do you happen to have the total length of the card with the Accelero Xtreme III installed?


it adds about 5cm, but damn this coil whine.. at 100% fan speed it disappear completely


----------



## Arizonian

This was in the original post:

AMD Radeon R9 290X To Feature New AMD CrossFireX Technology

Where are CrossFire connectors?

The missing crossfire finger technology is called Sideport. This technology support the crossfire connection through the PCI-E Express rather than separate connectors. What does this mean? Goodbye Crossfire Bridge connectors.

*AMD has added a hardware DMA engine (Direct Memory Access) in the AMD CrossFire compositing block. The video cards now have dual DMA engines. This new technology is only present in the R9 290X and R9 290. CrossFire has been simplified in the new series, and as such AMD was able to get frame pacing working.*

DMA engines allow for direct access between the GPUs over the PCI-Express bus solely. AMD has said the video cards can saturate a PCIe 3.0 x16 bus bandwidth; 16GB/s bi-directional. There is no performance penalty for eliminating the external bridge. Sideport can improve scaling and efficiency between the GPUs, especially as you scale up to triple or quad cards.

No performance penalties vs external bridge.

So only on the 290X and 290 new chips. None of the rebrands. The only new feature implemented is true audio on 270X.


----------



## Mas

Quote:


> Originally Posted by *geggeg*
> 
> $150?? Is that the msrp or is this the store raising its own price?


It's Australia


----------



## Jokah

Only just started looking into getting one of these.

What's the deal with the fan speed then. Can it be manually set to 100% (or anything above 50-55%)?

I will probably be water cooling though so a more relevant question for me would be does a water block make a considerable or noticeable difference?

I realise these points will have already been discussed and documented here but there are over 4500 posts so hopefully someone who has been keeping up to date will be able to answer without much hassle.


----------



## Raxus

Quote:


> Originally Posted by *evensen007*
> 
> Good god, I failed my resistance roll against upgrade. I just bought 2x XFX 290's on Amazon and 2 Nickel/Plexi waterblocks from performance Pc's (they're back in stock). I should have the cards tomorrow and the blocks on Wednesday.
> *EDIT*
> 
> Wait, Xfx voids the warranty if I use a waterblock!? I may cancel this order.


As far as I know modifying the card for any manufacturer voids the warranty these days.


----------



## Gunderman456

Quote:


> Originally Posted by *Jokah*
> 
> Only just started looking into getting one of these.
> 
> What's the deal with the fan speed then. Can it be manually set to 100% (or anything above 50-55%)?
> 
> I will probably be water cooling though so a more relevant question for me would be does a water block make a considerable or noticeable difference?
> 
> I realise these points will have already been discussed and documented here but there are over 4500 posts so hopefully someone who has been keeping up to date will be able to answer without much hassle.


Yes, waterblocks reduces heat, allows more of an oc and less throttling!


----------



## Taint3dBulge

Quote:


> Originally Posted by *Gilgam3sh*
> 
> I just fitted the Accelero Xtreme III, if I don't run the fans @ 100% I get this annoying coil whine it's killing me, the temps are very good , about 30-35 at idle, VRM temps at 29-34, but the coil whine is too much, if I don't find any solution this cooler will be removed asap.... and it's not only me who having this problem, everyone who fitted it have that.


Its the GPU not the cooler.. It seems a buttload of cards have coilwhine.. My card has it bad too.. It was worse, but it does seem to have quieted down alittle.. I have ran heaven for 8 hours with fps in 100fps range to try and squeel it out.. But its still loud... Loud enough that i dont want to remove the stock cooler so i dont want to hear the zzzzzzzzzzzzzzzzzzzzzzzzzzVVVVVVVVVVVVVVVZZZZZZZZzzzzzzzzzzvvvvvvvvvvvvvvvvvvvvvvvvvvvVZZVVVVVZZZZZVZVVZV sound. Maybe ill video and record it and youtube it..


----------



## Arizonian

Quote:


> Originally Posted by *Arvalin*
> 
> Sapphire R9 290 stock
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/5kmqd/


Congrats - added









Quote:


> Originally Posted by *MojoW*
> 
> I'm joining in, with my Sapphire R9 290 with elpida memory.
> It's running on a comfortable 1100/1300 on the stock uber bios with the stock cooler.(Custom fan profile)
> I'm running Windows 8.1 with 13.11 B9.2 no black screens so far but i always clean thoroughly after driver uninstall.(Came from 6990 quad)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> (Don't mind the pic my phone lens is busted)


Double congrats - added as well


----------



## Jokah

Quote:


> Originally Posted by *Gunderman456*
> 
> Yes, waterblocks reduces heat, allows more of an oc and less throttling!


Thanks

Anyone got any links to reviews/benchmarks that have overclocked with waterblocks? Struggling to find any.


----------



## rdr09

Quote:


> Originally Posted by *skupples*
> 
> you could also do it on a PLX chipped lga 1150 if you like, for much less, with almost no hit (and some times a gain) in gaming performance. It of course won't handle the high end tasks as well as x79. & yes I would HIGHLY recommend dual PSU & ADD2PSU for quad SLI Hawaii.


+rep.

Quote:


> Originally Posted by *Jokah*
> 
> Only just started looking into getting one of these.
> 
> What's the deal with the fan speed then. Can it be manually set to 100% (or anything above 50-55%)?
> 
> I will probably be water cooling though so a more relevant question for me would be does a water block make a considerable or noticeable difference?
> 
> I realise these points will have already been discussed and documented here but there are over 4500 posts so hopefully someone who has been keeping up to date will be able to answer without much hassle.


yes, you can set the fan manually. i used afterburner. i did not even check but i think my card is set at quiet mode 'cause i cannot hear its fan just browsing. it is set at 20% in auto mode and my idle temp is around 46C. i played BF4 with it the fan at auto and heard it roar. the only time i set it in manual @ 60% was when i oc'ed it and ran 3DMark11. i was hoping to get an ASIC of between 70 - 80% and I did. 79% without coil whine.


----------



## Gilgam3sh

Quote:


> Originally Posted by *Taint3dBulge*
> 
> Its the GPU not the cooler.. It seems a buttload of cards have coilwhine.. My card has it bad too.. It was worse, but it does seem to have quieted down alittle.. I have ran heaven for 8 hours with fps in 100fps range to try and squeel it out.. But its still loud... Loud enough that i dont want to remove the stock cooler so i dont want to hear the zzzzzzzzzzzzzzzzzzzzzzzzzzVVVVVVVVVVVVVVVZZZZZZZZzzzzzzzzzzvvvvvvvvvvvvvvvvvvvvvvvvvvvVZZVVVVVZZZZZVZVVZV sound. Maybe ill video and record it and youtube it..


hmm, I don't get it, with the stock cooler no coil whine at all, so how could it be the GPU when I only get it with the Xtreme III? only 100% fans helps


----------



## Raxus

Having a bit of an issue seeing the temps on my 290x memory.

Tried HWmonitor, doesnt list memory.


----------



## Falkentyne

So what's the vids on these cards? 76% and higher asic are 1.18 or 1.20? (gpu-z can't read the VID, because even at 1% load at HIGHEST voltage reading, there is STILL vdroop between vid and current voltage). Someone earlier said they had a 1.25v default vid...


----------



## diggiddi

How does Trueaudio affect you if you run sound out through a AVR?


----------



## josephimports

Finally!













ASIC 66.1%, Elpdia Ram

More results to come.


----------



## jrcbandit

So for the black screen issues, are there any games more prone than others to test? I sold my copy of BF4 to a friend so I can't test that one...

Quote:


> Originally Posted by *Jokah*
> 
> Thanks
> 
> Anyone got any links to reviews/benchmarks that have overclocked with waterblocks? Struggling to find any.


I have overclocked my water cooled card and achieved 1230 core and 1500/6000 memory at 1.41V Asus bios (so 1.299V with Vdroop). Here is a Firestrike run:
6315 graphics score for extreme. I originally had 5633 graphics score when I was running it on air with a slight overclock of 1115 core and no memory overclock.

Non extreme graphics of 13,261 with core at 1230 and memory only at 1400.

I did a few other benches but I dont have access to them right now.

My card has Hynix memory and 73.5% ASIC. It idles around 36-38 C and on load goes up to 51 C, although if winter ever hits here I'm sure it will be a few degrees lower.


----------



## utnorris

Quote:


> Originally Posted by *iPDrop*
> 
> Does anyone have benchmarks of BF4 with two R9 290's in crossfire?


I just got my two 290's delivered, so i am hoping to have them up and running tonight to bench a little tonight and tomorrow since I am off.


----------



## Arizonian

Quote:


> Originally Posted by *josephimports*
> 
> Finally!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> ASIC 66.1%, Elpdia Ram
> 
> More results to come.


Congrats - added


----------



## Kipsofthemud

So, it's been a business day and I haven't gotten my money refunded or heard anything from the store about a canceled order, so i'm guessing I'll be the proud owner of a 290X for a measly 380 euros







They changed the price to 490 6 hours after I ordered though lol











Time to order up dat waterblock, it's a shame EK decided to put the different stuff all over it - I like a more unified look better...gimme full frosty plexi CSQ please.

I'm wondering though, with all the problems people have been having here, how come not 1 (of the bigger) reviewers complained about this? Is it because of shilling? Being paid off or whatever?


----------



## Jokah

Quote:


> Originally Posted by *rdr09*
> 
> +rep.
> yes, you can set the fan manually. i used afterburner. i did not even check but i think my card is set at quiet mode 'cause i cannot hear its fan just browsing. it is set at 20% in auto mode and my idle temp is around 46C. i played BF4 with it the fan at auto and heard it roar. the only time i set it in manual @ 60% was when i oc'ed it and ran 3DMark11. i was hoping to get an ASIC of between 70 - 80% and I did. 79% without coil whine.


Quote:


> Originally Posted by *jrcbandit*
> 
> So for the black screen issues, are there any games more prone than others to test? I sold my copy of BF4 to a friend so I can't test that one...
> I have overclocked my water cooled card and achieved 1230 core and 1500/6000 memory at 1.41V Asus bios (so 1.299V with Vdroop). Here is a Firestrike run:
> 6315 graphics score for extreme. I originally had 5633 graphics score when I was running it on air with a slight overclock of 1115 core and no memory overclock.
> 
> Non extreme graphics of 13,261 with core at 1230 and memory only at 1400.
> 
> I did a few other benches but I dont have access to them right now.
> 
> My card has Hynix memory and 73.5% ASIC. It idles around 36-38 C and on load goes up to 51 C, although if winter ever hits here I'm sure it will be a few degrees lower.


Thanks guys.


----------



## esqueue

Quote:


> Originally Posted by *evensen007*
> 
> Good god, I failed my resistance roll against upgrade. I just bought 2x XFX 290's on Amazon and 2 Nickel/Plexi waterblocks from performance Pc's (they're back in stock). I should have the cards tomorrow and the blocks on Wednesday.
> *EDIT*
> 
> Wait, Xfx voids the warranty if I use a waterblock!? I may cancel this order.


They actually said to call them to get the ok to install an aftermarket cooler. I'll post the link and an insert from it if I can find it.
http://xfxforce.com/en-us/Help/Support/WarrantyInformation.aspx

*** XFX has carefully selected the optimal thermal or fansink component for your graphics card model. We do not encourage the removal of components due to damage that may result in the process. XFX understands that some enthusiasts may choose to replace the original component with their own cooling solution. To support the gaming community, we recommend that you contact XFX prior to any modifications so that we can update your profile and product registration to avoid potential issues with warranty support. In addition, XFX support will be able to walk through the installation with you or provide feedback and pointers on available options for your specific product. You may even consider shipping your components to XFX and allow the technicians at XFX to perform the modification for you (shipping charges to XFX apply).*


----------



## Stay Puft

Quote:


> Originally Posted by *evensen007*
> 
> Good god, I failed my resistance roll against upgrade. I just bought 2x XFX 290's on Amazon and 2 Nickel/Plexi waterblocks from performance Pc's (they're back in stock). I should have the cards tomorrow and the blocks on Wednesday.
> *EDIT*
> 
> Wait, Xfx voids the warranty if I use a waterblock!? I may cancel this order.


What did you expect? Them to cover it if you pushed it too hard?


----------



## Kriant

Add me to the "happy owners" list


----------



## skupples

Quote:


> Originally Posted by *Slomo4shO*
> 
> Do you happen to know where I can find higher resolution pictures of these benches?


Right click on them & select open in new window.
Quote:


> Originally Posted by *evensen007*
> 
> Good god, I failed my resistance roll against upgrade. I just bought 2x XFX 290's on Amazon and 2 Nickel/Plexi waterblocks from performance Pc's (they're back in stock). I should have the cards tomorrow and the blocks on Wednesday.
> *EDIT*
> 
> Wait, Xfx voids the warranty if I use a waterblock!? I may cancel this order.


It's too bad EVGA is Nv only.


----------



## evensen007

Quote:


> Originally Posted by *Stay Puft*
> 
> What did you expect? Them to cover it if you pushed it too hard?


Not sure what's funny. I could cook the card just as easily (or easier) on the stock/crap cooler as I could on water.


----------



## Slomo4shO

Quote:


> Originally Posted by *skupples*
> 
> Right click on them & select open in new window.


Not sure why right click and "view image" doesn't have the same results.


----------



## Arizonian

Quote:


> Originally Posted by *Kriant*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Add me to the "happy owners" list


Done deal Congrats


----------



## esqueue

There is no way I would install any cooler that involved anything permanent. The heatsink adhesive is a no-no for me. The EK block is also keeping my vrm temps idling at 34°C and I haven't seen it hit 50°C even when running Heaven. GPU temps are under 45°C under full load on a warm day. Haven't changed the thermal pads and using an ancient watercooling setup. Bonneville heatercore that was recently cleaned. A shroud made out of cardboard box.

This was the setup that I used on Xbox which was eventually hacked. Still in working condition to this day but non op due to missing cooling


----------



## Raxus

Anyone know how i can monitor the memory temps on a sapphire r9 290x?


----------



## Snyderman34

Appears FrozenCPU has a few blocks in stock (the acrylic ones)

http://www.frozencpu.com/products/21665/ex-blc-1564/EK_Radeon_R9-290X_VGA_Liquid_Cooling_Block_-_Acrylic_EK-FC_R9-290X.html?tl=g30c309s2073


----------



## evensen007

Quote:


> Originally Posted by *esqueue*
> 
> They actually said to call them to get the ok to install an aftermarket cooler. I'll post the link and an insert from it if I can find it.
> http://xfxforce.com/en-us/Help/Support/WarrantyInformation.aspx
> 
> _** XFX has carefully selected the optimal thermal or fansink component for your graphics card model. We do not encourage the removal of components due to damage that may result in the process. XFX understands that some enthusiasts may choose to replace the original component with their own cooling solution. *To support the gaming community, we recommend that you contact XFX prior to any modifications so that we can update your profile and product registration to avoid potential issues with warranty support.* In addition, XFX support will be able to walk through the installation with you or provide feedback and pointers on available options for your specific product. You may even consider shipping your components to XFX and allow the technicians at XFX to perform the modification for you (shipping charges to XFX apply)._




Quote:


> Originally Posted by *Snyderman34*
> 
> Appears FrozenCPU has a few blocks in stock (the acrylic ones)
> 
> http://www.frozencpu.com/products/21665/ex-blc-1564/EK_Radeon_R9-290X_VGA_Liquid_Cooling_Block_-_Acrylic_EK-FC_R9-290X.html?tl=g30c309s2073


Performance Pc's had all of them in stock when I ordered this afternoon.


----------



## MojoW

How do i reflash a bricked 290 bios ?
Do i need another card ?
Because i tried to flash the asus bios to my uber bios(for voltage control) but now i just have a black screen.
So what should i do?
I used atiflash btw.


----------



## Jpmboy

Quote:


> Originally Posted by *MojoW*
> 
> How do i reflash a bricked 290 bios ?
> Do i need another card ?
> Because i tried to flash the asus bios to my uber bios(for voltage control) but now i just have a black screen.
> So what should i do?
> I used atiflash btw.


I assume you are now using the quiet bios? You don't need another card if you use the iGPU (built in graphics) on you r 4770k. Switch to the on-board gpu. It's always best to flash from dos, booting from a USB memstick.

follow the instructions here:

http://forums.overclockers.co.uk/showthread.php?t=18552408


----------



## MojoW

Quote:


> Originally Posted by *Jpmboy*
> 
> I assume you are now using the quiet bios? You don't need another card if you use the iGPU (built in graphics) on you r 4770k. Switch to the on-board gpu. It's always best to flash from dos, booting from a USB memstick.
> 
> follow the instructions here:
> 
> http://forums.overclockers.co.uk/showthread.php?t=18552408


If i connect my IGPU and do a cmos reset it doesn't give a signal, so that's why i'm stuck.
And that thread is what i followed the first time.


----------



## Kelwing

Please add me to list.

Stock cooler coming off and putting on Artic Accelero III

http://s222.photobucket.com/user/mistwalker7/media/100_1569_1_zpsc1b62ffe.jpg.html


----------



## esqueue

Quote:


> Originally Posted by *evensen007*


LOL, you think so too? That's why I then posted the picture below. I never gave them a call. The will probably blame any problem on the water block.


----------



## brazilianloser

Gigabyte 290 voltage unlocked or will it require a Asus bios flash?


----------



## eternal7trance

Quote:


> Originally Posted by *MojoW*
> 
> How do i reflash a bricked 290 bios ?
> Do i need another card ?
> Because i tried to flash the asus bios to my uber bios(for voltage control) but now i just have a black screen.
> So what should i do?
> I used atiflash btw.


All you have to do is boot up on the working bios, switch the bios while the computer is on and flash it. Then reboot and it's good. That's what I did on my 280x when I screwed up one of the bioses.


----------



## Gunderman456

Quote:


> Originally Posted by *brazilianloser*
> 
> Gigabyte 290 voltage unlocked or will it require a Asus bios flash?


All reference models are voltage unlocked.


----------



## Kriant

Quote:


> Originally Posted by *Arizonian*
> 
> Done deal Congrats


Thanks!

And FrozenCpu shipped the blocks today.

*soon babies, soon* (cuddles the cards)


----------



## Jpmboy

Quote:


> Originally Posted by *MojoW*
> 
> If i connect my IGPU and do a cmos reset it doesn't give a signal, so that's why i'm stuck.
> And that thread is what i followed the first time.


try removing the card, clr cmos, post to bios and set you boot drives etc. no overclocking yet. should restore iGPU
boot to windows and shut down.
reinstall the card but be sure to keep the graphics on the igpu. dont connect the card to a monitor.
in atiflash, use the "--list" command to ID the card index number. (should be 0)
then flash using the "-i0" modifier in as "-i0 -p -f 0 [romname.rom]

should work

(I've flashed this r290x so many times it embarrassing







)


----------



## MojoW

Quote:


> Originally Posted by *eternal7trance*
> 
> All you have to do is boot up on the working bios, switch the bios while the computer is on and flash it. Then reboot and it's good. That's what I did on my 280x when I screwed up one of the bioses.


Tried it all! i'm on my IGPU now and even now i get the error 0fl01 adapter not found.
But my mobo recognizes the gpu?


----------



## Jpmboy

Quote:


> Originally Posted by *Gunderman456*
> 
> All reference models are voltage unlocked.


not true. The Sapphire are not volt unlocked. Flashing to the Asus bios does unlock it.


----------



## VSG

Every card is voltage unlocked, just that the software to recognize it is not there yet. Some programs such as Asus's GPU Tweak already have it, but that does not mean Sapphire's Trixx or MSI Afterburner will not have it soon.


----------



## lethal343

2x GIGABYTE r9 290x CF:

a very noob question.... does anybody know if GIGABYTE's warranty would be void if i up the vcore voltage on my cards? or any other overclock for that matter on a stock bios and stock ref cooler?


----------



## Jpmboy

So - been working at this r290x for a few days... we are in grave need of tools to really extract the power of this gpu. one thing that propelled the gk110 forward was the LLC tweak. I've been trying to probe/page the VRMs (IR3567B) to relax the LLC - which is very extreme on these cards - with no luck, and the international rectifier site is not cooperating with the [restricted access] instruction set documents.

the VRM offset tweak: msiafterburner /wi4,30,8d,10 (or any dec-hex equivalent) only seems to add 6.25mV to the offset per hex increment, but still suffers from a large vdroop.

Does anyone know where we can get the instruction set? Maybe "unwinder" ??

Very frustrating - I know this card can kick some butt. (I've got this one to position 5 in 3dmk11, top 10 in Firestrike, and poor score - but highest R290x in valley). If we unlock the LLC command, we have a much better way to control the card than the PT3 bios - which I find to be flawed (IMO).

HELP!!


----------



## MojoW

Quote:


> Originally Posted by *Jpmboy*
> 
> try removing the card, clr cmos, post to bios and set you boot drives etc. no overclocking yet. should restore iGPU
> boot to windows and shut down.
> reinstall the card but be sure to keep the graphics on the igpu. dont connect the card to a monitor.
> in atiflash, use the "--list" command to ID the card index number. (should be 0)
> then flash using the "-i0" modifier in as "-i0 -p -f 0 [romname.rom]
> 
> should work
> 
> (I've flashed this r290x so many times it embarrassing
> 
> 
> 
> 
> 
> 
> 
> )


I flashed once







just my luck.(There goes my uber bios)
All of the above gave me error 0fl01 adapter not found or invalid command.


----------



## Jpmboy

Quote:


> Originally Posted by *MojoW*
> 
> I flashed once
> 
> 
> 
> 
> 
> 
> 
> my luck.
> All of the above gave me error 0fl01 adapter not found or invalid command.


most likely an "invalid command". it's def fixable. you do have the other on-board bios you can switch to right... but, search the atiflash command set, you need to flash override to reset the eeproms.
As long as you have the igpu working, it can be done. (if you were on x79 it would be a very different story!!)

if you enter the command:

atiflash --list

what does it return? Does atiflash "see" the card at all" If yes, that's the pcie bus address of the card. make sure you flash to that index number only. (atiflash -i[#] -p -f ...etc) you seem to know what you're doing - go slow, don't panic - yet. it's not easy to bork a card to an unrecoverable state.

edit:
oops - hey - i gotta sign off 'till tomorrow. sorry. PM me if you're still stuck.


----------



## beejay

Okay, there was a confusion on which VRM number goes to what VRM cluster. And to add confusion, HWinfo displays two information for 1 card with swapped VRM figures on the two infor. I'm going to use GPU-Z's thermal sensor as the basis. Last time, I thought VRM1 (goes 120C_) is the small cluster on the upper left side. Last night I tried burning the GPU and used a simple thermometer to see which VRM clusters are really heating up. So now, I'm almost sure that VRM1 is the large cluster of VRM's on the right side.

I'm going to try and find a large block of heatsink to put on those VRMs (like the ones on DCU II coolers, maybe I'll take the sink from my old 570 DCU 2) and cool those VRMs.

Anyone tried fitting a vrm cooler, like the themalright R5 on the R9 290x?

And updates on my clock.

Core: 1100
Mem: 6000
Voltage: Stock

These are stable now. GPU temp is really low, Arctic Accelero Hybrid really works wonders on the GPU temp. VRM1 is still toasty though, reaching 90c while gaming BF4, and 100+ when burning.

Also, for those black screens, I was experiencing those before. I had the same clocks but raised the voltage to 1.3v and when benching I get artifacts, then black screens (failed driver) and whatnot. I tried lowering the mem clock to 5500 and I still get artifacts and black screens. So I just set the mem clock to default. On further thinking, I imagine it might be the VRM causing the card to fail with those voltage, so I took a gamble and set my voltage to stock, bump the mem clock to 6000, core to 1100, power target to 150%, and everything went stable.

I'm using Accelero Hybrid as my cooler by the way. (Why am I still on stock? OCD sorry)


----------



## MojoW

Quote:


> Originally Posted by *Jpmboy*
> 
> it's def fixable. you do have the other on-board bios you can switch to... but, search the atiflash command set, you need to flash override to reset the eeproms. As long as you have the igpu working, it can be done.
> 
> if you enter the command:
> 
> atiflash --list
> 
> what does it return? Does atiflash "see" the card at all" If yes, that's the pcie bus address of the card. make sure you flash to that index number only. you seem to know what you're doing - go slow, don't panic - yet. it's not easy to bork a card to an unrecoverable state.
> 
> edit:
> oops - hey - i gotta sign off 'till tomorrow. sorry. PM me if you're still stuck.


No problem i have time.
The ''--list'' aswell as the ''-i'' command don't show anything just a ''invalid command'' but my mobo does recognize the card as a ati subvendor in the bios.

This is what gpu-z show's:


----------



## beejay

Quote:


> Originally Posted by *MojoW*
> 
> No problem i have time.
> The ''--list'' aswell as the ''-i'' command don't show anything just a ''invalid command'' but my mobo does recognize the card as a ati subvendor in the bios.


There is this problem with atiflash that it fails when you use a DOS boot that has extended memory (himem.sys?), other drivers loaded (maybe cd drivers?) etc. Try using a DOS that doesn't load any thing (delete or rename startup files, config.sys and autoexec.bat on your boot device, hopefully it's usb) then run atiflash again.

I'm thinking Atiflash is finicky with memory addresses and might be loading itself up to the High Memory Area when running. Himem.sys or any other extended memory modules loads DOS to the high memory area so DOS can access the extended memory beyond 640k 1MB.


----------



## DavidH

Quote:


> Originally Posted by *Gilgam3sh*
> 
> I just fitted the Accelero Xtreme III, if I don't run the fans @ 100% I get this annoying coil whine it's killing me, the temps are very good , about 30-35 at idle, VRM temps at 29-34, but the coil whine is too much, if I don't find any solution this cooler will be removed asap.... and it's not only me who having this problem, everyone who fitted it have that.


The cooler has nothing to do with making your card have coil whine, it was already there before you installed the aftermarket cooler but you simply could not hear it over the noisy stock cooler.


----------



## DavidH

Quote:


> Originally Posted by *Gilgam3sh*
> 
> it adds about 5cm, but damn this coil whine.. at 100% fan speed it disappear completely


Then just run it at 100 percent, its still quiet.


----------



## MojoW

Quote:


> Originally Posted by *beejay*
> 
> There is this problem with atiflash that it fails when you use a DOS boot that has extended memory (himem.sys?), other drivers loaded (maybe cd drivers?) etc. Try using a DOS that doesn't load any thing (delete or rename startup files, config.sys and autoexec.bat on your boot device, hopefully it's usb) then run atiflash again.
> 
> I'm thinking Atiflash is finicky with memory addresses and might be loading itself up to the High Memory Area when running. Himem.sys or any other extended memory modules loads DOS to the high memory area so DOS can access the extended memory beyond 640k 1MB.


I reformatted the usb and did it all over again but still the same problem.(Different usb aswell)
And seeing the gpu-z screen 
I think i bricked it bad, and i flashed the same way as most here did.(It didn't give errors the first time i flashed so restarted and no signal)
As i said i just get adaptor not found or invalid command.


----------



## beejay

Quote:


> Originally Posted by *MojoW*
> 
> I reformatted the usb and did it all over again but still the same problem.(Different usb aswell)
> And seeing the gpu-z screen
> I think i bricked it bad, and i flashed the same way as most here did.(It didn't give errors the first time i flashed so restarted and no signal)
> As i said i just get adaptor not found or invalid command.


How are you formatting the USB? How are you preparing DOS in it (HPUsb?)? After prepping DOS, did you delete (or rename) config.sys and autoexec.bat? Are you using MS-DOS, FreeDOS or any type of DOS? Do you have the latest atiflash.exe?

So in a nutshell:

1. Create a bootable USB DOS
2. Delete config.sys and autoexec.bat if any
3. Copy atiflash.exe to the flash drive, put in on the root directory
3. Boot DOS

I understand you are seeing an "Invalid Command" not a device not found error.

Once you get to the actual flashing, you need the -f option

atiflash -f -p 0 biosname.rom

Aside from this, if it still does not flash, we may need more troubleshooting.

Come to think of it, you said, "invalid command", I think you forgot to copy atiflash.exe inside the usb drive? It needs to be the atiflash.exe file not the compressred zip file Disregard, reading your post, you already know this.


----------



## DavidH

Quote:


> Originally Posted by *Sazz*
> 
> This is what I did for my 290X when it was on stock cooler.
> 
> I put +10% on board power limit, put my max fan speed at 72% as my max speed at 80C (using Afterburner)
> 
> And the temp under load flat out at 90C with an ambient temp of 25C, if your ambient temp is higher than mine you may need to increase fan speed by a few %.
> 
> As of BF3 and BF4 the card is not "throttling" but when it doesn't need all the "power" it has it will downclock a bit, but on other games I noticed that the card usually flat out runs at max speed. maybe this will improve once mantle is implemented on BF4.
> I find that configuration for the PSU troubling, the two fans on your GPU will steal most of the air that would've gone thru the PSU, so watch out for your PSU's temps.. might wanna go put them on a traditional configuration since you got two powerful fans competing with the fan from your seasonic PSU.
> 
> And for those who are still experiencing black screens, lets take notes on what game/application when it happened, what clocks, and what background programs are running when it happened so that we can see the pattern quicker.


Thank you, I checked and the PSU is actually running very cool. There is a sound card positioned underneath one of the GPU fans so its keeping the fan from sucking up too much air from the PSU plus in that picture it looks a lot closer than it really is.


----------



## Snyderman34

Would anyone happen to have the BIOS for a Sapphire R9 290 (both Quiet and Uber)? I seem to have flashed both switches trying to flash the ASUS BIOS, so yeah.


----------



## MojoW

Quote:


> Originally Posted by *beejay*
> 
> How are you formatting the USB? How are you loading DOS in it (HPUsb?)? After loading DOS, did you delete (or rename) config.sys and autoexec.bat? Are you using MS-DOS, FreeDOS or any type of DOS?
> 
> I understand you are seeing an "Invalid Command" not a device not found error.
> 
> Once you get to the actual flashing, you need the -f option
> 
> atiflash -f -p 0 biosname.rom


Yes i use HPUsb with MS-DOS.
And i did used the -f option to force even the first time i flashed, and how do i delete (or rename) those files after loading DOS?

Thnx


----------



## DavidH

Quote:


> Originally Posted by *Snyderman34*
> 
> Would anyone happen to have the BIOS for a Sapphire R9 290 (both Quiet and Uber)? I seem to have flashed both switches trying to flash the ASUS BIOS, so yeah.


Check Techpowerup.com They have a database with GPU bios's.


----------



## beejay

Quote:


> Originally Posted by *MojoW*
> 
> Yes i use HPUsb with MS-DOS.
> And i did used the -f option to force even the first time i flashed, and how do i delete (or rename) those files after loading DOS?
> 
> Thnx


After HPusb finishes, just go to your usb drive and delete config.sys and autoexec.bat (they might be hidden). Copy atiflash.exe inside the drive, then reboot with your usb drive.


----------



## Snyderman34

Quote:


> Originally Posted by *DavidH*
> 
> Check Techpowerup.com They have a database with GPU bios's.


Looks like they have the 290x version. That should work I assume


----------



## DraXxus1549

Count me in!





Hoping to put it underwater around Christmas time.


----------



## bond32

Ordered my sapphire from Amazon. Caved and got the 290x, I was actually planning on the 290 but amazon had this sapphire which is a brand I trust in stock. Now just waiting on more GPU blocks...


----------



## iPDrop

What do yall think is the highest safe voltage for the 290


----------



## subyman

Got my R9 290. This thing is loud. The reviews were no joke. I'm glad my block gets here tomorrow. I have Epdia memory.

I noticed my fan speed went to 50% while in Heaven. I guess that is what Ryan from Anandtech was talking about when he said AMD updated their drivers to use absolute fan speeds instead of percentages. No more 47% hard cap.


----------



## Arizonian

Quote:


> Originally Posted by *Kelwing*
> 
> Please add me to list.
> 
> Stock cooler coming off and putting on Artic Accelero III
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s222.photobucket.com/user/mistwalker7/media/100_1569_1_zpsc1b62ffe.jpg.html


Congrats - added









Quote:


> Originally Posted by *DraXxus1549*
> 
> Count me in!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hoping to put it underwater around Christmas time.


Congrats - ditto


----------



## eternal7trance

Could someone take a picture of the stock vrm heatsink without the gpu cooler attached to it? Thanks.


----------



## tsm106

Quote:


> Originally Posted by *eternal7trance*
> 
> Could someone take a picture of the stock vrm heatsink without the gpu cooler attached to it? Thanks.


The vrm heatsink on the stock cooler is a part of the whole block.

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/3120_40#post_21136016


----------



## Snyderman34

Got my waterblock ordered! Hopefully will be here end of the week


----------



## eternal7trance

Quote:


> Originally Posted by *tsm106*
> 
> The vrm heatsink on the stock cooler is a part of the whole block.
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/3120_40#post_21136016


The screws in the picture make it seem like you can take it off


----------



## esqueue

I have the stock heatsink/fan in my hand at the moment. The entire area that makes contact with all the memory and vrm is one metal piece. The only different area is the copper area that sits on the gpu. If you want a picture of the bottom of the stock cooler, I can post one if you respond. I'm going afk in a bit.


----------



## GioV

Who is excited for the gratuitous amount of information on Mantle that should come out tomorrow from AMD's developer summit?

http://developer.amd.com/apu/home/agenda-sessions/


----------



## utnorris

Ok, so it's late but I wanted to run a quick benchmark or two. I paired one of the 290's with my 290x because I didn't want to break down the water cooling tonight. Running at 1000Mhz GPU and 1250Mhz on memory for both using AB to set the clocks and fan (for the 290).

3DMark11

P20091

Graphics Score
28447

Physics Score
11431

Combined Score
9722

Now keep in mind that I am basically running them at the stock 290x settings. These guys are monsters. Tomorrow I should have my other 290 tested and then get both of them under water to play with. I will check to see what memory comes with them and post it. Also, my first 290 has an Asic score of 69% and a VID of 1.24v. I am hoping I can get both running at 1150/1475Mhz stable which should give me a sweet score in 3DMark11.

Cost of the two 290's - $760
Water blocks - $240
Total - $1000

Compare that to a GTX780i and water block (doesn't exist yet) and it should be around $820, but I should be end up about 75% more GPU power with the 290's in CF for less than 20% difference in price. And as drivers get better, well.....................


----------



## Sgt Bilko

Quote:


> Originally Posted by *utnorris*
> 
> Ok, so it's late but I wanted to run a quick benchmark or two. I paired one of the 290's with my 290x because I didn't want to break down the water cooling tonight. Running at 1000Mhz GPU and 1250Mhz on memory for both using AB to set the clocks and fan (for the 290).
> 
> 3DMark11
> 
> P20091
> 
> Graphics Score
> 28447
> 
> Physics Score
> 11431
> 
> Combined Score
> 9722
> 
> Now keep in mind that I am basically running them at the stock 290x settings. These guys are monsters. Tomorrow I should have my other 290 tested and then get both of them under water to play with. I will check to see what memory comes with them and post it. Also, my first 290 has an Asic score of 69% and a VID of 1.24v. I am hoping I can get both running at 1150/1475Mhz stable which should give me a sweet score in 3DMark11.
> 
> Cost of the two 290's - $760
> Water blocks - $240
> Total - $1000
> 
> Compare that to a GTX780i and water block (doesn't exist yet) and it should be around $820, but I should be end up about 75% more GPU power with the 290's in CF for less than 20% difference in price. And as drivers get better, well.....................


That's some pretty awesome results there, I might start thinking about getting a 290 instead for Crossfire.


----------



## Arizonian

Quote:


> Originally Posted by *GioV*
> 
> Who is excited for the gratuitous amount of information on Mantle that should come out tomorrow from AMD's developer summit?
> 
> http://developer.amd.com/apu/home/agenda-sessions/


I am, mostly for the *Gaming Summit* part of it.

Mantle and trueaudio, how upcoming Thief will implement both these. Give some insight to game developers. TressFX 2.0 rendering. The new Cryengine aka Cryengine 4. Direct Compute in gaming. Astoundsound for 3D sounds gaming. All good things.









I hope an announcement for non-reference 290X / 290 date comes out of it, but I know I'm grasping at straws now.









_Found under tabs - Sessions / Gaming_


----------



## famich

I ll be submitting my pics here shortly: Sapphire R9 290X now with the EK block , VID 1,515 What we need most is the OC software .


----------



## Falkentyne

Quote:


> Originally Posted by *famich*
> 
> I ll be submitting my pics here shortly: Sapphire R9 290X now with the EK block , VID 1,515 What we need most is the OC software .


Vid 1.515? That card would fry with that voltage unless you were on subzero.....


----------



## famich

Sorry, my apology, 1.155


----------



## Gilgam3sh

Quote:


> Originally Posted by *DavidH*
> 
> The cooler has nothing to do with making your card have coil whine, it was already there before you installed the aftermarket cooler but you simply could not hear it over the noisy stock cooler.


Quote:


> Originally Posted by *DavidH*
> 
> Then just run it at 100 percent, its still quiet.


with the stock cooler installed in idle I run it about 25-30% and it's very quite and I could not hear any coil whine at all, I don't say it can't be the GPU but should I not hear it too at 100% with the Xtreme III as it's not as loud as the stock cooler?

yeah when gaming 100% with the Xtreme III is good, not too loud as the stock cooler at 60%, but 100% at idle it's annoying as I can hear it, also from time to time I leave the PC over night and with the Xtream II at 100% I would not be able to sleep hehe, at least the stock cooler is dead quite at 25%...


----------



## blue1512

Quote:


> Originally Posted by *Gilgam3sh*
> 
> with the stock cooler installed in idle I run it about 25-30% and it's very quite and I could not hear any coil whine at all, I don't say it can't be the GPU but should I not hear it too at 100% with the Xtreme III as it's not as loud as the stock cooler?
> 
> yeah when gaming 100% with the Xtreme III is good, not too loud as the stock cooler at 60%, but 100% at idle it's annoying as I can hear it, also from time to time I leave the PC over night and with the Xtream II at 100% I would not be able to sleep hehe, at least the stock cooler is dead quite at 25%...


I thought Xtreme III has a PWM plug? Connect it to the board and set up your own fan curve seems to be very simple


----------



## youra6

Can someone with a MSI R9 290 check for me what their BIOS vendor is? (I backed up my ROM, but now it says the vendor is ATI (1002)). I'm assuming this is expected behavior?


----------



## Gilgam3sh

Quote:


> Originally Posted by *blue1512*
> 
> I though Xtreme III has a PWM plug? Connect it to the board and set up your own fan curve seems to be very simple


if you mean by connect the plug into the card, yes that's what I've done... I have 2 profiles, one for idle, and one for gaming, but atm I can only use the fan at 100%, otherwise the coil whine sound wants me to break the card in two...


----------



## HoZy

http://www.3dmark.com/fs/1134780

*Firestrike extreme = 5529

+5% Power in CCC

+14.5% GPU in CCC = 1145mhz

+20% Memory in CCC = 1500(6000mhz)*


----------



## blue1512

Quote:


> Originally Posted by *Gilgam3sh*
> 
> if you mean by connect the plug into the card, yes that's what I've done... I have 2 profiles, one for idle, and one for gaming, but atm I can only use the fan at 100%, otherwise the coil whine sound wants me to break the card in two...


I am waiting for my 290x and your complain about coil whine really worries me. When did it appear? From start? After BIOS flashing? Or after replacing the cooler? Please help!


----------



## Gilgam3sh

Quote:


> Originally Posted by *blue1512*
> 
> I am waiting for my 290x and your complain about coil whine really worries me. When did it appear? From start? After BIOS flashing? Or after replacing the cooler? Please help!


after I fitted the Accelero Xtreme III, and I'm not the only one who got it...

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/4700#post_21171537

here is another coil whine problem with the Xtreme III

http://hardforum.com/showthread.php?t=1788658&page=1


----------



## Gilgam3sh

Quote:


> Originally Posted by *youra6*
> 
> Can someone with a MSI R9 290 check for me what their BIOS vendor is? (I backed up my ROM, but now it says the vendor is ATI (1002)). I'm assuming this is expected behavior?


it's normal, my Sapphire says that as well, but with the ASUS BIOS it says ASUS as vendor


----------



## blue1512

Quote:


> Originally Posted by *Gilgam3sh*
> 
> after I fitted the Accelero Xtreme III, and I'm not the only one who got it...
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/4700#post_21171537
> 
> here is another coil whine problem with the Xtreme III
> 
> http://hardforum.com/showthread.php?t=1788658&page=1


Thanks for your info.


----------



## beejay

So I'm thinking of getting a dedicated VRM cooling module. Looking at these: http://www.thermalright.com/vrm_heatsink.html. I'll remove the shroud from my Accelero Hybrid and install the VRM sink and maybe add an 80mm fan. I don't know if any of those VRM heatsinks would fit, maybe some kind of mod should be done.

What do you think?


----------



## chrisf4lc0n

@beejay
Just get some thin thermal sticky pads and install little aluminium heatsinks, Akasa did them for 6 quid in the UK. You would be suprised how much difference they do with a bit of airflow over them...

Sent from my SGH-i927 using Tapatalk


----------



## beejay

Quote:


> Originally Posted by *chrisf4lc0n*
> 
> @beejay
> Just get some thin thermal sticky pads and install little aluminium heatsinks, Akasa did them for 6 quid in the UK. You would be suprised how much difference they do with a bit of airflow over them...
> 
> Sent from my SGH-i927 using Tapatalk


Already have those. It works, but I want better. You see I'm using an accelero hybrid, my Core temps does not go above 70 on normal gaming sessions, vrm1 goes 90+. If I need to match the cooling of my core, I need to bring down the temps of my vrms. Doing this I can feed more voltage. I'm currently on stock voltage, 1100 core, 1500 mem


----------



## raptor15sc

I got these on launch day, but I've been busy until now:



Please forgive the messy cabling. The case has no windows and I decided to use all eight PCI-E sockets on my PSU individually. Also, the P9X79-E WS has a 6-pin power socket in a really stupid place just above the center of the PCI-E slots, so I used my PSU's unused EPS12V 8-pin cable with a 6-pin adapter (they differ, the pinouts are reversed vertically).

Full build specs:
Asus P9X79-E WS with 64 lanes of PCI-E 3.0
Intel i7 4930K @ 4.4GHz, HT off for increased performance in BF4
G.Skill TridentX (2 x 8GB) DDR3 @ 2400MHz
4 x MSI Radeon R9 290X with Asus BIOS and Coollaboratory Liquid Ultra TIM @ ...I'd like some suggestions on this. Good temps, great airflow.
2 x SanDisk Extreme II 240GB SSDs in RAID 0 SATA III
Coolmax ZPS-1600B 1600W Power Supply
NZXT Kraken X60 280mm Liquid Cooler
Cubitek Magic Cube AIO case with 4 x BitFenix Spectre Pro 140mm PWM Case Fans
3 x ASUS VG248QE Monitors @ 120Hz strobed with LightBoost and in-game Vsync enabled
Razer Tiamat Elite 7.1 Surround Sound Headset (Don't even hear the 290Xs' fans)
SteelSeries Sensei [RAW] mouse @ 5670 CPI, 10% sensitivity in BF4
Razer Goliathus Extended Mouse Pad, Speed Edition
Qpad MK-80 Pro Backlit PS2 Mechanical Keyboard with Cherry MX Red Switches
I'm too lazy to total up the price on everything right now, but I payed below MSRP on almost every item and I sold three BF4 keys on eBay.


----------



## the9quad

I already rma'd my 290x, but sitting here thinking I might have the solution to the blackscreens if someone is willing to test it. Try setting the power limit to 90% or so. Alternatively see if your motherboard has a different power plug that gives more power to the PCIe slots. I think the issue is the mobo isn't supplying the power needed to the PCIe slots for the cards. This is not the 6 and 8 pin from the psu to the card itself that isn't getting the power it's the slots themselves. What ya think,ami dumb or maybe right? I'm willing to bet most of the people here with no probs are on rog mobos with the ezplug connectors that give more stable power to the PCIe slots.


----------



## Sgt Bilko

Will using Thermal pads on aftermarket heatsinks work alright?

just wondering if they will lose their "stick" and fall off or not..........i'd rather not use thermal adhesive again just in case i do have to RMA.

Quote:


> Originally Posted by *the9quad*
> 
> I already rma'd my 290x, but sitting here thinking I might have the solution if someone is willing to test it. Try setting the power limit to 90% or so. Alternatively see if your motherboard has a different power plug that gives more power to the PCIe slots. I think the issue is the mobo isn't supplying the power needed to the PCIe slots for the cards. This is not the 6 and 8 pin from the psu to the card itself that isn't getting the power it's the slots themselves. What ya think,ami dumb or maybe right? I'm willing to bet most of the people here with no probs are on rog mobos with the ezplug connectors that give more stable power to the PCIe slots.


I don't have mine plugged in atm but i'll give it a go tomorrow...............also just had my system lock on me then when clicking on the GPU tab in AIDA 64 whilst playing a movie in PowerDVD 13, i tested it 3 times and it locked 3 times.......weird


----------



## Gilgam3sh

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Will using Thermal pads on aftermarket heatsinks work alright?
> 
> just wondering if they will lose their "stick" and fall off or not..........i'd rather not use thermal adhesive again just in case i do have to RMA.


I used the thermal pads from the stock cooler when fitted the Accelero Xtreme III memory heatsinks, did not want to use any glue so I can remove it if I need to RMA the card in future...


----------



## EliteReplay

Quote:


> Originally Posted by *Arizonian*
> 
> I am, mostly for the *Gaming Summit* part of it.
> 
> Mantle and trueaudio, how upcoming Thief will implement both these. Give some insight to game developers. TressFX 2.0 rendering. The new Cryengine aka Cryengine 4. Direct Compute in gaming. Astoundsound for 3D sounds gaming. All good things.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I hope an announcement for non-reference 290X / 290 date comes out of it, but I know I'm grasping at straws now.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> _Found under tabs - Sessions / Gaming_


in interested aswell is there any stream link please?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gilgam3sh*
> 
> I used the thermal pads from the stock cooler when fitted the Accelero Xtreme III memory heatsinks, did not want to use any glue so I can remove it if I need to RMA the card in future...


perfect, i actually took my AX III off when i thought i had to RMA and trying to clean everything was a nightmare, i'll order some Thermal pads right away, any type or size thats recommended?


----------



## Gilgam3sh

Quote:


> Originally Posted by *Sgt Bilko*
> 
> perfect, i actually took my AX III off when i thought i had to RMA and trying to clean everything was a nightmare, i'll order some Thermal pads right away, any type or size thats recommended?


ok, as I said, I just used the ones that came with the R9 290 cooler. did not buy anything.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gilgam3sh*
> 
> ok, as I said, I just used the ones that came with the R9 290 cooler. did not buy anything.


my bad, i overlooked that part, Simple yet efficient solution.......I'll put the cooler back on tomorrow,

As for the coil whine you were hearing, mine does it too but only at high FPS with the AX III on......maybe because the AX is more open it allows the sound to travel more easily than it does with the stock cooler on?


----------



## beejay

Quote:


> Originally Posted by *Sgt Bilko*
> 
> perfect, i actually took my AX III off when i thought i had to RMA and trying to clean everything was a nightmare, i'll order some Thermal pads right away, any type or size thats recommended?


What happened to your card?


----------



## Sgt Bilko

Quote:


> Originally Posted by *beejay*
> 
> What happened to your card?


Nothing, was going to RMA due to all the black screen issues i had but i decided to do a full wipe of things and start fresh and seems to be holding ok so i decided to keep the card and see what happens.


----------



## Gilgam3sh

Quote:


> Originally Posted by *Sgt Bilko*
> 
> my bad, i overlooked that part, Simple yet efficient solution.......I'll put the cooler back on tomorrow,
> 
> As for the coil whine you were hearing, mine does it too but only at high FPS with the AX III on......maybe because the AX is more open it allows the sound to travel more easily than it does with the stock cooler on?


I did not have time to test any game, will try later at home, but a little coil whine when playing games and with the fan @100% is OK, I wear headphones anyway, BUT the problem is that I get this coil whine at idle if I don't run the fan @ 100%... that is very very annoying. I will record a video and put on youtube when I get home.


----------



## hatlesschimp

Hello just wondering if I could get some help.

So I'm on the verge of buying a 290X (probably a ASUS or Gigabyte). How do I go about supplying 2 hdmi signals to 2 projectors (2x1 eyefinity). Also I have a AVR that need HDMI input for surround sound.


----------



## hatlesschimp

Quote:


> Originally Posted by *Gilgam3sh*
> 
> I did not have time to test any game, will try later at home, but a little coil whine when playing games and with the fan @100% is OK, I wear headphones anyway, BUT the problem is that I get this coil whine at idle if I don't run the fan @ 100%... that is very very annoying. I will record a video and put on youtube when I get home.


I cut a hole in my wall and put the my tower in the spare bedroom lol 3 titans with stock coolers lol. They weren't that loud or annoying but made the room a little bit warm.


----------



## FloJoe6669

Considering getting a 290 to replace my GTX 670, really tempted.

How do people here feel about the cards performance when compared with the temperature/noise/power, would you still consider it a good deal or are you just getting what you pay for?
or is that too preferential a question?


----------



## beejay

Quote:


> Originally Posted by *hatlesschimp*
> 
> Hello just wondering if I could get some help.
> 
> So I'm on the verge of buying a 290X (probably a ASUS or Gigabyte). How do I go about supplying 2 hdmi signals to 2 projectors (2x1 eyefinity). Also I have a AVR that need HDMI input for surround sound.


R9 290x has 2 dvi, 1 hdmi, 1 dp.

You can buy 2 dvi to hdmi converter, but it won't send the dual link signal (not really needed as you are using hdmi)


----------



## hatlesschimp

ok so from the 2 dvi ports I convert them to HDMI with adapters and then the projectors are happy . And the HDMI port from the 290x out to the AVR for audio.


----------



## hatlesschimp

Crossfire 290 or single 290x for 3840 x 1080p?


----------



## raptor15sc

Quote:


> Originally Posted by *hatlesschimp*
> 
> ok so from the 2 dvi ports I convert them to HDMI with adapters and then the projectors are happy . And the HDMI port from the 290x out to the AVR for audio.


Correct.
And if you plan on doing any overclocking, go with the Asus card (if they're the same price) and save yourself the trouble of flashing the BIOS. Haven't seen anything about Gigabyte specifically, but everything I've seen said Asus only if you want voltage control.

Edit: For your new question, I'd go with Crossfired 290 cards. Unless you care about noise AND you only have like an mATX board with only 4 expansion slots of space--these cards kinda need a little room to breathe.


----------



## MojoW

Quote:


> Originally Posted by *beejay*
> 
> How are you formatting the USB? How are you preparing DOS in it (HPUsb?)? After prepping DOS, did you delete (or rename) config.sys and autoexec.bat? Are you using MS-DOS, FreeDOS or any type of DOS? Do you have the latest atiflash.exe?
> 
> So in a nutshell:
> 
> 1. Create a bootable USB DOS
> 2. Delete config.sys and autoexec.bat if any
> 3. Copy atiflash.exe to the flash drive, put in on the root directory
> 3. Boot DOS
> 
> I understand you are seeing an "Invalid Command" not a device not found error.
> 
> Once you get to the actual flashing, you need the -f option
> 
> atiflash -f -p 0 biosname.rom
> 
> Aside from this, if it still does not flash, we may need more troubleshooting.
> 
> Come to think of it, you said, "invalid command", I think you forgot to copy atiflash.exe inside the usb drive? It needs to be the atiflash.exe file not the compressred zip file Disregard, reading your post, you already know this.


I found the problem!
The atiflash i used the first time was out of date(4.07 not 4.17), and it did flash the bios but i guess on different way then the new one.
So now the flashing works everytime but that first time did ruin something.
Because i still get no signal.
Don't know what else to do anymore.


----------



## hatlesschimp

Yeah Im thinking in the future I will buy a second 290x that has updated ports (hdmi 2.0?, More DP etc).

Will a standard vanilla 290x card crossfire ok with a non reference 290x card? I know with nvidia you were governed buy your slowest card.


----------



## raptor15sc

Quote:


> Originally Posted by *hatlesschimp*
> 
> Yeah Im thinking in the future I will buy a second 290x that has updated ports (hdmi 2.0?, More DP etc).
> 
> Will a standard vanilla 290x card crossfire ok with a non reference 290x card? I know with nvidia you were governed buy your slowest card.


I'm pretty sure yes, you'll be fine. AMD/ATI have always been pretty good about Crossfiring non-matching cards. Any 290X cards with 4GB of RAM should work together fine.


----------



## Jpmboy

Quote:


> Originally Posted by *raptor15sc*
> 
> I got these on launch day, but I've been busy until now:
> 
> 
> 
> Please forgive the messy cabling. The case has no windows and I decided to use all eight PCI-E sockets on my PSU individually. Also, the P9X79-E WS has a 6-pin power socket in a really stupid place just above the center of the PCI-E slots, so I used my PSU's unused EPS12V 8-pin cable with a 6-pin adapter (they differ, the pinouts are reversed vertically).
> 
> Full build specs:
> Asus P9X79-E WS with 64 lanes of PCI-E 3.0
> Intel i7 4930K @ 4.4GHz, HT off for increased performance in BF4
> G.Skill TridentX (2 x 8GB) DDR3 @ 2400MHz
> 4 x MSI Radeon R9 290X with Asus BIOS and Coollaboratory Liquid Ultra TIM @ ...I'd like some suggestions on this. Good temps, great airflow.
> 2 x SanDisk Extreme II 240GB SSDs in RAID 0 SATA III
> Coolmax ZPS-1600B 1600W Power Supply
> NZXT Kraken X60 280mm Liquid Cooler
> Cubitek Magic Cube AIO case with 4 x BitFenix Spectre Pro 140mm PWM Case Fans
> 3 x ASUS VG248QE Monitors @ 120Hz strobed with LightBoost and in-game Vsync enabled
> Razer Tiamat Elite 7.1 Surround Sound Headset (Don't even hear the 290Xs' fans)
> SteelSeries Sensei [RAW] mouse @ 5670 CPI, 10% sensitivity in BF4
> Razer Goliathus Extended Mouse Pad, Speed Edition
> Qpad MK-80 Pro Backlit PS2 Mechanical Keyboard with Cherry MX Red Switches
> I'm too lazy to total up the price on everything right now, but I payed below MSRP on almost every item and I sold three BF4 keys on eBay.


Good mobo!


----------



## TheRoot

Quote:


> Originally Posted by *raptor15sc*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I got these on launch day, but I've been busy until now:
> 
> 
> 
> Please forgive the messy cabling. The case has no windows and I decided to use all eight PCI-E sockets on my PSU individually. Also, the P9X79-E WS has a 6-pin power socket in a really stupid place just above the center of the PCI-E slots, so I used my PSU's unused EPS12V 8-pin cable with a 6-pin adapter (they differ, the pinouts are reversed vertically).
> 
> Full build specs:
> Asus P9X79-E WS with 64 lanes of PCI-E 3.0
> Intel i7 4930K @ 4.4GHz, HT off for increased performance in BF4
> G.Skill TridentX (2 x 8GB) DDR3 @ 2400MHz
> 4 x MSI Radeon R9 290X with Asus BIOS and Coollaboratory Liquid Ultra TIM @ ...I'd like some suggestions on this. Good temps, great airflow.
> 2 x SanDisk Extreme II 240GB SSDs in RAID 0 SATA III
> Coolmax ZPS-1600B 1600W Power Supply
> NZXT Kraken X60 280mm Liquid Cooler
> Cubitek Magic Cube AIO case with 4 x BitFenix Spectre Pro 140mm PWM Case Fans
> 3 x ASUS VG248QE Monitors @ 120Hz strobed with LightBoost and in-game Vsync enabled
> Razer Tiamat Elite 7.1 Surround Sound Headset (Don't even hear the 290Xs' fans)
> SteelSeries Sensei [RAW] mouse @ 5670 CPI, 10% sensitivity in BF4
> Razer Goliathus Extended Mouse Pad, Speed Edition
> Qpad MK-80 Pro Backlit PS2 Mechanical Keyboard with Cherry MX Red Switches
> I'm too lazy to total up the price on everything right now, but I payed below MSRP on almost every item and I sold three BF4 keys on eBay.


you need bigger case and better psu


----------



## beejay

Quote:


> Originally Posted by *MojoW*
> 
> I found the problem!
> The atiflash i used the first time was out of date(4.07 not 4.17), and it did flash the bios but i guess on different way then the new one.
> So now the flashing works everytime but that first time did ruin something.
> Because i still get no signal.
> Don't know what else to do anymore.


Can you run your card if you are using the other bios? You have two, the uber and the quiet. In what settings are you now?

And are you flashing the Uber or the Quiet Bios?

Remember we have 2 bios, hopefully the other one is still working. (I always do the flashing on the UBER bios)


----------



## MojoW

Quote:


> Originally Posted by *beejay*
> 
> Can you run your card if you are using the other bios? You have two, the uber and the quiet. In what settings are you now?
> 
> And are you flashing the Uber or the Quiet Bios?
> 
> Remember we have 2 bios, hopefully the other one is still working. (I always do the flashing on the UBER bios)


Yeah i'm on quiet now and the uber mode is what i'm flashing everytime.
Quiet mode is still untouched.


----------



## utnorris

Quote:


> Originally Posted by *hatlesschimp*
> 
> Yeah Im thinking in the future I will buy a second 290x that has updated ports (hdmi 2.0?, More DP etc).
> 
> Will a standard vanilla 290x card crossfire ok with a non reference 290x card? I know with nvidia you were governed buy your slowest card.


Yes it will be fine. Any 290 series card will CF with any other 290 series card. I am currently running a 290 with a 290x. AMD updated CF sometime back so the cards run at whatever their individual speed is, for example, if one is set to 947mhz and the other at 1000mhz, then they will run at their respective speeds. I tested this last night with mine.


----------



## Raephen

Quote:


> Originally Posted by *blue1512*
> 
> I thought Xtreme III has a PWM plug? Connect it to the board and set up your own fan curve seems to be very simple


I'm no expert, but couldn't the coil whine perhaps be a result of the card now trying to regulate the three fans instead of the stock one?

I don't have mine in yet, but a thing worth trying would be running it of the supplied 7v-feed from the molex plug. It's roughly speaking, 40% speed.

At Tom's it was sufficient, and I imagine it would be easier on noise than 100%.


----------



## raptor15sc

Quote:


> Originally Posted by *TheRoot*
> 
> you need bigger case and better psu


Why a bigger case?

*And what PSU do you recommend?* I've monitored it's power consumption and I don't reach the 1600w limit--The reason I got this PSU is because its two 12v rails have high amps. It is my understanding that other high wattage PSUs give each video card it's own 12v rail--at about 20 amps--and I think the 290X can go beyond that.
The Coolmax ZPS-1600B is the same as the Rosewill 1600w PSU from what I've read.


----------



## Gilgam3sh

Quote:


> Originally Posted by *Raephen*
> 
> I'm no expert, but isn't the coil whine perhaps a result of the card now trying to regulate the three fans instead of the stock one?
> 
> I don't hae minein yet, nut a thing worth trying would be running it of the supplied 7v-feed from the molex plug. It's roughly speaking, 40% speed.
> 
> At Tom's it was sufficient, and I imagine it would be easier on noise than 100%.


I will try that when I come home, with the molex plug, like I said before, at 100% its perfect when gaming as it's not annoying at all, but at idle it's too much...


----------



## Tobiman

Quote:


> Originally Posted by *raptor15sc*
> 
> Why a bigger case?
> 
> *And what PSU do you recommend?* I've monitored it's power consumption and I don't reach the 1600w limit--The reason I got this PSU is because its two 12v rails have high amps. It is my understanding that other high wattage PSUs give each video card it's own 12v rail--at about 20 amps--and I think the 290X can go beyond that.
> The Coolmax ZPS-1600B is the same as the Rosewill 1600w PSU from what I've read.


I'll recommend 2 850W (gold) PSUs. It should be cheaper than one 1600w and much safer.


----------



## Arizonian

Quote:


> Originally Posted by *raptor15sc*
> 
> I got these on launch day, but I've been busy until now:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Please forgive the messy cabling. The case has no windows and I decided to use all eight PCI-E sockets on my PSU individually. Also, the P9X79-E WS has a 6-pin power socket in a really stupid place just above the center of the PCI-E slots, so I used my PSU's unused EPS12V 8-pin cable with a 6-pin adapter (they differ, the pinouts are reversed vertically).
> 
> Full build specs:
> Asus P9X79-E WS with 64 lanes of PCI-E 3.0
> Intel i7 4930K @ 4.4GHz, HT off for increased performance in BF4
> G.Skill TridentX (2 x 8GB) DDR3 @ 2400MHz
> 4 x MSI Radeon R9 290X with Asus BIOS and Coollaboratory Liquid Ultra TIM @ ...I'd like some suggestions on this. Good temps, great airflow.
> 2 x SanDisk Extreme II 240GB SSDs in RAID 0 SATA III
> Coolmax ZPS-1600B 1600W Power Supply
> NZXT Kraken X60 280mm Liquid Cooler
> Cubitek Magic Cube AIO case with 4 x BitFenix Spectre Pro 140mm PWM Case Fans
> 3 x ASUS VG248QE Monitors @ 120Hz strobed with LightBoost and in-game Vsync enabled
> Razer Tiamat Elite 7.1 Surround Sound Headset (Don't even hear the 290Xs' fans)
> SteelSeries Sensei [RAW] mouse @ 5670 CPI, 10% sensitivity in BF4
> Razer Goliathus Extended Mouse Pad, Speed Edition
> Qpad MK-80 Pro Backlit PS2 Mechanical Keyboard with Cherry MX Red Switches
> I'm too lazy to total up the price on everything right now, but I payed below MSRP on almost every item and I sold three BF4 keys on eBay.


Congrats - your added








Quote:


> Originally Posted by *EliteReplay*
> 
> in interested aswell is there any stream link please?


No stream on the AMD Summit as far as I know.


----------



## Raephen

Quote:


> Originally Posted by *Gilgam3sh*
> 
> I will try that when I come home, with the molex plug, like I said before, at 100% its perfect when gaming as it's not annoying at all, but at idle it's too much...


Keep me posted!

So far - knock on wood - I have had next to no issues with my gem stone 290.

Appart from a single CTD in Skyrim, reminicent of CTD's I had whilst overclocking an FX-4170 I owned before (loved the cpu, just too bad AMD's IMC is abit on the weaker side, or I would still have it in my system).

And an odd squeal noise when quiting the Heaven benchmark. Odd, but when I exit that last screen and am idle in desktop, it's gone. Don't know if it's asound issue of something coming out of my system / card.

Let's try without sound...


----------



## flopper

whql driver 13.11
http://www.guru3d.com/files_details/amd_catalyst_13_11_whql_download.html


----------



## Arizonian

Quote:


> Originally Posted by *flopper*
> 
> whql driver 13.11
> http://www.guru3d.com/files_details/amd_catalyst_13_11_whql_download.html


This is news....first WHQL for the new series. Out of beta finally. Will stick on OP when it goes official.


----------



## flopper

Quote:


> Originally Posted by *Arizonian*
> 
> This is news....first WHQL for the new series. Out of beta finally. Will stick on OP when it goes official.


downloaded ready for my incoming 290 tomorow.


----------



## Gilgam3sh

@Arizonian

can you update first post? I now run the Accelero Xtreme III cooler instead of stock.

http://imageshack.com/i/ns0b27j


----------



## Jpmboy

Quote:


> Originally Posted by *MojoW*
> 
> Yeah i'm on quiet now and the uber mode is what i'm flashing everytime.
> Quiet mode is still untouched.


this is what the usb key should look like with no files hidden. use MS DOS. make sure you are typing the commands correctly:

atiflash[space]-p[space]-f[space]0[space]romname.rom


----------



## raptor15sc

Quote:


> Originally Posted by *flopper*
> 
> whql driver 13.11
> http://www.guru3d.com/files_details/amd_catalyst_13_11_whql_download.html


Quote:


> Originally Posted by *Arizonian*
> 
> This is news....first WHQL for the new series. Out of beta finally. Will stick on OP when it goes official.


Thatguy91 on Guru3D: "These are OLDER than the 13.11 Beta 9.2. The obvious older component is OpenGL (12614 in the Beta 9.2's vs 12613 for this one), but even the other components that are supposedly the same version are older builds and are not bit identical (different uncompressed file sizes too). For example, atikmdag.sys."


----------



## Fezlakk

Quote:


> Originally Posted by *raptor15sc*
> 
> I got these on launch day, but I've been busy until now:
> 
> 
> 
> Please forgive the messy cabling. The case has no windows and I decided to use all eight PCI-E sockets on my PSU individually. Also, the P9X79-E WS has a 6-pin power socket in a really stupid place just above the center of the PCI-E slots, so I used my PSU's unused EPS12V 8-pin cable with a 6-pin adapter (they differ, the pinouts are reversed vertically).
> 
> Full build specs:
> Asus P9X79-E WS with 64 lanes of PCI-E 3.0
> Intel i7 4930K @ 4.4GHz, HT off for increased performance in BF4
> G.Skill TridentX (2 x 8GB) DDR3 @ 2400MHz
> 4 x MSI Radeon R9 290X with Asus BIOS and Coollaboratory Liquid Ultra TIM @ ...I'd like some suggestions on this. Good temps, great airflow.
> 2 x SanDisk Extreme II 240GB SSDs in RAID 0 SATA III
> Coolmax ZPS-1600B 1600W Power Supply
> NZXT Kraken X60 280mm Liquid Cooler
> Cubitek Magic Cube AIO case with 4 x BitFenix Spectre Pro 140mm PWM Case Fans
> 3 x ASUS VG248QE Monitors @ 120Hz strobed with LightBoost and in-game Vsync enabled
> Razer Tiamat Elite 7.1 Surround Sound Headset (Don't even hear the 290Xs' fans)
> SteelSeries Sensei [RAW] mouse @ 5670 CPI, 10% sensitivity in BF4
> Razer Goliathus Extended Mouse Pad, Speed Edition
> Qpad MK-80 Pro Backlit PS2 Mechanical Keyboard with Cherry MX Red Switches
> I'm too lazy to total up the price on everything right now, but I payed below MSRP on almost every item and I sold three BF4 keys on eBay.


Man, that is an beastly setup.
How loud is it under load with all those 290X's? How is the throttling?


----------



## fearthisneo

Hi all, finally got my 290, going to install it here in a few minutes, will be running stock cooling till frozencpu ships my block.


----------



## rdr09

Quote:


> Originally Posted by *fearthisneo*
> 
> Hi all, finally got my 290, going to install it here in a few minutes, will be running stock cooling till frozencpu ships my block.


congrats. if i may make a suggestion - don't do anything fancy. if you are not using 13.11 Beta 8, then i suggest that and download it.

1. use it to uninstall the driver currently. pick express.
2. shutdown your rig
3. replace the 7970 with the 290
4. Boot and install 13.11 Beta 8 - pick express.

can't recommend Beta 9 atm but Beta 8 works fine. Not sure if you'll get the same results with Win8.

edit: my bad. thought you were the one with a 7970.


----------



## Jpmboy

Quote:


> Originally Posted by *rdr09*
> 
> congrats. if i may make a suggestion - don't do anything fancy. if you are not using 13.11 Beta 8, then i suggest that and download it.
> 
> 1. use it to uninstall the driver currently. pick express.
> 2. shutdown your rig
> 3. replace the 7970 with the 290
> 4. Boot and install 13.11 Beta 8 - pick express.
> 
> can't recommend Beta 9 atm but Beta 8 works fine. Not sure if you'll get the same results with Win8.
> 
> edit: my bad. thought you were the one with a 7970.


beta 8 is not on the AMD site ?... where do you recommend getting it?


----------



## Blackops_2

http://www.guru3d.com/files_details/amd_catalyst_13_11_whql_download.html

Apparently WHQL has been released.


----------



## rdr09

Quote:


> Originally Posted by *Jpmboy*
> 
> beta 8 is not on the AMD site ?... where do you recommend getting it?


here jp . . .

http://www.overclock.net/t/1438283/amd-catalyst-13-11-beta8-released

+rep for all your hardwork.


----------



## rdr09

Quote:


> Originally Posted by *Blackops_2*
> 
> http://www.guru3d.com/files_details/amd_catalyst_13_11_whql_download.html
> 
> Apparently WHQL has been released.


i normally wait for results from others. i know its bad.


----------



## Newbie2009

Ordered my cards when still in stock, now out of stock and still waiting.


----------



## Bloitz

Good news: just got an email from caseking that my order is ready for shipping ! Glad I didn't cancel it after all because I was pissed when they told me I would have to wait until the 15th for my block. They claim next day delivery... We'll see about that since it has to come from Germany, but would be a nice birthday present to myself


----------



## Jpmboy

Quote:


> Originally Posted by *rdr09*
> 
> here jp . . .
> 
> http://www.overclock.net/t/1438283/amd-catalyst-13-11-beta8-released
> 
> +rep for all your hardwork.


thanks for the link - but, yeah - that is a deadend. AMD has legacy/older drivers somewhere....


----------



## rdr09

Quote:


> Originally Posted by *Jpmboy*
> 
> thanks for the link - but, yeah - that is a deadend. AMD has legacy/older drivers somewhere....


Looks like latest WHQL that Blackops linked is same as the Beta 8.


----------



## hatlesschimp

So for a dual projector 2x1 eyefinity (3840 x 1080p). Which cards would you choose?

The pricing has really messed with my head and I cant decide between these below cards.

(Australian prices below LOL)

ASUS 290x = $769 + Free BF4
XFX 290x = $659
Gigabyte 290x = $699 + Free BF4

Sapphire 290 = $499
ASUS 290 = $499
Gigabyte 290 = $519

XFX 7990 = $699

Powercolor Radeon HD7970 3GB OC = $319
XFX Double Dissipation 3GB = $319

The 290 is cheap and I can buy bf4 separate perhaps???


----------



## raptor15sc

Quote:


> Originally Posted by *Fezlakk*
> 
> Man, that is an beastly setup.
> How loud is it under load with all those 290X's? How is the throttling?


Not too loud for me. All the fans have custom PWM profiles.
The 140mm case fans are powerful, but they're silent at low RPMs.

Sure, it gets relatively loud, but that's just when I'm gaming and I can't hear the fans because I keep the case away from where I sit and I'm wearing around-the-ear headphones.

I don't get throttled because I don't come close to exceeding the 94c limit.

With the 290Xs at stock specs, I stay just under 80c while playing BF4 at Ultra settings (thanks to the Liquid Ultra paste I put on them) with this fan profile (ULPS disabled):


----------



## Technewbie

Got mine in yesterday.


Spoiler: Warning: Spoiler!









Spoiler: Warning: Spoiler!









Spoiler: Warning: Spoiler!







Also I have a question, I plan on flashing my BIOS to the ASUS one so I can get voltage control for a better OC. However I have a 650w PSU so i'm concerned if I up the voltage that it will be to much for my PSU. Does any one know if I can up the voltage and be fine? If so to what voltage would be the max?


----------



## raptor15sc

Quote:


> Originally Posted by *hatlesschimp*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> So for a dual projector 2x1 eyefinity (3840 x 1080p). Which cards would you choose?
> 
> The pricing has really messed with my head and I cant decide between these below cards.
> 
> (Australian prices below LOL)
> 
> ASUS 290x = $769 + Free BF4
> XFX 290x = $659
> Gigabyte 290x = $699 + Free BF4
> 
> Sapphire 290 = $499
> ASUS 290 = $499
> Gigabyte 290 = $519
> 
> XFX 7990 = $699
> 
> Powercolor Radeon HD7970 3GB OC = $319
> XFX Double Dissipation 3GB = $319
> 
> The 290 is cheap and I can buy bf4 separate perhaps???


What kind of FPS at what settings do you want/need to get in BF4? Also, it might be worth it to wait the couple of days till AMD Mantle comes out. If it has a big effect, you might not have to spend as much.


----------



## MojoW

Quote:


> Originally Posted by *Jpmboy*
> 
> this is what the usb key should look like with no files hidden. use MS DOS. make sure you are typing the commands correctly:
> 
> atiflash[space]-p[space]-f[space]0[space]romname.rom


Did this exactly and it does say it's flashed.
If i flash to Sapphire and back to asus i even see the vendor change in gpu-z.(most info stays blank like the screen i posted earlier)
So i think that the first time flashing with the wrong atiflash messed something up in the bios chip.
My usb folder does look different then yours, do you use HPUsb?(I just see atiflash and the ROM with hidden files shown)
What else can i do?


----------



## beejay

Updated my first proof post with new GPU-Z validation and rig with OCN Signature tag. Pardon for the messy wires, I've been benching some other things lately.

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/2260#post_21107958


----------



## raptor15sc

Quote:


> Originally Posted by *Technewbie*
> 
> Also I have a question, I plan on flashing my BIOS to the ASUS one so I can get voltage control for a better OC. However I have a 650w PSU so i'm concerned if I up the voltage that it will be to much for my PSU. Does any one know if I can up the voltage and be fine? If so to what voltage would be the max?


It depends on every other spec in your system too. To be completely sure and safe, I recommend getting a wattage meter.
Here's the cheapest one, but there are more precise ones on the market.
You just slowly increase the voltage/speeds and check the meter each time you do.


----------



## hatlesschimp

Quote:


> Originally Posted by *raptor15sc*
> 
> What kind of FPS at what settings do you want/need to get in BF4? Also, it might be worth it to wait the couple of days till AMD Mantle comes out. If it has a big effect, you might not have to spend as much.


I guess ideal would be 60fps min at 3840 x 1080 with the highest settings.


----------



## skupples

Quote:


> Originally Posted by *GioV*
> 
> Who is excited for the gratuitous amount of information on Mantle that should come out tomorrow from AMD's developer summit?
> 
> http://developer.amd.com/apu/home/agenda-sessions/


----------



## smokedawg

New AB release soon:
link
Quote:


> Of course no, beta 17 with 290 voltage support was submitted to MSI couple days ago. So it will be released soon.


----------



## raptor15sc

Quote:


> Originally Posted by *hatlesschimp*
> 
> I guess ideal would be 60fps min at 3840 x 1080 with the highest settings.


Well, the FPS will go up with Mantle, we just don't know how much.
For now, I'd go with either two 290X cards or two 290 cards. Depends on what you want to spend and what you want in the future.
Personally, I think you could get two 290 cards, overclocked with an empty PCI-E slot between them on your board, and get 60 FPS average. Maybe even 60 FPS min. with Mantle.


----------



## raptor15sc

Quote:


> Originally Posted by *smokedawg*
> 
> New AB release soon:
> link










Nice!!! Been waiting for this. GPU Tweak is horrible in comparison to AB.


----------



## hatlesschimp

Thanks


----------



## Bloitz

Quick question: I was planning to install it in my brother's rig first since he has a 7950 to test for issues like black screening. He's running some variant of the 13.11 driver, I wouldn't need to reinstall that or do I?
I know I have to select Geforce 500 series drivers on the nVidia website but they're always the same drivers as the other cards AFAIK ...


----------



## Jpmboy

Quote:


> Originally Posted by *MojoW*
> 
> Did this exactly and it does say it's flashed.
> If i flash to Sapphire and back to asus i even see the vendor change in gpu-z.(most info stays blank like the screen i posted earlier)
> So i think that the first time flashing with the wrong atiflash messed something up in the bios chip.
> My usb folder does look different then yours, do you use HPUsb?(I just see atiflash and the ROM with hidden files shown)
> What else can i do?


okay - so when you select folder attributes and unhide all, *show hidden system files* and show extensions it looks different?

unless you entered the incorrect commands that first time, I'm not sure what happened. So - before trying to flash again, power off, shut off your PSU then press the power-on switch. hold it down until fans stop spinning (or if you can see the MB, until the MB leds go out.) Boot via the iGPUU directly to the USB key (that looks EXACTLY like the snip I posted). reflash with the commands i posted earlier.

Sorry bud, but if this does not work - something is very wrong.


----------



## MojoW

Quote:


> Originally Posted by *Jpmboy*
> 
> okay - so when you select folder attributes and unhide all, and show extensions it looks different?


Yes don't know if it's because i'm on W8.1 but that's all i see.


----------



## iTurn

Any hints on when the custom cooler/PCB cards will be coming for the 290x?


----------



## hatlesschimp

Quote:


> Originally Posted by *iTurn*
> 
> Any hints on when the custom cooler/PCB cards will be coming for the 290x?


+1


----------



## Jpmboy

Quote:


> Originally Posted by *MojoW*
> 
> Yes don't know if it's because i'm on W8.1 but that's all i see.


see edits above this ^^^^ post


----------



## DampMonkey

Quote:


> Originally Posted by *GioV*
> 
> Who is excited for the gratuitous amount of information on Mantle that should come out tomorrow from AMD's developer summit?
> 
> http://developer.amd.com/apu/home/agenda-sessions/


I really hope they lay out some details and not just a bunch of fluff powerpoint slides. With only a month until it's release, its about time we start hearing facts!


----------



## jamaican voodoo

got say after see you all of you posting up your 290x and 290's i'm super jelly right now....i'm 2 months away from getting 3 of these beast...the wait is killing me haha!!! congrats to all who have their card already and or about receive









i also see promising result i likes....


----------



## MojoW

Quote:


> Originally Posted by *Jpmboy*
> 
> okay - so when you select folder attributes and unhide all, *show hidden system files* and show extensions it looks different?
> 
> unless you entered the incorrect commands that first time, I'm not sure what happened. So - before trying to flash again, power off, shut off your PSU then press the power-on switch. hold it down until fans stop spinning (or if you can see the MB, until the MB leds go out.) Boot via the iGPUU directly to the USB key (that looks EXACTLY like the snip I posted). reflash with the commands i posted earlier.
> 
> Sorry bud, but if this does not work - something is very wrong.


Yeah all files unhidden and file extensions were allready visible.
I did it exactly the same the first time, the only difference is i used the wrong atiflash.
While other's have said it will not work with the older atiflash, it did work once and that is what messed it up.
Think i can RMA this ? just have it one day.


----------



## X-oiL

I've got a pre-order on 2 Asus 290x which i got 15% off (really good price), would you guys cancel that pre-order and get 2 x 290 (non x) for 67€/90$ less?


----------



## Arizonian

Quote:


> Originally Posted by *Gilgam3sh*
> 
> @Arizonian
> 
> can you update first post? I now run the Accelero Xtreme III cooler instead of stock.
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://imageshack.com/i/ns0b27j


Updated









Quote:


> Originally Posted by *fearthisneo*
> 
> Hi all, finally got my 290, going to install it here in a few minutes, will be running stock cooling till frozencpu ships my block.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Quote:


> Originally Posted by *Technewbie*
> 
> Got mine in yesterday.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Also I have a question, I plan on flashing my BIOS to the ASUS one so I can get voltage control for a better OC. However I have a 650w PSU so i'm concerned if I up the voltage that it will be to much for my PSU. Does any one know if I can up the voltage and be fine? If so to what voltage would be the max?


Congrats - added too








Quote:


> Originally Posted by *Jpmboy*
> 
> beta 8 is not on the AMD site ?... where do you recommend getting it?


Wow - I just went to OP to check if those links with detailed info I've been adding of the drivers since release worked and they only point to the newest one. *sigh*


----------



## jerrolds

Ill finally have some free time tonight to flash my 290X bios and hopefully overclock this thing - will atiflash work under safe mode? I have 1 usb stick i found lying around, but at 64MB it might be big enough to create a startup usb heh.

Ill have to try it when i get home i guess.


----------



## DavidH

Quote:


> Originally Posted by *X-oiL*
> 
> I've got a pre-order on 2 Asus 290x which i got 15% off (really good price), would you guys cancel that pre-order and get 2 x 290 (non x) for 67€/90$ less?


$90 less a piece? yes.


----------



## X-oiL

Quote:


> Originally Posted by *DavidH*
> 
> $90 less a piece? yes.


Sorry, no for both.


----------



## beejay

Hey Arizonian, would you mind updating me too? ^_^ My OCD is killing me. Using accelero hybrid.



I'm going to do a new mod and try fitting a thermalright r5 vrm cooler into this card.

Also, would a 750 watt PSU handle these babies in crossfire? Planning to get another one come pay day.


----------



## Technewbie

Quote:


> Originally Posted by *smokedawg*
> 
> New AB release soon:
> link


I just flashed my MSI BIOS to ASUS to get voltage control and now a new AB is coming out soon -_-


----------



## utnorris

If it is just $45 each and the wait is the same, then you might as well get the 290x's. However, that $90 almost buys you one block or another aftermarket cooler, so it really is up to you. I went with 290's and RMA'd my 290x because it was close to $200 each difference and the performance is too close to justify the extra cost for me.


----------



## Arizonian

Quote:


> Originally Posted by *beejay*
> 
> Hey Arizonian, would you mind updating me too? ^_^ My OCD is killing me. Using accelero hybrid.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I'm going to do a new mod and try fitting a thermalright r5 vrm cooler into this card.
> 
> Also, would a 750 watt PSU handle these babies in crossfire? Planning to get another one come pay day.


Updated









LOL - also now that I knew it was accurate - my OCD was killing me too.









As for 750 watts handling crossfire - I don't think so, even without an OC. Minimum in reviews has been 800 watts and that's without over clocking head room and definitely not adding voltage.

I'm struggling over to crossfire with an 850 watt PSU. I won't need to over clock if I go over kill so I'm good with that for gaming as I don't need to bench them. Just waiting on non-reference to drop to make a decision.


----------



## beejay

Yeah, calculating stock max power draw,, it does go past 300 watts, plus oc, that's alot. I'll buy a new PSU, or might just use my backup 550 watts as power for the second card.


----------



## Sainth

Hello, just got my second 290x.

got the asus sabertooth z77 which got 2 pci-e x16 v3.0 slots, my cards is running at pci-e x8 :S


----------



## mickeykool

Getting my 290 in the mail today, so all this black screens been resolved yet?


----------



## beejay

Quote:


> Originally Posted by *Sainth*
> 
> Hello, just got my second 290x.
> 
> got the asus sabertooth z77 which got 2 pci-e x16 v3.0 slots, my cards is running at pci-e x8 :S


That's normal. With pcie 2 unpopulated, pcie 1 would run ar 16x speed. If pcie 2 id populated, 16 lanes are split between pcie 2 and 1.

Just so you know, 8x speed pcie 3.0 is more than enough. Also, pcie x16 slot is different from x16 speed. So your board might have 2 x16 slots, but the speed on those slots does not necessarily mean x16 also.


----------



## DavidH

Quote:


> Originally Posted by *mickeykool*
> 
> Getting my 290 in the mail today, so all this black screens been resolved yet?


I have not had any black screen issues, seems to be hit and miss but is most likely driver related.


----------



## Jpmboy

Quote:


> Originally Posted by *beejay*
> 
> Yeah, calculating stock max power draw,, it does go past 300 watts, plus oc, that's alot. I'll buy a new PSU, or might just use my backup 550 watts as power for the second card.


just be sure to have at least 30A on the pcie rail in that second psu. here's a great way to add a second "add2psu". google it. great product.


----------



## Jpmboy

Quote:


> Originally Posted by *Sainth*
> 
> Hello, just got my second 290x.
> 
> got the asus sabertooth z77 which got 2 pci-e x16 v3.0 slots, my cards is running at pci-e x8 :S


if you have a 37xx 0r 47xx series i7 then you have 3.0, sandybridge is 2,0 i believe (my 2700K certainly is







)


----------



## raptor15sc

Quote:


> Originally Posted by *beejay*
> 
> Just so you know, 8x speed pcie 3.0 is more than enough.


Do you have a reputable citation for that? Also, remember that the 290X in Crossfire uses additional PCI-E bandwidth because that's how Crossfired cards connect now.


----------



## Kipsofthemud

Quote:


> Originally Posted by *raptor15sc*
> 
> Do you have a reputable citation for that? Also, remember that the 290X in Crossfire uses additional PCI-E bandwidth because that's how Crossfired cards connect now.


I think it was Linus who said "Xfire was tested in 2 pci-e *2.0* slots at *8x* and that wasn't slowed down at all.


----------



## Sainth

Quote:


> Originally Posted by *Kipsofthemud*
> 
> I think it was Linus who said "Xfire was tested in 2 pci-e *2.0* slots at *8x* and that wasn't slowed down at all.


so it should be no diffrence in preformance with it running in PCI-E 3.0x8 instead of PCI-E 3.0x16 ?

since 290x doesnt have any xfire bridge


----------



## X-oiL

Quote:


> Originally Posted by *utnorris*
> 
> If it is just $45 each and the wait is the same, then you might as well get the 290x's. However, that $90 almost buys you one block or another aftermarket cooler, so it really is up to you. I went with 290's and RMA'd my 290x because it was close to $200 each difference and the performance is too close to justify the extra cost for me.


Well that's the thing, 290 is in stock right now..290x is without a delivery date. Leaning towards the 290, really wanna play some games now...


----------



## Jpmboy

Quote:


> Originally Posted by *Sainth*
> 
> so it should be no diffrence in preformance with it running in PCI-E 3.0x8 instead of PCI-E 3.0x16 ?
> 
> since 290x doesnt have any xfire bridge


Doubtful you will saturate pcie3.0x8. What monitor(s) are you using? My 2 7970s at PCIe2.0 drive a 4K monitor without a problem (well... except the vram limitation in some instances).

but the question does deserve some careful testing...


----------



## givmedew

Man... I am buying one of these from amazon in case I don't like it lol!!!

So far sounds awesome!

Can you override the 290 to run the fan at 100%?

I'd like to see the difference in boost ability between the lower and higher settings. I will ultimately have it water cooled with a universal EK Supremacy and have the cards fan controller run a 3000 or 4000 RPM PWM 80mm or 120mm fan not sure which fan yet. I have a huge box full of fans and I have all the chip heat sinks I need.


----------



## DavidH

Quote:


> Originally Posted by *X-oiL*
> 
> Well that's the thing, 290 is in stock right now..290x is without a delivery date. Leaning towards the 290, really wanna play some games now...


Just get the R9290, there is not enough diff between the 2 to justify the extra $150.00 in cost. In fact I think its a rip off by comparison. I couldnt be happier with my R9 290 clocked at 1200mhz core and 6000mhz mem it is an absolute beast and all I needed was a $65.00 air cooler to do it.


----------



## Gilgam3sh

ok guys, I just recored this video to show you the coil whine with the Accelero Xtreme III, first I run the fan at 20%, then 100%, and last 20% again... I will try with the molex I got with the cooler and see if it helps...


----------



## evensen007

Quote:


> Originally Posted by *Jpmboy*
> 
> just be sure to have at least 30A on the pcie rail in that second psu. here's a great way to add a second "add2psu". google it. great product.


Quick question on this. I have a 3 year old Mushkin Joule 1000. It has 6 12v rails @ 19a each. There is also some weird switch on the PSU that allows you to combine the 12v rails. Would 6 12v rails @ 19a be an issue with 2 x 290's xfired?

http://www.newegg.com/Product/Product.aspx?Item=N82E16817812009


----------



## raptor15sc

Quote:


> Originally Posted by *givmedew*
> 
> Can you override the 290 to run the fan at 100%?


Easily. I recommend the program AfterBurner (it has an easy option to disable ULPS too), but I think I'm pretty sure you can also do it with the stock AMD software.


----------



## X-oiL

Quote:


> Originally Posted by *DavidH*
> 
> Just get the R9290, there is not enough diff between the 2 to justify the extra $150.00 in cost. In fact I think its a rip off by comparison. I couldnt be happier with my R9 290 clocked at 1200mhz core and 6000mhz mem it is an absolute beast and all I needed was a $65.00 air cooler to do it.


Well i'll be putting my under water after christmas, maybe the difference will still be just a few fps in games. Main concern is that gaming on 120Hz monitor wants me to have stable 120fps


----------



## mikep577

Hi, I just bought a new gaming rig(see signature only waiting on for my GPU waterblock and Motherboard.
I'm wondering if anyone have installed the EK-FC R9-290X - Nickel (Original CSQ) in CF and what kind bridge was used to connect them...


----------



## evensen007

Quote:


> Originally Posted by *mikep577*
> 
> Hi, I just bought a new gaming rig(see signature only waiting on for my GPU waterblock and Motherboard.
> I'm wondering if anyone have installed the EK-FC R9-290X - Nickel (Original CSQ) in CF and what kind bridge was used to connect them...


Depending on your mobo spacing, you need a dual 2 slot spacing or dual 3 slot spacing bridge:

http://www.performance-pcs.com/catalog/index.php?main_page=index&cPath=59_971_1018_1038_1207&zenid=9f2e77716f17a42d010632d4bd42d032

Serial or parallel doesn't really matter (although you will hear varying opinions). I ordered the dual 3 space serial plexi bridge.


----------



## Taint3dBulge

Quote:


> Originally Posted by *Gilgam3sh*
> 
> ok guys, I just recored this video to show you the coil whine with the Accelero Xtreme III, first I run the fan at 20%, then 100%, and last 20% again... I will try with the molex I got with the cooler and see if it helps...


Ya that sounds like voltage going through to your fan motor.. Its gotta be the offset power?? also wonder if you layed your case ont he side if it would go away.. so the fans dont have gravity pushing on them... Have a box fan that makes a sound "kinda" like that and i have to have it tilted about 100~110 degs and the noise goes away. lol.


----------



## Jpmboy

Quote:


> Originally Posted by *evensen007*
> 
> Quick question on this. I have a 3 year old Mushkin Joule 1000. It has 6 12v rails @ 19a each. There is also some weird switch on the PSU that allows you to combine the 12v rails. Would 6 12v rails @ 19a be an issue with 2 x 290's xfired?
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16817812009


the thing is, each card "can" draw 400W or more if you ever OC to WOT. So, if each rail connects to one PCIE cable (need to be sure of this), then two cables per card, each on an independent 19A rail would give you 48A per card... that's a limit of 576 watts to that card before psu shutdown (overdrawing any rail will shutdown the entire psu - most brands). This is why, single rail psus are a better choice.

So- that 1000W psu, with one PCIE power cable per 19A rail, feeding two cards should be fine. Or - read up on the combo rails thing and flip the switch! Should be goodtogo


----------



## bond32

http://koolance.com/video-card-vga-amd-radeon-r9-290x-water-block-vid-ar290x

Not sure if that's been posted yet. Anyone have experience with Koolance? I always thought they were top quality. That block is slightly more expensive but it sure does look good... Curious of the performance...


----------



## mikep577

Quote:


> Originally Posted by *evensen007*
> 
> Depending on your mobo spacing, you need a dual 2 slot spacing or dual 3 slot spacing bridge:
> 
> http://www.performance-pcs.com/catalog/index.php?main_page=index&cPath=59_971_1018_1038_1207&zenid=9f2e77716f17a42d010632d4bd42d032
> 
> Serial or parallel doesn't really matter (although you will hear varying opinions). I ordered the dual 3 space serial plexi bridge.


Thnx, I'm using PCI port 1 and 3 (both PCi3.0 x16



Ideally i would like to use this EK-FC Quad port bride in plexi so the fluid is viable


----------



## sugarhell

Quote:


> Originally Posted by *Gilgam3sh*
> 
> ok guys, I just recored this video to show you the coil whine with the Accelero Xtreme III, first I run the fan at 20%, then 100%, and last 20% again... I will try with the molex I got with the cooler and see if it helps...


Coil whine:When you pass a current from vrms then you create a magnetic field. This magnetic field can physical vibrate creating coil whine. Nothing to do with your cooler.


----------



## Gilgam3sh

Quote:


> Originally Posted by *sugarhell*
> 
> Coil whine:When you pass a current from vrms then you create a magnetic field. This magnetic field can physical vibrate creating coil whine. Nothing to do with your cooler.


should I not have it with the stock cooler too then?


----------



## jerrolds

Can anyone give me a run down on this "Black Screen" problem people are having? Ive had my 290X for almost 2 weeks but havent really gave it a proper burn in, and my card has Elpida memory. I was going to slap on the ICY 2 cooler on it tonight but if the black screen thing is something i should test for..

So far i get random crashes in BF4 (only game i really used with this card so far) when overclocked...but i just attributed it to an unstable clock.


----------



## sugarhell

Quote:


> Originally Posted by *Gilgam3sh*
> 
> should I not have it with the stock cooler too then?







coil whine


----------



## DavidH

Quote:


> Originally Posted by *sugarhell*
> 
> 
> 
> 
> 
> 
> coil whine


Now thats what coil whine sounds like.


----------



## the9quad

Quote:


> Originally Posted by *jerrolds*
> 
> Can anyone give me a run down on this "Black Screen" problem people are having? Ive had my 290X for almost 2 weeks but havent really gave it a proper burn in, and my card has Elpida memory. I was going to slap on the ICY 2 cooler on it tonight but if the black screen thing is something i should test for..
> 
> So far i get random crashes in BF4 (only game i really used with this card so far) when overclocked...but i just attributed it to an unstable clock.


When you say crash, does your monitor go black and the only way to fix it is to hard reboot? If yes, then you might want to start running other demanding games/benchmarks and see if it happens in those as well. Because the above is the black screen problem,. Random crashes of bf4 isn't the problem.


----------



## Sainth

Quote:


> Originally Posted by *Jpmboy*
> 
> Doubtful you will saturate pcie3.0x8. What monitor(s) are you using? My 2 7970s at PCIe2.0 drive a 4K monitor without a problem (well... except the vram limitation in some instances).
> 
> but the question does deserve some careful testing...


Dont have a 4k screen atm, running it on a 1920x1080 120hz, when im benching it in unigine valley 1.0 the ASUS AI Suite tells me that the pci-e is running "hot" at 60 °C.

should it automatically change to pci-e 3.0x16 or :S


----------



## josephimports

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Nothing, was going to RMA due to all the black screen issues i had but i decided to do a full wipe of things and start fresh and seems to be holding ok so i decided to keep the card and see what happens.


I have the black screen issue and it happens randomly. During benchmarks and 30 min of BF3, no problems. Browse the internet for ten minutes and bam...black screen. Tonight , I will be perform a reformat and cross my fingers.


----------



## Arizonian

Are the black screens happening with 13.11 beta 8 too? It seems since 13.11 beta 9.2 this started to occur. I didn't have this issue but I never got to beta 9.2.


----------



## evensen007

Quote:


> Originally Posted by *mikep577*
> 
> Thnx, I'm using PCI port 1 and 3 (both PCi3.0 x16
> 
> 
> 
> Ok, wait.


----------



## Gilgam3sh

part from the coil whine the cooler is good, did some temps test with Heaven Valley Benchmark at Extreme HD for about 15 minutes, and I think it's OK


----------



## DavidH

Quote:


> Originally Posted by *Gilgam3sh*
> 
> part from the coil whine the cooler is good, did some temps test with Heaven Valley Benchmark at Extreme HD for about 15 minutes, and I think it's OK


Dude thats great, its beating my Prolimatech by 11c for whatever reason and my Prolimatech is huge! I can still send it back for a refund, im probably gonna do that and get the arctic instead.


----------



## Slomo4shO

Quote:


> Originally Posted by *raptor15sc*
> 
> *And what PSU do you recommend?* .


\

LEPA G Series G1600-MA or EVGA SuperNOVA NEX 1500


----------



## jerrolds

Quote:


> Originally Posted by *the9quad*
> 
> When you say crash, does your monitor go black and the only way to fix it is to hard reboot? If yes, then you might want to start running other demanding games/benchmarks and see if it happens in those as well. Because the above is the black screen problem,. Random crashes of bf4 isn't the problem.


Ummm I believe maybe a couple times i had to do a hard reset..but it never happens in stock tho. And never during any benchmarks, 3DMark and Heaven.

I guess ill hold off on putting on the ICY tonight and test for Black Screen with BF4/Bioshock and Heaven/Valley. It happens in stock as well?


----------



## Mr357

Quote:


> Originally Posted by *Arizonian*
> 
> Are the black screens happening with 13.11 beta 8 too? It seems since 13.11 beta 9.2 this started to occur. I didn't have this issue but I never got to beta 9.2.


I've been running the Beta 8 with no issues until I got too ambitious with my overclocking (past 1700 mem







) and when I used the PT3 BIOS for less than a day. I've held off on installing 9.2 since it seems to be causing issues.


----------



## kcuestag

Quote:


> Originally Posted by *Arizonian*
> 
> Are the black screens happening with 13.11 beta 8 too? It seems since 13.11 beta 9.2 this started to occur. I didn't have this issue but I never got to beta 9.2.


I was on Beta 7 (Or 8? The first BETA's when 290X released) when it started happening, I updated to 9.2 to see if it helped, but it was the same thing.


----------



## Jpmboy

Quote:


> Originally Posted by *Sainth*
> 
> Dont have a 4k screen atm, running it on a 1920x1080 120hz, when im benching it in unigine valley 1.0 the ASUS AI Suite tells me that the pci-e is running "hot" at 60 °C.
> 
> should it automatically change to pci-e 3.0x16 or :S


it would be helpful if you used the rigbuilder and posted your kit to your sig. this way, we know what gear you are using.

(and - i have asus ai suite - where are your getting the 60C from ?)

4K... looks great!!



(yes - i pulled my r290x/EKWB and put the sli titans back in. Waiting for a QDC from Koolance before putting teh 290x back)


----------



## Gilgam3sh

*ACCELERO XTREME III COIL WHINE "FIX"*

OK GUYS!! The "fix" for not getting that annoying coil whine sound when not running the Xtreme III fans at 100% is to use the Molex plug you get with the cooler, I run it at 7v now and it's dead quite, temps are fine, about 35°C and NO MORE crazy coil whine sound.


----------



## jerrolds

Quote:


> Originally Posted by *josephimports*
> 
> I have the black screen issue and it happens randomly. During benchmarks and 30 min of BF3, no problems. Browse the internet for ten minutes and bam...black screen. Tonight , I will be perform a reformat and cross my fingers.


Just browsing the internet and the entire screen goes black needing a hard reset? Thats weird...i think my computer was on 24/7 the last couple days but i didnt do any real 3D gaming, just straight Netflix/Streaming with a ton of mame-ing.

Hopefully my card doesnt have this problem. Will have to look for it tonight.


----------



## tsm106

Quote:


> Originally Posted by *Gilgam3sh*
> 
> *ACCELERO XTREME III COIL WHINE "FIX"*
> 
> OK GUYS!! The "fix" for not getting that annoying coil whine sound when not running the Xtreme III fans at 100% is to use the Molex plug you get with the cooler, I run it at 7v now and it's dead quite, temps are fine, about 35°C and NO MORE crazy coil whine sound.


Haha, are you a comedian? I'm not sure that was the intended affect?


----------



## josephimports

Quote:


> Originally Posted by *jerrolds*
> 
> Just browsing the internet and the entire screen goes black needing a hard reset? Thats weird...i think my computer was on 24/7 the last couple days but i didnt do any real 3D gaming, just straight Netflix/Streaming with a ton of mame-ing.
> 
> Hopefully my card doesnt have this problem. Will have to look for it tonight.


Yep, I downgraded to 13.11v8 with no resolve. CPU and GPU are running stock. It will black screen randomly in 5-20 minutes just while browsing internet. Temps are not the problem. I will report back tonight after a fresh install.


----------



## Jpmboy

Quote:


> Originally Posted by *josephimports*
> 
> Yep, I downgraded to 13.11v8 with no resolve. CPU and GPU are running stock. It will black screen randomly in 5-20 minutes just while browsing internet. Temps are not the problem. I will report back tonight after a fresh install.


I'm not sure what's causing this problem for you guys - I've not been able to replicate it. I have not experienced that at all - and believe me, I've crashed beta9.2 very hard, many times. Maybe TSM ?

But - I have only used beta 9.2 so far, and flashed back to teh asus bios from the PT3 bios, which I found to be flawed - but each rig is different so PT3 may be fine for others.


----------



## famich

New Sapphire R9 290X BF 4 Edition
Humbly asking to be added


----------



## famich

Quote:


> Originally Posted by *Jpmboy*
> 
> I'm not sure what's causing this problem for you guys - I've not been able to replicate it. I have not experienced that at all - and believe me, I've crashed beta9.2 very hard, many times. Maybe TSM ?
> 
> But - I have only used beta 9.2 so far, and flashed back to teh asus bios from the PT3 bios, which I found to be flawed - but each rig is different so PT3 may be fine for others.


I have , too, found PT3 BIOS to be flawed , on guru3d it looks that AFB beta 17 will be available with better OV possibilities- by means of direct adressing of the respective registers and disabling the LLC .


----------



## Gunderman456

I got this to link both r9 290 cards;

SLI/Crossfire Fitting: Swiftech Chrome G1/4 MALE-MALE SLI & CrossFireX Connector Fitting Adjustable From 41 to 65MM - $9.99 (NCIX.ca)


----------



## tsm106

Quote:


> Originally Posted by *Jpmboy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *josephimports*
> 
> Yep, I downgraded to 13.11v8 with no resolve. CPU and GPU are running stock. It will black screen randomly in 5-20 minutes *just while browsing internet*. Temps are not the problem. I will report back tonight after a fresh install.
> 
> 
> 
> I'm not sure what's causing this problem for you guys - I've not been able to replicate it. I have not experienced that at all - and believe me, I've crashed beta9.2 very hard, many times. Maybe TSM ?
> 
> But - I have only used beta 9.2 so far, and flashed back to teh asus bios from the PT3 bios, which I found to be flawed - but each rig is different so PT3 may be fine for others.
Click to expand...

Try disabling all gpu acceleration, broswers, flash, etc etc.


----------



## Sainth

Quote:


> Originally Posted by *Jpmboy*
> 
> it would be helpful if you used the rigbuilder and posted your kit to your sig. this way, we know what gear you are using.
> 
> (and - i have asus ai suite - where are your getting the 60C from ?)
> 
> 4K... looks great!!
> 
> 
> 
> (yes - i pulled my r290x/EKWB and put the sli titans back in. Waiting for a QDC from Koolance before putting teh 290x back)


will update it, i still get som crappy results in crossfire in BF4. Its just in BF4, works better with win 8.1 ?

Anyone got any trics that might help me?

i get the 60 °C on the PCIE-2 under load :S


----------



## Raephen

Quote:


> Originally Posted by *tsm106*
> 
> Haha, are you a comedian? I'm not sure that was the intended affect?


I think the intended effect was getting rid of the coil whine he was experiencing when his fans on his AX III ran slower than 100% from his GPU fan controller.

I'm glad it worked for him. The 7V (roughly 40%) with that cooler should cool the card mighty fine. And seeing I'm getting the same cooler, it's good to know there's a good enough alternative, should my card experience the same whine when the 3 fans on the cooler are plugged into the card.


----------



## the9quad

Quote:


> Originally Posted by *kcuestag*
> 
> I was on Beta 7 (Or 8? The first BETA's when 290X released) when it started happening, I updated to 9.2 to see if it helped, but it was the same thing.


Can you plug in your ezplug 1and 2, I think it is actually just ezplug 1, and see if that fixes the issue? I have a theory it's the pcie slots not receiving enough power from the mobo.


----------



## kcuestag

Quote:


> Originally Posted by *the9quad*
> 
> Can you plug in your ezplug 1and 2, I think it is actually just ezplug 1, and see if that fixes the issue? I have a theory it's the pcie slots not receiving enough power from the mobo.


No I can't, don't have spare molex connectors/cables, plus I dno't think that is the issue, I previously had 2x OC'd 7970's and all week I've run an OC'd 290X to 1125/1500 no issues, then this starts happening even when downclocking the card.


----------



## Arizonian

Quote:


> Originally Posted by *famich*
> 
> New Sapphire R9 290X BF 4 Edition
> Humbly asking to be added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats and water blocked - added


----------



## TheSoldiet

Relaxing with 1200mhz on STOCK COOLER! 1.38 volts in asus gpu tweak. Playing BF4 like a charm!


----------



## nemm

I got hold of a 3rd Sapphire 290 today while I have the funds for her in doors or for a future setup.

While I await the RMA card I started testing this one with some unusual findings.

1st card: Hynix(RMA) (ASIC 79%) stock bios managed 1155/[email protected], can't remember Asus bios results.

2nd card: Elpida (ASIC 74.2%) stock bios 1100/[email protected] Asus PT bios 1200/[email protected], 1225/[email protected] much voltage for little gain so reverted back to 1.394 as max.

3rd card: Elpida (ASIC 78.7%) stock bios I managed 1105/1450 @ 1.211v. All Asus bios had weird results; Asus bios voltage slider 1.410 but max reading was 1.394 as a spike averaging 1.207 so max overclock was now only 1080/1600. PT1 bios with voltage slider set to 1450 the actual reading was still averaging 1.207 so again previous overclock on Sapphire bios was unstable. I thought it may be the memory overclock but this did nothing to the stability and the card is not throttling and core temp is max 68*C with vrm 53*C.

This is the first that hasn't taken well to the Asus bios so hurry up MSI AB or Trixx.

Another thing, does the Asus bios have increased memory voltage over others because I can always push the memory further compared to Sapphire bios?

*** Found out the card is actually throttling the voltage as setting 1325 in GPUTweak with Asus bios is fine but pushing higher the card begins throttling due to high VRM temps, 70*C+. Max readings in GPUZ were misread at first as it was set to show avg and not max like I thought I set. I will wait for RMA card return before deciding which 2 I use under water and put the other away for later.


----------



## famich

Thank you- regarding that black screen problem - to me it was definitely the 120Hz refresh rate on my Benq2420T , as reported over on OCUK forum.

Even when I started to run for example Heaven or Valley benchmark, it all of sudden happened , just for a few seconds : black screen, then the picture and so on.
After setting the RR @60 MHz it has disappeared completely.


----------



## tsm106

Quote:


> Originally Posted by *famich*
> 
> Thank you- regarding that black screen problem - to me it was definitely the 120Hz refresh rate on my Benq2420T , as reported over on OCUK forum.
> 
> Even when I started to run for example Heaven or Valley benchmark, it all of sudden happened , just for a few seconds : black screen, then the picture and so on.
> After setting the RR @60 MHz it has disappeared completely.


I've seen that before. A friend gets many more blkscrns with his 120hz panel than with his 60hz panel. It's got to be tied to the outputs, more stress etc? I would really love that beta 17 to be released any minute now so we can test the relationship to memory/imc and the addition of volts.


----------



## famich

Spoiler: Warning: Spoiler!






Can t agree more, it really looks to be tied up to the DVI /120Hz "problem ". Definitely a driver bug, not a reason for RMA * my card is capable of [email protected] HEven FS Exttreme - a good chip.


----------



## outofmyheadyo

I havent really read through this thread, but how much of an OC can one hope to get on 290 non X card on water ? Currently I am inbetween ordering myself a 290 non x or just ordering a waterblock for my 780 wich does 1300 on air.


----------



## Bull56

Anyone can send me the PT3 BIOS Version?

I cannot find it anymore


----------



## jerrolds

Quote:


> Originally Posted by *famich*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Can t agree more, it really looks to be tied up to the DVI /120Hz "problem ". Definitely a driver bug, not a reason for RMA * my card is capable of [email protected] HEven FS Exttreme - a good chip.


Quote:


> Originally Posted by *famich*
> 
> Thank you- regarding that black screen problem - to me it was definitely the 120Hz refresh rate on my Benq2420T , as reported over on OCUK forum.
> 
> Even when I started to run for example Heaven or Valley benchmark, it all of sudden happened , just for a few seconds : black screen, then the picture and so on.
> After setting the RR @60 MHz it has disappeared completely.


Oh weird....i dont think ive experienced the Black screen bug, but i am using a 120hz 1440p QNIX - but im using ToastyX Pixel Patcher and Custom Resolution Utility to set refresh rates..

I wonder if you guys use CRU instead of CCC or whatever to set your refresh, it might just fix it?


----------



## famich

I think 1180-1200is doable , let us see with MSI AFB 17 beta...


----------



## famich

Quote:


> Originally Posted by *Bull56*
> 
> Anyone can send me the PT3 BIOS Version?
> 
> I cannot find it anymore


----------



## famich

Quote:


> Originally Posted by *famich*


Here









OC BIOS PT1 PT 3.zip 320k .zip file


----------



## Forceman

Quote:


> Originally Posted by *outofmyheadyo*
> 
> I havent really read through this thread, but how much of an OC can one hope to get on 290 non X card on water ? Currently I am inbetween ordering myself a 290 non x or just ordering a waterblock for my 780 wich does 1300 on air.


Without voltage control, probably 1150. With voltage, maybe 1250? Not many voltage tweaked results right now.


----------



## Bull56

My cards do 1245MHz in Quad Crossfire @ 1,4V!
But this ran just once, after it crashed always or just the first GPU overvoltaged









Cheers


----------



## utnorris

Quote:


> Originally Posted by *Arizonian*
> 
> Updated
> 
> 
> 
> 
> 
> 
> 
> 
> 
> LOL - also now that I knew it was accurate - my OCD was killing me too.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As for 750 watts handling crossfire - I don't think so, even without an OC. Minimum in reviews has been 800 watts and that's without over clocking head room and definitely not adding voltage.
> 
> I'm struggling over to crossfire with an 850 watt PSU. I won't need to over clock if I go over kill so I'm good with that for gaming as I don't need to bench them. Just waiting on non-reference to drop to make a decision.


I am running two 290's in CF on a Seasonic Platinum 860 and have not gone above 700 watts with the GPU's and CPU overclocked. Personally I think you would be fine, but I do agree as we add voltage you may start pushing the limits of the PSU.

Quote:


> Originally Posted by *mickeykool*
> 
> Getting my 290 in the mail today, so all this black screens been resolved yet?


No issues here with the blackscreens and I am running two 290's.
Quote:


> Originally Posted by *X-oiL*
> 
> Well that's the thing, 290 is in stock right now..290x is without a delivery date. Leaning towards the 290, really wanna play some games now...


Then I would definitely get the 290's. Not enough difference that you will notice.


----------



## Sainth

Quote:


> Originally Posted by *utnorris*
> 
> I am running two 290's in CF on a Seasonic Platinum 860 and have not gone above 700 watts with the GPU's and CPU overclocked. Personally I think you would be fine, but I do agree as we add voltage you may start pushing the limits of the PSU.
> No issues here with the blackscreens and I am running two 290's.
> Then I would definitely get the 290's. Not enough difference that you will notice.


U got any problems in BF4 with crossfire?


----------



## utnorris

Quote:


> Originally Posted by *bond32*
> 
> http://koolance.com/video-card-vga-amd-radeon-r9-290x-water-block-vid-ar290x
> 
> Not sure if that's been posted yet. Anyone have experience with Koolance? I always thought they were top quality. That block is slightly more expensive but it sure does look good... Curious of the performance...


The Koolance is nice but does not appear to actively cool the VRM's like the EK does.
Quote:


> Originally Posted by *outofmyheadyo*
> 
> I havent really read through this thread, but how much of an OC can one hope to get on 290 non X card on water ? Currently I am inbetween ordering myself a 290 non x or just ordering a waterblock for my 780 wich does 1300 on air.


If you already have a 780 I would probably just go with a water block as the performance will be close. Of course if you just want something new to play with, well....


----------



## DampMonkey

Quote:


> Originally Posted by *utnorris*
> 
> The Koolance is nice but does not appear to actively cool the VRM's like the EK does.


From that picture it looks to me like it infact does cool the VRMs, a very similar block design as the EK


----------



## outofmyheadyo

Quote:


> Originally Posted by *utnorris*
> 
> The Koolance is nice but does not appear to actively cool the VRM's like the EK does.
> If you already have a 780 I would probably just go with a water block as the performance will be close. Of course if you just want something new to play with, well....


I could play with something new, and save 50€, I can sell the 780 for 400€ and buy a new 290 for 350








Also im pretty sure the 290 aint slower even on air then a 780, but the thing i worry about are the AMD drivers they give me the chills


----------



## utnorris

Quote:


> Originally Posted by *Sainth*
> 
> U got any problems in BF4 with crossfire?


Haven't tried yet. Every time I launch BF4 it does it in a browser window mode and I have not taken the time to figure out how to make it full screen even though the settings have it set that way.




Arizona,

Can you update me to 2 x 290 with water blocks?

So far without any additional voltage I have benched at 1150Mhz (not remotely stable, but benchable) and this was my score in 3DMark11:

http://www.3dmark.com/3dm11/7479195

P20921

Graphics Score
30921

Physics Score
11357

Combined Score
9677

Running at 1125Mhz/1475Mhz, still artifacts, but benchable.

http://www.3dmark.com/3dm11/7479233

P20636

Graphics Score
31056

Physics Score
10759

Combined Score
9647

Xtreme

http://www.3dmark.com/3dm11/7479267

X8917

Graphics Score
8651

Physics Score
11330

Combined Score
9231

Vantage

P53989

Graphics Score
69614

CPU Score
32264

I get the slightest coil whine while benching, but you can barely hear it even though I am under water. I am really glad I decided to send the 290x back and get these instead. The 290x's will probably be better, but mine was not very good, so I am coming out ahead this way. One of my cards has Elipdia ram and an Asic of 69.9% while the other one I forgot to check the memory and has an Asic score of 78%. I personally think that the lower Asic ones are clocking better than the higher ones, but that is just what I am seeing from mine experience.


----------



## rdr09

Quote:


> Originally Posted by *outofmyheadyo*
> 
> I havent really read through this thread, but how much of an OC can one hope to get on 290 non X card on water ? Currently I am inbetween ordering myself a 290 non x or just ordering a waterblock for my 780 wich does 1300 on air.


i am going to shoot for 1200. i haven't tried higher on air just 1115/1400 without any voltage control.

http://www.3dmark.com/3dm11/7478595

winter water work wonders. 1200 should match your 1300. if you can wait, my block should be here on the 14th.


----------



## utnorris

Quote:


> Originally Posted by *outofmyheadyo*
> 
> I could play with something new, and save 50€, I can sell the 780 for 400€ and buy a new 290 for 350
> 
> 
> 
> 
> 
> 
> 
> 
> Also im pretty sure the 290 aint slower even on air then a 780, but the thing i worry about are the AMD drivers they give me the chills


If you can get a 290 for $350 then jump on it, I got mine for $380 each and thought that was good.


----------



## outofmyheadyo

Quote:


> Originally Posted by *utnorris*
> 
> If you can get a 290 for $350 then jump on it, I got mine for $380 each and thought that was good.


Im talking about euros here 1eur = 0.75$ so my price is around 470$, but i quess you cant compare it like that


----------



## Jpmboy

Quote:


> Originally Posted by *DampMonkey*
> 
> From that picture it looks to me like it infact does cool the VRMs, a very similar block design as the EK


I agree. the area in red is the VRM contact. With my EK block I've never seen above 54C on the vrm (CPU Z) in any setting (gaming or benches like valley etc) Put a tiny amount of TIM on the VRMs and waterblock and use the supplied pad (or fujipoly - which is better).


----------



## utnorris

Quote:


> Originally Posted by *DampMonkey*
> 
> From that picture it looks to me like it infact does cool the VRMs, a very similar block design as the EK


If you look at the back there does not appear to be any channel going from the main to the VRM section, but it's possibly.


----------



## TomiKazi

The price of available 290's seems to be 370 euro's here in the Netherlands. 290x's go for 510 euro's but all out of stock.., though I've seen readily available XFX BF4 290x's somewhere for 560+.


----------



## rdr09

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Im talking about euros here 1eur = 0.75$ so my price is around 470$, but i quess you cant compare it like that


isn't it $1.25 = 1eur? lemme check.


----------



## outofmyheadyo

Quote:


> Originally Posted by *TomiKazi*
> 
> The price of available 290's seems to be 370 euro's here in the Netherlands. 290x's go for 510 euro's but all out of stock.., though I've seen readily available XFX BF4 290x's somewhere for 560+.


perhaps its just me but the *TINY* difference in performance between the 290 and X is not really worth the 150/200€.


----------



## TomiKazi

Quote:


> Originally Posted by *rdr09*
> 
> isn't it $1.25 = 1eur? lemme check.


I think that's what he meant but mixed it up by mistake








Quote:


> Originally Posted by *outofmyheadyo*
> 
> perhaps its just me but the *TINY* difference in performance between the 290 and X is not really worth the 150/200€.


Well that really seems to be the general thought in this thread. Perhaps not a bad one.


----------



## outofmyheadyo

1 EUR = 1.34293 USD, 1 USD = 0.744638 EUR


----------



## DampMonkey

Quote:


> Originally Posted by *utnorris*
> 
> If you look at the back there does not appear to be any channel going from the main to the VRM section, but it's possibly.


The stainless steel plate on the top is the top of the block is the channel, just like the EK block


----------



## Bull56

Is there any way to edit the 290x BIOS with an editor?
I want to set the clocks and the voltage manually...

It really stinks to use the standard clocks...

Cheers


----------



## flopper

Quote:


> Originally Posted by *outofmyheadyo*
> 
> perhaps its just me but the *TINY* difference in performance between the 290 and X is not really worth the 150/200€.


290 OC and call it a day.
unless you do eyefintiy big time then 2x290


----------



## Falkentyne

I tested 100 hz (lightboost), 120 hz (LB) and 121hz (120.5 hz non LB) on my VG248QE and no black screens in Valley at all (Haven't been playing BF4). Is this just an issue with the Benq? Monitor type shouldn't matter at all. Has anyone tried switching to the other DVI port or DP, if it's only happening at 120 hz for some people?


----------



## rdr09

Quote:


> Originally Posted by *outofmyheadyo*
> 
> perhaps its just me but the *TINY* difference in performance between the 290 and X is not really worth the 150/200€.


gaming, yes, any of the two. benching - 290X.


----------



## Maxxa

I'm going to be joining the club soon but I need to know if the Accelero Xtreme Plus II will fit a R9 290, I know all about the number of heatsink for the ram chips I got lots just need to know from someone who has tried mounting it or knows what has been changed on the revised Xtreme III.


----------



## Arm3nian

Quote:


> Originally Posted by *outofmyheadyo*
> 
> perhaps its just me but the *TINY* difference in performance between the 290 and X is not really worth the 150/200€.


Worth is relative. The correct way of putting it would be to say that the 290 has a better price to performance ratio than the 290x.


----------



## X-oiL

Quote:


> Originally Posted by *utnorris*
> 
> Then I would definitely get the 290's. Not enough difference that you will notice.


Said and done!


----------



## bpmcleod

Anyone have any ideas on how two 290s with xfire together if one is flashed to a 290x bios and the other on a 290 bios? I have a powercolor 290 on the asus 290x bios and am about to buy an asus 290 and probably leave it on stock bios. I might end up putting them both on pt3 once I get blocks for them but for now im wondering how well they will xfire since it is over the pcie slot now would it matter?


----------



## Raephen

Quote:


> Originally Posted by *Maxxa*
> 
> I'm going to be joining the club soon but I need to know if the Accelero Xtreme Plus II will fit a R9 290, I know all about the number of heatsink for the ram chips I got lots just need to know from someone who has tried mounting it or knows what has been changed on the revised Xtreme III.


Your question had me intrigued, so I did a quick google on images for both coolers.

The look nearly identical save for 1 (well 1 hole, 4 times) hole in the mounting plate, and since for the 290(x) you'd need one of the 3 outer holes, I would assume it's a safe bet that the AXpII could fit.

Maybe someone who's already installed the AX III could tell you which holes / hole spacing youd need. There are a one or two in this thread...


----------



## Pfortunato

So guys, I need an advice, I'm thinking in update my gpu to a r9 290 like I said before but the thing that is killing me that the only version available is the reference one and I got like to options:

1: buy a gigabyte r9 290 and put an arctic cooling xtreme 3 or a gelid icy vision rev2 for (around 400 with ac, around 390 with gelid)

2: buy a gigabyte r9 290 and when the aftermarket versions come out, sell my r9 290 and buy the aftermarket version

edit: any news about the new game bundles from amd?

Cheers :b


----------



## Maxxa

Quote:


> Originally Posted by *Raephen*
> 
> Your question had me intrigued, so I did a quick google on images for both coolers.
> 
> The look nearly identical save for 1 (well 1 hole, 4 times) hole in the mounting plate, and since for the 290(x) you'd need one of the 3 outer holes, I would assume it's a safe bet that the AXpII could fit.
> 
> Maybe someone who's already installed the AX III could tell you which holes / hole spacing youd need. There are a one or two in this thread...


I thought about it a bit and I can't imagine they would discontinue the Xtreme Plus II without it being able to fit the older cards too so it should work, the only one I know is different is the one for th e7970 because the chip is turned 45 degrees on the PCB.


----------



## bpmcleod

Quote:


> Originally Posted by *Maxxa*
> 
> I thought about it a bit and I can't imagine they would discontinue the Xtreme Plus II without it being able to fit the older cards too so it should work, the only one I know is different is the one for th e7970 because the chip is turned 45 degrees on the PCB.


I think it has already been confirmed it fits? Was alot earlier in the thread though.


----------



## Maxxa

Quote:


> Originally Posted by *bpmcleod*
> 
> I think it has already been confirmed it fits? Was alot earlier in the thread though.


I've been googling and searching OCN for a couple days now, it's a pita because all the versions pretty much have the same name. Trust me the last thing I want to do is anger the thread police.


----------



## Arizonian

Quote:


> Originally Posted by *utnorris*
> 
> Haven't tried yet. Every time I launch BF4 it does it in a browser window mode and I have not taken the time to figure out how to make it full screen even though the settings have it set that way.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Arizona,
> 
> Can you update me to 2 x 290 with water blocks?
> 
> So far without any additional voltage I have benched at 1150Mhz (not remotely stable, but benchable) and this was my score in 3DMark11:
> 
> http://www.3dmark.com/3dm11/7479195
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> P20921
> 
> Graphics Score
> 30921
> 
> Physics Score
> 11357
> 
> Combined Score
> 9677
> 
> Running at 1125Mhz/1475Mhz, still artifacts, but benchable.
> 
> http://www.3dmark.com/3dm11/7479233
> 
> P20636
> 
> Graphics Score
> 31056
> 
> Physics Score
> 10759
> 
> Combined Score
> 9647
> 
> Xtreme
> 
> http://www.3dmark.com/3dm11/7479267
> 
> X8917
> 
> Graphics Score
> 8651
> 
> Physics Score
> 11330
> 
> Combined Score
> 9231
> 
> Vantage
> 
> P53989
> 
> Graphics Score
> 69614
> 
> CPU Score
> 32264
> 
> 
> 
> I get the slightest coil whine while benching, but you can barely hear it even though I am under water. I am really glad I decided to send the 290x back and get these instead. The 290x's will probably be better, but mine was not very good, so I am coming out ahead this way. One of my cards has Elipdia ram and an Asic of 69.9% while the other one I forgot to check the memory and has an Asic score of 78%. I personally think that the lower Asic ones are clocking better than the higher ones, but that is just what I am seeing from mine experience.


Updated - congrats


----------



## Raephen

Quote:


> Originally Posted by *Pfortunato*
> 
> So guys, I need an advice, I'm thinking in update my gpu to a r9 290 like I said before but the thing that is killing me that the only version available is the reference one and I got like to options:
> 
> 1: buy a gigabyte r9 290 and put an arctic cooling xtreme 3 or a gelid icy vision rev2 for (around 400 with ac, around 390 with gelid)
> 
> 2: buy a gigabyte r9 290 and when the aftermarket versions come out, sell my r9 290 and buy the aftermarket version
> 
> edit: any news about the new game bundles from amd?
> 
> Cheers :b


If you want to invest in an aftermarket version, I would suggest waiting just that bi tlonger (shouldn't be that much longer; AMD's partners would go bonkers if they missed out on holiday sales!)

But reference models would be a good choice, too. I'm waiting for an Accelero Xtreme III to arrive which I hope to connect to my gpu. if I get coil whine because of it, like Gilgam3sh did, I'll use the 7v molex plug. Or a fan controller.
Which is what you would do with the Gelid Icy Vision r2: either plug the fans straight from the PSU, or a fan controller.

Either option is good enough, it seems, and both options have their pros and cons. A pro for the AX III would be the looks (I like it) and the - with luck - option of allowing the GPU control of the fans. A con would have to be the size and weight.

For the IV r2 the pros would certainly be it's price and the smaller size. The biggest con would be the standard 3-pin fans.

But if Gelid's IV r2 work on a 290 series, there is no reason a cooler like Arctic's Twin Turbo II wouldn't work. I've used it on a HD 7870 and see no reason why it shouldn't fit. It's just like the AX III, but smaller and about 200grams lighter.

Sorry if that made your choice even harder, but in the end of the day: it's up to you, and both of the choices you've given do the trick, it's just how much do you want to spend?


----------



## stilllogicz

It's unbelievable after reading through the some of the latest posts and looking through benchmarks how close the 290 is to the 290x. Hell for the price of the 290x I can get a waterblock + a 290. Amazing. Now, is Asus still the best to get for the voltage control or is there any others?

On a PSU note, would 1200w be enough for 3x 290 and a 4730k, all components OC'd?


----------



## SSJVegeta

Should I sell my 7950 crossfire setup for a single R9 290? I game on a 1440p Qnix monitor.


----------



## shilka

Quote:


> Originally Posted by *stilllogicz*
> 
> It's unbelievable after reading through the some of the latest posts and looking through benchmarks how close the 290 is to the 290x. Hell for the price of the 290x I can get a waterblock + a 290. Amazing. Now, is Asus still the best to get for the voltage control or is there any others?
> 
> On a PSU note, would 1200w be enough for 3x 290 and a 4730k, all components OC'd?


1000 watts could do it unless you overvolt your 290x cards

And no overclock and overvolt is not the same thing

If you are thinking of the AX1200i look elsewhere


----------



## Jpmboy

Quote:


> Originally Posted by *stilllogicz*
> 
> It's unbelievable after reading through the some of the latest posts and looking through benchmarks how close the 290 is to the 290x. Hell for the price of the 290x I can get a waterblock + a 290. Amazing. Now, is Asus still the best to get for the voltage control or is there any others?
> 
> On a PSU note, would 1200w be enough for 3x 290 and a 4730k, all components OC'd?


probably not if you overclock [yea - "overvolt"] the kit. I'd recommend 2 PSUs of at least 1000W - and single rail. and "add2psu"


----------



## Jpmboy

Quote:


> Originally Posted by *SSJVegeta*
> 
> Should I sell my 7950 crossfire setup for a single R9 290? I game on a 1440p Qnix monitor.


see this post:

http://www.overclock.net/t/1363440/nvidia-geforce-gtx-titan-owners-club/18560#post_21175411


----------



## shilka

Quote:


> Originally Posted by *Jpmboy*
> 
> probably not if you overclock the kit. I'd recomment 2 PSUs of at least 1000W - and single rail.


Cooler Master V1000 and EVGA SuperNova G2/P2

Is what i recommend

Found out an interesting fact today the Seasonic Platinum is not a single rail PSU its multi rail


----------



## Jpmboy

Quote:


> Originally Posted by *shilka*
> 
> Cooler Master V1000 and EVGA SuperNova G2/P2
> 
> Is what i recommend
> 
> Found out an interesting fact today the Seasonic Platinum is not a single rail PSU its multi rail


Eh - didn't know that ! I'm running a PC POwer 1200 and a seasonic 1050. I was shuting down the PCP&C 1200 in some benches with just 2 (heavily) OC titans.


----------



## Taint3dBulge

LOL. had to try it.


----------



## utnorris

Quote:


> Originally Posted by *bpmcleod*
> 
> Anyone have any ideas on how two 290s with xfire together if one is flashed to a 290x bios and the other on a 290 bios? I have a powercolor 290 on the asus 290x bios and am about to buy an asus 290 and probably leave it on stock bios. I might end up putting them both on pt3 once I get blocks for them but for now im wondering how well they will xfire since it is over the pcie slot now would it matter?


I ran a 290x and a 290 together last night and earlier today and no issues. They will run at their respective speeds or you can use AB to sync their speeds, up to you.


----------



## Arizonian

Some News - You'll need Google Translate









*Asus top models of Radeon R9 290/290X gets most advanced circuit board so far*
Quote:


> Asus has developed the most complex circuit boards so far, for a card with a single GPU. Although the DirectCU II cooler has ample dimensions expected PCB for Radeon R9 290 cards be equal to the radiator. Besides a new PCB design will also greatly *enhanced voltage divider*, which will further streamline the card's performance when overclocking. Asus DirectCU II model of the GeForce GTX 780 has a circuit board with 8 layers PCB, DirectCU II model of *Radeon R9 290X, the entire 12 layer PCB*.


----------



## iPDrop

Quote:


> Originally Posted by *Arizonian*
> 
> Some News - You'll need Google Translate
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Asus top models of Radeon R9 290/290X gets most advanced circuit board so far*


Nice I think I might replace my reference cards for these, the two I got area giving me really bad crosfire performance in BF4 I wonder if the next two will do the same thing.


----------



## Duvar

Hi guys, i have discovered something in our german forums:

1. Download DDU driver uninstaller and boot in safemode, run program there as admin and then reboot
2. Flash newest Bios (techpowerup v 015.039.000.007.000000) (He has a xfx 290x)
3. Don t install Beta Driver...install WHQL 13.11 unofficial (You can find it here http://www.guru3d.com/files_details/amd_catalyst_13_11_whql_download.html)
4. For BF 4 players, you should update direct x in the bf4 folder, there should be a folder with the name installer, you should run this.

He said he had always Black screens etc and after this steps, he could play all night long.
I hope it will help you guys, and sry 4 my english, i am from germany too.


----------



## VSG

Unless I am mistaken, that PCB has crossfire bridges that seem active. Also, the weird VRM heatsink "maze" might be a deterrent to water cooling. So if you guys are planning to water cool, I would still stick with the reference cards until we know more. I am sure EK will have blocks for this but I will need convincing about that VRM section.


----------



## VSG

Never mind I noticed that PCB was for the 7970 Matrix. Thanks for that, Nordic hardware!


----------



## BigGuns73

Guys after flashing my R9 290 bios to a R9 290X bios GPUZ now shows that I have the full 2816 shaders. Is this info correct? I didnt think the shaders unlocked right?


----------



## petedread

Anybody using TrueAudio? Is it any good?


----------



## rdr09

Quote:


> Originally Posted by *BigGuns73*
> 
> Guys after flashing my R9 290 bios to a R9 290X bios GPUZ now shows that I have the full 2816 shaders. Is this info correct? I didnt think the shaders unlocked right?


do you mind running a bench? 3DMark11.


----------



## BigGuns73

Quote:


> Originally Posted by *rdr09*
> 
> do you mind running a bench? 3DMark11.


I have valley installed, is that okay?


----------



## skupples

Quote:


> Originally Posted by *BigGuns73*
> 
> Guys after flashing my R9 290 bios to a R9 290X bios GPUZ now shows that I have the full 2816 shaders. Is this info correct? I didnt think the shaders unlocked right?
> 
> 
> 
> Spoiler: Warning: Spoiler!


it's likely a misread since you are now on a bios for the 290X... Now a days companies physically laser off the connection to the extra cores... Normally... Haven't heard of being able to do this with a simple flash in a looong time.


----------



## SonDa5

Quote:


> Originally Posted by *BigGuns73*
> 
> Guys after flashing my R9 290 bios to a R9 290X bios GPUZ now shows that I have the full 2816 shaders. Is this info correct? I didnt think the shaders unlocked right?


What brand is your 290?

I read the card could not possibly be flashed to 290x but maybe it does. Have to do some testing to confirm. Could be a glitch with GPU-Z. What is your ASIC score?


----------



## BigGuns73

Quote:


> Originally Posted by *skupples*
> 
> it's likely a misread since you are now on a bios for the 290X... Now a days companies use physically laser off the extra cores... Normally... Haven't heard of being able to do this with a simple flash in a looong time.


That is what I was thinking, one could only hope!


----------



## tsm106

Quote:


> Originally Posted by *Jpmboy*
> 
> Eh - didn't know that ! I'm running a PC POwer 1200 and a seasonic 1050. I was shuting down the PCP&C 1200 in some benches with just 2 (heavily) OC titans.


Yeap. I pulled 900w+ off two bone stock 290x and 3930k OC'd. There was nothing else in the rig, no loop, fans, drives, accessories. Trifire on a 1000w psu = haha.


----------



## BigGuns73

Quote:


> Originally Posted by *SonDa5*
> 
> What brand is your 290?
> 
> I read the card could not possibly be flashed to 290x but maybe it does. Have to do some testing to confirm. Could be a glitch with GPU-Z. What is your ASIC score?


Powercolor 73.1 asic and yes it can be flashed to 290x bios, lots of people have done so.


----------



## Arizonian

Quote:


> Originally Posted by *BigGuns73*
> 
> Guys after flashing my R9 290 bios to a R9 290X bios GPUZ now shows that I have the full 2816 shaders. Is this info correct? I didnt think the shaders unlocked right?
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> .


Welcome to overclock.net your first post. Be sure to submit a picture with your OCN name on it and I'll add you to our club Roster.

Well from what we understand these boards are laser cut only I don't think we ever confirmed that. If they are then it's physically impossible to unlock to Shaders. What we most likely looking at is the BIOS being fooled.

I'm pretty sure AMD learned from their 6950 / 6970 fiasco.


----------



## rdr09

Quote:


> Originally Posted by *BigGuns73*
> 
> I have valley installed, is that okay?


sure. nah, valley shows less info 3DMark or 3DMark11 are better. but no big deal if you can't.


----------



## skupples

Quote:


> Originally Posted by *BigGuns73*
> 
> I have valley installed, is that okay?


We did see one or two 780's with all 2680 cores (titan)... So, it is possible, though highly unlikely that you got a 290 that didn't actually have it's cores properly disabled. Some bench comparisons should give you the answer... Though, the two cards are so close in scoring idk how you would be able to tell yes or no.


----------



## SonDa5

Quote:


> Originally Posted by *BigGuns73*
> 
> Powercolor 73.1 asic and yes it can be flashed to 290x bios, lots of people have done so.


I know it can be flashed to a 290x bios but based on what I have read I didn't think it would unlock all the shaders that the 290x has.


----------



## BigGuns73

Quote:


> Originally Posted by *Arizonian*
> 
> Welcome to overclock.net your first post. Be sure to submit a picture with your OCN name on it and I'll add you to Harbor club Roster.
> 
> Well from what we understand these boards are laser cut only I don't think we ever confirmed that. If they are then it's physically impossible to unlock to Shaders. What we most likely looking at is the BIOS being fooled.
> 
> I'm pretty sure AMD learned from their 6950 / 6970 fiasco.


Hmm, surely others with flashed R9 290's can chime in on GPUZ's reading for them?


----------



## SonDa5

Quote:


> Originally Posted by *BigGuns73*
> 
> Hmm, surely others with flashed R9 290's can chime in on GPUZ's reading for them?


Could be a GPU-Z glitch.


----------



## BigGuns73

Quote:


> Originally Posted by *skupples*
> 
> We did see one or two 780's with all 2680 cores (titan)... So, it is possible, though highly unlikely that you got a 290 that didn't actually have it's cores properly disabled. Some bench comparisons should give you the answer... Though, the two cards are so close in scoring idk how you would be able to tell yes or no.


guess i could flip the bios switch and check with the other one thats not flashed?


----------



## SonDa5

Quote:


> Originally Posted by *BigGuns73*
> 
> guess i could flip the bios switch and check with the other one thats not flashed?


The other BIOS should show 290 shader count.


----------



## BigGuns73

Quote:


> Originally Posted by *SonDa5*
> 
> The other BIOS should show 290 shader count.


Crap, yeah I guess so....

So others here need to chime in who have done the flash, i know its not just me.


----------



## ImJJames

What you waiting for lol


----------



## rv8000

Does the ASUS r9 290 bios support voltage changing in GPU Tweak or just the ASUS 290x bios?


----------



## BigGuns73

Quote:


> Originally Posted by *ImJJames*
> 
> What you waiting for lol


okay i guess i will try the valley demo with stock bios at 1000mhz core and 1250mhz ram and then do the r9 290x bios at the same clock speeds and compare.


----------



## beejay

Going to make a single large VRM heatsink later. I'll bond three or four heatsinks together using a thermal epoxy, then I'm going to stick that to the VRM using an Akasa adhesives. Hard to find a single large VRM heatsink in Singapore.

Still hoping I can get a Thermalright VRM heatsink, but no dice.


----------



## ImJJames

Quote:


> Originally Posted by *BigGuns73*
> 
> okay i guess i will try the valley demo with stock bios at 1000mhz core and 1250mhz ram and then do the r9 290x bios at the same clock speeds and compare.


Hurry im getting anxious


----------



## BigGuns73

Well, interesting. 56.9fps on r9 290x bios and 54.6fps on r9 290 bios. ran a couple times and still 2-3fps diff between the two.









Also the card seems to run cooler with the R9 290 bios by a few degrees despite both cards running at 1000mhz core and 1250mhz ram.


----------



## utnorris

How friggin sweet would that be if they used 290x boards for 290's because they were worried about supply and now they unlock?


----------



## BigGuns73

Quote:


> Originally Posted by *utnorris*
> 
> How friggin sweet would that be if they used 290x boards for 290's because they were worried about supply and now they unlock?


Well something is going on, i score higher with the R9 290X bios, LOL! this would be sweet if I really have an unlocked R9 290x on my hands here!


----------



## evensen007

Pics or it didn't happen. No offense, it just seems highly unlikely.


----------



## BigGuns73

Quote:


> Originally Posted by *evensen007*
> 
> Pics or it didn't happen. No offense, it just seems highly unlikely.


omg, hold on. i will re run and post it. lol.


----------



## rdr09

Quote:


> Originally Posted by *BigGuns73*
> 
> omg, hold on. i will re run and post it. lol.


http://www.techpowerup.com/downloads/Benchmarking/Futuremark/

run 3DMark11, please. use the latest version. thanks.


----------



## Arizonian

Quote:


> Originally Posted by *BigGuns73*
> 
> omg, hold on. i will re run and post it. lol.


Post your old one ran with 290 and new one with 290X BIOS with same settings on both no over clock.

We'll get a better picture.


----------



## rv8000

Quote:


> Originally Posted by *BigGuns73*
> 
> Well something is going on, i score higher with the R9 290X bios, LOL! this would be sweet if I really have an unlocked R9 290x on my hands here!


Remember that 290x bios will boost the core up to 1ghz instead of 947mhz on the 290, discrepancy of 3fps can be based on the core clock increase alone depending on the program


----------



## blue1512

Quote:


> Originally Posted by *rv8000*
> 
> Remember that 290x bios will boost the core up to 1ghz instead of 947mhz on the 290, discrepancy of 3fps can be based on the core clock increase alone depending on the program


He already stated that both bios were run at 1000 MHz. Let's wait for the pics.


----------



## BigGuns73

R9 290x bios


Hold on and I will get the R9 290 bios run............
Quote:


> Originally Posted by *blue1512*
> 
> He already stated that both bios were run at 1000 MHz. Let's wait for the pics.


Exactly, i made sure both ran locked at 1ghz core and 1250mhz mem.


----------



## tsm106

lol you really did just join to post that?


----------



## rv8000

Quote:


> Originally Posted by *BigGuns73*
> 
> R9 290x bios
> 
> 
> Hold on and I will get the R9 290 bios run............


Use f12 to take a screen shot after the benchmark finishes so it will display the core/mem clocks in the upper right hand corner, not trying to be a bother but we dont necessarily know what clock that was run at


----------



## BigGuns73

Quote:


> Originally Posted by *tsm106*
> 
> lol you really did just join to post that?


Why does it matter to you why I joined? I found it very interesting and if its true then this is great news. Brb with the other run.........
Quote:


> Originally Posted by *rv8000*
> 
> Use f12 to take a screen shot after the benchmark finishes so it will display the core/mem clocks in the upper right hand corner, not trying to be a bother but we dont necessarily know what clock that was run at


it was run at 1ghz core and 1250mhz ram. geeze, i have no reason to lie but i have a feeling some of you are not going to be satisfied regardless of what i do. I will do as you said though............

Okay, where does it store the photo's after pressing F12. I looked where valley is installed but dont see anything photo related.


----------



## rv8000

Quote:


> Originally Posted by *BigGuns73*
> 
> Okay, where does it store the photo's after pressing F12. I looked where valley is installed but dont see anything photo related.


It's inside your user folder


----------



## DampMonkey

Quote:


> Originally Posted by *BigGuns73*
> 
> R9 290x bios
> 
> 
> Hold on and I will get the R9 290 bios run............


Heres a stock 290x run i did, Asus bios 1000/1250


----------



## BigGuns73

Quote:


> Originally Posted by *rv8000*
> 
> It's inside your user folder


Cool thanks, here is the first one on the R9 290 bios. Ran it a few times for consistency.


----------



## utnorris

Ok, so I flashed both of mine to the Asus bios. I did the first card and no joy. Then I did the second card and the first time I launched GPUz it showed the full shader count of the 290x, so I checked my first card again and it was still showing 2560 shaders. I switched back to the second card and it was now showing 2560 shaders instead of the full shader count. I believe this was just a GPUz misread as both of mine are showing the correct shader count for a 290 and not of a 290x. Maybe someone will get lucky though.


----------



## skupples

It won't be the first time it's happened. As the ever so wise Forest Gump once said... xxxx happens.

Still needs more investigation though. I mean, the only difference between the two bios isn't the name of it. I'm not a bios programmer, but I would guess many things are different inside of either bios. Why else would the 290 owners want to flash over to the 290X bios? (besides 53mhz more bclk & TDP)


----------



## tsm106

^^Shaders are laser cut.
Quote:


> Originally Posted by *utnorris*
> 
> Ok, so I flashed both of mine to the Asus bios. I did the first card and no joy. Then I did the second card and the first time I launched GPUz it showed the full shader count of the 290x, so I checked my first card again and it was still showing 2560 shaders. I switched back to the second card and it was now showing 2560 shaders instead of the full shader count. I believe this was just a GPUz misread as both of mine are showing the correct shader count for a 290 and not of a 290x. Maybe someone will get lucky though.


----------



## BigGuns73

R9 290X Bios run....



I have ran both 3 times each and its consistently higher on the R9 290x bios each time.


----------



## eternal7trance

You mind actually putting everything in one post instead of filling up the thread with multiple ones in less than a minute?


----------



## utnorris

Ok, so I reran 3DMark11 in xtreme mode and my original score was x8917 and my score after flashing is x8830. The only thing i did was dial the memory back from 1475 to 1400 as it was choking bad. I think either you need to go buy a lottery ticket because you are super lucky or it's a GPUz error.


----------



## rdr09

Quote:


> Originally Posted by *BigGuns73*
> 
> Cool thanks, here is the first one on the R9 290 bios. Ran it a few times for consistency.


is your i7 at stock? here is my 290 same oc


----------



## DampMonkey

I used to run a PNY Verto Geforce 6200 (128MB of AGP fury







) and with the Rivatuner program you could edit the registry/bios and enable 4 of the 8 pixel pipelines that were disabled from the higher end Geforce 6600. Im fairly certain manufacturers have become smarter and are physical removing these capabilities. Although it wasn't that long ago that you could unlock cores in AMD cpu's (3 core processor unlocked to quad), so I wouldn't put it passed them. Show me benches then ill make my judgement


----------



## BigGuns73

Quote:


> Originally Posted by *utnorris*
> 
> Ok, so I reran 3DMark11 in xtreme mode and my original score was x8917 and my score after flashing is x8830. The only thing i did was dial the memory back from 1475 to 1400 as it was choking bad. I think either you need to go buy a lottery ticket because you are super lucky or it's a GPUz error.


Its not a GPUZ error as best I can tell, look at my benchmark pics.
Quote:


> Originally Posted by *rdr09*
> 
> is your i7 at stock? here is my 290 same oc


No, its at 4.8ghz with hyperthreading enabled.


----------



## Sgt Bilko

Quote:


> Originally Posted by *DampMonkey*
> 
> I used to run a PNY Verto Geforce 6200 (128MB of AGP fury
> 
> 
> 
> 
> 
> 
> 
> ) and with the Rivatuner program you could edit the registry/bios and enable 4 of the 8 pixel pipelines that were disabled from the higher end Geforce 6600. Im fairly certain manufacturers have become smarter and are physical removing these capabilities. Although it wasn't that long ago that you could unlock cores in AMD cpu's (3 core processor unlocked to quad), so I wouldn't put it passed them. Show me benches then ill make my judgement


I still have a 6200









Im of the same opinion, I dont think its possible but on the slim chance it does then awesome but im reserving reserving judgement for now.

moar data!!!


----------



## BigGuns73

Im on win 8.1 though, this program might run better under win 7.


----------



## rdr09

Quote:


> Originally Posted by *BigGuns73*
> 
> No, its at 4.8ghz with hyperthreading enabled.


mine is at 4.5GHz. my score is higher than both your results.

yup, win8.


----------



## thekamikazepr

I just installed an universal block with small heatsinks in the vrams.

After an hour of stress it shutdown the logs show 50 degrees (my fans were in minimum). But the pcb felt around or above 90.

Has anybody else with an universal had this ecperience? ??

Please quote when replying since post here are plenty and hard to follow dpecially when you have to check from cellphone.

Note i have a xspc universal gpu block just got it today and i did apply paste (as5 )


----------



## BigGuns73

Quote:


> Originally Posted by *rdr09*
> 
> mine is at 4.5GHz. my score is higher than both your results.


Like I said, your also on a different version of windows. Could be diff drivers too as im not sure what you are using.


----------



## DampMonkey

Quote:


> Originally Posted by *BigGuns73*
> 
> Like I said, your also on a different version of windows. Could be diff drivers too as im not sure what you are using.


He said he's on win8


----------



## BigGuns73

Quote:


> Originally Posted by *DampMonkey*
> 
> He said he's on win8


Then why is his unigine showing otherwise....... regardless im on 8.1, not 8.


----------



## rdr09

Quote:


> Originally Posted by *DampMonkey*
> 
> He said he's on win8


no, im using win7 and Beta 8. sorry.

Big, run 3DMark11 if you can.


----------



## tsm106

Make your own thread lol. Prove it before proclaiming it.


----------



## DampMonkey

Quote:


> Originally Posted by *BigGuns73*
> 
> Then why is his unigine showing otherwise....... regardless im on 8.1, not 8.


Do you know if you're throttling? Can you run another time with 100% fan?


----------



## BigGuns73

Quote:


> Originally Posted by *tsm106*
> 
> Make your own thread lol. Prove it before proclaiming it.












I proved exactly what I was claiming to see on both bios's and posted the pics for proof. If thats not good enough for you then tough, your not getting a blood and stool sample from me too weirdo.
Quote:


> Originally Posted by *DampMonkey*
> 
> Do you know if you're throttling? Can you run another time with 100% fan?


No im not throttling, i have an aftermarket cooler. 65c tops with heavy overlcock/overvolt.


----------



## sugarhell

Shaders/texture units are laser cut. 2 fps difference on these gpus with variable clocks means nothing. Also keep on mind different bios have different timings for memory


----------



## BigGuns73

Quote:


> Originally Posted by *rdr09*
> 
> no, im using win7 and Beta 8. sorry.
> :


Yep, diff OS and diff drivers.
Quote:


> Originally Posted by *sugarhell*
> 
> Shaders/texture units are laser cut. 2 fps difference on these gpus with variable clocks means nothing. Also keep on mind different bios have different timings for memory


For the love of god do any of you know how to read? I said the clocks were both locked down to 1ghz core and 1250mem. Also the ram timing differences could have something to do with it, thats a good call.


----------



## utnorris

Quote:


> Originally Posted by *BigGuns73*
> 
> Its not a GPUZ error as best I can tell, look at my benchmark pics.


Does GPUZ consistently show the shaders unlocked?


----------



## BigGuns73

Quote:


> Originally Posted by *utnorris*
> 
> Does GPUZ consistently show the shaders unlocked?


Yes.

I will try 3dmark11 tomorrow with the 13.8 beta. I have to download the program, dont even have it on my PC.


----------



## sugarhell

Quote:


> Originally Posted by *BigGuns73*
> 
> For the love of god do any of you know how to read? I said the clocks were both locked down to 1ghz core and 1250mem. Also the ram timing differences could have something to do with it, thats a good call.


Leave god out of here. It doesnt matter. Locked down to 1ghz means nothing because the throttling without fan speed increase is big. Also gpu-z reads the bios id for info


----------



## BigGuns73

Quote:


> Originally Posted by *sugarhell*
> 
> Leave god out of here. It doesnt matter. Locked down to 1ghz means nothing because the throttling without fan speed increase is big. Also gpu-z reads the bios id for info


I dont throttle, like I said learn how to read. If you think im being a bit pissy then fine perhaps I am, just getting sick of having to repeat myself over and over because people refuse to read the previous comments.

I only ever said I hoped that all the shaders were unlocked, I never said for 100 percent certain that they are and when you guys asked for screens showing the results of valley from each bios I gladly obliged but despite all that I continue to be talked down too like crap from some of you.

The only thing I know for certain is that one bios is giving me more performance over the other one in spite of the clocks being the same and no throttling issues on either bios.


----------



## Jared Pace

Quote:


> Originally Posted by *BigGuns73*
> 
> For the love of god do any of you know how to read? I said the clocks were both locked down to 1ghz core and 1250mem. Also the ram timing differences could have something to do with it, thats a good call.


How about make it easier to everyone to compare instead of 35 posts of random bits. Put two screenshots in a single post. Show the whole Unigine result in full screen with score just like you have, _including_ GPUZ main screen showing shader count in info, _and_ Afterburner graph showing the core clock a steady 1000 during the run. First screenshot on 290 bios, 2nd screenshot on 290X bios. Pretty simple right? I'm with everyone else agreeing shaders can't be unlocked, and FPS difference is because of memory timings. Is this post possible for you to make?

Would be cool if they can unlock.


----------



## Falkentyne

You guys need to keep some things in mind, here.

HE MAY HAVE A 290X THAT WAS FLASHED WITH A 290 BIOS. Stranger things have happened.
Someone on OCUK received a 290 when he ordered a 290x (!), long before the 290 NDA expired.

Ram timings COULD be a factor, but the last time we ever had a RAM timing editor was on the 1900 XTX (I think Ati Tray tools could change the timings back then). However I highly doubt that ram timings would be different, as both cards use the exact same memory (you could downclock the memory by 100 MHz to see just how much of an effect a 'timing' difference "might" have.

He could also remove the heatsink and screen capture the GPU silicon reading on the chip.

He could also be very well trolling, as well. can't be surprised if no one believes him. The ONLY way anyone will know is if they purchase a 290 from the SAME manufacturer and also flash to 290x bios and compare.


----------



## skupples

Quote:


> Originally Posted by *DampMonkey*
> 
> I used to run a PNY Verto Geforce 6200 (128MB of AGP fury
> 
> 
> 
> 
> 
> 
> 
> ) and with the Rivatuner program you could edit the registry/bios and enable 4 of the 8 pixel pipelines that were disabled from the higher end Geforce 6600. Im fairly certain manufacturers have become smarter and are physical removing these capabilities. Although it wasn't that long ago that you could unlock cores in AMD cpu's (3 core processor unlocked to quad), so I wouldn't put it passed them. Show me benches then ill make my judgement


while this is 100% true, they do laser cut these things. It's been documented many times where some how a chip missed out on the laser cutting. We had some one posting a 2680 core 780 in titan club a few months back. Tests concluded that yes, it was really a 3gb titan. Though, that wasn't really a bios flash... If this was the same deal, his gpu-z should read the 290x core count either way it would seem.


----------



## BigGuns73

Quote:


> Originally Posted by *skupples*
> 
> . If this was the same deal, his gpu-z should read the 290x core count either way it would seem.


Thats what I was thinking......


----------



## Xion X2

Guns isn't "trolling". He's actually a friend of mine and knows his stuff.

He's got a Prolimatech on his 290 that keeps it at 65c max on the Valley bench, so I don't think he's throttling. And I can vouch that he's been flashing bios' on cards for years so is experienced at this.


----------



## BigGuns73

Quote:


> Originally Posted by *Xion X2*
> 
> Guns isn't "trolling". He's actually a friend of mine and knows his stuff.
> 
> He's got a Prolimatech on his 290 that keeps it at 65c max on the Valley bench, so I don't think he's throttling. And I can vouch that he's been flashing bios' on cards for years so is experienced at this.


Thank you brother and that 65c max is with the core at 1200mhz along with 1.28vc.







These runs that I just did the temps never even passed 53c.


----------



## Xion X2

Quote:


> Originally Posted by *Jared Pace*
> 
> How about make it easier to everyone to compare instead of 35 posts of random bits..


He answered questions as they were posed to him. If people don't have the attention span to go back 30 or so posts to follow a single line of discussion then that's their problem, not his.


----------



## Mr357

How do we know that we got 290X's and not 290's if software just reads from the BIOS?


----------



## BigGuns73

Quote:


> Originally Posted by *Mr357*
> 
> How do we know that we got 290X's and not 290's if software just reads from the BIOS?


LOL wouldnt that suck?


----------



## selk22

It is an interesting find BigGuns but I am not entirely convinced yet... Not that you have done anything wrong but I want to see the same results from others users before I believe this is the norm. Like Skupples said things like this have happened by mistake before.


----------



## BigGuns73

Quote:


> Originally Posted by *selk22*
> 
> It is an interesting find BigGuns but I am not entirely convinced yet... Not that you have done anything wrong but I want to see the same results from others users before I believe this is the norm. Like Skupples said things like this have happened by mistake before.


I understand, just know that there are lots of differing variables across different pc's from drivers to OS's so making a carbon copy comparison from other systems could be pretty hard to do. I dont know what else to say though other than im not fully convinced that my card unlocked to a r9 290x either but what I am sure of is that one bios shows 2816 cores in GPUZ and also oddly enough is the better performing of the two in Valley with identical clocks set on both bios's.


----------



## Slomo4shO

Quote:


> Originally Posted by *BigGuns73*
> 
> I have ran both 3 times each and its consistently higher on the R9 290x bios each time.


Can you run a custom fan profile for both and rerun the tests just for consistency?


----------



## BigGuns73

Goodnight everyone, will be back in the morning. Time for some much needed sleep.
Quote:


> Originally Posted by *Slomo4shO*
> 
> Can you run a custom fan profile for both and rerun the tests?


Sigh......... Aftermarket cooler bro, no throttling.


----------



## Falkentyne

Well one way to instantly show that is to take off the heatsink and look/take a picture of the bare GPU core. If it says 290x or whatever code they use from a 290, then someone flashed a 290 bios on a 290x card. But that's probably too much work....


----------



## Kelwing

Playing around with cpu and gpu clocks. Running Valley

Win 8.1
FX8350
R9 290
Drivers are 13.11 Beta 9.2

2216 with stock CPU/GPU clocks
2286 with stock CPU/GPU @1000
2258 with CPU at 4.4g and GPU Stock
2334 with CPU at 4.4 and GPU at 1000, CPU temp 40*c and GPU 56*c with 60% fan speed. (Artic Accelero III)

Good, bad, could be tweaked more? Plus I can't make heads or tails of it. What is the little switch on side of board for? It is positioned towards back(where monitor hooks up)

http://s222.photobucket.com/user/mistwalker7/media/00003_zpse80002eb.jpg.html

http://s222.photobucket.com/user/mistwalker7/media/100_1573_1_zps3ce8b10c.jpg.html


----------



## DampMonkey

Quote:


> Originally Posted by *Mr357*
> 
> How do we know that we got 290X's and not 290's if software just reads from the BIOS?


my cards out performing his, thats all i need to sleep well tonight


----------



## Arizonian

Quote:


> Originally Posted by *Kelwing*
> 
> Playing around with cpu and gpu clocks. Running Valley
> 
> Win 8.1
> FX8350
> R9 290
> Drivers are 13.11 Beta 9.2
> 
> 2216 with stock CPU/GPU clocks
> 2286 with stock CPU/GPU @1000
> 2258 with CPU at 4.4g and GPU Stock
> 2334 with CPU at 4.4 and GPU at 1000, CPU temp 40*c and GPU 56*c with 60% fan speed. (Artic Accelero III)
> 
> Good, bad, could be tweaked more? Plus I can't make heads or tails of it. What is the little switch on side of board for? It is positioned towards back(where monitor hooks up)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s222.photobucket.com/user/mistwalker7/media/00003_zpse80002eb.jpg.html
> 
> http://s222.photobucket.com/user/mistwalker7/media/100_1573_1_zps3ce8b10c.jpg.html


Updated with aftermarket cooling - looks good.


----------



## skupples

Quote:


> Originally Posted by *Falkentyne*
> 
> Well one way to instantly show that is to take off the heatsink and look/take a picture of the bare GPU core. If it says 290x or whatever code they use from a 290, then someone flashed a 290 bios on a 290x card. But that's probably too much work....


If it is, what he thinks it is, which i'm doubting it is. The chip would show the 290 imprint, but would of some how missed the laser treatment. This type of stuff is EXTREMELY rare.


----------



## Scorpion49

Well I gone and done it now...


----------



## selk22

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *BigGuns73*
> 
> I understand, just know that there are lots of differing variables across different pc's from drivers to OS's so making a carbon copy comparison from other systems could be pretty hard to do. I dont know what else to say though other than im not fully convinced that my card unlocked to a r9 290x either but what I am sure of is that one bios shows 2816 cores in GPUZ and also oddly enough is the better performing of the two in Valley with identical clocks set on both bios's.






Well then id say you have found yourself one winner of a bios buddy







I would be pretty happy with my purchase on that one!


----------



## th3illusiveman

Quote:


> Originally Posted by *Kelwing*
> 
> Playing around with cpu and gpu clocks. Running Valley
> 
> Win 8.1
> FX8350
> R9 290
> Drivers are 13.11 Beta 9.2
> 
> 2216 with stock CPU/GPU clocks
> 2286 with stock CPU/GPU @1000
> 2258 with CPU at 4.4g and GPU Stock
> 2334 with CPU at 4.4 and GPU at 1000, CPU temp 40*c and GPU 56*c with 60% fan speed. (Artic Accelero III)
> 
> Good, bad, could be tweaked more? Plus I can't make heads or tails of it. What is the little switch on side of board for? It is positioned towards back(where monitor hooks up)
> 
> http://s222.photobucket.com/user/mistwalker7/media/00003_zpse80002eb.jpg.html
> 
> http://s222.photobucket.com/user/mistwalker7/media/100_1573_1_zps3ce8b10c.jpg.html


That's terrible, some 7970s were getting 60 fps in valley. Heck mine got 54.5fps @ 1300Mhz, whats up with these things and Valley? That 512 bit bus should give them a massive advantage against GK110 and Tahiti in that bench.

They have excellent performance in almost everything else.


----------



## youra6

Quote:


> Originally Posted by *th3illusiveman*
> 
> That's terrible, some 7970s were getting 60 fps in valley. Heck mine got 54.5fps @ 1300Mhz, whats up with these things and Valley? That 512 bit bus should give them a massive advantage against GK110 and Tahiti in that bench.
> 
> They have excellent performance in almost everything else.


True that. I just ran my 290s at 1200Mhz and only got 126 FPS...


----------



## tsm106

My single 7970 got 62fps and dual got 115fps.

But then again my single 290x hit 79fps, so its relative to the user and rig.


----------



## youra6

Quote:


> Originally Posted by *tsm106*
> 
> My single 7970 got 62fps and dual got 115fps.
> 
> But then again my single 290x hit 79fps, so its relative to the user and rig.


My dual 290s at 1100 only got 119 FPS.

Slow and steady wins the race.


----------



## Scorpion49

Quote:


> Originally Posted by *Kelwing*
> 
> Playing around with cpu and gpu clocks. Running Valley
> 
> Win 8.1
> FX8350
> R9 290
> Drivers are 13.11 Beta 9.2
> 
> 2216 with stock CPU/GPU clocks
> 2286 with stock CPU/GPU @1000
> 2258 with CPU at 4.4g and GPU Stock
> 2334 with CPU at 4.4 and GPU at 1000, CPU temp 40*c and GPU 56*c with 60% fan speed. (Artic Accelero III)
> 
> Good, bad, could be tweaked more? Plus I can't make heads or tails of it. What is the little switch on side of board for? It is positioned towards back(where monitor hooks up)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s222.photobucket.com/user/mistwalker7/media/00003_zpse80002eb.jpg.html
> 
> http://s222.photobucket.com/user/mistwalker7/media/100_1573_1_zps3ce8b10c.jpg.html


That is a bad score for a 290 considering it should compete with a 780, you're missing ~20fps. The CPU shouldn't be an issue, although back when I ran an 8320 I did see some gains in valley it was on the order of 2-3fps between 4600 and 5250mhz. Are you getting any system shutdowns? The TT TR2 RX line is renowned for being horrifically mediocre PSU's and should generally be avoided.


----------



## eternal7trance

Quote:


> Originally Posted by *Scorpion49*
> 
> Well I gone and done it now...


Should have done it on mobile app. 5% off is a lot of money when you spend $800


----------



## selk22

Yeah for some reason they just really don't like valley lol I cant figure it out! My best bench at 1100/1400 is 65.5-68 at best. Was hoping for more but under water I think I can break that 70 fps mark.

This is with sig rig


----------



## Scorpion49

Quote:


> Originally Posted by *eternal7trance*
> 
> Should have done it on mobile app. 5% off is a lot of money when you spend $800


I don't have a mobile app, and how would I know I could save money using it in the first place?







Is it pasted somewhere on their site cuz I've really never seen it. Would have tried to get my POS windows phone to download it if I did.


----------



## rv8000

Up to 71.5 in valley on my 290 @ 1150/1550, scores for the 290s are everywhere atm lol


----------



## Arizonian

100% Fan for benching to keep from down clocking is key unless your water blocked.

i7 3770K 4.5 Ghz / 290X 1150 Core + 1350 Memory - no voltage added.


----------



## eternal7trance

Quote:


> Originally Posted by *Scorpion49*
> 
> I don't have a mobile app, and how would I know I could save money using it in the first place?
> 
> 
> 
> 
> 
> 
> 
> Is it pasted somewhere on their site cuz I've really never seen it. Would have tried to get my POS windows phone to download it if I did.


You gotta look around.

http://slickdeals.net/f/6414476-5-off-your-entire-purchase-newegg-max-50-mobile-app-and-mobile-site-only


----------



## Scorpion49

Quote:


> Originally Posted by *eternal7trance*
> 
> You gotta look around.
> 
> http://slickdeals.net/f/6414476-5-off-your-entire-purchase-newegg-max-50-mobile-app-and-mobile-site-only


Well darn, guess I missed it. Oh well, 290's are on the way! Now I have to wait for the aquacomputer blocks to be back in stock for them


----------



## ImJJames

Quote:


> Originally Posted by *th3illusiveman*
> 
> That's terrible, some 7970s were getting 60 fps in valley. Heck mine got 54.5fps @ 1300Mhz, whats up with these things and Valley? That 512 bit bus should give them a massive advantage against GK110 and Tahiti in that bench.
> 
> They have excellent performance in almost everything else.


I agree, my 7970 is getting 53 FPS @ 1230, but he did benchmark on stock.


----------



## CurrentlyPissed

So now BF4 comes with 290, and 290x? I haven't even had these cars a week (not even possible too).

No retroactive coupons.

I'm pretty much about to just write off AMD.

http://www.anandtech.com/show/7511/amd-announces-new-game-bundles-for-radeon-r7-260x-above


----------



## Gilgam3sh

Quote:


> Originally Posted by *CurrentlyPissed*
> 
> So now BF4 comes with 290, and 290x? I haven't even had these cars a week (not even possible too).
> 
> No retroactive coupons.
> 
> I'm pretty much about to just write off AMD.
> 
> http://www.anandtech.com/show/7511/amd-announces-new-game-bundles-for-radeon-r7-260x-above


yeah read that too, even though I already have BF4 Premium (bought it through Mexican proxy) for us who bought the R9 290/290X card couple of week ago should get that too...


----------



## CurrentlyPissed

Couple weeks ago? I havent had mine a week. Hell the reviews havent even been out a week for the 290. It's pretty lame.


----------



## Gilgam3sh

Quote:


> Originally Posted by *CurrentlyPissed*
> 
> Couple weeks ago? I havent had mine a week. Hell the reviews havent even been out a week for the 290. It's pretty lame.


yes thats what I mean, also had mine for barely a week.


----------



## Connolly

Quote:


> Originally Posted by *CurrentlyPissed*
> 
> So now BF4 comes with 290, and 290x? I haven't even had these cars a week (not even possible too).
> 
> No retroactive coupons.
> 
> I'm pretty much about to just write off AMD.
> 
> http://www.anandtech.com/show/7511/amd-announces-new-game-bundles-for-radeon-r7-260x-above


Does this offer only apply in the U.S.? I've just ordered an R9 290 from Ebuyer, I'm U.K. based, and didn't see any details of free games. Or do you need to register it directly with AMD or something to receive the game vouchers? The card I ordered isn't in stock until the 15th, so I can always cancel and order again if I need to/can get free games.


----------



## VSG

I tweeted AMD gaming, AMD Radeon and Roy about this and made my displeasure at having this happen to existing R9-290 owners be known. If more people let them know via Facebook or twitter maybe they will respond. Otherwise this is a really bad PR move when Nvidia is actually having a great gaming bundle themself.


----------



## hotrod717

Quote:


> Originally Posted by *geggeg*
> 
> I tweeted AMD gaming, AMD Radeon and Roy about this and made my displeasure at having this happen to existing R9-290 owners be known. If more people let them know via Facebook or twitter maybe they will respond. Otherwise this is a really bad PR move when Nvidia is actually having a great gaming bundle themself.


AMD has been really fumbling this launch. What started out with a never ending wait and extremely low available product has turned into driver issues, blackscreens, and
a middle finger to some of their most loyal followers and newbs alike. Their saving grace for now is the price to performance ratio. If mantle doesn't pay off and bring a nice return, it doesn't look good.


----------



## Snyderman34

Quote:


> Originally Posted by *CurrentlyPissed*
> 
> So now BF4 comes with 290, and 290x? I haven't even had these cars a week (not even possible too).
> 
> No retroactive coupons.
> 
> I'm pretty much about to just write off AMD.
> 
> http://www.anandtech.com/show/7511/amd-announces-new-game-bundles-for-radeon-r7-260x-above


Yeah, I'm a little bummed about that myself. The price of early adoption







Who knows though? Maybe they'll change it


----------



## VSG

Honestly if nothing comes out from APU 13 tomorrow about Mantle and if we get shafted out of what actually seems like a worse "bundle" than even the silver tier, I will be really tempted to sell my cards and their blocks. I can't place much trust in a company that has no clue how to get anything right about their premier card's launch.


----------



## selk22

Started to play Planetside 2 after the update today and it was running amazing then about 20-30minutes in BAM Black Screen... I can hear people they cant hear me and I am forced to hard reset the computer... I have been playing bf4 and other games smoothly since I got the card.. If this happens again I will be returning the card to Newegg and buying a 780.. Its sad to say but I will not go through this again, I had a 460 that had problems initially that I held on to thinking it would get better and it ended up being my only card for years and EXTREMELY problematic. Not worth it to me when I can return it and buy a 780 which I know will most likely be less problematic.

Edit:

I cant return to Newegg.... WOW
Quote:


> According to our records, this item is still covered by the product manufacturer's parts warranty.
> According to our records, this item is still covered by the product manufacturer's labor warranty.
> This item is not eligible for a refund.


Have I just gotten shafted?


----------



## hotrod717

I should be receiving my 290 tomorrow and while I wanted to have a waterblock ready, am starting to think it was good not to pull the trigger so soon. I'm really disappointed and don't even have the card yet. Crossing my fingers that the problems have been blown out of proportion and I eat some of my words.


----------



## CurrentlyPissed

Quote:


> Originally Posted by *Snyderman34*
> 
> Yeah, I'm a little bummed about that myself. The price of early adoption
> 
> 
> 
> 
> 
> 
> 
> Who knows though? Maybe they'll change it


I know all about "Early adoption fees".

I purchased two 7950s on release day for $579 each (MSI OC versions).

Little over a month later they dropped in almost $200 in price each.

It's understandable given the competition nVidia brought to the table. And being over a month.

But less than a week is a little harder to swallow.


----------



## Arizonian

Quote:


> Originally Posted by *selk22*
> 
> Started to play Planetside 2 after the update today and it was running amazing then about 20-30minutes in BAM Black Screen... I can hear people they cant hear me and I am forced to hard reset the computer... I have been playing bf4 and other games smoothly since I got the card.. If this happens again I will be returning the card to Newegg and buying a 780.. Its sad to say but I will not go through this again, I had a 460 that had problems initially that I held on to thinking it would get better and it ended up being my only card for years and EXTREMELY problematic. Not worth it to me when I can return it and buy a 780 which I know will most likely be less problematic.
> 
> Edit:
> 
> I cant return to Newegg.... WOW
> Have I just gotten shafted?


No you weren't shafted. All the GPU's from Newegg are replacement only within 30 days of purchase. After that it's through manufacturer warranty. It's right there underneath the products.
Quote:


> Iron Egg Guarantee Replacement-Only Return Policy


In fact once it's in packaging on the first day, it's yours. Only way out of it is when the package arrives are your door your there to meet the UPS guy to decline it. If it's left at your door and he rides away it's yours.


----------



## CurrentlyPissed

Yeah, I think that all started when the 6950-6970 return fiasco went on. Due to the amount of people returning 6970 because the 6950 was the same card.


----------



## selk22

Quote:


> Originally Posted by *Arizonian*
> 
> No you weren't shafted. All the GPU's from Newegg are replacement only within 30 days of purchase. After that it's through manufacturer warranty. It's right there underneath the products.
> In fact once it's in packaging on the first day, it's yours. Only way out of it is when the package arrives are your door your there to meet the UPS guy to decline it. If it's left at your door and he rides away it's yours.


Right I understand I dont mean shafted by the return policy but more shafted because I am stuck with another card that has issues!!

Should I RMA with Newegg or Sapphire? Do people have experience with either? What are the time-frames for the standard RMA because I do use this computer for more than just gaming.


----------



## Arizonian

Quote:


> Originally Posted by *selk22*
> 
> Right I understand I dont mean shafted by the return policy but more shafted because I am stuck with another card that has issues!!
> 
> Should I RMA with Newegg or Sapphire? Do people have experience with either? What are the time-frames for the standard RMA because I do use this computer for more than just gaming.


That will vary from company to company. You can use your iGPU by enabling it in your motherboard BIOS while you wait. Just can't game. You'll be able to stream though.

EDIT - Is the consensus in the club the black screens are defective GPU's or something else? How many members RMA'd but still get black screens with new replacement?


----------



## Blackroush

Quote:


> Originally Posted by *selk22*
> 
> Right I understand I dont mean shafted by the return policy but more shafted because I am stuck with another card that has issues!!
> 
> Should I RMA with Newegg or Sapphire? Do people have experience with either? What are the time-frames for the standard RMA because I do use this computer for more than just gaming.


Just exchange and sell it as New Sealed condition.


----------



## bpmcleod

You can return cards not under iron egg. I returned a 780 classy for a refund.. minus about 125$.. But considering I paid 830 for it for the HC version, I can get almost 2 290s for my refund..







.


----------



## Arizonian

Also on the OP

*
AMD Issue Reporting Form for AMD Catalyst™ 13.9 for Desktop Radeon™ Products*


----------



## kantxcape

Does anyone else gets some flickering and horizontal lines on your desktop when you overclock the memory on the 290x, i get those lines at 1450mhz and 1500mhz but not on 1400mhz? Just wanna know if its normal. Im using Afterburner, no voltage changes. Sapphire 290x elpida.


----------



## kcuestag

Quote:


> Originally Posted by *kantxcape*
> 
> Does anyone else gets some flickering and horizontal lines on your desktop when you overclock the memory on the 290x, i get those lines at 1450mhz and 1500mhz but not on 1400mhz? Just wanna know if its normal. Im using Afterburner, no voltage changes. Sapphire 290x elpida.


I get a weird flickering and horizontal lines when booting into Windows 8.1, but it also happens on stock, and only happens once every time I boot into Windows.


----------



## Forceman

Quote:


> Originally Posted by *kantxcape*
> 
> Does anyone else gets some flickering and horizontal lines on your desktop when you overclock the memory on the 290x, i get those lines at 1450mhz and 1500mhz but not on 1400mhz? Just wanna know if its normal. Im using Afterburner, no voltage changes. Sapphire 290x elpida.


Yeah, I have the same thing but at 1400. I saw someone said it was an issue with the card setting 2D clocks and there was a workaround, but I can't remember what it was. It was in a post in this thread a day or so ago.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kantxcape*
> 
> Does anyone else gets some flickering and horizontal lines on your desktop when you overclock the memory on the 290x, i get those lines at 1450mhz and 1500mhz but not on 1400mhz? Just wanna know if its normal. Im using Afterburner, no voltage changes. Sapphire 290x elpida.


It happens to me at 1430mhz, 1420 and 1440 are fine (not stable obviously), usually it's due to an unstable overclock of the memory afaik


----------



## r0l4n

Quote:


> Originally Posted by *kantxcape*
> 
> Does anyone else gets some flickering and horizontal lines on your desktop when you overclock the memory on the 290x, i get those lines at 1450mhz and 1500mhz but not on 1400mhz? Just wanna know if its normal. Im using Afterburner, no voltage changes. Sapphire 290x elpida.


Same here, but my limit is 1450MHz.


----------



## esqueue

Is Heaven even used anymore? My card was clocked at 1140/1400 stock bios 45°C gpu and 50°C/38°C for VRM 1 & 2.


----------



## r0l4n

Quote:


> Originally Posted by *Sgt Bilko*
> 
> It happens to me at 1430mhz, 1420 and 1440 are fine (not stable obviously), usually it's due to an unstable overclock of the memory afaik


Not really an unstable overclock, my mems are perfectly stable at 1650+ during long gaming sessions, yet anything over 1450 in the desktop will give me the flickering.


----------



## Sgt Bilko

Quote:


> Originally Posted by *r0l4n*
> 
> Not really an unstable overclock, my mems are perfectly stable at 1650+ during long gaming sessions, yet anything over 1450 in the desktop will give me the flickering.


in that case i stand corrected







, it's probably the 2D profile as Forceman suggested a few posts above then.


----------



## SonDa5

Playing around with the PT1 BIOS I improved my 290x over clock to 1274/1522 broke 21,000 on gpu score on 3dMark11P







!!!

http://www.techpowerup.com/gpuz/drw57/

http://www.3dmark.com/3dm11/7481578


----------



## Kuat

Are there ANY aftermarket 290/290x at all yet?

How come nvidia already has ACX / Winforce out and ATI has nothing?


----------



## selk22

Quote:


> Originally Posted by *kantxcape*
> 
> Does anyone else gets some flickering and horizontal lines on your desktop when you overclock the memory on the 290x, i get those lines at 1450mhz and 1500mhz but not on 1400mhz? Just wanna know if its normal. Im using Afterburner, no voltage changes. Sapphire 290x elpida.


I have gotten these anywhere above 1400..

*BLACK SCREEN UPDATE*

Alright well like I stated before I was running at 1100/1400 completely fine on bf4 and other games. Ran the new planetside update today and ended up with a black screen.. I dropped the OC completely back to stock at 1000/1250 and played for a few hours without a hiccup.. SO for me it seems it may have been cause by an unstable Mem OC.. So for now running at stock until I can get some voltage control to decided whether or not to RMA the card.

I will continue to run at stock until I can be 100% certain it is a stable card at stock settings


----------



## Sgt Bilko

Quote:


> Originally Posted by *selk22*
> 
> I have gotten these anywhere above 1400..
> 
> *BLACK SCREEN UPDATE*
> 
> Alright well like I stated before I was running at 1100/1400 completely fine on bf4 and other games. Ran the new planetside update today and ended up with a black screen.. I dropped the OC completely back to stock at 1000/1250 and played for a few hours without a hiccup.. SO for me it seems it may have been cause by an unstable Mem OC.. So for now running at stock until I can get some voltage control to decided whether or not to RMA the card.
> 
> I will continue to run at stock until I can be 100% certain it is a stable card at stock settings


I've done the same, running at stock and the only time my PC has locked/crashed is due to me messing about with my new 8350.


----------



## Tobiman

Quote:


> Originally Posted by *Kuat*
> 
> Are there ANY aftermarket 290/290x at all yet?
> 
> How come nvidia already has ACX / Winforce out and ATI has nothing?


Maybe because it's the same exact PCB design as the titan and 780?


----------



## kantxcape

Quote:


> Originally Posted by *selk22*
> 
> I have gotten these anywhere above 1400..
> 
> *BLACK SCREEN UPDATE*
> 
> Alright well like I stated before I was running at 1100/1400 completely fine on bf4 and other games. Ran the new planetside update today and ended up with a black screen.. I dropped the OC completely back to stock at 1000/1250 and played for a few hours without a hiccup.. SO for me it seems it may have been cause by an unstable Mem OC.. So for now running at stock until I can get some voltage control to decided whether or not to RMA the card.
> 
> I will continue to run at stock until I can be 100% certain it is a stable card at stock settings


Did you change any voltages to run it at those clocks? Isnt 1400mhz on the memories kinda low? whats the average?


----------



## chropose

Folks, what's your opinion about HIS Radeon 290/290X? What memory chips did they used? Hynix?


----------



## Arizonian

Quote:


> Originally Posted by *Kuat*
> 
> Are there ANY aftermarket 290/290x at all yet?
> 
> How come nvidia already has ACX / Winforce out and ATI has nothing?


The ASX has only been sold at amazingly low quantity only on EVGA twice now. Places like Newegg / Amazon have not had one yet. No other aftermarket has even been offered by any other vendor let alone any non-reference even shown including their own websites.

Multiple source now have all said latter part of November for AMD. Nvidia news indicates they are going to be ready to counter but not before. I have a feeling its to see where AMD will price and if other price drops happen so they don't come out higher and end up with egg on their face is my opinion.

Gigabyte showed their aftermarket Windforce for review but hasn't put it for sale anywhere and not given a date to TechSpot when they would either.


----------



## Sgt Bilko

Quote:


> Originally Posted by *chropose*
> 
> Folks, what's your opinion about HIS Radeon 290/290X? What memory chips did they used? Hynix?


Memory chips are random. Some are Hynix and others are Elphida.....no brand exclusively uses either afaik


----------



## Gilgam3sh

could I use these copper heatsinks on the VRM's instead of the aluminium ones I got with the Accelero Xtreme III cooler?

http://www.ebay.com/itm/Cooling-Copper-Memory-Chip-Cooler-Ram-Heat-Sinks-HeatSink-8pcs-set-/121189703639?pt=US_Memory_Chipset_Cooling&hash=item1c37781fd7

and for the RAM: http://www.ebay.com/itm/8PCS-Pure-Copper-Memory-Chipset-Cooler-Heat-Sinks-HeatSink-For-IC-DDR-RAM-VGA/350705273019


----------



## chropose

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Memory chips are random. Some are Hynix and others are Elphida.....no brand exclusively uses either afaik


Ouch..Really? That's too bad. I want Hynix.


----------



## overclockFrance

Quote:


> Originally Posted by *Gilgam3sh*
> 
> could I use these copper heatsinks on the VRM's instead of the aluminium ones I got with the Accelero Xtreme III cooler?
> 
> http://www.ebay.com/itm/Cooling-Copper-Memory-Chip-Cooler-Ram-Heat-Sinks-HeatSink-8pcs-set-/121189703639?pt=US_Memory_Chipset_Cooling&hash=item1c37781fd7


No, I already tested with the same heatsinks.


----------



## rdr09

Quote:


> Originally Posted by *selk22*
> 
> I have gotten these anywhere above 1400..
> 
> *BLACK SCREEN UPDATE*
> 
> Alright well like I stated before I was running at 1100/1400 completely fine on bf4 and other games. Ran the new planetside update today and ended up with a black screen.. I dropped the OC completely back to stock at 1000/1250 and played for a few hours without a hiccup.. SO for me it seems it may have been cause by an unstable Mem OC.. So for now running at stock until I can get some voltage control to decided whether or not to RMA the card.
> 
> I will continue to run at stock until I can be 100% certain it is a stable card at stock settings


good call, selk. can't rma for reason of unstable oc. i read your piece about BF4 now offered with the new cards - that i am pissed as well. what is AMD thinking!?


----------



## Gilgam3sh

Quote:


> Originally Posted by *overclockFrance*
> 
> No, I already tested with the same heatsinks.


ok, do you know why? any other copper heatsinks that works?

for the memory then?

http://www.ebay.com/itm/8PCS-Pure-Copper-Memory-Chipset-Cooler-Heat-Sinks-HeatSink-For-IC-DDR-RAM-VGA/350705273019


----------



## kantxcape

Quote:


> Originally Posted by *rdr09*
> 
> good call, selk. can't rma for reason of unstable oc. i read your piece about BF4 now offered with the new cards - that i am pissed as well. what is AMD thinking!?


Thank good i asked for a refund, gonna mail my 290x today.


----------



## overclockFrance

Quote:


> Originally Posted by *Gilgam3sh*
> 
> ok, do you know why? any other copper heatsinks that works?
> 
> for the memory then?
> 
> http://www.ebay.com/itm/8PCS-Pure-Copper-Memory-Chipset-Cooler-Heat-Sinks-HeatSink-For-IC-DDR-RAM-VGA/350705273019


The VRMs length is 6 mm whereas the heasinks one is 8 mm. And on the right of the VRM's, there are higher components than VRMs so that there is not enough room.

And for the memory, the size is not correct. But on ebay, you will find reliable chinese who sell copper hetsinks compatible with RAM modules. I bought some.

On the other hand, with the Alpenföhn Peter heatsink is provided an aluminium heatsink compatible with the 290 VRMs.


----------



## rdr09

Quote:


> Originally Posted by *kantxcape*
> 
> Thank good i asked for a refund, gonna mail my 290x today.


well, the BF4 is not enough reason to return the card either . . . but their ways of doing things are somewhere falling along the lines of the competition.


----------



## Gilgam3sh

Quote:


> Originally Posted by *overclockFrance*
> 
> The VRMs length is 6 mm whereas the heasinks one is 8 mm. And on the right of the VRM's, there are higher components than VRMs so that there is not enough room.
> 
> And for the memory, the size is not correct. But on ebay, you will find reliable chinese who sell copper hetsinks compatible with RAM modules. I bought some.
> 
> On the other hand, with the Alpenföhn Peter heatsink is provided an aluminium heatsink compatible with the 290 VRMs.


so there is no copper heatsinks that fits the VRMs at the moment?

ok could you link me to the copper RAM heatsinks?


----------



## overclockFrance

Quote:


> Originally Posted by *Gilgam3sh*
> 
> so there is no copper heatsinks that fits the VRMs at the moment?
> 
> ok could you link me to the copper RAM heatsinks?


I found the same as mine : http://www.ebay.com/itm/8-Pcs-Copper-Cooler-Heatsink-DDR-DDR2-DDR3-RAM-Memory-/321012577542?pt=LH_DefaultDomain_0&hash=item4abdd73506


----------



## kantxcape

Quote:


> Originally Posted by *rdr09*
> 
> well, the BF4 is not enough reason to return the card either . . . but their ways of doing things are somewhere falling along the lines of the competition.


The black screens, the insane coil whine and the 290 price is.


----------



## EliteReplay

Sorry for posting this in here, i dont find a place to ask

im confuse, AMD already showed mantle?

Tues 10:00 - 10:45 GS-4112 Guennadi Riguer, Brian Bennett AMD Mantle: Empowering 3D Graphics Innovation (Read abstract)


----------



## hatlesschimp

As people are getting refunds I ordered a ASUS R9 290x. Lol

I wonder how I will go.


----------



## Gilgam3sh

Quote:


> Originally Posted by *overclockFrance*
> 
> I found the same as mine : http://www.ebay.com/itm/8-Pcs-Copper-Cooler-Heatsink-DDR-DDR2-DDR3-RAM-Memory-/321012577542?pt=LH_DefaultDomain_0&hash=item4abdd73506


ok nice, do they really make a big improvement? price is cheap so I might buy these even though I would need it for the VRM, if you find something for the VRMs tell me.


----------



## rdr09

Quote:


> Originally Posted by *hatlesschimp*
> 
> As people are getting refunds I ordered a ASUS R9 290x. Lol
> 
> I wonder how I will go.


if you have installed a nvidia card in your system in the past, then you may have to do a bit more drastic steps prior to the 290X . . .

http://www.overclock.net/t/1265543/the-amd-how-to-thread

i suggest a clean install of os.


----------



## Sazz

Quote:


> Originally Posted by *chropose*
> 
> Ouch..Really? That's too bad. I want Hynix.


It doesn't really matter really.. the first 290X I had had 70.2% ASIC with Hynix memory and that one reached a max OC of 1100/1290 stock voltage but I sent it back coz the mini-display port stopped putting out image so the replacement I have now got 69.9% ASIC with Elipda memory and it maxes out at 1125/1360 OC at stock voltage.

So really, memory OC is just the same as any other parts you OC, its a lottery. you may end up with a good OC card on both memory and GPU but at the same time you may not get a good OC card at all.


----------



## Supranium

When i ran bf4 tests i got black screen on my first try @ 1150/1500. What i did is simply upping the fan speed to 50% and got nice stable gameplay. I suspect it wasnt the ram limit but rather high temperature on some component. GPU stayed around 84C.


----------



## mikep577

Quote:


> Originally Posted by *evensen007*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mikep577*
> 
> Thnx, I'm using PCI port 1 and 3 (both PCi3.0 x16
> 
> 
> 
> Yes, you do in fact need the 3x spacing, but wait... What? Why do you want the quad? You only have 2x 290's, right? In that case, order this:
> http://www.performance-pcs.com/catalog/index.php?main_page=product_info&cPath=59_971_1018_1038_1207&products_id=38981
> 
> 
> 
> Hi evensen007,
> 
> Great info, it seems that this bridge indeed fit in my board/pci/two card config.
> I have ordered it directly at EK this morning... should look great once finished
Click to expand...


----------



## Jpmboy

Quote:


> Originally Posted by *rdr09*
> 
> if you have installed a nvidia card in your system in the past, then you may have to do a bit more drastic steps prior to the 290X . . .
> 
> http://www.overclock.net/t/1265543/the-amd-how-to-thread
> 
> i suggest a clean install of os.


So I've switched out NV and AMD on this rig (titans and r390x) W7x64. No black screens, Use a sweeper to remove registry entries.


----------



## rdr09

Quote:


> Originally Posted by *Jpmboy*
> 
> So I've switched out NV and AMD on this rig (titans and r390x) W7x64. No black screens, Use a sweeper to remove registry entries.


you sound seasoned. others will use sweeper carelessly and end up borking their rig.

any news about the AB with voltage control?


----------



## Gilgam3sh

Quote:


> Originally Posted by *rdr09*
> 
> you sound seasoned. others will use sweeper carelessly and end up borking their rig.
> 
> any news about the AB with voltage control?


you can follow AB news here: http://forums.guru3d.com/forumdisplay.php?f=55


----------



## chropose

Quote:


> Originally Posted by *Sazz*
> 
> It doesn't really matter really.. the first 290X I had had 70.2% ASIC with Hynix memory and that one reached a max OC of 1100/1290 stock voltage but I sent it back coz the mini-display port stopped putting out image so the replacement I have now got 69.9% ASIC with Elipda memory and it maxes out at 1125/1360 OC at stock voltage.
> 
> So really, memory OC is just the same as any other parts you OC, its a lottery. you may end up with a good OC card on both memory and GPU but at the same time you may not get a good OC card at all.


I see. Good to know that.

My 7970 is Hynix and it can reach 1900 MHz on stock voltage. Valley stable, but with a few artifacts.


----------



## hatlesschimp

I may get a second card in the next week or two but I have a gtx 660 that I was using for sound and physx. Can I still run with that card in my system?


----------



## HoZy

Second watercooled 290x rig done.

Add me again to the first page?









This one is a Gigabyte with Hynix.


----------



## Sherp

Put a Gelid Icy Vision REV 2 on my 290. Voltages are stock, set my clocks to 1100/1300, idle temps are about 35°C (ambient temp = 24°C) and full load stays under 70°C with Furmark. VRM seems to hit about 98°C in Furmark, which is not a temperature I'm comfortable with, but it's not likely they'll hit those temps while gaming.

Took about an hour to fit the Icy Vision, very easy though and everything fits perfectly. Plus it doesn't sound like a hairdryer any more!

Everything runs perfectly.


----------



## utnorris

Quote:


> Originally Posted by *Scorpion49*
> 
> I don't have a mobile app, and how would I know I could save money using it in the first place?
> 
> 
> 
> 
> 
> 
> 
> Is it pasted somewhere on their site cuz I've really never seen it. Would have tried to get my POS windows phone to download it if I did.


Do a chat with them and let them know you forgot to put the code in and they will probably refund the 5% once it shipped. They did that for me on several occasions.


----------



## HoZy

Quote:


> Originally Posted by *Sherp*
> 
> Put a Gelid Icy Vision REV 2 on my 290. Voltages are stock, set my clocks to 1100/1300, idle temps are about 35°C (ambient temp = 24°C) and full load stays under 70°C with Furmark. VRM seems to hit about 98°C in Furmark, which is not a temperature I'm comfortable with, but it's not likely they'll hit those temps while gaming.
> 
> Took about an hour to fit the Icy Vision, very easy though and everything fits perfectly. Plus it doesn't sound like a hairdryer any more!
> 
> Everything runs perfectly.


Mines watercooled & I've never seen my VRM over 41c in furmark.

98c is something I'd definitely not be comfortable with.

Cheers
Mat


----------



## Supranium

Unless you use waterblock the VRMs will go over 90. If im right, their maximum operating temperature is 125C. So its not too much of the problem if you stay under 100 at all times.


----------



## Sainth

Any fix with the crossfire? In bf4 etc...


----------



## hatlesschimp

What an amd endorsed game has problem with crossfire??????


----------



## selk22

As far as you guys blaming the black screens on corrupted driver installs due to left over registry entry's and such in my opinion just is not correct. I can assure you I uninstalled the drivers fully and correctly down to to registry entry's. We have also had users do fresh installs and still have issues with the black screen.

Like Sgt Bilko said at stock things seem fine.. I am just anticipating the AB update


----------



## raptor15sc

Quote:


> Originally Posted by *hatlesschimp*
> 
> What an amd endorsed game has problem with crossfire??????


Not that I know of. Sure, there's some micro-stutter, but it's not that bad and SLI is comparable.

The only real 290X problem I've run into is with quadfire only--it doesn't work right with Vsync (at least in my multi-monitor setup).


----------



## rdr09

Quote:


> Originally Posted by *Gilgam3sh*
> 
> you can follow AB news here: http://forums.guru3d.com/forumdisplay.php?f=55


thanks. +rep.
Quote:


> Originally Posted by *selk22*
> 
> As far as you guys blaming the black screens on corrupted driver installs due to left over registry entry's and such in my opinion just is not correct. I can assure you I uninstalled the drivers fully and correctly down to to registry entry's. We have also had users do fresh installs and still have issues with the black screen.
> 
> Like Sgt Bilko said at stock things seem fine.. I am just anticipating the AB update


you might be referring to me as one of those who blame borked registry as to the cause of these blackscreens. i do not believe that either but it can happen. these blackscreens some are experiencing may be caused by a number of things including the driver itself. some were using 120Hz monitor and got blackscreens. reverting down to 60Hz solved it, so it could very well be driver. others who have oc'ed their cards and still other who have flashed their cards add to the complexity of zeroing in the cause.









imo, though, driver sweeper should be used with caution. i've never ever used it but i read of members having trouble with them and others with good results.


----------



## Jpmboy

Quote:


> Originally Posted by *rdr09*
> 
> you sound seasoned. others will use sweeper carelessly and end up borking their rig.
> 
> any news about the AB with voltage control?


Lightly salted..









I really think it is simply due to the drivers and launch bios in some cards. And yes, use sweeper, atiman, or duu in safe mode only and pay attention to what you ask it to do. Too many users clean out mobo drivers at the same time.

After using any sweeper, open an elevated command prompt and type in:

Sfc /scannow

Let it run. Then intstall your new drivers.


----------



## hatlesschimp

Im doing a fresh install when the card arrives.

What Windows should I install? I have win7 installed but since the last reinstall 2 months ago it keeps telling me I have an unlicensed version. Is the New Windows any good now?


----------



## evensen007

Quote:


> Originally Posted by *mikep577*
> 
> Hi evensen007,
> 
> Great info, it seems that this bridge indeed fit in my board/pci/two card config.
> I have ordered it directly at EK this morning... should look great once finished


No problem, Mike! Sorry, I was a little confused about the crazy spacing on that monster motherboard of yours!


----------



## Connolly

So is micro stutter present with 2 x R9 290s? I planned on grabbing another card in a couple of months time, but won't bother if it is. I've never had a Crossfire or SLI setup, can you expect to have some micro stutter no matter what card/setup you choose? I'll cancel my 290 order and go for a GTX 780 if it is indeed better with regard to stutter, or would I be expecting a similiar level?


----------



## Jpmboy

Quote:


> Originally Posted by *hatlesschimp*
> 
> Im doing a fresh install when the card arrives.
> 
> What Windows should I install? I have win7 installed but since the last reinstall 2 months ago it keeps telling me I have an unlicensed version. Is the New Windows any good now?


I haven't warmed up to 8.1 yet. But a licensed version of either is always best!


----------



## Need-To-Know

Hey guys - I have read through 6 or 7 pages so excuse me if this has been answered.

I am looking at options to run three 27" 1080 monitors. Having asked around on the EVGA forum and thinking about getting 2 GTX 770 4 gig cards someone pointed out that the 290 might be the better option.
So I came here to check a couple of things and get you guys opinion.

One thing my research has lead me to was one guy saying:
in reference to Black Screens -
"The reason so far seems to be related to the ones that are unlucky enough to have Elipidia memory modules. Not all but so far over at overclock.net there are dozens upon dozens of folks and out of all of them only one is experiencing problems and has a hynx module."

What is the general consensuses on this?
What card manufacturer is recommended?

Thanks for your experience and help!


----------



## Gilgam3sh

the GTX770 can't even use all that 4GB RAM... and yes 290/290X is way better.


----------



## Sainth

I get crappy results with my crossfire setup, in bf4 i get almost the same preformance as i did with one. Used a drive sweeper yesterday, cleared all old nvidia drivers. I still think an updated catalyst is needed.


----------



## the9quad

Quote:


> Originally Posted by *Need-To-Know*
> 
> Hey guys - I have read through 6 or 7 pages so excuse me if this has been answered.
> 
> I am looking at options to run three 27" 1080 monitors. Having asked around on the EVGA forum and thinking about getting 2 GTX 770 4 gig cards someone pointed out that the 290 might be the better option.
> So I came here to check a couple of things and get you guys opinion.
> 
> One thing my research has lead me to was one guy saying:
> in reference to Black Screens -
> "The reason so far seems to be related to the ones that are unlucky enough to have Elipidia memory modules. Not all but so far over at overclock.net there are dozens upon dozens of folks and out of all of them only one is experiencing problems and has a hynx module."
> 
> What is the general consensuses on this?
> What card manufacturer is recommended?
> 
> Thanks for your experience and help!


Doesn't what brand they all have a mix of elipda or hynix hard to tell what you will get. Best advice is just buy one, but be prepared to RMA it if you need to. Chances are you'll be fine, but be aware its a potential issue.


----------



## overclockFrance

Quote:


> Originally Posted by *Gilgam3sh*
> 
> ok nice, do they really make a big improvement? price is cheap so I might buy these even though I would need it for the VRM, if you find something for the VRMs tell me.


Copper is a far better thermal conductor than aluminium, so I advise you to buy some.

Unfortunately, for the VRMs, the only solution is a 3mm thick copper sheet : cut it and use adhesive thermal paste to glue on the VRMs. Then you can find a copper heatsink and glue it on the 3mm copper sheet. But it is much work.


----------



## alancsalt

But aluminium is a better radiater of heat, so taking both properties together, conduct and radiate, it pretty much balances out....


----------



## hatlesschimp

Im under the impression if your looking to boost the clocks then be aware there could be black screens coming your way. But I normally run my cards just above stock and like reliability and dont want to be sitting hours, days, weeks testing overclocks on benchmarks.

Ive ordered an Asus 290X and hoping that its a decent card. I may order a second card if I have some coin left over.


----------



## Need-To-Know

Quote:


> Originally Posted by *the9quad*
> 
> Doesn't what brand they all have a mix of elipda or hynix hard to tell what you will get. Best advice is just buy one, but be prepared to RMA it if you need to. Chances are you'll be fine, but be aware its a potential issue.


Thanks for the reply - I guess the next question would be then - Who has the better RMA process - I promise you it is not Asus as I went thru that once and it was a nightmare.

Maybe you guys are bias to the 290 (being a 290 thread and all







) but would you say 290s are the way to go instead of 2 GTX770 (4 gig needed for three screens from what i understand). Considering drivers and such.

I do like eyefinity on my current ASUS HD7950-DC2T-3GD5.


----------



## Jpmboy

Quote:


> Originally Posted by *Sainth*
> 
> I get crappy results with my crossfire setup, in bf4 i get almost the same preformance as i did with one. Used a drive sweeper yesterday, cleared all old nvidia drivers. I still think an updated catalyst is needed.


Which driver did you install? In gpuz are both cards getting loaded while bf4 is running?


----------



## Sainth

Quote:


> Originally Posted by *Jpmboy*
> 
> Which driver did you install? In gpuz are both cards getting loaded while bf4 is running?


It did, i will check again when i get home. Gpu-z said that 2 gpus were active, what could be wrong?

It cant be cpu bottleneck? Running a i7 3770k clocked to 4.6 ghz.


----------



## Sainth

Im
Quote:


> Originally Posted by *Jpmboy*
> 
> Which driver did you install? In gpuz are both cards getting loaded while bf4 is running?


Im running the latest beta, that was realesed on 10th?


----------



## BigGuns73

Quote:


> Originally Posted by *skupples*
> 
> If it is, what he thinks it is, which i'm doubting it is. The chip would show the 290 imprint, but would of some how missed the laser treatment. This type of stuff is EXTREMELY rare.


I never said that i think its an unlocked R9 290X, I said that i could only HOPE that it is... and mentioned more than one time that I was doubtful. Im gonna run the valley bench again today with 13.8 beta driver on my windows 8.1 because from what I hear it performs better than the latest beta.
Quote:


> Originally Posted by *Kelwing*
> 
> Playing around with cpu and gpu clocks. Running Valley
> 
> Win 8.1
> FX8350
> R9 290
> Drivers are 13.11 Beta 9.2
> 
> 2216 with stock CPU/GPU clocks
> 2286 with stock CPU/GPU @1000
> 2258 with CPU at 4.4g and GPU Stock
> 2334 with CPU at 4.4 and GPU at 1000, CPU temp 40*c and GPU 56*c with 60% fan speed. (Artic Accelero III)
> 
> Good, bad, could be tweaked more? Plus I can't make heads or tails of it. What is the little switch on side of board for? It is positioned towards back(where monitor hooks up)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s222.photobucket.com/user/mistwalker7/media/00003_zpse80002eb.jpg.html
> 
> http://s222.photobucket.com/user/mistwalker7/media/100_1573_1_zps3ce8b10c.jpg.html


Thats right in line with what I get on the R9 290 bios and you are also on win 8.1 using the same drivers.


----------



## shilka

Quote:


> Originally Posted by *BigGuns73*
> 
> Thats right in line with what I get on the R9 290 bios and you are also on win 8.1 using the same drivers.


Not trying to sound rude or anything but that PSU you have there cant do 850 watts

If you did not know already

It might do 600-650 watts on a good day if it feels like it


----------



## BigGuns73

Quote:


> Originally Posted by *DampMonkey*
> 
> my cards out performing his, thats all i need to sleep well tonight


And you are also on win 7 which valley is known to run an average of 2-3fps more than win 8. Its very well possible that your card is not performing any better than mine giving mine was already nipping at your heels on win 8.1 with the latest beta which is also supposedly not that great.


----------



## brazilianloser

Back a few pages on the Newegg return issue... good thing I guess I returned mine while they had none of the cards in stock... so my RMA is set for a refund instead. I think they do refunds as long as you plan on repurchasing something similar, now due to the high number of RMA I am pretty sure that will cause them to say no to various returns like they did for the one guy a few pages back.


----------



## BigGuns73

Quote:


> Originally Posted by *shilka*
> 
> Not trying to sound rude or anything but that PSU you have there cant do 850 watts
> 
> If you did not know already
> 
> It might do 600-650 watts on a good day if it feels like it


Um, its not an 850 watt unit. Its a Gold rated Seasonic 1000watt unit and it will do every bit of it.


----------



## shilka

Quote:


> Originally Posted by *BigGuns73*
> 
> Um, its not an 850 watt unit. Its a Gold rated Seasonic 1000watt unit and it will do every bit of it.


You showed a picture with a 850 watts TR-2 RX


----------



## rdr09

Quote:


> Originally Posted by *BigGuns73*
> 
> Um, its not an 850 watt unit. Its a Gold rated Seasonic 1000watt unit and it will do every bit of it.


looking forward to more results. i support you - Big. 'cause i own a 290.









Run 3DMark11 already. thanks.


----------



## BigGuns73

Quote:


> Originally Posted by *th3illusiveman*
> 
> That's terrible, some 7970s were getting 60 fps in valley. Heck mine got 54.5fps @ 1300Mhz, whats up with these things and Valley? That 512 bit bus should give them a massive advantage against GK110 and Tahiti in that bench.
> 
> They have excellent performance in almost everything else.


That would take a massively clocked 7970 to hit 60fps in valley because mine clocked at 1200mhz only hit 47fps and yours with 1300mhz is hitting 54.5 fps so im guessing 1400+ on the core which is not common at all. But yes I agree, valley is not performing well on the R9 290's, give the drivers time.
Quote:


> Originally Posted by *shilka*
> 
> You showed a picture with a 850 watts TR-2 RX


No I did not, I may have quoted someone else but I think I know what my rig runs bro.


----------



## shilka

It was Kelwing that has that TR-2 RX



Sory to say but you cant do crossfire with that PSU


----------



## battleaxe

Quote:


> Originally Posted by *BigGuns73*
> 
> I never said that i think its an unlocked R9 290X, I said that i could only HOPE that it is... and mentioned more than one time that I was doubtful. Im gonna run the valley bench again today with 13.8 beta driver on my windows 8.1 because from what I hear it performs better than the latest beta.


I'm giving you a +1 rep just because of the flack and crap you seem to be getting here. I've been reading your posts since your very first one and not once did I understand you to be anything other than excited that you _MAY_ have gotten a 290x instead of a 290 product.

I understood that from your posts. It wasn't hard, all I had to do was read. Obviously, you may not have a 290x chip in that 290, (or whatever), but you raised a good question, and one worth looking into without getting hammered senselessly. And no-one has been able to prove that you don't have a 290x there either. Who really cares honestly? Maybe its a GPU-Z glitch, maybe that chip was a rework and never got lazered then somehow made its way into a 290. It doesn't really matter, cause right now it kinda looks like it COULD be a 290x chip you've got there. (just like you said)

Sorry you've had to put up with all this honestly. It's a little embarrassing.


----------



## BigGuns73

You are looking at Kelwings rig, not mine.


----------



## Sherp

Quote:


> Originally Posted by *HoZy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sherp*
> 
> Put a Gelid Icy Vision REV 2 on my 290. Voltages are stock, set my clocks to 1100/1300, idle temps are about 35°C (ambient temp = 24°C) and full load stays under 70°C with Furmark. VRM seems to hit about 98°C in Furmark, which is not a temperature I'm comfortable with, but it's not likely they'll hit those temps while gaming.
> 
> Took about an hour to fit the Icy Vision, very easy though and everything fits perfectly. Plus it doesn't sound like a hairdryer any more!
> 
> Everything runs perfectly.
> 
> 
> 
> Mines watercooled & I've never seen my VRM over 41c in furmark.
> 
> 98c is something I'd definitely not be comfortable with.
> 
> Cheers
> Mat
Click to expand...

I might watercool in the future, but it's not a priority for me. I think the VRM was at a similar temp with the stock cooler though.
Quote:


> Originally Posted by *Supranium*
> 
> Unless you use waterblock the VRMs will go over 90. If im right, their maximum operating temperature is 125C. So its not too much of the problem if you stay under 100 at all times.


And yeah I've heard the same, I might fiddle about with the VRM heatsink I have or buy a new one. It's always worrying seeing something hit 100C though.


----------



## shilka

Quote:


> Originally Posted by *BigGuns73*
> 
> You are looking at Kelwings rig, not mine.


Yes i saw that sory


----------



## BigGuns73

Quote:


> Originally Posted by *battleaxe*
> 
> I'm giving you a +1 rep just because of the flack and crap you seem to be getting here. I've been reading your posts since your very first one and not once did I understand you to be anything other than excited that you _MAY_ have gotten a 290x instead of a 290 product.
> 
> I understood that from your posts. It wasn't hard, all I had to do was read. Obviously, you may not have a 290x chip in that 290, (or whatever), but you raised a good question, and one worth looking into without getting hammered senselessly. And no-one has been able to prove that you don't have a 290x there either. Who really cares honestly? Maybe its a GPU-Z glitch, maybe that chip was a rework and never got lazered then somehow made its way into a 290. It doesn't really matter, cause right now it kinda looks like it COULD be a 290x chip you've got there. (just like you said)
> 
> Sorry you've had to put up with all this honestly. It's a little embarrassing.


Sir thank you very much for the kind words, that helps a lot.


----------



## rdr09

Quote:


> Originally Posted by *BigGuns73*
> 
> That would take a massively clocked 7970 to hit 60fps in valley because mine clocked at 1200mhz only hit 47fps and yours with 1300mhz is hitting 54.5 fps so im guessing 1400+ on the core which is not common at all. But yes I agree, valley is not performing well on the R9 290's, give the drivers time.


see, if you have win7 you'll get 50. there are 7950 that gets 54 in Valley without tweaks. mine did 51.2 at 1255 core.

edit: valley relies heavily on the vram oc. i saw a 290 scored 70 recently.


----------



## BigGuns73

Quote:


> Originally Posted by *shilka*
> 
> Yes i saw that sory


No problem man.
Quote:


> Originally Posted by *rdr09*
> 
> see, if you have win7 you'll get 50. there are 7950 that gets 54 in Valley without tweaks. mine did 51.2 at 1255 core.


Right which means that my bios flashed R9 290 would be hitting 60+fps just like the other guy here with the R9 290X on win 7.


----------



## Jpmboy

Quote:


> Originally Posted by *Sainth*
> 
> Im
> Im running the latest beta, that was realesed on 10th?


Quote:


> Originally Posted by *Sainth*
> 
> It did, i will check again when i get home. Gpu-z said that 2 gpus were active, what could be wrong?
> It cant be cpu bottleneck? Running a i7 3770k clocked to 4.6 ghz.


It's not likely the cpu. Tab out of bf4 and Check the sensors tab in gpuz , make sure both gpus are being used, and that both are hot.


----------



## Sainth

I will dubbelche
Quote:


> Originally Posted by *Jpmboy*
> 
> It's not likely the cpu. Tab out of bf4 and Check the sensors tab in gpuz , make sure both gpus are being used, and that both are hot.


I will dubbelcheck it when i get home


----------



## alancsalt

Quote:


> Originally Posted by *BigGuns73*
> 
> Quote:
> 
> 
> 
> Originally Posted by *shilka*
> 
> Yes i saw that sory
> 
> 
> 
> No problem man.
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> see, if you have win7 you'll get 50. there are 7950 that gets 54 in Valley without tweaks. mine did 51.2 at 1255 core.
> 
> Click to expand...
> 
> Right which means that my bios flashed R9 290 would be hitting 60+fps just like the other guy here with the R9 290X on win 7.
Click to expand...

I don't know if you are spamming for post count or something, but if no one else has posted since you last posted, please edit your last post rather than making a new one.


----------



## BigGuns73

I still have win 7 64, might install it later today on a spare drive "if i can find one" and retest valley.


----------



## BigGuns73

Quote:


> Originally Posted by *alancsalt*
> 
> I don't know if you are spamming for post count or something, but if no one else has posted since you last posted, please edit your last post rather than making a new one.


I dont care about post count and im not trying to spam anything, just responding to questions like everyone else. Im sorry, apparently im breaking every rule around here without even trying.


----------



## mickeykool

What are you guys using to over clock the 290s? I just want to push my card a little on stock settings. Any tips and guide I can read up on.

thanks


----------



## BigGuns73

Quote:


> Originally Posted by *mickeykool*
> 
> What are you guys using to over clock the 290s? I just want to push my card a little on stock settings. Any tips and guide I can read up on.
> 
> thanks


MSI afterburner works well, I think an updated one is due out soon if its not out already.


----------



## selk22

I believe he is referring to just general forum etiquette...

AB is the best I have found right now. CCC was weird


----------



## Stay Puft

Quote:


> Originally Posted by *BigGuns73*
> 
> MSI afterburner works well, I think an updated one is due out soon if its not out already.


Unwinder said that Beta 17 has been submitted to MSI for approval and has 290X/290 Voltage control

http://forums.guru3d.com/showpost.php?p=4699424&postcount=176


----------



## Slomo4shO

Quote:


> Originally Posted by *kantxcape*
> 
> the 290 price is.


Prob still better to wait for non-reference cards at this rate.


----------



## Ukkooh

Does anyone have any idea why club3d 290xs jumped significantly in price recently (~50-80€)?


----------



## Stay Puft

Quote:


> Originally Posted by *Ukkooh*
> 
> Does anyone have any idea why club3d 290xs jumped significantly in price recently (~50-80€)?


Probably due to availability. Less supply + Higher demand = Higher prices


----------



## NaifQK

Add me please


----------



## battleaxe

We should start seeing the non ref 290's in about a week correct?


----------



## Kuat

Quote:


> Originally Posted by *Arizonian*
> 
> The ASX has only been sold at amazingly low quantity only on EVGA twice now. Places like Newegg / Amazon have not had one yet. No other aftermarket has even been offered by any other vendor let alone any non-reference even shown including their own websites.
> 
> Multiple source now have all said latter part of November for AMD. Nvidia news indicates they are going to be ready to counter but not before. I have a feeling its to see where AMD will price and if other price drops happen so they don't come out higher and end up with egg on their face is my opinion.
> 
> Gigabyte showed their aftermarket Windforce for review but hasn't put it for sale anywhere and not given a date to TechSpot when they would either.


Yeah, for me it's not really about if they are available for sale or not. I just want to see the reviews to get an idea what the aftermarket cards are gonna be like.

ACX / Windforce reviews were pretty informative (OC, power, noise, cooling)


----------



## battleaxe

Quote:


> Originally Posted by *Kuat*
> 
> Yeah, for me it's not really about if they are available for sale or not. I just want to see the reviews to get an idea what the aftermarket cards are gonna be like.
> 
> ACX / Windforce reviews were pretty informative (OC, power, noise, cooling)


Yeah, I can't imagine this will take longer than Thanksgiving though. The vendors won't want to miss out on black Friday shopping (at the very latest) I wouldn't think.

If I get one; this is when I'm going to pick up my 290. Although I'm not expecting any sales or anything. But there might be some nice Newegg codes and such.... lets hope.


----------



## Arizonian

Quote:


> Originally Posted by *NaifQK*
> 
> Add me please
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## bond32

UPS just dropped mine off! Oh man I am excited. Got the Sapphire 290x from amazon when I saw it in stock. Already installed it; the card itself was freezing cold from being on the truck lol.

Benchmarks/pics to come. I have 2 big tests so once those are done I will begin lots of tests. Then hopefully order a water block... Have 960mm of rad space for cpu and gpu only!


----------



## rdr09

Wake up!

Block is out for delivery!









@bond, the hell with the tests. test your gpu. lol.

kid kid


----------



## DampMonkey

Quote:


> Originally Posted by *BigGuns73*
> 
> And you are also on win 7 which valley is known to run an average of 2-3fps more than win 8. Its very well possible that your card is not performing any better than mine giving mine was already nipping at your heels on win 8.1 with the latest beta which is also supposedly not that great.


Except for the fact that I have the worst performing 290x i've seen at valley. Heres a snip of me getting passed by 4fps from a 290 at the same clocks. This would make you the new worst performing 290x if you do indeed have one
Quote:


> Quote:
> 
> 
> 
> Originally Posted by *rv8000*
> 
> Valley Extreme HD -- Sapphire r9 290 -- 1150/1575 -- stock bios/volts -- stock cooling -- 71.5 fps
> 
> 
> 
> Again going above 1500 on the memory seems to yield little to no result, ecc? I could see getting 76-78 fps out of this card if it's a good clocker, can't wait for aftermarket cooling!
> 
> 
> 
> Anyone know why a 290x would get 4 fps less than a 290 at the same clocks? I'd really hope the cpu doesn't play this big of a role in valley
Click to expand...


----------



## bond32

Quote:


> Originally Posted by *rdr09*
> 
> Wake up!
> 
> Block is out for delivery!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @bond, the hell with the tests. test your gpu. lol.
> 
> kid kid


Dangit! must focus...... Benchmarks/clocks/temperatures are fighting it out against HVAC Design/Psychometrics in my head!

But seriously, I can't wait to test this card. Still planning on going with the Koolance Block likely friday (Hellfire PC said they will have it in stock). No backplate for me... Possibly after christmas I will purchase a second 290x. Gaming only on 1 monitor, 1440p 120hz (X-Star).

Had a 780 Lightning then returned it when the prices fell. That was what 3 weeks ago? Been on onboard 4770k igpu since... Had to play on my 1080p tv at low settings but hey, at least I could play bf4 so not complaining.


----------



## rdr09

Quote:


> Originally Posted by *bond32*
> 
> Dangit! must focus...... Benchmarks/clocks/temperatures are fighting it out against HVAC Design/Psychometrics in my head!
> 
> But seriously, I can't wait to test this card. Still planning on going with the Koolance Block likely friday (Hellfire PC said they will have it in stock). No backplate for me... Possibly after christmas I will purchase a second 290x. Gaming only on 1 monitor, 1440p 120hz (X-Star).
> 
> Had a 780 Lightning then returned it when the prices fell. That was what 3 weeks ago? Been on onboard 4770k igpu since... Had to play on my 1080p tv at low settings but hey, at least I could play bf4 so not complaining.


lol. Go back and study. unplug that rig. throw away the power cord.


----------



## Kipsofthemud

Quote:


> Originally Posted by *HoZy*
> 
> Second watercooled 290x rig done.
> 
> Add me again to the first page?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This one is a Gigabyte with Hynix.


My Gigabyte is on the way, good to hear that yours came with Hynix


----------



## eternal7trance

So newegg came out with 5% off on the normal website and I finally got a 290 for $380


----------



## battleaxe

Quote:


> Originally Posted by *bond32*
> 
> Dangit! must focus...... Benchmarks/clocks/temperatures are fighting it out against HVAC Design/Psychometrics in my head!
> 
> But seriously, I can't wait to test this card. Still planning on going with the Koolance Block likely friday (Hellfire PC said they will have it in stock). No backplate for me... Possibly after christmas I will purchase a second 290x. Gaming only on 1 monitor, 1440p 120hz (X-Star).
> 
> Had a 780 Lightning then returned it when the prices fell. That was what 3 weeks ago? Been on onboard 4770k igpu since... Had to play on my 1080p tv at low settings but hey, at least I could play bf4 so not complaining.


You were actually able to play bf4 on low with the iGPU? That's impressive for the 4770 really, I have to say.


----------



## battleaxe

Quote:


> Originally Posted by *eternal7trance*
> 
> So newegg came out with 5% off on the normal website and I finally got a 290 for $380


And what was the code? How long is it good for? I want a non ref card. Arghhh.........!


----------



## bond32

Quote:


> Originally Posted by *battleaxe*
> 
> You were actually able to play bf4 on low with the iGPU? That's impressive for the 4770 really, I have to say.


Yeah, I was playing BF4 at 1080p, all lowest settings but I could play. Not too bad really, although it's running at 4.8 ghz.

Edit: What's the name of that utility that shows the memory info on the card?


----------



## jerrolds

Quote:


> Originally Posted by *geggeg*
> 
> I tweeted AMD gaming, AMD Radeon and Roy about this and made my displeasure at having this happen to existing R9-290 owners be known. If more people let them know via Facebook or twitter maybe they will respond. Otherwise this is a really bad PR move when Nvidia is actually having a great gaming bundle themself.


Yup i am pissed about this - i just tweeted @AMDGaming and going to post on their FB page, hopefully the back lash will make it better for us. If anything day 1 purchases should get bonuses.


----------



## BigGuns73

Quote:


> Originally Posted by *DampMonkey*
> 
> Except for the fact that I have the worst performing 290x i've seen at valley. Heres a snip of me getting passed by 4fps from a 290 at the same clocks. This would make you the new worst performing 290x if you do indeed have one


Not necessarily, again win 8.1 is slower in valley compared to win 7 "which i plan to reinstall on a separate drive today to do a rerun".


----------



## bond32

What should the card be idling at?? Mine just got to 51 C and was climbing... Dropped the refresh rate back to 60 hz and it seems to be falling back down now... on 13.11 drivers


----------



## rdr09

Quote:


> Originally Posted by *bond32*
> 
> What should the card be idling at?? Mine just got to 51 C and was climbing... Dropped the refresh rate back to 60 hz and it seems to be falling back down now... on 13.11 drivers


mine idles at 46C (290). the memory app is in Post #2.


----------



## eternal7trance

Quote:


> Originally Posted by *battleaxe*
> 
> And what was the code? How long is it good for? I want a non ref card. Arghhh.........!


NAFSAVE5NOV13Q


----------



## chiknnwatrmln

Why does everyone hate Elpida memory? To my understanding, it runs slower but has tighter timings. Wouldn't this make performance exactly the same?

Someone please correct me if I'm wrong.


----------



## bond32

Quote:


> Originally Posted by *rdr09*
> 
> mine idles at 46C (290). the memory app is in Post #2.


Thanks. I had some issue and it seemed the memory clock was stuck on 1250, wouldn't down clock which caused the high temps. Going to use 96 hz for now until I figure out more.


----------



## rdr09

Quote:


> Originally Posted by *bond32*
> 
> Thanks. I had some issue and it seemed the memory clock was stuck on 1250, wouldn't down clock which caused the high temps. Going to use 96 hz for now until I figure out more.


oh, yah. one cause of the blackscreens was someone using 120Hz. need to up the fan - me thinks or lower the Hz like you are doing.


----------



## jerrolds

I tested for Blackscreens yesterday and luckily i believe i am immune. Sapphire BF4 290X w/ Elpida memory using stock fan/bios/voltage overclocked to 1110/1400 i was able to play BF4 and Batman Origins for a few hours without black screening. This is with a 120hz 1440p QNIX display and the new b9.2 drivers ASIC 75%

Then i flashed to Asus BIOS and pushed it to 1190 @ 1.29-1.35v (after vdroop, mostly hovering around 1.29v) 100% fan speed and ran 3DMark and Heaven without issue. I think i can hit 1200mhz on stock heatsink fan with lower voltages but ill be slapping on the Gelid ICY 2 tonight and aim for 1220mhz+ on the core.

VRM Temps were about 60 and 70C reported by GPUZ. I may log them to file so i can compare against Gelid tonight.


----------



## bond32

Quick 3dmark11 run, stock gpu clocks. No throttling I think, says max temp was 90 C.


Compare to: http://www.3dmark.com/3dm11/7202906 likely my best score with the 780, although different cpus.


----------



## rdr09

Quote:


> Originally Posted by *jerrolds*
> 
> I tested for Blackscreens yesterday and luckily i believe i am immune. Sapphire BF4 290X w/ Elpida memory using stock fan/bios/voltage overclocked to 1110/1400 i was able to play BF4 and Batman Origins for a few hours without black screening.
> 
> Then i flashed to Asus BIOS and pushed it to 1190 @ 1.29-1.35v (after vdroop, mostly hovering around 1.29v) 100% fan speed and ran 3DMark and Heaven without issue. I think i can hit 1200mhz on stock heatsink fan with lower voltages but ill be slapping on the Gelid ICY 2 tonight and aim for 1220mhz+ on the core.
> 
> VRM Temps were about 60 and 70C reported by GPUZ. I may log them to file so i can compare against Gelid tonight.


no black screens here either, i think if bond's gpu will act up it would have already. let's hope for the best.


----------



## Gilgam3sh

one hour valley benchmark at 1,2GHz...



but I don't know why I get so low scores?


----------



## bond32

Low score usually meant unstable clocks, at least what it was when I had 7950's.

And no black screens yet here either, running at 96 hz. So far so good...


----------



## Stay Puft

Quote:


> Originally Posted by *Gilgam3sh*
> 
> one hour valley benchmark at 1,2GHz...
> 
> 
> 
> but I don't know why I get so low scores?


From the GPUZ tab i can see the core clocks jumping all over the place.


----------



## BigGuns73

Quote:


> Originally Posted by *Gilgam3sh*
> 
> one hour valley benchmark at 1,2GHz...
> 
> 
> 
> but I don't know why I get so low scores?


Quote:


> Originally Posted by *Gilgam3sh*
> 
> one hour valley benchmark at 1,2GHz...
> 
> 
> 
> but I don't know why I get so low scores?
> 
> [/quote
> 
> LOL, what the heck?


----------



## youra6

How much does vdroop account for? So if a voltage was set to 1.35V for example, approx. what would the voltage actually be?


----------



## BigGuns73

Quote:


> Originally Posted by *youra6*
> 
> How much does vdroop account for? So if a voltage was set to 1.35V for example, approx. what would the voltage actually be?


I think that equates to around 1.25 actual vcore for me last time I checked.


----------



## jerrolds

Quote:


> Originally Posted by *youra6*
> 
> How much does vdroop account for? So if a voltage was set to 1.35V for example, approx. what would the voltage actually be?


Depends on your ASIC i guess? Mine is 75% and when i maxed the voltage to 1.412v using asus bios and gpu tweak - during benches and gameplay it hovered around 1.29v - 1.35v, mostly around 1.3v. I'm going to try and lower the voltage to around 1350mV in gpu tweak and see how it goes....i dont like that it hits 1.35v on occasion.

This was with the stock cooler, hoping i dont need as much juice with the icy 2.


----------



## Gilgam3sh

Quote:


> Originally Posted by *bond32*
> 
> Low score usually meant unstable clocks, at least what it was when I had 7950's.
> 
> And no black screens yet here either, running at 96 hz. So far so good...


well I just ran a test with stock clocks and got even worse score hehe, don't think it has something with unstable clocks (must be Windows 8.1 lol), it should have crashed when I let it run for about one hour...



Quote:


> Originally Posted by *Stay Puft*
> 
> From the GPUZ tab i can see the core clocks jumping all over the place.


I noticed it did that when it was a change of scenes...but then back to 1,2GHz


----------



## Stay Puft

Quote:


> Originally Posted by *Gilgam3sh*
> 
> well I just ran a test with stock clocks and got even worse score hehe, don't think it has something with unstable clocks (must be Windows 8.1 lol), it should have crashed when I let it run for about one hour...
> 
> 
> I noticed it did that when it was a change of scenes...but then back to 1,2GHz


For a score that low it has to be thermal throttling. What fan % are you using?


----------



## james111333

Hey, can I join the club with this http://www.overclock.net/t/1442321/is-this-the-best-r9-290x-in-the-world-right-now ?


----------



## BigGuns73

Quote:


> Originally Posted by *Stay Puft*
> 
> For a score that low it has to be thermal throttling. What fan % are you using?


Thats what I was thinking.


----------



## Gilgam3sh

Quote:


> Originally Posted by *Stay Puft*
> 
> For a score that low it has to be thermal throttling. What fan % are you using?


I use the Accelero Xtreme III, should it throttle when max GPU temp is 65 celsius?


----------



## BigGuns73

Quote:


> Originally Posted by *Gilgam3sh*
> 
> I use the Accelero Xtreme III, should it throttle when max GPU temp is 65 celsius?


VRM Temps, what are they?


----------



## Stay Puft

Quote:


> Originally Posted by *Gilgam3sh*
> 
> I use the Accelero Xtreme III, should it throttle when max GPU temp is 65 celsius?


Do us all a favor. Pull up the GPUZ sensor screen, Select max for everything and run valley again and post the gpuz.

What fan speed are you using with the Xtreme III? Try 100% fan speed


----------



## BigGuns73

Quote:


> Originally Posted by *Stay Puft*
> 
> Do us all a favor. Pull up the GPUZ sensor screen, Select max for everything and run valley again and post the gpuz.
> 
> What fan speed are you using with the Xtreme III? Try 100% fan speed


It really should be at 100 percent in order to effectively cool the vrms, a lot of people dont realize this.


----------



## Raephen

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Why does everyone hate Elpida memory? To my understanding, it runs slower but has tighter timings. Wouldn't this make performance exactly the same?
> 
> Someone please correct me if I'm wrong.


From my understanding, bandwith (MHz) is more important on gpu's than the timings. Or rephrased: higher bandwith might have a bigger impact on the paralel computations than tighter timings...


----------



## Stay Puft

Quote:


> Originally Posted by *BigGuns73*
> 
> It really should be at 100 percent in order to effectively cool the vrms, a lot of people dont realize this.


Exactly. Even at 100% its whisper quiet


----------



## Gilgam3sh

Quote:


> Originally Posted by *Stay Puft*
> 
> Do us all a favor. Pull up the GPUZ sensor screen, Select max for everything and run valley again and post the gpuz.
> 
> What fan speed are you using with the Xtreme III? Try 100% fan speed


here you go: 

I run it at 12v, which I guess is 100%.


----------



## Gilgam3sh

Quote:


> Originally Posted by *Stay Puft*
> 
> Exactly. Even at 100% its whisper quiet


it's quite yes, but a little to much when in idle, atm I can only run it at 12v as I use the molex that came with the cooler.


----------



## Stay Puft

Quote:


> Originally Posted by *Gilgam3sh*
> 
> here you go:
> 
> I run it at 12v, which I guess is 100%.


Could we get closeups of GPUZ please. I dont feel like putting my face an inch away from the screen to try and read it


----------



## Gilgam3sh

Quote:


> Originally Posted by *Stay Puft*
> 
> Could we get closeups of GPUZ please. I dont feel like putting my face an inch away from the screen to try and read it


just press on the picture and then choose original it will give you full size.


----------



## BigGuns73

Quote:


> Originally Posted by *Stay Puft*
> 
> Could we get closeups of GPUZ please. I dont feel like putting my face an inch away from the screen to try and read it


Right click it and open in new tab.


----------



## sugarhell

Try on stock and stock bios


----------



## Raephen

Quote:


> Originally Posted by *Gilgam3sh*
> 
> I use the Accelero Xtreme III, should it throttle when max GPU temp is 65 celsius?


I know the situations are remotely different, but I saw the gpu clock speed jump all over the place in MSI AB's OSD while playing Skyrim.

I only have a single 1080p 60Hz screen and I was getting a full 60 fps no matter what I did, but still it puzzled me.

So I decided to to force Vsync always of in CCC. And in Skyrim, the bouncing clock speed vanished.

Could you do me a favour and run the bench while Vsync is forced always off? I'm curious...


----------



## Stay Puft

Quote:


> Originally Posted by *Gilgam3sh*
> 
> just press on the picture and then choose original it will give you full size.












Everything looks good. I'm going to take a guess and say since it cant read your fan speed its running in limp mode


----------



## BigGuns73

Quote:


> Originally Posted by *Stay Puft*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Everything looks good. I'm going to take a guess and say since it cant read your fan speed its running in limp mode


Cant be that, mine is also hooked up to the 12v and I have no such issues.


----------



## Gilgam3sh

Quote:


> Originally Posted by *Raephen*
> 
> I know the situations are remotely different, but I saw the gpu clock speed jump all over the place in MSI AB's OSD while playing Skyrim.
> 
> I only have a single 1080p 60Hz screen and I was getting a full 60 fps no matter what I did, but still it puzzled me.
> 
> So I decided to to force Vsync always of in CCC. And in Skyrim, the bouncing clock speed vanished.
> 
> Could you do me a favour and run the bench while Vsync is forced always off? I'm curious...


yes I will try that as I run with V-Sync ON....

Quote:


> Originally Posted by *Stay Puft*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Everything looks good. I'm going to take a guess and say since it cant read your fan speed its running in limp mode


ok, fired up BF4 and played for a couple of minutes just too see if I got very bad FPS but the game was smooth as butter, no problem at all..


----------



## Technewbie

Spoiler: Warning: Spoiler!






Does any one know how to get higher on the memory? 1465mhz is pretty much the wall i've hit on the memory because anything higher and it black screens. I was hoping to hit 1500mhz on the memory, Would I have to back my core clock down in order to increase the memory clock? Because I know you can back the memory down to increase the core.


----------



## Stay Puft

AMD just must hate valley then


----------



## jerrolds

Quote:


> Originally Posted by *Stay Puft*
> 
> Could we get closeups of GPUZ please. I dont feel like putting my face an inch away from the screen to try and read it


You can view the original if you like - heres the link http://cdn.overclock.net/b/b3/b308c279_Shot0001.jpeg


----------



## dade_kash_xD

Random question. When I have my system on, there is a little green light that lights up at the end of the PCB (at the opposite end of the input bracket). For some reason, only the green light on the second card lights up. Anybody know why this is or what that green light indicates?


----------



## r0l4n

Hi!

I've created a small windows application that wraps the VRM console commands towards MSI Afterburner (link and link).

Features:
-enables/disables LLC
-applies a VDDC offset between 0v and +0.3v
-allows for gpu selection
-loads profiles on load and/or on idle if desired
-applies these *only* to those processes in an input list
-stores settings to file so one doesn't have to enter them every time
-minimizes to system tray

Requires:
-R9 290/290X
-Microsoft .NET 4.0
-MSI Afterburner beta 16 or newer

Screenshot:


I've tested myself (I'm running a Gigabyte R9 290) and just works, but *I will not take, of course, any blame if your windows crashes, your GPU gets fried, etc*. Enabling LLC adds +0.1v straight away, so you can get yourself an idea of what you are putting your GPU through.

These are the AB settings I tested the application with:


Both files included in the .zip file need to be in the same location. If you have UAC enabled, you may need to run it as administrator.

CHANGELOG:
beta3:

support for multi-GPU setups
beta2:

fixed bug that prevented setting the correct path to MSIAfterburner.exe
ISSUES:

a user reported black screens. I have only been able to reproduce this by setting the tool to apply AB profiles with a high memory overclock.
Please report any bugs to me in a private message.

****USE AT YOUR OWN RISK****

ABAutoProfileVoltMod_beta3.zip 315k .zip file

****USE AT YOUR OWN RISK****


----------



## stilllogicz

Wanna do Tri Xfire 290's & IB-E _overvolted_ for max overclocks on water. Don't wanna add a 2nd PSU. I would rather have 1 suped up PSU that can do the job. I'm hoping a 1500W PSU will suffice with all the other system components, 16-20 fans, 2 water pumps, led strips are the standouts. What should I look for in a PSU to tackle this? Any recommendations on a PSU that's reliable and up to the job? Thanks guys.


----------



## Gilgam3sh

I tried with V-SYNC OFF but same crappy score hehe, I don't care really, most important that games run fine and I have a stable 1,2GHz core...







but would be good to know why I get these bad scores


----------



## Arizonian

I really thought by now MSI Afterburner would support voltage control / 290X & 290 cards. They are still on Version 2.3.1 (2013/1/23). These cards still haven't even seen their potential and give us some control.


----------



## jerrolds

Unwinder from guru3d said he submitted Afterburner Beta 17 (the one with 290(X) voltage controls) a few days ago - so it should be "soon"

http://forums.guru3d.com/showthread.php?t=382760&page=8


----------



## psyside

Quote:


> Originally Posted by *Gilgam3sh*
> 
> yes I will try that as I run with V-Sync ON....
> ok, fired up BF4 and played for a couple of minutes just too see if I got very bad FPS but the game was smooth as butter, no problem at all..


Uninstall valley, clean registry with CCleaner, restart - install Valley, max out fans via CCC, dont' open GPU-Z/AB, run valley in full screen, and run it at stock, report back.


----------



## Skylark71

Hi,
joining the club, my Asus R9 290, Elpida ram and no black screens so far











-Mika-


----------



## Arizonian

Quote:


> Originally Posted by *Skylark71*
> 
> Hi,
> joining the club, my Asus R9 290, Elpida ram and no black screens so far
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> -Mika-


Congrats - added.









You also make the *100th member* to join in 21 days since 290X release and the 130th GPU on the list purchased.


----------



## Kriant

My EK blocks are coming tomorrow. Hyped!


----------



## Jpmboy

Quote:


> Originally Posted by *Arizonian*
> 
> I really thought by now MSI Afterburner would support voltage control / 290X & 290 cards. They are still on Version 2.3.1 (2013/1/23). These cards still haven't even seen their potential and give us some control.


*This* ^^^^


----------



## Skylark71

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You also make the *100th member* to join in 21 days since 290X release and the 130th GPU on the list purchased.


Awesome , what did i win


----------



## VSG

Quote:


> Originally Posted by *Kriant*
> 
> My EK blocks are coming tomorrow. Hyped!


Mine too, but my FrozenQ reservoir is still MIA after almost a month since ordering.


----------



## Sainth

Crossfire is working like shiet, i get better preformance when im running with 1 card :S


----------



## Arizonian

Quote:


> Originally Posted by *Skylark71*
> 
> Awesome , what did i win


Nothing - your a club statistic.


----------



## Jpmboy

Quote:


> Originally Posted by *r0l4n*
> 
> Is it allowed to upload homebrewed utilities?
> 
> I've created a small windows application that wraps the VRM console commands towards MSI Afterburner (link and link).
> 
> Features:
> -enables LLC, or
> -applies a VDDC offset between 0v and +0.3v
> -loads profiles on load and/or on idle if desired
> -applies these *only* to those processes in an input list
> -stores settings to file so one doesn't have to enter them every time
> -minimizes to system tray
> 
> 
> 
> I've tested myself (I'm running a Gigabyte R9 290) and works very well, but I will not take, of course, any blame if your windows crashes, your GPU gets fried, etc. Enabling LLC adds +0.2v straight away, so you can get yourself an idea of what you are putting your GPU through.
> 
> I hope to get some feedback from the OP.


Feedback:

Nice! And thanks. I might give it s try....

Can you give more info? Did you have other OCN members beta it for you yet? When you say "enables LLC" do you mean dials down LLC (eg, decreases vdroop)? 'Cause "LLC" is already enabled. Switching off LLC should not add mV to base clock voltage, right? So how is it adding 200mV "straight away". In fact, the OEM vdroop from 1.412 mV on this gpu is only ~100 mV.

Am i missing something? Can you share the /wi4 commands this must issue? Otherwise, personally, i will not test it.

Oh, which AB did you test with?


----------



## Forceman

Quote:


> Originally Posted by *Jpmboy*
> 
> When you say "enables LLC" do you mean dials down LLC (eg, decreases vdroop)? 'Cause "LLC" is already enabled. Switching off LLC should not add mV to base clock voltage, right? So how is it adding 200mV "straight away". In fact, the OEM vdroop from 1.412 mV on this gpu is only ~100 mV.


LLC doesn't cause Vdroop, LLC counters Vdroop. So enabling (or increasing) LLC is going to cause Vdroop to be less severe, and so raise the effective voltage. Unless LLC works the exact opposite on GPUs than it does on CPUs, that is.


----------



## r0l4n

Quote:


> Originally Posted by *Jpmboy*
> 
> Feedback:
> 
> Nice! And thanks. I might give it s try....
> 
> Can you give more info? Did you have other OCN members beta it for you yet? When you say "enables LLC" do you mean dials down LLC (eg, decreases vdroop)? 'Cause "LLC" is already enabled. Switching off LLC should not add mV to base clock voltage, right? So how is it adding 200mV "straight away". In fact, the OEM vdroop from 1.412 mV on this gpu is only ~100 mV.
> 
> Am i missing something? Can you share the /wi4 commands this must issue? Otherwise, personally, i will not test it.


Only Sammyboy83 from this forum has beta tested earlier versions of the application. He runs the exact same card as me, a Gigabyte R9 290.

I use exactly the commands I posted in the links (also check this).

Load Line Calibration adds volts on load to prevent vdroop. The commands I use are enabling and disabling LLC, when enabled is like having a boost voltage of +200mV. Dialing down LLC means increasing vdroop, and I'm not sure it's possible with these commands. Afaik, it's only on or off, off being the default setting, on being +0.2v on load.

To make it clear, I have tested enabling LLC myself and the load voltage is 1.352v steady running Heaven. Since I'm running the stock cooler, I don't let it run more than a few seconds, so I consider that feature the least tested. Some people on water should give it a try, if not with my app, by running those commands I linked manually (and carefully).


----------



## EliteReplay

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You also make the *100th member* to join in 21 days since 290X release and the 130th GPU on the list purchased.


i went into the GTX780ti club and there are only like 10 people


----------



## $ilent

still no msi afterburner update yet?


----------



## Jpmboy

Quote:


> Originally Posted by *Forceman*
> 
> LLC doesn't cause Vdroop, LLC counters Vdroop. So enabling (or increasing) LLC is going to cause Vdroop to be less severe, and so raise the effective voltage. Unless LLC works the exact opposite on GPUs than it does on CPUs, that is.


Enable, disable... Really depends on the convention in the bios. So yes, increasing LLC will decrease vdroop on some mobos for instance, and do the opposite on others.

The question is does the program defeat vdroop and to what extent, for instance, on thr gk110, the command /wi3,20,de,10 dials LLC to about zero. The instruction set for the NPC VRM was easy to get. It's not been easy for me to get the same from international rectifier...


----------



## Snyderman34

My waterblock should be here tomorrow! Gonna see what the old H220 can handle by itself. If it can't hang, gonna look at a Phobya 200mm rad to add to it.

One question. Since flashing to the ASUS BIOS, I don't see my splash screen. I get what sounds like one long beep, 6 short beeps, then 2 long beeps. Screen stays black for a bit, but always brings me to my login screen. Anyone else having issues getting to the BIOS after a BIOS flash? I can use the IGPU, but it's a bit of an inconvenience unhooking everything and removing the card when I want in the BIOS


----------



## jerrolds

I can still still boot with BIOS after flashing to asus. And this is with a dual link connection to a 1440p monitor with no scalars. Sapphiare 290X w/ Asus Z77 mobo i believe


----------



## Jpmboy

Quote:


> Originally Posted by *r0l4n*
> 
> Only Sammyboy83 from this forum has beta tested earlier versions of the application. He runs the exact same card as me, a Gigabyte R9 290.
> 
> I use exactly the commands I posted in the links (also check this).
> 
> Load Line Calibration adds volts on load to prevent vdroop. The commands I use are enabling and disabling LLC, when enabled is like having a boost voltage of +200mV. Dialing down LLC means increasing vdroop, and I'm not sure it's possible with these commands. Afaik, it's only on or off, off being the default setting, on being +0.2v on load.
> 
> To make it clear, I have tested enabling LLC myself and the load *voltage is 1.352v steady running* Heaven. Since I'm running the stock cooler, I don't let it run more than a few seconds, so I consider that feature the least tested. Some people on water should give it a try, if not with my app, by running those commands I linked manually (and carefully).


Cool! That's great! Okay, i have to pull sli titans and put tthe r290 back in. *Here's alexey's specifics*:

"Yes, you can disable LLC by sending commands to IR3567B. LLC can be disabled by resetting bit 7 of register 38. To avoid calculations, you may use AND I2C command (/ai) instead of WRITE I2C (/wi) to reset some bit of some register. OR I2C (/oi) command can be used to set it back if needed.

The following commands disable bit 7 of register 38:

/ai4,30,38,7f

It can be set back with:

/oi4,30,38,80

Alexey Nicolaychuk aka Unwinder, RivaTuner creator"


----------



## Mr357

Quote:


> Originally Posted by *$ilent*
> 
> still no msi afterburner update yet?


Nope


----------



## famich

Quote:


> Originally Posted by *Jpmboy*
> 
> Hi, guys, these are my lines approved by the Great Master himself
> 
> 
> 
> 
> 
> 
> 
> I have to try them myself though. simply had no time these days.
> BTW managed to run Heaven @1260MHz on ASUS BIOS -not bad


----------



## Jpmboy

Got your quotes screwed up...


----------



## HughhHoney

I've got a really random issue. I've been testing out the pt1 bios and I've been getting a lot of black screens if I go above about 1.4v in gpu tweak. I primarily game on a Dell U2711 at 2560x1440. Today I randomly tried to play a little on my secondary screen, which is a dell s2440l at 1080p.

I was able to raise the voltage up to 1.45 in a short amount of testing and raise my overclock by at least 40mhz with no black screens or artifacting. Actually when I left the U2711 connected as a non-primary screen it still flickered and eventually turned off completely while I was playing bf4 just fine on the smaller screen.

I thought I was bumping up against the max voltage that my 290x could take (about 1.28-1.3v in gpuz) and causing black screens, but not I'm starting to thing there's something wrong with either my monitor or with my card's output hardware. I've had the monitor for about 5 months and it's never given me a problem before. I've tried the U2711 over both displayport and DVI with the same results.

Anyone had any sort of similar issue with one display and not another?


----------



## famich

Quote:


> Originally Posted by *Jpmboy*
> 
> Got your quotes screwed up...










Yes , but you know what I mean !


----------



## bpmcleod

Have there been any driver updates past 9.2 beta?


----------



## Jpmboy

Quote:


> Originally Posted by *HughhHoney*
> 
> I've got a really random issue. I've been testing out the pt1 bios and I've been getting a lot of black screens if I go above about 1.4v in gpu tweak. I primarily game on a Dell U2711 at 2560x1440. Today I randomly tried to play a little on my secondary screen, which is a dell s2440l at 1080p.
> 
> I was able to raise the voltage up to 1.45 in a short amount of testing and raise my overclock by at least 40mhz with no black screens or artifacting. Actually when I left the U2711 connected as a non-primary screen it still flickered and eventually turned off completely while I was playing bf4 just fine on the smaller screen.
> 
> I thought I was bumping up against the max voltage that my 290x could take (about 1.28-1.3v in gpuz) and causing black screens, but not I'm starting to thing there's something wrong with either my monitor or with my card's output hardware. I've had the monitor for about 5 months and it's never given me a problem before. I've tried the U2711 over both displayport and DVI with the same results.
> 
> Anyone had any sort of similar issue with one display and not another?


It's not unusual for different resolutions to affect OC limits/stability (sorry, it's actually, usual








) If you back down on clocks with the 1440p monitor, does it continue to blk scr?


----------



## Jpmboy

Quote:


> Originally Posted by *famich*
> 
> 
> 
> 
> 
> 
> 
> 
> Yes , but you know what I mean !


Right, you are. So for two card system, should we use the /sg# qualifier?


----------



## Slayem

I am looking into the 290 but i am worried about the noise and heat output, What do you guys think? Is the 290 as loud as i heard? I may just wait for the cards with better coolers. Any ETA on those?


----------



## Sainth

Im seriously considering returning my cards, and go and buy a 780ti im getting tired with AMDs bull**** customer service!


----------



## esqueue

Quote:


> Originally Posted by *alancsalt*
> 
> But aluminium is a better radiater of heat, so taking both properties together, conduct and radiate, it pretty much balances out....


Copper would also radiate heat better but isn't durable enough to make a proper designed heatsink. The only reason for aluminum is strength and cost in the cases of all aluminum heat sinks. They use copper as a base due to it's heat transfer properties and aluminum due to it's strength which equals more thinner fins and more surface area. Notice how those ebay copper heatsinks have a horrible fin design? The fins are too short and too fat when compared to some of the aluminum heatsinks therefore it may not disperse heat as fast as an aluminum heatsink with taller thinner fins.

That's why the automotive industry went from copper to aluminum. They can make thinner water tubes that will remain flat under pressure offering more water to air area. The copper tubes end up getting round under pressure.

Just thought that I'd throw this fact out there for anyone wondering about the pros and cons of each material.


----------



## DampMonkey

Quote:


> Originally Posted by *HughhHoney*
> 
> I've got a really random issue. I've been testing out the pt1 bios and I've been getting a lot of black screens if I go above about 1.4v in gpu tweak. I primarily game on a Dell U2711 at 2560x1440. Today I randomly tried to play a little on my secondary screen, which is a dell s2440l at 1080p.
> 
> I was able to raise the voltage up to 1.45 in a short amount of testing and raise my overclock by at least 40mhz with no black screens or artifacting. Actually when I left the U2711 connected as a non-primary screen it still flickered and eventually turned off completely while I was playing bf4 just fine on the smaller screen.
> 
> I thought I was bumping up against the max voltage that my 290x could take (about 1.28-1.3v in gpuz) and causing black screens, but not I'm starting to thing there's something wrong with either my monitor or with my card's output hardware. I've had the monitor for about 5 months and it's never given me a problem before. I've tried the U2711 over both displayport and DVI with the same results.
> 
> Anyone had any sort of similar issue with one display and not another?


My 1440p monitor does not like the PT3 or PT1 bios, i experience issues similar to you. Asus bios works fine though until i hit the voltage wall


----------



## Technewbie

I was playing BF4 for about 40min-1hr and I back out to check my temps and the VRM temperature 2 was max at 141C. Is that normal? Because people have been saying anything over 100C is uncomfortable to them...Well this is well above that lol


----------



## DampMonkey

Quote:


> Originally Posted by *Sainth*
> 
> Im seriously considering returning my cards, and go and buy a 780ti im getting tired with AMDs bull**** customer service!


Tell us your troubles


----------



## $ilent

I would have thought 140C VRM would be instant death for it.


----------



## HughhHoney

Quote:


> Originally Posted by *Jpmboy*
> 
> It's not unusual for different resolutions to affect OC limits/stability (sorry, it's actually, usual
> 
> 
> 
> 
> 
> 
> 
> 
> ) If you back down on clocks with the 1440p monitor, does it continue to blk scr?


It seems like voltage is the problem more than clocks. But yes I can play just fine at about 1.4v in gpu tweak or less, but it will instantly black screen above that. If I push clocks higher I get artifacts, but I can recover and reset everything which makes sense.

I've never had a situation like this before with any other card but it makes sense that resolution may be a factor.

PS my card is under water so temp isn't a factor.


----------



## Technewbie

Quote:


> Originally Posted by *$ilent*
> 
> I would have thought 140C VRM would be instant death for it.


Me too, Could it just be that it was a GPUZ error? Because the VRM temp 1 maxed at 67C and the vrm 2 temp is usually lower than the first one.


----------



## DampMonkey

Quote:


> Originally Posted by *Technewbie*
> 
> Me too, Could it just be that it was a GPUZ error? Because the VRM temp 1 maxed at 67C and the vrm 2 temp is usually lower than the first one.


It wouldn't be the first bug like that I've seen from gpu-z. I wouldn't worry about it unless its happening consistently. The VRM's on the reference heatsink/fan are actually cooled fairly well


----------



## Sainth

Quote:


> Originally Posted by *DampMonkey*
> 
> Tell us your troubles


i have been trying to get this brokeas Xfire to work for 2 days now, im getting so frustrated. ive been speaking to amd suport, they didnt tell me anything!
if amd dosent confirm a future fix to this problem im returning my cards!


----------



## DampMonkey

Quote:


> Originally Posted by *Sainth*
> 
> i have been trying to get this brokeas Xfire to work for 2 days now, im getting so frustrated. ive been speaking to amd suport, they didnt tell me anything!
> if amd dosent confirm a future fix to this problem im returning my cards!


....is crossfire enabled in CCC?


----------



## HughhHoney

Thanks... Maybe it's a problem with those bioses.

I'll test the Asus bios a bit more, but I think in my time using it I was in the same boat. I would crash if I set clocks too high, but never black screen.

The PT3 bios causes instant black screens on my 1440p as well, but I haven't tested it on another monitor.


----------



## sugarhell

Quote:


> Originally Posted by *Sainth*
> 
> i have been trying to get this brokeas Xfire to work for 2 days now, im getting so frustrated. ive been speaking to amd suport, they didnt tell me anything!
> if amd dosent confirm a future fix to this problem im returning my cards!


You know you need to question yourself first: I am doing something wrong? I know the basics?


----------



## Sainth

Quote:


> Originally Posted by *DampMonkey*
> 
> ....is crossfire enabled in CCC?


Quote:


> Originally Posted by *Sainth*
> 
> i have been trying to get this brokeas Xfire to work for 2 days now, im getting so frustrated. ive been speaking to amd suport, they didnt tell me anything!
> if amd dosent confirm a future fix to this problem im returning my cards!


yes!


----------



## DampMonkey

Quote:


> Originally Posted by *Sainth*
> 
> yes!


Does each card work individually? Are your power plugs correctly plugged into the cards/psu? What benches /games are you testing? What drivers?


----------



## famich

Quote:


> Originally Posted by *Jpmboy*
> 
> Right, you are. So for two card system, should we use the /sg# qualifier?


Yep, at least Mr U confirmed that - I got just one card, cannot try myself..


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Sainth*
> 
> yes!


...You need to give more information.

I understand you're frustrated but saying "it doesn't work" then getting mad at AMD helps nobody.

As DampMonkey said, try the cards individually. Make sure you seat them correctly and the power plugs are plugged in.

If you have any remnants of old drivers, entirely uninstall them. That means manually going through your PC and registry (not registry cleaners, they will do more harm than good) and delete all the old driver files. You can find guides online.

What specifically isn't working? The cards, or crossfire? You haven't really provided any detail, the most we can do is guess.


----------



## Sainth

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> ...You need to give more information.
> 
> I understand you're frustrated but saying "it doesn't work" then getting mad at AMD helps nobody.
> 
> As DampMonkey said, try the cards individually. Make sure you seat them correctly and the power plugs are plugged in.
> 
> If you have any remnants of old drivers, entirely uninstall them. That means manually going through your PC and registry (not registry cleaners, they will do more harm than good) and delete all the old driver files. You can find guides online.
> 
> What specifically isn't working? The cards, or crossfire? You haven't really provided any detail, the most we can do is guess.


i have been testing them individually they preform as they should, ive spent qutie abit of time removing the old drivers. As i mentioned they work perfect individually but when i try them in crossfire they preform like shiet. Running one card is better, in 1920x1080(120hz) im getting a maximum of 60fps in either ultra or high settings.. it dosent matter.


----------



## Technewbie

Quote:


> Originally Posted by *DampMonkey*
> 
> It wouldn't be the first bug like that I've seen from gpu-z. I wouldn't worry about it unless its happening consistently. The VRM's on the reference heatsink/fan are actually cooled fairly well


Yeah I just played again and it hit 61C max


----------



## jerrolds

Quote:


> Originally Posted by *Sainth*
> 
> i have been testing them individually they preform as they should, ive spent qutie abit of time removing the old drivers. As i mentioned they work perfect individually but when i try them in crossfire they preform like shiet. Running one card is better, in 1920x1080(120hz) im getting a maximum of 60fps in either ultra or high settings.. it dosent matter.


What games are these? Some are way worse than others for CF - Bioshock Infinite and Tomb Raider are 2 games that scale and perform pretty well when it comes to microstutter.

Games like Farcry 3 is terrible.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Sainth*
> 
> i have been testing them individually they preform as they should, ive spent qutie abit of time removing the old drivers. As i mentioned they work perfect individually but when i try them in crossfire they preform like shiet. Running one card is better, in 1920x1080(120hz) im getting a maximum of 60fps in either ultra or high settings.. it dosent matter.


Ok, so they're working, just not correctly.

Open GPU-z while playing a game or benching. Are the GPU's at 100% load? Are they throttling? Are the temperatures getting too high?

Have the task manager open to the Performance tab and check CPU usage.

I have the same mobo and CPU as you and I'm planning to (eventually...) go CF 290's so I hope it's not a motherboard problem.

Have you tried running only one card in the bottom slot? Check both PCI-e slots and the cards for any bent pins or damage.

Also, in GPU-z try checking the bandwidth speed. To do this, look at the Bus Interface box. It should say what speed you're running at, at idle it should be 1.1. To speed it up, hit the "?" button next to the box, it'll run a little test to put some load on the GPU's.


----------



## Kipsofthemud

Quote:


> Originally Posted by *jerrolds*
> 
> I can still still boot with BIOS after flashing to asus. And this is with a dual link connection to a 1440p monitor with no scalars. Sapphiare 290X w/ Asus Z77 mobo i believe


You don't know what mobo you have exactly?


----------



## dade_kash_xD

Quote:


> Originally Posted by *Sainth*
> 
> yes!


Him and I are having the same issue with erratic GPU usage. Huge spikes from 0%-100% fluctuating up and down every second.



Here's a snapshot of what I am seeing and so is he.

Also, Sainth, I don't know if you ordered from Newegg, but Newegg has a exchange only policy with the r9-290.


----------



## esqueue

What's considered a decent valley score? Heard many saying that it isn't lining our cards?

1140/1400mhz


----------



## jerrolds

Quote:


> Originally Posted by *Kipsofthemud*
> 
> You don't know what mobo you have exactly?


Its not the most exciting piece of hardware i own. All the fun stuff is connected to it. Its a Asus p8z77-v PRO if you must know.


----------



## CurrentlyPissed

Quote:


> Originally Posted by *dade_kash_xD*
> 
> Him and I are having the same issue with erratic GPU usage. Huge spikes from 0%-100% fluctuating up and down every second.
> 
> 
> 
> Here's a snapshot of what I am seeing and so is he.
> 
> Also, Sainth, I don't know if you ordered from Newegg, but Newegg has a exchange only policy with the r9-290.


I have this SAME issue. I thought it was normal, or the frame pacing.


----------



## Stay Puft

Quote:


> Originally Posted by *esqueue*
> 
> What's considered a decent valley score? Heard many saying that it isn't lining our cards?
> 
> 1140/1400mhz


Good score would be 80+fps


----------



## esqueue

Quote:


> Originally Posted by *Stay Puft*
> 
> Good score would be 80+fps


My best with Heaven is only 56.8. I got 63.3 on valley so I am confused on why people are only discussing valley?
I'm guessing that it's not possible with one 290x card right?


----------



## Stay Puft

Quote:


> Originally Posted by *esqueue*
> 
> I'm guessing that it's not possible with one 290x card right?


780 Classifieds hit almost 90fps so it should be very possible but for some reason the new cards suck in valley


----------



## selk22

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *dade_kash_xD*
> 
> Him and I are having the same issue with erratic GPU usage. Huge spikes from 0%-100% fluctuating up and down every second.
> 
> 
> 
> Here's a snapshot of what I am seeing and so is he.
> 
> Also, Sainth, I don't know if you ordered from Newegg, but Newegg has a exchange only policy with the r9-290.






This is AB not reporting corretly I have the same problem but GPUz reports it fine. Just wait for support guys.


----------



## blue1512

Why do you guys use AB, the latest version still doesn't support R9 290/290x. IMHO, let CCC, AB and GPUZ run simultaneously is a bad idea for our new cards.


----------



## Sgt Bilko

Quote:


> Originally Posted by *blue1512*
> 
> Why do you guys use AB, the latest version still doesn't support R9 290/290x. IMHO, let CCC, AB and GPUZ run simultaneously is a bad idea for our new cards.


Because CCC wont let you change the fan speed properly as where AB will


----------



## Technewbie

Quote:


> Originally Posted by *esqueue*
> 
> My best with Heaven is only 56.8. I got 63.3 on valley so I am confused on why people are only discussing valley?
> I'm guessing that it's not possible with one 290x card right?



Here's my best Valley score, I'm going to try to push my memory clock higher to get a better score.


----------



## bond32

Well I played about 20 min of BF4, I am pleased with my new Sapphire R9 290X. I manually set the fan to cap at 60%, in that time period of BF4 at 1440p, all maxed ultra settings, 4x MSAA, 96 hz, the fan hovered around 45% and the core was around 92 C. Max was 94 C. Provided I have pretty good airflow and cooling towards it, I suppose those are good results. I am going to see the framerates from FRAPS later.

Yeah the cooler is loud, but it's nothing I didn't expect. I had 3 reference 7950's at one time, the cooler is almost identical. Pics of my setup soon to come.


----------



## esqueue

Quote:


> Originally Posted by *Technewbie*
> 
> 
> Here's my best Valley score, I'm going to try to push my memory clock higher to get a better score.


Nice, I'm waiting till there is another option to mess with voltage besides flashing. Should be here soon.


----------



## bond32

Few pics of the setup; just sitting tight till I can get my hands on a good gpu block. Anyone heard anything from Swiftech or XSPC?


----------



## stl drifter

Have anybody benched these under water yet ?


----------



## DampMonkey

Quote:


> Originally Posted by *Technewbie*
> 
> 
> Here's my best Valley score, I'm going to try to push my memory clock higher to get a better score.


Most people bench valley on the Extreme HD setting, you did this bench at Extreme which is only 1600x900 rather than 1920x1080


----------



## Kelwing

Ok this thing worked all of one evening. Come home turn on pc and everything is fine up till Windows login screen. BAM black screen


----------



## DampMonkey

Quote:


> Originally Posted by *Kelwing*
> 
> Ok this thing worked all of one evening. Come home turn on pc and everything is fine up till Windows login screen. BAM black screen


what bios and what monitor?


----------



## esqueue

Quote:


> Originally Posted by *DampMonkey*
> 
> Most people bench valley on the Extreme HD setting, you did this bench at Extreme which is only 1600x900 rather than 1920x1080


Didn't catch that. I bench on full screen 1900X1080 full screen. Do people do full screen or window?


----------



## brazilianloser

Quote:


> Originally Posted by *Kelwing*
> 
> Ok this thing worked all of one evening. Come home turn on pc and everything is fine up till Windows login screen. BAM black screen


Welcome to the Black Screen Club


----------



## DampMonkey

Quote:


> Originally Posted by *esqueue*
> 
> Didn't catch that. I bench on full screen 1900X1080 full screen. Do people do full screen or window?


fullscreen generally


----------



## bond32

The "extreme HD" setting should have full screen box checked.


----------



## looncraz

Quote:


> Originally Posted by *esqueue*
> 
> My best with Heaven is only 56.8. I got 63.3 on valley so I am confused on why people are only discussing valley?
> I'm guessing that it's not possible with one 290x card right?


You're testing with the "Extreme HD" preset and comparing to others using the "Extreme" preset.

You are performing about double of a 7870XT (Tahiti LE / 7930)... which I find to be astounding at 1080p.

--The loon


----------



## Kelwing

Quote:


> Originally Posted by *DampMonkey*
> 
> what bios and what monitor?


Bios - not sure. Whatever MSI had on it stock.
Monitor - Samsung Syncmaster SA350, using HDMI cable to card.

Currently managed to get Windows Recovery screen to show.

Quote:


> Originally Posted by *brazilianloser*
> 
> Welcome to the Black Screen Club


Ok I want out of this club


----------



## bond32

It is interesting how low valley scores are... Here's mine. 

All stock. Stock 780 I had scored in the 2600-2700 range...


----------



## DampMonkey

Quote:


> Originally Posted by *Kelwing*
> 
> Bios - not sure. Whatever MSI had on it stock.
> Monitor - Samsung Syncmaster SA350, using HDMI cable to card.


But you were overclocked, right? Something might have gone wrong if you have an overclocking program automatically sets clocks when windows loads the desktop


----------



## Kelwing

Quote:


> Originally Posted by *DampMonkey*
> 
> But you were overclocked, right? Something might have gone wrong if you have an overclocking program automatically sets clocks when windows loads the desktop


I was messing around with it last night when running Valley. I thought I set it back when it started acting up. Finally got into safemode. So will fix whatever I screwed up.

And looks like I had Afterburner set to adjust on startup









EDIT - Removed Afterburner and did not work. Still black screened at login.


----------



## DampMonkey

Quote:


> Originally Posted by *Kelwing*
> 
> I was messing around with it last night when running Valley. I thought I set it back when it started acting up. Finally got into safemode. So will fix whatever I screwed up.
> 
> And looks like I had Afterburner set to adjust on startup
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT - Removed Afterburner and did not work. Still black screened at login.


Check your AMD Overdrive settings too, in CCC


----------



## esqueue

Quote:


> Originally Posted by *Kelwing*
> 
> I was messing around with it last night when running Valley. I thought I set it back when it started acting up. Finally got into safemode. So will fix whatever I screwed up.
> 
> And looks like I had Afterburner set to adjust on startup
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT - Removed Afterburner and did not work. Still black screened at login.


Wasn't there a way to boot into safe mode and set to default? Do the settings reset back to default upon uninstalling the program?


----------



## jerrolds

Quote:


> Originally Posted by *DampMonkey*
> 
> But you were overclocked, right? Something might have gone wrong if you have an overclocking program automatically sets clocks when windows loads the desktop


Good call - this can happen with AB or GPUTweak or any ocing program really.

Also there was issues with GPU Tweak and Windows 8.1 when it first came out....disabling the gpu tweak service let users get back into windows. But i believe the last 3 releases fixed this.


----------



## selk22

Quote:


> Originally Posted by *bond32*
> 
> Few pics of the setup; just sitting tight till I can get my hands on a good gpu block. Anyone heard anything from Swiftech or XSPC?


I know that Swiftech has a block in the works confirmed by Bryan from Swiftech.

Post #4
http://forums.swiftech.org/viewtopic.php?f=20&t=2343
Quote:


> We'll have our Komodo model for this card out shortly.
> 
> Regards,


----------



## evensen007

I'll join the club!

2x Xfx 290's with Nickel/Plexi waterblocks. I had a HE$% of a time getting the new drivers to recognize the new cards after removing my 7970. I've never really had that issue before! Haven't started overclocking yet, but will soon.


----------



## ImJJames

Quote:


> Originally Posted by *evensen007*
> 
> I'll join the club!
> 
> 2x Xfx 290's with Nickel/Plexi waterblocks. I had a HE$% of a time getting the new drivers to recognize the new cards after removing my 7970. I've never really had that issue before! Haven't started overclocking yet, but will soon.


oooo KILLEM!


----------



## Kelwing

Quote:


> Originally Posted by *DampMonkey*
> 
> Check your AMD Overdrive settings too, in CCC


Hmm CCC telling me no graphic driver installed. Going to be one of those evenings I see. Good thing I have beer handy.


----------



## Stay Puft

Quote:


> Originally Posted by *bond32*
> 
> It is interesting how low valley scores are... Here's mine.
> 
> All stock. Stock 780 I had scored in the 2600-2700 range...


Whats your tessellation setting in CCC? "Use Application Settings"? Select "Amd Optimized" Instead and rerun if you dont mine


----------



## bond32

It was set for "AMD optimized" when I did that run


----------



## selk22

What yields best results? AMD optimized or Application settings?

EDIT

Just ran this myself and found that in valley there is .1FPS difference in AMD optimized and app settings for tessellation..

1100/1400 290x
65.5 - AMD Optimized
65.4 - App Settings

This could be anything really. I don't think valley is effected by this.


----------



## Scorpion49

Anyone running 290 or 290X crossfire with an FX 8350 or similar chip? I've got the urge to try one again and I'm considering returning my RIVE (giving me issues already) for a 990FX board and AMD chip.


----------



## Jpmboy

Quote:


> Originally Posted by *stl drifter*
> 
> Have anybody benched these under water yet ?


yes. look on this thread first page:

http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0

best I could do was 72.6 (86th place !







)

it's the drivers... typical AMD.


----------



## selk22

Well my last post got me wondering how much FPS I could push out of the same OC but tweaking the CCC..

So I followed the AMD Valley Tweak guide posted here
http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0

And ended up nearly breaking 70fps!
Scored 68.7 my highest Valley score yet on air









I will try to keep pushing before I post again to the gk110 vs Hawaii thread


----------



## Stay Puft

Quote:


> Originally Posted by *Scorpion49*
> 
> Anyone running 290 or 290X crossfire with an FX 8350 or similar chip? I've got the urge to try one again and I'm considering returning my RIVE (giving me issues already) for a 990FX board and AMD chip.


You really want to pair a 500 dollar GPU with a 170 dollar processor?


----------



## Ricey20

So far since disabling AMD Overdrive in CCC I haven't had the weird black screening or checkerboard crashes. Still have my Xfire enabled, just use MSI AB for fan profile now and so far everything has been as it should.


----------



## Scorpion49

Quote:


> Originally Posted by *Stay Puft*
> 
> You really want to pair a 500 dollar GPU with a 170 dollar processor?


Why not? I do this for fun, not for super serious business. I already have a 3770k and 3930k in the house, I know what they do. I guess I could wait for the Rampage IV Black too. Just want something new to fiddle with


----------



## stilllogicz

Quote:


> Originally Posted by *Jpmboy*
> 
> probably not if you overclock [yea - "overvolt"] the kit. I'd recommend 2 PSUs of at least 1000W - and single rail. and "add2psu"


Interesting, do you think a 1000w single 12v rail PSU could power a Tri Xfire 290 overvolted GPU setup? I'd keep my AX1200 for the CPU and all system components and connect the 1Kw PSU via the add2psu adapter.


----------



## Blackroush

Now I'll try this, Hopefully this card not give me Blackscreen like his brother did (290x). Wish me luck.


----------



## Kelwing

Woot!

Removed all drivers and CCC. Reinstalled and back to normal. Think I need another beer for that


----------



## Jpmboy

Quote:


> Originally Posted by *stilllogicz*
> 
> Interesting, do you think a 1000w single 12v rail PSU could power a Tri Xfire 290 overvolted GPU setup? I'd keep my AX1200 for the CPU and all system components and connect the 1Kw PSU via the add2psu adapter.


I think you should use the 1200W for the 3 cards if you plan to Oc/OV them to any high level.


----------



## Arizonian

Quote:


> Originally Posted by *Blackroush*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Now I'll try this, Hopefully this card not give me Blackscreen like his brother did (290x). Wish me luck.


Updated to MSI -


----------



## stilllogicz

Quote:


> Originally Posted by *Jpmboy*
> 
> I think you should use the 1200W for the 3 cards if you plan to Oc/OV them to any high level.


After a double check it turns out my AX1200 is single rail (http://www.newegg.com/Product/Product.aspx?Item=N82E16817139014&Tpk=AX1200). So I can dedicate that PSU to solely the GPU's and buy a 1k watt PSU for my CPU & all other system components. This would work and allow me to reach my goals of max overvolting & OCing under water correct?

[EDIT]

Is there any programs out there that monitor how much watts each PSU is outputting? It would be interesting to see the actual draw myself.


----------



## brazilianloser

Quote:


> Originally Posted by *stilllogicz*
> 
> After a double check it turns out my AX1200 is single rail (http://www.newegg.com/Product/Product.aspx?Item=N82E16817139014&Tpk=AX1200). So I can dedicate that PSU to solely the GPU's and buy a 1k watt PSU for my CPU & all other system components. This would work and allow me to reach my goals of max overvolting & OCing under water correct?


Hm 1000w just for the rest of the pc seems overkill though. You would have to be doing some way over the top oc and have a good two dozen pieces of hardware sucking the on the poor thing.


----------



## Jpmboy

Quote:


> Originally Posted by *stilllogicz*
> 
> After a double check it turns out my AX1200 is single rail (http://www.newegg.com/Product/Product.aspx?Item=N82E16817139014&Tpk=AX1200). So I can dedicate that PSU to solely the GPU's and buy a 1k watt PSU for my CPU & all other system components. This would work and allow me to reach my goals of max overvolting & OCing under water correct?
> 
> [EDIT]
> 
> Is there any programs out there that monitor how much watts each PSU is outputting? It would be interesting to see the actual draw myself.


yeah - you probably don't need 1000W for everything but teh gfx cards. 850 or so is plenty. You ideally want to run the PSUs at or below 60% rated output. (can't say I adhere to that all the time tho...







)

i collected this data a while ago with sli titans:

I've been struggling with 3Dmk11 hanging at 5.0 with 1.3V into sli titans - actually will 101! Does not hang at 4.9 and 1.3V and will do p95 5min.FFT with 12288 ram committed for as long as I'm willing to watch temps in high 70's low 80's (82C max). Okay - so this may be the answer... hooked up a killawatt meter to the PSU with the parkbench rig.

Killawatt power measurements (at the PSU plug)
3930K @5.0(1.523V)
2xTitans SLI (svl7v3 bios, softvoltmod, LLC=0)
Bios = 220W
Boot = 500W (?)
Idle = ~ 160-170 watts to the rig
Browser = ~300W
Super Pi = 340W
p95 (8G ram) = 600W (597+/-)
3Dmk11 @ 875/3005 1.16V = 800-900W
3DMk11 @1215/3602 1.3V = 1190-1220W !!!
Valley @ 1215/3602 1.3V = 950-1050W (1080P ExHD)
Firestrike @ 1215/3602 @ 1.3V = 1050-1130W (default)

So.... my 1200W PCP&Cooling PSU is barely enough!!


----------



## stilllogicz

Quote:


> Originally Posted by *brazilianloser*
> 
> Hm 1000w just for the rest of the pc seems overkill though. You would have to be doing some way over the top oc and have a good two dozen pieces of hardware sucking the on the poor thing.


Yea I agree it's completely overkill. The remaining components will be a 4930k highly OC'd, about 20 fans (16 to all 20 radiator fans), 2 water pumps (most likely DDC, not 100% sure yet), random LED strips, Blu-Ray rewritable, 4-6 HDD's, 1-2 SSD's... I think that's it? But hey if all of that only draws 600w then I still have 400w more of headroom, you never know what the future has in store. Starting to appreciate that new Enthoo Primo I have sitting in my room more.


----------



## Jpmboy

Quote:


> Originally Posted by *Stay Puft*
> 
> You really want to pair a 500 dollar GPU with a 170 dollar processor?


erm... probably not the best move you could make performance wise. what's wrong with the rive?


----------



## Stay Puft

Quote:


> Originally Posted by *stilllogicz*
> 
> After a double check it turns out my AX1200 is single rail (http://www.newegg.com/Product/Product.aspx?Item=N82E16817139014&Tpk=AX1200). So I can dedicate that PSU to solely the GPU's and buy a 1k watt PSU for my CPU & all other system components. This would work and allow me to reach my goals of max overvolting & OCing under water correct?
> 
> [EDIT]
> 
> Is there any programs out there that monitor how much watts each PSU is outputting? It would be interesting to see the actual draw myself.


AX1200 may be rated for 1200w but i've pushed mine past 1400w with no issues


----------



## rv8000

Was so excited to come home and slap some aftermarket cooling on to my 290 today, shred the box open only to find Frozencpu sent me the wrong cooler model. All invoices show I ordered the proper Gelid hsf



































, somewhat peeved e-mail sent. Just have to wait that much longer to have more fun with my 290


----------



## Kriant

Quote:


> Originally Posted by *stilllogicz*
> 
> After a double check it turns out my AX1200 is single rail (http://www.newegg.com/Product/Product.aspx?Item=N82E16817139014&Tpk=AX1200). So I can dedicate that PSU to solely the GPU's and buy a 1k watt PSU for my CPU & all other system components. This would work and allow me to reach my goals of max overvolting & OCing under water correct?
> 
> [EDIT]
> 
> Is there any programs out there that monitor how much watts each PSU is outputting? It would be interesting to see the actual draw myself.


I ran 3x7970 with 1200/1600 OC + i7 3930k @ 4.6ghz OC + two pumps, 3 hdds on my AX1200 ( when I had one) - it worked like a beast just fine.


----------



## stilllogicz

Quote:


> Originally Posted by *Kriant*
> 
> I ran 3x7970 with 1200/1600 OC + i7 3930k @ 4.6ghz OC + two pumps, 3 hdds on my AX1200 ( when I had one) - it worked like a beast just fine.


I hear ya Kriant. It seems the 290's and 290x draw insane amounts of electricity. I'd rather just get the dual PSU setup since I was planning on buying a new one anyway. Atleast I can continue to use the AX1200. With 2200 Watts of available power (1200w just for the GPU's), the system will only use what's necessary for whatever it's doing at the time. I just don't want to end up with NOT enough power and then have to take apart and mod the case after everything is in, you know what i'm sayin?

I want to go all out with this build and juice everything to the absolute limits of sanity lol. Maybe I'm overthinking this though? Meh.


----------



## CallsignVega

Anyone figure out why the 290x is performing so poorly in Unigine benchmarks? Like the same speed as a 7970. I could see issue with crossfire profile or something, but single card should have no issues and they are putting up some poor numbers.


----------



## Technewbie

Is there anyway to tweak the memory voltage for the 290x?


----------



## selk22

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *CallsignVega*
> 
> Anyone figure out why the 290x is performing so poorly in Unigine benchmarks? Like the same speed as a 7970. I could see issue with crossfire profile or something, but single card should have no issues and they are putting up some poor numbers.






My best guess right now is the mature drivers vs these crappy ones we are on now. Thats my hope/guess
Quote:


> Originally Posted by *Technewbie*
> 
> Is there anyway to tweak the memory voltage for the 290x?


Yeah flash the Bios if your card is not Asus to Asus bios and use GPU tweak or wait a week or so for AB to update


----------



## utnorris

Quote:


> Originally Posted by *Scorpion49*
> 
> Why not? I do this for fun, not for super serious business. I already have a 3770k and 3930k in the house, I know what they do. I guess I could wait for the Rampage IV Black too. Just want something new to fiddle with


I loved having a 990FX board and the 8350, just got bored with it and switched. I still have the Gigabyte UD7 990FX board, so I may try it again sometime down the road.


----------



## Jpmboy

Quote:


> Originally Posted by *CallsignVega*
> 
> Anyone figure out why the 290x is performing so poorly in Unigine benchmarks? Like the same speed as a 7970. I could see issue with crossfire profile or something, but single card should have no issues and they are putting up some poor numbers.


very poor numbers... gotta be the drivers.

Have to tried the LLC hack yet? I have rol4n's app, but I may have a diff version of AB than he usewd. So just using the following command sequence posted here and by unwinder, yup - it seems to switch LLC off (or whatever, leading to minimal vdroop):

Add 100mV offset and defeats vdroop (shift-rightclick in teh AB folder to open an elevated command prompt):
msiafterburner /wi4,30,8d,10 /ai4,30,38,7f

return to stock:
msiafterburner /wi4,30,8d,0 /oi4,30,38,80

just begun experimenting with this - use extreme caution - it bumped valley slightly so far 73.4 - still very poor.


----------



## VSG

Well AMD Gaming announced their new gaming reward (BF4) for all R9 owners henceforth, effectively dealing us all a low blow


----------



## sugarhell

Tsm got 80 fps with a 290x at 1250/1500 and a 3820 at 4.5 ghz.


----------



## DampMonkey

Quote:


> Originally Posted by *CallsignVega*
> 
> Anyone figure out why the 290x is performing so poorly in Unigine benchmarks? Like the same speed as a 7970. I could see issue with crossfire profile or something, but single card should have no issues and they are putting up some poor numbers.


Heaven runs fine, its just valley actin wack


----------



## Jpmboy

Quote:


> Originally Posted by *sugarhell*
> 
> Tsm got 80 fps with a 290x at 1250/1500 and a 3820 at 4.5 ghz.


don't see it posted in the OCN thread... too embarrassed to share the result?







then again, TSM always beats my score!

oh - right now 72 is the highest placed r290x posted... not good at all. and 80 I beat with a stock titan on air









I know this r290x can do better.



hope this can do better, or i"ll be selling.


----------



## sugarhell

Quote:


> Originally Posted by *Jpmboy*
> 
> don't see it posted in th thread... too embarrassed to share the result?


Its way too back on the thread.


----------



## Blackroush

Guys whats 290's switch function? This is not like 290x, I cannot see any change max fan speed in CCC when i switch it left and right. Please help thanks


----------



## BigGuns73

Quote:


> Originally Posted by *rv8000*
> 
> Was so excited to come home and slap some aftermarket cooling on to my 290 today, shred the box open only to find Frozencpu sent me the wrong cooler model. All invoices show I ordered the proper Gelid hsf
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> , somewhat peeved e-mail sent. Just have to wait that much longer to have more fun with my 290


What did they send you? Sorry to hear that.


----------



## evensen007

Only had 10 minutes to play with the 2 290's so haven't had a chance to tweak anything. They are BEASTS. Got 120fps in Valley ultra/1080p 4x aa without touching anything yet. Can't wait to up the volts and the clocks!


----------



## Jpmboy

Quote:


> Originally Posted by *sugarhell*
> 
> Its way too back on the thread.


in the valley top 30:

http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0

geeze


----------



## sugarhell

Oh found it

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/1250#post_21071910


----------



## Forceman

Quote:


> Originally Posted by *Blackroush*
> 
> Guys whats 290's switch function? This is not like 290x, I cannot see any change max fan speed in CCC when i switch it left and right. Please help thanks


It just switches between two identical BIOSes, since there is no quiet or uber mode on the 290. Think of it as a fail-safe BIOS switch.


----------



## Arizonian

Quote:


> Originally Posted by *Jpmboy*
> 
> in the valley top 30:
> 
> http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0
> 
> geeze


Hey nice #9 spot with crossfire result there.


----------



## Technewbie

Kinda sad this is the max I got with my 290x in valley :/


----------



## Jpmboy

Quote:


> Originally Posted by *sugarhell*
> 
> Oh found it
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/1250#post_21071910


only 3 entries for r290


----------



## Jpmboy

Quote:


> Originally Posted by *Technewbie*
> 
> 
> Kinda sad this is the max I got with my 290x in valley :/


I must have a dog card or something is wrong. running 1265/1575 and scoring lower.


----------



## Raxus

so just an odd observation i made this evening.

sapphire 290x, was blackscreening all over the place most of the week. Uninstalled Afterburner, oced with the CCC and no blackscreens. May be coincidence.


----------



## Technewbie

Quote:


> Originally Posted by *selk22*
> 
> Yeah flash the Bios if your card is not Asus to Asus bios and use GPU tweak or wait a week or so for AB to update


I flashed it to the ASUS bios but that only gives core voltage, I was wondering about memory voltage.


----------



## skupples

Quote:


> Originally Posted by *selk22*
> 
> 
> My best guess right now is the mature drivers vs these crappy ones we are on now. Thats my hope/guess
> Yeah flash the Bios if your card is not Asus to Asus bios and use GPU tweak or wait a week or so for AB to update


The asus bios allows memory voltage tweaking? Or JUST core voltage tweaking? The two should be different.


----------



## Raxus

Any word on a TRIXX update?


----------



## Blackroush

Quote:


> Originally Posted by *Forceman*
> 
> It just switches between two identical BIOSes, since there is no quiet or uber mode on the 290. Think of it as a fail-safe BIOS switch.


Thanks +Rep, I got better score and lower temp when I turned it to left dont know why.

LEFT (Close to DVI Ports)


RIGHT (Away From Ports)


----------



## sugarhell

Guys more informations about mantle. Sometimes is 3 times faster than dx11 version. They demo nitrous engine with mantle support. Insane

https://twitter.com/cavemanjim

Read his comments


----------



## Jpmboy

Hey guys - run these benches and post your results in the thread below:

http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0

http://www.overclock.net/t/872945/top-30-3d-mark-13-fire-strike-scores

http://www.overclock.net/t/1406832/single-gpu-firestrike-top-30

http://www.overclock.net/t/1361939/top-30-3dmark11-scores-for-single-dual-tri-quad

http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-4-0-scores

let's get some numbers up...


----------



## Maxxa

Quote:


> Originally Posted by *sugarhell*
> 
> Guys more informations about mantle. Sometimes is 3 times faster than dx11 version. They demo nitrous engine with mantle support. Insane
> 
> https://twitter.com/cavemanjim
> 
> Read his comments


I need to see it for myself to believe it. It very well could be game changing but I want to see it for myself on my hardware...that I ordered...yesterday...a new HIS R9 290


----------



## Jpmboy

little better


----------



## Jpmboy

Quote:


> Originally Posted by *skupples*
> 
> The asus bios allows memory voltage tweaking? Or JUST core voltage tweaking? The two should be different.


only core as far as I can tell.


----------



## selk22

Quote:


> Originally Posted by *Technewbie*
> 
> I flashed it to the ASUS bios but that only gives core voltage, I was wondering about memory voltage.


I am sorry I didnt read it correctly I am tired


----------



## Sgt Bilko

Quote:


> Originally Posted by *sugarhell*
> 
> Guys more informations about mantle. Sometimes is 3 times faster than dx11 version. They demo nitrous engine with mantle support. Insane
> 
> https://twitter.com/cavemanjim
> 
> Read his comments


8350 as fast as a 4770k?

I want to believe it but I think I need to see such a feat for myself.

Although Q1 release looks good.


----------



## utnorris

So i definitely think we need better drivers, especially for CF. I ran the Valley benchmark and at first it ran smooth as butter, but after a bit it started switching from 100% to 0% on both GPU's alternating between the two for whatever reason. Anyways, here is my score.

3491


----------



## Jpmboy

Quote:


> Originally Posted by *Arizonian*
> 
> Hey nice #9 spot with crossfire result there.


thanks OP. Trying to hold up the r290x "mantle"









http://www.overclock.net/t/1361939/top-30-3dmark11-scores-for-single-dual-tri-quad


----------



## Tobiman

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 8350 as fast as a 4770k?
> 
> I want to believe it but I think I need to see such a feat for myself.
> 
> Although Q1 release looks good.


I think the catch is that the CPU workload is greatly reduced to the extent where an 8350 is as good as a 4770k.


----------



## utnorris

I posted my scores for 3DMark11 over there which should easily put me in the top 30 for single and CF, but they are not "Official" due to the driver being beta.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Tobiman*
> 
> I think the catch is that the CPU workload is greatly reduced to the extent where an 8350 is as good as a 4770k.


Yeah.....pretty impressive if that is the case.
So basicially what im getting for this is in gaming both Intel and AMD will equal each other but in benches Intel will still pull ahead?


----------



## evensen007

Quote:


> Originally Posted by *utnorris*
> 
> So i definitely think we need better drivers, especially for CF. I ran the Valley benchmark and at first it ran smooth as butter, but after a bit it started switching from 100% to 0% on both GPU's alternating between the two for whatever reason. Anyways, here is my score.
> 
> 3491


Mine seem to be using 100% on both with the 13.11 beta9,2. I haven't tweaked anything on mine yet except a small bump in CCC.


----------



## skupples

Quote:


> Originally Posted by *sugarhell*
> 
> Guys more informations about mantle. Sometimes is 3 times faster than dx11 version. They demo nitrous engine with mantle support. Insane
> 
> https://twitter.com/cavemanjim
> 
> Read his comments


now this is good ish, just want to see it, not hear it from a twitter feed. & it may actually be relevant to more than one engine by amd 20nm... Point match, your move NV. I say this because, many of us rarely step foot into any of the frostbite games. Now then, Star Citizen + Mantle? = Skupples goin tri-20nm AMD gpu.

*Anyone know where the live stream footage is?*


----------



## Arizonian

Quote:


> Originally Posted by *Tobiman*
> 
> I think the catch is that the CPU workload is greatly reduced to the extent where an 8350 is as good as a 4770k.


That's a huge impact. Here I was thinking minimal gains. Wonder what level of performance gain my 3770K would get with mantle enabled gaming?

I'm not going to spin my wheels just yet until I see it to believe it either.


----------



## CurrentlyPissed

Does anyone know if Sapphire would void the warranty if you remove the cooler, and use water blocks?


----------



## skupples

mantle part of APU.

two things.. "not specific to gcn" "possibly allow our competitors hard work to be supported by mantle"


----------



## Blackroush

I just got Blackscreen then BSOD with my new R9 290(non X) Beta 9 while playing BF4. Darn!


----------



## Tobiman

Quote:


> Originally Posted by *CurrentlyPissed*
> 
> Does anyone know if Sapphire would void the warranty if you remove the cooler, and use water blocks?


I'd do my best not to leave any clues showing that the stock cooler was removed. Also make sure you don't tell them that you removed it.


----------



## Jpmboy

Quote:


> Originally Posted by *utnorris*
> 
> I posted my scores for 3DMark11 over there which should easily put me in the top 30 for single and CF, but they are not "Official" due to the driver being beta.


no problem. Alancsalt will add them to the table. Unlike futuremark hof., many of the scores in the OCN list are with invalid drivers.


----------



## Jpmboy

Quote:


> Originally Posted by *sugarhell*
> 
> Oh found it
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/1250#post_21071910


tweaked with those clocks... gotta be.


----------



## sugarhell

No tweaks.


----------



## rdr09

Arizonian, please update me from air to water. thanks.


----------



## Arizonian

Quote:


> Originally Posted by *CurrentlyPissed*
> 
> Does anyone know if Sapphire would void the warranty if you remove the cooler, and use water blocks?


I'm going to try to word this.... As long as they receive the GPU in the same condition they sent it to you then yes they will. If there is a sticker that was placed on there and it's removed when you send it back then no they won't. As the removed sticker would confirm you've invalidated your warranty.

Quote:


> Originally Posted by *rdr09*
> 
> Arizonian, please update me from air to water. thanks.


Updated


----------



## Falkentyne

Quote:


> Originally Posted by *Jpmboy*
> 
> very poor numbers... gotta be the drivers.
> 
> Have to tried the LLC hack yet? I have rol4n's app, but I may have a diff version of AB than he usewd. So just using the following command sequence posted here and by unwinder, yup - it seems to switch LLC off (or whatever, leading to minimal vdroop):
> 
> Add 100mV offset and defeats vdroop (shift-rightclick in teh AB folder to open an elevated command prompt):
> msiafterburner /wi4,30,8d,10 /ai4,30,38,7f
> 
> return to stock:
> msiafterburner /wi4,30,8d,0 /oi4,30,38,80
> 
> just begun experimenting with this - use extreme caution - it bumped valley slightly so far 73.4 - still very poor.


DO NOT DO THIS LLC command ON ANY SORT OF AIR COOLING, unless its HIGHEST END. Do NOT even THINK about trying this on stock, even WITHOUT changing the voltage.

Your card *WILL* reach 90 C in less than 15 seconds and will rise in C so fast, it will jump past the throttle mark and black screen.

A poster said that without the default stock LLC, your card will be getting between 1.35-1.4 volts with "default " vcore...
Quote:


> Load Line Calibration adds volts on load to prevent vdroop. The commands I use are enabling and disabling LLC, when enabled is like having a boost voltage of +200mV. Dialing down LLC means increasing vdroop, and I'm not sure it's possible with these commands. Afaik, it's only on or off, off being the default setting, on being +0.2v on load.
> 
> To make it clear, I have tested enabling LLC myself and the load voltage is 1.352v steady running Heaven. Since I'm running the stock cooler, I don't let it run more than a few seconds, so I consider that feature the least tested. Some people on water should give it a try, if not with my app, by running those commands I linked manually (and carefully).
> Quote:
> 
> 
> 
> (Forgot by who).
Click to expand...


----------



## subyman

Can someone get their VRM temps on the stock cooler in Furmark?


----------



## Forceman

Quote:


> Originally Posted by *subyman*
> 
> Can someone get their VRM temps on the stock cooler?


I had about 63C at 70% fan speed on my 290. That was on a relatively short test though, however long the demo version of 3DMark takes (10 minutes or so?)


----------



## BigGuns73

Whats the max safe VRM temp on the 290's anyway?


----------



## Mr357

Quote:


> Originally Posted by *BigGuns73*
> 
> Whats the max safe VRM temp on the 290's anyway?


Hopefully 95C, just like the die. To be safe I would try to stay below 75C.


----------



## Forceman

Quote:


> Originally Posted by *BigGuns73*
> 
> Whats the max safe VRM temp on the 290's anyway?


According to some reports they're rated to 125C. I wouldn't go above 100, though. I like 80 or so myself.


----------



## EliteReplay

BF4: 3770k + 290X Ultra 1080p FPS counter


----------



## BigGuns73

Quote:


> Originally Posted by *Forceman*
> 
> According to some reports they're rated to 125C. I wouldn't go above 100, though. I like 80 or so myself.


Thanks! Now does anyone know what mod it is that tomshardware says has to be done in order to fit an arctic accelero 3 cooler onto a r9 290?


----------



## subyman

I'm using a universal GPU waterblock right now, so my VRMs aren't actively cooled. They are hitting 90C after a run through Heaven. Looks like I'll have to wait until the full covers get back in stock to push the card with higher voltage.


----------



## tsm106

Quote:


> Originally Posted by *subyman*
> 
> I'm using a universal GPU waterblock right now, so my VRMs aren't actively cooled. They are hitting 90C after a run through Heaven. Looks like I'll have to wait until the full covers get back in stock to push the card with higher voltage.


Throw a fan on it.

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/640_40#post_21060304

In other news, finally got my blocks and stuff related to it. The block shortage was more severe than the card shortage lol.


----------



## MrStick89

Quote:


> Originally Posted by *EliteReplay*
> 
> BF4: 3770k + 290X Ultra 1080p FPS counter


Quote:


> Originally Posted by *EliteReplay*
> 
> BF4: 3770k + 290X Ultra 1080p FPS counter


Looks good man. Do you have any fraps benchs of your min max and avg? I'm considering upgrading my FX8350 to a 4770k but I'm not sure if I'll get much performance increase.

stock 290x/[email protected] Ultra 1080p 4xmsaa Windows 7 64players in server.

2013-11-05 18:32:32 - bf4
Frames: 32818 - Time: 360000ms - Avg: 91.161 - Min: 48 - Max: 142

2013-11-05 18:07:43 - bf4
Frames: 27826 - Time: 360000ms - Avg: 77.294 - Min: 46 - Max: 121

2013-11-05 19:09:20 - bf4
Frames: 24042 - Time: 360000ms - Avg: 66.783 - Min: 47 - Max: 105

2013-11-04 17:01:20 - bf4
Frames: 33182 - Time: 360000ms - Avg: 92.172 - Min: 49 - Max: 135


----------



## ImJJames

Quote:


> Originally Posted by *MrStick89*
> 
> Looks good man. Do you have any fraps benchs of your min max and avg? I'm considering upgrading my FX8350 to a 4770k but I'm not sure if I'll get much performance increase.
> 
> stock 290x/[email protected] Ultra 1080p 4xmsaa Windows 7 64players in server.
> 
> 2013-11-05 18:32:32 - bf4
> Frames: 32818 - Time: 360000ms - Avg: 91.161 - Min: 48 - Max: 142
> 
> 2013-11-05 18:07:43 - bf4
> Frames: 27826 - Time: 360000ms - Avg: 77.294 - Min: 46 - Max: 121
> 
> 2013-11-05 19:09:20 - bf4
> Frames: 24042 - Time: 360000ms - Avg: 66.783 - Min: 47 - Max: 105
> 
> 2013-11-04 17:01:20 - bf4
> Frames: 33182 - Time: 360000ms - Avg: 92.172 - Min: 49 - Max: 135


You won't get any performance increase on BF 4 switching to 4770k, as long as your 8350 is OC'ed.


----------



## rdr09

Quote:


> Originally Posted by *ImJJames*
> 
> You won't get any performance increase on BF 4 switching to 4770k, as long as your 8350 is OC'ed.
> 
> 
> Spoiler: Warning: Spoiler!


got one for MP?


----------



## ImJJames

Quote:


> Originally Posted by *rdr09*
> 
> got one for MP?


MP is impossible to benchmark, way to many variables to get a viable result.


----------



## R35ervoirFox

That's not true, multiple 300 second benchmarks on a 64p server will give you a more constant result than you would expect.

I would just like to add, benchmarks that aren't done on a 64p server in bf4 are worthless.

I've run 6 300 sec and with an unlocked 6950 running ultra textures and ultra texture filitering on 64 Operation Locker, I got:

Min 52 Max 122 and Average 82

Which was an average of all 6 runs, but I just ordered a 290 sooooo I can get even higher fps


----------



## grandpatzer

Ive read and seems like 1200-1300core on the 290(X) is possible with water(?)
should be 45-50% more FPS vs 7970 GHZ.

Are there many ppl that have watercooled their 290x / 290 ?
I remember reading that 1.25v is "safe" voltage for 7950/7970.
What is considered "safe" voltage for the 290/290x?
I have full watercooling (1080radiator, cpu/gpu fullcover blocks)

Quote:


> Originally Posted by *geggeg*
> 
> I placed an order for a quarter sheet of Fujipoly extreme thermal pads (0.5 mm and 1 mm thick each) from FrozenCPU to come along with my EK blocks next week. These offer 4x the thermal conductivity of stock EK pads and are reasonably priced unlike the Fujipoly ultra extremes. For those having issues with VRM heating, consider better thermal pads.


That sounds like overkill to me, are the 290 fullcover waterblocks supposed to be much worse VRM cooling compared to 79x0?
My 7950 has good VRM temperatures.


----------



## tsm106

Quote:


> Originally Posted by *Maxxa*
> 
> Quote:
> 
> 
> 
> Originally Posted by *sugarhell*
> 
> Guys more informations about mantle. Sometimes is 3 times faster than dx11 version. They demo nitrous engine with mantle support. Insane
> 
> https://twitter.com/cavemanjim
> 
> Read his comments
> 
> 
> 
> I need to see it for myself to believe it. It very well could be game changing but I want to see it for myself on my hardware...that I ordered...yesterday...a new HIS R9 290
Click to expand...

Skepticisim is normal, but if you think it's not real or not happening, that's just stubbornness especially when you consider that the game developers asked for this, and they are all on board this choo choo train.


----------



## Sgt Bilko

Quote:


> Originally Posted by *BigGuns73*
> 
> Thanks! Now does anyone know what mod it is that tomshardware says has to be done in order to fit an arctic accelero 3 cooler onto a r9 290?


No mod needed. You just need more vram heatsinks that its supplied with. Unless you want to mix and match a bit


----------



## Exostenza

Hey guys I don't know if this was posted here or not, but I just did some research and it seems that these are the aftermarket coolers that support the R9 290(X) cards and they seem to do a seriously amazing job on bringing the temps *WAY* down:

The coolers that officially support the R9 290(X) cards are the Accelero Extreme III, Accelero Hybrid, Gelid Icy Vision Rev.2 and the Prolimatech MK-26 and MK Black Series.


----------



## R35ervoirFox

Quote:


> Originally Posted by *Exostenza*
> 
> Hey guys I don't know if this was posted here or not, but I just did some research and it seems that these are the aftermarket coolers that support the R9 290(X) cards and they seem to do a seriously amazing job on bringing the temps *WAY* down:
> 
> The coolers that officially support the R9 290(X) cards are the Accelero Extreme III, Accelero Hybrid, Gelid Icy Vision Rev.2 and the Prolimatech MK-26 and MK Black Series.


Ye I just ordered the Sapphire R9 290 and Arctic cooling Accelero Xtreme III for €374, hopefully get it over the weekend and report back


----------



## mastahg

I'm really interested in getting a 290 and doing an aftermarket cooling solution but im really worried about the VRM temp aspect of doing something like the red mod. Does anyone have a recommendation for vrm cooling that is reversible incase I decide to resell the card?


----------



## Technewbie

Does any one have a copy of the PT1 BIOS? I can't find it any where and the original post of it is deleted.


----------



## Exostenza

Quote:


> Originally Posted by *R35ervoirFox*
> 
> Ye I just ordered the Sapphire R9 290 and Arctic cooling Accelero Xtreme III for €374, hopefully get it over the weekend and report back


I want to do this so badly, but the end price comes out to just around where the R9 290X is which makes me unable to pull the trigger.... I am sure it is going to be great for you, but I am going to wait and see what the non-ref cooled boards come out with at the end of this month.


----------



## HoZy

Quote:


> Originally Posted by *Exostenza*
> 
> I want to do this so badly, but the end price comes out to just around where the R9 290X is which makes me unable to pull the trigger.... I am sure it is going to be great for you, but I am going to wait and see what the non-ref cooled boards come out with at the end of this month.


You'll be waiting until December/January for non-ref 290x. AMD Have a strict contract with vendors at the moment.


----------



## grandpatzer

Quote:


> Originally Posted by *utnorris*
> 
> Wouldn't exactly call it a Turkey, just not an extreme performer.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am pretty happy with being able to beat my previous GTX670 SLI and HD7970 CF scores with just one card. I have an MSI coming in that I will try out and see how well it does. If it ends up being an extreme performer than I will sell the Sapphire and grab another 290x when they come back in stock. If it performs similarly I will just keep them both and run CF and call it the day.


I'm very suprised watercooled 1200mhz 290x beats 670 sli and 7970 CFX.
From what Ive read [email protected] is about 45% better compared to 7970GHZ when in games, so 2x 7970 should be better.

Personaly I'm going from 7950 CFX to 290, I'm hoping to watercool it to 1200-1300mhz, I'm expecting loosing about 20% fps bcs my 2x 7950 where clocked 1100core.


----------



## NateN34

Quote:


> Originally Posted by *HoZy*
> 
> You'll be waiting until December/January for non-ref 290x. AMD Have a strict contract with vendors at the moment.


You sure? Dang, I can't wait that long! Stupidest move from AMD...how can they be this dumb to DELIBERATELY delay aftermarket cards? Almost don't want to go AMD, because of all the stupid moves they have been doing lately.

So then it is either Accelero + 290 or a 780......not sure which to get. Been hearing that the 290 has lots of issues with black screens and coil whine.


----------



## Arizonian

Quote:


> Originally Posted by *NateN34*
> 
> You sure? Dang, I can't wait that long! Stupidest move from AMD...how can they be this dumb to DELIBERATELY delay aftermarket cards? Almost don't want to go AMD, because of all the stupid moves they have been doing lately.
> 
> So then it is either Accelero + 290 or a 780......not sure which to get. Been hearing that the 290 has lots of issues with black screens and coil whine.


I don't think anyone is sure, but I've read to the contrary.

*Source - TechPowerUP!*
Quote:


> According to the report, AMD's AIB partners will launch R9 290X graphics cards *with custom-design air- and liquid-cooling solutions by late November, 2013.*


*Galaxy Readies GeForce GTX 780 Ti HOF Graphics Card*
Quote:


> The source expects non-reference design GTX 780 Ti cards to launch around late-November and early-December, strategically timed against non-reference Radeon R9 290X.


Personally I'm also impatient as I've done and put my 690 in a second rig and now using a piddly 680. After the 290X was in my system I can't even game anymore until I get the juice back.


----------



## KaiserFrederick

Sapphire R9 290, EK full cover Acetal+Nickel block and backplate


----------



## stiv

just replaced my sapphire 7970's, just waiting for my water block to arrive


----------



## SonDa5

Quote:


> Originally Posted by *Technewbie*
> 
> Does any one have a copy of the PT1 BIOS? I can't find it any where and the original post of it is deleted.


Everything you need below in the link:

http://www.overclockers.com/forums/showthread.php?t=739529


----------



## Toss3

Quote:


> Originally Posted by *grandpatzer*
> 
> I'm very suprised watercooled 1200mhz 290x beats 670 sli and 7970 CFX.
> From what Ive read [email protected] is about 45% better compared to 7970GHZ when in games, so 2x 7970 should be better.
> 
> Personaly I'm going from 7950 CFX to 290, I'm hoping to watercool it to 1200-1300mhz, I'm expecting loosing about 20% fps bcs my 2x 7950 where clocked 1100core.


1200Mhz might not be possible. Even with no vdroop my 290 won't stay stable at anything above 1150Mhz (water-cooled with ek's water block). The highest stable clocks for BF4 have been 1150/5600 (no black screen/artefacts or red screens).


----------



## R35ervoirFox

Quote:


> Originally Posted by *Exostenza*
> 
> I want to do this so badly, but the end price comes out to just around where the R9 290X is which makes me unable to pull the trigger.... I am sure it is going to be great for you, but I am going to wait and see what the non-ref cooled boards come out with at the end of this month.


I hear ya, here the X costs about €500 and you still need to buy an aftermarket cooler, so that actually costs about €540 so I had a very easy decision


----------



## raptor15sc

Quote:


> Originally Posted by *Toss3*
> 
> 1200Mhz might not be possible. Even with no vdroop my 290 won't stay stable at anything above 1150Mhz (water-cooled with ek's water block). The highest stable clocks for BF4 have been 1150/5600 (no black screen/artefacts or red screens).


What voltage do you use to get 1150, and what's the highest you've tried?


----------



## Gilgam3sh

Quote:


> Originally Posted by *Toss3*
> 
> 1200Mhz might not be possible. Even with no vdroop my 290 won't stay stable at anything above 1150Mhz (water-cooled with ek's water block). The highest stable clocks for BF4 have been 1150/5600 (no black screen/artefacts or red screens).


it seems that you have right, yesterday I ran valley benchmark for about 1 hour at 1200/5000 with my 290 with Accelero Xtreme III, no problem at all, max GPU temp was about 65 celsius and VRM1 was at 80 and VRM2 at 50, then I tried with BF4 and it started w/o problem, later that day, everything over 1150 core just did not work in BF4, even when loading the map I got artifacts etc, tried increase the voltage did not work, tried with 1200, then 1180, 1170, 1160 but same problem, so now I'm stuck at 1150 core with about 1,32v, not bad but always nice to get 1200mhz, maybe I will try once again later today, or I simply leave it at 1150mhz core and try to increase memory speed (if it even give me any boost at all?) or maybe it does not like the ASUS 290X BIOS... will try with stock when voltage support is available


----------



## Arizonian

Quote:


> Originally Posted by *stiv*
> 
> just replaced my sapphire 7970's, just waiting for my water block to arrive
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## hotrod717

Just received and installed my new 290. 71.3% asic.

Bone stock valley. No black screens. ( fingers crossed)

I believe that's almost 20% improvement over stock 7970ghz.
Amazing visually on my new 144hz 3d Asus VG248QE.
Didn't realize how stunning valley could be.


----------



## Arizonian

Quote:


> Originally Posted by *hotrod717*
> 
> Just received and installed my new 290. 71.3% asic.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Bone stock valley. No black screens. ( fingers crossed)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> I believe that's almost 20% improvement over stock 7970ghz.
> Amazing visually on my new 144hz 3d Asus VG248QE.
> Didn't realize how stunning valley could be.


Congrats - added


----------



## Gilgam3sh

Afterburner BETA 17, support for voltage control!!

http://www.guru3d.com/files_details/msi_afterburner_beta_download.html


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gilgam3sh*
> 
> Afterburner BETA 17, support for voltage control!!
> 
> http://www.guru3d.com/files_details/msi_afterburner_beta_download.html


































































Wooo!!!


----------



## smokedawg

Quote:


> Originally Posted by *Gilgam3sh*
> 
> Afterburner BETA 17, support for voltage control!!


Perfect ! Just got done installing my first watercooled build yesterday.









Arizonian, could you please update me to wc? Thanks!


----------



## Duvar

Change notes:

Added support for Hawaii family, MSI R9 270X, R9 270X hawk and R9 280 voltage control.

What about 290 and 290X ?


----------



## raptor15sc

Quote:


> Originally Posted by *Gilgam3sh*
> 
> Afterburner BETA 17, support for voltage control!!
> 
> http://www.guru3d.com/files_details/msi_afterburner_beta_download.html


Quote:


> Originally Posted by *Duvar*
> 
> Change notes:
> 
> Added support for Hawaii family, MSI R9 270X, R9 270X hawk and R9 280 voltage control.
> 
> What about 290 and 290X ?


I just tried it with my MSI 290X cards with stock Asus BIOS and no voltage control...









Can someone else confirm?


----------



## hotrod717

Yeah, no voltage control here. What the heck? Now that's a let down!


----------



## raptor15sc

Quote:


> Originally Posted by *hotrod717*
> 
> Yeah, no voltage control here. What the heck?


This sucks. I was going to finally wipe my system and start fresh with Windows 8.1 now that SteelSeries finally got their act together with their mouse drivers (lost A LOT of respect for SteelSeries from this, FYI), but now I guess I'll for the next Beta drivers of Catalyst and AfterBurner...


----------



## Gilgam3sh

maybe you need to use the ASUS 290X BIOS?


----------



## raptor15sc

Quote:


> Originally Posted by *Gilgam3sh*
> 
> maybe you need to use the ASUS 290X BIOS?


It works for you? I am using the ASUS 290X BIOS.


----------



## hotrod717

Quote:


> Originally Posted by *raptor15sc*
> 
> I just tried it with my MSI 290X cards with stock Asus BIOS and no voltage control...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can someone else confirm?


Quote:


> Originally Posted by *raptor15sc*
> 
> It works for you? I am using the ASUS 290X BIOS.


Uh, I believe he said he is. And it shouldn't be limited to Asus bios. It's MSI AFTERBURNER.
Got my hopes up, but I'm not planning on oc'ing this until I get a waterblock anyways.


----------



## Gilgam3sh

Quote:


> Originally Posted by *raptor15sc*
> 
> It works for you? I am using the ASUS 290X BIOS.


I'm at work so can't test it, but if you guys says it does not work then ASUS GPU Tweak still works.


----------



## Forceman

Quote:


> Originally Posted by *raptor15sc*
> 
> I just tried it with my MSI 290X cards with stock Asus BIOS and no voltage control...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can someone else confirm?


Yeah, no dice on my XFX card either.


----------



## Connolly

Does anyone know if the R9 290/x will be compatible with the new G-Sync technology from Nvidia? If you have an Asus 248QE you can upgrade without needing to buy a new monitor - http://www.blurbusters.com/gsync/upgrade-walkthrough/

It seems strange that a hard upgrade to a display device wouldn't work with the R9's, but that's what's being said on many sites. I'd love someone to tell me otherwise.


----------



## MacClipper

Voltage controls?


----------



## raptor15sc

Quote:


> Originally Posted by *MacClipper*
> 
> Voltage controls?
> 
> 
> Spoiler: Warning: Spoiler!


Oh great, now I gotta flash back?


----------



## Gilgam3sh

http://forums.guru3d.com/showthread.php?t=383541

Guys, I've just noticed that in beta 17 database we unlocked voltage control for reference VGA BIOS only. So those who flash third party BIOS are out of luck. We'll remove reference BIOS restriction shortly and upload updated new version.
In meanwhile, while revised beta 17 is not uploaded experienced users may unlock voltage control in current beta 17 by adding the following lines to hardware profile file (.\Profiles\VEN_1002.......cfg):

[Settings]
VDDC_IR3567B_Detection = 30h
VDDC_IR3567B_Output = 0
VDDCI_IR3567B_Detection = 30h
VDDCI_IR3567B_Output = 1


----------



## Sgt Bilko

Very nice, hopefully this means some more benches on the Hawaii vs GK110 Thread within the next few days


----------



## r0l4n

Quote:


> Originally Posted by *MacClipper*
> 
> Voltage controls?


Aux voltage?


----------



## Gilgam3sh

Quote:


> Originally Posted by *r0l4n*
> 
> Aux voltage?


"Its the phase locked loop (PLL) voltage. When you cant go higher on the OC it can help stabilize that last few MHz. Sometimes adjusting higher helps sometimes adjusting lower helps, its a try it and see what works kinda deal."

http://www.overclock.net/t/1355920/what-is-aux-voltage-in-afterburner


----------



## smokedawg

Quote:


> Originally Posted by *r0l4n*
> 
> Aux voltage?


Maybe the VDDC voltage Unwinder mentioned:
Quote:


> There is no programmable memory voltage controller on reference 290/x. Only VDDC/VDDCI.


source


----------



## Joeking78

Is the 280x Hawaii?

Really hoping this is the end of my voltage locked Gigabyte card...but not holding my breath.


----------



## MojoW

Quote:


> Originally Posted by *Joeking78*
> 
> Is the 280x Hawaii?
> 
> Really hoping this is the end of my voltage locked Gigabyte card...but not holding my breath.


Nope it's not.
But you can always check by looking in GPU-Z


----------



## r0l4n

Quote:


> Originally Posted by *Gilgam3sh*
> 
> "Its the phase locked loop (PLL) voltage. When you cant go higher on the OC it can help stabilize that last few MHz. Sometimes adjusting higher helps sometimes adjusting lower helps, its a try it and see what works kinda deal."
> 
> http://www.overclock.net/t/1355920/what-is-aux-voltage-in-afterburner


I was hoping for LLC...


----------



## Joeking78

Quote:


> Originally Posted by *MojoW*
> 
> Nope it's not.
> But you can always check by looking in GPU-Z


I'm at work QQ...

I'll be home soon though, with a bit of luck it works on the 280x, at least to some degree.


----------



## Arizonian

Quote:


> Originally Posted by *smokedawg*
> 
> Perfect ! Just got done installing my first watercooled build yesterday.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Arizonian, could you please update me to wc? Thanks!
> 
> 
> Spoiler: Warning: Spoiler!


Done








Looks great.

Side note:
*Today BF4 has a large patch being updated.*

*Source*


Spoiler: BF4 November 14th Update Notes



Quote:


> UPDATE #10 (Nov 14):
> PC game update is now live!
> Details on what the patch includes will be posted in a couple of hours.
> 
> UPDATE #9:
> PC Game Update Goes Live Nov 14
> We are happy to report that the PC Battlefield 4 game update will go live tomorrow, November 14. The main focus of this update is threefold:
> 
> *1) Reduce the number of crashes*
> We believe this update will solve a large portion of the most commonly occuring game crashes, and the improvement in overall stability should make a big difference for many players.
> 
> *2) Eliminate a network sync bug*
> If you've been experiencing situations in multiplayer where it appears that you are taking damage from enemies through walls, you might have been the victim of a network sync bug. In this game update, we have identified and eliminated one such bug that caused this type of gameplay experience. We are continuing to work on more multiplayer optimizations concerning network performance.
> 
> *3) Improve Network Smoothing*
> We have made some improvements to the "Network Smoothing" functionality that you can find in the OPTIONS>GAMEPLAY menu. The Network Smoothing slider governs a group of settings that aim to produce a tighter multiplayer experience based on your specific packet loss situation.
> 
> If you've been experimenting with the Network Smoothing slider earlier, it might now yield better results. If you haven't tried it before, please explore this setting and set it to the lowest value you can without experiencing graphical glitches in the game. By setting it lower, you can get a tighter multiplayer experience, depending on your specific network situation.
> 
> More details upcoming
> Besides the above mentioned focus areas, this PC game update contains a number of other fixes and tweaks that we will detail once it's live.
> 
> This game update for PC goes live around 10AM UTC/2AM PST November 14. Your Origin Client should acknowledge and automatically download this update for you, as usual. Otherwise, you can right click your game in the client and select "Check for Update". There will be a Battlelog maintenance downtime for about an hour as this update goes live. During this time multiplayer on PC will be unavailable.
> 
> We will release full patch notes for this game update later, but the three items mentioned above are by far the most important changes in this update. We understand that stability has been rocky for some players, and hope that this update will make the game run smoother and more stable for you. Let us know in the comments below.






EDIT - Also don't miss a break down to *Chipp @ AMD Developer Summit 2013* which is in the AMD / ATI section of our forum.









/sub - and await the update


----------



## SonDa5

Finally stabilized 1300mhz for 3dMark11P run!








P18,209 single AMD R9 290x.

http://www.3dmark.com/3dm11/7488080



Working great in my monster mini-ITX build.


----------



## blue1512

^ What a beautiful rig, and beastly too








One question, which voltage is used for 1300 MHz?


----------



## SonDa5

Quote:


> Originally Posted by *blue1512*
> 
> ^ What a beautiful rig, and beastly too
> 
> 
> 
> 
> 
> 
> 
> 
> One question, which voltage is used for 1300 MHz?


Thanks.

Using the modded Asus PT1 BIOS with it set at 1.618 GPU TWEAK Hawaii version software, looking at GPU-z the max voltage it reads is 1.539v. Some voltage loss which is normal.

I'm hoping Sapphire scores big with a new BIOS for the card which I think will be the one that will go with the Sapphire OC Dual X fan version which should be out soon.
With a better BIOS and Sapphire TRIXX voltage control I think I can get more out of the card.


----------



## Jpmboy

Quote:


> Originally Posted by *sugarhell*
> 
> No tweaks.


and how could you tell?


----------



## Jpmboy

Quote:


> Originally Posted by *MacClipper*
> 
> Voltage controls?


core clock slider????


----------



## Jpmboy

Quote:


> Originally Posted by *SonDa5*
> 
> Finally stabilized 1300mhz for 3dMark11P run!
> 
> 
> 
> 
> 
> 
> 
> 
> P18,209 single AMD R9 290x.
> http://www.3dmark.com/3dm11/7488080
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Working great in my monster mini-ITX build.
> 
> 
> Spoiler: Warning: Spoiler!


what besides invalid driver is in the yellow box on the futuremark report?

*tess off - invalid score.*



run it by just selecting "optimal" performance" in catalyst, the post again... to here:

http://www.overclock.net/t/1361939/top-30-3dmark11-scores-for-single-dual-tri-quad


----------



## overclockFrance

Quote:


> Originally Posted by *r0l4n*
> 
> Is it allowed to upload homebrewed utilities?
> 
> I've created a small windows application that wraps the VRM console commands towards MSI Afterburner (link and link).
> 
> Features:
> -enables LLC, or
> -applies a VDDC offset between 0v and +0.3v
> -loads profiles on load and/or on idle if desired
> -applies these *only* to those processes in an input list
> -stores settings to file so one doesn't have to enter them every time
> -minimizes to system tray
> 
> 
> 
> I've tested myself (I'm running a Gigabyte R9 290) and works very well, but I will not take, of course, any blame if your windows crashes, your GPU gets fried, etc. Enabling LLC adds +0.2v straight away, so you can get yourself an idea of what you are putting your GPU through.
> 
> I hope to get some feedback from the OP.


Would it be possible to have this tool, please ? Indeed, I see that the maximum voltage offset and frequencies are only +100 mV and 1235/1625 Mhz in MSI Afterburner.


----------



## Menthol

Can someone please pm me with a 290 bios


----------



## Jpmboy

Quote:


> Originally Posted by *r0l4n*
> 
> I was hoping for LLC...


Me too


----------



## r0l4n

Quote:


> Originally Posted by *overclockFrance*
> 
> Would it be possible to have this tool, please ? Indeed, I see that the maximum voltage offset and frequencies are only +100 mV and 1235/1625 Mhz in MSI Afterburner.


Got PM.


----------



## rv8000

New afterburner beta is out, first post.
Quote:


> Hi guys, the Afterburner 3.0.0 Beta 17 is ready for you to download. this version supports the control of NVIDIA and AMD graphics card, if you get the card, you may try to play it.
> 
> Change note:
> Now supported MSI R270X HAWK and AMD Hawaii family voltage control.


http://forums.guru3d.com/showthread.php?s=c15e1ced672807e34ef0abc1a727979b&t=383541


----------



## rdr09

Quote:


> Originally Posted by *Menthol*
> 
> Can someone please pm me with a 290 bios


pm'ed


----------



## Connolly

Does anyone know what the best R9 290 (single card) valley score is so far, off the top of their heads?


----------



## Raephen

Quote:


> Originally Posted by *Connolly*
> 
> Does anyone know what the best R9 290 (single card) valley score is so far, off the top of their heads?


... misread your post, gave other answer first edit....
Hmm.... I have noidea, is the correct answer


----------



## Jpmboy

Quote:


> Originally Posted by *Connolly*
> 
> Does anyone know what the best R9 290 (single card) valley score is so far, off the top of their heads?


http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0


----------



## Sgt Bilko

Speaking of valley, I've been messing about with AB since the update and my core clock is throttling, fan is 100%, power limit is +50.........any ideas?

or do i need to enable the power limit in CCC for it to work properly?

EDIT: nvm, i needed to enable and set it in CCC for the power limit to work


----------



## Falkentyne

Quote:


> Originally Posted by *r0l4n*
> 
> Got PM.


Hi there, may I also get a link for your tool?


----------



## r0l4n

For those interested in giving my tool a try: I haven't tested it with AB beta 17 yet, so I can't assure it works. Will test later today. I will, as well, make a change so one can enable LLC *and* apply a voltage offset (possibly *dangerous*), right now it's either or.

I will post the link to the new version in my original post when I'm done, hopefully in a few hours.


----------



## DampMonkey

Quote:


> Originally Posted by *r0l4n*
> 
> For those interested in giving my tool a try: I haven't tested it with AB beta 17 yet, so I can't assure it works. Will test later today. I will, as well, make a change so one can enable LLC *and* apply a voltage offset (possibly *dangerous*), right now it's either or.
> 
> I will post the link to the new version in my original post when I'm done, hopefully in a few hours.


Ill give it a try if you want to send it my way with a pm


----------



## $ilent

guys ive had to rma my 290x due to it being defective, im thinking of picking up a regular sapphire r9 290. Beside the slower core clock, whats difference between 290 and 290x? Does the 290 still have unlocked voltage control, bios switch etc etc?


----------



## Gilgam3sh

Quote:


> Originally Posted by *$ilent*
> 
> guys ive had to rma my 290x due to it being defective, im thinking of picking up a regular sapphire r9 290. Beside the slower core clock, whats difference between 290 and 290x? Does the 290 still have unlocked voltage control, bios switch etc etc?


2560 streams instead of 2816 and 160 textures instead of 176.. and yes it does have unlocked voltage and bios switch.


----------



## jerrolds

Not sure if you guys are aware (havent read the last 10pages) - but Afterburner Beta 17 is out (with 290X voltage support) http://www.guru3d.com/files_details/msi_afterburner_beta_download.html


----------



## jerrolds

Quote:


> Originally Posted by *tsm106*
> 
> Throw a fan on it.
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/640_40#post_21060304
> 
> In other news, finally got my blocks and stuff related to it. The block shortage was more severe than the card shortage lol.


What kind of temps were you getting on VRM1 (i guess its the top left group of 3 VRMs) with just a fan on it? I installed the Gelid ICY 2, and while i was able to hit 1220mhz on the core and hit max 69C VRM1 was easily climbing to 95C haha

While VRM2 maxed out at around 65C, which i believe was the long strip of VRMs on the right side.

That top VRM in that group of 3 was a PITA to tape a U or "bridge" ramsink on it, so little contact area - so its probably not on very securely. But if i can just lose it and have a fan blow air directly on it..it would save me a trip for buying thermal glue and permanently attaching ram sinks on it.


----------



## Arizonian

Quote:


> Originally Posted by *KaiserFrederick*
> 
> Sapphire R9 290, EK full cover Acetal+Nickel block and backplate
> 
> 
> Spoiler: Warning: Spoiler!


Wow - almost missed you. Slotted you in order to your post on roster. Congrats - added









On a related note: IF I've missed anyone else and you don't see your name on roster, please PM me with link to your submission post to be added. I've been following the thread religiously but some days I wake up and it's moved crazy fast in one day. I apologize up front.


----------



## tsm106

Quote:


> Originally Posted by *jerrolds*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Throw a fan on it.
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/640_40#post_21060304
> 
> In other news, finally got my blocks and stuff related to it. The block shortage was more severe than the card shortage lol.
> 
> 
> 
> What kind of temps were you getting on VRM1 (i guess its the top left group of 3 VRMs) with just a fan on it? I installed the Gelid ICY 2, and while i was able to hit 1220mhz on the core (maybe more) VRM1 was easily climbing to 95C haha
> 
> While VRM2 maxed out at around 65C, which i believe was the long strip of VRMs on the right side.
> 
> That top VRM in that group of 3 was a bit to tape a U or "bridge" ramsink on it, so little contact area - so its probably not on very securely. But if i can just lose it and have a fan blow air directly on it..it would save me a trip for buying thermal glue and permanently attaching ram sinks on it.
Click to expand...

It got so hot the thermal tape got gooey. You know the temp is bad when the sinks start falling off. I didn't even bother to measure temps after that. When I put my blocks in, I'll have thermal probes ready to get a more accurate measurement.


----------



## selk22

Yeah! I broke 10k in Firestrike Basic! Finally.....



http://www.3dmark.com/3dm/1629786

r9 290x stock bios. Stock volts. 1100/1400 on air


----------



## Sgt Bilko

Quote:


> Originally Posted by *selk22*
> 
> Yeah! I broke 10k in Firestrike Basic! Finally.....
> 
> http://www.3dmark.com/3dm/1629786
> 
> r9 290x stock bios. Stock volts. 1100/1400 on air


nice one









I can't even break 9k......highest so far is 8992 -_-

i should be able to break it sometime over the weekend


----------



## selk22

Thanks!
Cmon man you got this!







your the AMD champion!









You get that 8350 yet buddy?


----------



## ImJJames

Quote:


> Originally Posted by *Sgt Bilko*
> 
> nice one
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can't even break 9k......highest so far is 8992 -_-
> 
> i should be able to break it sometime over the weekend


That doesn't seem right, my 7970 hits almost 9k at 1180core/1685mem on air with barely any voltage increase.


----------



## Sgt Bilko

Quote:


> Originally Posted by *selk22*
> 
> Thanks!
> Cmon man you got this!
> 
> 
> 
> 
> 
> 
> 
> your the AMD champion!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You get that 8350 yet buddy?


yep, running at 4.8 atm, i can't get 5Ghz stable just yet........and i'm pretty sure the AMD championship in this thread belongs to DampMonkey









Quote:


> Originally Posted by *ImJJames*
> 
> That doesn't seem right, my 7970 hits almost 9k at 1180core/1685mem on air with barely any voltage increase.


my 290x will break 12k GPU score, it's the CPU that's holding me back atm........


----------



## Arizonian

Quote:


> Originally Posted by *ImJJames*
> 
> That doesn't seem right, my 7970 hits almost 9k at 1180core/1685mem on air with barely any voltage increase.


He's using an 8350 CPU and that's his combined score. Your rig isn't listed....what CPU are you sporting?

Also list your rig, OCN members like to see'em so when you post they know where your coming from.

*"How to put your Rig in your Sig"*


----------



## Jpmboy

Quote:


> Originally Posted by *r0l4n*
> 
> For those interested in giving my tool a try: I haven't tested it with AB beta 17 yet, so I can't assure it works. Will test later today. I will, as well, make a change so one can enable LLC *and* apply a voltage offset (possibly *dangerous*), *right now it's either or.*
> 
> I will post the link to the new version in my original post when I'm done, hopefully in a few hours.


So that's what was going on last night! I actually tried to ferret out why i didnt get the voltage increase by issuing the cmnds manually - which works btw.

Post the new version in your sig. It's OCN... Everything is "use at your own risk"


----------



## jerrolds

Anyone happen to know the safe operating temps are for the 290(X) VRMs? I've heard they might be rated up to 125C. I installed the Gelid ICY 2 last night, and while the core maxed out at 69C @1220mhz ~1.3v after vdroop, VRM1 easily hit 95C. VRM2 held steady at 65C so thats good.

I was not artifacting or anything after about 10-15mins in Heaven, but i shut it down manually. Just not comfortable with it almost hitting 100C. I think id prefer to keep it under 85C if possible.

Attaching the U-shaped or bridge heat sinks onto the group of 3 VRMs (the top one specifically) was a real PITA.

I have a feeling i can hit 1230-1250mhz on the core if i can just control the VRM temps a bit better. I have more room to tweak as this was just a quick crank it and see what happens


----------



## bond32

I broke 10k too: http://www.3dmark.com/3dm/1630109?

1100/1400


----------



## Jpmboy

Quote:


> Originally Posted by *jerrolds*
> 
> Anyone happen to know the safe operating temps are for the 290(X) VRMs? I've heard they might be rated up to 125C. I installed the Gelid ICY 2 last night, and while the core maxed out at 69C @1220mhz ~1.3v after vdroop, VRM1 easily hit 95C. VRM2 held steady at 65C so thats good.
> 
> I was not artifacting or anything after about 10-15mins in Heaven, but i shut it down manually. Just not comfortable with it almost hitting 100C. I think id prefer to keep it under 85C if possible.
> 
> Attaching the U-shaped or bridge heat sinks onto the group of 3 VRMs (the top one specifically) was a real PITA.
> 
> I have a feeling i can hit 1230-1250mhz on the core if i can just control the VRM temps a bit better. I have more room to tweak as this was just a quick crank it and see what happens


I see vrm1 running hotter also. (Max on h2o is 56C) I think 125c is the spec limit, not in the nominal operating range. Generally, 80% of the spec limit is okay. But verify that limit in the IR documentation... It's available on-line.


----------



## thekamikazepr

Quote:


> Originally Posted by *Jpmboy*
> 
> I see vrm1 running hotter also. (Max on h2o is 56C) I think 125c is the spec limit, not in the nominal operating range. Generally, 80% of the spec limit is okay. But verify that limit in the IR documentation... It's available on-line.


Same here 85-91 c


----------



## r0l4n

Application available for download, see original post.
Quote:


> Originally Posted by *r0l4n*
> 
> Hi!
> 
> I've created a small windows application that wraps the VRM console commands towards MSI Afterburner (link and link).
> 
> Features:
> -enables/disables LLC
> -applies a VDDC offset between 0v and +0.3v
> -loads profiles on load and/or on idle if desired
> -applies these *only* to those processes in an input list
> -stores settings to file so one doesn't have to enter them every time
> -minimizes to system tray
> 
> Requires:
> -Microsoft .NET 4.0
> -MSI Afterburner beta 16 or newer
> 
> Screenshot:
> 
> 
> I've tested myself (I'm running a Gigabyte R9 290) and just works, but *I will not take, of course, any blame if your windows crashes, your GPU gets fried, etc*. Enabling LLC adds +0.1v straight away, so you can get yourself an idea of what you are putting your GPU through.
> 
> These are the AB settings I tested the application with:
> 
> 
> Both files included in the .zip file need to be in the same location. If you have UAC enabled, you may need to run it as administrator.
> 
> Please report any bugs to me in a private message.
> 
> ****USE AT YOUR OWN RISK****
> 
> ABAutoProfileVoltMod.exe.zip 314k .zip file
> 
> ****USE AT YOUR OWN RISK****


----------



## th3illusiveman

Quote:


> Originally Posted by *SonDa5*
> 
> Finally stabilized 1300mhz for 3dMark11P run!
> 
> 
> 
> 
> 
> 
> 
> 
> P18,209 single AMD R9 290x.
> 
> http://www.3dmark.com/3dm11/7488080
> 
> 
> 
> Working great in my monster mini-ITX build.


tessellation disabled


----------



## HughhHoney

Is anyone getting memory voltage control on the new afterburner?

I flashed a stock rom, but it's still greyed out for me.


----------



## estens

As usual I have gotten a card that is a bad overclocker.
Card does about 1100mhz on stock voltage, now with msi afterburner voltage control I have manged to get to 1130mhz








Can bump the voltage +50 on the slider in ab, any more then that i get really bad artifacts an eventualy it locks up


----------



## Technewbie

Quote:


> Originally Posted by *HughhHoney*
> 
> Is anyone getting memory voltage control on the new afterburner?
> 
> I flashed a stock rom, but it's still greyed out for me.


There is no memory voltage control, Because the reference 290(x) don't have a programmable memory voltage controller. (This is what unwinder said)


----------



## Bull56

Well guys, I'm better









I scored 13k with one 90X


----------



## HughhHoney

Quote:


> Originally Posted by *Technewbie*
> 
> There is no memory voltage control, Because the reference 290(x) don't have a programmable memory voltage controller. (This is what unwinder said)


Oh thanks I guess I missed that. I was really looking forward to that I can't push my memory past 1325.

Oh well...


----------



## rdr09

Quote:


> Originally Posted by *HughhHoney*
> 
> Oh thanks I guess I missed that. I was really looking forward to that I can't push my memory past 1325.
> 
> Oh well...


you can't go past 1325 using stock volts? what are you using to oc?

I think this is my highest settings for stock volts . . .



http://www.3dmark.com/3dm11/7489178


----------



## Porter_

rdr09 is that artifact-free and game stable? if so i'm impressed. at stock voltage my card begins to artifact at 1075 core clock. haven't really nailed down the max memory clock yet but 1350 is perfectly fine.


----------



## HughhHoney

Quote:


> Originally Posted by *rdr09*
> 
> you can't go past 1325 using stock volts? what are you using to oc?


I've been using gpu tweak.

I can get my core up over 1200, but games and benchmarks crash right away at memory speeds over 1325.

I'm going to test out afterburner for a while today. Hopefully I get better results.


----------



## dade_kash_xD

So, I had to undo my OC today because it kept causing BF4 to freeze. It would just simply get stuck on the screen and I would have to hard power down my system. This is reminiscent of the issues I was having with my R9-280x Crossfire. At first, they played fine OC'd, then I had to put them at stock, then I couldn't play them at all... Starting to sound like my mobo? Or is this just the patch they released today?


----------



## bond32

Just placed the order for the Koolance GPU block... Little ticked that it came out to $142 with shipping... Couldn't really find it anywhere else.


----------



## jerrolds

Quote:


> Originally Posted by *Porter_*
> 
> rdr09 is that artifact-free and game stable? if so i'm impressed. at stock voltage my card begins to artifact at 1075 core clock. haven't really nailed down the max memory clock yet but 1350 is perfectly fine.


I was able to hit 1110/1400 using stock voltage/bios but this was at 60% fan speed. With asus bios i was able to hit 1180 core at 80% fan speed









Right now im at 1220 with Gelid ICY 2 (i think i can go higher) but VRM1 temps are kinda crazy atm...going to try and reign those in before pushing past it.


----------



## ImJJames

Quote:


> Originally Posted by *Sgt Bilko*
> 
> yep, running at 4.8 atm, i can't get 5Ghz stable just yet........and i'm pretty sure the AMD championship in this thread belongs to DampMonkey
> 
> 
> 
> 
> 
> 
> 
> 
> my 290x will break 12k GPU score, it's the CPU that's holding me back atm........


Ah okay I thought you meant graphic score when you said you can barely hit 9k.

Your CPU is not holding you back at anything lol, but synthetic scores.


----------



## rdr09

Quote:


> Originally Posted by *Porter_*
> 
> rdr09 is that artifact-free and game stable? if so i'm impressed. at stock voltage my card begins to artifact at 1075 core clock. haven't really nailed down the max memory clock yet but 1350 is perfectly fine.


these clocks are artifacts free. any higher i start to see them. i game at stock clocks 'cause i only use 1080. asic is 79.4%. its watered btw.

Quote:


> Originally Posted by *HughhHoney*
> 
> I've been using gpu tweak.
> 
> I can get my core up over 1200, but games and benchmarks crash right away at memory speeds over 1325.
> 
> I'm going to test out afterburner for a while today. Hopefully I get better results.


i will definitely have to increase the core's voltage to any higher than 1155. you got to keep it cool. my block in that run kept the core temp below 50C as shown. suck i can't monitor vrms' temps.


----------



## SonDa5

Quote:


> Originally Posted by *Jpmboy*
> 
> what besides invalid driver is in the yellow box on the futuremark report?
> 
> *tess off - invalid score.*
> 
> 
> 
> run it by just selecting "optimal" performance" in catalyst, the post again... to here:
> 
> http://www.overclock.net/t/1361939/top-30-3dmark11-scores-for-single-dual-tri-quad


It's not an invalid score to the pros at HWbot.org. I just set my CCC setting for performance. This is the highest FPS possible from this benchmark. The pro over clockers do the same thing with AMD cards and with NVIDIA cards.


----------



## VSG

I finally got my EK blocks, backplates and terminal link. Now all I need is my FrozenQ reservoir to arrive, if it ever does!


----------



## drdrache

Quote:


> Originally Posted by *geggeg*
> 
> I finally got my EK blocks, backplates and terminal link. Now all I need is my FrozenQ reservoir to arrive, if it ever does!


i'd check with frozenQ, last I read they were shutdown until further notice.


----------



## Jpmboy

Quote:


> Originally Posted by *SonDa5*
> 
> It's not an invalid score to the pros at HWbot.org. I just set my CCC setting for performance. This is the highest FPS possible from this benchmark. The pro over clockers do the same thing with AMD cards and with NVIDIA cards.


Correct, Not invalid for the bot. But for HOF, and i believe for the single card top 30 on ocn it is not acceptable to turn tess off.

Yeah, when i do that, it hits 19k...


----------



## SonDa5

Quote:


> Originally Posted by *Jpmboy*
> 
> Correct, Not invalid for the bot. But for HOF, and i believe for the single card top 30 on ocn it is not acceptable to turn tess off.
> 
> Yeah, when i do that, it hits 19k...


This thread is not the HOF and its not the top 30 on ocn. So quit telling me I can't share my best score here. The score is in fact legal at HWbot.org and IMO HWbot.org is good enough for me.









What card do you hit 19k with? How is the line score for graphics only?


----------



## VSG

Quote:


> Originally Posted by *drdrache*
> 
> i'd check with frozenQ, last I read they were shutdown until further notice.


Ya I know, there was a fire in the shop. But last I heard my package was ready for shipping.


----------



## GioV

Have you guys been able to resolve the black screen problem? I seem to have found 2 black screen issues:

1) Game is playing and it crashes, computer is still usable and get error "windows has encountered a problem and needs to close". Game is just a black screen and once I close the error message I go back to desktop. Most common, happens randomly during game play.

2) Much rarer than the other one, whole screen is black and have to hard reset. Seems to happen when I have too many monitor programs open. For example, was testing temps with HWinfo and got a black screen when I opened GPU-z. This was during the desktop and not during gaming.

My R9 290X (stock clocks) is under water, vrm temps and core temps are very low. I have Elpidia memory and using windows 8.1 pro. Have tried everything besides changing OS because its a pain. I tested my rig with a 6950 and no issues whatsoever besides low fps







If you guys have any input let me know please


----------



## bond32

I know some got the black screen with high refresh rates. I have yet to get one myself. I play BF4 at 1440p, all ultra 4xmsaa and 120hz. My desktop/everything else is at 96 hz - no black screens.

This is with a Sapphire 290x.


----------



## Raxus

Has anyone seen the ASUS r9 290x NON bf4 edition for sale in the US yet?


----------



## kcuestag

Anyone tried MSI AB Beta 17? What were you able to achieve with the max allowed of +100mV? So far it looks like I can barely hit 1200MHz on the Core, and still not 100% stable.









Very dissapointed with my card.


----------



## Gilgam3sh

yeah seems hard to hit 1200MHz stable 24/7 on air atleast for me even though I have no problems with temperature , mine is not stable over 1150MHz, so now I run 1150/1450... 1325mv with ASUS GPU Tweak, still pretty OK I think, but maybe another BIOS than the ASUS 290X I flashed can increase it, I don't know...


----------



## kcuestag

Quote:


> Originally Posted by *Gilgam3sh*
> 
> yeah seems hard to hit 1200MHz stable 24/7 on air atleast for me even though I have no problems with temperature , mine is not stable over 1150MHz, so now I run 1150/1450... 1325mv with ASUS GPU Tweak, still pretty OK I think, but maybe another BIOS than the ASUS 290X I flashed can increase it, I don't know...


Mine runs 1125/1400 on stock volts and 1200/1400 at +100mV (Which according to GPU-Z it is around 1300mV. Is that good?

Also, if I go above 1400MHz on the MEmory, sometimes it starts flickering like mad on idle (at desktop)...


----------



## Newbie2009

Quote:


> Originally Posted by *kcuestag*
> 
> Mine runs 1125/1400 on stock volts and 1200/1400 at +100mV (Which according to GPU-Z it is around 1300mV. Is that good?
> 
> Also, if I go above 1400MHz on the MEmory, sometimes it starts flickering like mad on idle (at desktop)...


Do you have ULPS turned off?

EDIT : You have 1 card nvm.

Stupid question, you're using uber mode right? Try a driver reinstall? I've read quiet a few blackscreen issues from people who didn't clean all drivers or didn't do a fresh install. Could be causing some instability?

Then again you could just have a turkey.


----------



## kcuestag

I noticed in Valley Benchmark that even though I had the Core set to 1200MHz, it kept jumping anywhere from 900MHz to 1168MHz, but never fixed at 1200MHz, even though temperatures are well below 55ºC, what the hell is going on here?









Not quite happy with the card so far.


----------



## qqan

This cooling setup works best for me: 58C after 1 hour Valley @stock clocks and low speed fans, VRM temps <72C.


----------



## Gilgam3sh

Quote:


> Originally Posted by *kcuestag*
> 
> Mine runs 1125/1400 on stock volts and 1200/1400 at +100mV (Which according to GPU-Z it is around 1300mV. Is that good?
> 
> Also, if I go above 1400MHz on the MEmory, sometimes it starts flickering like mad on idle (at desktop)...


seems good to me....

I just tried with 1155mhz and heaven benchmark went fine, testing 1160 now and so far no problems.. hmm strange this.

is heaven benchmark any good to test the card?


----------



## Mas

Quote:


> Originally Posted by *kcuestag*
> 
> Anyone tried MSI AB Beta 17? What were you able to achieve with the max allowed of +100mV? So far it looks like I can barely hit 1200MHz on the Core, and still not 100% stable.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Very dissapointed with my card.


Sorry to hear you aren't happy with your card.

How is the AB beta 17? I haven't been able to use any monitoring software for a couple weeks because AB wasn't playing nice with the 290Xs at release so I've stayed away (also haven't been having any of the issues people have been reporting with black screens, etc)


----------



## Newbie2009

Quote:


> Originally Posted by *Gilgam3sh*
> 
> seems good to me....
> 
> I just tried with 1155mhz and heaven benchmark went fine, testing 1160 now and so far no problems.. hmm strange this.
> 
> is heaven benchmark any good to test the card?


I've found it's pretty much useless.

3D Mark for memory
Crysis 2 bench for memory and core.


----------



## qqan

Quote:


> Originally Posted by *GioV*
> 
> Have you guys been able to resolve the black screen problem? I seem to have found 2 black screen issues:
> 
> 1) Game is playing and it crashes, computer is still usable and get error "windows has encountered a problem and needs to close". Game is just a black screen and once I close the error message I go back to desktop. Most common, happens randomly during game play.
> 
> 2) Much rarer than the other one, whole screen is black and have to hard reset. Seems to happen when I have too many monitor programs open. For example, was testing temps with HWinfo and got a black screen when I opened GPU-z. This was during the desktop and not during gaming.
> 
> My R9 290X (stock clocks) is under water, vrm temps and core temps are very low. I have Elpidia memory and using windows 8.1 pro. Have tried everything besides changing OS because its a pain. I tested my rig with a 6950 and no issues whatsoever besides low fps
> 
> 
> 
> 
> 
> 
> 
> If you guys have any input let me know please


I run Windows 8.1 Pro and have scenario 2) independent from card clocks when overclocking the BUS in BIOS. Overclocking CPU and RAM without touching the BUS works fine. I don't like going above 4.5 GHz on the I7 3770k because of motherboard temps though, as I run all fans low.. have not tested higher.


----------



## jerrolds

Quote:


> Originally Posted by *bond32*
> 
> I know some got the black screen with high refresh rates. I have yet to get one myself. I play BF4 at 1440p, all ultra 4xmsaa and 120hz. My desktop/everything else is at 96 hz - no black screens.
> 
> This is with a Sapphire 290x.


Wait..youre running BF4 1440p Ultra 4XMSAA...at 120fps? On a single sapphire 290x? ..cuz i get about 80-90fps outside on High everything (110-120fps inside), and No AA at [email protected] And this is in campaign mode.


----------



## bpmcleod

So with the recent string of Updates from windows 8.1 Updting system it seems to have fixed alot of my booting problems with my card. The card use to blackscreen on boot when I had two monitors connected and what not and now those problems seem to have gone away. Anyone else notice any changes for the better with recent windows updates? I see roughly 5-7 updates over the past 3 days.


----------



## kcuestag

For some reason the Core is not stable with voltage increase.. It runs anywhere from 900 to 1150MHz (or so) but never stays at 1200MHz. I even tried +100mV and my stable OC of 1125MHz (That is stable with voltage at stock) and even then it wont stay at 1125MHz.

Is this a bug in the drivers preventing the card from going faster with voltage increase? This is driving me nuts, if I don't add voltage the card runs fine.


----------



## bpmcleod

Quote:


> Originally Posted by *qqan*
> 
> This cooling setup works best for me: 58C after 1 hour Valley @stock clocks and low speed fans, VRM temps <72C.


Have you tried a run on ExtremeHD? At first I was like woa 102 fps then i noticed your settings


----------



## jerrolds

Quote:


> Originally Posted by *kcuestag*
> 
> For some reason the Core is not stable with voltage increase.. It runs anywhere from 900 to 1150MHz (or so) but never stays at 1200MHz. I even tried +100mV and my stable OC of 1125MHz (That is stable with voltage at stock) and even then it wont stay at 1125MHz.
> 
> Is this a bug in the drivers preventing the card from going faster with voltage increase? This is driving me nuts, if I don't add voltage the card runs fine.


AB only lets you go +100mv or 1350mv max with your card? Did you flash your bios? Im at work, but with the asis bios and GPU Tweak i can hit 1412mV...which is about 1.29 - 1.35v after vdroop.

I dont like it that high..but i was able to hit 1220mhz on the core with an aftermarket air cooler. With stock and 100% fan speed i was able to get 1180mhz, unusable..but possible.


----------



## estens

Quote:


> Originally Posted by *kcuestag*
> 
> For some reason the Core is not stable with voltage increase.. It runs anywhere from 900 to 1150MHz (or so) but never stays at 1200MHz. I even tried +100mV and my stable OC of 1125MHz (That is stable with voltage at stock) and even then it wont stay at 1125MHz.
> 
> Is this a bug in the drivers preventing the card from going faster with voltage increase? This is driving me nuts, if I don't add voltage the card runs fine.


I had the same problem, I had to set the powertarget with ccc when I overvolted or else it would downclock,


----------



## kcuestag

Quote:


> Originally Posted by *estens*
> 
> I had the same problem, I had to set the powertarget with ccc when I overvolted or else it would downclock,


Interesting, I'll give this a try tomorrow after class, thank you.









Edit:

It worked! Constant 1200MHz now, thanks!


----------



## Porter_

Quote:


> Originally Posted by *jerrolds*
> 
> AB only lets you go +100mv or 1350mv max with your card?


that's the limit set by MSI


----------



## qqan

Quote:


> Originally Posted by *bpmcleod*
> 
> Have you tried a run on ExtremeHD? At first I was like woa 102 fps then i noticed your settings


 yes, that is a lot less impressive.

Here you go (card @stock):


----------



## estens

Happy to help


----------



## Kriant

EK blocks incoming in a few hours...*mental drool*


----------



## Kipsofthemud

hnnnnng I must have posted this picture a few times already but I just can't wait till they ship me mine...I called the shop - new cards are being delivered to the shop on the 21st...oh god I'm so bad at waiting.

Meanwhile I'm just reading this thread hoping you guys solve your/the black screen issues etc


----------



## selk22

Sooo AB beta 17 does nothing basically... lame. I was really excited to actually see if I can take this card a bit further


----------



## pkrexer

I also fixed my erratic gpu down clocking itself (while I was playing BF4) by setting the power target within the CCC. Hopefully they get this corrected.


----------



## kcuestag

I'm keeping it at 1125/1400 default voltages until they fix the issue with the Power Target with CCC making the card downclock, I don't want to have to open CCC every time I reboot the computer to set it at 50% so it doesn't downclock at 1200MHz...









Unless there's a fix for that?


----------



## PillarOfAutumn

So THIS is what true power feels like??


----------



## Gilgam3sh

so the powerlimit in CCC is only 50%, with ASUS GPU Tweak I can set it at 150%.... and I did just run Heaven Benchmark at 1180/1450 with 1337mv and it went fine... this driving me nuts...also BF4 showed no artifacts etc at 1180mhz


----------



## Ukkooh

Quote:


> Originally Posted by *Kipsofthemud*
> 
> 
> 
> hnnnnng I must have posted this picture a few times already but I just can't wait till they ship me mine...I called the shop - new cards are being delivered to the shop on the 21st...oh god I'm so bad at waiting.
> 
> Meanwhile I'm just reading this thread hoping you guys solve your/the black screen issues etc


How did you get it for so cheap


----------



## cyenz

Please remove me from Owners Club, i am so disapointed with the blackscreen issue that i returned the card and payed the diference and bought a 780ti, i was using AMD since forever, didnt want to risk another blackscreen card. :\


----------



## jerrolds

Quote:


> Originally Posted by *kcuestag*
> 
> I'm keeping it at 1125/1400 default voltages until they fix the issue with the Power Target with CCC making the card downclock, I don't want to have to open CCC every time I reboot the computer to set it at 50% so it doesn't downclock at 1200MHz...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Unless there's a fix for that?


Why dont you just use AB or GPU tweak to set the power limit to 150%? I dont use CCC at all when it comes to overclocking.


----------



## Mr357

Quote:


> Originally Posted by *Gilgam3sh*
> 
> so the powerlimit in CCC is only 50%, with ASUS GPU Tweak I can set it at 150%.... and I did just run Heaven Benchmark at 1180/1450 with 1337mv and it went fine... this driving me nuts...also BF4 showed no artifacts etc at 1180mhz


50% in CCC is the same as 150% in GPU Tweak. It simply means 100% +50%.


----------



## Arizonian

Quote:


> Originally Posted by *cyenz*
> 
> Please remove me from Owners Club, i am so disapointed with the blackscreen issue that i returned the card and payed the diference and bought a 780ti, i was using AMD since forever, didnt want to risk another blackscreen card. :\


Understand - removed.


----------



## Forceman

Quote:


> Originally Posted by *kcuestag*
> 
> I'm keeping it at 1125/1400 default voltages until they fix the issue with the Power Target with CCC making the card downclock, I don't want to have to open CCC every time I reboot the computer to set it at 50% so it doesn't downclock at 1200MHz...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Unless there's a fix for that?


CCC doesn't keep the settings over the reboot for you? I set mine in CCC and it stayed just fine. Is there a "aaply at startup" box somewhere you may have missed?

I think someone said that just enabling overdrive in CCC and then using AB to overclock worked fine. That's what I do as well, and I haven't had any problems with AB (the overclock shows up both in AB and CCC when I check). Haven't tried increasing the voltage yet though. But the settings stick over a reboot and the clock speed is constant.


----------



## Blackroush

Do you guys think the blackscreen is fix able? I really want to buy another R9 290 but scare this blackscreen problem cannot be fixed through future driver.


----------



## nemm

Just been playing with AB B17 with the stock bios, first impressions are no different to Asus290 bios but at least now the card is using the the correct bios it was shipped with.

Using +100mv I was only able to push from 1100 to 1140 core and 1450 memory but that damn vdroop is the killer with fluctuating voltages from 1.35 to 1.28 so stabilizing the overclock is no easy task and gave up until I noticed the LLC AB command for which I am very grateful. After applying the command and +50mv the new max stable overclock is 1180/1450 reporting steady 1.297v and I can bench at 1200 but don't care for that since I am only after stable clock benches.

A little note, I used +200mv at one point before applying LLC which gave 1.408v but couldn't pass 1200 core.

Yes 1180 isn't anywhere near as high as some other get but it's about the average and I am happy with it since it beat my 1398core/1850memory 670 nearly 50% making it a worthy upgrade.

670- http://www.3dmark.com/fs/1059152



Spoiler: My stable Sapphire 290 bench







Later tomorrow I'll get to test my replacement card and maybe actually get to use crossfire finally


----------



## Raxus

since i had an RMA setup with newegg already, I went ahead and sent mine back (Black screening). Although it did seem to occur less frequently.

I just figured better safe than sorry.

HOPEFULLY newegg will get in a few ASUS cards.


----------



## Kipsofthemud

Quote:


> Originally Posted by *Ukkooh*
> 
> How did you get it for so cheap


Dutch shop. I was looking for a 290X but they weren't searchable on like, comparison sites so I went to this shop's site because I've been ordering there for years and I saw this while I was kinda drunk...I couldn't resist and pulled the trigger even though I was kinda fine with my 680 still..

Anyway, they changed the price to 490 the next day lol - they haven't cancelled my order though and I got the order confirmation so it seems like they are honoring it, it's been almost a week now.


----------



## Raephen

Quote:


> Originally Posted by *Kipsofthemud*
> 
> Dutch shop. I was looking for a 290X but they weren't searchable on like, comparison sites so I went to this shop's site because I've been ordering there for years and I saw this while I was kinda drunk...I couldn't resist and pulled the trigger even though I was kinda fine with my 680 still..
> 
> Anyway, they changed the price to 490 the next day lol - they haven't cancelled my order though and I got the order confirmation so it seems like they are honoring it, it's been almost a week now.


Azerty wasn't it? I ordered from them a few times too. They seem to have a good rep, so if they do back down on this agreement they've made -- damn their fine print!







-- blast away at them on tweakers.nl!


----------



## MrWhiteRX7

So are Sapphire 290's a definite good choice? (not the X) I am about to order a couple 290's and Sapphire is in stock


----------



## Raxus

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> So are Sapphire 290's a definite good choice? (not the X) I am about to order a couple 290's and Sapphire is in stock


I had issues with mine, sent it back.

Theres also no ETA or even a peep from sapphire about their TRIXX software.

Just a heads up.


----------



## flopper

Quote:


> Originally Posted by *Raxus*
> 
> I had issues with mine, sent it back.
> 
> Theres also no ETA or even a peep from sapphire about their TRIXX software.
> 
> Just a heads up.


used msi ab b17 to oc.
havent tocuhed ram though.
1155mhz on core on air with 100mv.
sapphire 290.
played bf4 today for hours no issue, temps was 60c or so


----------



## esqueue

On Msi AB 3.0.0 beta 17, What's the difference between the Core Voltage (mV) and Power Limit (%)? Another question I have is why the Core Voltage is not selectable on my system? Unlock Voltage Control and Monitoring are both checked.


----------



## nemm

Quote:


> Originally Posted by *Raxus*
> 
> I had issues with mine, sent it back.
> 
> Theres also no ETA or even a peep from sapphire about their TRIXX software.
> 
> Just a heads up.


I contacted with Sapphire with regards to Trixx update and was told it should be available by December, they didn't say which year though ^^


----------



## Forceman

Quote:


> Originally Posted by *esqueue*
> 
> On Msi AB 3.0.0 beta 17, What's the difference between the Core Voltage (mV) and Power Limit (%)? Another question I have is why the Core Voltage is not selectable on my system? Unlock Voltage Control and Monitoring are both checked.


Voltage controls the GPU voltage itself, while power limit controls the total power draw of the card. If you don't increase the power limit it'll throttle to keep the total draw under the BIOS limit. So increase that to +50 and don't worry about it anymore.

And the first version of AB 17 had a problem that didn't unlock voltage control on some cards - either try downloading it again and see if they put the fixed file up, or a few pagaes back there's a couple of lines you can add to the config file to get it working.


----------



## esqueue

Quote:


> Originally Posted by *Forceman*
> 
> Voltage controls the GPU voltage itself, while power limit controls the total power draw of the card. If you don't increase the power limit it'll throttle to keep the total draw under the BIOS limit. So increase that to +50 and don't worry about it anymore.
> 
> And the first version of AB 17 had a problem that didn't unlock voltage control on some cards - either try downloading it again and see if they put the fixed file up, or a few pagaes back there's a couple of lines you can add to the config file to get it working.


Thanks for that!


----------



## Raxus

Quote:


> Originally Posted by *flopper*
> 
> used msi ab b17 to oc.
> havent tocuhed ram though.
> 1155mhz on core on air with 100mv.
> sapphire 290.
> played bf4 today for hours no issue, temps was 60c or so


Yea i tried MSI AB, seemed to increase the black screens i was getting.


----------



## Kriant

Aaaand the blocks arrived! Sunday, please do come soon!


----------



## smartdroid

can you add me please











http://www.3dmark.com/3dm/1627511?


----------



## Arizonian

Quote:


> Originally Posted by *smartdroid*
> 
> can you add me please
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/1627511?


Congrats - added


----------



## MrWhiteRX7

Quote:


> Originally Posted by *smartdroid*
> 
> can you add me please
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/1627511?


Almost 25000 graphics score at stock clocks??? AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAND SOLD!


----------



## smartdroid

The clocks were 1150/1500


----------



## MrWhiteRX7

Still uber impressed, that's the same as my 3 7950's







lol!!!

I literally just ordered my sapphire 290's mmmmmmmmmmm


----------



## Falkentyne

Quote:


> Originally Posted by *GioV*
> 
> Have you guys been able to resolve the black screen problem? I seem to have found 2 black screen issues:
> 
> 1) Game is playing and it crashes, computer is still usable and get error "windows has encountered a problem and needs to close". Game is just a black screen and once I close the error message I go back to desktop. Most common, happens randomly during game play.
> 
> 2) Much rarer than the other one, whole screen is black and have to hard reset. Seems to happen when I have too many monitor programs open. For example, was testing temps with HWinfo and got a black screen when I opened GPU-z. This was during the desktop and not during gaming.
> 
> My R9 290X (stock clocks) is under water, vrm temps and core temps are very low. I have Elpidia memory and using windows 8.1 pro. Have tried everything besides changing OS because its a pain. I tested my rig with a 6950 and no issues whatsoever besides low fps
> 
> 
> 
> 
> 
> 
> 
> If you guys have any input let me know please


#2 is the "crash bug" that is making the GPU shut down. No idea why it's happening but it seems to be a fault that is triggering a shutdown protection on the card.
#1 is your old usual VPU recovery crash which we've had since 9700 pro days. GPU crashes and resets, game is suspended (thus the black screen).


----------



## Maxxa

I bet you can't wait to see some benches with my 965BE @4.0 + R9 290. I know I can't.
The best (I lie it's the worst) par of it all is that I have a I7-4770K and a Z87-G45 sitting in front of me right now that I have promised not to touch until Dec. 25th.


----------



## Kipsofthemud

Quote:


> Originally Posted by *Maxxa*
> 
> I bet you can't wait to see some benches with my 965BE @4.0 + R9 290. I know I can't.
> The best (I lie it's the worst) par of it all is that I have a I7-4770K and a Z87-G45 sitting in front of me right now that I have promised not to touch until Dec. 25th.


Why would you not touch it until then...you know you want to...it wont hurt


----------



## esqueue

Quote:


> Originally Posted by *Maxxa*
> 
> I bet you can't wait to see some benches with my 965BE @4.0 + R9 290. I know I can't.
> The best (I lie it's the worst) par of it all is that I have a I7-4770K and a Z87-G45 sitting in front of me right now that I have promised not to touch until Dec. 25th.


Have you tested it to make sure that it isn't a dud? It's very possible and exchanges expire by then.

Another thing is that Lama or Camel is ugly. LOL.


----------



## utnorris

Number 12 for single card and number 23 for CF in the 3DMark11 thread. Not bad. Probably won't last long, but I am still happy.

http://www.overclock.net/t/1361939/top-30-3dmark11-scores-for-single-dual-tri-quad/0_100


----------



## rass101

I just ordered MSI R9 290

and those parts to make a full build:

intel Core™ i5-4670K Processor

Corsair Vengeance Pro Series - 8GB (2 x 4B) DDR3 1600MHz CL9 Memory Kit

Corsair TX Series™ TX650 - 650 Watt 80 PLUS Bronze Certified

Samsung 840 EVO Series 120GB SSD

WD BLACK CAVIAR HDD SATA3 1TB 7200RPM 64MB WD1002FAEX

ASRock Z87M Extreme4 Intel Z87 Socket 1150 mATX

Zalman Z3 Plus ATX Mid Tower Computer Case must Aknaga USB3.0

Has anyone had any experience with MSI version of R9 290?
Also does rest of the build seem solid or would you change something from the list?


----------



## Maxxa

Quote:


> Originally Posted by *esqueue*
> 
> Have you tested it to make sure that it isn't a dud? It's very possible and exchanges expire by then.
> 
> Another thing is that Lama or Camel is ugly. LOL.


I've thought about it, but I'll take the chance on the CPU + Mobo vs. the GPU and use the manufacturers warranty if there is any problems. The one thing I did do though was look at the pins on the mobo and they look sweet.


----------



## DizzlePro

apparenlty the powecolor 290 can be unlocked into the 290x with just a bios flash

source: http://videocardz.com/47971/powercolor-radeon-r9-290-unlocked-r9-290x


----------



## stilllogicz

^ Alright OCN, investigate!


----------



## smartdroid

that sounds too good to be truth!! but one can only hope


----------



## iGameInverted

Nervous about moving forward with ordering the waterblock for this from AquaComputer (10-21 day lead time) I would want it to be ready for when I have all my other parts. But it seems as though this black screen is a huge issue for people.

Anyone want to enlighten me on this issue and what is going on? (If anything is certain yet) Is it only certain games? Is it only when people are over clocking? Would really like to know before I pull the trigger. Tried to search though the threads and other sites but it seems like no one really knows yet. Hopefully I can get some kind of advice. Thanks


----------



## dimwit13

Quote:


> Originally Posted by *iGameInverted*
> 
> Nervous about moving forward with ordering the waterblock for this from AquaComputer (10-21 day lead time) I would want it to be ready for when I have all my other parts. But it seems as though this black screen is a huge issue for people.
> 
> Anyone want to enlighten me on this issue and what is going on? (If anything is certain yet) Is it only certain games? Is it only when people are over clocking? Would really like to know before I pull the trigger. Tried to search though the threads and other sites but it seems like no one really knows yet. Hopefully I can get some kind of advice. Thanks


I have been going back and forth with the same problem.
should I or shouldn't I ?!?!?!
Well I said " the he11 with it"-I am ordering one tomorrow morning (and a EK water block and back plate)
Now, if I can just figure out which one.
Leaning towards Sapphire, Asus and HIS.
I guess they are all pretty much the same-Reference cards.

Anyone want to sway my vote?

-dimwit-


----------



## esqueue

Quote:


> Originally Posted by *iGameInverted*
> 
> Nervous about moving forward with ordering the waterblock for this from AquaComputer (10-21 day lead time) I would want it to be ready for when I have all my other parts. But it seems as though this black screen is a huge issue for people.
> 
> Anyone want to enlighten me on this issue and what is going on? (If anything is certain yet) Is it only certain games? Is it only when people are over clocking? Would really like to know before I pull the trigger. Tried to search though the threads and other sites but it seems like no one really knows yet. Hopefully I can get some kind of advice. Thanks


Why not just get an ek waterblock? Early sales is the reason they made sure to make the blocks available so soon. As they say, "The early bird takes the worm". I have zero issues with it and it actually exceeds my expectations. I even forgot to add the thermal paste with the pads on the VRM's as they recommend and their temps have never broke 55°C.


----------



## Scorpion49

Man, I got bamboozled. UPS changed my delivery date to tomorrow just now









Glad I paid extra for 2 day shipping, now on day 4.


----------



## rv8000

Out of curiosity has any 290 owner be able to get the core above 1200 stable yet with or without voltage tweaks?

I can pull 1180 stable with ab 17 and + 100mV (stock volts I can get 1150 on the core). The ab tool nol4n put out seems to not be playing nicely with my card or ab 17 and causing lock ups, temps are below 80c on both core and vrms.


----------



## Snyderman34

Update me to water please


----------



## Jpmboy

Quote:


> Originally Posted by *rv8000*
> 
> Out of curiosity has any 290 owner be able to get the core above 1200 stable yet with or without voltage tweaks?
> 
> I can pull 1180 stable with ab 17 and + 100mV (stock volts I can get 1150 on the core). *The ab tool nol4n put out seems to not be playing nicely with my card or ab 17* and causing lock ups, temps are below 80c on both core and vrms.


so far I've been haveing lockup (and hard ones...) with the tool also. But if i issue the llc and offset commands in cmnd mode (shift-rightclick in the MSI AB folder - "open cmnd window here") it holds better with AB15 0r 16"

On:
msiafterburner /wi4,30,8d,10 /ai4,30,38,7f
Off:
msiafterburner /wi4,30,8d,0 /oi4,30,38,80

May be something specific with this mobo and the LLC app...


----------



## MrStick89

Cannot unlock voltage control in afterburner with asus 290x. I edited the notepad file is there something else I need to do? Slider appears just is grayed out.


----------



## PillarOfAutumn

Hey guys. Ran into some trouble, unsure on how to fix it. I have the 290. I used onboard video to install all of wibdows 8 updates. I plugged the video card in and plugged the hdmi to the 290. When I start, I can see the bios splash screen and windows 8 start screen as well. However, when the PC is fully started, the screen is just black

Before installing the 290 here, i had the 7970 and catalyst 13.8 or something. Please advise


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> Throw a fan on it.
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/640_40#post_21060304
> 
> In other news, finally got my blocks and stuff related to it. The block shortage was more severe than the card shortage lol.


tsm, clu at work and finally got rid of all the air. just 1 120 rad and a fan.



no clue how hot the vrms are. prolly in the 100s.


----------



## Jpmboy

Quote:


> Originally Posted by *MrStick89*
> 
> Cannot unlock voltage control in afterburner with asus 290x. I edited the notepad file is there something else I need to do? Slider appears just is grayed out.


which AB beta version?


----------



## Jpmboy

Quote:


> Originally Posted by *rdr09*
> 
> tsm, clu at work and finally got rid of all the air. just 1 120 rad and a fan.
> 
> 
> 
> no clue how hot the vrms are. prolly in the 100s.


should be right in gpuZ (290x only?)


----------



## pkrexer

I'm having an odd issue where if I overclock my memory above 1400 I'll get desktop artifacts, but not in games. If I increase my core voltage, these desktop artifacts go away









Currently running 1150 / 1450 stable with the +100mv, previous best was 1080 core with stock voltage.


----------



## rv8000

Tried to push my card in valley a bit more, 1170/1500 is as far as its going to go it seems











Really curious to see more core oc resutls for 290s, I thought there would be more headroom than this, especially seeing a few 290x around 1300


----------



## Mas

Please update me to water.

Block finally came in last night, had to wait 2 weeks -_- Was going to get my second card a couple days ago for crossfire but no blocks were available and no ETA so I'm just going to wait until the retailer has both in stock and order them together next time. I have the 2nd backplate ready to go though


----------



## MrStick89

Quote:


> Originally Posted by *Jpmboy*
> 
> which AB beta version?


beta 17


----------



## grandpatzer

Can someone please confirm that Sapphire 290 does not have any warranty sticker on the screws?
I always watercool and want to keep my warranty.

Also seems like the 290/290x is exaxtly like 7950/7970, the more expensive card clocks +100mhz more, so it's about 7-10% more FPS I guess.


----------



## DampMonkey

Quote:


> Originally Posted by *grandpatzer*
> 
> Can someone please confirm that Sapphire 290 does not have any warranty sticker on the screws?
> I always watercool and want to keep my warranty.
> 
> Also seems like the 290/290x is exaxtly like 7950/7970, the more expensive card clocks +100mhz more, so it's about 7-10% more FPS I guess.


It does not, the heatsink will come off with a small phillips screwdriver and a little pullin'. No stickers in the way!


----------



## CurrentlyPissed

Took a video of the Black Screen error if anyone is interested.

Happens around 1:35


----------



## grandpatzer

Has anyone seen a good comparison of watercooled 290(x) 1150-1400mhz compared to GTX 780 or 2x 7970 / 2x 680?
I'm considering getting a Sapphire R9 290 + EKWB and overclock it 1100-1300mhz depending on luck.

From what my understanding the 290(x) at 1200mhz should be about 45% higher peformance in game vs 1 7970.


----------



## Jpmboy

Quote:


> Originally Posted by *MrStick89*
> 
> beta 17


and you checked the "voltage control" box in settings - right?


----------



## esqueue

What's the max safe voltage I can get with AB on stock non asus bios? Watercooled and temp is not an issue from the temps I am getting. I did 1160/1450 on +6 mV but am afraid of upping the voltage without lots more research on safe voltages and more questions and reading.


----------



## grandpatzer

Quote:


> Originally Posted by *esqueue*
> 
> What's the max safe voltage I can get with AB on stock non asus bios? Watercooled and temp is not an issue from the temps I am getting. I did 1160/1450 on +6 mV but am afraid of upping the voltage without lots more research on safe voltages and more questions and reading.


nice, so now the 290 supports MSI AB voltage control with reference bios (sapphire, powercolor, msi etc..)


----------



## PillarOfAutumn

Any help?

Hey guys. Ran into some trouble, unsure on how to fix it. I have the 290. I used onboard video to install all of wibdows 8 updates. I plugged the video card in and plugged the hdmi to the 290. When I start, I can see the bios splash screen and windows 8 start screen as well. However, when the PC is fully started, the screen is just black

Before installing the 290 here, i had the 7970 and catalyst 13.8 or something. Please advise


----------



## Falkentyne

Quote:


> Originally Posted by *kcuestag*
> 
> I'm keeping it at 1125/1400 default voltages until they fix the issue with the Power Target with CCC making the card downclock, I don't want to have to open CCC every time I reboot the computer to set it at 50% so it doesn't downclock at 1200MHz...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Unless there's a fix for that?


I use afterburner and have those settings saved in a profile.

And you can bypass the 100mv vcore limit in AB by setting it manually, which was posted above, through the command line (10 (hex)=100mv, 08 hex=50 mv (6.25 x8; its 6.25mv per every 'hexadecimal' step)


----------



## esqueue

Quote:


> Originally Posted by *grandpatzer*
> 
> nice, so now the 290 supports MSI AB voltage control with reference bios (sapphire, powercolor, msi etc..)


I went to http://www.guru3d.com/files_details/msi_afterburner_beta_download.html

installed beta 17 and did what they said in the first post in the comments on that page. After that I could mess with the voltage.
Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Any help?
> 
> Hey guys. Ran into some trouble, unsure on how to fix it. I have the 290. I used onboard video to install all of wibdows 8 updates. I plugged the video card in and plugged the hdmi to the 290. When I start, I can see the bios splash screen and windows 8 start screen as well. However, when the PC is fully started, the screen is just black
> 
> Before installing the 290 here, i had the 7970 and catalyst 13.8 or something. Please advise


Can you disable your onboard graphics adapter?


----------



## PillarOfAutumn

Quote:


> Originally Posted by *esqueue*
> 
> Can you disable your onboard graphics adapter?


Where do I disable that from? From bios or from the within the OS?


----------



## CurrentlyPissed

Guys please visit this thread. Want to get as much attention to it to possibly find us a fix.

http://www.overclock.net/t/1442736/video-and-interesting-find-on-black-screen-issue


----------



## th3illusiveman

Has the AB update eliminated the black screens?


----------



## selk22

Well guys its an exciting day with AB supporting the voltage control!
It seems my card is much more stable now and able to clock higher









+81 to Voltage
+20 to Power Limit
1150/1400

Took it 50mhz higher and decided to leave the volts and clocks here at a nice and stable point for the stock cooler...
Once I am on water I am excited to see how this card performs!

Pushed past the 70fps barrier for Valley that I have been trying to break this week!











Its nice to see what a little juice can do


----------



## mrsus

Anyone noticing artifact when waking up from sleep mode, it will disappear automatically afterward.

running valley and gaming is fine. seem to happen both stock or overclock


----------



## CurrentlyPissed

Where did you find the new version?


----------



## selk22

Quote:


> Originally Posted by *CurrentlyPissed*
> 
> Where did you find the new version?


http://www.guru3d.com/files_details/msi_afterburner_beta_download.html

Download beta 17 and make sure in settings you have voltage control checked


----------



## blue1512

Remember a guy on this thread talked about 290 being unlocked to 290x? Look like there is another lucky lad having success with that trick
http://videocardz.com/47971/powercolor-radeon-r9-290-unlocked-r9-290x


----------



## esqueue

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Where do I disable that from? From bios or from the within the OS?


From your Motherboard's Bios. That's if you have the option. When you restart the pc, keep tapping the Delete key for %90 of boards.

Quote:


> Originally Posted by *selk22*
> 
> Well guys its an exciting day with AB supporting the voltage control!
> It seems my card is much more stable now and able to clock higher
> 
> 
> 
> 
> 
> 
> 
> 
> 
> +81 to Voltage
> +20 to Power Limit
> 1150/1400
> 
> Took it 50mhz higher and decided to leave the volts and clocks here at a nice and stable point for the stock cooler...
> Once I am on water I am excited to see how this card performs!
> 
> Pushed past the 70fps barrier for Valley that I have been trying to break this week!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Its nice to see what a little juice can do


I'm a happy camper too. Max temp is 42° but I'm on water.
OC'd quite a bit for not much better results. I game on stock though
+69 to Voltage
+50 to Power Limit
1200/1500


----------



## bpmcleod

Has anyone been ordering the EK blocks from FrozenCPU and if so how long has it been taking for them to ship/arrive? I am about to purchase two blocks and I would rather have the EK ones but Koolance has there FC block ins tock atm so I just want a ballpark if that would be possible?


----------



## Arizonian

Quote:


> Originally Posted by *Snyderman34*
> 
> Update me to water please
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Quote:


> Originally Posted by *Mas*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Please update me to water.
> 
> Block finally came in last night, had to wait 2 weeks -_- Was going to get my second card a couple days ago for crossfire but no blocks were available and no ETA so I'm just going to wait until the retailer has both in stock and order them together next time. I have the 2nd backplate ready to go though


Both updated to water - congrats.









Quote:


> Originally Posted by *selk22*
> 
> http://www.guru3d.com/files_details/msi_afterburner_beta_download.html
> 
> Download beta 17 and make sure in settings you have voltage control checked


I'm going to add this link to the OP after this post.








Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Hey guys. Ran into some trouble, unsure on how to fix it. I have the 290. I used onboard video to install all of wibdows 8 updates. I plugged the video card in and plugged the hdmi to the 290. When I start, I can see the bios splash screen and windows 8 start screen as well. However, when the PC is fully started, the screen is just black
> 
> Before installing the 290 here, i had the 7970 and catalyst 13.8 or something. Please advise


Question - before you took out your 7970 did you have an over clock set to AB or CCC? If you didn't uninstall or at the very least what I like to do is revert back to stock default and save without starting with window start up you may be experiencing an over clock to your 290 which is unstable as soon as it starts back up.


----------



## Snyderman34

Quote:


> Originally Posted by *bpmcleod*
> 
> Has anyone been ordering the EK blocks from FrozenCPU and if so how long has it been taking for them to ship/arrive? I am about to purchase two blocks and I would rather have the EK ones but Koolance has there FC block ins tock atm so I just want a ballpark if that would be possible?


I ordered my block from FrozenCPU on Monday and got it today. USPS Priority 2 day shipping


----------



## utnorris

Quote:


> Originally Posted by *bpmcleod*
> 
> Has anyone been ordering the EK blocks from FrozenCPU and if so how long has it been taking for them to ship/arrive? I am about to purchase two blocks and I would rather have the EK ones but Koolance has there FC block ins tock atm so I just want a ballpark if that would be possible?


Send an email to Performance and Frozen and ask before buying. PPCS had them listed but they were not in stock when I ordered them. However, I imagine they are finally able to keep up with demand, but I would check first.


----------



## Joeve

This is frustrating, as soon as i try to play a game, i get blue screen, tried Bf4, Crysis 3, Bioshock, each one ends up with a blue screen, running a msi r9 290 stock. Deleted old drivers installed latest amd driver, no luck


----------



## spitty13

Anyone know how to make the 290 quit throttling?


----------



## esqueue

Quote:


> Originally Posted by *spitty13*
> 
> Anyone know how to make the 290 quit throttling?


Use a custom fan profile in AB which include higher fan speeds.


----------



## grandpatzer

Quote:


> Originally Posted by *esqueue*
> 
> I went to http://www.guru3d.com/files_details/msi_afterburner_beta_download.html
> 
> installed beta 17 and did what they said in the first post in the comments on that page. After that I could mess with the voltage.
> Can you disable your onboard graphics adapter?


Thanks looks like it's limited to +100mv and 1235/1625 Mhz, I guess those are fine.
I'm going under water and would be a shame if I get past 1235mhz on core, I doubt thats possible however on just +100mv.
I'm propably fine on +100mv, I want to keep this card for 2years









Quote:


> Originally Posted by *selk22*
> 
> Well guys its an exciting day with AB supporting the voltage control!
> It seems my card is much more stable now and able to clock higher
> 
> 
> 
> 
> 
> 
> 
> 
> 
> +81 to Voltage
> +20 to Power Limit
> 1150/1400
> 
> Took it 50mhz higher and decided to leave the volts and clocks here at a nice and stable point for the stock cooler...
> Once I am on water I am excited to see how this card performs!
> 
> Pushed past the 70fps barrier for Valley that I have been trying to break this week!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Its nice to see what a little juice can do


Quote:


> Originally Posted by *esqueue*
> 
> From your Motherboard's Bios. That's if you have the option. When you restart the pc, keep tapping the Delete key for %90 of boards.
> I'm a happy camper too. Max temp is 42° but I'm on water.
> OC'd quite a bit for not much better results. I game on stock though
> +69 to Voltage
> +50 to Power Limit
> 1200/1500


nice looks like the 290x at around 1150-1200mhz is about 20% slower vs 2x 7970GHZ, this to me is a good trade off as I want nothing to do with CF/SLI ever again lol.

Have you guys measured how much it watt it draws at wall in IDLE and LOAD?


----------



## blue1512

Quote:


> Originally Posted by *blue1512*
> 
> Remember a guy on this thread talked about 290 being unlocked to 290x? Look like there is another lucky lad having success with that trick
> http://videocardz.com/47971/powercolor-radeon-r9-290-unlocked-r9-290x


Any thoughts on this, guys?
If this is true, I will switch to 290 and try my luck with 290x bios.


----------



## Arm3nian

Quote:


> Originally Posted by *blue1512*
> 
> Any thoughts on this, guys?
> If this is true, I will switch to 290 and try my luck with 290x bios.


Seems fake to me tbh. We have seen this almost generation, it never works for unlocking more "hardware". Even turning the 690 into quadro5000 required a hardmod, not a simple bios flash.


----------



## Blackroush

Origin Just update BF4


----------



## evensen007

Anyone else running x-fire and getting strange gpu usage? It's running awesome maxed out, but the gpu usage chart in AB looks like a seismic sensor. When it does this, the game starts to feel funny.


----------



## Arizonian

Quote:


> Originally Posted by *Blackroush*
> 
> Origin Just update BF4
> 
> 
> Spoiler: Warning: Spoiler!


Yes we did but the thread moves so fast some may have missed it the *post*, bears repeating. Reduced crashing.

*Source*


Spoiler: BF4 November 14th Update Notes



Quote:


> UPDATE #10 (Nov 14):
> PC game update is now live!
> Details on what the patch includes will be posted in a couple of hours.
> 
> UPDATE #9:
> PC Game Update Goes Live Nov 14
> We are happy to report that the PC Battlefield 4 game update will go live tomorrow, November 14. The main focus of this update is threefold:
> 
> *1) Reduce the number of crashes*
> We believe this update will solve a large portion of the most commonly occuring game crashes, and the improvement in overall stability should make a big difference for many players.
> 
> *2) Eliminate a network sync bug*
> If you've been experiencing situations in multiplayer where it appears that you are taking damage from enemies through walls, you might have been the victim of a network sync bug. In this game update, we have identified and eliminated one such bug that caused this type of gameplay experience. We are continuing to work on more multiplayer optimizations concerning network performance.
> 
> *3) Improve Network Smoothing*
> We have made some improvements to the "Network Smoothing" functionality that you can find in the OPTIONS>GAMEPLAY menu. The Network Smoothing slider governs a group of settings that aim to produce a tighter multiplayer experience based on your specific packet loss situation.
> 
> If you've been experimenting with the Network Smoothing slider earlier, it might now yield better results. If you haven't tried it before, please explore this setting and set it to the lowest value you can without experiencing graphical glitches in the game. By setting it lower, you can get a tighter multiplayer experience, depending on your specific network situation.
> 
> More details upcoming
> Besides the above mentioned focus areas, this PC game update contains a number of other fixes and tweaks that we will detail once it's live.
> 
> This game update for PC goes live around 10AM UTC/2AM PST November 14. Your Origin Client should acknowledge and automatically download this update for you, as usual. Otherwise, you can right click your game in the client and select "Check for Update". There will be a Battlelog maintenance downtime for about an hour as this update goes live. During this time multiplayer on PC will be unavailable.
> 
> We will release full patch notes for this game update later, but the three items mentioned above are by far the most important changes in this update. We understand that stability has been rocky for some players, and hope that this update will make the game run smoother and more stable for you. Let us know in the comments below.






Quote:


> Originally Posted by *PillarOfAutumn*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> So THIS is what true power feels like??


Yes it is especailly with water blocks like you. Crongrats









Almost missed you too, but slotted you in where you should have been in order to entries.


----------



## selk22

SO right now have it set in AB like this

+81 Volts
+21-50 power
1150/1400

In games though the card bounces between 1000-1150, also the volts seem to bounce around with the core clock. Mem stays constant. Anyway to prevent this? I dont notice a FPS dip or anything really but it does annoy the piss out of me haha


----------



## VSG

Got my EK blocks, backplates and dual card terminal installed. Suddenly my 900D doesn't seem big anymore and I have to actually think hard about my water loop.


----------



## _Killswitch_

The black screen/Coil Whine/buzzing issues I keep seeing going on with 290/290X worry me a little, hopefully things work out when I get ready to upgrade.


----------



## Blackroush

I just installed Gelid Icy Vision REV.2 to my 290 So far so good but I dont know to make the fan quieter during idle. During load this fan alot quieter than stock

*Max GPU Temp @Idle 40c @BF4 Ultra 69c
*Max VRM 1 Temp @Idle 37c @BF4 Ultra 78c
*Max VRM 1 Temp @Idle 36c @BF4 Ultra 57c


----------



## evensen007

Quote:


> Originally Posted by *selk22*
> 
> SO right now have it set in AB like this
> 
> +81 Volts
> +21-50 power
> 1150/1400
> 
> In games though the card bounces between 1000-1150, also the volts seem to bounce around with the core clock. Mem stays constant. Anyway to prevent this? I dont notice a FPS dip or anything really but it does annoy the piss out of me haha


You have to go into CCC and set powertune to 50.


----------



## utnorris

Quote:


> Originally Posted by *evensen007*
> 
> Anyone else running x-fire and getting strange gpu usage? It's running awesome maxed out, but the gpu usage chart in AB looks like a seismic sensor. When it does this, the game starts to feel funny.


CF seems to be broken as I get all type of bouncing on GPU usage when benchmarking.
Quote:


> Originally Posted by *selk22*
> 
> SO right now have it set in AB like this
> 
> +81 Volts
> +21-50 power
> 1150/1400
> 
> In games though the card bounces between 1000-1150, also the volts seem to bounce around with the core clock. Mem stays constant. Anyway to prevent this? I dont notice a FPS dip or anything really but it does annoy the piss out of me haha


So I think there is something wrong with the beta AB. I pushed my voltage up and clocks up to 1175Mhz and got a lower score than a single overclocked without voltage control. According to AB the voltage was going up and I was not getting artifacts, but I was getting stutter and the higher I went in Mhz the lower the score got. It's as though it was not CF or overclocking the second GPU even though it was showing the second GPU running at the same clocks. I even tried turning off ULPS through regedit and that did not help. Not sure whats going on, but I had to uninstall the beta and go back to 16 to get what seems like normal CF functionality. Kinda frustrated right now as I just wasted my night messing with this.


----------



## CurrentlyPissed

Both cards are going back to newegg RMA. They paid shipping, and will re-overnight them once received since I paid for overnight initially.

Hopefully all works with the new set. Guess I will be grabbing the 7750 out of the backup PC for now.


----------



## utnorris

Well, something is definitely messed up now because I cannot break 15k with two friggin GPU's. Not sure what is going on, but I broke something. Guess I will have to do a clean install of Windows to see if that fixes it. Not friggin happy right now.


----------



## Mas

Quote:


> Originally Posted by *utnorris*
> 
> Well, something is definitely messed up now because I cannot break 15k with two friggin GPU's. Not sure what is going on, but I broke something. Guess I will have to do a clean install of Windows to see if that fixes it. Not friggin happy right now.


Yeh, I've decided to stay away from all OCing and monitoring software until everything settles down and all the bugs are ironed out of everything.


----------



## Forceman

Quote:


> Originally Posted by *selk22*
> 
> SO right now have it set in AB like this
> 
> +81 Volts
> +21-50 power
> 1150/1400
> 
> In games though the card bounces between 1000-1150, also the volts seem to bounce around with the core clock. Mem stays constant. Anyway to prevent this? I dont notice a FPS dip or anything really but it does annoy the piss out of me haha


I'm getting the same thing with the new Afterburner. It was pretty stable before, but now I get a lot of small core fluctuations and the voltage is all over the place. It seems to get better if I put the voltage back to stock, but it still isn't as smooth as it used to be. Does the Asus BIOS and GPU Tweak have the same problem? I'm on water, and I've got +50 set for power in CCC and Afterburner but I'm not sure what is going on. Any way to get more than +50% power limit?


----------



## Samurai Batgirl

Quote:


> Originally Posted by *_Killswitch_*
> 
> The black screen/Coil Whine/buzzing issues I keep seeing going on with 290/290X worry me a little, hopefully things work out when I get ready to upgrade.


Quote:


> Originally Posted by *Mas*
> 
> Yeh, I've decided to stay away from all OCing and monitoring software until everything settles down and all the bugs are ironed out of everything.


I'm not sure if this will solve any crashing problems since it's an AMD card we're talking about, but in MSI Afterburner you could rename the RTcore.cfg.

http://forums.guru3d.com/showthread.php?t=343161

Found that when I was trying to fix crashing problems in BFBC2. I have a GTX 580, so it might not do the trick, but I think it's worth a shot. The same type of sound lock whirring and black/blue screening happened to me.


----------



## pkrexer

Gurrr.. just when I think im satisfied with my overclock, bf4 black screen crashes 45 minutes later. Ah, the joys of overclocking.


----------



## RAFFY

Quote:


> Originally Posted by *Mas*
> 
> Yeh, I've decided to stay away from all OCing and monitoring software until everything settles down and all the bugs are ironed out of everything.


Same since I stopped using GPU Tweak my pc has become a lot more stable. Now my main issue is direct x crashing. But I'm sure that will be fixed once more mature drivers are released.


----------



## R35ervoirFox

Quote:


> Originally Posted by *DizzlePro*
> 
> apparenlty the powecolor 290 can be unlocked into the 290x with just a bios flash
> 
> source: http://videocardz.com/47971/powercolor-radeon-r9-290-unlocked-r9-290x


OK this is actually interesting, could have had that powercolor for 14 euros more, went with Sapphire instead, should be here this morning. Maybe the Sapphire will unlock like my last one








I will try flash a Sapphire 290x bios later today or tomorrow when I've done some initial testing. If anyone has a Sapphire R9 290X, please pm with a bios for scientific testing


----------



## jerrolds

Well after reseating the VRM ram sinks, im still able to easily reach 100C on VRM1 while VRM2 is at 60C and the GPU is at 64C maximum. This is at 1200mhz core, 1500mhz memory and 1.29v after vdroop (1385mV set in GPU Tweak)

Sigh.

Settling on 1180mhz core at 1335mv set in GPU Tweak, VRM1 tops out at around 80C which i can live with.

Might try and slap on a fan if i can find one and see if i can get 1200mhz game stable.

This is with the Gelid ICY 2 aftermarket cooler.


----------



## Sazz

pardon me if this has been answered coz I skipped to this last page.

But is it just me or the AB Beta 17 isn't really good on OCing... I used stock Asus BIOS with GPU Tweak and was able to get 1270/1466 at 1.45v on GPU Tweak.
Here on AB I put aux and core voltage at +100 and won't get stable 1200/1400.

Edit:

I can get 1120/1375 at stock voltage, so 1150/1375 should be easy enough but even at +100 on both core and aux voltage I can't even get 1150/1375 stable.

I don't think its even adding any voltages, using stock Asus BIOS with GPU Tweak I was able to get 1275/1466.


----------



## Forceman

Well. Well. Well.

Read the article about someone flashing a 290 to a 290X so I figured I'd give it a shot and flash mine. Got the Asus 290X BIOS from Overclockers UK, switched to the backup BIOS and flashed it. Rebooted and this is what I saw.



Well, maybe that's a GPU-Z thing. So I dialed in my old overclock and ran Valley. Here's the result:

*XFX R9 290 BIOS @ 1120/1350*



*Asus 290X BIOS @ 1120/1350*



Soooo....I need to so some more testing. But this flash to a 290X is starting to look like a real thing. I've got some Metro LL benches I ran with my original Sapphire 290X, so I'll compare to those.

Edit: So the 290 flashed to 290X runs a little better at 1120/1350 than the 290 does at 1180/1375.



Well, I'm giddy as a schoolgirl. Here's Metro at 1100/1350:

290 BIOS 48 FPS



290X BIOS 50 FPS


----------



## HoZy

It's the whole "Flash your 6950 to 6970" all over again.


----------



## jomama22

Were you throttling at all on either?

Only reason I ask is because tons of people have flashed the 290x Asus bios and no one unlocked anything until now. Maybe you got one of a few lucky uncut dies


----------



## nemm

Quote:


> Originally Posted by *rv8000*
> 
> Out of curiosity has any 290 owner be able to get the core above 1200 stable yet with or without voltage tweaks?
> 
> I can pull 1180 stable with ab 17 and + 100mV (stock volts I can get 1150 on the core). The ab tool nol4n put out seems to not be playing nicely with my card or ab 17 and causing lock ups, temps are below 80c on both core and vrms.


When applying LLC problems were the same until I stopped going over +55mv then all became fine, stabilizing my overclock from 1100 to 1180 and I can also bench pass 1220 with little artifact and I mean little.


----------



## Samurai Batgirl

Quote:


> Originally Posted by *Forceman*
> 
> Well. Well. Well.
> 
> Read the article about someone flashing a 290 to a 290X so I figured I'd give it a shot and flash mine. Got the Asus 290X BIOS from Overclockers UK, switched to the backup BIOS and flashed it. Rebooted and this is what I saw.
> 
> 
> 
> Well, maybe that's a GPU-Z thing. So I dialed in my old overclock and ran Valley. Here's the result:
> 
> *XFX R9 290 BIOS @ 1120/1350*
> 
> 
> 
> *Asus 290X BIOS @ 1120/1350*
> 
> 
> 
> Soooo....I need to so some more testing. But this flash to a 290X is starting to look like a real thing. I've got some Metro LL benches I ran with my original Sapphire 290X, so I'll compare to those.
> 
> Edit: So the 290 flashed to 290X runs a little better at 1120/1350 than the 290 does at 1180/1375.
> 
> 
> 
> Well, I'm giddy as a schoolgirl. Here's Metro at 1100/1350:
> 
> 290 BIOS 48 FPS
> 
> 
> 
> 290X BIOS 50 FPS


DUDE. Nice job! Get back to us when you have more results!


----------



## Sgt Bilko

Ok, i've got the AC III back on now and im folding at 100% load (+50mV Core 1100/1300) , VRM 1 is 55c VRM 2 is 43c and the core is 62c at 100% fan, Thats with using the thermal pads off the stock block.......apart from the vrm ones, they suck so i replaced them.

Hope that helps someone out


----------



## Supranium

This Asus 290X bios that unlocked your card is same unlocked voltage bios which everyone has been using here. So it seems that only very small percentage can unlock the shaders.


----------



## mr. biggums

Quote:


> Originally Posted by *Forceman*
> 
> Well. Well. Well.
> 
> Read the article about someone flashing a 290 to a 290X so I figured I'd give it a shot and flash mine. Got the Asus 290X BIOS from Overclockers UK, switched to the backup BIOS and flashed it. Rebooted and this is what I saw.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> Well, maybe that's a GPU-Z thing. So I dialed in my old overclock and ran Valley. Here's the result:
> 
> *XFX R9 290 BIOS @ 1120/1350*
> 
> 
> 
> *Asus 290X BIOS @ 1120/1350*
> 
> 
> 
> Soooo....I need to so some more testing. But this flash to a 290X is starting to look like a real thing. I've got some Metro LL benches I ran with my original Sapphire 290X, so I'll compare to those.
> 
> Edit: So the 290 flashed to 290X runs a little better at 1120/1350 than the 290 does at 1180/1375.
> 
> 
> 
> Well, I'm giddy as a schoolgirl. Here's Metro at 1100/1350:
> 
> 290 BIOS 48 FPS
> 
> 
> 
> 290X BIOS 50 FPS


What brand is your 290 the article specifies powercolor was the one that used, someone posted trying a on sapphire and it not working but i'm curious if it's just a needle in a haystack situation.


----------



## Supranium

He has XFX. Their all the same. Brand doesnst matter.


----------



## Sgt Bilko

Quote:


> Originally Posted by *mr. biggums*
> 
> What brand is your 290 the article specifies powercolor was the one that used, someone posted trying a on sapphire and it not working but i'm curious if it's just a needle in a haystack situation.


I believe he said XFX above the Valley bench


----------



## Forceman

Quote:


> Originally Posted by *jomama22*
> 
> Were you throttling at all on either?
> 
> Only reason I ask is because tons of people have flashed the 290x Asus bios and no one unlocked anything until now. Maybe you got one of a few lucky uncut dies


Most of the 290 tests were done on water, so no throttling. I did test a few things with it on air but I ran the fan at 70 or 100 and it wasn't throttling then either. So far it's been stable, so I don't know if the card has an actual 290X core, or if it just wasn't correctly disabled.
Quote:


> Originally Posted by *mr. biggums*
> 
> What brand is your 290 the article specifies powercolor was the one that used, someone posted trying a on sapphire and it not working but i'm curious if it's just a needle in a haystack situation.


Mine is a XFX with Hynix memory.


----------



## famich

Quote:


> Originally Posted by *Jpmboy*
> 
> so far I've been haveing lockup (and hard ones...) with the tool also. But if i issue the llc and offset commands in cmnd mode (shift-rightclick in the MSI AB folder - "open cmnd window here") it holds better with AB15 0r 16"
> 
> On:
> msiafterburner /wi4,30,8d,10 /ai4,30,38,7f
> Off:
> msiafterburner /wi4,30,8d,0 /oi4,30,38,80
> 
> May be something specific with this mobo and the LLC app...


Hi, I have been having the same problem -hard lockups with the tool , even adding more offset e.g. 15 caused the crash ...
There must be something more to that some other thing to play


----------



## hotrod717

Quote:


> Originally Posted by *Forceman*
> 
> Most of the 290 tests were done on water, so no throttling. I did test a few things with it on air but I ran the fan at 70 or 100 and it wasn't throttling then either. So far it's been stable, so I don't know if the card has an actual 290X core, or if it just wasn't correctly disabled.
> Mine is a XFX with Hynix memory.


When and where did you purchase your XFX 290, if I may ask? I just recently received my XFX 290 and although I have Elpida, I am intrigued.

I (knocks on wood, crosses fingers, and does 7 hail mary's) haven't experienced any issues, that seem to be plaguing people in this thread. I am running stock and tried both quiet and uber modes with my Asus VG248QE set at 144hz.
Simply uninstalled old drivers, deleted x86 files, and ran Ccleaner to delete reg files. Installed card after reboot and installed newest drivers. I've done this with multiple cards, AMD and Nvidia. Haven't experienced any issues yet.


----------



## Gilgam3sh

well this 290-->290X flash thing sounds nice even though my 290 still says 2560 shaders after ASUS 290X BIOS...


----------



## hotrod717

Quote:


> Originally Posted by *Gilgam3sh*
> 
> well this 290-->290X flash thing sounds nice even though my 290 still says 2560 shaders after ASUS 290X BIOS...


You don't know unless you try. Appearently there is a lottery within a lottery! This is obviously an exception, not the norm.


----------



## Duvar

Perhaps it will only work on cards with Hynix memory.


----------



## R35ervoirFox

Quote:


> Originally Posted by *Duvar*
> 
> Perhaps it will only work on cards with Hynix memory.


No memory has nothing to do with the core.


----------



## Forceman

Quote:


> Originally Posted by *hotrod717*
> 
> When and where did you purchase your XFX 290, if I may ask? I just recently received my XFX 290 and although I have Elpida, I am intrigued.
> .


I bought it at Amazon on day one. I'm very tempted to flash the XFX 290X BIOS on it, but I don't want to screw it up since it is working so well with the Asus one. Pretty consistently about 5% faster than the 290 BIOS at the same clock speed - tested with Heaven, Valley, and Metro LL. Gonna run Firestrike now.


----------



## R35ervoirFox

Quote:


> Originally Posted by *Forceman*
> 
> I bought it at Amazon on day one. I'm very tempted to flash the XFX 290X BIOS on it, but I don't want to screw it up since it is working so well with the Asus one. Pretty consistently about 5% faster than the 290 BIOS at the same clock speed - tested with Heaven, Valley, and Metro LL. Gonna run Firestrike now.


Do you know what memory the asus used, I'm concerned both memories have different timings but if the memory is the same shouldn't matter which bios you flash as they are all ref after that.


----------



## lordzed83

Well i had a play with new AB.
With +100mv max i can get without artefacts is 1150/1500 interesting how if i run memory at 1550 i get lower scores in benchmarks than on 1500.
Memory must be prooducing more bugs. So for me 6000mhz is sweetspot


----------



## hotrod717

Quote:


> Originally Posted by *Forceman*
> 
> I bought it at Amazon on day one. I'm very tempted to flash the XFX 290X BIOS on it, but I don't want to screw it up since it is working so well with the Asus one. Pretty consistently about 5% faster than the 290 BIOS at the same clock speed - tested with Heaven, Valley, and Metro LL. Gonna run Firestrike now.


Congratulations! Lucked out. I have a feeling you're in a small minority, but will give it a go, just to see once my block arrives. Don't want to alter card in any way for a few more days to ensure everythings good.


----------



## Forceman

Quote:


> Originally Posted by *R35ervoirFox*
> 
> Do you know what memory the asus used, I'm concerned both memories have different timings but if the memory is the same shouldn't matter which bios you flash as they are all ref after that.


Not sure. Although with the mix and match nature of the memory chips I'm wondering if all the timings are the same anyway.


----------



## R35ervoirFox

Quote:


> Originally Posted by *Forceman*
> 
> Not sure. Although with the mix and match nature of the memory chips I'm wondering if all the timings are the same anyway.


No they have different timings but it seems every bios has timings for both, where did you get the asus bios, I don't see it on techpowerup but I do see the sapphire 290x bios so might use that.

Edit: All the bios have this in the details:

Memory Support
4096 MB, GDDR5, Autodetect
4096 MB, GDDR5, Hynix H5GQ2H24AFR
4096 MB, Unknown, Elpida EDW2032BBBG

ofc the timings could still be the same


----------



## Gilgam3sh

Quote:


> Originally Posted by *R35ervoirFox*
> 
> No they have different timings but it seems every bios has timings for both, where did you get the asus bios, I don't see it on techpowerup but I do see the sapphire 290x bios so might use that.
> 
> Edit: All the bios have this in the details:
> 
> Memory Support
> 4096 MB, GDDR5, Autodetect
> 4096 MB, GDDR5, Hynix H5GQ2H24AFR
> 4096 MB, Unknown, Elpida EDW2032BBBG
> 
> ofc the timings could still be the same


here you have the ASUS 290X BIOS.

https://skydrive.live.com/?cid=0aa466d1c03b2ec3&id=AA466D1C03B2EC3!787&authkey=!ANoiqfQqK6n6tdo


----------



## R35ervoirFox

Quote:


> Originally Posted by *Gilgam3sh*
> 
> here you have the ASUS 290X BIOS.
> 
> https://skydrive.live.com/?cid=0aa466d1c03b2ec3&id=AA466D1C03B2EC3!787&authkey=!ANoiqfQqK6n6tdo


Thanks







I take it the Sapphire.rom is also a r9 290x rom?

I should get my card soon, I'll wait a day or two to make sure it's working ok, be awesome if the shaders unlock, they might not have had time to cut them all


----------



## Gilgam3sh

Quote:


> Originally Posted by *R35ervoirFox*
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> I take it the Sapphire.rom is also a r9 290x rom?
> 
> I should get my card soon, I'll wait a day or two to make sure it's working ok, be awesome if the shaders unlock, they might not have had time to cut them all


yes Sapphire is 290X but the owner just made a backup of his own BIOS. I use that ASUS 290X BIOS on my Sapphire 290 w/o any problem, but still 2560 shaders.


----------



## Duvar

We can check if it really unlocks the shaders or not.
What about with people where no shaders were unlocked with the asus bios?
They could run the same tests with same clocks and if they get better results with the asus bios, then it should be some timing issue.
Gilgamesh pls test it with your card, overclock stock bios to 1000/1250 and compare the results with the asus 290X Bios.


----------



## mustrum

I have done that. No performance increase at exact same clocks (54,8 to 55 with X bios).


----------



## Duvar

Thx i asked some guys from germany to make the same test and the result was the same as yours, no performance increase at all...
mustrum do hou have elpida mem?


----------



## mustrum

I need to get home first to have a look at the memory. Guy from germany could've been me though (hardwareluxx forum?). Not actually german but i am active in the flash thread over there.


----------



## Forceman

Can anyone think of another way to confirm the shaders? Preferably something that would be isolated to the shader count? Some kind of synthetic benchmark maybe?

My card appears to be about 5% faster, which is maybe a little on the low side? I'd like to test the shader throughput directly.

Okay, I put stock 290 clocks on and ran ShaderToyMark on the flashed BIOS. Anyone want to take a minute and check it with a stock 290?



http://www.geeks3d.com/20111215/shadertoymark-0-3-0-opengl-pixel-shader-benchmark-updated/


----------



## Duvar

@ mustrum: This is the original Thread, you can check this out if you want http://extreme.pcgameshardware.de/grafikkarten/303356-amd-r9-290-serie-bios-flash-und-oc-austausch-shader-freigeschaltet.html
Let everything translate via browser.


----------



## mustrum

Ahh yeah. I've read that thread. It's where everything started i believe.
I'm austrian - no need to translate. There is no additional information to be found though.


----------



## SonDa5

The shader problem could have been the reason why the 290 was detained.... But it looks like AMD may have it under control... then again some 290s appear to be able to be unlocked to 290x. Looks like a gamble. Either way with or without all the shaders the 290 is still a great card and great value.


----------



## Menthol

I have a Sapphire 290X, I have been using the ASUS bios with unlocked voltage, card clocks pretty good, waiting on my block, I flashed with a 290 bios to see if it would take, it flashed OK, booted without problem, GPU-Z showed every line blank, it couldn't read the bios on the card, GPU tweak wouldn't start. why flash down you ask, I was going to try to bench the card as a 290 for boints but I guess not


----------



## ontariotl

Quote:


> Originally Posted by *Forceman*
> 
> Mine is a XFX with Hynix memory.


I have 2 of the exact same model with Hynix. I'm going to try and flash mine when I get home from work and see if I get the same results. I already have some benchmarks after watercooling to compare if I'm successful with the flash.


----------



## instream

Does anybody also have problems with Directx 9 games with R9 290 or R9 290X? I have an R9 290. I get random crashes that cause the computer to freeze for a second then reboot itself. It happens randomly after 1-60 minutes when playing Mount and Blade Warband, and also in Trine. Both are DirectX 9. I use the latest Catalyst 13.11 beta9.2 on win7 x64, and have run DDU and AMD cleanup utility before installing Catalyst. Twice I have removed the R9 290 and inserted my old HD5850. Then the crashes go away completely. Last night I even installed a fresh copy of Win7 x64 on an empty partition, then installed Catalyst 13.11 beta9.2, then Steam, then Mount and Blade Warband. The game crashed and rebooted the same way twice within 15 min. So the problem is still there with DirectX 9 games I guess, even in Catalyst 13.11 beta9.2.

To rule out other causes: I have a brand new 750W PSU (Fractal Design Integra R2), CPU is Phenom II 1100T (watercooled). I can run Unigine heaven or valley for hours without problems. I monitor temps and freqs at the same time and see that GPU runs at 947MHz, reaches 94 deg C, and the fan goes at 2000+ RPM (47%). When instead running the DirectX 9 game the GPU load goes up and down from 0 to 100% all the time, GPU freq is maybe 700MHz, temp reaches 94% slowly, but the fan goes at around 1100RPM anyway. Even when I have set the max temp allowed to 70 deg (and I see the GPU is at maybe 68), I get the random reboots. The fan however goes up to 2000+ RPM to keep the temp below 70. So the crash shouldn't be caused by temperature.


----------



## SonDa5

Quote:


> Originally Posted by *instream*
> 
> Does anybody also have problems with Directx 9 games with R9 290 or R9 290X?


No problems with UT3. Have been playing it without any problems with my over clocked and water cooled Sapphire 290x with Hynix memory on Windows 8.1. Have tried last 2 AMD CCC drivers without any problems for UT3 and I am running the latest Beta BIOS from the AMD WEB site.


----------



## Sgt Bilko

Quote:


> Originally Posted by *instream*
> 
> Does anybody also have problems with Directx 9 games with R9 290 or R9 290X? I have an R9 290. I get random crashes that cause the computer to freeze for a second then reboot itself. It happens randomly after 1-60 minutes when playing Mount and Blade Warband, and also in Trine. Both are DirectX 9. I use the latest Catalyst 13.11 beta9.2 on win7 x64, and have run DDU and AMD cleanup utility before installing Catalyst. Twice I have removed the R9 290 and inserted my old HD5850. Then the crashes go away completely. Last night I even installed a fresh copy of Win7 x64 on an empty partition, then installed Catalyst 13.11 beta9.2, then Steam, then Mount and Blade Warband. The game crashed and rebooted the same way twice within 15 min. So the problem is still there with DirectX 9 games I guess, even in Catalyst 13.11 beta9.2.
> 
> To rule out other causes: I have a brand new 750W PSU (Fractal Design Integra R2), CPU is Phenom II 1100T (watercooled). I can run Unigine heaven or valley for hours without problems. I monitor temps and freqs at the same time and see that GPU runs at 947MHz, reaches 94 deg C, and the fan goes at 2000+ RPM (47%). When instead running the DirectX 9 game the GPU load goes up and down from 0 to 100% all the time, GPU freq is maybe 700MHz, temp reaches 94% slowly, but the fan goes at around 1100RPM anyway. Even when I have set the max temp allowed to 70 deg (and I see the GPU is at maybe 68), I get the random reboots. The fan however goes up to 2000+ RPM to keep the temp below 70. So the crash shouldn't be caused by temperature.


Thats strange, I've got trine so i'll go play it for an hour and see what happens on my end


----------



## instream

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Thats strange, I've got trine so i'll go play it for an hour and see what happens on my end


That would be great!


----------



## muhd86

guys 1 question ---

would 4 -r290 sapphire cards --tear a new one vs 4 titans or 4-gtx 780 ..

always been a fan of quad --so asking --3 gpus sapphire r290 will be here by coming monday ...thought if i add another one i could do some really bad ass benchmarking on them --
setup is a 3970x with rampage 4xtreme .

i have quads 780 but as i am old drivers that support quad and none of the new ones do so i thought i might sale 1 ,,keep the 3 to be installed in my 2nd i7 rig .

---

2nd question :-

are 3-r290 in them selves bad ass ---i cant wait to get my hands on them but been reading on the forums of people having differesnt issues with them --are they card releated or driver realted .


----------



## Sgt Bilko

Alrighty,30 mins in and it hasn't crashed (alt tabbed atm) but i'm seeing something similar......the GPU load isn't very consistent, it keeps bouncing between 0% and 90%.

I have an aftermarket cooler on and i'm running it overclocked anyways but i'm using the 13.11 whql (unofficial) driver from Guru3D.

other than that, no idea

on a more positive note, i'd forgotten just how pretty the Trine games are......very colourful.


----------



## rdr09

Quote:


> Originally Posted by *muhd86*
> 
> guys 1 question ---
> 
> would 4 -r290 sapphire cards --tear a new one vs 4 titans or 4-gtx 780 ..
> 
> always been a fan of quad --so asking --3 gpus sapphire r290 will be here by coming monday ...thought if i add another one i could do some really bad ass benchmarking on them --
> setup is a 3970x with rampage 4xtreme .
> 
> i have quads 780 but as i am old drivers that support quad and none of the new ones do so i thought i might sale 1 ,,keep the 3 to be installed in my 2nd i7 rig .
> 
> ---
> 
> 2nd question :-
> 
> are 3-r290 in them selves bad ass ---i cant wait to get my hands on them but been reading on the forums of people having differesnt issues with them --are they card releated or driver realted .


Depends on the bench. Titans are still up there in quads . . .


----------



## jjjc_93

4way 290x has the 3DMark 11 4way WR. Definitely go for 290x if you want to run 4way and bench.


----------



## muhd86

emm...interesting ..to bad nvidia is not giving up the option on quad gtx 780 other wise i would be there as well ...

ok i have 3 gtx 760 tri sli which i wanted to upgrade ..so are r290 3 gpus beasts in there own way ...can i compare a stock r290 to a stock gtx 780 would that be a fair comparison .


----------



## muhd86

Quote:


> Originally Posted by *jjjc_93*
> 
> 4way 290x has the 3DMark 11 4way WR. Definitely go for 290x if you want to run 4way and bench.


yeah but i have allready ordered so cant change it now -- to what nvidia gpus in tri configuration can i comapre 3-r290 from sapphire .


----------



## rdr09

Quote:


> Originally Posted by *muhd86*
> 
> yeah but i have allready ordered so cant change it now -- to what nvidia gpus in tri configuration can i comapre 3-r290 from sapphire .


780.


----------



## muhd86

Quote:


> Originally Posted by *rdr09*
> 
> 780.


damn ..thats bad ass ---lets hope it all plays well --- no one here has 3-r290 gpus ..looked around --can any one post a link if i missed it or some thing , 3 or 4 r-290 or r290x in quad fire ..hell yeah ...


----------



## selk22

the scaling on quad 290x vs quad titan is better on the AMD side.. If I was to quad GPU it would be AMD so I can get the most out of the cards


----------



## rdr09

Quote:


> Originally Posted by *muhd86*
> 
> damn ..thats bad ass ---lets hope it all plays well --- no one here has 3-r290 gpus ..looked around --can any one post a link if i missed it or some thing , 3 or 4 r-290 or r290x in quad fire ..hell yeah ...


it depends what 780 you are referring to. there are the classifieds that compete with the titans and the 780 tis.


----------



## instream

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Alrighty,30 mins in and it hasn't crashed (alt tabbed atm) but i'm seeing something similar......the GPU load isn't very consistent, it keeps bouncing between 0% and 90%.
> 
> I have an aftermarket cooler on and i'm running it overclocked anyways but i'm using the 13.11 whql (unofficial) driver from Guru3D.
> 
> other than that, no idea
> 
> on a more positive note, i'd forgotten just how pretty the Trine games are......very colourful.


Ok, thanks for your help!








I was going to put an after market cooler on it too, but since I'm not 100% sure that the card is not broken I don't dare... :-/
If it's not too much to ask, could you run the game a bit more?







Sometimes I have been able to play Mount and Blade for an hour or so, so it could be that you are just lucky right now.









Is that 13.11 whql driver the same as beta9.2 or something else?


----------



## instream

Quote:


> Originally Posted by *SonDa5*
> 
> No problems with UT3. Have been playing it without any problems with my over clocked and water cooled Sapphire 290x with Hynix memory on Windows 8.1. Have tried last 2 AMD CCC drivers without any problems for UT3 and I am running the latest Beta BIOS from the AMD WEB site.


One difference between us is that I'm using win7 x64 and you have win8.1. But that seems a bit far-fetched to be the reason for my crashes, right?


----------



## Sgt Bilko

Quote:


> Originally Posted by *instream*
> 
> Ok, thanks for your help!
> 
> 
> 
> 
> 
> 
> 
> 
> I was going to put an after market cooler on it too, but since I'm not 100% sure that the card is not broken I don't dare... :-/
> If it's not too much to ask, could you run the game a bit more?
> 
> 
> 
> 
> 
> 
> 
> Sometimes I have been able to play Mount and Blade for an hour or so, so it could be that you are just lucky right now.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is that 13.11 whql driver the same as beta9.2 or something else?


I'll run it again tomorrow maybe, just trying to break through 9k on Firestrike atm...........


----------



## Skylark71

Quote:


> Originally Posted by *instream*
> 
> Does anybody also have problems with Directx 9 games with R9 290 or R9 290X? I have an R9 290. I get random crashes that cause the computer to freeze for a second then reboot itself. It happens randomly after 1-60 minutes when playing Mount and Blade Warband, and also in Trine. Both are DirectX 9. I use the latest Catalyst 13.11 beta9.2 on win7 x64, and have run DDU and AMD cleanup utility before installing Catalyst. Twice I have removed the R9 290 and inserted my old HD5850. Then the crashes go away completely. Last night I even installed a fresh copy of Win7 x64 on an empty partition, then installed Catalyst 13.11 beta9.2, then Steam, then Mount and Blade Warband. The game crashed and rebooted the same way twice within 15 min. So the problem is still there with DirectX 9 games I guess, even in Catalyst 13.11 beta9.2.
> 
> To rule out other causes: I have a brand new 750W PSU (Fractal Design Integra R2), CPU is Phenom II 1100T (watercooled). I can run Unigine heaven or valley for hours without problems. I monitor temps and freqs at the same time and see that GPU runs at 947MHz, reaches 94 deg C, and the fan goes at 2000+ RPM (47%). When instead running the DirectX 9 game the GPU load goes up and down from 0 to 100% all the time, GPU freq is maybe 700MHz, temp reaches 94% slowly, but the fan goes at around 1100RPM anyway. Even when I have set the max temp allowed to 70 deg (and I see the GPU is at maybe 68), I get the random reboots. The fan however goes up to 2000+ RPM to keep the temp below 70. So the crash shouldn't be caused by temperature.


I have Asus R9 290, don`t have crashes but during gameplay GPU load goes up and down from 0 to 100% all the time too.
Something like this:
:

Tried upping power target, fan speed, don`t know what`s going on








But in bencmarks like 3dmark, furmark it stays solid 100%


----------



## HoZy

Why do people even bother with regular firestrike?

Run firestrike extreme if you are really interested in it.


----------



## evensen007

Quote:


> Originally Posted by *Skylark71*
> 
> I have Asus R9 290, don`t have crashes but during gameplay GPU load goes up and down from 0 to 100% all the time too.
> Something like this:
> :
> 
> Tried upping power target, fan speed, don`t know what`s going on
> 
> 
> 
> 
> 
> 
> 
> 
> But in bencmarks like 3dmark, furmark it stays solid 100%


I'm having this issue with my xfire 290's in BF4. It's running at 100+fps, but feels like a slideshow because of what's happening in your screenshot above.


----------



## smokedawg

Could this be what you are experiencing?
Quote:


> Guys, in different forums I've seen claims that AB reports wrong GPU usage on 290 cards because it is not supporting new AMD GPUs yet. That's incorrect statement, new versions of AB won't fix it, GPU usage is reported as it is calculated by AMD driver and it is not anyhow calculated internally in AB. So it is not a case of unknown GPU, it is as correct as AMD driver team implemented it. Currently GPU usage sensor implementation inside AMD driver is broken and it may return unreliable data if other sensors were polled immediately before GPU usage. So the amount of wrong data can be minimized by adding artificial delays between polling the sensors, it may be minimized by putting GPU load sensors to the first positions in the list, but the only true fix can be expected from AMD drivers update.


source


----------



## Sgt Bilko

Quote:


> Originally Posted by *HoZy*
> 
> Why do people even bother with regular firestrike?
> 
> Run firestrike extreme if you are really interested in it.


Call it a personal goal, 9k in Regular and 5k in Extreme

Speaking of which: http://www.3dmark.com/fs/1151742

i finally broke 9k.

EDIT: just broke 5k as well









http://www.3dmark.com/fs/1151775


----------



## Skylark71

Quote:


> Originally Posted by *evensen007*
> 
> I'm having this issue with my xfire 290's in BF4. It's running at 100+fps, but feels like a slideshow because of what's happening in your screenshot above.


Exactly, fps is fine but game feels like it stutters like hell.
Hope AMD will get this fixed, i`m starting to get pissed.


----------



## Bloitz

Hey Mr. Postman, you got a package for me?


----------



## overclockFrance

My Sapphire 290 flashed with the ASUS290X bios has always 2560 sp.


----------



## Sgt Bilko

Quote:


> Originally Posted by *overclockFrance*
> 
> My Sapphire 290 flashed with the ASUS290X bios has always 2560 sp.


maybe it's just a luck thing then?

Some have the full 2816 and others stay at 2560.......would have been pretty awesome if they all unlocked, i'd sell my 290x in a heartbeat and grab 2 x 290's


----------



## ontariotl

Quote:


> Originally Posted by *Forceman*
> 
> Can anyone think of another way to confirm the shaders? Preferably something that would be isolated to the shader count? Some kind of synthetic benchmark maybe?
> 
> My card appears to be about 5% faster, which is maybe a little on the low side? I'd like to test the shader throughput directly.
> 
> Okay, I put stock 290 clocks on and ran ShaderToyMark on the flashed BIOS. Anyone want to take a minute and check it with a stock 290?
> 
> 
> 
> http://www.geeks3d.com/20111215/shadertoymark-0-3-0-opengl-pixel-shader-benchmark-updated/


Forceman you might be onto something.

I just flashed my two XFX R9 290 cards since they had the Hynix memory as well. I put the bios switch to what should be quiet mode for the 290X and flashed with the Asus bios. This is what I got.

http://s81.photobucket.com/user/OntarioTL/media/biosflash_zps2465bccd.gif.html

All shaders accounted for. I even closed, rebooted and the shaders still are showing up as if it was a 290x.

Ok, so initial test so far.

Valley benchmark. 5760x1200 ultra custom mode no AA and o/c to 1100Mhz/1300Mhz

First with the original Xfx bios for 290
http://s81.photobucket.com/user/OntarioTL/media/R9290valleyoc1120mem1400_zps3245f909.jpg.html

Now with the Asus flash for 290X
http://s81.photobucket.com/user/Ont...hoverclock11001300valley_zpse91fca7a.jpg.html

This seems too good to be true. Time to do more testing.....


----------



## selk22

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Call it a personal goal, 9k in Regular and 5k in Extreme
> 
> Speaking of which: http://www.3dmark.com/fs/1151742
> 
> i finally broke 9k.
> 
> EDIT: just broke 5k as well
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/1151775


Congrats bro!
Quote:


> Originally Posted by *HoZy*
> 
> Why do people even bother with regular firestrike?
> 
> Run firestrike extreme if you are really interested in it.


Not everyone owns 3dmark11 so basic edition is a nice free open benchmark.


----------



## Sgt Bilko

Quote:


> Originally Posted by *selk22*
> 
> Congrats bro!
> Not everyone owns 3dmark11 so basic edition is a nice free open benchmark.


Thanks, and i have the Advanced version of the most recents 3DMark programs (11, Vantage and 3DMark)


----------



## selk22

So guys is this normal after OC'ing with AB and setting 50% power in CCC.. in heaven benchmark pretty much constant 1150/1400 but in a round of bf4 here is a screenshot.. Is this normal for the clock to bounce around so much during gameplay? Its actually still completely smooth gameplay but i was just wondering.


----------



## rdr09

Quote:


> Originally Posted by *selk22*
> 
> So guys is this normal after OC'ing with AB and setting 50% power in CCC.. in heaven benchmark pretty much constant 1150/1400 but in a round of bf4 here is a screenshot.. Is this normal for the clock to bounce around so much during gameplay? Its actually still completely smooth gameplay but i was just wondering.
> 
> 
> Spoiler: Warning: Spoiler!


i guess the gpu is just adjusting with the game scene. what caught my eyes is the VRAM usage. over 3000MB. what rez are you using?


----------



## selk22

Quote:


> Originally Posted by *rdr09*
> 
> i guess the gpu is just adjusting with the game scene. what caught my eyes is the VRAM usage. over 3000MB. what rez are you using?


I think your right after further testing this seems to just be the GPU adjusting to what is happening..

Well yeah that's why I have the 290x bro! all that sexy vram! Im at 1920x1200 4xAA, Full Post AA, 135% Res scale!








55-60FPS average on 64p OP locker.

If that's not a crisp game Idk what is!!


----------



## ontariotl

So far I can confirm it is working with my XFX cards. I had to re-flash the second card as GPU-Z was reporting odd numbers, but now its working fine.

The only thing I have observed so far is with the Asus bios, I had to up the voltage to 1.3v to run heaven benchmark to the same overclock I could get with the original Xfx 290 bios (1100Mhz GPU/1300 Mem with no increase in voltage). With default 290X clocks, it was fine. Being watercooled they never throttle back, which maybe why I wants more volts, but not much to be a failure.

I've posted more info on another site as I didn't want to go back and forth here and there, so if it's ok I'll post a link instead.

http://linustechtips.com/main/topic/76456-success-xfx-r9-290-flashed-to-asus-r9-290x/

Off to do more testing. I should be asleep for my shift at work tonight, but this whole thing with the bios has me giddy like a kid in a candy store


----------



## evensen007

Going home for lunch to flash my 2 XFX 290's with the asus 290x bios. I'm hoping I won the chip lottery on both of them and unlock the shaders.


----------



## Toss3

Considering selling my Sapphire 290 and ordering an XFX 290 in its stead. My sapphire isn't stable above 1150 on water.


----------



## the9quad

Well I ordered another 290x Amazon next day (so that will make two when the other one from Newegg RMA gets here), if they both start black screening I guess we will know it shouldl be something that can be fixed in drivers or a mobo bios update or something instead of faulty cards..


----------



## Maxxa

Quote:


> Originally Posted by *Sgt Bilko*
> 
> maybe it's just a luck thing then?
> 
> Some have the full 2816 and others stay at 2560.......would have been pretty awesome if they all unlocked, i'd sell my 290x in a heartbeat and grab 2 x 290's


I could just imagine how difficult it would be to sell a 290x for anywhere near what you paid for it if word is out that 290s can be flashed to Xs.


----------



## TomiKazi

Well, finally decided to go for the 290 today. The black screen and clock fluctuations made me worry so much that I considered just going the 780Ti route, but saving €320 gives more room for some overkill watercooling and perhaps a beefier PSU if I'd decide I want to overvolt.

Expecting it next week, hopefully.


----------



## MotionBlur84

Hi all!

My ASUS R9 290 first try:

3DM11:



TESS OFF:



Valley:



GPU load voltage: 1.23V
Driver: 13.11 WHQL (unofficial)


----------



## Sgt Bilko

Quote:


> Originally Posted by *Maxxa*
> 
> I could just imagine how difficult it would be to sell a 290x for anywhere near what you paid for it if word is out that 290s can be flashed to Xs.


Fortunately not everyone would hear about it









And the way it's looking it seems like only some of the 290's can flash to a 290x.

And i decided about 10 mins ago i'll get a non-ref 290x to go along with my current one later on.


----------



## jtjoetan

Hey everyone finally got my 290! loving it ATM, but why do i have inconsistent FPS? it can be 70, 90, and even 30-40+..
and is there a way to set its fan speed/max temp? i tried 'catalyst control centre' and after i clicked apply it just reverted back to its original settings..

Im currently placing my Rig aganisnt the wall ( back of the case against the wall) and my cables are all behind, so will the heat from the exhaust GPU damage my cables?

EDIT: i did not overclock the GPU beacuse i dont feel like theres a need for it, until future


----------



## R35ervoirFox

Quote:


> Originally Posted by *jtjoetan*
> 
> Hey everyone finally got my 290! loving it ATM, but why do i have inconsistent FPS? it can be 70, 90, and even 30-40+..
> and is there a way to set its fan speed/max temp? i tried 'catalyst control centre' and after i clicked apply it just reverted back to its original settings..
> 
> Im currently placing my Rig aganisnt the wall ( back of the case against the wall) and my cables are all behind, so will the heat from the exhaust GPU damage my cables?


You have to tick Enable Graphics OverDrive for the CCC fan to work and it works fine.
Just make sure you got proper airflow behind your case.

Guys are you flashing from inside windows or from dos, usually I would be lazy as it's dual bios and flash from windows but dunno if that is problematic with these cards and it's better to flash from usb dos. ty in advance


----------



## Xylene

Just set up a will call order for the XFX 290 because it comes with BF4. Assuming it's ready in the next few hours I'll be getting it today, if not I'll have to cancel because I won't be able to take the 71 mile each way again until after its too late (they only hold for 7 days). Got the 5% off promo which paid for most of the tax.


----------



## jtjoetan

Quote:


> Originally Posted by *R35ervoirFox*
> 
> You have to tick Enable Graphics OverDrive for the CCC fan to work and it works fine.
> Just make sure you got proper airflow behind your case.
> 
> Guys are you flashing from inside windows or from dos, usually I would be lazy as it's dual bios and flash from windows but dunno if that is problematic with these cards and it's better to flash from usb dos. ty in advance


Thanks! really works!
proper airflow as in what?.. right now I have noctua d 14 for my CPU, 1 fan on the top of the case, 1 fan the front of the case and another fan for behind the case. but near the 'slots' part its where the heat is crazily exhausting....


----------



## rdr09

Quote:


> Originally Posted by *MotionBlur84*
> 
> Hi all!
> 
> My ASUS R9 290 first try:
> 
> 3DM11:
> 
> 
> 
> TESS OFF:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Valley:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> GPU load voltage: 1.23V
> Driver: 13.11 WHQL (unofficial)


your oc may not be stable. your graphics score is low for that oc (3DM11).


----------



## ontariotl

Quote:


> Originally Posted by *R35ervoirFox*
> 
> Guys are you flashing from inside windows or from dos, usually I would be lazy as it's dual bios and flash from windows but dunno if that is problematic with these cards and it's better to flash from usb dos. ty in advance


Flashing from USB Dos.


----------



## jerrolds

Quote:


> Originally Posted by *R35ervoirFox*
> 
> You have to tick Enable Graphics OverDrive for the CCC fan to work and it works fine.
> Just make sure you got proper airflow behind your case.
> 
> Guys are you flashing from inside windows or from dos, usually I would be lazy as it's dual bios and flash from windows but dunno if that is problematic with these cards and it's better to flash from usb dos. ty in advance


I tried Winflash - didnt work got an error saying "could not find discreet ATI video card" or something like that. I thought about doing it in Safe Mode...but ended up just finding an old USB stick and booting from there. Flashing to Asus.rom worked like a charm.

I ended up reinstalling drivers as well.


----------



## $ilent

Please update me arizonian, I now have r9 290.

thanks, ASIC is 75.4%.

Aslo stock VDDC is 1.180, is this good?


----------



## jtjoetan

Quote:


> Originally Posted by *jtjoetan*
> 
> Thanks! really works!
> proper airflow as in what?.. right now I have noctua d 14 for my CPU, 1 fan on the top of the case, 1 fan the front of the case and another fan for behind the case. but near the 'slots' part its where the heat is crazily exhausting....


can any1 help me?
my wall is getting preety hot. but i dont want to increase the fan speed its freaking disturbing @[email protected] its currently sitting at 30% now.


----------



## iGameInverted

Just ordered the AquaComputer water block. Still haven't ordered any other parts for my PC yet. Will update when I receive the card. Wanted to order the water block to make sure it was here when I was ready to build. (Building my first water cooled PC). So much to order still.


----------



## Arizonian

Quote:


> Originally Posted by *$ilent*
> 
> Please update me arizonian, I now have r9 290.
> 
> thanks, ASIC is 75.4%.
> 
> Aslo stock VDDC is 1.180, is this good?


Updated to Sapphire 290








Quote:


> Originally Posted by *MotionBlur84*
> 
> Hi all!
> 
> My ASUS R9 290 first try:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 3DM11:
> 
> 
> 
> TESS OFF:
> 
> 
> 
> Valley:
> 
> 
> 
> 
> GPU load voltage: 1.23V
> Driver: 13.11 WHQL (unofficial)


DId I miss your pic w/OCN name for submission? If I did I'm sorry can you link me to it.

If you haven't submitted one please do and join us.


----------



## $ilent

Thanks arizonian. Anyone got any ideas on the vddc?


----------



## Technewbie

What is the best block for the 290x? I'm going to be watercooling my soon and I was going to get a EK block but now others are coming out so is there any that are better than the EK ones? Also OP you put my card as ASUS but it's actually a MSI


----------



## Thornet

Regarding the blackscreen issue..
After a lot of testing/benchmarking on my new watercooled 290x I've come to the conclusion that (atleast for my card):
1) Screen only goes black if I run all 3 of my monitors in 120 hz, at a little higher overclocking. If I set the screens to 60 hz, at the same clockspeed and volt there is no more blackscreening.
2) Unplugging 1 screen, and going 120 hz on the remaining 2, with both dual link dvi, there is no blackscreening, at the same clockspeed and volt.
3) Using 1 monitor in 120 hz, at the same clockspeed and volt regardless of which cable i use, displayport or dvi, there is no blackscreening...

The blackscreens come when I use 1230~ mhz on the core with 1450 mV (actual 1.3-1.32mV) as I'm using the asus bios with a pt1.rom
I'm not sure whether it's a driver issue or a hardware issue. But the problem only occurs when I set 120 hz on the screens which makes me wonder..

Anyone else with this issue using 120 hz monitors and tested with 60 hz?

Edit: Benchmarking done with Heaven on extreme preset.


----------



## $ilent

guys im having issues with my new r9 290.

When folding at stock with fan set to like 80% so as to prevent thermal throttling the card gpu usage bounces up and down when using [email protected]



Anyone know why this is? I have left my pc to fold away and the gpu usage is still doing the same up and down thing, and the gpu is only getting 114k ppd on a 7810, which I know is way off since my gtx 670 can get like 105k ppd on it with a huge overclock. This card should be doing well over 150k on this work unit.


----------



## jerrolds

I am using a QNIX 1440p @ 120hz and have yet to encounter Blackscreen issues. Sapphire 290X w/ Asus.rom 1200mhz core 1500mhz memory, 1.385v (1.29-1.33 actual). I dont have 3 screens, just a 2ndary 1080p 60hz Plasma HDTV connected via HDMI.


----------



## jerrolds

Quote:


> Originally Posted by *$ilent*
> 
> guys im having issues with my new r9 290.
> 
> When folding at stock with fan set to like 80% so as to prevent thermal throttling the card gpu usage bounces up and down when using [email protected]
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Anyone know why this is? I have left my pc to fold away and the gpu usage is still doing the same up and down thing, and the gpu is only getting 114k ppd on a 7810, which I know is way off since my gtx 670 can get like 105k ppd on it with a huge overclock. This card should be doing well over 150k on this work unit.


Usually maxing powerlimit to 150% well prevent throttling, assuming your fan speed has a high enough ceiling.


----------



## oblivious45cs

Quote:


> Originally Posted by *Thornet*
> 
> Anyone else with this issue using 120 hz monitors and tested with 60 hz?


I also have noticed overclocking issues related to 120Hz monitor. I have not played with it too much, but noticed that i cant get past about 1075 with my monitor at 120, if i drop the monitor to 60 i was benching at 1240 without black screens and artifacts. I am really hoping this is a driver issue!


----------



## Nevk

Quote:


> Originally Posted by *MotionBlur84*
> 
> Hi all!
> 
> My ASUS R9 290 first try:
> 
> 3DM11:
> 
> 
> 
> GPU load voltage: 1.23V
> Driver: 13.11 WHQL (unofficial)


R9 290 Series is really a powerful GPU...

Your score has surpassed my two 7970(1050/1500)CF...


----------



## smokedawg

Quote:


> Originally Posted by *jerrolds*
> 
> I am using a QNIX 1440p @ 120hz and have yet to encounter Blackscreen issues. Sapphire 290X w/ Asus.rom 1200mhz core 1500mhz memory, 1.385v (1.29-1.33 actual). I dont have 3 screens, just a 2ndary 1080p 60hz Plasma HDTV connected via HDMI.


Same here: QNIX 1440p @ 120hz dual-link dvi and 2nd screen with [email protected] hdmi. Overclocked with AB Beta17 +100mV to 1200/1500 - no blackscreen so far ( powercolor 290x with stock bios),


----------



## $ilent

Quote:


> Originally Posted by *jerrolds*
> 
> Usually maxing powerlimit to 150% well prevent throttling, assuming your fan speed has a high enough ceiling.


thanks, but that didnt fix it at al..

Its still doing this stupid thermal throttling crap.

Im getting annoyed now


----------



## R35ervoirFox

Quote:


> Originally Posted by *jtjoetan*
> 
> Thanks! really works!
> proper airflow as in what?.. right now I have noctua d 14 for my CPU, 1 fan on the top of the case, 1 fan the front of the case and another fan for behind the case. but near the 'slots' part its where the heat is crazily exhausting....


You need to have proper unobstructed air flow behind the case, 2 inches blocked with cables isn't proper airflow, use common sense








Quote:


> Originally Posted by *ontariotl*
> 
> Flashing from USB Dos.


Quote:


> Originally Posted by *jerrolds*
> 
> I tried Winflash - didnt work got an error saying "could not find discreet ATI video card" or something like that. I thought about doing it in Safe Mode...but ended up just finding an old USB stick and booting from there. Flashing to Asus.rom worked like a charm.
> 
> I ended up reinstalling drivers as well.


Thanks guys I will do some flashing tomorrow xD

Quote:


> Originally Posted by *jtjoetan*
> 
> can any1 help me?
> my wall is getting preety hot. but i dont want to increase the fan speed its freaking disturbing @[email protected] its currently sitting at 30% now.


Fan should be no lower than 50% I figure when under full load, I set target gpu temp to 92 for the time being and max fan to something 60 or greater.

So I got my Sapphire R9 290 with Aftermarket Accelero extreme III which may install tomorrow if all my tests are positive. Got Hynix memory and an ASIC quality of 76.2%


----------



## pkrexer

Quote:


> Originally Posted by *$ilent*
> 
> thanks, but that didnt fix it at al..
> 
> Its still doing this stupid thermal throttling crap.
> 
> Im getting annoyed now


Make sure Overdrive is turned on in your CCC. Also make sure the power limit is at the full 50% in CCC.


----------



## jerrolds

Quote:


> Originally Posted by *R35ervoirFox*
> 
> 
> 
> Spoiler: Warning: Spoiler!


Thats a good looking card, ~77% ASIC w/ Hynix memory - should get good ocing results. 1230mhz on the core maybe if you can keep your VRMs temp down.


----------



## muhd86

does the sapphire r290 have dual bios --and i am reading that ppl are flashing them with asus r290x bios to enable the extra shaders --has any one unlocked a sapphiere card with more shaders


----------



## jerrolds

Quote:


> Originally Posted by *$ilent*
> 
> thanks, but that didnt fix it at al..
> 
> Its still doing this stupid thermal throttling crap.
> 
> Im getting annoyed now


That sucks..maybe its just [email protected]? Have you tried other apps? Mine is steady at 150% ..maybe add a bit of juice?


----------



## Lennyx

Quote:


> Originally Posted by *iGameInverted*
> 
> Just ordered the AquaComputer water block. Still haven't ordered any other parts for my PC yet. Will update when I receive the card. Wanted to order the water block to make sure it was here when I was ready to build. (Building my first water cooled PC). So much to order still.


Just make sure to make a drain port or something.
I just changed card im my setup and it was not pretty. Luckily no water on any components.

Im waiting for some quick disconnect fittings. That will make everything so much easier.

Time to clean the pc for nvidia drivers and get the amd stuff installed







Cant believe i waitied a week to install the 290x


----------



## muhd86

in order to get eyefinity to work on 3 - dell 30 inch lcd --on nvidia surround i used 3 dvi dual link cables ---what about eyefinity .

i hook 2 dvi dual link to the 1st gpu ..the 3rd can i use a simple hdmi to dvi cable --will it support the 30 inch montiors default 2560-1600 res ...

or is the 3rd lcd only picked up via display port only .


----------



## $ilent

how are you getting 150% jerrold? Mine only goes up to 50%, unless you mean its not extra 150%?


----------



## R35ervoirFox

Quote:


> Originally Posted by *jerrolds*
> 
> Thats a good looking card, ~77% ASIC w/ Hynix memory - should get good ocing results. 1230mhz on the core maybe if you can keep your VRMs temp down.


Thanks hopefully it will overclock well, so far been perfect at stock across lots of benchmarks and games except that the bioshock infinite benchmark wouldn't launch, don't think that's to do with the card though. Dunno how good the Accelero extreme III is at keeping the vrm's cool but will see soon









Quote:


> Originally Posted by *muhd86*
> 
> does the sapphire r290 have dual bios --and i am reading that ppl are flashing them with asus r290x bios to enable the extra shaders --has any one unlocked a sapphiere card with more shaders


Every card is reference so all should have dual bios, I will try and unlock a sapphire tomorrow hopefully


----------



## Forceman

Quote:


> Originally Posted by *ontariotl*
> 
> So far I can confirm it is working with my XFX cards. I had to re-flash the second card as GPU-Z was reporting odd numbers, but now its working fine.
> 
> The only thing I have observed so far is with the Asus bios, I had to up the voltage to 1.3v to run heaven benchmark to the same overclock I could get with the original Xfx 290 bios (1100Mhz GPU/1300 Mem with no increase in voltage). With default 290X clocks, it was fine. Being watercooled they never throttle back, which maybe why I wants more volts, but not much to be a failure.
> 
> I've posted more info on another site as I didn't want to go back and forth here and there, so if it's ok I'll post a link instead.
> 
> http://linustechtips.com/main/topic/76456-success-xfx-r9-290-flashed-to-asus-r9-290x/
> 
> Off to do more testing. I should be asleep for my shift at work tonight, but this whole thing with the bios has me giddy like a kid in a candy store


Hmm. So that makes 3 XFX 290s with Hynix memory that unlocked the shaders. Anyone have a XFX with Elpida who can check? Anyone with a different brand 290 with Hynix tried the Asus BIOS yet?
Quote:


> Originally Posted by *$ilent*
> 
> how are you getting 150% jerrold? Mine only goes up to 50%, unless you mean its not extra 150%?


It's the same either way, some programs show it at +50% and some show it as 150%.


----------



## jerrolds

Quote:


> Originally Posted by *$ilent*
> 
> how are you getting 150% jerrold? Mine only goes up to 50%, unless you mean its not extra 150%?


Its probably the same 0% in AB == 100% in GPU Tweak (where max is 150). I havent flashed my sapphire back to stock bios and tried OCing with AB yet.


----------



## Sgt Bilko

$ilent, Make sure that CCC and AB both have the power limit set to +50%

It's working fine for me atm


----------



## R35ervoirFox

Quote:


> Originally Posted by *muhd86*
> 
> in order to get eyefinity to work on 3 - dell 30 inch lcd --on nvidia surround i used 3 dvi dual link cables ---what about eyefinity .
> 
> i hook 2 dvi dual link to the 1st gpu ..the 3rd can i use a simple hdmi to dvi cable --will it support the 30 inch montiors default 2560-1600 res ...
> 
> or is the 3rd lcd only picked up via display port only .


if you have two cards then you can use all 8 ports so ye whatever floats your boat









As regards bandwidth both hdmi and display port offer higher resolutions than the dual link dvi ports, it has two dual link dvi ports and again all 4 ports can be used simultaneously


----------



## $ilent

Quote:


> Originally Posted by *Sgt Bilko*
> 
> $ilent, Make sure that CCC and AB both have the power limit set to +50%
> 
> It's working fine for me atm


So is your card not looking the same as mine in afterburner? i.e gpu usage up and down like a yoyo?


----------



## Skylark71

Quote:


> Originally Posted by *Forceman*
> 
> Hmm. So that makes 3 XFX 290s with Hynix memory that unlocked the shaders. Anyone have a XFX with Elpida who can check? Anyone with a different brand 290 with Hynix tried the Asus BIOS yet?
> It's the same either way, some programs show it at +50% and some show it as 150%.


I have Asus R9 290 with Elpida, 78.5% asic.
Willing to try flashing it with Asus 290X bios. Is there instructions for flashing via usb?


----------



## ontariotl

Quote:


> Originally Posted by *Skylark71*
> 
> I have Asus R9 290 with Elpida, 78.5% asic.
> Willing to try flashing it with Asus 290X bios. Is there instructions for flashing via usb?


I have switched back to xfx bios for now, I do see some improvement but is it enough to justify possibly killing your card?
For now, I say no. But I did have fun flashing it and seeing those extra shaders and TMU's like the 290X has.

I don't mind being the guinea pig to find if it's possible, but I'm not sure I want to be the one to say I killed my card.

I am curious to know if the board specs are exactly the same as the R290X (components on board)

I will say this though, the 290 is no slouch when overclocking and with steady clocks, it does keep up with his "X" brother quite well.

Here is the flash instructions that I used

http://forums.overclockers.co.uk/showthread.php?t=18552408


----------



## Gunderman456

Arizonian please add me (by the way great job on this thread). Two Gigabyte R9 290s just arrived!



Still waiting on waterblocks.

Build Log started here;

http://www.overclock.net/t/1442038/build-log-the-hawaiian-heat-wave#post_21177067


----------



## evensen007

Quote:


> Originally Posted by *Forceman*
> 
> Hmm. So that makes 3 XFX 290s with Hynix memory that unlocked the shaders. Anyone have a XFX with Elpida who can check? Anyone with a different brand 290 with Hynix tried the Asus BIOS yet?
> It's the same either way, some programs show it at +50% and some show it as 150%.


So I flashed my 2 XFX 290's during lunch to the Asus 290x bios. Both flashes took just fine and I found something very strange in GPU-Z which I will look into more when I get home. When I first booted into WIndows, I loaded up GPU-Z and it instantly said that the first selected gpu now had the unlocked shaders. I dropped down the box to choose my other XFX290 and it only had the 2500 shaders. Then I switched back to the other 290 in the drop down and now THAT one also showed 2500 shaders...

It seems more like a bug in GPU-Z. It *IS* possible that one of my 290's did unlock the shaders and one didn't,. but why would GPUZ reset the shaders back to 2500 on the card that was showing 2800 when I first loaded into it?


----------



## Technewbie

Is this a good ASIC quality and does asic quality have any effect on the performance of the card?


----------



## jomama22

The asic means little on these. The higher asics seem to be able to take more voltage though. Iv seen 68.x%-78.x%


----------



## black7hought

If you are folding with your 290, which beta driver are you using? I keep getting lock ups with the multicolored screen while
folding with beta 9.2.


----------



## evensen007

Quote:


> Originally Posted by *Technewbie*
> 
> 
> 
> Is this a good ASIC quality and does asic quality have any effect on the performance of the card?


Allegedly, the lower ASIC bodes better for water cooling, and less so for air.


----------



## Sazz

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Call it a personal goal, 9k in Regular and 5k in Extreme
> 
> Speaking of which: http://www.3dmark.com/fs/1151742
> 
> i finally broke 9k.
> 
> EDIT: just broke 5k as well
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/1151775


Nice one dude! My goal was to reach 10k on mine and was able to and only one so far that has 9.5k+ score on pure AMD build.

http://www.3dmark.com/fs/1117688

Quote:


> Originally Posted by *Technewbie*
> 
> 
> 
> Is this a good ASIC quality and does asic quality have any effect on the performance of the card?


ASIC mostly gives you an idea on how good of an overclocker the card is, higher ASIC generally gets higher OC potential. the ASIC description on the GPU-Z is kinda outdated, coz it says high ASIC gets lower OC under watercooling compared to low ASIC, that's not the case now since watercooling has improved a lot and now even VRM's are getting cooled as compared to back then when only GPU core is being cooled.

But at stock clocks there is no performance difference, maybe voltage they use is different at stock clocks, high ASIC would tend to use less voltage at same clocks as low ASIC rated cards.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Sazz*
> 
> Nice one dude! My goal was to reach 10k on mine and was able to and only one so far that has 9.5k+ score on pure AMD build.
> 
> http://www.3dmark.com/fs/1117688


Why thank you









I seen that one posted in the Hawaii vs GK110 Thread, very nice score there:thumb:, I'd need to be under water to try and hit that.

but i'm happy for now, maybe when i get another 290x i'll aim a little higher


----------



## youra6

Quote:


> Originally Posted by *evensen007*
> 
> So I flashed my 2 XFX 290's during lunch to the Asus 290x bios. Both flashes took just fine and I found something very strange in GPU-Z which I will look into more when I get home. When I first booted into WIndows, I loaded up GPU-Z and it instantly said that the first selected gpu now had the unlocked shaders. I dropped down the box to choose my other XFX290 and it only had the 2500 shaders. Then I switched back to the other 290 in the drop down and now THAT one also showed 2500 shaders...
> 
> It seems more like a bug in GPU-Z. It *IS* possible that one of my 290's did unlock the shaders and one didn't,. but why would GPUZ reset the shaders back to 2500 on the card that was showing 2800 when I first loaded into it?


Exact same thing happened to me. Scores in Valley and 3dmark 11 proved otherwise. It looks like its just a GPU-Z bug as you said. If you revert back to stock 290 BIOS, it will do the same thing.


----------



## DampMonkey

Quote:


> Originally Posted by *Sazz*
> 
> Nice one dude! My goal was to reach 10k on mine and was able to and only one so far that has 9.5k+ score on pure AMD build.
> 
> http://www.3dmark.com/fs/1117688


Your pure amd build beat my pure amd build. Good job passing 10k!

http://i.imgur.com/8MOaWrq.jpg


----------



## motorwayne

I have a dilemma, I want to change to AMD because I like what I'm hearing in terms of Mantle and Truesound, and what I'm hearing in terms of innovation coming out of that corner.

Current GPU is a GTX 570 old but goes ok.

I have to buy a R9 290X in the next two days from the USA as I have a mate there at the moment on business and buying one in the US is 60% of the price I would pay here in New Zealand. The issue I have is the R9 290X seems filled with problems and issues both stock and when people try to overclock it. I am also grabbing a 120hz monitor in the next week and see people having issues with the Qnix 2560 x 1440.

I'm trying to decide between the R9 290X and the EVGA GTX 780 Classified (I know its a dirty word in here) and I'm flip flopping between the two. I really want to get the R9 but not sure the benefits are there right now and the aftermarket makers haven't released their version yet.

What do you think? Buy the 780 and sell it here in NZ for a profit and buy the R9 when aftermarket R9's come out over the next 6 months or suck the kumera and get the R9 now?

Cheers from NZ

Motorwayne


----------



## HughhHoney

Quote:


> Originally Posted by *Jpmboy*
> 
> I see vrm1 running hotter also. (Max on h2o is 56C) I think 125c is the spec limit, not in the nominal operating range. Generally, 80% of the spec limit is okay. But verify that limit in the IR documentation... It's available on-line.


Are you saying you have max VRM temps of 56C on water?

I'm on water with the EK copper acrylic block and my VRM temps are much higher. I'm pretty much positive that this block doesn't actually cool the VRMs after looking at it.

If anybody knows a waterblock with better VRM cooling please let me know. I've been looking around online, but none of them look like they actively cool the VRM area on the 290x.


----------



## muhd86

Quote:


> Originally Posted by *R35ervoirFox*
> 
> if you have two cards then you can use all 8 ports so ye whatever floats your boat
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As regards bandwidth both hdmi and display port offer higher resolutions than the dual link dvi ports, it has two dual link dvi ports and again all 4 ports can be used simultaneously


what i ment to say was i connect the 2 lcd to the dvi ports , i have dual link cables and i connect a hdmi to dvi cable for the 3rd lcd coz the dell 3007wfp models only have dvi connectors .
can the hdmi to dvi support the native res of 2560-1400 res of a 30inch lcd .


----------



## muhd86

waiting for ppl with sapphire cards who want t flash to asus bios ---to unlock more shaders ---hell this is very interesting indeed --- can some one guide as how to flash the 290 to a 290x bios .


----------



## Jpmboy

Quote:


> Originally Posted by *HughhHoney*
> 
> Are you saying you have max VRM temps of 56C on water?
> 
> I'm on water with the EK copper acrylic block and my VRM temps are much higher. I'm pretty much positive that this block doesn't actually cool the VRMs after looking at it.
> 
> If anybody knows a waterblock with better VRM cooling please let me know. I've been looking around online, but none of them look like they actively cool the VRM area on the 290x.


Yes, i have not seen higher max temp recorded in gpuz. That's through valley, firestrike etc at my max OC with unlocked voltage set to 1.412v which droops to 1.28-1,3v

I also check the pcb backside with an ir thermometer. <50C

Either get yourself fujipoly pads, or use a small amount of tim on the EK pads. Makes a really big difference.

Ni acrylic on mine, no diff.


----------



## HughhHoney

Quote:


> Originally Posted by *Jpmboy*
> 
> Yes, i have not seen higher max temp recorded in gpuz. That's through valley, firestrike etc at my max OC with unlocked voltage set to 1.412v which droops to 1.28-1,3v


What block do you have?


----------



## youra6

Quote:



> Originally Posted by *muhd86*
> 
> waiting for ppl with sapphire cards who want t flash to asus bios ---to unlock more shaders ---hell this is very interesting indeed --- can some one guide as how to flash the 290 to a 290x bios .


Follow this guide word for word and you shouldn't have any problems:

http://forums.overclockers.co.uk/showthread.php?t=18552408


----------



## HughhHoney

Quote:


> Originally Posted by *HughhHoney*
> 
> What block do you have?


Sorry only saw the first line of your post.


----------



## pkrexer

I flashed my Sapphire 290 to a Sapphire 290x bios last night and it still showed the same amount of shaders


----------



## ihatelolcats

subscribing because im interested in the unlock thing


----------



## Jpmboy

Quote:


> Originally Posted by *HughhHoney*
> 
> Sorry only saw the first line of your post.


Cool. Always put a non conductive tim on both contact surfaces before using a t-pad. A very tiny amt on the vrm, you can thin coat the contact surface on the water block.


----------



## evensen007

Quote:


> Originally Posted by *HughhHoney*
> 
> Are you saying you have max VRM temps of 56C on water?
> 
> I'm on water with the EK copper acrylic block and my VRM temps are much higher. I'm pretty much positive that this block doesn't actually cool the VRMs after looking at it.
> 
> If anybody knows a waterblock with better VRM cooling please let me know. I've been looking around online, but none of them look like they actively cool the VRM area on the 290x.


I have the EK Nickel/Plexi CSQ blocks and my VRM temps never breach 52-55c during Valley extreme. I used a small amount of the ectotherm on each ram and vrm chip before placing the included thermal pads on them.


----------



## HughhHoney

Quote:


> Originally Posted by *Jpmboy*
> 
> Cool. Always put a non conductive tim on both contact surfaces before using a t-pad. A very tiny amt on the vrm, you can thin coat the contact surface on the water block.


Thanks a lot I'll try that out.

I was pretty surprised/disappointed in my block after seeing the VRM temps. Hopefully this helps.


----------



## jerrolds

Quote:


> Originally Posted by *HughhHoney*
> 
> Are you saying you have max VRM temps of 56C on water?
> 
> I'm on water with the EK copper acrylic block and my VRM temps are much higher. I'm pretty much positive that this block doesn't actually cool the VRMs after looking at it.
> 
> If anybody knows a waterblock with better VRM cooling please let me know. I've been looking around online, but none of them look like they actively cool the VRM area on the 290x.


How high are your VRMs getting? VRM1 for me can easily hit 95C at 1200mhz core 1.3v actual using an aftermarket cooler w/ ram sinks. It kinda sucks.


----------



## Jpmboy

Quote:


> Originally Posted by *HughhHoney*
> 
> Thanks a lot I'll try that out.
> 
> I was pretty surprised/disappointed in my block after seeing the VRM temps. Hopefully this helps.


It should make a significant difference. Go slow, be patient. It can be tedious seating the pads correctly.


----------



## Arizonian

Quote:


> Originally Posted by *Gunderman456*
> 
> Arizonian please add me (by the way great job on this thread). Two Gigabyte R9 290s just arrived!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Still waiting on waterblocks.
> 
> Build Log started here;
> 
> http://www.overclock.net/t/1442038/build-log-the-hawaiian-heat-wave#post_21177067


Congrats - added


----------



## jomama22

So just a heads up. Only found 2 out of 9 290xs will reach 1300+ at 1.35v actual on water. The other 7 can't hit 1300 stable no matter voltage as they crap out around 1.375v actua pk and just crash.

So its looking like a 20% chance of getting a good chip for benching. I'll say though, some of those chips would hit 1280 @ 1.3v actual (max on asus.rom)which is awesome, they just can't handle the volts.

Memory wise, hynix takes the cake. 6/7 elpidas can't get over 1625 no matter what, while the 3 hynix all hit 1725 minimum.


----------



## Poyri

Finally got mine. I'll get a water block in a week or two.



I can't control voltage on this card do i need to do something?


----------



## evensen007

Quote:


> Originally Posted by *HughhHoney*
> 
> Thanks a lot I'll try that out.
> 
> I was pretty surprised/disappointed in my block after seeing the VRM temps. Hopefully this helps.


Also, don't overlook the 2nd VRM strip. In my manual that came with mine, it was very hard to see in black and white.


----------



## Jpmboy

Quote:


> Originally Posted by *jomama22*
> 
> So just a heads up. Only found 2 out of 9 290xs will reach 1300+ at 1.35v actual on water. The other 7 can't hit 1300 stable no matter voltage as they crap out around 1.375v actua pk and just crash.
> 
> So its looking like a 20% chance of getting a good chip for benching. I'll say though, some of those chips would hit 1280 @ 1.3v actual (max on asus.rom)which is awesome, they just can't handle the volts.
> 
> Memory wise, hynix takes the cake. 6/7 elpidas can't get over 1625 no matter what, while the 3 hynix all hit 1725 minimum.


Unfortunately, i have one of the 80%.









Can hold 1300 only sporadically, and not reproducibly. Bummed.

Dude bins 9 r290x 's ... Hero!


----------



## Sgt Bilko

Thanks for the info Jomama, good to know


----------



## jerrolds

Quote:


> Originally Posted by *Jpmboy*
> 
> Cool. Always put a non conductive tim on both contact surfaces before using a t-pad. A very tiny amt on the vrm, you can thin coat the contact surface on the water block.


Interesting, my VRM temps are pretty bad - right now its using thermal tape. But i do have tpads, you think if i put a speck of Chill Factor 3 (website says its not electrically conductive "CFIII does not contain any metal or other electrically conductive materials." on both sides of the pads it would improve cooling?

I just wonder if the ram sinks will stick...but i guess i can always test on a scrap peice


----------



## HughhHoney

Quote:


> Originally Posted by *jerrolds*
> 
> How high are your VRMs getting? VRM1 for me can easily hit 95C at 1200mhz core 1.3v actual using an aftermarket cooler w/ ram sinks. It kinda sucks.


at around 1.3-1.35v they get up around 90C looping heaven. Usually lower in games.. maybe 75-80.

I've seen them get up to 100-105C running furmark at 1200Mhz before I killed it.


----------



## HughhHoney

Quote:


> Originally Posted by *evensen007*
> 
> Also, don't overlook the 2nd VRM strip. In my manual that came with mine, it was very hard to see in black and white.


Thanks.. those temps are actually alright.

I don't think I've seen over 50s/60s. Certainly nothing alarming.

I'm still going to replace that pad and add a bit of thermal paste when I redo the rest.


----------



## GioV

Quote:


> Originally Posted by *jomama22*
> 
> So just a heads up. Only found 2 out of 9 290xs will reach 1300+ at 1.35v actual on water. The other 7 can't hit 1300 stable no matter voltage as they crap out around 1.375v actua pk and just crash.
> 
> So its looking like a 20% chance of getting a good chip for benching. I'll say though, some of those chips would hit 1280 @ 1.3v actual (max on asus.rom)which is awesome, they just can't handle the volts.
> 
> Memory wise, hynix takes the cake. 6/7 elpidas can't get over 1625 no matter what, while the 3 hynix all hit 1725 minimum.


Thanks for all the info on binning, very interesting indeed. Do you know if ASIC scores really matter? Which brands have Hynix chips and was it the BF4 edition?


----------



## $ilent

Yomamma how did they compare in terms of asic quality?


----------



## Jpmboy

Quote:


> Originally Posted by *HughhHoney*
> 
> at around 1.3-1.35v they get up around 90C looping heaven. Usually lower in games.. maybe 75-80.
> 
> I've seen them get up to 100-105C running furmark at 1200Mhz before I killed it.


Delete your copy of furmark... And never use that on your gpu.









Quote:


> Originally Posted by *jerrolds*
> 
> Interesting, my VRM temps are pretty bad - right now its using thermal tape. But i do have tpads, you think if i put a speck of Chill Factor 3 (website says its not electrically conductive "CFIII does not contain any metal or other electrically conductive materials." on both sides of the pads it would improve cooling?
> 
> I just wonder if the ram sinks will stick...but i guess i can always test on a scrap peice


I think you need to use tape or the HS falls off - right?


----------



## selk22

Thanks for the info Jomama!







Alot of good work..

Unfortunately I did get stuck with the elpida mem.. Oh well I mostly game and if I can see 1200/1500 under water I will be happy.


----------



## WuLF

So would it matter if when I receive this card, I unmount the cooler and replace the tim/clean stuff up? Do you think that would make a difference in stock cooler performance? Or is it really just not worth it?


----------



## Jpmboy

Quote:


> Originally Posted by *WuLF*
> 
> So would it matter if when I receive this card, I unmount the cooler and replace the tim/clean stuff up? Do you think that would make a difference in stock cooler performance? Or is it really just not worth it?


Cant hurt, but i don't think it will help with the stock air cooler. Let us know what you find out.


----------



## Raxus

Quote:


> Originally Posted by *GioV*
> 
> Thanks for all the info on binning, very interesting indeed. Do you know if ASIC scores really matter? Which brands have Hynix chips and was it the BF4 edition?


Seems like the memory was scattered, every brand came up with both.


----------



## Jpmboy

Here's my wc r290x in the bench rig, while sli titans sit and watch... So far, they are not worried.


----------



## VSG

Wait so applying TIM on the VRMs/Vram before a thermal pad made a lot of difference to people? The guys at EK said TIM is only worth it if you would keep removing and applying the block. I just installed Fujipoly extreme pads which aren't very easy to remove and reapply so I am unsure if I should remove everything and put in some TIM.


----------



## FtW 420

Quote:


> Originally Posted by *Jpmboy*
> 
> Delete your copy of furmark... And never use that on your gpu.
> 
> 
> 
> 
> 
> 
> 
> 
> *I think you need to use tape or the HS falls off - right?*


Yes, I tried with a few kinds of TIM before with ramsinks & while it might stick when cold, they tend to fall off when the mosfets heat up.

Also yes to deleting furmark


----------



## Jpmboy

Quote:


> Originally Posted by *geggeg*
> 
> Wait so applying TIM on the VRMs/Vram before a thermal pad made a lot of difference to people? The guys at EK said TIM is only worth it if you would keep removing and applying the block. I just installed Fujipoly extreme pads which aren't very easy to remove and reapply so I am unsure if I should remove everything and put in some TIM.


If you read back, you prolly dont need tim with fujipoly.


----------



## HughhHoney

Quote:


> Originally Posted by *Jpmboy*
> 
> Delete your copy of furmark... And never use that on your gpu.


Good advice... I actually did get rid of it already.

I used to use it to test my loop's capability since it has such a low CPU impact I can run cpu stress tests at the same time.


----------



## Jpmboy

Quote:


> Originally Posted by *FtW 420*
> 
> Yes, I tried with a few kinds of TIM before with ramsinks & while it might stick when cold, they tend to fall off when the mosfets heat up.
> 
> Also yes to deleting furmark


And FtW means *COLD*


----------



## VSG

Quote:


> Originally Posted by *Jpmboy*
> 
> If you read back, you prolly dont need tim with fujipoly.


I thought I read otherwise, but I guess I was mistaken. Thanks!


----------



## Xylene

Just picked up the XFX 290. Pictures and unlock (maybe) to follow.


----------



## FtW 420

Quote:


> Originally Posted by *Jpmboy*
> 
> And FtW means *COLD*


I actually meant room temperature there when testing, if the ramsinks had stayed on the mosfets when the card was used it would have gone cold. The TIM is just thicker & stickier at room temp than it is when the card is in use & warms up.
I have heard that TIM can be mixed with glue to hold them on, but at that point it would be like using thermal adhesive.


----------



## galil3o

I just got my sapphire 290x in the mail, plugged it in, reinstalled windows with the new 13.11 beta 9.2 driver and fired up 3d mark to see what it could do.

I'm getting a high pitched whining sound in the first few tests that goes away as 3dmark moves through its tests.

Has anyone else experienced something similar? I'm guessing its a capacitor thats about to pop...


----------



## esqueue

Quote:


> Originally Posted by *HughhHoney*
> 
> Are you saying you have max VRM temps of 56C on water?
> 
> I'm on water with the EK copper acrylic block and my VRM temps are much higher. I'm pretty much positive that this block doesn't actually cool the VRMs after looking at it.
> 
> If anybody knows a waterblock with better VRM cooling please let me know. I've been looking around online, but none of them look like they actively cool the VRM area on the 290x.


I have an EK copper acrylic block connected to an old 77 Pontiac Bonneville heatercore.

3 days ago, it was 99°F (37.22°C) in my bedroom and running valley around 1200/1450 on stock voltage never got my vrm past 55°C. That is the highest I've seen it too. GPU remained mid to low 40's.
Did you make sure to cut the thermal pad for the 3 in the top corner to make sure that the thermal pad isn't sitting on any resistors or other circuity? I had to make a "T" shaped piece to avoid contact with other components. I'm sure that it I didn't do that, it would keep the block from making a good contact.
Quote:


> Originally Posted by *Jpmboy*
> 
> Cool. Always put a non conductive tim on both contact surfaces before using a t-pad. A very tiny amt on the vrm, you can thin coat the contact surface on the water block.


I failed to do this, fortunately my temps aren't bad at all.
Quote:


> Originally Posted by *galil3o*
> 
> I'm getting a high pitched whining sound in the first few tests that goes away as 3dmark moves through its tests.
> 
> Has anyone else experienced something similar? I'm guessing its a capacitor thats about to pop...


Sounds like coil whine. Search for that and see how many of us have it.


----------



## selk22

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *galil3o*
> 
> I just got my sapphire 290x in the mail, plugged it in, reinstalled windows with the new 13.11 beta 9.2 driver and fired up 3d mark to see what it could do.
> 
> I'm getting a high pitched whining sound in the first few tests that goes away as 3dmark moves through its tests.
> 
> Has anyone else experienced something similar? I'm guessing its a capacitor thats about to pop...






Most likely its Coil Whine and some of the 290 series seems to be having issues with it. Mine also has coil whined a couple times here and their but I cant seem to replicate it or figure out what does it. It has not whined since I have added more volts.. yet


----------



## Xylene

Quote:


> Originally Posted by *galil3o*
> 
> I just got my sapphire 290x in the mail, plugged it in, reinstalled windows with the new 13.11 beta 9.2 driver and fired up 3d mark to see what it could do.
> 
> I'm getting a high pitched whining sound in the first few tests that goes away as 3dmark moves through its tests.
> 
> Has anyone else experienced something similar? I'm guessing its a capacitor thats about to pop...


Just installed my XFX 290 and I have a bit of a buzz, but doesn't change with fan speed.


----------



## galil3o

Quote:


> Originally Posted by *Xylene*
> 
> Just installed my XFX 290 and I have a bit of a buzz, but doesn't change with fan speed.


Same for me, oddly it doesnt do it when I fire up BF4 or in the more graphically intense tests towards the end of the 3dmark benchmark, just in the intial one where I'm gett 1700 FPS


----------



## jomama22

Quote:


> Originally Posted by *$ilent*
> 
> Yomamma how did they compare in terms of asic quality?


Quote:


> Originally Posted by *GioV*
> 
> Thanks for all the info on binning, very interesting indeed. Do you know if ASIC scores really matter? Which brands have Hynix chips and was it the BF4 edition?


asic means very little. Only thing I can see is higher asic can hold higher voltage (1.375actual) and can hold them in just a bit better (but more heat). I will say my best two clockers are 76.8% and 77.2%, but that is only once going over the 1.34-1.35v range. I have some that do awesome up to 1290 @ 1.35v and then just refuse to accept more volts. I also have cards that can take voltage like no other...but only make it to 1250 @ 1.41v. Its all luck of the draw.

Also, those two, the only ones able to actually go over 1320, both hit ~ 1350 max in fse and 1365 in valley.

Its luck of the draw for hynix/elpida. AIB doesn't matter. There is way more elpida out there then hynix.


----------



## Xylene

Any idea how I can flash/dump the BIOS on this card in Windows like I was able to with my 7950? I tried dumping with GPU-Z and video cut and the machine hung..


----------



## Mr357

Quote:


> Originally Posted by *jomama22*
> 
> Its luck of the draw for hynix/elpida. AIB doesn't matter. There is way more elpida out there then hynix.


I can vouch for this. As I have said before, mine (Elpida) will do an astounding 1700MHz stable on the PT1 BIOS, and possibly higher on the ASUS stock BIOS.


----------



## Sgt Bilko

From what i understand Coil Whine occurs when you have very high fps.

i get it in DiRT: Showdown and a couple of Menus but i wear a headset and it seems to go away with 100% fan speed for me anyway


----------



## Forceman

Quote:


> Originally Posted by *Xylene*
> 
> Any idea how I can flash/dump the BIOS on this card in Windows like I was able to with my 7950? I tried dumping with GPU-Z and video cut and the machine hung..


You can use Atiflash to dump it. I think it's the -s switch, but I'm not positive.


----------



## Amhro

Oh come on, I want non-ref 290 already


----------



## Kokin

Quote:


> Originally Posted by *galil3o*
> 
> I just got my sapphire 290x in the mail, plugged it in, reinstalled windows with the new 13.11 beta 9.2 driver and fired up 3d mark to see what it could do.
> 
> I'm getting a high pitched whining sound in the first few tests that goes away as 3dmark moves through its tests.
> 
> Has anyone else experienced something similar? I'm guessing its a capacitor thats about to pop...


It is coil whine and is common to hear when your gpu is pushing hundreds of FPS. If you don't hear it when gaming you should be okay.


----------



## Skylark71

Quote:


> Originally Posted by *Forceman*
> 
> Hmm. So that makes 3 XFX 290s with Hynix memory that unlocked the shaders. Anyone have a XFX with Elpida who can check? Anyone with a different brand 290 with Hynix tried the Asus BIOS yet?
> It's the same either way, some programs show it at +50% and some show it as 150%.


Quote:


> Originally Posted by *Skylark71*
> 
> I have Asus R9 290 with Elpida, 78.5% asic.
> Willing to try flashing it with Asus 290X bios. Is there instructions for flashing via usb?


Ok, flashed. Everything went ok, but no unlocking, still 2560 shaders...









what do you think, is it better to reflash the original bios?


----------



## kcuestag

Quote:


> Originally Posted by *Mr357*
> 
> I can vouch for this. As I have said before, mine (Elpida) will do an astounding 1700MHz stable on the PT1 BIOS, and possibly higher on the ASUS stock BIOS.


I have to ask, what's all this PT1 BIOS talking? IS it a special BIOS? I want to know, because I think I can't hit above 1400-1500 on the Memory with the default Sapphire bios.









If I set Memory to 1400, all is good, if I do 1500, on desktop, as soon as I apply, it starts flickering and having lots of horizontal lines like CRAZY, why?


----------



## Xylene

Any of you guys having issues dumping the BIOS in GPU-Z? I want to make a back up incase my flash goes wrong (going to need to do the boot CD method because I don't have a flash drive right now).


----------



## jerrolds

Quote:


> Originally Posted by *kcuestag*
> 
> I have to ask, what's all this PT1 BIOS talking? IS it a special BIOS? I want to know, because I think I can't hit above 1400-1500 on the Memory with the default Sapphire bios.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If I set Memory to 1400, all is good, if I do 1500, on desktop, as soon as I apply, it starts flickering and having lots of horizontal lines like CRAZY, why?


I accidentally realized that somehow the memory frequency is tied to the gpu voltage. When setting up a new profile in GPU Tweak i set the memory to 1600mhz (something i knew it could do) but the screen went all crazy and i had to hit default. When i cranked up the voltage +100mV, the memory was able to hit 1600mhz.

So thats something you can try.


----------



## Forceman

Quote:


> Originally Posted by *jerrolds*
> 
> I accidentally realized that somehow the memory frequency is tied to the gpu voltage. When setting up a new profile in GPU Tweak i set the memory to 1600mhz (something i knew it could do) but the screen went all crazy and i had to hit default. When i cranked up the voltage +100mV, the memory was able to hit 1600mhz.
> 
> So thats something you can try.


I saw the same thing. Crazy lines at stock voltage, but bumping the core voltage fixed it. Probably an issue with the controller and not the memory itself.


----------



## kcuestag

Quote:


> Originally Posted by *jerrolds*
> 
> I accidentally realized that somehow the memory frequency is tied to the gpu voltage. When setting up a new profile in GPU Tweak i set the memory to 1600mhz (something i knew it could do) but the screen went all crazy and i had to hit default. When i cranked up the voltage +100mV, the memory was able to hit 1600mhz.
> 
> So thats something you can try.


You're right, but I also heard some BIOS made Elpdia Memory OC better? If so, which BIOS should I install on my card?


----------



## youra6

Quote:


> Originally Posted by *Xylene*
> 
> Any idea how I can flash/dump the BIOS on this card in Windows like I was able to with my 7950? I tried dumping with GPU-Z and video cut and the machine hung..


Are you using the latest version?


----------



## alancsalt

Quote:


> Originally Posted by *HoZy*
> 
> Why do people even bother with regular firestrike?
> 
> Run firestrike extreme if you are really interested in it.


Maybe because OCN has a Firestrike thread but not a FireStrike Extreme thread.

On HWbot only FireStrike Extreme earns points.


----------



## Xylene

Yes. Also, I found a drive to make a USB drive that is win98/dos bootable and I got everything set up on there, but I still get "adapter not found" errors.


----------



## kcuestag

What's the difference between PT1 and PT3 BIOS? I want to try them, what are the advantages of those against my stock Sapphire bios?


----------



## Xylene

Got it to flash with another guide, no video. Lulz, awesome. Gotta figure out how to get that BIOS back up. Thank god for a dual BIOS switch.


----------



## estens

Quote:


> Originally Posted by *HughhHoney*
> 
> Are you saying you have max VRM temps of 56C on water?
> 
> I'm on water with the EK copper acrylic block and my VRM temps are much higher. I'm pretty much positive that this block doesn't actually cool the VRMs after looking at it.
> 
> If anybody knows a waterblock with better VRM cooling please let me know. I've been looking around online, but none of them look like they actively cool the VRM area on the 290x.


I have the same block and had problems with high VRM temps also. I could see with my eyes that the block didn`t make contact with the VRM`s.

I meassured the remaining thermal pads, and they where 0,5mm and should have been 1mm. So I only got 0.5mm thermal pads with my block, but one strip should have been 1mm.
Maybe that is your case also?


----------



## Newbie2009

Quote:


> Originally Posted by *kcuestag*
> 
> What's the difference between PT1 and PT3 BIOS? I want to try them, what are the advantages of those against my stock Sapphire bios?


i THINK pt3 IS A ln2 BIOS, V Risky


----------



## r0l4n

Released beta2.
Quote:


> Originally Posted by *r0l4n*
> 
> Hi!
> 
> I've created a small windows application that wraps the VRM console commands towards MSI Afterburner (link and link).
> 
> Features:
> -enables/disables LLC
> -applies a VDDC offset between 0v and +0.3v
> -loads profiles on load and/or on idle if desired
> -applies these *only* to those processes in an input list
> -stores settings to file so one doesn't have to enter them every time
> -minimizes to system tray
> 
> Requires:
> -R9 290/290X
> -Microsoft .NET 4.0
> -MSI Afterburner beta 16 or newer
> 
> I've tested myself (I'm running a Gigabyte R9 290) and just works, but *I will not take, of course, any blame if your windows crashes, your GPU gets fried, etc*. Enabling LLC adds +0.1v straight away, so you can get yourself an idea of what you are putting your GPU through.
> 
> Both files included in the .zip file need to be in the same location. If you have UAC enabled, you may need to run it as administrator.
> 
> CHANGELOG:
> beta2:
> -fixed bug that prevented setting the correct path to MSIAfterburner.exe
> 
> ISSUES:
> -a user reported black screens. I have only been able to reproduce this by setting the tool to apply AB profiles with a high memory overclock.
> 
> Please report any bugs to me in a private message.
> 
> ****USE AT YOUR OWN RISK****
> 
> ABAutoProfileVoltMod.exe_beta2.zip 314k .zip file
> 
> ****USE AT YOUR OWN RISK****


----------



## crazycrave

Will this get me in the club?

http://img.photobucket.com/albums/v400/crazycrave/IMG_1372.jpg


----------



## givmedew

Quote:


> Originally Posted by *Kokin*
> 
> Quote:
> 
> 
> 
> Originally Posted by *galil3o*
> 
> I just got my sapphire 290x in the mail, plugged it in, reinstalled windows with the new 13.11 beta 9.2 driver and fired up 3d mark to see what it could do.
> 
> I'm getting a high pitched whining sound in the first few tests that goes away as 3dmark moves through its tests.
> 
> Has anyone else experienced something similar? I'm guessing its a capacitor thats about to pop...
> 
> 
> 
> It is coil whine and is common to hear when your gpu is pushing hundreds of FPS. If you don't hear it when gaming you should be okay.
Click to expand...

Correct,

I would like to add that you should be ok no matter what anyhow. Not everyone can even hear it but the it is annoying to the people who can... if you get a crappy overclocker I would look to exchange on the grounds that it makes those wretched noises. If it is a good overclocker then I would just deal with it. I have had motherboard, power supplies and several video cards that do this. Sometimes it is a combination of certain parts. My Sapphire 2GB 5870 causes my King Win PF-850 (known for whining) to make noises on startup and shutoff but no other card I have makes the PSU make any noises but... when it was connected to my RIIIE motherboard it would always whine.

So if you can just deal with it but if it is a bad overclocker then I think you have a good enough reason to request an exchange from whoever you got it from.


----------



## Xylene

Looking like a no go on my 290 to 290x trial. I can flash a 290x BIOS and boot and it shows default clock of 1000mhz (but I have no video until Windows loads), but still same shader count.


----------



## skupples

After seeing some one mistakenly flash a 780ti with a titan bios, and have it only read 2688 cores (should read 2880) i'm guessing it can go both ways. Just a thought for those thinking they have unlocked the defective/defused sections of their 290s.


----------



## Taint3dBulge

Quote:


> Originally Posted by *galil3o*
> 
> I just got my sapphire 290x in the mail, plugged it in, reinstalled windows with the new 13.11 beta 9.2 driver and fired up 3d mark to see what it could do.
> 
> I'm getting a high pitched whining sound in the first few tests that goes away as 3dmark moves through its tests.
> 
> Has anyone else experienced something similar? I'm guessing its a capacitor thats about to pop...


My'n is super loud.. Its quited down alittle bit, but not enough to not notice.. I can hear it across the house when i have something running.. It also doesnt matter what fps a game or a bench is running.. if its at 40fps its just slower ticking noise, at 140fps its just a screeching screaming loud as hell sound.. Sucks, big time. I really wanted to put a aftermarket cooler on it, but now im just going to wait till the 25th to rma.. I guess though if you say you want to rma it because of a coil whine, they wont let you.. So luckly one of my dvi's has a flicker every now and again and well i cant have that now can i..

Quote:


> Originally Posted by *givmedew*
> 
> Correct,
> 
> I would like to add that you should be ok no matter what anyhow. Not everyone can even hear it but the it is annoying to the people who can... if you get a crappy overclocker I would look to exchange on the grounds that it makes those wretched noises. If it is a good overclocker then I would just deal with it. I have had motherboard, power supplies and several video cards that do this. Sometimes it is a combination of certain parts. My Sapphire 2GB 5870 causes my King Win PF-850 (known for whining) to make noises on startup and shutoff but no other card I have makes the PSU make any noises but... when it was connected to my RIIIE motherboard it would always whine.
> 
> So if you can just deal with it but if it is a bad overclocker then I think you have a good enough reason to request an exchange from whoever you got it from.


If your psu is making noises every now and again that might mean its ready to pop... That just cant be right.. For your psu to do it only when you have that card in or mobo pluged in.. There is something going on there. Maybe just an unhappy coil..


----------



## skupples

Coil whine is quite part for the course when it comes to AMD GPU's, and for some PSU's... It rarely means something is about to "pop" unless you are on an extremely low wattage and or low build quality unit.

Many companies will accept RMA request for coil whine, though most of them will tell you that they can not guarantee the next one won't do it.

If you are lucky, & or old enough you stop hearing it.


----------



## mojobear

Hey guys, I'm looking into getting 3 r9-290s and have read through most of the comments but cant get a sense of if the black screen issue has been solved and whether it is hardware or software based....

Before I pull the trigger on 2000 dollars worth of gear (including waterblocks), I just wanted to ensure that I wont be disappointed with my purchase.


----------



## the9quad

To be fair even the GTX 780's have coil whine, it is not exclusive to AMD.

Also Moojbear, I am getting another 290x tomorrow, so if it crashes I will say its more than likely something driver/software based on the user end, so I'll let ya know.The odds would be pretty crazy for me to get two bum cards in a row. My gut feeeling is, it is the mobo's not supplying enough voltage to the PCIe slots for these cards or bad ram on some cards. So in short, if the card tomorrow crashes, I am going to see if I can return this mobo, and get one of the ROG ones that have extra power cables for the PCIe slots.


----------



## hew1fr

Hi, i'm in ?

After a poor sample of sapphire, i've changed with a gigabyte card (290 non-x)

Hynix memory chip!!! No warranty sticker 
290x bios : none modification of shaders.
For now, voltage control with AB doesn't work, OCguruII either. But took back the hand on, with the asus 290 bios and gputweak.


----------



## Arizonian

Quote:


> Originally Posted by *the9quad*
> 
> To be fair even the GTX 780's have coil whine, it is not exclusive to AMD.
> 
> -SNIP


Quite true









http://www.overclock.net/newsearch/?search=Coil+whine&resultSortingPreference=recency&byuser=&output=posts&sdate=0&newer=1&type=all&containingthread%5B0%5D=1438886&advanced=1

6 out of 28 members in the 780Ti club would agree.


----------



## sugarhell

Because only amd follow the laws of nature


----------



## FtW 420

Quote:


> Originally Posted by *skupples*
> 
> Coil whine is quite part for the course when it comes to AMD GPU's, and for some PSU's... It rarely means something is about to "pop" unless you are on an extremely low wattage and or low build quality unit.
> 
> Many companies will accept RMA request for coil whine, though most of them will tell you that they can not guarantee the next one won't do it.
> 
> If you are lucky, & or old enough you stop hearing it.


When I was running the gtx285s regularly those things would scream, came in handy when walking away from the rig while folding or benching, when I stopped hearing the coil whine it was an audible indicator that the driver had crashed


----------



## outofmyheadyo

My gtx 780 squeals just like few ladies in amsterdam


----------



## Xylene

I gave it +50 mv and set it to 1050mhz just for a quick and dirty test and played a bunch of BF3, no issues. I have to do more research on these cards before I go any further. I set the fan speed to 75% (because I use headphones and it doesn't matter) but the temp topped out around 64c and it never throttled. For the lulz I set it to 100% to see how loud it is and it sounded like my computer might take flight.


----------



## skupples

My titans probably scream too, specially when over volted. It's either drowned out by fans, or iv'e lost the ability to hear that pitch.


----------



## Technewbie

I don't get it I can overclock my card and run benchmarks no problem but when I try to run a game it just blackscreens. Some times right away and then other times after a hour or longer. I've tried so many different settings for OC's that I lost count and in the end the same thing happens I get a blackscreen and have to hard restart. I have even had OC's last for over a day but then blackscreens.


----------



## Xylene

Is it still true with XFX that in the US if you remove the warranty void stickers over the screws that it doesn't actually void the warranty? I'd love to re-apply the compound as I am sure the factory job is horrible but it's not worth it to void the warranty.


----------



## skupples

Quote:


> Originally Posted by *Technewbie*
> 
> I don't get it I can overclock my card and run benchmarks no problem but when I try to run a game it just blackscreens. Some times right away and then other times after a hour or longer. I've tried so many different settings for OC's that I lost count and in the end the same thing happens I get a blackscreen and have to hard restart. I have even had OC's last for over a day but then blackscreens.


That's pretty typical actually. May just require more voltage.


----------



## Technewbie

Quote:


> Originally Posted by *skupples*
> 
> That's pretty typical actually. May just require more voltage.


I've ran max voltage in AB 17 at 1160MHZ core and 1425MHz memory and it lasted for about a 1 1/2 days and then boom blackscreen


----------



## Forceman

Quote:


> Originally Posted by *Xylene*
> 
> Is it still true with XFX that in the US if you remove the warranty void stickers over the screws that it doesn't actually void the warranty? I'd love to re-apply the compound as I am sure the factory job is horrible but it's not worth it to void the warranty.


The website says you can make modifications but you should contact them first (yeah, fat chance on that). Just carefully lift off the stickers and store them on another part of the PCB so you can replace them later if needed. They aren't really stuck on those screws very well.


----------



## rdr09

Quote:


> Originally Posted by *skupples*
> 
> Coil whine is quite part for the course when it comes to AMD GPU's, and for some PSU's... It rarely means something is about to "pop" unless you are on an extremely low wattage and or low build quality unit.
> 
> Many companies will accept RMA request for coil whine, though most of them will tell you that they can not guarantee the next one won't do it.
> 
> If you are lucky, & or old enough you stop hearing it.


highend nvidia cards are not immune to coil whine. i know 'cause owners complain here in OCN.


----------



## esqueue

Quote:


> Originally Posted by *Xylene*
> 
> Is it still true with XFX that in the US if you remove the warranty void stickers over the screws that it doesn't actually void the warranty? I'd love to re-apply the compound as I am sure the factory job is horrible but it's not worth it to void the warranty.


They claim that you should get their permission before making any mods so that they can update your status.









I believe that they will "update" your status to VOID without letting you know.

Do what I did with my XFX 290x, use a heatgun set to 300°F and some tweezers to heat up the stickers and remove them. I guess a blow drier might work if you don't have a heatgun that displays temps or can't go that low.


----------



## skupples

Quote:


> Originally Posted by *rdr09*
> 
> highend nvidia cards are not immune to coil whine. i know 'cause owners complain here in OCN.


That's correct, I didn't say they were. Then even posted afterwords that my units probably do it too, but that it's either drowned out, or i can't hear the pitch.

Multiple people posted afterwords that their 780's & what not also scream. Please stop assuming that i'm AMD bashing every time I post something.

erm(clears throat) My main point was this. *Coil whine is quite normal for GPU's*. It does not mean they are about to "pop". PSU's can also produce coil whine, though it's normally allot harder to hear, and much more prevalent in low end units.

*You can file RMA's over coil whine, but do not expect to get a unit back that won't do it.*


----------



## rdr09

Quote:


> Originally Posted by *skupples*
> 
> That's correct, I didn't say they were. Then even posted afterwords that my units probably do it too, but that it's either drowned out, or i can't hear the pitch.
> 
> Multiple people posted afterwords that their 780's & what not also scream. Please stop assuming that i'm AMD bashing every time I post something.
> 
> erm(clears throat) My main point was this. _*Coil whine is quite normal for GPU's*._ It does not mean they are about to "pop". PSU's can also produce coil whine, though it's normally allot harder to hear, and much more prevalent in low end units.


there you go . . . now you are talking.


----------



## Mr357

Quote:


> Originally Posted by *kcuestag*
> 
> I have to ask, what's all this PT1 BIOS talking? IS it a special BIOS? I want to know, because I think I can't hit above 1400-1500 on the Memory with the default Sapphire bios.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If I set Memory to 1400, all is good, if I do 1500, on desktop, as soon as I apply, it starts flickering and having lots of horizontal lines like CRAZY, why?


The PT1 BIOS prevents any limitation on the TDP of the card, increases the Vcore maximum, and I believe it reduces Vdroop by a small degree, but it's certainly still there. I only recommend it if you're comfortable with putting a lot of volts through your card, and it would help to also have a multi-meter to prevent software from misleading you. Keep in mind that your card will begin to eat watts like mad, and of course put out a lot more heat. I can dig up the download link if you'd like. There's also PT3, but that's a bad idea unless you're just going for some high benchmark scores, and then going to flash back from it.

As far as achieving high memory clocks, bumping up the core voltage should help with that, but if you can't get far on the stock BIOS I doubt PT1 will be any kinder. I know for sure that memory overclocking is tougher on PT3. Supposedly it's due to tighter timings.


----------



## black7hought

What is the cause of this faulty hardware or drivers?


Spoiler: Warning: Spoiler!


----------



## utnorris

Quote:


> Originally Posted by *motorwayne*
> 
> I have a dilemma, I want to change to AMD because I like what I'm hearing in terms of Mantle and Truesound, and what I'm hearing in terms of innovation coming out of that corner.
> 
> Current GPU is a GTX 570 old but goes ok.
> 
> I have to buy a R9 290X in the next two days from the USA as I have a mate there at the moment on business and buying one in the US is 60% of the price I would pay here in New Zealand. The issue I have is the R9 290X seems filled with problems and issues both stock and when people try to overclock it. I am also grabbing a 120hz monitor in the next week and see people having issues with the Qnix 2560 x 1440.
> 
> I'm trying to decide between the R9 290X and the EVGA GTX 780 Classified (I know its a dirty word in here) and I'm flip flopping between the two. I really want to get the R9 but not sure the benefits are there right now and the aftermarket makers haven't released their version yet.
> 
> What do you think? Buy the 780 and sell it here in NZ for a profit and buy the R9 when aftermarket R9's come out over the next 6 months or suck the kumera and get the R9 now?
> 
> Cheers from NZ
> 
> Motorwayne


If you want trouble free guaranteed then get the GTX780 Classified due to it having more mature drivers and such. If you don't mind the growing pains, then grab a 290x or better yet a 290.


----------



## jomama22

PT3 does not make it any more difficult to oc the ram. My max mem scores are on pt3. It should be noted that just because you hit a higher clock, doesn't mean you are getting better scores. This stuff crushes scores hard if not stable, same with the core.

All 9 cards of memory clocked exactly the same on each bios, baring the OG stock bios. There are no memory timing changes.

Anyone achieveing a higher clock on a different bios just ran into bad luck with drivers during testing.


----------



## rdr09

Quote:


> Originally Posted by *selk22*
> 
> I think your right after further testing this seems to just be the GPU adjusting to what is happening..
> 
> Well yeah that's why I have the 290x bro! all that sexy vram! Im at 1920x1200 4xAA, Full Post AA, 135% Res scale!
> 
> 
> 
> 
> 
> 
> 
> 55-60FPS average on 64p OP locker.
> 
> If that's not a crisp game Idk what is!!


got concerned there a bit 'cause 3400MB in BF4? Too close to our 4GB.


----------



## Tobiman

Quote:


> Originally Posted by *black7hought*
> 
> What is the cause of this faulty hardware or drivers?
> 
> 
> Spoiler: Warning: Spoiler!


That's usually memory related from my experience.


----------



## nemm

Finally got my 290s installed for crossfire, time to play now they have been overclocked and stable











Spoiler: Sapphire 290WC 1180/[email protected]


----------



## selk22

Quote:


> Originally Posted by *rdr09*
> 
> got concerned there a bit 'cause 3400MB in BF4? Too close to our 4GB.


Yeah that's no where near the normal usage without all that AA and res scale.. I think its somewhere around 1.5-2k normally? BF4 does eat up Vram pretty hard though


----------



## grandpatzer

Quote:


> Originally Posted by *Arizonian*
> 
> Quite true
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/newsearch/?search=Coil+whine&resultSortingPreference=recency&byuser=&output=posts&sdate=0&newer=1&type=all&containingthread%5B0%5D=1438886&advanced=1
> 
> 6 out of 28 members in the 780Ti club would agree.


it also depends on hearing sensitivity and what type of pc one has, if you have noisy fans/harddrives they can mask coil whine.
If you have watercooling with dead silent pump/fans/ssd then you can better hear coilwhines


----------



## sugarhell

Dont worry mantle will reduce memory usage on bf4


----------



## selk22

Yeah I am not worried at all. I am extremely happy with this 290x right now and excited for things to even get better with mantle and updated drivers


----------



## grandpatzer

Quote:


> Originally Posted by *kcuestag*
> 
> I have to ask, what's all this PT1 BIOS talking? IS it a special BIOS? I want to know, because I think I can't hit above 1400-1500 on the Memory with the default Sapphire bios.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If I set Memory to 1400, all is good, if I do 1500, on desktop, as soon as I apply, it starts flickering and having lots of horizontal lines like CRAZY, why?


Quote:


> Originally Posted by *jerrolds*
> 
> I accidentally realized that somehow the memory frequency is tied to the gpu voltage. When setting up a new profile in GPU Tweak i set the memory to 1600mhz (something i knew it could do) but the screen went all crazy and i had to hit default. When i cranked up the voltage +100mV, the memory was able to hit 1600mhz.
> 
> So thats something you can try.


Are these like the AMD 79x0 gpu's where memory oc gave little improvement in fps?
In that case I don't see the point in OC the memory alot.


----------



## Scorpion49

Well now I'm depressed. I missed the UPS guy by like 10 minutes coming home from work and now my cards won't be delivered until monday.


----------



## glenquagmire

Just sold my xfire 7950s for xfx 290. My new thread home.


----------



## Falkentyne

Quote:


> Originally Posted by *Technewbie*
> 
> I've ran max voltage in AB 17 at 1160MHZ core and 1425MHz memory and it lasted for about a 1 1/2 days and then boom blackscreen


This isn't related to your overclock (it's usually memory related stuff or some funky hardware fault that also causes black screen hard locks); black screens are "supposed" to trigger (just like on 7970) if your GPU reaches throttle point quickly and cant throttle and the temps keep rising, and the card shuts down, just like Intel chips will do.

These cards STILL have VPU recovery, and unstable core overclocks (Should) simply cause the application to black screen hang after the scene freezes for 5 seconds, then the driver recovers and the game is suspended with a black screen (alt tabable then you can end task on the process). I can trigger that on Valley in 25 seconds if I use 1125 MHz core at default volts, and at about 89C+ temps at +12mv at 1125 MHz (took 15 minutes last time to reach 90C then suspended black screen freeze from GPU crash+vpu recovery), while +25mv runs fine at 1125 MHz if I keep the temps under 92C (75-80% fan speed). System isn't frozen, just have to terminate the demo. Haven't tested that in BF4.

I won't deny that multiple things, including games, Bios or hardware or driver issues, or defects somewhere, could be triggering this "GPU shutdown"--that's what the hard lock black screen is. VPU recovery does NOT trigger an INSTANT black screen---you guys who are getting the hard locks, are getting INSTANT black screens; GPU crashes should cause a freeze for 5 seconds, then a VPU recovery....

The fact that increasing GPU volts is helping people pass 1600-1700 MHz on memory tells us that something different is typing GPU/stability to memory speeds


----------



## eternal7trance

While these cards don't cool that great, at 67% fan it maxes out around 70c while playing Ghosts and BF4 @ 1440p. Not sure what the big deal is. It's about as loud as my old 470 was.


----------



## black7hought

Quote:


> Originally Posted by *Tobiman*
> 
> That's usually memory related from my experience.


GPU memory or system memory?


----------



## Arizonian

Quote:


> Originally Posted by *Scorpion49*
> 
> Well now I'm depressed. I missed the UPS guy by like 10 minutes coming home from work and now my cards won't be delivered until monday.


Bummer and over the weekend. I feel your pain bud on the new toy.

Waiting is a drag. I'm waiting on non-ref with aftermarket cooling. Sold my reference card already.
Quote:


> Originally Posted by *glenquagmire*
> 
> Just sold my xfire 7950s for xfx 290. My new thread home.


Waiting on your pic & OCN submission to join us. In the meantime pull up a chair and grab some pop corn there's a lot of pages to catch up on.


----------



## Raephen

Guys, have pity on me... I'm ashamed to say it even to myself, but I've given in...

To watercooling my 290









I got fed up with waiting for my Accelero Xtreme III -- I haven't even got shipping notice, yet, and it seems it's no longer even available from the site I ordered it from.

Add to that, those sexy waterblocks wining at me from every corner (yes, I know: I'm a nerd







) and I just couldn't help myself.

So I ordered the Aqua Computer kryographics Hawaii from Aquatuning, allong with some other parts needed to add it to my loop. And I'm gonna cancel my AX III order asap..

Sure, it costs me about €110 more, but look on the bright side: I get a nice image of palm trees on the block -- which I'll never see when it's in my system, but that's beside the point: I'll know it's there.

Here's keeping my fingers crossed that Aquatuning can live up to the stated shipping date of 18-11-2013


----------



## grandpatzer

Quote:


> Originally Posted by *nemm*
> 
> Finally got my 290s installed for crossfire, time to play now they have been overclocked and stable
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Sapphire 290WC 1180/[email protected]


nice OC!

Is it stable in BF4 at those clocks/voltage?


----------



## LancerVI

Could I be added to the club please? Asus R9 290 w/ Koolance VID-AR290X fullcover WC block




So far so good. It's running nicely at stock. Never gets over 36C at load. Lot's more testing to do.


----------



## rv8000

Tried unlocking my sapphire 290 with hynix memory using the asus bios, no success. Voltage control even in with AB beta 17 on stock bios and asus bios using gpu tweak seem to do nothing in terms of stabilizing my card 1150-1180 seems to be the wall even with stock volts. Gonna try some more testing with llc commands and nol4ns tool, disappointed with this cards overclocking potential atm.


----------



## sun100

Solved heating problem with Afterburner custom fan profile, did ratio of 1:1 temp/fan speed, so i got it to run at 63C and around 60-65% fan speed. Not too loud, not silent for sure, but beats 95 C . Even though card is built for high temps, rest of the component in the chassis aren't lol . That card get the whole damn system burning, even though i got pretty good cooling and quite a large and spacey chassis.


----------



## Forceman

What do we think a safe long term 24/7 voltage is? I've got 1.35V set in GPU Tweak, which gives about 1.25V under load. How much higher can I push it for normal (not benching) use? Temps are no issue, still 50C core and 60C VRM.


----------



## gamervivek

Quote:


> Originally Posted by *grandpatzer*
> 
> Are these like the AMD 79x0 gpu's where memory oc gave little improvement in fps?
> In that case I don't see the point in OC the memory alot.


It should help. See the first test.

http://techreport.com/review/25509/amd-radeon-r9-290x-graphics-card-reviewed/6


----------



## eternal7trance

If I already have a gelid cooler, what heatsinks do i need to buy to finish it off?


----------



## Tobiman

Quote:


> Originally Posted by *eternal7trance*
> 
> If I already have a gelid cooler, what heatsinks do i need to buy to finish it off?


Do you mean fan? Noctuas or phantek fans are the best. Gelid fans aren't bad either.


----------



## skupples

Quote:


> Originally Posted by *grandpatzer*
> 
> Are these like the AMD 79x0 gpu's where memory oc gave little improvement in fps?
> In that case I don't see the point in OC the memory alot.


The bandwidth is already pretty damned high stock, so if you are not on extreme resolutions it's likely going to do little. if you are on eyefinity it may serve more use.


----------



## eternal7trance

Quote:


> Originally Posted by *Tobiman*
> 
> Do you mean fan? Noctuas or phantek fans are the best. Gelid fans aren't bad either.


No I have the gelid cooler from an older card but I don't have the heatsinks for it


----------



## sugarhell

Quote:


> Originally Posted by *Forceman*
> 
> What do we think a safe long term 24/7 voltage is? I've got 1.35V set in GPU Tweak, which gives about 1.25V under load. How much higher can I push it for normal (not benching) use? Temps are no issue, still 50C core and 60C VRM.


If the temps are not problem anything under 1.4 its okay i think. 50C on the core is a bit high for watercooling but nowhere high.The pcb seems higher quality than ref 7970s and i could push on these cards 1.38 volts 24/7


----------



## Arizonian

Quote:


> Originally Posted by *LancerVI*
> 
> Could I be added to the club please? Asus R9 290 w/ Koolance VID-AR290X fullcover WC block
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So far so good. It's running nicely at stock. Never gets over 36C at load. Lot's more testing to do.


Congrats - added


----------



## Jpmboy

Quote:


> Originally Posted by *Forceman*
> 
> What do we think a safe long term 24/7 voltage is? I've got 1.35V set in GPU Tweak, which gives about 1.25V under load. How much higher can I push it for normal (not benching) use? Temps are no issue, still 50C core and 60C VRM.


I've been running the same, GPU-T 1.412 with LLC on/enabled - seems very safe... just stay away from the following mod, or use with EXTREME caution in combo with LLC disable - think we cooked a card! - not sure yet.

[Settings]
VDDC_IR3567B_Detection = 30h
VDDC_IR3567B_Output = 0
VDDCI_IR3567B_Detection = 30h
VDDCI_IR3567B_Output = 1


----------



## tsm106

helluva long wait...


----------



## raghu78

Quote:


> Originally Posted by *tsm106*
> 
> helluva long wait...


tsm106 are those 3 R9 290X cards watercooled in the pic


----------



## tsm106

Yeap, tri x cards.


----------



## raghu78

Quote:


> Originally Posted by *tsm106*
> 
> Yeap, tri x cards.


tri CF scales better than tri SLI. i am quite sure you will beat even GTX 780 Ti triple SLI . i think you are back to rule the charts


----------



## Jpmboy

Quote:


> Originally Posted by *tsm106*
> 
> Yeap, tri x cards.


post it up on FS - I'll add it to the chart pronto. I'm curious to see if it tops your quad7970s


----------



## PillarOfAutumn

Really dumb question: on the switch, which side is the uber mode and which mode is quiet mode? The side closest to the vent or the side closest to 6+8 pin connectors/fan?


----------



## Forceman

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Really dumb question: on the switch, which side is the uber mode and which mode is quiet mode? The side closest to the vent or the side closest to 6+8 pin connectors/fan?


Uber is toward the power connectors, quiet is toward the display connectors.


----------



## tsm106

http://www.3dmark.com/fs/1115531

top 5 here for tri fse on air.


----------



## Arizonian

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Really dumb question: on the switch, which side is the uber mode and which mode is quiet mode? The side closest to the vent or the side closest to 6+8 pin connectors/fan?


OP has the info and pic.

Quiet Mode - Switch is in position closest to where you plug in your displays. Fan is at 40% cap.

Uber Mode - Switch is in position furthest away to where you plug in your displays. Fan is at 55% cap.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Forceman*
> 
> Uber is toward the power connectors, quiet is toward the display connectors.


Thanks boss!

Also, I JUST set my PC up and after taking this 8 hour exam yesterday, I really wanted to game. My only problem is that its night time here, the rest of my family is sleeping, but when I tried running BF4, it sounded like a jet engine. I don't want to wake anyone up, and idc if I'm getting less frames, but whats the best thing to do? Can I just run it in quiet mode, set the Power and GPU Clocks settings to -25 %, set the fan speed to 40% and temperature tolerance to 95C?


----------



## Forceman

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Thanks boss!
> 
> Also, I JUST set my PC up and after taking this 8 hour exam yesterday, I really wanted to game. My only problem is that its night time here, the rest of my family is sleeping, but when I tried running BF4, it sounded like a jet engine. I don't want to wake anyone up, and idc if I'm getting less frames, but whats the best thing to do? Can I just run it in quiet mode, set the Power and GPU Clocks settings to -25 %, set the fan speed to 40% and temperature tolerance to 95C?


If you limit the fan speed in CCC to whatever you think is a comfortable level, it'll automatically throttle the clock speed to keep the temps at 95C with that fan speed.


----------



## jomama22

I personally wouldn't go over 1.35v actual for 24/7. Most of these chips dont cooperate above 1.375v actual, which leads me to think they will degrade much faster above 1.35v.


----------



## bpmcleod

Ok so is this shader thing "real" or just some glitch with GPU-z reading the shaders directly off the BIOS? I flashed my Powercolor 290 to the ASUS 290x awhile back and noticed that the numbers on my GPUz show the 290x numbers but thought nothing of it. Considering my FS scores maybe I should have? I was pushing 11k on a 290 at only 1140 core/1500 memory. So my question is.. Is this flash to 290x real or what?


----------



## bpmcleod

Quote:


> Originally Posted by *tsm106*
> 
> http://www.3dmark.com/fs/1115531
> 
> top 5 here for tri fse on air.


If it makes you feel any better, you beat the GPU score of number two, but he crushed your physics score.


----------



## ledzepp3

How's the coil whine been with the 290X's? I'm looking in to getting two Sapphire cards in particular, even though brand now makes almost no difference because the boards are all reference.

-Zepp


----------



## sugarhell

Quote:


> Originally Posted by *bpmcleod*
> 
> If it makes you feel any better, you beat the GPU score of number two, but he crushed your physics score.


Yeah ln2. 6.1 ghz-8 pack


----------



## the9quad

Quote:


> Originally Posted by *ledzepp3*
> 
> How's the coil whine been with the 290X's? I'm looking in to getting two Sapphire cards in particular, even though brand now makes almost no difference because the boards are all reference.
> 
> -Zepp


It's a crap shoot mine had almost none, but it black screened. Others seem to get quite a bit.


----------



## raghu78

Quote:


> Originally Posted by *jomama22*
> 
> I personally wouldn't go over 1.35v actual for 24/7. Most of these chips dont cooperate above 1.375v actual, which leads me to think they will degrade much faster above 1.35v.


yeah 1.35v is generally the max limit for for long term reliability


----------



## youra6

I see a green light on the far end of the 290 PCB (side away from the mounting brackets). Anyone know what that light signifies?

*Edit:* Just did a simple test and waddya know...  For anyone wondering, it means CFX enabled. Disable CFX and the light goes away.


----------



## ontariotl

Quote:


> Originally Posted by *youra6*
> 
> I see a green light on the far end of the 290 PCB (side away from the mounting brackets). Anyone know what that light signifies?
> 
> *Edit:* Just did a simple test and waddya know...
> 
> 
> 
> 
> 
> 
> 
> For anyone wondering, it means CFX enabled. Disable CFX and the light goes away.


Actually I thought it was an indicator that power saver was enabled on the second card when not running a load. I notice it on while I'm on desktop since its not needed, but it shuts off as soon load a game or benchmark that xfire is needed.


----------



## Scorpion49

Ooohhh yeah come to daddy. Luckily I was able to pick my cards up at the UPS depot, although its like 40 minutes away. I braved rush hour traffic to get them!


----------



## Forceman

Quote:


> Originally Posted by *bpmcleod*
> 
> Ok so is this shader thing "real" or just some glitch with GPU-z reading the shaders directly off the BIOS? I flashed my Powercolor 290 to the ASUS 290x awhile back and noticed that the numbers on my GPUz show the 290x numbers but thought nothing of it. Considering my FS scores maybe I should have? I was pushing 11k on a 290 at only 1140 core/1500 memory. So my question is.. Is this flash to 290x real or what?


Yes, it appears to be real for at least some cards. Run a benchmark at 290 clocks and compare to stock 290 scores to see if your card is really faster. Or just compare it clock for clock.


----------



## ledzepp3

Quote:


> Originally Posted by *the9quad*
> 
> It's a crap shoot mine had almost none, but it black screened. Others seem to get quite a bit.


Was it a low pitched whine, or a screaming noise? I had a 7970 a while back that was an MSI, that whined like a little girl.. Hated it to death.

-Zepp


----------



## utnorris

So anyone with a CF setup seeing the GPU usage going sporadic from 100% to 0% on both cards? This just started on mine and I have not been able to resolve it. My next step is to test the cards separately and then reinstall the drivers, any thoughts? I am getting really low scores, lower than what I was on a single GPU. It starts off ok, but then by the third test in 3DMark it goes haywire.


----------



## bpmcleod

Quote:


> Originally Posted by *utnorris*
> 
> So anyone with a CF setup seeing the GPU usage going sporadic from 100% to 0% on both cards? This just started on mine and I have not been able to resolve it. My next step is to test the cards separately and then reinstall the drivers, any thoughts? I am getting really low scores, lower than what I was on a single GPU. It starts off ok, but then by the third test in 3DMark it goes haywire.


Ill let you know monday. Hope not though

Edit: Could also be that theres no profile for it.. idk


----------



## skupples

Quote:


> Originally Posted by *raghu78*
> 
> tri CF scales better than tri SLI. i am quite sure you will beat even GTX 780 Ti triple SLI . i think you are back to rule the charts


for sure! Sli scaling is something they need to address once and for all, despite the conspiracy theories about forcing people to buy titans. They limited 780 to three cards for more than one reason.


----------



## DraXxus1549

Anyone else having horribly inconsistent frame rates in Battlefield 4? I tried lowering everything to high and my frames were still all over the place. I tried with v sync on and it was even worse stuttering like crazy. Is it just BF4? or is something weird going on with my card?


----------



## youra6

Quote:


> Originally Posted by *ontariotl*
> 
> Actually I thought it was an indicator that power saver was enabled on the second card when not running a load. I notice it on while I'm on desktop since its not needed, but it shuts off as soon load a game or benchmark that xfire is needed.


 You're probably right; just so happens disabling CFX disables power saver all together on the second card.


----------



## Scorpion49

So I just ran a quick baseline of 3Dmark11 compared to the outgoing 780 SLI. Not a bad showing for 947mhz compared to 1163 on the 780's.

http://www.3dmark.com/compare/3dm11/7498994/3dm11/7468822


----------



## ljreyl

Quote:


> Originally Posted by *the9quad*
> 
> What game BF4 only? if so it's the game issue, happens to a lot of people especially on the dam map. If it's every game, I have no idea why


I guess no one has the same problem as I do?








This is for all games btw...


----------



## evensen007

Quote:


> Originally Posted by *DraXxus1549*
> 
> Anyone else having horribly inconsistent frame rates in Battlefield 4? I tried lowering everything to high and my frames were still all over the place. I tried with v sync on and it was even worse stuttering like crazy. Is it just BF4? or is something weird going on with my card?


It is horrible. My 2 x 290's feel worse than my single 7970 did.







My gpu usage is bouncing all over the charts 0-100 0-100 0 - 100. It feels like it paused every 3 or 4 seconds. awful.


----------



## youra6

Quote:


> Originally Posted by *Scorpion49*
> 
> So I just ran a quick baseline of 3Dmark11 compared to the outgoing 780 SLI. Not a bad showing for 947mhz compared to 1163 on the 780's.
> 
> http://www.3dmark.com/compare/3dm11/7498994/3dm11/7468822


Thats weird, on my stock dual 290 setup, I get around 29K on the PXXXX GPU score. At 1200Mhz I around 34K+.


----------



## Scorpion49

Quote:


> Originally Posted by *youra6*


Do you have the fans turned up? I let it stock everything, didn't move the switch over either. I can see the problem in Valley right away after looking at the usage. I've benched that program a whole lot on many, many GPU's and the 290 just falls off continuously instead of following the pattern of every other GPU out there. I look and see most of the run was spent at clock speeds around 616mhz while the fan went up to 50% which means it is ramping past the set limit to try and stay cool. Definitely a driver issue with that bench, something is getting overworked.


----------



## DraXxus1549

Quote:


> Originally Posted by *evensen007*
> 
> It is horrible. My 2 x 290's feel worse than my single 7970 did.
> 
> 
> 
> 
> 
> 
> 
> My gpu usage is bouncing all over the charts 0-100 0-100 0 - 100. It feels like it paused every 3 or 4 seconds. awful.


Looks like it's not just me. I only played one other game so far, Tomb Raider, and I didn't notice these issues. Is BF4 to blame or drivers? a combination maybe?


----------



## Scorpion49

Quote:


> Originally Posted by *DraXxus1549*
> 
> Looks like it's not just me. I only played one other game so far, Tomb Raider, and I didn't notice these issues. Is BF4 to blame or drivers? a combination maybe?


I think its mostly BF4, literally everyone trying to play it with these new cards has the same stupid issues.


----------



## Xylene

w00t, broke 13k in 3dmark2011.

http://www.3dmark.com/3dm11/7499195

1165/1350. I need moar volts. Any way to get more than +100mv in AB? I tried unlocking the limits in the preferences but it still topped at 100.


----------



## tsm106

Quote:


> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bpmcleod*
> 
> If it makes you feel any better, you beat the GPU score of number two, *but he crushed your physics score*.
> 
> 
> 
> Yeah ln2. 6.1 ghz-8 pack
Click to expand...

Crushed?

6.1-5.0 =


----------



## th3illusiveman

You guys with black screen issues try these DVi settings? Maybe changing them could help?


----------



## sugarhell

Quote:


> Originally Posted by *tsm106*
> 
> Crushed?
> 
> 6.1-5.0 =


6.1-5.0=







? Nice maths


----------



## eternal7trance

How do i flash my bios? I'm using the bootable usb drive with atiflash 4.0.7 and it keeps saying adapter not found no matter what command I use


----------



## ontariotl

Quote:


> Originally Posted by *DraXxus1549*
> 
> Looks like it's not just me. I only played one other game so far, Tomb Raider, and I didn't notice these issues. Is BF4 to blame or drivers? a combination maybe?


Is it possible it was borked after the latest patch they came out with recently to reduce issues of crashes and netcode?


----------



## ontariotl

Quote:


> Originally Posted by *eternal7trance*
> 
> How do i flash my bios? I'm using the bootable usb drive with atiflash 4.0.7 and it keeps saying adapter not found no matter what command I use


Use the command to see if there are any AMD cards found and listed. Since I didn't look with the version I had, but is this atiflash the latest version you are using?


----------



## iPDrop

Wow, just got a pretty sick benchmark. Almost 30k in graphics score with two R9 290's









http://www.3dmark.com/3dm11/7499424


----------



## jerrolds

Quote:


> Originally Posted by *DraXxus1549*
> 
> Looks like it's not just me. I only played one other game so far, Tomb Raider, and I didn't notice these issues. Is BF4 to blame or drivers? a combination maybe?


BF4 runs really smooth and consistent for me. High everything, no AA, HBAO. 1140/1500 - about 85-90fps outside, and 120fps inside. But this is during campaign mode..havent tried multi yet.

I guess you guys are running in CF?


----------



## Arizonian

Quote:


> Originally Posted by *jerrolds*
> 
> BF4 runs really smooth and consistent for me. High everything, no AA, HBAO. 1140/1500 - about 85-90fps outside, and 120fps inside. But this is during campaign mode..havent tried multi yet.
> 
> I guess you guys are running in CF?


BF4 Played smooth for me 13.11 beta7 1100/1300 75% on air with default game Ultra settings. (Video on OP)

However I did not get chance to try other drivers after that or the new BF4 update recently released.


----------



## iPDrop

I'm running CF 290's and I get really bad performance in BF4... When using only 1 card I get great performance 70-80 frames on ultra 1080p but when I crossfire I get like -10 to 30% boost in frames


----------



## RAFFY

How do my scores stack up? I finally installed 3D Mark (newest version) and ran the test without touching any of my hardware (CCC fan profile set to 70% when GPU reaches 70c). Everything ran at stock speeds, cpu stock no changes in bios at all, memory ran at 1333mhz.


----------



## ontariotl

Quote:


> Originally Posted by *iPDrop*
> 
> I'm running CF 290's and I get really bad performance in BF4... When using only 1 card I get great performance 70-80 frames on ultra 1080p but when I crossfire I get like -10 to 30% boost in frames


I haven't played a MP game since the new patch, I'm curious to try this when I get home and see if I have the same issue.


----------



## NaifQK

Quote:


> Originally Posted by *black7hought*
> 
> What is the cause of this faulty hardware or drivers?
> 
> 
> Spoiler: Warning: Spoiler!


I have the same problem
but it only happen when folding even on stock


----------



## evensen007

The frame issues with BF4 are on xfire 290's on multi-player. I feel like I'm playing the game on a 4 year old gpu.


----------



## Sgt Bilko

I just had a game of CS:GO and game was "freezing" every 2-3 secs........Core clock and load was bouncing all over the place

powerlimit and fan speed are both at max.


----------



## blue1512

Quote:


> Originally Posted by *evensen007*
> 
> The frame issues with BF4 are on xfire 290's on multi-player. I feel like I'm playing the game on a 4 year old gpu.


For BF4 multi and a CF setup, you need a beastly CPU. Your 2600K is not enough even at 4.7 GHz. However, this problem will be fixed when the Mantle patch arrives, as Dice said at AMD Developer Summit 14.


----------



## nemm

When using the LLC command in AB, is there another one to apply to second card when using crossfire? The reason I ask is the second card has fluctuating voltage whereas the main card doesn't which is causing my max obtained overclock to artifact when I know it is rock solid since each card was tested before crossfire. Both cards when tested separately would loop H4 1440p maxed overnight no problem at 1.297 but in crossfire the the second card dips below 1.25 at times which is no good.

Another thing is there a way to set power in ccc to stay at 50% because AB doesn't do anything or spear to do with this setting and it is required to prevent varying core clocks.

To the user that asked if stable in BF4, I don't have yet so unable to test but if i can control the poor vdroop on these cards then I don't see why not since they looped max H4 1440p for an easy 8hour+ separately without problems.


----------



## estens

Quote:


> Originally Posted by *nemm*
> 
> When using the LLC command in AB, is there another one to apply to second card when using crossfire? The reason I ask is the second card has fluctuating voltage whereas the main card doesn't which is causing my max obtained overclock to artifact when I know it is rock solid since each card was tested before crossfire. Both cards when tested separately would loop H4 1440p maxed overnight no problem at 1.297 but in crossfire the the second card dips below 1.25 at times which is no good.
> 
> Another thing is there a way to set power in ccc to stay at 50% because AB doesn't do anything or spear to do with this setting and it is required to prevent varying core clocks.
> 
> To the user that asked if stable in BF4, I don't have yet so unable to test but if i can control the poor vdroop on these cards then I don't see why not since they looped max H4 1440p for an easy 8hour+ separately without problems.


Don`t know about the llc command for the sceond card.

But if you uncheck voltage monitoring in AB you can use AB to set powerlimit without using ccc,

Anyone else having issues with overclocked 290(x) and monitors with a higher refreshrate than 60hz?
My 290x does 1170mhz core when monitor is at 115hz, but if I set screen to 60hz I can do 1230mhz core








I have a qnix monitor btw


----------



## jtjoetan

hey guys why do i get inconsistent fps in bf4? with a r9 290 saphire.


----------



## rdr09

Quote:


> Originally Posted by *iPDrop*
> 
> Wow, just got a pretty sick benchmark. Almost 30k in graphics score with two R9 290's
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/7499424


hope you get your BF4 working soon . . .

Sli Gtx Titans 1371/7256

http://www.3dmark.com/fs/1105291

edit: mind running fs and compare?


----------



## C-BuZz

Quote:


> Originally Posted by *Mr357*
> 
> As far as achieving high memory clocks, bumping up the core voltage should help with that, but if you can't get far on the stock BIOS I doubt PT1 will be any kinder. I know for sure that memory overclocking is tougher on PT3. Supposedly it's due to tighter timings.


Memory overclocking is the same on PT3 as it is with stock bios. For both my Powercolor's anyway. Both easily bench at 7000 no prob on PT3 & start to really max out at 7250


----------



## Jpmboy

Quote:


> Originally Posted by *tsm106*
> 
> http://www.3dmark.com/fs/1115531
> 
> top 5 here for tri fse on air.


Nice tsm! 3 290x looking much better than 4x7970?

PLz post a "Performance" level run to here


----------



## Jpmboy

Quote:


> Originally Posted by *nemm*
> 
> When using the LLC command in AB, is there another one to apply to second card when using crossfire? The reason I ask is the second card has fluctuating voltage whereas the main card doesn't which is causing my max obtained overclock to artifact when I know it is rock solid since each card was tested before crossfire. Both cards when tested separately would loop H4 1440p maxed overnight no problem at 1.297 but in crossfire the the second card dips below 1.25 at times which is no good.
> 
> Another thing is there a way to set power in ccc to stay at 50% because AB doesn't do anything or spear to do with this setting and it is required to prevent varying core clocks.
> 
> To the user that asked if stable in BF4, I don't have yet so unable to test but if i can control the poor vdroop on these cards then I don't see why not since they looped max H4 1440p for an easy 8hour+ separately without problems.


use the /sg indexer... eg. msiafterburner /sg0 /wi4... then the same using /sg1


----------



## Jpmboy

Quote:


> Originally Posted by *sugarhell*
> 
> 6.1-5.0=
> 
> 
> 
> 
> 
> 
> 
> ? Nice maths


----------



## anteante

I just manage to unlock all shaders on my Powercolor 290 OC with Asus 290X bios. I´m happy


----------



## rdr09

Quote:


> Originally Posted by *anteante*
> 
> I just manage to unlock all shaders on my Powercolor 290 OC with Asus 290X bios. I´m happy


congrats. do you know if your gpu has Hynix or Elpida VRAM?

you can check with this . . .

http://www.mediafire.com/download/voj4j1rlk0ucfz4/MemoryInfo+1005.rar#!

thanks.


----------



## anteante

Quote:


> Originally Posted by *rdr09*
> 
> congrats. do you know if your gpu has Hynix or Elpida VRAM?
> 
> you can check with this . . .
> 
> http://www.mediafire.com/download/voj4j1rlk0ucfz4/MemoryInfo+1005.rar#!
> 
> thanks.


The program says elpida


----------



## Duvar

German Forum someone with Elpida Vram unlocked too








Once again Power Color...


----------



## nemm

@estens and jpmboy, thank you, I shall try those later on.

With all this unlocking talk I just realised I need try the newest one to test my luck 0\3 so far. Is there a Sapphire 290x bios about?


----------



## rdr09

Quote:


> Originally Posted by *anteante*
> 
> The program says elpida


this gives hope to some of us. thanks, again.


----------



## Gilgam3sh

I might try to buy 2x Powercolor R9 290 and see if they can be unlocked, if not, I will return them.


----------



## r0l4n

Quote:


> Originally Posted by *blue1512*
> 
> For BF4 multi and a CF setup, you need a beastly CPU. Your 2600K is not enough even at 4.7 GHz. However, this problem will be fixed when the Mantle patch arrives, as Dice said at AMD Developer Summit 14.


Do you have any evidence to support this statement? Not that I don't believe it, just want to compare it to other processors.


----------



## estens

Quote:


> Originally Posted by *nemm*
> 
> @estens and jpmboy, thank you, I shall try those later on.
> 
> With all this unlocking talk I just realised I need try the newest one to test my luck 0\3 so far. Is there a Sapphire 290x bios about?


Can send you my Sapphire 290x bios later today, if you can`t find it somwhere else


----------



## Duvar

@ Full HD (max settings) there will be no difference http://www.ocaholic.ch/modules/smartsection/item.php?itemid=1123&page=13


----------



## Jpmboy

Quote:


> Originally Posted by *iPDrop*
> 
> Wow, just got a pretty sick benchmark. Almost 30k in graphics score with two R9 290's
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/7499424


post that score here please


----------



## Jpmboy

Quote:


> Originally Posted by *nemm*
> 
> @estens and jpmboy, thank you, I shall try those later on.
> 
> With all this unlocking talk I just realised I need try the newest one to test my luck 0\3 so far. Is there a Sapphire 290x bios about?


which? Q or U
save from my sapphire r290x with atiflash.

Q:

STOCKUBE.txt 128k .txt file


U:

STOCKUBE.txt 128k .txt file


change txt to rom and your good to go.


----------



## blue1512

Quote:


> Originally Posted by *r0l4n*
> 
> Do you have any evidence to support this statement? Not that I don't believe it, just want to compare it to other processors.


If you read more about AMD's new CF you will see what I mean. The 2600K at 4.7 GHz is fast, but the PCIe interface is still 2.0. You will be fine with 1 card, but 2x 290 in CF need the bandwidth of PCIe 3.0, which mean your CPU is not enough.


----------



## sun100

Quote:


> Originally Posted by *jtjoetan*
> 
> hey guys why do i get inconsistent fps in bf4? with a r9 290 saphire.


As someone mentioned earlier, try different modes (windowed, fullscreen, borderless) . I have the same one, on same map and same spot i get 30FPS , and 5 minutes later same spot, nearly same conditions i get over 150 FPS . Its just random. Guess drivers still need to be finetuned since card is pretty new.
Im currently running latest Beta drivers, try them out if you haven't yet.

P.S.
I'd like to join the club !
GPU-Z Link : http://www.techpowerup.com/gpuz/6sk9h/
Manufacturer & Brand: Sapphire Radeon R9 290 4 GB
Cooling: Stock


----------



## the9quad

Quote:


> Originally Posted by *blue1512*
> 
> If you read more about AMD's new CF you will see what I mean. The 2600K at 4.7 GHz is fast, but the PCIe interface is still 2.0. You will be fine with 1 card, but 2x 290 in CF need the bandwidth of PCIe 3.0, which mean your CPU is not enough.


the same issue is happening with someone on a evga dark x79 and 4930k.


----------



## r0l4n

Quote:


> Originally Posted by *blue1512*
> 
> If you read more about AMD's new CF you will see what I mean. The 2600K at 4.7 GHz is fast, but the PCIe interface is still 2.0. You will be fine with 1 card, but 2x 290 in CF need the bandwidth of PCIe 3.0, which mean your CPU is not enough.


Please post some links, that's an interesting read.


----------



## Raephen

Quote:


> Originally Posted by *Jpmboy*
> 
> which? Q or U
> save from my sapphire r290x with atiflash.
> 
> Q:
> 
> STOCKUBE.txt 128k .txt file
> 
> 
> U:
> 
> STOCKUBE.txt 128k .txt file
> 
> 
> change txt to rom and your good to go.


Thanks for those two.

I might also take a shot with my Sapphire 290.

I'm happy with it as it is - no issues yet - but there's no harm in trying, aye?


----------



## r0l4n

Quote:


> Originally Posted by *r0l4n*
> 
> 
> 
> CHANGELOG:
> beta3:
> 
> support for multi-GPU setups
> Please report any bugs to me in a private message.
> 
> ****USE AT YOUR OWN RISK****
> 
> ABAutoProfileVoltMod_beta3.zip 315k .zip file
> 
> ****USE AT YOUR OWN RISK****


New beta3 available.

I've added GPU selection for crossfire setups. I haven't got the chance to test the new functionality as I only have one card. If anyone on Crossfire runs it and finds issues (or gets it to work), I'd appreciate some feedback


----------



## evensen007

Quote:


> Originally Posted by *blue1512*
> 
> For BF4 multi and a CF setup, you need a beastly CPU. Your 2600K is not enough even at 4.7 GHz. However, this problem will be fixed when the Mantle patch arrives, as Dice said at AMD Developer Summit 14.


This is so very wrong. a 2600k @ 4.7 is not bottle-necking anything. I really don't like when people throw out incorrect information because they heard it second hand and repeat it. The statement is so blatantly incorrect that it needs to be put down right away.


----------



## kcuestag

Well, I am sending my R9 290X back and hoping they give me a refund, I'm fed up of these blackscreen issues, might grab two 290's and hope they don't have blackscreen issues and hope they're Hynix.


----------



## airisom2

According to the official slides, Crossfire Hawaii can saturate PCIe 3.0
Quote:


> This hardware DMA engines allow for direct access between the GPUs over the PCI-Express bus solely. AMD has said the video cards can saturate a PCIe 3.0 x16 bus bandwidth; 16GB/s bi-directional. If this is true, it is possible that gamers who are going with two R9 290's or R9 290X cards will now have a need to upgrade to PCIe 3.0 for the best performance. It makes your CPU and PCIe bus speed choice more important than it was in the past, where previously it made little or no difference. Certainly this will be something to go back and test with real world gaming and see if there is an impact.


Source


----------



## ZealotKi11er

Quote:


> Originally Posted by *kcuestag*
> 
> Well, I am sending my R9 290X back and hoping they give me a refund, I'm fed up of these blackscreen issues, might grab two 290's and hope they don't have blackscreen issues and hope they're Hynix.


What sucks. I am getting a 290 next week and see how it goes.


----------



## bond32

I had my first black screen yesterday in BF4. Considered myself lucky, haven't had near as many as some of you.


----------



## Kipsofthemud

Quote:


> Originally Posted by *blue1512*
> 
> If you read more about AMD's new CF you will see what I mean. The 2600K at 4.7 GHz is fast, but the PCIe interface is still 2.0. You will be fine with 1 card, but 2x 290 in CF need the bandwidth of PCIe 3.0, which mean your CPU is not enough.


Maybe you should read some more about the new CF yourself mate, 2 290X ran at PCI-E 2.0 *8x* without any performance issues or difference according to Linus from NCIX/Linus Tech Tips.


----------



## evensen007

Quote:


> Originally Posted by *airisom2*
> 
> According to the official slides, Crossfire Hawaii can saturate PCIe 3.0
> Source


I think hardocp misinterpreted AMD's point. I think AMD was saying 'saturate' as in make use of all the available bandwidth. It's definitely an early driver/BF4 thing right now. This guy has an x79 x16x16 and those platforms are exhibiting the same issues:

http://www.overclock.net/t/1440600/r9-290-having-problem-with-crossfire/30


----------



## Arizonian

Quote:


> Originally Posted by *evensen007*
> 
> I think hardocp misinterpreted AMD's point. I think AMD was saying 'saturate' as in make use of all the available bandwidth. It's definitely an early driver/BF4 thing right now. This guy has an x79 x16x16 and those platforms are exhibiting the same issues:
> 
> http://www.overclock.net/t/1440600/r9-290-having-problem-with-crossfire/30


I agree with this. I've read it does saturated all the bandwidth 2.0 or 3.0. Got a lot of info on Mantel on OP too.

Also Mantle will be plugged into Battlefield 4 by mid December and that news came out if the developers meeting this week reported by Chipp.

http://www.overclock.net/t/1442071/chipp-amd-developer-summit-2013


----------



## black7hought

Quote:


> Originally Posted by *NaifQK*
> 
> I have the same problem
> but it only happen when folding even on stock


That is when it happens to me. I can play Guild Wars 2 for hours but five minutes of folding causes that issue. I thought it was the display driver crashing but l haven't had a strong answer to support that theory.


----------



## Scorpion49

Well heres my screenshot thingy, on stock cooling for now.



Anyone have a link to the flash procedures for these cards? Or is it just the same as the last gen?


----------



## the9quad

Quote:


> Originally Posted by *black7hought*
> 
> That is when it happens to me. I can play Guild Wars 2 for hours but five minutes of folding causes that issue. I thought it was the display driver crashing but l haven't had a strong answer to support that theory.


whats your event log say?


----------



## utnorris

I am getting the inconsistent GPU load in benchmarks, not games (haven't tried those yet) to the point I might as well be running just one card. This just started recently as it was not happening before. Hopefully I will have time today to work on this and run the cards as singles to make sure one of my cards is not borked. I have reversed any changes I made prior to this happening and it is still happening which is why I will probably reinstall the drivers as they may have gotten borked. It's just frustrating going from awesome to meh for no apparent reason.


----------



## evensen007

Quote:


> Originally Posted by *Scorpion49*
> 
> Well heres my screenshot thingy, on stock cooling for now.
> 
> 
> 
> Anyone have a link to the flash procedures for these cards? Or is it just the same as the last gen?


Here ya go:

http://forums.overclockers.co.uk/showthread.php?t=18552408


----------



## Arizonian

Quote:


> Originally Posted by *Scorpion49*
> 
> Well heres my screenshot thingy, on stock cooling for now.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Anyone have a link to the flash procedures for these cards? Or is it just the same as the last gen?


.

http://www.overclock.net/t/1442692/vc-powercolor-radeon-r9-290-unlocked-into-r9-290x/210#post_21198215


----------



## Scorpion49

Thanks guys, dunno if I want to flash these. Just tried to run 3Dmark Vantage, black screen after about 10 seconds. Rebooted and tried it again, it made it to the second test and black screened. So then I tried Firestrike, black screen there as well. *sigh* I did manage a 1050mhz run in 3Dmark11, still not as fast as the 780's I got rid of. Maybe under water they'll do better.


----------



## airisom2

Oh, you guys were talking about the cf gpu usage bug in BF4. Yeah, I've seen a couple of crossfire users that have that problem. As far as Hawaii cf saturating 3.0 on a more general basis, we still need concrete verification. I'm thinking that with their DMA engines working on the frametime latency stuttering (probably the thing that's saturating PCIe), the more bandwidth you give the engines, the more uniform the frametimes will be.


----------



## Tobiman

Yo, Scorpion49, do you still play EvE?


----------



## Scorpion49

Quote:


> Originally Posted by *Tobiman*
> 
> Yo, Scorpion49, do you still play EvE?


I just started up again with one of my older alts. Why?


----------



## Tobiman

Quote:


> Originally Posted by *Scorpion49*
> 
> I just started up again with one of my older alts. Why?


Ah, not much, really, It's just good to see other eve players on the same forum.


----------



## Scorpion49

Quote:


> Originally Posted by *Tobiman*
> 
> Ah, not much, really, It's just good to see other eve players on the same forum.


Yeah I decided to get back into it a little bit. Haven't done much but mine in highsec, I'm broke haha. Plus apparently its a new thing to advertise a corp for the sole purpose of ambushing your new recruits, I lost some stuff like that too. I wish the game was like it was back in 2004-2005, way better even without the fancy graphics.


----------



## ljreyl

I have a weird problem going on with my overclocked 290x

System
AMD 8350 @ 4.7 GHz (1.368 Volts)
8 GB 1866 RAM (Stock Volts
2600 NB (1.275 volts)
2600 HT (Stock Volts)
R9 290x @ 1150 Core, 1450 memory (1.4 Volts, Max temps 80C, Stock cooling and 80% fan)

So my problem is, I play a game for a while and everything runs fine, no artifacts or anything BUT after about 30 minutes, my sound card cuts in and out. No in game stuttering for video but sound just takes a dump every 2 seconds or so.

Anyone know why?

Sound card - Xonar Phoebus
Note - I disabled onboard sound and didn't install the R9 HDMI sound drivers.

Also, all overclocks were stable (CPU, NB, HT, Etc) when I had my 7970 overclocked at 1250 core and 1800 memory

This is for all games btw...


----------



## skupples

Quote:


> Originally Posted by *anteante*
> 
> I just manage to unlock all shaders on my Powercolor 290 OC with Asus 290X bios. I´m happy
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


I can flash my titan with a 780Ti bios & have it read 2880 cores, does this mean it's now unlocked?

What I mean to say is, their has to be a better way to verify this supposed unlocking of cards... GPU-Z is known to fail @ reading things, especially voltages. Why is this any different?

guy accidentally flashed his 780Ti with a 780 Bios, it now reads 2304 core, does this mean he has locked 592 cores? another person accidentally flashed 780Ti w/ a titan bios, it read 2688 in GPU-Z, does this mean he locked 192 cores?


----------



## Scorpion49

So I went ahead and flashed the cards, they seem to be bouncing all around as far as GPU-Z readings. I don't think it worked correctly. It will jump from 2816 shaders, to 2560, to "unknown" very rapidly along with all the other measurements.


----------



## RAFFY

How are my scores for all stock speeds? Also with the memory reader how do I get it to read both of my cards? Do I need pull a GPU run it, then repeat for the second card?


----------



## Arizonian

Quote:


> Originally Posted by *sun100*
> 
> -SNIP-
> 
> P.S.
> I'd like to join the club !
> GPU-Z Link : http://www.techpowerup.com/gpuz/6sk9h/
> Manufacturer & Brand: Sapphire Radeon R9 290 4 GB
> Cooling: Stock


Congrats - added


----------



## Tobiman

Quote:


> Originally Posted by *Scorpion49*
> 
> Yeah I decided to get back into it a little bit. Haven't done much but mine in highsec, I'm broke haha. Plus apparently its a new thing to advertise a corp for the sole purpose of ambushing your new recruits, I lost some stuff like that too. I wish the game was like it was back in 2004-2005, way better even without the fancy graphics.


I started late 2009 so the game hasn't changed that much for me but it's still a bit different if I compare it now to back then. Especially due to the UI revamp and all.

I heard about the corp ganking stuff too but it wasn't as bad as the bug that was discovered during incursions. A logi at war could transfer aggression to any ship that it reps which allowed the aggressors (in the same fleet) to attack said ship. They would invite clueless incursioners to join their fleet, rep them and instantly blow them to pieces. Tears were shed during that period. Some videos on youtube about it too.

Other crazy bugs like the magnetar wormhole buff that allowed a blaster fitted BS to alpha strike a frig at 0 - 249kms for perfect damage. There's a whole 1 hr video on this bug on youtube.

Even without bugs, people go to great lenghts to make others have a bad day. For example, players create alts just to suicide one day old characters in noob systems, gank miners in highsec. Gank freighters and shiny ships in highsec. Drop suicide dreads on jump freighters offloading stuff at a pos. Drop 30+ titans on a couple battleships. Camp highsec - lowsec gates 24/7. No one wants to play fair anymore and It's all part of the game and what makes EvE what it is now.


----------



## Tobiman

Quote:


> Originally Posted by *ljreyl*
> 
> I have a weird problem going on with my overclocked 290x
> 
> System
> AMD 8350 @ 4.7 GHz (1.368 Volts)
> 8 GB 1866 RAM (Stock Volts
> 2600 NB (1.275 volts)
> 2600 HT (Stock Volts)
> R9 290x @ 1150 Core, 1450 memory (1.4 Volts, Max temps 80C, Stock cooling and 80% fan)
> 
> So my problem is, I play a game for a while and everything runs fine, no artifacts or anything BUT after about 30 minutes, my sound card cuts in and out. No in game stuttering for video but sound just takes a dump every 2 seconds or so.
> 
> Anyone know why?
> 
> Sound card - Xonar Phoebus
> Note - I disabled onboard sound and didn't install the R9 HDMI sound drivers.
> 
> Also, all overclocks were stable (CPU, NB, HT, Etc) when I had my 7970 overclocked at 1250 core and 1800 memory
> 
> This is for all games btw...


I'd report this on AMD's specific driver report web page.


----------



## tsm106

Quote:


> Originally Posted by *Kipsofthemud*
> 
> Quote:
> 
> 
> 
> Originally Posted by *blue1512*
> 
> If you read more about AMD's new CF you will see what I mean. The 2600K at 4.7 GHz is fast, but the PCIe interface is still 2.0. You will be fine with 1 card, but 2x 290 in CF need the bandwidth of PCIe 3.0, which mean your CPU is not enough.
> 
> 
> 
> Maybe you should read some more about the new CF yourself mate, 2 290X ran at PCI-E 2.0 *8x* without any performance issues or difference according to Linus from NCIX/Linus Tech Tips.
Click to expand...

Here's trifire on GEN2. The difference is HUGE!

http://www.overclock.net/t/1440974/anyone-have-real-world-performance-with-an-r9-290-or-290x-crossfire-on-pcie-2-0-and-eyefinity-5670x1080-rez/0_40#post_21154921

Quote:


> Originally Posted by *skupples*
> 
> Quote:
> 
> 
> 
> Originally Posted by *anteante*
> 
> I just manage to unlock all shaders on my Powercolor 290 OC with Asus 290X bios. I´m happy
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can flash my titan with a 780Ti bios & have it read 2880 cores, does this mean it's now unlocked?
> 
> What I mean to say is, their has to be a better way to verify this supposed unlocking of cards... GPU-Z is known to fail @ reading things, especially voltages. Why is this any different?
> 
> guy accidentally flashed his 780Ti with a 780 Bios, it now reads 2304 core, does this mean he has locked 592 cores? another person accidentally flashed 780Ti w/ a titan bios, it read 2688 in GPU-Z, does this mean he locked 192 cores?
Click to expand...

I don't think ppl realize that merely flashing a 290x uber bios on will make a 290 faster. And then depending on which stock bios they tested their benches on the difference between a 290x uber and a 290 quiet is freaking huge. If anything the flashed cards need to be compared to an actual 290x.


----------



## Arizonian

Quote:


> Originally Posted by *skupples*
> 
> I can flash my titan with a 780Ti bios & have it read 2880 cores, does this mean it's now unlocked?
> 
> What I mean to say is, their has to be a better way to verify this supposed unlocking of cards... GPU-Z is known to fail @ reading things, especially voltages. Why is this any different?
> 
> guy accidentally flashed his 780Ti with a 780 Bios, it now reads 2304 core, does this mean he has locked 592 cores? another person accidentally flashed 780Ti w/ a titan bios, it read 2688 in GPU-Z, does this mean he locked 192 cores?


I agree needing proof more than GPU-Z shots. Here's more of what your seeking with benchmarks.



*[VC] PowerColor Radeon R9 290 unlocked into R9 290X?* - OCN

http://www.overclock.net/t/1442692/vc-powercolor-radeon-r9-290-unlocked-into-r9-290x/30#post_21194328
http://www.overclock.net/t/1442692/vc-powercolor-radeon-r9-290-unlocked-into-r9-290x/90#post_21194879

*Success! XFX R9 290 flashed to Asus R9 290X!* - LinusTech Tips

http://linustechtips.com/main/topic/76456-success-xfx-r9-290-flashed-to-asus-r9-290x/

EDIT - BTW this member who posted in LinusTechTips is also an OCN member.









I agree that just seeing GPU-Z screen shot doesn't't mean it's unlocked. Some members claiming it was unlocked did not take the time to properly document before / after benchmarks as these two did. So unless a comparison is being made people can't just take your word for it.

We also might be seeing a common denominator with certain manufactures like PowerColor and XFX regardless of memory. However not all the members saying it unlocked showed more solid proof as these two did.
Quote:


> Originally Posted by *tsm106*
> 
> -SNIP-
> 
> I don't think ppl realize that merely flashing a 290x uber bios on will make a 290 faster. And then depending on which stock bios they tested their benches on the difference between a 290x uber and a 290 quiet is freaking huge. If anything the flashed cards need to be compared to an actual 290x.


Another valid point. The problem would be getting two systems equally spec'd to run those tests or it'd be flawed comparison.


----------



## DraXxus1549

Anyone who was having stuttering issues in BF4, try Fullscreen in stead of borderless. I got much more consistent framerates that way.


----------



## Technewbie

Well I took my 290x out and put my 7970 back in and submitted a support ticket to MSI and will hopefully RMA the card to get a new one. That will hopefully not black screen :/


----------



## rdr09

Quote:


> Originally Posted by *jtjoetan*
> 
> hey guys why do i get inconsistent fps in bf4? with a r9 290 saphire.


jt, what's the rest of your rig? i was having issues with BF4, so i did the in-game test and the result end up telling me to get a better internet connection. what i did was replaced my cable and it helped a bit but it was still not as good as my 7950 and 7970 game play. i monitored the game using AB and i saw how much pagefile the game was using. i thought that was the issue 'cause i set mine to 3000MB in Win7. I went to change it to Auto and - i must say, the game is now as smooth as ever. not sure if this would help. The game also uses a lot of RAM.

Pagefile set at Auto



set at 3000MB


----------



## Scorpion49

Okay, my Gigabyte 290's did not unlock. I'm going to flash them back to the stock BIOS for now.


----------



## MotionBlur84

ASUS R9 290 @ 290X BIOS:



So... shader unlock doesn't work


----------



## skupples

Quote:


> Originally Posted by *tsm106*
> 
> Here's trifire on GEN2. The difference is HUGE!
> 
> http://www.overclock.net/t/1440974/anyone-have-real-world-performance-with-an-r9-290-or-290x-crossfire-on-pcie-2-0-and-eyefinity-5670x1080-rez/0_40#post_21154921
> I don't think ppl realize that merely flashing a 290x uber bios on will make a 290 faster. And then depending on which stock bios they tested their benches on the difference between a 290x uber and a 290 quiet is freaking huge. If anything the flashed cards need to be compared to an actual 290x.


I'm still skeptical. Their has to be a better way to validate this. The difference between the two bios isn't just the name slapped on it. I see people are unlocking volt control, & power target... How else can this be validated? Micron scope?


----------



## black7hought

Quote:


> Originally Posted by *the9quad*
> 
> whats your event log say?


The event log is showing a COM crash and the next event is the recovery from the unexpected shutdown/reboot. It also is listing another issue I'm having after every crash which is my USB WiFi adapter driver crashing. When I have to hard reboot because of this crashing issue, I need to unplug the adapter and plug it back in for it to function.


Spoiler: Warning: Spoiler!


----------



## CallsignVega

Any particular brand more likely to have Hynix memory versus Elpida? Just ordered two Sapphire 290X's to play around with. Waited for them to come to Amazon for zero hassle return if necessary.


----------



## Raephen

Quote:


> Originally Posted by *MotionBlur84*
> 
> ASUS R9 290 @ 290X BIOS:
> 
> 
> 
> So... shader unlock doesn't work


I had no luck either... Ah well, I've still got a ballin' good card as it is, and when it's swimming in water I'll check out what my limits are in pushig the clocks.


----------



## kalel8

I love mine but its running so hot its making me nervous. But its FAAAAAAAAST.


----------



## skupples

Quote:


> Originally Posted by *CallsignVega*
> 
> Any particular brand more likely to have Hynix memory versus Elpida? Just ordered two Sapphire 290X's to play around with. Waited for them to come to Amazon for zero hassle return if necessary.


From what I can tell it seems to be a total luck of the draw with any brand atm.


----------



## Forceman

Quote:


> Originally Posted by *skupples*
> 
> I'm still skeptical. Their has to be a better way to validate this. The difference between the two bios isn't just the name slapped on it. I see people are unlocking volt control, & power target... How else can this be validated? Micron scope?


What we need is a shader specific benchmark we can use to compare. I ran the Compubench test at stock 290 speeds and scored higher than a stock 290 according to their database, another user who had a failed unlock ran it and got exactly 290 scores, so that may be something to use. I also ran that ShaderToyMark test, but I don't have anything to compare it to.


----------



## rancor

I would love to join








its a sapphire 290



I also have a question on throttling
in afterburner i have
core voltage +100mV
aux voltage +50V
power limit maxed
core clock 1200
and memory clock 1500

core temps 49C
VRMs 65C

In fire strike the card seems to be throttling? its not like I am hitting temp problems so am I hitting the power limit of my stock bios?
would the Asus bios fix this or do i need the PT1 bios.


----------



## muhd86

did u swtich to the uber mode ---

if not try it and report back .


----------



## grandpatzer

Quote:


> Originally Posted by *rv8000*
> 
> Tried unlocking my sapphire 290 with hynix memory using the asus bios, no success. Voltage control even in with AB beta 17 on stock bios and asus bios using gpu tweak seem to do nothing in terms of stabilizing my card 1150-1180 seems to be the wall even with stock volts. Gonna try some more testing with llc commands and nol4ns tool, disappointed with this cards overclocking potential atm.


so stock volts you get 1150-1180 and increasing voltage does not help improve OC?


----------



## muhd86

anyone unlock shaders on there sapphire r290 gpus ---

and any link to the software which tells which ram is installed in the r290 gpus


----------



## Gunderman456

Quote:


> Originally Posted by *muhd86*
> 
> anyone unlock shaders on there sapphire r290 gpus ---
> 
> and any link to the software which tells which ram is installed in the r290 gpus


It's in the OP - Post #2!


----------



## NaifQK

Quote:


> Originally Posted by *black7hought*
> 
> That is when it happens to me. I can play Guild Wars 2 for hours but five minutes of folding causes that issue. I thought it was the display driver crashing but l haven't had a strong answer to support that theory.


After close monitoring card parameters I found the cause of the failure is memory speed not stable (jumping from 150 to 1250 continuously)
So I disabled Power Play support with After Burner to prevent the card from down clocking and tested the card on folding
and the fluctuating in memory speed is gone
I hope the problem is gone too since it been 2 hours and it didn't failed









FYI the card was overclocked 10% (1100/1375) gonna try higher but after a day or so to make sure it is stable.

keep in mind that disabling Power Play will keep your card hot
I am under water and the temperatures in GPU idle
GPU= 45
VRM1= 44
VRM2= 40

Temperatures while folding
GPU= 48
VRM1= 47
VRM2= 43

while with Power Play all temperatures in idle were around 30-33


----------



## rancor

Quote:


> Originally Posted by *muhd86*
> 
> did u swtich to the uber mode ---
> 
> if not try it and report back .


no change I don't think the 290s have an uber mode


----------



## Arizonian

Quote:


> Originally Posted by *rancor*
> 
> I would love to join
> 
> 
> 
> 
> 
> 
> 
> 
> its a sapphire 290
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> I also have a question on throttling
> in afterburner i have
> core voltage +100mV
> aux voltage +50V
> power limit maxed
> core clock 1200
> and memory clock 1500
> 
> core temps 49C
> VRMs 65C
> 
> In fire strike the card seems to be throttling? its not like I am hitting temp problems so am I hitting the power limit of my stock bios?
> would the Asus bios fix this or do i need the PT1 bios.


Congrats - added


----------



## Blackroush

Dear Moderator this is my new 290 cooler


----------



## Arizonian

Quote:


> Originally Posted by *Blackroush*
> 
> Dear Moderator this is my new 290 cooler
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Wow. Don't know if I should put you down for Aftermarket or the first Modified.


----------



## overclockFrance

Quote:


> Originally Posted by *Blackroush*
> 
> Dear Moderator this is my new 290 cooler


What are the heatsink and the fans ?


----------



## Maxxa

Quote:


> Originally Posted by *Blackroush*
> 
> Dear Moderator this is my new 290 cooler
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> [/quote


That is awesome you should post a breakdown on it.


----------



## Mr357

Quote:


> Originally Posted by *Arizonian*
> 
> Wow. Don't know if I should put you down for Aftermarket or the first Modified.


That's a mod if I every saw one.








Quote:


> Originally Posted by *overclockFrance*
> 
> What are the heatsink and the fans ?


The fans look to be Sythe GT's


----------



## Blackroush

This is GELID ICY Vision Rev.2 I change the fans to Gentle Typhoon AP15

During BF4 (Compare to Gelid stock fans this is so quite)


----------



## Kipsofthemud

Quote:


> Originally Posted by *tsm106*
> 
> Here's trifire on GEN2. The difference is HUGE!
> 
> http://www.overclock.net/t/1440974/anyone-have-real-world-performance-with-an-r9-290-or-290x-crossfire-on-pcie-2-0-and-eyefinity-5670x1080-rez/0_40#post_21154921
> I don't think ppl realize that merely flashing a 290x uber bios on will make a 290 faster. And then depending on which stock bios they tested their benches on the difference between a 290x uber and a 290 quiet is freaking huge. If anything the flashed cards need to be compared to an actual 290x.


I never thought Linus would lie, but I guess he did







It's tri-fire though, would it be the exact same with crossfire?


----------



## spitty13

Hey does anyone know how to change GPU tweak so it displays the effective memory clocks instead of the 5000Mhz? Been looking around but cant seem to find where that option is


----------



## R35ervoirFox

Quote:


> Originally Posted by *muhd86*
> 
> did u swtich to the uber mode ---
> 
> if not try it and report back .


Quote:


> Originally Posted by *rancor*
> 
> no change I don't think the 290s have an uber mode


No the Sapphire 290 that I have has the exact same bios on postion 1 as postion 2.

Bit pissed now, neither the asus or sapphire 290x bios worked, the display wouldn't work after flashing, computer booted but with a black screen








Wish I had waited a day or two, haven't heard of a single sapphire 290 to unlock but it seems some xfx and powercolors do.
I'm sure it's still a lottery but I would still like a chance in the lotto










Anyways off to flash Jpmboys uber and quite bioses just to make sure.


----------



## jerrolds

Quote:


> Originally Posted by *evensen007*
> 
> This is so very wrong. a 2600k @ 4.7 is not bottle-necking anything. I really don't like when people throw out incorrect information because they heard it second hand and repeat it. The statement is so blatantly incorrect that it needs to be put down right away.


Exactly - my [email protected] handles BF4 multi just like it does campaign mode. 80-90fps outside, 110fps+ indoors @1440p High everything/No AA/HBAO. I would suspect a 2500k @ 4.7Ghz would almost be just as good.


----------



## Mr357

Quote:


> Originally Posted by *spitty13*
> 
> Hey does anyone know how to change GPU tweak so it displays the effective memory clocks instead of the 5000Mhz? Been looking around but cant seem to find where that option is


You mean you want it to display 1250MHz instead of 5000? 5000 is the effective memory clock, whereas 1250 is the actual memory clock. I don't know that there is an option to change it; you may just have to divide by 4 every time you want to know.


----------



## galil3o

guys, I apologize if this has been addressed already, I did my best to search through the thread but its rather huge at this point.

I just received my Sapphire R9 290x yesterday and am trying to establish a decent overclock and learn my way around this whole Powertune thing.
My problem it seems is that my core voltage is not elevating itself to the level I'm setting in CCC. I pushed it all the way to +50% and set the max fan speed to 100% to see what it could do with no thermal restrictions.

I can get up to about a 7.5% core clock increase before the artifacts start to appear in 3dmark fire strike test (havent even tried to mess with the memory yet). However when I go to gpu-z after running the benchmark, the max VDDC I ever reach is about 1.1v when my understanding is that it should jump to 1.5v.

Has anyone experienced or heard of this happenning?


----------



## utnorris

So something I came across as I was rebuilding my system. For those having issues with black screens, is anyone on an AMD setup having this issue? I was rebuilding my OS and after I installed the Intel Graphics driver I started having issues again. Keep in mind this is a fresh install, so no other conflicts had happened until I installed the Intel drivers. If the black screen is only happening on Intel setups, this might be the issue. Obviously this is just a thought and I have not confirmed this, but i thought I would throw it out there for everyone to think about.


----------



## skupples

Quote:


> Originally Posted by *OccamRazor*
> 
> *Revised Afterburner beta 17*
> 
> http://www.guru3d.com/files_get/msi_afterburner_beta_download,19.html


----------



## jerrolds

Quote:


> Originally Posted by *galil3o*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> guys, I apologize if this has been addressed already, I did my best to search through the thread but its rather huge at this point.
> 
> I just received my Sapphire R9 290x yesterday and am trying to establish a decent overclock and learn my way around this whole Powertune thing.
> My problem it seems is that my core voltage is not elevating itself to the level I'm setting in CCC. I pushed it all the way to +50% and set the max fan speed to 100% to see what it could do with no thermal restrictions.
> 
> I can get up to about a 7.5% core clock increase before the artifacts start to appear in 3dmark fire strike test (havent even tried to mess with the memory yet). However when I go to gpu-z after running the benchmark, the max VDDC I ever reach is about 1.1v when my understanding is that it should jump to 1.5v.
> 
> Has anyone experienced or heard of this happenning?


Grab the newest MSI Afterburner Beta 17 - it supports 290(X) voltage controls. Most 290X can hit 1100mhz on stock voltage as long as its cool enough (~55% on stock fan).

After that you'll probably need to give it some juice, AB can go up to 1350mV but anything over 1120mhz youre going to need to crank the fan/voltage. For my my wall is 1150mhz and i would have push voltage up to 1310mv+, at that point my VRM temps get pretty toasty.

If you wanna push past 1350mv, you may have to use GPU Tweak and flash your bios to the asus one.

I've gone up to 1220 on the core but VRM1 temps hit 105C (1400mv in gputweak, ~1.3v actual) before i shut it down. GPU temps was only 65C so i need higher pressure fans or something.


----------



## galil3o

Quote:


> Originally Posted by *jerrolds*
> 
> Grab the newest MSI Afterburner Beta 17 - it supports 290(X) voltage controls. Most 290X can hit 1100mhz on stock voltage as long as its cool enough (~55% on stock fan).
> 
> After that you'll probably need to give it some juice, AB can go up to 1350mV but anything over 1120mhz youre going to need to crank the fan/voltage. For my my wall is 1150mhz and i would have push voltage up to 1310mv+, at that point my VRM temps get pretty toasty.
> 
> If you wanna push past 1350mv, you may have to use GPU Tweak and flash your bios to the asus one.
> 
> I've gone up to 1220 on the core but VRM1 temps hit 105C (1400mv in gputweak, ~1.3v actual) before i shut it down. GPU temps was only 65C so i need higher pressure fans or something.


Thank you, thats kinda what I thought was wrong. I have a watercooling loop on its way in the mail so I will probably be looking to push pas 1.35v. I will check out the asus bios flashing. Can you point me to a good thread for that procedure?


----------



## airisom2

To Flashed 290 users: There is an official 290 -> 290x unlock thread now.

http://www.overclock.net/t/1443242/official-r9-290-290x-unlock-thread#post_21203153


----------



## nemm

Well I tried the Sapphire 290x with no joy on unlocking but never mind, still happy.

I applied the suggestions made in regards to a previous post of mine and they worked wonders. 3DMark11 graphics score went from 30058 to 31738 and 3DMark13 from 9484 to 10799



Spoiler: 3DMark11 Before:









Spoiler: 3DMark11 After:









Spoiler: 3DMark13 Before:









Spoiler: 3DMark13 After:







I've noticed that the LLC command needs applying every boot, is there a way for it to auto apply on start up?

Out of interest are my scores inline with what they should be?

*Missed out the clocks, 1180/1450 x2


----------



## muhd86

Quote:


> Originally Posted by *R35ervoirFox*
> 
> No the Sapphire 290 that I have has the exact same bios on postion 1 as postion 2.
> 
> Bit pissed now, neither the asus or sapphire 290x bios worked, the display wouldn't work after flashing, computer booted but with a black screen
> 
> 
> 
> 
> 
> 
> 
> 
> Wish I had waited a day or two, haven't heard of a single sapphire 290 to unlock but it seems some xfx and powercolors do.
> I'm sure it's still a lottery but I would still like a chance in the lotto
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyways off to flash Jpmboys uber and quite bioses just to make sure.


if they are 2 bios switches --then just switch to the 2nd bios ---or u cant seem to boot from both ---emm i wonder why xfx cards are unlocking and not others ---well i guess its a waiting game .
have to see if my gpus are unlockable or not .


----------



## muhd86

i want some information please ; -

i have 3 - dell 30 inch lcd 3007wfp models and they only have dvi connectors - i want to use eye finity ..

now only with the dual link dvi cables can one get the full 2560-1400 res of the 30 inch lcd - using any other cables it only goes to 1920-1080 .

so how do i hook the 3rd lcd to ---i dont have a display port cable or monitior --and last time with the 7970 series of gpus i could not get eye finity to work as it required one display to be display port supported .

can i hook a hdmi to dvi cable for the 3rd lcd - and will a hdmi to dvi cable support the full resoloution of the lcd .


----------



## R35ervoirFox

Quote:


> Originally Posted by *muhd86*
> 
> did u swtich to the uber mode ---
> 
> if not try it and report back .


Quote:


> Originally Posted by *muhd86*
> 
> if they are 2 bios switches --then just switch to the 2nd bios ---or u cant seem to boot from both ---emm i wonder why xfx cards are unlocking and not others ---well i guess its a waiting game .
> have to see if my gpus are unlockable or not .


Both switches have the exact same bios and you can boot from both. But once I flash a 290x bios, then it will boot but nothing will display on the screen. Only xfx or powercolor unlock from what has been reported.


----------



## R35ervoirFox

Quote:


> Originally Posted by *muhd86*
> 
> i want some information please ; -
> 
> i have 3 - dell 30 inch lcd 3007wfp models and they only have dvi connectors - i want to use eye finity ..
> 
> now only with the dual link dvi cables can one get the full 2560-1400 res of the 30 inch lcd - using any other cables it only goes to 1920-1080 .
> 
> so how do i hook the 3rd lcd to ---i dont have a display port cable or monitior --and last time with the 7970 series of gpus i could not get eye finity to work as it required one display to be display port supported .
> 
> can i hook a hdmi to dvi cable for the 3rd lcd - and will a hdmi to dvi cable support the full resoloution of the lcd .


So your wondering if you can get either a hdmi or display port to dual link dvi adapter, would imagine so, guess it needs to be active (needs power). the 7970 couldnt' support 2 dvi and hdmi on the internal clock, the 290 doesn't have this limitation so you can use all 4 ports at once.


----------



## ZealotKi11er

So my R9 290 will be paired with a AMD A8-3870K. Hopping Mantle will made it so this CPU can power this card fine.


----------



## webafile

http://www.newegg.com/Product/Product.aspx?Item=N82E16814150683

no the r9 290 xfx is only 2 years not life time warranty now not sure what made them change might be the fact that they run hot so xfx is like not going to replace them


----------



## nemm

I sorted the LLC problem, the AB profile wasn't saved after the command was applied so it is all good now.









Scratch that, never worked and still need to apply it every restart


----------



## webafile

Quote:


> Originally Posted by *AlphaC*
> 
> Read the fine print.
> 
> XFX has a lifetime warranty for USA/Canada. & must be registered
> 
> They could be sneaky and say it's for the lifetime of the product (i.e. 2/3 yr) ala Zotac.


http://www.newegg.com/Product/Product.aspx?Item=N82E16814150683

no the r9 290 xfx is only 2 years not life time warranty now not sure what made them change might be the fact that they run hot so xfx is like not going to replace them


----------



## PillarOfAutumn

Quote:


> Originally Posted by *muhd86*
> 
> anyone unlock shaders on there sapphire r290 gpus ---
> 
> and any link to the software which tells which ram is installed in the r290 gpus


Wait...you can unlock the 290 into a 290x??? Is there anything warranty voiding with this?


----------



## tsm106

Quote:


> Originally Posted by *Kipsofthemud*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Here's trifire on GEN2. The difference is HUGE!
> 
> http://www.overclock.net/t/1440974/anyone-have-real-world-performance-with-an-r9-290-or-290x-crossfire-on-pcie-2-0-and-eyefinity-5670x1080-rez/0_40#post_21154921
> I don't think ppl realize that merely flashing a 290x uber bios on will make a 290 faster. And then depending on which stock bios they tested their benches on the difference between a 290x uber and a 290 quiet is freaking huge. If anything the flashed cards need to be compared to an actual 290x.
> 
> 
> 
> I never thought Linus would lie, but I guess he did
> 
> 
> 
> 
> 
> 
> 
> It's tri-fire though, would it be the exact same with crossfire?
Click to expand...

The load on tri is a lot higher than dual so if there was a penalty, it would worse on tri. In the future, use more salt with Linus. Btw his fan tests are a joke too.

Quote:


> Originally Posted by *nemm*
> 
> Well I tried the Sapphire 290x with no joy on unlocking but never mind, still happy.
> 
> I applied the suggestions made in regards to a previous post of mine and they worked wonders. 3DMark11 graphics score went from 30058 to 31738 and 3DMark13 from 9484 to 10799
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Spoiler: 3DMark11 Before:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: 3DMark11 After:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: 3DMark13 Before:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: 3DMark13 After:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've noticed that the LLC command needs applying every boot, is there a way for it to auto apply on start up?
> 
> Out of interest are my scores inline with what they should be?


Your results are just as expected in regards to the gains with simply flashing a 290x bios. The scores themselves look to be in the ball park w/o knowing any specifics.

Quote:


> Originally Posted by *tsm106*
> 
> I don't think ppl realize that *merely flashing a 290x uber bios on will make a 290 faster*. And then depending on which stock bios they tested their benches on the difference between a 290x uber and a 290 quiet is freaking huge. If anything the flashed cards need to be compared to an actual 290x.


----------



## Jpmboy

eh - gotta rma my r290x. Bummed.


----------



## bond32

Why? Which one did you have?


----------



## rv8000

Quote:


> Originally Posted by *grandpatzer*
> 
> so stock volts you get 1150-1180 and increasing voltage does not help improve OC?


So far this seems to be the case, went to rerun valley last night at 1170 with +100mv which it ran with 0 artifacting the other day, and on the second scene today it was getting artifacts everywhere. Vcore seems to bounce all over the place during 3d loads. I've tried using the llc command for AB and ill get a steady 1.25v according to gpuz but ill blackscreen almost immediately (under 3d load). Starting to wonder if it may be psu related. Even at 1.25v and my cpu @ 4.5 I don't think im pulling more than 650w, max load test in the reviews for my psu were above 720w from what I remember. *scratches head*


----------



## utnorris

For black screens try removing the intel driver completely. It worked for me.


----------



## the9quad

Received a Sapphire R9 290X today from Amazon. Good news is the black screens are gone. Which makes me think it was definitely a hardware issue with the first card.

The better news, is I will now have two Sapphire 290x's when the one I RMA'd to Newegg gets here next week!


----------



## ledzepp3

Quote:


> Originally Posted by *the9quad*
> 
> Received a Sapphire R9 290X today from Amazon. Good news is the black screens are gone. Which makes me think it was definitely a hardware issue with the first card.
> 
> The better news, is I will now have two Sapphire 290x's when the one I RMA'd to Newegg gets here next week!


Again, any whine?


----------



## rv8000

Anyone here with a 290, a 800w + quality psu, aftermarket/wc, and some time willing to test power draw with llc commands and additional voltage? PM me please, might make a separate thread as well. Or if anyone knows of any threads/forums/sites that have checked power draw for oc'd ov'd 290/290x.


----------



## the9quad

Quote:


> Originally Posted by *ledzepp3*
> 
> Again, any whine?


i honestly don't hear any, but my hearing might not be the best bro, I'm 40 yrs. old


----------



## skupples

Quote:


> Originally Posted by *the9quad*
> 
> i honestly don't hear any, but my hearing might not be the best bro, I'm 40 yrs. old


high possibility of not being able to hear coil whine for sure. I can still hear my ANCIENT "flat screen" whine, but I can't hear my titans ever, even @ 1.35v & i'm only 25. (other people have claimed to hear them cry though)


----------



## HighTemplar

Quote:


> Originally Posted by *tsm106*
> 
> The load on tri is a lot higher than dual so if there was a penalty, it would worse on tri. In the future, use more salt with Linus. Btw his fan tests are a joke too.
> Your results are just as expected in regards to the gains with simply flashing a 290x bios. The scores themselves look to be in the ball park w/o knowing any specifics.


Linus may have a few issues in his methodology, but he is by FAR the most knowledgeable youtuber in the PC niche. It pains me watching the other wannabes that try to review hardware and give tech advice when they make their bones from being a gaming channel. Then you try to correct them in the comments and they ban you. *Ahem* RivalxFactor *Ahem*


----------



## zpaf

Asus R9 290 non X










Generic VGA video card benchmark result - Intel Core i7-4770K,ASUSTeK COMPUTER INC. MAXIMUS VI IMPACT


----------



## HighTemplar

I get coil whine from both of my 780 Ti's. It's only when the frames are insane such as in the thousands (in menus and such), but it's there.


----------



## kitsunestarwind

Quote:


> Originally Posted by *utnorris*
> 
> So something I came across as I was rebuilding my system. For those having issues with black screens, is anyone on an AMD setup having this issue? I was rebuilding my OS and after I installed the Intel Graphics driver I started having issues again. Keep in mind this is a fresh install, so no other conflicts had happened until I installed the Intel drivers. If the black screen is only happening on Intel setups, this might be the issue. Obviously this is just a thought and I have not confirmed this, but i thought I would throw it out there for everyone to think about.


I tested mine of 3 different systems, One Based on AMD X6 1090T, AMD A10-5800K and a Intel Core i5-3570k System. Card exhibited random black screening on all 3 test systems (didn't want to pull the cards from my X79 rig till I have new 290's all waterblocked)

But my XFX R9-290 was bad, Downclocking the memory made it more stable, but a day later even that wouldn't help it and it got progressively worse.
It has now been sent back on RMA and will try my luck again. not pulling my Tri-Fire 7970's till i got at least 2 good 290's and waterblocks


----------



## Porter_

Quote:


> Originally Posted by *HighTemplar*
> 
> I get coil whine from both of my 780 Ti's. It's only when the frames are insane such as in the thousands (in menus and such), but it's there.


I think that's pretty common. My GTX 460's and 7970 did that. My 290X does it too.


----------



## Rar4f

Hows the R9 290 been for you ever since AMD released a driver to fix the issues it had?


----------



## Technewbie

At least the biggest problem we have with 290(x)'s is black screens, while some 780 ti's explode .-.


----------



## Pfortunato

whut? xD Explode?


----------



## rv8000

Quote:


> Originally Posted by *zpaf*
> 
> Asus R9 290 non X
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Generic VGA video card benchmark result - Intel Core i7-4770K,ASUSTeK COMPUTER INC. MAXIMUS VI IMPACT


What was the voltage for that core clock?


----------



## Technewbie

Quote:


> Originally Posted by *Pfortunato*
> 
> whut? xD Explode?


Lol yeah http://linustechtips.com/main/topic/75958-galaxy-issues-gtx-780-ti-recalls-mosfets-exploding/


----------



## Pfortunato

Well, between 95º and a mosfets popcorn maker I take the 95º . Poor Galaxy xD


----------



## eternal7trance

Actually on a heaven benchmark, if you make a better custom fan profile you can stay around 70c or less


----------



## Scorpion49

Quote:


> Originally Posted by *utnorris*
> 
> For black screens try removing the intel driver completely. It worked for me.


I have black screens but on X79 so there is no display driver installed. Maybe it is limited to Intel setups? I'm about to pick up a Crosshair V and FX chip, I kind of feel like having a true AMD build (and this Rampage IV Extreme is giving me problems), I can give it a shot and see if its better.


----------



## youra6

Quote:


> Originally Posted by *Scorpion49*
> 
> I have black screens but on X79 so there is no display driver installed. Maybe it is limited to Intel setups? I'm about to pick up a Crosshair V and FX chip, I kind of feel like having a true AMD build (and this Rampage IV Extreme is giving me problems), I can give it a shot and see if its better.


Scorpion, you had a question for me that I never got to yesterday. What was it again? It was about your 3dmark 11 scores.


----------



## Scorpion49

Quote:


> Originally Posted by *youra6*
> 
> Scorpion, you had a question for me that I never got to yesterday. What was it again? It was about your 3dmark 11 scores.


I think it was about my score being lower, I asked if you had a fan curve set because I was running all stock, hadn't switched the BIOS either yet.


----------



## youra6

Quote:


> Originally Posted by *Scorpion49*
> 
> I think it was about my score being lower, I asked if you had a fan curve set because I was running all stock, hadn't switched the BIOS either yet.


Oh right, I'm under water so no fan profile 

@1000/1250mhz, I was getting a graphics score P29200. My i5 is @ 4.5Ghz. Memory is at 2133 @9-10-10-24.


----------



## LazarusIV

Add me!!! Add me!!! Just got my HIS R9 290 the other day, it's testing out pretty well so far. I'm on air for the moment, later this month I'll snag some wc equipment and dunk my computer. So far I've gotten to 1090 / 1375 with +50 on power, any higher on either core or memory gives me artifacts in Heaven. I had the black screen issue when I first got the drivers installed, but only when I initially used Overdrive to increase the power limit. I downloaded MSI Afterburner and disabled Overdrive, I do everything with Afterburner now and I've had no issues whatsoever. Qnix monitor is incoming, should be here by Monday.

Proof:


Heaven (1920x1200 @ 1090 / 1350):


I haven't messed with a different BIOS yet, I'm not sure I will. I've never done it and I'd like to get used to the card first. Also, I'd like to see how far I can overclock my new monitor, then I'll squeeze every last FPS out of this card that I can. I probably won't get a second one until next summer.


----------



## galil3o

Can anyone point me to a good thread or possibly a post in this one that gives a guide for bios flashing and some of the more advanced overclocking methods for 290x. This thread has become very large and its difficult to digest and the overclocking of this card is such a radical departure from cards I've worked with before.

I've got the new Afterburner up and running so I can adjust the voltage on my Sapphire 290x. I have a watercooling system on its way so I'm trying to learn as much as possible before hand.


----------



## skupples

Quote:


> Originally Posted by *Technewbie*
> 
> At least the biggest problem we have with 290(x)'s is black screens, while some 780 ti's explode .-.


Quote:


> Originally Posted by *Technewbie*
> 
> Lol yeah http://linustechtips.com/main/topic/75958-galaxy-issues-gtx-780-ti-recalls-mosfets-exploding/


this never actually happened. The supposed exploding 780Ti was actually a reference Tahiti PCB of some sort, the supposed massive recall, was ONE batch because of a misplaced sticker. Feel free to review the topic. We have a member on OCN who's had a tahiti do the same thing actually. It's linked in the review below. The galaxy rep made a statement about it in the owners club the other night. 100% confirmed all of our speculation. The authors who put these stories out are doing a major disservice to the community by not retracting/redacting/correcting their failed stories. oh, & the one batch that was recalled due to a *misplaced sticker* was only in China.


Spoiler: Warning: Spoiler!



Originally Posted by Galaxy View Post

Actually yes I do. With all the speculation that's been going on I think you'll be surprised to hear how simple the problem actually was. This batch of cards failed because the s*erial number sticker was somehow placed over the back of a MOSFET*. Seriously*. No faulty parts, no bad GPUs* or anything like that. You'd be surprised how many problems a sticker being slightly out of place can cause. I'm sure it goes without saying the guy who calibrates the sticker sticking machine has gotten more attention than he's used to recently. Either way there's *no risk of this happening to any cards in North America, Europe, Australia, or anywhere else.* Just the one batch in the Chinese market was affected.



Facts, Falacies, & Speculation of the fried Taiwanese 780Ti


----------



## utnorris

So the clean install and removing the Intel GPU driver has fixed my randon GPU usage issue I was having. Not sure exactly what caused it in the first place, but I am back to getting benches where I was previously. Of course I have to spend the rest of the night reinstalling everything, but at least it is working properly now.


----------



## evensen007

Quote:


> Originally Posted by *utnorris*
> 
> So the clean install and removing the Intel GPU driver has fixed my randon GPU usage issue I was having. Not sure exactly what caused it in the first place, but I am back to getting benches where I was previously. Of course I have to spend the rest of the night reinstalling everything, but at least it is working properly now.


Fuuuuu. I'm half ecstatic and half destroyed to hear that. I am not looking fwd to a full reinstall to resolve that issue.


----------



## skupples

Quote:


> Originally Posted by *evensen007*
> 
> Fuuuuu. I'm half ecstatic and half destroyed to hear that. I am not looking fwd to a full reinstall to resolve that issue.


Theirs no way to remove the drivers w/o a system wipe? This may help?

http://www.intel.com/support/graphics/sb/CS-034574.htm


----------



## Forceman

Just a head's up to anyone with a XFX card - they are giving out BF4 codes for them. I just got a code for my 290 by registering it at xfxsupport.com and clicking the game code link.


----------



## utnorris

I tried manually uninstalling all graphic drivers before I did a system wipe and it didn't help which led me to believe something else happened before hand to cause the issue. Doing a clean wipe ensured I didn't have any lingering drivers from previous installs that would cause problems. Anyways, this is what worked for me, may not be needed or work for everyone.


----------



## eternal7trance

Is there a modded bios floating around that gives a more aggressive fan setup? This is pretty dumb to only let the fan go to 55% without using MSI AB


----------



## hotrod717

Quote:


> Originally Posted by *Forceman*
> 
> Just a head's up to anyone with a XFX card - they are giving out BF4 codes for them. I just got a code for my 290 by registering it at xfxsupport.com and clicking the game code link.


Nice! Thanks for that.


----------



## evensen007

Quote:


> Originally Posted by *skupples*
> 
> Theirs no way to remove the drivers w/o a system wipe? This may help?
> 
> http://www.intel.com/support/graphics/sb/CS-034574.htm


There is, but driver removal can be black magic at times and even the most tenacious scrub can leave crap behind. I had a heck of a time getting the drivers to install correctly for the new 290's right off the bat, so that should have been my first warning sign. I even used the full ATIMan scrub.


----------



## grandpatzer

Quote:


> Originally Posted by *Jpmboy*
> 
> Cool. Always put a non conductive tim on both contact surfaces before using a t-pad. A very tiny amt on the vrm, you can thin coat the contact surface on the water block.


what is thin coat?


----------



## AfterFX

So do I go for a 290/x card that is priced right and performs but runs hot with fan noise?

Or do I go 780 ti that goes poof?


----------



## eternal7trance

Quote:


> Originally Posted by *AfterFX*
> 
> So do I go for a 290/x card that is priced right and performs but runs hot with fan noise?
> 
> Or do I go 780 ti that goes poof?


You mean should you get a 780ti that doesn't go poof because they already recalled the batch or should you get a 290/x card that runs fine with fan noise or hot with low fan noise


----------



## skupples

Quote:


> Originally Posted by *eternal7trance*
> 
> You mean should you get a 780ti that doesn't go poof because they already recalled the batch or should you get a 290/x card that runs fine with fan noise or hot with low fan noise


& that batch was only from one manufacturer, and was only in china, and none of them went poof, they simply didn't boot because of a metal serial code sticker that was incorrectly placed.









It's rather sad how quickly lies spread. I bet the guy who started that rumor is STILL lol'ing him self to sleep every night. I know I would be.


----------



## airisom2

Quote:


> Originally Posted by *AfterFX*
> 
> So do I go for a 290/x card that is priced right and performs but runs hot with fan noise?
> 
> Or do I go 780 ti that goes poof?


You could also wait for non-reference 290xs to come out, or buy a 290x now and slap an aftermarket cooler on it.


----------



## brazilianloser

As long as the replacement card I will be getting works on Monday I could care less for any of the two sides... I just want to play my darn games.


----------



## Sgt Bilko

Quote:


> Originally Posted by *AfterFX*
> 
> So do I go for a 290/x card that is priced right and performs but runs hot with fan noise?
> 
> Or do I go 780 ti that goes poof?


Well think of it this way.

AMD gives you Popcorn

Nvidia gives you Fireworks









I seen a post a few pages back suggesting that the whole 780Ti goes pop thing was actually a Tahiti GPU. not sure on the full story of it


----------



## AfterFX

Quote:


> Originally Posted by *skupples*
> 
> & that batch was only from one manufacturer, and was only in china, and none of them went poof, they simply didn't boot because of a metal serial code sticker that was incorrectly placed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's rather sad how quickly lies spread. I bet the guy who started that rumor is STILL lol'ing him self to sleep every night. I know I would be.


Oh, A card that will not boot is better than one that performs at high temps and loud fans?

That is the card I want.........The one that don't boot.......................


----------



## skupples

Quote:


> Originally Posted by *AfterFX*
> 
> Oh, A card that will not boot is better than one that performs at high temps and loud fans?
> 
> That is the card I want.........The one that don't boot.......................


remove sticker, hey look it now boots! The point is, you would of never, and will never get one of the 200(estimate number, probably lower) defective chinese 780ti's made by galaxy. It was one batch, distributed in one country, by one vendor. You might as well wipe off any galaxy made product for the rest of history from either amd or nvidia... Just to be safe.
Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well think of it this way.
> 
> AMD gives you Popcorn
> 
> Nvidia gives you Fireworks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I seen a post a few pages back suggesting that the whole 780Ti goes pop thing was actually a Tahiti GPU. not sure on the full story of it


If you want the full story, please go read my thread.

this thread, debunk the whole thing, and set the entire story straight

It was a masterful plot, put forth by some one who knew no one would fact check it before it spun the whole globe. Still waiting for some one to translate the Cantonese on the supposed recall lists. Now that galaxy has confirmed what happened, they are fake, we just don't know what they really are. They are likely the entire batch list for all of China.


----------



## AfterFX

What makes you think I didn't get one?


----------



## Redvineal

Hi all R9 290(X) owners! I plan to join the ranks real soon, but want to settle on an alternative cooling solution for OC potential before I pull the trigger.

Let me say, first, that I'm *NOT* at all interesting in water cooling, and I cannot be swayed otherwise! Whatever I wind up doing with air will be my first experience with a custom mod. Baby steps are kinda my thing, so I'm going with air for sure this go around.

I really like the idea of mounting the Prolimatech MK-26 (BEAST) on an R9 290 with two high static pressure fans. I'm not a quiet freak despite never using headphones. This is due to having music going 99.9% of the time, and having a dedicated room as my office (no sleeping with the PC in the same room).

As a little background info, I'm using the Corsair Carbide Air 540 case with 3x120mm fans for intake, and 3x140mm fans for exhaust (definitely no suffocation here). In the future, I may cut into the bottom of the case in order to mount two fans that can feed nice, cool air to any R9 290 air solution I go with (but that's for a future discussion).

Now, for the questions and humble request for recommendations:

1) I've seen many reports of severe sag when using the MK-26 (even worse with heavy, high performance fans). Are there any known backplates compatible with the R9 290 that expose the heatsink mounting holes? All the backplates I've seen so far are meant for water cooling, and don't expose these holes.

2) Is my assumption that backplates can provide extra stability a valid one, or am I just delusional? My instincts tell me that a solid plate on the back will prevent any board bend from the weight of the MK-26.

3) Is this project worthwhile? I really like the idea of a "project" instead of falling back to buying an aftermarket solution from manufacturers (which I've always done in the past). However, if the Arctic Accelero Xtreme III is worth what reviews have said it is, should I just save time and/or frustration and go with it?

3.a) If the Accelero is the best suggested cooler, and knowing it's weight, is a backplate still a good idea/necessary to prevent board bend?

In closing, I have visited the Overclock forums anonymously for many years and have always appreciated the immense wealth of information exchange going on around here. Many advance thanks go to all of you that take the time to read my words, and even more to those that are kind enough to offer advice, thoughts, and personal experiences!


----------



## Jpmboy

Quote:


> Originally Posted by *grandpatzer*
> 
> what is thin coat?


like you would spread out on a cpu heatsink. use a credit card or some other non-metallinc straight edge.

http://www.pugetsystems.com/labs/articles/Thermal-Paste-Application-Techniques-170/


----------



## AfterFX

Quote:


> Originally Posted by *Redvineal*
> 
> Hi all R9 290(X) owners! I plan to join the ranks real soon, but want to settle on an alternative cooling solution for OC potential before I pull the trigger.
> 
> Let me say, first, that I'm *NOT* at all interesting in water cooling, and I cannot be swayed otherwise!


Then wait 6 months for a party solution.


----------



## RedRage

Quote:


> Originally Posted by *Redvineal*
> 
> Hi all R9 290(X) owners! I plan to join the ranks real soon, but want to settle on an alternative cooling solution for OC potential before I pull the trigger.
> 
> Let me say, first, that I'm *NOT* at all interesting in water cooling, and I cannot be swayed otherwise! Whatever I wind up doing with air will be my first experience with a custom mod. Baby steps are kinda my thing, so I'm going with air for sure this go around.
> 
> I really like the idea of mounting the Prolimatech MK-26 (BEAST) on an R9 290 with two high static pressure fans. I'm not a quiet freak despite never using headphones. This is due to having music going 99.9% of the time, and having a dedicated room as my office (no sleeping with the PC in the same room).
> 
> As a little background info, I'm using the Corsair Carbide Air 540 case with 3x120mm fans for intake, and 3x140mm fans for exhaust (definitely no suffocation here). In the future, I may cut into the bottom of the case in order to mount two fans that can feed nice, cool air to any R9 290 air solution I go with (but that's for a future discussion).
> 
> Now, for the questions and humble request for recommendations:
> 
> 1) I've seen many reports of severe sag when using the MK-26 (even worse with heavy, high performance fans). Are there any known backplates compatible with the R9 290 that expose the heatsink mounting holes? All the backplates I've seen so far are meant for water cooling, and don't expose these holes.
> 
> 2) Is my assumption that backplates can provide extra stability a valid one, or am I just delusional? My instincts tell me that a solid plate on the back will prevent any board bend from the weight of the MK-26.
> 
> 3) Is this project worthwhile? I really like the idea of a "project" instead of falling back to buying an aftermarket solution from manufacturers (which I've always done in the past). However, if the Arctic Accelero Xtreme III is worth what reviews have said it is, should I just save time and/or frustration and go with it?
> 
> 3.a) If the Accelero is the best suggested cooler, and knowing it's weight, is a backplate still a good idea/necessary to prevent board bend?
> 
> In closing, I have visited the Overclock forums anonymously for many years and have always appreciated the immense wealth of information exchange going on around here. Many advance thanks go to all of you that take the time to read my words, and even more to those that are kind enough to offer advice, thoughts, and personal experiences!


I had the MK-26 on my R9 290x and recently slapped on an arctic accelero 3 for comparison. To my surprise the accelero out performs the much bigger/bulky MK-26 cooler by around 10c under load. I mounted the MK-26 3 different times and have good fans on it "110cfm each" but it simply doesnt do as well as the arctic. Believe me when I tell you I was shocked, you would think the larger MK-26 would cool it better.


----------



## RedRage

Anyone here know if the VRM 1 that GPUZ reads is the smaller section of VRMS "roughly 3 of them"? The VRM 1 temps are hotter than VRM 2 under full load, not dangerously hot but considerably hotter so im trying ot narrow it down.


----------



## Redvineal

Quote:


> Originally Posted by *AfterFX*
> 
> Then wait 6 months for a party solution.


Awww man, don't cheapen my post with selective quoting.

I'm looking for an air project with parts I can get my hands on now. I even explicitly mentioned that I don't want non-reference from manufacturers.

I'm not interested in sarcasm, egos, or pissing matches...just decent, relevant advice and discussion.


----------



## skupples

Quote:


> Originally Posted by *AfterFX*
> 
> What makes you think I didn't get one?


We have now reached the end of the usefulness of this convo for this thread. Feel free to send me proof of your blown up chinese 780 Ti via PM.


----------



## Redvineal

Quote:


> Originally Posted by *RedRage*
> 
> I had the MK-26 on my R9 290x and recently slapped on an arctic accelero 3 for comparison. To my surprise the accelero out performs the much bigger/bulky MK-26 cooler by around 10c under load. I mounted the MK-26 3 different times and have good fans on it "110cfm each" but it simply doesnt do as well as the arctic. Believe me when I tell you I was shocked, you would think the larger MK-26 would cool it better.


Thanks for the info! Shocked certainly describes my reaction to what you said. A long term 10c difference is worth far more than the slice of satisfaction I can get from completing an air project.

I've read a few discussions sprinkled around the net that the Accelero fan rig is "garbage". Since you've put hands on it, what do you think of it in terms of quality? Also, if you don't mind, what sinks did you have attached to the ram and vrm when you tried the Accelero?


----------



## tsm106

http://www.3dmark.com/fs/1162117

LOL just 427 pts away from the pro submission in first place heh.


----------



## Mr357

Quote:


> Originally Posted by *tsm106*
> 
> http://www.3dmark.com/fs/1162117
> 
> LOL just 427 pts away from the pro submission in first place heh.












Great work!


----------



## RedRage

Quote:


> Originally Posted by *Redvineal*
> 
> Thanks for the info! Shocked certainly describes my reaction to what you said. A long term 10c difference is worth far more than the slice of satisfaction I can get from completing an air project.
> 
> I've read a few discussions sprinkled around the net that the Accelero fan rig is "garbage". Since you've put hands on it, what do you think of it in terms of quality? Also, if you don't mind, what sinks did you have attached to the ram and vrm when you tried the Accelero?


Ignore that crap, most people who make such claims have never even owned one before. I have been using accelero coolers on my cards since way back in the days of the ATI X850XT cards and I had an accelero card on a HD 7970, GTX 275, GTX 480. GTX 580 and GTX 680..... So as you see, I have plenty of experience with them.

The best thing about the fans on the accelero 3 is the fact that you can crank it to 100 percent and you can barely even tell a difference in sound from 25 percent fan speed while it moves a good bit of air and keeps everything cool. I used the heatsinks that came with the cooler and they seem to be working well.

To give you an idea of performance my card is at 1200mhz core with 1.3 to 1.323vc under full load and the memory is at 1450mhz. My peak load temps after playing Crysis 3 for a few hours is 62c on the core.


----------



## tsm106

Quote:


> Originally Posted by *grandpatzer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jpmboy*
> 
> Cool. Always put a non conductive tim on both contact surfaces before using a t-pad. A very tiny amt on the vrm, you can thin coat the contact surface on the water block.
> 
> 
> 
> what is thin coat?
Click to expand...

I don't use tim on pads, its too annoying to remove and I like to reuse my ultra extremes because they are expensive as hell. Well, I did splurge this time and put tim on vrm1, though it was mostly to hold them in place but I really didn't need to do it. My vrms temps are 25c idle and 38-40c loaded and I'm talking at 1300 core.

Quote:


> Originally Posted by *Mr357*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Great work!


Thanks man.


----------



## Sgt Bilko

Quote:


> Originally Posted by *skupples*
> 
> remove sticker, hey look it now boots! The point is, you would of never, and will never get one of the 200(estimate number, probably lower) defective chinese 780ti's made by galaxy. It was one batch, distributed in one country, by one vendor. You might as well wipe off any galaxy made product for the rest of history from either amd or nvidia... Just to be safe.
> If you want the full story, please go read my thread.
> 
> this thread, debunk the whole thing, and set the entire story straight
> 
> It was a masterful plot, put forth by some one who knew no one would fact check it before it spun the whole globe. Still waiting for some one to translate the Cantonese on the supposed recall lists. Now that galaxy has confirmed what happened, they are fake, we just don't know what they really are. They are likely the entire batch list for all of China.


Thanks for the link









Just bad sticker placement huh?

Well thats not bad, especially considering it was only one batch in China....


----------



## eternal7trance

How do you guys manage putting a heatsink on the one part that is marked in red? Also, which is the best to get if I need to remove the heatsinks for warranty or selling?


----------



## Sgt Bilko

Quote:


> Originally Posted by *eternal7trance*
> 
> How do you guys manage putting a heatsink on the one part that is marked in red? Also, which is the best to get if I need to remove the heatsinks for warranty or selling?


I'm using the pads that came on the stock cooler apart from the vrm's which i used the pads included with my Accelero Xtreme III. They are stuck there and haven't fallen off so i'll call that a win.


----------



## hotrod717

Woot! I may have one!

XFX R9 290 stock w/ Elpida

Asus 290X bios


----------



## chiknnwatrmln

What brand card? About to buy a 290, I want to have a chance to get one that can be unlocked.


----------



## hotrod717

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> What brand card? About to buy a 290, I want to have a chance to get one that can be unlocked.


XFX. From Newegg. Another member got his from Amazon. Seems like XFX and Powercolor are the only 2 to unlock at this point. Memory makes no difference. It's the chip.


----------



## Emmett

Hey all.

I just got done putting an ek nickel/plexi block on one of my cards.

ran 3dmark11 still at stock speeds.

GPU topped out at 39C
VRM1 36C
VRM2 30C

Decent temps?


----------



## grandpatzer

Quote:


> Originally Posted by *tsm106*
> 
> I don't use tim on pads, its too annoying to remove and I like to reuse my ultra extremes because they are expensive as hell. Well, I did splurge this time and put tim on vrm1, though it was mostly to hold them in place but I really didn't need to do it. My vrms temps are 25c idle and 38-40c loaded and *I'm talking at 1300 core*.
> Thanks man.


is youre 290(x) at 1300 core?
what cooler/voltage/bios do you have and is it stable in modern games (BF, Crysis, Metro)?


----------



## tsm106

Quote:


> Originally Posted by *grandpatzer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I don't use tim on pads, its too annoying to remove and I like to reuse my ultra extremes because they are expensive as hell. Well, I did splurge this time and put tim on vrm1, though it was mostly to hold them in place but I really didn't need to do it. My vrms temps are 25c idle and 38-40c loaded and *I'm talking at 1300 core*.
> Thanks man.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> is youre 290(x) at 1300 core?
> what cooler/voltage/bios do you have and is it stable in modern games (BF, Crysis, Metro)?
Click to expand...

Water with an ek block. I doubt any air cooler can achieve this unless it comes with its own hurricane. Those are bench clocks. You don't need to oc trifire 290x to get silly high framerates, stock is enough.


----------



## grandpatzer

Quote:


> Originally Posted by *tsm106*
> 
> Water with an ek block. I doubt any air cooler can achieve this unless it comes with its own hurricane. Those are bench clocks. You don't need to oc trifire 290x to get silly high framerates, stock is enough.


Ah I see, I have ordered a Radeon R9 290, It will go under water once the fullcover block arives.

So what voltage did you have to achieve 1300core?

I'm propably at most going to only overvolt +200mv vcore, for the R9 290 I should be able to expect 1100-12xx mhz core from the looks of things...................


----------



## RAFFY

Quote:


> Originally Posted by *tsm106*
> 
> http://www.3dmark.com/fs/1162117
> 
> LOL just 427 pts away from the pro submission in first place heh.


My dual crossfire 290x's are scoring higher than your 3. I scored a 14752


----------



## Mr357

Quote:


> Originally Posted by *grandpatzer*
> 
> Ah I see, I have ordered a Radeon R9 290, It will go under water once the fullcover block arives.
> 
> So what voltage did you have to achieve 1300core?
> 
> I'm propably at most going to only overvolt +200mv vcore, for the R9 290 I should be able to expect 1100-12xx mhz core from the looks of things...................


+200mV would put you at 1.45V. Bad idea, especially for 24/7 since your card won't last very long with that many volts. It's been said that 1.35V should be the limit for daily use.
Quote:


> Originally Posted by *RAFFY*
> 
> My dual crossfire 290x's are scoring higher than your 3. I scored a 14752


Is your graphics score higher? I think that's what he cares about.


----------



## rv8000

Quote:


> Originally Posted by *grandpatzer*
> 
> Ah I see, I have ordered a Radeon R9 290, It will go under water once the fullcover block arives.
> 
> So what voltage did you have to achieve 1300core?
> 
> I'm propably at most going to only overvolt +200mv vcore, for the R9 290 I should be able to expect 1100-12xx mhz core from the looks of things...................


I'm getting 1150 on stock volts, even with +200mV, or 50mV with llc I can't stabilize anything over 1180. I haven't really seen any 290s above 1200, the highest I've seen is 1220 and I don't know what kind of voltage it required.


----------



## CallsignVega

Quote:


> Originally Posted by *tsm106*
> 
> Water with an ek block. I doubt any air cooler can achieve this unless it comes with its own hurricane. Those are bench clocks. You don't need to oc trifire 290x to get silly high framerates, stock is enough.


Any black screen issues in your testing?


----------



## grandpatzer

Quote:


> Originally Posted by *Mr357*
> 
> +200mV would put you at 1.45V. Bad idea, especially for 24/7 since your card won't last very long with that many volts. It's been said that 1.35V should be the limit for daily use.


I think It depends on the ASIC quality, so maybe it's much lower then 1.45v @ +200mv.

What's considered safe limits for 290 and 290x?
I remember 1.25v being safe for 7950 and for 7970 slightly more volts.


----------



## hotrod717

Quote:


> Originally Posted by *grandpatzer*
> 
> I think It depends on the ASIC quality, so maybe it's much lower then 1.45v @ +200mv.
> 
> What's considered safe limits for 290 and 290x?
> I remember 1.25v being safe for 7950 and for 7970 slightly more volts.


Depends on cooling. Water will handle a lot more. Also depends on use. 27/7 and benchies are 2 different topics.


----------



## Mr357

Quote:


> Originally Posted by *grandpatzer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mr357*
> 
> +200mV would put you at 1.45V. Bad idea, especially for 24/7 since your card won't last very long with that many volts. *It's been said that 1.35V should be the limit for daily use.*
> 
> 
> 
> I think It depends on the ASIC quality, so maybe it's much lower then 1.45v @ +200mv.
> 
> What's considered safe limits for 290 and 290x?
> I remember 1.25v being safe for 7950 and for 7970 slightly more volts.
Click to expand...

Just look at the post you quoted.


----------



## grandpatzer

Quote:


> Originally Posted by *rv8000*
> 
> I'm getting 1150 on stock volts, even with +200mV, or 50mV with llc I can't stabilize anything over 1180. I haven't really seen any 290s above 1200, the highest I've seen is 1220 and I don't know what kind of voltage it required.


Interesting, I remember reading another user in this thread stating that increasing volts does not help their OC, not sure if it was you who wrote that.
Do you have stock cooler?
You have very good stock volt overclock imo, then it just stops there regardless of volts.

So for OC the card should I either increase vcore or llc?
Is it bad idea increase both vcore and llc?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *RAFFY*
> 
> My dual crossfire 290x's are scoring higher than your 3. I scored a 14752


Firestrike EXTREME?


----------



## rv8000

Quote:


> Originally Posted by *grandpatzer*
> 
> Interesting, I remember reading another user in this thread stating that increasing volts does not help their OC, not sure if it was you who wrote that.
> Do you have stock cooler?
> You have very good stock volt overclock imo, then it just stops there regardless of volts.
> 
> So for OC the card should I either increase vcore or llc?
> Is it bad idea increase both vcore and llc?


Depends on cooling i guess, llc + 0.05v is going to give you a stable ~1.297 or there abouts I personally wouldn't go higher unless I decide to get a loop. I only use extra voltage and llc for benching anyways, either I'm limited by a power issue or there is something else weird going on because I really can't imagine not pulling another 50-75mhz out of the core at bare minimum.


----------



## RAFFY

Quote:


> Originally Posted by *Mr357*
> 
> +200mV would put you at 1.45V. Bad idea, especially for 24/7 since your card won't last very long with that many volts. It's been said that 1.35V should be the limit for daily use.
> Is your graphics score higher? I think that's what he cares about.


Yes my graphics score is 20056.


----------



## RAFFY

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Firestrike EXTREME?


No just normal didn't realize he was in Extreme. How do I run Extreme? Yes I'm a noob, I haven't overclocked in a couple years.


----------



## tsm106

Quote:


> Originally Posted by *CallsignVega*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Water with an ek block. I doubt any air cooler can achieve this unless it comes with its own hurricane. Those are bench clocks. You don't need to oc trifire 290x to get silly high framerates, stock is enough.
> 
> 
> 
> Any black screen issues in your testing?
Click to expand...

Nope, once you understand the pcb.

Tahiti was a beauty cuz she was so gentle when pushed to the edge. Hawaii is a cold mistress in comparison. I've learned a few things on the benching these cards. Hawaii has a cliff and its a sharp quick drop!

Basically you cannot run max memory oc with a core oc, it will hard blackscreen. Memory doesn't artifact, it hard blackscreens. The core though artifacts. The core and memory work in relationship to the volts, and obviously the volts are dynamic. Thus you have to keep the voltage high enough for the clocks. Basically its the opposite, you are not looking for the lowest volts per mhz, you are looking to feed enough volts for the mhz you want to run. Think in ranges. Also, this pcb is freaking insane, it eats the volts up. Btw, if you have memory at the right clocks and are pushing the core and it blackscreens it is from the core, it is not a hard lock, it can come back from the dead. It's just the outputs.

Quote:


> Originally Posted by *Mr357*
> 
> Is your graphics score higher? I think that's what he cares about.


Thanks. It's not even worth replying to though lol.

Oh btw, 1300 core in trifire, wee.

http://www.3dmark.com/fs/1162284


----------



## MrWhiteRX7

Quote:


> Originally Posted by *RAFFY*
> 
> No just normal didn't realize he was in Extreme. How do I run Extreme? Yes I'm a noob, I haven't overclocked in a couple years.


You will have to purchase the 3dmark utility. It's 25 bucks I believe but then it will allow you to run each benchmark separately and you will also get access to the Firestrike Extreme setting.


----------



## rv8000

Quote:


> Originally Posted by *tsm106*
> 
> Nope, once you understand the pcb.
> 
> Tahiti was a beauty cuz she was so gentle when pushed to the edge. Hawaii is a cold mistress in comparison. I've learned a few things on the benching these cards. Hawaii has a cliff and its a sharp quick drop!
> 
> Basically you cannot run max memory oc with a core oc, it will hard blackscreen. Memory doesn't artifact, it hard blackscreens. The core though artifacts. The core and memory work in relationship to the volts, and obviously the volts are dynamic. Thus you have to keep the voltage high enough for the clocks. Basically its the opposite, you are not looking for the lowest volts per mhz, you are looking to feed enough volts for the mhz you want to run. Think in ranges. Also, this pcb is freaking insane, it eats the volts up. Btw, if you have memory at the right clocks and are pushing the core and it blackscreens it is from the core, it is not a hard lock, it can come back from the dead. It's just the outputs.
> Thanks. It's not even worth replying to though lol.
> 
> Oh btw, 1300 core in trifire, wee.
> 
> http://www.3dmark.com/fs/1162284


Any idea as to why I'm getting blackscreens with no memory oc while pushing core. Stock volts 1150 is stable in all benchmarks, with +100mV I can sometimes clear benchmarks at 1180 without artifacting and being apparently stable. With no mem oc and llc + 0.05v at 1200 I will instantly blackscreen under 3d load, even with just llc (1.25v) I'll blackscreen with core at 1100. Doesn't make sense to me? I really think I should be able to pass 1200 if 1150 is stable on stock volts.


----------



## tsm106

Quote:


> Originally Posted by *rv8000*
> 
> Any idea as to why I'm getting blackscreens with no memory oc while pushing core. Stock volts 1150 is stable in all benchmarks, with +100mV I can sometimes clear benchmarks at 1180 without artifacting and being apparently stable. With no mem oc and llc + 0.05v at 1200 I will instantly blackscreen under 3d load, even with just llc enabled (1.25v) I'll blackscreen with core at 1100. Doesn't make sense to me? I really think I should be able to pass 1200 if 1150 is stable on stock volts.


You are not thinking in terms of Hawaii. The goal is not to achieve the highest clock on the lowest volts.

Yer using AB? If you are stop. It sucks. I hear beta 16 is better with the llc applet, but I think its just a pita. Gputweak is rubbish but for me ocing with the relationships in mind, and keeping the volts within range I don't blackscreen. In gputweak you can get a relative idea of the volts needed for the clocks, by clicking the lock button between core and voltage. then click the up button to see it roll up. That gives you an idea of the volts u need for X clock. My real world volts vs clocks are higher than gputweaks locked settings btw.

Btw, go to settings and remove the skin, because there are a lot of settings that the skins sort of hides.


----------



## rv8000

Quote:


> Originally Posted by *tsm106*
> 
> You are not thinking in terms of Hawaii. The goal is not to achieve the highest clock on the lowest volts.
> 
> Yer using AB? If you are stop. It sucks. I hear beta 16 is better with the llc applet, but I think its just a pita. Gputweak is rubbish but for me ocing with the relationships in mind, and keeping the volts within range I don't blackscreen. In gputweak you can get a relative idea of the volts needed for the clocks, by clicking the lock button between core and voltage. then click the up button to see it roll up. That gives you an idea of the volts u need for X clock. My real world volts vs clocks are higher than gputweaks locked settings btw.
> 
> Btw, go to settings and remove the skin, because there are a lot of settings that the skins sort of hides.


So try asus 290x bios with gpu tweak?


----------



## tsm106

Yes... but you kinda have to figure what you wanna do? If you wanna really bench then use the pt1 bios. If it's more for regular stuff the use the asus stock bios. The pt1/pt3 have the powertune stuff removed so they do not do powersavings.


----------



## rv8000

Quote:


> Originally Posted by *tsm106*
> 
> Yes... but you kinda have to figure what you wanna do? If you wanna really bench then use the pt1 bios. If it's more for regular stuff the use the asus stock bios. The pt1/pt3 have the powertune stuff removed so they do not do powersavings.


Pt3 has a more sever llc setting correct?


----------



## tsm106

Quote:


> Originally Posted by *rv8000*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Yes... but you kinda have to figure what you wanna do? If you wanna really bench then use the pt1 bios. If it's more for regular stuff the use the asus stock bios. The pt1/pt3 have the powertune stuff removed so they do not do powersavings.
> 
> 
> 
> Pt3 has a more sever llc setting correct?
Click to expand...

Hmm, not quite. pt1 is a normal bios with no limits, so it has natural droop. pt3 is the same but with no droop. pt3 is inherently more dangerous obviously since it will actually feed more volts than you input. And when we figure the droop involved it is actually a lot more volts. I prefer droop, its safer imo. I want to be the one in control of how much droop I'm affected by and not a predetermined offset.


----------



## Falkentyne

Quote:


> Originally Posted by *galil3o*
> 
> guys, I apologize if this has been addressed already, I did my best to search through the thread but its rather huge at this point.
> 
> I just received my Sapphire R9 290x yesterday and am trying to establish a decent overclock and learn my way around this whole Powertune thing.
> My problem it seems is that my core voltage is not elevating itself to the level I'm setting in CCC. I pushed it all the way to +50% and set the max fan speed to 100% to see what it could do with no thermal restrictions.
> 
> I can get up to about a 7.5% core clock increase before the artifacts start to appear in 3dmark fire strike test (havent even tried to mess with the memory yet). However when I go to gpu-z after running the benchmark, the max VDDC I ever reach is about 1.1v when my understanding is that it should jump to 1.5v.
> 
> Has anyone experienced or heard of this happenning?


Core voltage? 50% is power limit (50%=150%). You can't adjust vcore in CCC. You can use afterburner beta 17, or run from the commandline in afterburner folder: msiafterburner.exe /wi4,30,8d,##, where ## is a HEXADECIMAL value, which is 00 for default vcore, and 6x25 x the DECIMAL value of ##, so for example, wi4,30,8d,09 would be 56.25 mv, while wi4,30,8d,10 would be 6.25x16 (10 is HEX: 16x1+0, basically in long math terms), which= 100mv. wi4.30,8d,0f would be 93.75 (0f hex=15 decimal).

I found two very fast checkerboard artifacts in Firestrike graphics test 1, always at the exact same spot, if I use 1125/1500 and +25mv. Using +50mv seems to clear up those artifacts, although 1125/1500 and +25mv seems stable in valley..

No black screens though.


----------



## x800xt

I need atiflash 4.17 pls


----------



## hotrod717

Quote:


> Originally Posted by *tsm106*
> 
> You are not thinking in terms of Hawaii. The goal is not to achieve the highest clock on the lowest volts.
> 
> Yer using AB? If you are stop. It sucks. I hear beta 16 is better with the llc applet, but I think its just a pita. Gputweak is rubbish but for me ocing with the relationships in mind, and keeping the volts within range I don't blackscreen. In gputweak you can get a relative idea of the volts needed for the clocks, by clicking the lock button between core and voltage. then click the up button to see it roll up. That gives you an idea of the volts u need for X clock. My real world volts vs clocks are higher than gputweaks locked settings btw.
> 
> Btw, go to settings and remove the skin, because there are a lot of settings that the skins sort of hides.


I see you're finally coming to terms with the pros of gpu tweak.


----------



## famich

Quote:


> Originally Posted by *tsm106*
> 
> You are not thinking in terms of Hawaii. The goal is not to achieve the highest clock on the lowest volts.
> 
> Yer using AB? If you are stop. It sucks. I hear beta 16 is better with the llc applet, but I think its just a pita. Gputweak is rubbish but for me ocing with the relationships in mind, and keeping the volts within range I don't blackscreen. In gputweak you can get a relative idea of the volts needed for the clocks, by clicking the lock button between core and voltage. then click the up button to see it roll up. That gives you an idea of the volts u need for X clock. My real world volts vs clocks are higher than gputweaks locked settings btw.
> 
> Btw, go to settings and remove the skin, because there are a lot of settings that the skins sort of hides.


I do basically agree with you, above all regarding the blackscreen related to gpu failure- IMHO it is recoverable, yes the other one /gpu +mem / not.

But I like AFB beta 17 , I am able to run FC 3 Crysis / have finished it today after like 6 months LOL/ all of them I can run +100MV +45mV AUX V @ 1200MHz core no problem.
ASUS GPU Tweak s got no onscreen server .

I would like to try Sapphire Trixx, but it looks that Sapphire dooes not care about it anymore.

For 24/7 use is the ASUS BIOS or the orig BIOS better, IMHO.


----------



## zpaf

Quote:


> Originally Posted by *rv8000*
> 
> What was the voltage for that core clock?


1.37v from asus gpu tweak but I see 1.35v from gpuz

I dont know but with msi ab and asus gpu tweak I cant go above 1235MHz.
Anyone know how I can pass this limitation ?


----------



## Duvar

It s Silicon Lottery, some can overclock further and some not.
You can try to apply more voltage with PT1 or PT3 Bios but i wouldn t reccomend it.
1235MHz is a good result why bother to go higher with more power consumption and the risk of destroying your card?


----------



## anteante

Trying to give my card some higher clocks, but valley always crash @ 1100/5000 even with voltage up to 1.3. Using Asus Gputweak. Need tips


----------



## zpaf

Quote:


> Originally Posted by *Duvar*
> 
> It s Silicon Lottery, some can overclock further and some not.
> You can try to apply more voltage with PT1 or PT3 Bios but i wouldn t reccomend it.
> 1235MHz is a good result why bother to go higher with more power consumption and the risk of destroying your card?


I dont want to apply more voltage.
I want to try with same voltage for 1250MHz but asus gpu tweak and msi ab don't let me to slide over 1235MHz.


----------



## Duvar

Have a look @ here http://extreme.pcgameshardware.de/grafikkarten/304527-how-flash-amd-r9-290-290x.html
Look at the screens there.
I think you can slide further if you flash the asus 290X Bios.


----------



## Sgt Bilko

You can slide to 1300Mhz Core and 1625Mhz Memory if you download AB Beta 17,
also gives you Core Voltage Control

http://www.guru3d.com/files_details/msi_afterburner_beta_download.html


----------



## nemm

To the user who posted something along the lines stable at 1150 stock but cannot get 1180 with +100mv for which you have the vdroop to thank for that. I am stable 1100 stock with 1120 being highest before artifacts appear but with +100 I only manage 1140 because of the voltage varying. If I apply llc and drop to +50mv I can get 1180 stable and 1200 artifacts begin but can max bench at 1215. Adding more voltage does nothing and I get black screen within seconds of 3d application running. With llc +50mv the recorded stable voltage is 1.297 anything over is problematic so I settle for max stable of 1180/1500 but run 1160/1450 as 24/7 clocks with approximately 20% increase over stock is not to be disappointed with.


----------



## Sgt Bilko

Ok stupid question here, how do i apply LLC?


----------



## CriticalHit

Quote:


> Originally Posted by *tsm106*
> 
> Here's trifire on GEN2. The difference is huge


Dont these charts show difference between single and trip monitor setup with two crossfire cards? And the difference between pcie 3 and pcie2 x8 is only a couple of frames ..
Am i misreading something ?


----------



## Skylark71

Quote:


> Originally Posted by *Sgt Bilko*
> 
> You can slide to 1300Mhz Core and 1625Mhz Memory if you download AB Beta 17,
> also gives you Core Voltage Control
> 
> http://www.guru3d.com/files_details/msi_afterburner_beta_download.html


Just downloaded AB beta17 but the voltage control slider is still blank. I enabled the voltage control and unofficial oc?
Gpu-tweak works great but making a custom fan profiles totally sucks, that's why i like AB.

Any tips?


----------



## Arizonian

Quote:


> Originally Posted by *LazarusIV*
> 
> Add me!!! Add me!!! Just got my HIS R9 290 the other day, it's testing out pretty well so far. I'm on air for the moment, later this month I'll snag some wc equipment and dunk my computer. So far I've gotten to 1090 / 1375 with +50 on power, any higher on either core or memory gives me artifacts in Heaven. I had the black screen issue when I first got the drivers installed, but only when I initially used Overdrive to increase the power limit. I downloaded MSI Afterburner and disabled Overdrive, I do everything with Afterburner now and I've had no issues whatsoever. Qnix monitor is incoming, should be here by Monday.
> 
> Proof:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> Heaven (1920x1200 @ 1090 / 1350):
> 
> 
> 
> 
> I haven't messed with a different BIOS yet, I'm not sure I will. I've never done it and I'd like to get used to the card first. Also, I'd like to see how far I can overclock my new monitor, then I'll squeeze every last FPS out of this card that I can. I probably won't get a second one until next summer.


Congrats - You've been added


----------



## Jpmboy

Quote:


> Originally Posted by *tsm106*
> 
> I don't use tim on pads, its too annoying to remove and I like to reuse my ultra extremes because they are expensive as hell. Well, I did splurge this time and put tim on vrm1, though it was mostly to hold them in place but I really didn't need to do it. My vrms temps are 25c idle and 38-40c loaded and I'm talking at 1300 core.
> Thanks man.


eh - i just removed my ek block for rma. use pk-1 or gelig, easy to remove from vrms and memory . fujipoly etc, does not benefit as much from tim. I never saw >55C on vrm 1 which is plenty low. (1.3V actual)


----------



## Jpmboy

Quote:


> Originally Posted by *RAFFY*
> 
> My dual crossfire 290x's are scoring higher than your 3. I scored a 14752


not even the same benchmark and YOU DISABLED tessellation.


----------



## Jpmboy

Quote:


> Originally Posted by *tsm106*
> 
> Nope, once you understand the pcb.
> 
> Tahiti was a beauty cuz she was so gentle when pushed to the edge. Hawaii is a cold mistress in comparison. I've learned a few things on the benching these cards. Hawaii has a cliff and its a sharp quick drop!
> 
> Basically you cannot run max memory oc with a core oc, it will hard blackscreen. Memory doesn't artifact, it hard blackscreens. The core though artifacts. The core and memory work in relationship to the volts, and obviously the volts are dynamic. Thus you have to keep the voltage high enough for the clocks. Basically its the opposite, you are not looking for the lowest volts per mhz, you are looking to feed enough volts for the mhz you want to run. Think in ranges. Also, this pcb is freaking insane, it eats the volts up. Btw, if you have memory at the right clocks and are pushing the core and it blackscreens it is from the core, it is not a hard lock, it can come back from the dead. It's just the outputs.
> Thanks. It's not even worth replying to though lol.
> 
> Oh btw, 1300 core in trifire, wee.
> 
> http://www.3dmark.com/fs/1162284


http://www.overclock.net/t/1443196/firestrike-extreme-top-30/0_20


----------



## eternal7trance

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'm using the pads that came on the stock cooler apart from the vrm's which i used the pads included with my Accelero Xtreme III. They are stuck there and haven't fallen off so i'll call that a win.


Thanks

Is it easy to remove in case you would need to send it in for warranty?


----------



## kcuestag

My 290X is going back to the store either tomorrow or Tuesday. If they can't give me one with Hynix memory and I'll go for a GTX780 Ti.


----------



## mojobear

Hi all, does it mean that black screens is usually from overclocking? At stock most people are okay? I'm really confused on this issue and its really the only thing preventing me from buying three 290s







:thumb:


----------



## RedRage

Quote:


> Originally Posted by *eternal7trance*
> 
> How do you guys manage putting a heatsink on the one part that is marked in red? Also, which is the best to get if I need to remove the heatsinks for warranty or selling?


Is that section of vrm's what gpuz refers to as being VRM 1?


----------



## kcuestag

Quote:


> Originally Posted by *mojobear*
> 
> Hi all, does it mean that black screens is usually from overclocking? At stock most people are okay? I'm really confused on this issue and its really the only thing preventing me from buying three 290s
> 
> 
> 
> 
> 
> 
> 
> :thumb:


Nope, black screens are happening even at stock, and EVEN underclocked! This issue is sick...


----------



## EliteReplay

Quote:


> Originally Posted by *hotrod717*
> 
> Woot! I may have one!
> 
> XFX R9 290 stock w/ Elpida
> 
> Asus 290X bios


nice make sure to post in this thread... it will help people









http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread


----------



## youngmanblues

Quote:


> Originally Posted by *mojobear*
> 
> Hi all, does it mean that black screens is usually from overclocking? At stock most people are okay? I'm really confused on this issue and its really the only thing preventing me from buying three 290s
> 
> 
> 
> 
> 
> 
> 
> :thumb:


Mine happens at stock( cpu and gpu ,elpida memory). And for mine case, it seems to me that it will only black screen and hard crash after a cold boot, when playing bioshock infinite and bf4. After reset, everything works fine. Does anyone know why? Hoping it is just a driver issue, i do not want to go through the process of RMAing.

Hynix seems to be less prone to black screen..


----------



## EliteReplay

Quote:


> Originally Posted by *kcuestag*
> 
> Nope, black screens are happening even at stock, and EVEN underclocked! This issue is sick...


RMA?


----------



## flopper

Quote:


> Originally Posted by *mojobear*
> 
> Hi all, does it mean that black screens is usually from overclocking? At stock most people are okay? I'm really confused on this issue and its really the only thing preventing me from buying three 290s
> 
> 
> 
> 
> 
> 
> 
> :thumb:


I can replicate it with adding 100mhz to ram with the 290.
core is fine at 1155mhz.

I would assume as some stated that the dynamic clocks and voltage makes the cards act different.
so you would need x amount of voltage to make it run x clock.
still at stock it should just work.


----------



## RedRage

Guys, sorry to bother but the question keeps getting ignored. Again........ which section of vrms on the card is considered VRM1 according to GPUZ?


----------



## kcuestag

Quote:


> Originally Posted by *EliteReplay*
> 
> RMA?


No one can guarantee that I'll get one with Hynix memory and no black screen issues, most likely I'll get another Elpida with these issues, and I don't want to waste few weeks for RMA for that.


----------



## cowie

Quote:


> Originally Posted by *tsm106*
> 
> Nope, once you understand the pcb.
> 
> Tahiti was a beauty cuz she was so gentle when pushed to the edge. Hawaii is a cold mistress in comparison. I've learned a few things on the benching these cards. Hawaii has a cliff and its a sharp quick drop!
> 
> Basically you cannot run max memory oc with a core oc, it will hard blackscreen. Memory doesn't artifact, it hard blackscreens. The core though artifacts. The core and memory work in relationship to the volts, and obviously the volts are dynamic. Thus you have to keep the voltage high enough for the clocks. Basically its the opposite, you are not looking for the lowest volts per mhz, you are looking to feed enough volts for the mhz you want to run. Think in ranges. Also, this pcb is freaking insane, it eats the volts up. *Btw, if you have memory at the right clocks and are pushing the core and it blackscreens it is from the core, it is not a hard lock, it can come back from the dead. It's just the outputs.* Thanks. It's not even worth replying to though lol.


ok so how do you black screen bench?
I do have times when after a blsc hear say heaven bench still running that is with no overclocks at all and lose output how would one get that back without either app crash or driver crash or needing a restart?


----------



## CallsignVega

Quote:


> Originally Posted by *tsm106*
> 
> Nope, once you understand the pcb.
> 
> Tahiti was a beauty cuz she was so gentle when pushed to the edge. Hawaii is a cold mistress in comparison. I've learned a few things on the benching these cards. Hawaii has a cliff and its a sharp quick drop!
> 
> Basically you cannot run max memory oc with a core oc, it will hard blackscreen. Memory doesn't artifact, it hard blackscreens. The core though artifacts. The core and memory work in relationship to the volts, and obviously the volts are dynamic. Thus you have to keep the voltage high enough for the clocks. Basically its the opposite, you are not looking for the lowest volts per mhz, you are looking to feed enough volts for the mhz you want to run. Think in ranges. Also, this pcb is freaking insane, it eats the volts up. Btw, if you have memory at the right clocks and are pushing the core and it blackscreens it is from the core, it is not a hard lock, it can come back from the dead. It's just the outputs.
> Thanks. It's not even worth replying to though lol.
> 
> Oh btw, 1300 core in trifire, wee.
> 
> http://www.3dmark.com/fs/1162284


Hmm interesting. 1300 Core sounds nice, should beat my 1300 core Titans. Well unless it's Valley of course lol, these new AMD cards apparently don't run that too well. I have two 290X's inbound for testing to see how they work with my flight sim. I think it will be between 290X's and 790ti classifieds in which I go with for a new chill-box build.


----------



## Raephen

Quote:


> Originally Posted by *RedRage*
> 
> Is that section of vrm's what gpuz refers to as being VRM 1?


I think that varies....

I still have the stock cooler, and I imagine the vrm section in the top left section, the farthest away from the stock fan, would get warmest, and with my Sapphire 290 that's VRM 2.

VRM 1 is usually 15 degrees or so cooler under load, which would make sense with the stock fan right on top of it.


----------



## zpaf

We need more voltage for 1300MHz ...


----------



## jomama22

Quote:


> Originally Posted by *kcuestag*
> 
> My 290X is going back to the store either tomorrow or Tuesday. If they can't give me one with Hynix memory and I'll go for a GTX780 Ti.


Lololol please do that.


----------



## Scorpion49

All AMD build coming together.


----------



## TomiKazi

Quote:


> Originally Posted by *Scorpion49*
> 
> All AMD build coming together.
> 
> 
> Spoiler: Warning: Spoiler!


Looks delicious.


----------



## mojobear

Quote:


> Originally Posted by *kcuestag*
> 
> Nope, black screens are happening even at stock, and EVEN underclocked! This issue is sick...


From your signature you have the 290x under water as well...and no difference?

So just some general questions for all...

1) I know you said underclocking still has black screens, but have people had success with reducing black screens by decreasing ram frequency specifically? - I know its counter productive to OCN







but sounds like some people on this forum distinguish hardlock black screens from recoverable black screens with the former being ram related and the latter core related.

2) Are the black screens driver dependent? - for those with beta 8, before AMD updated the new drivers for 290 with beta 9 (I think it was beta 9???) to squeeze extra performance, do you still get black screens?

Thanks all. I'm really trying to convince myself into buying three 290s and that it will not be a bad investment haha


----------



## rv8000

Disregard, needed more thorough reading


----------



## Scorpion49

Quote:


> Originally Posted by *TomiKazi*
> 
> Looks delicious.


I like it, this is how it will be while I order all my water parts. Got the 8350 running nice and smooth at 4600mhz with 1.385V right now.


Spoiler: Warning: Spoiler!


----------



## IBIubbleTea

Quote:


> Originally Posted by *jomama22*
> 
> Lololol please do that.


Sorry but what are the differences between the two memory companies?


----------



## hotrod717

Quote:


> Originally Posted by *EliteReplay*
> 
> nice make sure to post in this thread... it will help people
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread


Already done. Thanks. When I switch out the stock cooler for waterblock, I'll also take pictures to confirm number on chip. Seems like these may actually be 290X chips that were used for a 290. Powercolor and XFX are the only 2 manufacturers confirmed so far. Memory has no bearing. Both Elpida and Hynix.
http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread/80#post_21206004


----------



## sugarhell

Quote:


> Originally Posted by *rv8000*
> 
> I'd like to recommend this thread be added to part of the op, contains very valuable info about overclocking, power, bioses etc....
> 
> http://www.overclock.net/t/1437170/290x-voltage-control-vdroop-mods-custom-bios-thread
> 
> I regret getting a 650w psu at this point, limiting me from getting any further


Its full of misinformations. For the power is easily to calculate it yourself. 300 watt stock +50% powerlimit for the max TDP of 290x


----------



## airisom2

It is said that Hynix overclocks better. That said, there are some that have gotten Elpida up to 1.7Ghz on Hawaii. I also read somewhere that after you pass 1500Mhz, you really don't gain much performance.

EDIT:
Quote:


> Originally Posted by *rv8000*
> 
> Anyone here notice that with 290/290x overclocking memory past 1500mhz seems to have almost 0 benefit in both 3dmark and Uniengine benchmarks? My card shows no sign of artifacting @ 1575 or even 1600 yet it yields no performance increase. My only guess is ecc? Have chip timings ever changed depending on clocks for gpus?


Quote:


> Originally Posted by *Mr357*
> 
> Quote:
> 
> 
> 
> Originally Posted by *jomama22*
> 
> Its luck of the draw for hynix/elpida. AIB doesn't matter. There is way more elpida out there then hynix.
> 
> 
> 
> I can vouch for this. As I have said before, mine (Elpida) will do an astounding 1700MHz stable on the PT1 BIOS, and possibly higher on the ASUS stock BIOS.
Click to expand...


----------



## RedRage

Quote:


> Originally Posted by *Raephen*
> 
> I think that varies....
> 
> I still have the stock cooler, and I imagine the vrm section in the top left section, the farthest away from the stock fan, would get warmest, and with my Sapphire 290 that's VRM 2.
> 
> VRM 1 is usually 15 degrees or so cooler under load, which would make sense with the stock fan right on top of it.


Thanks for the info, my VRM1 is actually considerably higher than VRM2 now that I have the Accelero 3 cooler installed. Not dangerously hot but still hotter than I would like.


----------



## ZealotKi11er

I was thinking its not fair to compare a R9 290 to current Nvidia cards since they dont have 512-Bit memory so i am going to compare it with Nvidias last 512-bit cards the GTX280. Should be a fair/good fight.


----------



## nemm

Hynix or Elpida memory for these cards makes no difference in my experience.

I've had 4 samples, 2 of each with only one having problems which was an Hynix sample. This card was fine to start but eventually died after endless problems. All cards have overclocked to 1500 easily but it has been stated numerous times before that after 1500 seems to offer nothing to little gain for most and in my case the resulting scores always come out less. All my cards have actually hit 1600-1650 depending on voltage with the Elpida having the highest overclocked memory.

As to trying to grab a Hynix for sure would involve the card either being tested or the cooler being removed to find out as serial numbers offer no insight in my opinion. The serial numbers of my cards have been xxx7 and xxx8 where 7 was Hynix and 8 was Elpida, also yy60 and yy70 where 60 is Hynix and 70 is Elpida.


----------



## EmZkY

Im planning to watercool a R9 290X, but retailers seem to never have them in stock here in Norway. I asked one of them and they replied with "within november". Anyone got a clue when a new batch is coming this way? Im planning to buy Sapphire.


----------



## Snyderman34

Really enjoying this card on water. I'm still capping out about 1200/1500, but it's so much cooler. Definitely need to add another rad though. Temps were at 54C on the core and 58C on VRMs at stock (on a 4770k (also stock), everything attached to an H220 with no extra rad). Pretty respectable for a single 240mm rad (CPU temps were at 58C. Temps taken during a Heaven Extreme run @ 1080p as per OCN Heaven 4.0 thread)


----------



## Mr357

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I was thinking its not fair to compare a R9 290 to current Nvidia cards since they dont have 512-Bit memory so i am going to compare it with Nvidias last 512-bit cards the GTX280. Should be a fair/good fight.


----------



## Toss3

Quote:


> Originally Posted by *EmZkY*
> 
> Im planning to watercool a R9 290X, but retailers seem to never have them in stock here in Norway. I asked one of them and they replied with "within november". Anyone got a clue when a new batch is coming this way? Im planning to buy Sapphire.


Don't buy Sapphire. Just get a Powercolor 290 and unlock the shaders by flashing 290x bios.









Anyone else getting Red screens instead of black ones when playing BF4? Latest 9.2 beta and stock clocks.


----------



## Forceman

Quote:


> Originally Posted by *Toss3*
> 
> Anyone else getting Red screens instead of black ones when playing BF4? Latest 9.2 beta and stock clocks.


Yeah, I got one just last night using Beta 8. I had switched back to Beta 8 because I got a couple with 9.2 and 8 seemed to be better. But I guess not.


----------



## EmZkY

Quote:


> Originally Posted by *Toss3*
> 
> Don't buy Sapphire. Just get a Powercolor 290 and unlock the shaders by flashing 290x bios.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone else getting Red screens instead of black ones when playing BF4? Latest 9.2 beta and stock clocks.


Powercolors all sold out too


----------



## sugarhell

Quote:


> Originally Posted by *Forceman*
> 
> Yeah, I got one just last night using Beta 8. I had switched back to Beta 8 because I got a couple with 9.2 and 8 seemed to be better. But I guess not.


I get red screen on bf4 too with 7970s


----------



## ZealotKi11er

http://ncix.ca/products/?sku=91803&vpn=AXR9%20290%204GBD5%2DMDH%2FOC&manufacture=PowerColor


----------



## RAFFY

Quote:


> Originally Posted by *Jpmboy*
> 
> not even the same benchmark and YOU DISABLED tessellation.


It must be a default setting for the basic version of 3Dmark. I just downloaded it and pressed run test. I only posted that my score was better incase his rig was having problems or wasn't scaling properly. Is there a was to enable tessellation in the free version?


----------



## tsm106

Quick and dirty overclock on Win 8. Ok, I killed them all in the tri FSE HoF.

http://www.3dmark.com/3dm/1655600


----------



## skupples

Quote:


> Originally Posted by *Toss3*
> 
> Don't buy Sapphire. Just get a Powercolor 290 and unlock the shaders by flashing 290x bios.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone else getting Red screens instead of black ones when playing BF4? Latest 9.2 beta and stock clocks.


Just let it be known... These companies use these cards as 290's for a reason. That reason being that they didn't pass w/e manufacturer QA to end up being a 290X... Meaning that problems could arise after time IF they truly are unlocking to the full core count.

I guess the question is, for what reasons do they turn them into 290's? They wouldn't take a profit loss just to make a second line would they? Normally GPU chips that don't pass QA get defused & plugged into a different SKU.


----------



## sugarhell

Quote:


> Originally Posted by *tsm106*
> 
> Quick and dirty overclock on Win 8. Ok, I killed them all in the tri FSE HoF.
> 
> http://www.3dmark.com/3dm/1655600


Zombie hunter


----------



## tsm106

Quick someone pass me another card and block lol. Btw these cards are not binned, just off the shelf. I'm pretty sure in the right hands all the 290x will do this.


----------



## FtW 420

Quote:


> Originally Posted by *RAFFY*
> 
> It must be a default setting for the basic version of 3Dmark. I just downloaded it and pressed run test. I only posted that my score was better incase his rig was having problems or wasn't scaling properly. Is there a was to enable tessellation in the free version?


Running the presets has tesslellation enabled, it can only be disabled in the benchmark using custom settings. It is probably disabled in the driver settings.
Quote:


> Originally Posted by *tsm106*
> 
> Quick and dirty overclock on Win 8. Ok, I killed them all in the tri FSE HoF.
> 
> http://www.3dmark.com/3dm/1655600


Nice!


----------



## Forceman

Quote:


> Originally Posted by *skupples*
> 
> Just let it be known... These companies use these cards as 290's for a reason. That reason being that they didn't pass w/e manufacturer QA to end up being a 290X... Meaning that problems could arise after time IF they truly are unlocking to the full core count.
> 
> I guess the question is, for what reasons do they turn them into 290's? They wouldn't take a profit loss just to make a second line would they? Normally GPU chips that don't pass QA get defused & plugged into a different SKU.


Or they just screwed up a batch. Or they knew this info would get out and drive sales of their cards. Who knows. But the fact that they have 290X part numbers on the chip (if that is borne out) would seem to indicate they are actually fully functional 290X's.


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> Quick and dirty overclock on Win 8. Ok, I killed them all in the tri FSE HoF.
> 
> http://www.3dmark.com/3dm/1655600


Well done!


----------



## tsm106

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Quick and dirty overclock on Win 8. Ok, I killed them all in the tri FSE HoF.
> 
> http://www.3dmark.com/3dm/1655600
> 
> 
> 
> Well done!
Click to expand...

Over 3.5K higher gs than 1st place. Houston, I need a better cpu or to run cold like everyone else lol.

http://www.3dmark.com/fs/1165787


----------



## kcuestag

Quote:


> Originally Posted by *mojobear*
> 
> From your signature you have the 290x under water as well...and no difference?
> 
> So just some general questions for all...
> 
> 1) I know you said underclocking still has black screens, but have people had success with reducing black screens by decreasing ram frequency specifically? - I know its counter productive to OCN
> 
> 
> 
> 
> 
> 
> 
> but sounds like some people on this forum distinguish hardlock black screens from recoverable black screens with the former being ram related and the latter core related.
> 
> 2) Are the black screens driver dependent? - for those with beta 8, before AMD updated the new drivers for 290 with beta 9 (I think it was beta 9???) to squeeze extra performance, do you still get black screens?
> 
> Thanks all. I'm really trying to convince myself into buying three 290s and that it will not be a bad investment haha


1) THey are hard lock black screens (I am forced to do a reset on the motherboard button, sometimes even that doesn't respond and I have to shut it off).

2) No, they happen with ANY driver.


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> Over 3.5K higher gs than 1st place. Houston, I need a better cpu or to run cold like everyone else lol.
> 
> http://www.3dmark.com/fs/1165787


you beat your quad 7970.


----------



## the9quad

Quote:


> Originally Posted by *kcuestag*
> 
> 1) THey are hard lock black screens (I am forced to do a reset on the motherboard button, sometimes even that doesn't respond and I have to shut it off).
> 
> 2) No, they happen with ANY driver.


Just RMA it, mine did the same thing, the new card works fine. It's a hardware issue for sure. New card is sapphire with elpida ram just like the first one. The only difference is the new one works the old one black screened=hardware issue.


----------



## kcuestag

Quote:


> Originally Posted by *the9quad*
> 
> Just RMA it, mine did the same thing, the new card works fine. It's a hardware issue for sure. New card is sapphire with elpida ram just like the first one. The only difference is the new one works the old one black screened=hardware issue.


I already made a ticket with the Store, hoping to get an RMA done tomorrow or Tuesday at most.


----------



## Jpmboy

Quote:


> Originally Posted by *RAFFY*
> 
> It must be a default setting for the basic version of 3Dmark. I just downloaded it and pressed run test. I only posted that my score was better incase his rig was having problems or wasn't scaling properly. Is there a was to enable tessellation in the free version?


It actually you CCC settings most likely. If you select the "Performance" benchmark, it has standard settings, which can be overridden in CCC. Just set your 3D settings to "Optimal Performance" and run it again. T


----------



## Sgt Bilko

Quote:


> Originally Posted by *tsm106*
> 
> Over 3.5K higher gs than 1st place. Houston, I need a better cpu or to run cold like everyone else lol.
> 
> http://www.3dmark.com/fs/1165787


Oh damn......thats nice

I really need watercooling now...


----------



## K1llrzzZ

Hi everyone,so i have an r9 290X crossfire in a Z77 Extreme 4 mobo with an i7 3770k,my question is,as far is i know Z77/Z87 mobos only support crossfire in 8x/8x mode,now the 290X communicates through PCI,since there isn't a crossfire bridge anymore,does that mean that Z77/Z87 mobos bottleneck such a configuration?So R9 290X Crossfire is only good in X79 mobos?Or it doesn't matter that much?How much is performance difference between 8x/8x and 16x/16x with these cards??We are talking about PCI 3.0 of course.


----------



## Jared Pace

Quote:


> Originally Posted by *sugarhell*
> 
> Its full of misinformations.


No it is not.


----------



## evensen007

So I went ahead and did a full reload of my PC and installed the 13.11 b9.2 drivers again. All of the stuttering problems I had in xfire are now gone. I also downloaded the core unparker app, but I think it has way more to do with reloading and getting rid of old dll's and driver conflicts than anything. My FPS dips were into the low 30's causing me to feel like it was a really bad gameplay experience. My dips are now in the low 50's, with my avg. being 95-100 @ Ultra 4x AA 2560x1600p.

I don't believe it was the 1155 chipset bottle-necking xfire like some people said. For those having the same issue in BF4, do yourself a favor and reload.


----------



## sugarhell

Quote:


> Originally Posted by *Jared Pace*
> 
> No it is not.


You bet?


----------



## Scorpion49

So now I have a new problem, it has showed up in both BL2 and Planetside 2, fullscreen white flashes. Like one frame of solid white every 45 seconds or so. It is really obnoxious.


----------



## rdr09

Quote:


> Originally Posted by *evensen007*
> 
> So I went ahead and did a full reload of my PC and installed the 13.11 b9.2 drivers again. All of the stuttering problems I had in xfire are now gone. I also downloaded the core unparker app, but I think it has way more to do with reloading and getting rid of old dll's and driver conflicts than anything. My FPS dips were into the low 30's causing me to feel like it was a really bad gameplay experience. My dips are now in the low 50's, with my avg. being 95-100 @ Ultra 4x AA 2560x1600p.
> 
> I don't believe it was the 1155 chipset bottle-necking xfire like some people said. For those having the same issue in BF4, do yourself a favor and reload.


+rep.


----------



## sugarhell

Its still full of misinformations.


----------



## Jared Pace

Quote:


> Originally Posted by *sugarhell*
> 
> Its still full of misinformations.


No it's not. But don't worry bro, You'll get to find out for yourself. Arizonian is deleting it again. Thank you Arizonian. I'm tired of this people arguing about it.


----------



## Xylene

My XFX 290 (Elpida) black screened twice, but only when I had some stupid high over clock. I've had no issues at 1150mhz, no need to push any higher, and I don't remember even trying higher than 1300mhz on the memory. The frame rates I am getting in BF3/4 with all ultra settings and no AA (I just don't need it at 1440p) are awesome.


----------



## utnorris

Quote:


> Originally Posted by *tsm106*
> 
> Over 3.5K higher gs than 1st place. Houston, I need a better cpu or to run cold like everyone else lol.
> 
> http://www.3dmark.com/fs/1165787


Get a bucket of water and load it up with ice, that will give you pretty low temps (not LN2 of course) on the cheap or you could grab a window AC and make a chiller. I did one for under $150 with most of that tied up in the controller, but you can skip the controller and do it for under $100 and see anywhere from -10c to -20c water temps.
Quote:


> Originally Posted by *evensen007*
> 
> So I went ahead and did a full reload of my PC and installed the 13.11 b9.2 drivers again. All of the stuttering problems I had in xfire are now gone. I also downloaded the core unparker app, but I think it has way more to do with reloading and getting rid of old dll's and driver conflicts than anything. My FPS dips were into the low 30's causing me to feel like it was a really bad gameplay experience. My dips are now in the low 50's, with my avg. being 95-100 @ Ultra 4x AA 2560x1600p.
> 
> I don't believe it was the 1155 chipset bottle-necking xfire like some people said. For those having the same issue in BF4, do yourself a favor and reload.


Yep, really the first time I have had to do a complete reinstall to fix a driver issue in a long time, but my OS had seen so many different cards it was bound to need it sometime sooner or later.


----------



## maarten12100

Too bad one of my r9 290's had such bad coil whine since it was a great clocker with +100mV in afterburner it did all the way up to 1235 (fan maxed)
Just a simple attempt with succes.

Once the BF4 bundled cards are in with my retailer I will order either 2 powercolor ones or 2 XFX ones for the sake of lottery (no luck on either of my 290's for unlocking)


----------



## xxmastermindxx

Weird issue I'm having with any flashed BIOS on my Gigabyte 290s. Unless they're on the stock BIOS, in Windows I get a mouse cursor stuck in the upper left corner of the screen. It changes with my normal cursor depending on what I'm doing.

I've tried Asus, PT1, and PT3 BIOS files, and they all do it. Any ideas?


----------



## Loktar Ogar

I need more info regarding the driver stability issues on these cards on recent games and overall system performance... If i read some post right, there are some issues... Kindly enlighten me guys because I'm close to buying this over 780... I do not have any experience yet buying an AMD card and hoping to to have an honest and informative replies from this thread.


----------



## cyenz

Quote:


> Originally Posted by *Loktar Ogar*
> 
> I need more info regarding the driver stability issues on these cards on recent games and overall system performance... If i read some post right, there are some issues... Kindly enlighten me guys because I'm close to buying this over 780... I do not have any experience yet buying an AMD card and hoping to to have an honest and informative replies from this thread.


Just beware of the infamous blackscreen.


----------



## galil3o

I've got my card up to 1185 core and 5700 on the memory using the asus bios with all the voltage setting maxed out.

GPU-Z is still showing a max voltage of 1.295v during benchmarks when I have the max of 1.412v set in GPU tweak with 150% power target. Is the vdrop a full 0.1 volts? and if so what is the next step to achieve a higher voltage? I've heard of PT something bios but can't seem to find a concise overview. I have a water cooling loop on its way in the mail so I should be able to push it a bit farther once thats all set up.


----------



## CallsignVega

Quote:


> Originally Posted by *tsm106*
> 
> Quick and dirty overclock on Win 8. Ok, I killed them all in the tri FSE HoF.
> 
> http://www.3dmark.com/3dm/1655600


I really wish 3Dmark reported CPU and GPU freq's properly. There is a 3x Titan setup about a hundred points below yours but no clue what his true freq's were at and hence don't know cooling setup.

I take it you are still using ambient water?


----------



## sugarhell

Quote:


> Originally Posted by *CallsignVega*
> 
> I really wish 3Dmark reported CPU and GPU freq's properly. There is a 3x Titan setup about a hundred points below yours but no clue what his true freq's were at and hence don't know cooling setup.
> 
> I take it you are still using ambient water?


IIRC they are zombie titans


----------



## r0l4n

Quote:


> Originally Posted by *galil3o*
> 
> I've got my card up to 1185 core and 5700 on the memory using the asus bios with all the voltage setting maxed out.
> 
> GPU-Z is still showing a max voltage of 1.295v during benchmarks when I have the max of 1.412v set in GPU tweak with 150% power target. Is the vdrop a full 0.1 volts? and if so what is *the next step to achieve a higher voltage*? I've heard of PT something bios but can't seem to find a concise overview. I have a water cooling loop on its way in the mail so I should be able to push it a bit farther once thats all set up.


See the tool in my signature.


----------



## the9quad

Quote:


> Originally Posted by *cyenz*
> 
> Just beware of the infamous blackscreen.


Just FYI, I replaced my sapphire 290x with another sapphire 290x, and the black screen issue went away. So I doubt it's a driver issue. Probably just a bad batch of cards, and nothing that can't be remedied by just returning it for another one.


----------



## tsm106

Quote:


> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *CallsignVega*
> 
> I really wish 3Dmark reported CPU and GPU freq's properly. There is a 3x Titan setup about a hundred points below yours but no clue what his true freq's were at and hence don't know cooling setup.
> 
> I take it you are still using ambient water?
> 
> 
> 
> IIRC they are zombie titans
Click to expand...

What's it matter lol? Those are hwbot benchers with their hwbot settings. You can go get the info you want from the hwbot WR leaderboard.

Btw, 8pack in 2nd place HoF is the Hwbot WR if I'm not mistaken.


----------



## ImJJames

So far what is the average overclock max on air on these? Clock speed wise?


----------



## Arizonian

Quote:


> Originally Posted by *ImJJames*
> 
> So far what is the average overclock max on air on these? Clock speed wise?


Anywhere between 1100 Mhz - 1150 Mhz Core on air. No voltage. Some voltage might yield 1175 Mhz I'm wagering.


----------



## Raxus

Quote:


> Originally Posted by *the9quad*
> 
> Just FYI, I replaced my sapphire 290x with another sapphire 290x, and the black screen issue went away. So I doubt it's a driver issue. Probably just a bad batch of cards, and nothing that can't be remedied by just returning it for another one.


I RMAed my sapphire to newegg last week.

This makes me feel better that I didn't jump the gun.


----------



## maarten12100

Quote:


> Originally Posted by *Loktar Ogar*
> 
> I need more info regarding the driver stability issues on these cards on recent games and overall system performance... If i read some post right, there are some issues... Kindly enlighten me guys because I'm close to buying this over 780... I do not have any experience yet buying an AMD card and hoping to to have an honest and informative replies from this thread.


Drivers are fine actually in CF not only are the frame rates higher on average than any Nvidia SLI also the experience on high resolutions is better.
If you're in for a single card you could get a 290 and get another one later on I would get a XFX or Powercolor one if I was you as they might unlock to 290x's


----------



## drdrache

Before I try to RMA for no reason,
is there a test to cause these blackscreens? (i've had a few, but I want to "PROVE" to myself that they are the same)


----------



## ImJJames

Quote:


> Originally Posted by *Arizonian*
> 
> Anywhere between 1100 Mhz - 1150 Mhz Core on air. No voltage. Some voltage might yield 1175 Mhz I'm wagering.


Not bad, so eager to pull trigger but waiting patiently on non-reference.


----------



## Scorpion49

Anybody here notice things like this? I'm getting black and white lines flashing on the screen and the 3Dmark fill tests display it easiest.


----------



## Raxus

Quote:


> Originally Posted by *drdrache*
> 
> Before I try to RMA for no reason,
> is there a test to cause these blackscreens? (i've had a few, but I want to "PROVE" to myself that they are the same)


At this point if you're seeing the blackscreens at stock clocks, Id rma it to be safe.


----------



## sugarhell

Quote:


> Originally Posted by *Scorpion49*
> 
> Anybody here notice things like this? I'm getting black and white lines flashing on the screen and the 3Dmark fill tests display it easiest.


Why you bench such an old 3dmark? Prob its unoptimized. Try firestrike


----------



## drdrache

Quote:


> Originally Posted by *Raxus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *drdrache*
> 
> Before I try to RMA for no reason,
> is there a test to cause these blackscreens? (i've had a few, but I want to "PROVE" to myself that they are the same)
> 
> 
> 
> At this point if you're seeing the blackscreens at stock clocks, Id rma it to be safe.
Click to expand...

I'll start the process then, anyone ever RMA through Tigerdirect? so far it seems like i send it in, THEN let them know why..


----------



## Arizonian

Quote:


> Originally Posted by *ImJJames*
> 
> Not bad, so eager to pull trigger but waiting patiently on non-reference.


Yes wait. Unless your putting water blocks or aftermarket cooling, non-reference is the way to go. It can't come soon enough. I'm going to start doing some homework to see which non-reference / aftermarket cooling use Hynix memory.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Arizonian*
> 
> Yes wait. Unless your putting water blocks or aftermarket cooling, non-reference is the way to go. It can't come soon enough. I'm going to start doing some homework to see which non-reference / aftermarket cooling use Hynix memory.


Thats my plan now....im gonna wait for a good non ref card to come along to add to mine for crossfire. Hopefully PCIe 2.0 wont hold me back too much.


----------



## Scorpion49

Quote:


> Originally Posted by *sugarhell*
> 
> Why you bench such an old 3dmark? Prob its unoptimized. Try firestrike


I can't even respond to this without getting an infraction.


----------



## maarten12100

Just benched them at 1200/1500
http://www.3dmark.com/3dm/1657138


Spoiler: Warning: Spoiler!







Nice card really, sad I have to send it back but hey maybe I get myself a unlock able 290 if I'm very lucky 2 unlock able ones you lads.
Once I get the new ones I will run them with a PT3 bios to see how deep the rabbit hole really goes.

Once Brickland is here they will go underwater


----------



## CriticalHit

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Thats my plan now....im gonna wait for a good non ref card to come along to add to mine for crossfire. Hopefully PCIe 2.0 wont hold me back too much.


As long as u are pcie 2 x8 and above u r fine for at least 2 cards in crossfire


----------



## galil3o

Quote:


> Originally Posted by *r0l4n*
> 
> See the tool in my signature.


Does this require that I use afterburner? I just got the Asus bios flashed and gpu tweak seems to be working well.


----------



## Sgt Bilko

Quote:


> Originally Posted by *CriticalHit*
> 
> As long as u are pcie 2 x8 and above u r fine for at least 2 cards in crossfire


Crosshair V Formula:

3 x PCIe 2.0 x16 (dual x16 or x16/x8/x8)

im good for 2 cards then


----------



## rdr09

Quote:


> Originally Posted by *maarten12100*
> 
> Just benched them at 1200/1500
> http://www.3dmark.com/3dm/1657138
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Nice card really sad I have to send it back but hey maybe I get myself a unlock able 290 if I'm very lucky 2 unlock able ones you lads.
> Once I get the new ones I will run them with a PT3 bios to see how deep the rabbit hole really goes.
> 
> Once Brickland is here they will go underwater


it could be the version of fs or the cpu holding back your 290 a bit.

http://www.3dmark.com/3dm/1656055

could be Win 8, too.


----------



## Dan848

I would like to be added to the R9 290 club please.

The following is a link to a screenshot with all the information you requested.

Thank you.

Image:

http://s130.photobucket.com/user/Swing848/media/OCNR9290OwnersClub.jpg.html


----------



## frogzombie

Edited: Resolved


----------



## dartuil

No one tried the accelero hybrid?


----------



## Sgt Bilko

Quote:


> Originally Posted by *dartuil*
> 
> No one tried the accelero hybrid?


2 members have the hybrid on iirc, check the the member list for aftermarket


----------



## qqan

Me
works.


----------



## RedRage

Quote:


> Originally Posted by *Arizonian*
> 
> Anywhere between 1100 Mhz - 1150 Mhz Core on air. No voltage. Some voltage might yield 1175 Mhz I'm wagering.


1200mhz core on my bios unlocked Powercolor R9 290. Rock stable for heavy gaming 1.3vc under load.


----------



## beejay

Quote:


> Originally Posted by *dartuil*
> 
> No one tried the accelero hybrid?


I'm on hybrid. It works wonders on the GPU temps. It's just the VRM temps that needs more work.

On that note, I just got an idea. I might be able to use the stock heatsink of the R9 290x for the VRMs. Looking at it, it's just the black metal heatsink that's in contact with the VRMs, not the big one with fins. And it has screw holes ^__^. I can attach an 80mm fan directly too, most likely (zip ties?). Then just remove the Hybrid's shroud... Fun night later.

I'll check later if it's detachable. If not, dremel we go.

Last idea in my head, if the VRMs still runs hot, I'll put this under water.


----------



## RedRage

How does the Hybrid match up to the Accelero? Im getting load temps at 1.3vc with 1200mhz core of just 61c after a few hours of Crysis 3 maxed out. BF4 single player runs about the same and Multilayer runs cooler around 58c. Fan is at 100 percent to help with vrm temps and its also extremely quiet at full setting.


----------



## Sgt Bilko

Quote:


> Originally Posted by *beejay*
> 
> I'm on hybrid. It works wonders on the GPU temps. It's just the VRM temps that needs more work.
> 
> On that note, I just got an idea. I might be able to use the stock heatsink of the R9 290x for the VRMs. Looking at it, it's just the black metal heatsink that's in contact with the VRMs, not the big one with fins. And it has screw holes ^__^. I can attach an 80mm fan directly too, most likely (zip ties?). Then just remove the Hybrid's shroud... Fun night later.
> 
> I'll check later if it's detachable. If not, dremel we go.
> 
> Last idea in my head, if the VRMs still runs hot, I'll put this under water.


Thats a good idea. Let me know how it goes. I might do the same with my AX III. The sink should provide some support for the pcb as well.


----------



## omgsosluuw

Pre-add me to the list!! I just put a order on newegg.com for a R9 290. This one is a Sapphire R9 290 with Battlefield 4. Total was $395 with promo code. It only has a Newegg Guarantee Return Policy unlike the other R9s which are nonrefundable so I can return for refund if needed.

I am pretty excited to see the massive improvement from my current 660ti.

Once I get it I will do some measurement from the layman's perspective about the fan noise. I am sure there are a lot of member wanting to know the actual sound coming from a non professional reviewer.

Cant wait!


----------



## Scorpion49

Just ordered a pair of the Aquatuning blocks, supposedly they will be in stock tomorrow but I'm not holding my breath on it.


----------



## RAFFY

Quote:


> Originally Posted by *skupples*
> 
> Just let it be known... These companies use these cards as 290's for a reason. That reason being that they didn't pass w/e manufacturer QA to end up being a 290X... Meaning that problems could arise after time IF they truly are unlocking to the full core count.
> 
> I guess the question is, for what reasons do they turn them into 290's? They wouldn't take a profit loss just to make a second line would they? Normally GPU chips that don't pass QA get defused & plugged into a different SKU.


Isn't this still the process that AMD uses with their CPU's. Hence why they came out with tri-core CPU's a couple years ago. I'd just buy the 290X.
Quote:


> Originally Posted by *FtW 420*
> 
> Running the presets has tesslellation enabled, it can only be disabled in the benchmark using custom settings. It is probably disabled in the driver settings.
> Nice!


Thanks, I'll have to go in and change my default settings. I was pretty stoked at how high my cards scored but was a little skeptical too.
Quote:


> Originally Posted by *Jpmboy*
> 
> It actually you CCC settings most likely. If you select the "Performance" benchmark, it has standard settings, which can be overridden in CCC. Just set your 3D settings to "Optimal Performance" and run it again. T


Just made that quick change thank you.
Quote:


> Originally Posted by *the9quad*
> 
> Just FYI, I replaced my sapphire 290x with another sapphire 290x, and the black screen issue went away. So I doubt it's a driver issue. Probably just a bad batch of cards, and nothing that can't be remedied by just returning it for another one.


For me I know its not my cards. Since the new drivers (9.2) I haven't received any hard lock ups. Now my only problem is direct x crashes sometimes in BF4. But since the new update in BF4 that has decreased a little bit. In games like COD I have no issues at all except for the game not properly accepting crossfire and my game looking like im on low settings.
Quote:


> Originally Posted by *Scorpion49*
> 
> Just ordered a pair of the Aquatuning blocks, supposedly they will be in stock tomorrow but I'm not holding my breath on it.


So with this block you do not need to use any thermal pads? I watched the video but it didn't show the side of the block that contacts the card. According to the video it looks like just great is used, seems pretty awesome and I would expect better cooling than with thermal pads on the EK. Can you post or PM me your temps once you have them installed please.


----------



## Arizonian

Quote:


> Originally Posted by *Dan848*
> 
> I would like to be added to the R9 290 club please.
> 
> The following is a link to a screenshot with all the information you requested.
> 
> Thank you.
> 
> http://www.overclock.net/content/type/61/id/1751503/
> 
> Image:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s130.photobucket.com/user/Swing848/media/OCNR9290OwnersClub.jpg.html


Thank you - Congrats - added to roster


----------



## Falkentyne

Quote:


> Originally Posted by *galil3o*
> 
> I've got my card up to 1185 core and 5700 on the memory using the asus bios with all the voltage setting maxed out.
> 
> GPU-Z is still showing a max voltage of 1.295v during benchmarks when I have the max of 1.412v set in GPU tweak with 150% power target. Is the vdrop a full 0.1 volts? and if so what is the next step to achieve a higher voltage? I've heard of PT something bios but can't seem to find a concise overview. I have a water cooling loop on its way in the mail so I should be able to push it a bit farther once thats all set up.


Vdroop is there by DESIGN, to PROTECT the card and stop VOLTAGE OVERSHOOT. The SAME goes for Intel CPU's. Its by DESIGN.

You can REMOVE the vdroop with the PT3 Bios but that can FRY the card if you aren't on subzero.

LLC tweak from afterburner command line pretty much does the same thing as PT3 bios does, but without getting rid of the TDP/voltage protection (your card will black screen crash if it gets too hot with LLC tweak).


----------



## galil3o

Quote:


> Originally Posted by *Falkentyne*
> 
> Vdroop is there by DESIGN, to PROTECT the card and stop VOLTAGE OVERSHOOT. The SAME goes for Intel CPU's. Its by DESIGN.
> 
> You can REMOVE the vdroop with the PT3 Bios but that can FRY the card if you aren't on subzero.
> 
> LLC tweak from afterburner command line pretty much does the same thing as PT3 bios does, but without getting rid of the TDP/voltage protection (your card will black screen crash if it gets too hot with LLC tweak).


I see. Well its was a little more straight forward in GPU Tweak. How do the +100/-100 Core Voltage, Aux Voltage and Power Limit options relate in Afterburner as opposed to just the core voltage in mv and a power limit up to 150% in GPU tweak?


----------



## dartuil

Quote:


> Originally Posted by *qqan*
> 
> Me
> works.


Quote:


> Originally Posted by *beejay*
> 
> I'm on hybrid. It works wonders on the GPU temps. It's just the VRM temps that needs more work.
> 
> On that note, I just got an idea. I might be able to use the stock heatsink of the R9 290x for the VRMs. Looking at it, it's just the black metal heatsink that's in contact with the VRMs, not the big one with fins. And it has screw holes ^__^. I can attach an 80mm fan directly too, most likely (zip ties?). Then just remove the Hybrid's shroud... Fun night later.
> 
> I'll check later if it's detachable. If not, dremel we go.
> 
> Last idea in my head, if the VRMs still runs hot, I'll put this under water.


Hey,

You have hybrid 7970 or simple hybid?
What are your temperatures?
Hard to mount or its ok?

Thank you!!


----------



## Scorpion49

Quote:


> Originally Posted by *RAFFY*
> 
> So with this block you do not need to use any thermal pads? I watched the video but it didn't show the side of the block that contacts the card. According to the video it looks like just great is used, seems pretty awesome and I would expect better cooling than with thermal pads on the EK. Can you post or PM me your temps once you have them installed please.


Seems that way, I'll post up pics for sure when I get them.


----------



## Dan848

Quote:


> Originally Posted by *rdr09*
> 
> it could be the version of fs or the cpu holding back your 290 a bit.
> 
> http://www.3dmark.com/3dm/1656055
> 
> could be Win 8, too.


Here are mine, at a lower setting:

http://www.3dmark.com/fs/1166783


----------



## Dan848

Quote:


> Originally Posted by *Arizonian*
> 
> Thank you - Congrats - added to roster


Thank you!


----------



## HoZy

*Fact*: Just ordered a 2nd 290x & water cooling supplies to suit, Eta 1.5weeks


----------



## brazilianloser

Quote:


> Originally Posted by *HoZy*
> 
> *Fact*: Just ordered a 2nd 290x & water cooling supplies to suit, Eta 1.5weeks


Only thing stopping me from buying my second 290 is the fact that the first went bananas and I am still waiting on the replacement which should arrive Monday or Tuesday... then the other fact is that even though some reviews show a 860w psu being enough, that in reality that is not the case and buying a new one when I just got me the ax860 not more than two months ago is a shame. Hopefully soon things can start going in the right direction for me and I can get my PC pimped out.


----------



## Loktar Ogar

Quote:


> Originally Posted by *maarten12100*
> 
> Drivers are fine actually in CF not only are the frame rates higher on average than any Nvidia SLI also the experience on high resolutions is better.
> If you're in for a single card you could get a 290 and get another one later on I would get a XFX or Powercolor one if I was you as they might unlock to 290x's


Thanks for your response... This was very helpful. REP+!


----------



## beejay

T
Quote:


> Originally Posted by *dartuil*
> 
> Hey,
> 
> You have hybrid 7970 or simple hybid?
> What are your temperatures?
> Hard to mount or its ok?
> 
> Thank you!!


The normal hybrid. The 7970 hybrid is for old 7970s with raised rams.

It's not that hard to mount. Just follow the instructions. Took me 30 mins to mount. I can burn in furmark at 80c with gpu clocks at 1100, memcore at 1500mhz. The only problem is that the VRMs still goes past 120 on burn test. Anyway, on actual gaming, (TR, BF4, Arkham City) core never goes over 75, vrms never goes above 95.


----------



## Falkentyne

Quote:


> Originally Posted by *galil3o*
> 
> I see. Well its was a little more straight forward in GPU Tweak. How do the +100/-100 Core Voltage, Aux Voltage and Power Limit options relate in Afterburner as opposed to just the core voltage in mv and a power limit up to 150% in GPU tweak?


I think a slightly newer build of AB beta 17 came out shortly after the first one, because there was a problem with GPU usage if you had voltage monitoring selected (unselecting it fixed it).
50% in afterburner is the same as max slider in CCC or 150% in GPU tweak. 50%=50% HIGHER, or 150% of 'normal'. (100% of normal would be the starting point). same thing.

100 mv=0.1 v

Aux voltage is clearly explained in the popup in afterburner. No need to explain it here.


----------



## Elmy

http://www.3dmark.com/fs/1167379

2 Club3D 290X with 4770K @ stock

In CCC I had Power limit setting at 15%

GPU clock setting @ 15%

High Perfomance memory clock setting @ 15%

Just playing around scored 15,810 in Firestrike (not extreme)  which is 35th in the world for 2 video cards.

On water temps not over 43 degrees using EK Nickel Plexi blocks on Quad XSPC 480 rad. Ambient is probably around 65 degrees.

Tomorrow might try to max everything out and see what these babies can do.

Got 5 monitors 5400X1920 working with 2 DVI, 2 displayport monitors connected to MST hub and one monitor connected to HDMI.

BF4 getting around 60 FPS average on High and pretty smooth most of the time. 2 more Club3D 290X's are inbound. Just need AMD to make a driver that will work for 120Hz. Last driver they had that worked was 13.9 (worked on my 7990's) and doesn't apply to 290X.


----------



## zpaf

Anyone with 290X to see how much better can do on Metro LL ?


----------



## Forceman

Quote:


> Originally Posted by *zpaf*
> 
> Anyone with 290X to see how much better can do on Metro LL ?


Couldn't quite match it at 1180/1375



Edit: But 1200/1425 did the trick


----------



## zpaf

Quote:


> Originally Posted by *Forceman*
> 
> Edit: But 1200/1425 did the trick


This is with flashed 290X with shader unlock ?


----------



## tsm106

Quote:


> Originally Posted by *tsm106*
> 
> Metro LL @1150/1500 air


Posted a while back. You can see by the spikes that the card throttled a bit.


----------



## zpaf

Quote:


> Originally Posted by *tsm106*
> 
> Posted a while back. You can see by the spikes that the card throttled a bit.


I had these spikes but when I install fresh windows disappeared.
Your card is 290X or non X ?


----------



## Forceman

Quote:


> Originally Posted by *zpaf*
> 
> This is with flashed 290X with shader unlock ?


Yes.


----------



## r0l4n

Quote:


> Originally Posted by *galil3o*
> 
> Does this require that I use afterburner? I just got the Asus bios flashed and gpu tweak seems to be working well.


Yes, you need to install AB and not have any other overclocking software running at the same time for reliable results.


----------



## Paul17041993

if you're a 'stralian, the GELID Icy Vision Rev 2 is now on preorder for 60AUD, comes with the full taboodle of 16 ram heatsinks and so on.

http://www.pccasegear.com/index.php?main_page=product_info&products_id=25815

not sure about performance, but Id imagine it would still be a good alternative to the reference cooler, 5 year warranty and 2KRPM fans.


----------



## Faksnima

Not sure...but this looks like the first non-ref 290x? http://www.amazon.com/PowerColor-AXPowerColor-290X-4GBD5-MDH-OC/dp/B00G9JAQA0/ref=sr_1_2?ie=UTF8&qid=1384767215&sr=8-2&keywords=powercolor+290


----------



## R35ervoirFox

So the Sapphire 290 I had with Hynix memory oc to 1155/1500 with +63mV, not 100% sure the core was stable. For comparison 1169 with +100mV was not stable. I'm sure the memory could go higher but I didn't try anything higher than 1500.
I used the latest AMD Catalyst 13.11 beta 9.2, I never had a black screen or crash. I have 3 monitors connected, 2 were by dvi, one by hdmi, my main screen runs at 120Hz, during various benchmarks or in BF4 there were no black screens or any other problems related to either the driver or the hardware.

Just for the record, my 290 would boot with a blank screen with any 290x bios so it was not possible to determine if it unlocked or not









All tests down with a 2500k @ 4.3GHz and 16gb 800MHz Memory. Stock cooler was used and temps were kept under 80 degrees to ensure there was no throttling, vrm temps were also under 80.






http://www.3dmark.com/3dm11/7505274

I decided to sent back the Sapphire today and I will order a Powercolor 290 OC in a few days, I did this because I would have never ordered the Sapphire had I known it wouldn't unlock. I thought it was a lottery but there was no lottery for Sapphire cards and Powercolor all unlock so it's less a lottery and more the fact they actually use 290x/uncut cores in their 290 OC editions


----------



## maarten12100

Quote:


> Originally Posted by *Faksnima*
> 
> Not sure...but this looks like the first non-ref 290x? http://www.amazon.com/PowerColor-AXPowerColor-290X-4GBD5-MDH-OC/dp/B00G9JAQA0/ref=sr_1_2?ie=UTF8&qid=1384767215&sr=8-2&keywords=powercolor+290


It looks really really bad hope it is custom pcb ;p


----------



## Connolly

Quote:


> Originally Posted by *R35ervoirFox*
> 
> So the Sapphire 290 I had with Hynix memory oc to 1155/1500 with +63mV, not 100% sure the core was stable. For comparison 1169 with +100mV was not stable. I'm sure the memory could go higher but I didn't try anything higher than 1500.
> I used the latest AMD Catalyst 13.11 beta 9.2, I never had a black screen or crash. I have 3 monitors connected, 2 were by dvi, one by hdmi, my main screen runs at 120Hz, during various benchmarks or in BF4 there were no black screens or any other problems related to either the driver or the hardware.
> 
> Just for the record, my 290 would boot with a blank screen with any 290x bios so it was not possible to determine if it unlocked or not
> 
> 
> 
> 
> 
> 
> 
> 
> 
> All tests down with a 2500k @ 4.3GHz and 16gb 800MHz Memory. Stock cooler was used and temps were kept under 80 degrees to ensure there was no throttling, vrm temps were also under 80.
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/7505274
> 
> I decided to sent back the Sapphire today and I will order a Powercolor 290 OC in a few days, I did this because I would have never ordered the Sapphire had I known it wouldn't unlock. I thought it was a lottery but there was no lottery for Sapphire cards and Powercolor all unlock so it's less a lottery and more the fact they actually use 290x/uncut cores in their 290 OC editions


It's proven that all the Powercolor R9 290s can have their shaders unlocked by flashing the 290x bios?


----------



## Toss3

Quote:


> Originally Posted by *Connolly*
> 
> It's proven that all the Powercolor R9 290s can have their shaders unlocked by flashing the 290x bios?


Not seen a single one that hasn't been able to unlock after flashing. Sapphire, HIS, MSI, and XFX (for the most part) are all locked. Not seen anyone try unlocking a Club 3D 290 yet.


----------



## Newbie2009

Finally got my blocks & backplates


----------



## JordanTr

Any news about non-reference? Im waiting for it very bad. Wanted to buy today reference 290 + gelid icy vision v2, but somehow was able to stop myself and wait for another week, cause im worried about those vrm temps with aftermarket coolers


----------



## Duvar

I guess it will be 2-4 weeks until Customdesigns will released.


----------



## Haldi

Quote:


> Originally Posted by *Toss3*
> 
> Not seen a single one that hasn't been able to unlock after flashing. Sapphire, HIS, MSI, and XFX (for the most part) are all locked. Not seen anyone try unlocking a Club 3D 290 yet.


http://extreme.pcgameshardware.de/grafikkarten/304527-howto-flash-amd-r9-290-290x-12.html#post5870008

Powercolor 290 OC that did NOT unlock with asus bios!

He did not yet, nor does he plan to, change cooler and check the number on the chip. But he said it's the new Batch that arrived at mindfactory(onlinestore).


----------



## Scorpion49

Quote:


> Originally Posted by *Duvar*
> 
> I guess it will be 2-4 weeks until Customdesigns will released.


Seems that way... http://hardforum.com/showthread.php?t=1791552


----------



## Prospector

Hello,guys. Ι am having issues too about black screen with my sapphire R9 290+Accelero Extreme III. I am stable at 1.240Vcore (gpuz real vddc) with 1170/1450. When i raise more the bar of voltage in msi/gpu tweak at the end of the benchmark Metro LL or Tomb raider,am getting the black screen! Can you suggest some solutions to fix this?? Rpm was 100% of the fan,my temps are perfect (max temp 52C,vrm1 65,vrm2 49C).


----------



## Raxus

Quote:


> Originally Posted by *JordanTr*
> 
> Any news about non-reference? Im waiting for it very bad. Wanted to buy today reference 290 + gelid icy vision v2, but somehow was able to stop myself and wait for another week, cause im worried about those vrm temps with aftermarket coolers


Nothing but rumors atm.


----------



## Nightgamerx

Does anyone know if you can use all 4 display outputs simultaneously on the R9 290(non-x), similar to Nvidia surround + Accessory Display, but I won't be using Eyefinity. I know there are two Dual Link DVI ports, one Full size Display Port and one HDMI port. I want to use all 4 of these at once with my following monitors. 1) Dual Link DVI to my 27' X-Star 2560x1440 2) Dual Link DVI to a Samsung 23' 1920x1080 3) Fullsize DisplayPort to DVI Active Adapter to second Samsung 23' 1920x1080 and 4) HDMI to my 58' Panasonic HDTV. Think this is possible?


----------



## cowie

Quote:


> Originally Posted by *EliteReplay*
> 
> i went into the GTX780ti club and there are only like 10 people


yeah but they still all have there cards with no issues at all in general I must have seen more then half the people that got 290x cards say they were rma'in or just getting there money back.

I am keeping mine another week if I can not fix stock clock blsc's it is going back


----------



## beejay

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Thats a good idea. Let me know how it goes. I might do the same with my AX III. The sink should provide some support for the pcb as well.


Okay, so it does look promising. You can cut the back part and you basically get a slab of metal that has direct contact with the vrms. I imagine you can place heatsinks on top of the slab.

But, since I'm using hybrid, i'm thinking of cutting a hole in the middle where the water block of the hybrid will go. Now this would be like "the hybrid" if succesful. I can keep the stock cooler for the vrm, and the hybrid block on the GPU. I'll do the cutting tomorrow so I can get exact measurements. And I need thermal pads. (Used up all the thermal pads)

Wish me luck. Haha, I'll have no more warranty options if I do this, but it's sure is a lot of fun. If it fails, I'll do put it on water instead.


----------



## ReHWolution

Count me in, guys!







benching atm to get a maximum stable OC, by the time I'm writing this message, I successfully benched @ 1150/1500 with +50mV on GPU and +50% power target. Do you suggest me to flash the Asus BIOS on my Sapphire R9-290X? Atm it's stock cooled, eventually I'll get an EK WB


----------



## flopper

Quote:


> Originally Posted by *Elmy*
> 
> http://www.3dmark.com/fs/1167379
> 
> 2 Club3D 290X with 4770K @ stock
> 
> In CCC I had Power limit setting at 15%
> 
> GPU clock setting @ 15%
> 
> High Perfomance memory clock setting @ 15%
> 
> Just playing around scored 15,810 in Firestrike (not extreme)  which is 35th in the world for 2 video cards.
> 
> On water temps not over 43 degrees using EK Nickel Plexi blocks on Quad XSPC 480 rad. Ambient is probably around 65 degrees.
> 
> Tomorrow might try to max everything out and see what these babies can do.
> 
> Got 5 monitors 5400X1920 working with 2 DVI, 2 displayport monitors connected to MST hub and one monitor connected to HDMI.
> 
> BF4 getting around 60 FPS average on High and pretty smooth most of the time. 2 more Club3D 290X's are inbound. Just need AMD to make a driver that will work for 120Hz. Last driver they had that worked was 13.9 (worked on my 7990's) and doesn't apply to 290X.


Took notice since I run 120hz eyefinity, the 13.11 9.2 works with 120hz but need to use my apple active adapter to make the driver to adjust the hz for 120hz also.
so currently, 2 screens in the DVI port.
the third in the displayport with apple active adapter mini dvi to DP adapter there in between.
reason I use apple was the other certified one was no where to get hold off...good job amd (NOT)

120hz works now.


----------



## hotrod717

Quote:


> Originally Posted by *Toss3*
> 
> Not seen a single one that hasn't been able to unlock after flashing. Sapphire, HIS, MSI, and XFX (for the most part) are all locked. Not seen anyone try unlocking a Club 3D 290 yet.


False! XFX do unlock. Go check the unlock thread. XFX and Powercolor are the only 2 known to unlock is correct information.







Not all cards of either manufacturer unlock. Lottery.


----------



## petedread

So afterburner beta 17 is out but no support for R290x


----------



## bond32

Quote:


> Originally Posted by *hotrod717*
> 
> False! XFX do unlock. Go check the unlock thread. XFX and Powercolor are the only 2 known to unlock is correct information.
> 
> 
> 
> 
> 
> 
> 
> Not all cards of either manufacturer unlock. Lottery.


I still am skeptical. Running a 290 at 290x bios will produce better benchmark scores, likely in the range shown. Still have yet to see someone with a 290 run valley at the 290x stock clocks, then change to the 290x bios and run it again.


----------



## Elmy

Quote:


> Originally Posted by *flopper*
> 
> Took notice since I run 120hz eyefinity, the 13.11 9.2 works with 120hz but need to use my apple active adapter to make the driver to adjust the hz for 120hz also.
> so currently, 2 screens in the DVI port.
> the third in the displayport with apple active adapter mini dvi to DP adapter there in between.
> reason I use apple was the other certified one was no where to get hold off...good job amd (NOT)
> 
> 120hz works now.


With my V2 7990's 144Hz @ 5400x1920 just worked flawlessly under cat 13.9. Just created the 5X1 group and went to desktop properties in CCC and the drop down menu under refrsh rate had every refresh rate available from 60Hz all the way to 144Hz. It was super easy. Every Beta driver since then broke that option so since the Beta driver works 10X better in BF4. I just threw the Club3D 290X's in and going to wait till AMD fixes whatever they did.. It might still work in 3X1 eyefinity but I just changed my 7990's out yesterday to 2 290X's.

So are your monitors DVI and Displayport? And when you connected 2 to the DVI and one to the displayport... The 120Hz did not work?

I have reported this a couple times to the beta driver reporting form on AMD's website and sent their office an email explaining the problem....But us Multi-Monitor users are a minority and I think the driver team might be working hard on Mantle right now.

Just playing the waiting game now. :


----------



## airisom2

Here ya go:
Quote:


> Originally Posted by *RedRage*
> 
> Powercolor R9 290 Flashed to Powercolor R9 290x Bios. Clocks are same on both Bios and Load temps are 61c with Arctic Cooler Accelero.
> 
> R9 290 Bios
> 
> 
> R9 290x Bios


The clock speeds are in the top right corner of the first two pics.


----------



## bond32

The core is clocked 30 mhz higher than a stock 290x? Or am I missing something...


----------



## sun100

Quote:


> Originally Posted by *Nightgamerx*
> 
> Does anyone know if you can use all 4 display outputs simultaneously on the R9 290(non-x), similar to Nvidia surround + Accessory Display, but I won't be using Eyefinity. I know there are two Dual Link DVI ports, one Full size Display Port and one HDMI port. I want to use all 4 of these at once with my following monitors. 1) Dual Link DVI to my 27' X-Star 2560x1440 2) Dual Link DVI to a Samsung 23' 1920x1080 3) Fullsize DisplayPort to DVI Active Adapter to second Samsung 23' 1920x1080 and 4) HDMI to my 58' Panasonic HDTV. Think this is possible?


I believe you can, since AMD cards were known for being able to, unlike Nvidia, run more than 2 displays from 1 card. Try it and see, and get back to us


----------



## flopper

Quote:


> Originally Posted by *Elmy*
> 
> With my V2 7990's 144Hz @ 5400x1920 just worked flawlessly under cat 13.9. Just created the 5X1 group and went to desktop properties in CCC and the drop down menu under refrsh rate had every refresh rate available from 60Hz all the way to 144Hz. It was super easy. Every Beta driver since then broke that option so since the Beta driver works 10X better in BF4. I just threw the Club3D 290X's in and going to wait till AMD fixes whatever they did.. It might still work in 3X1 eyefinity but I just changed my 7990's out yesterday to 2 290X's.
> 
> So are your monitors DVI and Displayport? And when you connected 2 to the DVI and one to the displayport... The 120Hz did not work?
> 
> I have reported this a couple times to the beta driver reporting form on AMD's website and sent their office an email explaining the problem....But us Multi-Monitor users are a minority and I think the driver team might be working hard on Mantle right now.
> 
> Just playing the waiting game now. :


my monitors are all dvi.
so using adapter for displaypot is needed.

I checked and dvi works with those screens but the third I only get the 120hz option once I put a active apple adapter there.
the other adapters I tested at home dont give me that option.

still works better now than my 7970 as I only need one adapter now.


----------



## airisom2

Quote:


> Originally Posted by *bond32*
> 
> The core is clocked 30 mhz higher than a stock 290x? Or am I missing something...


He used a powercolor 290xbios that has a factory 1030mhz core. In the benches, he downclocked it to 1000mhz to compare it clock for clock to the overclocked 1000mhz 290 bios. If you look in the top right corner of the first two benches, you'll see that both are at 1000mhz.


----------



## petedread

There are z77 and z87 that support two cards running PCIe x16. Mine does, the sniper 5 and the asus rog formula. The gigabyte Z77X UP7 comes to mind but there are others.


----------



## themasterpiece1

When are non reference 290 slated to be out? I am looking to upgrade either to a 290 or 780. I only air cool. I run a single 1080p 120hz monitor.

Currently have 580 SLI. Thinking of going either 7970CF, 290, or 780.


----------



## RAmable

Setup my Gigabyte R9 290 on the weekend.
Installed the EK 290x waterblock and backplate.
Flashed to 290x Asus.rom
Doesn't unlock to a 290x since the Valley benchs are identical.
I can OC stable to [email protected] 1350 mem
I can bench Valley at [email protected] but Tombraider shows artifacts (artifacts in Tombraider, who'd of thought...)









I put a vid on Youtube that demonstrates the sounds of the stock fan under load before I put the waterblock on and I also found an issue with the stock fan install at factory, seems that they left a piece of tape on the GPU between the heatsink. This caused the card to reduce clocks to 676MHz at 95C instead of 947MHz.


----------



## kcuestag

Anyone here with Sapphire R9 290X that has Hynix memory? If so, could you please tell me whats the P/N of your GPU? It should show in the box.

I'm asking because I just returned mine (Elpida Memory) and I want to tell them which is the P/N for the ones that got Hynix so they can give me one of those instead.


----------



## Arizonian

Quote:


> Originally Posted by *ReHWolution*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Count me in, guys!
> 
> 
> 
> 
> 
> 
> 
> benching atm to get a maximum stable OC, by the time I'm writing this message, I successfully benched @ 1150/1500 with +50mV on GPU and +50% power target. Do you suggest me to flash the Asus BIOS on my Sapphire R9-290X? Atm it's stock cooled, eventually I'll get an EK WB


Congrats - added









Quote:


> Originally Posted by *RAmable*
> 
> Setup my Gigabyte R9 290 on the weekend.
> Installed the EK 290x waterblock and backplate.
> Flashed to 290x Asus.rom
> Doesn't unlock to a 290x since the Valley benchs are identical.
> I can OC stable to [email protected] 1350 mem
> I can bench Valley at [email protected] but Tombraider shows artifacts (artifacts in Tombraider, who'd of thought...)


Post a pic w/OCN name on it and join us.








Quote:


> Originally Posted by *kcuestag*
> 
> Anyone here with Sapphire R9 290X that has Hynix memory? If so, could you please tell me whats the P/N of your GPU? It should show in the box.
> 
> I'm asking because I just returned mine (Elpida Memory) and I want to tell them which is the P/N for the ones that got Hynix so they can give me one of those instead.


SO just to reitorate - has anyone with Hynix memory received black screens?


----------



## galil3o

So I just checked my ASIC quality and its sitting at 82.3%. Thats higher than most other people's I've seen and according to their little chart it may negatively impact my watercooling overclocks. Is this something I should be concerned about?


----------



## Forceman

Quote:


> Originally Posted by *bond32*
> 
> I still am skeptical. Running a 290 at 290x bios will produce better benchmark scores, likely in the range shown. Still have yet to see someone with a 290 run valley at the 290x stock clocks, then change to the 290x bios and run it again.


I've run numerous tests (including shader specific tests) clock for clock and the 290X BIOS is faster across the board. If you check out the unlock thread you'll find all the evidence you are looking for.


----------



## Dan848

Quote:


> Originally Posted by *brazilianloser*
> 
> Only thing stopping me from buying my second 290 is the fact that the first went bananas and I am still waiting on the replacement which should arrive Monday or Tuesday... then the other fact is that even though some reviews show a 860w psu being enough, that in reality that is not the case and buying a new one when I just got me the ax860 not more than two months ago is a shame. Hopefully soon things can start going in the right direction for me and I can get my PC pimped out.


If you have good [clean, stable] power from the wall, good motherboard and 80 Plus Certified PSU 850 watts is more than enough to power a single R9 290. I am using a Corsair TX650M and my R9 290 with a 17% GPU and 16.5% memory overclock is stable. I have not tried anything higher than 17% because in my older age and too many medications, I do not feel like tweaking hardware to the maximum as in years past. The GPU is very stable at 17% overclock, however so, I might bump it up to 20% using CCC.

Maximum power draw during benchmarking spikes around 450 watts for my system [including an Intel i5 3570K CPU overclocked to 4.6GHz] and my power supply barely gets warm.

Nothing beats quality. I don't care about what DVD drive/burner I have, however, I do care about the important components in my computer, including case [good airflow], power supply, motherboard, RAM, SSD, CPU and very good air cooler for it [no hassles with water over long periods or uncooperative installation], and hard drives.

I suppose that is most of a computer build. I purchase the best that I can afford and wait until I have the money if I can't afford something good.

I purchase Sapphire video cards now; if you have noticed they only make AMD cards. Of course any company can let a dud slip through, that is what warranties are for [and keeping a back-up video card just in case].

If your power supply is 80 Plus Gold Certified or better, the 850 watt PSU should drive two R9 290's.


----------



## hotrod717

Quote:


> Originally Posted by *bond32*
> 
> I still am skeptical. Running a 290 at 290x bios will produce better benchmark scores, likely in the range shown. Still have yet to see someone with a 290 run valley at the 290x stock clocks, then change to the 290x bios and run it again.


Actually the cards have 290x chip. Unlock is not correct term. Visit unlock thread. Bare die shows 290x chip on cards that have been observed. Once I remove my stock cooler. I will verify before waterblocking my card.


----------



## brazilianloser

Quote:


> Originally Posted by *Dan848*
> 
> If you have good [clean, stable] power from the wall, good motherboard and 80 Plus Certified PSU 850 watts is more than enough to power a single R9 290. I am using a Corsair TX650M and my R9 290 with a 17% GPU and 16.5% memory overclock is stable. I have not tried anything higher than 17% because in my older age and too many medications, I do not feel like tweaking hardware to the maximum as in years past. The GPU is very stable at 17% overclock, however so, I might bump it up to 20% using CCC.
> 
> Maximum power draw during benchmarking spikes around 450 watts for my system [including an Intel i5 3570K CPU overclocked to 4.6GHz] and my power supply barely gets warm.
> 
> Nothing beats quality. I don't care about what DVD drive/burner I have, however, I do care about the important components in my computer, including case [good airflow], power supply, motherboard, RAM, SSD, CPU and very good air cooler for it [no hassles with water over long periods or uncooperative installation], and hard drives.
> 
> I suppose that is most of a computer build. I purchase the best that I can afford and wait until I have the money if I can't afford something good.
> 
> I purchase Sapphire video cards now; if you have noticed they only make AMD cards. Of course any company can let a dud slip through, that is what warranties are for [and keeping a back-up video card just in case].
> 
> If your power supply is 80 Plus Gold Certified or better, the 850 watt PSU should drive two R9 290's.


You miss understood me. I have an Ax860 and I am aware that I can power a single 290 no problem. My point was I am holding back getting a second 290 (crossfire) because even though some reviews show them at stock not consuming ao much power in the real world that is a totally different situation.


----------



## Skrillex101

Quote:


> Originally Posted by *Gilgam3sh*
> 
> *ACCELERO XTREME III COIL WHINE "FIX"*
> 
> OK GUYS!! The "fix" for not getting that annoying coil whine sound when not running the Xtreme III fans at 100% is to use the Molex plug you get with the cooler, I run it at 7v now and it's dead quite, temps are fine, about 35°C and NO MORE crazy coil whine sound.


Is there a different solution to this?

I got the Club 3D 290X. And ive attached the Artic Accelero Extreme III on it, the sound is insane when its not running 100%.

Only at 100% Fan speed the sound goes away, but the sound from the Fans at full speed is to much imo, at normal browsing etc.

Regards
Skrill


----------



## nemm

Quote:


> Originally Posted by *Arizonian*
> 
> SO just to reitorate - has anyone with Hynix memory received black screens?


Yes, 1st was RMA after completely dying and now the replacement also gives black screen whereas the Elpida did not.
While waiting for RMA only using the Elpida sample there were no problems for over a week running at 1180/1450 @1.297 and upon receiving replacement I tested to find the max clocks using same method as always which resulted in the same clocks. I didn't have time to fully test like the first one but looping H4 maxed 1440p without problem should have sufficed.
Installed in crossfire the cards were beginning to instant black screen at those clocks so I dropped them to 1160 which resulted in black screen after 30 or so minutes. I changed the memory to stock with same result. Dropped the core to 1140 and memory stock with no black screen after an hour then bang, black screen :s.
Currently testing 1125/1450 @1.297 with no black screen after 2 hours. This is a core problem and not memory in my opinion since I can bench at 1200/1500 @1.297 but cannot play games without problems at 1140/1450 @1.297 and changing the memoryclock has same outcome whereas lowering the core proves to be more of a solution to the black screen.


----------



## RAmable

My watercooled total system is pulling 530W from my UPS (metered) under load. Idles at 252W.

Devices using UPS power:
R9 290 OCed to [email protected]
i7 4770 (non-OC atm)
16GB 1866 ram
3 x 256GB Samsung 840Pro SSDs
2TB Seagate 5900rpm
4TB Seagate 5900rpm
D5 waterpump
8 low speed fans
3x 24" LED LCDs (34W usage each) ~100W total
cable modem/router

The video card is the big power user obviously. Under full load the CPU system is constantly drawing between 415 to 430W (removing the LCDs) which is fine for my HX1000 PSU, two cards would push it close to or over 700W depending on the overclocking. FYI, I don't know how accurate the UPS reporting software is.


----------



## Mr357

Quote:


> Originally Posted by *RAmable*
> 
> My watercooled total system is pulling 530W from my UPS (metered) under load. Idles at 252W.
> 
> Devices using UPS power:
> R9 290 OCed to [email protected]
> i7 4770 (non-OC atm)
> 16GB 1866 ram
> 3 x 256GB Samsung 840Pro SSDs
> 2TB Seagate 5900rpm
> 4TB Seagate 5900rpm
> D5 waterpump
> 8 low speed fans
> 3x 24" LED LCDs (34W usage each) ~100W total
> cable modem/router
> 
> The video card is the big power user obviously. Under full load the CPU system is constantly drawing between 415 to 430W (removing the LCDs) which is fine for my HX1000 PSU, two cards would push it close to or over 700W depending on the overclocking. FYI, I don't know how accurate the UPS reporting software is.


Good to know since that likely means my 750W can provide enough juice.


----------



## zpaf

Quote:


> Originally Posted by *RedRage*
> 
> Powercolor R9 290 Flashed to Powercolor R9 290x Bios. Clocks are same on both Bios and Load temps are 61c with Arctic Cooler Accelero.
> 
> R9 290 Bios
> 
> 
> R9 290x Bios


This is my result with Asus R9 290 non X at 1000MHz.


----------



## Skrillex101

Just for anyone interested in power consumption.

Im running this:

Intel I5 4670K @ 4.5GHZ + Noctua 14 Cooler
8GB G.Skill 1800 MHZ
Asus Z87 PRO
3x 1TB SATA Drives
1x 250GB Samsung SSD Drive
Club 3D R9 290X @ 1100Mhz/1385Mhz

4 USB Devices, 3 Monitors (DVI, DVI, DP), 4 LEDS, 5 Fans (4x 120mm, 1x 140mm)

All this is running fine in any game with this PSU: *Seasonic Platinum Fanless 520W PSU*


----------



## Amhro

TrueAudio comparison (be sure to use headphones)


----------



## Loktar Ogar

Quote:


> Originally Posted by *cowie*
> 
> yeah but they still all have there cards with no issues at all in general I must have seen more then half the people that got 290x cards say they were rma'in or just getting there money back.
> 
> I am keeping mine another week if I can not fix stock clock blsc's it is going back


Please do update us... I need to take a closer look on these issues.







Green or Red team.. Decisions.. Decisions...


----------



## Mr357

Quote:


> Originally Posted by *Amhro*
> 
> TrueAudio comparison (be sure to use headphones)


What I'm wondering is if watching that video is the same as actually experiencing it for yourself. I honestly couldn't tell much of a difference between the old stereo and TrueAudio. A lot of us already resent it for how much time they spent talking about it at the AMD event in Hawaii.


----------



## skupples

All I notice is less cut out on the opposite ear. I would assume it's hard to replicate this demo on a system that doesn't actually make use of "true audio"


----------



## TooBAMF

Quote:


> Originally Posted by *Mr357*
> 
> What I'm wondering is if watching that video is the same as actually experiencing it for yourself. I honestly couldn't tell much of a difference between the old stereo and TrueAudio. A lot of us already resent it for how much time they spent talking about it at the AMD event in Hawaii.


Seems like something would be lost in the regular stereo encoding for upload to Youtube. I couldn't tell much of a difference either.


----------



## jerrolds

I wonder if its the same thing as the "Virtual Barbershop" youtube video - cuz that was AWESOME


----------



## Mr357

Quote:


> Originally Posted by *TooBAMF*
> 
> Seems like something would be lost in the regular stereo encoding for upload to Youtube. I couldn't tell much of a difference either.


Exactly, it's like that commercial for Sharp's "Quatron" TV's with George Takei (Sulu on Star Trek). They're using a medium of advertisement that doesn't allow for the viewer to actually experience the highlighted feature.



Spoiler: Warning: Spoiler!


----------



## nemm

Decided to step away from the LLC command and apply more voltage.

Sapphire XF 1200/1450 with +257mv,



Fingers crossed I can play at these clocks.


----------



## tsm106

Quote:


> Originally Posted by *Mr357*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TooBAMF*
> 
> Seems like something would be lost in the regular stereo encoding for upload to Youtube. I couldn't tell much of a difference either.
> 
> 
> 
> Exactly, it's like that commercial for Sharp's "Quatron" TV's with George Takei (Sulu on Star Trek). They're using a medium of advertisement that doesn't allow for the viewer to actually experience the highlighted feature.
> 
> 
> 
> Spoiler: Warning: Spoiler!
Click to expand...

Except Sharp's Quatron is a real gimmick. True Audio... we can't know until we listen to it first hand and thus the problem. It highlights the fundamental weakness of the medium. You can't approximate how awesome high-fi speakers are thru budget speaks. And you can't imagine how great surround is thru a youtube video, lol. But marketers be marketers.


----------



## Mr357

Quote:


> Originally Posted by *tsm106*
> 
> Except Sharp's Quatron is a real gimmick. True Audio... we can't know until we listen to it first hand and thus the problem. It highlights the fundamental weakness of the medium. You can't approximate how awesome high-fi speakers are thru budget speaks. And you can't imagine how great surround is thru a youtube video, lol. But marketers be marketers.


I wasn't directly comparing it to Quatron. I was comparing the troubles of advertising the two things.


----------



## tsm106

I was more or less adding to your post. It's funny though that Sharp's is an actual gimmick since we can't see yellow directly, which is ironic considering with Quatron they added a yellow light.


----------



## ukaussi

Quote:


> Originally Posted by *RAmable*
> 
> Setup my Gigabyte R9 290 on the weekend.
> Installed the EK 290x waterblock and backplate.
> Flashed to 290x Asus.rom
> Doesn't unlock to a 290x since the Valley benchs are identical.
> I can OC stable to [email protected] 1350 mem
> I can bench Valley at [email protected] but Tombraider shows artifacts (artifacts in Tombraider, who'd of thought...)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I put a vid on Youtube that demonstrates the sounds of the stock fan under load before I put the waterblock on and I also found an issue with the stock fan install at factory, seems that they left a piece of tape on the GPU between the heatsink. This caused the card to reduce clocks to 676MHz at 95C instead of 947MHz.


YOu should do a seperate post on the GPU tape part of video to see if anyone else has found this. SOUnds kind of weird and maybe a one-off but you could go all "conspiracy theory" and it could well be why some 290/290X with ref cooler are throttling worst than others and that somebody is sabotaging them at the vendor's factory


----------



## RAmable

I tweeted to Gigabyte_USA and AMDRadeon about this.


----------



## bpmcleod

Anyone have any problems with GPU Tweak not allowing access to the second card? I can switch to it, but cant adjust anything on it with GPUTweak...


----------



## Slomo4shO

Quote:


> Originally Posted by *bpmcleod*
> 
> Anyone have any problems with GPU Tweak not allowing access to the second card? I can switch to it, but cant adjust anything on it with GPUTweak...


You likely have the settings set to mirror the OC of the first card.


----------



## bpmcleod

Quote:


> Originally Posted by *Slomo4shO*
> 
> You likely have the settings set to mirror the OC of the first card.


Nope I thought of that also. It is disabled. The fan speed on the second card never ramps up above 1400 rpms or so while I can set the first one to whatever I feel like. My only thought is that the second card is flashed to a 290x bios and the first is still stock at a 290 bios. Would this conflict with the xfire? It runs fine but I cant control the second card with anything. Just tried MSI Afterburner also


----------



## Scorpion49

Quote:


> Originally Posted by *Arizonian*
> 
> SO just to reitorate - has anyone with Hynix memory received black screens?


I have.


----------



## Moragg

I saw Arizonian's post about 290/290X problems (different thread) and, rather interestingly, he said mainly boards with Elipida memory were affected.

This is just an out-there conjecture, but wasn't it the case that most of the cards that unlocked to 290Xs had Elipida? It might be worth seeing if the 290s with problems also flash to 290X successfully more often.


----------



## Mr357

Quote:


> Originally Posted by *tsm106*
> 
> I was more or less adding to your post. It's funny though that Sharp's is an actual gimmick since we can't see yellow directly, which is ironic considering with Quatron they added a yellow light.


Okay, I was just worried that you thought I was a victim of the gimmick.


----------



## hatlesschimp

Has anyone had any trouble with powercolor 290x?


----------



## bpmcleod

Has anyone ran memoryinfo with more than one card? Also anyone have any idea why after flashing the asus 290x bios onto an ASUS 290, that it got all the way to the login screen and freezes there everytime? And the same with my powercolor 290, I have it flashed to a 290x bios but if I try to flash it to the ASUS 290 bios it hangs on the login screen as well. I am just trying to get them on similar BIOS to see if it fixes the problem with only being able to control one card with GPUTweak/MSI


----------



## Sgt Bilko

Since im still waiting for a better driver to come out ive decided to put my old 4850's in my pc and bench them









might continue with my 6970, 7970 and then the 290x just for fun


----------



## ReHWolution

Anyone could tell me a safe daily overvolt for the 290X? let's say 1.35? 1.30?


----------



## RAmable

Quote:


> Originally Posted by *bpmcleod*
> 
> Has anyone ran memoryinfo with more than one card? Also anyone have any idea why after flashing the asus 290x bios onto an ASUS 290, that it got all the way to the login screen and freezes there everytime? And the same with my powercolor 290, I have it flashed to a 290x bios but if I try to flash it to the ASUS 290 bios it hangs on the login screen as well. I am just trying to get them on similar BIOS to see if it fixes the problem with only being able to control one card with GPUTweak/MSI


Do you have an Asus motherboard?


----------



## RAmable

Quote:


> Originally Posted by *ReHWolution*
> 
> Anyone could tell me a safe daily overvolt for the 290X? let's say 1.35? 1.30?


Asus GPU tweak sets it to 1.250v on the power savings preset. 950 GPU 90% Target


----------



## ReHWolution

Quote:


> Originally Posted by *RAmable*
> 
> Asus GPU tweak sets it to 1.250v on the power savings preset. 950 GPU 90% Target


Overvolt for a daily overclock, not downvolt/downclock







I'm not a "underclock" kind of person


----------



## Scorpion49

lol...

http://www.3dmark.com/3dm11/7515885


----------



## ReHWolution

Quote:


> Originally Posted by *Scorpion49*
> 
> lol...
> 
> http://www.3dmark.com/3dm11/7515885


Great score, but that CPU is holding back those beasts :\


----------



## Sgt Bilko

Quote:


> Originally Posted by *Scorpion49*
> 
> lol...
> 
> http://www.3dmark.com/3dm11/7515885


Well thats got me sold...guess im getting another 290x after Xmas along with a 1440p monitor.


----------



## Scorpion49

Quote:


> Originally Posted by *ReHWolution*
> 
> Great score, but that CPU is holding back those beasts :\


Its a really poor score, and 4000 points less than the same cards scored a few days ago. There is something terribly wrong with one or both of them, I'm about to try them one at a time. I'm getting VERY frequent black screen lockups now (at least 10 in the last 3 hours) and white flashes.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Onglar*
> 
> Really REALLY slow, i can beat that with a single 290x.


You are also running a 3930k correct?


----------



## RAmable

Quote:


> Originally Posted by *ReHWolution*
> 
> Overvolt for a daily overclock, not downvolt/downclock
> 
> 
> 
> 
> 
> 
> 
> I'm not a "underclock" kind of person


Sorry misread your post.

I am running 1.331 for 1230MHz/1350mem , haven't gone any higher since I get artifacts still at 1.34 and 1.35
Increases the water temps by 2C each .01v also so not worth overvolting any higher on this card.
I might keep it at 1220/1350 since that only requires 1.281v to run


----------



## Gunderman456

Quote:


> Originally Posted by *bpmcleod*
> 
> Has anyone ran memoryinfo with more than one card? Also anyone have any idea why after flashing the asus 290x bios onto an ASUS 290, that it got all the way to the login screen and freezes there everytime? And the same with my powercolor 290, I have it flashed to a 290x bios but if I try to flash it to the ASUS 290 bios it hangs on the login screen as well. I am just trying to get them on similar BIOS to see if it fixes the problem with only being able to control one card with GPUTweak/MSI


It looks like the Asus is looking for those extra shaders and not finding them. Best bet, is keep the Asus 290 bios and run with that or try another 290x from a different vender.

http://www.techpowerup.com/vgabios/index.php?architecture=ATI&manufacturer=&model=R9+290X&interface=&memType=&memSize=
Quote:


> Originally Posted by *ReHWolution*
> 
> Anyone could tell me a safe daily overvolt for the 290X? let's say 1.35? 1.30?


Consensus is 1.35 volts for 24/7.


----------



## Scorpion49

Quote:


> Originally Posted by *Sgt Bilko*
> 
> You are also running a 3930k correct?


CPU has nothing to do with this. The graphics score will remain nearly the same with just about any CPU. I have Titan 3Dmark11 runs with a celeron G530 and they are identical on the graphics score to runs with a 4.8ghz 3930k. Thats why the Physics score doesn't influence the combined that much.

I just pulled card 2 out of the system and it was almost cold, I'm guessing crossfire wasn't working.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Scorpion49*
> 
> CPU has nothing to do with this. The graphics score will remain nearly the same with just about any CPU. I have Titan 3Dmark11 runs with a celeron G530 and they are identical on the graphics score to runs with a 4.8ghz 3930k. Thats why the Physics score doesn't influence the combined that much.
> 
> I just pulled card 2 out of the system and it was almost cold, I'm guessing crossfire wasn't working.


I was always told that the CPU has more of an influenece in 3DMark than what you're suggesting. Guess ive been wrong

Edit: just went back and noticed the p score......nvm me, carry on


----------



## Scorpion49

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I was always told that the CPU has more of an influenece in 3DMark than what you're suggesting. Guess ive been wrong
> 
> Edit: just went back and noticed the p score......nvm me, carry on


It does to some extent, but not as big as this. Well card 1 failed miserably. Set it at 55% fan and it black screened at test 3. So now I'm on card 2, we'll see how it does.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Onglar*
> 
> Bet you cant beat my 7970 score with that CPU lol.
> 
> Show me your best 3d11/FS score


And what would that achieve?


----------



## RAmable

Now that I am at home I can submit the 290 proof to join the club.

Gigabyte R9 290 reference cooler model.
Equipped with an EK 290x acetal waterblock and backplate in a custom loop.


----------



## bpmcleod

Quote:


> Originally Posted by *Gunderman456*
> 
> It looks like the Asus is looking for those extra shaders and not finding them. Best bet, is to flash both to the Asus 290 unlocked bios and run with that.
> Consensus is 1.35 volts for 24/7.


I've tried that. My powercolor won't go to the asus 290. It locks up on the login also. It "appears" to have unlocked with the 290x bios so idk. They may not be able to be on the same bios. I uninstalled CCC and was able to control them with msi and gputweak. It seems it was getting in the way. I decided to go back to win7 ult 64 bit and try my luck there. 8.1 has been plagued with problems anyways.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Onglar*
> 
> You sound hesitant.


Mobile device but sure go look through the thread if you really want them.


----------



## DeadlyDNA

So i wanted to share a tidbit with everyone. First off obviously i would not recommend anyone try this. I did however disconnect the blower fan on my Stock cooler R9 290 and run a benchmark. I can say without a doubt that in MY case i was able to create a black screen by doing this. It's thermal protection shutting down the card would be my guess.
I reconnected the fan and the video card works fine. So to summarize....

A black screen CAN be caused by thermal/protection of the card.

I would recommend anyone having black screens if they feel brave enough, take apart the card and examine all contacts for heatsink/GPU core and VRMS/Memory. And i am NOT saying all black screens are a single root cause such as this.

Thanks.


----------



## Gunderman456

Quote:


> Originally Posted by *bpmcleod*
> 
> I've tried that. My powercolor won't go to the asus 290. It locks up on the login also. It "appears" to have unlocked with the 290x bios so idk. They may not be able to be on the same bios. I uninstalled CCC and was able to control them with msi and gputweak. It seems it was getting in the way. I decided to go back to win7 ult 64 bit and try my luck there. 8.1 has been plagued with problems anyways.


I reread and saw that actually, see my edit and included link. Basically try another 290x bios from another vender.


----------



## Falkentyne

Quote:


> Originally Posted by *nemm*
> 
> Yes, 1st was RMA after completely dying and now the replacement also gives black screen whereas the Elpida did not.
> While waiting for RMA only using the Elpida sample there were no problems for over a week running at 1180/1450 @1.297 and upon receiving replacement I tested to find the max clocks using same method as always which resulted in the same clocks. I didn't have time to fully test like the first one but looping H4 maxed 1440p without problem should have sufficed.
> Installed in crossfire the cards were beginning to instant black screen at those clocks so I dropped them to 1160 which resulted in black screen after 30 or so minutes. I changed the memory to stock with same result. Dropped the core to 1140 and memory stock with no black screen after an hour then bang, black screen :s.
> Currently testing 1125/1450 @1.297 with no black screen after 2 hours. This is a core problem and not memory in my opinion since I can bench at 1200/1500 @1.297 but cannot play games without problems at 1140/1450 @1.297 and changing the memoryclock has same outcome whereas lowering the core proves to be more of a solution to the black screen.


Can you do something to test this?

Raise the core back to your 1200 MHz speed. but either do 1 of the 2 thinigs:

Run the OLDER GPU-Z 1.7.3 so it 'locks' your card at full 3D speed in the desktop (and thus on the load screen) and that should stop the ram clocks from dropping to 150, then see if you blackscreen anymore.

You can also try afterburner with the overclock without powerplay support option, and see if that also locks the RAM clocks. Then see if you stop blackscreening.


----------



## Jpmboy

Quote:


> Originally Posted by *RAFFY*
> 
> Isn't this still the process that AMD uses with their CPU's. Hence why they came out with tri-core CPU's a couple years ago. I'd just buy the 290X.
> Thanks, I'll have to go in and change my default settings. I was pretty stoked at how high my cards scored but was a little skeptical too.
> Just made that quick change thank you.
> For me I know its not my cards. Since the new drivers (9.2) I haven't received any hard lock ups. Now my only problem is direct x crashes sometimes in BF4. But since the new update in BF4 that has decreased a little bit. In games like COD I have no issues at all except for the game not properly accepting crossfire and my game looking like im on low settings.
> So with this block you do not need to use any thermal pads? I watched the video but it didn't show the side of the block that contacts the card. According to the video it looks like just great is used, seems pretty awesome and I would expect better cooling than with thermal pads on the EK. Can you post or PM me your temps once you have them installed please.


Trying to guage contribution of physx to overall scoring in firestrike? pg 35:

3DMark_Technical_Guide.pdf 1700k .pdf file


more formulaic than it seems...


----------



## Sgt Bilko

Quote:


> Originally Posted by *Onglar*
> 
> http://www.3dmark.com/compare/fs/1005312/fs/1151742


Uh huh.....nothing unexpected there.

Nice clock on the 7970 btw


----------



## Sgt Bilko

Quote:


> Originally Posted by *Onglar*
> 
> Thank you, you could probably get more physics out of the 8350. Set your RAM to 2133 or so and tighten the timings. CPUNB at 2600 atleast, and HT at 3000.


Getting new ram next week. And this cpu is very new (less than a week) so I havent had much time to tweak everything yet.........work gets in the way of everything


----------



## IBIubbleTea

Sorry but I'm kinda new to this but.
How would I flash the 290x bios on the 290?
Also do all of the 290's have dual bios so if I mess up I could go back to the old one?
Will this ruin the card?
Can I still RMA it?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Onglar*
> 
> Feel free to PM me for some tweaking tips. I have quite a bit of Vishera tweaking experience.


Thanks for the offer


----------



## galil3o

So I'm having a bit of a voltage issue. I'm still on stock air wait for my loop travelling through the snail mail, trying to see how far it will go on air so I have a starting point one its under water.

I'm not getting consistent results from r0l4n's afterburner voltmod program. I black screen at 1150 core (leaving mem at 1250) with the LLC mod enable and power options maxed out in afterburner. GPUz reads a max voltage of 1.273 when it black screens.

However I was able to run the core at 1185 with 1.3v using asus GPU tweak maxed out.

All my temps both core and vrm are under 60c with the fan at 100%

I'd like to be able to push past 1.3v once my cooling situation improves. Are there any other option such as the PT bioses (are those safe on water?)


----------



## Arizonian

Quote:


> Originally Posted by *RAmable*
> 
> Now that I am at home I can submit the 290 proof to join the club.
> 
> Gigabyte R9 290 reference cooler model.
> Equipped with an EK 290x acetal waterblock and backplate in a custom loop.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## Jpmboy

Quote:


> Originally Posted by *IBIubbleTea*
> 
> Sorry but I'm kinda new to this but.
> How would I flash the 290x bios on the 290?
> Also do all of the 290's have dual bios so if I mess up I could go back to the old one?
> Will this ruin the card?
> Can I still RMA it?


follow the instructions *here*. you just need the 290x bios.


----------



## galil3o

Quote:


> Originally Posted by *Onglar*
> 
> Try overvolting (100-200mV or so) without overclocking. Launch Valley, and spin around as fast as you can. Tell me what the result is.


Using afterburner without the LLC mod, +100mV on the Core Voltage, Aux at 0, 100% fan speed, no overclock.

Nothing happens, I'm getting about 80 fps


----------



## tsm106

Quote:


> Originally Posted by *galil3o*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Onglar*
> 
> Try overvolting (100-200mV or so) without overclocking. Launch Valley, and spin around as fast as you can. Tell me what the result is.
> 
> 
> 
> Using afterburner without the LLC mod, +100mV on the Core Voltage, Aux at 0, 100% fan speed, no overclock.
> 
> Nothing happens, I'm getting about 80 fps
Click to expand...

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/6120_40#post_21206032

AB breaks my setup in a lot of ways.

Quote:


> Originally Posted by *hotrod717*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> You are not thinking in terms of Hawaii. The goal is not to achieve the highest clock on the lowest volts.
> 
> Yer using AB? If you are stop. It sucks. I hear beta 16 is better with the llc applet, but I think its just a pita. Gputweak is rubbish but for me ocing with the relationships in mind, and keeping the volts within range I don't blackscreen. In gputweak you can get a relative idea of the volts needed for the clocks, by clicking the lock button between core and voltage. then click the up button to see it roll up. That gives you an idea of the volts u need for X clock. My real world volts vs clocks are higher than gputweaks locked settings btw.
> 
> Btw, go to settings and remove the skin, because there are a lot of settings that the skins sort of hides.
> 
> 
> 
> I see you're finally coming to terms with the pros of gpu tweak.
Click to expand...

Haha, I probably know it better than you but I still feel dirty using it everytime.


----------



## galil3o

Quote:


> Originally Posted by *Onglar*
> 
> Did ya spin around fast?


Yep, loaded up extreme HD preset and se camera to walk then spun as fast as i could (stops when the cursor hits the edge of the screen so I can to keep clicking)

Only out of the ordinary thing I saw was it report my card temp as being slightly under 60000 degrees


----------



## galil3o

Quote:


> Originally Posted by *tsm106*
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/6120_40#post_21206032
> 
> AB breaks my setup in a lot of ways.
> Haha, I probably know it better than you but I still feel dirty using it everytime.


So I should switch back to GPU Tweak. Its been the best for my so far but even when I set the core to 1.41 V GPUz only reads 1.3 V max


----------



## Jpmboy

Quote:


> Originally Posted by *galil3o*
> 
> So I should switch back to GPU Tweak. Its been the best for my so far but even when I set the core to 1.41 V GPUz only reads 1.3 V max


that's vdroop. It's a good thing, unless you want to OC very carefully. Will prevent your vrms from doing the flashbulb trick.


----------



## galil3o

Quote:


> Originally Posted by *Jpmboy*
> 
> that's vdroop. It's a good thing, unless you want to OC very carefully. Will prevent your vrms from doing the flashbulb trick.


Gotcha, that makes sense now. So my next question would be whats a safe max voltage for a watercooled 290x and would you guys recommend playing around with the PT1 bios once its properly cooled?


----------



## tsm106

Quote:


> Originally Posted by *galil3o*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/6120_40#post_21206032
> 
> AB breaks my setup in a lot of ways.
> Haha, I probably know it better than you but I still feel dirty using it everytime.
> 
> 
> 
> So I should switch back to GPU Tweak. Its been the best for my so far but even when I set the core to 1.41 V GPUz only reads 1.3 V max
Click to expand...

The voltage limit locked in the bios. If you want to play with more than 1.3v loaded then you'll need to flash to one of the unlocked bios' (and modded gputwk) but those are not really for daily use since they have the powertune removed. Btw, you should not use gpuz unless you absolutely have to. Use the built-in monitor in gputweak.


----------



## Scorpion49

So I just re-TIM'd one of my cards, 15*C drop in idle temps. The compound on these things is horrible, very dry and flaky already. Working on card #2 and something else I though of to try while I had it apart.


----------



## warbucks

Add me to the club







These will be under water later this week once my EK waterblocks arrive.


----------



## Jpmboy

Quote:


> Originally Posted by *galil3o*
> 
> Gotcha, that makes sense now. So my next question would be whats a safe max voltage for a watercooled 290x and would you guys recommend playing around with the PT1 bios once its properly cooled?


for most every use, the asus unlocked voltage bios is very good - works well with asus gpu tweak. Listen to tsm regarding the PT series bios'.


----------



## galil3o

Quote:


> Originally Posted by *Jpmboy*
> 
> for most every use, the asus unlocked voltage bios is very good - works well with asus gpu tweak. Listen to tsm regarding the PT series bios'.


I agree, though will my water cooling have any effect on what speed my card starts to artifact at while running 1.3 V (1200 Mhz on air @100%) or will it simply rid me of this turbine sound?

Also I cant view any voltages from the gpu tweak monitor, just gpu temp, usage, fan duty, fan speed, gpu clock, mem clock, mem usage and power target %


----------



## Jpmboy

Quote:


> Originally Posted by *galil3o*
> 
> I agree, though will my water cooling have any effect on what speed my card starts to artifact at while running 1.3 V (1200 Mhz on air @100%) or will it simply rid me of this turbine sound?
> 
> Also I cant view any voltages from the gpu tweak monitor, just gpu temp, usage, fan duty, fan speed, gpu clock, mem clock, mem usage and power target %


If you seated the waterblock well (and dabbed the stock EK T-pads with some TIM) your VRMs will stay very cool (<60C). So both - quiet and better OC cause the temps are controlled.



Eh - check the "monitor voltage" box in settings I think. Should be there.


----------



## tsm106

Quote:


> Originally Posted by *galil3o*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jpmboy*
> 
> for most every use, the asus unlocked voltage bios is very good - works well with asus gpu tweak. Listen to tsm regarding the PT series bios'.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I agree, though will my water cooling have any effect on what speed my card starts to artifact at while running 1.3 V (1200 Mhz on air @100%) or will it simply rid me of this turbine sound?
> 
> Also I cant view any voltages from the gpu tweak monitor, just gpu temp, usage, fan duty, fan speed, gpu clock, mem clock, mem usage and power target %
Click to expand...

Read the link I posted and my posts leading up to it and after it. Remove the skin, and you will see all the options. You have to manually enable all the options you want in both the app and the monitor. Gputweak is incredibly non-user friendly.


----------



## IBIubbleTea

Anyone able to flash 290x bios on the MSI or XFX cards? Asking because it comes with Battlefield 4 xD


----------



## Forceman

Quote:


> Originally Posted by *IBIubbleTea*
> 
> Anyone able to flash 290x bios on the MSI or XFX cards? Asking because it comes with Battlefield 4 xD


A few on XFX cards.

http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread/0_30


----------



## Scorpion49

Well, I think I solved the black screens. I removed my SoundBlaster X-Fi and I haven't had one since then.

Here is the best I can muster up on the stock BIOS:

FS http://www.3dmark.com/3dm/1663912

11 http://www.3dmark.com/3dm11/7516307


----------



## ZealotKi11er

So tomorrow i will officially join the Club with with R9 290. Going to run it on Air for a week or so and then snag a water block.


----------



## IBIubbleTea

Quote:


> Originally Posted by *Forceman*
> 
> A few on XFX cards.
> 
> http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread/0_30


So the highest success rate is wit the Powercolor?


----------



## Arizonian

Quote:


> Originally Posted by *warbucks*
> 
> Add me to the club
> 
> 
> 
> 
> 
> 
> 
> These will be under water later this week once my EK waterblocks arrive.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## RAmable

The Powercolor OC version has highest chance I believe.


----------



## rv8000

Quote:


> Originally Posted by *IBIubbleTea*
> 
> So the highest success rate is wit the Powercolor?


So far yes.

Finally got my Gelid Icy Vision installed. Using ASUS 290x bios and gpu tweak I'm able to run 1215/1550 stable with max voltage in gpu tweak. I think I may go water soon, VRM 1 temp shot up to 100c (throught software reading); I have a feeling theres a bit more left under the hood if i put it under water.

3Dmark 11 - http://www.3dmark.com/3dm11/7516332
Firestrike p - http://www.3dmark.com/fs/1171517


----------



## Falkentyne

Well it looks like I can do firestrike/full 3dmark loop stable at 1125/1500 with +50 mv and 80% fan, and 1150/1700 with +100mv, at 90% fan. 1125/1500 with +25mv will pass valley just fine, but will eventually give a few random checkboard artifacts in Firestrike graphics test 1 (always at the same location, oddly enough), and +50mv didn't show that anymore.

I only did 2 passes of 3dmark at +100 mv (1150/1700) as it's on the stock cooler and I don't intend to blow up the card. 1100/1400 is stable without a voltage increase, but 1500 for some reason requires a vGPU +mv increase (I really don't know why).


----------



## DarkZR9

Quote:


> Originally Posted by *Scorpion49*
> 
> Well, I think I solved the black screens. I removed my SoundBlaster X-Fi and I haven't had one since then.
> 
> Here is the best I can muster up on the stock BIOS:
> 
> FS http://www.3dmark.com/3dm/1663912
> 
> 11 http://www.3dmark.com/3dm11/7516307


That would figure, Creative and their stupid drivers.


----------



## Scorpion49

Well it appears that Sgt Bilko was correct. The 8350 is holding back the 290's a bunch. I compared them to the same settings on a 3930k, and also compared a GTX 780 SLI 8350 results I found on the website to my 780 SLI on 3930k, and both are around 3000 points lower on the GPU score. That sucks.


----------



## DarkZR9

Quote:


> Originally Posted by *Scorpion49*
> 
> Well it appears that Sgt Bilko was correct. The 8350 is holding back the 290's a bunch. I compared them to the same settings on a 3930k, and also compared a GTX 780 SLI 8350 results I found on the website to my 780 SLI on 3930k, and both are around 3000 points lower on the GPU score. That sucks.


Not surprising to me at all actually.


----------



## bpmcleod

Quote:


> Originally Posted by *IBIubbleTea*
> 
> So the highest success rate is wit the Powercolor?


My powercolor 290 "unlocked" to a 290x but my ASUS wont even take the ASUS 290x bios.. go figure xD lol


----------



## Scorpion49

Quote:


> Originally Posted by *DarkZR9*
> 
> Not surprising to me at all actually.


It is surprising, because the FX chips have been able to keep up with quad 7970/680 setups on the GPU score side, just not on the Physics score. I don't think there is much difference in actual games, I haven't been able to tell any.


----------



## Kelwing

Which group would be considered VRM1? The three small ones or the ones in a line? My VRM1 temps get up over 100c and I black screen. Currently removing heatsinks to try and redo them. Using the Extreme III air cooler on my 290.


----------



## DarkZR9

Quote:


> Originally Posted by *Kelwing*
> 
> Which group would be considered VRM1? The three small ones or the ones in a line? My VRM1 temps get up over 100c and I black screen. Currently removing heatsinks to try and redo them. Using the Extreme III air cooler on my 290.


Yeah they should not be getting that hot, I have the same cooler btw and I run 1200mhz core with 1.3vc. VRM1 should be the 3 small ones and even with my overclock I peak around 75c in BF4 multiplayer and Crysis 3 pulls it up the most to 93c.


----------



## broken pixel

Quote:


> Originally Posted by *DarkZR9*
> 
> That would figure, Creative and their stupid drivers.


I found with w8.1 any driver i use for my xfi makes my rig unstable for the most part unless i use the creative xfi driver within windows update.


----------



## utnorris

Quote:


> Originally Posted by *Scorpion49*
> 
> lol...
> 
> http://www.3dmark.com/3dm11/7515885


I was getting the same results, not sure what caused it, but doing a clean install and not installing the Intel GPU driver fixed my issues.


----------



## Scorpion49

Quote:


> Originally Posted by *utnorris*
> 
> I was getting the same results, not sure what caused it, but doing a clean install and not installing the Intel GPU driver fixed my issues.


Yeah.... this is a clean install with an FX 8350. Seems the FX just isn't cut out for this line of work.


----------



## Kelwing

Quote:


> Originally Posted by *DarkZR9*
> 
> Yeah they should not be getting that hot, I have the same cooler btw and I run 1200mhz core with 1.3vc. VRM1 should be the 3 small ones and even with my overclock I peak around 75c in BF4 multiplayer and Crysis 3 pulls it up the most to 93c.


I was running Kombuster and watching temps. 12v to fans and only had GPU at 1000mhz they were over 100c in maybe 2 minutes. I pulled heatsinks off and looked like I didn't have them seated good. Looked like air bubbles in glue between VRM and heatsink.


----------



## utnorris

Quote:


> Originally Posted by *Scorpion49*
> 
> Yeah.... this is a clean install with an FX 8350. Seems the FX just isn't cut out for this line of work.


Have you verified your GPU usage is not jumping all over the place on your cards? That is what was causing me to score low and once I did a clean install I was back to getting decent scores.


----------



## tsm106

Quote:


> Originally Posted by *Scorpion49*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DarkZR9*
> 
> Not surprising to me at all actually.
> 
> 
> 
> It is surprising, because the *FX chips have been able to keep up with quad 7970*/680 setups on the GPU score side, just not on the Physics score. I don't think there is much difference in actual games, I haven't been able to tell any.
Click to expand...

LOL, that is most assuredly not true, AMD cpu quad 7970s are around 30% slower on a good day.


----------



## DarkZR9

Quote:


> Originally Posted by *broken pixel*
> 
> I found with w8.1 any driver i use for my xfi makes my rig unstable for the most part unless i use the creative xfi driver within windows update.


Im using a win 7 driver for mine and just using compatibility mode, seems ok so far.


----------



## Scorpion49

Quote:


> Originally Posted by *tsm106*
> 
> LOL, that is most assuredly not true, AMD cpu quad 7970s are around 30% slower on a good day.


Man, I don't want to bring the 3930k back out. I just re-installed windows.


----------



## DarkZR9

Quote:


> Originally Posted by *Scorpion49*
> 
> Man, I don't want to bring the 3930k back out. I just re-installed windows.


What are you doing with a 8350 when you have a 3930K sitting around?


----------



## tsm106




----------



## Scorpion49

Quote:


> Originally Posted by *DarkZR9*
> 
> What are you doing with a 8350 when you have a 3930K sitting around?


Quote:


> Originally Posted by *Onglar*
> 
> You really running a 8350 when you have a 3930k?
> 
> Ok nice scorpion1776.


Because the RIVE I had sucked so I returned it and they didn't have anything decent for motherboards. This CHV-Z and CPU fit into the return price for the RIVE with cash to spare.


----------



## tsm106

What bios were u running on it?


----------



## bpmcleod

Anyone have any idea why I cant control my second card in the crossfire setup? I had it working before but now it has stopped working again. Syncing them does notthing. The fan speed and clocks wont budge no matter what I throw at them with any software.


----------



## Scorpion49

Quote:


> Originally Posted by *tsm106*
> 
> What bios were u running on it?


I tried 4 different ones, no matter what I did it wanted me to hook up a CPU fan and keyboard apparently. I took a chance buying it when it was open box because Fry's only had the one.


----------



## tsm106

Quote:


> Originally Posted by *Scorpion49*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> What bios were u running on it?
> 
> 
> 
> I tried 4 different ones, no matter what I did it wanted me to hook up a CPU fan and keyboard apparently. I took a chance buying it when it was open box because Fry's only had the one.
Click to expand...

Like a box of chocolates... lol you never know what you're gonna get. It's ok man, thats the way the cookie crumbles sometimes? And obviously there's a reason why it was openbox to begin with. I got mine openbox too and I love it. I got a RIVF for my sons rig same thing. Sometimes its worth the chance.


----------



## Scorpion49

Apparently my 290 waterblocks are pushed back to the end of the month too. Anyone ever cancel from aquatuning? I should have just got the EK but they're so damn ugly.


----------



## bond32

Quote:


> Originally Posted by *Scorpion49*
> 
> Apparently my 290 waterblocks are pushed back to the end of the month too. Anyone ever cancel from aquatuning? I should have just got the EK but they're so damn ugly.


Don't forget Koolance. Mine should be in this week. Ordered straight from Koolance, but you can get them from Amazon from Hellfire pc for $129.99 + shipping.


----------



## tsm106

Quote:


> Originally Posted by *Scorpion49*
> 
> Apparently my 290 waterblocks are pushed back to the end of the month too. Anyone ever cancel from aquatuning? I should have just got the EK but they're so damn ugly.


I like them so much better than the palm trees with puny vrm cooling.


----------



## Sazz

so I've sold my 290X for how much I bought it and I will be getting a 290 instead, and try to see if I can flash it to 290X. xD

Gonna receive the 290 by friday at the earliest.


----------



## Prospector

Quote:


> Originally Posted by *DarkZR9*
> 
> Yeah they should not be getting that hot, I have the same cooler btw and I run 1200mhz core with 1.3vc. VRM1 should be the 3 small ones and even with my overclock I peak around 75c in BF4 multiplayer and Crysis 3 pulls it up the most to 93c.


Quote:


> Originally Posted by *Kelwing*
> 
> Which group would be considered VRM1? The three small ones or the ones in a line? My VRM1 temps get up over 100c and I black screen. Currently removing heatsinks to try and redo them. Using the Extreme III air cooler on my 290.


Guys i have also Arctic extreme 3 on my 290 and the Auto fan is not working properly.It always stays at 20%,and every time i have to mannually setup the fan speed while gaming.What i did wrong? I just put the 4pin of the fan into the 4pin slot of the card,thats the 12V right?what you suggest?


----------



## Arizonian

Well some new news on non-reference cards as well as some topic discussion we've had here on different things.

*When will the 290 partner boards arrive?* - SemiAccurate
Nov 18, 2013 by Thomas Ryan
Quote:


> *With CES approaching quickly the AIB partners are rushing to have their own versions ready for the show at the latest.* On the other end of the spectrum we heard a few people mention "before Christmas" as the due date for non-reference design from AMD's partners. Either way we don't have too long to wait for a Christmas present from AMD's partners.


Quote:


> While AMD is going to put more of a focus on limiting the noise output of future cooling solutions it's unlikely that we'll ever see a brushed aluminum TITAN-style cooling solution from the red team. AMD wants to meet the expectations of the market, but it's not interested in putting itself in the poor house to do so.


and finally.....
Quote:


> In our checks with partners we learned that the aftermarket GPU air cooling business is a very small and unprofitable slice of the market. Multiple vendors told us that they'd done significant research on entering that market, but that their focus groups consistently told them that they would not under any circumstance willingly void their graphics card's warranty. Even if that means they could get higher overclocks or significantly better cooling performance.


A lot more to the article. Interesting read.

Guys I'm not sure I can hold out any longer. If Nvidia drops 780Ti Non-reference cards before AMD does then I might be pulling the trigger. I won't abandon club but I'm done waiting. I sold my 290X while I had the chance to recoup entire amount plus keep BF4, I couldn't pass up. I've already put the 690 in second rig and it's a done deal. No gaming for two weeks now is killing me as I refuse to lower settings / eye candy with my single 680. Ah well......I digress.


----------



## Taint3dBulge

GPU tweak has a bios update available. I just updated myn to 015.039.000.006.003515 not sure if this has been posted before, but if anyone wants to try the update id say try it out.. shoulda ran afew benches before hand.


----------



## hatlesschimp

Well the second Benq W1080ST Projector and Powercolor 290x finally arrived. It was a mission! But that's a story for another day.



So I'm guessing I will do a new windows install because I'm coming from Nvidia.


----------



## Arizonian

Quote:


> Originally Posted by *hatlesschimp*
> 
> Well the second Benq W1080ST Projector and Powercolor 290x finally arrived. It was a mission! But that's a story for another day.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> So I'm guessing I will do a new windows install because I'm coming from Nvidia.


Nice projectors ... congrats and added


----------



## Forceman

Quote:


> Originally Posted by *Taint3dBulge*
> 
> GPU tweak has a bios update available. I just updated myn to 015.039.000.006.003515 not sure if this has been posted before, but if anyone wants to try the update id say try it out.. shoulda ran afew benches before hand.


Any release notes with it?


----------



## Scorpion49

So here we have stock 3930k (no turbo, ECS board is total POS) vs the 8350 at 4600 with the cards at 1050/1300. Gained a lot of frames on test 1 in the SB-E.

http://www.3dmark.com/compare/3dm11/7516811/3dm11/7516387


----------



## golemite

Please update my entry to water!



Running well at around 52'max with OC to 1200mhz


----------



## Sgt Bilko

Quote:


> Originally Posted by *Scorpion49*
> 
> Well it appears that Sgt Bilko was correct. The 8350 is holding back the 290's a bunch. I compared them to the same settings on a 3930k, and also compared a GTX 780 SLI 8350 results I found on the website to my 780 SLI on 3930k, and both are around 3000 points lower on the GPU score. That sucks.


Wait....I was right about something?

That almost never happens..........

Anyways, yeah.....If AMD doesn't lift their CPU line then i'll be forced to change to the Blue team, shame considering i've owned AMD CPU's since the Athlon 1800+

Then again if Mantle pays off then i won't have to change but i'll need to see that before i believe it.


----------



## Forceman

Quote:


> Originally Posted by *Onglar*
> 
> Mantle makes the 8350 perform on par with a stock 4770k in BF4.


Got any stats to back that up?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Onglar*
> 
> Mantle makes the 8350 perform on par with a stock 4770k in BF4.


Thats what i've heard but it's something i'll need to see for myself though.

Don't get me wrong, it would be awesome


----------



## Roaches

Has there been any public demonstration of Mantle yet? The more we wait, the more smokes and mirrors it just become....

I'm skeptical though I'd love to see real results....


----------



## Arizonian

Quote:


> Originally Posted by *golemite*
> 
> Please update my entry to water
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Running well at around 52'max with OC to 1200mhz


I realized from your original post you didn't submit a OCN name in pic or GPU-Z so it wasn't added. I just slotted you in where you belonged and will take your word for it now.









http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/3030#post_21132767

Congrats - added -







and updated -


----------



## hotrod717

Quote:


> Originally Posted by *tsm106*
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/6120_40#post_21206032
> 
> AB breaks my setup in a lot of ways.
> Haha, I probably know it better than you but I still feel dirty using it everytime.


You're probably right. I would venture to guess you spent as much, if not more time using it in the last 3 weeks than I have the past year. Don't worry that dirty feeling will go away. At least that s what my girlfriend says!
Funny though, you're discription of the synergic mode and overcocking Hawaii is exactly how I approached Tahitii. It gives you a great idea of what volts it will take to push what core clock. A good starting point, if you will. The GPU Tweak you see is only half of what it was for Matrix. VDDCI, VRM clock, LLC, ect. that offer even more control. I know you're a reference purist, but man, you'd have grand old time with all that parameter control on a custom pcb. Congrats on your top score!


----------



## Sgt Bilko

Quote:


> Originally Posted by *Roaches*
> 
> Has there been any public demonstration of Mantle yet? The more we wait, the more smokes and mirrors it just become....
> 
> I'm skeptical though I'd love to see real results....


iirc there was a demonstration done at the AMD Developer Summit recently with the Nitrous engine i believe.....


----------



## golemite

Quote:


> Originally Posted by *Arizonian*
> 
> I realized from your original post you didn't submit a OCN name in pic or GPU-Z so it wasn't added. I just slotted you in where you belonged and will take your word for it now.
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added -
> 
> 
> 
> 
> 
> 
> 
> and updated -


Oops heres a quick run


----------



## Spectre-

tfw new card


----------



## Sgt Bilko

Quote:


> Originally Posted by *Forceman*
> 
> Got any stats to back that up?


This is the best i can find on it so far. Will try and dig up some solid number.......if they exist atm that is

http://www.overclock.net/t/1439628/amd-amds-revolutionary-mantle-graphics-api-adopted-by-industry-leading-game-developers-cloud-imperium-eidos-montreal-and-oxide/500#post_21191031

EDIT:


"FX-8350 at 2ghz does not bottleneck a R9-290"

Wha...........How?


----------



## Taint3dBulge

Quote:


> Originally Posted by *Forceman*
> 
> Any release notes with it?


None that i can find, i just clicked the check update. it asked if i wanted to update now, hit ok, it flashed it hung for about 3 min then was done.. Seems to be ok but no idea if its changed anything.


----------



## gamervivek

Mantle makes the game GPU limited, so the par results are not shocking.


__
https://www.reddit.com/r/1qlkk1/mantle_starswarm_demo_100k_draw_calls_on_fx8350/


----------



## Bloitz

Arizonian, you haven't added me yet !









http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/5720#post_21195643

Gigabyte card
Hynix memory and no warranty sticker.
I haven't tried unlocking or overclocking yet.
I did get a black screen when I tested it in my brother's rig when it should have gone to the desktop. But in safe mode and during booting it worked fine so it was a driver issue. Works fine in my rig now, uninstalled nvidia drivers, did a driver sweep with Driver Fusion in safe mode (which is kind of a pain to get into in W8.1) and let CCleaner clean the registry (I did manually check the keys it modified). Chugging along fine now.

I haven't installed CCC though. Should I? Seems some people have had issues with it.


----------



## kcuestag

Store recieved my 290X just a few minutes ago and they'll check it soon, I hope they ship me a new replacement this evening or tomorrow, I'm very impatient with no GPU to play with right now.









I also asked them if they could try send me one with Hynix, we'll see what they say/do.


----------



## Joeking78

Just joined the club (but probably leaving the happily married club).


----------



## icezar

Quote:


> Originally Posted by *RYCRAI*
> 
> I got the Sapphire 290 and it produces a God awful amount of coil whine... Seriously considering RMA'ing this. Anyone else having this problem?


Yep, I have it. Going to exchange it tomorrow.


----------



## Maxxa

Quote:


> Originally Posted by *Joeking78*
> 
> Just joined the club (but probably leaving the happily married club).
> 
> 
> Spoiler: Warning: Spoiler!


If you blow funds and she comes back, shes yours but if you splurge on 2x 290xs and she bolts, it was never meant to be.


----------



## RYCRAI

Quote:


> Originally Posted by *icezar*
> 
> Yep, I have it. Going to exchange it tomorrow.


I got my exchange and it still has a very small amount of coil whine, but it gets drowned out after playing for a bit and when the fan kicks up a little. Let me know how yours comes, I may still ask Newegg to refund me my money or exchange it again, but this is a little ridiculous.


----------



## kcuestag

Good news, the store recieved the card only 2 hours ago, and just now they told me they've already shipped a replacement which I should get tomorrow! That was fast!









Hoping this one is Hynix, if it's Elpida, then I hope it's got no blackscreen issues.


----------



## Newbie2009

Quote:


> Originally Posted by *kcuestag*
> 
> Good news, the store recieved the card only 2 hours ago, and just now they told me they've already shipped a replacement which I should get tomorrow! That was fast!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hoping this one is Hynix, if it's Elpida, then I hope it's got no blackscreen issues.


Did you still have blackscreen issues after a clean install?


----------



## Ukkooh

Just ordered a Club 3D 290x BF4 edition. It will be here propably by tomorrow so I'll post pictures then.


----------



## hatlesschimp

Ive never had a club 3d card but i like how they go about it.


----------



## Ukkooh

The sole reason for getting club 3d was watercooling support.


----------



## DarkZR9

Whats up with the GPU Core fluctuation in BF4? For example I have my card clocked to 1200mhz core 1400mhz mem and its not temp related because i am not exceeding 57c in bf4 and the vrm's are nice and cool too.

GPUZ shows the core all over the place and is not locked at 1200mhz, it fluctuates from 1125mhz to 1200mhz and its driver me nuts. However the Valley Demo keeps the core pegged at 1200mhz core.


----------



## evensen007

Have to say, not too impressed with my 290's so far. They are pretty awful overclockers which is to be expected since they are lower binned 290x's. They really aren't beating the heck out of my old trusty 7970 @ 1300. Drivers are early and I'm not really pushing an xfire type resolution (Battlefield 4 2560x1600p/ULTRA). I wanted them to be able to max AA and post processing effects. They do allow me to do that, but with a LOT of extra power draw and heat (even watercooled) compared to my 7970. In retrospect, I would have added another 7970 instead of going 2x 290's.

The 2x 290's in the water loop have added about 15c to my cpu max temps and about 10c to the gpu temps.


----------



## sugarhell

Your 7970 at 1300 use the same power consumption as a 290x uber


----------



## Toss3

So today I solved my damned red screen issues in BF4 - I simply ran it as the administrator, and voilá no more issues. This simple after flashing countless bios:es and thinking my card was damaged.







Now I can start clocking again!


----------



## evensen007

Quote:


> Originally Posted by *sugarhell*
> 
> Your 7970 at 1300 use the same power consumption as a 290x uber


You don't know that for sure, and also it is irrelevant to what I posted about my 290's.


----------



## sugarhell

Quote:


> Originally Posted by *evensen007*
> 
> You don't know that for sure, and also it is irrelevant to what I posted about my 290's.


A 7970 with 20% powerlimit and 1.3 volt will be around 300 watt. So yeah i know it.I had over 20 7970s the last 2 years


----------



## DarkZR9

I cant say im dissapointed in the cards performance because compared to my old 7970 at 1200mhz core my new card kills it. Even at stock clocks it was faster than my overclocked 7970.


----------



## evensen007

Quote:


> Originally Posted by *sugarhell*
> 
> A 7970 with 20% powerlimit and 1.3 volt will be around 300 watt. So yeah i know it.I had over 20 7970s the last 2 years


That's fine, and I still stand by my thoughts/impressions of the 290's.


----------



## rdr09

Quote:


> Originally Posted by *evensen007*
> 
> That's fine, and I still stand by my thoughts/impressions of the 290's.


but you are comparing heat produced by one gpu from two gpus. i am thinkiing adding another 290 but i think my i7 SB won't be able to push them to their full potential. you monitored your cpu usage?


----------



## evensen007

Quote:


> Originally Posted by *rdr09*
> 
> but you are comparing heat produced by one gpu from two gpus. i am thinkiing adding another 290 but i think my i7 SB won't be able to push them to their full potential. you monitored your cpu usage?


I understand. What I am saying is that I could have probably added another 7970 and left them at stock volts and gotten way more bang for my buck as well as much less heat and power draw. The new xfire specs may add extra overhead the the PCI-X bus, but I'm not sure. My 2600k cores are definitely getting more of a workout as they are all 60-85% in BF4. To be fair, they may have been just as high if I would have added another 7970.

My 290's can't even go to 1000core without causing black screens in BF4, and that's with extra volts up to 1.3+.


----------



## Stay Puft

Quote:


> Originally Posted by *Sgt Bilko*
> 
> This is the best i can find on it so far. Will try and dig up some solid number.......if they exist atm that is
> 
> http://www.overclock.net/t/1439628/amd-amds-revolutionary-mantle-graphics-api-adopted-by-industry-leading-game-developers-cloud-imperium-eidos-montreal-and-oxide/500#post_21191031
> 
> EDIT:
> 
> 
> "FX-8350 at 2ghz does not bottleneck a R9-290"
> 
> Wha...........How?


Comparable to a 3.5ghz 4770K with 1600 memory? What about us at 4.8ghz with ddr3 3000 memory?


----------



## DarkZR9

Quote:


> Originally Posted by *evensen007*
> 
> My 290's can't even go to 1000core without causing black screens in BF4, and that's with extra volts up to 1.3+.


You are certainly in the minority on that one, most can do 1150mhz core without much issue. Mine does 1200mhz core and its flashed to a R9 290X now.


----------



## DarkZR9

Quote:


> Originally Posted by *DarkZR9*
> 
> Whats up with the GPU Core fluctuation in BF4? For example I have my card clocked to 1200mhz core 1400mhz mem and its not temp related because i am not exceeding 57c in bf4 and the vrm's are nice and cool too.
> 
> GPUZ shows the core all over the place and is not locked at 1200mhz, it fluctuates from 1125mhz to 1200mhz and its driver me nuts. However the Valley Demo keeps the core pegged at 1200mhz core.


Anyone that can chime in on this? I know im not the only one who has seen this odd behavior in BF4.


----------



## sugarhell

Quote:


> Originally Posted by *Stay Puft*
> 
> Comparable to a 3.5ghz 4770K with 1600 memory? What about us at 4.8ghz with ddr3 3000 memory?


With mantle cpu oc will not make a difference. You need to oc to 5ghz for the brute force with dx11 limited draw calls


----------



## evensen007

Quote:


> Originally Posted by *DarkZR9*
> 
> You are certainly in the minority on that one, most can do 1150mhz core without much issue. Mine does 1200mhz core and its flashed to a R9 290X now.


And honestly, I'm ok with that. I understand the chip lottery deal ( I won it with my previous 7970 @ 1300/1700). I still just can't shake the overwhelming feeling of being severely underwhelmed.


----------



## rdr09

Quote:


> Originally Posted by *evensen007*
> 
> I understand. What I am saying is that I could have probably added another 7970 and left them at stock volts and gotten way more bang for my buck as well as much less heat and power draw. The new xfire specs may add extra overhead the the PCI-X bus, but I'm not sure. My 2600k cores are definitely getting more of a workout as they are all 60-85% in BF4. To be fair, they may have been just as high if I would have added another 7970.
> 
> My 290's can't even go to 1000core without causing black screens in BF4, and that's with extra volts up to 1.3+.


voltage control is still broken with these cards - imo. my single 290 can run benches at 1155/1500 at stock volts and bios but any higher (even with added volts) I start to see artifacts. i play stock with all my gpus. coming from a single 7970, you prolly do not even need to oc 2 290s. on the otherhand, we may need to take our cpus to 5GHz and beyond.


----------



## DarkZR9

Quote:


> Originally Posted by *evensen007*
> 
> And honestly, I'm ok with that. I understand the chip lottery deal ( I won it with my previous 7970 @ 1300/1700). I still just can't shake the overwhelming feeling of being severely underwhelmed.


Well If you are coming from crossfired 7970's then i can understand because as of right now there is not much of a diff in a lot of games from crossfire 7970's to crossfire R9 290's. Its likely the fact that there really isnt a CPU out right now that can keep up very well with two R9 290's where as with the 7970's you were able to get the most out of both cards.

Now when comparing a single 7970 to a single R9 290 the improvement is massive.


----------



## ZealotKi11er

Quote:


> Originally Posted by *evensen007*
> 
> And honestly, I'm ok with that. I understand the chip lottery deal ( I won it with my previous 7970 @ 1300/1700). I still just can't shake the overwhelming feeling of being severely underwhelmed.


If i had HD 7970s that could do 1300MHz i would have not upgraded. You probably need 1200MHz + 290s to feel the difference.

My HD 7970s do about 1200/1650 under water.

Also since your temps increased how much RAD do you have to cool 2 x 290s and CPU?


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> If i had HD 7970s that could do 1300MHz i would have not upgraded. You probably need 1200MHz + 290s to feel the difference.
> 
> My HD 7970s do about 1200/1650 under water.
> 
> Also since your temps increased how much RAD do you have to cool 2 x 290s and CPU?


zeal, i think evensen is talking about just a (1) 7970. even if that 7970 can clock to1300, it will still be a massive upgrade going to 2 290s.


----------



## evensen007

Quote:


> Originally Posted by *ZealotKi11er*
> 
> If i had HD 7970s that could do 1300MHz i would have not upgraded. You probably need 1200MHz + 290s to feel the difference.
> 
> My HD 7970s do about 1200/1650 under water.
> 
> Also since your temps increased how much RAD do you have to cool 2 x 290s and CPU?


I did have a single 7970 before. I have a 360 RAD and a 120 RAD in my loop.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Scorpion49*
> 
> So here we have stock 3930k (no turbo, ECS board is total POS) vs the 8350 at 4600 with the cards at 1050/1300. Gained a lot of frames on test 1 in the SB-E.
> 
> http://www.3dmark.com/compare/3dm11/7516811/3dm11/7516387


Isn't the first test more cpu dependent rather than gpu? It evens out in the heavy gpu sections. I wouldn't stress it!









I have a 2600k that overclocks to 5.2 on the daily and will out bench my 8350 with the same gpu's by far, but in games I can honestly say I've had a smoother experience on my 8350. Don't get too caught up in these synthetic benches if your sole purpose is to game


----------



## rdr09

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Isn't the first test more cpu dependent rather than gpu? It evens out in the heavy gpu sections. I wouldn't stress it!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have a 2600k that overclocks to 5.2 on the daily and will out bench my 8350 with the same gpu's by far, but in games I can honestly say I've had a smoother experience on my 8350. Don't get too caught up in these synthetic benches if your sole purpose is to game


^this. so long as the AMD cpu is not bottlenecking the GPU or GPUs in games it will be just as good as a powerful Intel chip.


----------



## Arizonian

Quote:


> Originally Posted by *Spectre-*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> tfw new card


Congrats - added








Quote:


> Originally Posted by *Joeking78*
> 
> Just joined the club (but probably leaving the happily married club).
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## Newbie2009

Quote:


> Originally Posted by *ZealotKi11er*
> 
> If i had HD 7970s that could do 1300MHz i would have not upgraded. You probably need 1200MHz + 290s to feel the difference.
> 
> My HD 7970s do about 1200/1650 under water.
> 
> Also since your temps increased how much RAD do you have to cool 2 x 290s and CPU?


Honestly diminishing returns once you pass 1200 on the core with the 7970. 1300 core sounds better than it is.

7970s such beastly cards, oh the fond memories.


----------



## MrWhiteRX7

My black edition 7970's did 1200/1675 and I was able to get them to bench higher than that but in games I really stopped gaining fps around 1180/ 1575... except in BF4 for some reason it did well with 1675 on the memory.

I'm "downgrading" to a single 290 at the moment. If I like it as much as I think I will, then it will be time to get a few more lol


----------



## DraXxus1549

Quote:


> Originally Posted by *DarkZR9*
> 
> Whats up with the GPU Core fluctuation in BF4? For example I have my card clocked to 1200mhz core 1400mhz mem and its not temp related because i am not exceeding 57c in bf4 and the vrm's are nice and cool too.
> 
> GPUZ shows the core all over the place and is not locked at 1200mhz, it fluctuates from 1125mhz to 1200mhz and its driver me nuts. However the Valley Demo keeps the core pegged at 1200mhz core.


Anyone have a fix for this?

I am seeing the same downclocking issues and my temps are below 80C.


----------



## jerrolds

Quote:


> Originally Posted by *DraXxus1549*
> 
> Anyone have a fix for this?
> 
> I am seeing the same downclocking issues and my temps are below 80C.


Thats weird - running a single 290X @1180/1500 i get no fluctuations in BF4. Believe my voltage is set to 1325mV in GPU Tweak (~1.22v actual) and +50% Power Limit.

Core temps are below 65C tho with Gelid aftermarket cooler, its VRM1 temps i need to worry about - hitting 80-85C pretty easily.


----------



## DraXxus1549

While it seems unlikely could it be my CPU is holding me back?


----------



## evensen007

Quote:


> Originally Posted by *DraXxus1549*
> 
> While it seems unlikely could it be my CPU is holding me back?


I believe it is a xfire bug with the 290's right now. Mine does the same thing, but the frames remain relatively smooth. Should be ironed out in new drivers as that happened to me in the past with my 2 5870's and a bad driver release.


----------



## kfxsti

Got my 2 sapphire 290s yesterday. Scored a 14521 in Firestrike..... IM going to reload my rig tonight and see if that helps , as my last run on my 7990 was a 15221.. Any thoughts guys ? Crap.. come to think of it i didn't uninstall the driver i had been using for the 7990 lol. and didnt install the latest for the 290 lol


----------



## Forceman

Quote:


> Originally Posted by *DarkZR9*
> 
> Anyone that can chime in on this? I know im not the only one who has seen this odd behavior in BF4.


If you are using Afterburner, try un checking voltage monitoring. It causes tons of small core speed dips (which may be just a monitoring issue) - disabling monitoring, while leaving voltage control enabled, cleared it right up for me.


----------



## DarkZR9

Quote:


> Originally Posted by *Forceman*
> 
> If you are using Afterburner, try un checking voltage monitoring. It causes tons of small core speed dips (which may be just a monitoring issue) - disabling monitoring, while leaving voltage control enabled, cleared it right up for me.


Hi thanks, my voltage monitoring was already un ticked.


----------



## Warsam71

Hello everyone!

I apologize for my late response...

I just wanted to let you know we are aware of the black screen issues and currently working on it. I've exchanged a few pm with @Arizonian in the past few days about this issue. He has provided me a thorough summary of the issue, possible sources of the problem, as well as links to existing threads; which I have already forwarded to our Customer Support and Driver Team. Very helpful









Thank You to everyone who has posted, and more importantly Thank You for your patience


----------



## rv8000

Quote:


> Originally Posted by *rdr09*
> 
> voltage control is still broken with these cards - imo. my single 290 can run benches at 1155/1500 at stock volts and bios but any higher (even with added volts) I start to see artifacts. i play stock with all my gpus. coming from a single 7970, you prolly do not even need to oc 2 290s. on the otherhand, we may need to take our cpus to 5GHz and beyond.


What block do you have? I just Got my Gelid Icy Vision and could originally only get 1170~1180 with ASUS 290x bios and not black screening, but I can run 1215/1550 stable now. I can get higher but I noticed when vrm temps start to pass a certain temp I start to artifact and without a heat gun theres no true way to tell my vrm temps. For me at least I still think its a heat issue holding me back, interesting that your under water and having issues.


----------



## sugarhell

Nice. At least we have confirmation. I would prefer it a bit earlier but either way


----------



## jerrolds

Quote:


> Originally Posted by *Warsam71*
> 
> Hello everyone!
> 
> I apologize for my late response...
> 
> I just wanted to let you know we are aware of the black screen issues and currently working on it. I've exchanged a few pm with @Arizonian in the past few days about this issue. He has provided me a thorough summary of the issue, possible sources of the problem, as well as links to existing threads; which I have already forwarded to our Customer Support and Driver Team. Very helpful
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thank You to everyone who has posted, and more importantly Thank You for your patience


Thanks for the update!

Offtopic but i havent got any replies from the reps on #AMDGaming FB page and Twitter accounts - Why do Day 1 290X customers had to pay extra for BF4 while two week laggers buying a R7 270 (a card that costs 1/3rd the price) get it for free? Can we (high end enthusiats) expect some sort of reward for being loyal?

Its just frustrating that we got totally crapped on imo.


----------



## kcuestag

Quote:


> Originally Posted by *Warsam71*
> 
> Hello everyone!
> 
> I apologize for my late response...
> 
> I just wanted to let you know we are aware of the black screen issues and currently working on it. I've exchanged a few pm with @Arizonian in the past few days about this issue. He has provided me a thorough summary of the issue, possible sources of the problem, as well as links to existing threads; which I have already forwarded to our Customer Support and Driver Team. Very helpful
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thank You to everyone who has posted, and more importantly Thank You for your patience


Thank you for your reply, it took over a week to get AMD noticed.









I already sent mine for RMA and getting a brand new replacement tomorrow, hoping you guys sort it out if it's software issue.


----------



## Warsam71

Quote:


> Originally Posted by *jerrolds*
> 
> Thanks for the update!
> 
> Offtopic but i havent got any replies from the reps on #AMDGaming FB page and Twitter accounts - Why do Day 1 290X customers had to pay extra for BF4 while two week laggers buying a R7 270 (a card that costs 1/3rd the price) get it for free? Can we (high end enthusiats) expect some sort of reward for being loyal?
> 
> Its just frustrating that we got totally crapped on imo.


Hello jerrolds,

I am sorry about the frustration...

I'm not sure if you've noticed it, I recently posted information on the R9 Series + BF4 Promotion, it's a Q&A, you can find it here: http://www.overclock.net/t/1443510/amd-r9-series-battlefield-4-promotion-official-q-a

Please note we will continue to update this page through out the week


----------



## Emmett

Well, my one ek blocked card I have plumbed in tops out at ---

1120 / 1430 on stock volts

played a few hours bf4 at these clocks. no issues.

saw 1.227 max volts at one point during 3dmark 11 run


----------



## DraXxus1549

Quote:


> Originally Posted by *evensen007*
> 
> I believe it is a xfire bug with the 290's right now. Mine does the same thing, but the frames remain relatively smooth. Should be ironed out in new drivers as that happened to me in the past with my 2 5870's and a bad driver release.


I'm only running one 290, so it isn't crossfire related, at least for me.


----------



## MrWhiteRX7

Are the Sapphire's voltage locked?


----------



## jerrolds

Warsam - i did, while thats nice (and better than nothing) - it seems like a contest or promo more than anything "As a gesture of goodwill to our most loyal fans, we'll be giving away in a *random draw* 1000 Battlefield 4 game codes in the coming weeks. "

So not every early adopter will be guaranteed a BF4 code, and could even be given to anyone that likes AMDGaming fb page or is Twitter follower.


----------



## Maxxa

Quote:


> Originally Posted by *jerrolds*
> 
> Warsam - i did, while thats nice (and better than nothing) - it seems like a contest or promo more than anything "As a gesture of goodwill to our most loyal fans, we'll be giving away in a *random draw* 1000 Battlefield 4 game codes in the coming weeks. "
> 
> So not every early adopter will be guaranteed a BF4 code, and could even be given to anyone that likes AMDGaming fb page or is Twitter follower.


Yeah and I like most people on this site will not go near FB or twitter just to "like" a corporation with someone else's hardware...nm our own.


----------



## Mr357

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Are the Sapphire's voltage locked?


Hard to say, but if so it's a BIOS lock and not hardware. Many of us are on an ASUS/third party BIOS that allows fro voltage adjustments.


----------



## Joeking78

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Are the Sapphire's voltage locked?


I got two Sapphire cards today...ran Afterburner, mades some minor adjustments to settings and I had full control over core voltage & aux voltage (no bios flashing)...memory voltage is locked.

Without touching voltage I could get to 1100mhz from 1000mhz, with the voltage maxed I can hit 1175/1400 stable...I can drop the memory to 1250 and push the core to 1200 but its not very stable.

EDIT: GPUz reports 1.055 VDDC when I have core voltage maxed, 0.961 stock and 1.094 VDDCI with aux maxed, 1.000 stock...that's idle btw, need to check under load.


----------



## sugarhell

Memory voltage is not locked. 290x doesnt have memory voltage control at all


----------



## the9quad

Quote:


> Originally Posted by *Warsam71*
> 
> Hello jerrolds,
> 
> I am sorry about the frustration...
> 
> I'm not sure if you've noticed it, I recently posted information on the R9 Series + BF4 Promotion, it's a Q&A, you can find it here: http://www.overclock.net/t/1443510/amd-r9-series-battlefield-4-promotion-official-q-a
> 
> Please note we will continue to update this page through out the week


Yeah! Wish you guys luck figuring the issues out.


----------



## Joeking78

Quote:


> Originally Posted by *sugarhell*
> 
> Memory voltage is not locked. 290x doesnt have memory voltage control at all


Well I meant its greyed out, can't be used.


----------



## kcuestag

Sapphire R9 290X cards are not voltage locked, but if you want to go above +100mV, you'll have to use an ASUS bios and use GPU Tweak, because MSI Afterburner has set a max of +100mV to be safe on their part with MSI cards and RMA's.

However, Unwinder (Dev of MSI Afterburner) posted a trick where you could add a flag to AB's shortcut to add more than 100mV.


----------



## rdr09

Quote:


> Originally Posted by *rv8000*
> 
> What block do you have? I just Got my Gelid Icy Vision and could originally only get 1170~1180 with ASUS 290x bios and not black screening, but I can run 1215/1550 stable now. I can get higher but I noticed when vrm temps start to pass a certain temp I start to artifact and without a heat gun theres no true way to tell my vrm temps. For me at least I still think its a heat issue holding me back, interesting that your under water and having issues.


you are using Asus bios, though. I am using stock bios, so that could be the difference.

anyway, i have an ek nickle plated block paired with just one 120 rad and a fan. seems to do a good job keeping my temps undercontrol but, as you know, stock bios does not show the vrm's temps.


----------



## Joeking78

Quote:


> Originally Posted by *kcuestag*
> 
> Sapphire R9 290X cards are not voltage locked, but if you want to go above +100mV, you'll have to use an ASUS bios and use GPU Tweak, because MSI Afterburner has set a max of +100mV to be safe on their part with MSI cards and RMA's.
> 
> However, Unwinder (Dev of MSI Afterburner) posted a trick where you could add a flag to AB's shortcut to add more than 100mV.


Oooh, link pls









In 3DMark I hit 80c, with 100mV 1175/1400...I think I got a tiny bit of head room for some more volts.


----------



## kcuestag

Quote:


> Originally Posted by *Joeking78*
> 
> Oooh, link pls
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In 3DMark I hit 80c, with 100mV 1175/1400...I think I got a tiny bit of head room for some more volts.


Here you go: http://forums.guru3d.com/showpost.php?p=4701410&postcount=40

I haven't really tried it as I can't be bothered, mine hit 1200/1400 at +100mV so I am pretty happy with that.


----------



## rv8000

Quote:


> Originally Posted by *rdr09*
> 
> you are using Asus bios, though. I am using stock bios, so that could be the difference.
> 
> anyway, i have an ek nickle plated block paired with just one 120 rad and a fan. seems to do a good job keeping my temps undercontrol but, as you know, stock bios does not show the vrm's temps.


Have you tried the ASUS bios yet? Stock bios does report vrm temps in gpu-z but who knows how accurate they are, it's at least a ballpark estimate though.


----------



## DarknightOCR

I do well 3dmark, valley and even BF4 smoothly.

+100 Mv or 1.32V in asus gpu tweak, to drop to 1.3V + -
but gives more than 1200MHz, but it was just to test if everything was OK.

http://www.3dmark.com/fs/1173041



Sapphire R9 290 @ Asus bios , stock cooler fan 80%


----------



## Slomo4shO

Quote:


> Originally Posted by *Warsam71*
> 
> Thank You for your patience


Many have been patiently waiting for non-reference cards to hit the market and I, personally, would consider it a failure if AMD and it's AIB partners are unable to deliver such products by the end of November. The launch of the 290X and 290 were bad enough, expecting someone to forgo potential Black Friday and Cyber Monday deals on the GTX 780 and waiting for aftermarket cards near Christmas is rather naive. I guess I'll know by the end of the month if AMD truly wants my business.


----------



## Joeking78

Quote:


> Originally Posted by *kcuestag*
> 
> Here you go: http://forums.guru3d.com/showpost.php?p=4701410&postcount=40
> 
> I haven't really tried it as I can't be bothered, mine hit 1200/1400 at +100mV so I am pretty happy with that.


I think I'll wait for the proper AB update, I don't want to mess around in cfg files and bork something and as you said, our clocks are pretty close and they are perfectly fine for everyday use.


----------



## kcuestag

Quote:


> Originally Posted by *Joeking78*
> 
> I think I'll wait for the proper AB update, I don't want to mess around in cfg files and bork something and as you said, our clocks are pretty close and they are perfectly fine for everyday use.


That's the final AB Update for 290/290X voltage control, MSI specificed no more than +100mV, so unless you do that trick or flash to ASUS and use GPU Tweak, there's nothing else you can do, AB will stay like that.


----------



## rdr09

Quote:


> Originally Posted by *rv8000*
> 
> Have you tried the ASUS bios yet? Stock bios does report vrm temps in gpu-z but who knows how accurate they are, it's at least a ballpark estimate though.


mine doesn't . . .



i'll think about using Asus bios but for now, the gpu is enough even at stock. played C3 last night and it was sweet.

i am assuming my vrms are as cool as the core . . .



till, then not much oc'ing here.


----------



## jerrolds

AB17 wont work on asus bios 290X right? It doesnt for me at least...even with voltage unlock checked off.

rdr09 - im almost positive the newest GPUZ shows VRM temps with my original Sapphire BIOS


----------



## bjozac

Quote:


> Originally Posted by *kcuestag*
> 
> That's the final AB Update for 290/290X voltage control, MSI specificed no more than +100mV, so unless you do that trick or flash to ASUS and use GPU Tweak, there's nothing else you can do, AB will stay like that.


Meaning that if we have Sapphire 290 we need to flash Asus bios?
I have not been reading last 4 days of this thread, but is there any ASUS non-x bios which works aswell as the asus 290x bios?


----------



## Joeking78

Quote:


> Originally Posted by *kcuestag*
> 
> That's the final AB Update for 290/290X voltage control, MSI specificed no more than +100mV, so unless you do that trick or flash to ASUS and use GPU Tweak, there's nothing else you can do, AB will stay like that.


Ah, I misread a quote on Guru, thought there was another update coming but it was dated a week ago.


----------



## Epsi

I just ordered a XFX 290, hoping to receive it tomorrow.

It's been a while i had a AMD card, can't wait to mess around with.

Little bit of hope it will unlock. And if not... well still a nice card i guess.


----------



## kcuestag

Quote:


> Originally Posted by *bjozac*
> 
> Meaning that if we have Sapphire 290 we need to flash Asus bios?
> I have not been reading last 4 days of this thread, but is there any ASUS non-x bios which works aswell as the asus 290x bios?


It means anyone that doesn't have an ASUS must flash to an ASUS Bios and use ASUS GPU Tweak if they want to exceed the +100mV limit that is set with MSI Afterburner.

Honestly, I tried it and it's not worth it, if the card hits a roof with +100mV, stay there, it's more than enough.
Quote:


> Originally Posted by *Joeking78*
> 
> Ah, I misread a quote on Guru, thought there was another update coming but it was dated a week ago.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Mr357*
> 
> Hard to say, but if so it's a BIOS lock and not hardware. Many of us are on an ASUS/third party BIOS that allows fro voltage adjustments.


That's what I was thinking. A friend was telling me they were voltage locked which I didn't think they were, but figured throwing in dat der Asus bios would get it done anyways


----------



## rv8000

Quote:


> Originally Posted by *rdr09*
> 
> mine doesn't . . .
> 
> 
> 
> i'll think about using Asus bios but for now, the gpu is enough even at stock. played C3 last night and it was sweet.


Interesting, you have an MSI board right?


----------



## VSG

Any idea if Trixx will updated to support voltage control for Sapphire cards? Similarly, what about other vendors?


----------



## sugarhell

They released 13.11 WHQL

http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64

Last download


----------



## bjozac

Quote:


> Originally Posted by *kcuestag*
> 
> It means anyone that doesn't have an ASUS must flash to an ASUS Bios and use ASUS GPU Tweak if they want to exceed the +100mV limit that is set with MSI Afterburner.
> 
> Honestly, I tried it and it's not worth it, if the card hits a roof with +100mV, stay there, it's more than enough.


So they have fixed that?
Initially the voltage was locked on sapphire cards.
I flashed mine(290-non-x) with a ASUS 290-x bios and overclocked it to 1200 stable with max voltage. Now i got my second card and are waiting for the water cooling to arrive. But you would suggest me to use msi afterburner and just max it out on stock bios?


----------



## rv8000

Quote:


> Originally Posted by *geggeg*
> 
> Any idea if Trixx will updated to support voltage control for Sapphire cards? Similarly, what about other vendors?


A beta for trixx is apparently being tested according to a mod on sapphire tech forums, and december was mentioned as a release (for official or beta im not sure).


----------



## brazilianloser

Quote:


> Originally Posted by *sugarhell*
> 
> They released 13.11 WHQL
> 
> http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64
> 
> Last download


I thought it was out for a while now... maybe I am thinking of the .10 version though. Either way both the .10 and .11 are showing the same release date and no release notes that I can find.


----------



## Arizonian

Quote:


> Originally Posted by *kfxsti*
> 
> Got my 2 sapphire 290s yesterday. Scored a 14521 in Firestrike..... IM going to reload my rig tonight and see if that helps , as my last run on my 7990 was a 15221.. Any thoughts guys ? Crap.. come to think of it i didn't uninstall the driver i had been using for the 7990 lol. and didnt install the latest for the 290 lol
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added








Quote:


> Originally Posted by *sugarhell*
> 
> They released 13.11 WHQL
> 
> http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64
> 
> Last download


Thank you - added to OP as link for reference if anyone needs to be directed.


----------



## kcuestag

Quote:


> Originally Posted by *bjozac*
> 
> So they have fixed that?
> Initially the voltage was locked on sapphire cards.
> I flashed mine(290-non-x) with a ASUS 290-x bios and overclocked it to 1200 stable with max voltage. Now i got my second card and are waiting for the water cooling to arrive. But you would suggest me to use msi afterburner and just max it out on stock bios?


They were never locked, it's simple, there were no programs that supported the voltage regulators the new cards use as they were NEW into the market, just like the GTX780Ti has no voltage control right now because Afterburner needs an update.


----------



## selk22

Quote:


> Originally Posted by *DarkZR9*
> 
> Anyone that can chime in on this? I know im not the only one who has seen this odd behavior in BF4.


I had posted this same "issue" a while back. Its basically determined that the 290x is simply fluctuating clocks in bf4/games according to what your are looking at/doing because in a benchmark like valley or heaven your GPU clock will stay maxed out because it needs to. I never have frame issues because of it

Side note... I had a dream my 290x began to smoke and hiss then caught fire.. it was so real.. lol


----------



## Skylark71

Quote:


> Originally Posted by *DraXxus1549*
> 
> I'm only running one 290, so it isn't crossfire related, at least for me.


'

I got this GPU-usage fluctuation in almost every game and i`m using single Asus 290.
Tried almost everything but reinstall Win7...







is that only solution?


----------



## flopper

Quote:


> Originally Posted by *Skylark71*
> 
> '
> 
> I got this GPU-usage fluctuation in almost every game and i`m using single Asus 290.
> Tried almost everything but reinstall Win7...
> 
> 
> 
> 
> 
> 
> 
> is that only solution?


not sure but since the powerplay now adjust for heat and speed defined its likely this behaviour will happen in games.
unwidner said the pollingrate also makes it into the graphs and he said its how amd does it in drivers currently.


----------



## the9quad

Good new s from caveman Jim, driver update coming this week to fix black screen issue

http://www.rage3d.com/board/showthread.php?t=34006026


----------



## jjjsong

Hopefully the black screen fix will also fix my bsod issues o.o


----------



## brazilianloser

My replacement finally come in... and its Elpidia again. But to run tests to see if it is a black screen monster again.


----------



## brazilianloser

Well its surviving Valley in a single monitor run so far without having to tweak fan curves or anything of the sort on Uber mode... So well good sign so far. To try some 3 monitor runs now.


----------



## lordzed83

Hmm updated to that new asus bios and rl cant tell difference at all


----------



## BababooeyHTJ

Quote:


> Originally Posted by *Warsam71*
> 
> Hello everyone!
> 
> I apologize for my late response...
> 
> I just wanted to let you know we are aware of the black screen issues and currently working on it. I've exchanged a few pm with @Arizonian in the past few days about this issue. He has provided me a thorough summary of the issue, possible sources of the problem, as well as links to existing threads; which I have already forwarded to our Customer Support and Driver Team. Very helpful
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thank You to everyone who has posted, and more importantly Thank You for your patience


Honestly, how in the hell are you guys just now aware of this issue? There are threads on every tech forum on the web about just this issue. The one from hardforum started just over two weeks ago.

Seriously, if you guys were to monitor and respond on your own formus people might actually use them. Could help you collect data there, like Nvidia does.

Every launch issue that I've seen with Nvidia, reps have kept people up to date on their forums.

You guys seem so out of touch. Its honestly as if you don't care about you customers. Stuff like this is why amd/ati has earned that reputation for poor driver support.


----------



## jerrolds

Quote:


> Originally Posted by *BababooeyHTJ*
> 
> Honestly, how in the hell are you guys just now aware of this issue? There are threads on every tech forum on the web about just this issue. The one from hardforum started just over two weeks ago.
> 
> Seriously, if you guys were to monitor and respond on your own formus people might actually use them. Could help you collect data there, like Nvidia does.
> 
> Every launch issue that I've seen with Nvidia, reps have kept people up to date on their forums.
> 
> You guys seem so out of touch. Its honestly as if you don't care about you customers. Stuff like this is why amd/ati has earned that reputation for poor driver support.


While i agree the launch was kinda bad - whos to say they werent aware of this as soon as it was reported on the tech forums, or shortly after?

Some AMD PR rep called caveman jim is saying a patch should be ready by the end of the week - so theyve been working on it for awhile.

The Blackscreen issue was only a real big topic the last week and a half or so. The turn around isnt that bad imo.


----------



## BababooeyHTJ

Quote:


> Originally Posted by *jerrolds*
> 
> While i agree the launch was kinda bad - whos to say they werent aware of this as soon as it was reported on the tech forums, or shortly after?
> 
> Some AMD PR rep called caveman jim is saying a patch should be ready by the end of the week - so theyve been working on it for awhile.
> 
> The Blackscreen issue was only a real big topic the last week and a half or so. The turn around isnt that bad imo.


You're right, they must have been working on it for a little while now. It just would have been nice to see a little more communication from AMD.

Its a gripe that I've had with them for quite a while. They really should respond on their forums like Nvidia does.


----------



## the9quad

In this day and age the gains you get by keeping in constant communication with your customer base just seems to me like it would be a no brainer. Problem is people are just generally jerks and abusive when they experience problems. I've been guilty of it myself if I was to be honest. So the person a company hires to be the pubic rep really needs to have thick skin and not take things personally. They need to realize when people pay $500-1000$ for a product they do so expecting it to work, and when it doesn't all sense of manners goes out the window. On the other hand it's really better to at least acknowledge issues right away, so your angry customers don't start adding to the negative image of an uncaring corporation.

Tldr: every company should be johny on the spot with communicating with their customers especially during a new product launch. It just makes so much sense.with the amazing tools at their disposal forums/Facebook,Twitter etc... There really is no excuse anymore to not do so.


----------



## DraXxus1549

Is the new WHQL driver different than the Beta 11, or should I just stick with the beta?


----------



## Kipsofthemud

Got my waterblock and backplate in the mail today. The shop I got the 290X from for 380 euros mailed me today that it might take another 2 weeks before they get my card in stock - so I decided to order a Club 3d 290 that is going to be delivered in about 12-16 hours...Seeing as Powercolor and Club3d get their cards from the same company...I'm hoping it will unlock









Either way I couldn't wait another 2 weeks lol. Will post pics tomorrow for proof


----------



## BababooeyHTJ

Quote:


> Originally Posted by *DraXxus1549*
> 
> Is the new WHQL driver different than the Beta 11, or should I just stick with the beta?


WHQL drivers are normally older than the latest beta. I honestly don't bother with whql drivers.


----------



## BababooeyHTJ

Quote:


> Originally Posted by *the9quad*
> 
> In this day and age the gains you get by keeping in constant communication with your customer base just seems to me like it would be a no brainer. Problem is people are just generally jerks and abusive when they experience problems. I've been guilty of it myself if I was to be honest. So the person a company hires to be the pubic rep really needs to have thick skin and not take things personally. They need to realize when people pay $500-1000$ for a product they do so expecting it to work, and when it doesn't all sense of manners goes out the window. On the other hand it's really better to at least acknowledge issues right away, so your angry customers don't start adding to the negative image of an uncaring corporation.
> 
> Tldr: every company should be johny on the spot with communicating with their customers especially during a new product launch. It just makes so much sense.with the amazing tools at their disposal forums/Facebook,Twitter etc... There really is no excuse anymore to not do so.


Thank you, I completely agree with you. It just doen't make any sense to me. You would think that having one person on staff to address customers would be a lot cheaper than what they pay for some of these game bundles. Also a lot more effective.
Quote:


> On the other hand it's really better to at least acknowledge issues right away, so your angry customers don't start adding to the negative image of an uncaring corporation.


Honestly, its the way that I've always viewed amd.

Nvidia seems more open about issues, and progress. I've even seen Nvidia respond to feature requests on their forums like lod bias adjustments for DX10/11.


----------



## maarten12100

Quote:


> Originally Posted by *jerrolds*
> 
> Warsam - i did, while thats nice (and better than nothing) - it seems like a contest or promo more than anything "As a gesture of goodwill to our most loyal fans, we'll be giving away in a *random draw* 1000 Battlefield 4 game codes in the coming weeks. "
> 
> So not every early adopter will be guaranteed a BF4 code, and could even be given to anyone that likes AMDGaming fb page or is Twitter follower.


Quote:


> Originally Posted by *Maxxa*
> 
> Yeah and I like most people on this site will not go near FB or twitter just to "like" a corporation with someone else's hardware...nm our own.


This.

PS going ot try PT1 bios before sending the cards back tommorow farewell my loves


----------



## Warsam71

Quote:


> Originally Posted by *BababooeyHTJ*
> 
> You're right, they must have been working on it for a little while now. It just would have been nice to see a little more communication from AMD.
> 
> Its a gripe that I've had with them for quite a while. They really should respond on their forums like Nvidia does.


Hello,

I can assure you we have been working hard on this issue for a little while now, we've received lots of feedback to help us address the issue, including through our Support Forums. Perhaps I should have been a bit clearer in my last post. I apologize if I didn't...


----------



## evensen007

Quote:


> Originally Posted by *the9quad*
> 
> In this day and age the gains you get by keeping in constant communication with your customer base just seems to me like it would be a no brainer. Problem is people are just generally jerks and abusive when they experience problems. I've been guilty of it myself if I was to be honest. So the person a company hires to be the pubic rep really needs to have thick skin and not take things personally. They need to realize when people pay $500-1000$ for a product they do so expecting it to work, and when it doesn't all sense of manners goes out the window. On the other hand it's really better to at least acknowledge issues right away, so your angry customers don't start adding to the negative image of an uncaring corporation.
> 
> Tldr: every company should be johny on the spot with communicating with their customers especially during a new product launch. It just makes so much sense.with the amazing tools at their disposal forums/Facebook,Twitter etc... There really is no excuse anymore to not do so.


I agree 1000%. I am very tech savvy and have been wading through the 290 issues to get them working properly. Can you imagine a tech neophyte buying a 290 off of amazon and plugging it in expecting it to "just work"? I have had a hard time whole-heartedly recommending the new 290 line to friends. One of them wants to build a custom loop and he got the 290 first. He installed it and is getting way worse frames out of the box then he was with his old dual nVidia 280x's. Granted, I know how to go through the whole driver deep-dive and figured out my own problems, but I think AMD needs to work on day1 and early adopter user experience.


----------



## Arizonian

I see an AMD rep addressing us here on a personal level a good thing. We've learned they've already been working on the black screen issue.

Opposed to Nvidia who dosent have a rep here at all. They've had thier fair share of driver issues too.

Both have different ways to reaching out to thier customer base when things aren't running smoothly is how I see it.

I'd like to thank Warsam71 for taking the time keeping OCN in the loop and taking us seriously.

I don't think you guys are aware how much they value your information here and have been listening.


----------



## Taint3dBulge

Quote:


> Originally Posted by *Arizonian*
> 
> I see an AMD rep addressing us here on a personal level a good thing. We've learned they've already been working on the black screen issue.
> 
> Opposed to Nvidia who dosent have a rep here at all. They've had thier fair share of driver issues too.
> 
> Both have different ways to reaching out to thier customer base when things aren't running smoothly is how I see it.
> 
> I'd like to thank Warsam71 for taking the time keeping OCN in the loop and taking us seriously.


Yup what Arizonian said. It would be nice to know a time frame for new drivers that might take care of some of these issues. Maybe like an unofficial statement on driver bugs they are working on and a time frame for release.... Or am i just dreaming lol.


----------



## Warsam71

Hello again everyone,

We are going to release a new driver tomorrow which fixes the black screen problem you've been experiencing.

Again, I want to thank everyone for your patience.


----------



## Scorpion49

Quote:


> Originally Posted by *Warsam71*
> 
> Hello again everyone,
> 
> We are going to release a new driver tomorrow which fixes the black screen problem you've been experiencing.
> 
> Again, I want to thank everyone for your patience.


Good news. I'm excited for that. Does the driver team have any idea why Valley bench runs so poorly on these cards?


----------



## Arizonian

Quote:


> Originally Posted by *Warsam71*
> 
> Hello again everyone,
> 
> We are going to release a new driver tomorrow which fixes the black screen problem you've been experiencing.
> 
> Again, I want to thank everyone for your patience.


Thank you as well.









I'm sure many owners will be looking forward to it and will be very happy to hear this. I will add a small section on the OP info section regarding black screen issue news for easy access so everyone can see.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Arizonian*
> 
> I see an AMD rep addressing us here on a personal level a good thing. We've learned they've already been working on the black screen issue.
> 
> Opposed to Nvidia who dosent have a rep here at all. They've had thier fair share of driver issues too.
> 
> Both have different ways to reaching out to thier customer base when things aren't running smoothly is how I see it.
> 
> I'd like to thank Warsam71 for taking the time keeping OCN in the loop and taking us seriously.
> 
> I don't think you guys are aware how much they value your information here and have been listening.


Nvidia does have reps here. They are under disguise.


----------



## Scorpion49

On another note, is anyone besides me getting full white frames inserted during gaming? Its a brief flash that appears to be a solid frame of white, it happens with all of my games.


----------



## rdr09

Quote:


> Originally Posted by *Taint3dBulge*
> 
> Yup what Arizonian said. It would be nice to know a time frame for new drivers that might take care of some of these issues. Maybe like an unofficial statement on driver bugs they are working on and a time frame for release.... Or am i just dreaming lol.


are you affected by black screens as well? what's weird is, based off the number of R9 290 owners in OCN, majority seems to not have any these issues.

I am one of them who is not affected. If it was a driver issue, then should it not just be happening to a few (relatively speaking)?


----------



## Arizonian

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Nvidia does have reps here. They are under disguise.


I'm sure they do







I'm sure they do


----------



## jjjsong

Quote:


> Originally Posted by *Warsam71*
> 
> Hello again everyone,
> 
> We are going to release a new driver tomorrow which fixes the black screen problem you've been experiencing.
> 
> Again, I want to thank everyone for your patience.


This is great news!

Does it also happen to fix the BSOD issues? :raying::


----------



## Roaches

Quote:


> Originally Posted by *Arizonian*
> 
> I'm sure they do
> 
> 
> 
> 
> 
> 
> 
> I'm sure they do


Some of them can be trolling us as we speak


----------



## VSG

So the black screen issue was not because of a lack of power to the card? I read that people didn't get the black screens anymore when they plugged in supplemental power to the PCI-E lanes via molex like some motherboards have.


----------



## brazilianloser

Quote:


> Originally Posted by *geggeg*
> 
> So the black screen issue was not because of a lack of power to the card? I read that people didn't get the black screens anymore when they plugged in supplemental power to the PCI-E lanes via molex like some motherboards have.


It is or was case dependant. But for example I had a faulty card that couldn't handle a single run of Valley or few minutes of game before crashing. Rma for a new got it in today and so far it's been running Valley for the past hour and nothing. That without changing a single thing on the PC.


----------



## ReHWolution

Quote:


> Originally Posted by *Warsam71*
> 
> Hello again everyone,
> 
> We are going to release a new driver tomorrow which fixes the black screen problem you've been experiencing.
> 
> Again, I want to thank everyone for your patience.


Thanks Warsam, glad to see the AMD drivers team working hard to bring the right power where it deserves to be (into a 290X







)
Quote:


> Originally Posted by *rdr09*
> 
> are you affected by black screens as well? what's weird is, based off the number of R9 290 owners in OCN, majority seems to not have any these issues.
> 
> I am one of them who is not affected. If it was a driver issue, then should it not just be happening to a few (relatively speaking)?


I've been getting random black screen, mostly launching a game just after another (to be correct, launching different benchmarks one after another), then all of sudden they disappeared. Looking forward for the tomorrow's drivers release (expecting 13.11 Beta 10, or maybe 13.12 beta?), I noticed there is poor performance with Unigine benchmarks and with Bioshock Infinite...


----------



## CallsignVega

I have two 290x's arriving tonight to test against my Titans with my 3x1 Eizo setup, and Elder scrolls online beta this weekend. So sweet. I will test 2 vs 2 of course, no water blocks for the AMD's yet so the results will of course have to be put into context.

Which is the best 290x BIOS currently? I heard you can run up to 1.4v with the stock air cooler? JK


----------



## Kelwing

Fixed my temp issues and then some. Yesterday I found my VRM1 temps going thru roof. Found heatsinks were not seated good and glue ended up with air bubbles in it. So yanked them back off and reglued them last night. Used one small heat sink and 1 large one(long and skinny). Got Extreme III back on and used Formula7 on GPU. Ran Kombuster for 35 minutes at stock clocks with no over heat issues. Dropped GPU temps a surprising14c. From 60c to 46c. VRm1 temps are now very steady peaking and holding at 45c with VRm2 at 34c.

Bumped GPU to 1000mhz which really caused VRM1 temps to shoot up fast yesterday. Again GPU held steady at 46c and VRM1 reached 46c, VRM2 at 33c









Now to play with this some more


----------



## spikexp

I need some help.

Just finished installing my xfx 290 with gelid icy vision rev 2 with this guide
http://www.overclock.net/t/1437634/installation-guide-tips-of-rev-2-icy-vision-on-r9-290x
and I think I did not set the gpu correctly.

First, temperature at idle are 50C and as soon as I start bf4 it skyrocket to 95C...

It has enough thermal paste and all. I think the problem is that the cooler is not fixed tight enough to the gpu.
I am effraid to tighten it fully because it is bending the gpu at the core level.
Is this bend normal?


----------



## bpmcleod

Quote:


> Originally Posted by *CallsignVega*
> 
> I have two 290x's arriving tonight to test against my Titans with my 3x1 Eizo setup, and Elder scrolls online beta this weekend. So sweet. I will test 2 vs 2 of course, no water blocks for the AMD's yet so the results will of course have to be put into context.
> 
> Which is the best 290x BIOS currently? I heard you can run up to 1.4v with the stock air cooler? JK


You can but it vdrops to 1.3 or so.. which is fine on the stock cooler. On a different note though anyone experiencing 0 scaling with crossfire setups on BF4? Im getting the same FPS with one or two GPUs and terrible usage..

Edit: I also cannot even play BF4 really.. I get the "Leaving Level" message 30 secs to a minute into the game..


----------



## CallsignVega

Quote:


> Originally Posted by *Warsam71*
> 
> Hello again everyone,
> 
> We are going to release a new driver tomorrow which fixes the black screen problem you've been experiencing.
> 
> Again, I want to thank everyone for your patience.


Sweet. Since the new NVIDIA drivers just came out, I will do all those runs tonight and do the AMD runs after that driver releases tomorrow.









I'll make an 290x/Titan sandwich and just switch between the two setups using the RIVE PCI-E DIP switches.


----------



## rdr09

Quote:


> Originally Posted by *bpmcleod*
> 
> You can but it vdrops to 1.3 or so.. which is fine on the stock cooler. On a different note though anyone experiencing 0 scaling with crossfire setups on BF4? Im getting the same FPS with one or two GPUs and terrible usage..
> 
> Edit: I also cannot even play BF4 really.. I get the "Leaving Level" message 30 secs to a minute into the game..


run the Test in the menu. it might give you an idea what's going on. i had problems with lag the other night and ran the test. it pointed to my internet connection being low. changed the cable (20ft) to a shorter one and solved the issue kinda of. I also reset my Pagefile from 3000MB to Auto.


----------



## Technewbie

Quote:


> Originally Posted by *Warsam71*
> 
> Hello again everyone,
> 
> We are going to release a new driver tomorrow which fixes the black screen problem you've been experiencing.
> 
> Again, I want to thank everyone for your patience.


Guess i'll be holding off on shipping my RMA for a day so I can see if the drivers fixes the issue.


----------



## Frozenoblivion

When are aftermarket coolers coming for the R9 290?


----------



## Loktar Ogar

Quote:


> Originally Posted by *Frozenoblivion*
> 
> When are aftermarket coolers coming for the R9 290?


Also looking forward to this... Then i'll decide what to buy.


----------



## Roaches

If AIBs can tease us with their designs, it puts hope for me to wait for it


----------



## bpmcleod

Quote:


> Originally Posted by *Warsam71*
> 
> Hello again everyone,
> 
> We are going to release a new driver tomorrow which fixes the black screen problem you've been experiencing.
> 
> Again, I want to thank everyone for your patience.


Havent had any blacks creens thankfully but still cant wait for a new driver...


----------



## DarkZR9

Quote:


> Originally Posted by *Kelwing*
> 
> Fixed my temp issues and then some. Yesterday I found my VRM1 temps going thru roof. Found heatsinks were not seated good and glue ended up with air bubbles in it. So yanked them back off and reglued them last night. Used one small heat sink and 1 large one(long and skinny). Got Extreme III back on and used Formula7 on GPU. Ran Kombuster for 35 minutes at stock clocks with no over heat issues. Dropped GPU temps a surprising14c. From 60c to 46c. VRm1 temps are now very steady peaking and holding at 45c with VRm2 at 34c.
> 
> Bumped GPU to 1000mhz which really caused VRM1 temps to shoot up fast yesterday. Again GPU held steady at 46c and VRM1 reached 46c, VRM2 at 33c
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now to play with this some more


Isnt that glue permanent? I used it on mine too and it feels like if i tried to remove a ram sink now that it would rip the ram sink right out of socket.


----------



## Snyderman34

Had a couple black screens during gaming (Dota 2, actually), but they've been few and far between. I still love the card, especially under water. Glad to see AMD reps on here letting us know what's going on. Thanks WarSam!


----------



## sid123

Yay got a 290x with 78.5% asic quility, waterblock coming soon

eplida memory aswell


----------



## bpmcleod

Quote:


> Originally Posted by *sid123*
> 
> Yay got a 290x with 78.5% asic quility, waterblock coming soon
> 
> eplida memory aswell


I got lucky and got teo 290s sub 70s asicsif it really makes a difference


----------



## BababooeyHTJ

Quote:


> Originally Posted by *Arizonian*
> 
> Opposed to Nvidia who dosent have a rep here at all. They've had thier fair share of driver issues too.


Why would Nvidia need a rep here? They actively address their customers all the time on their own forums. It makes a lot more sense than the occasional post at one of many tech sites. A lot of the complaints could have been confined to their own forum.

There were a few people who joined hardforum just to post in their black screen thread. I'm sure that had amd provided support on their own forum that people would use it.


----------



## Arizonian

Quote:


> Originally Posted by *BababooeyHTJ*
> 
> Why would Nvidia need a rep here? They actively address their customers all the time on their own forums. It makes a lot more sense than the occasional post at one of many tech sites. A lot of the complaints could have been confined to their own forum.
> 
> There were a few people who joined hardforum just to post in their black screen thread. I'm sure that had amd provided support on their own forum that people would use it.


You took my post out of context by removing the key sentence and twisting what I said.
Quote:


> Originally Posted by *Arizonian*
> 
> I see an AMD rep addressing us here on a personal level a good thing. We've learned they've already been working on the black screen issue.
> 
> Opposed to Nvidia who dosent have a rep here at all. They've had thier fair share of driver issues too.
> 
> *Both have different ways to reaching out to thier customer base when things aren't running smoothly is how I see it.
> 
> *I'd like to thank Warsam71 for taking the time keeping OCN in the loop and taking us seriously.
> 
> I don't think you guys are aware how much they value your information here and have been listening.


----------



## Forceman

Has anyone compared the stock voltage for 290 and 290X cards yet? Do 290Xs run at a lower voltage?


----------



## BababooeyHTJ

Quote:


> Originally Posted by *Arizonian*
> 
> You took my post out of context by removing the key sentence and twisting what I said.


No, I didn't I addressed that perfectly well in my post. Maybe you feel differently since a rep personally addressed you.


----------



## sid123

Quote:


> Originally Posted by *bpmcleod*
> 
> I got lucky and got teo 290s sub 70s asicsif it really makes a difference


It seems like the average for 290x is 70 to 75 (yet to see one over 80 highest 79)
290 seems to be 65 to 75 ( highest i have seen is 81)

very odd how similar they are since 290x should be a higher binned chip


----------



## Arizonian

Quote:


> Originally Posted by *BababooeyHTJ*
> 
> No, I didn't I addressed that perfectly well in my post. Maybe you feel differently since a rep personally addressed you.


Yes my assisting 290X / 290 owners with the black screen issue by provided the rep very important breakdown with specifics regarding what the club members are experiencing being helpful has swayed me.









Please don't insinuate things and derail the club from the main issue which the AMD rep was trying to convey here in the 290X / 290 Owners Club regards to black screen issues which you are not dealing with as a non-owner.


----------



## CurrentlyPissed

Just wanted to say Newegg did a stand up job. And gave me $50 in Gift Cards for the BF4 issue. ($25/card).


----------



## ZealotKi11er

Add me. AMD R9 290.


----------



## jjjsong

Quote:


> Originally Posted by *Arizonian*
> 
> Yes my assisting 290X / 290 owners with the black screen issue by provided the rep very important breakdown with specifics regarding what the club members are experiencing being helpful has swayed me.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Please don't insinuate things and derail the club from the main issue which the AMD rep was trying to convey here in the 290X / 290 Owners Club regards to black screen issues which you are not dealing with as a non-owner.


Arizonian, when you worked with them, did they mention anything about BSOD and/or possibly BSOD being another symptom of the same cause?


----------



## Arizonian

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Add me. AMD R9 290.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Is the Powercolor?


----------



## rdr09

Quote:


> Originally Posted by *Arizonian*
> 
> Yes my assisting 290X / 290 owners with the black screen issue by provided the rep very important breakdown with specifics regarding what the club members are experiencing being helpful has swayed me.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Please don't insinuate things and derail the club from the main issue which the AMD rep was trying to convey here in the 290X / 290 Owners Club regards to black screen issues which you are not dealing with as a non-owner.


As far as I know Bababooey is an owner of 2 of these cards based on his thread here . . .

http://www.overclock.net/t/1438401/r9-290x-microstutter

I asked him in that that thread how come he made a thread and not joined the club and maybe seeked help first. I understand, though, that that was the op's prerogative.


----------



## Arizonian

Quote:


> Originally Posted by *jjjsong*
> 
> Arizonian, when you worked with them, did they mention anything about BSOD and/or possibly BSOD being another symptom of the same cause?


No I provided them with specific information to the black screen issues only. Running down all the common denominator's from this thread and the black screen thread.

They did not provide me though with any information. It was passed along to their drivers team. I'm expecting when the driver releases tomorrow that we will find out exactly what was the cause perhaps.
Quote:


> Originally Posted by *rdr09*
> 
> As far as I know Bababooey is an owner of 2 of these cards based on his thread here . . .
> 
> http://www.overclock.net/t/1438401/r9-290x-microstutter
> 
> I asked him in that that thread how come he made a thread and not joined the club and maybe seeked help first.


Well since he had not joined us and I did not know that, my bad. I had assumed that also from his signature did not show that either.

My point to him was that Nvidia has a different way of approaching issues than AMD that's all I said. He presumed I was saying something differently and when I corrected him he insinuated I was somehow biased in dealing directly with the rep. I would have done the same exact thing for Nvidia owners if this was an Nvidia club and was speaking to a rep as well.


----------



## jjjsong

Quote:


> Originally Posted by *Arizonian*
> 
> No I provided them with specific information to the black screen issues only. Running down all the common denominator's from this thread and the black screen thread.
> 
> They did not provide me though with any information. It was passed along to their drivers team. I'm expecting when the driver releases tomorrow that we will find out exactly what was the cause perhaps.


I see. Hopefully it also fixes the BSODs I'm getting. I'll be receiving my replacement the day after tomorrow most likely so I guess I'll find out!


----------



## jerrolds

Quote:


> Originally Posted by *CurrentlyPissed*
> 
> Just wanted to say Newegg did a stand up job. And gave me $50 in Gift Cards for the BF4 issue. ($25/card).


Wait...what BF4 issue? How did you get $50 back in gift cards?


----------



## Redvineal

Please add me to the list!









I'm rocking this stock XFX R9 290 for now. Not aftermarket cooling...*yet*.

Now to get working on liberating those extra shaders...


----------



## rdr09

Quote:


> Originally Posted by *Arizonian*
> 
> No I provided them with specific information to the black screen issues only. Running down all the common denominator's from this thread and the black screen thread.
> 
> They did not provide me though with any information. It was passed along to their drivers team. I'm expecting when the driver releases tomorrow that we will find out exactly what was the cause perhaps.
> Well since he had not joined us and I did not know that, my bad. I had assumed that also from his signature did not show that either.
> 
> My point to him was that Nvidia has a different way of approaching issues than AMD that's all I said. He presumed I was saying something differently and when I corrected him he insinuated I was somehow biased in dealing directly with the rep. I would have done the same exact thing for Nvidia owners if this was an Nvidia club and was speaking to a rep as well.


I think we should invite Bababooey.

edit: Baba just left.









edit: Arizonian, I like to thank you personally for the great job you are doing for this club even though you are about to get a 780Ti.


----------



## Dan848

I am sorry it took so long to get back to you.

If you do not overclock your CPU your power supply will power two R9 290s.

Edit:

See this post as reference:

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/6300_50#post_21214227


----------



## galil3o

I'm getting my stuff ready for water cooling installation. Can anyone point me to a good video or tutorial on removing the stock cooler?


----------



## tsm106

Quote:


> Originally Posted by *Arizonian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BababooeyHTJ*
> 
> No, I didn't I addressed that perfectly well in my post. Maybe you feel differently since a rep personally addressed you.
> 
> 
> 
> Yes my assisting 290X / 290 owners with the black screen issue by provided the rep very important breakdown with specifics regarding what the club members are experiencing being helpful has swayed me.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Please don't insinuate things and derail the club from the main issue which the AMD rep was trying to convey here in the 290X / 290 Owners Club regards to black screen issues which you are not dealing with as a non-owner.
Click to expand...

I would ignore the negativity and continue doing what you're doing as thread OP.


----------



## raghu78

Quote:


> Originally Posted by *tsm106*
> 
> I would ignore the negativity and continue doing what you're doing as thread OP.


well said.







keep up the good work arizonian


----------



## ReHWolution

Quote:


> Originally Posted by *sid123*
> 
> It seems like the average for 290x is 70 to 75 (yet to see one over 80 highest 79)
> 290 seems to be 65 to 75 ( highest i have seen is 81)
> 
> very odd how similar they are since 290x should be a higher binned chip



*cough cough*


----------



## CurrentlyPissed

Quote:


> Originally Posted by *jerrolds*
> 
> Wait...what BF4 issue? How did you get $50 back in gift cards?


By BF4 issue. I mean, me paying Overnight shipping on release day of 290, getting my 290s, then 4 days later them announcing it would be bundled.


----------



## the9quad

Received my 2nd 290x today, Post pics later. So far no black screens just need my new monitor because @1080p this is overkill like a mug atm.


----------



## jerrolds

Quote:


> Originally Posted by *CurrentlyPissed*
> 
> By BF4 issue. I mean, me paying Overnight shipping on release day of 290, getting my 290s, then 4 days later them announcing it would be bundled.


Yes i was hoping for something like this - and newegg was accomodating? Cuz i did the exact same thing - paid express shipping on release day BF4 290x, then having it FREE for R7 270 cards not even 2 weeks later.

The card is 30% the price of a 290X

Are you kidding me with this AMD?


----------



## rdr09

Quote:


> Originally Posted by *rv8000*
> 
> Interesting, you have an MSI board right?


yes, MSI.


----------



## Gunderman456

Quote:


> Originally Posted by *galil3o*
> 
> I'm getting my stuff ready for water cooling installation. Can anyone point me to a good video or tutorial on removing the stock cooler?


Here you go!

http://m.youtube.com/index?gl=CA&desktop_uri=%2F%3Fgl%3DCA


----------



## MlNDSTORM

Still not sure if I should go with the 780 or 290x....The noise and the heat really bug me...Going to read some more and see videos, before I make my decision.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is the Powercolor?


Its AMD card. Also was able to unlock it to a 290X.


----------



## ReHWolution

Quote:


> Originally Posted by *MlNDSTORM*
> 
> Still not sure if I should go with the 780 or 290x....The noise and the heat really bug me...Going to read some more and see videos, before I make my decision.


The 780 is a good card (i had it) but drivers improvements are basically 0 from this point on. The 290X is already faster, and probably it will be even more thanks to new Catalysts and to Mantle and all those stuff, BUT the heat and the loud are a p.i.t.a., just wait less than 1 month, Vapor-X from Sapphire are on the way for the second half of december







or if you wish, asus, gigabyte, msi models, same release period (kinda







), or even better, wait for a custom 290, so that you're gonna save some cash and still have a GREAT card


----------



## MacClipper

Quote:


> Originally Posted by *ReHWolution*
> 
> Quote:
> 
> 
> 
> Originally Posted by *sid123*
> 
> It seems like the average for 290x is 70 to 75 (yet to see one over 80 highest 79)
> 290 seems to be 65 to 75 ( highest i have seen is 81)
> 
> very odd how similar they are since 290x should be a higher binned chip
> 
> 
> 
> 
> *cough cough*
Click to expand...











Should I be coughing too?


----------



## Forceman

Quote:


> Originally Posted by *MacClipper*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Should I be coughing too?


Why does your card have a 937 clock speed?


----------



## ReHWolution

Quote:


> Originally Posted by *Forceman*
> 
> Why does your card have a 937 clock speed?


'cause it's a R9 290, not a R9-290*X*







We were talking about X cards







so yea, you can cough too, but not for the same reason hahahaha xD


----------



## Forceman

Quote:


> Originally Posted by *ReHWolution*
> 
> 'cause it's a R9 290, not a R9-290*X*
> 
> 
> 
> 
> 
> 
> 
> We were talking about X cards
> 
> 
> 
> 
> 
> 
> 
> so yea, you can cough too, but not for the same reason hahahaha xD


290s should have a 9*4*7 clock speed.


----------



## ZealotKi11er

Quote:


> Originally Posted by *MlNDSTORM*
> 
> Still not sure if I should go with the 780 or 290x....The noise and the heat really bug me...Going to read some more and see videos, before I make my decision.


I have been hearing a lot of review complaining about sound and heat of this card. @ 40% i can say its not bad at all and this is coming from someone that cant stand fans over 1000 RPM and only run water cooled cards. I though it was going to be a jet.


----------



## MacClipper

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ReHWolution*
> 
> 'cause it's a R9 290, not a R9-290*X*
> 
> 
> 
> 
> 
> 
> 
> We were talking about X cards
> 
> 
> 
> 
> 
> 
> 
> so yea, you can cough too, but not for the same reason hahahaha xD
> 
> 
> 
> 290s should have a 9*4*7 clock speed.
Click to expand...

Eagle Eye points awarded!

Cos it is not off the shelf...










And what lies beneath...


----------



## Joeking78

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I have been hearing a lot of review complaining about sound and heat of this card. @ 40% i can say its not bad at all and this is coming from someone that cant stand fans over 1000 RPM and only run water cooled cards. I though it was going to be a jet.


I agree.

I was quite please with temps and noise when running BF4, temps in the low/mid 70's and not totally unbareable noise.

They do get to turbine noise levels when running 3dmark and temps go a tad higher but thats not everyday.

Very pleased with these cards, phenomonal horsepower.

Edit: nice timing with the ad at the bottom of my page, GE jet turbine engine lol.


----------



## Arizonian

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I have been hearing a lot of review complaining about sound and heat of this card. @ 40% i can say its not bad at all and this is coming from someone that cant stand fans over 1000 RPM and only run water cooled cards. I though it was going to be a jet.


IMO judging by ear, I'd say up to 290X fan at 70% is just a tad (if noticeable) louder than my 690 at 95% fan which I ran gaming to keep Kepler temps under the 82C threshold before Kepler down clocks 13 MHz on the Core.

Benching at 100% did sound like a vacuums cleaner. Luckily I don't play benchmarks.









Still I think if one isn't water blocking then aftermarket cooling alternative is the best route for 290X.

I can't wait any longer, I'm going to bust. I'm sure I'm not going to be able to hold out for ASUS Matrix if a Gigabyte Windforce III or ASUS DCIIU Top comes out first.


----------



## Scorpion49

So I just picked me up a new MSI X79A-GD45 Plus and a Samsung 840 EVO 250GB... now re-installing windoze. Wheeeeeee...

Quote:


> Originally Posted by *Arizonian*
> 
> IMO judging by ear, I'd say up to 290X fan at 70% is just a tad (if noticeable) louder than my 690 at 95% fan which I ran gaming to keep Kepler temps under the 82C threshold before Kepler down clocks 13 MHz on the Core.
> 
> Benching at 100% did sound like a vacuums cleaner. Luckily I don't play benchmarks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Still I think if one isn't water blocking then aftermarket cooling alternative is the best route for 290X.
> 
> I can't wait any longer, I'm going to bust. I'm sure I'm not going to be able to hold out for ASUS Matrix if a Gigabyte Windforce III or ASUS DCIIU Top comes out first.


Matrix is waste of money IMO, I've had 5 or 6 of them and they never did anything better than the regular DCUII cards. Lightning is where its at for custom PCB... 290XL


----------



## Spectre-

Quote:


> Originally Posted by *Arizonian*
> 
> IMO judging by ear, I'd say up to 290X fan at 70% is just a tad (if noticeable) louder than my 690 at 95% fan which I ran gaming to keep Kepler temps under the 82C threshold before Kepler down clocks 13 MHz on the Core.
> 
> Benching at 100% did sound like a vacuums cleaner. Luckily I don't play benchmarks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Still I think if one isn't water blocking then aftermarket cooling alternative is the best route for 290X.
> 
> I can't wait any longer, I'm going to bust. I'm sure I'm not going to be able to hold out for ASUS Matrix if a Gigabyte Windforce III or ASUS DCIIU Top comes out first.


meh sound and heat doesnt bother me

the temps are bad for the system but ill be grabbing the accelero 3 soon if Arctic doesnt properly announce for the accelero 4


----------



## Arizonian

Quote:


> Originally Posted by *Scorpion49*
> 
> Matrix is waste of money IMO, I've had 5 or 6 of them and they never did anything better than the regular DCUII cards. Lightning is where its at for custom PCB... 290XL


Thanks I'll take the previous owners word for it then. An MSI lightning would be sweet.









I've always wanted one. Just never made it past release day launch on new series.


----------



## Spectre-

can someone here help me setup my 290 for folding please

i have installed and re-installed [email protected] 4 times now

much appreciated


----------



## Scorpion49

Quote:


> Originally Posted by *Arizonian*
> 
> Thanks I'll take the previous owners word for it then. An MSI lightning would be sweet.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've always wanted one. Just never made it past release day launch on new series.


Digging deep in to my graphics card past, these are the only Matrix cards I've had that were amazing. Both 580 Matrix Platinums were able to game at 975 and bench at 1000mhz on air which was pretty decent for fermi. I wonder if the Lightning will be extra special over the "gaming" version...


Spoiler: Warning: Spoiler!


----------



## Arizonian

Quote:


> Originally Posted by *Redvineal*
> 
> Please add me to the list!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm rocking this stock XFX R9 290 for now. Not aftermarket cooling...*yet*.
> 
> Now to get working on liberating those extra shaders...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added








Quote:


> Originally Posted by *Scorpion49*
> 
> Digging deep in to my graphics card past, these are the only Matrix cards I've had that were amazing. Both 580 Matrix Platinums were able to game at 975 and bench at 1000mhz on air which was pretty decent for fermi. I wonder if the Lightning will be extra special over the "gaming" version...
> 
> 
> Spoiler: Warning: Spoiler!


I remember the 580 days fondly. Got my 580 reference with voltage bump (_the good old Nvidia days before Green initiative_) @ 950 Mhz to bench / @ 925 Mhz Core 24/7.









Quote:


> Originally Posted by *Spectre-*
> 
> can someone here help me setup my 290 for folding please
> 
> i have installed and re-installed [email protected] 4 times now
> 
> much appreciated


Might want to get a hold of @$ilent ---- *2013 Forum Folding War - Team Intel
* - tell him I sent ya.


----------



## ZealotKi11er

Using the 290X right now with A8-3870K @ 3.0GHz to play BF4. Man the CPU holds its back. Getting like 25-40 fps lol.


----------



## Scorpion49

Anyone do any checking to see if PCI-E 2.0 vs 3.0 has any effect on these cards in CF?

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Using the 290X right now with A8-3870K @ 3.0GHz to play BF4. Man the CPU holds its back. Getting like 25-40 fps lol.


Ouch, yeah thats why I brought back the 8350. It would be fine for new games but I play a lot of ones like Skyrim still and at 4.6 it can't match a stock Sandy chip in Skyrim unfortunately.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Scorpion49*
> 
> Anyone do any checking to see if PCI-E 2.0 vs 3.0 has any effect on these cards in CF?
> Ouch, yeah thats why I brought back the 8350. It would be fine for new games but I play a lot of ones like Skyrim still and at 4.6 it can't match a stock Sandy chip in Skyrim unfortunately.


Cant wait for Mantle to hit and see how much faster it will make BF4 in super CPU limited systems.


----------



## Scorpion49

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Cant wait for Mantle to hit and see how much faster it will make BF4 in super CPU limited systems.


I know right? I'm really excited for that technology. Seems the problems of node shrinks have caused both GPU makers to look elsewhere for things people want to buy/upgrade for .


----------



## Mr357

Quote:


> Originally Posted by *Spectre-*
> 
> can someone here help me setup my 290 for folding please
> 
> i have installed and re-installed [email protected] 4 times now
> 
> much appreciated


If you're folding on your CPU also, you need to limit it to two less threads than your CPU has. Otherwise a fully loaded CPU will cause your GPU WU's to fail.


----------



## bpmcleod

Quote:


> Originally Posted by *CurrentlyPissed*
> 
> By BF4 issue. I mean, me paying Overnight shipping on release day of 290, getting my 290s, then 4 days later them announcing it would be bundled.


MMMM I ordered two 290s with no BF4... Newegg here I comeeee!


----------



## Sgt Bilko

Quote:


> Originally Posted by *Arizonian*
> 
> Might want to get a hold of @$ilent ---- *2013 Forum Folding War - Team Intel
> * - tell him I sent ya.


Afaik $ilent has RMA'd his 290. Artifacts at stock clocks I believe.

As for the folding....mine was above 200k and now it wont pass 50k....nothing has changed on my rig so ive no idea whats going on there.


----------



## bpmcleod

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Afaik $ilent has RMA'd his 290. Artifacts at stock clocks I believe.
> 
> As for the folding....mine was above 200k and now it wont pass 50k....nothing has changed on my rig so ive no idea whats going on there.


They flucuate. Also, make sure you are folding only the newer WUs. They yield the most PPD


----------



## tsm106

The leaked whql driver is pretty fast, but dammit AMD it's not validated! Argh I could have shaken things up in the hof, bu no...


----------



## bpmcleod

where is the leaked driver?


----------



## tsm106

Quote:


> Originally Posted by *bpmcleod*
> 
> where is the leaked driver?


Actually they are not leaked anymore, they be on amd's site.


----------



## r0cawearz

can anyone who got the xfx 290 free battlefield 4 help me? For some reason i'm getting the message: Sorry, your product is not subject to XFX game promotion. after having registered and putting in the personal activation code from the card.


----------



## Ukkooh

A quick question: How many 290xs have fallen short of 1200mhz under water?


----------



## Mr357

Quote:


> Originally Posted by *Ukkooh*
> 
> A quick question: How many 290xs have fallen short of 1200mhz under water?


Depends where you draw the line with voltage. It's not about cooling.


----------



## tsm106

You can do 1200 on air pretty easily.


----------



## Forceman

Quote:


> Originally Posted by *r0cawearz*
> 
> can anyone who got the xfx 290 free battlefield 4 help me? For some reason i'm getting the message: Sorry, your product is not subject to XFX game promotion. after having registered and putting in the personal activation code from the card.


Maybe they wised up?


----------



## Scorpion49

Much better: http://www.3dmark.com/3dm11/7522457


----------



## Arizonian

Quote:


> Originally Posted by *Scorpion49*
> 
> Much better: http://www.3dmark.com/3dm11/7522457


That is good


----------



## Scorpion49

Quote:


> Originally Posted by *Arizonian*
> 
> That is good


P score would be a bit higher but I don't like HT for gaming so I shut it off, so it drags the Physics down a lot. I could see the difference in the run though, the FPS didn't bounce around as much, before the digits were changing so fast the number was tearing.


----------



## MintyOne

You can add me to the list please! Sapphire 290

Edit: lol my screenshot is with the 290x bios that I flashed, hence the 1ghz clockspeed shown.


----------



## ZealotKi11er

I notice with no load at all setting the memory to 1500MHz did not work. Is overclocking these memory chips really this limited.


----------



## TooBAMF

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I notice with no load at all setting the memory to 1500MHz did not work. Is overclocking these memory chips really this limited.


It has looked pretty limited to me. With the Titan rigging some extra fans to blow directly over the back of the card helps but the 290s don't have memory chips on the back. I assume you're on air?

Edit: saw the rads in your sig


----------



## Arizonian

Quote:


> Originally Posted by *MintyOne*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> You can add me to the list please! Sapphire 290
> 
> Edit: lol my screenshot is with the 290x bios that I flashed, hence the 1ghz clockspeed shown.


Congrats - added


----------



## brazilianloser

So here is my new issue... Got a replacement card in and its working great compared to the black screen madness I had going on with my last one... but for some reason AB Beta 17 and 3Dmark are only showing that the card has 3GB of ram. Anyone has noticed this situation on their cards as well (ASUS 290).???



edit: on the other hand GPUZ does states that the card has 4GB as it should.


----------



## kdawgmaster

Quote:


> Originally Posted by *Scorpion49*
> 
> Much better: http://www.3dmark.com/3dm11/7522457


Heres my firestrike score with 3 cards. http://www.3dmark.com/fs/1177247

I have to say tho that im having other issues other then the black screen problem.

For some reason in my 3 card setup after playing games the system will crash ( narrowed down to drivers ) but they crash the system so hard for some reason i have to go through a reinstall process where i have to even use driver sweeper. Now the worst part is I always have 1 card in the device manager error out on me to where i have to remove the cards and plug them back in. Now ive gone through the process of trying each card individually to make sure they work and they do but i cant get this card re-enabled again for some reason in the 3 card setup after the crash. Im thinking this is a reg problem with windows and amd drivers but i cant confirm 100%


----------



## MlNDSTORM

Thanks for the input guys, guess I'll make my decision soon...I really like the glowing Gefore GTX logo and the obvious cooler quieter card...but I have always had Nvidia and going to AMD seems like a change I've been wanting to try even though they're basically the same card. Just wished they had the non reference cards out. I also just bought the Asus VG248QE which will be G Sync compatible. Hoping to upgrade to 1440p summer of 2014 I'm sure XFire 290x will be enough then.


----------



## Sgt Bilko

Quote:


> Originally Posted by *brazilianloser*
> 
> So here is my new issue... Got a replacement card in and its working great compared to the black screen madness I had going on with my last one... but for some reason AB Beta 17 and 3Dmark are only showing that the card has 3GB of ram. Anyone has noticed this situation on their cards as well (ASUS 290).???
> 
> 
> 
> edit: on the other hand GPUZ does states that the card has 4GB as it should.


You need to change it in the AB Montitering settings to 4GB

I just did it then after a re-install


----------



## Forceman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I notice with no load at all setting the memory to 1500MHz did not work. Is overclocking these memory chips really this limited.


You talking about the flashing lines on the desktop? Supposedly that is some kind of 2D/3D clock issue. Try increasing the core voltage and see if that helps your memory overclocking - seems to help some people.


----------



## Sazz

do we have any list of 290 owners that was successful on flashing their 290's to 290X without any performance problems? I just wonder if the chances of flashing it successfully to 290X without problems is high, and I might buy a 290 instead to x-fire with my 290X now, or sell the 290X that I got now and use the 290 and flash it to 290X. But either way I am planning to sell my 290X and re-place it with 290, already got a buyer locally if I decide to sell xD


----------



## Falkentyne

Guys I have a VERY serious question (also for the friendly AMD rep, if he sees this):

if it's a driver problem, how come:
1) some people who had black screens RMA'd their cards and the new card was rock solid?

2) How come I haven't experienced any black screens? (the only time I ever had one was when overclocking the memory through CCC and after restarting, it applied the oc on startup and blackscreened on desktop after any gpu/scrolling activity).. I've even gone to 1150/1500 (90% fan) with +100mv and it ran 3dmark/firestrike like a baws, and heaven is solid with 1125/1500 and +50mv. I didn't see any probs in BF4 either, and one test mentioned on ocuk, running "dark Souls" which caused this one guy to black screen almost instantly, seemed fine too (and MAME emulator was fine).

Sure I'll do more testing to try to trigger it, but why isn't it happening to me?
(and why does it seem the vast majority of cases are those with elpida memory rather than Hynix?).

Thank you.


----------



## kdawgmaster

Quote:


> Originally Posted by *Sazz*
> 
> do we have any list of 290 owners that was successful on flashing their 290's to 290X without any performance problems? I just wonder if the chances of flashing it successfully to 290X without problems is high, and I might buy a 290 instead to x-fire with my 290X now, or sell the 290X that I got now and use the 290 and flash it to 290X. But either way I am planning to sell my 290X and re-place it with 290, already got a buyer locally if I decide to sell xD


theres a boat load of people that have done it if u google. But this will be like the 6000 series going from a 6950 to 6970. Very risky.

Quote:


> Originally Posted by *Falkentyne*
> 
> Guys I have a VERY serious question (also for the friendly AMD rep, if he sees this):
> 
> if it's a driver problem, how come:
> 1) some people who had black screens RMA'd their cards and the new card was rock solid?
> 
> 2) How come I haven't experienced any black screens? (the only time I ever had one was when overclocking the memory through CCC and after restarting, it applied the oc on startup and blackscreened on desktop after any gpu/scrolling activity).. I've even gone to 1150/1500 (90% fan) with +100mv and it ran 3dmark/firestrike like a baws, and heaven is solid with 1125/1500 and +50mv. I didn't see any probs in BF4 either, and one test mentioned on ocuk, running "dark Souls" which caused this one guy to black screen almost instantly, seemed fine too (and MAME emulator was fine).
> 
> Sure I'll do more testing to try to trigger it, but why isn't it happening to me?
> (and why does it seem the vast majority of cases are those with elpida memory rather than Hynix?).
> 
> Thank you.


Judging by ur system you arent using the R9 290x but rather the 7970 thats why. There are theories to why they black screen from a bad batch of Vram from a second source that AMD gets their ram from, to bad drivers.


----------



## Forceman

Quote:


> Originally Posted by *Sazz*
> 
> do we have any list of 290 owners that was successful on flashing their 290's to 290X without any performance problems? I just wonder if the chances of flashing it successfully to 290X without problems is high, and I might buy a 290 instead to x-fire with my 290X now, or sell the 290X that I got now and use the 290 and flash it to 290X. But either way I am planning to sell my 290X and re-place it with 290, already got a buyer locally if I decide to sell xD


Whole thread about it here:

http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread/0_30


----------



## Falkentyne

Quote:


> Originally Posted by *kdawgmaster*
> 
> theres a boat load of people that have done it if u google. But this will be like the 6000 series going from a 6950 to 6970. Very risky.
> Judging by ur system you arent using the R9 290x but rather the 7970 thats why. There are theories to why they black screen from a bad batch of Vram from a second source that AMD gets their ram from, to bad drivers.


I haven't updated my signature yet. I'm already in the 290x club. I bought the 290x the first night of release.


----------



## kdawgmaster

Quote:


> Originally Posted by *Falkentyne*
> 
> I haven't updated my signature yet. I'm already in the 290x club. I bought the 290x the first night of release.


Ah lol xD


----------



## Arm3nian

Quote:


> Originally Posted by *Sazz*
> 
> do we have any list of 290 owners that was successful on flashing their 290's to 290X without any performance problems? I just wonder if the chances of flashing it successfully to 290X without problems is high, and I might buy a 290 instead to x-fire with my 290X now, or sell the 290X that I got now and use the 290 and flash it to 290X. But either way I am planning to sell my 290X and re-place it with 290, already got a buyer locally if I decide to sell xD


Selling the 290x and buying a 290 on a gamble to save a few bucks is illogical imo. First of all, most of them do not unlock. Second of all you can brick your card because not all silicon is the same. And third of all, we still haven't seen any proper proof that it actually unlocks the shaders, for all we know it could just be gpu-z reading incorrectly. The 2fps increases are most likely the actual bios on the 290x which is different than the 290.


----------



## Duvar

You have to check out the number on the chip, if it is 215-0852000 = 290x.
So all 290 cards with the ending ...2000 are unlockable, this is what we figured out here in germany.
You can check here how many people already unlocked their cards http://extreme.pcgameshardware.de/grafikkarten/304527-howto-flash-amd-r9-290-290x.html


----------



## TooBAMF

Quote:


> Originally Posted by *Arm3nian*
> 
> Selling the 290x and buying a 290 on a gamble to save a few bucks is illogical imo. First of all, most of them do not unlock. Second of all you can brick your card because not all silicon is the same. And third of all, we still haven't seen any proper proof that it actually unlocks the shaders, for all we know it could just be gpu-z reading incorrectly. The 2fps increases are most likely the actual bios on the 290x which is different than the 290.


I agree selling 290X for a 290 to unlock is illogical. If the new 290 bricks for whatever reason, whether or not it's related to the unlock, a locked 290 will likely be the RMA replacement.

Still, I'm pretty sure people have tested to make sure the card doesn't throttle and the clocks are the same before and after BIOS update to compare.

If not, wouldn't it be easy for someone to benchmark a 290 underclocked to ~800MHz to stay well within stock TDP, update to the ASUS 290X BIOS and run at the same 800MHz clock?

There may not be a way to prove that unlocked shaders = true 290X shaders without owning both cards. Nevertheless, it should be possible to prove that some shaders are unlocked through the clock for clock performance boost (assuming clocks stay locked at 800MHz)


----------



## Forceman

Quote:


> Originally Posted by *Arm3nian*
> 
> Selling the 290x and buying a 290 on a gamble to save a few bucks is illogical imo. First of all, most of them do not unlock. Second of all you can brick your card because not all silicon is the same. And third of all, we still haven't seen any proper proof that it actually unlocks the shaders, for all we know it could just be gpu-z reading incorrectly. The 2fps increases are most likely the actual bios on the 290x which is different than the 290.


Quote:


> Originally Posted by *TooBAMF*
> 
> There may not be a way to prove that unlocked shaders = true 290X shaders without owning both cards. Nevertheless, it should be possible to prove that some shaders are unlocked through the clock for clock performance boost (assuming clocks stay locked at 800MHz)


It's been shown in multiple tests, by multiple people, that clock for clock the flashed 290X is faster than the unflashed 290 and I believe someone tested that it is faster than a flashed but still locked 290. So far no one has both an actual 290X and a unlocked 290 to compare in the same system, but there's little doubt still remaining that it is actually unlocking the shaders.

On a different topic, is there any word on the red screens in BF4? I thought Beta 8 was supposed to fix them, but I still occasionally get them on Beta 8 and 9.2, both overclocked and stock (and unflashed, before anyone asks).


----------



## hotrod717

It is not unlocking. Let's get it straight. The cards we're talking about have 290x chips. This is not like 6900 series. 290 chips have been laser cut and cannot be unlocked. Please do research before making assumptions and.spreading misinformation. Yes , I own one. please visit thread Forceman listed to get a correct idea of what is in question.







These are 290x chips that are "bios locked" and sold as 290's.


----------



## Arm3nian

Quote:


> Originally Posted by *TooBAMF*
> 
> I agree selling 290X for a 290 to unlock is illogical. If the new 290 bricks for whatever reason, whether or not it's related to the unlock, a locked 290 will likely be the RMA replacement.
> 
> Still, I'm pretty sure people have tested to make sure the card doesn't throttle and the clocks are the same before and after BIOS update to compare.
> 
> If not, wouldn't it be easy for someone to benchmark a 290 underclocked to ~800MHz to stay well within stock TDP, update to the ASUS 290X BIOS and run at the same 800MHz clock?
> 
> There may not be a way to prove that unlocked shaders = true 290X shaders without owning both cards. Nevertheless, it should be possible to prove that some shaders are unlocked through the clock for clock performance boost (assuming clocks stay locked at 800MHz)


To prove the flash made a difference we would need to see consistent results and by more of an increase then 2fps. Such low of a difference can be the percentage of error. The correct way to test is under water for no throttling. You would need both a 290 and 290x, and find the differences. Then flash the 290 and compare the results to the 290x.

290 is great, and more bang for buck, but selling a 290x and going to a 290 in hopes of flashing and getting the same performance is just asking to be dissapointed.


----------



## Forceman

Quote:


> Originally Posted by *Arm3nian*
> 
> To prove the flash made a difference we would need to see consistent results and by more of an increase then 2fps. Such low of a difference can be the percentage of error. The correct way to test is under water for no throttling. You would need both a 290 and 290x, and find the differences. Then flash the 290 and compare the results to the 290x.


My card is under water and doesn't throttle. I've tested it clock for clock on Heaven, Valley, 3DMark, Metro:LL, Compubench and ShaderToyMark. The flashed card is anywhere from 5-10% faster on every test. It's unlocked.

Why don't you run ShaderToyMark and Compubench on your card and see what you get?


----------



## GenoOCAU

When putting my waterblocks on and removing stock TIM, both the cores on my XFX 290's were totally blank.

Not sure how people are checking the numbers on their chip









Mine were literally a black rectangle, I remember commenting to my partner its one of the only chips ive ever seen since playing with PC's that didn't have at least a couple bits of lettering on it.


----------



## TooBAMF

Quote:


> Originally Posted by *Arm3nian*
> 
> To prove the flash made a difference we would need to see consistent results and by more of an increase then 2fps. Such low of a difference can be the percentage of error. The correct way to test is under water for no throttling. You would need both a 290 and 290x, and find the differences. Then flash the 290 and compare the results to the 290x.
> 
> 290 is great, and more bang for buck, but selling a 290x and going to a 290 in hopes of flashing and getting the same performance is just asking to be dissapointed.


Comparing real 290 to real 290X under water is the ideal test but I don't know if we're ever going to get that. 800MHz shouldn't throttle on air, and clock speeds can be logged to verify that.

Different benchmarks can be used, like the texel fill feature test in 3DMark Vantage. The Anandtech 780Ti review shows the 290X being 15% faster than the 290 in that particular benchmark and only 1-5% faster in the other synthetics:

http://www.anandtech.com/show/7492/the-geforce-gtx-780-ti-review/13


----------



## Arm3nian

Quote:


> Originally Posted by *Forceman*
> 
> My card is under water and doesn't throttle. I've tested it clock for clock on Heaven, Valley, 3DMark, Metro:LL, Compubench and ShaderToyMark. The flashed card is anywhere from 5-10% faster on every test. It's unlocked.
> 
> Why don't you run ShaderToyMark and Compubench on your card and see what you get?


Because I'm one of the unlucky ones who preordered the rivbe from ncix, and is never going to get it.

That thread has a whopping 5 people who've successfully unlocked the shaders? We're going to have some sad people when it doesn't work.


----------



## aznever

I ran compubench ray tracing test with the same clock speed , and with the r9-290 flashed to 290x shows about 4.6% faster.


----------



## SonDa5

Quote:


> Originally Posted by *Forceman*
> 
> My card is under water and doesn't throttle. I've tested it clock for clock on Heaven, Valley, 3DMark, Metro:LL, Compubench and ShaderToyMark. The flashed card is anywhere from 5-10% faster on every test. It's unlocked.
> 
> Why don't you run ShaderToyMark and Compubench on your card and see what you get?


My water cooled ASIC 72.5% Sapphire 290x doesn't throttle on benchmarks at around 1200/1700. We need to compare scores some time.

I don't give a rats ass if the 290 does indeed turn into a 290x with a BIOS flash. I think that is cool and more power to you and AMD for allowing such a kick as trick.


----------



## SonDa5

Quote:


> Originally Posted by *GenoOCAU*
> 
> When putting my waterblocks on and removing stock TIM, both the cores on my XFX 290's were totally blank.
> 
> Not sure how people are checking the numbers on their chip
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Mine were literally a black rectangle, I remember commenting to my partner its one of the only chips ive ever seen since playing with PC's that didn't have at least a couple bits of lettering on it.


My Sapphire 290x was completely blank as well.


----------



## Forceman

Quote:


> Originally Posted by *GenoOCAU*
> 
> When putting my waterblocks on and removing stock TIM, both the cores on my XFX 290's were totally blank.
> 
> Not sure how people are checking the numbers on their chip
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Mine were literally a black rectangle, I remember commenting to my partner its one of the only chips ive ever seen since playing with PC's that didn't have at least a couple bits of lettering on it.


The part number is on the shim, not the core itself, for some reason.


----------



## Sazz

Quote:


> Originally Posted by *Arm3nian*
> 
> Selling the 290x and buying a 290 on a gamble to save a few bucks is illogical imo. First of all, most of them do not unlock. Second of all you can brick your card because not all silicon is the same. And third of all, we still haven't seen any proper proof that it actually unlocks the shaders, for all we know it could just be gpu-z reading incorrectly. The 2fps increases are most likely the actual bios on the 290x which is different than the 290.


by "few bucks" you mean 150bucks is just a few bucks? of course I will test it out first and if it doesn't successfully unlock to 290X it doesn't really matter coz 290 is a pretty darn good performer for 150bucks less.

I can sell my card for 550 to my friend coz he's thinking of buying one anyways I just have to let him know before he pulls the trigger.


----------



## Hogesyx

Hi guys, first post here.

I have the chance to return my Sapphire R9 290 and top up to a Sapphire R9 290X(BF4 bundle) for about $188 USD(converted from local pricing), wondering if it is worth it considering I do not have Battlefield yet and is planning to buy it.

My current Sapphire 290 has a ASCI score of 70.1%, haven't really push the OC yet as I am still on stock blower fan.

Should I jump the gun and upgrade to 290X or stick with my current 290 and just buy BF4 off origin and spent the rest of the money on something else?


----------



## Sazz

Quote:


> Originally Posted by *Hogesyx*
> 
> Hi guys, first post here.
> 
> I have the chance to return my Sapphire R9 290 and top up to a Sapphire R9 290X(BF4 bundle) for about $188 USD(converted from local pricing), wondering if it is worth it considering I do not have Battlefield yet and is planning to buy it.
> 
> My current Sapphire 290 has a ASCI score of 70.1%, haven't really push the OC yet as I am still on stock blower fan.
> 
> Should I jump the gun and upgrade to 290X or stick with my current 290 and just buy BF4 off origin and spent the rest of the money on something else?


from looking at the current benchmarks out there, might just wanna buy the game separately, 290 is probably the best performing card out there for its price, price/performance ratio is just that good for the 290 and the extra 180bucks for a few 3-5FPS is not worth it.

You can buy the game somewhere out there for 30-40bucks (from those who owns the 290X and selling the code for it) and the extra 100bucks or so, buy a aftermarket cooler for the 290 so that you would reduce heat and noise significantly.


----------



## Hogesyx

Quote:


> Originally Posted by *Sazz*
> 
> from looking at the current benchmarks out there, might just wanna buy the game separately, 290 is probably the best performing card out there for its price, price/performance ratio is just that good for the 290 and the extra 180bucks for a few 3-5FPS is not worth it.
> 
> You can buy the game somewhere out there for 30-40bucks (from those who owns the 290X and selling the code for it) and the extra 100bucks or so, buy a aftermarket cooler for the 290 so that you would reduce heat and noise significantly.


You basically sum up my dilemma, bf4 + arctic extreme or 290X + free bf4.


----------



## Sazz

Quote:


> Originally Posted by *Hogesyx*
> 
> You basically sum up my dilemma, bf4 + arctic extreme or 290X + free bf4.


yeah, if I were you I'd stick with the 290, buy the game and get an arctic cooling Xtreme III cooler for it.


----------



## Forceman

Quote:


> Originally Posted by *Sazz*
> 
> yeah, if I were you I'd stick with the 290, buy the game and get an arctic cooling Xtreme III cooler for it.


Me too.


----------



## Sgt Bilko

I'm thinking about doing a Water Loop for my 290x (Just the GPU atm).........what size Rad should i be looking for?


----------



## rdr09

Quote:


> Originally Posted by *kdawgmaster*
> 
> Heres my firestrike score with 3 cards. http://www.3dmark.com/fs/1177247
> 
> I have to say tho that im having other issues other then the black screen problem.
> 
> For some reason in my 3 card setup after playing games the system will crash ( narrowed down to drivers ) but they crash the system so hard for some reason i have to go through a reinstall process where i have to even use driver sweeper. Now the worst part is I always have 1 card in the device manager error out on me to where i have to remove the cards and plug them back in. Now ive gone through the process of trying each card individually to make sure they work and they do but i cant get this card re-enabled again for some reason in the 3 card setup after the crash. Im thinking this is a reg problem with windows and amd drivers but i cant confirm 100%


might be a psu issue if you are using the 1050W still.


----------



## Sazz

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'm thinking about doing a Water Loop for my 290x (Just the GPU atm).........what size Rad should i be looking for?


I would say in any loop 240mm should be the minimum now, to ensure you are getting the most out of your watercooling solution. Some people say 120mm/block or part that you are cooling but I personally opt for 240mm/block. if just the GPU then your pump wouldn't need to be that strong, variable speed pump would be great. You don't want it to be too powerfull that it don't spend enough time to transfer heat thru the rad while you don't want it to be weak that the liquid doesn't move quick enough to dissipate heat from the card.

if you wan't a cheap solution, and if you have two extra bay slots. I'd go for XSPC X20 bay pump+res combo, then buy a used radiator somewhere, EK XT and XSPC Radiators works well and I have used both of em.

XSPC AX is by far the best I've used, its not a HUGE radiator like RX but performs just as well and even better at low fan speeds, RS is the cheapest but still performs great.


----------



## Hogesyx

Quote:


> Originally Posted by *Sazz*
> 
> yeah, if I were you I'd stick with the 290, buy the game and get an arctic cooling Xtreme III cooler for it.


Quote:


> Originally Posted by *Forceman*
> 
> Me too.


I guess I still stick to it then! Anyway Forceman, did you manage to unlock your shaders?


----------



## Sazz

Quote:


> Originally Posted by *Hogesyx*
> 
> I guess I still stick to it then! Anyway Forceman, did you manage to unlock your shaders?


From what I've seen on the thread for the 290-290X unlocking, Powercolor and XFX cards so far are the ones that people was able to unlock, none on any other manufacturer. I decided to downgrade to 290, imma sell my card(290X) to my friend once I get the 290 and do some stress testing on it to be sure it works before I put on the waterblock on it. I will try to flash it to 290X but if its not successful it still don't matter since 290 as it is, is already a great performing card.


----------



## rdr09

Quote:


> Originally Posted by *Sazz*
> 
> I would say in any loop 240mm should be the minimum now, to ensure you are getting the most out of your watercooling solution. Some people say 120mm/block or part that you are cooling but I personally opt for 240mm/block. if just the GPU then your pump wouldn't need to be that strong, variable speed pump would be great. You don't want it to be too powerfull that it don't spend enough time to transfer heat thru the rad while you don't want it to be weak that the liquid doesn't move quick enough to dissipate heat from the card.
> 
> if you wan't a cheap solution, and if you have two extra bay slots. I'd go for XSPC X20 bay pump+res combo, then buy a used radiator somewhere, EK XT and XSPC Radiators works well and I have used both of em.
> 
> XSPC AX is by far the best I've used, its not a HUGE radiator like RX but performs just as well and even better at low fan speeds, RS is the cheapest but still performs great.


^this. I agree with Sazz, 240mm would be ideal for the 290X. I only have a 120 rad and a fan but my loop includes the cpu and it has its own 120 rad with 2 fans. My temp in BF4 stays below 60C but I'd imagine during warmer weather it can go higher.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Sazz*
> 
> I would say in any loop 240mm should be the minimum now, to ensure you are getting the most out of your watercooling solution. Some people say 120mm/block or part that you are cooling but I personally opt for 240mm/block. if just the GPU then your pump wouldn't need to be that strong, variable speed pump would be great. You don't want it to be too powerfull that it don't spend enough time to transfer heat thru the rad while you don't want it to be weak that the liquid doesn't move quick enough to dissipate heat from the card.
> 
> if you wan't a cheap solution, and if you have two extra bay slots. I'd go for XSPC X20 bay pump+res combo, then buy a used radiator somewhere, EK XT and XSPC Radiators works well and I have used both of em.
> 
> XSPC AX is by far the best I've used, its not a HUGE radiator like RX but performs just as well and even better at low fan speeds, RS is the cheapest but still performs great.


Thanks for the info, I'm looking at these two atm : XSPC D5 and a XSPC 270 Tube res

And the rad choices: Alphacool NexXxoS Monsta 280mm (http://www.pccasegear.com/index.php?main_page=product_info&cPath=207_160_297_922&products_id=24486)
or a Phobya Xtreme Quad 480 Radiator (http://www.pccasegear.com/index.php?main_page=product_info&cPath=207_160_297_922&products_id=25812)

I'd like to keep everything inside the case, don't mind taking out a few HDD ranks here and there.

EDIT: Reason i'm finally thinking about doing it is i was just playing a game then and my core was at 91c

Bear in mind that i'm using the Accelero Xtreme III here, Vrm's were 91c and 73c.......that was with a very mild 1100/1300 clock and 100% fan speed......so yeah, it's time for water lol


----------



## Sazz

Quote:


> Originally Posted by *rdr09*
> 
> ^this. I agree with Sazz, 240mm would be ideal for the 290X. I only have a 120 rad and a fan but my loop includes the cpu and it has its own 120 rad with 2 fans. My temp in BF4 stays below 60C but I'd imagine during warmer weather it can go higher.


Yeah, I got two 240mm in my loop, CPU is in the same loop as my GPU, CPU is FX8350 at 5Ghz at 1.55v while GPU remains at stock voltage at 1100/1375. Playing BF4 my core maxed out at 47C while during furmark 15minute stress test I reached 55C. ambient temp at 25C and my fans are 4 Corsair SP 120 (both radiators only set up as push).

Funny thing is, under watercooling the 290X actually runs 2-3C cooler than 7970 under watercooling, I had same block on my 7970 and that thing reaches 57C with furmark run at stock clocks and idles at 36C while my 290X idle's at 33C.


----------



## bjozac

Quote:


> Originally Posted by *kcuestag*
> 
> They were never locked, it's simple, there were no programs that supported the voltage regulators the new cards use as they were NEW into the market, just like the GTX780Ti has no voltage control right now because Afterburner needs an update.


So what can be expected in overclocking potential using msi aftterburner now with Stock bios on a sapphire card?
And thanks for all the enlightning answers!


----------



## kcuestag

Quote:


> Originally Posted by *bjozac*
> 
> So what can be expected in overclocking potential using msi aftterburner now with Stock bios on a sapphire card?
> And thanks for all the enlightning answers!


If you're lucky you'll hit 1200MHz on the Core, which is pretty good already.


----------



## maarten12100

Quote:


> Originally Posted by *kcuestag*
> 
> If you're lucky you'll hit 1200MHz on the Core, which is pretty good already.


Should be doable mine was stable till 1225 with +100mV tried the asus pt1 bios but it didn't unlock voltage controll in gpu tweak so I guess the AB hack is it.


----------



## rdr09

Quote:


> Originally Posted by *Sazz*
> 
> Yeah, I got two 240mm in my loop, CPU is in the same loop as my GPU, CPU is FX8350 at 5Ghz at 1.55v while GPU remains at stock voltage at 1100/1375. Playing BF4 my core maxed out at 47C while during furmark 15minute stress test I reached 55C. ambient temp at 25C and my fans are 4 Corsair SP 120 (both radiators only set up as push).
> 
> Funny thing is, under watercooling the 290X actually runs 2-3C cooler than 7970 under watercooling, I had same block on my 7970 and that thing reaches 57C with furmark run at stock clocks and idles at 36C while my 290X idle's at 33C.


yes, temps look similar if not better than a watercooled 7970 (had one too and sitting in the closet) but this time i used CLU. It could also be that ek just did a magnificent job with these blocks. I only have 3 fans in my system - all for the rads.


----------



## basco

hey maarten did ya use this specific version of gputweak? i think so:
http://www.mediafire.com/download/7g1in2cj3rik7is/GPUTweakVer2491_+20131017-Hawaii.zip

just slapped my old twinfrozr 5830 cooler on 290 and hell its quiet now.
used vrm cooler from giga 470 soc and small alu heatsink


----------



## Joeking78

Quote:


> Originally Posted by *bjozac*
> 
> So what can be expected in overclocking potential using msi aftterburner now with Stock bios on a sapphire card?
> And thanks for all the enlightning answers!


I can hit 1200 core on mine but I get the odd gltich in 3Dmark...the best overlcock for me seems to be 1175/1400 with additional volts, I'm still testing tho so might be a little extra tonight.


----------



## SpewBoy

Quote:


> Originally Posted by *Scorpion49*
> 
> Anyone do any checking to see if PCI-E 2.0 vs 3.0 has any effect on these cards in CF?


Dunno if this has been answered in the millions of posts I've missed but I've been testing with a 2600K while I wait for my replacement Z87 MAXIMUS Hero to arrive (first one was faulty, of course







). Benchmarks seem to be fine but every game I've played so far (which isn't many atm, just Crysis 3, Black Ops, Ghosts, SimCity) has given worse performance when enabling CF. GPU usage fluctuates wildly between 100% and 0% for both GPUs and performance loss versus one card has been anywhere from a little bit to over 30%. Ghosts has serious CF issues for all cards (last I checked, it was not CF compatible). SimCity and Crysis 3 however, should scale extremely well in CF but they just don't at all for me. I just get completely crazy GPU usage fluctuations that ruin the frame rate and cause stutter.

I haven't had the chance to actually test a game with a working CF profile with my 4770K (I did benches and they were fine, a little faster probably due to extra bandwidth and faster CPU) but hopefully tomorrow I can do just that when my replacement board arrives.

So far I am not a happy camper at all. Got everything ready to go under water but mobo issues have delayed me for over a week and these cards just don't want to behave unless they are being benched. I get 40 fps and crazy stutter in the first Crysis 3 level on medium, rendering it unplayable so... yeah. Bit late to return the cards now and go for 780 /Tis because the water blocks came from a million miles away. Just gonna have to ride this out and hope my issue goes away on PCIe3 (not looking too likely) or AMD eventually fixes my problems (also not getting my hopes up).

EDIT: Oh, and I've disabled ULPS and all that jazz. Still runs like a dog.


----------



## rdr09

Quote:


> Originally Posted by *SpewBoy*
> 
> Dunno if this has been answered in the millions of posts I've missed but I've been testing with a 2600K while I wait for my replacement Z87 MAXIMUS Hero to arrive (first one was faulty, of course
> 
> 
> 
> 
> 
> 
> 
> ). Benchmarks seem to be fine but every game I've played so far (which isn't many atm, just Crysis 3, Black Ops, Ghosts, SimCity) has given worse performance when enabling CF. GPU usage fluctuates wildly between 100% and 0% for both GPUs and performance loss versus one card has been anywhere from a little bit to over 30%. Ghosts has serious CF issues for all cards (last I checked, it was not CF compatible). SimCity and Crysis 3 however, should scale extremely well in CF but they just don't at all for me. I just get completely crazy GPU usage fluctuations that ruin the frame rate and cause stutter.
> 
> I haven't had the chance to actually test a game with a working CF profile with my 4770K (I did benches and they were fine, a little faster probably due to extra bandwidth and faster CPU) but hopefully tomorrow I can do just that when my replacement board arrives.
> 
> So far I am not a happy camper at all. Got everything ready to go under water but mobo issues have delayed me for over a week and these cards just don't want to behave unless they are being benched. I get 40 fps and crazy stutter in the first Crysis 3 level on medium, rendering it unplayable so... yeah. Bit late to return the cards now and go for 780 /Tis because the water blocks came from a million miles away. Just gonna have to ride this out and hope my issue goes away on PCIe3 (not looking too likely) or AMD eventually fixes my problems (also not getting my hopes up).
> 
> EDIT: Oh, and I've disabled ULPS and all that jazz. Still runs like a dog.


can you post a synthetic benchmark? preferrably - 3DMark11.


----------



## bjozac

Quote:


> Originally Posted by *Joeking78*
> 
> I can hit 1200 core on mine but I get the odd gltich in 3Dmark...the best overlcock for me seems to be 1175/1400 with additional volts, I'm still testing tho so might be a little extra tonight.


I managed to get 1185/1500 stable, will try my new card and see how it turns out.
With the ASUS bios i do 1200/1500 easy, have not tried how far i can go. Will wait for watercooling before that.


----------



## bjozac

What did we say about hynix vs elpida?

I got one Sapphire 290-non-x with each memory brand, will this affect crossfire?
The one with hynix seems faster at same clocks(benchmark) but elpidia seems to overclock better.

How did we provoke these black screens?


----------



## evensen007

Quote:


> Originally Posted by *SpewBoy*
> 
> Dunno if this has been answered in the millions of posts I've missed but I've been testing with a 2600K while I wait for my replacement Z87 MAXIMUS Hero to arrive (first one was faulty, of course
> 
> 
> 
> 
> 
> 
> 
> ). Benchmarks seem to be fine but every game I've played so far (which isn't many atm, just Crysis 3, Black Ops, Ghosts, SimCity) has given worse performance when enabling CF. GPU usage fluctuates wildly between 100% and 0% for both GPUs and performance loss versus one card has been anywhere from a little bit to over 30%. Ghosts has serious CF issues for all cards (last I checked, it was not CF compatible). SimCity and Crysis 3 however, should scale extremely well in CF but they just don't at all for me. I just get completely crazy GPU usage fluctuations that ruin the frame rate and cause stutter.
> 
> I haven't had the chance to actually test a game with a working CF profile with my 4770K (I did benches and they were fine, a little faster probably due to extra bandwidth and faster CPU) but hopefully tomorrow I can do just that when my replacement board arrives.
> 
> So far I am not a happy camper at all. Got everything ready to go under water but mobo issues have delayed me for over a week and these cards just don't want to behave unless they are being benched. I get 40 fps and crazy stutter in the first Crysis 3 level on medium, rendering it unplayable so... yeah. Bit late to return the cards now and go for 780 /Tis because the water blocks came from a million miles away. Just gonna have to ride this out and hope my issue goes away on PCIe3 (not looking too likely) or AMD eventually fixes my problems (also not getting my hopes up).
> 
> EDIT: Oh, and I've disabled ULPS and all that jazz. Still runs like a dog.


I had this exact same issue. The only thing that fixed it for me was a complete O/S reload. Utnorris had the same success after reloading.


----------



## SpewBoy

Quote:


> Originally Posted by *rdr09*
> 
> can you post a synthetic benchmark? preferrably - 3DMark11.


Ok, working on them as I type... Aagh, just got a black screen hahah. Gimme 20 mins. This is a temporary Windows 8 install with nothing on it but the minimum so bear with me... I'm used to Photoshop for managing screenshots.
Quote:


> Originally Posted by *evensen007*
> 
> I had this exact same issue. The only thing that fixed it for me was a complete O/S reload. Utnorris had the same success after reloading.


You mean, reinstall the OS? I've tried both Windows 8 and 8.1 fresh installs with no luck. Current install is Windows 8 (mouse issues in 8.1 have deterred me for now) and it is less than 3 days old with no hardware swap outs or driver updates during this time (all up to date drivers were installed once 3 days ago).


----------



## rass101

Just received my new build with MSI 290, waiting for the new driver to be released. If anyone gets their hands on it inform the forum also


----------



## DarkZR9

Quote:


> Originally Posted by *Arm3nian*
> 
> To prove the flash made a difference we would need to see consistent results and by more of an increase then 2fps. Such low of a difference can be the percentage of error. The correct way to test is under water for no throttling. You would need both a 290 and 290x, and find the differences. Then flash the 290 and compare the results to the 290x.
> 
> 290 is great, and more bang for buck, but selling a 290x and going to a 290 in hopes of flashing and getting the same performance is just asking to be dissapointed.


Sounds to me like you are just bitter for paying full price for 2 290x's when you could have gotten a couple of powercolor 290's and saved yourself $300.00 in the end while still having a 290x after a bios flash.









And its more than just the 2 fps that you keep pulling out of your behind, for me and many others the diff is around 4fps which is just about right and you dont need the cards under water to prevent throttle, thats one of the stupidest things I have heard thus far in this conversation. Myself and many others are running the accelero 3 cooler which keeps the core under load at 60c while the hottest vrm is only reaching 65c so I can assure you that no throttle is taking place.

Not only that but as someone else mentioned we are smart enough to run identical clock speeds between both bios's during testing.


----------



## DarkZR9

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Thanks for the info, I'm looking at these two atm : XSPC D5 and a XSPC 270 Tube res
> 
> And the rad choices: Alphacool NexXxoS Monsta 280mm (http://www.pccasegear.com/index.php?main_page=product_info&cPath=207_160_297_922&products_id=24486)
> or a Phobya Xtreme Quad 480 Radiator (http://www.pccasegear.com/index.php?main_page=product_info&cPath=207_160_297_922&products_id=25812)
> 
> I'd like to keep everything inside the case, don't mind taking out a few HDD ranks here and there.
> 
> EDIT: Reason i'm finally thinking about doing it is i was just playing a game then and my core was at 91c
> 
> Bear in mind that i'm using the Accelero Xtreme III here, Vrm's were 91c and 73c.......that was with a very mild 1100/1300 clock and 100% fan speed......so yeah, it's time for water lol


Im using 1.328vc for 1200mhz core and get 93c load on vrm1 with the exact same cooler when playing Crysis 3 for a few hours which seems to heat it up more than anything else I have played. Other games like BF4 it stays around 80c tops. With 1100mhz core all i need are stock volts and vrm 1 never goes past 65c in that case.


----------



## SpewBoy

Here are some Unigine benchmarks. In a fit of stupidity I forgot to set a custom fan profile so the cards throttle down to as low as 925mhz












And here is a 3DMark11 benchmark (on extreme setting) with a custom fan profile (no throttling):



CPU is a 2600K @ 4.5GHz. GPU usage ain't perfect but it isn't going crazy like it does in games. Obviously 3DMark11 has gaps where the loading screens and non-GPU benches are. Tomorrow I should be able to do some tests with my Z87 board and try some more games. Got Far Cry 3, Skyrim, and Metro2033 sitting around ready for some lovin', possibly a few others too. Origin account got hijacked and EA support have told me they won't do a thing about it so BF3 is out.


----------



## drdrache

Quote:


> Originally Posted by *DarkZR9*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Arm3nian*
> 
> To prove the flash made a difference we would need to see consistent results and by more of an increase then 2fps. Such low of a difference can be the percentage of error. The correct way to test is under water for no throttling. You would need both a 290 and 290x, and find the differences. Then flash the 290 and compare the results to the 290x.
> 
> 290 is great, and more bang for buck, but selling a 290x and going to a 290 in hopes of flashing and getting the same performance is just asking to be dissapointed.
> 
> 
> 
> Sounds to me like you are just bitter for paying full price for 2 290x's when you could have gotten a couple of powercolor 290's and saved yourself $300.00 in the end while still having a 290x after a bios flash.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And its more than just the 2 fps that you keep pulling out of your behind, for me and many others the diff is around 4fps which is just about right and you dont need the cards under water to prevent throttle, thats one of the stupidest things I have heard thus far in this conversation. Myself and many others are running the accelero 3 cooler which keeps the core under load at 60c while the hottest vrm is only reaching 65c so I can assure you that no throttle is taking place.
> 
> Not only that but as someone else mentioned we are smart enough to run identical clock speeds between both bios's during testing.
Click to expand...

not to get too deep into this, but the issue we are all having is that NO ONE (that has come out with graphs and proof - Correct me if I am completely wrong) has benched a stock 290, a stock 290x, and an "unlocked" 290, AS WELL as a NON-UNLOCKED 290 with a 290x bios. on the same system, with cooling taken care of (water is mentioned because it solved any question of VRM temps) so that no throttling is observed, and all cards are given the SAME REPEATABLE testing. because, as you state "same clocks but it's faster" doesn't mean anything if it's still slower than a TRUE 290X on the same system.

4FPS is nothing, I can do that by changing fan speed on a case fan pointed in the right direction. I'm pretty sure you guys are lucky, and good for you, but other than conjecture about the speed testing, is it ACTUALLY working? or just from the bios?

PPS - the "unlocking" is not true unlocking like times past, because that's impossible, the chips were/are 290x chips, put into a 290 for a REASON.


----------



## DarkZR9

Quote:


> Originally Posted by *drdrache*
> 
> 4FPS is nothing


4fps is about the diff you can expect between an equally clocked R9 290 and R9 290x under valley and in many games the cards are that close when running the same clocks with no throttling so to say 4fps is nothing is just flat out wrong because its about right where it should be.

I have benched valley between both bios's on my own card 4 times each and the results were always consistent, this was using 1000core and 1250mem for "BOTH BIOS's" in order to rule out the possibility that the increase was coming from clock speed.

Btw changing the direction of your airflow isnt going to net you an extra 4fps unless temps were an issue and was causing the card to throttle which for me and many others is not a problem as we are not on stock cooling.
Quote:


> Originally Posted by *drdrache*
> 
> PPS - the "unlocking" is not true unlocking like times past, because that's impossible, the chips were/are 290x chips, put into a 290 for a REASON.


And I agree with that.


----------



## the9quad

Ok update me to two Sapphire 290x's in crossfire (on air for the time being).


----------



## Warsam71

Quote:


> Originally Posted by *Warsam71*
> 
> Hello again everyone,
> 
> We are going to release a new driver tomorrow which fixes the black screen problem you've been experiencing.
> 
> Again, I want to thank everyone for your patience.


Hello everyone (@Arizonian)

I apologize, I just wanted to let you know we need a few more days to post the driver. I will make sure to keep you up-to-date on our progress...


----------



## TomiKazi

Finally got the card. Should've been here saturday. But then the status update moved it to monday, then tuesday. Too bad the deliverer arrived three hours before the stated time frame. Damn, PostNL is weak...

Okay, enough whining, here she is:



Hope to be able to update it to water next week







.


----------



## the9quad

Quote:


> Originally Posted by *Warsam71*
> 
> Hello everyone (@Arizonian)
> 
> I apologize, I just wanted to let you know we need a few more days to post the driver. I will make sure to keep you up-to-date on our progress...


AAAAAnd that's how it is done, you guys ran into a snag, and said hey we ran into a snag, gonna take a bit longer.

YEAH! Thanks for keeping us updated. i think I am with 99.9999999999% of the people who don't mind waiting for fixes as long as we know they will get fixed. Can't tell ya how much I appreciate it bro.


----------



## fearthisneo

Got my waterblock today. Update me please.


----------



## bond32

Borrowed a watt meter from a professor today. Just threw it on, saw during BF4 my setup with a single R9 pulled 384 watts. Much lower than I thought... Max load was during firestrike, got to around 430 watts during the combined test. This is with a 4770k heavily overclocked to 4.8 ghz with 1.42 vcore, R9 is stock.


----------



## rass101

Thanks for the update Warsam71. Can anyone suggest what drivers to use for new r9 290?


----------



## Warsam71

Quote:


> Originally Posted by *the9quad*
> 
> AAAAAnd that's how it is done, you guys ran into a snag, and said hey we ran into a snag, gonna take a bit longer.
> 
> YEAH! Thanks for keeping us updated. i think I am with 99.9999999999% of the people who don't mind waiting for fixes as long as we know they will get fixed. Can't tell ya how much I appreciate it bro.


No problem at all









And thank you in advance for your patience


----------



## redshirtx

Quote:


> Originally Posted by *rass101*
> 
> Thanks for the update Warsam71. Can anyone suggest what drivers to use for new r9 290?


You'll want either the most recent beta driver (13.11b9.2) or the 13.11 WHQL drivers. Both can be found in the same place.


----------



## the9quad

Quote:


> Originally Posted by *bond32*
> 
> Borrowed a watt meter from a professor today. Just threw it on, saw during BF4 my setup with a single R9 pulled 384 watts. Much lower than I thought... Max load was during firestrike, got to around 430 watts during the combined test. This is with a 4770k heavily overclocked to 4.8 ghz with 1.42 vcore, R9 is stock.


you think my 1200 watt would do 3 of the 290x's? if so do you think I'd get much performance gain from doing so? I don't want a new power suppy and I dont want dual power supplies, if it isn't enough i will stick with just two cards to be honest.


----------



## Forceman

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Thanks for the info, I'm looking at these two atm : XSPC D5 and a XSPC 270 Tube res
> 
> And the rad choices: Alphacool NexXxoS Monsta 280mm (http://www.pccasegear.com/index.php?main_page=product_info&cPath=207_160_297_922&products_id=24486)
> or a Phobya Xtreme Quad 480 Radiator (http://www.pccasegear.com/index.php?main_page=product_info&cPath=207_160_297_922&products_id=25812)
> 
> I'd like to keep everything inside the case, don't mind taking out a few HDD ranks here and there.
> 
> EDIT: Reason i'm finally thinking about doing it is i was just playing a game then and my core was at 91c
> 
> Bear in mind that i'm using the Accelero Xtreme III here, Vrm's were 91c and 73c.......that was with a very mild 1100/1300 clock and 100% fan speed......so yeah, it's time for water lol


Bear in mind that the Monsta will require high static pressure fans (and really push-pull) to work best. You'll get almost the same performance from the UT60 with quieter fans.
Quote:


> Originally Posted by *bond32*
> 
> Borrowed a watt meter from a professor today. Just threw it on, saw during BF4 my setup with a single R9 pulled 384 watts. Much lower than I thought... Max load was during firestrike, got to around 430 watts during the combined test. This is with a 4770k heavily overclocked to 4.8 ghz with 1.42 vcore, R9 is stock.


I get about the same.


----------



## ReHWolution

Quote:


> Originally Posted by *Warsam71*
> 
> No problem at all
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And thank you in advance for your patience


Ah dang, I was waiting for the update in these hours, well I'm gonna wait as everybody will







Is there any way I can help you? I test videocards with A LOT of benchmarks and games, so I could be helpful for fixing something


----------



## DampMonkey

Quote:


> Originally Posted by *the9quad*
> 
> you think my 1200 watt would do 3 of the 290x's? if so do you think I'd get much performance gain from doing so? I don't want a new power suppy and I dont want dual power supplies, if it isn't enough i will stick with just two cards to be honest.


I think it could handle that. Don't count on record breaking overclocks though


----------



## the9quad

Quote:


> Originally Posted by *DampMonkey*
> 
> I think it could handle that. Don't count on record breaking overclocks though


I am just going to run stock, I just want 2560x1440p at 120 fps or so in BF4 @ ultra...I could really care less about OC'ing the cards until much later when they don't handle games how I want them to look @ stock.


----------



## bond32

Quote:


> Originally Posted by *the9quad*
> 
> you think my 1200 watt would do 3 of the 290x's? if so do you think I'd get much performance gain from doing so? I don't want a new power suppy and I dont want dual power supplies, if it isn't enough i will stick with just two cards to be honest.


A 1200 watt, good one at that, would be perfectly fine for 3xR9 290x's. Heck a good 1000 watt would likely be fine. I think people are overestimating the power consumption of these cards, especially when coupled with other components.

I personally have an EVGA 1000 watt g2 gold which I highly recommend. They have a platinum rated now which is a little more expensive. I think I paid somewhere in the $180 range for mine.


----------



## S410520

Quote:


> Originally Posted by *TooBAMF*
> 
> Comparing real 290 to real 290X under water is the ideal test but I don't know if we're ever going to get that. 800MHz shouldn't throttle on air, and clock speeds can be logged to verify that.
> 
> Different benchmarks can be used, like the texel fill feature test in 3DMark Vantage. The Anandtech 780Ti review shows the 290X being 15% faster than the 290 in that particular benchmark and only 1-5% faster in the other synthetics:
> 
> http://www.anandtech.com/show/7492/the-geforce-gtx-780-ti-review/13


This is very interesting, thank you for this information.

I will bench R9 290 crossfire;

- Both cards set to 800Mhz/1250MHz stock voltage (Gputweak)
- I can keep temps below 80c easy (on air, cards are re-timed)
- Run the texel fill test 3 times for this bench.

- Do it all again with the asus.rom bios.

Any suggestions before I start?


----------



## kdawgmaster

Quote:


> Originally Posted by *the9quad*
> 
> I am just going to run stock, I just want 2560x1440p at 120 fps or so in BF4 @ ultra...I could really care less about OC'ing the cards until much later when they don't handle games how I want them to look @ stock.


I have 3 290X with my 2600K which is bottlenecking them massively. performance is fantastic but there are alot of fixes they need to do. For some reason my drivers will crash causing me to do a hard shutdown, after this the drivers will no longer respond to only have me driver sweep them and then reinstall, after the reinstall i will then have to make sure all my cards are enabled in the device manager and if they arent i then have to disable the card not working and re-enable it and even then thats not 100%. All cards test fine individually so i know thats not the problem. :/

But BF4 depending in how much action and the map is an easy 120 for sure maxed at 1440P as thats what i got right now xD


----------



## Warsam71

Quote:


> Originally Posted by *ReHWolution*
> 
> Ah dang, I was waiting for the update in these hours, well I'm gonna wait as everybody will
> 
> 
> 
> 
> 
> 
> 
> Is there any way I can help you? I test videocards with A LOT of benchmarks and games, so I could be helpful for fixing something


That's very kind of you! Thank you for the offer!

I'm sure our Driver Team has had lots of feedback, that being said, I'm going to ask...you never know


----------



## maarten12100

Quote:


> Originally Posted by *basco*
> 
> hey maarten did ya use this specific version of gputweak? i think so:
> http://www.mediafire.com/download/7g1in2cj3rik7is/GPUTweakVer2491_+20131017-Hawaii.zip
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> just slapped my old twinfrozr 5830 cooler on 290 and hell its quiet now.
> used vrm cooler from giga 470 soc and small alu heatsink
> 
> 
> [


I'll try that one before I return them
thanks


----------



## Arizonian

Quote:


> Originally Posted by *the9quad*
> 
> Ok update me to two Sapphire 290x's in crossfire (on air for the time being).
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - updated to two -









Quote:


> Originally Posted by *TomiKazi*
> 
> Finally got the card. Should've been here saturday. But then the status update moved it to monday, then tuesday. Too bad the deliverer arrived three hours before the stated time frame. Damn, PostNL is weak...
> 
> Okay, enough whining, here she is:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Hope to be able to update it to water next week
> 
> 
> 
> 
> 
> 
> 
> .


Congrats - added








Quote:


> Originally Posted by *fearthisneo*
> 
> Got my waterblock today. Update me please.
> 
> 
> Spoiler: Warning: Spoiler!


Updated








Quote:


> Originally Posted by *Warsam71*
> 
> Hello everyone (@Arizonian)
> 
> I apologize, I just wanted to let you know we need a few more days to post the driver. I will make sure to keep you up-to-date on our progress...


We thank you for keeping members in the loop here.


----------



## DarkZR9

Quote:


> Originally Posted by *Warsam71*
> 
> Hello everyone (@Arizonian)
> 
> I apologize, I just wanted to let you know we need a few more days to post the driver. I will make sure to keep you up-to-date on our progress...


LOL, I guess my crystal ball really does work. Thanks for keeping us up to date anyway, its nothing personal just something I have come to expect from the ATI driver team.


----------



## tsm106

Quote:


> Originally Posted by *bond32*
> 
> Quote:
> 
> 
> 
> Originally Posted by *the9quad*
> 
> you think my 1200 watt would do 3 of the 290x's? if so do you think I'd get much performance gain from doing so? I don't want a new power suppy and I dont want dual power supplies, if it isn't enough i will stick with just two cards to be honest.
> 
> 
> 
> A 1200 watt, good one at that, would be perfectly fine for 3xR9 290x's. Heck a good 1000 watt would likely be fine. I think people are overestimating the power consumption of these cards, especially when coupled with other components.
> 
> I personally have an EVGA 1000 watt g2 gold which I highly recommend. They have a platinum rated now which is a little more expensive. I think I paid somewhere in the $180 range for mine.
Click to expand...

Do you have tri 290x to have the gall to say ppl are overestimating oc'd consumption?


----------



## kdawgmaster

Well here we are i guess we can submit me in to the official owners club 3 times xD





Cant figure out how to rotate the images for this for some reason :/


----------



## bond32

Quote:


> Originally Posted by *tsm106*
> 
> Do you have tri 290x to have the gall to say ppl are overestimating oc'd consumption?


No, but you don't have to own 3 to look at the numbers to see how people are overestimating.


----------



## hatlesschimp

I had 3 air fed and mild overclocked titans and a 4.8ghz 3930k and maxed out a Corsair 1200i. On the watt meter i was 1200watts from the wall.


----------



## bond32

Quote:


> Originally Posted by *hatlesschimp*
> 
> I had 3 air fed and mild overclocked titans and a 4.8ghz 3930k and maxed out a Corsair 1200i. On the watt meter i was 1200watts from the wall.


I find that very hard to believe as it contradicts with many other's measurements of the titan...


----------



## S410520

Please add me to the list
Managed to get 2 VTX3D R9 290 X-editions to unlock


----------



## kdawgmaster

Quote:


> Originally Posted by *bond32*
> 
> I find that very hard to believe as it contradicts with many other's measurements of the titan...


Where I work we had 3 titans completely fry what was a perfectly good Corsair AX 1200I we know this because we ran it for a test rig they also draw a hell of alot of power.


----------



## bond32

Quote:


> Originally Posted by *kdawgmaster*
> 
> Where I work we had 3 titans completely fry what was a perfectly good Corsair AX 1200I we know this because we ran it for a test rig they also draw a hell of alot of power.


Yes, they draw a lot, but to make a claim like that you need to put a number with it. A measured number, not an estimate or some software measurement. How much is "a hell of alot"?

And don't say whatever corsair's software measures, if they even have that feature.


----------



## tsm106

Quote:


> Originally Posted by *bond32*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Do you have tri 290x to have the gall to say ppl are overestimating oc'd consumption?
> 
> 
> 
> No, but you don't have to own 3 to look at the numbers to see how people are overestimating.
Click to expand...

Now would be a good time for you to stop spouting misinformation.


----------



## hatlesschimp

Quote:


> Originally Posted by *bond32*
> 
> I find that very hard to believe as it contradicts with many other's measurements of the titan...


oops i had a gtx 660 in there as well. but still no room and in the end I had to stop overclocking.
3x titans = atleast 1200i PSU unless you dont want to overclock.
Maybe you should check the titan thread out they are doing crazy 400 watt overclocks now! Its a bit different from the locked down 265 watt reviews that your reading.


----------



## Arizonian

Quote:


> Originally Posted by *kdawgmaster*
> 
> Well here we are i guess we can submit me in to the official owners club 3 times xD
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cant figure out how to rotate the images for this for some reason :/


Congrats - added









Quote:


> Originally Posted by *S410520*
> 
> Please add me to the list
> Managed to get 2 VTX3D R9 290 X-editions to unlock
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Also your the first VTX3D on the list.


----------



## kdawgmaster

Quote:


> Originally Posted by *bond32*
> 
> Yes, they draw a lot, but to make a claim like that you need to put a number with it. A measured number, not an estimate or some software measurement. How much is "a hell of alot"?
> 
> And don't say whatever corsair's software measures, if they even have that feature.


We never used any software. This was a power supply that we had running boat loads of high end customer systems with 2-4 7970's and even GTX680's to only have eventually 3 titans make it bit the dust. I mean it could have just been coincident.


----------



## bond32

Quote:


> Originally Posted by *tsm106*
> 
> Now would be a good time for you to stop spouting misinformation.


Then prove me wrong, show results. And can the condescending nonsense, I've seen your previous posts. Not going to waste my time with that.


----------



## tsm106

Quote:


> Originally Posted by *bond32*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Now would be a good time for you to stop spouting misinformation.
> 
> 
> 
> Then prove me wrong, show results. And can the condescending nonsense, I've seen your previous posts. Not going to waste my time with that.
Click to expand...

I've already posted my consumption numbers on this thread. Since you are making ludicrous statements without proof not experience not to mention not owning said setup, how can you tell others what is correct? lmao...


----------



## S410520

Thanks!

I love VTX3D since my 7970 OC was very cheap, early available and clocked great. Just like this (25Mhzoc) 290 X-edition!
(The cooler on the 7970 was plastic but performed great, silent, and not so heavy)

I cant believe nobody is trying out this "rookie" brand, they are high grade builders imo.

After reading all the rumors I was very happy that I had bought the oc models, considering the throttling I had a feeling they would both unlock and voila.


----------



## hatlesschimp

Quote:


> Originally Posted by *bond32*
> 
> Then prove me wrong, show results. And can the condescending nonsense, I've seen your previous posts. Not going to waste my time with that.


Just because you are ignorant and a pain in the butt, im going to trawl through all my photos and see if i can find that picture of me drawing 1200watts from the wall with 3 standard titans and a gtx 660 doing only physx and audio. I'll be back lol


----------



## kdawgmaster

Quote:


> Originally Posted by *bond32*
> 
> Then prove me wrong, show results. And can the condescending nonsense, I've seen your previous posts. Not going to waste my time with that.


Quote:


> Originally Posted by *tsm106*
> 
> I've already posted my consumption numbers on this thread. Since you are making ludicrous statements without proof not experience not to mention not owning said setup, how can you tell others what is correct? lmao...


Cant we all just get along


----------



## bond32

Quote:


> Originally Posted by *tsm106*
> 
> I've already posted my consumption numbers on this thread. Since you are making ludicrous statements without proof not experience not to mention not owning said setup, how can you tell others what is correct? lmao...


Good reply, you really answered the questions well followed by solid results.

http://www.guru3d.com/articles_pages/geforce_gtx_titan_3_way_sli_review,4.html

They measured their 3 titan setup to pull 843 watts. You seriously going to tell me when you overclock these you need a 1200 watt PSU minimum? LOL

Edit:
Quote:


> Originally Posted by *hatlesschimp*
> 
> Just because you are ignorant and a pain in the butt, im going to trawl through all my photos and see if i can find that picture of me drawing 1200watts from the wall with 3 standard titans and a gtx 660 doing only physx and audio. I'll be back lol


You quoted a post that wasn't even directed at you, even further resulting in name calling? This thread is getting fun.


----------



## kdawgmaster

Quote:


> Originally Posted by *hatlesschimp*
> 
> Just because you are ignorant and a pain in the butt, im going to trawl through all my photos and see if i can find that picture of me drawing 1200watts from the wall with 3 standard titans and a gtx 660 doing only physx and audio. I'll be back lol


If there were 3 titans why the hell would u need a GTX660 there for physx :/


----------



## jerrolds

Quote:


> Originally Posted by *bond32*
> 
> Then prove me wrong, show results. And can the condescending nonsense, I've seen your previous posts. Not going to waste my time with that.


While I agree with your assessment of TSMs previous posts lol, you should be the one providing proof of your claims. I've seen single 290X power measurements to be around 385w, and CF at just under 800w...adding a 3rd can definately go over 1000W imo esp with big overclocks and aftermarking cooling

http://www.hardocp.com/article/2013/11/01/amd_radeon_r9_290x_crossfire_video_card_review/9 - thats with a [email protected]

1200W will be very close for Trifire 290X with an OCN type system


----------



## maarten12100

Quote:


> Originally Posted by *maarten12100*
> 
> I'll try that one before I return them
> thanks


didn't work going with afterburner


----------



## hatlesschimp

Quote:


> Originally Posted by *bond32*
> 
> Good reply, you really answered the questions well followed by solid results.
> 
> http://www.guru3d.com/articles_pages/geforce_gtx_titan_3_way_sli_review,4.html
> 
> They measured their 3 titan setup to pull 843 watts. You seriously going to tell me when you overclock these you need a 1200 watt PSU minimum? LOL
> 
> Edit:
> You quoted a post that wasn't even directed at you, even further resulting in name calling? This thread is getting fun.


OK bond32 you have just proved yourself wrong. These are all standard non overclocked titan tests by reviewers.

Unfortunately I can find my photo showing 1230 watts. But I do have proof that I did at least have 3 titans and a watt meter which ironically shows 195 watts lol. But neither the less I know what I saw and what I'm talking about. I like you was surprised when I finally found out why I couldn't overclock anymore and that I had hit the ceiling with my PSU.


----------



## kdawgmaster

Quote:


> Originally Posted by *jerrolds*
> 
> While I agree with your assessment of TSMs previous posts lol, you should be the one providing proof of your claims. I've seen single 290X power measurements to be around 385w, and CF at just under 800w...adding a 3rd can definately go over 1000W imo esp with big overclocks and aftermarking cooling
> 
> http://www.hardocp.com/article/2013/11/01/amd_radeon_r9_290x_crossfire_video_card_review/9 - thats with a [email protected]
> 
> 1200W will be very close for Trifire 290X with an OCN type system


I dont follow Hardocp much because of their unstandardised means of testing. With most of their test they have skewed results because they will turn one thing on normal for one video card and then the next card they are comparing it to they will have it maxed or even lower.

If you are going to compare a video card to a video card it MUST BE 100% SAME settings and out of the box experience to give people the best impressions


----------



## DarkZR9

Quote:


> Originally Posted by *kdawgmaster*
> 
> I dont follow Hardocp much because of their unstandardised means of testing. With most of their test they have skewed results because they will turn one thing on normal for one video card and then the next card they are comparing it to they will have it maxed or even lower.
> 
> If you are going to compare a video card to a video card it MUST BE 100% SAME settings and out of the box experience to give people the best impressions


Exactly which is why I despise Hardocp.


----------



## bond32

Quote:


> Originally Posted by *jerrolds*
> 
> While I agree with your assessment of TSMs previous posts lol, you should be the one providing proof of your claims. I've seen single 290X power measurements to be around 385w, and CF at just under 800w...adding a 3rd can definately go over 1000W imo esp with big overclocks and aftermarking cooling
> 
> http://www.hardocp.com/article/2013/11/01/amd_radeon_r9_290x_crossfire_video_card_review/9 - thats with a [email protected]
> 
> 1200W will be very close for Trifire 290X with an OCN type system


I did see that review, and I literally just boxed up my meter as I have to return it, but I measured about the same as you for one card. Until I see a meter on someone's tri setup, I fail to see how a 1200 watt psu will be "very close".
Quote:


> Originally Posted by *hatlesschimp*
> 
> OK bond32 you have just proved yourself wrong. These are all standard non overclocked titan tests by reviewers.
> 
> Unfortunately I can find my photo showing 1230 watts. But I do have proof that I did at least have 3 titans and a watt meter which ironically shows 195 watts lol. But neither the less I know what I saw and what I'm talking about. I like you was surprised when I finally found out why I couldn't overclock anymore and that I had hit the ceiling with my PSU.


So, you claim I am proven wrong by posting a photo that tells us nothing about the total power draw of your 3 way titan setup? Ok then? Guess we will take your word for it??


----------



## tsm106

Guru consumption numbers... lol.









Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> Two cards on psu with my cpu and nothing else connected, ie. no drives, no loop, nothing else. Running 3dm11 extreme at stock with +50 on air, so it's throttling a bit as expected. 903w x .89 at that wattage = *803w.* This number should blow up with blocks and serious overclocking. Ya need to remember that on the stock bios PT is trying to keep the tdp of the cards very very low.


----------



## DarkZR9

Dont bother, denial will continue in the face of hard evidence. Cant stand that type of person.........


----------



## sugarhell

I dont know why people want to play it clever and spread misinformations when their only proof is a fail review. I cant really give an advice when i dont have first hand experience or real evidence


----------



## hatlesschimp

Quote:


> Originally Posted by *kdawgmaster*
> 
> If there were 3 titans why the hell would u need a GTX660 there for physx :/












I had a unique situation and no one could work it out.


3x VG248qe monitors in surround portrait.
1x Accessory display
3x GTX Titans
1x Cambridge Audio AVR

the 3 monitors were connected by Dual Link DVI and spanned in nvidia surround. When I connected the HDMI out of the titans to the AVR and then onto the VG278H i was getting no audio. I went mad trying all the different ports, drivers and settings in windows and nothing. So I thought I would buy a cheap GPU for the 4th slot and see if that could carry the audio because it wasn't apart of the nvidia surround setup. And to my surprise it worked and the added bonus was I selected it to be the dedicated physx card and to my surprise that too worked and I got a higher physx score in benchmarks.


----------



## bond32

Ok then, guess this discussion is over. I legit wanted to see some numbers from systems. Other than the photo you just posted TSM, all I hear are people telling of which reviews they like and don't like now.
Quote:


> I dont know why people want to play it clever and spread misinformations when their only proof is a fail review. I cant really give an advice when i dont have first hand experience or real evidence


Then what's the problem with the review? Why is it a "fail review"? I couldn't care less about your _opinion_ of a review, but if you have some reason to believe their test is wrong then by all means share.

Edit: Lets take a step back here and think about what's being discussed. I made ONE statement, saying a 1000 watt psu may _*likely*_ be enough and a 1200 watt psu was plenty. That statement is now seen by some of you as "ludicrous misinformation"? I mean really? It's like some of you are simply hunting around to find people to start a flame war with.

Now lets also examine some other facts. If you are buying 3 of these cards, the price difference between a 1000 watt and a 1200 watt psu is a drop in a bucket for you. I am simply curious about the amount of power they ACTUALLY pull, especially considering in this very thread I have seen some people concerned how their 1000 watt psu to them, isn't enough for 2 R9 290X cards.


----------



## tsm106

Quote:


> Originally Posted by *hatlesschimp*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kdawgmaster*
> 
> If there were 3 titans why the hell would u need a GTX660 there for physx :/
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *I had a unique situation and no one could work it out.*
> 
> 
> 3x VG248qe monitors in surround portrait.
> 1x Accessory display
> 3x GTX Titans
> 1x Cambridge Audio AVR
> 
> the 3 monitors were connected by Dual Link DVI and spanned in nvidia surround. When I connected the HDMI out of the titans to the AVR and then onto the VG278H i was getting no audio. I went mad trying all the different ports, drivers and settings in windows and nothing. So I thought I would buy a cheap GPU for the 4th slot and see if that could carry the audio because it wasn't apart of the nvidia surround setup. And to my surprise it worked.
Click to expand...

I had similar problems with an AVR as well. I switched to optical, no room at that time for another card with quad 7970s hehe.


----------



## jerrolds

Quote:


> Originally Posted by *kdawgmaster*
> 
> I dont follow Hardocp much because of their unstandardised means of testing. With most of their test they have skewed results because they will turn one thing on normal for one video card and then the next card they are comparing it to they will have it maxed or even lower.
> 
> If you are going to compare a video card to a video card it MUST BE 100% SAME settings and out of the box experience to give people the best impressions


I hear yea..heres another one from Tweaktown http://www.tweaktown.com/reviews/5853/his-radeon-r9-290x-4gb-in-crossfire-video-card-review/index23.html 750W CF with an i7 [email protected]

Personally i think i would do 1200W for Trifire if i wasnt looking to overclock past stock voltages. Maybe 1100mhz on the core max. Luckily im too poor to think about it.


----------



## hatlesschimp

Quote:


> Originally Posted by *tsm106*
> 
> I had similar problems with an AVR as well. I switched to optical, no room at that time for another card with quad 7970s hehe.


yeah i tried optical but got lost in the PCM mumbo jumbo and i really wanted true 5.1 surround.


----------



## kdawgmaster

Quote:


> Originally Posted by *jerrolds*
> 
> I hear yea..heres another one from Tweaktown http://www.tweaktown.com/reviews/5853/his-radeon-r9-290x-4gb-in-crossfire-video-card-review/index23.html 750W CF with an i7 [email protected]
> 
> Personally i think i would do 1200W for Trifire if i wasnt looking to overclock past stock voltages. Maybe 1100mhz on the core max. Luckily im too poor to think about it.


Maybe i should throw a 4th 290x to see if the 1300 watt can take it xD jkjk


----------



## hatlesschimp

Quote:


> Originally Posted by *kdawgmaster*
> 
> Maybe i should throw a 4th 290x to see if the 1300 watt can take it xD jkjk


make sure you have a watt meter rolling and take a photo otherwise we wont believe you.


----------



## kdawgmaster

Quote:


> Originally Posted by *hatlesschimp*
> 
> make sure you have a watt meter rolling and take a photo otherwise we wont believe you.


This is becoming to much work now :/

its -30 out and now i have to go buy a volt meter dammit all :'(


----------



## tsm106

Quote:


> Originally Posted by *hatlesschimp*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kdawgmaster*
> 
> Maybe i should throw a 4th 290x to see if the 1300 watt can take it xD jkjk
> 
> 
> 
> make sure you have a watt meter rolling and take a photo otherwise we wont believe you.
Click to expand...

I pull over 1100w at near max overclock with two 290x and my 3930 at 5ghz, and that's with no accessories, loops, drives. I use two psu, which is why I can run just two gpu + cpu like that. The thought of running oc'd quads, haha I picture fry guys time.


----------



## hatlesschimp

Also we need you to get todays newspaper and have that in the photo as well.


----------



## skupples

Quote:


> Originally Posted by *brazilianloser*
> 
> So here is my new issue... Got a replacement card in and its working great compared to the black screen madness I had going on with my last one... but for some reason AB Beta 17 and 3Dmark are only showing that the card has 3GB of ram. Anyone has noticed this situation on their cards as well (ASUS 290).???
> 
> 
> 
> edit: on the other hand GPUZ does states that the card has 4GB as it should.


Titans only Show 4 gigs in pretty much every thing and its been out for a year


----------



## bond32

Quote:


> Originally Posted by *hatlesschimp*
> 
> Also we need you to get todays newspaper and have that in the photo as well.


Really? What are you trying to say? If you have some issue then please share, otherwise you are continuing to make yourself look like a fool.


----------



## hatlesschimp

You can never trust anyone these days! Especially people on the internet!

In the 1930s there was a radio company that done a prank and said that aliens were invading and the people in the town that they were broadcasting in all went crazy doing random things that you would do if aliens really did come to earth. I think it was in Germany but is a perfect example of not to believe what others say without questioning its legitimacy.


----------



## kdawgmaster

Quote:


> Originally Posted by *bond32*
> 
> Really? What are you trying to say? If you have some issue then please share, otherwise you are continuing to make yourself look like a fool.


Quote:


> Originally Posted by *hatlesschimp*
> 
> You can never trust anyone these days! Especially people on the internet!


----------



## bond32

Quote:


> Originally Posted by *hatlesschimp*
> 
> You can never trust anyone these days! Especially people on the internet!


Then perhaps it may help if you reread my post. Here I will help:
Quote:


> Ok then, guess this discussion is over. I legit wanted to see some numbers from systems. Other than the photo you just posted TSM, all I hear are people telling of which reviews they like and don't like now.
> Quote:
> I dont know why people want to play it clever and spread misinformations when their only proof is a fail review. I cant really give an advice when i dont have first hand experience or real evidence
> 
> Then what's the problem with the review? Why is it a "fail review"? I couldn't care less about your opinion of a review, but if you have some reason to believe their test is wrong then by all means share.
> 
> Edit: Lets take a step back here and think about what's being discussed. I made ONE statement, saying a 1000 watt psu may likely be enough and a 1200 watt psu was plenty. That statement is now seen by some of you as "ludicrous misinformation"? I mean really? It's like some of you are simply hunting around to find people to start a flame war with.
> 
> Now lets also examine some other facts. If you are buying 3 of these cards, the price difference between a 1000 watt and a 1200 watt psu is a drop in a bucket for you. I am simply curious about the amount of power they ACTUALLY pull, especially considering in this very thread I have seen some people concerned how their 1000 watt psu to them, isn't enough for 2 R9 290X cards.


----------



## hatlesschimp

I cant be bothered reading. I'm sorry if I offended you. Its 5:37am I'm going to sleep.

Damn I'll be lucky if I can sleep not finding that picture of the 1230 watts.


----------



## Jpmboy

I posted benchmark power draw for several common benchmarks in the titan thread - dont have it with me on this ipad. 2xSLI titans and a 3930k @5.0 easily pull more than 1300 system watts,and would shut down a PC Power and Cooling 1200w (read: the best PSU ever made). Will post measurements here later.

A very wise man once said: you ARE entitled to you own opinion... But Not your own facts.


----------



## skupples

I really don't understand this on going war to discredit anyone coming forward with valid information on power draw.

TDP goes out the window and to the moon when over clocking.


----------



## tsm106

Quote:


> Originally Posted by *skupples*
> 
> I really don't understand this on going war to discredit anyone coming forward with valid information on power draw.
> 
> TDP goes out the window and to the moon when over clocking.


This is what happens when those who don't know, argue with those that do.


----------



## Epsi

Received my new XFX R9 290 today, it also unlocked successfully.

Only downside it has quite some coil whine.

Gonna bench/play some more with it.


----------



## skupples

Quote:


> Originally Posted by *tsm106*
> 
> This is what happens when those who don't know, argue with those that do.


Folks seem to be hypersensitive to anything that could possibly equate to "bashing" these (and many other GPUs) use an ishton of power when pushing serious clocks. Voltage alone does not increase power draw that much unless the clocks are their to use it... And so forth... Back to work for me.


----------



## bond32

Quote:


> Originally Posted by *tsm106*
> 
> This is what happens when those who don't know, argue with those that do.


So, you really just float around this thread, waiting to start an argument? Sure seems that way.

Where did I argue with anyone? I made one statement which turned into all of this, mainly by you. I wanted to see facts/measurements/numbers. Yes you posted a picture of a meter, with a value one would expect given the perimeters you specified.

What I thought was a discussion, turned out to me being on the defensive for a claim I made, and I still haven't seen any evidence to prove how a 1200 watt psu is not enough for 3 r9 290's, or titans.

The individuals fueling the flames haven't provided any evidence against that claim other than, " Well I own it and I know, so there you couldn't possibly know otherwise".


----------



## lethal343

hey guys ive just got a question.

these are my specs:

PSU: Thermaltake TT TOUHGHPOWER 1350W
MB: X79s-up5-WiFi
CPU: i7 4960X
CPU COOLER: Corsair H100i
RAM: HYPER X BEAST 16GB 2133MHz
GPU: 2X R9 290X CF
STORAGE: 3X 1TB HDD + 256 SSD

my question:
Is my thermaltake 1350W power supply enough for a 3RD R9 290X?







(I can remove a HDD if needed)

Also I have both my 290's in pcie 3.0 16x ports... if i was to throw in a third, my config would run in 16x,8x,16x (because i only have two 16x ports) How would I be expected to perform? Lower than 2 way CF? because of the 8x maybe setting the other slots to 8x? Would i notice any fps increase on games or benchmarks?

Help is much appreciated thanks guys heaps.


----------



## sugarhell

Quote:


> Originally Posted by *bond32*
> 
> So, you really just float around this thread, waiting to start an argument? Sure seems that way.
> 
> Where did I argue with anyone? I made one statement which turned into all of this, mainly by you. I wanted to see facts/measurements/numbers. Yes you posted a picture of a meter, with a value one would expect given the perimeters you specified.
> 
> What I thought was a discussion, turned out to me being on the defensive for a claim I made, and I still haven't seen any evidence to prove how a 1200 watt psu is not enough for 3 r9 290's, or titans.
> 
> The individuals fueling the flames haven't provided any evidence against that claim other than, " Well I own it and I know, so there you couldn't possibly know otherwise".


Maths. 290x=>stock 300 watt+50% powerlimit=max tdp X 3=profit

Now calculate a 5ghz sbe and a big loop so you can get your number


----------



## Jpmboy

Quote:


> Originally Posted by *bond32*
> 
> So, you really just float around this thread, waiting to start an argument? Sure seems that way.
> 
> Where did I argue with anyone? I made one statement which turned into all of this, mainly by you. I wanted to see facts/measurements/numbers. Yes you posted a picture of a meter, with a value one would expect given the perimeters you specified.
> 
> What I thought was a discussion, turned out to me being on the defensive for a claim I made, and I still haven't seen any evidence to prove how a 1200 watt psu is not enough for 3 r9 290's, or titans.
> 
> The individuals fueling the flames haven't provided any evidence against that claim other than, " Well I own it and I know, so there you couldn't possibly know otherwise".


Will post some data this evening.


----------



## tsm106

Let's look at this post again.
Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bond32*
> 
> Quote:
> 
> 
> 
> Originally Posted by *the9quad*
> 
> you think my 1200 watt would do 3 of the 290x's? if so do you think I'd get much performance gain from doing so? I don't want a new power suppy and I dont want dual power supplies, if it isn't enough i will stick with just two cards to be honest.
> 
> 
> 
> A 1200 watt, good one at that, would be perfectly fine for 3xR9 290x's. Heck a good 1000 watt would likely be fine. I think people are overestimating the power consumption of these cards, especially when coupled with other components.
> 
> I personally have an EVGA 1000 watt g2 gold which I highly recommend. They have a platinum rated now which is a little more expensive. I think I paid somewhere in the $180 range for mine.
> 
> Click to expand...
> 
> Do you have tri 290x to have the gall to say ppl are overestimating oc'd consumption?
Click to expand...

No experience with setup because he doesn't have the setup. Says ppl are overestimating their recommendations yet he has no proof of his own. That reads like misinformation to me.

One cannot make statements like these and demand that others prove him wrong.


----------



## Jpmboy

Quote:


> Originally Posted by *lethal343*
> 
> hey guys ive just got a question.
> 
> these are my specs:
> 
> PSU: Thermaltake TT TOUHGHPOWER 1350W
> MB: X79s-up5-WiFi
> CPU: i7 4960X
> CPU COOLER: Corsair H100i
> RAM: HYPER X BEAST 16GB 2133MHz
> GPU: 2X R9 290X CF
> STORAGE: 3X 1TB HDD + 256 SSD
> 
> my question:
> Is my thermaltake 1350W power supply enough for a 3RD R9 290X?
> 
> 
> 
> 
> 
> 
> 
> (I can remove a HDD if needed)
> 
> Also I have both my 290's in pcie 3.0 16x ports... if i was to throw in a third, my config would run in 16x,8x,16x (because i only have two 16x ports) How would I be expected to perform? Lower than 2 way CF? because of the 8x maybe setting the other slots to 8x? Would i notice any fps increase on games or benchmarks?
> 
> Help is much appreciated thanks guys heaps.


1350 is plenty unless you intend to overclock/overvolt the 4960 and the gpus. Okay if you only overclock that great CPU!


----------



## CriticalHit

got my 2 HIS 290's in and been testing at stock for last 3 hours or so... no crashes / black screens so far... minimal coil whine in menus but not during gaming ( i probably won't notice when close case )

im running on pcie 2.0 x8 on old i7 860 @3.8GHZ .... seems to be scaling OK but now im simply making sure it all works before playtime..

waterblocks are here, but alas, no thermal compound... once its here can have a bit more fun... firestrike extreme score X7348 (86th percentile ) whilst my perfomance score was in 95th percentile (P) 12144 ... everything is bulk stock until i play...

im gathering this is where they should be at stock ( am i right ? ) which means my 860 and x8 pcie 2.0 may not be much of a bottleneck afterall..

http://www.3dmark.com/3dm/1675290?


----------



## CriticalHit

double


----------



## lethal343

Quote:


> Originally Posted by *Jpmboy*
> 
> 1350 is plenty unless you intend to overclock/overvolt the 4960 and the gpus. Okay if you only overclock that great CPU!


So i can only overclock one or the other? that being GPU or CPU?
or just the CPU is overclockable in that setup?

How do you think performance will go?

Thx heaps buddy.


----------



## kdawgmaster

Quote:


> Originally Posted by *lethal343*
> 
> hey guys ive just got a question.
> 
> these are my specs:
> 
> PSU: Thermaltake TT TOUHGHPOWER 1350W
> MB: X79s-up5-WiFi
> CPU: i7 4960X
> CPU COOLER: Corsair H100i
> RAM: HYPER X BEAST 16GB 2133MHz
> GPU: 2X R9 290X CF
> STORAGE: 3X 1TB HDD + 256 SSD
> 
> my question:
> Is my thermaltake 1350W power supply enough for a 3RD R9 290X?
> 
> 
> 
> 
> 
> 
> 
> (I can remove a HDD if needed)
> 
> Also I have both my 290's in pcie 3.0 16x ports... if i was to throw in a third, my config would run in 16x,8x,16x (because i only have two 16x ports) How would I be expected to perform? Lower than 2 way CF? because of the 8x maybe setting the other slots to 8x? Would i notice any fps increase on games or benchmarks?
> 
> Help is much appreciated thanks guys heaps.


u should be fine as long as theres no massive OC done. As for the PCI layout i think it will be 16, 8 and 8 X79 CPU's only have 40 lanes to work with for GPU's, USB, sound cards and so on so with 3 GPU's u would run into 32 giving 8 lanes for the rest unless u have a PLX controller and that motherboard dosnt i think


----------



## iGameInverted

Has anyone installed the Aquacomputer block on their card yet? Looking for some temps and overall and very general review of the block. I ordered one a couple of days ago but it will be some time before I get a chance to get my rig together.


----------



## bond32

Quote:


> Originally Posted by *tsm106*
> 
> Let's look at this post again.
> No experience with setup because he doesn't have the setup. Says ppl are overestimating their recommendations yet he has no proof of his own. That reads like misinformation to me.
> 
> One cannot make statements like these and demand that others prove him wrong.


Yet you have yet to provide any information other than, "I own it, he doesn't, so I know and he doesn't know".

I'm done having this discussion. If anyone can put a meter on their setup, it would be cool to see the power draw. Maybe someone could make a thread and a spreadsheet or something...


----------



## Arizonian

Quote:


> Originally Posted by *Jpmboy*
> 
> Will post some data this evening.


Thank you.









I've seen this debate heat up before over 'how much wattage' is required. I've have seen posts in the Titan club where adding voltage can even be too much for a SLI set up on a 1200 watt PSU.

Anyway let's all keep it civil and it's ok to disagree. We've got very credible members who don't need to post pictures as proof who's word I'd take. I also understand if someone is only asking to see proof and we should resepct that. Don't get offended. Remember you guys don't have anything to prove.

The way I see it, If someone has been given advice over how much wattage is required and goes to the contrary then lessons learned sometimes is it's best teacher.

So let's just get along.









Play nice guys - using this post as my marker. Going to be away from computer with full access or time for the next 10 hrs. If you post submissions I will get to it tonight.


----------



## tsm106

Quote:


> Originally Posted by *bond32*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Let's look at this post again.
> No experience with setup because he doesn't have the setup. Says ppl are overestimating their recommendations yet he has no proof of his own. That reads like misinformation to me.
> 
> One cannot make statements like these and demand that others prove him wrong.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yet you have yet to provide any information other than, "I own it, he doesn't, so I know and he doesn't know".
> 
> I'm done having this discussion. If anyone can put a meter on their setup, it would be cool to see the power draw. Maybe someone could make a thread and a spreadsheet or something...
Click to expand...











I've already posted my draw on two stock 290x and oc'd 3930. It was over 900w at the wall. If you have no capacity to extrapolate maybe you shouldn't make misinformative statements to begin with.

Max oc'd 7970s have similar consumption to stock 290x. What we know and have proof as it's been posted numerous times and should be common knowledge by now is that quad 7970s will draw well over 1600w. Trifire 7970s will draw well over 1300w. My own quad 7970s draw over 1800w and have been posted here in the ati forums multiple times.

Fact, 290x starts at 7970s max of 300w. The implied consumption of well overclocked 290x should be obvious, no??


----------



## jerrolds

Quote:


> Originally Posted by *tsm106*
> 
> Let's look at this post again.
> No experience with setup because he doesn't have the setup. Says ppl are overestimating their recommendations yet he has no proof of his own. That reads like misinformation to me.
> 
> One cannot make statements like these and demand that others prove him wrong.


I think its the way you initiate conversation is what rubs people the wrong way
Quote:


> Originally Posted by *tsm106*
> 
> Do you have tri 290x to have the gall to say ppl are overestimating oc'd consumption?


Quote:


> Originally Posted by *tsm106*
> 
> Now would be a good time for you to stop spouting misinformation.


Words like gall and spouting is pretty condescending and would automatically put a lot of people on the defensive. You couldve just said "Ive measured over 1100W with CF 290X without loops and at stock. 1000W will not cut it". I'm not saying youre wrong..you were just kind of a jerk about it.









I agree in that he shouldve beent he one to provide proof of his initial claims.


----------



## tsm106

I don't care to hold hands with misinformers, shrugs. A simple cursory search of this thread would have brought up more than enough info. Any basic understanding of overclocked 7970s would have brought up loads of consumption numbers.


----------



## DarknightOCR

here is my sapphire

now with EK block



Temp idle : 31ºC

Temp run 3dmark clocks stock : 42ºC


----------



## Arm3nian

Quote:


> Originally Posted by *DarkZR9*
> 
> Sounds to me like you are just bitter for paying full price for 2 290x's when you could have gotten a couple of powercolor 290's and saved yourself $300.00 in the end while still having a 290x after a bios flash.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And its more than just the 2 fps that you keep pulling out of your behind, for me and many others the diff is around 4fps which is just about right and you dont need the cards under water to prevent throttle, thats one of the stupidest things I have heard thus far in this conversation. Myself and many others are running the accelero 3 cooler which keeps the core under load at 60c while the hottest vrm is only reaching 65c so I can assure you that no throttle is taking place.
> 
> Not only that but as someone else mentioned we are smart enough to run identical clock speeds between both bios's during testing.


Quote:


> Originally Posted by *DarkZR9*
> 
> Im using 1.328vc for 1200mhz core and get 93c load on vrm1 with the exact same cooler when playing Crysis 3 for a few hours which seems to heat it up more than anything else I have played. Other games like BF4 it stays around 80c tops.


Oh wow 2 more fps than what I said, groundbreaking. To forceman, I think he has a little trouble with math considering 5% increase at 60fps is 2-4fps, like I said.

Please just stop your hypocrisy. First you say your core never goes above 60c under load and then you say your core temp is 93c LOL.

With your logic, there is no point in the 290x because everyone can just get a 290 and with 100% guarantee flash it to a 290x right? Okay bro.


----------



## DarkZR9

Quote:


> Originally Posted by *jerrolds*
> 
> I think its the way you initiate conversation is what rubs people the wrong way
> 
> Words like gall and spouting is pretty condescending and would automatically put a lot of people on the defensive. You couldve just said "Ive measured over 1100W with CF 290X without loops and at stock. 1000W will not cut it". I'm not saying youre wrong..you were just kind of a jerk about it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I agree in that he shouldve beent he one to provide proof of his initial claims.


I agree, tsm106 I take it you dont work with the public because you would probably get it handed to you if you talked down to people like this face to face the way that you do here. Your means of delivery when communicating is lacking and only hurts your credibility. But...... carry on.


----------



## DarkZR9

Quote:


> Originally Posted by *Arm3nian*
> 
> Oh wow 2 more fps than what I said, groundbreaking. To forceman, I think he has a little trouble with math considering 5% increase at 60fps is 2-4fps, like I said.


And there is nothing ground breaking between an equally clocked R9 290 vs an equally clocked R9 290x. Point proven.........

Quote:


> Originally Posted by *Arm3nian*
> 
> Please just stop your hypocrisy. First you say your core never goes above 60c under load and then you say your core temp is 93c LOL.


Reading comprehension>you. Go back, re read what I said.

Quote:


> Originally Posted by *Arm3nian*
> 
> With your logic, there is no point in the 290x because everyone can just get a 290 and with 100% guarantee flash it to a 290x right?


Never said it was a 100% guarantee.
Quote:


> Originally Posted by *Arm3nian*
> 
> Okay bro.


One thing is for certain.. I ain't your bro.


----------



## Arm3nian

Quote:


> Originally Posted by *DarkZR9*
> 
> And there is nothing ground breaking between an equally clocked R9 290 vs an equally clocked R9 290x. Point proven.........
> Reading comprehension>you. Go back, re read what I said.
> Never said it was a 100% guarantee.
> One thing is for certain.. I ain't your bro.


1.Yeah at what, clocks below stock?
2.This is due to your poor writing skills.
3.Actually you did: when you could have gotten a couple of powercolor 290's and saved yourself $300.00 in the end while still having a 290x after a bios flash.
4.Thank god.


----------



## Ukkooh

Quote:


> Originally Posted by *DarknightOCR*
> 
> here is my sapphire
> 
> now with EK block
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Temp idle : 31ºC
> 
> Temp run 3dmark clocks stock : 42ºC


How much of rad area do you have?


----------



## Gralle

Hi all!

First time poster and I like to get in to this exclusive club









Sapphire Radeon R9 290X with EK-FC R9-290X - Acetal waterblock.

http://www.3dmark.com/fs/1166818


----------



## DarknightOCR

Quote:


> Originally Posted by *Ukkooh*
> 
> How much of rad area do you have?


i have EK Xt 360 psuh-pull , for GPU and CPu


----------



## ukaussi

Newegg just posted a 4-way R9 290X video which has a slide with total power for 1, 2, 3 and 4 cards which may help answer some questions.
Naturally, none of the cards were OC'ed so that will likely be the lowest power they pull.


----------



## stilllogicz

3 way is close 1000 watts but that's connected with the rest of the system. I guess a 1200 W PSU dedicated should hold over for tri fire overvolting.


----------



## sugarhell

Quote:


> Originally Posted by *stilllogicz*
> 
> 3 way is close 1000 watts but that's connected with the rest of the system. I guess a 1200 W PSU dedicated should hold over for tri fire overvolting.


Ask tsm. He is using 2 psus with 3 290x overvolted.


----------



## HowHardCanItBe

cleaned
Carry on guys


----------



## the9quad

Quote:


> Originally Posted by *ukaussi*
> 
> Newegg just posted a 4-way R9 290X video which has a slide with total power for 1, 2, 3 and 4 cards which may help answer some questions.
> Naturally, none of the cards were OC'ed so that will likely be the lowest power they pull.


Thanks a ton, looks like I will order another and shouldbe fine for tri @ 1200 at stock speeds. +rep


----------



## kpoeticg

Quote:


> Originally Posted by *Gralle*
> 
> Hi all!
> 
> First time poster and I like to get in to this exclusive club
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sapphire Radeon R9 290X with EK-FC R9-290X - Acetal waterblock.
> 
> 
> Spoiler: PiCS
> 
> 
> 
> http://www.3dmark.com/fs/1166818


Great job man!!! You should put together a build log for that beast
Quote:


> Originally Posted by *Arm3nian*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 1.Yeah at what, clocks below stock?
> 2.This is due to your poor writing skills.
> 3.Actually you did: when you could have gotten a couple of powercolor 290's and saved yourself $300.00 in the end while still having a 290x after a bios flash.
> 
> 
> *4.Thank god.*


















Quote:


> Originally Posted by *ukaussi*
> 
> Newegg just posted a 4-way R9 290X video which has a slide with total power for 1, 2, 3 and 4 cards which may help answer some questions.
> Naturally, none of the cards were OC'ed so that will likely be the lowest power they pull.


That's a great video for the differences between single, 2, 3, and 4 way CFX. Not just for power usage. Thanx +1


----------



## skupples

Quote:


> Originally Posted by *the9quad*
> 
> Thanks a ton, looks like I will order another and shouldbe fine for tri @ 1200 at stock speeds. +rep


Quote:


> Originally Posted by *ukaussi*
> 
> Newegg just posted a 4-way R9 290X video which has a slide with total power for 1, 2, 3 and 4 cards which may help answer some questions.
> Naturally, none of the cards were OC'ed so that will likely be the lowest power they pull.


his eyefinity+Xfire results are extremely interesting, to say the least...







Also, those sammich 4 way setups HAVE to be throttling some.

running PSU's @ 90% of capacity can be unhealthy. Let us know how you fare.

It's rather sad how much hate & flame fly's through this thread on a daily basis. Seems they have to clean & or lock it daily... That should tell us something as to *our* behavior.


----------



## Jpmboy

Quote:


> Originally Posted by *lethal343*
> 
> So i can only overclock one or the other? that being GPU or CPU?
> or just the CPU is overclockable in that setup?
> 
> How do you think performance will go?
> 
> Thx heaps buddy.


It really depends on the extent of overvolting (resulting in overclocking). Your system even at stock will do most anything and play nearly every game at max settings and very high frame rates, with the possible exception of 4K and big surround.
Unless you unlock voltage and start benching like crazy, you're good to go with 2 GPUs and the best OC you can get within the TDP of the cards.
Watercooled? If not, no worries about pulling too much power, your cards will throttle.
Also - please use rigbuilder and add your rig to your OCN signature.


----------



## CallsignVega

Wow, benching Titans and 290x's on the 4K Asus I've pretty much found you don't need more than two cards even in modern games. Ya, it's 8 mil pixels but you hardly need any AA, and even with high graphics settings with these babies overclocked it's pretty darn easy to maintain at least 60 FPS.

BTW where are the anti-black screen drivers that were suppose to release today?


----------



## sugarhell

Quote:


> Originally Posted by *CallsignVega*
> 
> Wow, benching Titans and 290x's on the 4K Asus I've pretty much found you don't need more than two cards even in modern games. Ya, it's 8 mil pixels but you hardly need any AA, and even with high graphics settings with these babies overclocked it's pretty darn easy to maintain at least 60 FPS.


Do you like 290x?


----------



## CallsignVega

Quote:


> Originally Posted by *ukaussi*
> 
> Newegg just posted a 4-way R9 290X video which has a slide with total power for 1, 2, 3 and 4 cards which may help answer some questions.
> Naturally, none of the cards were OC'ed so that will likely be the lowest power they pull.


Oh and whats up with some of those crossfire scaling, _really bad_..


----------



## skupples

Quote:


> Originally Posted by *CallsignVega*
> 
> Wow, benching Titans and 290x's on the 4K Asus I've pretty much found you don't need more than two cards even in modern games. Ya, it's 8 mil pixels but you hardly need any AA, and even with high graphics settings with these babies overclocked it's pretty darn easy to maintain at least 60 FPS.
> 
> BTW where are the anti-black screen drivers that were suppose to release today?


would love to see a break down comparison! end user results seem to be a good bit different then reviewer results.


----------



## CallsignVega

Ya, would like to eventually post something, been so freaking busy.







I actually have to work to pay for toys, how lame...


----------



## ukaussi

Quote:


> Originally Posted by *ukaussi*
> 
> Newegg just posted a 4-way R9 290X video which has a slide with total power for 1, 2, 3 and 4 cards which may help answer some questions.
> Naturally, none of the cards were OC'ed so that will likely be the lowest power they pull.


For those that do not have time to watch video etc here is the SS from the video regarding power draw at STOCK settings.


----------



## FtW 420

Quote:


> Originally Posted by *DarkZR9*
> 
> I agree, tsm106 I take it you dont work with the public because you would probably get it handed to you if you talked down to people like this face to face the way that you do here. Your means of delivery when communicating is lacking and only hurts your credibility. But...... carry on.


He used to say the same thing more politely, the power consumption comes up pretty often though & it is probably the 100th time he has posted overclocked/overvolted numbers. It does get frustrating repeating things sometimes, but at least he is trying to clear up some misinformation.
I used to post power numbers & literally quit replying to the 'how much PSU do I need' threads after a while. It can get ugly with all the graphs from reviewers that don't overclock or overvolt getting thrown at you.
Quote:


> Originally Posted by *skupples*
> 
> his eyefinity+Xfire results are extremely interesting, to say the least...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> running PSU's @ 90% of capacity can be unhealthy. Let us know how you fare.
> 
> It's rather sad how much hate & flame fly's through this thread on a daily basis. Seems they have to clean & or lock it daily... That should tell us something as to *our* behavior.


Indeed! 80 or 90% isn't even bad, but hitting rig shutdown can have the PSUs going 100% or more, which isn't healthy. They may be made to handle it for short bursts but not meant to be run sustained like that.


----------



## Jpmboy

Some data. PSU plugged into a tripplite isolator, on a "home run" dedicated 20A 120V line. (not your US standard 15A120V wall socket).

Killawatt power measurements (at the PSU plug) = complete system draw for "ParkBench"

3930K @5.0(1.523V)
2xTitans SLI (svl7v3 bios, softvoltmod, LLC=0)
50" 4K monitor (separate socket)

Bios = 220W (post and F2)
Boot = 500W (peak, W7x64)
Idle = ~ 160-170 watts
Browser = ~300W
Super Pi = 340W
p95 (5 min per FFT, 8G ram) = 600W (597+/-)
3Dmk11 (P) @ 875/3005 1.16V = 800-900W
3DMk11 @1215/3602 1.3V = 1190-1220W !!!
Valley @ 1215/3602 1.3V = 950-1050W (1080P ExHD)
Firestrike @ 1215/3602 @ 1.3V = 1050-1130W (default)

So.... my 1200W PCP&Cooling PSU is barely enough!!

Adding a second PSU (Seasonic 1080) and driving one card and some fans fixed any overcurrent shutdowns. Used an "add2psu" for the second power supply. works flawless.
3DMK11 @ 1280/3728 (P) will actually complete a run! Note, my MB has PSU power warning hardware/firmware - it would light up with one PSU when a heavy OC was on the CPU and SLI Titans, haven't see that since adding a second PSU.


----------



## chiknnwatrmln

Would some 290x owners kindly run some benches at stock clocks? Over at the unlocking 290 thread we're trying to compare unlocked 290's to 290x's but have nothing to compare to.

Heaven 4.0 max settings @ 1080p would be nice, any 3dMark benches, a bench called ShaderToyMark, and another called CompuBench.

Please post your results here:

We would really appreciate it, we're just trying to fully understand what's going on with these unlocked 290's.

Please do stock clocks, you can do oc's as well but please list the clocks. If anyone could also test 947 MHz (290's stock clock) that would be nice too. Also, make sure the card is not thermal throttling, that will throw off the results. Thanks.

http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread/330#post_21229818


----------



## jerrolds

^ i have 3DMark stock results - ill post em in the thread in a sec


----------



## the9quad

Quote:


> Originally Posted by *CallsignVega*
> 
> Oh and whats up with some of those crossfire scaling, _really bad_..


looks like it's ok in games, benches not so much, but games is fine with me.


----------



## FIX_ToRNaDo

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Would some 290x owners kindly run some benches at stock clocks? Over at the unlocking 290 thread we're trying to compare unlocked 290's to 290x's but have nothing to compare to.
> 
> Heaven 4.0 max settings @ 1080p would be nice, any 3dMark benches, a bench called ShaderToyMark, and another called CompuBench.
> 
> Please post your results here:
> 
> We would really appreciate it, we're just trying to fully understand what's going on with these unlocked 290's.
> 
> Please do stock clocks, you can do oc's as well but please list the clocks. If anyone could also test 947 MHz (290's stock clock) that would be nice too. Also, make sure the card is not thermal throttling, that will throw off the results. Thanks.
> 
> http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread/330#post_21229818


I've been hopelessly trying to find some benches in the last 10 pages of this thread


----------



## CallsignVega

Quote:


> Originally Posted by *the9quad*
> 
> looks like it's ok in games, benches not so much, but games is fine with me.


I thought I've seen some original 290x crossfire scaling at 90+% in game-benchmarks. I don't think any of those in that review had that sort of efficiency.


----------



## Sgt Bilko

Looks like my water loop is getting put on hold......Wife's Motherboard just died.

Oh well.


----------



## zpaf

Nice boost just from memory bandwidth.


----------



## redshirtx

FWIW, I got X4383 off a 3DMark 11 run on my rig (completely stock) with my 290X and, the 13.11b9.2 driver. I should probably run it again for kicks under 13.11 WHQL...

EDIT: huh--X4611 with 13.11 WHQL. Who knew?


----------



## kdawgmaster

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Would some 290x owners kindly run some benches at stock clocks? Over at the unlocking 290 thread we're trying to compare unlocked 290's to 290x's but have nothing to compare to.
> 
> Heaven 4.0 max settings @ 1080p would be nice, any 3dMark benches, a bench called ShaderToyMark, and another called CompuBench.
> 
> Please post your results here:
> 
> We would really appreciate it, we're just trying to fully understand what's going on with these unlocked 290's.
> 
> Please do stock clocks, you can do oc's as well but please list the clocks. If anyone could also test 947 MHz (290's stock clock) that would be nice too. Also, make sure the card is not thermal throttling, that will throw off the results. Thanks.
> 
> http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread/330#post_21229818


Heres my benchies with my R9 290x

1 card http://www.3dmark.com/fs/1047425 graphics score 10968
2 cards http://www.3dmark.com/fs/1100725 Graphics score 21240
3 cards http://www.3dmark.com/fs/1177369 ( had to OC CPU due to bottleneck with 3 cards ) graphics score 30086


----------



## tsm106

Quote:


> Originally Posted by *CallsignVega*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ukaussi*
> 
> Newegg just posted a 4-way R9 290X video which has a slide with total power for 1, 2, 3 and 4 cards which may help answer some questions.
> Naturally, none of the cards were OC'ed so that will likely be the lowest power they pull.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Oh and whats up with some of those crossfire scaling, _really bad_..
Click to expand...

That is a seriously bad review. Quad aircooled 290x = throttle all the time fun. It makes all the numbers questionable for real world users. Unless you want to run aircooled quad 7970s that are throttled all the time??

Did you note the horrific scaling in Valley? Obviously someone does not know how to create something as fundamental as a crossfire profile for unigine. I just ran this at stock clocks.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *kdawgmaster*
> 
> Heres my benchies with my R9 290x
> 
> 1 card http://www.3dmark.com/fs/1047425
> 2 cards http://www.3dmark.com/fs/1100725
> 3 cards http://www.3dmark.com/fs/1177369 ( had to OC CPU due to bottleneck with 3 cards )


Thanks, I'm gonna post this in the other thread if you don't mind.


----------



## kdawgmaster

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Thanks, I'm gonna post this in the other thread if you don't mind.


nope go ahead. u might want to make sure people are aware that its done with a 2600K which dosnt have PCI-E 3.0 support so there will be a bottleneck there meaning performance with a 3770K will probably be higher.


----------



## CallsignVega

Quote:


> Originally Posted by *tsm106*
> 
> That is a seriously bad review. Quad aircooled 290x = throttle all the time fun. It makes all the numbers questionable for real world users. Unless you want to run aircooled quad 7970s that are throttled all the time??
> 
> Did you note the horrific scaling in Valley? Obviously someone does not know how to create something as fundamental as a crossfire profile for unigine. I just ran this at stock clocks.


Ya any serious user will water cool. I just find it funny; where do you go from "this gpu is suppose to run at 95C" to hard throttling. To me, when you say something is suppose to run at a temperature, it means the clock is suppose to stay at it's rated frequency, not throttle down. Just marketing BS. Does anyones cards stay at 1 GHz stock core at 95C core temp?

As for crossfire-profiles, you would think they would already be in there unless you are talking about modifying the existing profile with a different config to work better.


----------



## tsm106

^^That setup was probably running under 800mhz core clocks. Shakes head. You hit 95c, you lose. Shrugs... it is what it is at this point.

Profiles... there hasn't been a proper profile since Cayman, and that was over two years ago and it was for heaven. We've been creating 1x1 optimize profiles that are not ideal, but it's all we have. If one is going to bench Valley w/o a profile, the score is a joke as you can see from the video review.


----------



## Taint3dBulge

What happened to the "new drivers" that were supposed to be out today?????????


----------



## Scorpion49

Quote:


> Originally Posted by *iGameInverted*
> 
> Has anyone installed the Aquacomputer block on their card yet? Looking for some temps and overall and very general review of the block. I ordered one a couple of days ago but it will be some time before I get a chance to get my rig together.


I paid for two, the shipping date has been pushed back twice now. I tried to contact them to see when they might actually ship but they appear to be ignoring me. First and last time I ever order and aquacomputer part.
Quote:


> Originally Posted by *Taint3dBulge*
> 
> What happened to the "new drivers" that were supposed to be out today?????????


Warsam said they will be a couple more days.


----------



## CallsignVega

Ya, I was waiting for these new drivers to publish 290x numbers. Maybe they will squeek something out right before they close for the day.


----------



## jerrolds

Lol Vega vs Tsm

/popcorn

I agree with Vega tho, this "designed to run at 95C" is pretty garbage imo - marketings way of saying "yes these coolers are passable, we needed a way to turn a profit while competing with Titans"

If 2 stock 290x causes them to throttle in most cases because of temperature - thats pretty bad design imo


----------



## Sgt Bilko

Quote:


> Originally Posted by *skupples*
> 
> I think we need some one who actually knows something about how BIOS' work to weigh in on this debate.
> 
> What happens when you flash a 290X over to a 290 bios? Does it loose 4 FPS?


Thats would be a much better way to test the 290 unlock.

A 290x already has the full shaders so flashing to a 290 Bios would give you more relevant results.


----------



## sugarhell

Its not 1000 mhz stock. Thats the huge mistake. Stock clocks are 737 mhz


----------



## kdawgmaster

Quote:


> Originally Posted by *Taint3dBulge*
> 
> What happened to the "new drivers" that were supposed to be out today?????????


Fore me theres still 9hrs left for the day :/ keep in mind they always are going in regards to their time zone no yours or mine.


----------



## Forceman

Quote:


> Originally Posted by *sugarhell*
> 
> Its not 1000 mhz stock. Thats the huge mistake. Stock clocks are 737 mhz


AMD's mistake. They shouldn't be advertising it as a 1000 MHz part if it can't maintain that speed under load out of the box.


----------



## kdawgmaster

Quote:


> Originally Posted by *sugarhell*
> 
> Its not 1000 mhz stock. Thats the huge mistake. Stock clocks are 737 mhz


The lowest my core clock has ever gotten is 900mhz and thats with 3 cards in the system


----------



## sugarhell

Quote:


> Originally Posted by *Forceman*
> 
> AMD's mistake. They shouldn't be advertising it as a 1000 MHz part if it can't maintain that speed under load out of the box.


Up to 1000 mhz doesnt mean fixed 1000 mhz.My first language is not english but i think i can understand the difference between 1000 mhz base clocks or up to 1000 mhz. I blame most the review sites that every one of them list the 290x as 1000 mhz stock


----------



## kdawgmaster

Just did an overclock to my R9 290x Trio and it resulted in a downgrade in performance for 3d mark. OC was 1075 core and ram 1350


----------



## tsm106

Quote:


> Originally Posted by *kdawgmaster*
> 
> Just did an overclock to my R9 290x Trio and it resulted in a downgrade in performance for 3d mark. OC was 1075 core and ram 1350


Throttle...

That many cards on air, its impossible to avoid it.


----------



## kdawgmaster

Quote:


> Originally Posted by *tsm106*
> 
> Throttle...
> 
> That many cards on air, its impossible to avoid it.


With open hardware monitor only 1 card throttled tho, the other 2 stayed at the 1075 and 1350. But yes that is most likely the point

What i think amd needs to do is allow the display card to down clock the ram like other series was able to do such as 7000 series. With the Ram always stuck at 1250 its going to make temps higher.

Doing this will allow the cards more time to heat up and in doing so they can allow the fans to spin up with the temps of the card rather then waiting for the magic 95C.


----------



## Forceman

Quote:


> Originally Posted by *sugarhell*
> 
> Up to 1000 mhz doesnt mean fixed 1000 mhz.My first language is not english but i think i can understand the difference between 1000 mhz base clocks or up to 1000 mhz. I blame most the review sites that every one of them list the 290x as 1000 mhz stock


Newegg, Amazon, and everyone else advertise it as a 1000 MHz core clock. Compare that to how they advertise Nvidia cards. AMD may say "up to 1000", but that's not the way it's advertised. I put that on AMD for not making the distinction clear to reviewers, and to resellers. Nvidia didn't seem to have any problem getting their message across. If AMD wants to go with the throttle down approach instead of boost up, that's fine, but they still need to make it clear to people what they are getting.


----------



## Skrillex101

To be honest, i wish i could turn my 290X back.. Dont really want to own this crap. It feels like im a Beta tester for something thats pushed to its limit with screws falling off everywhere along the way.

To much time have passed thou, and no more returning of it. Sure its my problem being out late. But seriously, buying a Card thats pretty much the same as the 290, for in $150 more, is just ridicolous. The noise, the performance, the downthrottle of stock cards, poor OC.

I know im no GURU of PC hardware, but i do know when i get cheated on a purchase.. and this guys is like stealing our money, were buying a product thats not fit for sale.

First and last time i buy AMD product for me. Dont want to buy a car without wheels twice..









Im forced thou to keep looking for fixed, and proper drivers for this card until i can afford a new one..

EDIT:
BTW, im new here i know, but i can post pics of my card and name if you want proof, idk.. but i can if you guys want, just so you know im not here just to troll. Ive been lurking here on forums for many years


----------



## sugarhell

Wait wait wait. So you get a card based on the spec of newegg? You dont do a research? Reviewers knew the new powertune and they still didint inform anyone how this work. Do you think that amd didint tell them how the new powertune works? Why hardware.fr has the best review about powertune? Maybe because guru3d copy paste their reviews the last 2 years they just add the results on the graphs?Because nvidia has a different boost doesnt mean amd boost is the same. Also do you mean how revieweres just report the base boost for example 770 when the card boost close to 1200?


----------



## bond32

Quote:


> Originally Posted by *Skrillex101*
> 
> To be honest, i wish i could turn my 290X back.. Dont really want to own this crap. It feels like im a Beta tester for something thats pushed to its limit with screws falling off everywhere along the way.
> 
> To much time have passed thou, and no more returning of it. Sure its my problem being out late. But seriously, buying a Card thats pretty much the same as the 290, for in $150 more, is just ridicolous. The noise, the performance, the downthrottle of stock cards, poor OC.
> 
> I know im no GURU of PC hardware, but i do know when i get cheated on a purchase.. and this guys is like stealing our money, were buying a product thats not fit for sale.
> 
> First and last time i buy AMD product for me. Dont want to buy a car without wheels twice..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Im forced thou to keep looking for fixed, and proper drivers for this card until i can afford a new one..
> 
> EDIT:
> BTW, im new here i know, but i can post pics of my card and name if you want proof, idk.. but i can if you guys want, just so you know im not here just to troll. Ive been lurking here on forums for many years


Guess you didnt get the memo from the thousands of reviews saying not to get the card unless you plan to have aftermarket cooling, or saying wait till the partners release their own versions.

If you don't have a water loop I would look into the arctic closed loop cooler.


----------



## Skrillex101

Quote:


> Originally Posted by *bond32*
> 
> Guess you didn't get the memo from the thousands of reviews saying not to get the card unless you plan to have aftermarket cooling, or saying wait till the partners release their own versions.
> 
> If you don't have a water loop I would look into the arctic closed loop cooler.


I have already gotten the Accelero Xtreme III, but that doesnt change my point of view. Card still performs poorly for the cost and its buggy as hell, at least now its not insanely loud. And i got this card literally the day it got out, not many reviews, and most of the reviews was, as they say today "fixed cards" sent to them.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Skrillex101*
> 
> I have already gotten the Accelero Xtreme III, but that doesnt change my point of view. Card still performs poorly for the cost and its buggy as hell, atleast now its not insanely loud. And i got this card litterally the day it got out, not many reviews, and most of the reviews was, as they say today "fixed cards" sent to them.


New hardware always has problems etc etc

you could have waited a week or two then weighed up your options before forking out the cash.


----------



## Skrillex101

Quote:


> Originally Posted by *Sgt Bilko*
> 
> New hardware always has problems etc etc
> 
> you could have waited a week or two then weighed up your options before forking out the cash.


I agree sgt Bilko that i should have waited, but that would mean 2 months waiting? already now i wished i had bought the 780TI, its already much more promising and without these problems the 290x got.. But in my work for instance, i would never make something that almost worked, and say here you go, pay full price as IF it worked, and maybe it will get fixed later.
But im not whining for refund or anything, im merely stating that im very disappointed, and wish i had returned it earlier, this is both for me to get my rage out, and warn other buyers.


----------



## kdawgmaster

Quote:


> Originally Posted by *Skrillex101*
> 
> To be honest, i wish i could turn my 290X back.. Dont really want to own this crap. It feels like im a Beta tester for something thats pushed to its limit with screws falling off everywhere along the way.
> 
> To much time have passed thou, and no more returning of it. Sure its my problem being out late. But seriously, buying a Card thats pretty much the same as the 290, for in $150 more, is just ridicolous. The noise, the performance, the downthrottle of stock cards, poor OC.
> 
> I know im no GURU of PC hardware, but i do know when i get cheated on a purchase.. and this guys is like stealing our money, were buying a product thats not fit for sale.
> 
> First and last time i buy AMD product for me. Dont want to buy a car without wheels twice..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Im forced thou to keep looking for fixed, and proper drivers for this card until i can afford a new one..
> 
> EDIT:
> BTW, im new here i know, but i can post pics of my card and name if you want proof, idk.. but i can if you guys want, just so you know im not here just to troll. Ive been lurking here on forums for many years


This is my saying to that.

When ever a product gets released and you are one to buy lets say month of you should have it in your head that you are the beta tester. We are the people who test the product and get it known for the bugs. Theres nothing to say that the R9 290x cant stay at a 1000mhz core clock all the time, there however is something to say it wont happen the way AMD is going about this for which iv posted now a few times ways around it.

I would say in your case this has nothing to do with the feel of being scammed but rather you should have stayed with the posts and waited for the people who dont mind being beta testers get their word out and AMD gets the fixes out. Am i bumbed that i cant fully use all my R9 290x? Yes rather quite bumbed infact. BUT I knew walking into this that i was nothing more then a person who was a beta tester and I was fine with that.

HOWEVER if AMD dosnt focus on the right things first then this could mean the end of me getting their cards atleast on launch my self. A card like this with the problems it has ( and performance AKA FPS not being one of them right now ) they should be working on the crashing problems and driver issues first and foremost ( which it seems they are )

My idea of solving this 1000mhz problem is simply the card idles to high for temps. This I believe could be in direct relation with the ram always being at 1250mhz ( but i could also be wrong ) The thing that gets me is the display card will always site with the ram clocks as high as they can go when in past GPU series the Vram would down clock. My question is why has that changed with this line up? I think if the ram were able to down clock to something lower idle temps wouldnt be near as high and the time it takes the card to heat up will be increased ( for those not to get confused meaning the card will take longer to get to the 95C ) and then in this warming up process the cards would also adjust the fan speeds on the fly and not when it reaches its thermal limit. All this put together in theory should make the cards lower in temps and would also allow for a greater shelf life and less DOA's due to thermaling the card.


----------



## Stay Puft

Quote:


> Originally Posted by *hatlesschimp*
> 
> I had 3 air fed and mild overclocked titans and a 4.8ghz 3930k and maxed out a Corsair 1200i. On the watt meter i was 1200watts from the wall.


My AX1200 was fine even at 1300w on the killawatt. Most can be pushed


----------



## the9quad

Quote:


> Originally Posted by *Skrillex101*
> 
> I have already gotten the Accelero Xtreme III, but that doesnt change my point of view. Card still performs poorly for the cost and its buggy as hell, atleast now its not insanely loud. And i got this card litterally the day it got out, not many reviews, and most of the reviews was, as they say today "fixed cards" sent to them.


Soooooo what *exactly* are the issues you are having if it isn't heat and noise? Some had or have the blackscreen issue (including myself (nothing the driver wont fix or an rma doesn't fix), that is the only bug I am aware of and only one I have seen....

Please explain what is so "buggy" and how the performance is so bad?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Skrillex101*
> 
> I agree sgt Bilko.. But in my work for instance, i would never make something that almost worked, and say here you go, pay full price as IF it worked, and maybe it will get fixed later.
> But im not whining for refund or anything, im merely stating that im very disappointed, and wish i had returned it earlier, this is both for me to get my rage out, and warn other buyers.


Well my 290x including the the AX III cost over $800 and im a little disappointed but im not that bothered by it. The card isnt a dud and its working fine I just needed a better cooling solution to keep it running at a level im happy with.

I agree that the 290 is better value for money but I dont have buyers remorse by getting the 290x

Maybe thats just me though


----------



## Stay Puft

Quote:


> Originally Posted by *Skrillex101*
> 
> I have already gotten the Accelero Xtreme III, but that doesnt change my point of view. Card still performs poorly for the cost and its buggy as hell, atleast now its not insanely loud. And i got this card litterally the day it got out, not many reviews, and most of the reviews was, as they say today "fixed cards" sent to them.


What is performing like crap? Core temps should be super low due to the AXIII. How are VRM temps? Did you use those crappy heatsinks arctic supplies?


----------



## Skrillex101

Quote:


> Originally Posted by *kdawgmaster*
> 
> This is my saying to that.
> 
> When ever a product gets released and you are one to buy lets say month of you should have it in your head that you are the beta tester. We are the people who test the product and get it known for the bugs. Theres nothing to say that the R9 290x cant stay at a 1000mhz core clock all the time, there however is something to say it wont happen the way AMD is going about this for which iv posted now a few times ways around it.
> 
> I would say in your case this has nothing to do with the feel of being scammed but rather you should have stayed with the posts and waited for the people who dont mind being beta testers get their word out and AMD gets the fixes out. Am i bumbed that i cant fully use all my R9 290x? Yes rather quite bumbed infact. BUT I knew walking into this that i was nothing more then a person who was a beta tester and I was fine with that.
> 
> HOWEVER if AMD dosnt focus on the right things first then this could mean the end of me getting their cards atleast on launch my self. A card like this with the problems it has ( and performance AKA FPS not being one of them right now ) they should be working on the crashing problems and driver issues first and foremost ( which it seems they are )
> 
> My idea of solving this 1000mhz problem is simply the card idles to high for temps. This I believe could be in direct relation with the ram always being at 1250mhz ( but i could also be wrong ) The thing that gets me is the display card will always site with the ram clocks as high as they can go when in past GPU series the Vram would down clock. My question is why has that changed with this line up? I think if the ram were able to down clock to something lower idle temps wouldnt be near as high and the time it takes the card to heat up will be increased ( for those not to get confused meaning the card will take longer to get to the 95C ) and then in this warming up process the cards would also adjust the fan speeds on the fly and not when it reaches its thermal limit. All this put together in theory should make the cards lower in temps and would also allow for a greater shelf life and less DOA's due to thermaling the card.


I have not bought many new release hardware items. But i did expect it to work. and just to Clarify i am running the Card on full speed now after i installed the Accelero. i got it OC´d a bit to 1095 with 60c under full load. VRM1 is mainly the temp problem.

Still leaves me disapointed, and yes i should have waited. But IF it really is the case that were the "BETA" testers, couldnt it be mentioned hehe.

Also why are AMD selling 290X a few days before the 290, and then we all jump on the wagon buying it, and a few days later the 290 guys come "HAHA" our card works just as good as yours for much less.

Idk, im just angry


----------



## skupples

Quote:


> Originally Posted by *sugarhell*
> 
> Wait wait wait. So you get a card based on the spec of newegg? You dont do a research? Reviewers knew the new powertune and they still didint inform anyone how this work. Do you think that amd didint tell them how the new powertune works? Why hardware.fr has the best review about powertune? Maybe because guru3d copy paste their reviews the last 2 years they just add the results on the graphs?Because nvidia has a different boost doesnt mean amd boost is the same. Also do you mean how revieweres just report the base boost for example 770 when the card boost close to 1200?


The mass population of people buying these products do not take the time to do thorough research on the product. They go directly to the price/performance ratio charts & pick one.


----------



## sugarhell

Quote:


> Originally Posted by *skupples*
> 
> The mass population of people buying these products do not take the time to do thorough research on the product. They go directly to the price/performance ratio charts & pick one.


This is wrong. And its stupid. I will not gonna buy a 100k car without proper research


----------



## Stay Puft

Quote:


> Originally Posted by *Skrillex101*
> 
> I have not bought many new release hardware items. But i did expect it to work. and just to Clarify i am running the Card on full speed now after i installed the Accelero. i got it OC´d a bit to 1095 with 60c under full load. VRM1 is mainly the temp problem.
> 
> Still leaves me disapointed, and yes i should have waited. But IF it really is the case that were the "BETA" testers, couldnt it be mentioned hehe.
> 
> Also why are AMD selling 290X a few days before the 290, and then we all jump on the wagon buying it, and a few days later the 290 guys come "HAHA" our card works just as good as yours for much less.
> 
> Idk, im just angry


To cure your VRM issues you'll need two sets of these

http://www.frozencpu.com/products/7190/vid-105/Enzotech_MOS-C1_MOSFET_Heatsinks_-_65mm_x_65mm_x_12mm_-_10_Pack.html?tl=g40c16

And this

http://www.frozencpu.com/products/3770/thr-06/Arctic_Silver_Adhesive_Premium_Silver_Thermal_Epoxy_-_70_Gram_ASTA-7G.html

For better memory cooling i highly recommend these

http://www.frozencpu.com/products/15573/vid-184/Akust_Copper_Memory_Chip_Heatsink_-_13mm_x_12mm_x_5mm_-_4_Pack_RS00-0602-AKS.html?tl=g40c16


----------



## Skrillex101

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well my 290x including the the AX III cost over $800 and im a little disappointed but im not that bothered by it. The card isnt a dud and its working fine I just needed a better cooling solution to keep it running at a level im happy with.
> 
> I agree that the 290 is better value for money but I dont have buyers remorse by getting the 290x
> 
> Maybe thats just me though


Well ill learn from my mistakes. I just think its wierd that were accepting it to buy stuff that aint working propery.. i know you guys say its a secret "beta" test. But when did that become Okay? Arent we paying for them to have done the Beta test already, giving us the final version that works?

Maybe im all wrong on that, but thats how i feel.

Quote:


> Originally Posted by *Stay Puft*
> 
> What is performing like crap? Core temps should be super low due to the AXIII. How are VRM temps? Did you use those crappy heatsinks arctic supplies?


Core temp is excellent now with the $100 cooler added. VRM a bit high, but not to much that it cant take it. Still disappointed by the OC ability, that the 290 almost beats it, beta drivers that work 50/50, i got my reasons. But as others said.. im a beta tester haha. I still got my fair rights to being disappointed.


----------



## kdawgmaster

Quote:


> Originally Posted by *Skrillex101*
> 
> I have not bought many new release hardware items. But i did expect it to work. and just to Clarify i am running the Card on full speed now after i installed the Accelero. i got it OC´d a bit to 1095 with 60c under full load. VRM1 is mainly the temp problem.
> 
> Still leaves me disapointed, and yes i should have waited. But IF it really is the case that were the "BETA" testers, couldnt it be mentioned hehe.
> 
> Also why are AMD selling 290X a few days before the 290, and then we all jump on the wagon buying it, and a few days later the 290 guys come "HAHA" our card works just as good as yours for much less.
> 
> Idk, im just angry


But there is a problem with the R9 290 and this is going back now to the 6900 series to. The problem with a R9 290 is that it failed a key part of being a 290x but past the second QC and became a 290. If you feel the need to trust failed hardware and then flash it go a head. BUT if it fails then u only have ur self to blame for that. This was the same thing with the 6950 being flashed to the 6970. I my self would NEVER buy something that managed to fail the primary QC and then buy it at a lower price to essentially possibly have it to work.


----------



## Skrillex101

Quote:


> Originally Posted by *Stay Puft*
> 
> To cure your VRM issues you'll need two sets of these
> 
> http://www.frozencpu.com/products/7190/vid-105/Enzotech_MOS-C1_MOSFET_Heatsinks_-_65mm_x_65mm_x_12mm_-_10_Pack.html?tl=g40c16
> 
> And this
> 
> http://www.frozencpu.com/products/3770/thr-06/Arctic_Silver_Adhesive_Premium_Silver_Thermal_Epoxy_-_70_Gram_ASTA-7G.html
> 
> For better memory cooling i highly recommend these
> 
> http://www.frozencpu.com/products/15573/vid-184/Akust_Copper_Memory_Chip_Heatsink_-_13mm_x_12mm_x_5mm_-_4_Pack_RS00-0602-AKS.html?tl=g40c16


Cooling is not a problem anymore with the Accelero x3. But thanks for your links.


----------



## Stay Puft

Quote:


> Originally Posted by *Skrillex101*
> 
> Maybe im all wrong on that, but thats how i feel.
> Core temp is excellent now with the $100 cooler added. VRM a bit high, but not to much that it cant take it. Still disappointed by the OC ability, that the 290 almost beats it, beta drivers that work 50/50, i got my reasons. But as others said.. im a beta tester haha. I still got my fair rights to being disappointed.


Exact reason i sold my 290X's. They just weren't worth it when the 290's were released 179 dollars cheaper. I might buy a 290 on friday to play around with.


----------



## eternal7trance

Quote:


> Originally Posted by *Stay Puft*
> 
> Exact reason i sold my 290X's. They just weren't worth it when the 290's were released 179 dollars cheaper. I might buy a 290 on friday to play around with.


Yea between the two cards a 290x just doesn't seem worth it


----------



## the9quad

Quote:


> Originally Posted by *Skrillex101*
> 
> I have not bought many new release hardware items. But i did expect it to work. and just to Clarify i am running the Card on full speed now after i installed the Accelero. i got it OC´d a bit to 1095 with 60c under full load. VRM1 is mainly the temp problem.
> 
> Still leaves me disapointed, and yes i should have waited. But IF it really is the case that were the "BETA" testers, couldnt it be mentioned hehe.
> 
> Also why are AMD selling 290X a few days before the 290, and then we all jump on the wagon buying it, and a few days later the 290 guys come "HAHA" our card works just as good as yours for much less.
> 
> Idk, im just angry


I still dont get where the beta drivers fail 50% of the time, fail how? Sure you can be disappointed and unhappy with your purchase that is your right, but come on the drivers aren't an issue and the cards arent buggy. There is a percentage of cards with the blackscreen issue, but other than that,nada.


----------



## Raxus

So Newegg got my rma for my 290x, torn between buying another 290x or a 780ti. Even though i despise nvidia.


----------



## Skrillex101

Quote:


> Originally Posted by *the9quad*
> 
> I still dont get where the beta drivers fail 50% of the time, fail how? Sure you can be disappointed and unhappy with your purchase that is your right, but come on the drivers aren't an issue and the cards arent buggy. There is a percentage of cards with the blackscreen issue, but other than that,nada.


Ye sorry ment 50/50 as a figure of speaking. I to have Blackscreen problems, thou not as often as the others. But ofcourse its not 50% of the time, but its pretty damn annoying.


----------



## stilllogicz

Quote:


> Originally Posted by *sugarhell*
> 
> This is wrong. And its stupid. I will not gonna buy a 100k car without proper research


Apples to oranges. You can't compare a 100k car to a $500 GPU. And it's true, most people will not do the research and just look at the charts fully expecting a 100% working item. I'd guess most of the community on OCN would be wise enough to DO research through past experiences, I know I'm that way, I research everything regardless of price.

I'm pretty sure 99% of regular gamers who aren't enthusiasts wouldn't expect a brand new card to have heat/throttling problems preventing it to be run properly on stock settings. This is how companies get a bad rep. Negativity spreads faster than positive rep. AMD _should_ have done better.

For me it's not a problem because I expect things to have problems because I've built many PC's both air cooled & watercooled. The average joe, I can sympathize with his frustration.


----------



## sugarhell

The average joe should start with a mid range gpu like 280x/270x now. Without knowledge you shouldnt start from the high end. Thats the point from the 100k bucks car. If you dont know the fundamentals then you shouldnt buy 500-600 bucks gpu. Most of the people instead of troubleshot they come here to find the easy solution ( i dont mean blackscreen) instead to search the problem.We all know that you only buy amd ref to watercool. Look how many they got 290 ref to stay on air. When there is a problem we can easily say omg drivers sucks or nvidia sucks or amd sucks. We never say that maybe i fail somewhere and i need to find the reason


----------



## Skrillex101

Quote:


> Originally Posted by *sugarhell*
> 
> The average joe should start with a mid range gpu like 280x/270x now. Without knowledge you shouldnt start from the high end. Thats the point from the 100k bucks car. If you dont know the fundamentals then you shouldnt buy 500-600 bucks gpu. Most of the people instead of troubleshot they come here to find the easy solution ( i dont mean blackscreen) instead to search the problem.We all know that you only buy amd ref to watercool. Look how many they got 290 ref to stay on air. When there is a problem we can easily say omg drivers sucks or nvidia sucks or amd sucks. We never say that maybe i fail somewhere and i need to find the reason


I did a lot of research, but no review or articles mentioned Blackscreen, poor performance, and fixed review cards to look better in benchmark. Nor did it mention poor drivers or slow response from AMD to critical errors in mid october.

Your almost saying that we should accept buying 95% finished hardware. And that my sir, is the WRONG mentality!

We should not settle, were the comsumers, its us buying their product, and we should expect it to be 100% ready. Its up to them to do the Quality Control. I may not be how it works, but its how it should work.

Your lying to yourself, if you think AMD didnt know about the Blackscreen issue pre-launch. They knew...


----------



## kdawgmaster

Quote:


> Originally Posted by *sugarhell*
> 
> The average joe should start with a mid range gpu like 280x/270x now. Without knowledge you shouldnt start from the high end. Thats the point from the 100k bucks car. If you dont know the fundamentals then you shouldnt buy 500-600 bucks gpu. Most of the people instead of troubleshot they come here to find the easy solution ( i dont mean blackscreen) instead to search the problem.We all know that you only buy amd ref to watercool. Look how many they got 290 ref to stay on air. When there is a problem we can easily say omg drivers sucks or nvidia sucks or amd sucks. We never say that maybe i fail somewhere and i need to find the reason


The problem tho is the average joe relies on their sales person to make some of those calls for them. I work in computer sales where i live and we get alot of people " this is my first real gaming rig for PC and i want something around the 3K mark" I tell them its not needed and will come with loads of problems normally. In this essence we have to hope the sales person recommends the right things.

As it is right now I dont think i can promptly recommend anyone a R9 290x or 290 with the driver problems i know they have which means I'll most likely be going to Nvidia, but i know for a fact it would take a hell of a lot for me to recommend someone a 780ti for that price right now i would probably still stick them with a general GTX 780 instead.

Quote:


> Originally Posted by *Skrillex101*
> 
> I did a lot of research, but no review or articles mentioned Blackscreen, poor performance, and fixed review cards to look better in benchmark. Nor did it mention poor drivers or slow response from AMD to critical errors in mid october.
> 
> Your almost saying that we should accept buying 95% finished hardware. And that my sir, is the WRONG mentality!
> 
> We should not settle, were the comsumers, its us buying their product, and we should expect it to be 100% ready. Its up to them to do the Quality Control. I may not be how it works, but its how it should work.
> 
> Your lying to yourself, if you think AMD didnt know about the Blackscreen issue pre-launch. They knew...


My theory to why reviewers didnt experience black screen issues is because it was a press release card and most likely a press released set of drivers. Also consider that reviews dont spend HR's on games, its normally something like 10 min max and that might be a good reviewer going through the game and not using in game benchmarks which alot of people do.


----------



## sugarhell

Reviewers didint have black screen issue. Not all the users get blackscreen issue.
Second i only see strong performance from the card.
Third how this is slow response? They need to replicate it first then fix it
Fourth toms was stupid. It was a fan speed problem not that amd send golden cards on the reviewers? Good cards 40% fan speed-2200 rpm Bad cards 40% fan speed 1800 rpm. And when your whole powertune is based on fan speed that makes the second card to throttle like crazy

I am not sure if they knew it. Maybe its a problem with the bios from AIBs. Not a single reviewer got black screen with amd samples

Also you should see 7970 release...

I think people you overreact with some problems. You dont like the ref cooler? Plenty of reviews they mention that the cooler is like a hairdryer.

Temps is not a problem this pcb probably can handle up to 110C.Do you think that amd will release a product without heat test on the silicon?


----------



## Stay Puft

Is powertune holding back the cards? Powertune can be related to Nvidia's "Target Power" correct?


----------



## HeliXpc

Just got my 290X, overclocked is slower than my GTX 780 OCd to 1200core +550 on mem. Still going to play around with it a bit, max OC is 1150 core and 1400 on mem, with 1.25v on core, +100 on afterburner. Setup a custom fan profile to keep it around 80c during full load. So far its a good card, but at $550 i wouldnt recommend it, the GTX 780 overclocked is a better card overall, faster, less power, less noise. Here are some photos.

http://s145.photobucket.com/user/gar818/media/20131118_184028_zps8a01c279.jpg.html
http://s145.photobucket.com/user/gar818/media/20131118_184018_zps784e3292.jpg.html
http://s145.photobucket.com/user/gar818/media/20131118_184723_zps18fb4312.jpg.html


----------



## sugarhell

Quote:


> Originally Posted by *Stay Puft*
> 
> Is powertune holding back the cards? Powertune can be related to Nvidia's "Target Power" correct?


Just put uber bios. +50% powerlimit


----------



## blue1512

Still, is there no one who realizes that black screen crash is not limited to Hawaii? Even 78xx/79xx had it at launch.




Eventually, AMD rolled out drivers to fix it. And this time the response from them is much faster. Well done.


----------



## Stay Puft

Quote:


> Originally Posted by *sugarhell*
> 
> Just put uber bios. +50% powerlimit


And if 50% isnt enough?


----------



## Skrillex101

Quote:


> Originally Posted by *blue1512*
> 
> Still, is there no one who realizes that black screen crash is not limited to Hawaii? Even 78xx/79xx had it at launch.
> 
> 
> 
> 
> Eventually, AMD rolled out drivers to fix it. And this time the response from them is much faster. Well done.


Your congratulating them for being less slow?

Funny


----------



## sugarhell

Quote:


> Originally Posted by *Stay Puft*
> 
> And if 50% isnt enough?


I think 450 tdp is enough







Otherwise i can give you a hand with the iron


----------



## Stay Puft

Is anyone running the bios Shamino posted over at kingpin cooling?
Quote:


> Originally Posted by *sugarhell*
> 
> I think 450 tdp is enough
> 
> 
> 
> 
> 
> 
> 
> Otherwise i can give you a hand with the iron


----------



## CallsignVega

lol doing some benchmarks I can't get BF4 to 100% GPU utilization in Surround what the heck.. very CPU limited game.


----------



## sugarhell

Quote:


> Originally Posted by *Stay Puft*
> 
> Is anyone running the bios Shamino posted over at kingpin cooling?


Tsm. He push like 1.5 volt through 3 cards. He is insane


----------



## Stay Puft

Quote:


> Originally Posted by *CallsignVega*
> 
> lol doing some benchmarks I can't get BF4 to 100% GPU utilization in Surround what the heck.. very CPU limited game.


Time for a 4960X


----------



## blue1512

Quote:


> Originally Posted by *Skrillex101*
> 
> Your congratulating them for being less slow?
> 
> Funny


Well, learn a bit of sarcasm. Driver have been the weak spot of AMD for a long time, unlike nVidia who likes being mentioned as a software company.


----------



## CallsignVega

Quote:


> Originally Posted by *Stay Puft*
> 
> Time for a 4960X


I don't know about that. In order for IB-E to match my SB-E it would have to do a min of 4.9GHz, which I hear is pretty rare for that chip.


----------



## Heinz68

Quote:


> Originally Posted by *Skrillex101*
> 
> I agree sgt Bilko that i should have waited, but that would mean 2 months waiting? already now i wished i had bought the 780TI, its already much more promising and without these problems the 290x got.. But in my work for instance, i would never make something that almost worked, and say here you go, pay full price as IF it worked, and maybe it will get fixed later.
> But im not whining for refund or anything, im merely stating that im very disappointed, and wish i had returned it earlier, this is both for me to get my rage out, and warn other buyers.


Thanks a lot for warning me, I almost bought 780 Ti from NV shill in official-amd-r9-290x-290-owners-club.


----------



## selk22

A lot of complaining here in this thread....

I personally am happy with my 290x! I purchased at launch knowing full well that most likely the 290 would be the smartest option but I just wanted the best upgrade I could get for the money I had. I was coming from a GTX 580 1.5gb. For you guys upset that the 290 came out and you bought the 290x well sorry guys but you can only blame yourself.. This is the same thing from these GPU company's over and over. A lot of Titan guys were upset when the 780 came out but if you are always looking back in the tech industry you will always be regretting your purchase.. As for the people who had 780's that moved to the 290x idk about that one..

Overall this card performs beastly in games which is where it counts and yes right now we are on beta drivers that have some problems but AMD has acknowledged this and I am sure we will get a fix. I really think this cards will start to shine on a few driver releases from now.

Just my 2 cents


----------



## skupples

Quote:


> Originally Posted by *sugarhell*
> 
> This is wrong. And its stupid. I will not gonna buy a 100k car without proper research


With all due respect. *This is not a 100,000$ Car.* It's a 550$ GPU(less than 1% of the tax rate on a 100,000$ car). I'm sorry you feel it's necessary to say my post is stupid, but it is correct. The larger % of people buying GPU's do not spend massive amounts of time researching these things. They simply draw up the price/perf charts & pick which one suites their price point, after that they do minimal research via google search, which draws up that theoretical performance website. Now then, if they are on their 3rd or 4th GPU purchase, they are likely doing more research.

"people buy reference amd gpu to watercool" once again wrong, water cooling is the essence of niche. I would beg to wager that less than 10% of the gpu market puts them under water.


----------



## Mas

People have to keep in mind, you hear more about bad experiences than good when reading forums and feedback on the internet. I'm pretty sure for every one person having problems with these cards, there are dozens and dozens of people who are just fine.

Sure, I'm on water now, but I only got my block last Thursday. Up until then, I had been running with stock cooling. Card ran fine. Bumped max fan speed up in CCC which doesn't bother me, since 99% of the time I'm wearing headphones while using my computer. I also don't have to worry about bothering others, because my wife and I have our own private computer rooms. Previous card (actually previous 3 cards) were all nvidia. Didn't have to do anything fancy with registry. Didn't have to reinstall OS. Just uninstalled nvidia software, ran the DDU uninstaller linked on the first page of this thread, put in new 290X, and installed beta driver. Next day, a new beta driver came out so I uninstalled previous beta driver with DDU and installed new one.

Have not had a single black screen. No lockups or artifacts. No throttling issues. Card has been just fine.

Also, like previously mentioned, you should do at least a little bit of research before dropping $500+ on a component. If you're not running at high resolutions, you should probably choose something else. This cards specs practically scream high res gaming. Just because something does high res really well and/or better than the competition doesn't mean the case is the same for lower resolutions.

Yes. Some people have been experiencing issues and that sucks, I wish they weren't because no one likes the disappointment and letdown of getting a dud when you spend a decent chunk of money on something and are excited about it, but a handful of people on a forum reporting issues can't be taken as representative of the actual whole picture.


----------



## CallsignVega

Hm, on ultra everything BF4 on the test range at 3320x1920 *two* of my Titans are pulling in 138 FPS. Just destroying this game. I didn't expect two Titans being that fast. Will switch back to the 290X's which I haven't run this test yet. Really wish those new drivers launched today.


----------



## tsm106

http://www.3dmark.com/fs/1179138

I'm pretty pleased with my cards. the ek blocks are excellent too.


----------



## sugarhell

Quote:


> Originally Posted by *skupples*
> 
> With all due respect. *This is not a 100,000$ Car.* It's a 550$ GPU(less than 1% of the tax rate on a 100,000$ car). I'm sorry you feel it's necessary to say my post is stupid, but it is correct. The larger % of people buying GPU's do not spend massive amounts of time researching these things. They simply draw up the price/perf charts & pick which one suites their price point, after that they do minimal research via google search, which draws up that theoretical performance website. Now then, if they are on their 3rd or 4th GPU purchase, they are likely doing more research.
> 
> "people buy reference amd gpu to watercool" once again wrong, water cooling is the essence of niche. I would beg to wager that less than 10% of the gpu market puts them under water.


I didint call your post stupid. I called stupid the non research thing.Also the ref cooler from amd was always bad.After so many cards i get ref only if i want to watercool otherwise i get a custom.


----------



## CallsignVega

I would like to know if this guy has the force GEN 3 on X79 enabled when he did these tests:






If it's PCI-E 2.0 with 3-4 cards, could be some issues there.


----------



## CallsignVega

Quote:


> Originally Posted by *tsm106*
> 
> http://www.3dmark.com/fs/1179138
> 
> I'm pretty pleased with my cards. the ek blocks are excellent too.


Not going with 4 this time? Or testing the waters before 4th?


----------



## tsm106

Quote:


> Originally Posted by *CallsignVega*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> http://www.3dmark.com/fs/1179138
> 
> I'm pretty pleased with my cards. the ek blocks are excellent too.
> 
> 
> 
> Not going with 4 this time? Or testing the waters before 4th?
Click to expand...

I was originally only going with two. I knew as a pure gaming card it would be more than enough. But I sort of splurged when ppl were bugging me about running quads. Three is a good compromise and it really is overkill for triple 1080.

Btw I already ran pcie tests, there's a difference but it is small at trifire 5760.


----------



## Mas

Quote:


> Originally Posted by *CallsignVega*
> 
> Hm, on ultra everything BF4 on the test range at 3320x1920 *two* of my Titans are pulling in 138 FPS. Just destroying this game. I didn't expect two Titans being that fast. Will switch back to the 290X's which I haven't run this test yet. Really wish those new drivers launched today.


Are you scaling up the resolution using the in-game slider to 200%?


----------



## EliteReplay

Quote:


> Originally Posted by *Skrillex101*
> 
> I did a lot of research, but no review or articles mentioned Blackscreen, poor performance, and fixed review cards to look better in benchmark. Nor did it mention poor drivers or slow response from AMD to critical errors in mid october.
> 
> Your almost saying that we should accept buying 95% finished hardware. And that my sir, is the WRONG mentality!
> 
> We should not settle, were the comsumers, its us buying their product, and we should expect it to be 100% ready. Its up to them to do the Quality Control. I may not be how it works, but its how it should work.
> 
> Your lying to yourself, if you think AMD didnt know about the Blackscreen issue pre-launch. They knew...


u are a troll or some sorf of thing like that... most of the poeple here never had a problem with Performance, the card is fine for what u pay for... AMD didnt put a gun on your head to buy the card
and on top of that u have RMA as an option...

why are people responding to this troll that just come here to talk really bad about the card? i think i know this guy from some place else...









u said u have the card right? what about if u take a picture of yourself near your pc? and with your name written in a paper? i dont know but i think i know u lol your name is very familiar to me... it reminds me a Nvidia Fanboy going over forums talking really bad about AMD products without really having them.


----------



## Skrillex101

Quote:


> Originally Posted by *EliteReplay*
> 
> u are a troll or some sorf of thing like that... most of the poeple here never had a problem with Performance, the card is fine for what u pay for... AMD didnt put a gun on your head to buy the card
> and on top of that u have RMA as an option...
> 
> why are people responding to this troll that just come here to talk really bad about the card? i think i know this guy from some place else...


Know me from somewhere else, how would you, and no im not a Troll, read the other 5000 posts on the net about complaints. Stupid throwing out accusatons just because you "think" something.

So you think its not allowed to complain about a non working product? If the card worked as it should, i would proberly have no complaints. but its not.. and the last couple of thousands replies proofs my point.

So thanks for your useful reply.

Actually, i will apoligize if you can proof my compaints for invalid and non existing. Begin..


----------



## Stay Puft

Quote:


> Originally Posted by *Skrillex101*
> 
> Know me from somewhere else, how would you, and no im not a Troll, read the other 5000 posts on the net about complaints. Stupid throwing out accusatons just because you "think" something.
> 
> So you think its not allowed to complain about a non working product? If the card worked as it should, i would proberly have no complaints. but its not.. and the last couple of thousands replies proofs my point.
> 
> So thanks for your useful reply.


Dont pay attention to Elite. He calls everyone a troll if they say something bad about an amd product


----------



## Skrillex101

Quote:


> Originally Posted by *Stay Puft*
> 
> Dont pay attention to Elite. He calls everyone a troll if they say something bad about an amd product


Hard to ignore, ive been to this forum for so long time, and on my first posts where i really feel the need to signup and give my voice, he calls me a troll for problems thats clearly stated in the last (what is it 700 pages?). Just waste of time he is then imo


----------



## brazilianloser

Well for my first attempt at AMD I was very un-pleased at first due to the black screen problem which at least for me it was hardware based. After much headache and two weeks without playing with my pc I got the replacement in that so far has just blown me out of the water for the price I paid for... Not sure what others are experiencing but my Asus 290 is able to play BF4 on Ultra 5760x1080 and other than some dips in FPS the game is playable something my previous 660Ti or the 280X I got previously to this beast has been able to do. Crysis 3 still kicks it to the ground but what to expect. Will definitely get a second one and once I do that put both of them on water. If they are performing as good as they are now imagine once its on water and better drivers are out. I am satisfied with my purchase since its the first time I have payed so much for a single card and had it perform so well in a single card config. Can't say the same for those out there that purchased the 290x though.


----------



## Scorpion49

Quote:


> Originally Posted by *CallsignVega*
> 
> I would like to know if this guy has the force GEN 3 on X79 enabled when he did these tests:
> 
> 
> 
> 
> 
> 
> If it's PCI-E 2.0 with 3-4 cards, could be some issues there.


I just spent a little time testing, and I see a small (1-2%) advantage to my simple 1080p crossfire system with 3.0 vs 2.0. It is a definitely improvement because it is clear across the board in everything I do, not just margin of error. I'm going to guess that tri/quad high resolution setups are going to show some decent gains with 3.0 over 2.0.


----------



## EliteReplay

Quote:


> Originally Posted by *Skrillex101*
> 
> Hard to ignore, ive been to this forum for so long time, and on my first posts where i really feel the need to signup and give my voice, he calls me a troll for problems thats clearly stated in the last (what is it 700 pages?). Just waste of time he is then imo


man i know u... u dont eve have any AMD product...

why on earth would you go straight to a forum to talk BAD about a product?
99% go to a forum to get help and post about their experience and how they can fix it this and this and that.

i just told u... u want us to believe you? post a picture of yourself near your pc with your name written in a paper.

proof me wrong.


----------



## CallsignVega

Quote:


> Originally Posted by *tsm106*
> 
> I was originally only going with two. I knew as a pure gaming card it would be more than enough. But I sort of splurged when ppl were bugging me about running quads. Three is a good compromise and it really is overkill for triple 1080.
> 
> Btw I already ran pcie tests, there's a difference but it is small at trifire 5760.


Interesting, you had at least one of the slots at PCi-E 2.0 - 8 speed?
Quote:


> Originally Posted by *Mas*
> 
> Are you scaling up the resolution using the in-game slider to 200%?


I will test the 290x's in BF4 before I go that far.


----------



## Skrillex101

Quote:


> Originally Posted by *EliteReplay*
> 
> man i know u... u dont eve have any AMD product...
> 
> why on earth would you go straight to a forum to talk BAD about a product?
> 99% go to a forum to get help and post about their experience and how they can fix it this and this and that.
> 
> i just told u... u want us to believe you? post a picture of yourself near your pc with your name written in a paper.
> 
> proof me wrong.


As you wish, give me 2 minutes. and ill edit this post.

ADDED. I added the stock cooler next to the PC case where you can see the Accelero, should be enough proof.


----------



## tsm106

http://www.overclock.net/forum/newestpost/1440974

Have link here cuz moving posts around my phone is big pain. And yes at least one card is x8.

Quote:


> Originally Posted by *CallsignVega*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I was originally only going with two. I knew as a pure gaming card it would be more than enough. But I sort of splurged when ppl were bugging me about running quads. Three is a good compromise and it really is overkill for triple 1080.
> 
> Btw I already ran pcie tests, there's a difference but it is small at trifire 5760.
> 
> 
> 
> Interesting, you had at least one of the slots at PCi-E 2.0 - 8 speed?
> Quote:
> 
> 
> 
> Originally Posted by *Mas*
> 
> Are you scaling up the resolution using the in-game slider to 200%?
> 
> Click to expand...
> 
> I will test the 290x's in BF4 before I go that far.
Click to expand...


----------



## rdr09

Quote:


> Originally Posted by *brazilianloser*
> 
> Well for my first attempt at AMD I was very un-pleased at first due to the black screen problem which at least for me it was hardware based. After much headache and two weeks without playing with my pc I got the replacement in that so far has just blown me out of the water for the price I paid for... Not sure what others are experiencing but my Asus 290 is able to play BF4 on Ultra 5760x1080 and other than some dips in FPS the game is playable something my previous 660Ti or the 280X I got previously to this beast has been able to do. Crysis 3 still kicks it to the ground but what to expect. Will definitely get a second one and once I do that put both of them on water. If they are performing as good as they are now imagine once its on water and better drivers are out. I am satisfied with my purchase since its the first time I have payed so much for a single card and had it perform so well in a single card config. Can't say the same for those out there that purchased the 290x though.


what driver are you using?


----------



## brazilianloser

Quote:


> Originally Posted by *rdr09*
> 
> what driver are you using?


Just the official 13.11 WHQL one available. Keep in mind I do not play multi player, at least at the moment...

And since I don't fell like paying 27$ for 3Dmark I can only run the normal tests... but the score has gone from 7.5k on the 280X to a tiny bit over 9k on the 290 without overclocking.


----------



## vol8

I have a question. I have XFX R9 290 Ref cooler sound like F16.

If I change the cooler to aftermarket cooler warrant void if removed or not?


----------



## the9quad

Quote:


> Originally Posted by *Skrillex101*
> 
> As you wish, give me 2 minutes. and ill edit this post.


also other than the black screen please explain how the drivers/cards are buggy and the performance is bad? I dont get you bro


----------



## tsm106

Quote:


> Originally Posted by *brazilianloser*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> what driver are you using?
> 
> 
> 
> Just the official 13.11 WHQL one available. Keep in mind I do not play multi player, at least at the moment...
> 
> And since I don't fell like paying 27$ for 3Dmark I can only run the normal tests... but the score has gone from 7.5k on the 280X to a tiny bit over 9k on the 290 without overclocking.
Click to expand...

Ppl seem to have missed this driver. It seriously rocks.


----------



## Stay Puft

Quote:


> Originally Posted by *brazilianloser*
> 
> Just the official 13.11 WHQL one available. Keep in mind I do not play multi player, at least at the moment...
> 
> And since I don't fell like paying 27$ for 3Dmark I can only run the normal tests... but the score has gone from 7.5k on the 280X to a tiny bit over 9k on the 290 without overclocking.


Wait for it to go on sale at steam for 12 dollars


----------



## CriticalHit

Quote:


> Originally Posted by *Mas*
> 
> Have not had a single black screen. No lockups or artifacts. No throttling issues. Card has been just fine.


had my cards in 24 hours... running in crossfire .. so far no issues ... just nice stock performance... will put under water as soon as i get some thermal compound..


----------



## Skrillex101

Quote:


> Originally Posted by *the9quad*
> 
> also other than the black screen please explain how the drivers/cards are buggy and the performance is bad? I dont get you bro


Not gonna discuss more about it mate, what i ment what performance is bad compared to other cards as same price range and even lower.

Maybe im also just blinded by all the things gone wrong with this card. Perhaps in 2-3 months it will all be different. I just didnt expect a released card to have this many issues, but as you guys say, its normal for new cards. I wouldnt know, i never bought a newly released card. None of my research didnt mention any of it either. But now i know not to buy, but i still feel that AMD screwed us over, by releasing something they knew didnt work 100%. Poor $2 Cooler from wallmart they put on it etc.









Btw ive uploaded images for you Elite, check my earlier post.

EDIT: Had another image from what the Box contained:


----------



## brazilianloser

Quote:


> Originally Posted by *Stay Puft*
> 
> Wait for it to go on sale at steam for 12 dollars


Yeah will be on the look out... but even then spending money on something I run only when I am testing oc or get a new toy is kind of bothersome. I have all the previous versions that I acquired free through EVGA and there is a discount if you have 3Dmark11 to upgrade to 3Dmark but the discount does not work since I got mine in a bundle


----------



## brazilianloser

Quote:


> Originally Posted by *tsm106*
> 
> Ppl seem to have missed this driver. It seriously rocks.


Yeah so far it has performed admirably at my end at least. Can't wait for more to come. Now just got to wait for school loans and end of year bonuses to hit so I can buy the second card and get all my wc parts.


----------



## the9quad

Quote:


> Originally Posted by *Skrillex101*
> 
> Not gonna discuss more about it mate, what i ment what performance is bad compared to other cards as same price range and even lower.
> 
> ]


In other words the 780 ti is faster. That is the only card that is downright faster period. It's also more expensive. Let's put it this way, the 780ti performance/price performance is above the 290x as much as the 290x is above the 290 price performance wise. Is it AMD's fault you didn't wait for a 290, if price was all you cared about? If price wasn't what you cared about than you have arguably the second fastest card out and still cheaper than the marginally faster card or you could have waited and and spent more $ and got the ti.. As far as buggy drivers, you complained about them, but haven't mentioned once what makes them so buggy. So far all you have done is basically whine that you didn't have the patience to wait two weeks for a 290 and/or the budget for a ti.

tldr....saying you are unhappy that you didn't wait for a cheaper card or spend more money on a faster card, is justifiable, but trying to make it seem like it is a shoddy card with buggy drivers is false.


----------



## Skrillex101

Quote:


> Originally Posted by *the9quad*
> 
> In other words the 780 ti is faster. That is the only card that is downright faster period. It's also more expensive. Let's put it this way, the 780ti performance/price performance is above the 290x as much as the 290x is above the 290 price performance wise. Is it AMD's fault you didn't wait for a 290, if price was all you cared about? If price wasn't what you cared about than you have arguably the second fastest card out and still cheaper than the marginally faster card or you could have waited and and spent more $ and got the ti.. As far as buggy drivers, you complained about them, but haven't mentioned once what makes them so buggy. So far all you have done is basically whine that you didn't have the patience to wait two weeks for a 290 and/or the budget for a ti.


There is no reason for me to state the obvious faults on this card atm.

Ive seen benchmarks with 290´s exceeding the 290x, ive seen benchmarks where the normal 780 exceeds the 290x. I could use lots of time to find the benchmarks, but i wont waste my time with that. I know my complaints are valid, i know the card have the problems ive mentioned at the current time, and thats all i have to say about it the9quad. You can however if you want, try proof me that the 290x doesnt blackscreen, doesnt require new cooling option, overheat, throttles down, poor OC performance compared majority of cards on this tier, poor driver for the moment..

Then there´s all the sales problems with AMD giving fixed review cards out, selling a card that should have said "Up to 1000mhz core", KNEW that the card blackscreened (dont tell me they never experienced it)...

Proof all these things wrong, and ill apoligize and bow to you sir..

EDIT: Budget wasnt a problem, i wanted the best, and i WAS gonna buy the 780, until AMD announced this 290x and oh so golden it was, it could everything. Got the card the 28th last month, and sure performed excellent in BF4, its a fast card no doubt, but for the price, the competitors offers more. But we´ll see when proper driver fixes come out, mantle and all that jab.. then i may be glad i choose as i did


----------



## rdr09

Quote:


> Originally Posted by *brazilianloser*
> 
> Just the official 13.11 WHQL one available. Keep in mind I do not play multi player, at least at the moment...
> 
> And since I don't fell like paying 27$ for 3Dmark I can only run the normal tests... but the score has gone from 7.5k on the 280X to a tiny bit over 9k on the 290 without overclocking.


Thanks. i'll dl it now. Still using Beta 6.

I just played C3 MP maxed out. ha! Very High and 8XMSAA but only 1080. Still, I thought no single card can do it. This was with stock clocks. Unbelievable.

No need for me to move my 7950 around to crossfire with the 7970. I may have to sell one of those.


----------



## Skrillex101

Quote:


> Originally Posted by *the9quad*
> 
> tldr....saying you are unhappy that you didn't wait for a cheaper card or spend more money on a faster card, is justifiable, but trying to make it seem like it is a shoddy card with buggy drivers is false.


What world are you living in? Have you not read the past thousand posts?

Btw just to add to your previous post, here´s a bench by Linustech: 

A lot of reviews post-launch is actually recommending NOT to buy this card..


----------



## Raephen

Here's some positive energy for the thread:

Ever since I got my hands on my 290, I've had no issues.

I simply let CCC in Windows 7 remove all AMD/ATI drivers, power down, replaced my HD7870 with the 290, and installed the beta 9.2 drivers .

No issues whatsoever. OK, I had one CTD when playing Skyrim, but I haven't been able to replicate that - and believe me: I've tried.

The only major gripe I have is my single monitors refresh rate of 60Hz. I just HATE seeing the constant 60fps in my AB-OSD counter all the time while running around in Skyrim on ultra. It's so boring


----------



## Stay Puft

Quote:


> Originally Posted by *Skrillex101*
> 
> What world are you living in? Have you not read the past thousand posts?
> 
> Btw just to add to your previous post, here´s a bench by Linustech:
> 
> A lot of reviews post-launch is actually recommending NOT to buy this card..


Linus


----------



## Skrillex101

Quote:


> Originally Posted by *Stay Puft*
> 
> Linus


I can find you many more, this was just the first bench, that popped up. But to be honest, you can find any benchmark you prefer, there are so many different ones, stating the opposite of eachother.. You know a place thats 100% legit? is there a collection here on OCN?


----------



## tsm106

Quote:


> Originally Posted by *Skrillex101*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Stay Puft*
> 
> Linus
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can find you many more, this was just the first bench, that popped up. But to be honest, you can find any benchmark you prefer, there are so many different ones, stating the opposite of eachother.. You know a place thats 100% legit? is there a collection here on OCN?
Click to expand...

Why are you still in this thread?


----------



## Skrillex101

Quote:


> Originally Posted by *tsm106*
> 
> Why are you still in this thread?


Because some keep qouteing my posts.. and why shouldnt i be?


----------



## jerrolds

Is PT1 bios safe on air? I just left the window open in my spare bedroom and its a cool 5C in there or something haha - able to hit 1225/1500 at the max asus bios ~1.3v actual after vdroop. VRM1 temps which was normally my problem hovered around 90C max

I think come Black Friday ill be getting a couple Cougar high pressure static fans and hopefully hit 1230+..i dont want to go too much over 1.3v actual though.

Hopefully going from lower CFM 92mm to high 120mm fans will get me a higher stable clock without touching voltage anymore


----------



## tsm106

Quote:


> Originally Posted by *Skrillex101*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Why are you still in this thread?
> 
> 
> 
> Because some keep qouteing my posts.. and why shouldnt i be?
Click to expand...

Usually ppl move on. You don't like the card fine, you said your peace fine. sell it or return it whatever. All you're doin now is trolling more or less, at least that's how it looks from here.


----------



## bond32

Quote:


> Originally Posted by *Skrillex101*
> 
> What world are you living in? Have you not read the past thousand posts?
> 
> Btw just to add to your previous post, here´s a bench by Linustech:
> 
> A lot of reviews post-launch is actually recommending NOT to buy this card..


1080p comparisons dont show much imo. The R9 290/290x is huge overkill if you play at 1080p...


----------



## Stay Puft

Quote:


> Originally Posted by *jerrolds*
> 
> Is PT1 bios safe on air? I just left the window open in my spare bedroom and its a cool 5C in there or something haha - able to hit 1225/1500 at the max asus bios ~1.3v actual after vdroop. VRM1 temps which was normally my problem hovered around 90C max
> 
> I think come Black Friday ill be getting a couple Cougar high pressure static fans and hopefully hit 1230+..i dont want to go too much over 1.3v actual though.


I would address those vrm temps first before flashing to that bios


----------



## kdawgmaster

well iv got something for you guys.

I decided to step away from my comp for about 30 min or so min. What i did was throw furmark on as a stress test for the cards just to see how stable they were under load ( from openGL keep in mind ) and what ive found is that so far today atleast my trio of R9 290x are FAR more stable running an Open GL application rather then a directX application.





This leads me to wonder why is it something thats way more stressful for the card can be WAY more stable?

Is this the primary problem with the drivers right now thats causing most of the problems?

Is there something thats causing the directX API to cut communication with the GPU making it crash?

What im thinking i might do is keep the system on furmark all night and see what happens. If the system crashes then I might want to look and see if one of the cards are defective, however if it dosnt crash then ill post stating what ive found out.


----------



## SpewBoy

Quote:


> Originally Posted by *tsm106*
> 
> Profiles... there hasn't been a proper profile since Cayman, and that was over two years ago and it was for heaven. We've been creating 1x1 optimize profiles that are not ideal, but it's all we have. If one is going to bench Valley w/o a profile, the score is a joke as you can see from the video review.


Could you enlighten a poor profile noob as to some better-optimised profile settings for Valley? I just tried a 1x1 optimize profile and got worse results with no throttling than I did with throttling and the default profile (running 2 290Xs)


----------



## Skrillex101

Quote:


> Originally Posted by *tsm106*
> 
> Usually ppl move on. You don't like the card fine, you said your peace fine. sell it or return it whatever. All you're doin now is trolling more or less, at least that's how it looks from here.


If you look back on the threads mate, you can clearly see that im saying several times, "Ive said my point and dont want to discuss it any further".
Im not gonna sell the card, i will use it until next gen stuff comes out, but still feel i had my right to share my opinion of this card. Ive had great performance in games no doubt. Im not replying to anyones elses posts. So dont quite understand you. Sorry

And yes Bond32 i know, ive played in Eyefinity for the most part.


----------



## Raephen

Quote:


> Originally Posted by *Skrillex101*
> 
> A lot of reviews post-launch is actually recommending NOT to buy this card..


Hmm... I must have missed those, too, when doing my research into the 290x and the 290 after their launch.

I have read that, yes the card has issues (heat, noise and power), and as such the card was difficult to recomend as it is right now and yes, perhaps people wanting this card should wait for the AIB versions to hit the market.

You, 'sir', are twisting words.

Oh yes, english isn't your first language, so yes there is a chance your brain translated words similar to "I find it hard to recommend this card as it is now" into "whatever you do: do NOT buy this card!!!" (<--- I'll just put the word SARCASM here as a warning).

One last thing: english isn't my first language either. Technically, it's my third.


----------



## jerrolds

Quote:


> Originally Posted by *Stay Puft*
> 
> I would address those vrm temps first before flashing to that bios


I can live with 90C max i think - pretty sure the VRMs are rated for 125C. I've seen them hit 110C before i manually shut it down. But i think ill wait to see how better fans handle them before trying PT1. I wasnt sure if it was a "watercooling or better" only bios.


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> Ppl seem to have missed this driver. It seriously rocks.


i'll play with more tomorrow. not much difference with beta 6 but seems my cpu is putting out more . . .

290 stock beta 6

http://www.3dmark.com/3dm11/7514277

13.11 whql

http://www.3dmark.com/3dm11/7528056

edit: gained some at same oc'ed at stock volts 1155/1500

beta 6

http://www.3dmark.com/3dm11/7489178

13.11 whql

http://www.3dmark.com/3dm11/7528079


----------



## hatlesschimp

Im quite happy with my powercolor 290x. I noticed it goes well in arma 3 and a few of the racing games I like. Obviously the price of the 290 vs the 290x can be annoying but I dont like playing with patches, unlockers and hacks to get something to work. Because if you do go that route you find your playing less games and more benchmarking. I would have bought 3 or maybe 4 290x cards If they had two displayports or a hdmi 2.0 but now I have to wait for the non reference.


----------



## Skrillex101

Quote:


> Originally Posted by *Raephen*
> 
> Hmm... I must have missed those, too, when doing my research into the 290x and the 290 after their launch.
> 
> I have read that, yes the card has issues (heat, noise and power), and as such the card was difficult to recomend as it is right now and yes, perhaps people wanting this card should wait for the AIB versions to hit the market.
> 
> You, 'sir', are twisting words.
> 
> Oh yes, english isn't your first language, so yes there is a chance your brain translated words similar to "I find it hard to recommend this card as it is now" into "whatever you do: do NOT buy this card!!!" (<--- I'll just put the word SARCASM here as a warning).
> 
> One last thing: english isn't my first language either. Technically, it's my third.


Yes sorry, might have been a slight mis typing. I saw a few vids here tonight, and they recommend other cards, not that they didnt recommend this one ofcourse







But please leave it here, as tsm106 is after my head i think..


----------



## Spectre-

http://www.3dmark.com/3dm11/7528077

my scores was ok i guess

but my clock speeds and memory info are being silly


----------



## hatlesschimp

Its all good recommending other cards. But sometimes your stuck with only one option. For me I had 3 gtx titans and they dominated every game and I didnt even need to overclock. Im sure they would have lasted for the next few years on the current configuration. However I wanted hdmi 2.0 for my 4k tv or 5x1p eyefinity for my vg248qe monitors. Nvidia wont do either and amds last series wont do hdmi 2.0. So it was 290 or 290x.


----------



## Raephen

Quote:


> Originally Posted by *Skrillex101*
> 
> Yes sorry, might have been a slight mis typing. I saw a few vids here tonight, and they recommend other cards, not that they didnt recommend this one ofcourse
> 
> 
> 
> 
> 
> 
> 
> But please leave it here, as tsm106 is after my head i think..


No worries. We're all friends here.









And don't worry about tsm or whoever. The flame wars have been raging quite furiously the last few days. I think some people felt like they got kicked in their e-peen.


----------



## Forceman

Quote:


> Originally Posted by *hatlesschimp*
> 
> However I wanted hdmi 2.0 for my 4k tv or 5x1p eyefinity for my vg248qe monitors. Nvidia wont do either and amds last series wont do hdmi 2.0. So it was 290 or 290x.


I don't think the 290/290X do HDMI 2.0 either.


----------



## hatlesschimp

They dont but the rumor is they will with an update. The non reference will for sure.


----------



## Jpmboy

Quote:


> Originally Posted by *rdr09*
> 
> i'll play with more tomorrow. not much difference with beta 6 but seems my cpu is putting out more . . .
> 290 stock beta 6
> http://www.3dmark.com/3dm11/7514277
> 13.11 whql
> http://www.3dmark.com/3dm11/7528056
> edit: gained some at same oc'ed at stock volts 1155/1500
> beta 6
> http://www.3dmark.com/3dm11/7489178
> 13.11 whql
> http://www.3dmark.com/3dm11/7528079


that 2700K is very capable of 5.0GHz. One of the best CPUs I've had (and still have).


----------



## CallsignVega

Doing some 2 vs 2 testing:



I've got a serious headache after a few hours of dealing with AMD.

Let's go down the list:

http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64

What software company on the planet puts their _newest_ drivers at the _bottom_ of a list? I mean, really?

Then I try and figure out why during bezel correction, the wider I space the triangles apart the narrow the resolution gets, and the narrower the triangles the wider the resolution. I thought I was losing my mind. It turns out AMD's wonderful testing team let another home run through. Turns out in my portrait flipped setup, they didn't code it properly so it's the exact opposite of what you see on screen. And then once I do get the resolution set properly for bezel correction, It shows properly in windows control panel but not in CCC. Oh jeez.

So I set everything to default in CC, load Beta 17 AB with stock settings, fire up some games and I get this:



LOL today is not my day. Trying to get crossfire flipped portrait Eyefinity working with AMD has always been a full-time job.


----------



## Nopileus

That is what i get when i clock my 290s GPU too high. (which is, of course, to be expected)


----------



## hatlesschimp

Wow


----------



## kdawgmaster

moved to beta V7 and cards all stable as hell right now


----------



## CallsignVega

Disabling crossfire and it works properly, so I have to find out if my other card is bad or if there is some config issue with crossfire which should be plug and play. Listening to these 290x's at 100% fan is a trip though.

I have my crew erecting a shield as we speak to protect my wall:



Haha, I know these cards need water, but the stock fans are just so epic to listen to.


----------



## Spectre-

beta 9 is giving me better scores

new benchamrk
http://www.3dmark.com/3dm11/7528195

http://www.techpowerup.com/gpuz/h45b5/

add me to teh clubz


----------



## Spectre-

sorry edit- http://www.3dmark.com/3dm11/7528245

this is the best run i can do


----------



## ZealotKi11er

Have used R9 290 for only 1 day now and only have tested BF4. I have not played more then 30 mins at a time. Also my CPU is very weak but resolution is @ 1440p which is a bit more GPU bound.

R290 is fully unlocked to 290X Umber Bios.

All i can say after testing is that reviews have over-exaggerated the noise this card makes. With 30 mins of BF4 i was hitting ~ 92C MAX. Clock based on MSI AB where 980-1000Mhz. Fan Speed hit MAX 47% but most of time ~ 44%.

Compare to my older Stock HD 7970 the tone of the fan is a lot better. Also love the fact that the card does not care to speed up the fan even when it hits 70s. This is nice if you using the card for other things besides gaming.

A lot of people forget how "amazing" HD 7970 launch. Yeah it was a amazing overclocker but drivers where meh.

People that complain about 290X and 290 being cheaper. Well that has always been the case. HD 7970 is like 5% faster clock/clock then HD 7950, Same was with HD 6970/50, HD 5870/50. You always pay premium for the higher tier card.

As far as reviews go comparing 290 @ 47% fan speed and a 290X @ 40% is not really apple to apples comparison.


----------



## jerrolds

Quote:


> Originally Posted by *CallsignVega*
> 
> Disabling crossfire and it works properly, so I have to find out if my other card is bad or if there is some config issue with crossfire which should be plug and play. Listening to these 290x's at 100% fan is a trip though.
> 
> Haha, I know these cards need water, but the stock fans are just so epic to listen to.


Thats one way to put it







The stock fans are so bad.

290(x) are ideal under water, but perform very well with aftermarket HSF like Accelero Xtreme 3 or Hybrid | Gelid ICY 2 | Prolimatech MK26 if you can keep VRM temps manageable. I'm sure the next gen will be even better with VRMs since they werent designed for 290s..just happened to work.


----------



## kdawgmaster

Quote:


> Originally Posted by *CallsignVega*
> 
> Disabling crossfire and it works properly, so I have to find out if my other card is bad or if there is some config issue with crossfire which should be plug and play. Listening to these 290x's at 100% fan is a trip though.
> 
> I have my crew erecting a shield as we speak to protect my wall:
> 
> 
> 
> Haha, I know these cards need water, but the stock fans are just so epic to listen to.


whats happening to the cards in crossfire? and what drivers u using?


----------



## DraXxus1549

Quote:


> Originally Posted by *brazilianloser*
> 
> Well for my first attempt at AMD I was very un-pleased at first due to the black screen problem which at least for me it was hardware based. After much headache and two weeks without playing with my pc I got the replacement in that so far has just blown me out of the water for the price I paid for... Not sure what others are experiencing but my Asus 290 is able to play BF4 on Ultra 5760x1080 and other than some dips in FPS the game is playable something my previous 660Ti or the 280X I got previously to this beast has been able to do. Crysis 3 still kicks it to the ground but what to expect. Will definitely get a second one and once I do that put both of them on water. If they are performing as good as they are now imagine once its on water and better drivers are out. I am satisfied with my purchase since its the first time I have payed so much for a single card and had it perform so well in a single card config. Can't say the same for those out there that purchased the 290x though.


What kind of framerates are you getting in BF4 at that resolution? If I run it on Ultra I only get 30ish FPS. Are you seeing similar?


----------



## brazilianloser

Quote:


> Originally Posted by *DraXxus1549*
> 
> What kind of framerates are you getting in BF4 at that resolution? If I run it on Ultra I only get 30ish FPS. Are you seeing similar?


AB does not show me fps on BF4 so I was just going by feel when playing. Felt smooth which usually is above 30 but when playing tomorrow will run with Fraps up to see the average fps. For now though attempting to study for some exams in the morning.


----------



## Raxus

Question for watercoolers, does the MSI R9 290x have warranty stickers?


----------



## ZealotKi11er

Quote:


> Originally Posted by *DraXxus1549*
> 
> What kind of framerates are you getting in BF4 at that resolution? If I run it on Ultra I only get 30ish FPS. Are you seeing similar?


What looks about right.


----------



## Forceman

Quote:


> Originally Posted by *brazilianloser*
> 
> AB does not show me fps on BF4 so I was just going by feel when playing. Felt smooth which usually is above 30 but when playing tomorrow will run with Fraps up to see the average fps. For now though attempting to study for some exams in the morning.


Open the console with ~ and then use the command "perfoverlay.drawfps 1" and it'll put the FPS in the corner for you.


----------



## utnorris

Anyone have the AB download? The link seems to be broken.


----------



## CallsignVega

OK, so I figured out the problem. The same issue that is reversed and broken in portrait bezel correction is what is causing my cards to haywire. If I use a stock non-bezel corrected Eyefinity resolution, the cards work fine.

On air I can get them up to 1160 MHz core and 1500 Mem with fan at 100% and GPU's max out around 80C. Both cards are identical 72% ASIC. So now I have to re-do my Titan benchmarks since those were bezel corrected numbers. This time I think since TSM has gotten ~1300 MHz core, about the same as my Titans, I will lower the Titans core to match the 290X's core to make it a more even test. If I stick with the 290x's, I obviously would have water blocks and would hope AMD could fix this bezel correction driver bug.


----------



## Mr357

Quote:


> Originally Posted by *utnorris*
> 
> Anyone have the AB download? The link seems to be broken.


I wouldn't bother with it anyway. Some have said that it's a bit glitchy, and I couldn't get voltage control to work no matter what I did. Although it's not a great alternative, GPUTweak at least works for me.


----------



## MrWhiteRX7

Just threw my Sapphire 290 in, installed 13.11 beta 9.2, loaded the AB b17, unlocked voltage, restarted...

Power Limit +50
Core Clock 1200
Core Voltage +100
Meme clock 1375

Right out of the gate and it's jammin in heaven bench so far. Not done testing yet will show validation once done but just so happy right now







I had this thing at 1200 core less than a minute after loading the damn driver and benching without any hassle.

Just wanted to throw some light out for the people seeing all the craziness going on in here. GEEEEEEEEEZ


----------



## Kriant

what's with those R9 290, they are superglued to their coolers or something ? I can't take that damn thing apart without fearing to break it in half


----------



## utnorris

Well I just updated my drivers to the WHQL driver and got a higher score at a slightly less overclock on Firestrike. Keep them coming AMD.


----------



## Heinz68

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Have used R9 290 for only 1 day now and only have tested BF4. I have not played more then 30 mins at a time. Also my CPU is very weak but resolution is @ 1440p which is a bit more GPU bound.
> 
> R290 is fully unlocked to 290X Umber Bios.
> 
> All i can say after testing is that reviews have over-exaggerated the noise this card makes. With 30 mins of BF4 i was hitting ~ 92C MAX. Clock based on MSI AB where 980-1000Mhz. Fan Speed hit MAX 47% but most of time ~ 44%.
> 
> Compare to my older Stock HD 7970 the tone of the fan is a lot better. Also love the fact that the card does not care to speed up the fan even when it hits 70s. This is nice if you using the card for other things besides gaming.
> 
> A lot of people forget how "amazing" HD 7970 launch. Yeah it was a amazing overclocker but drivers where meh.
> 
> *People that complain about 290X and 290 being cheaper. Well that has always been the case. HD 7970 is like 5% faster clock/clock then HD 7950, Same was with HD 6970/50, HD 5870/50. You always pay premium for the higher tier card.*
> 
> As far as reviews go comparing 290 @ 47% fan speed and a 290X @ 40% is not really apple to apples comparison.


So much true same goes for Nvidia or Intel only the premium is much higher


----------



## MrGaZZaDaG

Hi All,

Firstly I have not gone through all 698 pages.. So if this has been mentioned before "sorry in advance".

But does anyone know or have any links where I can find out the R9 290 limits to overclocking on water,
I just got my XFX R9 290 watercooled with the EK block.

Cheers


----------



## jerrolds

Quote:


> Originally Posted by *Kriant*
> 
> what's with those R9 290, they are superglued to their coolers or something ? I can't take that damn thing apart without fearing to break it in half


Theres 2 screws on the back panel thing where the output connectors are dont forget. I was the same way - just unscrew everything you can find and the shroud should come off with a bit of force, the pads and TIM are a bit sticky. Use your judgement tho and be careful


----------



## Kriant

Quote:


> Originally Posted by *jerrolds*
> 
> Theres 2 screws on the back panel thing where the output connectors are dont forget. I was the same way - just unscrew everything you can find and the shroud should come off with a bit of force, the pads and TIM are a bit sticky. Use your judgement tho and be careful


I've unscrewed everything, the center just won't budge


----------



## CallsignVega

I needed to take a break between benchmarks so I decided to have a different kind of test:






All in good fun..


----------



## Raxus

Quote:


> Originally Posted by *CallsignVega*
> 
> I needed to take a break between benchmarks so I decided to have a different kind of test:
> 
> 
> 
> 
> 
> 
> All in good fun..


lol


----------



## Scorpion49

Quote:


> Originally Posted by *CallsignVega*
> 
> I needed to take a break between benchmarks so I decided to have a different kind of test:
> 
> 
> 
> 
> 
> 
> All in good fun..


----------



## Raxus

I know I've asked this a few times in this thread and no one seems to answer me.

Between MSI and Asus, which would be best for watercooling warranty wise?


----------



## Scorpion49

Quote:


> Originally Posted by *Raxus*
> 
> I know I've asked this a few times in this thread and no one seems to answer me.
> 
> Between MSI and Asus, which would be best for watercooling warranty wise?


I prefer MSI, very very easy warranty support. Asus is hit or miss, with a lot more misses especially if they can put it down to "customer induced damage".


----------



## Ukkooh

Quote:


> Originally Posted by *Raxus*
> 
> I know I've asked this a few times in this thread and no one seems to answer me.
> 
> Between MSI and Asus, which would be best for watercooling warranty wise?


You lose warranty on both when watercooling so flip a coin.


----------



## evensen007

Quote:


> Originally Posted by *MrGaZZaDaG*
> 
> Hi All,
> 
> Firstly I have not gone through all 698 pages.. So if this has been mentioned before "sorry in advance".
> 
> But does anyone know or have any links where I can find out the R9 290 limits to overclocking on water,
> I just got my XFX R9 290 watercooled with the EK block.
> 
> Cheers


Depends on how you draw on the chip lottery. My 2 xfx 290's won't hit 1100 on water and won't even run at 1000 at stock volts.


----------



## Raxus

Quote:


> Originally Posted by *Ukkooh*
> 
> You lose warranty on both when watercooling so flip a coin.


Well clearly theres some that are easier to RMA than others after removing the stock cooler, thats what im asking *nudge nudge* *wink wink*


----------



## Ukkooh

Quote:


> Originally Posted by *Raxus*
> 
> Well clearly theres some that are easier to RMA than others after removing the stock cooler, thats what im asking *nudge nudge* *wink wink*


Club3D fully supports watercooling from the amd side.


----------



## Forceman

Quote:


> Originally Posted by *Kriant*
> 
> I've unscrewed everything, the center just won't budge


Sometimes gently twisting it helps break the seal, if only the GPU is holding it. Be careful you don't damage the memory pads though - just a little twisting pressure, don't unscrew it like a bottle top. Or you can try lightly prying the edge up with something plastic.


----------



## Scorpion49

Quote:


> Originally Posted by *Raxus*
> 
> Well clearly theres some that are easier to RMA than others after removing the stock cooler, thats what im asking *nudge nudge* *wink wink*


I've never had a problem with removing the stock cooler on any brand, even XFX with the stickers on the screw will tell you its fine, you just have to return it in original condition.

With Asus, I've taken to pretending the products have no warranty if I buy it. I've had enough of their RMA system to last me a lifetime, hence the pile of motherboards and graphics card in my closet.


----------



## Elmy

http://www.3dmark.com/3dm11/7528468

Just playing around with my Club3D 290X's under water.



21,210P

Can't wait to get my other 2 installed. Need new mobo and CPU.

Temps didn't get over 46 degrees C


----------



## Raxus

Quote:


> Originally Posted by *Scorpion49*
> 
> I've never had a problem with removing the stock cooler on any brand, even XFX with the stickers on the screw will tell you its fine, you just have to return it in original condition.
> 
> With Asus, I've taken to pretending the products have no warranty if I buy it. I've had enough of their RMA system to last me a lifetime, hence the pile of motherboards and graphics card in my closet.


I had a lot of run around with asus with my g55vw laptop.

So you've never had any issues with MSI? any stickers on their cards?


----------



## Hogesyx

My original card has coil whine, so after deciding between staying with Sapphire 290 or upgrade to 290X, the good folks here recommended me to stay with 290 and get an arctic. Just when I finally decided to stick to my 290 and went back to my local store to swap it, 290 is out of stock and got to wait around 2 weeks!

End up topping up to a 290X! I am sure I am gonna miss the value for money 290 but I guess 290X will not disappoint me as well. Probably will still buy a arctic on my next paycheck.


----------



## Scorpion49

Quote:


> Originally Posted by *Raxus*
> 
> I had a lot of run around with asus with my g55vw laptop.
> 
> So you've never had any issues with MSI? any stickers on their cards?


No, they've been helpful to me several times. I RMA'd one motherboard and at least 3 or 4 graphics cards. They are serial based so no need for receipts which is nice if you buy used. They have an RMA center in California so I usually get a replacement in less than a week.


----------



## grandpatzer

So I got my R9 290 and decided to bench it in my secondary PC (Sandy Bridge Dual Core 2.6ghz).

*All Below is Unigine Valley Extreme HD*

All Stock stuff 947c1250m
*FPS: 48.6 Score: 2033*

ALL 3 OC BELOW is MSI AB: *+100mV, Power Limit +50%, 100% GPU Fanspeed ALWAYS*

1000c1500m
run1: FPS: 51.8 Score: 2169
run2: FPS: 51.3 Score: 2146

1100c1500m
run1: FPS: 52.7 Score: 2205
run2: FPS: 53.3 Score: 2229

*************************************************************************
*Highest I tried*, also seems MSI AB only goes to 1235mhz core...

*1200c1500m
Run1: FPS: 54.6 Score: 2284
Run2: FPS: 54.0 Score: 2259*

average is *54.3fps*, stock I got *48.6fps*: *11.7% Fps increase*, *Core increase: 26.7%*, *memory increase: 20%*
Not sure if that is *good scaling*, maybe I get better fps when have it on my primary PC and much better CPU.

GPUZ VRM1: MAX 59C, VRM2 MAX 62c
It says MAX VDDC was 1.320v but during bench if memory serves correct was approximately 1.25v - 1.29v I believe, didn't really see it go above 1.3v
VDDCI is MAX 1.000v, there is no AUX slider on my MSI AB?
these after 2 runs of Valley Extreme HD
My screenshot does not show GPU Core temperature but I believe it was MAX 76c, I don't think it reached 80c ever with 100% fan speed.

*************************************************************************

*What do you guys think, does that 11.7% fps increase make sense going from 947mhz Core / 1250 Memory to 1200mhz Core / 1500 Memory?*


----------



## Raxus

Quote:


> Originally Posted by *Scorpion49*
> 
> No, they've been helpful to me several times. I RMA'd one motherboard and at least 3 or 4 graphics cards. They are serial based so no need for receipts which is nice if you buy used. They have an RMA center in California so I usually get a replacement in less than a week.


Im in Philadelphia. I've never owned any MSI products, always been an asus guy. Little weary of MSI.

And every sapphire product ive owned has given me issues.


----------



## Scorpion49

Quote:


> Originally Posted by *Raxus*
> 
> Im in Philadelphia. I've never owned any MSI products, always been an asus guy. Little weary of MSI.
> 
> And every sapphire product ive owned has given me issues.


They used to be pretty crappy back in the day, but they seem to have worked hard. I've had many of their boards and cards recently, at least 6 of the TFIII 7950/7970 cards, couple of lightnings. I'm using the X79A-GD45 Plus right now and it works great. I used to be primarily Asus as well but when their support went downhill I look for other options now, when I spend hundreds of dollars on stuff I expect the support to be up to par.


----------



## Raxus

Quote:


> Originally Posted by *Scorpion49*
> 
> They used to be pretty crappy back in the day, but they seem to have worked hard. I've had many of their boards and cards recently, at least 6 of the TFIII 7950/7970 cards, couple of lightnings. I'm using the X79A-GD45 Plus right now and it works great. I used to be primarily Asus as well but when their support went downhill I look for other options now, when I spend hundreds of dollars on stuff I expect the support to be up to par.


Absolutely, you watercool any of their parts? I'm just starting watercooling and of course I'm really paranoid hahah.


----------



## Scorpion49

Quote:


> Originally Posted by *Raxus*
> 
> Absolutely, you watercool any of their parts? I'm just starting watercooling and of course I'm really paranoid hahah.


A couple of them. Same with any brand, take your time and read up on water cooling and it will work great. Patience is key, rushing to get the machine running so you can see awesome temps is how you set yourself up for failure.


----------



## Raxus

Quote:


> Originally Posted by *Scorpion49*
> 
> A couple of them. Same with any brand, take your time and read up on water cooling and it will work great. Patience is key, rushing to get the machine running so you can see awesome temps is how you set yourself up for failure.


Think ill give msi a go then. Thanks for the help man.

anyone know what kinda memory theyve been popping up with?


----------



## Scorpion49

Quote:


> Originally Posted by *Raxus*
> 
> Think ill give msi a go then. Thanks for the help man.
> 
> anyone know what kinda memory theyve been popping up with?


You should be happy, they're all reference right now with different stickers slapped on anyways so it should be about the same no matter what brand you get. As far as memory, I ended up with Hynix but I think its just down to luck.


----------



## Raxus

Quote:


> Originally Posted by *Scorpion49*
> 
> You should be happy, they're all reference right now with different stickers slapped on anyways so it should be about the same no matter what brand you get. As far as memory, I ended up with Hynix but I think its just down to luck.


all i know is my sapphire dropped grimlock turds in the bed. thats the third card Ive had from them thats done that.


----------



## Raxus

Quote:


> Originally Posted by *Scorpion49*
> 
> You should be happy, they're all reference right now with different stickers slapped on anyways so it should be about the same no matter what brand you get. As far as memory, I ended up with Hynix but I think its just down to luck.


did it have the VOID WARRANTY stickers?


----------



## Hogesyx

Quote:


> Originally Posted by *Raxus*
> 
> all i know is my sapphire dropped grimlock turds in the bed. thats the third card Ive had from them thats done that.


My Sapphire encounter is pretty good, maybe I got lucky? This is why I went back to Sapphire for my 290 and now my 290X as well. The only issue I had is the coil whine on my 290 at high speed.


----------



## jerrolds

Quote:


> Originally Posted by *Raxus*
> 
> did it have the VOID WARRANTY stickers?


Sapphire cards dont have stickers, but dont tell them youve removed the HSF


----------



## Scorpion49

Quote:


> Originally Posted by *Raxus*
> 
> all i know is my sapphire *dropped grimlock turds in the bed*. thats the third card Ive had from them thats done that.


----------



## Raxus

Quote:


> Originally Posted by *jerrolds*
> 
> Sapphire cards dont have stickers, but dont tell them youve removed the HSF


i was asking about MSI. Thanks though!


----------



## grandpatzer

Quote:


> Originally Posted by *CallsignVega*
> 
> OK, so I figured out the problem. The same issue that is reversed and broken in portrait bezel correction is what is causing my cards to haywire. If I use a stock non-bezel corrected Eyefinity resolution, the cards work fine.
> 
> On air I can get them up to 1160 MHz core and 1500 Mem with fan at 100% and GPU's max out around 80C. Both cards are identical 72% ASIC. So now I have to re-do my Titan benchmarks since those were bezel corrected numbers. This time I think since TSM has gotten ~1300 MHz core, about the same as my Titans, I will lower the Titans core to match the 290X's core to make it a more even test. If I stick with the 290x's, I obviously would have water blocks and would hope AMD could fix this bezel correction driver bug.


Is high ASIC good for us that watercool and try also OC?
My has Asic 76.9% but it's in my secondary rig until my waterblock comes.

I ran Unigine Valley @ 1200core 1500memory, MSI AB: +100mV, Power Limit +50%, 100% GPU Fanspeed
But I prolly need stronger CPU so no bottleneck to make sure it passes Unigine Valley


----------



## Raxus

Quote:


> Originally Posted by *Scorpion49*


Grimlock? Transformers?


----------



## ZealotKi11er

Quote:


> Originally Posted by *grandpatzer*
> 
> Is high ASIC good for us that watercool and try also OC?
> My has Asic 76.9% but it's in my secondary rig until my waterblock comes.
> 
> I ran Unigine Valley @ 1200core 1500memory, MSI AB: +100mV, Power Limit +50%, 100% GPU Fanspeed
> But I prolly need stronger CPU so no bottleneck to make sure it passes Unigine Valley


The higher the better. Lower is only good in extreme cooling.


----------



## MrGaZZaDaG

Thanks Evensen,

I guess there really is only one way to find out is to slowly overclock on stock volts to find my max. and then tinker around with the volts over that...









Will post some results over the weekend


----------



## CallsignVega

Some interesting results coming out of my 290x vs Titan comparo. On my 3240x1920 120 Hz setup, they go back and forth in wins but averaging out around the same performance clock for clock.

I guess technically clock for clock that would make the 780Ti Classified probably the fastest GPU setup to have. It's performance per clock should be the highest, and it's overclock headroom should be both higher than the 290x and Titan.









290x is best performance per dollar though no doubt.


----------



## MrWhiteRX7

Been playing the hell out of some Batman origins and getting a lot of downclocking and eratic gpu usage. Game runs perfectly smooth I didn't notice it until I exited and looked at my datalog. Went back to stock 290x clocks and average the same fps but a lot less downclocking and much better gpu usage LOL. I'm not worried about it at the moment. It's not affecting my gameplay









About to try the WHQL driver see if that helps.


----------



## Hogesyx

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Been playing the hell out of some Batman origins and getting a lot of downclocking and eratic gpu usage. Game runs perfectly smooth I didn't notice it until I exited and looked at my datalog. Went back to stock 290x clocks and average the same fps but a lot less downclocking and much better gpu usage LOL. I'm not worried about it at the moment. It's not affecting my gameplay
> 
> 
> 
> 
> 
> 
> 
> 
> 
> About to try the WHQL driver see if that helps.


I believe the 13.11 WHQL is dated prior to 9.2.


----------



## MrWhiteRX7

Hmmm I just checked it seems it came a few days after, but you may be right. Oh well fug it, I already wiped and installed lol. Lets see what happens


----------



## DampMonkey

Quote:


> Originally Posted by *Hogesyx*
> 
> I believe the 13.11 WHQL is dated prior to 9.2.


The WHQL came out later, but its an earlier version. Someone pointed out that the OpenGL revisions in the WHQL were dated earlier than the 9.2's


----------



## MrWhiteRX7

Yea it's acting the same way in game. Not a biggie right now.. it's funny I turned up the MSAA to 8x and only dropped a few fps but noticed the core clock stablized lol. Still waaaaaay above my refresh rate. Gonna leave it like that.

maybe I need to bump voltage? Or try the Asus bios?

I must say though, I'm very happy with my 290


----------



## blue1512

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Yea it's acting the same way in game. Not a biggie right now.. it's funny I turned up the MSAA to 8x and only dropped a few fps but noticed the core clock stablized lol. Still waaaaaay above my refresh rate. Gonna leave it like that.
> 
> maybe I need to bump voltage? Or try the Asus bios?
> 
> I must say though, I'm very happy with my 290


The usage behavior of your card is exactly the same with my old 7950 boost when not overclocked. I think that is how AMD's powertune works. I hope that when you raise the power target and clock, the usage bar will be more stable.


----------



## Kriant

Quote:


> Originally Posted by *Raxus*
> 
> Well clearly theres some that are easier to RMA than others after removing the stock cooler, thats what im asking *nudge nudge* *wink wink*


Been chatting with Asus support today.
Quote:


> Chester B Wed, 11/20/2013 09:22:37 pm
> Hello XXX, thank you for contacting ASUS support. Please give me a few moments to review your information. I will be with you shortly.
> 
> XXXX Wed, 11/20/2013 09:25:04 pm
> Hello Chester, I would like to know, whether installing an aftermarket cooler or an EK waterblock will void the warranty on Asus AMD R9 290. Reason for the installation: horrendous 95 and above temperatures that those cards produce, which makes it a hustle to run them in tri or qua fire ( three or four cards). Installing a waterblock would lower the temperatures to reasonable 40s and 50s under load on those cards, but I'm not sure whether ASUS allows installations of water-blocks on those cards without voiding the warranty
> 
> Chester B Wed, 11/20/2013 09:27:18 pm
> Hi XXX, for your question if there is no physical damage or tampering on the device, the warranty for the graphic card will not be voided.
> 
> XXX Wed, 11/20/2013 09:28:47 pm
> Please define tampering. Because removing the stock cooler means touching the warranty sticker on one of the screws
> 
> I mean, if I ever have to RMA the card, I would obviously put the stock cooler back on
> 
> but will removing the sticker to replace the cooler be classified as tampering?
> 
> Chester B Wed, 11/20/2013 09:30:56 pm
> No it will not classified as tampering. Just make sure that there is no physical damage when the repair team check the Graphic when you placed it for RMA.
> 
> XXXv Wed, 11/20/2013 09:31:21 pm
> Ok, thank you for clarification
> 
> That will be all
> 
> Chester B Wed, 11/20/2013 09:31:49 pm
> I appreciate your time and patience. Thank you for choosing ASUS!


----------



## Sazz

Quote:


> Originally Posted by *CallsignVega*
> 
> Some interesting results coming out of my 290x vs Titan comparo. On my 3240x1920 120 Hz setup, they go back and forth in wins but averaging out around the same performance clock for clock.
> 
> I guess technically clock for clock that would make the 780Ti Classified probably the fastest GPU setup to have. It's performance per clock should be the highest, and it's overclock headroom should be both higher than the 290x and Titan.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 290x is best performance per dollar though no doubt.


You mean R9 290 (Non X) is the best performance/dollar among the Top tier cards on both camps. I just can't justify the extra 150+bucks difference between the 290 and 290X where in terms of performance the difference between the two is a mere 3-5FPS on average where you are already getting 60+ FPS on pretty much every game 1080p

If 290X was about 450-470 (50-70bucks difference compared to 290) then sure why not, but 150bucks can get you an aftermarket cooler for the 290.

Quote:


> Originally Posted by *Kriant*
> 
> Been chatting with Asus support today.


Might wanna save that chat file as well for future use in case they say the warranty has been voided. I've encountered a scenario like that before, now every time I talk to some CSR thru chat I make sure I save the chat logs to be safe in case they turn their backs on me.


----------



## Kriant

Consequently I've failed to remove the sticker without it ripping apart.

1 card down, 2 to go. Going to finish up tomorrow, and get my loop going.

EK needs a better instruction that clearly says that top right corner of the card/block screw-hole requires a different screw compared to the rest of the block ._.


----------



## RAFFY

Quote:


> Originally Posted by *Warsam71*
> 
> Hello everyone!
> 
> I apologize for my late response...
> 
> I just wanted to let you know we are aware of the black screen issues and currently working on it. I've exchanged a few pm with @Arizonian in the past few days about this issue. He has provided me a thorough summary of the issue, possible sources of the problem, as well as links to existing threads; which I have already forwarded to our Customer Support and Driver Team. Very helpful
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thank You to everyone who has posted, and more importantly Thank You for your patience


When will the new drivers be available, I see they were not released on Wednesday.


----------



## Kriant

Quote:


> Originally Posted by *Sazz*
> 
> You mean R9 290 (Non X) is the best performance/dollar among the Top tier cards on both camps. I just can't justify the extra 150+bucks difference between the 290 and 290X where in terms of performance the difference between the two is a mere 3-5FPS on average where you are already getting 60+ FPS on pretty much every game 1080p
> 
> If 290X was about 450-470 (50-70bucks difference compared to 290) then sure why not, but 150bucks can get you an aftermarket cooler for the 290.
> Might wanna save that chat file as well for future use in case they say the warranty has been voided. I've encountered a scenario like that before, now every time I talk to some CSR thru chat I make sure I save the chat logs to be safe in case they turn their backs on me.


Yeah, I've saved it, in my email transcript right away.


----------



## Falkentyne

Quote:


> Originally Posted by *sugarhell*
> 
> Wait wait wait. So you get a card based on the spec of newegg? You dont do a research? Reviewers knew the new powertune and they still didint inform anyone how this work. Do you think that amd didint tell them how the new powertune works? Why hardware.fr has the best review about powertune? Maybe because guru3d copy paste their reviews the last 2 years they just add the results on the graphs?Because nvidia has a different boost doesnt mean amd boost is the same. Also do you mean how revieweres just report the base boost for example 770 when the card boost close to 1200?


Definitely agree.

And seriously, for the rest of you, stop trolling AMD about the core speed!! KEEP THE CARD COOL, IT RUNS AT 1000 MHZ. I ran BOTH my 6970 and 7970 at 60% fan speed locked, and it didn't throttle. And I rum my 290x at 70% fan speed (when not overvolting). and IT DOESN'T THROTTLE, EITHER.

And, DID ANY OF YOU BASHERS REALIZE that amd fixed the "scrambled video/youtube screen" problem by REMOVING the UVD clocks? I(that happened if you overclocked the memory by even 1 MHz, on 144hz/some 120hz screens, or on multimonitor setups). (and sometimes even at lower refresh rates, if you alt tabbed from a game to stream). I bet no one noticed that, huh?

Big step in the right direction.


----------



## Arizonian

Quote:


> Originally Posted by *DarknightOCR*
> 
> here is my sapphire
> 
> now with EK block
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Temp idle : 31ºC
> 
> Temp run 3dmark clocks stock : 42ºC


Congrats - added









Quote:


> Originally Posted by *Gralle*
> 
> Hi all!
> 
> First time poster and I like to get in to this exclusive club
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sapphire Radeon R9 290X with EK-FC R9-290X - Acetal waterblock.
> 
> http://www.3dmark.com/fs/1166818
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Quote:


> Originally Posted by *HeliXpc*
> 
> Just got my 290X, overclocked is slower than my GTX 780 OCd to 1200core +550 on mem. Still going to play around with it a bit, max OC is 1150 core and 1400 on mem, with 1.25v on core, +100 on afterburner. Setup a custom fan profile to keep it around 80c during full load. So far its a good card, but at $550 i wouldnt recommend it, the GTX 780 overclocked is a better card overall, faster, less power, less noise. Here are some photos.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s145.photobucket.com/user/gar818/media/20131118_184028_zps8a01c279.jpg.html
> http://s145.photobucket.com/user/gar818/media/20131118_184018_zps784e3292.jpg.html
> http://s145.photobucket.com/user/gar818/media/20131118_184723_zps18fb4312.jpg.html


Congrats - added









Quote:


> Originally Posted by *Skrillex101*
> 
> As you wish, give me 2 minutes. and ill edit this post.
> 
> ADDED. I added the stock cooler next to the PC case where you can see the Accelero, should be enough proof.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added as well







Sorry it was less than welcoming but hope you hang around. It's not always like this.

On another note: What a day I missed. Thank you to the other moderators taking care of the thread in my day long absence. Don't see how any of the bashing was productive or one bit helpful to the cards.

Anyway, still glad to hear we got drivers coming to address black screen issues for those experiencing it here real soon. Can't wait for aftermarket cooling to put that subject to rest as well.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *blue1512*
> 
> The usage behavior of your card is exactly the same with my old 7950 boost when not overclocked. I think that is how AMD's powertune works. I hope that when you raise the power target and clock, the usage bar will be more stable.


It is down clocking with the overclock and +50%. No matter what i set core clock at it stays around 1000mhz. I'm at +100mv as well. I plan to try the Asus bios tomorrow.


----------



## james111333

Does anyone know about editing the card bios settings?
I was gutted to see that my 290 is just like my 7950 in that on my TV (42" Panasonic plasma which is my monitor) during post and startup, all I get is coloured lines, millions of them! I read somewhere that the issue is that the card outputs a resolution that HD TVs don't like and I wondered if there was anyway to change it to something like 1280 x 720? It's a pain as my Physx card from the dark ages (9800GT) works perfectly and displays POST etc just fine...... The HDMI out from my maximus VI F also works fine, it just seems to be AMD discrete cards. If anyone knows how I might be able to avoid this, specifically on my new 290, I'd love to hear some input


----------



## Sazz

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> It is down clocking with the overclock and +50%. No matter what i set core clock at it stays around 1000mhz. I'm at +100mv as well. I plan to try the Asus bios tomorrow.


When I tried using the new AB with voltage mod I didn't get any overclocking done, none of them are stable even. It seems to me its not even adding any votlage at all.

But when I used stock Asus BIOS with GPU Tweak I was able to get 1270/1465 on the 290X that I had (Sold it to my friend, R9 290 "non X" is on the way xD).

Imma wait til Sapphire trixx comes in and see if that works. if not then imma wait til AB gets a better volt mod. I don't get why they just put it as "+/- mv" on the voltage, why can't they just do it like GPU tweak where you set the voltage to exactly where you want it to be, if you want it at 1.4v then make it so that we see its being set to 1.4v not some +/- crap.


----------



## grandpatzer

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> It is down clocking with the overclock and +50%. No matter what i set core clock at it stays around 1000mhz. I'm at +100mv as well. I plan to try the Asus bios tomorrow.


Did you solve youre problem?

I'm having some downclocking in Unigine Valley, not sure if it's because of weak CPU (It's in my secondary PC)


----------



## rass101

Just finished my first build and ran a 3dmark with completely stock computer http://www.3dmark.com/3dm11/7529252
I got some quite big artifacts right before final test. Is this something I should worry about or is it normal?


----------



## um2802

Any idea how can i solve the stupid BSOD 0xA0000001 ?
It is related to ati driver, and is extremely random.
Any help please?
THX


----------



## Duvar

For those who are waiting for Customdesigns, check this out http://forums.overclockers.co.uk/showthread.php?t=18559156
Especially post 21...


----------



## Heinz68

Quote:


> Originally Posted by *RAFFY*
> 
> When will the new drivers be available, I see they were not released on Wednesday.


This is the latest news from AMD rep:
Quote:


> Originally Posted by *Warsam71*
> 
> Hello everyone (@Arizonian)
> 
> I apologize, I just wanted to let you know we need a few more days to post the driver. I will make sure to keep you up-to-date on our progress...


----------



## Heinz68

Quote:


> Originally Posted by *Duvar*
> 
> For those who are waiting for Customdesigns, check this out http://forums.overclockers.co.uk/showthread.php?t=18559156
> Especially post 21...


I did read post #21 and hopefully Gibbo is wrong
Here is what Kyle_Bennett posted at HardOCP 11-15-2013
Quote:


> R9 290/X AIB Cards
> Just talked to AMD as well as three big AIB partners and AMD as well as one big AIB told me that we should expect to see custom R9 290 and 290X cards in etail the second week of December. FYI....


----------



## hatlesschimp

Second week of December is perfect. Why miss out on Christmas.

I want 5x1 eyefinity @ 144hz and I think the current vanilla 290x cards wont do it. so I was wondering should I buy a second 290x now and then when the correct non reference with the right output comes along use it as the output for the 5 monitors. So in the end I will have 2 reference 290x and a non reference card in TRI-FIRE.


----------



## Kipsofthemud

Because it's another 2 week wait for my 290X I decided to get a 290 in the meantime







Here's my Club3d 290 with waterblock, backplate and kittehs.



Waterblock installed:



Pretty :3



Kitty happy too 

I only played some Max Payne 3 so far because it was late but I like the card anyway!

Loop is not totally bled yet I think but my temps are::

- 29C idle
- 38C max load after 4 runs of Valley and 2 hours of Max Payne 3.

This is with a 360 and 240 rad, 1 gpu 1 cpu loop.


----------



## Gilgam3sh

@ Arizonian

can you please update my card to a Powercolor R9 290X (I guess it's a 290X now since I got a 290X chip with 290 BIOS) with Arctic Accelero Xtreme III


----------



## kcuestag

I just got my replacement card (With Elpida vram as well) and it has the same damn problem, as soon as I load a game it'll instantly black screen!!!

Oh, and I had written my previous card's S/N to make sure they didn't send me the same one, this one is brand new.

How awesome this is, just wasted a week without a GPU only to get a new one with the same issues.


----------



## hatlesschimp

Quote:


> Originally Posted by *kcuestag*
> 
> I just got my replacement card (With Elpida vram as well) and it has the same damn problem, as soon as I load a game it'll instantly black screen!!!
> 
> Oh, and I had written my previous card's S/N to make sure they didn't send me the same one, this one is brand new.
> 
> How awesome this is, just wasted a week without a GPU only to get a new one with the same issues.


What brand is it?


----------



## kcuestag

Quote:


> Originally Posted by *hatlesschimp*
> 
> What brand is it?


Another Sapphire.


----------



## rdr09

Quote:


> Originally Posted by *Jpmboy*
> 
> that 2700K is very capable of 5.0GHz. One of the best CPUs I've had (and still have).


here is Osjur's 2600K - 13000 physics score.









http://www.3dmark.com/3dm11/7509893

and a graphics score 290 owners can only dream off. guess that's what the premium is for.

Quote:


> Originally Posted by *um2802*
> 
> Any idea how can i solve the stupid BSOD 0xA0000001 ?
> It is related to ati driver, and is extremely random.
> Any help please?
> THX


pls. fill out Rig builder . . .

http://www.overclock.net/rigbuilder

need more info.


----------



## eternal7trance

Quote:


> Originally Posted by *kcuestag*
> 
> Another Sapphire.


You could always wait for them to come out with the driver that's supposed to fix black screen issues either today or tomorrow. That's what they said anyways


----------



## r0l4n

Quote:


> Originally Posted by *kcuestag*
> 
> I just got my replacement card (With Elpida vram as well) and it has the same damn problem, as soon as I load a game it'll instantly black screen!!!
> 
> Oh, and I had written my previous card's S/N to make sure they didn't send me the same one, this one is brand new.
> 
> How awesome this is, just wasted a week without a GPU only to get a new one with the same issues.


I'd go nVidia for the next 2+ generations if that happened to me.


----------



## Raxus

Quote:


> Originally Posted by *Kriant*
> 
> Been chatting with Asus support today.


Thanks!

I came across this thread which isnt too old so I assume it still applies.

MSI and Powercoler responses on adding waterblocks to GPUs

http://www.overclock.net/t/1218606/gpu-watercooling-keeping-warranty/30


----------



## Raxus

Quote:


> Originally Posted by *eternal7trance*
> 
> You could always wait for them to come out with the driver that's supposed to fix black screen issues either today or tomorrow. That's what they said anyways


The AMD rep did say in this thread that they have a Driver fix for the black screening. I'd definitely wait to see before going through the RMA circus again.


----------



## rdr09

Quote:


> Originally Posted by *kcuestag*
> 
> Another Sapphire.


anyway to test it on another system?

a rep might be able to help . . .

http://www.overclock.net/t/1429534/greetings-ocn-from-sapphire


----------



## Raxus

Quote:


> Originally Posted by *kcuestag*
> 
> Another Sapphire.


Is it me or does it seem like sapphire 290x have more issues than others?


----------



## Joeking78

Quote:


> Originally Posted by *Raxus*
> 
> Is it me or does it seem like sapphire 290x have more issues than others?


I got two of them and they are fine, maybe I got lucky (twice) though.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Raxus*
> 
> Is it me or does it seem like sapphire 290x have more issues than others?


mine is Sapphire, it only black screens when i set the mem clock too high..

other than that, very happy with my card


----------



## eternal7trance

No problems here with a MSI 290, has hynix memory, not sure if that makes a difference


----------



## Ghostpilot

Left my system on overnight and awoke to find a hard lock black screen. This is on an XFX 290 at stock with Elpida memory. I'd be something I'd RMA if I thought I could one with Hynix memory. I'll wait for the driver update before pursuing the issue further.


----------



## hatlesschimp

I nearly choose Sapphire.


----------



## FloRin050

Is 290 overkill for a 1080p monitor while playing bf4?


----------



## Kipsofthemud

Quote:


> Originally Posted by *Kriant*
> 
> Consequently I've failed to remove the sticker without it ripping apart.
> 
> 1 card down, 2 to go. Going to finish up tomorrow, and get my loop going.
> 
> EK needs a better instruction that clearly says that top right corner of the card/block screw-hole requires a different screw compared to the rest of the block ._.


a different screw...what? I havent done/seen that...


----------



## evensen007

Quote:


> Originally Posted by *Ghostpilot*
> 
> Left my system on overnight and awoke to find a hard lock black screen. This is on an XFX 290 at stock with Elpida memory. I'd be something I'd RMA if I thought I could one with Hynix memory. I'll wait for the driver update before pursuing the issue further.


One of my biggest complaints against my 2 290's with Elpida mem is that the thing isn't even stable at stock clocks/volts (under water blocks no less). I hope the new driver truly does straighten out these issues, although my fear is that this is really design or hardware related.


----------



## smokedawg

I am very happy with my Powercolor 290x. Decent OC with ASUS Bios and a big step up from my 7850. Skyrim with all HD mods I could find + enb, BF4 all run great @1440p. I read alot of reviews, forums and realised I'd probably need to start with watercooling (something I always wanted to do). Mantle & drivers can only improve its performance.
This has been the first time I bought a "cutting edge" GPU since the late 90s when I got this beast: Quantum 3D's Obsidian2 X-24 Single Board Voodoo II SLI


----------



## ZealotKi11er

Quote:


> Originally Posted by *kcuestag*
> 
> I just got my replacement card (With Elpida vram as well) and it has the same damn problem, as soon as I load a game it'll instantly black screen!!!
> 
> Oh, and I had written my previous card's S/N to make sure they didn't send me the same one, this one is brand new.
> 
> How awesome this is, just wasted a week without a GPU only to get a new one with the same issues.


Thats sucks.


----------



## jbottz

Apologies for the terrible picture, but you can update me to being watercooled. Need to do some tubing rework next time I drain the loop, but for now it's working like a champ. Great temperatures.


----------



## rdr09

Quote:


> Originally Posted by *evensen007*
> 
> One of my biggest complaints against my 2 290's with Elpida mem is that the thing *isn't even stable at stock clocks/volts* (under water blocks no less). I hope the new driver truly does straighten out these issues, although my fear is that this is really design or hardware related.


my msi 290 has Elpida and is stable up to 1500 at stock volts and bios (well no voltage tweaks available here yet). have not tried any higher. ASIC is 79% if that matters.

could it be your psu?


----------



## evensen007

Quote:


> Originally Posted by *rdr09*
> 
> my msi 290 has Elpida and is stable up to 1500 at stock volts and bios (well no voltage tweaks available here yet). have not tried any higher. ASIC is 79% if that matters.
> 
> could it be your psu?


Anything is possible, although I would be very surprised (and disappointed) to find that 2 290's were destroying my decent 1000 watt PSU.


----------



## Stay Puft

Whats the best brand of 290 for overclocking? Sapphire? Asus? XFX?


----------



## evensen007

Quote:


> Originally Posted by *rdr09*
> 
> my msi 290 has Elpida and is stable up to 1500 at stock volts and bios (well no voltage tweaks available here yet). have not tried any higher. ASIC is 79% if that matters.
> 
> could it be your psu?


Also, when I said stable at stock clocks/volts, I meant the core not really the memory.


----------



## rdr09

Quote:


> Originally Posted by *evensen007*
> 
> Anything is possible, although I would be very surprised (and disappointed) to find that 2 290's were destroying my decent 1000 watt PSU.


i hope you are right. not heard much about Mushkin. I did see tsm pulled like 800w from the wall (after efficiency) with 2 290X.

maybe you can experiement if you are willing to mess around with your crossfire again. have you tried oc'ing them individually? like disabling and disconnecting the power of one gpu to play with the other. if you are willing to. normally before crossfiring, you would want to test each car and see which one is a better oc'er, then install that as the primary.


----------



## Skrillex101

Quote:


> Originally Posted by *rdr09*
> 
> i hope you are right. not heard much about Mushkin. I did see tsm pulled like 800w from the wall (after efficiency) with 2 290X.
> 
> maybe you can experiement if you are willing to mess around with your crossfire again. have you tried oc'ing them individually? like disabling and disconnecting the power of one gpu to play with the other. if you are willing to. normally before crossfiring, you would want to test each car and see which one is a better oc'er, then install that as the primary.


Im running a 290X on a 520w PSU Seasonic Platiunum Fanless with no issues. 4.6GHZ OC on cpu. Not sure if that helps putting things into perspective.


----------



## HeliXpc

Guys, how do i check if i have the hynix or elpida memory sticks?


----------



## pounced

Hey guys I'm running a R9 290x Sapphire edition, Is there any news on MSI After Burner or any overclocking tools that work well or are compatible now?

The first week I had my card there was nothing supporting them yet but this is the first time I've checked back here in a while.


----------



## Skrillex101

Quote:


> Originally Posted by *HeliXpc*
> 
> Guys, how do i check if i have the hynix or elpida memory sticks?


Here you go mate: http://www.mediafire.com/download/voj4j1rlk0ucfz4/MemoryInfo+1005.rar

Link is taken from bottom of very first post in here


----------



## rdr09

Quote:


> Originally Posted by *Skrillex101*
> 
> Im running a 290X on a 520w PSU Seasonic Platiunum Fanless with no issues. 4.6GHZ OC on cpu. Not sure if that helps putting things into perspective.


if the amp available is sufficient, then it should work. i would not oc the gpu with that psu, though.


----------



## evensen007

Quote:


> Originally Posted by *Skrillex101*
> 
> Here you go mate: http://www.mediafire.com/download/voj4j1rlk0ucfz4/MemoryInfo+1005.rar
> 
> Link is taken from bottom of very first post in here


Is it easy (obvious) to check the memory on both Gpu's if you are in xfire, or will it only ready the primary GPU's memory?


----------



## Skrillex101

Quote:


> Originally Posted by *rdr09*
> 
> if the amp available is sufficient, then it should work. i would not oc the gpu with that psu, though.


Ye that would not work i think hehe, its a strong high quality PSU, but its right at the edge i think. One day ill get some measurement on it to check what im actually using hehe, i have OC´d without raising Voltages thou. that should keep me safe.


----------



## Skrillex101

Quote:


> Originally Posted by *evensen007*
> 
> Is it easy (obvious) to check the memory on both Gpu's if you are in xfire, or will it only ready the primary GPU's memory?


I got no idea mate. I just linked to the tool hehe


----------



## rdr09

Quote:


> Originally Posted by *Skrillex101*
> 
> Ye that would not work i think hehe, its a strong high quality PSU, but its right at the edge i think. One day ill get some measurement on it to check what im actually using hehe, i have OC´d without raising Voltages thou. that should keep me safe.


you see, Sk, this is the reason why serious oc'ers will pick a 7970 over a 7950 or a 290X over a 290, especially reference models . . .

http://www.3dmark.com/3dm11/7509893

but you can't achieve it if you will gimped on the other components.


----------



## Skrillex101

Quote:


> Originally Posted by *rdr09*
> 
> you see, Sk, this is the reason why serious oc'ers will pick a 7970 over a 7950 or a 290X over a 290, especially reference models . . .
> 
> http://www.3dmark.com/3dm11/7509893
> 
> but you can't achieve it if you will gimped on the other components.


In my opinion its not worth doubling the power consumption for a 5% FPS gain in games. This PSU is specificly selected for its purpose. My setup got litterally no sound comming out of it. yet its high end. Thats just my thing.

So my reason is the opposite of what your referering to actually hehe, i Picked the 290X, so i wouldnt have to overvoltage the 290 to get the extra performance


----------



## rdr09

Quote:


> Originally Posted by *Skrillex101*
> 
> In my opinion its not worth doubling the power consumption for a 5% FPS gain in games. This PSU is specificly selected for its purpose. My setup got litterally no sound comming out of it. yet its high end. Thats just my thing.


if it is just gaming . . . 290 and 520W psu. for gaming and a chance to be in the leaderboard . . .


----------



## kfxsti

Hey guys... anyone know if adding 2 ARCTIC Accelero Xtreme III's to my 290s will allow me to still fit the cards in crossfire on my Z77 sabertooth mobo?


----------



## Skrillex101

Quote:


> Originally Posted by *rdr09*
> 
> if it is just gaming . . . 290 and 520W psu. for gaming and a chance to be in the leaderboard . . .


A 290 thats not overclocked, cannot handle Eyefinity Gaming of BF4 at ULTRA.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kfxsti*
> 
> Hey guys... anyone know if adding 2 ARCTIC Accelero Xtreme III's to my 290s will allow me to still fit the cards in crossfire on my Z77 sabertooth mobo?


The AX III + Card takes up Three Slots.....maybe 3 and a bit, hope that helps


----------



## Kriant

Quote:


> Originally Posted by *Kipsofthemud*
> 
> a different screw...what? I havent done/seen that...


Well, EK gives you a set of screws of one size and one that is a bit thinner and 1.5 longer ( idk the size, it was 1am in the morning, when I was putting the block on the card).
When I've tried to use one of the shorter screws in the top right corner - it stuck and started twisting that metal screw-in with the actual screw ( I'll make a picture later today). After managing to take it off, I've used the longer thinner screw and voila, it stepped in and worked like a charm. Moral of the story: EK manual doesn't say I had to use a different screw in that corner, yet all other of the provided screws were getting stuck and starting to twist the screw-in. -_-


----------



## BIG MICHO

What is a good setting to benchmark the R9 290's. or should i say what is the standard that most people use to compare to.


----------



## kfxsti

Quote:


> Originally Posted by *Sgt Bilko*
> 
> The AX III + Card takes up Three Slots.....maybe 3 and a bit, hope that helps


thanks !!! i think im better off water blocking them.. lol


----------



## Sgt Bilko

Quote:


> Originally Posted by *kfxsti*
> 
> thanks !!! i think im better off water blocking them.. lol


Yep, i'm planning on getting a 2nd 290x and going water myself actually......the AX III is nice but it's coming up to summer here and this card already gets fairly warm


----------



## rdr09

Quote:


> Originally Posted by *Skrillex101*
> 
> A 290 thats not overclocked, cannot handle Eyefinity Gaming of BF4 at ULTRA.


i am not challenging you but where did you base that info?


----------



## brazilianloser

Quote:


> Originally Posted by *Skrillex101*
> 
> A 290 thats not overclocked, cannot handle Eyefinity Gaming of BF4 at ULTRA.


I guess the two hours I spent playing yesterday were in my wild imagination. Clearly it's not 60fps but it's above 30 which is playable.


----------



## Skrillex101

Quote:


> Originally Posted by *rdr09*
> 
> i am not challenging you but where did you base that info?


I dont know, experience and intuition?.. If the 290 Stock can do Eyefinity 5760x1080 at Ultra settings in BF4 >50 FPS atleast, then damn, Gief! heh.


----------



## XtinG

hi everyone,

had also the black screen issue with my gigabyte R9 290. throttled it down (overdrive) to GPU-temp max 92°(celsius) and fan-speed to 42%. since then now probs, not satisfying but at least no more black screens. hope the new driver will fix the issue...

cheers from good old germany ;-)


----------



## rdr09

Quote:


> Originally Posted by *Skrillex101*
> 
> I dont know, experience and intuition?.. If the 290 Stock can do Eyefinity 5760x1080 at Ultra settings in BF4 >50 FPS atleast, then damn, Gief! heh.


both stock in games should not show much of a difference even in eyeinfinity. it is when you oc, i suppose, where we'll see the difference. if and when oc'ing is needed. but without hard data kinda hard to say.

have a nice day, Sk.

edit: oops, did not see Brazilianwinner's post.


----------



## esqueue

Quote:


> Originally Posted by *Kriant*
> 
> Well, EK gives you a set of screws of one size and one that is a bit thinner and 1.5 longer ( idk the size, it was 1am in the morning, when I was putting the block on the card).
> When I've tried to use one of the shorter screws in the top right corner - it stuck and started twisting that metal screw-in with the actual screw ( I'll make a picture later today). After managing to take it off, I've used the longer thinner screw and voila, it stepped in and worked like a charm. Moral of the story: EK manual doesn't say I had to use a different screw in that corner, yet all other of the provided screws were getting stuck and starting to twist the screw-in. -_-


The stock heatsink had three different types of screws. Twelve screws + four screws (gpu) + two screws (front plate). My ek acrylic copper block omitted the three outer screws near the power cables, the two screws for the front plate and 1 random screw on the inside.

I didn't notice any longer thinner screw out of the thirteen screws that I had to use to mount my block.


----------



## galil3o

Spent 8 hours last night building my water loop inside my shiny new bitfenix prodigy case.

I'm excited to see the load temps on my 290x with that EK waterblock. I'll post some pics and finally join the 290x owners list later today.

I'd do it now if it wasnt for this pesky job that pays for all this stuff.


----------



## Kriant

Quote:


> Originally Posted by *esqueue*
> 
> The stock heatsink had three different types of screws. Twelve screws + four screws (gpu) + two screws (front plate). My ek acrylic copper block omitted the three outer screws near the power cables, the two screws for the front plate and 1 random screw on the inside.
> 
> I didn't notice any longer thinner screw out of the thirteen screws that I had to use to mount my block.


I'll post a pick, when I come back home


----------



## Skrillex101

Quote:


> Originally Posted by *brazilianloser*
> 
> I guess the two hours I spent playing yesterday were in my wild imagination. Clearly it's not 60fps but it's above 30 which is playable.


Brazillian, i would like to see a picture of your eyefinity setup.







But im not really comfortable with just above 30, that means you will dip down under at occasions no?

But gratz that you can. Impressed that the Stock card can even get +30 FPS with 5760x1080 at Ultra in 64 Multiplayer.


----------



## esqueue

Quote:


> Originally Posted by *galil3o*
> 
> Spent 8 hours last night building my water loop inside my shiny new bitfenix prodigy case.
> 
> I'm excited to see the load temps on my 290x with that EK waterblock. I'll post some pics and finally join the 290x owners list later today.
> 
> I'd do it now if it wasnt for this pesky job that pays for all this stuff.


I do have some door fans that I will have to remove. They make the case ugly. All wires are hidden and tinted black except for the ugly ass wires to the door fan and to the water pump


----------



## brazilianloser

Quote:


> Originally Posted by *galil3o*
> 
> Spent 8 hours last night building my water loop inside my shiny new bitfenix prodigy case.
> 
> I'm excited to see the load temps on my 290x with that EK waterblock. I'll post some pics and finally join the 290x owners list later today.
> 
> I'd do it now if it wasnt for this pesky job that pays for all this stuff.


Looking forward to some pictures.


----------



## Ukkooh

Just picked up my club 3d 290x. I'll eat a bit and then I'll post some pictures.


----------



## Skrillex101

Quote:


> Originally Posted by *rdr09*
> 
> both stock in games should not show much of a difference even in eyeinfinity. it is when you oc, i suppose, where we'll see the difference. if and when oc'ing is needed. but without hard data kinda hard to say.
> 
> have a nice day, Sk.
> 
> edit: oops, did not see Brazilianwinner's post.


Ye agree, we dont got much data, so its all speculations, but so far i havent seen the biggest jumps. I can get 1100mhz on my core without touching voltages, most other OC´ers on water here only got to 1200 ish?, with a few lucky ones getting higher, and some lower, so its not to big loss imo. Was still safest to go with the 290X regardless









You have a good day to


----------



## brazilianloser

Quote:


> Originally Posted by *Skrillex101*
> 
> Brazillian, i would like to see a picture of your eyefinity setup.
> 
> 
> 
> 
> 
> 
> 
> But im not really comfortable with just above 30, that means you will dip down under at occasions no?
> 
> But gratz that you can. Impressed that the Stock card can even get +30 FPS with 5760x1080 at Ultra in 64 Multiplayer.


It's right there on my signature. Sure it dips but rarely. And this at stock. Will take some screenshots and what not after work.


----------



## Skrillex101

Quote:


> Originally Posted by *brazilianloser*
> 
> It's right there on my signature. Sure it dips but rarely. And this at stock. Will take some screenshots and what not after work.


Ah nice, like your walls btw, lovely relaxing warm color.









But its nice that your 290 can handle the game there, but its also struggling







, the 290X for me atleast is lying around 60 FPS with some dips into 48-49 at STOCK. So i consider that Ok ish, i would like higher, but thats where my slight OC comes in without overvoltage


----------



## Arizonian

Quote:


> Originally Posted by *Kipsofthemud*
> 
> Because it's another 2 week wait for my 290X I decided to get a 290 in the meantime
> 
> 
> 
> 
> 
> 
> 
> Here's my Club3d 290 with waterblock, backplate and kittehs.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Waterblock installed:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Pretty :3
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Kitty happy too
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I only played some Max Payne 3 so far because it was late but I like the card anyway!
> 
> Loop is not totally bled yet I think but my temps are::
> 
> - 29C idle
> - 38C max load after 4 runs of Valley and 2 hours of Max Payne 3.
> 
> This is with a 360 and 240 rad, 1 gpu 1 cpu loop.


Congrats - added
















Quote:


> Originally Posted by *Gilgam3sh*
> 
> @ Arizonian
> 
> can you please update my card to a Powercolor R9 290X (I guess it's a 290X now since I got a 290X chip with 290 BIOS) with Arctic Accelero Xtreme III
> 
> 
> Spoiler: Warning: Spoiler!


Sweet - nice of Powercolor sending 290X with 290 BIOS....we can take care of those ourselves, thanks for the proof. Updated









Quote:


> Originally Posted by *jbottz*
> 
> Apologies for the terrible picture, but you can update me to being watercooled. Need to do some tubing rework next time I drain the loop, but for now it's working like a champ. Great temperatures.
> 
> 
> Spoiler: Warning: Spoiler!


Updated


----------



## jerrolds

Quote:


> Originally Posted by *james111333*
> 
> Does anyone know about editing the card bios settings?
> I was gutted to see that my 290 is just like my 7950 in that on my TV (42" Panasonic plasma which is my monitor) during post and startup, all I get is coloured lines, millions of them! I read somewhere that the issue is that the card outputs a resolution that HD TVs don't like and I wondered if there was anyway to change it to something like 1280 x 720? It's a pain as my Physx card from the dark ages (9800GT) works perfectly and displays POST etc just fine...... The HDMI out from my maximus VI F also works fine, it just seems to be AMD discrete cards. If anyone knows how I might be able to avoid this, specifically on my new 290, I'd love to hear some input


I have my Panny ST50 hooked up to my 290X via HDMI without issue, POST shows up fine. But its hooked up as a secondary monitor, do you have anything plugged into the DVI ports?


----------



## brazilianloser

Quote:


> Originally Posted by *Skrillex101*
> 
> Ah nice, like your walls btw, lovely relaxing warm color.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But its nice that your 290 can handle the game there, but its also struggling
> 
> 
> 
> 
> 
> 
> 
> , the 290X for me atleast is lying around 60 FPS with some dips into 48-49 at STOCK. So i consider that Ok ish, i would like higher, but thats where my slight OC comes in without overvoltage


Yeah it struggles for sure. I do plan on getting a second one just waiting on some extra cash since I will have to update my psu and I do have plans for a water solution. Money money money... But so far for a single card without oc it has impressed me even after all the headache with black screens and rma. Thanks on the wall comment. In laws old house and since they own a lumber mill the whole house has wood walls and floor lol


----------



## galil3o

Quote:


> Originally Posted by *brazilianloser*
> 
> Looking forward to some pictures.


Its running a leak test at the moment but I plugged in some of the UV cathodes and its looking pretty sweet so far.

I was pleasantly surprised when my tubing came with a white nylon lattice structure in the tubing wall (couldnt see it in the website image), lights up well with the uv reactive fluid inside


----------



## jerrolds

Quote:


> Originally Posted by *Raxus*
> 
> Is it me or does it seem like sapphire 290x have more issues than others?


Ive never blacked screen during gaming - hour+ sessions in BF4, Batman Origins and Mame-ingm as well as long burn in/3DMark/Valley runs this is fully overclocked at 1225mhz and 1.3v actual...but i did Blackscreen/Hard Reset while watching media on VLC and MPC @stock which i thought was weird









Not worth me RMAing though


----------



## Arizonian

Quote:


> Originally Posted by *jerrolds*
> 
> Ive never blacked screen during gaming - hour+ sessions in BF4, Batman Origins and Mame-ingm as well as long burn in/3DMark/Valley runs this is fully overclocked at 1225mhz and 1.3v actual...but i did Blackscreen/Hard Reset while watching media on VLC and MPC @stock which i thought was weird
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not worth me RMAing though


I'd definitely wait for the AMD driver fix for black screening issue which should theoretically be out tomorrow. See if that solves it for others who are having the issue and if it indeed fixes it.


----------



## jerrolds

I'm hoping it solves it for people...as for me..i'd be ok either way - luckily its not that big of annoyance.

Can anyone link that chart or grid that shows the wattage used the higher you overclock/add voltage to the 290X? I cant find it. If [email protected] is somthing like 300W while [email protected] is 500W then maybe it wont be worth it for me pushing this hard.


----------



## CallsignVega

Results are done. I kept the cores both at 1160 MHz as it wouldn't be fair to pair the 1160 MHz core (stock air) 290x cards versus my 1300 MHz water Titan's.



The cards trade blows in AA titles like BF4, Skyrim and Witcher 2. My flight sim which doesn't work with SLI or Crossfire just happens to like NVIDIA a lot better. Crysis 3 and Heaven 2.0 didn't like my 290X's all that much for some reason. I purposely kept all SLI and Crossfire profiles stock that come from the most up to date drivers.

In all, the 290X's deliver superb performance for the money. These cards must be water cooled though, as the fan required at the voltage I ran to keep 1160 MHz core from throttling in all benchmarks would blow you out of the room.

Some of the original Eyefinity bugs from like five years ago are unfortunately still there. AMD also doesn't allow me to run bezel-corrected flipped portrait Eyefinity which is also a no-go. My flight sim also favors NVIDIA, which makes it that much harder to stick with the 290x's. I received one black screen crash from the 290x's during my testing.

290x's and Titans are great cards, but I've decided to go with 780Ti Classifieds as it's per-clock performance and overclockability should exceed both cards run in this test. Water blocks for the new Ti Classifieds are already on their way!


----------



## velocityx

just got my gigabyte r9 290

upgraded from 6970 CF because they weren't enough in 1440p for BF4. Wow, not only this card is less noisy than my single 6970, performance is really good. I'm almost sure I will grab a second one next week to do a crossfire setup;p


----------



## JonathanQC30

hi,

Does anyone have a Asus r9 290 Non-X bios to give me, im trying to unlock my voltage on my watercooled crossfire setup, any help will be appreciated. If u got a Asus r9 290 Non-X just download gpu-z and save your bios and send me please


----------



## Ukkooh

Spoiler: R9 290x









Finally got my 290x and joining the owners club. I didn't spot any warranty void stickers on the card so I guess Club 3D honours the statement about watercooling they gave me. Also got a doorhanger as you can see.


----------



## Stay Puft

We should make a list of what manufacturers put warranty stickers on the heatsink screws


----------



## Roaches

I know XFX does have them on my 7850 GPU backet screws on the PCB.


----------



## nyboy42

Just got a Sapphire R9 290. Anyone having trouble keeping the 947 boost clock consistent? I am using the latest Catalyst 13.11 Beta V9.2 on stock settings and within minutes of launching a game or unigine, my boost clock drops below 947 (sometimes as low as 750-850) before even reaching 80C. How do I get the 947 clock to stay constant and not throttle?


----------



## Stay Puft

Quote:


> Originally Posted by *nyboy42*
> 
> Just got a Sapphire R9 290. Anyone having trouble keeping the 947 boost clock consistent? I am using the latest Catalyst 13.11 Beta V9.2 on stock settings and within minutes of launching a game or unigine, my boost clock drops below 947 (sometimes as low as 750-850) before even reaching 80C. How do I get the 947 clock to stay constant and not throttle?


Increase fan speed


----------



## Arizonian

Quote:


> Originally Posted by *Ukkooh*
> 
> 
> 
> Spoiler: R9 290x
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Finally got my 290x and joining the owners club. I didn't spot any warranty void stickers on the card so I guess Club 3D honours the statement about watercooling they gave me. Also got a doorhanger as you can see.


Nice, Congrats - added


----------



## nyboy42

Quote:


> Originally Posted by *Stay Puft*
> 
> Increase fan speed


well based on all the reviews I saw, everyone was able to maintain a 947 clock with the 47 default fan speed at 95C. Is this not possible? And again the clock throttles well before even reaching 70 C.


----------



## Ghostpilot

Quote:


> Originally Posted by *JonathanQC30*
> 
> hi,
> 
> Does anyone have a Asus r9 290 Non-X bios to give me, im trying to unlock my voltage on my watercooled crossfire setup, any help will be appreciated. If u got a Asus r9 290 Non-X just download gpu-z and save your bios and send me please


Plenty to be found in the OP here: http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread








Quote:


> Originally Posted by *Stay Puft*
> 
> We should make a list of what manufacturers put warranty stickers on the heatsink screws


My XFX 290 had them on gpu bracket screws on the PCB.


----------



## Arizonian

Quote:


> Originally Posted by *nyboy42*
> 
> well based on all the reviews I saw, everyone was able to maintain a 947 clock with the 47 default fan speed at 95C. Is this not possible?


You should be in Uber mode honestly at the very least. A manual fan curve after that will take care of that even further.


----------



## nyboy42

Quote:


> Originally Posted by *Arizonian*
> 
> You should be in Uber mode honestly at the very least. A manual fan curve after that will take care of that even further.


i think only the 290x has an uber mode switch, not the 290. With the latest driver, the default fan speed for the 290 was bumped to 47 and was designed to keep the clock at 947 for extended period of time (at 95 C) im just not seeing that with my card, as my clock starts to throttle even before hitting 70C, wondering if I should be concerned or if this is expected at the default fan speed


----------



## jerrolds

Quote:


> Originally Posted by *nyboy42*
> 
> well based on all the reviews I saw, everyone was able to maintain a 947 clock with the 47 default fan speed at 95C. Is this not possible? And again the clock throttles well before even reaching 70 C.


Try UBER mode (switch position furthest away from the outputs), and then create a profile that manually sets the fan to 50-55% if you plan on keeping it at stock, and just hit it when youre about to game. The increase in fan speed should get a headstart on the heat and prevent any throttling. Or set a more aggressive fan curve if you wanna set it and forget it.


----------



## CurrentlyPissed

huge props to Newegg. I RMAd my two 290 Sapphires due to the black screen errors last week. They just got them today. I asked if I could swap them out for XFX instead. They said no at first due to not allowing swapping of products. But she asked the sueprvisor and they made an exception. So now I get two XFX back instead of sapphire.

Best part is you get BF4 with XFX now too


----------



## flopper

My 290 needs cooling badly.
added cold from the outside by opening door, added 500p to my firestrike score.
same settings as previously.
10k breached
http://www.3dmark.com/fs/1183884


----------



## ZealotKi11er

Quote:


> Originally Posted by *CallsignVega*
> 
> Results are done. I kept the cores both at 1160 MHz as it wouldn't be fair to pair the 1160 MHz core (stock air) 290x cards versus my 1300 MHz water Titan's.
> 
> 
> 
> The cards trade blows in AA titles like BF4, Skyrim and Witcher 2. My flight sim which doesn't work with SLI or Crossfire just happens to like NVIDIA a lot better. Crysis 3 and Heaven 2.0 didn't like my 290X's all that much for some reason. I purposely kept all SLI and Crossfire profiles stock that come from the most up to date drivers.
> 
> In all, the 290X's deliver superb performance for the money. These cards must be water cooled though, as the fan required at the voltage I ran to keep 1160 MHz core from throttling in all benchmarks would blow you out of the room.
> 
> Some of the original Eyefinity bugs from like five years ago are unfortunately still there. AMD also doesn't allow me to run bezel-corrected flipped portrait Eyefinity which is also a no-go. My flight sim also favors NVIDIA, which makes it that much harder to stick with the 290x's. I received one black screen crash from the 290x's during my testing.
> 
> 290x's and Titans are great cards, but I've decided to go with 780Ti Classifieds as it's per-clock performance and overclockability should exceed both cards run in this test. Water blocks for the new Ti Classifieds are already on their way!


When you say 1160MHz for Titan. How do u know that? Is that Actual MAX boost? A Titan clocked @ 1160 will boots a lot higher i would think.


----------



## Forceman

Quote:


> Originally Posted by *jerrolds*
> 
> Try UBER mode (switch position furthest away from the outputs), and then create a profile that manually sets the fan to 50-55% if you plan on keeping it at stock, and just hit it when youre about to game. The increase in fan speed should get a headstart on the heat and prevent any throttling. Or set a more aggressive fan curve if you wanna set it and forget it.


The 290 doesn't have an uber mode, and he's throttling at low temps so a fan increase isn't likely to help.
Quote:


> Originally Posted by *nyboy42*
> 
> i think only the 290x has an uber mode switch, not the 290. With the latest driver, the default fan speed for the 290 was bumped to 47 and was designed to keep the clock at 947 for extended period of time (at 95 C) im just not seeing that with my card, as my clock starts to throttle even before hitting 70C, wondering if I should be concerned or if this is expected at the default fan speed


Did you increase the power limit?


----------



## Arizonian

Quote:


> Originally Posted by *nyboy42*
> 
> i think only the 290x has an uber mode switch, not the 290. With the latest driver, the default fan speed for the 290 was bumped to 47 and was designed to keep the clock at 947 for extended period of time (at 95 C) im just not seeing that with my card, as my clock starts to throttle even before hitting 70C, wondering if I should be concerned or if this is expected at the default fan speed


Sorry didn't know you had a 290. This is key when posting OCN if your asking questions for help. Helps others and you don't have to repeat yourself.

*"How to put your Rig in your Sig"*


----------



## CallsignVega

Quote:


> Originally Posted by *ZealotKi11er*
> 
> When you say 1160MHz for Titan. How do u know that? Is that Actual MAX boost? A Titan clocked @ 1160 will boots a lot higher i would think.


No I have custom BIOS, removed that boost crap.


----------



## nyboy42

Quote:


> Originally Posted by *Arizonian*
> 
> Sorry didn't know you had a 290. This is key when posting OCN if your asking questions for help. Helps others and you don't have to repeat yourself.
> 
> *"How to put your Rig in your Sig"*


oh ok sry, i clearly stated in my op that i had a r9 290 - will add info to my sig


----------



## nyboy42

Quote:


> Originally Posted by *Forceman*
> 
> The 290 doesn't have an uber mode, and he's throttling at low temps so a fan increase isn't likely to help.
> Did you increase the power limit?


i havent touched any settings because based on all the reviews out there the card should stay 947 steady at stock, do I have a faulty card?


----------



## evensen007

Quote:


> Originally Posted by *nyboy42*
> 
> i havent touched any settings because based on all the reviews out there the card should stay 947 steady at stock, do I have a faulty card?


And this right here is exactly what I was talking about when I said I was having trouble recommending this card to my friends. Why should you have to tinker with CCC options to get it to run at it's rated speed?

Anyhow NYboy, go into CCC and the performance section. Enable overdrive and then set your power limit to +50. Now see what happens.


----------



## rv8000

Quote:


> Originally Posted by *nyboy42*
> 
> i havent touched any settings because based on all the reviews out there the card should stay 947 steady at stock, do I have a faulty card?


What game is this in, are you at 100% gpu usage consistently?


----------



## ukaussi

Quote:


> Originally Posted by *nyboy42*
> 
> i havent touched any settings because based on all the reviews out there the card should stay 947 steady at stock, do I have a faulty card?


Hopefully they didn't leave some of the platic protective tape on the GPU like that other guy in the video !!


----------



## DampMonkey

FYI, Sapphire 290X Toxic confirmed by our friendly neighborhoold Sapphire rep on OCN
Quote:


> Originally Posted by *VaporX*
> 
> I am excited about the 290X Toxic as well, I am only getting hints from our team but the hints are very encouraging.


----------



## dezerteagle323

Can anyone speculate what the prices of the aftermarket-cooler r9 290 will be? I'm guessing they won't be around $400 like the reference cards.

I'm wondering because if AIB's tend to be alot more, then I'm just going to grab a reference one today. Help me decide please!


----------



## nyboy42

Quote:


> Originally Posted by *rv8000*
> 
> What game is this in, are you at 100% gpu usage consistently?


yes I am getting 100% usage consistently according to MSI afterburner, I ran games like Crysis 3, BF4, and Unigine Heaven benchmark. The clock is constantly flucuating between 740-947, its never a steady 947 for more than 5 seconds


----------



## kuruptx

Got a question fixing to buy one of these but want to know how can you make it run in Uber mode 24/7?

because I want it to run at 1000MHz on the core clock .


----------



## Ukkooh

So what is the best way to test daily clock with these? I guess I could try to get 1100-1150 core stable on air.


----------



## the9quad

I am really hoping the next person who does an aftermarket cooler on one of these puts up a youtube video of them doing it. It really is daunting when you have never done it before. I imagine it would be easy to mess up.


----------



## Gilgam3sh

Quote:


> Originally Posted by *Ukkooh*
> 
> So what is the best way to test daily clock with these? I guess I could try to get 1100-1150 core stable on air.


1150MHz no problem with good aftermarket cooler


----------



## jerrolds

Quote:


> Originally Posted by *Ukkooh*
> 
> So what is the best way to test daily clock with these? I guess I could try to get 1100-1150 core stable on air.


For me both BF4 and Heaven on loop get my Temps to max pretty quick. Running Heaven for a good 30mins - 1hr and monitoring VRM/GPU temps is a pretty good way to find general stability. Then i play BF4 normally and have GPUZ running on a second display monitoring temps.

BF4 shows artifacting pretty quick, but i tuned my clocks/voltages using that - as soon as i saw artifacting i would alt-tab out, and increase voltage by 0005 volts

Then i confirmed using other games i play (currently Batman Origins)


----------



## galil3o

Has there been any word on that new driver that solves the black screen issues? Thought the rep said they were releasing yesterday or today.


----------



## Raxus

For anyone with an MSI card who was worried about voiding their warranties by removing the stock cooler

http://www.overclock.net/t/1444675/watercooling-msi-cards-direct-from-msi


----------



## rv8000

Quote:


> Originally Posted by *nyboy42*
> 
> yes I am getting 100% usage consistently according to MSI afterburner, I ran games like Crysis 3, BF4, and Unigine Heaven benchmark. The clock is constantly flucuating between 740-947, its never a steady 947 for more than 5 seconds


Did you manually remove older/perivous drivers, delete folders and clear registries?


----------



## nyboy42

Quote:


> Originally Posted by *rv8000*
> 
> Did you manually remove older/perivous drivers, delete folders and clear registries?


yes I uninstalled drivers, drivesweeper and ccleaner in safe mode


----------



## rv8000

Quote:


> Originally Posted by *nyboy42*
> 
> yes I uninstalled drivers, drivesweeper and ccleaner in safe mode


I'd open afterburner and set fan control to manual, set the fan to 85%, run haven, bf4 , crysis 3 for ~10 mins and monitor core clocks then. Check both core and vrm (gpuz should show vrm temps) temps during the tests.


----------



## evensen007

Quote:


> Originally Posted by *nyboy42*
> 
> yes I uninstalled drivers, drivesweeper and ccleaner in safe mode


Did you do what I said in CCC and add +50 to the power limit?


----------



## kcuestag

Quote:


> Originally Posted by *galil3o*
> 
> Has there been any word on that new driver that solves the black screen issues? Thought the rep said they were releasing yesterday or today.


If they don't come out tomorrow I will return the card and go to Nvidia. I got a brand new 290X from RMA with the same issues, in fact, I had to uninstall AMD Drivers so that I can at least use my computer on idle at desktop (browse, chat, watch movies...), otherwise it would black screen even at idle.


----------



## nyboy42

Quote:


> Originally Posted by *evensen007*
> 
> Did you do what I said in CCC and add +50 to the power limit?


yes I did, core clock still fluctuates between 840-947 with a load temp of 80C at 55 fan speed


----------



## evensen007

Quote:


> Originally Posted by *nyboy42*
> 
> yes I did, core clock still fluctuates between 840-947 with a load temp of 80C at 55 fan speed


Try setting the fan to 100% and see what happens just out of curiosity.


----------



## Kriant

installed second EK block, one left to go tonight. This time around Asus used 1mm thermal pads that have density of thermal compound, go try pealing that off with your hands. Ugh


----------



## Ukkooh

Quote:


> Originally Posted by *Gilgam3sh*
> 
> 1150MHz no problem with good aftermarket cooler


I'm using the stock cooler until my water gear arrives.


----------



## Kriant

That's what I was talking about, that one screw is bigger than the rest, and fits in the upper right corner of the card, when mounting EK block, but the instructions doesn't say that this is the one to use, and when I used any of the smaller once - they would get stuck and then that thing in which you screw this screw would start twisting.


----------



## rdr09

Quote:


> Originally Posted by *nyboy42*
> 
> yes I uninstalled drivers, drivesweeper and ccleaner in safe mode


ugh. driver sweeper. i never use that crap. you may have to do a clean os install. sorry.


----------



## selk22

Quote:


> Originally Posted by *Sgt Bilko*
> 
> mine is Sapphire, it only black screens when i set the mem clock too high..
> 
> other than that, very happy with my card


Same! I think maybe the new drivers may allow this to stop so im going to be holding on to mine for a while..

We seem to have similar cards Bilko, whats your ASIC?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Sazz*
> 
> When I tried using the new AB with voltage mod I didn't get any overclocking done, none of them are stable even. It seems to me its not even adding any votlage at all.
> 
> But when I used stock Asus BIOS with GPU Tweak I was able to get 1270/1465 on the 290X that I had (Sold it to my friend, R9 290 "non X" is on the way xD).
> 
> Imma wait til Sapphire trixx comes in and see if that works. if not then imma wait til AB gets a better volt mod. I don't get why they just put it as "+/- mv" on the voltage, why can't they just do it like GPU tweak where you set the voltage to exactly where you want it to be, if you want it at 1.4v then make it so that we see its being set to 1.4v not some +/- crap.


It's stable as hell my memory is locked at 1500 but yea the clock fluctuates a bit. Either way I appreciate that, I'm going to try the Asus bios and gpu tweak tonight







If I can get it to at least hold that 1200 solid my fps should be even more impressive. I'm stoked!

I'll post results tonight!!!


----------



## Forceman

Quote:


> Originally Posted by *nyboy42*
> 
> yes I did, core clock still fluctuates between 840-947 with a load temp of 80C at 55 fan speed


Do you have voltage monitoring enabled in Afterburner? Because that'll cause the Afterburner core clocks to jump around like crazy.


----------



## DampMonkey

Quote:


> Originally Posted by *Kriant*
> 
> 
> 
> That's what I was talking about, that one screw is bigger than the rest, and fits in the upper right corner of the card, when mounting EK block, but the instructions doesn't say that this is the one to use, and when I used any of the smaller once - they would get stuck and then that thing in which you screw this screw would start twisting.


My EK backplate used 1 screw that was longer than the others, but not my block. Thats very strange


----------



## the9quad

Quote:


> Originally Posted by *Kriant*
> 
> installed second EK block, one left to go tonight. This time around Asus used 1mm thermal pads that have density of thermal compound, go try pealing that off with your hands. Ugh


could you video that?


----------



## TheSoldiet

I keep getting these Black Screen's. R9 290X Sapphire with Elpida memory. Is there any solution yet? If not, the I am thinking about returning it and getting a 780 instead, or just a regular 290 later on.


----------



## FloRin050

Hello!!









I'm new to this forum. I have seen alot of this thread, and now i've got some qeustions.

'Since my current computer is not so good, i decided to build my first gaming PC.








People on this forum said that getting a 290(x) for 1080p gaming is overkill??
I have a 1080p screen and i was planning to get a R9 290, this for playing BF4 on ultra, with smooth fps.
Also i want to play future games with a decent fps.
Because i'm not sure when the Non-reference 290 cards come out, and it might be overkill for a 1080p monitor, Should i take a different GPU?

The build i'm planning:
Corsair 750D
Corsair AX860I
Intel Core I7 4770k
Corsair H100i
Corsair Air Series SP120 PWM x2 ( replacement for the H100i stock fans)
Asus MAXIMUS VI Hero
Corsair vegeance pro series 2x8gb @ 1600mhz
AMD R9 290??
Samsung 840 EVO Basic 250GB
2TB Seagate Barracuda ST2000DM001

Should i change GPU?
Since this is my first build, any other advice is highly appreciated too


----------



## rv8000

Quote:


> Originally Posted by *Forceman*
> 
> Do you have voltage monitoring enabled in Afterburner? Because that'll cause the Afterburner core clocks to jump around like crazy.


@ nyboy42 This, complete forgot about this issue. Ty Forceman


----------



## shilka

Quote:


> Originally Posted by *FloRin050*
> 
> Hello!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm new to this forum. I have seen alot of this thread, and now i've got some qeustions.
> 
> 'Since my current computer is not so good, i decided to build my first gaming PC.
> 
> 
> 
> 
> 
> 
> 
> 
> People on this forum said that getting a 290(x) for 1080p gaming is overkill??
> I have a 1080p screen and i was planning to get a R9 290, this for playing BF4 on ultra, with smooth fps.
> Also i want to play future games with a decent fps.
> Because i'm not sure when the Non-reference 290 cards come out, and it might be overkill for a 1080p monitor, Should i take a different GPU?
> 
> The build i'm planning:
> Corsair 750D
> Corsair AX860I
> Intel Core I7 4770k
> Corsair H100i
> Corsair Air Series SP120 PWM x2 ( replacement for the H100i stock fans)
> Asus MAXIMUS VI Hero
> Corsair vegeance pro series 2x8gb @ 1600mhz
> AMD R9 290??
> Samsung 840 EVO Basic 250GB
> 2TB Seagate Barracuda ST2000DM001
> 
> Should i change GPU?
> Since this is my first build, any other advice is highly appreciated too


Welcome to OCN then

You should change the PSU its a ripoff

You can get something just as good for less money no need to spend that much

You wont even need that much wattage either


----------



## nyboy42

Quote:


> Originally Posted by *Forceman*
> 
> Do you have voltage monitoring enabled in Afterburner? Because that'll cause the Afterburner core clocks to jump around like crazy.


great suggestion, voltage monitoring was enabled. I turned it off and now it seems that the clocks are more stable. They still fluctuate but it seems to be more steady than before. Interesting note is that im getting different clock performance from different applications. In unigine heaven the clock is basically solid and may dip for a millisecond here and there, but overall it stays locked at 947 for the entire benchmark.

Crysis 3 on the other hand still suffers from fluctuating clock which leads me to believe that this could be a driver issue. I tested Far Cry 3 and the clocks seem a bit more stable as well

One thing for certain though is that AMD is kinda spreading false info regarding the 95C load temp. They emphatically stated that 95C is an optimal temp for performance, but for me anytime the card hits 95c (on 47 stock fan setting) the clocks start to throttle and I lose a good 20-30 mhz which I never gain back due to the heat 95c load temp. So i think its best to have a custom fan profile to ensure that the card stays under 90c

I am now gonna fool around with some OC settings and see if I can push up more clock performance and clock stability - thanks for the tip +1 to you and rv8000


----------



## rv8000

Quote:


> Originally Posted by *nyboy42*
> 
> great suggestion, voltage monitoring was enabled. I turned it off and now it seems that the clocks are more stable. They still fluctuate but it seems to be more steady than before. Interesting note is that im getting different clock performance from different applications. In unigine heaven the clock is basically solid and may dip for a millisecond here and there, but overall it stays locked at 947 for the entire benchmark.
> 
> Crysis 3 on the other hand still suffers from fluctuating clock which leads me to believe that this could be a driver issue. I tested Far Cry 3 and the clocks seem a bit more stable as well
> 
> One thing for certain though is that AMD is kinda spreading false info regarding the 95C load temp. They emphatically stated that 95C is an optimal temp for performance, but for me anytime the card hits 95c (on 47 stock fan setting) the clocks start to throttle and I lose a good 20-30 mhz which I never gain back due to the heat 95c load temp. So i think its best to have a custom fan profile to ensure that the card stays under 90c
> 
> I am now gonna fool around with some OC settings and see if I can push up more clock performance and clock stability - thanks for the tip +1 to you and rv8000


I'd be careful with crysis 3 still, I have a gelid icy vision on my 290 (cheapy ram/vrm sinks







), core will stay around 74c but vrm temps (vrm 1) is still getting up there, hit 86c after about 5 mins of play. Vrm temps might be causing some funky business too.


----------



## Lennyx

Quote:


> Originally Posted by *FloRin050*
> 
> Hello!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm new to this forum. I have seen alot of this thread, and now i've got some qeustions.
> 
> 'Since my current computer is not so good, i decided to build my first gaming PC.
> 
> 
> 
> 
> 
> 
> 
> 
> People on this forum said that getting a 290(x) for 1080p gaming is overkill??
> I have a 1080p screen and i was planning to get a R9 290, this for playing BF4 on ultra, with smooth fps.
> Also i want to play future games with a decent fps.
> Because i'm not sure when the Non-reference 290 cards come out, and it might be overkill for a 1080p monitor, Should i take a different GPU?
> 
> The build i'm planning:
> Corsair 750D
> Corsair AX860I
> Intel Core I7 4770k
> Corsair H100i
> Corsair Air Series SP120 PWM x2 ( replacement for the H100i stock fans)
> Asus MAXIMUS VI Hero
> Corsair vegeance pro series 2x8gb @ 1600mhz
> AMD R9 290??
> Samsung 840 EVO Basic 250GB
> 2TB Seagate Barracuda ST2000DM001
> 
> Should i change GPU?
> Since this is my first build, any other advice is highly appreciated too


For me the 290x was horrible on air. I ran 2 valley runs on my 2nd rig on air before i put the waterblock on. Could not stand the noise, it was that bad.
So after that experience and if i was to buy a air cooled gpu. I would not buy the 290/290x before new coolers arrive.
My advice would be to get a gtx 780 with a decent cooler.


----------



## nyboy42

Quote:


> Originally Posted by *rv8000*
> 
> I'd be careful with crysis 3 still, I have a gelid icy vision on my 290 (cheapy ram/vrm sinks
> 
> 
> 
> 
> 
> 
> 
> ), core will stay around 74c but vrm temps (vrm 1) is still getting up there, hit 86c after about 5 mins of play. Vrm temps might be causing some funky business too.


r u running your r9 290 on stock settings or overclocked? does your core clock ever fluctuate?


----------



## FloRin050

Quote:


> Originally Posted by *Lennyx*
> 
> For me the 290x was horrible on air. I ran 2 valley runs on my 2nd rig on air before i put the waterblock on. Could not stand the noise, it was that bad.
> So after that experience and if i was to buy a air cooled gpu. I would not buy the 290/290x before new coolers arrive.
> My advice would be to get a gtx 780 with a decent cooler.


I'd really like to buy a non reference 290(X). But since AMD is not giving any information about when it will be released, i am getting kinda impatient...
Waiting till end November/ maybe begin December is not a problem at all. But nobody has confirmed it will be there by that time.
And because it is my first build ever, i don't like to risk putting something like a Artic cooler on there.

Quote:


> Originally Posted by *shilka*
> 
> Welcome to OCN then
> 
> You should change the PSU its a ripoff
> 
> You can get something just as good for less money no need to spend that much
> 
> You wont even need that much wattage either


Thanks!

Okay, i tought it would be usefull. because i never know what the future brings


----------



## rv8000

Quote:


> Originally Posted by *nyboy42*
> 
> r u running your r9 290 on stock settings or overclocked? does your core clock ever fluctuate?


Stock currently. Crysis with default ccc settings, 11 mins of game play, 946~947 on the core highest core temp 62c, highest vrm temp 76c. I wish I had run a few logs with stock cooler on. Even overclocked in 3dmark or valley I consistently stayed at 1215/1550 under load during benches.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *nyboy42*
> 
> r u running your r9 290 on stock settings or overclocked? does your core clock ever fluctuate?


Overclocked and yes my core clock definitely downclocks... stays around 1000 - 1050mhz even when I have it set to 1200mhz lol. Going to test some stuff tonight and see how it goes.


----------



## nyboy42

Quote:


> Originally Posted by *rv8000*
> 
> Stock currently. Crysis with default ccc settings, 11 mins of game play, 946~947 on the core highest core temp 62c, highest vrm temp 76c. I wish I had run a few logs with stock cooler on. Even overclocked in 3dmark or valley I consistently stayed at 1215/1550 under load during benches.


so I dont understand why my clock is still dipping to 840 in crysis 3, you get a solid 946-947 on stock, I dont let me temps go above 85C


----------



## skupples

Quote:


> Originally Posted by *FloRin050*
> 
> Hello!!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm new to this forum. I have seen alot of this thread, and now i've got some qeustions.
> 
> 'Since my current computer is not so good, i decided to build my first gaming PC.
> 
> 
> 
> 
> 
> 
> 
> 
> People on this forum said that getting a 290(x) for 1080p gaming is overkill??
> I have a 1080p screen and i was planning to get a R9 290, this for playing BF4 on ultra, with smooth fps.
> Also i want to play future games with a decent fps.
> Because i'm not sure when the Non-reference 290 cards come out, and it might be overkill for a 1080p monitor, Should i take a different GPU?
> 
> The build i'm planning:
> Corsair 750D
> Corsair AX860I
> Intel Core I7 4770k
> Corsair H100i
> Corsair Air Series SP120 PWM x2 ( replacement for the H100i stock fans)
> Asus MAXIMUS VI Hero
> Corsair vegeance pro series 2x8gb @ 1600mhz
> AMD R9 290??
> Samsung 840 EVO Basic 250GB
> 2TB Seagate Barracuda ST2000DM001
> 
> 
> 
> Should i change GPU?
> Since this is my first build, any other advice is highly appreciated too


Welcome to OCN brother!

Case: Solid
PSU: Go with something else, eVGA G2 or P2, or a seasonic unit. 99% of PSU's are rebranded, so you are paying for brand recognition & corsair link (which is still pretty damn broken for allot of people)
CPU: Good stuff.
Cooler: Good stuff, h100i has enough improvements over the regular h100 that it's worth the surplus charge.
Fans: Fine, a bit loud but much much better then the stock leaf blowers.
Mobo: Can't go wrong with Asus!
Ram: Drop the ram, go with G-skill, or mushkin. Once again, ram is a rebrand, so you are paying for the corsair name, not a specific quality trait only corsair can provide.
GPU: May as well go with the 290 in this instance, I don't really see the mediocre jump in performance worth 150$
SSD: Bad ass unit, I have two of the 1TB EVO's, they are blazing fast, & the rapid tech is pretty good.
HDD: Good to go!


----------



## CurrentlyPissed

Quote:


> Originally Posted by *TheSoldiet*
> 
> I keep getting these Black Screen's. R9 290X Sapphire with Elpida memory. Is there any solution yet? If not, the I am thinking about returning it and getting a 780 instead, or just a regular 290 later on.


There is a driver that should be coming out today/tomorrow to solve this issue. Or it may already be out. Someone else needs to respond I havent been able to keep up in the thread the past few days. But I didn't want you to go without a response.


----------



## kcuestag

Quote:


> Originally Posted by *CurrentlyPissed*
> 
> There is a driver that should be coming out today/tomorrow to solve this issue. Or it may already be out. Someone else needs to respond I havent been able to keep up in the thread the past few days.


I'm pretty sure they said they would delay it a few days as it needed some work.









So yeah, I have a 290X which can't even work at idle, so right now I am running with NO drivers and it's working fine ...


----------



## rdr09

Quote:


> Originally Posted by *FloRin050*
> 
> I'd really like to buy a non reference 290(X). But since AMD is not giving any information about when it will be released, i am getting kinda impatient...
> Waiting till end November/ maybe begin December is not a problem at all. But nobody has confirmed it will be there by that time.
> And because it is my first build ever, i don't like to risk putting something like a Artic cooler on there.
> Thanks!
> 
> Okay, i tought it would be usefull. because i never know what the future brings


If you can't wait, then do what Lennyx suggested but -imo- with Battlefield, i suggest you stick to AMD.


----------



## FloRin050

Quote:


> Originally Posted by *skupples*
> 
> Welcome to OCN brother!
> 
> Case: Solid
> PSU: Go with something else, eVGA G2 or P2, or a seasonic unit. 99% of PSU's are rebranded, so you are paying for brand recognition & corsair link (which is still pretty damn broken for allot of people)
> CPU: Good stuff.
> Cooler: Good stuff, h100i has enough improvements over the regular h100 that it's worth the surplus charge.
> Fans: Fine, a bit loud but much much better then the stock leaf blowers.
> Mobo: Can't go wrong with Asus!
> Ram: Drop the ram, go with G-skill, or mushkin. Once again, ram is a rebrand, so you are paying for the corsair name, not a specific quality trait only corsair can provide.
> GPU: May as well go with the 290 in this instance, I don't really see the mediocre jump in performance worth 150$
> SSD: Bad ass unit, I have two of the 1TB EVO's, they are blazing fast, & the rapid tech is pretty good.
> HDD: Good to go!


Thank you!









Okay thanks alot for the info!
Good to hear that the parts ( most of them) i took are a good choise.
You are clearing things up for me, i'll pick a different ram than.
Should i stick with 1600mhz? Or should i get more when overcloking my 4770k. I'm not really sure how it works, but does the ram mhz go down when overclocking?
Quote:


> Originally Posted by *rdr09*
> 
> If you can't wait, then do what Lennyx suggested but -imo- with Battlefield, i suggest you stick to AMD.


Okay, i qeuss i'll have to wait till the non reference cards come out than.
First i was thinking about a R9 280 MATRIX ( It's looking so good) , but i think it's not worth it since the 290 is alot better.


----------



## BigBeard86

hey all; I got a pair of 290's in crossfire.
i have them running at 1065mhz core and 1425 memory.
i have a nice case fan setup and good airflow, and the cards do not exceed 75c with a custom fan profile.
since the new msi AB came out, i decided to try overvolting. with +100mv, i was able to get 1110 on core and 1550 on memory. core temps did not exceed 80c, and vram temps were below 80c, of course, under full load. It was strange that raising core voltage allowed me higher memory overclocks.

now, do you guys think i should keep it at these settings? how "damaging" would +100mv for daily use?


----------



## skupples

Quote:


> Originally Posted by *FloRin050*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Thank you!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Okay thanks alot for the info!
> Good to hear that the parts ( most of them) i took are a good choise.
> You are clearing things up for me, i'll pick a different ram than.
> Should i stick with 1600mhz? Or should i get more when overcloking my 4770k. I'm not really sure how it works, but does the ram mhz go down when overclocking?
> Okay, i qeuss i'll have to wait till the non reference cards come out than.
> First i was thinking about a R9 280 MATRIX ( It's looking so good) , but i think it's not worth it since the 290 is alot better
> 
> 
> .


I would shoot for 2133 mhz kit personally, anything above that is over kill... Ram prices are utterly disgusting right now... I would camp on it for awhile, to see if things start to go down again. I payed 130$ for 2x8gb Trident X 2400mhz kit like 3 months ago, the same kit is ~200$ now.

I would personally stay away from the 280's, they are straight rebrands. Best bet is 290.


----------



## nyboy42

Quote:


> Originally Posted by *BigBeard86*
> 
> hey all I got a pair of 290's in crossfire.
> i have them running at 1065mhz core and 1425 memory.
> i have a nice case fan setup and good airflow, and the card do not exceed 75c with a custom fan profile.
> since the new msi AB came out, i decided to try overvolting. with +100mv, i was able to get 1110 on core and 1550 on memory. core temps did not exceed 80c, and vram temps were below 80c, of course, under full load. It was strange that raising core voltage allowed me higher memory overclocks.
> 
> now, do you guys think i should keep it at these settings? how "damaging" would +100mv for daily use?


which brand of 290 did you purchase? do your core clocks ever throttle and fall below 947? could you also post your custom fan profile?


----------



## Mas

Quote:


> Originally Posted by *Stay Puft*
> 
> We should make a list of what manufacturers put warranty stickers on the heatsink screws


My Gigabyte 290x had no stickers or anything to show that I've opened it up.


----------



## Amhro

Quote:


> Originally Posted by *FloRin050*
> 
> First i was thinking about a R9 280 MATRIX ( It's looking so good) , but i think it's not worth it since the 290 is alot better.


Yeah, matrix is cool, but it's not worth it in my opinion, for only 30€ more you can get 290 already (at least in my country)


----------



## Forceman

Quote:


> Originally Posted by *BigBeard86*
> 
> hey all; I got a pair of 290's in crossfire.
> i have them running at 1065mhz core and 1425 memory.
> i have a nice case fan setup and good airflow, and the cards do not exceed 75c with a custom fan profile.
> since the new msi AB came out, i decided to try overvolting. with +100mv, i was able to get 1110 on core and 1550 on memory. core temps did not exceed 80c, and vram temps were below 80c, of course, under full load. It was strange that raising core voltage allowed me higher memory overclocks.
> 
> now, do you guys think i should keep it at these settings? how "damaging" would +100mv for daily use?


I think the issue with the memory overclocking is that the memory controller is the weak link, which is why increasing core voltage helps with memory overclocking. The chips can run it fine, but the new simplified controller is holding it back.

And 100mV should be fine for long term use, as long as your temps are controlled.


----------



## pharma57

http://videocardz.com/48107/overclockers-uk-offering-pre-flashed-vtx3d-r9-290-290x


----------



## Raxus

Quote:


> Originally Posted by *Mas*
> 
> My Gigabyte 290x had no stickers or anything to show that I've opened it up.


MSI has stickers, but...

Did some digging with MSI

http://www.overclock.net/t/1444675/watercooling-msi-cards-edit-directly-from-msi-from-msi-forums


----------



## Forceman

Quote:


> Originally Posted by *pharma57*
> 
> http://videocardz.com/48107/overclockers-uk-offering-pre-flashed-vtx3d-r9-290-290x


Surprised AMD is letting them do that.


----------



## BigBeard86

Quote:


> Originally Posted by *nyboy42*
> 
> which brand of 290 did you purchase? do your core clocks ever throttle and fall below 947? could you also post your custom fan profile?


They are sapphires. No, my cards never throttled - they average 75c as I said before (right now they are idling at 29 and 35c). When running benchmarks of playing games, temps are in low 70's, with an instantaneous jump up to about 85c, which only occurs until the fans rev up and cool the card down. I created the graph in msi AB; I have the max fan set to 100%, but the highest they go, based on their temps, are around 70% fan speed...just under it if I am remembering correctly.

here is the fan profile. I played bf4 for one map just to show you the card is no where near throttling, and this is true, no matter how long I play.


----------



## Loktar Ogar

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *BigBeard86*
> 
> They are sapphires. No, my cards never throttled - they average 75c as I said before (right now they are idling at 29 and 35c). When running benchmarks of playing games, temps are in low 70's, with an instantaneous jump up to about 85c, which only occurs until the fans rev up and cool the card down. I created the graph in msi AB; I have the max fan set to 100%, but the highest they go, based on their temps, are around 70% fan speed...just under it if I am remembering correctly.
> 
> here is the fan profile. I played bf4 for one map just to show you the card is no where near throttling, and this is true, no matter how long I play.






Good cards! Happy days...









EDIT: Am i reading the memory usage right, 3072MB!?


----------



## nyboy42

Quote:


> Originally Posted by *BigBeard86*
> 
> They are sapphires. No, my cards never throttled - they average 75c as I said before (right now they are idling at 29 and 35c). When running benchmarks of playing games, temps are in low 70's, with an instantaneous jump up to about 85c, which only occurs until the fans rev up and cool the card down. I created the graph in msi AB; I have the max fan set to 100%, but the highest they go, based on their temps, are around 70% fan speed...just under it if I am remembering correctly.
> 
> here is the fan profile. I played bf4 for one map just to show you the card is no where near throttling, and this is true, no matter how long I play.


ah thanks so much for all the info, I really appreciate it - I am struggling with a single sapphire r9 290 and getting real bad down clocks on Crysis 3. Would you happen to have Crysis 3? could u please do a test for me in max settings and see if your core clock throttles?


----------



## rdr09

Quote:


> Originally Posted by *nyboy42*
> 
> ah thanks so much for all the info, I really appreciate it - I am struggling with a single sapphire r9 290 and getting real bad down clocks on Crysis 3. Would you happen to have Crysis 3? could u please do a test for me in max settings and see if your core clock throttles?


are you using 1080P? if you are, then you should not have any issue with C3. I played it last night Maxed (Very High and 8MSAA) but I did not monitor my usage. I do have my i7 SB oc'ed to 4.5GHz and the 290 stock. Is your CPU oc'ed?


----------



## BigBeard86

Quote:


> Originally Posted by *nyboy42*
> 
> ah thanks so much for all the info, I really appreciate it - I am struggling with a single sapphire r9 290 and getting real bad down clocks on Crysis 3. Would you happen to have Crysis 3? could u please do a test for me in max settings and see if your core clock throttles?


I really doubt a specific game will make you card throttle. what is you case fan setup like, and how is the airflow. is it very hot in your room? i don't have crisis 3, sorry.


----------



## Loktar Ogar

Hey BigBeard86,

Am i reading the memory usage right, 3072MB!?


----------



## rdr09

Quote:


> Originally Posted by *Raxus*
> 
> MSI has stickers, but...
> 
> Did some digging with MSI
> 
> http://www.overclock.net/t/1444675/watercooling-msi-cards-edit-directly-from-msi-from-msi-forums


+rep. i did not care. i unscrewed one with the sticker on it without hesitation.


----------



## BigBeard86

Quote:


> Originally Posted by *Loktar Ogar*
> 
> Good cards! Happy days...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: Am i reading the memory usage right, 3072MB!?


thanks man. i don't know, msi afterburner is weird for me. a lot of times it shows usage over 5 gigs...when the cards are 4 gigs. however, when i got that ss, i was playing bf4 on ultra with 16xEQ supersampling AA and 16xAF high quality at 150% resolution scale. TONS of eye candy, so its quite possible it's using that much.


----------



## NorcalTRD

Powercolor R9 290 Unlocked to 290X
Stock Cooling


----------



## Loktar Ogar

Quote:


> Originally Posted by *BigBeard86*
> 
> thanks man. i don't know, msi afterburner is weird for me. a lot of times it shows usage over 5 gigs...when the cards are 4 gigs. however, when i got that ss, i was playing bf4 on ultra with 16xEQ supersampling AA and 16xAF high quality at 150% resolution scale. TONS of eye candy, so its quite possible it's using that much.


This beats 3GB VRAM cards IMO! This card rocks! Hopefully the issues will be resolved soon... I'm super exited to buy one.


----------



## Raxus

Quote:


> Originally Posted by *rdr09*
> 
> +rep. i did not care. i unscrewed one with the sticker on it without hesitation.


LOL. Just wanted to know for piece of mind.


----------



## chiknnwatrmln

Wow I got unlucky... My XFX 290 can't even flash a BIOS, and can barely hit 1100 MHz on stock voltage.

How do I increase voltage in MSI AB? The slider is stuck even though I enabled unofficial overclocking and disabled ULPS.


----------



## MlNDSTORM

You guys think there will be some discounts come Black Friday Cyber Monday for the 290 or 290x?


----------



## MrWhiteRX7

Doubtful... too new and it's their flagship. Might see some bundles but price will probably not budge


----------



## eternal7trance

Quote:


> Originally Posted by *MlNDSTORM*
> 
> You guys think there will be some discounts come Black Friday Cyber Monday for the 290 or 290x?


No but you should watch all the newegg/tiger direct promos. I got my card for $380 because of those. I think tigerdirect has $20 off 100 right now if you use v.me


----------



## BigBeard86

download version beta 17


----------



## skupples

Quote:


> Originally Posted by *Loktar Ogar*
> 
> Hey BigBeard86,
> 
> Am i reading the memory usage right, 3072MB!?


That seems spot on for max possible settings (including AA)

Quote:


> Originally Posted by *BigBeard86*
> 
> thanks man. i don't know, msi afterburner is weird for me. a lot of times it shows usage over 5 gigs...when the cards are 4 gigs. however, when i got that ss, i was playing bf4 on ultra with 16xEQ supersampling AA and 16xAF high quality at 150% resolution scale. TONS of eye candy, so its quite possible it's using that much.


AB (and other monitoring aps) are known to misread vram if they are not properly reset after a system crash/forced power down.
Quote:


> Originally Posted by *MlNDSTORM*
> 
> You guys think there will be some discounts come Black Friday Cyber Monday for the 290 or 290x?


I highly doubt it. Flagship items from any company rarely get a discount on those days. Hell, most of the time prices slowly increase then go back to 5$ off normal for black friday/monday.


----------



## ReHWolution

Ok, I saved some money: should I go for Accelero Xtreme III or should I not? (290X for me)


----------



## chiknnwatrmln

Ugh overclocking this card is a pain.

So I'm at +100mv, +50 power limit, 1150 core, and 1400 mem.

Card's stable in games, but when I run 3dMark11 I get hardcore throttling, down below 900 MHz. The card is running about 60c, VRM's below 50c.

I don't get why this card is throttling. If I lower power limit to 100% in MSI AB there's no difference, but if I close it and make the power limit +50% in CCC the card doesn't throttle but I don't get to use my fan profile.

Is there any way to keep my additional volts and fan profile but have the increased power limit actually work?


----------



## Kriant

Who in EK came up that installation for 3-slot bridge, I can't fit any of the O-rings wihout them flying all over the place, when I try to mount that sli bridge


----------



## ReHWolution

Quote:


> Originally Posted by *Kriant*
> 
> Who in EK came up that installation for 3-slot bridge, I can't fit any of the O-rings wihout them flying all over the place, when I try to mount that sli bridge


Did you try to mount the system outside the case, then plugging all the cards at once? Everytime I see someone installing a 3/4-Way Bridge they go for this procedure :O


----------



## airisom2

Argh, this black screen is really aggravating me. I run furmark, black screen. I run valley, black screen. I run crysis 3, black screen. I shut down my computer, black screen (it does it right before it shuts down). I turn on my computer, black screen. My computer idles, black screen. Monitor goes to sleep, black screen (no, really. I move the mouse to wake it up, and it doesn't do anything). I've only had the card for one day...gimme a break.

What sucks even more is that Newegg doesn't allow you to return your card. Replacement only.

So, I'm gonna wait until these drivers get released, and if they don't work, then I'll complain to newegg that they sold me a defective card. Maybe that'll let me get a refund.

At first, I thought it was a powertune problem, which it partly is, because when I ran furmark with the power limit at default, it ran fine. When I upped it to 150%, my VRM and core temps skyrocketed, and I got a black screen within 30sec. Keep in mind that I'm using a Gelid aftermarket cooler.

So, first I thought, maybe it's because of my psu. Nah, I think a single rail 1000w 80+ gold has that covered. Then I thought, maybe I should plug in the molex to give more power to the PCIe lanes. Nope. Still does it. Then, I thought that it could be a BIOS problem, so I flashed the PT1 rom that removes powertune and all of that other stuff completely. Still does it (nice performance boost, though). Then again, I think the PT1 bios makes all that stuff run at the maximum anyways. Then, I thought it could be an OS/driver problem, so I reinstalled Windows. Still does it.

I really hope these drivers fix this problem, but I'm pretty skeptical. I'm thinking that this is more of a power delivery issue than anything, and that's hardware. It looks like Elpida people, like me, suffer this problem more than Hynix users too. Maybe it's a memory voltage problem, or Elpida's memory chips are bad.

Something told me to wait for aftermarket cards to come out...I was dead set on getting a non-reference Asus DC2 Top until this unlocking stuff got me.


----------



## Kriant

Quote:


> Originally Posted by *ReHWolution*
> 
> Did you try to mount the system outside the case, then plugging all the cards at once? Everytime I see someone installing a 3/4-Way Bridge they go for this procedure :O


I am doing it outside the case. The o-rings just won't stay in those super shallow indents for them


----------



## ReHWolution

Quote:


> Originally Posted by *Kriant*
> 
> I am doing it outside the case. The o-rings just won't stay in those super shallow indents for them


D'oh! Well, then keep trying, you're gonna do that


----------



## Kriant

Quote:


> Originally Posted by *ReHWolution*
> 
> D'oh! Well, then keep trying, you're gonna do that


I gave up -> this is torture and not fun. It either requires some lube or deep freezing to get those o-rings in place for a few minutes without them jumping out all over the table.
Got my trusty swiftech connectors...3 minutes later I am done and re-measuring the tubing distance


----------



## tsm106

Quote:


> Originally Posted by *Kriant*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ReHWolution*
> 
> D'oh! Well, then keep trying, you're gonna do that
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I gave up -> this is torture and not fun. It either requires some lube or deep freezing to get those o-rings in place for a few minutes without them jumping out all over the table.
> Got my trusty swiftech connectors...3 minutes later I am done and re-measuring the tubing distance
Click to expand...

O-rings flopping around? Grab the vaseline, go oldschool.


----------



## psychok9

Guys, Radeon R9 290 still have UVD clock in high priority over 3d(max) clock?


----------



## Jpmboy

Quote:


> Originally Posted by *tsm106*
> 
> O-rings flopping around? Grab the vaseline, go oldschool.


actually... you shouldn't use petroleum-based lube on the o-rings. Use a pure silicone-based lube.


----------



## airisom2

Whoa, the weirdest thing just happened. I shut down my computer, and I got the usual black screen. This time, the computer rebooted, and went to the desktop. Afterwards, the computer went to sleep! I don't even have that enabled in power options! When I turned it back on, the computer resumed back on the desktop. So, instead of my computer just shutting off like it was supposed to, it did all that. Now, I'm even more skeptical of the upcoming drivers.


----------



## tsm106

Quote:


> Originally Posted by *Jpmboy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> O-rings flopping around? Grab the vaseline, go oldschool.
> 
> 
> 
> actually... you shouldn't use petroleum-based lube on the o-rings. Use a pure silicone-based lube.
Click to expand...

Oh yes that's true, silicone is better and not just for... eh another topic lol.


----------



## ZealotKi11er

Quote:


> Originally Posted by *airisom2*
> 
> Argh, this black screen is really aggravating me. I run furmark, black screen. I run valley, black screen. I run crysis 3, black screen. I shut down my computer, black screen (it does it right before it shuts down). I turn on my computer, black screen. My computer idles, black screen. Monitor goes to sleep, black screen (no, really. I move the mouse to wake it up, and it doesn't do anything). I've only had the card for one day...gimme a break.
> 
> What sucks even more is that Newegg doesn't allow you to return your card. Replacement only.
> 
> So, I'm gonna wait until these drivers get released, and if they don't work, then I'll complain to newegg that they sold me a defective card. Maybe that'll let me get a refund.
> 
> At first, I thought it was a powertune problem, which it partly is, because when I ran furmark with the power limit at default, it ran fine. When I upped it to 150%, my VRM and core temps skyrocketed, and I got a black screen within 30sec. Keep in mind that I'm using a Gelid aftermarket cooler.
> 
> So, first I thought, maybe it's because of my psu. Nah, I think a single rail 1000w 80+ gold has that covered. Then I thought, maybe I should plug in the molex to give more power to the PCIe lanes. Nope. Still does it. Then, I thought that it could be a BIOS problem, so I flashed the PT1 rom that removes powertune and all of that other stuff completely. Still does it (nice performance boost, though). Then again, I think the PT1 bios makes all that stuff run at the maximum anyways. Then, I thought it could be an OS/driver problem, so I reinstalled Windows. Still does it.
> 
> I really hope these drivers fix this problem, but I'm pretty skeptical. I'm thinking that this is more of a power delivery issue than anything, and that's hardware. It looks like Elpida people, like me, suffer this problem more than Hynix users too. Maybe it's a memory voltage problem, or Elpida's memory chips are bad.
> 
> Something told me to wait for aftermarket cards to come out...I was dead set on getting a non-reference Asus DC2 Top until this unlocking stuff got me.


Always fill form : http://www.amdsurveys.com/se.ashx?s=5A1E27D25AD12B60

Help you, AMD and other AMD users. 10 forms > then 100 people complaining in forums.


----------



## Maxxa

I has one too?!!


3d Mark11 @ Performance Settings
CPU- 965BE @4.0
R9 290 Stock
http://www.3dmark.com/3dm11/7533455

Vs.

CPU- [email protected]
GTX 480 @ *875Mhz*
http://www.3dmark.com/3dm11/7528065

480 Stock
http://www.3dmark.com/3dm11/7527981

Note* Slightly better memory OC on the RAM with the 480 results. I lost the RAM OC in switching video cards which I found odd.
Also Note* Yeah I get it a 965BE is probably not the best pairing for a R9 290 but I have a 4770K+Z87-G45 sitting on the table that will be mine in a month and 3 days.
Also Also Note* The cat's name is Auron he's my guardian.


----------



## sugarhell

I have seen 16k 3dmark stock performance. Your card probably gets bottlenecked









I still have my 965 be

Nice cat by the way


----------



## tsm106

^^That sugar, ruined my post with ninja posting.

I can see the resemblance with your kitty! +1


----------



## Arizonian

Quote:


> Originally Posted by *Maxxa*
> 
> I has one too?!!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 3d Mark11 @ Performance Settings
> CPU- 965BE @4.0
> R9 290 Stock
> http://www.3dmark.com/3dm11/7533455
> 
> Vs.
> 
> CPU- [email protected]
> GTX 480 @ *875Mhz*
> http://www.3dmark.com/3dm11/7528065
> 
> 480 Stock
> http://www.3dmark.com/3dm11/7527981
> 
> Note* Slightly better memory OC on the RAM with the 480 results. I lost the RAM OC in switching video cards which I found odd.
> Also Note* Yeah I get it a 965BE is probably not the best pairing for a R9 290 but I have a 4770K+Z87-G45 sitting on the table that will be mine in a month and 3 days.
> Also Also Note* The cat's name is Auron he's my guardian.


Congrats - added


----------



## Mr357

Quote:


> Originally Posted by *Maxxa*
> 
> I has one too?!!


I love your cat


----------



## Maxxa

Quote:


> Originally Posted by *Mr357*
> 
> I love your cat


Thanks! ... so hard to do this but leaving it at that in fear of turning this into a cat thread.
This R9 290 is great though I can get 70fps in ultra with 140% scaling in BF4... now time for a new screen.


----------



## Jpmboy

Quote:


> Originally Posted by *tsm106*
> 
> Oh yes that's true, silicone is better and not just for... eh another topic lol.


both good for many uses! just not the same ones.


----------



## eternal7trance

I'm having a strange issue with my 290. When I click on Screen Resolution in Windows, it's showing a 3rd monitor that is not really there. Already reinstalled latest beta drivers and this is a fresh install of W8.1

Also I notice if I enable this mystery monitor, I get stuck a with copy of my cursor in the top left of my screen.


----------



## Scorpion49

Anyone have Far Cry 3 and could give it a shot for me and let me know what performance is like? I feel a little let down that these cards can't even reach 120fps in that game at maximum low settings at 1080p.


----------



## Kelwing

Ok waving white flag. This thing is coming out. Sick of crashing to desktop every time I try to play a game. Not even trying to OC it. Maybe I will put it back in when drivers are better but my 660Ti's are going back in.


----------



## Forceman

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Ugh overclocking this card is a pain.
> 
> So I'm at +100mv, +50 power limit, 1150 core, and 1400 mem.
> 
> Card's stable in games, but when I run 3dMark11 I get hardcore throttling, down below 900 MHz. The card is running about 60c, VRM's below 50c.
> 
> I don't get why this card is throttling. If I lower power limit to 100% in MSI AB there's no difference, but if I close it and make the power limit +50% in CCC the card doesn't throttle but I don't get to use my fan profile.
> 
> Is there any way to keep my additional volts and fan profile but have the increased power limit actually work?


You can't set +50% in CCC and then still use Afterburner? That's how I have it set - overdrive enabled in CCC but settings controlled via Afterburner. Seems to work fine.


----------



## Falkentyne

Quote:


> Originally Posted by *psychok9*
> 
> Guys, Radeon R9 290 still have UVD clock in high priority over 3d(max) clock?


Nope, no UVD clocks anymore. Ram no longer auto downclocks back to 0% speed from your overdrive speed unless there is VERY little load on the GPU (300'ish MHz core).


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Forceman*
> 
> You can't set +50% in CCC and then still use Afterburner? That's how I have it set - overdrive enabled in CCC but settings controlled via Afterburner. Seems to work fine.


When I have both running, every time I apply a setting via MSI AB it resets itself. Same for CCC, stuck at stock clocks.


----------



## battleaxe

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> When I have both running, every time I apply a setting via MSI AB it resets itself. Same for CCC, stuck at stock clocks.


I'm sure you already know this but just checking;

after you enter a value and hit enter, do you hit "apply" ?

otherwise it will always reset itself.


----------



## psychok9

Quote:


> Originally Posted by *Falkentyne*
> 
> Quote:
> 
> 
> 
> Originally Posted by *psychok9*
> 
> Guys, Radeon R9 290 still have UVD clock in high priority over 3d(max) clock?
> 
> 
> 
> Nope, no UVD clocks anymore. Ram no longer auto downclocks back to 0% speed from your overdrive speed unless there is VERY little load on the GPU (300'ish MHz core).
Click to expand...

I did not understand thing of 0%.
In classic old Radeon, like 5850 (to 7970) the clock priority it's > 2D/idle 157gpu/300mem -> intermediate 550/1000 -> 3d 725gpu/1000mem -> uvd 400gpu/900mem (accellerated video output and decoding).

Then, if I launch an hardware accellerated video, and I play a 3D game, the gpu still at 400/900mhz and we got low performance.

I did not mean to talk of the Overdrive, which is also an issue with multimonitor setup.


----------



## sugarhell

Uvd is a different performance state at least on 7970s. When you play something on 3d stays on 3d clocks. The first priority is 3d clocks on my 7970 1000/1500


----------



## psychok9

Quote:


> Originally Posted by *sugarhell*
> 
> Uvd is a different performance state at least on 7970s. When you play something on 3d stays on 3d clocks. The first priority is 3d clocks on my 7970 1000/1500


Thanks a lot.
Have you tried some HD/1080p videos with accellerated players? Sometime, in youtube + flashplayer, I can't get accellerated videos and and then the video does not start the UVD state.


----------



## Scorpion49

Quote:


> Originally Posted by *Kelwing*
> 
> Ok waving white flag. This thing is coming out. Sick of crashing to desktop every time I try to play a game. Not even trying to OC it. Maybe I will put it back in when drivers are better but my 660Ti's are going back in.


I'm about to do the same. Tried playing the first intensive game I felt like playing since I bought them, white flashes and artifacts all over the screen, having to run lowest settings to make 70fps on a 1080p screen. Geez, my old 580 can do better than this. I should have kept the 780's.


----------



## sugarhell

Quote:


> Originally Posted by *psychok9*
> 
> Thanks a lot.
> Have you tried some HD/1080p videos with accellerated players? Sometime, in youtube + flashplayer, I can't get accellerated videos and and then the video does not start the UVD state.


Dont use flash player on youtube. Its buggy with amd.I mean hardware acceleration


----------



## samoth777

hi 290/290x users,

does slapping a block on these cards have a substantial benefit in terms of max overclocking?

can it comfortably stay at the red zone? what mhz can one expect?

thanks!


----------



## psychok9

Quote:


> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *psychok9*
> 
> Thanks a lot.
> Have you tried some HD/1080p videos with accellerated players? Sometime, in youtube + flashplayer, I can't get accellerated videos and and then the video does not start the UVD state.
> 
> 
> 
> Dont use flash player on youtube. Its buggy with amd.
Click to expand...

That was not the point, it was to make sure that in your experience was really turned on the hardware acceleration without sacrificing 3D game performance (like on nVidia cards).


----------



## chiknnwatrmln

Quote:


> Originally Posted by *battleaxe*
> 
> I'm sure you already know this but just checking;
> 
> after you enter a value and hit enter, do you hit "apply" ?
> 
> otherwise it will always reset itself.


Yes.

As long as I don't run AB and CCC's OC tool at the same time, they each work independently.

The only thing is for some reason AB's Power Limit setting won't actually do anything for me; I set it to +50% and it still throttles.

If I use CCC's OC tool then the card doesn't throttle, but it gets very hot and eventually artifacts.

CCC's PowerTune thing is terrible. They should just let the user set a fan profile and make it so the card doesn't throttle.


----------



## evensen007

Quote:


> Originally Posted by *samoth777*
> 
> hi 290/290x users,
> 
> does slapping a block on these cards have a substantial benefit in terms of max overclocking?
> 
> can it comfortably stay at the red zone? what mhz can one expect?
> 
> thanks!


Unfortunately, the overclocking potential doesn't have much to do with temps as much as winning the chip lottery. My 290's won't even run at 1100 with +100mv in afterburner, and won't even run at 1000 at stock volts. This is under water and max core and vrm temps in the upper 50's.


----------



## skupples

Quote:


> Originally Posted by *psychok9*
> 
> That was not the point, it was to make sure that in your experience was really turned on the hardware acceleration without sacrificing 3D game performance (like on nVidia cards).


what? In my experience the only hardware acceleration I can turn on and off is in the application it's self. Like google.

While in multi-monitor cards tend to stay @ 3d clocks, so allot of use nvidia inspector to force downclock, with zero performance hit in 2d/hardware accel.


----------



## psychok9

Quote:


> Originally Posted by *skupples*
> 
> Quote:
> 
> 
> 
> Originally Posted by *psychok9*
> 
> That was not the point, it was to make sure that in your experience was really turned on the hardware acceleration without sacrificing 3D game performance (like on nVidia cards).
> 
> 
> 
> what? In my experience the only hardware acceleration I can turn on and off is in the application it's self. Like google.
> 
> While in multi-monitor cards tend to stay @ 3d clocks, so allot of use nvidia inspector to force downclock, with zero performance hit in 2d/hardware accel.
Click to expand...

I'm sorry if I was not clear.

I've noticed this strange behaviour on AMD cards: sometime youtube start in HTML5 mode, or flash in *windowed* mode (no fullscreen) and DXVA/hardware accelleration *still* disabled. if you click on fullscreen mode ramp up to 400/900.
You can check DXVA status on or off, enabling dynamic contrast or demo mode (splitted screen).

Best method to check UVD priority it's start 1080p videos on an hardware accellerated player, and play a game.


----------



## jerrolds

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> When I have both running, every time I apply a setting via MSI AB it resets itself. Same for CCC, stuck at stock clocks.


How are temps looking? You should be able to hit 1100 at stock volts and a decent fan profile. I wonder if you need to redo the TIM


----------



## chiknnwatrmln

Quote:


> Originally Posted by *jerrolds*
> 
> How are temps looking? You should be able to hit 1100 at stock volts and a decent fan profile. I wonder if you need to redo the TIM


My temps are 65-70c, VRMs have only gotten up to about 55c.

The temps aren't a problem, it's the card throttling because of the power limit.

The throttling is worse when I up the voltage, which makes sense.

I took a screen right after waiting at the Metro LL intro menu while running max settings and SSAA X4.

Temps are fine.

CPU is not bottlenecking because my GPU usage is at 100%.

You can tell it's the power limit because all of the power related graphs are spiky and fluctuate.

1100 MHz is my max OC at stock voltage, increasing by 100mV yields an additional 50MHz.

throttling.png 306k .png file


----------



## Mas

So, where are these new drivers that I thought we were supposed to be getting a day or so ago?


----------



## Raxus

Quote:


> Originally Posted by *Mas*
> 
> So, where are these new drivers that I thought we were supposed to be getting a day or so ago?


think Warsam said it could be a couple of days, yesterday.


----------



## the9quad

Quote:


> Originally Posted by *Mas*
> 
> So, where are these new drivers that I thought we were supposed to be getting a day or so ago?


The rep posted they ran into a snag and would take an extra day or so. Another rep posted that it has something to do with the GDDR5 memory training, under different workloads


----------



## Mas

Thanks guys


----------



## Scorpion49

Hrm, well Newegg doesn't want their 290's back very much. That was a fun conversation, I don't think I'll ever buy from them again.


----------



## Raxus

Quote:


> Originally Posted by *Scorpion49*
> 
> Hrm, well Newegg doesn't want their 290's back very much. That was a fun conversation, I don't think I'll ever buy from them again.


I managed to get them to give me a store credit for my black screening 290x.

I think I told them it was defective and that I would take a store credit to buy a 290x from a different manufacturer.

Something along those lines anyway.


----------



## Scorpion49

Quote:


> Originally Posted by *Raxus*
> 
> I managed to get them to give me a store credit for my black screening 290x.
> 
> I think I told them it was defective and that I would take a store credit to buy a 290x from a different manufacturer.
> 
> Something along those lines anyway.


Yeah, no. They didn't want none of that. Thats all I wanted was a credit to get something different, instead they lost a customer that has been shopping there since the freaking website started. They eventually told me to try and RMA with Gigabyte for a different model or something. Thats about the point where I told them I would never shop there again and Amazon would be getting my business instead.


----------



## selk22

Quote:


> Originally Posted by *Raxus*
> 
> I managed to get them to give me a store credit for my black screening 290x.
> 
> I think I told them it was defective and that I would take a store credit to buy a 290x from a different manufacturer.
> 
> Something along those lines anyway.


Yeah I just black screened myself at stock memory clocks in planetside 2 for the 3rd or 4th time... If the driver updates don't fix things I think I will try the same. It is really a shame because I actually really like this card and want to support AMD! But they really are testing my faith on this one.

Anyone RMA with Sapphire before? How long will I be without a GPU?


----------



## Raxus

Quote:


> Originally Posted by *Scorpion49*
> 
> Yeah, no. They didn't want none of that. Thats all I wanted was a credit to get something different, instead they lost a customer that has been shopping there since the freaking website started. They eventually told me to try and RMA with Gigabyte for a different model or something. Thats about the point where I told them I would never shop there again and Amazon would be getting my business instead.


Sorry I just popped back into the thread, you getting the blackscreen crap?


----------



## MrGaZZaDaG

So if the card is not throttling down due to temps.. it must also have something in place so it can only draw 'X' amount of power...

=/ slightly disappointed now, that I shelled out on water cooling to try and achieve some decent clocks...
Well I hope my XFX R9 290 rev 1.1 can unlock to a 290x


----------



## Scorpion49

Quote:


> Originally Posted by *selk22*
> 
> Yeah I just black screened myself at stock memory clocks in planetside 2 for the 3rd or fourth time... If the driver updates don't fix things I think I will try the same. It is really a shame because I actually really like this card and want to support AMD! But they really are testing my faith on this one.


I was patiently waiting for that driver update until I discovered that my GTX 580 thats been around for years can provide equal levels of performance in several games I play. I don't think there is anything physically wrong with the cards, but they do seem to spend an awful lot of time at 0% utilization when I'm not benching.


----------



## eternal7trance

Why are you guys complaining about newegg's return policy? It's on the page where you buy the card...


----------



## Scorpion49

Quote:


> Originally Posted by *Raxus*
> 
> Sorry I just popped back into the thread, you getting the blackscreen ****?


I've gotten many black screens but my problem is performance is lacking and I see a ton of white frames. Like the full screen will flash white for 1 or 2 frames and then be back to normal. Its driving me nuts because its like a strobe light.


----------



## Raxus

Quote:


> Originally Posted by *MrGaZZaDaG*
> 
> So if the card is not throttling down due to temps.. it must also have something in place so it can only draw 'X' amount of power...
> 
> =/ slightly disappointed now, that I shelled out on water cooling to try and achieve some decent clocks...
> Well I hope my XFX R9 290 rev 1.1 can unlock to a 290x


I did notice with my sapphire 290x the card would be at a steady 1100, but in bioshock it would run around 900... bizzare


----------



## Raxus

Quote:


> Originally Posted by *Scorpion49*
> 
> I was patiently waiting for that driver update until I discovered that my GTX 580 thats been around for years can provide equal levels of performance in several games I play. I don't think there is anything physically wrong with the cards, but they do seem to spend an awful lot of time at 0% utilization when I'm not benching.


What games is the 580 on par with the 290?


----------



## Raxus

Also I'm kinda confused why everyone seems to expect a 290 to overclock and perform on par with 290xs? I'm not nearly as experienced as most of you guys. But it seems kinda crazy to expect a $400 of hardware to perform inline with a $550 one.

I assume theres some reason the 290xs are $150 more? Better selected chips maybe?


----------



## Raxus

Quote:


> Originally Posted by *eternal7trance*
> 
> Why are you guys complaining about newegg's return policy? It's on the page where you buy the card...


As long as they replace defective cards with the same, can't complain to much I would think.

I did get lucky, but I was also very polite and always em with customer support.. It really goes miles in your favor.


----------



## bond32

Koolance water block is installed. Played some bf4, ran a few benchmarks, max core temp hit 46 C, max VRM hit 50 C. Pretty stoked, I like it a lot. Can post pictures later


----------



## Raxus

Quote:


> Originally Posted by *bond32*
> 
> Koolance water block is installed. Played some bf4, ran a few benchmarks, max core temp hit 46 C, max VRM hit 50 C. Pretty stoked, I like it a lot. Can post pictures later


You got pretty lucky, see a lot of people with the sapphire 290x cards taking a dump. Although according to Warsam, its a driver thing.


----------



## jomama22

Quote:


> Originally Posted by *Raxus*
> 
> As long as they replace defective cards with the same, can't complain to much I would think.
> 
> I did get lucky, but I was also very polite and always em with customer support.. It really goes miles in your favor.


A curtious, understanding yet firm undertone always wins in the end. The amount of freebies, discounts, gift cards etc. I have received because of this is honestly ridiculous lol.

When you walk into a cs conversation with a chip on your shoulder, it never goes well haha.


----------



## jomama22

Quote:


> Originally Posted by *Raxus*
> 
> You got pretty lucky, see a lot of people with the sapphire 290x cards taking a dump. Although according to Warsam, its a driver thing.


Sapphire has provided me with the best cards tbh. 3 out of 4 290xs from them will hit 1345+ core @ ~1.42v actual. Compare that to 2/6, other brands, that could only reach 1300-1310 before needing over 1.42v actual.


----------



## Scorpion49

Quote:


> Originally Posted by *Raxus*
> 
> What games is the 580 on par with the 290?


I decided to play Far Cry 3. At the exact same settings, my single old 580 gets 70-80fps, 290 CF gets 70-80fps with the added bonus of weird white screens. FC3 has sucked optimization-wise since it came out, so if I didn't already know that 7950CF can play it much better than this I wouldn't really be upset I guess.


----------



## jomama22

Quote:


> Originally Posted by *Scorpion49*
> 
> I decided to play Far Cry 3. At the exact same settings, my single old 580 gets 70-80fps, 290 CF gets 70-80fps with the added bonus of weird white screens.


You know far cry 3 is a disaster on anything above a 680/7970, especially sli/crossfire.

Why don't you try it with cfx disabled?


----------



## skupples

Quote:


> Originally Posted by *Scorpion49*
> 
> Yeah, no. They didn't want none of that. Thats all I wanted was a credit to get something different, instead they lost a customer that has been shopping there since the freaking website started. They eventually told me to try and RMA with Gigabyte for a different model or something. Thats about the point where I told them I would never shop there again and Amazon would be getting my business instead.


You my friend are not the only one. They have been conducting allot of shady business lately. They are too big to care about one customer, which is part of the problem.

FC3 is a disaster, it ran gloriously on my 670's, but runs TERRIBLE on my titans... Go figure.


----------



## Cubeman

http://www.techpowerup.com/gpuz/533qc/

Validation, Got mine for 563$. Only one we got in my store too. Customers are upset aha.


----------



## jerrolds

Just got my first black screen playing BF4...well that sucks thought i was immune







Sapphire w/ Elpida @1150/1500 1.293v in GPU Tweak at the time.


----------



## Scorpion49

Quote:


> Originally Posted by *jomama22*
> 
> You know far cry 3 is a disaster on anything above a 680/7970, especially sli/crossfire.
> 
> Why don't you try it with cfx disabled?


It ran great on 7950CF, 680 SLI, 690, Tri-670's, and probably other setups I've used that I can't think of. The only card I had that didn't like the game was the Titan.

Quote:


> Originally Posted by *skupples*
> 
> You my friend are not the only one. They have been conducting allot of shady business lately. They are too big to care about one customer, which is part of the problem.
> 
> FC3 is a disaster, it ran gloriously on my 670's, but runs TERRIBLE on my titans... Go figure.


Yeah, I understand their RMA policy is written there but I don't want to get the same thing back and have more issues after waiting for shipping. I don't know what I'll do right now honestly. Maybe they can sit on my desk until the driver patch comes out.


----------



## jomama22

Quote:


> Originally Posted by *skupples*
> 
> You my friend are not the only one. They have been conducting allot of shady business lately. They are too big to care about one customer, which is part of the problem.
> 
> FC3 is a disaster, it ran gloriously on my 670's, but runs TERRIBLE on my titans... Go figure.


He was asking for something that was clearly prohibited on the sale page. Just because someone was polite enough to overlook that for someone else, doesn't meant its the rule and if anything, it is the exception.

If I was newegg I would do the same thing. Its like ordering a well done grilled steak then complaining that its too dry, what do you expect?

No one forced anyone to buy through newegg. We all know the lame return policy of chips/gfx cards from them. But that isn't an excuse to be upset because they upheld their own rule.

If you couldn't tell, I'm in the service industry lol.


----------



## jomama22

Quote:


> Originally Posted by *Scorpion49*
> 
> It ran great on 7950CF, 680 SLI, 690, Tri-670's, and probably other setups I've used that I can't think of. The only card I had that didn't like the game was the Titan.
> Yeah, I understand their RMA policy is written there but I don't want to get the same thing back and have more issues after waiting for shipping. I don't know what I'll do right now honestly. Maybe they can sit on my desk until the driver patch comes out.


I would just RMA through newegg. 3/4 sapphires I have are absolute boss clocking wise. I personally have had 0 issues with black screens and such.


----------



## Ukkooh

My card has got 73.9% asic quality and elpida mem. How is that asic for overclocking? From my quick tests the card seems to be very sensitive to temps as I start artifacting @1100 mhz right after the temps pass 80°C.


----------



## Stay Puft

Quote:


> Originally Posted by *Raxus*
> 
> Also I'm kinda confused why everyone seems to expect a 290 to overclock and perform on par with 290xs? I'm not nearly as experienced as most of you guys. But it seems kinda crazy to expect a $400 of hardware to perform inline with a $550 one.
> 
> I assume theres some reason the 290xs are $150 more? Better selected chips maybe?


290's only have 10% less stream processors and lower clocks. Only two differences between the two cards


----------



## DeadlyDNA

Black screens for me only happened a few times for me. I am also certain in my case it was for a few reasons.
1. I screwed around with lowering fans in CCC
2. i unplugged my gpu fan intentionally and ran a benchmark - black screen within 1 minute of use.
3. when i ran trifire with the cards right next to each other

I also feel in my case my issue was temperature related. I spaced out the trifire on my x79 board for now until my water cooling stuff arrives tomorrow. Since then, i also run CCC settings : target temp 65C - 100% fan speed.
Yes it's noisy as hell but it will soon be fixed. Since i did this i have had no black screens and i've been gaming on them all week, and running benchmarks.

Also forgot, of 4 cards i did have 1 DOA right out of the box. I am waiting on the replacement still. If there are not a huge plethora of benchmarks for water vs stock. I might go ahead and try to compare them if people need the results.


----------



## Scorpion49

Quote:


> Originally Posted by *jomama22*
> 
> He was asking for something that was clearly prohibited on the sale page. Just because someone was polite enough to overlook that for someone else, doesn't meant its the rule and if anything, it is the exception.
> 
> If I was newegg I would do the same thing. Its like ordering a well done grilled steak then complaining that its too dry, what do you expect?
> 
> No one forced anyone to buy through newegg. We all know the lame return policy of chips/gfx cards from them. But that isn't an excuse to be upset because they upheld their own rule.
> 
> If you couldn't tell, I'm in the service industry lol.


Thats fine, they can enjoy their replacement only policy without my money, or anyone I know as well from here on out. Lesson learned, go with a company that still has some semblance of customer support.


----------



## Taint3dBulge

Iv been gettn the black screens ALOT too the last few days.. It stoped and never happend for about a week. Then all a sudden... Nothing has changed settings wise either.


----------



## Kriant

Quote:


> Originally Posted by *tsm106*
> 
> O-rings flopping around? Grab the vaseline, go oldschool.


I thought of that, actually. But ( and here goes a bad old joke ) : "I've used it up on OTHER things". In all reality though, I didn't have anything handy in my apt that would work, so I've used my swiftech connectors, now testing for leaks.

But it's a PITA, and I'll probably won't use that tri-sli connector till my next build or something. There's no real adventage of EK connector block over separate connectors, right?


----------



## MrWhiteRX7

*I figured out the downclock issue!!!!!!!!!!!!!!!!!* Unreal!!!

Ok for those using AB b17, once you have applied your overclock and all you MUST GO BACK IN TO CCC and reset the Power Limit back to 50% again. Even though it says it's 50% in AB it's NOT 50% in CCC.

Once I did this I held my overclock without one single dip! WOOOOOOOOOOOOO!


----------



## xTristinx

http://www.3dmark.com/3dm11/7534174. here are my results but why in the world am i only getting 17000 when each of my cards get 12300-12800


----------



## HeliXpc

Fellas, how do you check if you have elpida or hynix? and which one is better?


----------



## hatlesschimp

I got another 290X today. That makes it a Gigabyte & Powercolor


----------



## Stay Puft

Quote:


> Originally Posted by *xTristinx*
> 
> http://www.3dmark.com/3dm11/7534174. here are my results but why in the world am i only getting 17000 when each of my cards get 12300-12800


Your physics score was only 8000. CPU throttling?


----------



## chiknnwatrmln

My card is so loud...

When it's under load it makes a loud buzzing noise that sounds too low to be coil whine. It's louder than the fan up to 50%.

If my card isn't an exceptional ocer then I'm going to RMA it.


----------



## Arizonian

Quote:


> Originally Posted by *hatlesschimp*
> 
> I got another 290X today. That makes it a Gigabyte & Powercolor
> 
> 
> 
> Spoiler: Warning: Spoiler!


Nice - updated









PS> Cool dog and bunny you got there. They match too.


----------



## Stay Puft

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> My card is so loud...
> 
> When it's under load it makes a loud buzzing noise that sounds too low to be coil whine. It's louder than the fan up to 50%.
> 
> If my card isn't an exceptional ocer then I'm going to RMA it.


Rma because it's not a good overclocker? Are you out of your mind?


----------



## MrWhiteRX7

I scanned the first page looking for the command line/s to do +200mv in AB B17... anyone know this real quick? lol I need mo voltagez


----------



## MrWhiteRX7

Ohhh and btw add me! LOL I've been rocking out for a minute now but keep forgetting to post my card









Sapphire 290


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Stay Puft*
> 
> Rma because it's not a good overclocker? Are you out of your mind?


No I'm gonna rma it because of the ridiculously loud buzzing noise. The only way I'm putting up with it is if this card does 1250+ MHz.


----------



## xTristinx

How would I go about fixing this


----------



## Arizonian

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Ohhh and btw add me! LOL I've been rocking out for a minute now but keep forgetting to post my card
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sapphire 290
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added








Quote:


> Originally Posted by *xTristinx*
> 
> http://www.3dmark.com/3dm11/7534174. here are my results but why in the world am i only getting 17000 when each of my cards get 12300-12800


Quote:


> Originally Posted by *xTristinx*
> 
> How would I go about fixing this


Wish I could help with crossfire issues. Haven't done that myself. So far I've only had single GPU flagships and Dual GPU only set up.

I'm Sure another member will chime in here.


----------



## hatlesschimp

I cant get 3Dmark11 to run it just freezes at the start.

Also I slotted the second card into the 3rd double slot down. (Slot 1 Card 1, Slot 5 Card 2 etc)

Should I re install the drivers?

Whats the latest drivers I have 13:11. Sorry Im new to AMD and how they do things. My last AMD card was a 4XXX.

Whats the recommended way to remove the AMD Drivers?

Also I have the switch on the cards set to the left???

LOL Thanks

Cheers


----------



## Scorpion49

Quote:


> Originally Posted by *xTristinx*
> 
> http://www.3dmark.com/3dm11/7534174. here are my results but why in the world am i only getting 17000 when each of my cards get 12300-12800


I have money on unstable CPU overclock. My last 3770k would do 10k+ physics at 400 less mhz under Win 8.


----------



## Arizonian

Quote:


> Originally Posted by *hatlesschimp*
> 
> I cant get 3Dmark11 to run it just freezes at the start.
> 
> Also I slotted the second card into the 3rd double slot down. (Slot 1 Card 1, Slot 5 Card 2 etc)
> 
> Should I re install the drivers?
> 
> Whats the latest drivers I have 13:11. Sorry Im new to AMD and how they do things. My last AMD card was a 4XXX.
> 
> Whats the recommended way to remove the AMD Drivers?
> 
> Also I have the switch on the cards set to the left???
> 
> LOL Thanks
> 
> Cheers


Really good tool worked for me is linked in OP


----------



## Sgt Bilko

Quote:


> Originally Posted by *selk22*
> 
> Same! I think maybe the new drivers may allow this to stop so im going to be holding on to mine for a while..
> 
> We seem to have similar cards Bilko, whats your ASIC?


ASIC is 70.4% for me


----------



## MrGaZZaDaG

So what is this 'ASIC' you're talking about and how do you check that?


----------



## Sgt Bilko

Quote:


> Originally Posted by *MrGaZZaDaG*
> 
> So what is this 'ASIC' you're talking about and how do you check that?


Right click on the top of GPU-Z when it's open and select "Read ASIC Quality"
I'm not sure exactly what the scores mean btu GPU-Z has a short explanation.

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> *I figured out the downclock issue!!!!!!!!!!!!!!!!!* Unreal!!!
> 
> Ok for those using AB b17, once you have applied your overclock and all you MUST GO BACK IN TO CCC and reset the Power Limit back to 50% again. Even though it says it's 50% in AB it's NOT 50% in CCC.
> 
> Once I did this I held my overclock without one single dip! WOOOOOOOOOOOOO!


Yep, i thought i posted this earlier in the thread..

Arizonian, Might be worthwhile adding this to the OP


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yep, i thought i posted this earlier in the thread..
> 
> Arizonian, Might be worthwhile adding this to the OP


Yes please add this to the front page! I know there were two others having my same issue, they need to see this


----------



## grandpatzer

Quote:


> Originally Posted by *Arizonian*
> 
> You should be in Uber mode honestly at the very least. A manual fan curve after that will take care of that even further.


Can you please explain?

My card is currently at the bios slot closest to DVI connector, I think that is the silent mode?
I put the fan on 100% static and still had clock throttling and card was MAX 74c..... (Ive tried 947c/1250m & also 1200c/1500m)


----------



## Sgt Bilko

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Yes please add this to the front page! I know there were two others having my same issue, they need to see this


I also found out that whenever you set a new clock and your screen flashes you need to bump the powerlimit back up in CCC as well, resets everytime the driver does i think.

EDIT: I just checked then and AB 17 is changing the Powerlimit in CCC now properly, i have graphics overdrive turned off in CCC


----------



## hotrod717

Quote:


> Originally Posted by *pharma57*
> 
> http://videocardz.com/48107/overclockers-uk-offering-pre-flashed-vtx3d-r9-290-290x


Not sure how this only got one response, but for all those naysayers and pessimist's, this should go a long way for legitimizing the 290x chip in 290, aka bios flash debate. Please read where it says TUL corp. didn't receive 290 chips, so they used 290 bios with 290x chip for some initial release cards.


----------



## grandpatzer

Quote:


> Originally Posted by *Kriant*
> 
> 
> 
> That's what I was talking about, that one screw is bigger than the rest, and fits in the upper right corner of the card, when mounting EK block, but the instructions doesn't say that this is the one to use, and when I used any of the smaller once - they would get stuck and then that thing in which you screw this screw would start twisting.


Is youre Acetal (Black) or Plexi?

my guess is you have Acetal and it's the top right screw on this picture?
http://www.ekwb.com/shop/EK-IM/EK-IM-3831109868539.pdf


----------



## hatlesschimp

So I installed the second 290x. I removed drivers and reinstalled. And when I open gpuz or 3dmark11 I get this. Im taking the missus out for tea but when I get back im putting the 2nd 290x in the 1 spot by its self and work it from there. Anyone have any ideas?


----------



## Falkentyne

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Yes.
> 
> As long as I don't run AB and CCC's OC tool at the same time, they each work independently.
> 
> The only thing is for some reason AB's Power Limit setting won't actually do anything for me; I set it to +50% and it still throttles.
> 
> If I use CCC's OC tool then the card doesn't throttle, but it gets very hot and eventually artifacts.
> 
> CCC's PowerTune thing is terrible. They should just let the user set a fan profile and make it so the card doesn't throttle.


It works. I just tested it just now.

-50% in afterburner caused my 1100/1400 clocks to run at 800 MHz in Valley benchmark.
-25% in afterburner ->933 MHz in valley.

Afterburner works perfectly, zero problems.

Make sure you 1) HAVE VCORE MONITORING DISABLED.
2) HAVE OVERDRIVER DISABLED IN CCC. DISABLED meaning do NOT enable it.


----------



## MrWhiteRX7

Now if only there would be a driver release that didn't give us this kind of gpu usage! LOL

Game is smooth but the fps was jumping around pretty frequently, exited the game and found out why


----------



## FloRin050

Quote:


> Originally Posted by *Lennyx*
> 
> For me the 290x was horrible on air. I ran 2 valley runs on my 2nd rig on air before i put the waterblock on. Could not stand the noise, it was that bad.
> So after that experience and if i was to buy a air cooled gpu. I would not buy the 290/290x before new coolers arrive.
> My advice would be to get a gtx 780 with a decent cooler.


And when playing BF4 with a stock 290, and default fan speeds, is it so bad?
i'd like to buy a normal 290, but i'm worried i will regret it.


----------



## Arizonian

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Right click on the top of GPU-Z when it's open and select "Read ASIC Quality"
> I'm not sure exactly what the scores mean btu GPU-Z has a short explanation.
> Yep, i thought i posted this earlier in the thread..
> 
> Arizonian, Might be worthwhile adding this to the OP


Ok good suggestion - this way members can refer to OP rather than hunt it down for new members. Done.


----------



## grandpatzer

What is safe voltage for these cards in software overvolt and after VDROOP?


----------



## grandpatzer

Quote:


> Originally Posted by *jomama22*
> 
> Sapphire has provided me with the best cards tbh. 3 out of 4 290xs from them will hit 1345+ core @ ~1.42v actual. Compare that to 2/6, other brands, that could only reach 1300-1310 before needing over 1.42v actual.


Is that 1.42v in software or after Vdroop?

I'm watercooling my R9 290, still not sure If I have the balls to run it 24/7 at 1.42v, played some games at 1.25v with my 7950


----------



## grandpatzer

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Black screens for me only happened a few times for me. I am also certain in my case it was for a few reasons.
> 1. I screwed around with lowering fans in CCC
> 2. i unplugged my gpu fan intentionally and ran a benchmark - black screen within 1 minute of use.
> 3. when i ran trifire with the cards right next to each other
> 
> I also feel in my case my issue was temperature related. I spaced out the trifire on my x79 board for now until my water cooling stuff arrives tomorrow. Since then, i also run CCC settings : target temp 65C - 100% fan speed.
> Yes it's noisy as hell but it will soon be fixed. Since i did this i have had no black screens and i've been gaming on them all week, and running benchmarks.
> 
> Also forgot, of 4 cards i did have 1 DOA right out of the box. I am waiting on the replacement still. If there are not a huge plethora of benchmarks for water vs stock. I might go ahead and try to compare them if people need the results.


I would appreciate a water benching








Maybe I'll bench some games myself stock vs MAX OC, also I should have some old CFX 7950 logs/benches.


----------



## jomama22

Quote:


> Originally Posted by *grandpatzer*
> 
> Is that 1.42v in software or after Vdroop?
> 
> I'm watercooling my R9 290, still not sure If I have the balls to run it 24/7 at 1.42v, played some games at 1.25v with my 7950


That's for benching. Wouldn't dare that for 24/7. And that is voltage after vdroop on pt1 290x bios.


----------



## grandpatzer

Quote:


> Originally Posted by *jomama22*
> 
> That's for benching. Wouldn't dare that for 24/7. And that is voltage after vdroop on pt1 290x bios.


Ok, what voltages are you running @ 24/7?

I'm propably going to start out at +100mv MSI AB, then if I feel comfortable start finding ways to OC more.
I'm still waiting for my waterblock, currently it is in my secondary PC, it passe Valley @ 1200core/1500memory +100mv MSI AB but the CPU might be bottlenecking so not sure if the card is stable at those settings.


----------



## Prospector

There is any chance to see the new driver coming out today??


----------



## Lennyx

Quote:


> Originally Posted by *FloRin050*
> 
> And when playing BF4 with a stock 290, and default fan speeds, is it so bad?
> i'd like to buy a normal 290, but i'm worried i will regret it.


It all comes down to how much noise u feel is ok. Only thing i tested was stock bios on my 290x. Didnt even try uber couse the noise it did stock running valley was extreme.
So i actually only used the card with stock cooler for maybe 10minutes.

You can do a test. Get ur vaccum cleaner and place it next to ur computer. Turn the vaccum cleaner on and if you feel the noise is ok to game with then get the reference card.

There is always the choice of going custom water or getting a 3rd party cooling solution. If you want to spend the extra money on changing the cooler then the 290 is a great choice.
I


----------



## FloRin050

Quote:


> Originally Posted by *Lennyx*
> 
> There is always the choice of going custom water or getting a 3rd party cooling solution. If you want to spend the extra money on changing the cooler then the 290 is a great choice.
> I


is it that bad?
And well, since this is my first build i'm not sure how to do watercooling etc.
Seems to be alot of work + alot of extra money/parts.
For that money i could also just buy a 780TI ;p


----------



## Falkentyne

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> *I figured out the downclock issue!!!!!!!!!!!!!!!!!* Unreal!!!
> 
> Ok for those using AB b17, once you have applied your overclock and all you MUST GO BACK IN TO CCC and reset the Power Limit back to 50% again. Even though it says it's 50% in AB it's NOT 50% in CCC.
> 
> Once I did this I held my overclock without one single dip! WOOOOOOOOOOOOO!


Mine says whatever I set it to in afterburner. I tested it directly. No problems and i dont have to touch CCC. its disabled. DO NOT ENABLE OVERDRIVE IN CCC and your AB settings will WORK.
Even though overdrive is not enabled, it still SEES the overdrive settings. Probably because powertune itself is enabled but the SETTINGS to change it through CCC are "locked".


----------



## Sgt Bilko

Quote:


> Originally Posted by *Falkentyne*
> 
> Mine says whatever I set it to in afterburner. I tested it directly. No problems and i dont have to touch CCC. its disabled. DO NOT ENABLE OVERDRIVE IN CCC and your AB settings will WORK.
> Even though overdrive is not enabled, it still SEES the overdrive settings. Probably because powertune itself is enabled but the SETTINGS to change it through CCC are "locked".


If i disable graphics overdrive in CCC and set the powerlimit in AB it's fine but for some reason i used to have to change it in CCC to get it to work properly.

This might vary from rig to rig.


----------



## NorcalTRD

Try just using AMD OverDrive in the Catalyst Control Center.
When I used GPU Tweak to change my fan graph it throttled the card.


----------



## Lennyx

Quote:


> Originally Posted by *FloRin050*
> 
> is it that bad?
> And well, since this is my first build i'm not sure how to do watercooling etc.
> Seems to be alot of work + alot of extra money/parts.
> For that money i could also just buy a 780TI ;p


It is that bad it realy is. Its a great performance/price card.
You can get closed heaphones btw. Then the card wont bother you.


----------



## selk22

It certainly is not as bad as you say it is Lennyx.. I came from a GTX 580 and the max fan on the GTX 580 is probably comparable in noise to the Uber bios to around 55-60% fan speed on the 290x. If you push it further it is an extremely loud card yes but if running at stock its pretty much as loud as any other reference GPU I have had. But like mentioned many times before if you game with headphones or music going then you will not notice. It also is a reference card and should be W/C imo..


----------



## FloRin050

Quote:


> Originally Posted by *Lennyx*
> 
> It is that bad it realy is. Its a great performance/price card.
> You can get closed heaphones btw. Then the card wont bother you.


Well, i have a g930 which i use almost always when gaming, but i've got a 7.1 set too.
Soo, not always having headphones on.
And i might take it to a LAN-party sometimes, and i dont want it to make extreme alot of noise and annoy people.
+ I am buying quiter fans for my h100I which i actually dont need if i put a jet engine next of them.







.


----------



## FloRin050

Quote:


> Originally Posted by *selk22*
> 
> It certainly is not as bad as you say it is Lennyx.. I came from a GTX 580 and the max fan on the GTX 580 is probably comparable in noise to the Uber bios to around 55-60% fan speed on the 290x. If you push it further it is an extremely loud card yes but if running at stock its pretty much as loud as any other reference GPU I have had. But like mentioned many times before if you game with headphones or music going then you will not notice. It also is a reference card and should be W/C imo..


OH, good to hear that.
Also i've got a closed case( i mean going to buy* xD ) . Would it be too bad?

Soz for double posting btw.


----------



## centvalny

Testing Powercolor 290 stock bios air

1600 memory default voltage



http://imgur.com/bywI1y0


CPU @ 4500



http://imgur.com/5vDqRcp


----------



## Sgt Bilko

Quote:


> Originally Posted by *FloRin050*
> 
> OH, good to hear that.
> Also i've got a closed case( i mean going to buy* xD ) . Would it be too bad?
> 
> Soz for double posting btw.


You can always Edit your previous post and then hit quote and it will become part of your edited post if that makes it easier for you


----------



## selk22

Quote:


> Originally Posted by *FloRin050*
> 
> OH, good to hear that.
> Also i've got a closed case( i mean going to buy* xD ) . Would it be too bad?
> 
> Soz for double posting btw.


I have a haf 932 with window mod so its a relatively closed up case.. That being said under gaming load I have my fan set to 75% because I OC and I want 0 thermal throttling. But if you are okay with the card running at stock settings which do cause thermal throttling unless you set a custom fan profile to around 50-60% then I think you would be fine. It is a loud card yes but like I said it really was not much louder than my reference GTX 580 at full load.. Once you start pushing that fan profile though this 290x gets extremely loud and would recommend personally waiting a few weeks if your gunna keep it on air for news of the non reference coolers.


----------



## Lennyx

Quote:


> Originally Posted by *selk22*
> 
> It certainly is not as bad as you say it is Lennyx.. I came from a GTX 580 and the max fan on the GTX 580 is probably comparable in noise to the Uber bios to around 55-60% fan speed on the 290x. If you push it further it is an extremely loud card yes but if running at stock its pretty much as loud as any other reference GPU I have had. But like mentioned many times before if you game with headphones or music going then you will not notice. It also is a reference card and should be W/C imo..


My ears are maybe just more sensitive then most others. But yea, i would never recomend the reference 290 cards to any of my friends if they
did not go for another cooling solution on the card.


----------



## rdr09

Quote:


> Originally Posted by *centvalny*
> 
> Testing Powercolor 290 stock bios air
> 
> 1600 memory default voltage
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> http://imgur.com/bywI1y0
> 
> 
> 
> 
> 
> CPU @ 4500
> 
> 
> 
> http://imgur.com/5vDqRcp


core at only 1087 and you got 17000 + in graphics? I may have to try that and up my mem. we have the same - elpida.


----------



## centvalny

Yup, elpida seems ok

Flashed to asus bios and no 290X. Lol

Heres with 1650 mem. still stock air



http://imgur.com/j9jtidY


----------



## TomiKazi

As far as I can tell, Afterburner and CCC (overdrive enabled) seem to be working together without significant downclocking here, at least until now. The only downclocking I notice is when changing a scene or a tiny spike downwards, but I think that's normal.

A quick noob question: Is it correct that gpu voltage only changes when you actually alter the voltage in a program like AB? Just to be sure that it's not raising anything automatically.

Found this to be pretty funny:


----------



## utnorris

Quote:


> Originally Posted by *Raxus*
> 
> You got pretty lucky, see a lot of people with the sapphire 290x cards taking a dump. Although according to Warsam, its a driver thing.


I have two Sapphire 290's in CF with water blocks and once I did a reinstall and shut down the auto updates for Windows, I haven't had a single issue. The WHQL driver seems to be better than the latest beta driver too. I get all the frustration, but I think it is a combination of driver conflicts and even possibly an IRQ setting (for CF) that may be causing a lot of the issues here. However, that being said, if you want something that has had all the quirks worked out, go with a GTX780, not the TI, bot the regular GTX780. Performance is close enough for 99% of the people out there and they have had 6 months to work on the driver issues. It's funny, I remember when the Titan came out and everyone was having issues until Skynet created a custom bios that allowed people to use the card like they thought it should be and then all the complaining seemed to stop. My point is, these cards have a lot of potential, but it's early and there are going to be some growing pains with them, but most early adopters should realize this.


----------



## Maxxa

Is it unusual to lose your CPU and RAM OC when switching cards/ brands? I I thought after down clocking the ram a bit I was fine but then while playing BF4 my PC crashed which never happened since BF4 came out on the same CPU OC but with a heavily OC 480.


----------



## flopper

Quote:


> Originally Posted by *FloRin050*
> 
> And when playing BF4 with a stock 290, and default fan speeds, is it so bad?
> i'd like to buy a normal 290, but i'm worried i will regret it.


no issue with fan noise for me.
the 8800gt I used in between cards was worse and my 7970 was worse.

Quote:


> Originally Posted by *grandpatzer*
> 
> Is that 1.42v in software or after Vdroop?
> 
> I'm watercooling my R9 290, still not sure If I have the balls to run it 24/7 at 1.42v, played some games at 1.25v with my 7950


24/7?
I just run volt when I game.
I mean one push button and I am maxing it out for the game.
desktop dont need it









Personally my 290 is a great card.
its hot sure and can be noisy I am sure for many, but I reason the same way I do with my CPU, I did replace my cpu default Intel stock cooler, now some call me insane due to its so good its Intel made right?
I slapped a block on that sucker cpu.
I do the same with my 290 soonish.


----------



## Spectre-

Quote:


> Originally Posted by *flopper*
> 
> no issue with fan noise for me.
> the 8800gt I used in between cards was worse and my 7970 was worse.
> 24/7?
> I just run volt when I game.
> I mean one push button and I am maxing it out for the game.
> desktop dont need it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Personally my 290 is a great card.
> its hot sure and can be noisy I am sure for many, but I reason the same way I do with my CPU, I did replace my cpu default Intel stock cooler, now some call me insane due to its so good its Intel made right?
> I slapped a block on that sucker cpu.
> I do the same with my 290 soonish.


finally another man that understands what true hot and loud cards are-

my 8800GTX topped 100 degrees and was loud as hell

and dont get me started on the 4870X2


----------



## rass101

I got my MSI r290 build running a day a go and have to say I was expecting a lot louder card due to the comments. I run 13.11 beta version 9.2. Temperatures rise up to 95 while on maximum load but it has not caused any problems so far.

I also found that I have Accelero Xtreme Plus II cooler that I did not know I had. Can it be used with r9 290 cards?


----------



## flopper

Quote:


> Originally Posted by *Spectre-*
> 
> finally another man that understands what true hot and loud cards are-
> 
> my 8800GTX topped 100 degrees and was loud as hell
> 
> and dont get me started on the 4870X2


got to be aussie or such to understand I guess









personally any card doing a lot of air pushing going to be loud.
amd however need to hire someone who understand how the customer thinks and reacts, as many just look at heat and reviews.
they are not that good with public image.

my 290 reference is better than my 7970 reference at same 80% speed fan.


----------



## Gilgam3sh

here is a video of the Arctic Accelero Xtreme III at 100% on my 290X, just a note, the phone camera makes the noise higher than it is IRL.


----------



## skupples

Quote:


> Originally Posted by *jomama22*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> He was asking for something that was clearly prohibited on the sale page. Just because someone was polite enough to overlook that for someone else, doesn't meant its the rule and if anything, it is the exception.
> 
> If I was newegg I would do the same thing. Its like ordering a well done grilled steak then complaining that its too dry, what do you expect?
> 
> No one forced anyone to buy through newegg. We all know the lame return policy of chips/gfx cards from them. But that isn't an excuse to be upset because they upheld their own rule.
> 
> 
> If you couldn't tell, I'm in the service industry lol.


That's fine & dandy, my list of foul experiences with them is pretty long. Most recent was a 15% off coupon on an 1TB EVO SSD. 15% of 600= 90$... Well, the check out page showed the discount, went through with the discount, but when the receipts came to my email it was for the full value. When I contacted them they gave me some techno babble about the 15% coupon reaching it's limit mid queue, & offered me 5$. Literally, 5$. Not to mention the guy in the warehouse who fat thumbs cpu pins.


----------



## grandpatzer

Quote:


> Originally Posted by *flopper*
> 
> no issue with fan noise for me.
> the 8800gt I used in between cards was worse and my 7970 was worse.
> 24/7?
> I just run volt when I game.
> I mean one push button and I am maxing it out for the game.
> desktop dont need it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Personally my 290 is a great card.
> its hot sure and can be noisy I am sure for many, but I reason the same way I do with my CPU, I did replace my cpu default Intel stock cooler, now some call me insane due to its so good its Intel made right?
> I slapped a block on that sucker cpu.
> I do the same with my 290 soonish.


When I wrote 1.25v 24/7 I meant I played couple of games at 1.25v, that was my maximum comfortable volt for me with the 7950.
I don't run 1.25v in windows









Looks like the 290 is able to handle more volts, I think 1.30v is OK, maybe even 1.35v (?)


----------



## hatlesschimp

ASIC Quality %
Gigabyte = 72.7
Powercolor = 71.1

*WHAT THE .......*
Is this normal between the two cards in crossfire? the Gigabyte is in the top slot and the Power color is in the 3rd slot as per motherboiard manual. The readings with the yellow markings are from the powercolor in the 5th pcie slot (3rd double slot)


----------



## Raxus

Quote:


> Originally Posted by *skupples*
> 
> That's fine & dandy, my list of foul experiences with them is pretty long. Most recent was a 15% off coupon on an 1TB EVO SSD. 15% of 600= 90$... Well, the check out page showed the discount, went through with the discount, but when the receipts came to my email it was for the full value. When I contacted them they gave me some techno babble about the 15% coupon reaching it's limit mid queue, & offered me 5$. Literally, 5$. Not to mention the guy in the warehouse who fat thumbs cpu pins.


I have always gotten great customer service from newegg. Granted I never expect to get something that they explicitly say they don't provide.

Never once had a problem with them, just my experience though.


----------



## Scorpion49

Quote:


> Originally Posted by *hatlesschimp*
> 
> ASIC Quality %
> Gigabyte = 72.7
> Powercolor = 71.1
> 
> *WHAT THE .......*
> Is this normal between the two cards in crossfire? the Gigabyte is in the top slot and the Power color is in the 3rd slot as per motherboiard manual. The readings with the yellow markings are from the powercolor in the 5th pcie slot (3rd double slot)


Disable ULPS if you don't want the second card to shut off for power savings.


----------



## Raxus

Quote:


> Originally Posted by *flopper*
> 
> no issue with fan noise for me.
> the 8800gt I used in between cards was worse and my 7970 was worse.
> 24/7?
> I just run volt when I game.
> I mean one push button and I am maxing it out for the game.
> desktop dont need it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Personally my 290 is a great card.
> its hot sure and can be noisy I am sure for many, but I reason the same way I do with my CPU, I did replace my cpu default Intel stock cooler, now some call me insane due to its so good its Intel made right?
> I slapped a block on that sucker cpu.
> I do the same with my 290 soonish.


What i think is funny is that my Sapphire OC w boost was far louder than my 290x and my 7970 reference. That aftermarket cooler did not solve the sound problem for sure.


----------



## CurrentlyPissed

Quote:


> Originally Posted by *Raxus*
> 
> I have always gotten great customer service from newegg. Granted I never expect to get something that they explicitly say they don't provide.
> 
> Never once had a problem with them, just my experience though.


Same. When I purchased two 7950s at the egg and two weeks later they went from $550/eac to $450/ea they gave me a $100 GC.

When I purchased my two 290s. And then the BF4 bundle was announced they gave me $25x2 (one per card) GC.

Unfortunately I was plagued with horrible Black Screen issues so I RMAd, which they paid for to ship back to them. I originally purchased 2 Sapphires. I asked if I could switch from Sapphire to XFX they said they don't allow it, but would ask a super visor. He approved it so I got two XFX instead, and they paid for overnight back to me.


----------



## conwa

Whats is the normal downclock for a R9 290 in idle?

My XFX 290 has a constant clock of 947mhz, even when i browse in windows..

Is that normal?


----------



## SonDa5

I'm thinking about returning my 290x. I bought the Sapphire brand and at the time I ordered it the Sapphire WEB site stated that the card supported TRIXX voltage control. Got the card and even on the box it states the same thing.

So I'm getting close to my 30 day return window. I like the card but I'm not satisfied about the lack of TRIXX voltage control as promised.

Am I over reacting? Should I be patient and wait?

If I return it I can wait a few weeks and possibly get and improved PCB designed one.

I have 2 days left to return this card.


----------



## Stay Puft

Quote:


> Originally Posted by *SonDa5*
> 
> I'm thinking about returning my 290x. I bought the Sapphire brand and at the time I ordered it the Sapphire WEB site stated that the card supported TRIXX voltage control. Got the card and even on the box it states the same thing.
> 
> So I'm getting close to my 30 day return window. I like the card but I'm not satisfied about the lack of TRIXX voltage control as promised.
> 
> Am I over reacting? Should I be patient and wait?
> 
> If I return it I can wait a few weeks and possibly get and improved PCB designed one.
> 
> I have 2 days left to return this card.


Afterburner supports voltage control but thats not going to matter anyway if you're still on reference cooling


----------



## evensen007

Quote:


> Originally Posted by *SonDa5*
> 
> I'm thinking about returning my 290x. I bought the Sapphire brand and at the time I ordered it the Sapphire WEB site stated that the card supported TRIXX voltage control. Got the card and even on the box it states the same thing.
> 
> So I'm getting close to my 30 day return window. I like the card but I'm not satisfied about the lack of TRIXX voltage control as promised.
> 
> Am I over reacting? Should I be patient and wait?
> 
> If I return it I can wait a few weeks and possibly get and improved PCB designed one.
> 
> I have 2 days left to return this card.


Personally I would keep it since you're not having any issues with it. I would flash the Asus 290x BIOS on it which is really easy. Then you could use Asus Gpu tweak to do the same thing as Trixx until your card truly is supported by it. Then you can flash back to Sapphire and use Trixx.


----------



## SonDa5

I have flashed to the Asus BIOS and ran GPU Tweak for voltage control. I have also water cooled the card. The problem with the card is that Sapphire describes it as supporting TRIXX but Saphire doesn't update TRIXX to support it.

Probably going to return it. I hate buying into stupid stuff like this.


----------



## Jpmboy

Quote:


> Originally Posted by *evensen007*
> 
> Personally I would keep it since you're not having any issues with it. I would flash the Asus 290x BIOS on it which is really easy. Then you could use Asus Gpu tweak to do the same thing as Trixx until your card truly is supported by it. Then you can flash back to Sapphire and use Trixx.


This^^^ flash the over quiet bios, since all it does is drop anchor right when the card is just starting to breath.


----------



## Stay Puft

Quote:


> Originally Posted by *SonDa5*
> 
> I have flashed to the Asus BIOS and ran GPU Tweak for voltage control. I have also water cooled the card. The problem with the card is that Sapphire describes it as supporting TRIXX but Saphire doesn't update TRIXX to support it.
> 
> Probably going to return it. I hate buying into stupid stuff like this.


So you have voltage control with other software but want to return it because trixx doesn't support it? That's the stupidest thing I have ever heard and I've heard some doozies in my time on this forum


----------



## sugarhell

New trixx will be on beta soon.


----------



## SonDa5

Quote:


> Originally Posted by *Stay Puft*
> 
> So you have voltage control with other software but want to return it because trixx doesn't support it? That's the stupidest thing I have ever heard and I've heard some doozies in my time on this forum


Not stupid at all.

Sapphire markets the card as supporting 290x and one of the main reasons I bought the Sapphire brand is because I like to use TRIXX voltage control.

On my Sapphire 950mhz Edition HD7950 Asus GPU tweak sucked for over clocking in comparison to using Sapphire TRIXX.

I'm only using Asus GPU Tweak to provide some over clocking fun while I patiently wait for SApphire to finish the job with my card.

I only have 2 days left to return the card and I feel like I have been patient waiting for TRIXX to support the card.


----------



## maarten12100

Quote:


> Originally Posted by *Stay Puft*
> 
> So you have voltage control with other software but want to return it because trixx doesn't support it? That's the stupidest thing I have ever heard and I've heard some doozies in my time on this forum


He was promised vc trixx and it was advertised false so that is a valid argument already.
Even though there is an easy fix till trixx gets updated


----------



## Raxus

Quote:


> Originally Posted by *SonDa5*
> 
> I'm thinking about returning my 290x. I bought the Sapphire brand and at the time I ordered it the Sapphire WEB site stated that the card supported TRIXX voltage control. Got the card and even on the box it states the same thing.
> 
> So I'm getting close to my 30 day return window. I like the card but I'm not satisfied about the lack of TRIXX voltage control as promised.
> 
> Am I over reacting? Should I be patient and wait?
> 
> If I return it I can wait a few weeks and possibly get and improved PCB designed one.
> 
> I have 2 days left to return this card.


As long as you're not having black screens I would keep it, even them Warsam seems to think its a driver issue anyway. Trixx will come out eventually, theres also alternatives atm.


----------



## skupples

Quote:


> Originally Posted by *hatlesschimp*
> 
> I got another 290X today. That makes it a Gigabyte & Powercolor
> 
> 
> Spoiler: Warning: Spoiler!


awwww sooo cute... I had a dwarf rabbit when I was a kid. My Jack Russel use to looove playing with it.


----------



## Stay Puft

Quote:


> Originally Posted by *maarten12100*
> 
> He was promised vc trixx and it was advertised false so that is a valid argument already.
> Even though there is an easy fix till trixx gets updated


No its really not because TRIXX is inferior to Afterburner. TRIXX is down on the list next to Asus GPU Tweak

Afterburner
Precision
Trixxx
GPU Tweak


----------



## SonDa5

Quote:


> Originally Posted by *sugarhell*
> 
> New trixx will be on beta soon.


I can wait 2 more days but if it doesn't materialize I'm going to return the card.


----------



## maarten12100

Quote:


> Originally Posted by *Stay Puft*
> 
> No its really not because TRIXX is inferior to Afterburner. TRIXX is down on the list next to Asus GPU Tweak
> 
> Afterburner
> Precision
> Trixxx
> GPU Tweak


I meant that false advertising is a legit reason to return a product.


----------



## brazilianloser

Well so the OC starts... little baby steps since I am still on air and reference cooling. But a simple step like increasing the Core clock without altering any voltage to 1k has given me an extra 274 points in Firestrike. And the highest temp still only 76c.


----------



## Raxus

Quote:


> Originally Posted by *Stay Puft*
> 
> No its really not because TRIXX is inferior to Afterburner. TRIXX is down on the list next to Asus GPU Tweak
> 
> Afterburner
> Precision
> Trixxx
> GPU Tweak


Opinions are like A holes. And his is that Trixx is superior.


----------



## SonDa5

Quote:


> Originally Posted by *maarten12100*
> 
> He was promised vc trixx and it was advertised false so that is a valid argument already.


That sums up the situation.

I wont have to pay for a restocking fee because the product is incomplete and not as promised. I can return the card and get a 290 and then flash to 290x or I can wait for an AIB card and get a possibly better PCB.

Seems like the best choice is to return it.


----------



## Raxus

Quote:


> Originally Posted by *SonDa5*
> 
> That sums up the situation.
> 
> I wont have to pay for a restocking fee because the product is incomplete and not as promised. I can return the card and get a 290 and then flash to 290x or I can wait for an AIB card and get a possibly better PCB.
> 
> Seems like the best choice is to return it.


If you're not happy, return it and get something else.

Doesn't need to be validated by any of us.


----------



## brazilianloser

Well upping the Memory clock for me is actually reducing my scores in Firestrike









Guess I will just keep upping the Core to see where my limit is without voltage increase.


----------



## hatlesschimp

Quote:


> Originally Posted by *skupples*
> 
> awwww sooo cute... I had a dwarf rabbit when I was a kid. My Jack Russel use to looove playing with it.


LOL

The Jack Russel loves chasing and playing with the rabbit, but I have to becareful because they can get rough when playing.


----------



## Ukkooh

Is my card even worth putting under water as I get artifacts at [email protected] voltage? Would I still be able to hit 1200 with 1.3-1.35V?
Asic quality is 73.9%. It might be temp sensitivity though as the artifacting begins at ~80°C.


----------



## Stay Puft

Quote:


> Originally Posted by *Ukkooh*
> 
> Is my card even worth putting under water as I get artifacts at [email protected] voltage? Would I still be able to hit 1200 with 1.3-1.35V?
> Asic quality is 73.9%. It might be temp sensitivity though as the artifacting begins at ~80°C.


What are your VRM temps hitting?


----------



## Ukkooh

Quote:


> Originally Posted by *Stay Puft*
> 
> What are your VRM temps hitting?


59 and 81 max on a light game (smite) so I guess they'll hit well over 90°C on valley. I'll run it in a min and report.

Edit: Actually VRM temps were lower in valley. At 100% fan speed the core still hit 82°C and kept artifacting.


----------



## jerrolds

Quote:


> Originally Posted by *Raxus*
> 
> As long as you're not having black screens I would keep it, even them Warsam seems to think its a driver issue anyway. Trixx will come out eventually, theres also alternatives atm.


Agreed - if Trixx is your only problem then why risk another card. If it overclocks decently, doesnt black screen your ahead of the game. Unless of course his ulterior motive is to return the 290x to unlock a powercolor/xfx 290..

which i wish i could do


----------



## jerrolds

Quote:


> Originally Posted by *Ukkooh*
> 
> Is my card even worth putting under water as I get artifacts at [email protected] voltage? Would I still be able to hit 1200 with 1.3-1.35V?
> Asic quality is 73.9%. It might be temp sensitivity though as the artifacting begins at ~80°C.


Youre hitting [email protected] stock volts under water? Seems like the gpu isnt seated properly or whatever it has to be under a block







With the Gelid i hit 65C max at 1220mhz core and 1.3v actual. My ASIC is 75%, not the best..but ..i guess could be worse.


----------



## Ukkooh

Quote:


> Originally Posted by *jerrolds*
> 
> Youre hitting [email protected] stock volts under water? Seems like the gpu isnt seated properly or whatever it has to be under a block
> 
> 
> 
> 
> 
> 
> 
> With the Gelid i hit 65C max at 1220mhz core and 1.3v actual. My ASIC is 75%, not the best..but ..i guess could be worse.


Stock cooler.


----------



## Euda

It's awful that the release of the new fix-drivers was delayed. Going to visit a LAN-party of a friend in 3 hours and I'm looking forward to get blackscreened outta' the games every hour or so








Thank you, AMD.


----------



## jerrolds

Quote:


> Originally Posted by *Ukkooh*
> 
> Stock cooler.


Oh sorry..misread that - youre asking if its worth putting under water if you artifact at [email protected] Ummm when i had my stock cooler i tested "potential" overclocks by raising fanspeed to 100% and limit to +50%

If if i remember correctly - i think i hit 1110mhz on the core. After flashing to ASUS and cranking the voltage to max 1412mhz (~1.3v actual) i hit 1180mhz

With the gelid i mable to hit 1225mhz before artifacts started happening (gpu < 65C vrm < 85C) - so getting better cooler does help...i think max underwater this card can maybe hit 1240, 1250 if im lucky cuz vrm cooling would be much better. But i also wouldnt be surprised if i had to use a PT1 bios where voltages start to get silly.


----------



## Raxus

Quote:


> Originally Posted by *jerrolds*
> 
> Agreed - if Trixx is your only problem then why risk another card. If it overclocks decently, doesnt black screen your ahead of the game. Unless of course his ulterior motive is to return the 290x to unlock a powercolor/xfx 290..
> 
> which i wish i could do


I had the opportunity to do just that, but the chances of getting one that didn't unlock wasn't worth it to me.

And we have to assume there's some reason the 290xs are a little more expensive, whether it be better chips or something else.


----------



## Stay Puft

I'm looking at the Powercolor one right now on newegg but i kinda am thinking about just buying the Gigabyte. Unlocking doesnt mean that much to me. It's only 10% more shaders


----------



## Raxus

Quote:


> Originally Posted by *Stay Puft*
> 
> I'm looking at the Powercolor one right now on newegg but i kinda am thinking about just buying the Gigabyte. Unlocking doesnt mean that much to me. It's only 10% more shaders


If you're going to buy a 290, might as well get one that has the possibility to unlock no? If it doesn't no big deal right?


----------



## jerrolds

Quote:


> Originally Posted by *Raxus*
> 
> I had the opportunity to do just that, but the chances of getting one that didn't unlock wasn't worth it to me.
> 
> And we have to assume there's some reason the 290xs are a little more expensive, whether it be better chips or something else.


I think its worth if i can get most of my money back, for $150 less you get a potential 290X and a free copy of BF4, which you can sell.

I am still pretty pissed that day 1 290X customers got shafted with BF4. Only thing thats keeping me from totally flipping is that i am ok with my cards overclock. Mmm maybe i should appraise it and see what i can get since its "guaranteed" to OC over 1200mhz


----------



## brazilianloser

http://www.3dmark.com/fs/1189380

Asus 290 @ 1050/1200 (no voltage increase)
Highest temp 80c

Increase of 734 points in Firestrike, about 5 fps increase in both graphics test.

Not bad for being on air stock cooling without putting the fan in hurricane mode.


----------



## Raxus

Quote:


> Originally Posted by *jerrolds*
> 
> I think its worth if i can get most of my money back, for $150 less you get a potential 290X and a free copy of BF4, which you can sell.
> 
> I am still pretty pissed that day 1 290X customers got shafted with BF4. Only thing thats keeping me from totally flipping is that i am ok with my cards overclock. Mmm maybe i should appraise it and see what i can get since its "guaranteed" to OC over 1200mhz


Maybe its me but I just don't know that an unlocked 290 is equivalent to a 290x in every way. And didn't want to risk it.

How did day 1 290x customers get shafted with BF4?


----------



## jomama22

Quote:


> Originally Posted by *jerrolds*
> 
> I think its worth if i can get most of my money back, for $150 less you get a potential 290X and a free copy of BF4, which you can sell.
> 
> I am still pretty pissed that day 1 290X customers got shafted with BF4. Only thing thats keeping me from totally flipping is that i am ok with my cards overclock. Mmm maybe i should appraise it and see what i can get since its "guaranteed" to OC over 1200mhz


This is for everyone: Only Marked 290/290x boxes receive bf4. If it doesn't say "includes bf4" on the sale page, you arent getting it.

So early adopters didn't really get screwed as every "bf4 included" is more $ then the vanilla versions. All 290x with bf4 are $579. Same price day 1 people payed.


----------



## the9quad

Quote:


> Originally Posted by *jomama22*
> 
> This is for everyone: Only Marked 290/290x boxes receive bf4. If it doesn't say "includes bf4" on the sale page, you arent getting it.
> 
> So early adopters didn't really get screwed as every "bf4 included" is more $ then the vanilla versions. All 290x with bf4 are $579. Same price day 1 people payed.


and as far as a never settle bundle goes definitely check with the retailer/etailer, because apparently it's they who offer it, not the card manufacturer. In short BF4 marked on box, never settle etailer specific.


----------



## hatlesschimp

Well I sorted out my issues with installing the second card today and I must say Im happy. Im running stock settings and gaming has been fine. Mind you ive only played for 1 hour total. Noise doesnt bother me because I poked a hole in the wall and have my rig in the spare room. It also makes for a decent spy hole for when people stay over. LOL


----------



## Kriant

Hello fellow members of this great community!

Now that I've finally installed all 3 of the cards, put the EK blocks on, filled up my loop ( or rather added some coolant, thanks quick disconets for cards only







), ran it for 5h, I have a few questions, but this one goes first :
Do I want to disable ULPS for stability just like with 7970s?


----------



## sugarhell

Trixx is working with 290x. You cant just overvolt. So its not false advertising

Also i prefer trixx over msi ab for 7970s


----------



## tsm106

Quote:


> Originally Posted by *Raxus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Stay Puft*
> 
> No its really not because TRIXX is inferior to Afterburner. TRIXX is down on the list next to Asus GPU Tweak
> 
> Afterburner
> Precision
> Trixxx
> GPU Tweak
> 
> 
> 
> Opinions are like A holes. And his is that Trixx is superior.
Click to expand...











Ask me which app I used to oc my 7970s? These 7970s are still averaging at least top 30 in the Titan/780 crowded HoF.

Quote:


> Originally Posted by *Kriant*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> O-rings flopping around? Grab the vaseline, go oldschool.
> 
> 
> 
> I thought of that, actually. But ( and here goes a bad old joke ) : "I've used it up on OTHER things". In all reality though, I didn't have anything handy in my apt that would work, so I've used my swiftech connectors, now testing for leaks.
> 
> But it's a PITA, and I'll probably won't use that tri-sli connector till my next build or something. There's no real adventage of EK connector block over separate connectors, right?
Click to expand...

It looks to have much better flow than standard SLI links. The ek block is very thin and flat. The terminal block is designed with this in mind and it is consistent with their minimal design. Essentially there are less points where the flow changes from their wide flat channel to round. And then there is the advantage of being able to grab the array by the terminal, makes handling lots of cards a breeze. Don't forget to buy some qdcs too.


----------



## jerrolds

Quote:


> Originally Posted by *jomama22*
> 
> This is for everyone: Only Marked 290/290x boxes receive bf4. If it doesn't say "includes bf4" on the sale page, you arent getting it.
> 
> So early adopters didn't really get screwed as every "bf4 included" is more $ then the vanilla versions. All 290x with bf4 are $579. Same price day 1 people payed.


It irks me that buyers of R9 270X, a card that is a third of the price get BF4 for free while 290X day 1 buyers (not even 2 weeks prior) did not . We should least get offered a coupon or something imo. Especially since the only reason is due to contract/timing reasons, apparently.

I dont see why we cant get retroactively get coupons for _something_.


----------



## Kriant

Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ask me which app I used to oc my 7970s? These 7970s are still averaging at least top 30 in the Titan/780 crowded HoF.
> It looks to have much better flow than standard SLI links. The ek block is very thin and flat. The terminal block is designed with this in mind and it is consistent with their minimal design. Essentially there are less points where the flow changes from their wide flat channel to round. And then there is the advantage of being able to grab the array by the terminal, makes handling lots of cards a breeze. Don't forget to buy some qdcs too.


Did that a loong time ago, QDCs are a must, made swapping cards so much easier.

I've ended up running regular sli links for now. I'll probably attempt to put that terminal block sometime in december, with all the vasiline and lube I can muster


----------



## jerrolds

Quote:


> Originally Posted by *the9quad*
> 
> and as far as a never settle bundle goes definitely check with the retailer/etailer, because apparently it's they who offer it, not the card manufacturer. In short BF4 marked on box, never settle etailer specific.


Etailer, so..i should try Newegg? I know one user was able to get 2x$25 gift certs..but i feel thats an exception.


----------



## iGameInverted

Ordering my 290x today and I have been trying to figure out which brand to buy. I don't want to get the 290 and try to flash and test my luck in the lottery. I do play to watercool and overclock.

Anyone have some solid advice on this? Thank you


----------



## anarekist

hi newest member to join

http://www.techpowerup.com/gpuz/6mnke/

Sapphire AMD R9 290, reference

stock cooler.


----------



## Raxus

Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ask me which app I used to oc my 7970s? These 7970s are still averaging at least top 30 in the Titan/780 crowded HoF.
> It looks to have much better flow than standard SLI links. The ek block is very thin and flat. The terminal block is designed with this in mind and it is consistent with their minimal design. Essentially there are less points where the flow changes from their wide flat channel to round. And then there is the advantage of being able to grab the array by the terminal, makes handling lots of cards a breeze. Don't forget to buy some qdcs too.


He was basically giving him crap for preferring trixx over AB and returning a card because of it.

Never heard the expression opinions are like a holes?


----------



## hatlesschimp

Quote:


> Originally Posted by *iGameInverted*
> 
> Ordering my 290x today and I have been trying to figure out which brand to buy. I don't want to get the 290 and try to flash and test my luck in the lottery. I do play to watercool and overclock.
> 
> Anyone have some solid advice on this? Thank you


I have a Powercolor and they have a good warranty when it comes to watercooling. They basically dont care as long as it looks like normal when they get it back. Im unsure about the gigabyte that I bought today though but neither have stickers over any screws so thats a good sign.


----------



## Raxus

Quote:


> Originally Posted by *jerrolds*
> 
> It irks me that buyers of R9 270X, a card that is a third of the price get BF4 for free while 290X day 1 buyers (not even 2 weeks prior) did not . We should least get offered a coupon or something imo. Especially since the only reason is due to contract/timing reasons, apparently.
> 
> I dont see why we cant get retroactively get coupons for _something_.


290x day 1 buyers DID get bf4, it came in the box. 290, might be a different story.


----------



## Epsi

Bleh my ASIC is 66,9%. Thats freaking low, aint it? At stock volts its starting to artifact at 1075 core.

Sent from my HTC One using Tapatalk


----------



## hatlesschimp

Not sure if I posted this but my ASIC are:

Gigabyte R9 290X = *72.7%*
Powercolor R9 290X = *71.1%*


----------



## Ghostpilot

XFX R9 290 = 76.9%


----------



## iGameInverted

Quote:


> Originally Posted by *hatlesschimp*
> 
> I have a Powercolor and they have a good warranty when it comes to watercooling. They basically dont care as long as it looks like normal when they get it back. Im unsure about the gigabyte that I bought today though but neither have stickers over any screws so thats a good sign.


Thanks, I am trying to get one that is around 550 that doesn't come with BF4, so Power isn't an option.

Does anyone have any experience with HIS 290x? Specifically in regards to overclocking.


----------



## Arizonian

Quote:


> Originally Posted by *anarekist*
> 
> hi newest member to join
> 
> http://www.techpowerup.com/gpuz/6mnke/
> 
> Sapphire AMD R9 290, reference
> 
> stock cooler.


Congrats - added


----------



## dezerteagle323

Quote:


> Originally Posted by *dezerteagle323*
> 
> Can anyone speculate what the prices of the aftermarket-cooler r9 290 will be? I'm guessing they won't be around $400 like the reference cards.
> 
> I'm wondering because if AIB's tend to be alot more, then I'm just going to grab a reference one today. Help me decide please!


bump
?


----------



## hatlesschimp

Quote:


> Originally Posted by *iGameInverted*
> 
> Thanks, I am trying to get one that is around 550 that doesn't come with BF4, so Power isn't an option.
> 
> Does anyone have any experience with HIS 290x? Specifically in regards to overclocking.


What about the Powercolor 290 (no X version) - apparently there has been success with this card being unlocked to the full 290X version and it cheaper than the X.

http://www.overclock.net/t/1442692/vc-powercolor-radeon-r9-290-unlocked-into-r9-290x


----------



## CriticalHit

My 2 HIS R9 290 are ASIC 70.5% and 70.7%

guess its a good thing they are going under water


----------



## jerrolds

Quote:


> Originally Posted by *Raxus*
> 
> 290x day 1 buyers DID get bf4, it came in the box. 290, might be a different story.


Yes..but we paid extra, yes...less than retail..but $549 w/o BF4 and $579 for BF4 edition. And iirc, does not have any future DLCs/Expansions.


----------



## Kriant

ASIC:
Card 1: 65.8
Card 2: 69.6
Card 3: 72.7


----------



## TheSoldiet

ASIC: 290X 76.9


----------



## Arizonian

Quote:


> Originally Posted by *dezerteagle323*
> 
> Can anyone speculate what the prices of the aftermarket-cooler r9 290 will be? I'm guessing they won't be around $400 like the reference cards.
> 
> I'm wondering because if AIB's tend to be alot more, then I'm just going to grab a reference one today. Help me decide please!


If your comparing a reference card with reference blower like Gigabyte to a reference card with aftermarket cooling like a Windforce then the price difference can be approximately $30-$40 more.

If your asking between reference cards vs non-reference design aka extra power phases, thicker PCB, etc...then it can vary and more than just extra cooling like your Lightnings and Matrix.

I've been waiting for so long I'm going to bust but a few AIB's have said no later than 2nd week of December. So anywhere from next week to then we should see something. Difference will be worth the wait IMO if your not putting water blocks on it. I'm leaning toward ASUS DCUII but hope to last long enough for a Lightning.


----------



## bond32

I'm finding that changing the clock rates even when I am water cooling (temps all in check including vrm) the black screens happen....


----------



## jomama22

Asic games ey?
All 290x

1 72.6%
2 72.6%
3 74%
4 73.8%
5 70.4%
6 77.2%
7 76.7%
8 76.3%
9 73.3%
10 72.9%


----------



## Jpmboy

Quote:


> Originally Posted by *jomama22*
> 
> Asic games ey?
> All 290x
> 
> 1 72.6%
> 2 72.6%
> 3 74%
> 4 73.8%
> 5 70.4%
> 6 77.2%
> 7 76.7%
> 8 76.3%
> 9 73.3%
> 10 72.9%


And out of this binning list, which of them performed the best?


----------



## Raxus

Quote:


> Originally Posted by *jerrolds*
> 
> Yes..but we paid extra, yes...less than retail..but $549 w/o BF4 and $579 for BF4 edition. And iirc, does not have any future DLCs/Expansions.


I was never promised a free game. So I really don't feel cheated. I knew they'd eventually fire back up the never settle bundle, but I really didn't want to wait.

I do not feel that I am owed anything.


----------



## flamin9_t00l

HIS Radeon R9 290



Had a bit of bother coming from a Nvidia system (bsods 01 and F4). Also experienced distorted colours during boot up (not in windows).

Since reinstalling the OS I've never had anymore issues, very pleased









Not the best OC'er artifacting at 1100mhz with +50mV on reference cooler.

Will be adding another 290 after crimbo and both will be under water... coming from 580 SLI.


----------



## jerrolds

Quote:


> Originally Posted by *Arizonian*
> 
> If your comparing a reference card with reference blower like Gigabyte to a reference card with aftermarket cooling like a Windforce then the price difference can be approximately $30-$40 more.
> 
> If your asking between reference cards vs non-reference design aka extra power phases, thicker PCB, etc...then it can vary and more than just extra cooling like your Lightnings and Matrix.
> 
> I've been waiting for so long I'm going to bust but a few AIB's have said no later than 2nd week of December. So anywhere from next week to then we should see something. Difference will be worth the wait IMO if your not putting water blocks on it. I'm leaning toward ASUS DCUII but hope to last long enough for a Lightning.


Arent 7970 DC2s voltage locked? I just hope 3rd party aftermarket solutions dont lock their cards like they did last gen, most stayed locked even after bios updates.


----------



## iGameInverted

I did take a look at that. I don't want to play the lottery. While some will say it is worth it. I just want to be certain that I will have a 290x and I will pay the premium.

I would like to overclock though and that is my main concern with certain brands. Wondering if any particular brand is more likely to overclock better than another as well as where the HIS model stands.

I have had a HIS IceQ 6970 in the past and loved it, which is also a reason why I am considering buying it. As well as the non-bundle price of 550.


----------



## Arizonian

Quote:


> Originally Posted by *jerrolds*
> 
> Arent 7970 DC2s voltage locked? I just hope 3rd party aftermarket solutions dont lock their cards like they did last gen, most stayed locked even after bios updates.


Don't know. I've never lasted past Nvidia reference cards when they release.







Last aftermarket / AMD card I had was MSI TwinFrozr 6870.


----------



## Sgt Bilko

I had a Windforce 3 7970 and it was Volt locked........bit of a pain but it served it's purpose.


----------



## jomama22

Quote:


> Originally Posted by *Jpmboy*
> 
> And out of this binning list, which of them performed the best?


76% and up were the best but only above 1.35v actual. If sticking with the Asus.ROM bios, it makes little difference.

The highest stock bios oc I had was 1160 (no voltage changes) and that was the 73.8%. On air.

As luck would have it, my best card was the very first one I tried, lol. I have one with a better core, but its stuck with elpida, while the other is hynix. A shame really.


----------



## Raxus

I had two Sapphire 7970 OC w/ boost cards. now those were loud cards.


----------



## Sgt Bilko

In other news: The 13.11 WHQL drivers are now official for Futuremark.

Now we can see AMD rolling in


----------



## flamin9_t00l

Is the WHQL recommended over the beta 9.2.


----------



## petedread

Quote:


> Originally Posted by *flamin9_t00l*
> 
> HIS Radeon R9 290
> 
> Had a bit of bother coming from a Nvidia system (bsods 01 and F4). Also experienced distorted colours during boot up (not in windows).
> 
> Since reinstalling the OS I've never had anymore issues, very pleased
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not the best OC'er artifacting at 1100mhz with +50mV on reference cooler.
> 
> Will be adding another 290 after crimbo and both will be under water... coming from 580 SLI.


I know this mite sound backwards but have you tried 1100 core with stock volts? Or by +50mv did you mean + 50% power in ccc (just checking)


----------



## Sgt Bilko

Quote:


> Originally Posted by *flamin9_t00l*
> 
> Is the WHQL recommended over the beta 9.2.


Beta 9.2 is newer and AMD recommend it over the WHQL driver


----------



## brazilianloser

Quote:


> Originally Posted by *Sgt Bilko*
> 
> In other news: The 13.11 WHQL drivers are now official for Futuremark.
> 
> Now we can see AMD rolling in


I just ran a test minutes ago and still said unknown gpu... where did you see such news?


----------



## flamin9_t00l

Ok, thanks will just stick with beta for now.


----------



## tsm106

Quote:


> Originally Posted by *Sgt Bilko*
> 
> In other news: The 13.11 WHQL drivers are now official for Futuremark.
> 
> Now we can see AMD rolling in


It's about to get real yo!


----------



## Arizonian

Quote:


> Originally Posted by *flamin9_t00l*
> 
> Is the WHQL recommended over the beta 9.2.


As a rule of thumb *IMO*, I've always stuck with WHQL drivers even when beta was out for Nvidia. Zero problems as by the time they hit WHQL all bugs are worked out. (For the most part).

However if there are game improvements in games you play, or bugs ironed out in beta and your wanting the performance boost without waiting or the bug fix, then move up to the latest beta.

Ask a bencher, they will try beta to see if it improves benchmark scores or not before deciding if it's worth it or not. If it brings a decrease in benchmarks then they roll back to last best driver.
Quote:


> Originally Posted by *tsm106*
> 
> It's about to get real yo!


Good luck bro, represent us well.


----------



## Sgt Bilko

Quote:


> Originally Posted by *brazilianloser*
> 
> I just ran a test minutes ago and still said unknown gpu... where did you see such news?


Here: http://www.3dmark.com/fs/1151775

When i ran this test it wasn't counted but i just loaded it a few minutes ago to check something and BAM........It's official


----------



## tsm106

Quote:


> Originally Posted by *Arizonian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> It's about to get real yo!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Good luck bro, represent us well.
Click to expand...

The wait until they get posted on the leaderboard is going to be agonizing hehe.


----------



## Raxus

Quote:


> Originally Posted by *Arizonian*
> 
> As a rule of thumb *IMO*, I've always stuck with WHQL drivers even when beta was out for Nvidia. Zero problems as by the time they hit WHQL all bugs are worked out. (For the most part).
> 
> However if there are game improvements in games you play, or bugs ironed out in beta and your wanting the performance boost without waiting or the bug fix, then move up to the latest beta.
> 
> Ask a bencher, they will try beta to see if it improves benchmark scores or not before deciding if it's worth it or not. If it brings a decrease in benchmarks then they roll back to last best driver.
> Good luck bro, represent us well.


what made you decide for a 290x over a 780 ti?


----------



## brazilianloser

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Here: http://www.3dmark.com/fs/1151775
> 
> When i ran this test it wasn't counted but i just loaded it a few minutes ago to check something and BAM........It's official


Well I guess I will do another batch of tests.


----------



## flamin9_t00l

Quote:


> Originally Posted by *Arizonian*
> 
> As a rule of thumb *IMO*, I've always stuck with WHQL drivers even when beta was out for Nvidia. Zero problems as by the time they hit WHQL all bugs are worked out. (For the most part).
> 
> However if there are game improvements in games you play, or bugs ironed out in beta and your wanting the performance boost without waiting or the bug fix, then move up to the latest beta.
> 
> Ask a bencher, they will try beta to see if it improves benchmark scores or not before deciding if it's worth it or not. If it brings a decrease in benchmarks then they roll back to last best driver.
> Good luck bro, represent us well.


Yeah, I used to stick with WHQL with nvidia 99% of the time as the beta tended to cause issues.

Just to add some more info to my bsod problem before reinstalling the OS. I used bluescreenview to view the crash dump and it confirmed atikmdag.sys was the cause of the problems (a driver file). Just throwing that out there just incase anyone else is having similar problems. Maybe conflicts with nvida driver as the problem is now sorted with the reinstall... I used DDU aswell before installing the 290 for the first time.

And for performance one of these cards is similar performance to my 2 card 580 SLI but without the crippling vram limitation. They are beasts very chuffed with mine and after all the bashing of the super loud fans I was pleasantly suprised... it's really not that bad unless you're going for crazy OC's

Have to say the 4GB vram was the deciding factor for me (and of course the price).

Ghosts @ 1440p almost maxes out 4gb


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> The wait until they get posted on the leaderboard is going to be agonizing hehe.


Here I'll post it

1st Place 290X X 3 15574

http://www.3dmark.com/fs/1179138

2nd Place Titan X3 15156

http://www.3dmark.com/fs/926452


----------



## nyboy42

after 2 days with the Sapphire r9 290, im getting inconsistent core clock and gpu usage performance. For example, Unigine Heaven Benchmark performs beautifully, my gpu usage stays 100% solid and my core clock is locked to max thru the entire benchmark. Then ill fire up Crysis 3 and the core clock is constantly fluctuating between 500mhz- 947mhz getting terrible frame rates. Then I fired up Assassins Creed 4 and it works beautifully, core clock locked at 947. Then i launch Metro Last Light and its TERRIBLE gpu usage dipping down to 30%, core clock going as low as 300 mhz BUT ill run the Metro Last Light Benchmark in MAX settings and it works perfect, i average over 60 fps, core clock stays solid, so this not making any sense to me. I have a custom fan profile and the temps NEVER go above 80C. Besides AMD has stated that at 95C you should be getting full boost clock performance with no throttling, my card comes nowhere near 95C and certain games give terrible performance.

At this point I cant tell if its a faulty card, bad GPU drivers, or certain games need patches - wondering if anyone else with a Sapphire r9 290 and some of these games can comment whether they have the same experience.


----------



## rdr09

Quote:


> Originally Posted by *nyboy42*
> 
> after 2 days with the Sapphire r9 290, im getting inconsistent core clock and gpu usage performance. For example, Unigine Heaven Benchmark performs beautifully, my gpu usage stays 100% solid and my core clock is locked to max thru the entire benchmark. Then ill fire up Crysis 3 and the core clock is constantly fluctuating between 500mhz- 947mhz getting terrible frame rates. Then I fired up Assassins Creed 4 and it works beautifully, core clock locked at 947. Then i launch Metro Last Light and its TERRIBLE gpu usage dipping down to 30%, core clock going as low as 300 mhz BUT ill run the Metro Last Light Benchmark in MAX settings and it works perfect, i average over 60 fps, core clock stays solid, so this not making any sense to me. I have a custom fan profile and the temps NEVER go above 80C. Besides AMD has stated that at 95C you should be getting full boost clock performance with no throttling, my card comes nowhere near 95C and certain games give terrible performance.
> 
> At this point I cant tell if its a faulty card, bad GPU drivers, or certain games need patches - wondering if anyone else with a Sapphire r9 290 and some of these games can comment whether they have the same experience.


i played C3 MP Maxed out 1080 ( i mean maxed out) with the 290 stock but my i7 is oc'ed to 4.5GHz. never bothered checking usage 'cause it was so smooth. i cannot believe i can max out this game with one card.









edit: do you mind running 3DMark11?


----------



## Clockster

Well after getting 2x doa 290 cards I managed to swap the working 1 for a 290X and then grabbed another 290X.
Hopefully I can finish my loop this weekend. Also changed my board and cpu.

Picked up a i7 4770K + Msi Z87 M power Max for a sick price, couldn't say no lol


----------



## nyboy42

Quote:


> Originally Posted by *rdr09*
> 
> i played C3 MP Maxed out 1080 ( i mean maxed out) with the 290 stock but my i7 is oc'ed to 4.5GHz. never bothered checking usage 'cause it was so smooth. i cannot believe i can max out this game with one card.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> edit: do you mind running 3DMark11?


your core clock was solid at 1080? never dipped ? can you enable MSI Afterburner to see the usage and core clock and run the first level. Right on the first level on single player in max settings 1920x1080 with FXAA - I see my clocks dipping immediately

I dont have 3dMark11


----------



## chiknnwatrmln

Quote:


> Originally Posted by *nyboy42*
> 
> after 2 days with the Sapphire r9 290, im getting inconsistent core clock and gpu usage performance. For example, Unigine Heaven Benchmark performs beautifully, my gpu usage stays 100% solid and my core clock is locked to max thru the entire benchmark. Then ill fire up Crysis 3 and the core clock is constantly fluctuating between 500mhz- 947mhz getting terrible frame rates. Then I fired up Assassins Creed 4 and it works beautifully, core clock locked at 947. Then i launch Metro Last Light and its TERRIBLE gpu usage dipping down to 30%, core clock going as low as 300 mhz BUT ill run the Metro Last Light Benchmark in MAX settings and it works perfect, i average over 60 fps, core clock stays solid, so this not making any sense to me. I have a custom fan profile and the temps NEVER go above 80C. Besides AMD has stated that at 95C you should be getting full boost clock performance with no throttling, my card comes nowhere near 95C and certain games give terrible performance.
> 
> At this point I cant tell if its a faulty card, bad GPU drivers, or certain games need patches - wondering if anyone else with a Sapphire r9 290 and some of these games can comment whether they have the same experience.


Increase your power target via CCC or GPU Tweak.

Check that PhysX is turned off for Metro, that will murder performance on AMD cards.

Some games cause the card to hit the power limit and throttle more than others.

Edit: DL 3dMark11 and run it. That caused my card to throttle before I upped the power limit.


----------



## rdr09

Quote:


> Originally Posted by *nyboy42*
> 
> your core clock was solid at 1080? never dipped ? can you enable MSI Afterburner to see the usage and core clock and run the first level. Right on the first level on single player in max settings 1920x1080 with FXAA - I see my clocks dipping immediately
> 
> I dont have 3dMark11


http://www.techpowerup.com/downloads/Benchmarking/Futuremark/

i did not even check 'cause like i said i was ecstatic to play it maxed with just a single card.

edit: wait, mine is water cooled.


----------



## nyboy42

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Increase your power target via CCC or GPU Tweak.
> 
> Check that PhysX is turned off for Metro, that will murder performance on AMD cards.
> 
> Some games cause the card to hit the power limit and throttle more than others.
> 
> Edit: DL 3dMark11 and run it. That caused my card to throttle before I upped the power limit.


yea i forgot to mention that power limit is at 50 and physx is off


----------



## chiknnwatrmln

Quote:


> Originally Posted by *nyboy42*
> 
> yea i forgot to mention that power limit is at 50 and physx is off


What program did you use to set the power limit? MSI AB's power limit setting doesn't work for my card.

What is your CPU clocked at?

Run Task Manager and watch CPU usage during/after gaming.


----------



## rdr09

Quote:


> Originally Posted by *nyboy42*
> 
> yea i forgot to mention that power limit is at 50 and physx is off


Run the bench.


----------



## Arizonian

Quote:


> Originally Posted by *Raxus*
> 
> what made you decide for a 290x over a 780 ti?


Originally and what made me start the club, it's a good time to come back to AMD. Change and innovation happening. AMD reference cards though IMO aren't as good as Nvidia regarding fan noise if one is staying on air. So mine was sold and I got to keep BF4. I'm hoping black screen issue gets taken care of properly. Hoping to hear about the fix here today or tomorrow and see the results.

I'm wanting 290X still at heart, however if Nvidia drops non-ref first the urge to purchase will be hard for me. I'll be honest, I'm not biased to either.

I've got a small dilemma though because my sister and I are looking into the Nvidia Shield for my two nephews who've requested to share one as a Christmas gift and the $100 off makes it a lucrative purchase for me on top of a game bundle where I like at least two of the games.

Both are great cards and decisions are harder to make then ever before for me.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Falkentyne*
> 
> Mine says whatever I set it to in afterburner. I tested it directly. No problems and i dont have to touch CCC. its disabled. DO NOT ENABLE OVERDRIVE IN CCC and your AB settings will WORK.
> Even though overdrive is not enabled, it still SEES the overdrive settings. Probably because powertune itself is enabled but the SETTINGS to change it through CCC are "locked".


Unfortunately for me it is not this way. That's what I assumed it would do when I first installed the card and started overclocking. I never even opened the ccc. Once I finally did and realized the power limit was not changing I had to enable it and slide it over then all was well for me.


----------



## stilllogicz

I've been contemplating something the last few days and I'm torn on the answer. For 1440p @ 100-120 fps solid (Ultra settings, atleast 2x-4x MSAA as a general guideline for all games) two 290's or three 290's?

They will be overvolted and OC'd but that's a lottery at best so for factoring in the answer consider it a mild OC.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Arizonian*
> 
> I've got a small dilemma though because my sister and I are looking into the Nvidia Shield for my two nephews who've requested to *share* one as a Christmas gift


Siblings sharing a gaming system?

You jest.....surely?


----------



## jerrolds

Quote:


> Originally Posted by *stilllogicz*
> 
> I've been contemplating something the last few days and I'm torn on the answer. For 1440p @ 100-120 fps solid (Ultra settings, atleast 2x-4x MSAA as a general guideline for all games) two 290's or three 290's?
> 
> They will be overvolted and OC'd but that's a lottery at best so for factoring in the answer consider it a mild OC.


[email protected]/1500 I get 100fps+ in multi BF4 Ultra, No AA quite often..sometimes it dips to around 75fps, but more often than not its hovering around 100fps.

2 290s should give you a solid 120fps i would think at 1440p. Going "Low" or "Med" Post Processing AA, doesnt really kill performance either..maybe 5-10fps.


----------



## brazilianloser

Just ran another 3Dmark and it did not picked up my 290 while using whql drivers.

And to add to it every time I up the core my scores get better but if I up the memory clock the scores goes down...


----------



## Lennyx

So i just slept some hours this afternoon. And wake up to no new drivers.
Seriously this whole experience is starting to get to me. I want to play some bf4 but just knowing that at some point im gonna blackscreen
is realy frustrating. I want to have fun with my card and push its limits.

Oh today is the day my quick disconnect fittings is comming.... stuck in shipping....

Guess i can just open valley and go watch some anime on my laptop while i wait for the blackscreen. Laters guys.


----------



## DampMonkey

Quote:


> Originally Posted by *Lennyx*
> 
> So i just slept some hours this afternoon. And wake up to no new drivers.
> Seriously this whole experience is starting to get to me. I want to play some bf4 but just knowing that at some point im gonna blackscreen
> is realy frustrating. I want to have fun with my card and push its limits.
> 
> Oh today is the day my quick disconnect fittings is comming.... stuck in shipping....
> 
> Guess i can just open valley and go watch some anime on my laptop while i wait for the blackscreen. Laters guys.


what voltage, what bios, and what clocks?


----------



## Poyri

Which block is better ek or koolance?
http://www.frozencpu.com/products/21972/ex-blc-1575/Koolance_VID-AR290X_Radeon_VGA_Liquid_Cooling_Block_No_Fittings.html


----------



## brazilianloser

Man almost breaking the 10k on firestrike... http://www.3dmark.com/fs/1190685


----------



## rdr09

Quote:


> Originally Posted by *nyboy42*
> 
> your core clock was solid at 1080? never dipped ? can you enable MSI Afterburner to see the usage and core clock and run the first level. Right on the first level on single player in max settings 1920x1080 with FXAA - I see my clocks dipping immediately
> 
> I dont have 3dMark11


Here was C3 Campaign from Level 2 to 3 (Let this ran a bit) . . .



Look at my core temp. And i am using water.







heater is on in the house.

edit: 13.11 whql is a win. btw, if you are using 1080 . . . why are you using fxaa? use Very High and 8MSAA. if FXAA, then you should have gotten a 280X instead.


----------



## jerrolds

I know what you feel - im 18pts away from hitting 11k on my 2600k..but i dont think ill hit it without going over 1230/1600







I hit a brick wall at 1225..artifacting at max allowable voltage on Asus bios and temps are 65C Core and 90C VRM1, 65C VRM2

http://www.3dmark.com/fs/1182498


----------



## Sazz

I got my R9 290 that replaced my 290X and I think I was able to unlock it succesfully.. imma post screenshots of before and after runs of Firestrike in a sec.


----------



## NorcalTRD

Quote:


> Originally Posted by *stilllogicz*
> 
> I've been contemplating something the last few days and I'm torn on the answer. For 1440p @ 100-120 fps solid (Ultra settings, atleast 2x-4x MSAA as a general guideline for all games) two 290's or three 290's?
> 
> They will be overvolted and OC'd but that's a lottery at best so for factoring in the answer consider it a mild OC.


I'd say two, which would probably be overkill.
At 1080P in BF4 ultra'd out I get 100-130 FPS on a single 290X.


----------



## Lennyx

Quote:


> Originally Posted by *DampMonkey*
> 
> what voltage, what bios, and what clocks?


Stock sapphire on water. Played around with afterburner yday or was it last night and i could force the blackscreen freeze in valley when memory was at 1500.
Was stable in valley at 1400. Without any fiddling or fine tuning the best stable clock i got running in valley was 1200 core 1400 memory. It did not work in bf4 though.
But in the end, its no fun even playing around with ocing when i have a blackscreen card to begin with.

I realy hope a driver update will fix this. I regret i did not rma the card right away or got a refund. My sea of patience looks like a puddle at this point.


----------



## Neutronman

Sapphire released performance boost for R9 290....

Reported today.

As we all know it is nothing more than an official bios replacement with fan profile of 47% baked in. I have looked hard for the bios image and the flash tool that they are using but have yet to find it.

Any of you guys had better luck getting your hands on this? If so links please....

Cheers,

Greg


----------



## ReHWolution

Just had a Red Screen of Death on W8.1 Pro x64, playing @ BF4 with card @ stock settings...
Seriously guys, I crash at least twice per game, I can't play like this. Sapphire R9 290X with Elpida here :\


----------



## Raxus

Quote:


> Originally Posted by *ReHWolution*
> 
> Just had a Red Screen of Death on W8.1 Pro x64, playing @ BF4 with card @ stock settings...
> Seriously guys, I crash at least twice per game, I can't play like this. Sapphire R9 290X with Elpida here :\


shocker, another sapphire r9 290x elpida having black screening.

Anyways, supposedly the next round of driver updates is supposed to remedy that. They said they got a little hung up and theyll take a little longer.


----------



## Sgt Bilko

Quote:


> Originally Posted by *brazilianloser*
> 
> Man almost breaking the 10k on firestrike... http://www.3dmark.com/fs/1190685


You can do it!!


----------



## evensen007

Quote:


> Originally Posted by *ReHWolution*
> 
> Just had a Red Screen of Death on W8.1 Pro x64, playing @ BF4 with card @ stock settings...
> Seriously guys, I crash at least twice per game, I can't play like this. Sapphire R9 290X with Elpida here :\


Someone said they fixed this by running bf4 as admin (however unlikely it seems that would fix it).


----------



## ReHWolution

Quote:


> Originally Posted by *evensen007*
> 
> Someone said they fixed this by running bf4 as admin (however unlikely it seems that would fix it).


Dang, I wrote BF4, But I meant BF3, sorry :\ However is it THAT simple by running as admin a program? I mean, I had black screens with BF3, Metro LL, Crysis 3, Unigine Heaven 2.1 (the one for HWBOT), and now BF3 with a red screen... I'm really piss'd off...


----------



## Scorpion49

Anyone else having these white flashes? I found a video by a guy over on anand, I haven't been able to capture it on mine as it only does it once every 60-90 seconds.


----------



## Ukkooh

Is anyone of you who get black/red/white screens running windows 8.1? How long did it take for them to start appearing on your card? I'm running win 8.1 and haven't had a single issue yet. My card has elpida memory too.


----------



## cyenz

New Drivers: http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx

Feature Highlights of The AMD Catalyst 13.11 Beta9.4 Driver for Windows

Includes all Feature Highlights of The AMD Catalyst 13.11 Beta9.2 Driver

*May r*esolve intermittent black screens or display loss observed on some AMD Radeon™ R9 290X and AMD Radeon R9 290 graphics cards

Improves AMD CrossFire™ scaling in the multi-player portion of Call of Duty®: Ghosts

AMD Enduro Technology Profile updates:

XCOM: Enemy Unknown
Need for Speed Rivals


----------



## evensen007

Quote:


> Originally Posted by *cyenz*
> 
> New Drivers: http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx
> 
> Feature Highlights of The AMD Catalyst 13.11 Beta9.4 Driver for Windows
> 
> Includes all Feature Highlights of The AMD Catalyst 13.11 Beta9.2 Driver
> 
> *May r*esolve intermittent black screens or display loss observed on some AMD Radeon™ R9 290X and AMD Radeon R9 290 graphics cards
> 
> Improves AMD CrossFire™ scaling in the multi-player portion of Call of Duty®: Ghosts
> 
> AMD Enduro Technology Profile updates:
> 
> XCOM: Enemy Unknown
> Need for Speed Rivals


----------



## Warsam71

Hello everyone (@Arizonian),

The driver is up!

You can download it from here: http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx

Please make sure to read the Feature Highlights as it provides more details about the driver.

And if you are still experiencing an issue after the installation of the driver, please use our AMD Issue Reporting Form for the Catalyst driver, here is the link: http://www.amdsurveys.com/se.ashx?s=5A1E27D25AD12B3D

Thank you again for all your patience!

Best,
Sam


----------



## eternal7trance

Quote:


> Originally Posted by *Warsam71*
> 
> Hello everyone (@Arizonian),
> 
> The driver is up!
> 
> You can download it from here: http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx
> 
> Please make sure to read the Feature Highlights as it provides more details about the driver.
> 
> And if you are still experiencing an issue after the installation of the driver, please use our AMD Issue Reporting Form for the Catalyst driver, here is the link: http://www.amdsurveys.com/se.ashx?s=5A1E27D25AD12B3D
> 
> Thank you again for all your patience!
> 
> Best,
> Sam


Yay more drivers, will test when I get home, but thankfully I had no black screen issues

Would it be possible to release an updated bios that lets the fan be more agressive? I'm tired of using afterburner


----------



## stilllogicz

Quote:


> Originally Posted by *Warsam71*
> 
> Hello everyone (@Arizonian),
> 
> The driver is up!
> 
> You can download it from here: http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx
> 
> Please make sure to read the Feature Highlights as it provides more details about the driver.
> 
> And if you are still experiencing an issue after the installation of the driver, please use our AMD Issue Reporting Form for the Catalyst driver, here is the link: http://www.amdsurveys.com/se.ashx?s=5A1E27D25AD12B3D
> 
> Thank you again for all your patience!
> 
> Best,
> Sam


It begins.


----------



## jerrolds

@Warsam71 - Thanks!

Are you able to give us any info on what the problem was/might be?


----------



## ReHWolution

Quote:


> Originally Posted by *stilllogicz*
> 
> It begins.


*conga train* if it effectively solves my problems, well guys, you got me for other 5 or 6 generations of GPUs, each @ Day 1


----------



## Lennyx

Lets see if this works.


----------



## estens

I have some issues overclocking my Sapphire 290x, everytime the voltage goes past 1.25v actual for more than a few seconds my monitor looks like the picture below.
This happens even on stock clocks if I increase voltage over 1.25.

Is it just that my card don`t like the extra voltage too much, is it broken or could it be caused by another component in my pc?

I don`t think it is a software problem since I have tried several bioses and drivers. And I have a full custom loop so I don`t think it is temperature related.

http://s67.photobucket.com/user/eivst/media/20131122_172318_zpscd53ec82.jpg.html


----------



## ReHWolution

Quote:


> Originally Posted by *jerrolds*
> 
> @Warsam71 - Thanks!
> 
> Are you able to give us any info on what the problem was/might be?


+1


----------



## selk22

Alright well lets hope this fixes it! If I can actually OC this 290x I will start to be one happy OC'er


----------



## Raephen

Quote:


> Originally Posted by *nyboy42*
> 
> after 2 days with the Sapphire r9 290, im getting inconsistent core clock and gpu usage performance. For example, Unigine Heaven Benchmark performs beautifully, my gpu usage stays 100% solid and my core clock is locked to max thru the entire benchmark. Then ill fire up Crysis 3 and the core clock is constantly fluctuating between 500mhz- 947mhz getting terrible frame rates. Then I fired up Assassins Creed 4 and it works beautifully, core clock locked at 947. Then i launch Metro Last Light and its TERRIBLE gpu usage dipping down to 30%, core clock going as low as 300 mhz BUT ill run the Metro Last Light Benchmark in MAX settings and it works perfect, i average over 60 fps, core clock stays solid, so this not making any sense to me. I have a custom fan profile and the temps NEVER go above 80C. Besides AMD has stated that at 95C you should be getting full boost clock performance with no throttling, my card comes nowhere near 95C and certain games give terrible performance.
> 
> At this point I cant tell if its a faulty card, bad GPU drivers, or certain games need patches - wondering if anyone else with a Sapphire r9 290 and some of these games can comment whether they have the same experience.


I can't comment on the games you mentioned.

I can, however, say I've seen similar behavior in Skyrim (I just play that a lot atm... I thought ultra settings with my old HD7870 looked awesome - who could've guessed there could be such a difference between ultra and uktra...)

I play with the Afterburner OSD on, and I've seen instances of the clockspeed dropping down to half of wat it was - but that's on load screens, so not that inexplicable.

No, the thing that strikes me as odd is how my GPU ussage can drop to 0% for multiple - really, multiple multiple - seconds yet the game is silky smooth. Has anyone noticed any visual effects from bouncing clocks?

Another weird thing I've noticed is in the AB graphs. the GPU usage graph is usually pretty low, tight about up until th epoint ingame that I open the menu and exit. That seems to cause a bit of a spike in GPU usage...

Odd, I would have thought flying around Skyrim and having dragons breathe fire down my neck would be more graphically taxing than a menu.

Another thing to note is I never reach full clocks in game. In Valley I do reach the full 1000MHz I've got my Sapphire 290 at, but maybe because Skyrim is an older, less taxing DX 9 / 10 game, Powerune decides it doesn't need full clocks.

I don't know, really, but so far, in Skyrim at least, I've suffered no ill effects from the bouncing clocks (btw: in game ussually between 700ih and 847 (yes 847, not 947). And I can rule out throttling: in Skyrim the core maxes around 65 C and in Valley Extreme HD just about high 80's with the VRM1 - the hottest of the two in my case - mostly 10 to 15 degrees hotter.

I'm no expert, but I've got a gut feeling this odd behavior / thes eodd readings could be driver / software related.

I have no issues yet with 9.2 Beta yet, so I'll just wait and see what the next real update brings.


----------



## MrWhiteRX7

I see no reason to update the driver







I don't have black screen issues and there's nothing else on that driver pretty much. Hope it helps the rest of you!


----------



## tsm106

Oh new drivers!

Quote:


> Originally Posted by *Poyri*
> 
> Which block is better ek or koolance?
> http://www.frozencpu.com/products/21972/ex-blc-1575/Koolance_VID-AR290X_Radeon_VGA_Liquid_Cooling_Block_No_Fittings.html


Ek for me.


----------



## Neutronman

Quote:


> Originally Posted by *tsm106*
> 
> Oh new drivers!
> Ek for me.


EK for me too. traditionally Koolance has introduced more flow restriction in to my loops.


----------



## Raephen

Quote:


> Originally Posted by *Neutronman*
> 
> Sapphire released performance boost for R9 290....
> 
> Reported today.
> 
> As we all know it is nothing more than an official bios replacement with fan profile of 47% baked in. I have looked hard for the bios image and the flash tool that they are using but have yet to find it.
> 
> Any of you guys had better luck getting your hands on this? If so links please....
> 
> Cheers,
> 
> Greg


Uhmm... My Sapphire 290 already had a 47% fan profile bios when I bought it... So where's the upgrade in this replacement?


----------



## jjjsong

Quote:


> Originally Posted by *Warsam71*
> 
> Hello everyone (@Arizonian),
> 
> The driver is up!
> 
> You can download it from here: http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx
> 
> Please make sure to read the Feature Highlights as it provides more details about the driver.
> 
> And if you are still experiencing an issue after the installation of the driver, please use our AMD Issue Reporting Form for the Catalyst driver, here is the link: http://www.amdsurveys.com/se.ashx?s=5A1E27D25AD12B3D
> 
> Thank you again for all your patience!
> 
> Best,
> Sam


Great news! Thanks!

I'll test it once my replacement comes... although my issue has been blue screens.

Is it possible to share some info on the cause?


----------



## Virus_Shell

It finally came!!! 290 Club, please!!

http://www.techpowerup.com/gpuz/agmvv/

XFX 290 with the Arctic Cooling Accelero Hybrid


----------



## Prospector

Guys,there is any other efficient way to cool the VRM1.. I got like 71C with 70%fan speed on my accelero extreme III(r9 290) on extreme gaming..


----------



## D749

From what I understand frame pacing for the 290X currently supports D10 and DX11 up to 2560x1600. DX9 support is in the works.

What about current support for frame pacing in 4K and Eyefinity using 290X in CF?

Thanks.


----------



## Scorpion49

Quote:


> Originally Posted by *D749*
> 
> From what I understand frame pacing for the 290X currently supports D10 and DX11 up to 2560x1600. DX9 support is in the works.
> 
> What about current support for frame pacing in 4K and Eyefinity using 290X in CF?
> 
> Thanks.


I dunno about the frame pacing, but I just had major deja-vu switching between two tabs and seeing the same post on multiple forums


----------



## ZealotKi11er

Quote:


> Originally Posted by *D749*
> 
> From what I understand frame pacing for the 290X currently supports D10 and DX11 up to 2560x1600. DX9 support is in the works.
> 
> What about current support for frame pacing in 4K and Eyefinity using 290X in CF?
> 
> Thanks.


290X support everything bu DX9 i think. 4K and Eyefinity does work.


----------



## D749

Quote:


> Originally Posted by *Scorpion49*
> 
> I dunno about the frame pacing, but I just had major deja-vu switching between two tabs and seeing the same post on multiple forums


Why then why the heck didn't you post a reply other there too?


----------



## D749

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 290X support everything bu DX9 i think. 4K and Eyefinity does work.


Thanks. That's what I though but I was hoping for some links to official confirmations, etc.


----------



## Scorpion49

Oh man, Aquacomputer blocks got pushed back into December now. Anyone ever order from aquatuning.us? I'm going to try and cancel since technically its a "pre-order" and it doesn't look like I'll ever actually get the blocks.


----------



## Forceman

Quote:


> Originally Posted by *Prospector*
> 
> Guys,there is any other efficient way to cool the VRM1.. I got like 71C with 70%fan speed on my accelero extreme III(r9 290) on extreme gaming..


Better heatsinks on them? But 71C is nothing for the VRMs.


----------



## NorcalTRD

Can you guys please run Hawaiiinfo (found in this post)
http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread/610#post_21243519
and report back?

Please state the brand and whether it was unlocked with a 290X bios.
We think we have found a way to identify which R9 290 chips can be unlocked into 290X.
We believe the R1 reading F8000005 indicates an unlockable chip and R1 reading F8010005 indicates a locked chip.

EXAMPLE

Powercolor R9 290 unlocked with 290X bios
Compatible adapters detected: 1
PCI ID: 1002:67B0 - 1043:0466
Memory config: 0x5A0013A9 Elpida
R1: F8000005
R2: 00000000


----------



## Heinz68

Quote:


> Originally Posted by *D749*
> 
> Thanks. That's what I though but I was hoping for some links to official confirmations, etc.


HARDOCP AMD Radeon R9 290X CrossFire Video Card Review
Quote:


> In addition, this technology works with AMD's software based frame pacing technology. Currently, with Radeon R9 290X CrossFire you will get frame pacing support with the current driver at Eyefinity resolutions, and 4K resolutions.


----------



## Scorpion49

Quote:


> Originally Posted by *NorcalTRD*
> 
> Can you guys please run Hawaiiinfo (found in this post)
> http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread/610#post_21243519
> and report back?
> Please state the base model and whether it was unlocked with a 290X bios.
> We think we have found a way to identify which R9 290 chips can be unlocked into 290X.
> We believe the R1 reading F8000005 indicates an unlockable chip and R1 reading F8010005 indicates a locked chip.
> 
> EXAMPLE
> 
> Powercolor R9 290 unlocked with 290X bios
> Compatible adapters detected: 1
> PCI ID: 1002:67B0 - 1043:0466
> Memory config: 0x5A0013A9 Elpida
> R1: F8000005
> R2: 00000000


Compatible adapters detected: 2
PCI ID: 1002:67B1 - 1002:0B00
Memory config: 0x500036A9 Hynix
R1: F8010005
R2: 00000000

Mine do not unlock.


----------



## NorcalTRD

Quote:


> Originally Posted by *Scorpion49*
> 
> Compatible adapters detected: 2
> PCI ID: 1002:67B1 - 1002:0B00
> Memory config: 0x500036A9 Hynix
> R1: F8010005
> R2: 00000000
> 
> Mine do not unlock.


Perfect! Thank you!
You are helping to confirm that R1: F8010005 chips indeed are locked!
Can you mention the brand as well?


----------



## Technewbie

Is there still no word on when the new drivers will come out? I know it got pushed back a couple days but I don't want to ship my RMA and then have the driver come out and it fixes the black screen issue and I have to pay for return shipping because the card is fine now .-.


----------



## Loktar Ogar

Quote:


> Originally Posted by *Technewbie*
> 
> Is there still no word on when the new drivers will come out? I know it got pushed back a couple days but I don't want to ship my RMA and then have the driver come out and it fixes the black screen issue and I have to pay for return shipping because the card is fine now .-.


...


----------



## jjjsong

Quote:


> Originally Posted by *Technewbie*
> 
> Is there still no word on when the new drivers will come out? I know it got pushed back a couple days but I don't want to ship my RMA and then have the driver come out and it fixes the black screen issue and I have to pay for return shipping because the card is fine now .-.


It's already out http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx

But someone at AMD support forum said he's still getting black screens...


----------



## Ukkooh

Quote:


> Originally Posted by *Technewbie*
> 
> Is there still no word on when the new drivers will come out? I know it got pushed back a couple days but I don't want to ship my RMA and then have the driver come out and it fixes the black screen issue and I have to pay for return shipping because the card is fine now .-.


You already tested the ones that were released 20 minutes ago?


----------



## Scorpion49

Quote:


> Originally Posted by *NorcalTRD*
> 
> Perfect! Thank you!
> You are helping to confirm that R1: F8010005 chips indeed are locked!
> Can you mention the brand as well?


Sorry, both Gigabyte. Was typing one handed while eating a sandwich.


----------



## JordanTr

Quote:


> Originally Posted by *Technewbie*
> 
> Is there still no word on when the new drivers will come out? I know it got pushed back a couple days but I don't want to ship my RMA and then have the driver come out and it fixes the black screen issue and I have to pay for return shipping because the card is fine now .-.


They have been released couple hours ago







im expecting my sapphire r9 290 tomorrow


----------



## Technewbie

Quote:


> Originally Posted by *jjjsong*
> 
> It's already out http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx
> 
> But someone at AMD support forum said he's still getting black screens...


Yeah I just went to check and saw it came then came back to edit my post. But guess time to try it out.


----------



## fewohfjweoifj

I just registered to confirm that the new drivers failed to fix the black screen issue. It's just as bad as always. Hopefully they are some NEW new drivers before the weekend is up.


----------



## Ukkooh

Quote:


> Originally Posted by *jjjsong*
> 
> It's already out http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx
> 
> But someone at AMD support forum said he's still getting black screens...


Inb4 all cards using elpida mem are going to be replaced with hynix using cards.
Quote:


> Originally Posted by *fewohfjweoifj*
> 
> I just registered to confirm that the new drivers failed to fix the black screen issue. It's just as bad as always. Hopefully they are some NEW new drivers before the weekend is up.


What OS are you running on?


----------



## fewohfjweoifj

Quote:


> Originally Posted by *Ukkooh*
> 
> What OS are you running on?


I'm using Windows 7 64-bit.


----------



## stilllogicz

I'm just gonna sit back see what you pro's deduct. I gotta wait a couple weeks before I pull the trigger on a few 290's. Hope you all get it sorted out!


----------



## kpoeticg

Quote:


> Originally Posted by *Ukkooh*
> 
> Inb4 all cards using elpida mem are going to be replaced with hynix using cards.


That would be wonderful. Hopefully would teach companies to stop cutting corners and trying to mix sub-par products in with the good versions


----------



## NorcalTRD

Quote:


> Originally Posted by *Scorpion49*
> 
> Sorry, both Gigabyte. Was typing one handed while eating a sandwich.


Thanks!
That means that the R1 state will be a sure indication regardless of brand.
My readings were from a powercolor.









My R9 290 unlocked to 290X with Elpida memory has no black screens or other issues to speak of.
9.7% performance increase form stock 290 to unlocked 290X with 6% OC.


----------



## VSG

Quote:


> Originally Posted by *kpoeticg*
> 
> That would be wonderful. Hopefully would teach companies to stop cutting corners and trying to mix sub-par products in with the good versions


What? The black screens have nothing to do with memory type, they happen to owners of cards with both hynix and elpida. Also, as far as performance goes, elpida is right up there with hynix for this generation.


----------



## nemm

Just tried the latest drivers and all I can say is awful...

Overdrive was no longer in CCC so unable to set power to 50% which resulted FSE 99** score down to 80** with graphics score down by almost 20%.

Back to beta2 I go.


----------



## DampMonkey

Quote:


> Originally Posted by *kpoeticg*
> 
> That would be wonderful. Hopefully would teach companies to stop cutting corners and trying to mix sub-par products in with the good versions


I have elpida memory, only time i see a black screen is when my core is over 1320mhz, memory over 6600


----------



## GenoOCAU

Just ran the hawaiinfo tool, afaik neither of my cards unlocked.

Compatible adapters detected: 2
PCI ID: 1002:67B1 - 1682:9295
Memory config: 0x5A0013A9 Elpida
R1: FA000005
R2: 00000000


----------



## Ukkooh

Quote:


> Originally Posted by *nemm*
> 
> Just tried the latest drivers and all I can say is awful...
> 
> Overdrive was no longer in CCC so unable to set power to 50% which resulted FSE 99** score down to 80** with graphics score down by almost 20%.
> 
> Back to beta2 I go.


Did you wipe the earlier drivers before installing the new ones? I also got this issue once with my 7970 but I got it fixed by wiping the drivers properly and reinstalling after that.


----------



## Arizonian

Looks like the update is posted today - 13.11 beta 9.4

http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx

Includes all Feature Highlights of The AMD Catalyst 13.11 Beta9.2 Driver

May resolve intermittent black screens or display loss observed on some AMD Radeon™ R9 290X and AMD Radeon R9 290 graphics cards
Improves AMD CrossFire™ scaling in the multi-player portion of Call of Duty®: Ghosts
AMD Enduro Technology Profile updates:
XCOM: Enemy Unknown
Need for Speed Rivals

Will update OP later tonight. Post back with results for everyone who is experiencing black screens please. Hope this is the fix.


----------



## nemm

Quote:


> Originally Posted by *Ukkooh*
> 
> Did you wipe the earlier drivers before installing the new ones? I also got this issue once with my 7970 but I got it fixed by wiping the drivers properly and reinstalling after that.


Indeed I did, always wipe old drivers before new release installation.

I am back onto beta 9.2 and no problem with downclock power saving - 10016
http://www.3dmark.com/3dm/1692471

This was the best I could get from beta 9.4 which required clocks locked with no down clock power saving - 9030
http://www.3dmark.com/fs/1191559

Both runs were clocked at 1235/1500 @ 1.42v


----------



## Ukkooh

Quote:


> Originally Posted by *Arizonian*
> 
> Looks like the update is posted today - 13.11 beta 9.4
> 
> http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx
> 
> Includes all Feature Highlights of The AMD Catalyst 13.11 Beta9.2 Driver
> 
> May resolve intermittent black screens or display loss observed on some AMD Radeon™ R9 290X and AMD Radeon R9 290 graphics cards
> Improves AMD CrossFire™ scaling in the multi-player portion of Call of Duty®: Ghosts
> AMD Enduro Technology Profile updates:
> XCOM: Enemy Unknown
> Need for Speed Rivals
> 
> Will update OP later tonight. Post back with results for everyone who is experiencing black screens please. Hope this is the fix.


Some people are still getting black screens and I've seen a few reports of overdrive disappearing from CCC.


----------



## Sgt Bilko

Well i'm going to stick with the WHQL driver now, Same as Dampmonkey, The only Black Screens i ever get is when im clocking the memory too high (in my case 1500+)

So long as i keep the mem clock down its all good for me


----------



## Technewbie

So far haven't black screened with the new drivers which on the last once I black screened almost instantly after starting a game, However i'm now getting about the same performance just a little higher and sometimes less in BF4 with the 290x than I did with my 7970


----------



## Falkentyne

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Unfortunately for me it is not this way. That's what I assumed it would do when I first installed the card and started overclocking. I never even opened the ccc. Once I finally did and realized the power limit was not changing I had to enable it and slide it over then all was well for me.


Well maybe you have to enabler overdrive then disable it after doing a change. Probably needs initialization? But my only black screens I ever had were caused by using overdrive to overclock the memory









I ran FFIX benchmark for an hour then turned it off. Borderless, no black screens.


----------



## chiknnwatrmln

What kind of oc's are you guys getting?

I'm at 1185/5800 game+bench stable at 1.412 before droop. I can hit 1200 core but it artifacts past 65c.

I haven't maxed out the memory yet, but I was going for a higher core clock. Hopefully the Gelid can let me hit 1220+. On reference cooling now.


----------



## cyenz

My friends 290x blackscreen edition its runing strong it seems, as he could not play over 10 min without blackscreen. If it fixes completly then i will regret changing mine for a ti.

Still to early to call a victory.


----------



## Raephen

Quote:


> Originally Posted by *Scorpion49*
> 
> Oh man, Aquacomputer blocks got pushed back into December now. Anyone ever order from aquatuning.us? I'm going to try and cancel since technically its a "pre-order" and it doesn't look like I'll ever actually get the blocks.


Lmao! The bastards! I just checked my order at aquatuning.nl and indeed: 4-12, before that 25-11 and before that 21-11









I like those palm trees, but I'll ask around at HighFlow here in the NL to ask their ETA on the EK Acetal one.


----------



## metalion

Good evenoing all.Seems my P67 ws revolution dislike my new r 290.I have bsod sometimes(playingmseing a movie sometimes a few seconds after turning on).I tried two psus so it is not the case.Is there a known issue that can make me get out of blue screens of death?thank you in advance all.


----------



## djsatane

Quote:


> Originally Posted by *nemm*
> 
> Just tried the latest drivers and all I can say is awful...
> 
> Overdrive was no longer in CCC so unable to set power to 50% which resulted FSE 99** score down to 80** with graphics score down by almost 20%.
> 
> Back to beta2 I go.


Beta v9.4 no longer have overdrive in CCC? WHAT? Can anyone confirm this? How can that be?


----------



## Kuivamaa

Can't you just use the standalone overdrive program?


----------



## Kriant

I nominate myself as "the dumbest modder 2013" in my own list of nominations.

I've started playing AC4 and noticed that card 1 and 2 were at 39c while card 3 was at 50. So, qdc'ed the cards, took the third one apart thinking "didn't screw tight enough, or too much tp or both" and to my shock and surprise -> there was no thermal paste at all.







. Apparently me from the past ( aka yesterday) was distracted by something and slapped one block without putting any TP whatsoever.

Well, at least now that's fixed, and I'm back to filling up the loop for the GPU difference.


----------



## selk22

Well for me on the new drivers in the Star Citizen Hangar before I would fluctuate clock speeds and now I am at a constant 1150 with 0 fluctuations. So that's good.... No black screens yet but for me they are not common.


----------



## Technewbie

Quote:


> Originally Posted by *djsatane*
> 
> Beta v9.4 no longer have overdrive in CCC? WHAT? Can anyone confirm this? How can that be?



Looks to be true


----------



## MeneerVent

Is there any news on the aftermarket coolers for this cards yet?


----------



## djsatane

Why didnt they mention lack of overdrive in the driver changes? Thats a pretty big change... is that an attempt to hide stuff from the user?

Also does anyone know of any mirrors for those new drivers? (amd_catalyst_13.11_betav9.4.exe) AMD site is very slow currently and for some reason I get a weird problem where the download cuts off after 200 MB or so, its the only site that it happens from...


----------



## Scorpion49

Quote:


> Originally Posted by *Raephen*
> 
> Lmao! The bastards! I just checked my order at aquatuning.nl and indeed: 4-12, before that 25-11 and before that 21-11
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I like those palm trees, but I'll ask around at HighFlow here in the NL to ask their ETA on the EK Acetal one.


Email them then, but don't use the web page one it apparently doesn't work. I just got this:
Quote:


> Dear Scorpion49!
> 
> we are pleased to be able to report /n that your order (No. 91014089) has been dispatched.
> 
> Aquatuning GmbH
> CEO: Mr. Nathanael Draht & Mr. Andreas Rudnicki
> USt.ID Nr.: DE252383793
> Tax number: 349/5701/2422 Finanzamt Bielefeld Außenstadt
> Trade register: HRB 38891 Bielefeld
> Address:
> Aquatuning GmbH
> Beckheide 13
> D-33689 Bielefeld (Dalbke)
> Legal Info: http://www.aquatuning.us/shop_content.php/coID/4


----------



## Raephen

Aye, I think I'll do that first, then..

So according to them your shipment has already been dispatched?


----------



## Scorpion49

Quote:


> Originally Posted by *Raephen*
> 
> Aye, I think I'll do that first, then..
> 
> So according to them your shipment has already been dispatched?


Yep. Amazing how it only updated after I emailed them though.

EDIT: Nevermind, they actually just cancelled the order.


----------



## jomama22

If you are missing CCC, you are clearly not uninstalling and installing drivers properly.


----------



## Scorpion49

Quote:


> Originally Posted by *jomama22*
> 
> If you are missing CCC, you are clearly not uninstalling and installing drivers properly.


I don't have any overdrive in my CCC either with the 9.4 betas.


----------



## jomama22

Well then there you go. Uninstall using the included express uninstaller. Then use ddu from guru3d in safe mode. Unplug internet as to not allow windows to installed drivers. Then reinstall latest details. You'll be good to go.


----------



## ReHWolution

Maybe I found something relevant to black screens. I was benching with Unigine Heaven from HWBot, run @ default went pretty smooth, no black screens. I loaded, then, my oc'd profile (1230/1500) and around stage 21 of 26, the bench used to blackscreen me...I went like six or seven times trying to complete with my OC'd settings...
Eventually, I pulled out from my Delta 92mm fan, and you know what? I cooled the back of the PCB...and I haven't got any black screen anymore. However, I played Batman Arkham Origins for like 45 minutes and I haven't had a single black screen, seems like the back can't dissipate properly the heat...Does anyone with a backplate (liquid cooled by an EKWB) ever had black screens?


----------



## Lennyx

tpi2007 just posted in the other thread about the blackscreen issue. Looks like its hardware issue and amd have had the same problem on cards in the past. http://www.overclock.net/t/1441349/290-290x-black-screen-poll/80#post_21244483


----------



## djsatane

I noticed something that amd_catalyst_13.11_betav9.4.exe is missing file amd digital signatures, however amd_catalyst_13.11_betav9.2.exe had amd digital signature...


----------



## Xylene

I can run my 290 at 1150mhz +75mv no problem with no throttling, stock cooler. I have the stock 200mm fan in the front of my 600t on the highest speed the built in fan controller allows and no hard drive cages. I run the fan at 60% on the card. BF3 @ 1440p ultra and it never throttles and rarely hits 80c, usually mid 70's.


----------



## Raephen

Quote:


> Originally Posted by *Scorpion49*
> 
> Yep. Amazing how it only updated after I emailed them though.
> 
> EDIT: Nevermind, they actually just cancelled the order.


Wut wwut wut WUUUUT?!

I just posted a topic in their cs forum. We'll have to wait and see...


----------



## Scorpion49

Quote:


> Originally Posted by *Raephen*
> 
> Wut wwut wut WUUUUT?!
> 
> I just posted a topic in their cs forum. We'll have to wait and see...


Well, to be fair I asked if it would be possible to cancel the order. I think they took it literally, I was just trying to see what my options were. Oh well, I'll wait for paypal to get a refund and get the koolance blocks since I refuse to own any more EK.


----------



## NorcalTRD

Guys, use the 9.2 beta drivers which still have AMD OverDrive.

Make sure to use DDU to uninstall old drivers while in safe mode and disable internet so PC cant search for new ones.
Then restart, install the beta drivers and CCC and enjoy.


----------



## rdr09

Quote:


> Originally Posted by *Lennyx*
> 
> tpi2007 just posted in the other thread about the blackscreen issue. Looks like its hardware issue and amd have had the same problem on cards in the past. http://www.overclock.net/t/1441349/290-290x-black-screen-poll/80#post_21244483


i would not listen to someone who does not even own the hardware. have you tested the new driver or what? are you still getting black screens?


----------



## Technewbie

Quote:


> Originally Posted by *rdr09*
> 
> i would not listen to someone who does not even own the hardware. have you tested the new driver or what? are you still getting black screens?


I tested the new driver and it fixed the black screens for me. Before I couldn't even play BF4 at any clock for more than maybe 10 minutes and now i've been playing it for around 3 hours and haven't had a single issue.


----------



## rdr09

Quote:


> Originally Posted by *Technewbie*
> 
> I tested the new driver and it fixed the black screens for me. Before I couldn't even play BF4 at any clock for more than maybe 10 minutes and now i've been playing it for around 3 hours and haven't had a single issue.


glad it worked for you. seen any changes in settings inside CCC or AB? fan setting or something?


----------



## brinox

In case anyone was dying to know, the Arctic Cooling Accelero Xtreme 5870 heatsink *DOES NOT* bolt onto the R9 290/290X. I think with some very slight filing/dremeling it might work, but its going to be done on the PCB mounting holes, not the heatsink mounting bracket.

I'm also attempting to source a replacement mounting bracket for an Accelero Xtreme III to see if that will fit the bill.

In spite of all this, the Accelero Xtreme 5870 will actually fit on the R9 290/290X PCB once the mounting situation is... situated. I also happen to think it will fit more snugly on my R9 290 over the Xtreme III and take up less case space for those that are concerned with the Xtreme III/ R9 290/290X setup.


----------



## Raephen

Quote:


> Originally Posted by *Scorpion49*
> 
> Well, to be fair I asked if it would be possible to cancel the order. I think they took it literally, I was just trying to see what my options were. Oh well, I'll wait for paypal to get a refund and get the koolance blocks since I refuse to own any more EK.


Ah, that's how the cookie crumbles









According to Aquacomputers own webshop, the estimated time for the block is some 10 days.

But then, ever since ordering my 290, I've grown fond of the word 'estimated'


----------



## Technewbie

Quote:


> Originally Posted by *rdr09*
> 
> glad it worked for you. seen any changes in settings inside CCC or AB? fan setting or something?


CCC now has HydraVision in it (Don't know what that is) and it doesn't have Overdrive in it. As for AB it's the same no changes.


----------



## NorcalTRD

posted in wrong tab


----------



## CurrentlyPissed

Got my two XFX 290s in. @ 1335c/1500m

If I take the memory down to 1400~ I can get the cores to 1165 (This is +100mv)



Also I tried doing the memory info app, any idea why it's filled with blanks?


----------



## xxmastermindxx

Hey Arizonian, switch my info in the OP please









290CF with full cover EK blocks, CLU on the cores. Nice and chilly ~45C max.


----------



## xxmastermindxx

Hey guys, did a little searching, but anyone know offhand the reason for crap GPU usage like this? This is BF4, 2560x1440, almost everything maxed. 3770K at 4.8GHz, 13.11 Beta 9.2 drivers. It's a stuttery mess


----------



## dmasteR

Anyone know what the thermal limits are on the VRM's for the R9 290?

I remember seeing a graph of them throttling a little above 80C, but I can't seem to find it.


----------



## CurrentlyPissed

I believe that's because afterburner currently doesn't report correctly.

Someone feel free to correct me if I am wrong. But I believe that's what I read earlier in this thread.


----------



## Arizonian

Quote:


> Originally Posted by *xxmastermindxx*
> 
> Hey guys, did a little searching, but anyone know offhand the reason for crap GPU usage like this? This is BF4, 2560x1440, almost everything maxed. 3770K at 4.8GHz, 13.11 Beta 9.2 drivers. It's a stuttery mess


Are you using AB Beta 17?

If you have applied your overclock, go back into CCC and reset the Power Limit back to 50% again. Even though it says it's 50% in AB its actually not 50% in CCC.

Members have also noticed whenever you set a new clock you need to reset the power limit up in CCC as well as it resets every time the driver does. Not sure if that will take care of it or not but worth a try.


----------



## CurrentlyPissed

Quote:


> Originally Posted by *Falkentyne*
> 
> I don't know how to attach it to a sig. But here.
> 
> MemoryInfo 1005.zip 1101k .zip file


Anyone know what would cause this to report all blanks? Is it because I am in CFX?


----------



## Arizonian

Quote:


> Originally Posted by *CurrentlyPissed*
> 
> Anyone know what would cause this to report all blanks? Is it because I am in CFX?


One of our members noticed that you have to have GPU-Z open when you run that test in order to see the memory. You'll also only see the memory in GPU #1.


----------



## NorcalTRD

Possibly.
Click the link in my signature to get linked to the official downloads page for each and try re-running.


----------



## pkrexer

Quote:


> Originally Posted by *xxmastermindxx*
> 
> Hey guys, did a little searching, but anyone know offhand the reason for crap GPU usage like this? This is BF4, 2560x1440, almost everything maxed. 3770K at 4.8GHz, 13.11 Beta 9.2 drivers. It's a stuttery mess


I had stuttering issues with bf4 as well. The main thing that fixed it for me is unparking my cores. Just google core unparking.


----------



## CurrentlyPissed

Quote:


> Originally Posted by *Arizonian*
> 
> One of our members noticed that you have to have GPU-Z open when you run that test in order to see the memory. You'll also only see the memory in GPU #1.


Thanks that fixed it.


----------



## Neutronman

Quote:


> Originally Posted by *Raephen*
> 
> Uhmm... My Sapphire 290 already had a 47% fan profile bios when I bought it... So where's the upgrade in this replacement?


This was driver based.....I'm talking about official bios support not driver level...


----------



## CurrentlyPissed

Any idea why I'm only getting 60-70 FPS in League of Legends? It's in borderless windowed mode. But still. The Titan got 400+


----------



## NorcalTRD

Are your drivers up to date?
Double check the settings in LoL.
OH yeah, then uninstall LoL and start playing Dota 2 like a grown up


----------



## jerrolds

Quote:


> Originally Posted by *dmasteR*
> 
> Anyone know what the thermal limits are on the VRM's for the R9 290?
> 
> I remember seeing a graph of them throttling a little above 80C, but I can't seem to find it.


I think theyre rated to 125C - ive personally seen mine hit 105C without artifacting/throttling before i shut it down manually - id prefer to have them around 85C or lower.

Starcraft 2 > all MOBAs


----------



## CurrentlyPissed

Quote:


> Originally Posted by *NorcalTRD*
> 
> Are your drivers up to date?
> Double check the settings in LoL.
> OH yeah, then uninstall LoL and start playing Dota 2 like a grown up


Nah, not a big fan of DotA. Was huge into in WC3 but I have a lot more time invested into LoL (diamond). Don't care to retool mechanics.


----------



## CurrentlyPissed

See this is what it looks like in LoL. I'm not even getting max clocks? And look at the spiking in usage for single card mode. And my cores are unparked. I use an app for it.


----------



## dmasteR

Quote:


> Originally Posted by *jerrolds*
> 
> I think theyre rated to 125C - ive personally seen mine hit 105C without artifacting/throttling before i shut it down manually - id prefer to have them around 85C or lower.
> 
> Starcraft 2 > all MOBAs


Nevermind. Figured it out. They're rated at 150C


----------



## Adglu

Quote:


> Originally Posted by *CurrentlyPissed*
> 
> See this is what it looks like in LoL. I'm not even getting max clocks? And look at the spiking in usage for single card mode. And my cores are unparked. I use an app for it.
> 
> 
> Spoiler: Warning: Spoiler!


try to force max clocks through RadeonPro


----------



## Technewbie

Quote:


> Originally Posted by *CurrentlyPissed*
> 
> See this is what it looks like in LoL. I'm not even getting max clocks? And look at the spiking in usage for single card mode. And my cores are unparked. I use an app for it.


Aren't games like LoL more CPU intensive? So wouldn't it use less of your GPU since it doesn't need it?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Adglu*
> 
> try to force max clocks through RadeonPro


Pretty sure RP isn't supporting the 290 series yet.


----------



## CurrentlyPissed

Quote:


> Originally Posted by *Technewbie*
> 
> Aren't games like LoL more CPU intensive? So wouldn't it use less of your GPU since it doesn't need it?


To a point, yes. But why would the Titan get 400+ fps, and the 290 get 60-70? Sometimes it flutters from 40-107 back and forth, very rapid. And it's very, very annoying.


----------



## Adglu

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Pretty sure RP isn't supporting the 290 series yet.


There's a special build for r9, i think it supports 290 too


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Adglu*
> 
> There's a special build for r9, i think it supports 290 too


I heard it was in the works, didn't realize it was out. Thank you!!! I love me some Radeon Pro


----------



## NorcalTRD

Maybe its Powertune?
If it doesnt need the power it might be cycling its usage to conserve


----------



## DeadlyDNA

got my water blocks and psu, extras in today. waiting on my 4th card to make it here. im not official but here is a pic for fun


----------



## rv8000

Quote:


> Originally Posted by *CurrentlyPissed*
> 
> To a point, yes. But why would the Titan get 400+ fps, and the 290 get 60-70? Sometimes it flutters from 40-107 back and forth, very rapid. And it's very, very annoying.


Have you uninstalled Pando Media Booster, seen this program cause so much gpu havoc. Installs via standard league installation. First time I noticed something was when my gtx 680 started to whine while idling on desktop, checked gpu utilization and it was all over the place. Removed Pando after doing some investigative work, no more sporadic gpu clocks/utilization on desktop/idle.


----------



## jerrolds

Quote:


> Originally Posted by *dmasteR*
> 
> Nevermind. Figured it out. They're rated at 150C


Can you link where it says that?


----------



## NorcalTRD

Quote:


> Originally Posted by *DeadlyDNA*
> 
> got my water blocks and psu, extras in today. waiting on my 4th card to make it here. im not official but here is a pic for fun


holy balls man, thats alot of graphics processing power!
Your going four way crossfire?

I think without a cpu bottleneck you should show about 240FPS in Valley 1080P Extreme HD benchmark.


----------



## Arizonian

Quote:


> Originally Posted by *xxmastermindxx*
> 
> Hey Arizonian, switch my info in the OP please
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 290CF with full cover EK blocks, CLU on the cores. Nice and chilly ~45C max.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats







Added:








Quote:


> Originally Posted by *DeadlyDNA*
> 
> got my water blocks and psu, extras in today. waiting on my 4th card to make it here. im not official but here is a pic for fun
> 
> 
> Spoiler: Warning: Spoiler!


Cool - I'll hold off to add you then. Hope your goinig to bench that set up and represent us in the Quad's


----------



## Technewbie

Quote:


> Originally Posted by *CurrentlyPissed*
> 
> To a point, yes. But why would the Titan get 400+ fps, and the 290 get 60-70? Sometimes it flutters from 40-107 back and forth, very rapid. And it's very, very annoying.


Could it be that it isn't going into 3D mode so it is staying down at lower clock speeds?


----------



## Forceman

Quote:


> Originally Posted by *CurrentlyPissed*
> 
> See this is what it looks like in LoL. I'm not even getting max clocks? And look at the spiking in usage for single card mode. And my cores are unparked. I use an app for it.


Disable voltage monitoring in Afterbunrer (you can leave voltage control on). It can cause those spikes.


----------



## DeadlyDNA

Quote:


> Originally Posted by *NorcalTRD*
> 
> holy balls man, thats alot of graphics processing power!
> Your going four way crossfire?
> 
> I think without a cpu bottleneck you should show about 240FPS in Valley 1080P Extreme HD benchmark.


Pretty sure i will be cpu limited. I currently only have a i7 3820. I am getting the WB for it tomorrow so i hope to OC is more as well.
I run Eyefinity Portrait so i'm hoping for good scaling on the cards. First time going red team for my main gaming rig. I am a little
worried about bugs and problems but hoping for the best. I will definitely post benchmarks scores and stuff when i get it all on water.


----------



## Sazz

Ok I got my Sapphire R9 290 and I have done the unlocking. it seems to be able to unlock it to 290X and I ran some benchmarks to see if the increased performance is from the clocks or is it really unlocked.

My methodology, I ran 7 3Dmark firestrike runs, I used a custom fan profile set-up putting the card at +10% board power limit and 72% max fan speed at 80C makes the card stay at flat out 90C and as a result I don't encounter any throttling.

First Firestrike run is the "warm up" run which I did not record coz its sole purpose is to warm the card up and this is followed up right away with 6 Firestrike rpasses in which the first 3 passes is at 947Mhz and the other 3 passes is done at 1000Mhz.

This method is done with Both BIOS (stock Sapphire 290 and sapphire 290X BIOS) and I took the highest graphics score on each BIOS at both 947 and 1000Mhz.

First batch of screenshot that follows is with the Stock Sapphire 290 BIOS (look at the Graphics score not the overall score, the combined scoring system for the Firestrike seems to have a big margin of error in every run compared to looking at each separate scores)

*Stock 290 BIOS @ 947Mhz Firestrike Graphics score 9932pts*


*Stock 290 BIOS @ 1000Mhz Firestrike Graphics score 10377*


The following is the screenshots of scores with Sappire 290X BIOS.

*290X BIOS @ 947Mhz Firestrike Graphics score 10025*


*290X BIOS @ 1000Mhz Firestrike Graphics score 10467*


To summarize and makes it easier for you guys to compare, here's an easier way to view them.

290 Stock BIOS Score | 290X BIOS Score

Firestrike @ 947Mhz 9932Pts | 10025Pts
Firestrike @ 1000Mhz 10377Pts | 10467Pts

There seems to be a gain, but not a lot.

What are your thoughts guys?

I switched to 290 and sold my 290X for 550bucks to my friend, So far with overclocking I can reach the same core clocks as my previous 290X at 1100core clocks at stock voltage while the memory I got a LOT more OC headroom on this 290 than the 290X that I had which maxes out at 1425Mhz core clocks stable on the 290X while this 290 maxed out at 1520mhz memory clocks (This is while using stock 290 BIOS)

Imma go test overclocking now to see how high can I get on this 290.


----------



## grandpatzer

Quote:


> Originally Posted by *jomama22*
> 
> 76% and up were the best but only above 1.35v actual. If sticking with the Asus.ROM bios, it makes little difference.
> 
> The highest stock bios oc I had was 1160 (no voltage changes) and that was the 73.8%. On air.
> 
> As luck would have it, my best card was the very first one I tried, lol. I have one with a better core, but its stuck with elpida, while the other is hynix. A shame really.


when you say *1.35v actual* is that after Vdroop?
How do you measure this value, do you look at MAX VDDC in Gpu-z?

What did youre above 76% ASIC overclock core/memory at this voltage?

I tried my 290 at +100mv +50% with fan @ 100% and seems to have passed Valley @ 1200core/1500memorym however the CPU is weak, I'm waiting for my waterblock so I can put these stuff in the main rig (i5 2500k)


----------



## grandpatzer

Quote:


> Originally Posted by *Adglu*
> 
> try to force max clocks through RadeonPro


How does that work, I just realized there is Overdrive in RadeonPro








I only use RadeonPro to have Dynamic Frame Control locked at certain fps for more smooth gameplay, my radeonpro is 1.1.10

I use MSI AB for my OC always.


----------



## chiknnwatrmln

Interesting, increasing my 290's memory clock from 6000 to 6500 yields less than 40 points in 3dMark11. Graphics score is at 17436 for reference.

Anyway, I thought I asked earlier but I guess I never actually posted; what OC's are you guys getting?

Currently benching at 1200/6500 @ 1.26 after Vdroop. I start to get slight artifacts after 63c, hopefully once I put on my Gelid I'll stay under that point and be able to OC more.

Reference cooler sucks and is loud.


----------



## NorcalTRD

Chiknnwatrmln Im pretty sure your stating your memory clock wrong.
Im at 1060 core clock and 1350 memory for example.


----------



## the9quad

Quote:


> Originally Posted by *NorcalTRD*
> 
> Chiknnwatrmln Im pretty sure your stating your memory clock wrong.
> Im at 1060 core clock and 1350 memory for example.


4*1350=5400

so 4x1500=6000


----------



## NorcalTRD

Yah thats what I meant lol


----------



## chiknnwatrmln

I'm at 6500 MHz effective, or 1625 MHz.

1625 x 4 = 6500 MHz


----------



## centvalny

Powercolor 290 locked, neweeg's tenn. warehouse stock this week



http://imgur.com/OGsnRqw



Cpu & gpu (asus bios PT3) h20



http://imgur.com/NGLlZcK


----------



## rv8000

Quote:


> Originally Posted by *centvalny*
> 
> Powercolor 290 locked, neweeg's tenn. warehouse stock this week
> 
> 
> 
> http://imgur.com/OGsnRqw
> 
> 
> 
> Cpu & gpu (asus bios PT3) h20
> 
> 
> 
> http://imgur.com/NGLlZcK


Mind sharing a direct link to that result


----------



## grandpatzer

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Interesting, increasing my 290's memory clock from 6000 to 6500 yields less than 40 points in 3dMark11. Graphics score is at 17436 for reference.
> 
> Anyway, I thought I asked earlier but I guess I never actually posted; what OC's are you guys getting?
> 
> Currently benching at 1200/6500 @ 1.26 after Vdroop. I start to get slight artifacts after 63c, hopefully once I put on my Gelid I'll stay under that point and be able to OC more.
> 
> Reference cooler sucks and is loud.


From the looks of it, there is little performance gain OC memory over 1500


----------



## Falkentyne

Quote:


> Originally Posted by *Raephen*
> 
> I can't comment on the games you mentioned.
> 
> I can, however, say I've seen similar behavior in Skyrim (I just play that a lot atm... I thought ultra settings with my old HD7870 looked awesome - who could've guessed there could be such a difference between ultra and uktra...)
> 
> I play with the Afterburner OSD on, and I've seen instances of the clockspeed dropping down to half of wat it was - but that's on load screens, so not that inexplicable.
> 
> No, the thing that strikes me as odd is how my GPU ussage can drop to 0% for multiple - really, multiple multiple - seconds yet the game is silky smooth. Has anyone noticed any visual effects from bouncing clocks?
> 
> Another weird thing I've noticed is in the AB graphs. the GPU usage graph is usually pretty low, tight about up until th epoint ingame that I open the menu and exit. That seems to cause a bit of a spike in GPU usage...
> 
> Odd, I would have thought flying around Skyrim and having dragons breathe fire down my neck would be more graphically taxing than a menu.
> 
> Another thing to note is I never reach full clocks in game. In Valley I do reach the full 1000MHz I've got my Sapphire 290 at, but maybe because Skyrim is an older, less taxing DX 9 / 10 game, Powerune decides it doesn't need full clocks.
> 
> I don't know, really, but so far, in Skyrim at least, I've suffered no ill effects from the bouncing clocks (btw: in game ussually between 700ih and 847 (yes 847, not 947). And I can rule out throttling: in Skyrim the core maxes around 65 C and in Valley Extreme HD just about high 80's with the VRM1 - the hottest of the two in my case - mostly 10 to 15 degrees hotter.
> 
> I'm no expert, but I've got a gut feeling this odd behavior / thes eodd readings could be driver / software related.
> 
> I have no issues yet with 9.2 Beta yet, so I'll just wait and see what the next real update brings.


I THINK I can explain EXACTLY what is happening.

THE REASON YOU ARE NOT AT FULL CLOCKS--this applies to everyone-- is becausae the GPU is not being fully used. Basically, when you are CPU limited, I think it drops the clocks to save power.
I tested this in Unreal Tournament 2004 really fast. When I did 1920x1080 with 8x Anti aliasing, I noticed that whenever I got gibbed in a 1v1 deathmatch, and running at 1100/1400 on the clocks, the GPU clock rate would drop sometimes as low as 700 MHz for a shot time, and other times, during certain parts of the map or when the bot appeared. That's because explosions like that are calculated by the CPU. I also noticed when panning around a lot of areas, I saw many clock drops like as low as 950-1050 MHz then rising back.

I then started it with 8x SUPERSAMPLING AA, and then the clocks remained FROZEN At 1100 MHz, with momentary dips to 1099 MHz which are not important. The entire match was at 1100 MHz (ignoring the 1099 MHz) because the GPU required full power. Might explain some things.


----------



## Sazz

any Powercolor 290/290X users here? my friend has been thinking of buying a second 290 to pair with his other 290 XFX that was unlocked to 290X, I see that the powercolor 290's so far all users have on the 290 unlock thread has been able to unlock (Total of 14 users).

My question is do powercolor put void stickers on the screws or anywhere else coz he's putting them on watercooling too once he gets both.


----------



## NorcalTRD

I have a powercolor card and didnt notice any stickers.
Regardless of brand if you do anythign to a card besides insert it into your PCI express slot your voiding your warranty.
And dont think they wont know that you have tinkered with the card, they will.


----------



## Falkentyne

Quote:


> Originally Posted by *CurrentlyPissed*
> 
> See this is what it looks like in LoL. I'm not even getting max clocks? And look at the spiking in usage for single card mode. And my cores are unparked. I use an app for it.


No idea. I get 120 fps with vsync on, flat, at 1920x1080, 8x adaptive AA...(supersampling causes fps drops during fights at that setting, as it also did on my 7970).


----------



## brazilianloser

So probably dumb question... but this is after all my first AMD card and I am just so used to EVGA Precision... anyways... I got my core going at 1100 on air (BF4 temps 80c) with a bump on the core voltage of 25mV. But here is my thing, I do not have the option to modify the memory voltage in my AB Beta 17. The card is an ASUS 290. The overclocking on the core is pretty straight foward and I am sticking with 1100 until I come around to put this thing under water but what about the memory? Any increase from the stock 1250 causes 3Dmark to crash. Any help.

Is the Aux Voltage or Power Limit the ones responsible to allowing me to oc the memory clock?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *brazilianloser*
> 
> So probably dumb question... but this is after all my first AMD card and I am just so used to EVGA Precision... anyways... I got my core going at 1100 on air (BF4 temps 80c) with a bump on the core voltage of 25mV. But here is my thing, I do not have the option to modify the memory voltage in my AB Beta 17. The card is an ASUS 290. The overclocking on the core is pretty straight foward and I am sticking with 1100 until I come around to put this thing under water but what about the memory? Any increase from the stock 1250 causes 3Dmark to crash. Any help.


Use GPUTweak to modify memory voltages.

It also lets you raise core voltage to 1.412.


----------



## Derpinheimer

Anyone know how much performance difference there is between a good-overclocking 7950 and a 290?

I've got my 7950 under water and run it 1275/1750. I cant really tell how much OC potential to expect from the 290 under water. Is it similar? Or will the fact my 7950 overclocks pretty well reduce the gains considerably?

Thanks


----------



## brazilianloser

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Use GPUTweak to modify memory voltages.
> 
> It also lets you raise core voltage to 1.412.


Hmm I am not a big fan of GPU Tweak when I first tried with the initial card that was giving me black screens. And I do not think it would be a good idea to use both AB and GPU tweak correctly? So no way of doing such on the current AB I take?


----------



## centvalny

Quote:


> Originally Posted by *rv8000*
> 
> Mind sharing a direct link to that result


Heres







http://www.3dmark.com/3dm11/7540546



http://imgur.com/0MoPpLx


Rig



http://imgur.com/0oXFSdE


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Derpinheimer*
> 
> Anyone know how much performance difference there is between a good-overclocking 7950 and a 290?
> 
> I've got my 7950 under water and run it 1275/1750. I cant really tell how much OC potential to expect from the 290 under water. Is it similar? Or will the fact my 7950 overclocks pretty well reduce the gains considerably?
> 
> Thanks


I had a 7950 clocked at 1150/1700 and it had a 3DMark11 graphics score of around 10.7k on performance iirc. What is your 3DMark11 performance graphics score?

That was air cooled, my current ref cooled 290 @ 1200/6500 gets 17436 in the same bench.

Massive performance difference. Hawaii is temp sensitive (at 1200 I artifact over 62c) so watercooling will only help more.
Quote:


> Originally Posted by *brazilianloser*
> 
> Hmm I am not a big fan of GPU Tweak when I first tried with the initial card that was giving me black screens. And I do not think it would be a good idea to use both AB and GPU tweak correctly? So no way of doing such on the current AB I take?


I honestly hate the program; but MSI AB's power limit doesn't work for me, and I want to crank the voltage so I use GPUTweak.

They will not work simultaneously.

They're similar programs, except GPUTweak has a clunky UI.


----------



## Redvineal

Wanna see something crazy?

I wasn't at all happy with the VRM1 temps (~91C) the Accelero Xtreme III was producing with the rinky-dink heatsinks on my XFX R9 290 (does NOT unlock, btw).

As a matter of fact, the heat caused the thermal pads to fail and all 3 heatsinks fell off! I didn't want to go with the permanent glue, and here's why I'm glad I didn't:


Spoiler: Warning: Spoiler!













What you see there is the result of 2 hours of quality time spent with a hacksaw! Yes, a HACKSAW. I don't own a rotary tool, and my local hardware stores were closed when I started around 9:30 PM.

Essentially, I took inspiration from some folks at Overclockers UK for the VRM1 heat and got to work on the stock heatsink! The back plate covering VRM1 was the easy part (thanks OCUK). On the other hand, the VRM2 plate up front was much more difficult. I had to work my way through the black plate, the copper plate, and finally the aluminum fins. Once I got the VRM2 cover separated from the rest of the sink, I used a pair of pliers to individually tear off the aluminum fins.

Told you it was crazy!









Now here comes some gritty:


Spoiler: Warning: Spoiler!



Furmark 10 Minute Run 1

Core: 1100
Memory: 1300
Power: +50
mV: +0

GPU Max: 57C
VRM1 Max: 65C
VRM2 Max: 50C



Furmark 10 Minute Run 2

Core: 1150
Memory: 1500
Power: +50
mV: +100

GPU Max: 65C
VRM1 Max: 78C








VRM2 Max: 56C





Pretty sweet results, right? If you've made it this far, thanks for checking out my work!









I'd really like to know what you other R9 290(X) owners think of this Frankenstein of a mod, the results, and what other benches would be good to tax this monstrosity!

Also, if anyone is interested in more details about how I wound up with the VRM2 plate you see in the pics (where to cut, etc), I'll be happy to share.

Edit: Forgot to mention my idles
GPU: 35C
VRM1: 31C
VRM2: 33C


Spoiler: Warning: Spoiler!


----------



## CurrentlyPissed

Well I tried all the suggestions in this thread.

Using Radeon Pro

Uninstalling Pando Media Booster

Turning off Voltage Control

Still only getting 60-70 fps dips into 30s, and it just FEELS slow. Even when I'm getting 70 FPS it feels like 20.


----------



## djsatane

So since AMD silently removed overdrive in CCC with this latest beta driver, whats the best program to view and adjust power limit/fan stuff? Afterburner?

Wonder why AMD removed overdrive in latest beta driver that supposed to fix black screens.... did something tied to overdrive in ccc when altered caused possible problems? hmm


----------



## hatlesschimp

Quote:


> Originally Posted by *Sazz*
> 
> any Powercolor 290/290X users here? my friend has been thinking of buying a second 290 to pair with his other 290 XFX that was unlocked to 290X, I see that the powercolor 290's so far all users have on the 290 unlock thread has been able to unlock (Total of 14 users).
> 
> My question is *do powercolor put void stickers on the screws* or anywhere else coz he's putting them on watercooling too once he gets both.


No they dont. Here is a picture of my Powercolor 290x


----------



## youngmanblues

interesting









http://www.sapphiretech.com/presentation/media/media_index.aspx?psn=0004&articleID=5453&lid=1


----------



## Sazz

Quote:


> Originally Posted by *youngmanblues*
> 
> interesting:thinking:
> 
> http://www.sapphiretech.com/presentation/media/media_index.aspx?psn=0004&articleID=5453&lid=1


The Sapphire 290 that I just received today already had the BIOS with 47% max fan speed.


----------



## nemm

Quote:


> Originally Posted by *youngmanblues*
> 
> interesting:thinking:
> 
> http://www.sapphiretech.com/presentation/media/media_index.aspx?psn=0004&articleID=5453&lid=1


Fan profile improvements are not what wc people care much for







just hope it offers more than that else no point in using.


----------



## BIG MICHO

Quote:


> Originally Posted by *Sazz*
> 
> Ok I got my Sapphire R9 290 and I have done the unlocking. it seems to be able to unlock it to 290X and I ran some benchmarks to see if the increased performance is from the clocks or is it really unlocked.
> 
> My methodology, I ran 7 3Dmark firestrike runs, I used a custom fan profile set-up putting the card at +10% board power limit and 72% max fan speed at 80C makes the card stay at flat out 90C and as a result I don't encounter any throttling.
> 
> First Firestrike run is the "warm up" run which I did not record coz its sole purpose is to warm the card up and this is followed up right away with 6 Firestrike rpasses in which the first 3 passes is at 947Mhz and the other 3 passes is done at 1000Mhz.
> 
> This method is done with Both BIOS (stock Sapphire 290 and sapphire 290X BIOS) and I took the highest graphics score on each BIOS at both 947 and 1000Mhz.
> 
> First batch of screenshot that follows is with the Stock Sapphire 290 BIOS (look at the Graphics score not the overall score, the combined scoring system for the Firestrike seems to have a big margin of error in every run compared to looking at each separate scores)
> 
> *Stock 290 BIOS @ 947Mhz Firestrike Graphics score 9932pts*
> 
> 
> *Stock 290 BIOS @ 1000Mhz Firestrike Graphics score 10377*
> 
> 
> The following is the screenshots of scores with Sappire 290X BIOS.
> 
> *290X BIOS @ 947Mhz Firestrike Graphics score 10025*
> 
> 
> *290X BIOS @ 1000Mhz Firestrike Graphics score 10467*
> 
> 
> To summarize and makes it easier for you guys to compare, here's an easier way to view them.
> 
> 290 Stock BIOS Score | 290X BIOS Score
> 
> Firestrike @ 947Mhz 9932Pts | 10025Pts
> Firestrike @ 1000Mhz 10377Pts | 10467Pts
> 
> There seems to be a gain, but not a lot.
> 
> What are your thoughts guys?
> 
> I switched to 290 and sold my 290X for 550bucks to my friend, So far with overclocking I can reach the same core clocks as my previous 290X at 1100core clocks at stock voltage while the memory I got a LOT more OC headroom on this 290 than the 290X that I had which maxes out at 1425Mhz core clocks stable on the 290X while this 290 maxed out at 1520mhz memory clocks (This is while using stock 290 BIOS)
> 
> Imma go test overclocking now to see how high can I get on this 290.


Well i went to the 3dMark website and started comparing my scores to others. Unfortunately i think you and i are in the same boat. our CPU's are the weak links. Somebody correct me if I'm wrong but because physics is dependent on our CPU and memory, we get overall low scores with our AMD set ups. here are some of my scores with single dual and triple CFX. you see that the graphics test get better, however the physics score stay about the same or get worse.

Single Sorry no image but my results
http://www.3dmark.com/fs/1192543

Dual CFX



Triple CFX


----------



## hatlesschimp

My stock CF 290x score

Fire Strike Extreme = 8787

http://www.3dmark.com/fs/1188889


----------



## smonkie

I got a Powercolor 290X OC and everything seems ok while playing, but It can't keep 300/150 frequencies in desktop. It's constantly spiking, and things get worse when I open a video file, because memory frequency is always at full usage and card reaches 60-62º.

I've tried three diferent catalyst versions (13.10 whql and 13.11 beta 2 - 4), but no luck yet. Any ideas?


----------



## Spectre-

Quote:


> Originally Posted by *smonkie*
> 
> I got a Powercolor 290X OC and everything seems ok while playing, but It can't keep 300/150 frequencies in desktop. It's constantly spiking, and things get worse when I open a video file, because memory frequency is always at full usage and card reaches 60-62º.
> 
> I've tried three diferent catalyst versions (13.10 whql and 13.11 beta 2 - 4), but no luck yet. Any ideas?


if you are watching videos then those spikes would occur

but they really shouldnt my R9 290 is @ 300/1250 all the time ( running 3 monitors)


----------



## smonkie

Quote:


> Originally Posted by *Spectre-*
> 
> but they really shouldnt my R9 290 is @ 300/1250 all the time


Yeah, some spikes here and there may occur, but nothing like what happens to mine. Spikes while watching video (even a poor quality one):


----------



## Prospector

I had full usage of memory clock when i connect the card with the monitor via Hdmi. After i use Dvi cable,i have always 150mhz in desktop


----------



## smonkie

I use DVI too.


----------



## zpaf

Nice boost from stock.


----------



## Spectre-

Quote:


> Originally Posted by *Prospector*
> 
> I had full usage of memory clock when i connect the card with the monitor via Hdmi. After i use Dvi cable,i have always 150mhz in desktop


shouldnt make a difference my brothers R9 290 uses HDMI idle is 300/150


----------



## Kuivamaa

Quote:


> Originally Posted by *djsatane*
> 
> So since AMD silently removed overdrive in CCC with this latest beta driver, whats the best program to view and adjust power limit/fan stuff? Afterburner?
> 
> Wonder why AMD removed overdrive in latest beta driver that supposed to fix black screens.... did something tied to overdrive in ccc when altered caused possible problems? hmm


Even If overdrive is removed from CCC (which I don't think is the case), it actually exists as a standalone program you know.

http://sites.amd.com/us/game/downloads/amd-overdrive/Pages/overview.aspx


----------



## fewohfjweoifj

Did the new drivers fix the issue for a lot of people or is the situation still mostly the same? Hopefully AMD gets their act together and fixes this for everyone.


----------



## JordanTr

Wanna join club










http://imageshack.us/photo/my-images/14/ep52.png/

Uploaded with ImageShack.us

That overclock i just put in 5s, just random, gave 10% performance increase in unigine heaven. Dont want to play with OC before aftermarket cooler. Wanted your opinion, should i get gelid icy vision v2 and put Scythe Gentle Typhoons on it or just go for accelero extreme III (i dont like this scenario, cause i will have to buy extra accesories i dont know ). With gelid everything comes together. max VRM 1 was 55, VRM2 65. GPU on heaven max 75, but its really loud, fan was on 66%. P.S. Not bad ASIC i guess







Didnt get any issues, using newest 9.4 beta, came from gtx660 sli, went in safe mode, uninstalled drivers from controll panel, then used driver cleaner to clean everything, booted up with no internet, installed AMD drivers, connected internet. Will try some more benching to be aware of blackscreen before orderin aftermarket cooler. I got coil whine too on idle, but its not that but, tomorrow im turning 27, so maybe i will stop hearing it


----------



## Spectre-

Quote:


> Originally Posted by *JordanTr*
> 
> Wanna join club
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://imageshack.us/photo/my-images/14/ep52.png/
> 
> Uploaded with ImageShack.us
> 
> That overclock i just put in 5s, just random, gave 10% performance increase in unigine heaven. Dont want to play with OC before aftermarket cooler. Wanted your opinion, should i get gelid icy vision v2 and put Scythe Gentle Typhoons on it or just go for accelero extreme III (i dont like this scenario, cause i will have to buy extra accesories i dont know ). With gelid everything comes together. max VRM 1 was 55, VRM2 65. GPU on heaven max 75, but its really loud, fan was on 66%. P.S. Not bad ASIC i guess


i am grabbing the accelero 3 as soon as they are in stock here


----------



## smonkie

Here's after watching a YouTube video:



It's really annoying having the memory at full usage while watching any kind of video.









I've installed an ASUS Bios, but didn't help. Any ideas?


----------



## sugarhell

Disable hardware acceleration on flash player.


----------



## macamba

My sapphire r9 290 did'nt had a single black screen with the beta 9.2 drivers, until i turned overdrive on and set the power limit to 50%.(stock)
Without that my card throttled on stock speeds.

Now with 9.4 beta driver they did'nt include overdrive.Overdrive It seems to me that it's related to the black screen problem.
Too bad that the power limit isn't working for me in afterburner. It might be that the card cant handle more power correctly.


----------



## fewohfjweoifj

I was getting black screens before I had even touched Overdrive settings, so it can't be that.


----------



## macamba

It can still be a power problem, some can get into problems with less power than others.
Bad capacitors?


----------



## sugarhell

People get black screen on stock. One friend still has overdriver with new drivers. Also its not a hardware problem. 7970 had almost the same problem on release


----------



## nemm

Overdrive definitely appears to be stealth removed. It is still present if you don't remove old before installing new but nothing you set becomes applied. When doing a clean install overdrive is completely gone which leads me believe the power play is at fault for black screens. With this going, if you set in AB it makes no difference, the clocks bounce causing terrible performance. The only way other than installing the separate overdrive program was to have AB run at max clocks but even then the performance was shy of previous 9.2 drivers.
I'm not happy with running 1200/1450 @1.42 at desktop thank you AMD so until sorted ill stick with 9.2 after all I found a method which seems to stop black screen or hard locking which is not to be so conservative with voltage and also not to over the top for that matter. I can run any bench or any test with +200mv for 1200/1450 problem free but eventually I will get a fail when gaming so I use +227 and voila no more failing.


----------



## Falkentyne

Quote:


> Originally Posted by *Kuivamaa*
> 
> Even If overdrive is removed from CCC (which I don't think is the case), it actually exists as a standalone program you know.
> 
> http://sites.amd.com/us/game/downloads/amd-overdrive/Pages/overview.aspx


That's for an AMD system and that's for overclocking CPU/memory, NOT for graphics !


----------



## Raephen

Quote:


> Originally Posted by *Redvineal*
> 
> Wanna see something crazy?
> 
> I wasn't at all happy with the VRM1 temps (~91C) the Accelero Xtreme III was producing with the rinky-dink heatsinks on my XFX R9 290 (does NOT unlock, btw).
> 
> As a matter of fact, the heat caused the thermal pads to fail and all 3 heatsinks fell off! I didn't want to go with the permanent glue, and here's why I'm glad I didn't:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Good job. man.
> 
> If Aquatuning keep postponing the waterblock, I might consider something like this (that is: if the AX III ever comes back in stock here in the NL at shops)
> 
> I wonder, do you have a good picture of the heatsink and alu fins as they were before you cut them up?
> 
> What you see there is the result of 2 hours of quality time spent with a hacksaw! Yes, a HACKSAW. I don't own a rotary tool, and my local hardware stores were closed when I started around 9:30 PM.
> 
> Essentially, I took inspiration from some folks at Overclockers UK for the VRM1 heat and got to work on the stock heatsink! The back plate covering VRM1 was the easy part (thanks OCUK). On the other hand, the VRM2 plate up front was much more difficult. I had to work my way through the black plate, the copper plate, and finally the aluminum fins. Once I got the VRM2 cover separated from the rest of the sink, I used a pair of pliers to individually tear off the aluminum fins.
> 
> Told you it was crazy!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now here comes some gritty:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Furmark 10 Minute Run 1
> 
> Core: 1100
> Memory: 1300
> Power: +50
> mV: +0
> 
> GPU Max: 57C
> VRM1 Max: 65C
> VRM2 Max: 50C
> 
> 
> 
> Furmark 10 Minute Run 2
> 
> Core: 1150
> Memory: 1500
> Power: +50
> mV: +100
> 
> GPU Max: 65C
> VRM1 Max: 78C
> 
> 
> 
> 
> 
> 
> 
> 
> VRM2 Max: 56C
> 
> 
> 
> 
> 
> Pretty sweet results, right? If you've made it this far, thanks for checking out my work!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'd really like to know what you other R9 290(X) owners think of this Frankenstein of a mod, the results, and what other benches would be good to tax this monstrosity!
> 
> Also, if anyone is interested in more details about how I wound up with the VRM2 plate you see in the pics (where to cut, etc), I'll be happy to share.
> 
> Edit: Forgot to mention my idles
> GPU: 35C
> VRM1: 31C
> VRM2: 33C
> 
> 
> Spoiler: Warning: Spoiler!


Good job, man!

I wonder, do you have a picture of the heatsink + alu fins as they were before you cut them up?


----------



## macamba

If you do a clean install on a new windows you don't have overdrive in ccc. (Friend of mine on fresh install no overdrive)
The same if you clean up your old drivers completely. (registry amd ati maps)

i installed the 9.4 over the 9.2 beta first and i still had overdrive but it wasnt working as it should, so i uninstalled every single amd thing on my pc (registry etc)and reintalled 9.4. after that no more overdrive


----------



## cyenz

One thing my friend noticed is that the GPUZ readings of the 12v line are higher than on previous drivers. Before it would just idle at 11.4v, now it idles at 11.75V and drops to 11.5V when gaming. Still on the low side, but the truth is that is still havent got a blackcrseen after this new drivers. AMD are on the correct way.



Another thing as already stated in the poll thread is that the GPU usage is pretty much constant, before it would balance between 0 and 100% pretty quicly.


----------



## JordanTr

I read article that all who bought R9 series GPU will be able to get BF4 for free. How to do that? Or its not quite true? And why for should i enable or disable powerplay in AB? Or should i leave unofficial overclocking disabled? My core moves between 947 and 1050 a bit, but most of the time stays on 1050, never goes below 947.


----------



## lethal343

hey guys what does everybody reckon of the new BETA 13.11 9.4 drivers?!
Has it fixed the blackscreen problem?
Plus what program should i go for completely removing my 9.2 drivers for a legit install of 9.4?

Beta 13.11 v9.4 for anyone who's looking for it:

http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx


----------



## fewohfjweoifj

13.11 9.4 does not fix the blackscreen problem for me.


----------



## JordanTr

Havent tested 9.2, went straight forward to 9.4, pc is running 3h for no no black screens only tested unigine heaven and play Assasins creed 4 for a while, it looks like AC4 was able to raise my gpu temp to 78, while unigine heaven only 75


----------



## rdr09

Quote:


> Originally Posted by *cyenz*
> 
> One thing my friend noticed is that the GPUZ readings of the 12v line are higher than on previous drivers. Before it would just idle at 11.4v, now it idles at 11.75V and drops to 11.5V when gaming. Still on the low side, but the truth is that is still havent got a blackcrseen after this new drivers. AMD are on the correct way.
> 
> 
> 
> Another thing as already stated in the poll thread is that the GPU usage is pretty much constant, before it would balance between 0 and 100% pretty quicly.


mine has always been 11.75v since day1 and i was using 13.11 beta1 then. never had black screen. hmmm

Quote:


> Originally Posted by *fewohfjweoifj*
> 
> 13.11 9.4 does not fix the blackscreen problem for me.


pls. post your gpuz with sensor tab pls.


----------



## cyenz

Quote:


> Originally Posted by *rdr09*
> 
> mine has always been 11.75v since day1 and i was using 13.11 beta1 then. never had black screen. hmmm
> pls. post your gpuz with sensor tab pls.


I dont know, but it may be related to the blackscreens as he would be idling at 11.4V and i imagine it would be 0.2V lower when on load on previous drivers.

Just my 2 cents really, not an experted on this area.


----------



## rdr09

Quote:


> Originally Posted by *lethal343*
> 
> hey guys what does everybody reckon of the new BETA 13.11 9.4 drivers?!
> Has it fixed the blackscreen problem?
> Plus what program should i go for completely removing my 9.2 drivers for a legit install of 9.4?
> 
> Beta 13.11 v9.4 for anyone who's looking for it:
> 
> http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx


just use 9.2 or 9.4 to remove the driver itself. pick Uninstall (Express). don't use any driver sweeper crap.


----------



## fewohfjweoifj

Here is my GPU-Z reading. I'm using Uber mode if that matters, black screen happens in both modes.


----------



## rdr09

Quote:


> Originally Posted by *cyenz*
> 
> I dont know, but it may be related to the blackscreens as he would be idling at 11.4V and i image it would be 0.2V lower when on load on previous drivers.
> 
> Just my 2 cents really, not an experted on this area.


so, some cards maybe idling at 11.4v. i am not doubting you. could very well be the issue. software/bios/driver related. hardware is dependent on software.


----------



## rdr09

Quote:


> Originally Posted by *fewohfjweoifj*
> 
> 
> Here is my GPU-Z reading. I'm using Uber mode if that matters, black screen happens in both modes.


ha! 11.5v. hmmm

what psu do you have? pls. fill out the Rig builder as well. thanks.

290 using 13.11 whql


----------



## smonkie

As I see in your GPUZ screenshots, I'm not the only one having those spikes in desktop mode. Could you please play a Youtube video and post GPUZ screenshot after a few minutes to see how frequencies worked? It would be much appreciated.


----------



## rdr09

Quote:


> Originally Posted by *smonkie*
> 
> As I see in your GPUZ screenshots, I'm not the only one having those spikes in desktop mode. Could you please play a Youtube video and post GPUZ screenshot after a few minutes to see how frequencies worked? It would be much appreciated.


mine spikes, too. and look at the core voltage . . . it goes down. so if those cards suffering BC's are idling at 11.5 or 11.4v when gets loaded i assume their voltages will go down, too. and what happens? BC.



just speculating. some batches come with bioses that are forcing lower voltages.


----------



## JordanTr

I get 11.88V in idle, experiencing no problems.


----------



## rdr09

Quote:


> Originally Posted by *JordanTr*
> 
> I get 11.88V in idle, experiencing no problems.


what is it loaded?


----------



## JordanTr

Quote:


> Originally Posted by *rdr09*
> 
> what is it loaded?


Give me a minute, will check on heaven.


----------



## JordanTr

Just checked, its 11.63V.

BTW running mine on 1050/1400 on stock volts.


----------



## Arizonian

Quote:


> Originally Posted by *JordanTr*
> 
> Wanna join club
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://imageshack.us/photo/my-images/14/ep52.png/
> 
> 
> 
> Uploaded with ImageShack.us
> 
> That overclock i just put in 5s, just random, gave 10% performance increase in unigine heaven. Dont want to play with OC before aftermarket cooler. Wanted your opinion, should i get gelid icy vision v2 and put Scythe Gentle Typhoons on it or just go for accelero extreme III (i dont like this scenario, cause i will have to buy extra accesories i dont know ). With gelid everything comes together. max VRM 1 was 55, VRM2 65. GPU on heaven max 75, but its really loud, fan was on 66%. P.S. Not bad ASIC i guess
> 
> 
> 
> 
> 
> 
> 
> Didnt get any issues, using newest 9.4 beta, came from gtx660 sli, went in safe mode, uninstalled drivers from controll panel, then used driver cleaner to clean everything, booted up with no internet, installed AMD drivers, connected internet. Will try some more benching to be aware of blackscreen before orderin aftermarket cooler. I got coil whine too on idle, but its not that but, tomorrow im turning 27, so maybe i will stop hearing it


Congrats - welcome aboard - added









PS - any new members who've recently joined OCN - welcome to add pic and join us too.


----------



## rdr09

Quote:


> Originally Posted by *JordanTr*
> 
> Just checked, its 11.63V.
> 
> BTW running mine on 1050/1400 on stock volts.


thanks, Jordan. i've seen mine go as low as 1.5v running the render test in GPUz. wonder what those 1.4V idle do down to when loaded?
i can run benches using 1155/1500 at stock volts. i game at stock.


----------



## JordanTr

Quote:


> Originally Posted by *rdr09*
> 
> thanks, Jordan. i've seen mine go as low as 1.5v running the render test in GPUz. wonder what those 1.4V idle do down to when loaded?
> i can run benches using 1155/1500 at stock volts. i game at stock.


Nice clock i play same clock as well, didnt mess with oc before i get aftermarket cooler, put them just not to stay on stock


----------



## Raxus

Quote:


> Originally Posted by *NorcalTRD*
> 
> I have a powercolor card and didnt notice any stickers.
> Regardless of brand if you do anythign to a card besides insert it into your PCI express slot your voiding your warranty.
> And dont think they wont know that you have tinkered with the card, they will.


Not true. Several manufactuerrs have said as long as you don't physically damage the card your warranty is fine.


----------



## grandpatzer

Quote:


> Originally Posted by *JordanTr*
> 
> I read article that all who bought R9 series GPU will be able to get BF4 for free. How to do that? Or its not quite true? And why for should i enable or disable powerplay in AB? Or should i leave unofficial overclocking disabled? My core moves between 947 and 1050 a bit, but most of the time stays on 1050, never goes below 947.


Where did you read that regarding BF4 and R9 290?


----------



## macamba

Is it normal that my sapphire 290 throttles at stock speeds?
I have the fan at 55c so the target temp of 95c is no problem.
when i set power limit to +50 it stops throttling.

power throttling with stock speeds hmmm... cant be right


----------



## MrWhiteRX7

How bad is it throttling... what clocks does it go down to?


----------



## JordanTr

Quote:


> Originally Posted by *grandpatzer*
> 
> Where did you read that regarding BF4 and R9 290?


I interpretated. It has to be battlefield 4 edition to get the game, never mind







even 270's get that









I ran 3dmark basic, is that good score or not? My 2500K running on 4.5ghz


----------



## TheSoldiet

Is people still getting Black Screens after the recent driver update? I can't test because I am not home right now. I will test tomorrow though.


----------



## fewohfjweoifj

I found that by turning down the core voltage and aux voltage to about -44, I could run games for about twice as long before getting a black screen crash. It doesn't solve the issue, but you might want to look into this. I used Afterburner for it but I'm sure other programs will work as well.


----------



## Raxus

Quote:


> Originally Posted by *grandpatzer*
> 
> Where did you read that regarding BF4 and R9 290?


they said its up to the vendor/ manufacturer to decide what cards get BF4.


----------



## utnorris

Quote:


> Originally Posted by *hatlesschimp*
> 
> My stock CF 290x score
> 
> Fire Strike Extreme = 8787
> 
> http://www.3dmark.com/fs/1188889


Wow! That's better than my CF with the 290's overclocked to 1100Mhz.

http://www.3dmark.com/fs/1182327

I attribute some of it to your CPU, but if you look at the graphics score, you are still higher than me.


----------



## macamba

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> How bad is it throttling... what clocks does it go down to?


it throttles right away when i start furmark or a game and it stays between 720 and 850. temps are max 75c


----------



## Redvineal

Quote:


> Originally Posted by *Raephen*
> 
> Good job, man!
> 
> I wonder, do you have a picture of the heatsink + alu fins as they were before you cut them up?


Sorry, I didn't think to take pics before I started. Too much excitement.

However, I can probably round up some pics of the bottom and sides and draw some lines where I made the cuts.


----------



## fewohfjweoifj

I have also observed that increasing the core voltage and aux voltage causes the black screen crash to happen faster, turning it up as far as possible causes the crash as soon as the game launches.


----------



## nyboy42

Sapphire released a new bios for their r9 290: http://www.sapphiretech.com/presentation/downloads/?pid=2044&psn=0006&lid=1&os=6

can we flash this in windows or does it have to USB bootable?


----------



## utnorris

Quote:


> Originally Posted by *JordanTr*
> 
> I interpretated. It has to be battlefield 4 edition to get the game, never mind
> 
> 
> 
> 
> 
> 
> 
> even 270's get that
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I ran 3dmark basic, is that good score or not? My 2500K running on 4.5ghz


Its in line with what most are getting at stock and slightly above.


----------



## S410520

Finally finished placing the Gelid Vision Icy Rev2 on my unlocked 290. Just in short:

This cooler is not compatible without modding!

Long story:

After cleaning all the ram chips, vrm's and gpu, I found that the following needed modding to fit the cooler at all;

- 1 ram sink blocked the gpucooler and has to be modded or left out.
- The large VRM heatsink in the package had risers shaped in the aluminum which made it unfit for my 290x reference.
- Another vrm heatsink need modding, there is a square block supplied which does not fit. You could work with several really small ones and fold the adhesive stickers a couple of times to make it fit...

I decided to give it the best cooling possible with materials supplied with the cooler.

- Flattened the base of the large vrm heatsink by sanding off the risers completely to make it fit. (lucky for me, alu is soft enough to work with and I had a good friend helping me)
- sawed 2 pieces of a large square vrm block to make it fit, rather then use the very small separate blocks.
- Lowered half of the low ram heatsink, which was not low enough to use with this cooler.
- I used thermal paste on the large vrm heatsink on vrm2 and adhesive stickers on ram and vrm1*. (*3 vrm's in corner of the card)

Here are the results: (Note, my case is half-open and ambient is very cold, 15c I guess.)
I connected the fans to the 290's pcb so the are at 2000rpm all the time for this test. (will change that later)

These temps were measured after a run of Valley bench at extreme 8x aa. (68fps)

290X bios @stock:
max gpu: 52c
max vrm1:62c
max vrm2:43c

[email protected]+1.320v GPUZ / 1.296V gputweak:
max gpu: 62c
max vrm1:83c
max vrm2:49c

All in all, sweet cooler if you like to mod


----------



## Arizonian

@ S410520

Nice work. Roster updated


----------



## utnorris

Quote:


> Originally Posted by *nyboy42*
> 
> Sapphire released a new bios for their r9 290: http://www.sapphiretech.com/presentation/downloads/?pid=2044&psn=0006&lid=1&os=6
> 
> can we flash this in windows or does it have to USB bootable?


You realize all this does is change the fan curve? It is already part of the latest beta driver.


----------



## utnorris

Quote:


> Originally Posted by *S410520*
> 
> Finally finished placing the Gelid Vision Icy Rev2 on my unlocked 290. Just in short:
> 
> This cooler is not compatible without modding!
> 
> Long story:
> 
> After cleaning all the ram chips, vrm's and gpu, I found that the following needed modding to fit the cooler at all;
> 
> - 1 ram sink blocked the gpucooler and has to be modded or left out.
> - The large VRM heatsink in the package had risers shaped in the aluminum which made it unfit for my 290x reference.
> - Another vrm heatsink need modding, there is a square block supplied which does not fit. You could work with several really small ones and fold the adhesive stickers a couple of times to make it fit...
> 
> I decided to give it the best cooling possible with materials supplied with the cooler.
> 
> - Flattened the base of the large vrm heatsink by sanding off the risers completely to make it fit. (lucky for me, alu is soft enough to work with and I had a good friend helping me)
> - sawed 2 pieces of a large square vrm block to make it fit, rather then use the very small separate blocks.
> - Lowered half of the low ram heatsink, which was not low enough to use with this cooler.
> - I used thermal paste on the large vrm heatsink on vrm2 and adhesive stickers on ram and vrm1*. (*3 vrm's in corner of the card)
> 
> Here are the results: (Note, my case is half-open and ambient is very cold, 15c I guess.)
> I connected the fans to the 290's pcb so the are at 2000rpm all the time for this test. (will change that later)
> 
> These temps were measured after a run of Valley bench at extreme 8x aa. (68fps)
> 
> 290X bios @stock:
> max gpu: 52c
> max vrm1:62c
> max vrm2:43c
> 
> [email protected]+1.320v GPUZ / 1.296V gputweak:
> max gpu: 62c
> max vrm1:83c
> max vrm2:49c
> 
> All in all, sweet cooler if you like to mod


Awesome mod!!


----------



## S410520

Thanks, cost me and a friend helping me about 2.5 hours. Thought might as well take pics and post it as others might buy the Icy Rev2 thinking it fits out of the box.

Hopefully the Arctic hybrid that's going on the other unlocked 290 next week will fit out of the box.

I like modding, but this was just a bit to much only to make it fit.


----------



## JordanTr

Quote:


> Originally Posted by *S410520*
> 
> Finally finished placing the Gelid Vision Icy Rev2 on my unlocked 290. Just in short:
> 
> This cooler is not compatible without modding!
> 
> Long story:
> 
> After cleaning all the ram chips, vrm's and gpu, I found that the following needed modding to fit the cooler at all;
> 
> - 1 ram sink blocked the gpucooler and has to be modded or left out.
> - The large VRM heatsink in the package had risers shaped in the aluminum which made it unfit for my 290x reference.
> - Another vrm heatsink need modding, there is a square block supplied which does not fit. You could work with several really small ones and fold the adhesive stickers a couple of times to make it fit...
> 
> I decided to give it the best cooling possible with materials supplied with the cooler.
> 
> - Flattened the base of the large vrm heatsink by sanding off the risers completely to make it fit. (lucky for me, alu is soft enough to work with and I had a good friend helping me)
> - sawed 2 pieces of a large square vrm block to make it fit, rather then use the very small separate blocks.
> - Lowered half of the low ram heatsink, which was not low enough to use with this cooler.
> - I used thermal paste on the large vrm heatsink on vrm2 and adhesive stickers on ram and vrm1*. (*3 vrm's in corner of the card)
> 
> Here are the results: (Note, my case is half-open and ambient is very cold, 15c I guess.)
> I connected the fans to the 290's pcb so the are at 2000rpm all the time for this test. (will change that later)
> 
> These temps were measured after a run of Valley bench at extreme 8x aa. (68fps)
> 
> 290X bios @stock:
> max gpu: 52c
> max vrm1:62c
> max vrm2:43c
> 
> [email protected]+1.320v GPUZ / 1.296V gputweak:
> max gpu: 62c
> max vrm1:83c
> max vrm2:49c
> 
> All in all, sweet cooler if you like to mod


I was 90% sure i will get gelid icy vision rev 2, but after your post i will definetly go for extreme III and buy those extra ram sinks or whatever needed


----------



## Redvineal

Quote:


> Originally Posted by *Redvineal*
> 
> Wanna see something crazy?
> 
> I wasn't at all happy with the VRM1 temps (~91C) the Accelero Xtreme III was producing with the rinky-dink heatsinks on my XFX R9 290 (does NOT unlock, btw).
> 
> As a matter of fact, the heat caused the thermal pads to fail and all 3 heatsinks fell off! I didn't want to go with the permanent glue, and here's why I'm glad I didn't:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What you see there is the result of 2 hours of quality time spent with a hacksaw! Yes, a HACKSAW. I don't own a rotary tool, and my local hardware stores were closed when I started around 9:30 PM.
> 
> Essentially, I took inspiration from some folks at Overclockers UK for the VRM1 heat and got to work on the stock heatsink! The back plate covering VRM1 was the easy part (thanks OCUK). On the other hand, the VRM2 plate up front was much more difficult. I had to work my way through the black plate, the copper plate, and finally the aluminum fins. Once I got the VRM2 cover separated from the rest of the sink, I used a pair of pliers to individually tear off the aluminum fins.
> 
> Told you it was crazy!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now here comes some gritty:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Furmark 10 Minute Run 1
> 
> Core: 1100
> Memory: 1300
> Power: +50
> mV: +0
> 
> GPU Max: 57C
> VRM1 Max: 65C
> VRM2 Max: 50C
> 
> 
> 
> Furmark 10 Minute Run 2
> 
> Core: 1150
> Memory: 1500
> Power: +50
> mV: +100
> 
> GPU Max: 65C
> VRM1 Max: 78C
> 
> 
> 
> 
> 
> 
> 
> 
> VRM2 Max: 56C
> 
> 
> 
> 
> 
> Pretty sweet results, right? If you've made it this far, thanks for checking out my work!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'd really like to know what you other R9 290(X) owners think of this Frankenstein of a mod, the results, and what other benches would be good to tax this monstrosity!
> 
> Also, if anyone is interested in more details about how I wound up with the VRM2 plate you see in the pics (where to cut, etc), I'll be happy to share.
> 
> Edit: Forgot to mention my idles
> GPU: 35C
> VRM1: 31C
> VRM2: 33C
> 
> 
> Spoiler: Warning: Spoiler!


@Arizonian Can you update me to aftermarket please? I can supply pics of the finished product, if necessary. Thanks.

@S410520 Very nice work on the Gelid install. Looks nice and clean!


----------



## Arizonian

@Redvineal - done - good work.


----------



## brazilianloser

I must be overlooking something very simple... Both AB and GPU Tweak do not give me an option to increase voltages on my memory on a Asus 290. I can keep on bumping up the core with small increases but any increase on memory causes instability but due to not being able to raise the memory voltage.

Edit: GPU Tweak blows some buelas


----------



## Scorpion49

Oh my god this is driving me insane. I can not for the life of me get Far Cry 3 to run properly. No matter what the settings the game runs around 70fps, same with one card or both. How in the hell are review sites getting 50+ fps at eyefinity resolutions with crossfire on ultra when I can't scrabble my way above 70fps on a single 1080p screen as low as the settings will go? They were praising the smoothness of the game, I get a stuttery mess that makes it nearly impossible to do anything.


----------



## fewohfjweoifj

I'd really like some kind of word from AMD on whether they're still working on these issues. I bet they're stalling for time so people can't RMA.


----------



## ReHWolution

Quote:


> Originally Posted by *fewohfjweoifj*
> 
> I'd really like some kind of word from AMD on whether they're still working on these issues. I bet they're stalling for time so people can't RMA.


There is a form to fill if you still have problems with the latest driver release. I was skeptical but the beta 9.4 solved my black screen issues (except when I clock @ maximum speed my card in few benchmarks), they won't take like 2 years so that you can't RMA. Instead of complaining, just ask for help in the proper way









Hey guys, can anyone tell me what the heck is the PT1/PT3 bios for our cards?


----------



## rdr09

Quote:


> Originally Posted by *smonkie*
> 
> As I see in your GPUZ screenshots, I'm not the only one having those spikes in desktop mode. Could you please play a Youtube video and post GPUZ screenshot after a few minutes to see how frequencies worked? It would be much appreciated.


smonkie, i see no fluctuations in IE watching the same video. So, it could be Chrome or whatever browser you are using. Try using IE.


----------



## ReHWolution

Quote:


> Originally Posted by *rdr09*
> 
> smonkie, i see no fluctuations in IE watching the same video. So, it could be Chrome or whatever browser you are using. Try using IE.


Quoting this one, Chrome sucks on hardware acceleration, being AMD or nVidia... Recently that browser got worse...


----------



## S410520

@Redvineal; Thanks, you have done very nice job yourself I see









I have though about modding the end-plate together with the Gelid, but figured the plate blocked a lot of airflow. Seeing your temps I quess I did not have to worry about that. Maybe I will try this when I put the arctic hybrid on my other card next week


----------



## anubis1127

Quote:


> Originally Posted by *ReHWolution*
> 
> Quoting this one, Chrome sucks on hardware acceleration, being AMD or nVidia... Recently that browser got worse...


One thing Chrome is good at is spying on all your web browsing.


----------



## tsm106

Quote:


> Originally Posted by *Arizonian*
> 
> @Redvineal - done - good work.


Hey Arizonian, I returned my Sapphire triples. I will replace them with something from a brand that doesn't have a punitive warranty policy.


----------



## Pfortunato

Guys, whats the best thermal paste for a gpu? I asked sapphire if I can replace the stock cooler and they said If I replace it, it would void my warranty so I only want to change the thermal paste, any suggestion? :b

Cheers


----------



## Redvineal

Quote:


> Originally Posted by *S410520*
> 
> @Redvineal; Thanks, you have done very nice job yourself I see
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have though about modding the end-plate together with the Gelid, but figured the plate blocked a lot of airflow. Seeing your temps I quess I did not have to worry about that. Maybe I will try this when I put the arctic hybrid on my other card next week


Thanks for the kind words. VRM's need a lot of love when using aftermarket coolers, and this solves both VRM1 temperature and VRM2 heatsink awkwardness problems.

It certainly isn't pretty, being my first ever mod and all. However, it seems you and your friend are handy and meticulous with your work, so I'm sure your outcome would be much cleaner.

I don't see why the front VRM2 plate would interfere with the Gelid with how low the profile is. For the rear VRM1 plate, you might need to cut the small bar that secures the power connectors. It was necessary for the AX3, but not sure about the Gelid.

I hope you decide to go with it. I'd really like to see how others accomplish the task! PM me if you want anymore details about cuts and such.


----------



## Redvineal

Quote:


> Originally Posted by *Pfortunato*
> 
> Guys, whats the best thermal paste for a gpu? I asked sapphire if I can replace the stock cooler and they said If I replace it, it would void my warranty so I only want to change the thermal paste, any suggestion? :b
> 
> Cheers


The "best thermal paste" can be a highly subjective debate, more like asking for "favorite thermal paste".









I have read many great reports about, and personally used, Gelid GC-Extreme with no complaints. Although, I'm sure you'll get plenty of other recommendations!


----------



## Forceman

Quote:


> Originally Posted by *brazilianloser*
> 
> I must be overlooking something very simple... Both AB and GPU Tweak do not give me an option to increase voltages on my memory on a Asus 290. I can keep on bumping up the core with small increases but any increase on memory causes instability but due to not being able to raise the memory voltage.
> 
> Edit: GPU Tweak blows some buelas


There's no memory voltage control on these cards. So you aren't overlooking anything, it just isn't possible right now.


----------



## BackwoodsNC

Quote:


> Originally Posted by *tsm106*
> 
> Hey Arizonian, I returned my Sapphire triples. I will replace them with something from a brand that doesn't have a punitive warranty policy.


What brands on the amd side still allow us to remove stock heatsink?


----------



## Xylene

Perhaps someone already mentioned this, but is Aux voltage in AB for memory? If I change it, VDDCI changes in GPU-Z.


----------



## tsm106

Quote:


> Originally Posted by *BackwoodsNC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Hey Arizonian, I returned my Sapphire triples. I will replace them with something from a brand that doesn't have a punitive warranty policy.
> 
> 
> 
> What brands on the amd side still allow us to remove stock heatsink?
Click to expand...

MSI just went on the record disclaiming that cooler removals do not expressly void warranty and instead any damage incurred if their is damage would. That is logical and fair. I will support that by replacing my sapphires with MSI.


----------



## brazilianloser

Quote:


> Originally Posted by *Forceman*
> 
> There's no memory voltage control on these cards. So you aren't overlooking anything, it just isn't possible right now.


I see well when I increase the clock it causes crashes so that I am overclocking I am the card is just not handling since there is no extra voltage to enable such a push... Feel free to correct me if I am wrong.

Quote:


> Originally Posted by *Xylene*
> 
> Perhaps someone already mentioned this, but is Aux voltage in AB for memory? If I change it, VDDCI changes in GPU-Z.


Hmm was unsure of what that aux was in there for but let me try giving small increases to see if the card is able to oc the memory.


----------



## Forceman

Quote:


> Originally Posted by *brazilianloser*
> 
> I see well when I increase the clock it causes crashes so that I am overclocking I am the card is just not handling since there is no extra voltage to enable such a push... Feel free to correct me if I am wrong.
> Hmm was unsure of what that aux was in there for but let me try giving small increases to see if the card is able to oc the memory.


Adding core voltage seems to help memory overclocking as well - it may be a controller limitation and not a memory voltage issue. How high are you trying to go?


----------



## ontariotl

Has anyone tried the Asus bios update with their unlocked cards? I'm curious to know what they changed. I may try it later as I need to create another USB boot stick in order to flash again, but I was wondering if anyone else has any feedback.


----------



## djsatane

I know this been said over and over to RMA cards until you get one that has no issues, but when you guys RMA your cards to the seller like newegg or what not do they cover your shipping costs? For some people like me RMA is not something I want to be doing over and over through the holiday season, as the place I ordered from is in canada and it costs me 40 bucks+ to ship each time... If my replacement has black screens too I will be furious.


----------



## brazilianloser

Quote:


> Originally Posted by *Forceman*
> 
> Adding core voltage seems to help memory overclocking as well - it may be a controller limitation and not a memory voltage issue. How high are you trying to go?


Anything simple like 1275 or 1300... those initial steps just to get the feet wet but it crashes valley and 3Dmark. That being with stock core voltage. This is what my current settings are well at least the best I can seem to achieve without artifacts or crashing 3Dmark


----------



## jerrolds

Quote:


> Originally Posted by *Redvineal*
> 
> Wanna see something crazy?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I wasn't at all happy with the VRM1 temps (~91C) the Accelero Xtreme III was producing with the rinky-dink heatsinks on my XFX R9 290 (does NOT unlock, btw).
> 
> As a matter of fact, the heat caused the thermal pads to fail and all 3 heatsinks fell off! I didn't want to go with the permanent glue, and here's why I'm glad I didn't:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What you see there is the result of 2 hours of quality time spent with a hacksaw! Yes, a HACKSAW. I don't own a rotary tool, and my local hardware stores were closed when I started around 9:30 PM.
> 
> Essentially, I took inspiration from some folks at Overclockers UK for the VRM1 heat and got to work on the stock heatsink! The back plate covering VRM1 was the easy part (thanks OCUK). On the other hand, the VRM2 plate up front was much more difficult. I had to work my way through the black plate, the copper plate, and finally the aluminum fins. Once I got the VRM2 cover separated from the rest of the sink, I used a pair of pliers to individually tear off the aluminum fins.
> 
> Told you it was crazy!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now here comes some gritty:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Furmark 10 Minute Run 1
> 
> Core: 1100
> Memory: 1300
> Power: +50
> mV: +0
> 
> GPU Max: 57C
> VRM1 Max: 65C
> VRM2 Max: 50C
> 
> 
> 
> Furmark 10 Minute Run 2
> 
> Core: 1150
> Memory: 1500
> Power: +50
> mV: +100
> 
> GPU Max: 65C
> VRM1 Max: 78C
> 
> 
> 
> 
> 
> 
> 
> 
> VRM2 Max: 56C
> 
> 
> 
> 
> 
> Pretty sweet results, right? If you've made it this far, thanks for checking out my work!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'd really like to know what you other R9 290(X) owners think of this Frankenstein of a mod, the results, and what other benches would be good to tax this monstrosity!
> 
> Also, if anyone is interested in more details about how I wound up with the VRM2 plate you see in the pics (where to cut, etc), I'll be happy to share.
> 
> Edit: Forgot to mention my idles
> GPU: 35C
> VRM1: 31C
> VRM2: 33C
> 
> 
> Spoiler: Warning: Spoiler!


Heh very cool +rep for seeing something new. Were you able to overclock higher due to the lower VRM1 temps? On my gelid i noticed the temps went down after a couple days, i guess the thermal glue settled or something...from 100C to about 85C. I wont be able to hacksaw the stock heastink, just planning on adding stronger fans. Its too bad i cant find any really good 3rd party VRM heatsinks..i think frozencpu might have one..but im not sure


----------



## Forceman

Quote:


> Originally Posted by *brazilianloser*
> 
> Anything simple like 1275 or 1300... those initial steps just to get the feet wet but it crashes valley and 3Dmark. That being with stock core voltage. This is what my current settings are well at least the best I can seem to achieve without artifacts or crashing 3Dmark


Strange, bumping the core voltage to +50mV helped me get higher memory clocks. Stock cooling?


----------



## brazilianloser

Quote:


> Originally Posted by *Forceman*
> 
> Strange, bumping the core voltage to +50mV helped me get higher memory clocks. Stock cooling?


Yes. Even so thought my temps have yet to go above 80c. VRM temps according to GPUz are under 65. So really not sure what is up.


----------



## DeadlyDNA

I am installing my water blocks now. You know though this whole "stock heat sink removal voids warranty" is not really helping matters considering the cooler is sub par. Not only that but as i remove my water blocks i am noticing that chances are someone else has taken my "new" cards apart already. I am really getting tired of buying new video cards on NewEGG and they show signs of use or even damage. While they work still it's not fair to me if i send one back before i took it apart they can claim i damaged it.

I have an HIS that when i took off stock cooler i can see where someone pried on the copper heat sink part with a flat blade. "brand New"
Second card, Gigabyte i took off the stock cooler and one of the screw on the i/o plate were stripped. I can tell someone used one of the screws off the PCB instead and stripped it. I seriously doubt this happened at the factory.

Ordering online these days is more like a lottery it seems.... to bad my local brick and mortar shops don't even have r9 290's...


----------



## djsatane

Quote:


> Originally Posted by *jerrolds*
> 
> Heh very cool +rep for seeing something new. Were you able to overclock higher due to the lower VRM1 temps? On my gelid i noticed the temps went down after a couple days, i guess the thermal glue settled or something...from 100C to about 85C. I wont be able to hacksaw the stock heastink, just planning on adding stronger fans. Its too bad i cant find any really good 3rd party VRM heatsinks..i think frozencpu might have one..but im not sure


Any of these good for 290x vram chips?

http://www.frozencpu.com/products/3285/ram-21/Zalman_ZM-RHS1_VGA_RAM_Heat_Sinks_-_SILVER.html

http://www.frozencpu.com/products/5518/vid-82/Enzotech_Forged_Copper_VGA_Memory_Heatsink_Multipack_-_ATI_and_nVidia_-_14mm_x_14mm_x_14mm_BMR-C1.html


----------



## cyenz

It really seems that AMD is in the right way regarding the Blackscreen problems, my m8 GPU is blackscreen free since yesterday, for the first time since he got the 290x that he could play almost 3 hours of BF4 without crashing in 5 minutes.

Just a shame that a problem like this sliped unnoticed during testing on AMD side.

For 500€ we should get alot more (not talking performance wise obviously).


----------



## Newbie2009

Quote:


> Originally Posted by *tsm106*
> 
> MSI just went on the record disclaiming that cooler removals do not expressly void warranty and instead any damage incurred if their is damage would. That is logical and fair. I will support that by replacing my sapphires with MSI.


Oh thanks for this info.


----------



## jerrolds

Quote:


> Originally Posted by *djsatane*
> 
> Any of these good for 290x vram chips?
> 
> http://www.frozencpu.com/products/3285/ram-21/Zalman_ZM-RHS1_VGA_RAM_Heat_Sinks_-_SILVER.html
> 
> http://www.frozencpu.com/products/5518/vid-82/Enzotech_Forged_Copper_VGA_Memory_Heatsink_Multipack_-_ATI_and_nVidia_-_14mm_x_14mm_x_14mm_BMR-C1.html


Nope, unforunately they look too wide to fit onto the VRMs - something like http://www.frozencpu.com/products/9084/vid-126/Enzotech_MST-88_Forged_Copper_Mosfet_Heatsink_-_ASUS_P6T_Rampage_II_Extreme_.html might work..too bad i didnt take any measurements

These look like they should work as well http://www.frozencpu.com/products/7190/vid-105/Enzotech_MOS-C1_MOSFET_Heatsinks_-_65mm_x_65mm_x_12mm_-_10_Pack.html


----------



## jomama22

Quote:


> Originally Posted by *tsm106*
> 
> MSI just went on the record disclaiming that cooler removals do not expressly void warranty and instead any damage incurred if their is damage would. That is logical and fair. I will support that by replacing my sapphires with MSI.


Just a fair warning to you. You better get them to say they will accept your rma even though the void stickers are gone. Because msi puts one on top of a retaining bracket screw...Same way xfx does.


----------



## jrcbandit

Crappy new beta drivers have caused a black screen issue in Windows 8 when I previously had no issue. Can't boot into os even.


----------



## bond32

Quote:


> Originally Posted by *jrcbandit*
> 
> Crappy new beta drivers have caused a black screen issue in Windows 8 when I previously had no issue. Can't boot into os even.


I have them too. Seems only when changing clock rates for me. I am trying to figure out which one right now, adjusting core to 1100 and seeing what happens.

Edit: regarding the warranty stickers, my sapphire 290x did not have one.


----------



## jomama22

Quote:


> Originally Posted by *bond32*
> 
> I have them too. Seems only when changing clock rates for me. I am trying to figure out which one right now, adjusting core to 1100 and seeing what happens.
> 
> Edit: regarding the warranty stickers, my sapphire 290x did not have one.


I believe sapphire and HIS may be the only ones who do not put warranty stickers on the retention bracket screws, but HIS does have a piece of paper between the pcb and cooler saying if it is missing it is void.

XFX, MSI and Asus have the warranty stickers, i have seen it with my own eyes lol.

Not sure about powercolor...


----------



## tsm106

Quote:


> Originally Posted by *jomama22*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> MSI just went on the record disclaiming that cooler removals do not expressly void warranty and instead any damage incurred if their is damage would. That is logical and fair. I will support that by replacing my sapphires with MSI.
> 
> 
> 
> Just a fair warning to you. You better get them to say they will accept your rma even though the void stickers are gone. Because msi puts one on top of a retaining bracket screw...Same way xfx does.
Click to expand...

Did you not READ my post where I wrote they went on the record?









https://forum-en.msi.com/index.php?topic=174490.0

To users in this club. Yall need to read this. This sort of stuff is small in the scope of the big picture, but you can do something about it with what power you have, and that is your future buying power.

http://www.overclock.net/forum/newestpost/1444881


----------



## bond32

Quote:


> Originally Posted by *jomama22*
> 
> I believe sapphire and HIS may be the only ones who do not put warranty stickers on the retention bracket screws, but HIS does have a piece of paper between the pcb and cooler saying if it is missing it is void.
> 
> XFX, MSI and Asus have the warranty stickers, i have seen it with my own eyes lol.
> 
> Not sure about powercolor...


Interesting... And interesting reads TSM. Honestly, I recommend purchasing from amazon anyway. They will bend over backwards to help you if you have a problem...


----------



## jomama22

Quote:


> Originally Posted by *tsm106*
> 
> Did you not READ my post where I wrote they went on the record?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://forum-en.msi.com/index.php?topic=174490.0
> 
> To users in this club. Yall need to read this. This sort of stuff is small in the scope of the big picture, but you can do something about it with what power you have, and that is your future buying power.
> 
> http://www.overclock.net/forum/newestpost/1444881


Quote:


> if you have replaced the cooler and something goes wrong with it just remove the after-market one then put the original cooler back on and RMA it and all should be well! (some techs on the phones have no idea what they are doing and give out wrong information at times)


This is the last reply from the mod:
Quote:


> if there gone they will usually inspect if for signs of physical damage and if non is there then they carry on and then do the repair! they just use them as a indicator that its been taken off unless it gets to someone who is a idiot and just regects it but usually just tell them you had to take it off to change the thermal paste as it was overheating as it was badly applied and all will be well!


Im sorry, but if you feel a mod of their forums, who doesnt even work for MSI, has any more clear information then anyone else, you may end up getting burned.

I mean twice he states how basically you could get screwed and not get warranty service if the person working that day decides not having that sticker or having removed the cooler prevent warranty work.

All im saying is don't be upset when you get denied work because a stupid little .5 cm sticker is missing. Because according to your source, it can happen.


----------



## Redvineal

Quote:


> Originally Posted by *jerrolds*
> 
> Heh very cool +rep for seeing something new. Were you able to overclock higher due to the lower VRM1 temps? On my gelid i noticed the temps went down after a couple days, i guess the thermal glue settled or something...from 100C to about 85C. I wont be able to hacksaw the stock heastink, just planning on adding stronger fans. Its too bad i cant find any really good 3rd party VRM heatsinks..i think frozencpu might have one..but im not sure


Thanks! The highest I've gone was shown in the 2nd Furmark result I posted. I plan to push my luck a little further today!

I hope you get your black screen issue resolved soon so you can start doing some of the things you want with the card and cooling!


----------



## fewohfjweoifj

Has anyone received any word on whether the issue is still being worked on?


----------



## SonDa5

I think that if you test your card with benchmarks and over clocks withing the first few days that you have the card and nothing fails the card is in good shape for the warranty period and if you use aftermarket cooling the card will be in even better shape.

MSI and Sapphire have both been willing to exchange cards that I have used aftermarket cooling on. The key was that no physical damage from using after market cooling is the reason for the RMA request.

My experience may not be yours and I would expect any video card manufacturer to handle each RMA on a case by case basis.


----------



## the9quad

could someone post the size needed for the heatsinks? has anyone measured them?


----------



## rdr09

Quote:


> Originally Posted by *cyenz*
> 
> It really seems that AMD is in the right way regarding the Blackscreen problems, my m8 GPU is blackscreen free since yesterday, for the first time since he got the 290x that he could play almost 3 hours of BF4 without crashing in 5 minutes.
> 
> Just a shame that a problem like this sliped unnoticed during testing on AMD side.
> 
> For 500€ we should get alot more (not talking performance wise obviously).


with the quantity in question . . . there will be some that would slip. no way QC can do 100% testing. it's impossible.


----------



## lordzed83

I went sapphire cause of NO STICKERS on so in case of dead card i could send it back.

Anyhow i ran LOADS of benchmarks today

13.11 9.2 vs 9.4
same scores everywhere
no black screens even tho on 9.2 i had blackscreens even while sitting on windows. But with unofficial whql i had no black screens

anyhow 9.4 is gr8 drivers and by looks of it fixed problems for like 90% of people


----------



## Kriant

Ok, got to play some AC4, all 3 cards working, all 3 cards go up to 42c under load.

Now....how do I OC them, same as with 7970 or a bit different because of their "dynamic" speed system or whatever that is ?

Almost forgot : 3x R9 290 perform better than 4x 7970 in AC4







waaay better


----------



## Raephen

Quote:


> Originally Posted by *Redvineal*
> 
> Sorry, I didn't think to take pics before I started. Too much excitement.
> 
> However, I can probably round up some pics of the bottom and sides and draw some lines where I made the cuts.


It's ok. I was just curious as to how the aluminium and the copper and the fins all fit together.

Should I ever decide to cut it up I think I'll manage. Luckily I've got a Dremel multitool at hand for that kind of thing


----------



## jerrolds

Just wanted to report that the new patch seemed to help my system, i think i blackscreened a total of 3 times so it wasnt a real issue for me to begin with but since installing the patch everything seems to be even more solid.

Zero black screens, no hard resets at stock, even high overclocks dont cause hard resets - just artifacting. Only time i had a hard crash was when switching from a high overclock profile (1220/1500 @ 1400mv -> 1150/1500 1293mv in gpu tweak) while a game was running in windowed mode, dont think it likes that







Im sure if i tried to switch while it was minimized it wouldve been fine.


----------



## Sergio4

i have a question for you guys. does anybody know distances between holes that are made for mounting cooler and their diamter? pleaseplease


----------



## Arizonian

Quote:


> Originally Posted by *fewohfjweoifj*
> 
> Has anyone received any word on whether the issue is still being worked on?


Not yet, but it's working for some members here. Waiting on more of our owners to report on the fix. Too early to tell as we've only less than a handful of members who've responded / tested it over the weekend.

Welcome to OCN btw - please submit an entry pic of your GPU w/OCN screen name in shot and I'll add you to our group.

*"How to put your Rig in your Sig"* is very helpful when addressing members here to easily see your rig when asking questions.
Quote:


> Originally Posted by *jerrolds*
> 
> Just wanted to report that the new patch seemed to help my system, i think i blackscreened a total of 3 times so it wasnt a real issue for me to begin with but since installing the patch everything seems to be even more solid.
> 
> Zero black screens, no hard resets at stock, even high overclocks dont cause hard resets - just artifacting. Only time i had a hard crash was when switching from a high overclock profile (1220/1500 @ 1400mv -> 1150/1500 1293mv in gpu tweak) while a game was running in windowed mode, dont think it likes that
> 
> 
> 
> 
> 
> 
> 
> Im sure if i tried to switch while it was minimized it wouldve been fine.


Nice your our fourth member to say it's working.


----------



## overclockFrance

I encounter a recurrent problem when I overclock : as soon as I increase the GPU voltage too high, my screen becomes blurred in 3D then remains blurred in 2D. Furthermore, the textures quality becomes low, as if they were rendered in 1280x800 : the characters are unreadable. If I reboot or close the session, the problem is gone. When the display is blurred, If I copy the screen, save it in paint for example, reboot then open the file showing the display, the saved dispaly is not blurred.

It is not a cooling problem, since I am watercooled. I changed bioses (ASUS, Sapphire, Powercolor), I tried ASUS GPU Tweak and MSI Afterburner, changed the DVI port, unfastened the waterblock screws a little bit. The problem remains. It happened on 3 290 cards : HIS 290X, Sapphire 290, Powercolor 290 flashed to 290X. I plugged the dislplay on the graphic card HDMI plug with a DVI-HDMI adapter : the problem disappears but the resolution is 1280x800.

My hypothesis is that the GPU does not like too high voltages : the blurred display appears in ASUS GPU Tweak, around 1325 mV ou +70 mV in MSI Afterburner.

Nevertheless, I did not find another one in this forum who seems to have this problem.

I start being fed up with this problem. I even may contemplating selling my Powercolor and buying a GTX780TI.

Any ideas ?


----------



## Redvineal

Quote:


> Originally Posted by *Raephen*
> 
> It's ok. I was just curious as to how the aluminium and the copper and the fins all fit together.
> 
> Should I ever decide to cut it up I think I'll manage. Luckily I've got a Dremel multitool at hand for that kind of thing


Best I can tell it's some kind of epoxy adhesive. I tried wedging the copper off, but started to bend the black plate. It's on there far too well to get apart by hand. That's why I just decided to cut through that sucker!

Once the front plate is separated from the rest, the aluminum fins are extremely easy to peel. I just used pliers to pull a fin to the side away from the rest, and peeled from back to front. Actually, once I peeled all the fins off, the surface of the copper plate was sticky.

A note about where I made the cut: When you flip the heatsink over and look where the gray VRM2 pad sits, there's a screw hole just off to the side. I made the cut just between the screw hole and raised VRM 2 pad area in as straight of a line as the saw would allow. I'm sure your Dremel will work very well for that part, and take far less time, too!


----------



## DampMonkey

Quote:


> Originally Posted by *overclockFrance*
> 
> I encounter a recurrent problem when I overclock : as soon as I increase the GPU voltage too high, my screen becomes blurred in 3D then remains blurred in 2D. Furthermore, the textures quality becomes low, as if they were rendered in 1280x800 : the characters are unreadable. If I reboot or close the session, the problem is gone. When the display is blurred, If I copy the screen, save it in paint for example, reboot then open the file showing the display, the saved dispaly is not blurred.
> 
> It is not a cooling problem, since I am watercooled. I changed bioses (ASUS, Sapphire, Powercolor), I tried ASUS GPU Tweak and MSI Afterburner, changed the DVI port, unfastened the waterblock screws a little bit. The problem remains. It happened on 3 290 cards : HIS 290X, Sapphire 290, Powercolor 290 flashed to 290X. I plugged the dislplay on the graphic card HDMI plug with a DVI-HDMI adapter : the problem disappears but the resolution is 1280x800.
> 
> My hypothesis is that the GPU does not like too high voltages : the blurred display appears in ASUS GPU Tweak, around 1325 mV ou +70 mV in MSI Afterburner.
> 
> Nevertheless, I did not find another one in this forum who seems to have this problem.
> 
> I start being fed up with this problem. I even may contemplating selling my Powercolor and buying a GTX780TI.
> 
> Any ideas ?


is this happening with a 1440p monitor?


----------



## flamin9_t00l

Hi Arizonian, I can't see my name in the owners list... please add me I would like to use the club sig. My entry is on page 740 post #7400.
Cheers.


----------



## prostreetcamaro

Quote:


> Originally Posted by *overclockFrance*
> 
> I encounter a recurrent problem when I overclock : as soon as I increase the GPU voltage too high, my screen becomes blurred in 3D then remains blurred in 2D. Furthermore, the textures quality becomes low, as if they were rendered in 1280x800 : the characters are unreadable. If I reboot or close the session, the problem is gone. When the display is blurred, If I copy the screen, save it in paint for example, reboot then open the file showing the display, the saved dispaly is not blurred.
> 
> It is not a cooling problem, since I am watercooled. I changed bioses (ASUS, Sapphire, Powercolor), I tried ASUS GPU Tweak and MSI Afterburner, changed the DVI port, unfastened the waterblock screws a little bit. The problem remains. It happened on 3 290 cards : HIS 290X, Sapphire 290, Powercolor 290 flashed to 290X. I plugged the dislplay on the graphic card HDMI plug with a DVI-HDMI adapter : the problem disappears but the resolution is 1280x800.
> 
> My hypothesis is that the GPU does not like too high voltages : the blurred display appears in ASUS GPU Tweak, around 1325 mV ou +70 mV in MSI Afterburner.
> 
> Nevertheless, I did not find another one in this forum who seems to have this problem.
> 
> I start being fed up with this problem. I even may contemplating selling my Powercolor and buying a GTX780TI.
> 
> Any ideas ?


Mine will do the same thing! Everything gets blurry. So your not alone on that one.


----------



## Raxus

Quote:


> Originally Posted by *tsm106*
> 
> Did you not READ my post where I wrote they went on the record?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://forum-en.msi.com/index.php?topic=174490.0
> 
> To users in this club. Yall need to read this. This sort of stuff is small in the scope of the big picture, but you can do something about it with what power you have, and that is your future buying power.
> 
> http://www.overclock.net/forum/newestpost/1444881


I linked this and actually had an MSI tech tell me the same thing

http://www.overclock.net/t/1444675/watercooling-msi-cards-edit-directly-from-msi-from-msi-forums


----------



## anarekist

thanks for adding me to the list.


----------



## jerrolds

Quote:


> Originally Posted by *overclockFrance*
> 
> I encounter a recurrent problem when I overclock
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> : as soon as I increase the GPU voltage too high, my screen becomes blurred in 3D then remains blurred in 2D. Furthermore, the textures quality becomes low, as if they were rendered in 1280x800 : the characters are unreadable. If I reboot or close the session, the problem is gone. When the display is blurred, If I copy the screen, save it in paint for example, reboot then open the file showing the display, the saved dispaly is not blurred.
> 
> It is not a cooling problem, since I am watercooled. I changed bioses (ASUS, Sapphire, Powercolor), I tried ASUS GPU Tweak and MSI Afterburner, changed the DVI port, unfastened the waterblock screws a little bit. The problem remains. It happened on 3 290 cards : HIS 290X, Sapphire 290, Powercolor 290 flashed to 290X. I plugged the dislplay on the graphic card HDMI plug with a DVI-HDMI adapter : the problem disappears but the resolution is 1280x800.
> 
> My hypothesis is that the GPU does not like too high voltages : the blurred display appears in ASUS GPU Tweak, around 1325 mV ou +70 mV in MSI Afterburner.
> 
> Nevertheless, I did not find another one in this forum who seems to have this problem.
> 
> I start being fed up with this problem. I even may contemplating selling my Powercolor and buying a GTX780TI.
> 
> Any ideas ?


Yea that happens to me when i overclocked too high, 1230mhz on the core and maxed out GPU Tweak. It stopped once i lowered the clock, even if voltage was still maxed. If temps still look good for you - they you may have hit the ceiling of your gpu. That or push past it hah what happens when you max voltage?


----------



## Arizonian

Quote:


> Originally Posted by *anarekist*
> 
> thanks for adding me to the list.


No problem.








Quote:


> Originally Posted by *flamin9_t00l*
> 
> HIS Radeon R9 290
> 
> 
> 
> Had a bit of bother coming from a Nvidia system (bsods 01 and F4). Also experienced distorted colours during boot up (not in windows).
> 
> Since reinstalling the OS I've never had anymore issues, very pleased
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not the best OC'er artifacting at 1100mhz with +50mV on reference cooler.
> 
> Will be adding another 290 after crimbo and both will be under water... coming from 580 SLI.


Sorry about that. Sometime this thread moves very fast. Slotted you in order to your entry.

Congrats - added


----------



## Gilgam3sh

@ Arizonian

is says I have SAPPHIRE, but it's POWERCOLOR









btw, running with the PT1 BIOS atm and seems to be more stable when OC, but is there any way to get 2D in IDLE working?


----------



## Arizonian

Quote:


> Originally Posted by *Gilgam3sh*
> 
> @ Arizonian
> 
> is says I have SAPPHIRE, but it's POWERCOLOR
> 
> 
> 
> 
> 
> 
> 
> 
> 
> btw, running with the PT1 BIOS atm and seems to be more stable when OC, but is there any way to get 2D in IDLE working?


Weird, I looked at the box in the pic. Unless you did an exchange. It's updated.









http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/4200#post_21159818

EDIT: 136 members and counting. Non-reference haven't even hit yet. I expect a lot more members will be coming our way soon too.


----------



## Gasoliner

PT1.ROM is broken, 2D always 1000mhz. Update it to latest Asus bios and 2d works. 300mhz.
Upgrade to latest Asus bios with Asus GPU Tweak. Just install it and search for new updates. It will install new bios in windows.
http://support.asus.com/download.aspx?SLanguage=en&m=gpu%20tweak&os=30


----------



## Gilgam3sh

Quote:


> Originally Posted by *Arizonian*
> 
> Weird, I looked at the box in the pic. Unless you did an exchange. It's updated.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/4200#post_21159818
> 
> EDIT: 136 members and counting. Non-reference haven't even hit yet. I expect a lot more members will be coming our way soon too.


http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/7030#post_21233569


----------



## Gilgam3sh

Quote:


> Originally Posted by *Gasoliner*
> 
> PT1.ROM is broken, 2D always 1000mhz. Update it to latest Asus bios and 2d works. 300mhz.
> Upgrade to latest Asus bios with Asus GPU Tweak. Just install it and search for new updates. It will install new bios in windows.
> http://support.asus.com/download.aspx?SLanguage=en&m=gpu%20tweak&os=30


ok, I have the stock ASUS 290X BIOS but PT1 give me better OC, but I miss the 2D clocks in IDLE... I might revert back to stock ASUS 290X BIOS until (if it ever happen) the custom BIOS get 2D clocks.

edit: I just checked with GPU Tweak and yes there is a newer 290X BIOS, ends with AS02M


----------



## Sergio4

Quote:


> Originally Posted by *Sergio4*
> 
> i have a question for you guys. does anybody know distances between holes that are made for mounting cooler and their diamter? pleaseplease


bump


----------



## flamin9_t00l

Quote:


> Originally Posted by *Arizonian*
> 
> Sorry about that. Sometime this thread moves very fast. Slotted you in order to your entry.
> 
> Congrats - added


no probs, thanks mate









its hard to keep up with this thread, another 20-30 pages each time I return lol


----------



## HanSoloMe

Hey there guys, I've been reading through the thread for a few days and I figured I'd make a post. I just recently purchased an XFX 290 (It's currently out of my system, but as soon as I put it back in on Monday I'll post a screenshot for the Admins). First off, I'd like to say that XFX DOES allow North Americans to remove those warranty stickers on the backplate and attach an aftermarket cooler. I spoke with 2 different representatives and have written consent that installing an aftermarket cooler does not void the warranty on an XFX card in the US.

With that being said, I'm having some serious issues with this card. I was not able to install the Accelero xtreme iii I bought, because one of the screws stripped on the backplate and I've yet to get it off. Afterwards I decided to try it stock w/reference cooler and as soon as I tried to 3d mark it it hit 100 degrees and shut my system down. After modifying some fan speeds, the temperatures seemed to have been running fine but the card has shut my system down one more time and crashed my games. My LOL and BF4 seem to crash randomly with temps never over 75 degrees. I've tried the most recent beta drivers that came out yesterday and the preceding driver. I'm kind of at a loss and XFX is closed on the weekends, so I just removed the card from my system until Monday.

Any ideas/tips/comments would be greatly appreciated. Great community you guys have here, glad to be a part of it.

Thanks.


----------



## TacticalAce42

Hey guys Im really getting frustrated that I keep getting crashes with my R9 290 in skyrim. I had the 290x and RMA'd it because I thought it was faulty but I just recently got my R9 290 and Im getting the same problems where I would play for a couple of minutes and the game would crash. I know itsthe graphics card because I have no mods installed what so ever, also I installed my 2 6950's in crossfire and I dont have that problem. I wanted to ask were do I go to bring this to AMD's attention? Because its a pretty old game and it might not be on their radar right now. I would really appreciate some help.


----------



## Raxus

Quote:


> Originally Posted by *TacticalAce42*
> 
> Hey guys Im really getting frustrated that I keep getting crashes with my R9 290 in skyrim. I had the 290x and RMA'd it because I thought it was faulty but I just recently got my R9 290 and Im getting the same problems where I would play for a couple of minutes and the game would crash. I know itsthe graphics card because I have no mods installed what so ever, also I installed my 2 6950's in crossfire and I dont have that problem. I wanted to ask were do I go to bring this to AMD's attention? Because its a pretty old game and it might not be on their radar right now. I would really appreciate some help.


Who made both cards? Just curious.


----------



## DeadlyDNA

Quote:


> Originally Posted by *TacticalAce42*
> 
> Hey guys Im really getting frustrated that I keep getting crashes with my R9 290 in skyrim. I had the 290x and RMA'd it because I thought it was faulty but I just recently got my R9 290 and Im getting the same problems where I would play for a couple of minutes and the game would crash. I know itsthe graphics card because I have no mods installed what so ever, also I installed my 2 6950's in crossfire and I dont have that problem. I wanted to ask were do I go to bring this to AMD's attention? Because its a pretty old game and it might not be on their radar right now. I would really appreciate some help.


I had my r9 290's installed on stock cooling all week and one of the games i played the most was skyrim. I did not crash at all, matter of fact i was expecting tons of issues. Well it was disappointing in a good way because with 100 or so mods and ENB it ran like a champ. The only issue i had was was some flashing on the menu screen.


----------



## TacticalAce42

Quote:


> Originally Posted by *Raxus*
> 
> Who made both cards? Just curious.


The R9 290X I had was MSI the R9 290 I have is gigabyte.


----------



## TacticalAce42

Quote:


> Originally Posted by *DeadlyDNA*
> 
> I had my r9 290's installed on stock cooling all week and one of the games i played the most was skyrim. I did not crash at all, matter of fact i was expecting tons of issues. Well it was disappointing in a good way because with 100 or so mods and ENB it ran like a champ. The only issue i had was was some flashing on the menu screen.


There has to be something wrong, I've had two different cards from two different manufacturers. Does your fan get loud when you are playing skyrim?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Kriant*
> 
> Ok, got to play some AC4, all 3 cards working, all 3 cards go up to 42c under load.
> 
> Now....how do I OC them, same as with 7970 or a bit different because of their "dynamic" speed system or whatever that is ?
> 
> Almost forgot : 3x R9 290 perform better than 4x 7970 in AC4
> 
> 
> 
> 
> 
> 
> 
> waaay better


Is there a way to get over 62fps?


----------



## Spectre-

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Is there a way to get over 62fps?


turn of v-sync?


----------



## Raephen

Quote:


> Originally Posted by *Redvineal*
> 
> Best I can tell it's some kind of epoxy adhesive. I tried wedging the copper off, but started to bend the black plate. It's on there far too well to get apart by hand. That's why I just decided to cut through that sucker!
> 
> Once the front plate is separated from the rest, the aluminum fins are extremely easy to peel. I just used pliers to pull a fin to the side away from the rest, and peeled from back to front. Actually, once I peeled all the fins off, the surface of the copper plate was sticky.
> 
> A note about where I made the cut: When you flip the heatsink over and look where the gray VRM2 pad sits, there's a screw hole just off to the side. I made the cut just between the screw hole and raised VRM 2 pad area in as straight of a line as the saw would allow. I'm sure your Dremel will work very well for that part, and take far less time, too!


Thanks for the tips!

I'm hoping it won't come to that - rather, I'm hoping the set-back estimated shipping time of 04-12-2013 (from 25-11, from 21-11) will finally be the date they send me the Aquacomputer Hawaii block...

If they push it back again, I might start to look at alternatives. The AX III I was thinking of at first would be a good option, and with a base plate mod like yours it would even be better









Only time will tell.

Cheers and +rep for your info


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Spectre-*
> 
> turn of v-sync?


Have you played it? Lol when you turn off vsync it still caps you at 62fps.


----------



## Sergio4

can someone please tell me distance and diameter of mounting holes on r9 290/x


----------



## Raephen

Quote:


> Originally Posted by *TacticalAce42*
> 
> Hey guys Im really getting frustrated that I keep getting crashes with my R9 290 in skyrim. I had the 290x and RMA'd it because I thought it was faulty but I just recently got my R9 290 and Im getting the same problems where I would play for a couple of minutes and the game would crash. I know itsthe graphics card because I have no mods installed what so ever, also I installed my 2 6950's in crossfire and I dont have that problem. I wanted to ask were do I go to bring this to AMD's attention? Because its a pretty old game and it might not be on their radar right now. I would really appreciate some help.


Wow, that's odd: with me, Skyrim seems to love my 290. Ultra on a 290 > Ultra on a HD7870. And I run loads ofmods.

I once had a CTD from Skyrim. though, but that I couldn't replicate. I did,however, raise the powerlimit by 10% to be on the safe side.

The CTD reminded me of whenI was OCing the FX4170 CPU before my Ivy i5, and a CTD in Skyrim = more volts, please!


----------



## DeadlyDNA

Quote:


> Originally Posted by *Sergio4*
> 
> can someone please tell me distance and diameter of mounting holes on r9 290/x


I did not measure them so please don;t kill me, however i did mount an aftermarket air cooler just to see if the posts fit, and it did. That was an carbonx4 vga cooler, according to the manual in it. The holes that lined up were 53mm. I will try to measure it for you as well i am still building my machine back up with watercooling..

Yes, according to my measuring it comes to about 52.4mm or so. I think 53mm is safe to say if its not 53mm exactly then it's 52.5ish Hope that helps maybe someone else can confirm for the R9 290


----------



## Scorpion49

Great, still have the crappy AMD driver issue with the audio devices causing my audio control panel to lock up. I thought they fixed that back in the 12 series drivers


----------



## rdr09

Quote:


> Originally Posted by *Sergio4*
> 
> can someone please tell me distance and diameter of mounting holes on r9 290/x


is this what you are looking for?



used the stock cooler to measure but i only used a tape measure.


----------



## DeadlyDNA

Quote:


> Originally Posted by *rdr09*
> 
> is this what you are looking for?
> 
> 
> 
> used the stock cooler to measure but i only used a tape measure.


Thank you i was about to take a picture of the measurments for both


----------



## the9quad

Quote:


> Originally Posted by *rdr09*
> 
> is this what you are looking for?
> 
> 
> 
> used the stock cooler to measure but i only used a tape measure.


whats the measurements on the ram? if you dont mind.


----------



## Sergio4

Quote:


> Originally Posted by *DeadlyDNA*
> 
> I did not measure them so please don;t kill me, however i did mount an aftermarket air cooler just to see if the posts fit, and it did. That was an carbonx4 vga cooler, according to the manual in it. The holes that lined up were 53mm. I will try to measure it for you as well i am still building my machine back up with watercooling..
> 
> Yes, according to my measuring it comes to about 52.4mm or so. I think 53mm is safe to say if its not 53mm exactly then it's 52.5ish Hope that helps maybe someone else can confirm for the R9 290


Thank you







what about diameter of those holes?


----------



## Sgt Bilko

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Have you played it? Lol when you turn off vsync it still caps you at 62fps.


I thought that was strange, i'm just running the Single 290x with most of the eye-candy on and i'm getting 62fps all the time.......PC version didn't get a lot of love.

Still though, much better than AC III.


----------



## TacticalAce42

Does anyone have 2 290's in crossfire? Whats your fps on Assassins creed 4 and settings?


----------



## nyboy42

Quote:


> Originally Posted by *Scorpion49*
> 
> Great, still have the crappy AMD driver issue with the audio devices causing my audio control panel to lock up. I thought they fixed that back in the 12 series drivers


got the same problem AND the HDMI port is not pushing out audio to my home theater receiver, ONLY video....bah 6970 had never issues with HDMI audio


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I thought that was strange, i'm just running the Single 290x with most of the eye-candy on and i'm getting 62fps all the time.......PC version didn't get a lot of love.
> 
> Still though, much better than AC III.


Lol yea it threw me off at first too! I was like good lawwwwd this is the most consistent fps I've ever had... then i realized.

Ahhh wellll definitely a fun game!


----------



## MrWhiteRX7

Quote:


> Originally Posted by *TacticalAce42*
> 
> Does anyone have 2 290's in crossfire? Whats your fps on Assassins creed 4 and settings?


won't go over 62 regardless


----------



## TacticalAce42

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I thought that was strange, i'm just running the Single 290x with most of the eye-candy on and i'm getting 62fps all the time.......PC version didn't get a lot of love.
> 
> Still though, much better than AC III.


What are your settings exactly and where are you when you get that fps? Try going to a city and let us know your fps because on my 290 I get 60fps but when Im in a city it drops down to 30fps because vsync cant maintain the 60fps so it goes to 30 and locks there. So whats your settings?


----------



## rdr09

Quote:


> Originally Posted by *the9quad*
> 
> whats the measurements on the ram? if you dont mind.


I can't measure those 'cause my gpu is installed. if your asking about the distance of the holes near the rams that would not be symmetrical. i don't think they'll form a rectangle or square is what i am saying.


----------



## r0cawearz

for those who installed aftermarket coolers like gelid icyv2, i was thinking about flipping the case upside down so the vrm doesn't fall off. i can't think of anything possibly going wrong atm but i could be wrong.

anyone think this is a good or terrible idea?


----------



## Sgt Bilko

Quote:


> Originally Posted by *TacticalAce42*
> 
> What are your settings exactly and where are you when you get that fps? Try going to a city and let us know your fps because on my 290 I get 60fps but when Im in a city it drops down to 30fps because vsync cant maintain the 60fps so it goes to 30 and locks there. So whats your settings?


I don't have Vsync on for starters and as MrWhiteRX7 said, it wont go over 62 fps regardless of what Vsync is set to (Console Influence there)

I'll edit this post with a screencap of my settings when Uplay finishes being a jerk

EDIT: Uplay stopped being a jerk









I dip to about 40-50 fps in towns and 30 in heavy Naval combat in a storm (4-5 Ships plus rain and waterspouts)


----------



## the9quad

Quote:


> Originally Posted by *rdr09*
> 
> I can't measure those 'cause my gpu is installed. if your asking about the distance of the holes near the rams that would not be symmetrical. i don't think they'll form a rectangle or square is what i am saying.


nah was asking to order heatsinks for em. thanks though!


----------



## TacticalAce42

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I don't have Vsync on for starters and as MrWhiteRX7 said, it wont go over 62 fps regardless of what Vsync is set to (Console Influence there)
> 
> I'll edit this post with a screencap of my settings when Uplay finishes being a jerk
> 
> EDIT: Uplay stopped being a jerk
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I dip to about 40-50 fps in towns and 30 in heavy Naval combat in a storm (4-5 Ships plus rain and waterspouts)


Ok cool so I suspected that much. I have AA on EQAA 4x(x8) to be exact. I want AA and a constant 60 fps with Assassins creed 4. I know a single 290/290x would be more then enough if the game was optimized good but sadly we have to rely on raw power these days. So I think a second 290 should be fine for these new generation of games.


----------



## Sergio4

Diameter of monting holes?????


----------



## Sgt Bilko

Quote:


> Originally Posted by *TacticalAce42*
> 
> Ok cool so I suspected that much. I have AA on EQAA 4x(x8) to be exact. I want AA and a constant 60 fps with Assassins creed 4. I know a single 290/290x would be more then enough if the game was optimized good but sadly we have to rely on raw power these days. So I think a second 290 should be fine for these new generation of games.


Yeah tbh most of the time i don't notice AA in games so i generally leave it off and AC IV is a very pretty game without it
......kinda surprised me actually.

And i'm going to do the same, getting a 2nd 290x after Xmas sometime.
Quote:


> Originally Posted by *Sergio4*
> 
> Diameter of monting holes?????


Roughly 5mm and you don't need to spam to ask a question so please dont't


----------



## TacticalAce42

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yeah tbh most of the time i don't notice AA in games so i generally leave it off and AC IV is a very pretty game without it
> ......kinda surprised me actually.
> 
> And i'm going to do the same, getting a 2nd 290x after Xmas sometime.


Im just waiting to see if AMD will have a holiday bundle or something. I wish they would bring back never settle but I think they realize there are not that much "new" games being released this holiday.


----------



## the9quad

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yeah tbh most of the time i don't notice AA in games so i generally leave it off and AC IV is a very pretty game without it
> ......kinda surprised me actually.
> 
> And i'm going to do the same, getting a 2nd 290x after Xmas sometime.
> Roughly 5mm and you don't need to spam to ask a question so please dont't


this i find hilarious

PC Gamers Who Want Performance Should Just Buy a Better GPU, Ubisoft Suggests

"It's always a question of compromise about the effect, how it looks, and the performance it takes from the system. *On PC, usually you don't really care about the performance, because the idea is that if it's not [running] fast enough, you buy a bigger GPU.* Once you get on console, you can't have this approach."


----------



## Subby

I saw this worked for at least one other guy and now I can confirm that it worked for me. After ordering and receiving my XFX 290(non x) I registered on XFX's site and then clicked on the promotion and received a Battlefield 4 key. Ya boy!

Side note, Koolance water block already here just waiting on my XSPC Raystorm AX360 w/ D5 vario kit to get here.


----------



## Sgt Bilko

Quote:


> Originally Posted by *TacticalAce42*
> 
> Im just waiting to see if AMD will have a holiday bundle or something. I wish they would bring back never settle but I think they realize there are not that much "new" games being released this holiday.


I'm just waiting for the non-ref card to drop so i can get another referance card cheap and finally get my water loop
Quote:


> Originally Posted by *the9quad*
> 
> this i find hilarious
> 
> PC Gamers Who Want Performance Should Just Buy a Better GPU, Ubisoft Suggests
> 
> "It's always a question of compromise about the effect, how it looks, and the performance it takes from the system. *On PC, usually you don't really care about the performance, because the idea is that if it's not [running] fast enough, you buy a bigger GPU.* Once you get on console, you can't have this approach."


PC doesn't care about performance?........









Well at the very least we have Mantle to look forward to.....


----------



## Kriant

Quote:


> Originally Posted by *TacticalAce42*
> 
> Ok cool so I suspected that much. I have AA on EQAA 4x(x8) to be exact. I want AA and a constant 60 fps with Assassins creed 4. I know a single 290/290x would be more then enough if the game was optimized good but sadly we have to rely on raw power these days. So I think a second 290 should be fine for these new generation of games.


I run 5760x1080 with all maxed out and vsync, and MSAAx4, with three 290s I get around 30-40 fps with dips to 28 sometimes, it's pretty playable, but I haven't figures out how to force constant 100% use on the cards instead of them bouncing all over the place ( as shown by afterburner load graph)


----------



## SamEkinci

Links to 3DMark Results Before the mod - 947 @ default voltage
ICE STORM - 168147 http://www.3dmark.com/is/1148598
CLOUD GATE - 19435 http://www.3dmark.com/cg/1026338
FIRE STRIKE - 8566 http://www.3dmark.com/fs/1168983

Links to 3DMark Results After the mod - 1150 @ default voltage
ICE STORM - 175120 http://www.3dmark.com/is/1149048
CLOUD GATE - 20462 http://www.3dmark.com/cg/1028133
FIRE STRIKE - 9522 http://www.3dmark.com/fs/1171254

Temperatures

Before
Idle 38C
Load 95C

After
Idle 30C
Load 50C

VRM1 and VRM2 @ idle
32C and 43C

VRM1 and VRM2 @ Load
76C and 62C

Video of unboxing XFX R9 290 - cheap packaging.




DIY installation of Antec 620 closed loop water cooler on to R9 290.


----------



## the9quad

You guys think a single 360 rad will keep the temps ok on a pair of 290x's? Would love them to be <80 full load. preferably in the low 70's. Loop will essentially be: res>pump>CPU>240rad>GPU's>360rad>res.

and no zipties! I know I'd break something doing that.


----------



## Hogesyx

Quote:


> Originally Posted by *SamEkinci*
> 
> Links to 3DMark Results Before the mod - 947 @ default voltage
> ICE STORM - 168147 http://www.3dmark.com/is/1148598
> CLOUD GATE - 19435 http://www.3dmark.com/cg/1026338
> FIRE STRIKE - 8566 http://www.3dmark.com/fs/1168983
> 
> Links to 3DMark Results After the mod - 1150 @ default voltage
> ICE STORM - 175120 http://www.3dmark.com/is/1149048
> CLOUD GATE - 20462 http://www.3dmark.com/cg/1028133
> FIRE STRIKE - 9522 http://www.3dmark.com/fs/1171254
> 
> Temperatures
> 
> Before
> Idle 38C
> Load 95C
> 
> After
> Idle 30C
> Load 50C
> 
> VRM1 and VRM2 @ idle
> 32C and 43C
> 
> VRM1 and VRM2 @ Load
> 76C and 62C


Did you mount another fan to cool the VRMs?


----------



## Pesmerrga

Quote:


> Originally Posted by *rdr09*
> 
> is this what you are looking for?
> 
> 
> 
> used the stock cooler to measure but i only used a tape measure.


Quote:


> Originally Posted by *Sergio4*
> 
> Diameter of monting holes?????


Dropped a higher res pic of the die/mounting area/ram chips in to Photoshop and made lines using 3 inches as the reference point for the diagonal across the die. Second image blown up so the numbers could be read better.

That makes sense right? Defintely not better than using a ruler, but seeing as I haven't purchased a card yet, it's all I can do.


----------



## SamEkinci

Quote:


> Originally Posted by *Hogesyx*
> 
> Did you mount another fan to cool the VRMs?




They are right smack in front of the airflow, and it turned out the additional fan actually disrupted that and warmed things up.. If you see in the picture above, the two push/pull config radiator fans are blowing right on the card and VRM, there is an exhaust at the back and top. This allowed for lower temperatures. Considering people are seeing close to 100C in VRM temperatures I think its fairly good.


----------



## Pesmerrga

Quote:


> Originally Posted by *basco*
> 
> hey maarten did ya use this specific version of gputweak? i think so:
> http://www.mediafire.com/download/7g1in2cj3rik7is/GPUTweakVer2491_+20131017-Hawaii.zip
> 
> just slapped my old twinfrozr 5830 cooler on 290 and hell its quiet now.
> used vrm cooler from giga 470 soc and small alu heatsink


I've been waiting to see if someone responded to this guy. It looks like no one did. Seems like an uncommon thing, using a MSI Frozr II cooler from a 5830 on a R9-290. It was the last post on page 670.. so it might have just gotten missed..


----------



## SamEkinci

Not sure if this has been mentioned but AMD announced Battlefield 4 bundle with R9 series cards.. If you bought your card before you can go to your manufacturers website and claim it through EA Origin..

Now if anyone can help me figure out to get over corruption errors when installing Origin it will be dandy.

Quote:


> Originally Posted by *Pesmerrga*
> 
> I've been waiting to see if someone responded to this guy. It looks like no one did. Seems like an uncommon thing, using a MSI Frozr II cooler from a 5830 on a R9-290. It was the last post on page 670.. so it might have just gotten missed..


Looks like he just used the mounting bolts around the GPU, that thing looks like it weighs a ton, I wonder how it performs.

Quote:


> Originally Posted by *the9quad*
> 
> You guys think a single 360 rad will keep the temps ok on a pair of 290x's? Would love them to be <80 full load. preferably in the low 70's. Loop will essentially be: res>pump>CPU>240rad>GPU's>360rad>res.
> 
> and no zipties! I know I'd break something doing that.


In my experience, and although I am a very experienced artisan, I have had more breakage with brackets and bolts.. I always end up putting too much torque always. Zip-ties stretch only and clip so much that you cannot force them like you could with screws. In short its easier to spot the limit with a zip-tie. Thats my experience. Every custom bracket I made, I get scared to death about putting too much torque on screws, even when using tension springs.

Unless I am designing it with cad and using milling machine, but then again, zip-ties never failed me and always had 100% success with them in comparison so why bother? Unless of course for looks, but imho I like duct tape and zip ties clean look, more realistic imho.


----------



## the9quad

Quote:


> Originally Posted by *SamEkinci*
> 
> Not sure if this has been mentioned but AMD announced Battlefield 4 bundle with R9 series cards.. If you bought your card before you can go to your manufacturers website and claim it through EA Origin..
> 
> Now if anyone can help me figure out to get over corruption errors when installing Origin it will be dandy.
> Looks like he just used the mounting bolts around the GPU, that thing looks like it weighs a ton, I wonder how it performs.
> In my experience, and although I am a very experienced artisan, I have had more breakage with brackets and bolts.. I always end up putting too much torque always. Zip-ties stretch only and clip so much that you cannot force them like you could with screws. In short its easier to spot the limit with a zip-tie. Thats my experience. Every custom bracket I made, I get scared to death about putting too much torque on screws, even when using tension springs.
> 
> Unless I am designing it with cad and using milling machine, but then again, zip-ties never failed me and always had 100% success with them in comparison so why bother? Unless of course for looks, but imho I like duct tape and zip ties clean look, more realistic imho.


i wasn't knocking you bro, your video was awesome. I even repped ya for it.


----------



## SamEkinci

Quote:


> Originally Posted by *the9quad*
> 
> i wasn't knocking you bro, your video was awesome. I even repped ya for it.


Sorry I was just trying to reply nicely, I meant it as an honest opinion based on experience, didnt mean to be offensive/defensive or attack you in anyway..









For some reason, after doing this mod literally on 20 or so different cards (without knowing the existence of these threads) I just prefer zip-ties. Maybe thats because its quicker way of doing things. However, of all the bracket and bolt failures and me over torqueing, zip-ties work so much better.

its weird all my knowledge and experience makes me be anal about these sort of things, but I cant help but to admit zip-ties just work on everything.


----------



## Kriant

Quote:


> Originally Posted by *SamEkinci*
> 
> Not sure if this has been mentioned but AMD announced Battlefield 4 bundle with R9 series cards.. If you bought your card before you can go to your manufacturers website and claim it through EA Origin..


Wait whaaa? how ? where ? 0_o didn't see any news about that


----------



## SamEkinci

http://sites.amd.com/us/promo/battlefield/Pages/battlefield4.aspx

There you go!


----------



## Kriant

Quote:


> Originally Posted by *SamEkinci*
> 
> http://sites.amd.com/us/promo/battlefield/Pages/battlefield4.aspx
> 
> There you go!


Ah well, that I saw, but it's only true for XFX and sapphire. My asus cards are let out


----------



## jomama22

Quote:


> Originally Posted by *Kriant*
> 
> Wait whaaa? how ? where ? 0_o didn't see any news about that


He is wrong. Its with only the specific listings at retailers. The AIBs are making Specific bundled cards. Not all r9 cards are eligible.


----------



## Kriant

Quote:


> Originally Posted by *jomama22*
> 
> He is wrong. Its with only the specific listings at retailers. The AIBs are making Specific bundled cards. Not all r9 cards are eligible.


That's what I thought.


----------



## SamEkinci

He is right the add was a bit misleading, lists all manufacturers but then you find out sapphire and xfx only..

That's just wrong..


----------



## KyGuy

I don't know much about overclock records with these, but I got my memory to 1500 stable on air which gave me 384GB/S!!! Temps maxed out at 83 with 60% fan. I'm not too worry about noise. I added 100mV through Afterburner Beta 17. Pretty Good if you ask me. Really want to push for 400GB/s LOL!!


----------



## KyGuy

Could you add me please? Make is Sapphire, and cooling is stock....


----------



## prostreetcamaro

I am having a strange issue. Sometimes when i start playing COD ghosts my frame rates will be lower than normal and the game doesnt feel very smooth. If I reboot and then go back in the fps will pretty much stay pegged at 91 and the game plays smooth as silk. I cant figure out why this happens? Same goes for benchmarks. Sometimes I have to reboot to get my bench scores up to where they should be. Anybody else run into this?


----------



## Sgt Bilko

Quote:


> Originally Posted by *prostreetcamaro*
> 
> I am having a strange issue. Sometimes when i start playing COD ghosts my frame rates will be lower than normal and the game doesnt feel very smooth. If I reboot and then go back in the fps will pretty much stay pegged at 91 and the game plays smooth as silk. I cant figure out why this happens? Same goes for benchmarks. Sometimes I have to reboot to get my bench scores up to where they should be. Anybody else run into this?


Just happened to me in AC4, Was running at 62 fps same as always then i alt-tabbed, and now it's sitting at 30 or so and stuttering like hell.........just reloaded the game and now it's fine.

Not sure whats up with that, was fine yesterday


----------



## Banedox

Hey Add me to this list, I got a XFX 290 unlocked to a 290X, all at stock on air right now. Waiting on my water cooling stuff in the coming weeks...


----------



## Arizonian

Quote:


> Originally Posted by *Banedox*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Hey Add me to this list, I got a XFX 290 unlocked to a 290X, all at stock on air right now. Waiting on my water cooling stuff in the coming weeks...


Congrats - added


----------



## overclockFrance

Quote:


> Originally Posted by *DampMonkey*
> 
> is this happening with a 1440p monitor?


2560x1600 monitor : DELL 3007.


----------



## hatlesschimp

i get around 55 - 65 FPS in ARMA 3 with CF 290X on the highest settings with MSAA at 2x and running 3360 x 1920.

Depending on how many players and objects are on the map it can drop to 35fps but rarely so far.


----------



## DampMonkey

Quote:


> Originally Posted by *overclockFrance*
> 
> 2560x1600 monitor : DELL 3007.


I have the problems you've described when overclocking hard on my 1440p monitor, but when i switch to a 1080p or 1200p monitor, the problems disappear. Might wanna try that


----------



## overclockFrance

Quote:


> Originally Posted by *prostreetcamaro*
> 
> Mine will do the same thing! Everything gets blurry. So your not alone on that one.


Did you manage to solve the problem ?

Otherwise, here is my configuration, in order to find common points :

ASUS Maximus VI Hero, bios 0711
[email protected] Mhz, 1.37 V
4x32 Gb Gskill
Powercolor [email protected], Catalyst 13.11beta v9.4
512 GB Samsung 830 SSD
2x2 TB HDD
Power Supply Coolermaster 850 W GOLD
Windows 7 64 bits
DELL 3007 @2560*1600

Does your configuration have common points with mine ?


----------



## overclockFrance

Quote:


> Originally Posted by *jerrolds*
> 
> Yea that happens to me when i overclocked too high, 1230mhz on the core and maxed out GPU Tweak. It stopped once i lowered the clock, even if voltage was still maxed. If temps still look good for you - they you may have hit the ceiling of your gpu. That or push past it hah what happens when you max voltage?


With max voltage, the display becomes more blurry.

My locked Sapphire 290 was a very good overlcoker (1270/1725 Mhz stable in benchmarks at 1412 mV) and the display was blurry on 3D Mark vantage at 1270 Mhz, and 1250 Mhz on Call of Duty Ghosts and Total War Rome II. At the same frequency, with about 10 other games, the display was fine.

With the HIS 290X, 3DMArk Vantage was blurry at 1260 Mhz and 1200 Mhz on games. With the current Powercolor [email protected], 1140 Mhz is OK, 1150 Mhz at +70mV is blurry.


----------



## overclockFrance

Quote:


> Originally Posted by *DampMonkey*
> 
> I have the problems you've described when overclocking hard on my 1440p monitor, but when i switch to a 1080p or 1200p monitor, the problems disappear. Might wanna try that


I own only 1 monitor. But by plugging it in HDMI with a HDMI-DVI adapter, my problem disappears but the resolution is 1280*800 instead of 2560x1600.

I changed parameters in CCC for DVI interface but that changed nothing.

What is your configuration, in order to find common points, in addition to the monitor ?


----------



## brazilianloser

BF4 and its bugs... one night playing just fine, the other I can't even pass five minutes without the game crashing.


----------



## SamEkinci

My xfx 290 gives "drivers stopped responding" error over 1170mhz @ 2560*1440 resolution with dvi-i at default voltage.

Never experience bluriness you guys mentioned no matter what the voltage is.

Only thing is mine is water cooled. So perhaps the temperature is the problem?


----------



## overclockFrance

Quote:


> Originally Posted by *SamEkinci*
> 
> My xfx 290 gives "drivers stopped responding" error over 1170mhz @ 2560*1440 resolution with dvi-i at default voltage.
> 
> Never experience bluriness you guys mentioned no matter what the voltage is.
> 
> Only thing is mine is water cooled. So perhaps the temperature is the problem?


No, mine is watercooled too.

What is your configuration ?


----------



## SamEkinci

Quote:


> Originally Posted by *overclockFrance*
> 
> No, mine is watercooled
> 
> What is your configuration ?


Similar to yours perhaps its the power supply then? I also have uber bios selected.

Only difference I could see between our setup would be that I am running two PSUs for combined output of 1200w 650w dedicated for GPU and CPU alone.

Everything else and I mean everything is on secondary psu.


----------



## AlNasty

Quote:


> Originally Posted by *S410520*
> 
> Thanks, cost me and a friend helping me about 2.5 hours. Thought might as well take pics and post it as others might buy the Icy Rev2 thinking it fits out of the box.
> 
> Hopefully the Arctic hybrid that's going on the other unlocked 290 next week will fit out of the box.
> 
> I like modding, but this was just a bit to much only to make it fit.


I just finished installing the Hybrid tonight. You are in for a bit of work. You will spend a nice bit of time installing it. In the end, its worth it. Nearly silent while under load, temps under load 55-60c for me so far. Vram +1 does not get much airflow. I seen 95c under load with my case open. My case has a side fan, and when installed it blows right at the card, cooling the heat sinks-so I put the side on. Vram 1 now hovers about 75c under load.

You are going to find you don't have enough ram heat sinks. I used the heat sink package from the Gelid Icy(had a revision A lying around). They were nearly perfect. It even has a sink that nearly fits perfect where the copper base overhangs the single isolated ram near the edge. I had to mod 1 sink for the vram 1. I might mod another spare a little differently for better cooling and put it on at a later time. Glad that one got thermal tape. I wish I had ordered some good thermal tape ahead of time. I used the Gelid stuff, lost a couple of those pieces, trashed a couple (stuff is small and thin). I did not want to use the thermal epoxy, but had to mix it 50/50 with thermal pasted to get the job finished on the last few pieces. I did not want to use it 100% just in case I need to remove a heat sink later.

You will need to mod a sink to fit under the copper base unless you have the one from the Gelid package that is half flat and half fins. You will need to mode one for the vram 1 also. Everything else was done following directions.

Good luck.


----------



## overclockFrance

Quote:


> Originally Posted by *SamEkinci*
> 
> Similar to yours perhaps its the power supply then? I also have uber bios selected.
> 
> Only difference I could see between our setup would be that I am running two PSUs for combined output of 1200w 650w dedicated for GPU and CPU alone.
> 
> Everything else and I mean everything is on secondary psu.


Yes, maybe the PSU.

In order to confirm that, I asked other people in this thread who encounter the same problem, to give me their configuration.


----------



## SamEkinci

Quote:


> Originally Posted by *overclockFrance*
> 
> Yes, maybe the PSU.
> 
> In order to confirm that, I asked other people in this thread who encounter the same problem, to give me their configuration.


Looking at your setup, you can try to remove off some of the hard-drives and default the overclock on your CPU, maybe remove fans and some other hardware you arent using, then give it a shot, that should give you enough power to get going, if indeed your problem is power.

Whether it is or not, with your hardware and CPU overclocked sounds like your PSU would be having a hard time nevertheless. PSU calculator says this card alone sucks out close to 500w with CPU and motherboard alone, add water cooling on that and its a big problem. This is why I went double PSU, to clear 12v and put all the water cooling stuff and fans on an PSU while putting ATX, CPU and GPU on its own PSU.


----------



## overclockFrance

Quote:


> Originally Posted by *SamEkinci*
> 
> Looking at your setup, you can try to remove off some of the hard-drives and default the overclock on your CPU, maybe remove fans and some other hardware you arent using, then give it a shot, that should give you enough power to get going, if indeed your problem is power.
> 
> Whether it is or not, with your hardware and CPU overclocked sounds like your PSU would be having a hard time nevertheless. PSU calculator says this card alone sucks out close to 500w with CPU and motherboard alone, add water cooling on that and its a big problem. This is why I went double PSU, to clear 12v and put all the water cooling stuff and fans on an PSU while putting ATX, CPU and GPU on its own PSU.


OK, i am going to test and report later.

Thanks.


----------



## MrWhiteRX7

Well the final piece of the puzzle has been solved for me.. Once I fixed my downclock issue all I had left to figure out was my erratic gpu usage issue. A lot of people were telling me it's just an AB bug but with my fps jumping around a lot I knew that couldn't be the case...

Check it out now! This is Arkham Origins... the long dips are the stupid cut scenes. I'm getting incredible FPS now with 4x msaa, and alllll the dx 11 junk turned up. Everything maxed out except no physx of course. This 290 at these clocks has made me so happy











No more stutter either...

So what was my fix? I WENT BACK TO MY 2600k!!!!







My fx 8350 @ 4.8ghz just couldn't muster. I switched teams to give it a shot and although it did well for some things over my intel, when it comes to gaming it's just painfully not able to push these gpu's the way they need to be pushed. IMO

My sandybridge is only at 4.2ghz, memory is downclocked to 2133 instead of 2400 when I had it on the amd rig.

I am happy

EDIT: This is on the 13.11 WQHL driver


----------



## selk22

Quote:


> Originally Posted by *overclockFrance*
> 
> In order to confirm that, I asked other people in this thread who encounter the same problem, to give me their configuration.


It is very helpful if you guys would include the rig in signature. That way people don't need to ask!


----------



## overclockFrance

Signature updated.


----------



## Sgt Bilko

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> So what was my fix? I WENT BACK TO MY 2600k!!!!
> 
> 
> 
> 
> 
> 
> 
> My fx 8350 @ 4.8ghz just couldn't muster. I switched teams to give it a shot and although it did well for some things over my intel, when it comes to gaming it's just painfully not able to push these gpu's the way they need to be pushed. IMO
> 
> My sandybridge is only at 4.2ghz, memory is downclocked to 2133 instead of 2400 when I had it on the amd rig.
> 
> I am happy
> 
> EDIT: This is on the 13.11 WQHL driver


Yeah, AMD's Processors really need a speed bump in the gaming department, My 8350 was great for my 7970 but for the 290x it's just not up to scratch.

I've got high expectations for AMD next Processor because if it falls short then i'm switching to the Blue team (as much as it pains me)


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yeah, AMD's Processors really need a speed bump in the gaming department, My 8350 was great for my 7970 but for the 290x it's just not up to scratch.
> 
> I've got high expectations for AMD next Processor because if it falls short then i'm switching to the Blue team (as much as it pains me)


THIS!

On my FX 8350 @ 4.88ghz with single 7970 it was neck and neck with my 2600k @ 4.5ghz in games. Tri-fire not so much but still pretty damn good. I'm really shocked at how big of a difference this 290 makes.


----------



## Sgt Bilko

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> THIS!
> 
> On my FX 8350 @ 4.88ghz with single 7970 it was neck and neck with my 2600k @ 4.5ghz in games. Tri-fire not so much but still pretty damn good. I'm really shocked at how big of a difference this 290 makes.


I can't say that i'm shocked, more disappointed really.

AMD Bring out this truly awesome GPU but the CPU's they sell can't keep up with it, it's almost funny in a sad sort of way.

Mind you i don't think we will see a steamroller FX chip until late 2014 (if ever at all)


----------



## MrWhiteRX7

I really really wanted the AMD rig to be able to have a complete red team set up but it just doesn't work out that way. And to think I almost went Trifire 290'swith that CPU.


----------



## Sgt Bilko

I'm going CF later on and i'm hoping to keep it an all AMD rig but if it turns out crap then i guess i gotta fork out some more cash and replace half my PC


----------



## Sergio4

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yeah tbh most of the time i don't notice AA in games so i generally leave it off and AC IV is a very pretty game without it
> ......kinda surprised me actually.
> 
> And i'm going to do the same, getting a 2nd 290x after Xmas sometime.
> Roughly 5mm and you don't need to spam to ask a question so please dont't


Thank you. But everyone were just ignoring me


----------



## Mas

Quote:


> Originally Posted by *jomama22*
> 
> I believe sapphire and HIS may be the only ones who do not put warranty stickers on the retention bracket screws, but HIS does have a piece of paper between the pcb and cooler saying if it is missing it is void.
> 
> XFX, MSI and Asus have the warranty stickers, i have seen it with my own eyes lol.
> 
> Not sure about powercolor...


My Gigabyte 290x had no stickers


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'm going CF later on and i'm hoping to keep it an all AMD rig but if it turns out crap then i guess i gotta fork out some more cash and replace half my PC


I'm ordering two more 290s. I love trifire but I'm thinking ivy-e to push it.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Sergio4*
> 
> Thank you. But everyone were just ignoring me


If someone knew the answer off the top of their head then they would have answered you, it's an uncommon question and i don't think many would have known it, i had to measure it myself (i've assumed that the mounting holes for the stock cooler are the same size as the ones around the core)

I don't think people were ignoring you, i just think they didn't know so they didn't answer.


----------



## S410520

Sergio4;

I have a manual from a Gelid Icy Vision rev2. It says the mounting spaces are the same for the 4870. (measured by hand)
check step 7:

http://www.gelidsolutions.com/clients/gelid/userfiles/In_A3_4870_r9(2).pdf


----------



## S410520

Quote:


> Originally Posted by *AlNasty*
> 
> I just finished installing the Hybrid tonight. You are in for a bit of work. You will spend a nice bit of time installing it. In the end, its worth it. Nearly silent while under load, temps under load 55-60c for me so far. Vram +1 does not get much airflow. I seen 95c under load with my case open. My case has a side fan, and when installed it blows right at the card, cooling the heat sinks-so I put the side on. Vram 1 now hovers about 75c under load.
> 
> You are going to find you don't have enough ram heat sinks. I used the heat sink package from the Gelid Icy(had a revision A lying around). They were nearly perfect. It even has a sink that nearly fits perfect where the copper base overhangs the single isolated ram near the edge. I had to mod 1 sink for the vram 1. I might mod another spare a little differently for better cooling and put it on at a later time. Glad that one got thermal tape. I wish I had ordered some good thermal tape ahead of time. I used the Gelid stuff, lost a couple of those pieces, trashed a couple (stuff is small and thin). I did not want to use the thermal epoxy, but had to mix it 50/50 with thermal pasted to get the job finished on the last few pieces. I did not want to use it 100% just in case I need to remove a heat sink later.
> 
> You will need to mod a sink to fit under the copper base unless you have the one from the Gelid package that is half flat and half fins. You will need to mode one for the vram 1 also. Everything else was done following directions.
> 
> Good luck.


Thanks for sharing








Guess I'm in for another afternoon modding, maybe I will mod the endplate from the stock cooler on it for the vrm's.

I found that the thermal tape worked great when you clean the ram chips first. Just rub m clean with rubber or Q-tip then clean with ArctiClean 1 and 2.
You should see the label printed very clearly on the ram chip after cleaning. See the difference in pics below.



I needed to remove a ram-sink to mod it and found it was very hard to remove.


----------



## alancsalt

Quote:


> Originally Posted by *overclockFrance*
> 
> Signature updated.


http://www.overclock.net/t/1258253/how-to-put-your-rig-in-your-sig


----------



## TheSoldiet

Currently im running a fx 8350 4.5 GHz on a 212 evo, and im really pushing it (1.39v)
Is it worth getting a CM nepton 280l, and try to aim for 4.8 or higher? Hearing that the fx 8350 is weak hurts me lol, but i manage to get 100 percent gpu usage in bf4 though  (W7 ultimate)


----------



## Gralle

I did a signature update to.


----------



## rdr09

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> THIS!
> 
> On my FX 8350 @ 4.88ghz with single 7970 it was neck and neck with my 2600k @ 4.5ghz in games. Tri-fire not so much but still pretty damn good. I'm really shocked at how big of a difference this 290 makes.


post#20

http://www.overclock.net/t/1443622/r9-290-gpu-usage-issues/10

+rep.


----------



## Sgt Bilko

Quote:


> Originally Posted by *TheSoldiet*
> 
> Currently im running a fx 8350 4.5 GHz on a 212 evo, and im really pushing it (1.39v)
> Is it worth getting a CM nepton 280l, and try to aim for 4.8 or higher? Hearing that the fx 8350 is weak hurts me lol, but i manage to get 100 percent gpu usage in bf4 though  (W7 ultimate)


What are your temps like at 4.5?

1.39 isn't really pushing it with an 8350....Mine's running 1.46 for 4.8Ghz (H100i here) and if you plan on 5Ghz then you will go pretty close to 1.5V

But tbh it's really only for those few extra fps more than anything.

As for BF4, all i can say is that AMD and DICE worked together for BF4 on PC some other games you just need raw power more than anything.


----------



## evensen007

Quote:


> Originally Posted by *prostreetcamaro*
> 
> I am having a strange issue. Sometimes when i start playing COD ghosts my frame rates will be lower than normal and the game doesnt feel very smooth. If I reboot and then go back in the fps will pretty much stay pegged at 91 and the game plays smooth as silk. I cant figure out why this happens? Same goes for benchmarks. Sometimes I have to reboot to get my bench scores up to where they should be. Anybody else run into this?


This happens to me sometimes after the pc comes out of sleep. I have real time cpu clocking set in my bios and sometimes my 4.7ghz overclock is stuck/locked at Intel's 1.6 ghz clock speed. Reboot and back to 4.7


----------



## centvalny

Cpu and powercolor 290 (asus PT3 bios) h20



http://imgur.com/pZ5AXaS


----------



## COMBO2

Got my R9 290x the other day!!

LOVE the card so far. Much smoother and easier to deal with then the 2x GTX 760s I had. The 4GB, 512bit memory is just plain awesome for 1440p.
Now I'm just waiting for my Phanteks Enthoo Primo so I can get on and watercool the card along with my 3770k. Bring it on!!


----------



## TheSoldiet

Quote:


> Originally Posted by *Sgt Bilko*
> 
> What are your temps like at 4.5?
> 
> 1.39 isn't really pushing it with an 8350....Mine's running 1.46 for 4.8Ghz (H100i here) and if you plan on 5Ghz then you will go pretty close to 1.5V
> 
> But tbh it's really only for those few extra fps more than anything.
> 
> As for BF4, all i can say is that AMD and DICE worked together for BF4 on PC some other games you just need raw power more than anything.


The temps are at 60-64 (OCCT) 100 percent load. I meant that I was pushing my cooler ;-)


----------



## Sgt Bilko

Quote:


> Originally Posted by *TheSoldiet*
> 
> The temps are at 60-64 (OCCT) 100 percent load. I meant that I was pushing my cooler ;-)


Ahh i misunderstood, my bad









I'm getting about 30c with my H100i.....might be worth investing in one or similar?

Since they can fit on any CPU it's a decent upgrade


----------



## TheSoldiet

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Ahh i misunderstood, my bad
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm getting about 30c with my H100i.....might be worth investing in one or similar?
> 
> Since they can fit on any CPU it's a decent upgrade


Yeah, thinking of getting either the H100i or CM Nepton 280l, a 250gb evo and a icy vision rev 2 for christmas :-D


----------



## TheSoldiet

I have another question Sgt Bilko, can your CM storm trooper handle the h100I in push pull?


----------



## overclockFrance

Quote:


> Originally Posted by *overclockFrance*
> 
> OK, i am going to test and report later.
> 
> Thanks.


I disconnected the HDDs, stopped to overclock the CPU, always the bluriness problem. the wattage PSU is probably not the cause. What plug do you use on your 1440p monitor : DVI, Display Port ? If Display Port, it may explain why you don't encounter the problem.


----------



## Raephen

Intteresting.

I sent a message to Sapphire tech support with 'would I void my warranty by just wanting to check my TIM spread?'

While the answer might seem as a straight forward no, the fact it's never plainly said and the fact the representative had chosen to put a single line in bold made me scratch my head.

This can be seen as one or two things: As 'don't mess with our stuff or you'll pay!' or more like a plausible deniability statement, kinda like a don't-ask-don't-tell policy...

Answer I got from Sapphire:

2013-11-22 [11:29]

Dear Sir:
Product Warranty will not be valid even if returned after purchased for the following cases:
-*Products that are defaced or physically damaged and modified by customer.*
-Products that become non-functional due to customer improper use.
-Products that cannot be verified as Sapphire products.
-Products that do not have a matching serial number between the product
and the original receipt.
-Products not sold from our official distributors or resellers.


----------



## Spectre-

Quote:


> Originally Posted by *Raephen*
> 
> Intteresting.
> 
> I sent a message to Sapphire tech support with 'would I void my warranty by just wanting to check my TIM spread?'
> 
> While the answer might seem as a straight forward no, the fact it's never plainly said and the fact the representative had chosen to put a single line in bold made me scratch my head.
> 
> This can be seen as one or two things: As 'don't mess with our stuff or you'll pay!' or more like a plausible deniability statement, kinda like a don't-ask-don't-tell policy...
> 
> Answer I got from Sapphire:
> 
> 2013-11-22 [11:29]
> 
> Dear Sir:
> Product Warranty will not be valid even if returned after purchased for the following cases:
> -*Products that are defaced or physically damaged and modified by customer.*
> -Products that become non-functional due to customer improper use.
> -Products that cannot be verified as Sapphire products.
> -Products that do not have a matching serial number between the product
> and the original receipt.
> -Products not sold from our official distributors or resellers.


well then you know what this means

sapphire is just literally a dictatorship


----------



## Raephen

Quote:


> Originally Posted by *Spectre-*
> 
> well then you know what this means
> 
> sapphire is just literally a dictatorship


Huh? I don't get your jump from something like a don't-ask-don't-tell policy to a dictatorship?


----------



## flopper

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I can't say that i'm shocked, more disappointed really.
> 
> AMD Bring out this truly awesome GPU but the CPU's they sell can't keep up with it, it's almost funny in a sad sort of way.
> 
> Mind you i don't think we will see a steamroller FX chip until late 2014 (if ever at all)


Mantle might alter this for those games with support.
so the gpu becomes the bottleneck.


----------



## prostreetcamaro

Quote:


> Originally Posted by *overclockFrance*
> 
> Did you manage to solve the problem ?
> 
> Otherwise, here is my configuration, in order to find common points :
> 
> ASUS Maximus VI Hero, bios 0711
> [email protected] Mhz, 1.37 V
> 4x32 Gb Gskill
> Powercolor [email protected], Catalyst 13.11beta v9.4
> 512 GB Samsung 830 SSD
> 2x2 TB HDD
> Power Supply Coolermaster 850 W GOLD
> Windows 7 64 bits
> DELL 3007 @2560*1600
> 
> Does your configuration have common points with mine ?


Windows 7 64 looks to be our only common point lol

For those that do get the blurry screen when overclocking the card in games what cable are you using? I am using a DVI-D on my VG278H 120hz monitor. I wonder if the displayport would do the same thing?


----------



## Joeking78

Is the gpu usage in the image below normal?

I get great FPS and no stutter but odd gpu usage...I'm guessing its just a bug and is not reading correctly because I have 100+ FPS even when AB shows zero gpu usage lol.


----------



## Joeking78

Hmmm, I fixed it already lol.

I didn't have a CCC profile setup for the game, set one up and override game AA settings from CCC and now I have 100% gpu usage the whole time.


----------



## overclockFrance

Quote:


> Originally Posted by *prostreetcamaro*
> 
> Windows 7 64 looks to be our only common point lol
> 
> For those that do get the blurry screen when overclocking the card in games what cable are you using? I am using a DVI-D on my VG278H 120hz monitor. I wonder if the displayport would do the same thing?


I am using a DVI-D cable too.

I tested on Windows 8 x64 and the problem remains.

I contemplate buying a dual link DVI - display port adapter. I hope that it will solve the problem otherwise I will get rid of the 290 card.


----------



## Ukkooh

What kind of fanspeeds are you guys using when OCing with the stock cooler? I tried 1085mhz core with 70% fan speed and it still throttled.


----------



## Joeking78

Quote:


> Originally Posted by *Ukkooh*
> 
> What kind of fanspeeds are you guys using when OCing with the stock cooler? I tried 1085mhz core with 70% fan speed and it still throttled.


How do you know if its throttling?

I have stock cooler, no oc and 70% fan speed, 70c max temp.


----------



## Ukkooh

Quote:


> Originally Posted by *Joeking78*
> 
> How do you know if its throttling?
> 
> I have stock cooler, no oc and 70% fan speed, 70c max temp.


By monitoring the clock speed obviously.
I got a score of 2385 @stock in valley with extremehd preset. Is my card performing the way it should?


----------



## hotrod717

Please change to watercooled. Some pics of the 290X chip. Hopefully have time to finish my build in the next couple weeks and get rid of the ghetto bench.


----------



## Gilgam3sh

in case some of you did not notice yet there is a new BIOS for the ASUS 290X, ends with AS02M

http://www11.zippyshare.com/v/5874082/file.html

(I got it from the ASUS folder after I updated my card)

btw, can someone make a "PT1" BIOS with 2D IDLE clocks??


----------



## sugarhell

No. Because pt1 is for extreme oc and you dont need 2d clocks. With powerplay is like a stock asus bios


----------



## ZealotKi11er

Was getting ready to buy a block and could find nothing on Stock. Does anyone know when EK will have more?


----------



## Gilgam3sh

Quote:


> Originally Posted by *sugarhell*
> 
> No. Because pt1 is for extreme oc and you dont need 2d clocks. With powerplay is like a stock asus bios


but there is a higher TDP limit on the PT1 bios vs the asus stock bios.


----------



## Ukkooh

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Was getting ready to buy a block and could find nothing on Stock. Does anyone know when EK will have more?






__ https://www.facebook.com/EKWaterBlocks/posts/590899024297466


----------



## ontariotl

Quote:


> Originally Posted by *Ukkooh*
> 
> 
> 
> 
> __ https://www.facebook.com/EKWaterBlocks/posts/590899024297466


I think EK should thank AMD for making them a butt load of money from just selling these blocks. I don't remember a last time that EK was back ordered for blocks or practically all stores are completely out of stock.


----------



## hotrod717

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Was getting ready to buy a block and could find nothing on Stock. Does anyone know when EK will have more?


Performance PC's have them. http://www.performance-pcs.com/catalog/index.php?main_page=index&cPath=59_971_240_579&zenid=a72e74fafb5772f116286e81ad0fc8a9. Oops, just checked and they're down to just acrylic/copper. You got to keep checking. They usually get stock in once a week.


----------



## ReHWolution

Quote:


> Originally Posted by *Gilgam3sh*
> 
> Quote:
> 
> 
> 
> Originally Posted by *sugarhell*
> 
> No. Because pt1 is for extreme oc and you dont need 2d clocks. With powerplay is like a stock asus bios
> 
> 
> 
> but there is a higher TDP limit on the PT1 bios vs the asus stock bios.
Click to expand...

Where can I find this PT1 bios from Asus? What's the difference with the ordinary one? However, if there's a higher TDP limit on the PT1 it's because it's a specific OC version, not for everyone...


----------



## Ukkooh

Quote:


> Originally Posted by *Ukkooh*
> 
> By monitoring the clock speed obviously.
> I got a score of 2385 @stock in valley with extremehd preset. Is my card performing the way it should?


Shameless bump for my question.


----------



## simon45op

hi guys, i'm new here . Not sure if i can join the club. i bought 2 powercolor r9 290x battlefield 4 edition a week ago.
May i ask where to submit driver related issue?
I'm having problem running games in full screen mode 4320x2560 regardless of game settings.It will get into a loop of alt-tab before hitting bsod eventually.
However when i try to run in window mode, it works smoothly. This affect most of the games i'm playing.

List of games affected:
Need For Speed: Most Wanted
Dota2
Grid 2
SimCity 2013.
Path Of Exile(no bsod but extremely low fps <10)
Diablo3(Crash when hero die).

Will post some overclocking result after my EK blocks arrive


----------



## Gilgam3sh

Quote:


> Originally Posted by *ReHWolution*
> 
> Where can I find this PT1 bios from Asus? What's the difference with the ordinary one? However, if there's a higher TDP limit on the PT1 it's because it's a specific OC version, not for everyone...


PT1: http://www.overclock.net/attachments/18264
PT3: http://www.overclock.net/attachments/18265


----------



## Raephen

Quote:


> Originally Posted by *Ukkooh*
> 
> Shameless bump for my question.


Aye, seems fine to me.

I just ran the same test with my 290 @ 1000 MHz core: 2389.


----------



## flamin9_t00l

Quote:


> Originally Posted by *Ukkooh*
> 
> I got a score of 2385 @stock in valley with extremehd preset. Is my card performing the way it should?


Here's my stock 290 (non X) for comparison if it helps. ExtremeHD preset


----------



## SamEkinci

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Well the final piece of the puzzle has been solved for me.. Once I fixed my downclock issue all I had left to figure out was my erratic gpu usage issue. A lot of people were telling me it's just an AB bug but with my fps jumping around a lot I knew that couldn't be the case...
> 
> Check it out now! This is Arkham Origins... the long dips are the stupid cut scenes. I'm getting incredible FPS now with 4x msaa, and alllll the dx 11 junk turned up. Everything maxed out except no physx of course. This 290 at these clocks has made me so happy
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> No more stutter either...
> 
> So what was my fix? I WENT BACK TO MY 2600k!!!!
> 
> 
> 
> 
> 
> 
> 
> My fx 8350 @ 4.8ghz just couldn't muster. I switched teams to give it a shot and although it did well for some things over my intel, when it comes to gaming it's just painfully not able to push these gpu's the way they need to be pushed. IMO
> 
> My sandybridge is only at 4.2ghz, memory is downclocked to 2133 instead of 2400 when I had it on the amd rig.
> 
> I am happy
> 
> EDIT: This is on the 13.11 WQHL driver


I don't know your PSU so this maybe a long shot perhaps it is also a power issue? I understand and you maybe right but down clicking proves two things its either bottleneck like you mentioned or over clock draining too much power than your 12v rail can provide from psu.

Hope it helps.


----------



## SamEkinci

Quote:


> Originally Posted by *overclockFrance*
> 
> I am using a DVI-D cable too.
> 
> I tested on Windows 8 x64 and the problem remains.
> 
> I contemplate buying a dual link DVI - display port adapter. I hope that it will solve the problem otherwise I will get rid of the 290 card.


That is my only difference with you guys aside psu as well I am using dual link cable with no adapters. I was advised when buying my monitor to not dwell from display port or dvi-i as friends told me it was essential. That's my knowledge as far as that though.


----------



## HeliXpc

guys, my memory doesnt underclock in windows, whats wrong? its making it idle at 58c.


----------



## overclockFrance

Quote:


> Originally Posted by *SamEkinci*
> 
> That is my only difference with you guys aside psu as well I am using dual link cable with no adapters. I was advised when buying my monitor to not dwell from display port or dvi-i as friends told me it was essential. That's my knowledge as far as that though.


What monitor do you own ? A few monitors my be more sensitive to this bluriness problem ?


----------



## TheRoot

Quote:


> Originally Posted by *COMBO2*
> 
> Got my R9 290x the other day!!
> 
> LOVE the card so far. Much smoother and easier to deal with then the 2x GTX 760s I had. The 4GB, 512bit memory is just plain awesome for 1440p.
> Now I'm just waiting for my Phanteks Enthoo Primo so I can get on and watercool the card along with my 3770k. Bring it on!!
> 
> 
> Spoiler: Warning: Spoiler!


koolance block look awesome


----------



## zpaf

GRID 2, ULTRA, 8XMSAA 5760x1080



































If I push > 1220MHz no gain.
I think the card hits TDP LIMIT.


----------



## SamEkinci

S
Quote:


> Originally Posted by *overclockFrance*
> 
> What monitor do you own ? A few monitors my be more sensitive to this bluriness problem ?


Shimian one of those Korean ones. I remember you guys both use dell right?


----------



## overclockFrance

Quote:


> Originally Posted by *SamEkinci*
> 
> S
> Shimian one of those Korean ones. I remember you guys both use dell right?


Yes, a DELL for me. But another one with the same problem has an ASUS 27'' 1440p.


----------



## ReHWolution

Quote:


> Originally Posted by *Gilgam3sh*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ReHWolution*
> 
> Where can I find this PT1 bios from Asus? What's the difference with the ordinary one? However, if there's a higher TDP limit on the PT1 it's because it's a specific OC version, not for everyone...
> 
> 
> 
> PT1: http://www.overclock.net/attachments/18264
> PT3: http://www.overclock.net/attachments/18265
Click to expand...

Thanks a lot! Could you tell me what are the differences between these BIOSes?


----------



## prostreetcamaro

Quote:


> Originally Posted by *overclockFrance*
> 
> Yes, a DELL for me. But another one with the same problem has an ASUS 27'' 1440p.


If you are talking about me my asus is the 120hz 1080p model. Actually I might sell it and pick up a qnix 1440p and see how high I can overclock it. Most of them are going to 120hz.


----------



## Arizonian

Quote:


> Originally Posted by *COMBO2*
> 
> Got my R9 290x the other day!!
> 
> LOVE the card so far. Much smoother and easier to deal with then the 2x GTX 760s I had. The 4GB, 512bit memory is just plain awesome for 1440p.
> Now I'm just waiting for my Phanteks Enthoo Primo so I can get on and watercool the card along with my 3770k. Bring it on!!
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added







and water blocked









Quote:


> Originally Posted by *hotrod717*
> 
> Please change to watercooled. Some pics of the 290X chip. Hopefully have time to finish my build in the next couple weeks and get rid of the ghetto bench.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - Updated









Quote:


> Originally Posted by *simon45op*
> 
> hi guys, i'm new here . Not sure if i can join the club. i bought 2 powercolor r9 290x battlefield 4 edition a week ago.
> May i ask where to submit driver related issue?
> I'm having problem running games in full screen mode 4320x2560 regardless of game settings.It will get into a loop of alt-tab before hitting bsod eventually.
> However when i try to run in window mode, it works smoothly. This affect most of the games i'm playing.
> 
> List of games affected:
> Need For Speed: Most Wanted
> Dota2
> Grid 2
> SimCity 2013.
> Path Of Exile(no bsod but extremely low fps <10)
> Diablo3(Crash when hero die).
> 
> Will post some overclocking result after my EK blocks arrive
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats both added


----------



## SamEkinci

But you guys both use HDMI? My monitor is a true 1440p and has DVI-I input only, 27".. Not sure if monitors are the problem if you guys can get it to work without overclocking.


----------



## overclockFrance

Quote:


> Originally Posted by *SamEkinci*
> 
> But you guys both use HDMI? My monitor is a true 1440p and has DVI-I input only, 27".. Not sure if monitors are the problem if you guys can get it to work without overclocking.


I use DVI only, with HDMI, my problem disappears but the resolution is 1280x800 instead of 2560x1600.

Without overclocking, I have no problem.


----------



## SamEkinci

Quote:


> Originally Posted by *overclockFrance*
> 
> I use DVI only, with HDMI, my problem disappears but the resolution is 1280x800 instead of 2560x1600.
> 
> Without overclocking, I have no problem.


Thats why I feel it doesnt have much to do with monitor and instead its the power. Unless you are overclocking the monitor or changing refresh rates or pixel rates, it shouldnt be due to monitor. Your problems sound like they are due to voltages and power, and maybe instability due to bottleneck. This card uses more power than advertised for sure.


----------



## prostreetcamaro

Quote:


> Originally Posted by *SamEkinci*
> 
> Thats why I feel it doesnt have much to do with monitor and instead its the power. Unless you are overclocking the monitor or changing refresh rates or pixel rates, it shouldnt be due to monitor. Your problems sound like they are due to voltages and power, and maybe instability due to bottleneck. This card uses more power than advertised for sure.


I was kind of thinking my corsair TX650 might not really be quite enough for my system. I have a pc power and cooling quad 750 in my other backup rig. I am going to swap that into my main rig here and see if that helps any. Even if it doesnt I wont be pushing that power supply quite as hard as I push this 650.


----------



## overclockFrance

Quote:


> Originally Posted by *SamEkinci*
> 
> Thats why I feel it doesnt have much to do with monitor and instead its the power. Unless you are overclocking the monitor or changing refresh rates or pixel rates, it shouldnt be due to monitor. Your problems sound like they are due to voltages and power, and maybe instability due to bottleneck. This card uses more power than advertised for sure.


But 850 W is a lot for one 290 only !

If I change my power supply, I will probably buy the Seasonic Platinum 1000W but it costs 200 €. For such a price, I have the choice to keep my power supply and buy a GTX780TI for the same price in total, which will perform 5 % better than the 290X.


----------



## SamEkinci

True but you are running a lot of other hardware along with water cooling, they take a lot of resources as well. Not to mention its also to do with your 12V rail.

If your 12V rail can utilize lets say only 700W of that 850W PSU and your overclocked CPU and GPU along with Motherboard with possibly overclocked RAM (I am guessing) is sharing all that. Also depends on how many 12V rails your PSU has too, that limits it even more if you have more than one 12V rail.

So in short its not how much W your PSU has, its how much W your PSU has on 12V rail and how many rails you have, then how much of it you got left after CPU and RAM are overclocked. This information is on the box of your PSU or on the side of it.

If you have one 12V rail you want to make sure it has enough power for CPU, GPU and ATX all after overclocked. If you have multiple 12V rails, then you want to make sure the rail assigned to PCI-EX has enough power for GPU after overclocked.

Sorry I dont mean to be preachy, I dont know how much you know, just want to be helpful.


----------



## sugarhell

Quote:


> Originally Posted by *ReHWolution*
> 
> Thanks a lot! Could you tell me what are the differences between these BIOSes?


Dont use them. The first one is for a high end loop. The second one is mostly for ln2. The tdp is around 500 watt i think.

The normal asus bios gives you plenty enough tdp limit with 50% powerlimit. I only approve to use them if you know what are you doing.It will not be funny to push 1.4 stock volts through a stock cooler


----------



## SamEkinci

Quote:


> Originally Posted by *sugarhell*
> 
> Dont use them. The first one is for a high end loop. The second one is mostly for ln2. The tdp is around 500 watt i think.
> 
> The normal asus bios gives you plenty enough tdp limit with 50% powerlimit. I only approve to use them if you know what are you doing.It will not be funny to push 1.4 stock volts through a stock cooler


I have also heard many bricking their cards with these BIOSs. Just fyi its not proven to work 100% yet. As he said the TDP amount is large enough to give you max stable overclock already, providing you have enough cooling.


----------



## overclockFrance

Quote:


> Originally Posted by *SamEkinci*
> 
> True but you are running a lot of other hardware along with water cooling, they take a lot of resources as well. Not to mention its also to do with your 12V rail.
> 
> If your 12V rail can utilize lets say only 700W of that 850W PSU and your overclocked CPU and GPU along with Motherboard with possibly overclocked RAM (I am guessing) is sharing all that. Also depends on how many 12V rails your PSU has too, that limits it even more if you have more than one 12V rail.
> 
> So in short its not how much W your PSU has, its how much W your PSU has on 12V rail and how many rails you have, then how much of it you got left after CPU and RAM are overclocked. This information is on the box of your PSU or on the side of it.
> 
> If you have one 12V rail you want to make sure it has enough power for CPU, GPU and ATX all after overclocked. If you have multiple 12V rails, then you want to make sure the rail assigned to PCI-EX has enough power for GPU after overclocked.
> 
> Sorry I dont mean to be preachy, I dont know how much you know, just want to be helpful.


Thank you for your help. My power supply is a Coolermaster Silent Pro Hybrid 850W. It has only 1 12v rail which provides 70 A.

You may be right : with the power calculator (http://www.extreme.outervision.com/), my configuration consumes 716 W with CPU overclocking but without GPU overclocking.

If I have to buy another power supply, it really annoys me.


----------



## ReHWolution

Quote:


> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ReHWolution*
> 
> Thanks a lot! Could you tell me what are the differences between these BIOSes?
> 
> 
> 
> Dont use them. The first one is for a high end loop. The second one is mostly for ln2. The tdp is around 500 watt i think.
> 
> The normal asus bios gives you plenty enough tdp limit with 50% powerlimit. I only approve to use them if you know what are you doing.It will not be funny to push 1.4 stock volts through a stock cooler
Click to expand...

Flashing the PT1 I can easily "replicate" what I do with the ordinary Asus BIOS, however the card will be put in a liquid cooling system AND it will be tested under LN2, so the PT1 is more like a "halfway" between extreme modding and "light" modding the card...however, I can easily recover the card if BIOS flash doesn't go in the proper way, I got a Haswell, so I can boot through IGP


----------



## SamEkinci

Quote:


> Originally Posted by *overclockFrance*
> 
> Thank you for your help. My power supply is a Coolermaster Silent Pro Hybrid 850W. It has only 1 12v rail which provides 70 A.
> 
> You may be right : with the power calculator (http://www.extreme.outervision.com/), my configuration consumes 716 W with CPU overclocking but without GPU overclocking.
> 
> If I have to buy another power supply, it really annoys me.


Sorry TBH I always buy a new PSU with a new card, because there is also decay with PSU power output, meaning after one year of overclocking the PSU will not output same amount of power it did when you first purchased it.

These are the exact reasons why I run two PSU setup.. When I buy a new card I retire the old PSU to run Watercooling drivers and etc, and get a new one that fits the bill for CPU and GPU overclocked.

Unfortunately this card really and I mean really power hungry, my problem with overclocking it is not cooling as well, but the power. After 1200Mhz I began running out of power.


----------



## overclockFrance

Quote:


> Originally Posted by *SamEkinci*
> 
> Sorry TBH I always buy a new PSU with a new card, because there is also decay with PSU power output, meaning after one year of overclocking the PSU will not output same amount of power it did when you first purchased it.
> 
> These are the exact reasons why I run two PSU setup.. When I buy a new card I retire the old PSU to run Watercooling drivers and etc, and get a new one that fits the bill for CPU and GPU overclocked.
> 
> Unfortunately this card really and I mean really power hungry, my problem with overclocking it is not cooling as well, but the power. After 1200Mhz I began running out of power.


What power supplies do you use ? And what is your entire configuration. Because if you start running out of power at 1200 Mhz, 1000 W may be not enough for me !


----------



## SamEkinci

Quote:


> Originally Posted by *overclockFrance*
> 
> What power supplies do you use ? And what is your entire configuration. Because if you start running out of power at 1200 Mhz, 1000 W may be not enough for me !


I am not technically utilizing all 1200W of power, this is misinformation by PSU manufacturers and card manufacturers, its all about which 12v rail your GPU, CPU and ATX is using. Let me explain;

I am running two PSUs main one from OCZ and has 750W, it has single 12V rail that has around 700W, this powers only and I mean only CPU and GPU along with ATX. Nothing else, no motherboard powered fans or other PCI cards. (if I had other PCI cards I would divert ATX to secondary PSU) Everything is overclocked and with CPU and RAM , GPU at default this results in 430W (according to AMD demand of 290 which is BS), but I predict its closer to 550Wish. Overclock all that and its pure madness PSU fan starts blazing!









Second PSU is a Kingwin with single 12V rail that is 650W, this one powers everything else, hard drives, fans, water cooling, you name it, this powers it. I am not comfortable switching ATX power to secondary PSU, no reason aside a bad feeling, but I might give that a shot.

This all results in less heat and power loss, clear 12v rail. If you buy a new PSU dont throw the other one away just get a lian li or add2psu adapter.

I will definitely make a video about this later as I am surprised more overclockers arent using this method. All you need is a decent powered secondary PSU.


----------



## kdawgmaster

Quote:


> Originally Posted by *SamEkinci*
> 
> Sorry TBH I always buy a new PSU with a new card, because there is also decay with PSU power output, meaning after one year of overclocking the PSU will not output same amount of power it did when you first purchased it.
> 
> These are the exact reasons why I run two PSU setup.. When I buy a new card I retire the old PSU to run Watercooling drivers and etc, and get a new one that fits the bill for CPU and GPU overclocked.
> 
> Unfortunately this card really and I mean really power hungry, my problem with overclocking it is not cooling as well, but the power. After 1200Mhz I began running out of power.


psu degradation takes longer then 1 year if u get something half decent. Iv used 500 psu's that are 5+ years old and still work on todays high end stuff


----------



## overclockFrance

Ok, but unfortunately, I don't have enough room for a 2nd power supply in my case. consequently, I will buy only 1. According to your configuration and the power I consume at present, 1000 W is a minimum.

I may have the choice for 2 power supplies :

- Seasonic 1000 W Platinum, 1 12 V 83 A rail
- Enermax Platimax, 1250 W, 6 12 V rails, 30 A each

The price is the same. What power supply would you advise ? I prefer 1 12 V rail for flexibility but the Enermax is impressive.


----------



## Forceman

Quote:


> Originally Posted by *overclockFrance*
> 
> Thank you for your help. My power supply is a Coolermaster Silent Pro Hybrid 850W. It has only 1 12v rail which provides 70 A.
> 
> You may be right : with the power calculator (http://www.extreme.outervision.com/), my configuration consumes 716 W with CPU overclocking but without GPU overclocking.
> 
> If I have to buy another power supply, it really annoys me.


You are running a single 290,right? If so, you are not running out power with your system. Even overclocked, you aren't pulling more then 450W from the wall with that system, probably not even that much. I have a similar setup, and I draw about 390W from the wall running BF4. I've never seen mine go above 400W.


----------



## SamEkinci

Quote:


> Originally Posted by *kdawgmaster*
> 
> psu degradation takes longer then 1 year if u get something half decent. Iv used 500 psu's that are 5+ years old and still work on todays high end stuff


You are correct, but I was speaking of overclocking and pushing the PSU all the time. If you are PSU is Gold and Platinum and you have given it 10% power breathing room for your setup, degrading is *slower*. Its about not overworking your PSU. Most people have enough headroom with their PSUs that they dont notice degrading, but its mainly about 12V rail then anything else.

Sorry I guess I was generalizing a bit too much.









Moral of it though, R9 series use more power than advertised and that makes it hard to predict overclocking or PSU picking or figuring out if your current PSU is up to task. This is probably why XFX puts PSU advertisements all over their R9 series.


----------



## kdawgmaster

Quote:


> Originally Posted by *SamEkinci*
> 
> You are correct, but I was speaking of overclocking and pushing the PSU all the time. If you are PSU is Gold and Platinum and you have given it 10% power breathing room for your setup, degrading is *slower*. Its about not overworking your PSU. Most people have enough headroom with their PSUs that they dont notice degrading, but its mainly about 12V rail then anything else.
> 
> Sorry I guess I was generalizing a bit too much.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Moral of it though, R9 series use more power than advertised and that makes it hard to predict overclocking or PSU picking or figuring out if your current PSU is up to task. This is probably why XFX puts PSU advertisements all over their R9 series.


Thats why i upgraded my 1050 which was great for my 3 7970's to a 1300 for my 3 R9 290x's. I was well aware of the extra power draw but i thought even if i wanted to i could do 4 with a 1300 and it dosnt look so









but before that i need to upgrade my 2600K

AND that bring me to my question for you all. Can someone recommend me a 2011 mobo that supports PCI-E 16X across all the rails thats a resonable price? Im getting a 4930K from intel and would like to run all my 290x at the best possible performance.

I was thinking Asus WS motherboard but i would preffer Gigabyte but they dont appear to make one


----------



## the9quad

Quote:


> Originally Posted by *kdawgmaster*
> 
> Thats why i upgraded my 1050 which was great for my 3 7970's to a 1300 for my 3 R9 290x's. I was well aware of the extra power draw but i thought even if i wanted to i could do 4 with a 1300 and it dosnt look so
> 
> 
> 
> 
> 
> 
> 
> 
> 
> but before that i need to upgrade my 2600K
> 
> AND that bring me to my question for you all. Can someone recommend me a 2011 mobo that supports PCI-E 16X across all the rails thats a resonable price? Im getting a 4930K from intel and would like to run all my 290x at the best possible performance.
> 
> I was thinking Asus WS motherboard but i would preffer Gigabyte but they dont appear to make one


ASRock X79 Extreme11 LGA 2011 Intel X79 SATA 6Gb/s USB 3.0 Extended ATX Intel Motherboard

that's the only one I know of.


----------



## the9quad

The power draw issue was covered earlier:

Here:


----------



## overclockFrance

Quote:


> Originally Posted by *the9quad*
> 
> The power draw issue was covered earlier:
> 
> Here:


OK, thank you for these information.

For my bluriness problem, I ordered 1 DVI - display port adapter at 5 €. If that does not solve it, I will see.


----------



## kpoeticg

It's really not worth spending the extra money for a board with 2 PLX PEX8747 chips. Especially for AMD cards. PLX lanes have more latency than native PCI lanes, and AMD cards have good enough scaling where you probly wouldn't benefit much if at all. Get a RIVE BE!!!!


----------



## kdawgmaster

Quote:


> Originally Posted by *the9quad*
> 
> ASRock X79 Extreme11 LGA 2011 Intel X79 SATA 6Gb/s USB 3.0 Extended ATX Intel Motherboard
> 
> that's the only one I know of.


yeah its something like $730CAD and i dont need the LSI controller it comes with as im not doing raid, thats kinda why i looked into the Asus WS motherboard and its like 550 or something like that :/ but i was looking into that Extreme 11 motherboard.

But thanks a bunch for the recommendation =D


----------



## the9quad

Quote:


> Originally Posted by *kdawgmaster*
> 
> yeah its something like $730CAD and i dont need the LSI controller it comes with as im not doing raid, thats kinda why i looked into the Asus WS motherboard and its like 550 or something like that :/ but i was looking into that Extreme 11 motherboard.
> 
> But thanks a bunch for the recommendation =D


I recommend the Rampage IV (reg or BE) to be honest for the same reasons someone stated, I was just throwing that one out as the only other full 16 lane card I know of. 1x16 and 3x8 is plenty I think.


----------



## brazilianloser

Hmm I guess my Asus 290 is not voltage unlocked. Changed over to GPU Tweak and the GPU Voltage reading never goes over the 1.25 that it is set by default even if I apply a plus 25mV during stress testing.


----------



## kdawgmaster

Quote:


> Originally Posted by *the9quad*
> 
> I recommend the Rampage IV (reg or BE) to be honest for the same reasons someone stated, I was just throwing that one out as the only other full 16 lane card I know of. 1x16 and 3x8 is plenty I think.


Yeah that might be true, but i was just thinking in the long term thing. i kept my 2600K since almost day 1 and working fine besides now with the bottleneck that it gives my setup. I was just thinking that the LSI controller while fantastic for raid for reliability but i wont use it.

Hmm time to think


----------



## the9quad

Quote:


> Originally Posted by *kdawgmaster*
> 
> Yeah that might be true, but i was just thinking in the long term thing. i kept my 2600K since almost day 1 and working fine besides now with the bottleneck that it gives my setup. I was just thinking that the LSI controller while fantastic for raid for reliability but i wont use it.
> 
> Hmm time to think


I can tell you this much, if you are going to do more than one card, water cooling is a must. The noise going from one card to two is exponential. So figure that into your price as well, gonna take a pretty beefy cooling setup for 4 cards.I was going to do tri-fire and was really pumped about doing it, but the noise is really so bad, that I am spending the money on watercooling for the two I have, instead. and then your also looking at probably a 1600 watt power supply as well or probably better yet two power supplies.


----------



## kdawgmaster

Quote:


> Originally Posted by *the9quad*
> 
> I can tell you this much, if you are going to do more than one card, water cooling is a must. The noise going from one card to two is exponential. So figure that into your price as well, gonna take a pretty beefy cooling setup for 4 cards.I was going to do tri-fire and was really pumped about doing it, but the noise is really so bad, that I am spending the money on watercooling for the two I have, instead. and then your also looking at probably a 1600 watt power supply as well or probably better yet two power supplies.


Ive already got all 3 so far and they hit a max of 65% for each cards which isnt something that i really care about. All the cards btw can sit at 85C no problems with that fan speed


----------



## Raephen

Quote:


> Originally Posted by *overclockFrance*
> 
> Ok, but unfortunately, I don't have enough room for a 2nd power supply in my case. consequently, I will buy only 1. According to your configuration and the power I consume at present, 1000 W is a minimum.
> 
> I may have the choice for 2 power supplies :
> 
> - Seasonic 1000 W Platinum, 1 12 V 83 A rail
> - Enermax Platimax, 1250 W, 6 12 V rails, 30 A each
> 
> The price is the same. What power supply would you advise ? I prefer 1 12 V rail for flexibility but the Enermax is impressive.


My vote would be for the SeaSonic.

But then, maybe I'm biased: I'm running the 860W XP2 version of Seasonic, and before that the SS X-560 and my htpc is powered by their bronze 430W PSU (can't recall the name... something like S12...430).

Keep the 12v rail to a single, or dual at most with decent amperage, and you'll be fine.


----------



## overclockFrance

OK, thank you.


----------



## the9quad

Quote:


> Originally Posted by *Raephen*
> 
> My vote would be for the SeaSonic.
> 
> But then, maybe I'm biased: I'm running the 860W XP2 version of Seasonic, and before that the SS X-560 and my htpc is powered by their bronze 430W PSU (can't recall the name... something like S12...430).
> 
> Keep the 12v rail to a single, or dual at most with decent amperage, and you'll be fine.


i am running a pc power and cooling *platinum* 1200 watt, it got really good reviews. it has one 12v rail 99.5A. and a 7 year warranty. compared to the seasonic 5 yr. and it's platinum rated not gold. I am sure there are better power supplies, but this one is quiet, and does the job admirably with a sweet warranty.


----------



## SamEkinci

Quote:


> Originally Posted by *Raephen*
> 
> My vote would be for the SeaSonic.
> 
> But then, maybe I'm biased: I'm running the 860W XP2 version of Seasonic, and before that the SS X-560 and my htpc is powered by their bronze 430W PSU (can't recall the name... something like S12...430).
> 
> Keep the 12v rail to a single, or dual at most with decent amperage, and you'll be fine.


I also prefer one good 12V rail over multiple rails.. Experience and future proofing your overclock needs. 83A is I think off the top of my head about 996w (correct me someone if I am wrong please) and platinum is awesome of course.

I would also pick Seasonic if they were my options.

Multiple 12V rails is good in my opinion if the rails individual amps meet your needs perfectly, but that might change in future with different CPU or GPUs so I dont like that very much. But I have a strict budget so price is important deciding factor as well as future upgrade costs.


----------



## Raephen

Quote:


> Originally Posted by *the9quad*
> 
> i am running a pc power and cooling *platinum* 1200 watt, it got really good reviews. it has one 12v rail 99.5A. and a 7 year warranty. compared to the seasonic 5 yr. and it's platinum rated not gold. I am sure there are better power supplies, but this one is quiet, and does the job admirably with a sweet warranty.


Uhm..... the X-560 was gold, yes. My current 860 XP2 is platinum.

True, the PC Power and Cooling PSU are great, *too*, but the SS 860 XP2 was what I needed (and still overkill for a single card setup) and about € 40 less than a PC Power & Cooling Silencer Mk III 1200W.

So let's just shake hands and concede that both manufacturers make great PSU's (PP&C do make their own, right? They don't buy them from oem manufacturers like Superflower or SeaSonic?)

----
EDIT: Sorry, I misread your post. I thought you meant to say the SeaSonics weren't platinum, unlike yours. I apologise, that's not what you meant, right?







.


----------



## the9quad

Quote:


> Originally Posted by *Raephen*
> 
> Uhm..... the X-560 was gold, yes. My current 860 XP2 is platinum.
> 
> True, the PC Power and Cooling PSU are great, *too*, but the SS 860 XP2 was what I needed (and still overkill for a single card setup) and about € 40 less than a PC Power & Cooling Silencer Mk III 1200W.
> 
> So let's just shake hands and concede that both manufacturers make great PSU's (PP&C do make their own, right? They don't buy them from oem manufacturers like Superflower or SeaSonic?)
> 
> ----
> EDIT: Sorry, I misread your post. I thought you meant to say the SeaSonics weren't platinum, unlike yours. I apologise, that's not what you meant, right?
> 
> 
> 
> 
> 
> 
> 
> .


pc power and cooling makes their own I believe in the US, they have brands they do for OCZ now. I wasn't trying to start a argument over which is better I was stating another choice. 1200 watts should do 2 or 3 cards, which is why i have 1200 watts. 1200 watt platinum rated power supplies are few and far between, which is why I stated in that power range I don't believe seasonic has a platinum one.







and that 7yr. warranty, is sweet.

here is their blurb:

For over 25 Years, PC Power & Cooling has been at the forefront of the high-performance computer power supply industry. We have produced many innovations along the way: the first CPU cooler, the first PC heat alarm, the first independently-regulated PC power supply, the first redundant power system, the first NVIDIA SLI Certified supply, the first 1000W computer power supply and the first - and still only company - to offer an individual certified test report with each power supply sold. We produce the world's premier high-performance computer power supplies, all expertly engineered in San Diego, California.


----------



## shilka

Quote:


> Originally Posted by *Raephen*
> 
> Uhm..... the X-560 was gold, yes. My current 860 XP2 is platinum.
> 
> True, the PC Power and Cooling PSU are great, *too*, but the SS 860 XP2 was what I needed (and still overkill for a single card setup) and about € 40 less than a PC Power & Cooling Silencer Mk III 1200W.
> 
> So let's just shake hands and concede that both manufacturers make great PSU's (PP&C do make their own, right? They don't buy them from oem manufacturers like Superflower or SeaSonic?)
> 
> ----
> EDIT: Sorry, I misread your post. I thought you meant to say the SeaSonics weren't platinum, unlike yours. I apologise, that's not what you meant, right?
> 
> 
> 
> 
> 
> 
> 
> .


PC Power and Cooling has never made their own units

Unless Win-Tact is PC Power and Cooling?

OCZ bought them a few years ago

Today their units are a mix of Super Flower Seasonic and HighePower


----------



## ImJJames

So it looks like the intermittent black screen was a driver issue not hardware issue, since new drivers seem to have fixed it. Glad AMD came through.


----------



## the9quad

Quote:


> Originally Posted by *shilka*
> 
> PC Power and Cooling has never made their own units
> 
> Unless Win-Tact is PC Power and Cooling?
> 
> OCZ bought them a few years ago
> 
> Today their units are a mix of Super Flower Seasonic and HighePower


They design and assemble them in the US, built overseas, by wintact,sirtec, etc.. some have been reengineered from seasonic platform's, like the MKII series, not sure about the MKIII though.


----------



## shilka

Quote:


> Originally Posted by *the9quad*
> 
> They design and assemble them in the US, built overseas, by wintact,sirtec, etc.. some have been reengineered from seasonic platform's, like the MKII series, not sure about the MKIII though.


400/500/600 are Seasonic maybe M12II units?

750 and 850 watts are Super Flower Golden Green

And the 1200 watts is a Super Flower Golden King

http://www.realhardtechx.com/index_archivos/Page1418.htm


----------



## the9quad

Quote:


> Originally Posted by *shilka*
> 
> 400/500/600 are Seasonic maybe M12II units?
> 
> 750 and 850 watts are Super Flower Golden Green
> 
> And the 1200 watts is a Super Flower Golden King
> 
> http://www.realhardtechx.com/index_archivos/Page1418.htm


there ya go.







Tom's hardware does say this much after saying they are superflower PSU's

2. Designers: Without Their Own Production

The second group of companies also *develops and designs their own products*. However, they have to outsource either some or all of the manufacturing to other companies. One example of this is Be Quiet. Those familiar with the brand noted how Be Quiet P7 models were suddenly much better than the disappointing P6. The answer was simply a manufacturer change, from Topower to FSP. Other examples of designers include SilverStone, Corsair, *PC Power & Cooling*, and Tagan.

and that would jibe with what they say on their front page and what they have publicly stated as well, but who knows all companies are shady, it's the end product that counts, and it's a quality power supply.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *SamEkinci*
> 
> I don't know your PSU so this maybe a long shot perhaps it is also a power issue? I understand and you maybe right but down clicking proves two things its either bottleneck like you mentioned or over clock draining too much power than your 12v rail can provide from psu.
> 
> Hope it helps.


Clocks were fixed with my ccc issue. I'm on a corsair 1000w psu. The only game the 8350 is marginally close to my frame rate with the Intel is bf4 but my minimum framerates are still much higher on the Intel matching my gpu uasage.


----------



## Sgt Bilko

Quote:


> Originally Posted by *TheSoldiet*
> 
> I have another question Sgt Bilko, can your CM storm trooper handle the h100I in push pull?


Sorry for the late reply. I went to bed afterwards.

It can but not if your ram has heatsinks afaik. I only have it in a push setup atm.

Check out the CM Storm trooper club thread. There is a amall mod you can do that will allow you to have a push/pull setup fit easy by modding the handle bracket. Planning to do it myself when I get some spare time


----------



## TheSoldiet

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Sorry for the late reply. I went to bed afterwards.
> 
> It can but not if your ram has heatsinks afaik. I only have it in a push setup atm.
> 
> Check out the CM Storm trooper club thread. There is a amall mod you can do that will allow you to have a push/pull setup fit easy by modding the handle bracket. Planning to do it myself when I get some spare time


Thx for the reply! I have been looking over there. Seems like a good mod, I also have some spare fans to use. Too bad I have corsair vengeance ram lol. Huge heatsinks.

+1 rep for all of your replies


----------



## evensen007

I see that amd released some new drivers that appear to have fixed the black screens but also removed overdrive? Doesn't this seriously hamper any overclocking and cause throttling by not having power +50 controls?


----------



## Imprezzion

Guys, i'm thinking of switching from my 780 to a R9 290X but I really want to know something before I do..

What is the average OC I can count on when using a Accelero Hybrid.

Now, don't tell me the VRM's are hard to cool cause I ran a Powercolor R9 290 unlocked to X @ 1.412v in GPU Tweak (1.34v after vdroop) in BF4 and it ran just 65c VRM temps @ 100% fan. Now with reference cooling there's litterly nothing cooling them cause there's no airflow whatsoever over the tiny little aluminium plate cooling them as it's right beneath the fan. I dare to bet the Hybrid with the fan pulled from it and a 120mm Swiftech Helix fan pointed right at the sinks can coool the VRM's just fine.

Also, the rad will be installed push-pull with the same specially designed for rads Swiftech Helix fans so no problems there either.

So, what can I expect from a regular 290X at whatever voltage? Is 1300+ Mhz something you'd reach at ~1.4v?


----------



## ihaveworms

Hey guys I just ordered a koolance GPU block for a 290. I have it unlocked to a 290x with the ASUS bios and would like to overclock it when my water cooling stuff gets here. What would be considered a safe voltage for this card? What tool is recommended to tweak the voltage/clock rates?


----------



## jomama22

Quote:


> Originally Posted by *Imprezzion*
> 
> Guys, i'm thinking of switching from my 780 to a R9 290X but I really want to know something before I do..
> 
> What is the average OC I can count on when using a Accelero Hybrid.
> 
> Now, don't tell me the VRM's are hard to cool cause I ran a Powercolor R9 290 unlocked to X @ 1.412v in GPU Tweak (1.34v after vdroop) in BF4 and it ran just 65c VRM temps @ 100% fan. Now with reference cooling there's litterly nothing cooling them cause there's no airflow whatsoever over the tiny little aluminium plate cooling them as it's right beneath the fan. I dare to bet the Hybrid with the fan pulled from it and a 120mm Swiftech Helix fan pointed right at the sinks can coool the VRM's just fine.
> 
> Also, the rad will be installed push-pull with the same specially designed for rads Swiftech Helix fans so no problems there either.
> 
> So, what can I expect from a regular 290X at whatever voltage? Is 1300+ Mhz something you'd reach at ~1.4v?


The vrm temps are not bad at all as long as you have heatsinks on them as you saw. They did a good job keeping the heat away from then.

At max voltage using Asus.ROM, I would expect a core of 1200-1280. You will only be running 1.3v actual at that setting.

Its a silicon lottery. Water is these cards best friend.
Quote:


> Originally Posted by *evensen007*
> 
> I see that amd released some new drivers that appear to have fixed the black screens but also removed overdrive? Doesn't this seriously hamper any overclocking and cause throttling by not having power +50 controls?


You didn't do a proper uninstall if you don't have overdrive.


----------



## bond32

Quote:


> Originally Posted by *ihaveworms*
> 
> Hey guys I just ordered a koolance GPU block for a 290. I have it unlocked to a 290x with the ASUS bios and would like to overclock it when my water cooling stuff gets here. What would be considered a safe voltage for this card? What tool is recommended to tweak the voltage/clock rates?


So far I really have been impressed with the Koolance block. Came with two large thermal pad squares and you cut the lengths to fit the card. My temps are nothing short of fantastic right now, but I can't overclock and run a game without getting a black screen.

What bios seems to provide the best results so far?


----------



## evensen007

Quote:


> Originally Posted by *jomama22*
> 
> The vrm temps are not bad at all as long as you have heatsinks on them as you saw. They did a good job keeping the heat away from then.
> 
> At max voltage using Asus.ROM, I would expect a core of 1200-1280. You will only be running 1.3v actual at that setting.
> 
> Its a silicon lottery. Water is these cards best friend.
> You didn't do a proper uninstall if you don't have overdrive.


Even if you do a clean install and have overdrive, apparently the power limit doesn't work anymore.


----------



## Imprezzion

Quote:


> Originally Posted by *jomama22*
> 
> The vrm temps are not bad at all as long as you have heatsinks on them as you saw. They did a good job keeping the heat away from then.
> 
> At max voltage using Asus.ROM, I would expect a core of 1200-1280. You will only be running 1.3v actual at that setting.
> 
> Its a silicon lottery. Water is these cards best friend.
> You didn't do a proper uninstall if you don't have overdrive.


I thought there was a couple of ROM's, PT1 which has stock power limit but up to 2.00v voltage adjustment / PT3.rom, that had a LLC fix which made voltages in excess of 2v possible, out there?

EDIT: I have a Powercolor r9 290 here that unlcoks to 290x and does probably 1150-1160Mhz at max voltage possible with ASUS BIOS.
It obviously runs at ~80c on 100% fanspeed with stock cooler but maybe the lower temps the Hybrid brings increases OC ~20Mhz orso..
Is it smart to just keep that card or should I buy a full 290X / another Powercolor 290 and see if it goes any faster?

I got a good return policy in holland which means I can return ANY purchase within 14 days for full cashback for ANY reason.


----------



## bangbangbowman

I am a prospective owner. I'm in need of a massive gpu upgrade. I was thinking of a 280x but I want to get a gpu that will last me some time. I would like recommendations from the owners here about the 290 and what brand to get and also is it a massive upgrade from the 280x. As much help as you can provide is appreciated Thank You


----------



## ihaveworms

Quote:


> Originally Posted by *bond32*
> 
> So far I really have been impressed with the Koolance block. Came with two large thermal pad squares and you cut the lengths to fit the card. My temps are nothing short of fantastic right now, but I can't overclock and run a game without getting a black screen.
> 
> What bios seems to provide the best results so far?


Glad to hear you like it. At first I read that as "I really haven't been impressed" and then you went on how awesome your temps were. I was a little confused







. I had to re read it heh. Did you try the new beta drivers out? Since installing those, I haven't gotten a black screen.


----------



## Scorpion49

Quote:


> Originally Posted by *bond32*
> 
> So far I really have been impressed with the Koolance block. Came with two large thermal pad squares and you cut the lengths to fit the card. My temps are nothing short of fantastic right now, but I can't overclock and run a game without getting a black screen.
> 
> What bios seems to provide the best results so far?


Do they have a backplate by any chance? I didn't see one myself. Does it sag?


----------



## GenoOCAU

Just thought id post some of my 3dmark 11 benchmarking on these two XFX R9 290's (elpida). These ek blocks keep these cards frosty cool, havn't finished testing yet.

P25569 - 2x XFX R9 290 4gb - 1195/1420 - i7 3960x @ 5.1GHz, 16gb G.skill TridentX @ 2400 CL9-11-11-21 - Asus Rampage IV Formula - Geno - Watercooled - Link

P16393 - XFX R9 290 4gb - 1230/1400/1.33v - i7 3960x @ 4.9GHz, 16gb G.skill TridentX @ 2400 CL9-11-11-21 - Asus Rampage IV Formula - Geno - Watercooled - Link


----------



## Forceman

Quote:


> Originally Posted by *bangbangbowman*
> 
> I am a prospective owner. I'm in need of a massive gpu upgrade. I was thinking of a 280x but I want to get a gpu that will last me some time. I would like recommendations from the owners here about the 290 and what brand to get and also is it a massive upgrade from the 280x. As much help as you can provide is appreciated Thank You


Unless you are money-limited, get the 290 over the 280X. It's a very solid upgrade over a 280X.


----------



## bangbangbowman

which maker?


----------



## Gunderman456

Quote:


> Originally Posted by *kpoeticg*
> 
> That would be wonderful. Hopefully would teach companies to stop cutting corners and trying to mix sub-par products in with the good versions


Quote:


> Originally Posted by *geggeg*
> 
> What? The black screens have nothing to do with memory type, they happen to owners of cards with both hynix and elpida. Also, as far as performance goes, elpida is right up there with hynix for this generation.


I can attest to this, since I bought two Gigabyte r9 290s. One came with Elpdia and the other with Hynix memory. Both were benched separately today and both were comparable in performance and thermals with no black screen or other issues. See the build log for "The Hawaiian Heat Wave" (in sig) for results/pics.


----------



## DraXxus1549

So I'm a bit confused about the whole BF4 thing. I bought a Sapphire r9 290 before the promotion started, am I still able to get a code?


----------



## Banedox

Uggg According to GPUz my ASIC quality is 72% is that terrible for a unlocked 290?


----------



## Mr357

Quote:


> Originally Posted by *Banedox*
> 
> Uggg According to GPUz my ASIC quality is 72% is that terrible for a unlocked 290?


It means virtually nothing.


----------



## djsatane

Quote:


> Originally Posted by *evensen007*
> 
> I see that amd released some new drivers that appear to have fixed the black screens but also removed overdrive? Doesn't this seriously hamper any overclocking and cause throttling by not having power +50 controls?


I wouldnt say "appear to have fixed the black screens" but fixed for some, while some people had/have cards that simply drivers dont matter, rma of course but makes u wonder was it tied to memory batch or what.


----------



## Banedox

Quote:


> Originally Posted by *Mr357*
> 
> It means virtually nothing.


haha good now all i need to do is finish piecing my first water loop together and wait for the EK blocks to come in stock...


----------



## givmedew

Quote:


> Originally Posted by *Imprezzion*
> 
> Guys, i'm thinking of switching from my 780 to a R9 290X but I really want to know something before I do..
> 
> What is the average OC I can count on when using a Accelero Hybrid.
> 
> Now, don't tell me the VRM's are hard to cool cause I ran a Powercolor R9 290 unlocked to X @ 1.412v in GPU Tweak (1.34v after vdroop) in BF4 and it ran just 65c VRM temps @ 100% fan. Now with reference cooling there's litterly nothing cooling them cause there's no airflow whatsoever over the tiny little aluminium plate cooling them as it's right beneath the fan. I dare to bet the Hybrid with the fan pulled from it and a 120mm Swiftech Helix fan pointed right at the sinks can coool the VRM's just fine.
> 
> Also, the rad will be installed push-pull with the same specially designed for rads Swiftech Helix fans so no problems there either.
> 
> So, what can I expect from a regular 290X at whatever voltage? Is 1300+ Mhz something you'd reach at ~1.4v?


The VRMs are attached with thermal pads to the metal that is attached to the heat sink. The same metal that cools the ram. Most likely on a non reference cooler there will be a small separate heat-sink that goes over all of the VRM parts and is cooled by the air being pushed onto the card. Just like on most non reference cards out there. For example The ASUS DirectCU 680 cards there is a little black heatsink on the VRM... it stays on when you take the big heat sink off. Same thing with my 6950s I have one that came with a reference cooler and the other was a MSI Twin Frozr and the Twin Frozr had its own heatsink for the vrm and no cooling for the memory. The reference cooler had cooling for the VRM and memory but it was all on the huge plate that covered the entire thing.

I would say YES you need to have heat sinks on that bad boy!!! Wind blowing onto bare metal isn't going to do jack!


----------



## kdawgmaster

Quote:


> Originally Posted by *Imprezzion*
> 
> Guys, i'm thinking of switching from my 780 to a R9 290X but I really want to know something before I do..
> 
> What is the average OC I can count on when using a Accelero Hybrid.
> 
> Now, don't tell me the VRM's are hard to cool cause I ran a Powercolor R9 290 unlocked to X @ 1.412v in GPU Tweak (1.34v after vdroop) in BF4 and it ran just 65c VRM temps @ 100% fan. Now with reference cooling there's litterly nothing cooling them cause there's no airflow whatsoever over the tiny little aluminium plate cooling them as it's right beneath the fan. I dare to bet the Hybrid with the fan pulled from it and a 120mm Swiftech Helix fan pointed right at the sinks can coool the VRM's just fine.
> 
> Also, the rad will be installed push-pull with the same specially designed for rads Swiftech Helix fans so no problems there either.
> 
> So, what can I expect from a regular 290X at whatever voltage? Is 1300+ Mhz something you'd reach at ~1.4v?


If u take apart a R9 290x or R9 290u will see that the Vrms are getting the most direct airflow. The R9 290 heatsinks taken apart has fins for the Vrms that get the first introduction of Air for the system.


----------



## kdawgmaster

Using MSI afterburn i was able to get my 3 cards to stay under 80C with my own created fan profile. This test was done using Metro LL and BF4 for 30 min each.


----------



## Technewbie

Quote:


> Originally Posted by *bangbangbowman*
> 
> which maker?


I would choose either XFX or Sapphire because they have a free copy of BF4 that comes with the card. But if you already have it or don't want it then any manufacture because they are all the same card.


----------



## the9quad

Quote:


> Originally Posted by *Technewbie*
> 
> I would choose either XFX or Sapphire because they have a free copy of BF4 that comes with the card. But if you already have it or don't want it then any manufacture because they are all the same card.


if you do that keep in mind, they (XFX/Sapphire) don't all come with that copy it has to say bf4 edition, and it is generally $30 more expensive, so that "free" copy isn't really that free now is it? I'd also make sure you get one from a retailer that is doing the new AMD bundle as well. It's pretty confusing, because AMD has a new bundle, but it is with specific retailers. It's not like the old way where you got a coupon in your box from the manufacturer it's basically a really really crappy way to do it, and it only served to piss of people who bought cards (during the new promo period mind you), but not from a sponsoring retailer. who those retailers are is no ones guess atm, because I bought mine form amazon and newegg who you would think would be the big two retailers so of course they'd offer it? NOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO they don't.

But your in luck, AMD out of the kindness of their hearts is going to offer a free 1000 copies of BF4, all you have to do is watch facebook and twitter (yup straight out of the EA playbook there AMD nicely done), couch free advertisement under the guise of "helping" . pitiful.


----------



## Sgt Bilko

Quote:


> Originally Posted by *TheSoldiet*
> 
> Thx for the reply! I have been looking over there. Seems like a good mod, I also have some spare fans to use. Too bad I have corsair vengeance ram lol. Huge heatsinks.
> 
> +1 rep for all of your replies


No worries. Happy to help if I can.

Ive got a couple of CM fans spare so I might see if I can get a push/pull setup going without modding. Ive got Corsair dominator ram so I have a height issue as well


----------



## Imprezzion

Quote:


> Originally Posted by *givmedew*
> 
> The VRMs are attached with thermal pads to the metal that is attached to the heat sink. The same metal that cools the ram. Most likely on a non reference cooler there will be a small separate heat-sink that goes over all of the VRM parts and is cooled by the air being pushed onto the card. Just like on most non reference cards out there. For example The ASUS DirectCU 680 cards there is a little black heatsink on the VRM... it stays on when you take the
> big heat sink off. Same thing with my 6950s I have one that came with a reference cooler and the other was a MSI Twin Frozr and the Twin Frozr had its own heatsink for the vrm and no cooling for the memory. The reference cooler had cooling for the VRM and memory but it was all on the huge plate that covered the entire thing.
> 
> I would say YES you need to have heat sinks on that bad boy!!! Wind blowing onto bare metal isn't going to do jack!


I meant that I was going to use the heatsinks that come with the Accelero.

Plus, they have more direct airflow and more surface area so they should cool better even.


----------



## tsm106

Quote:


> Originally Posted by *Imprezzion*
> 
> Guys, i'm thinking of switching from my 780 to a R9 290X but I really want to know something before I do..
> 
> What is the average OC I can count on when using a Accelero Hybrid.
> 
> Now, don't tell me the VRM's are hard to cool cause I ran a Powercolor R9 290 unlocked to X @ 1.412v in GPU Tweak (1.34v after vdroop) in BF4 and it ran just 65c VRM temps @ 100% fan. *Now with reference cooling there's litterly nothing cooling them cause there's no airflow whatsoever over the tiny little aluminium plate cooling them as it's right beneath the fan.* I dare to bet the Hybrid with the fan pulled from it and a 120mm Swiftech Helix fan pointed right at the sinks can coool the VRM's just fine.
> 
> Also, the rad will be installed push-pull with the same specially designed for rads Swiftech Helix fans so no problems there either.
> 
> So, what can I expect from a regular 290X at whatever voltage? Is 1300+ Mhz something you'd reach at ~1.4v?


That's incorrect. As far as vrm specific cooling, it is hard to improve upon the reference cooler. Look at the tests done by swiftech and their 7970 unisink for example.

The reference cooler is a full size block that covers both vrms. Since it is a large frame, there is plenty of material to transfer heat. The squirrel cage sits right next to the vrms, giving them plenty of airflow. It's the chokes that are not heatsinked ans they just have air that moves along the path of the cooler. The chokes do not need active cooling btw so don't confuse them with the vrms. Also, its nearly impossible to cool the small vrm cluster at the front of the pcb without a reference cooler.


----------



## selk22

Well I figured out my erratic clocks in games!!!

I am on a 60hz monitor so I have to use vsync very often to prevent screen tearing.. This in turn has been effecting my GPU clock and making it bounce around cause its trying to maintain that perfect 60fps mark...

Turn off Vsync! *BOOM* 1150 CONSTANT on the core literally only dips during a cut scene or something similar!

Guess that means its time to upgrade to 1440p 120hz....

Hope this helps someone!


----------



## COMBO2

Hi guys,

My R9 290X appears to have coil whine, I've linked two videos, one load + one idle.





 - load




 - idle

Fan speeds are slightly different, but you can hear the hissing. It's incredibly annoying and varies upon load. If I look at the sky in BF4, it's nowhere near as bad, but if I look at an explosion, it's ridiculously annoying. Don't get me started on the noises it makes with Crysis 3









I'm going to watercool this card, and had a long discussion with the retailer that I purchase from (PCCaseGear, very helpful for any Australians thinking of buying there). The guy said that the stock coolers/fans can often do this to a varying degree, and that a lot of the cards they get actually do have some degree of coil whine under high load. He said if I'm watercooling it, there is a high chance that the coil whine could be significantly reduced or eliminated. I don't really understand how, but he said if I was concerned, I am better off going for a GTX 780 or Ti (don't have close to enough money for it...).

My question is, does anybody else have this problem? And will watercooling possibly rectify the problem? Because if so (can't test WC now so I don't know) I'll just wait for my Enthoo Primo to come and deal with the noise for a few weeks.


----------



## ontariotl

Quote:


> Originally Posted by *COMBO2*
> 
> Hi guys,
> 
> My R9 290X appears to have coil whine, I've linked two videos, one load + one idle.
> 
> 
> 
> 
> 
> - load
> 
> 
> 
> 
> - idle
> 
> Fan speeds are slightly different, but you can hear the hissing. It's incredibly annoying and varies upon load. If I look at the sky in BF4, it's nowhere near as bad, but if I look at an explosion, it's ridiculously annoying. Don't get me started on the noises it makes with Crysis 3
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm going to watercool this card, and had a long discussion with the retailer that I purchase from (PCCaseGear, very helpful for any Australians thinking of buying there). The guy said that the stock coolers/fans can often do this to a varying degree, and that a lot of the cards they get actually do have some degree of coil whine under high load. He said if I'm watercooling it, there is a high chance that the coil whine could be significantly reduced or eliminated. I don't really understand how, but he said if I was concerned, I am better off going for a GTX 780 or Ti (don't have close to enough money for it...).
> 
> My question is, does anybody else have this problem? And will watercooling possibly rectify the problem? Because if so (can't test WC now so I don't know) I'll just wait for my Enthoo Primo to come and deal with the noise for a few weeks.


WC does not rectify the coil whine issue. Its the load that the chokes are taking and they aren't as good as some chokes on other cards which is why they whine. It's something you'll either have to live with or find another card that probably will have to be non-reference with higher quality chokes that won't whine.
For me, it's probably the first card that I have noticed whine, but since I flashed my 290 (actually hardly any whine) to a 290x (noticeable whine) I really have nothing to complain about.


----------



## NapalmV5

please delete


----------



## hatlesschimp

^^^ MY GOD!!! ^^^

Do you think when the non reference 290x cards come out that I could buy 1 or 2 of them and use them with my 2 vanilla cards and water cool the lot with out issues?

Like do you think it will look silly or the fittings wont line up?


----------



## Hogesyx

Quote:


> Originally Posted by *selk22*
> 
> Well I figured out my erratic clocks in games!!!
> 
> I am on a 60hz monitor so I have to use vsync very often to prevent screen tearing.. This in turn has been effecting my GPU clock and making it bounce around cause its trying to maintain that perfect 60fps mark...
> 
> Turn off Vsync! *BOOM* 1150 CONSTANT on the core literally only dips during a cut scene or something similar!
> 
> Guess that means its time to upgrade to 1440p 120hz....
> 
> Hope this helps someone!


I uses vsync as well, and using a framelimitter to limit my fps to 59.

You can choose to safe power/cooling and let it downclock, or tune up the other graphics setting to make your graphics card work harder by overriding the AA settings etc.


----------



## hotrod717

Quote:


> Originally Posted by *the9quad*
> 
> if you do that keep in mind, they (XFX/Sapphire) don't all come with that copy it has to say bf4 edition, and it is generally $30 more expensive, so that "free" copy isn't really that free now is it? I'd also make sure you get one from a retailer that is doing the new AMD bundle as well. It's pretty confusing, because AMD has a new bundle, but it is with specific retailers. It's not like the old way where you got a coupon in your box from the manufacturer it's basically a really really crappy way to do it, and it only served to piss of people who bought cards (during the new promo period mind you), but not from a sponsoring retailer. who those retailers are is no ones guess atm, because I bought mine form amazon and newegg who you would think would be the big two retailers so of course they'd offer it? NOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO they don't.
> 
> But your in luck, AMD out of the kindness of their hearts is going to offer a free 1000 copies of BF4, all you have to do is watch facebook and twitter (yup straight out of the EA playbook there AMD nicely done), couch free advertisement under the guise of "helping" . pitiful.


Actually XFX offeres a free copy of BF4 when you register a 290 with them. It's not advertised, but is there. 290 unlocked to 290X plus free BF4= Epic price/performance.


----------



## Blackops_2

Quote:


> Originally Posted by *hatlesschimp*
> 
> ^^^ MY GOD!!! ^^^
> 
> Do you think when the non reference 290x cards come out that I could buy 1 or 2 of them and use them with my 2 vanilla cards and water cool the lot with out issues?
> 
> Like do you think it will look silly or the fittings wont line up?


Some of the AIB with custom cooling will have reference PCB so there will still be some that are waterblock compatible. Then you'll have the 290/x lightening down the road (i'm assuming) which will have it's own block. Or at least this is how the 7970 went. I'm interested to see a HOF or lightening edition of a 290/x. Would be awesome i think.


----------



## hatlesschimp

Im just chasing a different port configuration 290x to run all my monitors. I need more DP1.2's


----------



## Imprezzion

Quote:


> Originally Posted by *tsm106*
> 
> That's incorrect. As far as vrm specific cooling, it is hard to improve upon the reference cooler. Look at the tests done by swiftech and their 7970 unisink for example.
> 
> The reference cooler is a full size block that covers both vrms. Since it is a large frame, there is plenty of material to transfer heat. The squirrel cage sits right next to the vrms, giving them plenty of airflow. It's the chokes that are not heatsinked ans they just have air that moves along the path of the cooler. The chokes do not need active cooling btw so don't confuse them with the vrms. Also, its nearly impossible to cool the small vrm cluster at the front of the pcb without a reference cooler.


I decided to pull my R9 290 back apart again and you're absolutely right. My apologies









I tried to cool it with the sinks from a Gelid Icy Vision and the heatsink of a Accelero Xtreme Plus gen 1 (with bent heatpipes to fit around the double DVI port) and while coretemps where ~15c lower, VRM's shot up to 90-100c in a matter of minutes. Those sinks are just too small..

I'll grab a Accelero Hybrid and i'll just cut the core part out of the stock coolers heat plate so the waterblock fits through the mounting holes if ya get what I mean.

By the way, i've bene playing BF4 for a couple of rounds unlocked to 290X (It's a Club3D that fully unlocks) @ 1180Mhz core and 6000Mhz VRAM (Elpida, they won't do much faster then this alas) and it seems to be stable at max voltage (1.408v) in GPU Tweak.

With the reference fan set to 100% the core hits a stable temp of 75c, VRM's sit at exactly 55c for VRM1, and exactly 60c for VRM2.

1200Mhz core works, but gives light artifacting so I guess 1180-1190Mhz is the max.

And now a question I shouldn't be askin here, will it beat my GTX780 @ 1254Mhz core and 7000Mhz VRAM when unlocked to 290X and clocked to 1180Mhz? Especially in BF4 @ 2880x1620 (150% res scale @ 1080p) all Ultra no AA?


----------



## grandpatzer

Quote:


> Originally Posted by *zpaf*
> 
> GRID 2, ULTRA, 8XMSAA 5760x1080
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If I push > 1220MHz no gain.
> I think the card hits TDP LIMIT.


Can you please check what youre ASIC is with GPU-Z?
You get ASIC info by clicking top left on gpu-z window.


----------



## grandpatzer

Quote:


> Originally Posted by *Imprezzion*
> 
> Guys, i'm thinking of switching from my 780 to a R9 290X but I really want to know something before I do..
> 
> What is the average OC I can count on when using a Accelero Hybrid.
> 
> Now, don't tell me the VRM's are hard to cool cause I ran a Powercolor R9 290 unlocked to X @ 1.412v in GPU Tweak (1.34v after vdroop) in BF4 and it ran just 65c VRM temps @ 100% fan. Now with reference cooling there's litterly nothing cooling them cause there's no airflow whatsoever over the tiny little aluminium plate cooling them as it's right beneath the fan. I dare to bet the Hybrid with the fan pulled from it and a 120mm Swiftech Helix fan pointed right at the sinks can coool the VRM's just fine.
> 
> Also, the rad will be installed push-pull with the same specially designed for rads Swiftech Helix fans so no problems there either.
> 
> So, what can I expect from a regular 290X at whatever voltage? Is 1300+ Mhz something you'd reach at ~1.4v?


Actually I would say the reference [email protected]% fan cools the VRM much better, the reference cooler has direct contact with main cooler/heatsink and the VRM.
My guess is Arctic uses tiny heatsinks for the VRM?
If you get 65c @100% reference, my guess would be 85-120c for the Arctic cooler considering 1.412v (1.34v after vdroop).

Also it's a lottery, you get lucky you can hit 1300mhz+, unlucky somewhere 1100-1299 mhz is my guess.


----------



## kdawgmaster

Ive tried using both Msi afterburner and GPU tweak and both cant adjust my Core voltage :/

Ive tried to overclock my setup without adjusting core and the max i can get is 1085 core and 1350 ram.


----------



## Imprezzion

Oh well, the 1180Mhz this card pulls now is good enough I guess. Not worth the trouble and extra money to send it back and buy another one..

You should flash the ASUS BIOS btw. That'll allow voltage control.

Even though that's a 290X BIOS it doesn't matter. If the card doesn't unlock, it doesn't. But it still runs with the 290X power limit, 290X voltage control and such.


----------



## grandpatzer

Quote:


> Originally Posted by *Imprezzion*
> 
> Oh well, the 1180Mhz this card pulls now is good enough I guess. Not worth the trouble and extra money to send it back and buy another one..
> 
> You should flash the ASUS BIOS btw. That'll allow voltage control.
> 
> Even though that's a 290X BIOS it doesn't matter. If the card doesn't unlock, it doesn't. But it still runs with the 290X power limit, 290X voltage control and such.


you don't need to flash bios for voltage control, Asus GPU tweak software can change voltage.
Also MSI AB can change voltage, but MSI AB "only" gives +100mv but that should really be enough for most ppl.


----------



## maarten12100

Quote:


> Originally Posted by *hotrod717*
> 
> Actually XFX offeres a free copy of BF4 when you register a 290 with them. It's not advertised, but is there. 290 unlocked to 290X plus free BF4= Epic price/performance.


Are you sure about this?


----------



## Fahrenheit85

Thinking of picking up a 290x and tossing a EK block on it. Was thinking if going Gigabyte being they have amazing RMA speed if I get a DoA card. Good choice?


----------



## Technewbie

Quote:


> Originally Posted by *the9quad*
> 
> if you do that keep in mind, they (XFX/Sapphire) don't all come with that copy it has to say bf4 edition, and it is generally $30 more expensive, so that "free" copy isn't really that free now is it? I'd also make sure you get one from a retailer that is doing the new AMD bundle as well. It's pretty confusing, because AMD has a new bundle, but it is with specific retailers. It's not like the old way where you got a coupon in your box from the manufacturer it's basically a really really crappy way to do it, and it only served to piss of people who bought cards (during the new promo period mind you), but not from a sponsoring retailer. who those retailers are is no ones guess atm, because I bought mine form amazon and newegg who you would think would be the big two retailers so of course they'd offer it? NOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO they don't.
> 
> But your in luck, AMD out of the kindness of their hearts is going to offer a free 1000 copies of BF4, all you have to do is watch facebook and twitter (yup straight out of the EA playbook there AMD nicely done), couch free advertisement under the guise of "helping" . pitiful.


Well according to Newegg they are still $400 and claim to come with a free copy of BF4 so that copy actually is free. Actually the Sapphire is $410 but Gigabyte is also offering it for free. (I just checked again)


----------



## Bloitz

Quote:


> Originally Posted by *Imprezzion*
> 
> I got a good return policy in holland which means I can return ANY purchase within 14 days for full cashback for ANY reason.


I would advise you to double-check that policy. If youŕe talking about the European "cooldown-period" for buying stuff online: it is indeed 14 days to return any product without reason for a full refund. Do note, the packaging should be sealed so you can't bin some cards and send the duds back for a refund. It could be the the NL have changed this rule a bit but I doubt it.


----------



## COMBO2

Quote:


> Originally Posted by *ontariotl*
> 
> WC does not rectify the coil whine issue. Its the load that the chokes are taking and they aren't as good as some chokes on other cards which is why they whine. It's something you'll either have to live with or find another card that probably will have to be non-reference with higher quality chokes that won't whine.
> For me, it's probably the first card that I have noticed whine, but since I flashed my 290 (actually hardly any whine) to a 290x (noticeable whine) I really have nothing to complain about.


Should I even bother testing the card with the waterblock then?? From what you've said, the whine is a card based thing and can't really be solved, am I right? Does it happen on every card, because if it does, although I love the card, I might need to move to the green side. The noise is so irritating when actually occurring...


----------



## Imprezzion

Quote:


> Originally Posted by *Bloitz*
> 
> I would advise you to double-check that policy. If youŕe talking about the European "cooldown-period" for buying stuff online: it is indeed 14 days to return any product without reason for a full refund. Do note, the packaging should be sealed so you can't bin some cards and send the duds back for a refund. It could be the the NL have changed this rule a bit but I doubt it.


True, but most, if not all, shops accept opened / used products.


----------



## rdr09

Quote:


> Originally Posted by *COMBO2*
> 
> Should I even bother testing the card with the waterblock then?? From what you've said, the whine is a card based thing and can't really be solved, am I right? Does it happen on every card, because if it does, although I love the card, I might need to move to the green side. The noise is so irritating when actually occurring...


no, i suggest returning them or the one with whine. my 290 does not have any whine except in Firestrike first scene. Games, zero. Don't touch them. mine is watercooled and the only thing i hear from my rig is the pump.


----------



## Letmefly

Hi peeps, just wondering if anyone knows of brackets i could use for a Antec Kuhlar H20 620 to fit on a 290 and whether it will tame the beast on load







cheers


----------



## alancsalt

Quote:


> Originally Posted by *Letmefly*
> 
> Hi peeps, just wondering if anyone knows of brackets i could use for a Antec Kuhlar H20 620 to fit on a 290 and whether it will tame the beast on load
> 
> 
> 
> 
> 
> 
> 
> cheers


What about these? A kit to fit it, and a bracket to hold a fan above the rest....
http://www.overclock.net/t/1439434/gpu-cool-the-universal-bracket-now-for-sale-givaways-inside


----------



## Letmefly

Quote:


> Originally Posted by *alancsalt*
> 
> What about these? A kit to fit it, and a bracket to hold a fan above the rest....
> http://www.overclock.net/t/1439434/gpu-cool-the-universal-bracket-now-for-sale-givaways-inside


Cheers bud, seems pretty good for seating the actual block on. What VRM and MEM heatsinks would you reccomend, soz it's been a while since i have decided to do anything custom.


----------



## Skylark71

Quote:


> Originally Posted by *selk22*
> 
> Well I figured out my erratic clocks in games!!!
> 
> I am on a 60hz monitor so I have to use vsync very often to prevent screen tearing.. This in turn has been effecting my GPU clock and making it bounce around cause its trying to maintain that perfect 60fps mark...
> 
> Turn off Vsync! *BOOM* 1150 CONSTANT on the core literally only dips during a cut scene or something similar!
> 
> Guess that means its time to upgrade to 1440p 120hz....
> 
> Hope this helps someone!


Thank you,

that solved my problem, i use vsync also.
Just OC`ed my display to 74hz and it helped a lot.

+rep


----------



## alancsalt

Quote:


> Originally Posted by *Letmefly*
> 
> Quote:
> 
> 
> 
> Originally Posted by *alancsalt*
> 
> What about these? A kit to fit it, and a bracket to hold a fan above the rest....
> http://www.overclock.net/t/1439434/gpu-cool-the-universal-bracket-now-for-sale-givaways-inside
> 
> 
> 
> Cheers bud, seems pretty good for seating the actual block on. What VRM and MEM heatsinks would you reccomend, soz it's been a while since i have decided to do anything custom.
Click to expand...

I'm in Australia. Been seven generations since my ancestors were in your neck of the woods. Anyone from Great Britain know where Letmefly can get some suitable heatsinks?
I think in the States, http://www.frozencpu.com/


----------



## evensen007

I am hopeful that Warsam will come back to let us know what AMD changed in the beta 9.4 drivers to 'fix' the black screen problems. Some people have had success with them, but have also lost the overdrive functionality (full/clean install or not).

Did they adjust some invisible power limiter behind the scenes?


----------



## hotrod717

Quote:


> Originally Posted by *maarten12100*
> 
> Are you sure about this?


Absolutely, check my Sig, I own one. Not the first time someone posted this either.


----------



## rdr09

Quote:


> Originally Posted by *evensen007*
> 
> I am hopeful that Warsam will come back to let us know what AMD changed in the beta 9.4 drivers to 'fix' the black screen problems. Some people have had success with them, but have also lost the overdrive functionality (full/clean install or not).
> 
> Did they adjust some invisible power limiter behind the scenes?


even, that happened to me on a previous beta. CCC was gone. I only use CCC to enable crossfire or adjust scaling. others use it more than those stuff. i find the latest whql hard to use for oc'ing. i just use AB but i do not use the voltage control (core) and also make sure CCC is not one of the startup programs (it slows boot).

i am not exactly sure if we can install CCC by itself from another driver version.

edit: this is just an example . . . 13.11 Beta 6



it might work but the BS might come back.


----------



## maarten12100

Quote:


> Originally Posted by *hotrod717*
> 
> Absolutely, check my Sig, I own one. Not the first time someone posted this either.


http://www.alternate.nl/XFX/XFX+R9_290_Core_Edition_(R9-290A-ENFC),_grafische_kaart/html/product/1104166/?tk=7&lk=10190

That one also?
I hope so as that is what I'm looking for


----------



## evensen007

Quote:


> Originally Posted by *rdr09*
> 
> even, that happened to me on a previous beta. CCC was gone. I only use CCC to enable crossfire or adjust scaling. others use it more than those stuff. i find the latest whql hard to use for oc'ing. i just use AB but i do not use the voltage control (core) and also make sure CCC is not one of the startup programs (it slows boot).
> 
> i am not exactly sure if we can install CCC by itself from another driver version.


I think CCC is in place with the new 9.4 beta, but the overdrive/power limit settings have been removed/nerfed. With the power limit +50 removed/not working, I imagine it would make it REALLY hard to get the cards not to throttle.

I have finally got my cards overclocking properly after lots of work. They are now running at 1150/1500 which is pretty good.


----------



## rdr09

Quote:


> Originally Posted by *evensen007*
> 
> I think CCC is in place with the new 9.4 beta, but the overdrive/power limit settings have been removed/nerfed. With the power limit +50 removed/not working, I imagine it would make it REALLY hard to get the cards not to throttle.
> 
> I have finally got my cards overclocking properly after lots of work. They are now running at 1150/1500 which is pretty good.


where is that AMD rep?! i am still using 13.11 whql since never get any BS issues and i don't use CCC. good job on the oc.


----------



## ZealotKi11er

Does Windows 8 still perform better in BF4? I am not getting amazing fps with 290X @ 1440p Ultra 0xAA.


----------



## muhd86

3 gpus ....sapphire r290

thought they are all locked cursed --

still happy from the performance though ....downloading the latest drivers from guru3d.com now


----------



## shelter

Sign me up! Got my Gigabyte R9 290 and EK waterblock.


----------



## jerrolds

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Does Windows 8 still perform better in BF4? I am not getting amazing fps with 290X @ 1440p Ultra 0xAA.


What kind of performance are you getting...i get 75fps lows, and 120fps high - normally hovering 100fps+ in both campaign and multi. This is with Windows 8, this is with Ultra and no AA and 290X @ 1200/1500

Outdoors usually around 90+ and indoors 100+ - im happy with the performance of the card.


----------



## Arizonian

Some news : I have purchased a 780Ti ACX.

I'll elaborate; my second rig is an Nvidia 3D vision rig and when 20nm comes out I'm going to be upgrading again. Will give me a chance to upgrade my 2nd rigs 2GB 690 with the 780Ti 3GB to keep 3D Vision enabled and add another gig of VRAM which I feel will be needed eventually. My family still watches 3D movies frequently and we use that computer to do so with 3D gaming on new titles. Eventually the main rig becomes second rig with hand me downs etc....

Secondly on top of that, I was able to take advantage of the $100 off promo for Nvidia shield that ends today which my sister and I are gifting to nephews for Christmas. She's made the purchase today.

Between promo and two of the three bundle games I was going to buy anyway it made it a nice value minus that cost and sweetened the deal a bit.

Which brings me to the following:

*Now I am more happy to remain the thread starter and active in this club, however I'm also going to put out the opportunity for a 290X / 299 owner the opportunity to take on the task of thread starter to better serve the club. PM me if your interested and willing to dedicate the time and consistent effort to keep the club active and updated.*

I'm sure when mantle comes out I'm going to regret this decision. I'll be looking forward to Mantle launching and by the time the 20 nm video cards come out Mantle be in full force. Again don't worry gentlemen I'm not going anywhere and will continue fully support the club. This wasn't an easy decision.


----------



## TheSoldiet

Nice! Hope you are happy with your purchase







, but what happened to your 290X?


----------



## Arizonian

Quote:


> Originally Posted by *TheSoldiet*
> 
> Nice! Hope you are happy with your purchase
> 
> 
> 
> 
> 
> 
> 
> , but what happened to your 290X?


I sold that a while back since I wasn't water blocking it. While the cards availability was scarce and was able to sell it for what I paid for it and got to keep BF4.


----------



## evensen007

I got lucky and got in on the XFX 290 BF4 voucher before they apparently closed it off. I had already bought BF4, so now I have 2 codes from both of my cards.


----------



## conwa

Quote:


> Originally Posted by *maarten12100*
> 
> http://www.alternate.nl/XFX/XFX+R9_290_Core_Edition_(R9-290A-ENFC),_grafische_kaart/html/product/1104166/?tk=7&lk=10190
> 
> That one also?
> I hope so as that is what I'm looking for


I just bought that from Alternate and its unlocked, but with a large amount of coil whine...


----------



## zpaf

Quote:


> Originally Posted by *grandpatzer*
> 
> Can you please check what youre ASIC is with GPU-Z?
> You get ASIC info by clicking top left on gpu-z window.


----------



## r0cawearz

what kind of vddc is safe for 24/7 usage on this card air cooled?


----------



## zpaf

Quote:


> Originally Posted by *GenoOCAU*
> 
> Just thought id post some of my 3dmark 11 benchmarking on these two XFX R9 290's (elpida). These ek blocks keep these cards frosty cool, havn't finished testing yet.
> 
> P25569 - 2x XFX R9 290 4gb - 1195/1420 - i7 3960x @ 5.1GHz, 16gb G.skill TridentX @ 2400 CL9-11-11-21 - Asus Rampage IV Formula - Geno - Watercooled - Link
> 
> P16393 - XFX R9 290 4gb - *1230/1400*/1.33v - i7 3960x @ 4.9GHz, 16gb G.skill TridentX @ 2400 CL9-11-11-21 - Asus Rampage IV Formula - Geno - Watercooled - Link


I think you can do better on gpu score.

1234/1452

Generic VGA video card benchmark result - Intel Core i7-4770K,ASUSTeK COMPUTER INC. MAXIMUS VI IMPACT


----------



## maarten12100

Quote:


> Originally Posted by *conwa*
> 
> I just bought that from Alternate and its unlocked, but with a large amount of coil whine...


One of my sapphire 290's winned so it is luck of the draw pretty much.
Have you been able to redeem a BF4 code trough the XFX site if so I'll jump ship with 2 of these asap.


----------



## muhd86

stock 3930k , stock rams , slight oc on the core and ram ..

i am wondering what score i will get with a 4.8ghz oc on the cpu ...


----------



## Arizonian

Quote:


> Originally Posted by *shelter*
> 
> Sign me up! Got my Gigabyte R9 290 and EK waterblock.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Also for any new members which there are quite a few I've seen in the last few days. Please submit an entry pic w/OCN name on it to be added to the club roster and welcome aboard.


----------



## Imprezzion

I'm buying myself a second Club3D (For the dutchies among us, 4Launch) to see if it clocks any better then my current unlocked one which only goes as high as 1175Mhz with max possible voltage (1.4v setpoint, 1.3v load) before starting to artifact.


----------



## ihaveworms

Quote:


> Originally Posted by *r0cawearz*
> 
> what kind of vddc is safe for 24/7 usage on this card air cooled?


I too would like to know what a safe voltage is for this card, in my case water.


----------



## zpaf

First try to break 13K on Fire Strike.

1235/1600








AMD Radeon R9 290 video card benchmark result - Intel Core i7-4770K,ASUSTeK COMPUTER INC. MAXIMUS VI IMPACT


----------



## jerrolds

Quote:


> Originally Posted by *zpaf*
> 
> First try to break 13K on Fire Strike.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 1235/1600
> 
> 
> 
> 
> 
> 
> 
> 
> AMD Radeon R9 290 video card benchmark result - Intel Core i7-4770K,ASUSTeK COMPUTER INC. MAXIMUS VI IMPACT


If you have Elpida memory - try lowering your mem to 1500mhz, it net me 200pts over 1600mhz. Seems to be a sweet spot for me.

http://www.3dmark.com/compare/fs/1187577/fs/1182498 - 1225/1500


----------



## Ukkooh

After having it for a week got my first blackscreen on my 290x today. I just emailed the shop I bought it from if it is enough to RMA it or not since it doesn't OC too well. (1150mhz core with afterburner voltage +100mV)


----------



## Kolier

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Well the final piece of the puzzle has been solved for me.. Once I fixed my downclock issue all I had left to figure out was my erratic gpu usage issue. A lot of people were telling me it's just an AB bug but with my fps jumping around a lot I knew that couldn't be the case...
> 
> Check it out now! This is Arkham Origins... the long dips are the stupid cut scenes. I'm getting incredible FPS now with 4x msaa, and alllll the dx 11 junk turned up. Everything maxed out except no physx of course. This 290 at these clocks has made me so happy
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> No more stutter either...
> 
> So what was my fix? I WENT BACK TO MY 2600k!!!!
> 
> 
> 
> 
> 
> 
> 
> My fx 8350 @ 4.8ghz just couldn't muster. I switched teams to give it a shot and although it did well for some things over my intel, when it comes to gaming it's just painfully not able to push these gpu's the way they need to be pushed. IMO
> 
> My sandybridge is only at 4.2ghz, memory is downclocked to 2133 instead of 2400 when I had it on the amd rig.
> 
> I am happy
> 
> EDIT: This is on the 13.11 WQHL driver


I've the same issue as you, I think it's a win 8.1 specific issue, because in win 7, it works well.
In your words, do you mean after you used an Intel CPU and Motherboard, the issue was solved on your Win 8.1?


----------



## zpaf

Quote:


> Originally Posted by *jerrolds*
> 
> If you have Elpida memory - try lowering your mem to 1500mhz, it net me 200pts over 1600mhz. Seems to be a sweet spot for me.
> 
> http://www.3dmark.com/compare/fs/1187577/fs/1182498 - 1225/1500


I had this http://www.3dmark.com/3dm/1675598 with Windows 8.1

I always check ram with AvP and I have gains.




Anyway I will try.
Thanks


----------



## Epsi

Quote:


> Originally Posted by *Ukkooh*
> 
> After having it for a week got my first blackscreen on my 290x today. I just emailed the shop I bought it from if it is enough to RMA it or not since it doesn't OC too well. (1150mhz core with afterburner voltage +100mV)


I can't even reach 1150mhz, not even at max volts (1.410v gpu tweak). Really crappy, maybe i'm gonna send it RMA also because of the coils.

Even if i get a new one that won't unlock but clocks higher it feels better









So i have a really bad clocker here also.


----------



## Falkentyne

Quote:


> Originally Posted by *Ukkooh*
> 
> After having it for a week got my first blackscreen on my 290x today. I just emailed the shop I bought it from if it is enough to RMA it or not since it doesn't OC too well. (1150mhz core with afterburner voltage +100mV)


What memory is on that card?


----------



## Ukkooh

Quote:


> Originally Posted by *Falkentyne*
> 
> What memory is on that card?


Elpida and the card was completely stock when the crash happened. Well I did have fan speed at 70% but it shouldn't hurt.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Kolier*
> 
> I've the same issue as you, I think it's a win 8.1 specific issue, because in win 7, it works well.
> In your words, do you mean after you used an Intel CPU and Motherboard, the issue was solved on your Win 8.1?


Yea I've been on 8.1 the whole time. I did two fresh OS installs on the AMD cpu with no difference, then went back to my Intel 2600k and just a better experience. Not that my experience was bad on the AMD ... it was a random wild hair up my arrrse that made me do it. And since it's better I figure why go back.


----------



## famich

It looks that chips are of a different quality -I got Sapphire ASIC 75% . One of the first 8000 BF 4 s.
AFB +100mV @1220 , +150 mV 1250+-a good chip, watercooling with EK block.


----------



## Fahrenheit85

So who's the best manufacturer for 290x, Asus? Sapphire? Gigabyte?


----------



## the9quad

Quote:


> Originally Posted by *famich*
> 
> It looks that chips are of a different quality -I got Sapphire ASIC 75% . One of the first 8000 BF 4 s.
> AFB +100mV @1220 , +150 mV 1250+-a good chip, watercooling with EK block.


I got one of the first 8,000 as well, was elpida had around a 71% ASIC and black screened like a champ. I don't think there is much of a difference on when you got the card.


----------



## Ukkooh

Quote:


> Originally Posted by *famich*
> 
> It looks that chips are of a different quality -I got Sapphire ASIC 75% . One of the first 8000 BF 4 s.
> AFB +100mV @1220 , +150 mV 1250+-a good chip, watercooling with EK block.


Well mine is a BF4 edition too with 73.9% asic and still black screening at stock. Kind of odd that it didn't black screen when I pushed it to the limit and had light artifacting.


----------



## jerrolds

Are chips higher quality lately? I got my Sapphire BF4 ed day 1 and its a 75% ASIC, and barely blackscreened - total maybe 3 times and none after the patch. Pushed to 1225/1600 without blackscreening.


----------



## kcuestag

Anyone experienced a Red Screen of Death yet? I just got my first one while playing Battlefield 4:


----------



## bond32

I've got that red screen before. I think it's an unstable overclock on cpu. The red screen is the same as a bsod, at least I think.


----------



## jassilamba

Quote:


> Originally Posted by *kcuestag*
> 
> Anyone experienced a Red Screen of Death yet? I just got my first one while playing Battlefield 4:


I had this happen the other day (on a 7950), updated to the latest beta driver, and its been fine now. No OCs of any kind on anything.


----------



## kcuestag

Quote:


> Originally Posted by *bond32*
> 
> I've got that red screen before. I think it's an unstable overclock on cpu. The red screen is the same as a bsod, at least I think.


It's not, I never got them before except when I had my GTX680 SLI and they had a bug in their drivers with Red Orchestra 2.


----------



## Raxus

Quote:


> Originally Posted by *Fahrenheit85*
> 
> So who's the best manufacturer for 290x, Asus? Sapphire? Gigabyte?


It doesnt really matter unless you want to watercool it and not void your warranty.


----------



## bond32

Well guess I was wrong, sure seemed like a bsod to me though because when I play bf4, get the red screen, my sound goes wonky too. Sounds exactly the same as the bsod, and if I let it sit for a few seconds it will restart.


----------



## ZealotKi11er

Which memory overclocks better? Hynix or Elpida?


----------



## Ukkooh

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Which memory overclocks better? Hynix or Elpida?


I think someone from OCUK said earlier that they both oc pretty much the same.


----------



## jassilamba

Quote:


> Originally Posted by *Raxus*
> 
> It doesnt really matter unless you want to watercool it and not void your warranty.


So do all AMD GPU manufacturers void the warranty if you remove the stock cooler?

I have been on the fence between couple Sapphire 290s (CF) or some EVGA 780s (SLI), and will be water cooling.

My primary use will be BF4 on 1440P, and don't think I will need a pair 290Xs for that.


----------



## Ukkooh

Quote:


> Originally Posted by *jassilamba*
> 
> So do all AMD GPU manufacturers void the warranty if you remove the stock cooler?
> 
> I have been on the fence between couple Sapphire 290s (CF) or some EVGA 780s (SLI), and will be water cooling.
> 
> My primary use will be BF4 on 1440P, and don't think I will need a pair 290Xs for that.


From what I've heard club3d, asus and msi don't void the warranty. Club3d is the only one I managed to confirm myself.

Edit: Club 3D's reply when I asked about it:
"Does the warranty break on your r9 290x cards if i replace the reference cooler with a water block?"
"Thanks for your message.
In case the product goes defective and you return it back in the original condition with no damages on the card and cooler itself, it's no problem.
Best regards
Club 3D support."


----------



## X-oiL

Add me please







asic 77.1 and 78.5 with hynix memory, both cards looked.


----------



## famich

Quote:


> Originally Posted by *Ukkooh*
> 
> Well mine is a BF4 edition too with 73.9% asic and still black screening at stock. Kind of odd that it didn't black screen when I pushed it to the limit and had light artifacting.


I ve had no such thing, strange...


----------



## jerrolds

Quote:


> Originally Posted by *jassilamba*
> 
> So do all AMD GPU manufacturers void the warranty if you remove the stock cooler?
> 
> I have been on the fence between couple Sapphire 290s (CF) or some EVGA 780s (SLI), and will be water cooling.
> 
> My primary use will be BF4 on 1440P, and don't think I will need a pair 290Xs for that.


Check the Sapphire thread - but they dont allow modifications on their cards. On the other hand there are no stickers that get broken when removing the stock cooler.


----------



## Fahrenheit85

Quote:


> Originally Posted by *Raxus*
> 
> It doesnt really matter unless you want to watercool it and not void your warranty.


Well I see people saying they flash to Asus bios for unlocked voltage so that's why I asked. Also worried about coil whine. Also cards are going under water.

And hey its nice to see someone else from PA on here


----------



## Forceman

Quote:


> Originally Posted by *kcuestag*
> 
> Anyone experienced a Red Screen of Death yet? I just got my first one while playing Battlefield 4:


I was getting them until I lowered my memory overclock. Pushed it back up last night and got one again, so you might consider that. I only had to drop 25 MHz on the memory to appear to stop them.


----------



## jassilamba

Quote:


> Originally Posted by *Ukkooh*
> 
> From what I've heard club3d, asus and msi don't void the warranty. Club3d is the only one I managed to confirm myself.
> 
> Edit: Club 3D's reply when I asked about it:
> "Does the warranty break on your r9 290x cards if i replace the reference cooler with a water block?"
> "Thanks for your message.
> In case the product goes defective and you return it back in the original condition with no damages on the card and cooler itself, it's no problem.
> Best regards
> Club 3D support."


Awesome, of the above, I rather take MSI. I have heard mostly bad experiences with Asus support.
Quote:


> Originally Posted by *jerrolds*
> 
> Check the Sapphire thread - but they dont allow modifications on their cards. On the other hand there are no stickers that get broken when removing the stock cooler.


That makes sense, as long as there is no physical damage on the GPU, don't see there is way for them to tell if the cooler was removed or not.

Thinking of ordering 2 sapphire 290s with the BF4 edition.


----------



## famich

Sapphire should pose no problem when exchanging the cooler IMHO .
No damage and that s it









BTW Valley +1270MHz +200mV AFB " manual offset"

http://abload.de/image.php?img=0000139a04.png


----------



## kcuestag

Quote:


> Originally Posted by *Forceman*
> 
> I was getting them until I lowered my memory overclock. Pushed it back up last night and got one again, so you might consider that. I only had to drop 25 MHz on the memory to appear to stop them.


Thanks, I'll give it a try tomorrow.

I just did few Valley Benchmarks at 1125MHz Core and tested the following:

1250MHz Memory (Stock)
1350MHz Memory (Where I got RSOD at BF4)
1325MHz Memory (-25MHz as you suggested)

And I noticed the gain between 1250 and 1350MHz Memory was literally 0.8fps.









I guess I'll just keep the Memory at stock since it seems pretty useless to overclock it. Instead I'll focus on the Core.


----------



## centvalny

Probably the limit for this powercolor locked 290 baldie on h2o and ram clocking for cpu and gpu (elpida) are optimal for ambient cooling



http://imgur.com/o679zJ4



http://www.3dmark.com/3dm11/7554245


----------



## Jpmboy

Just received a replacement 290x from new egg. Downloaded the most recent CCC from AMD... installed, looks fine. AND THERE'S NO AMD OVERDRIVE IN CCC?


----------



## Jpmboy

Just received a replacement r290x from the egg. Installed latest CCC + driver - what? no option to enable AMD overdrive? Help... !!!


----------



## jerrolds

Quote:


> Originally Posted by *Jpmboy*
> 
> Just received a replacement 290x from new egg. Downloaded the most recent CCC from AMD... installed, looks fine. AND THERE'S NO AMD OVERDRIVE IN CCC?


There is..uninstall it again, delete Windows\Temp and User\LocalAppData\Temp or whatever and reinstall

If you wanna take it another step further, uninstall using the nvidia uninstall utility in safe mode https://forums.geforce.com/default/topic/550192/geforce-drivers/display-driver-uninstaller-ddu-v9-6-1-released-11-19-13-/ - works on AMD as well, ive used it with good results.

But i think just deleting the temp files before/after uninstalling will work just fine.

Altho i dont even enable CCC and just do overcocking in GPU Tweak, i dont get any where gpu throttling or usage fluctuations


----------



## jerrolds

Quote:


> Originally Posted by *centvalny*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Probably the limit for this powercolor locked 290 baldie on h2o and ram clocking for cpu and gpu (elpida) are optimal for ambient cooling
> 
> 
> 
> http://imgur.com/o679zJ4
> 
> 
> 
> http://www.3dmark.com/3dm11/7554245


1250mhz on only 1363mv in gpu tweak? Nice - whats the ASIC on that?


----------



## jassilamba

Quote:


> Originally Posted by *centvalny*
> 
> Probably the limit for this powercolor locked 290 baldie on h2o and ram clocking for cpu and gpu (elpida) are optimal for ambient cooling
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> http://imgur.com/o679zJ4
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/7554245


How were the temps under water in the above run?


----------



## Jpmboy

Quote:


> Originally Posted by *jerrolds*
> 
> There is..uninstall it again, delete Windows\Temp and User\LocalAppData\Temp or whatever and reinstall
> 
> If you wanna take it another step further, uninstall using the nvidia uninstall utility in safe mode https://forums.geforce.com/default/topic/550192/geforce-drivers/display-driver-uninstaller-ddu-v9-6-1-released-11-19-13-/ - works on AMD as well, ive used it with good results.
> 
> But i think just deleting the temp files before/after uninstalling will work just fine.
> 
> Altho i dont even enable CCC and just do overcocking in GPU Tweak, i dont get any where gpu throttling or usage fluctuations


In the beta version there was. I was using a week ago before returning that r290x (cooked itself). very strange.
will try...


----------



## centvalny

Quote:


> Originally Posted by *jerrolds*
> 
> whats the ASIC on that?


Quote:


> Originally Posted by *jassilamba*
> 
> How were the temps under water in the above run?


Thanks









Using the old swiftech block



http://imgur.com/SIbTpC2


----------



## jassilamba

Quote:


> Originally Posted by *centvalny*
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Using the old swiftech block
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> http://imgur.com/SIbTpC2


Very nice, I'm gonna wait for the Aquacomputer blocks for the R9 to hit the markets before I put them under water.

290s are looking more and more tempting....


----------



## rdr09

Quote:


> Originally Posted by *zpaf*
> 
> I think you can do better on gpu score.
> 
> 1234/1452
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Generic VGA video card benchmark result - Intel Core i7-4770K,ASUSTeK COMPUTER INC. MAXIMUS VI IMPACT


Genoa's score does seem kinda low. here is mine at 1200/1500

http://www.3dmark.com/3dm11/7556915


----------



## centvalny

Quote:


> Originally Posted by *jassilamba*
> 
> 290s are looking more and more tempting....


Thanks, it's fun benching this card











http://imgur.com/NjxVnGk


----------



## Jpmboy

Quote:


> Originally Posted by *jerrolds*
> 
> There is..uninstall it again, delete Windows\Temp and User\LocalAppData\Temp or whatever and reinstall
> 
> If you wanna take it another step further, uninstall using the nvidia uninstall utility in safe mode https://forums.geforce.com/default/topic/550192/geforce-drivers/display-driver-uninstaller-ddu-v9-6-1-released-11-19-13-/ - works on AMD as well, ive used it with good results.
> 
> But i think just deleting the temp files before/after uninstalling will work just fine.
> 
> Altho i dont even enable CCC and just do overcocking in GPU Tweak, i dont get any where gpu throttling or usage fluctuations


nope - no amd overdrive. yeah, sure - i had my last 290x under water with gpu tweak and volt hack. geeze, what's with AMD. this is ridiculous. now i have to spend time just to get this fischer-price crap taken care of.


----------



## DampMonkey

Quote:


> Originally Posted by *centvalny*
> 
> Thanks, it's fun benching this card
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://imgur.com/NjxVnGk


It won't be fun when your VRM's combust. I hope this pic is just from a leak test?


----------



## jerrolds

Quote:


> Originally Posted by *centvalny*
> 
> Thanks, it's fun benching this card
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> http://imgur.com/NjxVnGk


Hold up - you have no sinks on your VRMs at all...what are your VRM temps at when core is at 1250mhz and voltage is 1353mv in gpu tweak? Is this artifact free/stable after an hour+?

I can only get 1225mhz on the core and max temps is 65C and my ASIC is 75%...i wonder if i just lose the vrm sinks like you...


----------



## rdr09

Quote:


> Originally Posted by *Jpmboy*
> 
> nope - no amd overdrive. yeah, sure - i had my last 290x under water with gpu tweak and volthack. geeze, what's with AMD. this is ridiculpus. now i have to spend time just to get this fischer-price crap taken care of.


9.4? that's for those with black screen issues. Do you mind using 13.11 WHQL?


----------



## centvalny

Quote:


> Originally Posted by *jerrolds*
> 
> Hold up - you have no sinks on your VRMs at all...what are your VRM temps at when core is at 1250mhz and voltage is 1353mv in gpu tweak? Is this artifact free/stable after an hour+?
> 
> I can only get 1225mhz on the core and max temps is 65C and my ASIC is 75%...i wonder if i just lose the vrm sinks like you...


My oc settings above only for benching, no way it will hold for 24/7


----------



## Raxus

Update to Msi please.


----------



## Fahrenheit85

So do all 290x have unlocked voltage


----------



## Forceman

Quote:


> Originally Posted by *kcuestag*
> 
> Thanks, I'll give it a try tomorrow.
> 
> I just did few Valley Benchmarks at 1125MHz Core and tested the following:
> 
> 1250MHz Memory (Stock)
> 1350MHz Memory (Where I got RSOD at BF4)
> 1325MHz Memory (-25MHz as you suggested)
> 
> And I noticed the gain between 1250 and 1350MHz Memory was literally 0.8fps.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I guess I'll just keep the Memory at stock since it seems pretty useless to overclock it. Instead I'll focus on the Core.


Strangely enough, 1350 is where I got my RSOD as well. 1325 was trouble free over the weekend though. What's crazy is that I can run Heaven, Valley, Metro, etc at 1450 no problem. I don't know what BF4's deal is, unless it's actually just some kind of bug.


----------



## iPDrop

Hey if anyone can help me real quick I have an issue/question about overclocking two R9 290's

I'm using GPU Tweak and I set these settings:

GPU Clock 1127MHz

GPU Voltage 1300mV

Memory Clock default @ 5000MHz

Power Target 150%

Fanspeed is user defined.

I'm getting some flashes of a pattern of tiny white boxes and other flashes in the game... Does this mean I need to lower the GPU clock? But that's kind of strange because I can do 1100MHz just fine at the stock 1250mV voltage

Also, is adjusting the GPU Voltage the only thing I need to do to adjust the voltage? In other words, is that voltage slider actually doing anything or is the voltage locked? If so, can I unlock it and how?


----------



## velocityx

Quote:


> Originally Posted by *Fahrenheit85*
> 
> So do all 290x have unlocked voltage


is that a question?

anyway, I used to have 6970 in CF. experience was ok when frames were in 120range, anything below and stutters were too noticable.

anyone with 290CF, how is it? I kinda want another 290....


----------



## Arizonian

Quote:


> Originally Posted by *X-oiL*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Add me please
> 
> 
> 
> 
> 
> 
> 
> asic 77.1 and 78.5 with hynix memory, both cards looked.


Congrats - added









Quote:


> Originally Posted by *Raxus*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Update to Msi please.


Congrats - added


----------



## RAFFY

Quote:


> Originally Posted by *kcuestag*
> 
> I'm pretty sure they said they would delay it a few days as it needed some work.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So yeah, I have a 290X which can't even work at idle, so right now I am running with NO drivers and it's working fine ...


Sounds like something else is wrong with your computer and not the 290x.
Quote:


> Originally Posted by *Kriant*
> 
> Who in EK came up that installation for 3-slot bridge, I can't fit any of the O-rings wihout them flying all over the place, when I try to mount that sli bridge


I was under the impression that ALL EK bridges were incompatible with the new 290/290x blocks. Have they come out with a compatible version?


----------



## jerrolds

Quote:


> Originally Posted by *iPDrop*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Hey if anyone can help me real quick I have an issue/question about overclocking two R9 290's
> 
> I'm using GPU Tweak and I set these settings:
> 
> GPU Clock 1127MHz
> 
> GPU Voltage 1300mV
> 
> Memory Clock default @ 5000MHz
> 
> Power Target 150%
> 
> Fanspeed is user defined.
> 
> I'm getting some flashes of a pattern of tiny white boxes and other flashes in the game... Does this mean I need to lower the GPU clock? But that's kind of strange because I can do 1100MHz just fine at the stock 1250mV voltage
> 
> Also, is adjusting the GPU Voltage the only thing I need to do to adjust the voltage? In other words, is that voltage slider actually doing anything or is the voltage locked? If so, can I unlock it and how?


Yup - usually. You can try and increasing voltage/power limit while keeping temps cool to see if it stabilzes. If youre at 1127mhz at stock voltages, you have some room to go probably...bump up the voltage a bit, probably going to need to incraese fan speeds though.


----------



## Scorpion49

Anybody have an Nvidia card in their system with the 290's? I put in my 9800GT to see if I can get hybrid PhysX working, but the display only works on the Nvidia card now. If I plug in the 290 it shows the desktop for a brief second and then it goes black, however I can still see the mouse cursor moving around. Anyone know how to fix it?


----------



## iPDrop

Quote:


> Originally Posted by *jerrolds*
> 
> Yup - usually. You can try and increasing voltage/power limit while keeping temps cool to see if it stabilzes. If youre at 1127mhz at stock voltages, you have some room to go probably...bump up the voltage a bit, probably going to need to incraese fan speeds though.


But I thought that 1300mV was like the highest you should ever go for any gpu


----------



## selk22

Quote:


> Originally Posted by *Forceman*
> 
> Strangely enough, 1350 is where I got my RSOD as well. 1325 was trouble free over the weekend though. What's crazy is that I can run Heaven, Valley, Metro, etc at 1450 no problem. I don't know what BF4's deal is, unless it's actually just some kind of bug.


Yeah its weird on my card anything in 1300s is completely glitchy but at 1400 its 100% stable? I am not complaining though! This is on stock cooler btw


----------



## Jpmboy

Uninstalled beta9.4... reinstalled beta9.2 and got CCC control back:

Capture.PNG 167k .PNG file

does anyone have AMD overdrive in 11beta9.4??

flashed over to Asus bios, got voltage control via gpu tweak... okay - onward hoe..


----------



## Raxus

Quote:


> Originally Posted by *Fahrenheit85*
> 
> Well I see people saying they flash to Asus bios for unlocked voltage so that's why I asked. Also worried about coil whine. Also cards are going under water.
> 
> And hey its nice to see someone else from PA on here


Ya good ol philly. I confirmed with Msi that they don't void your warranty unless you physically damage the card. There's a thread in the Msi forums about it.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *famich*
> 
> Sapphire should pose no problem when exchanging the cooler IMHO .
> No damage and that s it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> BTW Valley +1270MHz +200mV AFB " manual offset"
> 
> http://abload.de/image.php?img=0000139a04.png


How did you do the +200mv mod in AB??? I tried going through this thread again to find it but maaaaaaaan it's taking forever! LOL I would greatly appreciate the assistance. I'm getting good clocks on +100mv anxious to see some higher clocks


----------



## Arizonian

Quote:


> Originally Posted by *Raxus*
> 
> Ya good ol philly. I confirmed with Msi that they don't void your warranty unless you physically damage the card. There's a thread in the Msi forums about it.


Same reason tsm went MSI. That's the way it should be. Other manufacturers that don't, restrict sales with enthusiast who change GPU's frequently and are alienating their best clients IMO.


----------



## Jpmboy

Quote:


> Originally Posted by *Raxus*
> 
> Ya good ol philly. I confirmed with Msi that they don't void your warranty unless you physically damage the card. There's a thread in the Msi forums about it.


Ha! A local! quick test of this card (after pulling 2 wc titans from the bench)... time to put the EK block on. Not a bad card.. we'll see. Last one is still holding in the top 10 on a few benches around here.. alas, she died in the call of duty











btw- there's a guy in Hersey that is wringing sli titans very hard.. no hard mods - just the available soft mods.


----------



## Scorpion49

Well, I can say that installing an Nvidia card with these ones is a very bad idea. I can't get display out of either of them now, so I'm going to have to go ahead and wipe windows and re-install everything. Never had a problem running hybrid setups in the past at all, so I'm not sure what the deal is with these cards.


----------



## Jpmboy

Quote:


> Originally Posted by *Scorpion49*
> 
> Well, I can say that installing an Nvidia card with these ones is a very bad idea. I can't get display out of either of them now, so I'm going to have to go ahead and wipe windows and re-install everything. Never had a problem running hybrid setups in the past at all, so I'm not sure what the deal is with these cards.


download and cleasn you system with Display Driver Uninstaller (DDU). I've swapped out NV and AMD on "parkbench" so many times it's embarrassing...


----------



## Jpmboy

Quote:


> Originally Posted by *Scorpion49*
> 
> Anybody have an Nvidia card in their system with the 290's? I put in my 9800GT to see if I can get hybrid PhysX working, but the display only works on the Nvidia card now. If I plug in the 290 it shows the desktop for a brief second and then it goes black, however I can still see the mouse cursor moving around. Anyone know how to fix it?


that will not work.,


----------



## Scorpion49

Quote:


> Originally Posted by *Jpmboy*
> 
> download and cleasn you system with Display Driver Uninstaller (DDU). I've swapped out NV and AMD on "parkbench" so many times it's embarrassing...


Yeah, that doesn't help. Something has been corrupted in windows besides the driver. It happened exactly like the R7-270X I had before for a friends machine, actually.
Quote:


> Originally Posted by *Jpmboy*
> 
> that will not work.,


I disagree, I've run Hybrid PhysX for a long, looong time and it works great with many generations of GPU (most recently with 7950CF). However, these newest AMD drivers don't seem to want to play nice.


----------



## Technewbie

Quote:


> Originally Posted by *iPDrop*
> 
> Hey if anyone can help me real quick I have an issue/question about overclocking two R9 290's
> 
> I'm using GPU Tweak and I set these settings:
> 
> GPU Clock 1127MHz
> 
> GPU Voltage 1300mV
> 
> Memory Clock default @ 5000MHz
> 
> Power Target 150%
> 
> Fanspeed is user defined.
> 
> I'm getting some flashes of a pattern of tiny white boxes and other flashes in the game... Does this mean I need to lower the GPU clock? But that's kind of strange because I can do 1100MHz just fine at the stock 1250mV voltage
> 
> Also, is adjusting the GPU Voltage the only thing I need to do to adjust the voltage? In other words, is that voltage slider actually doing anything or is the voltage locked? If so, can I unlock it and how?


You are getting artifacts because of a unstable overclock. You will either need to lower the clock speeds until they go away or increase the voltage. The reason you can 1100MHz just fine at stock is because the voltage is high enough for those clocks but once you increased it the voltage no longer was enough. And yes the voltage is unlocked and you just use the slider to adjust it.


----------



## iPDrop

Quote:


> Originally Posted by *Technewbie*
> 
> You are getting artifacts because of a unstable overclock. You will either need to lower the clock speeds until they go away or increase the voltage. The reason you can 1100MHz just fine at stock is because the voltage is high enough for those clocks but once you increased it the voltage no longer was enough. And yes the voltage is unlocked and you just use the slider to adjust it.


Yeah but _I did_ increase the voltage.

1100MHz 1250mV stable

1127MHz 1300mV artifacts....

It's not safe to increase the voltage past 1300mV am I right?


----------



## CurrentlyPissed

Anddd I just had a black screen in windows. Havent had one yet on the new XFX cards. Just was browsing and whammo.

(stock speeds custom fan profile)


----------



## iPDrop

I think my cards just died or something... I was playing bf4 just fine then all of the sudden extreme lag and like 10 fps with two r9 290's


----------



## Raxus

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added


Hey Arizonian, you have me on there twice, i just needed my sapphire r9 290x updated to a MSI r9 290x


----------



## Arizonian

Quote:


> Originally Posted by *Raxus*
> 
> Hey Arizonian, you have me on there twice, i just needed my sapphire r9 290x updated to a MSI r9 290x










Will fix that.


----------



## Raxus

Quote:


> Originally Posted by *Arizonian*
> 
> 
> 
> 
> 
> 
> 
> 
> Will fix that.


Lol thanks, its a 290x as well


----------



## the9quad

9959 pretty good in firestrike extreme with 290x's in crossfire? really want to hit 10k but don't feel like pushing these on air.


----------



## warbucks

Got my EK waterblocks in for my R9 290's. Going to install them tonight! You can update me to water now


----------



## Scorpion49

Quote:


> Originally Posted by *Arizonian*
> 
> 
> 
> 
> 
> 
> 
> 
> Will fix that.


I've never been added at all


----------



## Jpmboy

Quote:


> Originally Posted by *Scorpion49*
> 
> Yeah, that doesn't help. Something has been corrupted in windows besides the driver. It happened exactly like the R7-270X I had before for a friends machine, actually.
> I disagree, I've run Hybrid PhysX for a long, looong time and it works great with many generations of GPU (most recently with 7950CF). However, these newest AMD drivers don't seem to want to play nice.


I agree, it worked in the past. Do you actually gain anything with sep physx coprocessor? I haven't since cfx 7970s.


----------



## Arizonian

Quote:


> Originally Posted by *Raxus*
> 
> Lol thanks, its a 290x as well


Thanks for you patience. Updated









Quote:


> Originally Posted by *warbucks*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Got my EK waterblocks in for my R9 290's. Going to install them tonight! You can update me to water now


Congrats - updated


----------



## jerrolds

Quote:


> Originally Posted by *iPDrop*
> 
> Yeah but _I did_ increase the voltage.
> 
> 1100MHz 1250mV stable
> 
> 1127MHz 1300mV artifacts....
> 
> It's not safe to increase the voltage past 1300mV am I right?


I think as long as voltage after vdroop is around 1.3v and your temps are low youll be ok. Whats your ASIC? Just stick to the Asus bios and only use PT1+ bios if youre under water and you should be as long as you can keep temps down.


----------



## Scorpion49

Quote:


> Originally Posted by *Jpmboy*
> 
> I agree, it worked in the past. Do you actually gain anything with sep physx coprocessor? I haven't since cfx 7970s.


I play a lot of Borderlands 2 and I like the extra effects. Planetside 2 will have them back sooner or later as well. I really hate that its a proprietary tech but what can you do right.


----------



## mojobear

oh man...so am i reading this right...people are still getting black screens even with the new beta drivers? So itching to purchase 3 x R9 290s... but resisting bc of this one stupid issue.

What has the general consensus been? Reading through it seems like for some it has totally fixed it, others decreased, with a small number still having black screens even at stock.

Thanks for the tips in advance


----------



## the9quad

Just ordered one more, will post pics when I get back from my trip back home. Oh cold Ohio how I don't miss thee.


----------



## Raxus

Quote:


> Originally Posted by *mojobear*
> 
> oh man...so am i reading this right...people are still getting black screens even with the new beta drivers? So itching to purchase 3 x R9 290s... but resisting bc of this one stupid issue.
> 
> What has the general consensus been? Reading through it seems like for some it has totally fixed it, others decreased, with a small number still having black screens even at stock.
> 
> Thanks for the tips in advance


seems most of the people that were getting consistent black screens RMAd and arent experiencing the issue at all or very very little with their replacement cards. And it seems like the 290xs had it way more often, specifically of the sapphire variety. That is just my personal observations.


----------



## the9quad

Quote:


> Originally Posted by *Raxus*
> 
> seems most of the people that were getting consistent black screens RMAd and arent experiencing the issue at all or very very little. And it seems like the 290xs had it way more often, specifically of the sapphire variety. That is just my personal observations.


I think your personal observations are pretty spot on!


----------



## Technewbie

Quote:


> Originally Posted by *Raxus*
> 
> seems most of the people that were getting consistent black screens RMAd and arent experiencing the issue at all or very very little. And it seems like the 290xs had it way more often, specifically of the sapphire variety. That is just my personal observations.


I would say that is pretty spot on, I have seen very little 290's with black screen issues. But I have to add that my MSI 290x had blackscreen issue's starting at just when OC'd and later even at stock however the latest drivers fixed it and I have yet to experience another yet.


----------



## beejay

For me, I've only experienced one black screen issue when I was trying to find the sweet spot for my overclock. I have been using a 1100Mhz Core and 1500Mhz mem clock on a 1.3v +50% powerlimit when gaming and It's working fine (although I did experienced some artifacting due to hot VRMs, I have rectified this by bigger heatsink fins). I'm using PT3. On an accelero Hybrid. Powercolor R9 290X


----------



## Raxus

Quote:


> Originally Posted by *Technewbie*
> 
> I would say that is pretty spot on, I have seen very little 290's with black screen issues. But I have to add that my MSI 290x had blackscreen issue's starting at just when OC'd and later even at stock however the latest drivers fixed it and I have yet to experience another yet.


ive been playing bf4 and benchmarking all night, not a single black screen.

Goodbye sapphire, hello MSI.


----------



## muhd86

i can over clock my r290 3 of in tri fire to 1000/1330 , will try for more later today ....

is it me or the gpu utlilization in afterburner keeps on jumping up and down ---i saw this many time ----is it a bug of afterburner latest drivers did not fix it either --

on a 4.2ghz oc on my 3930k and 1000 on the core of each gpu i got a score of 3d mark 11 of 25800 points ---is this ok for a 3 gpu set up


----------



## Jpmboy

So far I've had 2 Sap 290x's... never a blkscrn yet (well, at least one I did not cause







) - maybe just lucky so far. Put the Ek i had on this new one after testing it with the stock cooler, flashed to Asus and 1265/6600 working very well. VRM temps are below 45C. GPu <40C with 17C water loop.
http://www.3dmark.com/3dm/1715631
http://www.3dmark.com/fs/1209143

Still learning to tune this thing. Very different from 7970s or GK110.


----------



## Jpmboy

Quote:


> Originally Posted by *muhd86*
> 
> i can over clock my r290 3 of in tri fire to 1000/1330 , will try for more later today ....
> 
> is it me or the gpu utlilization in afterburner keeps on jumping up and down ---i saw this many time ----is it a bug of afterburner latest drivers did not fix it either --
> 
> on a 4.2ghz oc on my 3930k and 1000 on the core of each gpu i got a score of 3d mark 11 of 25800 points ---is this ok for a 3 gpu set up


have a look down the charts. No R290(x) in HOF cause the driver is not yet validated.

http://www.3dmark.com/hall-of-fame-2/3dmark+11+3dmark+score+performance+preset/version+1.0.5/3+gpu

http://www.overclock.net/t/1361939/top-30-3dmark11-scores-for-single-dual-tri-quad


----------



## Raxus

Quote:


> Originally Posted by *Jpmboy*
> 
> So far I've had 2 Sap 290x's... never a blkscrn yet (well, at least one I did not cause
> 
> 
> 
> 
> 
> 
> 
> ) - maybe just lucky so far. Put the Ek i had on this new one after testing it with the stock cooler, flashed to Asus and 1265/6600 working very well. VRM temps are below 45C. GPu <40C with 17C water loop.
> http://www.3dmark.com/3dm/1715631
> http://www.3dmark.com/fs/1209143
> 
> Still learning to tune this thing. Very different from 7970s or GK110.


Just my observations from these forums and customer reviews on newegg etc.

sapphire 290x seemed to be the worst.


----------



## brazilianloser

Quote:


> Originally Posted by *Jpmboy*
> 
> have a look down the charts. No R290(x) in HOF cause the driver is not yet validated.
> 
> http://www.3dmark.com/hall-of-fame-2/3dmark+11+3dmark+score+performance+preset/version+1.0.5/3+gpu
> 
> http://www.overclock.net/t/1361939/top-30-3dmark11-scores-for-single-dual-tri-quad


3Dmark is picking up my whql driver, but i do not have 3dmark11 so can't say anything.


----------



## Scorpion49

So I just put one card back in after that driver fiasco, wow what a difference. Everything is smooth as butter, no crashing no stuttering, its amazing. I guess CF on the 290's isn't quite up to speed yet.


----------



## Jpmboy

Quote:


> Originally Posted by *brazilianloser*
> 
> 3Dmark is picking up my whql driver, but i do not have 3dmark11 so can't say anything.


I'm still on 13.11beta9.2. 9.4 gave me problems. which driver are you using?


----------



## Jpmboy

Quote:


> Originally Posted by *Raxus*
> 
> Just my observations from these forums and customer reviews on newegg etc.
> sapphire 290x seemed to be the worst.


could be flashing to a diff bios would fix it?


----------



## skupples

Quote:


> Originally Posted by *Jpmboy*
> 
> So far I've had 2 Sap 290x's... never a blkscrn yet (well, at least one I did not cause
> 
> 
> 
> 
> 
> 
> 
> ) - maybe just lucky so far. Put the Ek i had on this new one after testing it with the stock cooler, flashed to Asus and 1265/6600 working very well. VRM temps are below 45C. GPu <40C with 17C water loop.
> http://www.3dmark.com/3dm/1715631
> http://www.3dmark.com/fs/1209143
> 
> Still learning to tune this thing. Very different from 7970s or GK110.


You make me want to get a chiller!


----------



## Jpmboy

Quote:


> Originally Posted by *skupples*
> 
> You make me want to get a chiller!


in the great white north, i just open the window this time of year... whereas Florida... I'd def get an aquarium chiller if only to save on the AC bill bringing the room down to 60F !


----------



## skupples

Quote:


> Originally Posted by *Jpmboy*
> 
> in the great white north, i just open the window this time of year... whereas Florida... I'd def get an aquarium chiller if only to save on the AC bill bringing the room down to 60F !


Now you have me confused. I thought it was you who picked up a chiller! Either way, down here a chiller couldn't be run very cool. It would more be a luke warmer.


----------



## Jpmboy

yes - I did, 1/10 HP Aqua Euro. Works great! BUt it's disconnected now (too many QDCs , and Koolance changed the style of them recently... arg!


----------



## tsm106

Do you still have room in the core mhz tank? Btw, you should use the 13.11 whql, its not only fast and valid, but it is easier for FSI to read card specs, ie. it will read clocks correctly.


----------



## Stray_Bullet

Hey guys, I'm interested in picking up a R9 290, but am wondering if it would be a waste with my current setup. Running Fm2+ with a 760K @ 4.7ghz.... wondering if this is gonna be a major bottleneck. Could probably hit 5ghz, but don't really want to atm. What do you all think? Thanks


----------



## Ukkooh

Depends on what you like to play. If you will play mantle enabled games then it won't be that big of a problem.


----------



## Arizonian

Quote:


> Originally Posted by *Scorpion49*
> 
> I've never been added at all


I was wondering how I passed you. Found that *POST*.

I was expecting an entry and it was an invoice. I think I thought you'd post when it arrived.









Per OP I was looking for following:
1. GPU-Z Link or Screen shot with OCN Name
2. Manufacturer & Brand
3. Cooling - Stock, Aftermarket or 3rd Party Water

No worries as a lot of member seem to be missing this and I've been able to catch it. You were slotted in where you were at the original submission on the roster list.

A belated congrats - added


----------



## Scorpion49

Quote:


> Originally Posted by *Arizonian*
> 
> I was wondering how I passed you. Found that *POST*.
> 
> I was expecting an entry and it was an invoice. I think I thought you'd post when it arrived.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Per OP I was looking for following:
> 1. GPU-Z Link or Screen shot with OCN Name
> 2. Manufacturer & Brand
> 3. Cooling - Stock, Aftermarket or 3rd Party Water
> 
> No worries as a lot of member seem to be missing this and I've been able to catch it. You were slotted in where you were at the original submission on the roster list.
> 
> A belated congrats - added


I did post all that, cuz you PM'ed me about it








Quote:


> Originally Posted by *Scorpion49*
> 
> Well heres my screenshot thingy, on stock cooling for now.
> 
> 
> 
> Anyone have a link to the flash procedures for these cards? Or is it just the same as the last gen?


----------



## zpaf

Quote:


> Originally Posted by *Jpmboy*
> 
> So far I've had 2 Sap 290x's... never a blkscrn yet (well, at least one I did not cause
> 
> 
> 
> 
> 
> 
> 
> ) - maybe just lucky so far. Put the Ek i had on this new one after testing it with the stock cooler, flashed to Asus and 1265/6600 working very well. VRM temps are below 45C. GPu <40C with 17C water loop.
> http://www.3dmark.com/3dm/1715631
> ]http://www.3dmark.com/fs/1209143
> 
> Still learning to tune this thing. Very different from 7970s or GK110.


With what clocks is this run ? http://www.3dmark.com/fs/1209143


----------



## DampMonkey

Quote:


> Originally Posted by *zpaf*
> 
> With what clocks is this run ? http://www.3dmark.com/fs/1209143


heres a different one
http://i.imgur.com/8MOaWrq.jpg


----------



## hatlesschimp

Wheres the HDMI 2.0 upgrade for 4K @ 60Hz?????

My Sony TV has just received the 4K update and now I'm waiting on AMD.

The update added support for:
3840x2160 (59.94/60Hz), YUV 4:2:0, 8 bit
4096x2160 (59.94/60Hz), YUV 4:2:0, 8 bit

I know its not 4:4:4 at 12 Bit or 10 Bit but my 3x VG248QE Monitors are only 6 Bit so Im happy!!!

http://www.sony.jp/bravia/info2/20131126.html
Quote:


> In normal circumstances, You patronize Sony products, Thank you very much. 4K compatible BRAVIA LCD TV X9200A / X8500A targeting series, November 27, 2013 (Wednesday) from, and network download, the update for the 4K/60p ※ 1 signal transmission corresponding 2.0-compliant standard HDMI latest version of HDMI I will start by ※ 2 update by USB memory. Target products KD-65X9200A / 55X9200A / 65X8500A / 55X8500A [feature added by the update] the latest standard version 2.0 of HDMI, corresponding to the signal transmission of 4K/60p ※ 1, which is standardized new, 4K of future You can be done in connection with a single HDMI cable from compatible devices, input of 60p ※ 1 content is assumed broadcast, or 4K delivery service, the playback. Thus, it is possible to display at 60 frames per second video 4K resolution is possible, for example, content fast-moving sports scene reflects more smoothly, realism than ever in quality of 4K You can enjoy with. Input terminals of all of HDMI1 ~ 4 will be 4K/60p ※ 1 signal corresponding update later. [How to update] it is planned to update ※ 2 via USB memory and network download. BRAVIA LCD TV home page in the "body update information", for more information, is expected to post at Wednesday 15 November 27, 2013. 　"Body update information" here As for 4K/60p ※ 1 signal transmission corresponding HDMI 2.0-compliant version of "KD-84X9000", is expected to guide you during the month of December 2013. ※ 4,096 × 2,160 pixels vertical one horizontal 3,840 × 2,160 pixels vertically (59.94/60Hz) YUV 4:2:0 8bit, horizontal (59.94/60Hz) YUV 4:2:0 8-bit ※ 2 broadcast download is scheduled for later implementation You


----------



## DeadlyDNA

Okay, i am not losing my mind, but after getting these 290's underwater, i cannot believe the gains i just got from stock to water!!! I thought i was hallucinating but I'm pretty sure I'm not.

I just got this thing up and running and will have to bring it down again tomorrow to add 4th card. Pretty sure i am going to need another radiator as well. While this isn't much to go on ill throw a couple screenshots. Also everything ran is matching settings with STOCK clocks.

3 way CF with Stock air cooler:


3way CF with waterblocks stock clocks:


3way CF with stock cooler


3wayCF with water :


Wish i had more time to fiddle tonight but it's off to bed:thumb:


----------



## zpaf

Quote:


> Originally Posted by *DampMonkey*
> 
> heres a different one
> http://i.imgur.com/8MOaWrq.jpg


Good score but as I see you are with unofficial bios.
I want to see how far an X card can go on Firestrike with official bios.


----------



## MrWhiteRX7

Got in to an argument with a guy that was telling me 4k doesn't look any better than 720p........... I wanted to slap him in the gums! UGGGGGH that junk looks incredible! I'm just now stepping up to 1440p!!!


----------



## DampMonkey

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Got in to an argument with a guy that was telling me 4k doesn't look any better than 720p........... I wanted to slap him in the gums! UGGGGGH that junk looks incredible! I'm just now stepping up to 1440p!!!


Pixel density can make for some amazing visuals, but if you've never experience it you have no idea. Tell that guy he hes dumb









The 4k TV's at my local electronics store are mind bogglingly beautiful. I seriously cant wait for things to get in my price range, it's a shame theres no real content for them right now. I have a feeling that a 4k computer monitor would just look unreal. The awesome part is that you don't have to wait for "content", you just increase the resolution of your games!


----------



## DampMonkey

Quote:


> Originally Posted by *zpaf*
> 
> Good score but as I see you are with unofficial bios.
> I want to see how far an X card can go on Firestrike with official bios.


1200/1600 is pretty rock stable. Haven't explored the max on the asus bios, honestly


----------



## Arizonian

Quote:


> Originally Posted by *hatlesschimp*
> 
> Wheres the HDMI 2.0 upgrade for 4K @ 60Hz?????
> 
> My Sony TV has just received the 4K update and now I'm waiting on AMD.
> 
> The update added support for:
> 3840x2160 (59.94/60Hz), YUV 4:2:0, 8 bit
> 4096x2160 (59.94/60Hz), YUV 4:2:0, 8 bit
> 
> I know its not 4:4:4 at 12 Bit or 10 Bit but my 3x VG248QE Monitors are only 6 Bit so Im happy!!!
> 
> http://www.sony.jp/bravia/info2/20131126.html
> 
> 
> Spoiler: Warning: Spoiler!


That's awesome man....congrats


----------



## Scorpion49

Quote:


> Originally Posted by *DampMonkey*
> 
> Pixel density can make for some amazing visuals, but if you've never experience it you have no idea. Tell that guy he hes dumb
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The 4k TV's at my local electronics store are mind bogglingly beautiful. I seriously cant wait for things to get in my price range, it's a shame theres no real content for them right now. I have a feeling that a 4k computer monitor would just look unreal. The awesome part is that you don't have to wait for "content", you just increase the resolution of your games!


Yeah, I've used all sorts of monitors up to 1600p, and when I saw some Sony 4K on display in the store I was shocked at how good it looked. Obviously they play material on there to best take advantage of the screen but it was amazing (the $7000 price tag was NOT amazing).


----------



## DeadlyDNA

Adding heaven real quick, again this is all stock clocks on water, pulling about 800-900watts at the wall now.


thats eyefinity ^^^^^

1080p for folks as well


----------



## DeadlyDNA

Add me please! 



4 cards, 2 Gigabyte, 1 HIS, and 1 Powercolor(arrives tomorrow)

All on water....


----------



## hatlesschimp

Quote:


> Originally Posted by *Arizonian*
> 
> That's awesome man....congrats


Cheers bro!

Just waiting on AMD now. OMG the wait is killing me. ARMA 3 works well on 4k at 30hz anyways because its a slow game not like COD or BF4. But 60hz will be sweet!!!!!!!


----------



## Forceman

Quote:


> Originally Posted by *zpaf*
> 
> Good score but as I see you are with unofficial bios.
> I want to see how far an X card can go on Firestrike with official bios.


Here's mine with the regular Asus BIOS at 1200/1450:



http://www.3dmark.com/fs/1155472


----------



## Sgt Bilko

Quote:


> Originally Posted by *zpaf*
> 
> Good score but as I see you are with unofficial bios.
> I want to see how far an X card can go on Firestrike with official bios.


I'm running the Normal Sapphire Bios on Air,

http://www.3dmark.com/fs/1151742

On an Intel ship this would be a good amount faster and indicated by Forceman's result.


----------



## zpaf

Quote:


> Originally Posted by *Forceman*
> 
> Here's mine with the regular Asus BIOS at 1200/1450:
> 
> 
> 
> http://www.3dmark.com/fs/1155472


Thanks Forceman.

I think these cards have Power Throttle depending on game or benchmark if they run > 1200MHz.
With my card (non X) I can do 1270 but I have no gains on many games.
I had a GTX780 with the same situation.
When runs > 1200/1800 I had Power Throttle.
Thats why a lot of owners use unofficial bios to bypass TDP Limit.


----------



## Fahrenheit85

Sorry for the repost but I'm just trying to find an answer here.

Do all 290xs have unlocked voltage? I see people switching flashing to Asus bios and It seems like they are doing it to unlock the voltage for OC. Also is coil whine a luck of the drawl kind of thing?


----------



## Taint3dBulge

Has there been ANY news as of late, as to when the custom pcb's will be out.. I know its "supposed" to be sometime in mid december... But you would think someone would have herd an exact date by now.


----------



## hatlesschimp

Ive heard nothing on the non reffereence but I firmly believe with in 2.5 weeks from today.

THEY WOULD BE CRAZY TO NOT HAVE THESE CARDS OUT WELL BEFORE XMAS!!


----------



## mboner1

Just grabbed the gigabyte r9 290 and got a quick few questions while i google for answers (which is made harder by the fact nearly everything i search for comes up with results for 290x)

1. I always disable ULPS with my 7970 after every driver update, should i still disable ULPS with the 290??

2. I have uninstalled msi afterburner, can i still use that to overclock and for the fan profile??

3. Is the 290 set to uber mode by default or do i need to change it manually??

If anyone can answer these even with yes and no answers that would be great. Any necessary Additional info would be appreciated. Thanks.


----------



## Forceman

Quote:


> Originally Posted by *Fahrenheit85*
> 
> Sorry for the repost but I'm just trying to find an answer here.
> 
> Do all 290xs have unlocked voltage? I see people switching flashing to Asus bios and It seems like they are doing it to unlock the voltage for OC. Also is coil whine a luck of the drawl kind of thing?


Yes, all the reference cards have unlocked voltage, the Asus BIOS just lets you go a little higher (Afterburner is limited to +100mV).
Quote:


> Originally Posted by *mboner1*
> 
> Just grabbed the gigabyte r9 290 and got a quick few questions while i google for answers (which is made harder by the fact nearly everything i search for comes up with results for 290x)
> 
> 1. I always disable ULPS with my 7970 after every driver update, should i still disable ULPS with the 290??
> 
> 2. I have uninstalled msi afterburner, can i still use that to overclock and for the fan profile??
> 
> 3. Is the 290 set to uber mode by default or do i need to change it manually??
> 
> If anyone can answer these even with yes and no answers that would be great. Any necessary Additional info would be appreciated. Thanks.


1. I disabled ULPS and it seems to have eliminated the idle (monitor wake-up) black screens I sometimes got. So I'd disable it.

2. You can use Afterburner to overclock and control the fan, but it'll need to be installed. Once you set the overclock I think you can uninstall it again though since it makes the changes at the driver level.

3. The 290 doesn't have an Uber mode, both BIOSes are the same.


----------



## Hogesyx

Quote:


> Originally Posted by *mboner1*
> 
> Just grabbed the gigabyte r9 290 and got a quick few questions while i google for answers (which is made harder by the fact nearly everything i search for comes up with results for 290x)
> 
> 1. I always disable ULPS with my 7970 after every driver update, should i still disable ULPS with the 290??
> 
> 2. I have uninstalled msi afterburner, can i still use that to overclock and for the fan profile??
> 
> 3. Is the 290 set to uber mode by default or do i need to change it manually??
> 
> If anyone can answer these even with yes and no answers that would be great. Any necessary Additional info would be appreciated. Thanks.


1) I left the ULPS alone on both my Sapphire 290 and Sapphire 290X
2) I am using custom profile using MSI Afterburner, you need to have it running minimized for the fan profile to work, else it will default back to bios profile
3) My 290 has the same bios on both switch, only 290X seems to have Uber and Quiet mode. Uber is the one towards the blower fan, Quiet is the one towards the DVI ports.


----------



## smokedawg

Regarding the Beta 9.4 driver:
I uninstalled the official WHQL using DDU (which just received an AMD related update as I saw looking up this link) in safe mode, rebooted, installed 9.4 and got the performance tab with overdrive. OCing with GPU tweak though and no problems so far. Performance (FS & FSE) seems slightly higher than WHQL for my setup.


----------



## mboner1

+ rep to both of you.

No uber mode, there you go lol, i just changed it to what would be uber mode on the 290x. Ah well. I have uninstalled msi afterburner and re installed it, any link to what people are getting out f these for a ordinary overclock?

Does the 290 still need the crossfire bridge? Cos i'm bout to sell my 7970 and not sure if i need to keep the bridge cos the gigabyte r9 290 didn't come with one. Cheers.

Oh and is gpuz still a no go zone? Heard it causes 99% gpu usage at desktop and experienced that with my 7970.


----------



## Forceman

No crossfire bridge needed. The new GPU-Z doesn't seem to cause the 100% usage bug anymore. And with the stock cooler, probably 1100 for the overclock, with about 1200 on water depending on how much you want to push the voltage.


----------



## Imprezzion

You can maxout voltage at 1.408v perfectly fine with stock cooler if you can stand 100% fan noise









I have been for a few days and it never got above 75c core and 65c VRM's









I did buy a Accelero Hybrid for it now tho as I for one can't stand the noise haha.


----------



## mboner1

Cheers guys, super helpful!!! Time to see what sort of overclock i can get


----------



## Hogesyx

Quote:


> Originally Posted by *Forceman*
> 
> No crossfire bridge needed. The new GPU-Z doesn't seem to cause the 100% usage bug anymore. And with the stock cooler, probably 1100 for the overclock, with about 1200 on water depending on how much you want to push the voltage.


I manage 1140 on my 290 with coil whine, after returning and replacing it with 290X, I could only get 1080 on this card. Both on stock voltage and funny thing is that this 290X actually runs much cooler at the same fan speed(2.8K RPM), VRM1 is 20C cooler and VRM2 is 10C cooler, GPU during load is 7-10C cooler. Best of all, both card has the same ASIC score 70.7.


----------



## beejay

Quote:


> Originally Posted by *Imprezzion*
> 
> You can maxout voltage at 1.408v perfectly fine with stock cooler if you can stand 100% fan noise
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have been for a few days and it never got above 75c core and 65c VRM's
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I did buy a Accelero Hybrid for it now tho as I for one can't stand the noise haha.


Wow it must be cold in your place. By the way, the Accelero Hybrid does not cool the VRMs as well as the stock cooler (small heatsinks and such). I had my vrm's shooting up to the 80s when gaming (BF4, Crysis3, Tomb Raider) on Ultra. I do have it overclocked.

But the core temp is really sweet, never went over 75 on 1100 core, 1.3v.


----------



## Imprezzion

I know, but I don't plan on using the hybrid stock heatsinks









I'm going to go all ghetto mod on the thing and chop up the stock cooler just as long as it takes me to seperate the main core heatsink from the VRM/VRAM baseplate.
Then i'llcut out the hole around the core so i can mount the hybrid through it, and use the stock coolers baseplate


----------



## King4x4

Gonna be joining this club soon.

Just ordered 4x290x for my Hydra 2 Build (It was either that or 3x780ti)


----------



## Hogesyx

Quote:


> Originally Posted by *King4x4*
> 
> Gonna be joining this club soon.
> 
> Just ordered 4x290x for my Hydra 2 Build (It was either that or 3x780ti)


Don't look at the green grass patch, come to the RED CAMP!


----------



## beejay

Quote:


> Originally Posted by *Imprezzion*
> 
> I know, but I don't plan on using the hybrid stock heatsinks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm going to go all ghetto mod on the thing and chop up the stock cooler just as long as it takes me to seperate the main core heatsink from the VRM/VRAM baseplate.
> Then i'llcut out the hole around the core so i can mount the hybrid through it, and use the stock coolers baseplate


Oh My!!! We did the same thing!!! I was actually thinking of cutting a hole on the stock heatsink, then putting the accelero hybrid block in it (plus lessen the height of some fins to accommodate the tubes). But I'm afraid my dremel won't handle the task. I'll have it cut on a proper machine then weld the vrm plates back. That way, even the vrams will get nice cooling. As well as the small vrms at the upper left of the gpu.

Although, I officially have no warranty now. Just a thought to ponder.

Plus one rep for going ghetto!


----------



## Hogesyx

Quote:


> Originally Posted by *beejay*
> 
> Oh My!!! We did the same thing!!! I was actually thinking of cutting a hole on the stock heatsink, then putting the accelero hybrid block in it (plus lessen the height of some fins to accommodate the tubes). But I'm afraid my dremel won't handle the task. I'll have it cut on a proper machine then weld the vrm plates back. That way, even the vrams will get nice cooling. As well as the small vrms at the upper left of the gpu.
> 
> Although, I officially have no warranty now. Just a thought to ponder.
> 
> Plus one rep for going ghetto!


You mean using the reference base plate? Is it possible to separate the original heatsink from the base plate?


----------



## Imprezzion

Yeah it's possible but you have to either cut it loose somehow or desolder it.

I used a acethylene torch to do the same on my HD7970 w/ Accelero Xtreme III 7970.
It was either that or buying a Swiftech HS-7970 and cutting the fins to length.

Just a shame I blew the card when benching extreme voltages (1.38v load core and 1.70v VRAM) it blew the VRM on the upper left somehow mid-bench run haha.
It makes for a nice piece of wallpaper along with the 8 dead motherboards I collected over time from buddy's and local people and ofcourse myself. (RIP GA-X58 UD7 Rev 1.0)


----------



## Hogesyx

Quote:


> Originally Posted by *Imprezzion*
> 
> Yeah it's possible but you have to either cut it loose somehow or desolder it.
> 
> I used a acethylene torch to do the same on my HD7970 w/ Accelero Xtreme III 7970.
> It was either that or buying a Swiftech HS-7970 and cutting the fins to length.
> 
> Just a shame I blew the card when benching extreme voltages (1.38v load core and 1.70v VRAM) it blew the VRM on the upper left somehow mid-bench run haha.
> It makes for a nice piece of wallpaper along with the 8 dead motherboards I collected over time from buddy's and local people and ofcourse myself. (RIP GA-X58 UD7 Rev 1.0)


If you manage to retain the stock base plate and water only the GPU, it will be a sick looking RED MOD!


----------



## beejay

Quote:


> Originally Posted by *Hogesyx*
> 
> You mean using the reference base plate? Is it possible to separate the original heatsink from the base plate?


Yes, remove the shroud, and you'll see the base plate that touches the VRMs can be detached from the big copper plate with fins by cutting the thin metal bits on each side. Dremel this off using reinforced Cut-off discs and screw back the vrm base plate to the board. As Sgt. Bilco said, it also adds support to the board. After doing this, I used some thermal pads and tapes to attach more heatsinks above the plate. Because, it's actually just a large plate, but has better contacts and area.

Also, I used the stock gray pads that was originally used on the VRMs too.


----------



## mboner1

Well i just set the power limit to 50% and core clock to 1100, and memory to 1400 and ran valley, made it through fine, is that what everyone is using to test for stability and benchmark? In valley it said my temp was 1630001 degress or something lol. But i really only hit 70 degrees with 70% fan speed and it was no louder than my 7970. Don't know what people are crying about with these temps and noise honestly.

This look normal?...


----------



## beejay

Quote:


> Originally Posted by *Hogesyx*
> 
> If you manage to retain the stock base plate and water only the GPU, it will be a sick looking RED MOD!


Haha, but yeah, no more warranty. I'm now under constant fear of my card crapping out.


----------



## Hogesyx

Quote:


> Originally Posted by *mboner1*
> 
> Well i just set the power limit to 50% and core clock to 1100, and memory to 1400 and ran valley, made it through fine, is that what everyone is using to test for stability and benchmark? In valley it said my temp was 1630001 degress or something lol. But i really only hit 70 degrees with 70% fan speed and it was no louder than my 7970. Don't know what people are crying about with these temps and noise honestly.
> 
> This look normal?...


Temp looks good, what is your ambient temperature? You should test your OC on Heaven 4.0 as well, I feel that Heaven is much more stressful than Valley, max clock on Valley often fails in Heaven.


----------



## Heinz68

Quote:


> Originally Posted by *hatlesschimp*
> 
> Wheres the HDMI 2.0 upgrade for 4K @ 60Hz?????
> 
> My Sony TV has just received the 4K update and now I'm waiting on AMD.
> 
> The update added support for:
> 3840x2160 (59.94/60Hz), YUV 4:2:0, 8 bit
> 4096x2160 (59.94/60Hz), YUV 4:2:0, 8 bit
> 
> I know its not 4:4:4 at 12 Bit or 10 Bit but my 3x VG248QE Monitors are only 6 Bit so Im happy!!!


HDMI 2.0 was finalized on September 4, 2013, so none off the graphic card supports it. The AMD cards supports 4K UHD at 60Hz through DisplayPort 1.2, same goes for Nvidia cards I believe.


----------



## mboner1

Quote:


> Originally Posted by *Hogesyx*
> 
> Temp looks good, what is your ambient temperature? You should test your OC on Heaven 4.0 as well, I feel that Heaven is much more stressful than Valley, max clock on Valley often fails in Heaven.


Will give that a run now. It was 32 degrees celcius here today but i have the aircon on in the living room so that's probz helping things out a bit.. but i'm always going to have the aircon on if it's hot so yeaha.

I just saw the post from the op saying 600watts for a single 290x with full load at the wall?? I was planning on grabbing a 2nd 290 for crossfire and not overclocking with my current 850w psu, is that frought with danger and has anyone else done it??

Well i just did 1150 and 1450 and still only 72 degrees @ 72%... should i keep going? lol.


----------



## Forceman

Quote:


> Originally Posted by *mboner1*
> 
> I just saw the post from the op saying 600watts for a single 290x with full load at the wall?? I was planning on grabbing a 2nd 290 for crossfire and not overclocking with my current 850w psu, is that frought with danger and has anyone else done it??


I don't know what kind of voltage they are running to draw 600W for a single card. I've run mine as high as the standard Asus BIOS allows and I've never seen more than 400W at the wall.


----------



## Hogesyx

Quote:


> Originally Posted by *mboner1*
> 
> Will give that a run now. It was 32 degrees celcius here today but i have the aircon on in the living room so that's probz helping things out a bit.. but i'm always going to have the aircon on if it's hot so yeaha.
> 
> I just saw the post from the op saying 600watts for a single 290x with full load at the wall?? I was planning on grabbing a 2nd 290 for crossfire and not overclocking with my current 850w psu, is that frought with danger and has anyone else done it??


There is a chart for it






If link doesnt work, go to 8 mins and 33s.

Personally I am running my entire system on a 550W 80+ platinum psu(90+).


----------



## mboner1

Quote:


> Originally Posted by *Hogesyx*
> 
> There is a chart for it
> 
> 
> 
> 
> 
> 
> If link doesnt work, go to 8 mins and 33s.
> 
> Personally I am running my entire system on a 550W 80+ platinum psu(90+).


Sweet, no issues?

I think i will go with a 2nd then and no overclock. Although if this card can go much higher i might have to rethink everything.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Hogesyx*
> 
> If you manage to retain the stock base plate and water only the GPU, it will be a sick looking RED MOD!


I am seriously considering this approach if Water-Cooling is too far off for me.
Quote:


> Originally Posted by *beejay*
> 
> Haha, but yeah, no more warranty. I'm now under constant fear of my card crapping out.


oh pish posh........silly details









Seriously though, it's kinda mixing the best of both worlds there,
vrm cooling is a problem on any aftermarket air cooler and core cooling is only a problem on the stock cooler (combined with noise)


----------



## Hogesyx

Quote:


> Originally Posted by *mboner1*
> 
> Sweet, no issues?
> 
> I think i will go with a 2nd then and no overclock. Although if this card can go much higher i might have to rethink everything.


Well my 550 seems to be hitting the limit when I oc both my CPU and GPU together. So I am running them both at stock volt


----------



## JMCB

I would like to join:



2 Gigabyte R9 290x Video cards, in Crossfire. Loving it much more than 2x 7970s....


----------



## mboner1

Quote:


> Originally Posted by *Hogesyx*
> 
> Well my 550 seems to be hitting the limit when I oc both my CPU and GPU together. So I am running them both at stock volt


How are you establishing that? I'm running my cpu at 4.5ghz. Don't want to lower that but I would think 2 290's at stock should be fine on 850w.

They recommend a 800w psu for stock here, with the 2 gpu's reaching 493w at full load, just the 2 gpu's not the whole system.

http://www.guru3d.com/articles_pages/radeon_r9_290_crossfire_review_benchmarks,4.html


----------



## Hogesyx

Quote:


> Originally Posted by *mboner1*
> 
> How are you establishing that? I'm running my cpu at 4.5ghz. Don't want to lower that but I would think 2 290's at stock should be fine on 850w.
> 
> They recommend a 800w psu for stock here, with the 2 gpu's reaching 493w at full load, just the 2 gpu's not the whole system.
> 
> http://www.guru3d.com/articles_pages/radeon_r9_290_crossfire_review_benchmarks,4.html


I can reach max clock if I clock them one at a time, but OC both together my stability goes doodoo.


----------



## Valice

quick question, as I'm not willing to wade through all 800+ pages:

Is there a specific R290 card that has a higher chance of success towards unlocking the additional shaders to get a R290X ? I'm planning on getting two R290's with the intention to get them running under 290X specs (watercooled). thanks!


----------



## Durquavian

Quote:


> Originally Posted by *Valice*
> 
> quick question, as I'm not willing to wade through all 800+ pages:
> 
> Is there a specific R290 card that has a higher chance of success towards unlocking the additional shaders to get a R290X ? I'm planning on getting two R290's with the intention to get them running under 290X specs (watercooled). thanks!


Powercooler it seems but most have gotten numerous others too as well.


----------



## Jpmboy

Quote:


> Originally Posted by *tsm106*
> 
> Do you still have room in the core mhz tank? Btw, you should use the 13.11 whql, its not only fast and valid, but it is easier for FSI to read card specs, ie. it will read clocks correctly.


thanks tsm. will try it today. A few conf calls this morning then clear.


----------



## PwrElec

is there any stickers on gigabytes 290X screws?


----------



## Valice

Quote:


> Originally Posted by *Raxus*
> 
> It doesnt really matter unless you want to watercool it and not void your warranty.


and which manufacturer would be preferable to maintain warranty when planning to watercool the card?


----------



## Jpmboy

Quote:


> Originally Posted by *Valice*
> 
> and which manufacturer would be preferable to maintain warranty when planning to watercool the card?


Unless they have a seal or sticker on any screws or wrapped from the PCB to aircooler shell, what's to know?

but I hear MSI, and I know EVGA (for gk110 anyway).


----------



## rdr09

Quote:


> Originally Posted by *Valice*
> 
> and which manufacturer would be preferable to maintain warranty when planning to watercool the card?


stay away from Sapphire for sure. MSI, which brand i use, allows. it has been posted here a few pages back.


----------



## Jpmboy

Quote:


> Originally Posted by *tsm106*
> 
> Do you still have room in the core mhz tank? Btw, you should use the 13.11 whql, its not only fast and valid, but it is easier for FSI to read card specs, ie. it will read clocks correctly.


this one? amd_catalyst_13.11_r9_290_whql (247.026Mb file)


----------



## Ukkooh

Quote:


> Originally Posted by *Ukkooh*
> 
> After having it for a week got my first blackscreen on my 290x today. I just emailed the shop I bought it from if it is enough to RMA it or not since it doesn't OC too well. (1150mhz core with afterburner voltage +100mV)


Update on the situation: The shop mailed me back and said that I should absolutely RMA it. They told me that it definitely won't get any better in a way that implies that it will get worse over time. It sounds like they know something that we don't (specifically that black screens are hardware faults) and I definitely recommend everyone who has experienced black screens at stock to RMA their cards.


----------



## sugarhell

Quote:


> Originally Posted by *Jpmboy*
> 
> this one? amd_catalyst_13.11_r9_290_whql (247.026Mb file)


Yeah ^^


----------



## maynard14

Hi pls add me,. newly owner of a xfx r9 290 video card

heres my rig with the card :



and here is my 3dmark11 score

http://www.3dmark.com/3dm11/7559701

do you think guys my score on 3dmark11 is normal for stock r9 290 ?

thank you and im so happy with this card,. from 7970 to r9 290 and im am lucky that my card is unlockable to r9 290x!


----------



## Banedox

Quote:


> Originally Posted by *maynard14*
> 
> Hi pls add me,. newly owner of a xfx r9 290 video card
> 
> heres my rig with the card :
> 
> 
> 
> and here is my 3dmark11 score
> 
> http://www.3dmark.com/3dm11/7559701
> 
> do you think guys my score on 3dmark11 is normal for stock r9 290 ?
> 
> thank you and im so happy with this card,. from 7970 to r9 290 and im am lucky that my card is unlockable to r9 290x!


Gratz on the unlock, I got a unlocked XFX as well. Now register your card and get Battlefield 4 free! now waiting on a EK waterblock


----------



## Connolly

I've just got a VTX3D R9 290 that's been flashed to a 290x, can someone tell me whether this is a decent Valley score for the GPU with a stock cooler & settings and does everything look right in GPUz?





The card is also supposed to be voltage unlocked, does that mean I should be able to adjust the core voltage in MSI Afterburner? At the moment that slide bar is locked for me. And is the ASIC score in GPUz relevant to anything at all? Mine scored 70.6%.
Obviously I have no idea what I'm doing, but I've downloaded all of the relevant software from the OP in this thread and have it all installed, if anyone wants to tell me what sort of OC levels I should be aiming for in MSI afterburner it'd be much appreciated, or any advice in general for getting the most out of my card.

Thanks in advance,

Matt.


----------



## OverSightX

About to pukl the trigger on a single card for now and eventually a second.What does everyone recommend: get a 290 that possibly unlocks and a block OR a 290x without a block.

My Note2


----------



## mboner1

Quote:


> Originally Posted by *maynard14*
> 
> Hi pls add me,. newly owner of a xfx r9 290 video card
> 
> heres my rig with the card :
> 
> 
> 
> and here is my 3dmark11 score
> 
> http://www.3dmark.com/3dm11/7559701
> 
> do you think guys my score on 3dmark11 is normal for stock r9 290 ?
> 
> thank you and im so happy with this card,. from 7970 to r9 290 and im am lucky that my card is unlockable to r9 290x!


I just picked mine up today as well and ran 3dmark to compare to yours, mine is not a unlocked card, what presets did you have? Have you overclocked at all? I think i just got a good score? I'm not much for benchmarking.. but i like to know how i did lol.

http://www.3dmark.com/3dm11/7559918


----------



## rdr09

Quote:


> Originally Posted by *OverSightX*
> 
> About to pukl the trigger on a single card for now and eventually a second.What does everyone recommend: get a 290 that possibly unlocks and a block OR a 290x without a block.
> 
> My Note2


if you are into benching, then get the 290X. just for games, then the 290. Titan like performance . . .



http://www.3dmark.com/3dm11/7556915

edit: oh, yah, you need 2. those 7970s are faster. much faster.


----------



## X-oiL

For some reason my cards look like this while playing BF4..anyone knows the reason?


----------



## mboner1

So after a bit of digging it seems my 16217 score for GPU in 3dmark 11 is a pretty good score yeah??


----------



## mojobear

Quote:


> Originally Posted by *Raxus*
> 
> seems most of the people that were getting consistent black screens RMAd and arent experiencing the issue at all or very very little with their replacement cards. And it seems like the 290xs had it way more often, specifically of the sapphire variety. That is just my personal observations.


Thanks very much for the responses...really appreciate the perspective. the problem with checking OCN once every day is that I'm so late with my responses and we are already 9 pages away from original post haha.

Next time there is a deal, 3 x R9 290s are going in my cart along with EK or XSPC waterblocks


----------



## Raxus

Quote:


> Originally Posted by *Jpmboy*
> 
> Unless they have a seal or sticker on any screws or wrapped from the PCB to aircooler shell, what's to know?
> 
> but I hear MSI, and I know EVGA (for gk110 anyway).


Don't go by ether or not card has a void warranty sticker on it, MSI has stickers on their cards, doesn't void the warranty to remove them. Sapphire has no stickers but voids the warranty when you remove the cooler.

If your concerned search around or even better yet, contact the company yourself and ask.


----------



## OverSightX

Quote:


> Originally Posted by *rdr09*
> 
> if you are into benching, then get the 290X. just for games, then the 290. Titan like performance . . .
> 
> 
> 
> http://www.3dmark.com/3dm11/7556915
> 
> edit: oh, yah, you need 2. those 7970s are faster. much faster.


Mostly for gaming. I'm starting to think benches are useless to me. It's nice to compare but I've noticed that many cards act very differently when gaming compared to benches. I'm planning to CF whichever I get within the next 2 months so I hope to gain much more over my current 7970's. Biggest question for me is if the noise is worth the the performance of the 290X. Im used to a pretty quiet rig.

Thanks for the input.


----------



## rdr09

Quote:


> Originally Posted by *OverSightX*
> 
> Mostly for gaming. I'm starting to think benches are useless to me. It's nice to compare but I've noticed that many cards act very differently when gaming compared to benches. I'm planning to CF whichever I get within the next 2 months so I hope to gain much more over my current 7970's. Biggest question for me is if the noise is worth the the performance of the 290X. Im used to a pretty quiet rig.
> 
> Thanks for the input.


by that time the non-reference will be out and so is maxwell - i think.

or

http://www.ocaholic.ch/modules/news/article.php?storyid=8466


----------



## Arizonian

Quote:


> Originally Posted by *JMCB*
> 
> I would like to join:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 2 Gigabyte R9 290x Video cards, in Crossfire. Loving it much more than 2x 7970s....


Congrats - added









Quote:


> Originally Posted by *maynard14*
> 
> Hi pls add me,. newly owner of a xfx r9 290 video card
> 
> heres my rig with the card :
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> and here is my 3dmark11 score
> 
> http://www.3dmark.com/3dm11/7559701
> 
> do you think guys my score on 3dmark11 is normal for stock r9 290 ?
> 
> thank you and im so happy with this card,. from 7970 to r9 290 and im am lucky that my card is unlockable to r9 290x!


Congrats - added









Quote:


> Originally Posted by *Connolly*
> 
> I've just got a VTX3D R9 290 that's been flashed to a 290x, can someone tell me whether this is a decent Valley score for the GPU with a stock cooler & settings and does everything look right in GPUz?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The card is also supposed to be voltage unlocked, does that mean I should be able to adjust the core voltage in MSI Afterburner? At the moment that slide bar is locked for me. And is the ASIC score in GPUz relevant to anything at all? Mine scored 70.6%.
> Obviously I have no idea what I'm doing, but I've downloaded all of the relevant software from the OP in this thread and have it all installed, if anyone wants to tell me what sort of OC levels I should be aiming for in MSI afterburner it'd be much appreciated, or any advice in general for getting the most out of my card.
> 
> Thanks in advance,
> 
> Matt.


Congrats - added


----------



## Ghostpilot

With my previous GTX 260 and 7970 on this system I was able to run 3DMark without any issues. Since installing the 290 I've not been able to run 3DMark. I thought that maybe it was a driver issue or perhaps something going wrong with the 3DMark install, but I've since updated my drivers and reinstalled 3DMark to the same issues. I am also not able to pull up VRM temps in anything other than HWiNFO64 and the system stutters whenever it takes a sensor reading. An acquaintance has mentioned that this suggest hardware issues and I'm strongly considering sending it back while still under Newegg's warranty window. Am I off-base or should I begin the RMA process?


----------



## Kriant

Guys, how to find out which memory brand does your R9 290 has ?


----------



## Blackroush

Quote:


> Originally Posted by *OverSightX*
> 
> Mostly for gaming. I'm starting to think benches are useless to me. It's nice to compare but I've noticed that many cards act very differently when gaming compared to benches. I'm planning to CF whichever I get within the next 2 months so I hope to gain much more over my current 7970's. Biggest question for me is if the noise is worth the the performance of the 290X. Im used to a pretty quiet rig.
> 
> Thanks for the input.


Try This, that will be dead silent. http://www.overclock.net/t/1439731/bored-of-waiting-for-none-ref-290x-check-this-simple-cheap-fix/80#post_21203049


----------



## Raxus

Quote:


> Originally Posted by *Kriant*
> 
> Guys, how to find out which memory brand does your R9 290 has ?


Program called memoryinfo

google search should pop it up


----------



## rdr09

Quote:


> Originally Posted by *Raxus*
> 
> Program called memoryinfo
> 
> google search should pop it up


included in the op. just scroll down.


----------



## OverSightX

Quote:


> Originally Posted by *Blackroush*
> 
> Try This, that will be dead silent. http://www.overclock.net/t/1439731/bored-of-waiting-for-none-ref-290x-check-this-simple-cheap-fix/80#post_21203049


Thanks for that, but I won't be going that route. Thinking just getting the 290 w/ block. Just can't see a big enough difference between the two in performance. Unless Im missing something after looking at all reviews I can find.


----------



## Kriant

Quote:


> Originally Posted by *Raxus*
> 
> Program called memoryinfo
> 
> google search should pop it up


thx


----------



## OJSimps

Hey guys, have a bit of a dilemma here. So I've got a single Sapphire R9 290 which I just can't get to work. It's a new one as I had to RMA the first 290 I got for the same reasons. I did a clean sweep of any Nvidia drivers before I installed the 290, I was using a EVGA GTX 570 before. After booting off of Mobo graphics and installing the latest beta drivers for the 290 + rebooting, my computer will hard freeze and need to shut down. This constantly is happening anywhere from 2-10 minutes after booting up my PC. Do I have a power issue? Or what might be at fault? I'm just lost at this point as I feel like i've pulled out all the stops.


----------



## kcuestag

For those asking about Warranty when removing the cooler, I RMA'd a Sapphire R9 290X and got a brand new replacement, even though mine had a waterblock installed on it, they didn't notice or mind, not sure which.


----------



## Arizonian

Quote:


> Originally Posted by *OJSimps*
> 
> Hey guys, have a bit of a dilemma here. So I've got a single Sapphire R9 290 which I just can't get to work. It's a new one as I had to RMA the first 290 I got for the same reasons. I did a clean sweep of any Nvidia drivers before I installed the 290, I was using a EVGA GTX 570 before. After booting off of Mobo graphics and installing the latest beta drivers for the 290 + rebooting, my computer will hard freeze and need to shut down. This constantly is happening anywhere from 2-10 minutes after booting up my PC. Do I have a power issue? Or what might be at fault? I'm just lost at this point as I feel like i've pulled out all the stops.


Hi OJSimps - welcome to OCN with your first post.









Well before members can access how to help you, you'll need to provide information on your rig. Knowing what your PSU size is, motherboard, etc can help narrow down things sometimes.

I strongly suggest you take the time to fill out the form so you don't have to keep doing it every so many pages or so because members won't look back too many posts and it will get buried and have to re-ask you this question.

*"How to put your Rig in your Sig"*

Hope someone has had this issue and can guide you out of it or figure out your problem / solution.


----------



## Ukkooh

So how do I get overdrive back after updating to 13.11v9.4 betas? I already tried wiping with the newest DDU version and then reinstalling but it didn't help.


----------



## Arizonian

Quote:


> Originally Posted by *Ukkooh*
> 
> So how do I get overdrive back after updating to 13.11v9.4 betas? I already tried wiping with the newest DDU version and then reinstalling but it didn't help.


For the moment I honestly think it's been disabled as a temp fix to 'black screen' issue. I don't know of anyone that re-enabled it using the 13.11 beta 4 driver. We've not heard official news regarding why this was disabled and / or what AMD thinks was / is causing the issue.


----------



## Grimwarr

Quote:


> Originally Posted by *kcuestag*
> 
> For those asking about Warranty when removing the cooler, I RMA'd a Sapphire R9 290X and got a brand new replacement, even though mine had a waterblock installed on it, they didn't notice or mind, not sure which.


It also appears that MSI and XFX allow aftermarket cooling solutions to be used on their GPUS (North America only for XFX cards)

link: http://www.overclock.net/t/1444675/watercooling-msi-cards-edit-directly-from-msi-from-msi-forums


----------



## mboner1

Quote:


> Originally Posted by *Arizonian*
> 
> For the moment I honestly think it's been disabled as a temp fix to 'black screen' issue. I don't know of anyone that re-enabled it using the 13.11 beta 4 driver. We've not heard official news regarding why this was disabled and / or what AMD thinks was / is causing the issue.


Can i get added as well? Using a stock cooler.


----------



## Jpmboy

Quote:


> Originally Posted by *Arizonian*
> 
> For the moment I honestly think it's been disabled as a temp fix to 'black screen' issue. I don't know of anyone that re-enabled it using the 13.11 beta 4 driver. We've not heard official news regarding why this was disabled and / or what AMD thinks was / is causing the issue.


Yes - it is disabled in the new beta driver. I had to scrub it from the system and drop back to 9.2... then on the advice of rdr09 and tsm, dropped back to the last whql driver and all is good with the world again!


----------



## RYCRAI

Sapphire just released updated bios for its 290 card:

http://www.xbitlabs.com/news/graphics/display/20131125220348_Sapphire_Releases_Performance_Boost_for_AMD_Radeon_R9_290_Graphics_Cards.html

Question:

Should you download and flash these if you're already using AMD's overclock/fan speed utility?

Also,

as other users pointed out 9.4 beta I am unable to OC my memory on my Sapphire 290?


----------



## Ghostpilot

Quote:


> Originally Posted by *Ghostpilot*
> 
> With my previous GTX 260 and 7970 on this system I was able to run 3DMark without any issues. Since installing the 290 I've not been able to run 3DMark. I thought that maybe it was a driver issue or perhaps something going wrong with the 3DMark install, but I've since updated my drivers and reinstalled 3DMark to the same issues. I am also not able to pull up VRM temps in anything other than HWiNFO64 and the system stutters whenever it takes a sensor reading. An acquaintance has mentioned that this suggest hardware issues and I'm strongly considering sending it back while still under Newegg's warranty window. Am I off-base or should I begin the RMA process?


Quoting since it seems to have gotten lost in the discussion. Also, I've just tried benching in Unigine Heaven and it threw back a black screen at me. So yeah, an RMA'ing I go.


----------



## OJSimps

Quote:


> Originally Posted by *Arizonian*
> 
> Hi OJSimps - welcome to OCN with your first post.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well before members can access how to help you, you'll need to provide information on your rig. Knowing what your PSU size is, motherboard, etc can help narrow down things sometimes.
> 
> I strongly suggest you take the time to fill out the form so you don't have to keep doing it every so many pages or so because members won't look back too many posts and it will get buried and have to re-ask you this question.
> 
> *"How to put your Rig in your Sig"*
> 
> Hope someone has had this issue and can guide you out of it or figure out your problem / solution.


I went ahead and put my rig in my signature. Sorry!


----------



## Ukkooh

Quote:


> Originally Posted by *OJSimps*
> 
> Hey guys, have a bit of a dilemma here. So I've got a single Sapphire R9 290 which I just can't get to work. It's a new one as I had to RMA the first 290 I got for the same reasons. I did a clean sweep of any Nvidia drivers before I installed the 290, I was using a EVGA GTX 570 before. After booting off of Mobo graphics and installing the latest beta drivers for the 290 + rebooting, my computer will hard freeze and need to shut down. This constantly is happening anywhere from 2-10 minutes after booting up my PC. Do I have a power issue? Or what might be at fault? I'm just lost at this point as I feel like i've pulled out all the stops.


What exactly did you use to "clean sweep" the old nvidia drivers? To me it sounds like it might have messed up something in your windows installation. That or it is a case of black screen and you'll have to RMA your card again.

Ps. Welcome to OCN!


----------



## Arizonian

Quote:


> Originally Posted by *mboner1*
> 
> Can i get added as well? Using a stock cooler.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Quote:


> Originally Posted by *OJSimps*
> 
> I went ahead and put my rig in my signature. Sorry!


No worries, all good.









As Ukkooh said did you uninstall previous drivers? On the OP is a Nvidia / AMD driver uninstaller, it works for both. Just choose Nvidia uninstall when prompted. It worked great for me to clear up issues.


----------



## mboner1

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> No worries, all good.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As Ukkooh said did you uninstall previous drivers? On the OP is a Nvidia / AMD driver uninstaller, it works for both. Just choose Nvidia uninstall when prompted. It worked great for me to clear up issues.


Cheers dude









How come your switching to the 780ti Arizonian? .. Is that some kind of sick joke?


----------



## Ukkooh

It is not the wisest move to switch to a 780ti just a few weeks before we get the first mantle results in.


----------



## jerrolds

Quote:


> Originally Posted by *kcuestag*
> 
> For those asking about Warranty when removing the cooler, I RMA'd a Sapphire R9 290X and got a brand new replacement, even though mine had a waterblock installed on it, they didn't notice or mind, not sure which.


How long was the entire RMA process? You sent it to Sapphire, and not the retailer correct?


----------



## ZealotKi11er

This is with BF4 playing for 30 mins. Core stays @ 1000MHz even @ 94C. Using Uber Bios.

Fan speed 52-53% does not really bother me that much. People have over-exaggerated the noise it makes.


----------



## fewohfjweoifj

Still getting black screens at stock on the 9.4 beta driver. Has anyone received any word whatsoever about whether further fixes will come for the unlucky few still stuck with an unusable card? It is extremely frustrating that the rest of the internet seems to have had their problems with this card solved except for me.


----------



## qqan

Quote:


> Originally Posted by *OJSimps*
> 
> Hey guys, have a bit of a dilemma here. So I've got a single Sapphire R9 290 which I just can't get to work. It's a new one as I had to RMA the first 290 I got for the same reasons. I did a clean sweep of any Nvidia drivers before I installed the 290, I was using a EVGA GTX 570 before. After booting off of Mobo graphics and installing the latest beta drivers for the 290 + rebooting, my computer will hard freeze and need to shut down. This constantly is happening anywhere from 2-10 minutes after booting up my PC. Do I have a power issue? Or what might be at fault? I'm just lost at this point as I feel like i've pulled out all the stops.


Make sure your motherboard is @ stock clocks. then re-add any OC step by step. worked for me.
-qq


----------



## OJSimps

Quote:


> Originally Posted by *Ukkooh*
> 
> What exactly did you use to "clean sweep" the old nvidia drivers? To me it sounds like it might have messed up something in your windows installation. That or it is a case of black screen and you'll have to RMA your card again.
> 
> Ps. Welcome to OCN!


I went to add/remove programs and uninstalled everything NVIDIA oriented. Then went to delete all NVIDIA folders, and then wrapped it up by cleaning anything NVIDIA from my registry. I'll have to try another uninstall of everything with the program in the OP.


----------



## kcuestag

Quote:


> Originally Posted by *jerrolds*
> 
> How long was the entire RMA process? You sent it to Sapphire, and not the retailer correct?


My RMA was via the store I bought the card from, not retailer.

It took like 3 days since the day I shipped it until I got a new one.








Quote:


> Originally Posted by *ZealotKi11er*
> 
> This is with BF4 playing for 30 mins. Core stays @ 1000MHz even @ 94C. Using Uber Bios.
> 
> Fan speed 52-53% does not really bother me that much. People have over-exaggerated the noise it makes.


It is loud once you get used to water cooling and fans at 800rpm.








Quote:


> Originally Posted by *fewohfjweoifj*
> 
> Still getting black screens at stock on the 9.4 beta driver. Has anyone received any word whatsoever about whether further fixes will come for the unlucky few still stuck with an unusable card? It is extremely frustrating that the rest of the internet seems to have had their problems with this card solved except for me.


Sorry to hear that, do you still have sound when those blackscreens occur?

If not, it may be the PSU not giving +12V properly. I experienced that because I was using just one PCIe cable from PSU to power whole card, after I started using 2 cables those kind of crashes were gone.

If you do have sound, then it's the blackscreen issue and I guess you have two options;

1. RMA
2. Wait for new drivers

Maybe you could PM the AMD Rep who posted in this thread and let him know there are still a few people who get those blackscreen issues.


----------



## Ukkooh

Quote:


> Originally Posted by *fewohfjweoifj*
> 
> Still getting black screens at stock on the 9.4 beta driver. Has anyone received any word whatsoever about whether further fixes will come for the unlucky few still stuck with an unusable card? It is extremely frustrating that the rest of the internet seems to have had their problems with this card solved except for me.


Quote:


> Originally Posted by *Ukkooh*
> 
> Update on the situation: The shop mailed me back and said that I should absolutely RMA it. They told me that it definitely won't get any better in a way that implies that it will get worse over time. It sounds like they know something that we don't (specifically that black screens are hardware faults) and I definitely recommend everyone who has experienced black screens at stock to RMA their cards.


----------



## Arizonian

Quote:


> Originally Posted by *mboner1*
> 
> Cheers dude
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How come your switching to the 780ti Arizonian? .. Is that some kind of sick joke?


Not a joke. Here is the post explaining it. *LINK*

Quote:


> Originally Posted by *Ukkooh*
> 
> It is not the wisest move to switch to a 780ti just a few weeks before we get the first mantle results in.


I'm sure I will regret it when 290X performance soars past 780Ti in the games I play. Mainly BF4 too. I may sell at that point, but will be upgrading again when 20 nm releases.

I'd also like to reiterate that I'm going to fully support the club thread. I don't have any bias toward one company. If non-ref was out prior I'd have already pulled the trigger, waiting until latter part of November and then news comes out not until the second week of December after waiting as long as I did, I got impatient.. I've been idle gaming far too long.

I did take advantage of the promos, Nvidia Shield $100 off, and three bundle games which end today. Two out of the three games I was going to purchase anyway so it came to a $540 value as far as I'm concerned. My second rig, which is 3D Vision, will see the 780Ti eventually for the extra gig of VRAM gaming. At that point will sell the 690 and upgrade again for the main rig. I may be back soon.


----------



## Raxus

Quote:


> Originally Posted by *ZealotKi11er*
> 
> This is with BF4 playing for 30 mins. Core stays @ 1000MHz even @ 94C. Using Uber Bios.
> 
> Fan speed 52-53% does not really bother me that much. People have over-exaggerated the noise it makes.


It's certainly no louder than my h80


----------



## ZealotKi11er

Quote:


> Originally Posted by *kcuestag*
> 
> My RMA was via the store I bought the card from, not retailer.
> 
> It took like 3 days since the day I shipped it until I got a new one.
> 
> 
> 
> 
> 
> 
> 
> 
> It is loud once you get used to water cooling and fans at 800rpm.
> 
> 
> 
> 
> 
> 
> 
> 
> Sorry to hear that, do you still have sound when those blackscreens occur?
> 
> If not, it may be the PSU not giving +12V properly. I experienced that because I was using just one PCIe cable from PSU to power whole card, after I started using 2 cables those kind of crashes were gone.
> 
> If you do have sound, then it's the blackscreen issue and I guess you have two options;
> 
> 1. RMA
> 2. Wait for new drivers
> 
> Maybe you could PM the AMD Rep who posted in this thread and let him know there are still a few people who get those blackscreen issues.


I have water cooling and i know how nice is to have silent GPU but even on air its not thats bad unless you want to overclock it. It is quieter to the ear then reference HD 7970 cooler.


----------



## Raxus

Quote:


> Originally Posted by *fewohfjweoifj*
> 
> Still getting black screens at stock on the 9.4 beta driver. Has anyone received any word whatsoever about whether further fixes will come for the unlucky few still stuck with an unusable card? It is extremely frustrating that the rest of the internet seems to have had their problems with this card solved except for me.


I would probably RMA it at this point


----------



## rdr09

Quote:


> Originally Posted by *fewohfjweoifj*
> 
> Still getting black screens at stock on the 9.4 beta driver. Has anyone received any word whatsoever about whether further fixes will come for the unlucky few still stuck with an unusable card? It is extremely frustrating that the rest of the internet seems to have had their problems with this card solved except for me.


Return it. BTW, op has invited you to join the club but you seem to ignore it. ironically, you want members to help you. what gives? i know it must be frustrating but i think your only avenue now is to return the product. it's only been a month.


----------



## Arizonian

Quote:


> Originally Posted by *ZealotKi11er*
> 
> This is with BF4 playing for 30 mins. Core stays @ 1000MHz even @ 94C. Using Uber Bios.
> 
> Fan speed 52-53% does not really bother me that much. People have over-exaggerated the noise it makes.
> 
> 
> Spoiler: Warning: Spoiler!


I couldn't agree more about the noise it makes. Those jokes at 100% fan (I agree is loud) is not needed in real world usage and who the heck plays at 100% fan?

Benching is temporary and the only time 100% should be used.

But you know, haters will be haters and it helps them validate their own purchase or feel good about it when they feel threatened by performance meeting theirs at less cost.


----------



## mojobear

XSPC released their 290/290x waterblock

http://www.xs-pc.com/waterblocks-gpu/razor-r9-290x-290

Also a blackplate

http://www.xs-pc.com/waterblocks-gpu/razor-r9-290x-290-backplate


----------



## mboner1

Quote:


> Originally Posted by *Arizonian*
> 
> I couldn't agree more about the noise it makes. Those jokes at 100% fan (I agree is loud) is not needed in real world usage and who the heck plays at 100% fan?
> 
> Benching is temporary and the only time 100% should be used.
> 
> But you know, haters will be haters and it helps them validate their own purchase or feel good about it when they feel threatened by performance meeting theirs at less cost.


Honest to god my 7970 was louder than this 290. It would hit 80-82 degrees @ 70-72% fan speed. I started with a similar fan curve for this and at 72% fan speed it was still abit quieter.. and only hitting 72 degrees! With a max temp of 94 i have lowered the fan curve and now i hit 86 degrees with a fan speed of 55% and a over clock of 1125 and 1400. I either got lucky or there was/is a lot of fuss about nothing. Toms hardware has been a horrible source of information regarding everything to do with the 290/290x.


----------



## Arizonian

Quote:


> Originally Posted by *mojobear*
> 
> XSPC released their 290/290x waterblock
> 
> http://www.xs-pc.com/waterblocks-gpu/razor-r9-290x-290
> 
> Also a blackplate
> 
> http://www.xs-pc.com/waterblocks-gpu/razor-r9-290x-290-backplate


Thank you for the update. Added to OP "Water Block" section.


----------



## the9quad

To be fair when you have more than one installed with stock cooling the temps get quite a bit higher, and you have to raise the fan speeds a bit past the speeds of 50-60%. I find roughly 65% keeps my cards fairly cool mid 70's and isn't obnoxiously loud that way either. My case has pretty good airflow as well. But multi cards can generate some heat. I just don't feel like spending the money on watercooling ATM, but I think any more than one card, it's borderline necessary.


----------



## mojobear

Originally Posted by fewohfjweoifj

Still getting black screens at stock on the 9.4 beta driver. Has anyone received any word whatsoever about whether further fixes will come for the unlucky few still stuck with an unusable card? It is extremely frustrating that the rest of the internet seems to have had their problems with this card solved except for me.

Do you have a sapphire as well fewohfjweoifj ?


----------



## fewohfjweoifj

Quote:


> Originally Posted by *mojobear*
> 
> Originally Posted by fewohfjweoifj
> Do you have a sapphire as well ?


Yes, I have Sapphire. I'm probably going to rma later today based on the advice in the thread. I was prepared to wait for drivers but as AMD have given no information on whether those are actually being produced, I don't really have anything to go on. I sent a pm to the AMD rep who posted in the thread but it seems he doesn't come online unless he has an announcement to make or something so I haven't got a reply yet.


----------



## mboner1

Quote:


> Originally Posted by *fewohfjweoifj*
> 
> Yes, I have Sapphire. I'm probably going to rma later today based on the advice in the thread. I was prepared to wait for drivers but as AMD have given no information on whether those are actually being produced, I don't really have anything to go on. I sent a pm to the AMD rep who posted in the thread but it seems he doesn't come online unless he has an announcement to make or something so I haven't got a reply yet.


Have you tried running the catalyst uninstall utility and then restarting in safe mode and using driver sweeper? My mate had a really messed up install of ccc and that fixed it. Worth a shot.


----------



## fewohfjweoifj

Quote:


> Originally Posted by *mboner1*
> 
> Have you tried running the catalyst uninstall utility and then restarting in safe mode and using driver sweeper? My mate had a really messed up install of ccc and that fixed it. Worth a shot.


I did try doing this a few times to no avail. However I keep ending up with a CCC that has no Overdrive, which according a few people means I didn't do it properly. I definitely should have done everything correctly though.
By the way, I'm not trying to be rude by not joining the club, I'm just refraining from doing so until I actually get a card working in my machine.


----------



## mboner1

Quote:


> Originally Posted by *fewohfjweoifj*
> 
> I did try doing this a few times to no avail. However I keep ending up with a CCC that has no Overdrive, which according a few people means I didn't do it properly. I definitely should have done everything correctly though.
> By the way, I'm not trying to be rude by not joining the club, I'm just refraining from doing so until I actually get a card working in my machine.


Yeah, i don't blame you man. Have you disabled ULPS. Not sure if that is likely to be the cause but it can't hurt. I would try not running gpuz while gaming as well, caused issues on my 7970, not sure if that is still a possibility with these new cards.


----------



## Falkentyne

Quote:


> Originally Posted by *fewohfjweoifj*
> 
> I did try doing this a few times to no avail. However I keep ending up with a CCC that has no Overdrive, which according a few people means I didn't do it properly. I definitely should have done everything correctly though.
> By the way, I'm not trying to be rude by not joining the club, I'm just refraining from doing so until I actually get a card working in my machine.


beta 9.4 has no overdrive. You installed it properly. Use beta 9.2 or the WHQL.


----------



## jerrolds

I have overdrive in beta 9.4....unless the install went wrong...can anyone copy/paste their software info? Ill have to double check later.


----------



## fewohfjweoifj

Quote:


> Originally Posted by *Falkentyne*
> 
> beta 9.4 has no overdrive. You installed it properly. Use beta 9.2 or the WHQL.


Ok, I'm glad I installed it properly. I was worried it was my fault for being too dumb to install drivers correctly. I did try both beta 9.2 and the WHQL but the issue is still present when using those. Either AMD need to bring out another driver or I have a faulty card that needs replacing.


----------



## Skylark71

Quote:


> Originally Posted by *Falkentyne*
> 
> beta 9.4 has no overdrive. You installed it properly. Use beta 9.2 or the WHQL.


Stupid question: do i need overdrive to raise power limit% even if i only use Gpu tweak?


----------



## HanSoloMe

Hey guys

1st off, here is my verification for membership: 
I'm using an XFX 290 and I'll be putting the Accelero Xtreme III on it within a few days, so you can put me down for aftermarket.

Along those lines, I have a question: according to a Tom's Hardware post detailing the Xtreme III install specifically on a 290, it says that the Xtreme III is short 4 heatsinks ("As I browsed through the cooler's contents, I noticed that we were short four memory heat sinks (the R9 290 needs 16 for its 256 MB packages, and Arctic only includes 12). Fortunately, this isn't a show-stopper. Part of AMD's approach to this card was widening the memory bus to 512 bits and running its GDDR5 at more conservative frequencies, scaling back on voltage at the same time. The memory packages don't get so hot that they pose a thermal issue.") As stated it doesn't sound like an issue, but looking at the barebones card with no cooler on it, I have no idea which spots to leave un-sinked.

I would be incredibly grateful if somebody could take a picture of the card with no cooler on it (I think there are several on this thread already) and point out either with a diagram or something which 4 spots I should leave open.

Much appreciated!


----------



## Arizonian

Quote:


> Originally Posted by *HanSoloMe*
> 
> Hey guys
> 
> 1st off, here is my verification for membership:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I'm using an XFX 290 and I'll be putting the Accelero Xtreme III on it within a few days, so you can put me down for aftermarket.
> 
> Along those lines, I have a question: according to a Tom's Hardware post detailing the Xtreme III install specifically on a 290, it says that the Xtreme III is short 4 heatsinks ("As I browsed through the cooler's contents, I noticed that we were short four memory heat sinks (the R9 290 needs 16 for its 256 MB packages, and Arctic only includes 12). Fortunately, this isn't a show-stopper. Part of AMD's approach to this card was widening the memory bus to 512 bits and running its GDDR5 at more conservative frequencies, scaling back on voltage at the same time. The memory packages don't get so hot that they pose a thermal issue.") As stated it doesn't sound like an issue, but looking at the barebones card with no cooler on it, I have no idea which spots to leave un-sinked.
> 
> I would be incredibly grateful if somebody could take a picture of the card with no cooler on it (I think there are several on this thread already) and point out either with a diagram or something which 4 spots I should leave open.
> 
> Much appreciated!


First welcome to OCN - I see it's your second post.









Congrats - added









_We have some Xtreme III owners here that will be able to answer your question_.


----------



## DizzlePro

i wanna pull the trigger, but dat cooler

http://www.overclockers.co.uk/showproduct.php?prodid=GX-337-SP


----------



## TheRoot

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Add me please!
> 
> 
> 
> 4 cards, 2 Gigabyte, 1 HIS, and 1 Powercolor(arrives tomorrow)
> 
> All on water....


you miss this guy mod


----------



## jerrolds

Quote:


> Originally Posted by *Skylark71*
> 
> Stupid question: do i need overdrive to raise power limit% even if i only use Gpu tweak?


Just use GPU Tweak - my CCC is turned off/disabled. Hell i was only using GPU Tweak while CCC was waiting for me to "Accept" its conditions.


----------



## Arizonian

Quote:


> Originally Posted by *TheRoot*
> 
> you miss this guy mod


I surely did.







Thanks








Quote:


> Originally Posted by *DeadlyDNA*
> 
> Add me please!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 4 cards, 2 Gigabyte, 1 HIS, and 1 Powercolor(arrives tomorrow)
> 
> All on water....


Congrats - I slotted you in accordingly. Added (all four)







and under water.


----------



## Jpmboy

Quote:


> Originally Posted by *Arizonian*
> 
> Some news : I have purchased a 780Ti ACX.
> 
> I'll elaborate; my second rig is an Nvidia 3D vision rig and when 20nm comes out I'm going to be upgrading again. Will give me a chance to upgrade my 2nd rigs 2GB 690 with the 780Ti 3GB to keep 3D Vision enabled and add another gig of VRAM which I feel will be needed eventually. My family still watches 3D movies frequently and we use that computer to do so with 3D gaming on new titles. Eventually the main rig becomes second rig with hand me downs etc....
> 
> Secondly on top of that, I was able to take advantage of the $100 off promo for Nvidia shield that ends today which my sister and I are gifting to nephews for Christmas. She's made the purchase today.
> 
> Between promo and two of the three bundle games I was going to buy anyway it made it a nice value minus that cost and sweetened the deal a bit.
> 
> Which brings me to the following:
> 
> *Now I am more happy to remain the thread starter and active in this club, however I'm also going to put out the opportunity for a 290X / 299 owner the opportunity to take on the task of thread starter to better serve the club. PM me if your interested and willing to dedicate the time and consistent effort to keep the club active and updated.*
> 
> I'm sure when mantle comes out I'm going to regret this decision. I'll be looking forward to Mantle launching and by the time the 20 nm video cards come out Mantle be in full force. Again don't worry gentlemen I'm not going anywhere and will continue fully support the club. This wasn't an easy decision.


the 780Ti is a fantastic card...you will not regret it anytime soon!


----------



## Ricey20

For those looking to do the "Red Mod" or don't want to go full blown watercooling yet, NZXT released a GPU mount for both NV/AMD cards for asetek AIO coolers. Comes in white, black, or red with a 92mm fan for VRM cooling and zip tie mounts for tube routing. VRM and Ram sinks you'll have to buy separately.

http://www.nzxt.com/product/detail/138-kraken-g10-gpu-bracket.html


----------



## ZealotKi11er

Quote:


> Originally Posted by *Ricey20*
> 
> For those looking to do the "Red Mod" or don't want to go full blown watercooling yet, NZXT released a GPU mount for both NV/AMD cards for asetek AIO coolers. Comes in white, black, or red with a 92mm fan for VRM cooling and zip tie mounts for tube routing.
> 
> http://www.nzxt.com/product/detail/138-kraken-g10-gpu-bracket.html


Don't you need RAM and VRM heat-sinks also?


----------



## malik22

hello guys I just got my xfx 290 and when I register it to get the battlefield 4 promo its says its not available for my card?


----------



## Newbie2009

Quote:


> Originally Posted by *Jpmboy*
> 
> the 780Ti is a fantastic card...you will not regret it anytime soon!


As good as it is, I managed to get 2 290s for 50€ less than a TI


----------



## Raxus

Quote:


> Originally Posted by *malik22*
> 
> hello guys I just got my xfx 290 and when I register it to get the battlefield 4 promo its says its not available for my card?


If it didnt say it on the page you bought it from, you might be SOL.

Otherwise contact xfx support.


----------



## pompss

Quote:


> Originally Posted by *Ricey20*
> 
> For those looking to do the "Red Mod" or don't want to go full blown watercooling yet, NZXT released a GPU mount for both NV/AMD cards for asetek AIO coolers. Comes in white, black, or red with a 92mm fan for VRM cooling and zip tie mounts for tube routing. VRM and Ram sinks you'll have to buy separately.
> 
> http://www.nzxt.com/product/detail/138-kraken-g10-gpu-bracket.html


Quote:


> Originally Posted by *ZealotKi11er*
> 
> Don't you need RAM and VRM heat-sinks also?


Iam running the r290x with the heat sink installed on the ram ( 8 dollars) ek universal waterblock and a cougar fan pushing air over the VRM.
total 69 dollars overclock 1200/1400 stable stock bios.
VRM is getting really hot i think i have to add some heat sink


----------



## Ricey20

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Don't you need RAM and VRM heat-sinks also?


Yep, but that you'll have to buy separately


----------



## jerrolds

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Don't you need RAM and VRM heat-sinks also?


Sure do - while the fan on that bracket will probably hit the VRMs on the right side, i doubt very little flow will hit the cluster of 3 near the outputs. Its too bad the fan wasnt angled or something.


----------



## MintyOne

Quote:


> Originally Posted by *HanSoloMe*
> 
> Hey guys
> 
> 1st off, here is my verification for membership:
> I'm using an XFX 290 and I'll be putting the Accelero Xtreme III on it within a few days, so you can put me down for aftermarket.
> 
> Along those lines, I have a question: according to a Tom's Hardware post detailing the Xtreme III install specifically on a 290, it says that the Xtreme III is short 4 heatsinks ("As I browsed through the cooler's contents, I noticed that we were short four memory heat sinks (the R9 290 needs 16 for its 256 MB packages, and Arctic only includes 12). Fortunately, this isn't a show-stopper. Part of AMD's approach to this card was widening the memory bus to 512 bits and running its GDDR5 at more conservative frequencies, scaling back on voltage at the same time. The memory packages don't get so hot that they pose a thermal issue.") As stated it doesn't sound like an issue, but looking at the barebones card with no cooler on it, I have no idea which spots to leave un-sinked.
> 
> I would be incredibly grateful if somebody could take a picture of the card with no cooler on it (I think there are several on this thread already) and point out either with a diagram or something which 4 spots I should leave open.
> 
> Much appreciated!


there is one memory chip that will be blocked by the heatsink. if you place the cooler over top the board it's clear which one it is. I just used a piece of thermal pad from the stock cooler. To deal with the shortage of ram sinks I used 4 of the longer/narrow pieces to cover 4 ram chips. I did this by using 2 of the longer sinks side by side spanning 2 ram chips each.

One other important issue is to use the long plastice spacers on the back of the card between the backplate/foam and the pcb. If you don't use the longest ones in the package the backplate will come into contact with some pcb components when you tighten it down and compress the foam.

Sorry I can't take a picture and hopefully my crappy explanation will help. Just place the cooler into position and look where your clearances are and everything should be clear.


----------



## Banedox

Quote:


> Originally Posted by *malik22*
> 
> hello guys I just got my xfx 290 and when I register it to get the battlefield 4 promo its says its not available for my card?


Hey man I would contact XFX, cause mine was not purchased with anything mentions battlefield 4, tho I just registered my card and I got it. Hope that helps


----------



## steelkevin

Count me in







.

http://www.techpowerup.com/gpuz/ws29h/
Sapphire with Hynix Memory
Non unlockable
EK-FC R9-290X - Nickel and an EK Backplate
74.3% ASIC Quality

Just wondering what other R290 (or 290X) owners are getting temp wise after playing a bit of BF4 for example just to check whether or not my temps are normal.

Thanks in advance


----------



## Ukkooh

This is not my video but I think it should be added to the OP to help those who are going down "the mod" way. SamEkinci: I hope you don't mind me posting this here! My apologies if this offended you.


----------



## Jpmboy

Quote:


> Originally Posted by *Newbie2009*
> 
> As good as it is, I managed to get 2 290s for 50€ less than a TI


----------



## Forceman

Quote:


> Originally Posted by *malik22*
> 
> hello guys I just got my xfx 290 and when I register it to get the battlefield 4 promo its says its not available for my card?


I saw someone mention, kind of in passing, that it may have ended already for XFX.
Quote:


> Originally Posted by *Skylark71*
> 
> Stupid question: do i need overdrive to raise power limit% even if i only use Gpu tweak?


You shouldn't, but some people have reported the power limit not being correctly raised if it wasn't also raised in CCC. That may be an Afterburner-specific issue though.


----------



## maarten12100

Quote:


> Originally Posted by *Banedox*
> 
> Hey man I would contact XFX, cause mine was not purchased with anything mentions battlefield 4, tho I just registered my card and I got it. Hope that helps


http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/7950#post_21258698

I'm thinking about ordering 2 XFX ones for this reason I demand my BF4.
I mean really it is the reason I returned my Sapphire ones so I could get BF4 with XFX ones from Alternate.NL
Now that also went away...

False advertising at its finnest as the slide clearly said that is was bundeld and there was nothing about only specific SKU's having it.
I mean a SKU that costs more than BF4 + a non BF4 SKU R9 290 what is the point in that.

So shall I order the XFX version and hope for BF4 or shall I wait for Warsam to explain what they are exactly doing (replied in the Q&A thread and never got my answer...)
I can always return it if there is no BF4 though (again...)

http://www.alternate.nl/XFX/XFX+R9_290_Core_Edition_%28R9-290A-ENFC%29,_grafische_kaart/html/product/1104166/?tk=7&lk=10190
http://www.alternate.nl/MSI/MSI+R9_290_4GD5_Battelfield_4_Edition,_grafische_kaart/html/product/1113488/?tk=7&lk=10190

Amazing deals BF4 goes for 35 Euro and the card costs 40 Euro extra


----------



## maynard14

Quote:


> Originally Posted by *Banedox*
> 
> Gratz on the unlock, I got a unlocked XFX as well. Now register your card and get Battlefield 4 free! now waiting on a EK waterblock


really sir? how can i get Battlefield 4?? pls can you tell me how to get it? i dont see any promo on the box of my card that says free Battlefield 4... thanks


----------



## Banedox

Well All I did was go to the offical XFX website, created an account and registered by graphics card serial number, there should be a little card in the box that says username and password with some activation and serial numbers on it. Use those on the site. Once the serial is in find the my products page, to the right of the card information should be a promotions link to click and then click redeem Battlefield 4, with the activation code on the little card . They will then email you a serial to be redeemed on Origin. (I may have missed a step in there)


----------



## hatlesschimp

Quote:


> Originally Posted by *maarten12100*
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/7950#post_21258698
> 
> I'm thinking about ordering 2 XFX ones for this reason I demand my BF4.
> I mean really it is the reason I returned my Sapphire ones so I could get BF4 with XFX ones from Alternate.NL
> Now that also went away...
> 
> False advertising at its finnest as the slide clearly said that is was bundeld and there was nothing about only specific SKU's having it.
> I mean a SKU that costs more than BF4 + a non BF4 SKU R9 290 what is the point in that.
> 
> So shall I order the XFX version and hope for BF4 or shall I wait for Warsam to explain what they are exactly doing (replied in the Q&A thread and never got my answer...)
> I can always return it if there is no BF4 though (again...)
> 
> http://www.alternate.nl/XFX/XFX+R9_290_Core_Edition_%28R9-290A-ENFC%29,_grafische_kaart/html/product/1104166/?tk=7&lk=10190
> http://www.alternate.nl/MSI/MSI+R9_290_4GD5_Battelfield_4_Edition,_grafische_kaart/html/product/1113488/?tk=7&lk=10190
> 
> Amazing deals BF4 goes for 35 Euro and the card costs 40 Euro extra


I think all the new stock will see the BF4 codes absent.

Just something I have noticed in Australia.


----------



## maynard14

Quote:


> Originally Posted by *Banedox*
> 
> Well All I did was go to the offical XFX website, created an account and registered by graphics card serial number, there should be a little card in the box that says username and password with some activation and serial numbers on it. Use those on the site. Once the serial is in find the my products page, to the right of the card information should be a promotions link to click and then click redeem Battlefield 4, with the activation code on the little card . They will then email you a serial to be redeemed on Origin. (I may have missed a step in there)


thank you sir but i tried that but here is the result:



Sorry, your product is not subject to XFX game promotion

wahhh,.. hmm can someone tell me what is the use of the personal activation code that i got ? im confuse right now


----------



## maarten12100

Quote:


> Originally Posted by *hatlesschimp*
> 
> I think all the new stock will see the BF4 codes absent.
> 
> Just something I have noticed in Australia.


First they said it is free next they say only with specific SKUs.
Then the SKUs were more expensive than buying seperate like where is the logic in that.

I'm goingbto order the XFX and hope for the best.


----------



## maarten12100

Quote:


> Originally Posted by *maynard14*
> 
> thank you sir but i tried that but here is the result:
> 
> 
> 
> Sorry, your product is not subject to XFX game promotion
> 
> wahhh,.. hmm can someone tell me what is the use of the personal activation code that i got ? im confuse right now


I revoke my previous statement.


----------



## Gunderman456

Arizonian, change me to water!











Will be incorporating them in my new build, "The Hawaiian Heat Wave" (in sig)!


----------



## maynard14

Quote:


> Originally Posted by *maarten12100*
> 
> I revoke my previous statement.


ahmm so the code is for the warranty? maybe i should contact them.. eheh thanks sir


----------



## ReHWolution

Quote:


> Originally Posted by *maynard14*
> 
> Quote:
> 
> 
> 
> Originally Posted by *maarten12100*
> 
> I revoke my previous statement.
> 
> 
> 
> ahmm so the code is for the warranty? maybe i should contact them.. eheh thanks sir
Click to expand...

Seems like the only card to be valid is the 270/270X :\


----------



## Forceman

Quote:


> Originally Posted by *maynard14*
> 
> thank you sir but i tried that but here is the result:
> 
> 
> 
> Sorry, your product is not subject to XFX game promotion
> 
> wahhh,.. hmm can someone tell me what is the use of the personal activation code that i got ? im confuse right now


I had to use the personal activation code to get the BF4 deal, not sure what else it is good for.


----------



## SultanOfWalmart

Quote:


> Originally Posted by *Forceman*
> 
> I had to use the personal activation code to get the BF4 deal, not sure what else it is good for.


Personal code is also used for the warranty.

As far as XFX and BF4 goes, they did mention on their website that BF4 copies are only good while their "limited" supplies last. So I'm guessing their allotment of BF4 codes has run dry. THey also say that you should check with your dealer/vendor for specific qualifying models. Although they still have the promotion shown on their site, so I'm not sure whats the deal.

I do know that when I got my XFX R9 290 from Amazon, it mentioned nothing of getting a free copy of BF4 anywhere in the product description. But when I went to XFX website and followed their directions for getting your code, it worked. Essentially, seems that YMMV.

Best of luck.


----------



## maarten12100

Quote:


> Originally Posted by *SultanOfWalmart*
> 
> Personal code is also used for the warranty.
> 
> As far as XFX and BF4 goes, they did mention on their website that BF4 copies are only good while their "limited" supplies last. So I'm guessing their allotment of BF4 codes has run dry. THey also say that you should check with your dealer/vendor for specific qualifying models. Although they still have the promotion shown on their site, so I'm not sure whats the deal.
> 
> I do know that when I got my XFX R9 290 from Amazon, it mentioned nothing of getting a free copy of BF4 anywhere in the product description. But when I went to XFX website and followed their directions for getting your code, it worked. Essentially, seems that YMMV.
> 
> Best of luck.


Newegg still states that it comes with BF4 if I was an American I would sue them for false advertising


----------



## SultanOfWalmart

Quote:


> Originally Posted by *maarten12100*
> 
> Newegg still states that it comes with BF4 if I was an American I would sue them for false advertising


As far as I know, and this is based on two of my friends that have ordered their R9 290 from Newegg within the last week, got their BF4 codes just fine. So I'm not really sure where "false advertisement" you speak of comes in. Unless I missed something.









It may just be an issue with the vendor where the other guy bought his from, which I do not think was Newegg. Though I could be wrong.


----------



## voldomazta

Guys what are your thoughts on this:
http://www.amazon.com/dp/B00GJSUNHC

It's kind of shady without the pictures but what the heck, I pulled the trigger on 2. I contacted their live chat support and she says that she can confirm the items are legitimate and from Amazon. It's kind of weird that you cannot just go to amazon.com and search for "R9290X-G-4GD5" to get there. I got there through google.


----------



## selk22

I just want to clear something up here about beta 9.4 and CCC overdrive...

If you did a complete driver sweep before installing the new 9.4 drivers that includes the uninstall of overdrive with CCC then you will not have overdrive and power control in CCC with beta 9.4.
I used this tool from the OP


Spoiler: Warning: Spoiler!



https://forums.geforce.com/default/topic/550192/display-driver-uninstaller-ddu-v7-1/



BUT

If you have not completely uninstalled everything properly before installing 9.4 then you will most likely still have overdrive and power control in CCC like some users have reported..

So 9.4 by itself does not have overdrive with it.

Its actually been way more stable for me on 9.4 than anything else and it doesn't bother me because MSI AB all the way!


----------



## the9quad

That must be why I didn't have it and went back to the previous betas, so what just install on top of them?


----------



## JordanTr

Quote:


> Originally Posted by *the9quad*
> 
> That must be why I didn't have it and went back to the previous betas, so what just install on top of them?


Im just using MSI afterburner beta 17 - problems solved, they removed overdrive probably cause you dont need to put power limit on both +50, btw even with power limit 0% my card doesnt throttle (it hits max 82 degrees celsius with my current fan profile).


----------



## maarten12100

Quote:


> Originally Posted by *SultanOfWalmart*
> 
> As far as I know, and this is based on two of my friends that have ordered their R9 290 from Newegg within the last week, got their BF4 codes just fine. So I'm not really sure where "false advertisement" you speak of comes in. Unless I missed something.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It may just be an issue with the vendor where the other guy bought his from, which I do not think was Newegg. Though I could be wrong.



I'm pretty sure I'm not the only one that bought the card and expected it to be bundled with BF4 and receiving none.
They totally turned us over by giving out specific SKU's with the game that are more expensive than the game + the card.

It is an insult to the Never Settle program *we could as well start calling it the Settle program* as people with a 260X got way more value or any card with the gold promotion pack AKA the cheap 7950/7970's or 7990's.
We bought the highest end cards yet we got none.

Now I'm not saying those cards aren't great value I'm just saying they threw us a bone and didn't deliver.


----------



## HanSoloMe

Awesome, thank you!

I actually had to RMA my card shortly after posting because it started to artifact and crash.... But as soon as I get it back I'll use your tips and post the results/more questions.


----------



## HanSoloMe

Quote:


> Originally Posted by *MintyOne*
> 
> there is one memory chip that will be blocked by the heatsink. if you place the cooler over top the board it's clear which one it is. I just used a piece of thermal pad from the stock cooler. To deal with the shortage of ram sinks I used 4 of the longer/narrow pieces to cover 4 ram chips. I did this by using 2 of the longer sinks side by side spanning 2 ram chips each.
> 
> One other important issue is to use the long plastice spacers on the back of the card between the backplate/foam and the pcb. If you don't use the longest ones in the package the backplate will come into contact with some pcb components when you tighten it down and compress the foam.
> 
> Sorry I can't take a picture and hopefully my crappy explanation will help. Just place the cooler into position and look where your clearances are and everything should be clear.


Awesome, thank you!

I actually had to RMA my card shortly after posting because it started to artifact and crash.... But as soon as I get it back I'll use your tips and post the results/more questions.


----------



## Heinz68

Quote:


> Originally Posted by *maarten12100*
> 
> Newegg still states that it comes with BF4 if I was an American I would sue them for false advertising


The Battlefield 4 bundle with R9 Series is promotial limited time offer avaiable from AIO's partners.
I believe the XFX R9 290 was one the first to have the cards listed at Newegg, but today the card is no longer listed so it must be sold out. I'm sure anybody who purchased the bundled Battlefield 4 card from Newegg when the promotion was on will get the game without any lawsuit.

Right now at Newegg GIGABYTE and HIS R9 290 offers "Free BattleField 4 Game w/purchase, limited offer".
Saphire R9 290 also has the "BattleField 4 Game included, limited time offer" for $10 extra and they don't call it free.

http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&Description=R9+290&N=-1&isNodeId=1


----------



## maarten12100

If anybody can just point me to the XFX that is bundled with BF4 here in NL (and that doesn't cost more than buying it seperate)
I'll just get 2 of those.

http://www.forbes.com/sites/jasonevangelho/2013/11/14/amd-may-not-have-99-problems-but-battlefield-4-is-certainly-one-of-them/

I don't concur with the part of the article about Origin PC's since Origin is just a overpriced bunch of stupid people that have been bribed by Nvidia, but the rest stands correct this was handled wrong and they should be doing damage control.


----------



## skupples

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Got in to an argument with a guy that was telling me 4k doesn't look any better than 720p........... I wanted to slap him in the gums! UGGGGGH that junk looks incredible! I'm just now stepping up to 1440p!!!


This must of been on Youtube.

BTW, from what iv'e read elsewhere, you can get plenty fine scores out of valley in CF w/ custom profiles. Some one was linking a 160fps+ run on 4 cards. Said all he did was create a custom CF profile.


----------



## KyGuy

Hey guys, anyone experience a sleep bug? For me, the issue is that my GPU will fail to recover if the screen turns off to go into sleep mode. I changed my monitor to never switch off. This has happened a lot lately. No overclock on the core, but up to 1500 on the mem. Tested stable in Uningine for 30 minutes and Furmark for 10 minutes? The screen shows a signal from the card, but no picture....


----------



## dartuil

No one to let me see a 290 with accelero on it?








Please


----------



## maarten12100

Quote:


> Originally Posted by *Heinz68*
> 
> The Battlefield 4 bundle with R9 Series is promotial limited time offer avaiable from AIO's partners.
> I believe the XFX R9 290 was one the first to have the cards listed at Newegg, but today the card is no longer listed so it must be sold out. I'm sure anybody who purchased the bundled Battlefield 4 card from Newegg when the promotion was on will get the game without any lawsuit.
> 
> Right now at Newegg GIGABYTE and HIS R9 290 offers "Free BattleField 4 Game w/purchase, limited offer".
> Saphire R9 290 also has the "BattleField 4 Game included, limited time offer" for $10 extra and they don't call it free.
> 
> http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&Description=R9+290&N=-1&isNodeId=1


Let's go then and read some reviews over at Newegg:
Quote:


> DO NOT BUY, SEE BELOW!
> 
> Cons: I bought the card after they announced the Battlefield 4 offer but got no Battlefield 4. I think it's totally bogus of a company to advertise this (AMD) and then not provide the software. I purchased it the day the deal was announced and no battlefield 4.


Quote:


> Pros: Amazing card, giving over 80 FPS in battlefield 3(not 4)
> 
> Cons: Registered the card at XFX and then entered the promotional activation code and
> 
> Other Thoughts: I will update when I either get Battlefield 4 from XFX, or return the card.


The above one however should be fixed now.

I myself have ordered 2 r9 290's from Sapphire contacted Sapphire no code contacted retailer no code left a post in the Q&A no reply.
Asked Alternate their partner to clarify which told me that it was bundled with SKU's that were not even available here.
Then when they became available they were more expensive then buying BF4 seperate I mean come on.

I told a friend which I'm gifting my old 5870 and was planing on giving him one of 2 BF4 games seems like he has to buy it himself now and so do I.


----------



## maynard14

Two Step to Obtain Game
STEP 1 : Input Personal Activation Code
* If your product is PSU, please input "001"as your Personal Activation Code
** Click here to know what is Personal Activeation Code.
STEP 2: Click Obtain to initiate game code email on below promotion.
If you done it previously, you can resend the game email straight away,without key in Personal Activation Code again.
Sorry, your product is not subject to XFX game promotion.

help... what is the use of the promiton code of my xfx card.... if i got that msg from the xfx site


----------



## Forceman

Quote:


> Originally Posted by *KyGuy*
> 
> Hey guys, anyone experience a sleep bug? For me, the issue is that my GPU will fail to recover if the screen turns off to go into sleep mode. I changed my monitor to never switch off. This has happened a lot lately. No overclock on the core, but up to 1500 on the mem. Tested stable in Uningine for 30 minutes and Furmark for 10 minutes? The screen shows a signal from the card, but no picture....


Yes, I had the same problem. Disabling ULPS in Afterburner seems to have fixed it though.


----------



## tsm106

Quote:


> Originally Posted by *skupples*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MrWhiteRX7*
> 
> Got in to an argument with a guy that was telling me 4k doesn't look any better than 720p........... I wanted to slap him in the gums! UGGGGGH that junk looks incredible! I'm just now stepping up to 1440p!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This must of been on Youtube.
> 
> BTW, from what iv'e read elsewhere, you can get plenty fine scores out of valley in CF w/ custom profiles. Some one was linking a 160fps+ run on 4 cards. Said all he did was create a custom CF profile.
Click to expand...

Dude, I get 156fps with three cards at stock clocks. Just setup a 1x1 optimize profile. It's valley 101, but that thread sucks with the mysterious unbanning of the dclared cheaters.


----------



## KyGuy

Quote:


> Originally Posted by *Forceman*
> 
> Yes, I had the same problem. Disabling ULPS in Afterburner seems to have fixed it though.


I'll try disabling ULPS. Thank you for the response. Lucky for me, no black screens yet, and I have a Sapphire with the Elpida RAM running at 1500.


----------



## Raxus

anyone else not seeing vrm temps in gpu -z ?


----------



## KyGuy

Quote:


> Originally Posted by *Forceman*
> 
> Yes, I had the same problem. Disabling ULPS in Afterburner seems to have fixed it though.


ULPS did not work. Voltage reverted, but not the clocks, so I had a very difficult time resetting after the black screen, as the clocks were locked in. Apply Clock at startup is NOT checked. Weirdest thing ever...Drivers possibly? CCC?


----------



## KyGuy

Quote:


> Originally Posted by *Raxus*
> 
> anyone else not seeing vrm temps in gpu -z ?


Mine show up. My version is 0.7.4 according to the top. Seems like they just showed up. I don't remember them being there last time i checked...


----------



## Arizonian

Quote:


> Originally Posted by *steelkevin*
> 
> Count me in
> 
> 
> 
> 
> 
> 
> 
> .
> 
> http://www.techpowerup.com/gpuz/ws29h/
> Sapphire with Hynix Memory
> Non unlockable
> EK-FC R9-290X - Nickel and an EK Backplate
> 74.3% ASIC Quality
> 
> Just wondering what other R290 (or 290X) owners are getting temp wise after playing a bit of BF4 for example just to check whether or not my temps are normal.
> 
> Thanks in advance


Congrats - added
















Quote:


> Originally Posted by *Gunderman456*
> 
> Arizonian, change me to water!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Will be incorporating them in my new build, "The Hawaiian Heat Wave" (in sig)!


Congrats updated


----------



## Raxus

Quote:


> Originally Posted by *KyGuy*
> 
> Mine show up. My version is 0.7.4 according to the top. Seems like they just showed up. I don't remember them being there last time i checked...


yea same, but no vrm temps just core.


----------



## Hogesyx

Quote:


> Originally Posted by *Raxus*
> 
> yea same, but no vrm temps just core.


Can you take a screen shot after you scroll the graph tab all the way down and see what GPU-Z is displaying? My VRMs temp are all the way at the bottom.


----------



## ZealotKi11er

So whats the VID for 290 and 290X?


----------



## Raxus

Quote:


> Originally Posted by *Hogesyx*
> 
> Can you take a screen shot after you scroll the graph tab all the way down and see what GPU-Z is displaying? My VRMs temp are all the way at the bottom.


----------



## bloodkil93

Got my R9 290 about a week ago:
http://www.techpowerup.com/gpuz/vcgwg/
Sapphire with Elpdia memory
stock cooler (Arctic Accelero Xtreme III in the post)
non-unlockable
76.8% ASIC quality

Please add me, and also what does ASIC mean, is higher better?

Thanks!


----------



## ZealotKi11er

Quote:


> Originally Posted by *bloodkil93*
> 
> Got my R9 290 about a week ago:
> http://www.techpowerup.com/gpuz/vcgwg/
> Sapphire with Elpdia memory
> stock cooler (Arctic Accelero Xtreme III in the post)
> non-unlockable
> 76.8% ASIC quality
> 
> Please add me, and also what does ASIC mean, is higher better?
> 
> Thanks!


The place where you check for it tells you what it means.

I dont think it means much for 290 cards. HD 7970 it did make difference. Lower is better under cold water + cooler solutions. Anything else is better then higher.


----------



## bloodkil93

Quote:


> Originally Posted by *ZealotKi11er*
> 
> The place where you check for it tells you what it means.
> 
> I dont think it means much for 290 cards. HD 7970 it did make difference. Lower is better under cold water + cooler solutions. Anything else is better then higher.


What is a good ASIC on the 290's going on what others have got?


----------



## Gunderman456

I see people are still having issues with CCC and new AMD beta 9.4. Yes AMD omitted CCC, however you can re-enable it by opening up regedit and browsing to HKEY_CURRENT_USER\Software\ATI\ACE\Settings\Runtime\Graphics\OverDrive5 and changing the value of "OverclockEnabled_DEF" to True. Reboot computer and you shouldhave CCC back!


----------



## evensen007

Quote:


> Originally Posted by *bloodkil93*
> 
> What is a good ASIC on the 290's going on what others have got?


ASIC doesn't mean squat in the 290 generation, just ask jomomma. He benched 8 of them, and ASIC didn't correlate at all to the overclocks he was able to achieve. I have 74.6 and 74.2 for my 290's.


----------



## bloodkil93

Quote:


> Originally Posted by *evensen007*
> 
> ASIC doesn't mean squat in the 290 generation, just ask jomomma. He benched 8 of them, and ASIC didn't correlate at all to the overclocks he was able to achieve. I have 74.6 and 74.2 for my 290's.


Ok, hope I can get a decent overclock on my 290 with the Arctic Accelero Xtreme III on it


----------



## ZealotKi11er

Quote:


> Originally Posted by *bloodkil93*
> 
> Ok, hope I can get a decent overclock on my 290 with the Arctic Accelero Xtreme III on it


Make sure to keep those VRMs cool.


----------



## bloodkil93

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Make sure to keep those VRMs cool.


Does that cooler do that or will they need extra?


----------



## Forceman

It makes no sense to me that I can run Heaven, Valley, 3DMark, Metro, Batman, whatever all day long at 1200/1450, but BF4 gives me a RSOD at 1100/1325. Is BF4 really that much more stressful than any other game, or are they using some code path that is particularly sensitive to overclocks for some reason? Wonder what Mantle will bring.


----------



## bloodkil93

Quote:


> Originally Posted by *Forceman*
> 
> It makes no sense to me that I can run Heaven, Valley, 3DMark, Metro, Batman, whatever all day long at 1200/1450, but BF4 gives me a RSOD at 1100/1325. Is BF4 really that much more stressful than any other game, or are they using some code path that is particularly sensitive to overclocks for some reason? Wonder what Mantle will bring.


Are you using Hynix or Elpdia memory out of curiosity?


----------



## Kelwing

Well after seeing that these heat sinks didn't work all too well.
http://s222.photobucket.com/user/mi.../2013-11-25_16-54-51_110_zps633bef63.jpg.html

I tried adding a larger one with more surface area. Tried this combo and it seemed to work but wasn't pleased.
http://s222.photobucket.com/user/mi.../2013-11-25_16-55-26_871_zps903eba7e.jpg.html

Busted out Dremel and modified bottom to give it some clearance to a few components underneath.
http://s222.photobucket.com/user/mi.../2013-11-25_17-05-28_236_zpsd5e12e39.jpg.html

http://s222.photobucket.com/user/mi.../2013-11-25_19-45-27_399_zps8c337674.jpg.html

So far so good with this heat sink and Beta 9.4 drivers. Next step is water.


----------



## bloodkil93

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats updated


Just in case you didn't see my post on the previous page


----------



## Heinz68

Quote:


> Originally Posted by *maarten12100*
> 
> Let's go then and read some reviews over at Newegg:
> 
> The above one however should be fixed now.
> 
> I myself have ordered 2 r9 290's from Sapphire contacted Sapphire no code contacted retailer no code left a post in the Q&A no reply.
> Asked Alternate their partner to clarify which told me that it was bundled with SKU's that were not even available here.
> Then when they became available they were more expensive then buying BF4 seperate I mean come on.
> 
> I told a friend which I'm gifting my old 5870 and was planing on giving him one of 2 BF4 games seems like he has to buy it himself now and so do I.


You can't blame AMD for NL retailers sales policies, they don't have any control over that.

Now there was some miscommunication about the bundle with some hardware tech sites. Here in this article you can see AnandTech posted update the same day the original article was published, after they checked with AMD and that was even before the bundle was listed at AMD web site.
EDIT
I believe the people who have a valid promotional activation code will get the game.

Strange somebody expecting free game if it was not yet available in retail, I'm sure they knew what are they paying for. Guess some people are just trying to get the free game retroactive but the sale promotions do not work like that.


----------



## Forceman

Quote:


> Originally Posted by *bloodkil93*
> 
> Are you using Hynix or Elpdia memory out of curiosity?


Hynix. Actually just got a RSOD at stock, so I guess it isn't the overclock at all. Maybe a BF4 problem?

I've tried the Beta 8, Beta 9.2, and WHQL drivers, same problem on all three. Very annoying.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Forceman*
> 
> Hynix. Actually just got a RSOD at stock, so I guess it isn't the overclock at all. Maybe a BF4 problem?


I have not played much BF4 but so far i have not had any problems with card @ stock.


----------



## xxmastermindxx

How many of you guys are using the command line + Afterburner for voltage, and is it useful?


----------



## Durquavian

I used that Memory Info program and got the memory on my 7770s. Which are elpida, but that wasn't the cool part. I entered the memory name in search and got the spec sheet for them EDW2032BBBG. A lot of info, cool for knowledge, don't know what I can do with it, cool still.


----------



## Joeking78

Quote:


> Originally Posted by *Forceman*
> 
> Hynix. Actually just got a RSOD at stock, so I guess it isn't the overclock at all. Maybe a BF4 problem?
> 
> I've tried the Beta 8, Beta 9.2, and WHQL drivers, same problem on all three. Very annoying.


Have you tried running with no overclocks? i had the RSOD ages ago in BF4, returned to stock clocks on cpu & gpu and all good since.


----------



## bloodkil93

Quote:


> Originally Posted by *Forceman*
> 
> Hynix. Actually just got a RSOD at stock, so I guess it isn't the overclock at all. Maybe a BF4 problem?
> 
> I've tried the Beta 8, Beta 9.2, and WHQL drivers, same problem on all three. Very annoying.


Mine uses Elpdia, and is fine if the memory isn't overclocked, anything above 1325 causes issues, and from what I see, overclocking memory makes almost no difference. but I get 1120 on mine with stock voltages and probably even higher with an overvolt, but gotta wait for my Arctic Accelero Extreme III to get better temps for that.

Edit: The GPU clock overclocks fine by itself and I get no issues with black or red screen, but as soon as memory is overclocked, BAM! Basically, the GPU will go as high as it is capable of, but will just leave memory at stock.


----------



## Forceman

Quote:


> Originally Posted by *Joeking78*
> 
> Have you tried running with no overclocks? i had the RSOD ages ago in BF4, returned to stock clocks on cpu & gpu and all good since.


Yeah, just got one at stock (first time though).
Quote:


> Originally Posted by *bloodkil93*
> 
> Mine uses Elpdia, and is fine if the memory isn't overclocked, anything above 1325 causes issues, and from what I see, overclocking memory makes almost no difference. but I get 1120 on mine with stock voltages and probably even higher with an overvolt, but gotta wait for my Arctic Accelero Extreme III to get better temps for that.


Yeah, it's strange - I got them all the time at 1350, but 1325 was fine until today, when I got 2 at 1325 and then one more at stock. So either my card is degrading or there is something else gone on that is causing them.

Anyone know what the "ACP Application" is in the new beta drivers?


----------



## Hogesyx

Check this out guys, I am holding back my Arctic for this.

http://www.techpowerup.com/195042/nzxt-releases-the-kraken-g10-liquid-cooling-gpu-bracket.html


----------



## Joeking78

Quote:


> Originally Posted by *Forceman*
> 
> Yeah, just got one at stock (first time though).
> Yeah, it's strange - I got them all the time at 1350, but 1325 was fine until today, when I got 2 at 1325 and then one more at stock. So either my card is degrading or there is something else gone on that is causing them.
> 
> Anyone know what the "ACP Application" is in the new beta drivers?


http://en.m.wikipedia.org/wiki/Average_CPU_power

Mantle related?


----------



## bloodkil93

Also, does anyone know how well the Arctic Accelero Xtreme III cools the VRM's?


----------



## Raxus

Quote:


> Originally Posted by *Raxus*


anyone have any clue why i have no vrm temps?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Raxus*
> 
> anyone have any clue why i have no vrm temps?


Maybe drivers are not installed correctly?

Have you tried other apps like HWMonitoring?


----------



## Jpmboy

Quote:


> Originally Posted by *Gunderman456*
> 
> I see people are still having issues with CCC and new AMD beta 9.4. Yes AMD omitted CCC, however you can re-enable it by opening up regedit and browsing to HKEY_CURRENT_USER\Software\ATI\ACE\Settings\Runtime\Graphics\OverDrive5 and changing the value of "OverclockEnabled_DEF" to True. Reboot computer and you shouldhave CCC back!


Scrubbed 9.4 jst for this reason even tho I use GPU Tweak on this Asus-flashed Sapphire 290x. Will try this as soon as I reload it. Dropped back to 13.11 whql... which is working fine. No blkscrn after a hour of BF4 at 1200/6000 1.412V (~1.3 actual). But I have not had the blkscrn problem at all (lucky, I guess). [fingers crossed]


----------



## Raxus

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Maybe drivers are not installed correctly?
> 
> Have you tried other apps like HWMonitoring?


Hwinfo appears to show em


----------



## Jpmboy

which driver are you using?

gpuZ.png 767k .png file


----------



## djsatane

Quote:


> Originally Posted by *Raxus*
> 
> anyone have any clue why i have no vrm temps?


Scroll down?


----------



## djsatane

Quote:


> Originally Posted by *Hogesyx*
> 
> Check this out guys, I am holding back my Arctic for this.
> 
> http://www.techpowerup.com/195042/nzxt-releases-the-kraken-g10-liquid-cooling-gpu-bracket.html


In those pictures is that Kraken G10 combined with KRAKEN X40 Performance Cooler? Anything else would be required? Would you even bother installing heatsinks on vrams chips with that?


----------



## Rnglols

thinking of getting the r9 290s instead of 780s..


----------



## Raxus

Quote:


> Originally Posted by *djsatane*
> 
> Scroll down?


thats the bottom.

using the 9.4 drivers.


----------



## djsatane

Not sure if anyone linked this before but this is quite insane:

Retail Radeon R9 290X Frequency Variance Issues Still Present(sapphire vs reference) - http://www.pcper.com/reviews/Graphics-Cards/Retail-Radeon-R9-290X-Frequency-Variance-Issues-Still-Present


----------



## COMBO2

Well, I got a replacement R9 290X today from my retailer. They actually got another brand new card and tested it for coil whine for me. In the end, the one I've got now is about half as bad, but still a bit noticeable. The retailer was saying that they've only had a few cards on the stock cooler that actually haven't had coil whine. The guy also said that if I choose to put my waterblock on this card and remove the warranty stickers, that I'll be stuck with it so I have to make up my mind before doing so.

They actually instructed me to watercool the last card (with the bad whine), so they could see the coil whine without the stock cooler. I told them I had put a block on it and tested it, but actually I just removed the stickers to eliminate suspicion (this was because they were pretty confident that doing would solve the issue, but I begged to differ and didn't want to use a brand new waterblock before I had to). They are now saying that because I used the waterblock (wink wink), they can give me a credit for the R9 but not the waterblock. If I decide to take the credit, I might just tell them I borrowed a mates block and that mine is still in new condition (it is, all I've done is attach two compression fittings).

We'll see but I might end up with a GTX 780 sadly if this can't be fixed...


----------



## djsatane

COMBO2, was loud coil whine your only problem with the first card? Also did you check whether memory chips are same manufacturer as your previous card?


----------



## Raxus

1400 memory clock, no artifacting in firestrike or unigine here

edited, a 5 snuck in there lol


----------



## the9quad

Quote:


> Originally Posted by *COMBO2*
> 
> Well, I got a replacement R9 290X today from my retailer. They actually got another brand new card and tested it for coil whine for me. In the end, the one I've got now is about half as bad, but still a bit noticeable. The retailer was saying that they've only had a few cards on the stock cooler that actually haven't had coil whine. The guy also said that if I choose to put my waterblock on this card and remove the warranty stickers, that I'll be stuck with it so I have to make up my mind before doing so.
> 
> They actually instructed me to watercool the last card (with the bad whine), so they could see the coil whine without the stock cooler. I told them I had put a block on it and tested it, but actually I just removed the stickers to eliminate suspicion (this was because they were pretty confident that doing would solve the issue, but I begged to differ and didn't want to use a brand new waterblock before I had to). They are now saying that because I used the waterblock (wink wink), they can give me a credit for the R9 but not the waterblock. If I decide to take the credit, I might just tell them I borrowed a mates block and that mine is still in new condition (it is, all I've done is attach two compression fittings).
> 
> We'll see but I might end up with a GTX 780 sadly if this can't be fixed...


Just google gtx 780 coil whine. I'm pretty sure it's a phenomena not specific to AMD. I hope you get lucky in whichever way you go and get one with an acceptable level of whine.


----------



## Hogesyx

Quote:


> Originally Posted by *djsatane*
> 
> In those pictures is that Kraken G10 combined with KRAKEN X40 Performance Cooler? Anything else would be required? Would you even bother installing heatsinks on vrams chips with that?


Still will use sekisui 5760 to tape heatsinks to ram and vrm ofcause. Still thinking of which AIO to use as the bracket supports almost all the brands out there with round waterblock.


----------



## bloodkil93

Guys I was wondering since i'm getting an Arctic Accelero Extreme III, would i need heatsinks on the VRAM? the kit comes with 12 and i need 16.

The memory isn't going to be overclocked as it's been proven unnecessary


----------



## Hogesyx

Quote:


> Originally Posted by *bloodkil93*
> 
> Guys I was wondering since i'm getting an Arctic Accelero Extreme III, would i need heatsinks on the VRAM? the kit comes with 12 and i need 16.
> 
> The memory isn't going to be overclocked as it's been proven unnecessary


You would still need 4 more and 1 of them low profile heatsink for it.


----------



## Raxus

so i decided to give my card a run at 1175 clock and 1425 mem on air, fan got loud as hell (still quieter than my H80) This is with a 4.1 clock on my 3770k.

not too shabby?


----------



## skupples

Quote:


> Originally Posted by *tsm106*
> 
> Dude, I get 156fps with three cards at stock clocks. Just setup a 1x1 optimize profile. It's valley 101, but that thread sucks with the mysterious unbanning of the dclared cheaters.


Then maybe it was your post i'm thinking of, & it was three cards!
Quote:


> Originally Posted by *bloodkil93*
> 
> Guys I was wondering since i'm getting an Arctic Accelero Extreme III, would i need heatsinks on the VRAM? the kit comes with 12 and i need 16.
> 
> The memory isn't going to be overclocked as it's been proven unnecessary


OC'ing memory only really comes into play in two situations. Benchmarking, & multi-monitor.


----------



## Ukkooh

Why leave your 3770k at 4.1?


----------



## COMBO2

Quote:


> Originally Posted by *djsatane*
> 
> COMBO2, was loud coil whine your only problem with the first card? Also did you check whether memory chips are same manufacturer as your previous card?


I haven't checked the mem chips on either and yeah it was my only problem. I actually love this card, but the cool whine is quite irritating and I don't want to feel like I spent $800 for an irritating experience...
Quote:


> Originally Posted by *the9quad*
> 
> Just google gtx 780 coil whine. I'm pretty sure it's a phenomena not specific to AMD. I hope you get lucky in whichever way you go and get one with an acceptable level of whine.


I would try and get them to give me another card but I honestly don't think they will and it may well be worse then current. I'll give it thought over the next few days but hopefully I can get an R9 that doesn't have issues.

I guess I should be thankful that I'm not getting black/red screens haha.


----------



## Raxus

Quote:


> Originally Posted by *Ukkooh*
> 
> Why leave your 3770k at 4.1?


I havent messed with overclocking it too much, had it at 4.8 earlier today but wanted to poke around with the card before stressing the cpu.


----------



## Arizonian

Quote:


> Originally Posted by *bloodkil93*
> 
> Got my R9 290 about a week ago:
> http://www.techpowerup.com/gpuz/vcgwg/
> Sapphire with Elpdia memory
> stock cooler (Arctic Accelero Xtreme III in the post)
> non-unlockable
> 76.8% ASIC quality
> 
> Please add me, and also what does ASIC mean, is higher better?
> 
> Thanks!


Congrats - added


----------



## Hogesyx

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added


Come to think of it, I am not added yet. Lol.

Sapphire 290X BF4 edition / Hynix, 70.7% ASIC
Stock cooler(waiting for thermal double side tape and heatsink to arrive for RED mod).


----------



## Arizonian

Quote:


> Originally Posted by *Hogesyx*
> 
> Come to think of it, I am not added yet. Lol.
> 
> Sapphire 290X BF4 edition / Hynix, 70.7% ASIC
> Stock cooler(waiting for thermal double side tape and heatsink to arrive for RED mod).


I need something more as per OP states.

******* To be added on the member list please submit the following in your post *******
1. GPU-Z Link *or* Screen shot with OCN Name
2. Manufacturer & Brand
3. Cooling - Stock, Aftermarket or 3rd Party Water


----------



## aznever

Quote:


> Originally Posted by *Raxus*
> 
> so i decided to give my card a run at 1175 clock and 1425 mem on air, fan got loud as hell (still quieter than my H80) This is with a 4.1 clock on my 3770k.
> 
> not too shabby?


my gtx 780 clocked @ 1280mhz/1670mhz mem, with an i7 2700k @4.0ghz, cant even beat your graphics score...


----------



## Hogesyx

Quote:


> Originally Posted by *Arizonian*
> 
> I need something more as per OP states.
> 
> ******* To be added on the member list please submit the following in your post *******
> 1. GPU-Z Link *or* Screen shot with OCN Name
> 2. Manufacturer & Brand
> 3. Cooling - Stock, Aftermarket or 3rd Party Water


Got it, miss that part. Will request again once I got my GPUZ uploaded when I am back home.


----------



## mboner1

Quote:


> Originally Posted by *aznever*
> 
> my gtx 780 clocked @ 1280mhz/1670mhz mem, with an i7 2700k @4.0ghz, cant even beat your graphics score...


My r9 290 clocked at 1120 got 16200 for graphics score, I thought higher overclock would equal higher score, why is mine so high for such a low overclock??


----------



## Forceman

Quote:


> Originally Posted by *mboner1*
> 
> My r9 290 clocked at 1120 got 16200 for graphics score, I thought higher overclock would equal higher score, why is mine so high for such a low overclock??


Different test?


----------



## mboner1

Quote:


> Originally Posted by *Forceman*
> 
> Different test?


Well i don't know, i thought it was the same, here it is here, i just did the only test available in the free version.

http://www.3dmark.com/3dm11/7559918


----------



## Hogesyx

Quote:


> Originally Posted by *mboner1*
> 
> Well i don't know, i thought it was the same, here it is here, i just did the only test available in the free version.
> 
> http://www.3dmark.com/3dm11/7559918


That's 3DMARK 11, they are comparing 3DMARK 1.0 (2013). =D


----------



## mboner1

Quote:


> Originally Posted by *Hogesyx*
> 
> That's 3DMARK 11, they are comparing 3DMARK 1.0 (2013). =D


Cheers dude, just figured that out lol, downloading 3dmark 1.1.0. Thought i had some freakishly good 290 lol.


----------



## bloodkil93

http://www.scan.co.uk/products/akasa-vga-rma-heatsinks-oem-package-10pcs

Are these good enough heatsinks to cool vram with and Arctic Accelero.Xtreme III?


----------



## zpaf

R9 290 non X vs GTX780Ti on Grid II with ULTRA settings and 8XMSAA at 5760x1080.


----------



## iPDrop

has anyone here been using the 9.4 beta drivers? I have been getting a lot of buggy and glitchy performance ever since i installed them. What's the best driver version so far? two r9 290's here


----------



## Hogesyx

Quote:


> Originally Posted by *Arizonian*
> 
> I need something more as per OP states.
> 
> ******* To be added on the member list please submit the following in your post *******
> 1. GPU-Z Link *or* Screen shot with OCN Name
> 2. Manufacturer & Brand
> 3. Cooling - Stock, Aftermarket or 3rd Party Water


1) http://www.techpowerup.com/gpuz/7y84s/
2) Sapphire 290X BF4 edition
3) Stock cooler(waiting for thermal double side tape and heatsink to arrive for RED mod.


----------



## mboner1

Well my score scks then...

http://www.3dmark.com/3dm/1723446


----------



## Arizonian

Quote:


> Originally Posted by *Hogesyx*
> 
> 1) http://www.techpowerup.com/gpuz/7y84s/
> 2) Sapphire 290X BF4 edition
> 3) Stock cooler(waiting for thermal double side tape and heatsink to arrive for RED mod.


Congrats - added









Nice. Take some pics of the mod.


----------



## bloodkil93

Are those Heatsinks that I posted above ok for VRAM?


----------



## Hogesyx

Quote:


> Originally Posted by *mboner1*
> 
> Well my score scks then...
> 
> http://www.3dmark.com/3dm/1723446


I think its fine, just need to start overclocking it! This is a your result compare to my 290(the one i returned)

http://www.3dmark.com/compare/fs/1215024/fs/1172692

Don't ask me why it reported only 3GB, I got no idea. This is one of the few reason why I return the 290.
Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nice. Take some pics of the mod.


Thanks! Sure will.


----------



## mboner1

Quote:


> Originally Posted by *Hogesyx*
> 
> I think its fine, just need to start overclocking it! This is a your result compare to my 290(the one i returned)
> 
> http://www.3dmark.com/compare/fs/1215024/fs/1172692
> 
> Don't ask me why it reported only 3GB, I got no idea. This is one of the few reason why I return the 290.
> Thanks! Sure will.


Arggh, didn't even realise my overclock wasn't applied cos i restarted and didn't set it to apply at start up after reinstalling. Will give it another run with a conservative overclock.


----------



## Hogesyx

Quote:


> Originally Posted by *bloodkil93*
> 
> Are those Heatsinks that I posted above ok for VRAM?


Based on what I understand R9 memory does not requires alot of cooling, but you might want to either modify the left over heatsink from your xtreme iii package or buy something that looks the same. aesthetic reason. lol

http://www.arctic.ac/worldwide_en/products/cooling/vga/accelero-xtreme-iii.html

Some documents here, I believe it has height clearance info as well.


----------



## mboner1

Meh. Slightly better , no world beater tho. Will stop with the benchmarks now and enjoy i think. Runs cooler and quieter than my previous 7970 atleast.

http://www.3dmark.com/3dm/1723539


----------



## Ukkooh

Anyone else experiencing driver crashes when launching gpu-z with beta v9.4 drivers?


----------



## kdawgmaster

Quote:


> Originally Posted by *Ukkooh*
> 
> Anyone else experiencing driver crashes when launching gpu-z with beta v9.4 drivers?


just opened up on my screen no issues. have u tried to reinstall GPU-z?

Quote:


> Originally Posted by *Hogesyx*
> 
> Based on what I understand R9 memory does not requires alot of cooling, but you might want to either modify the left over heatsink from your xtreme iii package or buy something that looks the same. aesthetic reason. lol
> 
> http://www.arctic.ac/worldwide_en/products/cooling/vga/accelero-xtreme-iii.html
> 
> Some documents here, I believe it has height clearance info as well.


The problem with that cooler is it uses adhesive to hold the Vram heatsinks in place. This will potentially cause u to lose the manufacture warranty as its apparently a pain in the ass to remove. Im personally waiting for HIS to make their aftermarket cooler and see if i cant by 3 of them for my cards other then that i can get the cards to hit a max of 80C with the fan profiles i made from MSI afterburner.


----------



## smokedawg

Quote:


> Originally Posted by *Ukkooh*
> 
> Anyone else experiencing driver crashes when launching gpu-z with beta v9.4 drivers?


Yes. I had 2 blackscreens after one day of using beta9.4 - once switching GPU Tweak profile (while gpu-z was running) and once just browsing OCN. Never happened before on WHQL or 9.2 . Might have been a botched install though since overdrive was still enabled for me with 9.4. Switched back to WHQL and no problems so far.


----------



## zpaf

My card needs a waterblock to finish heaven on these clocks.


----------



## Hogesyx

Quote:


> Originally Posted by *kdawgmaster*
> 
> just opened up on my screen no issues. have u tried to reinstall GPU-z?
> The problem with that cooler is it uses adhesive to hold the Vram heatsinks in place. This will potentially cause u to lose the manufacture warranty as its apparently a pain in the ass to remove. Im personally waiting for HIS to make their aftermarket cooler and see if i cant by 3 of them for my cards other then that i can get the cards to hit a max of 80C with the fan profiles i made from MSI afterburner.


Sekisui 5760 and 3m 8815 to the rescue! Their thermal performance will not be as good as arctic silver adhensive but is passable for vram and mos/vrm.








Quote:


> Originally Posted by *zpaf*
> 
> My card needs a waterblock to finish heaven on these clocks.


What's your voltage?


----------



## zpaf

Quote:


> Originally Posted by *Hogesyx*
> 
> What's your voltage?


At max with gpu tweak and official bios ~1.410v


----------



## Hogesyx

Quote:


> Originally Posted by *zpaf*
> 
> At max with gpu tweak and official bios ~1.410v










, I will never have the guts to go pass 1.3v lol


----------



## maynard14

pls add me sir r9 290

1) 
2 XFX r9 290
3) Stock cooler

thanks


----------



## zpaf

Quote:


> Originally Posted by *Hogesyx*
> 
> 
> 
> 
> 
> 
> 
> 
> , I will never have the guts to go pass 1.3v lol


My card is on air







and we have cold here in Greece.



with the fan at 100% and ambient < 20c its ok.
This is from another try with core temperature at 70c and vrm at ...


----------



## iPDrop

Is anyone else having problems with the new beta drivers 13.11 9.4 ?


----------



## chrisf4lc0n

Quote:


> Originally Posted by *iPDrop*
> 
> Is anyone else having problems with the new beta drivers 13.11 9.4 ?


Yup, black screens no overdrive, I went back to WHQL...

Above that I just flashed the Asus BIOS, but I cannot achieve stable OC above 1200MHz and 1.3V, no matter how much I push the voltage (1.412V max) I get artifacts and it eventually crashes







I have given up on overclocking my VRAM, as any bump in clock makes it unstable








Card Sapphire R9 290X with EK waterblock and Asus BIOS...


----------



## hatlesschimp

Played ARMA 3 for a total of 30 hours in the past 4 days and not one issue from my CF R9 290X cards and 3 screen eyefinity.
Im running the latest drivers.


----------



## X-oiL

I'm having a small issue with my screen (Eizo foris fg2421). It's "restarting" by itself about twice a day, first it goes all black and after a few seconds the eizo logo shows together with the selected input source up to the right.

I've tried to reinstall graphics drivers but it didn't help. Also checked so the cables are connected correctly.

Anyone else having this problem or found a solution to it? Or is it my graphic card that could be the problem?

I'm using DP.

Cheers


----------



## evensen007

Quote:


> Originally Posted by *iPDrop*
> 
> Is anyone else having problems with the new beta drivers 13.11 9.4 ?


My performance dropped through the floor and got very glitchy on the 9.4 release coming from the 9.2 release. I am going to try the 13.11 WHQL drivers. Hopefully this will be a happy middle ground between no black screens and decent performance.


----------



## smonkie

Does those falls in VDDC seem ok to you?


----------



## Jpmboy

Quote:


> Originally Posted by *chrisf4lc0n*
> 
> Yup, black screens no overdrive, I went back to WHQL...
> 
> Above that I just flashed the Asus BIOS, but I cannot achieve stable OC above 1200MHz and 1.3V, no matter how much I push the voltage (1.412V max) I get artifacts and it eventually crashes
> 
> 
> 
> 
> 
> 
> 
> I have given up on overclocking my VRAM, as any bump in clock makes it unstable
> 
> 
> 
> 
> 
> 
> 
> 
> Card Sapphire R9 290X with EK waterblock and Asus BIOS...


My 290x is set up the same way. max stabe OC with 13.11whql is 1260/6700 for benching. gaming (BF4) it holds 1200/6250 fine. make sure your vrms are staying below 60C. When you mounted teh EK block, did you use any tim between teh pads and surfaces... or just those pads alone.

*ANYONE know if there is a way in the cat drivers to make custom resolutions (pixel clocks etc) - like with NVCP?*


----------



## Hogesyx

http://imgur.com/juOjmjN


1160 is the max stable I could do with +100mv and stock cooler. Can't wait to replace the stock cooler when my stuff arrives.


----------



## Bloitz

Quote:


> Originally Posted by *Arizonian*
> 
> I was wondering how I passed you. Found that *POST*.
> 
> I was expecting an entry and it was an invoice. I think I thought you'd post when it arrived.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Per OP I was looking for following:
> 1. GPU-Z Link or Screen shot with OCN Name
> 2. Manufacturer & Brand
> 3. Cooling - Stock, Aftermarket or 3rd Party Water
> 
> No worries as a lot of member seem to be missing this and I've been able to catch it. You were slotted in where you were at the original submission on the roster list.
> 
> A belated congrats - added


You never added me either









*Brand:* Gigabyte - No warranty sticker on any screws
*Memory:* Hynix
*Cooling:* Water (EK-FC R9-290X Acetal-Nickel)
*Unlockable?* Haven't tried, but I doubt it
*ASIC:* 79.8
Clocks @ stock volts (hovers around 1.17-1.18 V during benches according to GPU-Z and afterburner) and +20% powertune:
1180 core (could do 1185 but I prefer round numbers







) / 1350 memory (haven't fiddled a lot with it though) 3D mark and Valley stable with 0 artifacts.

All in all, quite pleased. I haven't always been the luckiest one when it comes to good clocking chips but I can't complain about this one











Spoiler: GPU-Z screen name ASIC









Spoiler: Original post



Quote:


> Originally Posted by *Bloitz*
> 
> Hey Mr. Postman, you got a package for me?


----------



## smokedawg

Quote:


> Originally Posted by *Jpmboy*
> 
> *ANYONE know if there is a way in the cat drivers to make custom resolutions (pixel clocks etc) - like with NVCP?*


Is AMD pixel clock patcher + Custom Resolution Utility what you are looking for?


----------



## DCRussian

Add me to the list as well.

I have a PowerColor 290 that's been flashed to 290X, running with stock cooling at the moment.

Benches pre-flash/post-flash *here*.

Stock Flashed


----------



## Jpmboy

Quote:


> Originally Posted by *smokedawg*
> 
> Is AMD pixel clock patcher + Custom Resolution Utility what you are looking for?











thanks! +1


----------



## dade_kash_xD

Would a 360mm alphacool monsta radiator with a 140ml reservoir w a single D5 pump be enough to cool an i7-4770k and 2x r9-290?


----------



## Topsu

Anyone have any results about overclocking r9 290 or r9 290x with Asus unlocked voltage + watercooler?


----------



## evensen007

Quote:


> Originally Posted by *dade_kash_xD*
> 
> Would a 360mm alphacool monsta radiator with a 140ml reservoir w a single D5 pump be enough to cool an i7-4770k and 2x r9-290?


I have a swiftech 320 and 120 RAD cooling my 2600k @4.8 and my 2x 290's at stock. I can tell you that my setup is RIGHT at the edge of what I would call acceptable. These 290's put out a TON of heat.


----------



## GaurabBrah

Quote:


> Originally Posted by *SASDF*
> 
> I was replying to the sapphire rep bro, he said we should assume every company was like this, thus i linked the xfx post...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It should be common knowledge to read words around other words, by doing so you gain something called "context".


I was responding in general but I'll be sure not to reply to you again.


----------



## jassilamba

Finally ordered my 290s yesterday thanks to the Newegg 5% coupon. Now going to wait for the aquacomputer blocks to hit the US shores.


----------



## SultanOfWalmart

Quote:


> Originally Posted by *evensen007*
> 
> I have a swiftech 320 and 120 RAD cooling my 2600k @4.8 and my 2x 290's at stock. I can tell you that my setup is RIGHT at the edge of what I would call acceptable. These 290's put out a TON of heat.


What would you say should be sufficient to cool a single 290 along with an i7 [email protected] (still cant bring myself to replace it). My case is a Graphite 600T so there's not a lot of room, not enough for a 320 without some hacking. I was thinking 240 up top, and maybe a 120 in the front...any ideas?


----------



## Jpmboy

Quote:


> Originally Posted by *dade_kash_xD*
> 
> Would a 360mm alphacool monsta radiator with a 140ml reservoir w a single D5 pump be enough to cool an i7-4770k and 2x r9-290?


close... with push-pull fans on high, it's probably good. If you overvolt and overclock everything, you'll do better with more rad area.


----------



## evensen007

Quote:


> Originally Posted by *SultanOfWalmart*
> 
> What would you say should be sufficient to cool a single 290 along with an i7 [email protected] (still cant bring myself to replace it). My case is a Graphite 600T so there's not a lot of room, not enough for a 320 without some hacking. I was thinking 240 up top, and maybe a 120 in the front...any ideas?


320/340/360 Radiator would be sufficient for that setup (or 240 and 120 like you plan).


----------



## voldomazta

Regarding the EK acetal block, which do you prefer, the copper or the nickel-coated one?


----------



## evensen007

Quote:


> Originally Posted by *voldomazta*
> 
> Regarding the EK acetal block, which do you prefer, the copper or the nickel-coated one?


There really is zero different in cooling between them. In the past, I would have said grab the copper because of all the Nickel plating issues EK had, but they fixed that a while back. Get whichever one you like better.


----------



## smokedawg

Quote:


> Originally Posted by *dade_kash_xD*
> 
> Would a 360mm alphacool monsta radiator with a 140ml reservoir w a single D5 pump be enough to cool an i7-4770k and 2x r9-290?


I am no expert on this but I have read you should plan 1x120mm rad per 150watts. A stock 290 can draw up to ~400w (furmark) as I recall so 2x290 would require 5-6 x120rad area. The 4770k can draw another ~170w under load (cinebench) so add another 120rad to that. Adjust depending on overclock. Someone correct me if I am totally wrong.


----------



## jerrolds

Quote:


> Originally Posted by *Ukkooh*
> 
> Anyone else experiencing driver crashes when launching gpu-z with beta v9.4 drivers?


Yea that happens with me, about 1/10 times. It happened with all drivers since 290X release...except i dont think it happened with the newest v9.4.


----------



## dade_kash_xD

What if I keep my h100i on the cpu and bottom mount an alphacool monsta 240 for the gpus alone?


----------



## Jpmboy

Quote:


> Originally Posted by *smokedawg*
> 
> I am no expert on this but I have read you should plan 1x120mm rad per 150watts. A stock 290 can draw up to ~400w (furmark) as I recall so 2x290 would require 5-6 x120rad area. The 4770k can draw another ~170w under load (cinebench) so add another 120rad to that. Adjust depending on overclock. Someone correct me if I am totally wrong.


Yes, This is about right, in order to keep coolant temp under control. The numbers are based on the thinner rad form factor. Two old-school 360 rads, and 2L coolant keep my OC'd 7970s and 2700K under 40C after hours of gaming - by then the 6 fans are at max 1200rpm. The single fat 360 (kit in parkbench) with 3 cougar vortex fans (and only ~ 1 L of coolant) will exceed 40C coolant temp with 2 titans, I added a blackIce fat 2x90mm - no big help. Then went full bore and looped in a 1680 GiGant.. (now ~ 3L coolant) liquid temp never above 30C (or +10 ambient).


----------



## mboner1

I have had 2 black screens since getting the card yesterday. Once when i launched bf4 and once when i launched crysis 3, both crashed when joiing multiplayer.. whatever that means, and both times required a hard reset.. .


----------



## Raxus

Quote:


> Originally Posted by *mboner1*
> 
> I have had 2 black screens since getting the card yesterday. Once when i launched bf4 and once when i launched crysis 3, both crashed when joiing multiplayer.. whatever that means, and both times required a hard reset.. .


My sapphire r9 290x blackscreened like crazy at stock clocks

My MSI R9 290X has yet to black screen once at 1175/1425.

I RMAd my sapphire to newegg, glad I did.


----------



## surrealalucard

Is anyone getting their screens freezing for about a minute? I am using chrome and that seems to be when it usually happens (if I am playing a game and chrome is still up as well) my screen will freeze and I cant do anything for about a minute, and then it starts working again.


----------



## Arizonian

Quote:


> Originally Posted by *maynard14*
> 
> pls add me sir r9 290
> 
> 1)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 2 XFX r9 290
> 3) Stock cooler
> 
> thanks


Congrats - added









Quote:


> Originally Posted by *zpaf*
> 
> My card is on air
> 
> 
> 
> 
> 
> 
> 
> and we have cold here in Greece.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> with the fan at 100% and ambient < 20c its ok.
> This is from another try with core temperature at 70c and vrm at ...
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Quote:


> Originally Posted by *Bloitz*
> 
> You never added me either
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Brand:* Gigabyte - No warranty sticker on any screws
> *Memory:* Hynix
> *Cooling:* Water (EK-FC R9-290X Acetal-Nickel)
> *Unlockable?* Haven't tried, but I doubt it
> *ASIC:* 79.8
> Clocks @ stock volts (hovers around 1.17-1.18 V during benches according to GPU-Z and afterburner) and +20% powertune:
> 1180 core (could do 1185 but I prefer round numbers
> 
> 
> 
> 
> 
> 
> 
> ) / 1350 memory (haven't fiddled a lot with it though) 3D mark and Valley stable with 0 artifacts.
> 
> All in all, quite pleased. I haven't always been the luckiest one when it comes to good clocking chips but I can't complain about this one
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: GPU-Z screen name ASIC


I looked back at your posts and didn't see submission. I did see your discussing the water blocks recieved. That response you quoted was for Scorpion49. I should have asked you when you first posted so I apologize. Decent over clock btw even though you can get that extra 5 Mhz on the core really only translates to an extra 1-2 FPS gaming anyway.









Congrats - added


----------



## Forceman

Quote:


> Originally Posted by *smonkie*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Does those falls in VDDC seem ok to you?


Software power readings are notoriously unreliable. I wouldn't worry about it unless you noticed any problems.


----------



## Ukkooh

Has anyone gotten these with the new v9.4 drivers? The guys from the shop told me to try these drivers before proceeding with the RMA to see if it fixes the black screens. They were right I'm not getting black screens anymore and now I get some beautiful art. Should I proceed with the RMA or not?


----------



## smonkie

Quote:


> Originally Posted by *Forceman*
> 
> Software power readings are notoriously unreliable. I wouldn't worry about it unless you noticed any problems.


No additional problems noticed. Thanks!


----------



## Jpmboy

Quote:


> Originally Posted by *smokedawg*
> 
> Is AMD pixel clock patcher + Custom Resolution Utility what you are looking for?


worked great! this panel is 120Hz native and only basic electronics (no upscaling or any of that crp), but both NV and AMD need custom rez and pixel clocks for it to hit that refresh. thank you again!!


----------



## rv8000

Has anyone run into the issue of not being able to successfully boot after switching bioses? I just switched out to a UD3H + 4670k and no issues running off the stock bios (sapphire 290 bios), but when I flip the switch to the ASUS 290x bios my system won't start up. I had no issues switching between the bios on my x79 system and I have not re-flashed either bios, whats up?


----------



## nemm

To the comments about radiator requirements, as a previous post stated 120mm per 150w is about right as a guideline but it comes down to what you consider acceptable temperatures will determine if you want to add more radiator surface after if able to. Myself have a ex360 and ut60 240 which was more than enough for my liking when I had a 670 1359/1800 and then a single 290 1200/1450 @1.4v but adding a second 290 I no longer like what I see because they sure do put out a lot of heat especially when overvolting them. 58/60*load @1.4 before was 51* and now its 51* load @1.3 before was 48* so its keeping my toes toasty and I will be adding another 240 ut60 and swapping ex360 for 360 xt45(not required but second machine needs a 30mm thick radiator). To get what I consider acceptable temperatures, the fans need to run faster but I built the system for silence which it is not when doing this. My recommendation would be having 240mm per 290 if overvolting.

**Ambient temperature was 25* with all results adjusted if ambient differed.


----------



## jerrolds

Quote:


> Originally Posted by *Ukkooh*
> 
> 
> 
> 
> Has anyone gotten these with the new v9.4 drivers? The guys from the shop told me to try these drivers before proceeding with the RMA to see if it fixes the black screens. They were right I'm not getting black screens anymore and now I get some beautiful art. Should I proceed with the RMA or not?


That happens with me with unstable clocks - did that happen to you at stock?


----------



## Im Batman

Would anyone happen to know if there is any sort of game bundle with the 290/x around the corner? No need to wait for aftermarket coolers, I will be putting it under water.

Ready to pull the trigger on picking one up, just want to make sure i'm not going to be kicking myself in a week because I missed out on a good deal.

Also, are anyway of the current reference cards voltage locked? I was most likey going to go with MSI.

Thanks for any info.


----------



## ricklen

I have been patiently waiting for non-reference coolers but nothing is here.

Still no news on the non-reference coolers here?

Still debating between a R9 280 or a GTX 770 4GB or a R9 290, the R9 290 seems the best price/performance but the cooling is so awfull.


----------



## Ukkooh

Quote:


> Originally Posted by *jerrolds*
> 
> That happens with me with unstable clocks - did that happen to you at stock?


Stock clocks but increased power limit and fan speed.


----------



## Forceman

Quote:


> Originally Posted by *smokedawg*
> 
> Is AMD pixel clock patcher + Custom Resolution Utility what you are looking for?


At what refresh rate does the pixel clock limitation become an issue? Or do you need to use the patcher for all custom resolutions on AMD cards?


----------



## utnorris

Anyone here play BF2142? For some reason it takes forever to load a map. I thought it was the mod, so i uninstalled it and reinstalled it without the mod and whenever it try's to adjust the shaders or load a map it takes forever, like 20-30 minutes. I usually just give up. I wasn't having any issues previously and all of a sudden it started to do this. I have tried the WHQL and the latest beta driver and neither seemed to help. I am thinking I will try an earlier beta driver, but I am not sure why that would make a difference. Any thoughts?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *ricklen*
> 
> I have been patiently waiting for non-reference coolers but nothing is here.
> 
> Still no news on the non-reference coolers here?
> 
> Still debating between a R9 280 or a GTX 770 4GB or a R9 290, the R9 290 seems the best price/performance but the cooling is so awfull.


Rumor has it they will start coming out in early December... The latest I would guess is late December.

If the aftermarket cards aren't out before the holidays AMD will be kicking themselves....

Go for the 290, you won't be disappointed with the performance. If you want a DIY project you can get a cooler to put on yourself, such as an Accellero or Gelid. I personally have the Gelid and it is much quieter and better at cooling than the reference cooler.
Quote:


> Originally Posted by *Im Batman*
> 
> Would anyone happen to know if there is any sort of game bundle with the 290/x around the corner? No need to wait for aftermarket coolers, I will be putting it under water.
> 
> Ready to pull the trigger on picking one up, just want to make sure i'm not going to be kicking myself in a week because I missed out on a good deal.
> 
> Also, are anyway of the current reference cards voltage locked? I was most likey going to go with MSI.
> 
> Thanks for any info.


I got free BF4 with my XFX 290.

Try waiting for Newegg to run a 5% off promo (they do all the time) and pick up a 290 for $380... Then either enjoy your free $60 game or sell it for like $50 and enjoy your $330 card that rivals the 780...

As far as I know all ref 290x/290's are not voltage locked.


----------



## Raxus

it seems the difference in the press copies could've been different bios.
Quote:


> *AMD did inform us they are working on an improved BIOS for the AMD Radeon R9 290X and we hope to get our hands on it in the near future*


Read more at http://www.legitreviews.com/amd-radeon-r9-290x-press-sample-versus-retail_129583/3#ORChLmvq2GiIK2Q5.99


----------



## warbucks

Quote:


> Originally Posted by *evensen007*
> 
> I have a swiftech 320 and 120 RAD cooling my 2600k @4.8 and my 2x 290's at stock. I can tell you that my setup is RIGHT at the edge of what I would call acceptable. These 290's put out a TON of heat.


The type of RAD makes a huge difference. I have a Thermochill PA120.3 (360 essentially) and a XSPC RX120. Running a 4770k @ 4.6ghz and two R9 290's overclocked and I have no issues dealing with the heat.


----------



## OverSightX

Just ordered my Sapphire 290 w/ an EK Block and hope to have it Friday!

The card: http://www.newegg.com/Product/Product.aspx?Item=N82E16814202060

The Block: http://www.frozencpu.com/products/21664/ex-blc-1565/EK_Radeon_R9-290X_VGA_Liquid_Cooling_Block_-_Acetal_EK-FC_R9-290X_-_Acetal.html?id=UntXaP7W&mv_pc=2018

Looking to join soon!


----------



## Newbie2009

Say hello to my first R9 290.









Sapphire unlock. will test on air first then see if I can unlock the other before watercool them.


----------



## chiknnwatrmln

Seems like everyone except me can unlock their 290's...

Congrats on the discounted 290x though.


----------



## sparkle128

Say hello to my upgrade from HD5970. (I'll miss you only a little).



Sapphire R9 290 Stock

So far I notice less noise than my OC HD5970.


----------



## jomama22

Quote:


> Originally Posted by *Newbie2009*
> 
> Say hello to my first R9 290.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sapphire unlock. will test on air first then see if I can unlock the other before watercool them.


Gpuz does not mean an unlock. Just an FYI.


----------



## Grimwarr

Add me to the list! got two XFX R9 290s. They are going to be stock cooled until i feel i need to add my artic acceleros extreme IIIs i picked up.


----------



## MintyOne

One thing I forgot to do was check the power consumption before/after installation of an arctic accelero III. Next time someone with a stock cooler does a run of valley can you see what gpu-z gives regarding wattage consumption.

I'm guessing the stock cooler uses a fair bit of power once you hit 70-100% fan speed.

Here is a shot of a 290 at 1100/1250 stock voltage, power limit +50, fan at 100% on accelero III.


----------



## Arizonian

Quote:


> Originally Posted by *Newbie2009*
> 
> Say hello to my first R9 290.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sapphire unlock. will test on air first then see if I can unlock the other before watercool them.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Quote:


> Originally Posted by *sparkle128*
> 
> Say hello to my upgrade from HD5970. (I'll miss you only a little).
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Sapphire R9 290 Stock
> 
> So far I notice less noise than my OC HD5970.


Congrats - added









Quote:


> Originally Posted by *Grimwarr*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Add me to the list! got two XFX R9 290s. They are going to be stock cooled until i feel i need to add my artic acceleros extreme IIIs i picked up.


Congrats - added









I'm was looking for OCN name in screen shot but I trust you guys.


----------



## TheSoldiet

So, 1 question....... Xtreme III or Icy Vision Rev 2?


----------



## RAFFY

Quote:


> Originally Posted by *nemm*
> 
> To the comments about radiator requirements, as a previous post stated 120mm per 150w is about right as a guideline but it comes down to what you consider acceptable temperatures will determine if you want to add more radiator surface after if able to. Myself have a ex360 and ut60 240 which was more than enough for my liking when I had a 670 1359/1800 and then a single 290 1200/1450 @1.4v but adding a second 290 I no longer like what I see because they sure do put out a lot of heat especially when overvolting them. 58/60*load @1.4 before was 51* and now its 51* load @1.3 before was 48* so its keeping my toes toasty and I will be adding another 240 ut60 and swapping ex360 for 360 xt45(not required but second machine needs a 30mm thick radiator). To get what I consider acceptable temperatures, the fans need to run faster but I built the system for silence which it is not when doing this. My recommendation would be having 240mm per 290 if overvolting.
> 
> **Ambient temperature was 25* with all results adjusted if ambient differed.


What is the rule of thumb for 140mm radiators? Are they like 175w?


----------



## Newbie2009

Quote:


> Originally Posted by *jomama22*
> 
> Gpuz does not mean an unlock. Just an FYI.


Cheers yeah I have been reading. Not to be a pain but how does this score stack up for a 290 "x" at stock with a 3770x @ 4.5 GHZ?

Been a long time since I have been benching.


----------



## prostreetcamaro

Quote:


> Originally Posted by *Newbie2009*
> 
> Cheers yeah I have been reading. Not to be a pain but how does this score stack up for a 290 "x" at stock with a 3770x @ 4.5 GHZ?
> 
> Been a long time since I have been benching.


That is low. Do you have CCC set to performance, quality or high quality? It should be set to performance. I will do a run at stock 290X clocks and post it for you.

EDIT: Here you go. 290X clocks on my unlocked 290 and my 2600K at 4.8



Here it is at 1150/6000 and my 2600K at 4.5


----------



## RAFFY

I know many of you are using the EK waterblocks but are they the best option? How are the temps on the Aquacomputer and Koolance blocks where tape is not needed for the memory and vrms? I'd like to pull the trigger on two waterblocks soon.


----------



## Arizonian

Quote:


> Originally Posted by *Newbie2009*
> 
> Cheers yeah I have been reading. Not to be a pain but how does this score stack up for a 290 "x" at stock with a 3770x @ 4.5 GHZ?
> 
> Been a long time since I have been benching.


I did not run my 290X at stock but did have 3770K @4.5. 290X @1150/1350 Firestrike 10749 score.

Post #4 results on OP

http://www.3dmark.com/3dm/1484478


----------



## Jpmboy

Quote:


> Originally Posted by *RAFFY*
> 
> I know many of you are using the EK waterblocks but are they the best option? How are the temps on the Aquacomputer and Koolance blocks where tape is not needed for the memory and vrms? I'd like to pull the trigger on two waterblocks soon.


where did you see that thermal pads are not needed for the memory and vrms with the AQ blocks? The EK does very well, and has active cooling for the VRMs... and they need it!



vrm heat plates highlighted.

from the AQ website:

_Combined GPU/RAM/VRM-cooler for graphics cards of the type Radeon R9 290X and 290 with 4GB RAM according to reference design.
This cooler combines the features of a graphics chip cooler and RAM-coolers in an elegant and very flat watercooler. Additionally the voltage regulators are also cooled effectively.

The kryographics for R9 290X and 290 water block offers outstanding cooling performance and a low flow resistance. All mounting threads are equipped with spacers for easy installation and to prevent twisting the graphics card. The areas contacting the GPU and RAM modules are polished for excellent heat transfer. For the voltage regulators, a thermal pad with high thermal conductivity is included._

BTW - they make very good quality stuff!


----------



## jerrolds

Quote:


> Originally Posted by *TheSoldiet*
> 
> So, 1 question....... Xtreme III or Icy Vision Rev 2?


Both probably perform near identical, except the Xtreme comes with not enough sinks and has thermal glue instead of tape - so RMAing it will be tougher. Id pick the Gelid cuz its cheaper to boot. Both will bring core clock down quite a bit, but will have trouble keeping VRM temps in check at high overclocks/voltage imo.


----------



## Scorpion49

My Koolance blocks are finally here! I don't know if I can wait for Saturday and my RIVF to arrive before I assemble the loop though


----------



## Raephen

Quote:


> Originally Posted by *RAFFY*
> 
> I know many of you are using the EK waterblocks but are they the best option? How are the temps on the Aquacomputer and Koolance blocks where tape is not needed for the memory and vrms? I'd like to pull the trigger on two waterblocks soon.


For Aquacomputer's Hawaii block results you'll have to wait just a bit longer.

I've received word of Aquatuning shipping my order and if the postal Gods are in a good mood, the order should be here by the end of the week.

Too bad I've got a stuffed weekend ahead of me, so first real free time I've got where I can take the time and do a proper job of installing the block and intergrating it into my loop wil be next wee Tuesday.

Once done, I'll keep y'all posted on the results!

In terms of order in my loop, I'm leaning towards placing it after my cpu - it's a an i5 3570k clocked at a mere 4.3GHz, where it hits a voltage barrie: 1.27v is ok for 4.3, but anything higher needs a huge increase in voltage.

So my order would be: dual-bay res -> mcp35x pump -> Alphacool Nexxxos st30 360mm rad -> XSPC Raystorm CPU block -> Aquacomputer GPU block -> dual bay res.

We'll see how it goes


----------



## Raxus

1175/1500

Will have to start upping my cpu.


----------



## Snyderman34

No pics since I'm at work, but I'm topping out at 10280 or so in Fire Strike (4770k @ 4.5 GHz, Sapphire 290 @ 1200/1500). I'm wondering if my RAM is holding it back? I'm currently running Corsair Vengeance (4x4GB, 1600) and am curious as to if switching to 2133MHz RAM would help scores (or games in general) any. Eying that G.Skill from NewEgg's sale.


----------



## djsatane

Quote:


> Originally Posted by *Raxus*
> 
> it seems the difference in the press copies could've been different bios.
> Read more at http://www.legitreviews.com/amd-radeon-r9-290x-press-sample-versus-retail_129583/3#ORChLmvq2GiIK2Q5.99


Any idea whether this bios will be released by AMD publically?


----------



## Raxus

Quote:


> Originally Posted by *djsatane*
> 
> Any idea whether this bios will be released by AMD publically?


Quote:


> AMD did inform us they are working on an improved BIOS for the AMD Radeon R9 290X and we hope to get our hands on it in the near future
> 
> Read more at http://www.legitreviews.com/amd-radeon-r9-290x-press-sample-versus-retail_129583/3#1jyVz2dUzRqQP5kJ.99


----------



## Derpinheimer

Can someone help me understand how to get higher than +100mV voltage increase? I have a flash R9 290 with ASUS 290X bios, and extended clock limit MSI AB.


----------



## taminhncna

My MSI R9 290 with Gelid Icy Vision rev 2 does verywell, overclock to 1200/1425 with +144mv core ---> when fullload max temp GPU 67 -70* and max temp VRM 67 - 71*TEMP ROM about 25*
Gelid icy is old but supprise like it was made for R9 290(x) (special VRM heatsink)


----------



## Forceman

Quote:


> Originally Posted by *Derpinheimer*
> 
> Can someone help me understand how to get higher than +100mV voltage increase? I have a flash R9 290 with ASUS 290X bios, and extended clock limit MSI AB.


If you flashed the Asus BIOS, use GPU Tweak instead of Afterburner. It has higher limits.


----------



## Hogesyx

Quote:


> Originally Posted by *Forceman*
> 
> If you flashed the Asus BIOS, use GPU Tweak instead of Afterburner. It has higher limits.


Yep, and make sure you have a sufficient cooling and factor in power consumption.


----------



## Derpinheimer

Sure, thanks to both of you.

I have another issue though. Any time I try to play a youtube video the computer hardlocks. Games work completely fine. I would try to disable HW acceleration, if enabled {I had it off for my 7950 which I just removed for this card}, but I only get about 1 second from opening a video and having it freeze, regardless of if its paused or not.

.mkv play fine through VLC Media player


----------



## BouncingBall

Might be a little late to the party, but:



HIS R9 290X.

Air cooled and running at stock.


----------



## kdawgmaster

Quote:


> Originally Posted by *BouncingBall*
> 
> Might be a little late to the party, but:
> 
> 
> 
> HIS R9 290X.
> 
> Air cooled and running at stock.


I thinking photos are required to get in.


----------



## ZealotKi11er

I suggest most people that run stock coolers keep the card stock with stock fan profile and test them for at least 1-2 weeks before jumping to conclusions about RMAs and such.

Also if 290 Unlocks how do you make sure its really Unlocked?


----------



## KyGuy

Could you add me please?

Sapphire 290x- Stock Cooling for now. Thinking about Water to take on this and the 3770k.


----------



## Derpinheimer

Quote:


> Originally Posted by *Derpinheimer*
> 
> Sure, thanks to both of you.
> 
> I have another issue though. Any time I try to play a youtube video the computer hardlocks. Games work completely fine. I would try to disable HW acceleration, if enabled {I had it off for my 7950 which I just removed for this card}, but I only get about 1 second from opening a video and having it freeze, regardless of if its paused or not.
> 
> .mkv play fine through VLC Media player


One more thing: I have a universal waterblock im going to put on it, but I have no idea what to use for VRM heatsinks? I do have 16 memory heatsinks..

Are there any kits out there with a set of nice heatsinks? I know a lot of the air coolers come with them but I, of course, am not going to pay $100 for a VRM heatsink lol.


----------



## kdawgmaster

For those using MSI afterburner what are they setting the AUX voltage to? and will this possibly allow my core clock to go higher because i cant seem to hit the 1100 mark at all or else it will artifact


----------



## chiknnwatrmln

Aux voltage is for RAM.

It shouldn't affect your core speed.


----------



## kdawgmaster

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Aux voltage is for RAM.
> 
> It shouldn't affect your core speed.


Thats what i got from reading the description but it just seems so odd that i cant get past 1075 with my core or it will start to artifact. This is also with the core voltage at 100+

Ive must have rly lost on the silicon for all or one of my cards :/


----------



## ironhide138

so, Anyone plan on picking up one of the new NZXT WC brackets and tossing an H55 on a 290/290x?


----------



## kdawgmaster

Quote:


> Originally Posted by *ironhide138*
> 
> so, Anyone plan on picking up one of the new NZXT WC brackets and tossing an H55 on a 290/290x?


Unless it cools the ram with it and the Vrms then no


----------



## Forceman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Also if 290 Unlocks how do you make sure its really Unlocked?


GPU-Z actually seems to be pretty accurate, but that Hawaiiinfo tool is pretty much 100% accurate, and the fool-proof way is before and after clock for clock benchmarks.
Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Aux voltage is for RAM.
> 
> It shouldn't affect your core speed.


I don't know what AUX is (I think I heard it is PLL or something), but it isn't RAM. There is no VRAM voltage control on these cards.


----------



## briclark1

I just bought a XFX R9 290 from Newegg. I bought it on Sunday and just registered today and it allowed me to download BF4. I know a few pages ago people were saying that they stopped honoring the free game offer. I had to register through XFX, AMD and Origin but it allowed it if someone is interested.

I'll add my rig to my signature and join the club once it's in.

One question. I planned to upgrade my system and buy a 290 and put it under water but Newegg had a code to drop the XFX card by $20 so I jumped. I won't have the water cooling stuff until middle of next month so is there a problem running on stock cooler for a month then switching? No harder to get the stock cooler off after running for a month? I would think not. Thanks in advanced.


----------



## Gunderman456

Quote:


> Originally Posted by *briclark1*
> 
> I just bought a XFX R9 290 from Newegg. I bought it on Sunday and just registered today and it allowed me to download BF4. I know a few pages ago people were saying that they stopped honoring the free game offer. I had to register through XFX, AMD and Origin but it allowed it if someone is interested.
> 
> I'll add my rig to my signature and join the club once it's in.
> 
> One question. I planned to upgrade my system and buy a 290 and put it under water but Newegg had a code to drop the XFX card by $20 so I jumped. I won't have the water cooling stuff until middle of next month so is there a problem running on stock cooler for a month then switching? No harder to get the stock cooler off after running for a month? I would think not. Thanks in advanced.


No, there would be no problems removing stock cooler after a month or ever.


----------



## DeadlyDNA

i agree, however me personally i got a huge increase going to water i think because in CF/3way/4way they get to hot next to each other even at 100% fan..

One card should be pretty decent on stock cooler, aside from noise should you choose to go high fan settings


----------



## briclark1

Quote:


> Originally Posted by *Gunderman456*
> 
> No, there would be no problems removing stock cooler after a month or ever.


I figured. Never had an issue with the CPU. Thanks.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Forceman*
> 
> GPU-Z actually seems to be pretty accurate, but that Hawaiiinfo tool is pretty much 100% accurate, and the fool-proof way is before and after clock for clock benchmarks.
> I don't know what AUX is (I think I heard it is PLL or something), but it isn't RAM. There is no VRAM voltage control on these cards.


Where do i get Hawaiiinfo tool?


----------



## Forceman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Where do i get Hawaiiinfo tool?


http://www.overclock.net/t/1445030/is-your-r9-290-unlockable-find-out-here/0_30


----------



## Falkentyne

Quote:


> Originally Posted by *kdawgmaster*
> 
> Thats what i got from reading the description but it just seems so odd that i cant get past 1075 with my core or it will start to artifact. This is also with the core voltage at 100+
> 
> Ive must have rly lost on the silicon for all or one of my cards :/


Check GPU-Z to see if that +100mv actually applied. It should raise the IDLE voltage by +100mv also.


----------



## BouncingBall

Sorry, didn't realize that I didn't include my OCN name. Whoops.










HIS.

Running on stock cooling and stock clock speeds.


Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *kdawgmaster*
> 
> I thinking photos are required to get in.


Really? I thought that would be enough.

If I must, here's a photo of the box:


Spoiler: Warning: Spoiler!







Amazon/HIS accidentally put it in the wrong box (which was pretty exciting; I ordered it before the release of the R9 290 and got a kick out of the fact that I got it's box.) Here's a HWinfo screenshot to prove that is it in fact a R9 290X.


Spoiler: Warning: Spoiler!


----------



## kdawgmaster

Quote:


> Originally Posted by *BouncingBall*
> 
> Really? I thought that would be enough.
> 
> If I must, here's a photo of the box:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Amazon/HIS accidentally put it in the wrong box (which was pretty exciting; I ordered it before the release of the R9 290 and got a kick out of the fact that I got it's box.) Here's a HWinfo screenshot to prove that is it in fact a R9 290X.
> 
> 
> Spoiler: Warning: Spoiler!


That was just to my understanding i could have been wrong


----------



## Arizonian

Quote:


> Originally Posted by *BouncingBall*
> 
> Really? I thought that would be enough.
> 
> If I must, here's a photo of the box:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Amazon/HIS accidentally put it in the wrong box (which was pretty exciting; I ordered it before the release of the R9 290 and got a kick out of the fact that I got it's box.) Here's a HWinfo screenshot to prove that is it in fact a R9 290X.
> 
> 
> Spoiler: Warning: Spoiler!


Very much congrats getting that 290X in a 290 box - added









Quote:


> Originally Posted by *KyGuy*
> 
> Could you add me please?
> 
> Sapphire 290x- Stock Cooling for now. Thinking about Water to take on this and the 3770k.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









I try to be lenient but would like to see some OCN name in either pic w/OCN name or a GPU-Z validation shot with validation tab open so I can see your user name. The link becomes proof off the roster list when others click on it.

I updated OP - moved up the club roster to first post. GPU info to to second post and reviews to third post so it's more easier now. Also spoilers to condense some info so OP isn't as hard to sift through.

To become a member I'm looking for certain things to make it easier when I add members.

1. GPU-Z Link *or* Screen shot --- with OCN Name
2. Manufacturer & Brand
3. Cooling - Stock, Aftermarket or 3rd Party Water


----------



## jorgitin02

guys is 90c vrm1 a safe temp after 2 hours of stress testing ?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Forceman*
> 
> http://www.overclock.net/t/1445030/is-your-r9-290-unlockable-find-out-here/0_30


Thank You.


----------



## RAFFY

Quote:


> Originally Posted by *Jpmboy*
> 
> where did you see that thermal pads are not needed for the memory and vrms with the AQ blocks? The EK does very well, and has active cooling for the VRMs... and they need it!
> 
> 
> 
> vrm heat plates highlighted.
> 
> from the AQ website:
> 
> _Combined GPU/RAM/VRM-cooler for graphics cards of the type Radeon R9 290X and 290 with 4GB RAM according to reference design.
> This cooler combines the features of a graphics chip cooler and RAM-coolers in an elegant and very flat watercooler. Additionally the voltage regulators are also cooled effectively.
> 
> The kryographics for R9 290X and 290 water block offers outstanding cooling performance and a low flow resistance. All mounting threads are equipped with spacers for easy installation and to prevent twisting the graphics card. The areas contacting the GPU and RAM modules are polished for excellent heat transfer. For the voltage regulators, a thermal pad with high thermal conductivity is included._
> 
> BTW - they make very good quality stuff!


I thought I read some where that people were just using thermal grease and having great results. I could be wrong as I am a water noob.


----------



## BouncingBall

Quote:


> Originally Posted by *Arizonian*
> 
> Very much congrats getting that 290X in a 290 box - added
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I try to be lenient but would like to see some OCN name in either pic w/OCN name or a GPU-Z validation shot with validation tab open so I can see your user name. The link becomes proof off the roster list when others click on it.
> 
> I updated OP - moved up the club roster to first post. GPU info to to second post and reviews to third post so it's more easier now. Also spoilers to condense some info so OP isn't as hard to sift through.
> 
> To become a member I'm looking for certain things to make it easier when I add members.
> 
> 1. GPU-Z Link *or* Screen shot --- with OCN Name
> 2. Manufacturer & Brand
> 3. Cooling - Stock, Aftermarket or 3rd Party Water


Yeah, sorry about that. My first post had a GPU-Z link and rest of the info and second had more pictures with OCN name; forgot to merge them. Will edit.


----------



## jomama22

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Aux voltage is for RAM.
> 
> It shouldn't affect your core speed.


No its not, its for vddcl. There is no memory voltage adjustments on the 290/x ad there is no programmable vrm.

So there will be 0 memory voltage control for reference designs unless you hard mod.


----------



## jomama22

Quote:


> Originally Posted by *RAFFY*
> 
> I thought I read some where that people were just using thermal grease and having great results. I could be wrong as I am a water noob.


You are not wrong, bit kinda are haha. On the aqua computer block, only the vrms need the thermal pad. Its the memory chips that have direct contact with the block.


----------



## ZealotKi11er

Quote:


> Originally Posted by *jomama22*
> 
> No its not, its for vddcl. There is no memory voltage adjustments on the 290/x ad there is no programmable vrm.
> 
> So there will be 0 memory voltage control for reference designs unless you hard mod.


The memory in 290/290X does not need voltage control. Its actually rated @ 6GHz. That needs boots is internal memory bus. That has been purposely under volted in order to bring down power consumption.


----------



## RAFFY

Quote:


> Originally Posted by *jomama22*
> 
> You are not wrong, bit kinda are haha. On the aqua computer block, only the vrms need the thermal pad. Its the memory chips that have direct contact with the block.


Lol thanks for the clarification!


----------



## jomama22

Quote:


> Originally Posted by *ZealotKi11er*
> 
> The memory in 290/290X does not need voltage control. Its actually rated @ 6GHz. That needs boots is internal memory bus. That has been purposely under volted in order to bring down power consumption.


Memory voltage would be nice for the crappy elpida that cant go over 1625 while my hynixs' do 1750 just fine lol. Nets me an extra 250 pts in fse at the same core clock.

IMC voltage is not an issue at all. The IMC itself isn't as robust as tahitis but far from weak. Like I said, I can run both my hynix cards upwards of 1750. Obviously at the same stock mem voltage of 1.5.

A bump up to 1.6 would be huge honestly.


----------



## kdawgmaster

how do u all get the memory info for ur cards?

Apart from taking them apart


----------



## Hogesyx

Quote:


> Originally Posted by *KyGuy*
> 
> Could you add me please?
> 
> Sapphire 290x- Stock Cooling for now. Thinking about Water to take on this and the 3770k.


What's your ASIC on your sapphire 290X? I am glad you could hit 1500 MHz on memory. Mine max out at 1440~


----------



## ZealotKi11er

Quote:


> Originally Posted by *jomama22*
> 
> Memory voltage would be nice for the crappy elpida that cant go over 1625 while my hynixs' do 1750 just fine lol. Nets me an extra 250 pts in fse at the same core clock.
> 
> IMC voltage is not an issue at all. The IMC itself isn't as robust as tahitis but far from weak. Like I said, I can run both my hynix cards upwards of 1750. Obviously at the same stock mem voltage of 1.5.
> 
> A bump up to 1.6 would be huge honestly.


Why do i see a lot of people with 1350 - 1450 MHz OC?


----------



## kdawgmaster

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Thank You.


How did u get the memory info like brand and everything?


----------



## brazilianloser

Quote:


> Originally Posted by *kdawgmaster*
> 
> How did u get the memory info like brand and everything?


Go to the beginning of the thread I think they put the memory checker up there.

edit: http://www.mediafire.com/download/voj4j1rlk0ucfz4/MemoryInfo+1005.rar


----------



## utnorris

So I am officially annoyed. BF2142 does not seem to want to play nice anymore. I got it semi working by uninstalling Catalyst manager. I suspect that one of the many Microsoft updates has boinked everything. Since I do not have Catalyst manger installed I do not get frame pacing or Eyefinity which I was going to try out since I am running two of these. I know BF2142 is an older game, but that's all the more reason there should not be any issues with it. Now I am debating on selling these and getting a 780 or 780ti or going back to reinstalling Windows just so I can get BF2142 working right. It's one my favorite games where I can just jump in and play a mind numbing, shoot em up game without having to log into a server and play with others or play a campaign. Anyways, I will give it a few days to see if I can figure out what the issue is.


----------



## kdawgmaster

Bawls it looks like HIS uses Elpdia no wonder why my overclocks were garbage


----------



## Forceman

Quote:


> Originally Posted by *kdawgmaster*
> 
> Bawls it looks like HIS uses Elpdia no wonder why my overclocks were garbage


All the cards are mix and match for the memory, some cards Elpida and some Hynix. But the Elpida cards seem to overclock the memory pretty much the same as Hynix this time.


----------



## kdawgmaster

Quote:


> Originally Posted by *Forceman*
> 
> All the cards are mix and match for the memory, some cards Elpida and some Hynix. But the Elpida cards seem to overclock the memory pretty much the same as Hynix this time.


From alot of the stuff ive been reading Elpida has been doing alot worse in terms of reliabilities and overclocking so far.

For the overclocking portion they require more tweaks to make them stable.


----------



## KyGuy

Quote:


> Originally Posted by *Hogesyx*
> 
> What's your ASIC on your sapphire 290X? I am glad you could hit 1500 MHz on memory. Mine max out at 1440~


My ASIC quality is only 71.5%. As many other people have said, there is no proof that the ASIC quality directly relates to overclock potential. It is more a silicon lottery just like it is with CPU's. Some can overclock great, some you can't push at all. Just depends on the particular card. My memory is Elpida too.


----------



## jomama22

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Why do i see a lot of people with 1350 - 1450 MHz OC?


Stock bios. The Asus, pt1 and pt3 allow higher memory clocks. This is my own experience with 10 cards. I could do 1400-1500 on stock bios for all of them. Asus bios/pt3/pt1, 1600 was the minimum with elpida always falling to 1600-1625 and the only 2 hynix I got did 1749+.


----------



## kdawgmaster

Hey Guys heres a question for you all. When any of you have taken apart the cooler for the R9 290x were u able to remove the Fins from the rest of the cooling unit?



Removing part A from Part B

Im asking this because if this can be done then i was interested in the Nzxt bracket for my GPU and getting some H80's for it


----------



## iPDrop

Hmmm, now I'm getting this weird glitch in BF4... Ultra settings 2560x1440 with a single R9 290

Any idea what's causing this?


----------



## DeadlyDNA

Quote:


> Originally Posted by *kdawgmaster*
> 
> Hey Guys heres a question for you all. When any of you have taken apart the cooler for the R9 290x were u able to remove the Fins from the rest of the cooling unit?
> 
> 
> 
> Removing part A from Part B
> 
> Im asking this because if this can be done then i was interested in the Nzxt bracket for my GPU and getting some H80's for it


The copper block with fins on it is glued or something to the black aluminum base. Another words no it's not easily removed unless you get out tools and practically saw it off. I was going to do the same thing and someone else posted in this thread about trying to remove it...

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/7590#post_21246084

i thought he said he couldnt pry it off, but either way by loooking at his mod it wasnt removable


----------



## kdawgmaster

Quote:


> Originally Posted by *DeadlyDNA*
> 
> The copper block with fins on it is glued or something to the black aluminum base. Another words no it's not easily removed unless you get out tools and practically saw it off. I was going to do the same thing and someone else posted in this thread about trying to remove it...
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/7590#post_21246084
> 
> i thought he said he couldnt pry it off, but either way by loooking at his mod it wasnt removable


By looking at it tho it appears to be 2 pieces. I might look into it some more and see what i can do. If it can be removed without sawing it ill let everyone know.


----------



## DeadlyDNA

Quote:


> Originally Posted by *kdawgmaster*
> 
> By looking at it tho it appears to be 2 pieces. I might look into it some more and see what i can do. If it can be removed without sawing it ill let everyone know.


Yes, it is 2 pieces but they really secured it with something. Anyways, good luck i look forward to your results. Would be cool to have an alternative aside from typical blocks


----------



## S410520

Upgraded my second unlocked 290 from Gelid Icy to Accelero Hybrid last night
.
The first one did *poof* before I could upgrade the cooler so thats rma'd..







(smoke from power connectors, chemical burn smell and al that)

Just a point of advise for people getting this cooler for a 290(X); You will need to modify the risers to set a safe pressure level on the cooler.

Another 3 pointers;
- only thermal glue provided for memory chips and vrm's.
- not enough ram sinks in package (came 3 short or something and no, I did not lose them)
- no clearence for 1 ram sink, you can saw half of a sink's fins off to make it fit. (the lower ram chip by the pci slot side)
- not enough clearence for another ram heat sink, you can place it 1mm off the center of it to make it fit.

The rest of the package of the Hybrid is very universal; lot's of vrm's sinks, many shapes and sizes.

I used the endplate of the stock cooler for the vrm's by the gpu. It works great and I can really recommend this.
Decided to glue the ram sinks for better temps and used some thermal stickers I had left from Gelid package on the small group of VRM's in the front.

Temps are best I ever had for a gpu as this is my first watercooler gpu









Some runs of valley and BF4 give the temps below @ 1140 1.214v load actual v.
max GPU: 61c
max vrm1; 72c
max vrm2; 56c

Too bad this one does not seem to want 1150mhz no matter what.

(forgot screenshots, here are pics of the mod and clearance to the card.)


----------



## kdawgmaster

Quote:


> Originally Posted by *S410520*
> 
> Upgraded my second unlocked 290 from Gelid Icy to Accelero Hybrid last night
> .
> The first one did *poof* before I could upgrade the cooler so thats rma'd..
> 
> 
> 
> 
> 
> 
> 
> (smoke from power connectors, chemical burn smell and al that)
> 
> Just a point of advise for people getting this cooler for a 290(X); You will need to modify the risers to set a safe pressure level on the cooler.
> 
> Another 3 pointers;
> - only thermal glue provided for memory chips and vrm's.
> - not enough ram sinks in package (came 3 short or something and no, I did not lose them)
> - no clearence for 1 ram sink, you can saw half of a sink's fins off to make it fit. (the lower ram chip by the pci slot side)
> - not enough clearence for another ram heat sink, you can place it 1mm off the center of it to make it fit.
> 
> The rest of the package of the Hybrid is very universal; lot's of vrm's sinks, many shapes and sizes.
> 
> I used the endplate of the stock cooler for the vrm's by the gpu. It works great and I can really recommend this.
> Decided to glue the ram sinks for better temps and used some thermal stickers I had left from Gelid package on the small group of VRM's in the front.
> 
> Temps are best I ever had for a gpu as this is my first watercooler gpu
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Some runs of valley and BF4 give the temps below @ 1140 1.214v load actual v.
> max GPU: 61c
> max vrm1; 72c
> max vrm2; 56c
> 
> Too bad this one does not seem to want 1150mhz no matter what.
> 
> (forgot screenshots, here are pics of the mod and clearance to the card.)


I know i said i would just do it but can u see if the fins are removed from the plate on the stock cooler? i have a diagram to show u what i mean



Removing part A from B


----------



## S410520

Sorry mate, not sure what you mean here.

If you want to seperate the fins from the base block I can tell you you are going to have bad time.
it will take serious work to seperate the block with fins or the fins only from the stock cooler.


----------



## kdawgmaster

damn. i wish that was just removable so i can make a customer bracket or by one that will work with a H80i or something. Then we wouldnt have to worry about Vrms and ram for the card for temps.


----------



## S410520

I think it is impossible without powertools. You will probably bend the base if you go blacksmith on it.

Edit: forget the ram temps, I'm pretty sure you wont have a problem even cooling them passively.
For the vrm's the back of the stock coolers base is great and the small group in front is easy to cool with little airflow as well. (just apply good sinks, in Accelero Hybrid package these are supplied, the Gelid icy vision needs modding/creativity for this vrm)


----------



## smokedawg

Quote:


> Originally Posted by *Forceman*
> 
> At what refresh rate does the pixel clock limitation become an issue? Or do you need to use the patcher for all custom resolutions on AMD cards?


I think the limit for AMD cards is 330Mhz. Your required pixel clock is shown in CRU or you can calculate it here (xls file)


----------



## Newbie2009

For anyone instaling new cards I recommend avoiding the newest Beta drivers, took out windows on me but got it recovered.

13.11 WHQL seem pretty sweet.


----------



## mohit9206

Is 13.11 whql out ? Am running 13.10 beta 2.


----------



## brazilianloser

Quote:


> Originally Posted by *mohit9206*
> 
> Is 13.11 whql out ? Am running 13.10 beta 2.


Its been out for quite a while now.

http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64


----------



## brazilianloser

My card is bipolar... The other day when I was trying to overclock the memory, any touch to it would cause the pc to crash during 3Dmark and 3Dmark11... and today I upped the core to 1200 and memory to 1500 on reference and not a single crash...

But look at that got past the 15k mark on 3Dmark11 on RC and max temp I got was about 82c... maybe its time to turn on the heater.

http://www.3dmark.com/3dm11/7572560


----------



## specopsFI

I've never joined one of these owners clubs before but this one might be both handy and fun









Here's a screenie, sorry had to crop it a bit:



This is a second hand Club3D R9 290 that I got for a straight swap for my 780. Usually I lose those gambles but this seems to have panned out ok. First of all, it was already unlocked by the previous owner, hence the shader count in pic. Also no black screens so far and practically no coil whine. It even seems to be a decent clocker, could do 1120MHz for core on stock voltage in Heaven with no artifacts. Memory is Elpida but that doesn't seem to be a problem for me, at least until I start to OC it which I haven't yet.

So to plug in to this knowledge base: what's the deal with the huge mouse pointer / cursor? I remember having read about it but couldn't find it now. I have it since I installed the drivers for the first time, on 13.11 beta 9.4.


----------



## COMBO2

I'm not sure whether I'll be RMAing this one or not... but here's my proof anyway.



It's an XFX model that I want to put on water (I have the block here). It has about half the coil whine of the one they replaced, and I'm really nervous to put a waterblock on it as they say it will void my warranty and that will be that, so I have to think carefully as to whether I want to keep this one or not. The higher the FPS, the more it whines, hence the more irritating it gets. I don't want the GPU itself to be the only thing causing noise in my watercooled rig.

I tried that paper towel thing to channel the noise. It seems to be coming from the power connectors or around the fan area. None the less, it's annoying.

I really don't want a GTX 780 but I don't think they'll be replacing this card. I think at most they'll credit me my money back.

Any suggestions guys? Should I just put up with it or force them to replace it?


----------



## quakermaas

Just got a pair of Powercolor R9 290s via DHL, they have consecutive serial numbers, will get a play with them later, and try the BIOS flash with a 290x ROM.


----------



## iPDrop

Anybody know what could be causing this?


----------



## darkelixa

Heya, Im looking at buying either a sapphire/xfx r9 290 or a sapphire toxic 280x if they ever come in stock , the r9 290s are in stock but the 280xs arent. Would it be worth the extra cash for the r9 290 or wait for the custom coolers. Im after a lower profiled card so it will fit in my case and still with the pcie toolless holder so it cant be super high


----------



## mboner1

Quote:


> Originally Posted by *iPDrop*
> 
> Anybody know what could be causing this?


Just guessing really man, could be heaps of things.

Overclock? Overheating? Monitor frequency (doubtful but try a different monitor/tv if you can) try operating at reduced dvi frequency in ccc. Can fix some issues. Power supply. Faulty card.


----------



## grandpatzer

I recently changed to a Asus Maximus IV Extreme P67 + *watercooled 290 @ 1218core/1500memory I only get 66.1fps in Valley Extreme HD*.

Is it because my 2500k CPU is atm 3.4-3.5ghz and memory is atm 667mhz?


----------



## BackwoodsNC

Quote:


> Originally Posted by *Derpinheimer*
> 
> Sure, thanks to both of you.
> 
> I have another issue though. Any time I try to play a youtube video the computer hardlocks. Games work completely fine. I would try to disable HW acceleration, if enabled {I had it off for my 7950 which I just removed for this card}, but I only get about 1 second from opening a video and having it freeze, regardless of if its paused or not.
> 
> .mkv play fine through VLC Media player


y

lower your memory overclock. put it back to stock and see if it still happens.


----------



## JordanTr

Ok guys, so i decided to grab Accelero Extreme III for my Sapphire R9 290. Can someone give me a link to amazon or ebay, where i can get those extra VRAM heatsinks or there is no need for that just put as many as i got and leave bottom one and few more vram modules without sinks? Thanks in advance.


----------



## S410520

@Derpinheimer;

I had similar issue with my 144hz screen, if you have a 120+ hz screen, try to set it lower. (60hz to test)


----------



## Jpmboy

Quote:


> Originally Posted by *RAFFY*
> 
> I thought I read some where that people were just using thermal grease and having great results. I could be wrong as I am a water noob.


The vrms are not uniform contact surfaces, neither are the vram chips (uneven as a result of how they are stuck into the pcb. OEMs use the thermal pads for a reason... but let us know if you go without and the result.


----------



## Jpmboy

Quote:


> Originally Posted by *smokedawg*
> 
> I think the limit for AMD cards is 330Mhz. Your required pixel clock is shown in CRU or you can calculate it here (xls file)

















thx. +1


----------



## Arizonian

Quote:


> Originally Posted by *specopsFI*
> 
> I've never joined one of these owners clubs before but this one might be both handy and fun
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here's a screenie, sorry had to crop it a bit:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> This is a second hand Club3D R9 290 that I got for a straight swap for my 780. Usually I lose those gambles but this seems to have panned out ok. First of all, it was already unlocked by the previous owner, hence the shader count in pic. Also no black screens so far and practically no coil whine. It even seems to be a decent clocker, could do 1120MHz for core on stock voltage in Heaven with no artifacts. Memory is Elpida but that doesn't seem to be a problem for me, at least until I start to OC it which I haven't yet.
> 
> So to plug in to this knowledge base: what's the deal with the huge mouse pointer / cursor? I remember having read about it but couldn't find it now. I have it since I installed the drivers for the first time, on 13.11 beta 9.4.


Welcome the owners club then, happy to have you aboard.







Congrats - added









Quote:


> Originally Posted by *COMBO2*
> 
> I'm not sure whether I'll be RMAing this one or not... but here's my proof anyway.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> It's an XFX model that I want to put on water (I have the block here). It has about half the coil whine of the one they replaced, and I'm really nervous to put a waterblock on it as they say it will void my warranty and that will be that, so I have to think carefully as to whether I want to keep this one or not. The higher the FPS, the more it whines, hence the more irritating it gets. I don't want the GPU itself to be the only thing causing noise in my watercooled rig.
> 
> I tried that paper towel thing to channel the noise. It seems to be coming from the power connectors or around the fan area. None the less, it's annoying.
> 
> I really don't want a GTX 780 but I don't think they'll be replacing this card. I think at most they'll credit me my money back.
> 
> Any suggestions guys? Should I just put up with it or force them to replace it?


Hey COMBO2 and fellow U2713HM club member







Congrats - added









Quote:


> Originally Posted by *quakermaas*
> 
> Just got a pair of Powercolor R9 290s via DHL, they have consecutive serial numbers, will get a play with them later, and try the BIOS flash with a 290x ROM.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added the pair -


----------



## Scorpion49

Quote:


> Originally Posted by *iPDrop*
> 
> Anybody know what could be causing this?


I get that effect in far cry when the shadows are set to high. Not sure about battlefields though.


----------



## SultanOfWalmart

Anyone else's Afterburner (beta 17) not saving power-limit settings? I can change it, hit apply, and it looks like it sicks, howver, it still trottles. Open ASUS GPUTweak and its showing power-limit at its original setting. Change it to 20%, apply, and it sticks, no longer throttles.

Any ideas?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *SultanOfWalmart*
> 
> Anyone else's Afterburner (beta 17) not saving power-limit settings? I can change it, hit apply, and it looks like it sicks, howver, it still trottles. Open ASUS GPUTweak and its showing power-limit at its original setting. Change it to 20%, apply, and it sticks, no longer throttles.
> 
> Any ideas?


MSI AB doesn't work for me either, I just use GPUTweak now.


----------



## KyGuy

Quote:


> Originally Posted by *SultanOfWalmart*
> 
> Anyone else's Afterburner (beta 17) not saving power-limit settings? I can change it, hit apply, and it looks like it sicks, howver, it still trottles. Open ASUS GPUTweak and its showing power-limit at its original setting. Change it to 20%, apply, and it sticks, no longer throttles.
> 
> Any ideas?


I think that it is something with Afterburner. I had my card overclocked to 1050/1500, and it worked fine when I benched it, stressed it, etc. But even with Don't apply Clocks at Startup, the clocks applied, and it was very difficult to revert the overclock, as the screen would frequently flash lines across it. No problem with the Power Limit for me....but this is the problem I have. If I shutdown without reverting the OC manually, it will not revert at all, and Windows almost fails to start. ULPS is disabled.


----------



## jerrolds

Quote:


> Originally Posted by *S410520*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Upgraded my second unlocked 290 from Gelid Icy to Accelero Hybrid last night
> .
> The first one did *poof* before I could upgrade the cooler so thats rma'd..
> 
> 
> 
> 
> 
> 
> 
> (smoke from power connectors, chemical burn smell and al that)
> 
> Just a point of advise for people getting this cooler for a 290(X); You will need to modify the risers to set a safe pressure level on the cooler.
> 
> Another 3 pointers;
> - only thermal glue provided for memory chips and vrm's.
> - not enough ram sinks in package (came 3 short or something and no, I did not lose them)
> - no clearence for 1 ram sink, you can saw half of a sink's fins off to make it fit. (the lower ram chip by the pci slot side)
> - not enough clearence for another ram heat sink, you can place it 1mm off the center of it to make it fit.
> 
> The rest of the package of the Hybrid is very universal; lot's of vrm's sinks, many shapes and sizes.
> 
> I used the endplate of the stock cooler for the vrm's by the gpu. It works great and I can really recommend this.
> Decided to glue the ram sinks for better temps and used some thermal stickers I had left from Gelid package on the small group of VRM's in the front.
> 
> Temps are best I ever had for a gpu as this is my first watercooler gpu
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Some runs of valley and BF4 give the temps below @ 1140 1.214v load actual v.
> max GPU: 61c
> max vrm1; 72c
> max vrm2; 56c
> 
> Too bad this one does not seem to want 1150mhz no matter what.
> 
> (forgot screenshots, here are pics of the mod and clearance to the card.)


Nice job on hacking the stock cooler - can i ask how you did that? Hacksaw?


----------



## RAFFY

Quote:


> Originally Posted by *Jpmboy*
> 
> The vrms are not uniform contact surfaces, neither are the vram chips (uneven as a result of how they are stuck into the pcb. OEMs use the thermal pads for a reason... but let us know if you go without and the result.


Lol fook that! Once I hit water I want the best performance possible. I'll be purchasing those aftermarket cooling pads.


----------



## errorlulz

Any news or eta on non reference r9 290/290x


----------



## S410520

Thanks jerrolds,

Just a junior hacksaw indeed and it destroyed all the teeth from 2 blades but it worked great.

Stay patient and make sure you cannot slip. This is actually the easiest part for mounting the Hybrid or gelid







.
Try customizing ram heat sinks, tiny things are hard to work on with a hacksaw









Here's a screen from GPU-Z after 15min [email protected]:


I finally found out which vrm is which in GPUZ:










The VRM1 temp is the temperature of the line vrm's between gpu and power connectors.
VRM2 temp is the temperature of the 3, small, grouped vrm's at the front near the display connectors.

Funny thing is that the 3 grouped vrm's at the front are very easy to cool with small heatsinks per vrm.
Its the large line of vrm's, vrm1 in GPUZ that is usually difficult to keep cool.

Probably everyone here knew that but if you search for it on google you cannot find it.....


----------



## hotrod717

Quote:


> Originally Posted by *grandpatzer*
> 
> I recently changed to a Asus Maximus IV Extreme P67 + *watercooled 290 @ 1218core/1500memory I only get 66.1fps in Valley Extreme HD*.
> 
> Is it because my 2500k CPU is atm 3.4-3.5ghz and memory is atm 667mhz?


I would say your memory and possibly the low oc on your cpu. Although valley doesn't seem to show these cards well. Your close though. This is what I get at 1200/1500 1.32v - Unoptimized and std. win7


----------



## tsm106

Quote:


> Originally Posted by *Arizonian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> It's about to get real yo!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Good luck bro, represent us well.
Click to expand...

Not all scores are up yet, but I just saw this one!



http://www.3dmark.com/hall-of-fame-2/fire+strike+3dmark+score+extreme+preset/version+1.1/3+gpu


----------



## hotrod717

Quote:


> Originally Posted by *tsm106*
> 
> Not all scores are up yet, but I just saw this one!
> 
> 
> 
> http://www.3dmark.com/hall-of-fame-2/fire+strike+3dmark+score+extreme+preset/version+1.1/3+gpu


Congrats. Way to represent the red team!


----------



## Connolly

I can't change the core voltage in MSI Aferburner, do I need to do something specific first?


----------



## DCRussian

Looks like I got skipped. Could you add me? Check this post for proof


----------



## S410520

Here's my temps @ 1140Mhz (still working on oc) with the Accelero Hybrid modded with endplate of the stock cooler:

The fan's are set to 50%@40c which is about same noise as 20% on the stock cooler








Really enjoying the pwm on the fans.

If you don't mind modding, I can definitely recommend this cooler over the Gelid Icy and the Accelero Extreme III as well.
It has better heat dissipation and you can place the radiator as an exit-fan, meaning, no heat in the case








IDLE, LOAD
GPU: 29c 57c
VRM1: 24c 64c
VRM2: 28c 56c


----------



## cyenz

is +100mv the max safe voltage for a 290x? or it depends on the temps? My friend is currently at 1160mhz at 100mv wich is the max on afterburner, is temps are around 67C after 2 hours of bf4 and 69C on VRM.


----------



## zpaf

Here is a heaven but with a lot of deeps at gpu usage and core.


----------



## mboner1

Quote:


> Originally Posted by *cyenz*
> 
> is +100mv the max safe voltage for a 290x? or it depends on the temps? My friend is currently at 1160mhz at 100mv wich is the max on afterburner, is temps are around 67C after 2 hours of bf4 and 69C on VRM.


Just about to ask the same thing lol. Does anyone know how dangerous it is to set it to max voltage in afterburner??


----------



## Jpmboy

Quote:


> Originally Posted by *RAFFY*
> 
> Lol fook that! Once I hit water I want the best performance possible. I'll be purchasing those aftermarket cooling pads.


FujiPoly !!


----------



## chiknnwatrmln

Quote:


> Originally Posted by *mboner1*
> 
> Just about to ask the same thing lol. Does anyone know how dangerous it is to set it to max voltage in afterburner??


Well +100 mV in AB is what, 1.35v before droop? Correct me if I'm wrong.

Also, actual voltages are more important. This means what the voltage is when the GPU is at 100%, after vdroop.

I wouldn't put more than 1.3v actual into a card for daily use, any less than that should be fine so long as your core and VRM temps are under control.

With Hawaii, the cooler the better, but I'd try and keep your core under 80c and VRM's under 100c. Use a custom fan profile if you need to, I'd rather have a loud GPU run cool than a quiet GPU burn itself out.

I'm running my 290 @ 1.412 (max in GPUTweak, before droop), about 1.26 actual at 1200/6000 and my max temps are 62c core, 85 and 55 for VRMs 1 and 2. Past 62c core and I artifact.

Edit: I will say that you should create a second profile for when you're not doing GPU intensive stuff. I have one with the core and mem clocks lowered by about 10% and voltage left stock. If you're just browsing the web there's no point in pumping your GPU full of volts.


----------



## tsm106

Quote:


> Originally Posted by *hotrod717*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Not all scores are up yet, but I just saw this one!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/hall-of-fame-2/fire+strike+3dmark+score+extreme+preset/version+1.1/3+gpu
> 
> 
> 
> Congrats. Way to represent the red team!
Click to expand...

Thanks dude. Oh look what I saw here. Two medium clocked quads setups pushing titans around.


----------



## mboner1

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Well +100 mV in AB is what, 1.35v before droop? Correct me if I'm wrong.
> 
> Also, actual voltages are more important. This means what the voltage is when the GPU is at 100%, after vdroop.
> 
> I wouldn't put more than 1.3v actual into a card for daily use, any less than that should be fine so long as your core and VRM temps are under control.
> 
> With Hawaii, the cooler the better, but I'd try and keep your core under 80c and VRM's under 100c. Use a custom fan profile if you need to, I'd rather have a loud GPU run cool than a quiet GPU burn itself out.
> 
> I'm running my 290 @ 1.412 (max in GPUTweak, before droop), about 1.26 actual at 1200/6000 and my max temps are 62c core, 85 and 55 for VRMs 1 and 2. Past 62c core and I artifact.
> 
> Edit: I will say that you should create a second profile for when you're not doing GPU intensive stuff. I have one with the core and mem clocks lowered by about 10% and voltage left stock. If you're just browsing the web there's no point in pumping your GPU full of volts.


What program are you using to monitor your vrm temps, and what is the max for those temps? I think my temperatures should be fine, I was 1120/1450 with no voltage increase with 55% fan speed and sitting on 80 degrees, so I'm maxing the volts in afterburner and can get 1180/1500 and fan profile @ 70 max. Just seeing what the temps are now with a quick run of fire strike. + rep as well, cheers.


----------



## Skylark71

Arizonian, can you update my status to water?

Installed EK FC R9 290X to my Asus 290:


----------



## Derpinheimer

Anyone have any idea of what kind of heatsink I can use for VRM with a universal waterblock? I saw some Gelid VRM thing, but have no idea if its actually compatible [was for a 5970 or something]


----------



## Skylark71

Quote:


> Originally Posted by *mboner1*
> 
> What program are you using to monitor your vrm temps, and what is the max for those temps? I think my temperatures should be fine, I was 1120/1450 with no voltage increase with 55% fan speed and sitting on 80 degrees, so I'm maxing the volts in afterburner and can get 1180/1500 and fan profile @ 70 max. Just seeing what the temps are now with a quick run of fire strike. + rep as well, cheers.


You can check your vrm temps with Gpu-z, under the sensor tab.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *mboner1*
> 
> What program are you using to monitor your vrm temps, and what is the max for those temps? I think my temperatures should be fine, I was 1120/1450 with no voltage increase with 55% fan speed and sitting on 80 degrees, so I'm maxing the volts in afterburner and can get 1180/1500 and fan profile @ 70 max. Just seeing what the temps are now with a quick run of fire strike. + rep as well, cheers.


VRM temps appear at the very bottom of GPU-z for me, I have to scroll down. If yours don't appear, update GPU-z or try HWInfo.

I have heard the VRM's are rated up to 125c, but if they get too hot they can really affect your OC. So far I've seen my VRM's max at about 85c (only VRM 1, VRM 2 maxes about 55c)

Besides, I don't feel comfortable having anything in my PC hot enough to boil water. Reminds me of my friends MacBook Pro, that thing's CPU breaks 110c regularly...


----------



## hotrod717

Quote:


> Originally Posted by *tsm106*
> 
> Thanks dude. Oh look what I saw here. Two medium clocked quads setups pushing titans around.


Nice to see "k" in there. The really great thing is, all those Titans are on mature drivers. Things can only get better as time progresses and we get some better drivers.


----------



## Arizonian

Quote:


> Originally Posted by *tsm106*
> 
> Not all scores are up yet, but I just saw this one!
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> http://www.3dmark.com/hall-of-fame-2/fire+strike+3dmark+score+extreme+preset/version+1.1/3+gpu


And represent Hawaii you did.







Waiting on your other scores that didn't post yet.

Quote:


> Originally Posted by *Skylark71*
> 
> Arizonian, can you update my status to water?
> 
> Installed EK FC R9 290X to my Asus 290:
> 
> 
> Spoiler: Warning: Spoiler!


Updated









Quote:


> Originally Posted by *DCRussian*
> 
> Add me to the list as well.
> 
> I have a PowerColor 290 that's been flashed to 290X, running with stock cooling at the moment.
> 
> Benches pre-flash/post-flash
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> *here*.
> 
> Stock Flashed


Quote:


> Originally Posted by *DCRussian*
> 
> Looks like I got skipped. Could you add me? Check this post for proof


Sorry about missing you. Your slotted in order to entry and belated Congrats - added


----------



## grandpatzer

Quote:


> Originally Posted by *hotrod717*
> 
> I would say your memory and possibly the low oc on your cpu. Although valley doesn't seem to show these cards well. Your close though. This is what I get at 1200/1500 1.32v - Unoptimized and std. win7


Do you have a 290x / unlocked 290?
I that is the case I think you should have about 6% more then me, so about 1.06*66.1= 70.0fps.
Incase youre score is with a normal 290 then I have some tweaking to do to get more then 66.1fps.

I was suggested by a user to AMD CCC tweak some setting for Valley, I'll try to figure out how to OC with this motherboard and put somewhere around 3.9-4.2Ghz the OC.
Then the memory should be about 9 9 9 24 1600mhz instead of 1333mhz which it is now.


----------



## mboner1

Almost cracked 10,000 in firestrike. Gpu score of 12,000+ , 1180 and 1500 with max voltage in msi afterburner. Temps hit 89 degrees with fan speed of 67%, think i'm just gonna dial it back down. vrm temps stayed under 80 as well according to gpuz. Is that a decent score for how hard i'm having to push it?


----------



## Forceman

Quote:


> Originally Posted by *cyenz*
> 
> is +100mv the max safe voltage for a 290x? or it depends on the temps? My friend is currently at 1160mhz at 100mv wich is the max on afterburner, is temps are around 67C after 2 hours of bf4 and 69C on VRM.


+100mV should be fine. With the Vdroop that's only about 1.25V actual, which should be no problem (especially since it is lower at idle and 2D mode). Consider that the GPUs are on a larger process than recent CPUs, and people push 1.35 or 1.4V through those long-term without any problems at all.


----------



## Jack Mac

So does anyone have any idea when Amazon is getting more R9 290s in stock? I *REALLY* don't want to use Newegg again.


----------



## Gunderman456

Quote:


> Originally Posted by *Connolly*
> 
> I can't change the core voltage in MSI Aferburner, do I need to do something specific first?


Enable voltage control in AB settings.


----------



## skupples

Quote:


> Originally Posted by *RAFFY*
> 
> Lol fook that! Once I hit water I want the best performance possible. I'll be purchasing those aftermarket cooling pads.


Fujipoly 30cmx10cm pads are ~30-40$. Not bad really when you think about it. Phobya charges 20$ for a 10x10cm piece. 30x10cm is enough thermal to last you a few years of upgrades.

The only thing iv'e yet to figure out is which side is up & down. They come with a diamond type finish on one side.


----------



## battleaxe

Quote:


> Originally Posted by *skupples*
> 
> Fujipoly 30cmx10cm pads are ~30-40$. Not bad really when you think about it. Phobya charges 20$ for a 10x10cm piece. 30x10cm is enough thermal to last you a few years of upgrades.
> 
> The only thing iv'e yet to figure out is which side is up & down. They come with a diamond type finish on one side.


Aren't they sticky on two sides? with should mean that it doesn't matter.

?


----------



## skupples

Quote:


> Originally Posted by *battleaxe*
> 
> Aren't they sticky on two sides? with should mean that it doesn't matter.
> 
> ?


Yes, they are sticky on both sides, but that doesn't take into account the imprint on one side. I would guess the imprinted side is the side between the TIM & the vrm/vram.


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> Not all scores are up yet, but I just saw this one!
> 
> 
> 
> http://www.3dmark.com/hall-of-fame-2/fire+strike+3dmark+score+extreme+preset/version+1.1/3+gpu


Congrats, tsm. Jp needs to be congratulated, too. Tops at single run here . . . by a lot.

http://www.overclock.net/t/1361939/top-30-3dmark11-scores-for-single-dual-tri-quad

edit: oops, tess off. Beasts Osjur's graphics with tess on.


----------



## Arizonian

Quote:


> Originally Posted by *rdr09*
> 
> Congrats, tsm. Jp needs to be congratulated, too. Tops at single run here . . . by a lot.
> 
> http://www.overclock.net/t/1361939/top-30-3dmark11-scores-for-single-dual-tri-quad


At that he does too, if he wasn't by any of us.









I was an early adopter and had my name in top 30 on air, for a very brief time anyway.


----------



## rdr09

Quote:


> Originally Posted by *Jack Mac*
> 
> So does anyone have any idea when Amazon is getting more R9 290s in stock? I *REALLY* don't want to use Newegg again.


there are other stores. Try Provantage - it is reputable.


----------



## rdr09

Quote:


> Originally Posted by *Arizonian*
> 
> At that he does too, if he wasn't by any of us.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I was an early adopter and had my name in top 30 on air, for a very brief time anyway.


i beat your graphics score already . . .

http://www.3dmark.com/3dm11/7556915








just using the power limit.

edit: can't wait to see you play with your Ti, Arizonian.


----------



## RAFFY

Quote:


> Originally Posted by *Jpmboy*
> 
> FujiPoly !!


Yup that's the stuff! Going to go with the extreme or whatever it is called for the hell of it.


----------



## Raephen

I have some 7W/mk Phobya thermal pad stuff lying arround, but when I get my hands on my Aquacomputer block I think I'll try with the supplied padding first.

So a dab of MX4 for the core and RAM modules and the AC-stock thermal padding for the VRMs... I can't wait!!!


----------



## Jack Mac

Well, I'm impatient, hopefully Newegg isn't slow like they usually are, wish I could have used Amazon Prime, oh well.

I would love to be added to the club, if this isn't enough proof I can take a picture of it when I get it.


----------



## RAFFY

*Arizonian please update! I have just order my third ASUS 290x!*

PROOF RAFFY NO LIE!!!


----------



## Arizonian

Quote:


> Originally Posted by *RAFFY*
> 
> *Arizonian please update! I have just order my third ASUS 290x!*
> 
> PROOF RAFFY NO LIE!!!


Congrats - updated







+







= 3 updated!


----------



## RAFFY

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - updated
> 
> 
> 
> 
> 
> 
> 
> +
> 
> 
> 
> 
> 
> 
> 
> = 3 updated!


Thanks! You'll have to give me more







: once I get em under water in the next few weeks!


----------



## hotrod717

Quote:


> Originally Posted by *grandpatzer*
> 
> Do you have a 290x / unlocked 290?
> I that is the case I think you should have about 6% more then me, so about 1.06*66.1= 70.0fps.
> Incase youre score is with a normal 290 then I have some tweaking to do to get more then 66.1fps.
> 
> I was suggested by a user to AMD CCC tweak some setting for Valley, I'll try to figure out how to OC with this motherboard and put somewhere around 3.9-4.2Ghz the OC.
> Then the memory should be about 9 9 9 24 1600mhz instead of 1333mhz which it is now.


Unlocked to 290X. Yes, there are some tweaks that help with Valley. My run is unoptimized, and not tweaked in any way. I actually am downclocked from what I usually run my memory at. I usually run 9-11-11-28 2400mhz with my Flares. I know people have said you don't need any more than 1600mhz, but things are changing and faster memory does improve things quite a bit. I believe there was discussion that faster memory does help in Valley and if you take a look at the big dogs in oc'ing, 3930k's 2400mhz+16gb memory. There's a reason. BTW not my max oc. Just where I've gotten to so far!


----------



## grandpatzer

Quote:


> Originally Posted by *hotrod717*
> 
> Unlocked to 290X. Yes, there are some tweaks that help with Valley. My run is unoptimized, and not tweaked in any way. I actually am downclocked from what I usually run my memory at. I usually run 9-11-11-28 2400mhz with my Flares. I know people have said you don't need any more than 1600mhz, but things are changing and faster memory does improve things quite a bit. I believe there was discussion that faster memory does help in Valley and if you take a look at the big dogs in oc'ing, 3930k's 2400mhz+16gb memory. There's a reason. BTW not my max oc. Just where I've gotten to so far!


So is youre fully unlocked to 290x?
Because then we have similar scores (I get 66fps with 290 you get 70fps with 290x, I believe the 290x gets 6% more fps in Valley).

I have not done any tweaking to Valley, also my CPU is at 3.4-3.5Ghz, memory is at [email protected] 8 8 20 I think.

It's interesting to see if OC DDR3 memory + CPU gives anything.
I was shocked to see that my Metro LL benchmarks [email protected]/1500 gave *exactly* same fps as my [email protected]/1250(!)
Unless there is some software error then the [email protected] is bottlenecking both the 290 and 7950 in Metro LL.

Atleast the Valley is about 40% better score 290 vs 7950.


----------



## Jpmboy

Quote:


> Originally Posted by *rdr09*
> 
> Congrats, tsm. Jp needs to be congratulated, too. Tops at single run here . . . by a lot.
> 
> http://www.overclock.net/t/1361939/top-30-3dmark11-scores-for-single-dual-tri-quad
> 
> edit: oops, tess off. Beasts Osjur's graphics with tess on.


This is where i was with valid drivers until i read the rules for alancsalt's benchthread. Anyone really understand why hwbot allows AMD tess off... ?
http://www.3dmark.com/3dm11/7560926


oh yeah - tsm is (has been) the king AMD overclocker!!









but sometimes Futuremark has to be reminded of invalid scores in HOF. Check the #1 spot.


----------



## Jpmboy

Quote:


> Originally Posted by *RAFFY*
> 
> Yup that's the stuff! Going to go with the extreme or whatever it is called for the hell of it.


Like skup was saying... put the "rough" side towards the PCB and a trace of TIM on that side. The smoothside will not need any with a clean waterblock contact surface. And this stuff is very reusable!


----------



## tsm106

it's cuz they can't detect nv hax and ppl have been using nv hax for a longtime so as payback, they allowed tess off to even things and as payback for all those years.


----------



## battleaxe

Quote:


> Originally Posted by *tsm106*
> 
> it's cuz they can't detect nv hax and ppl have been using nv hax for a longtime so as payback, they allowed tess off to even things and as payback for all those years.


Ah.... okay.


----------



## Jpmboy

Quote:


> Originally Posted by *tsm106*
> 
> it's cuz they can't detect nv hax and ppl have been using nv hax for a longtime so as payback, they allowed tess off to even things and as payback for all those years.


I would hope they have run some tests and got the metrics that justify it as leveling the playing field. Frankly, when I run the futuremark tests (any of them) comparing my titans to the 290x with tess off, there is NO visual difference in the graphics appearance.. i know, not very scientific at all.


----------



## Ukkooh

Quote:


> Originally Posted by *Ukkooh*
> 
> 
> 
> Spoiler: Images
> 
> 
> 
> 
> 
> 
> 
> Has anyone gotten these with the new v9.4 drivers? The guys from the shop told me to try these drivers before proceeding with the RMA to see if it fixes the black screens. They were right I'm not getting black screens anymore and now I get some beautiful art. Should I proceed with the RMA or not?


Bumping my question as I'm still not sure if my card is faulty or not as it runs normally 90% of the time. This happened at stock clocks. Just before I took those images I was playing dirt showdown and the actual crashing/artifacting started at desktop. The card's performance was also a bit odd as I averaged only at ~32 with lower dips where my 7970 could run stable 60fps (limited) with the same settings. If someone also has dirt showdown could you kindly play it for a few mins to see if it causes issues like mine. I'd really like to get this sorted out before i order my watercooling stuff next week. I'm kind of afraid to RMA as they'll make me pay the postages and testing fees if it turns out that the card is not faulty. Thanks in advance comrades.


----------



## brazilianloser

Quote:


> Originally Posted by *Ukkooh*
> 
> Bumping my question as I'm still not sure if my card is faulty or not as it runs normally 90% of the time. This happened at stock clocks. Just before I took those images I was playing dirt showdown and the actual crashing/artifacting started at desktop. The card's performance was also a bit odd as I averaged only at ~32 with lower dips where my 7970 could run stable 60fps (limited) with the same settings. If someone also has dirt showdown could you kindly play it for a few mins to see if it causes issues like mine. I'd really like to get this sorted out before i order my watercooling stuff next week. I'm kind of afraid to RMA as they'll make me pay the postages and testing fees if it turns out that the card is not faulty. Thanks in advance comrades.


Man try using the WHQL drivers since you are on the latest beta. If that doesn't help in my sincere opinion any crash, black screen or problems like such in stock clocks are a RMA ticket for me. Why settle for something that doesn't even operate at stock??? But firstly check your power consumption making sure the card is getting the desired power from your psu, try different drivers, maybe a fresh install in another HDD that you might have sitting around to see...


----------



## maynard14

temp while playing crysis 3 on unlock xfx r9 290 with bios from asus 290x

is it ok sir? or normal?


----------



## zpaf

Just play with 3dmark vantage.


----------



## Snyderman34

Sorry to ask again, but I can't find an answer. Would upgrading my RAM from 1600MHz to 2133MHz have any effect on games or benching?


----------



## devilhead

Quote:


> Originally Posted by *grandpatzer*
> 
> Do you have a 290x / unlocked 290?
> I that is the case I think you should have about 6% more then me, so about 1.06*66.1= 70.0fps.
> Incase youre score is with a normal 290 then I have some tweaking to do to get more then 66.1fps.
> 
> I was suggested by a user to AMD CCC tweak some setting for Valley, I'll try to figure out how to OC with this motherboard and put somewhere around 3.9-4.2Ghz the OC.
> Then the memory should be about 9 9 9 24 1600mhz instead of 1333mhz which it is now.


here is mine 1170/1500 xfx 290, not X


----------



## Gunderman456

Quote:


> Originally Posted by *Snyderman34*
> 
> Sorry to ask again, but I can't find an answer. Would upgrading my RAM from 1600MHz to 2133MHz have any effect on games or benching?


Typically the consensus was no real advantage going from 1600MHz to 2133MHz in either, however, apparently that is not the case in BF4.


----------



## Snyderman34

Quote:


> Originally Posted by *Gunderman456*
> 
> Typically the consensus was no real advantage going from 1600MHz to 2133MHz in either, however, apparently that is not the case in BF4.


Alright then. I'll hold onto my cash then. Thanks a bunch!


----------



## Jpmboy

Quote:


> Originally Posted by *Snyderman34*
> 
> Sorry to ask again, but I can't find an answer. Would upgrading my RAM from 1600MHz to 2133MHz have any effect on games or benching?


yes - measurable in benching, but not noticeable in gaming.


----------



## Forceman

Quote:


> Originally Posted by *maynard14*
> 
> 
> 
> 
> 
> temp while playing crysis 3 on unlock xfx r9 290 with bios from asus 290x
> 
> is it ok sir? or normal?


Looks pretty normal for air cooling.


----------



## tsm106

Quote:


> Originally Posted by *Jpmboy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Snyderman34*
> 
> Sorry to ask again, but I can't find an answer. Would upgrading my RAM from 1600MHz to 2133MHz have any effect on games or benching?
> 
> 
> 
> yes - measurable in benching, but not noticeable in gaming.
Click to expand...

Going from 1600 to 2133 raised my physics score 1K points or more. 2400mhz is the sweet spot imo, however on ivy class cpus the higher the better is best.


----------



## zpaf

My best on 3dmark11 with [email protected]

















AMD Radeon R9 290 video card benchmark result - Intel Core i7-4770K,ASUSTeK COMPUTER INC. MAXIMUS VI IMPACT


----------



## Snyderman34

Quote:


> Originally Posted by *tsm106*
> 
> Going from 1600 to 2133 raised my physics score 1K points or more. 2400mhz is the sweet spot imo, however on ivy class cpus the higher the better is best.


Ah. Lol. Was wondering why I couldn't crack 11k in Fire Strike. What's the scope with Haswell? Probably gonna hold onto my money for now, but that G.Skill RipJaw from NewEgg is hard to resist ($54.99 for 2x4GB)


----------



## binormalkilla

Just ordered my third 290x. I need to get 3 blocks and an EK GPU adapter too. Anyone tried the Koolance block yet?


----------



## RAFFY

How many sheets of the Fujipoly in .5 and 1 will I need for 3*290x using the EK blocks?


----------



## rdr09

Quote:


> Originally Posted by *RAFFY*
> 
> How many sheets of the Fujipoly in .5 and 1 will I need for 3*290x using the EK blocks?


this should be enough for all three . . .

http://www.frozencpu.com/products/16880/thr-165/Fujipoly_Extreme_System_Builder_Thermal_Pad_-_14_Sheet_-_150_x_100_x_10_-_Thermal_Conductivity_110_WmK.html

for each size. i only did one gpu and i have plenty left. measure it before applying. you can use the ones that come with the kit if not enough. the VRAMs will use the most and if you want to be safe - get 2 of those.


----------



## Jpmboy

Quote:


> Originally Posted by *RAFFY*
> 
> How many sheets of the Fujipoly in .5 and 1 will I need for 3*290x using the EK blocks?


From frozencpu?


----------



## flamin9_t00l

Quote:


> Originally Posted by *Ukkooh*
> 
> 
> 
> 
> Has anyone gotten these with the new v9.4 drivers? The guys from the shop told me to try these drivers before proceeding with the RMA to see if it fixes the black screens. They were right I'm not getting black screens anymore and now I get some beautiful art. Should I proceed with the RMA or not?


I have had this happen 2 times but they only flashed up for milliseconds. The first time was during post before I got into windows and it booted up just fine. The second was resuming from sleep which resulted in a bsod. These problems happened almost as soon as I got the card from new and I was using 13.11 beta 9.2.

I have since reinstalled windows about 2 weeks ago and it has never happened again. The card is running perfectly now, no problems whatsoever and I have tried all 3 drivers for days at a time without issue.

I'm putting it down to the fact the card was brand new and maybe bedding into the thermal paste or something, I dunno.

On another note I've noticed that sleeping dogs really hammers this gpu getting a good few degrees hotter than both Ghosts and BF4 which I wasn't expecting, haha.


----------



## RAFFY

Quote:


> Originally Posted by *Jpmboy*
> 
> From frozencpu?


Yes and correct me if im wrong but I need one sheet of .5 thickness and another sheet of 1 thickness correct?


----------



## rdr09

Quote:


> Originally Posted by *RAFFY*
> 
> Yes and correct me if im wrong but I need one sheet of .5 thickness and another sheet of 1 thickness correct?


yes.


----------



## Jpmboy

Quote:


> Originally Posted by *RAFFY*
> 
> Yes and correct me if im wrong but I need one sheet of .5 thickness and another sheet of 1 thickness correct?


Yes, you need both. The. $15 0.5 mm pad is enough for at least 3 blocks. You need one 16x100mm piece of each thickness for each block.

The instructions on the ek website show what you need. I cant attach the pdf file from this ipad


----------



## RAFFY

Quote:


> Originally Posted by *Jpmboy*
> 
> Yes, you need both. The. $15 0.5 mm pad is enough for at least 3 blocks. You need one 16x100mm piece of each thickness for each block.
> 
> The instructions on the ek website show what you need. I cant attach the pdf file from this ipad


Thanks for the answers guys. One last noob question! Which thermal compound should I use on my cpu (AS5 still king?) ? Which thermal grease should I use on my GPU?


----------



## tsm106

Quote:


> Originally Posted by *RAFFY*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jpmboy*
> 
> Yes, you need both. The. $15 0.5 mm pad is enough for at least 3 blocks. You need one 16x100mm piece of each thickness for each block.
> 
> The instructions on the ek website show what you need. I cant attach the pdf file from this ipad
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks for the answers guys. One last noob question! Which thermal compound should I use on my cpu (AS5 still king?) ? Which thermal grease should I use on my GPU?
Click to expand...

CLU for the win. Masking tape for applying and removal. Use a baby wipe to wipe it off.


----------



## rdr09

Quote:


> Originally Posted by *RAFFY*
> 
> Thanks for the answers guys. One last noob question! Which thermal compound should I use on my cpu (AS5 still king?) ? Which thermal grease should I use on my GPU?


I read good stuff about Gelid Extreme and it is safe (non capacitive and conductive). i think if you spent that much for cooling might as well use the best there is. i used Collaborative liquid Ultra from reading so much of tsm106's posts.









CLU is hazardous to your gpus if you are not careful.

edit: you see where i am coming from?


----------



## battleaxe

Quote:


> Originally Posted by *rdr09*
> 
> I read good stuff about Gelid Extreme and it is safe (non capacitive and conductive). i think if you spent that much for cooling might as well use the best there is. i used Coolaborative liquid Ultra from reading so much of tsm104's posts.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> CLU is hazardous to your gpus if you are not careful.


I've always used Gelid Extreme too. Good stuff.

Phobya HE Grease is the exact same thing. Frozen CPU has it for 10.00 for the 3.5g tube. Enough to do about 15 installs.


----------



## ImJJames

I really cant wait any longer for non-ref version, which r9 290 manufacture can bios unlock to 290x?


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> I've always used Gelid Extreme too. Good stuff.
> 
> Phobya HE Grease is the exact same thing. Frozen CPU has it for 10.00 for the 3.5g tube. Enough to do about 15 installs.


i've only used MX-2 in the past and they are good as well. i went a bit overboard with the 290.

edit: my cpus use MX-2 and i have not changed them for almost a year. Their temps have not changed much.


----------



## Jpmboy

Quote:


> Originally Posted by *battleaxe*
> 
> I've always used Gelid Extreme too. Good stuff.
> 
> Phobya HE Grease is the exact same thing. Frozen CPU has it for 10.00 for the 3.5g tube. Enough to do about 15 installs.


No doubt the liquid metal TIMs are the best... But the worst to apply, very electrically conductive, and do not put it near any Alu. Gelid extreme, HeGrease or PK-1, 2 or 3 are very good. (I use gelid on the gpu, pk1 on the vrams and vrms).

If you use the right cautions (mask as tsm shows) CLU is the best tim.


----------



## RAFFY

Thanks guys I went with the Phyoba HE as I found a couple charts and it seems to test the 2nd lowest overall. Looks pretty awesome! Now I just need to pick out some coolant and wait for it all to arrive


----------



## Derpinheimer

Quote:


> Originally Posted by *BackwoodsNC*
> 
> y
> 
> lower your memory overclock. put it back to stock and see if it still happens.


Yup, still happens when underclocked and overvolted.

Games and desktop work 100%, no issues. I found that Yahoo embed video also freezes it, but .mp4 and .mkv, etc, through VLC media player are fine.


----------



## battleaxe

Quote:


> Originally Posted by *RAFFY*
> 
> Thanks guys I went with the Phyoba HE as I found a couple charts and it seems to test the 2nd lowest overall. Looks pretty awesome! Now I just need to pick out some coolant and wait for it all to arrive


Just a warning. The stuff isn't' the easiest to spread around.

They aren't kidding when they say 'grease' it likes to stick to everything except the metal you want it to stick on.

My best results have been with just putting the right amount on the GPU or CPU that you need and just put the heatsink directly down on it. The less you mess with it the better. For a CPU about the size of a grain of rice. For a GPU about half a grain of rice. That has always served me well with this stuff. Its a little different than the cheaper TIM's that are common.

Once on though, it works great.


----------



## Jpmboy

Quote:


> Originally Posted by *RAFFY*
> 
> Thanks guys I went with the Phyoba HE as I found a couple charts and it seems to test the 2nd lowest overall. Looks pretty awesome! Now I just need to pick out some coolant and wait for it all to arrive


Dont buy a premix. Just get some distilled water from the grocery store, then pick up a bottle of redline water wetter from your local auto parts store... Best anticorrosion you can get. Use just 5-10% in the DW. Been running this mix for ever. Keeps all the metal contact surfaces like new.


----------



## RAFFY

Quote:


> Originally Posted by *Jpmboy*
> 
> Dont buy a premix. Just get some distilled water from the grocery store, then pick up a bottle of redline water wetter from your local auto parts store... Best anticorrosion you can get. Use just 5-10% in the DW. Been running this mix for ever. Keeps all the metal contact surfaces like new.


I'm only buying coolant for the blood red color from EK. I looked up some results and they all seem to score within the same range 29-32c. So I picked the EK because the color goes with my build.


----------



## DeadlyDNA

3 way Cf slight OC test run for Heaven 4.0 max settings


just goofing around before i add 4th card and test


----------



## Jack Mac

Quote:


> Originally Posted by *DeadlyDNA*
> 
> 3 way Cf slight OC test run for Heaven 4.0 max settings
> 
> 
> just goofing around before i add 4th card and test


Look at the temperature that Heaven is reporting, lol. 62,369C, I knew Hawaii ran hot, but dang.


----------



## Jpmboy

Quote:


> Originally Posted by *RAFFY*
> 
> I'm only buying coolant for the blood red color from EK. I looked up some results and they all seem to score within the same range 29-32c. So I picked the EK because the color goes with my build.


That's cool. Gonna look great!


----------



## maynard14

hi there,,, does anyone know why hwinfo doesnt show my r9 290 card temps and other valuable infos ?


----------



## Forceman

You running the beta version? The release version may not be updated yet.


----------



## maynard14

im using this sir

HWiNFO64 v4.26
for Windows (Native 64-bit)

i doesnt show my temps ,. do you have other programs to monitor the temps and infos and display them while gaming?


----------



## skupples

Quote:


> Originally Posted by *Jpmboy*
> 
> Like skup was saying... put the "rough" side towards the PCB and a trace of TIM on that side. The smoothside will not need any with a clean waterblock contact surface. And this stuff is very reusable!


This "full size" sheet of Fujipoly will last me a REALLY long time, even with the massive pieces i'll be using for my CPU backplate mod. Thanks for the tip! It's cheep too. Phobya charges 20$ for 1/5 the amount.

Also, I see the TIM spreading debate has cropped up here...

*DOTS ARE THE BEST METHOD. "grain of rice" for CPU, and "half grain" for GPU. Spreading can & will introduce air bubbles, which are bad for dissipation.*

This has been proven time & time again. The only time this doesn't apply is with Coollabs liquid ultra. It is made to be painted on *as thin as possible.*


----------



## Forceman

Quote:


> Originally Posted by *maynard14*
> 
> im using this sir
> 
> HWiNFO64 v4.26
> for Windows (Native 64-bit)
> 
> i doesnt show my temps ,. do you have other programs to monitor the temps and infos and display them while gaming?


Download the new beta 4.27.2040 instead and try it.


----------



## jerrolds

Anyone know if i can still put an EK Backplate even tho i put a Gelid cooler on the 290X? Theres a bit of spacing from the nuts used to secure the cooler...

Hard to judge from this pic - 

Gonna try and find one during the weekend sales


----------



## Derpinheimer

Quote:


> Originally Posted by *Derpinheimer*
> 
> Yup, still happens when underclocked and overvolted.
> 
> Games and desktop work 100%, no issues. I found that Yahoo embed video also freezes it, but .mp4 and .mkv, etc, through VLC media player are fine.


Just FYI for anyone else: Problem fixed by uninstalling drivers with Display Driver Uninstall [DDU] under safe mode, and re-installing Cat 11 v9.2.

----

I have another question though. I see reviews where people have these at 1.1v or even less, maybe. I've not found a way to get less than 1150mV. MSI AB only allows -100, and GPU tweak also caps at -100, despite allowing +700.. lol

Reason I ask is I'll have it under air for a few weeks, maybe.. and would like to undervolt it so its not a vacuum.


----------



## COMBO2

Alright guys, the guy from the retailer pretty much said that the 290X coil whine is a problem they had been having all round on all their cards and it was a characteristic, in which they added that I must also have extremely good hearing if I can hear it like they can, because they haven't had THAT many complaints, and cards with coil whine that they've heard, the people who actually bought them couldn't hear. So blessed or cursed, depends how you look at it.

Long story short, it's irritating to the point where I feel like it's better to just shell out the extra cash and grab a GTX 780 Ti, as I can pay it off pretty quickly (within a few weeks) and the general experience will be more pleasant. The retailer said they would credit me for the R9 & the waterblock I bought (total of $800AUD) and I could pay as much as I needed for the 780 Ti and waterblock.

So Arizonian, looks like we were and are now in the same GPU situation!!









Yeah but you also might want to remove me from the club. Sorry guys, tried the red camp but didn't work out for me.


----------



## Raephen

Quote:


> Originally Posted by *RAFFY*
> 
> Thanks for the answers guys. One last noob question! Which thermal compound should I use on my cpu (AS5 still king?) ? Which thermal grease should I use on my GPU?


AS5 is not really what I'd use for a GPU. I'd go for a non conductive, easily spread TIM, like Arctic's MX4.


----------



## RAFFY

Quote:


> Originally Posted by *Jpmboy*
> 
> That's cool. Gonna look great!


Yea but my wallet wont









Quote:


> Originally Posted by *Raephen*
> 
> AS5 is not really what I'd use for a GPU. I'd go for a non conductive, easily spread TIM, like Arctic's MX4.


Oh no AS5 on a GPU goes fry, fry if your not careful I believe. I ended up getting that Phyobia stuff.


----------



## Sgt Bilko

Quote:


> Originally Posted by *RAFFY*
> 
> Oh no AS5 on a GPU goes fry, fry if your not careful I believe. I ended up getting that Phyobia stuff.


I've been using AS5 on my stuff for years.....never knew it was bad for GPU's

Oh well, hopefully i'll be running on water in a month or so anyways.


----------



## grandpatzer

Quote:


> Originally Posted by *devilhead*
> 
> here is mine 1170/1500 xfx 290, not X


daumn, thats 6fps more then what I get at 1200/1500.
Did you tweak the Valley in any way, as example AMD CCC.

what CPU and Memory do you have?

I really need to get on and OC my memory/cpu (2500k stock + memory 1333, should be 1600).


----------



## skupples

Quote:


> Originally Posted by *jerrolds*
> 
> Anyone know if i can still put an EK Backplate even tho i put a Gelid cooler on the 290X? Theres a bit of spacing from the nuts used to secure the cooler...
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Hard to judge from this pic -
> 
> Gonna try and find one during the weekend sales


The EK plates normally screw directly into the waterblock it's self, so it may take some finagling.
Quote:


> Originally Posted by *Sgt Bilko*
> 
> I've been using AS5 on my stuff for years.....never knew it was bad for GPU's
> 
> Oh well, hopefully i'll be running on water in a month or so anyways.


It's only an issue if you aren't careful as it's conductive. I use nothing but CLU on my gpu/cpu dye's. Very conductive, must be careful.

PK3 is good stuff, and supposedly non conductive. It's normally what I use on my vrms & vrams.


----------



## Connolly

Quote:


> Originally Posted by *Gunderman456*
> 
> Enable voltage control in AB settings.


I've already done that, no joy. The core voltage slider bar remains locked


----------



## WinterActual

I really didn't followed the whole story so I will ask here - can you guys tell me whats causing the black screen issue on those GPUs? Is it production failure or its driver related? Because I am seeing really random results - some people are reporting black screen on stock voltages, other on slight clock, etc.


----------



## Forceman

Quote:


> Originally Posted by *WinterActual*
> 
> I really didn't followed the whole story so I will ask here - can you guys tell me whats causing the black screen issue on those GPUs? Is it production failure or its driver related? Because I am seeing really random results - some people are reporting black screen on stock voltages, other on slight clock, etc.


No one knows.


----------



## amlett

New member here. Powercolor 290 flashed to Asus 290X.












http://www.techpowerup.com/gpuz/mym2u/


----------



## maynard14

^ congrats bro









by the way does anyone here saw this to?

is my card throttling?





when i take that screen shot crysis 3 is pause...alt tab on desktop and screen shot my desktop

and is msi after burner works with r9 series cards? or there is a bug?


----------



## Falkentyne

Quote:


> Originally Posted by *WinterActual*
> 
> I really didn't followed the whole story so I will ask here - can you guys tell me whats causing the black screen issue on those GPUs? Is it production failure or its driver related? Because I am seeing really random results - some people are reporting black screen on stock voltages, other on slight clock, etc.


The Black screens are a GPU shutdown; it's basically the same as the old "Gray screen of death" from the 5800 series cards, except it actually cuts the video.
It's triggered by either a hardware thermal trip (GPU getting too hot and emergency shuts down--similar to intel's thermal trip), or from something related to the GDDR5 and/or the memory controller.

So far, almost all of the current black screens have been caused by memory issues, but no one knows WHY or what is triggering it when you are at STOCK speeds.

You can trigger it manually by overclocking the memory without increasing the GPU voltage; once you reach a certain point, the RAM switching from idle (2d) speeds to load (3d/overdrive) speeds causes an INSTANT fault and a blackscreen. (usually you can trigger that by moving a browser window around or scrolling). But if you manage to start a game and do not have the memory ever downclock to 2D speeds, it will usually run fine. It's the 2D to 3D memory switch that seems to cause a fault when overclocked past a certain point (may be 1350-1500 MHz).

Raising the CPU voltage by 50+mv often fixes this and lets you push the RAM higher.

It's very strange. black screens from clock switching on the memory, yet game stable if you can KEEP it at 3d speeds...

The faulty cards seem to maybe have memory that cant handle stock speeds or broken hardware


----------



## devilhead

Quote:


> Originally Posted by *grandpatzer*
> 
> daumn, thats 6fps more then what I get at 1200/1500.
> Did you tweak the Valley in any way, as example AMD CCC.
> 
> what CPU and Memory do you have?
> 
> I really need to get on and OC my memory/cpu (2500k stock + memory 1333, should be 1600).


my cpu 3930k 4.8ghz, ram 32gb 1866mhz, and yes i used tweaks


----------



## flamin9_t00l

DELETED


----------



## BouncingBall

Wondering if this (14970) was a decent score for my R9 290X.

I am running an HIS R9 290X at stock clocks with no modifications. 3DMark 11 is testing on Performance mode.

Additional image:


Spoiler: Warning: Spoiler!


----------



## Sgt Bilko

Well now i'm annoyed, Asus has just sent out pics of the GTX 780Ti Poseidon and still nothing about non-ref 290/x's


----------



## selk22

For you miners out...

Sapphire 290x 947/1500 +31core +50 power limit% beta 9.4 drivers

850-862khash/s

GUIminer-scrypt @ give-me-coins.com pool - port 3333

Thread concurrency- 32764
v- 1
gpu threads - 1
worksize - 512
intensity - 20
stratum - yes
GPUz VDDC - 1.25-1.33 fluctuations

This has been my best setup with least errors so far at least! Hopefully this remains stable as of right now its been up and running for a few hours at these speeds without dipping. On give-me-coins.com I am seeing as low as 750 for a very short time and almost always 850+ I have gotten spikes up to 1024. So far pretty happy

Hope this helps someone trying to get their 290 or 290x up and mining. 1500 is 100% stable on mem for mining but I dont dare try it for gaming until I am under water.. This is all on stock cooler btw at 70%
VRM1-70c
VRM2-78c
GPU-85c


----------



## flamin9_t00l

Anyone in the UK looking for 290(x) waterblocks or any other watercooling gear. Just noticed watercoolinguk have a black friday sale on from today until 2/12/13 and its 15% off everything







using the code BLACK15. You could get one for as little as £77.50 instead of £91









Dang I just missed out... ordered my blocks a couple of days before


----------



## rdr09

Quote:


> Originally Posted by *amlett*
> 
> New member here. Powercolor 290 flashed to Asus 290X.


you got 100 pts more in graphics score compared to a 290 stock run. nice. i think it increases at higher clocks.


----------



## WinterActual

Quote:


> Originally Posted by *Falkentyne*
> 
> The Black screens are a GPU shutdown; it's basically the same as the old "Gray screen of death" from the 5800 series cards, except it actually cuts the video.
> It's triggered by either a hardware thermal trip (GPU getting too hot and emergency shuts down--similar to intel's thermal trip), or from something related to the GDDR5 and/or the memory controller.
> 
> So far, almost all of the current black screens have been caused by memory issues, but no one knows WHY or what is triggering it when you are at STOCK speeds.
> 
> You can trigger it manually by overclocking the memory without increasing the GPU voltage; once you reach a certain point, the RAM switching from idle (2d) speeds to load (3d/overdrive) speeds causes an INSTANT fault and a blackscreen. (usually you can trigger that by moving a browser window around or scrolling). But if you manage to start a game and do not have the memory ever downclock to 2D speeds, it will usually run fine. It's the 2D to 3D memory switch that seems to cause a fault when overclocked past a certain point (may be 1350-1500 MHz).
> 
> Raising the CPU voltage by 50+mv often fixes this and lets you push the RAM higher.
> 
> It's very strange. black screens from clock switching on the memory, yet game stable if you can KEEP it at 3d speeds...
> 
> The faulty cards seem to maybe have memory that cant handle stock speeds or broken hardware


Thanks for the answer!


----------



## amlett

Quote:


> Originally Posted by *rdr09*
> 
> you got 100 pts more in graphics score compared to a 290 stock run. nice. i think it increases at higher clocks.


This is the run before flashing the 290X BIOS.



And this is with a little OC with stock volts as 290X that I'm using BF4 stable with nices temps. Waiting for the EK block for trying 1200-1250


It's 1080/1300


----------



## rdr09

Quote:


> Originally Posted by *amlett*
> 
> This is the run before flashing the 290X BIOS.
> 
> 
> 
> And this is with a little OC with stock volts as 290X that I'm using BF4 stable with nices temps. Waiting for the EK block for trying 1200-1250
> 
> 
> Spoiler: Warning: Spoiler!


mine gets 10200 graphics score at stock. that's 300 more pts than yours. hmmm.

maybe your score after flashing might be low, too. does not matter, so long as your card does not black screens.


----------



## amlett

Quote:


> Originally Posted by *rdr09*
> 
> mine gets 10200 graphics score at stock. that's 300 more pts than yours. hmmm.
> 
> maybe your score after flashing might be low, too. does not matter, so long as your card does not black screens.


I've to reinstall OS. Pretty dirty. Same instalation since 470, 480, 580, 670 (SLI), 780 and finally this.


----------



## rdr09

Quote:


> Originally Posted by *amlett*
> 
> I've to reinstall OS. Pretty dirty. Same instalation since 470, 480, 580, 670 (SLI), 780 and finally this.


your very first amd? hope it does not whine like mine or else you'll end up whining as well. lol

just out of curiosity . . . what happened to the 780?


----------



## amlett

Got to sell it. It dindn't fit in the case for just 1cm width, EK block was huge. I was a classified I got for a good price, so I think the 290 for replacing it was the smartest option. After all both were going to be under water.

I've run a FireStrike with the 290 clocks and the 290X BIOS (947/1250). I think is where the unlock can be seen better:

Before Flashing


After flashing


----------



## rdr09

Quote:


> Originally Posted by *amlett*
> 
> Got to sell it. It dindn't fit in the case for just 1cm width, EK block was huge. I was a classified I got for a good price, so I think the 290 for replacing it was the smartest option. After all both were going to be under water.
> 
> I've run a FireStrike with the 290 clocks and the 290X BIOS (947/1250). I think is where the unlock can be seen better:
> 
> Before Flashing
> 
> 
> After flashing


i see and you got one that unlocks. lucky you.


----------



## ihaveworms

Quote:


> Originally Posted by *Falkentyne*
> 
> The Black screens are a GPU shutdown; it's basically the same as the old "Gray screen of death" from the 5800 series cards, except it actually cuts the video.
> It's triggered by either a hardware thermal trip (GPU getting too hot and emergency shuts down--similar to intel's thermal trip), or from something related to the GDDR5 and/or the memory controller.
> 
> So far, almost all of the current black screens have been caused by memory issues, but no one knows WHY or what is triggering it when you are at STOCK speeds.
> 
> You can trigger it manually by overclocking the memory without increasing the GPU voltage; once you reach a certain point, the RAM switching from idle (2d) speeds to load (3d/overdrive) speeds causes an INSTANT fault and a blackscreen. (usually you can trigger that by moving a browser window around or scrolling). But if you manage to start a game and do not have the memory ever downclock to 2D speeds, it will usually run fine. It's the 2D to 3D memory switch that seems to cause a fault when overclocked past a certain point (may be 1350-1500 MHz).
> 
> Raising the CPU voltage by 50+mv often fixes this and lets you push the RAM higher.
> 
> It's very strange. black screens from clock switching on the memory, yet game stable if you can KEEP it at 3d speeds...
> 
> The faulty cards seem to maybe have memory that cant handle stock speeds or broken hardware


From what I read the recent beta drivers bumped up the voltage to try and fix the black screen issue. The beta drivers fixed the black screen issue for me, but it seems like it didn't fix it for all. My guess is that some of these chips were borderline stable at the voltage they were given at factory. Some were crashing, like mine, and a voltage bump was able to stabilize it.


----------



## Connolly

Does anyone have any idea why my bench scores are so terrible? It's feeling like a duff card, but I've added my rig to my sig and if anyone can see a major bottle neck feel free to point it out. I don't want to RMA the card as it was an R9 290 flashed to a 290x, and the company don't have any more in stock.



I've overclocked the CPU to 4.4ghz, atleast I think I have







Does everything look ok here?



And here is my GPUz if it's of any use


----------



## rdr09

Quote:


> Originally Posted by *Connolly*
> 
> Does anyone have any idea why my bench scores are so terrible? It's feeling like a duff card, but I've added my rig to my sig and if anyone can see a major bottle neck feel free to point it out. I don't want to RMA the card as it was an R9 290 flashed to a 290x, and the company don't have any more in stock.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> It says on there that my CPU is at 3.4ghz, but it's OC'd to 4.4ghz.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> And here is my GPUz if it's of any use
> 
> 
> Spoiler: Warning: Spoiler!


looks normal. Valley loves higher vram. run other benchmarks like Firestrike.


----------



## Connolly

You think that's a normal score for a 290x? Each one I've seen has been way higher than that, maybe I've just seen ridiculously overclocked benches.


----------



## rdr09

Quote:


> Originally Posted by *Connolly*
> 
> You think that's a normal score for a 290x? Each one I've seen has been way higher than that, maybe I've just seen ridiculously overclocked benches.


at stock and no Valley tweaks (checkout the Vallet bench thread for more info), that score is normal.

run Firestrike and compare it to amlett's score at an earlier post.

and, yes, the ones that you were seeing where done with oc's.


----------



## TomiKazi

I can finally be updated to water. EK Acetal block + backplate. Picture is a bit blurry:


And the Monsta rad with ghetto fan setup:

(to be improved of course)

Idle temps @ 1100/1250 stock voltage:
GPU: 33C
VRM1: 28C
VRM2: 34C

Max temps after 30 minutes of valley (2560*1440, 8xaa, ultra)
GPU: 42C
VRM1: 51C
VRM2: 38C

Ambient: ~20.5C

Now the question remains how much overvolting this psu can handle.


----------



## Connolly

Quote:


> Originally Posted by *amlett*
> 
> This is the run before flashing the 290X BIOS.
> 
> 
> 
> And this is with a little OC with stock volts as 290X that I'm using BF4 stable with nices temps. Waiting for the EK block for trying 1200-1250
> 
> 
> It's 1080/1300


Is this test at standard resolution and settings? I only have the basic version, so I can only run the full test and can't see what configuration options there are. Here are my results, if you're interested.


----------



## grandpatzer

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Well +100 mV in AB is what, 1.35v before droop? Correct me if I'm wrong.
> 
> Also, actual voltages are more important. This means what the voltage is when the GPU is at 100%, after vdroop.
> 
> I wouldn't put more than 1.3v actual into a card for daily use, any less than that should be fine so long as your core and VRM temps are under control.
> 
> With Hawaii, the cooler the better, but I'd try and keep your core under 80c and VRM's under 100c. Use a custom fan profile if you need to, I'd rather have a loud GPU run cool than a quiet GPU burn itself out.
> 
> I'm running my 290 @ 1.412 (max in GPUTweak, before droop), about 1.26 actual at 1200/6000 and my max temps are 62c core, 85 and 55 for VRMs 1 and 2. Past 62c core and I artifact.
> 
> Edit: I will say that you should create a second profile for when you're not doing GPU intensive stuff. I have one with the core and mem clocks lowered by about 10% and voltage left stock. If you're just browsing the web there's no point in pumping your GPU full of volts.


Can I ask how you measure voltage after vdroop?
I was able to see voltage on screen while benching with MSI AB but now it's disappeared.

According to GPU-Z the MAX voltage after puttin +100mV is 1.320V, however I believe the actual voltage is about 1.2xV.
you have air or water?

I think the +100mV hopefuly is safe for me








Quote:


> Originally Posted by *devilhead*
> 
> my cpu 3930k 4.8ghz, ram 32gb 1866mhz, and yes i used tweaks


Ok, you will always have more fps then me then, not sure how much difference CPU makes bcs Valley is from what I gather more stressful on GPU.
However I'm not happy with [email protected]/1500, I'll OC my 2500k and memory and tweak Valley I think should atleast get 70fps.


----------



## psychok9

Hi guys,
I'm looking for a Radeon R9 290 but I have some questions:

News from custom version? I need a good cooling compartment to use it 24/7.

Clean from dust the reference cooler is easy to you?


----------



## RAFFY

Quote:


> Originally Posted by *psychok9*
> 
> Hi guys,
> I'm looking for a Radeon R9 290 but I have some questions:
> 
> News from custom version? I need a good cooling compartment to use it 24/7.
> 
> Clean from dust the reference cooler is easy to you?


I dont remember which brand it is but there are some non-reference 290's out right now or at least coming soon. I know i saw one or an advertisement for one on newegg yesterday.


----------



## Connolly

You mean this thread?

http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0/9220

Can't find any non OCd results in there, but I did stop looking after 50 pages


----------



## psychok9

Quote:


> Originally Posted by *RAFFY*
> 
> Quote:
> 
> 
> 
> Originally Posted by *psychok9*
> 
> Hi guys,
> I'm looking for a Radeon R9 290 but I have some questions:
> 
> News from custom version? I need a good cooling compartment to use it 24/7.
> 
> Clean from dust the reference cooler is easy to you?
> 
> 
> 
> I dont remember which brand it is but there are some non-reference 290's out right now or at least coming soon. I know i saw one or an advertisement for one on newegg yesterday.
Click to expand...

No atm :/
In any case I think I'll wait for more, I live in europe :-/

If the sink is easy to disassemble, clean and reassemble nearly take the reference ... even if I do not like temperatures (95 ° C).
It is one thing having 95°c a few hours a day, another 95 ° for 24 hours/day ... what do you think?


----------



## voldomazta

Is the EK backplate usable with the Koolance block?


----------



## Jack Mac

Guys, Amazon has R9 290s in stock if you're interested. Cancelled my Newegg order of an XFX 290 for a Sapphire R9 290. And I get Saturday delivery with Amazon, yay.


----------



## Arizonian

Quote:


> Originally Posted by *amlett*
> 
> New member here. Powercolor 290 flashed to Asus 290X.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/mym2u/


Welcome aboard - congrats - added








Quote:


> Originally Posted by *TomiKazi*
> 
> I can finally be updated to water. EK Acetal block + backplate. Picture is a bit blurry:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> And the Monsta rad with ghetto fan setup:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> (to be improved of course)
> 
> Idle temps @ 1100/1250 stock voltage:
> GPU: 33C
> VRM1: 28C
> VRM2: 34C
> 
> Max temps after 30 minutes of valley (2560*1440, 8xaa, ultra)
> GPU: 42C
> VRM1: 51C
> VRM2: 38C
> 
> Ambient: ~20.5C
> 
> Now the question remains how much overvolting this psu can handle.


Congrats - updated


----------



## Gunderman456

Quote:


> Originally Posted by *Connolly*
> 
> I've already done that, no joy. The core voltage slider bar remains locked


Do you then have the latest beta for AB which allows voltage unlocks?


----------



## Connolly

Quote:


> Originally Posted by *Gunderman456*
> 
> Do you then have the latest beta for AB which allows voltage unlocks?


I have version 2.3.1, not the beta though. Does the this version not support voltage unlocks?


----------



## jerrolds

You need AB Beta 17, or GPU Tweak w/ Asus bios (or PT1/PT3 if youre under water or ln2) to unlock voltages.


----------



## Gunderman456

Quote:


> Originally Posted by *Connolly*
> 
> I have version 2.3.1, not the beta though. Does the this version not support voltage unlocks?


No, only the latest beta does.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *grandpatzer*
> 
> Can I ask how you measure voltage after vdroop?
> I was able to see voltage on screen while benching with MSI AB but now it's disappeared.
> 
> According to GPU-Z the MAX voltage after puttin +100mV is 1.320V, however I believe the actual voltage is about 1.2xV.
> you have air or water?
> 
> I think the +100mV hopefuly is safe for me
> 
> 
> 
> 
> 
> 
> 
> 
> Ok, you will always have more fps then me then, not sure how much difference CPU makes bcs Valley is from what I gather more stressful on GPU.
> However I'm not happy with [email protected]/1500, I'll OC my 2500k and memory and tweak Valley I think should atleast get 70fps.


Use GPU-z and read the voltage after running 100% load. It will usually jump around a bit so take a guess.

My card goes from 1.25 to 1.3v under load, so I call it about 1.27v.

I have a Gelid air cooler for my card.


----------



## X-oiL

What happens when you get a black screen? Do your computer freeze and you have to restart it?

I have a issue with something triggering my screen to "reboot" it goes all black for 2-3 sec and then starts again..trying to find out whether it's the graphic card or screen causing this to happen :/


----------



## DeadlyDNA

Quote:


> Originally Posted by *X-oiL*
> 
> What happens when you get a black screen? Do your computer freeze and you have to restart it?
> 
> I have a issue with something triggering my screen to "reboot" it goes all black for 2-3 sec and then starts again..trying to find out whether it's the graphic card or screen causing this to happen :/


When it happened to me i had it happen both ways. Mine was temperature related i had 3 way CF on stock coolers. I had to arrange them to give more space and they were fine, and now on water no issues at all.


----------



## PorkchopExpress

my card has been an angel till i just did a fresh os install, now it seems to be giving me problems. when i try to shutdown it turns the screen off but i have to hold the power button to actually turn the computer off. cant seem to clock as high either.


----------



## grandpatzer

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Use GPU-z and read the voltage after running 100% load. It will usually jump around a bit so take a guess.
> 
> My card goes from 1.25 to 1.3v under load, so I call it about 1.27v.
> 
> I have a Gelid air cooler for my card.


but gpu-z is in background while I'm benching?
Also I used to have OSD (On screen display) showing of voltage with MSI AB, I think maybe with the new MSI AB it disappeared...

so is after Vdroop 1.30v OK with water?


----------



## Forceman

Quote:


> Originally Posted by *X-oiL*
> 
> What happens when you get a black screen? Do your computer freeze and you have to restart it?
> 
> I have a issue with something triggering my screen to "reboot" it goes all black for 2-3 sec and then starts again..trying to find out whether it's the graphic card or screen causing this to happen :/


The "normal" black screen requires a reboot, what you are experiencing sounds like a driver crash/reset. Usually happens when you overclock too high.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *grandpatzer*
> 
> but gpu-z is in background while I'm benching?
> Also I used to have OSD (On screen display) showing of voltage with MSI AB, I think maybe with the new MSI AB it disappeared...
> 
> so is after Vdroop 1.30v OK with water?


You can mouse over the voltage right after you exit the bench. It should show the values wherever your mouse is.

1.3v should be fine as long as your temps are under control.


----------



## Scorpion49

Well I finally tried BF4 with my 290, it was running great with a GTX 580 but on the 290 I get black screens every 10-15 seconds. It will come back after 4-5 seconds but it keeps doing it over and over. Really, really annoying.


----------



## Connolly

I've managed to OC my 290x to these settings before I start to get artifacts:



These are the results:



Does that look about right so far? I'm really new to overclocking, what's my next move in Afterburner? Increase core voltage and core clock slowly alongside each other? Or something entirely? Or have I just reached the max for my card? Sorry for so many questions, I've done a little research into voltage increase and don't want to jump into something I don't really understand.


----------



## X-oiL

Quote:


> Originally Posted by *Forceman*
> 
> The "normal" black screen requires a reboot, what you are experiencing sounds like a driver crash/reset. Usually happens when you overclock too high.


I think it might have to do with my screen, in Turbo 240 mode it "restarts" randomly but turned off it runs fine for hours now playing BF4.


----------



## chiknnwatrmln

What are your max temps? I can't tell if you have a custom fan profile. If you don't already, you should make one. A 15c difference can give you a bit of a better OC. My card artifacts when it gets past a certain temperature at the clock I have it at, so I ramp up the fans so it doesn't get that hot.

After that, bump the voltage by a bit (25 mV to start) then bump up core speed and test it. Keep going until you're not stable, then either lower the core and keep it at that or bump the voltage and keep going.

+100mV is nothing really, Afterburner has the limit set really low.


----------



## Connolly

I set my fan to 65 while benching and temps haven't exceeded 75. Think I should be able to get my clock higher?


----------



## Jpmboy

Quote:


> Originally Posted by *Connolly*
> 
> I've managed to OC my 290x to these settings before I start to get artifacts:
> 
> These are the results:
> 
> Does that look about right so far? I'm really new to overclocking, what's my next move in Afterburner? Increase core voltage and core clock slowly alongside each other? Or something entirely? Or have I just reached the max for my card? Sorry for so many questions, I've done a little research into voltage increase and don't want to jump into something I don't really understand.


That's pretty good for stock mV. Unless you have flashed to a non OEM bios, you can push the mV slider to the right without fear of borking the card. Basically the same for the gpu clock ()until it is not stable). Memory can cause a blckscrn "event" which can be a problem for some. I had one so bad via memory OC (7000 at 1.412V = 1.3V actual) that I had to boot to safe mode, disable CCC and asus gpu tweak in msconfig, then uninstall/reinstall both... a real PIA. It would load windows fine trhen as soon as it tried to load CCC - - blackout and system hang. Nasty!

Just go slow and you can squeeze a lot out of that card.

btw - use the rigbuilder link in my sig, so we know what you are working with.









make sure you check the vrm temps with gpuZ. they run hotter than the gpu.


----------



## Connolly

Quote:


> Originally Posted by *Jpmboy*
> 
> That's pretty good for stock mV. Unless you have flashed to a non OEM bios, you can push the mV slider to the right without fear of borking the card. Basically the same for the gpu clock ()until it is not stable). Memory can cause a blckscrn "event" which can be a problem for some. I had one so bad via memory OC (7000 at 1.412V = 1.3V actual) that I had to boot to safe mode, disable CCC and asus gpu tweak in msconfig, then uninstall/reinstall both... a real PIA. It would load windows fine trhen as soon as it tried to load CCC - - blackout and system hang. Nasty!
> 
> Just go slow and you can squeeze a lot out of that card.
> 
> btw - use the rigbuilder link in my sig, so we know what you are working with.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> make sure you check the vrm temps with gpuZ. they run hotter than the gpu.


Nice, thanks for the advice









Is my rig not in my sig? I did set it up earlier, but I guess to no avail. I'm watching TV with the gf now, she won't be too happy if I slip off to my PC upstairs and start OCing again









Asus P8Z68-V PRO Z68 Socket 1155 8 Channel HD Audio ATX Motherboard
Corsair Vengeance 8GB (2x4GB) DDR3 1866Mhz Memory Kit CL9 1.5V
Intel Core i7 2600k 3.4GHz Socket 1155 8MB Cache Retail Boxed Processor
R9 290x GPU

That's the essentials of my kit, I did OC the CPU earlier to 4.4ghz but it was over heating, i need some better cooling. Again, thanks for the advice! (+rep)


----------



## kot0005

So, I replaced my Sapphire r9 290X with an asus one yesterday, because my sapphire died on me  I finally get to do some benches underwater....but the new asus card has coil whine .Going to return it and get a new on on Monday...

[email protected] 1154 MHz core esultand 5264 Mhz Memory

http://www.3dmark.com/3dm/1750500?


----------



## jomama22

You guys know coil wine settles down after a little while right? Hell, you could leave valley runnign for a few hours and most of that whine will be gone.


----------



## Connolly

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> What are your max temps? I can't tell if you have a custom fan profile. If you don't already, you should make one. A 15c difference can give you a bit of a better OC. My card artifacts when it gets past a certain temperature at the clock I have it at, so I ramp up the fans so it doesn't get that hot.
> 
> After that, bump the voltage by a bit (25 mV to start) then bump up core speed and test it. Keep going until you're not stable, then either lower the core and keep it at that or bump the voltage and keep going.
> 
> +100mV is nothing really, Afterburner has the limit set really low.


I set my fan to 65 while benching and temps haven't exceeded 75. Think I should be able to get my clock higher? Thanks for the advice, + rep


----------



## PorkchopExpress

im getting horrid horizontal lines and black screens AFTER i did a windows 8 re install. before that the card was fine. the card doesnt have a problem with games, seems to hate windows. im going to do another loonnnng re install and see if that helps but damn u amd.


----------



## Newb Builder

Hey people im in need of some help ive just brought 3 new monitors so need to upgrade from my 8750

Preferably id like to spend as little as possible as the monitors set me back £900
I was wandering which would provide best bang for bucks for essentially a 4k set up
I was looking at dual 7990's, possibly 3 or 4 7970's or 280x's, or 2 x 290's

Want it running at 60hz and as i said at a 4k resolution, the reason for this is because
Im going to be running 3 monitors at 2560 x 1080 which has the same amount of pixels
as a 4k monitor

So may I ask your guys opinions ? Was possibly thinking about crossfiring sapphire's 6GB 7970's


----------



## Jack Mac

Go for the 290s, they're beasts at high resolutions.


----------



## KyGuy

Quote:


> Originally Posted by *Newb Builder*
> 
> Hey people im in need of some help ive just brought 3 new monitors so need to upgrade from my 8750
> 
> Preferably id like to spend as little as possible as the monitors set me back £900
> I was wandering which would provide best bang for bucks for essentially a 4k set up
> I was looking at dual 7990's, possibly 3 or 4 7970's or 280x's, or 2 x 290's
> 
> Want it running at 60hz and as i said at a 4k resolution, the reason for this is because
> Im going to be running 3 monitors at 2560 x 1080 which has the same amount of pixels
> as a 4k monitor
> 
> So may I ask your guys opinions ? Was possibly thinking about crossfiring sapphire's 6GB 7970's


Go for the 290x's. Forget 280x's and 7970's as the 280x is basically a rebranded 7970. Quadfire with the 7990's won't scale well, although the extra RAM may help at high res. For now though the 290x is king, and when it is updated for Mantle, who knows how fast it will be.


----------



## voldomazta

http://www.amazon.com/gp/product/B00GJSUNHC/

ASUS R9290X-G-4GD5 from $589.99 to $569.99
Sadly I already bought 2x before the mark down.


----------



## Newbie2009

You can put me down for xfire and watercooling now.


----------



## maynard14

Quote:


> Originally Posted by *Newbie2009*
> 
> You can put me down for xfire and watercooling now.


wow nice nice...


----------



## amlett

Hi everyone

Finally after seeing that my results were pretty low compared to other 290X after the unlock, I've reinstalled OS and redone the FS. It seems that my system needed a refresh.

I can't redo the 290 bech, but here are both 290X runs:

This was with the old Windows 7 (after many cards):


And this with a fresh Windows 8.1:


----------



## kot0005

Quote:


> Originally Posted by *jomama22*
> 
> You guys know coil wine settles down after a little while right? Hell, you could leave valley runnign for a few hours and most of that whine will be gone.


Are you serious? this is my first time ever experiencing a coil whine. I thought it would get worse, no?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *kot0005*
> 
> Are you serious? this is my first time ever experiencing a coil whine. I thought it would get worse, no?


Not usually.

Most of the time it'll quiet down a bit after use, but it won't go away completely.

My old 7950 did that, and so did my 290. It's still audible, but not unbearable.

My 670 always had quiet whine, never changed after a year and a half though.

If it keeps getting louder then you should try RMA'ing, if it's worth the time/shipping costs to you.


----------



## tsm106

Quote:


> Originally Posted by *kot0005*
> 
> Quote:
> 
> 
> 
> Originally Posted by *jomama22*
> 
> You guys know coil wine settles down after a little while right? Hell, you could leave valley runnign for a few hours and most of that whine will be gone.
> 
> 
> 
> Are you serious? this is my first time ever experiencing a coil whine. I thought it would get worse, no?
Click to expand...

Its very common with high powered hardware. Your existing component quality can also have a large bearing on perception of coil whine. Your psu is the root of coil whine.


----------



## jomama22

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Not usually.
> 
> Most of the time it'll quiet down a bit after use, but it won't go away completely.
> 
> My old 7950 did that, and so did my 290. It's still audible, but not unbearable.
> 
> My 670 always had quiet whine, never changed after a year and a half though.


It will diminish after heavy use. I have 2 290x that squeal like a pig but settled to inaudible after abount 12 hours of heavy, non stop use.

They just need to be broken in.


----------



## Newbie2009

Anyone else having issues with xfire? I have enabled , says so in CCC, says so in GPUz, works in 3DMark 13 but does not seem to be working in games......


----------



## tsm106

Quote:


> Originally Posted by *Newbie2009*
> 
> Anyone else having issues with xfire? I have enabled , says so in CCC, says so in GPUz, works in 3DMark 13 but does not seem to be working in games......


Did you check the box for enable cfx for games without profiles?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *jomama22*
> 
> It will diminish after heavy use. I have 2 290x that squeal like a pig but settled to inaudible after abount 12 hours of heavy, non stop use.
> 
> They just need to be broken in.


Oh trust me my older cards were broken in quite a bit.









I really haven't had that much time to use my 290 though, with it being the end of the semester and all my schoolwork.

Actually, come to think of it, other than benching I think I've only gamed on my 290 a handful of times.


----------



## kot0005

Quote:


> Originally Posted by *tsm106*
> 
> Its very common with high powered hardware. Your existing component quality can also have a large bearing on perception of coil whine. Your psu is the root of coil whine.


Its not the psu, its coming from the card, my sapphire did not have this issue. I live like 40mins away from the store, they replace stuff right away if the fault is major so yeah.


----------



## sugarhell

Your graphic cards doesnt take power from the god. It takes power from your PSU. Thats what he means. If the psu sends dirty power then its easily a gpu to create a coil whine. All the gpus have coil whine but in the majority of them its really low frequency so the human ear cant hear it.


----------



## tsm106

Quote:


> Originally Posted by *kot0005*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Its very common with high powered hardware. Your existing component quality can also have a large bearing on perception of coil whine. Your psu is the root of coil whine.
> 
> 
> 
> Its not the psu, its coming from the card, my sapphire did not have this issue. I live like 40mins away from the store, they replace stuff right away if the fault is major so yeah.
Click to expand...

I bought a card off this guy once for cheap. It annoyed him to no end that it coil whined so badly. After receiving the card, I looked for this coil noise but found none. It was perfectly quiet for me. A psu can be the cause of the noise and you end up hearing it off the gpu. This is very common, shrugs.


----------



## CriticalHit

Well I'm underwater now ... My stock valley scores went up on that alone ..

Also not artifacting at 1130 ish anymore

But once I hit 1165 or so still no artifacts but my monitor on display port ( I'm on eyefinity ) just switches black and back on every second or so ... Is this standard over clock limit behavior ? Never seen it before... If I run only one monitor off dvi would not even know that was happening


----------



## RAFFY

Quote:


> Originally Posted by *tsm106*
> 
> I bought a card off this guy once for cheap. It annoyed him to no end that it coil whined so badly. After receiving the card, I looked for this coil noise but found none. It was perfectly quiet for me. A psu can be the cause of the noise and you end up hearing it off the gpu. This is very common, shrugs.


Honestly I am very skeptical that all these people have "coil whine" issues. It seems like a few people yelled FIRE and now everyone is saying the building is burning.


----------



## chiknnwatrmln

Mhm, I say my card makes a little buzzing noise just because everyone else says that, and not because, ya know, it makes a buzzing noise.

It's all in my head.


----------



## kot0005

Have a corsair ax 860i, it should be more than enough for 1 card lol.

anyway valley benchmark:


----------



## jokrik

I have two 290 on the way
and will run it with a 4930k and RIVE BE, 4 x rams
does anyone with similar setup reckon a 860 power supply is enough?
I used the psu calc and with oc and other stuff they only draw 780w

but I heard the Hawaii card is power hungry, just wanna confirm bout this matter


----------



## hotrod717

Quote:


> Originally Posted by *kot0005*
> 
> Have a corsair ax 860i, it should be more than enough for 1 card lol.
> 
> anyway valley benchmark:


Most comparison's are made using ExtremeHD preset.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *jokrik*
> 
> I have two 290 on the way
> and will run it with a 4930k and RIVE BE, 4 x rams
> does anyone with similar setup reckon a 860 power supply is enough?
> I used the psu calc and with oc and other stuff they only draw 780w
> 
> but I heard the Hawaii card is power hungry, just wanna confirm bout this matter


My entire PC draws 500w max load at the outlet (my UPS tells me how much it's drawing).

This is with a myriad of case fans, an overclocked 3770k, and a heavily overclocked 290. Without any card my PC pulls about 200w under 100% CPU usage at the wall. Now when it was pulling 500w the CPU was not maxed out, so I can guess that my 290 is making my PC draw about 320-330 more watts at the outlet. For good measure let's call that 350w.

Factor in my PSU's 80% efficiency and you get a total of 400w DC, of which my card is pulling about 280w.

So yes, with a Seasonic PSU you'll be fine with two cards. Even if you heavily OC a 290x I can't imagine it'd draw more than 325w maximum.


----------



## muhd86

Quote:


> Originally Posted by *jokrik*
> 
> I have two 290 on the way
> and will run it with a 4930k and RIVE BE, 4 x rams
> does anyone with similar setup reckon a 860 power supply is enough?
> I used the psu calc and with oc and other stuff they only draw 780w
> 
> but I heard the Hawaii card is power hungry, just wanna confirm bout this matter


thats a more then enough for now ---if you dont go in to some serious over clocking ...

i have 3 r290 with a 3930k over clocked to 4.5ghz but my psu is 1200watt gold


----------



## jokrik

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> My entire PC draws 500w max load at the outlet (my UPS tells me how much it's drawing).
> 
> This is with a myriad of case fans, an overclocked 3770k, and a heavily overclocked 290. Without any card my PC pulls about 200w under 100% CPU usage at the wall. Now when it was pulling 500w the CPU was not maxed out, so I can guess that my 290 is making my PC draw about 320-330 more watts at the outlet. For good measure let's call that 350w.
> 
> Factor in my PSU's 80% efficiency and you get a total of 400w DC, of which my card is pulling about 280w.
> 
> So yes, with a Seasonic PSU you'll be fine with two cards. Even if you heavily OC a 290x I can't imagine it'd draw more than 325w maximum.


Quote:


> Originally Posted by *muhd86*
> 
> thats a more then enough for now ---if you dont go in to some serious over clocking ...
> 
> i have 3 r290 with a 3930k over clocked to 4.5ghz but my psu is 1200watt gold


Thx guys !
I'll stick to mine for now

+1


----------



## sugarhell

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> My entire PC draws 500w max load at the outlet (my UPS tells me how much it's drawing).
> 
> This is with a myriad of case fans, an overclocked 3770k, and a heavily overclocked 290. Without any card my PC pulls about 200w under 100% CPU usage at the wall. Now when it was pulling 500w the CPU was not maxed out, so I can guess that my 290 is making my PC draw about 320-330 more watts at the outlet. For good measure let's call that 350w.
> 
> Factor in my PSU's 80% efficiency and you get a total of 400w DC, of which my card is pulling about 280w.
> 
> So yes, with a Seasonic PSU you'll be fine with two cards. Even if you heavily OC a 290x I can't imagine it'd draw more than 325w maximum.


Tsm already post a 900 watt power consumption with just 2 cards on air and a cpu no loop no fans anything. Yeah these cards can pull up to 450 watt.We should put his power consumption photo in OP section


----------



## Chomuco

XFX R9 290 ram elpida ,: unlocked CON CHP XXX2020 ?? A 290X ?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *sugarhell*
> 
> Tsm already post a 900 watt power consumption with just 2 cards on air and a cpu no loop no fans anything. Yeah these cards can pull up to 450 watt.We should put his power consumption photo in OP section


Interesting, do you have a link to a post or something? I wonder what voltage he was running.


----------



## Ricdeau

Please add me to the list. 2x Sapphire R9 290Xs with EK full cover blocks and back plates.

http://www.techpowerup.com/gpuz/mpsah/
http://www.techpowerup.com/gpuz/667v3/


----------



## leo82

Hi quick question , I'm buying 4 Sapphire AMD Radeon R9 290 4GB's .. Can Someone tell me what power supply I need for all 4?


----------



## muhd86

anyone here with quad r290 . i have quad gtx 780 but on old drivers which supported it --thinking of selling 1 and buying another r290 for quad on my 3930k / ramapge iv rig --
and install the 3 780 on my 2nd rig ---

---

would the 4th r290 be a benchmark killer ---


----------



## muhd86

Quote:


> Originally Posted by *leo82*
> 
> Hi quick question , I'm buying 4 Sapphire AMD Radeon R9 290 4GB's .. Can Someone tell me what power supply I need for all 4?


i am reckoning a min 1200watt but for safer side get a 1500watt one .


----------



## hatlesschimp

Get the 1500w!!!


----------



## brazilianloser

Just in case no one posted this yet... for all those out there that are in need of a block for their new 290/290x and are kind of tired of waiting on EK coming in stock here in the US...

http://www.xs-pc.com/waterblocks-gpu/razor-r9-290x-290


http://www.xs-pc.com/waterblocks-gpu/razor-r9-290x-290-backplate


According to their Facebook it should be in stock starting the first week of December.

If already brought up then sorry otherwise I hope that this helps out those impatient souls out there.


----------



## Heinz68

Quote:


> Originally Posted by *leo82*
> 
> Hi quick question , I'm buying 4 Sapphire AMD Radeon R9 290 4GB's .. Can Someone tell me what power supply I need for all 4?


This should give you some idea. Only the CPU is oveclocked. The PSU is Rosewill Hercules 1600 W.
If it doesn't work go to 8min.14sec


----------



## Arizonian

Quote:


> Originally Posted by *Newbie2009*
> 
> You can put me down for xfire and watercooling now.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - updated to Xfire







and updated to water cooling









Quote:


> Originally Posted by *Chomuco*
> 
> XFX R9 290 ram elpida ,: unlocked CON CHP XXX2020 ?? A 290X ?
> 
> 
> Spoiler: Warning: Spoiler!


Submit a GPU-Z validation with OCN name showing to be added to roster.








Quote:


> Originally Posted by *Ricdeau*
> 
> Please add me to the list. 2x Sapphire R9 290Xs with EK full cover blocks and back plates.
> 
> http://www.techpowerup.com/gpuz/mpsah/
> http://www.techpowerup.com/gpuz/667v3/


Congrats - added


----------



## brazilianloser

Quote:


> Originally Posted by *Heinz68*
> 
> This should give you some idea. Only the CPU is oveclocked. The PSU is Rosewill Hercules 1600 W.
> If it doesn't work go to 8min.14sec


Yet another video that shows that a good 860w psu should be able to power two of these things as long as you don't overclock it extremely... Instead of jumping the gun I guess I will just by my second card and do my own testing to see if my 860w can handle it.

And as I posted this the Asus 290 comes in stock on Newegg again







to the glory of Glob


----------



## tsm106

Quote:


> Originally Posted by *Heinz68*
> 
> Quote:
> 
> 
> 
> Originally Posted by *leo82*
> 
> Hi quick question , I'm buying 4 Sapphire AMD Radeon R9 290 4GB's .. Can Someone tell me what power supply I need for all 4?
> 
> 
> 
> This should give you some idea. Only the CPU is oveclocked. The PSU is Rosewill Hercules 1600 W.
> If it doesn't work go to 8 min
> 
> 
> Spoiler: Warning: Spoiler!
Click to expand...

PSA, that is one of the worst reviews in a while. Quad 290x on air = mass throttling so any metrics they get will be seriously skewed. And the reviewer has no basic understanding of the different benches as evidenced by the pathetic scoring they got in Heaven and Valley. Power consumption numbers are also off due to the stockness and throttling.

If my gold 7970 quads pulled over 1800w running full tilt, maxxed out... what does that say about 1200w 290x quads?



Bear in mind 290x starts with a higher max tdp by 50w (300w vs 250w), but they can have a much lower tdp limit if used on the quiet bios. Thus for ex. its either 300w or 210w or so...

Thus if for ex. you own a corvette LS1 with a key that switches it from 400HP to 650HP... do you buy tires only rated to go 400HP? Or do you plan for the full capability of your car and gpus...?


----------



## Scorpion49

Quote:


> Originally Posted by *brazilianloser*
> 
> Just in case no one posted this yet... for all those out there that are in need of a block for their new 290/290x and are kind of tired of waiting on EK coming in stock here in the US...
> 
> According to their Facebook it should be in stock starting the first week of December.
> 
> If already brought up then sorry otherwise I hope that this helps out those impatient souls out there.


Oh man, I was totally waiting for those but I bought the Koolance instead.


----------



## DeadlyDNA

Quote:


> Originally Posted by *leo82*
> 
> Hi quick question , I'm buying 4 Sapphire AMD Radeon R9 290 4GB's .. Can Someone tell me what power supply I need for all 4?


Your going to need around 1500watt PSU or 1600watt, i am now hitting 1200watts with mild OC on 3 cards, I haven't installed the 4th yet.

I only have a 3820 i7 quad, mildy oc to 4.6ghz, and im using a secondary PSU for pumps, fans...

You might be able to go dual PSU as well, like dual 1000watts or so


----------



## Sgt Bilko

I'm getting Artifacts in games now for some reason, even at stock clocks.......just ran Kombuster for 20mins to see any and it's fine there but in games it's a mess.

I've tried upping the core voltage at stock and that lessens them but even at 100mV through AB at stock i'm still getting them.

Any ideas?

Only thing i've changed lately is my System Ram, could that have an effect?


----------



## kot0005

Quote:


> Originally Posted by *hotrod717*
> 
> Most comparison's are made using ExtremeHD preset.


I keep getting a D3D11 error at fullscreen, so have to use custom preset

Hey Ariz, can you change my card to Asus please.. here is a gpuz ss:


----------



## quakermaas

Quote:


> Originally Posted by *jokrik*
> 
> I have two 290 on the way
> and will run it with a 4930k and RIVE BE, 4 x rams
> does anyone with similar setup reckon a 860 power supply is enough?
> I used the psu calc and with oc and other stuff they only draw 780w
> 
> but I heard the Hawaii card is power hungry, just wanna confirm bout this matter


Mine are not drawing as much power as I was expecting.

See rig below, very similar to your own, HX850w psu, 3930k @ 4.5, 2 x 290s @ 1100/1350 so far on default voltage and max power draw from wall hasn't went over 800watts in benchmarks, so far









My pair of HD7970s were hitting 900 + watts at wall in benchmarks in the same rig, but they were at their MAX clocks, haven't found that yet with the 290s.


----------



## cloppy007

What's the VRM temps of those using universal waterblocks? What heatsinks and fan do you use?

I want to use the enzotech VRM heatsinks with a case fan pointed to it, but I don't know what kind of temps I'll get


----------



## hotrod717

Quote:


> Originally Posted by *kot0005*
> 
> I keep getting a D3D11 error at fullscreen, so have to use custom preset
> 
> Hey Ariz, can you change my card to Asus please.. here is a gpuz ss:


That stinks . Kind of odd. Unfortunately running a custom preset, you' re not going to get a real good comparison.


----------



## Sgt Bilko

Just ran Valley at stock setting and it's a mess, artifacts everywhere...........time for the RMA process i think, i didn't want to but this is just no good

Any tips on cleaning the TIM off?


----------



## head9r2k

@Sgt Bilko

Can you make an video or screen with the artefacts?


----------



## Connolly

Well I think I've found the limits of my card with the stock cooler now



Was hoping to get the core a bit higher, but I guess I'll need better cooling for that, even though my temps never exceeded 74. Only started OCing for the first time last night, seriously addictive








Only 1hr to wait till my shiny new Eizo Foris FG2421 arrives, hope my card and it are a match made in heaven!


----------



## Sgt Bilko

Quote:


> Originally Posted by *head9r2k*
> 
> @Sgt Bilko
> 
> Can you make an video or screen with the artefacts?


Ummm, not really as the card is laying in front of me with the cooler off being cleaned, Once i get the stock cooler on i'll see what i can do.


----------



## jokrik

Quote:


> Originally Posted by *quakermaas*
> 
> Mine are not drawing as much power as I was expecting.
> 
> See rig below, very similar to your own, HX850w psu, 3930k @ 4.5, 2 x 290s @ 1100/1350 so far on default voltage and max power draw from wall hasn't went over 800watts in benchmarks, so far
> 
> 
> 
> 
> 
> 
> 
> 
> My pair of HD7970s were hitting 900 + watts at wall in benchmarks in the same rig, but they were at their MAX clocks, haven't found that yet with the 290s.


I'll stick to my 860 first and see how it goes, I hope it can manage all the stuff im going to put in the new rig
Ill have 15 fans running at 7v along with a d5 pump

The ivy bridge e should be more efficient on voltage besides im not gonna oc the hell out of my cpu, probably around 4.3


----------



## JordanTr

For those whininh about coil whine







My r9 290 had coil whine then it came, but after few days using, coile whine is gone.


----------



## Sgt Bilko

Yeah, still getting artifacts with the stock cooler,

Uploading a short vid now and will edit it into this post when it's done.

Well this sucks.........

EDIT:


----------



## head9r2k

@Sgt Bilko

AH okay :/ Send the Card Back


----------



## quakermaas

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yeah, still getting artifacts with the stock cooler,
> 
> Uploading a short vid now and will edit it into this post when it's done.
> 
> Well this sucks.........
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> EDIT:


I had a HD7970 go the exact same way a few months ago, I had owned it for over 18 months, MSI replaced it with a new card.


----------



## Sgt Bilko

Quote:


> Originally Posted by *head9r2k*
> 
> @Sgt Bilko
> 
> AH okay :/ Send the Card Back


Yeah, running on 7970 power now.

I tried bumping up the voltage, underclocking and swapping around a couple of drivers.

not sure what else i could do.........oh well, heres hoping they give me a refund, might put it towards 2 x 290's or 280x's
Quote:


> Originally Posted by *quakermaas*
> 
> I had a HD7970 go the exact same way a few months ago, I had owned it for over 18 months, MSI replaced it with a new card.


Well i'm not sure how this will go actually, will my retailer decide (PCCG in Aus) or will Sapphire?

I've heard that people have alot of trouble with Sapphire.


----------



## Jack Mac

My 290 should be getting here shortly, what drivers should I use?


----------



## Newbie2009

Quote:


> Originally Posted by *tsm106*
> 
> Did you check the box for enable cfx for games without profiles?


Hmmm, I don't see this box in CCC.

I'm not too familiar using 2 screens, but Tomb raider will work in xfire if I enable "exclusive fullscreen" and not when in full screen. Any ideas?


----------



## rdr09

Quote:


> Originally Posted by *Jack Mac*
> 
> My 290 should be getting here shortly, what drivers should I use?


scroll down. i recommend 13.11 (not the beta) . . .

http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64

if your current card is amd you don't need to do anything fancy.

just download that driver and use it to uninstall any existing driver. Reboot and run it again, then this time Install it. Just use Express for both.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Newbie2009*
> 
> Hmmm, I don't see this box in CCC.
> 
> I'm not too familiar using 2 screens, but Tomb raider will work in xfire if I enable "exclusive fullscreen" and not when in full screen. Any ideas?


afaik Crossfire has never worked in Windowed mode, only fullscreen, Complete pita for those of use that use more than one screen


----------



## Jack Mac

Quote:


> Originally Posted by *rdr09*
> 
> scroll down. i recommend 13.11 (not the beta) . . .
> 
> http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64
> 
> if your current card is amd you don't need to do anything fancy.
> 
> just download that driver and use it to uninstall any existing driver. Reboot and run it again, then this time Install it. Just use Express for both.


Alright, thanks. I just wanted to know if you guys ran the beta or just 13.11. I've uninstalled all my Nvidia drivers yesterday after I sold my 670.( I put it on Craigslist for $180 got an offer an hour later and sold it the same day)


----------



## head9r2k

Why its Recommend that we use 13.11 and not the beta 13.11 beta9.4?


----------



## rdr09

Quote:


> Originally Posted by *head9r2k*
> 
> Why its Recommend that we use 13.11 and not the beta 13.11 beta9.4?


13.11 WHQL is stable base on my experience and afaik that beta is for those with BS issues.


----------



## Newb Builder

Quote:


> Originally Posted by *KyGuy*
> 
> Go for the 290x's. Forget 280x's and 7970's as the 280x is basically a rebranded 7970. Quadfire with the 7990's won't scale well, although the extra RAM may help at high res. For now though the 290x is king, and when it is updated for Mantle, who knows how fast it will be.


My ownly problem with the 290 or 290x is the price for performance ive looked it some games benchmarks at 4k and for relatively high settings its difficult to achieve 30fps and i cant afford 2 of these, ive seen a 4k running on a 7970 which wasnt too bad and i can afford to crossfire them, and as for mantle its a little pointless to me atm until more developers support it and even then the 7970 and 7990 will support mantle as well since they are apart of the GCN Architecture

Atm i can afford just 1 290x, maybe 2 x 290's, 2 x 7990's, 3 x 7970's or 2 x 6Gb 7970's. For the moment the 290x fpr its price doesn't offer enough bang for the buck.


----------



## Newbie2009

Just a little input from anyone using radeon pro & vync with these cards and 13.11 WHQL.

If you are having performance issues, stuttering try this. Turn off vsync in the app you are using, obviously.



No triple buffering or anything, I think it is partly an OpenGL issue, but my performance increased massively.

With V Sync on in game and no radeon pro I was getting crackling in the sound and choppy performance.
With VSync control on and triple buffering with dynamic FC in Radeon pro I was getting the same.

No more sound crackling or chugging performance with above setting, incase anyone else is having trouble.


----------



## Jack Mac

Go for the two 290s


----------



## DraXxus1549

Quote:


> Originally Posted by *Newbie2009*
> 
> Just a little input from anyone using radeon pro & vync with these cards and 13.11 WHQL.
> 
> If you are having performance issues, stuttering try this. Turn off vsync in the app you are using, obviously.
> 
> 
> 
> No triple buffering or anything, I think it is partly an OpenGL issue, but my performance increased massively.
> 
> With V Sync on in game and no radeon pro I was getting crackling in the sound and choppy performance.
> With VSync control on and triple buffering with dynamic FC in Radeon pro I was getting the same.
> 
> No more sound crackling or chugging performance with above setting, incase anyone else is having trouble.


Do you have to be using the WHQL drivers? When I load up radeon pro it tells me that I have an invalid GPU, any idea why?


----------



## pompss

i just reinstall the stock cooler of my r9 290x and my card doesnt boot anymore








I really dont understand why the card doesnt boot and what is problem.
It's weird that card get broken like that....


----------



## Newbie2009

Quote:


> Originally Posted by *DraXxus1549*
> 
> Do you have to be using the WHQL drivers? When I load up radeon pro it tells me that I have an invalid GPU, any idea why?


No, probably because program is old. Anyone know how to apply R Pro to BF4?


----------



## KyGuy

Quote:


> Originally Posted by *Newb Builder*
> 
> My ownly problem with the 290 or 290x is the price for performance ive looked it some games benchmarks at 4k and for relatively high settings its difficult to achieve 30fps and i cant afford 2 of these, ive seen a 4k running on a 7970 which wasnt too bad and i can afford to crossfire them, and as for mantle its a little pointless to me atm until more developers support it and even then the 7970 and 7990 will support mantle as well since they are apart of the GCN Architecture
> 
> Atm i can afford just 1 290x, maybe 2 x 290's, 2 x 7990's, 3 x 7970's or 2 x 6Gb 7970's. For the moment the 290x fpr its price doesn't offer enough bang for the buck.


You could go for the 290's or the 7970's. Nothing wrong with that. They are still great cards. You could look into flashing the 290 to a 290x if you wanted, although there are some people that are reporting failures after the flash. XFX, Club3d, and a couple others report the highest success rate.


----------



## Connolly

You guys can even use Radeon Pro with the R9 290x? I get the error message Invalid video adapter: AMD Radeon R9 200 Series whenever I try.


----------



## Letmefly

Right peeps just a quick one, i am considering purchasing a Sapphire AMD Radeon R9 290 card and i want to know if the warrenty will be covered if i decide to remove the stock cooler. If not which brand of card do you recommend for me.

Cheers


----------



## Sinch979

Hi .My first post...


----------



## Arizonian

Quote:


> Originally Posted by *kot0005*
> 
> I keep getting a D3D11 error at fullscreen, so have to use custom preset
> 
> Hey Ariz, can you change my card to Asus please.. here is a gpuz ss:
> 
> 
> Spoiler: Warning: Spoiler!


Updated


----------



## Jpmboy

Quote:


> Originally Posted by *leo82*
> 
> Hi quick question , I'm buying 4 Sapphire AMD Radeon R9 290 4GB's .. Can Someone tell me what power supply I need for all 4?


If you keep the stock bios and OC only by CCC, 1200-1500W at least, and a single 12V rail. If you do anything beyond stock, or have a pump, fans, OC CPU and 10,000K drives, then use two PSUs and an "add2psu".


----------



## Jpmboy

Quote:


> Originally Posted by *pompss*
> 
> i just reinstall the stock cooler of my r9 290x and my card doesnt boot anymore
> 
> 
> 
> 
> 
> 
> 
> 
> I really dont understand why the card doesnt boot and what is problem.
> It's weird that card get broken like that....


did you reconnect the fan plug? Reinstalled all pads and tim?


----------



## Ricdeau

Quote:


> Originally Posted by *Connolly*
> 
> You guys can even use Radeon Pro with the R9 290x? I get the error message Invalid video adapter: AMD Radeon R9 200 Series whenever I try.


I use it all the time. Are you using the latest release that added R9 290/290X support?


----------



## Jack Mac

Well, I have everything installed. This card is silent at idle, but runs at 67C idle w/ both of my monitors plugged in, it's ridiculous, that's what my 670 used to get to under 100% load. Plus my desktop feels laggy, I don't know why, but on the other hand, this card has 79% ASIC, hoping it's a good clocker.
http://www.techpowerup.com/gpuz/7kgk3/


----------



## brazilianloser

Quote:


> Originally Posted by *Jack Mac*
> 
> Well, I have everything installed. This card is silent at idle, but runs at 67C idle w/ both of my monitors plugged in, it's ridiculous, that's what my 670 used to get to under 100% load. Plus my desktop feels laggy, I don't know why, but on the other hand, this card has 79% ASIC, hoping it's a good clocker.
> http://www.techpowerup.com/gpuz/7kgk3/


Hm something is wrong in your set up then... my for example is in the high 40s... with four monitors and currently spreadsheets, word, webpage and music player all open across all four.


----------



## Jack Mac

Lmao, just tried putting the fan on AB to 47% to see what all the fuss was about, it isn't quiet but it's not "unbearably loud." I can't believe people were QQing about this.


----------



## CriticalHit

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yeah, still getting artifacts with the stock cooler,
> 
> Uploading a short vid now and will edit it into this post when it's done.
> 
> Well this sucks.........
> 
> EDIT:


ohh.. thats not good...

i was running on unlocked BIOS and noticed some artifacting in Starcraft 2 of all things running at stock just now ... ( but not to the extent you have in valley - just a few specfiic textures would flicker )

Have gone back to Stock 290 BIOS for now.... Some weird things going on with monitor on the displayport adapter too... just watching it for now... im assuming / hoping its driver stuff...

( on beta 9.2 )


----------



## Jack Mac

I seem to be stable at 1150Mhz stock voltage, I wonder how hard I can push this thing.


----------



## CriticalHit

Quote:


> Originally Posted by *Jack Mac*
> 
> Lmao, just tried putting the fan on AB to 47% to see what all the fuss was about, it isn't quiet but it's not "unbearably loud." I can't believe people were QQing about this.


my thoughts too , its not THAT bad.. ... actually i upgraded my PSU to a Corsair 1050W for crossfired 290's and to me that makes comparable amount of noise... ( put my cards underwater but that PSU fan ensures i no longer have a quiet system


----------



## Ricdeau

Quote:


> Originally Posted by *Jack Mac*
> 
> Lmao, just tried putting the fan on AB to 47% to see what all the fuss was about, it isn't quiet but it's not "unbearably loud." I can't believe people were QQing about this.


A lot of people are having to put their fans at around 60-70% to keep their cards from throttling which is quite a bit louder (especially if you are running Crossfire). I was running mine at 70% to keep both of mine from throttling down which was a bit noisy, but they are under water now so no fan noise at all from them now!


----------



## ZealotKi11er

So i am not 100% what i did but i just moved the Voltage Slider to -100mv in MSI AB Beta 17 and got black screen. Restarted the PC and after Welcome Screen still got black screen. Went to Safe mode and removed MSI AB and still black screen after normal boot. Removed the drivers in Safe mode and not it works. Juts have to install drivers again and see.


----------



## Jack Mac

Is 1200Mhz good for a 290? I can run benches at that speed, but it artifacts a little. Also I think my Sapphire has Elpida and the memory voltage is locked


----------



## ZealotKi11er

Quote:


> Originally Posted by *Jack Mac*
> 
> Is 1200Mhz good for a 290? I can run benches at that speed, but it artifacts a little. Also I think my Sapphire has Elpida and the memory voltage is locked


Is it really worth it to see artifact for 0.5 fps. Drop it to 1190 or something. I know 1200MHz looks nice


----------



## chiknnwatrmln

I'm in the same boat, 1200 MHz artifacts past 62c, my card tops at 63c but 1190 solves the problem.

My OCD hates having my core clock at 1190 though.


----------



## Jack Mac

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Is it really worth it to see artifact for 0.5 fps. Drop it to 1190 or something. I know 1200MHz looks nice


I'll probably end up doing 1150Mhz 24/7 and 1200 for benches, if only the memory and cooler could keep up, I was hitting 85C at 1200Mhz with 80% fan speed.


----------



## Newb Builder

Quote:


> Originally Posted by *KyGuy*
> 
> You could go for the 290's or the 7970's. Nothing wrong with that. They are still great cards. You could look into flashing the 290 to a 290x if you wanted, although there are some people that are reporting failures after the flash. XFX, Club3d, and a couple others report the highest success rate.


Thank you I'll try and get some benchmarks of the 6GB 7970's and the normal 290's see how they compare, hopefully I maybe be able to get crossfire results for the 6GB cards


----------



## Jpmboy

Quote:


> Originally Posted by *ZealotKi11er*
> 
> So i am not 100% what i did but i just moved the Voltage Slider to -100mv in MSI AB Beta 17 and got black screen. Restarted the PC and after Welcome Screen still got black screen. Went to Safe mode and removed MSI AB and still black screen after normal boot. Removed the drivers in Safe mode and not it works. Juts have to install drivers again and see.


boot to safe mode, win menu - type msconfig., disable any msi services and do not let ccc (disable) start.. should reboot to desktop. My only way around this hard-locked driver issue has been to remove and reinstall the AMD driver pack.
it's fixable, but a pia.


----------



## Slomo4shO

Quote:


> Originally Posted by *leo82*
> 
> Hi quick question , I'm buying 4 Sapphire AMD Radeon R9 290 4GB's .. Can Someone tell me what power supply I need for all 4?


EVGA Supernova NEX 1500 or LEPA G1600 are good choices.


----------



## pompss

Quote:


> Originally Posted by *Jpmboy*
> 
> did you reconnect the fan plug? Reinstalled all pads and tim?


Yes i did . its not my first time i do something like that, so iam really pissed off and i dont understand why the card doesnt boot.
Fan is working but the card doesnt boot with the two beeps detecting bad card.
Guys any suggestions???
I remove the warranty seal so i cant return it for a DOA


----------



## Ukkooh

Quote:


> Originally Posted by *pompss*
> 
> Yes i did . its not my first time i do something like that, so iam really pissed off and i dont understand why the card doesnt boot.
> Fan is working but the card doesnt boot with the two beeps detecting bad card.
> Guys any suggestions???
> I remove the warranty seal so i cant return it for a DOA


Maybe you crushed the die?


----------



## Jpmboy

Quote:


> Originally Posted by *pompss*
> 
> Yes i did . its not my first time i do something like that, so iam really pissed off and i dont understand why the card doesnt boot.
> Fan is working but the card doesnt boot with the two beeps detecting bad card.
> Guys any suggestions???
> I remove the warranty seal so i cant return it for a DOA


Dauum- never heard of that before! Clear the pcie slot, ... no rig in sig ... post and set to boot from onboard video is you have it, see if ati flash sees the card in the slot.
Seems like you know what things to try before smashing it against the wall...


----------



## chiknnwatrmln

Try loosening the cooler. I have heard of overtightening putting pressure on the die and causing it to malfunction.


----------



## Mr357

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Try loosening the cooler. I have heard of overtightening putting pressure on the die and causing it to malfunction.


"Malfunction" is a euphemism seeing as die's can be cracked from too much pressure.


----------



## josephimports

I received my replacement Asus 290 yesterday and its been running flawless. ASIC rating is much higher at 76.9% compared to 66.0% for the original card. It runs artifact-free in benchmarks at 1150 with no voltage adjustments. Max vcore is 1.234 at 1150 and 1.195v at stock running 3DMark Extreme. Max VRM1 49c, VRM2 57c.

http://www.techpowerup.com/gpuz/9cxp7/





Valley @ 1150/1250/PT 150%



3DMark @ 1180 / 1375 / 1300mv / PT 150%



http://www.3dmark.com/3dm/1759629

3DMark11 Performance 1180/1375/1300/150%



http://www.3dmark.com/3dm11/7592721

Heaven 4.0 1180/1375/1300/150%


----------



## Scorpion49

Putting the block on my first card right now, hit a snag though because I ran out of alcohol to clean the thermal pad/paste residue with. Gotta run to the store in a few minutes and get some more. I decided to be smart about how I store the stock cooler this time rather than losing half the pieces like last time I did this, I put all the screws back in the cooler and then put a slice of butcher paper over the pads and used the GPU bracket to hold it down so the pads don't get ruined.

Took these with a potato


----------



## Pandora's Box

Just purchased 2 HIS R9 290's from amazon. Can't justify the cost of these 2 780Ti's I currently have. $800 for 2 290's vs $1400 for the 2 780 Ti's. While the Ti's are beasts it just doesn't seem worth it. I'm finding in BF4 with the resolution scale at 120% @ 1440P I'm hitting about 2900MB vram. I have performance to spare to crank it higher but can't due to VRAM limit. Figured I can save myself $600 and go with 290's and be able to crank settings even higher. Just hope AMD drivers are there to support these cards, I found them lacking when I had 7970 crossfire last year.


----------



## jomama22

So here's a new record:
http://www.3dmark.com/fs/1233769
3x 290x
1332/1600



Gfx score: 19564

I'll have a "valid" result later. This is just a quick test, first time firing up 3 at once.


----------



## psyside

So i have few questions.

1. Where can i find clock for clock 290X/290 vs 780/780Ti ?

2. What is the max safe vrm temps on this cards, with voltage tweaks and max oc on air, and does the 290 have *exactly the same vrm like 290x?
*
3. What fan speed are you guys using in order to keep vrms in check?

4. What is the max average oc on air? 1150 should be like sure right, what is the memory as well? Hynix vs E garbage, any difference?

Thanks.


----------



## pompss

Quote:


> Originally Posted by *Ukkooh*
> 
> Maybe you crushed the die?


Anyway to check and determine that the die is actually crushed ???
But i really doubt that the die crush so easy..
I didnt put so much pressure


----------



## chiknnwatrmln

Quote:


> Originally Posted by *psyside*
> 
> So i have few questions.
> 
> 1. Where can i find clock for clock 290X/290 vs 780/780Ti ?
> 
> 2. What is the max safe vrm temps on this cards, with voltage tweaks and max oc on air, and does the 290 have *exactly the same vrm like 290x?
> *
> 3. What fan speed are you guys using in order to keep vrms in check?
> 
> 4. What is the max average oc on air? 1150 should be like sure right, what is the memory as well? Hynix vs E garbage, any difference?
> 
> Thanks.


1. No idea, sorry. Maybe someone else knows.
2. They are rated for 125c, but I would keep them below 100c. Other than the chip, 290 and 290x boards are identical. Same VRM's, same everything.
3. With the ref cooler and maxed voltage via GPUTweak it would regularly ramp up to 75%+. Way too loud. Interestingly enough, the reference cooler kept my VRM's much cooler than the Gelid does.
4. 1150 sounds about right. For memory I'd say average is maybe 5700-6000 effective. Memory OC's really don't provide a big benefit past 5500 MHz because the bus is so wide.

I've personally never noticed a difference from Elpida vs Hynix. My Elpida memory has no problem clocking up to 6500 MHz effective.

To above: Try loosening the four screws around the GPU die a bit. If that doesn't work, take off the cooler and clean up old TIM. Make sure there's no TIM where it's not supposed to be, put it back together and hope for the best.

If that doesn't work, put the warranty stickers back on and try to RMA it.


----------



## skupples

Quote:


> Originally Posted by *Slomo4shO*
> 
> EVGA Supernova NEX 1500 or LEPA G1600 are good choices.


The nex 1500 gets some of the lowest reviews I have ever seen, but it's in a unique situation due to it's power output. It's also multi-rail, which means users should mix them to provide proper amperage to the cards.


----------



## psyside

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> 1. No idea, sorry. Maybe someone else knows.
> 2. They are rated for 125c, but I would keep them below 100c. Other than the chip, 290 and 290x boards are identical. Same VRM's, same everything.
> 3. With the ref cooler and maxed voltage via GPUTweak it would regularly ramp up to 75%+. Way too loud. Interestingly enough, the reference cooler kept my VRM's much cooler than the Gelid does.
> 4. 1150 sounds about right. For memory I'd say average is maybe 5700-6000 effective. Memory OC's really don't provide a big benefit past 5500 MHz because the bus is so wide.
> 
> I've personally never noticed a difference from Elpida vs Hynix. My Elpida memory has no problem clocking up to 6500 MHz effective.


Thanks bro, rep +









I hope non reference cards will be out soon, its already too much waiting...


----------



## Jack Mac

Quote:


> Originally Posted by *psyside*
> 
> So i have few questions.
> 
> 1. Where can i find clock for clock 290X/290 vs 780/780Ti ?
> 
> 2. What is the max safe vrm temps on this cards, with voltage tweaks and max oc on air, and does the 290 have *exactly the same vrm like 290x?
> *
> *3. What fan speed are you guys using in order to keep vrms in check?
> 
> 4. What is the max average oc on air? 1150 should be like sure right, what is the memory as well? Hynix vs E garbage, any difference?*
> 
> Thanks.


The auto fan speed will easily keep VRMs in check, they run quite cool. I play games @ 1150Mhz (stock voltage) at 65% fanspeed and they stay around 60C with the GPU itself getting to around 70-75C at max. I'd say the max average on air is 1100-1200Mhz, and memory OCing doesn't matter very much on this card, but Hynix is usually better.


----------



## psyside

Quote:


> Originally Posted by *Jack Mac*
> 
> The auto fan speed will easily keep VRMs in check, they run quite cool. I play games @ 1150Mhz (stock voltage) at 65% fanspeed and they stay around 60C with the GPU itself getting to around 70-75C at max. I'd say the max average on air is 1100-1200Mhz, and memory OCing doesn't matter very much on this card, but Hynix is usually better.


Thanks rep + as well


----------



## shilka

Quote:


> Originally Posted by *skupples*
> 
> The nex 1500 gets some of the lowest reviews I have ever seen, but it's in a unique situation due to it's power output. It's also multi-rail, which means users should mix them to provide proper amperage to the cards.


EVGA SuperNova NEX1500 is a PSU everyone should avoid as it has a ton of flaws

Lepa G 1600 is also multi rail


----------



## Jpmboy

Quote:


> Originally Posted by *jomama22*
> 
> So here's a new record:
> http://www.3dmark.com/fs/1233769
> 3x 290x
> 1332/1600
> 
> Gfx score: 19564
> I'll have a "valid" result later. This is just a quick test, first time firing up 3 at once.


Don't need a valid driver for this thread.


----------



## psyside

So on 780 cards the vrms are rated at around 90c this ones at 125? that is big difference.


----------



## RAFFY

Quote:


> Originally Posted by *Slomo4shO*
> 
> EVGA Supernova NEX 1500 or LEPA G1600 are good choices.


Quote:


> Originally Posted by *skupples*
> 
> The nex 1500 gets some of the lowest reviews I have ever seen, but it's in a unique situation due to it's power output. It's also multi-rail, which means users should mix them to provide proper amperage to the cards.


I summon the POWER SUPPLY GODS.... SHILIKA where are you? That NEX PSU is horrible DO NOT purchase. I believe the LEPA is also mediocre as well. I just purchased two EVGA SuperNOVA 1000P2's for my tri 290x setup.


----------



## shilka

Quote:


> Originally Posted by *RAFFY*
> 
> I summon the POWER SUPPLY GODS.... SHILIKA where are you? That NEX PSU is horrible DO NOT purchase. I believe the LEPA is also mediocre as well. I just purchased two EVGA SuperNOVA 1000P2's for my tri 290x setup.


Only the 500/600B and the NEX650G/750B/750G and the NEX1500 are mediocre

The G2 and P2 models are great

As for Lepa they are for the most part a cheaper Enermax rebrander with a few CWT units mixed

The new Lepa MaxBron and MaxGold series are made by a OEM called Yue-Lin Electrical Technology

That name sounds dubious as hell

As for the Lepa G 1600 watts no its not bad in any way

In fact its a great PSU


----------



## mojobear

Quote:


> Originally Posted by *Jpmboy*
> 
> From frozencpu?


Hi there...does the ek block not come w pads? Why get the Fuji kind?


----------



## skupples

Quote:


> Originally Posted by *shilka*
> 
> EVGA SuperNova NEX1500 is a PSU everyone should avoid as it has a ton of flaws
> 
> Lepa G 1600 is also multi rail


I wasn't going to say, in the hopes some one else (you) may be lurking.
Quote:


> Originally Posted by *mojobear*
> 
> Hi there...does the ek block not come w pads? Why get the Fuji kind?


Yes, they come with pads. They are pretty low quality. You can get a giant 30x20cm sheet of Fuji for 30-40$. It's a life time supply. For contrast, Phobya charges 20$ for a 10x10 piece.


----------



## Jack Mac

Quote:


> Originally Posted by *psyside*
> 
> So on 780 cards the vrms are rated at around 90c this ones at 125? that is big difference.


It's not like you really want to run your VRMs that hot anyway...








I'd be worried if they pass 85C, as that's my threshold for temperatures, I was worried when I purchased the 290 but with the fanspeed I run at I never pass 77C.


----------



## UNOE

I have odd GPU usage all up and down with crossfire


----------



## psyside

Quote:


> Originally Posted by *Jack Mac*
> 
> It's not like you really want to run your VRMs that hot anyway...
> 
> 
> 
> 
> 
> 
> 
> 
> I'd be worried if they pass 85C, as that's my threshold for temperatures, I was worried when I purchased the 290 but with the fanspeed I run at I never pass 77C.


Let me be more clear.

Ive been told that vrm temps are around 25c + from core temps on GTX78x, so you are around 90c with custom fan around 70% lets say.

On the 29x they run around 70c, which is lower then core temp around 20c, with custom fan. So in theory if you get golden card, for air oc, like i will do, you are better with AMD then Nvidia card it seems regarding vrm temps that is....

My goal is 90c max on vrms for both brands, depending on which one i pick.


----------



## Jpmboy

Quote:


> Originally Posted by *mojobear*
> 
> Hi there...does the ek block not come w pads? Why get the Fuji kind?


Quote:


> Originally Posted by *skupples*
> 
> Yes, they come with pads. They are pretty low quality. You can get a giant 30x20cm sheet of Fuji for 30-40$. It's a life time supply. For contrast, Phobya charges 20$ for a 10x10 piece.


^^^^ This.

The EK pads are okay, and perform very well if you use a dab of TIM on both sides. No doubt, the thermal conductivity of the FujiPoly is better than the stock pads (dry).


----------



## Jpmboy

Quote:


> Originally Posted by *psyside*
> 
> Let me be more clear.
> Ive been told that vrm temps are around 25c + from core temps on GTX78x, so you are around 90c with custom fan around 70% lets say.
> On the 29x they run around 70c, which is lower then core temp around 20c, with custom fan. So in theory if you get golden card, for air oc, like i will do, you are better with AMD then Nvidia card it seems regarding vrm temps that is....
> My goal is 90c max on vrms for both brands, depending on which one i pick.


Your criteria for choosing a GPU is the VRM temps on air colled cards? Really? Either will benefit from upgrading the cooler and the thermal transfer materials.


----------



## psyside

Quote:


> Originally Posted by *Jpmboy*
> 
> Your criteria for choosing a GPU is the VRM temps on air colled cards? Really? Either will benefit from upgrading the cooler and the thermal transfer materials.


Yes, i will be using my card really basic, can't be bothered with waranty, mods and such, so i seek for best vrm temps in order to be able to use my card with max oc for sustained periods, long gaming sessions.

Why is this so strange for you?

Also i wont buy reference cards, DCII or something similar...

I hope non reference cards will have good vrms like AMD reference ones, this time AMD use quite good vrm's.


----------



## kot0005

Bought windows 8.1 pro as it was on sale for students.. and did a fresh install

Here are some results:

stock clocks:http://www.3dmark.com/3dm/1758407?



OC: 1200/5108 ; http://www.3dmark.com/3dm/1758492?



I will try valley again.


----------



## Jack Mac

http://www.techpowerup.com/gpuz/7kgk3/
Sapphire R9 290
Stock cooling


----------



## kot0005

Ok valley is working now, results are really low for some reason. will post ss later.

@nother Firestrike OC test @ 1200/ 5996; http://www.3dmark.com/3dm/1758632?



Minor Artifacting above 1200Mhz on core, but the Memory is stable at 5996... Elpida chips.


----------



## Slomo4shO

Quote:


> Originally Posted by *skupples*
> 
> I wasn't going to say, in the hopes some one else (you) may be lurking.
> Yes, they come with pads. They are pretty low quality. You can get a giant 30x20cm sheet of Fuji for 30-40$. It's a life time supply. For contrast, Phobya charges 20$ for a 10x10 piece.


I could be wrong but weren't much of the issues that plagued the XR resolved in the VR? I just ordered the VR edition so I guess I'll know soon enough









Also, on the subject of pads, what thickness pads do I need for R9 290 waterblock?

Lastly, anyone know of a vendor that actually has some blocks in stock in the US?


----------



## Jack Mac

Yeah, there's a problem with Valley on the 290 for sure, I keep up with 780s/Titans/780Tis in most benchmarks but I can't in Valley.


----------



## shilka

Quote:


> Originally Posted by *Slomo4shO*
> 
> I could be wrong but weren't much of the issues that plagued the XR resolved in the VR? I just ordered the VR edition so I guess I'll know soon enough
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also, on the subject of pads, what thickness pads do I need for R9 290 waterblock?
> 
> Lastly, anyone know of a vendor that actually has some blocks in stock in the US?


I dont think that PSU is worth it

You can find Sukhoi fighter jets that makes less noise then the fan in it

And voltage regulation is just crap its nowhere great and should not have been sold ín the state it is

The NEX1500 was rushed way too much and as a result it has more then a few flaws that should have been fixed but never was


----------



## Jack Mac

Speaking of PSUs that sound like jets, you basically summed up what my TX750 V2 is like. It probably has a broken temperature sensor because the fan is always at 100% but it's nice and cool. I can't wait until my new PSU comes in though, the TX750 is louder than my R9 290.


----------



## skupples

Quote:


> Originally Posted by *Slomo4shO*
> 
> I could be wrong but weren't much of the issues that plagued the XR resolved in the VR? I just ordered the VR edition so I guess I'll know soon enough
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also, on the subject of pads, what thickness pads do I need for R9 290 waterblock?
> 
> Lastly, anyone know of a vendor that actually has some blocks in stock in the US?


Watercooling is becoming extremely popular, combine that with Holiday season = everything worth while is sold out.

erm... FSU #1....
Quote:


> Originally Posted by *Jpmboy*
> 
> ^^^^ This.
> 
> The EK pads are okay, and perform very well if you use a dab of TIM on both sides. No doubt, the thermal conductivity of the FujiPoly is better than the stock pads (dry).


I'll be using a dab of PK-3 this time around. Hopefully it's good stuff, I got a 30g tube of it.
Quote:


> Originally Posted by *psyside*
> 
> Yes, i will be using my card really basic, can't be bothered with waranty, mods and such, so i seek for best vrm temps in order to be able to use my card with max oc for sustained periods, long gaming sessions.
> 
> Why is this so strange for you?
> 
> Also i wont buy reference cards, DCII or something similar...
> 
> I hope non reference cards will have good vrms like AMD reference ones, this time AMD use quite good vrm's.


You should probably look into Classifieds & waiting for non-ref Hawaii's in this case.


----------



## PorkchopExpress

hate you guys and your great overclocks! grrr


----------



## rdr09

Quote:


> Originally Posted by *PorkchopExpress*
> 
> hate you guys and your great overclocks! grrr


you're better than most of us 'cause you are mining.









teach me.


----------



## Slomo4shO

Quote:


> Originally Posted by *skupples*
> 
> Watercooling is becoming extremely popular, combine that with Holiday season = everything worth while is sold out.
> 
> erm... FSU #1....


I now have 4 Sapphire BF Edition 290s on the way (replaced my original 2 Asus models) and I can't find blocks in stock anywhere!







I had pieced together a collection of parts for my water loop and I would appreciate your feedback since this is going to be my first custom loop.


----------



## rdr09

Quote:


> Originally Posted by *kot0005*
> 
> Ok valley is working now, results are really low for some reason. will post ss later.
> 
> @nother Firestrike OC test @ 1200/ 5996; http://www.3dmark.com/3dm/1758632?
> 
> 
> 
> 
> Minor Artifacting above 1200Mhz on core, but the Memory is stable at 5996... Elpida chips.


your card is prolly throttling. compare . . .

http://www.3dmark.com/fs/1155472

my 290 gets a 12600 in graphics score at that oc.


----------



## psyside

Quote:


> Originally Posted by *skupples*
> 
> You should probably look into Classifieds & waiting for non-ref Hawaii's in this case.


The only cards which are available where i live are.

Sapphire, DCII, Gigabyte and rare (MSI)

For Nvidia i know DCII is the only choice, for AMD it should be DCII as well, lol?


----------



## Jpmboy

Quote:


> Originally Posted by *psyside*
> 
> Yes, i will be using my card really basic, can't be bothered with waranty, mods and such, so i seek for best vrm temps in order to be able to use my card with max oc for sustained periods, long gaming sessions.
> 
> Why is this so strange for you?
> Also i wont buy reference cards, DCII or something similar...
> I hope non reference cards will have good vrms like AMD reference ones, this time AMD use quite good vrm's.


Not so strange, just a little... if you are not overclocking/overvolting, and have a well ventilated case, your vrm temp will not be the determinant of your long term gaming experience.








Quote:


> Originally Posted by *Slomo4shO*
> 
> I could be wrong but weren't much of the issues that plagued the XR resolved in the VR? I just ordered the VR edition so I guess I'll know soon enough
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Also, on the subject of pads, what thickness pads do I need for R9 290 waterblock?*
> 
> Lastly, anyone know of a vendor that actually has some blocks in stock in the US?


For EK, it's 0.5 and 1mm - check their website for the instruction pdf.


----------



## psyside

Quote:


> Originally Posted by *Jpmboy*
> 
> Not so strange, just a little... if you are not overclocking/overvolting, and have a well ventilated case, your vrm temp will not be the determinant of your long term gaming experience.
> 
> 
> 
> 
> 
> 
> 
> 
> For EK, it's 0.5 and 1mm - check their website for the instruction pdf.


I will overclocking/overvolting, but without advanced mods. Max air clocks out of the box, will control the voltage with AB.

So if i can get card which can sustain long sessions with vrm temps around 90c as max, i will be happy. So i need feedback about this...it looks like AMD cards would be better bet in this case....much lower vrm temps it seems.

On my old 7950, vrm hitted like 105 in no time with max stabke oc (1150/6200) and fan at 75% so i had to lower the clocks after 10 mins ingame...


----------



## skupples

Quote:


> Originally Posted by *psyside*
> 
> So on 780 cards the vrms are rated at around 90c this ones at 125? that is big difference.


No, the 780/titan/780Ti vrm's are rated to 110c.

They 20c+ estimate on the GK110 VRM's is when heavily overclocked. Stock they should be running pretty cool.

Those temps are never going to be an issue for you unless you are planning to do reckless levels of overclocking, which isn't really possible with the cards you have listed as available to you.

The DCII 780's do not allow over-volting past 1.212 (not directly comparable to the amount of voltage allowed on 290 due to different architecture designs)

Not sure if asus will lock down Hawaii DCII voltage, i believe that's up to AMD not Asus. The only reason we can go past 1.212 on ref gk110 is due to the buck controller being easily accessible.


----------



## VSG

So both my sapphire and Asus 290x cards use Elpida ram. Tomorrow I can finally test them out under stock cooling to make sure there are no issues such as black screening before they go under water.


----------



## Slomo4shO

Quote:


> Originally Posted by *Jpmboy*
> 
> For EK, it's 0.5 and 1mm - check their website for the instruction pdf.


Thank you for the info on the EK blocks. I initially checked the manual for the Koolance blocks since they were in stock last week and they didn't specify the actual size.
Quote:


> Please use the *diagram included with your water block* to determine the approximate sizes needed. Koolance heat transfer pads can have different thicknesses (0.5mm, 0.7mm, and 1.0mm).


Needless to say, it wasn't very helpful. Since I plan on ordering the first available set of blocks that become available, any idea what pads these blocks require?


----------



## skupples

Quote:


> Originally Posted by *Slomo4shO*
> 
> Thank you for the info on the EK blocks. I initially checked the manual for the Koolance blocks since they were in stock last week and they didn't specify the actual size.
> Needless to say, it wasn't very helpful. Since I plan on ordering the first available set of blocks that become available, any idea what pads these blocks require?


EK cools VRM #1!


----------



## Rar4f

AMD seems to be screwing people over by not allowing non reference cards til after december...
And what's worse? Their reference card makes alot of noise and is not very good at cooling.
.


----------



## ZealotKi11er

Ordered my Block from US Retailer. Could not wait any longer for Dazmode to have them in stock.


----------



## smokedawg

Quote:


> Originally Posted by *Newbie2009*
> 
> Anyone know how to apply R Pro to BF4?


I added the bf4.exe to the profile and set the launcher option to origin:


----------



## maynard14

Quote:


> Originally Posted by *Jack Mac*
> 
> The auto fan speed will easily keep VRMs in check, they run quite cool. I play games @ 1150Mhz (stock voltage) at 65% fanspeed and they stay around 60C with the GPU itself getting to around 70-75C at max. I'd say the max average on air is 1100-1200Mhz, and memory OCing doesn't matter very much on this card, but Hynix is usually better.


hi guys mine is hynix memory,..and it doesnt over clpck that well







on stock voltage and on air...

i only have max oc on the ore clock of 1080 and memory of 1280

but its ok its unlockable to 290x but i wont use it as 290x until it confirms safe...

thanks guys


----------



## Jack Mac

Quote:


> Originally Posted by *maynard14*
> 
> hi guys mine is hynix memory,..and it doesnt over clpck that well
> 
> 
> 
> 
> 
> 
> 
> on stock voltage and on air...
> 
> i only have max oc on the ore clock of 1080 and memory of 1280
> 
> but its ok its unlockable to 290x but i wont use it as 290x until it confirms safe...
> 
> thanks guys


Have you tried messing with memory voltage? That might help you.


----------



## maynard14

Quote:


> Originally Posted by *Jack Mac*
> 
> Have you tried messing with memory voltage? That might help you.


hi sir you mean the ore voltage sir? hmm i am afraid of frying my card,. haha


----------



## Jack Mac

Quote:


> Originally Posted by *maynard14*
> 
> hi sir you mean the ore voltage sir? hmm i am afraid of frying my card,. haha


No, memory voltage, it's pretty safe to increase. I'm mad that mine's locked.


Have my clocks low as my 290 has issues going to 2D clocks, I just switch over to 1150Mhz when I play games.


----------



## maynard14

ohhhh i see sir,.. well i just check it but we are the same sir my memory voltage is lock also,.. guess im just unlucky









if only there are lots of reports that r9 290 unlock to r9 290x via bios is stable i might use my r9 290 as 290x


----------



## Forceman

Quote:


> Originally Posted by *Jack Mac*
> 
> No, memory voltage, it's pretty safe to increase. I'm mad that mine's locked.
> 
> 
> Have my clocks low as my 290 has issues going to 2D clocks, I just switch over to 1150Mhz when I play games.


It's not locked, there is just no software memory voltage control on these cards.


----------



## maynard14

hnmm maybe sir thats why its lock? and i think msi after burner is still at beta stage with r9 290 cards


----------



## psyside

Quote:


> Originally Posted by *skupples*
> 
> No, the 780/titan/780Ti vrm's are rated to 110c.
> 
> They 20c+ estimate on the GK110 VRM's is when heavily overclocked. Stock they should be running pretty cool.
> 
> Those temps are never going to be an issue for you unless you are planning to do reckless levels of overclocking, which isn't really possible with the cards you have listed as available to you.
> 
> The DCII 780's do not allow over-volting past 1.212 (not directly comparable to the amount of voltage allowed on 290 due to different architecture designs)
> 
> Not sure if asus will lock down Hawaii DCII voltage, i believe that's up to AMD not Asus. The only reason we can go past 1.212 on ref gk110 is due to the buck controller being easily accessible.


What i meant was using the max voltage allowed on stock 1.212, or like 1.25/1.3 if i go with AMD card. I know its locked.

I just hope i get card with good memory oc capabilities, its beneficial for SSAA and extreme game mods.

Thanks for the help!

Also 110c on GTX78x? are you sure? that's great rep +


----------



## psyside

Quote:


> Originally Posted by *Rar4f*
> 
> AMD seems to be screwing people over by not allowing non reference cards til after december...
> And what's worse? Their reference card makes alot of noise and is not very good at cooling.
> .


What do you mean after december? is there any new info? omg!


----------



## Jack Mac

Quote:


> Originally Posted by *Forceman*
> 
> It's not locked, there is just no software memory voltage control on these cards.


Hm, that sucks.


----------



## Rar4f

Quote:


> Originally Posted by *psyside*
> 
> What do you mean after december? is there any new info? omg!


http://hardforum.com/showpost.php?p=1040415170&postcount=35


----------



## kot0005

Quote:


> Originally Posted by *rdr09*
> 
> your card is prolly throttling. compare . . .
> 
> http://www.3dmark.com/fs/1155472
> 
> my 290 gets a 12600 in graphics score at that oc.


prolly... I have the voltage to max with gpu tweak. Temps are 45-50c on load. ASIC is 75%

ok this core is a bit better...1215/1475; http://www.3dmark.com/3dm/1759317?



I am not sure why the clocks are in default but its def 1215 and 1475


----------



## hotrod717

Quote:


> Originally Posted by *Jack Mac*
> 
> Hm, that sucks.


Another reason people are anxious for non-reference cards to come out. Adjustable memory voltage, LLC, ect. help stabilize high oc's. It would be nice to see some leaks on new Asus and MSI designs.

^^ Not an approved driver.^^


----------



## psyside

Quote:


> Originally Posted by *Rar4f*
> 
> http://hardforum.com/showpost.php?p=1040415170&postcount=35


LOL thats terrible!!!


----------



## Derpinheimer

Isnt core voltage like a substitute for mem voltage this time around? Raising it seems to increase memory oc potential.


----------



## Rar4f

Quote:


> Originally Posted by *psyside*
> 
> LOL thats terrible!!!


Indeed, i may just put a Arctic cooler on the gpu if i get one.


----------



## psychok9

Quote:


> Originally Posted by *Rar4f*
> 
> Quote:
> 
> 
> 
> Originally Posted by *psyside*
> 
> What do you mean after december? is there any new info? omg!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://hardforum.com/showpost.php?p=1040415170&postcount=35
Click to expand...

Thanks for the info


----------



## Sgt Bilko

Quote:


> Originally Posted by *Rar4f*
> 
> http://hardforum.com/showpost.php?p=1040415170&postcount=35


Well that sucks, I was kinda hoping to replace mine with a non-ref card but not if im going to wait that long for them.


----------



## kosmosrl

Quote:


> Originally Posted by *Rar4f*
> 
> http://hardforum.com/showpost.php?p=1040415170&postcount=35










May be I should go with 280x..


----------



## Jack Mac

Just got an atikmdag bluescreen, should I be worried?


----------



## SpewBoy

Just got my 290Xs under water and I can get them up to 1250MHz core using GPUTweak and ASUS bios to get more than +100mv voltage (Afterburner is capped at 100mv it seems). GPUTweak shows 1.406 but there's probably some v-droop in there.

I have Elpida memory and the best I can do is... 1362MHz. They will almost hit 1400 stable but anything over 1362 causes some pretty nasty performance drops. That just seems crazy low to me. I've seen people hitting 1750... what am I doing wrong? Or are the Elpida chips really that bad? Temps don't go above 50 and I could probably keep going on the core if I didn't hit GPUTweak's voltage limit.

Also, does Afterburner's aux voltage adjustment actually do anything meaningful? Ramping that up to +100mv does nothing for my OCs.


----------



## Raxus

anyone have any idea why in games like Deadspace 3 the GPU clock won't rise above 650mhz, but battlefield 3 its pinned to the max? No heat issues.


----------



## Hogesyx

Has anyone tested out the reference bios uploaded on techpowerup?

http://www.techpowerup.com/vgabios/index.php?architecture=ATI&manufacturer=&model=R9+290X&interface=&memType=&memSize=

I reflash my Sapphire 290X to the reference 290X right bios(uber), I can now run the memory at 1600 on stock voltage whereas teh Sapphire only allows 1460 max.


----------



## maynard14

Quote:


> Originally Posted by *Raxus*
> 
> anyone have any idea why in games like Deadspace 3 the GPU clock won't rise above 650mhz, but battlefield 3 its pinned to the max? No heat issues.


me too sir,. while playing asassins creed 4 it wont go max at 947 clock speed.. it only goes to 700 to 800 clock.. but memory is stable at 1250,.. maybe driver issues or what.. r9 290s are so mysterious.. but maybe because they are new


----------



## Raxus

Quote:


> Originally Posted by *maynard14*
> 
> me too sir,. while playing asassins creed 4 it wont go max at 947 clock speed.. it only goes to 700 to 800 clock.. but memory is stable at 1250,.. maybe driver issues or what.. r9 290s are so mysterious.. but maybe because they are new


gpu load in Deadspace 3 never goes over 26%, granted im not getting any frame issues. still weird.


----------



## maynard14

Quote:


> Originally Posted by *Raxus*
> 
> gpu load in Deadspace 3 never goes over 26%, granted im not getting any frame issues. still weird.


yes sir were the same funny thing im playing assassins creed 4 and my clock speed is decreasing also gpu load is decreasing .. it is spiking BUT my FPS is stable or it is not decreasing drastically... haha lots of bugs.. i hope hwinfo msi after burner and gpuz will update there software for better and much stable for r9 290 cards


----------



## arang

it seems to be disappeared blackscreen on my card (overclock & ELPIDA RAM)
i have run "ULPS_control.exe" which opens the command prompt and lets you disable ulps
and after rebooting

with regedit
i found this unchanged
so changed 1->0



then reboot

and gpu tweak 1150/1400 setting
During 6hours' BF4 online playing, never happened blackscreen even overclocking


----------



## skupples

Quote:


> Originally Posted by *Raxus*
> 
> gpu load in Deadspace 3 never goes over 26%, granted im not getting any frame issues. still weird.


I noticed this when the game first came out when I was on a single 670, it's just extremely low demand.


----------



## tsm106

Quote:


> Originally Posted by *arang*
> 
> it seems to be disappeared blackscreen on my card (overclock & ELPIDA RAM)
> i have run "ULPS_control.exe" which opens the command prompt and lets you disable ulps
> and after rebooting
> 
> with regedit
> i found this unchanged
> so changed 1->0
> 
> 
> 
> then reboot
> 
> and gpu tweak 1150/1400 setting
> During 6hours' BF4 online playing, never happened blackscreen even overclocking


That's not where ulps is traditionally disabled. It's in the CLASS directory. When overclocking above Overdrive limits you will BSOD with ULPS enabled if you use an unofficial overclock method app like gputweak or AB in unofficial mode. That's one of the obvious ways to tell if ulps is on or not.


----------



## Kelwing

Weird. Messing with Skyrim for last few hours and not one black screen. This has been the best my 290 has acted since I got it. Even temps seemed to have stabilized and drop slightly. Inside case temps which would increase 5*c, no longer do. Still using beta 9.4 drivers.

Card is impressive when it works without issues.


----------



## maynard14

Quote:


> Originally Posted by *Kelwing*
> 
> Weird. Messing with Skyrim for last few hours and not one black screen. This has been the best my 290 has acted since I got it. Even temps seemed to have stabilized and drop slightly. Inside case temps which would increase 5*c, no longer do. Still using beta 9.4 drivers.
> 
> Card is impressive when it works without issues.


i agree sir my card doesnt have any issues at all,.. no blakc screen no bsod.. no artifacts,. BUT low on overclocking haha... but still i believe drivers are still not optomize even msi after burner it always shows gpu load spikes... and gpuz load is buggy. i guess we should have to wait for better programs for this card


----------



## Scorpion49

Change me over to water, finally!

Moar potato pics


----------



## Maracus

Sign me up!

XFX 290x
Stock cooling (For now)
Elpdia Memory

I'll do some testing later, so damn hot here atm 35c and 95% humidity, guess ill be doing some runs with 100% fan speed. Seems no louder than my 6970/7970 tho.


----------



## darkelixa

Is there a best brand of the 290 to buy? I live in australia so there is only really the sapphire or gigabytes in stock currently


----------



## DeadlyDNA

Quote:


> Originally Posted by *darkelixa*
> 
> Is there a best brand of the 290 to buy? I live in australia so there is only really the sapphire or gigabytes in stock currently


Everyone has opinions on this, however supposedly some are allowing water cooling if you plan on it. Me personally i bought 2 gigabyte 290s and they work great. The other 2 were HIS and one was DOA. The good one seems good as well. I have a power color BNIB i haven't opened yet so i cant say on that brand. Everyone has different experiences with brands. I don't know that there is a "best" brand yet???


----------



## kot0005

OT but, RIP paul walker I enjoyed Fast and furious 2...


----------



## tsm106

Quote:


> Originally Posted by *kot0005*
> 
> OT but, RIP paul walker I enjoyed Fast and furious 2...


Oh snap he went out like James Dean, in a Porsche.


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> Is there a best brand of the 290 to buy? I live in australia so there is only really the sapphire or gigabytes in stock currently


They are all Ref design atm so it's all up to your experience with warranty etc.

MSI i think allows you to watercool without voiding warranty.

I had a Sapphire and it was fine until it died









And Paul Walker RIP


----------



## dansi

Quote:


> Originally Posted by *Hogesyx*
> 
> Has anyone tested out the reference bios uploaded on techpowerup?
> 
> http://www.techpowerup.com/vgabios/index.php?architecture=ATI&manufacturer=&model=R9+290X&interface=&memType=&memSize=
> 
> I reflash my Sapphire 290X to the reference 290X right bios(uber), I can now run the memory at 1600 on stock voltage whereas teh Sapphire only allows 1460 max.


As reported in the unlock thread, i am using this review sample bios and confirm it allows slightly better core overclock stability!

As reviewers noted, amd partners may be using older bios version on their shipped cards and affects the performance

http://www.legitreviews.com/amd-radeon-r9-290x-press-sample-versus-retail_129583


----------



## kosmosrl

Check this out

"Always looking to provide the user with the best solutions, SAPPHIRE will be introducing new models in the R9 290 family over the coming weeks based on the company's own designs and with innovative and enhanced cooling solutions."

http://www.sapphiretech.com/presentation/media/media_index.aspx?psn=0004&articleID=5453&lid=1


----------



## tsm106

Quote:


> Originally Posted by *kosmosrl*
> 
> Check this out
> 
> "Always looking to provide the user with the best solutions, SAPPHIRE will be introducing new models in the R9 290 family over the coming weeks based on the company's own designs and with innovative and enhanced cooling solutions."
> 
> http://www.sapphiretech.com/presentation/media/media_index.aspx?psn=0004&articleID=5453&lid=1


Just keep in mind Sapphire custom designs are never done with full intent. They always eff something up and on top of that they are never supported by the watercooling community. MSI and Asus always have support from EK. For that it's really hard to get excited over a fancy Toxic card.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kosmosrl*
> 
> Check this out
> 
> "Always looking to provide the user with the best solutions, SAPPHIRE will be introducing new models in the R9 290 family over the coming weeks based on the company's own designs and with innovative and enhanced cooling solutions."
> 
> http://www.sapphiretech.com/presentation/media/media_index.aspx?psn=0004&articleID=5453&lid=1


Coming weeks huh?

So they will launch press cards just before Xmas and then they will go on sale in Mid January?


----------



## Raxus

Quote:


> Originally Posted by *tsm106*
> 
> Just keep in mind Sapphire custom designs are never done with full intent. They always eff something up and on top of that they are never supported by the watercooling community. MSI and Asus always have support from EK. For that it's really hard to get excited over a fancy Toxic card.


I ran into this issue with my 7970 oc with boost, started looking into watercooling..... NOPE! lol


----------



## Arizonian

Quote:


> Originally Posted by *Scorpion49*
> 
> Putting the block on my first card right now, hit a snag though because I ran out of alcohol to clean the thermal pad/paste residue with. Gotta run to the store in a few minutes and get some more. I decided to be smart about how I store the stock cooler this time rather than losing half the pieces like last time I did this, I put all the screws back in the cooler and then put a slice of butcher paper over the pads and used the GPU bracket to hold it down so the pads don't get ruined.
> 
> Took these with a potato
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - updated









Quote:


> Originally Posted by *jomama22*
> 
> So here's a new record:
> http://www.3dmark.com/fs/1233769
> 3x 290x
> 1332/1600
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Gfx score: 19564
> 
> I'll have a "valid" result later. This is just a quick test, first time firing up 3 at once.


Nice score








Quote:


> Originally Posted by *josephimports*
> 
> I received my replacement Asus 290 yesterday and its been running flawless. ASIC rating is much higher at 76.9% compared to 66.0% for the original card. It runs artifact-free in benchmarks at 1150 with no voltage adjustments. Max vcore is 1.234 at 1150 and 1.195v at stock running 3DMark Extreme. Max VRM1 49c, VRM2 57c.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Valley @ 1150/1250/PT 150%
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 3DMark @ 1180 / 1375 / 1300mv / PT 150%
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/1759629
> 
> http://www.techpowerup.com/gpuz/9cxp7/


Congrats - welcome aboard - added









Quote:


> Originally Posted by *Maracus*
> 
> Sign me up!
> 
> XFX 290x
> Stock cooling (For now)
> Elpdia Memory
> 
> I'll do some testing later, so damn hot here atm 35c and 95% humidity, guess ill be doing some runs with 100% fan speed. Seems no louder than my 6970/7970 tho.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - welcome to the club - added


----------



## taem

MSI 290x lightning to require 2x8-pin AND a 6-pin? Crazy. http://wccftech.com/msi-radeon-r9-290x-lightning-iccoming-pcb-pictured-full-power-hawaii-xt-unleashed/

Link shows pix of the pcb tho unfortunately not the cooler.


----------



## hotrod717

Quote:


> Originally Posted by *taem*
> 
> MSI 290x lightning to require 2x8-pin AND a 6-pin? Crazy. http://wccftech.com/msi-radeon-r9-290x-lightning-iccoming-pcb-pictured-full-power-hawaii-xt-unleashed/
> 
> Link shows pix of the pcb tho unfortunately not the cooler.


Honestly don't care about the cooler. Probably a few months away from release of card and EK waterblock. I went with Asus on the 7970 custom. Thinking I may have to try MSI this time around. Time will tell


----------



## MrWhiteRX7

Quote:


> Originally Posted by *dansi*
> 
> As reported in the unlock thread, i am using this review sample bios and confirm it allows slightly better core overclock stability!
> 
> As reviewers noted, amd partners may be using older bios version on their shipped cards and affects the performance
> 
> http://www.legitreviews.com/amd-radeon-r9-290x-press-sample-versus-retail_129583


There are two on the techpowerup page... Left or Right?


----------



## dansi

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> There are two on the techpowerup page... Left or Right?


Right!








Is uber.
Left is quiet and worthless for a high end card from AMD.

I will continue look out for new 290 series vbios because this time AMD is introducing plentiful of new techniques in powertune and it helps to keep updating the vbios.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *dansi*
> 
> Right!
> 
> 
> 
> 
> 
> 
> 
> 
> Is uber.
> Left is quiet and worthless for a high end card from AMD.


Thank you!!!!!!!!!!!


----------



## skupples

Quote:


> Originally Posted by *Forceman*
> 
> It's not locked, there is just no software memory voltage control on these cards.


ohhh how I wish that would come back some day.


----------



## Maracus

Just a quick run 1100/1400 only thing I touched was PL 50%
Gonna see how far i can push the Elpdia memory.
ASICS 71%
So far no black screens 9.4 beta.....I'll keep testing and go for the mother load later


----------



## quakermaas

Quote:


> Originally Posted by *UNOE*
> 
> I have odd GPU usage all up and down with crossfire


Mine is the same in MSI afterburner, I'm thinking it is a bug, performance doesn't seem to affected. I don't get throttling, as I have a custom fan profile and temps max out at about 78c.


----------



## RAFFY

Quote:


> Originally Posted by *kot0005*
> 
> OT but, RIP paul walker I enjoyed Fast and furious 2...


Vin Diesel was right, he really was a "Busta" lol. I'm not sure about this but it looks like they were driving a Porsche GT. Hard to tell since the car is in millions of pieces.


----------



## MrWhiteRX7

Well I went on a flashing spree tonight. Tried the Press Release sample and I did get a little more on my memory clocks but after 30 minutes of stress benching it would pop a little artifact here and there so no go. Then I tried the Asus bios and GPU Tweak. Card will do 1200 on the core but no where near the same on the memory which brings my scores down considerably. So back to my stock sapphire 290 bios and all is well again.

It was worth a shot









I'm hitting 1175 / 1500 with +100mv in AB17 - stock cooler hovering around 68 - 71c peak temps. No need to switch I guess but the grass is always greener right? hahaha ah well. I LOVE THIS CARD!


----------



## MrWhiteRX7

Isn't there a way to do +200mv in AB 17? I know it's command line or something.


----------



## Falkentyne

Quote:


> Originally Posted by *pompss*
> 
> Anyway to check and determine that the die is actually crushed ???
> But i really doubt that the die crush so easy..
> I didnt put so much pressure


Try using it in the X8 pcie slot. See if it boots.
My old 7970 wouldn't boot in X16 suddenly (the beeps) but did in X8 (it eventually died a red screen death (and the system wouldn't even POST) much later.


----------



## CriticalHit

couple of black screen crashes today on stock 290 wtih beta 9.2 ( just now on skype chat )
downloading 9.4 now

edit : its only black screen on center screen though ... right and left screens are loaded with graphic corruption.


----------



## Imprezzion

Guys, i'm hitting 1170Mhz core and 6000Mhz VRAM on my unlocked Club3D R9 290 but this is at 1.412v in GPU Tweak with 1.30-1.32v load in GPU-Z.

Is there any chance that buying a proper R9 290X will get me noticably higher clocks? I mean, 1170Mhz seems quite low for such a voltage.

I am planning to put my Accelero Hybrid on it with custom built VRM cooling (made from the stock coolers baseplate







) but before I start cutting and grinding my stock cooler to pieces to salvage the baseplate from it I want to know if this card @ 1170Mhz is worth it or whether I should get me another one that unlocks / a proper 290X.


----------



## head9r2k

Hey guys what do you think is safe for 24/7 with default voltage? and without memory overclocking

Sapphire R9 290 , 1000mhz? 1100Mhz? core clock

Can i broken the Card when i make 1000-1100Mhz?


----------



## rdr09

Quote:


> Originally Posted by *head9r2k*
> 
> Hey guys what do you think is safe for 24/7 with default voltage? and without memory overclocking
> 
> Sapphire R9 290 , 1000mhz? 1100Mhz? core clock
> 
> Can i broken the Card when i make 1000-1100Mhz?


have you tried 1000MHz or higher without touching the voltage? i game at stock but i've benched mine at 1155 at stock volts and gamed at 1100. mine is watercooled, though.


----------



## head9r2k

Yep for 3DMark11 i have test 1000mhz and a match Battlefield 4 but I dont know if i can broken the card with 1000mhz or higher with default voltage.

Sry for my bad english.


----------



## rdr09

Quote:


> Originally Posted by *head9r2k*
> 
> Yep for 3DMark11 i have test 1000mhz and a match Battlefield 4 but I dont know if i can broken the card with 1000mhz or higher with default voltage.
> 
> Sry for my bad english.


no, especially without an increase in voltage. if it is playing BF4 at that, then i guess your temps are acceptable. what are they? keep them temps below manufacturer's spec, then you are good. the vrms' temps are just as important as the core.

game on!


----------



## maynard14

my friend told me that the reason why gpu load on r9 290 shows only a little like 26 percent coz its not about msi after burner buggy or gpuz is buggy but because of r9 290 is design for 4k resolution? its that true...?


----------



## rdr09

Quote:


> Originally Posted by *maynard14*
> 
> my friend told me that the reason why gpu load on r9 290 shows only a little like 26 percent coz its not about msi after burner buggy or gpuz is buggy but because of r9 290 is design for 4k resolution? its that true...?


what game? minecraft?

i get 100% in BF4. a few dips which i point to the cpu.


----------



## maynard14

assassins creed 4 and crysis 3... im looking at my gpu load its spikes up and down but my fps its not effected


----------



## head9r2k

Thats when i Play BF4 with 1000mhz , i use the stock cooler with own fan profile.

What is Max Temp for the VRM? on the screen 1:49°C and 2:63°C


----------



## rdr09

Quote:


> Originally Posted by *maynard14*
> 
> assassins creed 4 and crysis 3... im looking at my gpu load its spikes up and down but my fps its not effected


i don't play AC4, sorry. here was C3 maxed out using 1080 . . .



look at my temp. almost 60 and that's watered. BF4 is lower.

Quote:


> Originally Posted by *head9r2k*
> 
> 
> 
> Thats when i Play BF4 with 1000mhz , i use the stock cooler with own fan profile.
> 
> What is Max Temp for the VRM? on the screen 1:49°C and 2:63°C


your vrm temps are lovely. can't see core, though. keep all of them under 90C you are good - imo.


----------



## Sgt Bilko

Quote:


> Originally Posted by *maynard14*
> 
> assassins creed 4 and crysis 3... im looking at my gpu load its spikes up and down but my fps its not effected


mine did the same in AC4 and Crysis 3, Core clock bounced around same as usage but the fps was fine.


----------



## Ukkooh

Looks like the 13.11 9.4Betas didn't resolve my black screen issues. Just got another one today. RMA time.


----------



## maynard14

mine is like this.. what should be tha problem

and im playing assassins creed 4 when i screen shot that gpu z


----------



## maynard14

Quote:


> Originally Posted by *Sgt Bilko*
> 
> mine did the same in AC4 and Crysis 3, Core clock bounced around same as usage but the fps was fine.


but are cards are not faulty right sir? im just alittle worried... even i slide the power slider to the max it still the same..


----------



## blak24

Guys I noticed in these days that my 290 has GPU load spike drops from 100% to 70-80 for a fraction of a second, and then goes back to 100%. It happens with benchs (Unigine Valley) or games, but GPU's clock is always at 1050Mhz as it's set. I already tried downgrading drivers to 13.11 beta 9.2 instead of the 9.4 but nothing happened. I also searched here in the forum but didn't find a solution, and it's not MSI Afterburner because I logged a file with GPU-Z and it also sees these load drops.

Now I'm supposing it's a driver problem, as it happens both with the 290X bios (it's unlocked) and with the stock 290 one... Any idea?


----------



## Sgt Bilko

Quote:


> Originally Posted by *maynard14*
> 
> but are cards are not faulty right sir? im just alittle worried... even i slide the power slider to the max it still the same..


No, not faulty, pretty sure that Ubisoft never gave two damns when they ported AC4 over.

I had a fairly consistent 100% GPU usage in nearly every other game i tried.

And Ubisoft have said that if AC4 runs a bit sloppy on your PC then "buy a better GPU"...........yeah


----------



## Ukkooh

Why would you guys even care about usage and clock drops if you have no fps or stutter issues? My gpu clocks down to 500mhz in some games but still runs fine (when it is not artifacting or black screening).


----------



## rdr09

Quote:


> Originally Posted by *maynard14*
> 
> 
> 
> mine is like this.. what should be tha problem
> 
> and im playing assassins creed 4 when i screen shot that gpu z


use AB to monitor cpu (yes, cpu) usage. it could be AC4. have you added a profile for AC4 in CCC?

checkout post # 8849. that's BF4, though. is that what you want to see?


----------



## maynard14

Quote:


> Originally Posted by *rdr09*
> 
> use AB to monitor cpu (yes, cpu) usage. it could be AC4. have you added a profile for AC4 in CCC?
> 
> checkout post # 8849. that's BF4, though. is that what you want to see?


yup sir ... i just wondering if my gpu is not faulty or what coz sometimes while i play a game evenno fps drops gpu usage is 0 percent then quickly climb to 100 percent haha,, how odd right.

no i dont have any profile on ccc

but sir i have another question if i run my card as r9 290x custom bios to my r9 290 is it ok or would i be worry?


----------



## rdr09

Quote:


> Originally Posted by *Ukkooh*
> 
> Looks like the 13.11 9.4Betas didn't resolve my black screen issues. Just got another one today. RMA time.


Quote:


> Originally Posted by *Ukkooh*
> 
> Why would you guys even care about usage and clock drops if you have no fps or stutter issues? My gpu clocks down to 500mhz in some games but still runs fine (when it is not artifacting or black screening).


^this. sorry about your card.
Quote:


> Originally Posted by *maynard14*
> 
> yup sir ... i just wondering if my gpu is not faulty or what coz sometimes while i play a game evenno fps drops gpu usage is 0 percent then quickly climb to 100 percent haha,, how odd right.
> 
> no i dont have any profile on ccc
> 
> but sir i have another question if i run my card as r9 290x custom bios to my r9 290 is it ok or would i be worry?


only time will tell. do you really need the X?


----------



## maynard14

Quote:


> Originally Posted by *rdr09*
> 
> ^this. sorry about your card.
> only time will tell. do you really need the X?


thank you sir for your answer..ahmm no im only using a 1080p monitor,,,


----------



## head9r2k

Quote:


> Originally Posted by *rdr09*
> 
> look at my temp. almost 60 and that's watered. BF4 is lower.
> your vrm temps are lovely. can't see core, though. keep all of them under 90C you are good - imo.


Here a Screen with core ^^


----------



## Jpmboy

Quote:


> Originally Posted by *SpewBoy*
> 
> Just got my 290Xs under water and I can get them up to 1250MHz core using GPUTweak and ASUS bios to get more than +100mv voltage (Afterburner is capped at 100mv it seems). *GPUTweak shows 1.406 but there's probably some v-droop in there.*
> 
> I have Elpida memory and the best I can do is... 1362MHz. They will almost hit 1400 stable but anything over 1362 causes some pretty nasty performance drops. That just seems crazy low to me. I've seen people hitting 1750... what am I doing wrong? Or are the Elpida chips really that bad? Temps don't go above 50 and I could probably keep going on the core if I didn't hit GPUTweak's voltage limit.
> 
> Also, does Afterburner's aux voltage adjustment actually do anything meaningful? Ramping that up to +100mv does nothing for my OCs.


If you set 1.412V in GPUT, it droops to 1.28-1.3V on full load. My sapphire has Elp memory also, try, setting the gpu clock lower and raising teh vram clock while holding vcore constant. Should easily do 1500 with the Asus bios and GPU tweak.

There should be a new MIS afterburner out soon... in testing at MSI according to "Unwinder".
http://forums.guru3d.com/showthread.php?t=382760&page=8


----------



## SultanOfWalmart

Quote:


> Originally Posted by *maynard14*
> 
> 
> 
> mine is like this.. what should be tha problem
> 
> and im playing assassins creed 4 when i screen shot that gpu z


Is your Vsync on?


----------



## Raxus

Quote:


> Originally Posted by *SultanOfWalmart*
> 
> Is your Vsync on?


It seems like it's all drivers/demand on the gpu


----------



## maynard14

yes sir my v sync is on in asassins creed 4

@ sir raxus/ yes maybe its all in the drivers...


----------



## zpaf

Anyone with R9 290X to see score from MetroLL with the same settings ?
I think this game loves more shaders.


----------



## SultanOfWalmart

Quote:


> Originally Posted by *maynard14*
> 
> yes sir my v sync is on in asassins creed 4
> 
> @ sir raxus/ yes maybe its all in the drivers...


Turn off your vsync.


----------



## SpewBoy

Quote:


> Originally Posted by *Jpmboy*
> 
> If you set 1.412V in GPUT, it droops to 1.28-1.3V on full load. My sapphire has Elp memory also, try, setting the gpu clock lower and raising teh vram clock while holding vcore constant. Should easily do 1500 with the Asus bios and GPU tweak.
> 
> There should be a new MIS afterburner out soon... in testing at MSI according to "Unwinder".
> http://forums.guru3d.com/showthread.php?t=382760&page=8


I tried just upping the mem clock to past 1362MHz previously (using Afterburner and HIS BIOS) and it resulted in a performance drop and instability. Tomorrow I'll give it another go with GPUTweak + ASUS BIOS and failing that, I might just turn to the Sapphire or reference BIOS because apparently people have been having some better luck with them. Gon' be doin' some intense BIOS hopping.

EDIT: Oh, and for some reason Ghosts produces horrible artefacting and crippling black screens / weird pixelation and scanlines at my otherwise stable core overclocks. Had to dial it back to 1100 / 1362 and even then I get occasional artefacting (which can be reproduced all the time with certain weapon sights... lol). These cards (and this game) are proving to be very finicky. Hopefully will be improved with new drivers.

Also, anyone know how to get +200mv with Afterburner? Someone mentioned that they read it was possible through command line. Me want need.


----------



## blak24

Quote:


> Originally Posted by *Ukkooh*
> 
> Why would you guys even care about usage and clock drops if you have no fps or stutter issues? My gpu clocks down to 500mhz in some games but still runs fine (when it is not artifacting or black screening).


Well I also noticed huge stutter issues on BF4 and AC4 with Vsync ON. The only way to get it to work without stutter and Vsync was forcing it via RadeonPro... But this is not good anyway..


----------



## Jpmboy

Quote:


> Originally Posted by *SpewBoy*
> 
> I tried just upping the mem clock to past 1362MHz previously (using Afterburner and HIS BIOS) and it resulted in a performance drop and instability. Tomorrow I'll give it another go with GPUTweak + ASUS BIOS and failing that, I might just turn to the Sapphire or reference BIOS because apparently people have been having some better luck with them. Gon' be doin' some intense BIOS hopping.
> 
> EDIT: Oh, and for some reason Ghosts produces horrible artefacting and crippling black screens / weird pixelation and scanlines at my otherwise stable core overclocks. Had to dial it back to 1100 / 1362 and even then I get occasional artefacting (which can be reproduced all the time with certain weapon sights... lol). These cards (and this game) are proving to be very finicky. Hopefully will be improved with new drivers.
> 
> Also, anyone know *how to get +200mv with Afterburner*? Someone mentioned that they read it was possible through command line. Me want need.


To set +100mV offset:
MSIAfterburner /wi4,30,8d,10

To restore original voltage offset:
MSIAfterburner /wi4,30,8d,0

shift-right-click in the MSI folder -> "open cmnd window here" - _type in MSIAfterburner /wi4,30,8d,10_

It's Hex, so 50mV is "8". Now when you add +100 in AB, it's on top of this offset. Each hex increment = 6.25mV

*Use at your own risk!!*

which driver are you using?


----------



## arang

Quote:


> Originally Posted by *tsm106*
> 
> That's not where ulps is traditionally disabled. It's in the CLASS directory. When overclocking above Overdrive limits you will BSOD with ULPS enabled if you use an unofficial overclock method app like gputweak or AB in unofficial mode. That's one of the obvious ways to tell if ulps is on or not.


Thank you for detail information thumb.gif i misuploaded..present setting is 1100/1300 not 1150/1400 (this is my MAX wink.gif stock voltage)
and will test more


----------



## Kenshiro 26

Valley run @ 1150/1600.


----------



## CriticalHit

Quote:


> Originally Posted by *zpaf*
> 
> Anyone with R9 290X to see score from MetroLL with the same settings ?
> I think this game loves more shaders.


but i feel it has a certain dislike for crossfire .... CFX gpu usage really bounces around on my rig with LL..

not the most consistent graph sadly .


----------



## zpaf

Quote:


> Originally Posted by *CriticalHit*
> 
> but i feel it has a certain dislike for crossfire .... CFX gpu usage really bounces around on my rig with LL..
> 
> not the most consistent graph sadly .


Can you disable CF and give us the best from one card ?


----------



## spitty13

So are any of you guys overvolting your memory? If so, what is the overclock ability like and what software are you using.


----------



## grandpatzer

When you "unlock voltage monitoring" in MSI AB the core clocks fluctuate, so does this effect performance also?
So should I *disable MSI AB "unlock voltage monitoring*?


----------



## SultanOfWalmart

Quote:


> Originally Posted by *spitty13*
> 
> So are any of you guys overvolting your memory? If so, what is the overclock ability like and what software are you using.


290/x does not have vmem control via software. Hard mods only. If if you have Elpdia chips it wouldn't matter much either way, as the chips are pretty much take a nose-dive after 1600Mhz regardless of temp and voltage...


----------



## spitty13

Quote:


> Originally Posted by *SultanOfWalmart*
> 
> 290/x does not have vmem control via software. Hard mods only. If if you have Elpdia chips it wouldn't matter much either way, as the chips are pretty much take a nose-dive after 1600Mhz regardless of temp and voltage...


Well thats a bit of a bummer. You think software for memory will ever come out? My memory doesn't clock very high, so I was hoping that a little bit of tweaking would let me get to 1500.


----------



## Matess

Hello,
I have a question about r9 290(x) cooler. Is it possible to dismount it so that bottom steel stays there and cools ram and vrm and remove vapor chamber with a heatsink? I wanna replace that heatsink with small water cooler. I wasnt able to google it...

thanks for answers...

Matess


----------



## zpaf

Quote:


> Originally Posted by *spitty13*
> 
> Well thats a bit of a bummer. You think software for memory will ever come out? My memory doesn't clock very high, so I was hoping that a little bit of tweaking would let me get to 1500.


You can get 1500 on memory if you give more volts on core.
Yes sounds strange but its true.
Try it.

I can do 1625 with max voltage.


----------



## ZealotKi11er

Is is really safe to use 1.412v on the GPU? When i had my HD 7970 i would never go above 1.25v even though 1.3v was the MSI AB limit.


----------



## hotrod717

Quote:


> Originally Posted by *zpaf*
> 
> You can get 1500 on memory if you give more volts on core.
> Yes sounds strange but its true.
> Try it.
> 
> I can do 1625 with max voltage.


Screen shot of a completed bench would be nice. Anyone can plug those numbers in and show a screenshot of GPU Tweak and GPU-Z. The fact that it doesn't crash your desktop, doesn't mean it's stable.


----------



## CriticalHit

Quote:


> Originally Posted by *zpaf*
> 
> Can you disable CF and give us the best from one card ?


sure
here you go

CFX disabled


CFX enabled


edit: ( wrong upload )


----------



## zpaf

Quote:


> Originally Posted by *hotrod717*
> 
> Screen shot of a completed bench would be nice. Anyone can plug those numbers in and show a screenshot of GPU Tweak and GPU-Z. The fact that it doesn't crash your desktop, doesn't mean it's stable.




AMD Radeon R9 290 video card benchmark result - Intel Core i7-4770K,ASUSTeK COMPUTER INC. MAXIMUS VI IMPACT


----------



## Jack Mac

Quote:


> Originally Posted by *zpaf*
> 
> 
> 
> AMD Radeon R9 290 video card benchmark result - Intel Core i7-4770K,ASUSTeK COMPUTER INC. MAXIMUS VI IMPACT


Dang, you make my bench stable 1200 Core 290 feel mediocre. You got 1k higher GPU score than me. Anyway, do these heaven results look normal for 1200 Core/1300 memory?


----------



## SultanOfWalmart

Quote:


> Originally Posted by *zpaf*
> 
> 
> 
> AMD Radeon R9 290 video card benchmark result - Intel Core i7-4770K,ASUSTeK COMPUTER INC. MAXIMUS VI IMPACT


Nice. Mine can't do over 1550 without going black screen.









##########


----------



## zpaf

Quote:


> Originally Posted by *Jack Mac*
> 
> Dang, you make my bench stable 1200 Core 290 feel mediocre. You got 1k higher GPU score than me. Anyway, do these heaven results look normal for 1200 Core/1300 memory?


Heaven and Valley both runs with problems on R9.
They have strange gpu usage and clocks have up and downs.
Its better to test with 3dmark 11/13 or with an ingame benchmark like Tomb Raider, Metro LL, Hitman ...


----------



## Jpmboy

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Is is really safe to use 1.412v on the GPU? When i had my HD 7970 i would never go above 1.25v even though 1.3v was the MSI AB limit.


yes - the vdroop is so extreme that 1.412V in gpuT is 1.28-1,3V actual


----------



## Jpmboy

Quote:


> Originally Posted by *zpaf*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> AMD Radeon R9 290 video card benchmark result - Intel Core i7-4770K,ASUSTeK COMPUTER INC. MAXIMUS VI IMPACT


let's get some more 290s here.
Read the rules by the OP. Easy!


----------



## r0cawearz

Quote:


> Originally Posted by *Jpmboy*
> 
> yes - the vdroop is so extreme that 1.412V in gpuT is 1.28-1,3V actual


Is that the vddc shown on gpuz? At maximum my 1.34 in gput is 1.3v max.


----------



## brazilianloser

Quote:


> Originally Posted by *zpaf*
> 
> 
> 
> AMD Radeon R9 290 video card benchmark result - Intel Core i7-4770K,ASUSTeK COMPUTER INC. MAXIMUS VI IMPACT


Man I got higher 16300 with lower volt and 1200/1500.


----------



## brazilianloser

Post your score over the 3dmark11 ranking forum.


----------



## ZealotKi11er

So anyone has dont any testing with different memory clocks and see what the benefits are. I plant to do it once i get my water block.


----------



## tivook

Quote:


> Originally Posted by *ZealotKi11er*
> 
> So anyone has dont any testing with different memory clocks and see what the benefits are. I plant to do it once i get my water block.


I'm rather curious if overclocking the memory has any performance gain at all.

I reset the memory clock to stock 1250 mhz and ran some tests and then I clocked it up to 1500 mhz.

The only gain I got was 1 more fps in furmark and black screens all the time, like literally every 40 minutes or so.

This was with +100mv and 50% power limit.

If anyone has blackscreen problems with 290x, OC'ed or not, I suggest either running the memories at stock or even lowering them to 1200 to see if that helps.

I'm running my 290x at 1150 core clock and 1250 mhz memory and I'm on my third day without a black screen. Before I ran it at 1150 / 1500 and would black screen all the time.

Tiv


----------



## Rar4f

If you have replaced stock cooler on a R9 290(x) with Arctic xtreme 3, could you please be kind enough and provide feedback of how it's going besides temperature and noise? Have you faced any problem?


----------



## zpaf

Quote:


> Originally Posted by *brazilianloser*
> 
> Man I got higher 16300 with lower volt and 1200/1500.


Yes but my run is withno tweaks on driver.
Quote:


> Originally Posted by *brazilianloser*
> 
> Well here is mine
> 
> brazilianloser --- i7 3770K @ 4.5GHz --- Asus 290 @ 1200/1500 --- P16303
> 
> http://www.3dmark.com/3dm11/7583561
> 
> 
> 
> *An extra 1300 points by turning tess off*


----------



## Jpmboy

Quote:


> Originally Posted by *r0cawearz*
> 
> Is that the vddc shown on gpuz? At maximum my 1.34 in gput is 1.3v max.


yeah - don't confuse gpuZ MAX and load vddc. During sustained full load the vddc will drop below 1.3V even tho you set 1.412V in gpuT. You need the PT3 bios to get around this, or issue commands directly to the vrm to disable LLC (very dangerous!!)


----------



## brazilianloser

Quote:


> Originally Posted by *zpaf*
> 
> Yes but my run is withno tweaks on driver.


sorry forgot about the different rules over there.


----------



## Jpmboy

Quote:


> Originally Posted by *brazilianloser*
> 
> sorry forgot about the different rules over there.


you can post a score with any settings within HWBot rules.


----------



## brazilianloser

Quote:


> Originally Posted by *zpaf*
> 
> Yes but my run is withno tweaks on driver.


post those scores there and ones with test off to take out some of those Nvidia guys of the top.


----------



## zpaf

Quote:


> Originally Posted by *Jpmboy*
> 
> yes - the vdroop is so extreme that 1.412V in gpuT is 1.28-1,3V actual


You are right.


----------



## zpaf

Quote:


> Originally Posted by *CriticalHit*
> 
> sure
> here you go
> 
> CFX disabled
> 
> 
> CFX enabled
> 
> 
> edit: ( wrong upload )


I dont know what but you can do better.
Here is my card at defaults.


----------



## Jpmboy

Quote:


> Originally Posted by *brazilianloser*
> 
> sorry forgot about the different rules over there.


Tess on, no voltage or LLC hacks:
http://www.3dmark.com/3dm11/7498155

Tess off (not posted to csalt's bench thread)
http://www.3dmark.com/3dm11/7491482

Struggle to break 18K graphics. Z has a v good card.


----------



## Arizonian

Question: If I didn't 'save' my 3DMark11 scores so have no way to upload them to HOF. I still have a URL while I was logged on when I ran them. Is there a way to submit my score or am I out of luck?

I've never submitted score before.


----------



## Jpmboy

Quote:


> Originally Posted by *Arizonian*
> 
> Question: If I didn't 'save' my 3DMark11 scores so have no way to upload them to HOF. I still have a URL while I was logged on when I ran them. Is there a way to submit my score or am I out of luck?
> 
> I've never submitted score before.


Yes, create an account at futuremark, and in "settings" use the upload results function.

Sorry, wait - you do not have to actively save the results. Look i your My Documents folderfor the 3DMK11 folder. They're in there.


----------



## Arizonian

Quote:


> Originally Posted by *Jpmboy*
> 
> Yes, create an account at futuremark, and in "settings" use the upload results function.


I have an account. I didn't save any of the 3DMARK11 scores so have nothing to submit. Luckily the Firestrike auto saves but right now giving me an error when I try to upload.

I only have the 3DMark11 URL's bookmarked.

On a side note - new members that would like to be added to roster please read OP how to submit your proof. I see a lot of members who are just not providing an OCN name in pic or with GPU-Z validation link.


----------



## Jpmboy

Quote:


> Originally Posted by *Arizonian*
> 
> I have an account. I didn't save any of the 3DMARK11 scores so have nothing to submit. Luckily the Firestrike auto saves but right now giving me an error when I try to upload.
> 
> I only have the 3DMark11 URL's bookmarked.
> 
> On a side note - new members that would like to be added to roster please read OP how to submit your proof. I see a lot of members who are just not providing an OCN name in pic or with GPU-Z validation link.


Plz read edit in post above...


----------



## zpaf

Quote:


> Originally Posted by *Jpmboy*
> 
> Tess on, no voltage or LLC hacks:
> http://www.3dmark.com/3dm11/7498155
> 
> Tess off (not posted to csalt's bench thread)
> http://www.3dmark.com/3dm11/7491482
> 
> Struggle to break 18K graphics. Z has a v good card.


Something goes wrong on your Graphics Test 4


http://www.3dmark.com/compare/3dm11/7577071/3dm11/7498155

You have to play more with clocks.


----------



## Jpmboy

Quote:


> Originally Posted by *zpaf*
> 
> Something goes wrong on your Graphics Test 4
> 
> http://www.3dmark.com/compare/3dm11/7577071/3dm11/7498155
> You have to play more with clocks.


Yup, thanks. just a quick, no fine tuning run. Pulled the 290x and put sli titans back in this morning. Maybe i'lll play with it later.


----------



## robert0507

Hi add me to the club


Sapphire R290X

http://www.techpowerup.com/gpuz/edkd8/


----------



## robert0507

3dmark.docx 1092k .docx file


3dmarkfirestrikescores.jpg 272k .jpg file


----------



## velocityx

just got my first ever (since I own it) r9 290 black screen in bf4. had to do a hard switch off switch on reset.


----------



## robert0507




----------



## robert0507

Quote:


> Originally Posted by *robert0507*


these are with a slight overclock vrm max temps are 58 for vrm1 and 38 for vrm2 using Koolance gpu block


----------



## Forceman

Quote:


> Originally Posted by *maynard14*
> 
> yes sir my v sync is on in asassins creed 4
> 
> @ sir raxus/ yes maybe its all in the drivers...


If Vsync is on, the card will throttle down to keep the frame rate at 60.thats why you are getting those fluctuations. Turn off Vsync and try it.
Quote:


> Originally Posted by *grandpatzer*
> 
> When you "unlock voltage monitoring" in MSI AB the core clocks fluctuate, so does this effect performance also?
> So should I *disable MSI AB "unlock voltage monitoring*?


It shouldn't affect performance, it is just a monitoring issue, but once you've checked the voltage once there's really no reason to leave monitoring on.
Quote:


> Originally Posted by *Matess*
> 
> Hello,
> I have a question about r9 290(x) cooler. Is it possible to dismount it so that bottom steel stays there and cools ram and vrm and remove vapor chamber with a heatsink? I wanna replace that heatsink with small water cooler. I wasnt able to google it...
> 
> thanks for answers...
> 
> Matess


No, not without cutting up the stock cooler with a hacksaw or something. They are both attached together.


----------



## Matess

Quote:


> Originally Posted by *Forceman*
> 
> No, not without cutting up the stock cooler with a hacksaw or something. They are both attached together.


is it glued or what? it is metal on vrm and ram, cuprum on vaporchamber a aluminium on heatsink....


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Matess*
> 
> is it glued or what? it is metal on vrm and ram, cuprum on vaporchamber a aluminium on heatsink....


I think so, but it's very tough adhesive. You're probably not gonna get it apart without damaging something.

I know a user here hacksawed the VRM cooling off and used that with an Accellero Hybrid, I think. He said it works fairly well.


----------



## stickg1

What seems to be an average max OC for the 290? I'm looking at one that can hit around 1200MHz, I was just wondering if this was good, average, or poor?


----------



## Ukkooh

Any reviews out for the different water blocks yet? I'm going to order my watercooling stuff next wednesday and I have no idea which block I should get. EK's blocks are going to be 10-30€ cheaper for me because those are the only ones I can get without postage.


----------



## Jack Mac

Quote:


> On a side note - new members that would like to be added to roster please read OP how to submit your proof. I see a lot of members who are just not providing an OCN name in pic or with GPU-Z validation link.


*1. GPU-Z Link or Screen shot with OCN Name
2. Manufacturer & Brand
3. Cooling - Stock, Aftermarket or 3rd Party Water*

http://www.techpowerup.com/gpuz/f57xz/ - GPU-Z validation link.
Sapphire R9 290
Stock cooling
Validation photo:

This good enough?


----------



## stickg1

Quote:


> Originally Posted by *stickg1*
> 
> What seems to be an average max OC for the 290? I'm looking at one that can hit around 1200MHz, I was just wondering if this was good, average, or poor?


Also, how hot and how loud is the stock cooling? I plan to use the card (R9 290) for Team Competition [email protected] and will fold nearly 24/7.

I know these questions probably have been covered already in this thread, I did a brief keyword search and didn't find any definitive answers, I'm off to work and trying to decide on pulling the trigger on a used 290, if anyone can toss me the info I need I will gladly rep you. Will check back later from the phone. Thanks in advance.


----------



## Forceman

Quote:


> Originally Posted by *stickg1*
> 
> What seems to be an average max OC for the 290? I'm looking at one that can hit around 1200MHz, I was just wondering if this was good, average, or poor?


1150 on air, 1200 on water (with voltage control).


----------



## grandpatzer

while gaming the core go under 947mhz, I need 144fps how do I make so that the core is solid at 947mhz?!?!?!?!?!?!?!?!?!


----------



## tivook

Quote:


> Originally Posted by *grandpatzer*
> 
> while gaming the core go under 947mhz, I need 144fps how do I make so that the core is solid at 947mhz?!?!?!?!?!?!?!?!?!


As far as I know, thats the new thing about Hawaii, the core has a variable core speed. It's normal.


----------



## Forceman

Quote:


> Originally Posted by *grandpatzer*
> 
> while gaming the core go under 947mhz, I need 144fps how do I make so that the core is solid at 947mhz?!?!?!?!?!?!?!?!?!


If it is throttling (most likely) then you need to turn up the fan speed.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *stickg1*
> 
> Also, how hot and how loud is the stock cooling? I plan to use the card (R9 290) for Team Competition [email protected] and will fold nearly 24/7.
> 
> I know these questions probably have been covered already in this thread, I did a brief keyword search and didn't find any definitive answers, I'm off to work and trying to decide on pulling the trigger on a used 290, if anyone can toss me the info I need I will gladly rep you. Will check back later from the phone. Thanks in advance.


If you overvolt the stock cooling can get pretty loud honestly. At 1200/6000 @1.28v and 70% fan my core would be about 70-75c, VRMs under 60c.

If you get an aftermarket air cooler (have the Gelid myself) or water cooling the card will be much quieter and oc better. Temps definitely affect max oc.

Overall though, the 290 is a great card and you can't go wrong with one.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Ukkooh*
> 
> Any reviews out for the different water blocks yet? I'm going to order my watercooling stuff next wednesday and I have no idea which block I should get. EK's blocks are going to be 10-30€ cheaper for me because those are the only ones I can get without postage.


EK is best.


----------



## ImJJames

Quote:


> Originally Posted by *Kenshiro 26*
> 
> Valley run @ 1150/1600.


290's are so bad on Valley, thats literally only 5fps more than my 7970 @ 1180/1680


----------



## robert0507

Xspc has a new block its gorgeous. I also like the hawii block by alpha cool cause the back plate has active cooling. Only prob that is that both blocks wont be available until end of month early next year


----------



## Scorpion49

My Koolance blocks seem to be doing well, after 4 or 5 hours of BF4 at 1000/1300 my temps are around 50*C on both cards. It would be lower but my fans are at 900 RPM so its very quiet, I'd rather have less noise and slightly worse temps.


----------



## VSG

My babies are finally alive: http://www.3dmark.com/fs/1239040



Details:
1) Stock cooling, even on CPU








2) Sapphire 290x - Elpida- Asic 72.2%
3) Asus 290x- Elpida- 77.5%
4) i7 4770k at 3.5 GHz
4) Overclocked to 1100 core, 1300 memory using MSI afterburner at stock volts (I can probably go even higher, but will wait till these are under water along with that bottlenecking CPU)
5) Driver: 13.11 beta 9.4

I will run Heaven and/or Valley for a couple of times and then play a few games to make sure I don't have to RMA these cards. So far so good though, I was afraid being one of the very first to buy these cards would come back and bite me with black screens.

Update: More benchmarks as a reference for comparison against when I put these under water and do some proper overclocking


Spoiler: Warning: Spoiler!



Heaven 4.0/1080p/Tess off










Heaven 4.0/1080p/Normal Tess










Valley 1.0/1080p/Extreme HD


----------



## Imprezzion

Does anyone know why my R9 290 unlocked to 290X using PT1.ROM BIOS is artifacting / losing signal to my screen in a flickery way whenever I push voltage past 1.412v in GPU Tweak (The max a regular BIOS can do)

Even at 1.440v it's unusable.

This only happens when there's a 3D load applied btw.

Cooling for the core is a Accelero Hybrid, VRM's are cooled by a cut apart stock coolers baseplate. Neither core nor VRM's go over 70c in these tests. Also, the screen flickering and stuff happens instantly under 3D load when temps are still <50c.

EDIT: Seems to be a bad flash or something? I mean, EVERY value above 1412mv does it. ANY value under it is fine. Even 1413mv or 1425mv instantly acts wierd and flashes screen and such.


----------



## Scorpion49

My cards are not very good overclockers even under water. The most I can get without artifacts is 1125/1300.

http://www.3dmark.com/3dm11/7594068

http://www.3dmark.com/fs/1239160


----------



## SultanOfWalmart

Quote:


> Originally Posted by *Imprezzion*
> 
> Does anyone know why my R9 290 unlocked to 290X using PT1.ROM BIOS is artifacting / losing signal to my screen in a flickery way whenever I push voltage past 1.412v in GPU Tweak (The max a regular BIOS can do)
> 
> Even at 1.440v it's unusable.
> 
> This only happens when there's a 3D load applied btw.
> 
> Cooling for the core is a Accelero Hybrid, VRM's are cooled by a cut apart stock coolers baseplate. Neither core nor VRM's go over 70c in these tests. Also, the screen flickering and stuff happens instantly under 3D load when temps are still <50c.
> 
> EDIT: Seems to be a bad flash or something? I mean, EVERY value above 1412mv does it. ANY value under it is fine. Even 1413mv or 1425mv instantly acts wierd and flashes screen and such.


It's very unlikely that it is a bad flash, more than likely its just a bit of a dud and is hitting the wall. Just the luck of the draw, I'm in the same boat with my mine. Cant complain too much though, 1200/1150 290X + BF4 for $370. Might not be a bench king, but still a hell of a deal any way you look at it.


----------



## Jpmboy

Quote:


> Originally Posted by *Imprezzion*
> 
> Does anyone know why my R9 290 unlocked to 290X using PT1.ROM BIOS is artifacting / losing signal to my screen in a flickery way whenever I push voltage past 1.412v in GPU Tweak (The max a regular BIOS can do)
> Even at 1.440v it's unusable.
> This only happens when there's a 3D load applied btw.
> Cooling for the core is a Accelero Hybrid, VRM's are cooled by a cut apart stock coolers baseplate. Neither core nor VRM's go over 70c in these tests. Also, the screen flickering and stuff happens instantly under 3D load when temps are still <50c.
> EDIT: Seems to be a bad flash or something? I mean, EVERY value above 1412mv does it. ANY value under it is fine. Even 1413mv or 1425mv instantly acts wierd and flashes screen and such.


more vddc is not necessarily gonna allow higher clocks. As tsm put it - you need to find the lowest vddc for a given clock... overvolting will lead to instability. Looks to me like you're a few mV from frying that card...


----------



## MrWhiteRX7

I just did a bunch of benches today at 1200 / 1500 on air.


----------



## Jpmboy

Quote:


> Originally Posted by *Scorpion49*
> 
> My cards are not very good overclockers even under water. The most I can get without artifacts is 1125/1300.
> 
> http://www.3dmark.com/3dm11/7594068
> 
> http://www.3dmark.com/fs/1239160


what bios? what control software for the oc? Asus bios + gpuTweak should do better than that.


----------



## SpewBoy

Okay, leaving the core at 1000 and overvolting to "1.412v" in GPUTweak does *not* help me get better memory clocks. 1400MHz is as much as I can muster and even then, it will probably eventually black screen if I bench for long enough. Just compared 1400MHz Valley score to 1250MHz score and I get about 60 more points.

I haven't benched each card separatly because a) they are under water and b) for some reason Valley crashes if I unsync their clocks.

I'll try the OEM BIOS and see if that helps. If not, I'll give the Sapphire BIOS a go.


----------



## Imprezzion

Quote:


> Originally Posted by *Jpmboy*
> 
> more vddc is not necessarily gonna allow higher clocks. As tsm put it - you need to find the lowest vddc for a given clock... overvolting will lead to instability. Looks to me like you're a few mV from frying that card...


That's what my fear was as well..
That even though temps are low enough it isn't liking it one bit at all.

I might even buy another unlockable 290 or even a full 290X just to see whether it clocks better.. I find 1170Mhz a bit sad at full open voltage with ASUS.ROM.


----------



## Jack Mac

Does anyone know of a way to force 2D clocks on AMD? My 290 wants to idle at the desktop with two monitors plugged in at my 3D clocks for some reason.


----------



## Ricdeau

Quote:


> Originally Posted by *Scorpion49*
> 
> My Koolance blocks seem to be doing well, after 4 or 5 hours of BF4 at 1000/1300 my temps are around 50*C on both cards. It would be lower but my fans are at 900 RPM so its very quiet, I'd rather have less noise and slightly worse temps.


Glad to hear my temps with the EK blocks are about the same. I usually hover around 49C in games. I also run my fans pretty quiet.


----------



## JordanTr

Quote:


> Originally Posted by *Jack Mac*
> 
> Does anyone know of a way to force 2D clocks on AMD? My 290 wants to idle at the desktop with two monitors plugged in at my 3D clocks for some reason.


It lloks like yoi disabled ulps through ab. Enable if im right.


----------



## Jpmboy

Quote:


> Originally Posted by *SpewBoy*
> 
> Okay, leaving the core at 1000 and overvolting to "1.412v" in GPUTweak does *not* help me get better memory clocks. 1400MHz is as much as I can muster and even then, it will probably eventually black screen if I bench for long enough. Just compared 1400MHz Valley score to 1250MHz score and I get about 60 more points.
> 
> I haven't benched each card separatly because a) they are under water and b) for some reason Valley crashes if I unsync their clocks.
> 
> I'll try the OEM BIOS and see if that helps. If not, I'll give the Sapphire BIOS a go.


uh - you should try this the other way around. set the cards to stock vddc and increase the core clock until it is unstable.. then add more vddc... repeat until it goes as far as the silicon allows. same for memory.

we all need a Tutorial from tsm106 for AMD GPUs!


----------



## Scorpion49

Quote:


> Originally Posted by *Jpmboy*
> 
> what bios? what control software for the oc? Asus bios + gpuTweak should do better than that.


Stock BIOS. Cards were worthless with the Asus BIOS, they would even run at stock settings.


----------



## SpewBoy

Quote:


> Originally Posted by *Jpmboy*
> 
> uh - you should try this the other way around. set the cards to stock vddc and increase the core clock until it is unstable.. then add more vddc... repeat until it goes as far as the silicon allows. same for memory.
> 
> we all need a Tutorial from tsm106 for AMD GPUs!


That's what I did first time around. Best I could get out of the memory at any voltage and clocks was 1362.

Trying the OEM BIOS and for the first time ever, I've managed to run through an entire Valley bench without black screening at 1450MHz. Will see how high I can go. Score was slightly worse than 1400 but within margin of error.


----------



## Jack Mac

Quote:


> Originally Posted by *JordanTr*
> 
> It lloks like yoi disabled ulps through ab. Enable if im right.


That's not it, ULPS is enabled.


----------



## SpewBoy

Seems I can't get past 1450 on mem at stock voltage, will try increasing core voltage and see if that makes a diff.


----------



## Jpmboy

Quote:


> Originally Posted by *Scorpion49*
> 
> Stock BIOS. Cards were worthless with the Asus BIOS, they would even run at stock settings.


Quote:


> Originally Posted by *SpewBoy*
> 
> That's what I did first time around. Best I could get out of the memory at any voltage and clocks was 1362.
> Trying the OEM BIOS and for the first time ever, I've managed to run through an entire Valley bench without black screening at 1450MHz. Will see how high I can go. Score was slightly worse than 1400 but within margin of error.


Wow! That's unusual. may be just my setup, but the asus bios works well so far. (with gpu tweak). How are you unlocking vddc with ther stock bios? Or are you OCing thru CCC overdrive?


----------



## VSG

Ok I am in a bit of an interesting situation here- My Asus 290x (when tested by itself) gave a few black screens and I have successfully talked Newegg into giving me a store credit of $590 after they received this RMA'd card as the card isn't in stock over there. Do I use that cash for a 290 or a 290x from another brand for CFX with my existing Sapphire 290x? If the latter, what brand should I go for? Does it matter if the replacement card has hynix memory if my Sapphire has elpida? Thanks again!


----------



## SpewBoy

Quote:


> Originally Posted by *Jpmboy*
> 
> Wow! That's unusual. may be just my setup, but the asus bios works well so far. (with gpu tweak). How are you unlocking vddc with ther stock bios? Or are you OCing thru CCC overdrive?


For some reason I don't even have access to CCC Overdrive. The tab just is not there and I've tried all the tricks to make it show up, short of reinstalling the OS.

OCing through GPUTweak with the ASUS BIOS and through GPUTweak or Afterburner (tried both) with OEM BIOS. Afterburner makes the screen flash black and go slightly corrupt for a split second when applying any adjustments, which I might need to remedy (it did it before, and I always got crappy performance the higher I OC'd when this was the case, then I fixed it somehow and performance increased but now it is back).

These cards are not making it easy for me .__.

It seems any mem clock above 1450, even with +25 - 50mv, causes random black screens. Happens sometimes during benchmarks and sometimes just idle on the desktop.

MEMORY, Y U SO BAD >: (

EDIT: Oh, and my memory overclocks don't actually seem to increase performance...


----------



## Forceman

Beta 9.4 removed CCC Overdrive, so that might be what happened there. And 1450 is pretty good for the memory, a lot of people can't get that high.


----------



## Rar4f

Is there any significant difference between a AX3 R9 290 and a R9 290 from Asus,Gigabyte, MSI etc other than the warranty?


----------



## jomama22

Now we have a valid result! Should take first in FM hof:

3x 290x @ 1333/1625(6500)
3960x @ 5.1
http://www.3dmark.com/fs/1239528


----------



## SpewBoy

Quote:


> Originally Posted by *Forceman*
> 
> Beta 9.4 removed CCC Overdrive, so that might be what happened there. And 1450 is pretty good for the memory, a lot of people can't get that high.


Ohh, that's it then. I'm rockin' b9.4.


----------



## jrcbandit

Quote:


> Originally Posted by *SpewBoy*
> 
> For some reason I don't even have access to CCC Overdrive. The tab just is not there and I've tried all the tricks to make it show up, short of reinstalling the OS.
> 
> OCing through GPUTweak with the ASUS BIOS and through GPUTweak or Afterburner (tried both) with OEM BIOS. Afterburner makes the screen flash black and go slightly corrupt for a split second when applying any adjustments, which I might need to remedy (it did it before, and I always got crappy performance the higher I OC'd when this was the case, then I fixed it somehow and performance increased but now it is back).
> 
> These cards are not making it easy for me .__.
> 
> It seems any mem clock above 1450, even with +25 - 50mv, causes random black screens. Happens sometimes during benchmarks and sometimes just idle on the desktop.
> 
> MEMORY, Y U SO BAD >: (


The memory doesn't overclock well on the 290 series, especially since it doesn't have separate voltage control. Only very few lucky people can get 1600+ and still have 0 black screen issues or no corruption. Most are stuck at 1500 or below. I was running at 1500, but now even that seems unstable for me, while some people with 290s flashed to 290X can get around the same overclocks... I feel ripped off, lol. However, my 1400 vs 1500 memory issues might also have something to do with my Haswell overclocks - even taking that into account the best I could get on watercooling was 1500...

My core clock also caps out at 1230, can't get anything higher at the Asus bios voltage limit, while people with flashed 290s can get 1200....


----------



## broken pixel

Quote:


> Originally Posted by *tsm106*
> 
> Oh snap he went out like James Dean, in a Porsche.


He was not driving the car another guy was.


----------



## ReHWolution

Hey guys! Sorry for not posting here in these days, I'm still waiting for Swiftech to reply to my e-mail so basically I got no news :\


----------



## UNOE

Does Trixx unlock mem voltage ?


----------



## stickg1

Thanks for the advice. I went ahead and got a Power Color R9 290. Can't wait for it to get in!


----------



## kot0005

Well, Asus is not getting my money for Graphics cards anymore. Wasn't able to get my 2 day old card with coil whine because the warranty sticker is missing.....No pc for 3 weeks now.


----------



## VSG

They don't specify anywhere that it is a warranty sticker on the card, can't you tell them that and feign ignorance?


----------



## Forceman

Quote:


> Originally Posted by *UNOE*
> 
> Does Trixx unlock mem voltage ?


I don't think Trixx has been updated for 290 cards yet, and from what Unwinder said there won't be any memory control on any reference cards.


----------



## MrWhiteRX7

I'm able to crank memory clocks up using the Aux setting on AB17. I've been as high as 1600 but anything over 1500 wasn't worth it as far as fps gains.


----------



## blazestalker100

I clocked the core to 1135 and memory to 1500 without overvoltage.

i got 210 artifacts during 11min stress test.

how bad is that? should i overvoltage or lower the memory? any suggestion? how can i read from artifacts if the clock is too bad?
Im using Evga oc scanner.


----------



## quakermaas

Quote:


> Originally Posted by *jomama22*
> 
> Now we have a valid result! Should take first in FM hof:
> 
> 3x 290x @ 1333/1625(6500)
> 3960x @ 5.1
> http://www.3dmark.com/fs/1239528


----------



## RAFFY

Quote:


> Originally Posted by *geggeg*
> 
> Ok I am in a bit of an interesting situation here- My Asus 290x (when tested by itself) gave a few black screens and I have successfully talked Newegg into giving me a store credit of $590 after they received this RMA'd card as the card isn't in stock over there. Do I use that cash for a 290 or a 290x from another brand for CFX with my existing Sapphire 290x? If the latter, what brand should I go for? Does it matter if the replacement card has hynix memory if my Sapphire has elpida? Thanks again!


Maybe I'm weird but I prefer to have all my graphics cards from the same manufacture. Get another Sapphire 290x. I've never had a problem with them nor have I had one with ASUS. Those would be my top two picks.
Quote:


> Originally Posted by *SpewBoy*
> 
> Ohh, that's it then. I'm rockin' b9.4.


+1 for me 9.4 has increase the stability of my cards. Before 9.4 I would get a few black screens/ game crashes every day. Now I may get one game crash every two or more days. It still sucks that I'm having at issues but at least the driver updates are working.

Quote:


> Originally Posted by *kot0005*
> 
> Well, Asus is not getting my money for Graphics cards anymore. Wasn't able to get my 2 day old card with coil whine because the warranty sticker is missing.....No pc for 3 weeks now.


That sucks but at the same time you can't be mad at ASUS. It's part of the game we all play here.


----------



## MrWhiteRX7

Ordered Rampage IV board, time to go IVY - E and get more 290's!!!!!


----------



## SpewBoy

Quote:


> Originally Posted by *RAFFY*
> 
> +1 for me 9.4 has increase the stability of my cards. Before 9.4 I would get a few black screens/ game crashes every day. Now I may get one game crash every two or more days. It still sucks that I'm having at issues but at least the driver updates are working.


Yeah, 9.4 fixed the black screen issues for me at stock clocks. Overclocks are a different matter entirely. What appears stable in one game is cripplingly bad in another game, makes it difficult to find what is actually stable. And from the looks of it, to get stable in Ghosts the best overclock I can do is around... 1100mhz on the core ROFL. Memory is ok at 1362. Yet in other games, the core overclock is stable up to 1250mhz.

Speaking of the 9.4 drivers, they mentioned:
Quote:


> Improves AMD CrossFire™ scaling in the multi-player portion of Call of Duty®: Ghosts


My cards have -30% scaling in CF with this game LOL. And for some reason when I enable v-sync I get better, more stable FPS (without v-sync it hovers between 72 and 82 fps at a given location, with v-sync it hovers at 82 fps solid). I also get no noticeable input lag and believe me, I stay away from v-sync with a 10 foot pole because of input lag (I honestly don't notice or care about screen tearing one bit - I'm lucky). I guess it is because the game is capped at 90 fps and therefore I set my refresh rate to 90Hz, which can prevet input lag apparently.

I turned on v-sync because some guy on the Steam forums said he found a fix for CF. From my testing, it was simply the fact that he enabled v-sync that caused better fps, the CF scaling was still just as bad


----------



## Arizonian

Quote:


> Originally Posted by *robert0507*
> 
> Hi add me to the club
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Sapphire R290X
> 
> http://www.techpowerup.com/gpuz/edkd8/


Congrats - added








Quote:


> Originally Posted by *Jack Mac*
> 
> *1. GPU-Z Link or Screen shot with OCN Name
> 2. Manufacturer & Brand
> 3. Cooling - Stock, Aftermarket or 3rd Party Water*
> 
> http://www.techpowerup.com/gpuz/f57xz/ - GPU-Z validation link.
> Sapphire R9 290
> Stock cooling
> Validation photo:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> This good enough?


Congrats - added









Thank you so much for paying attention to the OP submission request.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kot0005*
> 
> Well, Asus is not getting my money for Graphics cards anymore. Wasn't able to get my 2 day old card with coil whine because the warranty sticker is missing.....No pc for 3 weeks now.


Sorry to hear that, I'm in a similar situation with PCCG about my Sapphire card......they wasn't happy i removed the stock cooler and now they aren't sure if i can get a refund......


----------



## kot0005

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Sorry to hear that, I'm in a similar situation with PCCG about my Sapphire card......they wasn't happy i removed the stock cooler and now they aren't sure if i can get a refund......


My cards are from pccg as well. How come dude? Sapphire one does not have a warranty void sticker. The Asus one does, its a round sticker on one of the scews of the metal frame on the back of the gpu. I got a refund for sapphire one on friday because it would not be detected at all.Paied additional $69 just to realise that Asus does not honor warranty for installing waterblock and it had a coil whine issue, which is pretty noticeable and irritating if you have a silent watercooling loop. I know I should have tested it before installing the block but I had no choice, the card wouldn't fit in as chipset block fittings were getting in the way;/

After a lot of begging and argument, the guy did not bulge at all even though it was 2 days old. So I have to wait for asus to repair/replace it

I seriously should have bought a next gen console. 3 weeks of my gaming holidays down the drain now.

Did you physically damage it?


----------



## Sgt Bilko

Quote:


> Originally Posted by *kot0005*
> 
> My cards are from pccg as well. How come dude? Sapphire one does not have a warranty void sticker. The Asus one does, its a round sticker on one of the scews of the metal frame on the back of the gpu. I got a refund for sapphire one on friday because, it would not be detected at all, paied additional $69 just to realise that Asus does not honor warranty for installing waterblock and it had a coil whine issue, which is pretty noticeable and irritating if you have a silent watercooling loop.
> 
> After a lot of begging and argument, the guy did not bulge at all even though it was 2 days old. So I have to wait for asus to repair/replace it
> 
> I seriously should have bought a next gen console. 3 weeks of my gaming holidays down the drain now.
> 
> Did you physically damage it?


No, not damaged but PCCG aren't too hopeful that Sapphire will Honour it.........i hope they do, otherwise thats $800 down the drain


----------



## kot0005

Quote:


> Originally Posted by *Sgt Bilko*
> 
> No, not damaged but PCCG aren't too hopeful that Sapphire will Honour it.........i hope they do, otherwise thats $800 down the drain


How did they even know that you removed the reference cooler? Did you tell em haha?


----------



## Sgt Bilko

Quote:


> Originally Posted by *kot0005*
> 
> How did they even know that you removed the reference cooler? Did you tell em haha?


They asked why the screws had paint off them, i said i dunno........don't think they believed me somehow lol


----------



## Fahrenheit85

Just to confirm before I buy on amazon. The Sapphire 290x don't have a warranty sticker right? Does Sapphire tend to be good with RMAs and what not? With Amazon prime id just use them if i got a dead on out of the box but down the road if I have an issue i would have to send it to Sapphire


----------



## Sgt Bilko

Quote:


> Originally Posted by *Fahrenheit85*
> 
> Just to confirm before I buy on amazon. The Sapphire 290x don't have a warranty sticker right? Does Sapphire tend to be good with RMAs and what not? With Amazon prime id just use them if i got a dead on out of the box but down the road if I have an issue i would have to send it to Sapphire


Sapphire does not have any sticker over any screws, as for the RMA.......well give me a few days and i will be able to let you know.


----------



## kot0005

Quote:


> Originally Posted by *Fahrenheit85*
> 
> Just to confirm before I buy on amazon. The Sapphire 290x don't have a warranty sticker right? Does Sapphire tend to be good with RMAs and what not? With Amazon prime id just use them if i got a dead on out of the box but down the road if I have an issue i would have to send it to Sapphire


As far as I know, sapphire and gigabyte versions don't have a warranty void sticker.


----------



## Fahrenheit85

If the GB version was on Amazon prime id get that. Had great service from them in the past


----------



## tivook

Quote:


> Originally Posted by *Fahrenheit85*
> 
> Just to confirm before I buy on amazon. The Sapphire 290x don't have a warranty sticker right? Does Sapphire tend to be good with RMAs and what not? With Amazon prime id just use them if i got a dead on out of the box but down the road if I have an issue i would have to send it to Sapphire


My Sapphire 290X did not have any warranty stickers 100% sure.

However, my dealer told me they won't honor any warranty issues if the stock cooler have been removed.

The dealer also told me that if I replace the waterblock with the stock cooling prior to a RMA they really can't say you tampered with it so.. I should be fine as long as the stock cooler is on if I ever need to RMA.


----------



## Brian18741

Am hoping to pick up one of these soon. Do we have a favorite brand yet re overclocking? Is there any with locked voltage?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Brian18741*
> 
> Am hoping to pick up one of these soon. Do we have a favorite brand yet re overclocking? Is there any with locked voltage?


Z

All have unlocked core voltage but the Asus Bios lets you add more voltage i believe, which you can flash your card to if needed.


----------



## Fahrenheit85

Quote:


> Originally Posted by *tivook*
> 
> My Sapphire 290X did not have any warranty stickers 100% sure.
> 
> However, my dealer told me they won't honor any warranty issues if the stock cooler have been removed.
> 
> The dealer also told me that if I replace the waterblock with the stock cooling prior to a RMA they really can't say you tampered with it so.. I should be fine as long as the stock cooler is on if I ever need to RMA.


Its one of them if they dont tell em how will they know deals. Plus to me it seems if its not DOA or dead in a few days use then it won't die from wear and tear


----------



## grandpatzer

Quote:


> Originally Posted by *Forceman*
> 
> If it is throttling (most likely) then you need to turn up the fan speed.


really?

because the 1080 radiator is keeping the temps at 51c for this gpu (65c VRM 1, 52c VRM2)
Those temps are on 1200core/1500memory heavy benching.

While Im playing CSGO I'm sure the temperatures are much lower....

I really need to have this GPU at 947mhz so I get my 144fps..........

The 7950 always was solid @ 950mhz


----------



## rdr09

Quote:


> Originally Posted by *SpewBoy*
> 
> Yeah, 9.4 fixed the black screen issues for me at stock clocks. Overclocks are a different matter entirely. What appears stable in one game is cripplingly bad in another game, makes it difficult to find what is actually stable. And from the looks of it, to get stable in Ghosts the best overclock I can do is around... 1100mhz on the core ROFL. Memory is ok at 1362. Yet in other games, the core overclock is stable up to 1250mhz.
> 
> Speaking of the 9.4 drivers, they mentioned:
> My cards have -30% scaling in CF with this game LOL. And for some reason when I enable v-sync I get better, more stable FPS (without v-sync it hovers between 72 and 82 fps at a given location, with v-sync it hovers at 82 fps solid). I also get no noticeable input lag and believe me, I stay away from v-sync with a 10 foot pole because of input lag (I honestly don't notice or care about screen tearing one bit - I'm lucky). I guess it is because the game is capped at 90 fps and therefore I set my refresh rate to 90Hz, which can prevet input lag apparently.
> 
> I turned on v-sync because some guy on the Steam forums said he found a fix for CF. From my testing, it was simply the fact that he enabled v-sync that caused better fps, the CF scaling was still just as bad


gpu scaling in crossfire depends on the cpu, too. my i7 2700K at 4.5 was not able to push my 7950/7970 enough to get good scaling. and those are slower cards than these. check your cpu usage.


----------



## rdr09

Quote:


> Originally Posted by *grandpatzer*
> 
> really?
> 
> because the 1080 radiator is keeping the temps at 51c for this gpu (65c VRM 1, 52c VRM2)
> Those temps are on 1200core/1500memory heavy benching.
> 
> While Im playing CSGO I'm sure the temperatures are much lower....
> 
> I really need to have this GPU at 947mhz so I get my 144fps..........
> 
> The 7950 always was solid @ 950mhz


perhaps you need a faster cpu or oc higher? is your cpu oc'ed at all?


----------



## grandpatzer

Quote:


> Originally Posted by *rdr09*
> 
> perhaps you need a faster cpu or oc higher? is your cpu oc'ed at all?


I can try that, never had this problem with the 7950 with same all around settings, so to me feels more like something else.


----------



## Imprezzion

Is these any way to make MSI AB B17 voltage go to 1.412v? I don't really understand the whole command line thing...

I'm getting so Fed up with GPU Tweak not being able to apply clocks at boot which MSI AB can...


----------



## rdr09

Quote:


> Originally Posted by *grandpatzer*
> 
> I can try that, never had this problem with the 7950 with same all around settings, so to me feels more like something else.


don't tell me your i5 is at stock? based on your sig, if that 290 is oc'ed that much it is faster than 2 7950s. you are bottlenecking it big time.

crossfire 7900 series

http://www.3dmark.com/3dm11/6233890

290 oc'ed

http://www.3dmark.com/3dm11/7556915


----------



## darkelixa

Finding it hard to find any decent 280x in stock in australia, very tempted to just buy a 290


----------



## velocityx

so yesterday I had my first black screen, and today I had my first red screen. pretty much I think these are the same as I had to do a switch off/on on my machine. running gigabyte r9 290, elpida vram. I wonder what's up. I had my card for like 2 weeks now and it's the first time I had it. Card is not overclocked yet.


----------



## grandpatzer

Quote:


> Originally Posted by *rdr09*
> 
> don't tell me your i5 is at stock? based on your sig, if that 290 is oc'ed that much it is faster than 2 7950s. you are bottlenecking it big time.
> 
> crossfire 7900 series
> 
> http://www.3dmark.com/3dm11/6233890
> 
> 290 oc'ed
> 
> http://www.3dmark.com/3dm11/7556915


actually the 290 overclocked is about 20-30% slower then OC crossfire 79x0.

As for my problem my 7950 and 2500k where able to run fine the game [email protected]+
Also I'm only using 947/1250 for 290 in CSGO, it's not a demanding game at all.


----------



## Banedox

Quote:


> Originally Posted by *darkelixa*
> 
> Finding it hard to find any decent 280x in stock in australia, very tempted to just buy a 290


\

Hey just pick up a 290 and find a Powercooler, Sapphire BF4 Edition or XFX and see if it unlocks...


----------



## rdr09

Quote:


> Originally Posted by *velocityx*
> 
> so yesterday I had my first black screen, and today I had my first red screen. pretty much I think these are the same as I had to do a switch off/on on my machine. running gigabyte r9 290, elpida vram. I wonder what's up. I had my card for like 2 weeks now and it's the first time I had it. Card is not overclocked yet.


i would not waste another sec with that card. rma if you can. sorry.
Quote:


> Originally Posted by *grandpatzer*
> 
> actually the 290 overclocked is about 20-30% slower then OC crossfire 79x0.
> 
> As for my problem my 7950 and 2500k where able to run fine the game [email protected]+
> Also I'm only using 947/1250 for 290 in CSGO, it's not a demanding game at all.


oh, i know. but you should at least oc your cpu a bit. what driver are you using? dang csgo is on sale at steam!

edit: funny, though . . . the 7900 crossfired are faster but for some reason i was not able to max out C3 with them (stock - never oc my gpu for games). the 290 can max out C3 at stock using 1080.


----------



## tivook

Quote:


> Originally Posted by *darkelixa*
> 
> Finding it hard to find any decent 280x in stock in australia, very tempted to just buy a 290


Or just buy a cheap 7970 and pretend it's a 280x. It's the exact same card. In fact, it's possible to crossfire a 7970 and a 280x.


----------



## Fahrenheit85

Went to go order and the Gigabyte version was in stock so I went with that since GB has the best RMA department in the business in case of blank screen/coil whine issues.


----------



## Newbie2009

I have 2 Sapphire 290s, unlocked to 290Xs & no coil whine.


----------



## BlueW1zard

Hi guys

I'm new round here and looking for some advice.

I'm currently running a gtx 580 (PNY reference model) and was looking to buy a R9 290 reference model. I've seen one for around £290 which seems very good value and want to go ahead and purchase, but I do have a few concerns.

Is the reference cooler as bad as everyone is saying? I know this is subjective due to personnel noise and temperature tolerance levels but would I be making a grave mistake due to the fact it is actually a really poor quality cooler?

My current GTX580 is fairly loud when running under load (75% fan speed, 73 degrees c) but this doesn't bother me in the slightest. I can hear it over my other fans but I usually wear headphones and just ignore it.
The impression I get from the r9 290 reviews and forum posts is that it sounds like someone is using a hairdryer next to you were the noise is that loud you have to raise your voice to speak to each other. Is the blower cooler on the 290 louder than other blower fans?

I'm planning on setting a target temp of about 80c (not comfortable with 95c) and letting the fan run as fast as required to keep it from throttling. My case has average/ adequate negative airflow and my current system never reaches high temps. I would consider waiting on getting a non-reference release but I doubt they will be sold for £290(pretty much the very top of my budget) and there is a chance I will have higher system temps due to the hot air being blown into my case as oppose to being blown out of the back.

TLDR: Is the blower cooler on the 290 louder than other blower fans, specifically a gtx580 ref? What's the build quality of the cooler?

Thanks in advance


----------



## Slomo4shO

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Ordered Rampage IV board, time to go IVY - E and get more 290's!!!!!


I was thinking the same thing but I can't seem to find they Rampage IV BE in stock anywhere. My 4770K will likely hold back the quad 290s, at least in benching


----------



## chiknnwatrmln

Quote:


> Originally Posted by *BlueW1zard*
> 
> Hi guys
> 
> I'm new round here and looking for some advice.
> 
> I'm currently running a gtx 580 (PNY reference model) and was looking to buy a R9 290 reference model. I've seen one for around £290 which seems very good value and want to go ahead and purchase, but I do have a few concerns.
> 
> Is the reference cooler as bad as everyone is saying? I know this is subjective due to personnel noise and temperature tolerance levels but would I be making a grave mistake due to the fact it is actually a really poor quality cooler?
> 
> My current GTX580 is fairly loud when running under load (75% fan speed, 73 degrees c) but this doesn't bother me in the slightest. I can hear it over my other fans but I usually wear headphones and just ignore it.
> The impression I get from the r9 290 reviews and forum posts is that it sounds like someone is using a hairdryer next to you were the noise is that loud you have to raise your voice to speak to each other. Is the blower cooler on the 290 louder than other blower fans?
> 
> I'm planning on setting a target temp of about 80c (not comfortable with 95c) and letting the fan run as fast as required to keep it from throttling. My case has average/ adequate negative airflow and my current system never reaches high temps. I would consider waiting on getting a non-reference release but I doubt they will be sold for £290(pretty much the very top of my budget) and there is a chance I will have higher system temps due to the hot air being blown into my case as oppose to being blown out of the back.
> 
> TLDR: Is the blower cooler on the 290 louder than other blower fans, specifically a gtx580 ref? What's the build quality of the cooler?
> 
> Thanks in advance


Can't compare to the 580 because I've never heard one, but when my GTX 670 GTW's fan (680 ref fan) would spin up to 80% it was about as loud as the 290 fan at 60%... So yes, it's very loud. To the point where when I would try and talk to my roommates, I'd have to either yell or turn off my game so the card would cool down and the fan would slow down.

For blower style coolers you should have positive airflow, that way the air is forced into the blower and out the back. Negative airflow will only hurt the blowers cooling.


----------



## VSG

Quote:


> Originally Posted by *RAFFY*
> 
> Maybe I'm weird but I prefer to have all my graphics cards from the same manufacture. Get another Sapphire 290x. I've never had a problem with them nor have I had one with ASUS. Those would be my top two picks.


Will crossfire be impacted if I get a card with hynix memory given that my sapphire has elpida? Anyone know? I am pretty tempted to go with a 290, especially if I get one with free BF4 too.


----------



## BlueW1zard

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Can't compare to the 580 because I've never heard one, but when my GTX 670 GTW's fan (680 ref fan) would spin up to 80% it was about as loud as the 290 fan at 60%... So yes, it's very loud. To the point where when I would try and talk to my roommates, I'd have to either yell or turn off my game so the card would cool down and the fan would slow down.
> 
> For blower style coolers you should have positive airflow, that way the air is forced into the blower and out the back. Negative airflow will only hurt the blowers cooling.


I see. That's pretty loud then if you had to yell.

Thanks


----------



## tsm106

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *grandpatzer*
> 
> I can try that, never had this problem with the 7950 with same all around settings, so to me feels more like something else.
> 
> 
> 
> don't tell me your i5 is at stock? based on your sig, if that 290 is oc'ed that much it is faster than 2 7950s. you are bottlenecking it big time.
> 
> crossfire 7900 series
> 
> http://www.3dmark.com/3dm11/6233890
> 
> 290 oc'ed
> 
> http://www.3dmark.com/3dm11/7556915
Click to expand...

I'm not sure what point you're trying to make but those 7970 scores are weak, seriously weak. Comparing stock to overclocked leaves a blurry picture imo.

http://www.3dmark.com/3dm11/7298100


----------



## Gunderman456

Quote:


> Originally Posted by *geggeg*
> 
> Will crossfire be impacted if I get a card with hynix memory given that my sapphire has elpida? Anyone know? I am pretty tempted to go with a 290, especially if I get one with free BF4 too.


No, crossfire will not be impacted.


----------



## nemm

Quote:


> Originally Posted by *geggeg*
> 
> Will crossfire be impacted if I get a card with hynix memory given that my sapphire has elpida? Anyone know? I am pretty tempted to go with a 290, especially if I get one with free BF4 too.


Mixed memory modules makes no difference. I have 2xsapphire 290, 1 of each and they run just fine. I used to have black screen but that was too high core with too little voltage and with then vdroop varying to much causing it and nothing to do with memory.


----------



## VSG

Ok thanks a lot, guys. Once I get the store credit (a week or so from now







), I will get another 290/290x depending on how the deals look then. At this point, only Sapphire and Gigabyte have cards without warranty stickers, right? On the other hand, XFX has lifetime, transferable warranty and is ok with watercooling for people in north america


----------



## BlueW1zard

Any idea how much more the non-reference 290's will cost over the reference models?


----------



## tsm106

Quote:


> Originally Posted by *BlueW1zard*
> 
> Any idea how much more the non-reference 290's will cost over the reference models?


Most of the time they will be same price or cheaper. It's only the high end customs that demand a premium. Remember, its much cheaper for the AIB partners to produce their own cards than to buy reference from AMD.


----------



## RAFFY

Quote:


> Originally Posted by *SpewBoy*
> 
> Yeah, 9.4 fixed the black screen issues for me at stock clocks. Overclocks are a different matter entirely. What appears stable in one game is cripplingly bad in another game, makes it difficult to find what is actually stable. And from the looks of it, to get stable in Ghosts the best overclock I can do is around... 1100mhz on the core ROFL. Memory is ok at 1362. Yet in other games, the core overclock is stable up to 1250mhz.
> 
> Speaking of the 9.4 drivers, they mentioned:
> My cards have -30% scaling in CF with this game LOL. And for some reason when I enable v-sync I get better, more stable FPS (without v-sync it hovers between 72 and 82 fps at a given location, with v-sync it hovers at 82 fps solid). I also get no noticeable input lag and believe me, I stay away from v-sync with a 10 foot pole because of input lag (I honestly don't notice or care about screen tearing one bit - I'm lucky). I guess it is because the game is capped at 90 fps and therefore I set my refresh rate to 90Hz, which can prevet input lag apparently.
> 
> I turned on v-sync because some guy on the Steam forums said he found a fix for CF. From my testing, it was simply the fact that he enabled v-sync that caused better fps, the CF scaling was still just as bad


I can't even begin to express how disappointed I am with COD:Ghosts. I hadn't played the game in almost a week and before it had been running good. Then they go and update it some and it made the CF scaling worse again. Plus the game just looks like poop too. The game play is great but the visuals and overall performance at just pathetic.
Quote:


> Originally Posted by *Sgt Bilko*
> 
> They asked why the screws had paint off them, i said i dunno........don't think they believed me somehow lol


Ah you just checked to make sure they were torque to factory specs








Quote:


> Originally Posted by *darkelixa*
> 
> Finding it hard to find any decent 280x in stock in australia, very tempted to just buy a 290


Get a 290 series card!!!!!
Quote:


> Originally Posted by *geggeg*
> 
> Will crossfire be impacted if I get a card with hynix memory given that my sapphire has elpida? Anyone know? I am pretty tempted to go with a 290, especially if I get one with free BF4 too.


Won't be a problem at all. The memory on these cards does not matter at all. They both have issues and neither one overclocks better than the other.


----------



## BlueW1zard

Quote:


> Originally Posted by *tsm106*
> 
> Most of the time they will be same price or cheaper. It's only the high end customs that demand a premium. Remember, its much cheaper for the AIB partners to produce their own cards than to buy reference from AMD.


Really? That's great news.

I might hold out for a 3rd party option then.

Will negative airflow (just a large exhaust fan on back) with just a side vent be sufficient enough for a non-reference cooler that blows hot air into your case? My system temps are fairly low with current 580 blower cooler.


----------



## Forceman

Quote:


> Originally Posted by *grandpatzer*
> 
> really?
> 
> because the 1080 radiator is keeping the temps at 51c for this gpu (65c VRM 1, 52c VRM2)
> Those temps are on 1200core/1500memory heavy benching.
> 
> While Im playing CSGO I'm sure the temperatures are much lower....
> 
> I really need to have this GPU at 947mhz so I get my 144fps..........
> 
> The 7950 always was solid @ 950mhz


Oh, I thought you were on stock cooling. Did you turn up the power limit in CCC? Some people have had problems where it wouldn't "stick" in AB unless they also changed it in CCC.

Does it only happen in CSGO?


----------



## airisom2

Hey guys, I made a 290/x overclock thread if ya'll are interested.

http://www.overclock.net/t/1447763/amd-r9-290-x-overclock-thread


----------



## Taint3dBulge

So i just sent my 290x off for RMA. Looks like newegg has no Asus's in stock.. Wonder if they will just refund my money?

Super kinda sad that i its sent back now... It is a great card but I just cant handle that high pitch squeal and the fact i want a 1440p monitor and one display port is weird and give me red lines.. Anways i kinda hope i just get my money back and I can get a lightning or a dcii.. god i hope those hurry up and come out.


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> I'm not sure what point you're trying to make but those 7970 scores are weak, seriously weak. Comparing stock to overclocked leaves a blurry picture imo.
> 
> http://www.3dmark.com/3dm11/7298100


like i said, i know the 7900 crossfire are faster but look at grand's sig. we should know by now that the 290s are just as fast as the Titans (clock for clock) but i doubt you'll see a Titan paired with a stock i5 SB.


----------



## RAFFY

Quote:


> Originally Posted by *Taint3dBulge*
> 
> So i just sent my 290x off for RMA. Looks like newegg has no Asus's in stock.. Wonder if they will just refund my money?


Thats why they will do.


----------



## nievz

Is there a way to find out in the packaging if a Sapphire 290 is unlockable or not?


----------



## Gilgam3sh

Quote:


> Originally Posted by *Redvineal*
> 
> Wanna see something crazy?
> 
> I wasn't at all happy with the VRM1 temps (~91C) the Accelero Xtreme III was producing with the rinky-dink heatsinks on my XFX R9 290 (does NOT unlock, btw).
> 
> As a matter of fact, the heat caused the thermal pads to fail and all 3 heatsinks fell off! I didn't want to go with the permanent glue, and here's why I'm glad I didn't:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What you see there is the result of 2 hours of quality time spent with a hacksaw! Yes, a HACKSAW. I don't own a rotary tool, and my local hardware stores were closed when I started around 9:30 PM.
> 
> Essentially, I took inspiration from some folks at Overclockers UK for the VRM1 heat and got to work on the stock heatsink! The back plate covering VRM1 was the easy part (thanks OCUK). On the other hand, the VRM2 plate up front was much more difficult. I had to work my way through the black plate, the copper plate, and finally the aluminum fins. Once I got the VRM2 cover separated from the rest of the sink, I used a pair of pliers to individually tear off the aluminum fins.
> 
> Told you it was crazy!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now here comes some gritty:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Furmark 10 Minute Run 1
> 
> Core: 1100
> Memory: 1300
> Power: +50
> mV: +0
> 
> GPU Max: 57C
> VRM1 Max: 65C
> VRM2 Max: 50C
> 
> 
> 
> Furmark 10 Minute Run 2
> 
> Core: 1150
> Memory: 1500
> Power: +50
> mV: +100
> 
> GPU Max: 65C
> VRM1 Max: 78C
> 
> 
> 
> 
> 
> 
> 
> 
> VRM2 Max: 56C
> 
> 
> 
> 
> 
> Pretty sweet results, right? If you've made it this far, thanks for checking out my work!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'd really like to know what you other R9 290(X) owners think of this Frankenstein of a mod, the results, and what other benches would be good to tax this monstrosity!
> 
> Also, if anyone is interested in more details about how I wound up with the VRM2 plate you see in the pics (where to cut, etc), I'll be happy to share.
> 
> Edit: Forgot to mention my idles
> GPU: 35C
> VRM1: 31C
> VRM2: 33C
> 
> 
> Spoiler: Warning: Spoiler!


for those of you who thinking of this mod, its great! I finished it couple of hours ago, I just went with the VRM1 part as the VRM2 hits 55°C at load, so no need to make it harder than it is, just go with the VRM1 part as it's very easy and it's the VRM1 that is the big problem not VRM2, it just took me 1 minute to cut the VRM1 part with the dremel, very good and easy mod for those of you who run the Arctic Accelero Xtreme III cooler, Arctic should made this VRM piece and include it with the cooler.


----------



## RAFFY

Quote:


> Originally Posted by *nievz*
> 
> Is there a way to find out in the packaging if a Sapphire 290 is unlockable or not?


I do not believe so. I think the only way to tell physically is to see if some connection is cut with a laser. That's what someone else had stated a few hundred posts back.


----------



## ironhide138

Im tempted to get a stock 290 and toss that new NZXT bracket on with an h55.


----------



## Jpmboy

Quote:


> Originally Posted by *airisom2*
> 
> Hey guys, I made a 290/x overclock thread if ya'll are interested.
> 
> http://www.overclock.net/t/1447763/amd-r9-290-x-overclock-thread


So just post up numbers ... Sans stability assessment?


----------



## VSG

lol I managed to convince Newegg to give me a full refund for my current card (Asus BF4 edition at $489 - $15 store credit because I missed out on using a coupon code - $45 from selling the BF4 code) I am about to RMA with them so that I could place an order for the Sapphire BF4 edition currently on sale at $499. Once I sell off this BF4 code, I would essentially have gotten this 290x for about $400. Thanks a lot, awesome Newegg rep


----------



## airisom2

Quote:


> Originally Posted by *Jpmboy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *airisom2*
> 
> Hey guys, I made a 290/x overclock thread if ya'll are interested.
> 
> http://www.overclock.net/t/1447763/amd-r9-290-x-overclock-thread
> 
> 
> 
> So just post up numbers ... Sans stability assessment?
Click to expand...

Pretty much. I made the thread to hopefully see where the average limit will be for these cards, and to also determine if Elpida clocks lower or higher than Hynix (as well as aftermarket cards clocking better because of different pcb design whenever they're released).

I was thinking about adding stability test requirements, but everyone uses different software to determine stability. The overclock may be unstable on Furmark, but it's rock solid when gaming, or when running synthetic benchmarks. So, I'll take their word if they say it's stable.

What I could do is add another column inferring the user to determine if their card is either "game or 24/7" stable, or "bench" stable. How does that sound?


----------



## SpewBoy

Quote:


> Originally Posted by *rdr09*
> 
> gpu scaling in crossfire depends on the cpu, too. my i7 2700K at 4.5 was not able to push my 7950/7970 enough to get good scaling. and those are slower cards than these. check your cpu usage.


A moderate CPU bottleneck would not cause negative 30 percent GPU scaling with two GPUs when I can relieve any possible CPU bottleneck by increasing graphics settings until FPS dips below the cap of 90 in Ghosts.

But you're right, I need to oveclock my 4770K if I want to get the most outta these cards. There's just so much to overclock and I'm still messing with BIOS's and have yet to get the perfect overclock out of these cards before I delve into my CPU and RAM.

Also, for anyone interested, HIS 290X does *not* have a warranty sticker but there is a warning on the box about modifying the card not being covered by warranty but all venders have similar warnings on their sites so, take that how you will. Mine are under water.


----------



## Raephen

Today I found the time I needed to install the Aquacomputer 290(x) waterblock onto my card and into my loop.

I run my card at 1000 / 1250 +50% powerlimit in AB (which does show up in CCC's overdrive page, eventhough I haven't enabled Overdrive - I run v9.2, which has been trouble free for me since I first installed it)

I let Valley run for a time to heat things up, and check out my temps:



I took the liberty to do some cut and paste work on the two images I would have otherwise posted.

Not bad at all, this Aquacomputer block

Here are some pictures:




@ Arizonian: Please update my cooling to Water, please


----------



## SuprUsrStan

The Asus R9 290X comes with an unlocked voltage bios but how about the one from Sapphire? I've got both on its way to me right now but I need to return one. Is the sapphire bios overvoltage locked?

EDIT: It appears AB beta 17 is able to overvolt +100mV even on the stock uber Sapphire bios? If that's the case, I think I'll stick with my Sapphire cards as they've got the BF4 game inside them.


----------



## darkelixa

Anyone got an r9 290 sapphire or xfx 290? Looking at buying these if they dont sound like a jet lol


----------



## Rar4f

Quote:


> Originally Posted by *darkelixa*
> 
> Anyone got an r9 290 sapphire or xfx 290? Looking at buying these if they dont sound like a jet lol


They sound like a jet. But you can replace the cooler with arctic xtreme 3 cooler or a waterblock.


----------



## Ukkooh

Quote:


> Originally Posted by *Raephen*
> 
> Today I found the time I needed to install the Aquacomputer 290(x) waterblock onto my card and into my loop.
> 
> I run my card at 1000 / 1250 +50% powerlimit in AB (which does show up in CCC's overdrive page, eventhough I haven't enabled Overdrive - I run v9.2, which has been trouble free for me since I first installed it)
> 
> I let Valley run for a time to heat things up, and check out my temps:
> 
> 
> 
> I took the liberty to do some cut and paste work on the two images I would have otherwise posted.
> 
> Not bad at all, this Aquacomputer block
> 
> Here are some pictures:
> 
> 
> 
> 
> @ Arizonian: Please update my cooling to Water, please


Why did you put your card on those anti static bags? They are conductive from the outside and it only makes it easier to zap your card.. My eyes are bleeding now. Such a poor card.


----------



## darkelixa

Don't want to change the cooler as it voids warranty, would rather keep it all stock


----------



## chiknnwatrmln

Anti static bags are conductive from the inside also.

They work by conducting static charges to make sure there is no localized static discharge in the bag from the item.


----------



## jerrolds

So how would we get TrueAudio to work? Do we have to use HDMI for the card to process the audio? What if my only display connection is via DL-DVI, how can i take advantage of the 290X processing the audio?


----------



## Forceman

Quote:


> Originally Posted by *Syan48306*
> 
> The Asus R9 290X comes with an unlocked voltage bios but how about the one from Sapphire? I've got both on its way to me right now but I need to return one. Is the sapphire bios overvoltage locked?
> 
> EDIT: It appears AB beta 17 is able to overvolt +100mV even on the stock uber Sapphire bios? If that's the case, I think I'll stick with my Sapphire cards as they've got the BF4 game inside them.


AB can overvolt any card +100mV,if you want more than that you can flash the Asus BIOS or use the AB command line switch.


----------



## Raxus

Quote:


> Originally Posted by *darkelixa*
> 
> Don't want to change the cooler as it voids warranty, would rather keep it all stock


Depends on the manufacturer.


----------



## MrWhiteRX7

I recommend using AB with the command line. When I flashed to the Asus bios and used GPUT I was unimpressed. AB (for me) works a LOT better. No issues.


----------



## ImJJames

Powercolor or HIS for r9 290?


----------



## LazarusIV

What programs do you use for stability testing your overclock? I was using heaven but my HIS card is artifacting at 1125 with +100 mV and +50% power... suggestions?


----------



## ImJJames

Quote:


> Originally Posted by *LazarusIV*
> 
> What programs do you use for stability testing your overclock? I was using heaven but my HIS card is artifacting at 1125 with +100 mV and +50% power... suggestions?


Increase voltage?


----------



## Jpmboy

Quote:


> Originally Posted by *airisom2*
> 
> Pretty much. I made the thread to hopefully see where the average limit will be for these cards, and to also determine if Elpida clocks lower or higher than Hynix (as well as aftermarket cards clocking better because of different pcb design whenever they're released).
> 
> I was thinking about adding stability test requirements, but everyone uses different software to determine stability. The overclock may be unstable on Furmark, but it's rock solid when gaming, or when running synthetic benchmarks. So, I'll take their word if they say it's stable.
> 
> What I could do is add another column inferring the user to determine if their card is either "game or 24/7" stable, or "bench" stable. How does that sound?


why not something free and simple like 3DMK11 or Firestrike (or even catzilla) and post only the graphics score + a gpuZ shot?


----------



## Jpmboy

Quote:


> Originally Posted by *Raephen*
> 
> Today I found the time I needed to install the Aquacomputer 290(x) waterblock onto my card and into my loop.
> I run my card at 1000 / 1250 +50% powerlimit in AB (which does show up in CCC's overdrive page, eventhough I haven't enabled Overdrive - I run v9.2, which has been trouble free for me since I first installed it)
> I let Valley run for a time to heat things up, and check out my temps:
> 
> I took the liberty to do some cut and paste work on the two images I would have otherwise posted.
> Not bad at all, this Aquacomputer block
> Here are some pictures:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> @ Arizonian: Please update my cooling to Water, please


AQ makes excellent stuff. I've had two of their copper blocks on my 7970s for almost 2 years - never get above 48C!
nice job!


----------



## darkelixa

Sapphire r9 290

http://ijk.com.au/branch/ijk/product_info.php?cPath=353_340&products_id=146897

Would my psu be strong enough for that gpu. The psu would need to be able to power the r9 290,my amd 8350, asus sabertooth 990fx rv2, 16gb ddr3, 1 128xgb ssd, 1 2tb hdd, 4 case fans.

My psu is a http://ijk.com.au/branch/ijk/product_info.php?manufacturers_id=216&products_id=135658

FSP FSP-AM-750MG

Many thanks in advance


----------



## Raephen

Quote:


> Originally Posted by *Ukkooh*
> 
> Why did you put your card on those anti static bags? They are conductive from the outside and it only makes it easier to zap your card.. My eyes are bleeding now. Such a poor card.


Ah, is that so?

Well, we learn new things each day then.

I guess I should count my blessings then. I've treated more than a few cards in the same manner to no ill effect. Lucky me









Out of interest, cutting up those bags and using the inside as a improvised mat would be a better idea?


----------



## airisom2

Quote:


> Originally Posted by *Jpmboy*
> 
> why not something free and simple like 3DMK11 or Firestrike (or even catzilla) and post only the graphics score + a gpuZ shot?


Yeah, I guess I'm trying to make something more complicated than it should be. Catzilla it is! Maybe putting it in the overclock thread will give it some much needed attention from OCN.


----------



## Forceman

Quote:


> Originally Posted by *darkelixa*
> 
> Sapphire r9 290
> 
> http://ijk.com.au/branch/ijk/product_info.php?cPath=353_340&products_id=146897
> 
> Would my psu be strong enough for that gpu. The psu would need to be able to power the r9 290,my amd 8350, asus sabertooth 990fx rv2, 16gb ddr3, 1 128xgb ssd, 1 2tb hdd, 4 case fans.
> 
> My psu is a http://ijk.com.au/branch/ijk/product_info.php?manufacturers_id=216&products_id=135658
> 
> FSP FSP-AM-750MG


750W is plenty, although I don't know anything about that unit specifically.


----------



## stickg1

FSP makes some decent power supplies. But not all of their models are top quality. I don't know much about that specific model but I would think you should be fine.


----------



## Raephen

Quote:


> Originally Posted by *Jpmboy*
> 
> AQ makes excellent stuff. I've had two of their copper blocks on my 7970s for almost 2 years - never get above 48C!
> nice job!


I used MX-4 for the core, memory and just the smallest possible dab between the VRMs and the thermal pads.

It works like a charm.


----------



## darkelixa

What would you need to know about the Psu?

I was told when i bought it was one of the best quality psus available , with its gold rating and all


----------



## Ukkooh

Quote:


> Originally Posted by *Raephen*
> 
> Ah, is that so?
> 
> Well, we learn new things each day then.
> 
> I guess I should count my blessings then. I've treated more than a few cards in the same manner to no ill effect. Lucky me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Out of interest, cutting up those bags and using the inside as a improvised mat would be a better idea?


From what i've heard they use a faraday cage effect and cutting it up would ruin it. Just putting it on the table should be safe enough.


----------



## stickg1

Quote:


> Originally Posted by *darkelixa*
> 
> What would you need to know about the Psu?
> 
> I was told when i bought it was one of the best quality psus available , with its gold rating and all


Hardware Secrets gave it a decent review. Ripple a little high at 750w for such an expensive unit but other than that it's made of high quality components and holds true to its Gold standard. You shouldn't have any concerns with it.


----------



## Neutronman

Curious about coil whine/buzz on a reference R9 290. Currently I have exceeding loud coil whine/buzz when running 3D Mark. Nothing when I run Heaven or Valley and nothing when I play BF4. Not sure why I get this when I run 3D Mark. Anyone else noticed this???


----------



## Jack Mac

I don't have coil whine, but the fan on my card has a weird pitch to it at high speeds.


----------



## Jpmboy

Quote:


> Originally Posted by *Raephen*
> 
> I used MX-4 for the core, memory *and just the smallest possible dab between the VRMs and the thermal pads.*
> 
> It works like a charm.
> 
> 
> Spoiler: Warning: Spoiler!


That's all it takes!


----------



## VSG

How do you guys even manage to put on something like MX4 on those tiny VRMs?


----------



## Forceman

Quote:


> Originally Posted by *Neutronman*
> 
> Curious about coil whine/buzz on a reference R9 290. Currently I have exceeding loud coil whine/buzz when running 3D Mark. Nothing when I run Heaven or Valley and nothing when I play BF4. Not sure why I get this when I run 3D Mark. Anyone else noticed this???


Cool whine is normally worst/most noticeable in high FPS situations, so the low stress parts of 3DMark could be making it noticeable. It's the same for me, those early tests on the new version have noticeable coil whine, even though BF4 isn't bad.


----------



## darkelixa

Ah fantastic , I thought with such a power hungry cpu/video card the psu might not of been good enough to power but looks like it is as you say

Is the sapphire a good brand to purchase or xfx, they are the main choices i have available in stock


----------



## Scorpion49

Quote:


> Originally Posted by *Raephen*
> 
> Ah, is that so?
> 
> Well, we learn new things each day then.
> 
> I guess I should count my blessings then. I've treated more than a few cards in the same manner to no ill effect. Lucky me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Out of interest, cutting up those bags and using the inside as a improvised mat would be a better idea?


ESD bags are fine to use as a work mat, *as long as the devices have no power*. They are conductive, but it makes no difference if you put it on the outside of the bag or on the table. If you put a motherboard on the bag and then power it on, you'll have a problem, otherwise it is harmless. Your clothes and carpet pose a much greater risk to your parts.


----------



## Raephen

Quote:


> Originally Posted by *Jpmboy*
> 
> That's all it takes!


Indeed.

And for those wondering what the smallest possible dab is - I should have mentioned before really - it's just touching the chip with the end of your TIM spout and go with whatever sticks.







That'll do nicely, I feel, for filling gaps between the VRMs and the padding.


----------



## shilka

Quote:


> Originally Posted by *darkelixa*
> 
> Ah fantastic , I thought with such a power hungry cpu/video card the psu might not of been good enough to power but looks like it is as you say
> 
> Is the sapphire a good brand to purchase or xfx, they are the main choices i have available in stock


Just so you know 80 plus has zero to do with quality

FSP makes their own units and are one of the brands that does this Corsair and Cooler Master just to name two does not

The normal FSP Aurum is alright but they are not outstanding

If you dont overvolt you can have two cards with your PSU no problem

So no you dont need a new one


----------



## Neutronman

Quote:


> Originally Posted by *Forceman*
> 
> Cool whine is normally worst/most noticeable in high FPS situations, so the low stress parts of 3DMark could be making it noticeable. It's the same for me, those early tests on the new version have noticeable coil whine, even though BF4 isn't bad.


Just extremely weird, the noise is continuous in 3D mark even when fps are lower than 60.... I am not able to hear any buzz/whine when the fps are similar in Valley, Heaven or Games....


----------



## ReHWolution

Quote:


> Originally Posted by *Neutronman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Forceman*
> 
> Cool whine is normally worst/most noticeable in high FPS situations, so the low stress parts of 3DMark could be making it noticeable. It's the same for me, those early tests on the new version have noticeable coil whine, even though BF4 isn't bad.
> 
> 
> 
> Just extremely weird, the noise is continuous in 3D mark even when fps are lower than 60.... I am not able to hear any buzz/whine when the fps are similar in Valley, Heaven or Games....
Click to expand...

When you're VGA limited (like in 3dmarks) coil whine is ordinary... It just means your card is pushed to the limit


----------



## SpewBoy

Quote:


> Originally Posted by *ReHWolution*
> 
> When you're VGA limited (like in 3dmarks) coil whine is ordinary... It just means your card is pushed to the limit


It's just your card getting all excited about how many frames it is pushing. Screams like a little girl, but they are screams of excitement, not anguish.


----------



## tsm106

Quote:


> Originally Posted by *Jpmboy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Raephen*
> 
> Today I found the time I needed to install the Aquacomputer 290(x) waterblock onto my card and into my loop.
> I run my card at 1000 / 1250 +50% powerlimit in AB (which does show up in CCC's overdrive page, eventhough I haven't enabled Overdrive - I run v9.2, which has been trouble free for me since I first installed it)
> I let Valley run for a time to heat things up, and check out my temps:
> 
> I took the liberty to do some cut and paste work on the two images I would have otherwise posted.
> Not bad at all, this Aquacomputer block
> Here are some pictures:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> @ Arizonian: Please update my cooling to Water, please
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> AQ makes excellent stuff. I've had two of their copper blocks on my 7970s for almost 2 years - *never get above 48C!*
> nice job!
Click to expand...

Waterblock temps are not comparable between loops because it's dependent upon individual loop surface area.

The thing about that pic is the gap between the vrm temps. The AC block is falling behind on vrm1 cooling.


----------



## Jpmboy

Quote:


> Originally Posted by *tsm106*
> 
> Waterblock temps are not comparable between loops because it's dependent upon individual loop surface area.
> The thing about that pic is the gap between the vrm temps. The AC block is falling behind on vrm1 cooling.


With my EK block, vrm 1 is usually hotter than #2. Always less than 60C. No doubt, the Aq blocks work well and look fantastic:


----------



## tsm106

AC really love their bling lol.

My vrm temps stay under 38c-40c on both 1 and 2. I use temp probes on the block and the back vrms on the RIVE too.


----------



## Neutronman

Quote:


> Originally Posted by *SpewBoy*
> 
> It's just your card getting all excited about how many frames it is pushing. Screams like a little girl, but they are screams of excitement, not anguish.


Ha, perhaps. A lot louder than any other card I have owned in the last 3 years though.

It's going to be even more noticeable with a waterblock installed.

This is an AMD issue for sure, my GTX 780 and my GTX 670's were all silent (no coil whine). This is the first AMD card I have owned since the original 7970 was released and if I recall this also had coil whine....


----------



## Jack Mac

Quote:


> Originally Posted by *Neutronman*
> 
> Ha, perhaps. A lot louder than any other card I have owned in the last 3 years though.
> 
> It's going to be even more noticeable with a waterblock installed.
> 
> This is an AMD issue for sure, my GTX 780 and my GTX 670's were all silent (no coil whine). This is the first AMD card I have owned since the original 7970 was released and if I recall this also had coil whine....


It's not just an AMD issue, my MSI 670 PE had coil whine but my 290 doesn't.


----------



## Neutronman

Quote:


> Originally Posted by *Jpmboy*
> 
> With my EK block, vrm 1 is usually hotter than #2. Always less than 60C. No doubt, the Aq blocks work well and look fantastic:


I love the look of the Aqua Blocks, I have an EK Nickel version being delivered tomorrow. The silver and blue looks fantastic. I'll post back when I have everything installed.


----------



## SpewBoy

Every card I've ever owned after my GTX 295s (6970, 2x 560 Ti, 2x 670 FTW, 2x 290X) had coil whine. I just kinda take it as a given these days that I'm going to get coil whine. That said, it doesn't bother me too much because although I sleep in the same room as my PC, I don't game while I sleep (AFAIK







) and the sound of the game generally drowns out the coil whine. That, and during gaming it usually isn't so bad. I limit the fps to 120 at 120hz if I can. It is only during benching and menu screens when I get either lower pitched coil whine (often get it in Valley) or high pitched screams (2000 fps menu screens, etc.).

I totally get that you water cool for lower noise and coil whine somewhat ruins that point but from my experience (maybe I'm unlucky), most cards have it and it is not a "fault" in the card because the card still does its job just fine and runs within spec. I've learned to appreciate the noise as my GPUs putting in a good frame's work.

It gives 'em character


----------



## Raephen

Quote:


> Originally Posted by *tsm106*
> 
> Waterblock temps are not comparable between loops because it's dependent upon individual loop surface area.
> 
> The thing about that pic is the gap between the vrm temps. The AC block is falling behind on vrm1 cooling.


To be sure, there is a gap between the VRM temperatures.

But then, with the stock cooler the gap was much the same, while 30ish degrees higher.

They may not be the best, but I'm more than satisfied


----------



## binormalkilla

Anyone running 290x's in Crossfire with the Rampage IV Black Edition? Battlefield 4 freaks out on me.....I get some stuttering and audio clipping (with a lot of I/O activity according to my HDD LED), then I get a screen full of snow pattern and it locks up. Only on BF4 though. I can bench and play Planetside 2 just fine. These are the same graphics cards that were running just fine in another socket 1155 system. I'm also running 3 monitor eyefinity in portrait mode. I tried removing my ASUS Xonar Essence STX sound card, but that didn't affect it.

I might also mention that I don't know if they've patched anything in the 2 weeks that I haven't played. I also am running different graphics drivers (13.11 beta 4) drivers.


----------



## Jpmboy

Quote:


> Originally Posted by *tsm106*
> 
> AC really love their bling lol.
> 
> My vrm temps stay under 38c-40c on both 1 and 2. I use temp probes on the block and the back vrms on the RIVE too.


probably that fujipoly! I used the stock pads. Yes, those are good temps.


----------



## robert0507

Quote:


> Originally Posted by *Jpmboy*
> 
> probably that fujipoly! I used the stock pads. Yes, those are good temps.


Yeah must be the Fujiploy..damn...I have the Koolance block and my vrm temps are like 59 max for vrm1 and 38 max for vrm2 . Of course this is wth an overclock of 1200 gpu and 1500 mem. Stock
my vrm1 temp never goes above 54. But yeah Fujiploy......man


----------



## igrease

Hey guys. I may be getting a r9 290 soon. The last time I bought an AMD graphics card was about a month after the 7950 had just came out. I was extremely disappointed by the overall performance of the card. I even overclocked it to 1100/1400. My 560 Ti Twin Frozr still ended up getting more fps in most games I played over the OC'd 7950. I currently play Battlefield 4 most of the time. How is this cards performance in this game @ 1080p? I tried to find some Youtube videos but nothing really showed the actual fps. I just don't want to end up regretting this decision for the second time around. Also when the hell are the non-reference versions coming out?

Thanks.


----------



## rdr09

Quote:


> Originally Posted by *igrease*
> 
> Hey guys. I may be getting a r9 290 soon. The last time I bought an AMD graphics card was about a month after the 7950 had just came out. I was extremely disappointed by the overall performance of the card. I even overclocked it to 1100/1400. My 560 Ti Twin Frozr still ended up getting more fps in most games I played over the OC'd 7950. I currently play Battlefield 4 most of the time. How is this cards performance in this game @ 1080p? I tried to find some Youtube videos but nothing really showed the actual fps. I just don't want to end up regretting this decision for the second time around. Also when the hell are the non-reference versions coming out?
> 
> Thanks.


well, today the 7950 can max out BF4 at 1080. the 290 is about 20% - 30% faster. if you are still using the i5, i suggest you oc it to at least 4.5GHz.


----------



## igrease

Quote:


> Originally Posted by *rdr09*
> 
> well, today the 7950 can max out BF4 at 1080. the 290 is about 20% - 30% faster. if you are still using the i5, i suggest you oc it to at least 4.5GHz.


Trust me, I would love to overclock my 2500k to 4.5ghz. I can't however. My motherboard (p8p67 LE) doesn't allow me to manually adjust my voltage. The first time I attempted to overclock I just threw it on Asus Optimal and it actually ran stable(did a 10 hour + stream session) for about 2 weeks and then all of a sudden while I was messing with my stream settings it said Overclock failed and never booted again. I believe it was using 1.4 volts for a 4.3ghz overclock, lol. As of right now I don't really have the money to be messing around with my cpu overclock with this board because if my motherboard shorts out again then I won't have a PC for quite a while.

So what kind of frames would I be looking at while playing Battlefield 4 with this card? I plan to get a Benq 144hz monitor in the future. Of course I know I won't be playing BF4 on Ultra at 100+ fps.


----------



## stickg1

Quote:


> Originally Posted by *igrease*
> 
> Hey guys. I may be getting a r9 290 soon. The last time I bought an AMD graphics card was about a month after the 7950 had just came out. I was extremely disappointed by the overall performance of the card. I even overclocked it to 1100/1400. My 560 Ti Twin Frozr still ended up getting more fps in most games I played over the OC'd 7950. I currently play Battlefield 4 most of the time. How is this cards performance in this game @ 1080p? I tried to find some Youtube videos but nothing really showed the actual fps. I just don't want to end up regretting this decision for the second time around. Also when the hell are the non-reference versions coming out?
> 
> Thanks.


Did you clean install drivers? That sounds ridiculous that your 7950 would be outperformed by a 560ti. Having owned both of those cards myself I am almost certain this is user error.


----------



## rdr09

Quote:


> Originally Posted by *igrease*
> 
> Trust me, I would love to overclock my 2500k to 4.5ghz. I can't however. My motherboard (p8p67 LE) doesn't allow me to manually adjust my voltage. The first time I attempted to overclock I just threw it on Asus Optimal and it actually ran stable(did a 10 hour + stream session) for about 2 weeks and then all of a sudden while I was messing with my stream settings it said Overclock failed and never booted again. I believe it was using 1.4 volts for a 4.3ghz overclock, lol. As of right now I don't really have the money to be messing around with my cpu overclock with this board because if my motherboard shorts out again then I won't have a PC for quite a while.
> 
> So what kind of frames would I be looking at while playing Battlefield 4 with this card? I plan to get a Benq 144hz monitor in the future. Of course I know I won't be playing BF4 on Ultra at 100+ fps.


can't really help you. maybe others 'cause i only play using 1080. i know in BF4 Beta with HT off at 4.5 my cpu usage on 4 cores were like all above 90%. SP may not pose a problem but MP will rely more on the cpu speed. Hopefully mantle will relieve most of our cpu bottlenecks.

anyways, i was averaging 60 fps using the 7950 and dips to the high 30s. that was in BF4 maxed at 1080. the game is like C3 that even if dips that low it stays smooth. i'd imagine you'll need to oc a bit the 290 for 144Hz but it will really need help from a fast cpu.
Quote:


> Originally Posted by *stickg1*
> 
> Did you clean install drivers? That sounds ridiculous that your 7950 would be outperformed by a 560ti. Having owned both of those cards myself I am almost certain this is user error.


at launch the 7900 series were lackluster. 12.11 change all that.


----------



## stickg1

Idk, I'm pretty sure about a year ago I had 7950s and 7970s that I used on older drivers and was pleased having come from a 560ti. Maybe the games he's referring to were some of the few that were not AMD optimized? Still seems kind of silly.


----------



## chiknnwatrmln

I know some recent (within 3 or so months) AMD drivers really boosted performance.

My first GPU (670 FTW, oc in sig) used to be better than my 2nd (7950), but then when the new drivers came out my 7950 leapt ahead of the 670. Albeit not by much, but my 670 had a much higher oc on it.


----------



## igrease

Quote:


> Originally Posted by *stickg1*
> 
> Did you clean install drivers? That sounds ridiculous that your 7950 would be outperformed by a 560ti. Having owned both of those cards myself I am almost certain this is user error.


Quote:


> Originally Posted by *stickg1*
> 
> Idk, I'm pretty sure about a year ago I had 7950s and 7970s that I used on older drivers and was pleased having come from a 560ti. Maybe the games he's referring to were some of the few that were not AMD optimized? Still seems kind of silly.


Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I know some recent (within 3 or so months) AMD drivers really boosted performance.
> 
> My first GPU (670 FTW, oc in sig) used to be better than my 2nd (7950), but then when the new drivers came out my 7950 leapt ahead of the 670. Albeit not by much, but my 670 had a much higher oc on it.


Yeah all my drivers were fresh and cleanly installed. Tried many different versions that were out at the time. The main games I played at the time were Battlefield 3, Bad Company 2 and Tera Online. All three games the 560 Ti performed better in. Honestly I think I should have kept the card because at least I could have crossfired them now but at the time having purchased a $450+ card only to have my 560 outperform it was just nah.

But yeah like I said I don't plan on playing the game on Ultra at 100+ fps. Maybe everything on High ~ Medium to get 100+ frames or so.


----------



## binormalkilla

ATI drivers truly have hit an all time low. After experiencing tons of crashing problems with the latest beta, I downloaded a previous version from Guru3d (since ATI doesn't offer any driver other than beta drivers for 290X's on their site). After the driver uninstalled I was met with full screen graphical corruption, rendering my screen completely unreadable.

On top of that, there isn't a way (that I know of) to enter safe mode in Windows 8.1, so I'm left with hunting for a fix while booted into Linux, or simply reinstall Windows. If anyone has any suggestions I'm all for it.

Has ATI even released a non-beta driver for these 290X's yet?


----------



## magicase

Are there any reviews for a 290 Tri CF? I got refunded my 1st 7990 and might sell my 2nd 7990 if the performance is good for 290 Tri CF.


----------



## Scorpion49

Quote:


> Originally Posted by *binormalkilla*
> 
> ATI drivers truly have hit an all time low. After experiencing tons of crashing problems with the latest beta, I downloaded a previous version from Guru3d (since ATI doesn't offer any driver other than beta drivers for 290X's on their site). After the driver uninstalled I was met with full screen graphical corruption, rendering my screen completely unreadable.
> 
> On top of that, there isn't a way (that I know of) to enter safe mode in Windows 8.1, so I'm left with hunting for a fix while booted into Linux, or simply reinstall Windows. If anyone has any suggestions I'm all for it.
> 
> Has ATI even released a non-beta driver for these 290X's yet?


For safe mode, hold down left shift key while you click reboot, it will come up with the menu. I have the 13.11 WHQL drivers but I don't know where they hide it on their website, I wish they would take a hint from Nvidia and get a page with like the last 3 years worth of previous beta and WHQL on it.


----------



## binormalkilla

Quote:


> Originally Posted by *Scorpion49*
> 
> For safe mode, hold down left shift key while you click reboot, it will come up with the menu. I have the 13.11 WHQL drivers but I don't know where they hide it on their website, I wish they would take a hint from Nvidia and get a page with like the last 3 years worth of previous beta and WHQL on it.


I'm unable to even see enough to click, and the shift+F8 trick doesn't work. At this point it takes less time to just reinstall.

ATI used to keep all their previous drivers on their site, but recently removed them. Their driver quality has been on a slow, steady decline for the last 3 years or so. They really need to shape up. This is coming from someone that has been using ATI drivers since before they even named them 'Catalyst'.


----------



## Scorpion49

Quote:


> Originally Posted by *binormalkilla*
> 
> In unable to even see enough to click, and the shift+F8 trick doesn't work. At this point it takes less time to just reinstall.
> 
> ATI used to keep all their previous drivers on their site, but recently removed them. Their driver quality has been on a slow, steady decline for the last 3 years or so. They really need to shape up. This is coming from someone that has been using ATI drivers since before they even named them 'Catalyst'.


Oooh thats rough. You got onboard you could use? I don't comment about AMD drivers any more, it just starts a flame war every time. I do keep track of the issues I have with each side though, like you I've been using them for a long time. My first ATi card was a Rage 3D Pro 8MB


----------



## binormalkilla

Quote:


> Originally Posted by *Scorpion49*
> 
> Oooh thats rough. You got onboard you could use? I don't comment about AMD drivers any more, it just starts a flame war every time. I do keep track of the issues I have with each side though, like you I've been using them for a long time. My first ATi card was a Rage 3D Pro 8MB


I have onboard and other cards as well, but I've had so many problems on this new rampage iv black edition build with the ati drivers that I'm just going to start with a clean slate. The first time I tried to install the beta driver after the initial windows installation, it hard locked my PC at 'installing catalyst control center'..... several times in a row. Weirdly enough the drivers installed....just not CCC.


----------



## Scorpion49

Quote:


> Originally Posted by *binormalkilla*
> 
> I have onboard and other cards as well, but I've had so many problems on this new rampage iv black edition build with the ati drivers that I'm just going to start with a clean slate. The first time I tried to install the beta driver after the initial windows installation, it hard locked my PC at 'installing catalyst control center'..... several times in a row. Weirdly enough the drivers installed....just not CCC.


The 9.4 betas don't have CCC, but I know what you mean about a clean slate. It took me two fresh installs to stop having serious problems all the time, now the cards run pretty good.


----------



## ZealotKi11er

Quote:


> Originally Posted by *binormalkilla*
> 
> I'm unable to even see enough to click, and the shift+F8 trick doesn't work. At this point it takes less time to just reinstall.
> 
> ATI used to keep all their previous drivers on their site, but recently removed them. Their driver quality has been on a slow, steady decline for the last 3 years or so. They really need to shape up. This is coming from someone that has been using ATI drivers since before they even named them 'Catalyst'.


I know you have had ATi card since down of time but that will not help you lol. I had to reinstall Windows 7 to get my 290X to work. I was also part of the problem with having HD 7970 drivers and the drivers being Beta. Right now 290X drivers are really bad. We just have to be patient. They do have 13.11 Drivers you can try.


----------



## Jack Mac

The drivers need a bit more work, but they're usable.


----------



## darkelixa

Is it true all the horror stories with the r9 290s having black screens and not booting and having to be Rmaed straight away in hope of getting one that works or is this all just a myth. Trying to decide between the Sapphire or XFX branded cards


----------



## ZealotKi11er

Quote:


> Originally Posted by *Jack Mac*
> 
> The drivers need a bit more work, but they're usable.


True but they are at the low point. They still need to fix CCC; Also BF4 is buggy in general even though as far as the in game problems go i have not had one.


----------



## Jack Mac

Quote:


> Originally Posted by *darkelixa*
> 
> Is it true all the horror stories with the r9 290s having black screens and not booting and having to be Rmaed straight away in hope of getting one that works or is this all just a myth. Trying to decide between the Sapphire or XFX branded cards


Haven't had any blackscreen issues, although I have had a few weird issues but it was probably my fault (lowered clocks and voltages in AB because the card wasn't downclocking to 2D for whatever reason) and I got some hard locks, a ATI bluescreen, and tons of artifacting before I was able to get into windows and reset my AB settings.


----------



## binormalkilla

The weird part is that my 290x's worked perfectly fine with my z77-ud5h board in windows 8.1. That was with another driver though. Well I'm off to try installing again.


----------



## darkelixa

I wont be changing any clocks or voltages

I just want a power horse card that will play final fantasy a realm reborn, crysis 3 and bf4 at 60 fps , 1920x1080p


----------



## VSG

Then why spend the money on a voltage unlocked card? A non-reference nvidia 770 or amd R9-280x will do that for you (Maybe Crysis 3 toned down in filters a bit)


----------



## darkelixa

I own a 770gtx and all it does it stutter in both of the games, when my 5850 and 7870 do not, so im looking to change sides. Later on down the track ill look into overclocking the card


----------



## darkelixa

Is the only difference between the 290s gigabyte,msi, sapphire just the brand on the side or are there differences?


----------



## lethal343

Hey guys just been busy with uni....

WHAT'S THE GO WITH THE R9 290(/X) CHIPSETS?

Have the blackscreens faded away or do we still have heaps of problems?

With 13.11 v9.4 Beta I HAVE NOT encountered any blackscreens however my gpu's cards RANDOMLY set themselves to absolute max fan settings and the pc locks up on idle and then I have to hard reboot.

A mixture of other problems but no blue screens though.

Essentially, I have transitioned from black screens to blue screens.

What is going on? have AMD said anything yet?

I have 2X GIGABYTE r9 290x's in CF on an X79s-up5 motherboard with an i7-4960X CPU

comments appriciated


----------



## HardwareDecoder

count me in just ordered a 290x and an EK waterblock


----------



## darkelixa

So its not advised to buy a 290 yet unless you want blue/black screens?


----------



## the9quad

Quote:


> Originally Posted by *darkelixa*
> 
> So its not advised to buy a 290 yet unless you want blue/black screens?


I bought a sapphire 290x on launch day, it had black screen issues. Since than I have gotten three more 290x's all from sapphire and they all work flawlessly.


----------



## darkelixa

Ah very nice!! I was looking at buying just the non x sapphire 290









How loud are they during gaming with the stock cooler on?


----------



## Roaches

Hopefully Non-reference designs have more mature BIOS out of the box and better drivers by then, we haven't seen a WHQL certified driver yet from AMD.


----------



## HardwareDecoder

Quote:


> Originally Posted by *the9quad*
> 
> I bought a sapphire 290x on launch day, it had black screen issues. Since than I have gotten three more 290x's all from sapphire and they all work flawlessly.


Ahhh goood that put my mind at ease, thank you going to bed happy now, I hadn't even heard of this black screen thing till 20 minutes ago
Quote:


> Originally Posted by *darkelixa*
> 
> Ah very nice!! I was looking at buying just the non x sapphire 290
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How loud are they during gaming with the stock cooler on?


you weren't asking me and I don't own one (yet) well, I do but it isn't here yet and it is going directly on water. But i'll answer anyway: Stupid loud and hot from every review.

But one of the best cards to put on water ever.


----------



## darkelixa

So during gaming, if someone was watching tv behind me it would interupt there program they are that loud during a game?


----------



## Roaches

Quote:


> Originally Posted by *darkelixa*
> 
> So during gaming, if someone was watching tv behind me it would interupt there program they are that loud during a game?


Courtesy of Callsign Vega @ OCN


----------



## darkelixa

Thank you for the video









Apparently sapphire will be releasing custom cooled r9 290s in the next couple of weeks


----------



## maynard14

Hi guys... im just curious about the R9 290x hawaiin info...

can you pls post your hawaiin info using hawaiin info toll at the r9 290 unlock thread in the first post?

i reall y want to see a real 290x infos

i would like to compare it to my UNLOCK r9 290 to 290x bios moded

pls post it here... thank you so much


----------



## Arizonian

Quote:


> Originally Posted by *darkelixa*
> 
> Ah very nice!! I was looking at buying just the non x sapphire 290
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How loud are they during gaming with the stock cooler on?


Quote:


> Originally Posted by *darkelixa*
> 
> Thank you for the video
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Apparently sapphire will be releasing custom cooled r9 290s in the next couple of weeks


That Video shown above poster fails to mention it's at 100% fan. No one plays at 100% fan.

Here is some real world usage.



TweakTown more extensive noise levels comparing more GPU's - *LINK*

Uber mode is where you'll be gaming. Higher over clocks you'd want more fan speed and it does get loud if you've got someone else in a room watching tv past 60% IMO. Hope the chart with comparison of other GPU's help you figure out where you'll sit.

Non-reference cooling solutions out mid-December to first week of January will take care of the noise and tame temps too.

Quote:


> Originally Posted by *Roaches*
> 
> Hopefully Non-reference designs have more mature BIOS out of the box and better drivers by then, we haven't seen a WHQL certified driver yet from AMD.


13.11 WHQL

http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64


----------



## Faint

Just bought the Sapphire R9 290 (with BF4). I will be getting it in a couple days.


----------



## darkelixa

Let me know how it goes and how loud it is , im almost about to buy the same card


----------



## JordanTr

Its loud, but its quite ok if you dont go above 50% rpm. Waiting for friday to order gelid icy vision myself


----------



## broken pixel

I was going to wait for the AIB 290x to appear but I have decided to get two reference 290x GPUs. I sold six R7950s so that pays for the 290xs.







. In a month I will sell the 290x and get the AIBs.


----------



## JordanTr

Quote:


> Originally Posted by *broken pixel*
> 
> I was going to wait for the AIB 290x to appear but I have decided to get two reference 290x GPUs. I sold six R7950s so that pays for the 290xs.
> 
> 
> 
> 
> 
> 
> 
> . In a month I will sell the 290x and get the AIBs.


Haha, money waster


----------



## darkelixa

Well the new sapphire bios shouldnt go over 47% usage


----------



## kot0005

Amazon has AC 4 black flag on sale for $29 (steam code) nab it before its gone. JUst got one my self.

http://www.amazon.com/dp/B00GGUWLE6


----------



## charliew

I think a lot of people would appriciate some inputs to the unigine valley 1.0 thread from you guys







.
http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0

Crossfire for example is being led by a 290 non-X model for example and needs to be taken down a notch by someone with 2 290X's.

Im terribly interested in how the overclocked 290/290x's are doing on water.


----------



## broken pixel

I recycled man! I always do.


----------



## charliew

Quote:


> Originally Posted by *broken pixel*
> 
> I recycled man! I always do.


Dude, thought gpu mining died with the Asics







. Or are you turning free-electricity into money?


----------



## rdr09

Quote:


> Originally Posted by *charliew*
> 
> I think a lot of people would appriciate some inputs to the unigine valley 1.0 thread from you guys
> 
> 
> 
> 
> 
> 
> 
> .
> http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0
> 
> Crossfire for example is being led by a 290 non-X model for example and needs to be taken down a notch by someone with 2 290X's.
> 
> Im terribly interested in how the overclocked 290/290x's are doing on water.


the 780s are running Valley,while we are running 3DMarks. lol

do you mind running your 780 at 1200/1500 to compare? here is mine watered . . .

http://www.3dmark.com/3dm11/7556915

no voltage control yet.


----------



## broken pixel

I have an ASIC mining BTC and 2 7970Ls going at some ALT coin. I had 85,000 QRK coin and sold them 1 month early for 27.00 in BTC. At todays value they are worth closed to 17 BTC, 17,000 lol. I messed up selling early.


----------



## velocityx

god

I was really trying man

really

but I just couldnt help myself and i ordered a second 290 along with a 850 80+ gold psu

I know I don't need it but damn the case looks empty inside without a second one ;p


----------



## broken pixel

Add me to the list please.

Order #: 180XXXXXX(shipped via Newegg Next Business Day)
2
XFX R9-290X ENFC Radeon R9 290X 4GB 512-bit GDDR5 PCI Express 3.0 CrossFireX Support Video Card
Item #: N82E16814150675
Adult signature required on delivery.
Iron Egg Guarantee Replacement-Only Return Policy
$1,099.98
($549.99 ea)
Subtotal$1,099.98
Tax$0.00
Newegg Next Business Day$32.76
Rush Processing$2.99
Order Total$1,135.73


----------



## charliew

Quote:


> Originally Posted by *rdr09*
> 
> the 780s are running Valley,while we are running 3DMarks. lol
> 
> do you mind running your 780 at 1200/1500 to compare? here is mine watered . . .
> 
> http://www.3dmark.com/3dm11/7556915
> 
> no voltage control yet.


Ill sort it as soon as I get home







.

Ill throw in a single and a multi-gpu for comparison to measure SLI/Xfire scaling if you want.


----------



## maynard14

Hi guys... im just curious about the R9 290x hawaiin info...

can you pls post your hawaiin info using hawaiin info tOol at the r9 290 unlock thread in the first post.
i reall y want to see a real 290x infos
i would like to compare it to my UNLOCK r9 290 to 290x bios moded

pls post it here... thank you so much


----------



## Arizonian

Quote:


> Originally Posted by *broken pixel*
> 
> Add me to the list please.
> 
> Order #: 180XXXXXX(shipped via Newegg Next Business Day)
> 2
> XFX R9-290X ENFC Radeon R9 290X 4GB 512-bit GDDR5 PCI Express 3.0 CrossFireX Support Video Card
> Item #: N82E16814150675
> Adult signature required on delivery.
> Iron Egg Guarantee Replacement-Only Return Policy
> $1,099.98
> ($549.99 ea)
> Subtotal$1,099.98
> Tax$0.00
> Newegg Next Business Day$32.76
> Rush Processing$2.99
> Order Total$1,135.73


Need you to throw in a GPU-Z validation link as proof when you get them. Congrats in advance.









To be added on the member list please submit the following in your post

1. GPU-Z Link or Screen shot with OCN Name
2. Manufacturer & Brand
3. Cooling - Stock, Aftermarket or 3rd Party Water


----------



## blazestalker100

Maybe this just cant work. But i have a old 4870 card kon refference cooler. It seem to fit the 290 but im afraid it cant cool the memory. Could i fit ot and see some positive results?


----------



## rdr09

Quote:


> Originally Posted by *charliew*
> 
> Ill sort it as soon as I get home
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Ill throw in a single and a multi-gpu for comparison to measure SLI/Xfire scaling if you want.


single at 1200/1500. I only have one 290. maybe others are interested with sli results. thanks.


----------



## Ricdeau

Quote:


> Originally Posted by *charliew*
> 
> I think a lot of people would appriciate some inputs to the unigine valley 1.0 thread from you guys
> 
> 
> 
> 
> 
> 
> 
> .
> http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0
> 
> Crossfire for example is being led by a 290 non-X model for example and needs to be taken down a notch by someone with 2 290X's.
> 
> Im terribly interested in how the overclocked 290/290x's are doing on water.


I'm on water, and I have the second highest score for the R9 290/290X, but that's just at stock and pre-waterblocking. After putting on the waterblocks I see no improvement on overclocking with unmodified BIOS and turning up the voltage offset in AB all the way. Temperatures are far improved, but that's it. Also these cards are pretty temperamental in Valley, and seem to be pretty inconsistent. For example I actually have a run that beats the 290 Crossfire score with a slight overclock, but it's just not consistent and the score fluctuates a lot so I've since stopped bothering in that bench for now. I'd say look at using something else besides Valley if you want to compare these for now. Hopefully with some driver maturity we'll see an improvement.


----------



## SuprUsrStan

Quote:


> Originally Posted by *Ricdeau*
> 
> I'm on water, and I have the second highest score for the R9 290/290X, but that's just at stock and pre-waterblocking. After putting on the waterblocks I see no improvement on overclocking with unmodified BIOS and turning up the voltage offset in AB all the way. Temperatures are far improved, but that's it. Also these cards are pretty temperamental in Valley, and seem to be pretty inconsistent. For example I actually have a run that beats the 290 Crossfire score with a slight overclock, but it's just not consistent and the score fluctuates a lot so I've since stopped bothering in that bench for now. I'd say look at using something else besides Valley if you want to compare these for now. Hopefully with some driver maturity we'll see an improvement.


Whats the clock speeds you're getting then? Is that with a overvolt, if so how much?


----------



## HardwareDecoder

Quote:


> Originally Posted by *maynard14*
> 
> Hi guys... im just curious about the R9 290x hawaiin info...
> 
> can you pls post your hawaiin info using hawaiin info tOol at the r9 290 unlock thread in the first post.
> i reall y want to see a real 290x infos
> i would like to compare it to my UNLOCK r9 290 to 290x bios moded
> 
> pls post it here... thank you so much


I will definitely do this since someone else asked me to also, but I won't have my card until thursday or friday


----------



## ZealotKi11er

Quote:


> Originally Posted by *blazestalker100*
> 
> Maybe this just cant work. But i have a old 4870 card kon refference cooler. It seem to fit the 290 but im afraid it cant cool the memory. Could i fit ot and see some positive results?


Thats a horrible cooler. Worst non reference cooler i have used. Very loud and hot.


----------



## Ricdeau

Quote:


> Originally Posted by *Syan48306*
> 
> Whats the clock speeds you're getting then? Is that with a overvolt, if so how much?


Max I can hit is 1200Mhz on the core (haven't tapped out the memory yet, max I've tested 1500Mhz which didn't seem to cause any issues) which is sometimes stable enough to make it through a bench. In terms of no artifacting you're looking at around 1175Mhz. As I said in my initial post that is with stock BIOS so the only voltage you can change is the offset with Afterburner. There's been a few people that have killed their cards with the modified BIOS so I'm not looking to risk anything until we got a better understanding of the voltage thresholds. I don't have the money to throw around on replacing cards (I shouldn't have even bought these!) From the looks 1200Mhz seems to be the extent of what most get on stock volts, but for most it's not stable water or not.


----------



## Darklyric

Okay so going to pull the trigger on 2x 290s today. Any recommendations where and what to get?

This power cooler seems to be my best chance since xfx is sold out AXR9 290 4GBD5-MDH/OC

http://www.newegg.com/Product/Product.aspx?Item=N82E16814131527

Also does anyone have one of these and are they voltage unlocked, and can i flash to a bios with higher tdp to prevent throttling?

Edit: Does these just take the standard 290 er 290x waterblocks, and anyone have an idea where to find one in stock which one is the best?

Thanks


----------



## Jpmboy

The Egg is selling sapphire 290x for $499!


----------



## Darklyric

Quote:


> Originally Posted by *Jpmboy*
> 
> The Egg is selling sapphire 290x for $499!


where?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Darklyric*
> 
> where?


Out of stock now: http://www.newegg.com/Product/Product.aspx?Item=N82E16814202058


----------



## Darklyric

Ok so best chance of unlocking between whats left in stock
asus http://www.newegg.com/Product/Product.aspx?Item=N82E16814121807
his http://www.newegg.com/Product/Product.aspx?Item=N82E16814161439


----------



## charliew

Quote:


> Originally Posted by *Ricdeau*
> 
> I'm on water, and I have the second highest score for the R9 290/290X, but that's just at stock and pre-waterblocking. After putting on the waterblocks I see no improvement on overclocking with unmodified BIOS and turning up the voltage offset in AB all the way. Temperatures are far improved, but that's it. Also these cards are pretty temperamental in Valley, and seem to be pretty inconsistent. For example I actually have a run that beats the 290 Crossfire score with a slight overclock, but it's just not consistent and the score fluctuates a lot so I've since stopped bothering in that bench for now. I'd say look at using something else besides Valley if you want to compare these for now. Hopefully with some driver maturity we'll see an improvement.


Ill try to get around to going over to the 3dmark. Its only a bit AMD favored imo so its a bit bummy run to it on Nvidia







. Also, watching the demos while running a demo version of firestrike is almost as fun as pulling out your wisdom teeth with a pincer while watching twilight, sitting in a chair of honeycomb full of africinized bees, licking 2 9volt batteries and performing demeaning sexual acts on yourself with a medieval weapon of your own choice... while listening to Rebecka Black (not on a friday).

The 290s are really plowing up to the top of my recommendation-list tho, theyre really plowing through the competition at about 20% lower prices than 780 (Sweden, weeoooh). Almost 17.5k 3dmark11 graphics is veeeeery good.

If I cant beat it Ill have to figure out some bull**** excuse.


----------



## rdr09

Quote:


> Originally Posted by *charliew*
> 
> Ill try to get around to going over to the 3dmark. Its only a bit AMD favored imo so its a bit bummy run to it on Nvidia
> 
> 
> 
> 
> 
> 
> 
> . Also, watching the demos while running a demo version of firestrike is almost as fun as pulling out your wisdom teeth with a pincer while watching twilight, sitting in a chair of honeycomb full of africinized bees, licking 2 9volt batteries and performing demeaning sexual acts on yourself with a medieval weapon of your own choice... while listening to Rebecka Black (not on a friday).
> 
> The 290s are really plowing up to the top of my recommendation-list tho, theyre really plowing through the competition at about 20% lower prices than 780 (Sweden, weeoooh). Almost 17.5k 3dmark11 graphics is veeeeery good.
> 
> If I cant beat it Ill have to figure out some bull**** excuse.


i know you won't beat at 1200. no need to try and no pressure. good thing, though, you can oc higher and we both can max out the same games.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Darklyric*
> 
> Ok so best chance of unlocking between whats left in stock
> asus http://www.newegg.com/Product/Product.aspx?Item=N82E16814121807
> his http://www.newegg.com/Product/Product.aspx?Item=N82E16814161439


Only the HIS is in stock now, but from memory i think Powercolor has a 100% unlock rate so far and XFX has a very good chance as well, i think Sapphire is a 50/50 chance, not sure about that one.

Either way it's still a kick-ass card even if it doesn't unlock


----------



## ZealotKi11er

Quote:


> Originally Posted by *Darklyric*
> 
> Ok so best chance of unlocking between whats left in stock
> asus http://www.newegg.com/Product/Product.aspx?Item=N82E16814121807
> his http://www.newegg.com/Product/Product.aspx?Item=N82E16814161439


0. I would get ASUS though.


----------



## brazilianloser

Newegg must have had a major overhaul in their training of customer service, either that or they changed companies dealing with customer relations...

Their new favorite line: "May I please place you on hold?"

If I had a few bucks for every time I heard that the few weeks when I called them or even through their life chat... Just ship me my darn card.


----------



## The Storm

Sold my 7950's on the bay for a mint, now I have 2 r9-290's (sapphires) on the way. Can't wait to try them out. I need some ek blocks but they are sold out right now.


----------



## stickg1

I bought a used PowerColor 290 that should arrive soon. The original owner said it did not unlock but I was swayed by the 3DMark run he had at ~1253/~1600 on water. That looks above average and it was at a discounted price. So I decided to get it instead of taking my chances in a new one. But it is a Power Color and does not unlock.


----------



## Darklyric

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 0. I would get ASUS though.


Couldn't find one that anyone had unlocked one here so I grabbed the HIS models. Oh well I know I'm going to be happy even if they are duds at overclocking.


----------



## ImJJames

My 290 should arrive tomorrow







Got a power color brand. Sold my 7970 for $330 to a hungry LTC miner.


----------



## stickg1

Quote:


> Originally Posted by *ImJJames*
> 
> My 290 should arrive tomorrow
> 
> 
> 
> 
> 
> 
> 
> Got a power color brand. Sold my 7970 for $330 to a hungry LTC miner.


Yeah I didn't look much into market price and sold my 7970 for $200. Sigh, I feel like a moron. Oh well! Someone got a good deal at least.


----------



## Darklyric

Quote:


> Originally Posted by *stickg1*
> 
> Yeah I didn't look much into market price and sold my 7970 for $200. Sigh, I feel like a moron. Oh well! Someone got a good deal at least.


Yes 7970 are like gold i guess on ebay... and tiger direct has a 400$ 7770 lol. The world is ending


----------



## ImJJames

Quote:


> Originally Posted by *stickg1*
> 
> Yeah I didn't look much into market price and sold my 7970 for $200. Sigh, I feel like a moron. Oh well! Someone got a good deal at least.


Yeah LiteCoins increased in value, cards are selling like hotcakes.


----------



## rdr09

Quote:


> Originally Posted by *The Storm*
> 
> Sold my 7950's on the bay for a mint, now I have 2 r9-290's (sapphires) on the way. Can't wait to try them out. I need some ek blocks but they are sold out right now.


here you go storm . . .

http://www.overclock.net/t/1447333/fs-us48-three-ek-290-290x-blocks-w-backplates-in-red


----------



## hotrod717

Quote:


> Originally Posted by *stickg1*
> 
> Yeah I didn't look much into market price and sold my 7970 for $200. Sigh, I feel like a moron. Oh well! Someone got a good deal at least.


Quote:


> Originally Posted by *Darklyric*
> 
> Yes 7970 are like gold i guess on ebay... and tiger direct has a 400$ 7770 lol. The world is ending


Wow, roughly 1 month ago ago they were still selling for $225. Just looked and a used reference is like $300 now. I guess I should have held onto mine longer.


----------



## CriticalHit

Killing me .. got my cards underwater and nice low temps ( around 40C ) but whenever i overclock beyond 1100 my displayport monitor keeps dropping and regaining input signal ( on / off ) every couple of seconds ..

im running eyefinity - dvi monitors signal is solid..

i continued anyway all the way up to 1180 to see if would artifcact / crash out but its pretty stable up to there... is the signal dropping indicating my card is at limits or is it a driver issue ?


----------



## jerrolds

Not sure if anyone's answered this but how would we get TrueAudio to work? Do we have to use HDMI for the card to process the audio? What if my only display connection is via DL-DVI, how can i take advantage of the 290X processing the audio?


----------



## The Storm

Quote:


> Originally Posted by *rdr09*
> 
> here you go storm . . .
> 
> http://www.overclock.net/t/1447333/fs-us48-three-ek-290-290x-blocks-w-backplates-in-red


Thank you


----------



## Darklyric

Quote:


> Originally Posted by *jerrolds*
> 
> Not sure if anyone's answered this but how would we get TrueAudio to work? Do we have to use HDMI for the card to process the audio? What if my only display connection is via DL-DVI, how can i take advantage of the 290X processing the audio?


Good question because I just use the audio jacks on the motherboard so I'm not sure if it will work for me either... which would suck.


----------



## sugarhell

Quote:


> Originally Posted by *jerrolds*
> 
> Not sure if anyone's answered this but how would we get TrueAudio to work? Do we have to use HDMI for the card to process the audio? What if my only display connection is via DL-DVI, how can i take advantage of the 290X processing the audio?


What? True audio can use anything...And there isnt a game with true audio support atm


----------



## Jpmboy

Quote:


> Originally Posted by *ImJJames*
> 
> Yeah LiteCoins increased in value, cards are selling like hotcakes.


Whoa. Maybe i should sell my two WC'd 7970s and keep the WC'd 290x i listed in the ocn market place?


----------



## binormalkilla

Quote:


> Originally Posted by *lethal343*
> 
> Hey guys just been busy with uni....
> 
> WHAT'S THE GO WITH THE R9 290(/X) CHIPSETS?
> 
> Have the blackscreens faded away or do we still have heaps of problems?
> 
> With 13.11 v9.4 Beta I HAVE NOT encountered any blackscreens however my gpu's cards RANDOMLY set themselves to absolute max fan settings and the pc locks up on idle and then I have to hard reboot.
> 
> A mixture of other problems but no blue screens though.
> 
> Essentially, I have transitioned from black screens to blue screens.
> 
> What is going on? have AMD said anything yet?
> 
> I have 2X GIGABYTE r9 290x's in CF on an X79s-up5 motherboard with an i7-4960X CPU
> 
> comments appriciated


I got the random max fan bug while browsing the internet. Scared me death







. 2 290x's at %100 is no joke.


----------



## jerrolds

Quote:


> Originally Posted by *sugarhell*
> 
> What? True audio can use anything...And there isnt a game with true audio support atm


I'm talking about in the future - my question is: HDMI the only way to make sure the card itself processes the audio of whatever, or does it somehow process it and hands it over to the system audio renderer.

For example, my motherboard audio doesnt work so i use an USB External 7.1 Audio Adapter , its what my speakers/headphones are connected to. When a game that supports TrueAudio comes out, how can i make sure i can take advantage of it.


----------



## sugarhell

It just process the audio instead of the cpu. Nothing else


----------



## Gunderman456

!!!WOW - at least in Canada the R9 290 is sold out (newegg.ca/Direct Canada/Canada Computers) and the R9 290x, when available, hovers around $600+ now - WOW!!!

Note, NCIX.ca has the XFX R9 290 in stock for $450, the R9 290x is in line with the other boutiques at $600+ though.


----------



## ImJJames

Quote:


> Originally Posted by *Jpmboy*
> 
> Whoa. Maybe i should sell my two WC'd 7970s and keep the WC'd 290x i listed in the ocn market place?


Non-ref 7970's are selling for $400 right now even used...but you would make a lot more money keeping it and mining litecoins yourself.


----------



## The Storm

Quote:


> Originally Posted by *ImJJames*
> 
> Non-ref 7970's are selling for $400 right now even used...but you would make a lot more money keeping it and mining litecoins yourself.


I got $850 for a pair of 7950's I could not possibly pass that up.


----------



## tsm106

Quote:


> Originally Posted by *The Storm*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ImJJames*
> 
> Non-ref 7970's are selling for $400 right now even used...but you would make a lot more money keeping it and mining litecoins yourself.
> 
> 
> 
> I got $850 for a pair of 7950's I could not possibly pass that up.
Click to expand...

Holy Carp. I have 5 gold 7970s right here, and one is super gold. Hmm...


----------



## ImJJames

Quote:


> Originally Posted by *tsm106*
> 
> Holy Carp. I have 5 gold 7970s right here, and one is super gold. Hmm...


Put it up on ebay, lowest on ebay for reference 7970 right now is $400, and for non-reference you can get much more.


----------



## stickg1

so the 290 can't mine?


----------



## hotrod717

I sold 4 reference month and half ago. $225-$250







Wb'd XFX 1 1/2 week ago $320.


----------



## ZealotKi11er

Quote:


> Originally Posted by *ImJJames*
> 
> Put it up on ebay, lowest on ebay for reference 7970 right now is $400, and for non-reference you can get much more.


Really? I had hard time selling 2 x 7970 Reference with Blocks for $650.


----------



## ImJJames

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Really? I had hard time selling 2 x 7970 Reference with Blocks for $650.


Just recently? These price spikes just happened `1-2 days ago because of the value increase of Litecoins.


----------



## battleaxe

Litecoins... Hmmm... Any info on this?

So the 7970 is better at Litecoin than a 290?


----------



## Slomo4shO

Quote:


> Originally Posted by *Darklyric*
> 
> Ok so best chance of unlocking between whats left in stock
> asus http://www.newegg.com/Product/Product.aspx?Item=N82E16814121807
> his http://www.newegg.com/Product/Product.aspx?Item=N82E16814161439


Just got my 4 Sapphire 290 BF editions from Newegg Business and they all seem likely to unlock


----------



## quakermaas

Is there anything else that will cause my cards to throttle, other than temperature ?

My temperatures are not going over 80 in benchmarks(fan profile set for 100% at 80c), VRMs don't go over 60-70c.

I am using latest beta driver and MSI AB (ULPS off and unofficial OC mode with powerplay).

Its as if the PowerLimit of 150% is not being applied to the second card.

This was after a Heaven run.


----------



## Ricdeau

Quote:


> Originally Posted by *quakermaas*
> 
> Is there anything else that will cause my cards to throttle, other than temperature ?
> 
> My temperatures are not going over 80 in benchmarks(fan profile set for 100% at 80c), VRMs don't go over 60-70c, I am thinking it is the power limit. I am using latest beta driver and MSI AB (ULPS off and unofficial OC mode with powerplay)


I'm assuming you also have the power limit turned up all the way too. My cards are under water now, but while my blocks were on backorder I just set my fan profile for 70% and never throttled. As far as I could tell I wasn't throttling at around 50% or higher either.

Have you tested individually to see if they throttling out of Crossfire as well?

Edit - I see that you updated your post. So we know you have the power limit cranked up too. I'd still like to know if you have any throttling on that card in stand alone. Also have you tested anything else to see if it throttles outside of that too?


----------



## ImJJames

Posted my other 7970 sapphire on amazon for $380 sold within one hour, and yes used...Total of $710 for two used 7970 within 24 hours. Thank you litecoins


----------



## quakermaas

Quote:


> Originally Posted by *Ricdeau*
> 
> I'm assuming you also have the power limit turned up all the way too. My cards are under water now, but while my blocks were on backorder I just set my fan profile for 70% and never throttled. As far as I could tell I wasn't throttling at around 50% or higher either.
> 
> Have you tested individually to see if they throttling out of Crossfire as well?


Added a screen shot to original post, haven't tested individually, might have to do that, but not tonight.

I have AC blocks ordered.

Throttling on all benchmarks.


----------



## battleaxe

I just looked it up. Seems the 7970 series can do high 500's on coins. The 290's and 290x's can do well over 800, so that's that. The 290 series is the best out there for mining ATM besides dedicated devices.

This has just become the best time to sell your old AMD card. Or to get into mining. I'm not sure which. Wish I never sold my old 6970... those do quite well too. Bummer.


----------



## chiknnwatrmln

Now only if those miners could make the price of the 290/x's double, then I'd be happy.


----------



## battleaxe

does anyone think its worth it to mine Litecoins with these 290's/290x's?

They score over 800Kh/s but I'm not sure that's enough to make the electricity worth it?


----------



## ZealotKi11er

Quote:


> Originally Posted by *ImJJames*
> 
> Just recently? These price spikes just happened `1-2 days ago because of the value increase of Litecoins.


Why cant they buy 280X?


----------



## chiknnwatrmln

IDK, if the card's on stock voltage or undervolted maybe it'd be worth it.

I don't think they use 280x's because most of them probably don't know all that much about graphics card and as a result go for what everyone else uses.


----------



## mickeykool

Have the black screen been fixed yet? Want to buy the R290 but been seeing some issues with this card.


----------



## ImJJames

Quote:


> Originally Posted by *mickeykool*
> 
> Have the black screen been fixed yet? Want to buy the R290 but been seeing some issues with this card.


Yes its been fixed, and good luck finding a 290 in stock @ $399.


----------



## The Storm

Quote:


> Originally Posted by *ImJJames*
> 
> Yes its been fixed, and good luck finding a 290 in stock @ $399.


I picked up 2 sapphire 290's from newegg last night and used the 5% discount and they each come with a BF4 coupon. Too bad I already have the game lol.


----------



## fastpcman12

Quote:


> Originally Posted by *The Storm*
> 
> I picked up 2 sapphire 290's from newegg last night and used the 5% discount and they each come with a BF4 coupon. Too bad I already have the game lol.


you can sell the game for $20 to $25 on ebay or amazon...


----------



## LuPo95

Quote:


> Originally Posted by *mickeykool*
> 
> Have the black screen been fixed yet? Want to buy the R290 but been seeing some issues with this card.


For me its not fixed yet. Even the 11.4 beta hasnt got it to work -.-


----------



## mickeykool

Quote:


> Originally Posted by *LuPo95*
> 
> For me its not fixed yet. Even the 11.4 beta hasnt got it to work -.-


Not sure if you're aware but there is a new driver update released today.


----------



## RAFFY

Quote:


> Originally Posted by *battleaxe*
> 
> Litecoins... Hmmm... Any info on this?
> 
> So the 7970 is better at Litecoin than a 290?


What does the 7970 pull? My 290x's each get around 820mhash


----------



## battleaxe

Quote:


> Originally Posted by *RAFFY*
> 
> What does the 7970 pull? My 290x's each get around 820mhash


550 to 590ish depending on OC. Not bad, but nowhere near the 290's

290's are looking really good again. To me anyway. Looks like the driver issues are getting worked out too. gonna wait just a bit longer...


----------



## bond32

I'm just getting into Litecoin, my single r9 290x was doing 600


----------



## battleaxe

Quote:


> Originally Posted by *bond32*
> 
> I'm just getting into Litecoin, my single r9 290x was doing 600


Something's not right. You should be in the 800's or really close.


----------



## Slomo4shO

Quote:


> Originally Posted by *battleaxe*
> 
> 550 to 590ish depending on OC. Not bad, but nowhere near the 290's


700-750 is pretty common on the 7970s and 280X


----------



## jerrolds

Quote:


> Originally Posted by *battleaxe*
> 
> does anyone think its worth it to mine Litecoins with these 290's/290x's?
> 
> They score over 800Kh/s but I'm not sure that's enough to make the electricity worth it?


Depends on your total power draw and electricity costs obviously - at stock the 290X pulls about 300W i believe, i estimate my system to pull about 450W. I pay $0.07/kWh - going with those numbers it should cost me $13/month to run litecoin mining 12hr/day for 30 days.

I dont plan on doing it 24/7.


----------



## battleaxe

Quote:


> Originally Posted by *Slomo4shO*
> 
> 700-750 is pretty common on the 7970s and 280X


Cow... I missed the bottom part of the 7970's there. I was looking at that very page. Man.. take a look at those 7990's weeeewwwww.... 1500kH/s


----------



## fewohfjweoifj

Just updated to the new 9.5 Beta Drivers. The black screen crash now occurs EVEN MORE FREQUENTLY than ever before. AMD, whatever you're doing with these drivers, do the opposite


----------



## rdr09

Quote:


> Originally Posted by *fewohfjweoifj*
> 
> Just updated to the new 9.5 Beta Drivers. The black screen crash now occurs EVEN MORE FREQUENTLY than ever before. AMD, whatever you're doing with these drivers, do the opposite


sure.


----------



## darkelixa

Just about to pull the trigger on the sapphire r9 290 unless the xfx r9 290 is better


----------



## Ukkooh

Anyone else experiencing hard locks with catalyst 13.11v9.4 betas and flash player? I did a fresh install of windows around 2 hours ago to confirm that my issues aren't software related and now my system just hard locks with looping sound when I open a youtube video.


----------



## stl drifter

Performance wise , how do a R290 compare to 2 7950's in crossfire ?


----------



## fewohfjweoifj

I made this video demonstrating the black screen problem persisting when using the new Beta 9.5 drivers.


----------



## darkelixa

Makes me a bit worried to buy an r9 290 if they have all these black screen problems


----------



## rdr09

Quote:


> Originally Posted by *fewohfjweoifj*
> 
> 
> 
> 
> 
> I made this video demonstrating the black screen problem persisting when using the new Beta 9.5 drivers.


9.5 for Black Screen issues? wasn't it 9.4? if 9.4 did not resolve YOUR issue, then 20.11 Beta 9.4 won't.


----------



## fewohfjweoifj

Quote:


> Originally Posted by *rdr09*
> 
> 9.5 for Black Screen issues? wasn't it 9.4? if 9.4 did not resolve YOUR issue, then 20.11 Beta 9.4 won't.


The issue has occured on every driver version so far, including the new Beta 9.5 released today. I guess I can wait for whenever the next non-beta driver is, but if there is still an issue at that point then it's a hardware fault for sure.


----------



## Heinz68

Quote:


> Originally Posted by *fewohfjweoifj*
> 
> 
> 
> 
> 
> I made this video demonstrating the black screen problem persisting when using the new Beta 9.5 drivers.


Maybe something wrong with your system config, not in your signature or just about good time to RMA the the card since just about all your post are about the same subject


----------



## rdr09

Quote:


> Originally Posted by *fewohfjweoifj*
> 
> The issue has occured on every driver version so far, including the new Beta 9.5 released today. I guess I can wait for whenever the next non-beta driver is, but if there is still an issue at that point then it's a hardware fault for sure.


and it will happen even with 20.11 Beta 9.5. RMA.

Read the notes. Nothing has changed from the previous Beta just added stuff to address newer issues. Hey, you are from where?

sorry, to be sarcastic but seiously - RMA.


----------



## voldomazta

Hi guys,

Will a Alphacool NexXxoS 480 monsta be enough to cool three 290x?


----------



## fewohfjweoifj

Quote:


> Originally Posted by *rdr09*
> 
> and it will happen even with 20.11 Beta 9.5. RMA.
> 
> Read the notes. Nothing has changed from the previous Beta just added stuff to address newer issues. Hey, you are from where?
> 
> sorry, to be sarcastic but seiously - RMA.


If I RMA the card and it turns out not to be faulty I'll be charged for it and I can't really afford that. It's very possible that the issue is specific to my system and would not appear faulty to whoever tests it. And it's not like I expect beta drivers to run perfectly, on a card that was only released very recently. So I will wait at least until non-beta drivers are out. If at that time the black screen are still an issue, then I will definitely RMA. The issue is only affecting 3D games, normal computer use isn't hindered by it so I'm happy to wait that long.


----------



## head9r2k

AMD 13.11 BETA 9.5 Works fine by me







Many BF4 Matches no Blackscreen or other Issues


----------



## rdr09

Quote:


> Originally Posted by *fewohfjweoifj*
> 
> If I RMA the card and it turns out not to be faulty I'll be charged for it and I can't really afford that. It's very possible that the issue is specific to my system and would not appear faulty to whoever tests it. And it's not like I expect beta drivers to run perfectly, on a card that was only released very recently. So I will wait at least until non-beta drivers are out. If at that time the black screen are still an issue, then I will definitely RMA. The issue is only affecting 3D games, normal computer use isn't hindered by it so I'm happy to wait that long.


By now it is pretty obvious that no patch will fix it. Beta or otherwise in your case. I am just giving you a suggestion. You can disregard it but six months from now and you post about it again . . . I'll say the same thing - RMA.


----------



## Dankal

anyone else have problems with their card not downclocking?

The card runs games fine, but when back to windows its not always going back to the 300 mhz idle state.

This causes 75 * C temps on normal fan curve since it is stuck at 950 mhz.... with 0% load.

using 13.11 9.2


----------



## jerrolds

Quote:


> Originally Posted by *Mas*
> 
> Man I'm so jealous.. I pay between .36c/kwh and .44c/kwh, and my supply charge is almost a dollar a day. Quarterly electricity bill is usually almost a grand.


Ouch...







Yea we export a ton of our electricity to the states (I live in Manitoba) so i makes sense that power would be on the cheaper side compared to the rest of the world. My summer bill is about $25/mo and my winter about 45 heh..but i live in a smallish apartment style condo. Winters do get cold though about -30C for a good 3 months.


----------



## jerrolds

Quote:


> Originally Posted by *Dankal*
> 
> anyone else have problems with their card not downclocking?
> 
> The card runs games fine, but when back to windows its not always going back to the 300 mhz idle state.
> 
> This causes 75 * C temps on normal fan curve since it is stuck at 950 mhz.... with 0% load.
> 
> using 13.11 9.2


I had that issue, when a virus called ie8util or something similar was running bit/lite coin on my GPU. I noticed Bioshock one was running at around 100fps when it shouldve been 300+ and that GPU was 100% in windows.

Might wanna check your startup folder/task manager to make sure nothing weird is running.

Luckily i think was for about a few hours..bastards wont make virtual dollars off me!


----------



## Mas

Quote:


> Originally Posted by *jerrolds*
> 
> Ouch...
> 
> 
> 
> 
> 
> 
> 
> Yea we export a ton of our electricity to the states (I live in Manitoba) so i makes sense that power would be on the cheaper side compared to the rest of the world. My summer bill is about $25/mo and my winter about 45 heh..but i live in a smallish apartment style condo. Winters do get cold though about -30C for a good 3 months.


Time to talk the missus into into moving to Canada


----------



## Dankal

i wouldnt even know how to check for it but the computer is fine if i put it to sleep and reboot so i doubt virus. it is fine if i restart it


----------



## Jack Mac

It's not a virus, my card takes a very long time to return to 2D clocks but easily goes right back up to 3D clocks. Super annoying.


----------



## darkelixa

Umart have the sapphire r9 290 for $475 in australia, is that a good price?


----------



## Jack Mac

Not familiar with Australian prices, if it's within $70-100 of the 280X and 770 and you can afford it, then yeah it's probably worth it.


----------



## djsatane

My RMA process is finally complete, and for my troubles and the fact I had to ship it over the border, I am receiving amd "golden sample" press/vip card(they say its hard to get and on the market it can run as high as 2k, not sure if thats true). I have been steady gpu customer for this place since 2008 so I appreciate them treating older customer customer like me this way. They also said they stress tested it for 3 days, I will post details once I have it but it will take a while since its international. Initial sapphire 290x that I RMA 2 weeks back, its experience was black screen hell but maybe looks like everything will turn much better than I expected.

Btw, did anyone notice camelegg no longer works for newegg price charts?


----------



## escapedmonk

Im after a cheap bf4 key if anyone has a spare
Thanks


----------



## darkelixa

Yeah its under the 770gtx and about 70 more than a 280x.

I better ask before I buy one, are there huge stuttering issues with games and the r9 290.

The games i play is final fantasy a realm reborn,bf3/bf4,crysis 3.

Stuttering is usually a driver/video card issue isnt it?


----------



## ImJJames

Quote:


> Originally Posted by *darkelixa*
> 
> Yeah its under the 770gtx and about 70 more than a 280x.
> 
> I better ask before I buy one, are there huge stuttering issues with games and the r9 290.
> 
> The games i play is final fantasy a realm reborn,bf3/bf4,crysis 3.
> 
> Stuttering is usually a driver/video card issue isnt it?


Stuttering is for crossfire and even then its pretty much obsolete.


----------



## ImJJames

Plus if you're planning to buy one its almost impossible. Its out of stock everywhere unless you want to pay higher price.


----------



## darkelixa

Oh no issues buyin R9 290s here in Australia they have plenty of them in stock, its the 280xs they have zero in stock of. I guess its time to put the 770gtx up on ebay and buy the r9 290. ddr3 ram wouldnt cause stuttering would it?


----------



## deathlikeeric

Quote:


> Originally Posted by *stl drifter*
> 
> Performance wise , how do a R290 compare to 2 7950's in crossfire ?


i just went from 2 7950's to a r9 290 @ 1000/1250 i get around
65-90fps(depending on the map) on BF4 @ 1440p with all ultra setting with no msaa... crossfire: 65-110fps depending on the map
AC4 a solid 60fps in with all ultra with Fxaa with all ultra with Fxaa... crossfire 7950: solid 60fps (cap)

all in all i enjoy a single card over 2, less trouble with games


----------



## HardwareDecoder

Quote:


> Originally Posted by *deathlikeeric*
> 
> i just went from 2 7950's to a r9 290 @ 1000/1250 i get around
> 65-90fps(depending on the map) on BF4 @ 1440p with all ultra setting with no msaa... crossfire: 65-110fps depending on the map
> AC4 a solid 60fps in with all ultra with Fxaa with all ultra with Fxaa... crossfire 7950: solid 60fps (cap)
> 
> all in all i enjoy a single card over 2, less trouble with games


good to hear i'm doing the same except with a 290x cant wait to get mine. Paid for 3 day shipping but newegg didn't ship yet


----------



## DeadlyDNA

Quote:


> Originally Posted by *charliew*
> 
> I think a lot of people would appriciate some inputs to the unigine valley 1.0 thread from you guys
> 
> 
> 
> 
> 
> 
> 
> .
> http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0
> 
> Crossfire for example is being led by a 290 non-X model for example and needs to be taken down a notch by someone with 2 290X's.
> 
> Im terribly interested in how the overclocked 290/290x's are doing on water.


i submitted a sample for 5760x1080(multi-monitor) i havent gotten a chance to get my fourth card in, so i did tri-fire. My 1080p runs are not so hot and i think i am CPU limited on those. Every test i have run at 1080p has been mediocre and i think i would have to unleash on my CPU as well. But a quad core only scores so well in things like 3dmark11....

I am contemplating finding a used 6core 2011 if they are around....


----------



## ZealotKi11er

Quote:


> Originally Posted by *HardwareDecoder*
> 
> good to hear i'm doing the same except with a 290x cant wait to get mine. Paid for 3 day shipping but newegg didn't ship yet


My HD 7970 @ 1200MHz are faster then 290X @ Stock for sure. They got extra 20-30 fps in BF4 so 60-80 fps for 290X compare to 80-110 fps.
After OC 290X should come closer.


----------



## darkelixa

So get a 7970 instead of a r9 290?


----------



## HardwareDecoder

Quote:


> Originally Posted by *darkelixa*
> 
> So get a 7970 instead of a r9 290?


He was talking about two 7970's and DO NOT use xfire. Trust me I just got done with 6 months of it.

At first you can tell your self you are okay with it not scaling well in everything, you can deal with the micro stutter too. Slowly though you realize you just wasted the last 6 months of your gaming life. Nothing better has ever happened to me in terms of pc hardware than getting rid of my two 7950's for a huge profit.

My 290x is going under water as soon as the gpu block comes in, gonna be 1-2 weeks though. It's gonna be so much nicer than xfire 7950's was.

The "extra fps" means nothing when you got micro stutter


----------



## DeadlyDNA

Quote:


> Originally Posted by *darkelixa*
> 
> Yeah its under the 770gtx and about 70 more than a 280x.
> 
> I better ask before I buy one, are there huge stuttering issues with games and the r9 290.
> 
> The games i play is final fantasy a realm reborn,bf3/bf4,crysis 3.
> 
> Stuttering is usually a driver/video card issue isnt it?


I play FFXIV RR in eyefinity, and running 3 r9 290's so my experience may be different than yours but i can say it runs really well for me. My 680gtx 3 way SLI couldn't run that game properly.


----------



## darkelixa

Oh tell me about it with Nvidia, my single 770 gtx just couldnt handle it


----------



## evensen007

I am giving these 290's 10 more days before I return them both (end of return period). I get random black screen at stock clocks in BF4 using the 13.11 WHQL driver. I tried the beta 9.4 driver, but my FPS dropped through the floor.


----------



## ImJJames

Quote:


> Originally Posted by *evensen007*
> 
> I am giving these 290's 10 more days before I return them both (end of return period). I get random black screen at stock clocks in BF4 using the 13.11 WHQL driver. I tried the beta 9.4 driver, but my FPS dropped through the floor.


Remove driver using catalyst>reboot to safe mode>uninstall amd drivers using display driver uninstaller(latest version found in 3dguru)>reboot>install latest beta drivers 9.5(also in 3dguru)


----------



## battleaxe

Quote:


> Originally Posted by *ImJJames*
> 
> Remove driver using catalyst>reboot to safe mode>uninstall amd drivers using display driver uninstaller(latest version found in 3dguru)>reboot>install latest beta drivers 9.5(also in 3dguru)


He already did that.


----------



## Forceman

So I finally thought to check the minidumps on my red screens in BF4, and it turns out they were 124 errors, so it looks like my 4770K is to blame and not the 290. No idea what BF4 is doing that IBT/Prime aren't to cause me to crash in the game and not in stress tests though.


----------



## evensen007

Quote:


> Originally Posted by *ImJJames*
> 
> Remove driver using catalyst>reboot to safe mode>uninstall amd drivers using display driver uninstaller(latest version found in 3dguru)>reboot>install latest beta drivers 9.5(also in 3dguru)


Trust me, I've been through the whole deal. I've been on AMD cards for the last 3 generations so I've been around the block. I even reloaded windows as a last resort which did improve performance, but left this lingering black screen.


----------



## skupples

Quote:


> Originally Posted by *Dankal*
> 
> anyone else have problems with their card not downclocking?
> 
> The card runs games fine, but when back to windows its not always going back to the 300 mhz idle state.
> 
> This causes 75 * C temps on normal fan curve since it is stuck at 950 mhz.... with 0% load.
> 
> using 13.11 9.2


It could be multiple things. Mineware/high resolution/120hz+ monitor/hardware accelerated programs like G-chrome & the like.


----------



## maynard14

so thats why mine somethimes even idle my temps are 50c to 60c hahah

im using 120 hrtz monitor/..


----------



## sugarhell

Quote:


> Originally Posted by *maynard14*
> 
> so thats why mine somethimes even idle my temps are 50c to 60c hahah
> 
> im using 120 hrtz monitor/..


144 hz monitor cause 3d clocks. With 120hz you should have 2d clocks.


----------



## darkelixa

I wish people didnt have all these black screens lol starting to put me off my sapphire r9 290 purchase


----------



## lethal343

Guys







.......
im getting VIDEO_TDR_FAILURE with my r9 290x CF setup.
win 8.1

anyone else with this blue screen issue?!

only started getting it when i updated to 13.11 beta v9.4

also how do i force my GPU's into 3D mode at all times?!

i think the transition from 2d to 3d is causing the BSOD's


----------



## Dankal

I am scanning for viruses because apparently they can do that. As for my status now this is totally random. I can be using chrome and just surfing and get low power state but if i step away i can come back to a hot gpu. its annoying but viruses dont seem to be the issue.


----------



## DeadlyDNA

Quote:


> Originally Posted by *Forceman*
> 
> So I finally thought to check the minidumps on my red screens in BF4, and it turns out they were 124 errors, so it looks like my 4770K is to blame and not the 290. No idea what BF4 is doing that IBT/Prime aren't to cause me to crash in the game and not in stress tests though.


I honestly considered this as a culprit for most of the people who are having issues. I Possibly even the black screen. I did OC my CPU yesterday to 5ghz, and while i benchmarked fine, my games kept crashing. I forgot i has set my OC that high. However my screens just froze no black or red or flickering. One thing i can say for sure in my case is it sure seems like my CPU is working overtime with these cards. I haven't bothered to keep CPU usage meters rolling. Even at my 4.8 OC which i was able to game on my 680GTX 3way SLI wouldnt run without crashes on the R9 290s. I went down to 4.6 to be able to game without crashes.


----------



## maynard14

Quote:


> Originally Posted by *sugarhell*
> 
> 144 hz monitor cause 3d clocks. With 120hz you should have 2d clocks.


but my monitor is 3d.. but im not using it in 3d...

if 120hz will not push my card to idle as high as mine @ 55c why i am getting that temp?


----------



## skupples

Quote:


> Originally Posted by *Dankal*
> 
> I am scanning for viruses because apparently they can do that. As for my status now this is totally random. I can be using chrome and just surfing and get low power state but if i step away i can come back to a hot gpu. its annoying but viruses dont seem to be the issue.


try a program trial called hitman pro. Many of the new "viruses" do not show up in your typical free-ware anti-virus program.
Quote:


> Originally Posted by *maynard14*
> 
> but my monitor is 3d.. but im not using it in 3d...
> 
> if 120hz will not push my card to idle as high as mine @ 55c why i am getting that temp?


I don't know about AMD specifically, but high refresh rates will force many other GPU's into 3D clocks. I'm not sure the 120hz wont, 144hz will comment is correct. Many people in the gk110 owners clubs have this "issue" with high refresh rate monitors.


----------



## the9quad

Running my 290x's with a monitor at 120hz, no problems at all. Always goes back to idle clocks


----------



## DeadlyDNA

Quote:


> Originally Posted by *the9quad*
> 
> Running my 290x's with a monitor at 120hz, no problems at all. Always goes back to idle clocks


How are those X's treating ya? What do you play mostly?


----------



## skupples

Quote:


> Originally Posted by *the9quad*
> 
> Running my 290x's with a monitor at 120hz, no problems at all. Always goes back to idle clocks


would be interesting to find out if that's due to the extra grunt. As in, 1 hawaii may need 3d clocks to push the refresh rate, but two may not. It could also be the combo of hardware accelerated web browsers + high hz monitors.

-Intel Burn Test & Prime 95 are great for stability testing, but it doesn't compare to fluctuating loads of a high demand game like BF4. I can run IBT/95 all day long @ 5.0, but my system will crash within 30mins of playing any high cpu usage game.


----------



## the9quad

Quote:


> Originally Posted by *DeadlyDNA*
> 
> How are those X's treating ya? What do you play mostly?


BF4
1440p everything ultra, 4xmsaa,low post process aa, 115% upscale. Pretty much solid >120 fps and average in the 160-170 fps range. Bring on mantle!


----------



## maynard14

Quote:


> Originally Posted by *the9quad*
> 
> Running my 290x's with a monitor at 120hz, no problems at all. Always goes back to idle clocks


sir pls can you post your hawaii tool result from your r9 290x?

i just want to compare my r9 290 unlock to r9 290x via bios moded... pls sir









and maybe thats why hes running to cards in 120 hz monitor..

maybe because my room i a little bit warm thats why my idle temp is high 55 idle


----------



## skupples

Quote:


> Originally Posted by *the9quad*
> 
> BF4
> 1440p everything ultra, 4xmsaa,low post process aa, 115% upscale. Pretty much solid >120 fps and average in the 160-170 fps range. Bring on mantle!


results like this make me miss playing on a single monitor... Then I realize i'm now totally blind on a single panel.


----------



## magicase

I have the following 2 options to pick from in the near future.

2 x 290x for ~$1300
or
3 x 290 for ~$1500.

Which one should I pick?

The MB will be a Asrock Z87 OC Formula.


----------



## Slomo4shO

Quote:


> Originally Posted by *magicase*
> 
> I have the following 2 options to pick from in the near future.
> 
> 2 x 290x for ~$1300
> or
> 3 x 290 for ~$1500.
> 
> Which one should I pick?
> 
> The MB will be a Asrock Z87 OC Formula.


You may be able to receive $150 off your order of $999 or more still from Newegg Business if you order tonight. Triple crossfire on that board will only run at x8/x4/x4. It would be ideal to stick to two cards on that board. Nevertheless, going with two 290s is still the better value.


----------



## darkelixa

OMG pccg are getting sapphire r9 280x toxics this week $429.

Should I buy a toxic for 429 or

r9 290 $475

Last chance before i order!!

The black screens really do worry me thou


----------



## Ghostpilot

That, and there's the chance that they'll unlock to 290x's.


----------



## Hogesyx

Can anyone give some insight regarding my card?

idle with 1400MHz memory = stable
idle with 1430MHz++ memory = crash
during stress test/games with 1600MHz memory = stable

is vdroop causing this issue? Wondering if I should try out PT1 or PT3 bios but I hate to loss idle power saving.


----------



## CriticalHit

Quote:


> Originally Posted by *skupples*
> 
> results like this make me miss playing on a single monitor... Then I realize i'm now totally blind on a single panel.


exactly how i feel.. those frame numbers are sooooo nice... but id still choose 3 screens at 40-60 fps anyday of the week over the single screen option . heres hoping mantle can make it more like 70-90 fps on a few new titles for CFX setups


----------



## the9quad

Quote:


> Originally Posted by *CriticalHit*
> 
> exactly how i feel.. those frame numbers are sooooo nice... but id still choose 3 screens at 40-60 fps anyday of the week over the single screen option . heres hoping mantle can make it more like 70-90 fps on a few new titles for CFX setups


I honestly wanted to try 120hz gaming so I chose one monitor. I think down the line I might get two more and go back to 60hz gaming, I just honestly don't know if 3 of the 290x's can push consistently 60fps at ultra with three 1440p monitors. I'm thinking they can't.


----------



## maynard14

after my xfx 290 unlock to asus 290x bios:

Compatible adapters detected: 1
Adapter #1 PCI ID: 1002:67B0 - 1043:0466
Memory config: 0x500036A9 Hynix
RA1: F8000005 RA2: 00000000
RB1: F8000005 RB2: 00000000
RC1: F8000005 RC2: 00000000
RD1: F8000005 RD2: 00000000

can you compare your R9 290X card to my unlock via bios mod using hawaii tool?

thanks


----------



## CriticalHit

Quote:


> Originally Posted by *the9quad*
> 
> I honestly wanted to try 120hz gaming so I chose one monitor. I think down the line I might get two more and go back to 60hz gaming, I just honestly don't know if 3 of the 290x's can push consistently 60fps at ultra with three 1440p monitors. I'm thinking they can't.


yeah probably not ..... 2 , 290x cards with mild overclock barely get over 60 fps in most games now in 1080 eyefinity @ultra so imagine trifinity would be floating around 60 but dipping below at times with 1440 .. ( currently replaying crysis 1 with real lifesys mod at max settings and thats floating around 40-50fps .. )

edit: its where mantle needs to step it up


----------



## Sgt Bilko

Alright, I have a pretty serious question and i can't make up my mind about it.

I've RMA'd my 290x due to it artifacting at stock and now i can't decide if i want another 290x or grab two 280x's and CF them.

290x is newer tech, no bridges needed for CF and a better long term plan. But i hate the Ref cooler and i don't want to mess around putting another cooler on it. So i'd need to wait for Non-Ref (Jan)

280x would be about the same price, Non-ref so cooler and more quiet but i will suffer from the problems that only occur with Crossfire...........

I need to have a replacement card/s by the end of December.

I'm somewhat stumped here.







:

Help?


----------



## CriticalHit

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'm somewhat stumped here.
> 
> 
> 
> 
> 
> 
> 
> :
> 
> Help?


non-ref 780? its a bit of a premium but its a no fuss solution ... i like my AMD cards but i also like to tinker around to maximise performance. Nvidia tend to be a bit easier out of the box and they overclock well with stock fans. - but you tend to have to pay for that convenience with more cash... and you can always add a second later, rather than being locked in with 2 , 280's....

i only crossfire cards if -
1 ) they are the latest on offer and i basically couldnt get anything comparable with a single card
2) ive had a card a while and want to improve performance so add a second...

crossfire / sli always has drawbacks over 1 card solution...
im very happy with my 290's but have put waterblocks on so understand if you don't like ref blower..


----------



## Sgt Bilko

Quote:


> Originally Posted by *CriticalHit*
> 
> non-ref 780? its a bit of a premium but its a no fuss solution ... i like my AMD cards but i also like to tinker around to maximise performance. Nvidia tend to be a bit easier out of the box and they overclock well with stock fans. - but you tend to have to pay for that convenience with more cash... and you can always add a second later, rather than being locked in with 2 , 280's....
> 
> i only crossfire cards if -
> 1 ) they are the latest on offer and i basically couldnt get anything comparable with a single card
> 2) ive had a card a while and want to improve performance so add a second...
> 
> crossfire / sli always has drawbacks over 1 card solution...
> im very happy with my 290's but have put waterblocks on so understand if you don't like ref blower..


Ah, i did consider going green for about 2-3 seconds









I'd prefer to stick with AMD (no offense to the green team







) and yes, 780 is a bit of a premium but i have high hopes for Mantle


----------



## darkelixa

Omg dont go nvidia, my 770 is horrible!!!!

Makes me wonder why final fantasy stutters in gameplay and crysis 3 does not


----------



## stickg1

Quote:


> Originally Posted by *HardwareDecoder*
> 
> He was talking about two 7970's and DO NOT use xfire. Trust me I just got done with 6 months of it.
> 
> At first you can tell your self you are okay with it not scaling well in everything, you can deal with the micro stutter too. Slowly though you realize you just wasted the last 6 months of your gaming life. Nothing better has ever happened to me in terms of pc hardware than getting rid of my two 7950's for a huge profit.
> 
> My 290x is going under water as soon as the gpu block comes in, gonna be 1-2 weeks though. It's gonna be so much nicer than xfire 7950's was.
> 
> The "extra fps" means nothing when you got micro stutter


UGH I forgot about microstutter. I actually noticed it with a single card. Which is why I left my 7950s and 7970s once I got my hands on a 670 and the games I played were so smooth. I thought AMD fixed this though.
Quote:


> Originally Posted by *the9quad*
> 
> Running my 290x's with a monitor at 120hz, no problems at all. Always goes back to idle clocks


Just responding to you because you have Kenny Powers as an avatar and I love that show!








Quote:


> Originally Posted by *darkelixa*
> 
> Omg dont go nvidia, my 770 is horrible!!!!
> 
> Makes me wonder why final fantasy stutters in gameplay and crysis 3 does not


C'mon man, there's no need for any of that. The 770 is a good card, the 7970 is a good card, the R9 290 is a good card, the 780 is a good card. We're all good, so stop the non-sense.

Also, for the past few days I see your name every 4 or 5 posts asking if you should go ahead and get the R9 290 or not. Then you have questions that are answered on almost every page of this thread. Please just buy a card or don't buy a card, but quit the mindless posting. What's the saying? "Piss or get off the pot," I believe.

Not trying to be a dick, even though I'm sure it comes off that way. But you've been asking the same questions for days and have received your answer over and over. It's almost like your posting just to post.


----------



## blak24

Guys I've got big problems with core usage, every second it drops from 100% to 50-60-70 for a fraction of second, then again at 100%, then again it drops... Tried uninstalling drivers, clean-up, latest Beta 9.5, 9.4 or 9.2, every time the same problem. The problem started randomly.. Does anyone had this problem?


----------



## The Storm

Quote:


> Originally Posted by *blak24*
> 
> Guys I've got big problems with core usage, every second it drops from 100% to 50-60-70 for a fraction of second, then again at 100%, then again it drops... Tried uninstalling drivers, clean-up, latest Beta 9.5, 9.4 or 9.2, every time the same problem. The problem started randomly.. Does anyone had this problem?


I had this issue on my 7950's I just went to MSI afterburner and slid the power adjustment to +20% and my cards would go up and stay up. Hope this helps.


----------



## maynard14

wowow

nice scores unlock xfx 290 to xfx 290x bios

http://www.3dmark.com/3dm11/7606246


----------



## TheSoldiet

I found out that 1.300 v on GPU tweak is 1.180 v in GPU-z and 1.400 v on GPU tweak is 1.250 in GPU-z. Stock 1.250 is 1.148 in GPU-z. Weird right


----------



## Jpmboy

Quote:


> Originally Posted by *maynard14*
> 
> wowow
> nice scores unlock xfx 290 to xfx 290x bios
> http://www.3dmark.com/3dm11/7606246


Eh... Push those clocks some. http://www.3dmark.com/3dm11/7474358

Quote:


> Originally Posted by *TheSoldiet*
> 
> I found out that 1.300 v on GPU tweak is 1.180 v in GPU-z and 1.400 v on GPU tweak is 1.250 in GPU-z. Stock 1.250 is 1.148 in GPU-z. Weird right


That's vdroop.


----------



## maynard14

haha no sir

what im saying is that from stock bios score

11810 to 290x bios score 12463 im just amaze... i cant overclock coz im using stock cooler of my reference r9 290


----------



## blak24

Quote:


> Originally Posted by *The Storm*
> 
> I had this issue on my 7950's I just went to MSI afterburner and slid the power adjustment to +20% and my cards would go up and stay up. Hope this helps.


Also with power limit to +50% I have the same problem...


----------



## psyside

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> If you overvolt the stock cooling can get pretty loud honestly. At 1200/6000 @1.28v and 70% fan my core would be about 70-75c, VRMs under 60c.


So in my understanding, AMD is using fantastic vrm's on this cards, with that huge oc and volts, with the poor stock cooler you get under 60c on vrms? that's epic.

Does others also have this vrm temps with fans on ~ 70% ? i heard some issue and modding was required to remove some "glue" do you did that?

Does the card throttle if you use 60% fan at lets say 1150/6000? stock or slightly increased volts?

If this temps are on reference cards without any mods, that's epic. I wonder is 55-60% fan too loud.


----------



## Darklyric

bah so much BS, now xfx 290s are back in stock.... calling eggs right now.







but


----------



## The Storm

Ok so I have been looking for a bit and I am not finding a definite answer. Which EK bridge is the correct one to link 2 290's together. I have the http://www.performance-pcs.com/catalog/index.php?main_page=product_info&cPath=59_971_1018_1038_1207&products_id=30021 (parallel version) left over from my old 7950's but I don't think this works on these. Does anyone know or could link me some photos?


----------



## VSG

You need the EK terminal connection, not EK link/bridge. Get a dual card parallel terminal in double/triple slot config based on your motherboard.


----------



## killerhz

while i will take some time to read through all 928 pages of this thread







lol

what's the deal with this cards? i am currently a long time nvidia dude and well, looking to upgrade my whole system so normally i start with a graphics card. is this card better than the 780? it seems like the cost of the 290x is pretty sexy.


----------



## stickg1

The card performs better than the 780 in multiple situations. AMD is still ironing out some driver issues. You will spend less money on the card and get more performance, but similar to AMD vs Nvidia last generation, you may need to do some tweaking to get everything running smoothly.


----------



## RAFFY

Quote:


> Originally Posted by *geggeg*
> 
> You need the EK terminal connection, not EK link/bridge. Get a dual card parallel terminal in double/triple slot config based on your motherboard.


+1
Quote:


> Originally Posted by *killerhz*
> 
> while i will take some time to read through all 928 pages of this thread
> 
> 
> 
> 
> 
> 
> 
> lol
> 
> what's the deal with this cards? i am currently a long time nvidia dude and well, looking to upgrade my whole system so normally i start with a graphics card. is this card better than the 780? it seems like the cost of the 290x is pretty sexy.


The deal is that the 290x is simply awesome. It scales great in crossfire and performs like a champion. Right now (correct me if I'm wrong) the card beats the 780 and is on par with the 780ti. This is while the card is still on beta drivers and pre-mantle release. Mantle is a new API that should exponentially increase frame rates in supported games like BF4 for AMD users.


----------



## killerhz

Quote:


> Originally Posted by *stickg1*
> 
> The card performs better than the 780 in multiple situations. AMD is still ironing out some driver issues. You will spend less money on the card and get more performance, but similar to AMD vs Nvidia last generation, you may need to do some tweaking to get everything running smoothly.


Quote:


> Originally Posted by *RAFFY*
> 
> +1
> The deal is that the 290x is simply awesome. It scales great in crossfire and performs like a champion. Right now (correct me if I'm wrong) the card beats the 780 and is on par with the 780ti. This is while the card is still on beta drivers and pre-mantle release. Mantle is a new API that should exponentially increase frame rates in supported games like BF4 for AMD users.


thanks. Not big into BF4 at the moment. AMD graphics and me don't get along. Nothing to do with performance rather the picture itself.

Thanks for the info seems like this time around for the price of 1 i can get 2 and well, 2 is better than 1...

cheers


----------



## stickg1

Not sure I follow on that last line. Care to elaborate? One of what and two of what?


----------



## flopper

Quote:


> Originally Posted by *the9quad*
> 
> I honestly wanted to try 120hz gaming so I chose one monitor. I think down the line I might get two more and go back to 60hz gaming, I just honestly don't know if 3 of the 290x's can push consistently 60fps at ultra with three 1440p monitors. I'm thinking they can't.


120hz eyefinity here.
cant do 60hz anymore as it just sucks.
I dont imagine to move to 4k either as 120hz will be impossible to do there.
I am happy with a single 290 for my setup dont fancy Ultra in games even though many games can be maxed out.


----------



## jerrolds

Quote:


> Originally Posted by *skupples*
> 
> It could be multiple things. Mineware/high resolution/120hz+ monitor/hardware accelerated programs like G-chrome & the like.


I have everything above (except mining malware) and my 290x downclocks just fine, mem stays at at 6000mhz though. This is on Windows 8, havent yet upgraded to 8.1.


----------



## Darklyric

Well my xfx 290s just shipped. Wish me luck!


----------



## Jpmboy

Quote:


> Originally Posted by *killerhz*
> 
> while i will take some time to read through all 928 pages of this thread
> 
> 
> 
> 
> 
> 
> 
> lol
> 
> what's the deal with this cards? i am currently a long time nvidia dude and well, looking to upgrade my whole system so normally i start with a graphics card. is this card better than the 780? it seems like the cost of the 290x is pretty sexy.


Quote:


> Originally Posted by *stickg1*
> 
> The card performs better than the 780 in multiple situations. AMD is still ironing out some driver issues. You will spend less money on the card and get more performance, but similar to AMD vs Nvidia last generation, you may need to do some tweaking to get everything running smoothly.


Just to break the myth. A reference 780 is cheaper than a reference 290x. I have both... And a pair of titans. Way too early to say that the 290x is a faster card. It is a good card, sure. But not better than the (now cheaper) 780.


----------



## Jaju123

I ordered two 290s and they were shipped today. Should be in my excited hands by tomorrow (Netherlands).

Any idea how the performance in crossfire will be affected by PCI-e 2.0 x8?

I am coming from two GTX 680s, on a MSI Z68a-GD65(g3) but with a 2500k @ 4.7 ghz.

Hopefully my Corsair AX 850W will be enough! AMD said it was on twitter.


----------



## stickg1

I was thinking of the 290 @ $400. I guess I misread the original question.


----------



## Arizonian

Quck reminder everyone - all mining discussion must take place in *Distributed Computing* section for those interested.


----------



## The Storm

Quote:


> Originally Posted by *Darklyric*
> 
> Well my xfx 290s just shipped. Wish me luck!


Good luck, my sapphire 290's have been in packaging since monday. I'm growing restless already.


----------



## Jack Mac

Use Amazon, thank me later.


----------



## ImJJames

Will receive my 290 today







, I keep going outside every time I hear a truck.


----------



## Jpmboy

Quote:


> Originally Posted by *Jaju123*
> 
> I ordered two 290s and they were shipped today. Should be in my excited hands by tomorrow (Netherlands).
> Any idea how the performance in crossfire will be affected by PCI-e 2.0 x8?
> I am coming from two GTX 680s, on a MSI Z68a-GD65(g3) but with a 2500k @ 4.7 ghz.
> Hopefully my Corsair AX 850W will be enough! AMD said it was on twitter.


I think you can inform us!







Concurrent bandwidth at 2.0x16 is approx12k mb/s, 3.0x16 is 23k. 2.0x8 is like 7k. With no cfx bridge, the Effect of PCIE will be interesting to see. I do not think it will slow down your throughput until x4, or if you go big res.









If you OC those cards, 850W is probably not enough.
Quote:


> Originally Posted by *stickg1*
> 
> I was thinking of the 290 @ $400. I guess I misread the original question.


The 290 is a great buy!


----------



## Assasin

Quote:


> Originally Posted by *Jpmboy*
> 
> Just to break the myth. A reference 780 is cheaper than a reference 290x. I have both... And a pair of titans. Way too early to say that the 290x is a faster card. It is a good card, sure. But not better than the (now cheaper) 780.


Clock for clock a R9 290X matches a 780TI while beating a regular GTX 780 clock for clock quiet handily. Even a R9 290 will out perform a normal GTX 780 clock for clock. It takes a much higher clocked GTX 780 to compare to a lower clocked R9 290x. This is not a myth, it is a fact.


----------



## Jpmboy

Quote:


> Originally Posted by *Assasin*
> 
> Clock for clock a R9 290X matches a 780TI while beating a regular GTX 780 clock for clock quiet handily. Even a R9 290 will out perform a normal GTX 780 clock for clock. It takes a much higher clocked GTX 780 to compare to a lower clocked R9 290x. This is not a myth, it is a fact.


Show me the data


----------



## Assasin

Quote:


> Originally Posted by *Jpmboy*
> 
> Show me the data


Show you the data? really..... benchmarks are all over the web, im sure you know how to navigate the web right?


----------



## VSG

Newegg has finally shipped out those Sapphire 290x BF4 cards that were on sale. I should have my second Sapphire 290x by Friday.


----------



## ImJJames

Quote:


> Originally Posted by *Jpmboy*
> 
> Show me the data


Use google.


----------



## Jpmboy

Quote:


> Originally Posted by *ImJJames*
> 
> Use google.


Guys, I've had 290x since launch. The benchmarks "on the web" are trash... Stock clocks? Compare stock clocks? Come'on.

And you cant compare "clock for clock" anyway, cause they are different architectures. Secondly, you do know that the press received pumped up bios' and have shown that the retail cards are not measuring up the same as their press samples. Right?

I think *you* should look at the benches right here on OCN. Inconclusive with the exception of a few guys like tsm and jomamma, we're not dominating the leader boards.


----------



## stickg1

Yeah there were a few games that I play and the compute performance in [email protected] where I see the 290 go blow for blow and sometimes exceed the capabilities at a lower price point than the 780. Which is why I went AMD this round. That said, I love nvidia, I love AMD, I love Intel. I just love hardware in general. There's almost always a situation where one brand will beat another.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *stickg1*
> 
> Yeah there were a few games that I play and the compute performance in [email protected] where I see the 290 go blow for blow and sometimes exceed the capabilities at a lower price point than the 780. Which is why I went AMD. That said, I love nvidia, I love AMD, I love Intel. I just love hardware in general.


This. I'm loyal to price/performance ratio, not brand names.


----------



## Assasin

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> This. I'm loyal to price/performance ratio, not brand names.


+1


----------



## RAFFY

Quote:


> Originally Posted by *stickg1*
> 
> Yeah there were a few games that I play and the compute performance in [email protected] where I see the 290 go blow for blow and sometimes exceed the capabilities at a lower price point than the 780. Which is why I went AMD this round. That said, I love nvidia, I love AMD, I love Intel. I just love hardware in general. There's almost always a situation where one brand will beat another.


Totally agree! I think this time around AMD hit a home run with the 290x. I am really stoked to see what the future brings for these cards and also what kind of scaling ill get in Tri-Fire under water.


----------



## Darklyric

Quote:


> Originally Posted by *ImJJames*
> 
> Use google.


whats google







some sort waterfowl?
Quote:


> Originally Posted by *The Storm*
> 
> Good luck, my sapphire 290's have been in packaging since monday. I'm growing restless already.


I hate to say it but they are still packing it somehow... Oh well that saved me from the mistake of buying HIS lol.


----------



## brazilianloser

Yeah Newegg took two days to ship mine so after some complaining and due to being their fault they upgraded my shipping to two days so my 290 is arriving tomorrow finally.


----------



## Arizonian

I've found the 290X cannot reach same over clocks on air for me. Clock vs.clock is a bit irrelevant since the only thing that matters is the best OC either GPU can obtain and performance achieved as end result.

As example; my 290X could only obtain 1100 Core 1300 Memory gaming stable where my 780Ti ACX gets 1237 Boost 1800 Memory stable. The 780Ti is slightly better than 290X in all my circumstances gaming or benching on air in comparison I've found.

It boils down to what's important to the end user with true performance or price / performance ratio between the two. Performance the 780Ti won for me in all games I play. Price / performance ratio the 290X easily wins. So the main question - is the price difference justified and that will be subjective to each individuals opinion on where they place importance.

Having said this I realize all over clocks vary and I could have easily had a better over clocking 290X and a poor over clocking 780Ti with luck of the draw and got better end results with 290X over 780Ti.

Is the $150 price difference between the two worth the difference in performance is what each person should ask themselves and that answer will vary where they place importance.

*Firestrike Extreme*

14 Arizonian 3770K GTX 780 Ti *5526* http://www.3dmark.com/3dm/1760139
15 Arizonian 3770K R290x *5414* http://www.3dmark.com/3dm/1488266

*3DMark 11 Performance*

I'm not officially done with this one due to lack of time. I feel I can get more memory OC on 780Ti but even so as you can see it's a only going to be a small difference even if I beat it my 29X score.

Arizonian i7-3770K R9 290X *P14894* http://www.3dmark.com/3dm11/7375771
Arizonian i7-3770K GTX 780 Ti *P14842* http://www.3dmark.com/3dm11/7583602

So was the price difference worth it to me?

290X @ $570 - $50 game bundle = *$520
*780Ti ACX @ $730 - $150 game bundle - $100 shield = *$480
*

If I didn't use the Shiled discount then it would have been $580 vs $520 for small difference in performance. This was how I based my decision in the end. However I feel that when 290X non-reference cards come out topped with mantle I'm going to be regretting it as I only hold gaming as priority. Then I will look at price / performance with non-ref 290X vs my ref aftermarket cooling 780Ti and may have a different opinion.


----------



## Jpmboy

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> This. I'm loyal to price/performance ratio, not brand names.


Weighted to performance!


----------



## Jpmboy

Quote:


> Originally Posted by *Arizonian*
> 
> I've found the 290X cannot reach same over clocks on air for me. Clock vs.clock is a bit irrelevant since the only thing that matters is the best OC either GPU can obtain and performance achieved as end result.
> 
> As example; my 290X could only obtain 1100 Core 1300 Memory gaming stable where my 780Ti ACX gets 1237 Boost 1800 Memory stable. The 780Ti is slightly better than 290X in all my circumstances gaming or benching on air in comparison I've found.
> 
> It boils down to what's important to the end user with true performance or price / performance ratio between the two. Performance the 780Ti won for me in all games I play. Price / performance ratio the 290X easily wins. So the main question - is the price difference justified and that will be subjective to each individuals opinion on where they place importance.
> 
> Having said this I realize all over clocks vary and I could have easily had a better over clocking 290X and a poor over clocking 780Ti with luck of the draw and got better end results with 290X over 780Ti.
> 
> Is the $150 price difference between the two worth the difference in performance is what each person should ask themselves and that answer will vary where they place importance.
> 
> *Firestrike Extreme*
> 
> 14 Arizonian 3770K GTX 780 Ti *5526* http://www.3dmark.com/3dm/1760139
> 15 Arizonian 3770K R290x *5414* http://www.3dmark.com/3dm/1488266
> 
> *3DMark 11 Performance*
> 
> I'm not officially done with this one due to lack of time. I feel I can get more memory OC on 780Ti but even so as you can see it's a only going to be a small difference even if I beat it my 29X score.
> 
> Arizonian i7-3770K R9 290X *P14894* http://www.3dmark.com/3dm11/7375771
> Arizonian i7-3770K GTX 780 Ti *P14842* http://www.3dmark.com/3dm11/7583602
> 
> So was the price difference worth it to me?
> 
> 290X @ $570 - $50 game bundle = *$520
> *780Ti ACX @ $730 - $150 game bundle - $100 shield = *$480
> *
> 
> If I didn't use the Shiled discount then it would have been $580 vs $520 for small difference in performance. This was how I based my decision in the end. However I feel that when 290X non-reference cards come out topped with mantle I'm going to be regretting it as I only hold gaming as priority. Then I will look at price / performance with non-ref 290X vs my ref aftermarket cooling 780Ti and may have a different opinion.


Thank you for the evidence-based comparison! i would +2 if it were possible.


----------



## PwrElec

what's up with asus 290X...there is a sticker on it!!


----------



## Jpmboy

Quote:


> Originally Posted by *ImJJames*
> 
> Use google.


Didn't know about that google thing...is that how you know so much?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Arizonian*
> 
> I've found the 290X cannot reach same over clocks on air for me. Clock vs.clock is a bit irrelevant since the only thing that matters is the best OC either GPU can obtain and performance achieved as end result.
> 
> As example; my 290X could only obtain 1100 Core 1300 Memory gaming stable where my 780Ti ACX gets 1237 Boost 1800 Memory stable. The 780Ti is slightly better than 290X in all my circumstances gaming or benching on air in comparison I've found.
> 
> It boils down to what's important to the end user with true performance or price / performance ratio between the two. Performance the 780Ti won for me in all games I play. Price / performance ratio the 290X easily wins. So the main question - is the price difference justified and that will be subjective to each individuals opinion on where they place importance.
> 
> Having said this I realize all over clocks vary and I could have easily had a better over clocking 290X and a poor over clocking 780Ti with luck of the draw and got better end results with 290X over 780Ti.
> 
> Is the $150 price difference between the two worth the difference in performance is what each person should ask themselves and that answer will vary where they place importance.
> 
> *Firestrike Extreme*
> 
> 14 Arizonian 3770K GTX 780 Ti *5526* http://www.3dmark.com/3dm/1760139
> 15 Arizonian 3770K R290x *5414* http://www.3dmark.com/3dm/1488266
> 
> *3DMark 11 Performance*
> 
> I'm not officially done with this one due to lack of time. I feel I can get more memory OC on 780Ti but even so as you can see it's a only going to be a small difference even if I beat it my 29X score.
> 
> Arizonian i7-3770K R9 290X *P14894* http://www.3dmark.com/3dm11/7375771
> Arizonian i7-3770K GTX 780 Ti *P14842* http://www.3dmark.com/3dm11/7583602
> 
> So was the price difference worth it to me?
> 
> 290X @ $570 - $50 game bundle = *$520
> *780Ti ACX @ $730 - $150 game bundle - $100 shield = *$480
> *
> 
> If I didn't use the Shiled discount then it would have been $580 vs $520 for small difference in performance. This was how I based my decision in the end. However I feel that when 290X non-reference cards come out topped with mantle I'm going to be regretting it as I only hold gaming as priority. Then I will look at price / performance with non-ref 290X vs my ref aftermarket cooling 780Ti and may have a different opinion.


Interesting. Your 780ti @ 1237/1800 scores lower in 3dMark11 performance than my 290 (non x) at 1200/1625. My card gets around 174xx graphics points.
All the benches I've seen have said the 780ti should be faster than a 290, especially considering your higher clockspeed.

Edit: Here's the link. http://www.3dmark.com/3dm11/7540268 No tweaks were used.
Also, our rigs both use a lot of the same/similar parts.


----------



## PwrElec

anyone knows about asus warranty policies towards water cooling?

please help. <3


----------



## Darklyric

Quote:


> Originally Posted by *PwrElec*
> 
> anyone knows about asus warranty policies towards water cooling?
> 
> please help. <3


I know that if you spill water on their motherboard its a no go. If they find out =







.


----------



## Arizonian

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Interesting. Your 780ti @ 1237/1800 scores lower in 3dMark11 performance than my 290 (non x) at 1200/1625. My card gets around 174xx graphics points.
> All the benches I've seen have said the 780ti should be faster than a 290, especially considering your higher clockspeed.


I see your system is similar to mine minus 4.5 Ghz vs 4.6 Ghz OC on the i7 3770K except your using Gelid air cooling....you should be able to score better OC but there is a huge difference and that leaves me wondering.









I didn't see your name however in the *Top 30 3DMark11 Scores for Single/Dual/Tri/Quad* and can't confirm if you had tessellation ON or OFF? Please post link like I did as proof as it will carry more weight in this discussion as I had tessellation ON for both benches.


----------



## OverSightX

I have my Sapphire BF4 290 at home. Got the last one in stock a few days ago so no CF til they get more. Got it yesterday and got the block last week. I will be joining once I get the pics up tonight!

Here's to hoping it unlocks!


----------



## Darklyric

Quote:


> Originally Posted by *Arizonian*
> 
> I've found the 290X cannot reach same over clocks on air for me. Clock vs.clock is a bit irrelevant since the only thing that matters is the best OC either GPU can obtain and performance achieved as end result.


Its still important to know that per clock its better imo, but relevant to overclocking, no.


----------



## PwrElec

Quote:


> Originally Posted by *Darklyric*
> 
> I know that if you spill water on their motherboard its a no go. If they find out =
> 
> 
> 
> 
> 
> 
> 
> .


hehe








well, i just bought a bunch of 290X's and I see they have a sticker on one of the backplate screws.


----------



## Darklyric

Quote:


> Originally Posted by *PwrElec*
> 
> hehe
> 
> 
> 
> 
> 
> 
> 
> 
> well, i just bought a bunch of 290X's and I see they have a sticker on one of the backplate screws.


xfx? If you live in the cont us, and maybe the rest of north america, you can safely remove those without voiding warranty.

At least that was the case on my 7870 DD xfx card as I had to get that cooler off of their as soon as possible.


----------



## PwrElec

Quote:


> Originally Posted by *Darklyric*
> 
> xfx? If you live in the cont us, and maybe the rest of north america, you can safely remove those without voiding warranty.


No...Asus


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Arizonian*
> 
> I see your system is similar to mine minus 4.5 Ghz vs 4.6 Ghz OC on the i7 3770K except your using Gelid air cooling....you should be able to score better OC but there is a huge difference and that leaves me wondering.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I didn't see your name however in the *Top 30 3DMark11 Scores for Single/Dual/Tri/Quad* and can't confirm if you had tessellation ON or OFF? Please post link like I did as proof as it will carry more weight in this discussion as I had tessellation ON for both benches.


I currently can't submit anything with my name on it as my rig is currently without internet access. I would gladly re-run and submit my results but I can't.

I did in fact have tess on, and no other driver tweaks or anything like that. I really can't prove it right now, so until I get around to wiring and Ethernet cable to my room (too cheap to buy a wireless card) you'll just have to take my word on it.

I don't know why it still says the driver was not approved, and if you check the 3dmark link I edited in it shows incorrect clocks. I'm just guessing that has to do with using beta drivers and flashing to a 290x ROM.

For clarification, my card is not unlocked. It's on an ASUS ROM for extra voltage.

http://www.3dmark.com/3dm11/7540268 Edited it into the last post, perhaps you didn't see that though.

I'm exactly 1 point away from top 30..... Lol now I have a goal.


----------



## Arizonian

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I currently can't submit anything with my name on it as my rig is currently without internet access. I would gladly re-run and submit my results but I can't.
> 
> I did in fact have tess on, and no other driver tweaks or anything like that. I really can't prove it right now, so until I get around to wiring and Ethernet cable to my room (too cheap to buy a wireless card) you'll just have to take my word on it.
> 
> I don't know why it still says the driver was not approved, and if you check the 3dmark link I edited in it shows incorrect clocks. I'm just guessing that has to do with using beta drivers and flashing to a 290x ROM.
> 
> For clarification, my card is not unlocked. It's on an ASUS ROM for extra voltage.


Well the driver not being approved is because it's beta. Mine said same thing when submitting to 3DMark11. It still shows if tess was on or off.

So your 290X 1200 / 1625 vs mine 1150 / 1350 is +2000 on 3DMark11 score.



I only ask because I might be doing something wrong. It may be my CPU over clock in UEFI settings that's holding me back. I'm not too proficient in that regards over clocking. I'd like to figure out how to score better if I can as I'm holding myself back.


----------



## chiknnwatrmln

My card is a 290, not a 290x. Also I edited in the link in my last post, again after you quoted it haha.


----------



## Darklyric

Quote:


> Originally Posted by *PwrElec*
> 
> No...Asus


worth a call I'd say. Gl


----------



## PwrElec

Quote:


> Originally Posted by *Darklyric*
> 
> worth a call I'd say. Gl


i will ask at rog...thanks <3


----------



## Darklyric

Quote:


> Originally Posted by *PwrElec*
> 
> i will ask at rog...thanks <3


Yes that is way better then calling them. There are usually guys on there from their support if I remember right.


----------



## Arizonian

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> My card is a 290, not a 290x. Also I edited in the link in my last post, again after you quoted it haha.


Oh that makes better sense:

290 / *15115* - 1200 / 1625 Gelid Cooling

290X / *14894* - 1150 / 1350 Reference Cooling

Only 221 points difference.


----------



## RAFFY

Quote:


> Originally Posted by *PwrElec*
> 
> i will ask at rog...thanks <3


Post back or PM me what they say please. I have 3 ASUS R9 290x that are getting naked in the couple of weeks.


----------



## Assasin

Quote:


> Originally Posted by *Arizonian*
> 
> I've found the 290X cannot reach same over clocks on air for me. Clock vs.clock is a bit irrelevant since the only thing that matters is the best OC either GPU can obtain and performance achieved as end result.
> 
> As example; my 290X could only obtain 1100 Core 1300 Memory gaming stable where my 780Ti ACX gets 1237 Boost 1800 Memory stable. The 780Ti is slightly better than 290X in all my circumstances gaming or benching on air in comparison I've found.


Thats ashamed and certainly the minority, most will do 1100mhz core without even having to touch the voltage. But, as you said overclocking varies from card to card to a certain degree, looks like you just got unlucky on the low end of the overclock spectrum.


----------



## Slomo4shO

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I'm loyal to price/performance ratio, not brand names.










Agreed!


----------



## Assasin

Quote:


> Originally Posted by *Jpmboy*
> 
> Guys, I've had 290x since launch. The benchmarks "on the web" are trash... Stock clocks? Compare stock clocks? Come'on.


And why are they trash? because you said so?... or because the results are in ATI's favor in spite of the cards throttling?









Quote:


> And you cant compare "clock for clock" anyway, cause they are different architectures.


I realize that they are different architectures but clock for clock is still a relevant argument, especially since it takes a higher clocked GTX 780 to compete with a lower clocked R9 290 or 290X. Then when we are talking about the GTX 780TI clock for clock its very close to the R9 290X despite architectural differences.
Quote:


> Secondly, you do know that the press received pumped up bios' and have shown that the retail cards are not measuring up the same as their press samples. Right?


Myth has been debunked already and both bio's perform the same in uber mode, the reference you make was in regards to the quiet mode bios.


----------



## Assasin

Quote:


> Originally Posted by *RAFFY*
> 
> Right now (correct me if I'm wrong) the card beats the 780 and is on par with the 780ti. This is while the card is still on beta drivers and pre-mantle release. Mantle is a new API that should exponentially increase frame rates in supported games like BF4 for AMD users.


This is correct, the R9 290X is on par with the 780TI on a clock for clock comparison. Where the 780TI can pull away some is when you compare both cards at max overclocks because the more expensive 3gb 780TI can overclock higher than a R9 290X.


----------



## Slomo4shO

Quote:


> Originally Posted by *Arizonian*
> 
> As example; my 290X could only obtain 1100 Core 1300 Memory gaming stable where my 780Ti ACX gets 1237 Boost 1800 Memory stable. The 780Ti is slightly better than 290X in all my circumstances gaming or benching on air in comparison I've found.


To be fair you are comparing a card with a poor reference cooler to a card with a after market cooler...


----------



## Arizonian

Quote:


> Originally Posted by *Slomo4shO*
> 
> To be fair you are comparing a card with a poor reference cooler to a card with a after market cooler...


I agree looking at the scores.









Price / Performance wise for my taking advantage of deals it was hard to pass up when in the end I paid less.

EDIT _ my 290X was 100% fan and my 780Ti 87% with those results posted.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Arizonian*
> 
> Oh that makes better sense:
> 
> 290 / *15115* - 1200 / 1625 Gelid Cooling
> 
> 290X / *14894* - 1150 / 1350 Reference Cooling
> 
> Only 221 points difference.


Yeah I'm sure if I had a 290x or if my card unlocked there would be a bigger gap.

The part that seems more interesting to me is that my card beat a 780ti at higher clocks. I'll have to look up some more benches of oc'ed 780ti's and compare.

Hopefully once I get this card under water I'll be able to push 1225+ MHz.


----------



## iamhollywood5

Quick question - what is the stock voltage on 290Xs? Mine is 1.25v, but do all 290Xs across the board have 1.25v stock? Or does it scale with ASIC like the 7900 series?


----------



## Jpmboy

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Yeah I'm sure if I had a 290x or if my card unlocked there would be a bigger gap.
> 
> The part that seems more interesting to me is that my card beat a 780ti at higher clocks. I'll have to look up some more benches of oc'ed 780ti's and compare.
> 
> Hopefully once I get this card under water I'll be able to push 1225+ MHz.


http://www.overclock.net/t/1443196/firestrike-extreme-top-30

http://www.overclock.net/t/1406832/single-gpu-firestrike-top-30

http://www.3dmark.com/hall-of-fame-2/3dmark+11+3dmark+score+performance+preset/version+1.0.5/2+gpu


----------



## Slomo4shO

Quote:


> Originally Posted by *Arizonian*
> 
> I agree looking at the scores.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Price / Performance wise for my taking advantage of deals it was hard to pass up when in the end I paid less.
> 
> EDIT _ my 290X was 100% fan and my 780Ti 87% with those results posted.


I am sure that the ACX cooler at 87% still provides much greater cooling than the reference cooler at 100%. I was just suggesting that aftermarket OC would be a better comparison to your Ti once they are released.

As far as the price/performance, it is going to be difficult to beat my 290s that unlocked to 290X which cost $320 each and included BF4









Lastly, can you update me in the OP to 4 Sapphire 290s? They will be going under water as soon I receive the blocks (backordered to end of the month). Also, you should include an unlocked to 290X column


----------



## Banedox

Quote:


> Originally Posted by *iamhollywood5*
> 
> Quick question - what is the stock voltage on 290Xs? Mine is 1.25v, but do all 290Xs across the board have 1.25v stock? Or does it scale with ASIC like the 7900 series?


Move is stock at 1.25 granted mind is a 290 unlock but I can overclock like at all


----------



## HOMECINEMA-PC

My first Radeon card











HOMECINEMA-PC

http://www.techpowerup.com/gpuz/996br/

Sapphire R9 290

Reference cooler

Pls add me to da club list pls


----------



## PwrElec

Quote:


> Originally Posted by *RAFFY*
> 
> Post back or PM me what they say please. I have 3 ASUS R9 290x that are getting naked in the couple of weeks.


Roger that


----------



## the9quad

Here is 100% stock clocks and cooling:



Firestrike: (21,852)
http://www.3dmark.com/fs/1247818


Firestrike Extreme: (12,588)
http://www.3dmark.com/fs/1243348


----------



## RAFFY

Quote:


> Originally Posted by *iamhollywood5*
> 
> Quick question - what is the stock voltage on 290Xs? Mine is 1.25v, but do all 290Xs across the board have 1.25v stock? Or does it scale with ASIC like the 7900 series?


1.25v is the stock voltage for these cards and ASIC doesn't mean a thing with these.

Quote:


> Originally Posted by *the9quad*
> 
> Here is 100% stock clocks and cooling:
> 
> 
> 
> Firestrike: (21,852)
> http://www.3dmark.com/fs/1247818
> 
> 
> Firestrike Extreme: (12,588)
> http://www.3dmark.com/fs/1243348


Looking good! I get my third card and new PSU's this friday!


----------



## iamhollywood5

Quote:


> Originally Posted by *Banedox*
> 
> Move is stock at 1.25 granted mind is a 290 unlock but I can overclock like at all


What's the max stable OC you can get? Not being able to get 1200Mhz is not exactly something to complain about, lol


----------



## Fahrenheit85

Gigabyte 290x
Stock Cooling (for now)
Time to play some games


----------



## Banedox

Quote:


> Originally Posted by *iamhollywood5*
> 
> What's the max stable OC you can get? Not being able to get 1200Mhz is not exactly something to complain about, lol


Well as of right note nothing overstock is stable....

Can a unstable Cpu mess with a gpu overclock?


----------



## Banedox

Also is 1.5v safe for these cards?


----------



## HardwareDecoder

Heh I paid 500$ for a 290x and I didn't even research before hand, are any of these voltage unlocked or are they all voltage locked?

mine is gonna be the sapphire 290x from newegg CM sale.

I'm hoping for a massive overclock since I bought a legit 290x.

If not i'll just have to say it black screens and RMA it for another chance at a good overclocker, I feel I deserve it in paying the premium while all you guys are stealing 290x's (kidding about the stealing part, not about deserving a good clocker for paying the premium)


----------



## Jack Mac

I wouldn't try it, but I highly doubt it would actually run at 1.5V, these cards have a pretty large amount of vdroop.


----------



## staryoshi

Quote:


> Originally Posted by *Darklyric*
> 
> Its still important to know that per clock its better imo, but relevant to overclocking, no.


As far as I'm concerned, clock-for-clock comparisons have NO relevance when comparing different architectures. Such comparisons within architectures can sometimes be useful, though.

Also, I hope to be joining the club soon - once the non-reference R9 290s are released. (DirectCU II)


----------



## Jack Mac

Quote:


> Originally Posted by *HardwareDecoder*
> 
> Heh I paid 500$ for a 290x and I didn't even research before hand, are any of these voltage unlocked or are they all voltage locked?
> 
> mine is gonna be the sapphire 290x from newegg CM sale.
> 
> I'm hoping for a massive overclock since I bought a legit 290x.
> 
> If not i'll just have to say it black screens and RMA it for another chance at a good overclocker, I feel I deserve it in paying the premium while all you guys are stealing 290x's (kidding about the stealing part, not about deserving a good clocker for paying the premium)


It's voltage unlocked to +100MV in the latest Afterburner beta, but if you flash to an ASUS bios you can get a bit more voltage out of it, I wouldn't go in expecting massive OCs though, it's still all about luck.


----------



## HardwareDecoder

Quote:


> Originally Posted by *Jack Mac*
> 
> It's voltage unlocked to +100MV in the latest Afterburner beta, but if you flash to an ASUS bios you can get a bit more voltage out of it, I wouldn't go in expecting massive OCs though, it's still all about luck.


How much more voltage do you know? I'll have to look up the max safe voltage for these too.

Hopefully i'll get a really nice clocker, being under water should help since i've heard the higher the temps the worse OC you get on these.

Yeah I need to do some research before I get mine, I don't know the max safe voltage or temps. well doesn't really matter since mine is likely coming friday or saturday and I won't be able to put it under water for another week or two until my block comes in from frozencpu. I guess I won't be pushing it very hard until then since these are so loud and hot on stock coolers.


----------



## prostreetcamaro

Quote:


> Originally Posted by *Arizonian*
> 
> Oh that makes better sense:
> 
> 290 / *15115* - 1200 / 1625 Gelid Cooling
> 
> 290X / *14894* - 1150 / 1350 Reference Cooling
> 
> Only 221 points difference.


Man 14807 is all I can muster out of mine at 1180/5500. Which drivers you guys using? I think my biggest thing holding my score back a little is my 2600K at 4.5Ghz. I might have to go ahead and throw some voltage back into her and bring it back up to 5.2 and make a balls to the wall run to see what I can get with my flashed 290X.

http://www.3dmark.com/3dm11/7608306


----------



## Jack Mac

Quote:


> Originally Posted by *HardwareDecoder*
> 
> How much more voltage do you know? I'll have to look up the max safe voltage for these too.
> 
> Hopefully i'll get a really nice clocker, being under water should help since i've heard the higher the temps the worse OC you get on these.
> 
> Yeah I need to do some research before I get mine, I don't know the max safe voltage or temps. well doesn't really matter since mine is likely coming friday or saturday and I won't be able to put it under water for another week or two until my block comes in from frozencpu. I guess I won't be pushing it very hard until then since these are so loud and hot on stock coolers.


Not sure, I haven't flashed to the ASUS BIOS, but I believe the limit for their GPUTweak is 1412MV, with vdroop which comes out to around 1.35V. My 290 with +100Mv in afterburner runs at around 1.25V, and occasionally gets up to 1.3V.


----------



## darkelixa

Ive only asked so many times as I can only afford a gpu upgrade once for the next couple of years since I have a wedding to pay for also sorry.

Was just trying to help with the 770 as if you type ' 770gtx stuttering' into google you will see the many pages of complaints


----------



## HardwareDecoder

Quote:


> Originally Posted by *Jack Mac*
> 
> Not sure, I haven't flashed to the ASUS BIOS, but I believe the limit for their GPUTweak is 1412MV, with vdroop which comes out to around 1.35V. My 290 with +100Mv in afterburner runs at around 1.25V, and occasionally gets up to 1.3V.


oh Awesome, Yeah I'll likely do that since I should have no problem keeping it cool with my loop.

SO I'm guessing you can flash the asus bios to any 290x since they are all ref at this point


----------



## Forceman

Quote:


> Originally Posted by *Jack Mac*
> 
> Not sure, I haven't flashed to the ASUS BIOS, but I believe the limit for their GPUTweak is 1412MV, with vdroop which comes out to around 1.35V. My 290 with +100Mv in afterburner runs at around 1.25V, and occasionally gets up to 1.3V.


Yes, that's right, it's 1.412V with GPU Tweak. And I get about the same voltage with AB as you do.


----------



## ImJJames

What kind of watts are the single 290's pullling at 1.4volts? Would 630watt PSU suffice for that type of overclocks?


----------



## Nopileus

Put an Arctic Accelero Hybrid on my 290 today, very well worth it even though Arctics compatibility claim is a bit off.
Getting 65°C (1180/1385 +100mv) under Gaming conditions now with the fan going up to a max of 60%, still much more quiet than the reference cooler.

A picture of my humble setup can be seen here

Here's a validation link in case you want to add me to the list, even though i'm not a regular poster.
Its a Gigabyte card now running an Accelero Hybrid.


----------



## Ukkooh

Just a quick question: Could a faulty cpu cause a machine_check_exception bsod?
I did a fresh install (well three) of windows to verify that my black screens and other issues weren't driver related and started getting these. If I open any youtube video it hard locks my system and ends up in that bsod if I wait long enough. Also when I try to launch cod4 I get a 0xc0000142 error. How do I troubleshoot this mess? I am completely clueless.
I took an image of my old os installation so I guess I'll try that next to see if I can atleast watch youtube videos during my morning coffee.


----------



## ImJJames

Quote:


> Originally Posted by *Ukkooh*
> 
> Just a quick question: Could a faulty cpu cause a machine_check_exception bsod?
> I did a fresh install (well three) of windows to verify that my black screens and other issues weren't driver related and started getting these. If I open any youtube video it hard locks my system and ends up in that bsod if I wait long enough. Also when I try to launch cod4 I get a 0xc0000142 error. How do I troubleshoot this mess? I am completely clueless.
> I took an image of my old os installation so I guess I'll try that next to see if I can atleast watch youtube videos during my morning coffee.


Have you tried stock CPU clocks? Sounds like what happens to my computer when I have a unstable CPU overclock.


----------



## Ukkooh

My cpu has been stock for ~3 months. I guess I could run leave memtest running when I go to school tomorrow.

Edit: I was problem free to the day I got my 290x.


----------



## Jpmboy

Quote:


> Originally Posted by *staryoshi*
> 
> *As far as I'm concerned, clock-for-clock comparisons have NO relevance when comparing different architectures. Such comparisons within architectures can sometimes be useful, though.*
> Also, I hope to be joining the club soon - once the non-reference R9 290s are released. (DirectCU II)


^^^ This

[CSU Alumni!]


----------



## Jpmboy

Quote:


> Originally Posted by *prostreetcamaro*
> 
> Man 14807 is all I can muster out of mine at 1180/5500. Which drivers you guys using? I think my biggest thing holding my score back a little is my 2600K at 4.5Ghz. I might have to go ahead and throw some voltage back into her and bring it back up to 5.2 and make a balls to the wall run to see what I can get with my flashed 290X.
> http://www.3dmark.com/3dm11/7608306
> 
> 
> Spoiler: Warning: Spoiler!


*Your graphics score is very good!!*


----------



## Jpmboy

http://www.3dmark.com/hall-of-fame-2/3dmark+11+3dmark+score+performance+preset/version+1.0.5/2+gpu

http://www.overclock.net/t/1443196/firestrike-extreme-top-30

http://www.overclock.net/t/1406832/single-gpu-firestrike-top-30/470

http://www.overclock.net/t/1361939/top-30-3dmark11-scores-for-single-dual-tri-quad

Post some scores! Let's move some AMDs up the lists.


----------



## stickg1

Quote:


> Originally Posted by *Nopileus*
> 
> Put an Arctic Accelero Hybrid on my 290 today, very well worth it even though Arctics compatibility claim is a bit off.
> Getting 65°C (1180/1385 +100mv) under Gaming conditions now with the fan going up to a max of 60%, still much more quiet than the reference cooler.
> 
> A picture of my humble setup can be seen here
> 
> Here's a validation link in case you want to add me to the list, even though i'm not a regular poster.
> Its a Gigabyte card now running an Accelero Hybrid.


Those temps seem a tad high. Are you sure the block is making even and tight contact?

Does anyone else want to share a picture of their 290/290x with an aftermarket cooler on it? I might potentially be shopping for one and want to see some IRL examples. Thanks in advance!


----------



## chiknnwatrmln

Quote:


> Originally Posted by *stickg1*
> 
> Those temps seem a tad high. Are you sure the block is making even and tight contact?
> 
> Does anyone else want to share a picture of their 290/290x with an aftermarket cooler on it? I might potentially be shopping for one and want to see some IRL examples. Thanks in advance!


Check some of the pictures uploaded under my rig.

I have the Gelid cooler.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *prostreetcamaro*
> 
> Man 14807 is all I can muster out of mine at 1180/5500. Which drivers you guys using? I think my biggest thing holding my score back a little is my 2600K at 4.5Ghz. I might have to go ahead and throw some voltage back into her and bring it back up to 5.2 and make a balls to the wall run to see what I can get with my flashed 290X.
> 
> http://www.3dmark.com/3dm11/7608306


Your graphics score is 200 higher than mine, but your physics score is lower.

5.2 would probably bring your score up past 15k.

I'm on the first Beta drivers that had 290 support, my rig doesn't have internet access right now so I can't update.

Edit: Just absent-mindedly double posted, oops.


----------



## Nopileus

Quote:


> Originally Posted by *stickg1*
> 
> Those temps seem a tad high. Are you sure the block is making even and tight contact?
> 
> Does anyone else want to share a picture of their 290/290x with an aftermarket cooler on it? I might potentially be shopping for one and want to see some IRL examples. Thanks in advance!


65°C while overvolted under full load seems high? please elaborate

The hybrid doesn't have to grunt of a full blown water cooling loop, i knew that going in.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Nopileus*
> 
> 65°C while overvolted under full load seems high? please elaborate
> 
> The hybrid doesn't have to grunt of a full blown water cooling loop, i knew that going in.


What are your VRM temps?


----------



## josephimports

Quote:


> Originally Posted by *Jpmboy*
> 
> *Your graphics score is very good!!*


Yes, it is. Here is a 290 score to compare.

http://www.3dmark.com/3dm11/7608899


----------



## jerrolds

Its easy to cool the core using aftermarket, using the Gelid mine hits 72C (20C ambient) max at [email protected] or ~1.3-1.35v after vdroop..its the VRMs that are killer. VRM1 can easily climb to 90-95C+.

Still on the fence about hacking the ref cooler to pull out the VRM portion out, incase i need to RMA for any reason.


----------



## Nopileus

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> What are your VRM temps?


Hovering around 60°C 65-70°C for both clusters under load.

For the long strip i cut off the end of the stock coolers plate, for the small cluster i just glued on some of the included heatsinks.


----------



## Jpmboy

Quote:


> Originally Posted by *josephimports*
> 
> Yes, it is. Here is a 290 score to compare.
> http://www.3dmark.com/3dm11/7608899


and a 290x score: http://www.3dmark.com/3dm11/7498155

AND post your score here :http://www.overclock.net/t/1361939/top-30-3dmark11-scores-for-single-dual-tri-quad

(read the rules in the OP)


----------



## stickg1

Quote:


> Originally Posted by *Nopileus*
> 
> 65°C while overvolted under full load seems high? please elaborate
> 
> The hybrid doesn't have to grunt of a full blown water cooling loop, i knew that going in.


I guess I have never seen anyone else with that cooler so its hard to get a direct comparison. However, I remember my buddy putting that cooler on a 6970 and getting full load temps in the mid 50's. Perhaps its a case of comparing apples to oranges. I was just wondering if you tried mounting it once, then unmounting to check contact make sure everything was tight. No disrespect homie!


----------



## ImJJames

Why do people still use 3dmark11? 3dmark 2013 is out?


----------



## stickg1

Quote:


> Originally Posted by *ImJJames*
> 
> Why do people still use 3dmark11? 3dmark 2013 is out?


Unless you buy a copy you have to sit through all those other lame benchmarks and demos. That swayed me away at least until I got a free copy of 3DMark with my MSI MPower.


----------



## ImJJames

Quote:


> Originally Posted by *stickg1*
> 
> Unless you buy a copy you have to sit through all those other lame benchmarks and demos. That swayed me away at least until I got a free copy of 3DMark with my MSI MPower.


3dmark 2013 was $7 on steam when I bough it lol


----------



## Nopileus

Quote:


> Originally Posted by *stickg1*
> 
> I guess I have never seen anyone else with that cooler so its hard to get a direct comparison. However, I remember my buddy putting that cooler on a 6970 and getting full load temps in the mid 50's. Perhaps its a case of comparing apples to oranges. I was just wondering if you tried mounting it once, then unmounting to check contact make sure everything was tight. No disrespect homie!


I'm actually not sure either, on one side its a drop of 30°C compared to stock, on the other side reviews show the mid 50s you speak of but on different cards.
My radiator mounting might also not be optimal, maybe mounting it as an exhaust on the floor of the case is better.

When mounting it i went with shortening the 4mm standoffs to 3mm as suggested by people on several forums, maybe i should try again using the 1,5mm ones for more pressure.
For now its working nicely, maybe in a while i'll give it another go.


----------



## stickg1

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Check some of the pictures uploaded under my rig.
> 
> I have the Gelid cooler.


What are your load temps like with that cooler?


----------



## ivers

well my 290 is on its way, how much should i OC my i7 950 to meet this new baby?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *stickg1*
> 
> What are your load temps like with that cooler?


Benching at 1200/6500 @ 1.28v actual gives 63c core, 90c and 60c for VRM's if I run all my fans maxed and bench for 20+ minutes.
Gaming at 1140/6000 @ 1.2v actual gives about 60c core, 80c and 50c VRM's.
For less demanding games I run stock, I get 52c core, 55c and 40c for VRM's.


----------



## stickg1

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Benching at 1200/6500 @ 1.28v actual gives 63c core, 90c and 60c for VRM's if I run all my fans maxed and bench for 20+ minutes.
> Gaming at 1140/6000 @ 1.2v actual gives about 60c core, 80c and 50c VRM's.
> For less demanding games I run stock, I get 52c core, 55c and 40c for VRM's.


Oh wow, for $55 or so that's some decent cooling power. Did you have to make any modifications? Did you use all the RAMsinks and VRMsinks that came with the Gelid or did you use something else?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *stickg1*
> 
> Oh wow, for $55 or so that's some decent cooling power. Did you have to make any modifications? Did you use all the RAMsinks and VRMsinks that came with the Gelid or did you use something else?


I bought a heatsink from FrozenCPU to put on the cluster of 3 VRM's. I also had to bend part of the supplied VRM sink, the screw holes don't fit. I taped on some VRM sinks onto the bigger one to increase surface area also.

I used some 3M tape from FrozenCPU, Gelid gives you enough but I ran out after trying a few different setups. Make sure you clean the RAM chips, VRMs, and core very well. Use alcohol, qtip, and an eraser.

I can upload a pic of the card with the sinks on it if you want.

Edit: Nevermind, I thought I had one but I don't. I don't really feel like turning off my rig and pulling out the card and re-TIM'ing it just to take a pic.

The next time I take the cooler off I'll take a picture and upload it.


----------



## DeadlyDNA

Quote:


> Originally Posted by *skupples*
> 
> results like this make me miss playing on a single monitor... Then I realize i'm now totally blind on a single panel.


I agree... I feel like what I think a horse with blinders would.


----------



## stickg1

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I bought a heatsink from FrozenCPU to put on the cluster of 3 VRM's. I also had to bend part of the supplied VRM sink, the screw holes don't fit. I taped on some VRM sinks onto the bigger one to increase surface area also.
> 
> I used some 3M tape from FrozenCPU, Gelid gives you enough but I ran out after trying a few different setups. Make sure you clean the RAM chips, VRMs, and core very well. Use alcohol, qtip, and an eraser.
> 
> I can upload a pic of the card with the sinks on it if you want.
> 
> Edit: Nevermind, I thought I had one but I don't. I don't really feel like turning off my rig and pulling out the card and re-TIM'ing it just to take a pic.
> 
> The next time I take the cooler off I'll take a picture and upload it.


Could you take a side profile pic, like looking directly at the card from the side panel? I want to get an idea of how big it is. Thanks!


----------



## r0cawearz

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I bought a heatsink from FrozenCPU to put on the cluster of 3 VRM's. I also had to bend part of the supplied VRM sink, the screw holes don't fit. I taped on some VRM sinks onto the bigger one to increase surface area also.
> 
> I used some 3M tape from FrozenCPU, Gelid gives you enough but I ran out after trying a few different setups. Make sure you clean the RAM chips, VRMs, and core very well. Use alcohol, qtip, and an eraser.
> 
> I can upload a pic of the card with the sinks on it if you want.
> 
> Edit: Nevermind, I thought I had one but I don't. I don't really feel like turning off my rig and pulling out the card and re-TIM'ing it just to take a pic.
> 
> The next time I take the cooler off I'll take a picture and upload it.


What were the dimensions of the 3M tape and heatsink? I'm interested in getting the same.


----------



## The Storm

Growing impatient with newegg already, my 290's have been in packaging status since monday. It's well past the 48hr processing time...


----------



## maynard14

hey guys,.. i just want to post here the result of my unlock r9 290 xfx to r9 290x flash wiht xfx bios 290x

but first i would like to thank are fellow ocn member here the9quad for posting his r9 290X hawaiin tool result:

here is his result off his 3 r9 290x:

Compatible adapters detected: 3
Adapter #1 PCI ID: 1002:67B0 - 1002:0B00
Memory config: 0x500013A9 Elpida
RA1: F8000005 RA2: 00000000
RB1: F8000005 RB2: 00000000
RC1: F8000005 RC2: 00000000
RD1: F8000005 RD2: 00000000
Adapter #2 PCI ID: 1002:67B0 - 1002:0B00
Memory config: 0x5A0013A9 Elpida
RA1: F8000005 RA2: 00000000
RB1: F8000005 RB2: 00000000
RC1: F8000005 RC2: 00000000
RD1: F8000005 RD2: 00000000
Adapter #3 PCI ID: 1002:67B0 - 1002:0B00
Memory config: 0x500013A9 Elpida
RA1: F8000005 RA2: 00000000
RB1: F8000005 RB2: 00000000
RC1: F8000005 RC2: 00000000
RD1: F8000005 RD2: 00000000

and here is my r9 290 x stock bios

Compatible adapters detected: 1
Adapter #1 PCI ID: 1002:67B1 - 1682:9295
Memory config: 0x500036A9 Hynix
RA1: F8000005 RA2: F8010000
RB1: F8000005 RB2: F8010000
RC1: F8000005 RC2: F8010000
RD1: F8000005 RD2: F8010000

after my xfx 290 unlock to XFX 290x bios:

Compatible adapters detected: 1
Adapter #1 PCI ID: 1002:67B0 - 1043:0466
Memory config: 0x500036A9 Hynix
RA1: F8000005 RA2: 00000000
RB1: F8000005 RB2: 00000000
RC1: F8000005 RC2: 00000000


----------



## stickg1

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I bought a heatsink from FrozenCPU to put on the cluster of 3 VRM's. I also had to bend part of the supplied VRM sink, the screw holes don't fit. I taped on some VRM sinks onto the bigger one to increase surface area also.
> 
> I used some 3M tape from FrozenCPU, Gelid gives you enough but I ran out after trying a few different setups. Make sure you clean the RAM chips, VRMs, and core very well. Use alcohol, qtip, and an eraser.
> 
> I can upload a pic of the card with the sinks on it if you want.
> 
> Edit: Nevermind, I thought I had one but I don't. I don't really feel like turning off my rig and pulling out the card and re-TIM'ing it just to take a pic.
> 
> The next time I take the cooler off I'll take a picture and upload it.


One more question, lol, sorry for all this. But I have a bunch of these leftover Enzotech copper RAMsinks and VRMsinks. Will these fit under the Gelid Icy Vision R2? And if so, do you recall how many RAM chips and VRMs there were to cover?


----------



## Jpmboy

Quote:


> Originally Posted by *Forceman*
> 
> So I finally thought to check the minidumps on my red screens in BF4, and it turns out they were 124 errors, so it looks like my 4770K is to blame and not the 290. No idea what BF4 is doing that IBT/Prime aren't to cause me to crash in the game and not in stress tests though.


A 124 may be the CPU, but it can also be the video driver. if you have the minidump, look at the call immediately preceding the final fail (was it ntoskrnl.exe? then def check the preceding call).

check sysinternals.org...


----------



## Jpmboy

Quote:


> Originally Posted by *ImJJames*
> 
> Why do people still use 3dmark11? 3dmark 2013 is out?


both are still pretty tough on the system.. MK11 will draw more watts at the limit (for sure). Example:

Killawatt power measurements (at the PSU plug)
3930K @5.0(1.523V)
2xTitans SLI (svl7v3 bios, softvoltmod, LLC=0)
Bios = 220W
Boot = 500W (?)
Idle = ~ 160-170 watts to the rig
Browser = ~300W
Super Pi = 340W
p95 (8G ram) = 600W (597+/-)
3Dmk11 @ 875/3005 1.16V = 800-900W
3DMk11 @1215/3602 1.3V = 1190-1220W !!!
Valley @ 1215/3602 1.3V = 950-1050W (1080P ExHD)
Firestrike @ 1215/3602 @ 1.3V = 1050-1130W (default)


----------



## chiknnwatrmln

Quote:


> Originally Posted by *stickg1*
> 
> Could you take a side profile pic, like looking directly at the card from the side panel? I want to get an idea of how big it is. Thanks!


Here. I suggest you put an exhaust fan or two on your case door, I have two 85 CFM fans exhausting air and it still dumps a LOT of hot air into my case.


Wow that took a long time.... My ISP claims they've upped their speeds to past 25mbps and I'm getting <100kps.... I hate you so much Comcast.

After gaming my CPU idles at almost 40c until all the hot air is gone, at which it goes down to about 30c.
Quote:


> Originally Posted by *r0cawearz*
> 
> What were the dimensions of the 3M tape and heatsink? I'm interested in getting the same.


I got this heatsink: http://www.frozencpu.com/products/18210/vid-220/Aavid_Thermalloy_Premium_Heatsink_-_33mm_x_33mm_x_15mm_-_Anodized_Black.html

It was slightly too tall and I had to grind it down. It was also a bit too wide and long so it overlapped on 2 memory chips. You can see that in one of the pics uploaded under my rig.

I got 3 sheets of this: http://www.frozencpu.com/products/10223/thr-75/3M_8815_Thermally_Conductive_Adhesive_Transfer_Tape_-_2_x_2.html

Edit: Those sinks should fit. I can't remember how many ram chips there are, I think 12 maybe? There are 3 VRM's on one part of the card, and 10-15 in a line on another part.

Now if I were you I'd use the supplied RAMsinks and cut up those thicker copper sinks into halves, and use those for the line of VRM's. Copper is better at cooling than aluminum, and it's relatively soft and easy to cut.


----------



## stickg1

Thanks for the info, looks like I would need to order another pack of each to cover everything. If your big heatsink fit and its 15mm tall, mine are 14mm tall so it might be kinda tight like yours but it's worth a shot since I already have them. I guess I could use what I have and then finish off the open ones with what Gelid supplies with the cooler. Thanks you though, you have been very helpful!


----------



## Sgt Bilko

Asus have previewed the DCU II version of the 290x, I'll put some pics in here when i get them.





































Hopefully this won't take longer than a week or two.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *stickg1*
> 
> Thanks for the info, looks like I would need to order another pack of each to cover everything. If your big heatsink fit and its 15mm tall, mine are 14mm tall so it might be kinda tight like yours but it's worth a shot since I already have them. I guess I could use what I have and then finish off the open ones with what Gelid supplies with the cooler. Thanks you though, you have been very helpful!


No problem, and yes those sinks should be fine. Worse comes to worse you can grind off ~1mm or so and they'll fit.

To above: I saw ASUS's PCB picture... Oh my. Doesn't it have two 8pins and one 6pin? And like an 18 phase power system? Just thinking of the OC's possible on that makes me jealous.... I may have to sell my 290 when the DCUII comes out.


----------



## Sgt Bilko

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> No problem, and yes those sinks should be fine. Worse comes to worse you can grind off ~1mm or so and they'll fit.
> 
> To above: I saw ASUS's PCB picture... Oh my. Doesn't it have two 8pins and one 6pin? And like an 18 phase power system? Just thinking of the OC's possible on that makes me jealous.... I may have to sell my 290 when the DCUII comes out.


One 8+6 pin from what i seen, but hopefully i can swap my dead Ref 290x for one of these bad boys


----------



## stickg1

You might be thinking of the MSI Lightning PCB preview...

http://www.techpowerup.com/195272/msi-radeon-r9-290x-lightning-pcb-pictured.html

Man that thing looks H.A.M.!


----------



## chiknnwatrmln

Haha I just spent a solid 10 minutes searching my history trying to find that link...

Yes, I want one of those. Or two.


----------



## Sgt Bilko

Quote:


> Originally Posted by *stickg1*
> 
> You might be thinking of the MSI Lightning PCB preview...
> 
> http://www.techpowerup.com/195272/msi-radeon-r9-290x-lightning-pcb-pictured.html
> 
> Man that thing looks H.A.M.!


Oh damn









What PSU would you need for two of them in Crossfire?

Actually i don't care......i want them, Naow!!!


----------



## chiknnwatrmln

AT LEAST 1 kW.... Probably 1.2 or more for overclocking.

I don't think my house's circuitry could handle that, 530w from 1 290 is already enough to make my lights dim.


----------



## Chargeit

Yo, I'm going to be in the market for a new GPU once I'm done with holiday shopping. Basically, planing on buying between the end of this month, and mid-jan.

My system specs (That matter for this gpu) are,

CPU - Fx8320 @ 4.2 (212 evo, Soon to be closed loop, after GPU)
Mobo - 990FXA-UD5
PSU - TX850M
Case - Rosewill R5 (Will replace if heat is a issue, after adding GPU / closed loop)
Monitor - 1080p 60hz (Might add a 2nd for watching temps n the such later)

My question is, gaming in 1080p, would you guys suggest a 290 (Once custom coolers come out), or should I just go with the cheaper 280x?

Basically, I'm all about maintaining a smooth 60fps with Vsync on, while using the highest visual settings possible (The AA is give and take, x4 usually looks really good to me, but, more is better, even if I don't use)

So, should the 280x be enough, or should I step it up to the 290? I don't mind the cost, but, with the 290 custom coolers coming out mid jan, I'd have to wait longer. Also, I don't want more heat then needed.

I'm not looking for a "future proof" card, since I don't mind buying a new one later (1 card every year or 2 wouldn't kill me at this price range), as much as I'm looking for one that will meet my needs, and then some.


----------



## Sgt Bilko

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> AT LEAST 1 kW.... Probably 1.2 or more for overclocking.
> 
> I don't think my house's circuitry could handle that, 530w from 1 290 is already enough to make my lights dim.


I might be pushing it on the overclocking front but two of them at stock will destroy any game out there so ill be fine.


----------



## ImJJames

Quote:


> Originally Posted by *Chargeit*
> 
> Yo, I'm going to be in the market for a new GPU once I'm done with holiday shopping. Basically, planing on buying between the end of this month, and mid-jan.
> 
> My system specs (That matter for this gpu) are,
> 
> CPU - Fx8320 @ 4.2 (212 evo, Soon to be closed loop, after GPU)
> Mobo - 990FXA-UD5
> PSU - TX850M
> Case - Rosewill R5 (Will replace if heat is a issue, after adding GPU / closed loop)
> Monitor - 1080p 60hz (Might add a 2nd for watching temps n the such later)
> 
> My question is, gaming in 1080p, would you guys suggest a 290 (Once custom coolers come out), or should I just go with the cheaper 280x?
> 
> Basically, I'm all about maintaining a smooth 60fps with Vsync on, while using the highest visual settings possible (The AA is give and take, x4 usually looks really good to me, but, more is better, even if I don't use)
> 
> So, should the 280x be enough, or should I step it up to the 290? I don't mind the cost, but, with the 290 custom coolers coming out mid jan, I'd have to wait longer. Also, I don't want more heat then needed.
> 
> I'm not looking for a "future proof" card, since I don't mind buying a new one later (1 card every year or 2 wouldn't kill me at this price range), as much as I'm looking for one that will meet my needs, and then some.


280x is more than enough at 60fps 1080P


----------



## Derpinheimer

Hey guys, I think I asked this before but cant find the post or any answers.

I cant lower my core clock below 1000MHz. CCC sliders go back to 0% if I try negative values, and MSI AB/GPU tweak do not allow lower than 1000MHz.

Any idea what the deal is?


----------



## ShortySmalls

Sold my 7990 for $1300 with its block now im looking to buy 2 290's but can't find any in stock for a reasonable price lol!


----------



## ImJJames

Quote:


> Originally Posted by *ShortySmalls*
> 
> Sold my 7990 for $1300 with its block now im looking to buy 2 290's but can't find any in stock for a reasonable price lol!


7950, 7970, 280x, 290's are all going bye bye because of litecoin.


----------



## Chargeit

Cool thanks, pretty much what I'm thinking.

I'm still undecided of course.

Really, I wouldn't forgive myself if I bought the 280x, without at least seeing how the 290's preform with the custom cooling.

I just hate to get the 280x, then go to run some game, get lower fps (or frame timing) then expected , and think, "for only 100 bucks more, I could of had a 290.".

My wallet wants the 280x, my thermal levels wants the 280x... But the man inside of me wants more... Dang it.


----------



## psyside

For anyone who want to see amazing 290, great overclocker, and tested in the most extreme conditions one can imagine, SSAA - 8XMSAA, resolution scale.... this guy deserves subs, lets do it guys







beast of a card!












Overclock guide, apperantly aux voltage really helps?


----------



## Lord Xeb

Do you guys know which VRM is VRM 1 and VRM 2?


----------



## Dettie

*Today = Non-Reference model Day (?)*

EVERY R9 290 and R290x (ALL refence models) in existance just went out of stock at the same time. Does this mean NON-Reference models are being push live today?

See for yourself: http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&Description=r9+290&N=-1&isNodeId=1

(They were ALL in stock today).


----------



## black7hought

Quote:


> Originally Posted by *Dettie*
> 
> *Today = REFERENCE MODEL Day (?)*
> 
> EVERY R9 290 and R290x (ALL non-refence models) in existance just went out of stock at the same time. Does this mean Reference models are being push live today?
> 
> See for yourself: http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&Description=r9+290&N=-1&isNodeId=1
> 
> (They were ALL in stock today).


I'm not sure what you are trying to say but those are all reference models. I think you mean non-reference models may be pushed out soon. However, stock on the 290 and 290X has been scarce since release so I wouldn't get too excited.


----------



## psyside

Quote:


> Originally Posted by *Dettie*
> 
> *Today = REFERENCE MODEL Day (?)*
> 
> EVERY R9 290 and R290x (ALL non-refence models) in existance just went out of stock at the same time. Does this mean Reference models are being push live today?
> 
> See for yourself: http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&Description=r9+290&N=-1&isNodeId=1
> 
> (They were ALL in stock today).


I really don't get what are you saying?

There are no aftermarket (non-reference) cards yet, they are all reference.


----------



## Dettie

Yes I meant "Today is NON-reference" day... As EVERY reference model is gone now (check Newegg, Tigerdirect, Amazon).. sorry I switched the words around


----------



## Taint3dBulge

Quote:


> Originally Posted by *Dettie*
> 
> Yes I meant "Today is NON-reference" day... As EVERY reference model is gone now (check Newegg, Tigerdirect, Amazon).. sorry I switched the words around


Good, hope it stays that way till next week. I can get my money back, my rma should hit be back in cali on friday..

But my 1440p monitor comes tomorrow :\


----------



## Dettie

Quote:


> Originally Posted by *Taint3dBulge*
> 
> Good, hope it stays that way till next week. I can get my money back, my rma should hit be back in cali on friday..


I'm 100% sure it will, and that today at 3 AM Eastern time the non-ref cards will be pushed live.

I checked Amazon, Tigerdirect, and Newegg, and literally all 290 / 290x are gone. (they all had them earlier today).


----------



## psyside

Omg, i really hope thats the case, but i doubt it, 290 being out of stock imho is because of the miners









They said the non reference cards will be out in 2 weeks minimum.


----------



## ImJJames

Quote:


> Originally Posted by *Dettie*
> 
> I'm 100% sure it will, and that today at 3 AM Eastern time the non-ref cards will be pushed live.
> 
> I checked Amazon, Tigerdirect, and Newegg, and literally all 290 / 290x are gone. (they all had them earlier today).


Its all out of stock including 7950, 7970, and 280x because of miners....not because of non-ref models coming out.


----------



## Dettie

We'll see in exactly 2 hours from now, since that's the official "day end" of all big retailers (Amazon/Newegg).


----------



## Dettie

Quote:


> Originally Posted by *ImJJames*
> 
> Its all out of stock including 7950, 7970, and 280x because of miners....not because of non-ref models coming out.


Oh those bitcoin hoes? -_-


----------



## psyside

Quote:


> Originally Posted by *Dettie*
> 
> We'll see in exactly 2 hours from now, since that's the official "day end" of all big retailers (Amazon/Newegg).


What is day end?


----------



## Dettie

lol I might be correct.


----------



## Derpinheimer

Yeah its the miners. At least, thats what I'd bet on. Good luck finding a 7950,7970,R9 280X, R9 290, or R9 290X anywhere.

Actually I do see one R9 280x for $310 but thats it lol.

Edit: nvm, thats out of stock now too! (was on B&H Photo Video)


----------



## Dettie

ugh, Newegg had every manufacture in stock this week until now... Should've bought in bulk lol


----------



## ImJJames

Quote:


> Originally Posted by *Dettie*
> 
> ugh, Newegg had every manufacture in stock this week until now... Should've bought in bulk lol


Miners win, gamers lose







I myself will be turning my machine for mining, might as well get some money while this mining thing is hot.


----------



## HOMECINEMA-PC

HOMECINEMA-PC [email protected]@2428 Sapphire R9 290 1111 / 1287

http://www.3dmark.com/3dm11/7600977



http://www.3dmark.com/fs/1244748











Wadda ya rekon ........ any good ?

This is a beasty card never pulled those scores on single card before


----------



## darkelixa

Of course they bring out non ref cards when i pre order an xfx r9 290 lol i should cancel it


----------



## Forceman

Quote:


> Originally Posted by *Dettie*
> 
> I'm 100% sure it will, and that today at 3 AM Eastern time the non-ref cards will be pushed live.
> 
> I checked Amazon, Tigerdirect, and Newegg, and literally all 290 / 290x are gone. (they all had them earlier today).


I have no idea why you are equating reference cards being out of stock with custom cards releasing. They don't stop making/selling reference cards once custom cards are available. They are out of stock because they are out of stock, not because they are trying to clear shelf space for new cards or something.


----------



## Arizonian

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> My first Radeon card
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> HOMECINEMA-PC
> 
> http://www.techpowerup.com/gpuz/996br/
> 
> Sapphire R9 290
> 
> Reference cooler
> 
> Pls add me to da club list pls


Congrats - added







Good time to try Radeon as things are turning the corner in many ways.









Quote:


> Originally Posted by *Fahrenheit85*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Gigabyte 290x
> Stock Cooling (for now)
> Time to play some games


Congrats - added









Quote:


> Originally Posted by *prostreetcamaro*
> 
> Man 14807 is all I can muster out of mine at 1180/5500. Which drivers you guys using? I think my biggest thing holding my score back a little is my 2600K at 4.5Ghz. I might have to go ahead and throw some voltage back into her and bring it back up to 5.2 and make a balls to the wall run to see what I can get with my flashed 290X.
> 
> http://www.3dmark.com/3dm11/7608306
> 
> 
> 
> Spoiler: Warning: Spoiler!


Actually pretty good scores off that 2600K









Quote:


> Originally Posted by *staryoshi*
> 
> As far as I'm concerned, clock-for-clock comparisons have NO relevance when comparing different architectures. Such comparisons within architectures can sometimes be useful, though.
> 
> Also, I hope to be joining the club soon - once the non-reference R9 290s are released. (DirectCU II)


I agree on clock vs clock being bad way to compare two different types of GPU's.

Asus DCUII Sounds great - we await you.


----------



## Dettie

Quote:


> Originally Posted by *Forceman*
> 
> I have no idea why you are equating reference cards being out of stock with custom cards releasing. They don't stop making/selling reference cards once custom cards are available. They are out of stock because they are out of stock, not because they are trying to clear shelf space for new cards or something.


Because they were all in stock yesterday for a whole week, and now they all happen to be out of stock at the same time, across the entire web, coincidence eh?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> Good time to try Radeon as things are turning the corner in many ways.
> 
> 
> 
> 
> 
> 
> 
> 
> 'SNIP'
> 
> Asus DCUII Sounds great - we await you.


Yep agreed besides ive got about 15 nvidia cards , time to collect something else LooooL









Dont know too much about bios flashing these , straight forward or what


----------



## iPDrop

Does anybody know when Mantle will be coming to BF4? And how one goes about installing it or w/e?


----------



## Scorpion49

I went ahead and gave it full steam, was trying to break 10k on FSE but I couldn't quite make it. One of my GPU's gives up the ghost around 1160mhz and the other will not budge past 1150 without artifacts even with full voltage thrown at it. I'm game stable at 1125 +50mv though so I'm happy with that.

http://www.3dmark.com/fs/1253393
http://www.3dmark.com/3dm11/7610246


----------



## Forceman

Quote:


> Originally Posted by *Dettie*
> 
> Because they were all in stock yesterday for a whole week, and now they all happen to be out of stock at the same time, across the entire web, coincidence eh?


More likely coincidence than that it somehow signals the impending release of custom cards. Why would reference cards go out of stock just because custom cards are coming out? Did they send them all back to AMD to be recycled into customs?


----------



## Fahrenheit85

Just got done 10hrs or so of AC4 and no issues like black screen or anything like that. I got to say though when the card hits 55% fan speed I sounds like my computer is about to launch haha. With all the talk of the after market coolers coming real soon I may send my card back for one of them. My card has coil whine and I cant stand that crap.

My only worry is with water block compatibility being thats the plan int he future. How compatible do the 3rd party cards tend to be?



First time I ever watched the 3Dmark bench marks. This is at stock 290x speeds with a 4.4ghz 4770k.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Fahrenheit85*
> 
> Just got done 10hrs or so of AC4 and no issues like black screen or anything like that. I got to say though when the card hits 55% fan speed I sounds like my computer is about to launch haha. With all the talk of the after market coolers coming real soon I may send my card back for one of them. My card has coil whine and I cant stand that crap.
> 
> My only worry is with water block compatibility being thats the plan int he future. How compatible do the 3rd party cards tend to be?
> 
> 
> 
> First time I ever watched the 3Dmark bench marks. This is at stock 290x speeds with a 4.4ghz 4770k.


And here i was pushing to hit 9k









http://www.3dmark.com/fs/1151742

shows how much of a difference a good processor will make


----------



## HOMECINEMA-PC

You can get more..........

http://www.3dmark.com/fs/1244748 *10255*


----------



## Forceman

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> You can get more..........
> 
> http://www.3dmark.com/fs/1244748 *10255*


Yes, yes you can.

http://www.3dmark.com/fs/1155472 *11259*


----------



## Joeking78

Just picked up a 3rd 290x, will install later and post some pics/benchmarks.


----------



## Dettie

newegg has the Asus R9 290 in stock now for $409 .. comes with a copy of Battlefield 4.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814121807


----------



## Hogesyx

Quote:


> Originally Posted by *Joeking78*
> 
> Just picked up a 3rd 290x, will install later and post some pics/benchmarks.


Nice, just picked up my 2nd too.

http://postimg.org/image/7p71h2s2r/


----------



## Dettie

Quote:


> Originally Posted by *Joeking78*
> 
> Just picked up a 3rd 290x, will install later and post some pics/benchmarks.


lol what do you need 3x 290x' for? XD


----------



## Joeking78

Quote:


> Originally Posted by *Dettie*
> 
> lol what do you need 3x 290x' for? XD


This is Overclock.net...we all need more stuffs









And its almost Christmas!


----------



## Hogesyx

Quote:


> Originally Posted by *Dettie*
> 
> lol what do you need 3x 290x' for? XD


It's winter, gotta need more radiator.


----------



## Dettie

lol you guys crack me up


----------



## chiknnwatrmln

My room gets a good 10 degrees warmer than the other rooms in my house just from my rig after gaming for like an hour.


----------



## HardwareDecoder

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> My room gets a good 10 degrees warmer than the other rooms in my house just from my rig after gaming for like an hour.


lol so glad my 290x is going under water, god I hope it doesn't take 2 weeks for my block to come in


----------



## Falkentyne

Quote:


> Originally Posted by *iamhollywood5*
> 
> Quick question - what is the stock voltage on 290Xs? Mine is 1.25v, but do all 290Xs across the board have 1.25v stock? Or does it scale with ASIC like the 7900 series?


Should scale. My 77.9% card shows a MAX current (after vdroop) voltage of 1.203v in gpu-z at the lightest load (when switching between 3d to 2d usually) so I don't know if that's a vid of 1.2, 1.213 or 1.212 or whatever the step is but it's definitely NOT 1.250v.

With +100mv, highest voltage I've seen read is 1.293v.

Maybe gpu tweak might read the current vid....


----------



## Fahrenheit85

http://www.3dmark.com/fs/1253492

Not even gonna push it any till I get it wet. I spend most of my game time with my core at 930ish and my fan at 55% as it is.

When I do overclock what is the best tool to use? Can I just max out the sliders in CCC and see if it runs?


----------



## Bloitz

Any thoughts why my 290 seems to be downclocking for just a second constantly while folding?



Core sometimes goes as low as 847 Mhz, but usually remains above 1100 Mhz (1200 Mhz target speed). It's the memory that worries me: It dips to 150 Mhz every so often ...

Powertune: +50
Full cover WB so temps aren't the issue (right now 30 on the core, 23 and 21 for the VRM's (Yea, I had the window open the last hour







, it's usually a bit higher)

Not exactly impressed with the AMD drivers so far. Was running like a charm when I first installed it and installed the 9.4 beta, but then just when the FFW was about to begin (Thank you Kevdog) it started acting up. Reinstalled the driver, installed 9.5 beta, and now I'm on the WHQL which seems to be the most stable so far. (used driver fusion or AMD's install manager, nothing really helped)

Before last Sunday it would happily fold away, rock solid, no issues whatsoever. The OC is stable, 3D Mark 11 and the newest one, Valley and BF4 ... When I had my GTX570, BF3 was a lot harder on the OC than folding .. not sure what the deal is with AMD.

EDIT: and I can't help but feel like this card doesn't produce nearly as much heat as my 570 did ... I normally don't use the central heating in my room when folding..


----------



## Fahrenheit85

Shouldnt I be able to adjust core voltage? I have "unlock voltage" checked in the settings.


----------



## aznever

Use the new version of afterburner.


----------



## TheSoldiet

Will this do? A Xtreme III with Alpenfohn DRAM/VRAM heatsinks.

https://www.deal.no/deal/default.asp?page=vare&ProdusentID=DCACO-V15G400-BL

https://www.deal.no/deal/default.asp?page=vare&ProdusentID=84000000063

(Norwegian)


----------



## Fahrenheit85

Quote:


> Originally Posted by *aznever*
> 
> Use the new version of afterburner.


I am as far as I can tell


----------



## maynard14

new beta version sir
3.0.0 beta 17


----------



## Jpmboy

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yep agreed besides ive got about 15 nvidia cards , time to collect something else LooooL
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Dont know too much about bios flashing these , straight forward or what


yeah, easy.

http://forums.overclockers.co.uk/showthread.php?t=18552408


----------



## Gilgam3sh

Quote:


> Originally Posted by *TheSoldiet*
> 
> Will this do? A Xtreme III with Alpenfohn DRAM/VRAM heatsinks.
> 
> https://www.deal.no/deal/default.asp?page=vare&ProdusentID=DCACO-V15G400-BL
> 
> https://www.deal.no/deal/default.asp?page=vare&ProdusentID=84000000063
> 
> (Norwegian)


I use the AX3 with the VRM1 plate from the stock cooler, works great and very good VRM1 temps


----------



## Nopileus

I just remounted my Arctic Hybrid using the 1,5mm spacers and gave it as much pressure as i dared.

In BF4 i got 65°C yesterday, i am now down to 58°C (+100mv, 1180/1385). This seems to be in line with expectations.


----------



## darkelixa

Will core parking affect my new r9 290? It seems most games i play with my 5850 currently park 2-3 cores with my 5850. Wont be till next thursday I get my shinny new card


----------



## HardwareDecoder

What kind of overclocks are people getting on actual 290x's ? (not unlocked 290's) I have my 290x cming soon and curious what to expect.


----------



## velocityx

so i just got my second r9 290. judging by the black mark near psu plugs, this one won't unlock, the first one didn't too,. and don't care much about it that much so all is ok.

but what I care about is smoothness. previously, my old CF 6970 setup, although it was quite smooth in full hd, I could feel the stutters in bf3/bf4. Even more so in 1440p.

I wonder how will this new xdma engine work out. if I see repeatable micro stutter I will get rid of it. Gonna see tomorrow when I fire up the rig.

anyone care to share their experience x'firing these babies?


----------



## Joeking78

Quote:


> Originally Posted by *HardwareDecoder*
> 
> What kind of overclocks are people getting on actual 290x's ? (not unlocked 290's) I have my 290x cming soon and curious what to expect.


1150/1400 was a stable OC on mine...I could push 1200 core with 1250 memory but it would artifact randomly, sometimes pass 3dmark fine, sometimes artifact.


----------



## HardwareDecoder

Quote:


> Originally Posted by *Joeking78*
> 
> 1150/1400 was a stable OC on mine...I could push 1200 core with 1250 memory but it would artifact randomly, sometimes pass 3dmark fine, sometimes artifact.


How were the temps, voltages? did you have the 290x asus bios flashed. Were you on water?

I'm really hoping for atleast 1200/1400 on water with the asus bios.

UGHHH newegg is irritating me, ordered mine on the cyber monday sale and they didn't ship till yesterday, now expected delivery date is monday.... I paid for 3 day shipping, gonna ask for a refund on my shipping.


----------



## Joeking78

Quote:


> Originally Posted by *HardwareDecoder*
> 
> How were the temps, voltages? did you have the 290x asus bios flashed. Were you on water?
> 
> I'm really hoping for atleast 1200/1400 on water with the asus bios.


That was with stock bios but running latest MSI AB with maxed out volts on the Sapphire bios...IIRC temps reached 80-82 on stock air cooling.

I run stock now and see temps never go above 70c with 70% fan speed at that temp.

I did run the Asus bios once, I think I got 1215 core, can't remember the temps now.


----------



## Fahrenheit85

Stock

Core - 1150 Memory - Stock

Core - 1150 Memory 1550

Core - 1150 Memory 1600


All tests with core volts and power % slider on max with 100% fan speed. Temp maxes out at 75 degrees. 1175 on the core and it starts to artifact. If I keep this card (depending on how much coil whine goes away and if the 3rd part cards being compatible with EK water blocks) i'll flash with the Asus bios and try for 1250 on the core.

Also 100% fan speed made my ears ring.


----------



## HardwareDecoder

Quote:


> Originally Posted by *Joeking78*
> 
> That was with stock bios but running latest MSI AB with maxed out volts on the Sapphire bios...IIRC temps reached 80-82 on stock air cooling.
> 
> I run stock now and see temps never go above 70c with 70% fan speed at that temp.
> 
> I did run the Asus bios once, I think I got 1215 core, can't remember the temps now.


Awesome hope I don't have to RMA mine for being a dud overclocker.

Oh anyone who has a waterblock on a 290/x so far is it really easy to take the fan shroud off, does it void warranty if you do, like can they tell if you have removed it somehow? If it is just a sticker i'm sure I can get around that somehow.

Because I'll have mine long before my waterblock comes but i'm not sure if max OC on air is really comparable to max oc on water so i'd want to try both before RMA'ing it.


----------



## Jack Mac

Does auxiliary voltage actually do anything to help with stability?


----------



## prostreetcamaro

Quote:


> Originally Posted by *HardwareDecoder*
> 
> How were the temps, voltages? did you have the 290x asus bios flashed. Were you on water?
> 
> I'm really hoping for atleast 1200/1400 on water with the asus bios.
> 
> *UGHHH newegg is irritating me, ordered mine on the cyber monday sale and they didn't ship till yesterday, now expected delivery date is monday.... I paid for 3 day shipping, gonna ask for a refund on my shipping.*


Ordered another one 2 days ago and still hasn't shipped. I am going to get on them today about it. It used to be you could order and with in 2 hours it was boxed up and shipped. I did the same thing with the 3 day shipping and I am going to tell them I paid for 3 day shipping so that it would get here friday at the latest and now it wont be here until monday or tuesday which is ridiculous. Maybe they will upgrade the shipping to overnight for me.


----------



## Bloitz

Quote:


> Originally Posted by *prostreetcamaro*
> 
> Ordered another one 2 days ago and still hasn't shipped. I am going to get on them today about it. It used to be you could order and with in 2 hours it was boxed up and shipped. I did the same thing with the 3 day shipping and I am going to tell them I paid for 3 day shipping so that it would get here friday at the latest and now it wont be here until monday or tuesday which is ridiculous. Maybe they will upgrade the shipping to overnight for me.


What you don't ask, you won't get


----------



## HardwareDecoder

Quote:


> Originally Posted by *prostreetcamaro*
> 
> Ordered another one 2 days ago and still hasn't shipped. I am going to get on them today about it. It used to be you could order and with in 2 hours it was boxed up and shipped. I did the same thing with the 3 day shipping and I am going to tell them I paid for 3 day shipping so that it would get here friday at the latest and now it wont be here until monday or tuesday which is ridiculous. Maybe they will upgrade the shipping to overnight for me.


http://www.overclock.net/t/1448460/has-newegg-shipped-your-r290-yet

off topic: I really need to turn off my s4 keep getting beeps from ocn, need to sleep must study for this anatomy final monday.


----------



## The Storm

Quote:


> Originally Posted by *prostreetcamaro*
> 
> Ordered another one 2 days ago and still hasn't shipped. I am going to get on them today about it. It used to be you could order and with in 2 hours it was boxed up and shipped. I did the same thing with the 3 day shipping and I am going to tell them I paid for 3 day shipping so that it would get here friday at the latest and now it wont be here until monday or tuesday which is ridiculous. Maybe they will upgrade the shipping to overnight for me.


They will not, sorry man I called them probably 5 different times yesterday about this. I am in the same situation, I ordered 2 sapphire r9-290's on monday and they havent shipped as of last night. They told me that they cannot alter the shipping due to the order has already been processed. I just recieved an update that they have been shipped thru fedex as of this morning, I am in Indiana and they are coming from California. I normally get stuff from the Tennessee warehouse and it shows up in 2 days.


----------



## The Storm

Quote:


> Originally Posted by *HardwareDecoder*
> 
> http://www.overclock.net/t/1448460/has-newegg-shipped-your-r290-yet
> 
> off topic: I really need to turn off my s4 keep getting beeps from ocn, need to sleep must study for this anatomy final monday.


Haha I have an S4 as well. There is a setting that you can shut off the ringer and vibrate for email alerts but keep all your other alerts audible.


----------



## head9r2k

Quote:


> Originally Posted by *aznever*
> 
> Use the new version of afterburner.


What is the New? Beta 17 ?


----------



## Jack Mac

Quote:


> Originally Posted by *head9r2k*
> 
> What is the New? Beta 17 ?


Yep, the beta version. Also you guys complaining about NE really should just use Amazon.


----------



## The Storm

Quote:


> Originally Posted by *Jack Mac*
> 
> Yep, the beta version. Also you guys complaining about NE really should just use Amazon.


In all my years dealing with NewEgg I have never had an issue with shipping. I have however had issues with amazon in the past. So as far as that goes my track record is still better with the EGG. Your experiance may be different than others.


----------



## Fahrenheit85

I order all my stuff trough amazon now. As long as the seller is Amazon, LCC they ship fast and are a dream to return things too. Newegg has fallen behind. Things used to ship same day every time but now it seems like it never ships the same day. Also went trough hell with trying to get a RMA for a hotswap bay that was broken out of the box. I'll never order from them again. Ill use FrozenCPU if amazon doesnt have it.


----------



## RAFFY

Quote:


> Originally Posted by *Bloitz*
> 
> Any thoughts why my 290 seems to be downclocking for just a second constantly while folding?
> 
> 
> 
> Core sometimes goes as low as 847 Mhz, but usually remains above 1100 Mhz (1200 Mhz target speed). It's the memory that worries me: It dips to 150 Mhz every so often ...
> 
> Powertune: +50
> Full cover WB so temps aren't the issue (right now 30 on the core, 23 and 21 for the VRM's (Yea, I had the window open the last hour
> 
> 
> 
> 
> 
> 
> 
> , it's usually a bit higher)
> 
> Not exactly impressed with the AMD drivers so far. Was running like a charm when I first installed it and installed the 9.4 beta, but then just when the FFW was about to begin (Thank you Kevdog) it started acting up. Reinstalled the driver, installed 9.5 beta, and now I'm on the WHQL which seems to be the most stable so far. (used driver fusion or AMD's install manager, nothing really helped)
> 
> Before last Sunday it would happily fold away, rock solid, no issues whatsoever. The OC is stable, 3D Mark 11 and the newest one, Valley and BF4 ... When I had my GTX570, BF3 was a lot harder on the OC than folding .. not sure what the deal is with AMD.
> 
> EDIT: and I can't help but feel like this card doesn't produce nearly as much heat as my 570 did ... I normally don't use the central heating in my room when folding..


I know with mining in some of the programs you can through in a code that forces the GPU to load 100%.


----------



## magicase

Bought 2 290 (Asus and XFX) and managed to flash the Asus to the 290x bios


----------



## DizzlePro

http://wccftech.com/asus-shows-custom-radeon-r9-290x-directcu-ii-v2-prerelease-sample/


I'm throwing money at the screen but nothings happening


----------



## Arizonian

Quote:


> Originally Posted by *Hogesyx*
> 
> Nice, just picked up my 2nd too.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://postimg.org/image/7p71h2s2r/


Congrats Updated to two


----------



## prostreetcamaro

P15242 !!!!! This throws me into the top 30 with an older cpu and a $400 GPU! Man these cards are beastly!

http://www.3dmark.com/3dm11/7612014


----------



## OverSightX

Finally.. Add me! Sapphire r9 290 BF4 edition - unlocked with Asus Bios. Unlocked and flashed before waterblock install.


----------



## HardwareDecoder

Quote:


> Originally Posted by *OverSightX*
> 
> Finally.. Add me! Sapphire r9 290 BF4 edition - unlocked with Asus Bios. Unlocked and flashed before waterblock install.


you are the exact guy I need to answer a couple questions. PLEASE lol

Did the sapphire card have any warranty void stickers on it? How easy is it to remove the stock shroud on these, is it just like 4 screws in the back or something?

Gonna be water blocking a sapphire 290x my self.

That is the EK nickle/acetal block huh? Same one I ordered, like a 1-2 week back order right now though


----------



## ivers

Quote:


> Originally Posted by *OverSightX*
> 
> Finally.. Add me! Sapphire r9 290 BF4 edition - unlocked with Asus Bios. Unlocked and flashed before waterblock install.


Where did you get your EK? they are all out of stock.


----------



## Joeking78

Definitely a bottle neck (at least in benchmarks) with 3 290x on an 8/4/4/ pcie board.

I'm getting just shy of 24k in 3Dmark11, whilst systems exactly the same except for the Asus Maximus Extreme are scoring 2k+ higher at exactly the same clocks.

In games I can't see it, just ran BF4, Ultra settings, no AA, 2560x1440, 96hz and FPS never dropped below 90 with V-sync...yesterday with two cards same settings I would drop down to 65ish FPS.

Off to the shop tomorrow for an Asus board


----------



## OverSightX

Quote:


> Originally Posted by *HardwareDecoder*
> 
> you are the exact guy I need to answer a couple questions. PLEASE lol
> 
> Did the sapphire card have any warranty void stickers on it? How easy is it to remove the stock shroud on these, is it just like 4 screws in the back or something?
> 
> Gonna be water blocking a sapphire 290x my self.
> 
> That is the EK nickle/acetal block huh? Same one I ordered, like a 1-2 week back order right now though


No stickers what so ever. Shroud was very easy to take off. Once the screws are out just start moving the pcb back and forth starting from the back (near the power plugs). There are a bunch of screws (don't remember the actual amount) to take of the back side and like 4 on the i/o port. Took me about 25 min to install both the block and backplate.

Quote:


> Originally Posted by *ivers*
> 
> Where did you get your EK? they are all out of stock.


I ordered and received the block like a week ago from Frozencpu.com. I want to say I bought the last block and last BF4 edition 290 when i purchased them.


----------



## HardwareDecoder

Quote:


> Originally Posted by *OverSightX*
> 
> No stickers what so ever. Shroud was very easy to take off. Once the screws are out just start moving the pcb back and forth starting from the back (near the power plugs). There are a bunch of screws (don't remember the actual amount) to take of the back side and like 4 on the i/o port. Took me about 25 min to install both the block and backplate.
> I ordered and received the block like a week ago from Frozencpu.com. I want to say I bought the last block and last BF4 edition 290 when i purchased them.


Hmm. I'm a total WC noob, just did my first loop with a 240mm rasa kit and I've got the itch so I ordered that acetal/nickle block for my 290x. I have never installed a gpu block before









Any other tips for me? Wondering if I should order a backplate too, it's not actually necessary for anything right?


----------



## OverSightX

Quote:


> Originally Posted by *HardwareDecoder*
> 
> Hmm. I'm a total WC noob, just did my first loop with a 240mm rasa kit and I've got the itch so I ordered that acetal/nickle block for my 290x. I have never installed a GPU block before
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any other tips for me? Wondering if I should order a backplate too, it's not actually necessary for anything right?


I would say just take your time and don't over tighten. Die's are sensitive to pressure. Clean the old TIM completely and don't add too much new TIM. The backplate does some passive cooling, but it's not necessary. I bought this one just for aesthetics. Below is a link to the installation manual.

FC 290X EK Block install


----------



## HardwareDecoder

Quote:


> Originally Posted by *OverSightX*
> 
> I would say just take your time and don't over tighten. Die's are sensitive to pressure. Clean the old TIM completely and don't add too much new TIM. The backplate does some passive cooling, but it's not necessary. I bought this one just for aesthetics. Below is a link to the installation manual.
> 
> FC 290X EK Block install


Oh, wow you are the man. and dang 16 screws on the back lol.


----------



## Arizonian

Quote:


> Originally Posted by *OverSightX*
> 
> Finally.. Add me! Sapphire r9 290 BF4 edition - unlocked with Asus Bios. Unlocked and flashed before waterblock install.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## OverSightX

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added


Thank you!


----------



## PwrElec

Quote:


> Originally Posted by *RAFFY*
> 
> Post back or PM me what they say please. I have 3 ASUS R9 290x that are getting naked in the couple of weeks.


*Asus answer:
You water cool at your own risk. If you remove the stock cooler then you do break your warranty.*


----------



## Jack Mac

They'll have to know that you did it to void your warranty, it's a bit of a grey area.


----------



## PwrElec

Is it correct the MSI is ok with water? I think i read that somewhere?


----------



## Raxus

Quote:


> Originally Posted by *PwrElec*
> 
> Is it correct the MSI is ok with water? I think i read that somewhere?


It seems to depend on where you live.

US seems to be ok with it.


----------



## pkrexer

So when did the 290/290x become impossible to find? I ended up doing a RMA for my Sapphire 290 and newegg refunded my order because it was out of stock, even though it was replacement only. Now that I'm looking online, prices have exploded and nothing is in stock... sweet.


----------



## RAFFY

Quote:


> Originally Posted by *pkrexer*
> 
> So when did the 290/290x become impossible to find? I ended up doing a RMA for my Sapphire 290 and newegg refunded my order because it was out of stock, even though it was replacement only. Now that I'm looking online, prices have exploded and nothing is in stock... sweet.


Last night they all went out of stock.


----------



## battleaxe

Quote:


> Originally Posted by *pkrexer*
> 
> So when did the 290/290x become impossible to find? I ended up doing a RMA for my Sapphire 290 and newegg refunded my order because it was out of stock, even though it was replacement only. Now that I'm looking online, prices have exploded and nothing is in stock... sweet.


Miners are grabbing everything up. Everything. 280's can't be found either.


----------



## Jpmboy

Quote:


> Originally Posted by *battleaxe*
> 
> Miners are grabbing everything up. Everything. 280's can't be found either.


Even 7970s (aka 280). I sold my 2 in minutes yesterday.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Jpmboy*
> 
> Even 7970s (aka 280). I sold my 2 in minutes yesterday.


Where? LOL I posted my 7970 up a few days ago on amazon and nothing


----------



## Zero383

7950/7970/280x are all being bought up like crazy due to the recent surge of mining bitcoins/litecoins or anything similar.


----------



## MrWhiteRX7

Well they need to buy mine! LOL


----------



## Jpmboy

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Where? LOL I posted my 7970 up a few days ago on amazon and nothing


Ocn marketplace... But they never made it there. Got a PM and sold immediately. The two i sold had AQ waterblocks. But that won't matter


----------



## Dettie

ASUS R9 290 vs XFX R9 290 ? Which would you pick?


----------



## HOMECINEMA-PC

HOMECINEMA-PC [email protected]@2428 Sapphy R9 290 1095 / 1317



http://www.3dmark.com/3dm11/7610631


----------



## MrWhiteRX7

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> HOMECINEMA-PC [email protected]@2428 Sapphy R9 290 1095 / 1317
> 
> 
> 
> http://www.3dmark.com/3dm11/7610631


Why does it say tessellation load modified?

Amazing clock on that 3930k btw!!!!


----------



## mmsandi

Quote:


> Originally Posted by *Dettie*
> 
> ASUS R9 290 vs XFX R9 290 ? Which would you pick?


It seems that you have much more chance of unlocking your card to 290X with XFX. I don't know is it because of all the X's in the name....


----------



## Dettie

Quote:


> Originally Posted by *mmsandi*
> 
> It seems that you have much more chance of unlocking your card to 290X with XFX. I don't know is it because of all the X's in the name....


ASUS can't be unlocked?


----------



## Forceman

Quote:


> Originally Posted by *Dettie*
> 
> ASUS can't be unlocked?


Very few, if any. Much better odds with XFX.


----------



## mmsandi

Judging from THIS thread - NO!


----------



## Dettie

hmm.. I ordered a XFX at Amazon and an ASUS at Newegg.. thinking of cancelling the ASUS then..


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Why does it say tessellation load modified?
> 
> Amazing clock on that 3930k btw!!!!


Thanks i can run this hexy at 5.2 24/7 if i wanna but i dont . Only for benching . It boots in at 1.48vc in bios









Tess off as per HWBOT rules for amd cards









( somethin like that anyways







)


----------



## MrWhiteRX7

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Thanks i can run this hexy at 5.2 24/7 if i wanna but i dont . Only for benching . It boots in at 1.48vc in bios
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Tess off as per HWBOT rules for amd cards
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ( somethin like that anyways
> 
> 
> 
> 
> 
> 
> 
> )


Awesome! Thanks I had no idea just had never seen that before


----------



## Slomo4shO

Here is an interesting alternative to the reference cooler:


----------



## Chomuco

ASUS Radeon R9 290X DirectCU II Pre-Release Sample Photos

http://www.guru3d.com/news_story/asus_radeon_r9_290x_directcu_ii_pre_release_sample_photos.html


----------



## quakermaas

R9 290 CF 1165/1450 air

Tess on

http://www.3dmark.com/3dm11/7612709



Tess off

http://www.3dmark.com/3dm11/7612761


----------



## The Storm

Quote:


> Originally Posted by *quakermaas*
> 
> R9 290 CF 1165/1450 air
> 
> Tess on
> http://www.3dmark.com/3dm11/7612709
> 
> 
> 
> Tess off
> http://www.3dmark.com/3dm11/7612761


Unreal man, I really cannot wait to get my 2 290's and run them. I should see a big improvement over my 7950's.


----------



## Jaju123

Anyone know how I can flash the BIOS of my second AMD R9 290 instead of the first one while they are both plugged in to the system? Thanks.


----------



## skupples

Quote:


> Originally Posted by *the9quad*
> 
> I honestly wanted to try 120hz gaming so I chose one monitor. I think down the line I might get two more and go back to 60hz gaming, I just honestly don't know if 3 of the 290x's can push consistently 60fps at ultra with three 1440p monitors. I'm thinking they can't.


I would truly hope they can. May have to resort to FXAA, but such is life.


----------



## quakermaas

Quote:


> Originally Posted by *Jaju123*
> 
> Anyone know how I can flash the BIOS of my second AMD R9 290 instead of the first one while they are both plugged in to the system? Thanks.


Change the 0 to a 1

atiflash.exe -f -p 0 ~.rom for first card

atiflash.exe -f -p 1 ~.rom for second card

Reboot after doing the first card to make sure it flashed ok, before doing the second card.

I usually just flash both at the same time after i have tested the bios.


----------



## iamhollywood5

I guess it's time for me to join the club!

1. GPU-Z Validation http://www.techpowerup.com/gpuz/u8dww/


2. XFX R9 290 unlocked to 290X with BIOS flash
3. Water cooling (EK full cover block)


----------



## EmZkY

Add me to da club









XFX R9 290 unlocked into a R9 290X.
Using Asus bios and Catalyst 13.11 beta 9.5.
Clocks core/mem: 1150/1400 @ 1.375V

I am getting artefacts at 1.400V+ and core clock at 1180+. Anyone got any suggestions? Try another bios or driver?


----------



## Newbie2009

It's taking a long time for trixx to get updated init.


----------



## the9quad

Quote:


> Originally Posted by *Newbie2009*
> 
> It's taking a long time for trixx to get updated init.


yes it is.


----------



## iamhollywood5

Quote:


> Originally Posted by *EmZkY*
> 
> Add me to da club
> 
> 
> 
> 
> 
> 
> 
> 
> 
> XFX R9 290 unlocked into a R9 290X.
> Using Asus bios and Catalyst 13.11 beta 9.5.
> Clocks core/mem: 1150/1400 @ 1.375V
> 
> I am getting artefacts at 1.400V+ and core clock at 1180+. Anyone got any suggestions? Try another bios or driver?


Whats you're ASIC?


----------



## pkrexer

Looks like I'm "upgrading" to a 290x. After a search of the internet for half the day for a decently priced 290, I noticed some Asus 290x's popped into stock on newegg, so decided I'd grab one. I was going to hold out for a XFX or Powercolor 290 so I'd have a chance to get an unlocked one, but at least I know I'm for sure getting a true 290x now =p


----------



## EmZkY

Quote:


> Originally Posted by *iamhollywood5*
> 
> Whats you're ASIC?


My ASIC is 68.8 %.


----------



## HardwareDecoder

Quote:


> Originally Posted by *pkrexer*
> 
> Looks like I'm "upgrading" to a 290x. After a search of the internet for half the day for a decently priced 290, I noticed some Asus 290x's popped into stock on newegg, so decided I'd grab one. I was going to hold out for a XFX or Powercolor 290 so I'd have a chance to get an unlocked one, but at least I know I'm for sure getting a true 290x now =p


sucks you missed the CM sale


----------



## Durvelle27

Do you still need to flash ASUS Bios in order to OC the 290(X) without CCC ?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Durvelle27*
> 
> Do you still need to flash ASUS Bios in order to OC the 290(X) without CCC ?


No, you never had to. AB 17 does everything GPU-tweak does and for me it's even better. I'm on a sapphire and get 1200/ 1600 oc with stock bios.


----------



## Durvelle27

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> No, you never had to. AB 17 does everything GPU-tweak does and for me it's even better. I'm on a sapphire and get 1200/ 1600 oc with stock bios.


Does it have voltage controll


----------



## Forceman

Quote:


> Originally Posted by *Durvelle27*
> 
> Does it have voltage controll


+100mv. The Asus BIOS and GPU Tweak go higher though, if you want more. They can do 1.412.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Durvelle27*
> 
> Does it have voltage controll


Yes once you enable it in the settings menu. It will allow you to do +100mv right away, but you can go as high as +200mv by doing this:

To set +200mV offset:
MSIAfterburner /wi4,30,8d,20

To restore original voltage offset:
MSIAfterburner /wi4,30,8d,0

shift-right-click in the MSI folder -> "open cmnd window here" - type in MSIAfterburner /wi4,30,8d,10

Use at your own risk!!


----------



## Durvelle27

Quote:


> Originally Posted by *Forceman*
> 
> +100mv. The Asus BIOS and GPU Tweak go higher though, if you want more. They can do 1.412.


+100mV is around 1.3v correct


----------



## stickg1

Man this is brutal. Everytime I ship someone something USPS Priority they get it in like 2 days. Everytime I buy something and its shipped USPS it takes a dang week. My 290 was shipped Monday and expected delivery date is Saturday? Da fuq? I shipped the 7970 Wednesday at lunch and dude says it's scheduled for delivery tomorrow. I shipped a 670 Tuesday and dude got it today. Not cool man, not cool!


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Durvelle27*
> 
> +100mV is around 1.3v correct


Yes that is before Vdroop though, so effectively under load you'll be right arond 1.2v


----------



## HardwareDecoder

Quote:


> Originally Posted by *stickg1*
> 
> Man this is brutal. Everytime I ship someone something USPS Priority they get it in like 2 days. Everytime I buy something and its shipped USPS it takes a dang week. My 290 was shipped Monday and expected delivery date is Saturday? Da fuq? I shipped the 7970 Wednesday at lunch and dude says it's scheduled for delivery tomorrow. I shipped a 670 Tuesday and dude got it today. Not cool man, not cool!


heh... same exact feeling. I shipped my 7950s Tuesday they got delivered today. newegg shipped my 290x yesterday won't have it till Monday. ughhh


----------



## MrWhiteRX7

Quote:


> Originally Posted by *stickg1*
> 
> Man this is brutal. Everytime I ship someone something USPS Priority they get it in like 2 days. Everytime I buy something and its shipped USPS it takes a dang week. My 290 was shipped Monday and expected delivery date is Saturday? Da fuq? I shipped the 7970 Wednesday at lunch and dude says it's scheduled for delivery tomorrow. I shipped a 670 Tuesday and dude got it today. Not cool man, not cool!


YEP! It's craaaay! I know dat feel


----------



## Sgt Bilko

Quote:


> Originally Posted by *stickg1*
> 
> Man this is brutal. Everytime I ship someone something USPS Priority they get it in like 2 days. Everytime I buy something and its shipped USPS it takes a dang week. My 290 was shipped Monday and expected delivery date is Saturday? Da fuq? I shipped the 7970 Wednesday at lunch and dude says it's scheduled for delivery tomorrow. I shipped a 670 Tuesday and dude got it today. Not cool man, not cool!


Well at least Express makes a differance for you









I ordered 3 packages from the same shop, 1 Express, 1 Normal Post and another by Courier, they all turned up on the same day at the same time


----------



## iamhollywood5

Quote:


> Originally Posted by *EmZkY*
> 
> My ASIC is 68.8 %.


And people say it means nothing, lol.

Mine is 68.1% so that doesn't bode well for me. I've only tried 1100Mhz on stock volts with air cooling and there's no problems so far, but I haven't tried more yet. Eager to see what it's like when I get my waterblock.


----------



## Jaju123

Is it normal for BF4 on ultra at 1440P to eat all 4GB of VRAM? Mine is like 4.1GB VRAM used just in singleplayer!


----------



## battleaxe

Quote:


> Originally Posted by *stickg1*
> 
> Man this is brutal. Everytime I ship someone something USPS Priority they get it in like 2 days. Everytime I buy something and its shipped USPS it takes a dang week. My 290 was shipped Monday and expected delivery date is Saturday? Da fuq? I shipped the 7970 Wednesday at lunch and dude says it's scheduled for delivery tomorrow. I shipped a 670 Tuesday and dude got it today. Not cool man, not cool!


Which is why I don't use Universal Package Smashers anymore. FedEx only for me.

Universal Package Smashers took 2 mos to get my money back after they did their deed on a $2k sub-woofer for my company. Looked like they threw it off the truck then drug it down the highway for a few miles before it made it to its destination. Haven't used them since. At least they got it there in one piece for you.


----------



## Durvelle27

Anybody using a Accelero Extreme III


----------



## Arizonian

Quote:


> Originally Posted by *iamhollywood5*
> 
> I guess it's time for me to join the club!
> 
> 1. GPU-Z Validation http://www.techpowerup.com/gpuz/u8dww/
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 2. XFX R9 290 unlocked to 290X with BIOS flash
> 3. Water cooling (EK full cover block)


Congrats - added















Quote:


> Originally Posted by *EmZkY*
> 
> Add me to da club
> 
> 
> 
> 
> 
> 
> 
> 
> 
> XFX R9 290 unlocked into a R9 290X.
> Using Asus bios and Catalyst 13.11 beta 9.5.
> Clocks core/mem: 1150/1400 @ 1.375V
> 
> I am getting artefacts at 1.400V+ and core clock at 1180+. Anyone got any suggestions? Try another bios or driver?
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## Forceman

Quote:


> Originally Posted by *Durvelle27*
> 
> +100mV is around 1.3v correct


Should be about 1.35V, or around 1.3V actual. That's what I've seen.


----------



## Durvelle27

Quote:


> Originally Posted by *Forceman*
> 
> Should be about 1.35V, or around 1.3V actual. That's what I've seen.


Thx hopefully the Accelero Xtreme III can handle it. I ordered that and a Sapphire R9 290.


----------



## ImJJames

Newegg uses Ontrac and Ontrac sucks for delivering packages....



What is this crap??? I'm seriously about to file a claim, how the hell do you claim you delivered my package but then delivery again today...and no PACKAGE yet to be found.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Durvelle27*
> 
> Thx hopefully the Accelero Xtreme III can handle it. I ordered that and a Sapphire R9 290.


Yeah the AX III can handle it fine, your VRM's will heat up a bit though, in benches i hit 80c/69c for the vrm's and 80c or so for the core.

This was about 30c ambient temps though.


----------



## rdr09

Quote:


> Originally Posted by *iamhollywood5*
> 
> I guess it's time for me to join the club!
> 
> 1. GPU-Z Validation http://www.techpowerup.com/gpuz/u8dww/
> 
> 
> 2. XFX R9 290 unlocked to 290X with BIOS flash
> 3. Water cooling (EK full cover block)


Welcome aboard, iamhollywood5!


----------



## Durvelle27

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yeah the AX III can handle it fine, your VRM's will heat up a bit though, in benches i hit 80c/69c for the vrm's and 80c or so for the core.
> 
> This was about 30c ambient temps though.


My room ambient ranges from 20°c - 23°C and thx. Are those temps stock or OC'd


----------



## Slomo4shO

Quote:


> Originally Posted by *ImJJames*
> 
> Newegg uses Ontrac and Ontrac sucks for delivering packages....
> 
> 
> 
> What is this crap??? I'm seriously about to file a claim, how the hell do you claim you delivered my package but then delivery again today...and no PACKAGE yet to be found.


I have had UPS deliveries as late as 8PM... That, however, is odd


----------



## Sgt Bilko

Quote:


> Originally Posted by *Durvelle27*
> 
> My room ambient ranges from 20°c - 23°C and thx. Are those temps stock or OC'd


OC'd with 100mV through AB, i'll see if i can find my original post for it.

With the heatsinks though you will need 16 for the vram and at least on low profile one (if you didn't already know that is







)


----------



## ImJJames

Quote:


> Originally Posted by *Slomo4shO*
> 
> I have had UPS deliveries as late as 8PM...


Dont you see they claimed they delivered yesterday...


----------



## Slomo4shO

Quote:


> Originally Posted by *ImJJames*
> 
> Dont you see they claimed they delivered yesterday...


It is odd. At least you are not dealing with USPS trying to get them to fix a drop box at a condo. I have been trying for two weeks and no success. I have already had them drop 3 deliveries in there and then returning them back to sender after a few days since the key no longer works and I am unable to retrieve them... I get to spend another hour on the phone tonight


----------



## kcuestag

Quote:


> Originally Posted by *quakermaas*
> 
> Change the 0 to a 1
> 
> atiflash.exe -f -p 0 ~.rom for first card
> atiflash.exe -f -p 1 ~.rom for second card
> 
> Reboot after doing the first card to make sure it flashed ok, before doing the second card.
> I usually just flash both at the same time after i have tested the bios.


Are you going to get them under water?


----------



## Durvelle27

Quote:


> Originally Posted by *Sgt Bilko*
> 
> OC'd with 100mV through AB, i'll see if i can find my original post for it.
> 
> With the heatsinks though you will need 16 for the vram and at least on low profile one (if you didn't already know that is
> 
> 
> 
> 
> 
> 
> 
> )


Yes I know. I bought some copper heatsinks for the VRMs and VRAM. And thanks would love to see what you got with this cooler. I'm hoping for 1200+/1400+


----------



## Sgt Bilko

Quote:


> Originally Posted by *Durvelle27*
> 
> Yes I know. I bought some copper heatsinks for the VRMs and VRAM. And thanks would love to see what you got with this cooler. I'm hoping for 1200+/1400+


I never hit that high unfortunately 1150/1350 was 24/7 clocks for me (Elphida Ram







) but if you get lucky i can believe 1200/1400 with extra volts


----------



## the9quad

Quote:


> Originally Posted by *ImJJames*
> 
> Dont you see they claimed they delivered yesterday...


if it is anything like UPS sometimes when they try and deliver something and no one is there they put it back to deliver again. It will sometimes still show up as delivered. They will try and deliver for 3 days after that they will call you and schedule a delivery when you can be home. This recently happened to me. even the UPS guy mentioned it to me again tonight, he said sometimes it just gets entered in funny when you are not home the first time.

you can try entering the tracking info into the UPS website instead, as that tracker has more detail.


----------



## Durvelle27

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I never hit that high unfortunately 1150/1350 was 24/7 clocks for me (Elphida Ram
> 
> 
> 
> 
> 
> 
> 
> ) but if you get lucky i can believe 1200/1400 with extra volts


Thx. Well we'll know this weekend. Gonna push it to the limits before putting it under water.


----------



## Jaju123

Can you add me to the club?









http://www.techpowerup.com/gpuz/3kuvw/

http://www.techpowerup.com/gpuz/w3caa/

Got two in Crossfire!


----------



## ImJJames

Quote:


> Originally Posted by *the9quad*
> 
> if it is anything like UPS sometimes when they try and deliver something and no one is there they put it back to deliver again. It will sometimes still show up as delivered. They will try and deliver for 3 days after that they will call you and schedule a delivery when you can be home. This recently happened to me. even the UPS guy mentioned it to me again tonight, he said sometimes it just gets entered in funny when you are not home the first time.
> 
> you can try entering the tracking info into the UPS website instead, as that tracker has more detail.


I just called Ontrac and they claim they accidentally used wrong code showing it was delivered yesterday (But we all know ontrac was just trying to protect their behinds since it was suppose to be delievered yesterday) They told me they had a huge influx of deliveries and that is what caused the delay. They contacted the truck driver and they do have my package in the truck and should be coming anytime from now till 8pm.


----------



## darkelixa

Still waiting for my r9 290 xfx to ship


----------



## Forceman

Anyone else having video playback problems with Beta 9.4? If I try to play any movie (in pretty much any format) I get either application crashes or BSODs with Media Player or Media Player Classic. VLC seems to work, but anytime you seek it goes gray for a second then shows the video again. It worked fine a week ago, but since then I've switched frmo the WHQLs to 9.4, and also re-installed my Displaylink Adapter, so I don't know whether the drivers are to blame, or if it is some interaction with Displaylink. Displaylink didn't cause any trouble before though, so I'm leaning toward the beta drivers - just don't want to go back to the WHQLs unless I have to.

Edit: Guess it was Displaylink after all. Removed that and now it works fine.


----------



## the9quad

Firestrike Extreme: *13,402*
Graphics Score 16234
Physics Score 16736
Combined Score 5141
http://www.3dmark.com/fs/1257453


----------



## Jpmboy

http://www.overclock.net/t/1443196/firestrike-extreme-top-30


----------



## DeadlyDNA

Quote:


> Originally Posted by *the9quad*
> 
> Firestrike Extreme: *13,402*
> Graphics Score 16234
> Physics Score 16736
> Combined Score 5141
> http://www.3dmark.com/fs/1257453


you know I have 4 290's but only 3 in the system right now but this tri-fire like yours is really kicking it in gaming for me.
I know the 4th card may not scale as well as tri-fire does and I am almost tempted to send the 4th back lol.. its not even open yet either.i am really digging the step up from my 680s


----------



## Sgt Bilko

Just got off the phone with PCCG, they have just recieved my 290x and will test it tonight.....hopefully all goes well









Wish these non-ref cards were releasing sooner than Jan though, i need a replacement before then.


----------



## magicase

Add me to the list please











Managed to flash the Asus card to the 290x bios and stable


----------



## grunion

Couple of 3DM11 scores comparing FP on and off.

Performance

Extreme

So if you're going for numbers, turn off frame pacing.

And I know, it shows 1000 core, but trust me it was 1150.


----------



## skupples

Quote:


> Originally Posted by *Slomo4shO*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> It is odd. At least you are not dealing with USPS trying to get them to fix a drop box at a condo. I have been trying for two weeks and no success. I have already had them drop 3 deliveries in there and then returning them back to sender after a few days since the key no longer works and I am unable to retrieve them... I get to spend another hour on the phone tonight


I avoid USPS @ all costs. Had a decent sized Ebay order hit their sorting facility in Rosemead california on the 27th of november(payed 30$ for over night). It is now obviously lost, & the people on the phone simply said "we will call you when we find it." This is the third time in 6 months they have lost a package of mine. Being that they are a federal entity they have no incentive to operate in a professional or competent manner. A few years ago I shipped a PC, i chose 2 day air, + insurance. Well guess what? They "lost" the package, and would only pony up 50% of the insured amount. About 6 months later the system showed up SMASHED to pieces. Literally, smashed. Like some one opened up the packaging, beat the computer with a hammer, then packaged it back up & shipped it to me. I'm sure this is a minority experience, but these combined experiences make me avoid them like the plague.


----------



## Sgt Bilko

Quote:


> Originally Posted by *skupples*
> 
> I avoid USPS @ all costs. Had a decent sized Ebay order hit their sorting facility in Rosemead california on the 27th of november(payed 30$ for over night). It is now obviously lost, & the people on the phone simply said "we will call you when we find it." This is the third time in 6 months they have lost a package of mine. Being that they are a federal entity they have no incentive to operate in a professional or competent manner. A few years ago I shipped a PC, i chose 2 day air, + insurance. Well guess what? They "lost" the package, and would only pony up 50% of the insured amount. About 6 months later the system showed up SMASHED to pieces. Literally, smashed. Like some one opened up the packaging, beat the computer with a hammer, then packaged it back up & shipped it to me. I'm sure this is a minority experience, but these combined experiences make me avoid them like the plague.


Wow, that's a horror story if i've ever heard one, I've gotten somethings from the BioWare store in the US and they gets sent through USPS to me and i have to say.....they aren't in great shape when they gets here, one box was crushed to half it's original size........Bubble Wrap saved Liara there


----------



## Durquavian

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Wow, that's a horror story if i've ever heard one, I've gotten somethings from the BioWare store in the US and they gets sent through USPS to me and i have to say.....they aren't in great shape when they gets here, one box was crushed to half it's original size........Bubble Wrap saved Liara there


Do you guys have ACE VENTURA delivering your mail?


----------



## skupples

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Wow, that's a horror story if i've ever heard one, I've gotten somethings from the BioWare store in the US and they gets sent through USPS to me and i have to say.....they aren't in great shape when they gets here, one box was crushed to half it's original size........Bubble Wrap saved Liara there


It has to do with who & what they are.

I now only use them when ordering from in state, yet packages still show up pancaked with busted tape.


Spoiler: Warning: Spoiler!


----------



## Durquavian

Quote:


> Originally Posted by *skupples*
> 
> It has to do with who & what they are.
> 
> I now only use them when ordering from in state, yet packages still show up pancaked with busted tape.
> 
> 
> Spoiler: Warning: Spoiler!


That there is the employee of the month. LOOK at that ENTHUSIASM. They say hard work and determination is dead.


----------



## theyedi

hmm, I followed the instructions to prevent downclocking. it works, for one of the cards.. the other one seems like it's compensating or something though, dropping up to 200 mhz while the other sits at max clock the whole time. is there something else to do for xfire?


----------



## Scorpion49

I've shipped literally hundreds of packages with USPS and only had one issue, one single time, and it was lost in the freaking UK by whatever service they use. USPS refunded me for the item with the insurance even though it did eventually turn up after about a month. Now Fedex and UPS have been a nightmare for me.


----------



## RAFFY

Quote:


> Originally Posted by *Scorpion49*
> 
> I've shipped literally hundreds of packages with USPS and only had one issue, one single time, and it was lost in the freaking UK by whatever service they use. USPS refunded me for the item with the insurance even though it did eventually turn up after about a month. Now Fedex and UPS have been a nightmare for me.


I live in a college town and the guy who owns the ONLY two UPS stores in town is a scoundrel. I had already purchased postage online from UPS.com and was shipping a Thermaltake mATX case (that black one that was popular about 2 years back). I was actually shipping it in the original packaging that the case was shipped to me in. Funny thing is on the box they still had the stickers from when it was in china or japan. The owner of the UPS store said my box wasn't good enough to ship the item and that I could only ship it if they boxed it for me. The box for the case was in perfection condition decides the usual minor dents from shipping on the cardboard. Needless to say the conversation became very heated and I was banned from his UPS stores since I called him a F(word) A(word) all while not raising my voice lol. Unfortunately he didn't know I have some friends with parents who work for corporate....ouch








But yeah USPS I love them! Their new 3 day guaranteed flat rate service is stellar!


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> You can get more..........
> 
> http://www.3dmark.com/fs/1244748 *10255*


Quote:


> Originally Posted by *Forceman*
> 
> Yes, yes you can.
> 
> http://www.3dmark.com/fs/1155472 *11259*


Dang ...... get back to ya on that one .......... Very niiiiice


----------



## binormalkilla

Got my third 290x in today. Here are Trifire results at stock clocks, 4.4 GHz 4930K. Still trying to get this CPU stable at 4.6 or higher.


----------



## the9quad

binorma my results are nearly identical btw for stock clocks at 4.4, glad you posted those, cuz i always worries mine are slow or something.


----------



## quakermaas

Quote:


> Originally Posted by *kcuestag*
> 
> Are you going to get them under water?


Of course, Aqua computer blocks ordered, can't listen to them hair dryers blasting.

I have the computer in the bedroom, and after I removed the watercooled 7970s and fitted the air cooled R9 290s the wife was asking"WTH is wrong with your computer, it's sooo loud "









I had the fan profile at 80c/100%







for some benchmarks.


----------



## darkelixa

16/12 ETA for my xfx


----------



## RAFFY

Quote:


> Originally Posted by *darkelixa*
> 
> 16/12 ETA for my xfx


I don't even have an ETA on my 3 water blocks and back plates


----------



## Scorpion49

Quote:


> Originally Posted by *binormalkilla*
> 
> Got my third 290x in today. Here are Trifire results at stock clocks, 4.4 GHz 4930K. Still trying to get this CPU stable at 4.6 or higher.


Nice scores. Thats why I'm afraid to change out my 3930k, it can do 4.8 easy at 1.425V and 5.0 with a little more voltage than I like to use every day (~1.475V or so). I haven't seen very many 4930k at all that can match up to this.


----------



## binormalkilla

Quote:


> Originally Posted by *Scorpion49*
> 
> Nice scores. Thats why I'm afraid to change out my 3930k, it can do 4.8 easy at 1.425V and 5.0 with a little more voltage than I like to use every day (~1.475V or so). I haven't seen very many 4930k at all that can match up to this.


I was able to run small fft prime 95 for 18 hours at 4.6, but I crashed after I opened my browser. I'm running occt at 4.5 right now (testing memory too). For 4.4 I'm at 1.344 after LLC in full load.

This board is pretty buggy too...so hopefully a BIOS update might help.


----------



## Arizonian

Quote:


> Originally Posted by *Jaju123*
> 
> Can you add me to the club?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/3kuvw/
> 
> http://www.techpowerup.com/gpuz/w3caa/
> 
> Got two in Crossfire!


I don't see manufacturer listed so by default it's been listed as Sapphire. PM me if I'm wrong and I'll update roster. Either way - Congrats - added









Quote:


> Originally Posted by *magicase*
> 
> Add me to the please
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Managed to flash the Asus card to the 290x bios and stable


No OCN name in the pic or a validation link with OCN name provided but I'll take your word for it. Manufactuer boxes in pic helped me list both. Congrats - added









Quote:


> Originally Posted by *binormalkilla*
> 
> Got my third 290x in today. Here are Trifire results at stock clocks, 4.4 GHz 4930K. Still trying to get this CPU stable at 4.6 or higher.
> 
> 
> Spoiler: Warning: Spoiler!


I looked on the roster to update you to a third GPU when I noticed that I don't have you on the list. Either I missed you or you haven't submitted pic w/proof. Your welcome to join us if you haven't and if you did please send me PM with the link to your entry post.

Just a reminder to new members coming on board per OP and ease of me spotting entries.

To be added on the member list please submit the following in your post

1. GPU-Z Link or Screen shot with OCN Name
2. Manufacturer & Brand
3. Cooling - Stock, Aftermarket or 3rd Party Water


----------



## Scorpion49

Quote:


> Originally Posted by *binormalkilla*
> 
> I was able to run small fft prime 95 for 18 hours at 4.6, but I crashed after I opened my browser. I'm running occt at 4.5 right now (testing memory too). For 4.4 I'm at 1.344 after LLC in full load.
> 
> This board is pretty buggy too...so hopefully a BIOS update might help.


I haven't bothered with that rigorous stability testing in years. I run a round of IBT, if it fails I add some voltage. Screw running hours and hours and hours of tests just to have it crash, if it does crash someday I'll add a little more and I've still saved myself a whole day or more of efforts. Play some games or videos, no crash I'm happy. This method has always ended up with my machine being folding stable so I'm fine with that.


----------



## psyside

Guys few questions, i know i know, i ask the same thing all over, but i want to make sure

1. What VRM temps *(max)* should one expect in 22c ambient, stock cooling, ~ 1100/1400 and fan at 55-60% with 290, *during prolong game periods?* what is the max you guys get, if anyone can loop Unigine or something it would be great, ~20 mins would be fine, just max it out. Also check the max core temps please.

2. Do you guys see any downclocking or something if you don't max out power target? did anyone try to oc at around 1100/1400 without touching the power and volts, but fans at ~60% what effect does it makes on temps and clock stability?

3. I can only get Sapphire cards where i live, its the BF4 edition any chance to unlock this?

4. Does all cards have option to monitor vrm temps, and does flashing BIOS help with stability/oc if you are not going for max air oc? i don't mind running my card @5% lower clocks, not really fan of flashing vBIOS.

5. Does anyone know of clock for clock test/review with 290 vs 290X?

Thanks guys


----------



## binormalkilla

Quote:


> Originally Posted by *Scorpion49*
> 
> I haven't bothered with that rigorous stability testing in years. I run a round of IBT, if it fails I add some voltage. Screw running hours and hours and hours of tests just to have it crash, if it does crash someday I'll add a little more and I've still saved myself a whole day or more of efforts. Play some games or videos, no crash I'm happy. This method has always ended up with my machine being folding stable so I'm fine with that.


Well the weird thing is that it would pass a stability test, but crash by having a single browser tab open.


----------



## binormalkilla

Quote:


> Originally Posted by *Arizonian*
> 
> I don't see manufacturer listed so by default it's been listed as Sapphire. PM me if I'm wrong and I'll update roster. Either way - Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> No OCN name in the pic or a validation link with OCN name provided but I'll take your word for it. Manufactuer boxes in pic helped me list both. Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> I looked on the roster to update you to a third GPU when I noticed that I don't have you on the list. Either I missed you or you haven't submitted pic w/proof. Your welcome to join us if you haven't and if you did please send me PM with the link to your entry post.
> 
> Just a reminder to new members coming on board per OP and ease of me spotting entries.
> 
> To be added on the member list please submit the following in your post
> 
> 1. GPU-Z Link or Screen shot with OCN Name
> 2. Manufacturer & Brand
> 3. Cooling - Stock, Aftermarket or 3rd Party Water


I'll have to add the info some other time, since I'm posting from my phone ATM. I'm on air until more ek blocks come in. Also 1 MSI and 2 Sapphire.


----------



## Scorpion49

Quote:


> Originally Posted by *binormalkilla*
> 
> Well the weird thing is that it would pass a stability test, but crash by having a single browser tab open.


Thats because stability tests suck, IMO. They only hit certain instruction sets, and if they do hit all instruction sets the load is so unreasonable that it doesn't matter anyway. Random hits and loads from everyday use have always picked out a bad OC for me after the initial "is it enough voltage" check.


----------



## quakermaas

Quote:


> Originally Posted by *theyedi*
> 
> hmm, I followed the instructions to prevent downclocking. it works, for one of the cards.. the other one seems like it's compensating or something though, dropping up to 200 mhz while the other sits at max clock the whole time. is there something else to do for xfire?


I was getting the same thing.



I then installed the 9.5 beta driver and was then able to check in Overdrive that the powerlimits were not being applied properly to the second card.


----------



## hotrod717

Do I need a whql driver to submit to top 30. First run ever of 3dmark -


It's got more this is only at 1150.

1180.


----------



## Forceman

Quote:


> Originally Posted by *psyside*
> 
> Guys few questions, i know i know, i ask the same thing all over, but i want to make sure
> 
> 1. What VRM temps *(max)* should one expect in 22c ambient, stock cooling, ~ 1100/1400 and fan at 55-60% with 290, *during prolong game periods?* what is the max you guys get, if anyone can loop Unigine or something it would be great, ~20 mins would be fine, just max it out. Also check the max core temps please.
> 
> 2. Do you guys see any downclocking or something if you don't max out power target? did anyone try to oc at around 1100/1400 without touching the power and volts, but fans at ~60% what effect does it makes on temps and clock stability?
> 
> 3. I can only get Sapphire cards where i live, its the BF4 edition any chance to unlock this?
> 
> 4. Does all cards have option to monitor vrm temps, and does flashing BIOS help with stability/oc if you are not going for max air oc? i don't mind running my card @5% lower clocks, not really fan of flashing vBIOS.
> 
> 5. Does anyone know of clock for clock test/review with 290 vs 290X?
> 
> Thanks guys


1. When I had the stock cooler on mine, the VRMs maxed out around 65C or 70C with a 65% max fan speed. So probably under 70C in your conditions. Max core temp was 85 or 90C at those fan speeds, for me.

2. Yes, you can get some downclocking at the 100% power target, but it seems to depend on the individual card. I was able to run mine at 1100/1350 with 65% fan speed without any downclocking at stock volts and 150% power target.

3. Some Sapphire cards have unlocked, but not as many as the other brands.

4. All cards have VRM temp monitoring ability. The only advantage to flashing a different BIOS would be to get the higher voltage limits with the Asus BIOS/GPU Tweak.

5. About 5% lead for the 290X over the 290 from my testing, but I'm pretty sure several sites did comparisons when the 290 launched. I know Hardware Canucks did.


----------



## muhd86

*
modest score for 3 tri fire sapphire r290*


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I never hit that high unfortunately 1150/1350 was 24/7 clocks for me (Elphida Ram
> 
> 
> 
> 
> 
> 
> 
> ) but if you get lucky i can believe 1200/1400 with extra volts


Do you have an AB snapshot of GPU usage while in BF4? I'm trying to compare other FX8350's with these 290's. My buddy gets a lot of drops and just overall low usage in BF4 specifically. Much appreciated if possible


----------



## brazilianloser

Having a bit of a headache with my recently arrived second card and my eyefinity setup...


----------



## binormalkilla

Quote:


> Originally Posted by *Scorpion49*
> 
> Thats because stability tests suck, IMO. They only hit certain instruction sets, and if they do hit all instruction sets the load is so unreasonable that it doesn't matter anyway. Random hits and loads from everyday use have always picked out a bad OC for me after the initial "is it enough voltage" check.


FYI an 'instruction set' is simply the way a CPU can execute operations on it's components. You can download instruction set documentation for different microarchitectures and see how an operation utilizes that architectures registers. For instance:
http://www.intel.com/content/dam/www/public/us/en/documents/manuals/64-ia-32-architectures-software-developer-vol-2a-manual.pdf

When you're running small FFTs (Fast Fourier Transforms), you're basically 'working out' the inner workings of your CPU (registers, ALU, buffers, buses, etc.). That said, I think that Prime95 and memtest86 are both fairly reasonable stress tests. You just have to use them correctly.

@Arizonian
Here's my validation link
http://www.techpowerup.com/gpuz/d3wwe/


----------



## muhd86

is it just me i wonder why i dont reach fps 150 / 160 in valley benchmark , with a 4.4ghz oc on my 3930k , and some over clocking on the 3 r290 - its like 130 / 145 frames

should this be higher ---- i had gtx 760 before - removed the drivers with driver cleaner etc ..installed the ati cards ---

i know crossfire 3 way is working ---but wonder why the scores are less here-

even in 3d mark 2013 , now installing the new 13.11 beta 9.5 driver ---fire strike works in slow motion - its like the gpu just does not like this benchmark for some reason .

can any one help


----------



## Sgt Bilko

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Do you have an AB snapshot of GPU usage while in BF4? I'm trying to compare other FX8350's with these 290's. My buddy gets a lot of drops and just overall low usage in BF4 specifically. Much appreciated if possible


Unfortunately my card is about 800kms away waiting to be replaced or refunded, I could only ever play the SP campaign due to shoddy net and even then it was somewhat sporadic, never tried with the 13.11 WHQL drivers though, they seemed to smooth out GPU usage everywhere else.


----------



## brazilianloser

Eyefinity worked like a charm when was using single card... Put my second in and now it doesn't work anymore. Yet another headache.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Scorpion49*
> 
> I haven't bothered with that rigorous stability testing in years. I run a round of IBT, if it fails I add some voltage. Screw running hours and hours and hours of tests just to have it crash, if it does crash someday I'll add a little more and I've still saved myself a whole day or more of efforts. Play some games or videos, no crash I'm happy. This method has always ended up with my machine being folding stable so I'm fine with that.


*YEP , THIS ^^







*

Quote:


> Originally Posted by *muhd86*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> *
> modest score for 3 tri fire sapphire r290*


Hey there . Up here too eh ? Sweet score . 1200pts or so more than my ol' tri 760 score........


----------



## famich

I don t get it : MSI AFB default VC on my card : 1.15 ... +100mV 1,25 with the 150"switch " I got 1,28 @ load. /Sapphire BIOS /

ASUS BIOS shows default +100mV more /1250/ max is , as we all know , 1,41 e.g. around 1,31 V in "AFB reading " ? But hey what is the correct reading - AFB or GPUTweak ?

/Sapphire or ASUS ?:r


----------



## the9quad

13,759 Firestrike Extreme- Good enough for the 3 card #16 spot in the HOF. (http://www.3dmark.com/fs/1258271)

I am done benching these things before I break them.
Honestly couldn't be happier with these cards.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Unfortunately my card is about 800kms away waiting to be replaced or refunded, I could only ever play the SP campaign due to shoddy net and even then it was somewhat sporadic, never tried with the 13.11 WHQL drivers though, they seemed to smooth out GPU usage everywhere else.


Crap I forgot you had to send yours out









Hard to keep up with everyone in here







I only ask because once I switched back to my intel and all my games are getting 100% gpu usage across the board, my friend got pissed because he's still on 8350, but I didn't reinstall bf4 untl just now which he really wanted to see and sure enough, incredible fps in 1440p and 100% gpu usage lol so just seeing if it's really an FX issue.

Hope you get sorted soon enough man, you need to be gaming!


----------



## Forceman

Quote:


> Originally Posted by *famich*
> 
> I don t get it : MSI AFB default VC on my card : 1.15 ... +100mV 1,25 with the 150"switch " I got 1,28 @ load. /Sapphire BIOS /
> 
> ASUS BIOS shows default +100mV more /1250/ max is , as we all know , 1,41 e.g. around 1,31 V in "AFB reading " ? But hey what is the correct reading - AFB /Sapphire or ASUS ?:r


Maybe the two BIOSes just have different voltages programmed. Or did you mean Afterburner and GPU Tweak?

I just checked mine, and I get 1.18V under load at default.

Edit: Interesting, looks like AB didn't apply the voltage increase at startup. It was 1.18V with +50 set, then 1.18V with +0 set, then when I reset it to +50, it went to 1.22V.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *theyedi*
> 
> hmm, I followed the instructions to prevent downclocking. it works, for one of the cards.. the other one seems like it's compensating or something though, dropping up to 200 mhz while the other sits at max clock the whole time. is there something else to do for xfire?


Did you go in to CCC and do +50% power limit for BOTH cards? Make sure you do it for both and apply after each one and also that this is done after you already applied your overclock.


----------



## Arizonian

Quote:


> Originally Posted by *the9quad*
> 
> 13,759 Firestrike Extreme- Good enough for the 3 card #16 spot in the HOF. (http://www.3dmark.com/fs/1258271)
> 
> I am done benching these things before I break them.
> Honestly couldn't be happier with these cards.


Way to go.









It is a good feeling when you press them as far as they'll go and see how well they perform. I find it most gratifying when I've found max stable OC while gaming which is my priority since I'm not a big benching fan.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *muhd86*
> 
> is it just me i wonder why i dont reach fps 150 / 160 in valley benchmark , with a 4.4ghz oc on my 3930k , and some over clocking on the 3 r290 - its like 130 / 145 frames
> 
> should this be higher ---- i had gtx 760 before - removed the drivers with driver cleaner etc ..installed the ati cards ---
> 
> i know crossfire 3 way is working ---but wonder why the scores are less here-
> 
> even in 3d mark 2013 , now installing the new 13.11 beta 9.5 driver ---fire strike works in slow motion - its like the gpu just does not like this benchmark for some reason .
> 
> can any one help


O/clock your hexy to 5 gigahurtles or so .............

I can bench mine @ [email protected]@1.48vc


----------



## brazilianloser

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Did you go in to CCC and do +50% power limit for BOTH cards? Make sure you do it for both and apply after each one and also that this is done after you already applied your overclock.


Sounds more like he needs to disable ULPS


----------



## nexusforce

Hey everyone, just got my new 290:

*HIS AMD R9 290
*Stock Cooler


Spoiler: Picture!


----------



## MrWhiteRX7

Quote:


> Originally Posted by *brazilianloser*
> 
> Sounds more like he needs to disable ULPS


Yea definitely should do that as well fo shooooooooooooooooooo!


----------



## Sgt Bilko

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Crap I forgot you had to send yours out
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hard to keep up with everyone in here
> 
> 
> 
> 
> 
> 
> 
> I only ask because once I switched back to my intel and all my games are getting 100% gpu usage across the board, my friend got pissed because he's still on 8350, but I didn't reinstall bf4 untl just now which he really wanted to see and sure enough, incredible fps in 1440p and 100% gpu usage lol so just seeing if it's really an FX issue.
> 
> Hope you get sorted soon enough man, you need to be gaming!


It's all good, I finally got my stable 5Ghz on the 8350 i wanted, So if i decide to get another 290x then i'll run some more tests, still hoping the non-ref cards drop before Xmas though (i can dream can't I?







)


----------



## psyside

Quote:


> Originally Posted by *Forceman*
> 
> 1. When I had the stock cooler on mine, the VRMs maxed out around 65C or 70C with a 65% max fan speed. So probably under 70C in your conditions. Max core temp was 85 or 90C at those fan speeds, for me.
> 
> 2. Yes, you can get some downclocking at the 100% power target, but it seems to depend on the individual card. I was able to run mine at 1100/1350 with 65% fan speed without any downclocking at stock volts and 150% power target.
> 
> 3. Some Sapphire cards have unlocked, but not as many as the other brands.
> 
> 4. All cards have VRM temp monitoring ability. The only advantage to flashing a different BIOS would be to get the higher voltage limits with the Asus BIOS/GPU Tweak.
> 
> 5. About 5% lead for the 290X over the 290 from my testing, but I'm pretty sure several sites did comparisons when the 290 launched. I know Hardware Canucks did.


And this is why i love OCN, fantastic, rep + bro!


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Sgt Bilko*
> 
> It's all good, I finally got my stable 5Ghz on the 8350 i wanted, So if i decide to get another 290x then i'll run some more tests, still hoping the non-ref cards drop before Xmas though (i can dream can't I?
> 
> 
> 
> 
> 
> 
> 
> )


NICE overclock! I left mine at 4.8ghz but used a 300+ fsb with 3000mhz on the NB and HT. Still didn't seem to push the 290 the way I had hoped :'(


----------



## psyside

@Forceman, where do you see the clock for clock @ HardwareCanucks bro? i can't find it.


----------



## Joeking78

Loving this Maximus Extreme









Not managed to run 3DMark yet on the three 290, got carried away with all these bios tweaks









Currently at 4.5ghz 1.275 volts which is a lower vcore than I used for the same freq on the MSI board...hoping I can get 4.8ghz stable with this board









Will post some 3DMark results shortly.


----------



## Tabzilla

Has anyone had to RMA a 290 (or 290x for that matter)? The card is HIS and did not appear to have any warranty stickers, but I'm still worried newegg might hassle me.
I am experiencing artifacts and restarts after installing my universal GPU water block and vrm/memory heatsinks. The restarts occur during scrypt mining. The artifacts appear while idle on the desktop as well as in games. Oddly enough, the artifacts do NOT appear while mining. I've had to resort to underclocking to 900/1200 to stay stable.


----------



## HardwareDecoder

Quote:


> Originally Posted by *Tabzilla*
> 
> Has anyone had to RMA a 290 (or 290x for that matter)? The card is HIS and did not appear to have any warranty stickers, but I'm still worried newegg might hassle me.
> I am experiencing artifacts and restarts after installing my universal GPU water block and vrm/memory heatsinks. The restarts occur during scrypt mining. The artifacts appear while idle on the desktop as well as in games. Oddly enough, the artifacts do NOT appear while mining. I've had to resort to underclocking to 900/1200 to stay stable.


You should be good as long as you can put the stock cooler back on without it looking tampered with.


----------



## quakermaas

Any offers on 3Dmark 2013 anywhere? I want to buy it, but don't want to pay $24


----------



## brazilianloser

Just a heads up to anyone getting multiple cards and doing Eyefinity, make sure you got the same bios on all cards. Spent a few hours today beating my head against a wall because for some reason my eyefinity setup would not work no matter what I did until I just out of curiosity decided to check for an update for the new card that I got today from Asus and to my surprise the fact that my older one had the newer version of their bios and the new one didn't was stopping Eyefinity from working. Anyways if it helps anyone great if it doesn't oh well. Just glad I can finally give it a test to Eyefinity with two cards.


----------



## HardwareDecoder

Quote:


> Originally Posted by *brazilianloser*
> 
> Just a heads up to anyone getting multiple cards and doing Eyefinity, make sure you got the same bios on all cards. Spent a few hours today beating my head against a wall because for some reason my eyefinity setup would not work no matter what I did until I just out of curiosity decided to check for an update for the new card that I got today from Asus and to my surprise the fact that my older one had the newer version of their bios and the new one didn't was stopping Eyefinity from working. Anyways if it helps anyone great if it doesn't oh well. Just glad I can finally give it a test to Eyefinity with two cards.


ugh good info, i'm gonna keep that in a file i'm making on these cards, if I ever go xfire. Let me know how the micro stutter is please? that is why I got rid of my 7950 setup, that and the absurd mining prices on these.


----------



## Forceman

Quote:


> Originally Posted by *psyside*
> 
> @Forceman, where do you see the clock for clock @ HardwareCanucks bro? i can't find it.


Here's the summary:



But the actual results are spread through the review.

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/63863-amd-radeon-r9-290-4gb-review.html

Edit: never mind - not clock for clock. Sorry - missed that part of your post.


----------



## brazilianloser

Quote:


> Originally Posted by *HardwareDecoder*
> 
> ugh good info, i'm gonna keep that in a file i'm making on these cards, if I ever go xfire. Let me know how the micro stutter is please? that is why I got rid of my 7950 setup, that and the absurd mining prices on these.


if I notice anything odd will let ya know. About to hit some crysis, bf4 and ac3 since it's actually working now


----------



## HardwareDecoder

Quote:


> Originally Posted by *brazilianloser*
> 
> if I notice anything odd will let ya know. About to hit some crysis, bf4 and ac3 since it's actually working now


I'm jealous, won't have mine till monday and who knows when i'll get my block for it. Oh well glad you got it figured out finally, hope you have fun.

Back to studying for me









edit: meant to say absurd mining prices on 7950's but I think these have gotten inflated too.


----------



## Jaju123

I just came from SLI GTX 680s - I haven't noticed any stutter with my 290s to be honest, and the FPS is much more consistent in general (but I think I am CPU bottlenecked in many games, also with pcie 2.0 x8).


----------



## famich

Quote:


> Originally Posted by *Forceman*
> 
> Maybe the two BIOSes just have different voltages programmed. Or did you mean Afterburner and GPU Tweak?
> 
> I just checked mine, and I get 1.18V under load at default.
> 
> Edit: Interesting, looks like AB didn't apply the voltage increase at startup. It was 1.18V with +50 set, then 1.18V with +0 set, then when I reset it to +50, it went to 1.22V.


Yes, that s what I have meant . My card s got default V 1.15V . Interestingly, it is stable @AFB +100mV @1200MHz in all games /Crysis 3 FC 3 you name it /
but for COD Ghosts I had to up the voltage a notch e.g. 1,28 under load. ASUS BIOS seems to be better for this game , more stable to me -1200MHz @1,39 or so..

EDIT: these cards are already pushed to their limits : nice chips and powerful, but Kepler overclocks way better .


----------



## brazilianloser

Man going from pulling according to my friendly kill-a-wat 500ish at stock with two cards to pulling 700ish by a slight oc to 1100 with a 50mV increase is crazy. Lol


----------



## Newbie2009

These cards are so beastly, no regrets from xfire 7970s to these. I'm quite surprised at the jump and I've not even pushed them yet. Min fps tombraider maxed @ 1600p is about 58fps lol


----------



## chiknnwatrmln

Quote:


> Originally Posted by *brazilianloser*
> 
> Man going from pulling according to my friendly kill-a-wat 500ish at stock with two cards to pulling 700ish by a slight oc to 1100 with a 50mV increase is crazy. Lol


700 at the outlet, correct?

Mind doing +100mv and +50% power limit and letting me know how much it draws?

Getting a 2nd 290 eventually, but my UPS is only 865 watts... I'm hoping that's enough.

I pull ~540w total at max OC, 1.28v actual and 80+ bronze PSU.


----------



## brazilianloser

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> 700 at the outlet, correct?
> 
> Mind doing +100mv and +50% power limit and letting me know how much it draws?
> 
> Getting a 2nd 290 eventually, but my UPS is only 865 watts... I'm hoping that's enough.
> 
> I pull ~540w total at max OC, 1.28v actual and 80+ bronze PSU.


I just did a 3DMark11 run with the core at 1100, +75mV, +50% Power Limit and fans at 65% and it topped out at about 710w on the Kill-A-Watt I have. Seems anything above that my cards start to act up... but for today I had enough head aches.

edit: yes at the outlet got a Kill-a-Watt plugged at the outlet that I acquired at the local Home Depot.

If only I could figure out now why BF4 is only using one gpu while other games and benchmarks in my system are using both properly it would make my day...


----------



## famich

Is it possible to run Crossfire with 290x+ 290 combination ?


----------



## chiknnwatrmln

It should work fine, but the 290x will match the 290's speed.


----------



## famich

Thanks


----------



## psyside

No problem buddy, thanks anyway








Quote:


> Originally Posted by *Forceman*
> 
> Here's the summary:
> 
> 
> 
> But the actual results are spread through the review.
> 
> http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/63863-amd-radeon-r9-290-4gb-review.html
> 
> Edit: never mind - not clock for clock. Sorry - missed that part of your post.


----------



## velocityx

so my second 290 is here

two things

I hope mantle will free my cpu from bottlenecking them. performance is very good in BF4, although I can see in the graph it can go higher. So, mantle where are you!

my gpu usage is very high, although first GPU, the clock is throttling down to 850 and up to 947. second gpu isn't throttling. fan speeds 57%.

so, although crossfire in these is much much better than what I experienced in 6970, you can still tell it's not as fluent as a single card solution. most of the time it's okay and a non issue you can't tell, but it depends on the map and I feel maybe cpu bottleneck is the issue, because in paracel storm, there is almost no FT spikes, in rogue transsmision, you can kinda tell it's not perfect but like 95 % perfect. Also, there's this weird thing with textures and draw distance when in CF.

with my 4.7ghz 8320 and 1440p ultra settings I'm averaging 90-120 fps depending on the map. which is good. hope the drivers will continue to improve the scaling and frame time. However that's with an empty servers, will test later and see how it goes in full 64 man serv.,

---

the thing that worries me a lot is that both gpu's are hitting 94 C. And I have a sound blaster Z right in between them. I wonder if it's ok for the audio card to be there, I don't really have a choice because when I put the sound card below the second card, the system doesnt recognize it.


----------



## psyside

Your cpu is the issue. Even Crossfire 290/290X, will be way smoother then 6970.


----------



## Durquavian

Quote:


> Originally Posted by *velocityx*
> 
> so my second 290 is here
> 
> two things
> 
> I hope mantle will free my cpu from bottlenecking them. performance is very good in BF4, although I can see in the graph it can go higher. So, mantle where are you!
> 
> my gpu usage is very high, although first GPU, the clock is throttling down to 850 and up to 947. second gpu isn't throttling. fan speeds 57%.
> 
> so, although crossfire in these is much much better than what I experienced in 6970, you can still tell it's not as fluent as a single card solution. most of the time it's okay and a non issue you can't tell, but it depends on the map and I feel maybe cpu bottleneck is the issue, because in paracel storm, there is almost no FT spikes, in rogue transsmision, you can kinda tell it's not perfect but like 95 % perfect. Also, there's this weird thing with textures and draw distance when in CF.
> 
> with my 4.7ghz 8320 and 1440p ultra settings I'm averaging 90-120 fps depending on the map. which is good. hope the drivers will continue to improve the scaling and frame time.
> 
> ---
> 
> the thing that worries me a lot is that both gpu's are hitting 94 C. And I have a sound blaster Z right in between them. I wonder if it's ok for the audio card to be there, I don't really have a choice because when I put the sound card below the second card, the system doesnt recognize it.


with the fans so far back on the cards , as long as the sound card doesn't block them it shouldn't be a huge issue.


----------



## velocityx

Quote:


> Originally Posted by *Durquavian*
> 
> with the fans so far back on the cards , as long as the sound card doesn't block them it shouldn't be a huge issue.


I'm actually worried about my audio card, not my gpus ;d

yea i'm gonna wait for mantle to see how it goes. if mantle doesn't help with cpu bottleneck like they say it will, i will switch to intel


----------



## Sgt Bilko

Quote:


> Originally Posted by *velocityx*
> 
> I'm actually worried about my audio card, not my gpus ;d
> 
> yea i'm gonna wait for mantle to see how it goes. if mantle doesn't help with cpu bottleneck like they say it will, i will switch to intel


AMD seem very confident in it (I know it's their product and all) I mean, if they say an underclocked 8350 will run at the same speed as a 4770k then i'm going to be really impressed.

December is going to be one interesting month


----------



## velocityx

hmm

I hear some coil whine. Donno if it's the second gpu or seasonic xfx 850. it's audible when machine is switched off and disconnected, shut off.

will have to do some investigation on that.

new gpu is 77% asic quality, the two week old one is 69.


----------



## Jpmboy

Quote:


> Originally Posted by *velocityx*
> 
> hmm
> 
> I hear some coil whine. Donno if it's the second gpu or seasonic xfx 850. it's audible when machine is switched off and disconnected, shut off.
> 
> will have to do some investigation on that.
> 
> new gpu is 77% asic quality, the two week old one is 69.


An 850w psu may be cutting it close for those 2 cards... Any overvolt/overclock and you'll hit 800+ watts easy.


----------



## psyside

Yep, 850 is not enough, never thought about that. If you oc, PSU could be the issue, if you over-volt 100%


----------



## psyside

So after my obsession with VRM temps, i "discovered" something very interesting!

R9 290X VRM temps are way higher then R9 290!

_*290X*_



_*290*_



It could be the main reason why 290 oc better then 290X.

Like 15c difference!


----------



## Banedox

Quote:


> Originally Posted by *psyside*
> 
> So after my obsession with VRM temps, i "discovered" something very interesting!
> 
> R9 290X VRM temps are way higher then R9 290!
> 
> _*290X*_
> 
> 
> 
> _*290*_
> 
> 
> 
> It could be the main reason why 290 oc better then 290X.
> 
> Like 15c difference!


Also one of the biggest things i noticed with my card, I have XFX 290 that unlocked to a 290X is that my voltage fluctuates all over the place like by as much as 250MV, which could be a big reason people are having trouble overclocking, that my card is running stock at way below stock voltage.... I was able to get a 1150 core with stock voltage even tho GPU tweak was set to 1400mv.......(really 1250mv according to GPUZ)

I think with the non-reference cards the power handling with fix whether these puppy's can overclock.... I just need to figure out how to get my voltage stable... btw using the Asus 290X bios...


----------



## psyside

Yes it is strange you should ask any other if the issue can be triggered, maybe the unlock make the instability? Did you notice it on stock? Or even unlocked but on different bios?


----------



## Banedox

Quote:


> Originally Posted by *psyside*
> 
> Yes it is strange you should ask any other if the issue can be triggered, maybe the unlock make the instability? Did you notice it on stock? Or even unlocked but on different bios?


Yeah same thing happened with the stock 290 bios, I think it is just a thing with my card not sure tho, I really wish I knew how to test it appropriately. I'm gonna have to test some other bios.... could be a Asus thing who knows as of now...


----------



## Newbie2009

The asus bios everyone is using, is it the stock asus bios or a modded one?


----------



## velocityx

Quote:


> Originally Posted by *psyside*
> 
> Your cpu is the issue. Even Crossfire 290/290X, will be way smoother then 6970.


I did a DDU in safe mode, reinstalled my 9.5 driver, no more cpu spikes, gameplay is smooth like butter in bf4. I was really upset with 6970 cf, but this, oh boy, so good.


----------



## ZombieJon

MSI R9 290

Last 290 the store had in stock. One person cleared them out earlier today.


----------



## Arizonian

Quote:


> Originally Posted by *velocityx*
> 
> so my second 290 is here
> 
> two things
> 
> I hope mantle will free my cpu from bottlenecking them. performance is very good in BF4, although I can see in the graph it can go higher. So, mantle where are you!
> 
> my gpu usage is very high, although first GPU, the clock is throttling down to 850 and up to 947. second gpu isn't throttling. fan speeds 57%.
> 
> so, although crossfire in these is much much better than what I experienced in 6970, you can still tell it's not as fluent as a single card solution. most of the time it's okay and a non issue you can't tell, but it depends on the map and I feel maybe cpu bottleneck is the issue, because in paracel storm, there is almost no FT spikes, in rogue transsmision, you can kinda tell it's not perfect but like 95 % perfect. Also, there's this weird thing with textures and draw distance when in CF.
> 
> with my 4.7ghz 8320 and 1440p ultra settings I'm averaging 90-120 fps depending on the map. which is good. hope the drivers will continue to improve the scaling and frame time. However that's with an empty servers, will test later and see how it goes in full 64 man serv.,
> ---
> the thing that worries me a lot is that both gpu's are hitting 94 C. And I have a sound blaster Z right in between them. I wonder if it's ok for the audio card to be there, I don't really have a choice because when I put the sound card below the second card, the system doesnt recognize it.
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









http://tinypic.com/view.php?pic=nwj6na&s=5#.UqH8lvRDvNk


----------



## iamhollywood5

Quote:


> Originally Posted by *Newbie2009*
> 
> The asus bios everyone is using, is it the stock asus bios or a modded one?


Most people are using the stock ASUS bios.


----------



## Arizonian

Quote:


> Originally Posted by *ZombieJon*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> MSI R9 290
> 
> Last 290 the store had in stock. One person cleared them out earlier today.


Congrats - added









Glad to hear you got yours before being cleared out.


----------



## binormalkilla

Quote:


> Originally Posted by *brazilianloser*
> 
> Just a heads up to anyone getting multiple cards and doing Eyefinity, make sure you got the same bios on all cards. Spent a few hours today beating my head against a wall because for some reason my eyefinity setup would not work no matter what I did until I just out of curiosity decided to check for an update for the new card that I got today from Asus and to my surprise the fact that my older one had the newer version of their bios and the new one didn't was stopping Eyefinity from working. Anyways if it helps anyone great if it doesn't oh well. Just glad I can finally give it a test to Eyefinity with two cards.


Interesting....I have 2 sapphire and 1 msi and I don't have problems with eyefinity setup in windows 8.1. I still plan on flashing the ASUS BIOS when I start OCing though.


----------



## Banedox

Quote:


> Originally Posted by *Newbie2009*
> 
> The asus bios everyone is using, is it the stock asus bios or a modded one?


I am using the stock ASUS one


----------



## TheRoot

oh


----------



## brazilianloser

Quote:


> Originally Posted by *binormalkilla*
> 
> Interesting....I have 2 sapphire and 1 msi and I don't have problems with eyefinity setup in windows 8.1. I still plan on flashing the ASUS BIOS when I start OCing though.


That is why I never thought that was the problem and only solved my issue by luck. But for me it did the trick, two hours moving around connections, cards and etc,... Updated the bios and BAM works without me lifting a finger.


----------



## brazilianloser

Feel free to add me up to the CF part of the club. My second 290 arrived yesterday but since it was a bit late I didn't take any pictures at the time...


----------



## Banedox

Well just loaded up the MSI 290X bios still getting the low voltage...


----------



## iamhollywood5

Quote:


> Originally Posted by *Banedox*
> 
> Well just loaded up the MSI 290X bios still getting the low voltage...


Have you tried the PT3 BIOS? You'll get way more voltage with that one.


----------



## Arizonian

Quote:


> Originally Posted by *brazilianloser*
> 
> Feel free to add me up to the CF part of the club. My second 290 arrived yesterday but since it was a bit late I didn't take any pictures at the time...
> 
> 
> Spoiler: Warning: Spoiler!


Right on it finally arrived. Congrats - updated


----------



## brazilianloser

Quote:


> Originally Posted by *Arizonian*
> 
> Right on it finally arrived. Congrats - updated


I know right... not to wait another infinity to get all my water parts...


----------



## rdr09

anyone noticed Afterburner makes the core and memory jump. I let GPUZ run without AB and with AB. this was the result . . .



smooth at the first half and started jumping with AB opened.


----------



## Jaju123

Quote:


> Originally Posted by *psyside*
> 
> Yep, 850 is not enough, never thought about that. If you oc, PSU could be the issue, if you over-volt 100%


850W is way more than enough for two 290s, surely.... A 700W PSU is 'recommended' for 290X Crossfire.


----------



## brazilianloser

Quote:


> Originally Posted by *Jaju123*
> 
> 850W is way more than enough for two 290s, surely.... A 700W PSU is 'recommended' for 290X Crossfire.


Yeah I got an oc 3770k to 4.5, two 290 and after adding 75mV, 50% power limit and oc the core to 1100 was only pulling 700 tops during stress out of the wall. So unless you have a lot of junk on your system a 850/860w would be just fine. Now I am sure if I oc that core to 1200 and put in the full 100mV that AB allows I am going to be hitting the 800s.

Will post some pictures later maybe for those naysayers.


----------



## psyside

Quote:


> Originally Posted by *velocityx*
> 
> I did a DDU in safe mode, reinstalled my 9.5 driver, no more cpu spikes, gameplay is smooth like butter in bf4. I was really upset with 6970 cf, but this, oh boy, so good.


Great to know, i knew there is something wrong when you said 6970 was smoother, i know when i used it, it was not smooth even with all the tweaks one can imagine. 290/290x are way smoother then 7/6xxx generation.


----------



## binormalkilla

Quote:


> Originally Posted by *the9quad*
> 
> binorma my results are nearly identical btw for stock clocks at 4.4, glad you posted those, cuz i always worries mine are slow or something.


Good to hear. It's always nice to have a sanity check beyond the results section of futuremark's site. Going to try a few runs with a light OC now.


----------



## Jpmboy

Quote:


> Originally Posted by *Jaju123*
> 
> 850W is way more than enough for two 290s, surely.... A 700W PSU is 'recommended' for 290X Crossfire.


At all stock, sure... 700w for the gpus is okay. The TDP for 2 in cfx is >500w stock uber bios. Right? If you run at 1200 and 1.412V, gear up.


----------



## Forceman

Quote:


> Originally Posted by *rdr09*
> 
> anyone noticed Afterburner makes the core and memory jump. I let GPUZ run without AB and with AB. this was the result . . .
> 
> 
> 
> smooth at the first half and started jumping with AB opened.


It's a known issue if you have voltage monitoring turned on in AB, so that could be the problem. Mine is completely stable after I disabled monitoring.


----------



## esqueue

I am currently using my WC'd 290x with beta 9.2 drivers and I have no complaints. Completed BF4's single player before deleting the game and only had 2 cases where the came crashed. First time was while it was loading another stage and the other is a glitch in the game where you have to restart the mission to reload. Many have the exact issue so I blame it on the game.

I feel that the system is running great with no black screens, should I risk installing Beta 9.5 drivers which can possibly cause issues that I don't have? I really don't feel like messing with what works. The fact that the drivers are beta means that they don't completely trust them either.


----------



## broken pixel

For the brief moments of benching I did before either my mobo or cpu bit the dust I noticed the voltages would flux like mad but the clock speeds would stay constent. I also noticed the newest beta drivers did not have the overdrive section in CCC.


----------



## psyside

Quote:


> Originally Posted by *Jaju123*
> 
> 850W is way more than enough for two 290s, surely.... A 700W PSU is 'recommended' for 290X Crossfire.


*Here is Guru3D's power supply recommendation:*

AMD Radeon R9 290 - On your average system the card requires you to have a 550~600 Watt power supply unit.
AMD Radeon R9 290 Crossfire - On your average system the cards require *you to have a 800 Watt power supply unit as minimum.*



Only 70W left, with only 1 HDD, no fans open bench ect.

You are hitting the limit, without oc on the cards. 1000W with oc is a must, you don't want to run push your psu so hard.


----------



## Meatdohx

Question.

I have bought an Artic accelero twin turbo ii for my r9 290.

I have seen a russian video on youtube of someone that did it.

Anyone ever heard of someone doing this?

I am pretty sure i can pull it off i'll send pictures if no one ever done it before.


----------



## theyedi

GPU usage is all over the place... stutters a lot in-game. any ideas as to what's causing this?



had crossfire disabled for the first half, tried turning it on and it made no difference. happens in every game/benchmark

using 13.11

I also disabled ULPS and the clock is holding at max, power is +50%


----------



## Raephen

Quote:


> Originally Posted by *rdr09*
> 
> anyone noticed Afterburner makes the core and memory jump. I let GPUZ run without AB and with AB. this was the result . . .
> 
> 
> 
> smooth at the first half and started jumping with AB opened.


This isn't half as bad as the issue I've noticed when running HWInfo64.

The program takes it's time loading up (seems to take it's time reading my single 290's sensors) and when loaded, it reads gpu use a full 100% all the time.

GPU-Z, on the other hand doesn't. It just reads the card locked at max clock - and my watt-meter at the wall reads an increase of 30-40 W power draw.

And the kicker is: even if I closeHWInfo, the GPU clock stays maxed (1000 in my case). I need to reboot to get Powertune or whatever is going haywire to do it's thing again.

I think I'll check if HWInfo has an update out since I last installed it


----------



## Raephen

Quote:


> Originally Posted by *Meatdohx*
> 
> Question.
> 
> I have bought an Artic accelero twin turbo ii for my r9 290.
> 
> I have seen a russian video on youtube of someone that did it.
> 
> Anyone ever heard of someone doing this?
> 
> I am pretty sure i can pull it off i'll send pictures if no one ever done it before.


I think it should be doable. I haven't heard of anyone doing it in this thread, but there are a few with Gelid's Icy Vision Rev.2 in this thread and they say that cooler does the job.

Seeing both coolers are about equal, there is no reason why Arctic ATT2 would perform any less.

Just make sure you have enough heatsinks for your memory and VRMs. Oh, and be sure atleast one or two memory heatsinks are low profile - they interfere with the mounting, otherwise.

Good luck:thumb:


----------



## MlNDSTORM

Decided to go with a R9 290x..over the 780, since I will keep my GTX 670....but my patience is running low with this non reference models....really low.


----------



## brazilianloser

Now that my second 290 has arrived it was time to clean up my setup and organize my little area at least until January when I will start putting in the water loop.


----------



## esqueue

Should I install beta 9.5 drivers if 9.2 drivers are running with no issues?


----------



## chiknnwatrmln

About to pull the trigger on a new case/psu, then in a week or two about $600 worth of watercooling parts....

I'm going for 5GHz CPU and 1220 MHz core on the GPU.


----------



## quakermaas

Quote:


> Originally Posted by *psyside*
> 
> *Here is Guru3D's power supply recommendation:*
> 
> AMD Radeon R9 290 - On your average system the card requires you to have a 550~600 Watt power supply unit.
> AMD Radeon R9 290 Crossfire - On your average system the cards require *you to have a 800 Watt power supply unit as minimum.*
> 
> 
> 
> Only 70W left, with only 1 HDD, no fans open bench ect.
> 
> You are hitting the limit, without oc on the cards. 1000W with oc is a must, you don't want to run push your psu so hard.


A good quality 850w PSU, will produce 850w+ for the system. To make that 850w, it will need to draw around 1000w from the wall due to efficiency.

I had my 850w PSU running a 3930k @ 4.9 and 290s CF @ 1150/1450 for a few benchmark runs today, + 10 fans and 2 pumps, I was peaking at 1000w from the wall, at 85% efficiency that is 850 watts for the system.

Yes, it is cutting it fine, but not a major problem as long as I don't push any more until I upgrade the PSU. In games I draw from the wall around 600w-700w with a 3930k @ 4.5 and 290 CF @ 1100/1450


----------



## Arizonian

A few days late but OP has been updated with *13.11 Beta 9.5 driver info* and link. If one would like to see the progression of the drivers since supporting R9 290x & 290's are being kept for history and easy tracking.


----------



## rdr09

Quote:


> Originally Posted by *Forceman*
> 
> It's a known issue if you have voltage monitoring turned on in AB, so that could be the problem. Mine is completely stable after I disabled monitoring.


thanks. that was it.
Quote:


> Originally Posted by *Raephen*
> 
> This isn't half as bad as the issue I've noticed when running HWInfo64.
> 
> The program takes it's time loading up (seems to take it's time reading my single 290's sensors) and when loaded, it reads gpu use a full 100% all the time.
> 
> GPU-Z, on the other hand doesn't. It just reads the card locked at max clock - and my watt-meter at the wall reads an increase of 30-40 W power draw.
> 
> And the kicker is: even if I closeHWInfo, the GPU clock stays maxed (1000 in my case). I need to reboot to get Powertune or whatever is going haywire to do it's thing again.
> 
> I think I'll check if HWInfo has an update out since I last installed it


as Forceman said, it was voltage monitor enabled in AB. i have the latest Hwinfo64 version and i don't get any readings at all for the gpu. just have to rely on gpuz.


----------



## ImJJames

Love newegg they let me use the 10% coupon code they are promoting today on my r9 290 that I purchased on Monday







Basically bought my r9 290 for $360


----------



## brazilianloser

Here is my two cents yet again on the power consumption subject.

Overview of my system: Gigabyte GA-Z77X-UD3H Mobo, 8GB Ram 1600, 3770k @ 4.5 (without changing the voltage), H100 Cooler, 6 Fans, 1 HDD WD Black 1TB, 1 Kingston SSD, and finally 2 Asus 290 reference cooling.

The cards @ 1150/1250, +100mV, +50% Power Limit, peaked at about 750w give it or take 10w.



Hope this help ease some of you guys minds at least about the 290 non X


----------



## iamhollywood5

Quote:


> Originally Posted by *ImJJames*
> 
> Love newegg they let me use the 10% coupon code they are promoting today on my r9 290 that I purchased on Monday
> 
> 
> 
> 
> 
> 
> 
> Basically bought my r9 290 for $360


How are you using a 10% coupon for a purchase you made on Monday?


----------



## MrWhiteRX7

There's a 10% coupon?


----------



## ebduncan

cannot seem to find and 290x's for Sale :-(

I wanted to buy one.


----------



## Jpmboy

Quote:


> Originally Posted by *quakermaas*
> 
> A good quality 850w PSU, will produce 850w+ for the system. To make that 850w, it will need to draw around 1000w from the wall due to efficiency.
> I had my 850w PSU running a 3930k @ 4.9 and 290s CF @ 1150/1450 for a few benchmark runs today, + 10 fans and 2 pumps, I was peaking at 1000w from the wall, at 85% efficiency that is 850 watts for the system.
> Yes, it is cutting it fine, but not a major problem as long as I don't push any more until I upgrade the PSU. In games I draw from the wall around 600w-700w with a 3930k @ 4.5 and 290 CF @ 1100/1450


To compare to the Guru3D Titan measurements ("parkbench"):

Killawatt power measurements (at the PSU plug)
3930K @5.0(1.523V)
2xTitans SLI (svl7v3 bios, softvoltmod, LLC=0)
Bios = 220W
Boot = 500W peak (?)
Idle = ~ 160-170 watts to the rig
Browser = ~300W
Super Pi = 340W
p95 (8G ram) = 600W (597+/-)
3Dmk11 @ 875/3005 1.16V = 800-900W
3DMk11 @1215/3602 1.3V = 1190-1220W !!!
Valley @ 1215/3602 1.3V = 950-1050W (1080P ExHD) (at 1306/3728 = 1200W)
Firestrike @ 1215/3602 @ 1.3V = 1050-1130W (default)

the Guru numbers are at stock (they actually add up to component TDPs)

I use 2 PSUs on that rig.


----------



## ImJJames

Quote:


> Originally Posted by *iamhollywood5*
> 
> How are you using a 10% coupon for a purchase you made on Monday?


Because I asked nicely?
Quote:


> Originally Posted by *MrWhiteRX7*
> 
> There's a 10% coupon?


Yes, NAFSAVETENDEC6W


----------



## chiknnwatrmln

Quote:


> Originally Posted by *ImJJames*
> 
> Love newegg they let me use the 10% coupon code they are promoting today on my r9 290 that I purchased on Monday
> 
> 
> 
> 
> 
> 
> 
> Basically bought my r9 290 for $360


The discount maxes at $20, was your 290 $380 or $400 originally?


----------



## ImJJames

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> The discount maxes at $20, was your 290 $380 or $400 originally?


It was $430 total including taxes and she reimbursed me $40 out of that.


----------



## Newbie2009

I have been playing with overclocking on single 290x, (unlocked 290)

Results are from 13.11 whql, sapphire 290x bios, Afterburner.



http://www.3dmark.com/fs/1261638

Firestrike Extreme score 5310, cpu @ 4.5ghz.

http://www.3dmark.com/fs/1261619

Firestrike score 10281, cpu @4.5ghz

I know the newer betas give a few hundred more points in this bench but had bad experience with 9.4.

I seem to be hitting a TDP wall @ +50mv. I can put more volts into 1125 core and score will go down.

Does the asus bios give more than another 50% TDP limit?


----------



## Forceman

No, same power limit. Only the PT BIOSes have higher power limits.


----------



## Newbie2009

Quote:


> Originally Posted by *Forceman*
> 
> No, same power limit. Only the PT BIOSes have higher power limits.


Thought so. it's odd I'm only loading about 1.22v @ my clocks. Don't see how I'm hitting a wall.


----------



## X-oiL

Quote:


> Originally Posted by *theyedi*
> 
> GPU usage is all over the place... stutters a lot in-game. any ideas as to what's causing this?
> 
> 
> 
> had crossfire disabled for the first half, tried turning it on and it made no difference. happens in every game/benchmark
> 
> using 13.11
> 
> I also disabled ULPS and the clock is holding at max, power is +50%


I'm having same issues as you and can't seem to find a solution


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *brazilianloser*
> 
> Here is my two cents yet again on the power consumption subject.
> 
> Overview of my system: Gigabyte GA-Z77X-UD3H Mobo, 8GB Ram 1600, 3770k @ 4.5 (without changing the voltage), H100 Cooler, 6 Fans, 1 HDD WD Black 1TB, 1 Kingston SSD, and finally 2 Asus 290 reference cooling.
> 
> The cards @ 1150/1250, +100mV, +50% Power Limit, peaked at about 750w give it or take 10w.
> 
> 
> 
> Hope this help ease some of you guys minds at least about the 290 non X


Thanks for that very hungry beasties


----------



## Raephen

Quote:


> Originally Posted by *rdr09*
> 
> as Forceman said, it was voltage monitor enabled in AB. i have the latest Hwinfo64 version and i don't get any readings at all for the gpu. just have to rely on gpuz.


Aye, I use GPU-Z too. I wanted to use HWInfo for keeping track of my cpu use. I'll have to find something else, I guess.

And btw: I have no issues with AB either,

And I just had an eureka moment: I can use AB to monitor cpu use







Why didn't I think of that before, lol!

I just want to guestimate how much of the system draw would be only due to the GPU.

Valley, Extreme HD, draws 294 W at the wall.

Furmark, good for cooking your card I know, draws 432 W at the wall.

Both have max GPU clocks (1000 / 1250 in my case) and a full 100%. Valley looks lowish for a system with a 290, but it is what it is.

So I'm curious about the difference. I want to figure out what could be causing the disparity...


----------



## Newbie2009

Quote:


> Originally Posted by *X-oiL*
> 
> I'm having same issues as you and can't seem to find a solution


Vsync is probably the culprit, try turning it off, vsync broken for me.


----------



## ImJJames

Quote:


> Originally Posted by *Raephen*
> 
> Aye, I use GPU-Z too. I wanted to use HWInfo for keeping track of my cpu use. I'll have to find something else, I guess.
> 
> And btw: I have no issues with AB either,
> 
> And I just had an eureka moment: I can use AB to monitor cpu use
> 
> 
> 
> 
> 
> 
> 
> Why didn't I think of that before, lol!
> 
> I just want to guestimate how much of the system draw would be only due to the GPU.
> 
> Valley, Extreme HD, draws 294 W at the wall.
> 
> Furmark, good for cooking your card I know, draws 432 W at the wall.
> 
> Both have max GPU clocks (1000 / 1250 in my case) and a full 100%. Valley looks lowish for a system with a 290, but it is what it is.
> 
> So I'm curious about the difference. I want to figure out what could be causing the disparity...


432 Watts on Furmark??? Stock volts??


----------



## HOMECINEMA-PC

Startin to get some where with me clocks.......











I love you AB 300 beta 17..........


----------



## Raephen

Quote:


> Originally Posted by *ImJJames*
> 
> 432 Watts on Furmark??? Stock volts??


Total system draw, not just the GPU. And at the wall, so take roughly 90% = 388 W actually used by the system.

And yes, stock volts for the GPU. The Ivy i5 is undervolted, actually, at 1,27 (had a high VID to begin with)

Add to that one SSD, one HDD a bunch of fans and my MCP35X pump chugging away at sub 2000 rpm.

I do agree the GPU is the most power hungry part in my setup, and to be honest: use Furmark at your own risk.

VRM1 hit 85 C in the 15 min burn in test. Under water! True, it's wit an Aquacomputer block and I mightjust also get some fujipoly extreme padding for use with this block. Already sent a mail to AC asking which thickness they use for the pads on the VRMs.


----------



## Jack Mac

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Startin to get some where with me clocks.......
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I love you AB 300 beta 17..........


Nice, what's your ASIC?
Edit:


Spoiler: Warning: Spoiler!



I can do 1200 core (maybe a bit more) with 100mv/+15mv auxiliary and 1400 memory and I have 79.8% ASIC


----------



## Jpmboy

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Startin to get some where with me clocks.......
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I love you AB 300 beta 17..........


what happened here:


----------



## VSG

Ya I was wondering the same thing.

In other news, my 2nd Sapphire 290x has arrived in to replace the Asus one I had RMA'd. The box it came in, despite it being a BF4 edition, was the regular Sapphire 290x and it just had the AMD BF4 card thrown in as an afterthought lol.

Arizonian: Could you please update me to 2x Sapphire 290x? Tomorrow I will re-test these two cards before putting on the EK blocks (again). Now if only my Aquaero 6 XT and the FrozenQ reservoir would get here


----------



## Newbie2009

Fixed my tdp limit, AB not setting powertune properly.

AUX is at default, 3770k @ 4.5ghz. Gaming stable clocks, no artifacts.

Hopefully trixx will have higher limits.


----------



## Jaju123

Quote:


> Originally Posted by *X-oiL*
> 
> I'm having same issues as you and can't seem to find a solution


Me too....


----------



## Newbie2009

Quote:


> Originally Posted by *Jaju123*
> 
> Me too....


Try turning off vsync. Audio crackling too?


----------



## broken pixel

Quote:


> Originally Posted by *Newbie2009*
> 
> Vsync is probably the culprit, try turning it off, vsync broken for me.


Turn of frame pacing in CCC. It enables it by default and does not work with vsync.


----------



## Jaju123

Does anyone else have horrible throttling when playing games with 290s in crossfire? With my stock sapphire 290s with 47% fan, when playing bf4 the performance becomes noticeably worse over time and the 2nd card drops to 650mhz or so for prolonged periods.

Sent from my Nexus 7 using Tapatalk 4


----------



## sugarhell

Increase the fan speed?


----------



## geoxile

Quote:


> Originally Posted by *Jaju123*
> 
> Does anyone else have horrible throttling when playing games with 290s in crossfire? With my stock sapphire 290s with 47% fan, when playing bf4 the performance becomes noticeably worse over time and the 2nd card drops to 650mhz or so for prolonged periods.
> 
> Sent from my Nexus 7 using Tapatalk 4


Sounds like they're throttling because they're getting too hot. Even single 290x will throttle over time with higher fan speeds so 290 in crossfire will definitely throttle.

On that note, are there any retailers that are selling 290s at their MSRP? Everywhere I look it's $450


----------



## Tobiman

Good to see that the black screen issue isn't as bad as during induction. I should be getting a 290X non-reference as soon as they are available. My 780 classy is stuttering in some games which sucks.


----------



## leo82

Quick question , I have 4 sapphire r290's. I want to separate them to keep them cooler, can I hook all of them up to non powered riser cables? Can I just connect my power supply to all 4 of my graphics cards and would all of the power be drawn from my power supply only??

I have a Hercules 1600w power supply.


----------



## Tobiman

Quote:


> Originally Posted by *leo82*
> 
> Quick question , I have 4 sapphire r290's. I want to separate them to keep them cooler, can I hook all of them up to non powered riser cables? Can I just connect my power supply to all 4 of my graphics cards and would all of the power be drawn from my power supply only??
> 
> I have a Hercules 1600w power supply.


EDIT: I didn't read that thoroughly before replying but I think it should be possible. You do run the risk of burning your GPU though. I'd use powered risers just to be safe. After doing a quick search, it seems to be the more reliable choice.


----------



## DeadlyDNA

Quote:


> Originally Posted by *leo82*
> 
> Quick question , I have 4 sapphire r290's. I want to separate them to keep them cooler, can I hook all of them up to non powered riser cables? Can I just connect my power supply to all 4 of my graphics cards and would all of the power be drawn from my power supply only??
> 
> I have a Hercules 1600w power supply.


I was considering an idea like this but in my short research I didn't see anything that's pcie 3.0...

Since crossfire no longer requires bridges I would think you can get like 2 riser cables and mount away from each other. Quad cf on stock though is a nightmare. I had issues with just 3 spaced out..... I think you real choice would be water cooling maybe just 2 of the cards if your trying to save cash. Or water them all.


----------



## broken pixel

Quote:


> Originally Posted by *leo82*
> 
> Quick question , I have 4 sapphire r290's. I want to separate them to keep them cooler, can I hook all of them up to non powered riser cables? Can I just connect my power supply to all 4 of my graphics cards and would all of the power be drawn from my power supply only??
> 
> I have a Hercules 1600w power supply.


you need to use powered risers with that amout of GPUs. I used to run 5 7950s in a miner and they needed powered risers.


----------



## ImJJames

Add me please









http://www.techpowerup.com/gpuz/h2g95/

Powercolor r9 290

Reference Cooler

Unfortunately its not unlock-able :/


----------



## Emmett

Hmm, 2 seperate loops for each card. own pumps, 360 Rad.
Both using Distilled water with kill coil.

Anyone seen this before?

This is after only 7-9 days of setting up this cards loop.







the brown is where the acrylic lays on the nickle, then the is galvanic looking corrosion an the ends of the fins over core
(that is where i have the inlet.) cant believe it happened so fast...

the other loop looks fine and clean, and has been running much longer.

no aluminum in loops, but i have copper/brass/nickle/silver kill coil.


----------



## Mirob0t

Hey guys, im new to this forum and i like to ask you pro's some questons

I just bought few days ago Sapphire r9 290 Bf4 edition,
Since the cooler is so bad and the gpu gets HOT ( 94c) i changed the cooler and replaced it with Gelid Icy rev2
my temps are realy good compare to the stock cooler, ( 54 max on bf4 @100 Usage)
so waht happend next is... i am not getting the preformance i expected to get in battlefield 4,
its kinda almost the same as my previous Gpu ( Sapphire 7970 ghz edition),
so im wondering, am i doing something wrong.?... have i made a bad choice to get this card? so if anyone is willing to help me.. plz share









here are my specs

i5 2500K @4.2 ghz
8 gig of ram
Win 8.1
Asus maximus VI gene-z
Samsung 840 EVO SSD 120 Gig

thanks for reading,


----------



## ImJJames

Did a few runs with my new r9 290 looks like I may have a good overclocking card, so far ran heaven @ 1175 clock without touching voltage or memory and it passed @ 60FPS, will be pushing this card to the limit all night.


----------



## Arizonian

Quote:


> Originally Posted by *ImJJames*
> 
> Add me please
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/h2g95/
> 
> Powercolor r9 290
> 
> Reference Cooler
> 
> Unfortunately its not unlock-able :/


Congrats - added


----------



## psikeiro

Quote:


> Originally Posted by *Tobiman*
> 
> Good to see that the black screen issue isn't as bad as during induction. I should be getting a 290X non-reference as soon as they are available. My 780 classy is stuttering in some games which sucks.


Going for 2 myself. Can't stand trying to play bf4 on ultra with 4x msaa on triple 1080p and running out of VRAM on 2 780s.


----------



## grunion

FS with FP on/off


----------



## broken pixel

Quote:


> Originally Posted by *grunion*
> 
> FS with FP on/off


FP is borked.


----------



## the9quad

Quote:


> Originally Posted by *broken pixel*
> 
> FP is borked.


I dont understand, how is it borked? Seems to work great to me, no stutter. I didn't expect to see the exact same score with FP on/off, that would be ridiculous to think that. Not trying to be argumentative or a smart alec, btw is it borked in another way?

It does make me wish I would have turned it off when I ran 3dmark with the WHQL drivers to get a better spot in the HOF, but I can't be bothered now, because I went back to the beta's due to flickering textures in BF4


----------



## ImJJames

Just got this card one hour ago and here is what I got so far









*1200 Clock/1500Mem Artifact Free







Max temp 70C*


----------



## the9quad

Quote:


> Originally Posted by *ImJJames*
> 
> Just got this card one hour ago and here is what I got so far
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *1200 Clock/1500Mem Artifact Free
> 
> 
> 
> 
> 
> 
> 
> Max temp 70C*


Nice OC dude, I can never get close to that, you got a keeper!


----------



## VSG

By FP, are you guys referring to firestrike physics?


----------



## rdr09

Quote:


> Originally Posted by *Mirob0t*
> 
> Hey guys, im new to this forum and i like to ask you pro's some questons
> 
> I just bought few days ago Sapphire r9 290 Bf4 edition,
> Since the cooler is so bad and the gpu gets HOT ( 94c) i changed the cooler and replaced it with Gelid Icy rev2
> my temps are realy good compare to the stock cooler, ( 54 max on bf4 @100 Usage)
> so waht happend next is... i am not getting the preformance i expected to get in battlefield 4,
> its kinda almost the same as my previous Gpu ( Sapphire 7970 ghz edition),
> so im wondering, am i doing something wrong.?... have i made a bad choice to get this card? so if anyone is willing to help me.. plz share
> 
> 
> 
> 
> 
> 
> 
> 
> 
> here are my specs
> 
> i5 2500K @4.2 ghz
> 8 gig of ram
> Win 8.1
> Asus maximus VI gene-z
> Samsung 840 EVO SSD 120 Gig
> 
> thanks for reading,


oc the cpu a bit more like 4.5GHz. i played BF4 earlier with HT off on my i7 and looks great. i am running win7, though. heard win8 works better. What driver are you using?


----------



## broken pixel

Quote:


> Originally Posted by *the9quad*
> 
> I dont understand, how is it borked? Seems to work great to me, no stutter. I didn't expect to see the exact same score with FP on/off, that would be ridiculous to think that. Not trying to be argumentative or a smart alec, btw is it borked in another way?
> 
> It does make me wish I would have turned it off when I ran 3dmark with the WHQL drivers to get a better spot in the HOF, but I can't be bothered now, because I went back to the beta's due to flickering textures in BF4


with frame pacing off the score is higher. Try using FP with vsync on and play BF4 or another game. It will be choppy! Since I run 1440p @ 75-120Hz I use vsync and when ever I enable both games stutter.


----------



## broken pixel

Quote:


> Originally Posted by *geggeg*
> 
> By FP, are you guys referring to firestrike physics?


Frame pacing for crossfire configurations.


----------



## RAFFY

Quote:


> Originally Posted by *broken pixel*
> 
> Frame pacing for crossfire configurations.


Has anyone created a tweak guide for crossfire 290/290x yet?


----------



## Mirob0t

hmm isnt 4.2 ghz enough?
im using the latest driver ( AMD_Catalyst_13.11_BetaV9.5 )


----------



## ImJJames

Quote:


> Originally Posted by *the9quad*
> 
> Nice OC dude, I can never get close to that, you got a keeper!


Thanks heres my Valley, with stock r9 290 bios 1225/1500


----------



## the9quad

Quote:


> Originally Posted by *broken pixel*
> 
> with frame pacing off the score is higher. Try using FP with vsync on and play BF4 or another game. It will be choppy! Since I run 1440p @ 75-120Hz I use vsync and when ever I enable both games stutter.


Have you tried just limiting the framerate instead of vsync? It seems like vsync always made things sluggish to me, so Iwould never use it and instead limit the frame rate to where I wouldn't get tearing instead. I don't do either anymore with 110 or 120hz @ 1440p, I never see stuttering or tearing in bf4 with frame pacing on, V-sync Off, and No frame limiter. I am thinking of limiting frames to 125 though, just to keep the cards cooler.


----------



## RAFFY

Quote:


> Originally Posted by *the9quad*
> 
> Have you tried just limiting the framerate instead of vsync? It seems like vsync always made things sluggish to me, so Iwould never use it and instead limit the frame rate to where I wouldn't get tearing instead. I don't do either anymore with 110 or 120hz @ 1440p, I never see stuttering or tearing in bf4 with frame pacing on, V-sync Off, and No frame limiter. I am thinking of limiting frames to 125 though, just to keep the cards cooler.


Are you talking about limiting the frames with in BF4? If so what is the command?


----------



## MrWhiteRX7

So is it me or did newegg change the price of their 290's? I see they all come with bf4 now what in the hell. I'm trying to buy another one right meow but I'm not paying 470.00 for a 290 :/


----------



## Sgt Bilko

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> So is it me or did newegg change the price of their 290's? I see they all come with bf4 now what in the hell. I'm trying to buy another one right meow but I'm not paying 470.00 for a 290 :/


But But.......$470 is cheap.....I think the XFX 290's on sale here were $480


----------



## brazilianloser

Glad that I got mine then. Even though it had signs of being an open box. At least it works just as well as my other one.


----------



## quakermaas

Quote:


> Originally Posted by *RAFFY*
> 
> Are you talking about limiting the frames with in BF4? If so what is the command?


 I done it with BF 4 editor

http://battlefield.realmware.co.uk/bf4-settings-editor/

Set my max FPS to 65 (same as monitor), so helps with heat/power draw/fan noise.

I set it to always display the FPS as well, so I can adjust setting and easily check my frames are not dropping below 65 .


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Sgt Bilko*
> 
> But But.......$470 is cheap.....I think the XFX 290's on sale here were $480










It's still worth it but I paid 405.00 shipped for my first one lol, I think they just threw in the BF4 junk and use that to try and jack up the price for the holidays. No buenos!


----------



## iamhollywood5

Really happy I got my unlockable 290 with BF4 for just $399 shipped


----------



## the9quad

Quote:


> Originally Posted by *RAFFY*
> 
> Are you talking about limiting the frames with in BF4? If so what is the command?


GameTime.MaxVariableFps 60

or 100 or 45 or whatever you want them to be.

best thing to do is just add a user.cfg file with that line in it, so it runs every time.

here is a guide if you dont know how to make one or where to put it:

http://diaryofdennis.com/2013/11/04/how-to-load-bf-4-console-commands-on-startup-with-a-config-file/


----------



## psyside

Quote:


> Originally Posted by *quakermaas*
> 
> A good quality 850w PSU, will produce 850w+ for the system. To make that 850w, it will need to draw around 1000w from the wall due to efficiency.
> 
> I had my 850w PSU running a 3930k @ 4.9 and 290s CF @ 1150/1450 for a few benchmark runs today, + 10 fans and 2 pumps, I was peaking at 1000w from the wall, at 85% efficiency that is 850 watts for the system.
> 
> Yes, it is cutting it fine, but not a major problem as long as I don't push any more until I upgrade the PSU. In games I draw from the wall around 600w-700w with a 3930k @ 4.5 and 290 CF @ 1100/1450


Like i said, pushing it to the limits. PSU's operate best at 50% load, and for high draw, like 80% max, you should have like 20% headroom, always, so thats why if you pull like ~ 800W, you need 1000W imho.


----------



## RAFFY

Quote:


> Originally Posted by *quakermaas*
> 
> I done it with BF 4 editor
> 
> http://battlefield.realmware.co.uk/bf4-settings-editor/
> 
> Set my max FPS to 65 (same as monitor), so helps with heat/power draw/fan noise.
> I set it to always display the FPS as well, so I can adjust setting and easily check my frames are not dropping below 65 .


Thanks +rep
Quote:


> Originally Posted by *the9quad*
> 
> GameTime.MaxVariableFps 60
> 
> or 100 or 45 or whatever you want them to be.
> 
> best thing to do is just add a user.cfg file with that line in it, so it runs every time.
> 
> here is a guide if you dont know how to make one or where to put it:
> 
> http://diaryofdennis.com/2013/11/04/how-to-load-bf-4-console-commands-on-startup-with-a-config-file/


Thanks, I've always loved how you can create and edit config files to boost up game performance. I'll have to tweak this after I get my third 290x installed


----------



## tsm106

Quote:


> Originally Posted by *the9quad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *broken pixel*
> 
> FP is borked.
> 
> 
> 
> I dont understand, how is it borked? Seems to work great to me, no stutter. I didn't expect to see the exact same score with FP on/off, that would be ridiculous to think that. Not trying to be argumentative or a smart alec, btw is it borked in another way?
> 
> It does make me wish I would have turned it off when I ran 3dmark with the WHQL drivers to get a better spot in the HOF, but I can't be bothered now, because I went back to the beta's due to flickering textures in BF4
Click to expand...

Don't think brokenpixel understands what FP is.

Frame Pacing is by design going to slow things down. It inherently slows down the frames so it can better align them and naturally the speed or pace of the frames slows down for that to happen.

Obviously, when benching disable FP. Benching 101.


----------



## ImJJames

DELETE


----------



## chiknnwatrmln

Quote:


> Originally Posted by *ImJJames*
> 
> http://www.3dmark.com/3dm11/7577056


Wow 1270 MHz is a high oc. What cooling do you use?


----------



## ImJJames

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Wow 1270 MHz is a high oc. What cooling do you use?


Its actually 1225 Mhz, and I am using just stock bios reference cooler max temps 71C.


----------



## tsm106

Quote:


> Originally Posted by *ImJJames*
> 
> http://www.3dmark.com/3dm11/7577056


You must be throttling a bit. I would watch it till you get some better cooling.
http://www.3dmark.com/3dm11/7381362

Compare the gs scores vs the clock speeds. You're very far off on efficiency and your gains IPC wise is not great.


----------



## Mirob0t

He guys

i just overclocked my Sapphire r9 290, i like to know if my score in heaven benchmark is correct
and if u guys have better tips for overclocking... let me know!!


----------



## psyside

Quote:


> Originally Posted by *tsm106*
> 
> You must be throttling a bit. I would watch it till you get some better cooling.
> http://www.3dmark.com/3dm11/7381362
> 
> Compare the gs scores vs the clock speeds. You're very far off on efficiency and your gains IPC wise is not great.


What is gs sorry?


----------



## HOMECINEMA-PC

I paid $475 for my sapphy r9 290 and im very impressed . Im that stoked i dont really care if its locked or not .









P17127 on mk 11 for a single card under $500 AU , is well just the bees knees i rekon .



Just sold me old 1985 300zx so im gonna get another one


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *psyside*
> 
> What is gs sorry?


Graphics score dude


----------



## chiknnwatrmln

Quote:


> Originally Posted by *ImJJames*
> 
> Its actually 1225 Mhz, and I am using just stock bios reference cooler max temps 71C.


It's annoying how 3dmark11's clock speeds are never right.

You must be insane running 1225 MHz on ref cooling, surely you have to turn it up past 75% right?

What voltage are you running?


----------



## AddictedGamer93

I swear, at these prices I might as well just get a 780 Ti. At least they are in stock.


----------



## tsm106

FYI yall, 3dm11 is biased to IVY chips by inflating graphics scores a few hundred points on IVY cpus.


----------



## psyside

Quote:


> Originally Posted by *Mirob0t*
> 
> He guys
> 
> i just overclocked my Sapphire r9 290, i like to know if my score in heaven benchmark is correct
> and if u guys have better tips for overclocking... let me know!!


Use fan at 60% at least.
Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Graphics score dude


Yea, its 6am here so....


----------



## tsm106

Quote:


> Originally Posted by *AddictedGamer93*
> 
> I swear, at these prices I might as well just get a 780 Ti. At least they are in stock.


You cant mine with your ti then. And then when mantle is out you will be competing vs 7970s. Just sayin...


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *tsm106*
> 
> FYI yall, 3dm11 is biased to IVY chips by inflating graphics scores a few hundred points on IVY cpus.


Geeze louise







That explains why some of my benchies arent cuttin it against them . [email protected]@2428 should be the other way around


----------



## Mirob0t

oeps


----------



## ImJJames

Quote:


> Originally Posted by *tsm106*
> 
> You must be throttling a bit. I would watch it till you get some better cooling.
> http://www.3dmark.com/3dm11/7381362
> 
> Compare the gs scores vs the clock speeds. You're very far off on efficiency and your gains IPC wise is not great.


I just realized I had someone elses 3dmark11 link on my paste clipboard....thats not mine


----------



## Mirob0t

Quote:


> Originally Posted by *psyside*
> 
> Use fan at 60% at least.
> 
> well the thing is, im not using the stock cooler, i replaced it with Gelid icy rev2.
> The cooler is always on 1 speed, i havent bought the fan speed controller for it yet
> Fan RPM is always around 4100 /4150 , even if i put the speed on 100 in AB , it will not change anything
> 
> so was my Oc succesfull, or can i still get better preformance?


----------



## tsm106

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> FYI yall, 3dm11 is biased to IVY chips by inflating graphics scores a few hundred points on IVY cpus.
> 
> 
> 
> Geeze louise
> 
> 
> 
> 
> 
> 
> 
> That explains why some of my benchies arent cuttin it against them . [email protected]@2428 should be the other way around
Click to expand...

Thus the reason why FS is a better bench than 11.









11... needs to be retired especially with how it handles physics at the end of the runs.


----------



## Arizonian

Quote:


> Originally Posted by *nexusforce*
> 
> Hey everyone, just got my new 290:
> 
> *HIS AMD R9 290
> *Stock Cooler
> 
> 
> Spoiler: Picture!


I have to apologize for missing you. Congrats - and welcome to the club.









You were slotted in by your order of your entry.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *psyside*
> 
> Use fan at 60% at least.
> Yea, its 6am here so....


Yep a another case of upallnitebenchitis


----------



## psyside

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yep a another case of upallnitebenchitis


Sorry dude, don't really understand you, i'm not so much into OCN benching humor or whatever that phrase means


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *psyside*
> 
> Sorry dude, don't really understand you, i'm not so much into OCN benching humor or whatever that phrase means


Thats okay i will break it down for ya

up all nite bench itis


----------



## tsm106

It's hard to understand MADMEN unless you're a little MAD yourself. RAWR!


----------



## psyside

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Thats okay i will break it down for ya
> 
> up all nite bench itis


I tought about that, wasn't sure


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *tsm106*
> 
> It's hard to understand MADMEN unless you're a little MAD yourself. RAWR!


Correct but whats that thing under your desk hmmm ? . Your just as mad as i am bro . And you know it


----------



## darkelixa

At this rate , wont be till next year my r9 290 arrives


----------



## djsatane

Hey guys, I am confused because I havent seen any new AMD CAP profiles released lately at all, latest was 13.5 CAP1.... Does AMD still release CAP profiles(mainly for crossfire) or do they just packaged it all with latest drivers without actually listing them?


----------



## Forceman

Quote:


> Originally Posted by *djsatane*
> 
> Hey guys, I am confused because I havent seen any new AMD CAP profiles released lately at all, latest was 13.5 CAP1.... Does AMD still release CAP profiles(mainly for crossfire) or do they just packaged it all with latest drivers without actually listing them?


They are packaged in now.


----------



## Lord Xeb

Gelid VRM heat sinks are horrible (the 290x VRMs produce a lot of heat), so I made my own:


----------



## Sgt Bilko

Quote:


> Originally Posted by *Lord Xeb*
> 
> Gelid VRM heat sinks are horrible (the 290x VRMs produce a lot of heat), so I made my own:


Now thats an epic mod









Hows the temps now?


----------



## zpaf

Quote:


> Originally Posted by *ImJJames*
> 
> I just realized I had someone elses 3dmark11 link on my paste clipboard....thats not mine


This score is mine.









I had to up the core to 1275MHz for 18.006 gpu score.
http://www.3dmark.com/compare/3dm11/7577071/3dm11/7577056


----------



## ImJJames

http://www.3dmark.com/3dm11/7621583


----------



## quakermaas

Quote:


> Originally Posted by *zpaf*
> 
> This score is mine.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I had to up the core to 1275MHz for 18.006 gpu score.
> http://www.3dmark.com/compare/3dm11/7577071/3dm11/7577056


The highly overclocked card is scoring less than the card at default clocks! ?


----------



## james111333

Hey, is there anyone on here that has a 4770k and 290x together? I'd like to compare a few scores as i'm not sure that everything is running as it should?
Ideally, Valley, TR, Hitman and 3D Mark.

If you can help, please PM me


----------



## givmedew

Quote:


> Originally Posted by *james111333*
> 
> Hey, is there anyone on here that has a 4770k and 290x together? I'd like to compare a few scores as i'm not sure that everything is running as it should?
> Ideally, Valley, TR, Hitman and 3D Mark.
> 
> If you can help, please PM me


*STOCK speed unlocked 290 into 290x*

Unigine Valley Benchmark 1.0

FPS:
38.1
Score:
1594
Min FPS:
20.3
Max FPS:
71.8

*1120/1440 +75mV offset w/ unlocked shaders

*Unigine Valley Benchmark 1.0

FPS:
42.0
Score:
1757
Min FPS:
21.7
Max FPS:
79.5

Both scores are with stock speeds and hyper-threading turned off but HT shouldn't matter 1 bit in Valley and it is weighted low in FireStrike so not too big of a difference there.

My graphics only score for FireStrike Extreme is 5602 with a total score of 5088. I know the total score would be higher with HT on but no change to the graphics only score.


----------



## darkelixa

Will my amd 8350 be good enough for the r9 290 that I have comming?


----------



## psyside

Quote:


> Originally Posted by *givmedew*
> 
> *STOCK speed unlocked 290 into 290x*
> 
> Unigine Valley Benchmark 1.0
> 
> FPS:
> 38.1
> Score:
> 1594
> Min FPS:
> 20.3
> Max FPS:
> 71.8
> 
> *1120/1440 +75mV offset w/ unlocked shaders
> 
> *Unigine Valley Benchmark 1.0
> 
> FPS:
> 42.0
> Score:
> 1757
> Min FPS:
> 21.7
> Max FPS:
> 79.5


Pretty low increase, try without memory oc, and fans @70%


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> Will my amd 8350 be good enough for the r9 290 that I have comming?


Short answer: Yes

Somewhat longer answer: Yes it's fast enough for people gaming at 1080p, you won't be setting any benching records but my 290x worked perfectly fine with my 8150 and then better with my 8350.
An i7 3770/4770k is always going to be faster with a 290 than an 8350 but that is what we are hoping Mantle will change


----------



## darkelixa

Maybe in the near future when my 8350 cant keep up ill upgrade again







Hopefully they release a steamroller chip for am3 but thats hoping lol


----------



## givmedew

Quote:


> Originally Posted by *darkelixa*
> 
> Will my amd 8350 be good enough for the r9 290 that I have comming?


Yes it will be totally fine. Did you OC the 8350?

The best way to verify that your CPU isn't getting in the way is to use MSI Afterburner, Precision X or any of the other similar applications in conjunction with rivatuner (included with them) to display your GPU usage. If V-Sync is off and your frame rates are low and your GPU usage isn't at 99% then you are most likely being limited by the CPU.

To enable the GPU go into settings and on the second tab there will be a button for monitoring... enable GPU usage and makes sure it is set to overlay in game.

So a good example would be that if you had 48FPS and your GPU usage was 80% then if you OC'd your CPU further or had a better CPU you could see 60FPS. Also on thing to watch for is for certain scenes if your frame rate drops take a glance at your GPU usage because if it drops as well these could be scenes that are heavy on your CPU and that is why the framerate is dropping. If the framerate drops and the GPU usage stay high then it is because the scene is very GPU intense.


----------



## rdr09

Quote:


> Originally Posted by *Mirob0t*
> 
> hmm isnt 4.2 ghz enough?
> im using the latest driver ( AMD_Catalyst_13.11_BetaV9.5 )


for a 280 maybe.


----------



## Sgt Bilko

RMA Update time!!

Good News: So they tested my card with Furmark and Heaven, no problems with Furmark but massive artifacts and coil whine with heaven. So thats good (in a weird way)

Bad news: There is a screw missing from the fan shroud.......I have no idea why, i never took the shroud off, just the cooler. So now they are thinking that i can't get a refund.......anyone want to mail a screw to them on my behalf?


----------



## givmedew

Quote:


> Originally Posted by *psyside*
> 
> Quote:
> 
> 
> 
> Originally Posted by *givmedew*
> 
> *STOCK speed unlocked 290 into 290x*
> 
> Unigine Valley Benchmark 1.0
> 
> FPS:
> 38.1
> Score:
> 1594
> Min FPS:
> 20.3
> Max FPS:
> 71.8
> 
> *1120/1440 +75mV offset w/ unlocked shaders
> 
> *Unigine Valley Benchmark 1.0
> 
> FPS:
> 42.0
> Score:
> 1757
> Min FPS:
> 21.7
> Max FPS:
> 79.5
> 
> 
> 
> Pretty low increase, try without memory oc, and fans @70%
Click to expand...

Do you have a score to compare to? Like what you got with stock and what you got with settings similar to mine? Because my math says ~10% more than 38 is under 4FPS or ~42FPS which is what I got. For the max I had 71.8 and 10% would be 7.2 and I received 79.5.

So how would reducing the memory clock help me if I got almost exactly the percentage of clock increase out of things? From my understanding if you keep OCing the memory eventually it stops helping unless you increase the clock and if you increase the clock eventually the memory starts choking the clock and it will need to be increased. It is also my understanding that right before you memory starts causing noticeable issues you will get just a slightly lower benchmark. I did tests at every 10MHz with the core set to the same speed set of 1120 each time the score was better every time but eventually stopped getting much better so I stopped at 1440 not because it got unstable or because my score went down but because it stopped giving noticeable results. I mean like 2-4 points for 5-10MHz so not enough to keep increasing it.

As for the clock without the 70mV offset I get graphical glitches starting at just over 1050. I have not tried going over 1120 so I don't know exactly how far +70mV will take me I do know that it seems even +50mV is enough to make the card stable. However since my card usually runs around 40C full out I'm not to worried about adding +50-70mV over stock.


----------



## givmedew

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mirob0t*
> 
> hmm isnt 4.2 ghz enough?
> im using the latest driver ( AMD_Catalyst_13.11_BetaV9.5 )
> 
> 
> 
> for a 280 maybe.
Click to expand...

For most games 4.2GHz isn't going to limit you. I went from 4.6GHz 4670K on water to stock 4670K with stock cooler while I am rebuilding my watercooling loop. I have the GPU back under water but not the CPU yet and there is hardly a difference in frame rates on most games. However some games there is a noticeable change.


----------



## darkelixa

99% gpu usage in crysis 3
99% in ffxiv

So i am pretty sure the 5850 is a huge bottleneck not my cpu,

Oh is that what pccg said about your RMA?


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> 99% gpu usage in crysis 3
> 99% in ffxiv
> 
> So i am pretty sure the 5850 is a huge bottleneck not my cpu,
> 
> Oh is that what pccg said about your RMA?


There is a BIG difference between a 5850 and an R9 290, my 7970 maxes out at 100% usage in some games but the 290x won't.

imo you will need to overclock the 8350 to at least 4.6 to get the best benefit from it, i I run mine at 4.8 and sometimes 5.0.

And yes, PCCG want me to mail them a screw i don't have, i'd buy another 290x if i could afford it and tell them to take it out of that


----------



## Raephen

Quote:


> Originally Posted by *Sgt Bilko*
> 
> RMA Update time!!
> 
> Good News: So they tested my card with Furmark and Heaven, no problems with Furmark but massive artifacts and coil whine with heaven. So thats good (in a weird way)
> 
> Bad news: There is a screw missing from the fan shroud.......I have no idea why, i never took the shroud off, just the cooler. So now they are thinking that i can't get a refund.......anyone want to mail a screw to them on my behalf?


My card has shown no issues since I got it and seems very happy with the kryographics Hawaii block I gave it, so I have some screws from the stock cooler to spare...

... but they'd have to be shipped to the other side of the world


----------



## rdr09

Quote:


> Originally Posted by *givmedew*
> 
> For most games 4.2GHz isn't going to limit you. I went from 4.6GHz 4670K on water to stock 4670K with stock cooler while I am rebuilding my watercooling loop. I have the GPU back under water but not the CPU yet and there is hardly a difference in frame rates on most games. However some games there is a noticeable change.


i went back to make sure what game is giving Miro issue and it is BF4. it could be some settings but i think a higher cpu overclock is in order.
Quote:


> Originally Posted by *darkelixa*
> 
> 99% gpu usage in crysis 3
> 99% in ffxiv
> 
> So i am pretty sure the 5850 is a huge bottleneck not my cpu,
> 
> Oh is that what pccg said about your RMA?


290 is prolly equal to 4 5850s.


----------



## darkelixa

Oh how come you have to Oc to get the performance? when i look at my cpu usage its only at about 30% usage

If i use the asus oc on the bios setting it puts it to 4.7 standard.

I could get a screw from an older gpu i own


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *rdr09*
> 
> i went back to make sure what game is giving Miro issue and it is BF4. it could be some settings but i think a higher cpu overclock is in order.
> 290 is prolly equal to 4 5850s.


Thses 'red things' have come a long way eh


----------



## Sgt Bilko

Quote:


> Originally Posted by *Raephen*
> 
> My card has shown no issues since I got it and seems very happy with the kryographics Hawaii block I gave it, so I have some screws from the stock cooler to spare...
> 
> ... but they'd have to be shipped to the other side of the world


lol, i meant it as a joke more than anything, thanks for the offer though









ah i dunno, i'll give them a call tomorrow, they might be able to put another card aside for me when i get paid again


----------



## darkelixa

Lol i sell Nuts and bolts as a living ..


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> Oh how come you have to Oc to get the performance? when i look at my cpu usage its only at about 30% usage
> 
> If i use the asus oc on the bios setting it puts it to 4.7 standard.
> 
> I could get a screw from an older gpu i own


Games don't generally use more than 4 cores so that 30% usage is across all 8 cores.

Someone else could explain this a lot better than me but i'll try.

The CPU processes the information then send it to the GPU to render it and display it on the screen so the faster a CPU is (per thread i think) then the faster the information will be sent, thus making the CPU faster at sending information than the GPU is capable of rendering it.

Intel CPU's are much better at this than AMD's so that's why an FX 8 core will generally be compared to an i5 instead of an i7.

That said, i love my 8350 and once you overclock it a bit then you will notice the benefit over stock almost immediatly.

Thanks for the offer, i even went and checked out my 6970 Ref but it's clips....not screws.









It's all good, i'll figure it out on Monday.


----------



## darkelixa

Ah np man









When I look at the cpu usage in the resouce monitor most of the time 3 or 4 of the threads are parked lol they just have nothing to do


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> Ah np man
> 
> 
> 
> 
> 
> 
> 
> 
> 
> When I look at the cpu usage in the resource monitor most of the time 3 or 4 of the threads are parked lol they just have nothing to do


Yep, Battlefield 3 and 4 are the only games i know off the top of my head that use more than 4 cores, (6 and 8 respectively)

More games will be coming out that will be able to utilise the FX's 8 cores due to the new consoles,
so combine that with Mantle and i think a Crossfire 290 setup is looking like the future for me


----------



## rdr09

Quote:


> Originally Posted by *darkelixa*
> 
> Oh how come you have to Oc to get the performance? when i look at my cpu usage its only at about 30% usage
> 
> If i use the asus oc on the bios setting it puts it to 4.7 standard.
> 
> I could get a screw from an older gpu i own


what gpu again? the 5850?
Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Thses 'red things' have come a long way eh


yes, they have . . .

6870

http://www.3dmark.com/3dm11/2454753

7950

http://www.3dmark.com/3dm11/5948666

290

http://www.3dmark.com/3dm11/7556915

my 6870 played all the games the other 2 plays but ran out of vram in games like BF3 MP.

edit: MADMAN, you made a lot of 760 owners proud.


----------



## darkelixa

Correct, currently using a 5850 until the r9 290 gets here


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> what gpu again? the 5850?
> yes, they have . . .
> 
> 6870
> 
> http://www.3dmark.com/3dm11/2454753
> 
> 7950
> 
> http://www.3dmark.com/3dm11/5948666
> 
> 290
> 
> http://www.3dmark.com/3dm11/7556915
> 
> my 6870 played all the games the other 2 plays but ran out of vram in games like BF3 MP.
> 
> edit: MADMAN, you made a lot of 760 owners proud.


Pity i dont have any single 6970 runs here.

Crossfire 6970

http://www.3dmark.com/3dm11/6414400

7970Ghz

http://www.3dmark.com/3dm11/6377583

290x

http://www.3dmark.com/3dm11/7423200

And yes....They've come a long way


----------



## rdr09

Quote:


> Originally Posted by *darkelixa*
> 
> Correct, currently using a 5850 until the r9 290 gets here


with that gpu, then you won't need much of the 8300. with the 290, you gonna need to oc until mantel.


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Pity i dont have any single 6970 runs here.
> 
> Crossfire 6970
> 
> http://www.3dmark.com/3dm11/6414400
> 
> 7970Ghz
> 
> http://www.3dmark.com/3dm11/6377583
> 
> 290x
> 
> http://www.3dmark.com/3dm11/7423200
> 
> And yes....They've come a long way


stick with the 290s, Sgt.


----------



## conzilla

Iam using a 8800gt till my 290 gets here. And ill tell ya what it plays bf4 at 26 fps on low as the settings will go. lol


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> stick with the 290s, Sgt.


No doubt there, the 6970 is paired with a 940 BE, the 7970 with an 8150 and i'll be having 2 x 290's whenever my RMA goes through........if it does


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Thses 'red things' have come a long way eh
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> what gpu again? the 5850?
> yes, they have . . .
> 
> 6870
> 
> http://www.3dmark.com/3dm11/2454753
> 
> 7950
> 
> http://www.3dmark.com/3dm11/5948666
> 
> 290
> 
> http://www.3dmark.com/3dm11/7556915
> 
> my 6870 played all the games the other 2 plays but ran out of vram in games like BF3 MP.
> 
> edit: MADMAN, you made a lot of 760 owners proud.
Click to expand...

Why tanks very much , i made sure i overclocked and overvolted those little 'green buggars' as much as possible and nearly smashed every record / benchmark i could without using extreme cooling on single , sli and tri . Tried quad , non event bummer







. But it sure looked real good









Best 760 is da hawk cause its the only one that could go passt 1.212v (AB soft modd) my highest clock i gots was [email protected]@1.32v on factory bios









Gonna do me best with these 'red things' .









Quote:


> Originally Posted by *Sgt Bilko*
> 
> Pity i dont have any single 6970 runs here.
> 
> Crossfire 6970
> 
> http://www.3dmark.com/3dm11/6414400
> 
> 7970Ghz
> 
> http://www.3dmark.com/3dm11/6377583
> 
> 290x
> 
> http://www.3dmark.com/3dm11/7423200
> 
> And yes....They've come a long way


Very interesting . Ive been a green man since i started this (epeen) hobby


----------



## psyside

Quote:


> Originally Posted by *givmedew*
> 
> Do you have a score to compare to? Like what you got with stock and what you got with settings similar to mine? Because my math says ~10% more than 38 is under 4FPS or ~42FPS which is what I got. For the max I had 71.8 and 10% would be 7.2 and I received 79.5.
> 
> So how would reducing the memory clock help me if I got almost exactly the percentage of clock increase out of things? From my understanding if you keep OCing the memory eventually it stops helping unless you increase the clock and if you increase the clock eventually the memory starts choking the clock and it will need to be increased. It is also my understanding that right before you memory starts causing noticeable issues you will get just a slightly lower benchmark. I did tests at every 10MHz with the core set to the same speed set of 1120 each time the score was better every time but eventually stopped getting much better so I stopped at 1440 not because it got unstable or because my score went down but because it stopped giving noticeable results. I mean like 2-4 points for 5-10MHz so not enough to keep increasing it.
> 
> As for the clock without the 70mV offset I get graphical glitches starting at just over 1050. I have not tried going over 1120 so I don't know exactly how far +70mV will take me I do know that it seems even +50mV is enough to make the card stable. However since my card usually runs around 40C full out I'm not to worried about adding +50-70mV over stock.


Well, i didn't know you went with the step by step oc, in that case i guess its the benchmark. I tought your memory is unstable so you got lower fps, you should try Metro benchmark maybe? max out the settings


----------



## alancsalt

Quote:


> Originally Posted by *james111333*
> 
> Hey, is there anyone on here that has a 4770k and 290x together? I'd like to compare a few scores as i'm not sure that everything is running as it should?
> Ideally, Valley, TR, Hitman and 3D Mark.
> 
> If you can help, please PM me


Ricdeau has 4770K/290X
Quote:


> Originally Posted by *Ricdeau*


----------



## nexusforce

Quote:


> Originally Posted by *Arizonian*
> 
> I have to apologize for missing you. Congrats - and welcome to the club.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You were slotted in by your order of your entry.


Thanks and no problem its great to be a member. Do you guys think my Corsair HX 750 PSU, is enough for two 290's crossfired? Hardware secrets ( http://www.hardwaresecrets.com/article/Corsair-HX750W-Power-Supply-Review/775/8 ) indicates that it should be able to pull more than 750w (close to 900w) while still staying within specs. I have a i5-3570k overclocked.


----------



## Sgt Bilko

Quote:


> Originally Posted by *nexusforce*
> 
> Thanks and no problem its great to be a member. Do you guys think my Corsair HX 750 PSU, is enough for two 290's crossfired? Hardware secrets ( http://www.hardwaresecrets.com/article/Corsair-HX750W-Power-Supply-Review/775/8 ) indicates that it should be able to pull more than 750w (close to 900w) while still staying within specs. I have a i5-3570k overclocked.


At stock i think it will be fine but if you plan on overclocking them then i'd want a bigger PSU, tsm and sonda have shown some 290x results (sonda's are on the OP) and they were pulling 800+

And running your PSU at or close to it's max isn't good for it.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *alancsalt*
> 
> Ricdeau has 4770K/290X


Hey there , Come to have a second looksy before you go one of those expensive 'green things' eh o


----------



## quakermaas

Quote:


> Originally Posted by *nexusforce*
> 
> Thanks and no problem its great to be a member. Do you guys think my Corsair HX 750 PSU, is enough for two 290's crossfired? Hardware secrets ( http://www.hardwaresecrets.com/article/Corsair-HX750W-Power-Supply-Review/775/8 ) indicates that it should be able to pull more than 750w (close to 900w) while still staying within specs. I have a i5-3570k overclocked.


I would say you are OK as long as you don't overclock those 290s at all. I am seeing up to 1000+w draw from the wall on my overclocked CPU & 290s during benchmarks, and about 600~700w when gaming.


----------



## VSG

So my replacement Sapphire 290x with its ASIC of 82.2% is an excellent overclocker on air- so much so that the other card is slightly holding it back on stock cooling. On water, I guess it will equal out more but so far so good. I managed to get both cards at 1125/1375 on stock volts before artifacts.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *geggeg*
> 
> So my replacement Sapphire 290x with its ASIC of 82.2% is an excellent overclocker on air- so much so that the other card is slightly holding it back on stock cooling. On water, I guess it will equal out more but so far so good. I managed to get both cards at 1125/1375 on stock volts before artifacts.


Sound about right


----------



## alancsalt

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *alancsalt*
> 
> Ricdeau has 4770K/290X
> 
> 
> 
> Hey there , Come to have a second looksy before you go one of those expensive 'green things' eh o
Click to expand...

Fixed that for you!


----------



## Durvelle27

Just got a notification from USPS

Scheduled Delivery Day: December 7, 2013, 12:00 pm
Money Back Guarantee

I'm so excited. Quick question. Has anybody thought about using the NZXT Kraken G10 & AIO with the 290(X)


----------



## ImJJames

1225/1500 Stock bios

http://www.3dmark.com/fs/1265096


----------



## Sgt Bilko

Quote:


> Originally Posted by *Durvelle27*
> 
> Just got a notification from USPS
> 
> Scheduled Delivery Day: December 7, 2013, 12:00 pm
> Money Back Guarantee
> 
> I'm so excited. Quick question. Has anybody thought about using the NZXT Kraken G10 & AIO with the 290(X)


I've seen them but i'm not sure about the vrm cooling.....all the pics NZXT have shown for it has it attached to a 780 with no sinks anywhere


----------



## stickg1

I was literally about to cry. After a week of waiting for my card I installed it and no display.

Updated mobo BIOS, reseated the card, tried the BIOS switch, tried different cables. Not sure what exactly was the problem but I got a display and installing drivers now, phew!


----------



## Durvelle27

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I've seen them but i'm not sure about the vrm cooling.....all the pics NZXT have shown for it has it attached to a 780 with no sinks anywhere


Wow that's not good. You might can install some


----------



## hotrod717

Quote:


> Originally Posted by *tsm106*
> 
> FYI yall, 3dm11 is biased to IVY chips by inflating graphics scores a few hundred points on IVY cpus.


Not only that, but a 2011 cpu will inherently score better because of physics. I score 20071 in graphics, but only 12065 for physics and have a total of P16934. Home-Cinema for example: has lower clocks and graphics (18359), but his 6 core physics(15174) push him past me on overall- P17127. I definately got to break down and get Firestrike. It's the cheaper alternative to getting a 3930k or 4930k.


----------



## stickg1

Well it runs, wow 100% fan on the stock cooler is really something else, LOL...

I might look for an aftermarket cooler or swap cards with the AIB's drop custom cooled models.

Why are there no crossfire fingers on my card? Am I missing something?

Anywhere heres my validation link: http://www.techpowerup.com/gpuz/2xess/

Here's a screenie..


----------



## Sgt Bilko

Quote:


> Originally Posted by *Durvelle27*
> 
> Wow that's not good. You might can install some


----------



## rdr09

Quote:


> Originally Posted by *stickg1*
> 
> Well it runs, wow 100% fan on the stock cooler is really something else, LOL...
> 
> I might look for an aftermarket cooler or swap cards with the AIB's drop custom cooled models.
> 
> Why are there no crossfire fingers on my card? Am I missing something?
> 
> Anywhere heres my validation link: http://www.techpowerup.com/gpuz/2xess/
> 
> Here's a screenie..


no need for fingers. it is now called Sideport (see post no. 2).


----------



## stickg1

Quote:


> Originally Posted by *rdr09*
> 
> no need for fingers. it is now called Sideport (see post no. 2).


What a brilliant idea, I always thought of crossfire/sli bridges as an eyesore. Not that I even plan to crossfire these cards, but still that's really cool.


----------



## ImJJames

Alright seems like the max OC I can go on stock bios is 1225 Clock, 1500Memory. Which Bios for r9 290 will give me some more volts?


----------



## Sgt Bilko

Quote:


> Originally Posted by *ImJJames*
> 
> Alright seems like the max OC I can go on stock bios is 1225 Clock, 1500Memory. Which Bios for r9 290 will give me some more volts?


PT3 gives the most volts but it's meant as an LN2 bios.......you are still on stock air correct?

If so then wow...1225 on stock.


----------



## Jack Mac

I can do 1200 with my card, haven't really tried going much higher though and I wouldn't game on it because it requires 70% fanspeed.


----------



## stickg1

I'm having a little trouble adjusting the voltage on my 290. In MSI-AB I put in the EULA and enabled overclocking but the slider is grayed out. I installed GPU-Tweak and I don't even see a voltage slider or an option in the settings period. Sorry for the noobness in advance, lol, I'm sure its something silly.


----------



## ImJJames

Quote:


> Originally Posted by *Sgt Bilko*
> 
> PT3 gives the most volts but it's meant as an LN2 bios.......you are still on stock air correct?
> 
> If so then wow...1225 on stock.


Yeah I am on stock air cooler lol, I haven't even reached pass 70C yet. So whats my other option besides PT 3?
Quote:


> Originally Posted by *Jack Mac*
> 
> I can do 1200 with my card, haven't really tried going much higher though and I wouldn't game on it because it requires 70% fanspeed.


I wouldn't game this high also, but i just want to see this cards full potential.


----------



## Sgt Bilko

Quote:


> Originally Posted by *stickg1*
> 
> I'm having a little trouble adjusting the voltage on my 290. In MSI-AB I put in the EULA and enabled overclocking but the slider is grayed out. I installed GPU-Tweak and I don't even see a voltage slider or an option in the settings period. Sorry for the noobness in advance, lol, I'm sure its something silly.


AB beta 17 right?

Should just be a simple option check then you are in buisness.


----------



## ZombieJon

For some reason, I can't put in the screws on the PCIe bracket for the 290. I had difficulty doing the same with the 7950, and was barely able to put in a thumbscrew.

Anybody have a suggestion that might work? I might try regular screws tomorrow, but doubt they will fit too.


----------



## Durvelle27

My Sapphire R9 290 arrived and i got the Accelero Xtreme III installed along with some copper heatsinks


----------



## ImJJames

Quote:


> Originally Posted by *Durvelle27*
> 
> My Sapphire R9 290 arrived and i got the Accelero Xtreme III installed along with some copper heatsinks


Nice! Planning on adding that cooler on mine also, let us know how it is.


----------



## X-oiL

I'm not pushing 120 FPS stable in BF4 with my rig below. So i want to OC my GPU and CPU.

So where do i begin? This far i got MSI AB and 3dmark for some temp control and bench marking. I've made a custom fan profile so during gaming it never go above 80 celsius, I've raised the power limit to max +50%.

Now how much do i raise the core clock? steps by 5? or 10? or more?


----------



## Ricdeau

Quote:


> Originally Posted by *james111333*
> 
> Hey, is there anyone on here that has a 4770k and 290x together? I'd like to compare a few scores as i'm not sure that everything is running as it should?
> Ideally, Valley, TR, Hitman and 3D Mark.
> 
> If you can help, please PM me


I have a 4770K and two 290Xs, and everything you listed. Let me know what you want to know, and I can either pull benchmarks or re-run some with a single a card.


----------



## Ricdeau

Quote:


> Originally Posted by *givmedew*
> 
> Do you have a score to compare to? Like what you got with stock and what you got with settings similar to mine? Because my math says ~10% more than 38 is under 4FPS or ~42FPS which is what I got. For the max I had 71.8 and 10% would be 7.2 and I received 79.5.
> 
> So how would reducing the memory clock help me if I got almost exactly the percentage of clock increase out of things? From my understanding if you keep OCing the memory eventually it stops helping unless you increase the clock and if you increase the clock eventually the memory starts choking the clock and it will need to be increased. It is also my understanding that right before you memory starts causing noticeable issues you will get just a slightly lower benchmark. I did tests at every 10MHz with the core set to the same speed set of 1120 each time the score was better every time but eventually stopped getting much better so I stopped at 1440 not because it got unstable or because my score went down but because it stopped giving noticeable results. I mean like 2-4 points for 5-10MHz so not enough to keep increasing it.
> 
> As for the clock without the 70mV offset I get graphical glitches starting at just over 1050. I have not tried going over 1120 so I don't know exactly how far +70mV will take me I do know that it seems even +50mV is enough to make the card stable. However since my card usually runs around 40C full out I'm not to worried about adding +50-70mV over stock.


Your Valley scores are really low. Single card with no OC on a 290X and I score 64.7 FPS average and 2708 on ExtremeHD preset. Your Firestrike is much closer to where it should be I think, but still seems maybe a touch low with your clock speeds.


----------



## stickg1

Hrmm, having a bit of trouble now. It all started when I unlocked the voltage and started tinkering. I set the clocks too high and I would black screen, rebooting didn't help because I had the checkbox active in MSI-AB to apply overclock at startup. So I uninstalled MSI AB in safe mode. Then I got everything going again and reinstalled AB. Now I can set the clocks and voltage but when I put any stress on the GPU the clock fluctuates a lot, from 2D to stock to 1050MHz. I have it set for 1125MHz in AB but can't get it to run that fast in benchmarks. It was running that fast before, maybe I borked my drivers? I guess I will try a driver and MSI software sweep. Then reinstall and see where I'm at, unless anyone has some suggestions?


----------



## Arizonian

Quote:


> Originally Posted by *stickg1*
> 
> Well it runs, wow 100% fan on the stock cooler is really something else, LOL...
> 
> I might look for an aftermarket cooler or swap cards with the AIB's drop custom cooled models.
> 
> Why are there no crossfire fingers on my card? Am I missing something?
> 
> Anywhere heres my validation link: http://www.techpowerup.com/gpuz/2xess/
> 
> Here's a screenie..
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Powercolor
Quote:


> Originally Posted by *Durvelle27*
> 
> My Sapphire R9 290 arrived and i got the Accelero Xtreme III installed along with some copper heatsinks
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added with Aftermarket cooling.









12 members now using Arctic Extreme cooling. Results are really good too, enjoy.


----------



## stickg1

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> _I didn't see what manufacturer so your down for default Sapphire the mostly purchased GPU. If different PM me I'll update._


PM'd - It's PowerColor


----------



## Durvelle27

For verification


----------



## Arizonian

Got you both updated - thanks


----------



## Durvelle27

I just ran 3DMark11 and got a weird time measurement error. Was is it

But this is at stock (947/12500



http://www.3dmark.com/3dm11/7624361

OC'd (1100/1400) Core @58c Max



http://www.3dmark.com/3dm11/7624460


----------



## geoxile

Can someone tell me which OEMs have the warranty stickers on the HS/F screws?


----------



## brazilianloser

Quote:


> Originally Posted by *geoxile*
> 
> Can someone tell me which OEMs have the warranty stickers on the HS/F screws?


Asus does, both my cards have it. Not sure on the others.


----------



## Sgt Bilko

Quote:


> Originally Posted by *geoxile*
> 
> Can someone tell me which OEMs have the warranty stickers on the HS/F screws?


afaik, Sapphire (100% certain), MSI and Powercolor don't but Asus and XFX do......I also think Gigabyte does but someone should confirm these.

Might be a good thing to add to the OP Arizonian?


----------



## Roaches

Quote:


> Originally Posted by *Sgt Bilko*
> 
> afaik, Sapphire (100% certain), MSI and Powercolor don't but Asus and XFX do......I also think Gigabyte does but someone should confirm these.
> 
> Might be a good thing to add to the OP Arizonian?


Gigabyte doesn't have stickers on the GPU bracket screws. My GTX 680s WF5X don't have them.


----------



## brazilianloser

Remove it very carefully... make sure to keep it safe in a easy to remember location and hopefully you are able to put it back on with a tiny bit of glue... maybe...???

On another subject anyone that could shed some light on the option under CCC for "Enable ITC Processing"? and should I disable it since it is only available in 2 out of my 3 screens due to the way they are connected to the GPU?


----------



## geoxile

Doesn't MSI normally have stickers? I recall the MSI 7950 Twin Frozr having them.


----------



## broken pixel

Have you seen the asking prices for 290x, 7970L and 7950s on ebay, insane man. I sold 6 7950s a few weeks too early it seems as people are paying over 400 bucks for them.

I have two XFX 290x GPUs sitting on my desk and I am debating on if I should sell them? I also have two 7970L a BE and just an L.

I was waiting for AIBs to release 290x varients but i got tired of waiting. But the same day I got my 290xs my mobo goes poof. So now I wait and stare at my 290xs as all my powered risers broke but one. They run too hot to run side by side.

Should I sell them while the asking prices are Outa this world? Or hold them, I have a RIVEBE that should be delivered Monday.


----------



## RAFFY

Quote:


> Originally Posted by *broken pixel*
> 
> Have you seen the asking prices for 290x, 7970L and 7950s on ebay, insane man. I sold 6 7950s a few weeks too early it seems as people are paying over 400 bucks for them.
> 
> I have two XFX 290x GPUs sitting on my desk and I am debating on if I should sell them? I also have two 7970L a BE and just an L.
> 
> I was waiting for AIBs to release 290x varients but i got tired of waiting. But the same day I got my 290xs my mobo goes poof. So now I wait and stare at my 290xs as all my powered risers broke but one. They run too hot to run side by side.
> 
> Should I sell them while the asking prices are Outa this world? Or hold them, I have a RIVEBE that should be delivered Monday.


If these prices for the 290x stay around the 650-700 mark ill sell all three of mine and get non-reference when those are released in January and put them under water again.

Edit: When I am mining what would be a safe VRM temperature to hold for countless hours?


----------



## amlett

I've been playing with clocks after getting the block, and this seems to be the best stable BF4 spots stable BF4 with ASUS 290X BIOS:

1100/1350 on stock volts (1.23 in GPUZ)
http://img22.imageshack.us/img22/3269/v8d0.png
Score 10173 GPU SCORE 11726

1200/1500 with 1.41v in Asus GPU Tweaker
http://img69.imageshack.us/img69/9881/gjzl.png
Score 11277 GPU SCORE 13209

And this is the highest clocks I've been able to run in firestrike without artifacts. Not tested in BF4 or catzilla as the other two.
1225/1600 with 1.41v in Asus GPU Tweaker
http://img27.imageshack.us/img27/355/unld.png
Score 11377 GPU SCORE 13406

Love this card with the block


----------



## broken pixel

Quote:


> Originally Posted by *RAFFY*
> 
> If these prices for the 290x stay around the 650-700 mark ill sell all three of mine and get non-reference when those are released in January and put them under water again.
> 
> Edit: When I am mining what would be a safe VRM temperature to hold for countless hours?


Even newegg jumped the prices of the out of stock 290xs Asus 290x with BF4 is 600 bones. I bought my 2 xfx 290x for 549.00 last week and they are 599.00 now. I guess 290x gpus are limited edition now, lulz.

About the vrms I am not sure. What are the max vrm temps for the 290x? I would try keeping them 5-10C below the max if possible.

If you can position a decent 120mm fan somehow. I use some heavy duty 120mm fans and lay them on top of my gpus near the power connectors.


----------



## jrcbandit

Anyone have any experience with graphics corruption on screen after idling when overclocking a 290x? I can have the GPU voltage set to 1.412V in GPU Tweak and the core at 1230V and everything is fine after the computer idles for hours. However, if I change the memory above the default value of 5000 (1250) Mhz, I get screen corruption after the computer idles. During gaming and normal use of the desktop, there is absolutely 0 problem setting the memory higher (although my memory doesn't overclock all that well, 1400 Mhz normal use and 1500 Mhz for benchmarking).

I am also overclocking a 4770K processor using set voltage, not adaptive.


----------



## MrWhiteRX7

Newegg and Amazon raised prices of the 290 and 290x


----------



## stickg1

Ah supply and demand at it's best. I was just looking at eBay prices and thinking about selling my howler monkey here, but then I wouldn't be able to find another! So I will just get aftermarket cooling for now. I think the easiest thing to do will just be getting a little AIO liquid cpu cooler and strapping the block/pump to the GPU.

Does R9 290 have a recessed GPU like the 7950/7970? In other words, will I need a copper shim in order to mount a AIO liquid cooling pump/block to it?


----------



## RAFFY

Quote:


> Originally Posted by *broken pixel*
> 
> Even newegg jumped the prices of the out of stock 290xs Asus 290x with BF4 is 600 bones. I bought my 2 xfx 290x for 549.00 last week and they are 599.00 now. I guess 290x gpus are limited edition now, lulz.
> 
> About the vrms I am not sure. What are the max vrm temps for the 290x? I would try keeping them 5-10C below the max if possible.
> 
> If you can position a decent 120mm fan somehow. I use some heavy duty 120mm fans and lay them on top of my gpus near the power connectors.


Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Newegg and Amazon raised prices of the 290 and 290x


Sweet! Glad I ordered my third 290x for $549.99 then!


----------



## ImJJames

Flashed r9 290 to asus bios, I don't see any voltage options in asus gpu tweak


----------



## iamhollywood5

Quote:


> Originally Posted by *ImJJames*
> 
> Flashed r9 290 to asus bios, I don't see any voltage options in asus gpu tweak


You gotta go into the settings tab, and then inside there you go to the "Tune" tab, and under that you select the "Display Priority" tab, and finally under that you will see various check-boxes, and you have to check "GPU Voltage."


----------



## iamhollywood5

So I guess the R9 290 series cards just have problems with COD Ghosts? I've been noticing artifacts/glitches while I play the game with the stock 290 BIOS at the stock 947Mhz clock with fans cranked up to keep the core under 80C. I ran OCCT for 15 minutes straight with stock BIOS and clocks and had no errors, and then last night with the 290X BIOS and 1000Mhz I ran 15 minutes of Valley and a half-hour of Tomb Raider with no artifacts or crashes, and I've played 3-4 hours of BF4 at 1100Mhz on the 290X BIOS with absolutely no problems. So my card can't be the problem, it must be the game or drivers... weird, because I didn't notice any of those artifacts while playing the game on my 7970 at 1300Mhz...


----------



## pounced

Been playing around with clocks and I'm sitting at 1175 Mhz for core and 1400 Mhz for Memory without problems playing battlefield 4 the past 3 days without crashing.

added 100 mV to the voltage so I think that's why I can hold stable. If I tried anything over 1075mhz on stock voltages I would crash.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *pounced*
> 
> Been playing around with clocks and I'm sitting at 1175 Mhz for core and 1400 Mhz for Memory without problems playing battlefield 4 the past 3 days without crashing.
> 
> added 100 mV to the voltage so I think that's why I can hold stable. If I tried anything over 1075mhz on stock voltages I would crash.


Next step, flash to an ASUS BIOS, crank the voltage, and break 1200 MHz.

What's your ASIC?


----------



## ImJJames

*1250 Clock / 1500 Memory* Max temp 73C reference cooler


----------



## binormalkilla

Quote:


> Originally Posted by *the9quad*
> 
> I dont understand, how is it borked? Seems to work great to me, no stutter. I didn't expect to see the exact same score with FP on/off, that would be ridiculous to think that. Not trying to be argumentative or a smart alec, btw is it borked in another way?
> 
> It does make me wish I would have turned it off when I ran 3dmark with the WHQL drivers to get a better spot in the HOF, but I can't be bothered now, because I went back to the beta's due to flickering textures in BF4


So the latest betas fixed the flickering? I'm getting sick of it myself...


----------



## iamhollywood5

Quote:


> Originally Posted by *ImJJames*
> 
> *1250 Clock / 1500 Memory* Max temp 73C reference cooler


Impressive!! ASIC?


----------



## ImJJames

Quote:


> Originally Posted by *iamhollywood5*
> 
> Impressive!! ASIC?


77%


----------



## ImJJames

*1260 Clock / 1500 Memory* Max temp 68C

http://www.3dmark.com/3dm11/7625352


----------



## chiknnwatrmln

^That is an impressive overclock. You sir got very lucky with that card.


----------



## TheSoldiet

Getting a Antec kuhler 620 and alpenfohn dram/vram heatsinks and I'm gonna zip tie the cooler on. Waiting for Christmas though.


----------



## Durvelle27

How do i OC with ASUS GPU tweak


----------



## chiknnwatrmln

First you gotta have your card running an ASUS BIOS.

Then download and install GPUTweak from ASUS, and oc just like MSI AB.

GPUTweak runs on the same software as MSI AB and EVGA Precision X if I'm not mistaken, just different skins and voltage limits.


----------



## Kelwing

Ok I'm officially done with this 290. Sick of messing with this card when all I want to do is sit down and relax to a game. Not have to deal with CTD, black screens, hard locks that require reboot. So back to Green team. Got a great deal on two 770's and life is peaceful again.


----------



## Durvelle27

Whats power target ?


----------



## GioV

Quote:


> Originally Posted by *ImJJames*
> 
> *1250 Clock / 1500 Memory* Max temp 73C reference cooler


Are you overclocking with Asus bios or MSI AB?


----------



## crun

Hey.

Can I change the reference cooler on SAPPHIRE Rad R9 290 4096MB DDR5/512 D/H PCI-E (VGASAPATI0457) without voiding the warranty? Is it voltage locked? Anything I should know about this card?

With all the cryptocoin fuzz I change GPU's all the time, lol


----------



## -YC-

Quote:


> Originally Posted by *Slomo4shO*
> 
> Yes, the camera on the Nexus 4 sucks, too lazy to find my camera


people are having problems with bf4 r9 290 cf , are you ? whats your fps at 1080p. That is if you play bf4


----------



## chiknnwatrmln

Quote:


> Originally Posted by *crun*
> 
> Hey.
> 
> Can I change the reference cooler on SAPPHIRE Rad R9 290 4096MB DDR5/512 D/H PCI-E (VGASAPATI0457) without voiding the warranty? Is it voltage locked? Anything I should know about this card?
> 
> With all the cryptocoin fuzz I change GPU's all the time, lol


If you remove the stock cooler, even to change the TIM, it will void the warranty if they find out.

No reference 290's are voltage locked.
Quote:


> Originally Posted by *Durvelle27*
> 
> Whats power target ?


It's the percent of power the card is allowed to draw. For overclocking purposes, set it to +50%.


----------



## brazilianloser

Quote:


> Originally Posted by *-YC-*
> 
> people are having problems with bf4 r9 290 cf , are you ? whats your fps at 1080p. That is if you play bf4


On my Asus 290 I have stopped attempting to play because out of dozen or so games I have given a go since going crossfire, it is the only one that even with ULPS disabled only uses one card... given me no performance upgrade from having the two. All other games are fine but BF4 just doesn't want to play nice.


----------



## bir86

XFX 290 unlocked to 290X
ASIC 73%

1270/1400 @ 1,5vCore with PT1 BIOS - Stock fan @100%


----------



## crun

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> If you remove the stock cooler, even to change the TIM, it will void the warranty if they find out.


This is disapointing


----------



## RAFFY

Quote:


> Originally Posted by *crun*
> 
> This is disapointing


This seems like common sense lol


----------



## Durvelle27

My never ending overclocking quest. 1190/1450


----------



## hotrod717

Quote:


> Originally Posted by *Durvelle27*
> 
> My never ending overclocking quest. 1190/1450


That's really low with those clocks. What is your physics score with that 8350?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *alancsalt*
> 
> Fixed that for you!


LoooL









Gonna have one of those later









Quote:


> Originally Posted by *ImJJames*
> 
> 1225/1500 Stock bios
> 
> http://www.3dmark.com/fs/1265096










..............








Quote:


> Originally Posted by *hotrod717*
> 
> Not only that, but a 2011 cpu will inherently score better because of physics. I score 20071 in graphics, but only 12065 for physics and have a total of P16934. Home-Cinema for example: has lower clocks and graphics (18359), but his 6 core physics(15174) push him past me on overall- P17127. I definately got to break down and get Firestrike. It's the cheaper alternative to getting a 3930k or 4930k.


Theory : Quaddies for gaming and hexys for benching









Once you do go hexy you wont look back . Mind you i am running my 3930k with a 2Ghz o/clock and 2428 on da mem









Quote:


> Originally Posted by *Durvelle27*
> 
> My Sapphire R9 290 arrived and i got the Accelero Xtreme III installed along with some copper heatsinks
> 
> 
> Spoiler: Warning: Spoiler!


WHOA , Narley man









Quote:


> Originally Posted by *bir86*
> 
> XFX 290 unlocked to 290X
> ASIC 73%
> 
> 1270/1400 @ 1,5vCore with PT1 BIOS - Stock fan @100%


Now thats a kick ass FS score , AWESOME .

1.5vc !? Holy Overvolt Batman


----------



## Durvelle27

Quote:


> Originally Posted by *hotrod717*
> 
> That's really low with those clocks. What is your physics score with that 8350?


I actually have the 3rd highest score with a FX-8350+R9 290. @5GHz /Physics 8935


----------



## broken pixel

Quote:


> Originally Posted by *brazilianloser*
> 
> Asus does, both my cards have it. Not sure on the others.
> 
> my XFX 290x have 2 warranty void stickers. I know for sure MSI still RMAs with the stickers tampered with. I spoke with two MSI techs and have RMAd two R7950s with the stickers tampered with.


----------



## magicase

Does anyone have a 290 Tri CF setup? Any issues?


----------



## ImJJames

Quote:


> Originally Posted by *bir86*
> 
> XFX 290 unlocked to 290X
> ASIC 73%
> 
> 1270/1400 @ 1,5vCore with PT1 BIOS - Stock fan @100%


What's your rated PSU?


----------



## Slomo4shO

Quote:


> Originally Posted by *ImJJames*
> 
> *1250 Clock / 1500 Memory* Max temp 73C reference cooler


Impressive









I think this is the highest I have seen it with the stock cooler.


----------



## ImJJames

Quote:


> Originally Posted by *GioV*
> 
> Are you overclocking with Asus bios or MSI AB?


Asus


----------



## ImJJames

Does PT 1 bios work with msi AB and GPU tweak? Thinking of flashing to PT 1


----------



## ImJJames

Asus bios is weird, in GPU tweak I have it maxed out @ 1.4, but with software monitors it says its only maxing out at 1.34Volts


----------



## Slomo4shO

Quote:


> Originally Posted by *-YC-*
> 
> people are having problems with bf4 r9 290 cf , are you ? whats your fps at 1080p. That is if you play bf4


I have since returned those two Asus cards and picked up 4 Sapphire 290s which I have unlocked to 290X. I haven't played BF4 yet so I can't comment on BF4 specific benchmarks. I am waiting on my waterblocks, 4 of these sound like a turbine and would rather not upset the wife by starting up the jets to benchmark so I continue to wait patiently and stick to the PS4 until the blocks arrive









They throttled like crazy at reference clocks and 55% fan speeds:
http://www.3dmark.com/fs/1246693


----------



## DeadlyDNA

Quote:


> Originally Posted by *magicase*
> 
> Does anyone have a 290 Tri CF setup? Any issues?


ive been running tri fire and now quad, there have been small issues here and there. Why do you ask?


----------



## bir86

Quote:


> Originally Posted by *ImJJames*
> 
> What's your rated PSU?


Corsair HX1000W
Quote:


> Originally Posted by *ImJJames*
> 
> Does PT 1 bios work with msi AB and GPU tweak? Thinking of flashing to PT 1


Only works with GPUTweak.
Quote:


> Originally Posted by *ImJJames*
> 
> Asus bios is weird, in GPU tweak I have it maxed out @ 1.4, but with software monitors it says its only maxing out at 1.34Volts


Thats called vdroop. My 1.5vc is actually 1.45 or something.


----------



## crun

since i've sold both my 7950 and (few minutes ago) r9 280x i'm pretty much forced to pick a new GPU.

could anyone tell me anything more about SAPPHIRE Rad R9 290 4096MB DDR5/512 D/H PCI-E (VGASAPATI0457)? any nuances I should know of? overall, is it a good choice with stock cooler and well ventilated case?

i could get it for the price i've sold my r9 280x for







but i need to decide fast, early tomorrow probably or they all will sell out


----------



## HOMECINEMA-PC

Get it dont delay


----------



## pounced

This is with a 2500k and 290x at 1200mhz/1475mhz Is this a good score/Bench I'm pretty new to this. Also how does one flash to the Asus Bios I hear I can get more voltage I want to play at 1250 Mhz stable.


----------



## Forceman

Quote:


> Originally Posted by *pounced*
> 
> This is with a 2500k and 290x at 1200mhz/1475mhz Is this a good score/Bench I'm pretty new to this. Also how does one flash to the Asus Bios I hear I can get more voltage I want to play at 1250 Mhz stable.


What speed is the 2500K at? I got 70 FPS with about the same GPU clocks, but that was with a 4770K at 4.4.


----------



## pounced

Quote:


> Originally Posted by *pounced*
> 
> This is with a 2500k and 290x at 1200mhz/1475mhz Is this a good score/Bench I'm pretty new to this. Also how does one flash to the Asus Bios I hear I can get more voltage I want to play at 1250 Mhz stable.


CPU Is clocked at 4.7 Ghz left that out sorry.


----------



## HardwareDecoder

Quote:


> Originally Posted by *pounced*
> 
> This is with a 2500k and 290x at 1200mhz/1475mhz Is this a good score/Bench I'm pretty new to this. Also how does one flash to the Asus Bios I hear I can get more voltage I want to play at 1250 Mhz stable.


I hope my 290x does atleast those clocks under water or it's gonna get rma'd.

Did you do that on air or water?

my card comes in monday, block tuesday so i'll test it a bunch monday night when I get home from school, see if it is a dud before I bother waterblocking it.


----------



## pounced

Quote:


> Originally Posted by *HardwareDecoder*
> 
> I hope my 290x does atleast those clocks under water or it's gonna get rma'd.
> 
> Did you do that on air or water?
> 
> my card comes in monday, block tuesday so i'll test it a bunch monday night when I get home from school, see if it is a dud before I bother waterblocking it.


That was on Air stock cooler and that was the third bench I did so the card was warmed up. I have the power target set to 150% in GPU Tweak but I'm not sure if it's actually getting that because I'm not running the Asus Bios.

Edit: Card will sit from 80-85 C with 60%~ Fan speed so it's not to horrible.


----------



## Slomo4shO

Does anyone know of any merchant with blocks in stock? ForzenCPU is suggesting after Christmas for my order


----------



## brazilianloser

Quote:


> Originally Posted by *Slomo4shO*
> 
> Does anyone know of any merchant with blocks in stock? ForzenCPU is suggesting after Christmas for my order


http://www.frozencpu.com/products/21663/ex-blc-1567/EK_Radeon_R9-290X_VGA_Liquid_Cooling_Block_-_Acetal_Nickel_EK-FC_R9-290X_-_AcetalNickel.html?tl=g57c599s2078

Weird because they show about 6 in stock... just switch to those...


----------



## iamhollywood5

Quote:


> Originally Posted by *Slomo4shO*
> 
> Does anyone know of any merchant with blocks in stock? ForzenCPU is suggesting after Christmas for my order


They told you after Christmas?? I ordered my block as part of an order on Cyber Monday and it's still in "packaging" phase, I called them up and they said it would be shipped out early next week, I hope they weren't just lying to get me off the phone... If I really have to wait until after Christmas to get this ear-splitting card under water I'm going to lose my patience.


----------



## Kenshiro 26

Quote:


> Originally Posted by *iamhollywood5*
> 
> They told you after Christmas?? I ordered my block as part of an order on Cyber Monday and it's still in "packaging" phase, I called them up and they said it would be shipped out early next week, I hope they weren't just lying to get me off the phone... If I really have to wait until after Christmas to get this ear-splitting card under water I'm going to lose my patience.


In the same boat, been waiting about 3 weeks for my nickel & acrylic block. Got everything put together in my loop except for this block.


----------



## HOMECINEMA-PC

Geeze this club posts quick


----------



## tsm106

Quote:


> Originally Posted by *Kenshiro 26*
> 
> Quote:
> 
> 
> 
> Originally Posted by *iamhollywood5*
> 
> They told you after Christmas?? I ordered my block as part of an order on Cyber Monday and it's still in "packaging" phase, I called them up and they said it would be shipped out early next week, I hope they weren't just lying to get me off the phone... If I really have to wait until after Christmas to get this ear-splitting card under water I'm going to lose my patience.
> 
> 
> 
> In the same boat, been waiting about 3 weeks for my nickel & acrylic block. Got everything put together in my loop except for this block.
Click to expand...

Oh cancel that, you do not want nickel. It will corrode. There's a guy on here who had his corrode in 7 days use.

I have one last acetal copper block left to sell. It has backplate and is fitted with ultra extremes.

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Scorpion49*
> 
> Apparently my 290 waterblocks are pushed back to the end of the month too. Anyone ever cancel from aquatuning? I should have just got the EK but they're so damn ugly.
> 
> 
> 
> I like them so much better than the palm trees with puny vrm cooling.
Click to expand...

Quote:


> Originally Posted by *Emmett*
> 
> Hmm, 2 seperate loops for each card. own pumps, 360 Rad.
> Both using Distilled water with kill coil.
> 
> Anyone seen this before?
> 
> This is after only 7-9 days of setting up this cards loop.
> 
> 
> 
> 
> 
> 
> 
> the brown is where the acrylic lays on the nickle, then the is galvanic looking corrosion an the ends of the fins over core
> (that is where i have the inlet.) cant believe it happened so fast...
> 
> the other loop looks fine and clean, and has been running much longer.
> 
> no aluminum in loops, but i have copper/brass/nickle/silver kill coil.


Where have you been for the last two/three years? Silver is out. If you use nickel at least use a coolant. EK nickel is a joke.


----------



## rv8000

Finally got this stupid 290 to break 13k gpu score in firestrike @ 1230/1500, can't go any higher due to vrm temps D:

http://www.3dmark.com/3dm/1806045


----------



## pounced

Could anyone direct me where to learn how to flash my BIOS from sapphire to Asus so I can get some more juice on this card, Thanks in advance.


----------



## the9quad

Quote:


> Originally Posted by *pounced*
> 
> Could anyone direct me where to learn how to flash my BIOS from sapphire to Asus so I can get some more juice on this card, Thanks in advance.


http://forums.overclockers.co.uk/showthread.php?t=18552408


----------



## Slomo4shO

Quote:


> Originally Posted by *brazilianloser*
> 
> http://www.frozencpu.com/products/21663/ex-blc-1567/EK_Radeon_R9-290X_VGA_Liquid_Cooling_Block_-_Acetal_Nickel_EK-FC_R9-290X_-_AcetalNickel.html?tl=g57c599s2078
> 
> Weird because they show about 6 in stock... just switch to those...


I placed the order on Monday and I have actually spoke with them on the phone and asked them to ship me whatever block that comes in stock... I can't even contact them until Monday.

Quote:


> Originally Posted by *iamhollywood5*
> 
> They told you after Christmas?? I ordered my block as part of an order on Cyber Monday and it's still in "packaging" phase, I called them up and they said it would be shipped out early next week, I hope they weren't just lying to get me off the phone... If I really have to wait until after Christmas to get this ear-splitting card under water I'm going to lose my patience.


I spoke with them on Tursday and I was advised that my order for the full block acetals will likely not be shipped in the next two weeks which would put delivery past Christmas...


----------



## HardwareDecoder

Quote:


> Originally Posted by *Slomo4shO*
> 
> I placed the order on Monday and I have actually spoke with them on the phone and asked them to ship me whatever block that comes in stock... I can't even contact them until Monday.
> I spoke with them on Tursday and I was advised that my order for the full block acetals will likely not be shipped in the next two weeks which would put delivery past Christmas...


the ek nickle/acetal block?

they told me it might be 1-2 weeks and then it shipped in like 3 days. will be here tuesday


----------



## tsm106

Quote:


> Originally Posted by *Slomo4shO*
> 
> Quote:
> 
> 
> 
> Originally Posted by *brazilianloser*
> 
> http://www.frozencpu.com/products/21663/ex-blc-1567/EK_Radeon_R9-290X_VGA_Liquid_Cooling_Block_-_Acetal_Nickel_EK-FC_R9-290X_-_AcetalNickel.html?tl=g57c599s2078
> 
> Weird because they show about 6 in stock... just switch to those...
> 
> 
> 
> I placed the order on Monday and I have actually spoke with them on the phone and asked them to ship me whatever block that comes in stock... I can't even contact them until Monday.
> 
> Quote:
> 
> 
> 
> Originally Posted by *iamhollywood5*
> 
> They told you after Christmas?? I ordered my block as part of an order on Cyber Monday and it's still in "packaging" phase, I called them up and they said it would be shipped out early next week, I hope they weren't just lying to get me off the phone... If I really have to wait until after Christmas to get this ear-splitting card under water I'm going to lose my patience.
> 
> Click to expand...
> 
> I spoke with them on Tursday and I was advised that my order for the full block acetals will likely not be shipped in the next two weeks which would put delivery past Christmas...
Click to expand...

When I bought my blocks, Fcpu strung me along for two weeks. When I finally called them, it was next Monday, and then next Friday lol. I said eff you and forced a return. And waited for PPCS. PPCS ships when they have stock. Fcpu does not, they string you along taking your money. They have an order list, so it also depends on where you are on the list.


----------



## -YC-

Quote:


> Originally Posted by *brazilianloser*
> 
> On my Asus 290 I have stopped attempting to play because out of dozen or so games I have given a go since going crossfire, it is the only one that even with ULPS disabled only uses one card... given me no performance upgrade from having the two. All other games are fine but BF4 just doesn't want to play nice.


Well i play BF4 98% of the time i am gaming 

I am lost i see people getting 110avg. on BF4 @ 2560x1440p with the two cards, and some actually get worse fps than single card lol


----------



## brazilianloser

Quote:


> Originally Posted by *tsm106*
> 
> When I bought my blocks, Fcpu strung me along for two weeks. When I finally called them, it was next Monday, and then next Friday lol. I said eff you and forced a return. And waited for PPCS. PPCS ships when they have stock. Fcpu does not, they string you along taking your money. They have an order list, so it also depends on where you are on the list.


As you yourself explained both sites operate differently... FC will take orders even when they do not have the stock (Pre-Order) so depending on how many folks are ahead of you on the get your block list you might have to wait quite a while since those EK blocks have sold like crazy... PPC will only take orders when they actually have it in stock. Hence why you get yours faster, if you are able to order that means they have it.

Anyways I have heard stories about FrozenCpu and most of them weren't very nice.


----------



## Durvelle27

Striving for 18k graphics


----------



## Slomo4shO

Quote:


> Originally Posted by *tsm106*
> 
> When I bought my blocks, Fcpu strung me along for two weeks. When I finally called them, it was next Monday, and then next Friday lol. I said eff you and forced a return. And waited for PPCS. PPCS ships when they have stock. Fcpu does not, they string you along taking your money.


That is why I am looking for something in stock









On a side note, I will be having nickle plated quick releases and chrome fittings in my loop. I was planning on DI water with a biocide like Methylisothiazolone. Considering that the rads are copper with steel and bronze casings, should I be worried about corrosion?
Quote:


> Originally Posted by *HardwareDecoder*
> 
> the ek nickle/acetal block?


Nah the acetal full cover block.


----------



## brazilianloser

Quote:


> Originally Posted by *-YC-*
> 
> Well i play BF4 98% of the time i am gaming
> 
> I am lost i see people getting 110avg. on BF4 @ 2560x1440p with the two cards, and some actually get worse fps than single card lol


Seems like there is a bug or whatever you want to call it... some folks with double 290/290x are having the game only put use on a single card. I am being one of these guys. Yes ULPS is disabled and Yes other games use both cards







... BF4 on the other hand is gimped to a single card on my setup. I have stopped playing altogether until there is some sort of fix.


----------



## tsm106

Quote:


> Originally Posted by *brazilianloser*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> When I bought my blocks, Fcpu strung me along for two weeks. When I finally called them, it was next Monday, and then next Friday lol. I said eff you and forced a return. And waited for PPCS. PPCS ships when they have stock. Fcpu does not, they string you along taking your money. They have an order list, so it also depends on where you are on the list.
> 
> 
> 
> As you yourself explained both sites operate differently... FC will take orders even when they do not have the stock (Pre-Order) so depending on how many folks are ahead of you on the get your block list you might have to wait quite a while since those EK blocks have sold like crazy... PPC will only take orders when they actually have it in stock. Hence why you get yours faster, if you are able to order that means they have it.
> 
> Anyways I have heard stories about FrozenCpu and most of them weren't very nice.
Click to expand...

Yeap. And Fcpu will try their hardest to not let you have a refund on your money that they have no right to. That sort of shenanigans really ticks me off. Also, since I'm ranting, the FREE Holiday shipping is BS. That shipping speed is two weeks, so don't get free shipping unless you have a lot of time on your hands.


----------



## brazilianloser

Quote:


> Originally Posted by *Slomo4shO*
> 
> That is why I am looking for something in stock


Just fyi that wasn't me saying that. Not sure how the quote made it seem like me and not the other guy that said it... but who knows.


----------



## DeadlyDNA

Quote:


> Originally Posted by *brazilianloser*
> 
> Seems like there is a bug or whatever you want to call it... some folks with double 290/290x are having the game only put use on a single card. I am being one of these guys. Yes ULPS is disabled and Yes other games use both cards
> 
> 
> 
> 
> 
> 
> 
> ... BF4 on the other hand is gimped to a single card on my setup. I have stopped playing altogether until there is some sort of fix.


have you tried making a custom crossfire profile in CCC for bf4and maybe try different settings like 1x1, or AFR? Not sure if it will make a difference but maybe it will work.


----------



## grunion

Both pt1/pt3 bios' lock my core speeds at 1Ghz.
Any thoughts?


----------



## DeadlyDNA

I added my 4th card in on stock cooler, i forgot to order a fitting for my WB doh... anyways im hitting 1.3.k to 1.5k at the wall plug now.....


----------



## brazilianloser

Quote:


> Originally Posted by *DeadlyDNA*
> 
> have you tried making a custom crossfire profile in CCC for bf4and maybe try different settings like 1x1, or AFR? Not sure if it will make a difference but maybe it will work.


I am new to the AMD scene these double 290 being the first AMD cards I have owned (if you disregard the 280x I had for a week before returning for a 290)... But thank you will give that a try.


----------



## HardwareDecoder

Quote:


> Originally Posted by *tsm106*
> 
> Yeap. And Fcpu will try their hardest to not let you have a refund on your money that they have no right to. That sort of shenanigans really ticks me off. Also, since I'm ranting, the FREE Holiday shipping is BS. That shipping speed is two weeks, so don't get free shipping unless you have a lot of time on your hands.


I've ordered twice from frozen so far and both times they were super legit.....

the first time I had no idea what I was doing with watercooling and thought the pump went bad, they got me a replacement immediately.

The second time I ordered the copper block and found out it was too ugly and not full cover, called and they got it straightened out immediately ( they said the full cover nickle/acetal block was gonna be 1-2 weeks but it shipped in like 3 days) Also when you call you actually speak with someone immediately which is pretty amazing now a days.

As far as the shipping being two weeks I live in PA and they are in NY, they have shippd UPS both times and the stuff comes super fast. This includes the holiday shipping... they just shipped me something and it will be here tuesday, obviously would have been here alot sooner except for the weekend.

I have no idea what your experience has been and i'm not trying to downplay any of you guys complains just wanted to offer a positive light.

tl;dr i've had a great experience with frozencpu both times, got 7% off and free shipping on my last order.

Quote:


> Originally Posted by *Slomo4shO*
> 
> That is why I am looking for something in stock
> 
> 
> 
> 
> 
> 
> 
> 
> 
> On a side note, I will be having nickle plated quick releases and chrome fittings in my loop. I was planning on DI water with a biocide like Methylisothiazolone. Considering that the rads are copper with steel and bronze casings, should I be worried about corrosion?
> Nah the acetal full cover block.


oh okay I think they are very close to the same thing.

btw the acetal/nickle version is like 12$ more and is in stock now just a heads up.

http://www.frozencpu.com/products/21663/ex-blc-1567/EK_Radeon_R9-290X_VGA_Liquid_Cooling_Block_-_Acetal_Nickel_EK-FC_R9-290X_-_AcetalNickel.html?tl=g57c599s2078


----------



## Jack Mac

Quote:


> Originally Posted by *Durvelle27*
> 
> Striving for 18k graphics


Disable tessellation and you'll be over 19k.


----------



## tsm106

Quote:


> Originally Posted by *HardwareDecoder*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Yeap. And Fcpu will try their hardest to not let you have a refund on your money that they have no right to. That sort of shenanigans really ticks me off. Also, since I'm ranting, the FREE Holiday shipping is BS. That shipping speed is two weeks, so don't get free shipping unless you have a lot of time on your hands.
> 
> 
> 
> I've ordered twice from frozen so far and both times they were super legit.....
> 
> the first time I had no idea what I was doing with watercooling and thought the pump went bad, they got me a replacement immediately.
> 
> The second time I ordered the copper block and found out it was too ugly and not full cover, called and they got it straightened out immediately ( they said the full cover nickle/acetal block was gonna be 1-2 weeks but it shipped in like 3 days) Also when you call you actually speak with someone immediately which is pretty amazing now a days.
> 
> As far as the shipping being two weeks I live in PA and they are in NY, they have shippd UPS both times and the stuff comes super fast. This includes the holiday shipping... they just shipped me something and it will be here tuesday, obviously would have been here alot sooner except for the weekend.
> 
> *I have no idea what your experience has been and i'm not trying to downplay any of you guys complains just wanted to offer a positive light.*
> 
> tl;dr i've had a great experience with frozencpu both times, got 7% off and free shipping on my last order.
> 
> Quote:
> 
> 
> 
> Originally Posted by *Slomo4shO*
> 
> That is why I am looking for something in stock
> 
> 
> 
> 
> 
> 
> 
> 
> 
> On a side note, I will be having nickle plated quick releases and chrome fittings in my loop. I was planning on DI water with a biocide like Methylisothiazolone. Considering that the rads are copper with steel and bronze casings, should I be worried about corrosion?
> Nah the acetal full cover block.
> 
> Click to expand...
> 
> oh okay I think they are very close to the same thing.
Click to expand...

Yea, you don't know anything cuz you haven't been doing this as long. Fcpu and even PPCS, they are the largest retailers but they are not the best retailers. You are not dealing with Amazon here. These guys are not customer centric and don't ever forget that. And btw, I've spent thousands of dollars over the last five years at both Fcpu and PPCS, so I know their practices intimately.


----------



## HardwareDecoder

Quote:


> Originally Posted by *tsm106*
> 
> Yea, you don't know anything cuz you haven't been doing this as long. Fcpu and even PPCS, they are the largest retailers but they are not the best retailers. You are not dealing with Amazon here. These guys are not customer centric and don't ever forget that. And btw, I've spent thousands of dollars over the last five years at both Fcpu and PPCS, so I know their practices intimately.


So who is the best retailer for watercooling stuff then? I know amazon doesn't sell that much WC stuff or the prices suck. Also it's a bit rude to say I don't know anything, I have made purchases from the company you are bashing, if you don't like them so much why do you buy anything from them?

I'm not gonna argue any further since it's clear how you feel and you are entitled to your opinion. Just makes no sense to me that you'd continue to spend money with a company you clearly don't like.

Again: I'm really not trying to be confrontational here.


----------



## ImJJames

My new best so far

*1270 Clock, 1500 Memory, Stock cooler, Max temp 75C*
http://www.3dmark.com/3dm11/7626025


----------



## -YC-

Quote:


> Originally Posted by *brazilianloser*
> 
> Seems like there is a bug or whatever you want to call it... some folks with double 290/290x are having the game only put use on a single card. I am being one of these guys. Yes ULPS is disabled and Yes other games use both cards
> 
> 
> 
> 
> 
> 
> 
> ... BF4 on the other hand is gimped to a single card on my setup. I have stopped playing altogether until there is some sort of fix.


Though will there be a fix ? I will wait but what if there isnt ? Far Cry 3 still suffers from many things that will not be fixed. Will there be a huge perf. differance if i buy Zotac 4 GB GTX 770 SLI ?? Its 100 us $ cheaper too. And my cpu is i7-3770k will mantle work with intel too with amd cards ? When will it be released maybe not alot of differance from 770 and r2 290 but when mantle comes out ... What would you do though keep in mind i have been without Gaming PC for 30 days !!! I am going crazy so i really dont want to wait but dont want to rush and be not get the performance i want out of BF4.


----------



## tsm106

Quote:


> Originally Posted by *HardwareDecoder*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Yea, you don't know anything cuz you haven't been doing this as long. Fcpu and even PPCS, they are the largest retailers but they are not the best retailers. You are not dealing with Amazon here. These guys are not customer centric and don't ever forget that. And btw, I've spent thousands of dollars over the last five years at both Fcpu and PPCS, so I know their practices intimately.
> 
> 
> 
> So who is the best retailer for watercooling stuff then? I know amazon doesn't sell that much WC stuff or the prices suck
Click to expand...

The most standup retailers are Jab-tech and Sidewinder. However they are smaller and have miniscule stock selection compared to the big two fcpu and ppcs. Thus we are forced to deal with fcpu and ppcs. Whenever I can I try to buy all my stuff from jabtech and sidewinder, but you can't build a loop to your hearts content doing that unfortunately.

Haha, on your edit. It's clear you have no idea about the WC hobby.


----------



## r0cawearz

guys, I think I completely ruined my card just now.. Was reseating my heatsink after installing the stock vrm heatsink and I ran out of paste, so I ended up half-assing some old AS5. I tried to run valley after reinstalling it and my screen completely black screened so I took the heatsink apart to check on it and found a crack on the die. Am I completely screwed? Can anyone help me out?


----------



## HardwareDecoder

Quote:


> Originally Posted by *tsm106*
> 
> The most standup retailers are Jan-tech and Sidewinder. However they are smaller and have miniscule stock selection compared to the big two fcpu and ppcs. Thus we are forced to deal with fcpu and ppcs. Whenever I can I try to buy all my stuff from jabtech and sidewinder, but you can't build a loop to your hearts content doing that unfortunately.
> 
> Haha, on your edit. It's clear you have no idea about the WC hobby.


You really do come off as a bit of an elitist jerk, I mentioned I was very new to watercooling. Everyone starts somewhere anyway this will definitely be my last reply to you.


----------



## magicase

Quote:


> Originally Posted by *DeadlyDNA*
> 
> ive been running tri fire and now quad, there have been small issues here and there. Why do you ask?


Thinking of going Tri CF once the aftermarket coolers are released.


----------



## Durvelle27

Quote:


> Originally Posted by *Jack Mac*
> 
> Disable tessellation and you'll be over 19k.


Without Tweaks


----------



## King4x4

So are the black screens even happening for crossfire, trifire and quadfire users?


----------



## HardwareDecoder

Quote:


> Originally Posted by *r0cawearz*
> 
> guys, I think I completely ruined my card just now.. Was reseating my heatsink after installing the stock vrm heatsink and I ran out of paste, so I ended up half-assing some old AS5. I tried to run valley after reinstalling it and my screen completely black screened so I took the heatsink apart to check on it and found a crack on the die. Am I completely screwed? Can anyone help me out?


Wow you are screwed :-(, sorry. you were reseating what heatsink a waterblock or the stock one?

I mean I guess you could put back on the stock heatsink and try to RMA it and if they say anything about a cracked die just claim you have no idea you never took it apart it must have over heated severely.

That's a bit dishonest, but oh well...... i'd do it heh.


----------



## ImJJames

New valley best







, yes stock cooler


----------



## r0cawearz

Quote:


> Originally Posted by *HardwareDecoder*
> 
> Wow you are screwed :-(, sorry. you were reseating what heatsink a waterblock or the stock one?
> 
> I mean I guess you could put back on the stock heatsink and try to RMA it and if they say anything about a cracked block just claim you have no idea you never took it apart it must have over heated severely.
> 
> That's a bit dishonest, but oh well...... i'd do it heh.


Was reseating a gelid icy rev2 and I took apart the stock cooler to cool the VRMs like some other users here have(which did work out really well). But my core temps were pretty bad so I had to reseat it, which is where I ran out of TIM. My cards a XFX, would they do repairs or a new card for a fee? Or is there no way of salvaging this at all? I knew I should've just waited for better paste..


----------



## bir86

Quote:


> Originally Posted by *ImJJames*
> 
> My new best so far
> 
> *1270 Clock, 1500 Memory, Stock cooler, Max temp 75C*
> http://www.3dmark.com/3dm11/7626025


Try 3DMark 13.


----------



## HardwareDecoder

Quote:


> Originally Posted by *r0cawearz*
> 
> Was reseating a gelid icy rev2 and I took apart the stock cooler to cool the VRMs like some other users here have(which did work out really well). But my core temps were pretty bad so I had to reseat it, which is where I ran out of TIM. My cards a XFX, would they do repairs or a new card for a fee? Or is there no way of salvaging this at all? I knew I should've just waited for better paste..


I really have no idea, I've never had to rma a video card or had an xfx. Well I guess I have some idea, I'm pretty sure that card is dead if you cracked the die the actual gpu is done for.

Where did you buy it from? I think you could probably just return it to the retailer and get a refund as long as the outside doesn't look tampered with.

Man you are scaring me, I have to put on my first waterblock next week


----------



## tsm106

Quote:


> Originally Posted by *HardwareDecoder*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> The most standup retailers are Jan-tech and Sidewinder. However they are smaller and have miniscule stock selection compared to the big two fcpu and ppcs. Thus we are forced to deal with fcpu and ppcs. Whenever I can I try to buy all my stuff from jabtech and sidewinder, but you can't build a loop to your hearts content doing that unfortunately.
> 
> Haha, on your edit. It's clear you have no idea about the WC hobby.
> 
> 
> 
> You really do come off as a bit of an elitist jerk, I mentioned I was very new to watercooling. Everyone starts somewhere anyway this will definitely be my last reply to you.
Click to expand...

If you spent anytime googling or even searching in the wc forums, you would not think I was a jerk, lol.


----------



## DeadlyDNA

Quote:


> Originally Posted by *magicase*
> 
> Thinking of going Tri CF once the aftermarket coolers are released.


Gotcha, I am only going quad over tri for maximizing eyefinity resolutions. Not much scaling on the 4th, but tri-fire scales very nicely. Last time i ran trifire was with 5870's and quadfire with 5770's. That was for fun no doubt. I really dig EyeFinity over Surround on this R9 series, really easy to setup and turn off as needed. Surround was just a pain in the ass with the way it's enabled configured and disabled.


----------



## zephcdj

I have two identical Gigabyte 290x in two separate identical machines. They're both running the latest Catalyst WHQL. One of them reports VRM temps in GPU-Z, but the other does not. Anyone know the reason for the discrepancy?


----------



## r0cawearz

Quote:


> Originally Posted by *HardwareDecoder*
> 
> I really have no idea, I've never had to rma a video card or had an xfx. Well I guess I have some idea, I'm pretty sure that card is dead if you cracked the die the actual gpu is done for.
> 
> Where did you buy it from? I think you could probably just return it to the retailer and get a refund as long as the outside doesn't look tampered with.
> 
> Man you are scaring me, I have to put on my first waterblock next week


From newegg. You'll be fine, I was just being an idiot since I got excited over the low VRM temps from installing the stock heatsinks. Thanks for the responses.

Anyone know how RMAing a video card/xfx works? Or even repairs/replacements for a fee if possible? I don't have the box or the stock cooler intact and I really don't want to pay for shipping/return fees if it's going to fail.


----------



## brazilianloser

Created a profile like DeadlyDNA sugested and tried BF4 with 1x1 first... on ultra 1080p was barely hitting 60fps but at least the game seems to utilize both cards almost identically. Changed it over to AFR Friendly and the game shoots up to 80-100fps but it is still utilizing only one card... Either AB usage display is broken or something is really messed up with these drivers.


----------



## DeadlyDNA

Quote:


> Originally Posted by *King4x4*
> 
> So are the black screens even happening for crossfire, trifire and quadfire users?


I haven't had any issue with it, in fact having 4 of these 290's now and aside from 1 doa they have been great. I also haven't been pushing mine like others have so take it with a grain of salt.


----------



## ImJJames

Quote:


> Originally Posted by *bir86*
> 
> Try 3DMark 13.


For some reason my graphic score varies...

*1270 Clock 1500 Memory* *Graphic score 12831*
http://www.3dmark.com/3dm/1806283?


*1225 Clock 1500 Memory Graphic score 12964*
http://www.3dmark.com/fs/1265096


----------



## DeadlyDNA

Quote:


> Originally Posted by *brazilianloser*
> 
> Created a profile like DeadlyDNA sugested and tried BF4 with 1x1 first... on ultra 1080p was barely hitting 60fps but at least the game seems to utilize both cards almost identically. Changed it over to AFR Friendly and the game shoots up to 80-100fps but it is still utilizing only one card... Either AB usage display is broken or something is really messed up with these drivers.


I wish i had BF4 to help, I just hate the way the BF series launches from a browser and i think EA is the devil.... lol i really hate them and Activision


----------



## tsm106

Quote:


> Originally Posted by *r0cawearz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HardwareDecoder*
> 
> I really have no idea, I've never had to rma a video card or had an xfx. Well I guess I have some idea, I'm pretty sure that card is dead if you cracked the die the actual gpu is done for.
> 
> Where did you buy it from? I think you could probably just return it to the retailer and get a refund as long as the outside doesn't look tampered with.
> 
> Man you are scaring me, I have to put on my first waterblock next week
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> From newegg. You'll be fine, I was just being an idiot since I got excited over the low VRM temps from installing the stock heatsinks. Thanks for the responses.
> 
> *Anyone know how RMAing a video card/xfx works? Or even repairs/replacements for a fee if possible? I don't have the box or the stock cooler intact and I really don't want to pay for shipping/return fees if it's going to fail.
> 
> 
> 
> 
> 
> 
> 
> *
Click to expand...

Why bother? Do you think you did not cause the damage or something? You cracked the die... it's like crashing your car, and blaming GM for your error. I'm not sure about repairs to a die, that's usually the end of a gpu.


----------



## r0cawearz

Quote:


> Originally Posted by *tsm106*
> 
> Why bother? Do you think you did not cause the damage or something? You cracked the die... it's like crashing your car, and blaming GM for your error. I'm not sure about repairs to a die, that's usually the end of a gpu.


No, I stately clearly that I knew I damaged it and have very little to no hope on this card. But, I figured I'd ask if anyone knows if there were any ways to salvage the situation. Even if I had to pay a 200$ fee, it'd be better than nothing.


----------



## the9quad

Quote:


> Originally Posted by *brazilianloser*
> 
> Seems like there is a bug or whatever you want to call it... some folks with double 290/290x are having the game only put use on a single card. I am being one of these guys. Yes ULPS is disabled and Yes other games use both cards
> 
> 
> 
> 
> 
> 
> 
> ... BF4 on the other hand is gimped to a single card on my setup. I have stopped playing altogether until there is some sort of fix.


99.99999999% of the people who only have one of their cards working are playing in window mode. Crossfire (and SLI) only work in full screen. So if you were like me and you were playing borderless window mode, then that is why you have only one card working. Borderless window mode, does not equal full screen.


----------



## tsm106

Quote:


> Originally Posted by *r0cawearz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Why bother? Do you think you did not cause the damage or something? You cracked the die... it's like crashing your car, and blaming GM for your error. I'm not sure about repairs to a die, that's usually the end of a gpu.
> 
> 
> 
> No, I stately clearly that I knew I damaged it and have very little to no hope on this card. But, I figured I'd ask if anyone knows if there were any ways to salvage the situation. Even if I had to pay a 200$ fee, it'd be better than nothing.
Click to expand...

I see and understand. Still the die is the core. If they replace your core, they cannot sell another card. Fwiw with chats with MSIAlex, once a die is mounted it never gets removed.


----------



## DeadlyDNA

Quote:


> Originally Posted by *r0cawearz*
> 
> No, I stately clearly that I knew I damaged it and have very little to no hope on this card. But, I figured I'd ask if anyone knows if there were any ways to salvage the situation. Even if I had to pay a 200$ fee, it'd be better than nothing.


Just curious if you feel up to the task of posting a pic of the cracked chip. Maybe others should be aware it can happen?


----------



## tsm106

Yea post a pic. It didn't sound like the card was actually dead yet? Is it just a chip?


----------



## Forceman

Quote:


> Originally Posted by *r0cawearz*
> 
> No, I stately clearly that I knew I damaged it and have very little to no hope on this card. But, I figured I'd ask if anyone knows if there were any ways to salvage the situation. Even if I had to pay a 200$ fee, it'd be better than nothing.


If the damage was your fault, I don't think they will do anything for you at all. They don't repair cards, they just send out replacements and then send the dead ones back for recycling or something. So they won't be able to pull the GPU off to put a new one on or anything. If it was a bad capacitor they might repair that, but not a GPU. I don't know if they could even take it off if they wanted to without desoldering the whole card.


----------



## DeadlyDNA

Quote:


> Originally Posted by *tsm106*
> 
> Yea post a pic. It didn't sound like the card was actually dead yet? Is it just a chip?


OT - So i saw you sold your R9's, and you said waiting a month.... was just curious what you were waiting on?


----------



## r0cawearz

Here. I'm not sure if its from the AS5 spreading and zapping the core, or if its just from the poor heatsink installation. I've installed it a couple of times with poor installation and this never happened.


----------



## RAFFY

Quote:


> Originally Posted by *magicase*
> 
> Does anyone have a 290 Tri CF setup? Any issues?


I will have my setup going here in the coming weeks once my GPU blocks arrive. Keep an eye out for my postings.
Quote:


> Originally Posted by *Slomo4shO*
> 
> Does anyone know of any merchant with blocks in stock? ForzenCPU is suggesting after Christmas for my order


I hope not I have 3 on order from them!
Quote:


> Originally Posted by *brazilianloser*
> 
> http://www.frozencpu.com/products/21663/ex-blc-1567/EK_Radeon_R9-290X_VGA_Liquid_Cooling_Block_-_Acetal_Nickel_EK-FC_R9-290X_-_AcetalNickel.html?tl=g57c599s2078
> 
> Weird because they show about 6 in stock... just switch to those...


I hope that means when my order shipped it included my blocks!
Quote:


> Originally Posted by *-YC-*
> 
> Well i play BF4 98% of the time i am gaming
> 
> I am lost i see people getting 110avg. on BF4 @ 2560x1440p with the two cards, and some actually get worse fps than single card lol


110fps average is spot on with my setup and everything is running stock clocks (GPU,CPU,RAM). Its been so steady that since my monitor 60hz I have set my frames to cap at 61fps. What issues are you having?
Quote:


> Originally Posted by *brazilianloser*
> 
> Seems like there is a bug or whatever you want to call it... some folks with double 290/290x are having the game only put use on a single card. I am being one of these guys. Yes ULPS is disabled and Yes other games use both cards
> 
> 
> 
> 
> 
> 
> 
> ... BF4 on the other hand is gimped to a single card on my setup. I have stopped playing altogether until there is some sort of fix.


I haven't heard of people running single cards in BF4, that is a COD:Ghosts problem. Post your issues lets fix this.
Quote:


> Originally Posted by *DeadlyDNA*
> 
> Just curious if you feel up to the task of posting a pic of the cracked chip. Maybe others should be aware it can happen?


Post some pictures I haven't heard of anyone cracking a core in years!


----------



## brazilianloser

Quote:


> Originally Posted by *the9quad*
> 
> 99.99999999% of the people who only have one of their cards working are playing in window mode. Crossfire (and SLI) only work in full screen. So if you were like me and you were playing borderless window mode, then that is why you have only one card working. Borderless window mode, does not equal full screen.


full screen here like I said before only game where the usage issue happens.


----------



## DeadlyDNA

Quote:


> Originally Posted by *r0cawearz*
> 
> 
> 
> Here. I'm not sure if its from the AS5 spreading and zapping the core, or if its just from the poor heatsink installation. I've installed it a couple of times with poor installation and this never happened.


Wow, that's Bizarre - there are two lines that are perfectly straight... It almost looks like scratches?


----------



## r0cawearz

Another thing I noticed was that the thermal paste was a tint brownish before I cleaned it off. (with arctic silver 5.)

edit: on that specific area that is.


----------



## tsm106

Quote:


> Originally Posted by *r0cawearz*
> 
> Another thing I noticed was that the thermal paste was a tint brownish before I cleaned it off. (with arctic silver 5.)
> 
> edit: on that specific area that is.


I wonder, that spot looks to have suffered from an impact. It seems to spider out from that point.

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Yea post a pic. It didn't sound like the card was actually dead yet? Is it just a chip?
> 
> 
> 
> OT - So i saw you sold your R9's, and you said waiting a month.... was just curious what you were waiting on?
Click to expand...

Waiting on a custom that will let me throw an even crazier amount of volts at the card, but also for the custom vr design that adds memory voltage.


----------



## Redvineal

Quote:


> Originally Posted by *Redvineal*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Wanna see something crazy?
> 
> I wasn't at all happy with the VRM1 temps (~91C) the Accelero Xtreme III was producing with the rinky-dink heatsinks on my XFX R9 290 (does NOT unlock, btw).
> 
> As a matter of fact, the heat caused the thermal pads to fail and all 3 heatsinks fell off! I didn't want to go with the permanent glue, and here's why I'm glad I didn't:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What you see there is the result of 2 hours of quality time spent with a hacksaw! Yes, a HACKSAW. I don't own a rotary tool, and my local hardware stores were closed when I started around 9:30 PM.
> 
> Essentially, I took inspiration from some folks at Overclockers UK for the VRM1 heat and got to work on the stock heatsink! The back plate covering VRM1 was the easy part (thanks OCUK). On the other hand, the VRM2 plate up front was much more difficult. I had to work my way through the black plate, the copper plate, and finally the aluminum fins. Once I got the VRM2 cover separated from the rest of the sink, I used a pair of pliers to individually tear off the aluminum fins.
> 
> Told you it was crazy!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now here comes some gritty:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Furmark 10 Minute Run 1
> 
> Core: 1100
> Memory: 1300
> Power: +50
> mV: +0
> 
> GPU Max: 57C
> VRM1 Max: 65C
> VRM2 Max: 50C
> 
> 
> 
> Furmark 10 Minute Run 2
> 
> Core: 1150
> Memory: 1500
> Power: +50
> mV: +100
> 
> GPU Max: 65C
> VRM1 Max: 78C
> 
> 
> 
> 
> 
> 
> 
> 
> VRM2 Max: 56C
> 
> 
> 
> 
> 
> Pretty sweet results, right? If you've made it this far, thanks for checking out my work!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'd really like to know what you other R9 290(X) owners think of this Frankenstein of a mod, the results, and what other benches would be good to tax this monstrosity!
> 
> Also, if anyone is interested in more details about how I wound up with the VRM2 plate you see in the pics (where to cut, etc), I'll be happy to share.
> 
> Edit: Forgot to mention my idles
> GPU: 35C
> VRM1: 31C
> VRM2: 33C
> 
> 
> Spoiler: Warning: Spoiler!


Hey all. Today I got the bright (read crazy and insane) idea to one-up myself!

The spoiler above details the VRM mod I made a few weeks back. While the results were great, I could never avoid gagging when I looked at the Accelero Xtreme III fans and shroud, and hated to hear the sound it made running at a constant 12V.

I had a few Corsair SP120 Performance Edition fans lying around, and decided to strap a couple to the Accelero's heatsink. Here is the end result:


Spoiler: Warning: Spoiler!







Sorry for the craptastic phone camera picture, and yes, those are zip ties. I wish I had a better way to attach the fans to the sink, but my wife said the employees at our local Wal-Mart gave her dumb looks when she asked for 3M adhesive tape.









Back to business. The SP120's are connected directly to the PSU. I ran two burn in tests for 20 minutes each. One with the SP120s at 7V, another with them running (loudly) at 12V.

Now, let's bring forth the gritty!

Furmark 20 Minute Run 1


Spoiler: Warning: Spoiler!



Core: 1150
Memory: 1500
Power: +50
mV: +100
SP120's: 7V

GPU Max: 60C (5C under VRM mod)
VRM1 Max: 71C (7C under VRM mod)
VRM2 Max: 50C (6C under VRM mod)





Furmark 20 Minute Run 2


Spoiler: Warning: Spoiler!



Core: 1150
Memory: 1500
Power: +50
mV: +100
SP120's: 12V

GPU Max: 58C (7C under VRM mod)








VRM1 Max: 66C (12C under VRM mod)








VRM2 Max: 47C (9C under VRM mod)













Those VRM1 temps on the second run make me want to scream like a teenager at a Justin Bieber concert, but realistically I can't run those SP120's at 12V day in and day out due to noise. In all honesty, with the fans at 7V, I'm not convinced I saved any noise over running the Accelero at 12V.

Overall, I'm extremely happy with the results. Hell, the fact that I've done so much to this card (my first time ever modding), and it still works, surprises me!









If anyone has some tips on a better way to attach the fans, and/or help with the slight board bend I'm getting from the weight, I'd love to hear them.

Either way, in the true spirit of OCN, I challenge you all to one-up me and report back whatever awesome thing you pull off!


----------



## ImJJames

Well I had fun on PT 1 Bios, time to go back to stock. I wish PT 1 was fixed so it would downclock when not in usage.


----------



## DeadlyDNA

Quote:


> Originally Posted by *Redvineal*
> 
> Hey all. Today I got the bright (read crazy and insane) idea to one-up myself!
> 
> The spoiler above details the VRM mod I made a few weeks back. While the results were great, I could never avoid gagging when I looked at the Accelero Xtreme III fans and shroud, and hated to hear the sound it made running at a constant 12V.
> 
> I had a few Corsair SP120 Performance Edition fans lying around, and decided to strap a couple to the Accelero's heatsink. Here is the end result:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Sorry for the craptastic phone camera picture, and yes, those are zip ties. I wish I had a better way to attach the fans to the sink, but my wife said the employees at our local Wal-Mart gave her dumb looks when she asked for 3M adhesive tape.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Back to business. The SP120's are connected directly to the PSU. I ran two burn in tests for 20 minutes each. One with the SP120s at 7V, another with them running (loudly) at 12V.
> 
> Now, let's bring forth the gritty!
> 
> Furmark 20 Minute Run 1
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Core: 1150
> Memory: 1500
> Power: +50
> mV: +100
> SP120's: 7V
> 
> GPU Max: 60C (5C under VRM mod)
> VRM1 Max: 71C (7C under VRM mod)
> VRM2 Max: 50C (6C under VRM mod)
> 
> 
> 
> 
> 
> Furmark 20 Minute Run 2
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Core: 1150
> Memory: 1500
> Power: +50
> mV: +100
> SP120's: 12V
> 
> GPU Max: 58C (7C under VRM mod)
> 
> 
> 
> 
> 
> 
> 
> 
> VRM1 Max: 66C (12C under VRM mod)
> 
> 
> 
> 
> 
> 
> 
> 
> VRM2 Max: 47C (9C under VRM mod)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Those VRM1 temps on the second run make me want to scream like a teenager at a Justin Bieber concert, but realistically I can't run those SP120's at 12V day in and day out due to noise. In all honesty, with the fans at 7V, I'm not convinced I saved any noise over running the Accelero at 12V.
> 
> Overall, I'm extremely happy with the results. Hell, the fact that I've done so much to this card (my first time ever modding), and it still works, surprises me!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If anyone has some tips on a better way to attach the fans, and/or help with the slight board bend I'm getting from the weight, I'd love to hear them.
> 
> Either way, in the true spirit of OCN, I challenge you all to one-up me and report back whatever awesome thing you pull off!


Maybe nab a backplate to help with sag?


----------



## RAFFY

Quote:


> Originally Posted by *Redvineal*
> 
> Hey all. Today I got the bright (read crazy and insane) idea to one-up myself!
> 
> The spoiler above details the VRM mod I made a few weeks back. While the results were great, I could never avoid gagging when I looked at the Accelero Xtreme III fans and shroud, and hated to hear the sound it made running at a constant 12V.
> 
> I had a few Corsair SP120 Performance Edition fans lying around, and decided to strap a couple to the Accelero's heatsink. Here is the end result:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Sorry for the craptastic phone camera picture, and yes, those are zip ties. I wish I had a better way to attach the fans to the sink, but my wife said the employees at our local Wal-Mart gave her dumb looks when she asked for 3M adhesive tape.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Back to business. The SP120's are connected directly to the PSU. I ran two burn in tests for 20 minutes each. One with the SP120s at 7V, another with them running (loudly) at 12V.
> 
> Now, let's bring forth the gritty!
> 
> Furmark 20 Minute Run 1
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Core: 1150
> Memory: 1500
> Power: +50
> mV: +100
> SP120's: 7V
> 
> GPU Max: 60C (5C under VRM mod)
> VRM1 Max: 71C (7C under VRM mod)
> VRM2 Max: 50C (6C under VRM mod)
> 
> 
> 
> 
> 
> Furmark 20 Minute Run 2
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Core: 1150
> Memory: 1500
> Power: +50
> mV: +100
> SP120's: 12V
> 
> GPU Max: 58C (7C under VRM mod)
> 
> 
> 
> 
> 
> 
> 
> 
> VRM1 Max: 66C (12C under VRM mod)
> 
> 
> 
> 
> 
> 
> 
> 
> VRM2 Max: 47C (9C under VRM mod)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Those VRM1 temps on the second run make me want to scream like a teenager at a Justin Bieber concert, but realistically I can't run those SP120's at 12V day in and day out due to noise. In all honesty, with the fans at 7V, I'm not convinced I saved any noise over running the Accelero at 12V.
> 
> Overall, I'm extremely happy with the results. Hell, the fact that I've done so much to this card (my first time ever modding), and it still works, surprises me!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If anyone has some tips on a better way to attach the fans, and/or help with the slight board bend I'm getting from the weight, I'd love to hear them.
> 
> Either way, in the true spirit of OCN, I challenge you all to one-up me and report back whatever awesome thing you pull off!


Awesomeness!


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *HardwareDecoder*
> 
> You really do come off as a bit of an elitist jerk, I mentioned I was very new to watercooling. Everyone starts somewhere anyway this will definitely be my last reply to you.
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> If you spent anytime googling or even searching in the wc forums, you would not think I was a jerk, lol.
Click to expand...

Pfffffft







LooooooLs


----------



## RAFFY

I just realized that my motherboard (ASUS ROG MAXIMUS VI FORMULA) only does 8x4x4x when using three PCI Express slots?!?! From a performance stand point how bad will this effect my 2nd and 3rd GPU in tri-fire?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *RAFFY*
> 
> I just realized that my motherboard (ASUS ROG MAXIMUS VI FORMULA) only does 8x4x4x when using three PCI Express slots?!?! From a performance stand point how bad will this effect my 2nd and 3rd GPU in tri-fire?


If you were running x79 it wouldnt be a 'problem' lots of bandwith on em


----------



## y2kcamaross

Quote:


> Originally Posted by *the9quad*
> 
> 99.99999999% of the people who only have one of their cards working are playing in window mode. *Crossfire (and SLI)* only work in full screen. So if you were like me and you were playing borderless window mode, then that is why you have only one card working. Borderless window mode, does not equal full screen.


Actually, it's just crossfire, sli works fine in windowed mode. On another note, I'm trying to purchase 2 R9 290's...but they are literally gone...everywhere.


----------



## RAFFY

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> If you were running x79 it wouldnt be a 'problem' lots of bandwith on em


Well you might see me with a 6 core setup then lol


----------



## tsm106

Quote:


> Originally Posted by *RAFFY*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HOMECINEMA-PC*
> 
> If you were running x79 it wouldnt be a 'problem' lots of bandwith on em
> 
> 
> 
> Well you might see me with a 6 core setup then lol
Click to expand...

You can also power tri hawaii with a 3820 or 4820 no problem.


----------



## DeadlyDNA

Jesus, r9 290's are like 470$ now on the Egg, and out of stock!!!!!!!!!! Coin mining must be where it's at


----------



## RAFFY

Quote:


> Originally Posted by *tsm106*
> 
> You can also power tri hawaii with a 3820 or 4820 no problem.


Yeah but like 2 more cores like makes my like epeen like 5 times bigger!
Quote:


> Originally Posted by *DeadlyDNA*
> 
> Jesus, r9 290's are like 470$ now on the Egg, and out of stock!!!!!!!!!! Coin mining must be where it's at


It's fun you should try it out!


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *tsm106*
> 
> You can also power tri hawaii with a 3820 or 4820 no problem.


Very true if your lucky you could land a 3820 thats runs 166 strap and 2666+ on da mem @ 5 gigs too .

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Jesus, r9 290's are like 470$ now on the Egg, and out of stock!!!!!!!!!! Coin mining must be where it's at


290's are going for that price here but in aussie dollars . $470US = $516.20 and thats what they want for the asus 290 with BF4 over here


----------



## AddictedGamer93

Any news on custom cards? This wait is driving me insane.


----------



## HardwareDecoder

Quote:


> Originally Posted by *AddictedGamer93*
> 
> Any news on custom cards? This wait is driving me insane.


skip all the air cooling non sense and grab a reference+waterblock









granted, you need a loop to put it in but it's the best pc investment i've made so far.

I'll be adding the 290x to my loop tuesday.


----------



## Emmett

Quote:


> Originally Posted by *tsm106*
> 
> Oh cancel that, you do not want nickel. It will corrode. There's a guy on here who had his corrode in 7 days use.
> 
> I have one last acetal copper block left to sell. It has backplate and is fitted with ultra extremes.
> 
> Where have you been for the last two/three years? Silver is out. If you use nickel at least use a coolant. EK nickel is a joke.


LOL. Not keeping up I guess. heard about the EK nickel thing, but thought they had it straightened out?

I am running coolant on that cards loop now.

Dam, the other loop is still fine, no issues. I am likely still gonna drain it and do coolant on it also.

Fow now the coolant and distilled loops duel temps within 2C of each other so its all good.

working on my overclocks now.


----------



## ImJJames

Welp I had fun benching my r9 290, now its time to start mining baby


----------



## AddictedGamer93

Quote:


> Originally Posted by *HardwareDecoder*
> 
> skip all the air cooling non sense and grab a reference+waterblock
> 
> 
> 
> 
> 
> 
> 
> 
> 
> granted, you need a loop to put it in but it's the best pc investment i've made so far.
> 
> I'll be adding the 290x to my loop tuesday.


I intend on doing that, look at my sig rig. The problem is dealing with the noise until I get a water block.


----------



## broken pixel

Quote:


> Originally Posted by *AddictedGamer93*
> 
> Any news on custom cards? This wait is driving me insane.


Sometime in January I think.


----------



## DeadlyDNA

Quote:


> Originally Posted by *Emmett*
> 
> LOL. Not keeping up I guess. heard about the EK nickel thing, but thought they had it straightened out?
> 
> I am running coolant on that cards loop now.
> 
> Dam, the other loop is still fine, no issues. I am likely still gonna drain it and do coolant on it also.
> 
> Fow now the coolant and distilled loops duel temps within 2C of each other so its all good.
> 
> working on my overclocks now.


I am new to watercooling as well, i thought that you still had to have some soft of anti-corrosion additive even with just dist water and kill coil?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Emmett*
> 
> LOL. Not keeping up I guess. heard about the EK nickel thing, but thought they had it straightened out?
> 
> I am running coolant on that cards loop now.
> 
> Dam, the other loop is still fine, no issues. I am likely still gonna drain it and do coolant on it also.
> 
> Fow now the coolant and distilled loops duel temps within 2C of each other so its all good.
> 
> working on my overclocks now.


I would also like to know if EK fixed their nickel plating issue. I'm going to buy parts for a loop soon, and I absolutely hate the look of copper and the idea of corrosion and copper stains. However, if copper is better in the long run then I'll go with it.

@TSM, if the EK nickel plated blocks are not so good, what would you recommend? EK copper or an entirely different brand?

I'd love to get an XSPC setup going but I'm having a hard time finding one of their 290 blocks.


----------



## Mr357

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I would also like to know if EK fixed their nickel plating issue. I'm going to buy parts for a loop soon, and I absolutely hate the look of copper and the idea of corrosion and copper stains. However, if copper is better in the long run then I'll go with it.
> 
> @TSM, if the EK nickel plated blocks are not so good, what would you recommend? EK copper or an entirely different brand?
> 
> I'd love to get an XSPC setup going but I'm having a hard time finding one of their 290 blocks.


To my knowledge, EK fixed their nickel issues quite some time ago. If it's any consolation, I'm using an EK nickel block.

As far as an XSPC for the 290/X, I didn't think they existed yet. Last I checked, the only options are EK, Aqua Computer, and Koolance.


----------



## DeadlyDNA

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I would also like to know if EK fixed their nickel plating issue. I'm going to buy parts for a loop soon, and I absolutely hate the look of copper and the idea of corrosion and copper stains. However, if copper is better in the long run then I'll go with it.
> 
> @TSM, if the EK nickel plated blocks are not so good, what would you recommend? EK copper or an entirely different brand?
> 
> I'd love to get an XSPC setup going but I'm having a hard time finding one of their 290 blocks.


i like copper myself, sure it's not epic color but it can look alright sometimes










cant forget my bullet heatsink, sadly copper everywhere,


----------



## iamhollywood5

^ copper is actually a little bit better with heat transfer









My only problem with copper is the oxidization, but that can simply be avoided by careful handling.


----------



## chiknnwatrmln

Thanks for the clarification guys.

Copper goes well with some things, but for next build (in sig) I'm doing mostly black + white. White case, white tubing, black + white blocks, white illuminated pump, black wiring etc. Copper would stick out a lot. I like it in that rig posted above, though.

How can one avoid oxidation on copper?

Here's a link to the XSPC 290/x block: https://www.overclockers.co.uk/showproduct.php?prodid=WC-289-XS


----------



## tsm106

Quote:


> Originally Posted by *skupples*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> yes, for the 290s, quad is possible. we also want to consider the system to run this many gpus. afaik, you need a X79 system to fully utilize this configuration with a highly oc'ed cpu. with regard to the psu, well, 2 powerful psus for sure. might cause a blackout in the neighborhood. lol.
> 
> 
> 
> you could also do it on a PLX chipped lga 1150 if you like, for much less, with almost no hit (and some times a gain) in gaming performance. It of course won't handle the high end tasks as well as x79. & yes I would HIGHLY recommend dual PSU & *ADD2PSU* for quad SLI Hawaii.
Click to expand...

I keep seeing ppl recommend ADD2PSU. Not sure if you guys realize that there is a simpler and cheaper adapter, heck you can make one yourself. For ex.

http://www.ebay.com/itm/Dual-Power-Supply-Adapter-two-power-supplies-with-one-motherboard-/251012275056?pt=US_Monitor_Power_Supplies&hash=item3a717f5b70

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I would also like to know if EK fixed their nickel plating issue. I'm going to buy parts for a loop soon, and I absolutely hate the look of copper and the idea of corrosion and copper stains. However, if copper is better in the long run then I'll go with it.
> 
> @TSM, if the EK nickel plated blocks are not so good, what would you recommend? EK copper or an entirely different brand?
> 
> I'd love to get an XSPC setup going but I'm having a hard time finding one of their 290 blocks.


EK nickel fails. Koolance nickel strips at the fin structure. EK supposedly fixed their issue, but as evidenced by Emmet's situation, it is not fixed. I recommend you use a coolant, all the time. Whether it be with copper or nickel plated blocks, use a coolant or coolant/distilled mix. PC watercooling coolants include both biocides and anti-corrosives, achieving both needs at the same time. And the performance difference from straight water is negligible.

Quote:


> Originally Posted by *DeadlyDNA*
> 
> i like copper myself, sure it's not epic color but it can look alright sometimes
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> cant forget my bullet heatsink, sadly copper everywhere,


I used to use the same Lepa. How do you have your cards connected? I hope you're not using rail 3?

Quote:


> Originally Posted by *iamhollywood5*
> 
> ^ copper is actually a little bit better with heat transfer
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My only problem with copper is the oxidization, but that can simply be avoided by careful handling.


You can't avoid copper corrosion on the exterior of the block unless you seal it. That's one of the first things I do with my copper blocks. For ex.


----------



## RAFFY

Quote:


> Originally Posted by *tsm106*
> 
> I keep seeing ppl recommend ADD2PSU. Not sure if you guys realize that there is a simpler and cheaper adapter, heck you can make one yourself. For ex.
> 
> http://www.ebay.com/itm/Dual-Power-Supply-Adapter-two-power-supplies-with-one-motherboard-/251012275056?pt=US_Monitor_Power_Supplies&hash=item3a717f5b70
> EK nickel fails. Koolance nickel strips at the fin structure. EK supposedly fixed their issue, but as evidenced by Emmet's situation, it is not fixed. I recommend you use a coolant, all the time. Whether it be with copper or nickel plated blocks, use a coolant or coolant/distilled mix. PC watercooling coolants include both biocides and anti-corrosives, achieving both needs at the same time. And the performance difference from straight water is negligible.
> I used to use the same Lepa. How do you have your cards connected? I hope you're not using rail 3?
> You can't avoid copper corrosion on the exterior of the block unless you seal it. That's one of the first things I do with my copper blocks. For ex.


What product do you use to seal them?


----------



## tsm106

I use any gloss clear coat, rustoleum, whatever is cheap.The coat will also resist vinegar so you don't have to keep reapplying it between vinegar cleanings. And if for some reason you need to remove it, acetone bath or wipe takes care of it quick.

Oh btw, some blocks come clear coated from the factory like XSPC. When using an XSPC, and you're using a cleaner on it, don't accidentally clean the coat off the block.


----------



## Slomo4shO

Quote:


> Originally Posted by *Slomo4shO*
> 
> On a side note, I will be having nickle plated quick releases and chrome fittings in my loop. I was planning on DI water with a biocide like Methylisothiazolone. Considering that the rads are copper with steel and bronze casings, should I be worried about corrosion?


Any input on this?


----------



## ihaveworms

Quote:


> Originally Posted by *Mr357*
> 
> To my knowledge, EK fixed their nickel issues quite some time ago. If it's any consolation, I'm using an EK nickel block.
> 
> As far as an XSPC for the 290/X, I didn't think they existed yet. Last I checked, the only options are EK, Aqua Computer, and Koolance.


XSPC has a waterblock. http://www.xs-pc.com/waterblocks-gpu/razor-r9-290x-290


----------



## givmedew

Quote:


> Originally Posted by *ihaveworms*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mr357*
> 
> To my knowledge, EK fixed their nickel issues quite some time ago. If it's any consolation, I'm using an EK nickel block.
> 
> As far as an XSPC for the 290/X, I didn't think they existed yet. Last I checked, the only options are EK, Aqua Computer, and Koolance.
> 
> 
> 
> XSPC has a waterblock. http://www.xs-pc.com/waterblocks-gpu/razor-r9-290x-290
Click to expand...

Still think the Koolance would be the top performer because of the microchannel design. I don't know that for sure but mine is working much better than I expected it too.


----------



## Emmett

Quote:


> Originally Posted by *DeadlyDNA*
> 
> I am new to watercooling as well, i thought that you still had to have some soft of anti-corrosion additive even with just dist water and kill coil?


I keep seeing things back and forth about using anti-corrsion or not with just distilled and coil.

If I had to do it over I would have gone with

A full copper radiator (No brass at all).
Copper blocks.
Copper fittings.
Distilled.
A few drops of biocide.

the only non copper in the loop would be whatever
metal is used to cover the VRM bridges on the GPU block.


----------



## brazilianloser

Quote:


> Originally Posted by *ihaveworms*
> 
> XSPC has a waterblock. http://www.xs-pc.com/waterblocks-gpu/razor-r9-290x-290


They said it would be on sale this last week here in the US but so far I haven't seen a sign of it.


----------



## skupples

Quote:


> Originally Posted by *Emmett*
> 
> LOL. Not keeping up I guess. heard about the EK nickel thing, but thought they had it straightened out?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I am running coolant on that cards loop now.
> 
> Dam, the other loop is still fine, no issues. I am likely still gonna drain it and do coolant on it also.
> 
> Fow now the coolant and distilled loops duel temps within 2C of each other so its all good.
> 
> working on my overclocks now.


EK has mostly sorted out the nickel corrosion issues, though it's definitely smart to use some coolant, or even better (recommended by JPM) use Red Line Water Wetter Anti-corrosion. You can get it @ your local auto parts store. I JUST tore down my loop, all three of my nickel blocks ( cpu gpu's) are spotless, zero corrosion. Seems nickel is slightly higher maintenance is all. Gotta check your PH every few months, & run anti-corrosion to be sure you aren't destroying your stuff. Something tells me corrosion within 7 days of use is probably user error related. He must of used tap water, or flushed his rads with vinegar, who knows.

ADD2PSU is 20$ & is considered superior to hotwiring or dual plugs. I have a reason why, but can't remember in my 2:30 AM stupor.


----------



## MrWhiteRX7

Just for a reference to those having poor gpu usage in BF4 with an 8350.... This first pic is with my FX 8350 @ 4.9ghz (1080p ultra settings)



This is from a moment ago, INTEL 2600k @ 4.0ghz same 290 gpu (1440p ultra but no MSAA)



Huge HUUUGE difference in how the game plays and hell I get MORE fps in 1440p than I did in 1080p on the AMD rig. I'm not here to start a war just saying that these 290/ x gpu's need horsepower!

Would like to see more comparisons just out of curiosity, although doesn't matter much anymore I just purchased IVY-E


----------



## DeadlyDNA

TSM:
I used to use the same Lepa. How do you have your cards connected? I hope you're not using rail 3?

I am glad you asked, i couldn't find anywhere on what rails are what. Can you tell me what rails? I had originally assumed the red connectors above the Mobo plugs where shared is this incorrect?


----------



## xnxsxx

Count me in!









Sapphire R9 290 flashed to 290X! Stock cooler.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *xnxsxx*
> 
> Count me in!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sapphire R9 290 flashed to 290X! Stock cooler.


Now i am certain mine will unlock as well , i'll get around to it eventually .


----------



## darkelixa

So amd 8350s are no good with the r9 290s?


----------



## -YC-

Quote:


> Originally Posted by *RAFFY*
> 
> I will have my setup going here in the coming weeks once my GPU blocks arrive. Keep an eye out for my postings.
> I hope not I have 3 on order from them!
> I hope that means when my order shipped it included my blocks!
> 110fps average is spot on with my setup and everything is running stock clocks (GPU,CPU,RAM). Its been so steady that since my monitor 60hz I have set my frames to cap at 61fps. What issues are you having?
> I haven't heard of people running single cards in BF4, that is a COD:Ghosts problem. Post your issues lets fix this.
> Post some pictures I haven't heard of anyone cracking a core in years!


I havent yet bought the card, becouse it seems 90% of the people out there are having some serious frame problems with a CF 290. I waiting 30 days without PC mainly becouse my motherboard had problems and i sent to MSI now they are going to give me a new one i just hope i get it this year lol anyways i was wondering SLI doesnt have problems with BF4 ? Nvidia does more patches so people dont have to put up with this kind of shıit all the time. Maybe i should get two 4GB 770 from Zotac (not amp! verson) But how differant frames will i get ? Will i get much lower than a perfectly working CF 290 ?? Keep in mind 98% of the time i play BF4 so thats important and i will buy new cards when the GTX 900 series comes out (and what ever amd will release at the time) That is if they release GTX 800 three /four months in 2014







If they release close to 2015 ill get them







) Though i think they will becouse no single card can run games even at 1920x1080p very good









Anywasy should i wait untill this CF 290 gets solved or go ahead and get SLI 770 (4GB versions)


----------



## DeadlyDNA

Quote:


> Originally Posted by *-YC-*
> 
> I havent yet bought the card, becouse it seems 90% of the people out there are having some serious frame problems with a CF 290. I waiting 30 days without PC mainly becouse my motherboard had problems and i sent to MSI now they are going to give me a new one i just hope i get it this year lol anyways i was wondering SLI doesnt have problems with BF4 ? Nvidia does more patches so people dont have to put up with this kind of shıit all the time. Maybe i should get two 4GB 770 from Zotac (not amp! verson) But how differant frames will i get ? Will i get much lower than a perfectly working CF 290 ?? Keep in mind 98% of the time i play BF4 so thats important and i will buy new cards when the GTX 900 series comes out (and what ever amd will release at the time) That is if they release GTX 800 three /four months in 2014
> 
> 
> 
> 
> 
> 
> 
> If they release close to 2015 ill get them
> 
> 
> 
> 
> 
> 
> 
> ) Though i think they will becouse no single card can run games even at 1920x1080p very good
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anywasy should i wait untill this CF 290 gets solved or go ahead and get SLI 770 (4GB versions)


I don't play BF4, but in case you aren't aware 770GTX are basicly 680GTX. You might be better off looking at a higher end single card if you don't want SLI/CF issues.


----------



## -YC-

Quote:


> Originally Posted by *DeadlyDNA*
> 
> I don't play BF4, but in case you aren't aware 770GTX are basicly 680GTX. You might be better off looking at a higher end single card if you don't want SLI/CF issues.


One card is not enough though for BF4 for me


----------



## the9quad

Quote:


> Originally Posted by *-YC-*
> 
> I havent yet bought the card, becouse it seems 90% of the people out there are having some serious frame problems with a CF 290. I waiting 30 days without PC mainly becouse my motherboard had problems and i sent to MSI now they are going to give me a new one i just hope i get it this year lol anyways i was wondering SLI doesnt have problems with BF4 ? Nvidia does more patches so people dont have to put up with this kind of shıit all the time. Maybe i should get two 4GB 770 from Zotac (not amp! verson) But how differant frames will i get ? Will i get much lower than a perfectly working CF 290 ?? Keep in mind 98% of the time i play BF4 so thats important and i will buy new cards when the GTX 900 series comes out (and what ever amd will release at the time) That is if they release GTX 800 three /four months in 2014
> 
> 
> 
> 
> 
> 
> 
> If they release close to 2015 ill get them
> 
> 
> 
> 
> 
> 
> 
> ) Though i think they will becouse no single card can run games even at 1920x1080p very good
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anywasy should i wait untill this CF 290 gets solved or go ahead and get SLI 770 (4GB versions)


Everyone I know with crossfired 290/290x's is having zero problems with bf4. I've only seen one person on here mention that it isn't working for them, what does that tell ya? I highly doubt it is an issue with the 290's being crossfired in bf4, because it works for darn near everyone else. I'd say the odds that it is a amd problem are just as high as someone having a problem with nvidia sli in bf4. I've seen a few mention that.


----------



## S410520

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Just for a reference to those having poor gpu usage in BF4 with an 8350.... This first pic is with my FX 8350 @ 4.9ghz (1080p ultra settings)
> 
> 
> 
> This is from a moment ago, INTEL 2600k @ 4.0ghz same 290 gpu (1440p ultra but no MSAA)
> 
> 
> 
> Huge HUUUGE difference in how the game plays and hell I get MORE fps in 1440p than I did in 1080p on the AMD rig. I'm not here to start a war just saying that these 290/ x gpu's need horsepower!
> 
> Would like to see more comparisons just out of curiosity, although doesn't matter much anymore I just purchased IVY-E


I have a friend with a AMD [email protected] and it lags the same way you described. My 2600K does not have this issue @4.4Ghz.
The AMD rig cannot handle 1 290 in full. Even though the AMD is a "6-core" and 2600 is only quad.

On 3 cores I get about the same performance in BF4 as the AMD "6-core" in full speed.

Just our experience; my friend had only problems since BF4 beta, with AMD hyping everything up you can understand the disappointment with his performance.

Hopefully Mantle will improve this "6-cores" performance. (as they have promised it will)

Now where is MANTLE??? Anybody have a real date?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *S410520*
> 
> I have a friend with a AMD [email protected] and it lags the same way you described. My 2600K does not have this issue @4.4Ghz.
> The AMD rig cannot handle 1 290 in full. Even though the AMD is a "6-core" and 2600 is only quad.
> 
> On 3 cores I get about the same performance in BF4 as the AMD "6-core" in full speed.
> 
> Just our experience; my friend had only problems since BF4 beta, with AMD hyping everything up you can understand the disappointment with his performance.
> 
> Hopefully Mantle will improve this "6-cores" performance. (as they have promised it will)
> 
> Now where is MANTLE??? Anybody have a real date?


Exactly, there are quite a few with this issue that I have seen. I only had my fx8350 8 core for less than 4 weeks and I already went back to my 2600k. I am getting flawless performance at 1440p with 100% gpu usage on my overclocked 290. When I get my x79 setup and two more gpu's I can only imagine how sick this will be









I just wanted to post all of that as a visual reference on the same gpu, memory, psu, win 8.1, etc... all I did was swap mobo and cpu, refresh the OS and try bf4 out. Boom, 100% better.


----------



## darkelixa

I doubt a cpu is causing such bad fps loss


----------



## -YC-

Quote:


> Originally Posted by *the9quad*
> 
> Everyone I know with crossfired 290/290x's is having zero problems with bf4. I've only seen one person on here mention that it isn't working for them, what does that tell ya? I highly doubt it is an issue with the 290's being crossfired in bf4, because it works for darn near everyone else. I'd say the odds that it is a amd problem are just as high as someone having a problem with nvidia sli in bf4. I've seen a few mention that.


My head = jsbdfjafbcbfbrbeufbdfneoıthıshrf

I dont know what to do lol

ı am so jelious about console players they spend much much less and they insert disc and play no fps problem no nothing everyone is = , but graphics are very bad even on the ps4


----------



## darkelixa

Ive watched quite a few bf4 r9 290/280x videos with no fps issues.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *darkelixa*
> 
> I doubt a cpu is causing such bad fps loss


Frames were pretty good on the 8350 at 1080p maxed out quality settings... more than playable but had dips here and there and wasn't always consistent. The dips matched with the gpu usage drops. The 2600k fixed usage drops and more







Now I get the same fps at 1440p as I did at 1080p with the 8350 because now I can OC the card and make use of it 100%.

I still have the 8350 chillin, I do not have bad things to say about it. The 2600k is just working better for my needs.


----------



## MrWhiteRX7

I had my 7970 clocked at 1200/1675 in bf4 using the 8350 @ 4.9ghz zero issues with gpu usage. Once I went to the 290 is when it started. The game played fine overall, if someone were watching a video of it you'd probably not notice it but I could feel the stutter here and there and the gpu drops were definitely there. Again it's not just me and I only posted what I found to alleviate the situation. I'm not trying to bash the 8350.


----------



## S410520

Question for the Afterburner users:

Will forcing constant voltage help with oc? I can run 1100 on my unlocked 290 on +38mv =~1.180v actual max. (gpuz)
Weird thing is, I can't go much higher even tho my temps are good. (all below 60)
Voltage bumps never seem to help past this point pretty much, anybody recognize this?
Highest I had with little artifacts is 1140 on 1.230v actual. Not fully stable tho.

Is it just the limit of the gpu?


----------



## psyside

Quote:


> Originally Posted by *rv8000*
> 
> Finally got this stupid 290 to break 13k gpu score in firestrike @ 1230/1500, can't go any higher due to vrm temps D:
> 
> http://www.3dmark.com/3dm/1806045


What are the vrm temps at, and at what fan speed/volts?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *S410520*
> 
> Question for the Afterburner users:
> 
> Will forcing constant voltage help with oc? I can run 1100 on my unlocked 290 on +38mv =~1.180v actual max. (gpuz)
> Weird thing is, I can't go much higher even tho my temps are good. (all below 60)
> Voltage bumps never seem to help past this point pretty much, anybody recognize this?
> Highest I had with little artifacts is 1140 on 1.230v actual. Not fully stable tho.
> 
> Is it just the limit of the gpu?


Slide the voltage all the way over to +100mv and start going up on your core clock. It's not linear. +38 may get you to 1100, but that won't mean +48 gets you to 1125 or anything. It may take +100 just to get it up to 1150 max. Bump it up and then start messing with the core again.


----------



## S410520

Thanks, I will try.


----------



## centvalny

Testing 290X with Hynix mem. on cold. Cpu and rams @ stock



http://imgur.com/WPZPoQm


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *-YC-*
> 
> I havent yet bought the card, becouse it seems 90% of the people out there are having some serious frame problems with a CF 290. I waiting 30 days without PC mainly becouse my motherboard had problems and i sent to MSI now they are going to give me a new one i just hope i get it this year lol anyways i was wondering SLI doesnt have problems with BF4 ? Nvidia does more patches so people dont have to put up with this kind of shıit all the time. Maybe i should get two 4GB 770 from Zotac (not amp! verson) But how differant frames will i get ? Will i get much lower than a perfectly working CF 290 ?? Keep in mind 98% of the time i play BF4 so thats important and i will buy new cards when the GTX 900 series comes out (and what ever amd will release at the time) That is if they release GTX 800 three /four months in 2014
> 
> 
> 
> 
> 
> 
> 
> If they release close to 2015 ill get them
> 
> 
> 
> 
> 
> 
> 
> ) Though i think they will becouse no single card can run games even at 1920x1080p very good
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anywasy should i wait untill this CF 290 gets solved or go ahead and get SLI 770 (4GB versions)


All i can say is this , single 290 kicks azz in BF4 and COD Ghosts and i dont even know what fps are for either game i can simply tell by how smooth the transition from cut scenes to game . AB doesnt want me to know the fps . No idea why . But im running 2011 and hexy . And that helps big time . No tearing or ripples must be somewhere around the 60fps as a guesstimation . Cant wait to get 2nd 290 . Gonna bench the living daylights outta 'em . Then some gaming


----------



## cyenz

Anyone tried update Bios? There are several new bios versions on vga bios collection, one of them with 3 days.

V: 015.042.000.000.003745


----------



## velocityx

Quote:


> Originally Posted by *-YC-*
> 
> Anywasy should i wait untill this CF 290 gets solved or go ahead and get SLI 770 (4GB versions)


I had some issues with CF 290 but I solved it by doing DDU in safe mode, now everything runs plenty fine. Go ahead and get them. Getting 770 4gb in sli will be like worst case scenario to dump cash right now. Totally not worth it to buy old gen for the money that can give you current gen. Theres so much win win CF 290, frame pacing works wonders in bf4, it works like one card (meaning it's as smooth as one) so go ahead


----------



## quakermaas

Quote:


> Originally Posted by *cyenz*
> 
> Anyone tried update Bios? There are several new bios versions on vga bios collection, one of them with 3 days.
> 
> V: 015.042.000.000.003745


Yes, I have it in both my cards, no testing done yet.

I flashed the MSI 290 bios to my powercolor cards.


----------



## psyside

Considering, 290's have only Elpida, or most of them do. What memory oc should i expect?

1600 is a no go i guess or very hard?









Did anyone notice some iprovements from flashing vBIOS other then unlocking volts/power? does tt worth it?

More stable clocks with same volts then before? less throttling? heat? is it possible to go back to stock vBIOS after u update? thanks.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *centvalny*
> 
> Testing 290X with Hynix mem. on cold. Cpu and rams @ stock


Yes , yes thats what ive been waitin to see someonewhos really keen about what you can get outta this card


----------



## Newbie2009

Quote:


> Originally Posted by *psyside*
> 
> Considering, 290's have only Elpida, or most of them do. What memory oc should i expect?
> 
> 1600 is a no go i guess or very hard?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Did anyone notice some iprovements from flashing vBIOS other then unlocking volts/power? does tt worth it?
> 
> More stable clocks with same volts then before? less throttling? heat? is it possible to go back to stock vBIOS after u update? thanks.


1450/1500 my cards can do before performance goes south.


----------



## SultanOfWalmart

Quote:


> Originally Posted by *psyside*
> 
> Considering, 290's have only Elpida, or most of them do. What memory oc should i expect?
> 
> 1600 is a no go i guess or very hard?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Did anyone notice some iprovements from flashing vBIOS other then unlocking volts/power? does tt worth it?
> 
> More stable clocks with same volts then before? less throttling? heat? is it possible to go back to stock vBIOS after u update? thanks.


Highest I've seen is 1625 without severe loss in performance. Average seems to be 1550-1600 for the top end. Mine tops out around 1575 it before it quits.


----------



## jomama22

Quote:


> Originally Posted by *SultanOfWalmart*
> 
> Highest I've seen is 1625 without severe loss in performance. Average seems to be 1550-1600 for the top end. Mine tops out around 1575 it before it quits.


Hynix will go up to ~ 1750 if you find the sweetspot.


----------



## Durvelle27

Quote:


> Originally Posted by *jomama22*
> 
> Hynix will go up to ~ 1750 if you find the sweetspot.


Mines seem to not like above 1450 without a volt bump.


----------



## SultanOfWalmart

Quote:


> Originally Posted by *jomama22*
> 
> Hynix will go up to ~ 1750 if you find the sweetspot.


Yes, Hynix will clock higher. Think he was asking about Elpdia though.


----------



## Jack Mac

Quote:


> Originally Posted by *Durvelle27*
> 
> Mines seem to not like above 1450 without a volt bump.


Isn't the 290/X memory was memory voltage locked? Also, I get the same sort of dips that that 8350 gets with my 3570k, time to get a 3770k I guess.


----------



## jomama22

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yes , yes thats what ive been waitin to see someonewhos really keen about what you can get outta this card


That's on ln2. My 3 290xs hit 1350+ on water. These cards are just not falling in the right hands.
http://www.3dmark.com/fs/1239528


Here's my FSE 3x 290x @ *1333/1625*
World record right there. That's #1 on FM HOF.

Note that 2/3 are elpidas









1353/1749 - single 290x under water

http://www.3dmark.com/fs/1133097

Tess on and should be valid if it wasn't a beta


----------



## jomama22

Quote:


> Originally Posted by *Jack Mac*
> 
> Isn't the 290/X memory was memory voltage locked? Also, I get the same sort of dips that that 8350 gets with my 3570k, time to get a 3770k I guess.


This is correct , no memory voltage without hard mod. But, the Asus/pt1/pt3 bios does allow higher over clocking ccompared to stock.


----------



## Roy360

I was thinking of buying this card off a friend. This card will unlock right?


----------



## Jack Mac

Even if it doesn't, 290/X are like 3-5% difference @ the same speed. Haven't even bothered w/my 290.


----------



## -YC-

Quote:


> Originally Posted by *velocityx*
> 
> I had some issues with CF 290 but I solved it by doing DDU in safe mode, now everything runs plenty fine. Go ahead and get them. Getting 770 4gb in sli will be like worst case scenario to dump cash right now. Totally not worth it to buy old gen for the money that can give you current gen. Theres so much win win CF 290, frame pacing works wonders in bf4, it works like one card (meaning it's as smooth as one) so go ahead


Alright i will get the 290 CF, though should i wait untill non-referance cards or get the referance ? I dont play with headset and if it as loud as people say maybe it will bother me though i cant know 100% without seeing/testing but i have no way. Anyways thanks alot and do you know when the non-referance versions will come ?


----------



## SultanOfWalmart

Quote:


> Originally Posted by *Roy360*
> 
> I was thinking of buying this card off a friend. This card will unlock right?


Yes, it will unlock.


----------



## Durquavian

Quote:


> Originally Posted by *jomama22*
> 
> That's on ln2. My 3 290xs hit 1350+ on water. These cards are just not falling in the right hands.
> http://www.3dmark.com/fs/1239528
> 
> 
> Here's my FSE 3x 290x @ *1333/1625*
> World record right there. That's #1 on FM HOF.
> 
> Note that 2/3 are elpidas
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1353/1749 - single 290x under water
> 
> http://www.3dmark.com/fs/1133097
> 
> Tess on and should be valid if it wasn't a beta


I think it is with 3DMark. One time the same driver was approved and another not. I have used beta for a while, prob for 99% of these cards life.


----------



## Falkentyne

Quote:


> Originally Posted by *grunion*
> 
> Both pt1/pt3 bios' lock my core speeds at 1Ghz.
> Any thoughts?


They're a subzero/LN2 bios. They're supposed to do that. They remove the 2d speeds to keep the cards running at 3d clocks.


----------



## Roy360

Quote:


> Originally Posted by *SultanOfWalmart*
> 
> Yes, it will unlock.


Thanks, I couldn't decide between buying off a friend for 500, or just buying one of the new shipments for 460. (Taxes included)

This makes my decision alot easier


----------



## head9r2k

Hey Guys sorry for my Bad English

But give it a Bios editor for the r9 290?

Because i want that the card Dont Clock Down at 3D

When i Play Diablo3 the cards goes down to 550mhz i get stuttering.

Temps all Fine.


----------



## ZealotKi11er

I have Hynix memory on my 290X. Do i need to up voltage to overclock it? So far i have 1100MHz core at stock voltage.


----------



## kcuestag

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I have Hynix memory on my 290X. Do i need to up voltage to overclock it? So far i have 1100MHz core at stock voltage.


From my own experience, I've gathered that OC'ing the Memory on these R9 cards is absolutely USELESS. It will not even provide you 1fps increase, simply not worth it.

Stick to the Core OC.


----------



## MojoW

If i use the asus bios with asus tweak instead of the ati bios and afterburner will i end up with more volts after vdroop?
Cuz i just need a little bit more juice to hit the 1200 core and my memory is allready at 1600.(On air)

Edit: Elpida memory by the way.


----------



## HardwareDecoder

Quote:


> Originally Posted by *kcuestag*
> 
> From my own experience, I've gathered that OC'ing the Memory on these R9 cards is absolutely USELESS. It will not even provide you 1fps increase, simply not worth it.
> 
> Stick to the Core OC.


Good info, ill have to test it for my self of course when I get my legit 290x


----------



## grunion

Quote:


> Originally Posted by *Falkentyne*
> 
> They're a subzero/LN2 bios. They're supposed to do that. They remove the 2d speeds to keep the cards running at 3d clocks.


Ok here are more details.

Uninstall CAT/AB restart, delete AB folders.
Flash BIOS
Install CAT restart
Install AB restart
Disable ULPS restart
Run AB, 1 core at 2d speeds, 2nd core stuck at 1ghz

Try adjusting core speeds, 1 core stays at 1ghz, the other core maxes at 1030.
At this point I haven't unlocked OD, unlock OD and it shows the same thing.

Tried 3 different CATs same result, using AB .17.
Through all this memory and voltages are adjustable.

BTW is the rom only suppose to be 64kb?


----------



## tsm106

Quote:


> Originally Posted by *grunion*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Falkentyne*
> 
> They're a subzero/LN2 bios. They're supposed to do that. They remove the 2d speeds to keep the cards running at 3d clocks.
> 
> 
> 
> Ok here are more details.
> 
> Uninstall CAT/AB restart, delete AB folders.
> Flash BIOS
> Install CAT restart
> Install AB restart
> Disable ULPS restart
> Run AB, 1 core at 2d speeds, 2nd core stuck at 1ghz
> 
> Try adjusting core speeds, 1 core stays at 1ghz, the other core maxes at 1030.
> At this point I haven't unlocked OD, unlock OD and it shows the same thing.
> 
> Tried 3 different CATs same result, using AB .17.
> Through all this memory and voltages are adjustable.
> 
> BTW is the rom only suppose to be 64kb?
Click to expand...

Did you try adding non-stock bios support to AB? +A quick is test is to run gputweak. These bios' are that small because they have powertune removed.

http://forums.guru3d.com/showpost.php?p=4700906&postcount=2


----------



## grunion

^^ Forgot to add that I used the unlocked GPU Tweak, same issue.

I will try the AB tweak.


----------



## MojoW

Quote:


> Originally Posted by *MojoW*
> 
> If i use the asus bios with asus tweak instead of the ati bios and afterburner will i end up with more volts after vdroop?
> Cuz i just need a little bit more juice to hit the 1200 core and my memory is allready at 1600.(On air)
> 
> Edit: Elpida memory by the way.


Or can i use the AB commandlines on top of the 100mV i have set allready??


----------



## tsm106

Quote:


> Originally Posted by *DeadlyDNA*
> 
> TSM:
> I used to use the same Lepa. How do you have your cards connected? I hope you're not using rail 3?
> 
> I am glad you asked, i couldn't find anywhere on what rails are what. Can you tell me what rails? I had originally assumed the red connectors above the Mobo plugs where shared is this incorrect?




The Lepa is difficult because so many rails are shared. And there is only so much power to go around with 360w per rail. Normally its not a concern but with what 290s are able to draw with unlocked bios' it can be problematic. 12v3 is shared with mb/pcie aux power too.


----------



## CriticalHit

Quote:


> Originally Posted by *kcuestag*
> 
> From my own experience, I've gathered that OC'ing the Memory on these R9 cards is absolutely USELESS. It will not even provide you 1fps increase, simply not worth it.
> 
> Stick to the Core OC.


ill attest to that.. overclocking memory actually REDUCED fps in valley for me... sometimes by a significant margin..


----------



## iamhollywood5

Quote:


> Originally Posted by *CriticalHit*
> 
> ill attest to that.. overclocking memory actually REDUCED fps in valley for me... sometimes by a significant margin..


Probably because your VRAM is unstable when overclocked and ECC kicks in.


----------



## Clockster

K well I transplanted my rig into a Corsair Air 540 and I can say for sure my 290X is def running cooler.
BF4 not sits at 68c @100% fanspeed where it was 70-72c. Considering I live in a damn hot country I am pretty impressed


----------



## Kriant

Is there a OCer's manual on OCing R9 290/ 290x ?

Mine turned out to be all locked, so I will have to just go with OCing them


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Kriant*
> 
> Is there a OCer's manual on OCing R9 290/ 290x ?
> 
> Mine turned out to be all locked, so I will have to just go with OCing them


OC'ing 290's is pretty standard.

Create a custom fan profile to stay cool, Hawaii is temp sensitive in my experience.

Increase core, test with preferred method, repeat until artifact/crash.

Increase voltage until all is good, increase core until artifact/crash. Repeat until you hit your highest stable OC, then start OC'ing memory.

If you want more than +100mV, flash to ASUS BIOS'es and use GPUTweak. It allows a max of 1.412v before droop (about 1.3v under load).


----------



## iGameInverted

Finally have everything together. Add me =D

HIS 290x on water. Elpida mem.


----------



## Kriant

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> OC'ing 290's is pretty standard.
> 
> Create a custom fan profile to stay cool, Hawaii is temp sensitive in my experience.
> 
> Increase core, test with preferred method, repeat until artifact/crash.
> 
> Increase voltage until all is good, increase core until artifact/crash. Repeat until you hit your highest stable OC, then start OC'ing memory.
> 
> If you want more than +100mV, flash to ASUS BIOS'es and use GPUTweak. It allows a max of 1.412v before droop (about 1.3v under load).


K, do I want to put power draw to +50% or which one is the golden middle ?
P.S. I'm on water, so temps aren't as big of an issue.


----------



## chiknnwatrmln

Yes power limit to +50%.


----------



## velocityx

Quote:


> Originally Posted by *-YC-*
> 
> Alright i will get the 290 CF, though should i wait untill non-referance cards or get the referance ? I dont play with headset and if it as loud as people say maybe it will bother me though i cant know 100% without seeing/testing but i have no way. Anyways thanks alot and do you know when the non-referance versions will come ?


well, I donno, I game on a closed off headphone, and i'm getting a watercooling loop in the summer so noise isn't an issue for me, but it is a bit noisy. not crazy loud, just a bit noisy. if time permits, I can record a video with my phone of a small game session in bf4 without headphones so you can see how loud it is

im not promising tho, donno what will come up tomorrow.


----------



## ZealotKi11er

What does the power limit do? Has anyone done any testing.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What does the power limit do? Has anyone done any testing.


Power limit determines how much wattage (in %) the card is allowed to draw.

Basically, if it's not set at the max and you try to overclock, your card is probably gonna throttle. Especially if you over-volt.


----------



## ZealotKi11er

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Power limit determines how much wattage (in %) the card is allowed to draw.
> 
> Basically, if it's not set at the max and you try to overclock, your card is probably gonna throttle. Especially if you over-volt.


I am currently running @ 1100MHz stock voltage. Should be fine @ 0%?


----------



## ImJJames

Kraken g10 compatible with r9 290?


----------



## ZealotKi11er

Quote:


> Originally Posted by *ImJJames*
> 
> Kraken g10 compatible with r9 290?


Yes.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I am currently running @ 1100MHz stock voltage. Should be fine @ 0%?


Power limit is really only going to hurt you if it causes throttling.

I'd keep it set at +50%, even if you don't over-volt the card still needs more wattage for higher clocks.

Edit: You may have to use CCC to increase it, when I used MSI AB it wouldn't actually set the power limit to +50%. GPUTweak will work also.


----------



## iamhollywood5

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Power limit is really only going to hurt you if it causes throttling.
> 
> I'd keep it set at +50%, even if you don't over-volt the card still needs more wattage for higher clocks.
> 
> Edit: You may have to use CCC to increase it, when I used MSI AB it wouldn't actually set the power limit to +50%. GPUTweak will work also.


This, you really want it at 150% at all times. I'll tell you this - when I ran OCCT GPU stress test, if I kept power target at 100%, the clock would throttle all the way down to around 720Mhz no matter what temperatures (even below 60C) because a stress test like OCCT or Furmark draws so much power. Had to set 150% to get the GPU running at 1000Mhz. I wouldn't be surprised if a few games out there pulled enough power to throttle the card below 900Mhz. You probably want 150% even if you aren't even overclocking at all.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *iamhollywood5*
> 
> This, you really want it at 150% at all times. I'll tell you this - when I ran OCCT GPU stress test, if I kept power target at 100%, the clock would throttle all the way down to around 720Mhz no matter what temperatures (even below 60C) because a stress test like OCCT or Furmark draws so much power. Had to set 150% to get the GPU running at 1000Mhz. I wouldn't be surprised if a few games out there pulled enough power to throttle the card below 900Mhz. You probably want 150% even if you aren't even overclocking at all.


Running a heavily modded Fallout 3 at 4800x900 caused my card to throttle from 1100 MHz to ~750 MHz before I properly set the power limit to +50%.


----------



## ImJJames

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yes.


Cant be that easy is it? Any tutorials?


----------



## Durvelle27

Using the PT1 BIOs on my Sapphire R9 290 and i can say it overclocks much better and much more stable


----------



## Neutronman

What with the R9 290 pricing. I paid $399 for my MSI and $419 for the Powercolor OC version I picked up on Friday. Since then prices have rocketed. Newegg has the same MSI R9 290 that I paid $399 for $469!!!

Supply and demand I guess, unless AMD finally realized that they can get more money for this powerhouse of a videocard....


----------



## chiknnwatrmln

Mining craze. It affects all high-end AMD GPUs to some extent.


----------



## Arizonian

Quote:


> Originally Posted by *bir86*
> 
> XFX 290 unlocked to 290X
> ASIC 73%
> 
> 1270/1400 @ 1,5vCore with PT1 BIOS - Stock fan @100%
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Quote:


> Originally Posted by *xnxsxx*
> 
> Count me in!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sapphire R9 290 flashed to 290X! Stock cooler.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added








Quote:


> Originally Posted by *iGameInverted*
> 
> Finally have everything together. Add me =D
> 
> HIS 290x on water. Elpida mem.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## darkelixa

Still waiting for my xfx r9 290


----------



## Durvelle27

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added


I don't see my name on the list


----------



## ehpexs

I sold my 7950s and now I'm waiting on 290s. Hopefully the mining craze that allowed me to almost break even on my 7950s will die off so I can buy a pair of 290s with aftermarket coolers next month.


----------



## Arizonian

Quote:


> Originally Posted by *Durvelle27*
> 
> I don't see my name on the list


# 182 scroll down the roster - your there.


----------



## Durvelle27

Quote:


> Originally Posted by *Arizonian*
> 
> # 182 scroll down the roster - your there.


I don't see it. Think i'mblind


----------



## Arizonian

Quote:


> Originally Posted by *Durvelle27*
> 
> I don't see it. Think i'mblind


4th from the bottom currently. You have to focus your pointer in the roster and scroll down it goes further down than is showing. I had to cut off what shows on OP at some point.


----------



## Durvelle27

Quote:


> Originally Posted by *Arizonian*
> 
> 4th from the bottom currently. You have to focus your pointer in the roster and scroll down it goes further down than is showing. I had to cut off what shows on OP at some point.


Ahh

Dang it. wheel broke on my mouse XD


----------



## Arizonian

Quote:


> Originally Posted by *Durvelle27*
> 
> Ahh
> 
> Dang it. wheel broke on my mouse XD


Let me help you.











_Click pic then click original for full size._


----------



## Durvelle27

Quote:


> Originally Posted by *Arizonian*
> 
> Let me help you.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> _Click pic then click original for full size._


Thx









wow alot of ppl running stock


----------



## Arizonian

Quote:


> Originally Posted by *Durvelle27*
> 
> Thx
> 
> 
> 
> 
> 
> 
> 
> 
> 
> wow alot of ppl running stock


This club is going to double when AIB's put out their non-reference 290's this month and 290X next month too.


----------



## Durvelle27

Quote:


> Originally Posted by *Arizonian*
> 
> This club is going to double when AIB's put out there non-reference 290's this month and 290X next month too.


I bet. Sucks o don't have BF4 to enjoy this card though


----------



## Neutronman

Quote:


> Originally Posted by *Arizonian*
> 
> This club is going to double when AIB's put out their non-reference 290's this month and 290X next month too.


Going to double again when mantle is released.....


----------



## ImJJames

Quote:


> Originally Posted by *Neutronman*
> 
> Going to double again when mantle is released.....


Honestly I don't see AMD catching up to the demand. AMD profit this year must of went through the roof.


----------



## brazilianloser

Quote:


> Originally Posted by *ImJJames*
> 
> Honestly I don't see AMD catching up to the demand. AMD profit this year must of went through the roof.


Even some of the sites that keep getting them in stock are actually selling open boxes as new cards due to the high rma rate at the beginning of the release. My second card being one very noticeable when the packaging is beat up and all internal goodies look fooled with, but at least it worked and so far a good oc card for being on rc. So yeah I think as well that the demand is just well above the capabilities of providing card by AMD and its sellers.


----------



## Arizonian

Quote:


> Originally Posted by *Neutronman*
> 
> Going to double again when mantle is released.....


Let's hope DICE gets BF4 fixed soon and these issues don't delay Mantle. At least they are making it a priority and not leaving gamers hanging. They are to be the first game to sport Mantle as far as we know thus far.

*DICE halts development of all future projects as it shifts focus to Battlefield 4 issues*


----------



## brazilianloser

Quote:


> Originally Posted by *Neutronman*
> 
> Going to double again when mantle is released.....


I am really hoping for mantle to actually do as advertised... right now I can play BF4 on ultra and keep a 60fps with just an occasional dip below (once I figured out that for some reason I had to make a custom profile for the game in ccc and pick AFR in order for the game to use both cards). If it buffs up these cards even further I wont feel obliged to buy a third one.


----------



## velocityx

yea but mantle is in the hands of johan anderson

I don't think that with such high priority for AMD, dice will be like, hey johan, go leave mantle and fix the game bugs, Johan is the tech director so I bet he's working mantle all day all week;]


----------



## XSHollywood

Any word about an R9 290/290x *specific cooler being made / released soon? I don't think I'm gonna wait for the aftermarket solutions...

I'm betting thermalright or another heatsink company would make a killing releasing a kit to retrofit all the reference cards.

*And I'm not talking about the Gelid or Arctic "compatible with everything" coolers already out...


----------



## ehpexs

Quote:


> Originally Posted by *XSHollywood*
> 
> Any word about an R9 290/290x *specific cooler being made / released soon? I don't think I'm gonna wait for the aftermarket solutions...
> 
> I'm betting thermalright or another heatsink company would make a killing releasing a kit to retrofit all the reference cards.
> 
> *And I'm not talking about the Gelid or Arctic "compatible with everything" coolers already out...


I've heard January, hopefully they'll come out then. I'm thinking about water, but start up costs + block costs + limited resale value are making me think twice.


----------



## broken pixel

Anyone else notice Afterburner beta 17 throttles the clocks like mad?

I was testing my 2x 290x in my mining rig and noticed when I use afterburner beta 17 it throttles both cards. One will throttle and then the other.

Its not a heat issue because my gpus are under 64C and VRMs are not overheating.

When I do not use afterburner and only use cgminer script with the OC and fan settings the cards do not throttle the clocks. It is easy to tell because the hash rates are stable unlike when using afterburner.

900MHz/1250Mhz gives me almost 800KH/s per GPU. 1100/1500 would be great but my mems do not like going to 1500MHz



without AB B17


I just tried thread-concurrency of 57000 and it works.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *broken pixel*
> 
> Anyone else notice Afterburner beta 17 throttles the clocks like mad?
> 
> I was testing my 2x 290x in my mining rig and noticed when I use afterburner beta 17 it throttles both cards. One will throttle and then the other.
> 
> Its not a heat issue because my gpus are under 64C and VRMs are not overheating.
> 
> When I do not use afterburner and only use cgminer script with the OC and fan settings the cards do not throttle the clocks. It is easy to tell because the hash rates are stable unlike when using afterburner.
> 
> 900MHz/1250Mhz gives me almost 800KH/s per GPU. 1100/1500 would be great but my mems do not like going to 1500MHz
> 
> 
> 
> without AB B17
> 
> 
> I just tried thread-concurrency of 57000 and it works.


In my experience MSI AB's power limit settings don't work, which results in throttling.

You can use either CCC, or flash to an ASUS BIOS and use GPUTweak.


----------



## ZealotKi11er

Can you change mine to water.


----------



## Slomo4shO

Quote:


> Originally Posted by *Durvelle27*
> 
> wow alot of ppl running stock


Many not out of choice









Quote:


> Originally Posted by *Arizonian*
> 
> This club is going to double when AIB's put out their non-reference 290's this month and 290X next month too.


You should consider splitting the spreadsheet into two, one for 290s and another for 290Xs


----------



## Durvelle27

Quote:


> Originally Posted by *Slomo4shO*
> 
> Many not out of choice


Me running aftermarket. Paid $400 total


----------



## Slomo4shO

Quote:


> Originally Posted by *Durvelle27*
> 
> Me running aftermarket. Paid $400 total


Paid $1200 total for the 4 cards after selling the game







, the custom loop is currently at $1500 not including the fans







. My blocks are still on backorder


----------



## RAFFY

Quote:


> Originally Posted by *Slomo4shO*
> 
> Paid $1200 total for the 4 cards after selling the game
> 
> 
> 
> 
> 
> 
> 
> , the custom loop is currently at $1500 not including the fans
> 
> 
> 
> 
> 
> 
> 
> . My blocks are still on backorder


SCREW FANS!!! My 12 Scythe Gentle Typhoons cost a couple hundred bucks!


----------



## Durvelle27

Quote:


> Originally Posted by *Slomo4shO*
> 
> Paid $1200 total for the 4 cards after selling the game
> 
> 
> 
> 
> 
> 
> 
> , the custom loop is currently at $1500 not including the fans
> 
> 
> 
> 
> 
> 
> 
> . My blocks are still on backorder


ahh wish i knew as i want BF4 and yea i'm waiting for blocks to come back in stock.


----------



## broken pixel

I have seen some post of people recomending all these collants and aditives for H20 loops. No need waist money on that hype. Just use distilled water and a silver coil.

I used coolant for a few years and always has weird things that would develop inside my loop. Since I stopped using aftermarket coolant my loop has been clean and clear for over six months.

I am sure some will agree and some will disagree but it works for me, costs way less and is cleaner without the toxic stuff in my loop.


----------



## Slomo4shO

Quote:


> Originally Posted by *RAFFY*
> 
> SCREW FANS!!! My 12 Scythe Gentle Typhoons cost a couple hundred bucks!


I am looking at 26 fans total. I have 6 Noctua NF-F12s so I am deciding whether I want to pickup 20 of those or 26 Scythe AP-14s. I am leaning towards the Noctua since I am going to be using Swiftech 8-Way PWM Splitters connected to the OC Panel or motherboard.


----------



## tsm106

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Quote:
> 
> 
> 
> Originally Posted by *broken pixel*
> 
> Anyone else notice Afterburner beta 17 throttles the clocks like mad?
> 
> I was testing my 2x 290x in my mining rig and noticed when I use afterburner beta 17 it throttles both cards. One will throttle and then the other.
> 
> Its not a heat issue because my gpus are under 64C and VRMs are not overheating.
> 
> When I do not use afterburner and only use cgminer script with the OC and fan settings the cards do not throttle the clocks. It is easy to tell because the hash rates are stable unlike when using afterburner.
> 
> 900MHz/1250Mhz gives me almost 800KH/s per GPU. 1100/1500 would be great but my mems do not like going to 1500MHz
> 
> 
> 
> without AB B17
> 
> 
> I just tried thread-concurrency of 57000 and it works.
> 
> 
> 
> In my experience MSI AB's power limit settings don't work, which results in throttling.
> 
> You can use either CCC, or flash to an ASUS BIOS and use GPUTweak.
Click to expand...

AB is broke right now. That said you can use one of the unlocked bios like pt1 and not mess with powertune at all since pt1 has no tdp limit. Ergo, it makes the slider not matter.


----------



## psyside

Amazing channel for good binned 290


----------



## givmedew

Quote:


> Originally Posted by *skupples*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Emmett*
> 
> LOL. Not keeping up I guess. heard about the EK nickel thing, but thought they had it straightened out?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I am running coolant on that cards loop now.
> 
> Dam, the other loop is still fine, no issues. I am likely still gonna drain it and do coolant on it also.
> 
> Fow now the coolant and distilled loops duel temps within 2C of each other so its all good.
> 
> working on my overclocks now.
> 
> 
> 
> 
> 
> 
> EK has mostly sorted out the nickel corrosion issues, though it's definitely smart to use some coolant, or even better (recommended by JPM) use Red Line Water Wetter Anti-corrosion. You can get it @ your local auto parts store. I JUST tore down my loop, all three of my nickel blocks ( cpu gpu's) are spotless, zero corrosion. Seems nickel is slightly higher maintenance is all. Gotta check your PH every few months, & run anti-corrosion to be sure you aren't destroying your stuff. Something tells me corrosion within 7 days of use is probably user error related. He must of used tap water, or flushed his rads with vinegar, who knows.
> 
> ADD2PSU is 20$ & is considered superior to hotwiring or dual plugs. I have a reason why, but can't remember in my 2:30 AM stupor.
Click to expand...

Using silver with copper can be bad. Using silver with nickel can be worse. I say can because it depends on many things like "skupples" has pointed out pH matters. What also matter is the temperature of the water. If your running a 20C delta with 30C intake temp then 50C is going to cause corrosion much faster than a 5C delta with 24C intake air (29c).

So DON'T use Silver if you do and you don't have corrosion you are just lucky period or you used an anticorrosive that probably made the silver inert and totally ineffective. I know people have had algae however I have never seen it EVER in a loop. So why use silver when it's safety and effectiveness are questionable. I use either distilled water or distilled water and ethylene glycol. One of the most respected water cooling companies in the game "aquacomputer" has clear coolant that is just deionized water and ethylene glycol. The other most respected company "swiftech" has been pushing the use of distilled water and ethylene glycol for maybe 10 years? Their Hydrx additive is distilled water, ethylene glycol, green dye and possibly some other anti-corrosion packages that would be seen in vehicles.

Aquacomputer feels it is safe enough to mix aluminum and copper together with just their ethylene glycol mix.

I wanted a pure copper loop and have managed to keep one for a long time. However this time around I can not avoid nickel because I am not going to use a basic old school designed waterblock. I want something that is either designed like a CPU block (which is why I had been using universal blocks) or something as close to that design as possible. Since there are no shootouts say who has the best 290x block I used my best judgment and figured the koolance probably has the best temps and the highest restriction but the restriction is no issue with a D5 and the lowest restriction high performance water block ever made (DT 5noz).

If I see a shootout ever done that says EKs block is just as good I will sell mine and get a copper EK. Until then I will stick with this horribly ugly waterblock (the ugliest available).

Anyways my point is... don't be afraid of using nickel as long as your not a ******. Use an additive that prevents corrosion (ethylene glycol or water wetter). Don't use a kill coil and I am fairly certain that with an anti corrosion additive that the kill coil stops working. Also ethylene glycol is a biocide that will kill algae. Change your coolant 2-3x per year. One of the reasons people have issues with their loops is because they buy super expensive coolant which makes it so they are unlikely to change the coolant often. Swiftech additive is $3 and using half as much as they recommend it can do 4 changes on a huge loop. It is enough to treat 2 gallons of water. If you want clear ethylene glycol with no other additives then buy feser base, It is cheap ($7) and can handle 3-4 changes.


----------



## VSG

Wait, so is it confirmed that afterburner doesn't apply the power limit? I had both my cards on what I thought was +50 power limit and got them up to 1125/1350 without any issues at stock volts. So should I flash the Asus bios and try out GPU tweak?


----------



## Forceman

Quote:


> Originally Posted by *geggeg*
> 
> Wait, so is it confirmed that afterburner doesn't apply the power limit? I had both my cards on what I thought was +50 power limit and got them up to 1125/1350 without any issues at stock volts. So should I flash the Asus bios and try out GPU tweak?


It doesn't work for some people, for some unknown reason. Others have no problem with it. You can also increase the power limit in CCC< while still using AB for overclocking.


----------



## tsm106

Quote:


> Originally Posted by *geggeg*
> 
> Wait, so is it confirmed that afterburner doesn't apply the power limit? I had both my cards on what I thought was +50 power limit and got them up to 1125/1350 without any issues at stock volts. So should I flash the Asus bios and try out GPU tweak?


AB is BROKEN in two ways, applying Powertune and reading GPU Usage. Both issues are documented. Ironically, the one way that works w/o a doubt is using gputweak with an asus bios (stock asus if you want idle downclocking). As I described you can use AB with an unlocked bios as a work-around. This however carries a big caveat, no Powertune ie. no idle downclocking. Since pt1 has no tdp limit, it renders Powertune tdp slider useless, since all the limits have been gutted from the bios. Thus using it with AB, it removes the throttling from smacking the TDP limit. The end result is you get to keep the fancy OSD. Btw, the new rtss works in 64bit games.


----------



## psyside

Can anyone do short test Metro LL, (4XSSA) or some benchmark?

Stock clocks vs memory only overclock? 70% fan + 50% power tune on both.

I bet there will be difference.


----------



## givmedew

Quote:


> Originally Posted by *geggeg*
> 
> Wait, so is it confirmed that afterburner doesn't apply the power limit? I had both my cards on what I thought was +50 power limit and got them up to 1125/1350 without any issues at stock volts. So should I flash the Asus bios and try out GPU tweak?


it seams to work with xfx bios


----------



## VSG

Quote:


> Originally Posted by *Forceman*
> 
> It doesn't work for some people, for some unknown reason. Others have no problem with it. You can also increase the power limit in CCC< while still using AB for overclocking.


Quote:


> Originally Posted by *tsm106*
> 
> AB is BROKEN in two ways, applying Powertune and reading GPU Usage. Both issues are documented. Ironically, the one way that works w/o a doubt is using gputweak with an asus bios (stock asus if you want idle downclocking). As I described you can use AB with an unlocked bios as a work-around. This however carries a big caveat, no Powertune ie. no idle downclocking. Since pt1 has no tdp limit, it renders Powertune tdp slider useless, since all the limits have been gutted from the bios. Thus using it with AB, it removes the throttling from smacking the TDP limit. The end result is you get to keep the fancy OSD. Btw, the new rtss works in 64bit games.


Thanks guys, that helps a lot. I haven't finished my build (waiting on a reservoir) so I will wait and see if Trixx gets updated. If not, Asus/PT1 bios for me.


----------



## Jpmboy

Quote:


> Originally Posted by *Emmett*
> 
> Hmm, 2 seperate loops for each card. own pumps, 360 Rad.
> Both using Distilled water with kill coil.
> 
> Anyone seen this before?
> 
> This is after only 7-9 days of setting up this cards loop.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> the brown is where the acrylic lays on the nickle, then the is galvanic looking corrosion an the ends of the fins over core
> (that is where i have the inlet.) cant believe it happened so fast...
> 
> the other loop looks fine and clean, and has been running much longer.
> 
> no aluminum in loops, but i have copper/brass/nickle/silver kill coil.


Daaaaum, tht's nasty. Do the loops share the coolant? If yes, flush the whole system completely... dab plating?

With Cu, Brass and nickel (solder-less rads?) WHY add a kill coil? It's redundant. Copper has the same effect (by the same mechanism) on anything biotic. If the plating shows ANY SIGNS of corrosion or pitting, the block is borked. If Not: Flush completely with tap water - lots. Rinse with distilled, then use only distilled water and 3-10% of Redline water Wetter.
Check the pH with a pool test kit. No less than 6.5.


----------



## Arizonian

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Can you change mine to water.


Congrats - updated








Quote:


> Originally Posted by *Slomo4shO*
> 
> Many not out of choice
> 
> 
> 
> 
> 
> 
> 
> 
> You should consider splitting the spreadsheet into two, one for 290s and another for 290Xs


Keeping one chart is much simpler. I consider this one unified club, these cards are so closely matched anyway.


----------



## tsm106

Quote:


> Originally Posted by *geggeg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Forceman*
> 
> It doesn't work for some people, for some unknown reason. Others have no problem with it. You can also increase the power limit in CCC< while still using AB for overclocking.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> AB is BROKEN in two ways, applying Powertune and reading GPU Usage. Both issues are documented. Ironically, the one way that works w/o a doubt is using gputweak with an asus bios (stock asus if you want idle downclocking). As I described you can use AB with an unlocked bios as a work-around. This however carries a big caveat, no Powertune ie. no idle downclocking. Since pt1 has no tdp limit, it renders Powertune tdp slider useless, since all the limits have been gutted from the bios. Thus using it with AB, it removes the throttling from smacking the TDP limit. The end result is you get to keep the fancy OSD. Btw, the new rtss works in 64bit games.
> 
> Click to expand...
> 
> Thanks guys, that helps a lot. I haven't finished my build (waiting on a reservoir) so I will wait and see if Trixx gets updated. If not, Asus/PT1 bios for me.
Click to expand...

I see ppl mention using Overdrive and AB at the same time but that is a bad habit as both can interfere with each other. Once you enable Overdrive it now can wreak havoc on your powerstates. We went thru this years ago with Cayman. I generally avoid using redundant apps especially ones that can change powerstates.


----------



## Forceman

Quote:


> Originally Posted by *psyside*
> 
> Can anyone do short test Metro LL, (4XSSA) or some benchmark?
> 
> Stock clocks vs memory only overclock? 70% fan + 50% power tune on both.
> 
> I bet there will be difference.


Okay, I ran the Metro:LL benchmark (D6, 1080p, Very High, 16x AF, SSAA, Very High Tesselation) at 1100/1250 and then again at 1100/1400. 2 runs of each.

1100/1250:
Avg 49.00
Max 105.12
Min 34.04

1100/1400:
Avg 49.50
Max 95.88
Min 34.02

So basically no difference.
Quote:


> Originally Posted by *tsm106*
> 
> I see ppl mention using Overdrive and AB at the same time but that is a bad habit as both can interfere with each other. Once you enable Overdrive it now can wreak havoc on your powerstates. We went thru this years ago with Cayman. I generally avoid using redundant apps especially ones that can change powerstates.


Except that for some people not setting the power limit in CCC means it doesn't get increased. I'd say that's worse than any problems that might occur from running both.


----------



## RAFFY

Quote:


> Originally Posted by *Slomo4shO*
> 
> I am looking at 26 fans total. I have 6 Noctua NF-F12s so I am deciding whether I want to pickup 20 of those or 26 Scythe AP-14s. I am leaning towards the Noctua since I am going to be using Swiftech 8-Way PWM Splitters connected to the OC Panel or motherboard.


Scythe AP's are suppose to be the best. I purchased the AP15's. I'm starting with 12 but I have a feeling I'll end up ordering a couple more. I also purchased two molex to 3 pin fan header pcb things.


----------



## psyside

Quote:


> Originally Posted by *Forceman*
> 
> Okay, I ran the Metro:LL benchmark (D6, 1080p, Very High, 16x AF, SSAA, Very High Tesselation) at 1100/1250 and then again at 1100/1400. 2 runs of each.
> 
> 1100/1250:
> Avg 49.00
> Max 105.12
> Min 34.04
> 
> 1100/1400:
> Avg 49.50
> Max 95.88
> Min 34.02
> 
> So basically no difference.
> Except that for some people not setting the power limit in CCC means it doesn't get increased. I'd say that's worse than any problems that might occur from running both.


Rep +









Well, that's disappointing, is 1400 your limit? can you do one qucik valley with max out settings? or something really short


----------



## ImJJames

Quote:


> Originally Posted by *tsm106*
> 
> AB is BROKEN in two ways, applying Powertune and reading GPU Usage. Both issues are documented. Ironically, the one way that works w/o a doubt is using gputweak with an asus bios (stock asus if you want idle downclocking). As I described you can use AB with an unlocked bios as a work-around. This however carries a big caveat, no Powertune ie. no idle downclocking. Since pt1 has no tdp limit, it renders Powertune tdp slider useless, since all the limits have been gutted from the bios. Thus using it with AB, it removes the throttling from smacking the TDP limit. The end result is you get to keep the fancy OSD. Btw, the new rtss works in 64bit games.


\

Its the opposite for me AB works much better for 24/7 usage than GPU tweak, for some reason when I have GPU tweak open it uses FIXED voltage and I see my temps raise. The fan for GPU tweak is also buggy. If you close it, it sets the fan permanent speed to what it was when you closed GPU tweak. So I use GPU tweak for benching, AB for 24/7.


----------



## bustacap22

.


----------



## skupples

Quote:


> Originally Posted by *givmedew*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Using silver with copper can be bad. Using silver with nickel can be worse. I say can because it depends on many things like "skupples" has pointed out pH matters. What also matter is the temperature of the water. If your running a 20C delta with 30C intake temp then 50C is going to cause corrosion much faster than a 5C delta with 24C intake air (29c).
> 
> So DON'T use Silver if you do and you don't have corrosion you are just lucky period or you used an anticorrosive that probably made the silver inert and totally ineffective. I know people have had algae however I have never seen it EVER in a loop. So why use silver when it's safety and effectiveness are questionable. I use either distilled water or distilled water and ethylene glycol. One of the most respected water cooling companies in the game "aquacomputer" has clear coolant that is just deionized water and ethylene glycol. The other most respected company "swiftech" has been pushing the use of distilled water and ethylene glycol for maybe 10 years? Their Hydrx additive is distilled water, ethylene glycol, green dye and possibly some other anti-corrosion packages that would be seen in vehicles.
> 
> Aquacomputer feels it is safe enough to mix aluminum and copper together with just their ethylene glycol mix.
> 
> I wanted a pure copper loop and have managed to keep one for a long time. However this time around I can not avoid nickel because I am not going to use a basic old school designed waterblock. I want something that is either designed like a CPU block (which is why I had been using universal blocks) or something as close to that design as possible. Since there are no shootouts say who has the best 290x block I used my best judgment and figured the koolance probably has the best temps and the highest restriction but the restriction is no issue with a D5 and the lowest restriction high performance water block ever made (DT 5noz).
> 
> If I see a shootout ever done that says EKs block is just as good I will sell mine and get a copper EK. Until then I will stick with this horribly ugly waterblock (the ugliest available).
> 
> Anyways my point is... don't be afraid of using nickel as long as your not a ******. Use an additive that prevents corrosion (ethylene glycol or water wetter). Don't use a kill coil and I am fairly certain that with an anti corrosion additive that the kill coil stops working. Also ethylene glycol is a biocide that will kill algae. Change your coolant 2-3x per year. One of the reasons people have issues with their loops is because they buy super expensive coolant which makes it so they are unlikely to change the coolant often. Swiftech additive is $3 and using half as much as they recommend it can do 4 changes on a huge loop. It is enough to treat 2 gallons of water. If you want clear ethylene glycol with no other additives then buy feser base, It is cheap ($7) and can handle 3-4 changes.


I'm using about 5% Red Line Water Wetter this time, with another ~5% of commercial refrigerant ethylene glycol. Also, EK blocks are the clear winner when it comes to cooling VRM/VRAM @ least in the FC-Titan series. Though they do fall behind the other blocks by 1-3c on the core, i'll take 5-10c cooler vrm/vram any day.


----------



## tsm106

Quote:


> Originally Posted by *ImJJames*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> AB is BROKEN in two ways, applying Powertune and reading GPU Usage. Both issues are documented. Ironically, the one way that works w/o a doubt is using gputweak with an asus bios (stock asus if you want idle downclocking). As I described you can use AB with an unlocked bios as a work-around. This however carries a big caveat, no Powertune ie. no idle downclocking. Since pt1 has no tdp limit, it renders Powertune tdp slider useless, since all the limits have been gutted from the bios. Thus using it with AB, it removes the throttling from smacking the TDP limit. The end result is you get to keep the fancy OSD. Btw, the new rtss works in 64bit games.
> 
> 
> 
> \
> 
> Its the opposite for me AB works much better for 24/7 usage than GPU tweak, for some reason when I have GPU tweak open it uses FIXED voltage and I see my temps raise. The fan for GPU tweak is also buggy. If you close it, it sets the fan permanent speed to what it was when you closed GPU tweak. So I use GPU tweak for benching, AB for 24/7.
Click to expand...

Gputweak is an abomination but things are what they are. And you need to read what you quoted as most of your complaints are explained in it.


----------



## Forceman

Quote:


> Originally Posted by *psyside*
> 
> Rep +
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well, that's disappointing, is 1400 your limit? can you do one qucik valley with max out settings? or something really short


I've run it at 1450, but I haven't done much testing above that and I didn't want it to crash while I was running this. I ran Valley, which I think is more memory dependent. I realized after the first run that I hadn't reset defaults in CCC, so this is with non-standard AA options.

1100/1250: 45.5
1100/1400: 46.7

So a slight different there.


----------



## Raephen

Quote:


> Originally Posted by *geggeg*
> 
> Wait, so is it confirmed that afterburner doesn't apply the power limit? I had both my cards on what I thought was +50 power limit and got them up to 1125/1350 without any issues at stock volts. So should I flash the Asus bios and try out GPU tweak?


I really don't know.

Power limit seems to work for me, as far as I know.

At least, in CCC Graphics Overdrive (not enabled), the power limit does seem to change with what I set in AB.

I'm on Beta 9.2. Maybe with the drivers that disable overclocking, AB loses the powerlimit too

Extra voltage seems to be a no go, though. I checked with GPU-Z and I saw no differences between +0 and +100mV with my Sapphire 290.

I'm hoping a new version of Trixx will allow it.

I don't need the extra power just yet, but I seem to be stable at 1200 core, +50% power limit and - apparantly - stock voltages. Haven't done much memory ocing - I trust my 512 bit bus will be wide enough with 1250MHz for the time being.


----------



## Ricdeau

Quote:


> Originally Posted by *psyside*
> 
> Rep +
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well, that's disappointing, is 1400 your limit? can you do one qucik valley with max out settings? or something really short


I generally agree the memory overclocks don't really do much with these cards. Running without SSAA netted no real change to speak of, but adding in the SSAA while in crossfire I saw a minor increase.

Stock core and stock mem:


Stock core and 1400 mem:


1400 is as high as I can get my memory on the stock BIOS. Crossfire in both runs because I was too lazy to disable.


----------



## Falkentyne

Quote:


> Originally Posted by *tsm106*
> 
> AB is BROKEN in two ways, applying Powertune and reading GPU Usage. Both issues are documented. Ironically, the one way that works w/o a doubt is using gputweak with an asus bios (stock asus if you want idle downclocking). As I described you can use AB with an unlocked bios as a work-around. This however carries a big caveat, no Powertune ie. no idle downclocking. Since pt1 has no tdp limit, it renders Powertune tdp slider useless, since all the limits have been gutted from the bios. Thus using it with AB, it removes the throttling from smacking the TDP limit. The end result is you get to keep the fancy OSD. Btw, the new rtss works in 64bit games.


Setting power limit works in Afterburner for me.


----------



## Jack Mac

Ugh, BF4 is a stuttering mess, hope I don't have to upgrade my CPU for just one game, I even tried the unthinkable, turning my graphics settings up to try and fix my issues but it did nothing.


Anyone got a fix? Not going to switch to Windows 8, tried unparking cores, taskmanager, etc.


----------



## Durvelle27

Anybody with a R9 290 have Crysis 3


----------



## Forceman

Quote:


> Originally Posted by *Jack Mac*
> 
> Ugh, BF4 is a stuttering mess, hope I don't have to upgrade my CPU for just one game, I even tried the unthinkable, turning my graphics settings up to try and fix my issues but it did nothing.
> 
> 
> Anyone got a fix? Not going to switch to Windows 8, tried unparking cores, taskmanager, etc.


You have voltage monitoring turned on in Afterburner? If so, try with it off.


----------



## Jack Mac

Quote:


> Originally Posted by *Forceman*
> 
> You have voltage monitoring turned on in Afterburner? If so, try with it off.


Alright, I'll see if that fixes anything.


----------



## writer21

Quote:


> Originally Posted by *Jack Mac*
> 
> Alright, I'll see if that fixes anything.


You playing with that 3570k? If so I'm on the same and I'm locked at 100 fps lightboost and its smooth. Overclocked to 4.6 though @ 1.312 vcore though. Try overclocking the cpu more.


----------



## ImJJames

Powerlimit also works for me in AB.


----------



## Jack Mac

Quote:


> Originally Posted by *writer21*
> 
> You playing with that 3570k? If so I'm on the same and I'm locked at 100 fps lightboost and its smooth. Overclocked to 4.6 though @ 1.312 vcore though. Try overclocking the cpu more.


Disabling voltage monitoring did absolutely nothing, and yeah, I'm on the 3570k. I wish it would clock higher but it's a total dud, 4.4 isn't even stable at 1.32V..so I'm kinda stuck here at 4.2.


----------



## ImJJames

Quote:


> Originally Posted by *Jack Mac*
> 
> Disabling voltage monitoring did absolutely nothing, and yeah, I'm on the 3570k. I wish it would clock higher but it's a total dud, 4.4 isn't even stable at 1.32V..so I'm kinda stuck here at 4.2.


You still shouldn't be stuttering...I suggest you reinstall Windows. Even my FX-6300 didn'ts stutter with BF 4


----------



## Jack Mac

This is a fairly clean install so I doubt that's the issue. Here is the usage of my CPU+GPU:


----------



## Im Batman

Would anybody have any advice on how MSI and Gigabyte compare for the 290x or which one you would pick and why?

Will be picking up either one later today and i'm not sure which to get.

Thanks.


----------



## ImJJames

What kind of watt power consumption would a r9 290 pull @ 1.4+ volts?


----------



## UNOE

Quote:


> Originally Posted by *tsm106*
> 
> AB is broke right now. That said you can use one of the unlocked bios like pt1 and not mess with powertune at all since pt1 has no tdp limit. Ergo, it makes the slider not matter.


Is there one of these Bios's for r9 290 ?


----------



## tsm106

Quote:


> Originally Posted by *ImJJames*
> 
> What kind of watt power consumption would a r9 290 pull @ 1.4+ volts?


Considering its almost exactly the same as a 290x and some 290 are even 290x in hiding, it should consume/draw the same wattage when using an unlocked bios. My cards were pulling I'm guesstimating around 350w-360w after power conversion. Two cards at 1300/1600 and cpu at 5ghz = over 1100w.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Durvelle27*
> 
> Anybody with a R9 290 have Crysis 3


Yep


----------



## CHUNKYBOWSER

Ordered a Gigabyte R9 290 from Amazon today. It's out of stock at the moment, but hopefully they will ship out sometime this week.


----------



## Kriant

any clues what might be causing this, and how to fix it ? temp under load 40-42c, no in-game stuttering,


----------



## skupples

Quote:


> Originally Posted by *Jack Mac*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Ugh, BF4 is a stuttering mess, hope I don't have to upgrade my CPU for just one game, I even tried the unthinkable, turning my graphics settings up to try and fix my issues but it did nothing.
> 
> 
> Anyone got a fix? Not going to switch to Windows 8, tried unparking cores, taskmanager, etc.


Switching to Windows 8. You are going to have to do it sooner or later. More & more games are going to start to utilize DX11.1, which means those games will stutter like crazy in win 7. @least, this has been my experience with BF4. It's a stuttering mess on win 7 no matter what I do, but it runs like butter on the win 8 side of my SSD. Going through the BF4 thread, i'm not the only one to report these findings.


----------



## quakermaas

Quote:


> Originally Posted by *Kriant*
> 
> 
> 
> any clues what might be causing this, and how to fix it ? temp under load 40-42c, no in-game stuttering,


MSI AB is not displaying GPU usage correctly, it's the same for everyone.


----------



## RAFFY

Quote:


> Originally Posted by *skupples*
> 
> I'm using about 5% Red Line Water Wetter this time, with another ~5% of commercial refrigerant ethylene glycol. Also, EK blocks are the clear winner when it comes to cooling VRM/VRAM @ least in the FC-Titan series. Though they do fall behind the other blocks by 1-3c on the core, i'll take 5-10c cooler vrm/vram any day.


Quote:


> Originally Posted by *Kriant*
> 
> 
> 
> any clues what might be causing this, and how to fix it ? temp under load 40-42c, no in-game stuttering,


Quote:


> Originally Posted by *quakermaas*
> 
> MSI AB is not displaying GPU usage correctly, it's the same for everyone.


It is actually a new feature MSI I added in the latest version of their overclocking software. Now while watching you GPU's performance you can also try to get to the end of the maze!


----------



## AddictedGamer93

I found some 290's in stock! lol.


----------



## Kriant

Quote:


> Originally Posted by *quakermaas*
> 
> MSI AB is not displaying GPU usage correctly, it's the same for everyone.


Cool! thanks, I was almost worrying that my cards where having a seizure.


----------



## skupples

Quote:


> Originally Posted by *AddictedGamer93*
> 
> I found some 290's in stock! lol.


better grab em before the miners grab em.


----------



## Sgt Bilko

Quote:


> Originally Posted by *tsm106*
> 
> Considering its almost exactly the same as a 290x and some 290 are even 290x in hiding, it should consume/draw the same wattage when using an unlocked bios. My cards were pulling I'm guesstimating around 350w-360w after power conversion. Two cards at 1300/1600 and cpu at 5ghz = over 1100w.


Well thats good to know, was a little concerned that 2 cards in my rig might go over 1200w, i bought a "Kill-a-Watt" device (Australian version different name) so i'll get some reading from that and see what my rig is pulling currently
Quote:


> Originally Posted by *skupples*
> 
> Switching to Windows 8. You are going to have to do it sooner or later. More & more games are going to start to utilize DX11.1, which means those games will stutter like crazy in win 7. @least, this has been my experience with BF4. It's a stuttering mess on win 7 no matter what I do, but it runs like butter on the win 8 side of my SSD. Going through the BF4 thread, i'm not the only one to report these findings.


The start menu has been added to Win 8 correct? If thats the case then i'll switch over next time i wipe.

Windows without a start button........Pfff


----------



## Forceman

Quote:


> Originally Posted by *Kriant*
> 
> 
> 
> any clues what might be causing this, and how to fix it ? temp under load 40-42c, no in-game stuttering,


If you have voltage monitoring enabled, disable it. It can cause that.
Quote:


> Originally Posted by *Sgt Bilko*
> 
> The start menu has been added to Win 8 correct? If thats the case then i'll switch over next time i wipe.
> 
> Windows without a start button........Pfff


The start _button_ is back, but it just takes you to the Metro start screen. Install Start8 and you are back to Win 7 operation.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Forceman*
> 
> If you have voltage monitoring enabled, disable it. It can cause that.
> The start _button_ is back, but it just takes you to the Metro start screen. Install Start8 and you are back to Win 7 operation.


Start8?

Will grab that when i install it, i'm about a month or so off it. i want to do some Win7 vs Win8 benches but i need my 290x back first!!


----------



## RAFFY

I really don't see why people need the windows button. Just press ctrl + windows key does the same thing.


----------



## Sgt Bilko

Quote:


> Originally Posted by *RAFFY*
> 
> I really don't see why people need the windows button. Just press ctrl + windows key does the same thing.


Start menu i should say then, and it's something i've just gotten used to ever since 3.1.


----------



## RAFFY

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Start menu i should say then, and it's something i've just gotten used to ever since 3.1.


For me I rarely even use the start menu. Anything of importance is in my tool bar. Upgraded to windows 8.1 its awesome, even better on touch devices.


----------



## Sgt Bilko

Quote:


> Originally Posted by *RAFFY*
> 
> For me I rarely even use the start menu. Anything of importance is in my tool bar. Upgraded to windows 8.1 its awesome, even better on touch devices.


I'm willing to try it and reserve judgement until then









here's hoping i don't need to re-download all my games. 900GB at 2 Mbps with a montly download cap of 70GB.........not fun, really really not fun

EDIT: zomg......666 posts, it's a sign....I need a 290x Devil !!!!!


----------



## Newbie2009

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I am currently running @ 1100MHz stock voltage. Should be fine @ 0%?


i only hit a TDP wall with +50mv with it @ 0%


----------



## Emmett

Quote:


> Originally Posted by *Jpmboy*
> 
> Daaaaum, tht's nasty. Do the loops share the coolant? If yes, flush the whole system completely... dab plating?
> 
> With Cu, Brass and nickel (solder-less rads?) WHY add a kill coil? It's redundant. Copper has the same effect (by the same mechanism) on anything biotic. If the plating shows ANY SIGNS of corrosion or pitting, the block is borked. If Not: Flush completely with tap water - lots. Rinse with distilled, then use only distilled water and 3-10% of Redline water Wetter.
> Check the pH with a pool test kit. No less than 6.5.


No, 3 seperate loops, just for fun.

I rinsed block through with distilled. then re-looped with coolant.

Was gonna use water wetter but read it's not good to use with acrylic.

Wont ever use kill coil again.

The next time I block GPU's it will be with all new hardware will not reuse
any of this for it.

I think i was turned off by coolant because one time I used feser (pretty sure), and
my temps went up a good bit, but i recently read feser is heavy on the anti-corrsion
stuff, so cools a bit poorer. At least thats how i read it.

Is this decent?

+50 Core and power limits

1120/1475

http://www.3dmark.com/3dm11/7633138



EDIT:Added a bench score


----------



## maynard14

hmmmm im seeing other peoples 3dmark11 score 12k higher

but mine is only http://www.3dmark.com/3dm11/7633317

i dont know if my card is weak or not

im using 4.4 ghz core i5 3570k and latest beta drivers from amd 13.11 beta 9.5

is my score normal all stock clocks and memory ?


----------



## rdr09

Quote:


> Originally Posted by *maynard14*
> 
> hmmmm im seeing other peoples 3dmark11 score 12k higher
> 
> but mine is only http://www.3dmark.com/3dm11/7633317
> 
> i dont know if my card is weak or not
> 
> im using 4.4 ghz core i5 3570k and latest beta drivers from amd 13.11 beta 9.5
> 
> is my score normal all stock clocks and memory ?


compare the graphics scores not the overall or compare with close to similar systems. if your gpu is at stock, then that there is normal. here is mine at 1200/1500 (no voltage control just maxed power limit). ASIC is 79% if that matters and it is watercooled . . .

http://www.3dmark.com/3dm11/7556915


----------



## Durvelle27

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Yep


What FPS do you get and what settings


----------



## Banedox

Quote:


> Originally Posted by *maynard14*
> 
> hmmmm im seeing other peoples 3dmark11 score 12k higher
> 
> but mine is only http://www.3dmark.com/3dm11/7633317
> 
> i dont know if my card is weak or not
> 
> im using 4.4 ghz core i5 3570k and latest beta drivers from amd 13.11 beta 9.5
> 
> is my score normal all stock clocks and memory ?


Your Graphics Score is over 13000, which is higher than mine and im running a 290x unlocked.... and am only pulling about 11000..

Your Physics score is CPU (which all CPUs pulling 12k plus are probably because they are hyperthreaded and yours is not..)


----------



## Durvelle27

Quote:


> Originally Posted by *maynard14*
> 
> hmmmm im seeing other peoples 3dmark11 score 12k higher
> 
> but mine is only http://www.3dmark.com/3dm11/7633317
> 
> i dont know if my card is weak or not
> 
> im using 4.4 ghz core i5 3570k and latest beta drivers from amd 13.11 beta 9.5
> 
> is my score normal all stock clocks and memory ?


My 290 at stock

http://www.3dmark.com/3dm11/7624361


----------



## psyside

Quote:


> Originally Posted by *Forceman*
> 
> I've run it at 1450, but I haven't done much testing above that and I didn't want it to crash while I was running this. I ran Valley, which I think is more memory dependent. I realized after the first run that I hadn't reset defaults in CCC, so this is with non-standard AA options.
> 
> 1100/1250: 45.5
> 1100/1400: 46.7
> 
> So a slight different there.


Thanks alot







rep+

Also you are using the same display as me, epic unit, i will never be happy with anything else








Quote:


> Originally Posted by *Ricdeau*
> 
> I generally agree the memory overclocks don't really do much with these cards. Running without SSAA netted no real change to speak of, but adding in the SSAA while in crossfire I saw a minor increase.
> 
> 1400 is as high as I can get my memory on the stock BIOS. Crossfire in both runs because I was too lazy to disable.


Thanks buddy, rep + as well.


----------



## Banedox

Quote:


> Originally Posted by *Durvelle27*
> 
> My 290 at stock
> 
> http://www.3dmark.com/3dm11/7624361


Yeah alright I think my 290 card is garbage, yall are pulling like 2000 more points on the graphics score than I am, and I have a unlocked 290 into a 290(x)......

This blows tho on a good note I bought it on amazon so I have a return window due to the holidays of January 30th, tho now I just have a EK waterblock just sitting around that i really dont want to pay a 20% restocking fee...


----------



## rdr09

Quote:


> Originally Posted by *Banedox*
> 
> Your Graphics Score is over 13000, which is higher than mine and im running a 290x unlocked.... and am only pulling about 11000..
> 
> Your Physics score is CPU (which all CPUs pulling 12k plus are probably because they are hyperthreaded and yours is not..)


something ain't right.


----------



## maynard14

Quote:


> Originally Posted by *Durvelle27*
> 
> My 290 at stock
> 
> http://www.3dmark.com/3dm11/7624361


thanks again to all the replies.. maybe my 3570k is affecting my 3dmark11 score... because of hyper threading thing

but maybe not all 290 cards at stock are not the same


----------



## Joeking78

Quote:


> Originally Posted by *maynard14*
> 
> hmmmm im seeing other peoples 3dmark11 score 12k higher
> 
> but mine is only http://www.3dmark.com/3dm11/7633317
> 
> i dont know if my card is weak or not
> 
> im using 4.4 ghz core i5 3570k and latest beta drivers from amd 13.11 beta 9.5
> 
> is my score normal all stock clocks and memory ?


Don't bench with 9.5 drivers.

I installed them last night and got considerably less points in 3dmark compared to 9.2 drivers.


----------



## Banedox

Quote:


> Originally Posted by *Joeking78*
> 
> Don't bench with 9.5 drivers.
> 
> I installed them last night and got considerably less points in 3dmark compared to 9.2 drivers.


Gosh AMD needs to get their driver crap together, maybe I can heckle my friend who works on their driver testing team....


----------



## Joeking78

Quote:


> Originally Posted by *Banedox*
> 
> Gosh AMD needs to get their driver crap together, maybe I can heckle my friend who works on their driver testing team....


If you're using 9.5 revert back to 9.2 and see the difference.

I thought one of my cards stopped working when I benched with 9.5...the second test in 3dmark 11 I was getting 200-215FPS with 9.5, withh 9.2 I was getting 280+.


----------



## Jaju123

http://www.3dmark.com/3dm/1815475?

Does this seem correct to everyone? 2x Stock 290s in crossfire.


----------



## Durvelle27

Quote:


> Originally Posted by *Joeking78*
> 
> Don't bench with 9.5 drivers.
> 
> I installed them last night and got considerably less points in 3dmark compared to 9.2 drivers.


I'm using 13.11 Beta v9.5


----------



## HOMECINEMA-PC

Dang it man my Sapphy is dang locked , but still clocks and games pretty good for 69.8% asic card . Gonna gets another one tomorrow . But not sapphire , if i can msi or giga .......


----------



## Durvelle27

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Dang it man my Sapphy is dang locked , but still clocks and games pretty good for 69.8% asic card . Gonna gets another one tomorrow . But not sapphire , if i can msi or giga .......


I also have a Sapphire R9 290 not unlockable with ASIC 71.8%


----------



## Joeking78

Quote:


> Originally Posted by *Durvelle27*
> 
> I'm using 13.11 Beta v9.5


Compare them to 9.2...It might just have been an installastion problem with 9.5 (but the install log was fine), 9.2 blew 9.5 away in my 3dmark 11 run.


----------



## cowie

well i killed my 290x yesterday with 1.47volts smoked a middle vrm had a h70 on it with custom heatsinks was great till i got greedy







.....next


----------



## MojoW

Damn i can't flash my 290 to a 290x bios.(need it for overclocking beyond where i am at now)
Every 290x bios i flash causes no output it doesn't matter if it's asus, amd, sapphire or even the modded pt1 or pt3 bios.
So is my 290 not compatible with a 290x bios???


----------



## Durvelle27

Quote:


> Originally Posted by *MojoW*
> 
> Damn i can't flash my 290 to a 290x bios.(need it for overclocking beyond where i am at now)
> Every 290x bios i flash causes no output it doesn't matter if it's asus, amd, sapphire or even the modded pt1 or pt3 bios.
> So is my 290 not compatible with a 290x bios???


All 290s are compatible as there all reference. PT1 BIOs works great on my 290

Quote:


> Originally Posted by *Joeking78*
> 
> Compare them to 9.2...It might just have been an installastion problem with 9.5 (but the install log was fine), 9.2 blew 9.5 away in my 3dmark 11 run.


Compare your scores to mines. Graphics score


----------



## Joeking78

Quote:


> Originally Posted by *Durvelle27*
> 
> All 290s are compatible as there all reference. PT1 BIOs works great on my 290
> Compare your scores to mines. Graphics score


I got three cards, that wouldn't be a fair comparison









I'll fiind some old single card results on my other system.


----------



## GroupB

So I receive my 290's last week, 2 gigabyte 290 one with Hynix ram and one with Elpida ram . The elpida ram one have a major coil whine that is louder that the fan on 45%,

I speak to the shop and they offer to RMA it but since I really need a reference card for the upcoming waterblock im stuck with it, the shop are out of reference 290 and they next shipement will probably be the incoming third party fan one because they simply remove the reference card sku off they website. I dont want to take the chance to send it back only to get a refund or get it replace by a non reference. so im piss a little bit spending 900$ to get one card that buzz..

I guess there no unlock for me



I test them , they OC to 1117/1475 without artefact, So I settle at 1100/1400 for now, I will try higher with voltage when I receive the waterblock.

My cpu is old and is my problem right now, I score 14657 3dmark11, a OC 8350 at 5ghz get 16K so this is not bad for a 1090T but this really make me thing about the next CPU, so spending 200$ to upgrade to a 8350 is not the way to go, the gain will be low. I guess I will be force to switch intel and return to quad core... I enjoy my 6 core for rendering and such going 4 core dont look so good to me.

Anyway here my score with the 1090T OC stable 24/7 @1.55v ( cant push it anymore without going 1.6v):
http://www.3dmark.com/3dm11/7631695

I use eyefinity when gaming ( that one reason why I still have that old CPU) So I test the valley benchmark with the 5760X1080 rule

and got 28.4 FPS - score 1189 - min 15 - max 49.9

im happy I beat that i7 930, titan sli by 5.1 fps

PS: using 9.5 driver


----------



## maynard14

Quote:


> Originally Posted by *rdr09*
> 
> compare the graphics scores not the overall or compare with close to similar systems. if your gpu is at stock, then that there is normal. here is mine at 1200/1500 (no voltage control just maxed power limit). ASIC is 79% if that matters and it is watercooled . . .
> 
> http://www.3dmark.com/3dm11/7556915


thank you so much for the info sir


----------



## Durvelle27

Quote:


> Originally Posted by *Joeking78*
> 
> I got three cards, that wouldn't be a fair comparison
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll fiind some old single card results on my other system.


How many single runs you have


----------



## MojoW

Quote:


> Originally Posted by *Durvelle27*
> 
> All 290s are compatible as there all reference. PT1 BIOs works great on my 290
> Compare your scores to mines. Graphics score


That's what i thought but after 20 times of flashing i'm not sure that every 290 is compatible with a 290x bios.
Strange thing is i bought mine together with a friend of mine, and his 290 has a 290x bios on it without problems.
I guess i'm just unlucky


----------



## GroupB

I have a couple run for stock 290 on my weak CPU

my hynix score 13441 graphic -- phy 6879 combi 6211 , P10671 @ stock

my elpida score 13402 graphic -- phy 6843 combi 6158 P10629 @ stock


----------



## Durvelle27

Quote:


> Originally Posted by *MojoW*
> 
> That's what i thought but after 20 times of flashing i'm not sure that every 290 is compatible with a 290x bios.
> Strange thing is i bought mine together with a friend of mine, and his 290 has a 290x bios on it without problems.
> I guess i'm just unlucky


What are you using to flash.

I used atiwinflash

with these commands

atiwinflash -f -p 0 (Name).rom in CMD Administrative
Quote:


> Originally Posted by *GroupB*
> 
> I have a couple run for stock 290 on my weak CPU
> 
> my hynix score 13441 graphic -- phy 6879 combi 6211 , P10671 @ stock
> 
> my elpida score 13402 graphic -- phy 6843 combi 6158 P10629 @ stock


Stock

Graphics Score: 13795


----------



## MojoW

Quote:


> Originally Posted by *Durvelle27*
> 
> What are you using to flash.
> 
> I used atiwinflash
> 
> with these commands
> 
> atiwinflash -f -p 0 (Name).rom in CMD Administrative
> Stock
> 
> Graphics Score: 13795


I can flash, as i used the latest atiflash and it's not in CMD Admin but in DOS.
And flashing doesn't go wrong when i flash a 290 bios because there is output and the card works.
When i flash a 290x bios the card doesn't give output but it is recognized by the mobo.
And as i said i have tried multiple times so it's not like it was a bad flash.

Edit: I always flash in DOS with atiflash that's safer.


----------



## rdr09

Quote:


> Originally Posted by *Banedox*
> 
> Gosh AMD needs to get their driver crap together, maybe I can heckle my friend who works on their driver testing team....


or something went wrong when you flashed it to X. what are your temps?


----------



## Banedox

Quote:


> Originally Posted by *rdr09*
> 
> or something went wrong when you flashed it to X. what are your temps?


My temps seem to be fine, i only max out at max 70C when I have my fan going at 80% under max load... I'll give a couple other bios a shot but Im think it is just the newest amd driver cause that is what Im running, or I could have burned my card, when i ran the PT3 bios at 1.5v cause apparently the card didnt return to stock setting when i rebooted.....


----------



## Sgt Bilko

Quote:


> Originally Posted by *Banedox*
> 
> Gosh AMD needs to get their driver crap together, maybe I can heckle my friend who works on their driver testing team....


I was using the 13.11 WHQL driver and it worked better for me than any of the Betas (I didn't try 9.5 though)

In other news, RMA Update Number 2!!

My card has been shipped off to the distributer to be tested and will hopefully know more in the next few days.


----------



## crun

My Gigabyte R9 290 (@290X) has 81,5% ASIC. Pretty high lol, is it normal for 290 series? Didn't overclock/undervolt it yet!


----------



## rdr09

Quote:


> Originally Posted by *Banedox*
> 
> My temps seem to be fine, i only max out at max 70C when I have my fan going at 80% under max load... I'll give a couple other bios a shot but Im think it is just the newest amd driver cause that is what Im running, or I could have burned my card, when i ran the PT3 bios at 1.5v cause apparently the card didnt return to stock setting when i rebooted.....


100 - 200 points difference among the the different drivers may be normal but your score is really out of whack. so, i don't think its driver.


----------



## Banedox

Quote:


> Originally Posted by *rdr09*
> 
> 100 - 200 points difference among the the different drivers may be normal but your score is really out of whack. so, i don't think its driver.


Yeah I know, sadly not sure whats up with my card.....


----------



## Durvelle27

Comparison between my ole HD 7970 @1280/1850 & R9 290 @1215/1450

http://www.3dmark.com/compare/3dm11/7626071/3dm11/7462575

Pretty happy with my purchase


----------



## chiknnwatrmln

Quote:


> Originally Posted by *MojoW*
> 
> Damn i can't flash my 290 to a 290x bios.(need it for overclocking beyond where i am at now)
> Every 290x bios i flash causes no output it doesn't matter if it's asus, amd, sapphire or even the modded pt1 or pt3 bios.
> So is my 290 not compatible with a 290x bios???


Update your motherboard BIOS. Had the same problem, that fixed it.


----------



## Joeking78

Just noticed I got the AB/CCC bug with power limit...wasn't applied in CCC...no wonder my cards were struggling all this time


----------



## Newbie2009

Quote:


> Originally Posted by *Joeking78*
> 
> Just noticed I got the AB/CCC bug with power limit...wasn't applied in CCC...no wonder my cards were struggling all this time


Yeah that one left me scratching my head for a while too.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Durvelle27*
> 
> What FPS do you get and what settings


I have only played it @ 1440p. Everything on very high, FXAA the intro scene where it's raining is where I test and at this resolution I get an avg of 37fps lol.

I can test it at 1080p tonight with MSAA if ya want or just let me know what settings I'll try it out.

This is with a 2600k @ 4.0ghz, gpu is at 1150/1500.

edit: I will say though that the first scene you start in is the worst, after that in game I get around 45 - 60fps with the occasional drops to mid 30's during certain scenes.


----------



## Slomo4shO

Should I pickup a 4930K + ASUS Rampage IV Black Edition and replace my 4770K + ASUS Maximus VI Extreme for quad fire 290s that have been locked to 290X?

I finally saw the Rampage IV Black Edition back in stock and am wondering if it would be worthwhile to move to a hexcore from my 4.8GHz delidded Haswell.


----------



## Durvelle27

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> I have only played it @ 1440p. Everything on very high, FXAA the intro scene where it's raining is where I test and at this resolution I get an avg of 37fps lol.
> 
> I can test it at 1080p tonight with MSAA if ya want or just let me know what settings I'll try it out.
> 
> This is with a 2600k @ 4.0ghz, gpu is at 1150/1500.
> 
> edit: I will say though that the first scene you start in is the worst, after that in game I get around 45 - 60fps with the occasional drops to mid 30's during certain scenes.


1080p Very High w/ FXAA would be nice

I'm running at those settings and @1100/1450 and I'm only getting 40-60 FPS


----------



## velocityx

Quote:


> Originally Posted by *Joeking78*
> 
> Just noticed I got the AB/CCC bug with power limit...wasn't applied in CCC...no wonder my cards were struggling all this time


Quote:


> Originally Posted by *Newbie2009*
> 
> Yeah that one left me scratching my head for a while too.


what is this bug you guys are talking about?

strangely, my ab has sliders for core voltage, core clock, and thats normal, next to that, theres memory clock and fan speed slider, but I see that instead of the power limit slider, I get an inactive shader clock slider. weird.


----------



## Durvelle27

Man this card + my system is drawing some power. I keep tripping the power in my house with this thing highly OC'd


----------



## Joeking78

Quote:


> Originally Posted by *velocityx*
> 
> what is this bug you guys are talking about?
> 
> strangely, my ab has sliders for core voltage, core clock, and thats normal, next to that, theres memory clock and fan speed slider, but I see that instead of the power limit slider, I get an inactive shader clock slider. weird.


If you set power limit in AB then check CCC on my system it doesn't reflect...I have to manually set in AB, then manually set in CCC. No idea why this happens.


----------



## Derpinheimer

Thats odd.. mine is drawing about 200W, stock [except memory, 1500] with CGMiner.

Pushes toward 300W with overvolting...

How could that trip your breaker? People have 1500w systems lol.


----------



## velocityx

I discovered I have a small issue with my 290.

when I enter into my uefi bios, I only see the top left corner of the image, as if the image was too big for the screen. there are also artifacts in the bios, like scrambled background etc.


----------



## Gero2013

anyone else notice really weird skipping when playing back videos using Media Player Classic since I completely uninstalled all graphics drivers on my PC with Driver Sweeper and installed the latest AMD drivers.
The problem is for videos recorded at 60fps.

For 30fps videos the replay isn't as buttery smooth as it used to be, but rather a little bit jerky and kinda slow....

Any idea what the cause is and how this can be fixed?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Durvelle27*
> 
> 1080p Very High w/ FXAA would be nice
> 
> I'm running at those settings and @1100/1450 and I'm only getting 40-60 FPS


I'll do a bench of it tonight after work


----------



## ImJJames

My 7970 vs r9 290 comparion, Same CPU and Memory clocks were used.

http://www.3dmark.com/compare/fs/1268103/fs/1238668


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Gero2013*
> 
> anyone else notice really weird skipping when playing back videos using Media Player Classic since I completely uninstalled all graphics drivers on my PC with Driver Sweeper and installed the latest AMD drivers.
> The problem is for videos recorded at 60fps.
> 
> For 30fps videos the replay isn't as buttery smooth as it used to be, but rather a little bit jerky and kinda slow....
> 
> Any idea what the cause is and how this can be fixed?


YEP! I use SVM so my issue could be that but it didn't seem to do it before. Mine is interpolating though and now also playing back at like 96fps haahahha so I gotta sort my other issues out first but you're not alone.


----------



## Durvelle27

Quote:


> Originally Posted by *Derpinheimer*
> 
> Thats odd.. mine is drawing about 200W, stock [except memory, 1500] with CGMiner.
> 
> Pushes toward 300W with overvolting...
> 
> How could that trip your breaker? People have 1500w systems lol.


I'm not at stock. R9 290 @1215/1450 1.456v and FX-8350 @5.1GHz @1.534v

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> I'll do a bench of it tonight after work


OK thank you


----------



## tsm106

Quote:


> Originally Posted by *Slomo4shO*
> 
> Should I pickup a 4930K + ASUS Rampage IV Black Edition and replace my 4770K + ASUS Maximus VI Extreme for quad fire 290s that have been locked to 290X?
> 
> I finally saw the Rampage IV Black Edition back in stock and am wondering if it would be worthwhile to move to a hexcore from my 4.8GHz delidded Haswell.


For the sake of science you should at least try the ivy setup first. We don't know how the DMA engines will be affected by the PLX chip. You can compare your benches to a few hexacore quad systems out there. If it stinks, then you can make the switch having a very good reason to.


----------



## JimmieRustle

What thermal pads should I get?

I put the EK water block on my 290x and my VRM1 temp reach over 100C

I didn't put thermal paste on the VRM pads


----------



## Banedox

Quote:


> Originally Posted by *JimmieRustle*
> 
> What thermal pads should I get?
> 
> I put the EK water block on my 290x and my VRM1 temp reach over 100C
> 
> I didn't put thermal paste on the VRM pads


Those temps seem pretty high, what are you clocks at?


----------



## Gero2013

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> YEP! I use SVM so my issue could be that but it didn't seem to do it before. Mine is interpolating though and now also playing back at like 96fps haahahha so I gotta sort my other issues out first but you're not alone.


oh kk, well on my one now even the 30fps is kinda slow and sound drops every 3s as well.....

this is for *.avi* btw
*.mp4* is totally fine.... seriously what could be the issue here??

have you done any tests to check what it could be?

the funny thing is it just randomly happened today actually, I was watching .avi files last night no problem


----------



## ImJJames

Quote:


> Originally Posted by *Durvelle27*
> 
> I'm not at stock. R9 290 @1215/1450 1.456v and FX-8350 @5.1GHz @1.534v
> OK thank you


PT 1 bios? If so, you're running more like 1.41 volts because of Vdroop


----------



## MojoW

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Update your motherboard BIOS. Had the same problem, that fixed it.


Not working for me as i was on the latest official bios, but i tried the latest beta and had the same problem.
But i'm gonna try on a different mobo just to see if the mobo(bios) is the problem.
Thanks now i atleast know what the problem could be.


----------



## tsm106

Quote:


> Originally Posted by *JimmieRustle*
> 
> What thermal pads should I get?
> 
> I put the EK water block on my 290x and my VRM1 temp reach over 100C
> 
> I didn't put thermal paste on the VRM pads


You did something terribly wrong. Double check your mount ASAP!


----------



## Derpinheimer

Anyone know if non-reference cards can fix the coil whine issues? I know 79XX were plagued by it too, but im not sure if the non-reference cards were less prone to it.

I cant stand the whine/buzz on my cards. Its so loud.. but it unlocks :/


----------



## Durvelle27

Quote:


> Originally Posted by *ImJJames*
> 
> PT 1 bios? If so, you're running more like 1.41 volts because of Vdroop


Yes I'm using the PT1 BIOs


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Gero2013*
> 
> oh kk, well on my one now even the 30fps is kinda slow and sound drops every 3s as well.....
> 
> this is for *.avi* btw
> *.mp4* is totally fine.... seriously what could be the issue here??
> 
> have you done any tests to check what it could be?
> 
> the funny thing is it just randomly happened today actually, I was watching .avi files last night no problem


For me 30fps is fine, I only seem to be having the issue when it's trying to interpolate up to 96fps. I think my particular issue is only with SVM and the video file together. I'll test it out again tonight and pinpoint when it actually happens.


----------



## Raephen

Ok, I received a reply from Aquacomputer regarding the freakish delta between my VRM temps with their block.

It appears to be a known issue, with a reasonably simple solution.

Hello Ralph,

some days ago I had a very similar problem with one customer. After some testing he found out that his card is mechanically slightly different from a "normal" card since with a block from another company he faced the same problem.

He solved it by using slightly thicker thermal pads. If you take of the block you should check the marks on the pads. I assume you will see some areas with less impact. Using two layers so you get 1mm would be too much in my opinion. A 1mm pad could be fine but in this case you should watch out for a soft variant which is not as firm as the pads from us.

I would not worry too much about to get a specific pad since the provided pad is fine but in your case just to small in height. Just get a thermal pad which is a bit softer and higher. Even a cheap pad will do a much better job than your current situation.

kind regards
Sven Hanisch

I have some Phobya 1mm thick 7W/mk padding lying arround, so I thought I'd give it a try.

With succes: after some Valley Extreme HD, VRM 1 & 2 top out at 44 C and 41 C, respectively.

Under an extreme load (ie: Furmark) the gap between them is still large - 73 for 1 and 52 for 2, but more a 20C difference instead of 40 C, because VRM1 topped out at 94C before under Furmark.

And yes, I know Furmark is a rubbish benchmark, but it's good for one thing: cooking your card


----------



## tsm106

^^Seems they have a large degree of variance with their block.


----------



## JimmieRustle

doesn't matter what clock

it takes it a while to get there when i game but gets there pretty fast when mining

i've undervolted and it will stay in the low 90s.


----------



## MrWhiteRX7

Are people getting overall good clocks with the Gigabyte 290? I found a supplier that has some in stock, might go that route even though I'd rather have matching sapphy's


----------



## JimmieRustle

Quote:


> Originally Posted by *tsm106*
> 
> You did something terribly wrong. Double check your mount ASAP!


yeah that's what i'm going to do, but i probably need new thermal pads. which ones should i get?


----------



## tsm106

Quote:


> Originally Posted by *JimmieRustle*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> You did something terribly wrong. Double check your mount ASAP!
> 
> 
> 
> yeah that's what i'm going to do, but i probably need new thermal pads. which ones should i get?
Click to expand...

It depends on your goals and budget. Something slightly better than stock 5w/mk pads are the white Phobya 7w/mk. Then you start to move up to the Fuji Extremes at 11w/mk and higher. Prices go up real fast.


----------



## Derpinheimer

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Are people getting overall good clocks with the Gigabyte 290? I found a supplier that has some in stock, might go that route even though I'd rather have matching sapphy's


AFAIK you wont find a difference in overclockability between reference cards. Brand doesnt matter; just luck.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Derpinheimer*
> 
> AFAIK you wont find a difference in overclockability between reference cards. Brand doesnt matter; just luck.


That's what I figured, thanks! Time for tri-fire!!!


----------



## MrWhiteRX7

Orrrrrrrrrr not, lol they had 14 in stock and all are gone within that 5 minutes.


----------



## y2kcamaross

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> That's what I figured, thanks! Time for tri-fire!!!


Who is your supplier!!!


----------



## MrWhiteRX7

Quote:


> Originally Posted by *y2kcamaross*
> 
> Who is your supplier!!!


They're out of stock already


----------



## Gero2013

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> For me 30fps is fine, I only seem to be having the issue when it's trying to interpolate up to 96fps. I think my particular issue is only with SVM and the video file together. I'll test it out again tonight and pinpoint when it actually happens.


kk, could you try test different file types?


----------



## MrWhiteRX7

Yep, I'll disable the SVM proggie first then test .avi, .mp4, etc..


----------



## JimmieRustle

Quote:


> Originally Posted by *tsm106*
> 
> It depends on your goals and budget. Something slightly better than stock 5w/mk pads are the white Phobya 7w/mk. Then you start to move up to the Fuji Extremes at 11w/mk and higher. Prices go up real fast.


what thickness should i look for with fuji extremes?


----------



## dansi

new bf4 patch is looking good, no more black screen after 1 hr of play. i am able to unleash memory overclocking finally as pre-patch the memory is effy and results in black screen hangs!


----------



## tsm106

Quote:


> Originally Posted by *JimmieRustle*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> It depends on your goals and budget. Something slightly better than stock 5w/mk pads are the white Phobya 7w/mk. Then you start to move up to the Fuji Extremes at 11w/mk and higher. Prices go up real fast.
> 
> 
> 
> what thickness should i look for with fuji extremes?
Click to expand...

Same as stock, .5mm for memory and 1mm for vrm.


----------



## kizwan

I can't wait to join this club. I'm going to get two 290.


----------



## Raxus

So I decided not to watercool my system for various reasons. Will be returning the R9 290x and will wait to see how the aftermarket coolers turn out, considering once again to just jump ship to the 780 clan.

Great card, but man that cooling solution. Even though my h80 is louder (replacing it shortly) the r9 290x gives it a run for its money.

Remove me from the list for now, I will app back if i pick up another one.


----------



## JimmieRustle

Quote:


> Originally Posted by *tsm106*
> 
> Same as stock, .5mm for memory and 1mm for vrm.


how many 1/4 sheets do i need for two 290x (i plan on getting a second 290x when they become available lol)?


----------



## y2kcamaross

Amazon has the sapphire r9 290's in stock as of right now,but even they are asking $499.99...no thanks.


----------



## VSG

Quote:


> Originally Posted by *JimmieRustle*
> 
> how many 1/4 sheets do i need for two 290x (i plan on getting a second 290x when they become available lol)?


Buy a quarter sheet of the Fujipoly extremes (150x100mm I believe), they will be enough for 3-4 GPU's and can be reused also.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *y2kcamaross*
> 
> Amazon has the sapphire r9 290's in stock as of right now,but even they are asking $499.99...no thanks.


What the hell... I'm willing to go up to about 459.00 but not 500


----------



## GroupB

Im still doing some test with the 290 crossfire I find something odd

I run all my test in 5760X1080

I ran grid2 bench for like 3 hour without problem with stable temps,clock,framerate no drop, perfect...

Then I try again bf4 , all ultra no AA I got arround 95 fps with a few drop in the 70 but only for the first minute, then the fps drop badly 50 fps drop to 40fps. What happend during that time.. well the memory fill up slowly.. fps was nice till I hit 3.2 Gb and start to drop and fill up completly at 4.2Gb and after that the fps is constant but really bad 50 fps.

Grid memory usage were in the 2 GB only .. constant FPS.

Its odd cause I try to disable crossfire and run bf4 on one card and the same thing happend first minute were nice then the perf drop at around 2 GB and stay around 2.2 Gb and did not fill up to 4.2 like in the crossfire setup.

I wonder if the drop I see is due to the memory fill up and not having enough left...

Its stange to me that one card load mem to 2 gb and 2 card to 4 gb , after all memory in crossfire dont add up ... this should read the same right?

using afterburn maybe there a bug there I dont know.


----------



## y2kcamaross

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> What the hell... I'm willing to go up to about 459.00 but not 500


they are even sold by amazon...price gouging bastards


----------



## FragZero

Anyone here who ran any benches with 2x 290 crossfire on gen 2 x8 slots?

Was going to get a 7990 but they can't deliver it anymore so going for the next step, dual R290 but i'm afraid my motherboard will bottleneck these.

Anyone with some input regarding this topic?


----------



## JimmieRustle

Quote:


> Originally Posted by *geggeg*
> 
> Buy a quarter sheet of the Fujipoly extremes (150x100mm I believe), they will be enough for 3-4 GPU's and can be reused also.


i can't seem to find this in the 17 W/mk. I'm guessing the 17 is better than the 11? Why must they make it so complicated lol


----------



## GroupB

I ran them in 2.0 8X... working except for the prob I just post.

I did the same test on bf3 and the same thing happend , when the memory get in the 3 GB frame drop from 100 to 40

im gonna reinstall splinter cell to test that memory filling thing again see if its only related to DICE game.


----------



## VSG

Quote:


> Originally Posted by *JimmieRustle*
> 
> i can't seem to find this in the 17 W/mk. I'm guessing the 17 is better than the 11? Why must they make it so complicated lol


The 17 w/mk are the very expensive ultra extremes and you can only buy them in strips or a full sheet. That was too rich for my taste!


----------



## Slomo4shO

Quote:


> Originally Posted by *tsm106*
> 
> You can compare your benches to a few hexacore quad systems out there. If it stinks, then you can make the switch having a very good reason to.


I am not finding any quad-fire 290 setups on Z77 or Z87 anywhere and my custom loop isn't going to be up until the end of the month. Hmm...


----------



## Durvelle27

Sucks. Cant find any blocks anywhere


----------



## Slomo4shO

Quote:


> Originally Posted by *JimmieRustle*
> 
> i can't seem to find this in the 17 W/mk. I'm guessing the 17 is better than the 11? Why must they make it so complicated lol


Performance-PCS only carries the Extreme. Frozen CPU has the Ultra Extreme but only in 60x50 sheets.


----------



## HardwareDecoder

What is the max safe voltage on these? Just got my sapphire 290x (legit 290x) and flashed PT1 bios (unlocked voltage)

ran valley @ 1200 vcore no memory oc 1.425 volts



don't think I saw any artifacts, Can't really push the gpu any harder with these volts and don't want to go up much more till we get some solid info on these cards....

max temp was 87C with the stock fan @ 70% sounds like a rolls royce jet engine. Waterblock comes in tomorrow

Just tested asic.... 78.1% Nice! never had a card with a decent asic score before.... this one might be a keeper


----------



## Banedox

Quote:


> Originally Posted by *HardwareDecoder*
> 
> What is the max safe voltage on these? Just got my sapphire 290x (legit 290x) and flashed PT1 bios (unlocked voltage)
> 
> ran valley @ 1200 vcore no memory oc 1.425 volts
> 
> 
> 
> don't think I saw any artifacts, Can't really push the gpu any harder with these volts and don't want to go up much more till we get some solid info on these cards....
> 
> max temp was 87C with the stock fan @ 70% sounds like a rolls royce jet engine. Waterblock comes in tomorrow


What the heck setting are you running in valley, cause I get about 800 more score than you on my stock unlocked 290


----------



## OverSightX

Quick question about memory and core clocks for an flashed Sapphire 290: The memory and core clocks continually jump up and down. Is this normal? It never jumped so constantly with my 7970CF setup. I'm sure it's nothing to worry about, BUT it may be.



Sorry if this has been brought up before. Didn't see anything.

Some note about the ss: Taken off a remote session and there are 3 monitors in non-Eyefinity mode connected to the 290.


----------



## HardwareDecoder

Quote:


> Originally Posted by *Banedox*
> 
> What the heck setting are you running in valley, cause I get about 800 more score than you on my stock unlocked 290


are you running @ 1440p w/ everything maxed?


----------



## Forceman

Quote:


> Originally Posted by *HardwareDecoder*
> 
> What is the max safe voltage on these? Just got my sapphire 290x (legit 290x) and flashed PT1 bios (unlocked voltage)
> 
> ran valley @ 1200 vcore no memory oc 1.425 volts
> 
> 
> 
> don't think I saw any artifacts, Can't really push the gpu any harder with these volts and don't want to go up much more till we get some solid info on these cards....
> 
> max temp was 87C with the stock fan @ 70% sounds like a rolls royce jet engine. Waterblock comes in tomorrow
> 
> Just tested asic.... 78.1% Nice! never had a card with a decent asic score before.... this one might be a keeper


1.425V seems like a lot for 1200.
Quote:


> Originally Posted by *OverSightX*
> 
> Quick question about memory and core clocks for an flashed Sapphire 290: The memory and core clocks continually jump up and down. Is this normal? It never jumped so constantly with my 7970CF setup. I'm sure it's nothing to worry about, BUT it may be.
> 
> 
> 
> Sorry if this has been brought up before. Didn't see anything.
> 
> Some note about the ss: Taken off a remote seesion and there are 3 monitors in non-Eyefinity mode connected to the 290.


Do you have voltage monitoring turned on in AB? If so, try turning it off.


----------



## Gero2013

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Yep, I'll disable the SVM proggie first then test .avi, .mp4, etc..


http://www.overclock.net/t/1449719/lagarith-lossless-codec-and-r290-amd-drivers-incompatible

check this, I found the issue to do with the Lagarith Lossless Codec, when I uninstalled it, I could play some of the .AVI files with no problem....
but unfortunately I need it for VirtualDub so it's not really a solution as such...


----------



## Banedox

Quote:


> Originally Posted by *HardwareDecoder*
> 
> are you running @ 1440p w/ everything maxed?


Make sense then and no I run 1920 by 1080 with everything maxed...

I really wanna get a new monitor but I really want 3 with a super thin bezel...


----------



## OverSightX

Quote:


> Originally Posted by *Forceman*
> 
> 1.425V seems like a lot for 1200.
> Do you have voltage monitoring turned on in AB? If so, try turning it off.


I was on. Now disabled and same output. Thanks for the response.


----------



## HardwareDecoder

Quote:


> Originally Posted by *Forceman*
> 
> 1.425V seems like a lot for 1200.
> Do you have voltage monitoring turned on in AB? If so, try turning it off.


im not using AB right now, I'm using asus gpu tweak cause I flashed the asus bios first which is like 1.410 and I got artifacts @ 1200.

Why should I need so much vcore my asic score is pretty good @ 78% is this card a dud?

I was gonna put it underwater tomorrow but if it's a total dud i'll be sending it back for a replacement.


----------



## Banedox

Quote:


> Originally Posted by *HardwareDecoder*
> 
> im not using AB right now, I'm using asus gpu tweak cause I flashed the asus bios first which is like 1.410 and I got artifacts @ 1200.
> 
> Why should I need so much vcore my asic score is pretty good @ 78% is this card a dud?
> 
> I was gonna put it underwater tomorrow but if it's a total dud i'll be sending it back for a replacement.


I just dont think the R9 chips are good overclockers, its not that its a dud....


----------



## HardwareDecoder

Quote:


> Originally Posted by *Banedox*
> 
> I just dont think the R9 chips are good overclockers, its not that its a dud....


true and my 7950's only oc'ed to 1150/1500 so I guess I shouldn't be too pissed off....

I just checked and this card has elpida memory so I likely won't be oc'ing the memory too high either









I saw atleast one guy say oc'ing the memory on these tends to hurt fps because it just makes you have to reduce the core anyway most of the time but i'll definitely try some memory oc's and see what happens I think ill run valley @ 1200/1500 now just for kicks

Okay I need someone to explain this to me.....

]

why is a lower asic better for water cooling. Are you telling me I got a card with a decent asic score (my 7950's were all under 70%) and I actually want a low one this time for when i wc it? .............ugh lol


----------



## Kenshiro 26

W00t! FrozenCPU has shipped my water block, I can finally silence this card!


----------



## mojobear

what the hell?

Newegg has the r9 290 listed as 519 dollars....and it was 409 last week holy crapola. is it all due to the mining craze?!


----------



## Jack Mac

ASIC doesn't really matter, my 290 does 1200/[email protected]+100mv/+15aux at 79.8%ASIC (marginally better than yours but doesn't need a modded bios for 1200 core). However, I saw a guy with ~60% do 1250 on air with stock bios.


----------



## HardwareDecoder

Quote:


> Originally Posted by *Jack Mac*
> 
> ASIC doesn't really matter, my 290 does 1200/[email protected]+100mv/+15aux at 79.8%ASIC (marginally better than yours but doesn't need a modded bios for 1200 core). However, I saw a guy with ~60% do 1250 on air with stock bios.


yep... it is seeming like a dud.... glad I paid the 500$ for a 290x when I could have just tried my luck with a 290 for less money.

I had a feeling I was going to get a dud.

I don't want to boohoo too much but I always get duds from cpu's to gpu's

Now I have to decide if I really want to rma this, I have been with out a gpu for a week already, and who knows if newegg even has 290x's in stock now.


----------



## Neutronman

Quote:


> Originally Posted by *mojobear*
> 
> what the hell?
> 
> Newegg has the r9 290 listed as 519 dollars....and it was 409 last week holy crapola. is it all due to the mining craze?!


Crazy pricing, nabbed a Powercolor R9 290 OC on Friday at $419 to go with the MSI I already have at $399. When I looked at the prices Friday evening they had already gone up to $469 and are now even higher.

Perhaps I can sell my unopened MSI R9 290 and make some cash, lol....


----------



## Durvelle27

My Sapphire R9 290 OC's like junk. Can't push 1150/1250 even with 1.5v. Artifacts like crazy


----------



## HardwareDecoder

Quote:


> Originally Posted by *Durvelle27*
> 
> My Sapphire R9 290 OC's like junk. Can't push 1150/1250 even with 1.5v. Artifacts like crazy


yikes im sorry to hear that, I'd be lying if I said that didn't make me feel better about my 290x though.

And wow you put 1.5 through it, on air or water? I'm scared to do more than 1.425 till I get some word on the max save vccore on these.


----------



## Durvelle27

Quote:


> Originally Posted by *HardwareDecoder*
> 
> yikes im sorry to hear that, I'd be lying if I said that didn't make me feel better about my 290x though.
> 
> And wow you put 1.5 through it, on air or water? I'm scared to do more than 1.425 till I get some word on the max save vccore on these.


Im on Air (Arctic Accelero Xtreme III )


----------



## Banedox

Quote:


> Originally Posted by *HardwareDecoder*
> 
> yikes im sorry to hear that, I'd be lying if I said that didn't make me feel better about my 290x though.
> 
> And wow you put 1.5 through it, on air or water? I'm scared to do more than 1.425 till I get some word on the max save vccore on these.


yeah I put mine at 1.5 on air..... as well, still couldnt overclock.. I personally think the Elpida Memory are holding these cards way back...


----------



## Jack Mac

Keep your 290X, I bench 1200/1450 but do 24/7 at 1100/1350, I'm sure your 290X will do that at reasonable voltages.


----------



## HardwareDecoder

Quote:


> Originally Posted by *Durvelle27*
> 
> Im on Air (Arctic Accelero Xtreme III )


ahh, I'm not gonna do it till I get my waterblock
Quote:


> Originally Posted by *Banedox*
> 
> yeah I put mine at 1.5 on air..... as well, still couldnt overclock..


hmm so seems 1.5v is safe then? I know i'd hit 95c on air if I did it though

ya i got elpida too
Quote:


> Originally Posted by *Jack Mac*
> 
> Keep your 290X, I bench 1200/1450 but do 24/7 at 1100/1350, I'm sure your 290X will do that at reasonable voltages.


ya likely gonna keep it. dont feel like going through rma right now and newegg might be out of stock anyway

So no one has info on the max safe voltage for these things?

edit: wow seems I can't oc my memory AT ALL with 1200 core


----------



## Banedox

Quote:


> Originally Posted by *HardwareDecoder*
> 
> ahh, I'm not gonna do it till I get my waterblock
> 
> hmm so seems 1.5v is safe then? I know i'd hit 95c on air if I did it though
> 
> ya i got elpida too
> ya likely gonna keep it. dont feel like going through rma right now and newegg might be out of stock anyway
> 
> So no one has info on the max safe voltage for these things?


I wouldnt call 1.5 safe I say dont go over 1.410v, a guy over on another thread smoked his 290 VRM at 1.5volts..

The power limit on the card is 525W, but with the PT1/PT3 Bios you can go to 2 volts but thats for like LN2 Cooling and such.....


----------



## MrWhiteRX7

A guy in here a few pages back I think just posted about how he popped his card at 1.46v I think? Be careful!


----------



## Durvelle27

I also have Elpida VRAM


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Banedox*
> 
> I wouldnt call 1.5 safe I say dont go over 1.410v, a guy over on another thread smoked his 290 VRM at 1.5volts..


Yea that guy!


----------



## HardwareDecoder

Quote:


> Originally Posted by *Banedox*
> 
> I wouldnt call 1.5 safe I say dont go over 1.410v, a guy over on another thread smoked his 290 VRM at 1.5volts..


I'm not going over 1.425 for now, and may very likely flash back the asus bios @ 1.410 for long term use.

I doubt 0.015 mV is gonna make a big difference, but hey you never know...
Quote:


> Originally Posted by *MrWhiteRX7*
> 
> A guy in here a few pages back I think just posted about how he popped his card at 1.46v I think? Be careful!


YIKES LOL -- okay I better flash back to asus. I gotta run now and get ready for an exam at school. I'll check in with you guys later.

hopefully I can do 1150 @ 1.410

I don't even really care if my memory oc's since that barely gives any fps -- I gues I just gotta forget about any Epeen with this card was really hoping for a beast for once though. my 670 sucked -- both my 7950's were average


----------



## Banedox

Quote:


> Originally Posted by *HardwareDecoder*
> 
> I'm not going over 1.425 for now, and may very likely flash back the asus bios @ 1.410 for long term use.
> 
> I doubt 0.015 mV is gonna make a big difference, but hey you never know...


My quess is that guy let his chips get to hot...


----------



## MrWhiteRX7

That's what I'm hoping lol. I benched mine at 1.41 once but don't have the juevos to leave it there yet on air hahaha


----------



## HardwareDecoder

Quote:


> Originally Posted by *Banedox*
> 
> My quess is that guy let his chips get to hot...


i think he probably did but with voltage you can pop stuff even if it doesn't get too hot, if I under stand correctly

ive been running @ 65% fan it's a damn jet engine good thing my block comes in tomorrow.

Anyway i'm out for now adios, ttyl.

eh, what am I talking about I have a 4g smart phone, i'll probably be on it during school after the exam


----------



## Raephen

Quote:


> Originally Posted by *tsm106*
> 
> ^^Seems they have a large degree of variance with their block.


Perhaps, or perhaps not. It could be their hardware, or it could be that AMD has a slight variance in VRM height.

Who knows? Or better yet: who cares? I was happy with the block before and even happier now with the simple mod.

Tsm, you have an EK block, right? Could you do me a favour and run the Furmark burn-in test (15 min.) for me so we could compare numbers?


----------



## Derpinheimer

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> That's what I'm hoping lol. I benched mine at 1.41 once but don't have the juevos to leave it there yet on air hahaha


Y'all got me scared now..

Damn this chip taking 1460mV for 1200MHz.


----------



## crun

Didn't have much time to overclock and test yet (will do it tomorrow thoroughly) or even the newest MSI AB (=thats why I can't adjust voltage I guess?)
but my Gigabyte R9 [email protected] (Asus ROM, nearly 82% ASIC) crashes on 1100 core. Stable on 1050 though.

Seems like it is runing 1.07V max (according to the GPU-Z) so should clock well I hope


----------



## MrWhiteRX7

Found a shop online that I spoke with, they have great rating and reviews so lets hope this works well. They have 35 sapphire 290's on order and their first shipment of them comes in the 21st of this month. I went ahead and pre-ordered two of them they're still normal price!!!!!! $840.00 shipped for both!

http://www.shopblt.com/cgi-bin/shop/shop.cgi?action=thispage&thispage=0110040015013_BTF3730P.shtml&order_id=!ORDERID!#Availability










p.s. If I'm not allowed to post the link let me know please I'll drop it immediately. I just think it's insane that newegg and amazon are price gouging like little girls.


----------



## Banedox

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Found a shop online that I spoke with, they have great rating and reviews so lets hope this works well. They have 35 sapphire 290's on order and their first shipment of them comes in the 21st of this month. I went ahead and pre-ordered two of them they're still normal price!!!!!! $840.00 shipped for both!
> 
> http://www.shopblt.com/cgi-bin/shop/shop.cgi?action=thispage&thispage=0110040015013_BTF3730P.shtml&order_id=!ORDERID!#Availability
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> p.s. If I'm not allowed to post the link let me know please I'll drop it immediately. I just think it's insane that newegg and amazon are price gouging like little girls.


They prolly just sold out of stock they dont have yet.... and ugg do i order one of these and test my luck or just return my present unlocked 290 and go with the green team....


----------



## Stay Puft

Just saw the 499 prices on the 290's on Newegg.. WTH


----------



## ImJJames

Quote:


> Originally Posted by *Banedox*
> 
> I wouldnt call 1.5 safe I say dont go over 1.410v, a guy over on another thread smoked his 290 VRM at 1.5volts..
> 
> The power limit on the card is 525W, but with the PT1/PT3 Bios you can go to 2 volts but thats for like LN2 Cooling and such.....


I've went up to 1.5 volts on my r9 290 on pt 1 bios(Was trying to get 1300Mhz clock), never had a problem and my VRM's never went pass 70C, stock cooler. Just remember with PT 1 bios, you aren't actually pushing 1.5 volts just because GPU tweak says so. Its actually more like 1.45 volts when you put vdroop into consideration.


----------



## Derpinheimer

Quote:


> Originally Posted by *ImJJames*
> 
> I've went up to 1.5 volts on my r9 290 on pt 1 bios(Was trying to get 1300Mhz clock), never had a problem and my VRM's never went pass 70C, stock cooler. Just remember with PT 1 bios, you aren't actually pushing 1.5 volts just because GPU tweak says so. Its actually more like 1.45 volts when you put vdroop into consideration.


So why use PT1 instead of PT3? shouldnt pt3 let you have lower idle voltages? I mean at 1460mV mine droops to ~1290mV


----------



## bond32

Quote:


> Originally Posted by *Derpinheimer*
> 
> So why use PT1 instead of PT3? shouldnt pt3 let you have lower idle voltages? I mean at 1460mV mine droops to ~1290mV


I tried pt3 for a while, even on water it can get very hot. Made me nervous lol. Personally I am waiting on a replacement card before I try pt1. Also I haven't done enough research to find somewhere to check voltage with a meter, if possible. If I find somewhere that should be more accurate than software voltage readings.


----------



## MrWhiteRX7

For those that have been having low gpu usage causing stutter or hiccups in bf4, has the new update helped? My buddy with an 8350 said this made a big difference in gpu usage on his 290.

http://battlelog.battlefield.com/bf4/news/view/bf4-pc-update-today/


----------



## $ilent

guys im having an issue with my r9 290

Ive got a gtx 670 in pcie slot 1 and latest NV drivers.
Ive put my r9 290 in the second pcie slot but AMD is not showing overdrive (i have v5 beta) and gpuz nor msi AB are showing any figures for the r9 290.

Ive had problems with this in the past, anyone got any ideas please?


----------



## SultanOfWalmart

Man, looking back at my 290"X", I though @ 1200/1500 on air was a dud....apparently that's on the high-average side of things, good to know I suppose. Still not sure it would be worth putting it under water....but, maybe I could squeeze a few more Mhz out of it haha.


----------



## RAFFY

Quote:


> Originally Posted by *Slomo4shO*
> 
> I am not finding any quad-fire 290 setups on Z77 or Z87 anywhere and my custom loop isn't going to be up until the end of the month. Hmm...


That is because those motherboards will bottleneck the GPU's. This is why I am dumping my ASUS Maximus VI Formula for a ASUS IV Black Edition to run my TRI-Fire setup.


----------



## Jack Mac

I think you'd be fine with 3.0 x4, it's the same bandwidth as 2.0 x8.


----------



## RAFFY

Quote:


> Originally Posted by *Jack Mac*
> 
> I think you'd be fine with 3.0 x4, it's the same bandwidth as 2.0 x8.


That's what i was thinking but it turns out that you need the X79 to avoid any bottlenecks once the slots drop down to x4


----------



## Jack Mac

Quote:


> Originally Posted by *RAFFY*
> 
> That's what i was thinking but it turns out that you need the X79 to avoid any bottlenecks once the slots drop down to x4


I think it's because most X79 CPUs are 6+core, rather than the number of PCI lanes.


----------



## Jpmboy

Quote:


> Originally Posted by *$ilent*
> 
> guys im having an issue with my r9 290
> Ive got a gtx 670 in pcie slot 1 and latest NV drivers.
> Ive put my r9 290 in the second pcie slot but AMD is not showing overdrive (i have v5 beta) and gpuz nor msi AB are showing any figures for the r9 290.
> Ive had problems with this in the past, anyone got any ideas please?


Uh - most folks try to scrub their systems of red when running green and vs-versa... you want to run both at the same time? or just GTX physics?


----------



## Taint3dBulge

Just got my refund for my 290x.. Havnt been paying much attention to this thread the last week and ahalf. way to buys at work.. So the question is.. Any info at all on a date when "any" custom pcb 290/290x comes out.. Like a real date, not one that is just guesstamated.. I got 600 bucks burnin a hole in my pocket now lol.. I am tempted to look at the kingping 780ti.. But id rather stay with amd... bah. Back to this battle with my brain.. I want now vs. wait and be pleased by patients. Anyways, someone has to knwo something..


----------



## HardwareDecoder

Quote:


> Originally Posted by *Taint3dBulge*
> 
> Just got my refund for my 290x.. Havnt been paying much attention to this thread the last week and ahalf. way to buys at work.. So the question is.. Any info at all on a date when "any" custom pcb 290/290x comes out.. Like a real date, not one that is just guesstamated.. I got 600 bucks burnin a hole in my pocket now lol.. I am tempted to look at the kingping 780ti.. But id rather stay with amd... bah. Back to this battle with my brain.. I want now vs. wait and be pleased by patients. Anyways, someone has to knwo something..


should kept the ref model and got a waterblock IMO.


----------



## $ilent

Quote:


> Originally Posted by *Jpmboy*
> 
> Uh - most folks try to scrub their systems of red when running green and vs-versa... you want to run both at the same time? or just GTX physics?


both at the same time baby!

NV for gaming, amd for mining xD


----------



## djsatane

So beta 5 of the drivers have overdrive still officially missing? How does AMD expect users to boost their fan speed? Yes I know one can use 3rd party apps but this is flagship amd card and I think amd should let user view and adjust fan officially.


----------



## mojobear

guys i cant get over this wow....

sapphire r9 290 in canada on newegg.ca was like 429 last week with a 20 dollar discount so couild pick one up for 409....now its freaken 529 1 week later...have you guys ever seen anything like this? Wow hardware appreciating in value.


----------



## SultanOfWalmart

Quote:


> Originally Posted by *djsatane*
> 
> So beta 5 of the drivers have overdrive still officially missing? How does AMD expect users to boost their fan speed?


Beta 5 brought Overdrive back. If you're missing it, its an issue on your end. Try scrub drivers and reinstall.


----------



## y2kcamaross

Newegg has Asus models of the r9 290 in stock now, 499.99







same as amazon's Sapphire ones were this morning, what a joke


----------



## Nirvana91

Ok guys,

Tomorrow i´m selling my 7990 and going to get a R9 290X reference.

These are the cooling options:

- Kraken G10
- Gelid Icy Vision Rev2
- Accelero Xtreme III
- Accelero Hybrid (don´t know if it´s compatible).

I must say that I´m leaned to the Kraken G10 but the lack of ramsinks on the VRM/Memory makes me wonder....

The Gelid Icy Vision is much cheaper and i´ve heard good things.

Accelero Xtreme III i think that it´s needed to do some modding for it to work

The accelero Hyrbrid don´t even know if it´s compatible...


----------



## Faint

Just got my card today. Haven't really messed with it yet as I have finals this week to worry about.

1. http://www.techpowerup.com/gpuz/g5hk3/
2. Sapphire R9 290
3. Stock cooler


----------



## rdr09

Quote:


> Originally Posted by *y2kcamaross*
> 
> Newegg has Asus models of the r9 290 in stock now, 499.99
> 
> 
> 
> 
> 
> 
> 
> same as amazon's Sapphire ones were this morning, what a joke


this, too, is a joke . . .

http://www.newegg.com/Product/Product.aspx?Item=N82E16814121769

3GB


----------



## Jack Mac

Bet you it comes with a "sick" ROG sticker doe.


----------



## RAFFY

Quote:


> Originally Posted by *Taint3dBulge*
> 
> Just got my refund for my 290x.. Havnt been paying much attention to this thread the last week and ahalf. way to buys at work.. So the question is.. Any info at all on a date when "any" custom pcb 290/290x comes out.. Like a real date, not one that is just guesstamated.. I got 600 bucks burnin a hole in my pocket now lol.. I am tempted to look at the kingping 780ti.. But id rather stay with amd... bah. Back to this battle with my brain.. I want now vs. wait and be pleased by patients. Anyways, someone has to knwo something..


Read through the pages you missed there's a lot of good info in the thread. Also you should of kept the 290x as they now cost $619.


----------



## y2kcamaross

Quote:


> Originally Posted by *rdr09*
> 
> this, too, is a joke . . .
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814121769
> 
> 3GB


$660?







That's more than I paid for mine on launch from there....before the$150 price cut, I guess Newegg thinks since they are running a 10% off promo, they can raise the prices 30%!


----------



## rdr09

Quote:


> Originally Posted by *y2kcamaross*
> 
> $660?
> 
> 
> 
> 
> 
> 
> 
> That's more than I paid for mine on launch from there....before the$150 price cut, I guess Newegg thinks since they are running a 10% off promo, they can raise the prices 30%!


I would not even pay $400 for that.


----------



## KyGuy

Just want to add my 2 cents on an issue I have noticed. My Sapphire 290x can achieve a high memory overclock on Elpida RAM, up to 1550 with only a few artifacts. I run it at 1500. My core overclocks terrible with +100mv in Afterburner. Only 1100 stable. After I exited Furmark though, the GPU usage remained pegged at 100% and required a hard reset. This just happened. No issues after the reset, but anyone able to reproduce this, or have this issue?


----------



## Heinz68

Quote:


> Originally Posted by *HardwareDecoder*
> 
> yep... it is seeming like a dud.... glad I paid the 500$ for a 290x when I could have just tried my luck with a 290 for less money.
> 
> I had a feeling I was going to get a dud.
> 
> I don't want to boohoo too much but I always get duds from cpu's to gpu's
> 
> Now I have to decide if I really want to rma this, I have been with out a gpu for a week already, and who knows if newegg even has 290x's in stock now.


I don't think your cards are dud, I noticed the avg OC is around 1150, only very few lucky guys get over 1200 also it does not make much sense to RMA card just because it does not OC higher. I'm sure Newegg is not going to look at at it as valid reason for RMA.

Plus you should be very happy guy paying for the R9 290X only $500. Right now at Newegg all same non-reference cards are listed for $600 most have not the BF4 game included and none are in stock.
Actualy the R9 290 are now listed at $500


----------



## HardwareDecoder

Quote:


> Originally Posted by *Heinz68*
> 
> I don't think your cards are dud, I noticed the avg OC is around 1150, only very few lucky guys get over 1200 also it does not make much sense to RMA card just because it does not OC higher. I'm sure Newegg is not going to look at at it as valid reason for RMA.
> 
> Plus you should be very happy guy paying for the R9 290X only $500. Right now at Newegg all same non-reference cards are listed for $600 most have not the BF4 game included and none are in stock.
> Actualy the R9 290 are now listed at $500


you're right


----------



## y2kcamaross

Quote:


> Originally Posted by *rdr09*
> 
> I would not even pay $400 for that.


That's...cool? I'd gladly pay 400 if I didn't already have 2


----------



## RAFFY

Quote:


> Originally Posted by *y2kcamaross*
> 
> That's...cool? I'd gladly pay 400 if I didn't already have 2


LEAVE THIS THREAD YOU GREEN TEAM SPY









Wish me luck guys! I'm going to UPS to pick up my order from PPCs they never sent me an email informing me of what kind of wait to expect on my GPU blocks and back plates... maybe they are in the box!







If not I will be







when I get home


----------



## ivers

Ok so i just got my new r9 290 xfx, installed the driver and open up the configurator to change my display settings, boum black screen


----------



## y2kcamaross

Quote:


> Originally Posted by *RAFFY*
> 
> LEAVE THIS THREAD YOU GREEN TEAM SPY
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Wish me luck guys! I'm going to UPS to pick up my order from PPCs they never sent me an email informing me of what kind of wait to expect on my GPU blocks and back plates... maybe they are in the box!
> 
> 
> 
> 
> 
> 
> 
> If not I will be
> 
> 
> 
> 
> 
> 
> 
> when I get home










Hey now, I just sold my 2 launch reference Sapphire 7970s for 860 on ebay to fund my 2 new r9 290/x purchases, one should be here Friday and I'm just waiting on another at a regular price so I can order


----------



## Taint3dBulge

Quote:


> Originally Posted by *RAFFY*
> 
> Read through the pages you missed there's a lot of good info in the thread. Also you should of kept the 290x as they now cost $619.


Couldnt keep it... DVI port problems and super loud coil whine.. So I couldnt keepit and live with it... Im going to wait for the customs to come out.. im willing to pay in the 650 range.. I hope its not more then that.. Unless the performance is ~15+% more then stock and can oc higher then what reference can do on water.. lol.. Im mainly for memory clocks to go up... Reference memory chips suck... Not sure I can read though 40 pages..... Ill do some fast skimming. i guess.


----------



## skupples

Quote:


> Originally Posted by *RAFFY*
> 
> That is because those motherboards will bottleneck the GPU's. This is why I am dumping my ASUS Maximus VI Formula for a ASUS IV Black Edition to run my TRI-Fire setup.




good choice, you won't regret it.


----------



## Durvelle27

Anybody have the first gen Sapphire R9 290


----------



## chiknnwatrmln

Quote:


> Originally Posted by *y2kcamaross*
> 
> $660?
> 
> 
> 
> 
> 
> 
> 
> That's more than I paid for mine on launch from there....before the$150 price cut, I guess Newegg thinks since they are running a 10% off promo, they can raise the prices 30%!


Oh but it's Green Monday!

Lol Newegg prices are becoming more and more of a joke every day. Their "Black Friday" deals were God awful.

This whole "Cyber November" "Black Weekend" "Green Monday" crap needs to stop, it's not even like they're running any special deals lmao...


----------



## Derpinheimer

Its weird, the Sapphire R9 290 is the only one I see covered by Iron Egg RETURN.. others are replacement only.

Have to screen cap that.. lol


----------



## Heinz68

Quote:


> Originally Posted by *Derpinheimer*
> 
> So why use PT1 instead of PT3? shouldnt pt3 let you have lower idle voltages? I mean at 1460mV mine droops to ~1290mV


PT1 has Vdroop, while PT3 doesn't. PT3 is much more dangerous and should be used only with LN2, but they both have potential to destroy the cards. Both are kind off useless for daily use since they don't allow 2D clocks when idle.


----------



## Derpinheimer

Right, but isnt it healthier to use a constant 1.3v rather than 1.45v idle and 1.3v load? As long as you know how much it droops, PT 3 seems safer to me.

Also, with ASUS Unlock bios mine doesnt downclock idle..


----------



## Pesmerrga

Quote:


> Originally Posted by *rdr09*
> 
> this, too, is a joke . . .
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814121769
> 
> 3GB


How about this?

http://www.newegg.com/Product/Product.aspx?Item=N82E16814121671

Even worse than that is this..

http://www.mwave.com/mwave/SKUSearch.asp?scriteria=BF01542&pagetitle=Sapphire%2021226-00-53G%20Radeon%20R9%20290X%204GB%20DDR5%20BattleFie#.UqaX3kCTIdE


----------



## skupples

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Oh but it's Green Monday!
> 
> Lol Newegg prices are becoming more and more of a joke every day. Their "Black Friday" deals were God awful.
> 
> This whole "Cyber November" "Black Weekend" "Green Monday" crap needs to stop, it's not even like they're running any special deals lmao...


See, they are getting way more ignorant people to buy into the false deals during this time of year. This is why they do it. The # of people buying presents for other people far outweighs the enthusiast browsing for a truly good deal on something. Not to mention that cyber monday has pretty much been a joke since conception.
Quote:


> Originally Posted by *Derpinheimer*
> 
> Right, but isnt it healthier to use a constant 1.3v rather than 1.45v idle and 1.3v load? As long as you know how much it droops, PT 3 seems safer to me.
> 
> Also, with ASUS Unlock bios mine doesnt downclock idle..


Vdroop & LLC exists to protect the GPU. a 0% llc will degrade the unit much faster than one running 50% or 100% LLC.


----------



## Raxus

http://www.newegg.com/Product/Product.aspx?Item=N82E16814127757&nm_mc=AFC-C8Junction&cm_mmc=AFC-C8Junction-_-na-_-na-_-na&cm_sp=&AID=10446076&PID=6202798&SID=

msi non bf4 edition in stock, 599


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Durvelle27*
> 
> 1080p Very High w/ FXAA would be nice
> 
> I'm running at those settings and @1100/1450 and I'm only getting 40-60 FPS


1080p, very high everything, 16x af, FXAA on

Very beginning rain area 48fps... once past that all through the entire first level avg of 62fps. I did a few fraps 120second benchmarks in game. highs around 90ish and a couple spots where it chunks down in to the 50's but most of the time I was in the 60's easily. Game runs super smooth in 1080p.

Cpu: 2600k @ 4.0ghz
Gpu: 290 @ 1150/1500


----------



## Durvelle27

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> 1080p, very high everything, 16x af, FXAA on
> 
> Very beginning rain area 48fps... once past that all through the entire first level avg of 62fps. I did a few fraps 120second benchmarks in game. highs around 90ish and a couple spots where it chunks down in to the 50's but most of the time I was in the 60's easily. Game runs super smooth in 1080p.
> 
> Cpu: 2600k @ 4.0ghz
> Gpu: 290 @ 1150/1500


I get 40-62 FPS with dips in the 30s.


----------



## Derpinheimer

Quote:


> Originally Posted by *skupples*
> 
> See, they are getting way more ignorant people to buy into the false deals during this time of year. This is why they do it. The # of people buying presents for other people far outweighs the enthusiast browsing for a truly good deal on something. Not to mention that cyber monday has pretty much been a joke since conception.
> Vdroop & LLC exists to protect the GPU. a 0% llc will degrade the unit much faster than one running 50% or 100% LLC.


Can you explain this to me?

I thought vdroop was a "natural" drop in voltage from high power draw
And LLC "artificially" increased voltage as power draw increased, to help compensate for vdroop.

Or am I wrong/?


----------



## HardwareDecoder

Quote:


> Originally Posted by *skupples*
> 
> See, they are getting way more ignorant people to buy into the false deals during this time of year. This is why they do it. The # of people buying presents for other people far outweighs the enthusiast browsing for a truly good deal on something. Not to mention that cyber monday has pretty much been a joke since conception.


I agree for the most part however, I did get a really nice laptop for 329$ on black friday, and a 500$ 290x on cyber monday tho both without having to go stand in the cold... and we all know the 290x is like 600$+ now

and the laptop was originally 650$ and is an ivy bridge i5 with all the other bells and whistles.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Durvelle27*
> 
> I get 40-62 FPS with dips in the 30s.


I don't think it's your card, my buddy has an fx 8350 @ 4.66ghz with same sapphire 290 I have running at 1100/1500 I believe and he said his average is 45fps same settings.


----------



## Durvelle27

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> I don't think it's your card, my buddy has an fx 8350 @ 4.66ghz with same sapphire 290 I have running at 1100/1500 I believe and he said his average is 45fps same settings.


I'm averaging about 50+ FPS @1100/1450 but I get huge dips between 30-40 @5GHz in my 8350


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Durvelle27*
> 
> I'm averaging about 50+ FPS @1100/1450 but I get huge dips between 30-40 @5GHz in my 8350


Yea I had two random dips when playing for about an hour just now... one went down to 46fps for a second and the other was 52fps.

I'm going to OC my 2600k back up to 4.8ghz and see if that does anything.


----------



## Durvelle27

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Yea I had two random dips when playing for about an hour just now... one went down to 46fps for a second and the other was 52fps.
> 
> I'm going to OC my 2600k back up to 4.8ghz and see if that does anything.


Please do. I want to know if its my CPU holding me back


----------



## Durquavian

Quote:


> Originally Posted by *Derpinheimer*
> 
> Can you explain this to me?
> 
> I thought vdroop was a "natural" drop in voltage from high power draw
> And LLC "artificially" increased voltage as power draw increased, to help compensate for vdroop.
> 
> Or am I wrong/?


I think Vdroop was allowed to exist to keep from overvolting. LLC adds overvolting back in to the equation.


----------



## darkelixa

I dont think an amd 8350 is going to cause bad fps drops,if i run valley benchmark it will stutter in the same spot on my 8350 or i5 4670k with my 770gtx


----------



## Durvelle27

Quote:


> Originally Posted by *darkelixa*
> 
> I dont think an amd 8350 is going to cause bad fps drops,if i run valley benchmark it will stutter in the same spot on my 8350 or i5 4670k with my 770gtx


Very bad example considering Valley is very GPU limited. CPU doesn't matter in that bench or heaven


----------



## MrWhiteRX7

Quote:


> Originally Posted by *darkelixa*
> 
> I dont think an amd 8350 is going to cause bad fps drops,if i run valley benchmark it will stutter in the same spot on my 8350 or i5 4670k with my 770gtx


I disagree, the i5 and OC 8350 are about the same in terms of performance. I picked up a noticeable amount of FPS in all games and my minimums went up drastically when I switched back from 8350 @ 4.9ghz to 2600k @ 4.0ghz..

Obviously though, YRMV.


----------



## rdr09

Quote:


> Originally Posted by *Durvelle27*
> 
> I get 40-62 FPS with dips in the 30s.


I got that with a 7950. I saw it dipped to 37 as the lowest in BF4 MP 64 Ultra and 4MSAA. you are using 1080?


----------



## RAFFY

Quote:


> Originally Posted by *y2kcamaross*
> 
> 
> 
> 
> 
> 
> 
> 
> Hey now, I just sold my 2 launch reference Sapphire 7970s for 860 on ebay to fund my 2 new r9 290/x purchases, one should be here Friday and I'm just waiting on another at a regular price so I can order


Quote:


> Originally Posted by *skupples*
> 
> 
> 
> good choice, you won't regret it.


Very nice! I just got my full copper EK cpu block too, same one as yours just full copper. I can't wait to get it. I have to see if I need to buy a different EK FC bridge.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *rdr09*
> 
> I got that with a 7950. I saw it dipped to 37 as the lowest in BF4 MP 64 Ultra and 4MSAA. you are using 1080?


Crysis 3


----------



## Durvelle27

Quote:


> Originally Posted by *rdr09*
> 
> I got that with a 7950. I saw it dipped to 37 as the lowest in BF4 MP 64 Ultra and 4MSAA. you are using 1080?


I never saw dips that low in the BF4 beta and yep 1080


----------



## RAFFY

Anyone know the name of the EK Hardware Rep?


----------



## skupples

Quote:


> Originally Posted by *RAFFY*
> 
> Very nice! I just got my full copper EK cpu block too, same one as yours just full copper. I can't wait to get it. I have to see if I need to buy a different EK FC bridge.


Sadly, this block looks like this on the inside.



This is due to my own neglicance. Never listen to anyone who tells you distilled water & silver is all you need. It's an utter fallacy. Distilled water seeks out ions when warm... IE: EATS METAL. You MUST add anti-corrosive to it to keep it from destroying your loop.

the "CSQ" EK-FC bridges are different from the new non CSQ "EK-FC" so, if you are moving from an earlier series ek block you will need to upgrade the link. They changed the port shape.

replacing all of my blocks with the acetal/copper variants this time around. Clean CSQ.


----------



## HardwareDecoder

Quote:


> Originally Posted by *skupples*
> 
> Sadly, this block looks like this on the inside.
> 
> 
> 
> This is due to my own neglicance. Never listen to anyone who tells you distilled water & silver is all you need. It's an utter fallacy. Distilled water seeks out ions when warm... IE: EATS METAL. You MUST add anti-corrosive to it to keep it from destroying your loop.
> 
> the "CSQ" EK-FC bridges are different from the new non CSQ "EK-FC" so, if you are moving from an earlier series ek block you will need to upgrade the link. They changed the port shape.


don't you mean deionized seeks out ions, thus eating metal. distilled should be fine


----------



## skupples

Quote:


> Originally Posted by *HardwareDecoder*
> 
> don't you mean deionized seeks out ions, thus eating metal. distilled should be fine


From what i'm reading both (Di even more so) have extremely low dissolved solids content, thus both can & will eat your metal unless properly mixed with anti-corrosive & what not. anyways, that picture I linked is ~7 months of just distilled & silver. (no need for dead water with enough raddage, it's redundant, as it's dissolved copper) silver also seems to be a bad idea due to the possible production of acid in the loop. Reading all sorts of info that conflicts with what I would call "coolant common knowledge"

http://www.overclockers.com/pc-water-coolant-chemistry-part-ii/


----------



## HardwareDecoder

Quote:


> Originally Posted by *skupples*
> 
> From what i'm reading both (Di even more so) have extremely low dissolved solids content, thus both can & will eat your metal unless properly mixed with anti-corrosive & what not. anyways, that picture I linked is ~7 months of just distilled & silver. (no need for dead water, it's redundant, as it's dissolved copper)


hmmm just setup my first loop and I didn't put anything in except the biocide drops. I am actually breaking down my loop tomorrow as i'm moving to a new case and my 290x block is coming so what anti-corrosive should I use ? I can just order some an add it to the res pump/res when it comes.


----------



## skupples

Quote:


> Originally Posted by *HardwareDecoder*
> 
> hmmm just setup my first loop and I didn't put anything in except the biocide drops. I am actually breaking down my loop tomorrow as i'm moving to a new case and my 290x block is coming so what anti-corrosive should I use ? I can just order some an add it to the res pump/res when it comes.


Go with one of the well knowns, you can make your own from automotive additives, but some of them have bad side effects on acrylic. I'm going with Mayhams new XT-1 solution, you mix it with distilled. Proper fluid becomes even more important if you are using a nickel plated product. The EKoolant is also really good stuff, & as it's made by EK should be fabulous with their nickel producst.
Quote:


> Originally Posted by *HardwareDecoder*
> 
> I agree for the most part however, I did get a really nice laptop for 329$ on black friday, and a 500$ 290x on cyber monday tho both without having to go stand in the cold... and we all know the 290x is like 600$+ now
> 
> and the laptop was originally 650$ and is an ivy bridge i5 with all the other bells and whistles.


Deals definitely exist, but it's almost always on last gen type stuff. Nice deal on the 290x!

I picked up 4x 128gb vertex 3 SSD's last year, & some super super cheep ram. (ram was ROCK bottom this time last year, 1/2 the price of now or lower)
Quote:


> Originally Posted by *Derpinheimer*
> 
> Can you explain this to me?
> 
> I thought vdroop was a "natural" drop in voltage from high power draw
> And LLC "artificially" increased voltage as power draw increased, to help compensate for vdroop.
> 
> Or am I wrong/?


Iv'e always had LLC explained to me as a built in safety switch for voltage fluctuation. The type of fluctuation our hw monitors don't pickup. Now of course, disabling LLC (0%) will allow for slightly higher voltages, but @ the cost of possible degradation.


----------



## RAFFY

Quote:


> Originally Posted by *skupples*
> 
> Sadly, this block looks like this on the inside.
> 
> 
> 
> This is due to my own neglicance. Never listen to anyone who tells you distilled water & silver is all you need. It's an utter fallacy. Distilled water seeks out ions when warm... IE: EATS METAL. You MUST add anti-corrosive to it to keep it from destroying your loop.
> 
> the "CSQ" EK-FC bridges are different from the new non CSQ "EK-FC" so, if you are moving from an earlier series ek block you will need to upgrade the link. They changed the port shape.
> 
> replacing all of my blocks with the acetal/copper variants this time around. Clean CSQ.


Dang sorry to hear that man! This is my first water cool build. I purchased the EK CSQ Supremacy Clean CSQ (full copper), 3 EK R9 290x Copper/Acetal w/ back plates, and the EK FC Terminal Parallel Z77 (should fit the EK Blocks according to Frozen CPU website & EK). Then for fluid I purchased the EK-Ekoolant Blood RED. The color fits my build perfectly and after researching cooling temps it cools right up there with the best of them. Now after purchasing my new CPU and MB I need to powder coat this 900D white.

Edit: SKUPPLES have you ever messed around with the flow jet plate? Or do you know of any threads that discuss this topic?

Also is one 1mm 50x60 fujipoly and one .5mm 50x60 fujipoly enough for 3 GPUs?


----------



## ImJJames

Stock Asus Bios, stock air cooler
1260/1500 Max temp 69C, Max VRM temp 56C @ 90% fan


----------



## HardwareDecoder

Quote:


> Originally Posted by *skupples*
> 
> Go with one of the well knowns, you can make your own from automotive additives, but some of them have bad side effects on acrylic. I'm going with Mayhams new XT-1 solution, you mix it with distilled. Proper fluid becomes even more important if you are using a nickel plated product. The EKoolant is also really good stuff, & as it's made by EK should be fabulous with their nickel producst.
> Deals definitely exist, but it's almost always on last gen type stuff. Nice deal on the 290x!


heh this sounds made up but, I was just on borderlands 2 with a good buddy of mine who happens to have a degree in microbiology and organic chemistry.

He said that any metal in contact with any water form of water is going to become corroded eventually due to oxidation (the losing of an electron when two substances come in to contact IE water and metal) oxidation is often called "rust" but that is not exactly correct.

I asked what chemical he would use in a water cooling loop to avoid this, and he said nothing is going to offer complete protection.

I guess it might still be worth it to get some well known stuff that doesn't eat acrylic tubing though.
Quote:


> Originally Posted by *ImJJames*
> 
> Stock Asus Bios, stock air cooler
> 1260/1500 Max temp 69C, Max VRM temp 56C @ 90% fan


I want your card







and are your ears ringing from the 90% fan? i haven't even had it over 70% yet and I'm already sick of it, and that is with a noise cancelling g430 headset. So glad my block comes tomorrow haha


----------



## tsm106

Quote:


> Originally Posted by *skupples*
> 
> Quote:
> 
> 
> 
> Originally Posted by *RAFFY*
> 
> Very nice! I just got my full copper EK cpu block too, same one as yours just full copper. I can't wait to get it. I have to see if I need to buy a different EK FC bridge.
> 
> 
> 
> Sadly, this block looks like this on the inside.
> 
> 
> 
> This is due to my own neglicance. Never listen to anyone who tells you distilled water & silver is all you need. It's an utter fallacy. Distilled water seeks out ions when warm... IE: EATS METAL. You MUST add anti-corrosive to it to keep it from destroying your loop.
> 
> the "CSQ" EK-FC bridges are different from the new non CSQ "EK-FC" so, *if you are moving from an earlier series ek block you will need to upgrade the link. They changed the port shape.*
> 
> replacing all of my blocks with the acetal/copper variants this time around. Clean CSQ.
Click to expand...

If you you are referring to the 290 block design, you only need the appropriate terminal bridge. The terminal mounts directly to the block.

On the topic of stripped ek nickel... brings back memories. LOL.

http://www.overclock.net/t/1299223/two-ek-fc580-acetal-nickel-sold


----------



## robyneil

I'm new here. Thank you for all the info and guides provided. Can you please add me robyneil.

*Brand:* PowerColor AXR9 290 4GBD5-MDH/OC Radeon
*Store:* Newegg Item #: N82E16814131527
*Purchase Date:* 11/21/2013
*S/N:* AZG1311037XXX
*Cooler:* Stock air
*Fans:* Ultra Rogue M950 6 fans + GPU stock fan + CPU CM 212 EVO fan
*Benchmark Tool:* 3dmark 11
*Benchmark Highest GPU Temp:* 66C
*Benchmark Highest CPU Temp:* 73C
*Overclocking Tool:* MSI AfterBurner
*Quiet Bios:* Stock Powercolor r9 290 Bios
*Uber Bios:* Asus r9 290x Bios

*stock bios r9 290 overclocked to 1000/1250*


*stock bios r9 290 overclocked to 1154/1401*


*unlocked Asus bios r9 290 to r9 290x default settings*


*unlocked Asus bios r9 290 to r9 290x overclocked to 1154/1401*


Initially had lots of BSOD with hard restarts playing BF4 stock no OC but glad I didn't give up and didn't RMA it back. I believe a lot of the RMA'd r9 290 could have worked out. It's been stable after reseating the card, updating drivers, overclocking, and patiently playing BF4 until it got stable (lots of restarts during the 2nd and 3rd day)...don't really know what fixed it.

For some reason, I can overclock r9 290(not r9 290x) to 1208/1500 using the Powercolor stock bios but not with the Asus r9 290x bios (only up to 1154/1401 with no artifacts).


----------



## skupples

Quote:


> Originally Posted by *HardwareDecoder*
> 
> heh this sounds made up but, I was just on borderlands 2 with a good buddy of mine who happens to have a degree in microbiology and organic chemistry.
> 
> He said that any metal in contact with any water form of water is going to become corroded eventually due to oxidation (the losing of an electron when two substances come in to contact IE water and metal) oxidation is often called "rust" but that is not exactly correct.
> 
> I asked what chemical he would use in a water cooling loop to avoid this, and he said nothing is going to offer complete protection.
> 
> I guess it might still be worth it to get some well known stuff that doesn't eat acrylic tubing though.


That is 100% correct. This is why it's recommended to change your fluid every so often. Mayhem claims you only need to do it every 9 months with the XT-1 glycol additive. I would probably do it every 3 because i'm paranoid & not made of money yet. Do not use PTNUKE, it has bleach in it, bleach is bad.


----------



## robyneil

By the way I got Norton anti virus notification on hawaiiinfo12.exe. I only got it to work a couple of times during r9 290x benching and then Norton automatically removes it after.


----------



## Derpinheimer

Quote:


> Originally Posted by *HardwareDecoder*
> 
> heh this sounds made up but, I was just on borderlands 2 with a good buddy of mine who happens to have a degree in microbiology and organic chemistry.
> 
> He said that any metal in contact with any water form of water is going to become corroded eventually due to oxidation (the losing of an electron when two substances come in to contact IE water and metal) oxidation is often called "rust" but that is not exactly correct.
> 
> I asked what chemical he would use in a water cooling loop to avoid this, and he said nothing is going to offer complete protection.
> 
> I guess it might still be worth it to get some well known stuff that doesn't eat acrylic tubing though.
> I want your card
> 
> 
> 
> 
> 
> 
> 
> and are your ears ringing from the 90% fan? i haven't even had it over 70% yet and I'm already sick of it, and that is with a noise cancelling g430 headset. So glad my block comes tomorrow haha


Scary :O

I've got nothing but distilled water and biocide in mine too..

Im gonna have to take a look at the blocks.. some time.


----------



## tsm106

Quote:


> Originally Posted by *robyneil*
> 
> By the way I got Norton anti virus notification on hawaiiinfo12.exe. I only got it to work a couple of times during r9 290x benching and then Norton automatically removes it after.


You better wipe your OS and start over then?

lol, add the app to your white list.


----------



## HardwareDecoder

Quote:


> Originally Posted by *skupples*
> 
> That is 100% correct. This is why it's recommended to change your fluid every so often. Mayhem claims you only need to do it every 9 months with the XT-1 glycol additive. I would probably do it every 3 because i'm paranoid & not made of money yet. Do not use PTNUKE, it has bleach in it, bleach is bad.


you aren't this guy? I guess that's a good thing he looks flammable


----------



## skupples

Quote:


> Originally Posted by *robyneil*
> 
> By the way I got Norton anti virus notification on hawaiiinfo12.exe. I only got it to work a couple of times during r9 290x benching and then Norton automatically removes it after.


Get rid of Norton. Spend some money on a high end enterprise grade anti-virus.(that exe is fine white list it as TSM said)
Quote:


> Originally Posted by *Derpinheimer*
> 
> Scary :O
> 
> I've got nothing but distilled water and biocide in mine too..
> 
> Im gonna have to take a look at the blocks.. some time.


do your self a favor, order some additive tomorrow. People need to realize that the advice of water & biocide is bad advice.
Quote:


> Originally Posted by *HardwareDecoder*
> 
> you aren't this guy? I guess that's a good thing he looks flammable


Two years, tops.

good night club 290x. 1 day closer to mantle taking over the world!
Quote:


> Originally Posted by *tsm106*
> 
> If you you are referring to the 290 block design, you only need the appropriate terminal bridge. The terminal mounts directly to the block.
> 
> On the topic of stripped ek nickel... brings back memories. LOL.
> 
> http://www.overclock.net/t/1299223/two-ek-fc580-acetal-nickel-sold


This can be easily avoided these days with proper additives, copper may be able to get away with the abuse of just water & biocide, nickel how ever can not. EK seems to want my blocks, as they are still studying & improving on their design. WooT, hopefully this means they will be sending me replacements. I'll pay the fee for gold plated! Want to try those next.


----------



## robyneil

Hmmm...probably it is safer to reinstall everything again...no worries since it's a new build. Not much personal data in there yet. Thanks.


----------



## tsm106

Quote:


> Originally Posted by *robyneil*
> 
> Hmmm...probably it is safer to reinstall everything again...no worries since it's a new build. Not much personal data in there yet. Thanks.


No, that was a joke lol. Its a false positive.


----------



## robyneil

Ok...thanks.


----------



## HardwareDecoder

Oh yeah I guess you can add me to the club. sapphire 290x gonna be waterblocked tomorrow w/ ek nickle+acetal ill post pics when the blocking is done





and a pic of my dog with demon eyes


----------



## GroupB

I redid my loop last week and my loop were runing for 1 year 2 month only with distilled and a silver coil and my raystrom were pretty clean nothing like your... I did rince everything before installing even the tube. The only bad thing I saw was the tube primo lrt , full of plasticizer but that stay on the tube , nothing got into the block or res.


----------



## tsm106

Quote:


> Originally Posted by *skupples*
> 
> This can be easily avoided these days with proper additives, copper may be able to get away with the abuse of just water & biocide, nickel how ever can not. EK seems to want my blocks, as they are still studying & improving on their design. WooT, hopefully this means they will be sending me replacements. I'll pay the fee for gold plated! Want to try those next.


I use coolant always now. It also helps cuz I'm too lazy to change fluid too often lol. Anti-corrosives slows down the natural corrosion in copper which is great for the lazies out there. Can I get an amen?


----------



## skupples

Quote:


> Originally Posted by *tsm106*
> 
> I use coolant always now. It also helps cuz I'm too lazy to change fluid too often lol. Anti-corrosives slows down the natural corrosion in copper which is great for the lazies out there. Can I get an amen?




This TERRIBLE advice of distilled & silver needs to die, specially for nickel. Silver & nickel DO NOT GET ALONG in warm temp environments @ all. Hell, the terrible advice that distilled & biocide being enough also needs to end. This is a warning to all of you first time (maybe not ) water coolers in this thread...




These are expensive & easy to fix mistakes. 12.$ for a bottle of top of the line mayhem's additive. DO EET NOW if you got the EK nickel blocks, should do it regardless.


----------



## HardwareDecoder

Quote:


> Originally Posted by *skupples*
> 
> 
> 
> This TERRIBLE advice of distilled & silver needs to die, specially for nickel. Silver & nickel DO NOT GET ALONG in warm temp environments @ all. Hell, the terrible advice that distilled & biocide being enough also needs to end. This is a warning to all of you first time (maybe not ) water coolers in this thread...
> 
> 
> 
> 
> These are expensive & easy to fix mistakes. 12.$ for a bottle of top of the line mayhem's additive. DO EET NOW if you got the EK nickel blocks, should do it regardless.


i'm gonna order it tomorrow from frozen cpu since I'm gonna use a nickle block.


----------



## GroupB

Why not go full cooper and avoid nickel then?

For a full cooper setup distilled+coil is doing the job

after the EK nickel thing.. EK is now out of my loops forever


----------



## skupples

Quote:


> Originally Posted by *GroupB*
> 
> Why not go full cooper and avoid nickel then?
> 
> For a full cooper setup distilled+coil is doing the job
> 
> after the EK nickel thing.. EK is now out of my loops forever


EK is the #1 block for cooling VRM & Vram, so don't write them off all to gether. Also, pure distilled(deionized) water will eat your metal, it's not a matter of if, its a matter of when. The thing about copper is that it's harder to spot the corrosion/pitting. It's much easier to see on nickel due to the finish simply coming off. Nickel is fine, if properly maintained, copper is just easier, but it's not void of the properties of chemistry. Silver can also introduce acidic content into your loop if the water temps get warm enough. Acid is bad for any metal.


----------



## RavageTheEarth

Quote:


> Originally Posted by *GroupB*
> 
> Why not go full cooper and avoid nickel then?
> 
> For a full cooper setup distilled+coil is doing the job
> 
> after the EK nickel thing.. EK is now out of my loops forever


----------



## DeadlyDNA

I'm using distilled water + coolant(auto) do i still need to worry about biocide? I mean its almost all copper.
Copper,brass, and the connectors are ???


----------



## HardwareDecoder

Quote:


> Originally Posted by *DeadlyDNA*
> 
> I'm using distilled water + coolant(auto) do i still need to worry about biocide? I mean its almost all copper.
> Copper,brass, and the connectors are ???


I'd imagine if you are using an automotive coolant nothing is going to grow in that loop anyway.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *skupples*
> 
> This TERRIBLE advice of distilled & silver needs to die, specially for nickel. Silver & nickel DO NOT GET ALONG in warm temp environments @ all. Hell, the terrible advice that distilled & biocide being enough also needs to end. This is a warning to all of you first time (maybe not ) water coolers in this thread...
> 
> These are expensive & easy to fix mistakes. 12.$ for a bottle of top of the line mayhem's additive. DO EET NOW if you got the EK nickel blocks, should do it regardless.


A link posted earlier in the thread (http://www.overclockers.com/pc-water-coolant-chemistry-part-ii/ can't remember who posted it) says that glycol based additives (Mayhem XT-1 is based on Ethylene Glycol) "may result in a measurable drop in cooling performance."

Now I don't know if the low concentrations used in a loop would actually have an effect, but they also say that the lowered concentration might not supply all the needed additives.

I've heard good things about Swiftech HydrX, I think I'm gonna go with that.

I'm building my first loop soon, been doing waaay too much research into what parts to buy. Right now I'm probably gonna get EK Copper + Acetal blocks and XSPC or EK rads.


----------



## skupples

Quote:


> Originally Posted by *DeadlyDNA*
> 
> I'm using distilled water + coolant(auto) do i still need to worry about biocide? I mean its almost all copper.
> Copper,brass, and the connectors are ???


connectors are normally brass (zinc + copper) Red Line water wetter is a great automotive additive, but can eat acrylic if the mixture is too high. The auto additive you are using likely includes biocides along with anti-corrosion.

It's OK. More EK blocks for those of us who realize vrm/vram temps > 2-3 c on the core.








Quote:


> Originally Posted by *chiknnwatrmln*
> 
> A link posted earlier in the thread (http://www.overclockers.com/pc-water-coolant-chemistry-part-ii/ can't remember who posted it) says that glycol based additives (Mayhem XT-1 is based on Ethylene Glycol) "may result in a measurable drop in cooling performance."
> 
> Now I don't know if the low concentrations used in a loop would actually have an effect, but they also say that the lowered concentration might not supply all the needed additives.
> 
> I've heard good things about Swiftech HydrX, I think I'm gonna go with that.
> 
> I'm building my first loop soon, been doing waaay too much research into what parts to buy. Right now I'm probably gonna get EK Copper + Acetal blocks and XSPC or EK rads.


This is true, all additives can reduce cooling capacity of water if over mixed. I <3 acetal so much harder to crack than acrylic. Also, clear acrylic lets light into the system, which can promote the growth of bacteria. I'm going to be trying a bit of Red Line water wetter in conjunction with the new mayhem product. Going 5:100 XT-1 & 5:100 water wetter. Most of the coolants are close to identical, and some of them are less forthcoming with what they actually use. The point here is that pure water is bad. I was mislead by this information years ago, & it's cost me & many of my friends.


----------



## GroupB

I will be getting the razor XSPC for my 290's they dont look bad at all , the liquid also go to the vrm 2 like the EK and I think the water go way deeper into the vrm 1 than ek, just by the photo you can see, im pretty sure they not bad at all,


----------



## skupples

Quote:


> Originally Posted by *GroupB*
> 
> I will be getting the razor XSPC for my 290's they dont look bad at all , the liquid also go to the vrm 2 like the EK and I think the water go way deeper into the vrm 1 than ek, just by the photo you can see, im pretty sure they not bad at all,


They came out with a new revision recently with a bigger VRM channel. Good stuff.

this is the new revision.



ok, i'm going to go hide before I Get yelled @ for derailing.

good night ladies and gentleman

/endrant


----------



## DeadlyDNA

Cool then, as far as temps go im adding 2 more dual 200MM rads, so i'll have a 6 or 12x 120mm rad, and 2x Dual 200MM rads. So i am hoping for decent temps when i'm done on 4 290's and my i7


----------



## chiknnwatrmln

Quote:


> Originally Posted by *skupples*
> 
> connectors are normally brass (zinc + copper) Red Line water wetter is a great automotive additive, but can eat acrylic if the mixture is too high. The auto additive you are using likely includes biocides along with anti-corrosion.
> 
> It's OK. More EK blocks for those of us who realize vrm/vram temps > 2-3 c on the core.
> 
> 
> 
> 
> 
> 
> 
> 
> This is true, all additives can reduce cooling capacity of water if over mixed. I <3 acetal so much harder to crack than acrylic. Also, clear acrylic lets light into the system, which can promote the growth of bacteria. I'm going to be trying a bit of Red Line water wetter in conjunction with the new mayhem product.


I love the look of acetal, I'm doing a whole black/white thing with my next rig and it fits perfectly.

I originally wanted to do nickel blocks, but I figure copper is less risky, cheaper, and I'm not gonna be seeing the underside of the blocks that often so it doesn't even matter in the longrun.

I'm really looking forward to this build, I'm trying to do it right as I'm dropping >$1k and I'm not even buying any new core components...

Plus I plan on delidding my 3770k and going for 5GHz, that combined with a cool 290 and PT1 BIOS and I can get some sweet benching scores.
Quote:


> Originally Posted by *GroupB*
> 
> I will be getting the razor XSPC for my 290's they dont look bad at all , the liquid also go to the vrm 2 like the EK and I think the water go way deeper into the vrm 1 than ek, just by the photo you can see, im pretty sure they not bad at all,


I kinda want an XSPC loop over EK because I like the look better and it's cheaper. I can't find the 290x block anywhere though, I'm not sure where to find it or if it's even been officially released yet.


----------



## brazilianloser

Quote:


> Originally Posted by *GroupB*
> 
> I will be getting the razor XSPC for my 290's they dont look bad at all , the liquid also go to the vrm 2 like the EK and I think the water go way deeper into the vrm 1 than ek, just by the photo you can see, im pretty sure they not bad at all,


Quote:


> Originally Posted by *skupples*
> 
> They came out with a new revision recently with a bigger VRM channel. Good stuff.
> 
> this is the new revision.
> 
> 
> 
> ok, i'm going to go hide before I Get yelled @ for derailing.
> 
> good night ladies and gentleman
> 
> /endrant


Yeah they said though it would be available beginning of last week... and so far nothing here in the US


----------



## tsm106

Quote:


> Originally Posted by *DeadlyDNA*
> 
> I'm using distilled water + coolant(auto) do i still need to worry about biocide? I mean its almost all copper.
> Copper,brass, and the connectors are ???


You don't want to use an ethylene glycol based coolant afaik. It will work obviously but it is throwing the kitchen sink at the problem when you really don't want the whole kitchen sink thrown at a simple problem. You don't need atni-freeze capabilities, you don't need the 200F + temperature capabilities, you don't need the lubricating abilities since pc pumps are designed to be elf lubrications, etc etc. All you need is a coolant that serves two main purposes, biocide and anti-corrosion. Basically all pc coolants fit this purpose.


----------



## ImJJames

Do ARCTIC Accelero Xtreme III keep the VRM's cooler than stock cooler? The stock cooler does an excellent job keep VRM's nice and cool


----------



## geoxile

The prices of R9 290s are literally insane, literally...


----------



## ImJJames

Quote:


> Originally Posted by *geoxile*
> 
> The prices of R9 290s are literally insane, literally...


All AMD cards are lol, supply and demand my friend. You can thank miners for that.


----------



## MrWhiteRX7

A couple pages back I posted a link to a site still selling them for normal price. They get new stock on the 21st. First come first serve!


----------



## Sgt Bilko

Quote:


> Originally Posted by *brazilianloser*
> 
> Yeah they said though it would be available beginning of last week... and so far nothing here in the US


Hmm, looks like they will be arriving tomorrow for us in Aus : http://www.pccasegear.com/index.php?main_page=product_info&cPath=207_160_878_880_1504&products_id=25989

I'm still trying to decide if i want to wait for AIB or go water...........


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Hmm, looks like they will be arriving tomorrow for us in Aus : http://www.pccasegear.com/index.php?main_page=product_info&cPath=207_160_878_880_1504&products_id=25989
> 
> I'm still trying to decide if i want to wait for AIB or go water...........


H2o!!!!


----------



## Sgt Bilko

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> H2o!!!!


I'd love to but by the time i get the cash together for a loop the non-ref cards will be out.........hmm, will have to think about it some more.

but i'm leaning towards getting wet


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'd love to but by the time i get the cash together for a loop the non-ref cards will be out.........hmm, will have to think about it some more.
> 
> but i'm leaning towards getting wet


I'd say if you really only want one 290 then a non ref card would be good but multi gpu water ftw ?


----------



## Arizonian

Quote:


> Originally Posted by *Faint*
> 
> Just got my card today. Haven't really messed with it yet as I have finals this week to worry about.
> 
> 1. http://www.techpowerup.com/gpuz/g5hk3/
> 2. Sapphire R9 290
> 3. Stock cooler


Congrats - added








Quote:


> Originally Posted by *robyneil*
> 
> I'm new here. Thank you for all the info and guides provided. Can you please add me robyneil.
> 
> *Brand:* PowerColor AXR9 290 4GBD5-MDH/OC Radeon
> *Store:* Newegg Item #: N82E16814131527
> *Purchase Date:* 11/21/2013
> *S/N:* AZG1311037XXX
> *Cooler:* Stock air
> *Fans:* Ultra Rogue M950 6 fans + GPU stock fan + CPU CM 212 EVO fan
> *Benchmark Tool:* 3dmark 11
> *Benchmark Highest GPU Temp:* 66C
> *Benchmark Highest CPU Temp:* 73C
> *Overclocking Tool:* MSI AfterBurner
> *Quiet Bios:* Stock Powercolor r9 290 Bios
> *Uber Bios:* Asus r9 290x Bios
> 
> *stock bios r9 290 overclocked to 1000/1250*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> *stock bios r9 290 overclocked to 1154/1401*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> *unlocked Asus bios r9 290 to r9 290x default settings*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> *unlocked Asus bios r9 290 to r9 290x overclocked to 1154/1401*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Initially had lots of BSOD with hard restarts playing BF4 stock no OC but glad I didn't give up and didn't RMA it back. I believe a lot of the RMA'd r9 290 could have worked out. It's been stable after reseating the card, updating drivers, overclocking, and patiently playing BF4 until it got stable (lots of restarts during the 2nd and 3rd day)...don't really know what fixed it.
> 
> For some reason, I can overclock r9 290(not r9 290x) to 1208/1500 using the Powercolor stock bios but not with the Asus r9 290x bios (only up to 1154/1401 with no artifacts).


Congrats - added









Quote:


> Originally Posted by *HardwareDecoder*
> 
> Oh yeah I guess you can add me to the club. sapphire 290x gonna be waterblocked tomorrow w/ ek nickle+acetal ill post pics when the blocking is done
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and a pic of my dog with demon eyes
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## Sgt Bilko

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> I'd say if you really only want one 290 then a non ref card would be good but multi gpu water ftw ?


I want 2, i'm still waiting on my RMA though so i'll decide after Xmas, so little cash this time of year


----------



## HOMECINEMA-PC

Hey guys just got my 2nd 290 .

But gpu 2 usage is all over the shop , dogs breakfast . Ive read here somewhere that there is a fix for it , but this thread moves so bloody fast got no idea where it is . Anyone enlighten me please









Thanks in adavnce

Your freindly neighbourhood MADMAN


----------



## Hogesyx

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Hey guys just got my 2nd 290 .
> 
> But gpu 2 usage is all over the shop , dogs breakfast . Ive read here somewhere that there is a fix for it , but this thread moves so bloody fast got no idea where it is . Anyone enlighten me please
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks in adavnce
> 
> Your freindly neighbourhood MADMAN


There is a lot of factors that will affect GPU usage of the secondary card due to the profiling etc based on what game you are running and if a custom profile is made, and which crossfire mode.


----------



## RAFFY

Quote:


> Originally Posted by *RAFFY*
> 
> *Also is one 1mm 50x60 fujipoly and one .5mm 50x60 fujipoly enough for 3 GPUs?*


Can someone please answer this for me.

Also is anyone here running dual PSU's? If so what multi power supply adapter are you using? ADD2PSU?


----------



## Amhro

Any info about non-ref 290s?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Hogesyx*
> 
> There is a lot of factors that will affect GPU usage of the secondary card due to the profiling etc based on what game you are running and if a custom profile is made, and which crossfire mode.


Thanks for the quick reply









Just trying to bench Mk 11 . These are my first 'red things' . So im new to this . How many crossfire modes are there ?


----------



## robyneil

Quote:


> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added


Thank you.


----------



## Widde

Got my MSI R9 290 last thursday









Looking to add another card but not sure that my psu can handle it :/ Have a Cooler Master Silent Pro M2 720W, Looks like it wont have enough on the 12v rail to feed 2 cards, any thoughts? Otherwise I was looking at a Corsair RM 1000W

image wasnt looking that good, better link http://piclair.com/mj9wm


----------



## fleetfeather

long time no see







have the 290/290x problems all been sorted out recently?


----------



## Sgt Bilko

Quote:


> Originally Posted by *fleetfeather*
> 
> long time no see
> 
> 
> 
> 
> 
> 
> 
> have the 290/290x problems all been sorted out recently?


Somewhat, The black screen issues seem to have only affected the first batches of cards and a driver release fixed it for some, i don't think there is many here that have Day 1 cards anymore, most RMA'd (like myself) for various reasons.

AIB cards are supposed to be due out late Dec to Mid Jan from what i've been hearing (would love some conformation on that though)

Other than that, same old same old


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Somewhat, The black screen issues seem to have only affected the first batches of cards and a driver release fixed it for some, i don't think there is many here that have Day 1 cards anymore, most RMA'd (like myself) for various reasons.
> 
> AIB cards are supposed to be due out late Dec to Mid Jan from what i've been hearing (would love some conformation on that though)
> 
> Other than that, same old same old


This black screen thing your on about , Umm this is taking alot of my precious benching time between scenes thats what your on about yes ?


----------



## Arizonian

Quote:


> Originally Posted by *Widde*
> 
> Got my MSI R9 290 last thursday
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Looking to add another card but not sure that my psu can handle it :/ Have a Cooler Master Silent Pro M2 720W, Looks like it wont have enough on the 12v rail to feed 2 cards, any thoughts? Otherwise I was looking at a Corsair RM 1000W
> 
> image wasnt looking that good, better link http://piclair.com/mj9wm


Three things. First welcome to OCN with first post.







Second - congrats - added to club









Third your better off with a *Cooler Master V1000* which I feel is a better PSU and $10 less.









Johnny Guru Review - Cooler Master V1000 9.7 Score

Read about the models your looking for in an easy guide / index here on OCN - *PSU index thread*


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> This black screen thing your on about , Umm this is taking alot of my precious benching time between scenes thats what your on about yes ?


Hard lock-up?

Screen goes Black and stays that way until a hard reset?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Hard lock-up?
> 
> Screen goes Black and stays that way until a hard reset?


No .
Black 5-10sec screen between scenes while benchmarking ............. very annoying ,like my up and down gpu 2 useage


----------



## Widde

Quote:


> Originally Posted by *Arizonian*
> 
> Three things. First welcome to OCN with first post.
> 
> 
> 
> 
> 
> 
> 
> Second - congrats - added to club
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Third your better off with a *Cooler Master V1000* which I feel is a better PSU and $10 less.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Johnny Guru Review - Cooler Master V1000 9.7 Score
> 
> Read about the models your looking for in an easy guide / index here on OCN - *PSU index thread*


Thanks







Will have a look at it, doubt my little thing will be enough, maybe for stock but hey where's the fun in that


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> No .
> Black 5-10sec screen between scenes while benchmarking ............. very annoying ,like my up and down gpu 2 useage


I never had CF so i'm not sure about the usage but it sounds like the driver might be messing about somewhere.

have you tried different drivers?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I never had CF so i'm not sure about the usage but it sounds like the driver might be messing about somewhere.
> 
> have you tried different drivers?


No i updated to the latset beta last nite before i got 2nd one this arvo . It was doing the same thing on one card


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> No i updated to the latset beta last nite before i got 2nd one this arvo . It was doing the same thing on one card


I take it you've tried swapping the cards around?

If it's always GPU2 doing it then it might be the slot.

EDIT: Seems unlikely seeing as you were running Tri-SLI 760's before.


----------



## brazilianloser

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> No i updated to the latset beta last nite before i got 2nd one this arvo . It was doing the same thing on one card


Just a warning. I would uncheck that apply OC at startup. If you come across a very unstable OC where you can't even get into the desktop before crashing you are going to find yourself in a loop of death. Causing major headaches. Just a simple warning.

And what are you using there to stress those cards? None of my benchmarks or games give me such a solid usage on one of the cards like I am seeing there. They both go around having seizures.


----------



## Widde

Quick question, How do you add it to look like this? http://piclair.com/n0eg8


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *brazilianloser*
> 
> Just a warning. I would uncheck that apply OC at startup. If you come across a very unstable OC where you can't even get into the desktop before crashing you are going to find yourself in a loop of death. Causing major headaches. Just a simple warning.
> 
> And what are you using there to stress those cards? None of my benchmarks or games give me such a solid usage on one of the cards like I am seeing there. They both go around having seizures.


Firestrike extreme


----------



## brazilianloser

Quote:


> Originally Posted by *Widde*
> 
> Quick question, How do you add it to look like this? http://piclair.com/n0eg8


The code is in the very first page. Just copy and paste on your signature under your profile.


----------



## Widde

Quote:


> Originally Posted by *brazilianloser*
> 
> The code is in the very first page. Just copy and paste on your signature under your profile.


Thanks


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Firestrike extreme


So have you tried swapping the cards around?

Might have a dud in the 2nd slot


----------



## Arizonian

Quote:


> Originally Posted by *Widde*
> 
> Quick question, How do you add it to look like this? http://piclair.com/n0eg8


First - list your rig. OCN members like to see what your playing with especially helpful if you have questions.

*"How to put your Rig in your Sig"*

To add the signature to your profile:

Go to 'My Profile' at top of page / scroll down to 'Edit Signature' / add following by copy and pasting this below:

Code:



Code:


[CENTER] [URL=http://www.overclock.net/t/1436497/amd-r9-290x-290-owners-club#post_21044503][B] :clock:[Official] AMD R9 290X Owners Club :clock: [/B][/URL] [/CENTER]


----------



## Widde

Should be in the sig now







Already loving the site btw







Gonna check around alittle more


----------



## TheSoldiet

Still getting black screens randomly... It is less than before though  I got the day 1 batch btw. Do I have to rma?


----------



## Banedox

Hmm I should try to push my card again...


----------



## brazilianloser

Quote:


> Originally Posted by *TheSoldiet*
> 
> Still getting black screens randomly... It is less than before though  I got the day 1 batch btw. Do I have to rma?


Have you tried the various tweaks (bios and such), making sure the card is set properly, enough power, recent and some older drivers, updated mobo drivers, clean install... still happening after all that and worst of all happening at stock settings then i would RMA it. No point settling for a messed up card when you could have a good one out of the box like most folks.


----------



## TheSoldiet

Yeah, running at Asus bios and it still happens. I have updated chipset drivers, bios drivers and gpu drivers. It does not happen very often though. I times a day maybe. I haven't tried 13.11 whql. I might wanna try that.

Samsung galaxy s3


----------



## sugarhell

@homecinema. Disable ulps. Also untick voltage monitoring on msi AB.


----------



## JordanTr

Ok guys i received my gelid icy vision rev. 2 . Going to mount it tomorrow, cause im working. Can anyone give me a quick guide of doingt this? Which heatsins should i use for vrm? Can i put some tjermal compound on vrm before sticking the heatsinks or it will not stick to it? Thx in advance.


----------



## Falkentyne

Quote:


> Originally Posted by *Derpinheimer*
> 
> So why use PT1 instead of PT3? shouldnt pt3 let you have lower idle voltages? I mean at 1460mV mine droops to ~1290mV


PT3 is a DANGEROUS Bios.
Before you even THINK about using PT3, try using afterburner with the command to disable LLC (should have the command in the AB thread somewhere on guru3d.

I found it,
you can disable vdroop (the same thing that the PT3 bios does) by entering in these commands in afterburner CMD line

/ai4,30,38,7f (enable loadline)

It can be set back with:

/oi4,30,38,80 (default)

IF you are not on the most extreme air cooling or water cooling, you will most likely BLACKSCREEN crash from temps exceeding 100C within 30 seconds in Valley or any other stress /benchmark.
If you were using PT3 instead, you would have no failsafe protection besides the absolute thermal trip, and the card might just die.


----------



## nemm

@RAFFY

You could get away with 1x 60mmx50mmx1mm and 2x 60mmx50mmx0.5mm but not 1 of each. Myself would go 2 of each. Memory chips are 12mmx10mm iirc which is 25 chips covered by each sheet and since you 48 chips covered 2sheets is fine.


----------



## crun

Very happy with undervolting results of my R9 290(X). Minimum core voltage in AB (0.98V according to the GPU-Z?) is stable* on 950/1250 clocks. Temperature maxed out at 83c @ 45% fan speed in Metro.



*30 minutes of Heaven and 3 runs of Metro benchmark. 1300 memory (from the screenshot) was unstable in Windows (artefacts-lines) and in Metro. Battlefield 4 might further reduce clocks (not much if at all I guess)
Quote:


> Just a warning. I would uncheck that apply OC at startup. If you come across a very unstable OC where you can't even get into the desktop before crashing you are going to find yourself in a loop of death. Causing major headaches. Just a simple warning.


Exactly what happened to me after trying 1400 memory clock @ minimum voltage. Black screen, same thing after booting. Then some windows-loading errors. Solved it after 3 reboots by deleting the MSI AB process quickly and changing the profile manually to a lower memory clock


----------



## Widde

Offtopic, just something that caught my eye http://piclair.com/zf26i is this fine or? Because a friends looked like this http://piclair.com/2nxjz . Later card is a 660ti btw.


----------



## velocityx

Quote:


> Originally Posted by *Widde*
> 
> Offtopic, just something that caught my eye http://piclair.com/zf26i is this fine or? Because a friends looked like this http://piclair.com/2nxjz . Later card is a 660ti btw.


when in idle, the pcie speed goes down. atleast in gpu-z. hit the benchmark there and it will go up. funny tho, if you'd hit that question mark there, I wouldn;t have to reply here ;]


----------



## Widde

Quote:


> Originally Posted by *velocityx*
> 
> when in idle, the pcie speed goes down. atleast in gpu-z. hit the benchmark there and it will go up.


Was about to remove the post, tried furmark and it went to [email protected] ^^


----------



## voldomazta

Hey guys,
Is it okay to tri-fire 2x ASUS and 1x Sapphire 290x's and expect good OC cooperation between them?
2x ASUS ones are on their way to me and one left on stock on the local store is a Sapphire one that's why.
TIA.


----------



## Hattifnatten

Brand has nothing to do with it. Only thing that matters is model. So yes, it will work


----------



## voldomazta

sweet. pulling the trigger nao.


----------



## ghabhaducha

Quote:


> Originally Posted by *voldomazta*
> 
> Hey guys,
> Is it okay to tri-fire 2x ASUS and 1x Sapphire 290x's and expect good OC cooperation between them?
> 2x ASUS ones are on their way to me and one left on stock on the local store is a Sapphire one that's why.
> TIA.


There shouldn't be any problem, especially since they are pretty much the same reference card regardless of the manufacturer.

On another note, I ordered the XFX R9 290 from Tigerdirect, and it's now back-ordered (I got lucky and was able to order it right before the listing disappeared). It says 7-21 days shipping or something along those lines. I wonder how long it actually takes if someone has had experience with this already.


----------



## specopsFI

Quote:


> Originally Posted by *crun*
> 
> Very happy with undervolting results of my R9 290(X). Minimum core voltage in AB (0.98V according to the GPU-Z?) is stable* on 950/1250 clocks. Temperature maxed out at 83c @ 45% fan speed in Metro.
> 
> 
> 
> *30 minutes of Heaven and 3 runs of Metro benchmark. 1300 memory (from the screenshot) was unstable in Windows (artefacts-lines) and in Metro. Battlefield 4 might further reduce clocks (not much if at all I guess)
> Exactly what happened to me after trying 1400 memory clock @ minimum voltage. Black screen, same thing after booting. Then some windows-loading errors. Solved it after 3 reboots by deleting the MSI AB process quickly and changing the profile manually to a lower memory clock


I've tried undervolting too and at first it looked fine (could loop Heaven for ages, cruise around in Sleeping Dogs etc.). Only when I tried 3DMark I started to have problems. No crashes or anything, but it black screened a couple of times right after a test scene started after the loading screen. Mind you, my card has never black screened on stock or overclocked. I came to the conclusion that undervolting (which seems to affect every power state, not just the max clock state) doesn't play nice with the changing power states of which the 3DMark is a fine example. When you have a constant load keeping the clocks at full tilt, then no problem. Once the game or benchmark puts the card under varying load, it might be more problematic.


----------



## Gero2013

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> For those that have been having low gpu usage causing stutter or hiccups in bf4, has the new update helped? My buddy with an 8350 said this made a big difference in gpu usage on his 290.


wipe Nvidia drivers with Driver Sweeper in Safe mode then install AMD drivers,
I had the same problem, it's gone now.
Unfortunately LAGS codec is all messed up now







(((((


----------



## Banedox

Anyone still looking for a EK Block and Backplate, they seem to be sold out just about everywhere......


----------



## Sgt Bilko

Quote:


> Originally Posted by *Banedox*
> 
> Anyone still looking for a EK Block and Backplate, they seem to be sold out just about everywhere......


XSPC, Aquacomputer and Koolance have blocks out if that helps.

EDIT: XSPC also have Backplates as well, Not sure about the other two


----------



## Banedox

Quote:


> Originally Posted by *Sgt Bilko*
> 
> XSPC, Aquacomputer and Koolance have blocks out if that helps.
> 
> EDIT: XSPC also have Backplates as well, Not sure about the other two


Alright, from the results I have seen the EK block performs the best so far... Idk always liked their stuff
On that note I actually have the EK 290x + Backplate, was just seeing if anyone else needed one....


----------



## Sgt Bilko

Quote:


> Originally Posted by *Banedox*
> 
> Alright, from the results I have seen the EK block performs the best so far... Idk always liked their stuff
> On that note I actually have the EK 290x + Backplate, was just seeing if anyone else needed one....


Ah, my bad then









EK's blocks also look the best imo, and they are sold out here too of course


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> So have you tried swapping the cards around?
> 
> Might have a dud in the 2nd slot


Nah its bloody software / driver related .....thank gawd









Quote:


> Originally Posted by *sugarhell*
> 
> @homecinema. Disable ulps. Also untick voltage monitoring on msi AB.


Yep figured it out plus got rid of a lot of conflicting programms too .......



http://www.3dmark.com/3dm11/7638360



http://www.3dmark.com/3dm/1821912

Now thats what i call a good start....


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Nah its bloody software / driver related .....thank gawd
> 
> 
> 
> 
> 
> 
> 
> 
> Yep figured it out plus got rid of a lot of conflicting programms too .......
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/7638360
> 
> 
> 
> http://www.3dmark.com/3dm/1821912
> 
> 
> 
> Now thats what i call a good start....


Glad you got it worked out









These 290/x's are becoming a collectors item it seems.....hopefully i can manage to get my replacement back soon, the 7970 just doesn't feel the same


----------



## Banedox

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Ah, my bad then
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EK's blocks also look the best imo, and they are sold out here too of course


They do look the best but Performance PCs sent me the Nickel/Plexi block im pretty sure they ordered a Nickel/Aceteal, but they said i ordered the other one, I think im gonna ship it back to them... idk...


----------



## Raephen

Quote:


> Originally Posted by *Sgt Bilko*
> 
> XSPC, Aquacomputer and Koolance have blocks out if that helps.
> 
> EDIT: XSPC also have Backplates as well, Not sure about the other two


Aquacomputer has a backplate, too, with added cooling for the back of the long VRM strip.

But that one looks like a hassle to install, at least on a block already in use: you have to remove the port mount by undoing some hex screws and replace it by the bigger port mount attached to the backplate.

Extra cooling for the long VRM strip can be nice, I guess (if it's not just an epeen enlargement gimmick), but those are col enough as is... It's the 3 on top that I'd want to cool more.

Though replacing the stock thermal pads for thicker pads did seem to help for the 3 in my case.

After a good gaming session: Core 47C, VRM1 (the 3) 44C, VRM2 (the long strip) 41C.

Though that's Skyrim 1080p 60Hz, and while card usage is 100%, the clockspeed never really tops out.

Could that be because it's a DX9 game, or just because top clocks aren't needed for the performance?

I guess I'll start saving up for a 120Hz+ monitor... Never really thought my monitor could be a bottleneck


----------



## darkelixa

Here's to hoping my xfx 290 gets here before xmas


----------



## HOMECINEMA-PC

Last one for the nite ....im spent











http://www.3dmark.com/3dm/1822062


----------



## Sgt Bilko

Quote:


> Originally Posted by *Raephen*
> 
> Aquacomputer has a backplate, too, with added cooling for the back of the long VRM strip.
> 
> But that one looks like a hassle to install, at least on a block already in use: you have to remove the port mount by undoing some hex screws and replace it by the bigger port mount attached to the backplate.
> 
> Extra cooling for the long VRM strip can be nice, I guess (if it's not just an epeen enlargement gimmick), but those are col enough as is... It's the 3 on top that I'd want to cool more.
> 
> Though replacing the stock thermal pads for thicker pads did seem to help for the 3 in my case.
> 
> After a good gaming session: Core 47C, VRM1 (the 3) 44C, VRM2 (the long strip) 41C.
> 
> Though that's Skyrim 1080p 60Hz, and while card usage is 100%, the clockspeed never really tops out.
> 
> Could that be because it's a DX9 game, or just because top clocks aren't needed for the performance?
> 
> I guess I'll start saving up for a 120Hz+ monitor... Never really thought my monitor could be a bottleneck


Ah the upgrade bug.......I have far too many options and not enough cash atm, eyefinity or 120hz, another 290, water loop.........too many goodies and not enough cash.


----------



## rdr09

Quote:


> Originally Posted by *Pesmerrga*
> 
> How about this?
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814121671
> 
> Even worse than that is this..
> 
> http://www.mwave.com/mwave/SKUSearch.asp?scriteria=BF01542&pagetitle=Sapphire%2021226-00-53G%20Radeon%20R9%20290X%204GB%20DDR5%20BattleFie#.UqaX3kCTIdE


miners are willing. lol

edit: regular miners not the miners here in ocn who know better.


----------



## Banedox

Quote:


> Originally Posted by *rdr09*
> 
> miners are willing. lol
> 
> edit: regular miners not the miners here in ocn who know better.


Geeze the 780ti is looking darn good right now, I can return my unlocked 290 to amazon but just need to off load my waterblocks before i do that..... Maybe wanted to pick up another 290x but dosent look like it will be possible for manufacture price..


----------



## Raephen

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Ah the upgrade bug.......I have far too many options and not enough cash atm, eyefinity or 120hz, another 290, water loop.........too many goodies and not enough cash.


Lol! Indeed, I've caught the same bug









Though I'll think and wait with another 290 till after I get something like an Asus VG248QE - or the 27"model.

Now I get a constant 60fps in Skyrim, and perhaps it's my immagination, but I just feel like my card is telling me: I want to go faster, I can go faster!!!

But still, a +1 to you for the lol you gave me


----------



## HardwareDecoder

Quote:


> Originally Posted by *Banedox*
> 
> Geeze the 780ti is looking darn good right now, I can return my unlocked 290 to amazon but just need to off load my waterblocks before i do that..... Maybe wanted to pick up another 290x but dosent look like it will be possible for manufacture price..


So basically you just can't get it stable even @ stock clocks you say? What is it low gpu usage or what? I think you said alrdy but I forget.


----------



## rdr09

Quote:


> Originally Posted by *Banedox*
> 
> Geeze the 780ti is looking darn good right now, I can return my unlocked 290 to amazon but just need to off load my waterblocks before i do that..... Maybe wanted to pick up another 290x but dosent look like it will be possible for manufacture price..


Amazon? sell it.

get as much as you can and might be able to upgrade the rest of your sig (if it is up-to-date).


----------



## Banedox

Quote:


> Originally Posted by *HardwareDecoder*
> 
> So basically you just can't get it stable even @ stock clocks you say? What is it low gpu usage or what? I think you said alrdy but I forget.


It definitely seems stable at stock clocks, but I feel like it is not performing as it should..... I havent really the time to test stuff until this weekend again... damn 10 hour work days and work at 6am...


----------



## VSG

Quote:


> Originally Posted by *RAFFY*
> 
> Can someone please answer this for me.
> 
> Also is anyone here running dual PSU's? If so what multi power supply adapter are you using? ADD2PSU?


I don't think a 50x60mm sheet is enough for 2 GPUs, let alone 3. The VRAMs take up a lot.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Raephen*
> 
> Lol! Indeed, I've caught the same bug
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Though I'll think and wait with another 290 till after I get something like an Asus VG248QE - or the 27"model.
> 
> Now I get a constant 60fps in Skyrim, and perhaps it's my immagination, but I just feel like my card is telling me: I want to go faster, I can go faster!!!
> 
> But still, a +1 to you for the lol you gave me


Well i've decided to go for eyefinity, with my net online shooters are a no-go zone so i don't really need the extra fps, but with 4GB of GDDR goodness i can see a 3 x 1 setup in my future









I already have 2 x 24" screens, just one more to go, and another 290x, then the water loop when i get sick of the Rolls Royces howling









and thank you sir!!


----------



## Slomo4shO

Quote:


> Originally Posted by *RAFFY*
> 
> That is because those motherboards will bottleneck the GPU's. This is why I am dumping my ASUS Maximus VI Formula for a ASUS IV Black Edition to run my TRI-Fire setup.


The Asus Maximus VI Extreme actually uses a PLX chip and run at x8/x16/x8/x8 in quad fire. The concern with my system isn't bandwidth but the higher latency because of the PLX chip.


----------



## CriticalHit

hmmm.. anyone else having issues whereby crossfire disables when alt-tab to desktop and come back... only way to get it running again is to close and restart application ?

ive disable ULPS but hasnt helped.... only minor thing, hopefully its on a to do list for driver team..

( beta 9.5 )


----------



## Jpmboy

Quote:


> Originally Posted by *Slomo4shO*
> 
> The Asus Maximus VI Extreme actually uses a PLX chip and run at x8/x16/x8/x8 in quad fire. The concern with my system isn't bandwidth but the higher latency because of the PLX chip.


I believe you can test the plx "latency" by checking the concurent pcie bandwidth. There's command level tests for both amd and nv. So far, tests with the E11 and x79-E WS there was no difference in response time or bandwidth vs R4E. I'll compare to R4BE is a week or so. Should be same as r4e.

Search "concurrent bandwidth test"


----------



## cyberwave

Can I ask which driver version are you guys using with your R9 290?

I'm getting this huge cursor issue with 13.11 Beta 9.5 and while it is not causing any performance issues, it sure is annoying...


----------



## Sgt Bilko

Quote:


> Originally Posted by *cyberwave*
> 
> Can I ask which driver version are you guys using with your R9 290?
> 
> I'm getting this huge cursor issue with 13.11 Beta 9.5 and while it is not causing any performance issues, it sure is annoying...


Most people are running the 9.2 or 9.5 Beta drivers from what i've seen.


----------



## cyberwave

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Most people are running the 9.2 or 9.5 Beta drivers from what i've seen.


Anyone else getting this gigantic cursor issue?


----------



## MrWhiteRX7

13.11 whql here


----------



## Sgt Bilko

Quote:


> Originally Posted by *cyberwave*
> 
> Anyone else getting this gigantic cursor issue?


Are you saying that the mouse cursor has grown?

If so, i can't say i've heard of it, i was using the 13.11 WHQL driver with my 290x and i never had anything like that.


----------



## quakermaas

Quote:


> Originally Posted by *Falkentyne*
> 
> PT3 is a DANGEROUS Bios.
> Before you even THINK about using PT3, try using afterburner with the command to disable LLC (should have the command in the AB thread somewhere on guru3d.
> 
> I found it,
> you can disable vdroop (the same thing that the PT3 bios does) by entering in these commands in afterburner CMD line
> 
> /ai4,30,38,7f (enable loadline)
> 
> It can be set back with:
> 
> /oi4,30,38,80 (default)
> 
> IF you are not on the most extreme air cooling or water cooling, you will most likely BLACKSCREEN crash from temps exceeding 100C within 30 seconds in Valley or any other stress /benchmark.
> If you were using PT3 instead, you would have no failsafe protection besides the absolute thermal trip, and the card might just die.


Good advice.

Seems like many noobs trying it, even on stock cooler







, it's going to end in tears.


----------



## cyberwave

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Are you saying that the mouse cursor has grown?
> 
> If so, i can't say i've heard of it, i was using the 13.11 WHQL driver with my 290x and i never had anything like that.


well its like my resolution is set to 1080p but the cursor remained the same size as though it is in 800x600. at least 2.5 times the normal size


----------



## Slomo4shO

Quote:


> Originally Posted by *RAFFY*
> 
> Can someone please answer this for me.
> 
> Also is anyone here running dual PSU's? If so what multi power supply adapter are you using? ADD2PSU?


The single 1mm pad should be enough for the VRM since it doesn't take up much surface area. However, the .5mm pads are for memory and each memory chip seems to be around 16mm by 12.5mm. Based on those dimensions, each 60 by 50 pad will cover ~12 chips. Each card has 16 memory chips so you will likely need 4 60mm by 50 mm .5mm pads for the three cards.


----------



## tsm106

Quote:


> Originally Posted by *RAFFY*
> 
> Quote:
> 
> 
> 
> Originally Posted by *RAFFY*
> 
> *Also is one 1mm 50x60 fujipoly and one .5mm 50x60 fujipoly enough for 3 GPUs?*
> 
> 
> 
> Can someone please answer this for me.
> 
> Also is anyone here running dual PSU's? If so what multi power supply adapter are you using? ADD2PSU?
Click to expand...

You could be ok on the .5mm with two sheets if you are very frugal about it. One sheet is more than enough for the vrms.

Regarding psu, I use this cable.

http://www.ebay.com/itm/Dual-Power-Supply-Adapter-two-power-supplies-one-motherboard-W-load-MADE-IN-USA-/131059676680?pt=US_Adapters&hash=item1e83c3f608


----------



## Jpmboy

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> 13.11 whql here


Same here with 290x.


----------



## Sgt Bilko

Quote:


> Originally Posted by *cyberwave*
> 
> well its like my resolution is set to 1080p but the cursor remained the same size as though it is in 800x600. at least 2.5 times the normal size


Do a driver wipe then re-install 13.11 WHQL, it seems to be the most stable for people.

As for the cursor......no idea, maybe you need to stop feeding it







(jks)


----------



## Neutronman

Interesting, I just unlocked my Powercolor R9 290 to a 290X using the ASUS 290X rom.
There is less than 3fps difference in Valley at 1000/1250 and Shader Toy Mark went up from 645 to 685.....

AMD really did not hobble the 290 much. Kind of makes a mockery of the $150 price difference imo......

Most likely I will just flash back to the original bios and overclock like usual....


----------



## ImJJames

Quote:


> Originally Posted by *Neutronman*
> 
> Interesting, I just unlocked my Powercolor R9 290 to a 290X using the ASUS 290X rom.
> There is less than 3fps difference in Valley at 1000/1250 and Shader Toy Mark went up from 645 to 685.....
> 
> AMD really did not hobble the 290 much. Kind of makes a mockery of the $150 price difference imo......
> 
> Most likely I will just flash back to the original bios and overclock like usual....


Why flash back?


----------



## quakermaas

Quote:


> Originally Posted by *Neutronman*
> 
> Interesting, I just unlocked my Powercolor R9 290 to a 290X using the ASUS 290X rom.
> There is less than 3fps difference in Valley at 1000/1250 and Shader Toy Mark went up from 645 to 685.....
> 
> AMD really did not hobble the 290 much. Kind of makes a mockery of the $150 price difference imo......
> 
> Most likely I will just flash back to the original bios and overclock like usual....


Why not keep it as a 290x and overclock as usual ?


----------



## specopsFI

Quote:


> Originally Posted by *cyberwave*
> 
> well its like my resolution is set to 1080p but the cursor remained the same size as though it is in 800x600. at least 2.5 times the normal size


I have this, too. Posted it earlier, no one responded. Seems to be hit and miss. I'm using beta 9.4 and I have it when using my main screen, but not with my TV









If you have a second monitor or a different type of input for your screen, I'd give it a go. I'd reckon it's a compatibility issue between the driver and some monitors.


----------



## RAFFY

Quote:


> Originally Posted by *tsm106*
> 
> You could be ok on the .5mm with two sheets if you are very frugal about it. One sheet is more than enough for the vrms.
> 
> Regarding psu, I use this cable.
> 
> http://www.ebay.com/itm/Dual-Power-Supply-Adapter-two-power-supplies-one-motherboard-W-load-MADE-IN-USA-/131059676680?pt=US_Adapters&hash=item1e83c3f608


Thank you.


----------



## Neutronman

Quote:


> Originally Posted by *quakermaas*
> 
> Why not keep it as a 290x and overclock as usual ?


Good questions. I have an MSI 290 that did not unlock, and as I run them in crossfire it makes no sense to have the 290x and 290. I may as well have a pair of matched 290's. makes overclocking and volting a lot easier....


----------



## ImJJames

Quote:


> Originally Posted by *Neutronman*
> 
> Good questions. I have an MSI 290 that did not unlock, and as I run them in crossfire it makes no sense to have the 290x and 290. I may as well have a pair of matched 290's. makes overclocking and volting a lot easier....


Running CF 290x/290 will be equally as easy as running CF 290/290


----------



## cyberwave

Quote:


> Originally Posted by *specopsFI*
> 
> I have this, too. Posted it earlier, no one responded. Seems to be hit and miss. I'm using beta 9.4 and I have it when using my main screen, but not with my TV
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you have a second monitor or a different type of input for your screen, I'd give it a go. I'd reckon it's a compatibility issue between the driver and some monitors.


so when displayed on your TV it's a normal sized cursor? weird. cos my only display is a TV and it is a huge cursor









well i found a partial solution and that is to check the 'Enable pointer shadow' option in the mouse settings. my cursor is now normal sized but when i hover it over text, the 'I' beam is still huge. the middle mouse click scroll is also huge.

at least now it is semi-annoying instead of completely annoying


----------



## Jack Mac

Quote:


> Originally Posted by *ImJJames*
> 
> Running CF 290x/290 will be equally as easy as running CF 290/290


With no real benefit, the X will just turn into a non X.


----------



## ImJJames

Quote:


> Originally Posted by *Jack Mac*
> 
> With no real benefit, the X will just turn into a non X.


Exactly but I am asking why bother even flashing back...save yourself the 5 minutes it takes to flash back and do something better with that time


----------



## specopsFI

Quote:


> Originally Posted by *cyberwave*
> 
> so when displayed on your TV it's a normal sized cursor? weird. cos my only display is a TV and it is a huge cursor
> 
> 
> 
> 
> 
> 
> 
> 
> 
> well i found a partial solution and that is to check the 'Enable pointer shadow' option in the mouse settings. my cursor is now normal sized but when i hover it over text, the 'I' beam is still huge. the middle mouse click scroll is also huge.
> 
> at least now it is semi-annoying instead of completely annoying


I've done that, too. Helps, but doesn't solve it. Also, I've been having the same issue with my 7790 too. Hoping it will get fixed soon.


----------



## Banedox

So has there been a consensus yet on what the best and most stable bios is?

and on whether to use MSI or ASUS overclocking tools?


----------



## ImJJames

Quote:


> Originally Posted by *Banedox*
> 
> So has there been a consensus yet on what the best and most stable bios is?
> 
> and on whether to use MSI or ASUS overclocking tools?


I'm sure any reference bios will work just fine...I have no issues whatsoever using stock bios or asus bios.

And for tools
AB for 24/7, GPU tweak for benching


----------



## upgraditus

I should of asked this here earlier as one of you lot is bound to know; Is the Hawaii GPU lid recessed (lower than the socket frame) like it was with Tahiti? If so my plans are foiled!


----------



## tsm106

Quote:


> Originally Posted by *upgraditus*
> 
> I should of asked this here earlier as one of you lot is bound to know; Is the Hawaii GPU lid recessed (lower than the socket frame) like it was with Tahiti? If so my plans are foiled!


The die is higher, unlike with Tahiti.


----------



## upgraditus

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *upgraditus*
> 
> I should of asked this here earlier as one of you lot is bound to know; Is the Hawaii GPU lid recessed (lower than the socket frame) like it was with Tahiti? If so my plans are foiled!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The die is higher, unlike with Tahiti.
Click to expand...

Thanks! I have an Accelero Xtreme 7970 skimmed flat (diamond removed) which should work great, just have to watch them vrms...


----------



## cyberwave

Quote:


> Originally Posted by *specopsFI*
> 
> I've done that, too. Helps, but doesn't solve it. Also, I've been having the same issue with my 7790 too. Hoping it will get fixed soon.


yea hopefully new drivers would solve this problem


----------



## 00Smurf

Quote:


> Originally Posted by *Banedox*
> 
> So has there been a consensus yet on what the best and most stable bios is?
> 
> and on whether to use MSI or ASUS overclocking tools?


Quote:


> Originally Posted by *ImJJames*
> 
> I'm sure any reference bios will work just fine...I have no issues whatsoever using stock bios or asus bios.
> 
> And for tools
> AB for 24/7, GPU tweak for benching


Was just uploading my stock bios to techpowerup. When i ran across a much newer bios than any of the other board member bioses

File Name: MSI.R9290X.4096.131205.rom Build Date: 2013-12-05 01:29:00 Version: 015.042.000.000.003745 (left bios quiet)
File Name: MSI.R9270X.4096.131205.rom Build Date: 2013-12-05 01:27:00 Version 015.042.000.000.003746 (Right Bios Ubermode)

Link to download bios's
http://www.techpowerup.com/vgabios/index.php?did=1002-67B0-1462-8038

Note that the right bios says its from a 270x but everything else in the bios matches the left bios and every other spec for a r9 290x.

All the other boards and bio's i've seen have build dates of sept or october and are 000.006.3518 or 000.006.3515

fwiw, my stock bios for my msi card is 015.039.000.007.003526 (right bios ubermode)


----------



## Slomo4shO

Performance-PCS has 11 Koolance VID-AR290X, 8 EK-FC R9-290X, and 4 EK-FC R9-290X left in stock for anyone interested.


----------



## KyGuy

Sapphire 290x's in stock at Newegg. $599


----------



## Banedox

Quote:


> Originally Posted by *Slomo4shO*
> 
> Performance-PCS has 11 Koolance VID-AR290X, 8 EK-FC R9-290X, and 4 EK-FC R9-290X left in stock for anyone interested.


Lol they prolly have nickel in stock because I have a approved RMA to return my unopened one...


----------



## Amhro

just found this, the time has come... maybe?



http://www.hardware.fr/news/13491/gigabyte-r9-290x-windforce-se-montre.html

-15,9 dB, overclocked to 1040mhz
probably same cooling as 780ti oc


----------



## Slomo4shO

Quote:


> Originally Posted by *Banedox*
> 
> Lol they prolly have nickel in stock because I have a approved RMA to return my unopened one...


Both acrylic models from EK and the Koolance is available in nickle only. I canceled my backorder with ForzenCPU and just ordered 4 Koolance blocks since I don't want acrylic blocks.


----------



## RAFFY

Quote:


> Originally Posted by *cyberwave*
> 
> so when displayed on your TV it's a normal sized cursor? weird. cos my only display is a TV and it is a huge cursor
> 
> 
> 
> 
> 
> 
> 
> 
> 
> well i found a partial solution and that is to check the 'Enable pointer shadow' option in the mouse settings. my cursor is now normal sized but when i hover it over text, the 'I' beam is still huge. the middle mouse click scroll is also huge.
> 
> at least now it is semi-annoying instead of completely annoying


You need to go in and change your tv input setting from 16:9 to scan. That will probably fix your issue. If not post back.


----------



## jerrolds

Adding a 2nd 290X to my setup - current one has an aftermarket Gelid ICY 2 so it takes in air and kinda spreads it around like a Windforce or any normal 2-3 fan type setup. The bottom card will be standard blower style - does the top of the card get really hot? Hoping bottom blower style card doesnt raise top aftermarket cooling card temps too much.

Ill have the case open hopefully thatll help.

Guess ill find out tomorrow.


----------



## jerrolds

Quote:


> Originally Posted by *cyberwave*
> 
> so when displayed on your TV it's a normal sized cursor? weird. cos my only display is a TV and it is a huge cursor
> 
> 
> 
> 
> 
> 
> 
> 
> 
> well i found a partial solution and that is to check the 'Enable pointer shadow' option in the mouse settings. my cursor is now normal sized but when i hover it over text, the 'I' beam is still huge. the middle mouse click scroll is also huge.
> 
> at least now it is semi-annoying instead of completely annoying


Check your Scaling options in CCC, make sure overscan is set to 0%. Also make sure your TV isnt doing any overscan as well.


----------



## mojobear

Hey guys,

Hetting my sapphires r9 290s this week sometime (hopefully)...got them at 420 just hours before prices on newegg went to 530....









Was wondering what kind of testing I should do and for how long before I put EK 290 blocks on them? Want to ensure the black screen business has been resolved...etc

Thanks!


----------



## MrWhiteRX7

Battlefield 4 and firestrike extreme would be my two main tests for blackscreen possibilities.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *mojobear*
> 
> Hey guys,
> 
> Hetting my sapphires r9 290s this week sometime (hopefully)...got them at 420 just hours before prices on newegg went to 530....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Was wondering what kind of testing I should do and for how long before I put EK 290 blocks on them? Want to ensure the black screen business has been resolved...etc
> 
> Thanks!


I suggest you just game as normal for a week or two.

A couple benches won't hurt, but if you want to use your cards for gaming then the best way to test them is by gaming.


----------



## the9quad

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Do a driver wipe then re-install 13.11 WHQL, it seems to be the most stable for people.
> 
> As for the cursor......no idea, maybe you need to stop feeding it
> 
> 
> 
> 
> 
> 
> 
> (jks)


Whql are great, they do cause more flashing textures in BF 4 than the beta 9.5's do though. That's the only reason I'm on the betas atm.


----------



## HOMECINEMA-PC

Somethin i pulled last nite...

Homecinema -Pc [email protected]@2428 CF 290,s [email protected] *P25169*



http://www.3dmark.com/3dm11/7638360


----------



## chiknnwatrmln

That, sir, is an impressive score.

Highest I've gotten with one 290 is just over 15k... Hopefully once I get my loop built and my 3770k delidded I can break 16k.


----------



## tsm106

Run it in extreme.


----------



## jerrolds

How hot do the top of the reference cards get? Going to pair up my 2 fan Gelid with a reference blower underneath - wondering if this will kill the temps on the top card.


----------



## Asrock Extreme7

just got my xfx 290 and its unlocked to a 290x love this card just wait for my ek water block then kick some ass on bf4:thumb:


----------



## RAFFY

Quote:


> Originally Posted by *jerrolds*
> 
> How hot do the top of the reference cards get? Going to pair up my 2 fan Gelid with a reference blower underneath - wondering if this will kill the temps on the top card.


I'd say they can get to 70c + no problem


----------



## Asrock Extreme7

scan have 290s for £293


----------



## ChrisTahoe

Did I miss a memo? I thought the R9 290 was $399 (MSRP, I know Newegg is usually $10-$20 above that), why is it showing $499 on Newegg?


----------



## Forceman

Quote:


> Originally Posted by *ChrisTahoe*
> 
> Did I miss a memo? I thought the R9 290 was $399 (MSRP, I know Newegg is usually $10-$20 above that), why is it showing $499 on Newegg?


Price gouging. Which is complete crap from a big retailer like Newegg or Amazon. All the coin miners are buying up every AMD card in sight.


----------



## jerrolds

Ya thank the miners for the recent price bump. R9 290s for $529 on Newegg hahaha

Jesus i knew i shouldve bought an XFX or Powercooler 290 for $399 2 weeks ago.


----------



## ChrisTahoe

Quote:


> Originally Posted by *Forceman*
> 
> Price gouging. Which is complete crap from a big retailer like Newegg or Amazon. All the coin miners are buying up every AMD card in sight.


*Grumble grumble grumble grumble*


----------



## Sgt Bilko

Quote:


> Originally Posted by *jerrolds*
> 
> Ya thank the miners for the recent price bump. R9 290s for $529 on Newegg hahaha
> 
> Jesus i knew i shouldve bought an XFX or Powercooler 290 for $399 2 weeks ago.


The price hike hasn't hit here, good thing it hasn't though 290's are $499 and 290x's are $670


----------



## Sazz

these price hike is probably just a strategy so that they can put it on "sale" for last minute christmass shoppers and that "sale" would probably bring them down to what they normally cost.


----------



## bustacap22

Asking those who have jumped from 7970 to r290x. Just wondering who here on OCN jumped from 7970 to a 290x. Asking because buddy of mine has offered me $400 each for my dual 7970 w/ EK waterblock. I have been contemplating for the past 2 days if I should pull the trigger. Wondering about performance increase compared to price of the 290x. Yes, if I do pull the trigger would be getting dual 290x. Thoughts???


----------



## Durvelle27

Quote:


> Originally Posted by *bustacap22*
> 
> Asking those who have jumped from 7970 to r290x. Just wondering who here on OCN jumped from 7970 to a 290x. Asking because buddy of mine has offered me $400 each for my dual 7970 w/ EK waterblock. I have been contemplating for the past 2 days if I should pull the trigger. Wondering about performance increase compared to price of the 290x. Yes, if I do pull the trigger would be getting dual 290x. Thoughts???


I jumped from a HD 7970 @1280/1850 to a Sapphire R9 290 and its a nice upgrade.


----------



## MrWhiteRX7

I went from a well overclocking 7970 to a 290. (technically I had 3) but am also getting 3 x 290's.

It was WORTH it for sure. These are fantastically beast cards









A couple people have shown direct comparisons in benchmarks from their heavily overclocked 7970 to 290/x


----------



## Raephen

Prices here in the Netherlands seem stable, too. On average a few € less than launch day price (on tweakers.nl an average price of €380ish).

Wasn't thinking of getting another for the time being, though... I have to replace my bottleneck monitor first.

When I can't push the max refresh rate anymore, I'll get a second 290.


----------



## stickg1

Sigh I left this thread for 3 days and I'm 500 posts behind. So I will just ask my question; please excuse my laziness.

Has MSI-AB BETA 17 throttling issue been fixed? I can't get a display using the ASUS BIOS so GPU Tweak does not give me voltage control on the stock 290 PowerColor BIOS. I can only adjust voltage if I use BETA 17 Afterburner, but since it throttles the core clock it's pretty much useless. Has anyone made any headway on this situation? Thanks in advance.


----------



## smonkie

I got a Powercolor 290x with an Accelero III. Temps are usually fine (I use to play with Vsync), but yesterday I decided to try Crysis 1 maxed (no vsync in 1.0), and while GPU reached just 65 after one hour, one of the VRM hit 83º max, hovering between 78-80. I wonder, is this temp ok for the VRM? I reckon is a tad high, but just wondering how hot can these folks be.

I have a mild Oc (1100/1350).


----------



## Jpmboy

Quote:


> Originally Posted by *bustacap22*
> 
> Asking those who have jumped from 7970 to r290x. Just wondering who here on OCN jumped from 7970 to a 290x. Asking because buddy of mine has offered me $400 each for my dual 7970 w/ EK waterblock. I have been contemplating for the past 2 days if I should pull the trigger. Wondering about performance increase compared to price of the 290x. Yes, if I do pull the trigger would be getting dual 290x. Thoughts???


I just sold my cfx 7970s with AQ water blocks... one 290x is ~ 75% of performance (+/- 10%). 2 290x's will be a major bump in performance!!

reasonable comparison in the same rig:

http://www.3dmark.com/compare/fs/1257003/fs/1187031


----------



## Durvelle27

Do you guys think its worth waterblocking my Sapphire R9 290


----------



## Derpinheimer

If you have a watercooling setup yeah.. but I dont think these cards care about temp as much as Tahiti when it comes to overclocking.


----------



## KyGuy

Quote:


> Originally Posted by *Durvelle27*
> 
> Do you guys think its worth waterblocking my Sapphire R9 290


I think it would be worth it if you really don't like noise as well as the high temps. I am looking into getting my cooled as well, but I am just afraid I am going to crack the die. The big thing is to read all the stickies on installing the blocks. I have no idea if your CPU is cooled, so if you're running on air right now, there are plenty of stickies out there that talk about corrosion, and certain metals not to mix i.e. Aluminum AND ANY OTHER METALS!! I would say if you want quiet performance and possibly a good OC, go for it by all means.


----------



## KyGuy

290x's In Stock at Newegg are XFX and Sapphire, they say Out of Stock on the search page, but click on the actual card page, and they will show In Stock. Common Issue I have found back when I bought my Sapphire 290x with BF4.


----------



## Neutronman

Quote:


> Originally Posted by *KyGuy*
> 
> 290x's In Stock at Newegg are XFX and Sapphire, they say Out of Stock on the search page, but click on the actual card page, and they will show In Stock. Common Issue I have found back when I bought my Sapphire 290x with BF4.


Until you put them in your cart, then they show as out of stock. Bug with Newegg website.


----------



## Durvelle27

Quote:


> Originally Posted by *KyGuy*
> 
> I think it would be worth it if you really don't like noise as well as the high temps. I am looking into getting my cooled as well, but I am just afraid I am going to crack the die. The big thing is to read all the stickies on installing the blocks. I have no idea if your CPU is cooled, so if you're running on air right now, there are plenty of stickies out there that talk about corrosion, and certain metals not to mix i.e. Aluminum AND ANY OTHER METALS!! I would say if you want quiet performance and possibly a good OC, go for it by all means.


Yes I know of water cooling as my 7970 @1290/1850 was water cooled and my 8350 is also. Do you guys think by water cooling I could yield better OC results


----------



## stickg1

Quote:


> Originally Posted by *Durvelle27*
> 
> Yes I know of water cooling as my 7970 @1290/1850 was water cooled and my 8350 is also. Do you guys think by water cooling I could yield better OC results


Yes I believe so. For example, the previous owner of my PowerColor 290 was capable of 1253/1623 with a universal waterblock, VRMs and memory ambient cooled. I am using it with stock cooler and cannot get past 1175MHz because of the heat. It behaves erratically when I get over 75C.


----------



## ImJJames

Quote:


> Originally Posted by *stickg1*
> 
> Yes I believe so. For example, the previous owner of my PowerColor 290 was capable of 1253/1623 with a universal waterblock, VRMs and memory ambient cooled. I am using it with stock cooler and cannot get past 1175MHz because of the heat. It behaves erratically when I get over 75C.


Thats insane my power color r9 290 is clocked at 1200/1500 for 24/7 usage on stock bios lol, temps never go up higher than 70C when playing games like BF 4 for hours on end. I'm on Stock cooler, but I do use aggressive fan profile. I am headphone gamer so I don't hear anything.


----------



## KyGuy

Quote:


> Originally Posted by *Durvelle27*
> 
> Yes I know of water cooling as my 7970 @1290/1850 was water cooled and my 8350 is also. Do you guys think by water cooling I could yield better OC results


Water would let you use higher voltage which may let you achieve a better core clock. Memory overclocks are virtually useless on these cards. Only helps with texture loading speed. I tried overclocking my 290x and couldn't even get past 1130 without artifacting with +100mv in Afterburner, but I am still considering water cooling as I want another card for crossfire. I have a terrible card for core overclocking, but it will do 1500 on the memory and 1600 if I raise the AUX voltage even though it isn't directly related to the memory clock.


----------



## Durvelle27

Quote:


> Originally Posted by *stickg1*
> 
> Yes I believe so. For example, the previous owner of my PowerColor 290 was capable of 1253/1623 with a universal waterblock, VRMs and memory ambient cooled. I am using it with stock cooler and cannot get past 1175MHz because of the heat. It behaves erratically when I get over 75C.


Hmmmmm well I guess ill order a block this weekend. Just have to decide which block


----------



## ImJJames

Quote:


> Originally Posted by *KyGuy*
> 
> Water would let you use higher voltage which may let you achieve a better core clock. Memory overclocks are virtually useless on these cards. Only helps with texture loading speed. I tried overclocking my 290x and couldn't even get past 1130 without artifacting with +100mv in Afterburner, but I am still considering water cooling as I want another card for crossfire. I have a terrible card for core overclocking, but it will do 1500 on the memory and 1600 if I raise the AUX voltage even though it isn't directly related to the memory clock.


How high have you raised the memory voltage? I have yet to touch yet...whats safe limits?


----------



## stickg1

Quote:


> Originally Posted by *ImJJames*
> 
> Thats insane my power color r9 290 is clocked at 1175/1500 for 24/7 usage on stock bios lol, temps never go up higher than 70C when playing games like BF 4 for hours on end. Stock cooler, but I do use aggressive fan profile. I am headphone gamer so I don't hear anything.


I haven't played any games on it yet, for me it's [email protected] and benchmarking. I have seen the screenshots from the previous owners successful 3DMark runs, very impressive, but I can't get anywhere close without artifacts or crashing. As soon as I jump the voltage up over 1.25v it gets hotter than lava.


----------



## KyGuy

Quote:


> Originally Posted by *Neutronman*
> 
> Until you put them in your cart, then they show as out of stock. Bug with Newegg website.


I just added one to cart to test it out and it worked fine. Didn't show as Out of Stock. Maybe someone cancelled their order. No idea to be honest.....I'm a noob around here.


----------



## KyGuy

Quote:


> Originally Posted by *ImJJames*
> 
> How high have you raised the memory voltage? I have yet to touch yet...whats safe limits?


AUX voltage isn't related directly to memory, I only read it HELPS with some memory overclocks. I raised the AUX Voltage +100mv in Afterburner. To access it, you need AB 17, and have to unlock voltage control. You have to click the down area to gain access to the AUX Voltage, as it doesn't show in the main page.


----------



## TheRoot




----------



## Derpinheimer

Quote:


> Originally Posted by *KyGuy*
> 
> Water would let you use higher voltage which may let you achieve a better core clock. Memory overclocks are virtually useless on these cards. Only helps with texture loading speed. I tried overclocking my 290x and couldn't even get past 1130 without artifacting with +100mv in Afterburner, but I am still considering water cooling as I want another card for crossfire. I have a terrible card for core overclocking, but it will do 1500 on the memory and 1600 if I raise the AUX voltage even though it isn't directly related to the memory clock.


Hm, AUX voltage is VRM 2, aka the set of 3.

Clearly the voltage for them is not 1.0v default [Or is it?]

Very weird..


----------



## Sgt Bilko

Quote:


> Originally Posted by *Derpinheimer*
> 
> Hm, AUX voltage is VRM 2, aka the set of 3.
> 
> Clearly the voltage for them is not 1.0v default [Or is it?]
> 
> Very weird..


I always thought VRM 2 was the long strip and VRM 1 was the 3?


----------



## Derpinheimer

Nope, VRM2 is definitely the small set. I've "heard" it was the memory VRMs..

And it kinda makes sense. I know on my last card, a 7950, increasing mem voltage showed up in VRM 2.

So maybe they arent memory VRMs anymore, or the memory on this chip runs at 1.0v [???], or its inaccurately reported as 1.0v..


----------



## ZealotKi11er

Loving 290X @ 1100MHz under water. Only hits 45C in BF4. Runs BF4 @ 1440p Ultra 0x AA like a dream.


----------



## Banedox

So how is this for my Firestrike score, Im pretty sure my score is low, ive seen other 290/290x pull over 13000 at stock setting and im overclocked. using the PT1 bios...also just did a complete driver wipe


----------



## Durvelle27

I have a question for ppl on Water. Did going water offer better OC results compared to stock/air if you have tried it.


----------



## VSG

Quote:


> Originally Posted by *Banedox*
> 
> So how is this for my Firestrike score, Im pretty sure my score is low, ive seen other 290/290x pull over 13000 at stock setting and im overclocked. using the PT1 bios...also just did a complete driver wipe


Is that with a single card or dual cards?


----------



## Banedox

Quote:


> Originally Posted by *geggeg*
> 
> Is that with a single card or dual cards?


Mine is with a single card... its only performing as well as a 7970....


----------



## Derpinheimer

Quote:


> Originally Posted by *Durvelle27*
> 
> I have a question for ppl on Water. Did going water offer better OC results compared to stock/air if you have tried it.


Yes, but not massive.

My highest air was 1150, highest water is 1200.

Water could go higher, but voltage is already 1462 at that clock.


----------



## VSG

Quote:


> Originally Posted by *Banedox*
> 
> Mine is with a single card... its only performing as well as a 7970....


That's pretty good for a single card. On a single 290x, I got just below 10k at stock volts, stock cooler, 1100/1300 clocks. Those 15k scores are with dual cards.


----------



## Banedox

Quote:


> Originally Posted by *geggeg*
> 
> That's pretty good for a single card. On a single 290x, I got just below 10k at stock volts, stock cooler, 1100/1300 clocks. Those 15k scores are with dual cards.


hmm the 15k on GPU? Cause I saw a guy pull 13k on a stock 290...... hmmm


----------



## Jpmboy

there's a bunch of scores in this thread. (table has not been updated in a month...


----------



## jerrolds

Quote:


> Originally Posted by *bustacap22*
> 
> Asking those who have jumped from 7970 to r290x. Just wondering who here on OCN jumped from 7970 to a 290x. Asking because buddy of mine has offered me $400 each for my dual 7970 w/ EK waterblock. I have been contemplating for the past 2 days if I should pull the trigger. Wondering about performance increase compared to price of the 290x. Yes, if I do pull the trigger would be getting dual 290x. Thoughts???


I had a Matrix [email protected] core 7000mhz memory to this Sapphire [email protected] core 6000mhz memory - and its about 40% faster. If you can get a good price on your 7970s its a decent upgrade gaming wise.


----------



## Banedox

Quote:


> Originally Posted by *Jpmboy*
> 
> there's a bunch of scores in this thread. (table has not been updated in a month...


Thanks for that!


----------



## Durvelle27

Quote:


> Originally Posted by *Derpinheimer*
> 
> Yes, but not massive.
> 
> My highest air was 1150, highest water is 1200.
> 
> Water could go higher, but voltage is already 1462 at that clock.


Anybody else


----------



## Banedox

Quote:


> Originally Posted by *Durvelle27*
> 
> Anybody else


I can do 1150 on Air at 1410 on GPUTweak


----------



## Jpmboy

Quote:


> Originally Posted by *Banedox*
> 
> Thanks for that!


----------



## Jpmboy

just ran this... card still warm from BF4


----------



## Forceman

Quote:


> Originally Posted by *smonkie*
> 
> I got a Powercolor 290x with an Accelero III. Temps are usually fine (I use to play with Vsync), but yesterday I decided to try Crysis 1 maxed (no vsync in 1.0), and while GPU reached just 65 after one hour, one of the VRM hit 83º max, hovering between 78-80. I wonder, is this temp ok for the VRM? I reckon is a tad high, but just wondering how hot can these folks be.
> 
> I have a mild Oc (1100/1350).


Those temps are okay for the VRMs, but a little higher than most people would like to see on aftermarket cooling.
Quote:


> Originally Posted by *Durvelle27*
> 
> I have a question for ppl on Water. Did going water offer better OC results compared to stock/air if you have tried it.


Not at the same voltage. Easier to raise the volts with water though, so it is a little better. But the noise is worth waterblcoking it, at least for me. Plus it runs at 50C instead of 90C, which is nice in general.


----------



## Durvelle27

Quote:


> Originally Posted by *Banedox*
> 
> I can do 1150 on Air at 1410 on GPUTweak


Are you on water also


----------



## CriticalHit

Quote:


> Originally Posted by *Durvelle27*
> 
> Anybody else


water had 0 effect on my scores....
max overclock on stock air was 1180 ( with hard crashes ) and 1120 stable
max overclock on water is currently 1180 ( with hard crashes ) and 1120 stable

i think my card is having trouble with power delivery - doesnt deal well with increased voltage / power limit so overclockability under water is negated .. ( ASIC of cards 70.3 and 70.5 from memory )


----------



## Durvelle27

Hmmmm this is really getting at me. My cards ASIC is 71.8% but I'm thinking if I should let this card go or not


----------



## Widde

Was just checking around alittle comparing my card to a 780ti http://piclair.com/cfhrm







, Thing is my card doesnt seem to like anymore mhz on the core, I'm already at +100mV in AB









Guess it's just a poor overclocker


----------



## Arizonian

Quote:


> Originally Posted by *Jpmboy*
> 
> there's a bunch of scores in this thread. (table has not been updated in a month...


I've done a reach out and will look into what's going on.


----------



## ssiperko

Just finished with Arctic Twin Turbo install on my HIS 290.

Ran a Fire Strike loop for about 20 mins at 1200/1407 GPU Voltage at 1412 power target 150 with a 100 fan speed.
Max temp 89c.
No tearing.

My score was 10459, Graphics - 12385 and Physics - 10728.

The fans on this thing are quiet compared to 47% on the standard sink and that thing would hit 95c right off the bat.

When I ran it at 1100/1300 the temp was 76c.

Had to make a few changes to the mount but I'm happy now.

SS


----------



## grunion

I've given up, just can't get these pt bios' to run correctly.

And one more time, are they correct at 64kb in size?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *ssiperko*
> 
> Just finished with Arctic Twin Turbo install on my HIS 290.
> 
> Ran a Fire Strike loop for about 20 mins at 1200/1407 GPU Voltage at 1412 power target 150 with a 100 fan speed.
> Max temp 89c.
> No tearing.
> 
> My score was 10459, Graphics - 12385 and Physics - 10728.
> 
> The fans on this thing are quiet compared to 47% on the standard sink and that thing would hit 95c right off the bat.
> 
> When I ran it at 1100/1300 the temp was 76c.
> 
> Had to make a few changes to the mount but I'm happy now.
> 
> SS


What TIM did you use? 89c seems kinda hot, my Gelid tops at 65c at max OC, and I've heard that the Accellero Extreme III's top out around 60-70c, depending on voltage.

What are your VRM's at? I know that on my Gelid (even with bigger heatsinks I bought separately) at 1.412v in GPUTweak VRM 1 gets up into the 90's... One of the reasons I'm getting this thing under water haha.


----------



## CriticalHit

Quote:


> Originally Posted by *ssiperko*
> 
> Just finished with Arctic Twin Turbo install on my HIS 290.
> 
> Ran a Fire Strike loop for about 20 mins at 1200/1407 GPU Voltage at 1412 power target 150 with a 100 fan speed.
> Max temp 89c.
> No tearing.
> SS


Have same card - Which BIOS are you using ?


----------



## Sgt Bilko

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> What TIM did you use? 89c seems kinda hot, my Gelid tops at 65c at max OC, and I've heard that the Accellero Extreme III's top out around 60-70c, depending on voltage.
> 
> What are your VRM's at? I know that on my Gelid (even with bigger heatsinks I bought separately) at 1.412v in GPUTweak VRM 1 gets up into the 90's... One of the reasons I'm getting this thing under water haha.


I used AS5 on my 290x with the AX III and i topped out in benching at 75c (ambient of 28c) so that wasn't too bad.


----------



## Sazz

Quote:


> Originally Posted by *Banedox*
> 
> hmm the 15k on GPU? Cause I saw a guy pull 13k on a stock 290...... hmmm


What are we talking about here? firestrike or 3Dmark11? And GPU Score or overall score?

my runs on 290X max OC under watercooling paired with a FX8350.

Firestrike
3DMark11


----------



## Durvelle27

Quote:


> Originally Posted by *Sazz*
> 
> What are we talking about here? firestrike or 3Dmark11? And GPU Score or overall score?
> 
> my runs on 290X max OC under watercooling paired with a FX8350.
> 
> Firestrike
> 3DMark11


Highest OC I seen on a 290X so far. What BIOs and volts you pushing


----------



## Sazz

Quote:


> Originally Posted by *Durvelle27*
> 
> Highest OC I seen on a 290X so far. What BIOs and volts you pushing


I used PT1 BIOS, PT3 wasn't stable for me. at 1.425v


----------



## Durvelle27

Quote:


> Originally Posted by *Sazz*
> 
> I used PT1 BIOS, PT3 wasn't stable for me. at 1.425v


ASIC and wow you've got a clocker


----------



## ZombieJon

Bad lighting. =S Used a Sigma_Cool to mount a Zalman CNPS20LQ to the 290 (top GPU). The box didn't have the right screws, and I ran out of rigger's tape.

Ran Heaven a few times last night, and temps didn't get anywhere near 65. Didn't bother looking at VRM temps, and I have to get some thermal adhesive to stick Swiftech MC21s on to it. The thermal tape is nowhere near sticky enough, and the heatsinks fell off when I put the case back upright.

Will take another photo when I glue the heatsinks on.


----------



## Sazz

Quote:


> Originally Posted by *Durvelle27*
> 
> ASIC and wow you've got a clocker


it was 69.9%

There's a lot out there that can go 1300/1700+


----------



## Forceman

Quote:


> Originally Posted by *Sazz*
> 
> What are we talking about here? firestrike or 3Dmark11? And GPU Score or overall score?
> 
> my runs on 290X max OC under watercooling paired with a FX8350.
> 
> Firestrike
> 3DMark11


Hmm. I got basically the same graphics score (13355) at 1200/1450. Which driver did you use?

http://www.3dmark.com/fs/1155472


----------



## Sazz

Quote:


> Originally Posted by *Forceman*
> 
> Hmm. I got basically the same graphics score (13355) at 1200/1450. Which driver did you use?
> 
> http://www.3dmark.com/fs/1155472


beta 9.2

And yeah, if I had intel CPU the GPU score would be better, I've seen it from time to time people running firestrike at same clocks as I am but when using intel CPU that gets better GPU score, probably better scaling if scaling is set to CPU.


----------



## tsm106

Quote:


> Originally Posted by *Durvelle27*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sazz*
> 
> What are we talking about here? firestrike or 3Dmark11? And GPU Score or overall score?
> 
> my runs on 290X max OC under watercooling paired with a FX8350.
> 
> Firestrike
> 3DMark11
> 
> 
> 
> *Highest OC I seen on a 290X so far. What BIOs and volts you pushing*
Click to expand...


----------



## ZealotKi11er

Why are people using so much volts? My card did 1100MHz in air @ stock volts. ~ 1150-1180v according to MSI AB. If you are getting ~ 70 fps in a game each 100MHz will give you 3-4 extra fps. Is it really worth it to bump volts to MAX for 3-4 fps?


----------



## Durvelle27

Quote:


> Originally Posted by *Sazz*
> 
> it was 69.9%
> 
> There's a lot out there that can go 1300/1700+


Mines is 71.8% but it OCs terrible


----------



## Sazz

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Why are people using so much volts? My card did 1100MHz in air @ stock volts. ~ 1150-1180v according to MSI AB. If you are getting ~ 70 fps in a game each 100MHz will give you 3-4 extra fps. Is it really worth it to bump volts to MAX for 3-4 fps?


We are not doing it for the "FPS" in games, we are just trying to find out how high can we go with it. I run mine at stock voltage at 1100/1450 as my daily clocks.


----------



## HardwareDecoder

Quote:


> Originally Posted by *Durvelle27*
> 
> Mines is 71.8% but it OCs terrible


mine is 79% I think (legit 290x) and it does 1175 on the core (seems stable so far, but who knows with oc's) and doesn't oc on memory, like at all.

I really think asic means jack


----------



## Sgt Bilko

Quote:


> Originally Posted by *tsm106*


It's ok, we haven't forgotten


----------



## ZealotKi11er

I played 1 hour BF4 @ 1150MHz/1250MHz stock voltage. MAX temp 45C. These cards love low temp just like HD 7970.


----------



## HardwareDecoder

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I played 1 hour BF4 @ 1150MHz/1250MHz stock voltage. MAX temp 45C. These cards love low temp just like HD 7970.


45c, with a 290/x ? must have a waterblock on it, I feel like the stock fan couldn't even do that at 100% and you'd also be deaf.


----------



## Derpinheimer

Can anyone help me understand if this is normal?



The framerate does drop every time the usage goes down..


----------



## prostreetcamaro

Well guys according to Arizonian I am not allowed to post this new XFX 290 I received today for sale because I don't have enough rep's yet so I guess I will have to post it else where....


----------



## HardwareDecoder

Quote:


> Originally Posted by *prostreetcamaro*
> 
> Well guys according to Arizonian I am not allowed to post this new XFX 290 I received today for sale because I don't have enough rep's yet so I guess I will have to post it else where....


that is dumb you have been on here longer than me.

although 64 posts in 3 years isn't very active


----------



## HardwareDecoder

Quote:


> Originally Posted by *Derpinheimer*
> 
> Can anyone help me understand if this is normal?
> 
> 
> 
> The framerate does drop every time the usage goes down..


that doesn't seem normal at all.


----------



## Mr357

Quote:


> Originally Posted by *Derpinheimer*
> 
> Can anyone help me understand if this is normal?
> 
> 
> 
> The framerate does drop every time the usage goes down..


That is certainly not normal. It could be caused by a number of things unfortunately.


----------



## Derpinheimer

Ugh, I guess I hoped that with people saying GPU usage readings were bad it was normal, but at the same time the frame drops are highly annoying.

I'll try driver re-install first...


----------



## MrWhiteRX7

I've realized that watercooling on these cards is really about noise... I can run this card at +200mv stock cooler and it stays around 71c and takes it like a champ. VRM's barely get hotter than the core lol. It's just loud. I plan to put a water block on all three lol.


----------



## CriticalHit

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> I've realized that watercooling on these cards is really about noise... I can run this card at +200mv stock cooler and it stays around 71c and takes it like a champ. VRM's barely get hotter than the core lol. It's just loud. I plan to put a water block on all three lol.


yep for all the fuss about the cards temperatures under stock cooler, the cards arent heat limited at all.. and appear to comfortably reach max overclock clocks with stock cooling in most scenarios..

voltage to the cards and silicon lottery / chip quality are the main factors i see in overclocking.. alternative cooling is not really finding more gains..

edit: though i really hope i missed something and this post can be debunked... but noticed a few peeps not getting much out of new cooling except for lower temps..


----------



## SamEkinci

I just burned down my second R9 290.. All temperatures were way below the stock values and the card just stopped working. Later I saw that the GPU was cracked when I was reinstalling the stock heatsink before returning to Amazon. First the VRM gave out now the GPU, both cases the card was not overclocked and at idle.

Before the cards demise I also figured out that a 750W Gold rated PSU was unable to put out enough power for both CPU and GPU at full load. Whenever I began running the CPU at full load the power to GPU started throttling.

On top of all that the drivers.

All these negative points on my head, I decided to return and exchange for a new one through Amazon prime and XFX. Only to be told that it was no longer in stock and that my money would be refunded to me. Later I also learned whenever the card comes back to be on stock the price is raised to 459$ and then maybe even increased to 499$ thats over 500$ with tax. Needless to say I am not sure if I can afford or be willing to pay more for a card that broke down on me at idle twice. It sucks because I really wanted it to work.







Now with price raises I am not sure if I can afford it anyways.


----------



## ImJJames

Quote:


> Originally Posted by *SamEkinci*
> 
> I just burned down my second R9 290.. All temperatures were way below the stock values and the card just stopped working. Later I saw that the GPU was cracked when I was reinstalling the stock heatsink before returning to Amazon. First the VRM gave out now the GPU, both cases the card was not overclocked and at idle.
> 
> Before the cards demise I also figured out that a 750W Gold rated PSU was unable to put out enough power for both CPU and GPU at full load. Whenever I began running the CPU at full load the power to GPU started throttling.
> 
> On top of all that the drivers.
> 
> All these negative points on my head, I decided to return and exchange for a new one through Amazon prime and XFX. Only to be told that it was no longer in stock and that my money would be refunded to me. Later I also learned whenever the card comes back to be on stock the price is raised to 459$ and then maybe even increased to 499$ thats over 500$ with tax. Needless to say I am not sure if I can afford or be willing to pay more for a card that broke down on me at idle twice. It sucks because I really wanted it to work.
> 
> 
> 
> 
> 
> 
> 
> Now with price raises I am not sure if I can afford it anyways.


Sounds like a nightmare, seems like you had a crappy aftermarket PSU that destroyed your graphic card.


----------



## Arizonian

Quote:


> Originally Posted by *SamEkinci*
> 
> I just burned down my second R9 290.. All temperatures were way below the stock values and the card just stopped working. Later I saw that the GPU was cracked when I was reinstalling the stock heatsink before returning to Amazon. First the VRM gave out now the GPU, both cases the card was not overclocked and at idle.
> 
> Before the cards demise I also figured out that a 750W Gold rated PSU was unable to put out enough power for both CPU and GPU at full load. Whenever I began running the CPU at full load the power to GPU started throttling.
> 
> On top of all that the drivers.
> 
> All these negative points on my head, I decided to return and exchange for a new one through Amazon prime and XFX. Only to be told that it was no longer in stock and that my money would be refunded to me. Later I also learned whenever the card comes back to be on stock the price is raised to 459$ and then maybe even increased to 499$ thats over 500$ with tax. Needless to say I am not sure if I can afford or be willing to pay more for a card that broke down on me at idle twice. It sucks because I really wanted it to work.
> 
> 
> 
> 
> 
> 
> 
> Now with price raises I am not sure if I can afford it anyways.


Yeah this is a sad story. Sorry to hear about your bad luck.









I don't know if you have a spare GPU to work with but I have a feeling prices will go down eventually. Especially when non-reference comes out. Not sure if you can hold out but it's such a strong card for the money when it was priced correctly sad to see you have to settle. Supply / Demand really takes hold of vendors and against what AMD is selling them for vendors react to pricing other vendors are getting and adjust accordingly.


----------



## ImJJames

Quote:


> Originally Posted by *Forceman*
> 
> Hmm. I got basically the same graphics score (13355) at 1200/1450. Which driver did you use?
> 
> http://www.3dmark.com/fs/1155472


Firestrike has to be reading your clock wrong...because it happens alot(My Firestrike reads my clock wrong sometimes). I don't see how you're getting 13.3k graphic score on 1200 clock.


----------



## RAFFY

Quote:


> Originally Posted by *SamEkinci*
> 
> I just burned down my second R9 290.. All temperatures were way below the stock values and the card just stopped working. Later I saw that the GPU was cracked when I was reinstalling the stock heatsink before returning to Amazon. First the VRM gave out now the GPU, both cases the card was not overclocked and at idle.
> 
> Before the cards demise I also figured out that a 750W Gold rated PSU was unable to put out enough power for both CPU and GPU at full load. Whenever I began running the CPU at full load the power to GPU started throttling.
> 
> On top of all that the drivers.
> 
> All these negative points on my head, I decided to return and exchange for a new one through Amazon prime and XFX. Only to be told that it was no longer in stock and that my money would be refunded to me. Later I also learned whenever the card comes back to be on stock the price is raised to 459$ and then maybe even increased to 499$ thats over 500$ with tax. Needless to say I am not sure if I can afford or be willing to pay more for a card that broke down on me at idle twice. It sucks because I really wanted it to work.
> 
> 
> 
> 
> 
> 
> 
> Now with price raises I am not sure if I can afford it anyways.


Sorry to hear about all your troubles but it's not the graphics cards fault. It sounds like something else is destroying your graphics cards. As another member has mentioned I would look at your power supply. I would find it very hard to believe/ have that bad of luck were the first cards VRM burn out and the second cards GPU cracks. Check that PSU when you get a chance.

Edit: I saw that your using an OCZ and Kingwin in dual PSU. How are you running those two together? I've heard horror stories of people using the Lian Li type of connection only to have power surges fry components.


----------



## prostreetcamaro

Quote:


> Originally Posted by *SamEkinci*
> 
> I just burned down my second R9 290.. All temperatures were way below the stock values and the card just stopped working. Later I saw that the GPU was cracked when I was reinstalling the stock heatsink before returning to Amazon. First the VRM gave out now the GPU, both cases the card was not overclocked and at idle.
> 
> Before the cards demise I also figured out that a 750W Gold rated PSU was unable to put out enough power for both CPU and GPU at full load. Whenever I began running the CPU at full load the power to GPU started throttling.
> 
> On top of all that the drivers.
> 
> All these negative points on my head, I decided to return and exchange for a new one through Amazon prime and XFX. Only to be told that it was no longer in stock and that my money would be refunded to me. Later I also learned whenever the card comes back to be on stock the price is raised to 459$ and then maybe even increased to 499$ thats over 500$ with tax. Needless to say I am not sure if I can afford or be willing to pay more for a card that broke down on me at idle twice. It sucks because I really wanted it to work.
> 
> 
> 
> 
> 
> 
> 
> Now with price raises I am not sure if I can afford it anyways.


750 not enough for cpu and gpu? I am running my 2600K at 5Ghz and the 290 just fine on a PC&P 750 and before that I had a corsair 650 running the same setup without issue. How did you crack the core? Did you have to much pressure on that aftermarket cooling setup you did maybe? Just trying to think what could be causing your issues.


----------



## Hogesyx

Not sure if too late, but seems like VRMs on 290X does not really need a heatsink to keep them cool?

http://www.legitreviews.com/nzxt-kraken-g10-gpu-water-cooler-review-on-an-amd-radeon-r9-290x_130344/6

Even without the heat plate, a direct fan blowing is getting similar results for VRM1, VRM2(supposed to be the 3 pieces at the front) get a much higher drop in temperature when heat spreader was removed. This kind of contradicts with some of the users of Arctic x3 and Gelid icy v2 where even with heatsinks on, the VRM gets overheated.


----------



## Forceman

Quote:


> Originally Posted by *ImJJames*
> 
> Firestrike has to be reading your clock wrong...because it happens alot(My Firestrike reads my clock wrong sometimes). I don't see how you're getting 13.3k graphic score on 1200 clock.


It wasn't. That's what I had set in GPU Tweak. If you check the Firestrike thread there are a couple others around 13000.


----------



## quakermaas

Quote:


> Originally Posted by *stickg1*
> 
> Sigh I left this thread for 3 days and I'm 500 posts behind. So I will just ask my question; please excuse my laziness.
> 
> Has MSI-AB BETA 17 throttling issue been fixed? I can't get a display using the ASUS BIOS so GPU Tweak does not give me voltage control on the stock 290 PowerColor BIOS. I can only adjust voltage if I use BETA 17 Afterburner, but since it throttles the core clock it's pretty much useless. Has anyone made any headway on this situation? Thanks in advance.


I have no problems now with MSI AB, after I used Extend official overclocking, ULPS disabled and left Unofficial overclocking disabled. Then verified that powerlimits are being applied in overdrive ( but dint enable overdrive).

My powerlimits were not being applied properly when I used Unofficial overclocking mode.

Using unofficial overclocking (powerlimits not applied properly, card down clocking)



Using extended clocks and unofficial overclocking disabled (powerlimits now applied correctly)



Quote:


> Originally Posted by *SamEkinci*
> 
> I just burned down my second R9 290.. All temperatures were way below the stock values and the card just stopped working. Later I saw that the GPU was cracked when I was reinstalling the stock heatsink before returning to Amazon. First the VRM gave out now the GPU, both cases the card was not overclocked and at idle.
> 
> Before the cards demise I also figured out that a 750W Gold rated PSU was unable to put out enough power for both CPU and GPU at full load. Whenever I began running the CPU at full load the power to GPU started throttling.
> 
> On top of all that the drivers.
> 
> All these negative points on my head, I decided to return and exchange for a new one through Amazon prime and XFX. Only to be told that it was no longer in stock and that my money would be refunded to me. Later I also learned whenever the card comes back to be on stock the price is raised to 459$ and then maybe even increased to 499$ thats over 500$ with tax. Needless to say I am not sure if I can afford or be willing to pay more for a card that broke down on me at idle twice. It sucks because I really wanted it to work.
> 
> 
> 
> 
> 
> 
> 
> Now with price raises I am not sure if I can afford it anyways.


That's the second time I have read about cracked die on here in the last few days, can you post a picture ?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Hogesyx*
> 
> Not sure if too late, but seems like VRMs on 290X does not really need a heatsink to keep them cool?
> 
> http://www.legitreviews.com/nzxt-kraken-g10-gpu-water-cooler-review-on-an-amd-radeon-r9-290x_130344/6
> 
> Even without the heat plate, a direct fan blowing is getting similar results for VRM1, VRM2(supposed to be the 3 pieces at the front) get a much higher drop in temperature when heat spreader was removed. This kind of contradicts with some of the users of Arctic x3 and Gelid icy v2 where even with heatsinks on, the VRM gets overheated.


I find that hard to believe.... I have two large heatsinks and when my fans are cranked and I game at 1100/5500 @ 1.262 before droop, my VRM 1 gets up to 80c.

I don't see how the heck bare VRMs can dissipate more heat than an aluminum heatsink.

Also, I have a feeling that VRM 1 is for the core and VRM 2 is for memory. If I run memory OC'ed and the core at stock, then VRM 2 gets hotter and vice versa.


----------



## ImJJames

Quote:


> Originally Posted by *Forceman*
> 
> It wasn't. That's what I had set in GPU Tweak. If you check the Firestrike thread there are a couple others around 13000.


I've checked firestrike thread and you won't see any 1200 clocks getting 13.3k...something is off.


----------



## Forceman

Quote:


> Originally Posted by *ImJJames*
> 
> I've checked firestrike thread and you won't see any 1200 clocks getting 13.3k...something is off.


The9quad's is 12800 at 1180 and Arizonian's was 12500 at 1150. So in the ballpark. I've heard Beta 9.5 scores lower - I think mine was either 9.2 or 9.4, I ran it a couple of weeks ago. I'd run it again on 9.5 but I don't have the Asus BIOS flashed anymore, and I'm not sure the +100mV on AB is enough for 1200.

I ran a couple others also, 13100 at 1180 with the 290X BIOS and 12600 at 1180 with the 290 BIOS.

http://www.3dmark.com/fs/1150821
http://www.3dmark.com/fs/1150183


----------



## Heinz68

Quote:


> Originally Posted by *prostreetcamaro*
> 
> Well guys according to Arizonian I am not allowed to post this new XFX 290 I received today for sale because I don't have enough rep's yet so I guess I will have to post it else where....


Quote:


> Originally Posted by *HardwareDecoder*
> 
> that is dumb you have been on here longer than me.
> 
> although 64 posts in 3 years isn't very active


That's not bad, I have only 37 posts in 4 years and it's sure dumb he can't get on the owners list. Looks like I will never be allowed on the owners list and how does one gets this rep's and unique's posts? Can I buy some?









Also I was trying to start some real news thread and was not allowed.

@prostreetcamaro
I gave you 1 rep, ask Arizonian if it made any difference.


----------



## Hogesyx

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I find that hard to believe.... I have two large heatsinks and when my fans are cranked and I game at 1100/5500 @ 1.262 before droop, my VRM 1 gets up to 80c.
> 
> I don't see how the heck bare VRMs can dissipate more heat than an aluminum heatsink.
> 
> Also, I have a feeling that VRM 1 is for the core and VRM 2 is for memory. If I run memory OC'ed and the core at stock, then VRM 2 gets hotter and vice versa.


I find it hard to believe as well, I am still waiting for my ram sink to arrive, maybe I should get a kranken too and do a test.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Heinz68*
> 
> That's not bad, I have only 37 posts in 4 years and it's sure dumb he can't get on the owners list. Looks like I will never be allowed on the owners list and how does one gets this rep's and unique's posts? Can I buy some?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also I was trying to start some real news thread and was not allowed.
> 
> @prostreetcamaro
> I gave you 1 rep, ask Arizonian if it made any difference.


This should tell you everything you need to know:http://www.overclock.net/t/239915/reminder-the-rep-system-its-proper-usage/0_40

And the Rep system does have it's upsides and downsides.
Quote:


> Originally Posted by *Hogesyx*
> 
> I find it hard to believe as well, I am still waiting for my ram sink to arrive, maybe I should get a kranken too and do a test.


A friend asked me about the NZXT Bracket and i've noticed that they never showed it off on a 290/x, I really doubt that it can cool the vrm's enough.

Besides that, there is nothing to cool the small group of three so you would still need to modify it to make it work anyway.


----------



## 00Smurf

This is how i solved my throttling issues, and excessive temperature and noise.

Step 1:




Step 2:





Step 3:




Step 4:


I only have one fan on the h50 atm, but with a push pull setup i can see this maintaing a steady 55-60 C.


----------



## Heinz68

Quote:


> Originally Posted by *Hogesyx*
> 
> Not sure if too late, but seems like VRMs on 290X does not really need a heatsink to keep them cool?
> 
> http://www.legitreviews.com/nzxt-kraken-g10-gpu-water-cooler-review-on-an-amd-radeon-r9-290x_130344/6
> 
> Even without the heat plate, a direct fan blowing is getting similar results for VRM1, VRM2(supposed to be the 3 pieces at the front) get a much higher drop in temperature when heat spreader was removed. This kind of contradicts with some of the users of Arctic x3 and Gelid icy v2 where even with heatsinks on, the VRM gets overheated.


Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I find that hard to believe.... I have two large heatsinks and when my fans are cranked and I game at 1100/5500 @ 1.262 before droop, my VRM 1 gets up to 80c.
> 
> I don't see how the heck bare VRMs can dissipate more heat than an aluminum heatsink.
> 
> Also, I have a feeling that VRM 1 is for the core and VRM 2 is for memory. If I run memory OC'ed and the core at stock, then VRM 2 gets hotter and vice versa.


Sure it's strange, I remember when the cooler was first published people vere woried the VRM 2 is not going to get any air from the fan and now it doesn't even need heatsink?


----------



## smokedawg

Quote:


> Originally Posted by *ImJJames*
> 
> I've checked firestrike thread and you won't see any 1200 clocks getting 13.3k...something is off.


I got 13423 gpu score using 1210 / 1625 with a 290x:
http://www.3dmark.com/fs/1206605


----------



## Forceman

Quote:


> Originally Posted by *smokedawg*
> 
> I got 13423 gpu score using 1210 / 1625 with a 290x:
> http://www.3dmark.com/fs/1206605


Don't post that in the Firestrike thread, I don't want to have to go reflash the Asus BIOS.


----------



## CriticalHit

Quote:


> Originally Posted by *Forceman*
> 
> Don't post that in the Firestrike thread, I don't want to have to go reflash the Asus BIOS.


can i ask what BIOS are you on now ? and why gone off the Asus one ?

im trying to figure out exactly where to go with mine at the moment.


----------



## Forceman

Quote:


> Originally Posted by *CriticalHit*
> 
> can i ask what BIOS are you on now ? and why gone off the Asus one ?
> 
> im trying to figure out exactly where to go with mine at the moment.


I switched to the XFX BIOS (it's an XFX card). I didn't really like GPU Tweak and figured if I was going to use AB anyway, it might as well be on an XFX BIOS. But now that I figured out my BF4 problems were CPU related, I'm considering going back to Asus and the 1200/1450 overclock.


----------



## maynard14

mine cant get at 1100 core clock and 1300 memory clock

all i can do is 1080 core clock and 1259 memory clock

power limit to 50 percent

and im using xfx bios r9 290x

.... guess im not lucky on my card for overclock!


----------



## Sherp

Quote:


> Originally Posted by *00Smurf*
> 
> This is how i solved my throttling issues, and excessive temperature and noise.
> 
> Step 1:
> 
> 
> 
> 
> 
> 
> 
> [/URL]
> 
> I only have one fan on the h50 atm, but with a push pull setup i can see this maintaing a steady 55-60 C.


What are your VRM temps like?

I managed to get my R9 290 running well with a Gelid Icy Vision. Downclocked to 800/1250, undervolted by 50mV, 70°C core, 85°C VRM1, 65°C VRM2, 815kH/s

I used some tiny cable ties to secure the VRM heatsink.


----------



## Banedox

Quote:


> Originally Posted by *CriticalHit*
> 
> can i ask what BIOS are you on now ? and why gone off the Asus one ?
> 
> im trying to figure out exactly where to go with mine at the moment.


I actually found my card works better on the PT1 bios..


----------



## incog

Hey smurf, I'm curious, why did you keep the blower fan on your card? Isn't it useless now?


----------



## Jpmboy

Quote:


> Originally Posted by *smokedawg*
> 
> I got 13423 gpu score using 1210 / 1625 with a 290x:
> http://www.3dmark.com/fs/1206605


link is incorrect

Corrected


----------



## Jpmboy

Quote:


> Originally Posted by *Arizonian*
> 
> I've done a reach out and will look into what's going on.


Thank you!!


----------



## Gero2013

hey anyone else having issues playing back .avi videos after installing R9 290?


----------



## quakermaas

Quote:


> Originally Posted by *00Smurf*
> 
> This is how i solved my throttling issues, and excessive temperature and noise.
> 
> Step 1:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Step 2:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Step 3:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Step 4:
> 
> I only have one fan on the h50 atm, but with a push pull setup i can see this maintaing a steady 55-60 C.


Well now, that's one way of doing it









So warranty out the window then ?


----------



## brazilianloser

Quote:


> Originally Posted by *quakermaas*
> 
> Well now, that's one way of doing it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So warranty out the window then ?


He cut the plastic lid so yeah no warranty.


----------



## amlett

Quote:


> Originally Posted by *Forceman*
> 
> It wasn't. That's what I had set in GPU Tweak. If you check the Firestrike thread there are a couple others around 13000.


Mine too


----------



## ZealotKi11er

So how does the card react when it runs out of power? Does it drop core clocks?


----------



## maynard14

hi again ocn

i recently decided to use my r9 290 as r9 290x bios moded using xfx 290x bios

here is my screen shot of my hwinfo

that screen shot is taken while my pc is in idle:

i think my gpu vrm voltage is normal right for a 290x card?


----------



## quakermaas

Quote:


> Originally Posted by *ZealotKi11er*
> 
> So how does the card react when it runs out of power? Does it drop core clocks?


Yes, had it happen to me, when powerlimits was not being applied to my second card using MSI AB.

See here, temperature of both cards below 80, but second card is throttling.


----------



## maynard14

mine at msi after burner doesnt show my gpu core voltage.... pls help... i already tried the monitor voltage on the msi ab but still it doesnt appear on my graph on msi after burner


----------



## crun

Is there a connection between ASIC quality and stock voltage?

My R9 290(X) has 81% ASIC quality. On stock (1000/1250) clocks I'm idling on 0.95V (VDDC in GPU-Z) and during load it maxes out on 1.086V

This might be the reason why my card undervolts nicely (stable -100mV MSI AB, with 950/1250. Voltages: IDLE: 0.875V and LOAD: 1V ) but overclocks kinda bad? Didn't try increasiny my voltage, though
Quote:


> mine at msi after burner doesnt show my gpu core voltage.... pls help... i already tried the monitor voltage on the msi ab but still it doesnt appear on my graph on msi after burner


what's your version of MSI AB?


----------



## ssiperko

Quote:


> Originally Posted by *CriticalHit*
> 
> Have same card - Which BIOS are you using ?


Asus but I'm going to try the pt1 and the beta drivers.
Quote:


> Originally Posted by *chiknnwatrmln*
> 
> What TIM did you use? 89c seems kinda hot, my Gelid tops at 65c at max OC, and I've heard that the Accellero Extreme III's top out around 60-70c, depending on voltage.
> 
> What are your VRM's at? I know that on my Gelid (even with bigger heatsinks I bought separately) at 1.412v in GPUTweak VRM 1 gets up into the 90's... One of the reasons I'm getting this thing under water haha.


Arctic Silver.

I have the smaller Turbo II version as the Extreme III won't fit my case.

Not sure of the VRM temps ..... have yet to actually find the readings for them.

My goal was/is to keep the temps under the AMD Cataylist 95c under a full load which I achieved while going a full 100MHz higher on the core and have yet to max the memory I think from the stock setup.

I've not found the sweet spot yet .... more testing to be done tonight.

SS


----------



## maynard14

Quote:


> Originally Posted by *crun*
> 
> Is there a connection between ASIC quality and stock voltage?
> 
> My R9 290(X) has 81% ASIC quality. On stock (1000/1250) clocks I'm idling on 0.95V (VDDC in GPU-Z) and during load it maxes out on 1.086V
> 
> This might be the reason why my card undervolts nicely (stable -100mV MSI AB, with 950/1250. Voltages: IDLE: 0.875V and LOAD: 1V ) but overclocks kinda bad? Didn't try increasiny my voltage, though
> what's your version of MSI AB?


msi after burner 3.0.0 beta 17 sir..

mine asic quality is 67.5 percent only i can only do 1070/1270 overclock stock voltage and stock cooler

my idle on my flash 290 to 290x is 0.963 v and load is 1.235v

on my stock 290 bios idle is 0.986 v and load is 1.244 v...

i dont know why my stock bios is much more voltage than my moded 290x bios

pls help me..


----------



## quakermaas

Quote:


> Originally Posted by *crun*
> 
> Is there a connection between ASIC quality and stock voltage?


For the HD 7970 it was like this

ASIC quality < 75% - 1.1750 V;
ASIC quality < 80% - 1.1125 V;
ASIC quality < 85% - 1.0500 V;
ASIC quality < 90% - 1.0250 V;
ASIC quality ≤ 100% - 1.0250 V.


----------



## DeadlyDNA

Quote:


> Originally Posted by *Durvelle27*
> 
> I have a question for ppl on Water. Did going water offer better OC results compared to stock/air if you have tried it.


I posted this about my watercooling:

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/8080#


sorry im so far back in this thread now


----------



## Durvelle27

Quote:


> Originally Posted by *DeadlyDNA*
> 
> I posted this about my watercooling:
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/8080#
> 
> 
> sorry im so far back in this thread now


So by going water you were able to maintain better clocks


----------



## DeadlyDNA

Quote:


> Originally Posted by *Durvelle27*
> 
> So by going water you were able to maintain better clocks


sorry I meant to link my post on that page. I was stock on air then water. Those were the differences I got with the only change being stock coolers vs water blocks


----------



## CriticalHit

@deadlyDNA Had u tried benching ur tri fire at 80% to 100% fan speeds .. Or just stock ?

I imagine tri fire at stock fan speed may well heat throttle cards .. Was testing / benching my overclock headroom in 2 card crossfire at 80% before moving to water with no performance gains


----------



## DeadlyDNA

Quote:


> Originally Posted by *CriticalHit*
> 
> @deadlyDNA Had u tried benching ur tri fire at 80% to 100% fan speeds .. Or just stock ?
> 
> I imagine tri fire at stock fan speed may well heat throttle cards .. Was testing / benching my overclock headroom in 2 card crossfire at 80% before moving to water with no performance gains


I ran fans maxed 70-100% because if I didnt it would shut down from heat. I also has to space the top 2 cards if not it would shut down from heat.
my results were with tri fire and 1 or 2 cards probably wont suffer this loss?


----------



## 00Smurf

Quote:


> Originally Posted by *brazilianloser*
> 
> He cut the plastic lid so yeah no warranty.


Quote:


> Originally Posted by *quakermaas*
> 
> Well now, that's one way of doing it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So warranty out the window then ?


Lol, yea warrantys out, unless i can find another stock cooler. Anyone have one for cheap? lol

I could have not cut up the stock cooler, but i like'd the shroud. my next task is to try and get the fan temp reading from the vid card and splice it to the fan temp lead of the rad fans. I also want to change it out to a low profile water cooler and a thicker rad with push pull. But as a proof of concept, it works quite well i'd say. and only using zipties to hold the pump on and some scythe heat sinks i've had sitting around for years.


----------



## 00Smurf

Quote:


> Originally Posted by *Sherp*
> 
> What are your VRM temps like?
> 
> I managed to get my R9 290 running well with a Gelid Icy Vision. Downclocked to 800/1250, undervolted by 50mV, 70°C core, 85°C VRM1, 65°C VRM2, 815kH/s
> 
> I used some tiny cable ties to secure the VRM heatsink.


Gpu Fan @ 20%
Vrm1: 67C
vrm2: 57C

thats Hashing a under full load with a Gpu temp of 65C, with voltage at 1.135-1.145, 920C/1500 Mem

Quote:


> Originally Posted by *incog*
> 
> Hey smurf, I'm curious, why did you keep the blower fan on your card? Isn't it useless now?


I like the look of the shroud, Plus it blow's air over the rear pcb and help's dissipate the heat from the aluminum heat sinks. Plus it still allows me to monitor the fan speed signal. Now that i know this works i'm going to get a low profile AIO one cooler to replace this large H50 unit (Can't run xfire with it), plus i wan't to tap into the Fan temp and and possibly voltage signal So that the gpu will control the radiator fans)


----------



## Clockster

Quote:


> Originally Posted by *00Smurf*
> 
> Lol, yea warrantys out, unless i can find another stock cooler. Anyone have one for cheap? lol
> 
> I could have not cut up the stock cooler, but i like'd the shroud. my next task is to try and get the fan temp reading from the vid card and splice it to the fan temp lead of the rad fans. I also want to change it out to a low profile water cooler and a thicker rad with push pull. But as a proof of concept, it works quite well i'd say. and only using zipties to hold the pump on and some scythe heat sinks i've had sitting around for years.


I love what you did lol
Thinking of doing the same as an experiment with a H100 I have laying here.
Lucky for me I can get a stock cooler from one of my suppliers







lol


----------



## Sherp

Quote:


> Originally Posted by *00Smurf*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sherp*
> 
> What are your VRM temps like?
> 
> I managed to get my R9 290 running well with a Gelid Icy Vision. Downclocked to 800/1250, undervolted by 50mV, 70°C core, 85°C VRM1, 65°C VRM2, 815kH/s
> 
> I used some tiny cable ties to secure the VRM heatsink.
> 
> 
> 
> Gpu Fan @ 20%
> Vrm1: 67C
> vrm2: 57C
> 
> thats Hashing a under full load with a Gpu temp of 65C, with voltage at 1.135-1.145, 920C/1500 Mem
Click to expand...

Was confused how your VRM temps were so low, just noticed you chopped up your unisink.

The unisink does a great job of keeping the VRM and RAM cool, it's a shame that vapor chamber seems to be glued to it. If I could combine the black unisink with a Gelid Icy Vision, the temperatures would be fantastic.


----------



## Durvelle27

Quote:


> Originally Posted by *DeadlyDNA*
> 
> sorry I meant to link my post on that page. I was stock on air then water. Those were the differences I got with the only change being stock coolers vs water blocks


So it was throttling


----------



## ImJJames

Quote:


> Originally Posted by *smokedawg*
> 
> I got 13423 gpu score using 1210 / 1625 with a 290x:
> http://www.3dmark.com/fs/1206605


Some kind of tweaks I don't know about to get better score on firestrike?


----------



## Durvelle27

Thinking of getting a GTX 780 if i go water and it still doesn't perform where i want it


----------



## 00Smurf

Quote:


> Originally Posted by *Clockster*
> 
> I love what you did lol
> Thinking of doing the same as an experiment with a H100 I have laying here.
> Lucky for me I can get a stock cooler from one of my suppliers
> 
> 
> 
> 
> 
> 
> 
> lol


Go for it! It honestly took about 20 mins to do.

lol Do they have an extra one?
Quote:


> Originally Posted by *Sherp*
> 
> Was confused how your VRM temps were so low, just noticed you chopped up your unisink.
> 
> The unisink does a great job of keeping the VRM and RAM cool, it's a shame that vapor chamber seems to be glued to it. If I could combine the black unisink with a Gelid Icy Vision, the temperatures would be fantastic.


I tried seperating the uni sink from the vapor chamber. That's how it ended up snapped, but in the end it worked out. The ideal situation would be the uni sink as a full plate, with the AIO attached to it. But alas as you noticed they are glued together.


----------



## Derpinheimer

Quote:


> Originally Posted by *Sherp*
> 
> Was confused how your VRM temps were so low, just noticed you chopped up your unisink.
> 
> The unisink does a great job of keeping the VRM and RAM cool, it's a shame that vapor chamber seems to be glued to it. If I could combine the black unisink with a Gelid Icy Vision, the temperatures would be fantastic.


How are you underclocking? Is it only possible to underclock with default bios?

With unlocked 290x bios, at stock clocks and 1500 memory I get 815Khash as well? :/


----------



## ImJJames

Quote:


> Originally Posted by *Durvelle27*
> 
> Thinking of getting a GTX 780 if i go water and it still doesn't perform where i want it


I just noticed you're running 5ghz OC on a 4+2 VRM motherboard....how did it not blow up yet?


----------



## jerrolds

I'm getting a 2nd 290X today and since the stock cooler is garbage, im thinking of slapping on another 3rd party solution.

Outside of waterblocks, what are some aftermarket cooling you guys prefer? I'm currently using the Gelid ICY 2 on my 290X and its pretty good - @1150mhz my temps are 55C max on the core, VRM1 temps around 85C .

At 1200+, 65C on the core and 95C on VRM1 ..which is ok...but currently running inbetween at 1180mhz.

Gelid ICY 2's are sold out, as well as Prolimatech MK26s

So that leaves me with AC Hybrid/Xtreme 3 or possibly custom brackets like the Kraken G10 or Sigma_Cool MK2

Right now im leaning towards the G10, comes with a fan and i can get a better AIO cooling solution, possibly much cheaper come Boxing day.


----------



## Durvelle27

Quote:


> Originally Posted by *ImJJames*
> 
> I just noticed you're running 5ghz OC on a 4+2 VRM motherboard....how did it not blow up yet?


Just for the funnies


----------



## smokedawg

Quote:


> Originally Posted by *ImJJames*
> 
> Some kind of tweaks I don't know about to get better score on firestrike?


Only thing I can think of is setting AMD Catalyst preset to optimal performance. Other than that no clue.


----------



## ImJJames

Quote:


> Originally Posted by *smokedawg*
> 
> Only thing I can think of is setting AMD Catalyst preset to optimal performance. Other than that no clue.


Weird on 1200/1500 I can't even crack 12800 graphic score...


----------



## rdr09

Quote:


> Originally Posted by *ImJJames*
> 
> Weird on 1200/1500 I can't even crack 12800 graphic score...


I get almost 12700 with the 290 at those clocks. I don't think it is stable. it should be closer to 13000.


----------



## amlett

Quote:


> Originally Posted by *ImJJames*
> 
> Weird on 1200/1500 I can't even crack 12800 graphic score...


Try a fresh install of win 8.1. I was getting lower numbers when I installed the card, my OS was a three year old win7 with too many things installed I was no using anymore.


----------



## specopsFI

Quote:


> Originally Posted by *ImJJames*
> 
> Weird on 1200/1500 I can't even crack 12800 graphic score...


Well you do have a 290, don't you? The 290X does have 10% more shaders. Surely it shows in some benchies though you did great on Heaven


----------



## ImJJames

Quote:


> Originally Posted by *rdr09*
> 
> I get almost 12700 with the 290 at those clocks. I don't think it is stable. it should be closer to 13000.


You know what I am realizing? I think its because I'm comparing mine to 290x I have 290 the extra shaders may help it give that extra 400 points boost on graphic score.


----------



## ImJJames

Quote:


> Originally Posted by *specopsFI*
> 
> Well you do have a 290, don't you? The 290X does have 10% more shaders. Surely it shows in some benchies though you did great on Heaven


Yup you're right lol that is what I just posted above me


----------



## GroupB

Stupid question guys, Im currently removing my mcw82 on my 6950'S to put back the original heatsink (saphire 2 fan) and Its got my thinking... DO you think a saphire 6950 2 gb heatsink will fit on the 290 ? a 6950 is also pretty close to 300W, I just start mining and the noise and the temps are pretty high, Im trying to find a solution while I wait for the xspc block to hit the market


----------



## tsm106

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's ok, we haven't forgotten
Click to expand...

It's good that some ppl still know what's what in this thread.









Quote:


> Originally Posted by *KyGuy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Durvelle27*
> 
> Yes I know of water cooling as my 7970 @1290/1850 was water cooled and my 8350 is also. Do you guys think by water cooling I could yield better OC results
> 
> 
> 
> Water would let you use higher voltage which may let you achieve a better core clock. *Memory overclocks are virtually useless on these cards. Only helps with texture loading speed.* I tried overclocking my 290x and couldn't even get past 1130 without artifacting with +100mv in Afterburner, but I am still considering water cooling as I want another card for crossfire. I have a terrible card for core overclocking, but it will do 1500 on the memory and 1600 if I raise the AUX voltage even though it isn't directly related to the memory clock.
Click to expand...

If you don't know, you just don't know and you should leave it at that.

For ex.

@1225/1375 = 12734 GS

http://www.3dmark.com/fs/1060554

@1225/1500 = 13414 GS

http://www.3dmark.com/fs/1050519

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Derpinheimer*
> 
> Hm, AUX voltage is VRM 2, aka the set of 3.
> 
> Clearly the voltage for them is not 1.0v default [Or is it?]
> 
> Very weird..
> 
> 
> 
> I always thought VRM 2 was the long strip and VRM 1 was the 3?
Click to expand...

VRM1 is the triple set of vrms.


----------



## rdr09

Quote:


> Originally Posted by *ImJJames*
> 
> You know what I am realizing? I think its because I'm comparing mine to 290x I have 290 the extra shaders may help it give that extra 400 points boost on graphic score.


there you go. I just tested mine again and got the same result. I guess mine is stable at those clocks in 3DMark at least.


----------



## SamEkinci

Sorry guys this thread is moving too fast.

Unfortunately I was out of it talking to amazon so didn't take a picture.

I was using a kingwin single rail gold rated 750w psu to test power usage and see if one psu was good to run both CPU and gpu over clocked and at load together. It didn't. Gpu was throttling.

It's interesting because i turned it on, reset the overclock values to default, ran kombustor through afterburner and I heard a slight crack as case was open and next to me.

Removed the block and voila top left corner had a gigantic crack along with multiple tiny ones. Doubt it's the stress because i used the stock springs and bracket, not to mention it worked fine couple of tests prior.

Really weird, temperatures were average 50c when this happened at full load.

Now trying to save for 290x xmas time though I am a bit discouraged after second one went kaput for totally different reason. It's just too new I think at this stage and too many unknowns about the card still.


----------



## Banedox

Quote:


> Originally Posted by *SamEkinci*
> 
> Sorry guys this thread is moving too fast.
> 
> Unfortunately I was out of it talking to amazon so didn't take a picture.
> 
> I was using a kingwin single rail gold rated 750w psu to test power usage and see if one psu was good to run both CPU and gpu over clocked and at load together. It didn't. Gpu was throttling.
> 
> It's interesting because i turned it on, reset the overclock values to default, ran kombustor through afterburner and I heard a slight crack as case was open and next to me.
> 
> Removed the block and voila top left corner had a gigantic crack along with multiple tiny ones. Doubt it's the stress because i used the stock springs and bracket, not to mention it worked fine couple of tests prior.
> 
> Really weird, temperatures were average 50c when this happened at full load.
> 
> Now trying to save for 290x xmas time though I am a bit discouraged after second one went kaput for totally different reason. It's just too new I think at this stage and too many unknowns about the card still.


Wait your saying on a 750w PSU your card was throttling cause of the PSU?


----------



## tsm106

Quote:


> Originally Posted by *SamEkinci*
> 
> Sorry guys this thread is moving too fast.
> 
> Unfortunately I was out of it talking to amazon so didn't take a picture.
> 
> I was using a kingwin single rail gold rated 750w psu to test power usage and see if one psu was good to run both CPU and gpu over clocked and at load together. It didn't. Gpu was throttling.
> 
> It's interesting because i turned it on, reset the overclock values to default, ran kombustor through afterburner and I heard a slight crack as case was open and next to me.
> 
> *Removed the block and voila top left corner had a gigantic crack along with multiple tiny ones.* Doubt it's the stress because i used the stock springs and bracket, not to mention it worked fine couple of tests prior.
> 
> Really weird, temperatures were average 50c when this happened at full load.
> 
> Now trying to save for 290x xmas time though I am a bit discouraged after second one went kaput for totally different reason. It's just too new I think at this stage and too many unknowns about the card still.


What the heck are you guys doing where you are cracking dies!!! Is that a gpu only block? You compressed the springs all the way and kept tightening...?

I've never cracked a die before and this is my third set of tri or quad amd gpu setup, from quad 6950s, quad 7970s, to tri 290x.


----------



## jerrolds

Argh - anyone know the safe storing temps for these cards?







I found a 290X in the locals, in another city - and got my sister to pick it up and ship it to me. But she didnt make the package require a signature - so its currently in the community mailbox down the street.

It's -23C right now and no one will be able to get to it for a couple hours.

I'm sure its fine but kinda annoying. Itll sit in room temperature for a few hours before i fire it up so condensation shouldnt be an issue.

Worst case and its sitting there overnight and the temps hit -30C...will this be a problem even if its in room temps for a while?


----------



## iamhollywood5

Quote:


> Originally Posted by *jerrolds*
> 
> Argh - anyone know the safe storing temps for these cards?
> 
> 
> 
> 
> 
> 
> 
> I found a 290X in the locals, in another city - and got my sister to pick it up and ship it to me. But she didnt make the package require a signature - so its currently in the community mailbox down the street.
> 
> It's -23C right now and no one will be able to get to it for a couple hours.
> 
> I'm sure its fine but kinda annoying. Itll sit in room temperature for a few hours before i fire it up so condensation shouldnt be an issue.
> 
> Worst case and its sitting there overnight and the temps hit -30C...will this be a problem even if its in room temps for a while?


I think it's fine, you just dont want it to go from -30C to 20C rapidly. Maybe put it in a fridge for a while so the temperature change isn't so rapid? lol


----------



## Forceman

Quote:


> Originally Posted by *jerrolds*
> 
> Argh - anyone know the safe storing temps for these cards?
> 
> 
> 
> 
> 
> 
> 
> I found a 290X in the locals, in another city - and got my sister to pick it up and ship it to me. But she didnt make the package require a signature - so its currently in the community mailbox down the street.
> 
> It's -23C right now and no one will be able to get to it for a couple hours.
> 
> I'm sure its fine but kinda annoying. Itll sit in room temperature for a few hours before i fire it up so condensation shouldnt be an issue.
> 
> Worst case and its sitting there overnight and the temps hit -30C...will this be a problem even if its in room temps for a while?


Quote:


> Originally Posted by *iamhollywood5*
> 
> I think it's fine, you just dont want it to go from -30C to 20C rapidly. Maybe put it in a fridge for a while so the temperature change isn't so rapid? lol


Yeah, I don't think there's any lower limit on storing (within reason). Just let it warm up before you plug it in so you don't get any condensation on it and short something out.


----------



## sugarhell

Quote:


> Originally Posted by *tsm106*
> 
> What the heck are you guys doing where you are cracking dies!!! Is that a gpu only block? You compressed the springs all the way and kept tightening...?
> 
> I've never cracked a die before and this is my third set of tri or quad amd gpu setup, from quad 6950s, quad 7970s, to tri 290x.


Lol. Its quite difficult to crack a die.

I dont want to see these guys trying to delid an intel cpu.


----------



## Jpmboy

Quote:


> Originally Posted by *smokedawg*
> 
> Only thing I can think of is setting AMD Catalyst preset to optimal performance. Other than that no clue.


It's that e3g3 mobo! It's a beast! (I have the same







). And your result is at pcie2.0! Good run!


----------



## ImJJames

So I found out PT 1 bios was giving me unstable results on 3dmark, using stock asus bios I was able to get much better results at lower clocks

1250/1500 Stock Asus Bios, Stock cooler
http://www.3dmark.com/fs/1285324


----------



## Banedox

Quote:


> Originally Posted by *Jpmboy*
> 
> It's that e3g3 mobo! It's a beast! (I have the same
> 
> 
> 
> 
> 
> 
> 
> ). And your result is at pcie2.0! Good run!


I dont think PCI-Express Bottleneck Card yet at 16x


----------



## Forceman

Quote:


> Originally Posted by *ImJJames*
> 
> So I found out PT 1 bios was giving me unstable results on 3dmark, using stock asus bios I was able to get much better results at lower clocks
> 
> 1250/1500 Stock Asus Bios, Stock cooler
> http://www.3dmark.com/fs/1285324


Glad you got it figured out. Wonder what the PT1 was doing to affect it so much.


----------



## ImJJames

Quote:


> Originally Posted by *Forceman*
> 
> Glad you got it figured out. Wonder what the PT1 was doing to affect it so much.


Not sure, on PT 1 bios even at 1270 clock I couldnt even touch 13k Graphic score. With Asus stock bios @ 1250 I get over 13k lol


----------



## tsm106

Quote:


> Originally Posted by *ImJJames*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Forceman*
> 
> Glad you got it figured out. Wonder what the PT1 was doing to affect it so much.
> 
> 
> 
> Not sure, on PT 1 bios even at 1270 clock I couldnt even touch 13k Graphic score. With Asus stock bios @ 1250 I get over 13k lol
Click to expand...

Reads like a user issue to me, shrugs.


----------



## ImJJames

Quote:


> Originally Posted by *tsm106*
> 
> Reads like a user issue to me, shrugs.


It may be...I think on PT 1 bios, I was pushing to much voltage on it that it didn't need.


----------



## Jpmboy

Quote:


> Originally Posted by *jerrolds*
> 
> Argh - anyone know the safe storing temps for these cards?
> 
> 
> 
> 
> 
> 
> 
> I found a 290X in the locals, in another city - and got my sister to pick it up and ship it to me. But she didnt make the package require a signature - so its currently in the community mailbox down the street.
> 
> It's -23C right now and no one will be able to get to it for a couple hours.
> 
> I'm sure its fine but kinda annoying. Itll sit in room temperature for a few hours before i fire it up so condensation shouldnt be an issue.
> 
> Worst case and its sitting there overnight and the temps hit -30C...will this be a problem even if its in room temps for a while?


Plug that puppy in at that temp and give it the whip!


----------



## Jpmboy

Quote:


> Originally Posted by *Banedox*
> 
> I dont think PCI-Express Bottleneck Card yet at 16x


J king.... But seriously that $100 mobo has been bullet proof for almost two years and running constantly!


----------



## jerrolds

Quote:


> Originally Posted by *Jpmboy*
> 
> Plug that puppy in at that temp and give it the whip!


Its like a poor man full card ln2!


----------



## Jpmboy

Quote:


> Originally Posted by *jerrolds*
> 
> Its like a poor man full card ln2!


----------



## Derpinheimer

Quote:


> Originally Posted by *tsm106*
> 
> It's good that some ppl still know what's what in this thread.
> 
> 
> 
> 
> 
> 
> 
> 
> If you don't know, you just don't know and you should leave it at that.
> 
> For ex.
> 
> @1225/1375 = 12734 GS
> 
> http://www.3dmark.com/fs/1060554
> 
> @1225/1500 = 13414 GS
> 
> http://www.3dmark.com/fs/1050519
> VRM1 is the triple set of vrms.


Its not. I've manually removed the heatsink I have on the set of 3 and watched VRM2 temp increase. Then replacing it with a cold heatsink and it drops a good 15c before rising back up. Unless its a HWinfo/GPUz discrepancy on labeling one "1" and the other "2", then it doesnt make sense.

BTW, have all of you bought 3dmark? Or are you actually running all tests just to compare Firestrike in free version?


----------



## jerrolds

Ya i had a feeling VRM1 (as reported by GPUZ) is the line of vrms furthest away from the output connections.


----------



## chiknnwatrmln

I also agree that VRM 1 is the long line of VRM's.

Changing heatsinks on the cluster (VRM 2) affected VRM 2 temperatures.

Also I found a forum on Sweclockers where one user had pointed an air compressor at the cluster of VRM's while the card was stress tested... He said that VRM 2 temps dropped 40c while he did that, then went back up.

I don't have the link now, but I posted it earlier in the thread. The page is in Swedish, though, so you'll have to translate it.


----------



## Derpinheimer

Heres a little test demonstrating what I said. Running cgminer on 290x.

Red line -> Removed heatsink
Green line -> Replaced heatsink (Different one each time)


----------



## prostreetcamaro

Quote:


> Originally Posted by *Derpinheimer*
> 
> Its not. I've manually removed the heatsink I have on the set of 3 and watched VRM2 temp increase. Then replacing it with a cold heatsink and it drops a good 15c before rising back up. Unless its a HWinfo/GPUz discrepancy on labeling one "1" and the other "2", then it doesnt make sense.
> 
> BTW, have all of you bought 3dmark? Or are you actually running all tests just to compare Firestrike in free version?


No way am I buying that terrible benchmark. Many people like to use 3dmark11 because it lets you run just the benchmark instead of forcing you to watch all the other crap like in 3dmark13.


----------



## VSG

What? You can run just the benchmarks in 3dmark 13, just not in the free version.


----------



## rdr09

Quote:


> Originally Posted by *ImJJames*
> 
> So I found out PT 1 bios was giving me unstable results on 3dmark, using stock asus bios I was able to get much better results at lower clocks
> 
> 1250/1500 Stock Asus Bios, Stock cooler
> http://www.3dmark.com/fs/1285324


we are still getting to know Hawaii. It is not like Tahiti except that it is temp sensitive, too. i guess all gpus are. lol


----------



## ImJJames

Quote:


> Originally Posted by *prostreetcamaro*
> 
> No way am I buying that terrible benchmark. Many people like to use 3dmark11 because it lets you run just the benchmark instead of forcing you to watch all the other crap like in 3dmark13.


Terrible benchmark? The bought version which is like $7 on steam you can run just firestrike...which is just as long as 3dmark11


----------



## Derpinheimer

Quote:


> Originally Posted by *ImJJames*
> 
> Terrible benchmark? The bought version which is like $7 on steam you can run just firestrike...which is just as long as 3dmark11


Its $25?


----------



## Banedox

Quote:


> Originally Posted by *Derpinheimer*
> 
> Its $25?


it was like 75% during the last steam sale


----------



## jerrolds

Wait for steam sales - usually goes down by a lot. Next one is rumored to be around the 19th to the 2nd i believe.


----------



## KyGuy

Quote:


> Originally Posted by *tsm106*
> 
> It's good that some ppl still know what's what in this thread.
> 
> 
> 
> 
> 
> 
> 
> 
> If you don't know, you just don't know and you should leave it at that.
> 
> For ex.
> 
> @1225/1375 = 12734 GS
> 
> http://www.3dmark.com/fs/1060554
> 
> @1225/1500 = 13414 GS
> 
> http://www.3dmark.com/fs/1050519
> VRM1 is the triple set of vrms.


What I meant to come across, is that a memory overclock does not lead to a big gain in real time performance such as games. In benchmarks, that may be different as they are designed to push every part of the card to the limit. In games though, memory overclocks do not really do anything for you.


----------



## battleaxe

Well, picking up my 290 tomorrow. Getting an MSI. Are those decent?

Okay, I want to use a fan profile on this thing from the get-go. Can I use MSI AB? (meaning I want to be able to hit 100% fan speed)

This PC is in a closet so noise is a non-issue. (no worries - has an A/C vent in the closet)


----------



## Derpinheimer

Yes + yes

Very minute chance of unlocking. But the card is the same as the rest.


----------



## battleaxe

Quote:


> Originally Posted by *Derpinheimer*
> 
> Yes + yes
> 
> Very minute chance of unlocking. But the card is the same as the rest.


Guessing XFX has the best unlock chance? Friday I'll be looking at those, so I may exchange it if an XFX comes in.

What's the decent threshold for OC on air?

I'm going to water, but I want to test on air first. So curious what typical OC I should expect?


----------



## Derpinheimer

XFX has been around 50% from the start. Sapphire started becoming much more likely to unlock with BF4 edition. Powercolor has gone from near 100% to more like 50% or less unlock .

BTW, ASUS GPU tweak has been updated.

Max core voltage reduced a ton to 1410mV
Now supports downclocking
Now supports power limit
Now supports idle states while overclocked

^^^^^From my testing^^^^

I dont see a legit changelog yet, pretty sure it just came out.


----------



## Redeemer

Hey guys been following this thread found so much useful information. My question is under water which card is more beast the 290x or the 780TI? The GPU is the last piece needed to complete my rig!

Thanks


----------



## jerrolds

Quote:


> Originally Posted by *Derpinheimer*
> 
> XFX has been around 50% from the start. Sapphire started becoming much more likely to unlock with BF4 edition. Powercolor has gone from near 100% to more like 50% or less unlock .
> 
> BTW, ASUS GPU tweak has been updated.
> 
> Max core voltage reduced a ton to 1410mV
> Now supports downclocking
> Now supports power limit
> Now supports idle states while overclocked
> 
> ^^^^^From my testing^^^^
> 
> I dont see a legit changelog yet, pretty sure it just came out.


All those points is how gpu tweak currently behaves for me, for the last month and a half.


----------



## jerrolds

Quote:


> Originally Posted by *Redeemer*
> 
> Hey guys been following this thread found so much useful information. My question is under water which card is more beast the 290x or the 780TI? The GPU is the last piece needed to complete my rig!
> 
> Thanks


If price isnt a concern for you, and you are only gaming - then the 780ti is better out of the box, and they both have decent ceilings for overclocking. It should also run cooler/quieter than the stock 290X.


----------



## tsm106

Quote:


> Originally Posted by *jerrolds*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Redeemer*
> 
> Hey guys been following this thread found so much useful information. My question is under water which card is more beast the 290x or the 780TI? The GPU is the last piece needed to complete my rig!
> 
> Thanks
> 
> 
> 
> If price isnt a concern for you, and you are only gaming - then the 780ti is better out of the box, and they both have decent ceilings for overclocking. It should also run cooler/quieter than the stock 290X.
Click to expand...

How you get that reversed? Ugh, the 290x is a better pure gaming card. The 780ti is better at benching.


----------



## MlNDSTORM

Quote:


> Originally Posted by *Redeemer*
> 
> Hey guys been following this thread found so much useful information. My question is under water which card is more beast the 290x or the 780TI? The GPU is the last piece needed to complete my rig!
> 
> Thanks


I was looking at this guys videos about his 290 on water and it has convinced to go with a 290x and put it on water. Will be my first card I watercool.


----------



## rdr09

Quote:


> Originally Posted by *MlNDSTORM*
> 
> I was looking at this guys videos about his 290 on water and it has convinced to go with a 290x and put it on water. Will be my first card I watercool.
> 
> 
> Spoiler: Warning: Spoiler!


you happen to know what rez the person was using? 'Cause if 1080 you can play that with the 290 stock at Very High (all) and 8MSAA. i did but i have mine paired with an i7 4.5 GHz and i am talking about multi-player as well as campaign.


----------



## MlNDSTORM

Quote:


> Originally Posted by *rdr09*
> 
> you happen to know what rez the person was using? 'Cause if 1080 you can play that with the 290 stock at Very High (all) and 8MSAA. i did but i have mine paired with an i7 4.5 GHz and i am talking about multi-player as well as campaign.


Hmmm...Not sure he doesn't state anywhere in the description, just says to watch it in 1080p. Liking the temps though, from 90c to 50c....


----------



## rdr09

Quote:


> Originally Posted by *MlNDSTORM*
> 
> Hmmm...Not sure he doesn't state anywhere in the description, just says to watch it in 1080p. Liking the temps though, from 90c to 50c....


if it was 1080 even a 7970 can play those settings without problem. heck even a 7950 with a bit of an oc can do it. now if it is higher than 1080, then yah, oc the 290. but, like i said, at stock the 290 can max out C3 and not be concerned much about temp, especially when watercooled.


----------



## the9quad

Quote:


> Originally Posted by *SamEkinci*
> 
> Sorry guys this thread is moving too fast.
> 
> Unfortunately I was out of it talking to amazon so didn't take a picture.
> 
> I was using a kingwin single rail gold rated 750w psu to test power usage and see if one psu was good to run both CPU and gpu over clocked and at load together. It didn't. Gpu was throttling.
> 
> It's interesting because i turned it on, reset the overclock values to default, ran kombustor through afterburner and I heard a slight crack as case was open and next to me.
> 
> Removed the block and voila top left corner had a gigantic crack along with multiple tiny ones. Doubt it's the stress because i used the stock springs and bracket, not to mention it worked fine couple of tests prior.
> 
> Really weird, temperatures were average 50c when this happened at full load.
> 
> Now trying to save for 290x xmas time though I am a bit discouraged after second one went kaput for totally different reason. It's just too new I think at this stage and too many unknowns about the card still.


I am not trying to be judgmental but you removed the stock cooler, posted a video of you putting a water cooler on it in which you say several times don't tighten it down to much, and then proceed to keep tightening it yourself while admitting you are probably tightening it too much. Now it's cracked and you are blaming the card? And trying to RMA it>?

28 minute mark.


----------



## stickg1

All of the sudden after some overclocking and benchmarking my screen goes black when I get into the OS. So I went into safe mode and uninstalled all the overclocking apps. Still black, took the GPU out and booted up with iGPU and uninstalled all the drivers, screen still black with GPU installed. I feel sick in the stomach as I think I just bricked my $400 GPU.

Anyone have any helpful tips? I'm a little upset right now :/


----------



## 8800GT

Hey guys, I'm from the green team. Anyone wanna do some benches of your 290's and 290x's at max oc and my 780 at max oc in some popular titles?


----------



## ZealotKi11er

Quote:


> Originally Posted by *stickg1*
> 
> All of the sudden after some overclocking and benchmarking my screen goes black when I get into the OS. So I went into safe mode and uninstalled all the overclocking apps. Still black, took the GPU out and booted up with iGPU and uninstalled all the drivers, screen still black with GPU installed. I feel sick in the stomach as I think I just bricked my $400 GPU.
> 
> Anyone have any helpful tips? I'm a little upset right now :/


I had that problem. I think some people know a workaround for it. You have to delete something. I had to reinstall Windows.


----------



## ImJJames

Quote:


> Originally Posted by *stickg1*
> 
> All of the sudden after some overclocking and benchmarking my screen goes black when I get into the OS. So I went into safe mode and uninstalled all the overclocking apps. Still black, took the GPU out and booted up with iGPU and uninstalled all the drivers, screen still black with GPU installed. I feel sick in the stomach as I think I just bricked my $400 GPU.
> 
> Anyone have any helpful tips? I'm a little upset right now :/


I know this might sound stupid but have you tried using bios switch back to quiet mode? Just to see if it was a bad flash issue.


----------



## stickg1

Quote:


> Originally Posted by *ImJJames*
> 
> I know this might sound stupid but have you tried using bios switch back to quiet mode? Just to see if it was a bad flash issue.


I never flashed the BIOS on this card, I'm going to reinstall windows.


----------



## ZealotKi11er

Quote:


> Originally Posted by *stickg1*
> 
> I never flashed the BIOS on this card, I'm going to reinstall windows.


Try to uninstall drivers in safe mode. Then you should be able to boot. Install driver normally but set it so CCC does not launch with Windows Startup and see if it work. Use CCleaner to stop CCC from starting up.


----------



## Emmett

Quote:


> Originally Posted by *stickg1*
> 
> All of the sudden after some overclocking and benchmarking my screen goes black when I get into the OS. So I went into safe mode and uninstalled all the overclocking apps. Still black, took the GPU out and booted up with iGPU and uninstalled all the drivers, screen still black with GPU installed. I feel sick in the stomach as I think I just bricked my $400 GPU.
> 
> Anyone have any helpful tips? I'm a little upset right now :/


DDU in safe mode after uninstalling drivers and deleting AMD folders?

I had something bad that was sticking through driver reinstall until I used DDU. then all good.


----------



## ihaveworms

To the guys with the first version of the Koolance waterblock (yes, they already released another revision that doesn't fully cover the card)

During gaming (BF4) my VRM1 temps max out around 58C. When I tried OCCT GPU Benchmark I found that the VRM1 temps raised rapidly. I stopped the benchmark when my VRM1 temps hit 80C. I just wanted to know if that was normal. When looking at the block underside, I can see how there are two separate metal pieces, one for the RAM, GPU, and the small set of VRMs. Then the other chunk is over the strip of VRMs. I am not entirely sure if water makes contact with the metal over the VRM1 strip because I didn't want to take the block apart. Does anyone know if water comes in contact with that block over the VRM1 strip? If not, that would have explained why I was hitting those temps. I am just worried that I placed the thermal pad incorrectly if water is in fact flowing over that block peice.


----------



## stickg1

I just reinstalled windows and the problem is solved. Thanks!


----------



## Jack Mac

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *ihaveworms*
> 
> To the guys with the first version of the Koolance waterblock (yes, they already released another revision that doesn't fully cover the card)
> 
> During gaming (BF4) my VRM1 temps max out around 58C. When I tried OCCT GPU Benchmark I found that the VRM1 temps raised rapidly. I stopped the benchmark when my VRM1 temps hit 80C. I just wanted to know if that was normal. When looking at the block underside, I can see how there are two separate metal pieces, one for the RAM, GPU, and the small set of VRMs. Then the other chunk is over the strip of VRMs. I am not entirely sure if water makes contact with the metal over the VRM1 strip because I didn't want to take the block apart. Does anyone know if water comes in contact with that block over the VRM1 strip? If not, that would have explained why I was hitting those temps. I am just worried that I placed the thermal pad incorrectly if water is in fact flowing over that block peice.





What kind of voltage are you running and what are your ambient temperatures like? That definitely sounds strange though, my stock cooled card tops out at 72C on the hottest VRM at 80% fan speed when benching at 1200/1450 w/ +100mv in a 22C room.


----------



## RandomHer0

So has it been determined yet whether or not the BSoD are caused by driver or hardware issues? Got mine yesterday and have given up after 24 hours of trying different things with reinstalling drivers etc. Pretty sure I'll take it back to the shop unless you guys say otherwise


----------



## ihaveworms

Quote:


> Originally Posted by *Jack Mac*
> 
> What kind of voltage are you running and what are your ambient temperatures like? That definitely sounds strange though, my stock cooled card tops out at 72C on the hottest VRM at 80% fan speed when benching at 1200/1450 w/ +100mv in a 22C room.


+25mv in 74F room. I would say If you ran OCCT on a stock cooler with the settings I was using your temps would be high as well. In games it is alright.


----------



## Derpinheimer

Quote:


> Originally Posted by *RandomHer0*
> 
> So has it been determined yet whether or not the BSoD are caused by driver or hardware issues? Got mine yesterday and have given up after 24 hours of trying different things with reinstalling drivers etc. Pretty sure I'll take it back to the shop unless you guys say otherwise


Only thing that give me BSOD are using cgminer with bad login credentials or bad oc (too low voltage)


----------



## RandomHer0

Quote:


> Originally Posted by *Derpinheimer*
> 
> Only thing that give me BSOD are using cgminer with bad login credentials or bad oc (too low voltage)


Haven't OCd mine at all. It will crash of just left on the desktop. Ironically, the longest period it went without crashing was whilst testing mining.


----------



## DeadlyDNA

Quote:


> Originally Posted by *8800GT*
> 
> Hey guys, I'm from the green team. Anyone wanna do some benches of your 290's and 290x's at max oc and my 780 at max oc in some popular titles?


Not sure i understand this question entirely. Are you asking for benchmarks in controlled(similar hardware/multiple gpus?) testing or just anything goes?


----------



## Jack Mac

Quote:


> Originally Posted by *ihaveworms*
> 
> +25mv in 74F room. I would say If you ran OCCT on a stock cooler with the settings I was using your temps would be high as well. In games it is alright.


Maybe so, but I'm on the reference cooler and you're using water, your temperatures should be lower.


----------



## tsm106

Quote:


> Originally Posted by *stickg1*
> 
> All of the sudden after some overclocking and benchmarking my screen goes black when I get into the OS. So I went into safe mode and uninstalled all the overclocking apps. Still black, took the GPU out and booted up with iGPU and uninstalled all the drivers, screen still black with GPU installed. I feel sick in the stomach as I think I just bricked my $400 GPU.
> 
> Anyone have any helpful tips? I'm a little upset right now :/


Quote:


> Originally Posted by *stickg1*
> 
> I just reinstalled windows and the problem is solved. Thanks!


In the future, there are two things you can do depending how broken your desktop settings are. What were you overclocking with btw?

1. delete the ACE folders in users/you/appdata/local and roaming

2. if the above doesn't work (rare) delete your user profile.


----------



## SamEkinci

Quote:


> Originally Posted by *the9quad*
> 
> I am not trying to be judgmental but you removed the stock cooler, posted a video of you putting a water cooler on it in which you say several times don't tighten it down to much, and then proceed to keep tightening it yourself while admitting you are probably tightening it too much. Now it's cracked and you are blaming the card? And trying to RMA it>?
> 
> 28 minute mark.


I know you don't mean anything by it but these two cards were different cards than one that was on video. That one I sold to a friend and its still working.

The two other cards were with original blocks/heatsinks. I just replaced the TIM on both of them.

Very observant of you though.









I have enough experience with the stock block of 290 that I could tell you there is no way to overtighten it and card worked perfectly fine for 10 mins before deciding to break.

Sorry btw I just can't keep up with the pace of this thread.


----------



## stickg1

Quote:


> Originally Posted by *tsm106*
> 
> In the future, there are two things you can do depending how broken your desktop settings are. What were you overclocking with btw?
> 
> 1. delete the ACE folders in users/you/appdata/local and roaming
> 
> 2. if the above doesn't work (rare) delete your user profile.


I was using MSI-AB BETA 17 with CCC to get the power limit up.

I probably should have tooled around some more before reinstalling OS. I wanted to switch over my OS anyway, I was just being lazy, this was a good excuse to start fresh again.


----------



## Raephen

Quote:


> Originally Posted by *ihaveworms*
> 
> To the guys with the first version of the Koolance waterblock (yes, they already released another revision that doesn't fully cover the card)
> 
> During gaming (BF4) my VRM1 temps max out around 58C. When I tried OCCT GPU Benchmark I found that the VRM1 temps raised rapidly. I stopped the benchmark when my VRM1 temps hit 80C. I just wanted to know if that was normal. When looking at the block underside, I can see how there are two separate metal pieces, one for the RAM, GPU, and the small set of VRMs. Then the other chunk is over the strip of VRMs. I am not entirely sure if water makes contact with the metal over the VRM1 strip because I didn't want to take the block apart. Does anyone know if water comes in contact with that block over the VRM1 strip? If not, that would have explained why I was hitting those temps. I am just worried that I placed the thermal pad incorrectly if water is in fact flowing over that block peice.


I wouldn't worry about i ttoo much. Benchmarks are not realistic day to day use representatives (unless you mine a lot), and I had a similar experience with the Furmark 15 minute burn in test.

I have an Aquacomputer waterblock, and yes: I highly doubt there is 'direct' waterflow over the 3 VRMs on top, the copper over the section has water flowing right past it. So I'd think that not as bag an issue as people ar making it.

I did contact Aquacomputer about it, and they suggested maybe slightly thicker but maleable thermal padding might help.

Every card and every block has slight variances in manufacture, I'd imagine, so it makes sense that not all supplied padding fits as well in each and every instance. And with a slightly bad fit comes worse thermals.

I had some Phobya 7W/mk lying arround, 1mm thick, so I tried that instead of AC's supplied 0,5mm pad. It did the trick for me.


----------



## tsm106

Quote:


> Originally Posted by *SamEkinci*
> 
> Quote:
> 
> 
> 
> Originally Posted by *the9quad*
> 
> I am not trying to be judgmental but you removed the stock cooler, posted a video of you putting a water cooler on it in which you say several times don't tighten it down to much, and then proceed to keep tightening it yourself while admitting you are probably tightening it too much. Now it's cracked and you are blaming the card? And trying to RMA it>?
> 
> 28 minute mark.
> 
> 
> 
> 
> 
> 
> 
> 
> I know you don't mean anything by it but these two cards were different cards than one that was on video. That one I sold to a friend and its still working.
> 
> The two other cards were with original blocks/heatsinks. I just replaced the TIM on both of them.
> 
> Very observant of you though.
Click to expand...

^^Dies don't crack themselves. LOL.


----------



## SamEkinci

No they don't especially while working fine with cooler on them.

Anyways I was refunded for it so that means it wasn't my fault. I am sure they wouldn't have just taken it back if it was my fault.


----------



## tsm106

Quote:


> Originally Posted by *SamEkinci*
> 
> No they don't especially while working fine with cooler on them.
> 
> Anyways I was refunded for it so that means it wasn't my fault. I am sure they wouldn't have just taken it back if it was my fault.


Course you got refunded, you ABUSED Amazon's return policy. Its frauds like you that lead companies like Sapphire to their void on sight policies.


----------



## stickg1

Quote:


> Originally Posted by *tsm106*
> 
> Course you got refunded, you ABUSED Amazon's return policy. Its frauds like you that lead companies like Sapphire to their void on sight policies.


Agreed.


----------



## givmedew

Quote:


> Originally Posted by *ihaveworms*
> 
> To the guys with the first version of the Koolance waterblock (yes, they already released another revision that doesn't fully cover the card)
> 
> During gaming (BF4) my VRM1 temps max out around 58C. When I tried OCCT GPU Benchmark I found that the VRM1 temps raised rapidly. I stopped the benchmark when my VRM1 temps hit 80C. I just wanted to know if that was normal. When looking at the block underside, I can see how there are two separate metal pieces, one for the RAM, GPU, and the small set of VRMs. Then the other chunk is over the strip of VRMs. I am not entirely sure if water makes contact with the metal over the VRM1 strip because I didn't want to take the block apart. Does anyone know if water comes in contact with that block over the VRM1 strip? If not, that would have explained why I was hitting those temps. I am just worried that I placed the thermal pad incorrectly if water is in fact flowing over that block peice.


The water doesn't need to run over that area the block should have a fairly uniform temperature so cooling the block is cooling the block regardless of water goes over the area. There is def not fins in that area so it wouldn't matter much anyways. Unrestricted water isn't going to cool much at all. The fittings is most important.

That said... what are the differences with the new block? Is it better or did they just find a way to make it cheaper? That really blows either way because I want a second one of these cards and would like my blocks to match. But I'm not just going to buy a koolance block until I can get another card which doesn't look like it is going to happen soon.


----------



## xxmastermindxx

Quote:


> Originally Posted by *SamEkinci*
> 
> I know you don't mean anything by it but these two cards were different cards than one that was on video. That one I sold to a friend and its still working.
> 
> The two other cards were with original blocks/heatsinks. I just replaced the TIM on both of them.
> 
> Very observant of you though.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have enough experience with the stock block of 290 that I could tell you there is no way to overtighten it and card worked perfectly fine for 10 mins before deciding to break.
> 
> Sorry btw I just can't keep up with the pace of this thread.


Unbelievable. If you don't know what the hell you're doing (you clearly don't), don't do it. The die just cracked itself, amirite? Too bad Amazon didn't reject your RMA and teach you a lesson.


----------



## VSG

"@amd_roy: Saw Gigabytes R9 290X yesterday. It is very impressive. I want one! Ships next v week and will delight. @AMDRadeon @ryanshrout # gigabyte"


__ https://twitter.com/i/web/status/410883704433565696


----------



## binormalkilla

Quote:


> Originally Posted by *SamEkinci*
> 
> No they don't especially while working fine with cooler on them.
> 
> Anyways I was refunded for it so that means it wasn't my fault. I am sure they wouldn't have just taken it back if it was my fault.


This logic checks out.


----------



## ImJJames

Quote:


> Originally Posted by *geggeg*
> 
> "@amd_roy: Saw Gigabytes R9 290X yesterday. It is very impressive. I want one! Ships next v week and will delight. @AMDRadeon @ryanshrout # gigabyte"
> 
> 
> __ https://twitter.com/i/web/status/410883704433565696


Nice, would love to see how these non-ref cards bench. I personally won't be getting one I love my reference card lol


----------



## magicase

What performance lost am I expecting if I got 290 Tri CF with x8/x4/x4 compared to x16/x8/x8 @ 1440p?


----------



## RAFFY

Quote:


> Originally Posted by *SamEkinci*
> 
> I know you don't mean anything by it but these two cards were different cards than one that was on video. That one I sold to a friend and its still working.
> 
> The two other cards were with original blocks/heatsinks. I just replaced the TIM on both of them.
> 
> Very observant of you though.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have enough experience with the stock block of 290 that I could tell you there is no way to overtighten it and card worked perfectly fine for 10 mins before deciding to break.
> 
> Sorry btw I just can't keep up with the pace of this thread.


Either *YOUR* doing something wrong or *YOUR* computer is messed up. You have destroyed *TWO R9 290's* in the last *couple of weeks*. I think its time *you* do some digging on your computer itself and some digging on research on how to *properly* mod your hardware. This OCN there is a plethora of information on virtually any topic. My advice stay away from hard volt mods for now. You may burn your house down and then you'll probably sue the power company for selling you power. Just my







and I'm sure they are more that would agree.

Quote:


> Originally Posted by *magicase*
> 
> What performance lost am I expecting if I got 290 Tri CF with x8/x4/x4 compared to x16/x8/x8 @ 1440p?


I've been reading that there can be some bottlenecking on x4 slots on non x79 boards due to bandwidth constrictions.


----------



## RavageTheEarth

Quote:


> Originally Posted by *SamEkinci*
> 
> No they don't especially while working fine with cooler on them.
> 
> Anyways I was refunded for it so that means it wasn't my fault. I am sure they wouldn't have just taken it back if it was my fault.


I'm guessing you took off all the thermal pads and screwed all of the screws in all of the way?

Am I right??????????????????










EDIT: Never mind I see you did the red mod.


----------



## tsm106

Quote:


> Originally Posted by *RAFFY*
> 
> Quote:
> 
> 
> 
> Originally Posted by *SamEkinci*
> 
> I know you don't mean anything by it but these two cards were different cards than one that was on video. That one I sold to a friend and its still working.
> 
> The two other cards were with original blocks/heatsinks. I just replaced the TIM on both of them.
> 
> Very observant of you though.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have enough experience with the stock block of 290 that I could tell you there is no way to overtighten it and card worked perfectly fine for 10 mins before deciding to break.
> 
> Sorry btw I just can't keep up with the pace of this thread.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Either *YOUR* doing something wrong or *YOUR* computer is messed up. You have destroyed *TWO R9 290's* in the last *couple of weeks*. I think its time *you* do some digging on your computer itself and some digging on research on how to *properly* mod your hardware. This OCN there is a plethora of information on virtually any topic. My advice stay away from hard volt mods for now. You may burn your house down and then you'll probably sue the power company for selling you power. Just my
> 
> 
> 
> 
> 
> 
> 
> and I'm sure they are more that would agree.
> 
> Quote:
> 
> 
> 
> Originally Posted by *magicase*
> 
> What performance lost am I expecting if I got 290 Tri CF with x8/x4/x4 compared to x16/x8/x8 @ 1440p?
> 
> Click to expand...
> 
> *I've been reading that there can be some bottlenecking on x4 slots on non x79 boards due to bandwidth constrictions*.
Click to expand...

No problem on x79. The problem is he is on Ivy or Haswell without a PLX so he is stuck with splitting 16 lanes three ways.

On the topic of cracked dies... do I understand this correct, you FREAKING CRACKED TWO GPU DIES?

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/10640_40#post_21369319


----------



## Derpinheimer

Quote:


> Originally Posted by *Derpinheimer*
> 
> Ugh, I guess I hoped that with people saying GPU usage readings were bad it was normal, but at the same time the frame drops are highly annoying.
> 
> I'll try driver re-install first...


No go.

BF4 is supper jittery, but multiple times per second it has lag spikes. Other example, simcity, it does it as "set intervals" it seems. I tried lowering the CPU clock, and the only difference is the framerate drops are more dramatic [from the same height]...

Any ideas?


----------



## ImJJames

Quote:


> Originally Posted by *Derpinheimer*
> 
> No go.
> 
> BF4 is supper jittery, but multiple times per second it has lag spikes. Other example, simcity, it does it as "set intervals" it seems. I tried lowering the CPU clock, and the only difference is the framerate drops are more dramatic [from the same height]...
> 
> Any ideas?


Re-install windows if you have already tried DDU safe mode driver removal.


----------



## magicase

Quote:


> Originally Posted by *RAFFY*
> 
> Either *YOUR* doing something wrong or *YOUR* computer is messed up. You have destroyed *TWO R9 290's* in the last *couple of weeks*. I think its time *you* do some digging on your computer itself and some digging on research on how to *properly* mod your hardware. This OCN there is a plethora of information on virtually any topic. My advice stay away from hard volt mods for now. You may burn your house down and then you'll probably sue the power company for selling you power. Just my
> 
> 
> 
> 
> 
> 
> 
> and I'm sure they are more that would agree.
> I've been reading that there can be some bottlenecking on x4 slots on non x79 boards due to bandwidth constrictions.


So to be safe I should go with a MB that has a PLX chip to give x16/x8/x8?


----------



## tsm106

Quote:


> Originally Posted by *magicase*
> 
> Quote:
> 
> 
> 
> Originally Posted by *RAFFY*
> 
> Either *YOUR* doing something wrong or *YOUR* computer is messed up. You have destroyed *TWO R9 290's* in the last *couple of weeks*. I think its time *you* do some digging on your computer itself and some digging on research on how to *properly* mod your hardware. This OCN there is a plethora of information on virtually any topic. My advice stay away from hard volt mods for now. You may burn your house down and then you'll probably sue the power company for selling you power. Just my
> 
> 
> 
> 
> 
> 
> 
> and I'm sure they are more that would agree.
> I've been reading that there can be some bottlenecking on x4 slots on non x79 boards due to bandwidth constrictions.
> 
> 
> 
> So to be safe I should go with a MB that has a PLX chip to give x16/x8/x8?
Click to expand...

Nobody knows what or how Hawaii's dual DMA engines will work when being confined by a PLX chip, so I would stick with a proven setup, x79. You're quoting RAFFY, but he's confused on this point. It's the opposite, there are no bandwidth limitations on x79.


----------



## RavageTheEarth

Quote:


> Originally Posted by *tsm106*
> 
> On the topic of cracked dies... do I understand this correct, you FREAKING CRACKED TWO GPU DIES?
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/10640_40#post_21369319


Yes he did. I just posted on his Youtube videos warning people that he cracked his die doing this. Not sure why he didn't put out a warning after it happened considering the video has received over 8000 views....


----------



## DeadlyDNA

There have been 2 GPU's in this thread that have had cracked chips. From two different users. While i understand the logic to assume it was broken by the user. It would be nice instead of beating someone up, allowing them to share the information. This could be useful for others folks down the road. Folks are less inclined to share their experiences when chided by everyone. Even if it was destroyed by abuse it shares knowledge with us what limits to be aware of.


----------



## Derpinheimer

Quote:


> Originally Posted by *RavageTheEarth*
> 
> Yes he did. I just posted on his Youtube videos warning people that he cracked his die doing this. Not sure why he didn't put out a warning after it happened considering the video has received over 8000 views....


I only see that he cracked one. Where is the original?

BTW he said the one in that video, with zip ties, is a friend's card and its still working. I'd have a hard time believing 4 zip ties could exert enough force to break it.
Quote:


> Originally Posted by *ImJJames*
> 
> Re-install windows if you have already tried DDU safe mode driver removal.


Yup, did DDU :/

I'll try windows repair and otherwise I guess thats the only other option.

Thank you


----------



## tsm106

Quote:


> Originally Posted by *DeadlyDNA*
> 
> There have been 2 GPU's in this thread that have had cracked chips. From two different users. While i understand the logic to assume it was broken by the user. It would be nice instead of beating someone up, allowing them to share the information. This could be useful for others folks down the road. Folks are less inclined to share their experiences when chided by everyone. Even if it was destroyed by abuse it shares knowledge with us what limits to be aware of.


There was no problem till he committed fraud which hurts us all. The first user who cracked his die didn't go about committing fraud and admitted to it, so that was fair. This Sam guy... he's a real winner, not just one but two!!! What the I don't even...


----------



## RavageTheEarth

Ahh my bad I remembered it as he did the same mod twice and broke it both times. Wow at my old age of


Spoiler: Warning: Spoiler!



23


my memory seems to be failing me!


----------



## RavageTheEarth

Quote:


> Originally Posted by *tsm106*
> 
> There was no problem till he committed fraud which hurts us all. The first user who cracked his die didn't go about committing fraud and admitted to it, so that was fair. This Sam guy... he's a real winner, not just one but two!!! What the I don't even...


Agreed.

EDIT: Sorry for the double posting. My bad!


----------



## Derpinheimer

Quote:


> Originally Posted by *RavageTheEarth*
> 
> Ahh my bad I remembered it as he did the same mod twice and broke it both times. Wow at my old age of
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 23
> 
> 
> my memory seems to be failing me!


Where?? I went back 7 pages of his post history and only saw one mention of breaking a card. Also the RMA because the XFX 290 didnt unlock.


----------



## binormalkilla

Quote:


> Originally Posted by *Derpinheimer*
> 
> No go.
> 
> BF4 is supper jittery, but multiple times per second it has lag spikes. Other example, simcity, it does it as "set intervals" it seems. I tried lowering the CPU clock, and the only difference is the framerate drops are more dramatic [from the same height]...
> 
> Any ideas?


With the latest beta drivers and trifire I was getting horrible fps drops with lots of I/O....bf4 ran horribly. Installing the WHQL drivers from guru3d worked for me. Runs like a champ now. Well, besides all of the flickering textures. These drivers really are a joke


----------



## chiknnwatrmln

How hard do you have to tighten a cooler to crack the die?

Geez, I over-tightened my Gelid the first time I put it on, the PCB was slightly bent as a result but the die didn't crack. Loosening up the screws solved the problem. I only do 1/4 to 1/2 a turn with a screwdriver now, once I get it tight with my fingers that is.

The heatsink needs to be in contact with the TIM, and the TIM needs to be in contact with the heatsink and die. You don't need to freaking crush it, it only has to be secure lol.


----------



## RavageTheEarth

Quote:


> Originally Posted by *Derpinheimer*
> 
> Where?? I went back 7 pages of his post history and only saw one mention of breaking a card. Also the RMA because the XFX 290 didnt unlock.


No my memory failed me, remember? I think your memory is also failing you!!


----------



## SamEkinci

So aggressive..









English is not my native language so maybe I am not explaining it right?

Anyways. I am already planning to order another card from tiger direct. And I will make sure to come here and lie about it if I happen to step on the GPU bare feet and crack it while trying to apply Tim. While at it use air tool to tighten the screws.

Cause you know I am not as intelligent as you are and ******ed but you know me so well I don't even have to explain all that right?

Fun fun..









Some people just need to take a chill pill. So what if I screwed up and broke the GPU, maybe I did maybe I didn't no way for me to know, they took the card and refunded me and that's all that matters.

Instead of hating you can just say cool story bro or something.. After all we just having a good old time here..

Edit the card I was returning 'not RMAing' broke down on me that day I posted that so karma for you.







bites you in the arse.


----------



## tsm106

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> How hard do you have to tighten a cooler to crack the die?
> 
> Geez, I over-tightened my Gelid the first time I put it on, the PCB was slightly bent as a result but the die didn't crack. Loosening up the screws solved the problem. I only do 1/4 to 1/2 a turn with a screwdriver now, once I get it tight with my fingers that is.
> 
> The heatsink needs to be in contact with the TIM, and the TIM needs to be in contact with the heatsink and die. You don't need to freaking crush it, it only has to be secure lol.


It's because he used zipties. Zipties flex cuz its freaking NYLON. Once the clc is mounted and you bend a corner of the clc or put pressure on it wrong, the ziptie will flex allowing movement. That movement can equal serious damage.


----------



## sugarhell

Oh poor tiger direct.


----------



## tsm106

Quote:


> Originally Posted by *sugarhell*
> 
> Oh poor tiger direct.


Whats wrong, he got banned from Amazon?


----------



## Derpinheimer

Quote:


> Originally Posted by *tsm106*
> 
> It's because he used zipties. Zipties flex cuz its freaking NYLON. Once the clc is mounted and you bend a corner of the clc or put pressure on it wrong, the ziptie will flex allowing movement. That movement can equal serious damage.


Interesting theory.. makes sense. But he says it happened while running in the case. Unless hes like me and pokes and prods at his computer while its running, which.... I'm an idiot for doing (BUT NO DEAD CARDS SO FAR), but I doubt he is... eh? Well, I guess I dont see where the failure could come in.
Quote:


> Originally Posted by *SamEkinci*
> 
> So aggressive..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> English is not my native language so maybe I am not explaining it right?
> 
> Anyways. I am already planning to order another card from tiger direct. And I will make sure to come here and lie about it if I happen to step on the GPU bare feet and crack it while trying to apply Tim. While at it use air tool to tighten the screws.
> 
> Cause you know I am not as intelligent as you are and ******ed but you know me so well I don't even have to explain all that right?
> 
> Fun fun..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Some people just need to take a chill pill. So what if I screwed up and broke the GPU, maybe I did maybe I didn't no way for me to know, they took the card and refunded me and that's all that matters.
> 
> Instead of hating you can just say cool story bro or something.. After all we just having a good old time here..


Look, its not "fair" to attack because it cant be CERTAIN that it was user induced damage.

However, in all likelyhood it was... and when you break something and send it back and get it replaced, that is fraud and it hurts us. So no, its not all that matters.

So even though no one knows, its no surprise that people will attack.


----------



## SamEkinci

Amazon is out of stock.

To get the story right, first card I sold. second card I returned vrm problems. Third card the die had crack in the middle of benchmark with stock heatsink and stock values.

Also I really believed it wasn't my fault that die cracked. I like to think I am a semi decent person. I really believe I didn't do anything wrong as far as cracked GPU. So I returned as defective.


----------



## Derpinheimer

Quote:


> Originally Posted by *SamEkinci*
> 
> Amazon is out of stock.
> 
> To get the story right, first card I sold. Still find second card I returned vrm problems. Third card the die had crack in the middle of benchmark with stock heatsink and stock values.
> 
> Also I really believed it wasn't my fault that die cracked. I like to think I am a semi decent person. I really believe I didn't do anything wrong.


Well there we go, it was the stock heatsink. And checking back, his post does appear to have said that (although not super clearly)

AFAIK, you cant overtighten these. I know the 7950 I had came screwed in to the max, and its very clear when you retighten it that it wont go any farther, unlike custom coolers which rely on potentially oversized springs that can definitely be overtighten.

Sorry about the troubles man. Good luck with your next card.


----------



## SamEkinci

Thank you dude, yeah that's why I don't think its my fault stock heatsink and bracket with springs on xfx, impossible to overtighten.

So I was pretty shocked and decided to return because seriously its the first time die cracked on me ever. Like someonelsr was saying its not that easy to crack it far as I know.


----------



## binormalkilla

It sounds to me like you over tightened the cooler with zip ties instead of springs. Springs allow for thermal expansion and provide a factor of safety. Lesson learned... use the proper hardware for the job. Cutting corners can cause serious problems.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *binormalkilla*
> 
> It sounds to me like you over tightened the cooler with zip ties instead of springs. Springs allow for thermal expansion and provide a factor of safety. Lesson learned... use the proper hardware for the job. Cutting corners can cause serious problems.


He said he used the stock heatsink, I don't know how he would use zipties on the stock cooler.


----------



## SamEkinci

Quote:


> Originally Posted by *binormalkilla*
> 
> It sounds to me like you over tightened the cooler with zip ties instead of springs. Springs allow for thermal expansion and provide a factor of safety. Lesson learned... use the proper hardware for the job. Cutting corners can cause serious problems.


What do you mean?

The card in question with the cracked die had all stock heatsink and screws on it.

The red mod was done to another card with zip-ties that card is fine.


----------



## magicase

Sorry about my bad quoting skills.

So PLX chip isn't a viable option for using 290 Tri CF at x16/x8/x8?


----------



## RAFFY

Quote:


> Originally Posted by *tsm106*
> 
> No problem on x79. The problem is he is on Ivy or Haswell without a PLX so he is stuck with splitting 16 lanes three ways.
> 
> On the topic of cracked dies... do I understand this correct, you FREAKING CRACKED TWO GPU DIES?
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/10640_40#post_21369319


This is why I purchased the new Black Edition!
Quote:


> Originally Posted by *magicase*
> 
> So to be safe I should go with a MB that has a PLX chip to give x16/x8/x8?


Get a nice x79 board.
Quote:


> Originally Posted by *tsm106*
> 
> Nobody knows what or how Hawaii's dual DMA engines will work when being confined by a PLX chip, so I would stick with a proven setup, x79. You're quoting RAFFY, but he's confused on this point. It's the opposite, there are no bandwidth limitations on x79.


Your the confused one sir. I was saying that non x79 boards can have bandwidth limitations when using x4 slots. Same thing you are.


----------



## binormalkilla

Quote:


> Originally Posted by *SamEkinci*
> 
> What do you mean?
> 
> The card in question with the cracked die had all stock heatsink and screws on it.
> 
> The red mod was done to another card with zip-ties that card is fine.


Someone mentioned that you attached a cooler with zip ties.
So did you use a cross pattern when tightening screws? (Assuming you took the heatsink off and reinstalled before the fracture)

I've never seen a GPU crack though... And I've applied a TON of pressure with different waterblocks on various cards.


----------



## SamEkinci

Guys sorry I think my english is failing me here so let me explain what happened one last time, sure I can laugh it off but I like OCN and I don't like to be misunderstood.

Brand new 290 I tested working perfectly fine. Removed heatsink to put on Mx-2. Put on the stock sink all fine and dandy.

Over clocked and ran a stress test all fine. Removed the card to put on and test a friends card because I made him a custom bracket.

Then I put the card back on turned it on, put everything to stock to run one final test. Temperature 50c fan at 100% and I heard a crack. Then black screen. Tried few diagnostic stuff, nothing.

Removed the heatsink and there it is a big crack on top left with small fracture cracks around it.

That's what happened. No zip ties nothing. Yes I have made bunch of videos of the mod and the red mod. Has no relation with the card with cracked die. The person who was talking about that zip-tie mod had their facts wrong. Different cards.

Sorry if I still can't explain it clearly like I said many times english isn't my native language.


----------



## DeadlyDNA

Quote:


> Originally Posted by *SamEkinci*
> 
> Guys sorry I think my english is failing me here so let me explain what happened one last time, sure I can laugh it off but I like OCN and I don't like to be misunderstood.
> 
> Brand new 290 I tested working perfectly fine. Removed heatsink to put on Mx-2. Put on the stock sink all fine and dandy.
> 
> Over clocked and ran a stress test all fine. Removed the card to put on and test a friends card because I made him a custom bracket.
> 
> Then I put the card back on turned it on, put everything to stock to run one final test. Temperature 50c fan at 100% and I heard a crack. Then black screen. Tried few diagnostic stuff, nothing.
> 
> Removed the heatsink and there it is a big crack on top left with small fracture cracks around it.
> 
> That's what happened. No zip ties nothing. Yes I have made bunch of videos of the mod and the red mod. Has no relation with the card with cracked die. The person who was talking about that zip-tie mod had their facts wrong. Different cards.
> 
> Sorry if I still can't explain it clearly like I said many times english isn't my native language.


Did you happen to notice any brownish coloring in the TIM around the crack?


----------



## Derpinheimer

Sorry for the accusations. I should have checked back to be sure I knew what you had actually said before jumping in on the bash-wagon.

*No matter what happened I'm sure even if SamEkinci is lying to us, he knows what caused the issue and no amount of flaming will help anyone.*

Now, on a more productive note..

Was it OCCT/Furmark? Just curious, because I think those have been reported to kill GPUs before. And no I'm not saying that makes it your fault. I run OCCT all the time.. admittedly.


----------



## SamEkinci

It was a bit darker in color where the crack was but I wouldn't say brown. Nothing like this ever happened to me before so not sure what to say.

What would that mean if it were brown?

And it was the kombustor. No worries its internet.


----------



## binormalkilla

Quote:


> Originally Posted by *SamEkinci*
> 
> It was a bit darker in color where the crack was but I wouldn't say brown. Nothing like this ever happened to me before so not sure what to say.
> 
> What would that mean if it were brown?
> 
> And it was the kombustor.


If the heatsink doesn't make contact with all of the die, the thermal paste will cook, turning brown. I killed a 2900xt that way.


----------



## tsm106

Quote:


> Originally Posted by *magicase*
> 
> Sorry about my bad quoting skills.
> 
> So PLX chip isn't a viable option for using 290 Tri CF at x16/x8/x8?


I would never buy into a lesser system and have to use a PLX chip to overcome that weakness.

Quote:


> Originally Posted by *magicase*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Nobody knows what or how Hawaii's dual DMA engines will work when being confined by a PLX chip, so I would stick with a proven setup, x79. You're quoting RAFFY, but he's confused on this point. It's the opposite, there are no bandwidth limitations on x79.
> 
> 
> 
> Your the confused one sir. I was saying that non x79 boards can have bandwidth limitations when using x4 slots. Same thing you are.
Click to expand...

We are saying the same thing then?









Quote:


> Originally Posted by *Derpinheimer*
> 
> Sorry for the accusations. I should have checked back to be sure I knew what you had actually said before jumping in on the bash-wagon.
> 
> *No matter what happened I'm sure even if SamEkinci is lying to us, he knows what caused the issue and no amount of flaming will help anyone.*
> 
> Now, on a more productive note..
> 
> Was it OCCT/Furmark? Just curious, because I think those have been reported to kill GPUs before. And no I'm not saying that makes it your fault. I run OCCT all the time.. admittedly.


Fact is. Gpu met Sam. Gpu is now dead.


----------



## SamEkinci

Quote:


> Originally Posted by *binormalkilla*
> 
> If the heatsink doesn't make contact with all of the die, the thermal paste will cook, turning brown. I killed a 2900xt that way.


Gotcha no thermal paste was fairly fresh but mx-2 is known to work fast with no setting time. If I am not mistaken.

Hah that's funny yep I haven't met a part I can't break but I would admit and laugh it off if it was the case, I just shared to see if anyone had a clue why it might have happened.


----------



## RAFFY

Quote:


> Originally Posted by *tsm106*
> 
> We are saying the same thing then?


YES SIR!!!


----------



## DeadlyDNA

Quote:


> Originally Posted by *r0cawearz*
> 
> 
> 
> Here. I'm not sure if its from the AS5 spreading and zapping the core, or if its just from the poor heatsink installation. I've installed it a couple of times with poor installation and this never happened.


Quote:


> Originally Posted by *r0cawearz*
> 
> Another thing I noticed was that the thermal paste was a tint brownish before I cleaned it off. (with arctic silver 5.)
> 
> edit: on that specific area that is.


Was wondering if this was related in the sense trying to keep notes. This could be more than just assumed improper heat sink installation. At any rate sorry for your loss hope you get it all sorted out.


----------



## SamEkinci

I am not sure what you mean by poor installation of heatsink though xfx heatsinks are fairly straightforward to install. I am not saying it couldn't have been that, but I have also loosely installed heatsinks before and all it did was few degrees in difference.

This card just was working fine like a champ a minute before. Just weird. Sure its human nature but like I said I am the first to point out and have a good laugh if I mess up, its all about having a good time but this just is weird.

Its was like the weirdest crack that was audible and noticed it while kombustor was running very clearly. Its not something you hear coming out of a case every day.

Edit meant to say loosely and poorly


----------



## DeadlyDNA

Please understand, i am not implying you installed anything incorrectly. I was referring to all the other posts that assumed you did. There is always other possibilities such as faulty hardware. This reminds me of the old AMD CPU days where the chips cracked and corners broke off. Socket A cpu's, to be exact. Most of those were due to bad heat sink installations. However it was also fair to say the chip was a little to fragile.


----------



## tsm106

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Please understand, i am not implying you installed anything incorrectly. I was referring to all the other posts that assumed you did. There is always other possibilities such as faulty hardware. This reminds me of the old AMD CPU days where the chips cracked and corners broke off. Socket A cpu's, to be exact. Most of those were due to bad heat sink installations. However it was also fair to say the chip was a little to fragile.


Why you giving him an out? The old socket A design was weak obviously and has no real connection to hawaii or tahiti except that they are exposed die designs. Imo it is impossible to crack a die while mounting the stock cooler properly. The cooler is wide and flat which makes it impossible to get the angle needed to put improper forces on one corner. Using a clc mod, it's really easy to put an imbalanced load on the die. There are other ways to do it wrong, like tightening one screw at a time on the core instead of slowly tightening in cross pattern which should be common knowledge to any geek or gear head.


----------



## the9quad

I can say one thing about Sam, he seems like a decent dude. I don't know how many people can take this kind of posting without coming out swinging in retaliation. Good on ya sam


----------



## DeadlyDNA

Quote:


> Originally Posted by *tsm106*
> 
> Why you giving him an out? The old socket A design was weak obviously and has no real connection to hawaii or tahiti except that they are exposed die designs. Imo it is impossible to crack a die while mounting the stock cooler properly. The cooler is wide and flat which makes it impossible to get the angle needed to put improper forces on one corner. Using a clc mod, it's really easy to put an imbalanced load on the die. There are other ways to do it wrong, like tightening one screw at a time on the core instead of slowly tightening in cross pattern which should be common knowledge to any geek or gear head.


You are right, Socket A and current gen GPU's have no real connection. I just said it reminds me of that. We have two different users who had the same thing happen. When is the last time you heard of a GPU Die cracking?


----------



## tsm106

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Why you giving him an out? The old socket A design was weak obviously and has no real connection to hawaii or tahiti except that they are exposed die designs. Imo it is impossible to crack a die while mounting the stock cooler properly. The cooler is wide and flat which makes it impossible to get the angle needed to put improper forces on one corner. Using a clc mod, it's really easy to put an imbalanced load on the die. There are other ways to do it wrong, like tightening one screw at a time on the core instead of slowly tightening in cross pattern which should be common knowledge to any geek or gear head.
> 
> 
> 
> You are right, Socket A and current gen GPU's have no real connection. I just said it reminds me of that. We have two different users who had the same thing happen. When is the last time you heard of a GPU Die cracking?
Click to expand...

Are you trying to make one of these the absence of evidence is not the evidence of absence arguments?

Both these guys removed their coolers. The gpus were fine before presumably. After the gpu met their owners they died after their owners removed the coolers of their own choice. What is the link? You want to make it some theory that the core is weak or some gibberish with absolutely no proof?

Lets remember this thread and man up and take responsibility for our actions.

http://www.overclock.net/t/1444881/sapphire-after-market-coolers-and-warranty/0_40


----------



## DeadlyDNA

Quote:


> Originally Posted by *tsm106*
> 
> Are you trying to make one of these the absence of evidence is not the evidence of absence arguments?
> 
> Both these guys removed their coolers. The gpus were fine before presumably. After the gpu met their owners they died after their owners removed the coolers of their own choice. What is the link? You want to make it some theory that the core is weak or some gibberish with absolutely no proof?
> 
> Lets remember this thread and man up and take responsibility for our actions.
> 
> http://www.overclock.net/t/1444881/sapphire-after-market-coolers-and-warranty/0_40


I am not trying to prove anything specific especially without proof. I am just trying to keep an open mind on the subject. People automatically assumed these users improperly installed the hardware. Perfectly logical thinking, however to quick to judge. I understand people here are passionate about this stuff and that may be why they are so vocal. These are still new video cards and we may see more cracked gpu's.


----------



## SamEkinci

Quote:


> Originally Posted by *the9quad*
> 
> I can say one thing about Sam, he seems like a decent dude. I don't know how many people can take this kind of posting without coming out swinging in retaliation. Good on ya sam


Thanks I try to be, its internet full of people hiding behind their nicknames. I am afraid of my mom and my wife, death... so, so.. I dont really see the point of getting mad. I am and was just curious to see if it happened to anyonelse. I guess not.

What I dont like though is people relating this to a video I have made to help others. Thats lame. There isnt even a relation. Different cards, different situations.

All in all, I dont know why it happened. If the consensus is due to user error then I guess I messed up somewhere, I just dont see where, its just a bracket and few screws, and I have done it millions of times.









As far as flaming, meh.. not here for that.


----------



## Forceman

Quote:


> Originally Posted by *tsm106*
> 
> Are you trying to make one of these the absence of evidence is not the evidence of absence arguments?
> 
> Both these guys removed their coolers. The gpus were fine before presumably. After the gpu met their owners they died after their owners removed the coolers of their own choice. What is the link? You want to make it some theory that the core is weak or some gibberish with absolutely no proof?
> 
> Lets remember this thread and man up and take responsibility for our actions.
> 
> http://www.overclock.net/t/1444881/sapphire-after-market-coolers-and-warranty/0_40


I thought one of them also said that the TIM wasn't well spread? Maybe there was bad contact on the core and the temp differential was enough to cause it to crack. Seems unlikely, but it also seems like it would be hard to crack the die with the stock cooler, considering how it mounts.


----------



## HOMECINEMA-PC

Just had a quick flick thru .........

I will be going a proper w/block thats meant for da 290 .

That crack.........

......... aint worth the pain man especially at $475au


----------



## broken pixel

3930K 4.9GHz/ 2133MHz 1.376v
Description
2x XFX 290x 1100/1500/ +75 on AIR

19321
Graphics Score 24406
Physics Score 17478
Combined Score 8036

http://www.3dmark.com/fs/1288596


----------



## Connolly

R9 290 flashed to the R9 290x here running on stock cooler with a custom profile letting the fan get up to 100%, at 75 degrees. I've been using MSI AB and have it set to +100 core MV, but the highest core clock speed I can reach is 1130 before I start to experience some artefacts in the Valley. Is that kind of low? I'm seeing lots of people reaching 1200 + and wondering how they're doing it, using Asus tweak? Or have I just got a fairly poor overclocker?
I won't RMA it or anything if I have, it's good enough for me that it unlocked and it's already past it's 14 day easy return guarantee anyway, I'm just curious how people are getting such high numbers.

Cheers in advance.


----------



## ImJJames

Quote:


> Originally Posted by *Connolly*
> 
> R9 290 flashed to the R9 290x here running on stock cooler with a custom profile letting the fan get up to 100%, at 75 degrees. I've been using MSI AB and have it set to +100 core MV, but the highest core clock speed I can reach is 1130 before I start to experience some artefacts in the Valley. Is that kind of low? I'm seeing lots of people reaching 1200 + and wondering how they're doing it, using Asus tweak? Or have I just got a fairly poor overclocker?
> I won't RMA it or anything if I have, it's good enough for me that it unlocked and it's already past it's 14 day easy return guarantee anyway, I'm just curious how people are getting such high numbers.
> 
> Cheers in advance.


For MSI AB at +100 MV, I start to experience very slight artifacts anything over 1200 clock on benchmarks. But my card is probably above average overclocker. Try GPU tweak and give it more volts, see if it gets rid of the artifacts.


----------



## Arizonian

Quote:


> Originally Posted by *broken pixel*
> 
> 3930K 4.9GHz/ 2133MHz 1.376v
> Description
> 2x XFX 290x 1100/1500/ +75 on AIR
> 
> 19321
> Graphics Score 24406
> Physics Score 17478
> Combined Score 8036
> 
> http://www.3dmark.com/fs/1288596


If you don't mind added a GPU-Z validation link with you OCN name on that post would be great. In the meantime....

Congrats - added


----------



## broken pixel

Quote:


> Originally Posted by *Arizonian*
> 
> If you don't mind added a GPU-Z validation link with you OCN name on that post would be great. In the meantime....
> 
> Congrats - added


3930K 4.9GHz/ 2133MHz 1.376v Edit
Description
2x XFX 290x 1100/1500/ +75

19321
Graphics Score 24406
Physics Score 17478
Combined Score 8036
HIDE RESULT DETAILS
RESULT DETAILS
http://www.3dmark.com/fs/1288596










Sure thing, oops!
broken pixel 2x XFX 290x/ Air


----------



## quakermaas

Got my hands on one block so far, will fit them when I get another and back plates.


----------



## FragZero

My R9 290x from Sapphire just arrived! 3 long hours before i can go home and install it in my pc!

Question, is there any VRM heatsink which uses the mountingholes?

Had some bad experiences with 4850 VRM cooling, in the end i just cut my stock heatplate in half. Same with a 7950, eventually i just glued the sinks on it but i do not dare using permanent paste on my 500 euro card.

Another option would be a swiftech Unisink modded to work with aircooling, already contacted swiftech about it, i hope they will make one for the R9 290(x).


----------



## HOMECINEMA-PC

http://www.3dmark.com/3dm11/7648050

Your frendly neighbourhood MADMAN


----------



## Widde

Hmm, getting random black screens not happening often, it comes and goes totally random. Whenever I open a program on the desktop my screen goes black but the pc doesnt seem to freeze. after a couple of hardresets and switched bios on the card it goes away, anyone have any clue?


----------



## conzilla

Hello all i just got my r9 290 yesterday. I have a question. What do i need to run a vga monitor as my second monitor?


----------



## Durvelle27

Quote:


> Originally Posted by *conzilla*
> 
> Hello all i just got my r9 290 yesterday. I have a question. What do i need to run a vga monitor as my second monitor?


Not possible without some kind of powered adapter that will cost $35+ Display port to VGA


----------



## conzilla

Ouch time to just order korean 1440 monitor i guess. Thanks and +rep for quick answer.


----------



## Durvelle27

Quote:


> Originally Posted by *conzilla*
> 
> Ouch time to just order korean 1440 monitor i guess. Thanks and +rep for quick answer.


Your welcome


----------



## cplifj

hey, sorry but what crap is this.

i have a asus 290x and it nowhere nearly gets benchmarks like posted everywhere.

with firestrike the card even sounds like an ESSPRESO MACHINE.

really loud boiling noises and a performance to cry about.

the HYPE is over before it even started.


----------



## MojoW

Quote:


> Originally Posted by *Widde*
> 
> Hmm, getting random black screens not happening often, it comes and goes totally random. Whenever I open a program on the desktop my screen goes black but the pc doesnt seem to freeze. after a couple of hardresets and switched bios on the card it goes away, anyone have any clue?


Had this happen and shrouded it off as a bad flash because after i reflashed the same bios the problem was gone.


----------



## Widde

Quote:


> Originally Posted by *MojoW*
> 
> Had this happen and shrouded it off as a bad flash because after i reflashed the same bios the problem was gone.


Havent flashed anything though, Is this common? :/


----------



## MojoW

Quote:


> Originally Posted by *Widde*
> 
> Havent flashed anything though, Is this common? :/


Then it's the overdrive tab if you use overdrive and AB together you'll get black screens had that happen too so i just use AB and don't enable overdrive


----------



## quakermaas

Quote:


> Originally Posted by *cplifj*
> 
> hey, sorry but what crap is this.
> 
> i have a asus 290x and it nowhere nearly gets benchmarks like posted everywhere.
> 
> with firestrike the card even sounds like an ESSPRESO MACHINE.
> 
> really loud boiling noises and a performance to cry about.
> 
> the HYPE is over before it even started.


Post up a screen shot with the benchmark score, cpuz and MSI AB with the monitor open, so we can see what you are complaining about.

something like this


----------



## Sherp

Quote:


> Originally Posted by *Derpinheimer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sherp*
> 
> Was confused how your VRM temps were so low, just noticed you chopped up your unisink.
> 
> The unisink does a great job of keeping the VRM and RAM cool, it's a shame that vapor chamber seems to be glued to it. If I could combine the black unisink with a Gelid Icy Vision, the temperatures would be fantastic.
> 
> 
> 
> How are you underclocking? Is it only possible to underclock with default bios?
> 
> With unlocked 290x bios, at stock clocks and 1500 memory I get 815Khash as well? :/
Click to expand...

I'm just using MSIAB beta 17, I believe they're both flashed with the Asus 290X bios. Neither of them have unlocked shaders though.


----------



## Widde

Tried to disable it in CCC but then i cant do anything in AB instead :/


----------



## Widde

Quote:


> Originally Posted by *MojoW*
> 
> Then it's the overdrive tab if you use overdrive and AB together you'll get black screens had that happen too so i just use AB and don't enable overdrive


Tried to disable it in CCC but then i cant do anything in AB instead :/

(sorry for double posting)


----------



## MojoW

Quote:


> Originally Posted by *Widde*
> 
> Tried to disable it in CCC but then i cant do anything in AB instead :/
> 
> (sorry for double posting)


I did a reinstall of my drivers and AB after that just accept the overdrive in CCC so you can see everything but don't enable overdrive.(Leave it unticked!)
Then use AB to overclock and you will see everything will apply in overdrive too.

Edit:Typo's


----------



## CriticalHit

Quote:


> Originally Posted by *MojoW*
> 
> I did a reinstall of my drivers and AB after that just accept the overdrive in CCC so you can see everything but don't enable overdrive.(Leave it unticked!)
> Then use AB to overclock and you will see everything will apply in overdrive too.
> Edit:Typo's


mine does not do this.... Power limit always at 0 and has to be manually set ... Resets at every setting change in afterburner...

using afterburner 3.0.0 beta 17 ... beta driver 9.5 ..


----------



## brazilianloser

Quote:


> Originally Posted by *Widde*
> 
> Tried to disable it in CCC but then i cant do anything in AB instead :/
> 
> (sorry for double posting)


If you mean you cant change voltage in order to get higher clocks in AB make sure you have this enabled in AB since it is not enabled by default


If that is not what you meant then my bad.


----------



## Widde

Quote:


> Originally Posted by *brazilianloser*
> 
> If you mean you cant change voltage in order to get higher clocks in AB make sure you have this enabled in AB since it is not enabled by default
> 
> 
> If that is not what you meant then my bad.


If i disable overdrive in CCC then i cant adjust any clocks in AB







Will try to reinstall CCC and not enable overdrive


----------



## brazilianloser

Quote:


> Originally Posted by *Widde*
> 
> If i disable overdrive in CCC then i cant adjust any clocks in AB
> 
> 
> 
> 
> 
> 
> 
> Will try to reinstall CCC and not enable overdrive


That is strange. I would go the route you just mentioned... Uninstall it all nice and clean and start fresh this time not enabling overdrive.


----------



## RAFFY

To anyone who has EK products on order from FrozenCPU I was told that they should receive and ship out either Friday or Monday/Tuesday next week.


----------



## Banedox

Woot I love this card, but I want something more and NVIDIA has what I want, card was at a great price too.. i got it for 400, but I just sold my XFX 290 that unlocked.... Now im stuck with EK Nickel/Plexi Blocks + Backplate... help!

So long red team hello green team if only the Classy cards were on newegg yet...


----------



## crun

My Gigabyte R9 290 must be the worst overclocker here. At least I was able to unlock it.

Best it can do is 1030 core clock @ stock voltage


----------



## Durvelle27

Quote:


> Originally Posted by *crun*
> 
> My Gigabyte R9 290 must be the worst overclocker here. At least I was able to unlock it.
> 
> Best it can do is 1030 core clock @ stock voltage


Mines can'e even do 1020 on stock volts so your not alone


----------



## ivers

Does Afterburn still have proble mwith hte GPU usage? mine is doing the roller coaster.


----------



## iamhollywood5

Quote:


> Originally Posted by *Durvelle27*
> 
> Mines can'e even do 1020 on stock volts so your not alone


It's turning out that the Hawaii series just generally kind of sucks at overclocking


----------



## Clockster

So a guy is offering me a Brand new Asus GTX780 DCII OC Edition + $100 for my Gigabyte R9 290x...Do I trade him? lol xD


----------



## Durvelle27

Quote:


> Originally Posted by *Clockster*
> 
> So a guy is offering me a Brand new Asus GTX780 DCII OC Edition + $100 for my Gigabyte R9 290x...Do I trade him? lol xD


I'm thinking of trading for a GTX 780 Classified but haven't decided yet


----------



## Neutronman

Interesting cards. I now have 2 R9 290's, a Powercolor and an MSI. The Powercolor was flashed and is now a fully functional R9 290X, the MSI was unfortunately locked, so remains vanilla. To be honest it is not a major issue as there is not a whole lot of difference in performance once the R9 290's are running at the same clocks...

Has anyone tested the various R9 290 bios' that are available at TechPowerUp? I am curious if any are performing better than others due to higher power limits or voltage options?

Most likely I will just flash both of my cards with the Powercolor bios as this is factory overclocked to 975mhz....

I did notice both cards throttling at anything less than 55% fan speed when Power Limit was set to 150%, not really a problem for me as I will be watercooling, still it was interesting to see the core speed drop from 975mhz to 830-860mhz when temps hit 94C...

Shame on AMD for releasing a decent card with crappy stock cooler.....


----------



## stickg1

I'm hoping when custom cooler cards arrive someone will buy one with intentions of water cooling but will want to trade for a reference PCB. I'm not crazy about the after market air coolers available (gelid and ac, etc). Not that the performance is bad, just not aesthetically pleasing to my eyes.


----------



## escapedmonk

I think its more to do with bad drivers and/or bad bios atm Sometimes i can mine for days with 920 1500 or 1000 1400 on stock volts other times i lock up or black screen straight away or after an hour or so, same happens with benching.

sometimes i get bsod from atikmdag sometimes i get corrupted display drivers and i have to run ddu in safe mode and fresh install drivers.

Im currently mining downclocked to 800 1000 with -100 volts getting 650kh and still able to use pc and watch videos etc. its been running for 3 days now with no errors

Running ddu and a fresh install of drivers fixes most of my problems for a while at least.


----------



## Neutronman

Quote:


> Originally Posted by *Durvelle27*
> 
> I'm thinking of trading for a GTX 780 Classified but haven't decided yet


Actually I just sold my EVGA Classified 780 and bought a new R9 290 plus EK Waterblock for the same price....

Having used both, the Classified is definitely the faster card when both are running at stock. I was easily able to overclock the Classified making it even faster (+140mhz on the core), however once the AMD card hits 1100mhz plus it begins to catch up and it is actually faster currently at 2560x1440 in BF4 based on my experience.

I only play BF4 so went with AMD for Mantle. After using both cards I can say that I prefer the IQ from the R9 290 in this game as there is less texture shimmering...

Before the GTX 780 I had a pair of GTX 670's in SLI and before this I had an HD 5970 (which I did not like)....

I did not really expect to like the R9 290, however I like it so much I bought another for CF and will look at one of the new 4K Dell monitors in 2014....

EDIT: Has anyone determined if the bios switch on the R9 290 actually does anything?


----------



## tsm106

Quote:


> EDIT: Has anyone determined if the bios switch on the R9 290 actually does anything?


It's the same bios on both switches iirc.


----------



## Exidous

Anyone been able to find any of the r9 280x and up cards recently at prices that shouldn't be considered illegal? I'm wanting to buy a couple and am striking out bad. Tiger Direct had an MSI limited to one per at a normal price but they won't ship to APO.... :-(


----------



## pkrexer

So I just got my Asus 290x and have used it for a day... but with the prices I see these going for on ebay, I'm really debating of selling it and getting a 780Ti while the demand is still high


----------



## TamaDrumz76

Quote:


> Originally Posted by *pkrexer*
> 
> So I just got my Asus 290x and have used it for a day... but with the prices I see these going for on ebay, I'm really debating of selling it and getting a 780Ti while the demand is still high


Haha, I'm in the same boat. I just got a PowerColor 290 last night (didn't check to see if it can unlock yet), but 780Ti was one of my strong options... Selling the 290 puts me in close grasp of grabbing a 780Ti...

What really grinds me is that the XSPC 290/290x blocks still aren't available anywhere in the US. Meanwhile, the 780Ti block is starting to show up...


----------



## Neutronman

Quote:


> Originally Posted by *TamaDrumz76*
> 
> Haha, I'm in the same boat. I just got a PowerColor 290 last night (didn't check to see if it can unlock yet), but 780Ti was one of my strong options... Selling the 290 puts me in close grasp of grabbing a 780Ti...
> 
> What really grinds me is that the XSPC 290/290x blocks still aren't available anywhere in the US. Meanwhile, the 780Ti block is starting to show up...


What's so good about this block? I have two EK blocks on my 290's and these are excellent.


----------



## Exidous

Wtb 280x or 290

At original retail price. Not this 33% more crap.


----------



## TamaDrumz76

Quote:


> Originally Posted by *Neutronman*
> 
> What's so good about this block? I have two EK blocks on my 290's and these are excellent.


Heh, a couple things (for me anyway).

1) Main one, the multi-port connector.

2) I'm anal... All my blocks, rads, and such are XSPC and based on the newer designs. I particularly like their products (with the exception of the design flaws on the 7970 block - which aren't present on the 290x block).

I've got nothing against the new EK blocks. I've had EK in the past without issue (the acetal ones anyway). The CSQ blocks were extremely ugly though (I hated the circles design).

Besides, the EK blocks aren't in stock anywhere either with the recent demand for these cards. I gotta wait either way, so I'd rather get what I want.


----------



## Derpinheimer

Quote:


> Originally Posted by *Derpinheimer*
> 
> Can anyone help me understand if this is normal?
> 
> 
> 
> The framerate does drop every time the usage goes down..


Ok, to expand on this, here is BF4.



Any ideas? I re-installed drivers with DDU safe mode and also made sure windows had no errors...


----------



## sugarhell

Read the graph. Its easy to understand.

Yellow=cpu


----------



## jerrolds

Are you guys w/ multiple GPUs able to set specific settings seperately somehow? I'm using GPU Tweak (which ive always hated) with flashed Sapphire/XFX cards and GT only "recognizes" the bottom card (this was an issue since last year) when not in crossfire mode.

Id like to have different settings for each card depending on what im using them for.

I tried AB17 and voltage tweaking is greyed out (unlocked enabled voltage control) - is there other settings i need to mess around with?


----------



## Derpinheimer

Quote:


> Originally Posted by *sugarhell*
> 
> Read the graph. Its easy to understand.
> 
> Yellow=cpu


Really?
The question was not, "What do the squiggly lines mean?!"

Its, how do I fix it?


----------



## UNOE

Quote:


> Originally Posted by *Exidous*
> 
> Wtb 280x or 290
> 
> At original retail price. Not this 33% more crap.


Good luck with that


----------



## quakermaas

Quote:


> Originally Posted by *Derpinheimer*
> 
> Really?
> 
> The question was not, "What do the squiggly lines mean?!"
> 
> Its, how do I fix it?


What is wrong with it, what do you see as a problem ?


----------



## Derpinheimer

Yes, every time it spikes the game becomes choppy. And since it does it so frequently it makes the game very jumpy!


----------



## jerrolds

Looks like the CPU is doing something funky every n seconds - have you taken a look at your processes? Is this in Crossfire?


----------



## Derpinheimer

I found the solution - [Disable] Windows 7 Core Parking!

Unbelievable! Perfectly smooth now: 

So yes the CPU was doing something funky haha. No crossfire though.


----------



## Tennobanzai

I have a Sapphire R9 290 coming in today and I wanted to know how can I make it run quieter/cooler? Would undervolting do the trick?


----------



## iamhollywood5

Quote:


> Originally Posted by *Tennobanzai*
> 
> I have a Sapphire R9 290 coming in today and I wanted to know how can I make it run quieter/cooler? Would undervolting do the trick?


It would certainly help.


----------



## Forceman

Quote:


> Originally Posted by *Derpinheimer*
> 
> I found the solution - [Disable] Windows 7 Core Parking!
> 
> Unbelievable! Perfectly smooth now:
> 
> So yes the CPU was doing something funky haha. No crossfire though.


I wonder what the root cause of the core unparking thing is. Since it seems to help some people, but others (like myself) have no issue with it at default. Wonder if some process or something is causing the issue?


----------



## Sazz

So I just got my R9 290 that I replaced my 290X with (sold 290X for 550 to a friend, bought my 290 for 400bucks, saved 150bucks xD) And I think ASIC rating for these R9 290/290X's are really not that important. my 290X which had 69.9% ASIC rating Elipda memory OC's at 1125/1425 at stock voltage, my other friends R9 290 (same XFX) have 71.4% ASIC rating and OC's at 1120/1500 at stock votlage (elipda memory as well) and the one I got now have 76% ASIC and only OC's to 1075/1450 at stock voltage (also elipda).

even the memory brand doesn't do much, my very first 290X which had deffective Display port that I returned for replacement had 70.4% ASIC with Hynix memory, OC's 1100/1360 at stock voltage. I think an average 290/290X can get upto +100Mhz on core OC at stock voltage and memory OC goes around 1400-1450Mhz on average.

sad to say my 290 didn't unlock to 290X, but doesn't really matter to me, that extra 4-7% extra performance in FPS is very small difference and for 150bucks saved thats no brainer.

Imma install my waterblock on monday (got to do an overhaul cleaning of my rig xD) and see what's my max OC with AB and Asus BIOS using GPU Tweak (I generally got higher OC with GPU tweak when I used it with the 290X's that I have used)


----------



## Derpinheimer

All of your cards OCd pretty well on stock voltage.

ASIC 70.6% highest I can go is 1080MHz at stock voltage in OCCT without error.


----------



## BradleyW

My 2 R9 290x's arrived in the mail today! W00t!


----------



## broken pixel

Quote:


> Originally Posted by *iamhollywood5*
> 
> It would certainly help.[/quote
> Dont overclock it and try undervolting.


----------



## broken pixel

Quote:


> Originally Posted by *Forceman*
> 
> I wonder what the root cause of the core unparking thing is. Since it seems to help some people, but others (like myself) have no issue with it at default. Wonder if some process or something is causing the issue?


if i enable the overlay in BF4 user cfg file even with my cpu not parked makes no diff. I can see the massive cpu spikes and they seem worse now then before the patch.

Is a game problem, dice needs to get rolling and ea needs to stop releasing beta games at retail price.

The beta never had the horrid cpu spikes.


----------



## pkrexer

I was little bummed to find that my Asus 290x had elipda memory. Started pushing the memory clock and it maxed out afterburners 1625mhz limit.... so I guess not all elipda is crap







I have still yet to find the limit, but my benchmarks actually get worse after 1500mhz. Seems I've heard of this happening to other people as well.

ASIC = 72%


----------



## broken pixel

Quote:


> Originally Posted by *Sazz*
> 
> So I just got my R9 290 that I replaced my 290X with (sold 290X for 550 to a friend, bought my 290 for 400bucks, saved 150bucks xD) And I think ASIC rating for these R9 290/290X's are really not that important. my 290X which had 69.9% ASIC rating Elipda memory OC's at 1125/1425 at stock voltage, my other friends R9 290 (same XFX) have 71.4% ASIC rating and OC's at 1120/1500 at stock votlage (elipda memory as well) and the one I got now have 76% ASIC and only OC's to 1075/1450 at stock voltage (also elipda).
> 
> even the memory brand doesn't do much, my very first 290X which had deffective Display port that I returned for replacement had 70.4% ASIC with Hynix memory, OC's 1100/1360 at stock voltage. I think an average 290/290X can get upto +100Mhz on core OC at stock voltage and memory OC goes around 1400-1450Mhz on average.
> 
> sad to say my 290 didn't unlock to 290X, but doesn't really matter to me, that extra 4-7% extra performance in FPS is very small difference and for 150bucks saved thats no brainer.
> 
> Imma install my waterblock on monday (got to do an overhaul cleaning of my rig xD) and see what's my max OC with AB and Asus BIOS using GPU Tweak (I generally got higher OC with GPU tweak when I used it with the 290X's that I have used)


Good ASIC scores scream watercooling, bad and medium ASIC scores like voltage and run better on Air.


----------



## quakermaas

Quote:


> Originally Posted by *pkrexer*
> 
> I was little bummed to find that my Asus 290x had elipda memory. Started pushing the memory clock and it maxed out afterburners 1625mhz limit.... so I guess not all elipda is crap
> 
> 
> 
> 
> 
> 
> 
> I have still yet to find the limit, but my benchmarks actually get worse after 1500mhz. Seems I've heard of this happening to other people as well.
> 
> ASIC = 72%


1500 is probably the Max and ECC is kicking in after that and causing performance to degrade .


----------



## Durvelle27

Quote:


> Originally Posted by *broken pixel*
> 
> Good ASIC scores scream watercooling, bad and medium ASIC scores like voltage and run better on Air.


You have this backwards bud

Lower ASIC requires more voltage and is better for water coolers

Higher ASIC requires lower voltages and is mostly good for Air coolers or water


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Durvelle27*
> 
> You have this backwards bud
> 
> Lower ASIC requires more voltage and is better for water coolers
> 
> Higher ASIC requires lower voltages and is mostly good for Air coolers or water


This. I've heard optimal for water cooling is 70-75, which is why I'm so interested in seeing what my card with ASIC 70.3% will do underwater.


----------



## ZealotKi11er

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> This. I've heard optimal for water cooling is 70-75, which is why I'm so interested in seeing what my card with ASIC 70.3% will do underwater.


My card 77% does 1150 with stock voltage. The high the ASIC will give you lower stock voltage. If you can get high OC with that you are golden.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *ZealotKi11er*
> 
> My card 77% does 1150 with stock voltage. The high the ASIC will give you lower stock voltage. If you can get high OC with that you are golden.


On stock the highest I can get is just under 1100 MHz... 1100 is stable but it artifacts past 60c on air.

Now with GPUTweak maxed I can hold 1200 MHz but get slight artifacts from the temps again.

When my new build is complete (Going all-out, 600mm of rad that's 64mm thick with high performance SP120's) I hope that the temps (mostly VRMs) will be substantially lower, allowing me to flash the PT1 BIOS and crank the voltage for benches. I hope to hit at least 1225 MHz, but I guess I'll see in a few weeks.

Really all I want is to be able to game at 1200 MHz without my VRM's breaking 90c, anything after that is a bonus.









Now all I need is for EK to give FrozenCPU some Acetal blocks so I can buy one...


----------



## Korayyy

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Now all I need is for EK to give FrozenCPU some Acetal blocks so I can buy one...


I hear ya there...


----------



## cplifj

So , getting a firestrike score of around 9245 repeatable.

stock asus r9 290x - g - 4gd5 , screaming like hell with coilwhine when doing the 3dmark tests.

memorysize gets faulty reported everywhere , when it has to actually look it up and not read it from a bios string.

asus gputweak cant even read memory usage. only gpu-z can read gfx memory usage.

on a asus p8-z77m board with [email protected] with 16GB corsair dominator platinum, powered by corasir hx650w.

its all a bit too little performance from that card. + the whine , who can stand that, my cat sure as hell cant.

AND i am wondering how many people BOUGHT this card in an UNsealed box ??

makes me wonder, how many guys have allready tried it, before i bought it off the shelf, and might have broken something ...

or maybe amd made a good first batch ummong which are all the press models, and a second batch where

they allready started saving money like on other memory, and those cards will never get the same performance ?


----------



## jamaican voodoo

can i join just got my R9 290 with ek waterblock yesterday...getting two more in feb...


----------



## Asrock Extreme7

well just finished my watercooling loop run valley vrm1 32c vrm2 36c max core temp 40c ek block
1st time watercooling my xfx 290 also unlocked my chip no is 215-0852000


----------



## stickg1

Quote:


> Originally Posted by *cplifj*
> 
> So , getting a firestrike score of around 9245 repeatable.
> 
> stock asus r9 290x - g - 4gd5 , screaming like hell with coilwhine when doing the 3dmark tests.
> 
> memorysize gets faulty reported everywhere , when it has to actually look it up and not read it from a bios string.
> 
> asus gputweak cant even read memory usage. only gpu-z can read gfx memory usage.
> 
> on a asus p8-z77m board with [email protected] with 16GB corsair dominator platinum, powered by corasir hx650w.
> 
> its all a bit too little performance from that card. + the whine , who can stand that, my cat sure as hell cant.
> 
> AND i am wondering how many people BOUGHT this card in an UNsealed box ??
> 
> makes me wonder, how many guys have allready tried it, before i bought it off the shelf, and might have broken something ...
> 
> or maybe amd made a good first batch ummong which are all the press models, and a second batch where
> 
> they allready started saving money like on other memory, and those cards will never get the same performance ?


My vanilla 290 scores better than that. I would venture to say your card is no good. But I'm no expert. Which drivers and OS?


----------



## VSG

So it looks like my build will not be completed by this year, and may well be delayed till Mid-January at this rate. I am hesitating on selling my GPUs since they overclocked so well, but I am putting up both cards and EK blocks/backplates on craigslist for now- let's see. I might as well sell them now and buy them (or non-reference versions) back in January.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *geggeg*
> 
> So it looks like my build will not be completed by this year, and may well be delayed till Mid-January at this rate. I am hesitating on selling my GPUs since they overclocked so well, but I am putting up both cards and EK blocks/backplates on craigslist for now- let's see. I might as well sell them now and buy them (or non-reference versions) back in January.


If you don't mind me asking, why the delay? Seems kinda counterintuitive to lose money and get GPUs that don't OC as well.


----------



## Jack Mac

Is coil whine really that common? My sapphire doesn't have it, however the fan does whine a bit at high RPMs.


----------



## VSG

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> If you don't mind me asking, why the delay? Seems kinda counterintuitive to lose money and get GPUs that don't OC as well.


My motherboard requires an RMA and ASUS's response made it seem like it would be a while. Also, I will very likely have to send back my custom reservoir and that also will be another delay. Hopefully I won't lose much given the recent demand for these cards, so let's see.

I am considering eBay also but only as a last resort. Too many sharks over there who could claim they never got the card(s).


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Jack Mac*
> 
> Is coil whine really that common? My sapphire doesn't have it, however the fan does whine a bit at high RPMs.


A decent amount of people have it.

My card has more buzz than whine, when under load it makes a noise like an electric razor. It's gotten a bit quieter over time but is still audible.
Quote:


> Originally Posted by *geggeg*
> 
> My motherboard requires an RMA and ASUS's response made it seem like it would be a while. Also, I will very likely have to send back my custom reservoir and that also will be another delay. Hopefully I won't lose much given the recent demand for these cards, so let's see.


That sucks man, RMA processes are like going to the dentist, even when they go well they're still a hassle.


----------



## cplifj

Quote:


> Originally Posted by *stickg1*
> 
> My vanilla 290 scores better than that. I would venture to say your card is no good. But I'm no expert. Which drivers and OS?


its running in win 8.1/64 , i have done a thorough cleaning comming from a combined use of gtx66ti / hd 4000.

but it all doesnt make any difference at all, have tried the newest beta drivers, in short : i have pretty much tried everything i can find and it doesnt make a difference.

the card is indeed performing like a 7 series, i wonder what i will find if i decide to pop the hood and have a look.

here's a screenshot.



greetings


----------



## BradleyW

Quote:


> Originally Posted by *cplifj*
> 
> its running in win 8.1/64 , i have done a thorough cleaning comming from a combined use of gtx66ti / hd 4000.
> 
> but it all doesnt make any difference at all, have tried the newest beta drivers, in short : i have pretty much tried everything i can find and it doesnt make a difference.
> 
> the card is indeed performing like a 7 series, i wonder what i will find if i decide to pop the hood and have a look.
> 
> here's a screenshot.
> 
> 
> 
> greetings


I'd run a game in window mode to make sure that PCI-E slot goes to full speed and bandwidth. With power saving, it should read x16 1.1, not x1 1.1


----------



## Arizonian

Quote:


> Originally Posted by *jamaican voodoo*
> 
> can i join just got my R9 290 with ek waterblock yesterday...getting two more in feb...
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## cplifj

i'm sorry , let me reshoot that for you:


----------



## Derpinheimer

When people are saying that their card is stable at such and such clocks, I suppose that just means running a game or bench?

Because often times OCCT error check will show you that it is not in fact stable, or at least its not error free.


----------



## stickg1

Quote:


> Originally Posted by *cplifj*
> 
> i'm sorry , let me reshoot that for you:


I cant see anything in that pic. I can't see anything in a full 1920x1080 pic on this site, let alone a 5760x1080 pic. Either upload it to an image hosting site or show us the close-up of the GPU-Z window confirming that your card is being run at x16 3.0 under load.


----------



## cplifj

just click the image and then choose right under the picture to view the Original.

but if your on a Phone , it is running @x16 3.0 pcie speeds.

so everything should be good, but its rather disappointing in numbers, shure it works and things are smooth but thats not all marketing said.

i need to see the numbers that get advertised and made me decide to buy this thing. others seem to get them so...


----------



## Derpinheimer

I despise the overclocknet image system. Its so awful on phones...


----------



## Arizonian

Quote:


> Originally Posted by *Derpinheimer*
> 
> When people are saying that their card is stable at such and such clocks, I suppose that just means running a game or bench?
> 
> Because often times OCCT error check will show you that it is not in fact stable, or at least its not error free.


Well it's safe to say after you've gamed at some point for many hours without crashing and don't experience artifacts I'd consider that stable. Especially if it happens across many different games. Maybe it's safer to say this more confidently after months without crashing it's stable but I've found if I can put in 5 hrs of gaming I don't ever experience instability afterward.

Benching is much less time limited unless one runs loops for many hours. It's also another way to check for stability on an over clock running a bench looped for long period of time.

So that's what most OCN'ers are judging stability by.

_Suicide bench runs are a completely different creature, as it's time limited and can actually show artifacts but can still complete a bench but not true stability._

If one is not experiencing artifacts, crashing, or lock ups at all does it really matter if OCCT proves your in fact not stable? And if it does, Why would someone lower their OC when gaming your running it perfectly fine and getting the benefits of additional FPS from the over clock?


----------



## Durvelle27

Ok need some help guys. Keep getting weird artifacts at stock and black screens.


----------



## Derpinheimer

Quote:


> Originally Posted by *Durvelle27*
> 
> Ok need some help guys. Keep getting weird artifacts at stock and black screens.


Quote:


> Originally Posted by *Arizonian*
> 
> Well it's safe to say after you've gamed at some point for many hours without crashing and don't experience artifacts I'd consider that stable. Especially if it happens across many different games. Maybe it's safer to say this more confidently after months without crashing it's stable but I've found if I can put in 5 hrs of gaming I don't ever experience instability afterward.
> 
> Benching is much less time limited unless one runs loops for many hours. It's also another way to check for stability on an over clock running a bench looped for long period of time.
> 
> So that's what most OCN'ers are judging stability by.
> 
> _Suicide bench runs are a completely different creature, as it's time limited and can actually show artifacts but can still complete a bench but not true stability._
> 
> If one is not experiencing artifacts, crashing, or lock ups at all does it really matter if OCCT proves your in fact not stable? And if it does, Why would someone lower their OC when gaming your running it perfectly fine and getting the benefits of additional FPS from the over clock?


I follow you 100%. I guess I should stop caring about what OCCT says








Quote:


> Originally Posted by *Durvelle27*
> 
> Ok need some help guys. Keep getting weird artifacts at stock and black screens.


What kind of artifacts and in what?

I get black screen with unstable memory... artifacts with unstable core :?


----------



## Sazz

Quote:


> Originally Posted by *Derpinheimer*
> 
> When people are saying that their card is stable at such and such clocks, I suppose that just means running a game or bench?
> 
> Because often times OCCT error check will show you that it is not in fact stable, or at least its not error free.


Yeah most people don't use OCCT for error checking. I've been using it since I don't know when LOL.

And Arizonian, can you update mine and downgrade it to R9 290 from 290X, sold my 290X already and got a 290 instead xD same watercooled as well.


----------



## Falkentyne

Quote:


> Originally Posted by *Durvelle27*
> 
> You have this backwards bud
> 
> Lower ASIC requires more voltage and is better for water coolers
> 
> Higher ASIC requires lower voltages and is mostly good for Air coolers or water


Wrong

How come my 77.9% asic card wont go higher than 1150 MHz at +100 mv?


----------



## Durvelle27

Quote:


> Originally Posted by *Derpinheimer*
> 
> I follow you 100%. I guess I should stop caring about what OCCT says
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What kind of artifacts and in what?
> 
> I get black screen with unstable memory... artifacts with unstable core :?


I'm not OC'd just stock 947/1250. watching YouTube videos and then it had some white dots vertically go down the screen, then freeze, and then restarted.


----------



## stickg1

Quote:


> Originally Posted by *Falkentyne*
> 
> Wrong
> 
> How come my 77.9% asic card wont go higher than 1150 MHz at +100 mv?


Because 1150 is a good overclock for an air cooled card. When you check your ASIC with GPU-Z, it tells you exactly what said ASIC value is supposed to mean. Check your ASIC again, read the entire window, and then ask yourself if you really want to argue this point because the definition is staring you directly in the face.

Quote:


> Originally Posted by *Durvelle27*
> 
> I'm not OC'd just stock 947/1250. watching YouTube videos and then it had some white dots vertically go down the screen, then freeze, and then restarted.


This happened to me last night. Your driver is corrupt, you need to uninstall drivers, clean up with CCleaner, and someone said something about DDU, which IDK what that is. I fixed mine by just reinstalling windows. However, since I reformat and backup to other drives often this wasn't a huge task for me. But this is fixable without a reformat, if you go back a few pages (depending on how many posts per page you display in settings, I use 30) then a few people explained to me how to do this because it happened to me last night. Good luck!


----------



## Derpinheimer

Quote:


> Originally Posted by *Durvelle27*
> 
> I'm not OC'd just stock 947/1250. watching YouTube videos and then it had some white dots vertically go down the screen, then freeze, and then restarted.


Could be a downclocking issue. Try disablinb hardware acceleration in flash, and also try settings force constant voltage in MSI AB.
Quote:


> Originally Posted by *Falkentyne*
> 
> Wrong
> 
> How come my 77.9% asic card wont go higher than 1150 MHz at +100 mv?


[
Its called a TREND.

Come on.. its not like all asians have sma...

Right?


----------



## ImJJames

Quick question, does performance mode in Windows increase score in 3dmark compared to balance?


----------



## skupples

Quote:


> Originally Posted by *Jack Mac*
> 
> Is coil whine really that common? My sapphire doesn't have it, however the fan does whine a bit at high RPMs.


Some people can't hear it, even if it is present. The older you get, the less likely it is that you will hear that ultra high frequency.


----------



## Jack Mac

Quote:


> Originally Posted by *skupples*
> 
> Some people can't hear it, even if it is present. The older you get, the less likely it is that you will hear that ultra high frequency.


My 670 had coil whine and so did my TX750 v2, I can hear it. My Sapphire doesn't have it.


----------



## The Storm

Just installed my 2 Sapphire 290's today. Here is a screenie, can I be added to the list?

Sapphire R9-290, Stock cooling


----------



## Derpinheimer

Quote:


> Originally Posted by *ImJJames*
> 
> Quick question, does performance mode in Windows increase score in 3dmark compared to balance?


I imagine it should not, but it could have some minor yields..


----------



## jerrolds

I'm thinking of giving my 2nd 290X to my gf, ill be setting it up to mine when idle - @900/1500 default voltages. Wondering if her Corsair TX550M can handle it.

She has an i5 [email protected], an antec 620...one HDD.

I'm thinking itll be close...probably 450W if she decides to game doubt her CPU will 100% and probably ~400W during mining only and a bit more during desktop work.

What do you guys think?
Quote:


> Originally Posted by *Haldi*
> 
> Here some tests i've done with a EMR-1 wattage meter. All peak wattage.
> 
> i7-3930K @4.3ghz R9-290 (non X) W7
> 13.250.18.0 aka 13.11 B9.2
> 
> Unigine Heaven 4.0 1920*1080 Idle 185W (Peak Scene 26)
> 975mhz/1250/Stock = 450W 50.2FPS
> 1000mhz/1250/Stock = 455W 51.0FPS
> 1000mhz/1400/Stock = 455W 51.4FPS
> 1075mhz/1400/Stock = 465W 53.8FPS
> 
> Unigine Heaven 4.0 6048*1080 Idle 185W (Peak Scene 26)
> 1000mhz/1250/Stock = 455W 20.2FPS
> 1000mhz/1400/Stock = 455W 20.4FPS
> 1075mhz/1400/Stock = 465W 21.4FPS
> 
> Unigine Valley 1920*1080 Idle 185W (Peak Scene 5)1000mhz/1250/Stock = 455W 57.5FPS
> 1000mhz/1400/Stock = 455W 58.4FPS
> 1075mhz/1400/Stock = 465W 61.1FPS
> 
> Unigine Valley 6048*1080 Idle 190W (Peak Scene 5)
> 1000mhz/1250/Stock = 460W 22.2FPS
> 1000mhz/1400/Stock = 460W 22.4FPS
> 1075mhz/1400/Stock = 475W 23.5FPS
> 
> Metro LL 1920*1080 Idle 1850W (Peak) 2 of 2 Runs
> Very High: SSAA On
> 1000mhz/1250/Stock = 475W 42.9FPS
> 1000mhz/1400/Stock = 475W 43.4FPS
> 1075mhz/1400/Stock = 475W 45.2FPS
> 
> Metro LL 6048*1080 29°C Idle 260W (Peak) 2 of 2 Runs
> High: SSAA off
> 1000mhz/1250/Stock = 460W 33.7FPS
> 1000mhz/1400/Stock = 460W 33.8FPS
> 1075mhz/1400/Stock = 465W 34.9FPS


----------



## Forceman

Quote:


> Originally Posted by *Falkentyne*
> 
> Wrong
> 
> How come my 77.9% asic card wont go higher than 1150 MHz at +100 mv?


Because ASIC quality means nothing. AMD themselves said the registers that is reading are bogus.

Dave Baumann:
Quote:


> ASIC "quality" is a misnomer propobated by GPU-z reading a register and not really knowing what it results in. It is even more meaningless with the binning mechanism of Hawaii.


http://forum.beyond3d.com/showpost.php?p=1808073&postcount=2130


----------



## Thorteris

I'm so tempted to get a r9-290 but the after market coolers need to come out already. I just can't deal with that noise. Whats the cheapest and easiest to install aftermarket cooler btw?


----------



## Mr357

Quote:


> Originally Posted by *jerrolds*
> 
> I'm thinking of giving my 2nd 290X to my gf, ill be setting it up to mine when idle - @900/1500 default voltages. Wondering if her Corsair TX550M can handle it.
> 
> She has an i5 [email protected], an antec 620...one HDD.
> 
> I'm thinking itll be close...probably 450W if she decides to game doubt her CPU will 100% and probably ~400W during mining only and a bit more during desktop work.
> 
> What do you guys think?


I wouldn't try it. I would say 650 or more to be safe.


----------



## Derpinheimer

Ive got a 290 on a 600W, no problems at all.


----------



## stickg1

I fold my 290 @ 1050MHz 24/7, as well as my 3570K @ 4.7GHz. I have my power strip plugged into a Kill-A-Watt, on the power strip is two monitors, speakers, and a three CFL bulb floor lamp. (and my PC obviously) It reads 410w on full load. If I run something like 3Dmark Firestrike it will jump up to about 480w.

I use a Seasonic X-650 Gold.


----------



## Arizonian

Quote:


> Originally Posted by *The Storm*
> 
> Just installed my 2 Sapphire 290's today. Here is a screenie, can I be added to the list?
> 
> Sapphire R9-290, Stock cooling
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - both added


----------



## anubis1127

Quote:


> Originally Posted by *stickg1*
> 
> I fold my 290 @ 1050MHz 24/7, as well as my 3570K @ 4.7GHz. I have my power strip plugged into a Kill-A-Watt, on the power strip is two monitors, speakers, and a three CFL bulb floor lamp. (and my PC obviously) It reads 410w on full load. If I run something like 3Dmark Firestrike it will jump up to about 480w.
> 
> I use a Seasonic X-650 Gold.


410w for science!


----------



## pkrexer

I'm not sure if this has been discussed already, but i'm running into an issue with both Afterburner and GPU tweak.

I can create my profile and it applies it correctly upon booting into windows and everything is happy. Soon as the computer shuts off my monitors due to inactivity and I go to wake the screens back up, the voltage that is specified afterburner / gpu tweak does not get reapplied. However, my memory clock / core clock are still set correctly. Due to it not reapplying my voltage increase, i'm getting screen artifacts soon as my monitors come back on and I have to quickly apply my profile and things are ok again.

I'm using both the latest beta 9.5 drivers and latest version of afterburner and gpu tweak.

Anyone else experiencing this or know how to fix it?


----------



## Forceman

Quote:


> Originally Posted by *stickg1*
> 
> I fold my 290 @ 1050MHz 24/7, as well as my 3570K @ 4.7GHz. I have my power strip plugged into a Kill-A-Watt, on the power strip is two monitors, speakers, and a three CFL bulb floor lamp. (and my PC obviously) It reads 410w on full load. If I run something like 3Dmark Firestrike it will jump up to about 480w.
> 
> I use a Seasonic X-650 Gold.


Sounds about right - mine draws just under 400W running BF4. That's the main system only, no monitor or accessories.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *skupples*
> 
> Some people can't hear it, even if it is present. The older you get, the less likely it is that you will hear that ultra high frequency.


LooooooL
















How true but for me ive got a portable A/C unit about 5feet away from my left earhole so yeah


----------



## Widde

How "safe" is it to add the option to get +200mV in AB? The way they are adding it thorugh the shortcut if it even works, If my temps are okey what are the chances my card is gonna fry if I do that?


----------



## Newbie2009

http://www.3dmark.com/fs/1291545

Unlocked 290

1200/1500 cpu 4.5ghz



http://www.3dmark.com/fs/1291524


----------



## broken pixel

Quote:


> Originally Posted by *Widde*
> 
> How "safe" is it to add the option to get +200mV in AB? The way they are adding it thorugh the shortcut if it even works, If my temps are okey what are the chances my card is gonna fry if I do that?


Crank it up man, you wont fry your card.


----------



## Widde

Quote:


> Originally Posted by *broken pixel*
> 
> Crank it up man, you wont fry your card.


Temps really shot through the roof







Gonna try 156mV instead. Running ref cooler btw, http://piclair.com/6m3u5
Oc scaling on these things seems pretty high


----------



## ImJJames

Quote:


> Originally Posted by *Newbie2009*
> 
> [I//MG ALT=""]http://www.overclock.net/content/type/61/id/1790379/width/1500/height/1100[/IMG]
> 
> 1200/1500 cpu 4.5ghz
> 
> 
> 
> http://www.3dmark.com/fs/1291524


Slightly low for a 290x 1200 clock, I see a lot of 290x with 1200 clock hitting 13.3k+ Graphic score


----------



## Newbie2009

[quote name="ImJJames" Slightly low for a 290x 1200 clock, I see a lot of 290x with 1200 clock hitting 13.3k+ Graphic score[/quote]

My CPU score is what's doing it. Really need to do a fresh install.


----------



## Widde

With theses settings I'm getting some weird behaviour http://piclair.com/w3yab

When I go to something that is fullscreen my screen goes black for a couple of seconds and my sound goes away but then the fullscreen program proceeds as normal, same thing when i exit the fullscreen program

Even does it now at stock settings


----------



## ImJJames

Quote:


> Originally Posted by *Widde*
> 
> With theses settings I'm getting some weird behaviour http://piclair.com/w3yab
> 
> When I go to something that is fullscreen my screen goes black for a couple of seconds and my sound goes away but then the fullscreen program proceeds as normal, same thing when i exit the fullscreen program
> 
> Even does it now at stock settings


Uninstall drivers using DDU in safe mode, and then reinstall with latest beta driver.


----------



## Widde

Quote:


> Originally Posted by *ImJJames*
> 
> Uninstall drivers using DDU in safe mode, and then reinstall with latest beta driver.


Sorry for silly question but what is DDU? Have the latest betas

Got it working as usual with a reboot


----------



## Gilgam3sh

Quote:


> Originally Posted by *Widde*
> 
> Sorry for silly question but what is DDU? Have the latest betas
> 
> Got it working as usual with a reboot


DDU

http://www.wagnardmobile.com/DDU/


----------



## darkelixa

How good are the Amd 8350s with the r9 290s at stock, this monday my r9 290 should come into stock, not sure if i should keep my cpu at stock or risk overclocking it to like 4.7


----------



## Durvelle27

Quote:


> Originally Posted by *darkelixa*
> 
> How good are the Amd 8350s with the r9 290s at stock, this monday my r9 290 should come into stock, not sure if i should keep my cpu at stock or risk overclocking it to like 4.7


At stock you will get a bottleneck and its also comes down to the game.


----------



## darkelixa

Is changing over to an intel i5 going to stop a bottleneck or do both platforms have to be overclocked?


----------



## Durvelle27

Quote:


> Originally Posted by *darkelixa*
> 
> Is changing over to an intel i5 going to stop a bottleneck or do both platforms have to be overclocked?


Both will need to be OC'd to get good performance


----------



## darkelixa

Did a quick oc boost with the sabertooths onboard overclock from 4gh to 4.3gh and i received the same score with 3d mark 11 extreme presets with my 770gtx , yes i know its a nvidia card but its all i have thats better than my 5850 currently . Unless the 770 doesnt benefit from an overclocked cpu becuase of the way the clock works with power


----------



## ReHWolution

Hey guys, I'm back







I just received a R9 290 from Sapphire and a R7 260X too, other 2 GCN 1.1 cards







I'm almost done testing with Fire Strike Extreme for those high end GPUs, I just need a 280X


----------



## stickg1

Quote:


> Originally Posted by *darkelixa*
> 
> Did a quick oc boost with the sabertooths onboard overclock from 4gh to 4.3gh and i received the same score with 3d mark 11 extreme presets with my 770gtx , yes i know its a nvidia card but its all i have thats better than my 5850 currently . Unless the 770 doesnt benefit from an overclocked cpu becuase of the way the clock works with power


The boost is more noticeable in a gaming situation where the CPU and GPU are stressed simultaneously. In 3DMark your physics and combined score increased correct?


----------



## bronzodiriace

Working on the gpu ( I have to work on the timings of the ram)


----------



## battleaxe

Just want to tell you how to get a 290 for 420.00 if you are in the US and live near a MicroCenter:

Go to a Microcenter and make friends with a sales associate. Then give him/her your phone number. Then ask them to call you on Wednesday or Friday (when their new delivery trucks come in) Ask them to call you as soon as they see them come in stock. Ask them to put it on the purchased shelf so you have a chance to get to the store before it sells out. Drive to store and pick up your new GPU. It takes about 24 hours for the website to acknowledge that they have an item in stock. This lets you pick it up before it ever goes in stock on the site. Works famously.

I just did this, this week. The MSI version is at MC for 420.00 right now. Picked mine up yesterday. Is this all I need for proof?


----------



## Durvelle27

Quote:


> Originally Posted by *Derpinheimer*
> 
> Could be a downclocking issue. Try disablinb hardware acceleration in flash, and also try settings force constant voltage in MSI AB.
> [
> Its called a TREND.
> 
> Come on.. its not like all asians have sma...
> 
> Right?


Thx i did this and it hasn't happened again yet. Hopefully that was indeed the cure


----------



## Arizonian

Quote:


> Originally Posted by *battleaxe*
> 
> Just want to tell you how to get a 290 for 420.00 if you are in the US and live near a MicroCenter:
> 
> Go to a Microcenter and make friends with a sales associate. Then give him/her your phone number. Then ask them to call you on Wednesday or Friday (when their new delivery trucks come in) Ask them to call you as soon as they see them come in stock. Ask them to put it on the purchased shelf so you have a chance to get to the store before it sells out. Drive to store and pick up your new GPU. It takes about 24 hours for the website to acknowledge that they have an item in stock. This lets you pick it up before it ever goes in stock on the site. Works famously.
> 
> I just did this, this week. The MSI version is at MC for 420.00 right now. Picked mine up yesterday. Is this all I need for proof?
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## stickg1

I'm about to do the red mod with a H55 on my 290. I'll check back in with the results.


----------



## devilhead

so recieved 3x xfx 290, all have unlocked to 290x







and have ordered 2x more


----------



## Arizonian

Quote:


> Originally Posted by *devilhead*
> 
> so recieved 3x xfx 290, all have unlocked to 290x
> 
> 
> 
> 
> 
> 
> 
> and have ordered 2x more
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Five total? You must have two rigs....

*"How to put your Rig in your Sig"*


----------



## RAFFY

Quote:


> Originally Posted by *devilhead*
> 
> so recieved 3x xfx 290, all have unlocked to 290x
> 
> 
> 
> 
> 
> 
> 
> and have ordered 2x more


Very nice! I'm hunting for at least 6 R9 290's but am having no luck.


----------



## devilhead

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Five total? You must have two rigs....
> 
> *"How to put your Rig in your Sig"*


at this time i will put just one of 290 in my rig, with backplate and EK waterblock, when i will recieve next 2x xfx i will make watercooled mining rig







so all the cards will be watercooled







now its hard decide which card put under the water, from those 3x (because heve only 1 waterblock and backplate)







testing now 79 asic card, others 74 and 75







passed valley test 1170/1400 with +100 (XFX 290X bios)


----------



## devilhead

Quote:


> Originally Posted by *RAFFY*
> 
> Very nice! I'm hunting for at least 6 R9 290's but am having no luck.


heh, i had 1 xfx 290 at first(able to unlock), but overclocked to hard wit pt3 bios, stock fan, and recieved bad smell from memory, like i think














it was not able to overclock memory anymore, so have sent back the card







and get money back







then ordered 2x club3d cards, both of them was locked, so send them back, after that, i bought 2x290 sapphire BF4 edition, both was locked, so sent back








i'm bad customer


----------



## ReHWolution

Hey guys, tryin' to set up a 290x/290 crossfire, why can't I? I installed latest beta drivers but everytime I reboot I get a "No videocard found" error from CCC :\ GPU-Z shows 0MHz GPUs :\


----------



## battleaxe

Quote:


> Originally Posted by *devilhead*
> 
> heh, i had 1 xfx 290 at first(able to unlock), but overclocked to hard wit pt3 bios, stock fan, and recieved bad smell from memory, like i think
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> it was not able to overclock memory anymore, so have sent back the card
> 
> 
> 
> 
> 
> 
> 
> and get money back
> 
> 
> 
> 
> 
> 
> 
> then ordered 2x club3d cards, both of them was locked, so send them back, after that, i bought 2x290 sapphire BF4 edition, both was locked, so sent back
> 
> 
> 
> 
> 
> 
> 
> 
> i'm bad customer


Yes you are sir, yes you are! LOL


----------



## stickg1

Just resell your cards if they don't unlock or overclock well. Just take the $25 loss instead of committing fraud.


----------



## Tobiman

Just got a 290 from ebay for $430 shipped. It should end up around $500 after I slap on a G10 and H60 or something. My 780 Classy will probably go on sale anytime soon.


----------



## anubis1127

Quote:


> Originally Posted by *stickg1*
> 
> Just resell your cards if they don't unlock or overclock well. Just take the $25 loss instead of committing fraud.


With the current AMD GPU shortage, and price gouging going on, you shoudln't even have to take a $25 loss.


----------



## jamor

Quote:


> Originally Posted by *devilhead*
> 
> heh, i had 1 xfx 290 at first(able to unlock), but overclocked to hard wit pt3 bios, stock fan, and recieved bad smell from memory, like i think
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> it was not able to overclock memory anymore, so have sent back the card
> 
> 
> 
> 
> 
> 
> 
> and get money back
> 
> 
> 
> 
> 
> 
> 
> then ordered 2x club3d cards, both of them was locked, so send them back, after that, i bought 2x290 sapphire BF4 edition, both was locked, so sent back
> 
> 
> 
> 
> 
> 
> 
> 
> i'm bad customer


LOL

you're a *horrible* customer! thanks for the laugh


----------



## HardwareDecoder

Quote:


> Originally Posted by *jamor*
> 
> LOL
> 
> you're a *horrible* customer! thanks for the laugh


wow simply wow


----------



## HardwareDecoder

Quote:


> Originally Posted by *devilhead*
> 
> heh, i had 1 xfx 290 at first(able to unlock), but overclocked to hard wit pt3 bios, stock fan, and recieved bad smell from memory, like i think
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> it was not able to overclock memory anymore, so have sent back the card
> 
> 
> 
> 
> 
> 
> 
> and get money back
> 
> 
> 
> 
> 
> 
> 
> 
> then ordered 2x club3d cards, both of them was locked, so send them back, after that, i bought 2x290 sapphire BF4 edition, both was locked, so sent back
> 
> 
> 
> 
> 
> 
> 
> 
> i'm bad customer


no shame at all huh


----------



## broken pixel

Quote:


> Originally Posted by *Widde*
> 
> With theses settings I'm getting some weird behaviour http://piclair.com/w3yab
> 
> When I go to something that is fullscreen my screen goes black for a couple of seconds and my sound goes away but then the fullscreen program proceeds as normal, same thing when i exit the fullscreen program
> 
> Even does it now at stock settings


what drivers are you using?


----------



## Mr357

Winter is simply amazing. I'm seeing 38C under full load on this bad boy.


----------



## bloodkil93

Managed to push 1195/1250 on R9 290 with an Arctic Accelero Xtreme III fitted.

Temps are as follows on max load:

GPU: 56 Degrees C
VRM1: 78 Degrees C
VRM2: 49 Degrees C

Voltage on full load: 1.344

MSI Afterburner used for overclocking with +100mV

Unigine Valley Score:


----------



## ImJJames

Quote:


> Originally Posted by *bloodkil93*
> 
> Managed to push 1195/1250 on R9 290 with an Arctic Accelero Xtreme III fitted.
> 
> Temps are as follows on max load:
> 
> GPU: 56 Degrees C
> VRM1: 78 Degrees C
> VRM2: 49 Degrees C
> 
> Voltage on full load: 1.344
> 
> MSI Afterburner used for overclocking with +100mV
> 
> Unigine Valley Score:
> [IM/G ALT=""]http://www.overclock.net/content/type/61/id/1790962/width/500/height/1000[/IMG]


I am noticing that people who add aftermarket cooler like Accelero have higher VRM temps compared to reference cooler.


----------



## Tennobanzai

Noob question but how do I change voltage? I installed the newest driver, newest MSI AB and i checked the 2 voltage control options in AB. I slide the voltage and apply but it goes back to 0. I have a Sapphire 290


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Tennobanzai*
> 
> Noob question but how do I change voltage? I installed the newest driver, newest MSI AB and i checked the 2 voltage control options in AB. I slide the voltage and apply but it goes back to 0. I have a Sapphire 290


Have you changed the settings to allow overclocking?


----------



## Tennobanzai

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Have you changed the settings to allow overclocking?


I'm not sure what you mean by that. I can overclock the core and memory as of right now tho


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Tennobanzai*
> 
> I'm not sure what you mean by that. I can overclock the core and memory as of right now tho


I'm pretty sure there are settings that you have to change to allow voltage tweaks.

Look around in MSI AB's settings, I think the one you want to change is to allow unofficial overclocking.


----------



## bloodkil93

Also, is 1.32 a safe max voltage for the VRMs as I was told by someone this is unsafe, they run at a max temp of 78 Degrees C

Cheers!


----------



## HardwareDecoder

Quote:


> Originally Posted by *Clockster*
> 
> You have no idea what you are talking about.
> 
> Very few cards that unlock are True 290X cards, I have owned both a 290 and 290X and the 290X is a faster gpu.
> Synthetic benchmarks don't matter, play BF4 maxed out no AA @1440P with both a 290 and 290X and look at the performance difference.
> 
> Anyway, tempted to flash my cards with the Asus bios, need proper voltage control and AB just isn't giving it to me.


i wasn't able to control voltage with afterburner either. I enabled everything in it afaik and had the latest version.

I have the asus bios flashed to my 290x and uses gpu tweak to mess with voltage.


----------



## anubis1127

Quote:


> Originally Posted by *HardwareDecoder*
> 
> i wasn't able to control voltage with afterburner either. I enabled everything in it afaik and had the latest version.
> 
> I have the asus bios flashed to my 290x and uses gpu tweak to mess with voltage.


You need AB beta 17, or later for voltage control.


----------



## Tennobanzai

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I'm pretty sure there are settings that you have to change to allow voltage tweaks.
> 
> Look around in MSI AB's settings, I think the one you want to change is to allow unofficial overclocking.


Is it the old trick of changing some value from 0 to 1 in notepad? If not, i'll post some screenshots tonight!


----------



## chiknnwatrmln

Quote:


> Originally Posted by *bloodkil93*
> 
> Also, is 1.32 a safe max voltage for the VRMs as I was told by someone this is unsafe, they run at a max temp of 78 Degrees C
> 
> Cheers!


1.32v before droop or 1.32 actual? If you're running 1.32v actual (during load, you can check via GPU-z) I'd say it's a bit much if you're on air cooling.

1.32v before droop should be fine. Your VRM temps don't seem bad either. Keep them under 85c to be safe, although they can technically go over 100c.
Quote:


> Originally Posted by *Tennobanzai*
> 
> Is it the old trick of changing some value from 0 to 1 in notepad? If not, i'll post some screenshots tonight!


I think, I honestly can't stand MSI AB so I stick with GPUTweak or Precision X.


----------



## CravinR1

I want to be in:

Stock cooling:
MSI 290
XFX 290
Sapphire 290

Here is the gpuz for the msi (what I have in this pc) and here is the picture of all 3
http://www.techpowerup.com/gpuz/72b7k/


----------



## Tennobanzai

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> 1.32v before droop or 1.32 actual? If you're running 1.32v actual (during load, you can check via GPU-z) I'd say it's a bit much if you're on air cooling.
> 
> 1.32v before droop should be fine. Your VRM temps don't seem bad either. Keep them under 85c to be safe, although they can technically go over 100c.
> I think, I honestly can't stand MSI AB so I stick with GPUTweak or Precision X.


I like AB because they have the fan control profiles tho


----------



## prostreetcamaro

Quote:


> Originally Posted by *ImJJames*
> 
> I am noticing that people who add aftermarket cooler like Accelero have higher VRM temps compared to reference cooler.


I didnt. Mine was the same or cooler with the accelero 3


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Tennobanzai*
> 
> I like AB because they have the fan control profiles tho


GPUTweak also has fan profiles.

GPUTweak, MSI AB, and Precision X are all based on the same software I believe (RivaTuner) so they all run similarly. The main differences are the skins, and that GPUTweak allows much more voltage.


----------



## HardwareDecoder

Quote:


> Originally Posted by *CravinR1*
> 
> I want to be in:
> 
> Stock cooling:
> MSI 290
> XFX 290
> Sapphire 290
> 
> Here is the gpuz for the msi (what I have in this pc) and here is the picture of all 3
> http://www.techpowerup.com/gpuz/72b7k/


I saw earlier you didn't get an unlocker out of all of those. That sucks did you pay over MSRP for all of them too ?


----------



## Tennobanzai

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> GPUTweak also has fan profiles.
> 
> GPUTweak, MSI AB, and Precision X are all based on the same software I believe (RivaTuner) so they all run similarly. The main differences are the skins, and that GPUTweak allows much more voltage.


That's awesome, I didn't know GPUTweak has the custom fan profiles!


----------



## CravinR1

Quote:


> Originally Posted by *HardwareDecoder*
> 
> I saw earlier you didn't get an unlocker out of all of those. That sucks did you pay over MSRP for all of them too ?


Nah only the XFX

Sapphire + BF4 = $395 shipped
MSI = $399 shipped
XFX = $458 shipped (got it a week earlier cause newegg shipping fiasco)

Was very disappointed not one unlocked though

I did notice a disparity between included accessories of the manufacturers

Sapphire had no covers on the display ports however did come with power adapters, displayport adapter, and a HDMI cable plus a BF4 key


----------



## HardwareDecoder

Quote:


> Originally Posted by *CravinR1*
> 
> Nah only the XFX
> 
> Sapphire + BF3 = $395 shipped
> MSI = $399 shipped
> XFX = $458 shipped (got it a week earlier cause newegg shipping fiasco)
> 
> Was very disappointed not one unlocked though


Oh so you ordered them before the price gouge then I assume.

Did you buy from three brands hoping to get atleast 1-2 unlockers I assume?

Are you going to be selling some/all of them due to not unlocking or you gonna run tri-xfire or some such? just curious.


----------



## CravinR1

Currently I'm mining on them, making 1 LTC a day I will be adding the 7950 to the mining when I get my pcie risers


----------



## jerrolds

3.67Mh/s with 2 290Xs? How are you getting 1.8Mh/s+ per card? 290X are around 850-950Khs.


----------



## CravinR1

Quote:


> Originally Posted by *jerrolds*
> 
> 3.67Mh/s with 2 290Xs? How are you getting 1.8Mh/s+ per card? 290X are around 850-950Khs.


Thats with 3 290 and a 5870

Hashes:

MSI 870
Sapphire 860
XFX 840
5870 400

Total of 2970 but sometimes give-me-coins spikes higher


----------



## stickg1

Well I just got done, she aint pretty but shes nice and quiet now! I might get one of those NZXT brackets. Also I need to order more VRMsinks and RAMsinks.



These are temps while folding:



Then in the spoiler is more pictures of installing, finished, and some Unigine temps. One is 15 minutes running, then the next one is 3 minutes to cool down temps.


Spoiler: More pictures and temps!!


----------



## Arizonian

Quote:


> Originally Posted by *CravinR1*
> 
> I want to be in:
> 
> Stock cooling:
> MSI 290
> XFX 290
> Sapphire 290
> 
> Here is the gpuz for the msi (what I have in this pc) and here is the picture of all 3
> http://www.techpowerup.com/gpuz/72b7k/
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Quote:


> Originally Posted by *stickg1*
> 
> Well I just got done, she aint pretty but shes nice and quiet now! I might get one of those NZXT brackets. Also I need to order more VRMsinks and RAMsinks.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> These are temps while folding:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Then in the spoiler is more pictures of installing, finished, and some Unigine temps. One is 15 minutes running, then the next one is 3 minutes to cool down temps.
> 
> 
> Spoiler: More pictures and temps!!


Congrats - added


----------



## Durvelle27

Arizonian you can remove me from the list as i just sold my 290 for $500.


----------



## battleaxe

Quote:


> Originally Posted by *stickg1*
> 
> Well I just got done, she aint pretty but shes nice and quiet now! I might get one of those NZXT brackets. Also I need to order more VRMsinks and RAMsinks.
> 
> 
> 
> These are temps while folding:
> 
> 
> 
> Then in the spoiler is more pictures of installing, finished, and some Unigine temps. One is 15 minutes running, then the next one is 3 minutes to cool down temps.
> 
> 
> Spoiler: More pictures and temps!!


Yup. You win. That is ugly. What about the RAM and VRM's?

Edit: how did your VRM tmps go down with no heat sinks on them? Or am I blind? yes, I am blind.

I see them now. Hey, whatever works. Right?

Wait; you're comparing a 90+% load on stock to a 1% load. What are the VRM's like at 98% load?


----------



## stickg1

There's a 120mm fan pointed directly at them. I think some people over-estimate the effectiveness of AMD's stock heatsink for the VRMs and RAM chips. All it is, is a big black piece of aluminum feeling metal that makes contact with that other inadequate heatsink. It's mostly just the airflow of the stock cooler that keeps these things cool. You could watercool this card with a uniblock and be able to keep the VRMs and RAM chips below the point of overheating with decent ambient temps and airflow inside the case. VRMsinks and RAMsinks are definitely a good idea if you're going to be gaming for long sessions. I did order some but I don't have any right now.


----------



## Arizonian

Quote:


> Originally Posted by *Durvelle27*
> 
> Arizonian you can remove me from the list as i just sold my 290 for $500.


Done.


----------



## broken pixel

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Have you changed the settings to allow overclocking?


mine did that once and I did a fresh install of the gpu driver cleaned the drivers with DDU in safe mode. Do a complete uninstall of AB and the OSD, I did not save any AB settings when it asked. Worked for me.


----------



## chiknnwatrmln

Finally got my rig re-connected to the net!

Time to run some more benches...


----------



## Clockster

Mmmm Gigabyte GTX780TI OC Edition for my R9 290X card...So tempted to take the trade...lol


----------



## HardwareDecoder

Needed a break from studying for finals so I decided to slap on the ek acetal/nickle block

Okay the fun is over







back to anatomy and physiology

factory paste job


ek wb installed


----------



## pkrexer

Quote:


> Originally Posted by *Clockster*
> 
> Mmmm Gigabyte GTX780TI OC Edition for my R9 290X card...So tempted to take the trade...lol


If you don't, I will


----------



## chiknnwatrmln

Quote:


> Originally Posted by *HardwareDecoder*
> 
> Needed a break from studying for finals so I decided to slap on the ek acetal/nickle block
> 
> Okay the fun is over
> 
> 
> 
> 
> 
> 
> 
> back to anatomy and physiology
> 
> factory paste job
> 
> ek wb installed


Looks good, can't wait to get mine!

Good luck on your finals, I finished up all my finals last week so it's time for some Skyrim for me.


----------



## y2kcamaross

You can add me here in a bit, will post screen shot when I get home, my back up betty rig now has 1 xfx 290 unlocked to 290x with an accelero xtreme 3 for cooling, one more to come soon


----------



## HardwareDecoder

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Looks good, can't wait to get mine!
> 
> Good luck on your finals, I finished up all my finals last week so it's time for some Skyrim for me.


thx can't wait for it to be over


----------



## rdr09

Quote:


> Originally Posted by *y2kcamaross*
> 
> You can add me here in a bit, will post screen shot when I get home, my back up betty rig now has 1 xfx 290 unlocked to 290x with an accelero xtreme 3 for cooling, one more to come soon


2 290s paired with a Phenom Quad. win!


----------



## battleaxe

Quote:


> Originally Posted by *HardwareDecoder*
> 
> Needed a break from studying for finals so I decided to slap on the ek acetal/nickle block
> 
> Okay the fun is over
> 
> 
> 
> 
> 
> 
> 
> back to anatomy and physiology
> 
> factory paste job
> 
> 
> ek wb installed


You and me both. Just finished my last final last night. Yay.... And now I spent all day screwing around with mining on my 290. BTW, these things are really loud. Looking forward to water. Fan is running 85% to stay under 68c. Good thing its in the closet, but its still loud even with the door closed. lol

Oh well, who cares? Not me. Love the card. I can mine on it and browse the internet and it doesn't get overwhelmed.


----------



## jrcbandit

What is the best game to stress test a 290x and cpu overclock other than Battlefield 4 or Crysis 3 which I don't own?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *battleaxe*
> 
> You and me both. Just finished my last final last night. Yay.... And now I spent all day screwing around with mining on my 290. BTW, these things are really loud. Looking forward to water. Fan is running 85% to stay under 68c. Good thing its in the closet, but its still loud even with the door closed. lol
> 
> Oh well, who cares? Not me. Love the card. I can mine on it and browse the internet and it doesn't get overwhelmed.


What kind of coins do you mine? I'm kinda considering trying my hand at mining altcoins, apparently the 290/x's get like 800kh/s or something like that. I wonder if I could see any real profit from just one card.

To above, I find that Skyrim (I have various graphics mods + ENB so it's always 100% GPU usage) is the most sensitive. It will artifact even when other games and benches don't. Then comes Fallout 3, and Metro LL also is a good test.


----------



## stickg1

I tightened up the config a little bit, still totally ghetto but it kinda works! It took care of the problem anyway, noise. And I couldn't fold at 1075MHz before this mod because of heat and noise. So I guess it was a success.


----------



## kizwan

I want to join.









2 x Sapphire R9 290 with stock cooling.







Please excuse my cable management. It's just temporary until I get water blocks for my cards. It took me ~8 hours to disconnect the old PSU (non-modular), drain my loop, disconnect the old GPU from the loop, reposition the bottom radiator since it no longer fit because of the new PSU, install the new PSU, install the cards & finally refill my loop.

3dmark 11 @stock clock



http://www.3dmark.com/3dm11/7656003

*[Update - 22 January 2014]*

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/14770#post_21629049


----------



## HOMECINEMA-PC

Mammoth effort there as always


----------



## DeadlyDNA

Being new to Red team once again, how long do they take in between WHQL drivers? I'm using the last WHQL i could find. I stopped the beta drivers because i was having distorted sound in 3d games.

My only beef with this WHQL is it seems on boot up, i have to disable and re-enable CF to get proper performance


----------



## jerrolds

For the life of me i cant get Afterburner Beta 17 to unlock voltage control. Using an Sapphire and XFX 290X both flashed with Asus BIOS, currently using GPU Tweak. But GT cant "see" the primary card when not in Crossfire mode..so anytime i want to set the memory to 1500 the display gets corrupted in 2D (desktop). Super annoying.

Can someone please tell me what settings i need to check off to get AB17 to unlock voltage controls. Or does it not work with asus bios'?


----------



## Radmanhs

I would like to join 

gpu-z

http://www.techpowerup.com/gpuz/kecq7/

XFX

Stock... hopefully I will be going Watercooled after christmas, its going to be really awkward for my parents getting a bunch of fittings, pump, res ect  i did ask for AC4 to sound a little normal though hahaha


----------



## rv8000

I just got so trolled by frozencpu D:, one copper/acrylic ek block in stock, the second i complete my order its gone







. I will find you person!


----------



## Heinz68

Quote:


> Originally Posted by *devilhead*
> 
> heh, i had 1 xfx 290 at first(able to unlock), but overclocked to hard wit pt3 bios, stock fan, and recieved bad smell from memory, like i think
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> it was not able to overclock memory anymore, so have sent back the card
> 
> 
> 
> 
> 
> 
> 
> and get money back
> 
> 
> 
> 
> 
> 
> 
> then ordered 2x club3d cards, both of them was locked, so send them back, after that, i bought 2x290 sapphire BF4 edition, both was locked, so sent back
> 
> 
> 
> 
> 
> 
> 
> 
> i'm bad customer


Can't believe you think it's funny.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *devilhead*
> 
> heh, i had 1 xfx 290 at first(able to unlock), but overclocked to hard wit pt3 bios, stock fan, and recieved bad smell from memory, like i think
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> it was not able to overclock memory anymore, so have sent back the card
> 
> 
> 
> 
> 
> 
> 
> and get money back
> 
> 
> 
> 
> 
> 
> 
> then ordered 2x club3d cards, both of them was locked, so send them back, after that, i bought 2x290 sapphire BF4 edition, both was locked, so sent back
> 
> 
> 
> 
> 
> 
> 
> 
> i'm bad customer


If ya had of done your homework you should of gots another XFX or Powercolor ( they unlock ) .......... and you would of not wasted your time and everyone else's having to deal with ya


----------



## Heinz68

Quote:


> Originally Posted by *jerrolds*
> 
> For the life of me i cant get Afterburner Beta 17 to unlock voltage control. Using an Sapphire and XFX 290X both flashed with Asus BIOS, currently using GPU Tweak. But GT cant "see" the primary card when not in Crossfire mode..so anytime i want to set the memory to 1500 the display gets corrupted in 2D (desktop). Super annoying.
> 
> Can someone please tell me what settings i need to check off to get AB17 to unlock voltage controls. Or does it not work with asus bios'?


Did you unlock the voltage control? You don't need to unlock the monitoring, I don't have the cards yet, but somebody said it's buggy, you can try and see.


----------



## Arizonian

Quote:


> Originally Posted by *kizwan*
> 
> I want to join.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2 x Sapphire R9 290 with stock cooling.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Please excuse my cable management. It's just temporary until I get water blocks for my cards. It took me ~8 hours to disconnect the old PSU (non-modular), drain my loop, disconnect the old GPU from the loop, reposition the bottom radiator since it no longer fit because of the new PSU, install the new PSU, install the cards & finally refill my loop.
> 
> 3dmark 11 @stock clock
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/7656003


Congrats - added









Quote:


> Originally Posted by *Radmanhs*
> 
> I would like to join
> 
> 
> 
> 
> 
> 
> 
> 
> 
> gpu-z
> http://www.techpowerup.com/gpuz/kecq7/
> 
> XFX
> 
> Stock... hopefully I will be going Watercooled after christmas, its going to be really awkward for my parents getting a bunch of fittings, pump, res ect
> 
> 
> 
> 
> 
> 
> 
> i did ask for AC4 to sound a little normal though hahaha


Congrats - added


----------



## Technewbie

AMD needs to fix the fan profile for the 290x. On Uber-mode my card was at 94c and the fan was only at 33%. (Yes, I use AB 17 but I don't always use it.)


----------



## ImJJames

Anyone else waiting for kraken g10 to get on stock? grrrrrr


----------



## maynard14

Hi there.. i was wondering why MSI after burner latest beta could not show my voltage ath the monitor graph of the msi ab?

i already click the unlock voltage monitor on the ab settings...

thanks so much for the reply


----------



## CravinR1

95 cell is temp target


----------



## the9quad

Cheeeyah! 14k in extreme.


----------



## Joeking78

Quote:


> Originally Posted by *the9quad*
> 
> Cheeeyah! 14k in extreme.


Nice one









I need to give mine a run now


----------



## Forceman

Quote:


> Originally Posted by *maynard14*
> 
> Hi there.. i was wondering why MSI after burner latest beta could not show my voltage ath the monitor graph of the msi ab?
> 
> i already click the unlock voltage monitor on the ab settings...
> 
> thanks so much for the reply


You may have to turn it on on the monitoring tab as well. There's a list of things being shown in the graph in the middle of that settings page.


----------



## frogzombie

Sapphire R9 290

Stock cooler


----------



## MrWhiteRX7

Quote:


> Originally Posted by *the9quad*
> 
> Cheeeyah! 14k in extreme.


NICE!


----------



## Arizonian

Quote:


> Originally Posted by *frogzombie*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Sapphire R9 290
> 
> Stock cooler


I don't see your screen name in Validation tab, if you can edit and add it to this post would be nice. Just for proof, however.....

Congrats - added


----------



## frogzombie

Quote:


> Originally Posted by *Arizonian*
> 
> I don't see your screen name in Validation tab, if you can edit and add it to this post would be nice. Just for proof, however.....
> 
> Congrats - added


My apologies. I will update it tommow. Thanks!


----------



## DeadlyDNA

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Being new to Red team once again, how long do they take in between WHQL drivers? I'm using the last WHQL i could find. I stopped the beta drivers because i was having distorted sound in 3d games.
> 
> My only beef with this WHQL is it seems on boot up, i have to disable and re-enable CF to get proper performance


Scratch that, my issue was with MSI AB beta 17. and WHQL drivers. Gonna have to look at Asus bios and gpu tweak i think


----------



## the9quad

My issue is I cant watch blu rays without the WHQL drivers (which have never been officially released btw), and I can't play BF4 with them.

So yea they need to hurry up and wrap up the past few betas and release an OFFICIAL WHQL driver.


----------



## RAFFY

Quote:


> Originally Posted by *the9quad*
> 
> My issue is I cant watch blu rays without the WHQL drivers (which have never been officially released btw), and I can't play BF4 with them.
> 
> So yea they need to hurry up and wrap up the past few betas and release an OFFICIAL WHQL driver.


This is why man created PS4 and Xbox One


----------



## maynard14

Quote:


> Originally Posted by *Forceman*
> 
> You may have to turn it on on the monitoring tab as well. There's a list of things being shown in the graph in the middle of that settings page.


what the,,,,,, ahahahah i didnt know that sir... thank you so much!!


----------



## the9quad

Quote:


> Originally Posted by *RAFFY*
> 
> This is why man created PS4 and Xbox One


That man doesn't have to wait til christmas for his kids to open it.
Seriously though the pc is in the bedroom.Like to chill in bed sometimes and watch a movie.


----------



## RAFFY

Quote:


> Originally Posted by *the9quad*
> 
> That man doesn't have to wait til christmas for his kids to open it.
> Seriously though the pc is in the bedroom.Like to chill in bed sometimes and watch a movie.


Just let the wife open her stripper pole early! No need for Blu-ray then! I just noticed you live in Madison, Alabama. You work on the arsenal or for Nasa?


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Mammoth effort there as always


Thanks madman. I'm already can't wait to put these bad girls under water.









BTW, these upgrade was my birthday gift from my aunt. Best gift ever!


----------



## velocityx

anyone still getting black or red screens in CF from time to time?


----------



## CriticalHit

Quote:


> Originally Posted by *velocityx*
> 
> anyone still getting black or red screens in CF from time to time?


My display port monitor sometimes goes green. Just turn it off and on to fix .. Monitors with dvi are fine


----------



## battleaxe

Quote:


> Originally Posted by *velocityx*
> 
> anyone still getting black or red screens in CF from time to time?


Only thing I've gotten was a crash screen with multiple tiny pixels of color. Mostly light green. But it was totally my fault for mining it too hard. I knew it was gonna happen. Crashed my Nvidia cards too, so seems totally normal. My 290 has been running fabulously. I have no complaints. This is the deal of the decade if you ask me. (this coming from someone who loves Nvidia cards)

Mines still stock and is gonna be for at least a month or two. Gonna run her hard 24/7 and see if she holds up. If she does, she's going under water.

These cards are great, and a bargain... Make no mistake about it.


----------



## Widde

Feels like I'm getting a pretty low score for the oc http://piclair.com/d7t44







The core doesnt wanna go higher without more juice =/

Have it at +100mV btw


----------



## The Storm

Ordered my 2 new EK blocks for my 290's last night, I cant wait to get these things on!!!!


----------



## bloodkil93

Quote:


> Originally Posted by *Widde*
> 
> Feels like I'm getting a pretty low score for the oc http://piclair.com/d7t44
> 
> 
> 
> 
> 
> 
> 
> The core doesnt wanna go higher without more juice =/
> 
> Have it at +100mV btw


That voltage seems far too high for that clock, with +100mV i can get nearly 1190 out of mine


----------



## RAFFY

Quote:


> Originally Posted by *velocityx*
> 
> anyone still getting black or red screens in CF from time to time?


Nope, I haven't had this issue since Beta 9.2 drivers. With beta 9.5 and the latest drivers my game is running flawless now.


----------



## prostreetcamaro

Quote:


> Originally Posted by *ImJJames*
> 
> Anyone else waiting for kraken g10 to get on stock? grrrrrr


Not really. Wont work worth a crap with these cards IMO. Absolutely no cooling at all to the front 3 VRM's.


----------



## prostreetcamaro

Quote:


> Originally Posted by *prostreetcamaro*
> 
> Not really. Wont work worth a crap with these cards IMO. Absolutely no cooling at all to the front 3 VRM's. I personally think its a joke. Look at those 3 front VRM's. No active or passive cooling at all.


----------



## Valice

count me in!


----------



## stickg1

All the sudden I'm getting a bunch of flickering blue pixels. I just uninstalled the drivers with DDU in safe mode, cleaned up with CCleaner, then reinstalled drivers and I still have them. Reset the card to stock clocks, is it dying?


----------



## Tobiman

Quote:


> Originally Posted by *stickg1*
> 
> All the sudden I'm getting a bunch of flickering blue pixels. I just uninstalled the drivers with DDU in safe mode, cleaned up with CCleaner, then reinstalled drivers and I still have them. Reset the card to stock clocks, is it dying?


This usually happens when memory is clocked too high or voltage starved.


----------



## stickg1

Quote:


> Originally Posted by *Tobiman*
> 
> This usually happens when memory is clocked too high or voltage starved.


Yeah but everything is stock.


----------



## stickg1

Quote:


> Originally Posted by *stickg1*
> 
> Yeah but everything is stock.


Ugh I shouldn't admit this but I will. The source of the problem seemed to be a loose DVI connection, LOL.


----------



## anubis1127

Quote:


> Originally Posted by *stickg1*
> 
> Ugh I shouldn't admit this but I will. The source of the problem seemed to be a loose DVI connection, LOL.


Haha, well at least it was something simple.


----------



## quakermaas

Quote:


> Originally Posted by *stickg1*
> 
> Ugh I shouldn't admit this but I will. The source of the problem seemed to be a loose DVI connection, LOL.


Can happen, at least it's sorted.


----------



## frogzombie

Quote:


> Originally Posted by *Arizonian*
> 
> I don't see your screen name in Validation tab, if you can edit and add it to this post would be nice. Just for proof, however.....
> 
> Congrats - added


This should be better.


----------



## Derpinheimer

Quote:


> Originally Posted by *prostreetcamaro*
> 
> Not really. Wont work worth a crap with these cards IMO. Absolutely no cooling at all to the front 3 VRM's.


Those ones dont really need any cooling. A tiny 15x15x8mm passive heatsink keeps them under 72c. 14x14x14 keeps em under 68.

Add a 600rpm 90mm fan and that lowers it a couple more degrees.

As long as there is something on them I wouldnt worry at all. Or just a draft going over them is enough.

The long line of VRMs is the one that burns up when overclocking.


----------



## velocityx

in legitreviews review, they said the temp was higher by 1 degree celsius on that vrm.

actually, I will be buying two corsair h75 and two brackets for my CF.


----------



## Widde

Quote:


> Originally Posted by *bloodkil93*
> 
> That voltage seems far too high for that clock, with +100mV i can get nearly 1190 out of mine


Think I just had some bad luck with the lottery =/ Cant get higher and keep it stable, temps are in check and all. Starts to tear in furmark above 1115 on the core


----------



## kcuestag

Anyone knows what's the AMD Rep's username in here? My 290X still seems to be black screening while playing BF4, even at stock settings...


----------



## HardwareDecoder

so I got my waterloop all setup today in my new case. I can't even believe my 290x is at like 40C max. Vrms are 30-40C

I really don't see why anyone would buy one of these and not put it underwater.

Also this thing wanted to go under water really badly.... I can do 1200 with +100mv and 1250 with +163. I couldn't do over 1175 on +163 on air. so I gained like 75mhz core and 250mhz memory on water.

I also couldn't oc my vram at all on air and I just ran a 1200/1500 heaven benchmark with 0 issues.

I tried 1300 core with no mem oc but it crashed, oh well i'm happy as a clam with 1250. I'll see if I can push the memory up any higher later.

Super happy with my 290x


----------



## stickg1

Quote:


> Originally Posted by *prostreetcamaro*
> 
> Not really. Wont work worth a crap with these cards IMO. Absolutely no cooling at all to the front 3 VRM's.


Mine works fine with no heatsinks at all in fact. I did but 3 little VRMsinks on the bundle of three up front. I just zip-tied mine together and it keeps everything cooler (and quieter) than the stock cooler.


----------



## Jack Mac

Quote:


> Originally Posted by *HardwareDecoder*
> 
> I really don't see why anyone would buy one of these and not put it underwater.


Because it's too expensive and defeats the purpose of buying these cards for price/performance.


----------



## velocityx

Quote:


> Originally Posted by *kcuestag*
> 
> Anyone knows what's the AMD Rep's username in here? My 290X still seems to be black screening while playing BF4, even at stock settings...


my CF 290 seems to black and red screen every couple of days too.


----------



## the9quad

Quote:


> Originally Posted by *Jack Mac*
> 
> Because it's too expensive and defeats the purpose of buying these cards for price/performance.


I kind of agree with ya there. It's really true if you haven't previously invested in water yet. Your looking at some serious change to do it. Water blocks are only part of the equation, by the time you throw in pump(s)/reservoir/rad(s)/tubing/fittings/possibly a bigger psu, it adds up to some serious serious cash.

On the other hand if you have already invested in everything but the waterblocks, its a no brainer to water cool em.


----------



## hotrod717

Quote:


> Originally Posted by *Jack Mac*
> 
> Because it's too expensive and defeats the purpose of buying these cards for price/performance.


Have to diagree. If you watercool, it's still $150 cheaper than a 290X and still at the same price/performance when considering watercooling. You also gain oc'ing potential and watercooling usually extends the life and performance of a gpu/cpu. That being said, no, watercooling isn't cheap.


----------



## HardwareDecoder

Quote:


> Originally Posted by *the9quad*
> 
> I kind of agree with ya there. It's really true if you haven't previously invested in water yet. Your looking at some serious change to do it. Water blocks are only part of the equation, by the time you throw in pump(s)/reservoir/rad(s)/tubing/fittings/possibly a bigger psu, it adds up to some serious serious cash.
> 
> On the other hand if you have already invested in everything but the waterblocks, its a no brainer to water cool em.


All I know is i'm never going back. The low temps, the quiet from only having radiator fans (i'm only running one rear case fan cause I don't need any others anymore.) My computer basically makes no noise anymore.

Yeah I've spent 350$ so far to get my cpu+gpu cooled.

XSPC rasa 240mm kit ($150+a second rad($50 I think) + 118$ for a block and miscellaneous stuff.
Quote:


> Originally Posted by *stickg1*
> 
> Ugh I shouldn't admit this but I will. The source of the problem seemed to be a loose DVI connection, LOL.


I've done that. Freaked out that my monitor / computer was dying.
Quote:


> Originally Posted by *hotrod717*
> 
> Have to diagree. If you watercool, it's still $150 cheaper than a 290X and still at the same price/performance when considering watercooling. You also gain oc'ing potential and watercooling usually extends the life and performance of a gpu/cpu. That being said, no, watercooling isn't cheap.


did you mean cheaper than a 780ti? I already had my rasa kit for the cpu so it basically cost me 118$ for a block and 50$ for a second radiator.

You have to take in to consideration that now that I have a second radiator I never need to buy another one likely unless I do some ridiculous setup.

I can also resell the block for atleast half of what I paid for it eventually, when i'm ready to move on to a new card.

So basically no water cooling isn't cheap but if you value quiet/ocing potential (temps) it cannot be beat.

I spent about the same as a 780ti in the end but I have about as much performance and 0 noise.


----------



## Amhro

Quote:


> Originally Posted by *kcuestag*
> 
> Anyone knows what's the AMD Rep's username in here? My 290X still seems to be black screening while playing BF4, even at stock settings...


http://www.overclock.net/u/359086/warsam71


----------



## kcuestag

Quote:


> Originally Posted by *Amhro*
> 
> http://www.overclock.net/u/359086/warsam71


Thanks, sending him a PM hoping he can help me out, really tired of this 290X...


----------



## RAFFY

Quote:


> Originally Posted by *kcuestag*
> 
> Thanks, sending him a PM hoping he can help me out, really tired of this 290X...


Do you get black screens playing other games?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *the9quad*
> 
> I kind of agree with ya there. It's really true if you haven't previously invested in water yet. Your looking at some serious change to do it. Water blocks are only part of the equation, by the time you throw in pump(s)/reservoir/rad(s)/tubing/fittings/possibly a bigger psu, it adds up to some serious serious cash.
> 
> On the other hand if you have already invested in everything but the waterblocks, its a no brainer to water cool em.


I can attest to that, after buying a new PSU, case, blocks, pump, res, rads, fittings, TIM, additive, fans, tubing, etc. I'm over $1.2k for watercooling.... Keep in mind no core components were bought except the case and PSU. Granted I'm buying all high-end parts, but still.

I started with a budget of $400 originally, lol.

On the topic of thinking your PC is dead, yesterday I had my new wifi adapter plugged into my mobo, and I guess it was in the flashback port so my PC kept trying to boot from the adapter. It didn't do it the first 5 or so times that I booted so it took me a while to figure it out. First I thought my 290 was dead, then my Sabertooth Z77.

Simply unplugging my adapter made it work again.


----------



## TheRoot

Quote:


> Originally Posted by *HardwareDecoder*
> 
> so I got my waterloop all setup today in my new case. I can't even believe my 290x is at like 40C max. Vrms are 30-40C
> 
> I really don't see why anyone would buy one of these and not put it underwater.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Also this thing wanted to go under water really badly.... I can do 1200 with +100mv and 1250 with +163. I couldn't do over 1175 on +163 on air. so I gained like 75mhz core and 250mhz memory on water.
> 
> I also couldn't oc my vram at all on air and I just ran a 1200/1500 heaven benchmark with 0 issues.
> 
> I tried 1300 core with no mem oc but it crashed, oh well i'm happy as a clam with 1250. I'll see if I can push the memory up any higher later.
> 
> Super happy with my 290x


because reference cooler is good enough


----------



## the9quad

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I can attest to that, after buying a new PSU, case, blocks, pump, res, rads, fittings, TIM, additive, fans, tubing, etc. I'm over $1.2k for watercooling.... Keep in mind no core components were bought except the case and PSU. Granted I'm buying all high-end parts, but still.
> 
> I started with a budget of $400 originally, lol.
> 
> On the topic of thinking your PC is dead, yesterday I had my new wifi adapter plugged into my mobo, and I guess it was in the flashback port so my PC kept trying to boot from the adapter. It didn't do it the first 5 or so times that I booted so it took me a while to figure it out. First I thought my 290 was dead, then my Sabertooth Z77.
> 
> Simply unplugging my adapter made it work again.


And that would be my problem right there, I would want to get high end stuff as well and a new case to put it all in, so I'd be looking at roughly the same money as you there. It's just hard to justify that much money (to my wife, matter of fact I wouldn't dare ask, I'd have to sneak it, and it's just too much stuff to sneak lol)


----------



## RAFFY

Quote:


> Originally Posted by *the9quad*
> 
> And that would be my problem right there, I would want to get high end stuff as well and a new case to put it all in, so I'd be looking at roughly the same money as you there. It's just hard to justify that much money (to my wife, matter of fact I wouldn't dare ask, I'd have to sneak it, and it's just too much stuff to sneak lol)


Think outside the box...send her on a cruise with a girl friend.


----------



## hotrod717

Quote:


> Originally Posted by *HardwareDecoder*
> 
> All I know is i'm never going back. The low temps, the quiet from only having radiator fans (i'm only running one rear case fan cause I don't need any others anymore.) My computer basically makes no noise anymore.
> 
> Yeah I've spent 350$ so far to get my cpu+gpu cooled.
> 
> XSPC rasa 240mm kit ($150+a second rad($50 I think) + 118$ for a block and miscellaneous stuff.
> I've done that. Freaked out that my monitor / computer was dying.
> did you mean cheaper than a 780ti? I already had my rasa kit for the cpu so it basically cost me 118$ for a block and 50$ for a second radiator.
> 
> You have to take in to consideration that now that I have a second radiator I never need to buy another one likely unless I do some ridiculous setup.
> 
> I can also resell the block for atleast half of what I paid for it eventually, when i'm ready to move on to a new card.
> 
> So basically no water cooling isn't cheap but if you value quiet/ocing potential (temps) it cannot be beat.
> 
> I spent about the same as a 780ti in the end but I have about as much performance and 0 noise.


I quoted and was responding to Jackmac, as was the9quad. Not sure what made you believe our responses were directed at you? Not trying to be snooty, but we quoted someone else.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *the9quad*
> 
> And that would be my problem right there, I would want to get high end stuff as well and a new case to put it all in, so I'd be looking at roughly the same money as you there. It's just hard to justify that much money (to my wife, matter of fact I wouldn't dare ask, I'd have to sneak it, and it's just too much stuff to sneak lol)


Yes lol it's a lot of $$ but I'm hoping it'll be worth it. The way I figure is if I'm building a loop, I want to do it right.

Regardless, it'll give me something to build. The total cost will go up by another $300 if I mess up de-lidding my CPU though...


----------



## Derpinheimer

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Yes lol it's a lot of $$ but I'm hoping it'll be worth it. The way I figure is if I'm building a loop, I want to do it right.
> 
> Regardless, it'll give me something to build. The total cost will go up by another $300 if I mess up de-lidding my CPU though...


Exactly. When you read and people say, "Oh it might only be a 1-2c difference between X and Y" thats a pretty big deal when you are talking about a 60c best case scenario drop.

I mean, had I not gone with the best stuff right away I'd only be disappointed and throw away more cash to upgrade it..

SO DONT CHEAP OUT ON WATERCOOLING Y'ALL


----------



## HardwareDecoder

Quote:


> Originally Posted by *Derpinheimer*
> 
> Exactly. When you read and people say, "Oh it might only be a 1-2c difference between X and Y" thats a pretty big deal when you are talking about a 60c best case scenario drop.
> 
> I mean, had I not gone with the best stuff right away I'd only be disappointed and throw away more cash to upgrade it..


I didn't buy expensive stuff and my results are phenomenal. Only my gpu block is top of the line.



just ran that @ 1250/1500 max temp was 45c


----------



## Derpinheimer

Nice! What kind of setup? Also, I want your card.. how much voltage for 1250? Ever try it under OCCT?

I know you have a "true" 290x.. at least I think, right?

nevermind i see your post now. i really want that card now! xD


----------



## HardwareDecoder

Quote:


> Originally Posted by *Derpinheimer*
> 
> Nice! What kind of setup? Also, I want your card.. how much voltage for 1250? Ever try it under OCCT?
> 
> I know you have a "true" 290x.. at least I think, right?


Just an xspc rasa kit with a second 240mm radiator and an ek 290x block. Yeah my card is a legit 290x got it for $500 from newegg.

I'm using the asus bios and I have 1410mv right now (max you can do without a pt1/pt3 bios)

I really want to try a pt1 bios with unlocked vcore since my temps are so low but I don't want to pop my vrm's or something.

I can do 1200 core on stock bios with +100mv, asus one is +163. I don't think I can overclock my memory at all on stock bios though.

I've used occt a few times when I had 2x 7950's I wasn't a big fan of it.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *HardwareDecoder*
> 
> I didn't buy expensive stuff and my results are phenomenal. Only my gpu block is top of the line.
> 
> 
> 
> just ran that @ 1250/1500 max temp was 45c


What're your CPU temps/speeds? How much rad do you have?

I'm going for 5GHz CPU and 1225 MHz GPU..
I don't think I'm gonna go over 1.35v for CPU though, over that's the edge of my comfort zone for CPU voltage. I'm currently running 4.6 @ 1.3v after droop all day long but my H60 makes it max in the mid 80's temp wise.... A delid with CLU between the IHS and die with some IX between the IHS and block, combined with a delta t of (hopefully) under 10c should yield some interesting results.

I also plan on eventually adding a 2nd 290, my case has room for another ~200mm rad in the future. Gotta love ultra-tower cases.


----------



## HardwareDecoder

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> What're your CPU temps/speeds? How much rad do you have?
> 
> I'm going for 5GHz CPU and 1225 MHz GPU..
> I don't think I'm gonna go over 1.35v for CPU though, over that's the edge of my comfort zone for CPU voltage. I'm currently running 4.6 @ 1.3v after droop all day long but my H60 makes it max in the mid 80's temp wise.... A delid with CLU between the IHS and die with some IX between the IHS and block, combined with a delta t of (hopefully) under 10c should yield some interesting results.
> 
> I also plan on eventually adding a 2nd 290, my case has room for another ~200mm rad in the future. Gotta love ultra-tower cases.


3770k @ 4.8ghz 1.45 vcore (my chip isn't a great ocer I might go down to 4.7 soon my self to reduce voltage some)

I am using CLU on the die also.

I have 2x 240mm rads.

Check out my build log, i'll be adding some final pics soon.

My loops is ugly as heck since I wanted to leave slack in all my tubing for when I change components/do maintenance.

It gives great temps though.


----------



## Derpinheimer

Quote:


> Originally Posted by *HardwareDecoder*
> 
> Just an xspc rasa kit with a second 240mm radiator and an ek 290x block. Yeah my card is a legit 290x got it for $500 from newegg.
> 
> I'm using the asus bios and I have 1410mv right now (max you can do without a pt1/pt3 bios)
> 
> I really want to try a pt1 bios with unlocked vcore since my temps are so low but I don't want to pop my vrm's or something.
> 
> I can do 1200 core on stock bios with +100mv, asus one is +163. I don't think I can overclock my memory at all on stock bios though.
> 
> I've used occt a few times when I had 2x 7950's I wasn't a big fan of it.


Memory must love the low temps. Mine get burning hot with passive cooling [universal block]

OCCT GPU is nice not only because it tells you if you have minor errors errors that you may or may not care about, but major errors [artifacts] are super east to spot on truly unstable settings, unlike games or heaven where they can be hard to confirm and then you end up crashing once a week going ***?

Awesome card though.. I cant hit 1200 with maxed 1410mV. sooooo jelly









Not that this is a super-technical way to get rad area, but if I multiply mine all out [180x180x45(x3)+120x120x80(x1)] I get 5.5M mm. How about yours? Not sure what my general load temps is but I know I've seen it hit 47c. Prob OCCT. I cant test it now.

p.s. before the GPU tweak update it let me go all the way to 1800mV??? No one else had that?


----------



## HardwareDecoder

Quote:


> Originally Posted by *Derpinheimer*
> 
> Memory must love the low temps. Mine get burning hot with passive cooling [universal block]
> 
> OCCT GPU is nice not only because it tells you if you have minor errors errors that you may or may not care about, but major errors [artifacts] are super east to spot on truly unstable settings, unlike games or heaven where they can be hard to confirm and then you end up crashing once a week going ***?
> 
> Awesome card though.. I cant hit 1200 with maxed 1410mV. sooooo jelly


yeah I am getting much better oc results on water. these things love low temps, especially the memory it seems.

i thought this thing was a dud when I got it too.

I'll have to check out occt soon. Usually I just do a few different benchmarks then play bf4/whatever other games and I reduce memory/core by 25mhz till I don't crash anymore.

I'm so tempted to use a Pt1 bios but I don't want to kill this card since I'm pretty happy with it.


----------



## Forceman

Has anyone else noticed that Afterburner doesn't apply the voltage increase at startup? I just opened my GPU-Z sensors tab and it shows 0.961V (idle +50mV applied at startup). Changing the voltage to +100mV and the voltage jumped to 1.061V. Changing it back to +50 and now it is sitting at 1.008. So the original +50 wasn't effective.

Edit: So it appears it does apply at startup, but once the monitor goes to sleep (not the computer) it resets to defaults. Can someone else check and verify this? Aapparently I was overclocking with no voltage most of the time after all.


----------



## stl drifter

If I put a waterblock on my Asus R9 290 will it void the warranty ?


----------



## ImJJames

kraken g10 is in stock, so tempting!!


----------



## Forceman

Quote:


> Originally Posted by *stl drifter*
> 
> If I put a waterblock on my Asus R9 290 will it void the warranty ?


Depends on the company. You need to put the stock cooler back on before you RMA it though.


----------



## stl drifter

Thats what I thought . Thanks for the reply.


----------



## jamaican voodoo

Quote:


> Originally Posted by *stl drifter*
> 
> If I put a waterblock on my Asus R9 290 will it void the warranty ?


yes pretty much i put a waterblock on mine i didn't care too much!!! im not trying to set world record by any means just wanted to keep it cool


----------



## UNOE

Quote:


> Originally Posted by *velocityx*
> 
> my CF 290 seems to black and red screen every couple of days too.


I'm pretty sure its drivers because. I have 3 cards and they all seem to do it. Not happy about it though. Once again I have a AMD launch GPU that won't work for months after release.


----------



## velocityx

Quote:


> Originally Posted by *UNOE*
> 
> I'm pretty sure its drivers because. I have 3 cards and they all seem to do it. Not happy about it though. Once again I have a AMD launch GPU that won't work for months after release.


Yea I agree. I will try to get some more juice into my cpu, as was suggested, however, it is 24/7 stable otherwise in bf3 and 4 and I didn't have these issues with the cpu with my CF 6970. only thing that changes was the cards. since i have em i have those black and red screens. However, i have had that maybe 6 times total, at random it seems.


----------



## devilhead

So my experience with 3X xfx290, first card with 78.6% asic pass valley 1200/1250 with +160 with artifacts, other card 74.5 asic 1200/1250 pass valley with +130 and the third XFX 290 is not the best..... if i will put more than +60 voltage or more than stock memory (overclockover 1300) getting some lines and artifacts..... max what i can reach with those card is 1130/1250


----------



## chiknnwatrmln

Anybody here using the newest Beta drivers?

I updated yesterday and my 3dMark11 scores have dropped ~400 pts in the GPU category... What the heck? Now my overall score is under 15k again.

Gonna try rolling back drivers.

GPU score before was ~17.4k, now it's around 17k.


----------



## head9r2k

@chiknnwatrmln

Disabled Tesslation in CCC ?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *head9r2k*
> 
> @chiknnwatrmln
> 
> Disabled Tesslation in CCC ?


I never disable tess. Causes misleading scores.

Re-ran the bench a couple times, highest GPU score I can get now is 17.2k.


----------



## jerrolds

Quote:


> Originally Posted by *Derpinheimer*
> 
> Those ones dont really need any cooling. A tiny 15x15x8mm passive heatsink keeps them under 72c. 14x14x14 keeps em under 68.
> 
> Add a 600rpm 90mm fan and that lowers it a couple more degrees.
> 
> As long as there is something on them I wouldnt worry at all. Or just a draft going over them is enough.
> 
> The long line of VRMs is the one that burns up when overclocking.


This. The cluster of 3 doesnt heat up really, can just passively cool it with a bit of air flow. Its the line of VRMs that burn up pretty quick. I'm thinking of getting a Krake, assuming i dont sell this extra XFX 290X.


----------



## chiknnwatrmln

This is the highest I can get now: 15001, gpu 17210 physics 11251
http://www.3dmark.com/3dm11/7661511

Compared with this: 15115 gpu 17436 physics 11225
http://www.3dmark.com/3dm11/7540268

What the heck is going on here? I re-installed the 9.2 driver but I'm still not getting the same performance. And 3dmark won't add any of my results to my results list. Ugh FM's software is so buggy sometimes.


----------



## kizwan

Quote:


> Originally Posted by *Jack Mac*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HardwareDecoder*
> 
> I really don't see why anyone would buy one of these and not put it underwater.
> 
> 
> 
> Because it's too expensive and defeats the purpose of buying these cards for price/performance.
Click to expand...

I agree unless already own one & only need to add water block for the GPU. I personally think if you want, you can always water cool only the GPU. This way it's not going to cost you much.
Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Anybody here using the newest Beta drivers?
> 
> I updated yesterday and my 3dMark11 scores have dropped ~400 pts in the GPU category... What the heck? Now my overall score is under 15k again.
> 
> Gonna try rolling back drivers.
> 
> GPU score before was ~17.4k, now it's around 17k.


I think I'm using latest one 13.11 beta 9.5. However I didn't use previous driver, so I don't know if there is anything wrong with the latest driver.


----------



## The Storm

I am currently running 2 sapphire r9-290's stock cooling as of now, my EK blocks are on there way. Is it normal for my top card to idle @ 90c? Bottom card is idling at 45c. Case is an nzxt switch 810.


----------



## Asrock Extreme7

well just finished my loop got a 290 unlocked to 290x yes!! on chip says 215-0852000 week 1340

DSC_0015.JPG 1905k .JPG file

oc 1200 core 1500 mem 1.412 gpu tweak not tried higher on mem valley

Screenshot11.png 126k .png file

so far very happy

DSC_0022.JPG 2209k .JPG file


DSC_0023.JPG 2527k .JPG file

but is flashing to 290x bios safe or will it go in 2 weeks ?????


----------



## Jack Mac

Quote:


> Originally Posted by *The Storm*
> 
> I am currently running 2 sapphire r9-290's stock cooling as of now, my EK blocks are on there way. Is it normal for my top card to idle @ 90c? Bottom card is idling at 45c. Case is an nzxt switch 810.


No, what I do to make my card idle nice and cool is decrease core and memory to their minimum clocks in AB, decrease voltage by 70 and up the fan speed from 20-35% and it stays below 55C at idle.


----------



## UNOE

I'm going to be running two PSU's for 4 cards. Can you guys look at this tell me if you think, if this will work alright.

I will be using Dual PSU adapter from newegg LINK

*Corsair HX620 Load*
Motherboard 24pin
CPU 6 pin
PCIE power (Sata Connector on Gigabyte boards) *??*
All other Drives and and fans

*Corsair AX1200 Load*
4x 8pin PCIe
4x 6pin PCIe

So basically everything will be running off the HX620 except for the Video card PCIe cables which will all be powered by the AX1200.

The big question is mainly about PCIe power cable to the motherboard supplemental PCIE power. That uses the Sata cable to give the slots more power.
I'm wondering should I use that connector from the AX1200 so its the same as the videocards. Or should I use the sata pcie from the HX620 so it has same power as the rest of the motherboard.


----------



## Forceman

Quote:


> Originally Posted by *Asrock Extreme7*
> 
> is flashing to 290x bios safe or will it go in 2 weeks ?????


Mine's been fine for a month.


----------



## MarvinDessica

Any word on non-reference 290x? I haven't been able to keep up because of the holidays but I'm about ready to just get a 780ti


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *The Storm*
> 
> I am currently running 2 sapphire r9-290's stock cooling as of now, my EK blocks are on there way. Is it normal for my top card to idle @ 90c? Bottom card is idling at 45c. Case is an nzxt switch 810.


Absolutely not 90c is not idle temp thats a full load temp . What in gods name is it doing that for


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *kizwan*
> 
> I agree unless already own one & only need to add water block for the GPU. I personally think if you want, you can always water cool only the GPU. This way it's not going to cost you much.
> I think I'm using latest one 13.11 beta 9.5. However I didn't use previous driver, so I don't know if there is anything wrong with the latest driver.












I went back to 9.4 cause i was getting a few bsods and black and red screen freeze with CF for some bizzare reason


----------



## ImJJames

Quote:


> Originally Posted by *MarvinDessica*
> 
> Any word on non-reference 290x? I haven't been able to keep up because of the holidays but I'm about ready to just get a 780ti


January


----------



## chiknnwatrmln

I think I know why my card isn't scoring 17.4k any more...

When I got my highest score (17.4k GPU, 15115 total) I was on the stock cooler at 1200/6500 @ 1.412 in GT.

I'm now running the Gelid, same clocks and voltage but my core is cooler and my VRM's are hotter. I seem to artifact a bit more with the Gelid, getting checkerboard artifacts and eventually screen flashing to black for a frame or two after a short while. This didn't happen on the stock cooler, which leads me to believe it's being caused by higher VRM temps (>85c). These brief blackscreens must somehow lower my score by a bit (~200 points).

Once I get this card under water and the temps under control I'll re-run the test and see what happens.


----------



## broken pixel

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I went back to 9.4 cause i was getting a few bsods and black and red screen freeze with CF for some bizzare reason


I got one of the red screens with the 13.11 certified driver. With only clocking my mems to 1350mhz. Im going back to betas since im finished BM 3dmark11.


----------



## The Storm

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Absolutely not 90c is not idle temp thats a full load temp . What in gods name is it doing that for


I have no idea why, at first I thought it might have just been that card, so I swapped them around the bottom one that was idleing 45 is now up top and boom it idles the exact same as the other, 88-90 and the bottom one is 45. I have a Z87 board so there is a gap between the 2 cards so I dont know. My EK blocks cant get here fast enough, apparantly Performance pcs arent open on the weekend so they wont get shipped till moday.

Quote:


> Originally Posted by *Jack Mac*
> 
> No, what I do to make my card idle nice and cool is decrease core and memory to their minimum clocks in AB, decrease voltage by 70 and up the fan speed from 20-35% and it stays below 55C at idle.


Thanks I will try that.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *The Storm*
> 
> I have no idea why, at first I thought it might have just been that card, so I swapped them around the bottom one that was idleing 45 is now up top and boom it idles the exact same as the other, 88-90 and the bottom one is 45. I have a Z87 board so there is a gap between the 2 cards so I dont know. My EK blocks cant get here fast enough, apparantly Performance pcs arent open on the weekend so they wont get shipped till moday.
> Thanks I will try that.


Try putting a fan on the side of your case, or run your PC without the side door?

Also make a fan profile if you haven't already, although if you're getting blocks then it's just a temporary problem.

My block shipped today, according to FrozenCPU.

But yeah idling at almost 90c is not good. I don't even wanna know what your load temps are...


----------



## The Storm

Quote:


> Originally Posted by *Jack Mac*
> 
> No, what I do to make my card idle nice and cool is decrease core and memory to their minimum clocks in AB, decrease voltage by 70 and up the fan speed from 20-35% and it stays below 55C at idle.


This worked just like you said, idles 55 now, Thank you!!
Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Try putting a fan on the side of your case, or run your PC without the side door?
> 
> Also make a fan profile if you haven't already, although if you're getting blocks then it's just a temporary problem.
> 
> My block shipped today, according to FrozenCPU.
> 
> But yeah idling at almost 90c is not good. I don't even wanna know what your load temps are...


Thats with the side door off, man these things are hot!!!


----------



## CHUNKYBOWSER

Got my Sapphire R9 290 BF4 edition today. Unfortunately, I can't unlock it.

http://www.techpowerup.com/gpuz/ebc4k/

Stock cooling! Will have it under water later...


----------



## Forceman

Quote:


> Originally Posted by *The Storm*
> 
> I have no idea why, at first I thought it might have just been that card, so I swapped them around the bottom one that was idleing 45 is now up top and boom it idles the exact same as the other, 88-90 and the bottom one is 45. I have a Z87 board so there is a gap between the 2 cards so I dont know. My EK blocks cant get here fast enough, apparantly Performance pcs arent open on the weekend so they wont get shipped till moday.


Is the top card going to 2D clocks?


----------



## spitty13

Would two 290s be bottlenecked by a overclocked 2500k?


----------



## The Storm

Quote:


> Originally Posted by *Forceman*
> 
> Is the top card going to 2D clocks?


How do i check?


----------



## Sazz

LMAO! Looks like I am going back to 290X! I was able to sell my 290 for 650 on ebay (end up getting 573bucks after the fee's and shipping)... and I still have two BF4 codes to sell, got the 290 for 400bucks, hah! God damn, I don't even know how it got sold but it did. I will be getting my new 290X on Tuesday or wednseday xD


----------



## Durvelle27

Quote:


> Originally Posted by *Sazz*
> 
> LMAO! Looks like I am going back to 290X! I was able to sell my 290 for 650 on ebay (end up getting 573bucks after the fee's and shipping)... and I still have two BF4 codes to sell, got the 290 for 400bucks, hah! God damn, I don't even know how it got sold but it did. I will be getting my new 290X on Tuesday or wednseday xD


Are you serious. I only got $500 for my 290.


----------



## Sazz

Quote:


> Originally Posted by *Durvelle27*
> 
> Are you serious. I only got $500 for my 290.


yeah, I just listed it for no reason overpriced. I was just like "Wth, might as well list it along these overpriced ones too" then while at work I saw the notification on my phone.. and I was like "huh? it sold??" and then I saw the instant payment kicks in and I was like "ok... weird.... but I ain't complaining" basically got two extra 3 BF4 codes (one with the one I just purchased and two that I already have from my first 290X and the 290 that I sold).

So yeah, S> BF4 codes LOLOL!!!

I will probably give one out to a friend of mine then sell the remaining two once the BF4 code bundled 290/290X is gone (too many people selling BF4 codes right now)


----------



## The Storm

I have 2 codes sitting on my desk right now as well, to bad I pre ordered the game long ago. On another note both my Sapphire R9-290's unlocked to 290X's and are stable!!


----------



## pkrexer

Quote:


> Originally Posted by *Forceman*
> 
> Has anyone else noticed that Afterburner doesn't apply the voltage increase at startup? I just opened my GPU-Z sensors tab and it shows 0.961V (idle +50mV applied at startup). Changing the voltage to +100mV and the voltage jumped to 1.061V. Changing it back to +50 and now it is sitting at 1.008. So the original +50 wasn't effective.
> 
> Edit: So it appears it does apply at startup, but once the monitor goes to sleep (not the computer) it resets to defaults. Can someone else check and verify this? Aapparently I was overclocking with no voltage most of the time after all.


Yep, having the same issue. Its annoying that I have to apply my AB profile every time I go to use my computer.


----------



## Durvelle27

Quote:


> Originally Posted by *Sazz*
> 
> yeah, I just listed it for no reason overpriced. I was just like "Wth, might as well list it along these overpriced ones too" then while at work I saw the notification on my phone.. and I was like "huh? it sold??" and then I saw the instant payment kicks in and I was like "ok... weird.... but I ain't complaining" basically got two extra 3 BF4 codes (one with the one I just purchased and two that I already have from my first 290X and the 290 that I sold).
> 
> So yeah, S> BF4 codes LOLOL!!!
> 
> I will probably give one out to a friend of mine then sell the remaining two once the BF4 code bundled 290/290X is gone (too many people selling BF4 codes right now)


Man wish I did that


----------



## Forceman

Quote:


> Originally Posted by *The Storm*
> 
> How do i check?


Not sure with Crossfire, but GPU-Z should show it in the sensors tab if there's a way to check individual cards.


----------



## The Storm

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Try putting a fan on the side of your case, or run your PC without the side door?
> 
> Also make a fan profile if you haven't already, although if you're getting blocks then it's just a temporary problem.
> 
> My block shipped today, according to FrozenCPU.
> 
> But yeah idling at almost 90c is not good. I don't even wanna know what your load temps are...


Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Absolutely not 90c is not idle temp thats a full load temp . What in gods name is it doing that for


Funny thing is now, I have unlocked both my cards to 290X's they work flawlessly and now the top one idles @ 50c and bottom 40c.


----------



## the9quad

Quote:


> Originally Posted by *The Storm*
> 
> Funny thing is now, I have unlocked both my cards to 290X's they work flawlessly and now the top one idles @ 50c and bottom 40c.


That's normal.


----------



## UNOE

Quote:


> Originally Posted by *UNOE*
> 
> I'm going to be running two PSU's for 4 cards. Can you guys look at this tell me if you think, if this will work alright.
> 
> I will be using Dual PSU adapter from newegg LINK
> 
> *Corsair HX620 Load*
> Motherboard 24pin
> CPU 6 pin
> PCIE power (Sata Connector on Gigabyte boards) *??*
> All other Drives and and fans
> 
> *Corsair AX1200 Load*
> 4x 8pin PCIe
> 4x 6pin PCIe
> 
> So basically everything will be running off the HX620 except for the Video card PCIe cables which will all be powered by the AX1200.
> 
> The big question is mainly about PCIe power cable to the motherboard supplemental PCIE power. That uses the Sata cable to give the slots more power.
> I'm wondering should I use that connector from the AX1200 so its the same as the videocards. Or should I use the sata pcie from the HX620 so it has same power as the rest of the motherboard.


Bump ^


----------



## The Storm

Quote:


> Originally Posted by *the9quad*
> 
> That's normal.


Before I unlocked them the top one was 90c at idle.


----------



## skupples

Quote:


> Originally Posted by *MarvinDessica*
> 
> Any word on non-reference 290x? I haven't been able to keep up because of the holidays but I'm about ready to just get a 780ti


dat kingpin. 2ghz on ln2 =


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *The Storm*
> 
> Funny thing is now, I have unlocked both my cards to 290X's they work flawlessly and now the top one idles @ 50c and bottom 40c.


Good stuff man









Mine are locked runs standard coolers







but ive kept the bios stock and i did this ......

HOMECINEMA-PC [email protected]@2400 CF 290's [email protected] *P26246*



http://www.3dmark.com/3dm11/7662579

and



http://www.3dmark.com/fs/1304379


----------



## RAFFY

Quote:


> Originally Posted by *skupples*
> 
> dat kingpin. 2ghz on ln2 =


LINK!


----------



## CHUNKYBOWSER

Quote:


> Originally Posted by *UNOE*
> 
> Bump ^


You may want to ask this in the PSU section, I'm sure someone there will be much more knowledgeable. In my mind, I don't think it matters which way you do it.


----------



## prostreetcamaro

Quote:


> Originally Posted by *y2kcamaross*
> 
> You can add me here in a bit, will post screen shot when I get home, my back up betty rig now has 1 xfx 290 unlocked to 290x with an accelero xtreme 3 for cooling, one more to come soon










Have you had a chance to play around with her yet? Sorta kicking myself for selling it already lmao.


----------



## UNOE

Quote:


> Originally Posted by *CHUNKYBOWSER*
> 
> You may want to ask this in the PSU section, I'm sure someone there will be much more knowledgeable. In my mind, I don't think it matters which way you do it.


Thanks, I just hate making a thread for small question like this. But I will have to if no one else with more experience with this type of things chimes in.


----------



## RAFFY

Quote:


> Originally Posted by *UNOE*
> 
> Thanks


PM Shilka


----------



## The Storm

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Good stuff man
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Mine are locked runs standard coolers
> 
> 
> 
> 
> 
> 
> 
> but ive kept the bios stock and i did this ......
> 
> HOMECINEMA-PC [email protected]@2400 CF 290's [email protected] *P26246*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/7662579
> 
> and
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/1304379


Thats awesome man!! My new EK blocks will arive this week so I will test mine out. I have been dissapointed in the 290's until tonight. My idle temps have been crazy high, I played BF4 and my frame rates where actually less than my 7950's, due to the top card throttling something terrible. I decided to take the plunge and unlock them tonight, I used the sapphire 290x bios, now the idle temps are where they should be, and BF4 gets me 60+ fps higher than when they were 290's and 0 throttling...its as if the 290 bios was really hurting these things. Both cards are sapphires btw.


----------



## CHUNKYBOWSER

Quote:


> Originally Posted by *The Storm*
> 
> Thats awesome man!! My new EK blocks will arive this week so I will test mine out. I have been dissapointed in the 290's until tonight. My idle temps have been crazy high, I played BF4 and my frame rates where actually less than my 7950's, due to the top card throttling something terrible. I decided to take the plunge and unlock them tonight, I used the sapphire 290x bios, now the idle temps are where they should be, and BF4 gets me 60+ fps higher than when they were 290's and 0 throttling...its as if the 290 bios was really hurting these things. Both cards are sapphires btw.


Wish my Sapphire unlocked. So disappointed. I should just be happy that I got Hynix memory.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *The Storm*
> 
> Thats awesome man!! My new EK blocks will arive this week so I will test mine out. I have been dissapointed in the 290's until tonight. My idle temps have been crazy high, I played BF4 and my frame rates where actually less than my 7950's, due to the top card throttling something terrible. I decided to take the plunge and unlock them tonight, I used the sapphire 290x bios, now the idle temps are where they should be, and BF4 gets me 60+ fps higher than when they were 290's and 0 throttling...its as if the 290 bios was really hurting these things. Both cards are sapphires btw.
> 
> Quote:
> 
> 
> 
> Originally Posted by *CHUNKYBOWSER*
> 
> Wish my Sapphire unlocked. So disappointed. I should just be happy that I got Hynix memory.
Click to expand...

Thanks man








Ones a sapphy and the other is a giga
Dont wanna flash not up for the bitter dissapointment


----------



## kizwan

Quote:


> Originally Posted by *UNOE*
> 
> I'm going to be running two PSU's for 4 cards. Can you guys look at this tell me if you think, if this will work alright.
> 
> I will be using Dual PSU adapter from newegg LINK
> 
> *Corsair HX620 Load*
> Motherboard 24pin
> CPU 6 pin
> PCIE power (Sata Connector on Gigabyte boards) *??*
> All other Drives and and fans
> 
> *Corsair AX1200 Load*
> 4x 8pin PCIe
> 4x 6pin PCIe
> 
> So basically everything will be running off the HX620 except for the Video card PCIe cables which will all be powered by the AX1200.
> 
> The big question is mainly about PCIe power cable to the motherboard supplemental PCIE power. That uses the Sata cable to give the slots more power.
> I'm wondering should I use that connector from the AX1200 so its the same as the videocards. Or should I use the sata pcie from the HX620 so it has same power as the rest of the motherboard.


You can use either one. If I were you I may use SATA power connector (for EZ_PLUG_1) from AX1200. The only reason is because I just want the GPUs drawing power from the same PSU.
Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I went back to 9.4 cause i was getting a few bsods and black and red screen freeze with CF for some bizzare reason


Really? Is this random or ...? I don't experience any problem yet.


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Thanks man
> 
> 
> 
> 
> 
> 
> 
> 
> Ones a sapphy and the other is a giga
> Dont wanna flash not up for the bitter dissapointment


Both of my Sapphy are locked.







...but I never plan to unlocked my cards anyway, so it's fine. One card have Hynix & the other have Elpida.



[EDIT] Opps! Sorry double post.


----------



## HOMECINEMA-PC

Waterblock the hynix first , i believe that is your uber card..... good mem clocks









Im getting 'titan' performance stock bios


----------



## MrWhiteRX7

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Hell yeah


Your cards are still on stock cooler right? How close are they, is there a pcie space in between them or are they close together? I'm about to have 3 and won't have the waterblocks for another 3 weeks it seems







Hoping they won't downclock on me


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Your cards are still on stock cooler right? How close are they, is there a pcie space in between them or are they close together? I'm about to have 3 and won't have the waterblocks for another 3 weeks it seems
> 
> 
> 
> 
> 
> 
> 
> Hoping they won't downclock on me


R4F dude











Got 760 Hawks in that pic


----------



## HardwareDecoder

is there a bios editor for the r9 series yet? I need to set my clocks in the bios tired of having to run asus gpu tweak


----------



## Sazz

I don't know if what the problem is, probably just the game itself. But on BF4 I can feel a lot of frame skips, my frame rate doesn't dip down at all but when your walking around you can defenitely feel the frame skipping. But any other game it works just fine...

Knowing the kind of criticism BF4 is getting right now I wouldn't be surprised if its a game issue..


----------



## MrWhiteRX7

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> R4F dude
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Got 760 Hawks in that pic


Yea I'm afraid mine being so close together on my R4E without blocks may not bode well lol, oh well I'm going to try to push them anyway! lol


----------



## the9quad

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Your cards are still on stock cooler right? How close are they, is there a pcie space in between them or are they close together? I'm about to have 3 and won't have the waterblocks for another 3 weeks it seems
> 
> 
> 
> 
> 
> 
> 
> Hoping they won't downclock on me


Here is how close mine are, and no throttling. Middle card gets about 10c warmer under load and runs about 10% fan speed faster. You just have to set up a custom fan profile in AB until you get your blocks, but on stock it's fine.


----------



## Joeking78

Thats my Trifire setup...they are a little closer together with the Rampage board, paper thin gap between each card. (pics are when I had my MSI GD65)

When I'm benching I run 100% fan speed and the cards get to around 70-75c, thats with +200mv in Trixx and 1230/1350.

I have to run my AC full blast to get those temps though and my case fans are running slow at the moment, I used to be able to control them wiith my Rampage Extreme but I can't with my Formula


----------



## MrWhiteRX7

Quote:


> Originally Posted by *the9quad*
> 
> Here is how close mine are, and no throttling. Middle card gets about 10c warmer under load and runs about 10% fan speed faster. You just have to set up a custom fan profile in AB until you get your blocks, but on stock it's fine.


Quote:


> Originally Posted by *Joeking78*
> 
> 
> 
> 
> Thats my Trifire setup...they are a little closer together with the Rampage board, paper thin gap between each card. (pics are when I had my MSI GD65)
> 
> When I'm benching I run 100% fan speed and the cards get to around 70-75c, thats with +200mv in Trixx and 1230/1350.
> 
> I have to run my AC full blast to get those temps though and my case fans are running slow at the moment, I used to be able to control them wiith my Rampage Extreme but I can't with my Formula


Thanks guys!!!


----------



## rdr09

Quote:


> Originally Posted by *spitty13*
> 
> Would two 290s be bottlenecked by a overclocked 2500k?


yes.


----------



## HOMECINEMA-PC

290 CF useage runnin COD Ghosts at max settings in game




Stock bios's and cooler
Never seen mem usaege like this before AWESOME cards


----------



## psyside

Subscribe to this channel, they are doing amazing split screen compassion videos.


----------



## rdr09

Quote:


> Originally Posted by *psyside*
> 
> 
> 
> 
> 
> Subscribe to this channel, they are doing amazing split screen compassion videos.


+rep.


----------



## rdr09

Quote:


> Originally Posted by *Sazz*
> 
> I don't know if what the problem is, probably just the game itself. But on BF4 I can feel a lot of frame skips, my frame rate doesn't dip down at all but when your walking around you can defenitely feel the frame skipping. But any other game it works just fine...
> 
> Knowing the kind of criticism BF4 is getting right now I wouldn't be surprised if its a game issue..


in my i7 as well. i unparked my cores and problem solved.


----------



## Widde

This is bad







I just want to push the card further and further







Cant settle with what I have ^^ Thinking of getting a custom loop some blocks and another 290 and see what I can get out of it, But I'm afraid my 4.5ghz 3570k is gonna bottleneck everything


----------



## CriticalHit

Quote:


> Originally Posted by *Widde*
> 
> This is bad
> 
> 
> 
> 
> 
> 
> 
> I just want to push the card further and further
> 
> 
> 
> 
> 
> 
> 
> Cant settle with what I have ^^ Thinking of getting a custom loop some blocks and another 290 and see what I can get out of it, But I'm afraid my 4.5ghz 3570k is gonna bottleneck everything


hey widdle .. just know that if you have awesome overclockers - water or custom cooling will really set them free and open their legs... ie... if you can push really high frequency on the stock cooler and being heat limited, then a custom loop will let them roar...

however, you will likely find if you're finding your cards are maxing out in the 1100's with stock cooler with their max fan speed ... water will do little more than allow them to run cooler

worth considering your reasons for WC in that respect.. my cards are in a loop but i dont even worry with overclocking them .. too unstable for minimal gains...


----------



## JCviggen

I got an XFX 290 a few days ago, without going through this whole topic, is memory voltage linked to core voltage somehow?

Really effing annoying how I can't undervolt the core much before the memory starts crapping out...any hard mods for this out there? From what i've seen there's no chance of a software solution on the reference boards.


----------



## Redeemer

I am excited guys available for pre-order!

http://products.ncix.com/detail/msi-radeon-r9-290x-gaming-twin-frozr-iv1040mhz-4gb-5ghz-gddr5-hdmi-dp-2xdvi-pci-e-video-card-9c-92892.htm


----------



## Widde

Quote:


> Originally Posted by *CriticalHit*
> 
> hey widdle .. just know that if you have awesome overclockers - water or custom cooling will really set them free and open their legs... ie... if you can push really high frequency on the stock cooler and being heat limited, then a custom loop will let them roar...
> 
> however, you will likely find if you're finding your cards are maxing out in the 1100's with stock cooler with their max fan speed ... water will do little more than allow them to run cooler
> 
> worth considering your reasons for WC in that respect.. my cards are in a loop but i dont even worry with overclocking them .. too unstable for minimal gains...


Was thinking of a custom loop to get everything to run cooler, and is my cpu bottlenecking my current rig? 3570k at 4.5ghz







Want to get another 290 and a custom waterloop, is it worth getting a 3770k or do i have to switch to a 2011 cpu?


----------



## The Storm

This is a stock run no overclock on my unlocked 290's
http://www.3dmark.com/3dm11/7665394


----------



## Jack Mac

First world problems: My Sapphire 290 doesn't unlock.


----------



## Rar4f

My r9 290 is not showing up in Device manager>Display adapters
Only "Standard VGA adapter"

I have not installed gpu driver yet nor bios software for it (if needed)

Working on installing other drivers first. Having a problem with mobo chipset driver so working on that.

UD3h z87, intel 4770k

So what do you guys think is reason its not showing up? The fans are spinning and the gpu seems to be working.


----------



## the9quad

Quote:


> Originally Posted by *Rar4f*
> 
> My r9 290 is not showing up in Device manager>Display adapters
> Only "Standard VGA adapter"
> 
> I have not installed gpu driver yet nor bios software for it (if needed)
> 
> Working on installing other drivers first. Having a problem with mobo chipset driver so working on that.
> 
> UD3h z87, intel 4770k
> 
> So what do you guys think is reason its not showing up? The fans are spinning and the gpu seems to be working.


The fact you haven't installed the drivers is what's doing that.


----------



## Rar4f

gpu drivers or motherboard ones?


----------



## quakermaas

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 290 CF useage runnin COD Ghosts at max settings in game
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Stock bios's and cooler
> Never seen mem usaege like this before AWESOME cards


WoW, that is some memory usage alright


----------



## stickg1

Quote:


> Originally Posted by *Rar4f*
> 
> gpu drivers or motherboard ones?


You GPU will say Standard VGA Adapter until you install the drivers for it.


----------



## Rar4f

Attempting installing of catalyst i got error:
Application install: install package failure

It was with driver from Sapphire site.

EDIT: Going to try AMD's site as that was suggested when i searched for a solution.


----------



## RAFFY

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 290 CF useage runnin COD Ghosts at max settings in game
> 
> 
> 
> 
> Stock bios's and cooler
> Never seen mem usaege like this before AWESOME cards


Just ignore those readings. COD: Ghosts is so messed up right now it isn't even funny. I don't think they have fixed the SLI/CFX problem in the game yet either. I run that game maxed on 1440p which shouldn't be an issues and I get "lag" all the same even if I am on low settings.


----------



## devilhead

so tested my xfx with waterblock, +100 1200/1400 on batlefield 4 couple rounds







it works perfect, but once i have recieved black screen for 1 second, then again everything ok


----------



## Heinz68

Quote:


> Originally Posted by *Rar4f*
> 
> Attempting installing of catalyst i got error:
> Application install: install package failure
> 
> It was with driver from Sapphire site.
> 
> EDIT: Going to try AMD's site as that was suggested when i searched for a solution.


The drivers are also listed on first page this thread.


----------



## Rar4f

What difference does it make if i download drivers here or from AMD's site.

I tried AMD's driver 13.11 released 11/18/2013 for R9 290 series, and it gave me same language pack error message.


----------



## quakermaas

Quote:


> Originally Posted by *Rar4f*
> 
> What difference does it make if i download drivers here or from AMD's site.
> 
> I tried AMD's driver 13.11 released 11/18/2013 for R9 290 series, and it gave me same language pack error message.


I always seem to get an error when I install a AMD driver, but it runs fine anyway and when I look at thee report, I can't see anything that failed.


----------



## Rar4f

Well for me the device manager still says "Standard VGA adapter"


----------



## cloppy007

Are you all experiencing this problem when the monitor turns off?


----------



## Rar4f

Nope. R9 290 just does not show up anything on monitor. Ive used hdmi cable on the hdmi port of the motherboard and that works without problems.


----------



## Pesmerrga

Quote:


> Originally Posted by *Rar4f*
> 
> Nope. R9 290 just does not show up anything on monitor. Ive used hdmi cable on the hdmi port of the motherboard and that works without problems.


I think maybe your motherboard is booting using your on-board video (intel or amd) instead of the 290 installed in the PCI-E slot. You have to change the BIOS setting. You most likely have Intel HD video which why the CCC is not seeing your 290. I can't say exactly where to change the setting as I have no idea what motherboard or BIOS your are running.

Oh, and you should probably Put Your Rig in your Sig so it is easier for people to help you.


----------



## cloppy007

Quote:


> Originally Posted by *Rar4f*
> 
> Nope. R9 290 just does not show up anything on monitor. Ive used hdmi cable on the hdmi port of the motherboard and that works without problems.


Quote:


> Originally Posted by *Pesmerrga*
> 
> I think maybe your motherboard is booting using your on-board video (intel or amd) instead of the 290 installed in the PCI-E slot. You have to change the BIOS setting. You most likely Intel HD video which why the CCC is not seeing your 290. I can't say exactly where to change the setting as I have no idea what motherboard or BIOS your are running.


Sorry, I did not explain this properly. Sometimes when Windows turns off the screen after X minutes idling (that power saving feature, you know), it shows this when I move the mouse to wake the monitor up.

Edit: never mind, I realised that you are talking about something different


----------



## Rar4f

I did not have the 6 pin cable connected. So problem solved. Thank you


----------



## Pesmerrga

Quote:


> Originally Posted by *Rar4f*
> 
> I did not have the 6 pin cable connected. So problem solved.


LOL.. Totally did not see that coming..


----------



## Rar4f

I stil get the error. And the system has not registered gpu in device manager. Still the same "Language pack " error something.
I hoped it was a minor error that did not matter, but device manager tells a different story.
EDIT: Tried a other driver (prolly Sapphire's) and it seems to be working now.

The setup did not gvie me the error as it usually does after a short while it has run. heres to hoping it wont.


----------



## skupples

Quote:


> Originally Posted by *RAFFY*
> 
> LINK!


Quote:


> Originally Posted by *OccamRazor*
> 
> Who wants to drool?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> TIN sent me something...
> Something called: *K|NGP|N*
> 
> ON AIR CPU...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Firestrike single
> 
> 
> 
> Firestrike Extreme
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ed
> 
> (Skyn3t Team)


all dat with in days of the card coming out, wasn't a pre-release canned submission...


----------



## pkrexer

Quote:


> Originally Posted by *cloppy007*
> 
> Are you all experiencing this problem when the monitor turns off?


Are you overclocking with AB? If so, your voltage is probably dropping when it detects your monitors turn off.


----------



## cloppy007

Quote:


> Originally Posted by *pkrexer*
> 
> Are you overclocking with AB? If so, your voltage is probably dropping when it detects your monitors turn off.


No! Everything stock.


----------



## Rar4f

I just downloaded catalyst, and guess what? When i wanted to "check for driver update" it sends me to driver page of AMD.

Then whats point of this whole software? Meh O_O

EDIT:
Quick question: How do i remove the black frame?

My gpu is registered as a Radeon gpu in device manager and i am pretty confident i have installed the driver, but still that black frame is there.

I am using HDMI cable, perhaps its the HDMI cable thats causing it?


----------



## Forceman

Quote:


> Originally Posted by *JCviggen*
> 
> I got an XFX 290 a few days ago, without going through this whole topic, is memory voltage linked to core voltage somehow?
> 
> Really effing annoying how I can't undervolt the core much before the memory starts crapping out...any hard mods for this out there? From what i've seen there's no chance of a software solution on the reference boards.


It appears that it is connected, at least for overclocking. I get much better memory overclocks once I increase the core voltage. My guess is that the limiting factor is the new memory controller, not the memory itself, which would explain why bumping the voltage may help.


----------



## JCH979

Quote:


> Originally Posted by *Rar4f*
> 
> I just downloaded catalyst, and guess what? When i wanted to "check for driver update" it sends me to driver page of AMD.
> 
> Then whats point of this whole software? Meh O_O
> 
> EDIT:
> Quick question: How do i remove the black frame?
> 
> My gpu is registered as a Radeon gpu in device manager and i am pretty confident i have installed the driver, but still that black frame is there.
> 
> I am using HDMI cable, perhaps its the HDMI cable thats causing it?


In CCC open up the "My Digital Flat Panels" menu and go to "Scaling Options"


----------



## Rar4f

Quote:


> Originally Posted by *JCH979*
> 
> In CCC open up the "My Digital Flat Panels" menu and go to "Scaling Options"


merci


----------



## JordanTr

Installed gelid icy vision rev 2 on my r9 290. Now im not disturbed by jet engine







and temps alright, runing ot at 1050/1400. Gpu 60, vrm1 80, vrm2 60 max. Had tu use zip ties for long strip of vrm







so good ao far


----------



## Rar4f

Will the audio technology in R9 290 intefere with My Asus Xonar DG 5.1?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Rar4f*
> 
> Will the audio technology in R9 290 intefere with My Asus Xonar DG 5.1?


No, if I understand correctly it actually sends its info to the Xonar (if you use it as your output device.)

I'm also using a Xonar (DX) and I haven't really noticed a change in audio though.


----------



## Forceman

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I'm also using a Xonar (DX) and I haven't really noticed a change in audio though.


That's because nothing is using True Audio yet. Thief may be the first game to use it.


----------



## HardwareDecoder

so I asked like 5 pages ago and no one answered, so I'll ask again.

Is there a bios editor that works for the r9 series yet? VBE7 doesn't work. I'd like to lock in my overclock settings via bios.


----------



## Rar4f

Quote:


> Originally Posted by *Forceman*
> 
> That's because nothing is using True Audio yet. Thief may be the first game to use it.


I think BF4 is the first to do that.


----------



## velocityx

Quote:


> Originally Posted by *Rar4f*
> 
> I think BF4 is the first to do that.


bf4 is not using true audio. thief 4 will be the first one to use it.


----------



## Arizonian

Quote:


> Originally Posted by *Valice*
> 
> count me in!
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## TheRoot

Quote:


> Originally Posted by *CHUNKYBOWSER*
> 
> Got my Sapphire R9 290 BF4 edition today. Unfortunately, I can't unlock it.
> 
> http://www.techpowerup.com/gpuz/ebc4k/
> 
> Stock cooling! Will have it under water later...


miss


----------



## UNOE

Quote:


> Originally Posted by *kizwan*
> 
> You can use either one. If I were you I may use SATA power connector (for EZ_PLUG_1) from AX1200. The only reason is because I just want the GPUs drawing power from the same PSU.


Thanks well I was thinking the same things it would be better that way. But bad part about that is I don't balance the load as well. It would mean 1200 watts on AX1200 and only 100 watts HX620. But I'm glad the answer might be either. I would want to try it the other way first. So see if I can do more like 300 watts on the HX620 and 1000 watts from the AX1200.


----------



## Sgt Bilko

Well its starting to look like I wont get my 290x back before Xmas. I sent it off on the 4th and it was tested. Then sent off to the distributor on the 12th and ive been told between 1-3 weeks........this really sucks.

Oh well. Guess im definitely getting a non ref card then


----------



## CravinR1

I tried to disable crossfire in 9.5 and no go ?


----------



## HardwareDecoder

It's normal for these cards to fluctuate core clocks in a game right? I swear yesterday it would stay locked @ whatever clock I set in gpu tweak/afterburner yesterday. Today it fluctuates alot more. Does it at stock settings too so it isn't an unstable oc or anything.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *HardwareDecoder*
> 
> It's normal for these cards to fluctuate core clocks in a game right? I swear yesterday it would stay locked @ whatever clock I set in gpu tweak/afterburner yesterday. Today it fluctuates alot more. Does it at stock settings too so it isn't an unstable oc or anything.


Depends, do you use Vsync?

I do, I was just playing some Assetto Corsa and my framerate was stuck at 60 FPS, about 75% GPU load. This causes the card to throttle a bit (~1075MHz) and fluctuate voltage, simply because the card does not need to go 100% all out to give 60 FPS in this game.

Now if you don't use Vsync or your card is constantly at 100% load, it shouldn't be throttling. It should be pinned to whatever clock you have it set at.


----------



## stickg1

I just had a delightful gaming session in Sleeping Dogs 1080p Extreme preset. More are less locked at 60fps the whole time and smooth as butter. But I noticed at times my core clock would fluctuate. But I wasn't always at 100% usage.


----------



## HardwareDecoder

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Depends, do you use Vsync?
> 
> I do, I was just playing some Assetto Corsa and my framerate was stuck at 60 FPS, about 75% GPU load. This causes the card to throttle a bit (~1075MHz) and fluctuate voltage, simply because the card does not need to go 100% all out to give 60 FPS in this game.
> 
> Now if you don't use Vsync or your card is constantly at 100% load, it shouldn't be throttling. It should be pinned to whatever clock you have it set at.


I didn't have load % in my afterburner OSD so i'm not sure what it was at. I just noticed indoors in arkham origins it would fluctuate clocks as low as 950. Outdoors it was usually 1190+ (my oc is 1200 right now since I found I can do that with just +100mv)

It might just be the load is not that high indoors right?and No I am not running vsync. It is definitely not a temperature throttle either since my card is underwater @ 42c max i've seen.

I don't think it is at all starved for voltage with 150% power and 1350mv.

Maybe it is just normal for it to fluctuate on load right?


----------



## Mr357

Can anyone help me find a picture that shows where on the back of the PCB to put the leads of a DMM? Much appreciated!


----------



## chiknnwatrmln

Quote:


> Originally Posted by *HardwareDecoder*
> 
> I didn't have load % in my afterburner OSD so i'm not sure what it was at. I just noticed indoors in arkham origins it would fluctuate clocks as low as 950. Outdoors it was usually 1190+ (my oc is 1200 right now since I found I can do that with just +100mv)
> 
> It might just be the load is not that high indoors right?and No I am not running vsync. It is definitely not a temperature throttle either since my card is underwater @ 42c max i've seen.
> 
> I don't think it is at all starved for voltage with 150% power and 1350mv.
> 
> Maybe it is just normal for it to fluctuate on load right?


As long as the framerate is high enough it's not a problem.

I have FRAPS display the FPS in the corner of my screen to let me know where my PC struggles. I also suggest using CCC's framerate limiter, there's no point in wasting electricity rendering frames your monitor can't display.

Because I game at 4800x900, most games run 100% GPU load when in big areas so I usually keep an eye on FPS and core clock.


----------



## HardwareDecoder

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> As long as the framerate is high enough it's not a problem.
> 
> I have FRAPS display the FPS in the corner of my screen to let me know where my PC struggles. I also suggest using CCC's framerate limiter, there's no point in wasting electricity rendering frames your monitor can't display.
> 
> Because I game at 4800x900, most games run 100% GPU load when in big areas so I usually keep an eye on FPS and core clock.


well I used to be able to do 110hz on my monitor with my 7950's b ut that isn't working on my 290x either, that's a seperate issue though.


----------



## Sazz

Quote:


> Originally Posted by *rdr09*
> 
> in my i7 as well. i unparked my cores and problem solved.


I am using FX-8350 if that matters, how do you unpark the cores? xD


----------



## rdr09

Quote:


> Originally Posted by *Sazz*
> 
> I am using FX-8350 if that matters, how do you unpark the cores? xD


here you go . . .

http://www.overclock.net/t/1439672/want-to-increase-bf4-frames-unpark-your-cores


----------



## Raephen

Quote:


> Originally Posted by *HardwareDecoder*
> 
> It's normal for these cards to fluctuate core clocks in a game right? I swear yesterday it would stay locked @ whatever clock I set in gpu tweak/afterburner yesterday. Today it fluctuates alot more. Does it at stock settings too so it isn't an unstable oc or anything.


I guess that depends on the game.

In my case with Skyrim, I had bouncing clocks with my single 290. But that was powerplay at work. With my new 144Hz monitor, it's max clocks all the time.

(But I had another issue now: Skyrim is a console port and doesn't like high fps at fracking high image quality - it's max assigned memory is easily swamped by the high-res textures







. I'm running an enb now with a 4GB memory hack, but I liked vanilla high-res graphics, so I'm still looking for a good solution... But since it seems game-engine related, I suspect that's no easy task







)


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Jack Mac*
> 
> First world problems: My Sapphire 290 doesn't unlock.










LoooL


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Raephen*
> 
> I guess that depends on the game.
> 
> In my case with Skyrim, I had bouncing clocks with my single 290. But that was powerplay at work. With my new 144Hz monitor, it's max clocks all the time.
> 
> (But I had another issue now: Skyrim is a console port and doesn't like high fps at fracking high image quality - it's max assigned memory is easily swamped by the high-res textures
> 
> 
> 
> 
> 
> 
> 
> . I'm running an enb now with a 4GB memory hack, but I liked vanilla high-res graphics, so I'm still looking for a good solution... But since it seems game-engine related, I suspect that's no easy task
> 
> 
> 
> 
> 
> 
> 
> )


There are ways around it. You can tweak the .ini files (look on google for guides, they're everywhere) to up the cache sizes and other things to make the game more stable. I'm currently running 40+ texture packs and 100+ mods and I regularly see ~3.5gb VRAM usage rarely have crashes.

Another thing you can do is to create a bashed patch with WryeBash, if you use mods. You can find WryeBash on the Skyrim Nexus, I believe.

Don't use Bethesda's HD texture pack, it's riddled with bugs and errors, and modders have made much more extensive, higher quality packs.

Lastly you can mess with ENB's memory manager settings. To do this go to your Skyrim root folder and open up ENBseries.ini. There should be an entire section labeled [Memory]. The settings are pretty self explanatory, for example for VRAM you'd set 4096. You can also use the unsafe memory hacks, that actually helps.

I've done a lot of modding for both Fallout games and Skyrim, so I'm pretty familiar with how to make these games stable. If you have any questions let me know, if you want I can send you my .ini configurations.

I also recommend you download this text file from the Nexus, it has a lot of good information in it on how to make your game run smoother.

http://www.nexusmods.com/skyrim/mods/48262/?


----------



## Derpinheimer

Quote:


> Originally Posted by *HardwareDecoder*
> 
> I didn't have load % in my afterburner OSD so i'm not sure what it was at. I just noticed indoors in arkham origins it would fluctuate clocks as low as 950. Outdoors it was usually 1190+ (my oc is 1200 right now since I found I can do that with just +100mv)
> 
> It might just be the load is not that high indoors right?and No I am not running vsync. It is definitely not a temperature throttle either since my card is underwater @ 42c max i've seen.
> 
> I don't think it is at all starved for voltage with 150% power and 1350mv.
> 
> Maybe it is just normal for it to fluctuate on load right?


My clocks stay stable except for an occasional 25mhz drop or something. Not sure what would trigger them to drop aside from temp [no], power limit [unless its not setting, no], and lack of load [???]

When you say it drops, is it a "spike" or does it stay down for a few readings? Because if its just one reading that is lower it could be the game not giving the card enough load, while something is loading, be it a map or a menu, graphics settings change, etc.


----------



## HoZy

290x Crossfire Now









Picked up the 2nd baby, Been testing without the block first just to make sure it's all working properly.



3DMARK LINK

http://www.3dmark.com/3dm/1865931





Thanks


----------



## Raephen

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> There are ways around it. You can tweak the .ini files (look on google for guides, they're everywhere) to up the cache sizes and other things to make the game more stable. I'm currently running 40+ texture packs and 100+ mods and I regularly see ~3.5gb VRAM usage rarely have crashes.
> 
> Another thing you can do is to create a bashed patch with WryeBash, if you use mods. You can find WryeBash on the Skyrim Nexus, I believe.
> 
> Don't use Bethesda's HD texture pack, it's riddled with bugs and errors, and modders have made much more extensive, higher quality packs.
> 
> Lastly you can mess with ENB's memory manager settings. To do this go to your Skyrim root folder and open up ENBseries.ini. There should be an entire section labeled [Memory]. The settings are pretty self explanatory, for example for VRAM you'd set 4096. You can also use the unsafe memory hacks, that actually helps.
> 
> I've done a lot of modding for both Fallout games and Skyrim, so I'm pretty familiar with how to make these games stable. If you have any questions let me know, if you want I can send you my .ini configurations.
> 
> I also recommend you download this text file from the Nexus, it has a lot of good information in it on how to make your game run smoother.
> 
> http://www.nexusmods.com/skyrim/mods/48262/?


Aye, thanks. I'll lookk around for alternative texture packs later on today.









And I've found those (or at least, the ENB's) (V)RAM settings, modded them and still my VRAM doesn't exceed 1300 to 1500 MB. So much wasted potential while it still has problems with mist / water / snow... I'll have to keep fooling around a bit more


----------



## quakermaas

Quote:


> Originally Posted by *HardwareDecoder*
> 
> I didn't have load % in my afterburner OSD so i'm not sure what it was at. I just noticed indoors in arkham origins it would fluctuate clocks as low as 950. Outdoors it was usually 1190+ (my oc is 1200 right now since I found I can do that with just +100mv)
> 
> It might just be the load is not that high indoors right?and No I am not running vsync. It is definitely not a temperature throttle either since my card is underwater @ 42c max i've seen.
> 
> I don't think it is at all starved for voltage with 150% power and 1350mv.
> 
> Maybe it is just normal for it to fluctuate on load right?


I see you are using MSI AB to OC, did you check the power limits that you set in AB are being applied in overdrive. I had the same problem, until I checked and corrected it.

I


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jack Mac*
> 
> First world problems: My Sapphire 290 doesn't unlock.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> LoooL
Click to expand...

I got first world problem too; I burned the power socket on cable extension reel. Actually one of the socket (active/red/brown wire) is melted. It happen while I'm playing BF4. This is because my computer is not located next to wall socket, so I have to use power/cable extension reel. I admit, it's my mistake to use regular cable extension reel.







Do "heavy duty" cable extension reel exist?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *kizwan*
> 
> I got first world problem too; I burned the power socket on cable extension reel. Actually one of the socket (active/red/brown wire) is melted. It happen while I'm playing BF4. This is because my computer is not located next to wall socket, so I have to use power/cable extension reel. I admit, it's my mistake to use regular cable extension reel.
> 
> 
> 
> 
> 
> 
> 
> Do "heavy duty" cable extension reel exist?


Yep tradespeople use that , some even come with circut breaker and extra power points . I think your after 20amp extension cable ?

LooooL you melted powercable playing BF4 with 3820 and cards at stock clocks







Well done there


----------



## broken pixel

Quote:


> Originally Posted by *Sazz*
> 
> I don't know if what the problem is, probably just the game itself. But on BF4 I can feel a lot of frame skips, my frame rate doesn't dip down at all but when your walking around you can defenitely feel the frame skipping. But any other game it works just fine...
> 
> Knowing the kind of criticism BF4 is getting right now I wouldn't be surprised if its a game issue..


I think you are experincing CPU spikes, it is a bug. Open the BF4 console and type render.performanceoverlay 1
i think that is the command but it will show the command in a list once you type render.


----------



## Sazz

Quote:


> Originally Posted by *broken pixel*
> 
> I think you are experincing CPU spikes, it is a bug. Open the BF4 console and type render.performanceoverlay 1
> i think that is the command but it will show the command in a list once you type render.


I did the core unpark thing, and played the game again.. seems to be fixed now, but the thing that weirds me out is I didn't do that before but the game was playing smooth back then, I just didn't play it for a couple of weeks and then when I got back to it I got that problem.. anyways its fixed right now, and it seems to add 5-10FPS when I did the core unparking. (6020x1080p I was getting 35Fps minimum, now I am getting 40Fps as my lowest dip, averaging around 55-60Fps)


----------



## voldomazta

Hi admin, can you please add me?









Anyway, how do you determine the memory type for each one? I ran MemoryInfo.exe while I have 3 cards plugged in and I think it only detected memory for one of the cards (ASUS one), sadly, it's "Elpdia"



How do I check memory for the other ones?


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yep tradespeople use that , some even come with circut breaker and extra power points . I think your after 20amp extension cable ?
> 
> LooooL you melted powercable playing BF4 with 3820 and cards at stock clocks
> 
> 
> 
> 
> 
> 
> 
> Well done there


Yeah, I think so, 20 amp cable. 20 amp because 20 amp fuse in fuse box?

3820 @4.7GHz + R9 290 stock clock crossfire.








Quote:


> Originally Posted by *Sazz*
> 
> I did the core unpark thing, and played the game again.. seems to be fixed now, but the thing that weirds me out is I didn't do that before but the game was playing smooth back then, I just didn't play it for a couple of weeks and then when I got back to it I got that problem.. anyways its fixed right now, and it seems to add 5-10FPS when I did the core unparking. (6020x1080p I was getting 35Fps minimum, now I am getting 40Fps as my lowest dip, averaging around 55-60Fps)


If you're up to it, can you enable back the CPU parking & try; 1) set Power Options to High Performance, 2) disabled C1E, C3 & C6 in BIOS.


----------



## Clockster

Seems I have found the limit of my 290X, Managed to get it stable @1190 but that's about it and I've tried everything under the sun to get it higher apart from flashing a pt1/3 bios onto it. Mmmmm









Not exactly the best clock but I guess it'll have to do.


----------



## HardwareDecoder

guys do any of you on windows 8.1 have a 290/290x and a qnix monitor? I'm trying to figure out why I can't overclock past 60hz anymore I could do 110mhz on my xfire 7950's

this is the screen I get alot (same exact one) Some times games work for a minute then this happens.

I have tried every driver since the last whql


----------



## Mr357

Quote:


> Originally Posted by *Mr357*
> 
> Can anyone help me find a picture that shows where on the back of the PCB to put the leads of a DMM? Much appreciated!


Help?









If anyone finds it, maybe shoot me a PM? Thanks.


----------



## Joeking78

Add me pls











3 x Sapphire R290x, stock cooling.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *HardwareDecoder*
> 
> guys do any of you on windows 8.1 have a 290/290x and a qnix monitor? I'm trying to figure out why I can't overclock past 60hz anymore I could do 110mhz on my xfire 7950's
> 
> this is the screen I get alot (same exact one) Some times games work for a minute then this happens.
> 
> I have tried every driver since the last whql


Win 8.1, 290, and Qnix evoII 27" @ 96hz

I have not had this happen or any black screening for that matter either. Not sure if this makes any difference but I've been on 13.11 whql driver the whole time.


----------



## HardwareDecoder

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Win 8.1, 290, and Qnix evoII 27" @ 96hz
> 
> I have not had this happen or any black screening for that matter either. Not sure if this makes any difference but I've been on 13.11 whql driver the whole time.


Hmm, that scares me. so either it's my card or my monitor... Mind posting your custom resolution settings for 96hz ?

I have tried 13.11 whql also and Having same results.

I guess I'll do a win 8.1 reinstall tonight after class and see if it helps.


----------



## Joeking78

Quote:


> Originally Posted by *HardwareDecoder*
> 
> Hmm, that scares me. so either it's my card or my monitor... Mind posting your custom resolution settings for 96hz ?


Have you tried unticking Enable GPU Scaling in CCC?

I had a similar issue and it turned out I had to unclick GPU scaling, then re check it for some reason.


----------



## HardwareDecoder

Quote:


> Originally Posted by *Joeking78*
> 
> Have you tried unticking Enable GPU Scaling in CCC?
> 
> I had a similar issue and it turned out I had to unclick GPU scaling, then re check it for some reason.


I've never used gpu scaling.

This might all be my fault, I moved to a new motherboard and since the old+ the new has the same chipset I didn't "need" to reinstall windows.

I might really need to as it could be the root of this problem somehow


----------



## Joeking78

Quote:


> Originally Posted by *HardwareDecoder*
> 
> I've never used gpu scaling.


Mine won't work unless I check GPU Scaling...so maybe worth you trying that.


----------



## HardwareDecoder

Quote:


> Originally Posted by *Joeking78*
> 
> Mine won't work unless I check GPU Scaling...so maybe worth you trying that.


qnix evo 2 at over 60hz you are saying won't work without gpu scaling? hmm let me run upstairs and try it

I swear i've never used it though.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *HardwareDecoder*
> 
> Hmm, that scares me. so either it's my card or my monitor... Mind posting your custom resolution settings for 96hz ?
> 
> I have tried 13.11 whql also and Having same results.
> 
> I guess I'll do a win 8.1 reinstall tonight after class and see if it helps.


I just use Automatic- LCD Standard I'm pretty sure. Didn't do any manual timings and all worked right out of the box pretty much. Hopefully a fresh windows will do you right







It worked before and then just stopped?


----------



## Joeking78

Quote:


> Originally Posted by *HardwareDecoder*
> 
> qnix evo 2 at over 60hz you are saying won't work without gpu scaling? hmm let me run upstairs and try it
> 
> I swear i've never used it though.


I run the patcher, then CRU and set my refresh rate...then I restart, open CCC and enable GPU Scaling, then it shows my max res in CCC as being 120hz...then I right click desktop > screen resolution > advanced settings > monitor > select 120hz.


----------



## HardwareDecoder

Quote:


> Originally Posted by *Joeking78*
> 
> I run the patcher, then CRU and set my refresh rate...then I restart, open CCC and enable GPU Scaling, then it shows my max res in CCC as being 120hz...then I right click desktop > screen resolution > advanced settings > monitor > select 120hz.


Gpu scaling seems to have worked!!!

So odd to me, I swear I never used any form of gpu-scaling before. I selected the tickbox, all of the suboptions were still greyed out. I launched arkham origins and beat up 10 or so guys and it was fine so i'm thinking that was the issue.

I am about 80% sure I never used it on my 7950's though..

Doesn't gpu scaling add input lag or no?

Oh and +rep thank you so much I never even thought of gpu scaling and I was going nuts that my monitor/gfx card were screwed
Quote:


> Originally Posted by *MrWhiteRX7*
> 
> I just use Automatic- LCD Standard I'm pretty sure. Didn't do any manual timings and all worked right out of the box pretty much. Hopefully a fresh windows will do you right
> 
> 
> 
> 
> 
> 
> 
> It worked before and then just stopped?


Well, alot of things changed since it was working. I sold my 7950's, changed mobo's and got a 290x so I wasn't sure what the exact issue was. Apparently it was that I needed gpu scaling enabled (hopefully that was the problem, seems to have been)

I am running 110hz btw cause I could never do over around 115 or so on my 7950's, not sure if it how high my monitor can oc or was a limit of the 7950's i'll try for 120 later. Just really glad 110 works again.

I might go down to 96 anyway though since it's hard to keep 110 frames in alot of games so 96 might be better to help reduce some tearing since I hate the input lag from vsync


----------



## Joeking78

Quote:


> Originally Posted by *HardwareDecoder*
> 
> Gpu scaling seems to have worked!!!
> 
> So odd to me, I swear I never used any form of gpu-scaling before. I selected the tickbox, all of the suboptions were still greyed out. I launched arkham origins and beat up 10 or so guys and it was fine so i'm thinking that was the issue.
> 
> I am about 80% sure I never used it on my 7950's though..
> 
> Doesn't gpu scaling add input lag or no?


Glad it worked









I've not noticed any lag.

I've always had to have that option checked for it to work, I'm not sure why...it's the only way I've ever done it.


----------



## HardwareDecoder

Quote:


> Originally Posted by *Joeking78*
> 
> Glad it worked
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've not noticed any lag.
> 
> I've always had to have that option checked for it to work, I'm not sure why...it's the only way I've ever done it.


are the options under gpu scaling greyed out for you too? I can only select the actual gpu scaling checkbox


----------



## Joeking78

Yep, Maintain aspect ratio, Scale image to full panel size & Use centered timings are greyed out.


----------



## Fahrenheit85

So is flashing to the Asus bios and using GPU tweak still the thing to do? Kinda board today and I wanna mess with my Gigabyte 290x. Can I flash to the silent bios?


----------



## HardwareDecoder

Okay gpu scaling wasn't the issue, problem came back









I know i've tried @ stock settings (on the gpu) and it happened so I don't think it was my OC.

i do have another question about voltage on these. In msi afterburner I set +100mv but In the OSD I see it max out @ 1250 which is stock....

Even during 3dmark my clocks drop to ~1050 back up to 1200. What gives, not enough voltage? Improper load sensing?

I have too many issues going on right now i'm overwhelmed.

SEEMS I can do 120hz as long as I have 0 video card oc. This is really starting to piss me off I didn't pay 500$ for a video card to either be able to OC the video card and run @ 60fps. OR run @ 120fps with no oc ( never gonna hit 120fps with no oc anyway)

I'm about to sell this and buy a 780ti.


----------



## devilhead

those 290 is power hungry cards







3x with my all pc components draws ~1150w DD this is checked when mining







stock clocks 947/1250







power supply AX1200i


----------



## kcuestag

I just sent my 290X for RMA, for the 2nd time already... (Black screens or signal loss, whatever you want to call it)

This time I asked them to either refund me or give me 2 R9 290's instead and I'll pay the difference, I don't want another 290X, I'm done with those.


----------



## HardwareDecoder

Quote:


> Originally Posted by *Derpinheimer*
> 
> You could try switching to a 780TI and return it if it doesnt fix your issue.
> 
> However, are you willing to sell waterblock only? Sad day for you to part from an awesome OC'er.
> 
> Im sure the miners on ebay will be happy to buy it.
> 
> EDIT: Have you considered it might be an unstable overclock?


It's not an unstable OC because it runs fine @ 60hz, I can play BF4 for hours and do any benchmark, and even if I reduce the oc significantly the problem still remains. It's something with AMD drivers I'm sure of it.

All I really know for sure is I have to pick between oc'ing the video card or the monitor, and I know the monitor isn't screwed up because it has ~20hrs of use since I sold my 7950's a few weeks ago and they were rock solid @ 110hz and they were overclocked too.

I'm starting to go a little crazy here and I'm about to smash this card in to pieces


----------



## Derpinheimer

Are you using Vsync in either test, or a frame limiter?


----------



## HardwareDecoder

Quote:


> Originally Posted by *Derpinheimer*
> 
> Are you using Vsync in either test, or a frame limiter?


no I never use vsync


----------



## Derpinheimer

And no frame limiter.. hm... does this happen in all games? also, why not try

~
gametime.maxvariablefps 110
enter
~

I think you said yours does 110? So obviously whatever your refresh rate is goes there. And that is a BF4 specific command line.


----------



## MrWhiteRX7

If you already had intentions of doing a fresh windows install I'd at least give that the final shot before sending the card back.. then you will know you've exhausted every avenue, at which point go for the 780ti.


----------



## HardwareDecoder

Quote:


> Originally Posted by *Derpinheimer*
> 
> And no frame limiter.. hm... does this happen in all games? also, why not try
> 
> ~
> gametime.maxvariablefps 110
> enter
> ~
> 
> I think you said yours does 110? So obviously whatever your refresh rate is goes there. And that is a BF4 specific command line.


I can do 120hz @ stock 1000mhz. I just tried 1100mhz and seemed to work any higher and it screws up with the lines or that brown image.

I don't get it I can do like 1250 on 60hz. The video card shouldn't care what the refresh of the monitor is. it's just feeding a signal

gpu scaling seems to make 0 difference actually.


----------



## MrWhiteRX7

If I go from 60hz to 96hz (have been as high as 110hz but the colors looked a little washed) I can keep all my overclock settings which my highest is 1210 / 1550 @ +200mv in AB. No issues, zero lines or black/brown screening. You are correct that the monitor clocking shouldn't have an effect on the card clock.


----------



## HardwareDecoder

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> If I go from 60hz to 96hz (have been as high as 110hz but the colors looked a little washed) I can keep all my overclock settings which my highest is 1210 / 1550 @ +200mv in AB. No issues, zero lines or black/brown screening. You are correct that the monitor clocking shouldn't have an effect on the card clock.


how do you get +200mv in AB, PT-1 bios?

I know the stock does +100 and asus does +163.

i've switched back to the stock bios for now since i'm trying to rule stuff out

btw: you can fix the colors being washed out in the CCC go to desktop color and change gamma to 1.35-1.45 or so


----------



## broken pixel

Quote:


> Originally Posted by *devilhead*
> 
> those 290 is power hungry cards
> 
> 
> 
> 
> 
> 
> 
> 3x with my all pc components draws ~1150w DD this is checked when mining
> 
> 
> 
> 
> 
> 
> 
> stock clocks 947/1250
> 
> 
> 
> 
> 
> 
> 
> power supply AX1200i


How many KH/s are you getting? My 2 290x use less
Watts than my 2 7970s undervolted. Im getting almost 800kh/s with -6 power 900/1250, I-19. I have not messed with them much. @ I-20 i think they where around 560 watts x2 total system.


----------



## Derpinheimer

Just make sure its not game specific. If it is, then try to find a root cause somewhere else.

Its possible that at lower refresh rates, the game isnt fully loading the GPU. I mean, theres some really interesting things that can happen with resolution, framerate, etc, and "real" gpu usage.

Also please try a frame limiter. No sense in rendering 200 frames when you only need 120.


----------



## HardwareDecoder

Quote:


> Originally Posted by *Derpinheimer*
> 
> Just make sure its not game specific. If it is, then try to find a root cause somewhere else.
> 
> Its possible that at lower refresh rates, the game isnt fully loading the GPU. I mean, theres some really interesting things that can happen with resolution, framerate, etc, and "real" gpu usage.
> 
> Also please try a frame limiter. No sense in rendering 200 frames when you only need 120.


im, not rendering 200 since i'm @ 1440p bro.

I've tried with both batman and battlefield so far today


----------



## broken pixel

I notice vsync is broken with 9.4 and 9.5 betas. Went back to beta 8 and vsync works fine for [email protected] 100Hz 1440p.


----------



## Derpinheimer

Quote:


> Originally Posted by *HardwareDecoder*
> 
> im, not rendering 200 since i'm @ 1440p bro.
> 
> I've tried with both batman and battlefield so far today


Im on 1440p and I've seen framerates go well over 150 :?

At this point I would imagine its an overclock stability issue. Why it only shows up at 120hz is a mystery.


----------



## broken pixel

Quote:


> Originally Posted by *Sazz*
> 
> I did the core unpark thing, and played the game again.. seems to be fixed now, but the thing that weirds me out is I didn't do that before but the game was playing smooth back then, I just didn't play it for a couple of weeks and then when I got back to it I got that problem.. anyways its fixed right now, and it seems to add 5-10FPS when I did the core unparking. (6020x1080p I was getting 35Fps minimum, now I am getting 40Fps as my lowest dip, averaging around 55-60Fps)


when i checked my cores its said two where parked. Unparking them did not help my cpu spikes. For me the spike have gotten worse with every patch and Im using a 3930k, makes no diff if HT is on or off.


----------



## devilhead

Quote:


> Originally Posted by *broken pixel*
> 
> How many KH/s are you getting? My 2 290x use less
> Watts than my 2 7970s undervolted. Im getting almost 800kh/s with -6 power 900/1250, I-19. I have not messed with them much. @ I-20 i think they where around 560 watts x2 total system.


2500 kh/s


----------



## devilhead

Quote:


> Originally Posted by *HardwareDecoder*
> 
> I can do 120hz @ stock 1000mhz. I just tried 1100mhz and seemed to work any higher and it screws up with the lines or that brown image.
> 
> I don't get it I can do like 1250 on 60hz. The video card shouldn't care what the refresh of the monitor is. it's just feeding a signal
> 
> gpu scaling seems to make 0 difference actually.


so those line problem is pi... me







because i use 120hz monitor, when i overclock to 1230/1500 130mV is ok, if i will put 1250/1500 then getiing those lines! use 290xfx flashed to 290x
will try some overclock with 60 hz, maybe it will help


----------



## HardwareDecoder

Quote:


> Originally Posted by *Derpinheimer*
> 
> Im on 1440p and I've seen framerates go well over 150 :?
> 
> At this point I would imagine its an overclock stability issue. Why it only shows up at 120hz is a mystery.


yea Idk..... I just did a whole round of BF4 on 120hz @ 1150 and no issues.

how do I make my voltage/clock stay the same all the time while gaming on this card?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *HardwareDecoder*
> 
> how do you get +200mv in AB, PT-1 bios?
> 
> I know the stock does +100 and asus does +163.
> 
> i've switched back to the stock bios for now since i'm trying to rule stuff out
> 
> btw: you can fix the colors being washed out in the CCC go to desktop color and change gamma to 1.35-1.45 or so


Stock Sapphire 290 bios. I use the command line code for AB to allow +200mv and everything else just works great. I love using AB over the other options honestly. I've had zero issues with it.

I use a nice color profile made specifically for 96hz and it looks so good I just leave it there LOL


----------



## HardwareDecoder

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Stock Sapphire 290 bios. I use the command line code for AB to allow +200mv and everything else just works great. I love using AB over the other options honestly. I've had zero issues with it.
> 
> I use a nice color profile made specifically for 96hz and it looks so good I just leave it there LOL


There might be some hope yet..... I just did a few rounds of bf4 @ 1150 core no mem oc @ 120hz and didn't have any issues.

Whats the command line code btw? I'll give it a shot see if I can do my normal oc on 120hz with +200mv

UnofficialOverclockingEULA = I confirm that I am aware of unofficial overclocking limitations and fully understand that MSI will not provide me any support on it
UnofficialOverclockingMode = 1

only lets me select 100mv still


----------



## Mirob0t

He guys

just recently ( 2 weeks ago ) i bought Sapphire r9 290... i'v replaced the cooler with the Gelid Icy rev2
but mannn. ever since iv been having troubles with the preformance, im gettign lower then i expected... specially on bf4
i feel sad about this tbh and im starting to feel like selling the card.... but i stil try and hope to find any solution to this
if someone could help me... i would love to stay as a R9 290 user

here are my specs

Windows 8.1
I5 2500K @ 4.2
8 Gig 1333 A-data memory
Asus maximus Iv gene-z
Notcua-NH d14
Samsung 840 120 SSD
1T HDD
120hz Aw Optx 2310 monitor
Sapphire r9 290 @ Gelid Icy rev2 custom cooler

Peace..


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Mirob0t*
> 
> He guys
> 
> just recently ( 2 weeks ago ) i bought Sapphire r9 290... i'v replaced the cooler with the Gelid Icy rev2
> but mannn. ever since iv been having troubles with the preformance, im gettign lower then i expected... specially on bf4
> i feel sad about this tbh and im starting to feel like selling the card.... but i stil try and hope to find any solution to this
> if someone could help me... i would love to stay as a R9 290 user
> 
> here are my specs
> 
> Windows 8.1
> I5 2500K @ 4.2
> 8 Gig 1333 A-data memory
> Asus maximus Iv gene-z
> Notcua-NH d14
> Samsung 840 120 SSD
> 1T HDD
> 120hz Aw Optx 2310 monitor
> Sapphire r9 290 @ Gelid Icy rev2 custom cooler
> 
> Peace..


What kind of trouble? Performance as in frames per second or the card overheating or downclocking or what. Not sure what you mean.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Mirob0t*
> 
> He guys
> 
> just recently ( 2 weeks ago ) i bought Sapphire r9 290... i'v replaced the cooler with the Gelid Icy rev2
> but mannn. ever since iv been having troubles with the preformance, im gettign lower then i expected... specially on bf4
> i feel sad about this tbh and im starting to feel like selling the card.... but i stil try and hope to find any solution to this
> if someone could help me... i would love to stay as a R9 290 user
> 
> here are my specs
> 
> Windows 8.1
> I5 2500K @ 4.2
> 8 Gig 1333 A-data memory
> Asus maximus Iv gene-z
> Notcua-NH d14
> Samsung 840 120 SSD
> 1T HDD
> 120hz Aw Optx 2310 monitor
> Sapphire r9 290 @ Gelid Icy rev2 custom cooler
> 
> Peace..


Also note in BF4 you will need to OC that 2500k a good bit more to keep up. It's very cpu intensive and the 2500k lags behind a bit compared to 2600k and other 4core/8thread and 8 core cpu's. Especially when pushing a 290


----------



## Mirob0t

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> What kind of trouble? Performance as in frames per second or the card overheating or downclocking or what. Not sure what you mean.


sorry my bad, i meant in FPS... the heat and noise is not a problem since im using the Gelid cooler
its just the frames per second, when i change from ultra to high in BF4 .. there is almost no difference in frames, maybe like 10 or less


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Mirob0t*
> 
> sorry my bad, i meant in FPS... the heat and noise is not a problem since im using the Gelid cooler
> its just the frames per second, when i change from ultra to high in BF4 .. there is almost no difference in frames, maybe like 10 or less


How high can you OC that 2500k? Try to get that thing around 4.8ghz or higher


----------



## Mirob0t

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> How high can you OC that 2500k? Try to get that thing around 4.8ghz or higher


wel i dont know tbh... how much should i put the Cpu Voltage?


----------



## jerrolds

Whats your temps like? Is it throttling? I have a Gelid ICY 2 as well and gpu temps are < 65C, and VRM temps are fine as long as i keep it under 1200mhz on the core, ive since replace the fans with high static 120mm fans and VRMs are even more in line.

Try upping power limit to 50% and see if youre temps are good and not throttling.

Do you notice that your cards speed is not going back down to idle in desktop mode? I had a virus that was basically a cryptcoin miner that hijacked my card, i was wondering why Bioshock 1 was only running at 90fps as opposed to 300fps.

Luckily it was only for an evening but look for something called ie8util.exe (i forget the actual name) running in the back ground


----------



## broken pixel

Quote:


> Originally Posted by *devilhead*
> 
> 2500 kh/s


Your khs for each one not combine, nevermind i see you have 3x.

Anyone have the url for the hawaii info tool?


----------



## OverSightX

I asked a while back, but I will ask again. The memory clock on my 290 keeps going up and down continuously whatever I'm doing unless gaming it will stay at 1250.

This is what I mean:


Is this common?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Mirob0t*
> 
> wel i dont know tbh... how much should i put the Cpu Voltage?


12 fps difference at stock... 2500k vs 2600k

You need to OC that 2500k for sure.

Also note what was said above, make sure your card is not downclocking for whatever reason.


----------



## Mirob0t

Quote:


> Originally Posted by *jerrolds*
> 
> Whats your temps like? Is it throttling? I have a Gelid ICY 2 as well and gpu temps are < 65C, and VRM temps are fine as long as i keep it under 1200mhz on the core, ive since replace the fans with high static 120mm fans and VRMs are even more in line.
> 
> Try upping power limit to 50% and see if youre temps are good and not throttling.
> 
> Do you notice that your cards speed is not going back down to idle in desktop mode? I had a virus that was basically a cryptcoin miner that hijacked my card, i was wondering why Bioshock 1 was only running at 90fps as opposed to 300fps.
> 
> Luckily it was only for an evening but look for something called ie8util.exe (i forget the actual name) running in the back ground


What do you mean by throttling?, if you want to know the temps, i can jump in battlefield and play for a while so i can see wath the highest temps are


----------



## sugarhell

Quote:


> Originally Posted by *Mirob0t*
> 
> He guys
> 
> just recently ( 2 weeks ago ) i bought Sapphire r9 290... i'v replaced the cooler with the Gelid Icy rev2
> but mannn. ever since iv been having troubles with the preformance, im gettign lower then i expected... specially on bf4
> i feel sad about this tbh and im starting to feel like selling the card.... but i stil try and hope to find any solution to this
> if someone could help me... i would love to stay as a R9 290 user
> 
> here are my specs
> 
> Windows 8.1
> I5 2500K @ 4.2
> 8 Gig 1333 A-data memory
> Asus maximus Iv gene-z
> Notcua-NH d14
> Samsung 840 120 SSD
> 1T HDD
> 120hz Aw Optx 2310 monitor
> Sapphire r9 290 @ Gelid Icy rev2 custom cooler
> 
> Peace..


Buy better ram. Oc more your cpu. Nothing to do with the gpu


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Mirob0t*
> 
> What do you mean by throttling?, if you want to know the temps, i can jump in battlefield and play for a while so i can see wath the highest temps are


Are you using hwbot or Afterburner to watch your gpu usage and clocks?


----------



## Mirob0t

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Are you using hwbot or Afterburner to watch your gpu usage and clocks?


I am using After Burner, my usage is almost always at 100%
my temps are between 65/60/70 on heavy load

How much shall i Oc my 2500K?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Mirob0t*
> 
> I am using After Burner, my usage is almost always at 100%
> my temps are between 65/60/70 on heavy load
> 
> How much shall i Oc my 2500K?


Ok keep AB open and play some BF4 then go and look at your actual core clock. Tell us if it's staying at your desired speed.


----------



## MrWhiteRX7

When I had a 2500k I had that punk at 5.1ghz but for you if you have a good enough cooler on it go for at least 4.8ghz.


----------



## Mirob0t

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Ok keep AB open and play some BF4 then go and look at your actual core clock. Tell us if it's staying at your desired speed.


i just tested BF4,

core clock was running at 947/946
memory clock running at 1250 the whole time


----------



## battleaxe

I thought the 290's didn't have a uber mode or a switch? Mine does. Maybe I'm mistaken? Mine's not an X model


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Mirob0t*
> 
> i just tested BF4,
> 
> core clock was running at 947/946
> memory clock running at 1250 the whole time


Nice then you're good to go, just need a stronger CPU or more overclock









Plus, might as well bump up the clocks on that gpu! That will make a nice difference


----------



## battleaxe

Does the base 290 have uber/quiet mode?


----------



## Mirob0t

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Nice then you're good to go, just need a stronger CPU or more overclock
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Plus, might as well bump up the clocks on that gpu! That will make a nice difference


Ahh thanks man, im getting my hopes back!!

so how far should i OC my Cpu and wich Voltage should i use?
also what overclock settigns shall i use on my R9 290?

!


----------



## psyside

So does 290's have uber mode as well or only the 290X?


----------



## Mr357

Quote:


> Originally Posted by *psyside*
> 
> So does 290's have uber mode as well or only the 290X?


Probably so. Even if not, you can just set the fan limit to 55%, temperature limit to 95C, and get the same results.


----------



## Jack Mac

Quote:


> Originally Posted by *psyside*
> 
> So does 290's have uber mode as well or only the 290X?


Why does it matter? Using a BIOS switch is more annoying than making a fan curve.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Mirob0t*
> 
> Ahh thanks man, im getting my hopes back!!
> 
> so how far should i OC my Cpu and wich Voltage should i use?
> also what overclock settigns shall i use on my R9 290?
> 
> !


Yea dont' worry your gpu seems totally fine









Unlock voltage control in afterburner and start seeing what you can get out of it. Make sure your fan profile is set to where the card doesn't get much over 75c or so. With +100mv added most people can get around 1100 - 1150 on the core clock. Memory can vary quite a bit but no need to go over 1500 in games.

Just do it little by little.

As far as your cpu I'd scout out the sandybridge owners thread. Should be plenty of bios settings with voltage/ core comparisons. I haven't had a 2500k in a long while I don't remember what my voltage was but I honestly didn't care either. I know it ended up being higher than what a lot of people considered ideal hahahaha. But it was a beast!


----------



## Mirob0t

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Yea dont' worry your gpu seems totally fine
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Unlock voltage control in afterburner and start seeing what you can get out of it. Make sure your fan profile is set to where the card doesn't get much over 75c or so. With +100mv added most people can get around 1100 - 1150 on the core clock. Memory can vary quite a bit but no need to go over 1500 in games.
> 
> Just do it little by little.
> 
> As far as your cpu I'd scout out the sandybridge owners thread. Should be plenty of bios settings with voltage/ core comparisons. I haven't had a 2500k in a long while I don't remember what my voltage was but I honestly didn't care either. I know it ended up being higher than what a lot of people considered ideal hahahaha. But it was a beast!


alright i just Oc'ed my cpu to 4.7 @ 1.39 voltage
i also Oc'ed my Gpu' to ( core clock 1125, Memory 1450,
after testing like 5 mins or les son BF4 i had a black screen and had to reset the clocks... hmm do you think i should go higher with the memory or lower?


----------



## psyside

Quote:


> Originally Posted by *Jack Mac*
> 
> Why does it matter? Using a BIOS switch is more annoying than making a fan curve.


Quote:


> Originally Posted by *Mr357*
> 
> Probably so. Even if not, you can just set the fan limit to 55%, temperature limit to 95C, and get the same results.


Because i wasn't sure if i can go past 47% if the switch is not in uber mode? if its possible i couldn't care less









My card will be ~ 60% most of the time.

Also when does throttling start? after 85c +? 90c?


----------



## Jack Mac

Quote:


> Originally Posted by *psyside*
> 
> Because i wasn't sure if i can go past 47% if the switch is not in uber mode? if its possible i couldn't care less
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My card will be ~ 60% most of the time.
> 
> Also when does throttling start? after 85c +? 90c?


The switch doesn't matter, and fixed fanspeed when gaming works better than silent/uber/auto. Throttling starts at whatever your temperature target is set to. Default is 95C.


----------



## kizwan

Quote:


> Originally Posted by *battleaxe*
> 
> Does the base 290 have uber/quiet mode?


Quote:


> Originally Posted by *psyside*
> 
> So does 290's have uber mode as well or only the 290X?


Yes, 290 do have quiet/uber mode switch. My Sapphire does have the switch. For Sapphire R9 290 with their latest VBIOS, in uber mode, fan speed capped at 47%.


----------



## Amhro

Quote:


> Originally Posted by *psyside*
> 
> So does 290's have uber mode as well or only the 290X?


As far as I know, 290 does not have it.


----------



## psyside

As i planed, fixed 60-65% it is









Hope i wont get higher then 85 c core/80c vrm as max at this settings, with oc. No crazy volts here...


----------



## Jack Mac

Quote:


> Originally Posted by *psyside*
> 
> As i planed, fixed 60-65% it is
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hope i wont get higher then 85 c core/80c vrm as max at this settings, with oc. No crazy volts here...


Should be more than ok at those settings.


----------



## VSG

FYI: Anyone planning to sell a BF4 code on eBay, be ready to be cheated. I just got a paypal claim filed against me by the buyer citing unauthorized transactions and there is pretty much nothing I can do since digital downloads are not a tangible good.


----------



## HardwareDecoder

Quote:


> Originally Posted by *geggeg*
> 
> FYI: Anyone planning to sell a BF4 code on eBay, be ready to be cheated. I just got a paypal claim filed against me by the buyer citing unauthorized transactions and there is pretty much nothing I can do since digital downloads are not a tangible good.


that is why I just gave mine away on here. Not worth the hassle for 20-40$


----------



## battleaxe

Quote:


> Originally Posted by *Jack Mac*
> 
> The switch doesn't matter, and fixed fanspeed when gaming works better than silent/uber/auto. Throttling starts at whatever your temperature target is set to. Default is 95C.


My question had more to do with why my 290 had a switch at all. I'm already using AB to keep from throttling.


----------



## Derpinheimer

Quote:


> Originally Posted by *OverSightX*
> 
> I asked a while back, but I will ask again. The memory clock on my 290 keeps going up and down continuously whatever I'm doing unless gaming it will stay at 1250.
> 
> This is what I mean:
> 
> 
> Is this common?


Yes its normal, why does it matter?


----------



## darkelixa

MSI have just announced there r9 290s and 290x coolers

http://www.guru3d.com/news_story/msi_radeon_r9_290_photos_and_pcb_pictured.html


----------



## OverSightX

Quote:


> Originally Posted by *Derpinheimer*
> 
> Yes its normal, why does it matter?


Doesn't matter. Just wondering because my 7970's never did this.


----------



## Forceman

Quote:


> Originally Posted by *battleaxe*
> 
> My question had more to do with why my 290 had a switch at all. I'm already using AB to keep from throttling.


Yes, the 290 has the switch because it is the same PCB as the 290X. But it doesn't have a quiet or uber mode, both sides of the switch are the same. It's just a dual BIOS on the 290.


----------



## darkelixa

Anyone got bf4 coupon codes for giveaway/for a price?


----------



## Arizonian

Quote:


> Originally Posted by *HoZy*
> 
> 290x Crossfire Now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Picked up the 2nd baby, Been testing without the block first just to make sure it's all working properly.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 3DMARK LINK
> 
> http://www.3dmark.com/3dm/1865931
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> ]
> 
> 
> 
> Thanks


Congrats - updated









Quote:


> Originally Posted by *voldomazta*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Hi admin, can you please add me?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyway, how do you determine the memory type for each one? I ran MemoryInfo.exe while I have 3 cards plugged in and I think it only detected memory for one of the cards (ASUS one), sadly, it's "Elpdia"
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> How do I check memory for the other ones?


Congrats - added









Quote:


> Originally Posted by *Joeking78*
> 
> Add me pls
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 3 x Sapphire R290x, stock cooling.


Congrats - added


----------



## battleaxe

Quote:


> Originally Posted by *Forceman*
> 
> Yes, the 290 has the switch because it is the same PCB as the 290X. But it doesn't have a quiet or uber mode, both sides of the switch are the same. It's just a dual BIOS on the 290.


Ah, cool. Just curious. Thanks mate.


----------



## Pirada

Has there been anymore info on the black screen problem?

The beta drivers have helped greatly but I'm still experiencing a black screen maybe once a week or so (rather than every day), is there anything else I can try?


----------



## Asrock Extreme7

whats the new msi ab is it 3.0.0.0


----------



## MrWhiteRX7

This is a good read, in case you guys haven't seen it.

http://www.hardocp.com/article/2013/12/13/4_weeks_radeon_r9_290x_crossfire/1#.Uq9_stJDtAo


----------



## sugarhell

No this is a good read.

http://semiaccurate.com/2013/12/16/amds-powertune-2-0/


----------



## Raxus

Quote:


> Originally Posted by *darkelixa*
> 
> MSI have just announced there r9 290s and 290x coolers
> 
> http://www.guru3d.com/news_story/msi_radeon_r9_290_photos_and_pcb_pictured.html


http://videocardz.com/48493/msi-radeon-r9-290x-r9-290-gaming-available-preorder

$699 and $529

seems high.


----------



## SultanOfWalmart

Quote:


> Originally Posted by *HardwareDecoder*
> 
> Okay gpu scaling wasn't the issue, problem came back
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I know i've tried @ stock settings (on the gpu) and it happened so I don't think it was my OC.
> 
> i do have another question about voltage on these. In msi afterburner I set +100mv but In the OSD I see it max out @ 1250 which is stock....
> 
> Even during 3dmark my clocks drop to ~1050 back up to 1200. What gives, not enough voltage? Improper load sensing?
> 
> I have too many issues going on right now i'm overwhelmed.
> 
> SEEMS I can do 120hz as long as I have 0 video card oc. This is really starting to piss me off I didn't pay 500$ for a video card to either be able to OC the video card and run @ 60fps. OR run @ 120fps with no oc ( never gonna hit 120fps with no oc anyway)
> 
> I'm about to sell this and buy a 780ti.


Did you bump your power limit to 150%? If you're doing it in AB, try and use GPUTweak. On my card AB will not apply power limit no matter what I do, and it throttles. Apply Power Limit in GPUTweak and it flying full speed.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Raxus*
> 
> http://videocardz.com/48493/msi-radeon-r9-290x-r9-290-gaming-available-preorder
> 
> $699 and $529
> 
> seems high.


Hmmm......well I paid $700 AUD for my ref 290x on day 1.

Think it might be inflated due to mining craze?


----------



## Raxus

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Hmmm......well I paid $700 AUD for my ref 290x on day 1.
> 
> Think it might be inflated due to mining craze?


Who knows, but for that kinda money I'll grab a EVGA 780 ti classified for $50 more.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Raxus*
> 
> Who knows, but for that kinda money I'll grab a EVGA 780 ti classified for $50 more.


Good point. AMDs biggest advantage is price and if the above price is true then a 780ti is a clear winner

Edit: I was thinking about getting 2 non ref 290x's but not at $800 AUD (est) each


----------



## Jack Mac

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Good point. AMDs biggest advantage is price and if the above price is true then a 780ti is a clear winner
> 
> Edit: I was thinking about getting 2 non ref 290x's but not at $800 AUD (est) each


That's a preorder price and I'm sure when they're available they'll be much cheaper. And don't rule out non-ref 290s, they're only a few percent slower than a 290X.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Jack Mac*
> 
> That's a preorder price and I'm sure when they're available they'll be much cheaper. And don't rule out non-ref 290s, they're only a few percent slower than a 290X.


Thats true but I fully expect the non ref 290x's to be at least $730 here......eh I might just throw a few extra bucks in and trade my 290x for 2 non ref 290s


----------



## Jack Mac

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Thats true but I fully expect the non ref 290x's to be at least $730 here......eh I might just throw a few extra bucks in and trade my 290x for 2 non ref 290s


Do it, or do 290+4770K.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Jack Mac*
> 
> Do it, or do 290+4770K.


Ill go for 290 crossfire.......I like AMD and im still hoping Mantle pays off in the end.


----------



## ssiperko

You guys are smoking me!

How are you getting those #'s?






SS


----------



## Raxus

Quote:


> Originally Posted by *Jack Mac*
> 
> That's a preorder price and I'm sure when they're available they'll be much cheaper. And don't rule out non-ref 290s, they're only a few percent slower than a 290X.


Id be surprised, considering the reference models keep getting more expensive.


----------



## chiknnwatrmln

I must say I'm glad that I bought my 290 when I did...

These prices are not very good any more, I mean $500 for a ref 290? Gee thanks miners.

Can't really blame AMD, they're just making money because they can and they're a business. Just a crappy situation for gamers, I guess.

Oh well, I can live with one card... For now.


----------



## Sgt Bilko

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I must say I'm glad that I bought my 290 when I did...
> 
> These prices are not very good any more, I mean $500 for a ref 290? Gee thanks miners.
> 
> Can't really blame AMD, they're just making money because they can and they're a business. Just a crappy situation for gamers, I guess.
> 
> Oh well, I can live with one card... For now.


I am finding somewhat humorous that 290s have been $500 since day one here









But yes.....it does suck for people that just want to game with these cards....hopefully they craze dies down soon


----------



## Raxus

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Ill go for 290 crossfire.......I like AMD and im still hoping Mantle pays off in the end.


Quote:


> Originally Posted by *Sgt Bilko*
> 
> Ill go for 290 crossfire.......I like AMD and im still hoping Mantle pays off in the end.


If mantle is that big of an increase in performance, NVidia will likely adopt it.


----------



## Raxus

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I am finding somewhat humorous that 290s have been $500 since day one here
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But yes.....it does suck for people that just want to game with these cards....hopefully they craze dies down soon


What you do is wait until the miners move on to the hardware coming out specifically for mining, and pick up used cards for dirt cheap.


----------



## grunion

Here are some tri-fire scores.

3dm11 P26294
3dm11 extreme X12820

Firestrike 17598


----------



## Sgt Bilko

Quote:


> Originally Posted by *Raxus*
> 
> What you do is wait until the miners move on to the hardware coming out specifically for mining, and pick up used cards for dirt cheap.


That would be a great plan for the ref cards....but I have two small problems.

1. I need a new card before mid January

2. For Miners to buy and then sell used non ref cards......thats gonna take long than a month (assuming non ref drops this week (not likely))


----------



## Raxus

Quote:


> Originally Posted by *Sgt Bilko*
> 
> That would be a great plan for the ref cards....but I have two small problems.
> 
> 1. I need a new card before mid January
> 
> 2. For Miners to buy and then sell used non ref cards......thats gonna take long than a month (assuming non ref drops this week (not likely))


Quote:


> Originally Posted by *Sgt Bilko*
> 
> That would be a great plan for the ref cards....but I have two small problems.
> 
> 1. I need a new card before mid January
> 
> 2. For Miners to buy and then sell used non ref cards......thats gonna take long than a month (assuming non ref drops this week (not likely))


after owning a 680, 1 x 7970 reference, 2 x 7970 Sapphire oc, 2 x 290x. The AMD cards gave me nothing but grief in one way or another.

having 2 x 290x drop dead on me, unless they drop back down to their previous price. Gonna have to go with NVIDIA this time around. The reviews of G Sync intrigue me as well.


----------



## grunion

Valley and Heaven...
Had to drop the Heaven OC to 1100 because my PSU kept shutting off at 1200w on my KAW


----------



## kizwan

Quote:


> Originally Posted by *Forceman*
> 
> Yes, the 290 has the switch because it is the same PCB as the 290X. But it doesn't have a quiet or uber mode, both sides of the switch are the same. It's just a dual BIOS on the 290.


Can you share your source? The two reviews I read said it's uber/quiet mode switch. However, I think I recall seeing the fan capped at 47% with the switch supposedly at quiet mode. Basically, if bad flash, I can switch to second BIOS to use the good BIOS?
Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I must say I'm glad that I bought my 290 when I did...
> 
> These prices are not very good any more, I mean $500 for a ref 290? Gee thanks miners.
> 
> Can't really blame AMD, they're just making money because they can and they're a business. Just a crappy situation for gamers, I guess.
> 
> Oh well, I can live with one card... For now.


Mine cost me extra +$57 but when it first launched here in Malaysia, the price was MYR$1599 ($493.44). The Sapphy I got is MYR$1480 ($456.72) each. The MSI & ASUS still listed MYR$1599 but if you bundled with PSU, you can pay slightly less for the GPU.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Raxus*
> 
> after owning a 680, 1 x 7970 reference, 2 x 7970 Sapphire oc, 2 x 290x. The AMD cards gave me nothing but grief in one way or another.
> 
> having 2 x 290x drop dead on me, unless they drop back down to their previous price. Gonna have to go with NVIDIA this time around. The reviews of G Sync intrigue me as well.


And im the opposite. AMD/Ati have all been great for me and ive had issues with Nvidia (last NV was a 6200 though







)

Ive owned 2 x 4850s , 2 x 6970s a 7970 and one 290x and the 290x has been the only card thats given me grief.

I might go back to the green team when 20nm comes out but id rather build a 2nd Intel system for that and still have an all AMD rig.


----------



## Forceman

Quote:


> Originally Posted by *kizwan*
> 
> Can you share your source? The two reviews I read said it's uber/quiet mode switch. However, I think I recall seeing the fan capped at 47% with the switch supposedly at quiet mode. Basically, if bad flash, I can switch to second BIOS to use the good BIOS?


Firsthand experience is my original source, but if you want something more than that, the Anandtech review also mentions that it is the same on both sides. I think only the Guru3D article calls it quiet/uber and that was just a copy paste from their 290X review.

So yes, if you fail the flash on one side you can just flip the switch and keep going. I have the 290 BIOS on I e side, and the 290X BIOS on my other.
Quote:


> Moving on, AMD's dual BIOS functionality is back once more, but unlike the 290X the second BIOS will not be serving any defined purpose. Both BIOSes are identical as AMD doesn't have an uber mode for the 290, so switching between the two will not change the card's performance in any way. In this setup the second BIOS is reduced to serving as a safety net for end-user BIOS flashing.


----------



## darkelixa

My sapphire r9 290 just shipped today







They couldnt get xfx brand


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> My sapphire r9 290 just shipped today
> 
> 
> 
> 
> 
> 
> 
> They couldnt get xfx brand


Nice









Did you go through PCCG by chance? I know they only have XFX left in stock now


----------



## darkelixa

Nah man IJK

http://ijk.com.au/branch/ijk/product_info.php?cPath=353_340&products_id=146897


----------



## darkelixa

To me a few extra dollars for a whole lot more service is worth the while, they will usually message me in less than half a hour from open to close and pccg will only email me once a day , always at lunch time. Also the order was picked and packed in less than 20 mins lol and I had quite a few lines on my order


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> Nah man IJK
> 
> http://ijk.com.au/branch/ijk/product_info.php?cPath=353_340&products_id=146897


Same price really, only $10 difference between them.

You will be pleased I think









I used to think the same with Megaware....but I got stuffed around too many times by them and just gave up. Might have to give these guys some business. ....see what they are like


----------



## darkelixa

Now the hard part of waiting two days to receive my goods









Did you choose a card yet? Alot of them are out of stock


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> Now the hard part of waiting two days to receive my goods
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Did you choose a card yet? Alot of them are out of stock


Still in the RMA process atm...just waiting for the distributor to test it then decide if its a replacement, refund or a repair job.......I just want a refund....or store credit. Either one


----------



## darkelixa

Yeah man send them a test email


----------



## psyside

I talked with one highly educated electrician. He said, dont go over 85c on this cards regarding vrm's. If anyone can keep them around 50-60c you are golden. He read the PDF/specs for the chokes, so its first hand info.

Keep that in mind.


----------



## Jack Mac

Guess I'm golden then


----------



## Rar4f

Quote:


> Originally Posted by *psyside*
> 
> I talked with one highly educated electrician. He said, dont go over 85c on this cards regarding vrm's. If anyone can keep them around 50-60c you are golden. He read the PDF/specs for the chokes, so its first hand info.
> 
> Keep that in mind.


So AMD lied?

Also is MSI AB best software for GPU monitoring and overclocking? I got a Sapphire R9 290.


----------



## rquinn19

Quote:


> Originally Posted by *Rar4f*
> 
> *So AMD lied?*
> 
> Also is MSI AB best software for GPU monitoring and overclocking? I got a Sapphire R9 290.


I don't recall AMD saying anything specific regarding VRM temps.

And I use AB and I like it, but GPUTweak also worked well for me too. I just didn't like setting fan profiles on it.


----------



## ehpexs

Just got my first 290 (want another), my first thoughts weren't so much the speed of the card so much "wow, I REALLY need to put this thing under water." Seriously, it's a great card, but the cooler is literally as loud as a hair drier


----------



## Joeking78

Quote:


> Originally Posted by *grunion*
> 
> Here are some tri-fire scores.
> 
> 3dm11 P26294
> 3dm11 extreme X12820
> 
> Firestrike 17598


Nice scores.

You got a higher GPU score than me but I had a higher core OC...1225 vs 1150, I wonder why?

28296 - http://www.3dmark.com/3dm11/7658372

Have you tried running with Frame Pacing off?


----------



## Arizonian

Quote:


> Originally Posted by *ehpexs*
> 
> Just got my first 290 (want another), my first thoughts weren't so much the speed of the card so much "wow, I REALLY need to put this thing under water." Seriously, it's a great card, but the cooler is literally as loud as a hair drier


Nice.....

To be added on the member list please submit the following in your post
1. GPU-Z Link or Screen shot with OCN Name
2. Manufacturer & Brand
3. Cooling - Stock, Aftermarket or 3rd Party Water


----------



## Sgt Bilko

Sweclockers have a small preview of the DCU II 290x

http://translate.google.se/translate?sl=sv&tl=en&js=n&prev=_t&hl=sv&ie=UTF-8&u=http://www.sweclockers.com/nyhet/18039-asus-gor-radeon-r9-290x-directcu-ii&act=url

I'm still curious to see if HIS can tame the heat on these, They've always had decent coolers from what ive seen.


----------



## MrWhiteRX7

Humbug! ?


----------



## Sazz

Quote:


> Originally Posted by *psyside*
> 
> I talked with one highly educated electrician. He said, dont go over 85c on this cards regarding vrm's. If anyone can keep them around 50-60c you are golden. He read the PDF/specs for the chokes, so its first hand info.
> 
> Keep that in mind.


The 95C temp is for GPU Core not VRM's, at stock settings (including fan profile) after 1hr of gaming my VRM never went over 62C (VRM2) while VRM1 is about 10C lower than VRM2, ambient temp 22C.


----------



## Scotty99

Anyone else find it gross the 290 went up 100 bucks cause of the bitcoin miners? I bet when aftermarket cards come out they will be like 530 bucks, sigh.


----------



## Falkentyne

Quote:


> Originally Posted by *Joeking78*
> 
> Yep, Maintain aspect ratio, Scale image to full panel size & Use centered timings are greyed out.


You can only change these options if you set a LOWER resolution than your monitor's native res. Then you set it back, and the option will still stick (but will be grayed out).


----------



## AddictedGamer93

290X owner here. http://www.techpowerup.com/gpuz/ehghu/

Sapphire 290X BF4 Bundle
Stock Cooling
76.2% ASIC, Elpida

Is it just me or are these cards no louder than 79xx?


----------



## the9quad

Quote:


> Originally Posted by *ehpexs*
> 
> Just got my first 290 (want another), my first thoughts weren't so much the speed of the card so much "wow, I REALLY need to put this thing under water." Seriously, it's a great card, but the cooler is literally as loud as a hair drier


Literally? Really? More like they are loud and if I was to use hyperbole, they are loud as a hair dryer.

If I was to be literal they are loud but literally not as loud as a hair dryer.

Sorry dude even with 3 of em they are not as loud as a hair dryer. 3 women in this house and I know what hair dryers sound like, these aren't even close.

No fun aloud!


----------



## velocityx

Quote:


> Originally Posted by *the9quad*
> 
> Literally? Really? More like they are loud and if I was to use hyperbole, they are loud as a hair dryer.
> 
> If I was to be literal they are loud but literally not as loud as a hair dryer.
> 
> Sorry dude even with 3 of em they are not as loud as a hair dryer. 3 women in this house and I know what hair dryers sound like, these aren't even close.
> 
> No fun aloud!


I agree. my 6970 Crossfire was louder than my 290 CF. and there I was thinking oh god I wont stand if it's louder, but it's actually better.

but to be honest. I kinda bought them because it will make me buy WC setup for the summer, something I always wanted, and never felt ready, so now's the time I guess ;d


----------



## CriticalHit

I dont understand the fuss about the noise.. yes, they ARE noisy, but no more than any other card i remember owning.. When i overclocked my crossfired 5850's they required me to up the fan profile and they were also really loud - comparable to a 290 at 70% to 80% fan speed ... it was the straw that broke the camel's back into moving me into watercooling..

290's at the 47 - 50% arent that bad at all - i could have tolerated them in gaming rig with stock cooler but i have put them into my custom loop anyway ( to see if they would overclock better - but they didn't )

To tell you the truth the fan on my single Corsair 1050HX PSU beats the noise the crossfired 290's were making.

Not sure why all of a sudden a major issue... can't help but think its negative hype generated from the other side that has caught on....


----------



## zpaf

My lovely non X hair dryer @ 1270/1600.


----------



## velocityx

Quote:


> Originally Posted by *Pirada*
> 
> Has there been anymore info on the black screen problem?
> 
> The beta drivers have helped greatly but I'm still experiencing a black screen maybe once a week or so (rather than every day), is there anything else I can try?


yea, nothing.

today I updated my Bf4 to the latest patch, I fired it up, I see flickering textures in Crossfire. I'm like, gee, thanks dice, I switch CF off, go in game, half in the round the game goes black, hard reboot.

I really don't have time to waste like that.... wish amd said something is it hardware is it software? It's not happenening like everytime, but once a day maybe or so...


----------



## Sgt Bilko

Quote:


> Originally Posted by *velocityx*
> 
> I agree. my 6970 Crossfire was louder than my 290 CF. and there I was thinking oh god I wont stand if it's louder, but it's actually better.
> 
> but to be honest. I kinda bought them because it will make me buy WC setup for the summer, something I always wanted, and never felt ready, so now's the time I guess ;d


Seconded, my 6970 CF was LOUD, i didn't grab a 2nd 290x for CF but no way it could be louder, these cards are actually quite tame until you go above 65-75% fan speed, which you only need for benching, not 24/7 stuff


----------



## Forceman

The problem is that noise is totally subjective, not only in volume but also in pitch. For someone with a water cooled system, like me, the 290 was intolerably loud even at 47%, but for someone with a large case running 4 or 6 medium speed fans it would be totally acceptable. The biggest problem I had with the stock cooler was the pitch - the GTX 580 and 680 both made a whooshing air sound, while the 290 had a more whistling tone which was quite annoying. But again, everyone's perception and tolerance is going to vary, the only way to really know how it's going to affect you is to try it yourself.

I do wonder though, if the perception changes based on what card you are coming from. I've seen a lot of people compare it favorably to 6970s and 7970s, with less favorable comparisons from people with 680s and Titans. So maybe AMD coolers just sound different than Nvidia coolers.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Forceman*
> 
> The problem is that noise is totally subjective, not only in volume but also in pitch. For someone with a water cooled system, like me, the 290 was intolerably loud even at 47%, but for someone with a large case running 4 or 6 medium speed fans it would be totally acceptable. The biggest problem I had with the stock cooler was the pitch - the GTX 580 and 680 both made a whooshing air sound, while the 290 had a more whistling tone which was quite annoying. But again, everyone's perception and tolerance is going to vary, the only way to really know how it's going to affect you is to try it yourself.
> 
> I do wonder though, if the perception changes based on what card you are coming from. I've seen a lot of people compare it favorably to 6970s and 7970s, with less favorable comparisons from people with 680s and Titans. So maybe AMD coolers just sound different than Nvidia coolers.


Well Nvidia and AMD Ref coolers have different fan blade designs so i'd imagine that the pitch would be different along with the way the air interacts with the fins of the heatsink itself.

I haven't had an Nvidia card for a fair while now and when i did it didn't have a fan









it is interesting though, my Storm Trooper has 5 120mm fans in it and they are quite noisy so that and the old 6970 CF setup i had might be a factor in my opinion of the 290/x noise levels..


----------



## Imprezzion

Well, after reviewing a whole bunch of different GTX780's i'm glad to have my unlocked 290 back.
It's quite a lot faster in Battlefield 4, which is my main game at the moment, then the GTX780.
Also, a 1320Mhz GTX780 cannot compete with a 1150Mhz 290X. Not even a bit.

My 290 is cooled by a Accelero Hybrid with the back piece cut out of the stock heatsink and bolted back on for VRM cooling.
Running 1160Mhz core, 1500Mhz VRAM at +100mV in MSI AB (~1.25v load) and the core stays at about 55c load. VRM's with the Hybrid's fan module on there barely touch 60c.

I wish I had some more room in the voltage but the card starts to drop HDMI connection and give short flashes of ''No Input'' when I use GPU Tweak and use any voltage above 1.30v such as 1.412v.
On 1.412v the card just loses all connections and screen goes on stand by when applying load to the card.

Anyone know how to fix this?


----------



## psyside

Quote:


> Originally Posted by *Rar4f*
> 
> So AMD lied?
> 
> Also is MSI AB best software for GPU monitoring and overclocking? I got a Sapphire R9 290.


You are good up to ~ 90c, but efficiency drops, more noise is produced - not so stable oc. This, vrm's are rated up to 120c, but that's too much. He recommended not to exceed 85c.


----------



## psyside

Quote:


> Originally Posted by *Sazz*
> 
> The 95C temp is for GPU Core not VRM's, at stock settings (including fan profile) after 1hr of gaming my VRM never went over 62C (VRM2) while VRM1 is about 10C lower than VRM2, ambient temp 22C.


I know, i'm just saying, because not all go for same oc. I saw some owners get like 85-90c on the vrms with customs cooling.

What oc/fan speed you run?


----------



## battleaxe

Quote:


> Originally Posted by *Scotty99*
> 
> Anyone else find it gross the 290 went up 100 bucks cause of the bitcoin miners? I bet when aftermarket cards come out they will be like 530 bucks, sigh.


Yeah, it kinda sucks but I think this is more supply and demand than price gouging (at least on AMD's part). When you are constantly running something out of stock it makes sense to raise the price to slow down the turnover and make a bit more profit. That's business. I think in the end this will be good for everyone. AMD is getting a chance to make some money (which will help them make better future products). The miners are happy, and eventually the prices will come down.

Plus, AMD introducing these at such a fair/low price is what forced Nvidia to drop prices. Its really funniest what happened to Nvidia if you ask me. They were the ones that were price gouging, then had to lower in order to keep sales volume, and now the miners are buying up every AMD card on the planet. Kinda funny actually. We just gotta hang on till its all over. I say good for AMD.

You can still get these for a fair price at MC. I posted how to do it a while back here. It works.


----------



## Durvelle27

Quote:


> Originally Posted by *Imprezzion*
> 
> Well, after reviewing a whole bunch of different GTX780's i'm glad to have my unlocked 290 back.
> It's quite a lot faster in Battlefield 4, which is my main game at the moment, then the GTX780.
> Also, a 1320Mhz GTX780 cannot compete with a 1150Mhz 290X. Not even a bit.
> 
> My 290 is cooled by a Accelero Hybrid with the back piece cut out of the stock heatsink and bolted back on for VRM cooling.
> Running 1160Mhz core, 1500Mhz VRAM at +100mV in MSI AB (~1.25v load) and the core stays at about 55c load. VRM's with the Hybrid's fan module on there barely touch 60c.
> 
> I wish I had some more room in the voltage but the card starts to drop HDMI connection and give short flashes of ''No Input'' when I use GPU Tweak and use any voltage above 1.30v such as 1.412v.
> On 1.412v the card just loses all connections and screen goes on stand by when applying load to the card.
> 
> Anyone know how to fix this?


Any test comparisons ?


----------



## Imprezzion

Not with actual written proof but in Battlefield 4 the R9 290X averages about 80FPS at 2880x1620 (1080p @ 150% res scale) with all Ultra without AA.

The GTX780 @ 1306Mhz barey touches 70FPS average. And the minimums are much lower.

On 1080p differences are barely measurable but as we al knew, the 290X is stronger with higher resolutions.


----------



## Durvelle27

Quote:


> Originally Posted by *Imprezzion*
> 
> Not with actual written proof but in Battlefield 4 the R9 290X averages about 80FPS at 2880x1620 (1080p @ 150% res scale) with all Ultra without AA.
> 
> The GTX780 @ 1306Mhz barey touches 70FPS average. And the minimums are much lower.
> 
> On 1080p differences are barely measurable but as we al knew, the 290X is stronger with higher resolutions.


I ask as i sold my Sapphire R9 290 but now trying to decide between getting another 290 or GTX 780 Classified. I also only game at 1080p


----------



## Imprezzion

Well, the 290 can't beat a 780 at 1080p.

A 290X would probably be better.


----------



## battleaxe

Anyone want to sell their stock heatsink of a 290? PM me your price if you do please. I want to cut this one up for VRM cooling and will need another.


----------



## Durvelle27

Quote:


> Originally Posted by *Imprezzion*
> 
> Well, the 290 can't beat a 780 at 1080p.
> 
> A 290X would probably be better.


290X is not a option to exspensive


----------



## Imprezzion

And the Classy aint... Get something like a MSI Gaming 780 and you'll be happy.


----------



## psyside

Quote:


> Originally Posted by *Imprezzion*
> 
> Well, the 290 can't beat a 780 at 1080p.
> 
> A 290X would probably be better.


Sure it can. 290 = 290X 5% difference at same clocks.


----------



## Durvelle27

Quote:


> Originally Posted by *Imprezzion*
> 
> And the Classy aint... Get something like a MSI Gaming 780 and you'll be happy.


Classy only $460 so its not the expensive and if i go 780 i'm looking at the best clockers and the gaming is not


----------



## Amhro

Quote:


> Originally Posted by *Imprezzion*
> 
> Well, the 290 can't beat a 780 at 1080p.
> 
> A 290X would probably be better.


Ofcourse it can.


----------



## maynard14

hmm help pls

i think my idle and load of my xfx r9 290 is kinda high>?

0.984 V on idle and 1.234 V on load

can someone tell me if this is normal?

can you post guys what is your idle and load on stock clock and memory clock on stock voltage also....

thank you so much guys


----------



## rquinn19

Quote:


> Originally Posted by *Durvelle27*
> 
> *Classy only $460* so its not the expensive and if i go 780 i'm looking at the best clockers and the gaming is not


Where?

edit: I want to go green but can't justify the price difference...was hoping I'd find someone willing to trade


----------



## grunion

Quote:


> Originally Posted by *Joeking78*
> 
> Nice scores.
> 
> You got a higher GPU score than me but I had a higher core OC...1225 vs 1150, I wonder why?
> 
> 28296 - http://www.3dmark.com/3dm11/7658372
> 
> Have you tried running with Frame Pacing off?


Throttling maybe?

Tighter timings on my memory also, 2400/9-10-10-24-1t, 3dm11 likes tighter timings.
And those scores are with fp disabled.


----------



## Imprezzion

Quote:


> Originally Posted by *Amhro*
> 
> Ofcourse it can.


I have reviewed 7 different 780's and I have a 290 unlocked for referencing. The 290 came up short in pretty much all game (non synthetic) benches at 1080p with high AA compared to a 780 at 1137Mhz boost. A stock 780 and 290 are pretty close but the stock 290 at 1100Mhz loses to a 780 at 1200+Mhz in pretty much anything except maybe BF3 / BF4 / Dirt / Sleeping Dogs.

What I did notice is that the minimum - average - maximum FPS difference is a lot smaller for the AMD's. The 780's have larger frame variance.


----------



## Raxus

I ask as i sold my Sapphire R9 290 but now trying to decide between getting another 290 or GTX 780 Classified. I also only game at 1080p[/quote]
Quote:


> Originally Posted by *Durvelle27*
> 
> Any test comparisons ?


Wait for aftermarket coolers, shouldnt be more than a couple of weeks.

Or go with a 780ti

the original 780 wont beat out a 290


----------



## Joeking78

Quote:


> Originally Posted by *grunion*
> 
> Throttling maybe?
> 
> Tighter timings on my memory also, 2400/9-10-10-24-1t, 3dm11 likes tighter timings.
> And those scores are with fp disabled.


Are you on air or water?


----------



## ehpexs

Quote:


> Originally Posted by *velocityx*
> 
> I agree. my 6970 Crossfire was louder than my 290 CF. and there I was thinking oh god I wont stand if it's louder, but it's actually better.
> 
> but to be honest. I kinda bought them because it will make me buy WC setup for the summer, something I always wanted, and never felt ready, so now's the time I guess ;d


Those were my thoufgts. Plus these on water would be amazing.


----------



## Durvelle27

Quote:


> Originally Posted by *Raxus*
> 
> I ask as i sold my Sapphire R9 290 but now trying to decide between getting another 290 or GTX 780 Classified. I also only game at 1080p


Wait for aftermarket coolers, shouldnt be more than a couple of weeks.

Or go with a 780ti

the original 780 wont beat out a 290[/quote]
I really don't care about aftermarket as i'm doing watercooling for either


----------



## prostreetcamaro

Quote:


> Originally Posted by *Imprezzion*
> 
> I have reviewed 7 different 780's and I have a 290 unlocked for referencing. The 290 came up short in pretty much all game (non synthetic) benches at 1080p with high AA compared to a 780 at 1137Mhz boost. A stock 780 and 290 are pretty close but the stock 290 at 1100Mhz loses to a 780 at 1200+Mhz in pretty much anything except maybe BF3 / BF4 / Dirt / Sleeping Dogs.
> 
> What I did notice is that the minimum - average - maximum FPS difference is a lot smaller for the AMD's. The 780's have larger frame variance.


Really? The nvidia cards do overclock higher I know that but im not so sure I believe that the nvidia cards out perform the 290/x at 1080p. You also have to remember the drivers for the 290/X are very immature and the nvidia drivers are super optimized and the 290/X still equals or betters the 780TI in most games.

Just for example


----------



## Sleeppo

My graphic card died a week ago (460 crashed and doesn't work with drivers anymore), and I am a huge gamer so these are my options.

1. Buy this card r9 290 bf4, 355€. Good Quality/Price +bf4, but noisy and hot and maybe later buy an aftermarket cooling. (voiding warrantly)
http://www.pccomponentes.com/sapphire_r9_290_4gb_gddr5_battlefield_4_edition.html

2. Wait for the new models to get here, windforce etc

3. Buy a nvidia card 770 for 305€ or a r9 280X 275€ +bf4

I'm gonna play at 1080p, I know it would be enought a r9 280x but I really want a r9 290 with good cooling.

I know you gonna say that I should wait because I made my mind already but I dont really want to wait for a month to be able to buy a card.


----------



## Gero2013

Quote:


> Originally Posted by *Sleeppo*
> 
> My graphic card died a week ago (460 crashed and doesn't work with drivers anymore), and I am a huge gamer so these are my options.
> 
> 1. Buy this card r9 290 bf4, 355€. Good Quality/Price +bf4, but noisy and hot and maybe later buy an aftermarket cooling. (voiding warrantly)
> http://www.pccomponentes.com/sapphire_r9_290_4gb_gddr5_battlefield_4_edition.html


you should ask the vendor / manufacturers in your country, XFX in USA for example allows you to put on a custom cooler as long as you don't damage some sticker.

also I was told if you have to RMA and you do it well you can put back the reference cooler again without trace.

R290 + Accelero Xtereme 3 + BF4 = €410 and you are on roughly the same level as a GTX780 (not the uber OCd edition, the ones for €440).


----------



## Imprezzion

Quote:


> Originally Posted by *prostreetcamaro*
> 
> Really? The nvidia cards do overclock higher I know that but im not so sure I believe that the nvidia cards out perform the 290/x at 1080p. You also have to remember the drivers for the 290/X are very immature and the nvidia drivers are super optimized and the 290/X still equals or betters the 780TI in most games.
> 
> Just for example


Ok, let me nuance this post a bit.

Stock for stock the 290 can win. But only just. The 290X easily wins.

However, the 290's start to limit clockspeeds while most factory OC'd 780's (which are often cheaper then reference models where I live) boost to roughly 1080Mhz. On that Boost they can just about equal a 290. Due to the 290's temp throttling overclocking isn't very usefull unless fan profile is changed or it's non-ref cooled. A 780 on the other hand can easily do 1100-1137Mhz on stock and with the Skyn3t BIOS 1200Mhz+ is managable on even reference cooling without throttling. A good card will even get close or go over 1300Mhz like 3 of my 7 780's would do.

On those clocks a 290, 290x or 780Ti (without overclocking) loses.

My point is, from personal experience, a reference cooled 290/290X can't easily beat a 780/780Ti due to the limit in overclocking / temps.

Now that my 290X is cooled by a Accelero Hybrid and cut apart stock coler for VRM's I hit 1160Mhz core and 1500Mhz VRAM stable and at those clocks it's very competative on the 290 BIOS with a 1300Mhz GTX780 but because it no longer throttles the 290 might just take the win. Even with 150Mhz less clocks.
On 290X BIOS at 1160Mhz there's no competition.


----------



## prostreetcamaro

Quote:


> Originally Posted by *Imprezzion*
> 
> Ok, let me nuance this post a bit.
> 
> Stock for stock the 290 can win. But only just. The 290X easily wins.
> 
> However, the 290's start to limit clockspeeds while most factory OC'd 780's (which are often cheaper then reference models where I live) boost to roughly 1080Mhz. On that Boost they can just about equal a 290. Due to the 290's temp throttling overclocking isn't very usefull unless fan profile is changed or it's non-ref cooled. A 780 on the other hand can easily do 1100-1137Mhz on stock and with the Skyn3t BIOS 1200Mhz+ is managable on even reference cooling without throttling. A good card will even get close or go over 1300Mhz like 3 of my 7 780's would do.
> 
> On those clocks a 290, 290x or 780Ti (without overclocking) loses.
> 
> My point is, from personal experience, a reference cooled 290/290X can't easily beat a 780/780Ti due to the limit in overclocking / temps.
> 
> Now that my 290X is cooled by a Accelero Hybrid and cut apart stock coler for VRM's I hit 1160Mhz core and 1500Mhz VRAM stable and at those clocks it's very competative on the 290 BIOS with a 1300Mhz GTX780 but because it no longer throttles the 290 might just take the win. Even with 150Mhz less clocks.
> On 290X BIOS at 1160Mhz there's no competition.


Okay I got what you are saying and you are probably right. I had a 290 flashed to 290X with an accelero extreme III and it ran very cool and never down clocked so my gaming experience was both silent and excellent because it ran so cool and stable.


----------



## rdr09

Quote:


> Originally Posted by *Imprezzion*
> 
> Ok, let me nuance this post a bit.
> 
> Stock for stock the 290 can win. But only just. The 290X easily wins.
> 
> However, the 290's start to limit clockspeeds while most factory OC'd 780's (which are often cheaper then reference models where I live) boost to roughly 1080Mhz. On that Boost they can just about equal a 290. Due to the 290's temp throttling overclocking isn't very usefull unless fan profile is changed or it's non-ref cooled. A 780 on the other hand can easily do 1100-1137Mhz on stock and with the Skyn3t BIOS 1200Mhz+ is managable on even reference cooling without throttling. A good card will even get close or go over 1300Mhz like 3 of my 7 780's would do.
> 
> On those clocks a 290, 290x or 780Ti (without overclocking) loses.
> 
> My point is, from personal experience, a reference cooled 290/290X can't easily beat a 780/780Ti due to the limit in overclocking / temps.
> 
> Now that my 290X is cooled by a Accelero Hybrid and cut apart stock coler for VRM's I hit 1160Mhz core and 1500Mhz VRAM stable and at those clocks it's very competative on the 290 BIOS with a 1300Mhz GTX780 but because it no longer throttles the 290 might just take the win. Even with 150Mhz less clocks.
> On 290X BIOS at 1160Mhz there's no competition.


+rep. at the current prices of the 290s in the US (at least), i, for one, cannot recommend them.


----------



## Imprezzion

I'm very happy I bought mine before the mining frenzy started at just €356 incl shipping







The Hybrid was €74 incl shipping.. So for less then €430 I got a full fledged 290X which is quiet as can be (except for some pump noise..)


----------



## X-oiL

How can oyu explain this Firestrike extreme test? Temps are fine max 82 on Core and max 70 on VRM, I haven't changed anything else then the memory clock..

1100/1350 = *9132*

http://www.3dmark.com/fs/1312964

vs

1100/1400 = *8868*

http://www.3dmark.com/fs/1316520


----------



## Durvelle27

Quote:


> Originally Posted by *X-oiL*
> 
> How can oyu explain this Firestrike extreme test? Temps are fine max 82 on Core and max 70 on VRM, I haven't changed anything else then the memory clock..
> 
> 1100/1350 = *9132*
> 
> http://www.3dmark.com/fs/1312964
> 
> vs
> 
> 1100/1400 = *8868*
> 
> http://www.3dmark.com/fs/1316520


http://www.3dmark.com/compare/fs/1312964/fs/1316520


----------



## amlett

Quote:


> Originally Posted by *X-oiL*
> 
> How can oyu explain this Firestrike extreme test? Temps are fine max 82 on Core and max 70 on VRM, I haven't changed anything else then the memory clock..
> 
> 1100/1350 = *9132*
> 
> http://www.3dmark.com/fs/1312964
> 
> vs
> 
> 1100/1400 = *8868*
> 
> http://www.3dmark.com/fs/1316520


ECC kick in due to unstable memory frecuency --> less performance


----------



## head9r2k

Anyone know what is ok for 24/7? R9 290

I mean when i make 1000-1100mhz core clock without increase voltage.

Can i stay on 1000mhz with no problems?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *head9r2k*
> 
> Anyone know what is ok for 24/7? R9 290
> 
> I mean when i make 1000-1100mhz core clock without increase voltage.
> 
> Can i stay on 1000mhz with no problems?


Yes that's fine


----------



## X-oiL

Quote:


> Originally Posted by *Durvelle27*
> 
> http://www.3dmark.com/compare/fs/1312964/fs/1316520


Tanks!
Quote:


> Originally Posted by *amlett*
> 
> ECC kick in due to unstable memory frecuency --> less performance


I see, what can I do to prevent that?


----------



## amlett

your memory seems to be at its limit with that voltage. To push it further you need more voltage.


----------



## rdr09

Quote:


> Originally Posted by *X-oiL*
> 
> How can oyu explain this Firestrike extreme test? Temps are fine max 82 on Core and max 70 on VRM, I haven't changed anything else then the memory clock..
> 
> 1100/1350 = *9132*
> 
> http://www.3dmark.com/fs/1312964
> 
> vs
> 
> 1100/1400 = *8868*
> 
> http://www.3dmark.com/fs/1316520


prolly the memory is starving for more volts. i read somewhere here that memory oc depends on the core oc and its voltage. still comes down to silicon lottery. i can raise my memory to 1500 even if i don't have anyway of adjusting its voltage. that's the highest, though. so, if you can add just a tiny bit of volts, then you might see the same score as the one with 1350 or higher.


----------



## head9r2k

@MrWhiteRX7

Thank you


----------



## X-oiL

Quote:


> Originally Posted by *amlett*
> 
> your memory seems to be at its limit with that voltage. To push it further you need more voltage.


Quote:


> Originally Posted by *rdr09*
> 
> prolly the memory is starving for more volts. i read somewhere here that memory oc depends on the core oc and its voltage. still comes down to silicon lottery. i can raise my memory to 1500 even if i don't have anyway of adjusting its voltage. that's the highest, though. so, if you can add just a tiny bit of volts, then you might see the same score as the one with 1350 or higher.


It all sounds pretty logic, would I go OC the core first and then start with the memory or vice versa?


----------



## amlett

Start with core. Mem OC doesn't give as much performance gain as core OC.


----------



## X-oiL

Thank you sir!


----------



## MrWhiteRX7

LOL this is kind of funny. AMD giving away the 290 to people that probably have zero clue what a gpu is.


----------



## Widde

Starting to reach the limit of my current rig it seems







http://piclair.com/yqpgo Upgrade time soon









Cpu is a crap overclocker and my gpu wasnt that great either


----------



## voldomazta

Hi guys, I have 3x 290x and I'm trying to adjust OSD data on MSI Afterburner. The problem is only 1 GPU's temperature graph is detected because the other 2 cards are currently turned off due to Zerocore. GPU2&3 usage can be checked though but I need to check their temperatures as well.

The solution (kind of) is to run a game to make all 3 cards fire up, and restart AfterBurner. But I do not want to do that every single time. I hope there is another way to solve this.


----------



## Raephen

Quote:


> Originally Posted by *head9r2k*
> 
> Anyone know what is ok for 24/7? R9 290
> 
> I mean when i make 1000-1100mhz core clock without increase voltage.
> 
> Can i stay on 1000mhz with no problems?


Yes, I think you can without a problem.

I run my 290 at 1000MHz without added voltage. I have raised the power limit to +25%, though, but in the runs of Valley I did, I saw no discernible difference between 0, +25% and + 50%.


----------



## HardwareDecoder

Quote:


> Originally Posted by *voldomazta*
> 
> Hi guys, I have 3x 290x and I'm trying to adjust OSD data on MSI Afterburner. The problem is only 1 GPU's temperature graph is detected because the other 2 cards are currently turned off due to Zerocore. GPU2&3 usage can be checked though but I need to check their temperatures as well.
> 
> The solution (kind of) is to run a game to make all 3 cards fire up, and restart AfterBurner. But I do not want to do that every single time. I hope there is another way to solve this.


Disable ULPS


----------



## Arizonian

Quote:


> Originally Posted by *voldomazta*
> 
> Hi guys, I have 3x 290x and I'm trying to adjust OSD data on MSI Afterburner. The problem is only 1 GPU's temperature graph is detected because the other 2 cards are currently turned off due to Zerocore. GPU2&3 usage can be checked though but I need to check their temperatures as well.
> 
> The solution (kind of) is to run a game to make all 3 cards fire up, and restart AfterBurner. But I do not want to do that every single time. I hope there is another way to solve this.


Quote:


> Originally Posted by *HardwareDecoder*
> 
> Disable ULPS


Disable ULPS guide by tsm106 on OP first post.


----------



## voldomazta

Quote:


> Originally Posted by *HardwareDecoder*
> 
> Quote:
> 
> 
> 
> Originally Posted by *voldomazta*
> 
> Hi guys, I have 3x 290x and I'm trying to adjust OSD data on MSI Afterburner. The problem is only 1 GPU's temperature graph is detected because the other 2 cards are currently turned off due to Zerocore. GPU2&3 usage can be checked though but I need to check their temperatures as well.
> 
> The solution (kind of) is to run a game to make all 3 cards fire up, and restart AfterBurner. But I do not want to do that every single time. I hope there is another way to solve this.
> 
> 
> 
> Disable ULPS
Click to expand...

Quote:


> Originally Posted by *Arizonian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *voldomazta*
> 
> Hi guys, I have 3x 290x and I'm trying to adjust OSD data on MSI Afterburner. The problem is only 1 GPU's temperature graph is detected because the other 2 cards are currently turned off due to Zerocore. GPU2&3 usage can be checked though but I need to check their temperatures as well.
> 
> The solution (kind of) is to run a game to make all 3 cards fire up, and restart AfterBurner. But I do not want to do that every single time. I hope there is another way to solve this.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *HardwareDecoder*
> 
> Disable ULPS
> 
> Click to expand...
> 
> Disable ULPS guide by tsm106 on OP first post.
Click to expand...

thanks guys!


----------



## Asrock Extreme7

mines 1.80v load 0.961 idle


----------



## Mr357

Quote:


> Originally Posted by *Asrock Extreme7*
> 
> mines 1.80v load 0.961 idle


*What*? 1.8V is lethal! You've probably already damaged the GPU!


----------



## Asrock Extreme7

Quote:


> Originally Posted by *Mr357*
> 
> *What*? 1.8V is lethal! You've probably already damaged the GPU!


sorry 1.18 ops


----------



## grunion

Quote:


> Originally Posted by *Joeking78*
> 
> Are you on air or water?


Air, window cracked, 40°f ambient.
Cards were peaking in the 50's.


----------



## ImJJames

Cant see why anyone would want to go with a aftermarket cooler on 290's, when you can put it under water with a g10 for $80 or less. Which will give you much better temps and cooler VRM's compared to gelid or accelero air.


----------



## HardwareDecoder

Quote:


> Originally Posted by *ImJJames*
> 
> Cant see why anyone would want to go with a aftermarket cooler on 290's, when you can put it under water with a g10 for $80 or less. Which will give you much better temps and cooler VRM's compared to gelid or accelero air.


I agree. Now that I have my first card ever under water I will never go back 45c max loud no noise


----------



## HardwareDecoder

what are you guys on waterblocks vrm temps looking like? Mine are good afaik, the only thing that concerns me is one is 20c higher than the other?? I am pretty sure my block is making good contact w/ the thermal pads on top of both sets of vrms.

VRM 1 during bf4 is 53C
VRM2 during bf4 is 35c

core maxes @ like 45


----------



## Jack Mac

You're golden, those temps are fine, don't worry about the variation, they're on opposite sides of the PCB lol.


----------



## HardwareDecoder

Quote:


> Originally Posted by *Jack Mac*
> 
> You're golden, those temps are fine, don't worry about the variation, they're on opposite sides of the PCB lol.


yea that's what I was thinking. I remember seeing from the side of the card after I put the waterblock on both are def making good contact.

A little googling shows they run at dif temps.


----------



## robert0507

Quote:


> Originally Posted by *HardwareDecoder*
> 
> what are you guys on waterblocks vrm temps looking like? Mine are good afaik, the only thing that concerns me is one is 20c higher than the other?? I am pretty sure my block is making good contact w/ the thermal pads on top of both sets of vrms.
> 
> VRM 1 during bf4 is 53C
> VRM2 during bf4 is 35c
> 
> core maxes @ like 45


Full load my VRM temps are 54 for Vrm1 and around 36 for Vrm2. I have a koolance water block. What block do you have


----------



## robert0507

core is about the same as you max is 45 with an OC of 1150 core and 1500 for mem


----------



## HardwareDecoder

Quote:


> Originally Posted by *robert0507*
> 
> Full load my VRM temps are 54 for Vrm1 and around 36 for Vrm2. I have a koolance water block. What block do you have


EK nickel/acetal. Okay our temps are exactly the same on both VRM's and core










Seems like I did my first waterblock install correctly then. Not like it was hard.


----------



## jerrolds

Quote:


> Originally Posted by *ImJJames*
> 
> Cant see why anyone would want to go with a aftermarket cooler on 290's, when you can put it under water with a g10 for $80 or less. Which will give you much better temps and cooler VRM's compared to gelid or accelero air.


Kraken G10? That was released a good month and a half or so after the 290X.

If i were to get a reference now, Kraken or that other users bracket would be my choice.


----------



## ImJJames

Quote:


> Originally Posted by *jerrolds*
> 
> Kraken G10? That was released a good month and a half or so after the 290X.
> 
> If i were to get a reference now, Kraken or that other users bracket would be my choice.


Well I mean now, not before. What other user brackets are compatible with 290's?


----------



## escapedmonk

Accelero Hybrid fits the 290(x) but you'll need some more vram sinks, id recommend Alpenföhn Passive DRAM / VRAM Chip Coolers if you can find any in stock


----------



## ImJJames

Quote:


> Originally Posted by *escapedmonk*
> 
> Accelero Hybrid fits the 290(x) but you'll need some more vram sinks, id recommend Alpenföhn Passive DRAM / VRAM Chip Coolers if you can find any in stock


Will if this works anywhere near nzxt g10, you won't need heatsinks on VRM's.


----------



## jerrolds

Quote:


> Originally Posted by *ImJJames*
> 
> Well I mean now, not before. What other user brackets are compatible with 290's?


Off the top of my head these should work with 290/X - Gelid ICY 2, Prolimatech MK26, Artic Cooling Hybrid/Xtreme 3, DWood (without shim), Sigma_Cool MK2 and GPU Cool by richie_2010

Prolimatech with nice fans and the red Kraken G10 with nice AIO cooler would look best id say.


----------



## escapedmonk

They can get a bit toasty at high clocks but managable if you have a good side fan and general airflow. The Alpenföhn heatsinks are much better than the Arctic ones though, especially the big vrm bar seen here


----------



## Jack Mac

Anyone have the MK-26 on their 290? If so, how is it? I'm willing to pay the small premium over the Gelid. Are the VRM temperatures any good with it?


----------



## maynard14

can i down the gpu voltage of my card? pls help... coz mine is a little high ,. stock 290 bios idle is 0.986 v and load is 1.244 stock voltage,...im a little worried by default my cards voltage is high


----------



## Derpinheimer

Try to get a more stable load, it shouldnt be jumping like that in a game (if it is, thats a problem.. unless its really not a demanding game and you are using vsync)

Here is what mine looks like:



0.961v idle, 1.188v load

Stock settings.

You have the Asus R9 290x for one BIOS right? Do you get those same 'high' (I say that loosely, that voltage is totally safe) voltages with it?

Voltage could be from your ASIC quality?


----------



## Widde

Does anyone know what this means?

- Provider

[ Name] Microsoft-Windows-Kernel-Power
[ Guid] {331C3B3A-2005-44C2-AC5E-77220C37D6B4}

EventID 41

Version 3

Level 1

Task 63

Opcode 0

Keywords 0x8000000000000002

Got a black screen and all sound froze, was running furmark at that moment

Got it.. Display driver stopped responding, Could the card have degraded already? +100mV on the core and +13mV on aux, temps never above 85 when gaming


----------



## maynard14

Quote:


> Originally Posted by *Derpinheimer*
> 
> Try to get a more stable load, it shouldnt be jumping like that in a game (if it is, thats a problem.. unless its really not a demanding game and you are using vsync)
> 
> Here is what mine looks like:
> 
> 
> 
> 0.961v idle, 1.188v load
> 
> Stock settings.
> 
> You have the Asus R9 290x for one BIOS right? Do you get those same 'high' (I say that loosely, that voltage is totally safe) voltages with it?
> 
> Voltage could be from your ASIC quality?


nope im using stock bios of my r9 290 xfx....

maybe i can down the voltage a bit?

its high compare to others... 1.188 load vs mine 1.244


----------



## Raephen

I shouldn't worry about it too much.

Mine tops out at 1.227.

And as mentioned above: it could be related to ASIC quality. Mine is 70.2.


----------



## Asrock Extreme7

Quote:


> Originally Posted by *Raephen*
> 
> I shouldn't worry about it too much.
> 
> Mine tops out at 1.227.
> 
> And as mentioned above: it could be related to ASIC quality. Mine is 70.2.


my asic is 72.2 tops out @ 1.186 xfx290 unlocked to 290x


----------



## maynard14

hmmm so maybe thats why mine is high my asic is only 67 percent... hahah maybe thats why my load voltage is 1.244,.. if flash to 290x xfx bios my load is 1.231 only


----------



## Raephen

Quote:


> Originally Posted by *Asrock Extreme7*
> 
> my asic is 72.2 tops out @ 1.186 xfx290 unlocked to 290x


Lucky

Ah well, it could be down to the binning of the chips.

I don't bother about it too much - too bad I couldn't unlock my Sapphire 290, and maybe a shame about the slightly higher stock voltage, but I know one thing: 1200MHz seems to be doable at stock volts for me.

I run the card at 1000MHz because, at the moment, I don't need more. But I was curious one day, so I tried some things, ran some checks and played Skyrim a bit. It seemed rock solid stable.

Long live the silicon lottery


----------



## chill24

I'm having a horrible issue of stuttering and up and down gpu usage with 290s in crossfire. Or even without crossfire, for that matter.

I've tried every driver there is, cleaned in safemode and settled on the beta 9.2 as the best so far, but it's still awful. (The 9.5 drivers were giving slow motion static audio in menus with v-sync)

crysis, bf4, unigine valley all show gpu use that looks like a lie detector test. And it shows in game. Very stuttery and low crappy fps, can't maintain 60 fps with exact same settings in the same place that my gtx 670 sli could.

Same goes for single card use. It's up and down and it shows; it's not just some AB anomaly that can be ignored. (AB beta 17). I'm not trying to overclock and temps never get above 75.... It doesn't even have the chance to.

Mining they will run 100% non stop and top out at 88deg with 50% fan (I have case fans running pretty high).

They are powercolor 290s unlocked to 290x w/ asus bios. Flipping the bios to 290 exhibits the same behavior. (V-sync on and off)

Anybody got an idea what would be causing this? I'm sure its a setting or something somewhere. In any case, every game I've tried they can't touch my gtx 670s and I'm pretty disappointed in them.

Asus V-LK
i7 3770k - OC 4.5
24 gb 1600mhz
xfx 850 watt silver (full 100% load under r9 290 bios mining pulls 730 watts at wall. 290x undervolted pulls 670 at the wall. In game is nowhere close to this. Furmark is ~650 watt)


----------



## maynard14

yes sir maybe its a sillicon lottery... i will try to down the voltage a bit and see if it is stable... thanks guys!


----------



## ebduncan

Quote:


> Originally Posted by *chill24*
> 
> I'm having a horrible issue of stuttering and up and down gpu usage with 290s in crossfire. Or even without crossfire, for that matter.
> 
> I've tried every driver there is, cleaned in safemode and settled on the beta 9.2 as the best so far, but it's still awful. (The 9.5 drivers were giving slow motion static audio in menus with v-sync)
> 
> crysis, bf4, unigine valley all show gpu use that looks like a lie detector test. And it shows in game. Very stuttery and low crappy fps, can't maintain 60 fps with exact same settings in the same place that my gtx 670 sli could.
> 
> Same goes for single card use. It's up and down and it shows; it's not just some AB anomaly that can be ignored. (AB beta 17). I'm not trying to overclock and temps never get above 75.... It doesn't even have the chance to.
> 
> Mining they will run 100% non stop and top out at 88deg with 50% fan (I have case fans running pretty high).
> 
> They are powercolor 290s unlocked to 290x w/ asus bios. Flipping the bios to 290 exhibits the same behavior. (V-sync on and off)
> 
> Anybody got an idea what would be causing this? I'm sure its a setting or something somewhere. In any case, every game I've tried they can't touch my gtx 670s and I'm pretty disappointed in them.
> 
> Asus V-LK
> i7 3770k - OC 4.5
> 24 gb 1600mhz
> xfx 850 watt silver (full 100% load under r9 290 bios mining pulls 730 watts at wall. 290x undervolted pulls 670 at the wall. In game is nowhere close to this. Furmark is ~650 watt)


are you sure you have both cards installed in the correct PCI-E slots?


----------



## TheRoot

Quote:


> Originally Posted by *CHUNKYBOWSER*
> 
> Got my Sapphire R9 290 BF4 edition today. Unfortunately, I can't unlock it.
> 
> http://www.techpowerup.com/gpuz/ebc4k/
> 
> Stock cooling! Will have it under water later...


Quote:


> Originally Posted by *AddictedGamer93*
> 
> 290X owner here. http://www.techpowerup.com/gpuz/ehghu/
> 
> Sapphire 290X BF4 Bundle
> Stock Cooling
> 76.2% ASIC, Elpida
> 
> Is it just me or are these cards no louder than 79xx?


you miss them mod


----------



## chill24

Quote:


> Originally Posted by *ebduncan*
> 
> are you sure you have both cards installed in the correct PCI-E slots?


Yes. Same as 670 sli, gpu-z reports pci-e 3.0 (x8) for both 290's. Verified in bios.

At idle, gpuz reports 2nd cards bus as 32 bit. But when doing any work this changes.


----------



## Arizonian

Quote:


> Originally Posted by *CHUNKYBOWSER*
> 
> Got my Sapphire R9 290 BF4 edition today. Unfortunately, I can't unlock it.
> 
> http://www.techpowerup.com/gpuz/ebc4k/
> 
> Stock cooling! Will have it under water later...


Congrats - added









Quote:


> Originally Posted by *AddictedGamer93*
> 
> 290X owner here. http://www.techpowerup.com/gpuz/ehghu/
> 
> Sapphire 290X BF4 Bundle
> Stock Cooling
> 76.2% ASIC, Elpida
> 
> Is it just me or are these cards no louder than 79xx?


Congrats - added









Sorry I missed you guys, I'm in this thread everyday but having to watch five full sections of OCN and the speed which this moves, I missed you.


----------



## tuakchucuo

great.May order two for my 3D/gaming hobbies. Maybe when the third party manufacturers start making higher-end variants.


----------



## mobeious

For anyone looking to get the XSPC 290x/290 waterblock from performancepc.com... they actually do not have them in stock even though they are listed for sale on there website.... found that out today after my order has been pending for 2 days


----------



## stickg1

I ordered a Koolance full cover for my 290! Can't use it for a few weeks though


----------



## bond32

Quote:


> Originally Posted by *stickg1*
> 
> I ordered a Koolance full cover for my 290! Can't use it for a few weeks though


I have the Koolance block, it's pretty awesome. Max temp I have seen was about 56 C, that was mining with my gpu and cpu both at 100% load, warm ambient.


----------



## broken pixel

My XFX x2 290x

setx GPU_MAX_ALLOC_PERCENT 100
setx GPU_USE_SYNC_OBJECTS 1
color 04

cgminer --scrypt -I 19 -g 2 -w 256 --thread-concurrency 32765 --lookup-gap 2 --expiry 1 -s 1 --no-submit-stale --queue 0

850MHz/1250MHz (-25v)= 814Kh/s *2 @ 570 Watts total system= 1.627Mh/s
66C fan @ 68%

I'm using AF beta 17 to undervolt and underclock with voltage monitoring disabled.


----------



## givmedew

Quote:


> Originally Posted by *bond32*
> 
> Quote:
> 
> 
> 
> Originally Posted by *stickg1*
> 
> I ordered a Koolance full cover for my 290! Can't use it for a few weeks though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have the Koolance block, it's pretty awesome. Max temp I have seen was about 56 C, that was mining with my gpu and cpu both at 100% load, warm ambient.
Click to expand...

Quote:


> Originally Posted by *bond32*
> 
> Quote:
> 
> 
> 
> Originally Posted by *stickg1*
> 
> I ordered a Koolance full cover for my 290! Can't use it for a few weeks though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have the Koolance block, it's pretty awesome. Max temp I have seen was about 56 C, that was mining with my gpu and cpu both at 100% load, warm ambient.
Click to expand...

Which version of the Koolance Block do you have? I have the older one that goes all the way to the end of the card. I am seeing right now a temperature that is 12.5C above my actual water temperature. Which is pretty high because my room is really from mining. So right now it is 48C. That is using a temporary installation with a very tiny pump and a 2x120 radiator with (2) 1000RPM fans. It is the aquacomputer 2x120 all in one stand. The VRMs are running 10C above the GPU for VRM1 and the same temp as the GPU for VRM2 so 58C and 48C

Wondering if you are seeing similar things with the VRMs.

Once I have my loop back together I will expect 10C lower on all those temps.


----------



## bond32

Yep, those temps are about what I get if I just run my gpu miner. I think the temps are great, been very happy with my koolance block.


----------



## brazilianloser

Quote:


> Originally Posted by *mobeious*
> 
> For anyone looking to get the XSPC 290x/290 waterblock from performancepc.com... they actually do not have them in stock even though they are listed for sale on there website.... found that out today after my order has been pending for 2 days


That is pretty lame of them... Man really wished Newegg would sell all the water cooling goods so I wouldn't have to deal with performancepcs and frozencpu


----------



## Stay Puft

Quote:


> Originally Posted by *brazilianloser*
> 
> That is pretty lame of them... Man really wished Newegg would sell all the water cooling goods so I wouldn't have to deal with performancepcs and frozencpu


Why the frozencpu hate? They are awesome


----------



## Korayyy

In a recent email, frozencpu said they would have 290x blocks and backplates this Friday and the following Monday.

This was Dec. 13th:

Hello,

I've had my order in for a little while for 2 EK Acetal 290X water blocks which have been back ordered. I was wondering if you had an idea of how long it would be until a new shipment of blocks came in. If you could provide some insight to this I would appreciate it.

Thank you,
K

From Bucky:

Next Friday, and the following Monday we should have more landing to us.

Hope that helps!

EDIT: Sorry I didn't realize you were talking about the XSPC block... Maybe they will arrive around the same time.

I agree Puft, I think frozen is awesome.


----------



## brazilianloser

Quote:


> Originally Posted by *Stay Puft*
> 
> Why the frozencpu hate? They are awesome


there is no hate. They both have their good moments but they both seem to mess up too. I guess is the nature of their merchandise but would still prefer that a bigger more reliable wholesaler like Newegg would carry this stuff. That was my point.


----------



## utnorris

Any word on Mantle?


----------



## Arizonian

Quote:


> Originally Posted by *utnorris*
> 
> Any word on Mantle?


No official word.

Bummer because they said mid-December with BF4 but issues with BF4 may have delayed it's first implementation, not sure.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Arizonian*
> 
> No official word.
> 
> Bummer because they said mid-December with BF4 but issues with BF4 may have delayed it's first implementation, not sure.


I always thought it was mid-late December,

I might need to find the proof for that one but i'm 90% sure thats the official word.

EDIT: Source here: http://www.slideshare.net/DevCentralAMD/keynote-johan-andersson

Or i can just put it here:


----------



## rquinn19

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I always thought it was mid-late December,
> 
> I might need to find the proof for that one but i'm 90% sure thats the official word.
> 
> EDIT: Source here: http://www.slideshare.net/DevCentralAMD/keynote-johan-andersson
> 
> Or i can just put it here:


They didn't say what year...sorry to be skeptical, but I doubt see mantle until Thief


----------



## Sgt Bilko

Quote:


> Originally Posted by *rquinn19*
> 
> They didn't say what year...sorry to be skeptical, but I doubt see mantle until Thief


That is very skeptical but December means December 2013.

And i seriously doubt they plan to release the mantle patch a full 14 months after the game has been released when they are saying it was only an extra 2 months of work to implement it.


----------



## rquinn19

Quote:


> Originally Posted by *Sgt Bilko*
> 
> That is very skeptical but December means December 2013.
> 
> And i seriously doubt they plan to release the mantle patch a full 14 months after the game has been released when they are saying it was only an extra 2 months of work to implement it.


i was exaggerating, but no way its out before new years.


----------



## ebduncan

Quote:


> Originally Posted by *Korayyy*
> 
> In a recent email, frozencpu said they would have 290x blocks and backplates this Friday and the following Monday.
> 
> This was Dec. 13th:
> 
> Hello,
> 
> I've had my order in for a little while for 2 EK Acetal 290X water blocks which have been back ordered. I was wondering if you had an idea of how long it would be until a new shipment of blocks came in. If you could provide some insight to this I would appreciate it.
> 
> Thank you,
> K
> 
> From Bucky:
> 
> Next Friday, and the following Monday we should have more landing to us.
> 
> Hope that helps!
> 
> EDIT: Sorry I didn't realize you were talking about the XSPC block... Maybe they will arrive around the same time.
> 
> I agree Puft, I think frozen is awesome.


FrozenCPU is probably the best computer enthusiast online resource for things. I personally known the owner and hes a stand up guy. I helped him advertise in his beginning stages back when I was site admin of Inside-Hardware.net. May seem like a love story, but seriously FrozenCpu has done a great number of things for folks like us in the computer enthusiast industry. They offer solid prices, and wide product range.

I will order things from Newegg from time to time, however they are like most other larger e-tailers and will jack up prices, because they can get away with it. Look at the R9 290 prices on their site now. 100 over msrp. So take this into perspective next time you order from them. I personally will shop around and find the best prices from e-tailers who are honest and don't price jack.


----------



## kizwan

Quote:


> Originally Posted by *velocityx*
> 
> yea, nothing.
> 
> today I updated my Bf4 to the latest patch, I fired it up, I see flickering textures in Crossfire. I'm like, gee, thanks dice, I switch CF off, go in game, half in the round the game goes black, hard reboot.
> 
> I really don't have time to waste like that.... wish amd said something is it hardware is it software? It's not happenening like everytime, but once a day maybe or so...


I suddenly got flickering texture last night when playing BF4. The last thing I did before that happen is I disabled ULPS. However, enabling ULPS back doesn't solve the problem though. When I played on different server, no flickering texture whatsoever. One thing I noticed when flickering texture occur, CPU usage spike to 80 - 90% usage (using Open Hardware Monitor to monitor). The overlay graph in BF4 doesn't show any spike though.

I just start playing BF4 since 4 days ago. I imagine this is the known issue with BF4? Still searching this forum for similar issue, your post is the first one I found.


----------



## Im Batman

For anyone who is going to water cool.

I wasn't too impressed with the new XSPC R9 290 block I just put in.

They did a bad job machining the threads of the ports so when putting either the barbs or plugs in, I had to basically cut a new thread the whole way and it left a few plastic shavings in the loop that I just couldn't flush.

One of the ports is almost no good with the barb at about 15 degrees from being straight, the plug I replaced it with is the same. This was with XSPC compression fittings.

Still water tight though so I suppose that's something


----------



## ImJJames

Quote:


> Originally Posted by *kizwan*
> 
> I suddenly got flickering texture last night when playing BF4. The last thing I did before that happen is I disabled ULPS. However, enabling ULPS back doesn't solve the problem though. When I played on different server, no flickering texture whatsoever. One thing I noticed when flickering texture occur, CPU usage spike to 80 - 90% usage (using Open Hardware Monitor to monitor). The overlay graph in BF4 doesn't show any spike though.
> 
> I just start playing BF4 since 4 days ago. I imagine this is the known issue with BF4? Still searching this forum for similar issue, your post is the first one I found.


No issues with BF 4, been playing it for hours on end everyday overclocked for weeks.


----------



## voldomazta

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *velocityx*
> 
> yea, nothing.
> 
> today I updated my Bf4 to the latest patch, I fired it up, I see flickering textures in Crossfire. I'm like, gee, thanks dice, I switch CF off, go in game, half in the round the game goes black, hard reboot.
> 
> I really don't have time to waste like that.... wish amd said something is it hardware is it software? It's not happenening like everytime, but once a day maybe or so...
> 
> 
> 
> I suddenly got flickering texture last night when playing BF4. The last thing I did before that happen is I disabled ULPS. However, enabling ULPS back doesn't solve the problem though. When I played on different server, no flickering texture whatsoever. One thing I noticed when flickering texture occur, CPU usage spike to 80 - 90% usage (using Open Hardware Monitor to monitor). The overlay graph in BF4 doesn't show any spike though.
> 
> I just start playing BF4 since 4 days ago. I imagine this is the known issue with BF4? Still searching this forum for similar issue, your post is the first one I found.
Click to expand...

Are you on multiple cards perhaps? I'm on trifire and getting the same flickering textures big time. I was scared one of the cards went kaput so I tried each one and individually, they ran without a hitch. I'm not getting this bug on any other game so I thought it might be BF4 related.


----------



## kizwan

Quote:


> Originally Posted by *voldomazta*
> 
> Are you on multiple cards perhaps? I'm on trifire and getting the same flickering textures big time. I was scared one of the cards went kaput so I tried each one and individually, they ran without a hitch. I'm not getting this bug on any other game so I thought it might be BF4 related.


Yes, Crossfire with two cards. It suddenly happen last night. Before that they're running fine without flickering. I connected to different server, no flickering texture though. It's running fine. I'll try again tonight.

Did you noticed CPU usage spike when it happen?


----------



## Forceman

Quote:


> Originally Posted by *kizwan*
> 
> Yes, Crossfire with two cards. It suddenly happen last night. Before that they're running fine without flickering. I connected to different server, no flickering texture though. It's running fine. I'll try again tonight.
> 
> Did you noticed CPU usage spike when it happen?


It's a known bug with multiple cards. It was introduced with the patch yesterday.


----------



## kizwan

Quote:


> Originally Posted by *Forceman*
> 
> It's a known bug with multiple cards. It was introduced with the patch yesterday.


I see. Thanks for clarifying.


----------



## quakermaas

Quote:


> Originally Posted by *kizwan*
> 
> I see. Thanks for clarifying.


We all have it, SLI and CF, it really is a PITA.







No terrain textures is now another major problem, I have had since the latest patch.

Just got my second Aquacomputer block ordered and two active back plates


----------



## darkelixa

Whats the best version of windows to have installed for the r9 290s? Since i have a new hard drive comming and all, windows 7,8 or 8.1?


----------



## velocityx

anyone having problems with alt tab? i lose my keyboard and mouse in game whenever i alt tab from game or it alt tabs me because of a prompt. need to quit and rejoin to play.


----------



## Connolly

Is this amount of GPU usage fluctuation normal whilst gaming?



I have v-sync on, if that could make any difference.


----------



## HL2-4-Life

Just score two R9 290X as an early X'mas gift for myself...to replace my three HD7970's


----------



## Sgt Bilko

Quote:


> Originally Posted by *rquinn19*
> 
> i was exaggerating, but no way its out before new years.


It will, AMD can't really afford to have anymore delays and I know they want to give us a an awesome Xmas Present


----------



## r0l4n

First 290X with a custom cooler reviewed.

ASUS Radeon R9 290X DirectCU II OC

Quote:


> We measured a top stress temperature of 77 Degrees C on the card, coming from 94 Degrees C on the reference coolers that is rather sweet. Next to that, the card remains to be silent. So that is simply put a massive improvement over the reference cards.


----------



## Sgt Bilko

Quote:


> Originally Posted by *r0l4n*
> 
> First 290X with a custom cooler reviewed.
> 
> ASUS Radeon R9 290X DirectCU II OC


Thats looking very promising, especially since It's not a triple slot behemoth, If HIS doesn't seem up to the task then i'll grab 2 of these instead









Thanks for posting it


----------



## tdbchess

Hi,

I'm new, i'm from France.
I buy a Sapphire R9 290, i used a Artic Accelero Xtreme III system cooler

You can see below spec:



Thanks

Tdbchess


----------



## Jack Mac

Nobody has the MK-26 on a 290/X? Are the included heatsinks good enough for this card?


----------



## kizwan

Quote:


> Originally Posted by *quakermaas*
> 
> We all have it, SLI and CF, it really is a PITA.
> 
> 
> 
> 
> 
> 
> 
> No terrain textures is now another major problem, I have had since the latest patch.
> 
> Just got my second Aquacomputer block ordered and two active back plates
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


I may go with Koolance waterblock though. No supplier bringing in Aqua waterblock here as far as I know. EK & XSPC out of stock which may take month+ to re-stock.

[EDIT] Koolance water block only available end of next month. I may go with EK & buy directly from them since my friend told me shipping isn't too bad.


----------



## Stay Puft

Quote:


> Originally Posted by *brazilianloser*
> 
> there is no hate. They both have their good moments but they both seem to mess up too. I guess is the nature of their merchandise but would still prefer that a bigger more reliable wholesaler like Newegg would carry this stuff. That was my point.


Frozen has been around for a long time. I bought my first WCing parts from them back in 2004 and have been loyal ever since. PC WCing isnt a huge money maker which is why newegg would never get into it.


----------



## Koniakki

*First OFFICIAL aftermarket custom cooler R9-290X review!*

*ASUS Radeon R9-290X DirectCU II OC*

Already been posted. My bad guys. I got overly excited. Off to read it then.









Also for those who didn't notice it, those 77'C on the card were with 41-43% fan speed.

I'm sure at 70-80% will drop below 70'C. Maybe even 65'C or even less.


----------



## RavageTheEarth

Quote:


> Originally Posted by *Connolly*
> 
> Is this amount of GPU usage fluctuation normal whilst gaming?
> 
> 
> 
> I have v-sync on, if that could make any difference.


As long as your games are playing fine it should be alright. If it is bothering you, you can try to go into MSI AB settings and choose unofficial overclocking mode without powerplay support and then restart your computer and see if that helps.


----------



## TommyMoore

http://www.techpowerup.com/gpuz/dzyz7/

Sapphire R9 290

Adding Waterblock soon.


----------



## Raxus

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Thats looking very promising, especially since It's not a triple slot behemoth, If HIS doesn't seem up to the task then i'll grab 2 of these instead
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks for posting it


Question is how much will these retail for? The reference models are at $630, I could only imagine they'd be closer to 650-700?


----------



## amlett

AMD_Catalyst_13.12_W7_W8_WHQL

http://www.sapphiretech.com/presentation/downloads/?pid=2085&psn=0006&lid=1&os=15


----------



## stickg1

Have you tried those drivers yet? I will try when I get home from work.


----------



## Arizonian

Quote:


> Originally Posted by *quakermaas*
> 
> We all have it, SLI and CF, it really is a PITA.
> 
> 
> 
> 
> 
> 
> 
> No terrain textures is now another major problem, I have had since the latest patch.
> 
> Just got my second Aquacomputer block ordered and two active back plates
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - updated








Quote:


> Originally Posted by *HL2-4-Life*
> 
> Just score two R9 290X as an early X'mas gift for myself...to replace my three HD7970's
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added








Quote:


> Originally Posted by *tdbchess*
> 
> Hi,
> 
> I'm new, i'm from France.
> I buy a Sapphire R9 290, i used a Artic Accelero Xtreme III system cooler
> 
> You can see below spec:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Thanks
> 
> Tdbchess


Congrats - added









Quote:


> Originally Posted by *TommyMoore*
> 
> http://www.techpowerup.com/gpuz/dzyz7/
> 
> Sapphire R9 290
> 
> Adding Waterblock soon.


Congrats - added









This puts us officially at 203 owners on the roster and that's before any non-reference cards came out. Wow.


----------



## Redvineal

Quote:


> Originally Posted by *Jack Mac*
> 
> Nobody has the MK-26 on a 290/X? Are the included heatsinks good enough for this card?


I have an MK-26 waiting for me at home. I plan to install it tomorrow or Friday evening. Full disclosure, though, I won't be using any of the VRM sinks since I went with the stock VRM chop mod:

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/7580#post_21246084

Regardless, I'll be sure to update you on the installation and Furmark burn results.


----------



## Arizonian

Quote:


> Originally Posted by *amlett*
> 
> AMD_Catalyst_13.12_W7_W8_WHQL
> 
> http://www.sapphiretech.com/presentation/downloads/?pid=2085&psn=0006&lid=1&os=15


Thanks for the update - added to the OP.


----------



## Asrock Extreme7

Quote:


> Originally Posted by *RavageTheEarth*
> 
> As long as your games are playing fine it should be alright. If it is bothering you, you can try to go into MSI AB settings and choose unofficial overclocking mode without powerplay support and then restart your computer and see if that helps.


what will unofficial overclocking mode without powerplay support do ???


----------



## ImJJames

I'm a little hesitant on using the new 13.12 WHQL drivers, someone please give us updates on your experience so far compared to beta 9.5


----------



## Asrock Extreme7

Quote:


> Originally Posted by *ImJJames*
> 
> I'm a little hesitant on using the new 13.12 WHQL drivers, someone please give us updates on your experience so far compared to beta 9.5


not on amd website


----------



## sf101

I've been messing with my 290x for days now after installing waterblock.

is their any gold standard way atm of overclocking these cards with the least amount of issues and throttling?

i reinstalled os to win8 and started up asus gpu tweak and set the tool to 100 to 150% power -1250 to 1335mv - 1100core -1400 mem and im watching the core sit @ 1030mhz in gpuz not moving above that . . gpu @ 34*C temp and vrms at 30 and 33 respectively it can not be throttling right?

on my win 7 install i had mass issues with ocing the card past 1125 core i figured it was probably partially due to crashing the os so much and partially due to trying dif oc tools / drivers / bios revisions so i reinstalled OS and went with w8 vlk and im hoping not to redo the same mistake and cause lots of issues before i get solid oc's down.

so far i have not enabled CCC overdrive - and i followed the guide to disable ulps via regedit.

anyone who has a better / bulletproof way to oc this card im all ears at this point.

I should add im currently useing pt1.rom bios it seemed to be the most stable for me. i have tried asus.rom / pt1 / pt3 "this bios didnt work well at all" / msi's newest bios. and even the stock bios XFX 290 bios. Same oc issues all round.


----------



## pkrexer

Quote:


> Originally Posted by *sf101*
> 
> I've been messing with my 290x for days now after installing waterblock.
> 
> is their any gold standard way atm of overclocking these cards with the least amount of issues and throttling?
> 
> i reinstalled os to win8 and started up asus gpu tweak and set the tool to 100 to 150% power -1250 to 1335mv - 1100core -1400 mem and im watching the core sit @ 1030mhz in gpuz not moving above that . . gpu @ 34*C temp and vrms at 30 and 33 respectively it can not be throttling right?
> 
> on my win 7 install i had mass issues with ocing the card past 1125 core i figured it was probably partially due to crashing the os so much and partially due to trying dif oc tools / drivers / bios revisions so i reinstalled OS and went with w8 vlk and im hoping not to redo the same mistake and cause lots of issues before i get solid oc's down.
> 
> so far i have not enabled CCC overdrive - and i followed the guide to disable ulps via regedit.
> 
> anyone who has a better / bulletproof way to oc this card im all ears at this point.
> 
> I should add im currently useing pt1.rom bios it seemed to be the most stable for me. i have tried asus.rom / pt1 / pt3 "this bios didnt work well at all" / msi's newest bios. and even the stock bios XFX 290 bios. Same oc issues all round.


I had to enable overdrive to allow my overclocks to stick, otherwise they would just keep resetting back to default. I would suggest just using AB for overclocking, the only thing GPU Tweak has going over it, is the slightly higher voltage limit.


----------



## the9quad

Quote:


> Originally Posted by *ImJJames*
> 
> I'm a little hesitant on using the new 13.12 WHQL drivers, someone please give us updates on your experience so far compared to beta 9.5


About 100 points lower in firestrike extreme. The betas were faster.


----------



## sf101

any dif in stability though the9quad


----------



## X-oiL

Quote:


> Originally Posted by *the9quad*
> 
> About 100 points lower in firestrike extreme. The betas were faster.


9.5 or 9.2?


----------



## sf101

Quote:


> Originally Posted by *pkrexer*
> 
> I had to enable overdrive to allow my overclocks to stick, otherwise they would just keep resetting back to default. I would suggest just using AB for overclocking, the only thing GPU Tweak has going over it, is the slightly higher voltage limit.


Afterburner with overdrive disabled?

or do you just require overdrive enabled for asus gpu tweak?

i know im being rather specific its just with all the issues ive had with this card so far i keep thinking i must be doing it wrong or something ive never had so many problems clocking a card as this and ive owned a ton of gfx cards lol


----------



## bond32

So far seems improvement with the latest 13.12 drivers. I can actually OC and not get blackscreen'ed... So far...


----------



## MrWhiteRX7

For the people having Video playback issues sometimes, this new driver could fix it according to the notes log of it:

"Resolves Metro applications experiencing frame drops during playback of interlaced video content"


----------



## Rar4f

What monitoring and overclocking software would be good for a Sapphire R9 290?


----------



## Jack Mac

Quote:


> Originally Posted by *Redvineal*
> 
> I have an MK-26 waiting for me at home. I plan to install it tomorrow or Friday evening. Full disclosure, though, I won't be using any of the VRM sinks since I went with the stock VRM chop mod:
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/7580#post_21246084
> 
> Regardless, I'll be sure to update you on the installation and Furmark burn results.


Thank you so much, do you think the MK-26 heatsinks are enough and does it come with enough memory sinks? And rep.


----------



## Heinz68

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Thats looking very promising, especially since It's not a triple slot behemoth, If HIS doesn't seem up to the task then i'll grab 2 of these instead
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks for posting it


Quote:


> Originally Posted by *Raxus*
> 
> Question is how much will these retail for? The reference models are at $630, I could only imagine they'd be closer to 650-700?


the ASUS R9 290X DirectCU II - MSRP is $570 as per this HardwareCanucks review. How much it will sell for might be another story.


----------



## Raxus

Quote:


> Originally Posted by *Heinz68*
> 
> the ASUS R9 290X DirectCU II - MSRP is $570 as per this HardwareCanucks review. How much it will sell for might be another story.


whats the MSRP of the reference models? $550?

Safe to say this card will be around $650? getting really close to TI price there. Hopefully it comes in lower.


----------



## Forceman

Quote:


> Originally Posted by *Connolly*
> 
> Is this amount of GPU usage fluctuation normal whilst gaming?
> 
> 
> 
> I have v-sync on, if that could make any difference.


If you have voltage monitoring enabled, try disabling it. Seems to help some people.
Quote:


> Originally Posted by *Rar4f*
> 
> What monitoring and overclocking software would be good for a Sapphire R9 290?


MSI Afterburner works for all cards, and is pretty much the standard.


----------



## ImJJames

Quote:


> Originally Posted by *Raxus*
> 
> whats the MSRP of the reference models? $550?
> 
> Safe to say this card will be around $650? getting really close to TI price there. Hopefully it comes in lower.


r9 290 non-ref will be the best buy.


----------



## Rar4f

Quote:


> Originally Posted by *Forceman*
> 
> If you have voltage monitoring enabled, try disabling it. Seems to help some people.
> MSI Afterburner works for all cards, and is pretty much the standard.


Thanks.


----------



## sf101

well maybe just for me 1100 core 1400 mem is all i can do i guess. i just have issues past this point.


----------



## Arizonian

Quote:


> Originally Posted by *sf101*
> 
> well maybe just for me 1100 core 1400 mem is all i can do i guess. i just have issues past this point.


Actually that decent overock on Core and very good on Memory from my experience.


----------



## sf101

Quote:


> Originally Posted by *Arizonian*
> 
> Actually that decent overock on Core and very good on Memory from my experience.


oh thought most were getting like 1200.. this is gaming stable i benched at higher clocks 1225 in 3dm11/2013 i just cant run a game @ more than 1100 stable i thought i could do 1140 for awhile their but eventually it flickerd / went blurry and then black screened. 1125 same deal just took even longer to break i could probably get it stable at 1125 but it would just take too dang long.

mem might be able to do 1450 but i had issues at 1500 so i gave it 100mhz lean way..


----------



## VSG

Nah, game stable at 1100 MHz is pretty great.


----------



## Sgt Bilko

Quote:


> Originally Posted by *sf101*
> 
> oh thought most were getting like 1200.. this is gaming stable i benched at higher clocks 1225 in 3dm11/2013 i just cant run a game @ 1100 + stable i thought i could do 1140 for awhile their but eventually it flickerd / went blurry and then black screened. 1125 same deal just took even longer to break i could probably get it stable at 1125 but it would just take too dang long.
> 
> mem might be able to do 1450 but i had issues at 1500 so i gave it 100mhz lean way..


I ran mine at 1100/1300 for 24/7 clocks and up to 1180/1500 for benches and I was happy with that ASIC 70.2% from memory.

Those clocks are fairly normal and quite good


----------



## stickg1

Anyone load up 13.12? I'm downloading it now...


----------



## darkelixa

My freight just arrived this morning 
















Wont get time to unpack and install until tonight though

So windows 7 is better than 8 for the amd cards?


----------



## jerrolds

I've run as high as 1220mhz benching on aftermarket air...but my true game stable is 1180/1500. But only crank it that high for BF4..other games im ok at 1100 or lower.


----------



## UNOE

Anyone think Crossfire will work with risers ?


----------



## Sgt Bilko

Quote:


> Originally Posted by *stickg1*
> 
> Anyone load up 13.12? I'm downloading it now...


Someone said a little earlier that its about 500 points alower than the betas and another said that it fixed their blackscreen issues (maybe)
Quote:


> Originally Posted by *darkelixa*
> 
> My freight just arrived this morning
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Wont get time to unpack and install until tonight though
> 
> So windows 7 is better than 8 for the amd cards?


Grats dude!!

I think Win 8 is better (I dont have it so not 100% sure) just with BF4 and a few other cases....being on Win 7 isnt going to slow you down significantly though


----------



## darkelixa

Yeah man, got my stuff from Ijk in 2 days from Sydney to rural QLD so really damn good

Last time I tried windows 8 it kept making an odd device connect/device disconnect when it booted into windows, this however didnt happen with 7


----------



## MrWhiteRX7

Win 8 is a lot faster in all games from my personal experience with these GPU's.


----------



## Widde

Quote:


> Originally Posted by *amlett*
> 
> AMD_Catalyst_13.12_W7_W8_WHQL
> 
> http://www.sapphiretech.com/presentation/downloads/?pid=2085&psn=0006&lid=1&os=15


Assuming it does work with every "brand" since they are reference cards correct?







Cant find it on amds website


----------



## quakermaas

Quote:


> Originally Posted by *Widde*
> 
> Assuming it does work with every "brand" since they are reference cards correct?
> 
> 
> 
> 
> 
> 
> 
> Cant find it on amds website


http://support.amd.com/en-us/download/desktop?os=Windows+8+-+64


----------



## Sgt Bilko

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Win 8 is a lot faster in all games from my personal experience with these GPU's.


Quote:


> Originally Posted by *darkelixa*
> 
> My freight just arrived this morning
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Wont get time to unpack and install until tonight though
> 
> So windows 7 is better than 8 for the amd cards?


First hand knowledge right there. Thanks Mr White









I really should make the switch sometime.....gah. so much crap to backup


----------



## Connolly

Quote:


> Originally Posted by *Forceman*
> 
> If you have voltage monitoring enabled, try disabling it. Seems to help some people.
> MSI Afterburner works for all cards, and is pretty much the standard.


Quote:


> Originally Posted by *RavageTheEarth*
> 
> As long as your games are playing fine it should be alright. If it is bothering you, you can try to go into MSI AB settings and choose unofficial overclocking mode without powerplay support and then restart your computer and see if that helps.


Will do tomorrow, thanks for the tip


----------



## stickg1

I got better and more stable performance from my 290 when I switched back to Windows 8.1 (from Win7)

I just installed 13.12 drivers. For me anyway, it fixed my power limit slider so it actually does something. I just benched 3DMark stable at 1150/1550 which I had failed to do on previous drivers. Before I was stuck at 1100MHz because no matter how I changed the slider in CCC or AB it never seemed to take on more power.

I hope these drivers work well for everyone else, I'm pretty happy although my results could merely be a coincidence. I just got them installed 30 minutes ago.

http://www.3dmark.com/3dm/1884494


----------



## H1GHL4ND3R

Hi Guys!
After a lnog time of waiting i got my EK-block for my 290 , and after instalation im not sure is ment to be like that but my VRM temps are kinda wird , wile in stock GPU-z report VRM1 Temp 25C and VRM2 26C wich i think is low , under load especialy under Kombustor stres test VRM1 skyrocket to 78-80C its wobling around this , but VRM2 its around 60-63C , and thats all on stock clocks no OC , i wonder is it normal to get almost 20C difrience VRM1 vs VRM2 ?
To top it up yeah its not my first block actualy 3th and i check my thermalpads befor scruing block to pcb , i set it than pull back and sims like all pads where compresed so i set it jently back , and yes i dont mess them up i know finer are for VRAM and ficker for VRM , all the other temps GPU 29 in WinDSP droping to 28 in moments , and 52 max after an hour of Kombustor stressing , witch is not to bad on single loop with [email protected] and single TripleRad XSPC 3cm fick and XSPC combo res/pumb750l/h , so what u think try reset the block , or this temps on 290 are totaly fine in load , or maybe card to RMA , anyone ?

I want to note that im not native english person , so excuse me guys for my terrible grammary , and thx for help ;P


----------



## HardwareDecoder

Quote:


> Originally Posted by *H1GHL4ND3R*
> 
> Hi Guys!
> After a lnog time of waiting i got my EK-block for my 290 , and after instalation im not sure is ment to be like that but my VRM temps are kinda wird , wile in stock GPU-z report VRM1 Temp 25C and VRM2 26C wich i think is low , under load especialy under Kombustor stres test VRM1 skyrocket to 78-80C its wobling around this , but VRM2 its around 60-63C , and thats all on stock clocks no OC , i wonder is it normal to get almost 20C difrience VRM1 vs VRM2 ?
> To top it up yeah its not my first block actualy 3th and i check my thermalpads befor scruing block to pcb , i set it than pull back and sims like all pads where compresed so i set it jently back , and yes i dont mess them up i know finer are for VRAM and ficker for VRM , all the other temps GPU 29 in WinDSP droping to 28 in moments , and 52 max after an hour of Kombustor stressing , witch is not to bad on single loop with [email protected] and single TripleRad XSPC 3cm fick and XSPC combo res/pumb750l/h , so what u think try reset the block , or this temps on 290 are totaly fine in load , or maybe card to RMA , anyone ?
> 
> I want to note that im not native english person , so excuse me guys for my terrible grammary , and thx for help ;P


your temps are way too high, something is wrong me and another guy (me with ek block, him with koolance) both have temps of 45ish on the core and 53C on vrm one, 35c on vrm 2.

So yes it is normal to have a 20c variance between vrm's but in general your temps suck


----------



## Sherp

Quote:


> Originally Posted by *Jack Mac*
> 
> Anyone have the MK-26 on their 290? If so, how is it? I'm willing to pay the small premium over the Gelid. Are the VRM temperatures any good with it?


I have a Gelid Icy Vision on my R9 290. At stock, BF4 maxed out @ 1920x1200:

Core MAX 64°C
VRM1 MAX 65°C
VRM2 MAX 59°C

With FurMark:

Core MAX 73°C
VRM1 MAX 82°C
VRM2 MAX 62°C

Not bad for a £33 cooler. Admittedly I did have to sand the bottom of the VRM1 heatsink, and secure it with tiny cable ties. In a year or so when I need to overclock my 290s, I'll setup a custom loop. For now they chew through anything I throw at them.

Though if I could go back in time I would just wait for non-ref 290s, as the price would've ended up being roughly the same anyway.


----------



## Jack Mac

The Gelid isn't as nice looking as the black MK-26, and they're similarly priced where I live. I want to know if the VRM heatsinks and memory heatsinks that come with it are enough (quantity and quality wise).


----------



## H1GHL4ND3R

Yeah I thaugh so , i check tomorrow , but i think im gona takeout that pads and instead go with IC Diamond 7 , what you think ?


----------



## H1GHL4ND3R

Quote:


> Originally Posted by *HardwareDecoder*
> 
> your temps are way too high, something is wrong me and another guy (me with ek block, him with koolance) both have temps of 45ish on the core and 53C on vrm one, 35c on vrm 2.
> 
> So yes it is normal to have a 20c variance between vrm's but in general your temps suck


Can You alsow tell me did Your temps are in 53 max VRM when Furmaking or Kombusting ? or in gameing cos that is totaly 2 difriend thing , wile gameing my VRM1 dont go over 60 and VRM2 dont reach 45 , alsow note that im NOT on thick 6cm rad but 3cm , and my cpu @5Ghz alsow increas overal temp in single loop temps , so pls can You ran Fur's or Kombustor and confirm it for me , thx alot in advance .


----------



## Sherp

Quote:


> Originally Posted by *Jack Mac*
> 
> The Gelid isn't as nice looking as the black MK-26, and they're similarly priced where I live. I want to know if the VRM heatsinks and memory heatsinks that come with it are enough (quantity and quality wise).


To be fair they're both quite ugly coolers. I would imagine performance-wise they're both the same.

According to the website you get 16x RAM sinks. If the black MK-26 is the same price as the Gelid, go for whicever one you like the look of the most I suppose.

In the UK the Gelid (£33) is almost half the price of the MK-26 (£56)


----------



## HardwareDecoder

Quote:


> Originally Posted by *H1GHL4ND3R*
> 
> Can You alsow tell me did Your temps are in 53 max VRM when Furmaking or Kombusting ? or in gameing cos that is totaly 2 difriend thing , wile gameing my VRM1 dont go over 60 and VRM2 dont reach 45 , alsow note that im NOT on thick 6cm rad but 3cm , and my cpu @5Ghz alsow increas overal temp in single loop temps , so pls can You ran Fur's or Kombustor and confirm it for me , thx alot in advance .


Okay, lol my vrm 1 hit 82c with furmark, xtreme-burn in test @ 1440p w/ 8x msaa wow furmark is crazy, yea in gaming my vrm1 is never over 60c.

What app are you using to monitor vrm temps? I'm using gpu-z 0.7.4

My vrm2 temp never goes over 35c in anything. Maybe it is reading it wrong?


----------



## H1GHL4ND3R

Quote:


> Originally Posted by *HardwareDecoder*
> 
> Okay, lol my vrm 1 hit 82c with furmark, xtreme-burn in test @ 1440p w/ 8x msaa wow furmark is crazy, yea in gaming my vrm1 is never over 60c.
> 
> What app are you using to monitor vrm temps? I'm using gpu-z 0.7.4
> 
> My vrm2 temp never goes over 35c in anything. Maybe it is reading it wrong?


Im useing alsow GPU-z 0.7.4 and MSIafterburner wile in game , but i need to run windowed mode games to see VRM temps , anyway max settings on Cod Ghosts , i dont go like i mention over 60C on VRM1 and max noted 44C usualy 43 on VRM2 , well it sims its fine , im gona check the IC Diamont 7 tomorrow , that take almost 4C out of my CPU on 5Ghz i hope it will help with GPU too







, thx for help m8 ;P


----------



## EmZkY

Quote:


> Originally Posted by *H1GHL4ND3R*
> 
> Hi Guys!
> After a lnog time of waiting i got my EK-block for my 290 , and after instalation im not sure is ment to be like that but my VRM temps are kinda wird , wile in stock GPU-z report VRM1 Temp 25C and VRM2 26C wich i think is low , under load especialy under Kombustor stres test VRM1 skyrocket to 78-80C its wobling around this , but VRM2 its around 60-63C , and thats all on stock clocks no OC , i wonder is it normal to get almost 20C difrience VRM1 vs VRM2 ?
> To top it up yeah its not my first block actualy 3th and i check my thermalpads befor scruing block to pcb , i set it than pull back and sims like all pads where compresed so i set it jently back , and yes i dont mess them up i know finer are for VRAM and ficker for VRM , all the other temps GPU 29 in WinDSP droping to 28 in moments , and 52 max after an hour of Kombustor stressing , witch is not to bad on single loop with [email protected] and single TripleRad XSPC 3cm fick and XSPC combo res/pumb750l/h , so what u think try reset the block , or this temps on 290 are totaly fine in load , or maybe card to RMA , anyone ?
> 
> I want to note that im not native english person , so excuse me guys for my terrible grammary , and thx for help ;P


I have the EK block too, i did the thermal pad installation according to instructions, and I also have these temperature differences. When overclocked on load while mining, i would have 80C on on vrm 1

Edit: vrm 1


----------



## devilhead

Quote:


> Originally Posted by *EmZkY*
> 
> I have the EK block too, i did the thermal pad installation according to instructions, and I also have these temperature differences. When overclocked on load while mining, i would have 80C on on vrm 1
> 
> Edit: vrm 1


Hmm, i have installed EK block too, my temps bf4 core 39c vrm 42 and 32







when mining core 40, vrm 48 and 38. Between thermal pads i have applied mx4


----------



## HardwareDecoder

Quote:


> Originally Posted by *devilhead*
> 
> Hmm, i have installed EK block too, my temps bf4 core 39c vrm 42 and 32
> 
> 
> 
> 
> 
> 
> 
> when mining core 40, vrm 48 and 38. Between thermal pads i have applied mx4


can you run furmark extreme burn in test with 8x msaa @ 1440p? Or even just 1080 If you don't have a 1440p monitor.

Idk how mining relates to furmark but i'd be curious to see what you get on it.

Quote:


> Originally Posted by *EmZkY*
> 
> I have the EK block too, i did the thermal pad installation according to instructions, and I also have these temperature differences. When overclocked on load while mining, i would have 80C on on vrm 1
> 
> Edit: vrm 1


can you also run furmark on xtreme burn in ?

I put some of the EK thermal paste in between the vrm and the thermal pad like it said in the instructions. I'm wondering if it would have helped to put it in between the block and the thermal pad also?

It seems 80c vrm temps under mining/furmark are normal however. Makes me wonder if these things would explode on the reference cooler? Whats the max safe temp for these vrm's anyway ?


----------



## Warsam71

Hello everyone (@Arizonian),

Sorry if I'm late...I just wanted to let you know we just released a new driver today (13.12 WHQL), you may download it from here: http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64
(Make sure to read the Release Notes if you can, here is the link: http://support.amd.com/en-us/kb-articles/Pages/AMDCatalyst13-12WINReleaseNotes.aspx)

By the way, wanted to thank those that have PM'ed recently, your feedback/comments have been very helpful.


----------



## ImJJames

Quote:


> Originally Posted by *stickg1*
> 
> I got better and more stable performance from my 290 when I switched back to Windows 8.1 (from Win7)
> 
> I just installed 13.12 drivers. For me anyway, it fixed my power limit slider so it actually does something. I just benched 3DMark stable at 1150/1550 which I had failed to do on previous drivers. Before I was stuck at 1100MHz because no matter how I changed the slider in CCC or AB it never seemed to take on more power.
> 
> I hope these drivers work well for everyone else, I'm pretty happy although my results could merely be a coincidence. I just got them installed 30 minutes ago.


Glad to hear it worked out for you


----------



## H1GHL4ND3R

Quote:


> Originally Posted by *HardwareDecoder*
> 
> can you run furmark extreme burn in test with 8x msaa @ 1440p? Or even just 1080 If you don't have a 1440p monitor.
> 
> Idk how mining relates to furmark but i'd be curious to see what you get on it.
> can you also run furmark on xtreme burn in ?
> 
> I put some of the EK thermal paste in between the vrm and the thermal pad like it said in the instructions. I'm wondering if it would have helped to put it in between the block and the thermal pad also?
> 
> It seems 80c vrm temps under mining/furmark are normal however. Makes me wonder if these things would explode on the reference cooler? Whats the max safe temp for these vrm's anyway ?


Well its hard to say withou looking in properties of exact model of VRM but im gona check the model tomorrow when im gona apply IC Diamond 7 , but VRMs usualy runs hot this is the way this regulators are deisgn , acording to Hbot forum 290x on stock hit even 110c in stresstesting , on vrm1 and around 85 on vrm2 so we are vell under , and looking on ambient temps drop , its look like overal EK block drop around 30-40 C depend on Chip , my 290 wile oc testing in ful fan speed dont go over 90 with oc on 1100Hz , witch give 30+ drop , and haveing 80 on VRM1 gives similar drop , i think this temps are fine im gona check the temps tomorrow , after instaling aftermarket thermalcompund , i think this EK TCompound suck preaty much , its kinda wird when past is akting more like liquid than past xD, and when i was applying i was wandering should i use IC , ehh lazy me now i need to do it all over agen ;P, luckly i got my GPU on Swifftech Quick conectors , so it will be easy deinstal ;P


----------



## ImJJames

Does anyone know what "ACP Application" does? Gives you the option to install it with the new 13.12 WHQL drivers.


----------



## Forceman

Quote:


> Originally Posted by *ImJJames*
> 
> Does anyone know what "ACP Application" does? Gives you the option to install it with the new 13.12 WHQL drivers.


I was wondering that also, it's been in the last couple of betas as well.


----------



## H1GHL4ND3R

Quote:


> Originally Posted by *ImJJames*
> 
> Does anyone know what "ACP Application" does? Gives you the option to install it with the new 13.12 WHQL drivers.


Its and Audio driver for build in HDMI audio, if im not mistaken its for true audio


----------



## DeadlyDNA

Quote:


> Originally Posted by *Warsam71*
> 
> Hello everyone (@Arizonian),
> 
> Sorry if I'm late...I just wanted to let you know we just released a new driver today (13.12 WHQL), you may download it from here: http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64
> (Make sure to read the Release Notes if you can, here is the link: http://support.amd.com/en-us/kb-articles/Pages/AMDCatalyst13-12WINReleaseNotes.aspx)
> 
> By the way, wanted to thank those that have PM'ed recently, your feedback/comments have been very helpful.


----------



## Stay Puft

Amazon had 5 XFX references for 519 dollars. All gone in 5 minutes. I wonder if these people are actually monitoring the price of LTC.


----------



## Warsam71

Quote:


> Originally Posted by *ImJJames*
> 
> Does anyone know what "ACP Application" does? Gives you the option to install it with the new 13.12 WHQL drivers.


Quote:


> Originally Posted by *Forceman*
> 
> I was wondering that also, it's been in the last couple of betas as well.


I"m not 100% sure (in relation to the driver), but it's the Average CPU Power metric, take a look: http://search.amd.com/en-us/Pages/results-all.aspx?k=ACP

I am going to ask my colleagues at work tomorrow (just to make sure I'm not providing you the wrong information)...


----------



## Arizonian

Quote:


> Originally Posted by *Warsam71*
> 
> Hello everyone (@Arizonian),
> 
> Sorry if I'm late...I just wanted to let you know we just released a new driver today (13.12 WHQL), you may download it from here: http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64
> (Make sure to read the Release Notes if you can, here is the link: http://support.amd.com/en-us/kb-articles/Pages/AMDCatalyst13-12WINReleaseNotes.aspx)
> 
> By the way, wanted to thank those that have PM'ed recently, your feedback/comments have been very helpful.


Sweet glad to hear it.
Quote:


> RESOLVED ISSUES
> May resolve intermittent black screens or display loss observed on some AMD Radeon™ R9 290X and AMD Radeon R9 290 graphics cards
> Resolves intermittent crashes seen in legacy DirectX® 9 applications
> AMD Radeon™ R9 290 Series - Power Tune update to reduce variance of fan speed / RPM
> PCI-E bus speed is no longer set to x1 on the secondary GPU when running in an AMD CrossFire configuration
> Resolves incorrect HDMI Audio Driver information being listed in the AMD Catalyst Control Center
> Resolves AMD Steady Video option being grayed out in the AMD Catalyst Control Center
> Resolves intermittent flickering seen on some AMD Radeon R9 270X graphics cards
> Resolves graphics corruption issues found in Starcraft®
> Resolves image corruption seen in Autodesk Investor 2014
> Resolves flickering water corruption found in World of Warcraft®
> Resolves intermittent black screen when resuming from a S3/S4 sleep-state if the display is unplugged during the sleep-state on systems supporting AMD Enduro™ Technology
> Resolves intermittent crashes experienced with Battlefield 4 on Windows 8 based systems
> Resolves the display turning green when using Windows Media Player to view HD .avi format video in an extended desktop configuration
> Resolves Metro applications experiencing frame drops during playback of interlaced video content
> Resolves video playback corruption of .wmv format files in Windows Media Player


----------



## Durvelle27

Quote:


> Originally Posted by *Arizonian*
> 
> Sweet glad to hear it.


Wow this really sucks. One reason I sold my 290 was because of random black screens and now its resolved after I sold it.


----------



## ImJJames

Quote:


> Originally Posted by *H1GHL4ND3R*
> 
> Its and Audio driver for build in HDMI audio, if im not mistaken its for true audio


Could be coincidence but for some reason now my audio on my headphones are noise free...I could be tripping though lol


----------



## HardwareDecoder

Okay so either I have the coolest vrm2 or something is odd, It like never even hits 40c even in furmark


----------



## ImJJames

Quote:


> Originally Posted by *HardwareDecoder*
> 
> Okay so either I have the coolest vrm2 or something is odd, It like never even hits 40c even in furmark


Sounds normal to me


----------



## HardwareDecoder

Quote:


> Originally Posted by *ImJJames*
> 
> Sounds normal to me


im not so sure it is....


----------



## ImJJames

Quote:


> Originally Posted by *HardwareDecoder*
> 
> im not so sure it is....


Well when I am benching mine never goes over 50C even when I used PT 1 bios at over 1.37 volts 1270/1500, reference cooler.


----------



## Forceman

Quote:


> Originally Posted by *Warsam71*
> 
> I"m not 100% sure (in relation to the driver), but it's the Average CPU Power metric, take a look: http://search.amd.com/en-us/Pages/results-all.aspx?k=ACP
> 
> I am going to ask my colleagues at work tomorrow (just to make sure I'm not providing you the wrong information)...


Why would that be installing through the Catalyst drivers though?

I think the Audio angle may be more likely:
Quote:


> AMD Radeon™ HD 8900M Series Notebook Graphics Cards
> North America - English Hispanoamérica - Español Português Deutsch United Kingdom &#8230; AMD EnduroTM, AMD ZeroCore Power technology, *ACP Audio*, etc.) are applicable to select AMD &#8230;


----------



## the9quad

Quote:


> Originally Posted by *X-oiL*
> 
> 9.5 or 9.2?


9.2 the whql's are just the 9.5s


----------



## Jack Mac

Quote:


> Originally Posted by *ImJJames*
> 
> Well when I am benching mine never goes over 50C even when I used PT 1 bios at over 1.37 volts 1270/1500, reference cooler.


Ambients?


----------



## ImJJames

Quote:


> Originally Posted by *Jack Mac*
> 
> Ambients?


I would say lowest was around 15C ambient, its pretty warm here in Southern California.


----------



## HardwareDecoder

Quote:


> Originally Posted by *ImJJames*
> 
> Well when I am benching mine never goes over 50C even when I used PT 1 bios at over 1.37 volts 1270/1500, reference cooler.


hmm Yea im only running 1250mV anymore with +50% power tune since if I go over 1100 core clock it seems to screw up running my monitor @ 110hz.

I'm starting to worry I did something wrong with the waterblock though since my vrm1 just hit 90c in furmark xtreme burn in test.

I know furmark is known as a power virus but I'd think 90c is crazy high. That highlander guy was saying his hits 80c also though.

I usually never go over 60-65c in games on VRM1 though and both vrms idle @ 30c

I don't think I could have done the block install wrong though since all you do is cut the thermal pads to fit (the mem ones are pre-cut for ek blocks)

I made sure I used the correct thickness ones for the vrm like it said, and cut the strip to fit and applied thermal paste between the vrm and the pads like it said. Wondering if I should have used paste on both sides of the pad though?


----------



## ImJJames

Quote:


> Originally Posted by *HardwareDecoder*
> 
> hmm Yea im only running 1250mV anymore with +50% power tune since if I go over 1100 core clock it seems to screw up running my monitor @ 110hz.
> 
> I'm starting to worry I did something wrong with the waterblock though since my vrm1 just hit 90c in furmark xtreme burn in test.
> 
> I know furmark is known as a power virus but I'd think 80c is crazy high. That highlander guy was saying his hits 80c also though.
> 
> I usually never go over 60-65c in games on VRM1 though.
> 
> I don't think I could have done the block install wrong though since all you do is cut the thermal pads to fit (the mem ones are pre-cut for ek blocks)
> 
> I made sure I used the correct thickness ones for the vrm like it said, and cut the strip to fit and applied thermal paste between the vrm and the pads like it said. Wondering if I should have used paste on both sides of the pad though?


I honestly wouldn't touch furmark, it does nothing but heat up your card. If you want to check for stability, gaming is your best bet(Heaven is good too). Also remember when I am benching I am running fan profile at 90%


----------



## EmZkY

Quote:


> Originally Posted by *HardwareDecoder*
> 
> can you run furmark extreme burn in test with 8x msaa @ 1440p? Or even just 1080 If you don't have a 1440p monitor.
> 
> Idk how mining relates to furmark but i'd be curious to see what you get on it.
> can you also run furmark on xtreme burn in ?
> 
> I put some of the EK thermal paste in between the vrm and the thermal pad like it said in the instructions. I'm wondering if it would have helped to put it in between the block and the thermal pad also?
> 
> It seems 80c vrm temps under mining/furmark are normal however. Makes me wonder if these things would explode on the reference cooler? Whats the max safe temp for these vrm's anyway ?


Yeah I also did use EK paste under the pads. But now in retrospect I would guess paste on top of the pads would help aswell. Typical to not think things through when building


----------



## HardwareDecoder

Quote:


> Originally Posted by *ImJJames*
> 
> I honestly wouldn't touch furmark, it does nothing but heat up your card. If you want to check for stability, gaming is your best bet.


Yea you are right, I need to just uninstall it. It's really not a good example cause the highest i've ever seen my VRM1 gaming is 70c so that's atleast 20c less than furmark. It's funny furmark doesn't heat my core up much higher than gaming though, maybe 5c?

I heard furmark is throttled by amd/nvidia drivers too and it seems like it is cause I noticed when I run it the voltage doesn't get as high as when i'm actually gaming.

I see a random forum post while googling about the vrms on these 290/x's and it said the vrms can stand 125c.

So 70c gaming should be perfectly fine

WAIT: I just realized you said you are on stock cooling still!? and your vrm with 1370mV is 50c? I don't see how that is possible, what are your core temps while gaming?


----------



## H1GHL4ND3R

Quote:


> Originally Posted by *HardwareDecoder*
> 
> Okay so either I have the coolest vrm2 or something is odd, It like never even hits 40c even in furmark


I was speaking atm with Bang4Buck on Google+ under YT chanell , he is hitting 1220 and 1650 on mems , and kombusting or furmarking make him go up to 90 on VRM1 and low 40s on VRM2 , and he confirm the VRM on 290 or 290x can go up to 120 without no probs , i manage to get some info out Polish specialist forum , and they confirm that VRM2 its the VRAM power and VRM1 is runing GPU , and even on stock hitsink VRM2 dont go above 80 because of low power draw , so i think we have nothing to worry about , untill we hit 120c xDDDD , witch there is no way , i was just playing some bf4 on max settings , VRM1 hit max 58 and VRM2 42 , with 60C over to reach Tmax of this VRM's we can be realy calm and have some mor fun with OCing ;P


----------



## kizwan

Quote:


> Originally Posted by *UNOE*
> 
> Anyone think Crossfire will work with risers ?


Yes, Crossfire should work with risers. They only risers anyway, it's like connected directly to the physical slot. Make sure you're using correct PCIe slot though.

BTW, regarding VRM temp, whenever VRM temp reach their max operating temp, the card will throttling. So, don't worry, you won't burn your card.

I hate miners now. Now I need to wait 1 month+ to get water block for my cards.


----------



## H1GHL4ND3R

Quote:


> Originally Posted by *Warsam71*
> 
> I"m not 100% sure (in relation to the driver), but it's the Average CPU Power metric, take a look: http://search.amd.com/en-us/Pages/results-all.aspx?k=ACP
> 
> I am going to ask my colleagues at work tomorrow (just to make sure I'm not providing you the wrong information)...


Its confirmed on guru3d as and AMD drivers for they true audio Co procesro , so i was close







, not exactly for HDMI but its alow Mantle to creat real sound in co operation with this Co-procesor , to put it short , EAX in AMD style , welcomback 100% positioning sound ;P


----------



## ImJJames

Quote:


> Originally Posted by *HardwareDecoder*
> 
> Yea you are right, I need to just uninstall it. It's really not a good example cause the highest i've ever seen my VRM1 gaming is 70c so that's atleast 20c less than furmark. It's funny furmark doesn't heat my core up much higher than gaming though, maybe 5c?
> 
> I heard furmark is throttled by amd/nvidia drivers too and it seems like it is cause I noticed when I run it the voltage doesn't get as high as when i'm actually gaming.
> 
> I see a random forum post while googling about the vrms on these 290/x's and it said the vrms can stand 125c.
> 
> So 70c gaming should be perfectly fine
> 
> WAIT: I just realized you said you are on stock cooling still!? and your vrm with 1370mV is 50c? I don't see how that is possible, what are your core temps while gaming?


Core temps never reached pass 77C @ 70% fan speed on a warm day playing BF 4 for hours on end. Here is my fan profile. And yes reference cooling







24/7 clocks at 1200/1500 using Asus Stock bios. I play CS:GO, BF 4, Skyrim, Batman Origins, and more all game stable at those clocks.

It does get loud with my aggressive fan profile but I am headphone gamer so I don't hear anything.


----------



## psyside

Guys, vrms are rated to 120c, but do not exceed 85 if you want to still have some general efficiency left. Ideal is around 50-60, the vrm's lose alot of their power efficiency - producing noise/lower stability when they go over 80-85c.

Try to keep them ~ *50-60c if you can for best oc results.*


----------



## Darhant

Hey guys I've just got a R9 290x Gigabyte version and I'm getting graphical artifacts and crashes in most games and artifacts in windows in some cases? It has a EK waterblock on it, I've been trying to make it work but I'm coming up empty. Any ideas?


----------



## ImJJames

Quote:


> Originally Posted by *Darhant*
> 
> Hey guys I've just got a R9 290x Gigabyte version and I'm getting graphical artifacts and crashes in most games and artifacts in windows in some cases? It has a EK waterblock on it, I've been trying to make it work but I'm coming up empty. Any ideas?


Have you done a complete driver uninstall using DDU 9.9 in safe mode and installed the latest 13.12 WHQL drivers that were just released today? How are your core and VRM temps?


----------



## Darhant

OK they are downloading now, currently using latest beta 13.11 drivers. Temps are under around 40 and waterblock cools VRM aswell. In windows I sometimes get text all messed up and in battlefield 4 its like its streching the models at random.


----------



## ImJJames

Quote:


> Originally Posted by *Darhant*
> 
> OK they are downloading now, currently using latest beta 13.11 drivers. Temps are under around 40 and waterblock cools VRM aswell. In windows I sometimes get text all messed up and in battlefield 4 its like its streching the models at random.


Beta 9.5? Weird, running stock and it artifacts?


----------



## Darhant

Cheers hopefully that fixes the issue......I'm a little worried I've been sold a dud.


----------



## H1GHL4ND3R

Quote:


> Originally Posted by *Darhant*
> 
> Hey guys I've just got a R9 290x Gigabyte version and I'm getting graphical artifacts and crashes in most games and artifacts in windows in some cases? It has a EK waterblock on it, I've been trying to make it work but I'm coming up empty. Any ideas?


What brand and Watt-eg You have on your PSU?
Anything under 750Watt 80+Brown and OCing your CPU aswel , i would recomend hit the load default in UEFI (BIOS) , load windows check for arts , if there is non mean your 8pin or 6 pin power conector is low on power delivery that mean changing your PSu for something stronger , definitly not lower than 750 80+Silver , and if You run custom loop from the same PSU i wouldnt go lower than 850-1000Watt 80+Silver, this card in load can suck up to 380Watt on stock , and well over 420 Watt in OC , problem with low rated PSU is that most of them work actualy below 80% efficiency if its Brown u got guarnt that u hit 750Watt with 80% efficiency that give you around 650Watt of constant draw , so from sipmle math u cant run system with this card without power swings .


----------



## muhd86

tri fire r290 @4.5ghz 3930k


----------



## muhd86

*QUAD SAPPHIRE R290 PAIRED UP 3970X STOCK

*


----------



## chill24

Quote:


> Originally Posted by *Connolly*
> 
> Is this amount of GPU usage fluctuation normal whilst gaming?
> 
> 
> 
> I have v-sync on, if that could make any difference.


You and me both. I have 2 cards that do the same thing, maybe even worse. V-sync on or off, makes no matter. Single card or cross fire, same thing.

If you find out, let me know. Really disappointed in AMD so far.


----------



## ImJJames

Quote:


> Originally Posted by *muhd86*
> 
> *QUAD SAPPHIRE R290 PAIRED UP 3970X STOCK
> 
> *


Holy moly lol, you planning to run a Nuclear reactor with that?


----------



## H1GHL4ND3R

Quote:


> Originally Posted by *psyside*
> 
> Guys, vrms are rated to 120c, but do not exceed 85 if you want to still have some general efficiency left. Ideal is around 50-60, the vrm's lose alot of their power efficiency - producing noise/lower stability when they go over 80-85c.
> 
> Try to keep them ~ *50-60c if you can for best oc results.*


Thats why i was conser about tha temps , but only Kombusting or FURing make the VRM1 go up to 80 , under realy heavy load , in game i dont reach 60 with all games maxed out ,so i think im fine with it ;P


----------



## Darhant

Still has artifacts after upgrading to 13.12 I'm lost here.....
now they show as dark squares in a checker pattern on the screen


----------



## Darhant

Here is an example of whats happening with 13.12 drivers on a r9 290x


----------



## H1GHL4ND3R

Quote:


> Originally Posted by *Darhant*
> 
> Still has artifacts after upgrading to 13.12 I'm lost here.....
> now they show as dark squares in a checker pattern on the screen


Honestly its look nothing but PSU , try to reset the GPU in PCI-E slot aagen , and replug 8pin and 6pin, dont forget , to unplug the power cord from PSU and wait atlits 30 sec to current being drain out of chocks. Check that , i alsow have problem of IRQ incompatiblity with my cyborg v7 keyborad extension USB hub , unpluged the buger and all fine , hel know how is that possible , cos i always run all upgrades starting from def bios , and than working all OCing up , but it did happend , so maybe u might have some problem , unplug all USB devices from native ports on back of your MOBO , except mous and kayboard ofc , and leave them in top USB conectors, try to bot it up see whats happened , cos i was arting alsow wille haveing IRQ ishue , try it might help , then u can wark around IRQ setups in Bios/UEFI.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Darhant*
> 
> Still has artifacts after upgrading to 13.12 I'm lost here.....
> now they show as dark squares in a checker pattern on the screen


Mine started to artifact on stock settings for no real reason, no amount of voltage could clear it up so i RMA'd it, probably not the news you wanted to hear though.


----------



## ImJJames

Quote:


> Originally Posted by *H1GHL4ND3R*
> 
> Honestly its look nothing but PSU , try to reset the GPU in PCI-E slot aagen , and replug 8pin and 6pin, dont forget , to unplug the power cord from PSU and wait atlits 30 sec to current being drain out of chocks. Check that , i alsow have problem of IRQ incompatiblity with my cyborg v7 keyborad extension USB hub , unpluged the buger and all fine , hel know how is that possible , cos i always run all upgrades starting from def bios , and than working all OCing up , but it did happend , so maybe u might have some problem , unplug all USB devices from native ports on back of your MOBO , except mous and kayboard ofc , and leave them in top USB conectors, try to bot it up see whats happened , cos i was arting alsow wille haveing IRQ ishue , try it might help , then u can wark around IRQ setups in Bios/UEFI.


This


----------



## jerrolds

Quote:


> Originally Posted by *psyside*
> 
> Guys, vrms are rated to 120c, but do not exceed 85 if you want to still have some general efficiency left. Ideal is around 50-60, the vrm's lose alot of their power efficiency - producing noise/lower stability when they go over 80-85c.
> 
> Try to keep them ~ *50-60c if you can for best oc results.*


Can you provide a source for the 120C VRMs?


----------



## Darhant

I tried all of those things and I'm still getting stuff like this



At random times it displays text like that, its a fresh install of windows 8.1 if that makes any difference.
I'm not sure what else to try.


----------



## kizwan

With 13.12 drivers, when I set manual fan speed, the fan on second card remain at 20% until I run games or open flash content website.
Quote:


> Originally Posted by *chill24*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Connolly*
> 
> Is this amount of GPU usage fluctuation normal whilst gaming?
> 
> 
> 
> I have v-sync on, if that could make any difference.
> 
> 
> 
> You and me both. I have 2 cards that do the same thing, maybe even worse. V-sync on or off, makes no matter. Single card or cross fire, same thing.
> 
> If you find out, let me know. Really disappointed in AMD so far.
Click to expand...

Wait...is this a problem. I got the same "fluctuation" but games running smooth though.

BF3


BF4

Quote:


> Originally Posted by *jerrolds*
> 
> Quote:
> 
> 
> 
> Originally Posted by *psyside*
> 
> Guys, vrms are rated to 120c, but do not exceed 85 if you want to still have some general efficiency left. Ideal is around 50-60, the vrm's lose alot of their power efficiency - producing noise/lower stability when they go over 80-85c.
> 
> Try to keep them ~ *50-60c if you can for best oc results.*
> 
> 
> 
> Can you provide a source for the 120C VRMs?
Click to expand...

^^This. Also I would like to know the mosfet use on these cards.
Quote:


> Originally Posted by *Darhant*
> 
> I tried all of those things and I'm still getting stuff like this
> 
> 
> 
> At random times it displays text like that, its a fresh install of windows 8.1 if that makes any difference.
> I'm not sure what else to try.


Is this your rig?
http://www.overclock.net/lists/display/view/id/5550571

Try different PSU. As far as I know Aerocool PSU is among the PSU you should avoid.


----------



## ImJJames

Quote:


> Originally Posted by *Darhant*
> 
> I tried all of those things and I'm still getting stuff like this
> 
> 
> 
> At random times it displays text like that, its a fresh install of windows 8.1 if that makes any difference.
> I'm not sure what else to try.


RMA I guess.


----------



## chill24

Right, I have gpu usage like that but awful gameplay. It's very noticeable and I can't keep 60fps on the same games/scenes that my 670 sli cards could.

Even if you don't have problems, it seems really wrong to have usage like that.


----------



## CravinR1

That reminds me of a driver issue I had with a 8800 gts


----------



## Arizonian

Quote:


> Originally Posted by *muhd86*
> 
> *QUAD SAPPHIRE R290 PAIRED UP 3970X STOCK*
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - if you can 'edit' your post and add a screen shot with OCN name like a GPU-Z open to validation tab as proof would be great. Otherwise your quad's added -









Looks like *290X is going up to $629.99* and *290 for $529.99* now in addition to $5.92 shipping cost rather than free shipping. BF4 still included on some.


----------



## SamEkinci

I have made a short video for folks that live outside North America and want to take a good look at the card's heatsink removal before attempting and risk voiding the warranty. Same for both 290 and 290X. I have used a reference XFX brand, which has the warranty stickers on heatsink main screw heads (total of 2).

Only tough part was as usual the fan cable and the warranty stickers which were fairly easy to remove with a heat gun and tweezers. Do not attempt to remove them without heat as they will peel out fairly easy.





Hope it helps folks get an idea. Very simple.

Cheers.


----------



## amlett

Quote:


> Originally Posted by *Widde*
> 
> Assuming it does work with every "brand" since they are reference cards correct?
> 
> 
> 
> 
> 
> 
> 
> Cant find it on amds website


working perfectly with my powercolor 290 unlocked with asus 290x bios


----------



## Bluesubmarine6

Hey guys just got a XFX R9 290 it says recommended RAM is 16GB and I have only 8GB. Is there a noticeable difference in performance? Will I gain any more frames or smoother gameplay from the GPU is I have more?


----------



## DizzlePro

IT'S HAPPENING

Sapphire 290X Tri X oc edition
http://www.kitguru.net/components/graphic-cards/zardon/sapphire-r9-290-tri-x-oc-review-1600p-ultra-hd-4k/


----------



## Pierce

quick question guys..

I was playing Black Ops 2 earlier (I know, only game I had installed) with no problems but now I keep crashing.

I disabled OD but that didnt do anything. I checked the memory clock and it keeps jumping up and down. it will randomly go from 150 to like 1250. Right now its at 150 and I cant seem to increase the clocks... is it a catalyst issue or should I reinstall? thanks

edit: Now its showing power at 0%. I dont know whats going on..

whats should my settings under AMD overdrive?


----------



## Stay Puft

Quote:


> Originally Posted by *muhd86*
> 
> *QUAD SAPPHIRE R290 PAIRED UP 3970X STOCK
> 
> *


Why are you only clocked at 3.2Ghz? Clock it up to 4.6


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Stay Puft*
> 
> Why are you only clocked at 3.2Ghz? Clock it up to 4.6


And run it again if you could please in performance mode









BTW the new driver is crap ! Crashed using my old settings on mk 11 , f/s , vantage .









Back to 9.4 all good now .









Waiting on my first GPU block to get here in da mail ........ Merry christmas to me


----------



## King4x4

4xSapphire 290x ordered from Amazon at 570$ a pop.

They will be watercooled!


----------



## Durvelle27

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Mine started to artifact on stock settings for no real reason, no amount of voltage could clear it up so i RMA'd it, probably not the news you wanted to hear though.


Had the same problem also


----------



## Jaju123

Quote:


> Originally Posted by *chill24*
> 
> You and me both. I have 2 cards that do the same thing, maybe even worse. V-sync on or off, makes no matter. Single card or cross fire, same thing.
> 
> If you find out, let me know. Really disappointed in AMD so far.


I have the same problem, although the game seems to run fine. However, the game seems to have much more input lag than when I was with Nvidia. What gives?


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> And run it again if you could please in performance mode
> 
> 
> 
> 
> 
> 
> 
> 
> 
> BTW the new driver is crap ! Crashed using my old settings on mk 11 , f/s , vantage .
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Back to 9.4 all good now .
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Waiting on my first GPU block to get here in da mail ........ Merry christmas to me


Which water block did you get?


----------



## stickg1

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> And run it again if you could please in performance mode
> 
> 
> 
> 
> 
> 
> 
> 
> 
> BTW the new driver is crap ! Crashed using my old settings on mk 11 , f/s , vantage .
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Back to 9.4 all good now .
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Waiting on my first GPU block to get here in da mail ........ Merry christmas to me


Heh, it was the opposite for me. Changing to 13.12 made my power slider work and now I can game, bench, and fold on higher clocks!


----------



## darkelixa

I cant hear my sapphire r9 290 over my case fans lol who said they are loud!!!!! Took me a few hours to get my intel ssd to work properly, would boot fine then if i did a restart the bios would not recognise it, a second install of windows fixed the problem


----------



## kizwan

Quote:


> Originally Posted by *Jaju123*
> 
> I have the same problem, although the game seems to run fine. However, the game seems to have much more input lag than when I was with Nvidia. What gives?


My games running fine too, no FPS drop that I should worry about. However, fluctuating GPU usage is very unnerving to be honest.


----------



## Asrock Extreme7

uber bios 015.039.000.006.003515

quiet bios 015.039.000.006.003518


----------



## Asrock Extreme7

Quote:


> Originally Posted by *Asrock Extreme7*
> 
> uber bios 015.039.000.006.003515
> 
> quiet bios 015.039.000.006.003518


13.12 is the same as 13.11 Beta9.5


----------



## broken pixel

#2 spot surrounded by Nvidia scores.
http://rog.asus.com/forum/showthread.php?33401-Share-Your-3DMark-Scores(Fire-Strike-Only-Please)

guy forgot to add my score, should be 19621 i think?

http://www.3dmark.com/fs/1319834


----------



## Joeking78

Quote:


> Originally Posted by *Asrock Extreme7*
> 
> 13.12 is the same as 13.11 Beta9.5


Are you comparing GPU bios to GPU driver?


----------



## HardwareDecoder

Quote:


> Originally Posted by *H1GHL4ND3R*
> 
> What brand and Watt-eg You have on your PSU?
> Anything under 750Watt 80+Brown and OCing your CPU aswel , i would recomend hit the load default in UEFI (BIOS) , load windows check for arts , if there is non mean your 8pin or 6 pin power conector is low on power delivery that mean changing your PSu for something stronger , definitly not lower than 750 80+Silver , and *if You run custom loop from the same PSU i wouldnt go lower than 850-1000Watt 80+Silver*, this card in load can suck up to 380Watt on stock , and well over 420 Watt in OC , problem with low rated PSU is that most of them work actualy below 80% efficiency if its Brown u got guarnt that u hit 750Watt with 80% efficiency that give you around 650Watt of constant draw , so from sipmle math u cant run system with this card without power swings .


Sorry but you are mistaken. I just tested my setup

3770k @ 1.432 vcore (4.7ghz)
290x @ 1100 core / 1250mV +50% power tune
2 ssds and an hdd
5x 120mm fans
XSPC 750 v4 pump/res combo unit.

*I ran both furmark+ intel burn test at the same time! that is about the heaviest load you can get imo.
*
and the most it pulled tested via watt meter was 550W.

When you do the efficiency calculation for an 80% bronze psu it comes out like this:

550W*0.85=*467W*

I'm tired of the misinformation people spread about psu requirements.

I am running on a 650W seasonic psu and the air coming out of the psu isn't even warm. Granted if it is a garbage unit that can't even put out 550W I'd consider replacing it.

*bottom line: a quality 650W psu is plenty fine for a highly overclocked i5/i7 (probably most amd chips too) and an overclocked 290x + a water cooling loop.*

Also you are wrong about how much power a watercooled 290/x can pull. My system @ idle desktop uses 146W (without doing the conversion to 0.85%) If I load up furmark then it jumps to 420W.

so it is around 274W for a watercooled 290x @ 1250mV with +50% power tune. Core clock doesn't add that much more power since the voltage is alrdy calculated. It is either 230-274 I'm not sure if I should * it by 0.85 before I minus the two numbers or before. But regardless it is not 380W

edit: actually it is 255W for the 290x since the cpu draws ~230W on it's own in ibt.

I have ran 2x 7950's on this exact same setup and it drew 100W more and everything was fine. This was for 5-6 months.


----------



## psyside

Quote:


> Originally Posted by *jerrolds*
> 
> Can you provide a source for the 120C VRMs?


Me and one highly educated electrician read pdf which one of the users on OCN posted about reference vrms parts, mosfets & chokes.

95% of the vrms are rated to run ~ 120 as max, they got parameters listed for the temperature, they drop like 50% of their efficiency, and produce noise at that temp, hence the crashes.

Its bad to run them over 80c stability wise, if you want top oc- and stability. The electrician said, it would be best to aim for 50-60c








Quote:


> Originally Posted by *H1GHL4ND3R*
> 
> Thats why i was conser about tha temps , but only Kombusting or FURing make the VRM1 go up to 80 , under realy heavy load , in game i dont reach 60 with all games maxed out ,so i think im fine with it ;P


Definitely, you are good to go, great in fact









What fan speed/oc you run?


----------



## the9quad

Quote:


> Originally Posted by *Joeking78*
> 
> Are you comparing GPU bios to GPU driver?


I do not know what he is comparing, but he is correct the beta 9.5's are the exact same drivers as the new WHQL's.


----------



## dayen666

Whats is Power Tune on CCC? Like auto voltage?


----------



## bond32

Quote:


> Originally Posted by *HardwareDecoder*
> 
> Sorry but you are mistaken. I just tested my setup
> 
> 3770k @ 1.432 vcore (4.7ghz)
> 290x @ 1100 core / 1250mV +50% power tune
> 2 ssds and an hdd
> 5x 120mm fans
> XSPC 750 v4 pump/res combo unit.
> 
> *I ran both furmark+ intel burn test at the same time! that is about the heaviest load you can get imo.
> *
> and the most it pulled tested via watt meter was 550W.
> 
> When you do the efficiency calculation for an 80% bronze psu it comes out like this:
> 
> 550W*0.85=*467W*
> 
> I'm tired of the misinformation people spread about psu requirements.
> 
> I am running on a 650W seasonic psu and the air coming out of the psu isn't even warm. Granted if it is a garbage unit that can't even put out 550W I'd consider replacing it.
> 
> *bottom line: a quality 650W psu is plenty fine for a highly overclocked i5/i7 (probably most amd chips too) and an overclocked 290x + a water cooling loop.*
> 
> Also you are wrong about how much power a watercooled 290/x can pull. My system @ idle desktop uses 146W (without doing the conversion to 0.85%) If I load up furmark then it jumps to 420W.
> 
> so it is around 274W for a watercooled 290x @ 1250mV with +50% power tune. Core clock doesn't add that much more power since the voltage is alrdy calculated. It is either 230-274 I'm not sure if I should * it by 0.85 before I minus the two numbers or before. But regardless it is not 380W
> 
> edit: actually it is 255W for the 290x since the cpu draws ~230W on it's own in ibt.
> 
> I have ran 2x 7950's on this exact same setup and it drew 100W more and everything was fine. This was for 5-6 months.


+ rep. Wish more would read this, as my initial attempt to clear things up failed.


----------



## Joeking78

Quote:


> Originally Posted by *the9quad*
> 
> I do not know what he is comparing, but he is correct the beta 9.5's are the exact same drivers as the new WHQL's.


I get much lower 3dmark scores with the 13.12 compared to 9.2beta...have you tested/compared them yet?


----------



## HardwareDecoder

Quote:


> Originally Posted by *bond32*
> 
> + rep. Wish more would read this, as my initial attempt to clear things up failed.


Thanks I had a bunch of help from user @TwoCables when I initially wanted to know If my 650W could handle 2 overclocked 7950's with my oc'ed cpu. I had people telling me I needed a 1000W psu so it irks me whenever I see misinformation on the subject.


----------



## the9quad

Quote:


> Originally Posted by *Joeking78*
> 
> I get much lower 3dmark scores with the 13.12 compared to 9.2beta...have you tested/compared them yet?


I was about 100 to 150 points lower yes


----------



## BradleyW

Hey everyone. Just wanted to know all your thoughts on Nvidia G-Sync. Is it the end for AMD?


----------



## the9quad

Quote:


> Originally Posted by *BradleyW*
> 
> Hey everyone. Just wanted to know all your thoughts on Nvidia G-Sync. Is it the end for AMD?


Yeah amd is done for....sucks to be amd right now they are stuck in all of the next gen consoles,can't keep any of their cards on the shelf, have a new API debuting this month, and a gsync like feature they are announcing soon....sucks to be amd....


----------



## BradleyW

Quote:


> Originally Posted by *the9quad*
> 
> Yeah amd is done for....sucks to be amd right now they are stuck in all of the next gen consoles,can't keep any of their cards on the shelf, have a new API debuting this month, and a *gsync like feature they are announcing soon*....sucks to be amd....


Evidence?


----------



## the9quad

Quote:


> Originally Posted by *BradleyW*
> 
> Evidence?


http://www.tomshardware.com/reviews/amd-ama-toms-hardware,3672.html


----------



## jerrolds

Maybe theyre officially doing a 2D lightboost hack thing









Yea Gsync sounds cool, and will only initially work with an addon with supported hardware. AFAIK only supported by 1080p TN monitors like the VG248QE or XL2420T(E). So screw that....hopefully next year theyll have it work on 1440p+ IPS/PLS/VA

Maybe i should visit the monitor and displays forum, havent browsed there in awhile.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *BradleyW*
> 
> Hey everyone. Just wanted to know all your thoughts on Nvidia G-Sync. Is it the end for AMD?


----------



## kot0005

Quote:


> Originally Posted by *Sgt Bilko*
> 
> It will, AMD can't really afford to have anymore delays and I know they want to give us a an awesome Xmas Present


hey mate, did you get your card back yet? I still haven't gotten mine back


----------



## BradleyW

Well, it says:
"You'll hear more from us on G-Sync soon." Not much help at this point AMD. Maybe AMD will produce an external box that works on current displays.


----------



## Arizonian

Quote:


> Originally Posted by *BradleyW*
> 
> Hey everyone. Just wanted to know all your thoughts on Nvidia G-Sync. Is it the end for AMD?


Quote:


> Originally Posted by *the9quad*
> 
> Yeah amd is done for....sucks to be amd right now they are stuck in all of the next gen consoles,can't keep any of their cards on the shelf, have a new API debuting this month, and a gsync like feature they are announcing soon....sucks to be amd....


Quote:


> Originally Posted by *BradleyW*
> 
> Evidence?


I feel this questions is border line trolling as Nvidia G-sync has no bearing to AMD owners in the club.

Rather than club owners provide you evidence that it's not the end for AMD please provide evidence for your statement first. Otherwise member opinions don't need evidence, as your questions is purely your own evaluation of a possibility and not based on any facts either.


----------



## Mr357

Quote:


> Originally Posted by *Bluesubmarine6*
> 
> Hey guys just got a XFX R9 290 it says recommended RAM is 16GB and I have only 8GB. Is there a noticeable difference in performance? Will I gain any more frames or smoother gameplay from the GPU is I have more?


No way. 8GB is plenty, and you won't see even a tenth of a frame per second increase by adding more RAM.


----------



## Lettuceman

What are your guys favorite GPU monitor/overclocking program?

I've always used Afterburner, but seeing if there is any other options that are good(just to see whats out there







). I know there are links in the OP, but I want to hear what people think the best one is.


----------



## jerrolds

AB is my favorite, but currently cannot get it to unlock voltages with my Sapphire w/ Asus bios. So currently stuck with GPU Tweak.


----------



## HardwareDecoder

Quote:


> Originally Posted by *jerrolds*
> 
> AB is my favorite, but currently cannot get it to unlock voltages with my Sapphire w/ Asus bios. So currently stuck with GPU Tweak.


you need to set the power tune in the CCC after you apply the +100 or +200mv in msi ab


----------



## jerrolds

AB17 has the voltage slider greyed out for me..no idea why. Tried setting a bunch of checkboxes on/off while investigating...gave up after awhile.


----------



## brazilianloser

Quote:


> Originally Posted by *jerrolds*
> 
> AB17 has the voltage slider greyed out for me..no idea why. Tried setting a bunch of checkboxes on/off while investigating...gave up after awhile.




On the same subject, its been a while since they have updated AB... :/


----------



## jerrolds

Yea i tried that - ill try again using your screenshot. Something seriously weird going on with my AB. Ill post a screenshot of mine if it doesnt work. Thanks


----------



## brazilianloser

Quote:


> Originally Posted by *jerrolds*
> 
> Yea i tried that - ill try again using your screenshot. Something seriously weird going on with my AB. Ill post a screenshot of mine if it doesnt work. Thanks


Disable ULPS down there is for crossfire btw... If it doesn't work I would try uninstalling drivers and AB then re installing. Making sure Overdrive is disabled and that you have no other AB like programs going on.


----------



## escapedmonk

Quote:


> Originally Posted by *jerrolds*
> 
> Yea i tried that - ill try again using your screenshot. Something seriously weird going on with my AB. Ill post a screenshot of mine if it doesnt work. Thanks


Are you using the afterburner beta 17?


----------



## brazilianloser

Quote:


> Originally Posted by *escapedmonk*
> 
> Are you using the afterburner beta 17?


Quote:


> Originally Posted by *jerrolds*
> 
> *AB17* has the voltage slider greyed out for me..no idea why. Tried setting a bunch of checkboxes on/off while investigating...gave up after awhile.


He said so in his earlier post.


----------



## xJakkx

I know this may seem a stupid question, but considering that the price of an non reference R9 290 is now the same as a 780, at least, here in the UK, is it worth waiting for the non reference R9 290's to come out? The main reason I wanted an R9 290 was the 4GB VRAM which comes in very handy when playing Skyrim with loads of HD textures, as I currently max out my 2GB card and have seen people with 780's go up to 2.8GB VRAM usage when playing Skyrim with HD textures. The R9 290 also has superior performance in a few games. Is it worth waiting for the non reference R9 290's for that extra VRAM/performance or should I just get a 780 right now? I also posted this in the 780 owners section so I'm not worried about biased views.


----------



## brazilianloser

Quote:


> Originally Posted by *xJakkx*
> 
> I know this may seem a stupid question, but considering that the price of an non reference R9 290 is now the same as a 780, at least, here in the UK, is it worth waiting for the non reference R9 290's to come out? The main reason I wanted an R9 290 was the 4GB VRAM which comes in very handy when playing Skyrim with loads of HD textures, as I currently max out my 2GB card and have seen people with 780's go up to 2.8GB VRAM usage when playing Skyrim with HD textures. The R9 290 also has superior performance in a few games. Is it worth waiting for the non reference R9 290's for that extra VRAM/performance or should I just get a 780 right now? I also posted this in the 780 owners section so I'm not worried about biased views.


Are you planning on putting the 290 under water? If so just get a reference 290 and clock it to outperform a reference 290x at stock easy. Otherwise just do what pleases you. Playing 1080p or Surround or higher res? If 1080p just go with what you are comfortable... not surround I see the 4gb in the 290/x being a huge advantage since I have actually been able to max out the 4gb in BF4 in surround by increasing the resolution %. So yeah really your decision.


----------



## rdr09

Quote:


> Originally Posted by *xJakkx*
> 
> I know this may seem a stupid question, but considering that the price of an non reference R9 290 is now the same as a 780, at least, here in the UK, is it worth waiting for the non reference R9 290's to come out? The main reason I wanted an R9 290 was the 4GB VRAM which comes in very handy when playing Skyrim with loads of HD textures, as I currently max out my 2GB card and have seen people with 780's go up to 2.8GB VRAM usage when playing Skyrim with HD textures. The R9 290 also has superior performance in a few games. Is it worth waiting for the non reference R9 290's for that extra VRAM/performance or should I just get a 780 right now? I also posted this in the 780 owners section so I'm not worried about biased views.


you know exactly the price of the non-ref 290? where did you find that out?

what rez are you using?


----------



## jerrolds

Quote:


> Originally Posted by *escapedmonk*
> 
> Are you using the afterburner beta 17?


Yup Beta 17


----------



## kizwan

BF4 GPU1 & GPU2 usage (CFX disabled). My internet connection acting up when running with GPU2.

GPU1


GPU2


----------



## Warsam71

Quote:


> Originally Posted by *H1GHL4ND3R*
> 
> Its confirmed on guru3d as and AMD drivers for they true audio Co procesro , so i was close
> 
> 
> 
> 
> 
> 
> 
> , not exactly for HDMI but its alow Mantle to creat real sound in co operation with this Co-procesor , to put it short , EAX in AMD style , welcomback 100% positioning sound ;P


Quote:


> Originally Posted by *ImJJames*
> 
> Does anyone know what "ACP Application" does? Gives you the option to install it with the new 13.12 WHQL drivers.


Thank you very much for sharing...


----------



## Elfear

I'm putting together a comparison with a 780 Lightning and a 290 and I'm getting very inconsistent results at max clocks with my 290. I have benchmarks in various games at stock clocks, 1150/1450, and 1285/1400. I've tried both the Asus and PT1 bioses and still get inconsistent results. Voltage is at +219mV (~1.289V under load) and after looping Heaven for 10 minutes, the core is at 45C and the VRMs hit 69C (using EK block).

In Crysis 3 I get the same fps at 1150/1450 and 1285/1400 (1600p max settings). At 1285/1400 I can make one 3-minute run through just fine but when I go to repeat it I always get a black screen and have to reset my computer. Same thing at 1275/1400.

Core clocks and voltage remain pretty steady while benching so it doesn't appear to be throttling.

Any ideas?


----------



## xJakkx

Quote:


> Originally Posted by *rdr09*
> 
> you know exactly the price of the non-ref 290? where did you find that out?
> 
> what rez are you using?


http://www.overclockers.co.uk/showproduct.php?prodid=GX-243-MS&groupid=701&catid=56&subcat=1752

http://www.overclockers.co.uk/showproduct.php?prodid=GX-337-SP&groupid=701&catid=56&subcat=1752

I am using 1080p. Also thanks for the reply on the the previous page, I don't plan on using any water cooling for my GPU at the moment. I assume the non reference R9 290 prices will not drop so I may get the 780, I just hope I don't max out the VRAM on that as well when playing a game with really high res textures or lots of mods.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *kizwan*
> 
> Which water block did you get?












I am quaking in anticipation









Quote:


> Originally Posted by *stickg1*
> 
> Heh, it was the opposite for me. Changing to 13.12 made my power slider work and now I can game, bench, and fold on higher clocks!


Lucky you








Quote:


> Originally Posted by *Asrock Extreme7*
> 
> 13.12 is the same as 13.11 Beta9.5


What ? !

Quote:


> Originally Posted by *psyside*
> 
> Me and one highly educated electrician read pdf which one of the users on OCN posted about reference vrms parts, mosfets & chokes.
> 
> 95% of the vrms are rated to run ~ 120 as max, they got parameters listed for the temperature, they drop like 50% of their efficiency, and produce noise at that temp, hence the crashes.
> 
> Its bad to run them over 80c stability wise, if you want top oc- and stability. The electrician said, it would be best to aim for 50-60c
> 
> 
> 
> 
> 
> 
> 
> 
> Definitely, you are good to go, great in fact
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What fan speed/oc you run?


Now that sounds accurate to me









Quote:


> Originally Posted by *Joeking78*
> 
> I get much lower 3dmark scores with the 13.12 compared to 9.2beta...have you tested/compared them yet?


What more points on Beta 9.2 ............ im on to that pronto


----------



## Joeking78

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am quaking in anticipation
> 
> 
> 
> 
> 
> 
> 
> 
> Lucky you
> 
> 
> 
> 
> 
> 
> 
> 
> What ? !
> Now that sounds accurate to me
> 
> 
> 
> 
> 
> 
> 
> 
> What more points on Beta 9.2 ............ im on to that pronto


9.2 > 13.12 in 3DMark...I get really bad scores, like 3k less with 13.12, not sure why.

I think I found out why my scores are not so great too...I think my PSU is on the limit when my CPU is at 4.8 & my GPU's 1230...

I just scored 28492 with 4.7ghz, 1175/1400...http://www.3dmark.com/3dm11/7685166, tess off...compared to 28840 http://www.3dmark.com/3dm11/7658463 with 4.8, 1225/1350.

The only thing I can think of is my PSU being on the limit and when running the later tests I did get some random shut downs, must get a bigger PSU now.


----------



## cplifj

Has anybody noticed some strangeness going on with AMD's powerboost technology ?

i have posted this about it (@ bottom off post) : http://www.overclock.net/t/1437876/290-and-290x-litecoin-mining-performance/870#post_21423549

it seems something throttles back after half an hour of full load but i cant get any report on what it exactly is that is happening.

i have read about "tom's" experience with the FAULTY boards . that had something wrong on them .

seems a bit like what i am experiencing. after half an hour off load i can clearly hear it spin down on the fanspeed, nothing changes in the monitor readout anywhere, no fanspeed change no powerusage change, no GPU or MEM speed changes, NOTHING.

yet its very clearly audible and the calculation speed goes up slightly so it runs like more relaxed all off a sudden.

might it be AMD's powertune that wants to push too hard ?? i think it might just be that.

Grtz


----------



## rdr09

Quote:


> Originally Posted by *Joeking78*
> 
> 9.2 > 13.12 in 3DMark...I get really bad scores, like 3k less with 13.12, not sure why.
> 
> I think I found out why my scores are not so great too...I think my PSU is on the limit when my CPU is at 4.8 & my GPU's 1230...
> 
> I just scored 28492 with 4.7ghz, 1175/1400...http://www.3dmark.com/3dm11/7685166, tess off...compared to 28840 http://www.3dmark.com/3dm11/7658463 with 4.8, 1225/1350.
> 
> The only thing I can think of is my PSU being on the limit and when running the later tests I did get some random shut downs, must get a bigger PSU now.


Joe, check your VDDC. it seems that the latest whql added about 0.12v to it - at least that's what GPUz showed for my 290 at idle. This also raised my idle temp by about 3C, so went back to 13.11 WHQL.


----------



## rdr09

Quote:


> Originally Posted by *xJakkx*
> 
> http://www.overclockers.co.uk/showproduct.php?prodid=GX-243-MS&groupid=701&catid=56&subcat=1752
> 
> http://www.overclockers.co.uk/showproduct.php?prodid=GX-337-SP&groupid=701&catid=56&subcat=1752
> 
> I am using 1080p. Also thanks for the reply on the the previous page, I don't plan on using any water cooling for my GPU at the moment. I assume the non reference R9 290 prices will not drop so I may get the 780, I just hope I don't max out the VRAM on that as well when playing a game with really high res textures or lots of mods.


3GB will be more than enough for 1080 even with modded Skyrim - me thinks.


----------



## The Storm

My water blocks just arrived. Do you guys put Tim on each little thermal pad? I never have but EK recommends it with there instructions.


----------



## HardwareDecoder

Quote:


> Originally Posted by *The Storm*
> 
> My water blocks just arrived. Do you guys put Tim on each little thermal pad? I never have but EK recommends it with there instructions.


I put tim between the vrm and the pad like it said. I'm wondering if it's a good idea to put a tiny drop on top too.


----------



## esqueue

Quote:


> Originally Posted by *The Storm*
> 
> My water blocks just arrived. Do you guys put Tim on each little thermal pad? I never have but EK recommends it with there instructions.


I forgot to put it on the thermal pads. I actually use the spread method instead of the pea method and after pressing the block on, I turn back and forth slightly to press out any possible air pockets.

While the gpu was OC'd to 1200/1500 +69mv I got great temps even when ambient was pretty hot here. I've never seen my vrm temp go past 54°C and the gpu in the mid 40°C's max.

This is with an old cooling setup with maybe 5+ years old.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kot0005*
> 
> hey mate, did you get your card back yet? I still haven't gotten mine back


Not yet, PCCG's Distributer have it atm (i think it's Altech), They should be testing it out today sometime and hopefully just give me a refund so i can grab a non-ref version instead.

Really hoping the HIS version is good otherwise it's Asus or Sapphire for me.


----------



## The Storm

Did you guys use the supplied tim? I have some clu here that I may use


----------



## sf101

to bad they still dont put thermal sensors on the memory modules it would be handy to know if your memory was making good or bad contact with blocks.


----------



## kot0005

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Not yet, PCCG's Distributer have it atm (i think it's Altech), They should be testing it out today sometime and hopefully just give me a refund so i can grab a non-ref version instead.
> 
> Really hoping the HIS version is good otherwise it's Asus or Sapphire for me.


Its been like 3 weeks and they are testing it now ? wow!


----------



## Sgt Bilko

Quote:


> Originally Posted by *kot0005*
> 
> Its been like 3 weeks and they are testing it now ? wow!


Yeah, PCCG got it 3 days after i posted it, tested it on the 4th day, sent it to Altech on the 7th day and since then it's been sitting in a queue.......i'm gonna call them later today and bug them about it


----------



## Asrock Extreme7

Quote:


> Originally Posted by *The Storm*
> 
> Did you guys use the supplied tim? I have some clu here that I may use


I did not put tim on mem just vrms used an x shape for the core vrms never go over 40 coer 45 max very happy with ek block used mx4


----------



## Redeemer

Quote:


> Originally Posted by *BradleyW*
> 
> Hey everyone. Just wanted to know all your thoughts on Nvidia G-Sync. Is it the end for AMD?


If every Nvidia gamer want to ditch\mod their perfectly good gaming monitor and pick up a $399/$100+ MOD then it could be a force!


----------



## carlovfx

AMD 13.12 drivers are out guys.


----------



## Jack Mac

Waiting to see what more people have to say about 13.12 before I go through the trouble of installing it.


----------



## BradleyW

Quote:


> Originally Posted by *Redeemer*
> 
> If every Nvidia gamer want to ditch\mod their perfectly good gaming monitor and pick up a $399/$100+ MOD then it could be a force!


Is there are need for G Sync with those who are using 144Hz displays?
Cheers.


----------



## carlovfx

Quote:


> Originally Posted by *Jack Mac*
> 
> Waiting to see what more people have to say about 13.12 before I go through the trouble of installing it.


Installing right now, gimme some time to test them.


----------



## opiate5

help needed here guys

i have unlocked my sapphire 290 bf4 edition to 290x and tried variuos 290x bios'es but UEFI never works. Using win 8.1. Do any of you have any idea what could be wrong? or maybe someone who has it working in UEFI can provide a vbios so I can rule out the gpu and look into my motherboard or smth? cheers.


----------



## carlovfx

Quote:


> Originally Posted by *Jack Mac*
> 
> Waiting to see what more people have to say about 13.12 before I go through the trouble of installing it.


I installed the new drivers over the old ones and in Valley I am losing more than 150 points, plus it's stuttering like hell. I am going to disinstall everything and clean in Safe Mode, then reinstall the new ones so hang on.


----------



## Jack Mac

Quote:


> Originally Posted by *carlovfx*
> 
> I installed the new drivers over the old ones and in Valley I am losing more than 150 points, plus it's stuttering like hell. I am going to disinstall everything and clean in Safe Mode, then reinstall the new ones so hang on.


Thanks, I've been happy with 13.11 WHQL but free performance is always nice.


----------



## carlovfx

Quote:


> Originally Posted by *Jack Mac*
> 
> Thanks, I've been happy with 13.11 WHQL but free performance is always nice.


Uninstalled all AMD software in Safe Mode using DDU, reinstalled the new drivers and disabled ULPS: result is the same in Valley. I am going to bench Hitman now.

EDIT: Both Tomb Raider and Hitman benchmarks are stuttering like crazy with the new drivers.

EDIT 2: The drivers work well if Frame Pacing is disabled.


----------



## HeliXpc




----------



## Darhant

After buying a second hand r9 290x and turning out to be dodgy since he shipped it with the block on and didn't drain it properly.
It was artifact and crashing I got my back for the card but kept the EK FC full cover block and back plate.



So combine that with the EK FC full cover block that is sitting on the desk staring at me, I think I will have a good time but even better if it has a 2000 chip that is unlock-able!

Should arrive on Monday so I'll have a full 2 days to play with it!


----------



## sf101

Quote:


> Originally Posted by *BradleyW*
> 
> Is there are need for G Sync with those who are using 144Hz displays?
> Cheers.


Better question is is their a benifit to those who can run a solid 120-145 fps already without any drops in fps when its locked? most games allow you to lock fps at a targeted speed so you shouldnt be seeing dips and drops of fps anyways if your locking it.

for the extra price of a monitor with 120hz and gsync and the extra price of the nvidia premium do you not think you could spend the extra cash on a card or two cards crossfired to maintain 60-120-144 fps solid to match your screen hz ?

most 120-144hz screens are light boost capable and can be run in 2d light boost which strobes and makes it clear as heck already.

so if you ask me i dont think its at all worth it if you have a card that can maintain solid clock speeds.

from what i can see all gsync does is match the refresh rate to the current fps so it only should work better if your fps is all over the place.


----------



## Derpinheimer

Does PSU effect overclocking? My 600W Cooler Master PSU is drooping all the way to 11.03v under extreme loads, or 11.2v under heavy load. Might this effect my OC potential?


----------



## tsm106

Quote:


> Originally Posted by *Derpinheimer*
> 
> Does PSU effect overclocking? My 600W Cooler Master PSU is drooping all the way to 11.03v under extreme loads, or 11.2v under heavy load. Might this effect my OC potential?


Is that a serious question? Hell yea that will affect things.


----------



## RAFFY

ALL 290X's in stock at NewEgg


----------



## anubis1127

Quote:


> Originally Posted by *RAFFY*
> 
> ALL 290X's in stock at NewEgg


Wow, $80 over MSRP. Good job newegg.


----------



## Derpinheimer

Quote:


> Originally Posted by *tsm106*
> 
> Is that a serious question? Hell yea that will affect things.


I know people say it does, im just wondering why? More work for VRM?


----------



## Sgt Bilko

Quote:


> Originally Posted by *anubis1127*
> 
> Wow, $80 over MSRP. Good job newegg.


Price hike has hit here now too, 290's are $520 from $500 and 290x's are $710 from $700 for a BF4 bundled one..

Asus 290x BF4 Bundle: $769.00........Wha!?!?!

On a positive note, looks like i'm getting my refund today......the downside is cards are more expensive now than what i originally bought mine for


----------



## the9quad

Just thought I would mention, since people really love benching these things, That 3dmark (the newest one) is currently on sale on steam for $12.50.

No excuses now, go get it and start benching fire strike extreme


----------



## battleaxe

If anyone lives near Columbus Ohio there's a few 290's in stock for 420.00 And an open box for 359.00. I may just pick this up tomorrow if none of you beat me to it.

http://www.microcenter.com/search/search_results.aspx?Ntt=290&N=

Jeez... already sold out. Open box is only one left. Probably a bad OC'er that was unstable. Bummer


----------



## RAFFY

Quote:


> Originally Posted by *the9quad*
> 
> Just thought I would mention, since people really love benching these things, That 3dmark (the newest one) is currently on sale on steam for $12.50.
> 
> No excuses now, go get it and start benching fire strike extreme


Much Thanks.... May the DOGE be with you!


----------



## tsm106

Quote:


> Originally Posted by *Derpinheimer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Is that a serious question? Hell yea that will affect things.
> 
> 
> 
> I know people say it does, im just wondering why? More work for VRM?
Click to expand...

It's failing. Call up the manufacturer, and you will get a RMA approval on the spot. Drooping rails equal damage and you are putting your gear into a brown out state, not good.


----------



## ImJJames

Official drivers seem slower on 3dmark compared to 9.5, whats faster 9.4 or 9.2 beta on 3dmark?


----------



## the9quad

Quote:


> Originally Posted by *ImJJames*
> 
> Official drivers seem slower on 3dmark compared to 9.5, whats faster 9.4 or 9.2 beta on 3dmark?


The official drivers are really just the 9.5's so if they are slower, it is just an anomaly or just... who knows, 3dmark is funny like that. Personally, I found the previous WHQL's to be the fastest for me, but I imagine you just have to try the different ones, and run 3dmark and see, as it can probably be affected by different systems..


----------



## ImJJames

Quote:


> Originally Posted by *the9quad*
> 
> The official drivers are really just the 9.5's so if they are slower, it is just an anomaly or just... who knows, 3dmark is funny like that. Personally, I found the previous WHQL's to be the fastest for me, but I imagine you just have to try the different ones, and run 3dmark and see, as it can probably be affected by different systems..


idk I lost almost 300 points at same exact settings, ill try again


----------



## battleaxe

I just bought a 2nd MSI 290 for 381.00 including tax. Open box at MC. Nice. Hopefully its not a turd. If so its getting returned.


----------



## Derpinheimer

Quote:


> Originally Posted by *tsm106*
> 
> It's failing. Call up the manufacturer, and you will get a RMA approval on the spot. Drooping rails equal damage and you are putting your gear into a brown out state, not good.


Oh, I guess I figured I was just overloading it. But I suppose that is true, it shouldnt droop that much even at maximum load.

Only issue is I have no backup PSU. Already sent an RMA request, hoepfully they can cross ship. Thanks.


----------



## stickg1

What model Cooler Master PSU? Have them send you one of their few decent models if anything. Some of those units they sell are nothing but a cat turd with a sticker on the side.

But they do offer a couple quality units.


----------



## battleaxe

Quote:


> Originally Posted by *stickg1*
> 
> What model Cooler Master PSU? Have them send you one of their few decent models if anything. Some of those units they sell are nothing but a cat turd with a sticker on the side.
> 
> But they do offer a couple quality units.


LOL.... cat turds are the best man. Oh and they smell so good when you cook them. Cat turd plus 80c = some mighty fine aromas!

(I think I drank too many beers tonight!)


----------



## UNOE

Quote:


> Originally Posted by *HeliXpc*


Nice way to use your fingers to touch the VRM and memory while its running.


----------



## psyside

Quote:


> Originally Posted by *Derpinheimer*
> 
> Oh, I guess I figured I was just overloading it. But I suppose that is true, it shouldnt droop that much even at maximum load.
> 
> Only issue is I have no backup PSU. *Already sent an RMA request, hoepfully they can cross ship. Thanks.*


Ask for them to offer you rma from V/Vanguard series, even if you have to pay a bit more, they are amazing


----------



## VSG

Quote:


> Originally Posted by *HeliXpc*


Did you try installing aftermarket heatsinks on the VRAM and VRM?


----------



## Derpinheimer

Quote:


> Originally Posted by *psyside*
> 
> Ask for them to offer you rma from V/Vanguard series, even if you have to pay a bit more, they are amazing


Yeah, I asked if I can pay for an upgraded version. Every other company I try this with they say no (Well, only like 2 tries...).
Thanks for advice though. Will look into the Vanguard.
Quote:


> Originally Posted by *stickg1*
> 
> What model Cooler Master PSU? Have them send you one of their few decent models if anything. Some of those units they sell are nothing but a cat turd with a sticker on the side.
> 
> But they do offer a couple quality units.


I have a Silent Pro M 600W. I really like it since its the only truly silent PSU i've found. Every other "silent" psu either whines, too high idle fan RPM, fan ticking, buzzing, etc. This one has developed a faint buzz at load but nothing major.


----------



## HeliXpc

Quote:


> Originally Posted by *UNOE*
> 
> Nice way to use your fingers to touch the VRM and memory while its running.


Oh yeah, my intent was to blow up the card, because thats what they do when you touch them while the pc is on







Nothing to worry about, been building PCs since 1995


----------



## HeliXpc

Quote:


> Originally Posted by *geggeg*
> 
> Did you try installing aftermarket heatsinks on the VRAM and VRM?


Yeah ill be doing this tomorrow, when i did the video i had no arctic alumina adhesive, ill be posting a new video


----------



## Slomo4shO

Just got all my water cooling parts finally


----------



## broken pixel

Quote:


> Originally Posted by *RAFFY*
> 
> Much Thanks.... May the DOGE be with you!


I made 150 USD with dogecoin last night. Cryptsy was delaying deposits and manipulating the market with wacked out high sales low volume as everyone waited for there deposits to post.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Slomo4shO*
> 
> Just got all my water cooling parts finally


Can I have some noctuas please??


----------



## Mr357

Smoke on HWBot hit 1500/1762 on a 290X under LN2.









He now holds the world record for single GPU 3DMark11 - Performance

Link


----------



## Derpinheimer

Wow... thats amazing. Wonder how that chip performs under water.


----------



## ImJJames

Quote:


> Originally Posted by *Mr357*
> 
> Smoke on HWBot hit 1500/1762 on a 290X under LN2.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> He now holds the world record for single GPU 3DMark11 - Performance
> 
> Link


Wow!


----------



## kizwan

GPU usage in Crossfire seems a lot better with Unigine Heaven Benchmark.



@Arizonian, do you think whether it's good idea to add one section at first post for known issue (& also known fix) with Crossfire, e.g. fluctuating GPU usage in some games, flickering texture in BF4, etc?

[EDIT]
This is GPU usage monitored using Open Hardware Monitor while playing BF4 in Crossfire.


----------



## Arizonian

@kizwan

Only if someone supplies the entire issue and fix. See OP for example of members who I did add. If you provide an entire write up of issue and fix I'd be willing to add it to OP if it was well done.


----------



## kizwan

@Arizonian

Thanks. I'll see what I can do.


----------



## Hogesyx

My EK-290X block and backplate has arrived. Will install and do some benching tonight.

Originally wanted to go for NZXT Kraken G10, but wife promised me a Corsair 900D for Xmas so went for custom loop instead. Hopefully the results of the block won't disappoint.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Hogesyx*
> 
> My EK-290X block and backplate has arrived. Will install and do some benching tonight.
> 
> Originally wanted to go for NZXT Kraken G10, but wife promised me a Corsair 900D for Xmas so went for custom loop instead. Hopefully the results of the block won't disappoint.


Your wife sounds like a really cool lady


----------



## sf101

any way to make your clock rates stay at what you set them in Asus GPU Tweak?

I've been ocing some in gpu tweak and its working really well except i set it to 1140mhz and it gives me anywhere from 1115 to 1135 mhz depending how it feels but im not getting any of the black screen stuff i was getting before past 1100 and oddly enough Afterburner started black screening me at pretty much any overclock i dont think it was playing well with everything else or something.

wish they could have just kept the 7970 system for overclocking it was so much simpler and reliable. no fluctuations of voltages or clocks just set it and if it works it works.


----------



## velocityx

donno whats up but I was playing bf4 today and noticed my top 290 was at 98 C. donno whats up with that?I thought theres a limit for 94 C.


----------



## aesma

Hi, I'm new to this forum, but an experienced user/overclocker (mostly CPUs though). I've got 3 brand new Sapphire 290 BF4 edition. I've got no voltage control, no voltage readings even ! Last night I flashed them with the Asus 290 BIOS from techpowerup. The flash was successful but this has not changed a thing (nothing in AB, nothing in GPUtweak, including the hawaii modded one). What am I missing ? I need undervolting badly as my ears are bleeding !

Thanks


----------



## rdr09

Quote:


> Originally Posted by *velocityx*
> 
> donno whats up but I was playing bf4 today and noticed my top 290 was at 98 C. donno whats up with that?I thought theres a limit for 94 C.


apps may not be as accurate reading volts, temps, or usage. you may have to check case airflow.


----------



## maynard14

http://www.3dmark.com/3dm11/7689670

huhuhu my card is very weak at overclocking

ab settings are 100 mv voltages

and core clock is only 1140!

and memory stock at 1250

r9 290 xfx

very sad indeed


----------



## VSG

If you notice the average overclock for the card, you will see that 1140 is actually pretty good.


----------



## TripleTurbo

1140 on the core isnt a poor overclock; I bet the maxed voltage set is overkill, try operating at the aforementioned clocks with a range of voltage.

I have twin 290X gpus performing 1212/1300 @ 1315 mv. 1300 being an absolute maximum, higher provides diminished returns in most cases, negative returns in a few. The core clock scales very well to performance gains - @ 2560x1440, ultra mode & 4xMSAA heaven 4.0, frames delivered ratches up from 69.7 to 77.3 when taking clocks from 1100 mhz to 1200mhz. Cards arent yet under dihydrogen monoxide, nevertheless my clocks are immune to the plague of throttling, so with higher capabilitu cooling id expect 1250 a reasonable expectation for these fantastic cards. AMD is bound to unlock better performance via drivers, too, so the gpu division high-end looks pretty bright these days.


----------



## rdr09

shooting for 1300MHz . . .

http://www.3dmark.com/3dm11/7689992

orig bios and oc'ing using just Trixx.


----------



## maynard14

Quote:


> Originally Posted by *geggeg*
> 
> If you notice the average overclock for the card, you will see that 1140 is actually pretty good.


Quote:


> Originally Posted by *TripleTurbo*
> 
> 1140 on the core isnt a poor overclock; I bet the maxed voltage set is overkill, try operating at the aforementioned clocks with a range of voltage.
> 
> I have twin 290X gpus performing 1212/1300 @ 1315 mv. 1300 being an absolute maximum, higher provides diminished returns in most cases, negative returns in a few. The core clock scales very well to performance gains - @ 2560x1440, ultra mode & 4xMSAA heaven 4.0, frames delivered ratches up from 69.7 to 77.3 when taking clocks from 1100 mhz to 1200mhz. Cards arent yet under dihydrogen monoxide, nevertheless my clocks are immune to the plague of throttling, so with higher capabilitu cooling id expect 1250 a reasonable expectation for these fantastic cards. AMD is bound to unlock better performance via drivers, too, so the gpu division high-end looks pretty bright these days.


oh i didnt know that 1140 is actually a good overclock coz of my voltage increasement of 100 mv,.. ok guys i will try to lower the mv voltage and see if i can stable it with 1140 core clock and later oc my memory as well

by the way when i set my oc on my card at 1140 core clock 100 mv my max temp is 76 @75 percent fan and max voltage i think is 1.321 volts


----------



## mikep577

Hi just installed my two R9's with water cooling. Can I be added to this thread?

Brand: Asus BF4 edition
WaterCooling


----------



## maynard14

ok final oc result

ab settings

100 mv

core clock 1135 core clock

memory clock 1400

and fan speed 75 percent load 76 c

max voltage is 1.336 volts and idle is 1.086

http://www.3dmark.com/3dm11/7690273

but its weird 3dmark11 not showing my card overclock core clock and memory clock haha

is my overclock ok or still weak?


----------



## the9quad

I posted this earlier, but you guys benching with 3dmark 11, if you have been waiting for a sale to upgrade to the new 3dmark.

Steam currently has it on sale for 50% off. so it is now *$12.50.* Well worth it imo.


----------



## VSG

I would wait it out, it was 75% off at the summer sale.


----------



## ReHWolution

Quote:


> Originally Posted by *rdr09*
> 
> shooting for 1300MHz . . .
> 
> http://www.3dmark.com/3dm11/7689992
> 
> orig bios and oc'ing using just Trixx.


WHich version are you using? (I mean TriXX)


----------



## Derpinheimer

Quote:


> Originally Posted by *rdr09*
> 
> shooting for 1300MHz . . .
> 
> http://www.3dmark.com/3dm11/7689992
> 
> orig bios and oc'ing using just Trixx.


Voltage? Cooling? Temps???????
Quote:


> Originally Posted by *TripleTurbo*
> 
> 1140 on the core isnt a poor overclock; I bet the maxed voltage set is overkill, try operating at the aforementioned clocks with a range of voltage.
> 
> I have twin 290X gpus performing 1212/1300 @ 1315 mv. 1300 being an absolute maximum, higher provides diminished returns in most cases, negative returns in a few. The core clock scales very well to performance gains - @ 2560x1440, ultra mode & 4xMSAA heaven 4.0, frames delivered ratches up from 69.7 to 77.3 when taking clocks from 1100 mhz to 1200mhz. Cards arent yet under dihydrogen monoxide, nevertheless my clocks are immune to the plague of throttling, so with higher capabilitu cooling id expect 1250 a reasonable expectation for these fantastic cards. AMD is bound to unlock better performance via drivers, too, so the gpu division high-end looks pretty bright these days.


1100 to 1200 only increases mine from <5% in the standard heaven test [1920x1080, extreme preset, full screen]


----------



## Slomo4shO

Quote:


> Originally Posted by *the9quad*
> 
> I posted this earlier, but you guys benching with 3dmark 11, if you have been waiting for a sale to upgrade to the new 3dmark.
> 
> Steam currently has it on sale for 50% off. so it is now *$12.50.* Well worth it imo.


Or just pick em up off eBay for $8.00


----------



## ImJJames

Quote:


> Originally Posted by *rdr09*
> 
> shooting for 1300MHz . . .
> 
> http://www.3dmark.com/3dm11/7689992
> 
> orig bios and oc'ing using just Trixx.


Very nice if true.


----------



## rdr09

Quote:


> Originally Posted by *Derpinheimer*
> 
> Voltage? Cooling? Temps???????
> 1100 to 1200 only increases mine from <5% in the standard heaven test [1920x1080, extreme preset, full screen]


i used one found in post # 10

http://www.overclock.net/t/1444892/sapphire-290-290x-trixx-voltage-control-missing

did not check temp - it is watercooled. i had the window open and it was 45F outside.



you don't see the PL but it is maxed at 50%. based off GPUz my VDDC ends up at 1.18v.


----------



## Xares

Quote:


> Originally Posted by *kizwan*
> 
> With 13.12 drivers, when I set manual fan speed, the fan on second card remain at 20% until I run games or open flash content website.
> Wait...is this a problem. I got the same "fluctuation" but games running smooth though.
> 
> BF3
> 
> 
> BF4
> 
> ^^This. Also I would like to know the mosfet use on these cards.
> Is this your rig?
> http://www.overclock.net/lists/display/view/id/5550571
> 
> Try different PSU. As far as I know Aerocool PSU is among the PSU you should avoid.


is the fluctuation the problem the temperature? what is the problem?


----------



## rdr09

Quote:


> Originally Posted by *ImJJames*
> 
> Very nice if true.


unbelievable, huh? here is 1250MHz

http://www.3dmark.com/3dm11/7648825


----------



## RavageTheEarth

Quote:


> Originally Posted by *rdr09*
> 
> shooting for 1300MHz . . .
> 
> http://www.3dmark.com/3dm11/7689992
> 
> orig bios and oc'ing using just Trixx.


*VERY* nice clocks!


----------



## ImJJames

Quote:


> Originally Posted by *rdr09*
> 
> unbelievable, huh? here is 1250MHz
> 
> http://www.3dmark.com/3dm11/7648825


I can do 1250 also on stock bios, but I thought you said 1300 lol, which driver version you using.


----------



## Derpinheimer

Quote:


> Originally Posted by *rdr09*
> 
> unbelievable, huh? here is 1250MHz
> 
> http://www.3dmark.com/3dm11/7648825


Lets trade 290s







[Or 290x, mine unlocks]

Doesnt want to go over 1200 on water. Sound like a deal?


----------



## rdr09

Quote:


> Originally Posted by *RavageTheEarth*
> 
> *VERY* nice clocks!


thank you. it is a work and gaming pc, so i take my time. i did run Firestike at 1260 earlier . . .

http://www.3dmark.com/3dm/1895751

love this 290. best part is it can max out C3 at 1080 stock clocks.

edit: no, thanks, @ Derpinheimer. lol


----------



## rdr09

Quote:


> Originally Posted by *ImJJames*
> 
> I can do 1250 also on stock bios, but I thought you said 1300 lol, which driver version you using.


i will shoot for 1300 but i have my doubts. 13.11 WHQL it is. i am sure AMD will come up with the driver or drivers that will push these cards farther.


----------



## RavageTheEarth

Quote:


> Originally Posted by *rdr09*
> 
> i will shoot for 1300 but i have my doubts. 13.11 WHQL it is. i am sure AMD will come up with the driver or drivers that will push these cards farther.


What volts are you using for the clocks that you have?


----------



## kizwan

Quote:


> Originally Posted by *Xares*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Wait...is this a problem. I got the same "fluctuation" but games running smooth though.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> BF3
> 
> 
> BF4
> 
> 
> 
> 
> 
> 
> 
> is the fluctuation the problem the temperature? what is the problem?
Click to expand...

Temperatures are fine, games performance also fine. Only issue is the fluctuating GPU usage. I tested Crossfire with DiRT 2, BF3 & BF4, all three also got fluctuating GPU usage but smooth game play. IMO, it should not fluctuating that bad because when running Unigine Heaven Benchmark, GPU usage a lot better.


----------



## rdr09

Quote:


> Originally Posted by *RavageTheEarth*
> 
> What volts are you using for the clocks that you have?


1.18v according to GPUz for 1280Mhz, which is correct 'cause i used +200 offset in Trixx. idle volts is 0.98v.


----------



## Hogesyx

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Your wife sounds like a really cool lady


Yes she is! She is a closet gadget freak, got her a Shure SE535 in return.









Anyway the EK-290X is simply amazing.

Max out at 1200MHz core without artifacts @ +100mV in afterburner. GPU-Z reported highest 48C for core, 55C for VRM1, 41C for VRM2.

I am still on my temp setup with a XTX360 and a single 120mm fan, temp will definitely be better once I migrate everything to the new case.


----------



## psyside

Quote:


> Originally Posted by *tsm106*
> 
> Is that a serious question? Hell yea that will affect things.


How does this look to you? what should be the 12v + rail on idle/load ideally?

Thanks


----------



## Joeking78

Quote:


> Originally Posted by *rdr09*
> 
> shooting for 1300MHz . . .
> 
> http://www.3dmark.com/3dm11/7689992
> 
> orig bios and oc'ing using just Trixx.


Awesome









1230 is my absolute limit with +200 in Trixx...even then I get artifacts and random shut downs occasionally, I think my PSU is on the very limit with all three cards and the cpu overclocked. I actually score more now with +100 in AB and 1175 core lol.

Going to get a 1500w psu this week, what's your opinion on the Thermaltake 1475 Toughpower?


----------



## ImJJames

Quote:


> Originally Posted by *rdr09*
> 
> i will shoot for 1300 but i have my doubts. 13.11 WHQL it is. i am sure AMD will come up with the driver or drivers that will push these cards farther.


You don't have any problem with games on that old driver or its just for benching? That driver is older than the card itself lol


----------



## EliteReplay

Hi is there any one here with a 3930K/FX8350 and R9 290/290X that can let me know their idle and load Power compsution? while playing let say BF4? Ultra settings rep+ for u if u post a picture


----------



## Arizonian

Quote:


> Originally Posted by *mikep577*
> 
> Hi just installed my two R9's with water cooling. Can I be added to this thread?
> 
> Brand: Asus BF4 edition
> WaterCooling
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## Asrock Extreme7

Quote:


> Originally Posted by *Derpinheimer*
> 
> Lets trade 290s
> 
> 
> 
> 
> 
> 
> 
> [Or 290x, mine unlocks]
> 
> Doesnt want to go over 1200 on water. Sound like a deal?


Quote:


> Originally Posted by *rdr09*
> 
> unbelievable, huh? here is 1250MHz
> 
> http://www.3dmark.com/3dm11/7648825


your 3dmark says 3,072 MB mem


----------



## Forceman

Quote:


> Originally Posted by *rdr09*
> 
> 1.18v according to GPUz for 1280Mhz, which is correct 'cause i used +200 offset in Trixx. idle volts is 0.98v.


What is your load voltage? 1.18V sounds like it is either voltage adjusted idle or non-volage adjusted load. But it certainly isn't +200mV load.
Quote:


> Originally Posted by *Asrock Extreme7*
> 
> your 3dmark says 3,072 MB mem


It always says that, for some reason.


----------



## stickg1

Quote:


> Originally Posted by *Derpinheimer*
> 
> Yeah, I asked if I can pay for an upgraded version. Every other company I try this with they say no (Well, only like 2 tries...).
> Thanks for advice though. Will look into the Vanguard.
> I have a Silent Pro M 600W. I really like it since its the only truly silent PSU i've found. Every other "silent" psu either whines, too high idle fan RPM, fan ticking, buzzing, etc. This one has developed a faint buzz at load but nothing major.


The fan on my Seasonic X650 and the AX760i I used to have rarely ever turn on unless I'm maxing out CPU and GPU simultaneously. When it does spin, it's inaudible over the case fans which are low RPM Fractals.


----------



## Sazz

Quote:


> Originally Posted by *Asrock Extreme7*
> 
> your 3dmark says 3,072 MB mem


same on mine, but it only shows that way when I use Asus and PT1 BIOS, when I use stock bios of the card it shows right.

http://www.3dmark.com/3dm11/7456375


----------



## Asrock Extreme7

Quote:


> Originally Posted by *stickg1*
> 
> The fan on my Seasonic X650 and the AX760i I used to have rarely ever turn on unless I'm maxing out CPU and GPU simultaneously. When it does spin, it's inaudible over the case fans which are low RPM Fractals.


I got a xfx 850 gold psu a last week it has hybrid fan mode not seen it spin once starting to think something wrong with it


----------



## Asrock Extreme7

Quote:


> Originally Posted by *Sazz*
> 
> same on mine, but it only shows that way when I use Asus and PT1 BIOS, when I use stock bios of the card it shows right.
> 
> http://www.3dmark.com/3dm11/7456375


strange?

nice pipe work on pic


----------



## Sazz

Quote:


> Originally Posted by *Asrock Extreme7*
> 
> strange?
> 
> nice pipe work on pic


Probably because of the BIOS itself, but I don't really use Asus BIOS on the daily basis, only when benchmarking that needs better voltage than AB can do right now.

And thanks.


----------



## punkrocker

*Hi everyone!*

I am owner of a R9 290. Yesterday I put the Arctic Cooling Xtreme III on it. I am mining with it 24/7. The core-temp is great: 73° @ 42% fan, when mining @ 690 KH/s.
I am not shure about the VRM-temps. Maybe you can help me.

In IDLE-mode the VRM 1 is @ ~55° and VRM 2 is @ ~50°.

*When I am starting mining, this happens*:


*After a two minutes:*


I am wondering about the difference between both VRM's. What do you think?

- How can I reduce the VRM 1 temp and how do I know which VRM's on my R9 290 are connected to VRM 1?
- Any ideas, how I can cool down the VRM 1?
- Is the difference between VRM 1 and VRM 2 normal?

Best regards!


----------



## sugarhell

690? Sell it and get a 280x for mining


----------



## Strata

EK Block was just delivered. Sadly I don't have time to do the install until Tuesday. Also the block wasn't polished so I'll be doing that as well.

EK Acetal 290X Waterblock
EK 290X Backplate (repacked into the EK box for my travel)
Primochill 1/2 | 3/4 Tubing Dark Blue
Arctic MX-4
Fillport
Swiftech Compression Fittings
Myriad of Swiftech 90° & 45° Rotary Adapters
I & H Silver Coil

Not pictured, but waiting at my holiday destination:

Fujipoly Extreme pads
Indigo Xtreme XS for LGA115X
Fesar Base Corrosion Inhibitor


----------



## SultanOfWalmart

Quote:


> Originally Posted by *punkrocker*
> 
> *Hi everyone!*
> 
> I am owner of a R9 290. Yesterday I put the Arctic Cooling Xtreme III on it. I am mining with it 24/7. The core-temp is great: 73° @ 42% fan, when mining @ 690 KH/s.
> I am not shure about the VRM-temps. Maybe you can help me.
> 
> In IDLE-mode the VRM 1 is @ ~55° and VRM 2 is @ ~50°.
> 
> I am wondering about the difference between both VRM's. What do you think?
> 
> - How can I reduce the VRM 1 temp and how do I know which VRM's on my R9 290 are connected to VRM 1?
> - Any ideas, how I can cool down the VRM 1?
> - Is the difference between VRM 1 and VRM 2 normal?
> 
> Best regards!


- Reduce it with heatsinks and fans, good airflow as airflow by itself will not be neough, suckers get hot! VRM1 are related to the GPU Core and VRM2 are for memory. VRM1 is a long strip that is close to the end of the card. VRM2 is a 3-piece cluster that is closer to the front of the card (bracket).
- honestly, to cool it better than the stock HSF, I would go with a watercooling block, especially if you mine. Although, I've seen people gut the stock cooler and use it in conjunction with the ACXIII.
- The temp difference is normal. VRM1 will usually be hotter (sometimes significantly more) than VRM2 since the core is more demanding than the memory is.


----------



## jerrolds

Anyone with a Kraken G10/Artic Hybrid/Dwood/Red Mod/etc cooling VRM1 without sinks and just pointing a fan on them? Hows that working out? Or if you did put sinks, what are your temps as well?


----------



## Asrock Extreme7

Is mineing worth it how much do u make


----------



## rdr09

Quote:


> Originally Posted by *Forceman*
> 
> What is your load voltage? 1.18V sounds like it is either voltage adjusted idle or non-volage adjusted load. But it certainly isn't +200mV load.
> It always says that, for some reason.


yes, at idle after applying the oc settings in Trixx that's the voltage 1.18v. at load i did not check.

Quote:


> Originally Posted by *Sazz*
> 
> same on mine, but it only shows that way when I use Asus and PT1 BIOS, when I use stock bios of the card it shows right.
> 
> http://www.3dmark.com/3dm11/7456375


i have never flashed my 290 to any other bios. that glitch happens to other cards like the Titan. i have a run at 1280 core and it shows the whole 4GB vram.


----------



## jerrolds

Trixx doesnt do Screen Overlay Monitoring like AB does it? Cuz im using both AB/GPU Tweak to monitor and overclock atm ;p


----------



## battleaxe

How do I flash my 290 to a 290x?

I know its in this thread somewhere.

Never-mind; I found the thread.


----------



## Arizonian

Just saw this on OCN......*GPU-Z v0.7.5*

*0.7.5*
•Fixed crash/hang on NVIDIA Optimus
•Fixed crash during CUDA detection
•Added voltage monitoring support for new Bonaire/Pitcairn SKUs
•Fixed BIOS reading on AMD Mars and Oland
•Added support for AMD Radeon R7 260, HD 7600A, HD 8850M, HD 8400, HD 8240, HD 7620G
•Added support for NVIDIA GeForce 705A, GTX 645, GTX 650 OEM, Grid K260Q, Quadro K5100M, K6000
•Added support for Intel GMA (Bay Trail), HD 4200
*•Windows 8.1 will now be detected properly*
•Sensor data in shared memory will now update once per second, as intended

*0.7.4*
•Added support for AMD R9 290X, R9 290, R9 270, HD 7310, HD 8280
•Added support for NVIDIA GTX 780 Ti, GT 635, Quadro K3100M
•Added release date for AMD R7 260X, R7 250, R7 240
•Fixed release date for AMD R9 280X


----------



## stickg1

Quote:


> Originally Posted by *jerrolds*
> 
> Anyone with a Kraken G10/Artic Hybrid/Dwood/Red Mod/etc cooling VRM1 without sinks and just pointing a fan on them? Hows that working out? Or if you did put sinks, what are your temps as well?


I am currently. I fold on it 24/7, core max 65C, VRM1 58C (no sinks, direct high RPM airflow), VRM2 54C (heatsinked)

It's a Corsair H55 zip-tied, Enzotech 6.5mm x 6.5mm x 14mm VRMsinks on triple VRMs at the front of the card, they get cooled by case airflow. I didn't have enough sinks and pads to do the VRM1, I was going to order more but my temps are great.

I have Thermalright 120mm high RPM fans, one on the rad, one ziptied to the GPU and pointed directly at the VRMS.


----------



## rv8000

Was messing around with trixx and 3dmark today, stability wise oc's seem better. I can do 1215 with +100mv while in ab I could barely keep 1180 stable with +100mv. Also noticed something not so great, I was monitoring my 12v line while watching vrm temps and noticed that 12v line is dropping a lot more than i thought. Normally runs at 11.88v, and is dropping all the way down to 11.50v with the 290 heavily oc'd with cpu @ stock. Is it bad my psu's 12v line is dropping so much, also 11.50 is close to potential damaging levels right?


----------



## tsm106

Quote:


> Originally Posted by *rv8000*
> 
> Was messing around with trixx and 3dmark today, stability wise oc's seem better. I can do 1215 with +100mv while in ab I could barely keep 1180 stable with +100mv. Also noticed something not so great, I was monitoring my 12v line while watching vrm temps and noticed that 12v line is dropping a lot more than i thought. Normally runs at 11.88v, and is dropping all the way down to 11.50v with the 290 heavily oc'd with cpu @ stock. Is it bad my psu's 12v line is dropping so much, also 11.50 is close to potential damaging levels right?


The ATX spec allows a range of 5% which is about 11.4v to 12.6v. Your psu at 11.5v is getting rather too close to the 5% ATX spec. I would be cautious with it and at least start to think about a multi meter soon to confirm your fears.

When your range of droop is severe like below, it's time to replace asap.

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Derpinheimer*
> 
> Does PSU effect overclocking? My 600W Cooler Master PSU is drooping all the way to 11.03v under extreme loads, or 11.2v under heavy load. Might this effect my OC potential?
> 
> 
> 
> Is that a serious question? Hell yea that will affect things.
Click to expand...


----------



## ImJJames

Quote:


> Originally Posted by *rv8000*
> 
> Was messing around with trixx and 3dmark today, stability wise oc's seem better. I can do 1215 with +100mv while in ab I could barely keep 1180 stable with +100mv. Also noticed something not so great, I was monitoring my 12v line while watching vrm temps and noticed that 12v line is dropping a lot more than i thought. Normally runs at 11.88v, and is dropping all the way down to 11.50v with the 290 heavily oc'd with cpu @ stock. Is it bad my psu's 12v line is dropping so much, also 11.50 is close to potential damaging levels right?


Trixx doesn't even let me change voltage, how you get it working?


----------



## rv8000

Quote:


> Originally Posted by *ImJJames*
> 
> Trixx doesn't even let me change voltage, how you get it working?


Have you tried the newest beta in that same thread they posted the first 4.8 beta? I simply installed, slid the bar, hit apply, and no issues.

@ tsm106 I have a nice DMM laying around, red dmm lead to yellow wire and black lead to ground correct? Should i check from a spare 6-pin vga connector?


----------



## tsm106

http://pcsupport.about.com/od/insidethepc/a/atx-pinout-24-pin-12v-psu.htm

Pin guide for reference. Yea, red to yellow, black to ground. Test off the 24pin.


----------



## ImJJames

Quote:


> Originally Posted by *rv8000*
> 
> Have you tried the newest beta in that same thread they posted the first 4.8 beta? I simply installed, slid the bar, hit apply, and no issues.


Can u link me pls, all I see are really old betas version 3


----------



## punkrocker

Quote:


> Originally Posted by *SultanOfWalmart*
> 
> - Reduce it with heatsinks and fans, good airflow as airflow by itself will not be neough, suckers get hot! VRM1 are related to the GPU Core and VRM2 are for memory. VRM1 is a long strip that is close to the end of the card. VRM2 is a 3-piece cluster that is closer to the front of the card (bracket).
> - honestly, to cool it better than the stock HSF, I would go with a watercooling block, especially if you mine. Although, I've seen people gut the stock cooler and use it in conjunction with the ACXIII.
> - The temp difference is normal. VRM1 will usually be hotter (sometimes significantly more) than VRM2 since the core is more demanding than the memory is.


Really great answer, thanks. I will try to find a solution to get the VRM 1 colder. I already have heatsinks on VRM 1 + 2. Maybe I need bigger and better once.

Quote:


> Originally Posted by *sugarhell*
> 
> 690? Sell it and get a 280x for mining


I am mining @ 690 KH/s because I set the intensity @ 17, so I can work without lags. This card is great for mining. On intensity 20 I get >800 KH/s.


----------



## X-oiL

Seems like I've reached the limit for now, I'm currently on the latest drivers WHQL13.12 . Here's my Firestrike Extreme results. It's possible to raise the memory higher but i'm not getting any extra points by doing so, guess I have to switch bios if I want to clock any higher.

Not sure if this have been posted before but disabling frame pacing gives a huge increase in performance.

*8054* stock volt, 947 // 1250 http://www.3dmark.com/fs/1327924

*9779* +100mV, 1195 // 1350 http://www.3dmark.com/fs/1333911


----------



## BradleyW

Hey,
Just wondering, why do I get no mouse lag with a single GPU when Vsync is enabled, but with 2 cards I have enormous input delay?
Thank you.


----------



## Durquavian

Quote:


> Originally Posted by *BradleyW*
> 
> Hey,
> Just wondering, why do I get no mouse lag with a single GPU when Vsync is enabled, but with 2 cards I have enormous input delay?
> Thank you.


prerendering: occurs with dual gpu but not single unless selected. Prerenndered frames cause lag.


----------



## BradleyW

Quote:


> Originally Posted by *Durquavian*
> 
> prerendering: occurs with dual gpu but not single unless selected. Prerenndered frames cause lag.


No way around it?


----------



## rdr09

Quote:


> Originally Posted by *ImJJames*
> 
> Can u link me pls, all I see are really old betas version 3


post #53 . . .

http://www.overclock.net/t/1444892/sapphire-290-290x-trixx-voltage-control-missing/50


----------



## VSG

Quote:


> Originally Posted by *rv8000*
> 
> Was messing around with trixx and 3dmark today, stability wise oc's seem better. I can do 1215 with +100mv while in ab I could barely keep 1180 stable with +100mv. Also noticed something not so great, I was monitoring my 12v line while watching vrm temps and noticed that 12v line is dropping a lot more than i thought. Normally runs at 11.88v, and is dropping all the way down to 11.50v with the 290 heavily oc'd with cpu @ stock. Is it bad my psu's 12v line is dropping so much, also 11.50 is close to potential damaging levels right?


How did you measure the PSU voltage? Voltmeter or a PSU software? The latter can be untrustworthy at many times.


----------



## Durvelle27

First review of the non reffernece R9 290

Sapphire Radeon R9 290 Tri-X





































http://m.hexus.net/tech/reviews/graphics/63953-sapphire-radeon-r9-290-tri-x/


----------



## Sgt Bilko

Quote:


> Originally Posted by *Durvelle27*
> 
> First review of the non reffernece R9 290
> 
> Sapphire Radeon R9 290 Tri-X
> 
> http://m.hexus.net/tech/reviews/graphics/63953-sapphire-radeon-r9-290-tri-x/


I still don't like the Black/Orange design but everywhere else is looking good

EDIT: Looks like Powercolor are going to bring out water blocked 290x cards










http://powercolor.com/au/News.asp?id=1207


----------



## Falkentyne

Quote:


> Originally Posted by *rv8000*
> 
> Was messing around with trixx and 3dmark today, stability wise oc's seem better. I can do 1215 with +100mv while in ab I could barely keep 1180 stable with +100mv. Also noticed something not so great, I was monitoring my 12v line while watching vrm temps and noticed that 12v line is dropping a lot more than i thought. Normally runs at 11.88v, and is dropping all the way down to 11.50v with the 290 heavily oc'd with cpu @ stock. Is it bad my psu's 12v line is dropping so much, also 11.50 is close to potential damaging levels right?


This has NOTHING to do with your power supply. On my Seasonic 1000 platinum PSU, my card also starts at 11.75v, and drops to 11.63v and at highest load, 11.50v. While HWinfo, detects, at full load in prime 95, 11.926v. So the card has its own voltage sensor/regulators that droops the voltage or something, not related to your PSU.


----------



## Derpinheimer

Prime doesnt stress the 12v line.

Where did the first numbers come from? You say it drops to 11.5v in some reading, and HWInfo says 11.926v

GPUz and HWInfo are about the same for me:


----------



## kcuestag

Quote:


> Originally Posted by *Derpinheimer*
> 
> Prime doesnt stress the 12v line.
> 
> Where did the first numbers come from? You say it drops to 11.5v in some reading, and HWInfo says 11.926v
> 
> GPUz and HWInfo are about the same for me:
> 
> 
> Spoiler: Warning: Spoiler!


Never trust voltage readings from computer software. If you want to see proper readings, get yourself a multi/volt-meter. I did it because I saw as low as 11.13V on GPU-Z, but the multimeter never gave me below +12.10V.


----------



## Derpinheimer

Quote:


> Originally Posted by *kcuestag*
> 
> Never trust voltage readings from computer software. If you want to see proper readings, get yourself a multi/volt-meter. I did it because I saw as low as 11.13V on GPU-Z, but the multimeter never gave me below +12.10V.


Do you think something this cheap would work properly? http://www.amazon.com/DT830B-Digital-Voltmeter-Ammeter-Multimeter/dp/B005KGCI0Y/ref=sr_1_3?ie=UTF8&qid=1387589430&sr=8-3&keywords=volt+meter

Thanks


----------



## kcuestag

Quote:


> Originally Posted by *Derpinheimer*
> 
> Do you think something this cheap would work properly? http://www.amazon.com/DT830B-Digital-Voltmeter-Ammeter-Multimeter/dp/B005KGCI0Y/ref=sr_1_3?ie=UTF8&qid=1387589430&sr=8-3&keywords=volt+meter
> 
> Thanks


That's exactly the one I bought when I went nuts about the voltage readings on GPU-z.


----------



## VSG

Ya, it will work but be careful when taking readings straight off the PSU.Do it on a spare molex connector to be safe.

Edit: A quick google search revealed this thread on another forum so excuse me for a non-OCN link - http://forums.anandtech.com/showthread.php?t=257485


----------



## The Storm

Quote:


> Originally Posted by *rdr09*
> 
> post #53 . . .
> 
> http://www.overclock.net/t/1444892/sapphire-290-290x-trixx-voltage-control-missing/50


Thank you, I needed this as well

+1


----------



## Derpinheimer

Awesome, thanks guys. Prime shipping <3


----------



## Falkentyne

Quote:


> Originally Posted by *Derpinheimer*
> 
> Prime doesnt stress the 12v line.
> 
> Where did the first numbers come from? You say it drops to 11.5v in some reading, and HWInfo says 11.926v
> 
> GPUz and HWInfo are about the same for me:


I asked Unwinder. He said the numbers come from an IC register on the GPU itself.
HWinfo's +12v comes from a motherboard sensor. They do NOT read the same.

hwinfo says 12.025v for the psu/mb, while GPUZ says 11.75v for the gpu. (BOTH when fully idle).
Someone measured the 12v on the GPU with dmm, he said the real voltage was 0.06 (or 0.08 I don't remember) higher than gpuz


----------



## ImJJames

Quote:


> Originally Posted by *The Storm*
> 
> Thank you, I needed this as well
> 
> +1


Seems to be a little bit more stable than asus gpu tweak, I can bench on 1270 on stock asus bios without crashing


----------



## Roy360

My XFX R9 290(air) crashed after mining for 3 days. Everytime I turn on my PC I see white screens on the desktop and then it crashes when I open up 2-3 applications, (ie. Chrome and Firefox = crash)

Funny thing is, I am still able to mine as long as I use my iGPU for my monitors. Hashrate dropped by 60kH/s but besides that the card continues to mine like normal. Yet the monitor I hook up even one monitor I get white lines across that monitor and the whole thing crashes.

No modded firmware, using stock clocks.


----------



## stickg1

Sounds like it could be corrupt drivers. Try to install in safe mode with DDU, sweep registry with CCleaner and reinstall drivers. Worked for me!


----------



## VSG

The first of the aftermarket hawaii cards are in stock: http://www.newegg.com/Product/Product.aspx?Item=N82E16814125500

Massive price gouging going on, but at least we got something else other than reference cards.


----------



## Durvelle27

Quote:


> Originally Posted by *geggeg*
> 
> The first of the aftermarket hawaii cards are in stock: http://www.newegg.com/Product/Product.aspx?Item=N82E16814125500
> 
> Massive price gouging going on, but at least we got something else other than reference cards.


Holy Bologna that price. I rather get a Aftermarket GTX 780 or reference 290


----------



## brazilianloser

Quote:


> Originally Posted by *Durvelle27*
> 
> Holy Bologna that price. I rather get a Aftermarket GTX 780 or reference 290


No kidding... enough money there if you get your hands on a ref 290 for the original price of 399 to buy a block, back plate and put it under water, assuming you have a loop. Unless that is supposed to be the 290x and they made a mistake on the site...


----------



## Durvelle27

Quote:


> Originally Posted by *brazilianloser*
> 
> No kidding... enough money there if you get your hands on a ref 290 for the original price of 399 to buy a block, back plate and put it under water, assuming you have a loop. Unless that is supposed to be the 290x and they made a mistake on the site...


Hopefully it is a mistake or that's pretty step

Also guys there's a guy on eBay selling 10+ PowerColor R9 290s for $440-$460 a pop if your looking.


----------



## Pesmerrga

Quote:


> Originally Posted by *geggeg*
> 
> The first of the aftermarket hawaii cards are in stock: http://www.newegg.com/Product/Product.aspx?Item=N82E16814125500
> 
> Massive price gouging going on, but at least we got something else other than reference cards.


Ugh.. they are still doing the $50 off of $500+ on Neweggbusiness.com and the ASUS 290x Reference is $599. You need a Tax-ID # to make an account though.

http://www.neweggbusiness.com/product/product.aspx?item=9b-14-121-820

Promo code: XB2BYE201350

Looks like there is only 1 in stock.. Might have been a return.

I've been holding off on buying a GPU for the last few months. Waiting to see how Mantle might affect things, waiting for aftermarket coolers, etc.. I'm literally sitting in my Newegg cart with the above Asus card trying to decide to buy it or not.

I know a lot of people have gotten reference cards to watercool, but would it still be better to get an aftermarket card to watercool? I know it's probably not as necessary as the cooler will be better than the stock cooler. But won't the power delivery, chokes, vrm, etc be higher quality and possibly OC better?


----------



## VSG

There were over 10 in stock when I posted this thread, some people have more money than sense.

If the aftermarket versions use a much improved PCB, then they will be worth buying for watercooling it but then you run the risk of waterblocks not possibly being made for them.


----------



## Pesmerrga

Quote:


> Originally Posted by *geggeg*
> 
> There were over 10 in stock when I posted this thread, some people have more money than sense.
> 
> If the aftermarket versions use a much improved PCB, then they will be worth buying for watercooling it but then you run the risk of waterblocks not possibly being made for them.


I probably won't be doing a full loop. I have an Antec 620, richie's GPU Cool bracket, some Phobya thermal padding, enzotech Ramsinks.. I also have a brand new Artic Accelero Xtreme III in case the Antec doesn't work out..

I think I'll keep waiting and see what Mantle might bring.. or maybe I'll buy one tomorrow.. lol.. I can't make up my freakin' mind


----------



## Roy360

thanks, I honestly did not think that would solve my issue. But one thing that is concerning me, is that I can no longer get my old overclocks. When I set the memory clock to 1500 i get an immediate crash. 14000 is the highest I've been able to set. Could the damaged on card have been prevented if I was on water? Or did it get damaged by me pushing it too hard.


----------



## VSG

From everything I read so far, memory overclocking has not been very effective in benchmarks or games and in some cases actually hurts it. So I don't know about you, but I would probably leave it at 1400.


----------



## jerrolds

Quote:


> Originally Posted by *geggeg*
> 
> The first of the aftermarket hawaii cards are in stock: http://www.newegg.com/Product/Product.aspx?Item=N82E16814125500
> 
> Massive price gouging going on, but at least we got something else other than reference cards.


I hope thats not MSRP - newegg is being super lame about pricing but theyre making a killing a guess. Hopefully non-reference cooling will be +$50 more normally. Only Matrix/Lightning/HoF style cards deserve that kind of markup, if that


----------



## tsm106

Quote:


> Originally Posted by *geggeg*
> 
> From everything I read so far, memory overclocking has not been very effective in benchmarks or games and in some cases actually hurts it. So I don't know about you, but I would probably leave it at 1400.


Do us all a favor and stop repeating that. Just because you get no gains with a memory OC doesn't mean its true for everyone or in all cases.

Quote:


> Originally Posted by *tsm106*
> 
> For ex.
> 
> @1225/1375 = 12734 GS
> 
> http://www.3dmark.com/fs/1060554
> 
> @1225/1500 = 13414 GS
> 
> http://www.3dmark.com/fs/1050519


----------



## VSG

Fair enough, you actually have numbers to prove my statement false whereas the posts I was referring to were just words.


----------



## jerrolds

To be fair I'm curious to see what that 9% bump done to the gpu core would do....I would think it would be more than the 5.34% increase in scores. Then "not very effective" might be somewhat accurate.


----------



## Forceman

Quote:


> Originally Posted by *jerrolds*
> 
> To be fair I'm curious to see what that 9% bump done to the gpu core would do....I would think it would be more than the 5.34% increase in scores. Then "not very effective" might be somewhat accurate.


I'd like to see some game testing also. I checked Metro:LL at 1350 and 1500 and it was 0.5 FPS different (or 1%).

Edit: Just checked Batman:AO at 1325 and 1455 (so 10% increase) and it was 3.8% faster.


----------



## Roy360

well in mining I saw a 60Kh/s instease from overclocking from 1250 to 1500. My only worry is whether I have damaged the card. I'll be voiding the warranty to install a waterblock so I"m considering RMAing the card while I still have the chance. Only 3 days old after all. By the way, why does XFX advertise unlocked voltage on their site, yet on afterburner there is no such option. Do they have their own software?


----------



## Forceman

Quote:


> Originally Posted by *Roy360*
> 
> well in mining I saw a 60Kh/s instease from overclocking from 1250 to 1500. My only worry is whether I have damaged the card. I'll be voiding the warranty to install a waterblock so I"m considering RMAing the card while I still have the chance. Only 3 days old after all. By the way, why does XFX advertise unlocked voltage on their site, yet on afterburner there is no such option. Do they have their own software?


It is unlocked voltage (they all are), and you can use Afterburner. You need Beta 15, and you may need to make a config file change, but it does work.


----------



## tsm106

Quote:


> Originally Posted by *jerrolds*
> 
> To be fair I'm curious to see what that 9% bump done to the gpu core would do....I would think it would be more than the 5.34% increase in scores. Then "not very effective" might be somewhat accurate.


*Core oc's give the greatest ratio of gains and memory oc is half that with typical variances aside.* I didn't run any benches to get a feel for these metrics but I have enough benches draw some knowledge from it. The cpus are different amongst a host of other variables. All that aside as a longtime oc'er here, I could never hit the scores in benches nor game benches without raising memory oc along with core oc. Thus the greatest gains are from balancing both core and memory oc.

1225/1500 = 13414

http://www.3dmark.com/fs/1050519

1320/1700 = 14473

http://www.3dmark.com/fs/1176944


----------



## Roy360

Quote:


> Originally Posted by *Forceman*
> 
> It is unlocked voltage (they all are), and you can use Afterburner. You need Beta 15, and you may need to make a config file change, but it does work.


In that case I could get my old overclock of 1000/1500 if I were to apply more voltage. Maybe I ruined a capacitor or something? and that's why it demands more power to get my old voltages.


----------



## Sazz

Quote:


> Originally Posted by *Forceman*
> 
> I'd like to see some game testing also. I checked Metro:LL at 1350 and 1500 and it was 0.5 FPS different (or 1%).
> 
> Edit: Just checked Batman:AO at 1325 and 1455 (so 10% increase) and it was 3.8% faster.


same result when I tested on Tomb Raider at 6020x1080p, barely .5 FPS difference changing from stock 1250 and 1450. so now I don't even OC my memory on my daily clocks anymore. very little improvement.


----------



## kizwan

My second card seems running cooler than first card. It doesn't seems under perform though based on FPS & core clock/GPU usage. So I switched GPU2 to first slot & GPU1 to second slot. It seems this balanced out the temperature for both card when running in Crossfire.

BF4 update last night seems fixed two issue for me:-
- fixed micro-lag when running single card.
- fixed texture flickering in Crossfire.


----------



## tsm106

Quote:


> Originally Posted by *Roy360*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Forceman*
> 
> It is unlocked voltage (they all are), and you can use Afterburner. You need Beta 15, and you may need to make a config file change, but it does work.
> 
> 
> 
> In that case I could get my old overclock of 1000/1500 if I were to apply more voltage. Maybe I ruined a capacitor or something? and that's why it demands more power to get my old voltages.
Click to expand...

If yer mining, shouldn't you be underclocking the memory and reducing your voltage usage?


----------



## jerrolds

Scrypt mining is dependent on a lot of fast memory - most of the 850khs+ have memory at 1500mhz ...but theres a balance, if the core is too high it actually hurts mining speeds. Usually its around 900/1500 for 290X.

Havent seen too many configs that go much higher than 1500.

But i think its a combination of miners w/ Elpida memory and that we're not able to increase memory voltage on ref boards. Maybe non-ref cards will allow for this and 1600mhz+ will be faster.


----------



## RAFFY

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Can I have some noctuas please??


Or would you like some Scythe?


----------



## jassilamba

Finally took my R9 out of the Box and installed it tonight. Was running a 690 before. Has anyone seen this:


----------



## Durvelle27

Quote:


> Originally Posted by *RAFFY*
> 
> Or would you like some Scythe?


Ohhh I need some of those o.o


----------



## chiknnwatrmln

Quote:


> Originally Posted by *jassilamba*
> 
> Finally took my R9 out of the Box and installed it tonight. Was running a 690 before. Has anyone seen this:


Your problem is you're using Precision X.

Use MSI AB Beta 17 or later, or GPUTweak..


----------



## jassilamba

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Your problem is you're using Precision X.
> 
> Use MSI AB Beta 17 or later, or GPUTweak..


isn't precision not supposed to work with AMD cards


----------



## HoZy

So update on my 290x's

The coil whine I had on my first card is completely gone after a month.

Picked up the second card the other week, Coil whine is NASTYYYYYYYYY above 140fps.

However, After some use it'll quieten down & should practically go as well









Here is how the rig looks & it's running flawlessly on the CoolerMaster SilentPro 850w PSU with 290x crossfire.


----------



## dansi

new whql + new bf4 update patch makes me overclock vram higher by 45mhz without black screen! good job mates at amd dice.
now for mantle!


----------



## HoZy

290 Non-Reference cooler cards are coming, With a decent factory overclock as well!

http://www.newegg.com/Product/Product.aspx?Item=N82E16814125500


----------



## Strata

Finished lapping my EK Block, kind of disappointing that they don't polish it on their end, but at least it forced me to try lapping finally.

I'm quite proud of my first lap, it might not be the best job, but looks good to me.

Before:


After (Core only):


After (whole block):


----------



## darkelixa

Still yet to hear my sapphire r9 290 over my case fans lol


----------



## PCBung

Whats the best power settings for the 290's so far if any at all? I find that setting the power limit to +50% isnt as stable as +30/40%


----------



## kizwan

*BF4: GPU Clock vs. GPU Usage with Crossfire vs. Single GPU*


----------



## Widde

Quote:


> Originally Posted by *Sazz*
> 
> same result when I tested on Tomb Raider at 6020x1080p, barely .5 FPS difference changing from stock 1250 and 1450. so now I don't even OC my memory on my daily clocks anymore. very little improvement.
> 
> This is in furmark so I dont know how relevant it is or how the program works compared to games but
> 
> 
> 
> 
> 
> 
> 
> When I went from 1100/1250 to 1100/1500 I gained around 30ish fps when it was running 1024*768 and around 15-20ish in fullscreen 1920*1080
> 
> And now I seem to have messed up the post -.-


----------



## specopsFI

I must leave you behind, for now at least. Sold my 290 for a nice price, but I think both parties got a good deal. Since I wasn't going to water cool it, I'm glad I found someone who would. It was a nice card for sure, unlocked and could do 1200/1400 on air.

See you elsewhere!


----------



## Durvelle27

Quote:


> Originally Posted by *specopsFI*
> 
> I must leave you behind, for now at least. Sold my 290 for a nice price, but I think both parties got a good deal. Since I wasn't going to water cool it, I'm glad I found someone who would. It was a nice card for sure, unlocked and could do 1200/1400 on air.
> 
> See you elsewhere!


Welcome to the Dark Side


----------



## darkelixa

And what did you buy??


----------



## specopsFI

Quote:


> Originally Posted by *Durvelle27*
> 
> Welcome to the Dark Side


It's dark outside, but that's not what you meant, is it?









Quote:


> Originally Posted by *darkelixa*
> 
> And what did you buy??


Another GTX 780. The third one I've had so far. Nothing wrong with the Hawaii, though. Might return with a nice custom card, if a good deal presents itself.


----------



## Durvelle27

Quote:


> Originally Posted by *specopsFI*
> 
> It's dark outside, but that's not what you meant, is it?
> 
> 
> 
> 
> 
> 
> 
> 
> Another GTX 780. The third one I've had so far. Nothing wrong with the Hawaii, though. Might return with a nice custom card, if a good deal presents itself.


Lol nope

I haven't bought a new GPU yet but I am looking at a 780


----------



## stickg1

This has been discussed a few times in this thread, but I've found that memory overclocking does make a difference. Just like it does on every other card I have overclocked in the past two years. It's not as significant as a core clock, but it does make a difference. If I run Valley for example, 1100MHz core with stock 1250MHz memory, my score is a handful of frames lower than it is at say 1100/1500. A handful of frames is nothing to scoff at, and to me, is very significant.

I can take a bunch of screenies and post them if you want, maybe later today.


----------



## Pesmerrga

Wel, now they have the Windforce 290x up. $699.. Looks like in stock and ready to ship. Not pre-order.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814125499


----------



## the9quad

Quote:


> Originally Posted by *Pesmerrga*
> 
> Wel, now they have the Windforce 290x up. $699.. Looks like in stock and ready to ship. Not pre-order.
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814125499


For that price I'd rather get a 780ti.


----------



## ricklen

Just posted this in another thread, thought it would be interesting to post it here:

Pricing is very weird on Newegg or just very weird in The Netherlands. For example:

Newegg Gigabyte GTX 770 = $ 330 ---> Dutch price € 300

Newegg MSI R9 290 (stock cooler) = $ 550 ---> Dutch price € 360

The difference is huge on the Radeon cards!


----------



## trihy

Guys.... any good place to buy a gelid icy vision heatsink? There are only a few on ebay...


----------



## Tobiman

Well, 290s are back to $450 on ebay and Newegg has been able to keep stock for more than a day. Hopefully, this means that prices will fall back in line soonish. I can only hope.


----------



## stickg1

Quote:


> Originally Posted by *trihy*
> 
> Guys.... any good place to buy a gelid icy vision heatsink? There are only a few on ebay...


Check Superbiiz.com, they have it for about $55 shipped.


----------



## rdr09

Quote:


> Originally Posted by *ricklen*
> 
> Just posted this in another thread, thought it would be interesting to post it here:
> 
> Pricing is very weird on Newegg or just very weird in The Netherlands. For example:
> 
> Newegg Gigabyte GTX 770 = $ 330 ---> Dutch price € 300
> 
> Newegg MSI R9 290 (stock cooler) = $ 550 ---> Dutch price € 360
> 
> The difference is huge on the Radeon cards!


if you have newegg there it will be - € 500.


----------



## kcuestag

Hey @Arizonian, can you update me from a single R9 290X to 2x Sapphire R9 290 (Watercooled)?

Thanks.


----------



## Arizonian

Quote:


> Originally Posted by *Strata*
> 
> Finished lapping my EK Block, kind of disappointing that they don't polish it on their end, but at least it forced me to try lapping finally.
> 
> I'm quite proud of my first lap, it might not be the best job, but looks good to me.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Before:
> 
> 
> After (Core only):
> 
> 
> After (whole block):


Congrats - updated









Quote:


> Originally Posted by *specopsFI*
> 
> I must leave you behind, for now at least. Sold my 290 for a nice price, but I think both parties got a good deal. Since I wasn't going to water cool it, I'm glad I found someone who would. It was a nice card for sure, unlocked and could do 1200/1400 on air.
> 
> See you elsewhere!


Thanks for the update - removed.









Quote:


> Originally Posted by *kcuestag*
> 
> Hey @Arizonian, can you update me from a single R9 290X to 2x Sapphire R9 290 (Watercooled)?
> 
> Thanks.


I sure can buddy - congrats - updated


----------



## the9quad

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - updated
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks for the update - removed.
> 
> 
> 
> 
> 
> 
> 
> 
> I sure can buddy - congrats - updated


Update me to 3 btw, I've had 3 for a month now.

see pic in rig builder.

Thanks!


----------



## Gen290X

gen290x - I7 930 stock 2.8 Ghz - Sapphire R9 290X currently at stock 1000 Core / 1250 Memory - Stock Cooling - Catalyst 13.11 beta 9.5

http://www.techpowerup.com/gpuz/3wbzm/

First post on forum for an older school overclocker. Damn this powertune stuff!


----------



## Arizonian

Quote:


> Originally Posted by *the9quad*
> 
> Update me to 3 btw, I've had 3 for a month now.
> 
> see pic in rig builder.
> 
> Thanks!


Congrats - Updated









Quote:


> Originally Posted by *Gen290X*
> 
> gen290x - I7 930 stock 2.8 Ghz - Sapphire R9 290X currently at stock 1000 Core / 1250 Memory - Stock Cooling - Catalyst 13.11 beta 9.5
> 
> http://www.techpowerup.com/gpuz/3wbzm/
> 
> First post on forum for an older school overclocker. Damn this powertune stuff!


First welcome aboard OCN with your first post.









Very fitting OCN name with your new GPU - congrats - added


----------



## trihy

Quote:


> Originally Posted by *stickg1*
> 
> Check Superbiiz.com, they have it for about $55 shipped.


Nice price, sadly they dont ship outside US for new accounts


----------



## stickg1

Quote:


> Originally Posted by *trihy*
> 
> Nice price, sadly they dont ship outside US for new accounts


Oh, I didn't realize you needed international shipping.


----------



## trihy

Quote:


> Originally Posted by *stickg1*
> 
> Oh, I didn't realize you needed international shipping.


Yep, too bad









Anyone knows if the 280x cards are similar to 290 cards? I was thinking using a 280x dual-x heatsink on the 290... and put the reference 290 heatsink on the 280x.

I know the gpu wont be a problem, but mem and vrm location could be...


----------



## quakermaas

Finally done, will install them later or tomorrow.

Made a schoolboy error on first block, after I had it fitted, I remembered I had forgot to remove the top layer of plastic film on the heat pads


----------



## Rokku

The non reference gigabyte card are up for sale now on newegg.
579.99 for the 290
http://www.newegg.com/Product/Product.aspx?Item=N82E16814125500

699 for the 290x.
http://www.newegg.com/Product/Product.aspx?Item=N82E16814125499


----------



## HeliXpc

Here is the follow up video with the vrm heatsinks installed.


----------



## kcuestag

Quote:


> Originally Posted by *quakermaas*
> 
> Finally done, will install them later or tomorrow.
> 
> Made a schoolboy error on first block, after I had it fitted, I remembered I had forgot to remove the top layer of plastic film on the heat pads
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Nice! Now that I got two 290's instead of one 290X I have to get my self a 2nd waterblock for the 2nd GPU.


----------



## Derpinheimer

I thought I saw on one of the non-reference 290s or 290x's that it came with Elpida 1500MHz VRAM but I cant find it anymore. Anyone know if Im crazy?


----------



## Sazz

Quote:


> Originally Posted by *Derpinheimer*
> 
> I thought I saw on one of the non-reference 290s or 290x's that it came with Elpida 1500MHz VRAM but I cant find it anymore. Anyone know if Im crazy?


The 290X I sold to a friend can go 1500 at stock voltage, and its elipda memory. tbh I only handled 1 Hynix card but all elipda memory 290X/290 that I've handled clocked higher than the Hynix one, I guess that one Hynix that I got is a bad OC'er. but don't really think that memory brand matters THAT much anymore.


----------



## chiknnwatrmln

That 290 on Newegg is such a ripoff, lol Newegg's markups are a joke.

I mean $580 for a 290? That's ridiculous. That card should be like $430. $450 tops.


----------



## Arizonian

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> That 290 on Newegg is such a ripoff, lol Newegg's markups are a joke.
> 
> I mean $580 for a 290? That's ridiculous. That card should be like $430. $450 tops.


It's not just Newegg - Amazon too as well as many other places.

*Amazon Prices 290 & 290X*

Blame the Bitcoin & Litcoin Miners for the supply / demand mark up if anyone.

On a side note: Waiting on what MSI 290X Lightning and mantle combo brings. If it surpasses my current GPU in performance (regardless of price) I might take the hook and bite.







.


----------



## VSG

In that list above, select Amazon as the seller and you will get much better prices. You also get this random listing: http://www.amazon.com/PowerColor-AXPowerColor-290X-4GBD5-MDH-OC/dp/B00G9JAQA0


----------



## BradleyW

Quote:


> Originally Posted by *Durquavian*
> 
> prerendering: occurs with dual gpu but not single unless selected. Prerenndered frames cause lag.


Setting an fps limit equal to the refresh rate fixes this sort of. I say sort of, because randomly you can feel the lag slowly coming back. And then it just vanishes again. This repeats. Any idea why?

Edit: I'm talking about input lag here.


----------



## Slomo4shO

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - updated
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks for the update - removed.
> 
> 
> 
> 
> 
> 
> 
> 
> I sure can buddy - congrats - updated


Update me to 4 Sapphire 290, watercooled with koolance blocks.











Spoiler: Warning: Spoiler!













Just got all my water cooling parts the other day only to discover that I need 0.7mm pads for the GPU instead of the 0.5mm ones that I picked up. It is unfortunate that Koolance didn't have this information posted on their site about the pad dimensions


----------



## jerrolds

Anyone have any idea why AB17 just refuses to unlock voltages for me? Currently using Asus bios and GPU Tweak, but since im not maxing out voltage to 1412mv i want to use one tool for both overclocking and monitoring.

Heres my settings (ive tried others but this should just be it right?)


----------



## Durvelle27

Quote:


> Originally Posted by *Slomo4shO*
> 
> Update me to 4 Sapphire 290, watercooled with koolance blocks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just got all my water cooling parts the other day only to discover that I need 0.7mm pads for the GPU instead of the 0.5mm ones that I picked up. It is unfortunate that Koolance didn't have this information posted on their site about the pad dimensions


What fans are those ?


----------



## Slomo4shO

Quote:


> Originally Posted by *Durvelle27*
> 
> What fans are those ?


Noctua NF-F12


----------



## Forceman

Quote:


> Originally Posted by *jerrolds*
> 
> Anyone have any idea why AB17 just refuses to unlock voltages for me? Currently using Asus bios and GPU Tweak, but since im not maxing out voltage to 1412mv i want to use one tool for both overclocking and monitoring.
> 
> Heres my settings (ive tried others but this should just be it right?)


You can try doing this quick settings file edit:
Quote:


> Guys, I've just noticed that in beta 17 database we unlocked voltage control for reference VGA BIOS only. So those who flash third party BIOS are out of luck. We'll remove reference BIOS restriction shortly and upload updated new version.
> In meanwhile, while revised beta 17 is not uploaded experienced users may unlock voltage control in current beta 17 by adding the following lines to hardware profile file (.\Profiles\VEN_1002.......cfg):
> 
> [Settings]
> VDDC_IR3567B_Detection = 30h
> VDDC_IR3567B_Output = 0
> VDDCI_IR3567B_Detection = 30h
> VDDCI_IR3567B_Output = 1


http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/5460_30#post_21188008

It worked for me after a restart when I was having trouble. The version that is posted now shouldn't have that problem though, did you download it recently?


----------



## Durvelle27

Quote:


> Originally Posted by *Slomo4shO*
> 
> Noctua NF-F12


amazing fans. Looking to replace my crap ones soon.


----------



## jerrolds

Quote:


> Originally Posted by *Forceman*
> 
> You can try doing this quick settings file edit:
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/5460_30#post_21188008
> 
> It worked for me after a restart when I was having trouble. The version that is posted now shouldn't have that problem though, did you download it recently?


No i havent i didnt realize they revised Beta 17 - i will give it a shot thanks!

EDIT: Ok that worked - thanks +REP

There's an "Aux Voltage" under Memory Voltage - will that help overclocking mem? My Elpida doesnt play nice over 1500mhz.


----------



## Derpinheimer

Quote:


> Originally Posted by *Sazz*
> 
> The 290X I sold to a friend can go 1500 at stock voltage, and its elipda memory. tbh I only handled 1 Hynix card but all elipda memory 290X/290 that I've handled clocked higher than the Hynix one, I guess that one Hynix that I got is a bad OC'er. but don't really think that memory brand matters THAT much anymore.


Ah, what I mean is that one of them was actually a 1500MHz chip, downclocked to [?].. while every current card ships with 1250MHz rated Hynix or Elpida.

Hynix was a heck of a lot better back in the 7900s series for overclocking. Maybe they have improved their binning process.


----------



## ImJJames

Quote:


> Originally Posted by *HeliXpc*
> 
> Here is the follow up video with the vrm heatsinks installed.


This is awesome, can't wait to get my g10! I wonder if I can push my overclocks even further with these low temps?


----------



## jerrolds

Quote:


> Originally Posted by *HeliXpc*
> 
> Here is the follow up video with the vrm heatsinks installed.


Nice what sinks did you use for the VRMs?

EDIT: nm its in the video - thanks Geforce VR005 mem kit, and you used thermal glue..so that kinda sucks...mmmm i wonder if i should just chance not being able to RMA this card. I can hit 1220mhz on the core but VRM1 is skyrocketing to over 100C before i get corruption.


----------



## pounced

Wow I'm glad I bought my stock Sapphire 290x on day 1 for like $580 that came with battlefield 4. It's insane how the prices are jumping up. I might sell this stock one and grab a 3rd party cooler version later on.


----------



## stickg1

Yeah I bought my 290 from an overclocker for $375. He had a couple of them and this one didn't unlock. This was before LiteCoin made a huge surge in people buying up all the 7900/280/290 cards they could find. Thus inflating the prices.

This is a great card for $375, for $500+ I rather have an nvidia card.


----------



## Slomo4shO

Quote:


> Originally Posted by *stickg1*
> 
> Yeah I bought my 290 from an overclocker for $375. He had a couple of them and this one didn't unlock. This was before LiteCoin made a huge surge in people buying up all the 7900/280/290 cards they could find. Thus inflating the prices.
> 
> This is a great card for $375, for $500+ I rather have an nvidia card.


Litecoin difficult has skyrocketed and the price of the coins has also come down so prices should settle back to normal soon enough.

For example, Amazon has the PowerColor BF4 Edition R9 290X available for backorder for $580 and MSI R9 290X for $570 with limited inventory.


----------



## jamaican voodoo

i'm loving this 290!! watercool temps don't past 45c ....currently running @ 1175 core and 1300 mem...50+


----------



## RAFFY

@Arizonian Please update me to 5* ASUS 290x 1 * Gigabyte 290x and 2 * HIS 290.


----------



## VSG

lol so how much have you paid for those cards so far?


----------



## Arizonian

Quote:


> Originally Posted by *RAFFY*
> 
> @Arizonian Please update me to 5* ASUS 290x 1 * Gigabyte 290x and 2 * HIS 290.


Holy cow - congrats - updated


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Slomo4shO*
> 
> Update me to 4 Sapphire 290, watercooled with koolance blocks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just got all my water cooling parts the other day only to discover that I need 0.7mm pads for the GPU instead of the 0.5mm ones that I picked up. It is unfortunate that Koolance didn't have this information posted on their site about the pad dimensions


Are you powering all those on that one 1200w psu?


----------



## RAFFY

Quote:


> Originally Posted by *Arizonian*
> 
> Holy cow - congrats - updated


Quote:


> Originally Posted by *geggeg*
> 
> lol so how much have you paid for those cards so far?


2 * ASUS 290x $589.99 (launch/BF4) after 1 key sold $569.99
2 * ASUS 290x $599.99
1 * ASUS 290x $549.99
1 * GIGABYTE 290x $599.99
2 * HIS 290 $459.99 (BF4)

Total: $4409.92

But these GPU's will yield about 8Mhash if not MOAR!!!!
Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Are you powering all those on that one 1200w psu?


If he is, it wont be for too much longer lol.
Quote:


> Originally Posted by *Arizonian*
> 
> Holy cow - congrats - updated


Thanks I've become addicted to mining and making money lol.


----------



## broken pixel

Quote:


> Originally Posted by *maynard14*
> 
> http://www.3dmark.com/3dm11/7689670
> 
> huhuhu my card is very weak at overclocking
> 
> ab settings are 100 mv voltages
> 
> and core clock is only 1140!
> 
> and memory stock at 1250
> 
> r9 290 xfx
> 
> very sad indeed


Try 1100/1500


----------



## Asrock Extreme7

how much do u make with 1 290x not try mining


----------



## kizwan

I just learning about scaling. GPU usage with 200% much "cleaner" than 150%.

BF4 1080p Scaling 150%


BF4 1080p Scaling 200%


----------



## broken pixel

Quote:


> Originally Posted by *Asrock Extreme7*
> 
> how much do u make with 1 290x not try mining


it all depends on the coin you are mining the diffuclty and the price of BTC. Mining Bitcoins with GPUs is not profitable unless you have a farm, even then those farms would be mining alt currency and trading for BTC.

ASIC is about the only way the mine BTC. My 7.5Ghs ASIC only mines about 2- 7 in USD a day.

My scrypt gpu miner makes me anywhere from 25- 100+ a day. I made 150 USD in BTC with about 100k dogecoins that I traded when dogecoin went to cryptsy. I still have 90k left.

25,000 Digitalcoins i mined is what I am siting on.


----------



## Slomo4shO

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Are you powering all those on that one 1200w psu?


No, I picked up a ENERMAX Maxrevo 1500W. I just updated the build. Will update the pictures eventually...


----------



## VSG

You might still have an issue if you decide to overvolt the cards, if not then you should be golden.


----------



## Slomo4shO

Quote:


> Originally Posted by *geggeg*
> 
> You might still have an issue if you decide to overvolt the cards, if not then you should be golden.


I don't plan on OCing higher than what I can max with stock voltages for everyday use. I have a spare Rosewill Ligntening 1300W for benchmarking. I am just going to have to run an extension cord from an outlet on a different circuit to make it work without overloading the 15A circuit


----------



## BradleyW

Quote:


> Originally Posted by *Durquavian*
> 
> prerendering: occurs with dual gpu but not single unless selected. Prerenndered frames cause lag.


Setting an fps limit equal to the refresh rate fixes this sort of. I say sort of, because randomly you can feel the lag slowly coming back. And then it just vanishes again. This repeats. Any idea why?

Edit: I'm talking about input lag here


----------



## Asrock Extreme7

msi afterburner only says 3gb vram put %200 scaling on bf4 ran out of vram???


----------



## twirelez

Hi all,

I have Sapphire R9 290 and attempted to install Gelid Icy Cool Vision on it, but the VRM1 temps are way too high under load. I started seeing 110C before I was able to shut it down in time.

I read somewhere that this thread contains info on how to re-use stock reference sink together with Gelid/ Accellero, but after browsing through 500 pages, i'm tired of clicking next.

1. Can someone please point me to the right page number for the mod?

I will really appreciate it!

2. Another question, If I wanted to start playing with watercooling, just for the VGA to begin with, but also with room for expansion, which watercooling system would you recommend? If it's not too much trouble please post/pm me with what kit I should get.

Thanks ALL!


----------



## RAFFY

Quote:


> Originally Posted by *broken pixel*
> 
> it all depends on the coin you are mining the diffuclty and the price of BTC. Mining Bitcoins with GPUs is not profitable unless you have a farm, even then those farms would be mining alt currency and trading for BTC.
> 
> ASIC is about the only way the mine BTC. My 7.5Ghs ASIC only mines about 2- 7 in USD a day.
> 
> My scrypt gpu miner makes me anywhere from 25- 100+ a day. I made 150 USD in BTC with about 100k dogecoins that I traded when dogecoin went to cryptsy. I still have 90k left.
> 
> 25,000 Digitalcoins i mined is what I am siting on.


+1 I'm siting on 500k of doge and going to dump about 400k of it when the prices shoots up next time for a sweet few hundred bucks.


----------



## Forceman

Quote:


> Originally Posted by *twirelez*
> 
> Hi all,
> 
> I have Sapphire R9 290 and attempted to install Gelid Icy Cool Vision on it, but the VRM1 temps are way too high under load. I started seeing 110C before I was able to shut it down in time.
> 
> I read somewhere that this thread contains info on how to re-use stock reference sink together with Gelid/ Accellero, but after browsing through 500 pages, i'm tired of clicking next.
> 
> 1. Can someone please point me to the right page number for the mod?
> 
> I will really appreciate it!
> 
> 2. Another question, If I wanted to start playing with watercooling, just for the VGA to begin with, but also with room for expansion, which watercooling system would you recommend? If it's not too much trouble please post/pm me with what kit I should get.
> 
> Thanks ALL!


The Gelid should keep the VRMs cooler than that, did you put heatsinks on them?

Most of the kits, like the XSPC Raystorm ones, are set up for CPU cooling so they come with a CPU block and you'd have to buy a GPU block to go with it. If you just wanted to start with the GPU then you'd probably be best parting it together yourself.


----------



## twirelez

I installed the long VRM heatsink that came with the kit, had to extend the holes to get the screws in. but I don't think it's enough to keep the vrm's cool while mining scrypt.

Also I noticed that my GPU still gets really hot, upwards of 85C which should not be the case with this heatsink. I tried using Arctic Silver thermal paste as well as I thought that the one that came with the kit wasn't good, but it still runs hot. Maybe I got a faulty Gelid Icy...

$60 down the drain i guess....
I'm willing to invest in watercooling, but I need some guidance. What capacity of XSPC Raystorm do i need to buy to be able to handle cpu + r9 290 + 6870 ?

Thanks!


----------



## Jack Mac

The Prolimatech MK-26 is $68 on Newegg if anyone wants to try it.


----------



## Roy360

I don't get it. Since upgrading from my 750W Ultra PSU to my 500W Corsair PSU. I was able to regain my 1000/1500 overclock. (I could possibly go higher but never bothered too.) I get no artifacts in Unigine Heaven's Benchmark or in Furmark. (Overclocked and on stock) Yet for some reason my computer always freezes whenever I minimize all my applications.

ie. I just finished testing overclock and stock settings, went to to enable Eyfinity by opening the AMD catalyst, and it freezes. Black Screen, need to restart the computer to get back in.

HashRate on 947/1250 - 800Kh/s
HashRate on 1000/1500/50% - 867Kh/s

The card doesn't seem to have problems running at full speed. So maybe there is something wrong with it's power management? (What does Power limit do in MSI afterburner? maybe I'll keep that on 50% all the time)


----------



## Derpinheimer

Quote:


> Originally Posted by *twirelez*
> 
> I installed the long VRM heatsink that came with the kit, had to extend the holes to get the screws in. but I don't think it's enough to keep the vrm's cool while mining scrypt.
> 
> Also I noticed that my GPU still gets really hot, upwards of 85C which should not be the case with this heatsink. I tried using Arctic Silver thermal paste as well as I thought that the one that came with the kit wasn't good, but it still runs hot. Maybe I got a faulty Gelid Icy...
> 
> $60 down the drain i guess....
> I'm willing to invest in watercooling, but I need some guidance. What capacity of XSPC Raystorm do i need to buy to be able to handle cpu + r9 290 + 6870 ?
> 
> Thanks!


Feel the heatsink. If its not burning hot when the GPU is 85c, its probably a contact issue.


----------



## twirelez

it's hot.
is it possible that GPU temperature is affected by high vrm1 temp? vrm1 was at 110C before i turned off mining and it was still on the rise


----------



## Jaju123

Guys I was playing BF4 with 2x R9 290s in crossfire on 64 player conquest large, and it was stuttering a little so I checked my VRAM usage and this is what is said:



How is it possible that I can use more than what my cards even provide?

Thanks!


----------



## Maracus

Quote:


> Originally Posted by *Jaju123*
> 
> Guys I was playing BF4 with 2x R9 290s in crossfire on 64 player conquest large, and it was stuttering a little so I checked my VRAM usage and this is what is said:
> 
> 
> 
> How is it possible that I can use more than what my cards even provide?
> 
> Thanks!


Using "Resolution Scaling"? I'm only using one 290x, I decided to test out the feature at +200% and damn i went from 50fps or so down to below 10fps lol


----------



## Jaju123

Quote:


> Originally Posted by *Maracus*
> 
> Using "Resolution Scaling"? I'm only using one 290x, I decided to test out the feature at +200% and damn i went from 50fps or so down to below 10fps lol


No haha, just 1440p with 4xmsaa.


----------



## stickg1

Quote:


> Originally Posted by *Jaju123*
> 
> Guys I was playing BF4 with 2x R9 290s in crossfire on 64 player conquest large, and it was stuttering a little so I checked my VRAM usage and this is what is said:
> 
> 
> 
> How is it possible that I can use more than what my cards even provide?
> 
> Thanks!


It starts using system memory, which is DDR3, not nearly as fast as GDDR5 which is on your GPU which causes the lag. But that shouldn't be evident in GPU-Z. Your card is really hot, what's the airflow like in your case?

Upon 2nd look, your card is 93C and using 43% fan speed, it's probably just throttling to keep itself from overheating, either crank up the fan or get a better cooling solution.
Quote:


> Originally Posted by *Roy360*
> 
> I don't get it. Since upgrading from my 750W Ultra PSU to my 500W Corsair PSU. I was able to regain my 1000/1500 overclock. (I could possibly go higher but never bothered too.) I get no artifacts in Unigine Heaven's Benchmark or in Furmark. (Overclocked and on stock) Yet for some reason my computer always freezes whenever I minimize all my applications.
> 
> ie. I just finished testing overclock and stock settings, went to to enable Eyfinity by opening the AMD catalyst, and it freezes. Black Screen, need to restart the computer to get back in.
> 
> HashRate on 947/1250 - 800Kh/s
> HashRate on 1000/1500/50% - 867Kh/s
> 
> The card doesn't seem to have problems running at full speed. So maybe there is something wrong with it's power management? (What does Power limit do in MSI afterburner? maybe I'll keep that on 50% all the time)


I'm pretty sure memory clocks don't do a whole lot for mining. You need to get more out of the core clock to increase your hashrate.


----------



## Derpinheimer

Quote:


> Originally Posted by *stickg1*
> 
> It starts using system memory, which is DDR3, not nearly as fast as GDDR5 which is on your GPU which causes the lag. But that shouldn't be evident in GPU-Z. Your card is really hot, what's the airflow like in your case?
> I'm pretty sure memory clocks don't do a whole lot for mining. You need to get more out of the core clock to increase your hashrate.


For the 290 and 290x this is actually completely wrong. The opposite is actually true. More often than not, an underclock to the mid 900s helps hashrate, accompanied by 1500MHz VRAM. At least, at 20 intensity. At lower intensities core clock increases help.
Quote:


> Originally Posted by *twirelez*
> 
> it's hot.
> is it possible that GPU temperature is affected by high vrm1 temp? vrm1 was at 110C before i turned off mining and it was still on the rise


Jeez! Hate to ask, but are you sure the fans are working properly?


----------



## twirelez

Quote:


> Originally Posted by *Derpinheimer*
> 
> For the 290 and 290x this is actually completely wrong. The opposite is actually true. More often than not, an underclock to the mid 900s helps hashrate, accompanied by 1500MHz VRAM. At least, at 20 intensity. At lower intensities core clock increases help.
> Jeez! Hate to ask, but are you sure the fans are working properly?


Yes, the fans are both spinning at 2000 rpm, i connected them to the gpu so they are on all the time with constant speed...


----------



## Gir

http://farm4.staticflickr.com/3833/11488398293_42b585fffa_h.jpg

The chips circled in blue are all of the VRMs, right? 15 in total. Did I miss any?

I picked up a pack of these tiny little heatsinks to go on the chips when I get my NZXT G10 on monday. They come with thermal tape on them, so I don't have to mess with permanently attaching them. Already spoke to XFX and they say they will honor the warranty as long as it can be returned to stock condition and you don't cause physical damage in the process.

I know these things are tiny, but there was a youtube video of a guy testing his 290x with a G10 with and without VRM heatsinks. He used some big ones and the temp difference was 30-50 degrees C at times. So I'm hoping for around 10-20C difference with these little things. Which would keep them out of the 90C range.


----------



## stickg1

Quote:


> Originally Posted by *Derpinheimer*
> 
> For the 290 and 290x this is actually completely wrong. The opposite is actually true. More often than not, an underclock to the mid 900s helps hashrate, accompanied by 1500MHz VRAM. At least, at 20 intensity. At lower intensities core clock increases help.
> Jeez! Hate to ask, but are you sure the fans are working properly?


Interesting, I haven't mined on a 290, I used to mine with my 7950 and 7970 and I actually would underclock the memory to cut down on power and it had no effect to the hashrate. Core clock was the only thing that mattered. But you learn something new everyday!


----------



## VSG

Quote:


> Originally Posted by *Gir*
> 
> http://farm4.staticflickr.com/3833/11488398293_42b585fffa_h.jpg
> 
> The chips circled in blue are all of the VRMs, right? 15 in total. Did I miss any?


Ya those are the VRMs.


----------



## Sgt Bilko

3DMark has dropped to $2.50 on Steam atm for those looking to pick it up cheap


----------



## Derpinheimer

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 3DMark has dropped to $2.50 on Steam atm for those looking to pick it up cheap


Quote:


> Originally Posted by *stickg1*
> 
> Interesting, I haven't mined on a 290, I used to mine with my 7950 and 7970 and I actually would underclock the memory to cut down on power and it had no effect to the hashrate. Core clock was the only thing that mattered. But you learn something new everyday!


Same here - its very odd. When I first tried mining on my unlocked 290 I was super disappointed. At some point I found that when I lowered power limit increased hashrate.. Low and behold, I find it is from the lower core clock later on. 938MHz core and 1500MHz memory yeild best results [~890KHash], any small deviation reduces it significantly. Like 1499MHz ram is low 700s.
Quote:


> Originally Posted by *twirelez*
> 
> Yes, the fans are both spinning at 2000 rpm, i connected them to the gpu so they are on all the time with constant speed...


Hm, I dont know what to say. Can you download HWInfo and check VRM1 Power Out?
Quote:


> Originally Posted by *Sgt Bilko*
> 
> 3DMark has dropped to $2.50 on Steam atm for those looking to pick it up cheap


Great, thanks!







Wouldnt have paid a penny more, but it should be nice for testing overclocks..


----------



## twirelez

Quote:


> Originally Posted by *Derpinheimer*
> 
> Same here - its very odd. When I first tried mining on my unlocked 290 I was super disappointed. At some point I found that when I lowered power limit increased hashrate.. Low and behold, I find it is from the lower core clock later on. 938MHz core and 1500MHz memory yeild best results [~890KHash], any small deviation reduces it significantly. Like 1499MHz ram is low 700s.
> Hm, I dont know what to say. Can you download HWInfo and check VRM1 Power Out?
> Great, thanks!
> 
> 
> 
> 
> 
> 
> 
> Wouldnt have paid a penny more, but it should be nice for testing overclocks..


Going back to my original question, can one of you please point me to the page number in this thread that talks about re-using stock heatsink for vrm cooling in combination with gelid fan?


----------



## TheRoot

Quote:


> Originally Posted by *Arizonian*
> 
> It's not just Newegg - Amazon too as well as many other places.
> 
> *Amazon Prices 290 & 290X*
> 
> Blame the Bitcoin & Litcoin Miners for the supply / demand mark up if anyone.


why blame miner ? they just do their work


----------



## stickg1

Quote:


> Originally Posted by *twirelez*
> 
> Going back to my original question, can one of you please point me to the page number in this thread that talks about re-using stock heatsink for vrm cooling in combination with gelid fan?


Quote:


> Originally Posted by *twirelez*
> 
> Going back to my original question, can one of you please point me to the page number in this thread that talks about re-using stock heatsink for vrm cooling in combination with gelid fan?


You're going to destroy your stock cooler which you might need to send the card back for warranty work in the future.

I would just get two packs of these...

http://www.newegg.com/Product/Product.aspx?Item=N82E16835708011

They can probably be found for cheaper, Newegg is just hella easy to navigate, lol.


----------



## twirelez

Quote:


> Originally Posted by *stickg1*
> 
> You're going to destroy your stock cooler which you might need to send the card back for warranty work in the future.
> 
> I would just get two packs of these...
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16835708011
> 
> They can probably be found for cheaper, Newegg is just hella easy to navigate, lol.


Thanks stickg1, has anyone tried these here? also, do I need to get the sticky tape as well or do they come with one?


----------



## stickg1

Quote:


> Originally Posted by *twirelez*
> 
> Thanks stickg1, has anyone tried these here? also, do I need to get the sticky tape as well or do they come with one?


I use them for the H55 CLC I zip-tied to the core. I've used them a bunch for cards that I use a universal waterblock on. They come with the thermal adhesive tape. But for it to stick, the VRMs have to be really clean. Dip a q-tip in isoprophyl alcohol and clean them off a few times and you should be good to go.


----------



## twirelez

Quote:


> Originally Posted by *stickg1*
> 
> I use them for the H55 CLC I zip-tied to the core. I've used them a bunch for cards that I use a universal waterblock on. They come with the thermal adhesive tape. But for it to stick, the VRMs have to be really clean. Dip a q-tip in isoprophyl alcohol and clean them off a few times and you should be good to go.


Thank you!!! I will order these and try.

In case i need to RMA the card, which thermal tape (what thickness) should i get to replace the ones that were on the stock heatsink? (some pieces got broken)


----------



## stickg1

Quote:


> Originally Posted by *twirelez*
> 
> Thank you!!! I will order these and try.
> 
> In case i need to RMA the card, which thermal tape (what thickness) should i get to replace the ones that were on the stock heatsink? (some pieces got broken)


Well technically, the card would be broken and if they did for some reason remove the cooler to inspect, they could be the ones that ripped the stock adhesive. But for peace of mind, it looks like a 7mm white thermal tape.


----------



## Derpinheimer

Just save the old stuff. If its being RMAd why use anything new? Its only going to get replaced again.


----------



## twirelez

i would still like to see the post on how to mod the stock heatsink though, any idea what page it's on?


----------



## stickg1

It was a cunning maneuver what that user did to reuse the stock heatsink. But the first time I opened up my 290 I realized how inadequately cooled the VRMs are on this card. You're better off using copper heatsinks and direct airflow.


----------



## twirelez

Quote:


> Originally Posted by *stickg1*
> 
> It was a cunning maneuver what that user did to reuse the stock heatsink. But the first time I opened up my 290 I realized how inadequately cooled the VRMs are on this card. You're better off using copper heatsinks and direct airflow.


I'm planning to use the copper heatsinks that you suggested under the Gelid Icy Vision rev 2, do you think that should work fine?


----------



## stickg1

Quote:


> Originally Posted by *twirelez*
> 
> I'm planning to use the copper heatsinks that you suggested under the Gelid Icy Vision rev 2, do you think that should work fine?


It should work really well. Most likely better than chopping the stock heatsink. Plus you get to save your stock heatsink in case you need it for warranty or end up selling the card and reusing your Gelid for a future GPU upgrade.

Hell I don't even use heatsinks on VRM1, I just use direct airflow and it cools 25C better than the stock heatsink. Those things are suffocated by the stock cooling.


----------



## Derpinheimer

Quote:


> Originally Posted by *stickg1*
> 
> It should work really well. Most likely better than chopping the stock heatsink. Plus you get to save your stock heatsink in case you need it for warranty or end up selling the card and reusing your Gelid for a future GPU upgrade.
> 
> Hell I don't even use heatsinks on VRM1, I just use direct airflow and it cools 25C better than the stock heatsink. Those things are suffocated by the stock cooling.[/quote
> 
> The stock cooling keeps VRM1 in the low 60s for me ??? Even when core is >80


----------



## twirelez

Quote:


> Originally Posted by *stickg1*
> 
> It should work really well. Most likely better than chopping the stock heatsink. Plus you get to save your stock heatsink in case you need it for warranty or end up selling the card and reusing your Gelid for a future GPU upgrade.
> 
> Hell I don't even use heatsinks on VRM1, I just use direct airflow and it cools 25C better than the stock heatsink. Those things are suffocated by the stock cooling.


Do you have a pic of how you are blowing air on them?


----------



## stickg1

Quote:


> Originally Posted by *Derpinheimer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *stickg1*
> 
> It should work really well. Most likely better than chopping the stock heatsink. Plus you get to save your stock heatsink in case you need it for warranty or end up selling the card and reusing your Gelid for a future GPU upgrade.
> 
> Hell I don't even use heatsinks on VRM1, I just use direct airflow and it cools 25C better than the stock heatsink. Those things are suffocated by the stock cooling.
> 
> 
> 
> The stock cooling keeps VRM1 in the low 60s for me ??? Even when core is >80
Click to expand...

**fixed that for ya

My stock cooler ran my VRM1 in the low 90s while gaming, low 80s while folding




Quote:


> Originally Posted by *twirelez*
> 
> Do you have a pic of how you are blowing air on them?


----------



## Forceman

Quote:


> Originally Posted by *twirelez*
> 
> I'm willing to invest in watercooling, but I need some guidance. What capacity of XSPC Raystorm do i need to buy to be able to handle cpu + r9 290 + 6870 ?


For all three I'd get a 360 (or 360 total, could be a 240 and a separate 120 if you bought them individually).


----------



## tsm106

Quote:


> Originally Posted by *Jaju123*
> 
> Guys I was playing BF4 with *2x R9 290s* in crossfire on 64 player conquest large, and it was stuttering a little so I checked my VRAM usage and this is what is said:
> 
> 
> 
> *How is it possible that I can use more than what my cards even provide?*
> 
> Thanks!


Memory monitoring is unified, ie. it's the combined total of all cards.


----------



## broken pixel

Quote:


> Originally Posted by *pounced*
> 
> Wow I'm glad I bought my stock Sapphire 290x on day 1 for like $580 that came with battlefield 4. It's insane how the prices are jumping up. I might sell this stock one and grab a 3rd party cooler version later on.


2x 290x Lightnings for 2x XFX 290x. ?


----------



## unlost

Times 2


----------



## maynard14

Quote:


> Originally Posted by *broken pixel*
> 
> Try 1100/1500


hi bro flash my card to 290x xfx bios

overclock with after burner

core clock 1135

memory clock 1400

voltage 100 mv

fan 75 percent

and here is my result

http://www.3dmark.com/3dm11/7699477

i think its not that bad.. it is stable now









havent tried 1500 on memory though..


----------



## Scotty99

Newegg has gigabyte 290 for 579.00:
http://www.newegg.com/Product/Product.aspx?Item=N82E16814125500

And 290x for 699.00:
http://www.newegg.com/Product/Product.aspx?Item=N82E16814125499

lol...

200 dollar markups, thanks miners!


----------



## Sazz

Quote:


> Originally Posted by *Scotty99*
> 
> Newegg has gigabyte 290 for 579.00:
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814125500
> 
> And 290x for 699.00:
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814125499
> 
> lol...
> 
> 200 dollar markups, thanks miners!


Those arent the reference cards tho. BUT! the reference cards are still 130bucks more than the launch price.


----------



## Scotty99

How often do you seriously see aftermarket coolers go for more than reference tho? Just because AMD messed this launch up we should expect the aftermarket cooled cards to go for more? How does that work.


----------



## Derpinheimer

Well usually they are a bit more. R&D, higher quality PCB components (potentially) and a slightly more expensive hsf.. $20-$40 markup is totally fair for non-reference. Unless its an XFX Double Disippation, those should be like half off theyre so bad - at least with 6XXX and 7XXX


----------



## Scotty99

Quote:


> Originally Posted by *Derpinheimer*
> 
> Well usually they are a bit more. R&D, higher quality PCB components (potentially) and a slightly more expensive hsf.. $20-$40 markup is totally fair for non-reference. Unless its an XFX Double Disippation, those should be like half off theyre so bad - at least with 6XXX and 7XXX


No, not really. Go on newegg right now and i bet you irl money you can find aftermarket cards going for the exact same price in every series of GPU's besides the 290's.

Sure i understand a 10 dollar markup and would gladly pay that but history has shown you can almost always find aftermarket coolers for the same price as reference cards.


----------



## muhd86

Quote:


> Originally Posted by *ImJJames*
> 
> Holy moly lol, you planning to run a Nuclear reactor with that?


*

well no ----but i always been a quad fan of both nvidia and amd ...*


----------



## AddictedGamer93

My overclock isn't doing anything, it's strange. The new clocks show in GPU-Z but when I run FireStrike there is zero difference between stock and overclocked results and the clocks speeds are reported as 1000/1250.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *AddictedGamer93*
> 
> My overclock isn't doing anything, it's strange. The new clocks show in GPU-Z but when I run FireStrike there is zero difference between stock and overclocked results and the clocks speeds are reported as 1000/1250.


What are you using to over clock


----------



## AddictedGamer93

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> What are you using to over clock


Overdrive. I am downloading Afterburner right now, I haven't been particularly fond of it in the past but it seems to be the best option with these cards.


----------



## Forceman

Quote:


> Originally Posted by *AddictedGamer93*
> 
> Overdrive. I am downloading Afterburner right now, I haven't been particularly fond of it in the past but it seems to be the best option with these cards.


I had to enable overdrive and set an overclock there first, and then after that Afterburner could change it and work normally. But before I set it in CCC once, it didn't save the changes.


----------



## PCBung

I've tried the command: MSIAfterburner.exe /sg0 /wi4,30,8d,18 /sg1 /wi4,30,8d,18 but my voltage isn't going above the 100mv? Any ideas?


----------



## Forceman

Quote:


> Originally Posted by *PCBung*
> 
> I've tried the command: MSIAfterburner.exe /sg0 /wi4,30,8d,18 /sg1 /wi4,30,8d,18 but my voltage isn't going above the 100mv? Any ideas?


You checking the offset or the actual voltage? I think I heard that it doesn't change the offset numbers, it just adds 100mV to whatever is set there. So +0 is actually +100 and +100 would be +200.


----------



## the9quad

Quote:


> Originally Posted by *AddictedGamer93*
> 
> Overdrive. I am downloading Afterburner right now, I haven't been particularly fond of it in the past but it seems to be the best option with these cards.


There is a new version of trixx in one of these threads, and it does +200 mv.

ediit- here it is

http://www.overclock.net/t/1444892/sapphire-290-290x-trixx-voltage-control-missing/50


----------



## PCBung

I just downloaded that software and now able to set higher voltage... Just benching at +160mv @ 1200 core .


----------



## AddictedGamer93

Got it! Though it is a wimpy overclock, it is recognized in 3DMark 13 and there is a performance gain over stock, although small. I'm not going to push this card until I get my water block, then the games begin


----------



## PCBung

+160 wasnt stable, +200 started having issues (screen went fuzzy). +150 gave me 600points less...

Guess my cards are at their peak @

1165/1375 +100

Shame









I noticed trixx doesnt have a power limit slider?


----------



## Radmanhs

ok, i am going to use a kraken g10 on my 290, but i heard the vrm's get very hot still

can i get a boxes of these and use them on the vrm's and ram, or will they only work on the ram

http://www.amazon.com/Cosmos-Copper-Cooling-Heatsinks-cooler/dp/B00637X42A/ref=cm_cr_pr_product_top

along with this stuff to glue them on

http://www.amazon.com/Arctic-Alumina-Ceramic-Thermal-Adhesive/dp/B001O7ARUQ/ref=pd_sim_sbs_pc_15


----------



## Spartacvs

hi guys, i new in this forum and i have a problem.... i have two 290 sapphire, my problem is this:
my configuration is as follows intel i7 3930K 4.8 ghz, 16 gb ram kingston hyper x 2400MHz, Asus Rampage 4 Extreme, Corsair 1200w Gold Series, I'm trying the video cards in single TRIXX with the program, setting the following parameters gpu 1250 mhz 15000 mhz ram and the test is performed on the 3d mark 11 with a score of 16400, but once finished the test the video card in the graph GPUZ remains at 100 percent .. I do not understand why ... and are forced to restart, this happens to me with both. i hope for your help, thx.
The vga are liquid with the waterblock ek, sorry for my english.


----------



## smoke2

I'm wondering to buy new R9 290, but one thing surprised me.
It is required 750W power supply.
http://www.sapphiretech.com/presentation/product/?cid=1&gid=3&sgid=1227&pid=2044&psn=&lid=1&leg=0#

I'm owning now Seasonic X-650 which is 650W.
My rig is i5-4670K OCed, 2x4GB RAM, 2x Caviar Black 2TB.

I have a chance to sell this PSU and to buy more powerful, probably will buy X-750.

But I have a question.
Do you think that my current PSU can handle with this card without any problems for the whole life of card?
Wouldn't want to risk anything...

Thanks.


----------



## Forceman

Yes, a decent 650w (and yours is much better than decent) is more than enough. They overrate the PSU requirements because many people have chappy units that can't deliver anything like their rated power. My system draws just under 400W at the wall running BF4,for example.


----------



## Falkentyne

Quote:


> Originally Posted by *Forceman*
> 
> You checking the offset or the actual voltage? I think I heard that it doesn't change the offset numbers, it just adds 100mV to whatever is set there. So +0 is actually +100 and +100 would be +200.


No, actually you're wrong.
the offset is in hexadecimal and it adds the direct offset to the default core voltage, not the CURRENT set voltage..
so "10" is 16 in decimal.
Voltage is in 6.25mv steps, so 16x6.25=100 (100 mv).
So you use 10 to set 100 mv.

11 would be 106.25 mv. (17 (decimal) x 6.25=106.25
8 would be 50mv (8 x 6.25=50).

Setting "0" returns it back to the default VID with no offset.
And I just checked quickly in GPU-Z and using values over "10" in the command line (100 mv) DO work and apply. You can check because the idle voltage goes up.
I DO NOT think that afterburner's GUI will show more than 100 mv, as the sliders are locked to 100mv max.


----------



## voldomazta

Anybody on tri-fire eyefinity? I want to compare my BF4 ultra settings it seems to be pretty low. I only get 20+ fps on test range with ultra settings. My resolution is 5760x1200


----------



## psyside

Well good news everyone!

I talked again with the electrician, this time showing him the mosfets of 290/290X









He said they keep their ideal performance even @100c+ and he also said they are high quality, and very good generally - he says that very rare, he wasn't so impressed by the inductors he rated them as average, and recommended to not exceed 60-70c on them for good performance. He works - repair, expensive gear, so for him to say for something that is high quality means alot









So ye, anyone who plant to buy some non-reference cards, think really hard before you do it. He confirmed to me that, if some of this cards ain't oc well, there is no - or very very little chance this card VRM design is holding you back









For now, no card beats reference ones regarding vrm performance, maybe Lightning/Mars will do it, but not sure, the mosfets they are using are top notch, he was really surprised to see such linear performance even at very high temps - he didn't expected that at consumer grade products, iR - vrm's on this cards rocks!


----------



## Hogesyx

A heads up guys, 3dmarks on discount on Steam, 90% discount flash sale.


----------



## PCBung

Quote:


> Originally Posted by *Hogesyx*
> 
> A heads up guys, 3dmarks on discount on Steam, 90% discount flash sale.


Bought it this morning, finally benchmark without the danm demo's !


----------



## Hogesyx

Quote:


> Originally Posted by *PCBung*
> 
> Bought it this morning, finally benchmark without the danm demo's !


Haha. I got mine during 50% sale, so I got another copy as gift incase any friends need it.


----------



## trihy

Anyone tried a home made way to improve temps on this cards?

Maybe removing the plastic cover and putting a 12cm fan there? It would make the stock fan useless, but maybe it will work better..


----------



## kcuestag

Still getting blackscreen issues (More like monitor signal loss and cards fans go into full blast). Could it be a software issue? I haven't really formatted my Windows since getting my first 290X, might as well try it and see if it's a corrupted installation causing this...


----------



## Hogesyx

Quote:


> Originally Posted by *trihy*
> 
> Anyone tried a home made way to improve temps on this cards?
> 
> Maybe removing the plastic cover and putting a 12cm fan there? It would make the stock fan useless, but maybe it will work better..


You can't. The heatsink was design to be side push by blower fan. You might want to try stacking a 80mm fan on the blower fan, helps a bit but not much and is what I did prior to water.


----------



## trihy

Quote:


> Originally Posted by *Hogesyx*
> 
> You can't. The heatsink was design to be side push by blower fan. You might want to try stacking a 80mm fan on the blower fan, helps a bit but not much and is what I did prior to water.


Yes... probably you are right, I´d never seen an improved reference heatsink/cooler, but I need to lower temps of this card, they are completely insane. Most in-game benchs, gives me number on par or below 7970. I guess is throttling like hell





















(besides it never passed 95 degress)

Only synthetic benchs gives great numbers...


----------



## Raephen

Quote:


> Originally Posted by *smoke2*
> 
> I'm wondering to buy new R9 290, but one thing surprised me.
> It is required 750W power supply.
> http://www.sapphiretech.com/presentation/product/?cid=1&gid=3&sgid=1227&pid=2044&psn=&lid=1&leg=0#
> 
> I'm owning now Seasonic X-650 which is 650W.
> My rig is i5-4670K OCed, 2x4GB RAM, 2x Caviar Black 2TB.
> 
> I have a chance to sell this PSU and to buy more powerful, probably will buy X-750.
> 
> But I have a question.
> Do you think that my current PSU can handle with this card without any problems for the whole life of card?
> Wouldn't want to risk anything...
> 
> Thanks.


I think you should be safe.

I reckoned the X-560 would've been enough to handle my system + the single 290 I have, though that would've been pushing it higher than I liked.

I had the chance to sell it of alongside my old gpu, so I took that chance.

Your 650 gives you a bit more headroom, and I would have felt comfortable running that.

But like you mentioned: you have an option to sell it off, but don't do it to upgrade to 'just' an X-750 if it ends up costing more. You might want to look into their platinum offerings. Like I said: I sold my X-560. I replaced it with the Seasonic 860XP2. And contrary to what I would have expected, it's even more efficient in idle than the X-560 was, on very light loads. (Idle draw: 70-80 W, at the wall, monitor off < 10%).

Maybe something between 5 to 10 Watts at that load, but from what I know of PSU's, I would've expected the difference to be the other way around - a bit more when idling, not less


----------



## Hogesyx

Quote:


> Originally Posted by *trihy*
> 
> Yes... probably you are right, I´d never seen an improved reference heatsink/cooler, but I need to lower temps of this card, they are completely insane. Most in-game benchs, gives me number on par or below 7970. I guess is throttling like hell
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (besides it never passed 95 degress)
> 
> Only synthetic benchs gives great numbers...


Which card are you using? My old 290 was a tad hotter compared to 290X. With both card I have no throttling at 60% fan speed on stock clock, my ambient temp is tropic temp at 30-34%


----------



## kizwan

Quote:


> Originally Posted by *Radmanhs*
> 
> ok, i am going to use a kraken g10 on my 290, but i heard the vrm's get very hot still
> 
> can i get a boxes of these and use them on the vrm's and ram, or will they only work on the ram
> 
> http://www.amazon.com/Cosmos-Copper-Cooling-Heatsinks-cooler/dp/B00637X42A/ref=cm_cr_pr_product_top
> 
> along with this stuff to glue them on
> 
> http://www.amazon.com/Arctic-Alumina-Ceramic-Thermal-Adhesive/dp/B001O7ARUQ/ref=pd_sim_sbs_pc_15


That is *permanent* adhesive. The heatsink will attached forever.
Quote:


> Originally Posted by *Spartacvs*
> 
> hi guys, i new in this forum and i have a problem.... i have two 290 sapphire, my problem is this:
> my configuration is as follows intel i7 3930K 4.8 ghz, 16 gb ram kingston hyper x 2400MHz, Asus Rampage 4 Extreme, Corsair 1200w Gold Series, I'm trying the video cards in single TRIXX with the program, setting the following parameters gpu 1250 mhz 15000 mhz ram and the test is performed on the 3d mark 11 with a score of 16400, but once finished the test the video card in the graph GPUZ remains at 100 percent .. I do not understand why ... and are forced to restart, this happens to me with both. i hope for your help, thx.
> The vga are liquid with the waterblock ek, sorry for my english.


Did you try latest driver? 13.11 beta 9.5 / 13.12?
Quote:


> Originally Posted by *PCBung*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Hogesyx*
> 
> A heads up guys, 3dmarks on discount on Steam, 90% discount flash sale.
> 
> 
> 
> Bought it this morning, finally benchmark without the danm demo's !
Click to expand...

Bought mine this morning too.









Quote:


> Originally Posted by *kcuestag*
> 
> Still getting blackscreen issues (More like monitor signal loss and cards fans go into full blast). Could it be a software issue? I haven't really formatted my Windows since getting my first 290X, might as well try it and see if it's a corrupted installation causing this...


It seems like the GPU(s) is/are suddenly "disconnected" from the slot. Meaning like lost electrical connection between the card & PCIe slot. But this doesn't necessarily means bad PCIe slot(s). I think clean Windows installation is good idea. Troubleshooting drivers can be troublesome sometime.


----------



## trihy

Quote:


> Originally Posted by *Hogesyx*
> 
> Which card are you using? My old 290 was a tad hotter compared to 290X. With both card I have no throttling at 60% fan speed on stock clock, my ambient temp is tropic temp at 30-34%


An xfx one. Probably is not throttling... and amd drivers are responsible... not sure. But ie. On sleeping dogs I was getting max 80, avg 60 min 55 on a 7970ghz (1080, max settings but AA at high) and on 290x Im getting 66 -60-58...

Every review states 95 degress common for this cards, so I guess cant blame this card..

I'd not tried fixed 60% fan speed, which is super loud, but will try later


----------



## kcuestag

Quote:


> Originally Posted by *kizwan*
> 
> It seems like the GPU(s) is/are suddenly "disconnected" from the slot. Meaning like lost electrical connection between the card & PCIe slot. But this doesn't necessarily means bad PCIe slot(s). I think clean Windows installation is good idea. Troubleshooting drivers can be troublesome sometime.


Hoping a clean Windows install helps... It's driving me nuts, I've already gone through two 290X's and now two 290's.









I'm pretty sure every piece of hardware is just fine.

I haven't done a clean Windows install since September, and since then I've gone through two 7970's, then one 290X, which had these issues, then another one from RMA with same issues, and now these two 290's having same issue.

I'm pretty sure 4 cards can't have this issue by hardware, or I hope not.


----------



## smoke2

Quote:


> Originally Posted by *Raephen*
> 
> I think you should be safe.
> 
> I reckoned the X-560 would've been enough to handle my system + the single 290 I have, though that would've been pushing it higher than I liked.
> 
> I had the chance to sell it of alongside my old gpu, so I took that chance.
> 
> Your 650 gives you a bit more headroom, and I would have felt comfortable running that.
> 
> But like you mentioned: you have an option to sell it off, but don't do it to upgrade to 'just' an X-750 if it ends up costing more. You might want to look into their platinum offerings. Like I said: I sold my X-560. I replaced it with the Seasonic 860XP2. And contrary to what I would have expected, it's even more efficient in idle than the X-560 was, on very light loads. (Idle draw: 70-80 W, at the wall, monitor off < 10%).
> 
> Maybe something between 5 to 10 Watts at that load, but from what I know of PSU's, I would've expected the difference to be the other way around - a bit more when idling, not less


My seller doesn't have 860XP on stock and would like to buy it to the end of the year, so I can only have choice between X-750 and X-850.
If you will be in my case, will you sell your X-650 and buy X-750 for better feeling?


----------



## Sgt Bilko

Quote:


> Originally Posted by *kcuestag*
> 
> Hoping a clean Windows install helps... It's driving me nuts, I've already gone through two 290X's and now two 290's.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm pretty sure every piece of hardware is just fine.


Fresh install usually fixes those unknown problems. Hope it all goes well


----------



## kcuestag

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Fresh install usually fixes those unknown problems. Hope it all goes well


I hope so too, I've already gone through 4 R9 cards in total with the very same issue, I refuse to believe they are all faulty.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kcuestag*
> 
> I hope so too, I've already gone through 4 R9 cards in total with the very same issue, I refuse to believe they are all faulty.


Yeah, i really doubt all 4 were faulty......probably some driver conflict or such going on in the bowels on Windows there somewhere


----------



## kcuestag

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yeah, i really doubt all 4 were faulty......probably some driver conflict or such going on in the bowels on Windows there somewhere


Yeah, I remember when those BETA drivers came out, I didn't get ANY issue for well over a week, but then they came back! So maybe software indeed.


----------



## Raephen

Quote:


> Originally Posted by *smoke2*
> 
> My seller doesn't have 860XP on stock and would like to buy it to the end of the year, so I can only have choice between X-750 and X-850.
> If you will be in my case, will you sell your X-650 and buy X-750 for better feeling?


If I were in your shoes, no, I wouldn't.

But the same goes with every consumer product: go with what feels right to you.

Have you tested your system draw as it is now? If so, you can draw comparisons based of reviews of the gear you want to replace and extrapolate from there.

While your CPU might be a bit more power hungry than my 3570K, I do think your X-650 is more than enough.

Like already mentioned: the recommended PSU needs take into account people might have crappy ones, so to be safe the recommendations are usually higher than actually needed.


----------



## DraXxus1549

Picked up 3d mark yesterday and ran firestrike extreme.

How are my scores?

http://www.3dmark.com/3dm/1945710

EDIT: This is at 1100/1400.


----------



## Roy360

So no one one this thread has random black screens? Time for me to RMA my card?


----------



## Jaju123

http://www.3dmark.com/3dm/1946920

Is this a normal benchmark on fire strike extreme?

2x R9 290s stock in crossfire
2500K @ 4.8 ghz


----------



## Sgt Bilko

Quote:


> Originally Posted by *DraXxus1549*
> 
> Picked up 3d mark yesterday and ran firestrike extreme.
> 
> How are my scores?
> 
> http://www.3dmark.com/3dm/1945710
> 
> EDIT: This is at 1100/1400.


This is mine compared with yours: http://www.3dmark.com/compare/fs/1362276/fs/1151775

Mine was running at 1170/1430 though so everything looks pretty good there


----------



## ShortySmalls

Furmarking Both cards = massive power requirement









My 900W UPS started going nuts, looked at the screen it was showing i was pulling 1.2kw with just my 2 290x's running furmark at 1150/1400 +50% power.... i can't imagin if i added my CPU into the mix running prime95, i don't think my EVGA 1300 G2 would have enough power, the cpu and gpus idle pulls almost 500w from the wall.


----------



## Andrix85

Hy boys...

I have XFX R290 @ 290X mod with bios ASUS and max voltage of 1.41V on GPU TWEAK, real 1.117V on gpu-z. The bios is revision 3515.
There is an other bios to have higher voltage ?

Thanks


----------



## VSG

Quote:


> Originally Posted by *Jaju123*
> 
> http://www.3dmark.com/3dm/1946920
> 
> Is this a normal benchmark on fire strike extreme?
> 
> 2x R9 290s stock in crossfire
> 2500K @ 4.8 ghz


Ya, seems fine for a stock setting. You can expect major increase when you overclock the cards.

Edit: I did a quick comparison against my CFX 290x cards (at 1100/1300) when I was testing them out prior to putting on water blocks on everything. Even with my stock 4770k at 3.5 Ghz, my physics score is way higher than your 2500k at 4.8 Ghz. I suspect your CPU will bottleneck those cards quite a bit.

http://www.3dmark.com/compare/fs/1363002/fs/1262528


----------



## kizwan

Quote:


> Originally Posted by *Jaju123*
> 
> http://www.3dmark.com/3dm/1946920
> 
> Is this a normal benchmark on fire strike extreme?
> 
> 2x R9 290s stock in crossfire
> 2500K @ 4.8 ghz


This is mine compared with yours. Your graphic score seems low though. Maybe because of CPU?

http://www.3dmark.com/compare/fs/1363002/fs/1363682

[EDIT] You're running Extreme benchmark. LOL.


----------



## Jaju123

Quote:


> Originally Posted by *kizwan*
> 
> This is mine compared with yours. Your graphic score seems low though. Maybe because of CPU?
> 
> http://www.3dmark.com/compare/fs/1363002/fs/1363682


Well I did fire strike extreme and you did fire strike normal. Here is my firestrike normal test:

http://www.3dmark.com/3dm/1947989

I got a bit higher graphics score than yours, actually!


----------



## devilhead

Quote:


> Originally Posted by *kcuestag*
> 
> Hoping a clean Windows install helps... It's driving me nuts, I've already gone through two 290X's and now two 290's.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm pretty sure every piece of hardware is just fine.
> 
> I haven't done a clean Windows install since September, and since then I've gone through two 7970's, then one 290X, which had these issues, then another one from RMA with same issues, and now these two 290's having same issue.
> 
> I'm pretty sure 4 cards can't have this issue by hardware, or I hope not.


i have some black screens because of 120hz, when i switch to 60hz my monitor, everything looks ok


----------



## Jaju123

Quote:


> Originally Posted by *geggeg*
> 
> Ya, seems fine for a stock setting. You can expect major increase when you overclock the cards.
> 
> Edit: I did a quick comparison against my CFX 290x cards (at 1100/1300) when I was testing them out prior to putting on water blocks on everything. Even with my stock 4770k at 3.5 Ghz, my physics score is way higher than your 2500k at 4.8 Ghz. I suspect your CPU will bottleneck those cards quite a bit.
> 
> http://www.3dmark.com/compare/fs/1363002/fs/1262528


Yeah I expect to upgrade soon to a 3770k, since I can use the same mobo for it. Should be a reasonable upgrade!


----------



## RAFFY

Quote:


> Originally Posted by *kcuestag*
> 
> I hope so too, I've already gone through 4 R9 cards in total with the very same issue, I refuse to believe they are all faulty.


Wow you should of picked up on that one the first card lol


----------



## stickg1

Man I can't run certain games with high AA, I get the black screen. If I run everything on medium AA I can game for days. It's fine, but it's just lame that I'm maxing out games with ~60% GPU usage and when I go for full eyecandy I get black screens.


----------



## kizwan

Quote:


> Originally Posted by *Jaju123*
> 
> Well I did fire strike extreme and you did fire strike normal. Here is my firestrike normal test:
> 
> http://www.3dmark.com/3dm/1947989
> 
> I got a bit higher graphics score than yours, actually!


This mine in Extreme compared to yours:-

http://www.3dmark.com/compare/fs/1363002/fs/1366317


----------



## Radmanhs

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Radmanhs*
> 
> ok, i am going to use a kraken g10 on my 290, but i heard the vrm's get very hot still
> 
> can i get a boxes of these and use them on the vrm's and ram, or will they only work on the ram
> 
> http://www.amazon.com/Cosmos-Copper-Cooling-Heatsinks-cooler/dp/B00637X42A/ref=cm_cr_pr_product_top
> 
> along with this stuff to glue them on
> 
> http://www.amazon.com/Arctic-Alumina-Ceramic-Thermal-Adhesive/dp/B001O7ARUQ/ref=pd_sim_sbs_pc_15
> 
> 
> 
> That is *permanent* adhesive. The heatsink will attached forever.
Click to expand...

ok, are those heats sinks what i use for both ram and vrm's? is there also a non-permanent way of attaching those?


----------



## VSG

Quote:


> Originally Posted by *kizwan*
> 
> This mine in Extreme compared to yours:-
> 
> http://www.3dmark.com/compare/fs/1363002/fs/1366317


Did you overclock the cards at all? Your graphics score suggests they are at stock.


----------



## TheSoldiet

*** happend here.......

http://www.3dmark.com/fs/1097562


----------



## VSG

Did you card throttle down like crazy? Monitor your GPU clocks and temperatures.

My laptop scores more than that so something was clearly off big time.


----------



## Jack Mac

Quote:


> Originally Posted by *TheSoldiet*
> 
> *** happend here.......
> 
> http://www.3dmark.com/fs/1097562


I have no idea...


----------



## kizwan

Quote:


> Originally Posted by *geggeg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> This mine in Extreme compared to yours:-
> 
> http://www.3dmark.com/compare/fs/1363002/fs/1366317
> 
> 
> 
> Did you overclock the cards at all? Your graphics score suggests they are at stock.
Click to expand...

No, I did not overclock yet. They're running at stock clock for now until I get water blocks for them. Out of stock everywhere over here. I'll need to get them directly from EKWB though because unfortunately there is no "reliable" water cooling supplier here.


----------



## TheSoldiet

My card ran fine, it was my CPU. Look at the physics score. My CPU lagged like crazy in those tests.
http://www.3dmark.com/fs/1097562


----------



## Jack Mac

Quote:


> Originally Posted by *TheSoldiet*
> 
> My card ran fine, it was my CPU. Look at the physics score. My CPU lagged like crazy in those tests.
> http://www.3dmark.com/fs/1097562


Even though I don't care much for the FX CPUs, you should definitely be scoring a good bit higher than that. Maybe it's throttling? I'd try running the test again.


----------



## TheSoldiet

Quote:


> Originally Posted by *Jack Mac*
> 
> Even though I don't care much for the FX CPUs, you should definitely be scoring a good bit higher than that. Maybe it's throttling? I'd try running the test again.


Yeah, im gonna run it again next Sunday







(Christmas holiday) Just got the full version of 3dmark for 2£


----------



## Radmanhs

hey guys, i saw you could use this

http://www.amazon.com/Arctic-DCACO-VR00500-GB-VR005-VGA-Heatsink/dp/B0047V031E

for the vrm's on a r9 290, but it was said that you need 4 more ram heatsinks to cover them all, should i just get other heatsinks for the ram like this pack?

http://www.amazon.com/Cosmos-Aluminum-Cooling-Heatsinks-cooler/dp/B007XACV8O/ref=pd_cp_pc_0

and finally, what would be the best way to attach these that isnt permanent?


----------



## TheSoldiet

I just noticed that on my 3dmark test only 1 of my 8 cores were running........
Quote:


> Physical / logical processors1 / 8


----------



## Forceman

Quote:


> Originally Posted by *Falkentyne*
> 
> No, actually you're wrong.
> the offset is in hexadecimal and it adds the direct offset to the default core voltage, not the CURRENT set voltage..
> so "10" is 16 in decimal.
> Voltage is in 6.25mv steps, so 16x6.25=100 (100 mv).
> So you use 10 to set 100 mv.
> 
> 11 would be 106.25 mv. (17 (decimal) x 6.25=106.25
> 8 would be 50mv (8 x 6.25=50).
> 
> Setting "0" returns it back to the default VID with no offset.
> And I just checked quickly in GPU-Z and using values over "10" in the command line (100 mv) DO work and apply. You can check because the idle voltage goes up.
> I DO NOT think that afterburner's GUI will show more than 100 mv, as the sliders are locked to 100mv max.


Isn't that pretty much exactly what I said? The command line doesn't change the offset slider, it adds the voltage you specify (in my example 100mV) to the core voltage directly. So +0 offset is the same as +100 offset with no command line, and +100 offset is the same as +200 offset with no command line. The Afterburner voltage slider won't show the affect of the command line change, only looking at the actual voltage will show you whether it is working.
Quote:


> Originally Posted by *TheSoldiet*
> 
> I just noticed that on my 3dmark test only 1 of my 8 cores were running........
> Quote:
> 
> 
> 
> Physical / logical processors1 / 8
Click to expand...

That means 1 physical processor, 8 logical processors. Not that it is only using 1 out of 8.


----------



## kizwan

Quote:


> Originally Posted by *Radmanhs*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Radmanhs*
> 
> ok, i am going to use a kraken g10 on my 290, but i heard the vrm's get very hot still
> 
> can i get a boxes of these and use them on the vrm's and ram, or will they only work on the ram
> 
> http://www.amazon.com/Cosmos-Copper-Cooling-Heatsinks-cooler/dp/B00637X42A/ref=cm_cr_pr_product_top
> 
> along with this stuff to glue them on
> 
> http://www.amazon.com/Arctic-Alumina-Ceramic-Thermal-Adhesive/dp/B001O7ARUQ/ref=pd_sim_sbs_pc_15
> 
> 
> 
> That is *permanent* adhesive. The heatsink will attached forever.
> 
> Click to expand...
> 
> ok, are those heats sinks what i use for both ram and vrm's? is there also a non-permanent way of attaching those?
Click to expand...

Quote:


> Originally Posted by *Radmanhs*
> 
> hey guys, i saw you could use this
> 
> http://www.amazon.com/Arctic-DCACO-VR00500-GB-VR005-VGA-Heatsink/dp/B0047V031E
> 
> for the vrm's on a r9 290, but it was said that you need 4 more ram heatsinks to cover them all, should i just get other heatsinks for the ram like this pack?
> 
> http://www.amazon.com/Cosmos-Aluminum-Cooling-Heatsinks-cooler/dp/B007XACV8O/ref=pd_cp_pc_0
> 
> and finally, what would be the best way to attach these that isnt permanent?


You can use double sided Thermal *Tape* to attached the heatsink. ^^That heatsinks look all right to me.


----------



## Spartacvs

Quote:


> Originally Posted by *kizwan*
> 
> That is *permanent* adhesive. The heatsink will attached forever.
> Did you try latest driver? 13.11 beta 9.5 / 13.12?
> Bought mine this morning too.
> 
> 
> 
> 
> 
> 
> 
> 
> It seems like the GPU(s) is/are suddenly "disconnected" from the slot. Meaning like lost electrical connection between the card & PCIe slot. But this doesn't necessarily means bad PCIe slot(s). I think clean Windows installation is good idea. Troubleshooting drivers can be troublesome sometime.


hello, thank you for your answer, I tried the latest drivers for the 13:12 certificates and beta, but same problem, do you think are the video cards? or is it a software problem? I can not think that they are under the vrm or other liquid and the temperature of the vrm are low.


----------



## quakermaas

Blocks installed


----------



## kizwan

Quote:


> Originally Posted by *Spartacvs*
> 
> hello, thank you for your answer, I tried the latest drivers for the 13:12 certificates and beta, but same problem, do you think are the video cards? or is it a software problem? I can not think that they are under the vrm or other liquid and the temperature of the vrm are low.


Instead of TRIXX, try MSI Afterburner beta 17 (link at first post).


----------



## TheSoldiet

Quote:


> Originally Posted by *Forceman*
> 
> That means 1 physical processor, 8 logical processors. Not that it is only using 1 out of 8.


It is supposed to be 8/8, so when it is 1/8 it means that only one core is running. 1 physical using 8 logical threads.


----------



## Spartacvs

Quote:


> Originally Posted by *kizwan*
> 
> Instead of TRIXX, try MSI Afterburner beta 17 (link at first post).


MSI afterburner does not take the mv, trixx yes.


----------



## Spartacvs

Quote:


> Originally Posted by *kizwan*
> 
> Instead of TRIXX, try MSI Afterburner beta 17 (link at first post).


you see... this is my problem








http://imageshack.com/a/img4/9606/bi5d.png

the first gpu si in full after the bench....








This would happen even with the second, the second in crossfire is not 100 percent.


----------



## Spartacvs

Quote:


> Originally Posted by *kizwan*
> 
> Instead of TRIXX, try MSI Afterburner beta 17 (link at first post).


Quote:


> Originally Posted by *kizwan*
> 
> You can use double sided Thermal *Tape* to attached the heatsink. ^^That heatsinks look all right to me.


change the bios ????


----------



## 107Spartan

*Hi ALL!!!...

I have finally finished reading about half of this threads posts!.... and First let me say, love the info. Stated not knowing much to now feeling much more confident with my decision to eventually hit up Watercooling and taking this OC hobby head on.

I have a R9 290X. Bought it on Cyber Monday from Newegg for 490$...









Here is a screenshot of my GPU-Z.*


----------



## 107Spartan

Been doing some reading on the GPU-Z, and from what I gather a lower ASIC score means it will OC by Watercooling better than a higher score and a higher score will OC better by Air?

I was hoping my card would rate atleast a 75%,...not sure how thrilled I am that my card gets a "D" in the rating system...

I don't buy Cards very often so the thought of RMA'ing this one just for the score has crossed my mind, but then I got confused by Exactly what that score means in real world Games OC'ing if you will be going the Watercooling route.


----------



## Pesmerrga

Quote:


> Originally Posted by *TheSoldiet*
> 
> It is supposed to be 8/8, so when it is 1/8 it means that only one core is running. 1 physical using 8 logical threads.


I don't think so.. Just ran FS to see what my FX-8350 and GTX 580 max OC'ed could do in comparison to a 290/290x..

I put PhysX on CPU (which actually gets a higher Physics score by a few hundred)



http://www.3dmark.com/3dm/1956719


----------



## bloodkil93

Hey guys,
I'm having some REALLY annoying issues with my R9 290, whenever I even slightly start to overclock, I get blackscreens and occasional bluescreeen, is this normal or should I RMA it?

It also (not very often though) bluescreens/blackscreens on stock

Thanks!


----------



## TheSoldiet

Quote:


> Originally Posted by *Pesmerrga*
> 
> I don't think so.. Just ran FS to see what my FX-8350 and GTX 580 max OC'ed could do in comparison to a 290/290x..
> 
> I put PhysX on CPU (which actually gets a higher Physics score by a few hundred)
> 
> http://www.3dmark.com/3dm/1956719


Then i dont know what is wrong with my cpu. I will try to figure it out on Sunday.


----------



## Forceman

Quote:


> Originally Posted by *TheSoldiet*
> 
> It is supposed to be 8/8, so when it is 1/8 it means that only one core is running. 1 physical using 8 logical threads.


No, you have 1 physical processor with 8 threads. The only way it would say 8/something is if you had it running in an 8-way server. It is # of physical processors / # of logical processors (threads). Someone with a 3570K would see 1/4 instead of 1/8. I've looked at dozens of these in the Firstrike thread, and they always say 1/something.

For example:

http://www.3dmark.com/3dm/1543242


http://www.3dmark.com/3dm/1872175


----------



## TheSoldiet

Quote:


> Originally Posted by *Forceman*
> 
> No, you have 1 physical processor with 8 threads. The only way it would say 8/something is if you had it running in an 8-way server. It is # of physical processors / # of threads. Someone with a 3570 would see 1/4 instead of 1/8. I've looked at dozens of these in the Firstrike thread, and they always say 1/something.
> 
> For example:
> http://www.3dmark.com/3dm/1543242


Yeah, noticed. Thx for clearing it up though.


----------



## Pesmerrga

Question for you guys. Did anyone get a 290 or 290x from Tigerdirect.com? It seems like they haven't had any stock since the first week the card came out. Best "new" AMD card they have for sale is an R7 260x. lol.. I have a credit account with them that is just waiting for stock of reference or aftermarket cards to come in so I can pull the trigger.

Actually, I found an R9 270x that was mis-named as an R9270x.. woohoo!



Oh, and they're just as bad as Newegg.. $449 for a reference 7950

http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=1967386&CatId=7387


----------



## 107Spartan

Anyone know why whenever I trt and run the free 3dmark test it keeps crashing on me. Tried to oc the card and now it crashes at the very beginning. I can play bf4 at ultra graphics no problem at120% and still get 60fps.. not sure why it crashes for the 3dmark test.... any advice?

I'm running it with a custom fan profile to keep it cool with no OC to see if it's a software or hardware issue.


----------



## 107Spartan

....and nope it crashed a at start again.


----------



## Forceman

Quote:


> Originally Posted by *107Spartan*
> 
> ....and nope it crashed a at start again.


Does it pop up an error message as soon as you click the run button? That's what mine does now after I crashed running it the other day.


----------



## 107Spartan

Nope, the first test ran it made it to the very end and then the audio froze while the rest of the video finished, then it was just a black screen with the stuck audio. I had to hardreboot the system.

The second time I Oc'd the card and it just went to a black screen. Had to hard reboot again.

Third time no OC but aggressive custom fan profile. Crashed a few mins in to a white screen...and again had to hard reboot.

Logged into bf4 with my 1100/1400 oc settings and fan at 70% and then I played bf4 at ultrafor 20 mins with fan at 70% with zero issues or screen tears or flickers. Played at 1440p 27" monitor too.

I'm confused whats wrong with 3dmark.


----------



## Slomo4shO

Can anyone with a Koolance water block advise what pads they are using if they are not using the stock Koolance pads?
Quote:


> The thermal pad sheet should be cut into pieces required for your video block contact areas. Please use the diagram included with your water block to determine the approximate sizes needed. Koolance heat transfer pads can have different thicknesses (0.5mm, *0.7mm, and 1.0mm*)


The diagram included in the box (not listed anywhere online that I can find) only specifies 0.7mm and 1.0mm pads and I am not finding any fujipoly 0.7mm pads anywhere (I already have 0.5mm and 1.0mm fujipoly pads). So after digging around on the web I am somewhat perplexed about what to do. I would like to use a higher quality pad but I am unable to find any. Does anyone have any recommendations?


----------



## Durvelle27

Quote:


> Originally Posted by *Slomo4shO*
> 
> Can anyone with a Koolance water block advise what pads they are using if they are not using the stock Koolance pads?
> The diagram included in the box (not listed anywhere online that I can find) only specifies 0.7mm and 1.0mm pads and I am not finding any fujipoly 0.7mm pads anywhere (I already have 0.5mm and 1.0mm fujipoly pads). So after digging around on the web I am somewhat perplexed about what to do. I would like to use a higher quality pad but I am unable to find any. Does anyone have any recommendations?


I saw some 0.7mm Thermal pads on Performance-PCs.com


----------



## Slomo4shO

Quote:


> Originally Posted by *Durvelle27*
> 
> I saw some 0.7mm Thermal pads on Performance-PCs.com


Are you referring to Koolance Thermal Padding - 0.7mm Thick? Because that is already included with the block. I am looking for some higher quality pads since these are only rated at 1.5 W/mK.


----------



## Durvelle27

Quote:


> Originally Posted by *Slomo4shO*
> 
> Are you referring to Koolance Thermal Padding - 0.7mm Thick? Because that is already included with the block. I am looking for some higher quality pads since these are only rated at 1.5 W/mK.


They weren't those but i can't remember the name as i was just browsing the site for some good 1.5mm thermal pads


----------



## stickg1

Quote:


> Originally Posted by *Slomo4shO*
> 
> Are you referring to Koolance Thermal Padding - 0.7mm Thick? Because that is already included with the block. I am looking for some higher quality pads since these are only rated at 1.5 W/mK.


IDK, I plan on just using what came with the block. Looks to be decent enough.


----------



## Jack Mac

In case you guys didn't see this:

Also, if anyone is having issues with 290/X switching to 2D clocks, I recommend using the following settings in CCC, and then reverting to default and applying an OC in AB when you want to game.


Keeps me below 50C and nice and quiet at idle.


----------



## Arizonian

Quote:


> Originally Posted by *Slomo4shO*
> 
> Can anyone with a Koolance water block advise what pads they are using if they are not using the stock Koolance pads?
> The diagram included in the box (not listed anywhere online that I can find) only specifies 0.7mm and 1.0mm pads and I am not finding any fujipoly 0.7mm pads anywhere (I already have 0.5mm and 1.0mm fujipoly pads). So after digging around on the web I am somewhat perplexed about what to do. I would like to use a higher quality pad but I am unable to find any. Does anyone have any recommendations?


Got you updated to water









Quote:


> Originally Posted by *107Spartan*
> 
> *Hi ALL!!!...
> 
> I have finally finished reading about half of this threads posts!.... and First let me say, love the info. Stated not knowing much to now feeling much more confident with my decision to eventually hit up Watercooling and taking this OC hobby head on.
> 
> I have a R9 290X. Bought it on Cyber Monday from Newegg for 490$...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here is a screenshot of my GPU-Z.*
> 
> 
> Spoiler: Warning: Spoiler!


First welcome to OCN, glad the owners club thread helped - congrats - added


----------



## stickg1

Quote:


> Originally Posted by *Jack Mac*
> 
> In case you guys didn't see this:


I applied for a step-up with EVGA about 11 months ago. Haven't heard anything yet. So hopefully if they do, it will actually work in a timely manner.


----------



## chiknnwatrmln

My build is still waiting on an EK 290x block... Ordered it like two weeks ago and still not shipped.

Knowing my luck it's gonna arrive the day I move back to school.


----------



## Sgt Bilko

Quote:


> Originally Posted by *TheSoldiet*
> 
> Then i dont know what is wrong with my cpu. I will try to figure it out on Sunday.


Yeah you should be getting at least 8k Physics score on the CPU even at 4.5. Something is really wrong.....C&Q maybe?


----------



## ImJJames

If anyone got their kraken g10 I am willing to pay double the price for it.


----------



## BradleyW

Quote:


> Originally Posted by *Durquavian*
> 
> prerendering: occurs with dual gpu but not single unless selected. Prerenndered frames cause lag.


Setting an fps limit equal to the refresh rate fixes this sort of. I say sort of, because randomly you can feel the lag slowly coming back. And then it just vanishes again. This repeats. Any idea why?

Edit: I'm talking about input lag here


----------



## chiknnwatrmln

So I think GPUTweak just killed my 290.....

I was installing some Skyrim mods, finished up and went to start the game. Changed to my gaming profile (1100/5500 @ 1.268v before droop, stable since I got the card) and the screen kept flashing and artifacting and my PC reset. This was accompanied by a squealing noise from my GPU. On startup GPUTweak briefly showed 1820mv (!!!) and then the same thing happened.

Started in safe mod, uninstalled GPUTweak, now in regular mode whenever my memory clocks up (even at stock clocks) I get heavy artifacting......

This is really the last thing I need right now.

Okay, I just re-installed GPUTweak and remade my profiles and everything's seemingly back to normal.. what the heck.


----------



## Durquavian

Quote:


> Originally Posted by *BradleyW*
> 
> Setting an fps limit equal to the refresh rate fixes this sort of. I say sort of, because randomly you can feel the lag slowly coming back. And then it just vanishes again. This repeats. Any idea why?
> 
> Edit: I'm talking about input lag here


Depends on setup I guess. If you are using no v-sync then there is some prerendering going on. I n Radeonpro the default for Flip que ( render ahead) is 3 frames. That would equate to 48-50ms which would be some what noticeable. CF/SLI will have prerendered to some degree with the other card creating the even frames and the first having to wait on them before doing the next odd. Vsync on creates a whole host of scenarios. DX uses Render ahead as do quite a few games. some games and definitely Radeonpro use true Triple buffering which does help with the lag. Nvidia I have little to no relevant experience (used Nvidia 4-5 years ago). But generally you can get minimal lag with any option out side of single GPU Vsync off.


----------



## Mr357

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> So I think GPUTweak just killed my 290.....
> 
> I was installing some Skyrim mods, finished up and went to start the game. Changed to my gaming profile (1100/5500 @ 1.268v before droop, stable since I got the card) and the screen kept flashing and artifacting and my PC reset. This was accompanied by a squealing noise from my GPU. On startup GPUTweak briefly showed 1820mv (!!!) and then the same thing happened.
> 
> Started in safe mod, uninstalled GPUTweak, now in regular mode whenever my memory clocks up (even at stock clocks) I get heavy artifacting......
> 
> This is really the last thing I need right now.
> 
> Okay, I just re-installed GPUTweak and remade my profiles and everything's seemingly back to normal.. what the heck.


ASUS's software is notorious for sensor errors. Unless you're on the PT1 or PT3 BIOS, that kind of Vcore jump isn't even possible.


----------



## HardwareDecoder

I have a strange issue. Every single game is rock stable @ 1100mhz all the time. Fallout new vegas however will not hold a stable clock and FPS goes up and down because of this.

I don't have vsync on or anything stupid like that, any ideas ?


----------



## chiknnwatrmln

Fallout uses a crappy single threaded engine, that's why.

Use .ini tweaks. Google fallout ini tweak guide that will help some.


----------



## HardwareDecoder

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Fallout uses a crappy single threaded engine, that's why.
> 
> Use .ini tweaks. Google fallout ini tweak guide that will help some.


yeah I have this:

bUseThreadedAI=1
iNumHWThreads=4

I could do 8 threads since I have an i7 but figured better off with 4, either way it doesn't make it stay at a steady clock


----------



## ZerrethDotCom

Seems like the power limit in AMD Overdrive does nothing. I can run +18% GPU clock at 0% Power limit, but can't get stable above that by upping the Power limit.
I'm on a Gigabyte R9 290. I've heard something of a voltage unlocked Asus GPU Bios, could it be the case that it is locked down in my Gigabyte GPU Bios ?


----------



## hatlesschimp

Just wondering if I should get one more 290x and have tri fire. I play arma 3 mainly and maybe Dayz SA.


----------



## mojobear

hi guys,

are people still disabling ULPS for crossfire solutions? Just out of curiosity bc I have 2 more r9 290s coming in hopefully within the next two weeks.

Thanks!


----------



## Derpinheimer

Quote:


> Originally Posted by *Mr357*
> 
> ASUS's software is notorious for sensor errors. Unless you're on the PT1 or PT3 BIOS, that kind of Vcore jump isn't even possible.


Mine offered something like 1800mV+ for awhile. Then it went down to a limit of 1412mV with the latest update.


----------



## Maracus

Quote:


> Originally Posted by *Mr357*
> 
> ASUS's software is notorious for sensor errors. Unless you're on the PT1 or PT3 BIOS, that kind of Vcore jump isn't even possible.


Actually I remember bug when running AIDA64 and running MSIafterburner at the same time it would somehow change voltage to ridiculous levels. IE, my 6970 at 1.6mv and idling at 92c, think it had something to do with both accessing the VRMs


----------



## mojobear

Hello all also wanted to ask....is VDDC power in on GPUZ the power draw of the card? Havent used GPUz for AMD cards in a long time since my 5850 crossfire


----------



## chiknnwatrmln

Quote:


> Originally Posted by *HardwareDecoder*
> 
> yeah I have this:
> 
> bUseThreadedAI=1
> iNumHWThreads=4
> 
> I could do 8 threads since I have an i7 but figured better off with 4, either way it doesn't make it stay at a steady clock


Setting iNumHWThread higher than 2 does nothing for performance. The majority of the game runs on one thread, but it supposedly can utilize a max of two. People have actually reported less stability when set higher than 2, although I never noticed.

There are way more tweaks than just those two. Other tweaks include audio cache, general caching size, so on and so forth. They won't really help your FPS, but if you tweak right you can pretty much get rid of stutter.

Just don't set the game to use multithreaded audio. That will cause crashing. And backup your .ini files (both of them) every couple of changes so you can narrow down when you've made a mistake.

Use this guide right here, it doesn't describe every setting but helps a lot.

http://www.tweakguides.com/Fallout3_8.html

This also goes for FNV as the game engine is nearly identical and a lot of these settings are also used in Skyrim.

Beth really need to totally abandon the Gamebryo engine, it was outdated five years ago. As a result, even if you tweak everything just perfect you're still gonna be CPU bound in some areas with a lot of meshes, AI's, and scripts.


----------



## ShortySmalls

Dis Memory bandwidth... i will not get my blocks in until tomorrow (i hope, USPS has delayed it 3 times now) This OC isn't stable i don't think, its as high as msi afterburner will let me go for core and memory, i just wanted to see how high it would stay stable on the desktop







Ill try and run fire strike like this lol


----------



## Jack Mac

Quote:


> Originally Posted by *ShortySmalls*
> 
> Dis Memory bandwidth... i will not get my blocks in until tomorrow (i hope, USPS has delayed it 3 times now) This OC isn't stable i don't think, its as high as msi afterburner will let me go for core and memory, i just wanted to see how high it would stay stable on the desktop
> 
> 
> 
> 
> 
> 
> 
> Ill try and run fire strike like this lol


It'll say that you're running that memory frequency, but I think it's actually running at a much lower speed. I only got a 1FPS improvement going to the max memory vs 1500 in valley. Proof:


----------



## kizwan

Quote:


> Originally Posted by *Spartacvs*
> 
> you see... this is my problem
> 
> 
> 
> 
> 
> 
> 
> 
> http://imageshack.com/a/img4/9606/bi5d.png
> 
> the first gpu si in full after the bench....
> 
> 
> 
> 
> 
> 
> 
> 
> This would happen even with the second, the second in crossfire is not 100 percent.


Quote:


> Originally Posted by *Spartacvs*
> 
> change the bios ????


Sorry, I don't know what is wrong there. Maybe you might want to try with fresh Windows installation.

I don't think it's BIOS problem though but mine have slightly different BIOS version.

** Different brand may have different VBIOS version.* Yours are Sapphire BF4 version too right?


----------



## hotrod717

Here is my good dead for the day. I am not in the market, but XFX 290 $499.99. http://www.amazon.com/gp/offer-listing/B00H5ML9Z6/ref=dp_olp_0?ie=UTF8&condition=all


----------



## Forceman

Quote:


> Originally Posted by *hotrod717*
> 
> Here is my good dead for the day. I am not in the market, but XFX 290 $499.99. http://www.amazon.com/gp/offer-listing/B00H5ML9Z6/ref=dp_olp_0?ie=UTF8&condition=all


It's a pretty sucky situation when we are celebrating a card being available for $100 over list price. Wonder what AMD's yields are like - are there really that many miners in the world?


----------



## brazilianloser

Quote:


> Originally Posted by *Forceman*
> 
> It's a pretty sucky situation when we are celebrating a card being available for $100 over list price. Wonder what AMD's yields are like - are there really that many miners in the world?


No joke... glad to have purchased mine two when they were still reasonably priced.


----------



## sugarhell

Quote:


> Originally Posted by *Forceman*
> 
> It's a pretty sucky situation when we are celebrating a card being available for $100 over list price. Wonder what AMD's yields are like - are there really that many miners in the world?


Some guys have over 60-70k kh/s.That means over 70 gpus for each of them


----------



## hotrod717

Quote:


> Originally Posted by *Forceman*
> 
> It's a pretty sucky situation when we are celebrating a card being available for $100 over list price. Wonder what AMD's yields are like - are there really that many miners in the world?


I agree. More of a humorist post than anything. Thankfully got mine for $400. But, when they are going for $550, $400 does seems good?


----------



## Forceman

Quote:


> Originally Posted by *hotrod717*
> 
> I agree. More of a humorist post than anything. Thankfully got mine for $400. But, when they are going for $550, $400 does seems good?


No kidding, to think I was upset that I paid $419 for mine.


----------



## Amhro

I guess it depends on a country, because price of 290 increased just by 1€ since release here in slovakia







(5€ for 280x)
Hopefully non-ref 290s will come with normal prices too and I'll grab one ASAP


----------



## Koniakki

Quote:


> Originally Posted by *Amhro*
> 
> I guess it depends on a country, because price of 290 increased just by 1€ since release here in slovakia
> 
> 
> 
> 
> 
> 
> 
> (5€ for 280x)
> Hopefully non-ref 290s will come with normal prices too and I'll grab one ASAP


Noooo.. Now everyone is going to start mining in Slovakia and drive the prices up there...


----------



## Amhro

Quote:


> Originally Posted by *Koniakki*
> 
> Noooo.. Now everyone is going to start mining in Slovakia and drive the prices up there...


Haha







Nah, I don't think so, since prices were higher than normal at release too.


----------



## hatlesschimp

Just wondering can I overclock the hdmi 1.4 port to 4k at 60hz. Could I drop the color to 8 bit - 4:2:0? And how would I do this.


----------



## velocityx

I noticed, that everyday, when I switch on my pc and play some bf4, the first time I load it up, I'll get a black screen within minutes. and then I can go play all day without it. But that first load and it's consistent. I wonder what it is..


----------



## Slomo4shO

Quote:


> Originally Posted by *sugarhell*
> 
> Some guys have over 60-70k kh/s


I believe the term you are looking for is mh/s


----------



## sugarhell

Quote:


> Originally Posted by *Slomo4shO*
> 
> I believe the term you are looking for is mh/s


....

Fail


----------



## escapedmonk

LOL yeah my CPU has 60-70kh's


----------



## MrWhiteRX7

There are still e-tailers selling these cards at fair prices. I posted many pages back a link to one of them. Need to know how to scrounge the interwebs for dem deals


----------



## cplifj

Hi,

hmm, still having troubles with this r9290x.

in valley it benchmarks at 2440 points thereabouts. max framerate= 140 minimum = 26 fps, i mean ***. this is at extremeHD 1920*1080 8*AA.

on an intel i7-3770 with 16GB ram system running stock speeds, xmp profile so thats 1600Mhz. with a brandnew HX650 watts from corsair.

This thing just doesnt want to run smooth. Firestrike score = 9107 , and it still drops with every new try i make with other drivers.

Asus was kind enough to update the bios via its gpu tweak tool, did it help ? No.

Screams like hell running 3dmark tests, like being tortured. and all that for the measly price of 539 euro's.

NEVER overclocked , hell,not when its like this stock.

SO ANYBODY WANT TO BUY A R9290X only tried out for 2 weeks ??

(this happens like everytime i let met myself get influenced by pressgoons, new card thats supposed to rock, only never with my normal high end system)

thats what you get when buying in stores that build and sell their own pc's i guess. those guys took the good parts and sell the bad ones off shelf to the "unsuspecting" customer.

BAH, press did its worst, the salespeople do their worst, its like GREEEEEED all over, merry christmas AMD.

can i now have a piece of R9290X that does do whats it supposed to do ???

grtz


----------



## battleaxe

Quote:


> Originally Posted by *Roy360*
> 
> So no one one this thread has random black screens? Time for me to RMA my card?


I had a black screen yesterday. But I know it was because I ran my memory at 1500mhz without adding voltage. Since I set it back down to stock its been fine. Try lowering the memory and try again. If it can't run stock memory settings or even a slight OC without more volts then yes, RMA the thing.


----------



## Slomo4shO

Quote:


> Originally Posted by *escapedmonk*
> 
> LOL yeah my CPU has 60-70kh's


He said 60-70kkh/s not 60-70kh/s...


----------



## sun100

Wait, so you guys are saying that r9 290 jumped up quite high in price ? Im in Sweden, and here on this online shop where i bought mine, its still within the price i bought it a month ago, even sinking i think. 290x is still going by the same price, and noticed some models even went up. Non-ref. 290 from Sapphire with new cooling costs nearly as much as 290x tho. *** is going on with prices ? :S

Btw, anyone finally managed to get stock cooler off the reference card ( Sapphire r9 290 in my case) and install aftermarket one without any modifications, meaning just unscrew and screw ?


----------



## Sazz

Just got my new 290X! back on the 290X bandwagon after selling the 290 that replaced my first 290X for 650bucks LOL..

Testing for OCCT error checking at stock clocks before I start OCing and see what's the limit at stock volts xD


----------



## pkrexer

Finally got around to benchmarking my 290x. I was getting slight artifacting @ 1180/1500, so didn't go further. Might try running it again at 1200/1500 for the heck of it. I can go up to 1600+ on the memory, but anything past 1500 hurts my scores. This is all on the stock cooler running +100mv. I have little doubt with better cooling and little extra voltage I could get 1200 stable.

http://www.3dmark.com/fs/1374957


----------



## Forceman

Quote:


> Originally Posted by *sun100*
> 
> Btw, anyone finally managed to get stock cooler off the reference card ( Sapphire r9 290 in my case) and install aftermarket one without any modifications, meaning just unscrew and screw ?


The whole cooler comes off easily, just a bunch of screws on the back and a couple on the bracket end. The issue was people trying to separate the baseplate from the cooling fins, which requires major modifications.


----------



## Forceman

Quote:


> Originally Posted by *pkrexer*
> 
> Finally got around to benchmarking my 290x. I was getting slight artifacting @ 1180/1500, so didn't go further. Might try running it again at 1200/1500 for the heck of it. I can go up to 1600+ on the memory, but anything past 1500 hurts my scores. This is all on the stock cooler running +100mv. I have little doubt with better cooling and little extra voltage I could get 1200 stable.
> 
> http://www.3dmark.com/fs/1374957


Good score, you should post that over in the Firestrike Top 30 thread. Not enough 290X scores in there, the Nvidia guys are running the place







.

Edit: but you'll have to re-run it with tessellation enabled.


----------



## Sazz

just saw this score on Firestrike, I was wondering.. how can you get 21k graphics score on a single 290X..... LN2 mode? doesn't help the fact that the clocks isn't showing up. xD


----------



## Arizonian

Quote:


> Originally Posted by *Forceman*
> 
> Good score, you should post that over in the Firestrike Top 30 thread. Not enough 290X scores in there, the Nvidia guys are running the place
> 
> 
> 
> 
> 
> 
> 
> .


Just added this to the OP for easy reference / links for owners and new OCN members with 290 and 290X GPU's.

*OCN Benchmark Links*

Got a good benchmark? Show your GPU scores by taking pride and feel free to participate representing the 290 and 290X. Remember to read the first page and follow those rules when benching when submitting your scores for validity in order to be counted.









*Single GPU Firestrike Top 30*

*Top 30 3d Mark 13 Fire Strike Scores in Crossfire / SLI
*
*Firestrike Extreme Top 30*

*Top 30 3DMark11 Scores for Single/Dual/Tri/Quad*

*[OFFICIAL]--- Top 30 --- Unigine 'Valley' Benchmark 1.0*

*OCN GK110 vs. Hawaii Bench-off thread*


----------



## ImJJames

Quote:


> Originally Posted by *Forceman*
> 
> Good score, you should post that over in the Firestrike Top 30 thread. Not enough 290X scores in there, the Nvidia guys are running the place
> 
> 
> 
> 
> 
> 
> 
> .


tess modification is allowed on firestrike? I thought only on 3dmark11


----------



## Sazz

Anyone knows how to increase the OC limit on AB? the new 290X I got only OC's 1110Mhz on Core (Stock voltage) but my memory clocks is already at AB's max of 1625 and its still not even budging, normally if its sort of unstable I get flickers of black screens but not even a single sign of memory clocks being unstable.

Edit: Nevermind, finally got an error at 1625Mhz on memory, turning the clocks down to see how high it can go without errors xD


----------



## Durvelle27

Quote:


> Originally Posted by *Sazz*
> 
> Anyone knows how to increase the OC limit on AB? the new 290X I got only OC's 1110Mhz on Core (Stock voltage) but my memory clocks is already at AB's max of 1625 and its still not even budging, normally if its sort of unstable I get flickers of black screens but not even a single sign of memory clocks being unstable.
> 
> Edit: Nevermind, finally got an error at 1625Mhz on memory, turning the clocks down to see how high it can go without errors xD


Have you increased unofficial OC limits
I


----------



## ShortySmalls

FIRE STRIKE SCORE
18095 with AMD Radeon R9 290X(2x) and Intel Core i7-4770K

http://www.3dmark.com/fs/1383676

FIRE STRIKE EXTREME SCORE
10228 with AMD Radeon R9 290X(2x) and Intel Core i7-4770K

http://www.3dmark.com/fs/1383424

Stock Reference Coolers, i wanted to see my scores before i went extreme oc's with my waterblocks installed, i just got them in today!


----------



## Sazz

Quote:


> Originally Posted by *Durvelle27*
> 
> Have you increased unofficial OC limits
> I


already checked that box, nothing happened. Anyways 1500 is stable so far on memory. at 1625 I didn't get an error til over 1hr of testing it.


----------



## VSG

Quote:


> Originally Posted by *ShortySmalls*
> 
> FIRE STRIKE SCORE
> 18095 with AMD Radeon R9 290X(2x) and Intel Core i7-4770K
> 
> http://www.3dmark.com/fs/1383676
> 
> FIRE STRIKE EXTREME SCORE
> 10228 with AMD Radeon R9 290X(2x) and Intel Core i7-4770K
> 
> http://www.3dmark.com/fs/1383424
> 
> Stock Reference Coolers, i wanted to see my scores before i went extreme oc's with my waterblocks installed, i just got them in today!


What was the overclock you got on those?

Edit: NVM, I see those on the link. 1215 core on the stock coolers is pretty amazing! Does it have to be a valid driver for 3dmark to display the overclocked numbers? When I was testing out, it only displayed the default clocks.


----------



## Forceman

Quote:


> Originally Posted by *ImJJames*
> 
> tess modification is allowed on firestrike? I thought only on 3dmark11


No, that's my bad - I was looking on my phone and didn't see the tessellation warning.


----------



## Durvelle27

Quote:


> Originally Posted by *Sazz*
> 
> already checked that box, nothing happened. Anyways 1500 is stable so far on memory. at 1625 I didn't get an error til over 1hr of testing it.


I think thats the max then


----------



## Bartouille

R9 290 at 419$ on dell.ca... 480$ shipped w/ tax. I feel tempted but at the same time my 7950 does a fantastic job even at 1440p.


----------



## Mr357

Quote:


> Originally Posted by *Bartouille*
> 
> R9 290 at 419$ on dell.ca... 480$ shipped w/ tax. I feel tempted but at the same time my 7950 does a fantastic job even at 1440p.


Just hold out for Pirate Islands/Maxwell then


----------



## Ukkooh

Is the reference design limiting overclocks? MY 290x is undergoing and RMA and I have a reference wc block ordered. I heard that EK is making a Asus direct cu block and wondered if I should ask for a refund, cancel the block order and get a direct cu card?


----------



## sf101

Quote:


> Originally Posted by *ShortySmalls*
> 
> FIRE STRIKE SCORE
> 18095 with AMD Radeon R9 290X(2x) and Intel Core i7-4770K
> 
> http://www.3dmark.com/fs/1383676
> 
> FIRE STRIKE EXTREME SCORE
> 10228 with AMD Radeon R9 290X(2x) and Intel Core i7-4770K
> 
> http://www.3dmark.com/fs/1383424
> 
> Stock Reference Coolers, i wanted to see my scores before i went extreme oc's with my waterblocks installed, i just got them in today!


To be honest i gained nothing with my waterblock infact for some reason i started getting more issues if anything although all temp readings say they are cool as heck.

So all i really gained was cooler temps and less noise but really nothing as far as oc headroom goes.


----------



## kcuestag

Update on my black screen (Or signal loss) issue. After removing ALL my sleeved extensions from the power supply, and using just the original cables, I haven't got any issues for the past 24 hours.

I don't want to get my hopes up, but so far it looks good, and it appears it maybe an extensions not working properly and causing a short, I don't know which, and can't be bothered to try them all one by one, so it's staying with no sleeving extensions for a good while to see if issue appears again.


----------



## kizwan

Quote:


> Originally Posted by *kcuestag*
> 
> Update on my black screen (Or signal loss) issue. After removing ALL my sleeved extensions from the power supply, and using just the original cables, I haven't got any issues for the past 24 hours.
> 
> I don't want to get my hopes up, but so far it looks good, and it appears it maybe an extensions not working properly and causing a short, I don't know which, and can't be bothered to try them all one by one, so it's staying with no sleeving extensions for a good while to see if issue appears again.


This is why I don't like using custom cable. Did you sleeved yourself or buy?


----------



## kcuestag

Quote:


> Originally Posted by *kizwan*
> 
> This is why I don't like using custom cable. Did you sleeved yourself or buy?


I bought them, they're Bitfenix (Alchemy) white sleeved extensions. They've always worked fine for me, maybe I broke one while messing around with cables when installing my first 290X.


----------



## Arizonian

Quote:


> Originally Posted by *kcuestag*
> 
> I bought them, they're Bitfenix (Alchemy) white sleeved extensions. They've always worked fine for me, maybe I broke one while messing around with cables when installing my first 290X.


I have the same extensions for my PSU and had similar issue with 690 when I benched. Turned out one of them was barley making contact and some how pulled from one of the connectors a tad. I only noticed because the white part of the sleeving was torn and could see the black wire underneath.

On a side note - hoping the MSI 290X Lightning or Galaxy 290X HOF turn out to be power houses and my AX850 can handle it. Guessing mid-January.

Any word on Matle release with BF4 yet? We're on the end of Dec now and thought it would have provided at least some reviews.


----------



## MrWhiteRX7

They said it is pushed back


----------



## kcuestag

Quote:


> Originally Posted by *Arizonian*
> 
> I have the same extensions for my PSU and had similar issue with 690 when I benched. Turned out one of them was barley making contact and some how pulled from one of the connectors a tad. I only noticed because the white part of the sleeving was torn and could see the black wire underneath.
> 
> On a side note - hoping the MSI 290X Lightning or Galaxy 290X HOF turn out to be power houses and my AX850 can handle it. Guessing mid-January.
> 
> Any word on Matle release with BF4 yet? We're on the end of Dec now and thought it would have provided at least some reviews.


DICE promised Mantle for mid December but that has already passed and I don't have any hopes of it coming this year...


----------



## velocityx

Anderson from frostbite team said late december which for me means midnight dec 31. Nobody said its pushed back jesus stop spreading mis information. Hate that about interwebz..


----------



## Sgt Bilko

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> They said it is pushed back


Where did they say that?
Quote:


> Originally Posted by *kcuestag*
> 
> DICE promised Mantle for mid December but that has already passed and I don't have any hopes of it coming this year...


The only time i've ever seen it written down anywhere was Late December and that means they have till the 31st to bring it out.......


----------



## MrWhiteRX7

Quote:


> Originally Posted by *velocityx*
> 
> Anderson from frostbite team said late december which for me means midnight dec 31. Nobody said its pushed back jesus stop spreading mis information. Hate that about interwebz..


They could be lying... but I did not make anything up. Go do cartwheels in traffic.









http://www.hardwarecanucks.com/news/video/have-battlefield-4s-woes-delayed-amds-mantle-roll-out/


----------



## MrWhiteRX7

You really think they are going to work through the holidays to get us Mantle within the next week when we've heard nothing more about it at all and with all the BS going on with EA and BF4 in general? I highly doubt it and then there's the article I just posted. I want it to be released just as bad as anyone, hell they OWE it to us by now with all the damn hype and hoopla.


----------



## velocityx

Mantle has nothing to do with battlefield 4 and its a project of amd and the frostbite studio which is not part of dice per say. So even tho projects might have been halted, mantle is not a project of dice but frostbite's team so the whole piece from hardware canucks is just pure speculation. The only person speaking about mantle in frostbite is johan anderson aka repi so if he comes and says sorry guys delag then its a delay, anything else, i wouldnt trust it


----------



## MrWhiteRX7

At THIS current time Mantle technically has everything to do with BF4, that's the game we're waiting on it to be implemented in! LOL

Johan said at the beginning of december that it's not even close. I'll find the interview for you to show again that I'm not just talking out my arse here.


----------



## Ukkooh

Quote:


> Originally Posted by *Ukkooh*
> 
> Is the reference design limiting overclocks? MY 290x is undergoing and RMA and I have a reference wc block ordered. I heard that EK is making a Asus direct cu block and wondered if I should ask for a refund, cancel the block order and get a direct cu card?


bump


----------



## MrWhiteRX7

Silicon lottery buddy. Reference is not the reason, plenty of reference cards clocking very very well even on air. You may just not have lucked out.


----------



## BradleyW

I would wait another year before mantle is implemented into anything on a viable level.


----------



## Forceman

Quote:


> Originally Posted by *Ukkooh*
> 
> bump


Impossible to say until custom cards come out, but it appears the limiting factor is going to be the chip itself not the PCB. Plenty of people testing on water and some with modded BIOS so it doesn't look like temps or power are hiding the cards back much (unless you want crazy high voltages).


----------



## Slomo4shO

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> They could be lying... but I did not make anything up. Go do cartwheels in traffic.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.hardwarecanucks.com/news/video/have-battlefield-4s-woes-delayed-amds-mantle-roll-out/


It is amazing how hearsay passes as journalism these days...


----------



## hatlesschimp

Just purchased my 3rd Powercolor 290x.


----------



## HardwareDecoder

Quote:


> Originally Posted by *hatlesschimp*
> 
> Just purchased my 3rd Powercolor 290x.


How is the micro stutter with 2 of them? Gpu usage in most games good on both cards?


----------



## Derpinheimer

Quote:


> Originally Posted by *Ukkooh*
> 
> bump


No R9 290(x) are barely temp limited, unlike the 79XX. Low temps barely help core overclocking.


----------



## hatlesschimp

Im running 3 1080p 144hz monitors in portrait and a 4k sony tv although the sony 4k is out in the living room now because my missus had a cry and demanded it. Im considering buying another one or getting most likely the new 2014 version or another brand if it has DP1.2. I wasnt getting the frame rates I would like on ultra. I run MSAA at 2x 4x does nothing and you can see the difference from OFF to 2x.

Hard to believe this is a 65" tv lol. It makes multi tasking fun!


----------



## BradleyW

Get any screen tear on those 144hz screens?


----------



## hatlesschimp

a tiny bit not as bad as other monitors.


----------



## BradleyW

Quote:


> Originally Posted by *hatlesschimp*
> 
> a tiny bit not as bad as other monitors.


Do you notice less tearing with just 2 cards active as oppose to 3?


----------



## Maracus

Never really thought about the vdroop on these cards until i actually had MSIafterburner monitoring running firestrike. It started at 1.313 and it dropped as low as 1.203, cant get past 1215/1500 @1225 during the vdroop it would start to artifact....oh well guess even with water cooling it aint gonna go any higher seeing as i had the fan speed at 100% and the temps where fine. Anyhow still a good card, just not a great overclocker.


----------



## VSG

So I asked this earlier but ended up lost as an edit to an old post: Does 3dMark not recognize overclocked settings if you are on an invalid (beta) driver? My score pages only indicate the default clocks.


----------



## devilhead

tryed some OC for my crappy unlocked xfx 290















http://www.3dmark.com/3dm/1985395


----------



## BradleyW

What's the average OC on 290x's on water? What is the highest safe voltage for core and vram?


----------



## Forceman

Quote:


> Originally Posted by *BradleyW*
> 
> What's the average OC on 290x's on water? What is the highest safe voltage for core and vram?


You can't change the memory voltage, and no one really knows the safe core voltage yet but the +100mV you can set with Afterburner seems like it should be plenty safe. I'd reckon the limit on GPU Tweak is probably safe as well, considering the Vdroop, but I'd be careful messing with modded BIOSes at this point. Average overclock is in the 1150-1200 range on water - water helps keep the temps down, but it doesn't seem to help overclocking all that much.


----------



## BradleyW

Quote:


> Originally Posted by *Forceman*
> 
> You can't change the memory voltage, no one really knows the safe core voltage yet but the +100mV you can set with Afterburner seems like it should be plenty safe. I'd reckon the limit on GPU Tweak is probably safe as well, considering the Vdroop, but I'd be careful messing with modded BIOSes at this point. Average overclock is in the 1150-1200 range on water - water helps keep the temps down, but it doesn't seem to help overclocking all that much.


Thank you.


----------



## Mr357

Quote:


> Originally Posted by *BradleyW*
> 
> What's the average OC on 290x's on water? What is the highest safe voltage for core and vram?


I would guess just over 1200MHz for water. As for core voltage I was told 1.35 for 24/7 use, so I wouldn't ever go above 1.4 without Phase/DICE/LN2.


----------



## Derpinheimer

Quote:


> Originally Posted by *Derpinheimer*
> 
> Prime doesnt stress the 12v line.
> 
> Where did the first numbers come from? You say it drops to 11.5v in some reading, and HWInfo says 11.926v
> 
> GPUz and HWInfo are about the same for me:


Just FYI, multimeter tells me a value ~0.4v higher, wether its idle or load. So, a minimum of just over 11.4v. Originally it wasnt going lower than 11.6v but then I used a 2x molex to 6 pin and tested the molex line now that it was loaded and it dropped another .2v.

Still think I need to RMA it.


----------



## Zaid

There should be a section in the first table to add max OC for core/mem. this info would be very useful for overclocking.


----------



## Sgt Bilko

Alrighty then. My RMA has gone through now so i need some opinions, Should i use it all on a Ref 290x ($699) 1 x 290 for $519 or just wait till the non-ref cards come in and decide then?


----------



## VSG

Quote:


> Originally Posted by *Zaid*
> 
> There should be a section in the first table to add max OC for core/mem. this info would be very useful for overclocking.


You can't really specify that though, since each card is unique in its overclocking capacity. What can be done is to maintain a database for each user to input his/her max stable OC values and there is another thread in this subforum where that would suit better.


----------



## Durquavian

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Alrighty then. My RMA has gone through now so i need some opinions, Should i use it all on a Ref 290x ($699) 1 x 290 for $519 or just wait till the non-ref cards come in and decide then?


Non-reference is soooo close that may be waiting is the best option.


----------



## Arizonian

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Alrighty then. My RMA has gone through now so i need some opinions, Should i use it all on a Ref 290x ($699) 1 x 290 for $519 or just wait till the non-ref cards come in and decide then?


If your like me, you might be hard pressed to wait till sometime in January for non-reference 290X.

Count on them being higher priced than aftermarket reference cooled 290X $699 at roughly $749 or higher depending on how over built non-reference designs are.


----------



## Durquavian

I am hoping this mining crap dies down a bit by next year so I can actually afford one at its original price.


----------



## kizwan

Quote:


> Originally Posted by *kcuestag*
> 
> I bought them, they're Bitfenix (Alchemy) white sleeved extensions. They've always worked fine for me, maybe I broke one while messing around with cables when installing my first 290X.


Quote:


> Originally Posted by *Arizonian*
> 
> I have the same extensions for my PSU and had similar issue with 690 when I benched. Turned out one of them was barley making contact and some how pulled from one of the connectors a tad. I only noticed because the white part of the sleeving was torn and could see the black wire underneath.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> On a side note - hoping the MSI 290X Lightning or Galaxy 290X HOF turn out to be power houses and my AX850 can handle it. Guessing mid-January.
> 
> Any word on Matle release with BF4 yet? We're on the end of Dec now and thought it would have provided at least some reviews.


@kcuestag ^^This could explain your problem. Check your sleeved extensions, especially the PCIe power connector for bad contact.

Thankfully nothing melted or burn. I've seen a lot of things melted in my life. In all occasions, my computers survived. One time my computer got power surge because of lightning that day. That scary one. I can see sparks in the casing. Only PSU burn though.
Quote:


> Originally Posted by *Ukkooh*
> 
> Is the reference design limiting overclocks? MY 290x is undergoing and RMA and I have a reference wc block ordered. I heard that EK is making a Asus direct cu block and wondered if I should ask for a refund, cancel the block order and get a direct cu card?


Depends on the quality of the component they use in non-reference design, especially VRM. So far I see the only limit on reference design is temp. Also, it seems water cooling doesn't increase overclock capability. What max clocks you get on air, it will be the max overclock on water. Maybe a little bit stable than on air. I don't see non-reference design can change all these though except running much cooler.

With reference design, water cooling them are good idea. Fan at 50% with two cards still running silently than my reference 5870 card at the same speed. However, I need to run the fan at least 70% - 80% speed since ambient here pretty high, 32 - 34C (indoor), to keep temp below 90C. 70 - 80% with two cards, really, really loud. Running fan at default speeds not bad though. Temp max out at 94C & the games still running smooth though.

BTW, based on track record, for example ASUS, does direct cu cards overall better than reference cards?


----------



## psyside

Quote:


> Originally Posted by *kizwan*
> 
> BTW, based on track record, for example ASUS, does direct cu cards overall better than reference cards?


Most of the times yes, but this time reference cards have superior mosfets. Asus have superior inductors, so it is a mixed result. If you use custom fan like ~ 60% and don't push more then + 100mV you won't be limited by refernce card.

You will be limited by core 99 of 100 times - not the card vrm's...which are more then decent for ~ 1200/1600, but not higher generally on air.


----------



## Frontside

Hello OCN. Does anyone know if Asus have "Warranty Void" stickers on their R9 290X cards


----------



## tsm106

Quote:


> Originally Posted by *psyside*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> BTW, based on track record, for example ASUS, does direct cu cards overall better than reference cards?
> 
> 
> 
> Most of the times yes, but this time reference cards have superior mosfets. Asus have superior inductors, so it is a mixed result. If you use custom fan like ~ 60% and don't push more then + 100mV you won't be limited by refernce card.
> 
> You will be limited by core 99 of 100 times - not the card vrm's...which are more then decent for ~ 1200/1600, but not higher generally on air.
Click to expand...

Where do you come up with this fiction?


----------



## Tobiman

Quote:


> Originally Posted by *Arizonian*
> 
> I have the same extensions for my PSU and had similar issue with 690 when I benched. Turned out one of them was barley making contact and some how pulled from one of the connectors a tad. I only noticed because the white part of the sleeving was torn and could see the black wire underneath.
> 
> On a side note - hoping the MSI 290X Lightning or Galaxy 290X HOF turn out to be power houses and my AX850 can handle it. Guessing mid-January.
> 
> Any word on Matle release with BF4 yet? We're on the end of Dec now and thought it would have provided at least some reviews.


Galaxy don't make AMD cards, IIRC!


----------



## HardwareDecoder

reference cards have better flux capacitors!


----------



## Tobiman

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Alrighty then. My RMA has gone through now so i need some opinions, Should i use it all on a Ref 290x ($699) 1 x 290 for $519 or just wait till the non-ref cards come in and decide then?


Gigabyte windforce 3X r9 290X on newegg. GOGOGOGOGO!


----------



## VSG

Quote:


> Originally Posted by *Frontside*
> 
> Hello OCN. Does anyone know if Asus have "Warranty Void" stickers on their R9 290X cards


Not one that says warranty void, but there is a sticker on one of the screws at the back that is almost impossible to take off without tearing.


----------



## DeadlyDNA

I am sad tonight I think I was bad this year and santa gave me more than coal....

While gaming I had apparently developed a leak somewhere in my setup and system shutdown.
Now I am just hoping that nothing is damaged. I have been taking apart the rig piece by piece. So far I found fluid in mobo, psu, gpus all 4 ....
I was holding off on adding radiators and waterblocking 4th card. I just hope the psu did its job and shutdown before things died lol.


----------



## VSG

Wow, what happened? Pressure build up from hot coolant?


----------



## ShortySmalls

ALL I DO IS WIN!!!!

#1 Firstrike Extreme Score now for all 4770k and 2 way 290x crossfire owners!

10386 baby

http://www.3dmark.com/fs/1388657


----------



## sebhitman

call me stupid but isn't the liquid of liquid cooling non conductive?


----------



## HardwareDecoder

Quote:


> Originally Posted by *sebhitman*
> 
> call me stupid but isn't the liquid of liquid cooling non conductive?


Some of the "coolants" they sell might be, but a lot of people including me just use distilled water.

Edit: a little googling actually says since a lot of stuff is removed from distilled water that it isn't electrically conductive anymore. Heh you learn something all the time -- makes me feel safer about my loop.


----------



## VSG

Quote:


> Originally Posted by *sebhitman*
> 
> call me stupid but isn't the liquid of liquid cooling non conductive?


It's usually water by itself or with an additive so it is very much conductive.


----------



## Arizonian

Quote:


> Originally Posted by *Tobiman*
> 
> Galaxy don't make AMD cards, IIRC!










myself

Thanks, been a while.


----------



## HardwareDecoder

Quote:


> Originally Posted by *geggeg*
> 
> It's usually water by itself or with an additive so it is very much conductive.


apparently distilled water is NOT conductive though.

So it seems just using distilled water is proving time and time again to be better than pre-mixed "coolants"


----------



## VSG

Distilled water picks up ions though, so it gets conductive.


----------



## ShortySmalls

Quote:


> Originally Posted by *HardwareDecoder*
> 
> apparently distilled water is NOT conductive though.
> 
> So it seems just using distilled water is proving time and time again to be better than pre-mixed "coolants"


Perfect distilled water is non-conductive, but as soon as any dust or particles mix into it it becomes conductive again.


----------



## tsm106

Quote:


> Originally Posted by *DeadlyDNA*
> 
> I am sad tonight I think I was bad this year and santa gave me more than coal....
> 
> While gaming I had apparently developed a leak somewhere in my setup and system shutdown.
> Now I am just hoping that nothing is damaged. I have been taking apart the rig piece by piece. So far I found fluid in mobo, psu, gpus all 4 ....
> I was holding off on adding radiators and waterblocking 4th card. I just hope the psu did its job and shutdown before things died lol.


Tear down time. Grab some CRC elect cleaner and spray that on the die, it pushes out any water still stuck under it. You can do that for any other die areas, cpu, mb etc. Remember any liquids that start out non-conductive do not stay that way very long inside a loop so make sure everything is dry and free of deposits. That flaky white stuff can be conductive too. Hence the elect cleaner. In disasters like this, hope for the MB dieing first and not taking everything with it.


----------



## HardwareDecoder

Quote:


> Originally Posted by *geggeg*
> 
> Distilled water picks up ions though, so it gets conductive.


you mean like leeches them from the metals in the loop?


----------



## VSG

Quote:


> Originally Posted by *HardwareDecoder*
> 
> you mean like leeches them from the metals in the loop?


That, and dust mostly are the sources. Either way, I am really interested in what happened here.


----------



## GOTFrog

never mind got my answer


----------



## tsm106

Quote:


> Originally Posted by *geggeg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HardwareDecoder*
> 
> you mean like leeches them from the metals in the loop?
> 
> 
> 
> That, and dust mostly are the sources. Either way, I am really interested in what happened here.
Click to expand...

Tube prolly popped up high and the good ole pump, pumped till it went dry. Shakes head... makes me cringe thinking about leak disasters, knock on wood!


----------



## DeadlyDNA

Yep I was using distilled with additive I think I developed a leak at the cpu block. However I just noticed on my ek blocks the acrylic or whatever is not flush with copper block. Right now I'm trying to make sure they didn't leak as well. one card was spared so far. it was just wet on the wblock. I have a good feeling the equipment is okay it did post again after shuttdown when I realized after I saw liquid!!

Still going at it :-/


----------



## HardwareDecoder

I don't see how a lot of dust ends up in the actual loop? I mean it's a closed system except when you open it up. The only time mine was open was when I was getting the air out and left the res cap open to let the air bubbles escape.

So any dust that would be in it would have had to fall directly in to the hole then. I don't think a few dust particles would be able to make water conductive. I mean obviously if you got enough crap in it. Also dust is made up mostly of dead skin cells.... I don't think dead skin cells are going to conduct electricity very well.
Quote:


> Originally Posted by *DeadlyDNA*
> 
> Yep I was using distilled with additive I think I developed a leak at the cpu block. However I just noticed on my ek blocks the acrylic or whatever is not flush with copper block. Right now I'm trying to make sure they didn't leak as well. one card was spared so far. it was just wet on the wblock. I have a good feeling the equipment is okay it did post again after shuttdown when I realized after I saw liquid!!
> 
> Still going at it :-/


Put anything you can in a bag with plain old white rice. It's amazing at sucking up water. what you don't want to do right now is start powering stuff on that might have water in it dude.... I know it sucks and you want to test stuff out but you really need to give stuff 2-3 days in bags full of rice.

I had a friend drop a phone in the sink or something at my house and he refused to listen to me about not powering it back on, needless to say it never worked again when if he had just let it chill in rice it *might* have worked in a few days.


----------



## VSG

Quote:


> Originally Posted by *tsm106*
> 
> Tube prolly popped up high and the good ole pump, pumped till it went dry. Shakes head... makes me cringe thinking about leak disasters, knock on wood!


Ya, just wondering if the coolant got overheated somehow and built up pressure in the loop. Anyway, this is going into water cooling territory now so I will just wish him good luck- hopefully the components still work.


----------



## DeadlyDNA

Oh I am not powering anything on its all torn apart. I just powered it on after it shutting down initially thinking I had crashed. While posting I looked down and saw liquid pooled on psu top..

I am tearing it all down now and two of my ek blocks so far have the same strange gap between acrylic and copper. I will post a pic soon if I can its hard to see.

also ill refrain from posting alot on here since its gpu owners thread and not wc disaster zone


----------



## UNOE

Quote:


> Originally Posted by *HardwareDecoder*
> 
> I don't see how a lot of dust ends up in the actual loop? I mean it's a closed system except when you open it up. The only time mine was open was when I was getting the air out and left the res cap open to let the air bubbles escape.
> 
> So any dust that would be in it would have had to fall directly in to the hole then. I don't think a few dust particles would be able to make water conductive. I mean obviously if you got enough crap in it. Also dust is made up mostly of dead skin cells.... I don't think dead skin cells are going to conduct electricity very well.
> Put anything you can in a bag with plain old white rice. It's amazing at sucking up water. what you don't want to do right now is start powering stuff on that might have water in it dude.... I know it sucks and you want to test stuff out but you really need to give stuff 2-3 days in bags full of rice.
> 
> I had a friend drop a phone in the sink or something at my house and he refused to listen to me about not powering it back on, needless to say it never worked again when if he had just let it chill in rice it *might* have worked in a few days.


Soon as it touches air or even the motherboard or other parts it will collect dust. No ones case is that dust free, unless its all new parts.

Is this CRC Electrical cleaner a good thing to have laying around ? Can you spray a PCB with it ?


----------



## VSG

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Oh I am not powering anything on its all torn apart. I just powered it on after it shutting down initially thinking I had crashed. While posting I looked down and saw liquid pooled on psu top..
> 
> I am tearing it all down now and two of my ek blocks so far have the same strange gap between acrylic and copper. I will post a pic soon if I can its hard to see.
> 
> also ill refrain from posting alot on here since its gpu owners thread and not wc disaster zone


Make a thread in the watercooling forum, I am sure others will be interested as well.


----------



## tsm106

Quote:


> Originally Posted by *UNOE*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *HardwareDecoder*
> 
> I don't see how a lot of dust ends up in the actual loop? I mean it's a closed system except when you open it up. The only time mine was open was when I was getting the air out and left the res cap open to let the air bubbles escape.
> 
> So any dust that would be in it would have had to fall directly in to the hole then. I don't think a few dust particles would be able to make water conductive. I mean obviously if you got enough crap in it. Also dust is made up mostly of dead skin cells.... I don't think dead skin cells are going to conduct electricity very well.
> Put anything you can in a bag with plain old white rice. It's amazing at sucking up water. what you don't want to do right now is start powering stuff on that might have water in it dude.... I know it sucks and you want to test stuff out but you really need to give stuff 2-3 days in bags full of rice.
> 
> I had a friend drop a phone in the sink or something at my house and he refused to listen to me about not powering it back on, needless to say it never worked again when if he had just let it chill in rice it *might* have worked in a few days.
> 
> 
> 
> 
> 
> 
> 
> Soon as it touches air or even the motherboard or other parts it will collect dust. No ones case is that dust free, unless its all new parts.
> 
> Is this CRC Electrical cleaner a good thing to have laying around ? Can you spray a PCB with it ?
Click to expand...

LOL.

Anyways, I always have the stuff laying around. I use it whenever I clean and its safe for all electronics, but that is for the gentler stuff. There are different grades with stronger grades that eat plastics and that is bad juju on pcbs. I use the CRC QD cleaner for pc stuff as it is plastics safe. Spray it on, blast it off with some air. Read the can, it tells you how to clean with it too.


----------



## Sazz

I had similar experience before where I accidentally spilled liquid on my mobo and the system shuts down. Dismantled it and used some air pressure to blow all the water from the crevices. Took me a couple of hours to do so and everything worked again.

Normally all watercooling parts in the market are pressure tested way beyond a normal watercooling system would ever produce, but other things factors in like temperature difference between ambient and the system, sudden heating/cooling would tend to build pressure inside. That's why I decided to use pressure valve on my loop (on my reservoir) so if there's any pressure building up inside that is too much, it will be automatically released. and I do use Acrylic tubing so that's a little more re-assuring in my end, it won't be so easy to break them just by the pressure that the loop will ever produce.

Which comes down to the waterblocks and connections (fittings), normally you would want to replace the O-rings everytime you disassemble the blocks and remove fittings, but I tend to use them 3times before I decide to replace them.


----------



## yawa

Man I know it's been said but I cannot even begin to tell you the frustration of getting exactly enough money together to start your new build only to watch a main components price shoot wildly out of range









I was hoping to have everything ready to go (but the A10 7850k for obvious release date reasons) by thethe beginning of January and this price gouging shows no signs of letting up.

So incredibly frusturating.


----------



## HardwareDecoder

Quote:


> Originally Posted by *UNOE*
> 
> Soon as it touches air or even the motherboard or other parts it will collect dust. No ones case is that dust free, unless its all new parts.
> 
> Is this CRC Electrical cleaner a good thing to have laying around ? Can you spray a PCB with it ?


At what point does water actually ever come in to contact with anything besides the blocks/tubing/rads? It's a closed system until you open a cap on a res or something right? I mean yea dust accumulates on the outside of the loop/radiator etc.

So if you have no dust accumulation *IN* any of those components how does any noticeable level of dust get in to the loop? I'm not talking about a particle here or there.

I mean my loop probably has a black Labrador dog hair in it, 1 hair or a few dust particles isn't going to change the electrical conductivity of the entire loop.


----------



## VSG

You would be surprised, they don't call this an open loop for nothing. I got a conductivity meter in my lab that tells me what the conductivity of the DI water I use is and it varies a lot from inside the DI source and just outside in a beaker.


----------



## UNOE

Quote:


> Originally Posted by *HardwareDecoder*
> 
> At what point does water actually ever come in to contact with anything besides the blocks/tubing/rads? It's a closed system until you open a cap on a res or something right? I mean yea dust accumulates on the outside of the loop/radiator etc.
> 
> So if you have no dust accumulation *IN* any of those components how does any noticeable level of dust get in to the loop? I'm not talking about a particle here or there.
> 
> I mean my loop probably has a black Labrador dog hair in it, 1 hair or a few dust particles isn't going to change the electrical conductivity of the entire loop.


Well its not a issue when its in the loop. Its only a issue when it is outside on your motherboard or GPU's. So what is your point, I don't think anyone really cares if its non conductive inside a closed loop.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Tobiman*
> 
> Gigabyte windforce 3X r9 290X on newegg. GOGOGOGOGO!


I'd rather not use Gigabyte due to their history of Volt locking the cards
Quote:


> Originally Posted by *Durquavian*
> 
> Non-reference is soooo close that may be waiting is the best option.


Yeah, the DCUII and Tri-X versions have had some good reviews
Quote:


> Originally Posted by *Arizonian*
> 
> If your like me, you might be hard pressed to wait till sometime in January for non-reference 290X.
> 
> Count on them being higher priced than aftermarket reference cooled 290X $699 at roughly $749 or higher depending on how over built non-reference designs are.


higher priced of course but the part i'm struggling with is if the 290x is really worth the extra.......it is but not....get what i mean?









Argh, i think i'll grab a couple of non-ref 290's then, best price/perf it seems.

Oh and i need a replacement card before the 11th Jan so hopefully they release them soon.


----------



## DeadlyDNA

So quick snap of one of the 290x water blocks after my leak. Notice the water under the acrylic on the outside of the sealing gasket. Also, ill get another snap of it sideways but there is a gap that slowly starts from one side of the block to the other between the acrylic and the copper base.

Wierd, all 4 of my cards were pretty much dry, except the top two got a a little wet spot near the pcie and vrms on the topside(backside) of the pcb. However all of the water blocks were wet slightly as seen in the pic. I am perplexed at the moment what exactly i have here.


----------



## sebhitman

yikes , i really wouldn't know where to start in ''troubleshooting'' / avoiding to short something / fix / cleanup my system

let's hope for the best !
i hope theres not damage cause on anything


----------



## ShortySmalls

Quote:


> Originally Posted by *DeadlyDNA*
> 
> So quick snap of one of the 290x water blocks after my leak. Notice the water under the acrylic on the outside of the sealing gasket. Also, ill get another snap of it sideways but there is a gap that slowly starts from one side of the block to the other between the acrylic and the copper base.
> 
> Wierd, all 4 of my cards were pretty much dry, except the top two got a a little wet spot near the pcie and vrms on the topside(backside) of the pcb. However all of the water blocks were wet slightly as seen in the pic. I am perplexed at the moment what exactly i have here.


I just installed 2 of those exact blocks lol... ill be watching them like a hawk for a few days atleast.


----------



## Sazz

Quote:


> Originally Posted by *DeadlyDNA*
> 
> So quick snap of one of the 290x water blocks after my leak. Notice the water under the acrylic on the outside of the sealing gasket. Also, ill get another snap of it sideways but there is a gap that slowly starts from one side of the block to the other between the acrylic and the copper base.
> 
> Wierd, all 4 of my cards were pretty much dry, except the top two got a a little wet spot near the pcie and vrms on the topside(backside) of the pcb. However all of the water blocks were wet slightly as seen in the pic. I am perplexed at the moment what exactly i have here.
> 
> 
> Spoiler: Warning: Spoiler!


maybe that water got there when the leak was spraying all over your system from somewhere else. if you got an air can atleast you can try to blow those out, then leak test them.
That's what I do first before I even install anything, leak test the components on their own on the tub overnight, if no leaks spring out that is caused by a faulty component and not just some un-fastened fitting then imma proceed on installing them onto my system.


----------



## HOMECINEMA-PC

Hey guys









Just wondering if someone knows how you get 200mv slider to unlock on AB 17 ?
Any insight would greatly appreciated








Quote:


> Originally Posted by *Forceman*
> 
> AB can overvolt any card +100mV,if you want more than that you can flash the Asus BIOS or use the AB command line switch.


Ahh you know what i want


----------



## djsatane

Wonder if those GIGABYTE GV-R929XOC-4GD Radeon R9 290X 4GB cards are still prone to same black screen faults as random stock design cards and what memory manufacturer chips they use, whether its specific or just random like stock .


----------



## velocityx

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> At THIS current time Mantle technically has everything to do with BF4, that's the game we're waiting on it to be implemented in! LOL
> 
> Johan said at the beginning of december that it's not even close. I'll find the interview for you to show again that I'm not just talking out my arse here.


What i meant was, mantle is an extension and a new renderer to the Frostbite engine. So even tho bf4 is buggy, it has nothing to do with the game, it's a feature of the engine. All the stuff were having issues with bf4 is really down to numbers in config. Mantle is deeper on a different level. So in this way it has nothing to do with bf4, its being made for a few games right now, its just that bf4 is coming first.

As for the interview youre trying to quote, its prolly a piece from tomshardware interview with anderson back this one was back from end of october and posted in november. There anderson said its too early for performance numbers.


----------



## Sgt Bilko

Ok then, so i'd decided to wait until the non-ref cards are released then go for a couple of R9 290's, With Trixx supporting +200Mv now things are looking good for the future









now i've just gotta hope the price won't be at the stupid levels we are seeing atm


----------



## psyside

Quote:


> Originally Posted by *tsm106*
> 
> Where do you come up with this fiction?


One highly educated electrician analyzed power delivery on reference design cards...we read few PDF's and power rating of the parts.


----------



## rdr09

Trixx was maxed at +200, so this is it . . .

290 @ 1300/1500

http://www.3dmark.com/3dm11/7711110

i think i'll need more to go any higher.

Happy Holidays!


----------



## hatlesschimp

Just gave my stock crossfired 290x's a go on my Sony 4k tv. Was quite surprised!!! Comparing to my tri sli titans i was getting the same frames and less lag. (notably less lag!)

Pretty consistent around the 45 to 58 fps mark. It was only 7 to 8 minutes of gaming though. But for 30hz it was very playable and cant wait till hdmi 2.0 comes out on a gpu or maybe a 2014 4k tv with DP.

Here is a 1080p gameplay video.


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> Trixx was maxed at +200, so this is it . . .
> 
> 290 @ 1300/1500
> 
> http://www.3dmark.com/3dm11/7711110
> 
> i think i'll need more to go any higher.
> 
> Happy Holidays!


1300?........not bad.....not bad at all


----------



## Ludus

Hi guys, i'm in the club









Xfx r9 290 black edition cooled by ek full cover acetal+nichel and backplate
bios pt1, +225mv
1280 on core and... 7 ghz on memory !!! thanks hynix








asics 81,3%



i'm really happy.


----------



## TommyMoore

Ek waterblock for my 290 got delivered this morning. Just need to finish work to try it out. =o)


----------



## Arizonian

Quote:


> Originally Posted by *Ludus*
> 
> Hi guys, i'm in the club
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Xfx r9 290 black edition cooled by ek full cover acetal+nichel and backplate
> bios pt1, +225mv
> 1280 on core and... 7 ghz on memory !!! thanks hynix
> 
> 
> 
> 
> 
> 
> 
> 
> asics 81,3%
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> i'm really happy.


Didn't see your OCN name in the screen shot. If you can 'edit' add to the original post above a GPU-Z validation link or open GPU-Z image with tab open to validation tab with your OCN name would be great.

Congrats - added anyway









Side note gentlemen - [H]ardOCP review the *ASUS R9 290X DirectCU II OC Video Card Review*
Quote:


> The ASUS R9 290X DCU II OC is sporting ASUS's DIGI+ VRM and Super Alloy Power. The reference R9 290X has a 5 phase GPU PWM and 2 phase Memory/IO PWM. The ASUS R9 290X DCU II has a 6 phase GPU PWM and 4 phase Memory/IO PWM. *This is a 6+2+2 power delivery system*. This should help with overclocking stability as well. ASUS claims 30% more stable overclocking and 15% better power efficiency.


Also comes with a back plate which I feel is a must for those of us who like aesthetics. However your going to have to wait another 30 days before this is out in retail. Seems to be in performance mode on par with reference GTX 780 Ti.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Arizonian*
> 
> Side note gentlemen - [H]ardOCP review the *ASUS R9 290X DirectCU II OC Video Card Review*
> Also comes with a back plate which I feel is a must for those of us who like aesthetics. However your going to have to wait another 30 days before this is out in retail. Seems to be in performance mode on par with reference GTX 780 Ti.


Looks like im going for a DCU II then.

Here's hoping they get them out before mid January.....


----------



## Ludus

Quote:


> Originally Posted by *Arizonian*
> 
> Didn't see your OCN name in the screen shot. If you can 'edit' add to the original post above a GPU-Z validation link or open GPU-Z image with tab open to validation tab with your OCN name would be great.
> 
> Congrats - added anyway


Sorry, i didn't know that rule.
But fyi i haven't a 290x, i have a 290 non x and i'm on water, not with stock cooler.









I ran a 3dmark also without the gpu-z validation, next time i will follow the rules









290 pt1 bios @1280/7000 +225mv
i5 4670k at 4.7ghz

http://www.3dmark.com/3dm/1991988


----------



## Jack Mac

Think I'd be ok for about 30 mins of benching with Trixx @ 200Mv? I wonder if my card will do 1300. It can do 1210 at +100 in AB.


----------



## Arizonian

Quote:


> Originally Posted by *Ludus*
> 
> Sorry, i didn't know that rule.
> But fyi i haven't a 290x, i have a 290 non x and i'm on water, not with stock cooler.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I ran a 3dmark also without the gpu-z validation, next time i will follow the rules
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 290 pt1 bios @1280/7000 +225mv
> i5 4670k at 4.7ghz
> 
> http://www.3dmark.com/3dm/1991988
> 
> 
> Spoiler: Warning: Spoiler!


Sweet - thanks for all the info. Updated on roster.









Welcome aboard.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *velocityx*
> 
> What i meant was, mantle is an extension and a new renderer to the Frostbite engine. So even tho bf4 is buggy, it has nothing to do with the game, it's a feature of the engine. All the stuff were having issues with bf4 is really down to numbers in config. Mantle is deeper on a different level. So in this way it has nothing to do with bf4, its being made for a few games right now, its just that bf4 is coming first.
> 
> As for the interview youre trying to quote, its prolly a piece from tomshardware interview with anderson back this one was back from end of october and posted in november. There anderson said its too early for performance numbers.


The interview was listed Dec 2nd. Basically there is NO definitive information. Hell once we got in to December they've gone quiet. If we put it to reason then logically it makes sense it would be postponed with everything else considering it's being released with BF4 first and foremost. I know it's by a different team but DICE still has to put it in and test it regardless if it's numbers in config, do they not? I don't see how, even though is an API, you just throw it in and expect it all to work perfectly. With how borked the game is currently why would they roll out mantle when other problems still exist. That would be mantle suicide since everyone will blame the other problems on the API itself regardless of what the root issue is









This is just my reasoning on the matter as you and I both do not have definitive answers on this. They said almost a month ago that it would be late December. They realistically have a few days.

Again I truly hope it does release but only if it's polished.


----------



## Widde

Have a question, Do i enable "force constant voltage" in sapphire trixx? Does it help to get a higher max oc on the core? Running +137mV @1140/1250 at the moment, can force constant voltage damage the card or rather is the risk high? Going to see how high it goes on this voltage, think it was a unstable cpu oc that might have caused my previous attempts to fail


----------



## MrWhiteRX7

Anyone else try out the XSPC water blocks yet? A buddy of mine tested one and says it's working very well but that it was sketchy putting in the fittings. I have to get 3 of them so I want to make the right choice LOL


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Widde*
> 
> Have a question, Do i enable "force constant voltage" in sapphire trixx? Does it help to get a higher max oc on the core? Running +137mV @1140/1250 at the moment, can force constant voltage damage the card or rather is the risk high? Going to see how high it goes on this voltage, think it was a unstable cpu oc that might have caused my previous attempts to fail


You should not need to check that box. I've used trixx for years with very good overclocks and high voltage having never touched that.


----------



## Jack Mac

I can't wait to get to my computer, I'm dying to know if I can run 1300MHz valley.


----------



## Widde

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> You should not need to check that box. I've used trixx for years with very good overclocks and high voltage having never touched that.


Starting to lose fps after 1145mhz







Anyone wanna switch cards?







xD Settled on 1100/1500







Might just get a new psu and another card and do crossfire


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Widde*
> 
> Starting to lose fps after 1145mhz
> 
> 
> 
> 
> 
> 
> 
> Anyone wanna switch cards?
> 
> 
> 
> 
> 
> 
> 
> xD Settled on 1100/1500
> 
> 
> 
> 
> 
> 
> 
> Might just get a new psu and another card and do crossfire


And it's not artifacting or anything? Drop the memory clock back down a little bit but go up on the core. See if that doesn't get the fps going back up.


----------



## Widde

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> And it's not artifacting or anything? Drop the memory clock back down a little bit but go up on the core. See if that doesn't get the fps going back up.


Had the memory on stock when i did the core, yep it's artifacting







I had to have the core @ 1150 150mV and wasnt comfy having the voltage up there


----------



## Widde

Quote:


> Originally Posted by *Widde*
> 
> Had the memory on stock when i did the core, yep it's artifacting
> 
> 
> 
> 
> 
> 
> 
> 
> I had to have the core @ 1150 150mV and wasnt comfy having the voltage up there


No artifacts at 1100/1500 atleast, doing +75mV

Shoot, Forgot to press edit instead of quote -.-


----------



## tsm106

Quote:


> Originally Posted by *psyside*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Where do you come up with this fiction?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> One highly educated electrician analyzed power delivery on reference design cards...we read few PDF's and power rating of the parts.
Click to expand...

Pulled that out of your ass huh?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Widde*
> 
> Had the memory on stock when i did the core, yep it's artifacting
> 
> 
> 
> 
> 
> 
> 
> I had to have the core @ 1150 150mV and wasnt comfy having the voltage up there


Quote:


> Originally Posted by *Widde*
> 
> No artifacts at 1100/1500 atleast, doing +75mV
> 
> Shoot, Forgot to press edit instead of quote -.-


Yea the only time I've seen a higher clock get lower scores is when it's not stable (ie. artifacts). 1100/1500 is not bad sir!


----------



## Rar4f

Unigine Heaven Benchmark 4.0
FPS:
45.9
Score:
1156
Min FPS:
18.9
Max FPS:
247.4

System
Platform:
Windows 7 (build 7600) 64bit
CPU model:
Intel(R) Core(TM) i7-4770K CPU @ 3.50GHz (3491MHz) x4
GPU model:
AMD Radeon R9 200 Series 13.250.22.0/Standard VGA Graphics Adapter 6.1.7600.16385 (4095MB) x1
Settings
Render:
Direct3D11
Mode:
2560x1424 8xAA windowed
Preset
Custom
Quality
Ultra
Tessellation:
Normal

If anyone needs a screenshot for documentation purposes or something let me know and i will supply one


----------



## Ukkooh

@DeadlyDNA, how long was your loop in use before the leak happened? This made me a bit scared as I'll be building my first custom loop in a few weeks.


----------



## psyside

Quote:


> Originally Posted by *tsm106*
> 
> Pulled that out of your ass huh?


No. Its true, no need to act...









If you don't believe me fine, what i typed is 100% true.


----------



## sugarhell

And the truth is??


----------



## psyside

Reference cards have very good mosfets, the chokes on the other hand are average....they lose huge % of their efficiency/performance when they heat up. He analayzed the vrm design in depth, read like 20 mins about mosfets/chokes design/performance - offical PDF from iR and can't remember choke manufacturer.


----------



## sugarhell

Em they lose efficiency because you let them heat up. That happens almost to every single electronic component. The vrms are perfectly fine until you hit 1.4+ volts. Its almost the same pcb as the 7970 which was my number one choice.For 99% of the people that get this card the ref pcb is fine. But for the people that aim at 1.4 volt and more should wait for a custom solution


----------



## psyside

You really think a electrician who repair high end audio gear, would not know know that every part operate similar? point being, the inductors on the 290 are decent, but not same level as the mosfets, he recommend not exceeding 50-60c for best results/performance, he was a bit disappointed when he saw the amps - rating drops on higher temps, that's all what i'm saying, they are good enough but not great.

Maybe thats because he saw the mosfets first, then the chokes, he was impressed by the mosfets being on a consumer grade product.

Also he told me that the core would hold me down in 99 from 100 cases before the chokes...but still they are just average.


----------



## sugarhell

Only when you overclock high.


----------



## tsm106

Comical. A repair shop worker giving advice on high-end gpu design. Right, totally believable. In my back pocket I have a super experienced engineer too.


----------



## psyside

The only comical thing is your ignorance about his knowledge, without knowing him, and your attitude on this forum after 17K + posts









That guy is real pro, but nevermind think what u like.


----------



## sugarhell

Quote:


> Originally Posted by *psyside*
> 
> The only comical thing is your ignorance about his knowledge, without knowing him, and your attitude on this forum after 17K + posts
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That guy is real pro, but nevermind think what u like.


You know that you suggest to cool down the chokes when their purpose is to get hot?


----------



## Koniakki

Quote:


> Originally Posted by *tsm106*
> 
> Comical. A repair shop worker giving advice on high-end gpu design. Right, totally believable. *In my back pocket I have a super experienced engineer too.*


I have NASA's head engineer phone number on my speed-dial #4.


----------



## kcuestag

Enough guys.


----------



## Falkentyne

Quote:


> Originally Posted by *Widde*
> 
> Have a question, Do i enable "force constant voltage" in sapphire trixx? Does it help to get a higher max oc on the core? Running +137mV @1140/1250 at the moment, can force constant voltage damage the card or rather is the risk high? Going to see how high it goes on this voltage, think it was a unstable cpu oc that might have caused my previous attempts to fail


I don't know if that option even works on 290''s. But on 7970 (280x) cards, it would keep the IDLE voltage at 3d voltage speeds, and prevent the voltage from dropping to the idle setting of 0.850, 0.804 or 0.949, depending on your card.

You're probably looking about how to remove droop. I wont tell you how,as you can DAMAGE your card by it unless you are on cold water or subzero; there are posts earlier that have the link to it through MSI afterburner.


----------



## Widde

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Yea the only time I've seen a higher clock get lower scores is when it's not stable (ie. artifacts). 1100/1500 is not bad sir!


Memory also gets unstable at pretty much anything higher than 1500, saw a loss in score and fps in 3dmark and tearing happening









But I'm satisfied with the card so far for the price







Paid 530$ for it so cant complain when a 780 here goes for around 650-700 (Sweden)


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Widde*
> 
> Memory also gets unstable at pretty much anything higher than 1500, saw a loss in score and fps in 3dmark and tearing happening
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But I'm satisfied with the card so far for the price
> 
> 
> 
> 
> 
> 
> 
> Paid 530$ for it so cant complain when a 780 here goes for around 650-700 (Sweden)


Yea mine starts to lose points after around 1550 on the memory. I game at 1165/1500 and it's PLENTY


----------



## DeadlyDNA

Quote:


> Originally Posted by *Ukkooh*
> 
> @DeadlyDNA, how long was your loop in use before the leak happened? This made me a bit scared as I'll be building my first custom loop in a few weeks.


At least 3 weeks, about a month? Either way it's my first full on water cooling build myself. Have done cpu custom loops not gpu's or let alone 4 gpus and a cpu. I found what appears to be the culprit right now. I had a o ring that was compressed more on one side, on the cpu block. Not sure if it just wasnt tight enough and the tubing put pressure on the connector. I am currently testing components one at a time now. first i'm testing video cards so far so good! Don't get scared, just read up a lot and ask questions! for instance everyone mentioned pressure buildup that's something i haven't heard of. These o-rings are a pain because you can't over tighten them they leak, and under tighten and leak.


----------



## tsm106

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ukkooh*
> 
> @DeadlyDNA, how long was your loop in use before the leak happened? This made me a bit scared as I'll be building my first custom loop in a few weeks.
> 
> 
> 
> At least 3 weeks, about a month? Either way it's my first full on water cooling build myself. Have done cpu custom loops not gpu's or let alone 4 gpus and a cpu. I found what appears to be the culprit right now. I had a o ring that was compressed more on one side, on the cpu block. Not sure if it just wasnt tight enough and the tubing put pressure on the connector. I am currently testing components one at a time now. first i'm testing video cards so far so good! Don't get scared, just read up a lot and ask questions! for instance everyone mentioned pressure buildup that's something i haven't heard of. These o-rings are a pain because you can't over tighten them they leak, and under tighten and leak.
Click to expand...

What fittings are ya using?


----------



## DeadlyDNA

Quote:


> Originally Posted by *tsm106*
> 
> What fittings are ya using?


I bought alot of EK ones, the ones on the cpu block came with one of the pieces i ordered. Not sure what brand it is, they are going to be replaced with EK ones.I will have to find my receipt to find the brand. Back to testing my junk :-/ Bad timing right before xmas!!


----------



## MrWhiteRX7

Quote:


> Originally Posted by *DeadlyDNA*
> 
> At least 3 weeks, about a month? Either way it's my first full on water cooling build myself. Have done cpu custom loops not gpu's or let alone 4 gpus and a cpu. I found what appears to be the culprit right now. I had a o ring that was compressed more on one side, on the cpu block. Not sure if it just wasnt tight enough and the tubing put pressure on the connector. I am currently testing components one at a time now. first i'm testing video cards so far so good! Don't get scared, just read up a lot and ask questions! for instance everyone mentioned pressure buildup that's something i haven't heard of. These o-rings are a pain because you can't over tighten them they leak, and under tighten and leak.


How many rads are you using?


----------



## tsm106

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> What fittings are ya using?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I bought alot of EK ones, the ones on the cpu block came with one of the pieces i ordered. Not sure what brand it is, they are going to be replaced with EK ones.I will have to find my receipt to find the brand. Back to testing my junk :-/ Bad timing right before xmas!!
Click to expand...

I have used them all before from BP, Enzo, Ek... and now I only use XSPC compressions. I actually have whole lots (BP/Enzo/etc) of those sets in a bag that I'd been meaning to sell for a year now lol.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *tsm106*
> 
> I have used them all before from BP, Enzo, Ek... and now I only use XSPC compressions. I actually have whole lots (BP/Enzo/etc) of those sets in a bag that I'd been meaning to sell for a year now lol.


I'm building my watercooling setup now, pm me (if you don't mind) what you have! I have cpu, three gpu blocks, and two rads lol.


----------



## tsm106

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I have used them all before from BP, Enzo, Ek... and now I only use XSPC compressions. I actually have whole lots (BP/Enzo/etc) of those sets in a bag that I'd been meaning to sell for a year now lol.
> 
> 
> 
> I'm building my watercooling setup now, pm me (if you don't mind) what you have! I have cpu, three gpu blocks, and two rads lol.
Click to expand...

I could open my own used WC shop lol. New Monsta 240, bazillion 120 rads, a few diff other 240s, two MCR320s, XSPC 480, enough BP/Enzo 1/2-3/4 compressions for one loop each, 4 new ultra kazes, phobya octa 360 grill, etc etc.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *tsm106*
> 
> I could open my own used WC shop lol. New Monsta 240, bazillion 120 rads, a few diff other 240s, two MCR320s, XSPC 480, enough BP/Enzo 1/2-3/4 compressions for one loop each, 4 new ultra kazes, phobya octa 360 grill, etc etc.


I'm about to order the xspc extreme 360 kit with copper cpu block, 3 x 290 xspc blocks with back plates, phobya 1260 rad, and a bunch of tubing and fittings lol. If you have any of that let me know


----------



## tsm106

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I could open my own used WC shop lol. New Monsta 240, bazillion 120 rads, a few diff other 240s, two MCR320s, XSPC 480, enough BP/Enzo 1/2-3/4 compressions for one loop each, 4 new ultra kazes, phobya octa 360 grill, etc etc.
> 
> 
> 
> I'm about to order the xspc extreme 360 kit with copper cpu block, 3 x 290 xspc blocks with back plates, phobya 1260 rad, and a bunch of tubing and fittings lol. If you have any of that let me know
Click to expand...

Oh I have the triple serial ek bridge for your blocks. Ok lemme dig thru my stuff and will pm ya.


----------



## Ukkooh

Quote:


> Originally Posted by *DeadlyDNA*
> 
> At least 3 weeks, about a month? Either way it's my first full on water cooling build myself. Have done cpu custom loops not gpu's or let alone 4 gpus and a cpu. I found what appears to be the culprit right now. I had a o ring that was compressed more on one side, on the cpu block. Not sure if it just wasnt tight enough and the tubing put pressure on the connector. I am currently testing components one at a time now. first i'm testing video cards so far so good! Don't get scared, just read up a lot and ask questions! for instance everyone mentioned pressure buildup that's something i haven't heard of. These o-rings are a pain because you can't over tighten them they leak, and under tighten and leak.


What block are you using on the cpu? I'm going to use a supremacy with EK compression fittings so hopefully I won't have issues like that. Did you open up the cpu block or was it like that from the factory?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *tsm106*
> 
> Oh I have the triple serial ek bridge for your blocks. Ok lemme dig thru my stuff and will pm ya.


Awesome!


----------



## Strata

I seem to have lost all temperature sensors for my 290X after installing the EK Block. I was hoping to get some load temps for you guys before I post pictures.


----------



## Koniakki

Merry Christmas to everyone even if some are still in the 24th!


----------



## chiknnwatrmln

Still waiting on my block... Geez this is taking painfully long.

Oh and NCIX says the pump I ordered on the 11th (which they said was in stock when I ordered when it really wasn't, but that it'd be re-stocked on the 19th) won't ship out until Jan 15th at the earliest....


----------



## DeadlyDNA

Quote:


> Originally Posted by *Ukkooh*
> 
> What block are you using on the cpu? I'm going to use a supremacy with EK compression fittings so hopefully I won't have issues like that. Did you open up the cpu block or was it like that from the factory?


Heatkiller 3.0 - i did take it apart to clean it because i bought it from a buddy and it needed cleaning. I think my issue was one of the barbs connected to the block leaked at the o-ring. I am pretty sure it's an error on my part i didn't tighten it enough and maybe pressure as mentioned by others. however examining my GPU EK blocks have me worried uneven gap i noticed on two of them. Also, i'm happy to report all my video cards are fine!!!! WOOT!!

I am quick benching them on stock coolers. The "Target GPU tempature" slider bar defaulted to 95C at stock settings. When i move it to say 65C i am getting a huge decrease in performance. Does this setting intentionally throttle the GPU to keep temps low?

Overdrive disabled: AVPDX11 benchmark default settings

AvP D3D11 Benchmark Report
==========================

**************************************************
* Report Created: 2013-12-24 @ 14:54:16
**************************************************
* Executable Build: V1.03, Apr 19 2010
**************************************************

*DX11 Hardware Detected*

Using Default Video Settings:

Resolution: 1920 x 1080
Texture Quality: 2
Shadow Quality: 3
Anisotropic Filtering: 16
SSAO: ON
Vertical Sync: OFF
DX11 Tessellation: ON
DX11 Advanced Shadows: ON
DX11 MSAA Samples: 1

Benchmark Summary:

Number of frames: 13897
Average Frame Time: 7.5ms
Average FPS: 132.5

Use command-line option '-logframetime' to report performance frame-by-frame.

================================================================================================================================================================================================================

Same run as above but with "target GPU setting @ 65C" and fan speed to 100%

AvP D3D11 Benchmark Report
==========================

**************************************************
* Report Created: 2013-12-24 @ 16:16:15
**************************************************
* Executable Build: V1.03, Apr 19 2010
**************************************************

*DX11 Hardware Detected*

Using Default Video Settings:

Resolution: 1920 x 1080
Texture Quality: 2
Shadow Quality: 3
Anisotropic Filtering: 16
SSAO: ON
Vertical Sync: OFF
DX11 Tessellation: ON
DX11 Advanced Shadows: ON
DX11 MSAA Samples: 1

Benchmark Summary:

Number of frames: 10560
Average Frame Time: 9.9ms
Average FPS: 100.7

Use command-line option '-logframetime' to report performance frame-by-frame.


----------



## Sazz

Noob question, This is the first time I've seen "Aux voltage" on any OCing tool, what does this exactly do and what does it affect? it does help me get stable clocks but I just wanna learn more about it. Sapphire TrixX don't have it tho.


----------



## Forceman

Quote:


> Originally Posted by *Sazz*
> 
> Noob question, This is the first time I've seen "Aux voltage" on any OCing tool, what does this exactly do and what does it affect? it does help me get stable clocks but I just wanna learn more about it. Sapphire TrixX don't have it tho.


I don't think anyone knows for sure, although I've seen reports that it is PLL voltage.


----------



## TheSoldiet

Christmas is over here







(24th) Got a gift card, and i'm using it and my money to get a 840 EVO 250Gb and later doing the red mod to my 290x.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Sazz*
> 
> Noob question, This is the first time I've seen "Aux voltage" on any OCing tool, what does this exactly do and what does it affect? it does help me get stable clocks but I just wanna learn more about it. Sapphire TrixX don't have it tho.


I have my aux voltage set to +13 and it allows me to run more memory oc. I'm not saying they are directly related, but without it I can't get over 1500 at all and some games will artifact at 1500, but with it I can go as high as 1600 no issues.


----------



## Strata

Ok so here's the issue, the 290 doesn't show on HW Monitor, AB, or CCC...it does show on the device manager however

any thoughts?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Anyone else try out the XSPC water blocks yet? A buddy of mine tested one and says it's working very well but that it was sketchy putting in the fittings. I have to get 3 of them so I want to make the right choice LOL


Yes i am 28c idle and 36c 100% load temp . Nice backplate too........



Im glad i got it cause nearly all koolance and ek blocks are sold out on east coast of OZ and heard reports of ek blocks leaking ....... MERRY XMAS DUDES


----------



## Iniura

Quote:


> Originally Posted by *Jack Mac*
> 
> Think I'd be ok for about 30 mins of benching with Trixx @ 200Mv? I wonder if my card will do 1300. It can do 1210 at +100 in AB.


Does your card have some little blue blinks in the screen and some times some horizontal green lines for a blink of a second? when you benchmark or is it completly free of artifacts(don't know if you call these lines artifacts)

I received a XFX R9 290 today, it doesn't unlock. ( I run stock AIR cooling)

I've been using it from this afternoon till now and I experienced no problems, no black screens nothing, I am running the Catalyst 13.12 driver.

I ran valley Extreme HD preset on stock settings



I did a little bit of gaming Battlefield 4 and it's running on Ultra 2560x1600 2x MSAA above 60FPS almost all the time.

And then I tried overclocking the card however I am very inexperienced in overclocking GPU's so I have a couple of questions.

I tried running 3Dmark Firestrike extreme on core voltage +100 in MSI AB Beta 17, 1200MHz Core, 1450 MHz Memory and +50 Power Limit% it does complete however it had 1 time some green horizontal lines in the screen during the benchmark and some sort of blue shadow appeared somewhere on the right side of the screen.

When I tried 1500Mhz memory OC it crashed. so max I can get is 1400MHz on these settings.

This was the score:



This means my OC isn't stable right? I down clocked it to 1180 Core and had no problems during 3Dmark Firestrike extreme, I played 20 Minutes of BF4 after that and everything was okay, max temperature was 80 degrees celcius and VRM1 59 Degrees and VRM 2 70 degrees.

These temperatures are okay right and I have some room left?

Now I read that I could go a little higher with the core voltage in MSI AB to +200mV with this: MSIAfterburner /wi4,30,8d,20.

At how much voltage does my card run then with this command?

And how much voltage is it with +100mV, is GPU-z correct and do I need to look at VDDC, this showed 1,281 V during load in BF4 is that the actual voltage I am running though my card?

How safe is it to set this in AB? when I have some room left qua temperature could I try this command to see if I can get to 1300Mhz Core clock or 1200 or 1250?

Also is it better to flash another bios on to my card, if I would want to get some more performance out of it, if I switch the switch on my GPU to the other side I could flash another bios in there and my original XFX bios will still be on there when I switch back right?

If anyone would be willing to help me out a little bit with these questions I would be grateful.


----------



## Darhant

Ok so I finally got my XFX R9 290









Please add cooled with an EK FC water block and a mostly swiftech loop.
In Terms of unlocking is it worth trying as you can see in the pic of the die I have one of the 2020 chips

http://www.techpowerup.com/gpuz/va2bw/


----------



## MrWhiteRX7

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yes i am 28c idle and 36c 100% load temp . Nice backplate too........
> 
> 
> 
> Im glad i got it cause nearly all koolance and ek blocks are sold out on east coast of OZ and heard reports of ek blocks leaking ....... MERRY XMAS DUDES


thank you!!!! Merry Christmas!


----------



## darkelixa

Are custom fan profiles even needed for the stock cooling on the r9 290 sapphire. My temps never go over 90 at full load and I cant hear the gpu over case fans


----------



## Forceman

Quote:


> Originally Posted by *Iniura*
> 
> These temperatures are okay right and I have some room left?
> 
> Now I read that I could go a little higher with the core voltage in MSI AB to +200mV with this: MSIAfterburner /wi4,30,8d,20.
> 
> At how much voltage does my card run then with this command?
> 
> And how much voltage is it with +100mV, is GPU-z correct and do I need to look at VDDC, this showed 1,281 V during load in BF4 is that the actual voltage I am running though my card?
> 
> How safe is it to set this in AB? when I have some room left qua temperature could I try this command to see if I can get to 1300Mhz Core clock or 1200 or 1250?
> 
> Also is it better to flash another bios on to my card, if I would want to get some more performance out of it, if I switch the switch on my GPU to the other side I could flash another bios in there and my original XFX bios will still be on there when I switch back right?
> 
> If anyone would be willing to help me out a little bit with these questions I would be grateful.


Yes, those temps are okay.

I would not use the +200mV tweak for Afterburner unless you are willing to risk your card. It is unlikely that any damage would occur at +200mV, but if you aren't on water cooling I wouldn't try it. Just my opinion though. If you really do want +200mV though, use the new version of Trixx, which supports it with no tweaks.

There really isn't any reason to flash a different BIOS unless you wanted to put the Asus one on and use GPU Tweak instead of Afterburner (it also allows higher voltage). But you can flash to one side of the switch and still have the default BIOS on the other side, so it is pretty fail-safe if you wanted to fool with it.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *darkelixa*
> 
> Are custom fan profiles even needed for the stock cooling on the r9 290 sapphire. My temps never go over 90 at full load and I cant hear the gpu over case fans


If it's not throttling then I'd say no but when I run mine at stock clocks I run up the fan to about 60% or maybe a bit less to keep temps down and I still don't hear it outside the case. For me this just insures my clocks stay up the whole time.


----------



## robert0507

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> thank you!!!! Merry Christmas!


What are your Ben temps like


----------



## robert0507

I meant vrm. Temps l8ke


----------



## MrWhiteRX7

My only beef with Trixx is the lack of Aux voltage. I can't hit my same memory clocks on Trixx as I can with AB using aux @ +13









Which sucks because I ONLY used trixx on my 7970's, I love it so much.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *robert0507*
> 
> I meant vrm. Temps l8ke


I don't have the blocks in yet


----------



## darkelixa

Have not had my card throttle once with the 90 degree temps


----------



## Iniura

Quote:


> Originally Posted by *Forceman*
> 
> Yes, those temps are okay.
> 
> I would not use the +200mV tweak for Afterburner unless you are willing to risk your card. It is unlikely that any damage would occur at +200mV, but if you aren't on water cooling I wouldn't try it. Just my opinion though. If you really do want +200mV though, use the new version of Trixx, which supports it with no tweaks.
> 
> There really isn't any reason to flash a different BIOS unless you wanted to put the Asus one on and use GPU Tweak instead of Afterburner (it also allows higher voltage). But you can flash to one side of the switch and still have the default BIOS on the other side, so it is pretty fail-safe if you wanted to fool with it.


Thank you for your reply,

Maybe I will try out the +200mV and +13 Aux voltage once i'm under water to try and hit 1500Mhz Memory clock and +1200Mhz Core.

I like afterburner, I never used Trixx, and probably will use that command in AB to hit +200mV.

I just ran Valley Extreme HD preset, with the OC 1180, 1450 +100mV +50 Power Limit and I am scoring lower then my score when I just got the card out of the box!









Valley must be really broken, or something.

Because my graphics score in Firestrike Extreme has gone up a decent amount from the OC over stock.


----------



## HardwareDecoder

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yes i am 28c idle and 36c 100% load temp . Nice backplate too........
> 
> 
> 
> Im glad i got it cause nearly all koolance and ek blocks are sold out on east coast of OZ and heard reports of ek blocks leaking ....... MERRY XMAS DUDES


Don't take this personally but I somehow doubt your full load is 36c...... I haven't seen anyone with a full load under 45c

Mine personally hits as high as 55c but is normally around 50c


----------



## VSG

Well he could be using a chiller or has low ambient temperatures, you never know,


----------



## HardwareDecoder

Quote:


> Originally Posted by *geggeg*
> 
> Well he could be using a chiller or has low ambient temperatures, you never know,


Yea i'm not saying his temp isn't 36c full load, I just find it hard to believe since I haven't seen anyone under 45c max load iirc. As far as ambient temps he is in Australia it says so I doubt they are that low.

I didn't even know a thing such as a chiller existed, i'm guessing it is just something you connect to the loop that chills water somehow? How does that work exactly.

I should mention mine hitting 55c was with +100mV/1200 core, but I can't run over 1100/1150 mhz and be totally stable, so I just run 1100 on stock volts which usually tops @ 47-50c


----------



## robert0507

Quote:


> Originally Posted by *HardwareDecoder*
> 
> Yea i'm not saying his temp isn't 36c full load, I just find it hard to believe since I haven't seen anyone under 45c max load iirc. As far as ambient temps he is in Australia it says so I doubt they are that low.
> 
> I didn't even know a thing such as a chiller existed, i'm guessing it is just something you connect to the loop that chills water somehow? How does that work exactly


I wonder what his vrm temps are


----------



## HardwareDecoder

Quote:


> Originally Posted by *robert0507*
> 
> I wonder what his vrm temps are


me too..... I personally hit 60-65c max but I am usually lower @ 55c or so


----------



## HoZy

Quote:


> Originally Posted by *HardwareDecoder*
> 
> Don't take this personally but I somehow doubt your full load is 36c...... I haven't seen anyone with a full load under 45c
> 
> Mine personally hits as high as 55c but is normally around 50c


His temps are similar to mine.

I have XFire 290x in my rig & my brother has single 290x all watercooled & pretty much identical temps to him.

All running EK Blocks and no leaks.

Cheers
Mat


----------



## sugarhell

If you have over 10C delta then your loop is not enough. I had several gpus on a loop and most of them were between 30C and 37C


----------



## HardwareDecoder

I really don't know what to say, I mean my 290x idles at 30c and hits 50c max, very rarely 55c. seems perfectly normal to me. like I said, from everything i've seen you guys just have lower temps than most people.

My ambient is around 70F so 21C and my gpu idles @ 30c on a loop it shares with my cpu. Seems fine to me but i'm a water cooling noob. All I know for sure is everything is setup correctly.


----------



## rdr09

Quote:


> Originally Posted by *HardwareDecoder*
> 
> Don't take this personally but I somehow doubt your full load is 36c...... I haven't seen anyone with a full load under 45c
> 
> Mine personally hits as high as 55c but is normally around 50c


Believe it. Home's setup . . .


----------



## HardwareDecoder

What am I looking at exactly, is he piping AC in to the case?

Also was just playing FC3 for awhile and never saw core get over 47c so i'm happy


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *HardwareDecoder*
> 
> Don't take this personally but I somehow doubt your full load is 36c...... I haven't seen anyone with a full load under 45c
> 
> Mine personally hits as high as 55c but is normally around 50c


None taken









Sub ambient cooling or ' Air bending ' Portable ducted A/C . Next run i i will snip some temps screens
Prototype


Now


No airbending COD Ghosts @ [email protected]



And while im here i will update specs

http://www.techpowerup.com/gpuz/aanf/

Giga 290 with XSPC Razor block


----------



## HardwareDecoder

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> None taken
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sub ambient cooling or ' Air bending ' Portable ducted A/C . Next run i i will snip some temps screens
> Prototype
> 
> 
> Now
> 
> 
> No airbending COD Ghosts @ [email protected]


haha yeah so those temps "with extreme measures" lol but seriously isn't piping ac in to a computer case dangerous, condensation could form right ?


----------



## HOMECINEMA-PC

Not if its insulated / sealed and stay above dew point so in summer 14c-18c case ambient temp . Winter 11c - 13c . Lower room ambient helps out alot


----------



## HardwareDecoder

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Not if its insulated / sealed and stay above dew point so in summer 14c-18c case ambient temp . Winter 11c - 13c . Lower room ambient helps out alot


ahh thanks for all the info. I knew with that low of temps something else was at work besides a water cooling loop.

edit: what are your vrm temps like @ +125 vddc


----------



## Forceman

Quote:


> Originally Posted by *Iniura*
> 
> Thank you for your reply,
> 
> Maybe I will try out the +200mV and +13 Aux voltage once i'm under water to try and hit 1500Mhz Memory clock and +1200Mhz Core.
> 
> I like afterburner, I never used Trixx, and probably will use that command in AB to hit +200mV.
> 
> I just ran Valley Extreme HD preset, with the OC 1180, 1450 +100mV +50 Power Limit and I am scoring lower then my score when I just got the card out of the box!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Valley must be really broken, or something.
> 
> Because my graphics score in Firestrike Extreme has gone up a decent amount from the OC over stock.


I got lower speeds when my overclock got too high so that could be contributing.


----------



## Derpinheimer

Quote:


> Originally Posted by *HoZy*
> 
> His temps are similar to mine.
> 
> I have XFire 290x in my rig & my brother has single 290x all watercooled & pretty much identical temps to him.
> 
> All running EK Blocks and no leaks.
> 
> Cheers
> Mat


Bull, lol.

We just say HomeCinema's setup and its clearly so low because of his custom sub ambient setup.

You arent going to get 37c max on a 290x with standard ambient temps. That includes a range, obviously.

Now on my highly OC'd 7950 I rarely saw it hit 40c, actually just about never. And that was with a copper shim, so two layers of thermal paste between the chip and block. However this card seems to push toward 50c under extreme loads like OCCT. Still gets to mid 40s gaming. I suspect a thermal paste issue, but the loop is not lacking in any department.


----------



## HardwareDecoder

Quote:


> Originally Posted by *Derpinheimer*
> 
> Bull, lol.
> 
> We just say HomeCinema's setup and its clearly so low because of his custom sub ambient setup.
> 
> You arent going to get 37c max on a 290x with standard ambient temps. That includes a range, obviously.
> 
> Now on my highly OC'd 7950 I rarely saw it hit 40c, actually just about never. And that was with a copper shim, so two layers of thermal paste between the chip and block. However this card seems to push toward 50c under extreme loads like OCCT. Still gets to mid 40s gaming. I suspect a thermal paste issue, but the loop is not lacking in any department.


That's what i'm sayin.. every one I've talked to has said they get about 45-55c and people come out of the wood work sayin nah I get 36c lol. Well yeah home cinema does cause he is pumpin AC in to his case lol, that other dude? I highly doubt it


----------



## Durquavian

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Not if its insulated / sealed and stay above dew point so in summer 14c-18c case ambient temp . Winter 11c - 13c . Lower room ambient helps out alot


I airconditioned my whole case, actually the cabinet they (2 computers 965BE and 8350) sit in. I can run as low as 15C but usually run @ 24C when just regular browsing which usually consists of just this site and @ 20C when Gaming ( all ambients). Being in a closed cabinet also curbs computer noise by a lot 12 Fans one being a Delta MEGA FAST aka: LOUD as ... Another reason a reference 290X would not bother me at all. And the best part, NO MORE DUST. Got cats and the litter box that comes with them and that dust makes me clean the cases once a month. Now never.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Derpinheimer*
> 
> Bull, lol.
> 
> We just say HomeCinema's setup and its clearly so low because of his custom sub ambient setup.
> 
> You arent going to get 37c max on a 290x with standard ambient temps. That includes a range, obviously.
> 
> Now on my highly OC'd 7950 I rarely saw it hit 40c, actually just about never. And that was with a copper shim, so two layers of thermal paste between the chip and block. However this card seems to push toward 50c under extreme loads like OCCT. Still gets to mid 40s gaming. I suspect a thermal paste issue, but the loop is not lacking in any department.



















Quote:


> Originally Posted by *HardwareDecoder*
> 
> That's what i'm sayin.. every one I've talked to has said they get about 45-55c and people come out of the wood work sayin nah I get 36c lol. Well yeah home cinema does cause he is pumpin AC in to his case lol, that other dude? I highly doubt it


That first screener with trixx on was without a/c and this one is.........


----------



## VSG

How much was that voltage offset to before vdroop kicked in?


----------



## HOMECINEMA-PC

125 about 1.36v ?


----------



## Novulux

http://www.techpowerup.com/gpuz/a4vxm/
XFX R9 290
Watercooled by NZXT Kraken x40 and G10 bracket.

Apparently I am getting abysmal memory overclocking. :|


----------



## pounced

Without getting any weird graphical glitches and running stable for a couple benchmarks on valley I have my 290x from sapphire sitting at 1175/1450 Core/Mem and my VDDC set to 131. Runs stable and the custom fan profile I have setup never really lets it get above 80C in games and over 60% fan.

I'll most likely buy a Nzxt Kraken setup for this beast.


----------



## hatlesschimp

I have a 3rd 290x arriving on friday. Will my corsair ax1200i be ok. I know my tri titans were pushing it to its limits. What other psu should I look at?


----------



## sugarhell

Quote:


> Originally Posted by *HardwareDecoder*
> 
> That's what i'm sayin.. every one I've talked to has said they get about 45-55c and people come out of the wood work sayin nah I get 36c lol. Well yeah home cinema does cause he is pumpin AC in to his case lol, that other dude? I highly doubt it


You dont even know how a loop work but youu doubt it. With a nova and 9 GT15s i have almost lower than 10C delta. Add 20C ambient and you have my load temps


Spoiler: Warning: Spoiler!


----------



## HardwareDecoder

Quote:


> Originally Posted by *sugarhell*
> 
> You dont even know how a loop work but youu doubt it. With a nova and 9 GT15s i have almost lower than 10C delta. Add 20C ambient and you have my load temps
> 
> 
> Spoiler: Warning: Spoiler!


Please don't even bother replying to me in the future. I've seen you around and you do nothing but try to condescendingly correct people every time. I am no water cooling expert since this is my first real loop but trust me when I tell you I don't need/want YOU in particular to teach me anything. I'd prefer to learn it on my own then from you.

I could quiz you on the stuff I am going to school for and make you look as dumb as you try to make me. Everyone is an expert in something.


----------



## Arizonian

Quote:


> Originally Posted by *Darhant*
> 
> Ok so I finally got my XFX R9 290
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Please add cooled with an EK FC water block and a mostly swiftech loop.
> In Terms of unlocking is it worth trying as you can see in the pic of the die I have one of the 2020 chips
> 
> http://www.techpowerup.com/gpuz/va2bw/
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> None taken
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sub ambient cooling or ' Air bending ' Portable ducted A/C . Next run i i will snip some temps screens
> Prototype
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Now
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> No airbending COD Ghosts @ [email protected]
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> And while im here i will update specs
> 
> http://www.techpowerup.com/gpuz/aanf/
> 
> Giga 290 with XSPC Razor block
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - updated








Quote:


> Originally Posted by *Novulux*
> 
> http://www.techpowerup.com/gpuz/a4vxm/
> XFX R9 290
> Watercooled by NZXT Kraken x40 and G10 bracket.
> 
> Apparently I am getting abysmal memory overclocking. :|


Congrats - added


----------



## UNOE

Quote:


> Originally Posted by *hatlesschimp*
> 
> I have a 3rd 290x arriving on friday. Will my corsair ax1200i be ok. I know my tri titans were pushing it to its limits. What other psu should I look at?


I have a Ax1200 and with 3 r9 290's with z77 there was a good amount of room to spare. I think if your benching 3930K at 5ghz and trying to push all three cards to the limit. You will eventually hit a wall where your PSU won't be able to get any further. For 4.8ghz and every day medium OC gaming. You should never have a problem.

I used the same ax1200 with 3930K at 5.1ghz and three 7970's at 1320mhz/1800mhz. And I think I hit the PSU limit on those runs somewhere around 1315 core clock and 1.325v on the 7970's. But like I said it should be find unless your obsessed with breaking records.


----------



## maynard14

hi guys,. just recently benchmark my xfx 290

here it is

@ 290 bios

1120 core clock
1400 memory clock

http://www.3dmark.com/3dm11/7715411

@ 290x bios

1120 core clock
1400 memory clock

http://www.3dmark.com/3dm11/7715522

thats the max i can oc my vard on air and stock cooler,..

so guys what you think? is it great or just normal?

thank you


----------



## 107Spartan

Do you guys recommend that I buy a second 290x Card or instead a water cooling system for my first? Have a corsair 850 watt psu.


----------



## 107Spartan

That is about what my card is doing right now and i have a 290x too. If I put my memory any higher in bf4 ultra at 1440p I get screen tearing bad... It is only 70 degrees though so thought i could take it higher?...This Is On Air btw.


----------



## mikep577

I have two 290x Asus equipped with EK block and i want to over-clock them.
Can anyone point me in the right direction on applications to use to OC them?
I did some OC through the AMD CCC panel and managed(1100/1375) to gain some but i guess there's a lot more


----------



## esqueue

Quote:


> Originally Posted by *kcuestag*
> 
> Still getting blackscreen issues (More like monitor signal loss and cards fans go into full blast). Could it be a software issue? I haven't really formatted my Windows since getting my first 290X, might as well try it and see if it's a corrupted installation causing this...


I had a very similar issue on an Nvidia card 2 cards back. The screen would black out and the fans go max speed. The sound continued for a little before everything would quit.

I found out that I stopped this from happening by reinserting the auxiliary power cables on the card. I gave the pc away and the new owner couldn't get it running so it was trashed.


----------



## Joeking78

Quote:


> Originally Posted by *hatlesschimp*
> 
> I have a 3rd 290x arriving on friday. Will my corsair ax1200i be ok. I know my tri titans were pushing it to its limits. What other psu should I look at?


I had 3 290x and a Seasonic 1250w...When I overclocked the GPU & CPU and tried to run a benchmark I would get poor results or random shut downs.

IMO you need 1500w if you plan on doing any sort of heavy overvolting to be on the safe side...I had 4.8ghz @ 1.42v, and three 290x with +200mv on each, 3DMark would randomly reboot my PC and the scores never reflected what they should have done.


----------



## Forceman

Quote:


> Originally Posted by *mikep577*
> 
> I have two 290x Asus equipped with EK block and i want to over-clock them.
> Can anyone point me in the right direction on applications to use to OC them?
> I did some OC through the AMD CCC panel and managed(1100/1375) to gain some but i guess there's a lot more


Most people use MSI Afterburner (it works on all brands) or Sapphire Trixx. Since you have an Asus card you can also use GPU Tweak.


----------



## TommyMoore

R9 290 now under water



Quick run on Firestrike



3770k @ 4.2 - R9 290 @ 1198 / 1497

Score 10214
Graphics 12717
Physics 10064
Combined 4164


----------



## Scotty99

Linus did a 290 watercooled video, only got a 7% overclock:






Is that about all you guys are getting as well? (asking you guys on water)


----------



## sugarhell

I have a 290 that does 1350 under water. I dont know how he got only 7% oc


----------



## Newbie2009

Quote:


> Originally Posted by *Scotty99*
> 
> Linus did a 290 watercooled video, only got a 7% overclock:
> 
> 
> 
> 
> 
> 
> Is that about all you guys are getting as well? (asking you guys on water)


You would get more than that on air at stock volts.


----------



## maynard14

Quote:


> Originally Posted by *sugarhell*
> 
> I have a 290 that does 1350 under water. I dont know how he got only 7% oc


mine is only at reference cooler and on air...haha

is there any free 3dmark 11 with firestrike hehe


----------



## TheSoldiet

Quote:


> Originally Posted by *maynard14*
> 
> mine is only at reference cooler and on air...haha
> 
> is there any free 3dmark 11 with firestrike hehe


3dmark was 2.14, now it is 12.49 £

EDIT: On Steam BTW


----------



## maynard14

http://store.steampowered.com/app/205270/

its is 19.99 huhu

ill wait for sale :0

thanks sir

oppps

saw this

http://store.steampowered.com/app/223850/

im gonna buy it! thanks


----------



## TheSoldiet

NP mate. I bought it for 2.14 £ (Or was it 2.19 £ ) It was on a massive 90% sale.


----------



## quakermaas

Quote:


> Originally Posted by *Scotty99*
> 
> Linus did a 290 watercooled video, only got a 7% overclock:
> 
> 
> 
> 
> 
> 
> Is that about all you guys are getting as well? (asking you guys on water)


He only got 7% more overclock on water, than the maximum overclock on air.


----------



## Derpinheimer

My max air oc was >1150 (never really fine tuned it). My max water OC is <1200. These cards dont care at all about temps.


----------



## Ukkooh

Quote:


> Originally Posted by *Derpinheimer*
> 
> My max air oc was >1150 (never really fine tuned it). My max water OC is <1200. These cards dont care at all about temps.


Most don't but some do. My 290x started artifacting after 80°C when overclocked.


----------



## kizwan

Quote:


> Originally Posted by *Ukkooh*
> 
> Most don't but some do. My 290x started artifacting after 80°C when ocd.


Thankfully I don't have OCD.


----------



## Ukkooh

Quote:


> Originally Posted by *kizwan*
> 
> Thankfully I don't have OCD.


Edited my post just for you <3


----------



## Redeemer

stock volts but he got a very bad 290 anyway


----------



## 107Spartan

So my card seems kinda weak. Getting what I think are low scores on the 3d CLoud test. Please let me know if this is normal for how i have it set. 1100/1300 score: 16262.... without it OC to that the score is 16165... I would think bumping it up 100 mghz and upping the mem I would get more than an extra 100 points in the 3d test. Is my CPU the problem?


----------



## 107Spartan

Quote:


> Originally Posted by *maynard14*
> 
> http://store.steampowered.com/app/205270/
> 
> its is 19.99 huhu
> 
> ill wait for sale :0
> 
> thanks sir
> 
> oppps
> 
> saw this
> 
> http://store.steampowered.com/app/223850/
> 
> im gonna buy it! thanks


Thanks for this link. gave rep and bought software at the discount price.


----------



## rx7racer

Spartan what cpu do you run. I can't say for sure but that does seem a bit weak, not majorly but a tad.

Just noticed it was a 920, are you monitoring the gpu, is it dropping clocks maybe?


----------



## tobitronics

Has any manufacturer already released a UEFI GOP bios for the 290 and 290X? It would be very nice to have my ultrafast boottimes back


----------



## xGTx

So, do Powercolor 290Xs Bf4 edition on the egg actually have Hynix or do they just have Elpida ram?


----------



## VSG

All brands have been using both memory types on the reference cards so far, but it has not seemed to matter much on these cards.


----------



## Ukkooh

xGTX, they propably use both like most of the manufacturers.


----------



## Durquavian

Quote:


> Originally Posted by *xGTx*
> 
> So, do Powercolor 290Xs Bf4 edition on the egg actually have Hynix or do they just have Elpida ram?


Probably to do with 1500 having ample bandwidth that anything higher has diminishing returns.


----------



## xGTx

Quote:


> Originally Posted by *geggeg*
> 
> All brands have been using both memory types on the reference cards so far, but it has not seemed to matter much on these cards.


Quote:


> Originally Posted by *Ukkooh*
> 
> xGTX, they propably use both like most of the manufacturers.


Thanks guys. And regarding overall build quality I guess they should the same knowing that those are reference cards.

Just to have an idea though: What's the brand most people here in general have had more success with? (talking about highest clocks, etc).


----------



## VSG

You can't predict that given the small sample size of cards from various brands. Sapphire by itself probably accounts for 50% or more of the cards sold.

Just go with a brand that has good warranty policy.


----------



## kcuestag

What's the average OC a 290 can achieve under stock volts? Something like 1050MHz? Maybe 1100MHz?


----------



## HardwareDecoder

Quote:


> Originally Posted by *kcuestag*
> 
> What's the average OC a 290 can achieve under stock volts? Something like 1050MHz? Maybe 1100MHz?


mine does 1100


----------



## 107Spartan

Quote:


> Originally Posted by *rx7racer*
> 
> Spartan what cpu do you run. I can't say for sure but that does seem a bit weak, not majorly but a tad.
> 
> Just noticed it was a 920, are you monitoring the gpu, is it dropping clocks maybe?


Ive tried upping the voltage to max on afterburner and putting the mhz to 1200 and upping the memory to 1300 (anything past that and cloud gate crashes) and the highest i can get to is 16400...

I have the fan all the way and the gpu temp does not go over 70. and the results do not show any dipping in in clocks.

I am am not sure why i score so low.... I run cloud gate at its default settings for my score

Any more advice would be much appreciated.


----------



## 107Spartan

Also, when I try to run firestrike in extreme it runs fine but always crashes at the physics test. Is it a cpu issue?


----------



## devilhead

Quote:


> Originally Posted by *kcuestag*
> 
> What's the average OC a 290 can achieve under stock volts? Something like 1050MHz? Maybe 1100MHz?


had 10x 290, with stock volts all cards able to do 1100mhz


----------



## rx7racer

Quote:


> Originally Posted by *107Spartan*
> 
> Also, when I try to run firestrike in extreme it runs fine but always crashes at the physics test. Is it a cpu issue?


I haven't dabbled with a 920 for awhile but unless you are hitting 4GHz + I'd say it's the physics from it bring your score down. If you look and compare your numbers it will probably be your cpu lagging you behind a bit.

Those i7 920's are starting to show their age. Sounds like you may be unstable as well if it keeps crashing on physics test.


----------



## Hattifnatten

How's the performance difference between the 290 and 290X when they're under water and overclocked? I don't know if I can justify the price-difference between 290 and 290X.


----------



## zpaf

My R9 290 from start to end on 3dmark firestrike.


Result


----------



## sf101

Anyone double check the memory contact with a EKWB full cover for 290/x? wonder if supplied thermal pads are making good contact and would hate to pull it all apart to confirm.


----------



## rdr09

Quote:


> Originally Posted by *kcuestag*
> 
> What's the average OC a 290 can achieve under stock volts? Something like 1050MHz? Maybe 1100MHz?


1175 on the core here but power limit is maxed PL at 50% using Trixx. Benches at 1300.


----------



## SultanOfWalmart

Quote:


> Originally Posted by *rdr09*
> 
> 1175 on the core here but power limit is maxed at 50% using Trixx. Benches at 1300.


I don't think you can do that on stock volts, which is what he was asking.

As for the original question, I would say between 1050-1100 is the average range @stock. Mine does 1090 consistently and 1100 can be artifact free depending on the temps and benchmark. No artifacts while gaming (BF4,skyrim) while @1100/1500.


----------



## rdr09

Quote:


> Originally Posted by *SultanOfWalmart*
> 
> I don't think you can do that on stock volts, which is what he was asking.


that's the question i answered. btw, mine is watered. it could prolly do higher but i just tried 1200 but no go.

here is 1300 at +200 . . .

http://www.3dmark.com/3dm11/7716320


----------



## SultanOfWalmart

Quote:


> Originally Posted by *rdr09*
> 
> that's the question i answered. btw, mine is watered. it could prolly do higher but i just tried 1200 but no go.
> 
> here is 1300 at +200 . . .
> 
> http://www.3dmark.com/3dm11/7716320


Dang 1170 at stock volts sounds like you have a keeper. Could you post some shots of benching [email protected] volt? Also, what manufacturer and bios?


----------



## rdr09

Quote:


> Originally Posted by *SultanOfWalmart*
> 
> Dang 1170 at stock volts sounds like you have a keeper. Could you post some shots of benching [email protected] volt? Also, what manufacturer and bios?


i lied. here is 1080 at stock . . .

http://www.3dmark.com/3dm11/7718626



my bad. i'll run it again and add 100.

edit: here we go . . .

http://www.3dmark.com/3dm11/7718654



i saw artifacts, so 1175 it is. it is made by msi and using stock bios. it is not unlockable.


----------



## Forceman

Quote:


> Originally Posted by *Hattifnatten*
> 
> How's the performance difference between the 290 and 290X when they're under water and overclocked? I don't know if I can justify the price-difference between 290 and 290X.


Probably not enough to be worth the price difference - maybe 5% or so.


----------



## bir86

Quote:


> Originally Posted by *sf101*
> 
> Anyone double check the memory contact with a EKWB full cover for 290/x? wonder if supplied thermal pads are making good contact and would hate to pull it all apart to confirm.


They work fine. A drop of MX-4 on top of the pads don't hurt either. Just don't make the same mistake as I did. I thought that I needed to cut the pads myself but found out later that they were pre-cut to fit the chips perfectly.


----------



## SultanOfWalmart

Quote:


> Originally Posted by *rdr09*
> 
> i lied. here is 1080 at stock . . .
> 
> http://www.3dmark.com/3dm11/7718626
> 
> 
> 
> my bad. i'll run it again and add 100.
> 
> edit: here we go . . .
> 
> http://www.3dmark.com/3dm11/7718654
> 
> 
> 
> i saw artifacts, so 1175 it is. it is made by msi and using stock bios. it is not unlockable.


Nice, do me a favor or, in gpuz monitoring click on the vddc and vddc1 so that it shows max voltage values recorded and run the bench, I'm curious what your vdroop is. Thanks in advance.


----------



## rdr09

Quote:


> Originally Posted by *SultanOfWalmart*
> 
> Nice, do me a favor or, in gpuz monitoring click on the vddc and vddc1 so that it shows max voltage values recorded and run the bench, I'm curious what your vdroop is. Thanks in advance.


the VDDC jumped to 1.22v. i think there is no way to keep it from compensating at load but the way i understand the question is without touching the voltage by manually adjusting it.


----------



## Derpinheimer

Quote:


> Originally Posted by *rdr09*
> 
> the VDDC jumped to 1.22v. i think there is no way to keep it from compensating at load but the way i understand the question is without touching the voltage by manually adjusting it.


Which is why "stock voltage" has always been such a useless comparison. Mine is 1.16v under load, at stock voltage. How does that make sense to compare to someone with 1.1v load, or yours with 1.22v?

I guess I always thought it was idiotic when people said that before, with the 79XX cards. Some came stock at 1.25v and others as low as 0.95v.. of course those with high reference voltages flaunted their high OCs and made everyone else think they got bad cards.


----------



## SultanOfWalmart

That's why I wanted to see the max load voltage in GPUz screenshot Because if his 290 can do 1170 on stock volts something is off or he has the best clocking chip in 290 history, something I find a little hard to believe...


----------



## rdr09

Quote:


> Originally Posted by *Derpinheimer*
> 
> Which is why "stock voltage" has always been such a useless comparison. Mine is 1.16v under load, at stock voltage. How does that make sense to compare to someone with 1.1v load, or yours with 1.22v?
> 
> I guess I always thought it was idiotic when people said that before, with the 79XX cards. Some came stock at 1.25v and others as low as 0.95v.. of course those with high reference voltages flaunted their high OCs and made everyone else think they got bad cards.


the load varies based on the oc. 1.16v for what oc? i'll try it and we'll compare. my 1.22v is for 1180. i think the question concerns highest oc that can run a game or a bench without touching the voltage. can you run 3DMark11 (as an example) at 1180 without touching the voltage? i think the result would dictate how high an oc the gpu can achieve. for my gpu - 1300.


----------



## HardwareDecoder

Quote:


> Originally Posted by *Derpinheimer*
> 
> Which is why "stock voltage" has always been such a useless comparison. Mine is 1.16v under load, at stock voltage. How does that make sense to compare to someone with 1.1v load, or yours with 1.22v?
> 
> I guess I always thought it was idiotic when people said that before, with the 79XX cards. Some came stock at 1.25v and others as low as 0.95v.. of course those with high reference voltages flaunted their high OCs and made everyone else think they got bad cards.


I agree... my 290x can do 1100 core on stock volts -- but my stock volt is only 1150 mV

So him doing 1175 or whatever @ 1.22 isn't really impressive at all.

Honestly the difference between an 1100mhz and a 1200-1250 mhz 290/x is how much 5% performance at most? Memory ocing on these does just about nothing.

I can do 1200 with +50mv @ 60hz but when I run my monitor @ 110hz everything screws up over 1100 so i'm not sure what the real issue is since monitor refresh rate SHOULD have nothing to do with my video oc, but it seems it somehow does.

also wondering if there is a bios editor for these yet, or if someone handy with editing bios could possibly help me out and show me how to change clocks etc?


----------



## Derpinheimer

Quote:


> Originally Posted by *rdr09*
> 
> the load varies based on the oc. 1.16v for what oc? i'll try it and we'll compare. my 1.22v is for 1180. i think the question concerns highest oc that can run a game or a bench without touching the voltage. can you run 3DMark11 (as an example) at 1180 without touching the voltage? i think the result would dictate how high an oc the gpu can achieve. for my gpu - 1300.


1.16 at 1100

Takes +80mV to get 1.22v under load.

This should have no relation to overall overclockability, or minimal at best. Its hard to find a connection with ASIC quality and OC but there does appear to be one.

This would be connecting stock voltage to OC, and stock voltage is likely determined from a number of ASIC pools. [Like 40-60, 61-80, and 81-100]

So the already small connection is even smaller.


----------



## rdr09

Quote:


> Originally Posted by *Derpinheimer*
> 
> 1.16 at 1100
> 
> Takes +80mV to get 1.22v under load.
> 
> This should have no relation to overall overclockability, or minimal at best. Its hard to find a connection with ASIC quality and OC but there does appear to be one.
> 
> This would be connecting stock voltage to OC, and stock voltage is likely determined from a number of ASIC pools. [Like 40-60, 61-80, and 81-100]
> 
> So the already small connection is even smaller.


can you run 3DMark11 using 1180 oc without adjusting the voltage using any app including CCC? if you can't, then there is a good possibility that you can't run the same bench at 1300 (adjusting the voltage) using the same app.

now, i am just guessing here . . . it could be related to how high an oc you can run your gpu playing a game like BF4.


----------



## maynard14

Quote:


> Originally Posted by *107Spartan*
> 
> Thanks for this link. gave rep and bought software at the discount price.


yes sir no problem... i ended up cracking it lol haha


----------



## mrsus

Quote:


> Originally Posted by *HardwareDecoder*
> 
> I agree... my 290x can do 1100 core on stock volts -- but my stock volt is only 1150 mV
> 
> So him doing 1175 or whatever @ 1.22 isn't really impressive at all.
> 
> Honestly the difference between an 1100mhz and a 1200-1250 mhz 290/x is how much 5% performance at most? Memory ocing on these does just about nothing.
> 
> I can do 1200 with +50mv @ 60hz but when I run my monitor @ 110hz everything screws up over 1100 so i'm not sure what the real issue is since monitor refresh rate SHOULD have nothing to do with my video oc, but it seems it somehow does.
> 
> also wondering if there is a bios editor for these yet, or if someone handy with editing bios could possibly help me out and show me how to change clocks etc?


My 290 is acting the same with my Qnix monitor when the monitor is overclock.


Can do 1180 core on AB with +100mV no problem when monitor is at @60hz , overclocking the monitor @120mhz will corrupt the screen.
1100 core and the Qnix @120 wont corrupt.
I think it the Qnix monitor as i have another Samsung connect and only the Qnix will corrupt.
Try better DVI-D cable and it didnt help ( Does help the Qnix monitor reaching 120hz tho







)
Card is water cool and temp seem fine ( load temp Core 53C, Vrm 1 55C, Vrm 2 41 C)


So card is running at 1100mhz now as I either choose higher card OC or montior OC, and I cant go back to 60hz !!!


----------



## HardwareDecoder

Quote:


> Originally Posted by *mrsus*
> 
> My 290 is acting the same with my Qnix monitor when the monitor is overclock.
> 
> 
> Can do 1180 core on AB with +100mV no problem when monitor is at @60hz , overclocking the monitor @120mhz will corrupt the screen.
> 1100 core and the Qnix @120 wont corrupt.
> I think it the Qnix monitor as i have another Samsung connect and only the Qnix will corrupt.
> Try better DVI-D cable and it didnt help ( Does help the Qnix monitor reaching 120hz tho
> 
> 
> 
> 
> 
> 
> 
> )
> Card is water cool and temp seem fine ( load temp Core 53C, Vrm 1 55C, Vrm 2 41 C)
> 
> 
> So card is running at 1100mhz now as I either choose higher card OC or montior OC, and I cant go back to 60hz !!!


WOW dude our cases are like exactly the same THANK YOU so much for posting glad it isn't just me. I had posted the same info on the qnix thread here and no one else seemed to have the problem.

Your temps are the same as mine and the over clocking is almost exactly the same too. My options are 1100mhz @ stock volts @ 110hz (my qnix won't go higher with any cable it seems) or 1200mhz w/ +25-50 mV @ 60hz. So I run 1100mhz as I never want to go back to 60 again.

Here is the screen I get. 


or everything will just get super blurry.

What I don't get is video card OC should not effect the max monitor oc.


----------



## SandGlass

The prices are still marked up, lowest price so far is at provantage, they're selling the XFX 290x for $559.


----------



## Derpinheimer

Quote:


> Originally Posted by *SandGlass*
> 
> The prices are still marked up, lowest price so far is at provantage, they're selling the XFX 290x for $559.


Thats a pretty reasonable price... lol.
Quote:


> Originally Posted by *rdr09*
> 
> can you run 3DMark11 using 1180 oc without adjusting the voltage using any app including CCC? if you can't, then there is a good possibility that you can't run the same bench at 1300 (adjusting the voltage) using the same app.
> 
> now, i am just guessing here . . . it could be related to how high an oc you can run your gpu playing a game like BF4.


Sure, not tonight.

My max OC is somewhere around 1200. For some reason the card has artifacts like this with overvolting: 



 [ over +137mv ]

Not sure if its the PSU or the card. It only happens in battlefield [out of the few games I've tried; benches work fine.. up to a higher voltage, where they then do the same thing]

But you have to remember if your stock voltage is +60-80mv versus mine, you might be putting some serious volts in to it. Even ASUS GPU tweak, which gives you stated voltages instead of offsets, uses an offset. Its stated voltages are made up.


----------



## hatlesschimp

Why are the ASUS 290X so much more??? Here in Australia the average price for a 290x is $650 - $700 but the ASUS variant is *$850*???????????


----------



## rdr09

Quote:


> Originally Posted by *Derpinheimer*
> 
> Thats a pretty reasonable price... lol.
> Sure, not tonight.
> 
> My max OC is somewhere around 1200. For some reason the card has artifacts like this with overvolting:
> 
> 
> 
> [ over +137mv ]
> 
> Not sure if its the PSU or the card. It only happens in battlefield [out of the few games I've tried; benches work fine.. up to a higher voltage, where they then do the same thing]
> 
> But you have to remember if your stock voltage is +60-80mv versus mine, you might be putting some serious volts in to it. Even ASUS GPU tweak, which gives you stated voltages instead of offsets, uses an offset. Its stated voltages are made up.


where did you get the +60-80mv? my idle stock volts is 0.984v. when loaded at stock it goes up to 1.188v. a difference of 0.204v.


----------



## DeadlyDNA

Quote:


> Originally Posted by *sf101*
> 
> Anyone double check the memory contact with a EKWB full cover for 290/x? wonder if supplied thermal pads are making good contact and would hate to pull it all apart to confirm.


I just took 3 apart, and i could see the impressions from the memory chips. To me they looked great on contact.


----------



## Derpinheimer

Quote:


> Originally Posted by *rdr09*
> 
> where did you get the +60-80mv? my idle stock volts is 0.984v. when loaded at stock it goes up to 1.188v. a difference of 0.204v.


Because mine loaded is 1.16v and yours loaded is 1.22v? 60mv, +? for the fact your OC vDroop droop is likely higher (1100 vs 1180) and +? for that it actually requires me to set +80 for it to be 1.22v. Of course the absolute difference is +60

PS my idle stock volt is .956


----------



## rdr09

Quote:


> Originally Posted by *Derpinheimer*
> 
> Because mine loaded is 1.16v and yours loaded is 1.22v? 60mv, +? for the fact your OC vDroop droop is likely higher (1100 vs 1180) and +? for that it actually requires me to set +80 for it to be 1.22v. Of course the absolute difference is +60
> 
> PS my idle stock volt is .956


hmmm. at 1100 mine went to 1.14v using no offset.. at 1300 it went to 1.39v using +200. i only use Trixx and i am using stock bios.


----------



## maynard14

my r9 290 xfx idle os 0.986 and load is 1.244 huhuh

all stock....

after i unlock my card to 290x bios my idle is 0.969 and load is 1.224

i can only do max oc with stock voltage at 1070 core clock and memory clock is 1260

if add to 100 mv on ab max overclock is 1130 and core clock to 1400

max temp when oc is only 75c and fan to 75 percent


----------



## rdr09

Quote:


> Originally Posted by *maynard14*
> 
> my r9 290 xfx idle os 0.986 and load is 1.244 huhuh
> 
> all stock....
> 
> after i unlock my card to 290x bios my idle is 0.969 and load is 1.224
> 
> i can only do max oc with stock voltage at 1070 core clock and memory clock is 1260
> 
> if add to 100 mv on ab max overclock is 1130 and core clock to 1400
> 
> max temp when oc is only 75c and fan to 75 percent


use trixx. you can go all the way to +200 but if you are using the stock cooler - i advise not to oc your gpu at all.


----------



## Derpinheimer

Quote:


> Originally Posted by *maynard14*
> 
> my r9 290 xfx idle os 0.986 and load is 1.244 huhuh
> 
> all stock....
> 
> after i unlock my card to 290x bios my idle is 0.969 and load is 1.224
> 
> i can only do max oc with stock voltage at 1070 core clock and memory clock is 1260
> 
> if add to 100 mv on ab max overclock is 1130 and core clock to 1400
> 
> max temp when oc is only 75c and fan to 75 percent


Odd. I have 70.4% ASIC. You?
Quote:


> Originally Posted by *rdr09*
> 
> hmmm. at 1100 mine went to 1.14v using no offset.. at 1300 it went to 1.39v using +200. i only use Trixx and i am using stock bios.


Sorry man, but you arent making sense.

You said 1.22 for 1180, stock. Cards dont self-overvolt. Plus, adding 200mV will result in less than 200mV in net "after droop" voltage, so getting 250mV difference is also impossible.

Gotta clarify because right now some things arent making sense.


----------



## rdr09

Quote:


> Originally Posted by *Derpinheimer*
> 
> Odd. I have 70.4% ASIC. You?
> Sorry man, but you arent making sense.
> 
> You said 1.22 for 1180, stock. Cards dont self-overvolt. Plus, adding 200mV will result in less than 200mV in net "after droop" voltage, so getting 250mV difference is also impossible.
> 
> Gotta clarify because right now some things arent making sense.


then GPUz does not make sense. that's what GPUz showed. when i load at stock the VDDC goes to 1.14v. at 1180, as requested by one member earlier to show the max volts, it read 1.22v that was with 0 offset.. i ran 1300 with +200 offset it read a max voltage of 1.39v.


----------



## Derpinheimer

Maybe its because you are using max voltage; I think minimum or "typical" would work better. Max voltage can always jump a bit because of temporary reductions in load. Dropping voltage? Well, it shouldnt find too much more stress in one frame than the last... of course 100% load isnt always the same (furmark vs game), but its still a more comparable metric IMO.

Here is mine, using "typical" (It generally stays at these numbers) No voltage changes, just clockspeed.
MHz:vCore
1000: 1.156
1050: 1.148
1100: 1.141

And here is typical, in OCCT:

1000: 1.094
1050: 1.086
1100: 1.078

So you can see the trend there is down; higher OC -> higher power draw -> greater vDroop.

And just to be sure, here is 1180, with +80mV in a game, since there is no way in hell mine will do that with stock.

1100: 1.219
1180: 1.211

So I dont see that the card might have some sort of anti-droop mechanism clicking in at higher clocks.

Finally! With +200mV and 1100MHz core, MAX game voltage is 1.32v

So there seems to be about a 5% increase in voltage for you card from mine. Odd? I really wish some reliable source could give us safe voltages.


----------



## rdr09

Quote:


> Originally Posted by *Derpinheimer*
> 
> Maybe its because you are using max voltage; I think minimum or "typical" would work better. Max voltage can always jump a bit because of temporary reductions in load. Dropping voltage? Well, it shouldnt find too much more stress in one frame than the last... of course 100% load isnt always the same (furmark vs game), but its still a more comparable metric IMO.
> 
> Here is mine, using "typical" (It generally stays at these numbers)
> MHz:vCore
> 1000: 1.156
> 1050: 1.148
> 1100: 1.141
> 
> No voltage change, just clockspeed.
> 
> And typical, OCCT:
> 
> 1000: 1.094
> 1050: 1.086
> 1100: 1.078
> 
> So you can see the trend there is down; higher OC -> higher power draw -> greater vDroop.
> 
> And just to be sure, here is 1180, with +80mV since there is no way in hell mine will do that with stock.
> 
> 1100: 1.219
> 1180: 1.211
> 
> So I dont see that the card might have some sort of anti-droop mechanism clicking in at higher clocks.
> 
> Finally! With +200mV and 1100, voltage is 1.32v.


now, you got me confuse, too. i do not have to use the offset in Trixx when only benching at 1175 and this is without artifacts. i can do the same up to 1180 but it artifacts. i have to use an offset of +200 in Trixx to bench at 1300. i just checked my minimum VDDC after benching at 1300 and GPUz showed 1.15v (lol). Max was 1.39v.


----------



## Derpinheimer

Right, the reason I use +80mV is because my card wont do it with stock voltage. This is why im calling it a useless "measure", because stock voltage varies from card to card. Yours seems to be 5% higher than mine, at any given voltage.

Also when I say minimum I mean minimum while doing 3d.







1.15v must have been idle mode. .95v+200mv = 1.15v







(Offset voltages of course apply to idle and 3d voltages)

I just watched the GPU-Z voltage and recorded whatever it stabilized at.

P.S. I am totally jelly of your card but that isnt why im debating this with you


----------



## maynard14

Quote:


> Originally Posted by *rdr09*
> 
> use trixx. you can go all the way to +200 but if you are using the stock cooler - i advise not to oc your gpu at all.


yeah sir i just tried to bechmark but iim not using my card overclock when gaming at all.

im happy that i can unlock my card though but im waiting for nzxt g10 and vrm cooling

my asic score is only 67.5 percent


----------



## Darhant

Hey I having issues with my XFX R9 290 volt droop.
I'm using standard bios and it is water-cooled so temps never break 65.

In these two pictures is me running furmark and the graphs in afterburner are show very unstable core clock and vcore.



I've read that flashing the PT1 bios can help but will it work considering my card isn't unlock-able?
Could I be due for a new power supply?(currently 850w)
Is there any other solutions?


----------



## maynard14

Quote:


> Originally Posted by *Darhant*
> 
> Hey I having issues with my XFX R9 290 volt droop.
> I'm using standard bios and it is water-cooled so temps never break 65.
> 
> In these two pictures is me running furmark and the graphs in afterburner are show very unstable core clock and vcore.
> 
> 
> 
> I've read that flashing the PT1 bios can help but will it work considering my card isn't unlock-able?
> Could I be due for a new power supply?(currently 850w)
> Is there any other solutions?


where about the same sir... stock settings or unlock 290x bios still unstable gpu load and votlage,..but i dont know why are cards are like this..

using stock cooler for the card


----------



## Arizonian

Quote:


> Originally Posted by *TommyMoore*
> 
> R9 290 now under water
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Quick run on Firestrike
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 3770k @ 4.2 - R9 290 @ 1198 / 1497
> 
> Score 10214
> Graphics 12717
> Physics 10064
> Combined 4164


Congrats - updated


----------



## Sazz

Well good news is that prices may go down once every custom PCB gets out, Asus DCUII was priced at 569.99 MSRP (290X) so that's 10bucks lower than launch price of a reference 290X.. only thing that is really OVERLY marked up is the 290's, which is over 100bucks more than what it costs on launch.


----------



## Iniura

If my card does 1200 Core on +100mV, how high do you think it will clock with +200mV?
Could 1200 still be the highest it goes?, or will it most likely be higher? Think I can manage 1300 MHz?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Iniura*
> 
> If my card does 1200 Core on +100mV, how high do you think it will clock with +200mV?
> Could 1200 still be the highest it goes?, or will it most likely be higher? Think I can manage 1300 MHz?


keep the temps down and go for gold! There are people hitting 1250+ with added 200mv. Not alot though


----------



## Sazz

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> keep the temps down and go for gold! There are people hitting 1250+ with added 200mv. Not alot though


My first 290X reached 1275 on core using Asus BIOS with GPU tweak, I set GPU tweak at 1.450v but from what I have seen whatever you see on GPU tweak is not really the voltage you're getting and you are actually getting lower voltage if I remember it right.. I haven't been OCing lately, still need to put back my waterblock on my new 290X. procrastinating, just feel too lazy lately with all these foods from the holidays xD


----------



## 107Spartan

For me in bf4 going from 1150 to 1200mhz on a 1440p,ultra everything at 150% super sampling there is Zero increase in fps....


----------



## Forceman

Quote:


> Originally Posted by *Iniura*
> 
> If my card does 1200 Core on +100mV, how high do you think it will clock with +200mV?
> Could 1200 still be the highest it goes?, or will it most likely be higher? Think I can manage 1300 MHz?


Mine didn't scale for crap. 1200 with +100 and 1250 with +200.


----------



## kizwan

Quote:


> Originally Posted by *rdr09*
> 
> where did you get the +60-80mv? my idle stock volts is 0.984v. when loaded at stock it goes up to 1.188v. a difference of 0.204v.


Both of my 290's also have the same idle voltage, 0.984V & with GPU-Z render test, one max at 1.227 - 1.234V (ASIC 70.3%, Elpida) & another max at 1.211V (ASIC 77%, Hynix).
Quote:


> Originally Posted by *HardwareDecoder*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> WOW dude our cases are like exactly the same THANK YOU so much for posting glad it isn't just me. I had posted the same info on the qnix thread here and no one else seemed to have the problem.
> 
> Your temps are the same as mine and the over clocking is almost exactly the same too. My options are 1100mhz @ stock volts @ 110hz (my qnix won't go higher with any cable it seems) or 1200mhz w/ +25-50 mV @ 60hz. So I run 1100mhz as I never want to go back to 60 again.
> 
> Here is the screen I get.
> 
> 
> or everything will just get super blurry.
> 
> 
> 
> What I don't get is video card OC should not effect the max monitor oc.


Can you point me where I can find tutorial to overclock my monitor. I have Samsung monitor though.
Quote:


> Originally Posted by *Darhant*
> 
> Hey I having issues with my XFX R9 290 volt droop.
> I'm using standard bios and it is water-cooled so temps never break 65.
> 
> In these two pictures is me running furmark and the graphs in afterburner are show very unstable core clock and vcore.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> I've read that flashing the PT1 bios can help but will it work considering my card isn't unlock-able?
> Could I be due for a new power supply?(currently 850w)
> Is there any other solutions?


850W is plenty for one 290. GPU clock fluctuating while bench/stress is strange though when temp is low. This is mine while playing BF4 (one round). I didn't use AB to monitor because I can't scroll the graph.

(top is core clock, bottom is gpu usage)


Quote:


> Originally Posted by *maynard14*
> 
> where about the same sir... stock settings or unlock 290x bios still unstable gpu load and votlage,..but i dont know why are cards are like this..
> 
> using stock cooler for the card


While playing games did you experiencing micro-stutter or micro-lag?


----------



## Forceman

Quote:


> Originally Posted by *kizwan*
> 
> Both of my 290's also have the same idle voltage, 0.984V & with GPU-Z render test, one max at 1.227 - 1.234V (ASIC 70.3%, Elpida) & another max at 1.211V (ASIC 77%, Hynix).


Hmm. My 290 flashed to 290X idles at 0.961 and it runs the render test at 1.18 (both at 1000 and at 1100). Tempted to open the case and flip back to the 290 BIOS and see what it does.


----------



## HardwareDecoder

Quote:


> Originally Posted by *kizwan*
> 
> Can you point me where I can find tutorial to overclock my monitor. I have Samsung monitor though.


idk if you can overclock your monitor. Although, the qnix does use a Samsung PLS panel.


----------



## Sgt Bilko

Quote:


> Originally Posted by *hatlesschimp*
> 
> Why are the ASUS 290X so much more??? Here in Australia the average price for a 290x is $650 - $700 but the ASUS variant is *$850*???????????


Well if you are talking about PCCG thats because they get stuff through Altech : http://altech.com.au/productinfo.aspx?id=66208&product=ASUS_AMD_R9_290X_PCI-E_3_0_4GB_GDDR5__DVI-d__D-SUB__HDMI

And no idea why Asus always more expensive, it makes some sense on custom cards but for the Reference Design? just an Asus-tax i guess


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> Both of my 290's also have the same idle voltage, 0.984V & with GPU-Z render test, one max at 1.227 - 1.234V (ASIC 70.3%, Elpida) & another max at 1.211V (ASIC 77%, Hynix).


My gpu's ASIC is 79%, Elpidia. That 1.188v was based on GPUz render test as well.


----------



## kcuestag

I have an issue where sometimes in Battlefield 4 I have poor framerate and I noticed both GPU's are well underclocked, and not anywhere close to 947MHz.

Here's the MSI AB graph:



Sometimes it fixes it's self after minimizing and maximing again, but sometimes I have to close the game, or even reboot the computer. I'm using latest 13.12 Official driver, any idea?

PS: That's in Windowed mode, that's why 2nd GPU is not used at all, in Fullscreen it does the same thing but with both GPU's, not just GPU1.


----------



## steelkevin

So I have my R9 290 under water clocked @1100/1300MHz with everything else @stock.

Does that seem like a decent overclock ?

With Trixx I can get the Vcore to +200 while AB only alows for +100 but I don't really see the point of it. I could do Firesrike without any artifacting with +200 and power limit to +50 with 1220/1750MHz but the card makes an awfull whine noise so I wouldn't leave it at that.
Now that I think about it. With my usual overclock (1100/1300MHz with Vcore and Power Limit @stock) on some BF4 maps the card whines a little bit and on others it doesn't at all. What's that about ? Not that it really bothers me considering that when I do play I have my headset on anyway.

Oh and what are other water cooling users' temps like ? I have two 280mm rads and the GPU stays under 50 if I don't play much (as in about an hour) but I saw it go up to 57°C yesterday. Not much to worry about but when I see that people running those custom heatsinks or AiOs with the NZXT adaptator getting only a couple degrees more (with near stock VRM temps though ^^) it makes me wonder if there's not something wrong with my loop.

Thanks in advance









EDIT: Sooner or later I'll probably be chuking a second one in there so I can fully benefit of a good monitor.


----------



## quakermaas

Quote:


> Originally Posted by *kcuestag*
> 
> I have an issue where sometimes in Battlefield 4 I have poor framerate and I noticed both GPU's are well underclocked, and not anywhere close to 947MHz.
> 
> Here's the MSI AB graph:
> 
> 
> 
> Sometimes it fixes it's self after minimizing and maximing again, but sometimes I have to close the game, or even reboot the computer. I'm using latest 13.12 Official driver, any idea?
> 
> PS: That's in Windowed mode, that's why 2nd GPU is not used at all, in Fullscreen it does the same thing but with both GPU's, not just GPU1.


What is your temperatures like ? May be throttling due to high temps, but it looks like it is throttling to much for that.


----------



## velocityx

Quote:


> Originally Posted by *kcuestag*
> 
> I have an issue where sometimes in Battlefield 4 I have poor framerate and I noticed both GPU's are well underclocked, and not anywhere close to 947MHz.
> 
> Here's the MSI AB graph:
> 
> 
> 
> Sometimes it fixes it's self after minimizing and maximing again, but sometimes I have to close the game, or even reboot the computer. I'm using latest 13.12 Official driver, any idea?
> 
> PS: That's in Windowed mode, that's why 2nd GPU is not used at all, in Fullscreen it does the same thing but with both GPU's, not just GPU1.


Quote:


> Originally Posted by *quakermaas*
> 
> What is your temperatures like ? May be throttling due to high temps, but it looks like it is throttling to much for that.


your gpu cooling is barely over 1000/1100 rpm. he is temp throttling there as his fans should be at that speed at idle not 3d speeds.


----------



## kcuestag

Quote:


> Originally Posted by *quakermaas*
> 
> What is your temperatures like ? May be throttling due to high temps, but it looks like it is throttling to much for that.


Quote:


> Originally Posted by *velocityx*
> 
> your gpu cooling is barely over 1000/1100 rpm. he is temp throttling there as his fans should be at that speed at idle not 3d speeds.


They were barely hitting 86-90ºC when that happened, they're not at 95ºC as they're used to be at full load.

Edit:

Just noticed I had a max fan speed of 47% set at CCC, I remember I dropped that last night, I'll set it back to ~75%. Can't wait to get my 2nd waterblock so that I can slap both under water.


----------



## EliteReplay

im so disapointed with AMD

Hi guys i really love AMD Gpus and AMD CPUs but i got to a point that im switching to intel/nvidia

the reason is to much talk and promises not being made... just empty as always
Mantle was the main reason a was hoping amd to do well but we dont have a clue or a real wolrd test
that show us how is going to change gaming industry or something...

i just decided to go 3930K and GTX780
after a lot of thinking and using AMD
since 2006

the R9290 / 290X are giving people

artifacts
OC pontial is ramdonly
No availability
crashes
black screen
More Power usage

So guys Goodbye until AMD do well.


----------



## Spartacvs

for overclock test and not vdroop, is good flash bios pt1 o pt3?


----------



## quakermaas

Quote:


> Originally Posted by *EliteReplay*
> 
> im so disapointed with AMD
> 
> Hi guys i really love AMD Gpus and AMD CPUs but i got to a point that im switching to intel/nvidia
> 
> the reason is to much talk and promises not being made... just empty as always
> Mantle was the main reason a was hoping amd to do well but we dont have a clue or a real wolrd test
> that show us how is going to change gaming industry or something...
> 
> i just decided to go 3930K and GTX780
> after a lot of thinking and using AMD
> since 2006
> 
> the R9290 / 290X are giving people
> 
> artifacts
> OC pontial is ramdonly
> No availability
> crashes
> black screen
> More Power usage
> 
> So guys Goodbye until AMD do well.


Enjoy your new set-up, I'm very happy with my 290s, no major problems at all so far.


----------



## quakermaas

Quote:


> Originally Posted by *Spartacvs*
> 
> for overclock test and not vdroop, is good flash bios pt1 o pt3?


If you are new to overclocking, stick to one of the normal bios, pt1 and pt3 are for extreme clocking and if you are not aware of the risks, you might damage your card.


----------



## maynard14

Quote:


> Originally Posted by *kizwan*
> 
> Both of my 290's also have the same idle voltage, 0.984V & with GPU-Z render test, one max at 1.227 - 1.234V (ASIC 70.3%, Elpida) & another max at 1.211V (ASIC 77%, Hynix).
> Can you point me where I can find tutorial to overclock my monitor. I have Samsung monitor though.
> 850W is plenty for one 290. GPU clock fluctuating while bench/stress is strange though when temp is low. This is mine while playing BF4 (one round). I didn't use AB to monitor because I can't scroll the graph.
> 
> (top is core clock, bottom is gpu usage)
> 
> While playing games did you experiencing micro-stutter or micro-lag?


no sir none at all stable as rock fps is stable and even in benchmarking,.. my issue is really weird even on stock 290 bios and 290x moded bios on my xfx r9 290


----------



## Hattifnatten

Quote:


> Originally Posted by *EliteReplay*
> 
> Mantle was the main reason a was hoping amd to do well but we dont have a clue or a real wolrd test
> that show us how is going to change gaming industry or something...


But we do. Much information was given on APU13.




 a "demo" from Oxide, very impressive imo. as the unit-count just goes up, and up, and up, and still no sign of slowing down.


----------



## kcuestag

I just uninstalled the drivers using DDU (Display Driver Uninstaller) in Safe Mode and installed them again. Before it didn't show "Catalyst Version" on CCC's Software Info, now it does, so my previous installation was probably broken and thus the framerate drops, I've seen other people complain about the same thing in BF4, so this should help.


----------



## maynard14

haha my card is really weird : (



100 percent fan

stock settings

latest amd driver


----------



## rdr09

Quote:


> Originally Posted by *EliteReplay*
> 
> im so disapointed with AMD
> 
> Hi guys i really love AMD Gpus and AMD CPUs but i got to a point that im switching to intel/nvidia
> 
> the reason is to much talk and promises not being made... just empty as always
> Mantle was the main reason a was hoping amd to do well but we dont have a clue or a real wolrd test
> that show us how is going to change gaming industry or something...
> 
> i just decided to go 3930K and GTX780
> after a lot of thinking and using AMD
> since 2006
> 
> the R9290 / 290X are giving people
> 
> artifacts
> OC pontial is ramdonly
> No availability
> crashes
> black screen
> More Power usage
> 
> So guys Goodbye until AMD do well.


i'll never pair my 290 to an AMD cpu, atm. enjoy!


----------



## EvolveGamingPC

Just got my sapphire r9 290! 
Reference cooled but I might get a stick on after-market cool at some point. Loving it so far!


----------



## steelkevin

Quote:


> Originally Posted by *steelkevin*
> 
> So I have my R9 290 under water clocked @1100/1300MHz with everything else @stock.
> 
> Does that seem like a decent overclock ?
> 
> With Trixx I can get the Vcore to +200 while AB only alows for +100 but I don't really see the point of it. I could do Firesrike without any artifacting with +200 and power limit to +50 with 1220/1750MHz but the card makes an awfull whine noise so I wouldn't leave it at that.
> Now that I think about it. With my usual overclock (1100/1300MHz with Vcore and Power Limit @stock) on some BF4 maps the card whines a little bit and on others it doesn't at all. What's that about ? Not that it really bothers me considering that when I do play I have my headset on anyway.
> 
> Oh and what are other water cooling users' temps like ? I have two 280mm rads and the GPU stays under 50 if I don't play much (as in about an hour) but I saw it go up to 57°C yesterday. Not much to worry about but when I see that people running those custom heatsinks or AiOs with the NZXT adaptator getting only a couple degrees more (with near stock VRM temps though ^^) it makes me wonder if there's not something wrong with my loop.
> 
> Thanks in advance
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: Sooner or later I'll probably be chuking a second one in there so I can fully benefit of a good monitor.


Anybody ?
Water cooled temperatures ?
Whine ?
Water cooled stable OC ?


----------



## amlett

Quote:


> Originally Posted by *steelkevin*
> 
> Anybody ?
> Water cooled temperatures ?
> Whine ?
> Water cooled stable OC ?


I unlocked mine to a 290X.

This one is on stock volts at 1100/1300

http://img24.imageshack.us/img24/2076/rvnn.png

And this with more volts at 1200/1500.

http://img801.imageshack.us/img801/723/pcfd.png

I'm using EK full block with fujipoly thermalpads

Max OC in water with ASUS Bios 1225/1600. Not worth it.

Using 1200/1500 for BF4 (120fps maxed at 1050p in a 2233rz no vsync), and 1100/1300 or stock for anything else at 1200p with vsync at 60hz if the game doesn't need the 1200/1500 (about 100W more than with 1100/1300).

My temps in BF4 ~53º with 27ºC in the room. one 480 rad tor the whole system.

Whine when I use 1200/1500 and the case is open. Can't ear it while playing.


----------



## steelkevin

Quote:


> Originally Posted by *amlett*
> 
> I unlocked mine to a 290X.
> 
> This one is on stock volts at 1100/1300
> 
> http://img24.imageshack.us/img24/2076/rvnn.png
> 
> And this with more volts at 1200/1500.
> 
> http://img801.imageshack.us/img801/723/pcfd.png
> 
> I'm using EK full block with fujipoly thermalpads
> 
> Max OC in water with ASUS Bios 1225/1600. Not worth it. Using 1200/1500 for BF4 (120fps maxed at 1050p in a 2233rz no vsync), and 1100/1300 or no OC for anything else at 1200p with vsync.
> 
> My temps in BF4 ~53º with 27ºC in the room. one 480 rad tor the whole system.


Alright thanks a lot







.

How about the whine ? Are you getting any of that ?
And WOW. 120fps with everything maxed out Oo ? Seems like a lot. On those new China Rising maps I've seen as low as 55fps I think.


----------



## amlett

edited the post with some things I think I should have included.
Quote:


> Originally Posted by *steelkevin*
> 
> Alright thanks a lot
> 
> 
> 
> 
> 
> 
> 
> 
> .
> 
> How about the whine ? Are you getting any of that ?
> And WOW. 120fps with everything maxed out Oo ? Seems like a lot. On those new China Rising maps I've seen as low as 55fps I think.


It's 1680x1050. With stock clocks you can see ~ 100 AVG fps. Didn't tried with 290 bios. 290X since day 1.

Also, I didn't check the min fps in new maps. I checked on first days I was ensuring the OC stability. I supose that it can go down to less fps in some extreme situations, but the games goes incredibly smooth at 1200/1500.


----------



## steelkevin

Quote:


> Originally Posted by *amlett*
> 
> edited the post with some things I think I should have included.
> It's 1680x1050. With stock clocks you can see ~ 100 AVG fps. Didn't tried with 290 bios. 290X since day 1.


Oh, right. Thought you meant 1080p but had just missed the 8 key and tapped the 5







.
Well that makes sense.

Oh and just read the editted parts. Answers it all. Thanks a ton mate


----------



## Roy360

How big of power supply would I need to run 3 to mine? I can't decide between the Corsair 1200AX(Gold) for 199.99, the XFX 1000 (Platinum) for 199.99 or the EVGA 1000P2/G2 (Platinum/Gold) for 179.99/159.99

System:
i5 3570k @ 4.6V @ 1.195. I don't think the rest of the stuff matter.


----------



## Roy360

or even better, could I use a XFX 850W Gold? costs 104$


----------



## Arizonian

Quote:


> Originally Posted by *EvolveGamingPC*
> 
> Just got my sapphire r9 290!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Reference cooled but I might get a stick on after-market cool at some point. Loving it so far!


Congrats - added


----------



## Derpinheimer

Quote:


> Originally Posted by *Forceman*
> 
> Mine didn't scale for crap. 1200 with +100 and 1250 with +200.


Quote:


> Originally Posted by *steelkevin*
> 
> Anybody ?
> Water cooled temperatures ?
> Whine ?
> Water cooled stable OC ?


Water cool temp [bf4]
Ambient: 20c
Core max: 42c
VRM1 max: 50c
VRM2 max: 48c

Whine: No, buzz - yes

Stable OC: Not totally sure. Was running 1200MHz @ +137mV, 1350 memory.


----------



## steelkevin

Quote:


> Originally Posted by *Derpinheimer*
> 
> Water cool temp [bf4]
> Ambient: 20c
> Core max: 42c
> VRM1 max: 50c
> VRM2 max: 48c
> 
> Whine: No, buzz - yes
> 
> Stable OC: Not totally sure. Was running 1200MHz @ +137mV, 1350 memory.


42°C Max Oo ?! What rads do you have ?


----------



## Derpinheimer

Well amlett got 53c with 27c ambient, so 26c rise

I got 22c rise. Not a huge difference.

Rad though I have
1: 3x180mm 45mm thick
1: 1x120mm 80mm thick


----------



## TheSoldiet

I ordered a Seidon 120m for the red mod and a Samsung 840 250gb EVO







You can soon change me to watercooled


----------



## VSG

Well, I have done it and sold my baby overclockers to a miner since it looked unlikely that Asus was having my motherboard fixed within the next 2 weeks. Please forgive me









Arizonian, I may well go ahead and get 2 reference 290s if the price drops soon so don't just remove me from the list yet. But if I get swayed away by those green guys, I will let you know.

Edit: Listed my waterblocks for sale on the OCN marketplace if anyone is interested.


----------



## EvolveGamingPC

Hey people, my 290 hasn't down-clocked the memory after gaming and is sitting at 1250Mhz instead of the normal 150Mhz when idle and is hitting 68 Degress idle when it is normally at around 44 Degrees. Anyone else have this or a way to fix it?


----------



## Redvineal

What's up party people? Anyone have an idea what the stock cooler VRAM and VRM thermal pad materials are and/or what their thermal conductivity ratings are?

I'm especially interested in the gray stripe that makes contact with VRM1 and how it compares to Fujipoly Extreme (11w/mK) thermal pads.

Thanks!


----------



## Jack Mac

Quote:


> Originally Posted by *EvolveGamingPC*
> 
> Hey people, my 290 hasn't down-clocked the memory after gaming and is sitting at 1250Mhz instead of the normal 150Mhz when idle and is hitting 68 Degress idle when it is normally at around 44 Degrees. Anyone else have this or a way to fix it?


I have this as well. I down clock with CCC and undervolt with AB.


----------



## the9quad

Quote:


> Originally Posted by *EvolveGamingPC*
> 
> Hey people, my 290 hasn't down-clocked the memory after gaming and is sitting at 1250Mhz instead of the normal 150Mhz when idle and is hitting 68 Degress idle when it is normally at around 44 Degrees. Anyone else have this or a way to fix it?


I know mine wouldn't downclock at 120 hz but does at 60 hz.


----------



## EvolveGamingPC

Quote:


> Originally Posted by *the9quad*
> 
> I know mine wouldn't downclock at 120 hz but does at 60 hz.


Awesome man, that works I just disabled my secondary monitor and it downclocked to 150Mhz


----------



## alawadhi3000

Hello guys, I just grabbed a Sapphire R9-290.

I installed it in my system (That has a GTX670 as the main GPU) but the AMD CCC doesn't display the performance tab, I can't overclock.

Even MSI Afterburner and Sapphire Trixx can't read the clocks and fan speed of the card, ASUS overclocking tool only reads the GTX670.

Latest driver is installed (13.12 WHQL), any suggestions??


----------



## HardwareDecoder

Quote:


> Originally Posted by *alawadhi3000*
> 
> Hello guys, I just grabbed a Sapphire R9-290.
> 
> I installed it in my system (That has a GTX670 as the main GPU) but the AMD CCC doesn't display the performance tab, I can't overclock.
> 
> Even MSI Afterburner and Sapphire Trixx can't read the clocks and fan speed of the card, ASUS overclocking tool only reads the GTX670.
> 
> Latest driver is installed (13.12 WHQL), any suggestions??


Your post is a bit confusing. You are using a 670+a 290 at the same time? I wonder if you have a driver conflict between nvidia and amd.'

No point in using a 670 for anything other than phys-x if you are running a 290... 290 is much more powerful.


----------



## alawadhi3000

Quote:


> Originally Posted by *HardwareDecoder*
> 
> Your post is a bit confusing. You are using a 670+a 290 at the same time? I wonder if you have a driver conflict between nvidia and amd.'
> 
> No point in using a 670 for anything other than phys-x if you are running a 290... 290 is much more powerful.


I know that.

Yes, I'm using both, the GTX670 is for the display and gaming, the R9-290 is for mining.









There are no driver conflicts yet, only the problem I listed.


----------



## Forceman

Quote:


> Originally Posted by *alawadhi3000*
> 
> There are no driver conflicts yet, only the problem I listed.


I'd guess the problem you are having _is_ the driver conflict.


----------



## HardwareDecoder

Quote:


> Originally Posted by *alawadhi3000*
> 
> I know that.
> 
> Yes, I'm using both, the GTX670 is for the display and gaming, the R9-290 is for mining.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> There are no driver conflicts yet, only the problem I listed.


ahh I forgot about mining for a minute but yea, it's pretty certain you are having a driver conflict....


----------



## alawadhi3000

Any solution or workaround then?? AMD cards are unusable when mining, at least on nVidia you can use the PC with minimum lag for watching videos, browsing ..etc.


----------



## Korayyy

Throw me on the list, Christmas present to me! 2x 290



Sorry for the blurry photo, excited/crappy camera makes for that.

Stock cooling, will be under water next week sometime after my blocks come in.

GPU-Z : http://www.techpowerup.com/gpuz/7puuk/ - Asus

http://www.techpowerup.com/gpuz/af3fr/ - Sapphire


----------



## Arizonian

Quote:


> Originally Posted by *Korayyy*
> 
> Throw me on the list, Christmas present to me! 2x 290
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Sorry for the blurry photo, excited/crappy camera makes for that.
> 
> Stock cooling, will be under water next week sometime after my blocks come in.
> 
> GPU-Z : http://www.techpowerup.com/gpuz/7puuk/ - Asus
> 
> http://www.techpowerup.com/gpuz/af3fr/ - Sapphire


Congrats & Merry Christmas - added


----------



## givmedew

Quote:


> Originally Posted by *alawadhi3000*
> 
> Any solution or workaround then?? AMD cards are unusable when mining, at least on nVidia you can use the PC with minimum lag for watching videos, browsing ..etc.


You can OC the card with commands in CGMINER.

If you have a 2nd, 3rd, or 4th gen Intel then you can use that for your display with no lag.

Your issue IS NOT a driver conflict. You have to have a monitor connected to the card and have a signal going to it in order to OC it the normal way or to see it's temp in GPUz


----------



## Derpinheimer

Quote:


> Originally Posted by *alawadhi3000*
> 
> Any solution or workaround then?? AMD cards are unusable when mining, at least on nVidia you can use the PC with minimum lag for watching videos, browsing ..etc.


I can play games while mining on the same GPU ?

Turn down intensity..


----------



## Novulux

Quote:


> Originally Posted by *Derpinheimer*
> 
> I can play games while mining on the same GPU ?
> 
> Turn down intensity..


Yeah, I almost always use my PC while it is mining, I can even play less graphic intensive games like CS:GO or TF2. Not too much of a hit to hashrate.


----------



## HOMECINEMA-PC

Did this last nite......

HOMECINEMA-PC [email protected]@2428 Giga 290 on wasser [email protected] *P18643*



http://www.3dmark.com/3dm11/7720973

Did my first atiflash with asus PT1T bios using Trixx or GPU tweek . Might try diff unlocked bioses see if i can improve


----------



## Derpinheimer

Awesome man!

For a moment I thought that meant 1400..mV, hehe, that would be incredible.

Is that game stable?


----------



## kizwan

Quote:


> Originally Posted by *kcuestag*
> 
> I have an issue where sometimes in Battlefield 4 I have poor framerate and I noticed both GPU's are well underclocked, and not anywhere close to 947MHz.
> 
> Here's the MSI AB graph:
> 
> 
> 
> Sometimes it fixes it's self after minimizing and maximing again, but sometimes I have to close the game, or even reboot the computer. I'm using latest 13.12 Official driver, any idea?
> 
> PS: That's in Windowed mode, that's why 2nd GPU is not used at all, in Fullscreen it does the same thing but with both GPU's, not just GPU1.


Quote:


> Originally Posted by *kcuestag*
> 
> I just uninstalled the drivers using DDU (Display Driver Uninstaller) in Safe Mode and installed them again. Before it didn't show "Catalyst Version" on CCC's Software Info, now it does, so my previous installation was probably broken and thus the framerate drops, I've seen other people complain about the same thing in BF4, so this should help.


Since you mentioned this, I also got the same issue, poor FPS & fluctuating clock frequency on the second GPU. Re-installing the driver & switching the GPU slots (my Hynix card tend to run hotter than Elpida when Hynix at first slot) fixed the issue.
Quote:


> Originally Posted by *steelkevin*
> 
> So I have my R9 290 under water clocked @1100/1300MHz with everything else @stock.
> 
> Does that seem like a decent overclock ?
> 
> With Trixx I can get the Vcore to +200 while AB only alows for +100 but I don't really see the point of it. I could do Firesrike without any artifacting with +200 and power limit to +50 with 1220/1750MHz but the card makes an awfull whine noise so I wouldn't leave it at that.
> Now that I think about it. With my usual overclock (1100/1300MHz with Vcore and Power Limit @stock) on some BF4 maps the card whines a little bit and on others it doesn't at all. What's that about ? Not that it really bothers me considering that when I do play I have my headset on anyway.
> 
> Oh and what are other water cooling users' temps like ? I have two 280mm rads and the GPU stays under 50 if I don't play much (as in about an hour) but I saw it go up to 57°C yesterday. Not much to worry about but when I see that people running those custom heatsinks or AiOs with the NZXT adaptator getting only a couple degrees more (with near stock VRM temps though ^^) it makes me wonder if there's not something wrong with my loop.
> 
> Thanks in advance
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: Sooner or later I'll probably be chuking a second one in there so I can fully benefit of a good monitor.


I guess 290/290X produced a lot of heat. Usually at stock clock to mild overclock, 40s Celsius is the target temp most GPUs get, depends on the ambient of course. In high ambient, >30C, temp will be around 50s Celsius. These are temp in games though. I guess with 290/290X, the number may be slightly higher.
Quote:


> Originally Posted by *quakermaas*
> 
> Enjoy your new set-up, I'm very happy with my 290s, no major problems at all so far.


Same here, no major problem at all. Happy as a clam.
Quote:


> Originally Posted by *maynard14*
> 
> no sir none at all stable as rock fps is stable and even in benchmarking,.. my issue is really weird even on stock 290 bios and 290x moded bios on my xfx r9 290


That is good news I guess. The only thing I can think of try fresh driver installation. Uninstall current driver usind DDU & re-install the driver again.

Quote:


> Originally Posted by *rdr09*
> 
> i'll never pair my 290 to an AMD cpu, atm. enjoy!


Me neither!
Quote:


> Originally Posted by *Roy360*
> 
> How big of power supply would I need to run 3 to mine? I can't decide between the Corsair 1200AX(Gold) for 199.99, the XFX 1000 (Platinum) for 199.99 or the EVGA 1000P2/G2 (Platinum/Gold) for 179.99/159.99
> 
> System:
> i5 3570k @ 4.6V @ 1.195. I don't think the rest of the stuff matter.


Quote:


> Originally Posted by *Roy360*
> 
> or even better, could I use a XFX 850W Gold? costs 104$


290 or 290X? 850W too little for 3 cards. I wouldn't go below 1000W. 1200W should be your minimum.
Quote:


> Originally Posted by *alawadhi3000*
> 
> Hello guys, I just grabbed a Sapphire R9-290.
> 
> I installed it in my system (That has a GTX670 as the main GPU) but the AMD CCC doesn't display the performance tab, I can't overclock.
> 
> Even MSI Afterburner and Sapphire Trixx can't read the clocks and fan speed of the card, ASUS overclocking tool only reads the GTX670.
> 
> Latest driver is installed (13.12 WHQL), any suggestions??


You probably need to disabled ULPS. MSI AB & TriXX should be able to pick up your card afterwards, as well as Performance tab in CCC.


----------



## hatlesschimp

Well my 3rd 290X arrived today! I placed the order in the afternoon Christmas eve and it arrived today the 27th at 12pm. Excellent!!! Thanks scorptec.com.au in Clayton!!!

Well what can I say not much room between the cards LOL. It looks good though. I wonder how hot these bad boys will get? I probably should finally invest in water cooling but I have to wait and see if we build a new house or not.

Well I ran the new TriFire 290x setup with Firestrike Extreme and loved the results on stock settings. These cards are monsters for the price and I'm very happy with them. I had TriSli Titans and they were good as well but now after one benchmark I can see both sides of the coin. AMD get better FPS numbers but Nvidia do it a bit smoother. Basically thats the difference Ive seen after one benchmark LOL. Im sure this could all change after some proper gaming time.

*FIRESTRIKE EXTREME RESULTS:*
Trifire 290X - Stock = *x12097* (on air)
http://www.3dmark.com/fs/1411481

Crossfire - Mild/Slight overclock = *x8925* (on air)
http://www.3dmark.com/fs/1188972

Crossfire - Stock = *x8699* (on air)
http://www.3dmark.com/fs/1188953

Single 290X - Stock = *x5004* (on air)
http://www.3dmark.com/fs/1178643

Tri Sli Titans - Best Overclock with one or two artifacts = *x13246* (I Ringed their necks on air)
http://www.3dmark.com/3dm11/6869871


----------



## Iniura

Msi Afterburner max clock on Memory is 1625 Mhz? IF someone wanted to take that a little bit higher how would you accomplish that? Use other software like Trixx or something?

My card can't go higher then 1200Mhz Core on air, even from making the jump from +100mV to +200mV. should it be possible to go higher once underwater? or will it probably not, the GPU now doesnt get hotter then 80 degrees and VRM1 and VRM2 stay beneath 60 degrees on air, would it make a difference that once underwater the temperatures are a lot cooler and will that help to be able to get a little bit more Core clock or not?

Firestrike Extreme -- XFX R9 290 Black OC Edition -- gpu score 5862 -- 1200/1625

http://www.3dmark.com/fs/1410905


----------



## Iniura

Quote:


> Originally Posted by *kcuestag*
> 
> I have an issue where sometimes in Battlefield 4 I have poor framerate and I noticed both GPU's are well underclocked, and not anywhere close to 947MHz.
> 
> Here's the MSI AB graph:
> 
> 
> 
> Sometimes it fixes it's self after minimizing and maximing again, but sometimes I have to close the game, or even reboot the computer. I'm using latest 13.12 Official driver, any idea?
> 
> PS: That's in Windowed mode, that's why 2nd GPU is not used at all, in Fullscreen it does the same thing but with both GPU's, not just GPU1.


I had a similar issue but only with using a single card, my core clock wouldn't reach the core clock I had set in MSI AB even though my temps where very good, the issue for me was I had to enable Power Limit to 50% and don't leave it on 0% once I upped the power limit my clocks where the same under load then what was set in MSI AB. Maybe that will help you if you didn't do that already.


----------



## battleaxe

Well, just got my 290 under water today. Getting a max of 52C while mining at 100%. I assume this is about the norm? VRM's are vrm1 = 51c max and vrm2 =35c max. I was surprised how hot the memory got on these; the RAM sinks get hot to the touch unless they have a fan on them. Passive cooling is a no go.


----------



## Derpinheimer

Yes, for sure. The memory gets very hot.
Of course without ambient its hard to be certain your temps are normal, but 52c sounds fine.

My memory has already degraded a bit. I used to be able to mine @ 1500 with stock core volts, now it requires +12mV for it to not black screen.

I also have a universal block and passive memory.


----------



## hatlesschimp

How come with GPUz I can only see one ASIC for one 290x? The other only show 0

http://www.3dmark.com/fs/1412050


----------



## battleaxe

Quote:


> Originally Posted by *Derpinheimer*
> 
> Yes, for sure. The memory gets very hot.
> Of course without ambient its hard to be certain your temps are normal, but 52c sounds fine.
> 
> My memory has already degraded a bit. I used to be able to mine @ 1500 with stock core volts, now it requires +12mV for it to not black screen.
> 
> I also have a universal block and passive memory.


Get some coolers on that RAM. I couldn't believe how hot they were passive. No way I'd run them that way long term. I've got 15x15x15mm aluminum sinks on them now. With a 120 fan blowing over the card they stay nice and cool now.


----------



## bayz11

Hello guys, can you add me please Sapphire r9 290



Just wanna ask couple things, i try to flash asus 290 on my sapphire card, and its work. But couldnt unlock voltage in gpu tweak from asus. It doesnt appear in the setting.
And i try bios 290x asus as well (even thought it unlocked) it still doesnt show unlock voltage. what happen? do i miss something?

Cause my pc using watercooling, i try flashing PT1 bios , but it say doesnt match essd.so if i mess up with something?i using the latest asus bios at the moment

Currently my card run at 1200/1325 stable at 131 vddc. Temp at idle around 36 to 40 depend on my room temp. At max around 52 to 56.
And no matter how high my vddc (try 200 already), it cant go further that. Last attempt is 1235/1375 at 150 vddc. It stable only in benchmark and show artifact in game.

Sorry if im asking too much. Im new to oc my gpu

Thank you


----------



## Derpinheimer

Quote:


> Originally Posted by *battleaxe*
> 
> Get some coolers on that RAM. I couldn't believe how hot they were passive. No way I'd run them that way long term. I've got 15x15x15mm aluminum sinks on them now. With a 120 fan blowing over the card they stay nice and cool now.


Well by passive I mean..

8 of them have Enzotech Copper 14x14x14, 6 have some random aluminum 15x14x14, and 2 have some smaller 15x15x8 (height limit from tubing)

And I dont see a way to cool them unfortunately. With the 7950 and a universal block I used the dual fan assembly and just mounted it on. I dont have that anymore, heh.

Any ideas? Maybe a PCI fan? Very space limited in an FT02 :/


----------



## prostreetcamaro

Guys I am running into a problem. Has anybody had a 290X or 290 that did not like +200 on the voltage? I just installed a complete XSPC water cooling setup. res+D5 pump, EX360 + EX240 + EK block on the 290X + raystorm on the 2600K. I am getting max gpu temps in the mid 40's and vrm temps in the 30's and 40's. My 290X will not run at +200 now with out major corruption even on stock clocks. I can go +100 and up to around 1120 stable but put the voltage at +200 and it is corruption city.

Could my pc&p silencer 750 not have enough power for everything now that I installed the water cooling and a uv cold cathode? I have tried 2 dvi dual link cables on my qnix 27" monitor and that did not help. I have a feeling this psu might be the culprit. What do you guys think? Also my scores for the clock speeds seem to be low.

Here is a video of what it is doing. Sometimes the screen goes funky and never returns and I have to hard reset.


----------



## kizwan

750W should be plenty for overclocked CPU, single GPU & water cooling. PC&P is very good PSU too.


----------



## DeadlyDNA

Well, I tested my 4 gpu's and power supply after my leak. All tested good. Getting around to testing my mainboard next. Been busy but I think I am safe for now.


----------



## HardwareDecoder

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *prostreetcamaro*
> 
> Guys I am running into a problem. Has anybody had a 290X or 290 that did not like +200 on the voltage? I just installed a complete XSPC water cooling setup. res+D5 pump, EX360 + EX240 + EK block on the 290X + raystorm on the 2600K. I am getting max gpu temps in the mid 40's and vrm temps in the 30's and 40's. My 290X will not run at +200 now with out major corruption even on stock clocks. I can go +100 and up to around 1120 stable but put the voltage at +200 and it is corruption city.
> 
> Could my pc&p silencer 750 not have enough power for everything now that I installed the water cooling and a uv cold cathode? I have tried 2 dvi dual link cables on my qnix 27" monitor and that did not help. I have a feeling this psu might be the culprit. What do you guys think? Also my scores for the clock speeds seem to be low.
> 
> Here is a video of what it is doing. Sometimes the screen goes funky and never returns and I have to hard reset.






I get the same problem, even if I do 1200mhz w/ +50mv, basically if I push my volts or clock too high I get stuff similar to what was happening in your video. I basically have the same setup as you also cooling wise give or take a part.

My choice is to run 1100 core 1400 mem @ 110hz or run 1200 core/1400 mem @ 60hz.

That is 3 people now that have this issue with 290/x's and the qnix. Something funky is going on here. Are you running over 60hz on the monitor?


----------



## Arizonian

Quote:


> Originally Posted by *hatlesschimp*
> 
> Well my 3rd 290X arrived today! I placed the order in the afternoon Christmas eve and it arrived today the 27th at 12pm. Excellent!!! Thanks scorptec.com.au in Clayton!!!
> 
> Well what can I say not much room between the cards LOL. It looks good though. I wonder how hot these bad boys will get? I probably should finally invest in water cooling but I have to wait and see if we build a new house or not.
> 
> Well I ran the new TriFire 290x setup with Firestrike Extreme and loved the results on stock settings. These cards are monsters for the price and I'm very happy with them. I had TriSli Titans and they were good as well but now after one benchmark I can see both sides of the coin. AMD get better FPS numbers but Nvidia do it a bit smoother. Basically thats the difference Ive seen after one benchmark LOL. Im sure this could all change after some proper gaming time.
> 
> *FIRESTRIKE EXTREME RESULTS:*
> Trifire 290X - Stock = *x12097* (on air)
> http://www.3dmark.com/fs/1411481
> 
> Crossfire - Mild/Slight overclock = *x8925* (on air)
> http://www.3dmark.com/fs/1188972
> 
> Crossfire - Stock = *x8699* (on air)
> http://www.3dmark.com/fs/1188953
> 
> Single 290X - Stock = *x5004* (on air)
> http://www.3dmark.com/fs/1178643
> 
> Tri Sli Titans - Best Overclock with one or two artifacts = *x13246* (I Ringed their necks on air)
> http://www.3dmark.com/3dm11/6869871
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - updated









Quote:


> Originally Posted by *battleaxe*
> 
> Well, just got my 290 under water today. Getting a max of 52C while mining at 100%. I assume this is about the norm? VRM's are vrm1 = 51c max and vrm2 =35c max. I was surprised how hot the memory got on these; the RAM sinks get hot to the touch unless they have a fan on them. Passive cooling is a no go.


Congrats - I'll take your word for it. updated









Quote:


> Originally Posted by *bayz11*
> 
> Hello guys, can you add me please Sapphire r9 290
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Just wanna ask couple things, i try to flash asus 290 on my sapphire card, and its work. But couldnt unlock voltage in gpu tweak from asus. It doesnt appear in the setting.
> And i try bios 290x asus as well (even thought it unlocked) it still doesnt show unlock voltage. what happen? do i miss something?
> 
> Cause my pc using watercooling, i try flashing PT1 bios , but it say doesnt match essd.so if i mess up with something?i using the latest asus bios at the moment
> 
> Currently my card run at 1200/1325 stable at 131 vddc. Temp at idle around 36 to 40 depend on my room temp. At max around 52 to 56.
> And no matter how high my vddc (try 200 already), it cant go further that. Last attempt is 1235/1375 at 150 vddc. It stable only in benchmark and show artifact in game.
> 
> Sorry if im asking too much. Im new to oc my gpu
> 
> Thank you


Congrats - added


----------



## stiv

Quote:


> Originally Posted by *HardwareDecoder*
> 
> 
> I get the same problem, even if I do 1200mhz w/ +50mv, basically if I push my volts or clock too high I get stuff similar to what was happening in your video. I basically have the same setup as you also cooling wise give or take a part.
> 
> My choice is to run 1100 core 1400 mem @ 110hz or run 1200 core/1400 mem @ 60hz.
> 
> That is 3 people now that have this issue with 290/x's and the qnix. Something funky is going on here. Are you running over 60hz on the monitor?


Having the same issue as well, can run 1100 on the core but anything over that requires me to set my qnix to 60hz, usually run it at 96hz


----------



## HardwareDecoder

Quote:


> Originally Posted by *stiv*
> 
> Having the same issue as well, can run 1100 on the core but anything over that requires me to set my qnix to 60hz, usually run it at 96hz


I really want to know why we can't go higher than 1100 MHz with over 60 hz


----------



## UNOE

I would like to know more about this as well before I put any of these cards in a gaming rig.


----------



## kizwan

Quote:


> Originally Posted by *HardwareDecoder*
> 
> I really want to know why we can't go higher than 1100 MHz with over 60 hz


Seriously, is there a tutorial somewhere how to overclock the monitor?


----------



## Taint3dBulge

Well, im coming to a hard decision.. I just cant make up my mind.. I am driving myself crazy waiting for the DCII 290x... I see that the 780ti Classified it is up for sale.. Should I wait or just drop the hammer on that.. I was really hoping to stick it out.. I love amd, and I really like the idea of mantel.. But how much better will the 290x be compared to a super OCed classified... Thoughts? Should i wait another 3 weeks or 3 days and turn to the green side again.. havnt owned an nvidia product since the 8800gt...


----------



## Sgt Bilko

Quote:


> Originally Posted by *Taint3dBulge*
> 
> Well, im coming to a hard decision.. I just cant make up my mind.. I am driving myself crazy waiting for the DCII 290x... I see that the 780ti Classified it is up for sale.. Should I wait or just drop the hammer on that.. I was really hoping to stick it out.. I love amd, and I really like the idea of mantel.. But how much better will the 290x be compared to a super OCed classified... Thoughts? Should i wait another 3 weeks or 3 days and turn to the green side again.. havnt owned an nvidia product since the 8800gt...


I'm in the same dilemma, Trying to wait it out for a couple of 290 DCU II's for crossfire but the 780 Ti did tempt me.

At the end of it I'm an AMD fan so i will probably stick with them but waiting for these customs cards is torture


----------



## bayz11

I can run 1200mhz with qnix oc to 96mhz.


----------



## EvolveGamingPC

Hey guys anyone here have any problems with the Kraken G10? Was thinking about picking one up for three reasons, having some fun, un-chaining this beast form the stock cooler and of course, for bragging rights


----------



## devilhead

Quote:


> Originally Posted by *prostreetcamaro*
> 
> Guys I am running into a problem. Has anybody had a 290X or 290 that did not like +200 on the voltage? I just installed a complete XSPC water cooling setup. res+D5 pump, EX360 + EX240 + EK block on the 290X + raystorm on the 2600K. I am getting max gpu temps in the mid 40's and vrm temps in the 30's and 40's. My 290X will not run at +200 now with out major corruption even on stock clocks. I can go +100 and up to around 1120 stable but put the voltage at +200 and it is corruption city.
> 
> Could my pc&p silencer 750 not have enough power for everything now that I installed the water cooling and a uv cold cathode? I have tried 2 dvi dual link cables on my qnix 27" monitor and that did not help. I have a feeling this psu might be the culprit. What do you guys think? Also my scores for the clock speeds seem to be low.
> 
> Here is a video of what it is doing. Sometimes the screen goes funky and never returns and I have to hard reset.


I have same problem with my asus monitor on 120hz, best what i can do is 1230/1500, then begins the same artifacts like in your video, on 60 hz i can get more 1270/1500. 290 don't like more than 60 hz


----------



## kcuestag

Quote:


> Originally Posted by *Iniura*
> 
> I had a similar issue but only with using a single card, my core clock wouldn't reach the core clock I had set in MSI AB even though my temps where very good, the issue for me was I had to enable Power Limit to 50% and don't leave it on 0% once I upped the power limit my clocks where the same under load then what was set in MSI AB. Maybe that will help you if you didn't do that already.


Interesting, I had completely forgotten about Power Limit, I remember I did set this to 50% on my previous 290X, but not on these, I have set it now and we'll see if it does anything.


----------



## smoke2

Planning to buy single R9 290 non-reference design.
I forward my former PSU.
Now I'm wondering to buy a new one.
Do you think it is worth to buy maybe X-850 despite I'm not planning to Crossfire?
I was wondering that maybe in the future when the power of graphic cards will increasing and the manufacturing process will be the same (problems with 20nm chips), there will be requirment for still stronger power supplies?
When I bought my former Enermax 485W in 2006 it was plenty enough wattage, but in 2013 it was too low.
Radeon R9 290 have minimal requirments 750W, I know that is too overwattaged, but maybe next generation will want more watts?
I'm wondering between X-750 and X-850.
Want to stay with him some years, ideally like with my former Enermax (7 years







)
In my country the price difference between them is 24 EUR.


----------



## EvolveGamingPC

Quote:


> Originally Posted by *smoke2*
> 
> Planning to buy single R9 290 non-reference design.
> I forward my former PSU.
> Now I'm wondering to buy a new one.
> Do you think it is worth to buy maybe X-850 despite I'm not planning to Crossfire?
> I was wondering that maybe in the future when the power of graphic cards will increasing and the manufacturing process will be the same (problems with 20nm chips), there will be requirment for still stronger power supplies?
> When I bought my former Enermax 485W in 2006 it was plenty enough wattage, but in 2013 it was too low.
> Radeon R9 290 have minimal requirments 750W, I know that is too overwattaged, but maybe next generation will want more watts?
> I'm wondering between X-750 and X-850.
> Want to stay with him some years, ideally like with my former Enermax (7 years
> 
> 
> 
> 
> 
> 
> 
> )
> In my country the price difference between them is 24 EUR.


I would go the X-850, I say this is a few reasons. Mainly because as you say you want to be able to keep it for a few years and having a higher wattage extends a PSUs effective life...I hate to use the word future proofing but it kinda is







. Another reason is a PSU is more efficient when the wattage you are drawing is between 100W and it's theoretical maximum power output. For 24 Euros it is definitely worth going to 850 Watts.


----------



## darkelixa

I have a sapphire r9 290 that runs fine on my 750w psu


----------



## sun100

Quote:


> Originally Posted by *darkelixa*
> 
> I have a sapphire r9 290 that runs fine on my 750w psu


I have the same card (reference) and it runs just fine on my Chieftec 650W, few years old that i bought to power my gf7900gtx when it came out. I do have only 2 HDDs and i5 2400 cpu tho, so not to much to draw power.


----------



## stickg1

Uhhh, so yeah. This Koolance block works pretty well. New temps while folding @ 1075MHz using 1.23v are GPU: 41C, VRM1: 41C, VRM2: 37C

Old temps (with Corsair H55 mod) were about GPU: 65C, VRM1: 60C, VRM2: 45C


----------



## Hattifnatten

My Hawaiian beauty has arrived


----------



## cplifj

all in all this must have been the best year for many.

amd' r9290X , dice's Battlefield 4 , Microsofts xbox.....

we get to buy more and more beta or is still alpha stuff.

TIME TO QUIT THE SLACKING , I WONT BUY ANYTHING ANYMORE THE COMMING YEARS.

first they gotta learn that CUSTOMER IS KING and not the other way around.

we buy nothing and you all go bust. thats our thanks for treating us the way you did.


----------



## steelkevin

Was there any solution to this ?



I was playing BF4 and when I quit at the end of the round I noticed that my GPU usage was all over the place.


----------



## prostreetcamaro

Quote:


> Originally Posted by *HardwareDecoder*
> 
> 
> I get the same problem, even if I do 1200mhz w/ +50mv, basically if I push my volts or clock too high I get stuff similar to what was happening in your video. I basically have the same setup as you also cooling wise give or take a part.
> 
> My choice is to run 1100 core 1400 mem @ 110hz or run 1200 core/1400 mem @ 60hz.
> 
> That is 3 people now that have this issue with 290/x's and the qnix. Something funky is going on here. Are you running over 60hz on the monitor?


During these benches I only had it set at 60hz. I had an asus VG278H 120hz before this monitor and I never had any issues. I am kind of at a loss since it does it even at 60hz.


----------



## Hattifnatten

Mother of God, I ran 3dmark, and compared to my old card*s*.
http://www.3dmark.com/compare/fs/1414747/fs/1099072
More than 2000 points difference on FSX


----------



## Ludus

Simply love my 290 more and more









Xfx r9 290 (not x) black edition (locked)
Hynix memory
Ek fullcover
Asics 81,3%
Pt1 bios

*Firestrike*

http://www.3dmark.com/3dm/2025020



http://imgur.com/N3ADL6h



*Firestrike Extreme*

http://www.3dmark.com/3dm/2025068



http://imgur.com/gImAmvZ



http://www.techpowerup.com/gpuz/hguxu/


----------



## kizwan

Quote:


> Originally Posted by *stickg1*
> 
> Uhhh, so yeah. This Koolance block works pretty well. New temps while folding @ 1075MHz using 1.23v are GPU: 41C, VRM1: 41C, VRM2: 37C
> 
> Old temps (with Corsair H55 mod) were about GPU: 65C, VRM1: 60C, VRM2: 45C


That's more like it!








Quote:


> Originally Posted by *steelkevin*
> 
> Was there any solution to this ?
> 
> 
> 
> I was playing BF4 and when I quit at the end of the round I noticed that my GPU usage was all over the place.


At what resolution you're playing BF4? If you don't experiencing any micro-stutter/lag & good fps, don't worry about it. I also got fluctuating GPU usage @1080p. I only have 1080p monitor though. When I set scaling 150 - 200%, GPU usage is a lot better.

Single GPU - 1080p


Crossfire - 1080p


Crossfire - 1080p + Scaling 200%


----------



## rdr09

Quote:


> Originally Posted by *Ludus*
> 
> Simply love my 290 more and more
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Xfx r9 290 (not x) black edition (locked)
> Hynix memory
> Ek fullcover
> Asics 81,3%
> Pt1 bios
> 
> *Firestrike*
> 
> http://www.3dmark.com/3dm/2025020
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> http://imgur.com/N3ADL6h
> 
> 
> 
> 
> *Firestrike Extreme*
> 
> http://www.3dmark.com/3dm/2025068
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> http://imgur.com/gImAmvZ
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/hguxu/


wow. 1800 on the mem.


----------



## bayz11

Quote:


> Originally Posted by *Ludus*
> 
> Simply love my 290 more and more
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Xfx r9 290 (not x) black edition (locked)
> Hynix memory
> Ek fullcover
> Asics 81,3%
> Pt1 bios
> 
> *Firestrike*
> 
> http://www.3dmark.com/3dm/2025020
> 
> 
> 
> http://imgur.com/N3ADL6h
> 
> 
> 
> *Firestrike Extreme*
> 
> http://www.3dmark.com/3dm/2025068
> 
> 
> 
> http://imgur.com/gImAmvZ
> 
> 
> 
> http://www.techpowerup.com/gpuz/hguxu/


If you dont mind can you tell me how to flash pt1 bios please.whenever i try to flash it to my sapphire bios it said essd number doesnt match.thank you


----------



## Ludus

Quote:


> Originally Posted by *bayz11*
> 
> If you dont mind can you tell me how to flash pt1 bios please.whenever i try to flash it to my sapphire bios it said essd number doesnt match.thank you


u have to use this command to avoid ssid check

atiflash.exe -f -p 0 xxx.rom

xxx >>> the name of your rom


----------



## bayz11

Quote:


> Originally Posted by *Ludus*
> 
> u have to use this command to avoid ssid check
> 
> atiflash.exe -f -p 0 xxx.rom
> 
> xxx >>> the name of your rom


Ow.thats why.thank you mate.if you dont mind can you tell me which software you use to oc ypur graphic card plz.thank you


----------



## Ludus

Quote:


> Originally Posted by *bayz11*
> 
> Ow.thats why.thank you mate.if you dont mind can you tell me which software you use to oc ypur graphic card plz.thank you


with pt1/pt3 bios u should use asus gpu tweak
take care that this bios are just for bench and could destroy your card. i recommend to not use them for daily purpose.


----------



## bayz11

Quote:


> Originally Posted by *Ludus*
> 
> with pt1/pt3 bios u should use asus gpu tweak
> take care that this bios are just for bench and could destroy your card. i recommend to not use them for daily purpose.


Cheers.i will remember that


----------



## cplifj

Hey,

thought i'd ask in this thread.

Does anybody know how to change the bios so that i can change this minimum fan speed for my R9290X .

I think the low fanspeed is really a cullprit, it doesnt give any cooling at all. start the load and it takes way too long to spin up , because it waits for the 95°c.

This is utterly stupid as when i mannualy set a higher idle fanspeed the whole card works just better and cooler for far longer.

i want to change the fanprofile in the bios somewhat so that it just cools better without noise, it is perfectly possible, why amd choose to let it get to 95°c is a mistery to me.

maybe that has something to do with laying the mask on the silicon and material thermal expansion/contraction ??

if the fan runs just a bit faster at idle and pulls up a bit faster it doesnt get all that hot and noisy. the fan cools more then enough without sounding like a blowdryer.

its just Always comming in late and the heatbattle gets lost every time.

I tried vbe7 but that dont work on 290X bios.

grtz


----------



## velocityx

How is it that my valley benchmarks are so random? I'm no benchmarker, but I'm just curious.

like let's take this one, taken a week ago. r9 290 CF, stock settings.



it's kinda low but I found out, when I was setting up my crossfire I had a beer to many and I connected my psu cables wrong ( one cable, with two plugs, each plug to different card instead of two plugs from the same cable to the same card.

when I noticed that, I re did the connections so one cable for each card.

and this is the result



it's a bit higher, I understand that power delivery could be the reason why it was lower vs properly connected power plugs.

but today, I swapped my cards, top one went down, the lower one went on top and now the score is this


----------



## kcuestag

For those seeing weird GPU Usage (Jumping from 0 to 100% all the time), it's a bug in the AMD Overdrive API, explained by MSI AB's developer (Unwinder) here:

http://forums.guru3d.com/showpost.php?p=4730549&postcount=489
Quote:


> Jumping GPU usage sensor values is a question of driver bug in Overdrive 5/6 API and there is no proper solution for it yet. Currently Overdrive 5/6 API may return erroneous data under certain conditions. So you may either use GPU-Z, which is using old GPU software GPU load performance counter existing since ancient AMD GPUs (but the value won't match with GPU load displayed in CCC because those are different performance counters) or just keep in mind that sensor is broken in AMD driver and wait for fix in future.


----------



## chill24

Do all aftermarket cards with the good coolers use their own custom PCBs? Is there any chance one of them will use the reference pcb with a better cooler, that way it could be fitted to existing reference cards? Or maybe someone will release a cooler for the 290s that really is meant for it so it doesn't take up so much room and isn't so much of a hack.


----------



## chill24

Quote:


> Originally Posted by *chill24*
> 
> Do all aftermarket cards with the good coolers use their own custom PCBs? Is there any chance one of them will use the reference pcb with a better cooler, that way it could be fitted to existing reference cards? Or maybe someone will release a cooler for the 290s that really is meant for it so it doesn't take up so much room and isn't so much of a hack.


Nevermind, I've seen the sapphinre tri-x cards will use reference pcb. Maybe there'll be a way to get a couple of those coolers....


----------



## Hattifnatten

Quote:


> Originally Posted by *chill24*
> 
> Do all aftermarket cards with the good coolers use their own custom PCBs? Is there any chance one of them will use the reference pcb with a better cooler, that way it could be fitted to existing reference cards? Or maybe someone will release a cooler for the 290s that really is meant for it so it doesn't take up so much room and isn't so much of a hack.


The Tri-X uses reference PCB and AFAIK, so will the Windforce x3. DCII on the other hand, use a custom pcb (but EK will make block for that one and the lightning anyways, atleast, they did that with the 7970).


----------



## Menphisto

hey guys,
would this psu, "Cougar CMX 550" be enough for a r9 290x ? 550w (12v1 :28a 12v2: 20a)
Other stuff in the case :
6 low speed fans
i5 4670k @4,5ghz 1,23v
16gb corsair vengeance 1600mhz
corsair h80i
samsung ssd 840 evo 120gb
500gb (Seagate......7200rpm)









gpu @ the moment a gtx 285


----------



## Arizonian

Quote:


> Originally Posted by *Hattifnatten*
> 
> My Hawaiian beauty has arrived
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## HardwareDecoder

Quote:


> Originally Posted by *smoke2*
> 
> Planning to buy single R9 290 non-reference design.
> I forward my former PSU.
> Now I'm wondering to buy a new one.
> Do you think it is worth to buy maybe X-850 despite I'm not planning to Crossfire?
> I was wondering that maybe in the future when the power of graphic cards will increasing and the manufacturing process will be the same (problems with 20nm chips), there will be requirment for still stronger power supplies?
> When I bought my former Enermax 485W in 2006 it was plenty enough wattage, but in 2013 it was too low.
> Radeon R9 290 have minimal requirments 750W, I know that is too overwattaged, but maybe next generation will want more watts?
> I'm wondering between X-750 and X-850.
> Want to stay with him some years, ideally like with my former Enermax (7 years
> 
> 
> 
> 
> 
> 
> 
> )
> In my country the price difference between them is 24 EUR.


Quote:


> Originally Posted by *EvolveGamingPC*
> 
> I would go the X-850, I say this is a few reasons. Mainly because as you say you want to be able to keep it for a few years and having a higher wattage extends a PSUs effective life...I hate to use the word future proofing but it kinda is
> 
> 
> 
> 
> 
> 
> 
> . Another reason is a PSU is more efficient when the wattage you are drawing is between 100W and it's theoretical maximum power output. For 24 Euros it is definitely worth going to 850 Watts.


lol you guys. I posted a detailed explanation awhile back on why for a single 290 you don't need more then a 650w psu . as long as it isn't a crap psu.

maybe my post could be added to the op somewhere because people keep asking the same questions and recommending overkill psu power.

See my post here with the math to back it up. http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/11550#post_21421778

I want to add that getting a bigger psu can still make sense like you said in case newer cards use more power or you want to add a second card later. However to the other guy i'm not even hitting 475W usage on a 650W psu (with my heavily oc'ed cpu at 100% and oc'ed gpu at 100% and a water cooling loop)

so that isn't going to effect the life span on the unit at all. Remember too these are designed to handle up to their max output 24//7 and it's something of a myth in my opinion that running a (quality) psu at max capacity lowers it's life span.


----------



## the9quad

Quote:


> Originally Posted by *HardwareDecoder*
> 
> lol you guys. I posted a detailed explanation awhile back on why for a single 290 you don't need more then a 650w psu . as long as it isn't a crap psu.
> 
> maybe my post could be added to the op somewhere because people keep asking the same questions and recommending overkill psu power.
> 
> See my post here with the math to back it up. http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/11550#post_21421778
> 
> I want to add that getting a bigger psu can still make sense like you said in case newer cards use more power, however to the other guy i'm not even hitting 500W usage on a 650W psu so that isn't going to effect the life span on the unit at all. Remember too these are designed to handle up to their max output 24//7 and it's something of a myth in my opinion that running a psu at max capacity lowers it's life span.


Or they can just add the newegg video to the OP that showed 1,2,3,and 4 card configurations and the power they drew as he ran them through benches...No offense, but people are more inclined to believe, a outlet (news/retail) over some guy on a forum. Please don't take offense, because I appreciate your posts on the matter. Just saying you know how people are, and that video by newegg was pretty thorough.


----------



## Menphisto

Quote:


> Originally Posted by *HardwareDecoder*
> 
> lol you guys. I posted a detailed explanation awhile back on why for a single 290 you don't need more then a 650w psu . as long as it isn't a crap psu.
> 
> maybe my post could be added to the op somewhere because people keep asking the same questions and recommending overkill psu power.
> 
> See my post here with the math to back it up. http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/11550#post_21421778
> 
> I want to add that getting a bigger psu can still make sense like you said in case newer cards use more power or you want to add a second card later. However to the other guy i'm not even hitting 475W usage on a 650W psu (with my heavily oc'ed cpu at 100% and oc'ed gpu at 100% and a water cooling loop)
> 
> so that isn't going to effect the life span on the unit at all. Remember too these are designed to handle up to their max output 24//7 and it's something of a myth in my opinion that running a (quality) psu at max capacity lowers it's life span.


so i cant run a r9 290x with my 550w psu D:


----------



## HardwareDecoder

Quote:


> Originally Posted by *the9quad*
> 
> Or they can just add the newegg video to the OP that showed 1,2,3,and 4 card configurations and the power they drew as he ran them through benches...No offense, but people are more inclined to believe, a outlet (news/retail) over some guy on a forum. Please don't take offense, because I appreciate your posts on the matter. Just saying you know how people are, and that video by newegg was pretty thorough.


0 offense taken and I know what you mean. I personally have never seen that video and like do figure out things my self if possible. That's why I bought the watt meter in the first place.

Can you link to the video? I've never seen it.
Quote:


> Originally Posted by *Menphisto*
> 
> so i cant run a r9 290x with my 550w psu D:


I'm sure you could run my same setup on a 550W psu but it would be cutting it a little closer than i'd like. You see I said it drew 475W @ full load on everything with a heavily oc'ed cpu and +100mV on the gpu + a water cooling loop.


----------



## the9quad

here is said video, approximately 8 minute mark gives wattage for a typical system as he runs through benches. Gives draw for 1,2,3,and 4 cards.


----------



## ehpexs

How well do you guys reckon dual 290s under water would handle triple 1440s?


----------



## HardwareDecoder

good video, yeah he ends up with even less power draw for the whole system then I do since he isn't maxing out the CPU with intel burn test and running "power virus" furmark like I did. I like to account for the highest use scenario.

What I really took from that video though is that xfire is still kind of a losing proposition imo. No one take offense that is running xfire or anything but it seems like in most of the games he tested it doesn't scale well.. then add in microstutter

Especially when you add in a 3rd/4th card that's basically just a waste of money with what little boost you get.
The only game that even scales well with a 3rd or 4th card is BF3. I'm sure bf4 will get there eventually especially when mantle drops.

The caveat is that If you are going to run multiple monitors or even 4K I guess you need to just throw money at the problem since any single card isn't gonna work.

edit: LOL i just got to the part where he says here is what 4 of these cards on stock cooling sounds like. Wow it's like a jet engine for four of those cards.

So add in the cost of 4 waterblocks too since I dont think anyone could stand that much noise 24//7


----------



## Arizonian

Quote:


> Originally Posted by *the9quad*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> here is said video, approximately 8 minute mark gives wattage for a typical system as he runs through benches. Gives draw for 1,2,3,and 4 cards.


I'll add this to the OP second post of info, I don't mind.

In the meantime here's a water blocked card that just came out being sold on Newegg - *PowerColor LCS AXR9 290X WMDHG/OC* (BF4 Edition) $779.99

http://www.newegg.com/Product/Product.aspx?Item=N82E16814131543


----------



## alawadhi3000

Quote:


> Originally Posted by *givmedew*
> 
> You can OC the card with commands in CGMINER.
> 
> If you have a 2nd, 3rd, or 4th gen Intel then you can use that for your display with no lag.
> 
> Your issue IS NOT a driver conflict. You have to have a monitor connected to the card and have a signal going to it in order to OC it the normal way or to see it's temp in GPUz


Yeah I figured that out myself a few minutes before your post, however AMD CCC still doesn't display the performance tab, I had to OC using MSI Afterburner.

Thanks anyway and REP+.
Quote:


> Originally Posted by *Derpinheimer*
> 
> I can play games while mining on the same GPU ?
> 
> Turn down intensity..


Turning down intensity lower the hashrate, therefore its pointless.
Quote:


> Originally Posted by *kizwan*
> 
> You probably need to disabled ULPS. MSI AB & TriXX should be able to pick up your card afterwards, as well as Performance tab in CCC.


Tried that yesterday, doesn't work.


----------



## devilhead

Quote:


> Originally Posted by *Ludus*
> 
> Simply love my 290 more and more
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Xfx r9 290 (not x) black edition (locked)
> Hynix memory
> Ek fullcover
> Asics 81,3%
> Pt1 bios
> 
> *Firestrike*
> 
> http://www.3dmark.com/3dm/2025020
> 
> 
> 
> http://imgur.com/N3ADL6h
> 
> 
> 
> *Firestrike Extreme*
> 
> http://www.3dmark.com/3dm/2025068
> 
> 
> 
> http://imgur.com/gImAmvZ
> 
> 
> 
> http://www.techpowerup.com/gpuz/hguxu/


how much you can get in valley with those clocks? Extreeme HD preset


----------



## Menphisto

Whats about the Amps ?
i have a dual rail 550w psuxsreviews.co.uk/reviews/power-supply-units/cougar-cmx-preview/ ... first rail 28 amps second 20 amps...is that a problem?


----------



## ImJJames

Quote:


> Originally Posted by *devilhead*
> 
> how much you can get in valley with those clocks? Extreeme HD preset


Don't even bother with Valley on Hawaii cards.


----------



## Ludus

Quote:


> Originally Posted by *devilhead*
> 
> how much you can get in valley with those clocks? Extreeme HD preset


about 3420pt


----------



## steelkevin

Quote:


> Originally Posted by *Hattifnatten*
> 
> The Tri-X uses reference PCB and AFAIK, so will the Windforce x3. DCII on the other hand, use a custom pcb (but *EK will make block for that one* and the lightning anyways, atleast, they did that with the 7970).


I really wouldn't rely on that or at least I wouldn't expect the block any time soon.

They announced they were making one for the DCII 280X back when I had that (a bit over a week after they were released I think it was) and said it would be "in stores" by mid-November and I'm really glad I sent mine back for a 290 because to this date they still haven't released those blocks.


----------



## kizwan

Quote:


> Originally Posted by *Menphisto*
> 
> Whats about the Amps ?
> i have a dual rail 550w psuxsreviews.co.uk/reviews/power-supply-units/cougar-cmx-preview/ ... first rail 28 amps second 20 amps...is that a problem?


That give you only 336W (28A * 12V) which is not enough. The second rail is for CPU.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *kizwan*
> 
> That give you only 336W (28A * 12V) which is not enough. The second rail is for CPU.


336w is enough for a 290. Maybe not with overvolting, but it will handle it at stock.

It cuts it a bit close for comfort and he'd be better off with a better PSU, but it will handle a 290 if it's the only thing on that rail.

My entire rig (3770k @ 4.6 on 1.3v actual and R9 290 @ 1100/5500 on stock voltage + 10 or so fans) pulls about 450w from the wall at full tilt. This is without factoring in the PSU's inefficiencies, of course.

Actually, that number was wrong. Under constant 100% GPU load I get about 410w average, and spikes up to 430w.

Overvolting via GPUTweak adds almost 100w to those numbers, though.


----------



## Tobiman

Just got my r9 290 after it shipped 2 weeks ago from canada. First impressions, the card feels a bit heavier than I expected. Stickers all over the place which is normal for a Sapphire card. Will run tests when I get home. Hope it unlocks.


----------



## devilhead

Quote:


> Originally Posted by *Ludus*
> 
> about 3420pt


so it is already more than 80 fps in valley


----------



## HardwareDecoder

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> 336w is enough for a 290. Maybe not with overvolting, but it will handle it at stock.
> 
> It cuts it a bit close for comfort and he'd be better off with a better PSU, but it will handle a 290 if it's the only thing on that rail.
> 
> My entire rig (3770k @ 4.6 on 1.3v actual and R9 290 @ 1100/5500 on stock voltage + 10 or so fans) pulls about 450w from the wall at full tilt. This is without factoring in the PSU's inefficiencies, of course.
> 
> Actually, that number was wrong. Under constant 100% GPU load I get about 410w average, and spikes up to 430w.
> 
> Overvolting via GPUTweak adds almost 100w to those numbers, though.


about same as me


----------



## Maximization

PowerColor LCS AXR9 290X,, o man i am getting tempted, i was thinking of waterblocking my 2 7870's now i don't know


----------



## HardwareDecoder

Quote:


> Originally Posted by *Tobiman*
> 
> Just got my r9 290 after it shipped 2 weeks ago from canada. First impressions, the card feels a bit heavier than I expected. Stickers all over the place which is normal for a Sapphire card. Will run tests when I get home. Hope it unlocks.


my sapphire 290x had 0 stickers


----------



## Andrix85

Hi boys,
i have a question for you:

i have mod my XFX R9 290 @ 290X with ASUS BIOS. I have a strange problems with settings; if i use the stock 290 bios i have good performance in 3D MARK 2011 ( 17.000 pti ), but if i use the 290X BIOS my result drops, expecially in GT3 test. Why ? I'm using 13.12 drivers...

Thanks


----------



## Raephen

Quote:


> Originally Posted by *Tobiman*
> 
> Just got my r9 290 after it shipped 2 weeks ago from canada. First impressions, the card feels a bit heavier than I expected. Stickers all over the place which is normal for a Sapphire card. Will run tests when I get home. Hope it unlocks.


Quote:


> Originally Posted by *HardwareDecoder*
> 
> my sapphire 290x had 0 stickers


Mine had 0 aswell (in the EU, perhaps location makes a difference...) Unless of course you are referring to the sticker on the outside of the stock cooler, and not on the mounting screws...


----------



## Taint3dBulge

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'm in the same dilemma, Trying to wait it out for a couple of 290 DCU II's for crossfire but the 780 Ti did tempt me.
> 
> At the end of it I'm an AMD fan so i will probably stick with them but waiting for these customs cards is torture


yes it is torture. I know the 780ti classy is faster then any amd card out there. It also will clock huge.. it wont however see any big gains in newer drivers. Wellnot like the amds will show anyways. Hmmmmm luckly im out of town and no pc. Have time to think more. Oh also been reading alot on these qnix monitors and how 290xs dont like anything over 60hz. That concerns me too.


----------



## Heinz68

Quote:


> Originally Posted by *alawadhi3000*
> 
> Turning down intensity lower the hashrate, therefore its pointless.


You're right when it comes down to gaming, but for net surfing or video watching I need to decrease the Intensiti only by 4 points and on my HD 6990 the hash rate goes does from 950 to 710 kHash/s, still worthwhile.

In the meantime I'm waiting for to buy couple reasonably priced custom built R9 290X, hopefuly soon.

Maybe than I can find cgminer config that would allow me to use for mining only one card when gaming.


----------



## kcuestag

Any idea what the "ACP Application" is? It is included in the AMD Drivers, wondering what it does, and if I can get rid of them.


----------



## Derpinheimer

Quote:


> Originally Posted by *Heinz68*
> 
> You're right when it comes down to gaming, but for net surfing or video watching I need to decrease the Intensiti only by 4 points and on my HD 6990 the hash rate goes does from 950 to 710 kHash/s, still worthwhile.
> 
> In the meantime I'm waiting for to buy couple reasonably priced custom built R9 290X, hopefuly soon.
> 
> Maybe than I can find cgminer config that would allow me to use for mining only one card when gaming.


I still mine on some games at 12 to 14 intensity. Usually games get priority it seems, so 60fps is maintained. Because of that, the hash rate can fluctuate from 200 to 650 while playing games. I don't consider that pointless.


----------



## hyp3rtraxx

Will my psu be able to handle 2*"XFX Radeon R9 280X 3GB (Tahiti XTL)" crossfire not overclocked? Will use for light gaming (dota 2) and mining?


----------



## Hattifnatten

Wrong thread, but yes, if you're speaking about that 850w.


----------



## Redvineal

Quote:


> Originally Posted by *Redvineal*
> 
> What's up party people? Anyone have an idea what the stock cooler VRAM and VRM thermal pad materials are and/or what their thermal conductivity ratings are?
> 
> I'm especially interested in the gray stripe that makes contact with VRM1 and how it compares to Fujipoly Extreme (11w/mK) thermal pads.
> 
> Thanks!


It seems like no one has an idea about the question above. Either that, or it was simply lost amongst other conversation.

Anyhow, let me pose a slightly different question. Knowing that I'm using a cut portion of the stock cooler as my VRM1 heatsink, would the Fujipoly Extreme (11w/mK) thermal pads work well for VRM1 cooling? Specifically, would it work better or worse than the stock gray strip padding?

Any opinions or facts on the matter? Thanks.


----------



## HardwareDecoder

Quote:


> Originally Posted by *Taint3dBulge*
> 
> yes it is torture. I know the 780ti classy is faster then any amd card out there. It also will clock huge.. it wont however see any big gains in newer drivers. Wellnot like the amds will show anyways. Hmmmmm luckly im out of town and no pc. Have time to think more. Oh also been reading alot on these qnix monitors and how 290xs dont like anything over 60hz. That concerns me too.


they can do over 60hz but only at 1100 or so core... which is really only a 100mhz loss in possible core clock since most 290/x's aren't stable over 1175-1225 anyway.

It is annoying though I agree as I am suffering this problem. It may be amd driver related and get fixed eventually...hopefully. Or it might even be an issue on the monitor pcb end since they are "overclocked"


----------



## smoke2

Quote:


> Originally Posted by *HardwareDecoder*
> 
> lol you guys. I posted a detailed explanation awhile back on why for a single 290 you don't need more then a 650w psu . as long as it isn't a crap psu.
> 
> maybe my post could be added to the op somewhere because people keep asking the same questions and recommending overkill psu power.
> 
> See my post here with the math to back it up. http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/11550#post_21421778
> 
> I want to add that getting a bigger psu can still make sense like you said in case newer cards use more power or you want to add a second card later. However to the other guy i'm not even hitting 475W usage on a 650W psu (with my heavily oc'ed cpu at 100% and oc'ed gpu at 100% and a water cooling loop)
> 
> so that isn't going to effect the life span on the unit at all. Remember too these are designed to handle up to their max output 24//7 and it's something of a myth in my opinion that running a (quality) psu at max capacity lowers it's life span.


I know, but not life span is the main thing, but I'm afraid that the newer cards will be still more power hungry, because there is a problem with lower manufacturing process (now 20nm) and it could happen with more power cards you will need stronger power supplies?
The price difference is 24 EUR.


----------



## DeadlyDNA

Woohoo i'm back up and running everything is good and i have been benching. I added 2 more rads to my loop (phobia dual 200mm rads) it's overkill and all but my cards can barely reach 45C now on full load in benching. I'm just not sure about the WHQL 13.12 drivers yet.

Also, anyone know if there is more than one version of firestrike? I've seen videos and scores that don't match mine. Like the amount of tests are different.


----------



## Forceman

Quote:


> Originally Posted by *kcuestag*
> 
> Any idea what the "ACP Application" is? It is included in the AMD Drivers, wondering what it does, and if I can get rid of them.


I asked before and the consensus was that it is some kind of audio feature/function. Possibly True Audio?


----------



## Taint3dBulge

Quote:


> Originally Posted by *HardwareDecoder*
> 
> they can do over 60hz but only at 1100 or so core... which is really only a 100mhz loss in possible core clock since most 290/x's aren't stable over 1175-1225 anyway.
> 
> It is annoying though I agree as I am suffering this problem. It may be amd driver related and get fixed eventually...hopefully. Or it might even be an issue on the monitor pcb end since they are "overclocked"


Thanks for the info. Im gonna have to ask in the qnix thread if nvvidis shares the same issue.


----------



## Pesmerrga

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Woohoo i'm back up and running everything is good and i have been benching. I added 2 more rads to my loop (phobia dual 200mm rads) it's overkill and all but my cards can barely reach 45C now on full load in benching. I'm just not sure about the WHQL 13.12 drivers yet.
> 
> Also, anyone know if there is more than one version of firestrike? I've seen videos and scores that don't match mine. Like the amount of tests are different.


Are you seeing the demo play first before the actual tests?


----------



## bayz11

Quote:


> Originally Posted by *HardwareDecoder*
> 
> they can do over 60hz but only at 1100 or so core... which is really only a 100mhz loss in possible core clock since most 290/x's aren't stable over 1175-1225 anyway.
> 
> It is annoying though I agree as I am suffering this problem. It may be amd driver related and get fixed eventually...hopefully. Or it might even be an issue on the monitor pcb end since they are "overclocked"


Hello.i wanna ask is this happen with qnix only?what about if i use asus and benq monitor that support fully 144p?

I own qnix as well,oc at 96mhz with my gpu oc at 1200/1325 at vddc 150 stable.but couldnt push more even with vddc at 200.


----------



## HardwareDecoder

Quote:


> Originally Posted by *bayz11*
> 
> Hello.i wanna ask is this happen with qnix only?what about if i use asus and benq monitor that support fully 144p?
> 
> I own qnix as well,oc at 96mhz with my gpu oc at 1200/1325 at vddc 150 stable.but couldnt push more even with vddc at 200.


seems to be a qnix problem only. Which leads me to believe that it is just something with the qnix PCB and the 290/x's not talking to each other effectively once both are overclocked significantly and not an actual issue with the 290/x's


----------



## Iniura

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Woohoo i'm back up and running everything is good and i have been benching. I added 2 more rads to my loop (phobia dual 200mm rads) it's overkill and all but my cards can barely reach 45C now on full load in benching. I'm just not sure about the WHQL 13.12 drivers yet.
> 
> Also, anyone know if there is more than one version of firestrike? I've seen videos and scores that don't match mine. Like the amount of tests are different.


When you download the basic edition (free) of 3DMark you get the Firestrike benchmark, when you buy 3DMark advanced edition you will unlock the Firestrike Extreme preset.

http://www.futuremark.com/benchmarks/3dmark


----------



## the9quad

Quote:


> Originally Posted by *Iniura*
> 
> When you download the basic edition (free) of 3DMark you get the Firestrike benchmark, when you buy 3DMark advanced edition you will unlock the Firestrike Extreme preset.
> 
> http://www.futuremark.com/benchmarks/3dmark


HE really should have bought it from steam the other day it was like $2.50


----------



## stickg1

With the Koolance block installed I was able to benchmark stable at 1150/1500 +100mV MSI-AB. Core temp never got over 40C


----------



## DeadlyDNA

Quote:


> Originally Posted by *the9quad*
> 
> HE really should have bought it from steam the other day it was like $2.50


Quote:


> Originally Posted by *Iniura*
> 
> When you download the basic edition (free) of 3DMark you get the Firestrike benchmark, when you buy 3DMark advanced edition you will unlock the Firestrike Extreme preset.
> 
> http://www.futuremark.com/benchmarks/3dmark


Quote:


> Originally Posted by *Pesmerrga*
> 
> Are you seeing the demo play first before the actual tests?


Thats the version i have, the Advanced i bought through steam. I just saw a thread for 3dmark 13 firestrike. and when i compared the loops they had a different set of tests. Thought maybe i was just looking at it wrong or different versions

I got this score earlier fooling around, i was trying to compare to similar systems on here. My quad core beats me up against 6 cores.
4.6 cpu 3820/ 4 x290s. To me it looks low.

http://www.3dmark.com/3dm11/7728169?


----------



## IBIubbleTea

Just ordered the ASUS R9 290 for 449.99. Can't wait to get it.


----------



## Sgt Bilko

And OCN is back online!!!

But the sad news is i'm still waiting on Custom 290's


----------



## The Storm

Quote:


> Originally Posted by *Taint3dBulge*
> 
> yes it is torture. I know the 780ti classy is faster then any amd card out there. It also will clock huge.. it wont however see any big gains in newer drivers. Wellnot like the amds will show anyways. Hmmmmm luckly im out of town and no pc. Have time to think more. Oh also been reading alot on these qnix monitors and how 290xs dont like anything over 60hz. That concerns me too.


I have an X-Star, I also have 2 290's that unlocked to 290x's and I have 0 issues with my monitor, I can run it at 60, 96, 105, and 120 without issues at all. Not sure why other people are having so many issues. Im on Windows 8.1


----------



## yappy

Hey guys.

How much watts would you say a ASUS R9 290 OC runs at? I OC'd mine to 1000mhz and my mem clock to 1500mhz with powertune at 20%. I cannot adjust voltage.

I have 4 of these cards, and my system cannot run all 4 at full load 100% usage. I got a lepa 1600w psu. My whole system just shuts off after couple minutes.

But I can run 3 cards at full 100% usage for hours and hours no problem.


----------



## steelkevin

Quote:


> Originally Posted by *stickg1*
> 
> With the Koolance block installed I was able to benchmark stable at 1150/1500 +100mV MSI-AB. Core temp never got over 40C


Wow wow wait... with this:



You're only hitting 40°C Oo ! ?
Is it like -5°C where your computer is xD ?
Or perhaps that fan is spinning really fast ?


----------



## maynard14

Quote:


> Originally Posted by *The Storm*
> 
> I have an X-Star, I also have 2 290's that unlocked to 290x's and I have 0 issues with my monitor, I can run it at 60, 96, 105, and 120 without issues at all. Not sure why other people are having so many issues. Im on Windows 8.1


how long have you been using your r9 290 as 290x sir?


----------



## The Storm

Quote:


> Originally Posted by *maynard14*
> 
> how long have you been using your r9 290 as 290x sir?


Since Dec 12th. I haven't had a single issue since both cards have been flashed.


----------



## Sgt Bilko

Quote:


> Originally Posted by *steelkevin*
> 
> Wow wow wait... with this:
> 
> 
> 
> You're only hitting 40°C Oo ! ?
> Is it like -5°C where your computer is xD ?


I'm guessing he means the Koolance Fullcover 290 block and not just a universal water mod


----------



## steelkevin

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'm guessing he means the Koolance Fullcover 290 block and not just a universal water mod










obviously he was. No idea how I missed that. Still is really impressive.


----------



## maynard14

Quote:


> Originally Posted by *The Storm*
> 
> Since Dec 12th. I haven't had a single issue since both cards have been flashed.


nice thanks sir... im thinking leaving my card xfx 290 as 290x bios flash using xfx 290x bios... hehehe ..


----------



## darkelixa

The only game I own that makes my card work for its money is battlefield 4

Defiantly heats the room up, but final fantasy the card runs very quite and cool


----------



## Tobiman

Quote:


> Originally Posted by *yappy*
> 
> Hey guys.
> 
> How much watts would you say a ASUS R9 290 OC runs at? I OC'd mine to 1000mhz and my mem clock to 1500mhz with powertune at 20%. I cannot adjust voltage.
> 
> I have 4 of these cards, and my system cannot run all 4 at full load 100% usage. I got a lepa 1600w psu. My whole system just shuts off after couple minutes.
> 
> But I can run 3 cards at full 100% usage for hours and hours no problem.


It's either you get a second PSU or watercool your gpus. The second option isn't guaranteed to fix the problem but it will definitely reduce the power consumption of each card.


----------



## Sgt Bilko

So has anyone heard any dates regarding the Custom 290 cards?

I've heard some dates floating around but not placing too much stock in them atm,

Asus DCU II = 15-17th Jan,
Sapphire Tri-X = 2nd Jan,
Gigabyte W3 = out now,
MSI Twin Frozr = out now (haven't seen it yet), also seen 24th Jan
HIS IceQ2 = Unknown (but ugly as hell)
and the Powercolor LCS 290x (290x shipping with a pre-installed EK block)

Seriously hoping for them to come out next week though......i'll probably grab two Ref cards if they make me wait too long


----------



## darkelixa

I think the ref cards will be a better option since they are not factory overclocked and the stock clocks work perfect before being overclocked


----------



## VSG

Quote:


> Originally Posted by *Sgt Bilko*
> 
> So has anyone heard any dates regarding the Custom 290 cards?
> 
> I've heard some dates floating around but not placing too much stock in them atm,
> 
> Asus DCU II = 15-17th Jan,
> Sapphire Tri-X = 2nd Jan,
> Gigabyte W3 = out now,
> MSI Twin Frozr = out now (haven't seen it yet), also seen 24th Jan
> HIS IceQ2 = Unknown (but ugly as hell)
> and the Powercolor LCS 290x (290x shipping with a pre-installed EK block)
> 
> Seriously hoping for them to come out next week though......i'll probably grab two Ref cards if they make me wait too long


I am going to wait till CES is over before I make my decision about any cards. What I am waiting for from the AMD side is the MSI 290x Lightning


----------



## Sgt Bilko

Quote:


> Originally Posted by *geggeg*
> 
> I am going to wait till CES is over before I make my decision about any cards. What I am waiting for from the AMD side is the MSI 290x Lightning


Thats was my thinking as well, except i need a new card before the 10th (wife wants her 7970 back







)

Quote:


> Originally Posted by *darkelixa*
> 
> I think the ref cards will be a better option since they are not factory overclocked and the stock clocks work perfect before being overclocked


Well yes and no, i don't have an issue with heat as my case is well ventilated but it's more the noise from them (i fold at night with the PC about 2m away from my head) so a custom card would be best for noise but if i don't see any custom cards in stock at PCCG before the 10th then i'm just grabbing a couple of Ref cards (if they still have them in stock that is)


----------



## darkelixa

I should record my pc for you one day, I cant hear the r9 290 over my case fans, or the air con. Its only with Bf4 that you can hear the card but its not noticable






Some gameplay from earlier lol


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> I should record my pc for you one day, I cant hear the r9 290 over my case fans, or the air con. Its only with Bf4 that you can hear the card but its not noticable
> 
> 
> 
> 
> 
> 
> Some gameplay from earlier lol


I used to own a 290x, i know about the noise, but i also don't have Air Con here and we hit close to 40c ambient in summer......


----------



## darkelixa

Why do you fold at night,what's the benefit


----------



## darkelixa

Yeah gets that hot here in qld also,have my at about 27


----------



## Joeking78

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I used to own a 290x, i know about the noise, but i also don't have Air Con here and we hit close to 40c ambient in summer......


Same here...Dubai ambient temps are 40-50c in the summer, I had to turn the a/c up and the gpu fan up and it's just too noisy. Ended up going with Giga 780Ti Windforce and I can have them 100% fan speed and not hear them, and keeps the cards plenty cool even without a/c on


----------



## darkelixa

My 770 gtx is still holding down paper as it had micro stutter,swapped to the 290 and the issue is gone


----------



## the9quad

Quote:


> Originally Posted by *Taint3dBulge*
> 
> Thanks for the info. Im gonna have to ask in the qnix thread if nvvidis shares the same issue.


Running 3x290x's and a qnix at 100,110,and 120 hz no problems at all.


----------



## Taint3dBulge

Quote:


> Originally Posted by *The Storm*
> 
> I have an X-Star, I also have 2 290's that unlocked to 290x's and I have 0 issues with my monitor, I can run it at 60, 96, 105, and 120 without issues at all. Not sure why other people are having so many issues. Im on Windows 8.1


Quote:


> Originally Posted by *the9quad*
> 
> Running 3x290x's and a qnix at 100,110,and 120 hz no problems at all.


Are u guys over clockn ur video cards snd by how much?


----------



## the9quad

Quote:


> Originally Posted by *Taint3dBulge*
> 
> Are u guys over clockn ur video cards snd by how much?


I don't run with mine overclocked because I'm on stock cooling but I've ran quite a few benches with them at I believe 1175 and 1350 core and ram respectively. That's about all I can do with my power supply. Hope that helps, it was enough for #41 on the fire strike extreme HOF which was enough for me.

In all honesty, I really don't need to overclock with 3 of em, there isn't anything I can't run on my monitor at 100hz at 100fps besides arma3. I also don't run at 120hz because I can't tell the difference between 100 and 120, and it is just less taxing on the cards and the monitor.

If your going to buy one card and money isn't an issue I'd say get a 780ti. If money is an issue get a 290. If your going multiple cards, well 290's with water cooling is what I would do, but only if you can get them at msrp. Can't really make a case for 290x's they are less fast than a ti and more expensive than the marginally slower 290.

One caveat, if you are going multiple cards and not water cooling than ti's are your better bet. The 290/290x's in crossfire with stock cooling are very loud and hot.


----------



## Hattifnatten

Quick question, what clocks do people get on stock voltages on the 290? Mine will do 1155/1625 (ECC does not even kick in). I know the memory is above average, but how about core?


----------



## stickg1

Quote:


> Originally Posted by *steelkevin*
> 
> Wow wow wait... with this:
> 
> 
> 
> You're only hitting 40°C Oo ! ?
> Is it like -5°C where your computer is xD ?
> Or perhaps that fan is spinning really fast ?


No I put on a proper block, the Koolance VID-AR290X


----------



## steelkevin

Quote:


> Originally Posted by *stickg1*
> 
> No I put on a proper block, the Koolance VID-AR290X


That's still very impressive. You only have a 240mm and a 120mm rad which don't look like they're equipped with fans in p/p. And you're GPU isnt even alone in there, it's running along side a 3570K (how overclocked is it btw ?).

OR maybe that Quick Disconnect behind the reservoir goes down to a massive 1080mm rad and not strait to the back of the res/pump combo ^^ ?

Which pump is it btw ? I know mine's (MCP 350) not one of the best and I'll change it in due time for a better one. I doubt it'd make that much of a difference though.


----------



## stickg1

Quote:


> Originally Posted by *steelkevin*
> 
> That's still very impressive. You only have a 240mm and a 120mm rad which don't look like they're equipped with fans in p/p. And you're GPU isnt even alone in there, it's running along side a 3570K (how overclocked is it btw ?).
> 
> OR maybe that Quick Disconnect behind the reservoir goes down to a massive 1080mm rad and not strait to the back of the res/pump combo ^^ ?
> 
> Which pump is it btw ? I know mine's (MCP 350) not one of the best and I'll change it in due time for a better one. I doubt it'd make that much of a difference though.


The R9 290 I run 1100/1400 w/ +70mV for 24/7 clocks. To be fair, when I run something for over an hour, like a game or [email protected], the GPU will get up to ~43C, but I have not seen anything over 45C since I put this block on.

The 3570K is 4.7GHz w/ 1.296v. Temps get into the low 60C's while folding.

I use the MCP355. And I will be adding one more 120mm radiator to the front of the case but I ran out of fittings. In fact I will be redoing the entire loop with all chrome fittings. That picture was just kind of a test run because I got impatient waiting for all my gear. The quick disconnects were used just so I could transition into other (smaller) fittings because I had run out of 3/4" OD compressions. But it almost looks like they have a purpose. Hell, I might even keep them. I have extra attachments for them so I could use them to drain the loop and swap out parts, etc.


----------



## DeadlyDNA

Quote:


> Originally Posted by *yappy*
> 
> Hey guys.
> 
> How much watts would you say a ASUS R9 290 OC runs at? I OC'd mine to 1000mhz and my mem clock to 1500mhz with powertune at 20%. I cannot adjust voltage.
> 
> I have 4 of these cards, and my system cannot run all 4 at full load 100% usage. I got a lepa 1600w psu. My whole system just shuts off after couple minutes.
> 
> But I can run 3 cards at full 100% usage for hours and hours no problem.


I have the same setup almost. Make sure you arent using shared rails on your lepa g1600.
There are 2 shared rails 12v3 and 12v5 have 2 of the red pcie connectors. I am not on pc right now but refer to the
picture tsm106 posted a while back.


----------



## yappy

Quote:


> Originally Posted by *DeadlyDNA*
> 
> I have the same setup almost. Make sure you arent using shared rails on your lepa g1600.
> There are 2 shared rails 12v3 and 12v5 have 2 of the red pcie connectors. I am not on pc right now but refer to the
> picture tsm106 posted a while back.


I will look for the post. If I need help i'll PM you.


----------



## Derpinheimer

Quote:


> Originally Posted by *Hattifnatten*
> 
> Quick question, what clocks do people get on stock voltages on the 290? Mine will do 1155/1625 (ECC does not even kick in). I know the memory is above average, but how about core?


I don't believe this one bit lol.

You're well aware of how good this chip is if true and are just gloating.


----------



## sugarhell

Quote:


> Originally Posted by *Derpinheimer*
> 
> I don't believe this one bit lol.
> 
> You're well aware of how good this chip is if true and are just gloating.


Stock volts mean nothing. Until you see the card under full volts then you cant say anything


----------



## skupples

this thread has 1500 posts since the last time I opened it, AMAZING!


----------



## HardwareDecoder

Quote:


> Originally Posted by *the9quad*
> 
> Running 3x290x's and a qnix at 100,110,and 120 hz no problems at all.


you aren't having problems because you aren't overclocking the 290x's.

I'd be willing to bet you get display corruption over 1100mhz / over 60hz like most of us
Sucks for us that want to overclock as high as possible









I'm really starting to wish I had just bought a 780ti, even though I do like this card and the power it has for the 500$ I paid is pretty impressive. It just hurts me not to be able to overclock as high as the card can go since I can do like 1250 @ 60hz but only 1100 @ 110hz


----------



## Hattifnatten

Quote:


> Originally Posted by *Derpinheimer*
> 
> I don't believe this one bit lol.
> 
> You're well aware of how good this chip is if true and are just gloating.


Quote:


> Originally Posted by *sugarhell*
> 
> Stock volts mean nothing. Until you see the card under full volts then you cant say anything


Well, it's not exactly stable







, but it can finish some benchmarks without crashing. It is however, gamestable with +100mv on both core and memory.


Spoiler: Warning: Spoiler!


----------



## rdr09

Quote:


> Originally Posted by *Hattifnatten*
> 
> Well, it's not exactly stable
> 
> 
> 
> 
> 
> 
> 
> , but it can finish some benchmarks without crashing. It is however, gamestable with +100mv on both core and memory.
> 
> 
> Spoiler: Warning: Spoiler!


when you say stock volts, do you mean no voltage tweaks? if that is the case, then say - no voltage tweaks 'cause Derpinheimer won't buy it. been there done that. mine can game at 1175/1500 without voltage tweaks or manual adjustments of voltage.


----------



## Hattifnatten

No voltagetweaking needed for those settings, although it crashes pretty much imidiately when not under load unless I up the memoryvoltage. Will see how far I can push it with +200mV on core, hopefully close to 1300.


----------



## the9quad

Quote:


> Originally Posted by *HardwareDecoder*
> 
> you aren't having problems because you aren't overclocking the 290x's.
> 
> I'd be willing to bet you get display corruption over 1100mhz / over 60hz like most of us
> Sucks for us that want to overclock as high as possible
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm really starting to wish I had just bought a 780ti, even though I do like this card and the power it has for the 500$ I paid is pretty impressive. It just hurts me not to be able to overclock as high as the card can go since I can do like 1250 @ 60hz but only 1100 @ 110hz


Maybe you should go read my posts, I don't run 24/7 overclocked but I have ran them overclocked no problems at 120hz with 1150/1375 overclock. I don't overclock higher because for one: I only have a 1200 watt power supply, and two: with 3 cards I imagine I would have to be lucky to find 3 that will do 1200/1400. So yes overclocking 3 of them and running at 120hz causes me zero problems.


----------



## rdr09

Quote:


> Originally Posted by *Hattifnatten*
> 
> No voltagetweaking needed for those settings, although it crashes pretty much imidiately when not under load unless I up the memoryvoltage. Will see how far I can push it with +200mV on core, hopefully close to 1300.


hmmm. you can adjust the voltage for the memory? mine does not have that feature.


----------



## Hattifnatten

Press the arrow down next to core voltage. Aux voltage is memory afaik, even though there's memory voltage too.


----------



## rdr09

Quote:


> Originally Posted by *Hattifnatten*
> 
> Press the arrow down next to core voltage. Aux voltage is memory afaik, even though there's memory voltage too.


i only use Trixx for oc'ing and i think it does not have Aux voltage, too. AB does.


----------



## zpaf

With reference cooler at 100% you can check your card limits without worry about temps.


AMD Radeon R9 290 video card benchmark result - Intel Core i7-4770K,ASUSTeK COMPUTER INC. MAXIMUS VI IMPACT


----------



## cyenz

Just an heads up, gigabyte as a 290x bios update, uefi compatible.

http://www.gigabyte.pt/products/page/vga/gv-r929xd5-4gd-b


----------



## HardwareDecoder

Quote:


> Originally Posted by *the9quad*
> 
> Maybe you should go read my posts, I don't run 24/7 overclocked but I have ran them overclocked no problems at 120hz with 1150/1375 overclock. I don't overclock higher because for one: I only have a 1200 watt power supply, and two: with 3 cards I imagine I would have to be lucky to find 3 that will do 1200/1400. So yes overclocking 3 of them and running at 120hz causes me zero problems.


sorry I don't keep track of everyones posts I just saw your last one where you said you were not overclocking.........excuse me that I don't really care about your rig. Cool that you have no issues though, you are in the minority however.

edit: sry not trying to sound like a jerk just not happy with my own (and other peoples) situation regarding these.

Have you tried each of your cards in the 1st slot configuration? Since you can only hook up video outputs on the first card right? Or did they change that from the 7xxx series? My point is maybe 1-2 of your cards have no issue but the third would.

Since some people claim no issues and others have problems it seems to be hit or miss. Thus leading me to believe it is not driver/qnix related but related to individual 290's


----------



## rdr09

Quote:


> Originally Posted by *zpaf*
> 
> With reference cooler at 100% you can check your card limits without worry about temps.
> 
> 
> AMD Radeon R9 290 video card benchmark result - Intel Core i7-4770K,ASUSTeK COMPUTER INC. MAXIMUS VI IMPACT
> 
> 
> Spoiler: Warning: Spoiler!


nice. mem oc does help in FS. you've got a very good card there, zpaf.


----------



## the9quad

Quote:


> Originally Posted by *HardwareDecoder*
> 
> sorry I don't keep track of everyones posts I just saw your last one where you said you were not overclocking.........excuse me that I don't really care about your rig. Cool that you have no issues though, you are in the minority however.


I'll chalk that up to you having a bad day or something. I didn't mean my post as antagonizing, it was meant as clarification, because you quoted my post. I was merely trying to help, as you were asking if people had similar problems. So sorry if I offended you, it wasn't meant that way. Hope you get whatever issues youre having technically worked out.

Geez


----------



## Jack Mac

Quote:


> Originally Posted by *HardwareDecoder*
> 
> sorry I don't keep track of everyones posts I just saw your last one where you said you were not overclocking.........excuse me that I don't really care about your rig. Cool that you have no issues though, you are in the minority however.


I know its not a Qnix, but I tested my friend's catleap with my 290 OC'd and it had no problems running 120Hz. Maybe it's the Qnix at fault here?


----------



## HardwareDecoder

see my edited post 9quad. Yes I am having a bad week honestly, not your problem it's mine. I gotta go hang out with my family in law today and I don't really like them. lol


----------



## zpaf

Quote:


> Originally Posted by *rdr09*
> 
> nice. mem oc does help in FS. you've got a very good card there, zpaf.


Thanks a lot.
Memory scales very well as I increase gpu voltage.
At default voltage 1.25v memory do 1450MHz
At max 1.412v with real ~1.3v cause vdroop memory can do as you see 1625MHz.
Very happy with card.


----------



## rdr09

Quote:


> Originally Posted by *zpaf*
> 
> Thanks a lot.
> Memory scales very well as I increase gpu voltage.
> At default voltage 1.25v memory do 1450MHz
> At max 1.412v with real ~1.3v cause vdroop memory can do as you see 1625MHz.
> Very happy with card.


i'd imagine with an aftermarket cooler your card will do better.


----------



## Ized

Installed a Arctic Accelero Hybird and well it sounds like someone is taking a leak inside my system: https://www.dropbox.com/s/9pwdcewdg014e5v/pump2.mp3 The water sounds come and go









Has there been any word from Arctic about an actual 290 package? Not enough heatsinks and the VRM heatsinks are way too small to be used while mining for coins.


----------



## geoxile

Are there any manufacturers that actually allow aftermarket coolers?


----------



## MrWhiteRX7

Anyone tested the new Heatkiller water blocks???? I might get these as no one seems to have the xspc blocks in stock and I want copper. Or if you have seen a good review point me to it please! I can't seem to find much about them on google.


----------



## VSG

They look good but are really expensive! If you want an EK copper block, I got one for sale in the marketplace.


----------



## rdr09

Quote:


> Originally Posted by *geoxile*
> 
> Are there any manufacturers that actually allow aftermarket coolers?


this has been asked a number of times and, iirc, xfx, powercolor and msi do allow it. in case you nee to rma, they just want the orig cooler back of course.

sapphire is under the "don't ask, don't tell" policy.


----------



## geoxile

Quote:


> Originally Posted by *rdr09*
> 
> this has been asked a number of time and, iirc, xfx, powercolor and msi do allow it. in case you nee to rma, they just want the orig cooler back of course.


Don't XFX and MSI have the warranty stickers on their screws? So even if you remove the sticker they'll honor the warranty?


----------



## Ized

Quote:


> Originally Posted by *geoxile*
> 
> Don't XFX and MSI have the warranty stickers on their screws? So even if you remove the sticker they'll honor the warranty?


My XFX 290 had stickers on the screws. They did not appear to be anti tamper stickers though like the PS3 etc has, just regular small stickers.

My Powercolor did not


----------



## MrWhiteRX7

Quote:


> Originally Posted by *geggeg*
> 
> They look good but are really expensive! If you want an EK copper block, I got one for sale in the marketplace.


I have three 290's.. they seem to be the same price as the xspc from where I'm looking. Hmmm I'm sure they're on the same performance level as the others. All the heatkiller blocks for other cards have reviewed very well. I might go for it LOL I'll post up results if I do


----------



## VSG

Well the way I see it is they have saved money by not nickel plating the blocks so I am not sure why they are charging over $30 more than the EK blocks. The XSPC version has a pretty complicated design and has their trademark LED design embedded so that I can understand being more expensive.


----------



## Forceman

Quote:


> Originally Posted by *Hattifnatten*
> 
> Press the arrow down next to core voltage. Aux voltage is memory afaik, even though there's memory voltage too.


It's not. Unwinder has already said there is no memory voltage control available, and the Aux voltage on that slider changes the VDDCI (as shown in GPU-Z), which is 1.0V. I doubt the memory runs at 1.0V. Changing core voltage does seem to help memory overclocking though, my guess is because the limiting factor is the new memory controller, not the memory chips themselves.

Quote:


> Originally Posted by *geoxile*
> 
> Don't XFX and MSI have the warranty stickers on their screws? So even if you remove the sticker they'll honor the warranty?


According to XFX, those stickers don't apply/matter in North America.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *geggeg*
> 
> Well the way I see it is they have saved money by not nickel plating the blocks so I am not sure why they are charging over $30 more than the EK blocks. The XSPC version has a pretty complicated design and has their trademark LED design embedded so that I can understand being more expensive.


Hmmm I didn't realize the EK's were that much cheaper, thanks for the heads up. I honestly haven't given other blocks much look since I was originally waiting to see how the xspc did when they were released, but it's impossible to get any for probably the next 18 - 25 days lol. If you had 3 blocks I'd probably just grab them from you


----------



## VSG

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Hmmm I didn't realize the EK's were that much cheaper, thanks for the heads up. I honestly haven't given other blocks much look since I was originally waiting to see how the xspc did when they were released, but it's impossible to get any for probably the next 18 - 25 days lol. If you had 3 blocks I'd probably just grab them from you


PM sent summarizing current 290x waterblock situation from different manufacturers- hope that helps.


----------



## chiknnwatrmln

All I know is I ordered an EK Acetal 290x block over two weeks ago and it still hasn't shipped...


----------



## MrWhiteRX7

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> All I know is I ordered an EK Acetal 290x block over two weeks ago and it still hasn't shipped...


From where?


----------



## The Storm

Quote:


> Originally Posted by *Taint3dBulge*
> 
> Are u guys over clockn ur video cards snd by how much?


I've pushed mine to 1175/1500 and no issues with the monitor.


----------



## prostreetcamaro

Anybody have better benchmark performance with one driver over the others? My scores seem slightly lower than they should and its not due to heat because my 290X is under water and maxing out in the lower 40's.


----------



## Darklyspectre

Anybody heard anything about this thing?

http://www.newegg.com/Product/Product.aspx?Item=N82E16814131543

I have read about it that PC LCS 290X being released soon but didn't expect it to be now.

Anybody seen any reviews on this thing?

I was about to order my EVGA 780Ti classy and then ofcourse this pops up


----------



## Derpinheimer

Quote:


> Originally Posted by *Hattifnatten*
> 
> Well, it's not exactly stable
> 
> 
> 
> 
> 
> 
> 
> , but it can finish some benchmarks without crashing. It is however, gamestable with +100mv on both core and memory.
> 
> 
> Spoiler: Warning: Spoiler!


So you lied.. lol.


----------



## velocityx

Quote:


> Originally Posted by *cyenz*
> 
> Just an heads up, gigabyte as a 290x bios update, uefi compatible.
> 
> http://www.gigabyte.pt/products/page/vga/gv-r929xd5-4gd-b


can you elaborate a bit whats the difference,between the regular bios and this one? is it about some cards having problems displaying bios when its uefi>? or what it does?


----------



## cyenz

Quote:


> Originally Posted by *velocityx*
> 
> can you elaborate a bit whats the difference,between the regular bios and this one? is it about some cards having problems displaying bios when its uefi>? or what it does?


It´s not to fix any problem, just to get full compatibility with uefi bios (fast boot and such) nothing major i guess, just posted because some people in this thread were looking for one.


----------



## velocityx

Quote:


> Originally Posted by *cyenz*
> 
> It´s not to fix any problem, just to get full compatibility with uefi bios (fast boot and such) nothing major i guess, just posted because some people in this thread were looking for one.


I guess if i have the non x version, it's alright to flash this bios on it? ;d


----------



## MrWhiteRX7

Welllllllllll it seems quiet at the moment... anyone taking bets on whether we see mantle in the next few days?
















I still don't think we'll see it before CES


----------



## Widde

Does anyone know if MSI will void their warranty if you replace their heatsink with an aftermarket one? Have not found an e-mail adress for europe







Really want to replace it, its not really loud as much as it is crap at what it does


----------



## hotrod717

Please update me, adding Sapphire 290x watercooled. Just bought another card from ocn member for my x79/4930k/R4BE build. Installing drivers now. Woot!












Horrible part is, after getting up early, breaking down 3770k setup and getting this together, I'm too tired to do any real overclocking. Definately need a nap.


----------



## Forceman

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Welllllllllll it seems quiet at the moment... anyone taking bets on whether we see mantle in the next few days?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I still don't think we'll see it before CES


Considering how much they touted it back in Sep/Oct the fact that they've gone completely silent now, when it is due for release, strongly indicates that it isn't going to be available when they said it would be.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Forceman*
> 
> Considering how much they touted it back in Sep/Oct the fact that they've gone completely silent now, when it is due for release, strongly indicates that it isn't going to be available when they said it would be.


Thank you!!! IMO this is just simple reasoning.. they clearly don't want to speak up about it because it's probably not ready, unless this is just going to be some epic surprise hahaha


----------



## robert0507

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Welllllllllll it seems quiet at the moment... anyone taking bets on whether we see mantle in the next few days?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I still don't think we'll see it before CES


I hope we get this by Monday. I am starting to get a bit passed off over the no information policy Amd or dice seem to be implementing


----------



## MrWhiteRX7

Quote:


> Originally Posted by *robert0507*
> 
> I hope we get this by Monday. I am starting to get a bit passed off over the no information policy Amd or dice seem to be implementing


Man I would be thrilled! I just don't see how







They were giving pretty consistent updates/ news about it up until about 3 weeks ago when they just went completely silent. That is never good. No news is bad news when it comes to meeting release dates usually.

Having said this, I'm ok with the delay because AMD / dice need some good points here and if they were to release the patch for mantle support in BF4 and it's still plagued with all these other issues it will be overshadowed for sure and it's not going to get the praise it may very well deserve.


----------



## zpaf

Quote:


> Originally Posted by *rdr09*
> 
> i'd imagine with an aftermarket cooler your card will do better.


Another one with same clocks as before but with [email protected],7


http://www.3dmark.com/3dm/2041099

This card needs water and unofficial bios desperately.


----------



## MrWhiteRX7

^^ NICE!!! ^^


----------



## velocityx

so does ocing memory bring any performance boost in games? I heard contradicting opinions on that.


----------



## Forceman

Quote:


> Originally Posted by *velocityx*
> 
> so does ocing memory bring any performance boost in games? I heard contradicting opinions on that.


Pretty minor in most games, at least from the testing I've done and seen. Could depend on the game though.


----------



## IBIubbleTea

What do you guys think about the drivers, have they gotten any better?


----------



## Sinch979

Good morning.
Is there anything to be concerned about with the weight of the XSPC waterblock? I didn't realize how heavy it is until I received it.(my first GPU water block).
Any chance of bending the card over time?
I'm guessing I might need to buy a backplate.Is this correct?
Thanks for your time.


----------



## VSG

Ya I would always have a backplate with any water block, just to be on the safe side.


----------



## Derpinheimer

Shouldn't a full cover block prevent bending?


----------



## Sinch979

Thanks for the quick reply geggeg.
Derp "I should have said damaged instead of bending. "


----------



## MrWhiteRX7

Quote:


> Originally Posted by *IBIubbleTea*
> 
> What do you guys think about the drivers, have they gotten any better?


LOL I'm still using the 13.11 whql drivers. I have had no reason to use any betas or even switch to the 13.12 driver.


----------



## stickg1

Any backplates compatible with the Koolance VID-AR290X?


----------



## IBIubbleTea

Quote:


> Originally Posted by *stickg1*
> 
> Any backplates compatible with the Koolance VID-AR290X?


I don't think so. I was planning on getting that block. Now I'm getting the EK R9 290x Acetal with backplate.


----------



## skupples

Hate to say it, mantle won't be here until 2014.


----------



## stickg1

Temps are sweet with my Koolance block. Just think it would look nice with a backplate.


----------



## Arizonian

Quote:


> Originally Posted by *hotrod717*
> 
> Please update me, adding Sapphire 290x watercooled. Just bought another card from ocn member for my x79/4930k/R4BE build. Installing drivers now. Woot!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Horrible part is, after getting up early, breaking down 3770k setup and getting this together, I'm too tired to do any real overclocking. Definately need a nap.


I'm confused...did you swap out your XFX 290 for a SAPPHIRE 290X? IF so.....updated









If not PM I'll set the roster straight.

On a side note: Seeing some new members in the thread. If you'd like to be added to the OP please submit your entry. We'd love to add you.









To be added on the member list please submit the following in your post

1. GPU-Z Link or Screen shot with OCN Name
2. Manufacturer & Brand
3. Cooling - Stock, Aftermarket or 3rd Party Water


----------



## IBIubbleTea

Quote:


> Originally Posted by *stickg1*
> 
> Temps are sweet with my Koolance block. Just think it would look nice with a backplate.


What kind of temps are you getting with that Koolance block? Also what is your water cooling parts?


----------



## stickg1

Quote:


> Originally Posted by *IBIubbleTea*
> 
> What kind of temps are you getting with that Koolance block? Also what is your water cooling parts?


Folding at 1125MHz w/ +75mV in AB. 480mm worth of thin radiator, 3570K @ 4.7GHz in same loop at 65C. I have not seen my 290 go higher than 45C.


----------



## Forceman

Arizonian, just noticed that you still have me listed with both the 290 and 290X. I settled on the XFX 290 (flashed to 290X) under water.


----------



## maynard14

Quote:


> Originally Posted by *Forceman*
> 
> Arizonian, just noticed that you still have me listed with both the 290 and 290X. I settled on the XFX 290 (flashed to 290X) under water.


wowow very nice ! i hope someday i could water cooled my gpu


----------



## Arizonian

Quote:


> Originally Posted by *Forceman*
> 
> Arizonian, just noticed that you still have me listed with both the 290 and 290X. I settled on the XFX 290 (flashed to 290X) under water.
> 
> 
> Spoiler: Warning: Spoiler!


Heh - you tricked me with that pic showing both. Updated


----------



## aidhanc

Add me, please.










Gigabyte 290 with the stock cooler.


----------



## Arizonian

Quote:


> Originally Posted by *aidhanc*
> 
> Add me, please.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Gigabyte 290 with the stock cooler.


Congrats - welcome aboard - added


----------



## smoke2

I found a review where is a VRM temperature called as VRM1 and VRM2:
http://www.hardocp.com/article/2013/12/23/asus_r9_290x_directcu_ii_oc_video_card_review/9
Don't you know what does VRM1 and VRM2 mean?


----------



## hotrod717

Quote:


> Originally Posted by *Arizonian*
> 
> I'm confused...did you swap out your XFX 290 for a SAPPHIRE 290X? IF so.....updated
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If not PM I'll set the roster straight.
> 
> On a side note: Seeing some new members in the thread. If you'd like to be added to the OP please submit your entry. We'd love to add you.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> To be added on the member list please submit the following in your post
> 
> 1. GPU-Z Link or Screen shot with OCN Name
> 2. Manufacturer & Brand
> 3. Cooling - Stock, Aftermarket or 3rd Party Water


Quote:


> Originally Posted by *Arizonian*
> 
> I'm confused...did you swap out your XFX 290 for a SAPPHIRE 290X? IF so.....updated
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have both, testing the new 290X right now. You can leave it as single Sapphire 290X as the XFX 290 will most likely be sold soon. Thanks!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If not PM I'll set the roster straight.
> 
> On a side note: Seeing some new members in the thread. If you'd like to be added to the OP please submit your entry. We'd love to add you.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> To be added on the member list please submit the following in your post
> 
> 1. GPU-Z Link or Screen shot with OCN Name
> 2. Manufacturer & Brand
> 3. Cooling - Stock, Aftermarket or 3rd Party Water


I have both, testing the new 290X right now. You can leave it as single Sapphire 290X as the XFX 290 will most likely be sold soon. Thanks!


----------



## Forceman

Quote:


> Originally Posted by *smoke2*
> 
> I found a review where is a VRM temperature called as VRM1 and VRM2:
> http://www.hardocp.com/article/2013/12/23/asus_r9_290x_directcu_ii_oc_video_card_review/9
> Don't you know what does VRM1 and VRM2 mean?


There are two sets of VRMs on the card, the long set near the power connectors (VRM1) and the small cluster of 3 near the display outputs (VRM2). VRM1 controls the core, and VRM2 is for the memory.


----------



## stickg1

Quote:


> Originally Posted by *Forceman*
> 
> There are two sets of VRMs on the card, the long set near the power connectors (VRM1) and the small cluster of 3 near the display outputs (VRM2). VRM1 controls the core, and VRM2 is for the memory.


Both of mine are colder than a polar bear's toe nails thanks to my Koolance block!









Heh, sorry, I'm just excited.


----------



## VSG

No worries man, watercooling is exciting









Now overclock those beasts and benchmark them!


----------



## Durvelle27

Wow this thread is very healthty Lol. How's it going guys


----------



## UNOE

Quote:


> Originally Posted by *Durvelle27*
> 
> Wow this thread is very healthty Lol. How's it going guys


We need AB update. And or some new tools for voltage. Nothing too exciting to see here till then.


----------



## Sinch979

stocksettings.jpg 664k .jpg file

Basic stats


----------



## Sgt Bilko

Quote:


> Originally Posted by *UNOE*
> 
> We need AB update. And or some new tools for voltage. Nothing too exciting to see here till then.


Trixx supports +200mV iirc while AB only supports +100mV


----------



## stickg1

I wish my Powercolor unlocked, oh well..


----------



## alex22808

i got a 290 for christmas and as we all know the cooling is awful. a quick search on the internet brings up lots of suggestions of installling an aftermarket cooler. now, i dont have a water loop so water-cooling is out the question. therefore, what do you all recomend? also, how easy is it to install? im fairly new to pc building and the such, having built my first rig in the summer, so are the risks for an inexperienced person like me worth the rewards? many thanks.


----------



## rx7racer

As of now most are going aftermarket cooling more for noise suppression than for the need of cooling.

But to answer your question, replacing the heatsink isn't that hard or complicated given you take your time when doing it. Rather simple really and straight forward.

The risks if you do make a big enough error is a dead gpu, which can happen by just a slip of the screwdriver and hitting a pcb trace. The rewards are kinda small until we can get some unlocked voltage control for the most part.


----------



## VSG

Check out this: http://www.tomshardware.com/reviews/r9-290-accelero-xtreme-290,3671.html

People here have also reported good cooling with the same cooler. Some have also said it is possible to put on the Asus DCII 280x cooler onto this but that is not verified yet by anyone in the tech world as far as I know.


----------



## stickg1

The cooling isn't awful, so much as the noise generated by the cooling is awful. Check out the Gelid and Arctic Cooling options. Now myself, I have have been taking apart GPUs and other PC components for the better part of a decade. So installing an aftermarket cooler I would consider pretty straight forward. For someone new to that concept, IDK. If you can operate a manual screw driver and follow step-by-step directions it shouldn't be an issue at all. But some people have the tendency to surprise me. If you're new to PC building, chances are you aren't doing much overclocking. How bad are the temperatures on your card? How aggressive of a fan profile are you running? Does the noise bother you at all?


----------



## alex22808

thanks for the reply, seems sensible to wait a bit then and see where the overclocking scene gets to. the noise doesn't bother me too much seeing as all my case fans sound like hair dryers.


----------



## alex22808

noise is not really a problem. the temps reach the 95 limit due to the temperature in my house being rather hot so it begins to throttle fairly quickly, thats whats drawn me towards aftermarket cooling.

edit: appologies for the double post


----------



## rx7racer

Yup, Arctic's cooler is good, just be sure to keep those vrm's in check. Also Gelids icy vision V2.

On a side note if you can handle the noise then you can ramp up stock fan. I made a custom profile in AB and I stay pretty much fan % for degree, so if I hit 70c I am at 70% fan speed. Keeps my temps down and no throttling.

Some have stated they can place their fan % at around 60% and keep it under the 95c mark to prevent throttling. Just kinda have to be willing to play around with it and see what you are willing to put up with.

I run my 290X at 1150core and 1500mem with +100mv and stay at around 80c playing BF4 etc, but I wear cans and my fans on my radiator are just about as loud so I'm used to it.


----------



## VSG

Well be aware that you would likely lose out on your warranty by replacing the stock heatsink unless specified otherwise by the GPU manufacturer. If you are willing to take the risk, I would go for it- reaching 95 C and getting performance throttling in a hot house will not be a good experience with the card.


----------



## alex22808

ive seen it mentioned that xfx will honour the warranty even with a custom cooler. any truth in that?


----------



## stickg1

Quote:


> Originally Posted by *alex22808*
> 
> noise is not really a problem. the temps reach the 95 limit due to the temperature in my house being rather hot so it begins to throttle fairly quickly, thats whats drawn me towards aftermarket cooling.
> 
> edit: appologies for the double post


The stock fan profile is pretty weak. Make it so the fan goes up to 60% around the 75C mark. I was able to use my card without throttling on the stock cooler with no issues other than it being too loud for my liking. And since I use it for [email protected] 24/7 it was a nuisance for watching TV or sporting events while sitting at the PC. I put on a waterblock and added it to my water loop. Now I can't get it over 50C if I tried.


----------



## stickg1

Quote:


> Originally Posted by *alex22808*
> 
> ive seen it mentioned that xfx will honour the warranty even with a custom cooler. any truth in that?


Yeah, but sometimes they like to know in advance. IF you have an XFX card, register it with them and shoot them an email telling them that you want to replace the cooler. Nine times out of ten they give you the green light.


----------



## alex22808

okay ill do that now. in terms of which cooler to fit if i choose that route, do you recomend either one in particular, or are they both equal (arctic and gelid's)?


----------



## VSG

Quote:


> Originally Posted by *alex22808*
> 
> ive seen it mentioned that xfx will honour the warranty even with a custom cooler. any truth in that?


Are you based in North America? If so, yes they will but it is advised to call them up and confirm first. But if you are going with an aftermarket air cooler, that involves permanent sticking of heatsinks to the card so you can't put the stock cooler back on for RMA (a requirement).


----------



## alex22808

no, afraid not. i live in the uk so ringing is not really practical.


----------



## stickg1

I probably wouldn't use the thermal glue that comes with those cooling options anyway. Get a sheet of thermal tape and use that instead, just in case you have to remount the stock cooler for warranty work. Also you might even look at some nice copper RAMsinks and VRMsinks instead of the aluminum ones that come in the kit. I think as long as they are shorter than 12mm they should fit. This was what I found when I was looking to get the Gelid solution anyway. 13-14mm was all the clearance you had. Enzotech makes some nice 9mm copper sinks that should fit the bill.


----------



## alex22808

thanks for all the help people, it really is very much appreciated.


----------



## Sgt Bilko

Quote:


> Originally Posted by *stickg1*
> 
> I probably wouldn't use the thermal glue that comes with those cooling options anyway. Get a sheet of thermal tape and use that instead, just in case you have to remount the stock cooler for warranty work. Also you might even look at some nice copper RAMsinks and VRMsinks instead of the aluminum ones that come in the kit. I think as long as they are shorter than 12mm they should fit. This was what I found when I was looking to get the Gelid solution anyway. 13-14mm was all the clearance you had. Enzotech makes some nice 9mm copper sinks that should fit the bill.


I've had the AX III on my 290x and i used the thermal pads from the stock heatsink for the vram and the pads that came with the AX III for the vrm's, Worked great, no mess and easy to put them back on the stock cooler to return it if you need to.


----------



## kizwan

Quote:


> Originally Posted by *Sinch979*
> 
> Good morning.
> Is there anything to be concerned about with the weight of the XSPC waterblock? I didn't realize how heavy it is until I received it.(my first GPU water block).
> Any chance of bending the card over time?
> I'm guessing I might need to buy a backplate.Is this correct?
> Thanks for your time.


Don't worry. It won't damage the card. With single card, your card may sagging a little bit at the back (in normal ATX casing) but it will be fine. Backplate only help a little but doesn't completely prevent it from sagging.

[EDIT]
I got black screen today. Turn out the backlight is off for some reason. I can't seems to get the backlight on again. I can see very vaguely the images on the screen. If I turn the monitor off & on back, the backlight is back on for a second & then go off again. Googled & found that it's likely a couple of capacitors failed. Time to get new monitor I think.


----------



## smoke2

Quote:


> Originally Posted by *Forceman*
> 
> There are two sets of VRMs on the card, the long set near the power connectors (VRM1) and the small cluster of 3 near the display outputs (VRM2). VRM1 controls the core, and VRM2 is for the memory.


Than the 108 degrees in quiet mode is very much on the VRM1 for ASUS? Little fail for them...


----------



## esqueue

Quote:


> Originally Posted by *kizwan*
> 
> Don't worry. It won't damage the card. With single card, your card may sagging a little bit at the back (in normal ATX casing) but it will be fine. Backplate only help a little but doesn't completely prevent it from sagging.
> 
> [EDIT]
> I got black screen today. Turn out the backlight is off for some reason. I can't seems to get the backlight on again. I can see very vaguely the images on the screen. If I turn the monitor off & on back, the backlight is back on for a second & then go off again. Googled & found that it's likely a couple of capacitors failed. Time to get new monitor I think.


I figured that my rig looks like crap so I decided to just make it functional. I supported my video card using dental floss. I also upgraded my two 120mm fans to two 140mm Noctua NF a14 pwm fan to cool my heatercore.

As for the monitor, just replace the caps. I've repaired 2 monitors by changing capacitors. One in my brother's and the other is the one that I'm using right now. I just got lazy and ordered the kits from eBay.


----------



## Sgt Bilko

Ok, so overclockers UK have the DCU II 290x listed for a 17th jan release, should i assume the 290 version will be out around the same time?

Actually nvm, It's between the Tri-X and DCU II for me now, just gonna grab the first one that releases in Aus.


----------



## blak24

Guys do you think downvolting the GPU core by -100mV creates problems? I'm trying some mining with the stock cooler, -50mV seems to be ok for now, wondering if also -100 would be OK to decrease heat


----------



## darkelixa

Good luck getting a trix in aus, they still dont have supply of the 280x


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> Good luck getting a trix in aus, they still dont have supply of the 280x


PCCG had 280x Toxic's in for about a day then they went quickly, just a right place/right time thing as usual.

I'm fairly confident i can grab a couple before the rush hits and they all sell out.

On another note, looking at EK's website to find out if the backplate is compatible and they have the Tri-X PCB pictured along with the other custom designs and it looks like they are using Hynix, so at least on the surface on things it's looking good there.


----------



## Ludus

I'm having some fun this morning with my 290 (not x)

*Firestrike
*
http://www.techpowerup.com/gpuz/hsvd8/
http://www.3dmark.com/3dm/2047141



*Firestrike Extreme*

http://www.techpowerup.com/gpuz/gxvs4/
http://www.3dmark.com/3dm/2047049



*Valley*

http://www.techpowerup.com/gpuz/wqa8v/


----------



## Sgt Bilko

Quote:


> Originally Posted by *Ludus*
> 
> I'm having some fun this morning with my 290 (not x)
> 
> *Firestrike
> *
> http://www.techpowerup.com/gpuz/hsvd8/
> http://www.3dmark.com/3dm/2047141
> 
> 
> 
> *Firestrike Extreme*
> 
> http://www.techpowerup.com/gpuz/gxvs4/
> http://www.3dmark.com/3dm/2047049
> 
> 
> 
> *Valley*
> 
> http://www.techpowerup.com/gpuz/wqa8v/


Those are Nice!









You need to submit them over here: http://www.overclock.net/t/1436635/official-ocns-team-green-vs-team-red-gk110-vs-hawaii/0_40

Get some love for the red team going









EDIT: Get that FSE run Validated, thats good enough for a top 50 spot in the FSE Hall of Fame








And the Normal FS run is in for the Top 100 as well


----------



## Durvelle27

Quote:


> Originally Posted by *UNOE*
> 
> We need AB update. And or some new tools for voltage. Nothing too exciting to see here till then.


I blelieve theirs an AB mod to force +200Mv


----------



## Sgt Bilko

Quote:


> Originally Posted by *Durvelle27*
> 
> I blelieve theirs an AB mod to force +200Mv


Yeah, there is a command line for AB to apply more Voltage and Trixx now supports +200mV afaik so thats another alternative


----------



## Ludus

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Those are Nice!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You need to submit them over here: http://www.overclock.net/t/1436635/official-ocns-team-green-vs-team-red-gk110-vs-hawaii/0_40
> 
> Get some love for the red team going
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: Get that FSE run Validated, thats good enough for a top 50 spot in the FSE Hall of Fame
> 
> 
> 
> 
> 
> 
> 
> 
> And the Normal FS run is in for the Top 100 as well


should i update to 13.12 whql driver to get them validated ? now i'm using beta5.
overall score could be better with a 4770k or even better a 4930k







no ht, no party in 3dmark physics score









i'll post these score over there


----------



## Sgt Bilko

Quote:


> Originally Posted by *Ludus*
> 
> should i update to 13.12 whql driver to get them validated ? now i'm using beta5.
> overall score could be better with a 4770k or even better a 4930k
> 
> 
> 
> 
> 
> 
> 
> no ht, no party in 3dmark physics score
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i'll post these score over there


Last Validated driver i used was the 13.11 WHQL driver but FM are accepting the 13.12 driver

http://www.futuremark.com/support/benchmark-rules

Radeon R9, R7, HD 7000, 6000, and 5000 series:

Catalyst 13.12


----------



## stickg1

Quote:


> Originally Posted by *esqueue*
> 
> I figured that my rig looks like crap so I decided to just make it functional. I supported my video card using dental floss. I also upgraded my two 120mm fans to two 140mm Noctua NF a14 pwm fan to cool my heatercore.
> 
> As for the monitor, just replace the caps. I've repaired 2 monitors by changing capacitors. One in my brother's and the other is the one that I'm using right now. I just got lazy and ordered the kits from eBay.
> 
> 
> Spoiler: Snip


Oh wow, is that cardboard box thing a radiator box? That's awesome! Post it in the ghetto rigged thread. http://www.overclock.net/t/666445/post-your-ghetto-rigging-shenanigans/2490_30#post_21466896
Quote:


> Originally Posted by *Ludus*
> 
> I'm having some fun this morning with my 290 (not x)
> 
> *Firestrike
> *
> http://www.techpowerup.com/gpuz/hsvd8/
> http://www.3dmark.com/3dm/2047141
> 
> 
> Spoiler: snip!
> 
> 
> 
> 
> 
> *Firestrike Extreme*
> 
> http://www.techpowerup.com/gpuz/gxvs4/
> http://www.3dmark.com/3dm/2047049
> 
> 
> 
> *Valley*
> 
> http://www.techpowerup.com/gpuz/wqa8v/


How are your Valley scores so much higher than mine? I understand you have a better overclock, but we're talking a 20 FPS difference. Am I missing something?


----------



## hotrod717

Quote:


> Originally Posted by *Ludus*
> 
> should i update to 13.12 whql driver to get them validated ? now i'm using beta5.
> overall score could be better with a 4770k or even better a 4930k
> 
> 
> 
> 
> 
> 
> 
> no ht, no party in 3dmark physics score
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i'll post these score over there


Yes, those are nice scores! Update to 13.12 whql for valid result. Check 1 st page of threads to determine rules. Some require tess on, some do not. Will make a big difference in overall score.
What are you using to oc?


----------



## Ludus

Quote:


> Originally Posted by *hotrod717*
> 
> Yes, those are nice scores! Update to 13.12 whql for valid result. Check 1 st page of threads to determine rules. Some require tess on, some do not. Will make a big difference in overall score.
> What are you using to oc?


tess was on amd optimized. 3dmark check if tess is off and give invalid score.

i'm using bios pt1 like bios and gpu tweak like sw.
Quote:


> Originally Posted by *stickg1*
> 
> Oh wow, is that cardboard box thing a radiator box? That's awesome! Post it in the ghetto rigged thread. http://www.overclock.net/t/666445/post-your-ghetto-rigging-shenanigans/2490_30#post_21466896
> How are your Valley scores so much higher than mine? I understand you have a better overclock, but we're talking a 20 FPS difference. Am I missing something?


follow valley topic, at first post there are some optimization to do.


----------



## hotrod717

Quote:


> Originally Posted by *Ludus*
> 
> tess was on amd optimized. 3dmark check if tess is off and give invalid score.
> 
> i'm using bios pt1 like bios and gpu tweak like sw.
> follow valley topic, at first post there are some optimization to do.


I was giving you advice on submitting scores to other threads, particularly 3dmark top 30, ect., ect. Some of them follow hwbot rules and allow tess to be modified. I'm very familiar with valley!


----------



## TheSoldiet

FX 8350 4.5
290X 1200Mhz 110Vcore offset (AIR)

http://www.3dmark.com/3dm/2050109

Graphic score default clock 10843


----------



## EvolveGamingPC

What sort of rads are you guys using to water cool those 290s? I was thinking about getting into the water cooling "loop" (ba dum chh) I have seen some recommendations for a 120-140mm rad for the single GPU excluding the CPU block (I already have an AIO water loop for that) Or should i go for a 240mm just for the extendibility?


----------



## stickg1

Quote:


> Originally Posted by *Ludus*
> 
> tess was on amd optimized. 3dmark check if tess is off and give invalid score.
> 
> i'm using bios pt1 like bios and gpu tweak like sw.
> follow valley topic, at first post there are some optimization to do.


Oh okay, yeah I didn't do all the steps, but I just adjust those few things in CCC and now I'm up to 70 FPS instead of 64. Thanks for the tips! I'll try later with my other monitors unplugged, priority set to real-time, and all the other apps/programs/explorer disabled.


----------



## stickg1

Quote:


> Originally Posted by *EvolveGamingPC*
> 
> What sort of rads are you guys using to water cool those 290s? I was thinking about getting into the water cooling "loop" (ba dum chh) I have seen some recommendations for a 120-140mm rad for the single GPU excluding the CPU block (I already have an AIO water loop for that) Or should i go for a 240mm just for the extendibility?


I use a 240mm and a 120mm for my CPU and single 290 loop. Temps are fantastic, never seen it get higher than 45C on the core since.

For just a 290, a 120/140 would probably do it, but if you can fit it in your case, I would go with a 240mm slim rad like the Alphacool ST30.


----------



## EvolveGamingPC

Quote:


> Originally Posted by *stickg1*
> 
> I use a 240mm and a 120mm for my CPU and single 290 loop. Temps are fantastic, never seen it get higher than 45C on the core since.
> 
> For just a 290, a 120/140 would probably do it, but if you can fit it in your case, I would go with a 240mm slim rad like the Alphacool ST30.


I have the Phantom 530 so I can probably fit a 240mm on the top. My main concern is not removing any of my drive cages, 5 drives take up some space. The hairs on the back of my neck stand on end when I think about water-cooling that monster. I would just love to have more PCs to build


----------



## StonedAlex

Anyone know when any more aftermarket 290x's are going to be released and if they are going to be $650+? I've been gpu-less for a couple weeks and I'm seriously thinking about just getting a 780 ti today or tomorrow.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *stickg1*
> 
> I use a 240mm and a 120mm for my CPU and single 290 loop. Temps are fantastic, never seen it get higher than 45C on the core since.
> 
> For just a 290, a 120/140 would probably do it, but if you can fit it in your case, I would go with a 240mm slim rad like the Alphacool ST30.


I have EK XTX (64mm thick) 240 and 360 rads here for my 290 and 3770k.... I may have went a bit overkill, hahaha.

Oh well, now I can buy another 290 without needing more rad.

What kinda OC/volts you use for that temp?


----------



## stickg1

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I have EK XTX (64mm thick) 240 and 360 rads here for my 290 and 3770k.... I may have went a bit overkill, hahaha.
> 
> Oh well, now I can buy another 290 without needing more rad.
> 
> What kinda OC/volts you use for that temp?


I fold 24/7 on it 1125MHz +75mV on the MSI-AB slider. It's usually like 1.25v but kinda fluctuates around a lot.

Which I had a question, I guess its too early to tell for sure because the cards just came out recently. But is it safe to use the card 100% 24/7 with +100mV on the power slider? Temps never go above 50C but I sure would hate to kill my card because extended overvolted use...


----------



## Forceman

Quote:


> Originally Posted by *stickg1*
> 
> I fold 24/7 on it 1125MHz +75mV on the MSI-AB slider. It's usually like 1.25v but kinda fluctuates around a lot.
> 
> Which I had a question, I guess its too early to tell for sure because the cards just came out recently. But is it safe to use the card 100% 24/7 with +100mV on the power slider? Temps never go above 50C but I sure would hate to kill my card because extended overvolted use...


I think the consensus is that +100mV is fine for long term use. With the Vdroop you are still around or under 1.3V load, which seems pretty safe. Plus MSI was the one that limited Afterburner to +100mV, which would imply that they think that's a safe amount of over-voltage.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *stickg1*
> 
> I fold 24/7 on it 1125MHz +75mV on the MSI-AB slider. It's usually like 1.25v but kinda fluctuates around a lot.
> 
> Which I had a question, I guess its too early to tell for sure because the cards just came out recently. But is it safe to use the card 100% 24/7 with +100mV on the power slider? Temps never go above 50C but I sure would hate to kill my card because extended overvolted use...


+100 mv is safe, as long as your temps are in check which they are.

I can't wait to get my new rig built, flash the PT1 BIOS and shoot for 3dMark11 hall of fame. My goal is 1225MHz, right now I can hit 1200 but with minor artifacting due to temps at 1.412v before droop (~1.3v after).


----------



## Forceman

Is 1225 good enough for the HOF? Seems like a lot of people are pushing higher than that with modded BIOSes.


----------



## hotrod717

Quote:


> Originally Posted by *Forceman*
> 
> Is 1225 good enough for the HOF? Seems like a lot of people are pushing higher than that with modded BIOSes.


1225 is top 50 atm, but probably won't be for long. Top clocking 290/290X cards are reaching 1300+, Good cards 1250, and most hit 1175. Luckily my newer card is looking like 1250 is doable!







1225 is good, but not great.
Also ,if you see artifacting at 1200 with 1.412, don't think volts are going to get that card any higher.


----------



## bond32

Thought I would chime in on the refresh rate/OC, I have a sapphire 290x and I can't get any higher than around 1150 with refresh rates of 96hz and higher. Usually I play bf4 at 120hz on my x-star 1440p but it has been quite frustrating to find stable clock rates...


----------



## Sinch979

wow......


----------



## ImJJames

Quote:


> Originally Posted by *hotrod717*
> 
> 1225 is top 50 atm, but probably won't be for long. Top clocking 290/290X cards are reaching 1300+, Good cards 1250, and most hit 1175. Luckily my newer card is looking like 1250 is doable!
> 
> 
> 
> 
> 
> 
> 
> 1225 is good, but not great.
> Also ,if you see artifacting at 1200 with 1.412, don't think volts are going to get that card any higher.


Remember though its actually not 1.4 volts because of vdroop.


----------



## justinone

Can one of you guys tell me why valley fps is so low? Extreme hd preset.

i am getting 53 fps on a 290x @ stock speeds with 50% power.

temps stay at 75

thats a little low for a 290x right?

3770k 4.2ghz
8gb ram
windows 7


----------



## VSG

Seems normal for stock clocks, nvidia cards are better in general at Valley.


----------



## hotrod717

Quote:


> Originally Posted by *ImJJames*
> 
> Remember though its actually not 1.4 volts because of vdroop.


Absolutely, that's a given since there is only one bios I know of atm that LLC is off, and the majority of people aren't using it. Wait until Matrix/ Lightning comes out, then we can talk about LLC percentage in relation to the amount of vdroop and vddci clock.


----------



## Derpinheimer

Also remember 1412 is actually +162mV, not 1412.


----------



## Roy360

I want to watercool mines so bad. But I know I have a bad card that won't run 947/1250 on stock voltage and Dell won't ship my VisionTek R9 290 until February. This waiting period is killing me. I've got all the other parts scatters around my computer table.


----------



## dade_kash_xD

I finally got around to putting both my Sapphire r9-290 GPU's and whole system for that matter, under water!

I'm wondering if my temps are too high though.

Idle: 30c
Load: 67c

The radiators get hot as hell and the air coming out of the system is fairly hot. I get these temps whether I am gaming or mining for dogecoin. Any thoughts?


----------



## HardwareDecoder

Quote:


> Originally Posted by *dade_kash_xD*
> 
> I finally got around to putting both my Sapphire r9-290 GPU's and whole system for that matter, under water!
> 
> I'm wondering if my temps are too high though.
> 
> Idle: 30c
> Load: 67c
> 
> The radiators get hot as hell and the air coming out of the system is fairly hot. I get these temps whether I am gaming or mining for dogecoin. Any thoughts?


what do you have on the loop all together and how much radiator do you have?

I'm not a watercooling expert by any means but that seems a bit high...

I know I get 50c max with one card and some people even get 42-45 max.

edit: looked at your rig. Looks like a very nice rig/WC setup! imo 67 seems high, that is the max on both cards?

your idle temp is same as mine though.


----------



## dade_kash_xD

Quote:


> Originally Posted by *HardwareDecoder*
> 
> what do you have on the loop all together and how much radiator do you have?
> 
> I'm not a watercooling expert by any means but that seems a bit high...
> 
> I know I get 50c max with one card and some people even get 42-45 max.
> 
> edit: looked at your rig. Looks like a very nice rig/WC setup! imo 67 seems high, that is the max on both cards?
> 
> your idle temp is same as mine though.


I have 2 rads 360+ 240mm both alphacool XT45 in push configuration w corsair sp120 fans. If I have my air conditioning on I never pass 50 degrees Celsius however when my air conditioner is off is when I hate 67 degrees Celsius sorry for the run on sentences but I'm using my speech to text app on my phone while I Drive


----------



## bond32

67 is a bit high... Might need to remount the gpu block. My koolance block hits around 55 but that's when I slow the pump quite a bit. You may need to increase flow too actually


----------



## HardwareDecoder

Quote:


> Originally Posted by *dade_kash_xD*
> 
> I have 2 rads 360+ 240mm both alphacool XT45 in push configuration w corsair sp120 fans. If I have my air conditioning on I never pass 50 degrees Celsius however when my air conditioner is off is when I hate 67 degrees Celsius sorry for the run on sentences but I'm using my speech to text app on my phone while I Drive


lol man OCN can wait, don't do distracted driving.


----------



## Derpinheimer

Yeah 67c is high but if you are saying the rads are hot that can mean only one thing; lack of cooling capacity.

Flowrate and mount must be fine by default.

So it could be:

-Fans (Too slow?)
-Bubble (Destroying rad cooling capacity)

Thats all I can think of.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *hotrod717*
> 
> 1225 is top 50 atm, but probably won't be for long. Top clocking 290/290X cards are reaching 1300+, Good cards 1250, and most hit 1175. Luckily my newer card is looking like 1250 is doable!
> 
> 
> 
> 
> 
> 
> 
> 1225 is good, but not great.
> Also ,if you see artifacting at 1200 with 1.412, don't think volts are going to get that card any higher.


The artifacting is due to temps, there are none until about 58c.

I wish I had a better card, at 1200 core I really can't complain but I have yet to get a CPU/GPU that gets a crazy OC.


----------



## DeadlyDNA

Quote:


> Originally Posted by *Sinch979*
> 
> wow......


Think about all that noise and heat sheesh!!


----------



## HardwareDecoder

is it normal to get all over the place gpu usage % as reported by msi afterburner in some games? Like in BF4 it's usually near 100% the entire time but most other games the gpu usage % fluctuates wildly.

My core clock stays pretty stable +/- 20mhz and I get great fps in everything so I spose it is normal ?

This was me playing some bioshock 2 (infinite sucks)


----------



## hotrod717

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> The artifacting is due to temps, there are none until about 58c.
> 
> I wish I had a better card, at 1200 core I really can't complain but I have yet to get a CPU/GPU that gets a crazy OC.


How are you determining that your temps are the cause of artifacting?


----------



## Forceman

Quote:


> Originally Posted by *HardwareDecoder*
> 
> is it normal to get all over the place gpu usage % as reported by msi afterburner in some games? Like in BF4 it's usually near 100% the entire time but most other games the gpu usage % fluctuates wildly.
> 
> My core clock stays pretty stable +/- 20mhz and I get great fps in everything so I spose it is normal ?
> 
> This was me playing some bioshock 2 (infinite sucks)


Try disabling voltage monitoring in Afterburner and see if that smooths the graph out - I had real spiky charts like that with it enabled. Otherwise the core drops are probably just caused by the lack of proper workload on the card - looks like you had 250+ FPS for a while there.


----------



## Ized

I could use some advice on making a 290 as close to silent as possible when idle. Is it possible? Are stand alone water cooling pumps audible?

Of course I can accept noise when I am gaming or mining coins.

I installed a Arctic Accelero Hybrid (all in one watercooler) and when it is working well there is a constant low humming which stops if I unplug the pump. I also get random water sounds that come and go.

I am really disappointed and I will be returning the Accelero unit but I am unsure what to replace it with and I am a bit fed up buying gear that reviewers claim are silent only to discover within seconds that this is not true at all.










Any experiences are most welcome. Would going to a full water cooling setup be the answer to the silence that I seek? What sort of budget would it require to buy pumps and whatever else that don't hum or buzz click knock rattle etc?


----------



## ImJJames

Quote:


> Originally Posted by *Ized*
> 
> I could use some advice on making a 290 as close to silent as possible when idle. Is it possible? Are stand alone water cooling pumps audible?
> 
> Of course I can accept noise when I am gaming or mining coins.
> 
> I installed a Arctic Accelero Hybrid (all in one watercooler) and when it is working well there is a constant low humming which stops if I unplug the pump. I also get random water sounds that come and go.
> 
> I am really disappointed and I will be returning the Accelero unit but I am unsure what to replace it with and I am a bit fed up buying gear that reviewers claim are silent only to discover within seconds that this is not true at all.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any experiences are most welcome. Would going to a full water cooling setup be the answer to the silence that I seek? What sort of budget would it require to buy pumps and whatever else that don't hum or buzz click knock rattle etc?


g10


----------



## chiknnwatrmln

Quote:


> Originally Posted by *hotrod717*
> 
> How are you determining that your temps are the cause of artifacting?


Benching on one screen with GPU-z and GPUTweak on another shows that when I pass that temperature, I get artifacts. It happens when I'm above that threshold, it doesn't if I'm below it. If I open a window then I won't artifact, but my room will be about 40f.

Also happens at 1100MHz, albeit a higher temperature. I run +18mv for that clockspeed and it is very stable but if I forget to turn my Gelid's fans up (have it on a manual dial) and pass ~67c or so then I'll see a couple small artifacts. A tad more voltage remedies the problem, but like I said if I'm under that temp I'm fine.


----------



## hotrod717

Just thought I'd share.


----------



## K1llrzzZ

Can anyone find any benchmarks about 290x Crossfire in PCI 3.0 8x/8x vs 16x/16x?I know there wasn't any difference in performance in the past,but with this new Crossfire system,I would be interested to see that is that change anything or not?Or PCI 2.0 16x/16x vs 3.0 16x/16x since 2.0 16x has the same bandwidth as 3.0 8x.


----------



## bayz11

Quote:


> Originally Posted by *dade_kash_xD*
> 
> I have 2 rads 360+ 240mm both alphacool XT45 in push configuration w corsair sp120 fans. If I have my air conditioning on I never pass 50 degrees Celsius however when my air conditioner is off is when I hate 67 degrees Celsius sorry for the run on sentences but I'm using my speech to text app on my phone while I Drive


Are you using one cpu + 2 gpu?
im using 2 × 240ex rad from xspc to cool up one cpu with raystorm and one gpu with aqua computer block for r9 290
My pump set at 3.idle between 36 to 40 and max for 50 to 55 depend on my room temp.so yeah i think something is wrong with ur setup.


----------



## kizwan

Quote:


> Originally Posted by *dade_kash_xD*
> 
> I finally got around to putting both my Sapphire r9-290 GPU's and whole system for that matter, under water!
> 
> I'm wondering if my temps are too high though.
> 
> Idle: 30c
> Load: 67c
> 
> The radiators get hot as hell and the air coming out of the system is fairly hot. I get these temps whether I am gaming or mining for dogecoin. Any thoughts?


Load temp too high but depend on your ambient. You should see it max at around 50s Celsius though if your ambient is around ~28 - 36C.


----------



## maynard14

Quote:


> Originally Posted by *HardwareDecoder*
> 
> is it normal to get all over the place gpu usage % as reported by msi afterburner in some games? Like in BF4 it's usually near 100% the entire time but most other games the gpu usage % fluctuates wildly.
> 
> My core clock stays pretty stable +/- 20mhz and I get great fps in everything so I spose it is normal ?
> 
> This was me playing some bioshock 2 (infinite sucks)


we have the same problem sir...my gpu usage, core clock, and memory clock are spiking like hell...haha

they say the cause of the spiking is after burner...i read it somewhere here in this thread.

hope thats true and i hope msi will release a updated after burner compatible to 290/290x card


----------



## HoZy

Use GPUZ to log your usage. You'll notice it doesn't spike like afterburner does.

Cheers
Mat


----------



## VSG

Quote:


> Originally Posted by *K1llrzzZ*
> 
> Can anyone find any benchmarks about 290x Crossfire in PCI 3.0 8x/8x vs 16x/16x?I know there wasn't any difference in performance in the past,but with this new Crossfire system,I would be interested to see that is that change anything or not?Or PCI 2.0 16x/16x vs 3.0 16x/16x since 2.0 16x has the same bandwidth as 3.0 8x.


I had my cards at PCI 3.0 8x/8x and the stock benchmark numbers were the same if not higher than what a few guys on x79 were reporting on 16x/16x. So don't worry about it!


----------



## battleaxe

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Think about all that noise and heat sheesh!!


What do you mean noise? I don't hear anything... no really, I can't hear anything at all now.


----------



## dartuil

Hello,

Can someone explain me what is the difference when the GPU have UEFI?

THanks


----------



## K1llrzzZ

Quote:


> Originally Posted by *geggeg*
> 
> I had my cards at PCI 3.0 8x/8x and the stock benchmark numbers were the same if not higher than what a few guys on x79 were reporting on 16x/16x. So don't worry about it!


Alright then,thanks


----------



## kizwan

When ULPS is disabled, is it normal for the screen to flicker when setting fan setting in AB? For example, when changing from Auto to fixed fan speed & vice versa.


----------



## mojobear

Quote:


> Originally Posted by *maynard14*
> 
> we have the same problem sir...my gpu usage, core clock, and memory clock are spiking like hell...haha
> 
> they say the cause of the spiking is after burner...i read it somewhere here in this thread.
> 
> hope thats true and i hope msi will release a updated after burner compatible to 290/290x card


From what I read...cant find the damn link...MSI afterburner is not causing actual GPU usage inconsistencies...it is just reading it wrong. You can probably attest to that based on the (hopefully) smooth gameplay you are getting. Good old GPUz shows full usage I bet


----------



## dartuil

Custom sapphire are on newegg.








Rush it


----------



## kdawgmaster

So i may have come to the conclusion that my 1300watt PSU might now be enough to power 3 of these cards with a 4820K CPU


----------



## kizwan

Quote:


> Originally Posted by *kdawgmaster*
> 
> So i may have come to the conclusion that my 1300watt PSU might now be enough to power 3 of these cards with a 4820K CPU


Do you overclocked both CPU & GPUs? What was total power draw from the wall?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *kdawgmaster*
> 
> So i may have come to the conclusion that my 1300watt PSU might now be enough to power 3 of these cards with a 4820K CPU


Shouldn't be an issue with your setup unless you have those cards at like 1350/1600 clocks









Your psu can run 4 of them at stock clocks, it should definitely not have an issue with 3 with mild overclocks.


----------



## kdawgmaster

Quote:


> Originally Posted by *kizwan*
> 
> Do you overclocked both CPU & GPUs? What was total power draw from the wall?


I did have them both overclocked but then i reverted back to stock. I dont have anything right now to tell me power draw from the wall but if i can im going to get one today.

When its a non stress full game AKA L4D2, TF2 or something in those regards teh system does great and plays for hours on end. But when it comes to a game that loads ALL my system it will hard lock with the GPU fans ( all 3 of them ) reving up to 100% and i have to shut down the system. I know this isnt the TYPICAL indication of lower power for a system but when iv tested the parts separate they do fine. I can get through 20 pass of linx without error and heaven and furmark both run perfect for hrs.

What I also did is because i thought its a cooling problem was get some fans for my GPU's and rad for the CPU. Im running 4 of the coolermaster sickleflo fans which move a bunch of air. although producing alot of noise.

This could be a motherboard fault as well but I would have figured that to give me errors with Linx as well.
Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Shouldn't be an issue with your setup unless you have those cards at like 1350/1600 clocks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Your psu can run 4 of them at stock clocks, it should definitely not have an issue with 3 with mild overclocks.


Not rly :/ the R9 290x on the average game takes around 350-370 of watts per card. So lets say base off of the 350 mark for the average game and thats 1050 watts just in GPU alone. Now my 4820K none overclocked is 130watts max so thats 1180 roughly, now factor in mainboard, ram, cd/dvd, cooling, my 2 HDD's and my 3 SSD's and i bet u im reaching the limits.

While im off at work im going to load memtest to make sure this isnt a memory leak problem.


----------



## DraXxus1549

Anyone know why I am not getting my full clock speed?

I have it set to 1200/1500 using afterburner (+100 mV) and I am only getting ~1000mhz max. Temps are not an issue as I am under water with max temp ~50C.


----------



## kdawgmaster

Quote:


> Originally Posted by *DraXxus1549*
> 
> Anyone know why I am not getting my full clock speed?
> 
> I have it set to 1200/1500 using afterburner (+100 mV) and I am only getting ~1000mhz max. Temps are not an issue as I am under water with max temp ~50C.


Which overclocking tool are u using?


----------



## kizwan

Quote:


> Originally Posted by *kdawgmaster*
> 
> I did have them both overclocked but then i reverted back to stock. I dont have anything right now to tell me power draw from the wall but if i can im going to get one today.
> 
> When its a non stress full game AKA L4D2, TF2 or something in those regards teh system does great and plays for hours on end. *But when it comes to a game that loads ALL my system it will hard lock with the GPU fans ( all 3 of them ) reving up to 100% and i have to shut down the system.* I know this isnt the TYPICAL indication of lower power for a system but when iv tested the parts separate they do fine. I can get through 20 pass of linx without error and heaven and furmark both run perfect for hrs.
> 
> What I also did is because i thought its a cooling problem was get some fans for my GPU's and rad for the CPU. Im running 4 of the coolermaster sickleflo fans which move a bunch of air. although producing alot of noise.
> 
> This could be a motherboard fault as well but I would have figured that to give me errors with Linx as well.


How about BF3 or BF4? Both games utilized both CPU & GPU. Sound like lack of power there. Do you use custom extensions cable? If you do, you might want to check the cables though.

As for the actual power draw, I was looking for 12V power meter but it's not available everywhere though.


----------



## DraXxus1549

Quote:


> Originally Posted by *kdawgmaster*
> 
> Which overclocking tool are u using?


I'm using MSI afterburner.


----------



## kdawgmaster

Quote:


> Originally Posted by *kizwan*
> 
> How about BF3 or BF4? Both games utilized both CPU & GPU. Sound like lack of power there. Do you use custom extensions cable? If you do, you might want to check the cables though.
> 
> As for the actual power draw, I was looking for 12V power meter but it's not available everywhere though.


Well just did a run of BF4 and it seemed to be fine. Ill do more testing and see what happens

Quote:


> Originally Posted by *DraXxus1549*
> 
> I'm using MSI afterburner.


Do u have open hardware monitor downloaded by chance? this will tell u the active clock speeds ( max, min, current ) so u can ensure its not just a read out problem with afterburner


----------



## Arizonian

Quote:


> Originally Posted by *hotrod717*
> 
> Just thought I'd share.










Holy smokes. Nice job indeed


----------



## DraXxus1549

Quote:


> Originally Posted by *kdawgmaster*
> 
> Well just did a run of BF4 and it seemed to be fine. Ill do more testing and see what happens
> Do u have open hardware monitor downloaded by chance? this will tell u the active clock speeds ( max, min, current ) so u can ensure its not just a read out problem with afterburner


Just downloaded it and it is giving the same readings as afterburner.


----------



## kdawgmaster

Quote:


> Originally Posted by *DraXxus1549*
> 
> Just downloaded it and it is giving the same readings as afterburner.


Hmm, Make sure u have the latest afterburner downloaded. I think for the R9 290x its a beta one right now so download the latest beta one.

with mine going into the settings i have these options are selected if u dont have them already

under General
Start with Windows

Under Compatibility properties
I have all them

Under AMD compatibility properties
i also have reset display mode but i dont think thats ur problem.

Also go into ur CCC and make sure u have enabled the overdrive and try a minor OC with that and see if it takes.


----------



## kizwan

Quote:


> Originally Posted by *dartuil*
> 
> Hello,
> 
> Can someone explain me what is the difference when the GPU have UEFI?
> 
> THanks


GPU with UEFI GOP BIOS, you can use Windows 8 Fast Boot feature. No difference in performance except faster boot.


----------



## rdr09

Quote:


> Originally Posted by *hotrod717*
> 
> Just thought I'd share.


i think you can do better. Osjur crossed 22K with the 290X. here's mine crossing 21K . . .

http://www.3dmark.com/3dm11/7711090

orig bios and using Trixx to oc.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *kdawgmaster*
> 
> I did have them both overclocked but then i reverted back to stock. I dont have anything right now to tell me power draw from the wall but if i can im going to get one today.
> 
> When its a non stress full game AKA L4D2, TF2 or something in those regards teh system does great and plays for hours on end. But when it comes to a game that loads ALL my system it will hard lock with the GPU fans ( all 3 of them ) reving up to 100% and i have to shut down the system. I know this isnt the TYPICAL indication of lower power for a system but when iv tested the parts separate they do fine. I can get through 20 pass of linx without error and heaven and furmark both run perfect for hrs.
> 
> What I also did is because i thought its a cooling problem was get some fans for my GPU's and rad for the CPU. Im running 4 of the coolermaster sickleflo fans which move a bunch of air. although producing alot of noise.
> 
> This could be a motherboard fault as well but I would have figured that to give me errors with Linx as well.
> Not rly :/ the R9 290x on the average game takes around 350-370 of watts per card. So lets say base off of the 350 mark for the average game and thats 1050 watts just in GPU alone. Now my 4820K none overclocked is 130watts max so thats 1180 roughly, now factor in mainboard, ram, cd/dvd, cooling, my 2 HDD's and my 3 SSD's and i bet u im reaching the limits.
> 
> While im off at work im going to load memtest to make sure this isnt a memory leak problem.


You should invest in to a kill-a-watt or some way of checking actual wattage being pulled from the wall. Many people are running multiple cards and not getting that kind of power draw. Newegg tested up to 4 way under full load full system and was under 1300watts (stock clocks of course) but 3 way left plenty of room. I have the same setup as you almost identically hahaha and no issues.


----------



## kcuestag

Quote:


> Originally Posted by *kizwan*
> 
> GPU with UEFI GOP BIOS, you can use Windows 8 Fast Boot feature. No difference in performance except faster boot.


How do I know if my GPU's have that UEFI GOP BIOS? Are they official BIOS'? If so where can I get them.


----------



## thelude

Hey owners of the 290x, would you recommend me getting rid of my 2x 7950's for the 290x? I can get $650 for both of my cards. Both cards is watercooled, so i'm selling them with it. Would I see similar performance? Thanks peeps.


----------



## cyenz

Quote:


> Originally Posted by *kcuestag*
> 
> How do I know if my GPU's have that UEFI GOP BIOS? Are they official BIOS'? If so where can I get them.


Gigabyte have 290x bios on theire site that are Uefi compatible. My friend already flashed it. Got fast boot with windows 8 working.

http://www.gigabyte.pt/products/page/vga/gv-r929xd5-4gd-b/download/bios


----------



## kcuestag

Quote:


> Originally Posted by *cyenz*
> 
> Gigabyte have 290x bios on theire site that are Uefi compatible. My friend already flashed it. Got fast boot with windows 8 working.


I have a 290 (Non X) though.


----------



## MrWhiteRX7

Just sayin' we all saw this... some just didn't want to admit it









Lets hope Mantle actually shows up in January. Let me find all those posts that were talking trash to me and name calling saying it would be here December... lulz!!!

http://semiaccurate.com/2013/12/30/happened-eas-battlefield-4-mantle-patch/

EDIT: This link was posted BY AMD


----------



## DraXxus1549

So I believe I figured out the problem I was having. For some reason afterburner was resetting the power limit in overdrive to 0. I updated to the latest AMD drivers and it seems to have fixed the issue. Now with the power limit set to 50% my core clock stays at 1200.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> You should invest in to a kill-a-watt or some way of checking actual wattage being pulled from the wall. Many people are running multiple cards and not getting that kind of power draw. Newegg tested up to 4 way under full load full system and was under 1300watts (stock clocks of course) but 3 way left plenty of room. I have the same setup as you almost identically hahaha and no issues.


Keep in mind though that power pulled from the wall is greater than the power the PSU is supplying, which is what they are rated for. Killawatts don't address this.

For example, my old TX850 is rated for about 80% efficiency near full load, so if my PC is pulling ~800w from the wall then the PSU is supplying 800 x .80 = ~640w which leaves a good amount of headroom.

Now I went and bought an 865w UPS for my PC, but my new PSU is rated about 92% efficiency under load. This means I can have about 800w on the 12v rail, which is enough for my 3770k, pump, HDD, fans, and 2 decently OC'ed 290's.

My current rig (in sig, 3770k @ 4.6, 290, and lots of fans) currently maxes out at about 520w from the UPS, which with the ~80% efficiency means that my PSU is supplying ~415w.


----------



## hotrod717

Quote:


> Originally Posted by *rdr09*
> 
> i think you can do better. Osjur crossed 22K with the 290X. here's mine crossing 21K . . .
> 
> http://www.3dmark.com/3dm11/7711090
> 
> orig bios and using Trixx to oc.


Thanks for the support buddy. Just got this in my new rig Sat. and pulling double dutiies with this new card and new 4930k. I'm sure I can score better with a little more time. Been seeing good things with Trixx. Will have to give it a try.


----------



## kizwan

Quote:


> Originally Posted by *kcuestag*
> 
> How do I know if my GPU's have that UEFI GOP BIOS? Are they official BIOS'? If so where can I get them.


I see you have Sapphire R9 290 GPU. So far Sapphire have not release UEFI GOP BIOS for the card yet.


----------



## chiknnwatrmln

OK, so for some reason I can't score as high as I used to. Same driver settings, etc etc.

Before at 1200/6500 I would get ~17.4k GPU score on 3dMark11 with tess on. Now I get ~17000.

I re-ran a test I did before (1150/5600) and same thing, a couple hundred points lower. Different drivers, but I got lower scores using the same drivers too. Here are the links to the two most recent.

New: http://www.3dmark.com/3dm11/7743777
Old: http://www.3dmark.com/3dm11/7533693

What the heck is going on? Has my chip really degraded? I've run +18mv for pretty much the past two months, the only time I've run higher voltage is for benching which I don't do all that much. Highest I've gone is 1412 in GPUTweak, about 1.28v after droop on my card.


----------



## the9quad

Quote:


> Originally Posted by *kdawgmaster*
> 
> I did have them both overclocked but then i reverted back to stock. I dont have anything right now to tell me power draw from the wall but if i can im going to get one today.
> 
> When its a non stress full game AKA L4D2, TF2 or something in those regards teh system does great and plays for hours on end. But when it comes to a game that loads ALL my system it will hard lock with the GPU fans ( all 3 of them ) reving up to 100% and i have to shut down the system. I know this isnt the TYPICAL indication of lower power for a system but when iv tested the parts separate they do fine. I can get through 20 pass of linx without error and heaven and furmark both run perfect for hrs.
> 
> What I also did is because i thought its a cooling problem was get some fans for my GPU's and rad for the CPU. Im running 4 of the coolermaster sickleflo fans which move a bunch of air. although producing alot of noise.
> 
> This could be a motherboard fault as well but I would have figured that to give me errors with Linx as well.
> Not rly :/ the R9 290x on the average game takes around 350-370 of watts per card. So lets say base off of the 350 mark for the average game and thats 1050 watts just in GPU alone. Now my 4820K none overclocked is 130watts max so thats 1180 roughly, now factor in mainboard, ram, cd/dvd, cooling, my 2 HDD's and my 3 SSD's and i bet u im reaching the limits.
> 
> While im off at work im going to load memtest to make sure this isnt a memory leak problem.


Your power supply is fine, I can run my 4930k at 1.4 volts and 3 r9290x's with 1150/1375 overclocks just fine on a 1200 watt psu.

You have something else conflicting that's causing it to happen. It is not the power.

And if you go to the original page one of this thread you can see the newegg video with power draws under full system load in tri fire isn't close to 1300 watts.

In the video with a water cooled 3970x at 4.5 ghz, one ssd, and one hard drive he is pulling 930 watts at full load with 3 r9 290x's.


----------



## bond32

Still seems I get black screens in BF4 when I up the memory clocks... This is with a 1440p monitor, 110hz. Anyone else still have this issue?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *bond32*
> 
> Still seems I get black screens in BF4 when I up the memory clocks... This is with a 1440p monitor, 110hz. Anyone else still have this issue?


1440p @ 96hz and in BF4 I run my memory clock at 1500mhz. I've never experienced a black screen issue.


----------



## bond32

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> 1440p @ 96hz and in BF4 I run my memory clock at 1500mhz. I've never experienced a black screen issue.


Which card do you have and when did you get it? I have a sapphire 290x that I got at release.


----------



## RagingCain

Hey guys just a reminder, Team Green vs. Team Red is going on! (I took over for Alatar)!

Unfortunately the river is running green with toxic waste, but I think the rivers are supposed to run Red with blood? Enough of my ******ed-ness. Feel free to stop by and get your benchmark on for OCN!

http://www.overclock.net/t/1436635/official-ocns-team-green-vs-team-red-gk110-vs-hawaii/0_50

R9 290s and 290Xs.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *bond32*
> 
> Which card do you have and when did you get it? I have a sapphire 290x that I got at release.


Sapphire 290, also purchased at launch.


----------



## note235

is it safe to undervolt the 290s?


----------



## crun

General consensus is that Hynix is better than Elpida in terms of overclocking in R9 290/290X, correct?


----------



## MrWhiteRX7

My first one was elpida and I've had it as high as 1600 on the memory stable, but that doesn't "seem" to be the norm.


----------



## jrcbandit

There doesn't seem to be much difference. Even people with Hynix are often stuck with "low" memory overclocks.


----------



## velocityx

Quote:


> Originally Posted by *crun*
> 
> General consensus is that Hynix is better than Elpida in terms of overclocking in R9 290/290X, correct?


http://forums.overclockers.co.uk/showpost.php?p=25174104&postcount=290


----------



## jerrolds

So my bros H60 is slowly dying, and since ive already bought the XSPC Backplate for my 290X (for aesthetic/strengthening purposes originally) - im thinking this might open the door into a more "custom" water loop for me (i can give him my current Venomous X-RT cooler).

Thinking of the Swiftech H320 as my gateway setup (not looking to spend $400+ for WC, not including the gpu block, quite yet)

The Swiftech rep says the H320 should be able to support an i7 [email protected]+ and a Radeon R9 [email protected]+ rather easily.

I'm wondering if i get get the 320 now for my CPU, then eventually add the 290X to the loop in the future. Not sure if i can mix and match a Swiftech H320 with a GPU with a XSPC Razor block?

Another $140 for swiftech now and $100 for xspc block (and maybe some more for coolant/tubes/fittings) later to get my i7 and 290X "under water" seems like a good jump off point.

What do you guys think?

Or should i spend the extra $100 for the XSPC EX360 kit?


----------



## Falkentyne

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> OK, so for some reason I can't score as high as I used to. Same driver settings, etc etc.
> 
> Before at 1200/6500 I would get ~17.4k GPU score on 3dMark11 with tess on. Now I get ~17000.
> 
> I re-ran a test I did before (1150/5600) and same thing, a couple hundred points lower. Different drivers, but I got lower scores using the same drivers too. Here are the links to the two most recent.
> 
> New: http://www.3dmark.com/3dm11/7743777
> Old: http://www.3dmark.com/3dm11/7533693
> 
> What the heck is going on? Has my chip really degraded? I've run +18mv for pretty much the past two months, the only time I've run higher voltage is for benching which I don't do all that much. Highest I've gone is 1412 in GPUTweak, about 1.28v after droop on my card.


Quote:


> Originally Posted by *MrWhiteRX7*
> 
> My only beef with Trixx is the lack of Aux voltage. I can't hit my same memory clocks on Trixx as I can with AB using aux @ +13
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Which sucks because I ONLY used trixx on my 7970's, I love it so much.


You sure you have everything set the same?
Even something like forcing anisotropic filtering in the driver, or some other obscure setting or something running could slip by....


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Falkentyne*
> 
> You sure you have everything set the same?
> Even something like forcing anisotropic filtering in the driver, or some other obscure setting or something running could slip by....


Yea for me I just use Afterburner and I can hit high memory clocks but I have the aux voltage at +13. In sapphire trixx this option does not exist and I'm only able to get to 1450 when the core is at 1200 same voltage. Doesn't really bother me I actually really like AB currently. I just had used Trixx so long I was attached to it lol.


----------



## bond32

Quote:


> Originally Posted by *jerrolds*
> 
> So my bros H60 is slowly dying, and since ive already bought the XSPC Backplate for my 290X (for aesthetic/strengthening purposes originally) - im thinking this might open the door into a more "custom" water loop for me (i can give him my current Venomous X-RT cooler).
> 
> Thinking of the Swiftech H320 as my gateway setup (not looking to spend $400+ for WC, not including the gpu block, quite yet)
> 
> The Swiftech rep says the H320 should be able to support an i7 [email protected]+ and a Radeon R9 [email protected]+ rather easily.
> 
> I'm wondering if i get get the 320 now for my CPU, then eventually add the 290X to the loop in the future. Not sure if i can mix and match a Swiftech H320 with a GPU with a XSPC Razor block?
> 
> Another $140 for swiftech now and $100 for xspc block (and maybe some more for coolant/tubes/fittings) later to get my i7 and 290X "under water" seems like a good jump off point.
> 
> What do you guys think?
> 
> Or should i spend the extra $100 for the XSPC EX360 kit?


Swiftech kit is great. I swore by my h220 when I had it and it worked really well to cool an amd 8350 and a 7970 with 1 additional rad. There are some things you should consider about it though, such as the barbs on the pump are not removable. This means you are stuck with that tube size (dont remember what it was). Also, I had serious clearance issues on my motherboard because of the pump/block.

I would recommend the XSPC kit. It will cool just as well if not better, plus it is even more flexible. Don't get me wrong the h320 is a fantastic kit, however in my experience I wish I had just passed on it as I ultimately wanted a full blown custom loop (later got the xspc rx240 kit, now have a 480mm rad in push pull and a thick 420).


----------



## Tennobanzai

Quote:


> Originally Posted by *note235*
> 
> is it safe to undervolt the 290s?


I dont see why not. I've been trying to undervolt mine but after a couple of restarts it goes back to the 0 default. Not sure why.


----------



## note235

Quote:


> Originally Posted by *Tennobanzai*
> 
> I dont see why not. I've been trying to undervolt mine but after a couple of restarts it goes back to the 0 default. Not sure why.


are you using AB?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Falkentyne*
> 
> You sure you have everything set the same?
> Even something like forcing anisotropic filtering in the driver, or some other obscure setting or something running could slip by....


Yes I'm sure. The only difference was that those scores were on the stock cooler, and now I'm running the Gelid cooler. The core is cooler but the VRM's are about 10c hotter.


----------



## Chimera1970

GPU-Z Link - http://www.techpowerup.com/gpuz/ngeyp/

Vendor - Gigabyte
Cooler - WindForce


----------



## Durvelle27

Quote:


> Originally Posted by *Chimera1970*
> 
> GPU-Z Link - http://www.techpowerup.com/gpuz/mhx28/
> 
> Vendor - Gigabyte
> Cooler - WindForce


Pics


----------



## Tennobanzai

Quote:


> Originally Posted by *note235*
> 
> are you using AB?


Yes latest driver and AB


----------



## Chimera1970

I already installed it LOL, would you settle for a pic of the box?


----------



## Gero2013

Quote:


> Originally Posted by *Asrock Extreme7*
> 
> my asic is 72.2 tops out @ 1.186 xfx290 unlocked to 290x


so is ASIC directly related to overclocking potential?

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Yes I'm sure. The only difference was that those scores were on the stock cooler, and now I'm running the Gelid cooler. The core is cooler but the VRM's are about 10c hotter.


400 is not a significant difference, it could be the new cooler? would be great if you could keep us updated on any changes!


----------



## Mr357

Quote:


> Originally Posted by *Gero2013*
> 
> so is ASIC directly related to overclocking potential?


No, it means nothing.


----------



## Bartouille

Got a r9 290 for 419$ in Canada







Diamond brand, I hope it OC well! Going to install my MK-26 on it and keep the unisink. I hope I can get some 1.2ghz.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Gero2013*
> 
> so is ASIC directly related to overclocking potential?
> 400 is not a significant difference, it could be the new cooler? would be great if you could keep us updated on any changes!


That's about 2.3%, my score is repeatedly about 400pts lower so this is not inside the margin of error.

I think that maybe the higher VRM temps cause the card to get a lower score, although the card's not downlocking so I'm not sure. When (if, I've been waiting on a 290x block for three weeks) I get this card under water I'll see if there is a difference.

If not, back to XFX this card goes. The chip should not have degraded in a few months.


----------



## Pheozero

So since I don't quite have the time to read through the 12K+ posts in here, I'm hoping someone wouldn't mind answering my questions while I wait for custom PCB cards to come out.

Was that black screen issue ever fixed? What about the single GPU performance while in crossfire while playing BF4?


----------



## Forceman

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> That's about 2.3%, my score is repeatedly about 400pts lower so this is not inside the margin of error.
> 
> I think that maybe the higher VRM temps cause the card to get a lower score, although the card's not downlocking so I'm not sure. When (if, I've been waiting on a 290x block for three weeks) I get this card under water I'll see if there is a difference.
> 
> If not, back to XFX this card goes. The chip should not have degraded in a few months.


I think I saw someone else say they got lower performance when the VRM temps got too high, but I don't remember where. Makes some sense though, Powertune could be throttling the card slightly to cut power consumption or lower the temps or something. You have +50% power limit set, yeah? And it shows correctly in CCC?


----------



## jerrolds

Quote:


> Originally Posted by *Pheozero*
> 
> So since I don't quite have the time to read through the 12K+ posts in here, I'm hoping someone wouldn't mind answering my questions while I wait for custom PCB cards to come out.
> 
> Was that black screen issue ever fixed? What about the single GPU performance while in crossfire while playing BF4?


Its better since beta 9.4, but i dont think its 100% fixed. I havent got a blackscreen in weeks personally. I do get random bluescreen restarts - but i think its a MAME thing (only happens when running cab), although it did happen in the desktop a few times randomly.

But nothing to really write home about. I am satisfied with the stability.


----------



## Fniz92

Quote:


> Originally Posted by *Pheozero*
> 
> So since I don't quite have the time to read through the 12K+ posts in here, I'm hoping someone wouldn't mind answering my questions while I wait for custom PCB cards to come out.
> 
> Was that black screen issue ever fixed? What about the single GPU performance while in crossfire while playing BF4?


My brothers 290x have been running without any black screens after the 9.4 beta driver, he is now using the latest WHQL and zero issues.


----------



## DeadlyDNA

Kinda off topic but how do you get your results on 3dmarks ranking list? Does it automatically submit the results and add them?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Forceman*
> 
> I think I saw someone else say they got lower performance when the VRM temps got too high, but I don't remember where. Makes some sense though, Powertune could be throttling the card slightly to cut power consumption or lower the temps or something. You have +50% power limit set, yeah? And it shows correctly in CCC?


Yes +50% is set in GPUTweak.

The card is not throttling.


----------



## The Storm

Has anyone ever had luck selling game codes on ebay? I need to sell these 2 BF4 codes.


----------



## Zenophobe

Recieved my Gigabyte GV-R929XOC-4GD today.

http://www.3dmark.com/3dm11/7743935

Waiting on my new case to install my 4770K and MB then will rerun everthing.

http://www.3dmark.com/3dm/2064926


----------



## Forceman

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Yes +50% is set in GPUTweak.
> 
> The card is not throttling.


Can you post a Afterburner graph of a run?
Quote:


> Originally Posted by *The Storm*
> 
> Has anyone ever had luck selling game codes on ebay? I need to sell these 2 BF4 codes.


I sold mine no problem. Think I got $40 for it.


----------



## VSG

Quote:


> Originally Posted by *The Storm*
> 
> Has anyone ever had luck selling game codes on ebay? I need to sell these 2 BF4 codes.


Don't do eBay, you are ineligible for seller protection on digital goods and I got cheated last week by a scammer who filed a claim on PayPal against me after using the code. I still have 2 BF4 codes to sell but may put them up on Craigslist.


----------



## mojobear

Hi all....got my crossfire r9 290 running...but cant see the sensor stuff on the second GPU uising GPUz such as VRMs....anyone have any ideas how to fix this>

Many thanks!


----------



## the9quad

Quote:


> Originally Posted by *Pheozero*
> 
> So since I don't quite have the time to read through the 12K+ posts in here, I'm hoping someone wouldn't mind answering my questions while I wait for custom PCB cards to come out.
> 
> Was that black screen issue ever fixed? What about the single GPU performance while in crossfire while playing BF4?


Can you clarify what you mean by single gpu performance in crossfire? I did a fraps run last night in siege of shanghai and was like 180 avg fps/100 min @1440p ultra. So crossfire scaling seems pretty good in bf4 as far as I've seen. I can run some more benches tomorrow if ya want. Also black screens are a thing still, my recommendation is if you get them don't waste your time and just rma the card ASAP for a new one. I don't think as many cards are as affected as before though. First one I bought black screened. The three I have now are fine though.

Hope that helps.


----------



## the9quad

Quote:


> Originally Posted by *mojobear*
> 
> Hi all....got my crossfire r9 290 running...but cant see the sensor stuff on the second GPU uising GPUz such as VRMs....anyone have any ideas how to fix this>
> 
> Many thanks!


Did you use the drop down tab in gpuz to select the second card, it only shows one at a time.

Also for future reference, if you want to see the info in afterburner for both cards you have to disable ULPS


----------



## jomama22

Quote:


> Originally Posted by *geggeg*
> 
> Don't do eBay, you are ineligible for seller protection on digital goods and I got cheated last week by a scammer who filed a claim on PayPal against me after using the code. I still have 2 BF4 codes to sell but may put them up on Craigslist.


Just a FYI, all you have to do in this case is mail out the game code paper itself in a cheap envelope and stamp. Have proof of mailing with a receipt from the post office. Ebay would be none the wiser that you emailed him the code. As long as you have proof of sending, he cant get his money back.


----------



## HardwareDecoder

Quote:


> Originally Posted by *jomama22*
> 
> Just a FYI, all you have to do in this case is mail out the game code paper itself in a cheap envelope and stamp. Have proof of mailing with a receipt from the post office. Ebay would be none the wiser that you emailed him the code. As long as you have proof of sending, he cant get his money back.


good idea but I don't think it will fully protect the seller. The buyer can still just say the code was already used / product not as described and put in for a refund.


----------



## mojobear

Quote:


> Originally Posted by *the9quad*
> 
> Did you use the drop down tab in gpuz to select the second card, it only shows one at a time.
> 
> Also for future reference, if you want to see the info in afterburner for both cards you have to disable ULPS


Hi there yah I did try that...but GPUz (latest one from Dec 20) was a bit wack...didnt display any settings like VRM.... also tried Trixxx but there was no voltage control on the second and third gpu (just power limit??)


----------



## mojobear

my second and third GPU for r9 290 also lists as 2816 shaders....maybe there is still a bug with GPUz for R9 290s?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Forceman*
> 
> Can you post a Afterburner graph of a run?
> I sold mine no problem. Think I got $40 for it.



Here, this is a quick snapshot of what my usage looks like while gaming on my everyday OC. The only time the core clock dropped below 1100 MHz was when Skyrim's GPU usage wasn't exactly 100%, but it was over 95% the entire time.

Temps get a bit higher after an hour or so, around 80c VRM 1, 55c VRM 2, and 62c core. Even then there is no throttling, other than when I either get 60 FPS (vsync) or the game's engine stutters.

The graphs look the same regardless of my OC, 1140 or 1200 MHz both don't throttle but the VRM's get a bit too hot (can hit 100c @ 1200) so I keep my card at 1100. I really can't wait to get this card under water.


----------



## dansi

Anyone know how to undervolt 290x?
Tried with MSI AB, but upon restart, the voltages return to defaults.


----------



## Sgt Bilko

Quote:


> Originally Posted by *dansi*
> 
> Anyone know how to undervolt 290x?
> Tried with MSI AB, but upon restart, the voltages return to defaults.


You tried checking Force Constant Voltage?

Other than that you could try a different program, Trixx or GPU Tweak etc


----------



## chiknnwatrmln

Quote:


> Originally Posted by *dansi*
> 
> Anyone know how to undervolt 290x?
> Tried with MSI AB, but upon restart, the voltages return to defaults.


Make a profile and re-apply it.

GPUTweak does the same thing, just set it to start up when Windows launches and click the profile you made.


----------



## VSG

Quote:


> Originally Posted by *jomama22*
> 
> Just a FYI, all you have to do in this case is mail out the game code paper itself in a cheap envelope and stamp. Have proof of mailing with a receipt from the post office. Ebay would be none the wiser that you emailed him the code. As long as you have proof of sending, he cant get his money back.


Doesn't work since the actual item sold is digital and that doesn't classify for seller protection.


----------



## bir86

I get lower FPS in Hitman: Absolution with OC. +1 FPS with 1100Mhz and -2 with 1200Mhz on the core. Dafuq?

Anyone else can confirm this?

290X Crossfire
Windows 8.1 64-bit
Asus GPU Tweak
CCC 13.12


----------



## Arizonian

Quote:


> Originally Posted by *Chimera1970*
> 
> GPU-Z Link - http://www.techpowerup.com/gpuz/ngeyp/
> 
> Vendor - Gigabyte
> Cooler - WindForce


Congrats - added









I saw your GPU-Z has your OCN name on it. That'll do.









I assumed since it was 1040 MHz clock speed it's a *290X* which I had to look up. If I'm not mistaken your the first non-reference cooler with your Windforce after 216 members on the roster who either stuck with stock or did their own water cooling or aftermarket cooling.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Chimera1970*
> 
> GPU-Z Link - http://www.techpowerup.com/gpuz/ngeyp/
> 
> Vendor - Gigabyte
> Cooler - WindForce


Nice one







First Custom Card in here









i have to ask, Is it Volt Locked like most Giga cards?


----------



## Sazz

I got my new R9 290X (that replaced my 290) back on watercooling, just tried a little overclocking tests with AB and stock BIOS (sapphire) and at +100mv I get 1210Mhz core clock stable (OCCT error free, could go 1230 with no artifacts showing but OCCT gets error).

I tried to use Sapphire Trixx at +150mv so far I get 1275Mhz stable, prolly can break 1300Mhz+ at +200mv.

Temp wise VRM seems to go as high as 48C, but would do more GPU stressing before I jump on that conclusion.


----------



## dansi

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Make a profile and re-apply it.
> 
> GPUTweak does the same thing, just set it to start up when Windows launches and click the profile you made.


Yap thats what im doing currently...so there is no way to auto apply on restart?


----------



## Sgt Bilko

XFX has the Double Dissapation 290 for $599 and the 290x for $759 up at PCCG in Aus

That's a clear $100 over the cost for a 290 ref card and $80 over for a 290x ref.


----------



## darkelixa

i could not spend an extra 100 bucks on a gpu just becuase it has a different fan/fans on it


----------



## Derpinheimer

Well, that and a custom PCB. They also claim it has controllable memory voltage.. although I'm guessing thats an error?

Controllable memory voltage + hynix vram = <3


----------



## Sgt Bilko

Quote:


> Originally Posted by *Derpinheimer*
> 
> Well, that and a custom PCB. They also claim it has controllable memory voltage.. although I'm guessing thats an error?
> 
> Controllable memory voltage + hynix vram = <3


I thought Hawaii in General didn't have controllable Mem Voltage?

Would be nice if it did


----------



## ssiperko

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I thought Hawaii in General didn't have controllable Mem Voltage?
> 
> Would be nice if it did


Yea, I'd likely be able to smack 1800 if I could mod the mem voltage.

SS


----------



## ImJJames

Quote:


> Originally Posted by *Sazz*
> 
> I got my new R9 290X (that replaced my 290) back on watercooling, just tried a little overclocking tests with AB and stock BIOS (sapphire) and at +100mv I get 1210Mhz core clock stable (OCCT error free, could go 1230 with no artifacts showing but OCCT gets error).
> 
> I tried to use Sapphire Trixx at +150mv so far I get 1275Mhz stable, prolly can break 1300Mhz+ at +200mv.
> 
> Temp wise VRM seems to go as high as 48C, but would do more GPU stressing before I jump on that conclusion.


Very nice, seems like you have a good OC'ing card, slap PT 1 bios on there just to see how far you can push it since you're on water


----------



## HardwareDecoder

Quote:


> Originally Posted by *ImJJames*
> 
> Very nice, seems like you have a good OC'ing card, slap PT 1 bios on there just to see how far you can push it since you're on water


I still recommend he is very careful. Killing a card with volts is not only because of heat.... it is also because of ya know VOLTS. There has been at least one guy on here that killed his card with 1450mv supposedly.


----------



## Sgt Bilko

Well i just put an order in for the XFX Double Dissipation 290 Black Edition (Quite a long name







), ships out on the 10th so i'll let everyone knows how it goes then.

Anyone got any experience with XFX?


----------



## Sazz

So far vrm temps seems to be in great shape at +100mv, GPU temp barely cracks 44C after 3hrs of valley.

gonna see what +200mv on trixx can do at stock bios tomorrow.

Edit: Ran a quick Valley at +200mv with TrixX got 1300/1500


----------



## Sgt Bilko

Quote:


> Originally Posted by *Sazz*
> 
> 
> 
> So far vrm temps seems to be in great shape at +100mv, GPU temp barely cracks 44C after 3hrs of valley.
> 
> gonna see what +200mv on trixx can do at stock bios tomorrow.
> 
> Edit: Ran a quick Valley at +200mv with TrixX got 1300/1500


Nice clock









Seems that extra 100mV with Trixx can push them to 1250/1300 if you can keep it cool enough.


----------



## Sazz

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Nice clock
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Seems that extra 100mV with Trixx can push them to 1250/1300 if you can keep it cool enough.


Yeah, once watercooled temps is not even a problem anymore, atleast to my case it wasn't. GPU core was barely breaking 49C overclocked at +200mV and VRM's are in the same case. it's just that can the hardware sustain those kinds of voltage for long periods of time, for me I will only do +200mV for benchmarkings, will run at +50mV for my daily clocks until we can confirm that +100mV is safe for daily use.


----------



## kizwan

Quote:


> Originally Posted by *kizwan*
> 
> When ULPS is disabled, is it normal for the screen to flicker when setting fan setting in AB? For example, when changing from Auto to fixed fan speed & vice versa.


Anyone?


----------



## Sgt Bilko

Ok, i pulled the trigger on 2 XFX R9 290 DD's

Looks like it's gonna be a good new year


----------



## kizwan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Ok, i pulled the trigger on 2 XFX R9 290 DD's
> 
> Looks like it's gonna be a good new year


Finally!


----------



## Sgt Bilko

Quote:


> Originally Posted by *kizwan*
> 
> Finally!


Yeah, i've been going back and forth a bit but i finally decided to grab the first type that was in stock


----------



## Monkeysphere

Ok so I'm a bit annoyed with this. I have 2x MSI R9 290.

In benchmarks neither card will hit 947mhz. They are both watercooled hitting temps of around 40C. One maxes at 915mhz and the other at 850mhz.

Overclocking them sometimes makes them ramp up to 1100 but then I'll reboot and they go all over the place.

One ASICs at 77% the other at 73%.

I can also get +100mv in AB, is that all they can go or is it just the locked BIOS issue?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Monkeysphere*
> 
> Ok so I'm a bit annoyed with this. I have 2x MSI R9 290.
> 
> In benchmarks neither card will hit 947mhz. They are both watercooled hitting temps of around 40C. One maxes at 915mhz and the other at 850mhz.
> 
> Overclocking them sometimes makes them ramp up to 1100 but then I'll reboot and they go all over the place.
> 
> One ASICs at 77% the other at 73%.
> 
> I can also get +100mv in AB, is that all they can go or is it just the locked BIOS issue?


Well they aren't temp throttling, have you set the power limit to +50 in Afterburner and CCC?

If you have Overclocking disabled in CCC then you should only need to set the power limit in Afterburner.


----------



## Sazz

Quote:


> Originally Posted by *Monkeysphere*
> 
> Ok so I'm a bit annoyed with this. I have 2x MSI R9 290.
> 
> In benchmarks neither card will hit 947mhz. They are both watercooled hitting temps of around 40C. One maxes at 915mhz and the other at 850mhz.
> 
> Overclocking them sometimes makes them ramp up to 1100 but then I'll reboot and they go all over the place.
> 
> One ASICs at 77% the other at 73%.
> 
> I can also get +100mv in AB, is that all they can go or is it just the locked BIOS issue?


AB's limit is +100mV but there's a bypass I think to make it go further than +100mV

as for your clocks not hitting the max clocks you set it to reach, did you set the board power limit more than +10%? and what benchmarks are you talking about? I run Valley and my 290X do run at the clocks that I set it to.

I saw this on my 290X and the 290 that I previously owned, when at stock settings (except for the fan curve, I use my own fan profile) the clocks doesn't sustain its set clocks, but when I set my board power limit at +10 to +15% the gpu clock reach the clocks I set it to do, it also depends on the game/benchmark. Valley can run it at top speed, not 100% constant coz you will see the clocks play around a bit specially during transitions between passes.


----------



## TheSoldiet

So hawaii has more to give apparently!

http://wccftech.com/amd-hawaii-gpu-die-shot-analyzed-fully-unlocked-chip-48-compute-units-3072-stream-processors/

(Didn't have time to check if anyone else has posted this)


----------



## Sgt Bilko

Quote:


> Originally Posted by *TheSoldiet*
> 
> So hawaii has more to give apparently!
> 
> http://wccftech.com/amd-hawaii-gpu-die-shot-analyzed-fully-unlocked-chip-48-compute-units-3072-stream-processors/
> 
> (Didn't have time to check if anyone else has posted this)


3072?!? Damn, looks like AMD are getting serious


----------



## TheSoldiet

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 3072?!? Damn, looks like AMD are getting serious


Yeah! I also hope that in the future regular 290X's and unlocked 290's will be able to unlock themselves even further to get all of these sp's. This will be a beast, and even more when the drivers has matured.


----------



## Ros1kk

Quote:


> Originally Posted by *TheSoldiet*
> 
> Yeah! I also hope that in the future regular 290X's and unlocked 290's will be able to unlock themselves even further to get all of these sp's. This will be a beast, and even more when the drivers has matured.


dreams are made to be!!!!!!!!!!


----------



## Chimera1970

Yes, it's the 290x.

The fact that it came with an aftermarket cooler already installed was one of the main reasons I decided to get this model. I had planned on changing out the cooler had I bought a stock card, but after reading some of the (mostly negative installation) reviews AND the fact that all of the "good" coolers were sold out everywhere I looked, I decided on this card. It was the most expensive at $699, but overall, I went with what I wanted  and now I have bragging rights LOL. I have to admit though that the Windforce cooler on this one seems a little flimsy. One of the fans was rubbing up against something and I had to figure out a way to stop it, so I jerry-rigged a temporary fix until I can either figure out a better solution. I should include a picture of my handywork, maybe later on today.

Hopefully I can do some gaming on it today and see if it can handle Bioshock Infinite better than my old 1GB AMD Radeon HD 7700 series. I also plan on trying out Metro 2033 and Metro: Last Light, both of which I got just yesterday in the Steam sale going on.

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I saw your GPU-Z has your OCN name on it. That'll do.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I assumed since it was 1040 MHz clock speed it's a *290X* which I had to look up. If I'm not mistaken your the first non-reference cooler with your Windforce after 216 members on the roster who either stuck with stock or did their own water cooling or aftermarket cooling.


----------



## Durvelle27

Guys looks like I might be getting another R9 290


----------



## Ros1kk

one question guys: if I use a reference sapphire 290x card and i replace the cooler with an accelero x3 my warranty goes to hell?? thanks for support! good new year!


----------



## Durvelle27

Quote:


> Originally Posted by *Ros1kk*
> 
> one question guys: if I use a reference sapphire 290x card and i replace the cooler with an accelero x3 my warranty goes to hell?? thanks for support! good new year!


Depends on the brand as some have warranty stickers and others don't


----------



## Sgt Bilko

Quote:


> Originally Posted by *Ros1kk*
> 
> one question guys: if I use a reference sapphire 290x card and i replace the cooler with an accelero x3 my warranty goes to hell?? thanks for support! good new year!


Sapphire follow more of a "dont ask,dont tell" policy, just dont glue heatsinks on. Use the thermal pads instead and you will be fine.

And a Happy 2014 to you as well!!

12:35am here now


----------



## Ros1kk

Quote:


> Originally Posted by *Durvelle27*
> 
> Depends on the brand as some have warranty stickers and others don't


sapphire usually have it? now i put out my 290 to verify







but i'm going to buy the same card with an X more now!


----------



## Sgt Bilko

Quote:


> Originally Posted by *Durvelle27*
> 
> Depends on the brand as some have warranty stickers and others don't


Sapphire doesnt have them over any screws


----------



## Ros1kk

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Sapphire doesnt have them over any screws


thank u so much Sir!


----------



## Durvelle27

Quote:


> Originally Posted by *Ros1kk*
> 
> sapphire usually have it? now i put out my 290 to verify
> 
> 
> 
> 
> 
> 
> 
> but i'm going to buy the same card with an X more now!


I can confirm the Sapphire R9 290 does not have any


----------



## Ros1kk

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Sapphire follow more of a "dont ask,dont tell" policy, just dont glue heatsinks on. Use the thermal pads instead and you will be fine.
> 
> And a Happy 2014 to you as well!!
> 
> 12:35am here now


13:35am here at the moment







thank u so much for the help needed









Link for thermal pads description? sorry for my noobness


----------



## TheSoldiet

3.04 PM here

Or 15.04 as we call it here in Norway.


----------



## steelkevin

Quote:


> Originally Posted by *Durvelle27*
> 
> I can confirm the Sapphire R9 290 does not have any


Not that it's needed but I can give a third confirmation that I didn't encounter any sticker when putting a Water Block on a Sapphire 290.

I really wish I would've tested it on air first though. Just to know what others are enduring







(I'd drained my loop the night before getting the card so I wouldn't lose any time, already had the water block waiting for a week or so).


----------



## TheSoldiet

i can also confirm that the sapphire 290x does not have any stickers.


----------



## Ros1kk

Quote:


> Originally Posted by *steelkevin*
> 
> Not that it's needed but I can give a third confirmation that I didn't encounter any sticker when putting a Water Block on a Sapphire 290.
> 
> I really wish I would've tested it on air first though. Just to know what others are enduring
> 
> 
> 
> 
> 
> 
> 
> (I'd drained my loop the night before getting the card so I wouldn't lose any time, already had the water block waiting for a week or so).


i will let you know about that! im going to buy soon an x and an accelero x3. the only question.. approximatively, which is the average ASIC quality on 290x? now i have a 290 basic card with 68.1%(omg)and i have the signature frequencies without any problems.
Any "X" owner could explain me please? thank you so much!


----------



## Sgt Bilko

Quote:


> Originally Posted by *Ros1kk*
> 
> 13:35am here at the moment
> 
> 
> 
> 
> 
> 
> 
> thank u so much for the help needed
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Link for thermal pads description? sorry for my noobness


Use the pads from the stock cooler for the vram and the pads included with the Accelero Xtreme III for the vrm's.

Thats what I did when I had my Sapphire 290x









EDIT: I'm guessing you already know that you need some more heatsinks than whats included with the Accelero III?

You will need some low profile ones due to clearance


----------



## Ros1kk

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Use the pads from the stock cooler for the vram and the pads included with the Accelero Xtreme III for the vrm's.
> 
> Thats what I did when I had my Sapphire 290x
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: I'm guessing you already know that you need some more heatsinks than whats included with the Accelero III?
> 
> You will need some low profile ones due to clearance


i know but i don't really know where buy the last vrm heatsinks







if it's ok with the stock cooler pads sounds really good!


----------



## Derpinheimer

It will work but re-using pads will increase temperatures for those parts. You can buy thermal pads from eBay, otherwise search for Fujipoly Thermal Pad in "local" internet computer stores.


----------



## steelkevin

Alright,
I should make a thread with a poll but people tend not to care about new threads so I'll ask here and hope that as many people as possible involved will answer.

*@All water cooled R9 290 owners*

If you're running your fans a bit fast you probably won't have noticed and won't be able to answer. If you would be willing just to turn them down to their lowest speed and launch a game that would be great. It won't take any more than a minute if even that much







.

My card makes a terrible electrical interference / buzz or whatever sound when the card is under load. I just played some BF4 and noticed the sound changed a bit depending on which pre-sets or resolution I used (I had my case open, you may not be able to distinguish the sounds if yours is closed but it doesn't really matter anyway).
I'm thinking it could be the card or the PSU so here are mine:
- Sapphire R9 290 with Hynix Memory @1100/1300 no volt adjustments
- OCZ StealthXStream II 700W

It may also be related to CPU OC (just throwing a random guess out there, can't really find a link between both) so mine's an i7-860 @3800MHz with HT disabled.

I really hate that it makes that noise and it sort of ruins the purpose of quiet water cooling (if that's what you were going for which is my case) and would love to know if others are affected by it too.

*Does your card make such a noise too ?*


----------



## Tomalak

Quote:


> Originally Posted by *steelkevin*
> 
> Alright,
> I should make a thread with a poll but people tend not to care about new threads so I'll ask here and hope that as many people as possible involved will answer.
> *@All water cooled R9 290 owners*
> *Does your card make such a noise too ?*


I had two 290Xs (one of them RMA-ed for another reason) - they both have moderate coil whine. You'll find it's more noticeable the higher the FPS is, especially in menus and such.

I don't really mind it since I never had an AMD card without it (5870, 2x6970 and now 290X).


----------



## dspacek

Hello next week I will buy R9-290 Saphire. so I will put it under water with this block http://www.alphacool.cz/produkt/EK_Water_Blocks_EK-FC_R9-290X_-_Plexi.html?arg1=01
So then I will let you know what about overclocking and if its unlockable by 290X bios. ;-)


----------



## lmclean12

Quote:


> Originally Posted by *steelkevin*
> 
> Alright,
> I should make a thread with a poll but people tend not to care about new threads so I'll ask here and hope that as many people as possible involved will answer.
> 
> *@All water cooled R9 290 owners*
> 
> If you're running your fans a bit fast you probably won't have noticed and won't be able to answer. If you would be willing just to turn them down to their lowest speed and launch a game that would be great. It won't take any more than a minute if even that much
> 
> 
> 
> 
> 
> 
> 
> .
> 
> My card makes a terrible electrical interference / buzz or whatever sound when the card is under load. I just played some BF4 and noticed the sound changed a bit depending on which pre-sets or resolution I used (I had my case open, you may not be able to distinguish the sounds if yours is closed but it doesn't really matter anyway).
> I'm thinking it could be the card or the PSU so here are mine:
> - Sapphire R9 290 with Hynix Memory @1100/1300 no volt adjustments
> - OCZ StealthXStream II 700W
> 
> It may also be related to CPU OC (just throwing a random guess out there, can't really find a link between both) so mine's an i7-860 @3800MHz with HT disabled.
> 
> I really hate that it makes that noise and it sort of ruins the purpose of quiet water cooling (if that's what you were going for which is my case) and would love to know if others are affected by it too.
> 
> *Does your card make such a noise too ?*


I've noticed it on the vast majority of cards i've owned when i push past 60 fps and when I exit Heaven or Valley Benchmarking.


----------



## kizwan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Sapphire follow more of a "dont ask,dont tell" policy, just dont glue heatsinks on. Use the thermal pads instead and you will be fine.
> 
> And a Happy 2014 to you as well!!
> 
> 12:35am here now


Here now is 3 hours behind yours.
Quote:


> Originally Posted by *Ros1kk*
> 
> 13:35am here at the moment
> 
> 
> 
> 
> 
> 
> 
> thank u so much for the help needed
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Link for thermal pads description? sorry for my noobness


Here now is 7 hours ahead. News from the future, no nuclear war yet, everything is safe.
Quote:


> Originally Posted by *TheSoldiet*
> 
> 3.04 PM here
> 
> Or 15.04 as we call it here in Norway.


Here now is 7 hours ahead.


----------



## Ros1kk

you already are in 2014! amazing


----------



## kizwan

Quote:


> Originally Posted by *Ros1kk*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> you already are in 2014! amazing


Not yet. One minute before 2014.


----------



## dspacek

we are 7 hours to 2014 ;-) So happy new year to all who are and will be in, and have the best experiences with your AMD cards.


----------



## Ros1kk

Quote:


> Originally Posted by *dspacek*
> 
> we are 7 hours to 2014 ;-) So happy new year to all who are and will be in, and have the best experiences with your AMD cards.


----------



## steelkevin

Quote:


> Originally Posted by *Tomalak*
> 
> I had two 290Xs (one of them RMA-ed for another reason) - they both have moderate coil whine. You'll find it's more noticeable the higher the FPS is, especially in menus and such.
> 
> I don't really mind it since I never had an AMD card without it (5870, 2x6970 and now 290X).


Quote:


> Originally Posted by *lmclean12*
> 
> I've noticed it on the vast majority of cards i've owned when i push past 60 fps and when I exit Heaven or Valley Benchmarking.


Argh... I'm going to buy either the Asus VG248QE or the BenQ XL2411T so that's quite the annoyance considering I'll probably be changing settings to get 120fps... Btw, I made a thread with a poll so people could help me decide which screen to get here it is if any of you feel like commenting or simply voting. Any vote is helpful








XL2411T (280€) or VG248QE (300€) ?


----------



## kdawgmaster

Well im starting my winter cleaning of mu case. Ill post progress for u all =D


----------



## steelkevin

Quote:


> Originally Posted by *kdawgmaster*
> 
> Well im starting my winter cleaning of mu case. Ill post progress for u all =D


From the looks of it you don't do a winter cleaning every year now do you


----------



## kdawgmaster

Quote:


> Originally Posted by *steelkevin*
> 
> From the looks of it you don't do a winter cleaning every year now do you


Quote:


> Originally Posted by *steelkevin*
> 
> ]
> 
> From the looks of it you don't do a winter cleaning every year now do you


No I dont. But I live in canada and where I live dust is just everywhere. Iv had this case for not even year and thats how much it's collected.


----------



## Durquavian

Try having cats. Litter dust gets so bad had to clean once a month. That was the biggest reason I built my airconditioned cabinet. Never have to clean them again and stable steady performance year round.


----------



## Jack Mac

I keep my dog downstairs and my computer upstairs, no issues with pet hair, just dust from shah carpet.


----------



## mojobear

Hi All,

I posted prior about GPUz not showing sensors like..VRM temps etc when you have crossfirex enabled for secondary GPU(s)...well you all probably know already but its all that ULPS stuff. Disabling it in regedit did the trick...now it shows most data - only GPU1 has gpu load unfortunately but all other info is present.

Also - had some issues with drive sweeper (uuhhhhh) and had to reinstall AMD drivers x 2. Think most of the issues are fixed...fingers crossed.

Does this 3Dmark result look right to you guys?

http://www.3dmark.com/fs/1447153


----------



## psyside

Can anyone share some info about temps on 290, after prolonged gameplay with max oc? i would use custom fan, like 60% does it get really loud at that point, and are there downclock- throttling issues even with 60% fan?


----------



## kizwan

Quote:


> Originally Posted by *mojobear*
> 
> Hi All,
> 
> I posted prior about GPUz not showing sensors like..VRM temps etc when you have crossfirex enabled for secondary GPU(s)...well you all probably know already but its all that ULPS stuff. Disabling it in regedit did the trick...now it shows most data - only GPU1 has gpu load unfortunately but all other info is present.
> 
> Also - had some issues with drive sweeper (uuhhhhh) and had to reinstall AMD drivers x 2. Think most of the issues are fixed...fingers crossed.
> 
> Does this 3Dmark result look right to you guys?
> 
> http://www.3dmark.com/fs/1447153


Did your screen flicker whenever you change fan setting in MSI AB?

Score look alright to me.
Quote:


> Originally Posted by *psyside*
> 
> Can anyone share some info about temps on 290, after prolonged gameplay with max oc? i would use custom fan, like 60% does it get really loud at that point, and are there downclock- throttling issues even with 60% fan?


Depends on the ambient. If your ambient is >32C, 60% is not enough, at least 70 - 80% fan speed.


----------



## kdawgmaster

Quote:


> Originally Posted by *Durquavian*
> 
> Try having cats. Litter dust gets so bad had to clean once a month. That was the biggest reason I built my airconditioned cabinet. Never have to clean them again and stable steady performance year round.


i have 2 cats and 2 dogs









 Pre video cards =D
 Video cards no power
 POWEAAAAAAAAAHAHAHAHAHAHA!!!!!!!!
 back =)

this is 10 times better then the last one









Quote:


> Originally Posted by *psyside*
> 
> Can anyone share some info about temps on 290, after prolonged gameplay with max oc? i would use custom fan, like 60% does it get really loud at that point, and are there downclock- throttling issues even with 60% fan?


My ambient temps are like 20C and with 3 cards stock coolers with overclocks i need fans on the video cards blowing on them and the video card fans them selves at 70% to stay at 70C


----------



## mojobear

Quote:


> Originally Posted by *kizwan*
> 
> Did your screen flicker whenever you change fan setting in MSI AB?
> 
> Score look alright to me.
> Depends on the ambient. If your ambient is >32C, 60% is not enough, at least 70 - 80% fan speed.


my cards are under water so I dont mess around with MSI AB fan settings.


----------



## xP_0nex

Finally got my R9 290 installed.


----------



## Raephen

Quote:


> Originally Posted by *steelkevin*
> 
> Alright,
> I should make a thread with a poll but people tend not to care about new threads so I'll ask here and hope that as many people as possible involved will answer.
> 
> *@All water cooled R9 290 owners*
> 
> If you're running your fans a bit fast you probably won't have noticed and won't be able to answer. If you would be willing just to turn them down to their lowest speed and launch a game that would be great. It won't take any more than a minute if even that much
> 
> 
> 
> 
> 
> 
> 
> .
> 
> My card makes a terrible electrical interference / buzz or whatever sound when the card is under load. I just played some BF4 and noticed the sound changed a bit depending on which pre-sets or resolution I used (I had my case open, you may not be able to distinguish the sounds if yours is closed but it doesn't really matter anyway).
> I'm thinking it could be the card or the PSU so here are mine:
> - Sapphire R9 290 with Hynix Memory @1100/1300 no volt adjustments
> - OCZ StealthXStream II 700W
> 
> It may also be related to CPU OC (just throwing a random guess out there, can't really find a link between both) so mine's an i7-860 @3800MHz with HT disabled.
> 
> I really hate that it makes that noise and it sort of ruins the purpose of quiet water cooling (if that's what you were going for which is my case) and would love to know if others are affected by it too.
> 
> *Does your card make such a noise too ?*


Coil whine. Yes, I noticed it on my 290 a few times, most often when pushing more than 100 fps in Skyrim. I have my monitor set too 100Hz now (due to another issue, though) and now I don't notice any coil whine.

I should try Dragon Age: Origins with my monitor set at 144Hz. I tried it this week at 100, and omg: I thought it was great with the HD 6870 / HD 7870 cards I had, but now!!! Too bad I seem to CTD a lot now


----------



## Raephen

And best wishes to all for 2014!!!


----------



## Forceman

Quote:


> Originally Posted by *steelkevin*
> 
> Alright,
> I should make a thread with a poll but people tend not to care about new threads so I'll ask here and hope that as many people as possible involved will answer.
> 
> *@All water cooled R9 290 owners*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> If you're running your fans a bit fast you probably won't have noticed and won't be able to answer. If you would be willing just to turn them down to their lowest speed and launch a game that would be great. It won't take any more than a minute if even that much
> 
> 
> 
> 
> 
> 
> 
> .
> 
> My card makes a terrible electrical interference / buzz or whatever sound when the card is under load. I just played some BF4 and noticed the sound changed a bit depending on which pre-sets or resolution I used (I had my case open, you may not be able to distinguish the sounds if yours is closed but it doesn't really matter anyway).
> I'm thinking it could be the card or the PSU so here are mine:
> - Sapphire R9 290 with Hynix Memory @1100/1300 no volt adjustments
> - OCZ StealthXStream II 700W
> 
> It may also be related to CPU OC (just throwing a random guess out there, can't really find a link between both) so mine's an i7-860 @3800MHz with HT disabled.
> 
> I really hate that it makes that noise and it sort of ruins the purpose of quiet water cooling (if that's what you were going for which is my case) and would love to know if others are affected by it too.
> 
> 
> *Does your card make such a noise too ?*


Mine also makes a buzzing sound. Not coil whine in the traditional sense, but buzzing. It isn't very loud though, and seems to be getting less noticeable over time.


----------



## Ricey20

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Sapphire doesnt have them over any screws


From reports here on OCN, it doesn't matter that they don't have the warranty void sticker. Sapphire is known to test the TIM to find out if it's been changed and deny warranty. MSI and XFX, although they have the warranty void stickers have posted themselves that it's okay to change the cooler as long as there's no physical damage and you change it back to stock before sending it back to them. This also depends on country too though.


----------



## Arizonian

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Sapphire follow more of a "dont ask,dont tell" policy, just dont glue heatsinks on. Use the thermal pads instead and you will be fine.
> 
> And a Happy 2014 to you as well!!
> 
> 12:35am here now


Quote:


> Originally Posted by *Ros1kk*
> 
> one question guys: if I use a reference sapphire 290x card and i replace the cooler with an accelero x3 my warranty goes to hell?? thanks for support! good new year!


Quote:


> Originally Posted by *kizwan*
> 
> Here now is 3 hours behind yours.
> Here now is 7 hours ahead. News from the future, no nuclear war yet, everything is safe.
> Here now is 7 hours ahead.


Quote:


> Originally Posted by *Raephen*
> 
> And best wishes to all for 2014!!!


Still got eight hours here in the United States.

Like to take a quick moment to wish everyone a Happy New Year - 2014!

What a way to end 2013 with a new GPU for those if us who did get one - got to love that.









For those of you still thinking about getting a new GPU, couldn't think of a better way to start off 2014.


----------



## disintegratorx

Hey everyone.









I'll be looking to join this club as soon as I get the money for a R9 290X. The one that I want to get is the Powercolor LCS. I have a question for anyone with experience with liquid cooling as I've never done it before and am interested in doing it for the graphics card, only. What is a good kit to buy and what should I know, if anything in particular, about liquid cooling? My purpose to know is because I want to have the card stay at safe and even preffered temps at all times. Any advice and extra information about it would be greatly appreciated and I'm looking to definitely get one of these cards, so any good tips would be very much appreciated. I'll be reading through this thread for any answers and also because I'm curious to know how the newest Radeons are working for everyone. Thank you in advance, and by the way... Happy New Year!!!









Thanks Guys,

- The Disintegrator -

Mike


----------



## Sazz

The only time I get coil whine is when I OC over +75mv with GPU under load, but not so much that its annoying or anything, from time to time it makes those sounds.


----------



## PearlJammzz

I just got a R9 290 and no mater which position I flip the switch, all my fan settings and everything are the same. Shouldn't CCC show me at a 40% fan limit in one and in a 59% or something in the other? Benchmarks are exactly the same as well.

Maybe someone had this card before, tried flashing it, didn't work, then flashed it back and overwrote the quiet or uber bios? Sapphire R9 290 BF4 edition BTW.


----------



## kdawgmaster

Quote:


> Originally Posted by *PearlJammzz*
> 
> I just got a R9 290 and no mater which position I flip the switch, all my fan settings and everything are the same. Shouldn't CCC show me at a 40% fan limit in one and in a 59% or something in the other? Benchmarks are exactly the same as well.
> 
> Maybe someone had this card before, tried flashing it, didn't work, then flashed it back and overwrote the quiet or uber bios? Sapphire R9 290 BF4 edition BTW.


Quote:


> Originally Posted by *psyside*
> 
> Can anyone share some info about temps on 290, after prolonged gameplay with max oc? i would use custom fan, like 60% does it get really loud at that point, and are there downclock- throttling issues even with 60% fan?


R9 290 dosnt have a Uber or quiet profile to it thats only the R9 290x. Ur card does still have a " dual Bios " just in case an overclock fails and u need to take it back.

On a side note. I managed to overclock my GPU to 1070 core and 1400 ram but that seems to be the limit. I had to adjust my aux voltage another 100mvs in order to get it this high without artifacting.


----------



## HardwareDecoder

Quote:


> Originally Posted by *Ricey20*
> 
> From reports here on OCN, it doesn't matter that they don't have the warranty void sticker. Sapphire is known to test the TIM to find out if it's been changed and deny warranty. MSI and XFX, although they have the warranty void stickers have posted themselves that it's okay to change the cooler as long as there's no physical damage and you change it back to stock before sending it back to them. This also depends on country too though.


tests the Tim? lol nah. just nah


----------



## PearlJammzz

Quote:


> Originally Posted by *kdawgmaster*
> 
> R9 290 dosnt have a Uber or quiet profile to it thats only the R9 290x. Ur card does still have a " dual Bios " just in case an overclock fails and u need to take it back.
> 
> On a side note. I managed to overclock my GPU to 1070 core and 1400 ram but that seems to be the limit. I had to adjust my aux voltage another 100mvs in order to get it this high without artifacting.


Ahh, I wasn't aware! Thank you!


----------



## HardwareDecoder

Quote:


> Originally Posted by *disintegratorx*
> 
> Hey everyone.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll be looking to join this club as soon as I get the money for a R9 290X. The one that I want to get is the Powercolor LCS. I have a question for anyone with experience with liquid cooling as I've never done it before and am interested in doing it for the graphics card, only. What is a good kit to buy and what should I know, if anything in particular, about liquid cooling? My purpose to know is because I want to have the card stay at safe and even preffered temps at all times. Any advice and extra information about it would be greatly appreciated and I'm looking to definitely get one of these cards, so any good tips would be very much appreciated. I'll be reading through this thread for any answers and also because I'm curious to know how the newest Radeons are working for everyone. Thank you in advance, and by the way... Happy New Year!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks Guys,
> 
> - The Disintegrator -
> 
> Mike


does that power color LCS come with a block installed? if sobid say just get an xspc kit with atleast 360mm of rad and it will come with a CPU block too. the kits come with everything you need basically and you would just need 2 fittings to put the gfx card in the loop.


----------



## kdawgmaster

Quote:


> Originally Posted by *PearlJammzz*
> 
> Ahh, I wasn't aware! Thank you!


Be aware tho that if u overclock with MSI AB and have the overclocks applied on windows start up, it wont matter if u have a dual bios or not it will still overclock and mess u up. SO if u do overclock have fun and be safe =D

question for everyone! what are all ur Vrm's temps at?

I see people reporting 90+C and everything but mine hardly hit 60 on load for all 3 cards :/


----------



## Forceman

Quote:


> Originally Posted by *kdawgmaster*
> 
> On a side note. I managed to overclock my GPU to 1070 core and 1400 ram but that seems to be the limit. I had to adjust my aux voltage another 100mvs in order to get it this high without artifacting.


Why are you increasing the aux voltage so much? Most people just use core voltage - I don't think I've ever heard of anyone doing 100mV on the Aux. I doubt most people change Aux at all.


----------



## kdawgmaster

Quote:


> Originally Posted by *Forceman*
> 
> Why are you increasing the aux voltage so much? Most people just use core voltage - I don't think I've ever heard of anyone doing 100mV on the Aux. I doubt most people change Aux at all.


Its what i had to do to get mine stable :/

My core will only get to 1070 with the core upped and ram 1350. With Aux i can get the ram to 1450 havent tried higher yet.


----------



## psyside

Quote:


> Originally Posted by *kizwan*
> 
> Did your screen flicker whenever you change fan setting in MSI AB?
> 
> Score look alright to me.
> Depends on the ambient. If your ambient is >32C, 60% is not enough, at least 70 - 80% fan speed.


My ambient is around 22c, i won't push volts higher then 100 mV + max....i'm trying to not go over 85c as max on core, and around 75c on the vrm's do you guys *think this is possible with 60% fan settings?*

Is it to loud @60%? thanks.


----------



## disintegratorx

Yes, and thank you for replying, HardwareDecoder. It also has a something called a gold power stabilizer kit which is another reason why I want that model. Yeah, but I've never had a water cooling system, so I was wondering if there was a specific kind that was the best to go with. I'm currently looking at some kits at FrozenCPU.com... I'm totally new to liquid cooling, so I was just wondering if anyone had any recommended parts to use, or brands.


----------



## kdawgmaster

Quote:


> Originally Posted by *psyside*
> 
> My ambient is around 22c, i won't push volts higher then 100 mV + max....i'm trying to not go over 85c as max on core, and around 75c on the vrm's do you guys *think this is possible with 60% fan settings?*
> 
> Is it to loud @60%? thanks.


Overclocked no ur more then likely not going to get that. I have to do 1% for ever 1C for fan speed for anything over 60C, this means 60C is 60% fan speed to keep min under 75C which it dosnt always do depending on the game. Something like BF4 can push me above 80C still with my overclock.

And yes ur fan will be a little loud at 60% fan speed


----------



## psyside

Quote:


> Originally Posted by *kdawgmaster*
> 
> Overclocked no ur more then likely not going to get that. I have to do 1% for ever 1C for fan speed for anything over 60C, this means 60C is 60% fan speed to keep min under 75C which it dosnt always do depending on the game. Something like BF4 can push me above 80C still with my overclock.


I don't really understand you









Do you mean that for 85c i must use 70% fan?


----------



## Jack Mac

Quote:


> Originally Posted by *psyside*
> 
> I don't really understand you
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Do you mean that for 85c i must use 70% fan?


Not with 1 card. 55-60% keeps me below 80C.


----------



## kdawgmaster

Quote:


> Originally Posted by *psyside*
> 
> I don't really understand you
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Do you mean that for 85c i must use 70% fan?


probably more. I thought what i typed was clear but let me try it again

For my video card (s) to stay cool while gaming i have a pre set fan profile with MSI AB. My card (s) will essentially go to 60% fan speed at 60C and for every degree more ( meaning 1C) my fans will go up 1% to help the card keep cool. Now yours might not be as special as a case as mine where i have 3 cards overclocked. But even my bottom card, which gets the most air, still has the same fan profile set and even a game like BF4 will push it to 70C easy for me making my fan speed 70%.

Quote:


> Originally Posted by *Jack Mac*
> 
> Not with 1 card. 55-60% keeps me below 80C.


I find this extremely hard to believe when a stock card at 55% has a hard enough time keep its core clock ( which means it hits 95C ) and u have a massive overclock to yours. Ur either running water or a custom cooler to get that. And im talking for the 290x btw


----------



## Jack Mac

Quote:


> Originally Posted by *kdawgmaster*
> 
> probably more. I thought what i typed was clear but let me try it again
> 
> For my video card (s) to stay cool while gaming i have a pre set fan profile with MSI AB. My card (s) will essentially go to 60% fan speed at 60C and for every degree more ( meaning 1C) my fans will go up 1% to help the card keep cool. Now yours might not be as special as a case as mine where i have 3 cards overclocked. But even my bottom card, which gets the most air, still has the same fan profile set and even a game like BF4 will push it to 70C easy for me making my fan speed 70%.
> 
> I find this extremely hard to believe when a stock card at 55% has a hard enough time keep its core clock ( which means it hits 95C ) and u have a massive overclock to yours. Ur either running water or a custom cooler to get that. And im talking for the 290x btw


I only need +65MV for this OC and it hits 95C on reviews because it doesn't run at a fixed 55% fanspeed.


----------



## psyside

@ kdawgmaster Ahhh, you got 3 cards, huh now i can breath more easily lol! i tought i'm doomed









Edit: You can test this easy, disable 2 cards and test with one









@Jack Mac Thanks, what volts does your card have as stock?


----------



## kdawgmaster

Quote:


> Originally Posted by *Jack Mac*
> 
> I only need +65MV for this OC and it hits 95C on reviews because it doesn't run at a fixed 55% fanspeed.


Quote:


> Originally Posted by *Jack Mac*
> 
> Not with 1 card. 55-60% keeps me below 80C.


Ur MSI fan profile dosnt tell me ur running a stock card cooler tho. What are ur testign it with. Throw it under furmark and see and play a game like crysis 3 maxed 100% or BF4 100% with something like 30 min into it and see what ur temps are then.

Quote:


> Originally Posted by *psyside*
> 
> Ahhh, you got 3 cards, huh now i can breath more easily lol! i tought i'm doomed
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: You can test this easy, disable 2 cards and test with one
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks, what volts are your card have as stock?


Its not just because i have 3 cards. Keep in mind the lowest card ( #3 ) has all the same airflow as if u were to have a single card if u have the proper case. I have a cosmos2 and i have 2 120MM fans on my cards feeding them 95cfm each to help cool the cards.

Card 3 will only run a few degress above normal because of a little heat off of card 2 but its not enough to be noticeable. but none the less i still dont think ull be able to do it because of the amount of power this card needs to run properly.


----------



## Jack Mac

Quote:


> Originally Posted by *kdawgmaster*
> 
> Ur MSI fan profile dosnt tell me ur running a stock card cooler tho. What are ur testign it with. Throw it under furmark and see and play a game like crysis 3 maxed 100% or BF4 100% with something like 30 min into it and see what ur temps are then.
> Its not just because i have 3 cards. Keep in mind the lowest card ( #3 ) has all the same airflow as if u were to have a single card if u have the proper case. I have a cosmos2 and i have 2 120MM fans on my cards feeding them 95cfm each to help cool the cards.
> 
> Card 3 will only run a few degress above normal because of a little heat off of card 2 but its not enough to be noticeable. but none the less i still dont think ull be able to do it because of the amount of power this card needs to run properly.


I'm running the stock cooler, I don't play furmark or Crysis but I did test Crysis 2 and temperatures peaked at 78C. Keep in mind that this is with 22C ambients inside an FT02.


----------



## the9quad

1 card is easy to keep cool at 60-65 % at 100% load in my experience.
3 cards on my mobo, is another story. Card 1 will do it, but #2 and 3 are like sammiches and #2 runs hot. That's with a HAF X. Not the best case for airflow. No sense upgrading the case until i go watercooling later this year. Need a case with lots of room for rads, that wont cost me a thousand bucks.

generally card 2 peaks at 78C with 75% fan speed in my case with #1 considerably cooler at 65 % and 70C and #3 at 75C and 70%. Thats after hours of BF4.


----------



## psyside

Quote:


> Originally Posted by *the9quad*
> 
> 1 card is easy to keep cool at 60-65 % at 100% load in my experience.
> 3 cards on my mobo, is another story. Card 1 will do it, but #2 and 3 are like sammiches and #2 runs hot. That's with a HAF X. Not the best case for airflow. No sense upgrading the case until i go watercooling later this year. Need a case with lots of room for rads, that wont cost me a thousand bucks.
> 
> generally card 2 peaks at 78C with 75% fan speed in my case with #1 considerably cooler at 65 % and 70C and #3 at 75C and 70%. Thats after hours of BF4.


Thanks. So what will be the conclusion, with fans at 60% single card will max out around 80c, even with decent voltage bump + 100mV and long gaming session with intensive games right? what about noise, is it very bad at around 60%?

What are your vrm/ambient temps? Also HAF X is bad case for airflow? what about Storm Trooper?


----------



## kdawgmaster

Quote:


> Originally Posted by *psyside*
> 
> Thanks. So what will be the conclusion, with fans at 60% single card will max out around 80c, even with decent voltage bump + 100mV and long gaming session with intensive games right? what about noise, is it very bad at around 60%?
> 
> What are your vrm/ambient temps? Also HAF X is bad case for airflow? what about Storm Trooper?


Haf stand for " High Air Flow " so u dont have a problem there. I personally dont like the Trooper and it dosnt have as good air flow as the Haf anyways.


----------



## kizwan

Quote:


> Originally Posted by *psyside*
> 
> My ambient is around 22c, i won't push volts higher then 100 mV + max....i'm trying to not go over *85c as max on core*, and around *75c on the vrm*'s do you guys *think this is possible with 60% fan settings?*
> 
> Is it to loud @60%? thanks.


I can't tell for sure. This is mine at stock clock:-
- 33C ambient
- 80% fan
- BF4
- 90C on GPU1 core
Quote:


> Originally Posted by *disintegratorx*
> 
> Yes, and thank you for replying, HardwareDecoder. It also has a something called a gold power stabilizer kit which is another reason why I want that model. Yeah, but I've never had a water cooling system, so I was wondering if there was a specific kind that was the best to go with. I'm currently looking at some kits at FrozenCPU.com... I'm totally new to liquid cooling, so I was just wondering if anyone had any recommended parts to use, or brands.


If you're getting Powercolor LCS, you'll only need these to complete the loop:-
- pump (best pump are DDC & D5)
- reservoir (bay reservoir, tube reservoir, etc)
- tube (best soft tube so far is Primochill Primoflex Advanced LRT)
- barbs or compression fittings (two for each block/radiator/reservoir)

You can get all the above in one complete kit, e.g. XSPC or EK. For example:-

XSPC:-
http://www.frozencpu.com/products/16070/ex-wat-210/XSPC_Raystorm_EX240_Universal_CPU_Water_Cooling_Kit_w_D5_Variant_Pump_Included_and_Free_Dead-Water.html?tl=g59c683s2174

EK:-
http://www.frozencpu.com/products/18964/ex-wat-249/EK_L240_Complete_Dual_120mm_Liquid_Cooling_Kit_EK-KIT_L240.html?tl=g57c607s1948

OR

http://www.frozencpu.com/products/15069/ex-wat-201/Ek_H30_240_HFX_Advanced_Liquid_Cooling_Kit_-_CSQ_EK-KIT_H3O_240_HFX.html?tl=g57c607s1948

The XSPC kit come with cheap tube. I recommend replacing it with Primochill Primoflex Advanced LRT tube. All kits come with CPU water block. You can add CPU in the loop later without getting any additional parts, except two more barbs or compression fittings.

The kits above come with 240mm radiator. It's enough for cooling GPU.

The EK kits come with EK-DCP 2.2 or EK-DCP 4.0 pump. They're not DDC pump but Jingway rebranded pump. They should do well though.


----------



## psyside

Quote:


> Originally Posted by *kdawgmaster*
> 
> Haf stand for " High Air Flow " so u dont have a problem there. I personally dont like the Trooper and it dosnt have as good air flow as the Haf anyways.


Thanks for your help, he said that HAF X is not good for airflow, and i was a bit confused : /


----------



## the9quad

Quote:


> Originally Posted by *kdawgmaster*
> 
> Haf stand for " High Air Flow " so u dont have a problem there. I personally dont like the Trooper and it dosnt have as good air flow as the Haf anyways.


I should clarify that I have a rad at the top and it pretty much negates the "HAF" part of the HAF lol, so airflow sucks in my situation with the case..more than likely because I have no idea what I do when i build these things I just put it together and hope for the best.


----------



## kdawgmaster

Quote:


> Originally Posted by *psyside*
> 
> Thanks for your help, he said that HAF X is not good for airflow, and i was a bit confused : /


The Haf x will have better air flow then the Trooper case will. Although i dont recommend either. What is it that ur looking for in case?


----------



## VSG

Ahem: http://www.overclock.net/t/1455495/tomshardware-de-gigabyte-to-stop-mass-production-of-windforce-r9-290x-immediately-citing-problem-with-fan-sink/0_50


----------



## the9quad

Quote:


> Originally Posted by *kdawgmaster*
> 
> The Haf x will have better air flow then the Trooper case will. Although i dont recommend either. What is it that ur looking for in case?


I agree, I didnt do any research when I bought mine. I definitely would have put more money and time in a case if I had to do it again. for half the price of the cosmos 2 it aint bad though. Just no room forenough rads to watercool decently.. whats your opinion on Corsair Obsidian Series 900D


----------



## kdawgmaster

Quote:


> Originally Posted by *the9quad*
> 
> I agree, I didnt do any research when I bought mine. I definitely would have put more money and time in a case if I had to do it again. for half the price of the cosmos 2 it aint bad though. Just no room forenough rads to watercool decently


Im looking at swaping my cosmos 2 with a Corsair air 540 and get 3 kraken G10's with water coolers. Thats when i can get the Kraken G10 for less then 100 bucks because of shipping


----------



## disintegratorx

Woow.. Thanks for the information about that, kizwan. That's exactly what info I was looking for. I will be looking for those exact parts that you've mentioned and thanks so much for your input!







+1!!!

My introductory to to watercooling... Awesome!.. Thanks!!









Will be back to join the club! and I'll do my best to post some pics or info when I come back. Thanks again..


----------



## HALOwner97

My brand new R9 290














Le GPUz link:
http://www.techpowerup.com/gpuz/9zpdg/

Brand: Sapphire

Cooler: Stock (with the included awesome Sapphire stickers on it.)


----------



## Forceman

Quote:


> Originally Posted by *disintegratorx*
> 
> Woow.. Thanks for the information about that, kizwan. That's exactly what info I was looking for. I will be looking for those exact parts that you've mentioned and thanks so much for your input!
> 
> 
> 
> 
> 
> 
> 
> +1!!!
> 
> My introductory to to watercooling... Awesome!.. Thanks!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Will be back to join the club! and I'll do my best to post some pics or info when I come back. Thanks again..


If you plan on cooling both an overclocked 3930K and a 290, you might want to think about getting more than a 240mm radiator if your case will allow it.


----------



## psyside

Quote:


> Originally Posted by *kdawgmaster*
> 
> The Haf x will have better air flow then the Trooper case will. Although i dont recommend either. What is it that ur looking for in case?


Decent looks, easy to work with, and most important of all great cooling


----------



## stickg1

Is the stock 290(x) at >60% fan speed loud? Have you ever operated a gas powered chainsaw? If you consider that noise mild then you shouldn't be bothered









Water blocks FTW


----------



## mojobear

hey is anyone having trouble with afterburner and BSOD? A0000001 code? or is it just me being a poor overclocker hahaha...thanks! Happy new year!


----------



## Zenophobe

Quote:


> Originally Posted by *Chimera1970*
> 
> Yes, it's the 290x.
> 
> The fact that it came with an aftermarket cooler already installed was one of the main reasons I decided to get this model. I had planned on changing out the cooler had I bought a stock card, but after reading some of the (mostly negative installation) reviews AND the fact that all of the "good" coolers were sold out everywhere I looked, I decided on this card. It was the most expensive at $699, but overall, I went with what I wanted  and now I have bragging rights LOL. I have to admit though that the Windforce cooler on this one seems a little flimsy. One of the fans was rubbing up against something and I had to figure out a way to stop it, so I jerry-rigged a temporary fix until I can either figure out a better solution. I should include a picture of my handywork, maybe later on today.
> 
> Hopefully I can do some gaming on it today and see if it can handle Bioshock Infinite better than my old 1GB AMD Radeon HD 7700 series. I also plan on trying out Metro 2033 and Metro: Last Light, both of which I got just yesterday in the Steam sale going on.


Same here for my Gigabyte Windforce 290x yesterday. I posted my scores a few pages back. Cool quiet and heavy. Will only OC my star to 110. My 7970 kHz did 120.


----------



## Arizonian

Quote:


> Originally Posted by *HALOwner97*
> 
> My brand new R9 290
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Le GPUz link:
> http://www.techpowerup.com/gpuz/9zpdg/
> 
> Brand: Sapphire
> 
> Cooler: Stock (with the included awesome Sapphire stickers on it.)


Congrats - added







Sweet way to start the new year!


----------



## Sgt Bilko

Quote:


> Originally Posted by *geggeg*
> 
> Ahem: http://www.overclock.net/t/1455495/tomshardware-de-gigabyte-to-stop-mass-production-of-windforce-r9-290x-immediately-citing-problem-with-fan-sink/0_50


Well thats a worry, Good thing I didn't get a Gigabyte then....


----------



## Sazz

Highest score I have on firestrike using Stock BIOS.

http://www.3dmark.com/fs/1449477


----------



## gateh0use

can anyone tell me which VRM is which? is VRM the 3 one or the long strip one? Thanks.


----------



## kdawgmaster

Quote:


> Originally Posted by *gateh0use*
> 
> can anyone tell me which VRM is which? is VRM the 3 one or the long strip one? Thanks.


Both. The ones near the end of the card is VRM1 and the one at the at the front is VRM2


----------



## gateh0use

Quote:


> Originally Posted by *kdawgmaster*
> 
> Both. The ones near the end of the card is VRM1 and the one at the at the front is VRM2


Sorry for phrasing my question poorly. 'Near the end of the card' -- that is the long strip of VRM's right (VRM1 in GPUZ) -- not the 3 piece VRM (presumably VRM2)?

I ask because my idle temps (currently with only fan cooling, no heatsinks) VRM1 @ 37C , VRM2 @42C.
But at full load in demanding game (VRM1 @ 91C (Highest i've seen it), VRM2 56C).

Just wondering which of them I should actually add the heat-sinks to.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Sazz*
> 
> 
> 
> Highest score I have on firestrike using Stock BIOS.
> 
> http://www.3dmark.com/fs/1449477


Very nice









Your physics score is lower than mine at 5ghz though....is that clock stable?

EDIT: Link here http://www.3dmark.com/fs/1441733


----------



## rx7racer

Quote:


> Originally Posted by *gateh0use*
> 
> Sorry for phrasing my question poorly. 'Near the end of the card' -- that is the long strip of VRM's right (VRM1 in GPUZ) -- not the 3 piece VRM (presumably VRM2)?
> 
> I ask because my idle temps (currently with only fan cooling, no heatsinks) VRM1 @ 37C , VRM2 @42C.
> But at full load in demanding game (VRM1 @ 91C (Highest i've seen it), VRM2 56C).
> 
> Just wondering which of them I should actually add the heat-sinks to.


You'll need heatsink on VRM1 which is the long strip closest to PCI-E power connectors. And want some on the VRM2 which is the group of 3 clustered closest to the Display/DVI ports.

If you don't need stock heatsink intact you can use it's base plate. There is a good guide by a fellow OCN'er in cooling for AMD/ATi.


----------



## Sazz

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Very nice
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Your physics score is lower than mine at 5ghz though....is that clock stable?
> 
> EDIT: Link here http://www.3dmark.com/fs/1441733


yeah its stable, I've ran several times coz I noticed it being low, but for some reason it keeps getting that score, my old score (here) got 9.9k physics score.

Altho one thing has changed tho, the system hardware monitoring feature of the Firestrike works now for me, back on the time when i ran the first 10k run that I have the system monitoring on firestrike don't work on me.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Sazz*
> 
> yeah its stable, I've ran several times coz I noticed it being low, but for some reason it keeps getting that score, my old score (here) got 9.9k physics score.
> 
> Altho one thing has changed tho, the system hardware monitoring feature of the Firestrike works now for me, back on the time when i ran the first 10k run that I have the system monitoring on firestrike don't work on me.


Thats just weird then...hopefully you figure it out. That GPU score mixed with a 10k physics score would be hella nice


----------



## Raephen

Quote:


> Originally Posted by *psyside*
> 
> Decent looks, easy to work with, and most important of all great cooling


From personal experience, two spring to mind: Fractal Design Arc Midi and the NZXT Phantom 630.

I owned a Arc Midi rev.1 and modded it slightly to fit a thick 120mm rad outside on the rear exhaust along with the 120x240mm rad in top. I've also worked with an Arc Midi R2 recently for a friends system - looks better than the first version 1.

The Phantom 630 is a different beast all together with a plethora of watercool options. If anyone is looking for a case with multiple options, I'd recommend looking into the 630.


----------



## Sazz

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Thats just weird then...hopefully you figure it out. That GPU score mixed with a 10k physics score would be hella nice


yeah been scratching my head, can't figure out why. my combined score is right, but the physics score is almost 1k lower than it should.


----------



## UnkDev

Just got my XFX 290 today!



Stock air cooling for now. Waiting on my other cards.


----------



## Sgt Bilko

http://www.fudzilla.com/home/item/33480-xfx-showed-r9-290x-with-in-the-house-made-cooler

Specs of the XFX DD 290 and 290x's released:


Spoiler: Warning: Spoiler!



- 7 Heatpipes
- Copper base
- Direct connect to GPU for maximum cooling efficient in collaboration with the 7 heat pipes
- Ghost 2.0 Thermal Design
- Open thermal for maximum airflow and designed to take the weight stress away from the PCB to ensure safer shipping.
- Significant DB Level reduction Vs AMD own build
- 32dB Vs AMD 57dB
- Idle DB level ONLY 15dB to 20 dB
- Whisper quiet.
- Dual Double Dissipation 90mm Fans
- Providing superior cooling performance under especially under gaming conditions
- Lower temperature under 3D 85c Vs AMD 95c
- Dual XFX Bios - Voltage unlocked
- allowing the end user to adjust the GPU voltage, DRAM voltage and VDDCI voltage by themselves
- Allowing great over clocking or even under clocking
- Digital Power
- Providing a clean voltage signals that are more stable and accurate then conventional analogue designs (allowing faster and more accurate adjustments when you OC)
- All solid capacitors and 6 Phase Power
- this ensures a lower temperature by as much as 20% on the power IC modules
- Ensuring stable power flow to the GPU all of the time.


----------



## psyside

Quote:


> Originally Posted by *Raephen*
> 
> From personal experience, two spring to mind: Fractal Design Arc Midi and the NZXT Phantom 630.
> 
> I owned a Arc Midi rev.1 and modded it slightly to fit a thick 120mm rad outside on the rear exhaust along with the 120x240mm rad in top. I've also worked with an Arc Midi R2 recently for a friends system - looks better than the first version 1.
> 
> The Phantom 630 is a different beast all together with a plethora of watercool options. If anyone is looking for a case with multiple options, I'd recommend looking into the 630.


Thanks for the help


----------



## Sgt Bilko

Quote:


> Originally Posted by *Sazz*
> 
> yeah been scratching my head, can't figure out why. my combined score is right, but the physics score is almost 1k lower than it should.


Yeah it really is a mystery, my physics at 4.9 was pretty low though for this run:

8267 physics @ 4.9 : http://www.3dmark.com/fs/1151742

9716 physics @ 5.05 : http://www.3dmark.com/fs/1441733

I was running 4 x 4GB of Corsair Dominator at 1866Mhz for the first test and got my G Skill 2400Mhz kit afterwards.........maybe it's your ram slowing you up?

I also believe my NB Freq was higher for the second run as well (not 100% sure)..........Might be worth checking out maybe?


----------



## ohone

Does anyone know why my memory clock is this high when idle? And why it keeps dropping? I'm running the reference cooler on a R9 290.

http://gpuz.techpowerup.com/14/01/01/wc.png


----------



## RAFFY

Can anyone shed some light on my GPU Tweak will not pick up my 4th 290x? I have uninstalled GPU Tweak, ran CCcleaner, restarted, reinstalled and still the problem exists


----------



## stickg1

Quote:


> Originally Posted by *ohone*
> 
> Does anyone know why my memory clock is this high when idle? And why it keeps dropping? I'm running the reference cooler on a R9 290.
> 
> http://gpuz.techpowerup.com/14/01/01/wc.png


Multiple monitors?


----------



## ohone

Quote:


> Originally Posted by *stickg1*
> 
> Multiple monitors?


Nope, one 1080p. It varies between 1250MHz and 150MHz.

Just figured it out. I was streaming a 720p video in the background. Thanks for the help!


----------



## HardwareDecoder

Quote:


> Originally Posted by *Raephen*
> 
> From personal experience, two spring to mind: Fractal Design Arc Midi and the NZXT Phantom 630.
> 
> I owned a Arc Midi rev.1 and modded it slightly to fit a thick 120mm rad outside on the rear exhaust along with the 120x240mm rad in top. I've also worked with an Arc Midi R2 recently for a friends system - looks better than the first version 1.
> 
> The Phantom 630 is a different beast all together with a plethora of watercool options. If anyone is looking for a case with multiple options, I'd recommend looking into the 630.


. love my r2


----------



## Slomo4shO

Just ordered 2 XFX R9 290 BLACK EDITION from ShopBLT.com yesterday. Seems to be the only reasonably priced merchant currently. They are out of stock on most models but do provide back orders with details of incoming inventory.


----------



## RAFFY

Quote:


> Originally Posted by *the9quad*
> 
> 1 card is easy to keep cool at 60-65 % at 100% load in my experience.
> 3 cards on my mobo, is another story. Card 1 will do it, but #2 and 3 are like sammiches and #2 runs hot. That's with a HAF X. Not the best case for airflow. No sense upgrading the case until i go watercooling later this year. Need a case with lots of room for rads, that wont cost me a thousand bucks.
> 
> generally card 2 peaks at 78C with 75% fan speed in my case with #1 considerably cooler at 65 % and 70C and #3 at 75C and 70%. Thats after hours of BF4.


Nice that's a great deal compared to the rest.


----------



## prostreetcamaro

Quote:


> Originally Posted by *disintegratorx*
> 
> Yes, and thank you for replying, HardwareDecoder. It also has a something called a gold power stabilizer kit which is another reason why I want that model. Yeah, but I've never had a water cooling system, so I was wondering if there was a specific kind that was the best to go with. I'm currently looking at some kits at FrozenCPU.com... I'm totally new to liquid cooling, so I was just wondering if anyone had any recommended parts to use, or brands.


I am loving my new water cooling setup I just installed. I got the XSPC EX360 kit with the D5 Vario pump plus and additional EX240 radiator and the EK water block. My 290X never goes above 45c no matter how high i clock it or stress it. My 2600K at 1.45 volts and 4.8Ghz will just touch 80c with LinX and stays on the 60's with prime95. This setup is very quiet even with the pump set at full speed and the fans set at full speed.

I need to do some better tube routing. This is my first try so please everybody excuse the poor tube routing.



I still can not figure out why when I set the voltage on the 290X to anything over +135 it corrupts my QNIX monitor. Doesnt do it with my other monitor hooked up. I am starting to wonder if the cable just cant support the bandwidth.


----------



## stickg1

Quote:


> Originally Posted by *HardwareDecoder*
> 
> . love my r2


I don't think I'll be ditching my R2 anytime soon...


Quote:


> Originally Posted by *prostreetcamaro*
> 
> I am loving my new water cooling setup I just installed. I got the XSPC EX360 kit with the D5 Vario pump plus and additional EX240 radiator and the EK water block. My 290X never goes above 45c no matter how high i clock it or stress it. My 2600K at 1.45 volts and 4.8Ghz will just touch 80c with LinX and stays on the 60's with prime95. This setup is very quiet even with the pump set at full speed and the fans set at full speed.
> 
> I need to do some better tube routing. This is my first try so please everybody excuse the poor tube routing.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I still can not figure out why when I set the voltage on the 290X to anything over +135 it corrupts my QNIX monitor. Doesnt do it with my other monitor hooked up. I am starting to wonder if the cable just cant support the bandwidth.


Maybe spin that top radiator around so the fittings are closer to the front reservoir. That would eliminate one of the long runs.

Also that tube from res to front radiator looks to have a kink in it.


----------



## HardwareDecoder

Quote:


> Originally Posted by *prostreetcamaro*
> 
> I am loving my new water cooling setup I just installed. I got the XSPC EX360 kit with the D5 Vario pump plus and additional EX240 radiator and the EK water block. My 290X never goes above 45c no matter how high i clock it or stress it. My 2600K at 1.45 volts and 4.8Ghz will just touch 80c with LinX and stays on the 60's with prime95. This setup is very quiet even with the pump set at full speed and the fans set at full speed.
> 
> I need to do some better tube routing. This is my first try so please everybody excuse the poor tube routing.
> 
> 
> 
> I still can not figure out why when I set the voltage on the 290X to anything over +135 it corrupts my QNIX monitor. Doesnt do it with my other monitor hooked up. I am starting to wonder if the cable just cant support the bandwidth.


good news is you aren't alone... I have the same issue, but mine seems to be related to core clock + voltage+refresh I can set +100mV/1100/110hz even though I can run 1100 on my stock volts.

But if I set 1200 +100mV/110hz it screws up. I can do 1200/+100mV/60hz though. I ordered a 3 ft cable from mono price which is supposed to be the best qnix overclocking cable so we will see if it helps at all.

your tube routing is fine btw. It looks alot like mine, screw how it looks if it works good that is all that matters. Mine has to be kind of long so I can pull the pump/res unit out to fill it etc.


----------



## psyside

I heard 290 tri x oc sold out on Newegg, anyone have some info about vrm temps during oc? i'm thorn beethven tri x oc and referent.


----------



## Sazz

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yeah it really is a mystery, my physics at 4.9 was pretty low though for this run:
> 
> 8267 physics @ 4.9 : http://www.3dmark.com/fs/1151742
> 
> 9716 physics @ 5.05 : http://www.3dmark.com/fs/1441733
> 
> I was running 4 x 4GB of Corsair Dominator at 1866Mhz for the first test and got my G Skill 2400Mhz kit afterwards.........maybe it's your ram slowing you up?
> 
> I also believe my NB Freq was higher for the second run as well (not 100% sure)..........Might be worth checking out maybe?


I am using the same RAM and using the same exact set-up on my OC on those two runs.. that's why I am scratching my head right now coz the only different thing between the two runs is the system monitoring feature of 3Dmark is working now for me, back then it wasn't.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Sazz*
> 
> I am using the same RAM and using the same exact set-up on my OC on those two runs.. that's why I am scratching my head right now coz the only different thing between the two runs is the system monitoring feature of 3Dmark is working now for me, back then it wasn't.


Ok, well i'm at a complete loss then







sorry....


----------



## HardwareDecoder

Quote:


> Originally Posted by *stickg1*
> 
> I don't think I'll be ditching my R2 anytime soon...
> 
> 
> Maybe spin that top radiator around so the fittings are closer to the front reservoir. That would eliminate one of the long runs.
> 
> Also that tube from res to front radiator looks to have a kink in it.


I noticed the kink but I figured i'd keep my mouth shut if his setup was working good.

Yeah, my R2 setup looks nowhere near as good as yours but then again it is fully functional so I don't really care







it's a great case for the $50 I paid for it.

question: do you know what that vertical pci slot looking thing is for ?


----------



## aznever

Quote:


> Originally Posted by *HardwareDecoder*
> 
> question: do you know what that vertical pci slot looking thing is for ?


probably for a blower type fan that you can install via PCI slot.


----------



## AlphaC

Newegg is trolling with 650/1200W PSU combos on their R9 290 series.

MSI R9 290 Gaming $580+shipping http://www.newegg.com/Product/Product.aspx?Item=N82E16814127774
NMSI R9 290X Gaming $720+shipping http://www.newegg.com/Product/Product.aspx?Item=N82E16814127773

Sapphire R9 290 Tri-X $550+shipping http://www.newegg.com/Product/Product.aspx?Item=N82E16814202080
Sapphire R9 290X Tri-X $700+shipping http://www.newegg.com/Product/Product.aspx?Item=N82E16814202079

Powercolor LCS $780+shipping http://www.newegg.com/Product/Product.aspx?Item=N82E16814131543


----------



## stickg1

Quote:


> Originally Posted by *HardwareDecoder*
> 
> I noticed the kink but I figured i'd keep my mouth shut if his setup was working good.
> 
> Yeah, my R2 setup looks nowhere near as good as yours but then again it is fully functional so I don't really care
> 
> 
> 
> 
> 
> 
> 
> it's a great case for the $50 I paid for it.
> 
> question: do you know what that vertical pci slot looking thing is for ?


Quote:


> Originally Posted by *aznever*
> 
> probably for a blower type fan that you can install via PCI slot.


That and there are little PCI slot fan controllers and stuff like that. In fact I think my Arc Mini or my original Arc Midi (maybe both) came with one. That was before the R2's with the fan controller on top of the case.

I can't remember when I bought this R2, but it was around $50-$60 on Newegg, some sort of sale or special. But yeah, my last three cases I have used, Midi, Mini, Midi R2. So I think its safe to say I like Fractals. I think they are very stylish but not overwhelming. They seem to be the perfect size for me. Not hating, but I'm not really into those giant 3ft-4ft tall cases with room for like 37 radiators and 10 power supplies. But then again, I've never set out to build a PC that would need all that space.


----------



## stickg1

Wow, R9 290's are $580 now? That makes me sad, it was perfectly priced when it was $400. I wouldn't pay much more than that for a 290. Although I guess technically I put a $100 waterblock on my card. So I paid $500.

By the way, I am more stable and got a little higher clocks after putting the waterblock on it. Probably because I can run +100mV no problem without getting hot now. Before if I ran +100mV my VRMs would get in the 90C's and core around 85C.


----------



## HardwareDecoder

Quote:


> Originally Posted by *stickg1*
> 
> Wow, R9 290's are $580 now? That makes me sad, it was perfectly priced when it was $400. I wouldn't pay much more than that for a 290. Although I guess technically I put a $100 waterblock on my card. So I paid $500.
> 
> By the way, I am more stable and got a little higher clocks after putting the waterblock on it. Probably because I can run +100mV no problem without getting hot now. Before if I ran +100mV my VRMs would get in the 90C's and core around 85C.


I wish I could overclock to my cards full potential, but something funny is going on with these and qnix monitors for alot of people, not everyone but alot of us. Yeah fractal is really nice! My next build is going to be totally insane though in about two years and I'm gonna probably need one of those giant case labs cases then.


----------



## the9quad

Realistically, would a 360 and a 240 rad cool 3 of these and a processor?


----------



## HardwareDecoder

Quote:


> Originally Posted by *the9quad*
> 
> Realistically, would a 360 and a 240 rad cool 3 of these and a processor?


I think you'd really want more rad then that for 3 of these beasts and a processor. I'd think 480+360 would be sufficient. Or any combo of 240's and 360's equalling around 700-800mm


----------



## the9quad

Quote:


> Originally Posted by *HardwareDecoder*
> 
> I think you'd really want more rad then that for 3 of these beasts and a processor. I'd think 480+360 would be sufficient. Or any combo of 240's and 360's equalling around 700-800mm


Thats what I figured as well. HAFX is killin me just no room for that many rads.


----------



## HardwareDecoder

Quote:


> Originally Posted by *the9quad*
> 
> Thats what I figured as well. HAFX is killin me just no room for that many rads.


Might be time for a case upgrade then. I upgraded my self cause I couldn't fit more than one 240mm rad in my nzxt tempest 210 and had to top mount my first 240mm rad since the hdd rack wasn't removable.

Your rig is already nuts it would be really cool to see you water cool it. Would silence those 3 jet engines you got too.

Gonna be pretty expensive though. You are going to need a bunch of rads, 3 water blocks, a d5 pump a res tubing, a ton of fittings a cpu block.

If I were you i'd get this: you still need whatever bridges connect between the cards also... I wasn't sure what the right ones were....



but damn that is going to be alot of money to cool your whole system. I just felt like scheming it all out since I was bored









frozencpu has an ocn discount code I think it is ocn55


----------



## Slomo4shO

Quote:


> Originally Posted by *stickg1*
> 
> Wow, R9 290's are $580 now? That makes me sad, it was perfectly priced when it was $400. I wouldn't pay much more than that for a 290.


It depends on where you look, ShopBLT seems to have most models at ~$20 above MSRP but most are on backorder :
Quote:


> Originally Posted by *Slomo4shO*
> 
> Just ordered 2 XFX R9 290 BLACK EDITION from ShopBLT.com yesterday. Seems to be the only reasonably priced merchant currently. They are out of stock on most models but do provide back orders with details of incoming inventory.


The XFX R9-290A-ENFC are expected to be in stock on 1/5 but most other models are no expected to be available before 1/10


----------



## Menphisto

What is the maximum pretty safe voltage for a 290 ?


----------



## Forceman

Quote:


> Originally Posted by *Menphisto*
> 
> What is the maximum pretty safe voltage for a 290 ?


No one knows for sure, but as long as you don't exceed the +100mV from Afterburner I think you'd be fine. So something around 1.3V under load (after Vdroop).


----------



## Menphisto

OK i have now +25 mV for 1100 core and 1375 memory


----------



## Sazz

prices of R9 290's are pretty high, but the 290X's are not far off from the launch price of 579.99, (290's launch price was 399.99) bitcoin/litcoin mining started it, and they probably started marking it up because the difference in performance between the two is not that big, but the difference in price is huge. if it was at 450-480 I would say that's a good price but at 500+, just go for a 290X or a 780 if your open for Nvidia camp.

Quote:


> Originally Posted by *Menphisto*
> 
> OK i have now +25 mV for 1100 core and 1375 memory


OCCT error checking stable?

I wonder how many people actually use OCCT for error checking on their overclocks.. I don't get any visual instability on my 1150Mhz core clock but when I do OCCT error check I get several errors and had to back down to 1115Mhz (stock voltage) .

And I just confirmed it myself with further testing, black screens is a result of unstable memory clocks. I get black screens playing BF4/Tomb Raider with my memory set at 1500 at stock aux voltage, and with further testing I downclock it bit by bit and came to 1425Mhz memory clock and didn't experience black screens for 5hrs of gaming, when its unstable on memory clock i get black screen within a couple of hours.

If you get black screen at stock clocks, then its probably a bad memory problem. but if your OC'ing and getting black screens, tone down your memory clocks.


----------



## Durvelle27

Quote:


> Originally Posted by *HardwareDecoder*
> 
> good news is you aren't alone... I have the same issue, but mine seems to be related to core clock + voltage+refresh I can set +100mV/1100/110hz even though I can run 1100 on my stock volts.
> 
> But if I set 1200 +100mV/110hz it screws up. I can do 1200/+100mV/60hz though. I ordered a 3 ft cable from mono price which is supposed to be the best qnix overclocking cable so we will see if it helps at all.
> 
> your tube routing is fine btw. It looks alot like mine, screw how it looks if it works good that is all that matters. Mine has to be kind of long so I can pull the pump/res unit out to fill it etc.


Is that a midi R2


----------



## HardwareDecoder

Quote:


> Originally Posted by *Durvelle27*
> 
> Is that a midi R2


yes


----------



## Mas

Hey guys, sorry about lazy question, but I've been out of the country on holidays for a couple of weeks and away from the computer, and am still badly jetlagged and don't feel like doing research at the moment.

Are the new 13.12 drivers worth installing? I have been golden with absolutely zero problems (no artifacting, no black screens, etc) on the 13.11 beta9.2 and quite frankly am afraid of changing anything with all the issues people have been experiencing lol -_-


----------



## Durvelle27

Quote:


> Originally Posted by *HardwareDecoder*
> 
> yes


I love my R2. Any plans for theme


----------



## DeadlyDNA

Quote:


> Originally Posted by *the9quad*
> 
> Realistically, would a 360 and a 240 rad cool 3 of these and a processor?


i had a 6x120 rad for 4 gpus and cpu, but i had ordered 2 more rads. Then i had my water leak, which may have been from pressure or my lack of water cooling skills. I then added 2 dual 200mm rads, all this is of course outside the case. Now i am around 40-44c on full load. My setup looks like hell right now but its a work in progress...

I have an issue with one of my cards, i think its the sapphire i ordered last. I sometimes get a blackout while gaming at stock speeds. It doesn't crash but goes black for 5 seconds then comes back around. Whats weird is it doesn't do it while benchmarking but sometimes when gaming. I guess ill need to take some time and use my PCI-E switches and run one card at a time to figure out.


----------



## HardwareDecoder

Quote:


> Originally Posted by *Durvelle27*
> 
> I love my R2. Any plans for theme


you mean as in making it look good? No, I kind of gave up on that







I'm more a functionality kind of guy anyway and it my system runs great even if my color scheme (white/grey/blue mobo w/ red ram) is kind of off and my tubing runs look like crap. They had to be long for functionality reasons.


----------



## Durvelle27

Quote:


> Originally Posted by *HardwareDecoder*
> 
> you mean as in making it look good? No, I kind of gave up on that
> 
> 
> 
> 
> 
> 
> 
> I'm more a functionality kind of guy anyway and it my system runs great even if my color scheme (white/grey/blue mobo w/ red ram) is kind of off and my tubing runs look like crap. They had to be long for functionality reasons.


----------



## Gunderman456

I bought straight G1/4 to 3/8IN and 90 degree 3/8IN OD 1/2IN G1/4 compression fittings.

The ring for the straight fittings fits over the tube no problem, but the ring for the 90 degree fittings is too small to go over the tube so that I can twist tight. Did I buy wrong, what size of 90 degree compression fitting should have I gotten?


----------



## aznever

probably, you need to check your OD on the tubing, and match that with any compression fittings that you are trying to use.


----------



## Gunderman456

On the tube box is says 3/8" ID - 5/8" OD.

The straight fittings are these and the ring fits;

http://www.ncix.ca/products/?sku=28959

And the 90 degree fitting are these and the ring don't fit;

http://products.ncix.com/detail/bitspower-silver-shining-dual-rotary-90-degree-compression-fitting-id-3-8in-od-1-2in-g1-4-ff-74620-1234.htm

So what do I get for 90 degree compression fittings?


----------



## kizwan

Quote:


> Originally Posted by *Gunderman456*
> 
> On the tube box is says 3/8" ID - 5/8" OD.
> 
> The straight fittings are these and the ring fits;
> 
> http://www.ncix.ca/products/?sku=28959
> 
> And the 90 degree fitting are these and the ring don't fit;
> 
> http://products.ncix.com/detail/bitspower-silver-shining-dual-rotary-90-degree-compression-fitting-id-3-8in-od-1-2in-g1-4-ff-74620-1234.htm
> 
> So what do I get for 90 compression fittings?


The 90 degrees you got is in wrong size. You should buy one for 3/8"ID 5/8"OD tubing.

The compression fittings is for *3/8" ID 5/8" OD* tubing, while the 90 degrees compression fittings is for *3/8" ID 1/2" OD* tubing.


----------



## Gunderman456

Quote:


> Originally Posted by *kizwan*
> 
> The 90 degrees you got is in wrong size. You should buy one for 3/8"ID 5/8"OD tubing.
> 
> The compression fittings is for *3/8" ID 5/8" OD* tubing, while the 90 degrees compression fittings is for *3/8" ID 1/2" OD* tubing.


Thanks, I'm glad you were here to help as well as aznever.

I looked at the on line stores I bought these and none had them which makes it easy for a newcomer to watercooling to then make a mistake, since the option was not there in the 1st place. It looks like I have to buy those from the US. Oh, Canada...


----------



## kizwan

Quote:


> Originally Posted by *Gunderman456*
> 
> Thanks, I'm glad you were here to help as well as aznever.
> 
> I looked at the on line stores I bought these and none had them which makes it easy for a newcomer to watercooling to then make a mistake, since the option was not there in the 1st place. It looks like I have to buy those from the US. Oh, Canada...


Oh, Canada... Did you forget DazMode?

90-Degree *3/8"ID - 5/8"OD* Compression Fitting - Silver
https://www.dazmode.com/store/product/90-degree_3_8_id_-_5_8_od_compression_fitting_-_silver/


----------



## Gunderman456

Yes, I checked and they had it! I finished ordering too!

PS, It's people like you that makes this place excellent by the way. Always help when you can I say.


----------



## Slomo4shO

Quote:


> Originally Posted by *Gunderman456*
> 
> I looked at the on line stores I bought these and none had them which makes it easy for a newcomer to watercooling to then make a mistake, since the option was not there in the 1st place.


You can always opt for a 90 degrees adapter instead...


----------



## esqueue

Quote:


> Originally Posted by *HardwareDecoder*
> 
> Might be time for a case upgrade then. I upgraded my self cause I couldn't fit more than one 240mm rad in my nzxt tempest 210 and had to top mount my first 240mm rad since the hdd rack wasn't removable.
> 
> Your rig is already nuts it would be really cool to see you water cool it. Would silence those 3 jet engines you got too.
> 
> Gonna be pretty expensive though. You are going to need a bunch of rads, 3 water blocks, a d5 pump a res tubing, a ton of fittings a cpu block.
> 
> If I were you i'd get this: you still need whatever bridges connect between the cards also... I wasn't sure what the right ones were....
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> but damn that is going to be alot of money to cool your whole system. I just felt like scheming it all out since I was bored
> 
> 
> 
> 
> 
> 
> 
> 
> 
> frozencpu has an ocn discount code I think it is ocn55


He's also going to need fans


----------



## HardwareDecoder

Quote:


> Originally Posted by *esqueue*
> 
> He's also going to need fans


yea, the kit comes with 3 fans so he will need more for the second rad.


----------



## Gunderman456

Is there a tool you use to tighten the round compression fittings other then a face towel and a monkey wrench, which can still cause metal shavings?


----------



## Slomo4shO

Quote:


> Originally Posted by *HardwareDecoder*
> 
> yea, the kit comes with 3 fans so he will need more for the second rad.


The kit also comes with 6 ft of tubing and 6 fittings, would just need a bridge for the cards and 4 more compression fittings.


----------



## HardwareDecoder

Quote:


> Originally Posted by *Slomo4shO*
> 
> The kit also comes with 6 ft of tubing and 6 fittings, would just need a bridge for the cards and 4 more compression fittings.


I added extra tubing because it is always good to have extra. I have never used a bridge so I over estimated the fittings. Thanks for the info, do you know exactly what bridge someone would need? I wasn't sure....


----------



## Slomo4shO

EK-FC Bridge TRIPLE Serial Z77 CSQ - Acetal or EK-FC Bridge TRIPLE Parallel Z77 CSQ - Acetal for the ASUS Sabertooth X79 board. Also, would be better to buy the extra tubing by the foot









I just ordered the 360mm kit yesterday for my planned A10-7850K build


----------



## Forceman

Quote:


> Originally Posted by *Gunderman456*
> 
> Is there a tool you use to tighten the round compression fittings other then a face towel and a monkey wrench, which can still cause metal shavings?


You talking about the ring-like piece that goes over the tube? You can just hand tighten them, you don't really need to torque them down.


----------



## Gunderman456

No, the part that screws into the reservoir/rads etc... I was told on the water cooling club that hand tight is all I needed.


----------



## VSG

Ya, hand tighten is enough as the O-ring is enough to grip it on and prevent leakage.


----------



## MrWhiteRX7

I'm running 2 of the Phobya 1080 rads for my trifire 290 setup and 4820k. Just waiting on my water blocks but I think I should be good









Got some cool parts from TSM106 as well, really good OCN'er!


----------



## HardwareDecoder

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> I'm running 2 of the *Phobya 1080 rads* for my trifire 290 setup and 4820k. Just waiting on my water blocks but I think I should be good
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Got some cool parts from TSM106 as well, really good OCN'er!


is that a 1080mm rad?


----------



## MrWhiteRX7

Each 1080 rad is basically 9x120mm rads put together lol and I have two of those. I have 18 x 120mm fans.


----------



## HardwareDecoder

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Each 1080 rad is basically 9x120mm rads put together lol and I have two of those. I have 18 x 120mm fans.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *HardwareDecoder*


LOL I went this route with the intent of probably adding a 4th 290 gpu. My buddy says I'm overkill but I'd rather have more than necessary when it comes to cooling.


----------



## HardwareDecoder

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> LOL I went this route with the intent of probably adding a 4th 290 gpu. My buddy says I'm overkill but I'd rather have more than necessary when it comes to cooling.


how is the micro stutter on that setup? I just got in to water cooling my self but i'm already totally addicted and already wish I could afford to replace my xspc 750 pump with a better one since I think it is holding me back from even better temps, I wish I had bought a kit with a D5 in it but oh well it works for now. I don't want to take my whole loop apart though and my wife is going to literally murder me if I buy any more wc parts.

anyway /end rant nice setup you got man.

As I was typing this I realized I won't have to take much apart just to replace my pump with a D5.......

*starts scheming on how to afford one*


----------



## MrWhiteRX7

Quote:


> Originally Posted by *HardwareDecoder*
> 
> how is the micro stutter on that setup? I just got in to water cooling my self but i'm already totally addicted and already wish I could afford to replace my xspc 750 pump with a better one since I think it is holding me back from even better temps, I wish I had bought a kit with a D5 in it but oh well it works for now. I don't want to take my whole loop apart though and my wife is going to literally murder me if I buy any more wc parts.
> 
> anyway /end rant nice setup you got man.


Thank you









It's been a costly setup, but I love it and don't regret it. I'm an addicted hardware enthusiast haha.

Dude I'm just using an external D5, they're not too expensive and it shouldn't be bad adding that to the loop so you can go more extreme









No micro stutter! I've had 4x7970's and hated it so went down to 3x7970's and had some pretty good success but this just absolutely blows it away which is why I'm thinking of adding a 4th now


----------



## HardwareDecoder

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Thank you
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's been a costly setup, but I love it and don't regret it. I'm an addicted hardware enthusiast haha.
> 
> Dude I'm just using an external D5, they're not too expensive and it shouldn't be bad adding that to the loop so you can go more extreme
> 
> 
> 
> 
> 
> 
> 
> 
> 
> No micro stutter! I've had 4x7970's and hated it so went down to 3x7970's and had some pretty good success but this just absolutely blows it away which is why I'm thinking of adding a 4th now


what i'm unsure of is do I need to buy this:

http://www.frozencpu.com/products/16144/ex-res-372/XSPC_Dual_525_Bay_Black_Reservoir_-_w_D5_Variant_Pump_Installed.html

or just the pump?

http://www.frozencpu.com/products/19457/ex-pmp-234/XSPC_D5_Vario_Pump_Motor_w_Mount_Ring_-_Single_Edition.html

I don't have a stand alone res right now, just the xspc x2o 750 pump/res combo. I'm wondering if I can just use the 750 as a res or what.

Do you have pics of your setup I could look at?

I had 2x 7950's and got really tired of xfire i'm gonna stick to one card for awhile + I can't afford another card


----------



## MrWhiteRX7

One card is plenty







I'm actually planning to post a nice rig thread once the blocks are in, I'll at least be opening the case up this weekend and starting prep work. I'll snap some shots of how it's all combined in there hahahaha


----------



## MrWhiteRX7

Quote:


> Originally Posted by *HardwareDecoder*
> 
> what i'm unsure of is do I need to buy this:
> 
> http://www.frozencpu.com/products/16144/ex-res-372/XSPC_Dual_525_Bay_Black_Reservoir_-_w_D5_Variant_Pump_Installed.html
> 
> or just the pump?
> 
> http://www.frozencpu.com/products/19457/ex-pmp-234/XSPC_D5_Vario_Pump_Motor_w_Mount_Ring_-_Single_Edition.html
> 
> I don't have a stand alone res right now, just the xspc x2o 750 pump/res combo. I'm wondering if I can just use the 750 as a res or what.
> 
> Do you have pics of your setup I could look at?
> 
> I had 2x 7950's and got really tired of xfire i'm gonna stick to one card for awhile + I can't afford another card


Oh and I'm not sure about your res/ combo situation. I almost went that route but decided to do my own res/ d5 setup. I have a tube res that basically mounts right my external d5.


----------



## HardwareDecoder

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Oh and I'm not sure about your res/ combo situation. I almost went that route but decided to do my own res/ d5 setup. I have a tube res that basically mounts right my external d5.


I see. I think I really don't need to change it right now anyway. I just see people with like 42-45c load temps on the 290/x with + voltage and mine gets to like 50c in BF4 with no + voltage. So i'm running a bit hotter and I think it is my pump not being that powerful to go through CPU Block 2 rads and a gpu block. I should probably just be happy that everything works for now and just upgrade my pump whenever I do a new build.


----------



## kizwan

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> One card is plenty
> 
> 
> 
> 
> 
> 
> 
> I'm actually planning to post a nice rig thread once the blocks are in, I'll at least be opening the case up this weekend and starting prep work. I'll snap some shots of how it's all combined in there hahahaha


Did you order the water block from FrozenCPU or PPC?


----------



## MrWhiteRX7

Yea don't tear in to your setup just to do a pump right now if nothing else is going to be changed. Those temps are still perfectly fine haha


----------



## HardwareDecoder

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Yea don't tear in to your setup just to do a pump right now if nothing else is going to be changed. Those temps are still perfectly fine haha


I'm like terrible at never leaving anything alone.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *HardwareDecoder*
> 
> I'm like terrible at never leaving anything alone.


I'm the same way!!!!! It's an expensive problem lol


----------



## HardwareDecoder

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> I'm the same way!!!!! It's an expensive problem lol


Yea, I really need to just leave stuff alone for a year or two though this rig will be perfectly fine for 2 years if I let it.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *HardwareDecoder*
> 
> Yea, I really need to just leave stuff alone for a year or two though this rig will be perfectly fine for 2 years if I let it.


Humbug ? where's the fun in that?


----------



## HardwareDecoder

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Humbug ? where's the fun in that?


well, the fun is in staying alive


----------



## MrWhiteRX7

Quote:


> Originally Posted by *kizwan*
> 
> Did you order the water block from FrozenCPU or PPC?


PPC


----------



## Forceman

Quote:


> Originally Posted by *HardwareDecoder*
> 
> I see. I think I really don't need to change it right now anyway. I just see people with like 42-45c load temps on the 290/x with + voltage and mine gets to like 50c in BF4 with no + voltage. So i'm running a bit hotter and I think it is my pump not being that powerful to go through CPU Block 2 rads and a gpu block. I should probably just be happy that everything works for now and just upgrade my pump whenever I do a new build.


Is that pump speed controllable? It should be powerful enough for what you have, but you might want to try adjusting the speed of you can. I turned my D5 down a little, and I'm getting better temps that way (about 5C on the VRMs) either because the pump is putting less heat in or because the water is moving a little slower and can transfer the heat better.


----------



## Sazz

Quote:


> Originally Posted by *Forceman*
> 
> Is that pump speed controllable? It should be powerful enough for what you have, but you might want to try adjusting the speed of you can. I turned my D5 down a little, and I'm getting better temps that way (about 5C on the VRMs) either because the pump is putting less heat in or because the water is moving a little slower and can transfer the heat better.


D5 pump doesn't really much have heat compared to MCP350/5/X pumps, it's the latter part. You just need to find the perfect flow rate for your set-up, too fast and the liquid doesn't stay long enough in the rad to transfer heat, but if its too slow the liquid would stay too long in the block.


----------



## kizwan

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> PPC


I'm going to order the parts directly from EK. Some items are backorder though. (Didn't check out yet, still looking if I need anything else.)


----------



## prostreetcamaro

Quote:


> Originally Posted by *Sazz*
> 
> D5 pump doesn't really much have heat compared to MCP350/5/X pumps, it's the latter part. You just need to find the perfect flow rate for your set-up, *too fast and the liquid doesn't stay long enough in the rad to transfer heat*, but if its too slow the liquid would stay too long in the block.


You guys got it all wrong. Coming from the high performance automotive world I know this for a fact. It can be a real pain to get a cooling setup perform well on a high horsepower highly modified car. Many things go into it but slowing down the flow so that it stays in the radiator longer to cool off more is the worst thing you can do. While it stays in the radiator longer what does the water do that is in the engine or in this example the water blocks? It stays there longer and heats up higher before it leaves the water blocks. You want high flow so it can get in and out of everything quickly.

http://www.arrowheadradiator.com/14_rules_for_improving_engine_cooling_system_capability_in_high-performance_automobiles.htm


----------



## rdr09

Quote:


> Originally Posted by *Mas*
> 
> Hey guys, sorry about lazy question, but I've been out of the country on holidays for a couple of weeks and away from the computer, and am still badly jetlagged and don't feel like doing research at the moment.
> 
> Are the new 13.12 drivers worth installing? I have been golden with absolutely zero problems (no artifacting, no black screens, etc) on the 13.11 beta9.2 and quite frankly am afraid of changing anything with all the issues people have been experiencing lol -_-


i suggest reading the release notes. if none of them apply to what you are looking for, then stick with what works. i know the 13.12 raised my VDDC a bit as well as my temp, so i reverted back to 13.11. works great and plays all my games except NFS prostreet.







good thing i have my hd7770 for that. BF4 is what i bought this card for.


----------



## Slomo4shO

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Each 1080 rad is basically 9x120mm rads put together lol and I have two of those. I have 18 x 120mm fans.


Are you running push/pull on both? I am currently waiting on washers to mount my 6 rads(14x 45mm thick rads in push/pull and 2x 30mm rads in push) in my 900D since the screws from Alphacool are too narrow. This build has been running into too many delays









I would be curious to see how my quadfire loop compares to your trifire temps.


----------



## esqueue

Quote:


> Originally Posted by *prostreetcamaro*
> 
> You guys got it all wrong. Coming from the high performance automotive world I know this for a fact. It can be a real pain to get a cooling setup perform well on a high horsepower highly modified car. Many things go into it but slowing down the flow so that it stays in the radiator longer to cool off more is the worst thing you can do. While it stays in the radiator longer what does the water do that is in the engine or in this example the water blocks? It stays there longer and heats up higher before it leaves the water blocks. You want high flow so it can get in and out of everything quickly.
> 
> http://www.arrowheadradiator.com/14_rules_for_improving_engine_cooling_system_capability_in_high-performance_automobiles.htm


I'm sure that you are aware of this but there are many aspects of cooling something as hot as an engine which doesn't apply to pc cooling. The pressure capabilities is one of the major factors in automotive cooling which is the reason behind the pressure rated caps and the reason why an aluminum radiator can preform better than a copper radiator in cars. The extreme temps, and dissimilar metals aren't going to me an issue in most pc cooling setups too. A slow flow setup will never get water close to boiling so it can take in quite a bit of heat before going through the radiators while in a car it will boil and all hell will break lose. This is even more-so in a high HP setup. You can never have too much flow in an automobile setup. Just some FYI for those that care

Yes, I agree with you. people say that if water is moving too fast that it doesn't pick up enough heat. While there is only a limit on where a higher flow pump will not help if their radiator is setup to remove more heat than their components produce, they will be golden. The water picks up heat but there is more water working and it has less heat to disperse through the radiator so it balances out in the end. I don't know how pump heat may interfere with the setup though. A higher flow pump should generate more heat.

Lets change the water picking up heat from your components and moving the heat to your capable radiator to working moving boxes. A normal flow is 2 workers each carrying 2 boxes and take a 1 minute round trip to get the boxes to their destination and back. That's 4 boxes per minute between the 2 workers. A high flow pump is those same 2 workers carrying only 1 box each and taking 30 seconds for the trip. The same 4 boxes per minute.

I am definitely not an expert in either field but have a great understanding on how things work especially in automobiles. If someone have some info on a higher flow pump actually not preforming as good as a lower flow, I'd love to read up on it. Pump heat may be more of an issue than I thought which is never an issue in an automobile.


----------



## the9quad

Esqueue, dissimilar metals is more of a corrosion concern than a heat transfer concern. It has to do with metals with different ecp's in an electrolytic solution which leads to galvanic corrosion. You can combat it by using the same metals, and/or sacrificial anodes.

I have never built a water cooled rig, but I have taken and taught my share of heat transfer, fluid flow, materials, and chemistry classes. And I work in a field where keeping things cool is pretty much a necessity.


----------



## PearlJammzz

I am looking to pick up a water block. Are there any to stay away from? One that performs better than the rest? Complete personal preference?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Slomo4shO*
> 
> Are you running push/pull on both? I am currently waiting on washers to mount my 6 rads(14x 45mm thick rads in push/pull and 2x 30mm rads in push) in my 900D since the screws from Alphacool are too narrow. This build has been running into too many delays
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I would be curious to see how my quadfire loop compares to your trifire temps.


Pull only. I'll see how that works at first and then go push pull if need be. LOL that will be a total of 36 fans


----------



## prostreetcamaro

Quote:


> Originally Posted by *esqueue*
> 
> I'm sure that you are aware of this but there are many aspects of cooling something as hot as an engine which doesn't apply to pc cooling. The pressure capabilities is one of the major factors in automotive cooling which is the reason behind the pressure rated caps and the reason why an aluminum radiator can preform better than a copper radiator in cars. The extreme temps, and dissimilar metals aren't going to me an issue in most pc cooling setups too. A slow flow setup will never get water close to boiling so it can take in quite a bit of heat before going through the radiators while in a car it will boil and all hell will break lose. This is even more-so in a high HP setup. You can never have too much flow in an automobile setup. Just some FYI for those that care
> 
> Yes, I agree with you. people say that if water is moving too fast that it doesn't pick up enough heat. While there is only a limit on where a higher flow pump will not help if their radiator is setup to remove more heat than their components produce, they will be golden. The water picks up heat but there is more water working and it has less heat to disperse through the radiator so it balances out in the end. I don't know how pump heat may interfere with the setup though. A higher flow pump should generate more heat.
> 
> Lets change the water picking up heat from your components and moving the heat to your capable radiator to working moving boxes. A normal flow is 2 workers each carrying 2 boxes and take a 1 minute round trip to get the boxes to their destination and back. That's 4 boxes per minute between the 2 workers. A high flow pump is those same 2 workers carrying only 1 box each and taking 30 seconds for the trip. The same 4 boxes per minute.
> 
> I am definitely not an expert in either field but have a great understanding on how things work especially in automobiles. If someone have some info on a higher flow pump actually not preforming as good as a lower flow, I'd love to read up on it. Pump heat may be more of an issue than I thought which is never an issue in an automobile.


I am sure there are some differences but the basics should still apply in the end. I played hell getting my 600hp camaro to stay cool in 90+ degree day time temps. I finally got it dialed in but man it beat me up pretty bad getting there. A 600hp pump gas N/A small block produces a ton of heat to dissipate. 4.56 gears making the engine sing at 3000+ rpm at only 50mph doesn't help either. I got new 3.90's sitting here I need to get put in this spring.


----------



## esqueue

Quote:


> Originally Posted by *the9quad*
> 
> Esqueue, dissimilar metals is more of a corrosion concern than a heat transfer concern. It has to do with metals with different ecp's in an electrolytic solution which leads to galvanic corrosion. You can combat it by using the same metals, and/or sacrificial anodes.
> 
> I have never built a water cooled rig, but I have taken and taught my share of heat transfer, fluid flow, materials, and chemistry classes. And I work in a field where keeping things cool is pretty much a necessity.


Yeah, I was listing issues you will deal with in almost all automobiles but shouldn't deal with in computers.I was going in one direction but lost track of what I was typing about. That is one of the reasons why automobiles need their coolant where our PC's with any half decent can do fine with distilled water and something to function as a biocide. I'm sure that I've seen a few cooler master systems that mix copper blocks with aluminum radiators which is a reason why I kind of think of that company as a joke.


----------



## the9quad

New afterburner beta 18 is out

http://www.guru3d.com/files_get/msi_afterburner_beta_download,20.html


----------



## Maximization

silver and copper in computer cooling bad


----------



## Slomo4shO

Quote:


> Originally Posted by *esqueue*
> 
> A slow flow setup will never get water close to boiling so it can take in quite a bit of heat before going through the radiators while in a car it will boil and all hell will break lose.


Nevertheless, the temperature deltas between the block and water, water and radiator, and radiator and air also play a crucial role in heat transfer where larger temperature deltas between the two mediums increased the rate of heat transfer from the hotter medium to the cooler medium via conductive heat transfer.

Fourier's equation of conductive heat transfer is q = k A ΔT / s.
q= heat transfer
k= thermal conductivity
A= surface area
ΔT= temperature difference
s = material thickness

Since k, A,and s are static in a closed loop then one can infer that ΔT has the largest impact on heat transfer. Considering that any given molecule of water will spend the same amount of time within a specific cross-section of a loop over time, I am unsure how relevant flow rate would be in a closed loop if the flow is turbulent throughout and a water wetter is used.
Quote:


> Originally Posted by *esqueue*
> 
> I'm sure that I've seen a few cooler master systems that mix copper blocks with aluminum radiators which is a reason why I kind of think of that company as a joke.


You mean like every other all-in-one system?


----------



## esqueue

Quote:


> Originally Posted by *Slomo4shO*
> 
> Nevertheless, the temperature deltas between the block and water, water and radiator, and radiator and air also play a crucial role in heat transfer where larger temperature deltas between the two mediums increased the rate of heat transfer from the hotter medium to the cooler medium via conductive heat transfer.
> 
> Fourier's equation of conductive heat transfer is q = k A ΔT / s.
> q= heat transfer
> k= thermal conductivity
> A= surface area
> ΔT= temperature difference
> s = material thickness
> 
> Since k, A,and s are static in a closed loop then one can infer that ΔT has the largest impact on heat transfer. Considering that any given molecule of water will spend the same amount of time within a specific cross-section of a loop over time, I am unsure how relevant flow rate would be in a closed loop if the flow is turbulent throughout and a water wetter is used.
> You mean like every other all-in-one system?
> 
> 
> Spoiler: Warning: Spoiler!


Is the equation you posted above saying that too much flow in computer system will give negative results? If no, that Is what I said in my original statement. You posted a part where I was trying to imply where it would not be detrimental to a computer cooling system as it would be in an automobile.

As for the aluminum radiators, you got me. I guess the all in ones use the correct coolant as used in automobiles applications that have mixed metals.

I'm actually leaving right now that's why I didn't have time to really read the equation you posted but it all seems like common sense to the most part in general. I will have to read up on how water wetter helps though as I've never bother to read up on how or if it works.

The whole point of my original post was to say that too much flow isn't bad. It seems that you agree too.


----------



## Tennobanzai

Not sure if it was posted yet but new beta AB was released today.

http://www.guru3d.com/files_details/msi_afterburner_beta_download.html


----------



## the9quad

Quote:


> Originally Posted by *Tennobanzai*
> 
> Not sure if it was posted yet but new beta AB was released today.
> 
> http://www.guru3d.com/files_details/msi_afterburner_beta_download.html


lol i literally posted that last page, but the more the merrier!










On another note: this is supposed to help with the visual fluctuations people see on their GPU's in AB.

It's purely visual from what I gather in the way AMD handles it, and it doesn't affect performance. There is also an option to display things the way GPU-Z does, which is apparently wrong but looks smoother, if the visual fluctuations still bother ya.


----------



## jerrolds

Quote:


> Originally Posted by *Daveleaf*
> 
> I am using Swiftech MCW 80,
> 
> some enzotech ramsinks, onlyhad 8 so ordered more, and a piece or copper stock on the vrm with electrical tape.
> 
> I ordered another 2 boxes enzotech ramsink, and vrm sinks, and some seisuki thermal tape.
> 
> 
> 
> This is my 3rd vid card on this block, and since I go through new vid cards every 6 months, this had saved me alot on custom blocks.


Damn why didnt i think of this, putting a straight up copper bar w/ duct tape - can you buy copper bars at hardware stores maybe? Or maybe frozencpu/performance-pc?


----------



## kdawgmaster

Quote:


> Originally Posted by *esqueue*
> 
> [/SPOILER]
> 
> Is the equation you posted above saying that too much flow in computer system will give negative results? If no, that Is what I said in my original statement. You posted a part where I was trying to imply where it would not be detrimental to a computer cooling system as it would be in an automobile.
> 
> As for the aluminum radiators, you got me. I guess the all in ones use the correct coolant as used in automobiles applications that have mixed metals.
> 
> I'm actually leaving right now that's why I didn't have time to really read the equation you posted but it all seems like common sense to the most part in general. I will have to read up on how water wetter helps though as I've never bother to read up on how or if it works.
> 
> The whole point of my original post was to say that too much flow isn't bad. It seems that you agree too.


In a computer theres negative and positive air pressure. Normally having more fans will not allow for a proper set up depending on the case. The way it works best at least for everything ive done is there should be more fans exhausting air out of the case there are in the case. This will normally allow for air transfer to be more seemless in the case with less vacuum locations thus letting heat escape as quick as possible. My set up in the cosmos 2 that I have is 1 front (intake) 1 back (exhaust) 2 side (intake) and 3 top (exhaust) this have provided me with best results so far. Now with a water cooler ive always experienced it cool better to take out side air ( not in case ) as this will be ambient air temps cooling the rad.


----------



## Daveleaf

Quote:


> Originally Posted by *jerrolds*
> 
> Damn why didnt i think of this, putting a straight up copper bar w/ duct tape - can you buy copper bars at hardware stores maybe? Or maybe frozencpu/performance-pc?


I got it from a local metal distributor. I called two places, a metal shop wanted 25 dollars, at the distributor it was 8 dollar for a 12" bar 1/4" x1 1/4"

So I dremelled a piece of for temp use, but however I found it great platform pieces for this final heatsink layout.


----------



## jerrolds

Quote:


> Originally Posted by *Daveleaf*
> 
> I got it from a local metal distributor. I called two places, a metal shop wanted 25 dollars, at the distributor it was 8 dollar for a 12" bar 1/4" x1 1/4"
> 
> So I dremelled a piece of for temp use, but however I found it great platform pieces for this final heatsink layout.


Nice - i dont have a dremel - can you tell me the final dimensions of that piece? Im tempted to call around and see what i can get. That long strip of VRM1 get stupid hot for me even tho the core is < 65C @1200mhz.

I suppose i can measure it out, but that would require me to reseat the HSF again


----------



## Sazz

Quote:


> Originally Posted by *prostreetcamaro*
> 
> You guys got it all wrong. Coming from the high performance automotive world I know this for a fact. It can be a real pain to get a cooling setup perform well on a high horsepower highly modified car. Many things go into it but slowing down the flow so that it stays in the radiator longer to cool off more is the worst thing you can do. While it stays in the radiator longer what does the water do that is in the engine or in this example the water blocks? It stays there longer and heats up higher before it leaves the water blocks. You want high flow so it can get in and out of everything quickly.
> 
> http://www.arrowheadradiator.com/14_rules_for_improving_engine_cooling_system_capability_in_high-performance_automobiles.htm


Didn't you read what I said? Go ahead read it again, I said "but if its too slow the liquid would stay too long in the block". I did not say to go fully slow on the flow rate, that would have a negative effect as well you just need to find the right balance on your flow rate.


----------



## prostreetcamaro

Quote:


> Originally Posted by *Sazz*
> 
> Didn't you read what I said? Go ahead read it again, I said "but if its too slow the liquid would stay too long in the block". I did not say to go fully slow on the flow rate, that would have a negative effect as well you just need to find the right balance on your flow rate.


Yeah i read over it too quickly. It would be very easy for me to see the difference. My D5 is a variable so I can turn it up and down. I might play around with it when i get back from my cruise. Right now it is running at max speed.


----------



## bond32

The "Force Constant Voltage" option, is this recommended?


----------



## Derpinheimer

Not really. If voltage control works fine without it, all it will do is waste power at idle. Possibly help with overclocking a slight bit.


----------



## northernhorn

Just preordered a Gigabyte Windforce OC R9 290 to go with my soon to arrive QNIX 27" PLS monitor, resolution 2560x1440.

I'm coming from an HD5850 at 1920x1080, and I'm already very pleased with this set up, so this should be a great upgrade for me.


----------



## prostreetcamaro

Quote:


> Originally Posted by *northernhorn*
> 
> Just preordered a Gigabyte Windforce OC R9 290 to go with my soon to arrive QNIX 27" PLS monitor, resolution 2560x1440.
> 
> I'm coming from an HD5850 at 1920x1080, and I'm already very pleased with this set up, so this should be a great upgrade for me.


I thought I had read gigabyte stopped production of the windforce 290 and 290X due to problems with the cooler. You might want to look into that. You dont want to end up with a card that does not cool really well even thought you spent all that extra money for the better cooling solution.


----------



## prostreetcamaro

Quote:


> Originally Posted by *Sazz*
> 
> Didn't you read what I said? Go ahead read it again, I said "but if its too slow the liquid would stay too long in the block". I did not say to go fully slow on the flow rate, that would have a negative effect as well you just need to find the right balance on your flow rate.


Just found this.

http://www.xtremerigs.net/2014/01/01/r9-290x-gpu-block-performance-summary/

"As expected cooling performance increases as flow increases. There may be a point at which high flow may hurt performance but this is not shown to be within the normal expected flow rates that users normally run."


----------



## northernhorn

Quote:


> Originally Posted by *prostreetcamaro*
> 
> I thought I had read gigabyte stopped production of the windforce 290 and 290X due to problems with the cooler. You might want to look into that. You dont want to end up with a card that does not cool really well even thought you spent all that extra money for the better cooling solution.


Thanks for your input. I did read about these issues moments after I ordered my card, but luckily, there's been some case of lost in translation along the way, and it's only the 290x that's affected, and only about 30 units at that. By all accounts, the 290 is fine. Overclockers have January 17th as their eta for these cards, so fingers crossed..


----------



## y2kcamaross

Does the new afterburner fix the power limit setting on the 290/290xs?


----------



## jerrolds

Oh sweet
Quote:


> Hi guys, the Afterburner 3.0.0 Beta 18 is ready for you to download. this version supports the control of NVIDIA and AMD graphics card, if you get the card, you may try to play it.
> 
> Change note:
> • Added low-level I2C access driver for AMD Bonaire, Curacao and Hawaii graphics processors. Low-level I2C access driver provides much faster access to I2C bus than ineffective native AMD ADL I2C access interface and addresses issues with GPU clock throttling when enabling voltage monitoring on AMD RADEON R9 290 series graphics cards
> • Improved AMD ADL access layer with Overdrive 6 support to provide compatibility with future AMD GPUs
> • Regular and SE versions of MSI Afterburner are now merged into single installer. Added voltage control mode selection option to the "Compatibility properties" section in "General" tab. Now you can toggle between reference design, standard MSI and extended MSI voltage control modes
> • Added "boost edition" / "GHz edition" GPU type selection option for reference design AMD RADEON 7970 and AMD RADEON 7950 based graphics cards to "AMD compatibility properties" section in "General" tab
> • Added GPU usage averaging algorithm for Overdrive 6 capable AMD GPUs. Now displayed GPU usage is being averaged by sliding window to smooth GPU usage artifacts occurring due to bug in AMD ADL API on AMD Sea Islands GPU family
> • Added optional unified GPU usage monitoring path via D3DKMT performance counters. Power users may enable it via configuration file as a workaround to replace native vendor's GPU usage monitoring if it is working improperly (e.g. broken GPU usage monitoring in AMD ADL API for AMD Sea Islands GPU family)
> • Added "Use dedicated encoder server" option to "Videocapture" tab
> • RivaTuner Statistics Server has been upgraded to version 6.0.0
> • Included MSI Gaming skins
> 
> Download link:
> http://www.guru3d.com/files_details/..._download.html


I hope he increased the voltage max for 290X cards


----------



## wilflare

is it worth paying an extra $100 to get 290 over the 280x?


----------



## Durvelle27

Quote:


> Originally Posted by *wilflare*
> 
> is it worth paying an extra $100 to get 290 over the 280x?


Yes yes it is.


----------



## wilflare

thanks man! any custom air cooling solution that would work with this?
maybe those corsair watercooling units?


----------



## Durvelle27

Quote:


> Originally Posted by *wilflare*
> 
> thanks man! any custom air cooling solution that would work with this?
> maybe those corsair watercooling units?


Accelero Xtreme III, Accelero Hybrid, Kraken G10 etc...

Read other post in other thread


----------



## bustacap22

Two of my friends recently bought a 290x....One MSI and the other XFX. Both cards are under water and both are having problems with VRM having high temps. Also, I saw somebody here posting similar problems with high VRM temps. I am ready to pull the trigger on purchasing 2 290x but after hearing this I am gun shy now. What am I missing here????


----------



## HardwareDecoder

Quote:


> Originally Posted by *bustacap22*
> 
> Two of my friends recently bought a 290x....One MSI and the other XFX. Both cards are under water and both are having problems with VRM having high temps. Also, I saw somebody here posting similar problems with high VRM temps. I am ready to pull the trigger on purchasing 2 290x but after hearing this I am gun shy now. What am I missing here????


your friends installed the blocks wrong then


----------



## Sgt Bilko

Quote:


> Originally Posted by *bustacap22*
> 
> Two of my friends recently bought a 290x....One MSI and the other XFX. Both cards are under water and both are having problems with VRM having high temps. Also, I saw somebody here posting similar problems with high VRM temps. I am ready to pull the trigger on purchasing 2 290x but after hearing this I am gun shy now. What am I missing here????


Are the pads on the vrm's making good contact with the block?

Thats the most common cause i've heard of in here, 1mm pads for the vrm's, 0.5mm for the mem


----------



## Tennobanzai

Quote:


> Originally Posted by *bustacap22*
> 
> Two of my friends recently bought a 290x....One MSI and the other XFX. Both cards are under water and both are having problems with VRM having high temps. Also, I saw somebody here posting similar problems with high VRM temps. I am ready to pull the trigger on purchasing 2 290x but after hearing this I am gun shy now. What am I missing here????


Sounds like poor contact with they have full WC blocks


----------



## Sazz

Quote:


> Originally Posted by *bustacap22*
> 
> Two of my friends recently bought a 290x....One MSI and the other XFX. Both cards are under water and both are having problems with VRM having high temps. Also, I saw somebody here posting similar problems with high VRM temps. I am ready to pull the trigger on purchasing 2 290x but after hearing this I am gun shy now. What am I missing here????


they probably installed the block wrong, did they have EK blocks? coz the EK block got two thermal pads, one for the memory and one for the VRM's, one is thicker than the other.

I have mine under water with EK blocks and my VRM's never reach over 47C at +100mV overclock.

Now if they are using the AIO liquid cooler mods, then sure you would have problems with VRM temps if you do not install a proper heatsink over them. I would only advice AIO liquid cooler mod if you don't want a full custom loop and you are in a budget.


----------



## Fahrenheit85

Had the block on back order with FrozenCPU for 2 weeks now, ordered with from Performance PCs on Monday and I got it already.


----------



## devilhead

Quote:


> Originally Posted by *bustacap22*
> 
> Two of my friends recently bought a 290x....One MSI and the other XFX. Both cards are under water and both are having problems with VRM having high temps. Also, I saw somebody here posting similar problems with high VRM temps. I am ready to pull the trigger on purchasing 2 290x but after hearing this I am gun shy now. What am I missing here????


My Vram never goes more than 48/34C and core 44C, ambient 23C(mining, gaming), so yes, your friend probably have done mistake with pads 0.5mm and 1 mm


----------



## mojobear

Quote:


> Originally Posted by *bustacap22*
> 
> Two of my friends recently bought a 290x....One MSI and the other XFX. Both cards are under water and both are having problems with VRM having high temps. Also, I saw somebody here posting similar problems with high VRM temps. I am ready to pull the trigger on purchasing 2 290x but after hearing this I am gun shy now. What am I missing here????


I think they installed the blocks wrong. Got three R9 290s with EK waterblocks at stock VRMs stay less than 46 C and with OC they go up to 56C on VRM1 (hotter of the two)


----------



## prostreetcamaro

Quote:


> Originally Posted by *devilhead*
> 
> My Vram never goes more than 48/34C and core 44C, ambient 23C(mining, gaming), so yes, your friend probably have done mistake with pads 0.5mm and 1 mm


Yup same here with my setup. I went one step further and used FujiPoly ultra extreme on the VRM's and the memory. The EK block is a great water block for sure!


----------



## esqueue

Also chiming in to say that my EK water block does a fine job cooling the VRM. I also forgot to add thermal paste to the vrm's as they recommended in the instructions and I'm running an outdated heatercore as a radiator so the block is fine.


----------



## Gero2013

my R9 290X idles at 70C. Anyone know why?

XFX R9 290X Quiet BIOS. It's a reference 290 with stock cooler.


----------



## Sazz

Quote:


> Originally Posted by *Gero2013*
> 
> my R9 290X idles at 70C. Anyone know why?
> 
> XFX R9 290X Quiet BIOS. It's a reference 290 with stock cooler.


that's expected with reference cooler, stock fan profile. make your own fan profile to lower it down.


----------



## Forceman

Quote:


> Originally Posted by *Gero2013*
> 
> my R9 290X idles at 70C. Anyone know why?
> 
> XFX R9 290X Quiet BIOS. It's a reference 290 with stock cooler.


Is the card downclocking to 2D clocks?
Quote:


> Originally Posted by *Sazz*
> 
> that's expected with reference cooler, stock fan profile. make your own fan profile to lower it down.


Mine certainly didn't idle at 70C with the stock cooler.


----------



## HardwareDecoder

Quote:


> Originally Posted by *Forceman*
> 
> Is the card downclocking to 2D clocks?
> Mine certainly didn't idle at 70C with the stock cooler.


mine either....


----------



## Sazz

many other factors could have made it go to 70C as idle, one is it really on "idle", 2nd is ambient temp, air flow in your case and many others. So I wouldn't be surprised if its doing 70C idle.

I didn't get a chance to run my card with reference cooler with stock fan profile but with my custom fan profile I made it so that it runs at 40C idle.


----------



## esqueue

I never tested the quiet mode. I'm sure that many here didn't either. I'm guessing that it's extremely high ambient temps combined with bad airflow.


----------



## selk22

Well guys it is time for me to leave you! Got my 290x at launch and I think it was 100% worth it but not a card I would buy for gaming... It has countless problems with recording using Dxtory that are unsolvable no matter what iv tried.. 7870 records perfect and every other GPU iv owned... But the 290x was loud hot and did the job while I needed it









This was the tale of my 290x

1st - Buy the card for about 630$ after taxes and shipping.. this was at release with bf4

2nd - Game lightly on it and mine about 180-200$ worth of Litecoin

3rd - Bought a 5850 and 7870 for dedicated mining for about 4 LTC

4th - Sell my 290x for .9 BTC which came to about 720$ when I sold the BTC

5th - Time to find the best deal I can on a 780TI.. Thanks 290x and the mining craze for making a 290x to 780ti trade possible









It was a great card and I wish I had the money to put it under water right now but honestly I miss nvidia and everything that they offer including amazing drivers.. So time to head back to the green team at least for gaming that is









I still have my 5850 and 7870 for mining which is what I like AMD for!

So please remove me from the list. Take it easy everyone, it was a fun launch to be a part of.


----------



## Jack Mac

Best of luck with the 780TI. I'm happy with my $400 290 that will pay for itself in a bit through mining.


----------



## selk22

Quote:


> Originally Posted by *Jack Mac*
> 
> Best of luck with the 780TI. I'm happy with my $400 290 that will pay for itself in a bit through mining.


I was extremely happy that my 290x netted so much profit in such a short amount of time







But now I have a dedicated miner because of the 290x that actually uses less power which is great









I never wanted my gaming PC to be a miner so I am excited to get my computer back to use anytime! It was hard to justify gaming when I could have been mining at the same time.. Now I will always be mining


----------



## Jack Mac

Quote:


> Originally Posted by *selk22*
> 
> I was extremely happy that my 290x netted so much profit in such a short amount of time
> 
> 
> 
> 
> 
> 
> 
> But now I have a dedicated miner because of the 290x that actually uses less power which is great
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I never wanted my gaming PC to be a miner so I am excited to get my computer back to use anytime! It was hard to justify gaming when I could have been mining at the same time.. Now I will always be mining


I mine while I sleep and fold on the CPU (which is one of the reasons why I want a 3930k/4930k). My PC doesn't go to waste and keeps my room nice and warm.


----------



## The Storm

Quote:


> Originally Posted by *Fahrenheit85*
> 
> 
> Had the block on back order with FrozenCPU for 2 weeks now, ordered with from Performance PCs on Monday and I got it already.


What res set up is that?


----------



## HardwareDecoder

Quote:


> Originally Posted by *Jack Mac*
> 
> I mine while I sleep and fold on the CPU (which is one of the reasons why I want a 3930k/4930k). My PC doesn't go to waste and keeps my room nice and warm.


can you tell me how to get started mining coins? I figure I might as well when i'm not using my 290x


----------



## Aggronor

Hello Guys,

I´ve Bought a 290x Sapphire Couple days ago. The card is amazing, 53 in Idle, and of course 85/90 on full mode.. but, everything is running properly.. no shutdowns etc...

Find below some tests I´ve performed.

Let me know if there is some SAFE OC profile to apply on my Card.. to increase performance.., or if this results are ok for you !pls!!!

Thanks a lot!!!!!!!!

CPU: i7 3770K
PSU: M2 CoolerMaster Silent Pro 720
RAM: 16 RAM 1600MHZ


----------



## Heinz68

Quote:


> Originally Posted by *HardwareDecoder*
> 
> can you tell me how to get started mining coins? I figure I might as well when i'm not using my 290x


You can learn everything about Litecoin mining, incluiding the instalation guide and help forum here https://www.wemineltc.com/
You can also mine at reduced intensity while using the PC for other work, for example like net browsing or watching video.


----------



## Fahrenheit85

Quote:


> Originally Posted by *The Storm*
> 
> What res set up is that?


XSPC Photon with D5 pump


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Fahrenheit85*
> 
> 
> Had the block on back order with FrozenCPU for 2 weeks now, ordered with from Performance PCs on Monday and I got it already.


No way man, I got my Photon 170 yesterday. The first thing that struck me was the weight. These things are real quality, the glass is so nice looking. I can't wait to have it all built.

Still waiting for my 290x block here, I ordered on the 14th and it still hasn't shipped. Last part I'm waiting on, I've been gathering parts for my new build for a bit over a month now.


----------



## HardwareDecoder

cool im getting 850 khash on my 290x


----------



## Aggronor

Quote:


> Originally Posted by *Aggronor*
> 
> Hello Guys,
> 
> I´ve Bought a 290x Sapphire Couple days ago. The card is amazing, 53 in Idle, and of course 85/90 on full mode.. but, everything is running properly.. no shutdowns etc...
> 
> Find below some tests I´ve performed.
> 
> Let me know if there is some SAFE OC profile to apply on my Card.. to increase performance.., or if this results are ok for you !pls!!!
> 
> Thanks a lot!!!!!!!!
> 
> CPU: i7 3770K
> PSU: M2 CoolerMaster Silent Pro 720
> RAM: 16 RAM 1600MHZ


are these results enough? pls help









thx


----------



## prostreetcamaro

Quote:


> Originally Posted by *HardwareDecoder*
> 
> cool im getting 850 khash on my 290x


I get a solid 880 with my 290X and guiminer. Thread concurrency 32765 and worksize 512


----------



## HardwareDecoder

Quote:


> Originally Posted by *prostreetcamaro*
> 
> I get a solid 880 with my 290X and guiminer. Thread concurrency 32765 and worksize 512


--scrypt -I 20 -g 1 -w 512 --thread-concurrency 32765 --lookup-gap 2 --temp-target 50 --gpu-engine 920 --gpu-memclock 1500 --gpu-powertune 20

that's what I was told to use in the litecoin IRC channel.

i'm using the cgminer command line client

I rebooted my machine and im getting 866 khash now


----------



## Arizonian

Quote:


> Originally Posted by *Fahrenheit85*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Had the block on back order with FrozenCPU for 2 weeks now, ordered with from Performance PCs on Monday and I got it already.


Congrats - added









Quote:


> Originally Posted by *selk22*
> 
> Well guys it is time for me to leave you! Got my 290x at launch and I think it was 100% worth it but not a card I would buy for gaming... It has countless problems with recording using Dxtory that are unsolvable no matter what iv tried.. 7870 records perfect and every other GPU iv owned... But the 290x was loud hot and did the job while I needed it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> This was the tale of my 290x
> 
> 1st - Buy the card for about 630$ after taxes and shipping.. this was at release with bf4
> 
> 2nd - Game lightly on it and mine about 180-200$ worth of Litecoin
> 
> 3rd - Bought a 5850 and 7870 for dedicated mining for about 4 LTC
> 
> 4th - Sell my 290x for .9 BTC which came to about 720$ when I sold the BTC
> 
> 5th - Time to find the best deal I can on a 780TI.. Thanks 290x and the mining craze for making a 290x to 780ti trade possible
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It was a great card and I wish I had the money to put it under water right now but honestly I miss nvidia and everything that they offer including amazing drivers.. So time to head back to the green team at least for gaming that is
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I still have my 5850 and 7870 for mining which is what I like AMD for!
> 
> So please remove me from the list. Take it easy everyone, it was a fun launch to be a part of.


Thanks for all your help to other members in the club when we were figuring it all out in the beginning.









Quote:


> Originally Posted by *HardwareDecoder*
> 
> can you tell me how to get started mining coins? I figure I might as well when i'm not using my 290x


All mining discussion should be conducted in *Distributing Computing* sections of OCN where all those questions can be answered. You'll find a lot more info there.


----------



## Heinz68

Quote:


> Originally Posted by *HardwareDecoder*
> 
> cool im getting 850 khash on my 290x


That's about good avg from what I read on other forums. I don't have the R9 290X yet, waiting to buy two cards with custom cooling.
My HD 6990 (dual GPU) does 950 khash set at I-20, but it gets very noise. When I use the PC like right now I set the cgminer bat to I-15 and get about 700 khash

EDIT
Sorry Arizonian, did not notice your post.


----------



## TommyGunn123

On PC Case Gear they've posted about the Sapphire R9 290 Tri-X for $579 Aussy dollary-do's

Should've waited, Sgt. Bilko











http://www.pccasegear.com/index.php?main_page=product_info&cPath=416&products_id=26373


----------



## Hogesyx

Added an XSPC AX480 to my rig(with an EK XTX360) with 4x GT AP-15, manage to get VRM 1 down to 54 degreeC(room is 26C, delta 28C). max with cgminer @ 1200/1500 +100mV.









Now saving for another waterblock and PSU so I could transfer the 290X from my HTPC into this box as well.


----------



## Slomo4shO

Quote:


> Originally Posted by *Hogesyx*
> 
> Added an XSPC AX480 to my rig(with an EK XTX360) with 4x GT AP-15, manage to get VRM 1 down to 54 degreeC(room is 26C, delta 28C). max with cgminer @ 1200/1500 +100mV.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now saving for another waterblock and PSU so I could transfer the 290X from my HTPC into this box as well.


How much of a drop did you see in temps by adding the 2nd rad in your loop?


----------



## Hogesyx

Quote:


> Originally Posted by *Slomo4shO*
> 
> How much of a drop did you see in temps by adding the 2nd rad in your loop?


Both GPU core and VRM drop by 6/8C respectively. This 480 is feeding the 290X immediately in my loop through the rear port of EK-FC R9-290X. I am very happy that I manage to have sub 60C on the VRM during peak.

RES > PUMP > AX480 > 290X > XTX360 > [email protected] > RES


----------



## Sazz

can anyone tell me how to make the rivatuner OSD work for BF4? can't figure it out how to make it work.. -_-


----------



## Hogesyx

Quote:


> Originally Posted by *Sazz*
> 
> can anyone tell me how to make the rivatuner OSD work for BF4? can't figure it out how to make it work.. -_-


Works on BF4 32bit only, 64bit hook is not implemented yet. Alternative would be to use Radeon Pro.


----------



## Sazz

Quote:


> Originally Posted by *Hogesyx*
> 
> Works on BF4 32bit only, 64bit hook is not implemented yet. Alternative would be to use Radeon Pro.


oh nevermind then LOL


----------



## Gero2013

Quote:


> Originally Posted by *Forceman*
> 
> Is the card downclocking to 2D clocks?
> Mine certainly didn't idle at 70C with the stock cooler.


Quote:


> Originally Posted by *Sazz*
> 
> many other factors could have made it go to 70C as idle, one is it really on "idle", 2nd is ambient temp, air flow in your case and many others. So I wouldn't be surprised if its doing 70C idle.
> 
> I didn't get a chance to run my card with reference cooler with stock fan profile but with my custom fan profile I made it so that it runs at 40C idle.


what are 2D clocks?
the case is well vented, I have 4 fans and a good CPU cooler.
Quote:


> Originally Posted by *selk22*
> 
> It has countless problems with recording using Dxtory that are unsolvable no matter what iv tried.


was that a problem with the codes? I have massive problems with codecs since I put in my 290X ;_;


----------



## TommyGunn123

Quote:


> Originally Posted by *Gero2013*
> 
> what are 2D clocks?


The lower clock speeds when your card's doing nothing, it shouldn't stay at high clocks 24/7. Check the clocks when you have everything closed so there's no video interference to see if its self de-clocking


----------



## HardwareDecoder

Quote:


> Originally Posted by *Hogesyx*
> 
> Works on BF4 32bit only, 64bit hook is not implemented yet. Alternative would be to use Radeon Pro.


actually if you download the newest rivatuner beta it does work on 64bit been using it for a few weeks now


----------



## Gero2013

Quote:


> Originally Posted by *TommyGunn123*
> 
> The lower clock speeds when your card's doing nothing, it shouldn't stay at high clocks 24/7. Check the clocks when you have everything closed so there's no video interference to see if its self de-clocking


ah ok, yes it does downclock to 300Mhz on idle but temps stay hovering around 70C


----------



## Monkeysphere

Ok so my issue with not hitting max clocks is resolved. Certain benchmarks and stress tests do not force the card to full clocks. Using Valley corrected this.

I however am now stumped with a new problem. When I put my cards into CrossfireX the second GPU disappears from the system and all the sensors on it go nuts. GPU-Z reports it still being there, GPU Shark does not.

Return it to non-CrossfireX and it returns, plug my monitor into the second card and Valley will utilise that card and it works fine.

So....seens this before with these or Mobo problem? I admittedly do have an old Asus P67 Sabertooth. I kept it because my i5 2500k still does everything I need and then some clocked at just shy of 5ghz. But if it is a Mobo problem it might be time for Haswell.

EDIT: It may also be worth mentioning that last night I enabled CrossfireX and it showed up in MSI Afterburner fine and ran both cards in BF4. However I gained only 25 or so FPS going from about 60 with one card to 85ish. Usage on both cards was just a series of spikes from 0%-100% a couple times every second. Now after enabling it today I get nothing. Furmark and GPU Shark have always dropped one card saying Crossfire is enabled but listing only GPU 1. GPU-Z has displayed both and confirmed Crossfire enabled but the second card never gets any load. AB has now lost the second card.

Also GPU1 has an earlier BIOS revision than GPU2. Not sure if they need to be the same.


----------



## Forceman

Quote:


> Originally Posted by *Gero2013*
> 
> ah ok, yes it does downclock to 300Mhz on idle but temps stay hovering around 70C


That doesn't sound right. What is the fan RPM/percent when the card is at idle (you can check it in Afterburner)? You didnt force constant voltage in Afterburner settings, did you?


----------



## Sgt Bilko

Quote:


> Originally Posted by *TommyGunn123*
> 
> On PC Case Gear they've posted about the Sapphire R9 290 Tri-X for $579 Aussy dollary-do's
> 
> Should've waited, Sgt. Bilko
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.pccasegear.com/index.php?main_page=product_info&cPath=416&products_id=26373


Wouldn't have mattered, they are now sold out and i've been out all day so i wouldn't have been able to grab them anyways.

I'll be happy with the XFX cards, Besides, i said i'd grab the first type that became available


----------



## Gero2013

Quote:


> Originally Posted by *Forceman*
> 
> That doesn't sound right. What is the fan RPM/percent when the card is at idle (you can check it in Afterburner)? You didnt force constant voltage in Afterburner settings, did you?


IIRC GPU-Z showed 20% fan speed last night, and (I think) something like 1160rpm


----------



## maynard14

grrr even at latest msi after burner my graph still spikes... any suggestions?


----------



## Sgt Bilko

Full Review to come but it looks like XFX might have done alright with this card:
http://www.fudzilla.com/home/item/33553-xfx-radeon-dd-r9-290x-1000m-previewed


----------



## givmedew

Hello everyone...

I just bought a second 290 and I will obviously want to put it under water. Problem is I have a full size Koolance Water Block and they don't make those anymore.

I would be willing to purchase someone's full size Koolance or I would be willing to sell mine to them so that I could get (2) of the smaller Koolance blocks.

I just want matching blocks. So any of you guys that either have a the large Koolance or want the large Koolance let me know.


that one I have


----------



## selk22

Quote:


> Originally Posted by *Gero2013*
> 
> was that a problem with the codes? I have massive problems with codecs since I put in my 290X ;_;


Yeah I do believe that had to be at least part of my issue? I think possibly bad driver support when it comes to open codes and or openCL/openGL.. I hear people did not have my same issue's with fraps but for me it was no excuse for the card to be performing the way it did. I suppose new drivers may be in the future that would fix the issue but I didn't want to invest much in faith like I did with my GTX460(it was a terrible card)
Quote:


> Originally Posted by *Arizonian*
> 
> Thanks for all your help to other members in the club when we were figuring it all out in the beginning.


Of course I enjoyed it very much and look forward to a new GPU adventure!


----------



## Sgt Bilko

Quote:


> Originally Posted by *selk22*
> 
> Yeah I do believe that had to be at least part of my issue? I think possibly bad driver support when it comes to open codes and or openCL/openGL.. I hear people did not have my same issue's with fraps but for me it was no excuse for the card to be performing the way it did. I suppose new drivers may be in the future that would fix the issue but I didn't want to invest much in faith like I did with my GTX460(it was a terrible card)
> Of course I enjoyed it very much and look forward to a new GPU adventure!


Sad to see you go Selk but Thanks for sticking around for the early parts and giving out advice when needed


----------



## the9quad

Quote:


> Originally Posted by *maynard14*
> 
> grrr even at latest msi after burner my graph still spikes... any suggestions?


My suggestion, and don't take take this the wrong way is to read the release notes, and the guru forums. The spiking is from the way amd allows that value to be called, and is normal. The reason it doesn't do that in gpuz, is because gpuz is calling the wrong value. If it bothers you that bad, afterburner now allows you to call the wrong value as well, but its not that way by default.


----------



## the9quad

Quote:


> Originally Posted by *Monkeysphere*
> 
> Ok so my issue with not hitting max clocks is resolved. Certain benchmarks and stress tests do not force the card to full clocks. Using Valley corrected this.
> 
> I however am now stumped with a new problem. When I put my cards into CrossfireX the second GPU disappears from the system and all the sensors on it go nuts. GPU-Z reports it still being there, GPU Shark does not.
> 
> Return it to non-CrossfireX and it returns, plug my monitor into the second card and Valley will utilise that card and it works fine.
> 
> So....seens this before with these or Mobo problem? I admittedly do have an old Asus P67 Sabertooth. I kept it because my i5 2500k still does everything I need and then some clocked at just shy of 5ghz. But if it is a Mobo problem it might be time for Haswell.
> 
> EDIT: It may also be worth mentioning that last night I enabled CrossfireX and it showed up in MSI Afterburner fine and ran both cards in BF4. However I gained only 25 or so FPS going from about 60 with one card to 85ish. Usage on both cards was just a series of spikes from 0%-100% a couple times every second. Now after enabling it today I get nothing. Furmark and GPU Shark have always dropped one card saying Crossfire is enabled but listing only GPU 1. GPU-Z has displayed both and confirmed Crossfire enabled but the second card never gets any load. AB has now lost the second card.
> 
> Also GPU1 has an earlier BIOS revision than GPU2. Not sure if they need to be the same.


Try disabling ulps, I have a hunch your card is working just fine. Also you maybe CPU limited in bf4.


----------



## maynard14

Quote:


> Originally Posted by *maynard14*
> 
> grrr even at latest msi after burner my graph still spikes... any suggestions?


is my gpu broken or my cpu is bottlenecking my card?

that screen shot is taken while im in idle and browsing the net....

pls help,. i thought beta 18 will fix the spike graph issue in after burner..

but still it didnt fix it or is there something else that spikes up the graph>

while playing i dont see any fps decreasement


----------



## sugarhell

This spike is normal. Some browsers spike often enough


----------



## Sgt Bilko

Quote:


> Originally Posted by *maynard14*
> 
> is my gpu broken or my cpu is bottlenecking my card?
> 
> that screen shot is taken while im in idle and browsing the net....
> 
> pls help,. i thought beta 18 will fix the spike graph issue in after burner..
> 
> but still it didnt fix it or is there something else that spikes up the graph>
> 
> while playing i dont see any fps decreasement


There is nothing wrong with your card, it's working just fine.

And a 3570k will never bottleneck a single 290, even overclocked.


----------



## maynard14

Quote:


> Originally Posted by *the9quad*
> 
> My suggestion, and don't take take this the wrong way is to read the release notes, and the guru forums. The spiking is from the way amd allows that value to be called, and is normal. The reason it doesn't do that in gpuz, is because gpuz is calling the wrong value. If it bothers you that bad, afterburner now allows you to call the wrong value as well, but its not that way by default.


hi sir ,.. hmmm so your saying is that the graph msi after burner shows me is correct? so thats the correct value? but i saw others after burner graph and there graphs differ from mine, because there graphs are more stable and not spiking,.. thats why im worried..

i will read again the release notes









thank you sir and sorry for the troubles.. im just a noob haha

i understand now thanks!


----------



## kizwan

Quote:


> Originally Posted by *Monkeysphere*
> 
> Ok so my issue with not hitting max clocks is resolved. Certain benchmarks and stress tests do not force the card to full clocks. Using Valley corrected this.
> 
> I however am now stumped with a new problem. When I put my cards into CrossfireX the second GPU disappears from the system and all the sensors on it go nuts. GPU-Z reports it still being there, GPU Shark does not.
> 
> Return it to non-CrossfireX and it returns, plug my monitor into the second card and Valley will utilise that card and it works fine.
> 
> So....seens this before with these or Mobo problem? I admittedly do have an old Asus P67 Sabertooth. I kept it because my i5 2500k still does everything I need and then some clocked at just shy of 5ghz. But if it is a Mobo problem it might be time for Haswell.
> 
> EDIT: It may also be worth mentioning that last night I enabled CrossfireX and it showed up in MSI Afterburner fine and ran both cards in BF4. However I gained only 25 or so FPS going from about 60 with one card to 85ish. Usage on both cards was just a series of spikes from 0%-100% a couple times every second. Now after enabling it today I get nothing. Furmark and GPU Shark have always dropped one card saying Crossfire is enabled but listing only GPU 1. GPU-Z has displayed both and confirmed Crossfire enabled but the second card never gets any load. AB has now lost the second card.
> 
> Also GPU1 has an earlier BIOS revision than GPU2. Not sure if they need to be the same.


I would re-connect the GPU to PCIe slot. For good measure, swap GPUs; second GPU to first PCIe slot & first GPU to second PCIe slot. Then do fresh driver installation.

GPU usage like that is normal. At what resolution you're playing BF4?


----------



## maynard14

Quote:


> Originally Posted by *Sgt Bilko*
> 
> There is nothing wrong with your card, it's working just fine.
> 
> And a 3570k will never bottleneck a single 290, even overclocked.


yes sir i got it now...

yes my graph is normal and know i understand what mr the9quad is saying

my idle graph is normal and IF im playing a game and still the gpu core clock and memory clock and usage is spiking then thats abnormal right?

tried tombraiders in game benchmark and after the benchmark a look at my graph in atb and it is stable straight line on core clock, memomry clock and gpu load,..

thanks again for helping me ! eheh now i know thanks


----------



## Sgt Bilko

Ok so i have an idea for my water loop.

Now i just need to wait and find out if the XFX cards are Ref design or not and for the Wife to be in a good mood before i hit that buy button ($830







)

EDIT: Ohh, i got a flame, does that mean i'm special now


----------



## jerrolds

Would a single XSPC EX360 rad be enough for an i7 [email protected]+ and a [email protected]+? I'm thinking itll be fine but im seeing people with almost double that for just a cpu + 1gpu.

Thinking of getting either the EX360 kit, or the 360 Photon kit as my first wc build.


----------



## Mr357

Quote:


> Originally Posted by *jerrolds*
> 
> Would a single XSPC EX360 rad be enough for an i7 [email protected]+ and a [email protected]+? I'm thinking itll be fine but im seeing people with almost double that for just a cpu + 1gpu.
> 
> Thinking of getting either the EX360 kit, or the 360 Photon kit as my first wc build.


I have the exact same setup (2700K @ 4.8GHz and 290X), but I have both a 360 and 240 in my loop.


----------



## jerrolds

Think the extra 240 rad is overkill?


----------



## Maracus

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Ok so i have an idea for my water loop.
> 
> Now i just need to wait and find out if the XFX cards are Ref design or not and for the Wife to be in a good mood before i hit that buy button ($830
> 
> 
> 
> 
> 
> 
> 
> )
> 
> EDIT: Ohh, i got a flame, does that mean i'm special now


Haha, same I told the girl i'm buying EK block etc. Also told her im putting couple hundred away a fortnight for a 900D Haswell-E build when its released,she merely shrugged normally she complains about the amount I spend....but damnit i work hard for that cash


----------



## kizwan

Quote:


> Originally Posted by *jerrolds*
> 
> Would a single XSPC EX360 rad be enough for an i7 [email protected]+ and a [email protected]+? I'm thinking itll be fine but im seeing people with almost double that for just a cpu + 1gpu.
> 
> Thinking of getting either the EX360 kit, or the 360 Photon kit as my first wc build.


Quote:


> Originally Posted by *jerrolds*
> 
> Think the extra 240 rad is overkill?


No, definitely not. With both CPU & GPU overclocked, additional 240mm radiator will be useful. My loop consist of 360mm & 240mm radiators. We'll see how well it going to cool both CPU & two 290's when I add the 290's in the loop. No more space in my casing for another radiator.


----------



## stickg1

Quote:


> Originally Posted by *jerrolds*
> 
> Would a single XSPC EX360 rad be enough for an i7 [email protected]+ and a [email protected]+? I'm thinking itll be fine but im seeing people with almost double that for just a cpu + 1gpu.
> 
> Thinking of getting either the EX360 kit, or the 360 Photon kit as my first wc build.


What case are you working in?


----------



## givmedew

Is GPU Tweak the best software to use to OC my 290X?

If so do I need to use an ASUS bios?

If so exactly which link from HERE do I use?

Thanks


----------



## jerrolds

Quote:


> Originally Posted by *stickg1*
> 
> What case are you working in?


Right now itll be the CM HAF-X 942, its the biggest and probably ugliest case ive owned (got it used/bundled with the i7 2600k) ;p I have enough room for a 360rad at the top (maybe even 480) and a 120 on the exhaust, possibly another 120 at the bottom.

Looks arent too much of a concern now since its in another room, but my next build may be a wc Bitfenix mini itx.


----------



## the9quad

Here is my issue:

I really like using Afterburner for the custom fan control.
I tried using Sapphire TRIXX, but it will only control the fan on one card.
I would like to run benchmarks using Radeonpro.

Is there a way to only use the Fan control portion of Afterburner without the OSD (RTSS) so it will not conflict with radeonpro, or as an alternative fraps.

Just can't figure it out or figure out how to use Afterburner to benchmark games by itself.

TLDR: I want to use AB for fan control and Radeonpro or FRAPS to benchmark at the same time.


----------



## givmedew

Quote:


> Originally Posted by *stickg1*
> 
> Quote:
> 
> 
> 
> Originally Posted by *jerrolds*
> 
> Would a single XSPC EX360 rad be enough for an i7 [email protected]+ and a [email protected]+? I'm thinking itll be fine but im seeing people with almost double that for just a cpu + 1gpu.
> 
> Thinking of getting either the EX360 kit, or the 360 Photon kit as my first wc build.
> 
> 
> 
> What case are you working in?
Click to expand...

That is absolutely more than enough to handle a 2600K and a 290X. Don't laugh but a single 120 would be enough... if you have big fans!

With a 360 rad and 3 1500-2000RPM fans you will be golden. Most people don't really understand how much rad space they need and the bottom line is more can't be worse than less!

If you have way too much rad space then you are going to find yourself able to be less than 5C above ambient with LOW fan speeds.

Now lets say that you have a 360 and with your given fans it can handle 350W at 10C above room temp well that would mean that if you threw 700W of heat at it then your temps would only be 10C higher. That means if your CPU was running max of 70C before then now it will be running at 80C and if your GPU was running at max of 50C now it would be running 60C.

This is why even a single 120 with a good fan would be enough for a 290X and a 2600K because 30C above ambient would be equivalent to a 6x120 radiator that was running 5C above ambient and 30C above ambient would mean your water would be less than 60C. Of course that is pretty darn hot but I think it should be enough info to get my point across.

It just boils down to how much over ambient you are and what how loud your fans might be.


----------



## jerrolds

Interesting - i believe air cooled my 2600k hits 75C under IBT, and the 290X at 65C under Heaven loops/BF4 - its the VRM1 thats killing me. Easily hits 90C at 1180mhz on the core.

My bro's H60 is dying, so i might give him my venomous which will open me up to trying decent watercooling. The Swiftech rep said the H320 should be more than enough to handle a 2600k/290X overclocked but i took it with a grain of salt.

I'll be happy with keeping the same temps but brining VRM temps down, hoping to squeeze a bit more out of the 290X though (1230+)

Rad -> 290X -> RES/Pump (Photon) -> 2600k?


----------



## stickg1

Quote:


> Originally Posted by *jerrolds*
> 
> Right now itll be the CM HAF-X 942, its the biggest and probably ugliest case ive owned (got it used/bundled with the i7 2600k) ;p I have enough room for a 360rad at the top (maybe even 480) and a 120 on the exhaust, possibly another 120 at the bottom.
> 
> Looks arent too much of a concern now since its in another room, but my next build may be a wc Bitfenix mini itx.


I think 360mm will be plenty, it really comes down to fans. You'll probably want 1700-2200 RPM. But these can make some noise. If you want super quiet then you might want some more rad space.


----------



## givmedew

Quote:


> Originally Posted by *jerrolds*
> 
> Interesting - i believe air cooled my 2600k hits 75C under IBT, and the 290X at 65C under Heaven loops/BF4 - its the VRM1 thats killing me. Easily hits 90C at 1180mhz on the core.
> 
> My bro's H60 is dying, so i might give him my venomous which will open me up to trying decent watercooling. The Swiftech rep said the H320 should be more than enough to handle a 2600k/290X overclocked but i took it with a grain of salt.
> 
> I'll be happy with keeping the same temps but brining VRM temps down, hoping to squeeze a bit more out of the 290X though (1230+)
> 
> Rad -> 290X -> RES/Pump (Photon) -> 2600k?


It sounds like you don't own any of the water cooling parts yet right?

If that is the case open up a new thread in the water cooling section with your parts, case, proposed build, and anything that you already own that you can re-use (like fans) and what your goals might be.

You want to fit the largest radiator possible inside your case. If your case supports a 3x140 then grab a 3x140 if the biggest it supports is a 3x120 then grab the 3x120. Whatever the largest you can get it. After that I highly recommend that you get a KIT instead of putting it together part by part. The absolute best value in kits would be the XSPC 750 kits. So if a 3x120 is the largest you can fit then grab this kit , and(2) extra fittings and an EK block.

For a total of $267.12 + shipping after your 5.1% discount code.

The radiator is more than enough to handle your system, the fans are nice and quiet, the fittings are of great quality and 7/16" tubing on 1/2" barbs means NO LEAKS and the tightest safest fit possible. The pump in that kit has enough power to handle the raystorm and the full coverage 290 block without issue and you could even run a second 290 later without having to upgrade the PUMP.


----------



## jerrolds

The XSPC AX360 Photon Kit comes with 3x1650rpm fans..i also have 2xCM Sickle flow (2k rpm/High static pressure) fans that i can use instead or add on as push/pull.

But ill probably replace 2 of the 3 fans as i dont want to add much bulk noise. For $269 i dont think i can build a kit seperately that will be as good, still have to get tubing/fittings/1xfan if i were to use the sickle flows.

Although..if i wanted to try rigid tubing since it looks so cool







then maybe itll be around the same price.


----------



## jerrolds

Quote:


> Originally Posted by *givmedew*
> 
> It sounds like you don't own any of the water cooling parts yet right?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> If that is the case open up a new thread in the water cooling section with your parts, case, proposed build, and anything that you already own that you can re-use (like fans) and what your goals might be.
> 
> You want to fit the largest radiator possible inside your case. If your case supports a 3x140 then grab a 3x140 if the biggest it supports is a 3x120 then grab the 3x120. Whatever the largest you can get it. After that I highly recommend that you get a KIT instead of putting it together part by part. The absolute best value in kits would be the XSPC 750 kits. So if a 3x120 is the largest you can fit then grab this kit , and(2) extra fittings and an EK block.
> 
> For a total of $267.12 + shipping after your 5.1% discount code.
> 
> The radiator is more than enough to handle your system, the fans are nice and quiet, the fittings are of great quality and 7/16" tubing on 1/2" barbs means NO LEAKS and the tightest safest fit possible. The pump in that kit has enough power to handle the raystorm and the full coverage 290 block without issue and you could even run a second 290 later without having to upgrade the PUMP.


Oh nice - i might do that. I dont have anything for WC other than 2 high pressure static fans. But im thinking of the XSPC Photon 360 kit, only cuz it has that cool tube reservoir thing and that i already purchased the XSPC backplate for my 290X (for aesthetic/strengthening purposes).

Can i mix and match a XSPC backplate with an EK block? And will that pump support cpu + 2 gpus? But honestly i dont think ill ever go CF again...maybe SLI though...


----------



## Widde

Is it possible to bolt on a 290 back plate on a 290 with a stock cooler? Just want it to look better ^^

And does a r9 290x backplate fit a 290? it should since it should be the same card?


----------



## jerrolds

You probably can if you get extra long screws so the shroud can connect to the backplate. Probably have to sandwich it all together before screwing things down.

290(X) backplates have the same layout.


----------



## stickg1

Quote:


> Originally Posted by *jerrolds*
> 
> The XSPC AX360 Photon Kit comes with 3x1650rpm fans..i also have 2xCM Sickle flow (2k rpm/High static pressure) fans that i can use instead or add on as push/pull.
> 
> But ill probably replace 2 of the 3 fans as i dont want to add much bulk noise. For $269 i dont think i can build a kit seperately that will be as good, still have to get tubing/fittings/1xfan if i were to use the sickle flows.
> 
> Although..if i wanted to try rigid tubing since it looks so cool
> 
> 
> 
> 
> 
> 
> 
> then maybe itll be around the same price.


Quote:


> Originally Posted by *jerrolds*
> 
> Oh nice - i might do that. I dont have anything for WC other than 2 high pressure static fans. But im thinking of the XSPC Photon 360 kit, only cuz it has that cool tube reservoir thing and that i already purchased the XSPC backplate for my 290X (for aesthetic/strengthening purposes).
> 
> Can i mix and match a XSPC backplate with an EK block? And will that pump support cpu + 2 gpus? But honestly i dont think ill ever go CF again...maybe SLI though...


Oh yeah, I definitely think that's the way to go. You get a much better pump. Of course your way is more expensive than givemedew's suggestion, I think he was just trying to get your feet wet in a budget friendly way.

I'm pretty sure you will need to use the XSPC 290x block in order to use the XSPC backplate. But I'm not 100% sure. But since you're getting XSPC everything else it couldn't hurt right?

Oh wow, that Photon kit comes with compressions too, that's nice, you need two more of those for the GPU. And I think you will want to use Primochill Advance LRT tubing (7/16" ID). The XSPC stuff that comes with your kit will likely give you plasticizer issues.


----------



## Caanon

Hey all!

Building my first gaming rig since 2006... 8 years-ish? Wow... anyway, I've been toying with a budget ceiling of $1500 to get the highest-performance-I-can-get for that price. I've been following reviews, stories and benchmarks of the R9 series and have built up what I think is a decent PC around the Sapphire Tri-X R9 290 OC: http://pcpartpicker.com/p/2uZxq (note: I have an SSD already and plenty of old large storage drives lying around).

Without the video card, the build comes in right around $840 - depending on daily deals, of course. I've got the Sapphire Tri-X r9 290 on order for $540 shipped (darn you 10% WA sales tax!!!) for a total of ~$1380, but can cancel it if need be. So here's where I'm looking to you experts: I've been toying with the idea of watercooling for over a decade now and am wondering if this may be the build to actually pull the trigger and try it. My priorities are about 70% noise, 30% performance/overclocking; my fiancee is a light sleeper and I want to make sure I can game with headphones and not have a turbojet of a computer wake her up, hence the Tri-X. However, I can find a reference R9 290 for ~$460 shipped, which would leave me with right around $200 left in the build for water cooling which I don't quite think is enough. Thoughts?

I'm also a bit confused about water cooling however, as I've seen some cooler reviews mention that they were just as loud if not louder than the stock coolers (e.g. Kraken G10). I thought the goal of water cooling was most-heat-dissipation-at-least-noise? How noisy are they in general with the pumps, radiator fan etc?


----------



## jerrolds

Quote:


> Originally Posted by *stickg1*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Oh yeah, I definitely think that's the way to go. You get a much better pump. Of course your way is more expensive than givemedew's suggestion, I think he was just trying to get your feet wet in a budget friendly way.
> 
> I'm pretty sure you will need to use the XSPC 290x block in order to use the XSPC backplate. But I'm not 100% sure. But since you're getting XSPC everything else it couldn't hurt right?
> 
> Oh wow, that Photon kit comes with compressions too, that's nice, you need two more of those for the GPU. And I think you will want to use Primochill Advance LRT tubing (7/16" ID). The XSPC stuff that comes with your kit will likely give you plasticizer issues.


Plasticizer issues? I know some of those words! lol - ill have to post a seperate thread in /watercooling or something with all my parts and purposed build when i get to it i suppose.

But i think the Photon kit and maybe some rigid tubings/heatgun? It comes with 8 fittings so i should be good if im hooking it up to a rad + cpu + gpu + pump?


----------



## stickg1

Quote:


> Originally Posted by *jerrolds*
> 
> Plasticizer issues? I know some of those words! lol - ill have to post a seperate thread in /watercooling or something with all my parts and purposed build when i get to it i suppose.
> 
> But i think the Photon kit and maybe some rigid tubings/heatgun? It comes with 8 fittings so i should be good if im hooking it up to a rad + cpu + gpu + pump?


Naw the kit you linked comes with 6 fittings. You need two more to hook up the GPU.

Plasticizer is basically when the chemical used in the tubing to make it flexible breaks down over time and causes the inside of the tubing to become foggy and eventually can gunk up your blocks or rads. There is no issues like this for rigid acrylic tubing, because its hard and therefore has no plasticizers in it. But also you will need to get all new fittings (the ones that come with the kit are for flexible tubing). But I think for your first time you would be doing yourself a favor by just getting some Primochill Advanced LRT flexible tubing and two more XSPC black chrome compression fittings.


----------



## jerrolds

Cool http://www.overclock.net/t/1380775/what-is-plasticizer - thanks for the advice. I fugure im gonna only do this once a year a most so do it right off the bat since swapping everything out would be annoying.

But..ill mull it over some (flexible vs rigid)


----------



## Redvineal

Man, all this water cooling talk has me looking hard at options. I don't think I'm ready to pull the trigger on a full custom loop, so I'm leaning towards an AIO unit with Richie's GPU Cool bracket. I know, I know...a custom loop is the best solution, but for now I think I just want to take the first step with an AIO.

For those familiar with the AIO units, would a 240 or 280 rad unit be overkill for the GPU core? I can save about $25 going with a 120 or 140 unit, but would I see significant gains from going bigger for a GPU core?

Thanks.


----------



## jerrolds

^ I'm flip flopping between not going WC at all vs getting a Swiftech H320 (basically an AIO CLC w/ a 360mm rad that is designed to be expanded) and add a full block ($150 for h320 + $100 for gpu block + fittings) vs going all out XSPC Photon (~$300 + $100 for gpu block)

I should maybe think about saving some money


----------



## gateh0use

Quote:


> Originally Posted by *Redvineal*
> 
> Man, all this water cooling talk has me looking hard at options. I don't think I'm ready to pull the trigger on a full custom loop, so I'm leaning towards an AIO unit with Richie's GPU Cool bracket. I know, I know...a custom loop is the best solution, but for now I think I just want to take the first step with an AIO.
> 
> For those familiar with the AIO units, would a 240 or 280 rad unit be overkill for the GPU core? I can save about $25 going with a 120 or 140 unit, but would I see significant gains from going bigger for a GPU core?
> 
> Thanks.


not sure how much a 240+mm radiator would be an improvement over a 140mm one, but my Corsair H90 keeps my 290x at 50c during full load


----------



## battleaxe

Quote:


> Originally Posted by *Redvineal*
> 
> Man, all this water cooling talk has me looking hard at options. I don't think I'm ready to pull the trigger on a full custom loop, so I'm leaning towards an AIO unit with Richie's GPU Cool bracket. I know, I know...a custom loop is the best solution, but for now I think I just want to take the first step with an AIO.
> 
> For those familiar with the AIO units, would a 240 or 280 rad unit be overkill for the GPU core? I can save about $25 going with a 120 or 140 unit, but would I see significant gains from going bigger for a GPU core?
> 
> Thanks.


This is what I did. I have an AIO on each of my GTX670's in SLI and also an AIO on my new 290. Temps on the 290 never go above 54c under full load mining. Nowhere near that for gaming, under 50c max. My VRM's are under 55c too. I made some custom coolers out of old CPU coolers by cutting them into pieces to fit the VRM's and RAM. Used zip ties to secure everything along with thermal tape.

Looks hideous on the 290, but works great. They are all mining so I don't care about aesthetics too much. The 670's actually look really nice but the 290 looks downright awful.... It does work well though. Overall, I'm very happy I did it this way. I didn't want to drop $200 per card on custom loops. After the blocks and rads it would be at least that much. The AIO solution cost me 60.00 per GPU card. Saved about 140.00 per card I figure. Although, not as nice looking as a custom loop I admit. I doubt you'll regret it. I will warn though that seating the 290 with an AIO is kinda hard. I used the H80 and the intel bracket. Clipped the tabs off so they were only 1/4" long (just long enough to secure a zip tie) and used one zip tie on each corner. My first attempt was with the AMD bracket pulling across and this didn't work at all. Good luck to you.


----------



## kdawgmaster

At my work im seeing with HIS if their aftermarket cooler will work with the stock cards. If they can im going to order 3 of those and put them on


----------



## psyside

Can anyone with reference card 290 please tell me what vrm temps you get on 1.4v and 50% power limit?









also point out your fan speed settings, thanks.


----------



## Jack Mac

Quote:


> Originally Posted by *psyside*
> 
> Can anyone with reference card 290 please tell me what vrm temps you get on 1.4v and 50% power limit?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> also point out your fan speed settings, thanks.


Well, with max voltage in AB (+100mV) I get a max temp of 72C, although that's when benching at 70-85% fan speed. Normal use sees around 63-65C on the hottest VRM at 1175MHz with 55-60% fan speed, using +60Mv in a 20-23C room. The actual voltage for my card is about 1.235 with +60 if that helps you at all.


----------



## psyside

Thanks bro, can you do one Heaven 4.0 with max settings, and fan on 70% @ same oc settings?


----------



## Jack Mac

Quote:


> Originally Posted by *psyside*
> 
> Thanks bro, can you do one Heaven 4.0 with max settings, and fan on 70% @ same oc settings?


Not at my PC ATM but yeah I'll run heaven ASAP.


----------



## battleaxe

Quote:


> Originally Posted by *kdawgmaster*
> 
> At my work im seeing with HIS if their aftermarket cooler will work with the stock cards. If they can im going to order 3 of those and put them on


What HIS cooler is that? got a link?


----------



## psyside

Quote:


> Originally Posted by *Jack Mac*
> 
> Not at my PC ATM but yeah I'll run heaven ASAP.


Thanks!


----------



## stickg1

Quote:


> Originally Posted by *psyside*
> 
> Can anyone with reference card 290 please tell me what vrm temps you get on 1.4v and 50% power limit?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> also point out your fan speed settings, thanks.


When I had a stock cooler on mine I would get into the high 80's easily on VRM1 benching and gaming. That was only with 1.275v.


----------



## Jack Mac

Quote:


> Originally Posted by *stickg1*
> 
> When I had a stock cooler on mine I would get into the high 80's easily on VRM1 benching and gaming. That was only with 1.275v.


"Only" 1.275V, that's more than 100mV gets me on AB and not something that most people run every day.


----------



## rdr09

Quote:


> Originally Posted by *Aggronor*
> 
> are these results enough? pls help
> 
> 
> 
> 
> 
> 
> 
> 
> 
> thx


your results look normal. can't really recommend oc'ing using the stock cooler. you need to watch the vrms' temps, too, if your card has sensors for them.


----------



## jerrolds

VRM temps on stock are better than 3rd party solutions atm...you can get a decent overclock at quiet fan speeds as long as ambient isnt too bad. 1100mhz w/ 53% fan speed should hover around 90C on the core, and shouldnt throttle. At least for me it didnt.

55% is where it got too loud for me personally.


----------



## Aggronor

Quote:


> Originally Posted by *rdr09*
> 
> your results look normal. can't really recommend oc'ing using the stock cooler. you need to watch the vrms' temps, too, if your card has sensors for them.


well thank you!!!! ... will do


----------



## mojobear

for anyone going crossfire under water with EK blocks...just as a heads up the EK terminal is B&$^& to put together with the blocks.....the O rings just do not want to stay in place. Took me a whole hour for 3 cards to fit together and even then I messed one up and there was a test leak... lesson in patience.


----------



## Menphisto

My stock boost voltage is 1,163v  is this normal...i tought 1,18v is stock


----------



## stickg1

Quote:


> Originally Posted by *Jack Mac*
> 
> "Only" 1.275V, that's more than 100mV gets me on AB and not something that most people run every day.


That's what I get using +100mV in AB.

I have a full cover Koolance block now though.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *mojobear*
> 
> for anyone going crossfire under water with EK blocks...just as a heads up the EK terminal is B&$^& to put together with the blocks.....the O rings just do not want to stay in place. Took me a whole hour for 3 cards to fit together and even then I messed one up and there was a test leak... lesson in patience.


I'm using the EK Triple terminal. I've heard it's quite a pain! I'll know next week


----------



## kdawgmaster

Quote:


> Originally Posted by *battleaxe*
> 
> What HIS cooler is that? got a link?


http://www.tomshardware.com/news/his-radeon-290x-iceq-x2,25508.html#his-radeon-290x-iceq-x2%2C25508.html?&_suid=1388785924833012465925225848295 It would be nice if i can get these for better cooling


----------



## smoke2

Quote:


> Originally Posted by *jerrolds*
> 
> VRM temps on stock are better than 3rd party solutions atm...you can get a decent overclock at quiet fan speeds as long as ambient isnt too bad. 1100mhz w/ 53% fan speed should hover around 90C on the core, and shouldnt throttle. At least for me it didnt.
> 
> 55% is where it got too loud for me personally.


You're right.
Sadly Sapphire Tri-X is at 91 degrees on VRM, reference on 73 degrees.
http://ht4u.net/reviews/2013/sapphire_radeon_tri-x_r9_290x_oc_im_test/index9.php
Probably must to wait for Vapor or Toxic.


----------



## stickg1

Quote:


> Originally Posted by *kdawgmaster*
> 
> http://www.tomshardware.com/news/his-radeon-290x-iceq-x2,25508.html#his-radeon-290x-iceq-x2%2C25508.html?&_suid=1388785924833012465925225848295 It would be nice if i can get these for better cooling


Hate to break it to you man but I doubt you can get your hands on that cooler without buying the whole card.


----------



## jerrolds

Quote:


> Originally Posted by *smoke2*
> 
> You're right.
> Sadly Sapphire Tri-X is at 91 degrees on VRM, reference on 73 degrees.
> http://ht4u.net/reviews/2013/sapphire_radeon_tri-x_r9_290x_oc_im_test/index9.php
> Probably must to wait for Vapor or Toxic.


Wait..thats Sapphire non-ref design? And their VRM performace is worse? I was talking about things like the Xtreme3/MK2/G10/Gelid where you have to use dinky little sinks to cool them. TriX should be avoided then...


----------



## Redvineal

Quote:


> Originally Posted by *jerrolds*
> 
> ^ I'm flip flopping between not going WC at all vs getting a Swiftech H320 (basically an AIO CLC w/ a 360mm rad that is designed to be expanded) and add a full block ($150 for h320 + $100 for gpu block + fittings) vs going all out XSPC Photon (~$300 + $100 for gpu block)
> 
> I should maybe think about saving some money


I'm going to go ghetto at first. Richie's GPU Cool bracket seems really nice with its built in fan bracket. I have the full aluminum plate on my card with Fujipoly Extreme underneath on VRAM and VRM. I'm thinking an AIO on the core, and a slim 92mm fan directly cooling the VRM plate will leave me only with the project of finding a way to hide it all (you know, Kraken G10 style). The hiding will be especially necessary if I go with the Noctua slim 92mm fan!









Quote:


> Originally Posted by *gateh0use*
> 
> not sure how much a 240+mm radiator would be an improvement over a 140mm one, but my Corsair H90 keeps my 290x at 50c during full load


Thanks for the info. I can get an H80i for the same price as the H90 right now at $75. The former is 120mm and the latter is 140mm. I'm not too fond of the thickness of the H80i having 2 fans, but I could always use the PWM SP120's I already have on it to match all my case fans and overall theme. A rig has to look good, too, right?









Quote:


> Originally Posted by *battleaxe*
> 
> This is what I did. I have an AIO on each of my GTX670's in SLI and also an AIO on my new 290. Temps on the 290 never go above 54c under full load mining. Nowhere near that for gaming, under 50c max. My VRM's are under 55c too. I made some custom coolers out of old CPU coolers by cutting them into pieces to fit the VRM's and RAM. Used zip ties to secure everything along with thermal tape.
> 
> Looks hideous on the 290, but works great. They are all mining so I don't care about aesthetics too much. The 670's actually look really nice but the 290 looks downright awful.... It does work well though. Overall, I'm very happy I did it this way. I didn't want to drop $200 per card on custom loops. After the blocks and rads it would be at least that much. The AIO solution cost me 60.00 per GPU card. Saved about 140.00 per card I figure. Although, not as nice looking as a custom loop I admit. I doubt you'll regret it. I will warn though that seating the 290 with an AIO is kinda hard. I used the H80 and the intel bracket. Clipped the tabs off so they were only 1/4" long (just long enough to secure a zip tie) and used one zip tie on each corner. My first attempt was with the AMD bracket pulling across and this didn't work at all. Good luck to you.


I plan to use Richie's GPU Cool bracket instead of zip ties. It makes for extremely easy installation and has a fan mount included.

Check out http://www.overclock.net/t/1439434 for more info.

Are you using any fans directed at the heatsinks you chopped up for your VRM?


----------



## Pheozero

Quote:


> Originally Posted by *mojobear*
> 
> for anyone going crossfire under water with EK blocks...just as a heads up the EK terminal is B&$^& to put together with the blocks.....the O rings just do not want to stay in place. Took me a whole hour for 3 cards to fit together and even then I messed one up and there was a test leak... lesson in patience.


Quote:


> Originally Posted by *MrWhiteRX7*
> 
> I'm using the EK Triple terminal. I've heard it's quite a pain! I'll know next week


Try using some silicone grease next time. It helps a bunch.


----------



## Menphisto

I have the feeling my chip is crazy...i can overclock it with stock voltage up to 1075 MHz ...than with + 25 mV i can go up to stable 1110 MHz ...but then ....+ 50 mV for 1125 MHz and +100mhz for 1150 MHz ...thats a huge bump i think ...is this normal?


----------



## jrcbandit

Quote:


> Originally Posted by *Menphisto*
> 
> I have the feeling my chip is crazy...i can overclock it with stock voltage up to 1075 MHz ...than with + 25 mV i can go up to stable 1110 MHz ...but then ....+ 50 mV for 1125 MHz and +100mhz for 1150 MHz ...thats a huge bump i think ...is this normal?


That's very normal.. to hit 1200 Mhz you'll likely need +150 and for 1240-50 around +200.


----------



## Forceman

Quote:


> Originally Posted by *Menphisto*
> 
> I have the feeling my chip is crazy...i can overclock it with stock voltage up to 1075 MHz ...than with + 25 mV i can go up to stable 1110 MHz ...but then ....+ 50 mV for 1125 MHz and +100mhz for 1150 MHz ...thats a huge bump i think ...is this normal?


Yeah, there's always a point of diminishing returns.


----------



## Menphisto

OK then i See my 24/7 sweet spot @ 1110 MHz you think the same? (1450mhz mem) (downclocking mem dont give me better Core OC)


----------



## psyside

Guys, Tri x oc is ~ 70c max on vrm's with *1.3v* @1150/1450


----------



## Tennobanzai

Is 1125mhz with normal voltage really good? I have power limit set to -1% and i'm thinking of just leaving that at 0%


----------



## psyside

Hmm, you got 290/290X? i would use +20% power limit to make sure you are not throttling, its good result yes.


----------



## Tennobanzai

Quote:


> Originally Posted by *psyside*
> 
> Hmm, you got 290/290X? i would use +20% power limit to make sure you are not throttling, its good result yes.


It's a 290. I guess I can try 20% for just benchmarking but I rather stay on the cooler side for normal usage.


----------



## Derpinheimer

Quote:


> Originally Posted by *Forceman*
> 
> Yeah, there's always a point of diminishing returns.


Im finding with PT3 bios aka no vdroop I can get 1200 with no voltage change... 1225 is +100 and 1250 is impossible,? lol major diminishing returns. might be psu though.
Quote:


> Originally Posted by *Tennobanzai*
> 
> Is 1125mhz with normal voltage really good? I have power limit set to -1% and i'm thinking of just leaving that at 0%


lol -1%? what point would that have?

whether or not its really good depends on what your cards stock voltage is. but its still good, yeah


----------



## Tennobanzai

Quote:


> Originally Posted by *Derpinheimer*
> 
> Im finding with PT3 bios aka no vdroop I can get 1200 with no voltage change... 1225 is +100 and 1250 is impossible,? lol major diminishing returns. might be psu though.
> lol -1%? what point would that have?
> 
> whether or not its really good depends on what your cards stock voltage is. but its still good, yeah


I normally have it set to -5% or -6% but for benchmark I put it to 1%.


----------



## Derpinheimer

The odds of 1% effecting anything are very low. The only settings that really make sense are 0/20/50

negative ?? some might find use, but not with -1%








0 for those who dont want to push more power than the card was designed for
20 for those who want headroom for an OC but less risk for some crazy power spike to kill the card (aka protect you if you are stupid and set voltage really high and open furmark)
50 for those who trust themselves and want no power limit issues holding them back


----------



## psyside

Agree, 1 or even 5% is useless use 20% and you will be fine


----------



## Tennobanzai

Thanks for the replies. So I'm guessing my 1125mhz is stable and possibly I can get a higher overclock if I went with more powerlimit + voltage?


----------



## Derpinheimer

Powerlimit wont really improve your OC capabilities in the way you might think. But it does make high OCs possible. All I mean by that is when you increase voltage, the chip is actually capable of being stable at a higher clockspeed. Power limit will prevent the clock speed from going that high if its too low, though.

I've tricked myself multiple times OCing without increasing the power limit. You think the card is stable at a certain clockspeed and later realize, due to the power limit being hit, the card throttled and was actually running at a slower speed than you though.


----------



## Tennobanzai

Quote:


> Originally Posted by *Derpinheimer*
> 
> Powerlimit wont really improve your OC capabilities in the way you might think. But it does make high OCs possible. All I mean by that is when you increase voltage, the chip is actually capable of being stable at a higher clockspeed. Power limit will prevent the clock speed from going that high if its too low, though.
> 
> I've tricked myself multiple times OCing without increasing the power limit. You think the card is stable at a certain clockspeed and later realize, due to the power limit being hit, the card throttled and was actually running at a slower speed than you though.


Thanks for the info. Kinda sad now that my overclock might not be actually stable. It made it through 3 loops of Valley tho. I'll find out tonight tho


----------



## Forceman

Quote:


> Originally Posted by *psyside*
> 
> Guys, Tri x oc is ~ 70c max on vrm's with *1.3v* @1150/1450


What fan speed was that at?
Quote:


> Originally Posted by *Derpinheimer*
> 
> Powerlimit wont really improve your OC capabilities in the way you might think. But it does make high OCs possible. All I mean by that is when you increase voltage, the chip is actually capable of being stable at a higher clockspeed. Power limit will prevent the clock speed from going that high if its too low, though.
> 
> I've tricked myself multiple times OCing without increasing the power limit. You think the card is stable at a certain clockspeed and later realize, due to the power limit being hit, the card throttled and was actually running at a slower speed than you though.


Yeah, I've done that a few times with the Afterburner power limit bug. Set it up and run through Firestrike at a high clock and get worse than stock scores. Then the lightbulb goes on and you check CCC and the power limit is still 0% and the card was running 900MHz the whole time.


----------



## Derpinheimer

Quote:


> Originally Posted by *Forceman*
> 
> What fan speed was that at?
> Yeah, I've done that a few times with the Afterburner power limit bug. Set it up and run through Firestrike at a high clock and get worse than stock scores. Then the lightbulb goes on and you check CCC and the power limit is still 0% and the card was running 900MHz the whole time.


Looks like 70% or 3300RPM.

Im loving this PT3 bios with inf. power limit









Quote:


> Originally Posted by *Tennobanzai*
> 
> Thanks for the info. Kinda sad now that my overclock might not be actually stable. It made it through 3 loops of Valley tho. I'll find out tonight tho


Best of luck to ya







. I think you'll be ok. With no voltage increase, power usage doesnt go up all that much.


----------



## Sazz

Quote:


> Originally Posted by *Forceman*
> 
> What fan speed was that at?
> Yeah, I've done that a few times with the Afterburner power limit bug. Set it up and run through Firestrike at a high clock and get worse than stock scores. Then the lightbulb goes on and you check CCC and the power limit is still 0% and the card was running 900MHz the whole time.


Quote:


> Originally Posted by *Sgt Bilko*
> 
> Thats just weird then...hopefully you figure it out. That GPU score mixed with a 10k physics score would be hella nice


I think it has something to do with the BIOS, I just got around using PT1 BIOS and I noticed right away that I am getting 9.8k Physics score now, but my combined score is down by 2FPS with PT1 BIOS, while with stock Sapphire BIOS I am getting 16-17Fps on combined run while Physics run gets 8.9-9k only.

really freakin weird. Imma use other BIOS's from XFX and stock Asus BIOS to see if that have any effect on it. I did verify this by switching from PT1 and stock Sapphire BIOS several times now and everytime I have PT1 on Physics score is up, combined score is down while Sapphire stock BIOS combined score is up physics score is down, really damn weird.


----------



## esqueue

Just started mining LTC and it seems that the train was missed. At approximately $23 for 1 coin and at 868Kh/s that's 0.27 LTC / Day A bump under 4 days of straight mining for around $25 is not worth the energy bill. I guess 1 card just isn't gonna cut it. Oh well, back to a game card as I originally purchased it for.


----------



## battleaxe

Quote:


> Originally Posted by *jerrolds*
> 
> VRM temps on stock are better than 3rd party solutions atm...you can get a decent overclock at quiet fan speeds as long as ambient isnt too bad. 1100mhz w/ 53% fan speed should hover around 90C on the core, and shouldnt throttle. At least for me it didnt.
> 
> 55% is where it got too loud for me personally.


Not entirely true. At least not for me. My VRM temps are under 40c for VRM 1 and under 52c for VRM2. I made custom coolers out of old CPU heat sinks by cutting them down to size. They are about 3/8 of an inch wide, cover the length of the VRM sections and are 1.25" tall. About 5 fins per sink. With a 120 fan pushing air over the card and a water cooler on the core I don't get any temps above 53c on anything on the card. Not quite as good as full water but not too bad.

If you have some old cpu heat sinks and a hack saw anyone can make these heatsinks for the RAM and VRM's for the 290/290x series cards. Not exactly pretty, but if you're mining, who really cares? Not me.


----------



## PillarOfAutumn

Hey everyone. I ran into a bit of a problem. I'm very exhausted from work so I apologize for not doing the research prior to this. I turned on my PC for the first time in about 2 months after successfully leak testing my rig. After turning it on I wanted to benchmark my Sapphire 290 to see how well it would perform. After setting it to uber mode and then going into "Performance" in CCC and maxing everything out for performance, a few minutes went by and my screen just turned black. I know people have had this problem happen to them and theres a fix for it, Can someone please tell me what to do?


----------



## Jack Mac

Black screen? That's usually a weird driver crash, at least for me and requires a reboot to fix.


----------



## battleaxe

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Hey everyone. I ran into a bit of a problem. I'm very exhausted from work so I apologize for not doing the research prior to this. I turned on my PC for the first time in about 2 months after successfully leak testing my rig. After turning it on I wanted to benchmark my Sapphire 290 to see how well it would perform. After setting it to uber mode and then going into "Performance" in CCC and maxing everything out for performance, a few minutes went by and my screen just turned black. I know people have had this problem happen to them and theres a fix for it, Can someone please tell me what to do?


I've only seen black screens when my memory was clocked higher than it could handle... When I lowered the mem, all went back to normal and blacks screens went away. Try that first.


----------



## Roy360

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Hey everyone. I ran into a bit of a problem. I'm very exhausted from work so I apologize for not doing the research prior to this. I turned on my PC for the first time in about 2 months after successfully leak testing my rig. After turning it on I wanted to benchmark my Sapphire 290 to see how well it would perform. After setting it to uber mode and then going into "Performance" in CCC and maxing everything out for performance, a few minutes went by and my screen just turned black. I know people have had this problem happen to them and theres a fix for it, Can someone please tell me what to do?


My card does the same thing. (I'm really happy I tested mines before adding it to my loop). I just RMA'd it and am waiting for a replacement.

If you didn't keep your warranty stickers, increasing your voltage should solve your problem. Downclocking may help the issue, but it doesn't always work. Mines got the black screen even at 947/1100 @ stock voltage


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Jack Mac*
> 
> Black screen? That's usually a weird driver crash, at least for me and requires a reboot to fix.


Quote:


> Originally Posted by *battleaxe*
> 
> I've only seen black screens when my memory was clocked higher than it could handle... When I lowered the mem, all went back to normal and blacks screens went away. Try that first.


Thanks a lot! I think it was some weird driver crash. It did help when I rebooted my PC. Another question. in the CCC, if I were to push the circle all the way to the top right corner, will that constitute as OCing? Currently, I have my card set at max temp of 95C and max fan speed of 100% and in uber mode. The card is watercooled. My GPU temp is averaging at 34C on idle, and VRMs are 26 and 28 on idle. Will run 3D mark in a little bit.


----------



## esqueue

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Hey everyone. I ran into a bit of a problem. I'm very exhausted from work so I apologize for not doing the research prior to this. I turned on my PC for the first time in about 2 months after successfully leak testing my rig. After turning it on I wanted to benchmark my Sapphire 290 to see how well it would perform. After setting it to uber mode and then going into "Performance" in CCC and *maxing everything out for performance,* a few minutes went by and my screen just turned black. I know people have had this problem happen to them and theres a fix for it, Can someone please tell me what to do?


Wouldn't that be his problem right there?

Pillar, did you max everything out? if yes, you have to increase the speed at a slow rate not just max out everything.


----------



## PillarOfAutumn

Oh okay. So that graph in CCC is equivalent to MSI Afterburner? Sorry I'm a total noob at this. Should I use Sapphire Trixx instead? I have a Sapphire card. And before using that, what should I do in CCC so that trixx gets all control?


----------



## Jack Mac

Use Afterburner or Trixx to overclock, it'll save you some trouble, and most issues can usually be shrugged off as just a driver issue. Seeing as this is a new card, AMD still has a good bit of polishing left to do. If you do use AB/Trixx be sure to disable overdrive in CCC.


----------



## ehpexs

Got my second 290 today. I haven't had any time to game, but they're loud (need to order some water cooling stuff)

Anyways proof



Rig Pic (Crappy picture, messy case, don't care)



Boxes (Again messy and I really don't care)


----------



## ZealotKi11er

My VRM1 with EK Block hit today 67C. Do i have to do something special?

Core is 51C and VRM2 43C

Also i am using +15% Power Limit & +50mV Core Voltage.


----------



## Hogesyx

Quote:


> Originally Posted by *ZealotKi11er*
> 
> My VRM1 with EK Block hit today 67C. Do i have to do something special?
> 
> Core is 51C and VRM2 43C
> 
> Also i am using +15% Power Limit & +50mV Core Voltage.


What rad and loop configuration as well as ambient?


----------



## ehpexs

Pretty disappointed with the crossfire performance, one card is great, two are like a powerpoint presentation,


----------



## ReHWolution

Quote:


> Originally Posted by *ehpexs*
> 
> Pretty disappointed with the crossfire performance, one card is great, two are like a powerpoint presentation,


Explain the joke 

However, I quitted thinking about a waterblock: 1) it's too expensive 2) i usually swap videocards for reviews so i would have some problems closing the loop with my videocard, I ordered a Icy Vision Rev. 2 on eBay in order to get that, it's coming from UK to Italy, expected to arrive between 9th and 14th


----------



## ehpexs

Quote:


> Originally Posted by *ReHWolution*
> 
> Explain the joke


It's no joke, games ran fine on one card, on two they are literally a slide show. It's really weird.


----------



## kizwan

Quote:


> Originally Posted by *ReHWolution*
> 
> Explain the joke
> 
> However, I quitted thinking about a waterblock: 1) it's too expensive 2) i usually swap videocards for reviews so i would have some problems closing the loop with my videocard, I ordered a Icy Vision Rev. 2 on eBay in order to get that, it's coming from UK to Italy, expected to arrive between 9th and 14th


You can always use QDC but yeah they're expensive.
Quote:


> Originally Posted by *ehpexs*
> 
> It's no joke, games ran fine on one card, on two they are literally a slide show. It's really weird.


What games? By "slide show" you meant it lags or stutters?


----------



## ehpexs

Quote:


> Originally Posted by *kizwan*
> 
> What games? By "slide show" you meant it lags or stutters?


Like literally a frame a second


----------



## binormalkilla

I finally got everything leak tested the other night, but I haven't been able to test it out because of work. Still waiting on my RIVBE block to arrive from EK, but in the meantime I'm going to see how these load temps are...


http://www.techpowerup.com/gpuz/9edym/


----------



## Tennobanzai

After I put power limit to 0% I can hit 1130mhz on stock volt. Before with negative power limit 1125mhz was the only stable clock speed I could get. I guess I'll keep trying to go higher


----------



## kizwan

Quote:


> Originally Posted by *ehpexs*
> 
> Like literally a frame a second


Running fine here with BF3, DiRT2 & BF4. I assume you already tried reinstalling the driver? Both GPUs running fine as single card in their respective PCIe slot? (meaning with Crossfire disable, test both card one at a time, by plugging the HDMI/DVI cable to the individual card)


----------



## Hogesyx

Quote:


> Originally Posted by *ReHWolution*
> 
> Explain the joke
> 
> However, I quitted thinking about a waterblock: 1) it's too expensive 2) i usually swap videocards for reviews so i would have some problems closing the loop with my videocard, I ordered a Icy Vision Rev. 2 on eBay in order to get that, it's coming from UK to Italy, expected to arrive between 9th and 14th


QDC is design specifically for that purpose, as well as ease of draining a loop. WC has high initial setup cost, but the cost generally will break even at around 3-5 years time, especially true if you use a universal GPU block.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Hogesyx*
> 
> What rad and loop configuration as well as ambient?


24C Ambient. More then enough 360 + 240 push/pull. had 2 x HD 7970 before.


----------



## binormalkilla

Here are my temps with stock clocks and power settings on Trifire 290X in BF3 for 3 rounds. Reported using AIDA64 and MSI afterburner. Ambient temps ~22C.


----------



## Forceman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> My VRM1 with EK Block hit today 67C. Do i have to do something special?
> 
> Core is 51C and VRM2 43C
> 
> Also i am using +15% Power Limit & +50mV Core Voltage.


Did you use the thermal pads that came with the EK block? Mine also hit around 60C under load - I'm not worried about it, but it is higher than a lot of people report. I get about 50-55C on the core (at 1200/+100). I just used the EK pads.


----------



## ZealotKi11er

Quote:


> Originally Posted by *binormalkilla*
> 
> Here are my temps with stock clocks and power settings on Trifire 290X in BF3 for 3 rounds. Reported using AIDA64 and MSI afterburner. Ambient temps ~22C.


Sucks that we dont know actual GPU usage for 290X. I can tell you for sure that is under 80% because of low load temps.
Quote:


> Originally Posted by *Forceman*
> 
> Did you use the thermal pads that came with the EK block? Mine also hit around 60C under load - I'm not worried about it, but it is higher than a lot of people report. I get about 50-55C on the core (at 1200/+100). I just used the EK pads.


Yes i used stock pad.


----------



## binormalkilla

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Sucks that we dont know actual GPU usage for 290X. I can tell you for sure that is under 80% because of low load temps.
> Yes i used stock pad.


Unless MSI afterburner is inaccurate, it seems like Crossfire is doing some load balancing. I will test with other programs eventually, but for now I need my beer and BF4









If you want me to test with something else let me know. For now I plan on running 3dmark, however BF4 is the only game I have installed ATM.


----------



## ZealotKi11er

Quote:


> Originally Posted by *binormalkilla*
> 
> Unless MSI afterburner is inaccurate, it seems like Crossfire is doing some load balancing. I will test with other programs eventually, but for now I need my beer and BF4
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you want me to test with something else let me know. For now I plan on running 3dmark, however BF4 is the only game I have installed ATM.


Quote:


> Originally Posted by *Forceman*
> 
> Did you use the thermal pads that came with the EK block? Mine also hit around 60C under load - I'm not worried about it, but it is higher than a lot of people report. I get about 50-55C on the core (at 1200/+100). I just used the EK pads.


Test something that will push 3 x 290X @ 100% for sure. I know BF3 could not even push 2 x HD 7970 @ 100% all the time. Actually i never did for me.


----------



## binormalkilla

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Test something that will push 3 x 290X @ 100% for sure. I know BF3 could not even push 2 x HD 7970 @ 100% all the time. Actually i never did for me.


I'll see what I can do.


----------



## kizwan

Quote:


> Originally Posted by *ZealotKi11er*
> 
> My VRM1 with EK Block hit today 67C. Do i have to do something special?
> 
> Core is 51C and VRM2 43C
> 
> Also i am using +15% Power Limit & +50mV Core Voltage.


Is this temp while gaming or stress test? Remember, binormalkilla's 290X's is running @stock clock. Temp will be lower than yours.


----------



## binormalkilla

Also, I used thermal pads along with the EK Ektotherm TIM on both the GPU and the VRMs. Just a little dab'll do ya


----------



## ZealotKi11er

Quote:


> Originally Posted by *kizwan*
> 
> Is this temp while gaming or stress test? Remember, binormalkilla's 290X's is running @stock clock. Temp will be lower than yours.


Mining. I hit 45C gaming but only have 1 card not 3.


----------



## PillarOfAutumn

I'm playing Sleeping Dogs at 1080p everything maxed. I have one 360mm (45mm thick) rad and another 120mm (60mm). after a few hours of gameplay my temps are:

GPU: 47.6
VRM1: 45
VRM2: 39


----------



## binormalkilla

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Mining. I hit 45C gaming but only have 1 card not 3.


I've been meaning to configure cgminer to (at least) test GPU load temps. I may leave it running if it's worth it. This PC could replace my central heat (almost).


----------



## Yvese

Anyone have the Sapphire Tri-X? Would love a video to see how loud it is at 100%.


----------



## kizwan

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Mining. I hit 45C gaming but only have 1 card not 3.


Your temp look fine then. I have two cards & temps pretty much the same whether I'm running single or dual cards. VRM1 is going to run hotter since it handle the core. BTW, do you have backplate on your card?


----------



## Hogesyx

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Mining. I hit 45C gaming but only have 1 card not 3.


Then it is about right. I had 64C on stock voltage with a single 360x60mm prior to adding a 480.

Adding 50mV and a additional 240 should be around your level.


----------



## zpaf

Quote:


> Originally Posted by *Sazz*
> 
> I think it has something to do with the BIOS, I just got around using PT1 BIOS and I noticed right away that I am getting 9.8k Physics score now, but my combined score is down by 2FPS with PT1 BIOS, while with stock Sapphire BIOS I am getting 16-17Fps on combined run while Physics run gets 8.9-9k only.
> 
> really freakin weird. Imma use other BIOS's from XFX and stock Asus BIOS to see if that have any effect on it. I did verify this by switching from PT1 and stock Sapphire BIOS several times now and everytime I have PT1 on Physics score is up, combined score is down while Sapphire stock BIOS combined score is up physics score is down, really damn weird.


Same here.
4770K at defaults

official bios

unofficial bios


----------



## ZealotKi11er

So i had my mining configured to I 20 and VRM1 hit 75C. Maybe this is not typical game load?


----------



## ReHWolution

Quote:


> Originally Posted by *Hogesyx*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ReHWolution*
> 
> Explain the joke
> 
> However, I quitted thinking about a waterblock: 1) it's too expensive 2) i usually swap videocards for reviews so i would have some problems closing the loop with my videocard, I ordered a Icy Vision Rev. 2 on eBay in order to get that, it's coming from UK to Italy, expected to arrive between 9th and 14th
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> QDC is design specifically for that purpose, as well as ease of draining a loop. WC has high initial setup cost, but the cost generally will break even at around 3-5 years time, especially true if you use a universal GPU block.
Click to expand...

I thought about Quick Disconnect Couplings but they're expensive, and tbh I can't afford just WB+Backplate, do you think I can afford WB+Backplate+QDCs? However, in 3 years I upgrade 3-4 times my system, I won't be good since I meant to get a full cover wb on my 290x... So, it's not like I'm able to use the various components once I upgrade :\


----------



## ReHWolution

Quote:


> Originally Posted by *ZealotKi11er*
> 
> So i had my mining configured to I 20 and VRM1 hit 75C. Maybe this is not typical game load?


Definitely not a "typical game load", litecoin mining : game = Linx : cinebench


----------



## Zenophobe

http://imageshack.us/photo/my-images/69/z5q4.png/

Uploaded with ImageShack.us

Purchased 12/21/2013


----------



## ReHWolution

Quote:


> Originally Posted by *Zenophobe*
> 
> http://imageshack.us/photo/my-images/69/z5q4.png/
> 
> Uploaded with ImageShack.us
> 
> Purchased 12/21/2013


Very nice! How come 51°C on idle?


----------



## ZealotKi11er

Quote:


> Originally Posted by *ReHWolution*
> 
> Definitely not a "typical game load", litecoin mining : game = Linx : cinebench


Still need to do something about it.


----------



## kizwan

Aquacomputer active backplate helps reduced VRM1 temp. So attaching heatsink will help reduced VRM1 temp too. I'm referring to VRM1 (mosfet??) at the back of the card.


----------



## ZealotKi11er

Quote:


> Originally Posted by *kizwan*
> 
> Aquacomputer active backplate helps reduced VRM1 temp. So attaching heatsink will help reduced VRM1 temp too. I'm referring to VRM1 (mosfet??) at the back of the card.


Still there could be some with stock EK Pads because VRMs should not get that hot under water.


----------



## Hogesyx

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Still there could be some with stock EK Pads because VRMs should not get that hot under water.


Well, it's the 290X VRM we are talking about. The VRM is also the main reason why I added a 480 rad to reduce the delta difference.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Hogesyx*
> 
> Well, it's the 290X VRM we are talking about. The VRM is also the main reason why I added a 480 rad to reduce the delta difference.


I have more the enough RAD. 360 + 240 with GT15 push/pull.


----------



## Hogesyx

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I have more the enough RAD. 360 + 240 with GT15 push/pull.


I am not saying you got insufficient rad. I am saying 290X vrm is hot.


----------



## the9quad

Quote:


> Originally Posted by *Hogesyx*
> 
> I am not saying you got insufficient rad. I am saying 290X vrm is hot.


Not very hot if you don't overclock with a custom fan profile.
My hottest VRM will get to 62 C and coolest is at 42 C, at stock speeds with stock coolers after long sessions of BF4.
That's with the top card at 70C, middle at 76C and bottom card at 65C.
Fan speeds are 65,68, and 60 respectively.

Ran BF4 for about 2 hours straight last night, while monitoring them, in fact. Thats with cards all smooooshed together so the heat is probably the worst on my rig. I am sure overclocking that all goes out the window though. Was just posting this to clarify at stock they aren't bad at all. Not to say your wrong.

I can run a video of it today when I play, to show it.


----------



## smoke2

I've made a decision to buy Cooler Master V instead of Seasonic X, because of fan.
But don't know which one to take: 700W or rather 850W to be on safe side?
Price is here 110€ for 700W and 140€ for 850W.
Not planning to do a Crossfire, but want to have it some years...


----------



## cplifj

Hah,

unfortunatly i have an asus and i'm using gpu-tweak , this hasn't got nearly as many setpoints for a fanprofile.

offcourse i knew this allready, i just would like to see it done and finished when we the customers buy supposedly topgear.

i'm also no fan off putting 'ig different softwares on my pc to do the same thing, it makes software mess off windows registry mostly.

another point i get worked up over, no company ever takes the effort to really clean all its files when uninstalled.

anyway's i'm not going into arguments with unbiased persons toward this subject, these points are not mute at all.

Grtz


----------



## dspacek

Add me to this club please;-) On the air I was able to reach this OC with TRIXX and without touching voltage. Next week it will get under water with this block http://www.alphacool.cz/produkt/Watercool_Heatkiller_GPU-X3_AMD_R9-290X.html
Then I try extreme OC with AB
So how can I set higher voltage of RAM with AB? Im not able to set it







Only AUX!

PS: ASIC is 71,4 %


----------



## PillarOfAutumn

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I have more the enough RAD. 360 + 240 with GT15 push/pull.


Keep in mind it can also be your thermalpad. I'm using a Fujipoly thermal pad which is why I think the VRMs are staying under ~37 while gaming in BF4.


----------



## shilka

Quote:


> Originally Posted by *smoke2*
> 
> I've made a decision to buy Cooler Master V instead of Seasonic X, because of fan.
> But don't know which one to take: 700W or rather 850W to be on safe side?
> Price is here 110€ for 700W and 140€ for 850W.
> Not planning to do a Crossfire, but want to have it some years...


Even 700 watts is overkill for a 290(x)

If you are from the EU why not grab a Super Flower unit instead?

http://www.caseking.de/shop/catalog/advanced_search_result.php?keywords=Super+Flower


----------



## psyside

Quote:


> Originally Posted by *shilka*
> 
> Even 700 watts is overkill for a 290(x)
> 
> If you are from the EU why not grab a Super Flower unit instead?
> 
> http://www.caseking.de/shop/catalog/advanced_search_result.php?keywords=Super+Flower


No overkill if you use volts....this card use tons of power if you oc with volts.


----------



## bond32

Quote:


> Originally Posted by *psyside*
> 
> No overkill if you use volts....this card use tons of power if you oc with volts.


Yeah, it's still overkill. Assuming with extreme overvoltage, the absolute max the card can physically draw is somewhere around 350 watts.... The rest of the system does not use 350 watts.


----------



## shilka

Quote:


> Originally Posted by *psyside*
> 
> No overkill if you use volts....this card use tons of power if you oc with volts.


Yes i thought that was obvious enough


----------



## kizwan

Quote:


> Originally Posted by *smoke2*
> 
> I've made a decision to buy Cooler Master V instead of Seasonic X, because of fan.
> But don't know which one to take: 700W or rather 850W to be on safe side?
> Price is here 110€ for 700W and 140€ for 850W.
> Not planning to do a Crossfire, but want to have it some years...


What do you mean "because of fan"?


----------



## chiknnwatrmln

Do I need to use TIM with thermal pads? I'm gonna be using FujiPoly Extreme Ultra, so any TIM won't be as thermally conductive as those pads.


----------



## smoke2

Quote:


> Originally Posted by *shilka*
> 
> Even 700 watts is overkill for a 290(x)
> 
> If you are from the EU why not grab a Super Flower unit instead?
> 
> http://www.caseking.de/shop/catalog/advanced_search_result.php?keywords=Super+Flower


I'm from EU, but it's not practically sold in my country.
It's better than Cooler Master?
Quote:


> Originally Posted by *bond32*
> 
> Yeah, it's still overkill. Assuming with extreme overvoltage, the absolute max the card can physically draw is somewhere around 350 watts.... The rest of the system does not use 350 watts.


Ok, so with 700W I will be safe with good reserve?


----------



## psyside

Quote:


> Originally Posted by *bond32*
> 
> Yeah, it's still overkill. Assuming with extreme overvoltage, the absolute max the card can physically draw is somewhere around 350 watts.... The rest of the system does not use 350 watts.


No overkill.


----------



## psyside

Quote:


> Originally Posted by *smoke2*
> 
> It's better than Cooler Master?


No. About the fan, its highest class on the market right now.


----------



## shilka

Quote:


> Originally Posted by *smoke2*
> 
> I'm from EU, but it's not practically sold in my country.
> It's better than Cooler Master?
> Ok, so with 700W I will be safe with good reserve?


In some areas the Leadex is better then the V


----------



## Durvelle27

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Do I need to use TIM with thermal pads? I'm gonna be using FujiPoly Extreme Ultra, so any TIM won't be as thermally conductive as those pads.


I use Shin-ESTU x23 7783D on Core & VRMs/VRAMs and its some of the best TIM while also being non-conductive


----------



## Forceman

Quote:


> Originally Posted by *psyside*
> 
> No overkill.


Unless you are planning to use the PT3 (or some other modded BIOS), 700W is plenty for a single card. I've yet to see more than 450W from the wall and I've run mine as high as +200mV with Trixx.


----------



## psyside

Well we are talking about extreme runs here, but good to know thanks


----------



## Forceman

Quote:


> Originally Posted by *psyside*
> 
> Well we are talking about extreme runs here, but good to know thanks


Why go off on tangents and confuse everyone? The original question was whether 700W would be enough for a single card, which it is. I don't know why people always run down the "absolute max" rabbit hole when the only people who intend to do something like that know full well what the requirements and power consumption are.


----------



## psyside

700W is like + 100W then needed, for me running the psu around 80 - 85% of its full power is not overkill. Don't know about you, i like my psu's to be 70-80% loaded most of the times when i bench or game/oc not close to the limit, and if we add efficiency into account its better to have some room left.

Also, i had no idea how the original poster will run his 290X, maybe he will run 1.4v who knows...if not 700W is more then enough.


----------



## bond32

Quote:


> Originally Posted by *psyside*
> 
> 700W is like + 100W then needed, for me running the psu around 80 - 85% of its full power is not overkill. Don't know about you, i like my psu's to be 70-80% loaded most of the times when i bench or game/oc not close to the limit, and if we add efficiency into account its better to have some room left.
> 
> Also, i had no idea how the original poster will run his 290X, maybe he will run 1.4v who knows...if not 700W is more then enough.


Why though? You're idea of an "overload" psu is weird. A psu rated for 700 watts supplying 600 watts only differs from a psu rated for 800 watts pulling 600 watts by lower power bills. If I had a system that pulled 1000 watts, I would buy a 1000 watt psu personally... It makes no difference as long as you purchase a quality psu.


----------



## smoke2

Quote:


> Originally Posted by *psyside*
> 
> 700W is like + 100W then needed, for me running the psu around 80 - 85% of its full power is not overkill. Don't know about you, i like my psu's to be 70-80% loaded most of the times when i bench or game/oc not close to the limit, and if we add efficiency into account its better to have some room left.
> 
> Also, i had no idea how the original poster will run his 290X, maybe he will run 1.4v who knows...if not 700W is more then enough.


I like to have also some reserve.
I will be overclocking it probably.
Quote:


> Originally Posted by *bond32*
> 
> Why though? You're idea of an "overload" psu is weird. A psu rated for 700 watts supplying 600 watts only differs from a psu rated for 800 watts pulling 600 watts by lower power bills. If I had a system that pulled 1000 watts, I would buy a 1000 watt psu personally... It makes no difference as long as you purchase a quality psu.


It's practically the same, only efficiency will be probabky worse.
From 50-60% it's the best and then about 75% it falls down relatively much.


----------



## psyside

Quote:


> Originally Posted by *bond32*
> 
> Why though? You're idea of an "overload" psu is weird. A psu rated for 700 watts supplying 600 watts only differs from a psu rated for 800 watts pulling 600 watts by lower power bills. If I had a system that pulled 1000 watts, I would buy a 1000 watt psu personally... It makes no difference as long as you purchase a quality psu.


I know, and i know you are right, but generally i like to have some headroom, i feel better when i know my psu run with a bit more spare


----------



## esqueue

Quote:


> Originally Posted by *bond32*
> 
> Why though? You're idea of an "overload" psu is weird. A psu rated for 700 watts supplying 600 watts only differs from a psu rated for 800 watts pulling 600 watts by lower power bills. If I had a system that pulled 1000 watts, I would buy a 1000 watt psu personally... It makes no difference as long as you purchase a quality psu.


The following is the basic principal of most electronic devices. I'm quite certain that it also applies to Computer power supplies. Running any power supply at 100% capacity is always a bad idea. The closer you run it to that capacity the more ware and tear it takes. A power supply will only pull as much as the draw so a more capable PS running at 60% should not pull that much more than one running at 100%

I guess it boils down to how the Manufacture rates the power supplies. If they rate it at absolute max then that is bad. If they rate it at constant draw capabilities, then purchasing the exact amount should be fine.


----------



## Sazz

Quote:


> Originally Posted by *bond32*
> 
> Why though? You're idea of an "overload" psu is weird. A psu rated for 700 watts supplying 600 watts only differs from a psu rated for 800 watts pulling 600 watts by lower power bills. If I had a system that pulled 1000 watts, I would buy a 1000 watt psu personally... It makes no difference as long as you purchase a quality psu.


even quality PSU degrades, if you are pulling 1000Watts and you bought a "just enough" 1000watts PSU, over time that PSU will have hard time supplying that 1000watts needed specially if its constantly putting out its maximum rated power. I always get a headroom of atleast 100watts so that the PSU doesn't have to always work its ass off 100% all the time, plus the fact that in case you add something else in the future you won't have to worry about upgrading your PSU.

I got an overkill Seasonic X1250 (1250watts) PSU for my system that contains a [email protected]+single 290X (w/ all the other stuff like watercooling) in my system. Not because I want that MUCH of a headroom, but because I bought it for 140bucks (retail is 290bucks on this PSU), the huge headroom is just a bonus and its a quality PSU and another bonus is if I decided to add another 290X, or maybe 2 more 290X wouldn't have to worry about it.. right now with 8350 @ 5Ghz and 290X at my daily clocks I am barely using 654watts under load (furmark+cinebench), and if reviews are right a stock 290X adds up around 250ish-300ishwatts? so I believe I can add 2 more 290X without worries.


----------



## shilka

I am just going to leave this here

http://www.overclock.net/t/872013/50-load-myth
http://www.overclock.net/t/711542/on-efficiency


----------



## Arizonian

Quote:


> Originally Posted by *binormalkilla*
> 
> I finally got everything leak tested the other night, but I haven't been able to test it out because of work. Still waiting on my RIVBE block to arrive from EK, but in the meantime I'm going to see how these load temps are...
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/9edym/


Congrats - added
















Quote:


> Originally Posted by *Zenophobe*
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Uploaded with ImageShack.us
> 
> Purchased 12/21/2013


Congrats - added









Quote:


> Originally Posted by *dspacek*
> 
> Add me to this club please;-) On the air I was able to reach this OC with TRIXX and without touching voltage. Next week it will get under water with this block http://www.alphacool.cz/produkt/Watercool_Heatkiller_GPU-X3_AMD_R9-290X.html
> Then I try extreme OC with AB
> So how can I set higher voltage of RAM with AB? Im not able to set it
> 
> 
> 
> 
> 
> 
> 
> Only AUX!
> 
> PS: ASIC is 71,4 %
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added







Like to see it under water, update us.


----------



## psyside

Quote:


> Originally Posted by *shilka*
> 
> I am just going to leave this here
> 
> http://www.overclock.net/t/872013/50-load-myth
> http://www.overclock.net/t/711542/on-efficiency


Thanks for that.

Now lets talk about mining a bit, if you have 1000W psu, and you pull only 700W, the 12v rail will degrade much slower then if you run it @900W, this is proven by many miners by now. Also you can't have psu which rail wont degrade overtime, that's not possible. So yea, if you push your psu often its still decent choice to go for like 20% more then you need, its not necessary but its better option imho, because the price are very close.

The threads you linked, although correct on many points, it doesn't cover extreme draw usage scenarios. For cusual gamer its spot on, but for crazy runs and mining not so sure.


----------



## dspacek

Quote:


> Originally Posted by *Forceman*
> 
> Unless you are planning to use the PT3 (or some other modded BIOS), 700W is plenty for a single card. I've yet to see more than 450W from the wall and I've run mine as high as +200mV with Trixx.


So what is your OC for the R9-290X now? Are you water cooled? Im stable on 1120Mhz core and 1380Mhz RAM on air. I hope I can reach around 1250-1300Mhz core and 1700Mhz RAM(Hynix) on water as my HD7950 was able, but I have locked core


----------



## Jack Mac

Quote:


> Originally Posted by *dspacek*
> 
> So what is your OC for the R9-290X now? Are you water cooled? Im stable on 1120Mhz core and 1380Mhz RAM. I hope I can reach around 1250-1300Mhz core and 1700Mhz RAM(Hynix) as my HD7950 was able, but I have locked core


I advise staying at or below 100mv if you want to run an OC like that on the stock air. 200mv should be used sparingly, as in, just for benching.


----------



## sun100

Quote:


> Originally Posted by *bond32*
> 
> Why though? You're idea of an "overload" psu is weird. A psu rated for 700 watts supplying 600 watts only differs from a psu rated for 800 watts pulling 600 watts by lower power bills. If I had a system that pulled 1000 watts, I would buy a 1000 watt psu personally... It makes no difference as long as you purchase a quality psu.


Quote:


> Originally Posted by *darkelixa*
> 
> I have a sapphire r9 290 that runs fine on my 750w psu


Have to agree here, people have serious misconception about wattage and how much 1000W psu really is . 1kW is a serious business power and puny little PC is nothing for it. It's just marketing and/or bad PSU. I still remember the days in an IT retailer when we ordered container of PSUs from one of the well known manufacturers and we were asked, clearly as it was perfectly normal, how much wattage should be written on the PSU sticker.

So as you said even 600W will run R9 series just fine, as long as its quality PSU.


----------



## dspacek

Quote:


> Originally Posted by *Jack Mac*
> 
> I advise staying at or below 100mv if you want to run an OC like that on the stock air. 200mv should be used sparingly, as in, just for benching.


200mV is no problem. With HD7950 which was stock at 1.1V I was able to manage all day long 1.3V with no problems with almost 1300Mhz.


----------



## ZealotKi11er

Quote:


> Originally Posted by *dspacek*
> 
> So what is your OC for the R9-290X now? Are you water cooled? Im stable on 1120Mhz core and 1380Mhz RAM on air. I hope I can reach around 1250-1300Mhz core and 1700Mhz RAM(Hynix) on water as my HD7950 was able, but I have locked core


Its not going to happen. Be lucky to even get 1.2GHz.


----------



## dspacek

Quote:


> Originally Posted by *sun100*
> 
> Have to agree here, people have serious misconception about wattage and how much 1000W psu really is . 1kW is a serious business power and puny little PC is nothing for it. It's just marketing and/or bad PSU. I still remember the days in an IT retailer when we ordered container of PSUs from one of the well known manufacturers and we were asked, clearly as it was perfectly normal, how much wattage should be written on the PSU sticker.
> 
> So as you said even 600W will run R9 series just fine, as long as its quality PSU.


Yes thats true, I thing 550W PSU will manage non OCed R9-290 absolutly without problems. But I like cold PSUs and bought mine Xilence 800W for 50$ so why not?


----------



## dspacek

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Its not going to happen. Be lucky to even get 1.2GHz.


1,2Ghz on stock Voltage?
Im 1,12Ghz on stock V so I hope more than 1,2Ghz will be possible
And what about memory? What was the maximum for Hynix? Ihope I can get around 1700Mhz


----------



## shilka

Quote:


> Originally Posted by *dspacek*
> 
> Yes thats true, I thing 550W PSU will manage non OCed R9-290 absolutly without problems. But I like cold PSUs and bought mine Xilence 800W for 50$ so why not?


Reason why it was so cheap is because its so bad

Buy quality dont buy wattage you dont need


----------



## Forceman

Quote:


> Originally Posted by *dspacek*
> 
> So what is your OC for the R9-290X now? Are you water cooled? Im stable on 1120Mhz core and 1380Mhz RAM on air. I hope I can reach around 1250-1300Mhz core and 1700Mhz RAM(Hynix) on water as my HD7950 was able, but I have locked core


I run it at 1100/1350 (stock voltages) day to day, but it'll do 1250/1500 at +175mV (water cooled). Can't get much more than that on the memory though. I don't think you should expect 1250-1300 though unless you want to run a modded BIOS. 1200 seems to be about the normal high-end overclock on these.


----------



## Roy360

Quote:


> Originally Posted by *smoke2*
> 
> I've made a decision to buy Cooler Master V instead of Seasonic X, because of fan.
> But don't know which one to take: 700W or rather 850W to be on safe side?
> Price is here 110€ for 700W and 140€ for 850W.
> Not planning to do a Crossfire, but want to have it some years...


Those prices seem a little excessive. In Canada I've seen
XFX 850W Black Pro (Seasonic KM3) go for 114CAD
Lepa G850W go for 74CAD
EVGA 750B go for 29CAD
Corsair CX750 go for 50 CAD
Corsair AX1200 go for 199CAD

at the very least those are all the psus I've bought in the last couple of weeks. Canadian Power supplies will work even better in Europe than they do in America. Maybe importing would be cheaper?


----------



## Derpinheimer

Quote:


> Originally Posted by *dspacek*
> 
> 1,2Ghz on stock Voltage?
> Im 1,12Ghz on stock V so I hope more than 1,2Ghz will be possible
> And what about memory? What was the maximum for Hynix? Ihope I can get around 1700Mhz


Haven't seen anyone get over mid 1600s for memory. Hynix oc just a bit better.


----------



## velocityx

Quote:


> Originally Posted by *Derpinheimer*
> 
> Haven't seen anyone get over mid 1600s for memory. Hynix oc just a bit better.


put an asus bios on that elpida and it will overclock like hynix. at least that's what gibbo over at overclockers.co.uk tested


----------



## ZealotKi11er

Quote:


> Originally Posted by *dspacek*
> 
> 1,2Ghz on stock Voltage?
> Im 1,12Ghz on stock V so I hope more than 1,2Ghz will be possible
> And what about memory? What was the maximum for Hynix? Ihope I can get around 1700Mhz


I can get 1150MHz stock voltage. I could not do 1200MHz with +100mV.


----------



## dspacek

Quote:


> Originally Posted by *shilka*
> 
> Reason why it was so cheap is because its so bad
> 
> Buy quality dont buy wattage you dont need


Xilence is bad quality? Since when? original price was more than 100$ and have nice performance with great steady. Managed HD7950 and HD7970 in CF(both oced) with i5-2500K at 5.0 Ghz with no output of hotair.


----------



## dspacek

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I can get 1150MHz stock voltage. I could not do 1200MHz with +100mV.


thats a pity, next week I will post my results. ;-)


----------



## zpaf

This is with official asus bios.



and this is with unofficial PT1 bios.


----------



## shilka

Quote:


> Originally Posted by *dspacek*
> 
> Xilence is bad quality? Since when? original price was more than 100$ and have nice performance with great steady. Managed HD7950 and HD7970 in CF(both oced) with i5-2500K at 5.0 Ghz with no output of hotair.


Its not made by Xilence

Xilence have two 800 watts units both are made by Andyson

Andyson is low quality more then 95% of the time

Since there are no reviews of it and given Andyson track record i would say you bought a junker

Not trying to be rude so dont take it that way


----------



## PillarOfAutumn

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Do I need to use TIM with thermal pads? I'm gonna be using FujiPoly Extreme Ultra, so any TIM won't be as thermally conductive as those pads.


No. I just stuck my Fujipoly on and that was it. No hassles or problems. Amazing temps on the VRM, though, so I'd imagine that the memory is doing just as well.


----------



## smoke2

Quote:


> Originally Posted by *Roy360*
> 
> Those prices seem a little excessive. In Canada I've seen
> XFX 850W Black Pro (Seasonic KM3) go for 114CAD
> Lepa G850W go for 74CAD
> EVGA 750B go for 29CAD
> Corsair CX750 go for 50 CAD
> Corsair AX1200 go for 199CAD
> 
> at the very least those are all the psus I've bought in the last couple of weeks. Canadian Power supplies will work even better in Europe than they do in America. Maybe importing would be cheaper?


Yes, they are really overpriced here. And these prices are the lower ones.
On the other side Sapphire Tri-X 290 is for 400€ here.


----------



## shilka

You have not seen overpriced untill you have seen the price level in Denmark Norway and Finland

25% tax on everything and 37% of my income goes back to tax alone


----------



## ZealotKi11er

Quote:


> Originally Posted by *shilka*
> 
> You have not seen overpriced untill you have seen the price level in Denmark Norway and Finland
> 
> 25% tax on everything and 37% of my income goes back to tax alone


Do you pay for School?


----------



## TheSoldiet

Quote:


> Originally Posted by *shilka*
> 
> You have not seen overpriced untill you have seen the price level in Denmark Norway and Finland
> 
> 25% tax on everything and 37% of my income goes back to tax alone


Agreed! Here in norway everything is utterly expensive.
I payed 763 Usd for my 290x in Norway when the msrp is actually 550 usd. Norway has not been affected by miners too.

Zealot: No we dont. Hospital is free too


----------



## shilka

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Do you pay for School?


Only private schools

Public schools are free


----------



## INCREDIBLEHULK

Any of you own a sapphire card? Was curious how their warranty policy works and if you guys have had good/bad experiences with them.


----------



## PillarOfAutumn

So I'm OCing using Sapphire-TRIXX. After each conservative bump, I run 3D Mark 11 to check for stability. So far, I've gone up to 990 core, and 1310 memory, no voltage increase. I feel like 3DMark doesn't last long enough to truly test stability. What should I be using? Unigine? Catzilla?


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> So I'm OCing using Sapphire-TRIXX. After each conservative bump, I run 3D Mark 11 to check for stability. So far, I've gone up to 990 core, and 1310 memory, no voltage increase. I feel like 3DMark doesn't last long enough to truly test stability. What should I be using? Unigine? Catzilla?


Throw on a game like Battlefield 4 and try to play for 30minutes, if it can last through battlefield chances are it will get through 90% of any stress test


----------



## psyside

Anyone can do some test for me? BF4 2XMSAA + 150% resolution scale and low post AA? what fps you got as minimum like this? is it still smooth? thanks.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *psyside*
> 
> Anyone can do some test for me? BF4 2XMSAA + 150% resolution scale and low post AA? what fps you got as minimum like this? is it still smooth? thanks.


You want OC or no OC? My card is currently at 1000, 1320

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> Throw on a game like Battlefield 4 and try to play for 30minutes, if it can last through battlefield chances are it will get through 90% of any stress test


Thanks! I'll do that.


----------



## psyside

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> You want OC or no OC? My card is currently at 1000, 1320
> Thanks! I'll do that.


No matter as you like but use 60% fan to avoid throttle please


----------



## PillarOfAutumn

Quote:


> Originally Posted by *psyside*
> 
> Anyone can do some test for me? BF4 2XMSAA + 150% resolution scale and low post AA? what fps you got as minimum like this? is it still smooth? thanks.


I'm liquid cooled 

Be back in about 15 minutes.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *psyside*
> 
> No matter as you like but use 60% fan to avoid throttle please


Okay, so I didn't run a benchmark or anything to record the FPS with, but instead used the built-in FPS counter.

So after about 15 minutes at the settings you asked for, at 90FOV, and my card at 1000/1320, I averaged around 50-55 FPS. I was above 60 FPS when there was little happening on screen, and near 40 during intense firefights and explosions.


----------



## stickg1

On the topic of power consumption. I have a 290 and a 3570K. The 290 is running 1150/1550 and 3570K @ 4.7GHz. For kicks I ran Prime95 and Valley at the same time. I was pulling 470w from the wall as monitored by a Kill-a-Watt. While my CPU consumption is lower than some of you AMD CPU users, and I am using the ASUS BIOS (not PT1 or PT3), I believe my Seasonic X650 is totally up to the task. I fold both CPU and GPU 24/7. That pulls ~370w from the wall. 700w seems like a lot for one 290(x).


----------



## psyside

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Okay, so I didn't run a benchmark or anything to record the FPS with, but instead used the built-in FPS counter.
> 
> So after about 15 minutes at the settings you asked for, at 90FOV, and my card at 1000/1320, I averaged around 50-55 FPS. I was above 60 FPS when there was little happening on screen, and near 40 during intense firefights and explosions.


So its playable, did you use post AA at low, as well MSAA + resolution scale? was it smooth? thanks, rep +


----------



## battleaxe

Quote:


> Originally Posted by *stickg1*
> 
> On the topic of power consumption. I have a 290 and a 3570K. The 290 is running 1150/1550 and 3570K @ 4.7GHz. For kicks I ran Prime95 and Valley at the same time. I was pulling 470w from the wall as monitored by a Kill-a-Watt. While my CPU consumption is lower than some of you AMD CPU users, and I am using the ASUS BIOS (not PT1 or PT3), I believe my Seasonic X650 is totally up to the task. I fold both CPU and GPU 24/7. That pulls ~370w from the wall. 700w seems like a lot for one 290(x).


Yeah, I'm mining with a 290 and a 620watt PSU. No problems. Even if I overclock the GPU, still no problems.


----------



## Redeemer

So we got the DCU II, Tri-X OC, Windforce..what else is coming this month?


----------



## PillarOfAutumn

Quote:


> Originally Posted by *psyside*
> 
> So its playable, did you use post AA at low, as well MSAA + resolution scale? was it smooth? thanks, rep +


These are my settings:

GPU is OCed to 1000/1320
GPU temp:46
VRM1 Temp: 42
VRM 2 Temp: 38
Card is in Uber mode. No throttling is occurring

Everything on Ultra except Post Processing which is Low and AA which is x2. Resolution is 1080p, scaling is at 150% and FOV is 90.

The average 50-55, but closer to 50, with dips to 40 during intense parts with a lot going on, and peaks at 60 when there's nothing happening. This is on domination, siege of shanghai. Its playable, at these settings, but personally I'd tone the settings down a bit to accommodate 60+ fps. Now, if this was operation locker domination, I'd imagine it'd be a lot worse.

And again, my card is OCed to 1000/1320 and I'm not done OCing it yet. Maybe after I can max out the OC, it may be able to better handle 150% resolution scaling?


----------



## Pesmerrga

Quote:


> Originally Posted by *Redeemer*
> 
> So we got the DCU II, Tri-X OC, Windforce..what else is coming this month?


XFX is releasing/has released the Double Dissipation.

http://www.tomshardware.com/news/xfx-r9-290-r9-290x-double-dissipation,25511.html


----------



## psyside

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> These are my settings:
> 
> GPU is OCed to 1000/1320
> GPU temp:46
> VRM1 Temp: 42
> VRM 2 Temp: 38
> Card is in Uber mode. No throttling is occurring
> 
> Everything on Ultra except Post Processing which is Low and AA which is x2. Resolution is 1080p, scaling is at 150% and FOV is 90.
> 
> The average 50-55, but closer to 50, with dips to 40 during intense parts with a lot going on, and peaks at 60 when there's nothing happening. This is on domination, siege of shanghai. Its playable, at these settings, but personally I'd tone the settings down a bit to accommodate 60+ fps. Now, if this was operation locker domination, I'd imagine it'd be a lot worse.
> 
> And again, my card is OCed to 1000/1320 and I'm not done OCing it yet. Maybe after I can max out the OC, it may be able to better handle 150% resolution scaling?


Great post thanks alot, if you can do another with max oc with same settings









How does this game feels at 40 fps? its starting to feel choppy eh? do you feel big improvement in IQ going to 150% res scale?


----------



## PillarOfAutumn

Yeah, I'll do another max OC soon. I'm kind of interested myself because I want to see if this card, OCed, can push BF4 at 1440P maxed, or with a few settings on high/med.


----------



## psyside

Wait, so your saying you play at 1440P?


----------



## Forceman

Quote:


> Originally Posted by *psyside*
> 
> Anyone can do some test for me? BF4 2XMSAA + 150% resolution scale and low post AA? what fps you got as minimum like this? is it still smooth? thanks.


Isn't MSAA and resolution scaling together kind of redundant? I thought the point of resolution scaling was to use it as SSAA, so you wouldn't want/need to run MSAA also.


----------



## psyside

Quote:


> Originally Posted by *Forceman*
> 
> Isn't MSAA and resolution scaling together kind of redundant? I thought the point of resolution scaling was to use it as SSAA, so you wouldn't want/need to run MSAA also.


No they work different.


----------



## Forceman

Quote:


> Originally Posted by *psyside*
> 
> No they work different.


I know they work differently, but the purpose of both is to reduce aliasing, which is why it seems strange that you'd want to do both at the same time. Someone already posted a 140% to 4x MSAA comparison, and the aliasing was about the same on both.


----------



## rdr09

Quote:


> Originally Posted by *stickg1*
> 
> On the topic of power consumption. I have a 290 and a 3570K. The 290 is running 1150/1550 and 3570K @ 4.7GHz. For kicks I ran Prime95 and Valley at the same time. I was pulling 470w from the wall as monitored by a Kill-a-Watt. While my CPU consumption is lower than some of you AMD CPU users, and I am using the ASUS BIOS (not PT1 or PT3), I believe my Seasonic X650 is totally up to the task. I fold both CPU and GPU 24/7. That pulls ~370w from the wall. 700w seems like a lot for one 290(x).


for one 290 a good 700W should be plenty. that's what i have and i bench at 1320/1620 but i game at stock. my i7 SB is oc'ed to 4.9GHz during benches. 4.5 for games. also, i use the stock bios.

btw, i don't have much in my system. 3 fans, a pump, and one hdd aside from the cpu/gpu.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *stickg1*
> 
> On the topic of power consumption. I have a 290 and a 3570K. The 290 is running 1150/1550 and 3570K @ 4.7GHz. For kicks I ran Prime95 and Valley at the same time. I was pulling 470w from the wall as monitored by a Kill-a-Watt. While my CPU consumption is lower than some of you AMD CPU users, and I am using the ASUS BIOS (not PT1 or PT3), I believe my Seasonic X650 is totally up to the task. I fold both CPU and GPU 24/7. That pulls ~370w from the wall. 700w seems like a lot for one 290(x).


Is your card overvolted?

Quote:


> Originally Posted by *psyside*
> 
> Wait, so your saying you play at 1440P?


[
No, I'm at 1080p. 1440p is about 1.7x1080p, so I want to see if this card can run the game at 170% scaling cause I guess in a way that'd give me a BF4 performance at 1440P?


----------



## Sazz

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> Any of you own a sapphire card? Was curious how their warranty policy works and if you guys have had good/bad experiences with them.


If you use an aftermarket cooler/watercooling on it, it "technically" voids the warranty... that is if you TELL them you used one on it. just put the stock cooler back on before sending to RMA and voila..

It's basically the same on CPU's.. Intel/AMD will void if the damage is caused by overclocking/use of different cooler except the ones that the CPU came with.. but are you gonna tell em you overclocked it to its death? or you used a different cooler? Nope... unless if you are way too honest.. xD


----------



## stickg1

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Is your card overvolted?


Just the +100mV in AB. It runs at around 1.275v after vDroop.


----------



## Zenophobe

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Yeah, I'll do another max OC soon. I'm kind of interested myself because I want to see if this card, OCed, can push BF4 at 1440P maxed, or with a few settings on high/med.


Ultra everything disable last 3. 60 to 80 fps highs of 100 lows of 50


----------



## ReHWolution

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Quote:
> 
> 
> 
> Originally Posted by *stickg1*
> 
> On the topic of power consumption. I have a 290 and a 3570K. The 290 is running 1150/1550 and 3570K @ 4.7GHz. For kicks I ran Prime95 and Valley at the same time. I was pulling 470w from the wall as monitored by a Kill-a-Watt. While my CPU consumption is lower than some of you AMD CPU users, and I am using the ASUS BIOS (not PT1 or PT3), I believe my Seasonic X650 is totally up to the task. I fold both CPU and GPU 24/7. That pulls ~370w from the wall. 700w seems like a lot for one 290(x).
> 
> 
> 
> Is your card overvolted?
> 
> Quote:
> 
> 
> 
> Originally Posted by *psyside*
> 
> Wait, so your saying you play at 1440P?
> 
> Click to expand...
> 
> [
> No, I'm at 1080p. 1440p is about 1.7x1080p, so I want to see if this card can run the game at 170% scaling cause I guess in a way that'd give me a BF4 performance at 1440P?
Click to expand...

No, lol. 1440p is 33% higher than 1080p, so 135% should be the same for 1440p. I tried doing 150% scaling on a 1440p, resulting in a 4K resolution, and with a single 290X i was @ about 30FPS, that is in line with other results..


----------



## Redeemer

Quote:


> Originally Posted by *rdr09*
> 
> for one 290 a good 700W should be plenty. that's what i have and i bench at 1320/1620 but i game at stock. my i7 SB is oc'ed to 4.9GHz during benches. 4.5 for games. also, i use the stock bios.
> 
> btw, i don't have much in my system. 3 fans, a pump, and one hdd aside from the cpu/gpu.


Which 290 you got?


----------



## Hogesyx

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> Throw on a game like Battlefield 4 and try to play for 30minutes, if it can last through battlefield chances are it will get through 90% of any stress test


Currently cgminer is my best overnight stress testing tool, can break even your electric cost too. lol.


----------



## rx7racer

So I got a little annoyed finally at the stock blower fan noise. BF4 is one that pushed me into craziness with it's fairly high speed profile for all my targets.

Anyway, had 2 Zalman vf3000f heatsinks from my GTX465's. And so me being me said why not, fellow member Enzarch made a great post about using stock cooler for vrms and mem. I thought surely the zalman handled Fermi heat it can handle Hawaii, well kinda right and kinda wrong.

Max I can really go, or the cooler just can't handle the heat and it slowly builds up till 95c hits and it throttles is 1075core/1350mem.(can prob go higher on mem). But have to say while gaming some BF4 ultra at only 1080p it only hits about 65c. VRM runs warm hitting around 70c on long sessions. Tossed on some spare heatsinks from an old AC TTP kit from my 4890 over vmem spots to try and help. Couldn't squeeze any over vrm area though.



Installed and running.


I left ithe stock heatsink in the oven a minute too long and it sure puffed right up to let me know.


Just thought I'd share that for noise suppression there is actually some versatile options for cheap if ya got stuff laying around etc.

All in all I spent 0$ and have a 290X that performs at max no throttling and temps are 83c under extreme sessions or mid 60c for gaming sessions. I figure with some better fans that I have laying around it can maybe handle a bit more. Not to mention this 800D is not a great air cooling case haha.

So there it is, my 5 hour last midnights brainstorm. Not so amazing results per se, but at least it is some results. I only lost 25Mhz on core compared to what I could run with stock cooler but my wifes ears are feeling much better.







(that means mine are too







)

Edit: GPUz Linky


----------



## psyside

Quote:


> Originally Posted by *Forceman*
> 
> I know they work differently, but the purpose of both is to reduce aliasing, which is why it seems strange that you'd want to do both at the same time. Someone already posted a 140% to 4x MSAA comparison, and the aliasing was about the same on both.


I'm IQ addict i want the best









I saw some screenshots, even @200% RS, there are jaggies without MSAA. With 4XMSAA *it was perfect.* So generally the best balance would be 130-150% RS + 2XMSAA, while maintaining playable fps


----------



## PillarOfAutumn

I'm overclocking my GPU right now and with every step, I run 3DMark. So I've reached 1070/1400 without touching the voltage. How do I know something is unstable? I'm thinking that I may have maxed out memory for now (haven't ran into a problem, though), and so I'm just going to push the core clock until something happens and then up the voltage by 15mv, and then continue OCing. Is this okay?


----------



## Forceman

Quote:


> Originally Posted by *ReHWolution*
> 
> No, lol. 1440p is 33% higher than 1080p, so 135% should be the same for 1440p. I tried doing 150% scaling on a 1440p, resulting in a 4K resolution, and with a single 290X i was @ about 30FPS, that is in line with other results..


Is that the way BF4 calculates it though? Because 1440p is 1.77x as many pixels as 1080p. Not sure what their slider is actually based on.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> I'm overclocking my GPU right now and with every step, I run 3DMark. So I've reached 1070/1400 without touching the voltage. How do I know something is unstable? I'm thinking that I may have maxed out memory for now (haven't ran into a problem, though), and so I'm just going to push the core clock until something happens and then up the voltage by 15mv, and then continue OCing. Is this okay?


You'll either black screen, have the driver crash, or see artifacting.

Memory OC's don't help too much on these cards so I would stick with 1400MHz or so.

If you bump up the voltage you should be able to hit 1100MHz core easily.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> You'll either black screen, have the driver crash, or see artifacting.
> 
> Memory OC's don't help too much on these cards so I would stick with 1400MHz or so.
> 
> If you bump up the voltage you should be able to hit 1100MHz core easily.


Without touching the voltage, I'm already at 1100/1400. Should I keep pushing the card until I get a driver crash and then up the voltage, and then continue pushing core/mem?


----------



## Derpinheimer

Quote:


> Originally Posted by *velocityx*
> 
> put an asus bios on that elpida and it will overclock like hynix. at least that's what gibbo over at overclockers.co.uk tested


I dont know what you mean by that? The bios shouldnt have any effect whatsover on memory OCing.

Only voltage. And even so, voltage helps hynix just as much as it helps elpida.
Quote:


> Originally Posted by *psyside*
> 
> I'm IQ addict i want the best
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I saw some screenshots, even @200% RS, there are jaggies without MSAA. With 4XMSAA *it was perfect.* So generally the best balance would be 130-150% RS + 2XMSAA, while maintaining playable fps


Shouldnt only 0.5x/1x/2x be the useful settings? How would something like 130% help? Isnt that like doing 1080p on a 1440p monitor? Theres no way to properly scale it.


----------



## Jack Mac

Quote:


> Originally Posted by *rx7racer*
> 
> I left ithe stock heatsink in the oven a minute too long and it sure puffed right up to let me know.


lol, why'd you put it in the oven?


----------



## rx7racer

Quote:


> Originally Posted by *Jack Mac*
> 
> lol, why'd you put it in the oven?


Quick and dirty way to separate the heatsink from the base plate on the 290X's cooler setup. Then use the base plate as vrm and vmem cooling surface.


----------



## psyside

Quote:


> Originally Posted by *Derpinheimer*
> 
> I dont know what you mean by that? The bios shouldnt have any effect whatsover on memory OCing.
> 
> Only voltage. And even so, voltage helps hynix just as much as it helps elpida.
> Shouldnt only 0.5x/1x/2x be the useful settings? How would something like 130% help? Isnt that like doing 1080p on a 1440p monitor? Theres no way to properly scale it.


Yes, 2XMSAA + 130-150% res scale should be the best overall IQ to performance ratio. Everything helps regarding resolution scale, i think 125% is 1XSAA 150% 2XSSAA and so on, kinda like downsampling afaik.


----------



## rdr09

Quote:


> Originally Posted by *Redeemer*
> 
> Which 290 you got?


MSI.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Without touching the voltage, I'm already at 1100/1400. Should I keep pushing the card until I get a driver crash and then up the voltage, and then continue pushing core/mem?


Go for it.

If you see artifacts (can be odd-shaped and oddly colored triangles originating from a single point that looks very out of place, or checkerboard pattern squares, or any other graphical anomaly) then that means one of two things:
-You need more voltage for that clock. Add 10 or so mv until they don't appear.
-Your chip has hit its max OC. There is a point where more voltage will not help.

Of course you should only increase your voltage if you have the thermal headroom. I'd say try and keep the core under 80c, and VRM's under 80c also. Too high temp on either can affect your OC's stability. You might need to make a custom fan profile via MSI AB.

Also, FYI, VRM 1 provides the core's power while VRM 2 provides the memory's power. A high core OC will heat up VRM 1 but not VRM 2 and vice versa.


----------



## Ultracarpet

YEEEE BOYEEEEEEE... My mother fedexed a Christmas present to me and it arrived a couple days ago... a surprise 290x








I was like







when I opened it lol



It's a powercolor... and I was too excited to wait to get a block for it so i'm just on stock cooling atm.

Soooo jacked omg.


----------



## Mr357

Quote:


> Originally Posted by *Ultracarpet*
> 
> YEEEE BOYEEEEEEE... My mother fedexed a Christmas present to me and it arrived a couple days ago... a surprise 290x
> 
> 
> 
> 
> 
> 
> 
> I was like
> 
> 
> 
> 
> 
> 
> 
> when I opened it lol
> 
> 
> 
> It's a powercolor... and I was too excited to wait to get a block for it so i'm just on stock cooling atm.
> 
> Soooo jacked omg.


Your mother is awesome!


----------



## Arizonian

Quote:


> Originally Posted by *Ultracarpet*
> 
> YEEEE BOYEEEEEEE... My mother fedexed a Christmas present to me and it arrived a couple days ago... a surprise 290x
> 
> 
> 
> 
> 
> 
> 
> I was like
> 
> 
> 
> 
> 
> 
> 
> when I opened it lol
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> It's a powercolor... and I was too excited to wait to get a block for it so i'm just on stock cooling atm.
> 
> Soooo jacked omg.


Sweet gift indeed. Your welcome to join us with a submission post.









1. GPU-Z Link or Screen shot with OCN Name
2. Manufacturer & Brand
3. Cooling - Stock, Aftermarket or 3rd Party Water


----------



## kizwan

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ReHWolution*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PillarOfAutumn*
> 
> No, I'm at 1080p. 1440p is about 1.7x1080p, so I want to see if this card can run the game at 170% scaling cause I guess in a way that'd give me a BF4 performance at 1440P?
> 
> 
> 
> No, lol. 1440p is 33% higher than 1080p, so 135% should be the same for 1440p. I tried doing 150% scaling on a 1440p, resulting in a 4K resolution, and with a single 290X i was @ about 30FPS, that is in line with other results..
> 
> Click to expand...
> 
> Is that the way BF4 calculates it though? Because 1440p is 1.77x as many pixels as 1080p. Not sure what their slider is actually based on.
Click to expand...

I'm playing @1080p + 200% resolution scale. I wonder whether this means I'm playing with the games is rendering @3840x2160 (2160p) or @2160x1920 (2 times as many pixels as 1080p = 4147200 pixels). I always thought it's the former; rendering the games at higher resolution & downsampling to 1080p.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Go for it.
> 
> If you see artifacts (can be odd-shaped and oddly colored triangles originating from a single point that looks very out of place, or checkerboard pattern squares, or any other graphical anomaly) then that means one of two things:
> -You need more voltage for that clock. Add 10 or so mv until they don't appear.
> -Your chip has hit its max OC. There is a point where more voltage will not help.
> 
> Of course you should only increase your voltage if you have the thermal headroom. I'd say try and keep the core under 80c, and VRM's under 80c also. Too high temp on either can affect your OC's stability. You might need to make a custom fan profile via MSI AB.
> 
> Also, FYI, VRM 1 provides the core's power while VRM 2 provides the memory's power. A high core OC will heat up VRM 1 but not VRM 2 and vice versa.


Thank you! I'm actually watercooling, and during BF4 sessions, my max temps are:

GPU: 46 C
VRM1: 43
VRM2: 38


----------



## psyside

Quote:


> Originally Posted by *kizwan*
> 
> I'm playing @1080p + 200% resolution scale.


CF 290's?


----------



## PillarOfAutumn

Quote:


> Originally Posted by *kizwan*
> 
> I'm playing @1080p + 200% resolution scale. I wonder whether this means I'm playing with the games is rendering @3840x2160 (2160p) or @2160x1920 (2 times as many pixels as 1080p = 4147200 pixels). I always thought it's the former; rendering the games at higher resolution & downsampling to 1080p.


hmm...I think with 200%, you're actually playing at the 4k resolution. So it looks like scaling is another form of super sampling. Which means that you're taking an oversized image and shirnking it to fit your resolution. Essentially, you're take each axis and multiplying it by the percent your changing it by. So if you're playing at 200%, that means the resolution you're playing at is (1920*2) by (1080*2), which is essentially 4k res or 4 1080p monitors.

Like wise, to simulate 1440p resolution on a 1080p screen, you'd have to multiply both the x and y by 1.33.

At my clocks, I'm getting 55 FPS average at 170% scaling, so now I know that this card can definitely handle 1440p or even 1600p..


----------



## pounced

Just a quick question, I'm debating on selling this Ref Sapphire 290x I bought first day on craigslist and just waiting for the Non Ref coolers to come out and fork out some extra cash or possibly wait for non ref 290s and then cross fire those. What should I try and sell this card for? I have successfully ran it stable at 1175 Core and 1450 mem running valley and Furmark on a +100 vddc offset in Trixx.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *pounced*
> 
> Just a quick question, I'm debating on selling this Ref Sapphire 290x I bought first day on craigslist and just waiting for the Non Ref coolers to come out and fork out some extra cash or possibly wait for non ref 290s and then cross fire those. What should I try and sell this card for? I have successfully ran it stable at 1175 Core and 1450 mem running valley and Furmark on a +100 vddc offset in Trixx.


Honestly, the only reason why I personally bought the reference card is because I was planning on watercooling my rig. And the only time I've ever recommended to others to buy the reference card is only if they're watercooling, otherwise wait for a non-ref or get an nVidia card. Because honestly, the loud noise aside, the heat really throttles the thing so you don't get its full potential. If you don't plan on water cooling, then certainly sell it. You may be able to break even because of the price gouging that's going on right now.


----------



## Ultracarpet

Quote:


> Originally Posted by *Mr357*
> 
> Your mother is awesome!


I know right?!
Quote:


> Originally Posted by *Arizonian*
> 
> Sweet gift indeed. Your welcome to join us with a submission post.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1. GPU-Z Link or Screen shot with OCN Name
> 2. Manufacturer & Brand
> 3. Cooling - Stock, Aftermarket or 3rd Party Water


Ok, I'll try this again... that first picture was supposed to be a submission post, but my lack of sleep and getting wayyy too excited caused me to skip over the notepad portion of the ss. lol









Maybe this time...


----------



## kizwan

Quote:


> Originally Posted by *psyside*
> 
> CF 290's?


Yes. 1080p, High settings, 200% scale, stock clock: 80s - 100s FPS (outdoor).
Quote:


> Originally Posted by *PillarOfAutumn*
> 
> hmm...I think with 200%, you're actually playing at the 4k resolution. So it looks like scaling is another form of super sampling. Which means that you're taking an oversized image and shirnking it to fit your resolution. Essentially, you're take each axis and multiplying it by the percent your changing it by. So if you're playing at 200%, that means the resolution you're playing at is (1920*2) by (1080*2), which is essentially 4k res or 4 1080p monitors.
> 
> Like wise, to simulate 1440p resolution on a 1080p screen, you'd have to multiply both the x and y by 1.33.
> 
> At my clocks, I'm getting 55 FPS average at 170% scaling, so now I know that this card can definitely handle 1440p or even 1600p..


This is exactly how understand it.
Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Quote:
> 
> 
> 
> Originally Posted by *pounced*
> 
> Just a quick question, I'm debating on selling this Ref Sapphire 290x I bought first day on craigslist and just waiting for the Non Ref coolers to come out and fork out some extra cash or possibly wait for non ref 290s and then cross fire those. What should I try and sell this card for? I have successfully ran it stable at 1175 Core and 1450 mem running valley and Furmark on a +100 vddc offset in Trixx.
> 
> 
> 
> Honestly, the only reason why I personally bought the reference card is because I was planning on watercooling my rig. And the only time I've ever recommended to others to buy the reference card is only if they're watercooling, otherwise wait for a non-ref or get an nVidia card. Because honestly, the loud noise aside, the heat really throttles the thing so you don't get its full potential. If you don't plan on water cooling, then certainly sell it. *You may be able to break even because of the price gouging that's going on right now.*
Click to expand...

You may get to sell it at the original price. I don't think you have any problem finding a buyer. Miners turn 18 everyday.


----------



## Arizonian

Quote:


> Originally Posted by *Ultracarpet*
> 
> I know right?!
> Ok, I'll try this again... that first picture was supposed to be a submission post, but my lack of sleep and getting wayyy too excited caused me to skip over the notepad portion of the ss. lol
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Maybe this time...
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## flamin9_t00l

Hi guys I'm now under water with 2 290's (Sapphire and HIS). Please update me, cheers.

Proof


Rig pic


Running these things in crossfire using stock cooler with only 1 slot spacing is really not a long term option as the noise was unbearable, although I liked to keep the temps below 80c instead of the ridiculous 95c target.

Anyone thinking of watercooling the reference 290(X) don't hesitate you will see 50c knocked off the load temp with decent size rad(s).

All the best for the new year


----------



## velocityx

Quote:


> Originally Posted by *Derpinheimer*
> 
> I dont know what you mean by that? The bios shouldnt have any effect whatsover on memory OCing.
> 
> Only voltage. And even so, voltage helps hynix just as much as it helps elpida.


that's actually what I meant. I didn't try myself but asus bios allows voltage tweaking in gpu tweak ? like I said, I didn't try myself, it's gibbo says here http://forums.overclockers.co.uk/showpost.php?p=25174104&postcount=290


----------



## dspacek

Quote:


> Originally Posted by *velocityx*
> 
> that's actually what I meant. I didn't try myself but asus bios allows voltage tweaking in gpu tweak ? like I said, I didn't try myself, it's gibbo says here http://forums.overclockers.co.uk/showpost.php?p=25174104&postcount=290


So is that way working ASUS bios for R9-290? I have Saphire in a week under water so what is the best BIOS for OC 290? to reach the best results?


----------



## dspacek

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> So I'm OCing using Sapphire-TRIXX. After each conservative bump, I run 3D Mark 11 to check for stability. So far, I've gone up to 990 core, and 1310 memory, no voltage increase. I feel like 3DMark doesn't last long enough to truly test stability. What should I be using? Unigine? Catzilla?


Yes use some Unigine especially Valley is now the most popular or FurMark to get your card frying to its max temps. But be careful with air cooling.


----------



## wand3r3r

I've read a number of the first posts a while back but haven't followed this now massive thread. I have a couple questions.

1. Do the XFX 290x reference cards have a known memory type?

2. What is the power consumption at the wall for two 290x's with say a 4770k or equivalent?

3. Then for those that have two reference cards just the opinion on the noise and how you handle it.
I'd be interested in custom cards, putting one would be fine but I'm afraid two would be too much heat inside my case.

4. Does MSI afterburner support overvolting now on all 290/x's? (at launch it didn't but was "coming")


----------



## Sazz

Quote:


> Originally Posted by *wand3r3r*
> 
> I've read a number of the first posts a while back but haven't followed this now massive thread. I have a couple questions.
> 
> 1. Do the XFX 290x reference cards have a known memory type?
> 
> 2. What is the power consumption at the wall for two 290x's with say a 4770k or equivalent?
> 
> 3. Then for those that have two reference cards just the opinion on the noise and how you handle it.
> I'd be interested in custom cards, putting one would be fine but I'm afraid two would be too much heat inside my case.
> 
> 4. Does MSI afterburner support overvolting now on all 290/x's? (at launch it didn't but was "coming")


It's mixed Elipdia and Hynix, so far the only confirmed all Hynix card would be Asus's DCUII version of the card.

Don't have any answer for the 2nd question as I have no experience with 4770k

I used my own fan curve maxing at 72% at Power limit +10% to prevent it from throttling. at 72% max fan speed (40% fan speed at 40C to 72% fan speed at 85C) the noise in my perspective is irrelevant specially when gaming since I am more focused on the game and not the noise, I guess I am just better at filtering out noise to what is important to hear at that time. Some people like me, even in a noisy environment can separate noise from important audio so it doesn't affect me much. this question would be answered differently by different people as its a matter of preferences.

and yes, MSI afterburner does support every card now with over volting of +100mV while Sapphire TrixX supports upto +200mV, AB does have an extra option of over volting the "AUX" voltage which i am not sure (and I think no one actually 100% certain) what it is adding voltage to, some say its probably the PLL voltage.. I wouldn't really go over +50mV on it until I see a confirmation on what exactly that affects.. altho when increasing it makes it easier for me to get a stable clock on core clocks.


----------



## MeneerVent

I actually am waiting for my Kraken G10 to arrive so I can pair it with my Kraken X40 to tame this card. But I couldn't resist moving a few sliders in MSI AB. I got to 1100Mhz core clock with stock voltage, is that considered good? I use BF3 to see if it is stable, is this a good choice? When setting it to 1150 I get a "crash" and artifacts. Crash refers to MSI AB not responding at all, my mouse disappearing and BF3 glitching. And is the safe voltage of this card already found out?


----------



## Jack Mac

Quote:


> Originally Posted by *MeneerVent*
> 
> I actually am waiting for my Kraken G10 to arrive so I can pair it with my Kraken X40 to tame this card. But I couldn't resist moving a few sliders in MSI AB. I got to 1100Mhz core clock with stock voltage, is that considered good? I use BF3 to see if it is stable, is this a good choice? When setting it to 1150 I get a "crash" and artifacts. Crash refers to MSI AB not responding at all, my mouse disappearing and BF3 glitching. And is the safe voltage of this card already found out?


Yeah, you can safely max out the voltage in AB if your temperatures are kept in check, but personally, I do +60mV for my 24/7 clocks.


----------



## dspacek

Hi again here is my results for BF4 1080p everything Ultra, 2xMSAA and 130% texturing + Vsync ON
GPU setting are these core 1100 Mhz core and 1420 Mhz RAM with 25% power limit, +31 mV VDDC

bf42014-01-0514-27-54-72fps.csv 6k .csv file


bf42014-01-0514-27-54-72minmaxavg.csv 0k .csv file


----------



## Timx2

Hey there,

I am ready to press the order button on my 290(X) but I am not sure about two things. The 290 and 290x are very close to each other and some may say that the 290X is not worth the extra money. Does this also count when I add it into my waterloop? I only find reviews where they compare a reference 290 vs a reference 290x. In other words, has the 290x more oc-potentional when placed in a waterloop?

Secondly, is there a difference between the sapphire and asus bios? Are they both voltage unlocked or do I need a modified bios?

Thanks in advance.


----------



## MeneerVent

Quote:


> Originally Posted by *Jack Mac*
> 
> Yeah, you can safely max out the voltage in AB if your temperatures are kept in check, but personally, I do +60mV for my 24/7 clocks.


What about the power limit?


----------



## Jack Mac

Quote:


> Originally Posted by *MeneerVent*
> 
> What about the power limit?


I'm sure power limit can be safely maxed out, it's like power limit on Nvidia. I do +45 with my 24/7 OC.


----------



## dspacek

Quote:


> Originally Posted by *Timx2*
> 
> Hey there,
> 
> I am ready to press the order button on my 290(X) but I am not sure about two things. The 290 and 290x are very close to each other and some may say that the 290X is not worth the extra money. Does this also count when I add it into my waterloop? I only find reviews where they compare a reference 290 vs a reference 290x. In other words, has the 290x more oc-potentional when placed in a waterloop?
> 
> Secondly, is there a difference between the sapphire and asus bios? Are they both voltage unlocked or do I need a modified bios?
> 
> Thanks in advance.


I red somewhere, probably on this web that one guy had raw 290 version that could much more OCed than raw 290x so it means that does not much matter what version do you have if you want to OCed it. and that 4% increase ind default is nothing for that money. So its my opinion ;-)
Here is my SS from Valley with light OCed 290 Saphire version (1100/1420 Mhz)


----------



## Timx2

Quote:


> Originally Posted by *dspacek*
> 
> I red somewhere, probably on this web that one guy had raw 290 version that could much more OCed than raw 290x so it means that does not much matter what version do you have if you want to OCed it. and that 4% increase ind default is nothing for that money. So its my opinion ;-)
> Here is my SS from Valley with light OCed 290 Saphire version (1100/1420 Mhz)


Thanks, did you flash anoher bios on that sapphire? And is the Sapphire bios voltage unlocked? And is that score stable with gaming etc? 1 completed run of Valley doesn't say much to me. I can get a 2400 score on my 770 but that doesn't make it stable at all. Just curious. I orderd my waterblock but haven't made a choice yet about the card.


----------



## Durvelle27

Quote:


> Originally Posted by *dspacek*
> 
> I red somewhere, probably on this web that one guy had raw 290 version that could much more OCed than raw 290x so it means that does not much matter what version do you have if you want to OCed it. and that 4% increase ind default is nothing for that money. So its my opinion ;-)
> Here is my SS from Valley with light OCed 290 Saphire version (1100/1420 Mhz)


I hate myself


----------



## PillarOfAutumn

So what I've been doing is I've been raising my core by 10 and my memory by 25. Between each increase, I run 3DMark and between every 50 increase in core I run valley for a while or play bf4 for a half hour. Right now, I'm at 1200/1500 without touching the voltage. Now, I'm just wondering if this is normal or if this card I have is "better than average"? What I plan on doing is finding the unstable clocks for core and memory, from there drop it down by 25 and 50 respective, and push the voltage offset by 10mv. Is this an okay plan? And do you guys leave your cards overclocked/volted or do you only overclock when gaming?


----------



## ImJJames

Something weird is starting to happen. After a while of computer usage my chrome starts to flicker anytime I have it as a active window. When I move the mouse it just flickers, and if I keep doing it, it will black screen, which forces me to hard reset my comp.







I haven't changed anything, this is just so sudden. Never happens in games...


----------



## Redeemer

http://www.legitreviews.com/asus-r9-290x-directcu-ii-sapphire-r9-290x-tri-x-video-card-reviews_132158


----------



## pbsn

any dual/tri/quad 290x owner?

I had an issue a while ago and I hope someone can test this if possible...

when you enable crossfire on 290x, there seems to be a graphical lag with assassins creed 4 and metro last light where in, the screen looks like as if the FPS is somewhere between 3 to 5 but the real FPS is normal and unaffected.

it happens remotely about once every 2-3 minutes so you have to watch the game carefully to notice it.

apparently, it only occurs on wifi asus motherboards wherein, when its ON... it causes this issue and when its OFF, the issue is gone.

if anyone has an asus wifi enabled mobo such as asus rampage iv black ed. kindly enable your wifi and try playing metro last light or assassins creed IV if this same issue also occurs on your PC.

thank you!


----------



## tsm106

Quote:


> Originally Posted by *Durvelle27*
> 
> Quote:
> 
> 
> 
> Originally Posted by *dspacek*
> 
> I red somewhere, probably on this web that one guy had raw 290 version that could much more OCed than raw 290x so it means that does not much matter what version do you have if you want to OCed it. and that 4% increase ind default is nothing for that money. So its my opinion ;-)
> Here is my SS from Valley with light OCed 290 Saphire version (1100/1420 Mhz)
> 
> 
> 
> 
> I hate myself
Click to expand...












Man, newegg is seriously taking advantage of the mining situation. They've got the custom cards priced at 700, haha. Freaking newegg, I'm never shocked by their shenanigans.


----------



## Slomo4shO

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> So far, I've gone up to 990 core, and 1310 memory, no voltage increase.


You should be leaving the memory alone until you find the maximum core frequency and then working on the memory clocks.
Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> Spoiler: Warning: Spoiler!


Temperature: 59682 °C









Gotta love Valley


----------



## Durvelle27

Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Man, newegg is seriously taking advantage of the mining situation. They've got the custom cards priced at 700, haha. Freaking newegg, I'm never shocked by their shenanigans.


And your rubbing it in


----------



## tijgert

I'm looking to start some experiments as to what Asus bios will work best on my Sapphire 290x. You know, to get voltage unlock and stuff.
Techpowerup shows several bioses.. one newer than the other. -> http://www.techpowerup.com/vgabios/index.php?architecture=ATI&manufacturer=Asus&model=R9+290X&interface=&memType=&memSize=

Is the latest (highest number is 015.039.000.007.003525) automatically the best? Or is there a reason to prefer another one?

Also: I wanted to try the modded bioses at first, but the so called PT1 and PT3 bioses are 64K big while the regular bioses are 128K... I don't understand why and it makes me reluctant to try.


----------



## psyside

The Tri X oc being amazing











*1.4V 1220/1600* look at those temps!!!


----------



## Widde

New PSU on the way







Think mine is starting to fail, showing 11.34V on the 12v rail







Measured with a multimeter. Ordered a Corsair RM 1000w gold and soon another 290







Hopefully I can cram it all inside my antec threehundred







Else the powertools comes out


----------



## shilka

Quote:


> Originally Posted by *Widde*
> 
> New PSU on the way
> 
> 
> 
> 
> 
> 
> 
> Think mine is starting to fail, showing 11.34V on the 12v rail
> 
> 
> 
> 
> 
> 
> 
> Measured with a multimeter. Ordered a Corsair RM 1000w gold and soon another 290
> 
> 
> 
> 
> 
> 
> 
> Hopefully I can cram it all inside my antec threehundred
> 
> 
> 
> 
> 
> 
> 
> Else the powertools comes out


Cancel your order

http://www.overclock.net/t/1455892/why-you-should-not-buy-a-corsair-rm-psu

20$ less and its miles better then the RM
http://www.newegg.com/Product/Product.aspx?Item=N82E16817438010


----------



## psyside

Quote:


> Originally Posted by *shilka*
> 
> Cancel your order
> 
> http://www.overclock.net/t/1455892/why-you-should-not-buy-a-corsair-rm-psu
> 
> 20$ less and its miles better then the RM
> http://www.newegg.com/Product/Product.aspx?Item=N82E16817438010


----------



## tsm106

Quote:


> Originally Posted by *Durvelle27*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Man, newegg is seriously taking advantage of the mining situation. They've got the custom cards priced at 700, haha. Freaking newegg, I'm never shocked by their shenanigans.
> 
> 
> 
> 
> 
> 
> And your rubbing it in
Click to expand...

No. Unigine is uses cpu a lot more than ppl think. You're at similar clocks yet there is a 10fps gap... and that was with my son's 4.6ghz quad core.


----------



## Widde

Quote:


> Originally Posted by *shilka*
> 
> Cancel your order
> 
> http://www.overclock.net/t/1455892/why-you-should-not-buy-a-corsair-rm-psu
> 
> 20$ less and its miles better then the RM
> http://www.newegg.com/Product/Product.aspx?Item=N82E16817438010


Where i ordered from they only have a Fractal Design Tesla R2 1000W 80+ Gold and it's about 8 bucks cheaper







The Cooler master V1000 is around 30-40$ more expensive







Thought the RM series where supposed to be decent, silly corsair









Should add that I dont have to pay for the psu, I'm just happy i get to replace the old one







But the new 290 i have to pay for ^^


----------



## Durvelle27

Quote:


> Originally Posted by *tsm106*
> 
> No. Unigine is uses cpu a lot more than ppl think. You're at similar clocks yet there is a 10fps gap... and that was with my son's 4.6ghz quad core.


You know i'm running the 780 not 290. the 290 screenie was a guy i quoted


----------



## shilka

Quote:


> Originally Posted by *Widde*
> 
> Where i ordered from they only have a Fractal Design Tesla R2 1000W 80+ Gold and it's about 8 bucks cheaper
> 
> 
> 
> 
> 
> 
> 
> The Cooler master V1000 is around 30-40$ more expensive
> 
> 
> 
> 
> 
> 
> 
> Thought the RM series where supposed to be decent, silly corsair


The V1000 despite its higher price is a much better unit by far

At least the OEM that made it has not forgotten to add parts like CWT did with the RM

Who the hell forgets to add a grounding bridge?


----------



## Mr357

Quote:


> Originally Posted by *shilka*
> 
> The V1000 despite its higher price is a much better unit by far
> 
> At least the OEM that made it has not forgotting to add parts like CWT did with the RM
> 
> Who the hell forgets to add a grounding bridge?


The V1000 is what I got about two weeks ago. Is there anything wrong with the fact that the PCI-E connectors on the PSU end are 6-pin instead of 8? When re-connecting my 290X I used two separate cables instead of two connectors off of one cable, since I figured it would make a difference. Does it actually?


----------



## Slomo4shO

Quote:


> Originally Posted by *Widde*
> 
> New PSU on the way
> 
> 
> 
> 
> 
> 
> 
> Think mine is starting to fail, showing 11.34V on the 12v rail
> 
> 
> 
> 
> 
> 
> 
> Measured with a multimeter. Ordered a Corsair RM 1000w gold and soon another 290
> 
> 
> 
> 
> 
> 
> 
> Hopefully I can cram it all inside my antec threehundred
> 
> 
> 
> 
> 
> 
> 
> Else the powertools comes out


The EVGA Supernova 1000 P2 is currently $190 at NCIX, would be a much better buy than the RM for around the same price.


----------



## shilka

Quote:


> Originally Posted by *Mr357*
> 
> The V1000 is what I got about two weeks ago. Is there anything wrong with the fact that the PCI-E connectors on the PSU end are 6-pin instead of 8? When re-connecting my 290X I used two separate cables instead of two connectors off of one cable, since I figured it would make a difference. Does it actually?


No

They would not put two connectors on one cables if the cable could not handle it
Quote:


> Originally Posted by *Slomo4shO*
> 
> The EVGA Supernova 1000 P2 is currently $190 at NCIX, would be a much better buy than the CM RM for around the same price.


Did not check NCIX so thank you

And yes the EVGA SuperNova P2 makes the RM look like a pathetic joke


----------



## Zenophobe

Make it over 5K!!

http://www.3dmark.com/3dm/2118442


----------



## sun100

Just a quick question, do you guys that have r9 290 (ref design) custom fan profile with trixx/AB set the fan speed % to be about equal to temperature (say 60% fan for 60 C temp.) or do you go a bit lower. Im running my Sapphire R9 290 at about 55% fan speed and 63 C under BF4 load via trixx and, God, i feel my PC is gonna take off. Holy cow, that thing is L O U D. Any tips about Trixx custom fan control curve for Sapphire's R9 290 ?


----------



## FtW 420

Quote:


> Originally Posted by *shilka*
> 
> No
> *
> They would not put two connectors on one cables if the cable could not handle it*
> Did not check NCIX so thank you
> 
> And yes the EVGA SuperNova P2 makes the RM look like a pathetic joke


It depends on the amps on the ral/rails as well. My silverstone 1500w has the 8pin + 6 pin connectors on each cable, but with only 25 amps per rail I have to use 2 cables per card if I want to overclock.


----------



## Mr357

Quote:


> Originally Posted by *FtW 420*
> 
> It depends on the amps on the ral/rails as well. My silverstone 1500w has the 8pin + 6 pin connectors on each cable, but with only 25 amps per rail I have to use 2 cables per card if I want to overclock.


Well, the V1000 has a single 12V supposedly capable of up to 83A.


----------



## zpaf

A good benchmark to find your card limits.


----------



## Forceman

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> So what I've been doing is I've been raising my core by 10 and my memory by 25. Between each increase, I run 3DMark and between every 50 increase in core I run valley for a while or play bf4 for a half hour. Right now, I'm at 1200/1500 without touching the voltage. Now, I'm just wondering if this is normal or if this card I have is "better than average"? What I plan on doing is finding the unstable clocks for core and memory, from there drop it down by 25 and 50 respective, and push the voltage offset by 10mv. Is this an okay plan? And do you guys leave your cards overclocked/volted or do you only overclock when gaming?


You sure the power limit has been raised and it's really running those clocks? 1200/1500 at stock voltages greatly exceeds the norm.
Quote:


> Originally Posted by *psyside*
> 
> The Tri X oc being amazing
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *1.4V 1220/1600* look at those temps!!!


Little warm on VRM1 though.


----------



## dspacek

Quote:


> Originally Posted by *zpaf*
> 
> A good benchmark to find your card limits.


Very nice, what is your settings of voltage, BIOS etc, which brand of memory? Are you water cooled?
This is exactly what I want to reach with my 290!


----------



## zpaf

Quote:


> Originally Posted by *dspacek*
> 
> Very nice, what is your settings of voltage, BIOS etc, which brand of memory? Are you water cooled?
> This is exactly what I want to reach with my 290!


I am with max volts on gpu tweak 1.412v with vdroop ~1.3v
My card is asus and the bios is the second release from asus.
It is with reference cooler fan at 100% of course for bench.
Memory is Elpida.


----------



## Tobiman

Just got an EVGA 1300w G2 to replace a Seasonic X850 and things are looking pretty good so far. Will be running a 290 and 3 X 270Xs for mining.


----------



## psyside

Quote:


> Originally Posted by *Forceman*
> 
> Little warm on VRM1 though.


Yep but 1.4v on air


----------



## zpaf

Just for fun ...










techPowerUp GPU-Z Validation eshbp


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Forceman*
> 
> You sure the power limit has been raised and it's really running those clocks? 1200/1500 at stock voltages greatly exceeds the norm


After I restarted PC, I did some more benchmarks. I started noticing artifacts and the screen randomly flashing green or red. I turned it down to 1175/1500 and my PC crashed. I turned it down again and I see that it is most stable at 1155/1500. The memory clock may still be a bit high, though. I'm going to play around with the voltages and see how much more I can push it. So to see the voltage limit, can I do the following:

Push the VDDC offset to 200, find the max overclocks, and then tone down the VDDC by 10 until it becomes unstable?

Or should I raise the VDDC in increments of 10, and see how much the clocks can be pushed in each increment?


----------



## dspacek

Quote:


> Originally Posted by *zpaf*
> 
> I am with max volts on gpu tweak 1.412v with vdroop ~1.3v
> My card is asus and the bios is the second release from asus.
> It is with reference cooler fan at 100% of course for bench.
> Memory is Elpida.


Thanx a lot, I hope I will be able to manage this with stock Saphire BIOS under water, maybe little bit better ;-)
Can you upload Valley bench results on Extreme HD settings?


----------



## Aggronor

Guys,

Quick Question...
Which is the the most stable config i can get with a reference Sapphire 290x ????? OC ???

1150 - 1300 ?????

Power limit ?????

Should I use MSI Afterburner.. the lastest Beta version ? or just Trixx ?

CPU: 3770K
PSU: 750 Seasonic 80 plus bronze
RAM: 16 1600MHZ

THANKS THANKS THANKS!!!!!


----------



## ImJJames

Quote:


> Originally Posted by *Aggronor*
> 
> Guys,
> 
> Quick Question...
> Which is the the most stable config i can get with a reference Sapphire 290x ????? OC ???
> 
> 1150 - 1300 ?????
> 
> Power limit ?????
> 
> Should I use MSI Afterburner.. the lastest Beta version ? or just Trixx ?
> 
> CPU: 3770K
> PSU: 750 Seasonic 80 plus bronze
> RAM: 16 1600MHZ
> 
> THANKS THANKS THANKS!!!!!


I think the latest MSI AB beta 18 is bugged(Beta 17 worked fine). After updating I get flickering using chrome and then black screen. After switching to Sapphire trixx I haven't had any problems.


----------



## the9quad

Quote:


> Originally Posted by *zpaf*
> 
> A good benchmark to find your card limits.


Cool , I was going to overclock until I got in the top 10, but I ran it once and hit the #10 spot at stock clocks, lol. I'm done

38246



http://www.catzilla.com/toplist?page=1&firm=all&res=720&multi=all&ven=all&ctype=all&manual=&submit=Show


----------



## TheSoldiet

can you change me to WC/RED Mod.


----------



## vortex240

Just got a reference 290X straight from AMD Canada(not branded). 81.6% asic, hynix memory - does 1150 core and 1600 mem, +100mv, +13mv aux. Sock cooler with replaced tim and removed I/O plate.


----------



## Falkentyne

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> After I restarted PC, I did some more benchmarks. I started noticing artifacts and the screen randomly flashing green or red. I turned it down to 1175/1500 and my PC crashed. I turned it down again and I see that it is most stable at 1155/1500. The memory clock may still be a bit high, though. I'm going to play around with the voltages and see how much more I can push it. So to see the voltage limit, can I do the following:
> 
> Push the VDDC offset to 200, find the max overclocks, and then tone down the VDDC by 10 until it becomes unstable?
> 
> Or should I raise the VDDC in increments of 10, and see how much the clocks can be pushed in each increment?


You find the clock you want, then find how much voltage you need for that clock. If it's too high, you lower your desired clock and try again until you can get decent voltage, speed and temps for it.


----------



## Sazz

Quote:


> Originally Posted by *Falkentyne*
> 
> You find the clock you want, then find how much voltage you need for that clock. If it's too high, you lower your desired clock and try again until you can get decent voltage, speed and temps for it.


LOL I work the other way around, I find what voltage I wanna use with my daily clocks (for 290X I only wanna use +50mV) and then put a clock I know it would be unstable (with artifacts) and work my way down on the clocks, once no artifact is visible to me I use OCCT to verify if its stable xD


----------



## Slomo4shO

Quote:


> Originally Posted by *zpaf*
> 
> Just for fun ...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> techPowerUp GPU-Z Validation eshbp


Nice









What stable clocks were you able to obtain on that card?


----------



## Aggronor

Whats the function of VDDC OFFSET ??? using TRIXX ????


----------



## psyside

Quote:


> Originally Posted by *Slomo4shO*
> 
> The EVGA Supernova 1000 P2 is currently $190 at NCIX, would be a much better buy than the *CM RM* for around the same price.


?


----------



## Forceman

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Push the VDDC offset to 200, find the max overclocks, and then tone down the VDDC by 10 until it becomes unstable?
> 
> Or should I raise the VDDC in increments of 10, and see how much the clocks can be pushed in each increment?


I would pick a voltage you are comfortable with first (I'd stick with the +100mV Afterburner provides, but that's just me) and then find the max clocks that are stable at that voltage. Just work your way up from something rasonable, like 1200 at +100mV and see how far you get/how far back down you need to go. Then I usually back off another small amount just for stability/safety.
Quote:


> Originally Posted by *Aggronor*
> 
> Whats the function of VDDC OFFSET ??? using TRIXX ????


It increases the core voltage.


----------



## psyside

1.4v is to much, i'm pretty sure you will be limited by core, then volts....around 1.3v -1.35 for runs would be my max.

Also what is the average with this cards? does the 290X reach higher clocks? afaik 1150/1450 should be expected with the majority of the cards 290/290X right?


----------



## Slomo4shO

Quote:


> Originally Posted by *psyside*
> 
> ?


Brainfart, thinking Corsair and typing in CM. I was going to initially recommend the V series but found the higher quality P2 cheaper. Thanks for catching it.


----------



## psyside

Quote:


> Originally Posted by *Slomo4shO*
> 
> Brainfart, thinking Corsair and typing in CM. I was going to initially recommend the V series but found the higher quality P2 cheaper. Thanks for catching it.


Not really sure if the EVGA is higher quality then V series tbh, they use CapXon caps, and inferior fans but anyway its great unit


----------



## Forceman

Quote:


> Originally Posted by *psyside*
> 
> Also what is the average with this cards? does the 290X reach higher clocks? afaik 1150/1450 should be expected with the majority of the cards 290/290X right?


Those speeds sound about right. Doesn't really appear to be much difference between 290 and 290X clocks.


----------



## psyside

Yea, that's what i thought.


----------



## HardwareDecoder

just bought another 290 sapphire Bf4 edition for $450 on ebay.....Gonna use it for mining.

From what i've seen these cards have a big chance to unlock let's hope I get lucky.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Forceman*
> 
> I would pick a voltage you are comfortable with first (I'd stick with the +100mV Afterburner provides, but that's just me) and then find the max clocks that are stable at that voltage. Just work your way up from something rasonable, like 1200 at +100mV and see how far you get/how far back down you need to go. Then I usually back off another small amount just for stability/safety.
> It increases the core voltage.


Okay! So instead of increasing by 200 in Trix, I should instead increase by 100mv, and then find the max clocks stable with that voltage. Afterwards, slowly decrease the voltage until I find at which voltage those clocks are unstable. And then maybe add like 5 to the voltage and drop the core by 10 and memory by 25 or so. Does that sound okay? And is there a little to no return in frequency overclocking beyond a certain voltage? Thank you.


----------



## Aggronor

Should I perform OC (Sapphire 290x) in Uber mode or Quiet mode ????.... or i will get same results on BOTH ?

Thanks


----------



## psyside

Quote:


> Originally Posted by *Aggronor*
> 
> Should I perform OC (Sapphire 290x) in Uber mode or Quiet mode ????.... or i will get same results on BOTH ?
> 
> Thanks


Neither one, use manual fan @65% fixed for overclocking. Do not go lower then 60% when you co, fixed fan works better the auto btw, even if you setup curve.


----------



## stickg1

Good Lord!!! I flipped the BIOS switch over to the ASUS BIOS to have some benchmarking fun. It's pretty unstable, my 2nd monitor flickers sometimes, anyway. Last night I mentioned how on the stock BIOS with +100 mV my card runs with about 1.27v after vDroop and running 3Dmark I would draw about 450w from the wall. Well with the ASUS BIOS (not sure which one exactly it came flashed by the previous owner), the voltage stays locked at 1.35v and running the same 3DMark at a higher core clock I was pulling about 560w from the wall. Rock-n-roll babe!!!


----------



## Forceman

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Okay! So instead of increasing by 200 in Trix, I should instead increase by 100mv, and then find the max clocks stable with that voltage. Afterwards, slowly decrease the voltage until I find at which voltage those clocks are unstable. And then maybe add like 5 to the voltage and drop the core by 10 and memory by 25 or so. Does that sound okay? And is there a little to no return in frequency overclocking beyond a certain voltage? Thank you.


Yes, but if you find the max clock at a given voltage, you probably won't be able to lower the voltage back down much without causing instability. So if you set +100, then find 1200/1500 to be stable, backing down the voltage to +75 may make those same clocks unstable again. You can try it, but you'll probably end up back where you started (+100/1200/1500, for example).
Quote:


> Originally Posted by *stickg1*
> 
> Good Lord!!! I flipped the BIOS switch over to the ASUS BIOS to have some benchmarking fun. It's pretty unstable, my 2nd monitor flickers sometimes, anyway. Last night I mentioned how on the stock BIOS with +100 mV my card runs with about 1.27v after vDroop and running 3Dmark I would draw about 450w from the wall. Well with the ASUS BIOS (not sure which one exactly it came flashed by the previous owner), the voltage stays locked at 1.35v and running the same 3DMark at a higher core clock I was pulling about 560w from the wall. Rock-n-roll babe!!!


If the voltage sticks at 1.35V they may have flashed the PT1/PT3 BIOS on it, instead of a stock BIOS.


----------



## devilhead

maybe it will be enough to cool 4x 290x







it is three 3x120 rads 60mm thickness







will run 2x D5 pumps, hope that enough...


----------



## Widde

Hmm just noticed something weird, going from stock core to 1095mhz required zero increase in core voltage. 1095-1115 is +56mv (Any further is a decrease in fps) but when I increase the memory up to 1300 from 1250 i lose fps in furmark (Prob not reliable but still) have to up the memory to 1400 untill i'm at the same fps. Dont know if it matters but i used a 1024x576 window in furmark.

Going from 183fps on stock memory to 177fps with 1300 on the memory

And another question (Yes I have many of them







) My 12v is sinking below 11.23v at load, this should be below the aproved specs right? Can this be causing my occt errors 1080mhz at +150mV


----------



## stickg1

Quote:


> Originally Posted by *Forceman*
> 
> If the voltage sticks at 1.35V they may have flashed the PT1/PT3 BIOS on it, instead of a stock BIOS.


Figured as much, I just didn't know which one for sure. But it's a PowerColor card and with that BIOS is just says "ASUS" in GPU-Z.

It's a completely different card on that BIOS. Great for quick benches but that's about all it's good for, on my card at least.


----------



## Slomo4shO

Quote:


> Originally Posted by *psyside*
> 
> Not really sure if the EVGA is higher quality then V series tbh, they use CapXon caps, and inferior fans but anyway its great unit


The P2 has all Nippon Chemi-Con caps, is platinum rated with great ripple suppression, and comes with a 10 year warranty. I would consider it superior








Quote:


> Originally Posted by *devilhead*
> 
> maybe it will be enough to cool 4x 290x
> 
> 
> 
> 
> 
> 
> 
> it is three 3x120 rads 60mm thickness
> 
> 
> 
> 
> 
> 
> 
> will run 2x D5 pumps, hope that enough...


Why not just pickup a Phobya G-Changer Xtreme NOVA 1080?


----------



## psyside

Quote:


> Originally Posted by *Slomo4shO*
> 
> The P2 has all Nippon Chemi-Con caps, is platinum rated with great ripple suppression, and comes with a 10 year warranty. I would consider it superior


Yes, i mixed up with the G2...they are very close, both are great.

Also warranty 10 years? pretty much marketing.


----------



## shilka

Quote:


> Originally Posted by *psyside*
> 
> Not really sure if the EVGA is higher quality then V series tbh, they use CapXon caps, and inferior fans but anyway its great unit


Thats ONLY on the modular which is not really stressed


----------



## HardwareDecoder

Quote:


> Originally Posted by *HardwareDecoder*
> 
> just bought another 290 sapphire Bf4 edition for $450 on ebay.....Gonna use it for mining.
> 
> From what i've seen these cards have a big chance to unlock let's hope I get lucky.


a 290+290x will xfire together since they are the same chip right? Just run like 2x 290's I have heard, this is true right?

Like I said i'm gonna be using them both for mining for atleast a few months until they pay themselves off but if I want to do a little gaming once in awhile I'm hoping they will xfire.


----------



## Slomo4shO

Quote:


> Originally Posted by *HardwareDecoder*
> 
> a 290+290x will xfire together since they are the same chip right? Just run like 2x 290's I have heard, this is true right?


This is correct for crossfire. However, for mining you can configure your rig to run them as a 290X and a 290 and you shouldn't be using crossfire for mining.


----------



## HardwareDecoder

Quote:


> Originally Posted by *Slomo4shO*
> 
> This is correct for crossfire. However, for mining you can configure your rig to run them as a 290X and a 290 and you shouldn't be using crossfire for mining.


ohh I know that much. Thanks for the confirmation on gaming though.

Really hoping it unlocks too I have a good feeling since it is a bf4 edition, those were winners.


----------



## PillarOfAutumn

I'm very confused by my card...

At stock voltages and running at 1150/1480, my 3DMark score is coming out as 5150.

Once i push the voltage by 100, and start increasing the core only, I'm getting lower scores:

+100mv VDDC
1160/1500: 4735
1170/1500: 4791
1180/1500: 4826
1190/1500: 4834
1200/1500: 4845

Anyone know whats going on?


----------



## devilhead

Quote:


> Originally Posted by *Slomo4shO*
> 
> The P2 has all Nippon Chemi-Con caps, is platinum rated with great ripple suppression, and comes with a 10 year warranty. I would consider it superior
> 
> 
> 
> 
> 
> 
> 
> 
> Why not just pickup a Phobya G-Changer Xtreme NOVA 1080?


because i got nice deal for those rads and d5 pumps


----------



## ZealotKi11er

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> I'm very confused by my card...
> 
> At stock voltages and running at 1150/1480, my 3DMark score is coming out as 5150.
> 
> Once i push the voltage by 100, and start increasing the core only, I'm getting lower scores:
> 
> +100mv VDDC
> 1160/1500: 4735
> 1170/1500: 4791
> 1180/1500: 4826
> 1190/1500: 4834
> 1200/1500: 4845
> 
> Anyone know whats going on?


Could be that the card does not have enough power?


----------



## PillarOfAutumn

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Could be that the card does not have enough power?


Should I push it upto 200mv? will that do anything to the card?


----------



## velocityx

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Could be that the card does not have enough power?


that's with power limit set to 50?


----------



## PillarOfAutumn

Quote:


> Originally Posted by *velocityx*
> 
> that's with power limit set to 50?


How do I check for power limit? I have everything stock on this card, and set to uber mode.


----------



## Slomo4shO

Quote:


> Originally Posted by *HardwareDecoder*
> 
> Really hoping it unlocks too I have a good feeling since it is a bf4 edition, those were winners.


Ironically, my 4 Sapphire BF4 editions unlocked back when none of the Sapphire cards were unlocking a month ago and the two XFX cards I picked up last week are both locked.


----------



## Forceman

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Should I push it upto 200mv? will that do anything to the card?


Check in CCC and make sure the power limit is really changing. Sometimes there's a bug where changing it in Afterburner/Trixx won't actually change it. Sounds like your card is power throttling. I've seen the same thing happen and that was what was happening to mine.

Saw your other post, you definitely need to change the power limit if you want to overclock/overvolt. You can check it on the overdrive page of CCC.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Forceman*
> 
> Check in CCC and make sure the power limit is really changing. Sometimes there's a bug where changing it in Afterburner/Trixx won't actually change it. Sounds like your card is power throttling. I've seen the same thing happen and that was what was happening to mine.
> 
> Saw your other post, you definitely need to change the power limit if you want to overclock/overvolt.


Thanks! I just checked and its at 0%. What percentage can I raise it by to reflect +100mv?


----------



## Forceman

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Thanks! I just checked and its at 0%. What percentage can I raise it by to reflect +100mv?


You can just set it to +50% and not worry about it. Some people set 20 or 30 instead though.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Forceman*
> 
> You can just set it to +50% and not worry about it. Some people set 20 or 30 instead though.


But if I set it to 50%, won't that increase the voltage the card is receiving by 50%?


----------



## Forceman

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> But if I set it to 50%, won't that increase the voltage the card is receiving by 50%?


It's not voltage, it is changing the total power draw by allowing more current. Changing the power limit won't affect the voltage, except that the card may no longer throttle the voltage and clocks to stay below the power limit. But it won't add more.


----------



## HardwareDecoder

Quote:


> Originally Posted by *Slomo4shO*
> 
> Ironically, my 4 Sapphire BF4 editions unlocked back when none of the Sapphire cards were unlocking a month ago and the two XFX cards I picked up last week are both locked.


ahh good that makes me feel better about my chances. I did ask the ebay seller if he had any info on its unlock status but that wasn't too long ago so haven't heard anything yet.

He said he bought 12 of them to mine and his apartment couldn't handle the power draw







His loss is my gain.

From what i've seen the bf4 edition sapphires were super lucky.


----------



## smoke2

Here is the probably preview with trio: ASUS, Gigabyte and Sapphire Tri-X with added reference 290/290X.

Sapphire is the coldest from all:
http://www.hardware.fr/news/13513/radeon-r9-290-290x-customs-nos-1ers-tests.html


----------



## Aggronor

This is the better result i could get without artifacting.... (Sapphire R9 290X) Reference with a custom FAN profile, using Trixx

1150-1350

CPU: 3770k
PSU: Seasonic 750 80 plus
RAM: 16 GB 1600MHZ








GL


----------



## tijgert

@ Devilhead: That's a nifty rad stand, where did you get it?


----------



## Arizonian

Quote:


> Originally Posted by *flamin9_t00l*
> 
> Hi guys I'm now under water with 2 290's (Sapphire and HIS). Please update me, cheers.
> 
> Proof
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Rig pic
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Running these things in crossfire using stock cooler with only 1 slot spacing is really not a long term option as the noise was unbearable, although I liked to keep the temps below 80c instead of the ridiculous 95c target.
> 
> Anyone thinking of watercooling the reference 290(X) don't hesitate you will see 50c knocked off the load temp with decent size rad(s).
> 
> All the best for the new year


Congrats - updated from one to two







and under water









Quote:


> Originally Posted by *vortex240*
> 
> Just got a reference 290X straight from AMD Canada(not branded). 81.6% asic, hynix memory - does 1150 core and 1600 mem, +100mv, +13mv aux. Sock cooler with replaced tim and removed I/O plate.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - welcome - added


----------



## PillarOfAutumn

Alright, so I just ran the test again after setting the power limit to 50% and this is what I get for my 3DMark benchmarks:

+100mv
at 1210/1500: 5378
at 1220/1500: 5317
at 1210/1510: 5202

So does this mean that 1210/1500 is my sweetspot? Or is there something else that is limiting my system?

P.S. at stock volts, I get:
1150/1480: 5150


----------



## Mr357

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> But if I set it to 50%, won't that increase the voltage the card is receiving by 50%?


No, power limit and Vcore are completely separate.


----------



## BradleyW

Is the 290x shipped in quiet mode?
Thanks


----------



## dspacek

Can anyone help me how to reach higher OC limits in AB? one day I did something in AB that allowed me tu run almost unrestricted clocks.
But now Im locked in stable 1625Mhz RAM and cant try much more








PLEASE give me an advice how to unlock unoficial OC limits


----------



## PillarOfAutumn

Quote:


> Originally Posted by *dspacek*
> 
> Can anyone help me how to reach higher OC limits in AB? one day I did something in AB that allowed me tu run almost unrestricted clocks.
> But now Im locked in stable 1625Mhz RAM and cant try much more
> 
> 
> 
> 
> 
> 
> 
> 
> PLEASE give me an advice how to unlock unoficial OC limits


Shouldn't you increase your power limit to 50%?

And you can increase your clocks as high as you want, but are you getting any return in performance from those OCs?


----------



## dspacek

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Shouldn't you increase your power limit to 50%?
> 
> And you can increase your clocks as high as you want, but are you getting any return in performance from those OCs?


No I cant go as high as I want.
Yes without Powerlimit or anything other touching
Yes Increase by 4 FPS AVG in Vallley. now is 70 FPS avg


----------



## Forceman

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Alright, so I just ran the test again after setting the power limit to 50% and this is what I get for my 3DMark benchmarks:
> 
> +100mv
> at 1210/1500: 5378
> at 1220/1500: 5317
> at 1210/1510: 5202
> 
> So does this mean that 1210/1500 is my sweetspot? Or is there something else that is limiting my system?
> 
> P.S. at stock volts, I get:
> 1150/1480: 5150


Looks like it - once you start going too far you get artifacts or errors that can reduce performance, even if you can't always see them on screen.
Quote:


> Originally Posted by *BradleyW*
> 
> Is the 290x shipped in quiet mode?
> Thanks


Yes.


----------



## ZealotKi11er

Ok so mining with a pretty hot room i hit 81C in VRM1. Core is 54C. Man EK has a problem here.


----------



## HardwareDecoder

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Ok so mining with a pretty hot room i hit 81C in VRM1. Core is 54C. Man EK has a problem here.


my vrm hits 60 mining core is 53.... your block on right?


----------



## psyside

Quote:


> Originally Posted by *shilka*
> 
> Thats ONLY on the modular which is not really stressed


Don't really understand what you are saying : /


----------



## PillarOfAutumn

So these are the numbers I got:

1150/1480 stock voltage
1210/1500 at 87mv

playing BF4 at 87mv, I'm hitting an average of about 80-90FPS on a 1080p monitor playing at a supersampled resolution that is simulating 1440p. Everything is on ultra except for AA, HBAO, and post processing, which are all either on low or completely off. At these settings, my max temps are:

GPU: 46
VRM1: 43
VRM2: 37

at 1150/1480, my card benchmarks in at 5150 and at 1210/1500 i'm at 5378 (3DMark).

my CPU is only OCed to 4.0, but I'm planning on pushing it to 4.5 so maybe doing that, it will also increase benchmark performance and gaming performance. I'm looking to play BF4 at 1440p with at least 90FPS.

I think for my current settings, this is optimal.

Starting tomorrow, I'm going to remove all the piping and drain my system so that I can put in the finishing touches. At this point, I won't be using my PC for about a month. Does anyone want me to run any games or do any tests at these settings?


----------



## ZealotKi11er

Quote:


> Originally Posted by *HardwareDecoder*
> 
> my vrm hits 60 mining core is 53.... your block on right?


I have mounted many block before. Did you use thermal paste? Also VRM1 or VRM2?


----------



## HardwareDecoder

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I have mounted many block before. Did you use thermal paste? Also VRM1 or VRM2?


vrm1 and yes I did use paste


----------



## shilka

Quote:


> Originally Posted by *psyside*
> 
> Don't really understand what you are saying : /


The CapXon caps you where talking about are only on the modular board they are not actually inside the PSU itself

And the modular board is not really stressed at all


----------



## ZealotKi11er

Quote:


> Originally Posted by *HardwareDecoder*
> 
> vrm1 and yes I did use paste


Did it say in the instructions. I never had to use paste in ram or vrm before.


----------



## BradleyW

Quote:


> Originally Posted by *Forceman*
> 
> Looks like it - once you start going too far you get artifacts or errors that can reduce performance, even if you can't always see them on screen.
> Yes.


Thanks. I will be sure to flick the switch since I have both 290x's under water.


----------



## HardwareDecoder

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Did it say in the instructions. I never had to use paste in ram or vrm before.


yup


----------



## ZealotKi11er

Quote:


> Originally Posted by *HardwareDecoder*
> 
> yup


Thanks. Looks i will have to redo it.


----------



## utnorris

Quote:


> Originally Posted by *devilhead*
> 
> maybe it will be enough to cool 4x 290x
> 
> 
> 
> 
> 
> 
> 
> it is three 3x120 rads 60mm thickness
> 
> 
> 
> 
> 
> 
> 
> will run 2x D5 pumps, hope that enough...


I use three PA120.3 old school rads to cool my 4 x 290's while mining and it does fine, but it depends on the room temp obviously. If I leave the room closed the room temp gets to 80f and the GPU's are around 50c. If I open the window up a little to let cool outside air in I can drop the GPU's to 40c.


----------



## psyside

Quote:


> Originally Posted by *shilka*
> 
> The CapXon caps you where talking about are only on the modular board they are not actually inside the PSU itself
> 
> And the modular board is not really stressed at all


Аh yes, but still its a bit of a cutdown, P2 don't feature those caps so its all good


----------



## 00Smurf

Expanded to two rig's and 5 290x's. One in a haf xb evo with 2 290x's. The other three in a carbide 540 case running tri-fire when not mining.

One card runs a H90, the other 4 use h55's. Can't beat 54C mining temps.


----------



## PillarOfAutumn

Wow. How much money have you made mining?


----------



## ShortySmalls

http://www.techpowerup.com/gpuz/cz2ep/
2 Gigabyte 290x
EK Waterblocks


----------



## psyside

Guys, what you think about adding some more info about what to and what not to on the first page?









Like what drivers are generally the most stable, what vBIOS is best, recommended stability tests and so on. I know this can bin different for one card, to another, but still.

And while i talk about that, what are the most stable drivers atm, and is there any news about when the next version should be available?


----------



## Prexxus

Hey guys my first post here







Just got my R9 290 Sapphire last week and man was I impressed! That is until the black screens started. Ive scoured the net for the past week or so looking into this and it seems to be a pretty widespread problem.

What I can't find is any official comment on it or any real fixes. Has AMD said anything? Is there any real way of fixing this? Does anyone know what's causing this too happen?

I checked the memory type of my GPU and saw that it was Elpdia and not Hinyx. From what Ive read up to now is that almost all if not all the people with this problem have Elpdia memory on their boards.

My question really is what should I do? Is there anything I can do myself or should I just send it back and hope to get one with Hinyx memory?

I'm exhausted trying to fix this. I can't figure it out myself and I would love any advice/help possible at this point.

Thank you so much in advance.

My system specs if it matters...

i5 2500k @ 4.2
Radeon R9 290 Sapphire
Sabertooth P67
Rocketfish 700wPS2
8 gigs ram Gskill @ 1600
Windows 7 64 bit

Using the latest AMD drivers.

Black screens when playing certain games at random times, sometimes I can get hours of play in, sometimes not even 5 minutes. Other games like DOTA2 and BF4 seem to work flawlessly every time. Which is quite strange to me but it might just be random luck.


----------



## 00Smurf

Quote:


> Originally Posted by *psyside*
> 
> Guys, what you think about adding some more info about what to and what not to on the first page?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Like what drivers are generally the most stable, what vBIOS is best, recommended stability tests and so on. I know this can bin different for one card, to another, but still.
> 
> And while i talk about that, what are the most stable drivers atm, and is there any news about when the next version should be available?


i like this idea. Especially the vbios idea.


----------



## Arizonian

Quote:


> Originally Posted by *psyside*
> 
> Guys, what you think about adding some more info about what to and what not to on the first page?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Like what drivers are generally the most stable, what vBIOS is best, recommended stability tests and so on. I know this can bin different for one card, to another, but still.
> 
> And while i talk about that, what are the most stable drivers atm, and is there any news about when the next version should be available?


Quote:


> Originally Posted by *00Smurf*
> 
> i like this idea. Especially the vbios idea.


Submit some info and I'll be glad to post any well constructed info that would benefit owners.








Quote:


> Originally Posted by *Prexxus*
> 
> Hey guys my first post here
> 
> 
> 
> 
> 
> 
> 
> Just got my R9 290 Sapphire last week and man was I impressed! That is until the black screens started. Ive scoured the net for the past week or so looking into this and it seems to be a pretty widespread problem.
> 
> What I can't find is any official comment on it or any real fixes. Has AMD said anything? Is there any real way of fixing this? Does anyone know what's causing this too happen?
> 
> I checked the memory type of my GPU and saw that it was Elpdia and not Hinyx. From what Ive read up to now is that almost all if not all the people with this problem have Elpdia memory on their boards.
> 
> My question really is what should I do? Is there anything I can do myself or should I just send it back and hope to get one with Hinyx memory?
> 
> I'm exhausted trying to fix this. I can't figure it out myself and I would love any advice/help possible at this point.
> 
> Thank you so much in advance.
> 
> My system specs if it matters...
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> i5 2500k @ 4.2
> Radeon R9 290 Sapphire
> Sabertooth P67
> Rocketfish 700wPS2
> 8 gigs ram Gskill @ 1600
> Windows 7 64 bit
> 
> 
> 
> Using the latest AMD drivers.
> 
> Black screens when playing certain games at random times, sometimes I can get hours of play in, sometimes not even 5 minutes. Other games like DOTA2 and BF4 seem to work flawlessly every time. Which is quite strange to me but it might just be random luck.


We did have an AMD rep address us in this thread. He had gathered some info regarding the black screen issues we were seeing. He posted with an incoming update that would fix the black screens in our latest driver, but that was a mixed bag. They know it's happening, and they've acknowledged us.

Sounds like you may have already come across this thread that might help. *290/290X Black Screen Poll*

EDIT - OH and welcome to OCN with your fist post.


----------



## Sazz

finally got around finishing up my rig, for the last 2 months it has been on open case situation since it wasn't 100% done and I've been procrastinating on finishing it.. now its done and I've uploaded updated pics xD



Sucks i have to let go of this build ASAP before I ship out to basic training.

http://www.overclock.net/g/a/1058261/2013-rig-updated-switched-from-hd7970-to-r9-290x/


----------



## TheSoldiet

I guess this post wasn't enough http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/13240#post_21521316

I will post some pictures later with more "believing" proof.


----------



## Sazz

Quote:


> Originally Posted by *TheSoldiet*
> 
> I guess this post wasn't enough http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/13240#post_21521316
> 
> I will post some pictures later with more "believing" proof.


link to gpu-z validation is missing I believe?


----------



## TheSoldiet

Quote:


> Originally Posted by *Sazz*
> 
> link to gpu-z validation is missing I believe?


Ok, I will do that







I did write my name on the Paint file though


----------



## Sazz

Quote:


> Originally Posted by *TheSoldiet*
> 
> Ok, I will do that
> 
> 
> 
> 
> 
> 
> 
> I did write my name on the Paint file though


Or maybe arizon is just away right now and hasn't read the update post of yours..


----------



## CruxXial

i still wonder how u guys could all easily reach 1500mhz on the vram and how u confirm it is stable?? for me 1500mhz and even 1600mhz are working for simple 3dmark benches without any problem, but as soon as i am playing games that use more than 3gb of vram i am forced to downclock it all the way to 1350mhz or i do get blackscreens. did anybody else notice something like that???


----------



## Sazz

Quote:


> Originally Posted by *CruxXial*
> 
> i still wonder how u guys could all easily reach 1500mhz on the vram and how u confirm it is stable?? for me 1500mhz and even 1600mhz are working for simple 3dmark benches without any problem, but as soon as i am playing games that use more than 3gb of vram i am forced to downclock it all the way to 1350mhz or i do get blackscreens. did anybody else notice something like that???


True, for me I can go upto 1625 running benchmarks but as soon as I play games that are demanding on the video card, I get black screen. Either within 1minute of playing the game til 2hrs in. I don't see any gain in performance over 1500Mhz on VRAM so I just stay at 1500, in fact my runs at 1550-1600 got lower score than my run at 1500Mhz, so don't worry if you couldn't reach 1500, the gain isn't that much specially in actual games (a mere 1Fps difference to me in Tomb Raider and BF4 from stock to 1500Mhz).


----------



## dspacek

*So nobody knows how to reach higher clocks than 1625 Mhz on RAM? in AB?*
Benchmarking means for me, that I launch Valley etc... and looks for artifacts for miimum 30minutes. *I do not launch that few minutes benchmark, thats irrelevant stability test!*
Maybe I had lucky for the modules. looks like tha same which I had on my HD7950 which was able to manage 1700 Mhz all day long and 1800 Mhz for benchs.

But in global, the higher clocks on RAM means no(less) restriction for core OC performance and little better overall performance gain.


----------



## CruxXial

valley, heaven and stuff like that do not use much vram, no matter for how long u keep it running. i kept 3dmark firestrike first sequence looped at extreme settings plus heaven and gaming to find my max core but benches are all useless for vram testing. i am usually keeping it at 1350mhz or even stock 1250mhz because of that issue. for example playing ghosts with everything maxed out uses up to 3,5gb of vram and bf4 with 200% scaling uses 3,3gb.
for the core i can reach 1240mhz with +200mv and that is ok to me. i was just wondering if all that 1500+ mhz are stable with high vram use of over 3gb...


----------



## Roy360

So for all you people who installed blocks on your R9 290s. How many of you found warranty stickers in the way?

Which vendors are your cards coming from?

And you overclockers, how much of a difference are you getting? I'm going to be setting a CF between a VIsionTek R9 290 and a XFX R9 290 both with EK blocks, (No backplates), wondering if I should bother overclocking.

Mined at 1000/1500 for 3 days before my first black screen. Now my hashrate is ~6Kh/s on stock or overclocked (I'm guessing HW errors are bogging it down)


----------



## BradleyW

Quote:


> Originally Posted by *Roy360*
> 
> So for all you people who installed blocks on your R9 290s. How many of you found warranty stickers in the way?
> 
> Which vendors are your cards coming from?
> 
> And you overclockers, how much of a difference are you getting? I'm going to be setting a CF between a VIsionTek R9 290 and a XFX R9 290 both with EK blocks, (No backplates), wondering if I should bother overclocking.
> 
> Mined at 1000/1500 for 3 days before my first black screen. Now my hashrate is ~6Kh/s on stock or overclocked (I'm guessing HW errors are bogging it down)


Gigabyte here. No stickers on my cards!


----------



## Csokis

MSI GPUs at CES 2014: Massive R9 290X Lightning and More
Quote:


> Into that mix, MSI is releasing a gigantic R9 290X with their triple-fan and triple-slot GPU.


----------



## devilhead

Quote:


> Originally Posted by *tijgert*
> 
> @ Devilhead: That's a nifty rad stand, where did you get it?


this is Phobya fan holder, but i don't know where to get it those fan holder alone, i have bought with those 3x rads







http://www.aquatuning.de/product_info.php/info/p10005_Phobya-Stand--Bench-Bundle-Nova-1080--mit-9x-Nano-G-12-Silent-Waterproof-1500rpm-Multioption-und-L-fterkabel.html


----------



## BradleyW

I just loaded up Bioshock Infinite and the coil buzzing is unbearable on both cards! How can I reduce the buzz?


----------



## Roy360

Quote:


> Originally Posted by *BradleyW*
> 
> Gigabyte here. No stickers on my cards!


so that's Gigabyte and PowerColor so far, any others?

and for those who did have stickers, how did you remove them?


----------



## TheSoldiet

Quote:


> Originally Posted by *Roy360*
> 
> so that's Gigabyte and PowerColor so far, any others?
> 
> and for those who did have stickers, how did you remove them?


Sapphire got no stickers mate


----------



## Jack Mac

Quote:


> Originally Posted by *BradleyW*
> 
> I just loaded up Bioshock Infinite and the coil buzzing is unbearable on both cards! How can I reduce the buzz?


Personally, I don't have this, but I heard it goes down over time.


----------



## VSG

Quote:


> Originally Posted by *Csokis*
> 
> MSI GPUs at CES 2014: Massive R9 290X Lightning and More


That's the one I am waiting for on the AMD side, I hope someone gets more details on the card soon.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Csokis*
> 
> MSI GPUs at CES 2014: Massive R9 290X Lightning and More


That thing is freaking HUGE!!

It's gonna be a beastly overclocker though, Triple slot cooling plus 12+3+2 power phase? Yup, looking forward to someone picking that up and benching it


----------



## Tomalak

Quote:


> Originally Posted by *BradleyW*
> 
> I just loaded up Bioshock Infinite and the coil buzzing is unbearable on both cards! How can I reduce the buzz?


Buzzing usually increases with FPS (should be especially noticeable in menus and such).So in theory, anything that limits FPS, like a frame limiter or vsync, should somewhat reduce it.


----------



## Hogesyx

Quote:


> Originally Posted by *Tomalak*
> 
> Buzzing usually increases with FPS (should be especially noticeable in menus and such).So in theory, anything that limits FPS, like a frame limiter or vsync, should somewhat reduce it.


The buzzing/whine seems to be directly related to the voltage(power usage?), not sure if the sounds is from the chip itself or the vrms. It is especially obvious when using watercooling as the stock fan noise will generally block out the whine.


----------



## BradleyW

Quote:


> Originally Posted by *Hogesyx*
> 
> The buzzing/whine seems to be directly related to the voltage(power usage?), not sure if the sounds is from the chip itself or the vrms. It is especially obvious when using watercooling as the stock fan noise will generally block out the whine.


Do all 290x's whine bad on water?


----------



## dspacek

Quote:


> Originally Posted by *CruxXial*
> 
> valley, heaven and stuff like that do not use much vram, no matter for how long u keep it running. i kept 3dmark firestrike first sequence looped at extreme settings plus heaven and gaming to find my max core but benches are all useless for vram testing. i am usually keeping it at 1350mhz or even stock 1250mhz because of that issue. for example playing ghosts with everything maxed out uses up to 3,5gb of vram and bf4 with 200% scaling uses 3,3gb.
> for the core i can reach 1240mhz with +200mv and that is ok to me. i was just wondering if all that 1500+ mhz are stable with high vram use of over 3gb...


Good thoughts but Im still stable, no matter how much RAM is used. When I run more apps withs almost 4 GBs of use, still working well.
*So how to unlock higher clocks in AB?* I try to use Trixx and AB together and be able to put 1700Mhz RAM thru Trixx but it can not handle it and video output shut down...


----------



## dspacek

Quote:


> Originally Posted by *geggeg*
> 
> That's the one I am waiting for on the AMD side, I hope someone gets more details on the card soon.


This is monstrum,
I vote for water block. Im on water almost year and thats what I can recomend to all. No dust, No screaming fans no problems (til now







)


----------



## Hogesyx

Quote:


> Originally Posted by *BradleyW*
> 
> Do all 290x's whine bad on water?


I think it whine the same regardless of water or air, just that it is much more obvious since there is no more fan sound. Also maybe the fact that we overclock more adding more voltage to it.









Some people reported that the whine goes away after awhile, mine is still pretty obvious if I open up the case.

I cant hear the one in my HTPC though, probably due to the fact that it is pack in a fractal node with 4hdd and 5 fans.


----------



## BradleyW

Quote:


> Originally Posted by *Hogesyx*
> 
> I think it whine the same regardless of water or air, just that it is much more obvious since there is no more fan sound. Also maybe the fact that we overclock more adding more voltage to it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Some people reported that the whine goes away after awhile, mine is still pretty obvious if I open up the case.
> 
> I cant hear the one in my HTPC though, probably due to the fact that it is pack in a fractal node with 4hdd and 5 fans.


Well if they al whine, I might have just spent a lot of money on crap and should have gotten 780's which don't tend to whine. Even my new monitor is whining.


----------



## rdr09

Quote:


> Originally Posted by *BradleyW*
> 
> Well if they al whine, I might have just spent a lot of money on crap and should have gotten 780's which don't tend to whine. Even my new monitor is whining.


my 290 does not whine. it does in Firestrike but games . . . nah. air or water.


----------



## Sgt Bilko

Quote:


> Originally Posted by *BradleyW*
> 
> Well if they al whine, I might have just spent a lot of money on crap and should have gotten 780's which don't tend to whine. Even my new monitor is whining.


Mine whined for about a week when i first got it but never after that.

I also think the power quality being delivered to the cards might play a small part in coil whine as well.
someone else might have more to suggest on that though.

And you didn't spend money on "crap", you spent money on GPU's which have coil whine in some cases and not in others.


----------



## TheSoldiet

Change me to Water/Red mod please









I know, i know... Sloppy handwriting


----------



## BradleyW

Quote:


> Originally Posted by *rdr09*
> 
> my 290 does not whine. it does in Firestrike but games . . . nah. air or water.


Would nail polish on these cards?


----------



## rdr09

Quote:


> Originally Posted by *BradleyW*
> 
> Would nail polish on these cards?


one of my 7950s whined (more like knocking sound) after i put the waterblock.. my guess was that a capacitor was hitting the wall(s) of the block causing it to make that sound. i must have ran Valley bench 20 times and it went away.

i heard glue (non permanent) might help but you have to be a able to pin point exactly which coil (or capacitor) is doing it unless you apply it on all of them.


----------



## Jack Mac

I'm so glad that my 290 doesn't whine.


----------



## TheSoldiet

Quote:


> Originally Posted by *Jack Mac*
> 
> I'm so glad that my 290 doesn't whine.


My 290X doesn't have either


----------



## ZealotKi11er

I think mine did but because i was using a old power supply. Not it does not whine at all.


----------



## jrcbandit

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I think mine did but because i was using a old power supply. Not it does not whine at all.


I'm experiencing that too. I'm using an older crappy 750 watt power supply temporarily with coil whine from the video card. However, while using my high energy efficiency 1000 watt I never noticed any coil whine above any fan noise. I have water cooling so the fan noise isn't anywhere near the stock cooler.


----------



## alawadhi3000

I can't monitor VRM temps on my Sapphire R9 290, how come?


----------



## Widde

Just want a confirmation but I suspect my PSU hurting my gpu oc, my 12V is around 11.26-11.32V under load and 11.63V idle :S Have a Seasonic 1000W Platinum on the way







(Going 3 R9 290s in the near future)


----------



## Bartouille

Quote:


> Originally Posted by *alawadhi3000*
> 
> I can't monitor VRM temps on my Sapphire R9 290, how come?


Looks like they already started taking away features from the card, your card is probably not TRUE reference. Not the first time I hear someone say he can't monitor his vrm temps. Or maybe you are doing something wrong? GPU-Z and HWiNFO should be able to read vrm temps no problem.


----------



## BradleyW

Would nail polish work despite the caps being covered?


----------



## the9quad

Quote:


> Originally Posted by *Widde*
> 
> Just want a confirmation but I suspect my PSU hurting my gpu oc, my 12V is around 11.26-11.32V under load and 11.63V idle :S Have a Seasonic 1000W Platinum on the way
> 
> 
> 
> 
> 
> 
> 
> (Going 3 R9 290s in the near future)


1000 watts is going to be cutting it razor thin close with 3 cards. Your going to be at 100% capacity if not more with 3 cards.


----------



## Arizonian

Quote:


> Originally Posted by *TheSoldiet*
> 
> Change me to Water/Red mod please
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> I know, i know... Sloppy handwriting


Congrats - updated


----------



## Widde

Quote:


> Originally Posted by *the9quad*
> 
> 1000 watts is going to be cutting it razor thin close with 3 cards. Your going to be at 100% capacity if not more with 3 cards.


Okey ^^ Might settle with 2 cards then ^^ Should be more than enough for gaming


----------



## flamin9_t00l

Quote:


> Originally Posted by *Roy360*
> 
> so that's Gigabyte and PowerColor so far, any others?
> 
> and for those who did have stickers, how did you remove them?


No stickers on either of my cards (Sapphire and HIS). Didn't even have the piece of paper under the stock cooler on the HIS model like some have reported.

As for coil whine I had a little when I first got them but only at stupidly high fps 500/1000+ and now they're watercooled I only notice a slight hiss when running Ice storm/Cloud gate (powered by Enermax Revolution 85+ 1050W - 6x 12v rails, 30A each (both cards have their own rail)).


----------



## alawadhi3000

Quote:


> Originally Posted by *Bartouille*
> 
> Looks like they already started taking away features from the card, your card is probably not TRUE reference. Not the first time I hear someone say he can't monitor his vrm temps. Or maybe you are doing something wrong? GPU-Z and HWiNFO should be able to read vrm temps no problem.


GPU-Z doesn't read the VRM temps, but HWiNFO does. Thanks and REP+.


----------



## flamin9_t00l

Quote:


> Originally Posted by *Widde*
> 
> Just want a confirmation but I suspect my PSU hurting my gpu oc, my 12V is around 11.26-11.32V under load and 11.63V idle :S Have a Seasonic 1000W Platinum on the way
> 
> 
> 
> 
> 
> 
> 
> (Going 3 R9 290s in the near future)


What are you using to monitor you +12V because I've just noticed that my +12V rails are also reporting 11.6v idle using HWMonitor which I think is incorrect. Try using the latest HWiNFO64







... idle I'm at 12.1v and average 12.0v.


----------



## Tabzilla

Very nice! Do the blowers keep the VRMS cool, and are you using a custom heatsink on them?


----------



## its-vp

Quote:


> Originally Posted by *Prexxus*
> 
> Hey guys my first post here
> 
> 
> 
> 
> 
> 
> 
> Just got my R9 290 Sapphire last week and man was I impressed! That is until the black screens started. Ive scoured the net for the past week or so looking into this and it seems to be a pretty widespread problem.
> 
> What I can't find is any official comment on it or any real fixes. Has AMD said anything? Is there any real way of fixing this? Does anyone know what's causing this too happen?
> 
> I checked the memory type of my GPU and saw that it was Elpdia and not Hinyx. From what Ive read up to now is that almost all if not all the people with this problem have Elpdia memory on their boards.
> 
> My question really is what should I do? Is there anything I can do myself or should I just send it back and hope to get one with Hinyx memory?
> 
> I'm exhausted trying to fix this. I can't figure it out myself and I would love any advice/help possible at this point.
> 
> Thank you so much in advance.
> 
> My system specs if it matters...
> 
> i5 2500k @ 4.2
> Radeon R9 290 Sapphire
> Sabertooth P67
> Rocketfish 700wPS2
> 8 gigs ram Gskill @ 1600
> Windows 7 64 bit
> 
> Using the latest AMD drivers.
> 
> Black screens when playing certain games at random times, sometimes I can get hours of play in, sometimes not even 5 minutes. Other games like DOTA2 and BF4 seem to work flawlessly every time. Which is quite strange to me but it might just be random luck.


Prexxus, check the temperature of your card.
I don't let mine pass 85C (I use MSI Afterburner and set FAN SPEED % to "User Defined", that way it can increase the FAN speed according to the temperature increase... if you don't use it, the default AIT Catalyst driver will let your card reach 90/95C and start to decrease the clockspeed to mantain the temperature...).

I used to get blackscreen before i start keep an eye on the temperature. Note that today i'm running my card (a unlocked R290 modded to R290x with bios change, with standard AMD cooler - It is a XFX), at 1080mhz for the core and 1250mhz for the memory... sometimes my FAN Speed reach 60% - i think the maximum fan speed for the default Catalyst driver is 50%..)

resuming, try to increase the FAN speed, and see what happens...









Plz reply if it helped you! (Hope it will...


----------



## bloodkil93

Just an update, sent my R9 290 back for a refund (due to blackscreens) and got an R9 290x instead.

Old Card: Sapphire R9 290
New Card : Sapphire R9 290X BF4 Edition

Also to add confirmation, the new Sapphire R9 290X card came with Elpida memory, but works absolutely fine, runs happily at 1110/1325 (would go higher, but haven't had enough time).


----------



## its-vp

Quote:


> Originally Posted by *00Smurf*
> 
> Expanded to two rig's and 5 290x's. One in a haf xb evo with 2 290x's. The other three in a carbide 540 case running tri-fire when not mining.
> 
> One card runs a H90, the other 4 use h55's. Can't beat 54C mining temps.


00Smurf, IMPRESSIVE SYSTEM! Congratulations! Hope you have a good powersurge / nobreak...









One question, did you use standard H90 / H55 mounting system? or you created some kind of adaptation to screw to the PCB?

Thx for the encouraging post!


----------



## cplifj

wonderfull thread, this must be the LEET page


----------



## Korayyy

Quick question for you guys.. I just put both of my 290s under water with EK acetal blocks. Now I hear it isn't out of the ordinary for VRM1 to be hotter than VRM2 due to the core being more demanding. But, it kind of caught me off guard when I noticed that VRM1 is 18C hotter than 2. VRM2 is at 48C while 1 is at 66C... But, it is the exact same on both of the cards according to GPU-Z. If this is normal please let me know because I'm thinking about pulling the blocks off of the cards and checking them out. Thanks.


----------



## robert0507

Quote:


> Originally Posted by *Korayyy*
> 
> Quick question for you guys.. I just put both of my 290s under water with EK acetal blocks. Now I hear it isn't out of the ordinary for VRM1 to be hotter than VRM2 due to the core being more demanding. But, it kind of caught me off guard when I noticed that VRM1 is 18C hotter than 2. VRM2 is at 48C while 1 is at 66C... But, it is the exact same on both of the cards according to GPU-Z. If this is normal please let me know because I'm thinking about pulling the blocks off of the cards and checking them out. Thanks.


Those VRM temps seem high. is that under full load or just idleing? I hve the Kollanve block and my VRM temps idle at 26 and under full load around 55 for vrm1 and 38 for vrm2 (2 hours play of BF4)


----------



## robert0507

Quote:


> Originally Posted by *robert0507*
> 
> Those VRM temps seem high. is that under full load or just idleing? I hve the Kollanve block and my VRM temps idle at 26 and under full load around 55 for vrm1 and 38 for vrm2 (2 hours play of BF4)


(Koolance)


----------



## Korayyy

Quote:


> Originally Posted by *robert0507*
> 
> Those VRM temps seem high. is that under full load or just idleing? I hve the Kollanve block and my VRM temps idle at 26 and under full load around 55 for vrm1 and 38 for vrm2 (2 hours play of BF4)


It is while they are mining. At idle they both sit at 31C on both cards. Wonder if mining is stressing the VRM1 more than 2?


----------



## smonkie

Could anyone with a TRI-X 290x upload its Bios? As it appears, flashing it is the solution to get rid of coil whine with Accelero + PWM.


----------



## fh12volvo

1. www.techpowerup.com/gpuz/5zwpp/ - fh12volvo
2. Sapphire R9 290
3. Cooling: Stock
Thanks ! thumb.gif


----------



## mojobear

Quote:


> Originally Posted by *the9quad*
> 
> 1000 watts is going to be cutting it razor thin close with 3 cards. Your going to be at 100% capacity if not more with 3 cards.


What's interesting is I'm running R9 290s x 3 and with watercooling mine are hitting loads of about 900-950W gaming with a mild overclock of +70mV @ 1150/1350. Havent checked with Furmark...wont dare and dont see the need. Games like crysis 3, which in my past experience pool the most wattage hit the 900-950 W range. You can see the rest of my rig in the sig with an OCed i7 4770K.

When I was running stock (3 x R9 290s) in Crysis 3 I was hitting about 850-900 W *usually more in the 850W range - really feel that the watercooling decreases leakage quite a bit...i think some websites have shown this using the G10 NZXT aftermarket cooler and waterblocks.


----------



## Arizonian

Quote:


> Originally Posted by *fh12volvo*
> 
> 1. www.techpowerup.com/gpuz/5zwpp/ - fh12volvo
> 2. Sapphire R9 290
> 3. Cooling: Stock
> Thanks ! thumb.gif


Congrats - added


----------



## BradleyW

Is it safe to set the GPU Voltage to +100?
I'm trying to change the power draw to manipulate the speed of the coils.


----------



## Forceman

Quote:


> Originally Posted by *Korayyy*
> 
> Quick question for you guys.. I just put both of my 290s under water with EK acetal blocks. Now I hear it isn't out of the ordinary for VRM1 to be hotter than VRM2 due to the core being more demanding. But, it kind of caught me off guard when I noticed that VRM1 is 18C hotter than 2. VRM2 is at 48C while 1 is at 66C... But, it is the exact same on both of the cards according to GPU-Z. If this is normal please let me know because I'm thinking about pulling the blocks off of the cards and checking them out. Thanks.


VRM1 is core and VRM2 is memory, so it's normal for there to be a temp split like that. Your temps sound about right for a mining load.


----------



## Jack Mac

Quote:


> Originally Posted by *BradleyW*
> 
> Is it safe to set the GPU Voltage to +100?
> I'm trying to change the power draw to manipulate the speed of the coils.


If you can cool them, most definitely. Personally, I don't run more than +65 24/7 but that's just me. 100 should be safe.


----------



## Sazz

Quote:


> Originally Posted by *Arizonian*
> 
> I can't find your submission post. Did you submit one and I missed it? See OP for what's needed for submission, I'd love to add you to the roster.
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added


I am already in it. xD


----------



## bloodkil93

Quote:


> Originally Posted by *bloodkil93*
> 
> Just an update, sent my R9 290 back for a refund (due to blackscreens) and got an R9 290x instead.
> 
> Old Card: Sapphire R9 290
> New Card : Sapphire R9 290X BF4 Edition
> 
> Also to add confirmation, the new Sapphire R9 290X card came with Elpida memory, but works absolutely fine, runs happily at 1110/1325 (would go higher, but haven't had enough time).


I think you missed me Arzonian, switched from a 290 to a 290x because of RMA'ing and blackscreens


----------



## Widde

Quote:


> Originally Posted by *flamin9_t00l*
> 
> What are you using to monitor you +12V because I've just noticed that my +12V rails are also reporting 11.6v idle using HWMonitor which I think is incorrect. Try using the latest HWiNFO64
> 
> 
> 
> 
> 
> 
> 
> ... idle I'm at 12.1v and average 12.0v.


Multimeter


----------



## Bartouille

First off sorry for extremely bad quality picture, it is very hard to see my user name.









Good news is that I bought a R9 290 and received a R9 290X.







And BF4 too, 480$ cdn total haha, I'm so freaking happy right now.







Man these reference cards are so heavy!!

1. the pic
2. AMD R9 290X (actually it's a Diamond but even in GPU-Z it just says ATI in subvendor, so I assume this is a card directly from AMD)
3. Stock


----------



## headbass

how do I get the voltage to more than +100 mV ?

I am using msi afterburner 3.0.0 beta 18 and no matter what setting I change (extend official overclocking limit, disable ulps, unofficial overclocking mode) the AB won't let me go any higher then +100mV and 1235 core

I got two sapphire 290 BF4 edition and need to pick the one I keep and install accelero on (and install the cooler withing the 14day return period in case anything goes wrong ;o)

both are locked elpida

one has 72.4 asic and one has 71.5 asic

when trying to find the max oc on stock voltage I noticed the 71.5 gets up to 1.19v where the 72.4 only gets to 1.15, both at the same 1100mhz, +50 power limit, same game and everything

I need to pick the better one and dunno if it's better to have higher stock volts or tiny bit higher asic....since I'll be using accelero I will probably be able to cool higher voltage then what it's currently letting me

just testing the 71.5 one and anything over 1150 core will start to artefact slightly in games, valley seemed ok....I am having the fan turned up to keep the core from dropping down


----------



## Hattifnatten

Quote:


> Originally Posted by *Anandtech forums*
> ROUTE to my MSI Afterburner Directory:
> CD C: \Program Files (x86)\SYSTEM TOOLS\AMD Radeon R9 290X\MSI Afterburner
> 
> MSIAfterburner.exe /wi4,30,8d,20 ........... +200mv offset This gets HOT
> MSIAfterburner.exe /wi4,30,8d,19 ........... +156mv offset This seems to work BEST
> MSIAfterburner.exe /wi4,30,8d,18 ........... +150mv offset
> MSIAfterburner.exe /wi4,30,8d,17 ........... +144mv offset
> MSIAfterburner.exe /wi4,30,8d,16 ........... +138mv offset
> MSIAfterburner.exe /wi4,30,8d,14 ........... +125mv offset
> MSIAfterburner.exe /wi4,30,8d,12 ........... +113mv offset
> MSIAfterburner.exe /wi4,30,8d,10 ........... +100mv offset AB's Optimum Setting
> MSIAfterburner.exe /wi4,30,8d,0 .............. Return to 0


----------



## BradleyW

Quote:


> Originally Posted by *Jack Mac*
> 
> If you can cool them, most definitely. Personally, I don't run more than +65 24/7 but that's just me. 100 should be safe.


And is it possible to under volt the card?
Also, does nail polish work on these cards despite the caps being closed?


----------



## Hattifnatten

By the way, I just discovered something that will be of interest of anyone who mines. These cards underclock like a champ! With lowered clocks(I'm currently trying out 870mhz), I'm able to run it fully stable at 0.975v average. It doesen't even touch 80c on the stock fan curve! And since mining isn't affected by core clock all that much...It's a win-win.

Edit:
Just noticed that the fan was stuck on 44%. That explains a lot. Still, I can barely hear it above my pump.
And another thing, at 800mhz the vcore is down to 0.949


----------



## Arizonian

Quote:


> Originally Posted by *Bartouille*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> First off sorry for extremely bad quality picture, it is very hard to see my user name.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Good news is that I bought a R9 290 and received a R9 290X.
> 
> 
> 
> 
> 
> 
> 
> And BF4 too, 480$ cdn total haha, I'm so freaking happy right now.
> 
> 
> 
> 
> 
> 
> 
> Man these reference cards are so heavy!!
> 
> 1. the pic
> 2. AMD R9 290X (actually it's a Diamond but even in GPU-Z it just says ATI in subvendor, so I assume this is a card directly from AMD)
> 3. Stock


Congrats - added


----------



## ImJJames

Quote:


> Originally Posted by *Hattifnatten*
> 
> By the way, I just discovered something that will be of interest of anyone who mines. These cards underclock like a champ! With lowered clocks(I'm currently trying out 870mhz), I'm able to run it fully stable at 0.975v average. It doesen't even touch 80c on the stock fan curve! And since mining isn't affected by core clock all that much...It's a win-win.
> 
> Edit:
> Just noticed that the fan was stuck on 44%. That explains a lot. Still, I can barely hear it above my pump.
> And another thing, at 800mhz the vcore is down to 0.949


What kind of hash rate you getting with that setting


----------



## Hattifnatten

Haven't tested yet, but I guess it's still around the 800 mark.
Edit: ~750 @850mhz. Will try 890.
Edit2: 890 brings me up to about 780. But mining is definitely more taxing than fsx. I'll have to increase the fans in order to keep it from throttling. Which is kind of sad, I really though I had something going here, the firestrike results were so promising. 78c, øower fan speed etc.


----------



## headbass

Quote:


> Originally Posted by *Hattifnatten*


thx but I can't get it to work

if I create a shortcut to afterburner and change the target to
"C:\Program Files (x86)\MSI Afterburner\MSIAfterburner.exe" /wi4,30,8d,19
it won't start

if I create a bat file and put in it
C:\Program Files (x86)\MSI Afterburner\MSIAfterburner.exe /wi4,30,8d,19
it won't start either


----------



## Hattifnatten

You have to use command line


----------



## Jack Mac

Quote:


> Originally Posted by *BradleyW*
> 
> And is it possible to under volt the card?
> Also, does nail polish work on these cards despite the caps being closed?


Yeah, undervolting is possible, I do it to decrease idle temperatures, not sure about the cap thing.


----------



## Bartouille

I checked unlock voltage control in MSI Afterburner (3.0.0 Beta 16) and the slider is still grayed out? How comes?


----------



## INCREDIBLEHULK

I posted this question in another thread and I am curious to ask you guys.
Please share your experiences or thoughts.

Let's say I buy 10 of these for mining. Expensive card, definitely not the card to run "COOL" under load. I would assume overheating damage or blown vrms or any thing like that is a user error correct? So what does the warranty from manufacturer leave with? Random errors on the card?

I see some videos where people have an insane amount of these cards with a fan blowing over them and I know a card like this running 70-80C GPU(imagine those vrms!!!!) its not the best thing.

Really curious on the experiences you have had with some manufacturers and warranty of their product. I know it is much easier to deny an RMA, but what is really left that is not "user error"


----------



## aznever

Quote:


> Originally Posted by *Bartouille*
> 
> I checked unlock voltage control in MSI Afterburner (3.0.0 Beta 16) and the slider is still grayed out? How comes?


You need Afterburner Beta 17 or Beta 18.


----------



## headbass

Quote:


> Originally Posted by *Hattifnatten*
> 
> You have to use command line


cool thx, took me a while to figure out that it actually just changes the cfg and then I have to start the afterburner and it will start with the desired voltage increase

you can actually use a bat file if you put this in it

CD C:\Program Files (x86)\MSI Afterburner
MSIAfterburner.exe /wi4,30,8d,14
MSIAfterburner.exe

hmmm even with just +125mV according to gpuz the vddc got to 1.313 max and it will still artefact in game (not much, only once in a few mins) with just 1175 core, memory 1250

since I don't care about benches and just wanna get a stable oc for games I am not that happy

was kinda hoping for 1170-1200 on 1.25-1.30
asic is 71.5 and it's locked with elpida


----------



## Aggronor

Any idea.. or video or something about Mantle performance on BF4 ???... , how will be this API ? maybe a Catalyst update ?? BF4 Patch?...


----------



## Aggronor

Mantle is Ready guys!!!
Check this out!!!






45% percent better.... on BATTLEFIELD 4...

GG Nvidia


----------



## the9quad

Quote:


> Originally Posted by *Aggronor*
> 
> Mantle is Ready guys!!!
> Check this out!!!
> 
> 
> 
> 
> 
> 
> 45% percent better.... on BATTLEFIELD 4...
> 
> GG Nvidia


Just saying wasnt impressed, I saw 108 fps drop to 58 fps.... wonder what hardware they were using.


----------



## Remij

Quote:


> Originally Posted by *the9quad*
> 
> Just saying wasnt impressed, I saw 108 fps drop to 58 fps.... wonder what hardware they were using.


The fact that that video looked like crap, and that we don't know any specifics means I'm not exactly impressed.

For the WORLD PREMIER BF4 Mantle showcase, you'd think they'd take more than 20seconds to show and explain why it's better. That vid showed no action, had drops to 50-70fps for no apparent reason, and then they tell you it was 45% faster..just take our word for it..

For something that should have been out by now, the lack of a thorough in-depth analysis of what to expect is disappointing.


----------



## the9quad

Quote:


> Originally Posted by *Remij*
> 
> The fact that that video looked like crap, and that we don't know any specifics means I'm not exactly impressed.
> 
> For the WORLD PREMIER BF4 Mantle showcase, you'd think they'd take more than 20seconds to show and explain why it's better. That vid showed no action, had drops to 50-70fps for no apparent reason, and then they tell you it was 45% faster..just take our word for it..
> 
> For something that should have been out by now, the lack of a thorough in-depth analysis of what to expect is disappointing.


I am not a hater by any means, but I do agree with ya. Why do companies not roll stuff out better than this? It should have been more in-depth like you said. at the very least you think they would have picked a smooth part to play so they didn't have the frame drops.

On the other hand if it was a single card, running it at 1440p or greater, than yeah that would be kind of impressive as it stayed locked >100fps most of the time, which is better than any other card by far. But since we don't know what res or what pc specs it's a useless video


----------



## BradleyW

Are any of you 290x owners getting strange CFX scaling if you don't put the power draw limit up to +50%?
Benchmarks seem fine but games show strange scaling (usage mismatch and poor fps scaling at random) in games. However, the experience is still smooth. (ULPS disabled)
Is my PSU enough for the sig rig?


----------



## ZealotKi11er

Quote:


> Originally Posted by *BradleyW*
> 
> Are any of you 290x owners getting strange CFX scaling if you don't put the power draw limit up to +50%?
> Benchmarks seem fine but games show strange scaling (usage mismatch and poor fps scaling at random) in games. However, the experience is still smooth. (ULPS disabled)
> Is my PSU enough for the sig rig?


GPU usage outside Mining is really off. I would not bother with it since even MSI does not got it right how R9 290X operate.


----------



## pounced

Quote:


> Originally Posted by *Aggronor*
> 
> Mantle is Ready guys!!!
> Check this out!!!
> 
> 
> 
> 
> 
> 
> 45% percent better.... on BATTLEFIELD 4...
> 
> GG Nvidia


As long as I can play On ultra with 4xsmaa and HBAO without ever going below 60 fps I will be happy. So many times with my 290x I will dip to like 55-59 and I can notice it.


----------



## BradleyW

Quote:


> Originally Posted by *ZealotKi11er*
> 
> GPU usage outside Mining is really off. I would not bother with it since even MSI does not got it right how R9 290X operate.


Sorry, do you mean the cards are off right now or MSI?
I was in Heaven benchmark max out using free camera. I parked up at the stone houses and the smoke chimneys. All was still (Other than the smoke). Here was my usage:


Can you see those small dips? See how the usage does not match?


----------



## ZealotKi11er

Quote:


> Originally Posted by *BradleyW*
> 
> Sorry, do you mean the cards are off right now or MSI?
> I was in Heaven benchmark max out using free camera. I parked up at the stone houses and the smoke chimneys. All was still (Other than the smoke). Here was my usage:
> 
> 
> Can you see those small dips? See how the usage does not match?


Added GPU usage averaging algorithm for Overdrive 6 capable AMD GPUs. Now displayed GPU usage is being averaged by sliding window to smooth GPU usage artifacts occurring due to bug in AMD ADL API on AMD Sea Islands GPU family

Thats directly from MSI AB log.


----------



## Imprezzion

I got a problem with my 290 unlocked to 290X...

I overclocked it with MSI AB at first using +100mV and it works just fine at 1150Mhz core and 1500Mhz VRAM but it gives artifacts at 1160Mhz.

Now, I know ASUS GPU Tweak allows for 1.412v as I used the ASUS BIOS.
However, when I up the voltage to anything above say, 1.30v, the card randomly flickers black screens or my screen just goes to stand-by all together. It's like it's losing signal constantly with a voltage higher then the +100mV MSI AB can do...

Temps are not an issue. Core is cooled by a Accelero Hybrid and never goes over 65c on low rad fans.
Also, VRM's are cooled with a cut up baseplate of the stock cooler and VRM's never go over 60c.


----------



## Newbie2009

Quote:


> Originally Posted by *Imprezzion*
> 
> I got a problem with my 290 unlocked to 290X...
> 
> I overclocked it with MSI AB at first using +100mV and it works just fine at 1150Mhz core and 1500Mhz VRAM but it gives artifacts at 1160Mhz.
> 
> Now, I know ASUS GPU Tweak allows for 1.412v as I used the ASUS BIOS.
> However, when I up the voltage to anything above say, 1.30v, the card randomly flickers black screens or my screen just goes to stand-by all together. It's like it's losing signal constantly with a voltage higher then the +100mV MSI AB can do...
> 
> Temps are not an issue. Core is cooled by a Accelero Hybrid and never goes over 65c on low rad fans.
> Also, VRM's are cooled with a cut up baseplate of the stock cooler and VRM's never go over 60c.


I have same issue. If you lower the power limit you will be able to put more volts in and reach higher clocks, however you will hit the wall earlier obviously. TSM mentioned reaching limit of pwm mosfest is a possible cause. However actual 290Xs seem to have a higher limit than unlocked 290s.


----------



## CriticalHit

I also get the flickering with added voltage or increased power limit ... I can't do +30 power limit nor more than +80mv ...
Luck of the draw


----------



## Sazz

Quote:


> Originally Posted by *Imprezzion*
> 
> I got a problem with my 290 unlocked to 290X...
> 
> I overclocked it with MSI AB at first using +100mV and it works just fine at 1150Mhz core and 1500Mhz VRAM but it gives artifacts at 1160Mhz.
> 
> Now, I know ASUS GPU Tweak allows for 1.412v as I used the ASUS BIOS.
> However, when I up the voltage to anything above say, 1.30v, the card randomly flickers black screens or my screen just goes to stand-by all together. It's like it's losing signal constantly with a voltage higher then the +100mV MSI AB can do...
> 
> Temps are not an issue. Core is cooled by a Accelero Hybrid and never goes over 65c on low rad fans.
> Also, VRM's are cooled with a cut up baseplate of the stock cooler and VRM's never go over 60c.


You probably are not even stable at 1150Mhz, have you OCCT test it to confirm your stable at 1150?

at +50mV I am stable at 1150Mhz, but at 1160 I am not stable (OCCT) but I don't get any artifacts at all.

Anyways you can't really complain about it, its a "unlock" not a full blown 290X to begin with. you are lucky enough to unlock it and at 1150 its pretty darn good already. (that's my set daily clocks, 1150/1400)


----------



## Imprezzion

I know







I am happy with it but I like to maximize performance of my hardware and a little modding or extra volts doesn't bother me








That's why I run my 3770K delidded @ 4.95Ghz as that's the max I can get it to without running hot. It hits about 60c in daly usage and 75c in full LinX AVX / Prime95 AVX stress (adaptive PWM fan and pump speed which only ramps up above from 40% above 65c).

I'm just a bit bored as I always clock and tweak everything and my CPU and RAM both run at the max of their potential. The only thing I haven't tweaked to the max is the GPU haha.

And stable in OCCT, I don't really care all that much about it. If the games I play work without crashes, FPS drops or atrifacts it's stable.
Since both BF4 and Far Cry 3, the most taxing games out there in terms of OC stability, run for hours and hours without a single artifact it's fine by me lol.

Point is, with minor artifacting (black and white square patterns popping up once every ~ minute orso) I can run as high as 1200Mhz but I wanna get rid of the artifacts and I feel that with a tad more voltage on the beast I can probably get away with 1200-1225Mhz of core clocks.

I'm planning to keep the 290X and see what Mantle brings as BF4 is basically all I play besides ArmA II DayZ Epoch / DayZ standalone.
If Mantle disappoints in terms of performance or stability i'm out and going back to a GTX780 / GTX780 Ti. I kinda miss my 780


----------



## tijgert

For the overvolters;
AB only allows 100mV unless you go command line, GPUTweak (when flashed with an Asus bios, 3518 being latest) adds upto 162mV, Trixx maxes at 200mV.


----------



## dspacek

Quote:


> Originally Posted by *Hattifnatten*


Can you explain it more exactly where to put which file to get this overvoltage?
Is that same possible to do with clocks? Im stucked at 1625 Mhz stable on RAM, but cant put more.
Its possbile only with trixx, but AB and trixx are making problems when run together.


----------



## King4x4

Can I join now?


----------



## battleaxe

Quote:


> Originally Posted by *King4x4*
> 
> Can I join now?


Jealousy..... jealousy....

Nice man.


----------



## dspacek

So is there any BIOS utility to set up own BIOS by modding the original one? is that AB clocks locked in vy program or is that by BIOS settings?
So how to reach for example 1300Mhz core and 1800 RAM? Will I need modded BIOS or some commands for AB?
Im still stable at 1625 Mjz RAM and want to try more ...
thanks for advice
I found this for voltage unlocking so is there something like this for unlocking clocks?
http://www.overclock.net/t/861293/release-msi-afterburner-max-voltage-unlocker


----------



## BradleyW

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Added GPU usage averaging algorithm for Overdrive 6 capable AMD GPUs. Now displayed GPU usage is being averaged by sliding window to smooth GPU usage artifacts occurring due to bug in AMD ADL API on AMD Sea Islands GPU family
> 
> Thats directly from MSI AB log.


So due to the architecture, MSI can only show an estimate GPU usage, thus causing the bumps? So there is actually nothing wrong with my usage? Or have I misunderstood?
Thank you.


----------



## INCREDIBLEHULK

Any thoughts on my post?
http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/13380#post_21529346


----------



## Bartouille

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> I posted this question in another thread and I am curious to ask you guys.
> Please share your experiences or thoughts.
> 
> Let's say I buy 10 of these for mining. Expensive card, definitely not the card to run "COOL" under load. I would assume overheating damage or blown vrms or any thing like that is a user error correct? So what does the warranty from manufacturer leave with? Random errors on the card?
> 
> I see some videos where people have an insane amount of these cards with a fan blowing over them and I know a card like this running 70-80C GPU(imagine those vrms!!!!) its not the best thing.
> 
> Really curious on the experiences you have had with some manufacturers and warranty of their product. I know it is much easier to deny an RMA, but what is really left that is not "user error"


With pci-e risers and decent spacing between the cards you won't overheat. Reference cooler is great for multiple cards and don't worry about the vrms, usually they run cooler than the core itself. 70-80c for r9 290(x) is cool.


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *Bartouille*
> 
> With pci-e risers and decent spacing between the cards you won't overheat. Reference cooler is great for multiple cards and don't worry about the vrms, usually they run cooler than the core itself. 70-80c for r9 290(x) is cool.


Thanks Bartouille, this person also left a great post in other thread

http://www.overclock.net/t/1398250/official-tutorial-how-to-start-mining-litecoins/2140#post_21529538

Made me wonder what companies are easier to deal with than others. Physical damage is one thing, I know some companies might claim that it's user error to run a card for 5days 24/7 and that it caused damage to the card.
http://www.overclock.net/t/1398250/official-tutorial-how-to-start-mining-litecoins/2140

this is other thread I posted in. Just trying to get some thoughts out and have people share stories about past experiences with companies and what not. I get worried when not buying from newegg or a site like that and picking up a "new" card from ebay or amazon that these manufacturers treat the customer different.


----------



## Bartouille

Does removing the shroud improve performance?


----------



## Jack Mac

Speaking of shrouds, is it possible to put a reference 7970 or 6970 shroud on this card? I prefer the looks of those shrouds over the 290 shroud.


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *Jack Mac*
> 
> Speaking of shrouds, is it possible to put a reference 7970 or 6970 shroud on this card? I prefer the looks of those shrouds over the 290 shroud.


Just make sure you dont have a meh manufacturer.
Last thing you want is a warranty denied because you "physically altered" the card


----------



## MeneerVent

I just bought Battlefield 4 and tried to play a match. Than I got a whitescreen in the battlefield 4 window. Anyone knowing what the problem is? I suspect it are the drivers, so which one is confirmed stable? I am using the latest beta driver btw. Also I just tried starting up tomb raider, but all I got was a black screen. Battlefield 3 works fine though.


----------



## bond32

Quote:


> Originally Posted by *Jack Mac*
> 
> Speaking of shrouds, is it possible to put a reference 7970 or 6970 shroud on this card? I prefer the looks of those shrouds over the 290 shroud.


I would think so... They are almost identical. Try it!


----------



## jerrolds

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Is your card overvolted?
> [
> No, I'm at 1080p. 1440p is about 1.7x1080p, so I want to see if this card can run the game at 170% scaling cause I guess in a way that'd give me a BF4 performance at 1440P?


I play at 1440, all "High" settings with AA turned off, HBAO on - and i hover around 80-120fps in multiplayer. Usually 95fps-105fps. This is with a i7 [email protected] and [email protected] . ~60fps is doable at Ultra 1440p, if thats what your looking for i think.


----------



## Alexbo1101

Arizonian could you update me to water cooling?


Spoiler: Water! :D


----------



## Jack Mac

Quote:


> Originally Posted by *bond32*
> 
> I would think so... They are almost identical. Try it!


I would love to, but I don't have any reference 7970/6970 shrouds lying around.


----------



## TheSoldiet

Quote:


> Originally Posted by *Jack Mac*
> 
> Speaking of shrouds, is it possible to put a reference 7970 or 6970 shroud on this card? I prefer the looks of those shrouds over the 290 shroud.


Not 6970. The ref 6970 does have another locking method on the shroud. It uses clips instead of screws.


----------



## Jack Mac

Quote:


> Originally Posted by *TheSoldiet*
> 
> Not 6970. The ref 6970 does have another locking method on the shroud. It uses clips instead of screws.


Ok so it'll just be the 7970 shroud that I'll look for to try this on. I love the sleek and glossy look of the 7970 shroud over the 290 shroud.


----------



## jerrolds

Just the shroud part and not the heatsink? I know VRM placements are different for 79xx and 290


----------



## Jack Mac

Quote:


> Originally Posted by *jerrolds*
> 
> Just the shroud part and not the heatsink? I know VRM placements are different for 79xx and 290


Yeah, just the plastic shroud, not the heatsink itself.


----------



## headbass

Quote:


> Originally Posted by *dspacek*
> 
> Can you explain it more exactly where to put which file to get this overvoltage?
> Is that same possible to do with clocks? Im stucked at 1625 Mhz stable on RAM, but cant put more.
> Its possbile only with trixx, but AB and trixx are making problems when run together.


as I said already, those commands that *Hattifnatten* posted change the settings (stored in some ab cfg file I guess) that will be used the next time you start afterburner

you can make a bat file and put these 3 lines in it

Code:



Code:


CD C:\Program Files (x86)\MSI Afterburner
MSIAfterburner.exe /wi4,30,8d,14
MSIAfterburner.exe

the first line takes you to the afterburner location, so change it according to where it is for you (or I guess if you place the bat in the AB directory you don't need the first line)

the next line is the voltage offset that will be used the next time you start the afterburner (but will not start the afterburner itself)...replace the second line with any of these for your desired voltage:

Code:



Code:


MSIAfterburner.exe /wi4,30,8d,20 ........... +200mv offset This gets HOT
MSIAfterburner.exe /wi4,30,8d,19 ........... +156mv offset This seems to work BEST
MSIAfterburner.exe /wi4,30,8d,18 ........... +150mv offset
MSIAfterburner.exe /wi4,30,8d,17 ........... +144mv offset
MSIAfterburner.exe /wi4,30,8d,16 ........... +138mv offset
MSIAfterburner.exe /wi4,30,8d,14 ........... +125mv offset
MSIAfterburner.exe /wi4,30,8d,12 ........... +113mv offset
MSIAfterburner.exe /wi4,30,8d,10 ........... +100mv offset AB's Optimum Setting
MSIAfterburner.exe /wi4,30,8d,0 .............. Return to 0

the last line starts afterburner which will load with the voltage offset according to the command used in second line....it is sort of a hack and if you touch the voltage slider it will only let you go to +100mV again so set the voltage via the command and do not touch it after you start afterburner

hope it's clear now

edit: btw you don't have to stop at +200mV if you're brave enough and can cool it, just keep increasing the last number in the second line (carefully, increase by 1 and watch the max vddc in gpu-z, also keep an eye on the vrm temps)...just tried 21 out of curiosity and got +206mV

also you can set pretty much any freq you want want without any limits via the AB profile file....see this post for more info

all this regrettably no longer works in beta 18, you have to use beta 17 for these hacks to work


----------



## Connolly

Quote:


> Originally Posted by *jerrolds*
> 
> I play at 1440, all "High" settings with AA turned off, HBAO on - and i hover around 80-120fps in multiplayer. Usually 95fps-105fps. This is with a i7 [email protected] and [email protected] . ~60fps is doable at Ultra 1440p, if thats what your looking for i think.


Really? I get between 70 and 130 fps at 1080p, with an average of around 85 - 90. That's with a mix of ultra and high settings and AA/HBAO off. Your fps seems very high for 1440p


----------



## Jack Mac

Quote:


> Originally Posted by *headbass*
> 
> as I said already, those commands that *Hattifnatten* posted change the settings (stored in some ab cfg file I guess) that will be used the next time you start afterburner
> 
> you can make a bat file and put these 3 lines in it
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> CD C:\Program Files (x86)\MSI Afterburner
> MSIAfterburner.exe /wi4,30,8d,14
> MSIAfterburner.exe
> 
> the first line takes you to the afterburner location, so change it according to where it is for you (or I guess if you place the bat in the AB directory you don't need the first line)
> 
> the next line is the voltage offset that will be used the next time you start the afterburner (but will not start the afterburner itself)...replace the second line with any of these for your desired voltage:
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> MSIAfterburner.exe /wi4,30,8d,20 ........... +200mv offset This gets HOT
> MSIAfterburner.exe /wi4,30,8d,19 ........... +156mv offset This seems to work BEST
> MSIAfterburner.exe /wi4,30,8d,18 ........... +150mv offset
> MSIAfterburner.exe /wi4,30,8d,17 ........... +144mv offset
> MSIAfterburner.exe /wi4,30,8d,16 ........... +138mv offset
> MSIAfterburner.exe /wi4,30,8d,14 ........... +125mv offset
> MSIAfterburner.exe /wi4,30,8d,12 ........... +113mv offset
> MSIAfterburner.exe /wi4,30,8d,10 ........... +100mv offset AB's Optimum Setting
> MSIAfterburner.exe /wi4,30,8d,0 .............. Return to 0
> 
> the last line starts afterburner which will load with the voltage offset according to the command used in second line....it is sort of a hack and if you touch the voltage slider it will only let you go to +100mV again so set the voltage via the command and do not touch it after you start afterburner
> 
> hope it's clear now


This is going to be useful for me because I definitely prefer AB to Trixx. Might do some benching with this later, rep.


----------



## szeged

greetings my red friends, ill be coming back over soon when the 290x lightning comes out







i just hope it comes out soon and under $800 lol.


----------



## sugarhell

Quote:


> Originally Posted by *szeged*
> 
> greetings my red friends, ill be coming back over soon when the 290x lightning comes out
> 
> 
> 
> 
> 
> 
> 
> i just hope it comes out soon and under $800 lol.


Its 630 from the latest news


----------



## szeged

Quote:


> Originally Posted by *sugarhell*
> 
> Its 630 from the latest news


630 msrp, so $800 on pricegougersusa.com ....i mean newegg.









if i can find one on day one ill definitely grab it.


----------



## VSG

Man I want to grab 1 or 2 as well but not if Newegg gouges me, I really want to boycott Newegg permanently now. Hopefully Amazon will have them soon as well.


----------



## Arizonian

Quote:


> Originally Posted by *King4x4*
> 
> Can I join now?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added







If you can pop a GPU-Z validation link with your OCN open to the validation tab in the post would be great. Congrats on all FOUR!

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/13400#post_21532521

Quote:


> Originally Posted by *Alexbo1101*
> 
> Arizonian could you update me to water cooling?
> 
> 
> Spoiler: Water! :D


Congrats - Updated


----------



## grunion

Model Name: R9290X-DC2OC-4GD5
ETA: Jan. W2
Web link: http://www.asus.com/Graphics_Cards/R9290XDC2OC4GD5/

· MSRP: $ 589.99

· Part Number: R9290X-DC2OC-4GD5


DirectCU II achieves 20% lower temps with 220% the dissipation area, plus 3X quieter running than reference.
Exclusive CoolTech Fan drives wider airflow to keep critical components cool.
1050 MHZ engine clock for better performance and outstanding gaming experience
DIGI+ VRM with 8-phase Super Alloy Power delivers precise digital power for superior efficiency, reliability, and performance.
GPU Tweak helps you modify clock speeds, voltages, fan performance and more, all via an intuitive interface
GPU Tweak Streaming let you share on-screen action in real time - so others can watch live as games are played.
4.7% faster game performance than reference Radeon R9 290X in Metro Last Light


----------



## the9quad

Quote:


> Originally Posted by *grunion*
> 
> Model Name: R9290X-DC2OC-4GD5
> ETA: Jan. W2
> Web link: http://www.asus.com/Graphics_Cards/R9290XDC2OC4GD5/
> 
> · MSRP: $ 589.99
> 
> · Part Number: R9290X-DC2OC-4GD5
> 
> 
> DirectCU II achieves 20% lower temps with 220% the dissipation area, plus 3X quieter running than reference.
> Exclusive CoolTech Fan drives wider airflow to keep critical components cool.
> 1050 MHZ engine clock for better performance and outstanding gaming experience
> DIGI+ VRM with 8-phase Super Alloy Power delivers precise digital power for superior efficiency, reliability, and performance.
> GPU Tweak helps you modify clock speeds, voltages, fan performance and more, all via an intuitive interface
> GPU Tweak Streaming let you share on-screen action in real time - so others can watch live as games are played.
> 4.7% faster game performance than reference Radeon R9 290X in Metro Last Light


man if you could only get them at MSRP lol this would be a beautiful world


----------



## bloodkil93

Update me to an R9 290X from Sapphire, sorry for the terrible quality, phones camera is very broken. This is a replacement for the R9 290 I sent back under RMA due to Blackscreens.

Gonna put it under an NZXT Kraken G10 with this month.


----------



## bloodkil93

Quote:


> Originally Posted by *the9quad*
> 
> man if you could only get them at MSRP lol this would be a beautiful world


You guys are lucky, should see the prices here in the UK, the Tri X R9 290X is £499.99GBP ($840USD), the reference models are about £420GBP ($690USD) which is how much I paid for mine.


----------



## VSG

Well that has always been the case though, irrespective of the electronic item. You guys are at least not gouged even further.


----------



## Gumbi

Quote:


> Originally Posted by *bloodkil93*
> 
> You guys are lucky, should see the prices here in the UK, the Tri X R9 290X is £499.99GBP ($840USD), the reference models are about £420GBP ($690USD) which is how much I paid for mine.


That's because Overclockers can be expensive.

Of course, we pay ~20% tax (VAT) on everything we buy here, included in sale price.

If you are savvy, you can grab a 290X Trix for 500 euro from some German etailers, they are always very competetively priced.

http://geizhals.de/sapphire-radeon-r9-290x-tri-x-oc-11226-00-40g-a1048381.html

Chec out price history, it has been at 514 euro and 500 euro price points recently.


----------



## alancsalt

ASUS Radeon R9 290X 4GB BF4 Bundle $AU769.00


----------



## bloodkil93

Quote:


> Originally Posted by *Gumbi*
> 
> That's because Overclockers can be expensive.
> 
> Of course, we pay ~20% tax (VAT) on everything we buy here, included in sale price.
> 
> If you are savvy, you can grab a 290X Trix for 500 euro from some German etailers, they are always very competetively priced.


You gotta be careful though as sometimes it's taxed on import.


----------



## Gumbi

Quote:


> Originally Posted by *bloodkil93*
> 
> You gotta be careful though as sometimes it's taxed on import.


Never the case, you pay the tax at sale price. One of the benefits of living in the EU, free trade.

I've had several orders over the years from Hardwareversand (19 euro flat shipping fee no matter how big the order) and have spent probably 2.5k there over 3 years. I'be never been taxed.

Why would I be?


----------



## Slomo4shO

Quote:


> Originally Posted by *geggeg*
> 
> Man I want to grab 1 or 2 as well but not if Newegg gouges me, I really want to boycott Newegg permanently now. Hopefully Amazon will have them soon as well.


If they were the only ones I could understand that sentiment... It is practically every US retailer with a few exceptions here and there (ShopBLT.)


----------



## VSG

Amazon has not really gouged prices, they just seem to go out of stock asap. Newegg was gouging the most of the big retailers as well.


----------



## BradleyW

How's the coil whine for your 290X's everyone?
Thank you.


----------



## szeged

Quote:


> Originally Posted by *geggeg*
> 
> Amazon has not really gouged prices, they just seem to go out of stock asap. Newegg was gouging the most of the big retailers as well.


Yeah I'm done buying from newegg. Worst pricing policy ever.


----------



## Jack Mac

My 290 has no coil while, except the fan makes a whining noise at anything above 75%.


----------



## Widde

Quote:


> Originally Posted by *BradleyW*
> 
> How's the coil whine for your 290X's everyone?
> Thank you.


I have alittle noise in my headphones at high fps no coilwhine though, Might be the onboard audio aswell









Side note: I'm inches away from pulling the trigger on a second 290, anyone had any stutter or latency issues? How is it performance wise?







Never had dual gpus before. Just want to do a final check ^^


----------



## bloodkil93

Quote:


> Originally Posted by *BradleyW*
> 
> How's the coil whine for your 290X's everyone?
> Thank you.


My Sapphire R9 290X has very quiet coil whine, but not that bad, only happens when getting stupidly high FPS like 300+

Other than that it performs fine with no issues or blackscreens


----------



## Slomo4shO

Quote:


> Originally Posted by *szeged*
> 
> Yeah I'm done buying from newegg. Worst pricing policy ever.


I just let the price dictate who I buy from so it is a non-issue. Then again I did score 4 R9 290s from Newegg Business for $320 each ($300 per after selling the extra games







) on Black Friday


----------



## Jhors2

So I've been trying to read into this as much as possible, I was assuming the DirectCu II would be the best card to buy because of the aftermarket digipower delivery and the PWM heat issues people were seeing, but I want to watercool 3-4 of these and am wondering what the best card to buy will be.


----------



## sun100

Quote:


> Originally Posted by *Gumbi*
> 
> That's because Overclockers can be expensive.
> 
> Of course, we pay ~20% tax (VAT) on everything we buy here, included in sale price.
> 
> If you are savvy, you can grab a 290X Trix for 500 euro from some German etailers, they are always very competetively priced.
> 
> http://geizhals.de/sapphire-radeon-r9-290x-tri-x-oc-11226-00-40g-a1048381.html
> 
> Chec out price history, it has been at 514 euro and 500 euro price points recently.


Yeah, you can also order it from Swedish retailers, goes for about 460€ . Don't forget, you don't pay import customs for shipments coming from EU.


----------



## dspacek

Quote:


> Originally Posted by *headbass*
> 
> as I said already, those commands that *Hattifnatten* posted change the settings (stored in some ab cfg file I guess) that will be used the next time you start afterburner
> 
> you can make a bat file and put these 3 lines in it
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> CD C:\Program Files (x86)\MSI Afterburner
> MSIAfterburner.exe /wi4,30,8d,14
> MSIAfterburner.exe
> 
> the first line takes you to the afterburner location, so change it according to where it is for you (or I guess if you place the bat in the AB directory you don't need the first line)
> 
> the next line is the voltage offset that will be used the next time you start the afterburner (but will not start the afterburner itself)...replace the second line with any of these for your desired voltage:
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> MSIAfterburner.exe /wi4,30,8d,20 ........... +200mv offset This gets HOT
> MSIAfterburner.exe /wi4,30,8d,19 ........... +156mv offset This seems to work BEST
> MSIAfterburner.exe /wi4,30,8d,18 ........... +150mv offset
> MSIAfterburner.exe /wi4,30,8d,17 ........... +144mv offset
> MSIAfterburner.exe /wi4,30,8d,16 ........... +138mv offset
> MSIAfterburner.exe /wi4,30,8d,14 ........... +125mv offset
> MSIAfterburner.exe /wi4,30,8d,12 ........... +113mv offset
> MSIAfterburner.exe /wi4,30,8d,10 ........... +100mv offset AB's Optimum Setting
> MSIAfterburner.exe /wi4,30,8d,0 .............. Return to 0
> 
> the last line starts afterburner which will load with the voltage offset according to the command used in second line....it is sort of a hack and if you touch the voltage slider it will only let you go to +100mV again so set the voltage via the command and do not touch it after you start afterburner
> 
> hope it's clear now


Thanks very much, and does anything like this exists for a core and mem overclocking? I would like to set up 1700 mhz RAM and around 1300Mhz COre...







in couple of days under water


----------



## Slomo4shO

Quote:


> Originally Posted by *Jhors2*
> 
> So I've been trying to read into this as much as possible, I was assuming the DirectCu II would be the best card to buy because of the aftermarket digipower delivery and the PWM heat issues people were seeing, but I want to watercool 3-4 of these and am wondering what the best card to buy will be.


The DirectCu II is the only one that currently has a custom block in the works from EK. The models that use the reference boards really don't have anything to justify the costs and you would be equally served by going with a reference model if you were to water cooler one of the custom coolers with the reference board. Best option would be a Lighting or Matrix under water but they are no where in sight in the foreseeable future.


----------



## headbass

Quote:


> Originally Posted by *dspacek*
> 
> Thanks very much, and does anything like this exists for a core and mem overclocking? I would like to set up 1700 mhz RAM and around 1300Mhz COre...
> 
> 
> 
> 
> 
> 
> 
> in couple of days under water


no idea, sorry....I don't deserve any credit for this, all I did was explain it in more detail how this voltage hack works to help others get it to work ;o]

btw you could try experimenting with editing the profile file for your card

in my AB profiles directory
C:\Program Files (x86)\MSI Afterburner\Profiles
I have file
VEN_1002&DEV_67B1&SUBSYS_0B001002&REV_00&BUS_1&DEV_0&FN_0.cfg

which seems to be the one changed by the hack...looks like this:

Code:



Code:


[I2C_BUS_04_DEV_30]
Offset00=25 1D 24 00 13 18 90 55 68 75 11 66 66 44 FF FF 
Offset10=A2 22 80 10 2A FF 40 00 00 00 00 00 00 00 00 00 
Offset20=00 00 60 60 2B 1E FF 80 06 21 94 2D 37 93 14 23 
Offset30=77 07 FC 7C 79 05 05 1D 81 A0 60 00 00 66 00 00 
Offset40=A8 90 00 00 00 00 00 00 14 14 00 40 80 60 A0 FF 
Offset50=FF 00 30 88 44 88 44 12 2A 02 A8 00 50 25 00 3C 
Offset60=3C 02 4E 60 C0 9C 24 88 80 00 00 00 00 FF 06 FF 
Offset70=FF 20 00 00 00 00 00 00 00 00 00 00 15 15 00 00 
Offset80=00 00 00 00 00 00 00 00 88 88 01 C2 44 10 00 1F 
Offset90=00 05 43 59 58 01 70 70 5E A9 8B 80 05 03 23 24 
OffsetA0=00 00 00 00 00 00 01 01 08 55 80 AE 9F 00 00 0A 
OffsetB0=06 00 00 00 00 00 23 A3 00 00 71 72 AD 15 00 00 
OffsetC0=00 00 01 76 23 00 00 00 00 00 00 07 02 44 00 00 
OffsetD0=00 00 AD 50 0C 00 10 00 3F 00 00 00 23 C0 F0 00 
OffsetE0=00 00 00 10 01 03 01 88 00 00 00 00 00 00 00 00 
OffsetF0=00 00 00 00 00 33 00 00 00 00 00 00 00 00 00 00 
[Defaults]
Format=2
PowerLimit=0
CoreClk=947000
MemClk=1250000
FanMode=1
FanSpeed=25
[Settings]
CaptureDefaults=0
[Startup]
Format=2
PowerLimit=
CoreClk=
MemClk=
FanMode=
FanSpeed=
CoreVoltageBoost=
AuxVoltageBoost=

if it works for the offsets maybe it will work if you try editing the default/startup parameters

since you've asked for the mem freq bump quite a few times in this thread already it seems you're eager to give it a try....let us know if it worked

I can't see myself needing to extend the AB limits, my card can't even get to 1200 on core without artefacting in games while keeping the max in gpuz under 1.35v which is already more than I would like for 24/7 and I don't care about benches really

just don't blame me if you fry the card since I have no idea how it works really ;o]


----------



## Jhors2

Quote:


> Originally Posted by *Slomo4shO*
> 
> The DirectCu II is the only one that currently has a custom block in the works from EK. The models that use the reference boards really don't have anything to justify the costs and you would be equally served by going with a reference model if you were to water cooler one of the custom coolers with the reference board. Best option would be a Lighting or Matrix under water but they are no where in sight in the foreseeable future.


Thanks Slomo! Much appreciated!


----------



## Gumbi

Quote:


> Originally Posted by *sun100*
> 
> Yeah, you can also order it from Swedish retailers, goes for about 460€ . Don't forget, you don't pay import customs for shipments coming from EU.


The 290X for 460? I doubt it very much. Sounds like you're talking of the 290. I which case I've seen that as low as 386 euro, currently going for 400 right now. A pretty good price considering the VAT inclusive price


----------



## jerrolds

Quote:


> Originally Posted by *Connolly*
> 
> Really? I get between 70 and 130 fps at 1080p, with an average of around 85 - 90. That's with a mix of ultra and high settings and AA/HBAO off. Your fps seems very high for 1440p


No youre right ..i think i was only seeing high fps and ignoring when it went low haha. Its around 80fps right now in 64player Lancan Dam and Seige of Shanghai.

Coulda swore some maps i was consistently above 100fps..but i guess im wrong.


----------



## kizwan

Quote:


> Originally Posted by *bloodkil93*
> 
> You guys are lucky, should see the prices here in the UK, the Tri X R9 290X is £499.99GBP ($840USD), the reference models are about £420GBP ($690USD) which is how much I paid for mine.


In Malaysia, reference 290X when released was MYR$2399 (£445.67GBP) & now reduced to MYR$1899 (£352.78GBP), for Sapphire at least. Sapphire TRI-X R9 290 is MYR$2299 (£427.09GBP) at the moment.

No GST/VAT tax but will be soon, in 2015. I hope salary increase too.

Quote:


> Originally Posted by *Widde*
> 
> I have alittle noise in my headphones at high fps no coilwhine though, Might be the onboard audio aswell
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Side note: I'm inches away from pulling the trigger on a second 290, anyone had any stutter or latency issues? How is it performance wise?
> 
> 
> 
> 
> 
> 
> 
> Never had dual gpus before. Just want to do a final check ^^


I only play few games, DiRT2, BF3 & BF4. I can tell no stutter issue with these games.


----------



## Arizonian

Quote:


> Originally Posted by *bloodkil93*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> /type/61/id/1831320/width/500/height/1000
> 
> Update me to an R9 290X from Sapphire, sorry for the terrible quality, phones camera is very broken. This is a replacement for the R9 290 I sent back under RMA due to Blackscreens.
> 
> Gonna put it under an NZXT Kraken G10 with this month.


Congrats - updated









Will leave you with aftermarket cooling since your going to add the Kraken again.


----------



## TommyGunn123

Gigabyte Radeon R9 290X 4GB OC BF4 Bundle out for $769 aus



http://www.pccasegear.com/index.php?main_page=product_info&cPath=193_1555&products_id=26457&zenid=539fb2d531d3607b0a5c7a53898e5592


----------



## Roy360

that seems expensive.

Why not buy a regular R9 290 for 419 and buy a waterblock? You could buy a H220 , tubing, some barbs too for that price
http://accessories.dell.com/sna/products/Graphic_Video_Cards/productdetail.aspx?c=ca&l=en&cs=cadhs1&sku=A7389425&baynote_bnrank=0&baynote_irrank=0&~ck=baynoteSearch

no battlefield though


----------



## the9quad

Quote:


> Originally Posted by *Roy360*
> 
> that seems expensive.
> 
> Why not buy a regular R9 290 for 419 and buy a waterblock? You could buy a H220 , tubing, some barbs too for that price
> http://accessories.dell.com/sna/products/Graphic_Video_Cards/productdetail.aspx?c=ca&l=en&cs=cadhs1&sku=A7389425&baynote_bnrank=0&baynote_irrank=0&~ck=baynoteSearch


Nm


----------



## HardwareDecoder

any working bios editor out for these?


----------



## hatlesschimp

Quote:


> Originally Posted by *King4x4*
> 
> Can I join now?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


YUM!!!

Im waiting on the Magical 290X with HDMI 2.0!

WILL IT EVER COME???


----------



## wolfej

Finally got my loop set up as I was running them on air until I got the new case and other junk.



I'm too lazy to get my fiancee's camera to take a proper picture, but _THAR THEY BLOW!_

I've gotten them to 1200/1300 with just 100mV offset as I've only been messing around with the clocks for like 30 minutes. Seems to be completely stable.


----------



## TheSoldiet

Ok, i need help. Is the stock thermal pad for the 290x stock VRM conductive?! I'm worried because i had to reuse it on my alpenfohn vrm cooler. The thermal pad kinda went all over the place and touched allot of those little things just besides vrm1. Please answer if you know.

Thx


----------



## kizwan

Quote:


> Originally Posted by *TheSoldiet*
> 
> Ok, i need help. Is the stock thermal pad for the 290x stock VRM conductive?! I'm worried because i had to reuse it on my alpenfohn vrm cooler. The thermal pad kinda went all over the place and touched allot of those little things just besides vrm1. Please answer if you know.
> 
> Thx


Thermal pad is not electrically conductive.


----------



## Matt-Matt

My XFX "Core" aka reference 290 refuses to accept a 290x BIOS and will just soft brick no matter what BIOS I use.

Still, gets great fps in games, but oh so loud.. What coolers are people using for these and what are their temps like?

Also to ask, what is the best waterblock? Considering they all cost around $150 apart from the XSPC one for me.


----------



## Amhro

Finally, non-ref 290s showed up in stores!!








How's Gigabyte? Is it voltage locked? It's 400€ with BF4 (40€ more than ref model).
There's also sapphire trix, but it won't fit into my case I guess.
And I don't plan any high overclocking yet, because of my cpu


----------



## darkelixa

I have no idea how a trixx would not fit, its low profile and low length


----------



## ottoore

Quote:


> Originally Posted by *Matt-Matt*
> 
> My XFX "Core" aka reference 290 refuses to accept a 290x BIOS and will just soft brick no matter what BIOS I use.


I have the same problem. You can use all 290 bios and, if you want to unlock voltage in gpu tweak, you can use pt1t bios ( you can find it in unlock thread) , that is pt1 bios for r9 290.


----------



## hotrod717

Asus DCII officially released. Should be available soon. 290X and 290 versions.
http://rog.asus.com/297212014/graphics-cards-2/pr-asus-announces-r9-290x-and-r9-290-directcu-ii-graphics-cards/


----------



## Amhro

Quote:


> Originally Posted by *darkelixa*
> 
> I have no idea how a trixx would not fit, its low profile and low length


Isn't trix like 30,5cm long?


----------



## hotrod717

Quote:


> Originally Posted by *Amhro*
> 
> Finally, non-ref 290s showed up in stores!!
> 
> 
> 
> 
> 
> 
> 
> 
> How's Gigabyte? Is it voltage locked? It's 400€ with BF4 (40€ more than ref model).
> There's also sapphire trix, but it won't fit into my case I guess.
> And I don't plan any high overclocking yet, because of my cpu


Quote:


> Originally Posted by *darkelixa*
> 
> I have no idea how a trixx would not fit, its low profile and low length


Quote:


> Originally Posted by *Amhro*
> 
> Isn't trix like 30,5cm long?


Sapphire Trixx is overclocking software, not a type of hardware you put in your case. It will fit on the smallest of drives.


----------



## darkelixa

http://www.pccasegear.com/index.php?main_page=product_info&cPath=193_1575&products_id=26373

Wrong, its the new line of r9 290


----------



## hotrod717

Quote:


> Originally Posted by *darkelixa*
> 
> http://www.pccasegear.com/index.php?main_page=product_info&cPath=193_1575&products_id=26373
> 
> Wrong, its the new line of r9 290


No, that's the Tri-X, not trix or trixx.


----------



## MIGhunter

Quote:


> Originally Posted by *darkelixa*
> 
> http://www.pccasegear.com/index.php?main_page=product_info&cPath=193_1575&products_id=26373
> 
> Wrong, its the new line of r9 290


Tri-X is the new card by sapphire, triXX is the OC software.


----------



## Amhro

Quote:


> Originally Posted by *hotrod717*
> 
> Sapphire Trixx is overclocking software, not a type of hardware you put in your case. It will fit on the smallest of drives.


Trixx != trix


----------



## dspacek

I have to say big thanks to *headbass*







who helped me how to reach unlimited OCing
Now my AB looks like this, BUT ULPS does not work in high clocks:


----------



## flamin9_t00l

Anyone tried "Force Constant voltage" in TriXX and does it eliminate vdroop?


----------



## MasterT

Heya guys. Greeting from the Caribbean. Been following this thread from page 1 since the launch of these cards. Now I'll like to be added. Had loads of help from you guys. Excellent community here. My cards are Asus R9 290 and they are now under water. Temps after a Heaven run are 52(top card)/51 on gpu. More testing to be done.


----------



## flamin9_t00l

Quote:


> Originally Posted by *MasterT*
> 
> Heya guys. Greeting from the Caribbean. Been following this thread from page 1 since the launch of these cards. Now I'll like to be added. Had loads of help from you guys. Excellent community here. My cards are Asus R9 290 and they are now under water. Temps after a Heaven run are 52(top card)/51 on gpu. More testing to be done.


Very nice... Did you apply red strips or paint the red on to show they're radeon cards or did they come like that.


----------



## ds84

Just got the MSI R9 290 Gaming. OC to +50 mV, +50%, 1077mhz Core clock, 1300mhz mem clock. Ran heaven and 3dmark and got the following results:

Heaven:
FPS: 54.7
Score: 1377

Min FPS: 22.3
Max FPS: 112.2

3DMark:
8555

My ASIC is 79.6%. How can i ensure my OC can run stable?


----------



## MasterT

Got them from a member. He painted them.


----------



## ImJJames

Quote:


> Originally Posted by *ds84*
> 
> Just got the MSI R9 290 Gaming. OC to +50 mV, +50%, 1077mhz Core clock, 1300mhz mem clock. Ran heaven and 3dmark and got the following results:
> 
> Heaven:
> FPS: 54.7
> Score: 1377
> 
> Min FPS: 22.3
> Max FPS: 112.2
> 
> 3DMark:
> 8555
> 
> My ASIC is 79.6%. How can i ensure my OC can run stable?


Game on it


----------



## tobitronics

Is there any benefit to OC capabilities if I flash a different VGA BIOS? Currently I have a Asus 290X under water with BIOS 006.003518. Should I update to 007.003525? And what is the best OC utility? Thanks in advance!


----------



## Ukkooh

Do XFX cards have warranty void stickers on the cooler screws?


----------



## ds84

Quote:


> Originally Posted by *ImJJames*
> 
> Game on it


But, i would like to try to find a high OC which is stable.


----------



## Ukkooh

Quote:


> Originally Posted by *ds84*
> 
> But, i would like to try to find a high OC which is stable.


Fold on it if gaming stable isn't stable enough for you.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *ds84*
> 
> But, i would like to try to find a high OC which is stable.


If you can't game on it, it's not stable. Simple as that.


----------



## pkrexer

So I was using kombustor to try and determine my max overclock. I would constantly see artifacts at much lower clocks then I would in games, I figured it was due to kombustor making my GPU run much hotter. However, after monitoring my voltages(running +25mv with AB) through GPUz I noticed that my core was only showing around 1.109 while stressing with kombustor. When I play a game its typically showing around 1.189 - 1.2.

Anyone else notice this? Also, what do use to test for artifacts besides just games?


----------



## Roy360

Quote:


> Originally Posted by *Ukkooh*
> 
> Do XFX cards have warranty void stickers on the cooler screws?


Yes. Unfortunately, they seem to be the only one who does have those stickers. Apparently they want you to tell them if you are modifying the card.

ASUS, PowerColor, Gigabyte, shouldn't have stickers

Tell me if know a way to remove the sticker. I thought the extreme temperatures would make them fall off. But that didn't work.


----------



## VSG

Asus also has those stickers- a tiny one on the core backside, real tricky to remove unscathed.


----------



## jerrolds

Quote:


> Originally Posted by *Ukkooh*
> 
> Do XFX cards have warranty void stickers on the cooler screws?


Yes they do, I had an XFX reference 290X briefly and it did have a sticker on the screws behind the gpu. But i believe XFX is one of the only manufacturers that allow for modification (may have to talk to them first).

Its weird since Sapphire explicitly does not allow modification, but have no stickers heh


----------



## kpoeticg

Hey guys, I'm about to grab my first 290x for my Tri 290x build. I'm trying to wait for prices to drop before I grab my other 2. I was thinking of grabbing the PowerColor cuz of its higher Core Clock (1030). I'll be OC'ing & Gaming. Also I plan on having all 3 of my cards under water. Is the PowerColor a good choice? And has anybody started selling cards with strictly Hynix memory yet?


----------



## Ukkooh

Quote:


> Originally Posted by *Roy360*
> 
> Yes. Unfortunately, they seem to be the only one who does have those stickers. Apparently they want you to tell them if you are modifying the card.
> 
> ASUS, PowerColor, Gigabyte, shouldn't have stickers
> 
> Tell me if know a way to remove the sticker. I thought the extreme temperatures would make them fall off. But that didn't work.


I guess I'll have to send the XFX card I have in the the mail back then. Thanks.


----------



## VSG

Save yourself some money and grab any reference card with good warranty and the EK block separately. That overclock is pityful for water cooling, and they probably just decided to be conservative. On air you can get 1075 core at least on average without added voltage.

No one I know of is selling cards with just hynix yet, but elpida on the 290/290x seems to have tighter timings anyway so it plays out the same more or less.


----------



## kpoeticg

Quote:


> Originally Posted by *geggeg*
> 
> Save yourself some money and grab any reference card with good warranty and the EK block separately. That overclock is pityful for water cooling, and they probably just decided to be conservative. On air you can get 1075 core at least on average without added voltage.
> 
> No one I know of is selling cards with just hynix yet, but elpida on the 290/290x seems to have tighter timings anyway so it plays out the same more or less.


Thanx brotha. Rep!

I'm planning on the AC Kryographics Blocks for the 290x's. I'm going EK with all my other blocks. Love the AC 290x Copper/Plexi tho. I like the new black version too.

I'll probly just grab a Sapphire for now then. Thanx again









Edit: I wasn't getting the combo EK/PowerColor. Just the regular PowerColor cuz of its higher core clock. But if it doesn't make a difference when OC'ing then i'll just grab a Sapphire. I just need something for now so I can power on my RIVE BE. It's been sitting around for too long =P


----------



## TheHunter

Quote:


> Originally Posted by *MasterT*
> 
> Heya guys. Greeting from the Caribbean. Been following this thread from page 1 since the launch of these cards. Now I'll like to be added. Had loads of help from you guys. Excellent community here. My cards are Asus R9 290 and they are now under water. Temps after a Heaven run are 52(top card)/51 on gpu. More testing to be done.
> 
> 
> 
> 
> 
> 
> 
> 
> ]


Nice one,









I was wondering what temps did you get with DirectCuii fan? or took it of asap.
Quote:


> Originally Posted by *ds84*
> 
> Just got the MSI R9 290 Gaming. OC to +50 mV, +50%, 1077mhz Core clock, 1300mhz mem clock. Ran heaven and 3dmark and got the following results:
> 
> Heaven:
> FPS: 54.7
> Score: 1377
> 
> Min FPS: 22.3
> Max FPS: 112.2
> 
> 3DMark:
> 8555
> 
> My ASIC is 79.6%. How can i ensure my OC can run stable?











Hi, what temps did you get with that TF4 @ 50% fan? Thanks


----------



## VSG

Quote:


> Originally Posted by *kpoeticg*
> 
> Thanx brotha. Rep!
> 
> I'm planning on the AC Kryographics Blocks for the 290x's. I'm going EK with all my other blocks. Love the AC 290x Copper/Plexi tho. I like the new black version too.
> 
> I'll probly just grab a Sapphire for now then. Thanx again
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: I wasn't getting the combo EK/PowerColor. Just the regular PowerColor cuz of its higher core clock. But if it doesn't make a difference when OC'ing then i'll just grab a Sapphire. I just need something for now so I can power on my RIVE BE. It's been sitting around for too long =P


No problem, man. Just be aware that the active Aquacomputer backplate for the 290/290x might not be compatible as is (nothing a dremel can't solve) with your RIVBE as some people reported it hit the ram slot covers.


----------



## kpoeticg

Thanx for pointing that out. I was planning on getting the active backplate so good to know. I saw some people talking about removing the ram lever. Didn't know that was why


----------



## Timx2

Wohoo! I got my Sapphire R290X y'day (ASIC 78%). 

Did a few test benchmarks to see if it wasn't DOA and mounted a EKWB waterblock + backplate on it. I just did a 15 minutes valley run and I do get the following temps:

GPU: 45°
VRM1: 51°
VRM2: 37°

The valley score was 2527 @ stock settings. I'll be overclocking in a few weeks. I will try the stock settings first for some time.

At idle the temps are around

GPU: 30°
VRM1: 25°
VRM2: 24°

Are these temps normal for a watercooled loop? I post 2 pictures to get an idea of my loop!


----------



## tsm106

Quote:


> I got my Sapphire R290X y'day (ASIC 78%).


No need to list your asic as its a meaningless metric on Hawaii. Temps look good man.


----------



## tijgert

Quote:


> Is there any benefit to OC capabilities if I flash a different VGA BIOS? Currently I have a Asus 290X under water with BIOS 006.003518. Should I update to 007.003525? And what is the best OC utility? Thanks in advance!


Here's what I found; the 3525 is the same bios as on several other 290xs. Check Techpowerup bios section for that. As such, the 3525 does not allow overvolting with GPUtweak.
3518 seems the latest 'Asus only' bios and it does allow overvolting.

Having said that, GPUtweak allows only 162mV extra while Trixx allows 200mV extra, in case you need it. AB will only let you add 100mV, unless you start fiddling with command lines. Still, AB for graphs is still the best. Never understood why Trixx lacked graphs...
Also; my Sapphire had a 3526 bios and that seems quite a few revisions higher than 3518 so I reflashed my Sapphire bios and am overvolting with Trixx. Well, will be once I dunk it under water with my dual 420's with Akasa Vipers


----------



## Roy360

In case there are some people looking to waterer cool but can't afford it . Buy H220 or H320(illegal in the U.S) they go on sale for 119.99+ 20 for tubing + 20 for 8 barb fittings + 120$ for GPU block. Then ~30-100 for extra rads if you want a better delta.

The H220 can easily handle gpu+cpu with no problem


----------



## VSG

It is not illegal to buy them in the US, just that a patent dispute prevents them being sold in the US. You can still buy them if you find them on eBay or NCIX US and you still get warranty support if bought new from an authorized vendor.


----------



## jerrolds

Quote:


> Originally Posted by *Roy360*
> 
> In case there are some people looking to waterer cool but can't afford it . Buy H220 or H320(illegal in the U.S) they go on sale for 119.99+ 20 for tubing + 20 for 8 barb fittings + 120$ for GPU block. Then ~30-100 for extra rads if you want a better delta.
> 
> The H220 can easily handle gpu+cpu with no problem


I asked the Swiftech rep in the watercooling forums and he aid that with my i7 [email protected]+ and R9 [email protected]+ the H220 would be pushing it, but said the the H320 would be just fine.

I'm still up in the air about that though...The H320 is $169CDN, for $100 more i can get a proper XSPC 360rad Raystorm kit. But i know id end up getting rigid tubing which adds to cost









barbs/fittings, and probably more coolant if youre going to add a GPU to the loop.


----------



## dspacek

Quote:


> Originally Posted by *ds84*
> 
> Just got the MSI R9 290 Gaming. OC to +50 mV, +50%, 1077mhz Core clock, 1300mhz mem clock. Ran heaven and 3dmark and got the following results:
> 
> Heaven:
> FPS: 54.7
> Score: 1377
> 
> Min FPS: 22.3
> Max FPS: 112.2
> 
> 3DMark:
> 8555
> 
> My ASIC is 79.6%. How can i ensure my OC can run stable?


Pure overclock, nobody cares















Im running this now playing WOT and still stable, (Pure air stock cooling - water block tomorrow)


----------



## flamin9_t00l

Quote:


> Originally Posted by *Timx2*
> 
> Wohoo! I got my Sapphire R290X y'day (ASIC 78%).
> 
> Did a few test benchmarks to see if it wasn't DOA and mounted a EKWB waterblock + backplate on it. I just did a 15 minutes valley run and I do get the following temps:
> 
> GPU: 45°
> VRM1: 51°
> VRM2: 37°
> 
> The valley score was 2527 @ stock settings. I'll be overclocking in a few weeks. I will try the stock settings first for some time.
> 
> At idle the temps are around
> 
> GPU: 30°
> VRM1: 27°
> VRM2: 25°
> 
> Are these temps normal for a watercooled loop? I post 2 pictures to get an idea of my loop!


It looks like you have your gpu inlet and outlet on the same side of the terminal. I think inlet should be on one side and outlet on the other (top or bottom doesn't matter). I would have thought your temps would be very poor like that as the coolant could just go straight through without circulating the block... unless there is a separator to prevent this, I dunno didn't check when I installed mine. Your temps seem good though.


----------



## Timx2

Quote:


> Originally Posted by *flamin9_t00l*
> 
> It looks like you have your gpu inlet and outlet on the same side of the terminal. I think inlet should be on one side and outlet on the other (top or bottom doesn't matter). I would have thought your temps would be very poor like that as the coolant could just go straight through without circulating the block... unless there is a separator to prevent this, I dunno didn't check when I installed mine. Your temps seem good though.


Hey,

No, the outlet under the inlet is just a tube for my drain. It doesn't go anywhere. But thanks for your concerns


----------



## staryoshi

I ordered a MSI Gaming R9 290 @ $470 - with the caveat being that it's temporarily OOS. I'm away from my desktop for a few weeks so I don't particularly mind waiting







I'm hopeful it will arrive in a few weeks. When it does arrive, expect a thorough Yoshi review like in the days or yore


----------



## Gumbi

366 euro Windforce 290s on sale in many German etailers. http://geizhals.de/gigabyte-radeon-r9-290-windforce-3x-oc-gv-r929oc-4gd-a1049151.html

290 Trix going for as low as 384 too now.
http://geizhals.de/sapphire-radeon-r9-290-tri-x-oc-11227-00-40g-a1048411.html


----------



## headbass

Quote:


> Originally Posted by *tsm106*
> 
> No need to list your asic as its a meaningless metric on Hawaii. Temps look good man.


imho it's quite the opposite mate...it matters on hawaii more then on any other previous chips

stating freq @ voltage offset (without stating asic quality as well) is the meaningless since the base voltage depends on the asic

my 71.5% asic gets to 1.20v without any voltage increase
my 72.4% asic gets to 1.15v without any voltage increase

the higher the asic the lower the base voltage
and the higher the voltage the higher the temps, the higher the temps the lower freq (due to throttling to keep it under 95 without going over the max fan duty cycle set by amd to make the card quieter)

and as always the higher asic you have the lower voltage you need to acheive certain freq


----------



## headbass

Quote:


> Originally Posted by *dspacek*
> 
> I have to say big thanks to *headbass*
> 
> 
> 
> 
> 
> 
> 
> who helped me how to reach unlimited OCing
> Now my AB looks like this, BUT ULPS does not work in high clocks:


hehe thx man, it was just a thought, glad it worked...for those who missed the post, it's here


----------



## tsm106

Quote:


> Originally Posted by *headbass*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> No need to list your asic as its a meaningless metric on Hawaii. Temps look good man.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *imho* it's quite the opposite mate...it matters on hawaii more then on any other previous chips
> 
> stating freq @ voltage offset is the meaningless since the base voltage depends on the asic
> 
> my 71.5% asic gets to 1.20v without any voltage increase
> my 72.4% asic gets to 1.15v without any voltage increase
> 
> the higher the asic the lower the base voltage
Click to expand...

Which is worth what?

http://forum.beyond3d.com/showpost.php?p=1808073&postcount=2130

Quote:


> ASIC "quality" is a misnomer propobated by GPU-z reading a register and not really knowing what it results in. It is even more meaningless with the binning mechanism of Hawaii.


----------



## Bartouille

Quote:


> Originally Posted by *headbass*
> 
> imho it's quite the opposite mate...it matters on hawaii more then on any other previous chips
> 
> stating freq @ voltage offset (without stating asic quality as well) is the meaningless since the base voltage depends on the asic
> 
> my 71.5% asic gets to 1.20v without any voltage increase
> my 72.4% asic gets to 1.15v without any voltage increase


I don't think lower stock voltage matters much. That's why AB works with an offset.


----------



## hotrod717

Quote:


> Originally Posted by *kpoeticg*
> 
> Hey guys, I'm about to grab my first 290x for my Tri 290x build. I'm trying to wait for prices to drop before I grab my other 2. I was thinking of grabbing the PowerColor cuz of its higher Core Clock (1030). I'll be OC'ing & Gaming. Also I plan on having all 3 of my cards under water. Is the PowerColor a good choice? And has anybody started selling cards with strictly Hynix memory yet?


I would wait a bit. Asus just announced release of DCII. Only known card to come with hynix 100%. EK also will be making waterblock for it.


----------



## headbass

Quote:


> Originally Posted by *Amhro*
> 
> Isn't trix like 30,5cm long?


yup, uses the same heatsink as the sapphire 280x toxic which was 308mm, the tri-x 290/290x is exactly as you say 305mm

http://www.sapphiretech.com/presentation/product/?cid=1&gid=3&sgid=1227&pid=2023&psn=&lid=1&leg=0
http://www.sapphiretech.com/presentation/product/?cid=1&gid=3&sgid=1227&pid=2091&psn=&lid=1&leg=0


----------



## headbass

Quote:


> Originally Posted by *Bartouille*
> 
> I don't think lower stock voltage matters much. That's why AB works with an offset.


the offset is imho offset from base voltage which depends on asic


----------



## devilhead

Quote:


> Originally Posted by *hotrod717*
> 
> I would wait a bit. Asus just announced release of DCII. Only known card to come with hynix 100%. EK also will be making waterblock for it.


so the DCII for sure it will be with hynix? so miners will grab those cards really fast


----------



## hotrod717

Quote:


> Originally Posted by *devilhead*
> 
> so the DCII for sure it will be with hynix? so miners will grab those cards really fast


When you put it like that....NO, they will definately not have hynix memory! Miners, no need to pay attention to this card. It will be horrible for mining and will not make you any crypto currency!


----------



## bloodkil93

Quote:


> Originally Posted by *hotrod717*
> 
> When you put it like that....NO, they will definately not have hynix memory! Miners, no need to pay attention to this card. It will be horrible for mining and will not make you any crypto currency!


I'm not being funny or anything, but my Sapphire R9 290X has Elpida memory and has managed to clock at 1500 stable and could still go go higher if I wanted too, understandably there were Blackscreen issues, but they can't be that bad?


----------



## jerrolds

I think most Eplida actually perform worse over 1500mhz - something about the memory controller/voltage. I know at 1600mhz mine scores lower.


----------



## Timx2

Quote:


> Originally Posted by *Timx2*
> 
> Wohoo! I got my Sapphire R290X y'day (ASIC 78%).
> 
> Did a few test benchmarks to see if it wasn't DOA and mounted a EKWB waterblock + backplate on it. I just did a 15 minutes valley run and I do get the following temps:
> 
> GPU: 45°
> VRM1: 51°
> VRM2: 37°
> 
> The valley score was 2527 @ stock settings. I'll be overclocking in a few weeks. I will try the stock settings first for some time.
> 
> At idle the temps are around
> 
> GPU: 30°
> VRM1: 25°
> VRM2: 24°
> 
> Are these temps normal for a watercooled loop? I post 2 pictures to get an idea of my loop!


I am wondering. I only see gpu, vrm1 and vrm2 temps. I cant see the memory temps or am I missing something?


----------



## bloodkil93

Quote:


> Originally Posted by *jerrolds*
> 
> I think most Eplida actually perform worse over 1500mhz - something about the memory controller/voltage. I know at 1600mhz mine scores lower.


I didn't want to go over 1500 as I was already getting higher FPS with that so I thought it best to stay put


----------



## tsm106

Quote:


> Originally Posted by *Timx2*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Timx2*
> 
> Wohoo! I got my Sapphire R290X y'day (ASIC 78%).
> 
> Did a few test benchmarks to see if it wasn't DOA and mounted a EKWB waterblock + backplate on it. I just did a 15 minutes valley run and I do get the following temps:
> 
> GPU: 45°
> VRM1: 51°
> VRM2: 37°
> 
> The valley score was 2527 @ stock settings. I'll be overclocking in a few weeks. I will try the stock settings first for some time.
> 
> At idle the temps are around
> 
> GPU: 30°
> VRM1: 25°
> VRM2: 24°
> 
> Are these temps normal for a watercooled loop? I post 2 pictures to get an idea of my loop!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am wondering. I only see gpu, vrm1 and vrm2 temps. I cant see the memory temps or am I missing something?
Click to expand...

That's cuz mem temp and volts are not supported. Afiak mem temp has never been reported anyways as far as I can recall.


----------



## Timx2

Quote:


> Originally Posted by *tsm106*
> 
> That's cuz mem temp and volts are not supported. Afiak mem temp has never been reported anyways as far as I can recall.


Ok thank you. Then I geuss there is no way to check if the memory is properly cooled. At least for the moment...


----------



## Jack Mac

Quote:


> Originally Posted by *Timx2*
> 
> Ok thank you. Then I geuss there is no way to check if the memory is properly cooled. At least for the moment...


You could always touch it...








Although I wouldn't recommend doing that.


----------



## tsm106

Quote:


> Originally Posted by *Timx2*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> That's cuz mem temp and volts are not supported. Afiak mem temp has never been reported anyways as far as I can recall.
> 
> 
> 
> Ok thank you. Then I geuss there is no way to check if the memory is properly cooled. At least for the moment...
Click to expand...

My mb supprts tmp sensor leads, 3 of them to be exact. If you have a RIVE or RIVB board you can add a sensor to your memory in the case you really really wanted to know.


----------



## dspacek

Quote:


> Originally Posted by *headbass*
> 
> the offset is imho offset from base voltage which depends on asic


Yes you are right. Thats the reason why AB uses offset for Hawai against other GPU.
So here are my latest results, still on air alias hairdryer cooling








110/1700 Mhz


----------



## tsm106

Quote:


> Originally Posted by *dspacek*
> 
> Quote:
> 
> 
> 
> Originally Posted by *headbass*
> 
> the offset is imho offset from base voltage which depends on asic
> 
> 
> 
> Yes you are right. Thats the reason why AB uses offset for Hawai against other GPU.
Click to expand...

Offset is used because the VRM CONTROLLER IS PWM.


----------



## Falkentyne

Quote:


> Originally Posted by *pkrexer*
> 
> So I was using kombustor to try and determine my max overclock. I would constantly see artifacts at much lower clocks then I would in games, I figured it was due to kombustor making my GPU run much hotter. However, after monitoring my voltages(running +25mv with AB) through GPUz I noticed that my core was only showing around 1.109 while stressing with kombustor. When I play a game its typically showing around 1.189 - 1.2.
> 
> Anyone else notice this? Also, what do use to test for artifacts besides just games?


That's vdroop. Combuster is putting a much higher load on the GPU so it drops the voltage to keep thermals and stress in check. This is the EXACT SAME THING CPU's do when you stress them with linpack or Prime95. It's by design, although its more to stop voltage "overshoots" (when load changes).

If there was no vdroop or if you remove vdroop by using the command line in msi afterburner (check the guru3d forms for the command), it will make your card run MUCH, MUCH hotter. ONLY advisable on subzero.


----------



## Mr357

Quote:


> Originally Posted by *bloodkil93*
> 
> I'm not being funny or anything, but my Sapphire R9 290X has Elpida memory and has managed to clock at 1500 stable and could still go go higher if I wanted too, understandably there were Blackscreen issues, but they can't be that bad?


My Elpida's do 1700.


----------



## Gunderman456

Forceman previously indicated that one should tighten by hand the compression fittings over the hoses.

Yesterday, while I was connecting the hoses to the compression fittings to get that perfect seal, I could not manage enough hand torque to tighten all the way.

My question then becomes is it ok to not have a perfect seal between the compression fittings when securing the hoses?


----------



## headbass

Quote:


> Originally Posted by *tsm106*
> 
> Offset is used because the VRM CONTROLLER IS PWM.


ahh ok...so the offset it just an increase to what the graphic card fw/bios decides it's optimal?

I've read some of that beyond3d forum thread that you linked to previously and seems it's more complicated than I thought ;o]


----------



## dspacek

Quote:


> Originally Posted by *Mr357*
> 
> My Elpida's do 1700.


My Hynix too.....will see under water whats the potencial


----------



## Roy360

Quote:


> Originally Posted by *jerrolds*
> 
> I asked the Swiftech rep in the watercooling forums and he aid that with my i7 [email protected]+ and R9 [email protected]+ the H220 would be pushing it, but said the the H320 would be just fine.
> 
> I'm still up in the air about that though...The H320 is $169CDN, for $100 more i can get a proper XSPC 360rad Raystorm kit. But i know id end up getting rigid tubing which adds to cost
> 
> 
> 
> 
> 
> 
> 
> 
> 
> barbs/fittings, and probably more coolant if youre going to add a GPU to the loop.


It would be pushing it, which is why I got a custom kit. But I've seen a H220 cool a CPU, two 7950 blocks, and vrm and 2 extra radiators in addition to the stock rad. Max temps were ~60. It's somewhere on LinusTipTech forums, but I can't find it.

The H220 and H320 are the same pumps. You will have to add extra radiators for the GPUS.

Now I paid close to 600$ for my kit. My temps barely pass over 40 on full load, overclocked.


----------



## headbass

damn I noticed today that my pc freezes within 5 seconds after I play any video on youtube....even on everything stock (except custom fan profile in AB)

also tried to increase the voltage to +100mV and keep the speeds stock and still freezes....even tried to return to cpu to stock to rule it out....nothing seems to help

and I changed the graphic cards (still have both of them) and it still freezes with the other 290 card as well

I can play games/bench without any problems OCed or stock....using regular catalyst 13.12...as always I did uninstall, clear with driver sweeper (twice to be sure) and then restart before updating the drivers

according to gpu-z log it switches to 501/1250.0 and 1.234v when I start the video

I have aero turned off which I remember used to keep switching my 6950/6970 between 2D clocks and 2D/3D (the clocks kinda halfway between 2D and 3D)

any ideas?

found a thread on OCN abou the same issue
http://www.overclock.net/t/1440334/r9-290x-driver-causes-video-playback-to-freeze-pc

but I can play local divx/mp4 files fine (bsplayer+cccp)
also I have the flash hw acceleration turned off iirc and using latest Firefox (non-beta, not aurora) and using flash 10.3


----------



## Hattifnatten

Quote:


> Originally Posted by *Gunderman456*
> 
> Forceman previously indicated that one should tighten by hand the compression fittings over the hoses.
> 
> Yesterday, while I was connecting the hoses to the compression fittings to get that perfect seal, I could not manage enough hand torque to tighten all the way.
> 
> My question then becomes is it ok to not have a perfect seal between the compression fittings when securing the hoses?


I've had fittings with about 1-2mm space between them for 6 months now, haven't had a problem with them. But do try to tighten them as much as possible, I use two hands for more surface area and thus better grip. Can get them half a rotation or so more by doing that.


----------



## ImJJames

Quote:


> Originally Posted by *headbass*
> 
> damn I noticed today that my pc freezes within 5 seconds after I play any video on youtube....even on everything stock (except custom fan profile in AB)
> 
> also tried to increase the voltage to +100mV and keep the speeds stock and still freezes....even tried to return to cpu to stock to rule it out....nothing seems to help
> 
> and I changed the graphic cards (still have both of them) and it still freezes with the other card as well
> 
> I can play games/bench without any problems OCed or stock....using regular catalyst 13.12...as always I did uninstall, clear with driver sweeper (twice to be sure) and then restart before updating the drivers
> 
> according to gpu-z log it switches to 501/1250.0 and 1.234v when I start the video
> 
> I have aero turned off which I remember used to keep switching my 6950/6970 between 2D clocks and 2D/3D (the clocks kinda halfway between 2D and 3D)
> 
> any ideas?


Try using sapphire trixx to see if maybe the OC software your using is causing the problem. Also what BIOS are you on?


----------



## headbass

Quote:


> Originally Posted by *ImJJames*
> 
> Try using sapphire trixx to see if maybe the OC software your using is causing the problem. Also what BIOS are you on?


happens even with the AB not running (and not being stared on os startup)...or with it running on all stock just with custom fan curve

I can play local divx/mp4 files fine (bsplayer+cccp)
also I have the flash hw acceleration turned off iirc and using latest Firefox (non-beta, not aurora) and using flash 10.3

bios is stock, not touched it, using the silent mode (at least I guess that's how they are shipped, the fan never got over 48% when I first tested it)

edit: hmm I just noticed that chrome (which uses different flash version than firefox) plays youtube fine and has the hw acceleration turned on

edit2: using htm5 playback it runs ok even in firefox...you can switch to html5 on http://www.youtube.com/html5


----------



## ZealotKi11er

Quote:


> Originally Posted by *BradleyW*
> 
> So due to the architecture, MSI can only show an estimate GPU usage, thus causing the bumps? So there is actually nothing wrong with my usage? Or have I misunderstood?
> Thank you.


Yes thats pretty much it.


----------



## Widde

Placed an order now for a Sapphire 290







Crossfire here we go







Gotta move my hdds down a notch though to fit it ^^ It's tight in a antec threehundred (Got all the fan slots populated though)


----------



## Forceman

Quote:


> Originally Posted by *Falkentyne*
> 
> That's vdroop. Combuster is putting a much higher load on the GPU so it drops the voltage to keep thermals and stress in check. This is the EXACT SAME THING CPU's do when you stress them with linpack or Prime95. It's by design, although its more to stop voltage "overshoots" (when load changes).
> 
> If there was no vdroop or if you remove vdroop by using the command line in msi afterburner (check the guru3d forms for the command), it will make your card run MUCH, MUCH hotter. ONLY advisable on subzero.


It sounds like you are talking about two different things. Vdroop is the drop in voltage that occurs under a heavy load, due to the power delivery system not adequately compensating for the increased current draw. Purposely reducing the voltage (and clock speed) to control temps and power is PowerTune. You may be seeing both with Furmark/Kombuster, but there are two different things at play there.
Quote:


> Originally Posted by *Gunderman456*
> 
> Forceman previously indicated that one should tighten by hand the compression fittings over the hoses.
> 
> Yesterday, while I was connecting the hoses to the compression fittings to get that perfect seal, I could not manage enough hand torque to tighten all the way.
> 
> My question then becomes is it ok to not have a perfect seal between the compression fittings when securing the hoses?


You don't need the rings to touch the fitting itself, it's fine for there to be a gap there (unless you are trying to eliminate it for aesthetics). You might want to back the rings off and check to be sure you aren't cutting into the hose by tightening them too much.


----------



## alawadhi3000

I moved the Sapphire R9 290 into my other PC, and welcomed these two leafblowers.
The top card gets too hot and throttles to ~700MHz even with the fan @%65 and 20C Ambient.


----------



## Gumbi

Couple of things:

Make sure your case has very good airflow. You have 550 watts of heat being generated in your case by your cards alone, that needs to get out ASAP. A side fan blowing cool air directly onto the GPUs is a must.

Replace the thermal paste on both cards with some aftermarket paste (Arctic MX 4 usually works best). Make sure to clean the chips properly first, you can look up how to do this online, pretty simple). Apply a small amount of MX or similar paste to the middle of the chip (a pea sized amount should do, even a bit less). Tightening the cooler plate back onto it will spread it out evenly.

Gains of 10 degrees or more can be had from this, and used in conjunction with proper airflow, you should be able to minimise (or even eradicate throttling).


----------



## Sgt Bilko

My 2 XFX cards will be shipping out tomorrow









Should have them installed and running by Tuesday. Will post temps etc then


----------



## beejay

It's been a while, got super busy with wedding preps and stuff. Anyway, got 3 more R9290X. I'll pair one to my other 290x and put the other 2 on my mining rig.

Will post new system pic after I move them to a test bench. Need to pull out my old Noctua D14.


----------



## TommyGunn123

Quote:


> Originally Posted by *headbass*
> 
> damn I noticed today that my pc freezes within 5 seconds after I play any video on youtube....even on everything stock (except custom fan profile in AB)


Try turning off hardware acceleration in the flash settings


----------



## Derpinheimer

Quote:


> Originally Posted by *TommyGunn123*
> 
> Try turning off hardware acceleration in the flash settings


I had the same issue, other video players didnt cause this issue (VLC, WMP)

I think hardware accel was only part of the issue, but im too fcking dumb to remember what else I changed :/


----------



## kizwan

Quote:


> Originally Posted by *Hattifnatten*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gunderman456*
> 
> Forceman previously indicated that one should tighten by hand the compression fittings over the hoses.
> 
> Yesterday, while I was connecting the hoses to the compression fittings to get that perfect seal, I could not manage enough hand torque to tighten all the way.
> 
> My question then becomes is it ok to not have a perfect seal between the compression fittings when securing the hoses?
> 
> 
> 
> I've had fittings with about 1-2mm space between them for 6 months now, haven't had a problem with them. But do try to tighten them as much as possible, I use two hands for more surface area and thus better grip. Can get them half a rotation or so more by doing that.
Click to expand...

Don't tighten all the way down. Don't tighten too much. It will be difficult to undo them later. Also there is possibility you're going to cut the tube if tighten all the way down. Just hand tight using one hand is more than enough.


----------



## Sazz

can anyone upload their Sapphire 290X Uber mode BIOS, coz the one I downloaded from techpowerup database won't let me turn off GPU scaling, but the other BIOS like PT1 and Asus BIOS is working fine and lets me turn off GPU scaling.


----------



## Mr357

Quote:


> Originally Posted by *Sazz*
> 
> can anyone upload their Sapphire 290X Uber mode BIOS, coz the one I downloaded from techpowerup database won't let me turn off GPU scaling, but the other BIOS like PT1 and Asus BIOS is working fine and lets me turn off GPU scaling.


For clarification, "Quiet" and "Uber" are just fan profiles, and use the same BIOS.

Why would you want the Sapphire BIOS over the ASUS version anyway?


----------



## tsm106

Quote:


> Originally Posted by *Sazz*
> 
> can anyone upload their Sapphire 290X Uber mode BIOS, coz the one I downloaded from techpowerup database won't let me turn off GPU scaling, but the other BIOS like PT1 and Asus BIOS is working fine and lets me turn off GPU scaling.


 SAPUBER.zip 128k .zip file


Rename to *.rom










Meant .rom but wrote .zip rofl, but you get the picture.


----------



## Sazz

Quote:


> Originally Posted by *Mr357*
> 
> For clarification, "Quiet" and "Uber" are just fan profiles, and use the same BIOS.
> 
> Why would you want the Sapphire BIOS over the ASUS version anyway?


I got Sapphire card, I used PT1 BIOS then just re-downloaded the Sapphire BIOS off techpowerup.. but for some reason it won't let me turn off GPU scaling while using that BIOS that I downloaded of Tech, but another weird thing is I downloaded a stock 290X ASUS BIOS and that one lets me turn off GPU scaling...

Quote:


> Originally Posted by *tsm106*
> 
> SAPUBER.zip 128k .zip file
> 
> 
> Rename to *.zip


Thanks. imma give this a try


----------



## tsm106

Quote:


> Originally Posted by *Mr357*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sazz*
> 
> can anyone upload their Sapphire 290X Uber mode BIOS, coz the one I downloaded from techpowerup database won't let me turn off GPU scaling, but the other BIOS like PT1 and Asus BIOS is working fine and lets me turn off GPU scaling.
> 
> 
> 
> *For clarification, "Quiet" and "Uber" are just fan profiles, and use the same BIOS*.
> 
> Why would you want the Sapphire BIOS over the ASUS version anyway?
Click to expand...

That's incorrect. The uber bios not only raises the fan speed as you mention but it also raises the TDP limit.


----------



## Sazz

Quote:


> Originally Posted by *tsm106*
> 
> That's incorrect. The uber bios not only raises the fan speed as you mention but it also raises the TDP limit.


And Quiet mode locks Overdrive for the GPU I believe.


----------



## Mr357

Quote:


> Originally Posted by *tsm106*
> 
> That's incorrect. The uber bios not only raises the fan speed as you mention but it also raises the TDP limit.


I had no idea. Whoops.


----------



## Sazz

Quote:


> Originally Posted by *tsm106*
> 
> SAPUBER.zip 128k .zip file
> 
> 
> Rename to *.rom
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Meant .rom but wrote .zip rofl, but you get the picture.


Am I just having a problem or whenever I try to open it, it says file may be corrupted...

nevermind, I didn't notice had to rename it to .rom xD


----------



## kizwan

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mr357*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sazz*
> 
> can anyone upload their Sapphire 290X Uber mode BIOS, coz the one I downloaded from techpowerup database won't let me turn off GPU scaling, but the other BIOS like PT1 and Asus BIOS is working fine and lets me turn off GPU scaling.
> 
> 
> 
> For clarification, "Quiet" and "Uber" are just fan profiles, and use the same BIOS.
> 
> Why would you want the Sapphire BIOS over the ASUS version anyway?
> 
> Click to expand...
> 
> That's incorrect. The uber bios not only raises the fan speed as you mention but it also raises the TDP limit.
Click to expand...

It also dual BIOS right, like 290?


----------



## tsm106

I can try actually zipping it. OCN's server might be changing the hash or something...

Quote:


> nevermind, I didn't notice had to rename it to .rom xD


lol, there it is!

My old 290 blocks have a new home inside MasterT's rig.









Quote:


> Leak tested, for about 5hrs. So far, so good. Temps after a Heaven run are 52(top card)/51 on gpu, and *42vrm1,44vrm2*.


Look at those vrm temps, and he's on a puny rad lol.


----------



## Arizonian

Quote:


> Originally Posted by *wolfej*
> 
> Finally got my loop set up as I was running them on air until I got the new case and other junk.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I'm too lazy to get my fiancee's camera to take a proper picture, but _THAR THEY BLOW!_
> 
> I've gotten them to 1200/1300 with just 100mV offset as I've only been messing around with the clocks for like 30 minutes. Seems to be completely stable.


Congrats - updated









Quote:


> Originally Posted by *MasterT*
> 
> Heya guys. Greeting from the Caribbean. Been following this thread from page 1 since the launch of these cards. Now I'll like to be added. Had loads of help from you guys. Excellent community here. My cards are Asus R9 290 and they are now under water. Temps after a Heaven run are 52(top card)/51 on gpu. More testing to be done.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added







Glad to have you aboard.









Quote:


> Originally Posted by *Timx2*
> 
> Wohoo! I got my Sapphire R290X y'day (ASIC 78%).
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Did a few test benchmarks to see if it wasn't DOA and mounted a EKWB waterblock + backplate on it. I just did a 15 minutes valley run and I do get the following temps:
> 
> GPU: 45°
> VRM1: 51°
> VRM2: 37°
> 
> The valley score was 2527 @ stock settings. I'll be overclocking in a few weeks. I will try the stock settings first for some time.
> 
> At idle the temps are around
> 
> GPU: 30°
> VRM1: 25°
> VRM2: 24°
> 
> Are these temps normal for a watercooled loop? I post 2 pictures to get an idea of my loop!
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added
















Quote:


> Originally Posted by *staryoshi*
> 
> I ordered a MSI Gaming R9 290 @ $470 - with the caveat being that it's temporarily OOS. I'm away from my desktop for a few weeks so I don't particularly mind waiting
> 
> 
> 
> 
> 
> 
> 
> I'm hopeful it will arrive in a few weeks. When it does arrive, expect a thorough Yoshi review like in the days or yore


Right on bud. I know you've been contemplating which move, good times. Awaiting your submission and even more so your review.









Quote:


> Originally Posted by *alawadhi3000*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I moved the Sapphire R9 290 into my other PC, and welcomed these two leafblowers.
> The top card gets too hot and throttles to ~700MHz even with the fan @%65 and 20C Ambient.


Congrats - updated









Quote:


> Originally Posted by *beejay*
> 
> It's been a while, got super busy with wedding preps and stuff. Anyway, got 3 more R9290X. I'll pair one to my other 290x and put the other 2 on my mining rig.
> 
> Will post new system pic after I move them to a test bench. Need to pull out my old Noctua D14.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - updated to four


----------



## Forceman

Quote:


> Originally Posted by *tsm106*
> 
> That's incorrect. The uber bios not only raises the fan speed as you mention but it also raises the TDP limit.


The only place I've seen that is the Guru3D review. All the other reviews state that the only difference is the fan speed. Do you have another source for that?

In other words, is the reason the card draws more power in uber mode because they raised the power limit, or because the higher fan speed keeps the chip cooler, which allows higher power draw. If you fix both modes to 60% fan speed, for example, is there still a power difference for the uber mode?


----------



## Sazz

Quote:


> Originally Posted by *tsm106*
> 
> I can try actually zipping it. OCN's server might be changing the hash or something...
> lol, there it is!


hmmm, renaming it as .rom doesn't switch it to a rom file it's still under zip file.. T_T

edit:

Crap, again.. nevermind... I am going ****** right now, LOLOL

Anyways that didn't work... my GPU scaling is still on, whenever I try to turn it off and click "apply" it just gets back on (box gets checked right away after clicking apply)... So I am at loss right here, other BIOS it works fine and I can turn it off.


----------



## tsm106

Quote:


> Originally Posted by *Sazz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I can try actually zipping it. OCN's server might be changing the hash or something...
> lol, there it is!
> 
> 
> 
> hmmm, renaming it as .rom doesn't switch it to a rom file it's still under zip file.. T_T
> 
> edit:
> 
> Crap, again.. nevermind... I am going ****** right now, LOLOL
> 
> Anyways that didn't work... my GPU scaling is still on, whenever I try to turn it off and click "apply" it just gets back on (box gets checked right away after clicking apply)... So I am at loss right here, other BIOS it works fine and I can turn it off.
Click to expand...

It's probably the driver. What OS are you on?

Try this...

https://www.systemshock.org/index.php?topic=4439.0


----------



## Bartouille

What is the stock voltage for you guys 290x???


----------



## Sazz

Quote:


> Originally Posted by *tsm106*
> 
> It's probably the driver. What OS are you on?
> 
> Try this...
> 
> https://www.systemshock.org/index.php?topic=4439.0


Thanks!

Solution 2 worked, had to choose a smaller resolution (in my case 1600x900 is what I used) and then went back to the Properties Display panel on CCC and the 3 choices under GPU scaling is not grayed out anymore, chose maintain aspect ratio after applying that I disabled GPU scaling and it finally turned off. Afterwards went back to 1080p and it remained off.


----------



## quakermaas

Anybody know were I will get the basic BF4 code cheap, for a friend ?







PM me.

Sorted, don't need more PM's


----------



## Sazz

Quote:


> Originally Posted by *quakermaas*
> 
> Anybody know were I will get the basic BF4 code cheap, for a friend ?


anandtech or hardocp trade forums, there's a bunch of people selling there. I just sold all of what I have last week.


----------



## Adglu

very interesting

Can't wait to see a true Toxic card
http://www.tomshardware.com/reviews/r9-290x-case-performance,3710.html


----------



## kpoeticg

Quote:


> Originally Posted by *hotrod717*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kpoeticg*
> 
> Hey guys, I'm about to grab my first 290x for my Tri 290x build. I'm trying to wait for prices to drop before I grab my other 2. I was thinking of grabbing the PowerColor cuz of its higher Core Clock (1030). I'll be OC'ing & Gaming. Also I plan on having all 3 of my cards under water. Is the PowerColor a good choice? And has anybody started selling cards with strictly Hynix memory yet?
> 
> 
> 
> I would wait a bit. Asus just announced release of DCII. Only known card to come with hynix 100%. EK also will be making waterblock for it.
Click to expand...

Thanx for the response. But I coulda SWORN that the only review I've read so far (HardOCP??) stated that all the 290x DCII's will come only with Elpida memory..



http://www.reviewstudio.net/1190-asus-r9-290x-directcu-ii-oc-review-enjoy-the-silence

"Asus chose Elpida memory for this card, and install them around the GPU. There are no memory chips on the back."


----------



## psyside

Tri X?


----------



## kpoeticg

This is why I thought HardOCP. The Advertisement Screenshot from the HardOCP review



All the mem modules are Elpida


----------



## King4x4

Ghetto mods FTW


----------



## Asrock Extreme7

AMD Mantle boosts Battlefield 4 performance by 45 per cent bring it on


----------



## smartdroid

Quote:


> Originally Posted by *King4x4*
> 
> Ghetto mods FTW


Nice build









I just bought my third sapphire R9 290 and it seems they are cheaping out on the card, no more metal bracket to hold the cooler


----------



## kizwan

Quote:


> Originally Posted by *smartdroid*
> 
> Nice build
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I just bought my third sapphire R9 290 and it seems they are cheaping out on the card, no more metal bracket to hold the cooler


What metal bracket?


----------



## smartdroid

the one on the back of the pcb that holds the stock cooler.


----------



## BradleyW

Quote:


> Originally Posted by *smartdroid*
> 
> the one on the back of the pcb that holds the stock cooler.


I have those on my gigabyte cards. I expected more from Sapphire. I remember when XFX when cheap on the 4890 and it reduced fps by at least 10 on average. It was horrid.


----------



## VSG

I am confused here, could either of you take a photo of the bracket you refer to?


----------



## BradleyW

Quote:


> Originally Posted by *geggeg*
> 
> I am confused here, could either of you take a photo of the bracket you refer to?




Look at the AMD card. Can you see the backplate?


----------



## wand3r3r

I may as well join the club.


----------



## VSG

Why is your GPU-z also detecting the Intel graphics?


----------



## jerrolds

Maybe he's using lucid virtual monitors or something, or hasnt disabled it heh


----------



## Arizonian

Quote:


> Originally Posted by *wand3r3r*
> 
> I may as well join the club.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## bond32

I can't for the life of me get any decent clocks on my 290x... Pretty much use BF4 for stability since I play for hours (many hours) until school starts. Seems around 1120 on core, 1350 mem is my max. On water too, temps are all in check. Any higher and I get black screen, or the display goes wonky and a hard reset is the only way to fix it. Adding voltage doesn't seem to change anything...

I put in about 2 months ago to exchange it on amazon. They only refunded me since stock seems so limited, but I think i'll just keep it as I payed the real price of $550 not some $2345782395273.99 for one 290x price.

Those of you with good clock rates, do you play bf4? Any issues? I'm on windows 7, using latest drivers. I've considered going to 8.1, hoped it might give me more OC room maybe?


----------



## Mr357

Quote:


> Originally Posted by *bond32*
> 
> I can't for the life of me get any decent clocks on my 290x... Pretty much use BF4 for stability since I play for hours (many hours) until school starts. Seems around 1120 on core, 1350 mem is my max. On water too, temps are all in check. Any higher and I get black screen, or the display goes wonky and a hard reset is the only way to fix it. Adding voltage doesn't seem to change anything...
> 
> I put in about 2 months ago to exchange it on amazon. They only refunded me since stock seems so limited, but I think i'll just keep it as I payed the real price of $550 not some $2345782395273.99 for one 290x price.
> 
> Those of you with good clock rates, do you play bf4? Any issues? I'm on windows 7, using latest drivers. I've considered going to 8.1, hoped it might give me more OC room maybe?


What BIOS are you on? Maybe it's not getting enough juice?


----------



## jerrolds

I also have a Sapphire R9 290X (a day one card) - and i flashed to one of the first ASUS BIOs and using GPU Tweak to overclock.

With an ASIC of 75% (not that it matters) - using a 3rd party air cooler i was able to run as high as 1225/1600 but was limited by VRM1 temps. My BF4 clocks are 1170/1500 at 1.37v (~1.28-1.32 after vdroop).

Have you tried flashing the bios and using something other than AB or CCC?

Powertarget +50%
Fan speed 100%, which is only about 21db.


----------



## bond32

I've tried quite a few bios, didn't have much luck but I can try again. Tried PT1 and PT3, didn't gain anything so went back to stock.

I really don't expect any crazy clocks, but I would like something in the range you got jerrolds. I would be happy with 1170/1500 but I get black screens, seemingly as soon as I raise the memory clocks over 1350 range.


----------



## jerrolds

Weird...at high overclocks i dont get blackscreens, just the usual artifacting/crashing when hitting high temps. Only time it crashes off the bat is when i do something crazy like 1250mhz w/ 1.412v (max with GPU Tweak and ASUS bios)

I'm not sure if the newest WHQL has the "blackscreen" fix...does it?

How soon does it blackscreen? This happens on all apps?


----------



## Sgt Bilko

http://www.tomshardware.co.uk/visiontek-radeon-r9-290-cryovenom,news-46819.html

Visiontek R9 290, pre clocked to 1175Mhz.......thats impressive for a card straight from the factory.


----------



## bond32

Quote:


> Originally Posted by *jerrolds*
> 
> Weird...at high overclocks i dont get blackscreens, just the usual artifacting/crashing when hitting high temps. Only time it crashes off the bat is when i do something crazy like 1250mhz w/ 1.412v (max with GPU Tweak and ASUS bios)
> 
> I'm not sure if the newest WHQL has the "blackscreen" fix...does it?
> 
> How soon does it blackscreen? This happens on all apps?


I only play bf4, so not sure on other apps. I ran some benches and it didn't happen, but that was with a minor overclock. They claimed it had the blackscreen fix, but I saw others here still had it too. Maybe it is still a driver issue, not sure.


----------



## hotrod717

Quote:


> Originally Posted by *bloodkil93*
> 
> I'm not being funny or anything, but my Sapphire R9 290X has Elpida memory and has managed to clock at 1500 stable and could still go go higher if I wanted too, understandably there were Blackscreen issues, but they can't be that bad?


With 7970's, hynix were capable of 1900+

Quote:


> Originally Posted by *kpoeticg*
> 
> Thanx for the response. But I coulda SWORN that the only review I've read so far (HardOCP??) stated that all the 290x DCII's will come only with Elpida memory..
> 
> 
> 
> http://www.reviewstudio.net/1190-asus-r9-290x-directcu-ii-oc-review-enjoy-the-silence
> "Asus chose Elpida memory for this card, and install them around the GPU. There are no memory chips on the back."


Damn, you are absolutely right. I checked your source and found those pics as well. I thought I had read another article stating they were using hynix. Last time around on 7970's both DCII and Matrix came exclusively with hynix. Not sure what the deal is this time. Is hynix cheaper or performs better than hynix in 4gb config? Believe it's a bad chice on their part. Hopefully some card will come exclusively with hynix. Whatever that card might be, they're going to sell a ton of them.
All these hynix - elpida debates. Again, hynix are believed to be better because they were capable of 1900mhz+ on the 7970's. Haven't seen any elpida posting oc's like that. If a card is released with memory voltage control and both elpida and hynix , we'll see.


----------



## bond32

To recap, here are the things I just tried:
- Installed PT1 bios
- Played bf4, clocked core at 1100 and +50% = direct X error
- For siggle and ghits, clocked memory at 1400 = black screen

Maybe I should try gpu tweak instead?


----------



## wand3r3r

Quote:


> Originally Posted by *bond32*
> 
> I can't for the life of me get any decent clocks on my 290x... Pretty much use BF4 for stability since I play for hours (many hours) until school starts. Seems around 1120 on core, 1350 mem is my max. On water too, temps are all in check. Any higher and I get black screen, or the display goes wonky and a hard reset is the only way to fix it. Adding voltage doesn't seem to change anything...
> 
> I put in about 2 months ago to exchange it on amazon. They only refunded me since stock seems so limited, but I think i'll just keep it as I payed the real price of $550 not some $2345782395273.99 for one 290x price.
> 
> Those of you with good clock rates, do you play bf4? Any issues? I'm on windows 7, using latest drivers. I've considered going to 8.1, hoped it might give me more OC room maybe?


I don't know if 8.1 will matter for the OC ability, however it should help your FPS assuming you play MP.
Quote:


> Rest assured, this is not a benchmark we enjoy running, and frankly, in future articles, we probably won't use the Siege of Shanghai level. It's too chaotic, and performance changes far too much depending on where your character is on the map. But there's no denying that based on the number of runs we conducted, it's obvious that Windows 8.1 is simply the better operating system for this game. In fact, it seems that Windows 8.1 on its own provides more of a benefit in this game than our 22 percent CPU overclock, and that's saying something. Also keep in mind that at many times during our benchmark runs, our video card was pegged at 99% utilization, meaning that this is not strictly a boost in CPU performance. Windows 8.1 may be working some kind of magic on both the CPU and GPU pipelines in this game. Given the rumors at the time of its release that Battlefield 4 was the first game to truly leverage the enhanced efficiency of Windows 8.1, it's possible that this game is in fact a window, so to speak, into the future of gaming performance under the new Windows OS.


http://www.techbuyersguru.com/windowsgaming4.php


----------



## jerrolds

Crashing at 1100mhz? Wth....i always thought all 290X can hit that on stock volts/cooler.

It crashed at 1400mhz memory and stock clocks? Youre raising voltages right?









Try ASUS bios maybe w/ GPU Tweak? I dont know - kinda sounds like a bad overclocker.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *bond32*
> 
> To recap, here are the things I just tried:
> - Installed PT1 bios
> - Played bf4, clocked core at 1100 and +50% = direct X error
> - For siggle and ghits, clocked memory at 1400 = black screen
> 
> Maybe I should try gpu tweak instead?


Are you increasing voltage offset at all? If so how high have you tried? Don't be scared to at least try up to +100mv


----------



## Gunderman456

Thanks Hattifnatten, kizwan and Forceman. Yes, I also considered aesthetics for this. The tubes are very thick, so no danger in slicing through them. Man, my first custom loop and I've learned so much already.

I've also realized while many of us tend to cover the bigger things in our how toos, we often don't talk about the little things like "take care to hand tighten all compression fittings to avoid metal shavings, damaging waterblock threads, damaging O rings and perforating tubes".

I had tightened using a wrench, and luckily no damages were incurred other then the metal shavings which I had to clean, but next time I will be hand tightening everything.

Your combined advise will be included in my new "The Hawaiian Heat Wave" build log.


----------



## bond32

Quote:


> Originally Posted by *jerrolds*
> 
> Crashing at 1100mhz? Wth....i always thought all 290X can hit that on stock volts/cooler.
> 
> It crashed at 1400mhz memory and stock clocks? Youre raising voltages right?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Try ASUS bios maybe w/ GPU Tweak? I dont know - kinda sounds like a bad overclocker.


Just installed it... Didn't change clocks at all, still getting a directX error.... WTH


----------



## jerrolds

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Are you increasing voltage offset at all? If so how high have you tried? Don't be scared to at least try up to +100mv


I've pushed to the max ASUS bios at 1412mv in GPU Tweak - worked fine and crashed when i hit the thermal barrier on VRM1 (GPU temps was still < 70C). So it should be fine as long as temps are in check, but currently at 1360mv.


----------



## jerrolds

Quote:


> Originally Posted by *bond32*
> 
> Just installed it... Didn't change clocks at all, still getting a directX error.... WTH


Lol - well judging by your avatar..maybe you still have some nvidia artifacts lying around? Have you tried doing a safe mode DDU run?


----------



## bond32

Quote:


> Originally Posted by *jerrolds*
> 
> Lol - well judging by your avatar..maybe you still have some nvidia artifacts lying around? Have you tried doing a safe mode DDU run?


Ha, no, I always run that with drivers.

One thing I forgot to mention, and I think this is the issue: I play at 1440p @ 110hz. Setting it back to 60hz, so far no wonky display issues.


----------



## jerrolds

I run BF4 at [email protected] without issue, so im not sure thats it. Are your CRU values good?


----------



## bond32

Quote:


> Originally Posted by *jerrolds*
> 
> I run BF4 at [email protected] without issue, so im not sure thats it. Are your CRU values good?


Probably not, but to eliminate that as an issue I'm going to just play at 60hz for now...

BTW thanks for the help!


----------



## nagle3092

So whats the latest AMD driver? 13.12 or 13.11beta9.5? I got a reference gigabyte coming in today that I scored for $534, new but damaged box so I'll see how it goes.


----------



## the9quad

Quote:


> Originally Posted by *nagle3092*
> 
> So whats the latest AMD driver? 13.12 or 13.11beta9.5? I got a reference gigabyte coming in today that I scored for $534, new but damaged box so I'll see how it goes.


afaik, the 13.12 are just whql certified 13.11 beta 9.5's


----------



## nagle3092

Quote:


> Originally Posted by *the9quad*
> 
> afaik, the 13.12 are just whql certified 13.11 beta 9.5's


Thanks, AMD still using CAPs? Dont see them on the site...

Its been awhile since I had a radeon.


----------



## the9quad

Quote:


> Originally Posted by *nagle3092*
> 
> Thanks, AMD still using CAPs? Dont see them on the site...
> 
> Its been awhile since I had a radeon.


I think they are built in now.


----------



## VSG

Ya, they have been built into the drivers for a while now. I just updated my laptop 7970m drivers to confirm


----------



## BradleyW

Quote:


> Originally Posted by *nagle3092*
> 
> Thanks, AMD still using CAPs? Dont see them on the site...
> 
> Its been awhile since I had a radeon.


Profiles are included within the actual drivers. Profiles that are developing are directly programmed within the drivers code until a profile is formulated. AMD pump out new variations of their Beta's all the time. I don't think AMD will ever go back to CAPS patches due to this change in protocol.


----------



## bond32

Quote:


> Originally Posted by *jerrolds*
> 
> I run BF4 at [email protected] without issue, so im not sure thats it. Are your CRU values good?


Think that may have been my issue... Also I backed off on my cpu OC, so far so good. Running 1150/5300 atm at 1.273 V on PT1 bios. Temps are all in check, bf4 seems to run fine.... Wow I feel dumb.

Should have backed off on the refresh rate long ago


----------



## Redvineal

Quote:


> Originally Posted by *jerrolds*
> 
> Crashing at 1100mhz? Wth....i always thought all 290X can hit that on stock volts/cooler.
> 
> It crashed at 1400mhz memory and stock clocks? Youre raising voltages right?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Try ASUS bios maybe w/ GPU Tweak? I dont know - kinda sounds like a bad overclocker.


Not sure if it's apples to apples, but my R9 290 (non-X) can run BF4 up to 1085/1285 on stock volts without issue.

I started out with 1100/1300 and after occasional DX errors (core) and black screens (memory), I dialed it back by 5 until I reached stability.


----------



## nagle3092

Quote:


> Originally Posted by *the9quad*
> 
> I think they are built in now.


Quote:


> Originally Posted by *geggeg*
> 
> Ya, they have been built into the drivers for a while now. I just updated my laptop 7970m drivers to confirm


Quote:


> Originally Posted by *BradleyW*
> 
> Profiles are included within the actual drivers. Profiles that are developing are directly programmed within the drivers code until a profile is formulated. AMD pump out new variations of their Beta's all the time. I don't think AMD will ever go back to CAPS patches due to this change in protocol.


Thats good to know, thanks guys. Going to be interesting having a Radeon again, looking forward to it. Now hopefully the "damaged box" card isnt dead.


----------



## bloodkil93

Quote:


> Originally Posted by *jerrolds*
> 
> I run BF4 at [email protected] without issue, so im not sure thats it. Are your CRU values good?


Forgive my ignorance, but what are CRU values?

And on another side note, I can only push 1130/1550 on my R9 290X with +20% Power Limit and +88 mV, if I go any higher on the GPU than that I start getting artifacts, could I have just got a dud overclocker, or is there something else wrong?


----------



## jrcbandit

Quote:


> Originally Posted by *bloodkil93*
> 
> Forgive my ignorance, but what are CRU values?
> 
> And on another side note, I can only push 1130/1550 on my R9 290X with +20% Power Limit and +88 mV, if I go any higher on the GPU than that I start getting artifacts, could I have just got a dud overclocker, or is there something else wrong?


Try lowering your memory clocks, it doesn't add much performance. You definitely should be able to get above 1200 Mhz core, but it can require +100 or even +150 mV to be stable. I can benchmark at 1250 core 1500 memory without any artifacts, but it will artifact in games when I run my 1440p monitor at 96+ Hz. When gaming I turn down the memory to 1400 Mhz and core to 1230 Mhz. I use +150 Mv core as +100 was not enough (haven't thoroughly tested inbetween values)


----------



## jerrolds

CRU - Custom Resolution Utility by ToastyX. There should be tutorials on how to set the timings in the monitor thread should you wish to overclock your panel (if its possible). Afaik no one has given better timings than my [email protected] 1440p http://www.overclock.net/t/1384767/official-the-korean-pls-monitor-club-qnix-x-star/6910#post_20798547


----------



## bond32

Quote:


> Originally Posted by *jrcbandit*
> 
> Try lowering your memory clocks, it doesn't add much performance. You definitely should be able to get above 1200 Mhz core, but it can require +100 or even +150 mV to be stable. I can benchmark at 1250 core 1500 memory without any artifacts, but it will artifact in games when I run my 1440p monitor at 96+ Hz. When gaming I turn down the memory to 1400 Mhz and core to 1230 Mhz. I use +150 Mv core as +100 was not enough (haven't thoroughly tested inbetween values)


Which card/bios are you on?


----------



## jrcbandit

Quote:


> Originally Posted by *bond32*
> 
> Which card/bios are you on?


Sapphire 290x using the Asus bios and Trixx to overclock. Trixx beta even allows +200 mV.


----------



## Bartouille

Does anyone have entire system shutdown when overclocking the core sometimes? I'm running +100mV with 75% fan and vrm are below 70c and core is around 80c, no memory oc, just core?


----------



## bond32

Quote:


> Originally Posted by *Bartouille*
> 
> Does anyone have entire system shutdown when overclocking the core sometimes? I'm running +100mV with 75% fan and vrm are below 70c and core is around 80c, no memory oc, just core?


I believe that indicates your cpu/memory may not be stable. Have you seen it shut down at stock gpu clocks too?


----------



## Bartouille

Quote:


> Originally Posted by *bond32*
> 
> I believe that indicates your cpu/memory may not be stable. Have you seen it shut down at stock gpu clocks too?


I doubt it's cpu/memory. My overclock is pretty solid and this has only happened with core voltage at +100mv. I haven't played much at stock clocks but it didn't happen, only when +100.

Also it doesn't BSOD, and there is not BSOD code when I restart (usually says you got a bsod and the code). very weird!


----------



## Durvelle27

Quote:


> Originally Posted by *Bartouille*
> 
> Does anyone have entire system shutdown when overclocking the core sometimes? I'm running +100mV with 75% fan and vrm are below 70c and core is around 80c, no memory oc, just core?


I also had this problem with my TX850 when OCing my 290 up to 1230/1450 1.35v. I could never figure out what caused it but I think it was OCP


----------



## headbass

Quote:


> Originally Posted by *jerrolds*
> 
> I also have a Sapphire R9 290X (a day one card) - and i flashed to one of the first ASUS BIOs and using GPU Tweak to overclock.
> 
> With an ASIC of 75% (not that it matters) - using a 3rd party air cooler i was able to run as high as 1225/1600 but was limited by VRM1 temps. My BF4 clocks are 1170/1500 at 1.37v (~1.28-1.32 after vdroop).
> 
> Have you tried flashing the bios and using something other than AB or CCC?
> 
> Powertarget +50%
> Fan speed 100%, which is only about 21db.


does the asus bios have constant voltage? or how did you find out your voltage before vdroop? I can get the after vdroop voltage range from gpuz readings but I have no idea what the base voltage before vdroop is supposed to be


----------



## battleaxe

Quote:


> Originally Posted by *Bartouille*
> 
> Does anyone have entire system shutdown when overclocking the core sometimes? I'm running +100mV with 75% fan and vrm are below 70c and core is around 80c, no memory oc, just core?


Possibly the memory is too high.


----------



## psyside

Or his PSU is weak.


----------



## Derpinheimer

Quote:


> Originally Posted by *Bartouille*
> 
> I doubt it's cpu/memory. My overclock is pretty solid and this has only happened with core voltage at +100mv. I haven't played much at stock clocks but it didn't happen, only when +100.
> 
> Also it doesn't BSOD, and there is not BSOD code when I restart (usually says you got a bsod and the code). very weird!


Does the screen go black or does the system actually shut down?cut power?


----------



## Bartouille

I really don't know what's going on here. But for now I put everything back to stock except gpu. Maybe my 290x just can't handle +100mv. I was playing for about 15 minutes (far cry 3) no problem no artifacts at 1160mhz and then poof, everything goes down and restarts, no bsod or anything, and I even started windows with "no restart if bsod". Looks like this is some ocp or something, I guess I'll have to figure out.


----------



## Bartouille

Quote:


> Originally Posted by *Derpinheimer*
> 
> Does the screen go black or does the system actually shut down?cut power?


I know about the screen goes black, it happens when OCing memory. This crash is everything goes down (case fans etc) and it restarts.


----------



## Derpinheimer

Quote:


> Originally Posted by *Bartouille*
> 
> I know about the screen goes black, it happens when OCing memory. This crash is everything goes down (case fans etc) and it restarts.


Actually that wasnt what I meant - I have an issue where the screen fades to black with overvolting, heh.

If power is cutting, its the PSU almost certainly. Mine would cut power when I tried to mine on a 7950 and 290(x).


----------



## Bartouille

I'm back to stock speed cpu. So far so good. Looks like 10 pass of IBT at maximum difficulty doesn't cut it.


----------



## axizor

Got a new PowerCooler BF4 290x from an ebay seller for 500 buy it now. Great success


----------



## stickg1

Hmm it's been a while since a Tom's article gave me happy pants: http://www.tomshardware.com/news/visiontek-radeon-r9-290-cryovenom,25621.html


----------



## taem

Do all 290/x cards support uefi? Sapphire adverts this prominently, Asus has support, but I can't find out if MSI cards do. I google and all I see is users requesting uefi supporting bios uploads. I've preordered an MSI 290 gaming and I'm wondering if I should switch.

Also, what is the actual length of the WINDFORCE 290? Gigabyte web says 294mm, but a comment in one of the reviews says it's 280mm. I'm limited to around 283mm in my case which is why I can't get the Sapphire Tri-X or the Asus.


----------



## Sgt Bilko

Quote:


> Originally Posted by *stickg1*
> 
> Hmm it's been a while since a Tom's article gave me happy pants: http://www.tomshardware.com/news/visiontek-radeon-r9-290-cryovenom,25621.html


I posted this a little earlier but yeah....1175Mhz factory clock.....very nice


----------



## stickg1

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I posted this a little earlier but yeah....1175Mhz factory clock.....very nice


My bad! I'm still a few hundred posts behind. This thread moves quickly.


----------



## Sgt Bilko

Quote:


> Originally Posted by *stickg1*
> 
> My bad! I'm still a few hundred posts behind. This thread moves quickly.


Its no problem









This thread is part of my daily reading now....otherwise I might miss something


----------



## Widde

Damm my psu is coming after my second card







Any chance on running 2 290s stock with ocd 3570k at 4.5 on a coolermaster 720w bronze? have 3 hdds aswell

6 hours before I can go get my card


----------



## jerrolds

At 720W it'll be really close. I suppose you can try it







System will just turn off if PSU isnt enough...right?


----------



## Widde

Quote:


> Originally Posted by *jerrolds*
> 
> At 720W it'll be really close. I suppose you can try it
> 
> 
> 
> 
> 
> 
> 
> System will just turn off if PSU isnt enough...right?


Hope so







Else i'll have to wait untill monday when I get my new psu a Seasonic 1000w platinum









Would assume power target is gonna stay untouched if i'm gonna try with this psu, could try and jumper my old 650w corsair aswell


----------



## bond32

Well, got a successful firestrike run with no artifacts: http://www.3dmark.com/3dm/2159006

this is at 1175/1350, 1.375 V, +50%... This is success, although it seems weak compared to some of your scores...


----------



## BradleyW

Quote:


> Originally Posted by *jerrolds*
> 
> At 720W it'll be really close. I suppose you can try it
> 
> 
> 
> 
> 
> 
> 
> System will just turn off if PSU isnt enough...right?


Things overheat and throttle when the PSU can't handle it.


----------



## psyside

Quote:


> Originally Posted by *Widde*
> 
> Hope so
> 
> 
> 
> 
> 
> 
> 
> Else i'll have to wait untill monday when I get my new psu a Seasonic 1000w platinum
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Would assume power target is gonna stay untouched if i'm gonna try with this psu, could try and jumper my old 650w corsair aswell


Undervolt - done.


----------



## VSG

I will just leave this here: Does Radeon R9 290X Behave Any Differently In A Closed Case?


----------



## bond32

Oh wow, forgive my excitement but http://www.3dmark.com/3dm/2159072

That's quite a jump, still kept the 1175 core and moved memory clock to 1500. I'm really pleased


----------



## GenoOCAU

Quote:


> Originally Posted by *geggeg*
> 
> I will just leave this here: Does Radeon R9 290X Behave Any Differently In A Closed Case?


Makes me glad im on water.

Side note - Sapphire knocked it out of the park.


----------



## Forceman

Quote:


> Originally Posted by *geggeg*
> 
> I will just leave this here: Does Radeon R9 290X Behave Any Differently In A Closed Case?


Those speed drops are pretty unacceptable though - they really need to dial back Powertune or something.


----------



## Arizonian

*Announcement*

Seeking someone to take on thread starter role at the *xXCrossXFire ClubXx --Because one's not enough *

Anyone interested, please PM me.


----------



## Sazz

Quote:


> Originally Posted by *Forceman*
> 
> Those speed drops are pretty unacceptable though - they really need to dial back Powertune or something.


Adding 10% to the board power limit when i had the stock cooler reduces the clock drops to a minimum, and when it does drop it barely drops by 10-50Mhz for a split second.

Right now I am under watrercooling and at stock clocks w/ +10% board power limit, I pretty much flatline at my set clocks unless my GPU load drops significantly.


----------



## kelvinchen

R9 290 Crossfire Test stock voltage oc to 1190mhz.


----------



## Arizonian

Quote:


> Originally Posted by *kelvinchen*
> 
> R9 290 Crossfire Test stock voltage oc to 1190mhz.
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## Forceman

Quote:


> Originally Posted by *Sazz*
> 
> Adding 10% to the board power limit when i had the stock cooler reduces the clock drops to a minimum, and when it does drop it barely drops by 10-50Mhz for a split second.
> 
> Right now I am under watrercooling and at stock clocks w/ +10% board power limit, I pretty much flatline at my set clocks unless my GPU load drops significantly.


Yeah, mine is flatline also, but you would expect a little more consistency out of the box. The power use is even more chaotic. Pretty big swings in the space of a few milliseconds.


----------



## Sazz

Quote:


> Originally Posted by *Forceman*
> 
> Yeah, mine is flatline also, but you would expect a little more consistency out of the box. The power use is even more chaotic. Pretty big swings in the space of a few milliseconds.


Well remember that it is designed to adapt w/ the load within mili seconds so seeing rapid changes on power draw at this timeframe is expected. in fact on previous video cards even if the video card is not under 100% load the clocks would remain at its set max clocks.

While on 290/290X, if the GPU load is under a certain % (can't figure out what exactly is the number) and/maybe if the GPU doesn't require to run at 100% power for the said task, the clocks would just automatically adjust depending on the load, I noticed this playing NFS World when I had V-Sync on at 60Fps, my clocks would only go at 800ish Mhz probably because it doesn't need all the power to run at 60Fps, but once I turn that V-sync off my clocks would go at its designated clock that I set coz its free to render as much frames as it can.


----------



## Forceman

Quote:


> Originally Posted by *Sazz*
> 
> Well remember that it is designed to adapt w/ the load within mili seconds so seeing rapid changes on power draw at this timeframe is expected. in fact on previous video cards even if the video card is not under 100% load the clocks would remain at its set max clocks.
> 
> While on 290/290X, if the GPU load is under a certain % (can't figure out what exactly is the number) and/maybe if the GPU doesn't require to run at 100% power for the said task, the clocks would just automatically adjust depending on the load, I noticed this playing NFS World when I had V-Sync on at 60Fps, my clocks would only go at 800ish Mhz probably because it doesn't need all the power to run at 60Fps, but once I turn that V-sync off my clocks would go at its designated clock that I set coz its free to render as much frames as it can.


I understand the concept, I'm just surprised to see such big swings in such a small period of time. I'd like to see a millisecond output like that from an Nvidia card (or a CPU). You'd think they'd try to smooth it out a little, a 200W power draw swing in a millisecond, and a dozen of them in 100 ms, seems like a lot (but maybe that's just my analog thinking).


----------



## Sazz

Quote:


> Originally Posted by *Forceman*
> 
> I understand the concept, I'm just surprised to see such big swings in such a small period of time. I'd like to see a millisecond output like that from an Nvidia card (or a CPU). You'd think they'd try to smooth it out a little, a 200W power draw swing in a millisecond, and a dozen of them in 100 ms, seems like a lot (but maybe that's just my analog thinking).


Yeah, when you think about it and look back they did mention this kind of behavior on this card when they launched it, it was meant to be like that so that the power is there when is needed and goes on low power when its not, I wonder if using "Force constant Voltage" would down right make it flat line on the voltage as well.


----------



## Forceman

Those kind of power swings may also explain why there is so much Vdroop, I'd imagine overshooting would be a problem with that kind of sudden drop in power draw.


----------



## Widde

Crammed in the 2nd card now







http://piclair.com/ago8h . Did a 3dmark at stock http://piclair.com/qzfzn dont know if that's a good score, got around 11k before but that was with 1100/1500 oc on the card

Also seem to get less fps in bf4, might be due to the psu being to weak


----------



## Sazz

Quote:


> Originally Posted by *Widde*
> 
> Crammed in the 2nd card now
> 
> 
> 
> 
> 
> 
> 
> http://piclair.com/ago8h . Did a 3dmark at stock http://piclair.com/qzfzn dont know if that's a good score, got around 11k before but that was with 1100/1500 oc on the card
> 
> Also seem to get less fps in bf4, might be due to the psu being to weak


Or the game itself, I do remember reading about the new patch that BF4 had. That patch made multi-GPU configuration get less performance instead of more... yeah, they are screwing up BF4 big time.


----------



## kizwan

Quote:


> Originally Posted by *Widde*
> 
> Crammed in the 2nd card now
> 
> 
> 
> 
> 
> 
> 
> http://piclair.com/ago8h . Did a 3dmark at stock http://piclair.com/qzfzn dont know if that's a good score, got around 11k before but that was with 1100/1500 oc on the card
> 
> Also seem to get less fps in bf4, might be due to the psu being to weak


Your CFX score (graphic score) is slightly lower than mine but probably because of the driver. You may need to do fresh driver installation.

http://www.3dmark.com/fs/1363682

I doubt it's because of weak PSU. More like driver issue than anything else. At what resolution you're playing BF4? Resolution scaling?


----------



## Widde

Quote:


> Originally Posted by *kizwan*
> 
> Your CFX score (graphic score) is slightly lower than mine but probably because of the driver. You may need to do fresh driver installation.
> 
> http://www.3dmark.com/fs/1363682
> 
> I doubt it's because of weak PSU. More like driver issue than anything else. At what resolution you're playing BF4? Resolution scaling?


Noticed my psu doesnt even get remotely warm when playing bf4, Might try a fresh driver install, didnt touch anything when i put the other card in. Any other game i can try and stress these cards with?







Might be cpu limited now though, 3570k at 4.5


----------



## fragamemnon

Well I should've posted an application here a lot earlier.









Anyway,
PowerColor R9 290
http://www.techpowerup.com/gpuz/z2ksp/

Did not unlock, chip is 1338
Elpida memory

Core clock has been stable at frequencies up to 1150MHz, haven't pushed it harder yet.

For the time being, I'm using the stock coolerheater, _however..._

I will hopefully have one of these soon:
kryographics Hawaii for Radeon R9 290X and 290 acrylic glass edition, nickel plated version


Spoiler: you might also want to add it to the OP, it's adorable. There are also:



kryographics Hawaii for Radeon R9 290X and 290 acrylic glass edition
kryographics Hawaii for Radeon R9 290X and 290 black edition, nickel plated version
kryographics Hawaii for Radeon R9 290X and 290 black edition



I haven't pushed the card under any games/benchmarks yet, but things are about to change once the block comes here.
I only mine on it for now


----------



## Adglu

Quote:


> Originally Posted by *Forceman*
> 
> Those speed drops are pretty unacceptable though - they really need to dial back Powertune or something.


It's Metro LL benchmark, so i don't see any significant drops apart from between the loops


----------



## kizwan

Quote:


> Originally Posted by *Widde*
> 
> Noticed my psu doesnt even get remotely warm when playing bf4, Might try a fresh driver install, didnt touch anything when i put the other card in. Any other game i can try and stress these cards with?
> 
> 
> 
> 
> 
> 
> 
> Might be cpu limited now though, 3570k at 4.5


Can you play BF4 & monitor the CPU usage using your favourite monitoring software? Mine with 3820 @4.75GHz + R9 290 CFX @stock clock, BF4 High settings @1080p + 200% scaling:-

CPU usage around 80s %
80 - 100FPS outdoor
The CPU usage figure was taken after playing BF4. So, it may not accurate (taking into account spike when game loading, etc).
Quote:


> Originally Posted by *fragamemnon*
> 
> Well I should've posted an application here a lot earlier.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyway,
> PowerColor R9 290
> http://www.techpowerup.com/gpuz/z2ksp/
> 
> Did not unlock, chip is 1338
> Elpida memory
> 
> Core clock has been stable at frequencies up to 1150MHz, haven't pushed it harder yet.
> 
> For the time being, I'm using the stock coolerheater, _however..._
> 
> I will hopefully have one of these soon:
> kryographics Hawaii for Radeon R9 290X and 290 acrylic glass edition, nickel plated version
> 
> 
> Spoiler: you might also want to add it to the OP, it's adorable. There are also:
> 
> 
> 
> kryographics Hawaii for Radeon R9 290X and 290 acrylic glass edition
> kryographics Hawaii for Radeon R9 290X and 290 black edition, nickel plated version
> kryographics Hawaii for Radeon R9 290X and 290 black edition
> 
> 
> 
> I haven't pushed the card under any games/benchmarks yet, but things are about to change once the block comes here.
> I only mine on it for now


If you're going to get AC Kryographics block, make sure you get active backplate too.


----------



## fragamemnon

Quote:


> Originally Posted by *kizwan*
> 
> If you're going to get AC Kryographics block, make sure you get active backplate too.


Yeah I've considered but I don't know about it yet. I hope I can afford it at some point in the near future - it's only the price tag that keeps me off of it.


----------



## Amhro

Just ordered gigabyte 290, can't wait to get it


----------



## gkolarov

Does this cooler Alpenföhn Peter fit to reference 290 ?


----------



## Arizonian

Quote:


> Originally Posted by *Widde*
> 
> Crammed in the 2nd card now
> 
> 
> 
> 
> 
> 
> 
> http://piclair.com/ago8h . Did a 3dmark at stock http://piclair.com/qzfzn dont know if that's a good score, got around 11k before but that was with 1100/1500 oc on the card
> 
> Also seem to get less fps in bf4, might be due to the psu being to weak


Congrats - updated


----------



## Forceman

Quote:


> Originally Posted by *Adglu*
> 
> It's Metro LL benchmark, so i don't see any significant drops apart from between the loops


If it was just the scene changes causing the drops you'd expect to see some consistency in them, both in number and interval, but you can see how much more frequently the Asus card dropped than the others. In any case, the review says it didn't affect performance, but it looks a little unseemly to see the card drop so far so often (at least the Asus). It would be nice if they had showed a comparison chart with the power limit increased to see if that changed anything.


----------



## phallacy

Hey new poster here and I've had my sapphire 290x since early December. Couple questions for you experienced OCers and owners. The answers to my questions are probably somewhere in this thread but I've had a hard time tracking down good responses, so I apologize if this has been posted already.

I'm currently overclocking my card using the reference cooler and trixx and I've managed to get the card to 1125 core and 1425 memory with everything stable with power limit at +45% and voltage offset at +65 (or at least stable enough for me, 8 hours in BF 4 and 4-6 hours other games no crashes, artifacts etc.) Card is running about 85-90 C at this setup. I seem to hit a wall at around 1160 core and 1475 memory though. At this stage, games work however flickering is a big problem so I turn it back down. I'm running at 1440p using a qnix 2710 that's been unlocked to 110 hz. GPU Z is reading ASIC quality at 79.4% so I guess that's ok for OCing ? Are there any things you've tried that worked to get past some of these OC problems? My card will be underwater in a few weeks so there's that but in the mean time using air cooling, is it even possible to reach say 1200/1500 with a proper OC?

My fan speed caps at 80% so I could raise that though the sound is deafening already.


----------



## mojobear

Hey guys,

Gotta say LOVE the R9 290s...extremely smooth gameplay coming from 590 SLI....the extra VRAM is amazing for 5760 x 1080 and now I can finally turn on AERO for W7...dont have to care about conserving VRAM MUAHAHAHA. To me its about a 75-100% upgrade going from 590 SLI to trifire R9 290 and the smoothness of the crossfire experience is pretty amazing as well...this is coming from a guy who did 5850 crossfire back in 2009....that experience is what actually drove me to Nvidia for the 590s.

Anyways, enough praise for the R9 290s...one question I had was about BF4 for those crossfiring...stutter much? Most all other games are silky smooth but BF4 is a hot load of crap. Turning on/off frame pacing doesnt change anything and I turned scaling down to 100%...was getting 100 FPS + @ 5760 x 1080 but still stutters. Dont think its the CPU since Im running 4770k OC to 4.75 GHZ.

As long as other people here on OCN are having same issue then I'll just sit back and wait for new drivers/patches...otherwise I have some optimizing to do


----------



## jerrolds

That seem inline with what i was getting, max i could get with stock cooler was 1180/1500 but that was at 100% fan speed.

You should get a bit higher with a 3rd party cooler, but not much since VRM1 temps with the tiny vrm sinks + thermal tape will limit you. Youll have 65C on the core, but 90C+ on VRM1. The plus side that it will be WAY quieter at higher clocks.

To get VRMs down youd need to watercool, literally hack the reference cooler, or use thermal glue on the sinks (which will prevent you from RMAing or watercooling in the future)


----------



## Gunderman456

Quote:


> Originally Posted by *phallacy*
> 
> Hey new poster here and I've had my sapphire 290x since early December. Couple questions for you experienced OCers and owners. The answers to my questions are probably somewhere in this thread but I've had a hard time tracking down good responses, so I apologize if this has been posted already.
> 
> I'm currently overclocking my card using the reference cooler and trixx and I've managed to get the card to 1125 core and 1425 memory with everything stable with power limit at +45% and voltage offset at +65 (or at least stable enough for me, 8 hours in BF 4 and 4-6 hours other games no crashes, artifacts etc.) Card is running about 85-90 C at this setup. I seem to hit a wall at around 1160 core and 1475 memory though. At this stage, games work however flickering is a big problem so I turn it back down. I'm running at 1440p using a qnix 2710 that's been unlocked to 110 hz. GPU Z is reading ASIC quality at 79.4% so I guess that's ok for OCing ? Are there any things you've tried that worked to get past some of these OC problems? My card will be underwater in a few weeks so there's that but in the mean time using air cooling, is it even possible to reach say 1200/1500 with a proper OC?
> 
> My fan speed caps at 80% so I could raise that though the sound is deafening already.


OC your Core first and once you're maxed stable there try the Memory and see what you can get away with.


----------



## Gunderman456

Quote:


> Originally Posted by *mojobear*
> 
> Hey guys,
> 
> Gotta say LOVE the R9 290s...extremely smooth gameplay coming from 590 SLI....the extra VRAM is amazing for 5760 x 1080 and now I can finally turn on AERO for W7...dont have to care about conserving VRAM MUAHAHAHA. To me its about a 75-100% upgrade going from 590 SLI to trifire R9 290 and the smoothness of the crossfire experience is pretty amazing as well...this is coming from a guy who did 5850 crossfire back in 2009....that experience is what actually drove me to Nvidia for the 590s.
> 
> Anyways, enough praise for the R9 290s...one question I had was about BF4 for those crossfiring...stutter much? Most all other games are silky smooth but BF4 is a hot load of crap. Turning on/off frame pacing doesnt change anything and I turned scaling down to 100%...was getting 100 FPS + @ 5760 x 1080 but still stutters. Dont think its the CPU since Im running 4770k OC to 4.75 GHZ.
> 
> As long as other people here on OCN are having same issue then I'll just sit back and wait for new drivers/patches...otherwise I have some optimizing to do


Some people have had success by unparking their cores in Bios.


----------



## ebduncan

I haven't had any issues with stutters in BF4. Running a single R9-290


----------



## jomama22

Quote:


> Originally Posted by *Gunderman456*
> 
> Some people have had success by unparking their cores in Bios.


they are unparked through the registry, not bios. Very simple to do and I rexconend it


----------



## Its L0G4N

*Add me in the club*
http://www.techpowerup.com/gpuz/c4hab/
XFX 290
Stock cooling at the moment. Going to pick up a water cooler sometime next month.


----------



## vieuxchnock

*Add me in the club


*


----------



## Arizonian

Quote:


> Originally Posted by *vieuxchnock*
> 
> *Add me in the club
> *
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> **
> 
> 
> *
> *


Congrats - added to roster.


----------



## ebduncan

meh might as well get added to the roster

http://www.techpowerup.com/gpuz/f745b/


----------



## Bartouille

My card shows some red pixels when overclocked at round 75c (75% fan). When fan is at 100% the temps go down to 70c and they don't show up. Now I want watercooling.









I have a MK-26 around, I will try it for fun!


----------



## bloodkil93

I'm overclocking and the highest I can get my 290x to is 1130/1550, if I go higher than that on the GPU I start getting artifacting, I'm using +88mV and +20% Power Limit, am I doing something wrong?

http://www.techpowerup.com/gpuz/w2h9h/

Thanks!

EDIT: It's using Elpida RAM modules if that makes any difference


----------



## Mr357

Quote:


> Originally Posted by *bloodkil93*
> 
> I'm overclocking and the highest I can get my 290x to is 1130/1550, if I go higher than that on the GPU I start getting artifacting, I'm using +88mV and +20% Power Limit, am I doing something wrong?
> 
> http://www.techpowerup.com/gpuz/w2h9h/
> 
> Thanks!
> 
> EDIT: It's using Elpida RAM modules if that makes any difference


+20% isn't enough. Bump it to 50% if you want real results.


----------



## VSG

Increase the power limit first and see, you can also go higher on voltage if your temps look good.


----------



## Bartouille

Quote:


> Originally Posted by *bloodkil93*
> 
> I'm overclocking and the highest I can get my 290x to is 1130/1550, if I go higher than that on the GPU I start getting artifacting, I'm using +88mV and +20% Power Limit, am I doing something wrong?
> 
> http://www.techpowerup.com/gpuz/w2h9h/
> 
> Thanks!
> 
> EDIT: It's using Elpida RAM modules if that makes any difference


They say r9 290(x) is made for 95c and all that but it's not true. As soon as you start bumping the clocks artifacts show up, and to me artifacts is not a sign of instability (unless memory oc is involved), but more like gpu is running too hot, which causes instability. Never had any artifacts on my 7950 and it was always running cool. Any WC guys that get artifacts from ocing the core?


----------



## bloodkil93

Quote:


> Originally Posted by *Mr357*
> 
> +20% isn't enough. Bump it to 50% if you want real results.


That's the issue, whenever I go higher I seem to get a lower FPS?


----------



## VSG

That makes no sense, you are only increase the max power output to the card so why would it lower FPS? Unless of course you are temperature throttled.


----------



## bloodkil93

Ok, I got it up to 1150/5500 and got no artifacts and FPS was higher, but got blackscreen, maybe too high on the memory?


----------



## Redvineal

Quote:


> Originally Posted by *bloodkil93*
> 
> Ok, I got it up to 1150/5500 and got no artifacts and FPS was higher, but got blackscreen, maybe too high on the memory?


Every black screen I have encountered so far was caused by memory clocks being too high. Core clock errors usually result in DirectX errors in games, and blocky artifacts in benchmarks.


----------



## bloodkil93

Quote:


> Originally Posted by *Redvineal*
> 
> Every black screen I have encountered so far was caused by memory clocks being too high. Core clock errors usually result in DirectX errors in games, and blocky artifacts in benchmarks.


Ok, should I drop memory down to 1500 then?


----------



## Redvineal

Quote:


> Originally Posted by *bloodkil93*
> 
> Ok, should I drop memory down to 1500 then?


The step amount is a personal, subjective decision. In my opinion, if you're getting a black screen quickly at 1550, go down to 1500 in one shot. If it takes a while to get a black screen at 1550, step down less (maybe 5-10 at a time).

This is where patience comes into play. The closer you get to stability, the longer it could take for you to see a problem and realize you need to tweak more!


----------



## Arizonian

Quote:


> Originally Posted by *ebduncan*
> 
> meh might as well get added to the roster
> 
> http://www.techpowerup.com/gpuz/f745b/


Congrats - glad to add you too.


----------



## bloodkil93

Quote:


> Originally Posted by *Redvineal*
> 
> The step amount is a personal, subjective decision. In my opinion, if you're getting a black screen quickly at 1550, go down to 1500 in one shot. If it takes a while to get a black screen at 1550, step down less (maybe 5-10 at a time).
> 
> This is where patience comes into play. The closer you get to stability, the longer it could take for you to see a problem and realize you need to tweak more!


Ok, just for the sake of squeezing the performance, i'll bring it down in 10's until the blackscreens are eliminated, god damn Elpida RAM!!


----------



## VSG

Don't blame the ram, I was able to get 1500 easy on both my 290x cards with Elpida without any overvoltage at all. It is just luck of the draw here.


----------



## Bartouille

Quote:


> Originally Posted by *geggeg*
> 
> Don't blame the ram, I was able to get 1500 easy on both my 290x cards with Elpida without any overvoltage at all. It is just luck of the draw here.


I agree. My Elpida is epic, here's a quick run with +100/50mv: http://www.3dmark.com/3dm/2166593


----------



## Its L0G4N

*Add me in the club*
http://www.techpowerup.com/gpuz/c4hab/
XFX 290
Stock cooling at the moment. Going to pick up a water cooler sometime next month.


----------



## Stoffie22

Hi, got 2 Sapphire R9 290X BF4 edition. Hynix memory.
Under WATER with 2x XSPC waterblocks with backplates, in series.
GPUZ Validation: http://www.techpowerup.com/gpuz/29cym/

Got them now on 1166core - 1403 (5609Mhz)Mem.
Vddc offset 112, power limit +50, using Saphire Trixx.
For some odd reason MSI aftherburner did not bring these cards beyond 1125/1400, without throteling @ 45 -55 degrees C. depending the fan speed on the rad.

With Sapphire Trixx i can allso go beyond the +100mV offset, wich is nice.


Original Cards


Cards with the water blocks.


Cards in my rig.

So, can i join the club?


----------



## neurotix

How's the black screen issue nowadays?

Supposedly the 13.12 drivers are supposed to fix it.

I'm getting a R9 290 Tri-X in a few days, when I install it I'll come back and post GPU-Z to be added to the club.


----------



## bloodkil93

Quote:


> Originally Posted by *Stoffie22*
> 
> Hi, got 2 Sapphire R9 290X BF4 edition. Hynix memory.
> Under WATER with 2x XSPC waterblocks with backplates, in series.
> GPUZ Validation: http://www.techpowerup.com/gpuz/29cym/
> 
> Got them now on 1166core - 1403 (5609Mhz)Mem.
> Vddc offset 112, power limit +50, using Saphire Trixx.
> For some odd reason MSI aftherburner did not bring these cards beyond 1125/1400, without throteling @ 45 -55 degrees C. depending the fan speed on the rad.
> 
> With Sapphire Trixx i can allso go beyond the +100mV offset, wich is nice.
> 
> 
> Original Cards
> 
> 
> Cards with the water blocks.
> 
> 
> Cards in my rig.
> 
> So, can i join the club?


Nice rig bro!







Can't wait to throw an Arctic Accelero Xtreme III on my 290x!


----------



## StonedAlex

If anyone has a spare BF4 code I'll trade Arkham Origins and Total War: Rome II codes for it.


----------



## Stoffie22

Quote:


> Originally Posted by *mojobear*
> 
> Hey guys,
> 
> Gotta say LOVE the R9 290s...extremely smooth gameplay coming from 590 SLI....the extra VRAM is amazing for 5760 x 1080 and now I can finally turn on AERO for W7...dont have to care about conserving VRAM MUAHAHAHA. To me its about a 75-100% upgrade going from 590 SLI to trifire R9 290 and the smoothness of the crossfire experience is pretty amazing as well...this is coming from a guy who did 5850 crossfire back in 2009....that experience is what actually drove me to Nvidia for the 590s.
> 
> Anyways, enough praise for the R9 290s...one question I had was about BF4 for those crossfiring...stutter much? Most all other games are silky smooth but BF4 is a hot load of crap. Turning on/off frame pacing doesnt change anything and I turned scaling down to 100%...was getting 100 FPS + @ 5760 x 1080 but still stutters. Dont think its the CPU since Im running 4770k OC to 4.75 GHZ.
> 
> As long as other people here on OCN are having same issue then I'll just sit back and wait for new drivers/patches...otherwise I have some optimizing to do


NOOOP, Have 2 290x @ 1100 - 1400 in crossfire and a 3770K @ 4,5 on a Eyefinity setup of 6000x1200, and it is running like a charm. (appart from the occasional crash because of BF4 being a crappy code)
Maybe its network related?


----------



## Caanon

Man, I've got my i7 4770k rig all up and running... with integrated video. Gah! You guys are making me jealous







I can't even find a Sapphire Tri-x r9 290 OC in stock _anywhere_! Anyone see one, even for stupidly inflated prices?


----------



## neurotix

Yep, Newegg had them in stock for $579 yesterday, I ordered one.

Should be here Monday, I'm really excited.


----------



## VSG

Man I am kinda surprised you are ok with paying $130 more than MSRP.


----------



## Mildcorma

So I've bought a 290 and so far I've gotta say I've had some real issues...

It worked fine to start, no problem. Then I restarted my PC and got a windows logon circle, then a black screen. Weird, I thought!

My adventure so far has included completely flashing my bios and upgrading to the newest version. I've installed my old card then downloaded the drivers again, only to have the same issue again as soon as I plug in my 290.

I've re-installed windows 8.1 and it's still not working at all

I'm using the beta drivers (13.11 I think? Could be wrong)

Does anyone have anything that's worked for them in the short-term? I've been at this for the last 4 hours and it's doing my head in!

I've read on hear that they're releasing updated drivers, are they out yet? Am I in fact using the drivers that should fix this issue?

Thanks for any help on this, I'll include the pic below.


----------



## kpoeticg

I'm looking forward to being a member next week =)



I ended up just going with the PowerColor


----------



## the9quad

Quote:


> Originally Posted by *Mildcorma*
> 
> So I've bought a 290 and so far I've gotta say I've had some real issues...
> 
> It worked fine to start, no problem. Then I restarted my PC and got a windows logon circle, then a black screen. Weird, I thought!
> 
> My adventure so far has included completely flashing my bios and upgrading to the newest version. I've installed my old card then downloaded the drivers again, only to have the same issue again as soon as I plug in my 290.
> 
> I've re-installed windows 8.1 and it's still not working at all
> 
> I'm using the beta drivers (13.11 I think? Could be wrong)
> 
> Does anyone have anything that's worked for them in the short-term? I've been at this for the last 4 hours and it's doing my head in!
> 
> I've read on hear that they're releasing updated drivers, are they out yet? Am I in fact using the drivers that should fix this issue?
> 
> Thanks for any help on this, I'll include the pic below.


you should be using 13.12 WHQL's they are the most up to date drivers. Also without any info besides you being on win 8.1 and having a 290, your not giving us much to help you with. You should use rig builder to the upper right to put together your system it will help people trouble shoot for ya.


----------



## Zaiphon

Hey Guys, im new and im happy to announce that im a R9 290 Since yesterday









Proof:



Brand

PowerColor R9 290 OC

Stock but planing Peter 2 + 2x140mm fans.

Btw i opened a thread about aftermarkt cooling in ATi/Amd Section, check it


----------



## Mildcorma

Quote:


> Originally Posted by *the9quad*
> 
> you should be using 13.12 WHQL's they are the most up to date drivers. Also without any info besides you being on win 8.1 and having a 290, your not giving us much to help you with. You should use rig builder to the upper right to put together your system it will help people trouble shoot for ya.


I've updated my rig on here now, thanks. Will try 13.12 tomorrow and see how it goes, fingers crossed!


----------



## Infinite Jest

I was looking forward to getting one of those sapphire 290s, but after seeing the current market, I think I'll wait a bit. A member of the nVidia club for a bit longer, I'm afraid.


----------



## Pheozero

So how would a pair of R9 290(X)'s handle a certain 1440p 120Hz+ monitor that was announced at CES?


----------



## VSG

No G-sync for one, but it would be a good match for 120Hz at that resolution. Why not just get an Overlord 1440p, 120Hz monitor for cheaper though? That one is also available right now.


----------



## the9quad

Quote:


> Originally Posted by *geggeg*
> 
> No G-sync for one, but it would be a good match for 120Hz at that resolution. Why not just get an Overlord 1440p, 120Hz monitor for cheaper though? That one is also available right now.


I LOVE THESE CARDS!

or a QNIX/Xstar for half the price of the overlord.


----------



## Pheozero

I'll take a look at it. Thanks.


----------



## mullenium

Only 5 left: Sapphire BF4 Edition 290x @ Amazon for $579


----------



## TheRoot

Quote:


> Originally Posted by *mullenium*
> 
> Only 5 left: Sapphire BF4 Edition 290x @ Amazon for $579


that price







so damn you newegg


----------



## Sazz

Anyone here want's a GPU block for 290/290X? I decided to disassemble my build and make a transition to a mini-ITX build. I would get rid of all my watercooling stuff first before I start selling my core components (except the 290X)

I got the EK Acetal+copper block w/ backplate $115 shipped. I would have a thread up over on Anandtech (can't post here, don't have enough reps to post on marketplace) just check out my forum name there (sazuzaki) if you are in need of other watercooling stuff that I may have.

And anyone willing to sell their reference cooler? I am gonna use a modded AIO liquid cooler on my 290X on w/ my mini-ITX build. =P


----------



## steadly2004

Can somebody with a 290x bench at 1150/1350 on 3d mark 11 and part the score? I'm curious the difference between your score and my 290 at the same clocks. Or we can come to terms on different clocks. Mostly interested in the graphics score


----------



## bond32

Quote:


> Originally Posted by *steadly2004*
> 
> Can somebody with a 290x bench at 1150/1350 on 3d mark 11 and part the score? I'm curious the difference between your score and my 290 at the same clocks. Or we can come to terms on different clocks. Mostly interested in the graphics score


This was at 1175 core:http://www.3dmark.com/fs/1508181


----------



## steadly2004

Quote:


> Originally Posted by *bond32*
> 
> This was at 1175 core:http://www.3dmark.com/fs/1508181


Thanks. I'll have to run for strike to compare. I got 16,600 on 3d mark 11 for the graphics score in performance.


----------



## steadly2004

Quote:


> Originally Posted by *bond32*
> 
> This was at 1175 core:http://www.3dmark.com/fs/1508181


wouldn't complete at 1175, 1125 went through though. It's a pain if you haven't bough it.

DETAILED SCORES
3DMark Score10223.0 10798.0 3DMarks
Graphics Score11975.0 12447.0
Physics Score10818.0 13203.0
Combined Score4692.0 4765.0
Graphics Test 157.8 61.0 fps
Graphics Test 247.4 48.7 fps
Physics Test34.3 41.9 fps
Combined Test21.8 22.2 fps

my scores at 1125 vs yours at 1175. you have a lead, by 500 solid graphics score. I wonder how wide that lead would be if you ran at 1125? pretty much trying to validate my choice in going with a 290 instead of 290x. thats all


----------



## Arizonian

Quote:


> Originally Posted by *Its L0G4N*
> 
> *Add me in the club*
> http://www.techpowerup.com/gpuz/c4hab/
> XFX 290
> Stock cooling at the moment. Going to pick up a water cooler sometime next month.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added








Quote:


> Originally Posted by *Stoffie22*
> 
> Hi, got 2 Sapphire R9 290X BF4 edition. Hynix memory.
> Under WATER with 2x XSPC waterblocks with backplates, in series.
> GPUZ Validation: http://www.techpowerup.com/gpuz/29cym/
> 
> Got them now on 1166core - 1403 (5609Mhz)Mem.
> Vddc offset 112, power limit +50, using Saphire Trixx.
> For some odd reason MSI aftherburner did not bring these cards beyond 1125/1400, without throteling @ 45 -55 degrees C. depending the fan speed on the rad.
> 
> With Sapphire Trixx i can allso go beyond the +100mV offset, wich is nice.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Original Cards
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Cards with the water blocks.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Cards in my rig.
> 
> So, can i join the club?


Congrats - added














Nice rig.









Quote:


> Originally Posted by *Mildcorma*
> 
> So I've bought a 290 and so far I've gotta say I've had some real issues...
> 
> It worked fine to start, no problem. Then I restarted my PC and got a windows logon circle, then a black screen. Weird, I thought!
> 
> My adventure so far has included completely flashing my bios and upgrading to the newest version. I've installed my old card then downloaded the drivers again, only to have the same issue again as soon as I plug in my 290.
> 
> I've re-installed windows 8.1 and it's still not working at all
> 
> I'm using the beta drivers (13.11 I think? Could be wrong)
> 
> Does anyone have anything that's worked for them in the short-term? I've been at this for the last 4 hours and it's doing my head in!
> 
> I've read on hear that they're releasing updated drivers, are they out yet? Am I in fact using the drivers that should fix this issue?
> 
> Thanks for any help on this, I'll include the pic below.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Quote:


> Originally Posted by *Zaiphon*
> 
> Hey Guys, im new and im happy to announce that im a R9 290 Since yesterday
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Proof:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Brand
> 
> PowerColor R9 290 OC
> 
> Stock but planing Peter 2 + 2x140mm fans.
> 
> Btw i opened a thread about aftermarkt cooling in ATi/Amd Section, check it


Congrats - added









Right on guys - enjoy your cards.


----------



## Forceman

Quote:


> Originally Posted by *steadly2004*
> 
> 
> wouldn't complete at 1175, 1125 went through though. It's a pain if you haven't bough it.
> 
> DETAILED SCORES
> 3DMark Score10223.0 10798.0 3DMarks
> Graphics Score11975.0 12447.0
> Physics Score10818.0 13203.0
> Combined Score4692.0 4765.0
> Graphics Test 157.8 61.0 fps
> Graphics Test 247.4 48.7 fps
> Physics Test34.3 41.9 fps
> Combined Test21.8 22.2 fps
> 
> my scores at 1125 vs yours at 1175. you have a lead, by 500 solid graphics score. I wonder how wide that lead would be if you ran at 1125? pretty much trying to validate my choice in going with a 290 instead of 290x. thats all


Here's mine at 1150/1350. 17178 for Graphics. I re-ran it and got 16876 though - not sure what the difference was.

http://www.3dmark.com/3dm11/7802510


----------



## steadly2004

Quote:


> Originally Posted by *Forceman*
> 
> Here's mine at 1150/1350. 17178 for Graphics. I re-ran it and got 16876 though - not sure what the difference was.
> 
> http://www.3dmark.com/3dm11/7802510


very nice, that's a good comparison. 290 at 1150/1350 = 16,600 and 290x at same clocks = 17,100 or so. Again about 500 points ahead.

Here is me on firestrike at 1339/1390. not a fair compairison since he had slower memory. Got me agian like 400-500 less points. Pretty close. 
Core clock and GPU usage on my end didn't stay stable through out that test.


----------



## Aesthetics

Does XFX 290 Double Dissipation use a references PCB or custom PCB? Most of water blocks only support references 290 PCB


----------



## DullBoi

Howdy you all







I got 2 x Powercolor R9 290 the OC one with 975mhz core.

Surprised, I did not find the AMD logo above the PCI-E connector, instead the code LF R29F . I checked and it has the exact same layout as the reference design and ofc the same cooler still.

Just thought I'd share my discovery since I got reference blocks and was a little worried driving back from the shop









Cheers


----------



## Sgt Bilko

Quote:


> Originally Posted by *Aesthetics*
> 
> Does XFX 290 Double Dissipation use a references PCB or custom PCB? Most of water blocks only support references 290 PCB


afaik its a ref board but I dont know for sure yet.


----------



## devilhead

So today recieved 2x more Sapphire 290x Battlefield 4 edition, both cards is with hynix memory


----------



## Durvelle27

Guys do you know what brands have a higher chance of having Hynex VRAM


----------



## psyside

Quote:


> Originally Posted by *Durvelle27*
> 
> Guys do you know what brands have a higher chance of having Hynex VRAM


Not higher, guaranteed.

Sapphire Tri X.


----------



## Durvelle27

Quote:


> Originally Posted by *psyside*
> 
> Not higher, guaranteed.
> 
> Sapphire Tri X.


Really only want reference


----------



## Sgt Bilko

Quote:


> Originally Posted by *Durvelle27*
> 
> Really only want reference


From what ive seen the Sapphire 290x BF4 editions have had a good chance for Hynix but then again mine was Elphida so its still a lottery.


----------



## Sazz

hardocp's review of Asus DCUII says its guaranteed Hynix as well. atleast that one is guaranteed to get a waterblock coz EK always makes waterblock on high end DCUII video cards.


----------



## Durvelle27

Quote:


> Originally Posted by *Sazz*
> 
> hardocp's review of Asus DCUII says its guaranteed Hynix as well. atleast that one is guaranteed to get a waterblock coz EK always makes waterblock on high end DCUII video cards.


hmmmm i might look into those. What about the new VisionTek R9 290 w/ Pre-installed EK block


----------



## Sgt Bilko

Quote:


> Originally Posted by *Durvelle27*
> 
> hmmmm i might look into those. What about the new VisionTek R9 290 w/ Pre-installed EK block


Seen a review posted in this thread a few pages that said the DCU II cards were Elphida.

Im really not sure about the visiontek cards though....they probably are hynix due to the higher oc. But no way to guarantee it until they review it.


----------



## Durvelle27

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Seen a review posted in this thread a few pages that said the DCU II cards were Elphida.
> 
> Im really not sure about the visiontek cards though....they probably are hynix due to the higher oc. But no way to guarantee it until they review it.


Guess i'll wait some and keep eyes peeled


----------



## Sazz

Quote:


> Originally Posted by *Durvelle27*
> 
> Guess i'll wait some and keep eyes peeled


TBH Hynix or elipdia it doesn't matter, I've handled a couple of Hynix card and 3 Elipda card, and Elipdia cards actually reached higher clocks for me, first Hynix I got (my very first 290X that had bad Display port) only got to 1325 stable (doesn't matter what voltage added, that's its max) and second one which is a 290 only maxed out at 1340 while my current 290X maxed out at 1450 and the other one that my friend owns maxed out at 1500 stable.

The only reason why hynix was probably superior is because of the voltage control for memory, now w/ R9 290/290X don't have memory voltage control, whatever you get at stock voltage is what your max memory clocks would be, it's really just a roll of a dice in terms of the memory clocks.

if you do end up buying a reference card, I am selling my waterblock+backplate for $115 shipped.


----------



## MeneerVent

I got my r9 290 cooled by a Kraken X40 and a Kraken G10, it's awesome, but the mounting proces was a pain. With +100mV core voltage and a power limit of +50% I get to a stable clock speed of 1207Mhz, is this considered good? Also, what is considered to be a safe VRM temperature? I saw that GPU-Z shows a max temp of 114C after a run of Furmark for VRM 1, is this bad?


----------



## Durvelle27

Quote:


> Originally Posted by *Sazz*
> 
> TBH Hynix or elipdia it doesn't matter, I've handled a couple of Hynix card and 3 Elipda card, and Elipdia cards actually reached higher clocks for me, first Hynix I got (my very first 290X that had bad Display port) only got to 1325 stable (doesn't matter what voltage added, that's its max) and second one which is a 290 only maxed out at 1340 while my current 290X maxed out at 1450 and the other one that my friend owns maxed out at 1500 stable.
> 
> The only reason why hynix was probably superior is because of the voltage control for memory, now w/ R9 290/290X don't have memory voltage control, whatever you get at stock voltage is what your max memory clocks would be, it's really just a roll of a dice in terms of the memory clocks.
> 
> if you do end up buying a reference card, I am selling my waterblock+backplate for $115 shipped.


My last 290 which was a reference Sapphire R9 290 had Eplida memory and the max it could do was 1450 on memory

Why you selling the block


----------



## Sgt Bilko

Quote:


> Originally Posted by *MeneerVent*
> 
> I got my r9 290 cooled by a Kraken X40 and a Kraken G10, it's awesome, but the mounting proces was a pain. With +100mV core voltage and a power limit of +50% I get to a stable clock speed of 1207Mhz, is this considered good? Also, what is considered to be a safe VRM temperature? I saw that GPU-Z shows a max temp of 114C after a run of Furmark for VRM 1, is this bad?


Well 114c is not good. Ideally you don't want them going over 70-80c or so but they are rated up to 120c ive heard

EDIT: and furmark isnt good for anything except cooking your cards really.


----------



## Sazz

Quote:


> Originally Posted by *Durvelle27*
> 
> My last 290 which was a reference Sapphire R9 290 had Eplida memory and the max it could do was 1450 on memory
> 
> Why you selling the block


I am converting my build to a Micro-ATX one to make it easier to haul around. selling all of my watercooling stuff as well.


----------



## Durvelle27

Quote:


> Originally Posted by *Sazz*
> 
> I am converting my build to a Micro-ATX one to make it easier to haul around. selling all of my watercooling stuff as well.


I'll pm you bud


----------



## bond32

Quote:


> Originally Posted by *steadly2004*
> 
> 
> wouldn't complete at 1175, 1125 went through though. It's a pain if you haven't bough it.
> 
> DETAILED SCORES
> 3DMark Score10223.0 10798.0 3DMarks
> Graphics Score11975.0 12447.0
> Physics Score10818.0 13203.0
> Combined Score4692.0 4765.0
> Graphics Test 157.8 61.0 fps
> Graphics Test 247.4 48.7 fps
> Physics Test34.3 41.9 fps
> Combined Test21.8 22.2 fps
> 
> my scores at 1125 vs yours at 1175. you have a lead, by 500 solid graphics score. I wonder how wide that lead would be if you ran at 1125? pretty much trying to validate my choice in going with a 290 instead of 290x. thats all


I'll be happy to run more tests Sunday when I get home. I'm curious of this... I basically bought the 290x on release just because it was available. I really preferred the 290. Then I see all this "unlock" business, made me feel bad. Here's my best score :http://www.3dmark.com/fs/1508220


----------



## EliteReplay

Hi are are pleople getting black screen still on this cards_?


----------



## King4x4

Two days testing... No Blacks yet.


----------



## mukmaster

Add me plz











Asus R290x bought for 500 on newegg.
1150MHZ Core @ 1.1v
1600MHZ Mem Elpidia
Stock Cooling
RUNS HOT.

Thanks


----------



## Durvelle27

Quote:


> Originally Posted by *mukmaster*
> 
> Add me plz
> 
> 
> 
> Asus R290x bought for 500 on newegg.
> 1150MHZ Core @ 1.1v
> 1600MHZ Mem Elpidia
> Stock Cooling
> RUNS HOT.
> 
> Thanks


Need proof bud. GPU-Z validation or pic of card with your name handwritten on a piece of paper in the pic


----------



## tijgert

Quote:


> Need proof bud. GPU-Z validation or pic of card with your name handwritten on a piece of paper in the pic


No offence, but really? I need *hard* evidence of owning a card so I can be added to.. what exactly?

Well, until then, just disbelieve me when I say that undervolting 75mV works real well on my Sapphire 290x, saves me some degrees, I'm sure...


----------



## szeged

Quote:


> Originally Posted by *tijgert*
> 
> No offence, but really? I need *hard* evidence of owning a card so I can be added to.. what exactly?
> 
> Well, until then, just disbelieve me when I say that undervolting 75mV works real well on my Sapphire 290x, saves me some degrees, I'm sure...


i dont think anyone disbelieves you, its just the rules of the club that you need a gpuz validation link or picture of the card with your OCN name in it to be added to the official owners list.


----------



## devilhead

made couple test on my new 290x, so yeeh hynix memory is better, i have 3x290(X) xfx Elpidia, so max overclock on memory is 1500







and now those with hynix can make more than 1600, one card tested +100mv 1200/1720, but i need watercool them


----------



## kizwan

Quote:


> Originally Posted by *tijgert*
> 
> No offence, but really? I need *hard* evidence of owning a card so I can be added to.. what exactly?
> 
> Well, until then, just disbelieve me when I say that undervolting 75mV works real well on my Sapphire 290x, saves me some degrees, I'm sure...


It just the rules to join this club. You don't have to join this club though. No one forcing you really (wink if you're under duress). It's up to you.


----------



## Mildcorma

Ok so my card still isn't working, going to go over everything here in great detail and hope that someone has a great answer!

I bought this card a few days ago and it arrived yesterday with no damage. I installed the drivers, then played a game of BF4 to take it for a test run, then I uninstalled Nvidia drivers (I have / had a GTX 660ti). Once I restarted my PC after uninstalling, it wouldn't boot past the windows icon with a swirly circle.

My build in brief:

Windows 8.1
ASUS Sabretooth P67 mobo
GTX 660ti / AMD R9 290
2 x OCZ Vertex 4 120gb each
2 x Samsung Spinpoint F3s 1tb each
8gb G-skill DDR3 1333mhz
i5 2500k not OC'ed at present
600w PSU
I have 2 24 inch monitors that are connected via HDMI (although one is currently DVI for troubleshooting)

I took a look at my Bios and surprise, it was out of date. So i downloaded the 4 bios updates and cycled through them in order, clearing the CMOS after each one to be certain. Put the 290 in again, but no luck.

I re-installed windows completely (yay for backups right?) and went through the established process once more with the 13.12 drivers, but no luck. I tried the 13.11 drivers, but no luck there either. I've tried the tool listed on page one here to completely erase the drivers, and started again, but no luck at all.

Now, my mates got a rig and the card works fine in that with no apparent immediate issues also, so this is looking more and more like I send the card back and splash out a little more on a gtx780. I've never had these kinds of issues with new cards before, and as a lifelong Nvidia buyer, the move across to what's an amazing card for the price has been EVERYTHING but easy!

Does anyone have any ideas? If not then I have a 290 for sale, £310 if you pay for postage! :S


----------



## Durvelle27

Quote:


> Originally Posted by *Mildcorma*
> 
> Ok so my card still isn't working, going to go over everything here in great detail and hope that someone has a great answer!
> 
> I bought this card a few days ago and it arrived yesterday with no damage. I installed the drivers, then played a game of BF4 to take it for a test run, then I uninstalled Nvidia drivers (I have / had a GTX 660ti). Once I restarted my PC after uninstalling, it wouldn't boot past the windows icon with a swirly circle.
> 
> My build in brief:
> 
> Windows 8.1
> ASUS Sabretooth P67 mobo
> GTX 660ti / AMD R9 290
> 2 x OCZ Vertex 4 120gb each
> 2 x Samsung Spinpoint F3s 1tb each
> 8gb G-skill DDR3 1333mhz
> i5 2500k not OC'ed at present
> 600w PSU
> I have 2 24 inch monitors that are connected via HDMI (although one is currently DVI for troubleshooting)
> 
> I took a look at my Bios and surprise, it was out of date. So i downloaded the 4 bios updates and cycled through them in order, clearing the CMOS after each one to be certain. Put the 290 in again, but no luck.
> 
> I re-installed windows completely (yay for backups right?) and went through the established process once more with the 13.12 drivers, but no luck. I tried the 13.11 drivers, but no luck there either. I've tried the tool listed on page one here to completely erase the drivers, and started again, but no luck at all.
> 
> Now, my mates got a rig and the card works fine in that with no apparent immediate issues also, so this is looking more and more like I send the card back and splash out a little more on a gtx780. I've never had these kinds of issues with new cards before, and I build quite a few systems.
> 
> Does anyone have any ideas? If not then I have a 290 for sale, £310 if you pay for postage! :S


Sounds like PSU to me


----------



## Its L0G4N

Quote:


> Originally Posted by *Mildcorma*
> 
> Ok so my card still isn't working, going to go over everything here in great detail and hope that someone has a great answer!
> 
> I bought this card a few days ago and it arrived yesterday with no damage. I installed the drivers, then played a game of BF4 to take it for a test run, then I uninstalled Nvidia drivers (I have / had a GTX 660ti). Once I restarted my PC after uninstalling, it wouldn't boot past the windows icon with a swirly circle.
> 
> My build in brief:
> 
> Windows 8.1
> ASUS Sabretooth P67 mobo
> GTX 660ti / AMD R9 290
> 2 x OCZ Vertex 4 120gb each
> 2 x Samsung Spinpoint F3s 1tb each
> 8gb G-skill DDR3 1333mhz
> i5 2500k not OC'ed at present
> 600w PSU
> I have 2 24 inch monitors that are connected via HDMI (although one is currently DVI for troubleshooting)
> 
> I took a look at my Bios and surprise, it was out of date. So i downloaded the 4 bios updates and cycled through them in order, clearing the CMOS after each one to be certain. Put the 290 in again, but no luck.
> 
> I re-installed windows completely (yay for backups right?) and went through the established process once more with the 13.12 drivers, but no luck. I tried the 13.11 drivers, but no luck there either. I've tried the tool listed on page one here to completely erase the drivers, and started again, but no luck at all.
> 
> Now, my mates got a rig and the card works fine in that with no apparent immediate issues also, so this is looking more and more like I send the card back and splash out a little more on a gtx780. I've never had these kinds of issues with new cards before, and I build quite a few systems.
> 
> Does anyone have any ideas? If not then I have a 290 for sale, £310 if you pay for postage! :S


unplug one of the screens and/or try a different monitor.

I run my r9 290 with a 550w psu


----------



## Bartouille

Anyone knows how to get more than 1625MHz on the memory with AB? I checked "extended clock limits" and it didn't do anything.


----------



## Mr357

Quote:


> Originally Posted by *Bartouille*
> 
> Anyone knows how to get more than 1625MHz on the memory with AB? I checked "extended clock limits" and it didn't do anything.


If anyone would know it's TSM, but I know for sure you can go much higher in GPUTweak.


----------



## Durquavian

My AB shows 2245 for max memory clock. Official OCing method.


----------



## shilka

Quote:


> Originally Posted by *Mildcorma*
> 
> Ok so my card still isn't working, going to go over everything here in great detail and hope that someone has a great answer!
> 
> I bought this card a few days ago and it arrived yesterday with no damage. I installed the drivers, then played a game of BF4 to take it for a test run, then I uninstalled Nvidia drivers (I have / had a GTX 660ti). Once I restarted my PC after uninstalling, it wouldn't boot past the windows icon with a swirly circle.
> 
> My build in brief:
> 
> Windows 8.1
> ASUS Sabretooth P67 mobo
> GTX 660ti / AMD R9 290
> 2 x OCZ Vertex 4 120gb each
> 2 x Samsung Spinpoint F3s 1tb each
> 8gb G-skill DDR3 1333mhz
> i5 2500k not OC'ed at present
> 600w PSU
> I have 2 24 inch monitors that are connected via HDMI (although one is currently DVI for troubleshooting)
> 
> I took a look at my Bios and surprise, it was out of date. So i downloaded the 4 bios updates and cycled through them in order, clearing the CMOS after each one to be certain. Put the 290 in again, but no luck.
> 
> I re-installed windows completely (yay for backups right?) and went through the established process once more with the 13.12 drivers, but no luck. I tried the 13.11 drivers, but no luck there either. I've tried the tool listed on page one here to completely erase the drivers, and started again, but no luck at all.
> 
> Now, my mates got a rig and the card works fine in that with no apparent immediate issues also, so this is looking more and more like I send the card back and splash out a little more on a gtx780. I've never had these kinds of issues with new cards before, and as a lifelong Nvidia buyer, the move across to what's an amazing card for the price has been EVERYTHING but easy!
> 
> Does anyone have any ideas? If not then I have a 290 for sale, £310 if you pay for postage! :S


Which PSU all you list is wattage


----------



## Mildcorma

Quote:


> Originally Posted by *shilka*
> 
> Which PSU all you list is wattage


OCZ Stealth Extreme, it's not the PSU as I said it was working fine with no issues previously. My PSU is nowhere near capping out.


----------



## shilka

Quote:


> Originally Posted by *Mildcorma*
> 
> OCZ Stealth Extreme, it's not the PSU as I said it was working fine with no issues previously. My PSU is nowhere near capping out.


Stealth Extreme 1 or 2?

Either way nothing bad about them other then their age


----------



## Mildcorma

Quote:


> Originally Posted by *shilka*
> 
> Stealth Extreme 1 or 2?
> 
> Either way nothing bad about them other then their age


2


----------



## Marvin82

@Mukmaster

Can you pleas upload the Asus Uber Bios?
Thx

@all
How can i see witch VRam i have without open the card cooler ?


----------



## Mildcorma

Quote:


> Originally Posted by *Its L0G4N*
> 
> unplug one of the screens and/or try a different monitor.
> 
> I run my r9 290 with a 550w psu


Monitors work fine on the other card and I've tried my mates 19" as well with no luck. Same as always, gets to the blue windows logo with the circle then goes black.


----------



## Durvelle27

Quote:


> Originally Posted by *shilka*
> 
> Stealth Extreme 1 or 2?
> 
> Either way nothing bad about them other then their age


This would worry me some

[email protected], [email protected], [email protected], [email protected]


----------



## shilka

Quote:


> Originally Posted by *Durvelle27*
> 
> This would worry me some
> 
> [email protected], [email protected], [email protected], [email protected]


I forgot about they where multi rail with really weak 12v rails

Cant remember everything about every single PSU ever made

Pathetic 12v rails


----------



## MeneerVent

Any way to cool the VRM effectively? I feel that my memory clock can still go higher but the VRM temps are really too high. I actually don't want to place heatsinks because I don't know where to get them or how to mount them, and I maybe will be selling it later. I thought of replacing the stock fan on the G10 with a noctua one, but would that help?


----------



## Mildcorma

Quote:


> Originally Posted by *shilka*
> 
> I forgot about they where multi rail with really weak 12v rails
> 
> Cant remember everything about every single PSU ever made
> 
> Pathetic 12v rails


Why would this be a problem? PSUs are, unfortunately, my most neglected area of knowledge :S


----------



## shilka

Quote:


> Originally Posted by *Mildcorma*
> 
> Why would this be a problem? PSUs are, unfortunately, my most neglected area of knowledge :S


If the PSU have so weak 12v rails that makes them every easy to overload and the result is the PSU will shut down

18 amps is very weak

Even the 450 watts Cooler Master V450S have 36 amps on its single 12v rail

Cooler Master V1000 have 83 amps on its single rail just to compare


----------



## Asrock Extreme7

help I oc with msi ab but when I restart no oc power tune just goes back to stock just used gpu tweak same if if power tune does not show the oc then it does not work


----------



## Mildcorma

Quote:


> Originally Posted by *shilka*
> 
> If the PSU have so weak 12v rails that makes them every easy to overload and the result is the PSU will shut down
> 
> 18 amps is very weak
> 
> Even the 450 watts Cooler Master V450S have 36 amps on its single 12v rail
> 
> Cooler Master V1000 have 83 amps on its single rail just to compare


That makes sense, but I don't think it's the issue at hand as if the PSU shut down then the whole PC would stop. I don't think it's a hardware issue at all, i mean the PSU is the oldest part of my rig but it still runs everything with no issues.


----------



## shilka

Quote:


> Originally Posted by *Mildcorma*
> 
> That makes sense, but I don't think it's the issue at hand as if the PSU shut down then the whole PC would stop. I don't think it's a hardware issue at all, i mean the PSU is the oldest part of my rig but it still runs everything with no issues.


If you spread the load out on two 12v rails then it would be harder to overload a rail

So you probably did that maybe without knowing it

Anyway you should not try and overvolt on a PSU that have so weak 12v rails


----------



## tijgert

Well peeps, I quickly scribbled my alias on a piece of eh.. paper.
So here's the ironclad proof of my 290x. Still waiting on various parts for my third water cool project.










(Here's the first and second is still waiting to be written... For a few years now actually)


----------



## Hattifnatten

Finally got my 90* fitting today, and I was able to fill the loop up. I've actually had the card blocked for a few days, but thanks to EK putting the inlet/outlat WAY to close to the rear of the case, I was having so much trouble putting it together. Going from the block to the rad in a straight line? Nope. Unless you wanted to restrict the path from the rad to the cpu-block. And after all of that was figured out, I realized I was 2-3cm short of tube. Had to get a 90.


Spoiler: Warning: Spoiler!


----------



## Forceman

Quote:


> Originally Posted by *Marvin82*
> 
> @Mukmaster
> @all
> How can i see witch VRam i have without open the card cooler ?


Use the memory info tool (linked in the first post).

http://www.mediafire.com/download/voj4j1rlk0ucfz4/MemoryInfo+1005.rar


----------



## Stoffie22

Quote:


> Originally Posted by *Marvin82*
> 
> @Mukmaster
> 
> Can you pleas upload the Asus Uber Bios?
> Thx
> 
> @all
> How can i see witch VRam i have without open the card cooler ?


Asus bios:
http://www.techpowerup.com/vgabios/index.php?architecture=ATI&manufacturer=Asus&model=R9+290X&interface=&memType=&memSize=

There is a prog for that: MemoryInfo 1005.
http://www.mediafire.com/download/voj4j1rlk0ucfz4/MemoryInfo+1005.rar


----------



## EliteReplay

Quote:


> Originally Posted by *shilka*
> 
> If you spread the load out on two 12v rails then it would be harder to overload a rail
> 
> So you probably did that maybe without knowing it
> 
> Anyway you should not try and overvolt on a PSU that have so weak 12v rails


why PSU manufactures were doing that before??


----------



## shilka

Quote:


> Originally Posted by *EliteReplay*
> 
> why PSU manufactures were doing that before??


I dont understand you?


----------



## EliteReplay

Quote:


> Originally Posted by *shilka*
> 
> I dont understand you?


that why manufactures used to divide the rails into Two rails of 12v or 4 rails of 12v


----------



## shilka

Quote:


> Originally Posted by *EliteReplay*
> 
> that why manufactures used to divide the rails into Two rails of 12v or 4 rails of 12v


I really dont know

This thread explains better then i can

http://www.overclock.net/t/761202/single-rail-vs-multi-rail-explained


----------



## devilhead

done some fast test with 290x +100mv on air







on monday will put waterblock







max temp was 72C, and still for overclocking for me personally looks to high


----------



## Arizonian

Quote:


> Originally Posted by *devilhead*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> So today recieved 2x more Sapphire 290x Battlefield 4 edition, both cards is with hynix memory


Congrats - added









Quote:


> Originally Posted by *mukmaster*
> 
> Add me plz
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> Asus R290x bought for 500 on newegg.
> 1150MHZ Core @ 1.1v
> 1600MHZ Mem Elpidia
> Stock Cooling
> RUNS HOT.
> 
> Thanks


Congrats - added









If you could please go back to the post linked and add either GPU-Z Link with validation tab open and your OCN name showing, Screen shot with OCN Name, or even pic of GPU with OCN name on piece of paper which is required and edit your original post would be greatly appreciated.
Quote:


> Originally Posted by *tijgert*
> 
> Well peeps, I quickly scribbled my alias on a piece of eh.. paper.
> So here's the ironclad proof of my 290x. Still waiting on various parts for my third water cool project.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s1069.photobucket.com/albums...&current=3e2c357c679bd16c539bdbe491fb7fc8.jpg
> 
> 
> 
> (Here's the first and second is still waiting to be written... For a few years now actually)


Congrats - added


----------



## kizwan

Quote:


> Originally Posted by *Hattifnatten*
> 
> Finally got my 90* fitting today, and I was able to fill the loop up. I've actually had the card blocked for a few days, but thanks to EK putting the inlet/outlat WAY to close to the rear of the case, I was having so much trouble putting it together. Going from the block to the rad in a straight line? Nope. Unless you wanted to restrict the path from the rad to the cpu-block. And after all of that was figured out, I realized I was 2-3cm short of tube. Had to get a 90.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


The tube from the radiator to GPU is too long. Why didn't you shorten the tube there?


----------



## Hattifnatten

It was either that, or no tube at all







Trust me, I tried for hours. Only ended upwith a massive kink which would have stopped all flow. Could probably have done something if I had two 45's, but I didn't.


----------



## DullBoi

Just to show the new Powercolor R9 290's (LF R29F board) with Hynix H5GQ2H24AFR VRAM




















On the back it still has the AMD sign next to the FCC with the Model : C671 which is on the reference card aswell.


----------



## dspacek

Hello, Im under water now, and testing OC performance. So Im stucked with memory at 1700 Mhz still.
But the core starts throttling at 1080Mhz without influence on power or Voltage setup.
Can anyone put here his BIOS for R9-290 which is not throttling?







Or how to aviod it?
Temps are 50°C MAX, included VRM
Tanks Very much

big city

Heatkiller


----------



## dspacek

Quote:


> Originally Posted by *devilhead*
> 
> done some fast test with 290x +100mv on air
> 
> 
> 
> 
> 
> 
> 
> on monday will put waterblock
> 
> 
> 
> 
> 
> 
> 
> max temp was 72C, and still for overclocking for me personally looks to high


Without throttling? NICE! Can you add an AB graph when you are benching?
What Bios do you use or recomend for R9-290?


----------



## Darklyspectre

Might join the red side again soon if I can get a proper price for my Ti classified and if the MSI 290X lightning doesn't turn into a dissapointment.


----------



## Asrock Extreme7

Quote:


> Originally Posted by *dspacek*
> 
> Without throttling? NICE! Can you add an AB graph when you are benching?
> What Bios do you use or recomend for R9-290?


Quote:


> Originally Posted by *dspacek*
> 
> Without throttling? NICE! Can you add an AB graph when you are benching?
> What Bios do you use or recomend for R9-290?


whats your core and mem


----------



## Asrock Extreme7

Quote:


> Originally Posted by *devilhead*
> 
> done some fast test with 290x +100mv on air
> 
> 
> 
> 
> 
> 
> 
> on monday will put waterblock
> 
> 
> 
> 
> 
> 
> 
> max temp was 72C, and still for overclocking for me personally looks to high


whats your core and mem


----------



## Asrock Extreme7

Quote:


> Originally Posted by *devilhead*
> 
> done some fast test with 290x +100mv on air
> 
> 
> 
> 
> 
> 
> 
> on monday will put waterblock
> 
> 
> 
> 
> 
> 
> 
> max temp was 72C, and still for overclocking for me personally looks to high


whats your core and mem


----------



## Forceman

Quote:


> Originally Posted by *dspacek*
> 
> Hello, Im under water now, and testing OC performance. So Im stucked with memory at 1700 Mhz still.
> But the core starts throttling at 1080Mhz without influence on power or Voltage setup.
> Can anyone put here his BIOS for R9-290 which is not throttling?
> 
> 
> 
> 
> 
> 
> 
> Or how to aviod it?
> Temps are 50°C MAX, included VRM
> Tanks Very much
> 
> big city
> 
> Heatkiller


Did you increase the power limit, and make sure it took in CCC? Sometimes when you set it in Afterburner it doesn't actually change it.


----------



## xavier78

Max OC


----------



## jerrolds

Quote:


> Originally Posted by *Mildcorma*
> 
> So I've bought a 290 and so far I've gotta say I've had some real issues...
> 
> It worked fine to start, no problem. Then I restarted my PC and got a windows logon circle, then a black screen. Weird, I thought!
> 
> My adventure so far has included completely flashing my bios and upgrading to the newest version. I've installed my old card then downloaded the drivers again, only to have the same issue again as soon as I plug in my 290.
> 
> I've re-installed windows 8.1 and it's still not working at all
> 
> I'm using the beta drivers (13.11 I think? Could be wrong)
> 
> Does anyone have anything that's worked for them in the short-term? I've been at this for the last 4 hours and it's doing my head in!
> 
> I've read on hear that they're releasing updated drivers, are they out yet? Am I in fact using the drivers that should fix this issue?
> 
> Thanks for any help on this, I'll include the pic below.


Are you able to get into Safe Mode? If so its a driver thing. Are you using GPU Tweak to overclock? I had an issue when i updated to Windows 8.1, GPU Tweak services failed to start in windows and just ended up with a black screen.

If this is the case, you can disable GPU Tweak windows services in safe mode.


----------



## devilhead

Quote:


> Originally Posted by *Asrock Extreme7*
> 
> whats your core and mem


1200/1700


----------



## devilhead

Quote:


> Originally Posted by *dspacek*
> 
> Without throttling? NICE! Can you add an AB graph when you are benching?
> What Bios do you use or recomend for R9-290?


i have used my stock sapphire 290x bios, ofcourse is not throttling, i'm using TriXX for overclock +100mv and +50 power limit, so it was around 1.28v, and i'm monitoring with gpu-z


----------



## Paulenski

A while ago I had some issues with my 290X, would get random black screens and shutdowns. Since then I went back to my CF 7850s, but as of Thursday I ordered a Sapphire Tri-X OC. Was hoping to have it today, but I guess Newegg doesn't mention that their Newegg Next Day, is only standard Fedex overnight, not Saturday one, kinda mad about it, spent about $30 to get it next day (was thinking friday, took 23 hours to get processed with rush processing)

I'll definitely be making another in-depth voltage and temp chart. Probably compare it to the stock one I did. Funny thing that I saw from one of the reviews on the Tri-X OC, they mentioned the voltage bump taking the card up to 1.297, I was maxing out on that voltage regardless of what you do after 1.3 in GPUTweak. I do not know if there's been any improvements on the custom bios' or software tools to actually allow the ref cards to stay stable above 1.3.

Here's my chart for those who might of missed it:
http://i.imgur.com/X1GUJnE.png


----------



## dspacek

Quote:


> Originally Posted by *Asrock Extreme7*
> 
> whats your core and mem


My core can handle 1100 without touching Voltage or power, anything more leads to blackscreeen. If I add voltage and power, it makes big stuttering.
Memory 1700Mhz stable


----------



## Durvelle27

Quote:


> Originally Posted by *shilka*
> 
> If the PSU have so weak 12v rails that makes them every easy to overload and the result is the PSU will shut down
> 
> 18 amps is very weak
> 
> Even the 450 watts Cooler Master V450S have 36 amps on its single 12v rail
> 
> Cooler Master V1000 have 83 amps on its single rail just to compare


My PSU has 70A on the single 12v rail


----------



## dspacek

FOrceman, yep. I changed too many times drivers because of black screens that I forgot to unlock Overdrive in CCC.
So thanks, for reminder. Maybe it helps. Now I go to bed. Good bye today.


----------



## ZealotKi11er

Any solution to maintain stability with 1500Mhz with stock volts in 2D states? Under load its fine but as soon as there is no load the screen starts having artifacts.


----------



## bond32

I get black screens so often I lost count... If my 290X is even slightly unstable I get one.


----------



## jomama22

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Any solution to maintain stability with 1500Mhz with stock volts in 2D states? Under load its fine but as soon as there is no load the screen starts having artifacts.


Try just reapplying your voltage/clocks.


----------



## mukmaster

Quote:


> Originally Posted by *Durvelle27*
> 
> Need proof bud. GPU-Z validation or pic of card with your name handwritten on a piece of paper in the pic


I have edited my original post
Quote:


> Originally Posted by *mukmaster*
> 
> Add me plz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Asus R290x bought for 500 on newegg.
> 1150MHZ Core @ 1.1v
> 1600MHZ Mem Elpidia
> Stock Cooling
> RUNS HOT.


----------



## Forceman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Any solution to maintain stability with 1500Mhz with stock volts in 2D states? Under load its fine but as soon as there is no load the screen starts having artifacts.


Only way I could do it was increase the core volts slightly. I think it only took +25mV or so for mine.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Forceman*
> 
> Only way I could do it was increase the core volts slightly. I think it only took +25mV or so for mine.


It takes +50mv for me but thats a lot of volts when u dont need it for the core under load.


----------



## Widde

Something is wrong with my Unigine Heaven run http://piclair.com/odak8 Have disabled ULPS and all. Keeps saying x1 on the card aswell http://piclair.com/wv8pm Have no idea why they are behaving like this :S 2nd card not even heating up in Heaven


----------



## Bartouille

My card likes low temps but I don't have watercooling.







I was able to stabilize 10mhz higher (not much I know but still) by running fan at 75% instead of 55%. Now all I need is some 3400rpm deltas and installing this mk-26 to hit some nice clocks at +100. All I can get right now is 1160mhz at +100 but with good cooling hoping to get 1180


----------



## PillarOfAutumn

I ran into a big problem just now. I was running pretty conservative clocks of 1200 and 1490 at 87mv like I always do. For some reason, my system just crashed, but that didn't seem like a big deal. I just pressed and held the power button to turn everything off, but when I went to power it on, everything started up but no display. I have a mechanical backlit keyboard with the lights that turn on and the whole startup process seems normal, but just no display. Is my GPU shot?


----------



## Forceman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> It takes +50mv for me but thats a lot of volts when u dont need it for the core under load.


You'd think there would be a way to adjust the 2D memory clocks independently.


----------



## smartdroid

Arizonian, please update my entry to 3 Sapphire R9 290


----------



## dorcopio

When will custom R9 290 bail out?
Can't find any on amazon and so on.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Forceman*
> 
> You'd think there would be a way to adjust the 2D memory clocks independently.


I think there is a problem with AMD drivers. Card does not need to go 150MHz - 1500MHz jumping like crazy in 2D mode. It should stay at 150Mhz.


----------



## Jack Mac

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I think there is a problem with AMD drivers. Card does not need to go 150MHz - 1500MHz jumping like crazy in 2D mode. It should stay at 150Mhz.


I know, it's bad for me with two monitors. Hopefully 14.1 BETA fixes it and brings Mantle. We can all dream, right? One step a a time AMD.


----------



## Arizonian

Quote:


> Originally Posted by *dspacek*
> 
> Hello, Im under water now, and testing OC performance. So Im stucked with memory at 1700 Mhz still.
> But the core starts throttling at 1080Mhz without influence on power or Voltage setup.
> Can anyone put here his BIOS for R9-290 which is not throttling?
> 
> 
> 
> 
> 
> 
> 
> Or how to aviod it?
> Temps are 50°C MAX, included VRM
> Tanks Very much
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> big city
> 
> 
> 
> Heatkiller


Congrats - updated









Quote:


> Originally Posted by *xavier78*
> 
> Max OC
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Quote:


> Originally Posted by *smartdroid*
> 
> Arizonian, please update my entry to 3 Sapphire R9 290
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - updated


----------



## vieuxchnock

*Mine is Crazy. XFX 290 Black Edition running 1100/5400 without any problem. That's a crazy card.







*


----------



## Jack Mac

Quote:


> Originally Posted by *vieuxchnock*
> 
> *Mine is Crazy. XFX 290 Black Edition running 1100/5400 without any problem. That's a crazy card.
> 
> 
> 
> 
> 
> 
> 
> *


Post pics, I like the way the XFX 290s look.


----------



## robert0507

Quote:


> Originally Posted by *Mildcorma*
> 
> Ok so my card still isn't working, going to go over everything here in great detail and hope that someone has a great answer!
> 
> I bought this card a few days ago and it arrived yesterday with no damage. I installed the drivers, then played a game of BF4 to take it for a test run, then I uninstalled Nvidia drivers (I have / had a GTX 660ti). Once I restarted my PC after uninstalling, it wouldn't boot past the windows icon with a swirly circle.
> 
> My build in brief:
> 
> Windows 8.1
> ASUS Sabretooth P67 mobo
> GTX 660ti / AMD R9 290
> 2 x OCZ Vertex 4 120gb each
> 2 x Samsung Spinpoint F3s 1tb each
> 8gb G-skill DDR3 1333mhz
> i5 2500k not OC'ed at present
> 600w PSU
> I have 2 24 inch monitors that are connected via HDMI (although one is currently DVI for troubleshooting)
> 
> I took a look at my Bios and surprise, it was out of date. So i downloaded the 4 bios updates and cycled through them in order, clearing the CMOS after each one to be certain. Put the 290 in again, but no luck.
> 
> I re-installed windows completely (yay for backups right?) and went through the established process once more with the 13.12 drivers, but no luck. I tried the 13.11 drivers, but no luck there either. I've tried the tool listed on page one here to completely erase the drivers, and started again, but no luck at all.
> 
> Now, my mates got a rig and the card works fine in that with no apparent immediate issues also, so this is looking more and more like I send the card back and splash out a little more on a gtx780. I've never had these kinds of issues with new cards before, and as a lifelong Nvidia buyer, the move across to what's an amazing card for the price has been EVERYTHING but easy!
> 
> Does anyone have any ideas? If not then I have a 290 for sale, £310 if you pay for postage! :S


Hey I think your issue has to do with an windows update that installed a Intel driver. I was having the same issue until I booted into safe mode and unis taller that update. Now everything is perfect.


----------



## vieuxchnock

*That's the reference body.It's the Black Edition because it's OC by XFX at 980.


*


----------



## Mildcorma

Quote:


> Originally Posted by *robert0507*
> 
> Hey I think your issue has to do with an windows update that installed a Intel driver. I was having the same issue until I booted into safe mode and unis taller that update. Now everything is perfect.


Thanks i'll give that a shot! What update was it specifically, can you remember?


----------



## Jack Mac

Quote:


> Originally Posted by *vieuxchnock*
> 
> *That's the reference body.It's the Black Edition because it's OC by XFX at 980.
> 
> 
> *


Oh, I thought it was XFX's non-reference design, still nice looking.


----------



## battleaxe

I want to nominate my 290 rig for "UGLIEST 290 RIG"

AIO cooler. Custom cut heat sinks for VRM and RAM. (read hack saw massacre)





We need an official ugly 290 forum. Too bad I had to do this because of the terrible stock heat sink. But whatever. Its under 50c now all the time. VRM's included. Look at those hideous heat sinks... lol... that's just terrible.

Hey, at least it works and doesn't sound like a jet engine anymore.


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> I want to nominate my 290 rig for "UGLIEST 290 RIG"
> 
> AIO cooler. Custom cut heat sinks for VRM and RAM. (read hack saw massacre)
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> We need an official ugly 290 forum. Too bad I had to do this because of the terrible stock heat sink. But whatever. Its under 50c now all the time. VRM's included. Look at those hideous heat sinks... lol... that's just terrible.
> 
> Hey, at least it works and doesn't sound like a jet engine anymore.


stock heatsink?

anyway, so long as it works.


----------



## Darklyspectre

Quote:


> Originally Posted by *Jack Mac*
> 
> Oh, I thought it was XFX's non-reference design, still nice looking.


The double dispensations look pretty but I have heard XFX kinda has crappy quality most of the time?

The only XFX I had was a XFX 6990 but that broke down after a half year and that is how I got a 680.


----------



## shilka

Quote:


> Originally Posted by *Darklyspectre*
> 
> The double dispensations look pretty but I have heard XFX kinda has ****ty quality most of the time?
> 
> The only XFX I had was a XFX 6990 but that broke down after a half year and that is how I got a 680.


The AMD 6000 series cards with the DD cooler where decent

The AMD 7000 series cards with the XFX DD cooler on the hand was an epic fail

Dont know about the AMD 200 series models with the XFX DD cooler but they cant be worse then the last gen


----------



## battleaxe

Quote:


> Originally Posted by *rdr09*
> 
> stock heatsink?
> 
> anyway, so long as it works.


No. My stock heat sink is still untouched. I put it back in the box in case I ever have to RMA.

The coolers are all old CPU coolers that I cut down to fit over the VRM's and RAM sections. Plus a few small stick ons for a few of the RAM modules.

Looks like total crap, but its working great. My temps are very good. Under 50c now even while mining at 100%. The AOI is a Corsair H80 with push/pull fans.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Jack Mac*
> 
> Post pics, I like the way the XFX 290s look.


Give me a couple of days and ill post pics of mine









Really looking forward to see what they can do.


----------



## Durvelle27

Quote:


> Originally Posted by *shilka*
> 
> The AMD 6000 series cards with the DD cooler where decent
> 
> The AMD 7000 series cards with the XFX DD cooler on the hand was an epic fail
> 
> Dont know about the AMD 200 series models with the XFX DD cooler but they cant be worse then the last gen


hmmm the DD cooler on my HD 7870 was pretty decent


----------



## robert0507

Quote:


> Originally Posted by *Mildcorma*
> 
> Thanks i'll give that a shot! What update was it specifically, can you remember?


Intel Corporation - Graphics Adapter WDDM1.2, Graphics Adapter WDDM1.3 - Intel(R) HD Graphics 4000

Download size: 94.4 MB

You may need to restart your computer for this update to take effect.

Update type: Important

Intel Corporation Graphics Adapter WDDM1.2, Graphics Adapter WDDM1.3 software update released in December, 2013

More information:
http://sysdev.microsoft.com/support/default.aspx

Help and Support:
http://support.microsoft.com/select/?target=hub

This the one I had to uninstall


----------



## SoloCamo

Quote:


> Originally Posted by *Durvelle27*
> 
> hmmm the DD cooler on my HD 7870 was pretty decent


My 7970ghz edition DD cooler sucked. I mean, it did decent considering it was a pretty good overclock on the gpu but still, it struggled even at 100% fan speed and it was far from quiet.


----------



## shilka

Quote:


> Originally Posted by *SoloCamo*
> 
> My 7970ghz edition DD cooler sucked. I mean, it did decent considering it was a pretty good overclock on the gpu but still, it struggled even at 100% fan speed and it was far from quiet.


I had a XFX 7950 DD and it was so loud that it would hurt both my head and ears

Was forced to get rid of it


----------



## Durvelle27

Quote:


> Originally Posted by *SoloCamo*
> 
> My 7970ghz edition DD cooler sucked. I mean, it did decent considering it was a pretty good overclock on the gpu but still, it struggled even at 100% fan speed and it was far from quiet.


My HD 7870 which was stock 1000/1250 did 1200/1450 @100% Silent and never broke 65*C


----------



## Jack Mac

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Give me a couple of days and ill post pics of mine
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Really looking forward to see what they can do.


Awesome, I can't wait to see them.


----------



## Sgt Bilko

Quote:


> Originally Posted by *shilka*
> 
> I had a XFX 7950 DD and it was so loud that it would hurt both my head and ears
> 
> Was forced to get rid of it


From what I understand the DD cooler on the R9 series is a new design. Its working quite well on the 280x so I have faith that it can tame Hawaii as well


----------



## vieuxchnock

Quote:


> Originally Posted by *Jack Mac*
> 
> Oh, I thought it was XFX's non-reference design, still nice looking.


*I want the reference card because I'm going under water and with non-ref it's sometimes harder to have a waterblock.*


----------



## TheRoot

this card


----------



## Sazz

can anyone that has used modded AIO watercoolers on their card tell me which heatsinks did you guys use for the memory and VRM? I am making a transition on my build and will just use AIO's on both CPU and the GPU. Links of them from FCPU or PPC's would be great.


----------



## Widde

Started to perform alittle better after I installed net framework 4.5.1 and here is some gpu usage http://piclair.com/2oggy in Bf4 at ultra 1080p resolution scale 180% Starts to get better


----------



## BradleyW

Quote:


> Originally Posted by *TheRoot*
> 
> 
> this card


That's just the EK FC block right?


----------



## BackwoodsNC

Quote:


> Originally Posted by *BradleyW*
> 
> That's just the EK FC block right?


Yeah, you can see the EK logo on it.


----------



## Jack Mac

I'm not sure if anyone already posted this, but the MSI 290 Gaming is available for pre-order on Amazon. It says it'll ship in 3-5 weeks, but the price isn't too inflated. Amazon is asking for $470.
http://www.amazon.com/MSI-R9-290-GAMING-4G/dp/B00HPS4AFG/ref=sr_1_10?ie=UTF8&qid=1389502267&sr=8-10&keywords=r9+290


----------



## psyside

The gaming cards - cooling can't take volts/oc sadly. I recommend Tri x


----------



## ds84

Can anyone give the dimensions for the VRM? Would like to go find some compatible heatsinks for it.... But im not sure what are its dimensions.


----------



## Mr357

Out of curiosity, on no higher than the PT1 BIOS, what's the best stable core clock anyone has reached? Any 1300's?


----------



## phallacy

Would like to add my card to the list.

Sapphire R9 290x Aircooled


----------



## Arizonian

Quote:


> Originally Posted by *phallacy*
> 
> Would like to add my card to the list.
> 
> Sapphire R9 290x Aircooled
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## zpaf

Quote:


> Originally Posted by *mukmaster*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Asus R290x bought for 500 on newegg.
> 1150MHZ Core @ 1.1v
> 1600MHZ Mem Elpidia
> Stock Cooling
> RUNS HOT.
> 
> Thanks


1150/1600 with only 1.1v ?
I want to see a screenshot ...

https://imageshack.com/i/jjb7g1j


----------



## mukmaster

Quote:


> Originally Posted by *zpaf*
> 
> 1150/1600 with only 1.1v ?
> I want to see a screenshot ...
> 
> https://imageshack.com/i/jjb7g1j


I am so sorry, ,y m

I am sorry, I will edit that + .1v on a 1.1 stock card.


----------



## Luckael

hi,

Can i join this group?

From Asus R9 280X DCU2T V2 to Sapphire R9 290 Tri X


----------



## zpaf

Play with MetroLL benchmark looking for lowest voltage at 1100 core.



1.063v


----------



## DullBoi

Please add me to your list

1. 


2. Powercolor R9 290 AXR9 290 4GBD5-IDH/OC

3. Stock cooling


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> No. My stock heat sink is still untouched. I put it back in the box in case I ever have to RMA.
> 
> The coolers are all old CPU coolers that I cut down to fit over the VRM's and RAM sections. Plus a few small stick ons for a few of the RAM modules.
> 
> Looks like total crap, but its working great. My temps are very good. Under 50c now even while mining at 100%. The AOI is a Corsair H80 with push/pull fans.


no, i meant on the cpu. man, i keep on reading about this mining stuff. my 290 sits idle sometimes.


----------



## Asrock Extreme7

Quote:


> Originally Posted by *mukmaster*
> 
> I am so sorry, ,y m
> 
> I am sorry, I will edit that + .1v on a 1.1 stock card.


is that 100mv with msi ab


----------



## devilhead

Quote:


> Originally Posted by *DullBoi*
> 
> Please add me to your list
> 
> 1.
> 
> 
> 2. Powercolor R9 290 AXR9 290 4GBD5-IDH/OC
> 
> 3. Stock cooling


heh, you need to clean your fingers before touching cards







or don't eat chips at the same time







)


----------



## dspacek

Hello, So Im now at 1200/1700 Mhz (+200mV +50%) so I tried influence of the CPU for gaming performance especialy GPU performance if there are some restrictions by CPU, So here are the results.
No influence and restrictions by i5-2500K, GPU has almost same score.
And my last Valley bench added


----------



## Arizonian

Quote:


> Originally Posted by *Luckael*
> 
> hi,
> 
> Can i join this group?
> 
> From Asus R9 280X DCU2T V2 to Sapphire R9 290 Tri X
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - first TRI X - added









Quote:


> Originally Posted by *DullBoi*
> 
> Please add me to your list
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 1.
> 
> 
> 
> 
> 2. Powercolor R9 290 AXR9 290 4GBD5-IDH/OC
> 
> 3. Stock cooling


Congrats - added


----------



## esqueue

Ok I started mining a little while back and determined that the power consumption vs payoff wasn't worth it. All in all, I felt kinda ripped off especially by sites like BTC-e.com. They are scam artists to say the least. They make it easy to transfer coins to their site but have millions of rules to withdraw the stuff so I will let them have the $5. Also learned that deleting the wallet.dat folders equals permanent loss of all coins so I promptly deleted and unsubscribed from anything related to mining.

I started messing with folding last night but my gpu sits at 0% load. I doubt that it is even being used. Is there some settings or tweaks to make sure that my gpu's full strength is being used?


----------



## DullBoi

Quote:


> Originally Posted by *devilhead*
> 
> heh, you need to clean your fingers before touching cards
> 
> 
> 
> 
> 
> 
> 
> or don't eat chips at the same time
> 
> 
> 
> 
> 
> 
> 
> )


hAHA, i will be slapping some blocks on soon then I will clean them


----------



## blazestalker100

Hello.
Ive used hours of testing and wanted to make my own thread, but i was afraid of haters so i decided to post here.

I belive i got a good overclocker to be honest.
My card seem to deliver!

So i did a series of test and noted all the results.
I want to share the results here and wish for feedback.

And i have a few questions aswell:

Whats considered a safe Over Voltage for the R9 290 non-x?
I was doing some search online and found people that said running 1.2v core could set your card at great risk.
If i just knew what the risk are for these cards i would know how far im willing to go with the voltage.
I myself have considered that anything under 1.2v is fine for me. It doesnt make me feel bad at all.
But if going up to 1.3v wouldnt be too bad, id defo push my card further.

im using my card for gaming, and thats why i am overclocking.

My goal is to find a great overclock and stability getting none artifacts and good temps.

If anyone could answer my Voltage question id be glad.
Anyone know if Amd have any rated max voltages for the card?

Anyway, so here is my testing:

RADEON R9 290 Non X (Not unlockable)
SAPPHIRE

Quote:


> 3dmark fire strike.
> 
> Running graphic tests on default settings.
> Afterburner 100+Mv on all tests (Voltages up to 1.27v)
> Temps from 62-75c Using Prolimatech Mk26 cooler.
> 
> Gpu | Memory | _Score_
> _________________________________
> 
> 0947 1350 _10080_ (Stock Gpu)
> 1100 1350 _11321_ (No artifacts)
> 1135 1350 _11635_ (No artifacts)
> 1150 1350 _11825_ (No artifacts)
> 1165 1350 _11883_ (No artifacts)
> 1170 1350 _11882_ (No artifacts)
> 1180 1350 _11862_ (No artifacts)
> 1185 1350 _12007_ (very few or no artifacts)
> 1200 1350 _12200_ (Seeing few artifacts)
> 1215 1350 _12416_ (seeing few artifacts)
> 1235 1350 _12213_ (Alot of artifacts)
> 
> Running 1100 Gpu and testing memory (Elpida)
> 
> Gpu | Memory | _Score_
> _________________________________
> 
> 1100 1250 _11368_
> 1100 1350 Look above
> 1100 1400 _11439_
> 1100 1450 _11598_
> 1100 1475 _11651_
> 1100 1500 _11692_
> 1100 1525 _11505_
> 1100 1550 _11579_
> 1100 1600 Cant.
> 
> Ultimate test using the best but stable clocks:
> 1165 1500 12218 (Final voltage 1.156)
> Forcing constant voltage using evga Oc scanner to burn the card up to 90c i eliminated artifacts by adjusting the voltage.
> Final voltage 1.156v


----------



## jprovido

I've been thinking of getting the nzxt g10. would an antec kuhler 620 be enough to cool this? im also a bit worried with the vrm temps. from where I'm from there's no way for me to get vrm heatsinks. can I just ghetto rig some ram heatsinks and slap it in with thermal pads?


----------



## Asrock Extreme7

Quote:


> Originally Posted by *blazestalker100*
> 
> Hello.
> Ive used hours of testing and wanted to make my own thread, but i was afraid of haters so i decided to post here.
> 
> I belive i got a good overclocker to be honest.
> My card seem to deliver!
> 
> So i did a series of test and noted all the results.
> I want to share the results here and wish for feedback.
> 
> And i have a few questions aswell:
> 
> Whats considered a safe Over Voltage for the R9 290 non-x?
> I was doing some search online and found people that said running 1.2v core could set your card at great risk.
> If i just knew what the risk are for these cards i would know how far im willing to go with the voltage.
> I myself have considered that anything under 1.2v is fine for me. It doesnt make me feel bad at all.
> But if going up to 1.3v wouldnt be too bad, id defo push my card further.
> 
> im using my card for gaming, and thats why i am overclocking.
> 
> My goal is to find a great overclock and stability getting none artifacts and good temps.
> 
> If anyone could answer my Voltage question id be glad.
> Anyone know if Amd have any rated max voltages for the card?
> 
> Anyway, so here is my testing:
> 
> RADEON R9 290 Non X (Not unlockable)
> SAPPHIRE


what u reading volts with


----------



## Asrock Extreme7

what u reading volts with and is that load


----------



## blazestalker100

reading with both Gpu-z and afterburner.
Loading gpu 90-100% with evga oc scanner.

EDIT: playing BF4 now and the highest voltage peak i get is 1.18, average 1.16


----------



## smartdroid

I bought a R9 290 BF4 edition....but have no idea how i can get the game key?! can anyone help me out?


----------



## zpaf

Quote:


> Originally Posted by *blazestalker100*
> 
> Anyway, so here is my testing:[/U]
> 
> RADEON R9 290 Non X (Not unlockable)
> SAPPHIRE
> 
> 3dmark fire strike.
> 
> Running 1100 Gpu and testing memory (Elpida)
> 
> Gpu | Memory | Score
> _________________________________
> 
> *1100 1250 11368*
> 1100 1350 Look above
> 1100 1400 11439
> 1100 1450 11598
> 1100 1475 11651
> 1100 1500 11692
> 1100 1525 11505
> 1100 1550 11579
> 1100 1600 Cant.


I dont know if you run win 7 or 8 but your scores seems low.
Here is mine.

https://imageshack.com/i/m9yd87j
http://www.3dmark.com/3dm/2185746


----------



## steadly2004

Quote:


> Originally Posted by *zpaf*
> 
> I dont know if you run win 7 or 8 but your scores seems low.
> Here is mine.
> 
> https://imageshack.com/i/m9yd87j
> http://www.3dmark.com/3dm/2185746


Yea, mine are about the same. i bet he's throttling due to temps or power limit and not knowing it.

Just realized my mem was clocked higher- so probably explains the small difference in graphic rating. (1350mhz)


----------



## CrossMaximus63

Hi chaps. Please forgive if this question has already been answered in this rather verbose thread.

I'm building up a box with an Asus Maximus Extreme VI and 3x ATI R9 290 video cards. I'm dismayed to note I can't fit these video cards in slots A2, B2, or 3, because the 10 SATA connectors at the far edge of the board, which are stacked 2 high, interfere with the video card. I can put the video cards in slots 1 and 4 only, which is where I presently have them, but this does not allow for CrossfireX configuration. I've ordered some 15cm PCI riser cables for greater flexibility, but am wondering if there's a simpler trick. What is standard practice in this case? Does anyone else have this problem? If so, what did you do to work around it?


----------



## Hattifnatten

Quote:


> Originally Posted by *CrossMaximus63*
> 
> because the 10 SATA connectors at the far edge of the board, *which are stacked 2 high*, interfere with the video card.


That is absolutely normal and they do not interfere with any expansion card.
You can use the risers to give the cards more breathing room though


----------



## zpaf

Quote:


> Originally Posted by *blazestalker100*
> 
> Anyway, so here is my testing:
> 
> RADEON R9 290 Non X (Not unlockable)
> SAPPHIRE
> 
> 3dmark fire strike.
> 
> Running graphic tests on default settings.
> Afterburner 100+Mv on all tests (Voltages up to 1.27v)
> Temps from 62-75c Using Prolimatech Mk26 cooler.
> 
> Gpu | Memory | Score
> _________________________________
> 
> 0947 1350 10080 (Stock Gpu)
> 1100 1350 11321 (No artifacts)
> 1135 1350 11635 (No artifacts)
> 1150 1350 11825 (No artifacts)
> 1165 1350 11883 (No artifacts)
> 1170 1350 11882 (No artifacts)
> *1180 1350 11862 (No artifacts)*
> 1185 1350 12007 (very few or no artifacts)
> 1200 1350 12200 (Seeing few artifacts)
> 1215 1350 12416 (seeing few artifacts)
> 1235 1350 12213 (Alot of artifacts)


Same situation here at 1180/1350.

https://imageshack.com/i/g9uzw8j
http://www.3dmark.com/3dm/2186328


----------



## dspacek

Quote:


> Originally Posted by *zpaf*
> 
> I dont know if you run win 7 or 8 but your scores seems low.
> Here is mine.
> 
> https://imageshack.com/i/m9yd87j
> http://www.3dmark.com/3dm/2185746


You are almost the same as he is. Look at mine 12930 on previous page








But the drivers are very bad, if I OC it much, Im getting every start of PC black screen. So I have to uninstal (by ATIman) drivers and install again.
AMD again gain its title as Almost Made Drivers.
So I hope in few weeks come out better drivers and better update for AB. Because the GPU at clocks near 1200 Mhz realy needs more thans 100mV +


----------



## dspacek

Hello again, I have a VERY important info for you. After many hundereds of tests, and reinstalling many versions of drivers and Trixx and AB etc. I found this
MSI Afterburner version 17 can manage your R9-290 RAM at least 1700Mhz but the core may become unstable since 1150Mhz
MSI Afterburner version 18 can manage your R9-290 RAM at max 1600Mhz but the core is stable at atleast 1200 Mhz
It means to get the best OC results you have to use a trixx with AB together to set higher core voltage. Or wait for MSI and hope that they'llmix the best of theese last versions together.


----------



## mojobear

Quote:


> Originally Posted by *CrossMaximus63*
> 
> Hi chaps. Please forgive if this question has already been answered in this rather verbose thread.
> 
> I'm building up a box with an Asus Maximus Extreme VI and 3x ATI R9 290 video cards. I'm dismayed to note I can't fit these video cards in slots A2, B2, or 3, because the 10 SATA connectors at the far edge of the board, which are stacked 2 high, interfere with the video card. I can put the video cards in slots 1 and 4 only, which is where I presently have them, but this does not allow for CrossfireX configuration. I've ordered some 15cm PCI riser cables for greater flexibility, but am wondering if there's a simpler trick. What is standard practice in this case? Does anyone else have this problem? If so, what did you do to work around it?


Hey I have exactly the same setup as you, except mine are watercooled. I had no issues with my GPUs in 1, A2, and 3....the SATA ports are 90 degrees (instead of shooting straight up) so dont have any issues with clearance. Not sure what problem you are having...I think my SATA3 ports are 80% filled up and have no issues.


----------



## zpaf

Quote:


> Originally Posted by *dspacek*
> 
> You are almost the same as he is. Look at mine 12930 on previous page


I think you have to push more ...









https://imageshack.com/i/1n4fhij
http://www.3dmark.com/3dm/2041099


----------



## jomama22

Quote:


> Originally Posted by *dspacek*
> 
> You are almost the same as he is. Look at mine 12930 on previous page
> 
> 
> 
> 
> 
> 
> 
> 
> But the drivers are very bad, if I OC it much, Im getting every start of PC black screen. So I have to uninstal (by ATIman) drivers and install again.
> AMD again gain its title as Almost Made Drivers.
> So I hope in few weeks come out better drivers and better update for AB. Because the GPU at clocks near 1200 Mhz realy needs more thans 100mV +


You need to not set AB to start with windows and set profile 1 to your stock settings, this will prevent a bad oc from being forced on (thus causing your boot issues). I personally suggest the asus.rom bios and GPUtweak for these cards if overclocking is the priority. This isnt a driver issue.

After running 11 of these cards, i have yet to have the driver crash without it being my fault (overclocking or otherwise). If your blackscreening and crashing when overclocked, what do you think the problem really is?

Second, ATIman IS NOT SUGGESTED ANY LONGER. It mucks up more then helps at this point. Use DDU from guru3d.net, best uninstaller for both amd and nvidia.

Last, there is a way to increase the AB voltage limit and remove llc (vdroop), but seeing the issues you are reporting, I dont think that is safe for you lol.


----------



## rdr09

Quote:


> Originally Posted by *dspacek*
> 
> Hello again, I have a VERY important info for you. After many hundereds of tests, and reinstalling many versions of drivers and Trixx and AB etc. I found this
> MSI Afterburner version 17 can manage your R9-290 RAM at least 1700Mhz but the core may become unstable since 1150Mhz
> MSI Afterburner version 18 can manage your R9-290 RAM at max 1600Mhz but the core is stable at atleast 1200 Mhz
> It means to get the best OC results you have to use a trixx with AB together to set higher core voltage. Or wait for MSI and hope that they'llmix the best of theese last versions together.


i only use Trixx to get 1320/1620 to bench my 290. it comes down to silicon lottery i guess. have you tried to lower your memory oc while increasing your core to get better results in benches?

edit: at 1300 i have to max out VDDC offset.


----------



## zpaf

Quote:


> Originally Posted by *jomama22*
> 
> You need to not set AB to start with windows and set profile 1 to your stock settings, this will prevent a bad oc from being forced on (thus causing your boot issues). *I personally suggest the asus.rom bios and GPUtweak for these cards if overclocking is the priority. This isnt a driver issue.*
> 
> After running 11 of these cards, i have yet to have the driver crash without it being my fault (overclocking or otherwise). If your blackscreening and crashing when overclocked, what do you think the problem really is?
> 
> Second, ATIman IS NOT SUGGESTED ANY LONGER. It mucks up more then helps at this point. Use DDU from guru3d.net, best uninstaller for both amd and nvidia.
> 
> Last, there is a way to increase the AB voltage limit and remove llc (vdroop), but seeing the issues you are reporting, I dont think that is safe for you lol.


I have to agree with you.
I own a reference asus 290 and the best combination are asus rom ver 1 and gputweak.
Asus try to solve some problems with black screens on asus rom ver 2 but I have lower scores with this.
Maybe they change memory timings on the second bios version.


----------



## Jack Mac

Do you guys think a 4.4GHz 3570K would hold back two R9 290s? What about a stock i5?


----------



## mojobear

Hi all since I stated here on the 290(x) forums that BF4 was stuttering like mad with my trifire R9 290s I wanted to clarify the issue now that I have mostly solved it and dont want to spread misconceptions...the majority of the problem at least was not due to R9 290s or drivers.

Solution....unpark ur cores...i7 + windows 7 = stutter. With the fix definitely felt a HUGE difference. I didnt believe it for the longest time so didnt try but ya...makes a big difference. Still get stutters here and there but at least the major hiccups are now gone.


----------



## rv8000

Could you update me to water please and ty


----------



## Forceman

Quote:


> Originally Posted by *jprovido*
> 
> I've been thinking of getting the nzxt g10. would an antec kuhler 620 be enough to cool this? im also a bit worried with the vrm temps. from where I'm from there's no way for me to get vrm heatsinks. can I just ghetto rig some ram heatsinks and slap it in with thermal pads?


Yes a Kuhler 620 would be enough to keep the core cool. The fan alone should keep the VRMs at an acceptable temp, but rigging up some heatsinks from something else would be even better. A couple poeple cut up old CPU coolers or something to make some, you'd just need some thermal tape or something to hold them on.


----------



## bloodkil93

On my 290x, with the Asus PT1 bios, I get some serious Vdroop, I set it to 1.412 and it drops to 1.27 which seems to cause artifacting, everything is at 1200/1500, how can i get rid of this vdroop?


----------



## CrossMaximus63

I found the problem. It wasn't that the cards were't seated properly, it was this:

http://rog.asus.com/forum/showthread.php?36108-Bad-PCIe-lane
Quote:


> Do you have any of the PCI-E lanes 4 & 6 occupied? (sound card, RAID card etc.)
> 
> If so, lane 3 (B2) will be disabled


Apparently I can't have the cards spaced out; they need to be quite close together which is a non-starter with the stock coolers. I guess I have to wait for the PCI risers. In the mean time I can use 2 cards, but not in SLI config.


----------



## Jack Mac

I hate miners, I want a second 290 for CF. They're not available on Amazon and Newegg has the nerve to charge $100 over MSRP and still charge for shipping.


----------



## jprovido

Quote:


> Originally Posted by *Jack Mac*
> 
> I hate miners, I want a second 290 for CF. They're not available on Amazon and Newegg has the nerve to charge $100 over MSRP and still charge for shipping.


how much can you get an r9 290/290x now? from where I'm from there's no price gouging and shortage
Quote:


> Yes a Kuhler 620 would be enough to keep the core cool. The fan alone should keep the VRMs at an acceptable temp, but rigging up some heatsinks from something else would be even better. A couple poeple cut up old CPU coolers or something to make some, you'd just need some thermal tape or something to hold them on.


ty for the info I have an antec 620 here lying around so that's good to hear. the only thing to do now is to wait for the nzxt g10. it's not available yet here in manila but from what ive heard they're coming really soon


----------



## ZealotKi11er

Quote:


> Originally Posted by *Jack Mac*
> 
> I hate miners, I want a second 290 for CF. They're not available on Amazon and Newegg has the nerve to charge $100 over MSRP and still charge for shipping.


If you cant beat them join them.


----------



## blazestalker100

Quote:


> Originally Posted by *zpaf*
> 
> I dont know if you run win 7 or 8 but your scores seems low.
> Here is mine.
> http://www.3dmark.com/3dm/2185746


Using win 8.1


----------



## DeutyDaBeast

I bought the Gigabyte R9 290. Runs quiet and relatively cool, especially compared to all the other ones I saw the videos for on youtube.


----------



## blazestalker100

Quote:


> Originally Posted by *zpaf*
> 
> Same situation here at 1180/1350.
> 
> https://imageshack.com/i/g9uzw8j
> http://www.3dmark.com/3dm/2186328


You got a nice score. I couldnt get that one. Using win 8.1


----------



## dspacek

Quote:


> Originally Posted by *jomama22*
> 
> You need to not set AB to start with windows and set profile 1 to your stock settings, this will prevent a bad oc from being forced on (thus causing your boot issues). I personally suggest the asus.rom bios and GPUtweak for these cards if overclocking is the priority. This isnt a driver issue.
> 
> After running 11 of these cards, i have yet to have the driver crash without it being my fault (overclocking or otherwise). If your blackscreening and crashing when overclocked, what do you think the problem really is?
> 
> Second, ATIman IS NOT SUGGESTED ANY LONGER. It mucks up more then helps at this point. Use DDU from guru3d.net, best uninstaller for both amd and nvidia.
> 
> Last, there is a way to increase the AB voltage limit and remove llc (vdroop), but seeing the issues you are reporting, I dont think that is safe for you lol.


You dont say to me nothing new. I know theese things but the black screen bug is caused by drivers. expecialy if power set in CCC manualy and not through AB. Because If I remove AB there is still black at startup when CCC loaded. So CCC cant manage to make itsel set to default.

Everything abou AB is truth. 17 version is absolutly better optimised than 18.

At the end, I put here the question many times, about which ASUS bios for 290 works well but nobody told me, or send a link







So you can and let me know how to increase voltage and disable throttling
through AB


----------



## sugarhell

You oc and you get black screen. You affect ACE files. Nothing to do with drivers.


----------



## the9quad

Quote:


> Originally Posted by *jomama22*
> 
> After running 11 of these cards, i have yet to have the driver crash without it being my fault (overclocking or otherwise). If your blackscreening and crashing when overclocked, what do you think the problem really is?


Just wanted to say, the blackscreen was a hardware issue when I had it. It blackscreened at stock clocks, on the stock bios, and very, very frequently. Received 3 more cards since than, and not one black screen ever, and that was all I changed.

TLDR: You ever think that maybe, you just got 11 cards that didn't have a defect, which is extremely likely?


----------



## kizwan

Well black screen can cause by few things such as overheating (including memory) & insufficient power (either at stock or overclocked). Faulty hardware can cause black screen too of course.


----------



## zpaf

Quote:


> Originally Posted by *dspacek*
> 
> You dont say to me nothing new. I know theese things but the black screen bug is caused by drivers. expecialy if power set in CCC manualy and not through AB. Because If I remove AB there is still black at startup when CCC loaded. So CCC cant manage to make itsel set to default.
> 
> Everything abou AB is truth. 17 version is absolutly better optimised than 18.
> 
> At the end, I put here the question many times, about which ASUS bios for 290 works well but nobody told me, or send a link
> 
> 
> 
> 
> 
> 
> 
> So you can and let me know how to increase voltage and disable throttling
> through AB


Here are http://www.techpowerup.com/vgabios/index.php?architecture=ATI&manufacturer=&model=R9+290&interface=&memType=&memSize= all the available bios.
You can play with this 015.039.000.006.003516


----------



## jomama22

Quote:


> Originally Posted by *the9quad*
> 
> Just wanted to say, the blackscreen was a hardware issue when I had it. It blackscreened at stock clocks, on the stock bios, and very, very frequently. Received 3 more cards since than, and not one black screen ever, and that was all I changed.
> 
> TLDR: You ever think that maybe, you just got 11 cards that didn't have a defect, which is extremely likely?


If you read what I typed, you would see the "when overclocking" attached to my comment. I am fully aware of the black screen issue for some users at stock, at that point, it was a hardware issue not driver.

I was responding to a comment about overclocking and black screening with drivers being the issue, which is not the case.


----------



## kpoeticg

Anybody know if the AC Active Backplate is available anywhere in the states yet? Or anywhere with reasonable shipping to the states?


----------



## Jack Mac

Quote:


> Originally Posted by *jprovido*
> 
> how much can you get an r9 290/290x now? from where I'm from there's no price gouging and shortage
> ty for the info I have an antec 620 here lying around so that's good to hear. the only thing to do now is to wait for the nzxt g10. it's not available yet here in manila but from what ive heard they're coming really soon


They're OOS on Amazon and $500 USD on Amazon.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> If you cant beat them join them.


I do mine while I'm asleep and fold on the CPU. I don't see why people do it though, the money sucks. I only make around 35 cents per hour mining on my 290.


----------



## zpaf

Some numbers from MetroLL.
The first 3 screenshots are with official bios.
The last one is with PT1 bios.


































I would like to see an R9 with more than 57 fps.


----------



## MeneerVent

Quote:


> Originally Posted by *zpaf*
> 
> Some numbers from MetroLL.
> The first 3 screenshots are with official bios.
> The last one is with PT1 bios.
> 
> I would like to see an R9 with more than 57 fps.


Challenge accepted. And I assume that PT1 BIOS allows more voltage?

You win.


----------



## dspacek

Quote:


> Originally Posted by *zpaf*
> 
> Here are http://www.techpowerup.com/vgabios/index.php?architecture=ATI&manufacturer=&model=R9+290&interface=&memType=&memSize= all the available bios.
> You can play with this 015.039.000.006.003516


Yes this bios im at now, but nothing special allowed. Maximum I get from stock bios was 1200/1700 Mhz.
with this asus and their Tweak is impossible to set anything higher than 1600 on RAM.
So Im still voting for AB 17 and manual +35% power in CCC because AB often do not change it.
So thats all why Im saying that everone who is not much experienced in OCing should wait for 14.1 CCC and drivers with AB 19+ ;-)
because standard ways withou touching some files of AB does not lead to theese high OC. ;-)


----------



## ZealotKi11er

Quote:


> Originally Posted by *Jack Mac*
> 
> They're OOS on Amazon and $500 USD on Amazon.
> I do mine while I'm asleep and fold on the CPU. I don't see why people do it though, the money sucks. I only make around 35 cents per hour mining on my 290.


Well i mine with my 290X almost 22/7. Still 35C in hours is $8 a day. in 2 weeks you make up the price hike.


----------



## kpoeticg

How much of that $8 a day goes to electricity?


----------



## jomama22

Quote:


> Originally Posted by *dspacek*
> 
> Yes this bios im at now, but nothing special allowed. Maximum I get from stock bios was 1200/1700 Mhz.
> with this asus and their Tweak is impossible to set anything higher than 1600 on RAM.
> So Im still voting for AB 17 and manual +35% power in CCC because AB often do not change it.
> So thats all why Im saying that everone who is not much experienced in OCing should wait for 14.1 CCC and drivers with AB 19+ ;-)
> because standard ways withou touching some files of AB does not lead to theese high OC. ;-)


Use this Asus bios from the 290x.
http://www.techpowerup.com/vgabios/148214/asus-r9290x-4096-131014.html

Then use latest GPUtweak. That is the best reference bios for any 290/x. Let's me clock up to 1750+ on my hynix and 1625+ on elpida. Also gave me the highest actual scores at any set clocks. I'm running 3 290xs @ 1250/1625 on Asus.ROM bios and gputweak with 0 issues.

I use the pt3 bios w/ gputweak for my suicide benches which net me 1350+/1750+.

Also know that that bios and gputweak prevent you from ever opening CCC overdrive as then power limit actually works.
Quote:


> Originally Posted by *kpoeticg*
> 
> How much of that $8 a day goes to electricity?


That's after costing out power.


----------



## dspacek

Quote:


> Originally Posted by *youngmanblues*
> 
> I getting black screen too.
> 
> Happens randomly when i am playing BF4. Can take 30 mins to 3 hours to pop up, very random. Sound is still playing properly but no response from keyboard.
> Cannot ctr alt del, caps lock light not lighting up when pressed. Have to hard reset computer to restart.
> 
> 
> CPU and GPU: stock clock
> MB bios: latest
> Gpu bios : stock bios
> Latest beta drivers
> Epida memory
> GPU temp: 57(idle), 78(load), custom fan curve using MSI AB


thats the driver problem, wait for 14.1 should be solved. I red it at Powertech or somewhere, maybe AMD.com


----------



## Hattifnatten

Quote:


> Originally Posted by *kpoeticg*
> 
> How much of that $8 a day goes to electricity?


For me, about 30-40 cent.


----------



## dspacek

Quote:


> Originally Posted by *jomama22*
> 
> Use this Asus bios from the 290x.
> http://www.techpowerup.com/vgabios/148214/asus-r9290x-4096-131014.html
> 
> Then use latest GPUtweak. That is the best reference bios for any 290/x. Let's me clock up to 1750+ on my hynix and 1625+ on elpida. Also gave me the highest actual scores at any set clocks. I'm running 3 290xs @ 1250/1625 on Asus.ROM bios and gputweak with 0 issues.
> 
> I use the pt3 bios w/ gputweak for my suicide benches which net me 1350+/1750+.
> 
> Also know that that bios and gputweak prevent you from ever opening CCC overdrive as then power limit actually works.
> That's after costing out power.


But I have 290 not X version, i know I can try it instead of asking but, will it work? If not which BIOS do you recomend. I tryied many of them....


----------



## jomama22

Quote:


> Originally Posted by *dspacek*
> 
> thats the driver problem, wait for 14.1 should be solved. I red it at Powertech or somewhere, maybe AMD.com


Its not a driver issue for Christ's sake. That is a hardware issue as has been proven over and over. Look back 2 pages where someone had a black screen mess at stock, rma'd and has since gotten 3 cards that do not black screen.

If it was a driver issue, every card would suffer from it, but that is not the case.

Easy way to prove my point is to have the guy bump his voltage at stock speeds and retest. If it rids the black screens (which I would almost guarantee it would) then he has a hardware "issue" since it can't do stock at stock.


----------



## jomama22

Quote:


> Originally Posted by *dspacek*
> 
> But I have 290 not X version, i know I can try it instead of asking but, will it work? If not which BIOS do you recomend. I tryied many of them....


Yes it will work just fine. That is the main bios people used to unlock their 290s to 290xs. The 290 and 290x are exactly the same except for less shaders, the bios will not harm your card. The worst that could happen is if your 290 can't run stock 290x clocks (which I'm 99% sure it can), but nothing a small voltage bump wouldn't fix anyway.


----------



## dspacek

Quote:


> Originally Posted by *jomama22*
> 
> Yes it will work just fine. That is the main bios people used to unlock their 290s to 290xs. The 290 and 290x are exactly the same except for less shaders, the bios will not harm your card. The worst that could happen is if your 290 can't run stock 290x clocks (which I'm 99% sure it can), but nothing a small voltage bump wouldn't fix anyway.


So I tried and did not work. But any other ASUS BIOS for 290 was not offering anything higher OC than 1235/1625, and thats a pity if I know that my Hynix mem can handle 1700+ in AB with Saphire stock bios.


----------



## velocityx

kinda funny

im sitting browsing the internet and thinking to myself, damn my feet are feeling cold, gotta fire up bf4 to get some heat under my desk....;d


----------



## BradleyW

Quote:


> Originally Posted by *velocityx*
> 
> kinda funny
> 
> im sitting browsing the internet and thinking to myself, damn my feet are feeling cold, gotta fire up bf4 to get some heat under my desk....;d


Same. Woke up at 8am and it was freezing in my front room so I launched Prime95 whilst the kettle was boiling.


----------



## bloodkil93

Hey guys, I can only get 1165/1500 out of my 290x card with Elpida memory before getting artifacts, I have set +50% Power Limit and +168 Power Limit, what am I doing wrong?

Thanks!


----------



## VSG

Do you mean +168mV core voltage? If so, you have probably hit the max on the core, try increasing the Aux voltage a bit and see if that helps with memory.


----------



## bond32

Quote:


> Originally Posted by *jomama22*
> 
> Use this Asus bios from the 290x.
> http://www.techpowerup.com/vgabios/148214/asus-r9290x-4096-131014.html
> 
> Then use latest GPUtweak. That is the best reference bios for any 290/x. Let's me clock up to 1750+ on my hynix and 1625+ on elpida. Also gave me the highest actual scores at any set clocks. I'm running 3 290xs @ 1250/1625 on Asus.ROM bios and gputweak with 0 issues.
> 
> I use the pt3 bios w/ gputweak for my suicide benches which net me 1350+/1750+.
> 
> Also know that that bios and gputweak prevent you from ever opening CCC overdrive as then power limit actually works.
> That's after costing out power.


Thanks for this, going to give this bios a try...

I'm not convinced the black screens are hardware related... My sapphire 290x, started getting black screens when I started upping the memory frequency. Never got one at stock... Also, and I am still testing, it seems my stable bf4 clocks are 1160/1300. If I move the memory up even a little it will black screen, however I can bench at 1500 memory...
Quote:


> Originally Posted by *bloodkil93*
> 
> Hey guys, I can only get 1165/1500 out of my 290x card with Elpida memory before getting artifacts, I have set +50% Power Limit and +168 Power Limit, what am I doing wrong?
> 
> Thanks!


Those are good clocks, nothing you did wrong I think. That's more than I can get and I'm on water...


----------



## bloodkil93

Quote:


> Originally Posted by *bond32*
> 
> Thanks for this, going to give this bios a try...
> 
> I'm not convinced the black screens are hardware related... My sapphire 290x, started getting black screens when I started upping the memory frequency. Never got one at stock... Also, and I am still testing, it seems my stable bf4 clocks are 1160/1300. If I move the memory up even a little it will black screen, however I can bench at 1500 memory...
> Those are good clocks, nothing you did wrong I think. That's more than I can get and I'm on water...


Meh, I guess some cards are just dud overclockers


----------



## stickg1

Quote:


> Originally Posted by *bloodkil93*
> 
> Meh, I guess some cards are just dud overclockers


Yeah, if they all clocked high then AMD would have shipped them with higher clocks.


----------



## ZealotKi11er

So i am going to remount my EK Block because my VRM1 temps hit 78C with +50mV during mining. Just to make sure all i have to do is add some thermal paste between the VRMs and the Thermal Pad?


----------



## Sazz

Quote:


> Originally Posted by *ZealotKi11er*
> 
> So i am going to remount my EK Block because my VRM1 temps hit 78C with +50mV during mining. Just to make sure all i have to do is add some thermal paste between the VRMs and the Thermal Pad?


No need for thermal paste on the VRM's, you are using EK block? you probably used the wrong thermal pad on the VRM, there's two thermal pads that came with EK blocks, one is 1mm thick and one is .5mm thick, you need the 1mm thick thermal pads on the VRM's I believe and the .5 for the RAM modules.


----------



## Forceman

Quote:


> Originally Posted by *bloodkil93*
> 
> Meh, I guess some cards are just dud overclockers


15% isn't all that much of a dud.


----------



## ImJJames

I'm still getting crazy flickers on Chrome after awakening from long idle. Eventually it will crash to black screen if I stay on chrome unless I restart computer. Same happens on AB and Trixx. Weird part is I never get flicker anywhere else...just chrome.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Sazz*
> 
> No need for thermal paste on the VRM's, you are using EK block? you probably used the wrong thermal pad on the VRM, there's two thermal pads that came with EK blocks, one is 1mm thick and one is .5mm thick, you need the 1mm thick thermal pads on the VRM's I believe and the .5 for the RAM modules.


I am pretty sure i used the thicker thermal pad for VRM1. Same thermal pad for VRM2 which has much lower temps. I will inspect it tonight. My HD 7970 though different has no problems with VRM1 And 2 cooling. They say relatively close to core temp.


----------



## Forceman

Quote:


> Originally Posted by *zpaf*
> 
> Some numbers from MetroLL.
> The first 3 screenshots are with official bios.
> The last one is with PT1 bios.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I would like to see an R9 with more than 57 fps.


Talk to IamJJames (62.2)

http://www.overclock.net/t/1436635/official-ocns-team-green-vs-team-red-gk110-vs-hawaii/1590#post_21523325


----------



## Slomo4shO

Quote:


> Originally Posted by *geggeg*
> 
> I will just leave this here: Does Radeon R9 290X Behave Any Differently In A Closed Case?


Still testing in quite mode... when will they learn?


----------



## cplifj

its nice to see my hdmi cableconnectors deform and melt from the heat comming off this reference 290x card while gaming.


----------



## headbass

installed accelero today and getting blackscreens immediately after I put load on the card

please check the thread here

EDIT: fixed it....uffff ;o]


----------



## dspacek

Can anyone explain or send me a quality link to how to extend voltage and OC limits in AB v.18?







thanks


----------



## Asrock Extreme7

is 1150 core with 1300mv gpu tweak good is 1300mv like +50 in msi ab
thanks all


----------



## ZealotKi11er

So I removed my EK block and there was some kind of oil substance in the VRM1 pad. Could that be the problem. Also one more time. Do I need to put thermal paste on the. VRMs


----------



## headbass

Quote:


> Originally Posted by *dspacek*
> 
> Can anyone explain or send me a quality link to how to extend voltage and OC limits in AB v.18?
> 
> 
> 
> 
> 
> 
> 
> thanks


you can set pretty much anything you want via the profile file, even more than +200mV.....thought you already tried that
http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/13420#post_21535735 (look at the edit at the bottom)


----------



## stickg1

Clean the pads and put a small dab of TIM for good measure.


----------



## Slomo4shO

Quote:


> Originally Posted by *CrossMaximus63*
> 
> Hi chaps. Please forgive if this question has already been answered in this rather verbose thread.
> 
> I'm building up a box with an Asus Maximus Extreme VI and 3x ATI R9 290 video cards. I'm dismayed to note I can't fit these video cards in slots A2, B2, or 3, because the 10 SATA connectors at the far edge of the board, which are stacked 2 high, interfere with the video card. I can put the video cards in slots 1 and 4 only, which is where I presently have them, but this does not allow for CrossfireX configuration. I've ordered some 15cm PCI riser cables for greater flexibility, but am wondering if there's a simpler trick. What is standard practice in this case? Does anyone else have this problem? If so, what did you do to work around it?


How are the SATA interfering? I have the same board and have a quadfire installed without any issues...


----------



## Redvineal

Soooo, last night I talked myself into the first steps of planning a custom loop that will include cooling for my R9 290. Right now I'm sort of set on the XSPC Razor block, but can be swayed. Anyone care to post what OC you achieved, temps you observe, and the general parts info you have (radiators, pump, etc.)?

I know this has been mentioned in passing in many previous pages, but a clean line of info would certainly help spread info in here.

And if there's a better place to ask, please let me know.

Thanks.


----------



## Forceman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> So I removed my EK block and there was some kind of oil substance in the VRM1 pad. Could that be the problem. Also one more time. Do I need to put thermal paste on the. VRMs


EK recommends putting a small dab of thermal paste on the VRM before putting the pad on, if I remember correctly. I did and it didn't seem to cause any problems, but I don't know if it helped any.


----------



## BradleyW

Quote:


> Originally Posted by *Forceman*
> 
> EK recommends putting a small dab of thermal paste on the VRM before putting the pad on, if I remember correctly. I did and it didn't seem to cause any problems, but I don't know if it helped any.


I saw this in the manual as well. I decided to go against putting TIM on VRM's and it worked out fine.


----------



## Forceman

Quote:


> Originally Posted by *BradleyW*
> 
> I saw this in the manual as well. I decided to go against putting TIM on VRM's and it worked out fine.


Yeah, I doubt it makes much difference. If I had better pads (not the stock ones) I probably wouldn't have bothered.


----------



## bond32

Blackscreen update here, tried the asus bios linked a few pages back. Slowly working with it, ran bf4 stable at 1170/1375 at 1.32 V. As soon as I went to 1180 I got a black screen. Didn't adjust memory yet... So now running with more voltage, seems to be ok for now at 1180/1400 at 1.375 V.

Edit: Nvm, got a black screen... Still working


----------



## rabidz7

Anyone know how to volt mod the vram?


----------



## esqueue

Quote:


> Originally Posted by *Jack Mac*
> 
> They're OOS on Amazon and $500 USD on Amazon.
> I do mine while I'm asleep and fold on the CPU. I don't see why people do it though, the money sucks. I only make around 35 cents per hour mining on my 290.


Yeah, I found it a complete waste of time. You are completely out of a rig, they put off quite a bit of heat and you may end up dealing with criminals such as BTC-e.com if you didn't research enough. While your computer is mining, you can only browse the internet. There are lots of critical details such as if you lose your wallet.dat, you lose everything. I guess that there are backup options but.....I found out the hard way.

I honestly found it very stressful and the few cents per hour wasn't worth it IMO. To each their own though.

I found folding a much more productive use for my H/W.


----------



## headbass

Quote:


> Originally Posted by *Asrock Extreme7*
> 
> is 1150 core with 1300mv gpu tweak good is 1300mv like +50 in msi ab
> thanks all


that imho good, my 290 with 71.5% asic needs +130mV in AB which is about 1.35v in gpuz to be artefact free at 1150


----------



## Roy360

Quote:


> Originally Posted by *BradleyW*
> 
> Same. Woke up at 8am and it was freezing in my front room so I launched Prime95 whilst the kettle was boiling.


haha I do the same thing. I'm going to miss that feature of the R9 290 once I water cool it.

Going to be receiving my RMA'd XFX card back tomorrow. Anything I should test before voiding my warranty?

P.S Anyone know of any good ways to remove those two stickers on the screws?


----------



## ZealotKi11er

Wow that's high voltage. I am @ 1.1v stock 77.9%


----------



## grunion

Quote:


> Originally Posted by *CrossMaximus63*
> 
> Hi chaps. Please forgive if this question has already been answered in this rather verbose thread.
> 
> I'm building up a box with an Asus Maximus Extreme VI and 3x ATI R9 290 video cards. I'm dismayed to note I can't fit these video cards in slots A2, B2, or 3, because the 10 SATA connectors at the far edge of the board, which are stacked 2 high, interfere with the video card. I can put the video cards in slots 1 and 4 only, which is where I presently have them, but this does not allow for CrossfireX configuration. I've ordered some 15cm PCI riser cables for greater flexibility, but am wondering if there's a simpler trick. What is standard practice in this case? Does anyone else have this problem? If so, what did you do to work around it?


Quote:


> Originally Posted by *Slomo4shO*
> 
> How are the SATA interfering? I have the same board and have a quadfire installed without any issues...


Ditto to this, I have no issues.


----------



## jprovido

quick question guys. r9 290x reference or sapphire r9 290 tri-x? thanks


----------



## neurotix

Probably Tri-X if you don't plan on water cooling.

I think reference with an Accelero or Gelid Icy might get even better temps but you gotta find and buy one of those coolers and void your warranty.


----------



## Gumbi

Quote:


> Originally Posted by *jprovido*
> 
> quick question guys. r9 290x reference or sapphire r9 290 tri-x? thanks


290 Trix without a doubt. Clock for clock the 290x is less than 5% faster than a 290, and a Trix 290 is even cheaper than a 290x reference. It should be 450 bucks, but with the mining craze you'll have to pay more.

It's definitely worth it. Queiter cooler, allowing better overclocks etc etc.


----------



## jprovido

Quote:


> Originally Posted by *Gumbi*
> 
> 290 Trix without a doubt. Clock for clock the 290x is less than 5% faster than a 290, and a Trix 290 is even cheaper than a 290x reference. It should be 450 bucks, but with the mining craze you'll have to pay more.
> 
> It's definitely worth it. Queiter cooler, allowing better overclocks etc etc.


ty for the info. was gon get my r9 290x on wednesday but I saw this card getting posted just a few hours ago

http://www.tipidpc.com/viewitem.php?iid=28645412 that's 22,000 or about 480USD, a reference 290x is about 550USD from where I'm from. there's not much price gouging and shortage here in Manila. you can get these cards anywhere with ease.

I did see someone selling a 1 month used reference r9 290 for 365USD. it's tempting me quite a bit as well








Quote:


> Probably Tri-X if you don't plan on water cooling.
> 
> I think reference with an Accelero or Gelid Icy might get even better temps but you gotta find and buy one of those coolers and void your warranty.


we buy everything here locally so with changing coolers I just need to inform the retailer that I got it from. i know it sounds kinda weird but you just need to ask for permission to change the coolers so they can apply new warranty stickers with the new cooler. that's what they did with my HD5970 when I bought an accelero cooler a few years ago


----------



## psyside

Quote:


> Originally Posted by *jprovido*
> 
> ty for the info. was gon get my r9 290x on wednesday but I saw this card getting posted just a few hours ago
> 
> http://www.tipidpc.com/viewitem.php?iid=28645412 that's 22,000 or about 480USD, a reference 290x is about 550USD from where I'm from. there's not much price gouging and shortage here in Manila. you can get these cards anywhere with ease.
> 
> I did see someone selling a 1 month used reference r9 290 for 365USD. it's tempting me quite a bit as well
> 
> 
> 
> 
> 
> 
> 
> 
> we buy everything here locally so with changing coolers I just need to inform the retailer that I got it from. i know it sounds kinda weird but you just need to ask for permission to change the coolers so they can apply new warranty stickers with the new cooler. that's what they did with my HD5970 when I bought an accelero cooler a few years ago


290 Tri X can take *1.4v on air,* with [email protected] 90c and core @75 - 30 mins Heaven loop 70% fan and @1220/1600...so you get the idea. Monster card


----------



## ZealotKi11er

Added Thermal Paste in the VRM1. Still getting high temps. ~ 70C. VRM2 is only 41C.


----------



## bond32

Whoa there, what's this... Seems BF4 is stable at 1190/1375.. This is good news. Way better than anything before plus temps all in check. No black screens yet.


----------



## Its L0G4N

Quote:


> Originally Posted by *Roy360*
> 
> haha I do the same thing. I'm going to miss that feature of the R9 290 once I water cool it.
> 
> Going to be receiving my RMA'd XFX card back tomorrow. Anything I should test before voiding my warranty?
> 
> P.S Anyone know of any good ways to remove those two stickers on the screws?


If you live in the U.S. nothing done to the card (except damage) will void the warrenty


----------



## Paulenski

Quote:


> Originally Posted by *Roy360*
> 
> haha I do the same thing. I'm going to miss that feature of the R9 290 once I water cool it.
> 
> Going to be receiving my RMA'd XFX card back tomorrow. Anything I should test before voiding my warranty?
> 
> P.S Anyone know of any good ways to remove those two stickers on the screws?


Those stickers don't matter for RMA's in the US, XFX allows removal of heatsink for third-party coolers. The stickers on the GPU bracket are only valid for places outside the US that do not allow third-party coolers. I RAM'd mine after having some vrm heat issues with mk-26, I spoke with two different people there on the matter and was told about the stickers and whats allowed.


----------



## Forceman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Added Thermal Paste in the VRM1. Still getting high temps. ~ 70C. VRM2 is only 41C.


Which thermal pads are you using? There are two different stock pads, one is 0.5mm and one is 1mm, be sure you used the 1mm ones on the VRMs. Is that at stock voltage?


----------



## Sgt Bilko

Update Please









2 XFX R9 290 DD Black Edition's





Spoiler: More Pics


----------



## psyside

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Update Please
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2 XFX R9 290 DD Black Edition's
> 
> 
> 
> 
> 
> Spoiler: More Pics


I hope, you gonna watercool









http://www.overclock.net/t/1458690/fud-xfx-radeon-dd-r9-290x-1000m-4gb-reviewed/10

Cards on air, are being terrible.


----------



## Sgt Bilko

Quote:


> Originally Posted by *psyside*
> 
> I hope, you gonna watercool
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1458690/fud-xfx-radeon-dd-r9-290x-1000m-4gb-reviewed/10
> 
> Cards on air, are being terrible.


Pretty good for me so far, was playing some BF4 while waiting for the pics to upload (abysmally slow net) and cores sat at 68c and 71c.

vrm's never went over 75 either but thats at stock settings.

Mind you i scoured the net trying to find a review for these cards before they shipped out......review comes on the day they are delivered


----------



## psyside

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Pretty good for me so far, was playing some BF4 while waiting for the pics to upload (abysmally slow net) and cores sat at 68c and 71c.
> 
> vrm's never went over 75 either but thats at stock settings.
> 
> Mind you i scoured the net trying to find a review for these cards before they shipped out......review comes on the day they are delivered


LOL talking about bad luck...

Why you didn't go for the Tri-X? proven beast of cards. Are they out of stock in the USA? i should be getting mine in few days.


----------



## Arizonian

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Update Please
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2 XFX R9 290 DD Black Edition's
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: More Pics


Congrats on the new cards. Updated









IMO asthetically nice.


----------



## TheRoot

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Update Please
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2 XFX R9 290 DD Black Edition's
> 
> 
> 
> 
> 
> Spoiler: More Pics


look at this card


----------



## Sgt Bilko

Quote:


> Originally Posted by *psyside*
> 
> LOL talking about bad luck...
> 
> Why you didn't go for the Tri-X? proven beast of cards. Are they out of stock in the USA? i should be getting mine in few days.


I don't live in the US and the only time they were in stock in Aus is when i was at work or out of the house.

Doesn't really matter as i am water cooling later down the line anyways, was just curious to see how they would go


----------



## Sgt Bilko

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats on the new cards. Updated
> 
> 
> 
> 
> 
> 
> 
> 
> 
> IMO asthetically nice.


Oh yeah, they look amazing, really great feel to them as well, surprisingly light but also no need for a backplate, next to no sag at all, was very impressed build quality wise.

EDIT: Instead of triple posting i'm just going to edit this one









Both cards do not unlock and they both have Elphida ram so thats confirmed, the top card is on average 10c hotter than the bottom one and thats with 30c ambients......


----------



## Gumbi

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Oh yeah, they look amazing, really great feel to them as well, surprisingly light but also no need for a backplate, next to no sag at all, was very impressed build quality wise.
> 
> EDIT: Instead of triple posting i'm just going to edit this one
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Both cards do not unlock and they both have Elphida ram so thats confirmed, the top card is on average 10c hotter than the bottom one and thats with 30c ambients......


That's expected with a CF setup, as one card is credit the other preheated air. You have 30 degree ambients? Are you sure, that's very very high 8-12 degrees higher than most peoples'. What are the cards idling at? Do you have a side fan feeding these cards cool air?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gumbi*
> 
> That's expected with a CF setup, as one card is credit the other preheated air. You have 30 degree ambients? Are you sure, that's very very high 8-12 degrees higher than most peoples'. What are the cards idling at? Do you have a side fan feeding these cards cool air?


I live in Australia with no air-con, thats why the temps are high. and the cards are idling at 41c and 37c.

There is no side fan but i have taken the sides off my case to let them breathe a bit better. I don't normally run overclocked settings in summer for this reason but with new cards......i gotta try them out









Just ran a Firestrike and FSE at stock settings with them.

FS: http://www.3dmark.com/3dm/2191525

FSE: http://www.3dmark.com/3dm/2191757

Sidenote, i'm drawing a max of 895w during the FSE Combined test for those that are interested.


----------



## velocityx

Quote:


> Originally Posted by *Gumbi*
> 
> That's expected with a CF setup, as one card is credit the other preheated air. You have 30 degree ambients? Are you sure, that's very very high 8-12 degrees higher than most peoples'. What are the cards idling at? Do you have a side fan feeding these cards cool air?


yep, my ref 290's do the same, top one is 15c hotter.


----------



## Luckael

Thanks for adding me in the group


----------



## kizwan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Oh yeah, they look amazing, really great feel to them as well, surprisingly light but also no need for a backplate, next to no sag at all, was very impressed build quality wise.
> 
> EDIT: Instead of triple posting i'm just going to edit this one
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Both cards do not unlock and they both have Elphida ram so thats confirmed, the top card is on average 10c hotter than the bottom one and thats with 30c ambients......


Congrats & enjoy your cards!









Mine was around ~10C difference too, with the first card can run at 80s Celsius & the second card run at 70s Celsius. Then I swapped the cards (second card to first slot, first card to second slot) & now both running at 80s Celsius (a couple of degrees difference between them). Other than swapping the cards, I did re-install driver though. Anyway, usually the first card will always running hotter than the second card.


----------



## ottoore

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Added Thermal Paste in the VRM1. Still getting high temps. ~ 70C. VRM2 is only 41C.


i saw another italian guy having the same problem. You can try to apply 2 pieces of thermal pad ( 2mm) for better contact. He solved the problem in this way.


----------



## ST05

check ASIC quality on gpu-z for both card, probably has a "bigger" difference on there








congrat for the nice couple


----------



## Arizonian

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Oh yeah, they look amazing, really great feel to them as well, surprisingly light but also no need for a backplate, next to no sag at all, was very impressed build quality wise.
> 
> EDIT: Instead of triple posting i'm just going to edit this one
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Both cards do not unlock and they both have Elphida ram so thats confirmed, the top card is on average 10c hotter than the bottom one and thats with 30c ambients......


Oooooh had you down for 'stock' reference cooling, almost forgot - I've got you now listed as 'DD Black' for 'cooling' - updated









Again, congrats your the first on the list with XFX DD Black edition cooling.


----------



## Sgt Bilko

Quote:


> Originally Posted by *ST05*
> 
> check ASIC quality on gpu-z for both card, probably has a "bigger" difference on there
> 
> 
> 
> 
> 
> 
> 
> 
> congrat for the nice couple


74.6% and 80.4%, not that it makes much difference really.


----------



## krillz0

need more info or is this ok for add?








Sapphire R9 290 4gb
core: 1100
Mem: 1350
Cooling: Stock :/


----------



## Arizonian

Quote:


> Originally Posted by *krillz0*
> 
> need more info or is this ok for add?
> 
> 
> 
> 
> 
> 
> 
> 
> Sapphire R9 290 4gb
> core: 1100
> Mem: 1350
> Cooling: Stock :/
> 
> 
> Spoiler: Warning: Spoiler!


Perfect - welcome aboard. Congrats - added


----------



## Romeru

My R9 290 Windforce is at 1200mhz on the core with an voltage increase of 80, so far so stable. But msi afterburner wont let me overvolt anymore, is there anyway to get more out of this chip as my temperatures arent very high. around 77c in battlefield 4.

I love this card so far anyway


----------



## Sgt Bilko

Quote:


> Originally Posted by *Romeru*
> 
> My R9 290 Windforce is at 1200mhz on the core with an voltage increase of 80, so far so stable. But msi afterburner wont let me overvolt anymore, is there anyway to get more out of this chip as my temperatures arent very high. around 77c in battlefield 4.
> 
> I love this card so far anyway


Sapphire Trixx allows +200mV compared to Afterburners +100mV.

And nice clock, you seem to have a winner there


----------



## Aesthetics

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Sapphire Trixx allows +200mV compared to Afterburners +100mV.
> 
> And nice clock, you seem to have a winner there


I also own XFX 290 DD, what temps are you getting on your VRM?


----------



## Jack Mac

Wow, those XFX cards are impressive.


----------



## kizwan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 74.6% and 80.4%, not that it makes much difference really.


I bet 80.4% card have lower voltage requirement than 74.6% card. Mine max at 1.227 - 1.234V (ASIC 70.3%, Elpida) & another max at 1.211V (ASIC 77%, Hynix). These voltages at stock clock of course.


----------



## dspacek

Quote:


> Originally Posted by *headbass*
> 
> you can set pretty much anything you want via the profile file, even more than +200mV.....thought you already tried that
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/13420#post_21535735 (look at the edit at the bottom)


Yep I tried but for version 18 it does not work







only in version 17 I was able to.


----------



## Sazz

Quote:


> Originally Posted by *kizwan*
> 
> I bet 80.4% card have lower voltage requirement than 74.6% card. Mine max at 1.227 - 1.234V (ASIC 70.3%, Elpida) & another max at 1.211V (ASIC 77%, Hynix). These voltages at stock clock of course.


at stock settings mine runs at 1.145v. and at that stock voltage I can go upto 1100 on core stable. Memory only goes 1385 stable, I can run upto 1550Mhz on memory but I will get black screen on games that uses high vram, but benchmarking wise I don't get black screen at 1550Mhz on vram, and it has 80.3% ASIC as well.


----------



## Romeru

Well uhm, this is odd. Mind i asking what fps you guys have in games such as Starcraft 1 (Yes starcraft brood war), minecraft and saints row 3 and 4.

As for minecraft my fps is just above 60, i was running it happily with my gtx 660 before and was averaging 500-600 fps constantly. As for a old game like starcraft is feels like my fps is locked at 15 or something, i think it's related to v-sync messing it up.



Minecraft has lot's of stuttering, however other games like battlefield 4, battlefield 3, starcraft II and crysis 3 works perfect. Getting very smooth fps there.
I uninstalled my nvidia drivers via Windwos control panel, and then installed Catalyst 13.12.

Specs:
GA-990FXA-UD3
FX-8350 @ 5Ghz
CORSAIR 8GB 1600 9-9-9-24 RAM
Gigabyte R9 290 (non-x) Windforce 3x @ 1200mhz
CPU cooler: H80i
GPU cooler: Aftermarket
PSU: XFX ProSeries XXX Edition P1-850X-XXB9 850W

Any ideas?


----------



## gordonfo

powercolor 290x watercooling with xspc 290x block and backplate ^^, asus bios with gpu tweak keep made my gaming experience with bf4 crash, so i use back stock bios and overclock abit =D
gpuz link
http://www.techpowerup.com/gpuz/39d9p/





http://www.techpowerup.com/gpuz/39d9p/


----------



## Sgt Bilko

Quote:


> Originally Posted by *Aesthetics*
> 
> I also own XFX 290 DD, what temps are you getting on your VRM?


my vrm temps max out at 75c when benching, i haven't got any gaming done yet lol
Quote:


> Originally Posted by *Jack Mac*
> 
> Wow, those XFX cards are impressive.


Yeah, they are a lot better than what people were expecting, The glowing from the XFX on the underside actually turns off on the second card until it actually has to work. I'm very pleased with them, even at stock clocks they run circles around anything i can throw at them.
Quote:


> Originally Posted by *kizwan*
> 
> I bet 80.4% card have lower voltage requirement than 74.6% card. Mine max at 1.227 - 1.234V (ASIC 70.3%, Elpida) & another max at 1.211V (ASIC 77%, Hynix). These voltages at stock clock of course.


1.211 vs 1.219.

EDIT: I just noticed that for some strange reason my 2nd 290 is showing as PCIe 2.0 x 8 even though it's in a x16 slot, any ideas?


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> my vrm temps max out at 75c when benching, i haven't got any gaming done yet lol
> Yeah, they are a lot better than what people were expecting, The glowing from the XFX on the underside actually turns off on the second card until it actually has to work. I'm very pleased with them, even at stock clocks they run circles around anything i can throw at them.
> 1.211 vs 1.219.
> 
> EDIT: I just noticed that for some strange reason my 2nd 290 is showing as PCIe 2.0 x 8 even though it's in a x16 slot, any ideas?


X8 even at load?

do your cards follow the reference pcb?


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> X8 even at load?
> 
> do your cards follow the reference pcb?


I haven't had the cooler off them but i'm 98% sure they are Ref Boards (Minus the XFX Logo)

I'll tinker a bit more tomorrow, i think a full Windows install might be in order, i've been putting it off for far too long now......


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I haven't had the cooler off them but i'm 98% sure they are Ref Boards (Minus the XFX Logo)
> 
> I'll tinker a bit more tomorrow, i think a full Windows install might be in order, i've been putting it off for far too long now......


use gpuz's render test to load and check if it turns to X16. thanks.


----------



## kpoeticg

Quote:


> Originally Posted by *jomama22*
> 
> That's after costing out power.


Quote:


> Originally Posted by *Hattifnatten*
> 
> For me, about 30-40 cent.


Thanx for explaining. +1 to both of you

I'm gonna have 3 290x's in my build and i'm considering mining in my PC's downtime just to try it out


----------



## Hattifnatten

Powerprices may be different for you, but you will make a profit nevertheless.


----------



## kpoeticg

Sweet. I could definitely use it =)


----------



## BradleyW

Quote:


> Originally Posted by *Sgt Bilko*
> 
> my vrm temps max out at 75c when benching, i haven't got any gaming done yet lol
> Yeah, they are a lot better than what people were expecting, The glowing from the XFX on the underside actually turns off on the second card until it actually has to work. I'm very pleased with them, even at stock clocks they run circles around anything i can throw at them.
> 1.211 vs 1.219.
> 
> EDIT: I just noticed that for some strange reason my *2nd 290 is showing as PCIe 2.0 x 8 even though it's in a x16 slot*, any ideas?


Is the slot x8 electrical?
Set power option preset to High Performance in the power options settings (via windows).
If this fails, ensure the BIOS settings are correct.
Still no luck? Re-install the card. Each time you re-install and test, try seating it a little less firmly each time. Too much pressure or too much for a firm install can knock the speeds on the slots down.. I know that sounds absolutely ridiculous but it's true. I discovered this and replicated the problem on other systems.


----------



## Sazz

Hey just a question, Haven't had an intel build since 2007 so I don't really have a concrete idea on the power consumption of it.

But my question is, would a 650watts PSU enough to power these..

i5 3570k (prolly a OC of 4Ghz to 4.4Ghz depending on temps/voltage)
R9 290X (stock voltage, maybe if I could run it at max +200mV for benchmarking that would be nice)
two (2) H100i (one for CPU and another for GPU)
one SSD and HDD
2x4Gb RAM module.
4-6Fans


----------



## Durvelle27

Quote:


> Originally Posted by *Sazz*
> 
> Hey just a question, Haven't had an intel build since 2007 so I don't really have a concrete idea on the power consumption of it.
> 
> But my question is, would a 650watts PSU enough to power these..
> 
> i5 3570k (prolly a OC of 4Ghz to 4.4Ghz depending on temps/voltage)
> R9 290X (stock voltage, maybe if I could run it at max +200mV for benchmarking that would be nice)
> two (2) H100i (one for CPU and another for GPU)
> one SSD and HDD
> 2x4Gb RAM module.
> 4-6Fans


It would be fine for a stock 290X but I can't say once you start overvolting the 290X as they can start drawing a lot power. If you don't go to crazy with volts you'll be fine.


----------



## Sazz

Quote:


> Originally Posted by *Durvelle27*
> 
> It would be fine for a stock 290X but I can't say once you start overvolting the 290X as they can start drawing a lot power. If you don't go to crazy with volts you'll be fine.


I'll probably opt in w/ 850watts PSU. if only length of the PSU wasn't an issue, 160mm in length is the maximum PSU size I can get into a bitfenix prodigy and that's gonna be really tight with modular cables.

my Seasonic X1250 won't even fit on it since it's 190mm in length.


----------



## ReHWolution

Update me to aftermarket, Icy Vision Rev. 2 mounted on my 290X


----------



## smoke2

I'm stucked between Gigabyte Windforce 290 and Sapphire Tri-X 290.
Which one would you prefer?
http://www.tomshardware.com/reviews/r9-290x-case-performance,3710-2.html
According this review Sapphire is better, but isn't also a noisier than Gigabyte?

Which one would you choose?

I'm owning Lancool PC-K9X case (not silenced case):
http://www.lancoolpc.com/en/product/product06.php?pr_index=50&cl_index=1&sc_index=25&ss_index=96&g=f

Thanks.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Forceman*
> 
> Which thermal pads are you using? There are two different stock pads, one is 0.5mm and one is 1mm, be sure you used the 1mm ones on the VRMs. Is that at stock voltage?


Yes i am using 1mm one. Its @ +50mV. I just dont know what will happen if i want to OC more with +200mV.


----------



## jerrolds

Quote:


> Originally Posted by *ZealotKi11er*
> 
> So I removed my EK block and there was some kind of oil substance in the VRM1 pad. Could that be the problem. Also one more time. Do I need to put thermal paste on the. VRMs


Maybe - i know on the Gelid ICY install instructions they recommend using an eraser to remove any oils before applying the memory sinks.


----------



## Marvin82

Quote:


> Originally Posted by *ReHWolution*
> 
> 
> 
> Update me to aftermarket, Icy Vision Rev. 2 mounted on my 290X


And now? Idle and 100% speed ?
Can the card not Regulate (pwm) the icy ? The GTX680 cane not handle the iCY ...


----------



## Xares

hi, i have "fluctuation" on GPU frequency core running battlefield 3 and 3dmark vantage with vsync off,

Is that normal? the top temperature is 80º

sorry for my bad english


----------



## rdr09

Quote:


> Originally Posted by *jerrolds*
> 
> Maybe - i know on the Gelid ICY install instructions they recommend using an eraser to remove any oils before applying the memory sinks.


i used alcohol and wiped the vrams and vrms clean.

@ Zeal, yes. i used the same thermal paste that came with the ek kit and applied it on the vrams and vrms. next time l'll apply it on both sides of the pads.


----------



## jerrolds

Yea i used an eraser, then alcohol (or was it acetone) to make sure the sinks were clean. I figured an eraser couldnt hurt as long as you made sure no debris was left on.


----------



## ottoore

Quote:


> Originally Posted by *ReHWolution*
> 
> 
> 
> Update me to aftermarket, Icy Vision Rev. 2 mounted on my 290X


Do you change fans?
Max rpm is 2000 and it shows 4000!
http://www.gelidsolutions.com/products/index.php?lid=1&cid=17&id=52&tab=2


----------



## Raephen

Quote:


> Originally Posted by *Sazz*
> 
> I'll probably opt in w/ 850watts PSU. if only length of the PSU wasn't an issue, 160mm in length is the maximum PSU size I can get into a bitfenix prodigy and that's gonna be really tight with modular cables.
> 
> my Seasonic X1250 won't even fit on it since it's 190mm in length.


The Seasonic XP2-860 is 16cm in length...


----------



## BradleyW

Hey, I have a question about Splinter Cell Blacklist. I know we have a club on the forum, but it's dried up a little. So, I have issues with this game. I seem to be getting lots of stuttering, especially when panning the camera with the mouse. With CFX disabled, the stutter is only reduced a little. It feels like the game is not running my screen at 144Hz or frame pacing is not activating.

CCC 13.12
SC B Settings:

Texture Detail: ULTRA
Shadow Quality: ULTRA
Parallax: ON
Tessellation: ON
AF: 16x
V-Sync: OFF
Dynamic AO: Field AO & HBAO+
AA: FXAA
DX: 11

I've tried windows 7 and 8.1, AFR mode, Optimize 1x1, CFX OFF, various graphics settings, force 144hz via .ini file.

Thank you for the help.


----------



## Sazz

Quote:


> Originally Posted by *Raephen*
> 
> The Seasonic XP2-860 is 16cm in length...


I am selling my X1250 to get a X850 instead, same length as the one you mentioned.

And if anyone is on the lookout for a waterblock, I got a EK FC-290X Acetal w/ backplate. Won't be able to do custom loop w/ my current watercooling components so I am just gonna get rid of them all and use a AIO mod on my 290X. will trade it for a H100i that is under warranty.


----------



## Durvelle27

Quote:


> Originally Posted by *Sazz*
> 
> I'll probably opt in w/ 850watts PSU. if only length of the PSU wasn't an issue, 160mm in length is the maximum PSU size I can get into a bitfenix prodigy and that's gonna be really tight with modular cables.
> 
> my Seasonic X1250 won't even fit on it since it's 190mm in length.


You could get a PSU like mines. I have a Corsair TX850 and its only 160.02mm


----------



## Arizonian

Quote:


> Originally Posted by *ReHWolution*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Update me to aftermarket, Icy Vision Rev. 2 mounted on my 290X


Nice Congrats - updated.


----------



## sugarhell

Guys to fix gpu usage on msi AB go to MSIafterburner.cfg and in [ATIADLHAL] section change UnifiedactivityMonitoring to 1.


----------



## shilka

Quote:


> Originally Posted by *Sazz*
> 
> I am selling my X1250 to get a X850 instead, same length as the one you mentioned.
> 
> And if anyone is on the lookout for a waterblock, I got a EK FC-290X Acetal w/ backplate. Won't be able to do custom loop w/ my current watercooling components so I am just gonna get rid of them all and use a AIO mod on my 290X. will trade it for a H100i that is under warranty.


SilverStone has a ton of very small PSU´s

http://www.overclock.net/t/1436430/silverstone-power-supplies-information-thread


----------



## dspacek

Quote:


> Originally Posted by *sugarhell*
> 
> Guys to fix gpu usage on msi AB go to MSIafterburner.cfg and in [ATIADLHAL] section change UnifiedactivityMonitoring to 1.


Thx, and do you know how to increase clock limits? I had a luck one day when I rewrite a file in Profile folder and it was able to manage 1700Mhz Memory but now it does not work. thx for advice
This is my AB now:


----------



## Janus67

290X owner here (bought at launch). Haven't really tried any overclocking, I currently have it mining at 900kh/s at 975/1450 and doing just fine. Wish it was quieter, currently waiting for a review sample of Arctic's new heatsink that is supposed to be made for the 290/290X (as opposed to using the AC3)


----------



## jerrolds

AC is making a new 290(X) specific cooling solution? I thought they were just gonna leave the Xtreme3 and Hybrid (theyre listed as compatible)


----------



## BradleyW

To add to my previous post (Page 1401) , it now seems that MSI is showing no usage on my second card.


----------



## sugarhell

Quote:


> Originally Posted by *dspacek*
> 
> Thx, and do you know how to increase clock limits? I had a luck one day when I rewrite a file in Profile folder and it was able to manage 1700Mhz Memory but now it does not work. thx for advice
> This is my AB now:


Why you dont go to settings to extend overclock limits?

ALso you should use beta 18. It use a low level i2c bus that doesnt affect performance vs beta 17 that use amd api for sensor and monitoring.


----------



## tsm106

Quote:


> Originally Posted by *BradleyW*
> 
> To add to my previous post (Page 1401) , it now seems that MSI is showing no usage on my second card.


You're on a roll now.


----------



## dspacek

Quote:


> Originally Posted by *sugarhell*
> 
> Why you dont go to settings to extend overclock limits?
> 
> ALso you should use beta 18. It use a low level i2c bus that doesnt affect performance vs beta 17 that use amd api for sensor and monitoring.


this is extended, maximum is 1625Mhz RAM and 1235 Core. previous I could overwrite it in Profile config but now it does not work. So explain me how to make 1700 Mhz of RAM. because standard and hackin ways not working.


----------



## Its L0G4N

Quote:


> Originally Posted by *Romeru*
> 
> Well uhm, this is odd. Mind i asking what fps you guys have in games such as Starcraft 1 (Yes starcraft brood war), minecraft and saints row 3 and 4.
> 
> As for minecraft my fps is just above 60, i was running it happily with my gtx 660 before and was averaging 500-600 fps constantly. As for a old game like starcraft is feels like my fps is locked at 15 or something, i think it's related to v-sync messing it up.
> 
> 
> 
> Minecraft has lot's of stuttering, however other games like battlefield 4, battlefield 3, starcraft II and crysis 3 works perfect. Getting very smooth fps there.
> I uninstalled my nvidia drivers via Windwos control panel, and then installed Catalyst 13.12.
> 
> Specs:
> GA-990FXA-UD3
> FX-8350 @ 5Ghz
> CORSAIR 8GB 1600 9-9-9-24 RAM
> Gigabyte R9 290 (non-x) Windforce 3x @ 1200mhz
> CPU cooler: H80i
> GPU cooler: Aftermarket
> PSU: XFX ProSeries XXX Edition P1-850X-XXB9 850W
> 
> Any ideas?


I had the same problem. I believe the solution is to turn off vsync and put it to max fps.


----------



## sugarhell

Quote:


> Originally Posted by *dspacek*
> 
> this is extended, maximum is 1625Mhz RAM and 1235 Core. previous I could overwrite it in Profile config but now it does not work. So explain me how to make 1700 Mhz of RAM. because standard and hackin ways not working.


If you want to bench why you dont use gpu tweak or trixx?


----------



## dspacek

Quote:


> Originally Posted by *sugarhell*
> 
> If you want to bench why you dont use gpu tweak or trixx?


OMG, because trixx does not handle 1700Mhz RAM and GPU tweak is working on ASUS Bios only which makes me a black screen in startup. So STOP ask me. I want only an advice, so if you dont have any please do not react on me with theese childish advices.


----------



## Hattifnatten

Quote:


> Originally Posted by *Xares*
> 
> hi, i have "fluctuation" on GPU frequency core running battlefield 3 and 3dmark vantage with vsync off,
> 
> Is that normal? the top temperature is 80º
> 
> sorry for my bad english


Are you using afterburner beta 17 with voltage monitoring enabled? That will cause clock fluctuations. Beta 18 fixes this.


----------



## sugarhell

Childish advice that i told you to use gpu tweak or trixx? Mkay


----------



## BradleyW

Quote:


> Originally Posted by *tsm106*
> 
> You're on a roll now.


Not that easy. MSI is showing no usage in all games. When CFX is disabled, I then see usage on the second card. However, with CFX disabled despite the usage, It's false because the fps is obviously much lower with CFX off.
So back to square one with splinter cell blacklist.


----------



## tsm106

Quote:


> Originally Posted by *dspacek*
> 
> Quote:
> 
> 
> 
> Originally Posted by *sugarhell*
> 
> If you want to bench why you dont use gpu tweak or trixx?
> 
> 
> 
> OMG, because trixx does not handle 1700Mhz RAM and GPU tweak is working on ASUS Bios only which makes me a black screen in startup. So STOP ask me. I want only an advice, so if you dont have any please do not react on me with theese childish advices.
Click to expand...

Note to self, do not give advice as it will be construed as childish advices.









PS.
Quote:


> Mantle will bring peace


----------



## kizwan

Quote:


> Originally Posted by *dspacek*
> 
> OMG, because trixx does not handle 1700Mhz RAM and GPU tweak is working on ASUS Bios only which makes me a black screen in startup. So STOP ask me. I want only an advice, so if you dont have any please do not react on me with theese childish advices.


Overclock: side effects include migraine & hotheaded.

Advice: Don't overclock.


----------



## jerrolds

Really wish GPU Tweak would have an OSD or something. Currently using GT and AB concurrently for overclocking/onscreen monitoring. Anyone have a better solution?

I suppose i can lower my overclock and fit within the 100mv limit of AB.


----------



## sugarhell

The limits on ab are not 100mv


----------



## Forceman

Quote:


> Originally Posted by *jerrolds*
> 
> Really wish GPU Tweak would have an OSD or something. Currently using GT and AB concurrently for overclocking/onscreen monitoring. Anyone have a better solution?
> 
> I suppose i can lower my overclock and fit within the 100mv limit of AB.


At the risk of giving childish advice, does Trixx have an OSD? Because it can also do +200mV.


----------



## tsm106

Quote:


> Originally Posted by *sugarhell*
> 
> The limits on ab are not 100mv


Use the Force Luke.

The Force being command line switches via desktop shortcuts.


----------



## dspacek

Quote:


> Originally Posted by *tsm106*
> 
> Note to self, do not give advice as it will be construed as childish advices.


OK so if I tell you try this program, then this and then this one thats a contructive advice how to solve a problem? Ohhh nice!
When I know how it works before and could handle 1200 core and 1700 RAM on AB 17 so, it after installing that 18version it does not work either, included reinstall back to AB 17. I only want to know how to change it. I don want to hear reflash BIOS to ASUS, and try GPU tweak which is not able to manage 1700 on RAMstable. I have many combinations tried but the Saphire orig. bios with AB 17 was working well for my system. But now is anything corrupted and Im not able to raise clocks...
If anyone of you was able to reach on R9-290 higher clocks than 1200 core and 1700 ram so let me know please how did you do it. Because I cant do it now. and next problem is, that AB cant manage to change power limit so I have to install CCC and change it in there. That sucks.


----------



## jerrolds

Using the +200mv flag for AB, that offsets the voltage displayed in AB?

1250mv (with +200mV ***) is as if its sliding the bar to 1450mv?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *jerrolds*
> 
> Using the +200mv flag for AB, that offsets the voltage displayed in AB?
> 
> 1250mv (with +200mV ***) is as if its sliding the bar to 1450mv?


Yes


----------



## sugarhell

Guys its easy to give more volts on msi.

Just use /wi4,30,8d,10 for 100mv. The offset is 6.25 mv in hexademical. So on decimal is :16*6.25=100 mv. For 50mv you need 8. For 200mv you need 20( 20=32 on dec. So 32 * 6.25=200mv)

The easy way to do changes:

Create a txt on desktop. Write
CD C:\Program Files (x86)\MSI Afterburner
MSIAfterburner.exe /wi4,30,8d,10

and then save as .bat file. Eveyrtime you start this bat file msi will start with +100mv

For 50mv: 8
For 100mv:10
For 125mv:14
For 150mv:18
For 175mv:1C
For 200mv:20

I wouldn't go over this point because
1)You are close to leave the sweet spot of the ref pcb vrms efficiency
2)These commands add 200mv on top of the 100mv offset through AB gui.That means 300mv

By default /wi command apply to current gpu only. So if you have 2 or more gpus you must use /sg command. That means the command line is something like that
ex:MsiAfterburner.exe /sg0 /wi4,30,8d,10 /sg1 /wi4,30,8d,10


----------



## tsm106

^^You know your post will be forgotten and someone will ask the same thing again tomorrow.


----------



## sugarhell

Quote:


> Originally Posted by *dspacek*
> 
> OK so if I tell you try this program, then this and then this one thats a contructive advice how to solve a problem? Ohhh nice!
> When I know how it works before and could handle 1200 core and 1700 RAM on AB 17 so, it after installing that 18version it does not work either, included reinstall back to AB 17. I only want to know how to change it. I don want to hear reflash BIOS to ASUS, and try GPU tweak which is not able to manage 1700 on RAMstable. I have many combinations tried but the Saphire orig. bios with AB 17 was working well for my system. But now is anything corrupted and Im not able to raise clocks...
> If anyone of you was able to reach on R9-290 higher clocks than 1200 core and 1700 ram so let me know please how did you do it. Because I cant do it now. and next problem is, that AB cant manage to change power limit so I have to install CCC and change it in there. That sucks.


If you cant do 1700 on gpu tweak then you cant do it on msi. Or its barely stable. There is nothing different on msi how it communicate with the gpu compared to gpu tweak. Your only reason that you cant do it is:
1) driver version
2) temps between these runs.


----------



## dspacek

Quote:


> Originally Posted by *sugarhell*
> 
> Guys its easy to give more volts on msi.
> 
> Just use /wi4,30,8d,10 for 100mv. The offset is 6.25 mv in hexademical. So on decimal is :16*6.25=100 mv. For 50mv you need 8. For 200mv you need 20( 20=32 on dec. So 32 * 6.25=200mv)
> 
> The easy way to do changes:
> 
> Create a txt on desktop. Write
> CD C:\Program Files (x86)\MSI Afterburner
> MSIAfterburner.exe /wi4,30,8d,10
> 
> and then save as .bat file. Eveyrtime you start this bat file msi will start with +100mv


Yes everyone know this. But how to reach higher clocks? because its limited to 1235/1625Mhz


----------



## jerrolds

1235/1625 is a VERY good overclock. If you cant get higher by throwing more voltage at it then youve reached the limit of your GPU.

You can try PT1 bios if you havent already, or PT3 but thats not recommended unless youre going LN2.


----------



## dspacek

Quote:


> Originally Posted by *sugarhell*
> 
> If you cant do 1700 on gpu tweak then you cant do it on msi. Or its barely stable. There is nothing different on msi how it communicate with the gpu compared to gpu tweak. Your only reason that you cant do it is:
> 1) driver version
> 2) temps between these runs.


Maybe in your theory, but my practical experiences are theese, AB 17 was able to manage 1200 Core 1700 Mem with power +30, Vcore +156
AB 18 max 1180 Core, 1625 RAM because of locked limits. And since 18 im not bale to breake the limits in program AB if you can understand.
Now I tried http://www.techpowerup.com/vgabios/148706/asus-r9290-4096-131016.html ASUS BIOS on my Saphire card
And GPU tweak shut down after 1675 Mhz on mems.
I know sure that I could set 1725 on mem without problems, but I decrease it for better stability on AB 17.
So I dont know shere the problem is, that is not possible to reach again with reinstalled AB....







and Im realy sad of it


----------



## dspacek

Quote:


> Originally Posted by *jerrolds*
> 
> 1235/1625 is a VERY good overclock. If you cant get higher by throwing more voltage at it then youve reached the limit of your GPU.
> 
> You can try PT1 bios if you havent already, or PT3 but thats not recommended unless youre going LN2.


No they are not. I can swear I wase able to be stable at 1700+ Mem maybe here are some pictures from me in this forum ;-)

Im sorry to all that im little bit rude and angry but Im playing with theese settings all day and cant find my previous combination of programs, hacks and luck when it worked ;-)


----------



## sugarhell

AB 17 because it used the ADL files from amd used to throttle the performance. The communication between gpu and msi ab was too slow so the performance was lower. Thats why maybe you could do it. Now beta 18 use a lower interface to communicate with the 290 so no more throttling on performance.


----------



## BradleyW

Does anyone have any thoughts about my issue stated on the previous pages?
Thank you.


----------



## Xares

Quote:


> Originally Posted by *Hattifnatten*
> 
> Are you using afterburner beta 17 with voltage monitoring enabled? That will cause clock fluctuations. Beta 18 fixes this.


I'm using the beta 18: (


----------



## Snyderman34

Switch me back to air please. My H220 gave out, and I've lost the point of full water cooling. Throwing my EK block and AlphaCool rad on eBay and moving back to GPU on air.


----------



## Paulenski

Update my info please, got a new card









SAPPHIRE TRI-X OC R9 290X


----------



## Loktar Ogar

Quote:


> Originally Posted by *Paulenski*
> 
> Update my info please, got a new card
> 
> 
> 
> 
> 
> 
> 
> 
> 
> SAPPHIRE TRI-X OC R9 290X
> 
> Nice card! Does it have a BF4 included?


----------



## Paulenski

Quote:


> Originally Posted by *Loktar Ogar*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Paulenski*
> 
> Update my info please, got a new card
> 
> 
> 
> 
> 
> 
> 
> 
> 
> SAPPHIRE TRI-X OC R9 290X
> 
> Nice card! Does it have a BF4 included?
Click to expand...

Sadly no, but I already got BF4 from my first 290X, ended up RMA'ing the card but I got to keep the code


----------



## ReHWolution

Quote:


> Originally Posted by *Marvin82*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ReHWolution*
> 
> 
> 
> Update me to aftermarket, Icy Vision Rev. 2 mounted on my 290X
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And now? Idle and 100% speed ?
> Can the card not Regulate (pwm) the icy ? The GTX680 cane not handle the iCY ...
Click to expand...

Nope, the heatsink doesn't support PWM, fans run @ 100% :\
Quote:


> Originally Posted by *ottoore*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ReHWolution*
> 
> 
> 
> Update me to aftermarket, Icy Vision Rev. 2 mounted on my 290X
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Do you change fans?
> Max rpm is 2000 and it shows 4000!
> http://www.gelidsolutions.com/products/index.php?lid=1&cid=17&id=52&tab=2
Click to expand...

I don't know why, stock fans on it, but it gives 4k RPMs, i don't wanna say some b.s. but maybe it sums both fans speeds :\

I'll bench with Fire Strike Extreme in OC to give you results on this


----------



## Hattifnatten

Quote:


> Originally Posted by *Xares*
> 
> I'm using the beta 18: (


You could try raising the power limit, although I do not think that will do the trick :/


----------



## Jack Mac

Anyone have CF 290s with a 2500k/3570k? I want to know if it'll hold back the cards too much and what kind of OC I'd need to run two of these cards.


----------



## BradleyW

Quote:


> Originally Posted by *Jack Mac*
> 
> Anyone have CF 290s with a 2500k/3570k? I want to know if it'll hold back the cards too much and what kind of OC I'd need to run two of these cards.


My 7970's where held back by my 3770k @ 4.6GHz so yeah I'd say an upgrade will help in games that require a lot of CPU power. You'd enjoy an overclocked 4930k! I'd never go back to 4c/4th or 4c/8th


----------



## Jack Mac

Quote:


> Originally Posted by *BradleyW*
> 
> My 7970's where held back by my 3770k @ 4.6GHz so yeah I'd say an upgrade will help in games that require a lot of CPU power. You'd enjoy an overclocked 4930k! I'd never go back to 4c/4th or 4c/8th


Dang it, I hate having to spend more money. I wish I went for the 3930K instead of being impatient and buying a 3570K.


----------



## jerrolds

I dont think an i5 [email protected]+ will bottleneck CF 290X at high resolutions/AA for modern games. BF4 1440p Ultra will definitely see gains, while something like CS:GO might not (300fps vs 350fps or something)


----------



## Marvin82

Quote:


> Originally Posted by *ReHWolution*
> 
> Nope, the heatsink doesn't support PWM, fans run @ 100% :\
> I don't know why, stock fans on it, but it gives 4k RPMs, i don't wanna say some b.s. but maybe it sums both fans speeds :\
> 
> I'll bench with Fire Strike Extreme in OC to give you results on this


Ok thx
The icy is very god but not quiet at 100% i have it on a 680 and regulate with scythe fan control
So on the 290x no PWM regulate too ..... its no go for me
thx


----------



## jerrolds

Quote:


> Originally Posted by *Jack Mac*
> 
> Dang it, I hate having to spend more money. I wish I went for the 3930K instead of being impatient and buying a 3570K.


A 3570K is a great CPU - i *really* doubt youll see real world gains vs a 3930K or even a [email protected] unless you turn all image quality options to low or somethign.


----------



## BradleyW

Quote:


> Originally Posted by *Jack Mac*
> 
> Dang it, I hate having to spend more money. I wish I went for the 3930K instead of being impatient and buying a 3570K.


It depends what game you play really. Games like Sleeping Dogs, Battlefield 3/4, Crysis 2/3, Hitman Absolution, Tomb Raider ect will cause a lot of work on the CPU. From this, you will be held back. I had gains of up to 30fps in those games moving from 3770K to 3930K in CPU intensive moments.

The downfall is, games coded for 3/4 cores might not yield any improvement. Most people will tell you that the 3930K won't do much. This is because those people telling you have never moved from a 4c CPU to a 6c CPU. I can only see games becoming even more CPU intensive in the future (in a good way, not a bad coded way), so more CPU power will be better!

Here is a little example. Sleeping dogs max out at 1080p. In an area I was sat on 42fps. I upgraded to X79, OC'ed the 3930k and loaded the game in the exact same spot. The fps was now on 74. Similar story with Crysis 3 and Hitman Absolution.


----------



## Jack Mac

Quote:


> Originally Posted by *jerrolds*
> 
> A 3570K is a great CPU - i *really* doubt youll see real world gains vs a 3930K or even a [email protected] unless you turn all image quality options to low or somethign.


My 3570K is far from great. It's a total dud that's not even worth OCing on because of the high voltage required to get a decent clockspeed. I just put it on Craigslist, hope it sells. I'm really sick of this thing and just want it gone. I play a lot of BF3/4 so I hope to see a good improvement by upgrading.


----------



## tsm106

Quote:


> Originally Posted by *Jack Mac*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BradleyW*
> 
> My 7970's where held back by my 3770k @ 4.6GHz so yeah I'd say an upgrade will help in games that require a lot of CPU power. You'd enjoy an overclocked 4930k! I'd never go back to 4c/4th or 4c/8th
> 
> 
> 
> Dang it, I hate having to spend more money. I wish I went for the 3930K instead of being impatient and buying a 3570K.
Click to expand...

Don't fret dude. Just wait for mantle,especially in the case of triple A games that have signed on for support. All cpus are barely utilized atm so even when its good it isnt that good. You dig?


----------



## BradleyW

Quote:


> Originally Posted by *tsm106*
> 
> Don't fret dude. Just wait for mantle,especially in the case of triple A games that have signed on for support. All cpus are barely utilized atm so even when its good it isnt that good. You dig?


Assuming lots of games make full use of mantle, and assuming mantle actually lives up to the hype. Hope it does.


----------



## Jack Mac

Quote:


> Originally Posted by *tsm106*
> 
> Don't fret dude. Just wait for mantle,especially in the case of triple A games that have signed on for support. All cpus are barely utilized atm so even when its good it isnt that good. You dig?


I think my 3570K is actually reaching its limits and I hope mantle takes the strain away from my CPU. I see usage in the high 90s in BF4.


----------



## BradleyW

Quote:


> Originally Posted by *Jack Mac*
> 
> I think my 3570K is actually reaching its limits and I hope mantle takes the strain away from my CPU. I see usage in the high 90s in BF4.


Windows 8.1 would max that out to 99 usage


----------



## Luckael

Quote:


> Originally Posted by *Paulenski*
> 
> Update my info please, got a new card
> 
> 
> 
> 
> 
> 
> 
> 
> 
> SAPPHIRE TRI-X OC R9 290X


Congrats, now we are 2 in this group using Sapphire R9 290 Tri-X..  but mine has BF4..


----------



## Jack Mac

Quote:


> Originally Posted by *BradleyW*
> 
> Windows 8.1 would max that out to 99 usage


I was taking about CPU usage, I wish that my card got 90+ usage lol. The game stutters when it gets that high, it's annoying.


----------



## Paulenski

Quote:


> Originally Posted by *Luckael*
> 
> Congrats, now we are 2 in this group using Sapphire R9 290 Tri-X..  but mine has BF4..


Where did you order from?


----------



## jerrolds

Quote:


> Originally Posted by *Jack Mac*
> 
> I think my 3570K is actually reaching its limits and I hope mantle takes the strain away from my CPU. I see usage in the high 90s in BF4.


Didnt you hit 4.4Ghz? Thats about average for that chip.

Meh... according to this http://www.techspot.com/review/734-battlefield-4-benchmarks/page6.html - the difference between an i7 [email protected] and an i5 [email protected] is 1fps at 1920x1200 Ultra/HBAO

Yet goes from 30fps from single 7970 to 51fps in Crossfire (2560x1600 Ultra)

Again here http://www.techspot.com/review/645-tomb-raider-performance/page5.html there is 0 difference between an i7 [email protected] and an i5 [email protected], while there are huge gains with better GPUS.

I dont think getting a new CPU will give you the gains youre thinking of..but then again - upgrading is never bad.

With the same card - i thin frame time variance will be the same as long as you have a decent CPU (i5 2500k+)


----------



## phallacy

I would just like to add a small update regarding my 290x. I've been playing stable at 1150 core and 1500 memory with elpida from sapphire. I was going to go more on the core but I had a brief scare when I started it up yesterday morning and the screen was that rainbow colored mess after I logged in. Luckily I restarted and all was fine. I won't really overclock it more until my watercooling set up arrives. So far though, I am really liking the overclock gains from the Hawaii chipset. I am averaging 75-80 fps in multiplayer with ultra and 2x msaa in battlefield 4 on 1440p. Changing the res scale from 100 to 125 drops the FPS to about 50 average in multiplayer.


----------



## Sazz

BTW, Arctic cooler/Gelid cooler users, can you guys tell me what dimensions of VRM/RAM heatsinks did you guys used... Imma just use a H100i on my 290X and I need to know the size of heatsinks needed for the VRM and RAM modules.


----------



## kpoeticg

May i be down with the sickness?


----------



## tsm106

Quote:


> Originally Posted by *jerrolds*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jack Mac*
> 
> I think my 3570K is actually reaching its limits and I hope mantle takes the strain away from my CPU. I see usage in the high 90s in BF4.
> 
> 
> 
> Didnt you hit 4.4Ghz? Thats about average for that chip.
> 
> Meh... according to this http://www.techspot.com/review/734-battlefield-4-benchmarks/page6.html - the difference between an i7 [email protected] and an i5 [email protected] is 1fps at 1920x1200 Ultra/HBAO
> 
> Yet goes from 30fps from single 7970 to 51fps in Crossfire (2560x1600 Ultra)
> 
> Again here http://www.techspot.com/review/645-tomb-raider-performance/page5.html there is 0 difference between an i7 [email protected] and an i5 [email protected], while there are huge gains with better GPUS.
> 
> I dont think getting a new CPU will give you the gains youre thinking of..but then again - upgrading is never bad.
> 
> With the same card - i thin frame time variance will be the same as long as you have a decent CPU (i5 2500k+)
Click to expand...

You never seen a gpu bound scenario before?

These guys are hilarious. Hello buddy your whole test is weak.
Quote:


> The above data suggests that Tomb Raider isn't particularly CPU-intensive, as our Core i7-3770K offered virtually identical performance when clocked at 2.5GHz as it did at 4.0GHz. In fact, an 80% increase in clock speed to 4.5GHz resulted in a frame rate boost of only 1%.


----------



## Jack Mac

Quote:


> Originally Posted by *jerrolds*
> 
> Didnt you hit 4.4Ghz? Thats about average for that chip.
> 
> Meh... according to this http://www.techspot.com/review/734-battlefield-4-benchmarks/page6.html - the difference between an i7 [email protected] and an i5 [email protected] is 1fps at 1920x1200 Ultra/HBAO
> 
> Yet goes from 30fps from single 7970 to 51fps in Crossfire (2560x1600 Ultra)
> 
> Again here http://www.techspot.com/review/645-tomb-raider-performance/page5.html there is 0 difference between an i7 [email protected] and an i5 [email protected], while there are huge gains with better GPUS.
> 
> I dont think getting a new CPU will give you the gains youre thinking of..but then again - upgrading is never bad.
> 
> With the same card - i thin frame time variance will be the same as long as you have a decent CPU (i5 2500k+)


Yeah, I'm at 4.4 but it takes over 1.3V which I don't like. I put it for trade on CL, hoping to find someone who will take my CPU/MB + cash for a 4770K and motherboard.


----------



## sugarhell

2014 and people doesnt know the difference between single and multiplayer and how affects the performance. Sad


----------



## Jack Mac

Quote:


> Originally Posted by *kpoeticg*
> 
> May i be down with the sickness?


Doesn't it have a nice weight to it? The 290 feels more solid than my 670 PE.


----------



## Kuivamaa

Christ sugar, If your coordinates are accurate you live roughly 1500m from me


----------



## kpoeticg

Hell yeah. Feels solid as a rock. Shame i'm gonna be taking off that cooler on Thursday for my Copper/Plexi Kryographics


----------



## ReHWolution

Quote:


> Originally Posted by *Marvin82*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ReHWolution*
> 
> Nope, the heatsink doesn't support PWM, fans run @ 100% :\
> I don't know why, stock fans on it, but it gives 4k RPMs, i don't wanna say some b.s. but maybe it sums both fans speeds :\
> 
> I'll bench with Fire Strike Extreme in OC to give you results on this
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ok thx
> The icy is very god but not quiet at 100% i have it on a 680 and regulate with scythe fan control
> So on the 290x no PWM regulate too ..... its no go for me
> thx
Click to expand...

Yea for me too the fans are a bit loud, but you know? Compared to the stock cooler it doesn't bother me that much... I need a fan controller too, but at first I'll save money for a second 290X








Quote:


> Originally Posted by *kpoeticg*
> 
> May i be down with the sickness?


Disturbed?


----------



## kpoeticg

Only on the weekends


----------



## Arizonian

Quote:


> Originally Posted by *sugarhell*
> 
> Guys its easy to give more volts on msi.
> 
> Just use /wi4,30,8d,10 for 100mv. The offset is 6.25 mv in hexademical. So on decimal is :16*6.25=100 mv. For 50mv you need 8. For 200mv you need 20( 20=32 on dec. So 32 * 6.25=200mv)
> 
> The easy way to do changes:
> 
> Create a txt on desktop. Write
> CD C:\Program Files (x86)\MSI Afterburner
> MSIAfterburner.exe /wi4,30,8d,10
> 
> and then save as .bat file. Eveyrtime you start this bat file msi will start with +100mv
> 
> For 50mv: 8
> For 100mv:10
> For 125mv:14
> For 150mv:18
> For 175mv:1C
> For 200mv:20
> 
> I wouldn't go over this point because
> 1)You are close to leave the sweet spot of the ref pcb vrms efficiency
> 2)These commands add 200mv on top of the 100mv offset through AB gui.That means 300mv
> 
> By default /wi command apply to current gpu only. So if you have 2 or more gpus you must use /sg command. That means the command line is something like that
> ex:MsiAfterburner.exe /sg0 /wi4,30,8d,10 /sg1 /wi4,30,8d,10


Quote:


> Originally Posted by *tsm106*
> 
> ^^You know your post will be forgotten and someone will ask the same thing again tomorrow.


Not anymore. It's been added to OP under 'Useful software and info' section. You can refer members to that now. Thanks sugarhell - please PM me if you'd like that worded any other way.
Quote:


> Originally Posted by *Snyderman34*
> 
> Switch me back to air please. My H220 gave out, and I've lost the point of full water cooling. Throwing my EK block and AlphaCool rad on eBay and moving back to GPU on air.


Sorry to hear that. Updated









Quote:


> Originally Posted by *Paulenski*
> 
> Update my info please, got a new card
> 
> 
> 
> 
> 
> 
> 
> 
> 
> SAPPHIRE TRI-X OC R9 290X
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - looks sweet - added









PS - does the TRI X come with Elpdia or Hynix memory?

Quote:


> Originally Posted by *kpoeticg*
> 
> May i be down with the sickness?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


oooowah ah ah ah







Congrats - added


----------



## Cool Mike

Everyone may know by now, but the ASUS R9290X-DC2OC-4GD5 (Direct CU) is available at Newegg. $699.99


----------



## Arizonian

Quote:


> Originally Posted by *Cool Mike*
> 
> Everyone may know by now, but the ASUS R9290X-DC2OC-4GD5 (Direct CU) is available at Newegg. $699.99


If I remember correctly that's a 6+8 pin and no extra power phases?


----------



## Cool Mike

Full info on Asus website. Custom 8 phase. 6+8 power

http://www.asus.com/Graphics_Cards/R9290XDC2OC4GD5/


----------



## VSG

Quote:


> 4.7% faster game performance than reference Radeon R9 290X in Metro Last Light


This is their selling point over reference design?


----------



## grunion

Quote:


> Originally Posted by *grunion*
> 
> Model Name: R9290X-DC2OC-4GD5
> ETA: Jan. W2
> Web link: http://www.asus.com/Graphics_Cards/R9290XDC2OC4GD5/
> 
> · MSRP: $ 589.99
> 
> · Part Number: R9290X-DC2OC-4GD5
> 
> 
> DirectCU II achieves 20% lower temps with 220% the dissipation area, plus 3X quieter running than reference.
> Exclusive CoolTech Fan drives wider airflow to keep critical components cool.
> 1050 MHZ engine clock for better performance and outstanding gaming experience
> DIGI+ VRM with 8-phase Super Alloy Power delivers precise digital power for superior efficiency, reliability, and performance.
> GPU Tweak helps you modify clock speeds, voltages, fan performance and more, all via an intuitive interface
> GPU Tweak Streaming let you share on-screen action in real time - so others can watch live as games are played.
> 4.7% faster game performance than reference Radeon R9 290X in Metro Last Light


Quote:


> Originally Posted by *Cool Mike*
> 
> Everyone may know by now, but the ASUS R9290X-DC2OC-4GD5 (Direct CU) is available at Newegg. $699.99


----------



## Cool Mike

Overpriced like every other AMD based GPU. Those Miners


----------



## tsm106

Quote:


> Originally Posted by *Cool Mike*
> 
> Overpriced like every other AMD based GPU. Those Miners


They should make a mining only gpu without outputs, like a mini tesla lol. That way mining sales don't cannibalize desktop sales. The inflated pricing does nothing for AMD's bottom line, it's profit that they don't see.


----------



## Arizonian

Well the ASUS DCUII was supposed to be the GPU I was going to order in November and have been waiting for. I'm going to be waiting on mantle reviews and will be poised for this card. So far this one is my favorite even if $70 over priced.


----------



## tsm106

Quote:


> Originally Posted by *Arizonian*
> 
> Well the ASUS DCUII was supposed to be the GPU I was going to order in November and have been waiting for. I'm going to be waiting on mantle reviews and will be poised for this card. So far this one is my favorite even if $70 over priced.


Elpida...


----------



## sugarhell

Sapphire tri-x is better


----------



## carlovfx

Today i finished my first custom loop, I used 2 r9 290x: 43 dregrees under extreme load.


----------



## Arizonian

Quote:


> Originally Posted by *tsm106*
> 
> Elpida...


Darn! I thought ASUS only used Hynix on DCUII.








Quote:


> Originally Posted by *sugarhell*
> 
> Sapphire tri-x is better


Which memory is TRI X? No extra power phases right?


----------



## VSG

Quote:


> Originally Posted by *Arizonian*
> 
> Well the ASUS DCUII was supposed to be the GPU I was going to order in November and have been waiting for. I'm going to be waiting on mantle reviews and will be poised for this card. So far this one is my favorite even if $70 over priced.


You mean $110? Newegg seems to have the most stock so of course they will raise prices as much as they want


----------



## sugarhell

Quote:


> Originally Posted by *Arizonian*
> 
> Darn! I thought ASUS only used Hynix on DCUII.
> 
> 
> 
> 
> 
> 
> 
> 
> Which memory is TRI X? No extra power phases right?


Hynix

http://www.kitguru.net/components/graphic-cards/zardon/sapphire-r9-290x-tri-x-oc-review-1600p-ultra-hd-4k/3/


----------



## kpoeticg

Dunno if this is noteworthy, but i just took my new Powercolor (http://www.newegg.com/Product/Product.aspx?Item=N82E16814131522) apart to check the memory



Also it's dropped $20 since i ordered it 3 days ago =\ ($610 => $590)

Quote:


> Originally Posted by *Arizonian*
> 
> oooowah ah ah ah
> 
> 
> 
> 
> 
> 
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanx


Also as far as the cooling on the members list. I have an Aquacomputer Copper/Plexi Block on the way from PPC. Not sure if you need to wait for me to post a pic with the block tagged with my nick. I'll def post a pic when it comes in tho.


----------



## Widde

Quote:


> Originally Posted by *Jack Mac*
> 
> Anyone have CF 290s with a 2500k/3570k? I want to know if it'll hold back the cards too much and what kind of OC I'd need to run two of these cards.


Got a 3570k at 4.5ghz and it's being draged behind my 2 290s







Atleast what it feels like ^^ Could be unoptimized drivers aswell, cant wait for the 14.1 betas









But went from 70-90 fps to 130-160 then i cranked the resolution scale to 200% and keeping it at 80-110 fps







(Battlefield 4 @ ultra 0 AA)


----------



## rdr09

Quote:


> Originally Posted by *Widde*
> 
> Got a 3570k at 4.5ghz and it's being draged behind my 2 290s
> 
> 
> 
> 
> 
> 
> 
> Atleast what it feels like ^^ Could be unoptimized drivers aswell, cant wait for the 14.1 betas
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But went from 70-90 fps to 130-160 then i cranked the resolution scale to 200% and keeping it at 80-110 fps
> 
> 
> 
> 
> 
> 
> 
> (Battlefield 4 @ ultra 0 AA)


what rez? i checked my cpu usage using 1080 BF4 MP maxed with one 290 . . .





both smooth, so i kept the HT off.


----------



## Jack Mac

Quote:


> Originally Posted by *Widde*
> 
> Got a 3570k at 4.5ghz and it's being draged behind my 2 290s
> 
> 
> 
> 
> 
> 
> 
> Atleast what it feels like ^^ Could be unoptimized drivers aswell, cant wait for the 14.1 betas
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But went from 70-90 fps to 130-160 then i cranked the resolution scale to 200% and keeping it at 80-110 fps
> 
> 
> 
> 
> 
> 
> 
> (Battlefield 4 @ ultra 0 AA)


Can you tell me what usage you get with ultra with 200% resolution scale and 0x MSAA? CPU and GPU wise at stock if possible? Also the average FPS. If it'll be a hassle for you, don't bother though. Thanks!


----------



## BradleyW

Quote:


> Originally Posted by *Jack Mac*
> 
> I was taking about CPU usage, I wish that my card got 90+ usage lol. The game stutters when it gets that high, it's annoying.


I know. And I said Windows 8.1 would push it to 99.


----------



## Cool Mike

I bit the bullet and purchased the Asus 290x CUII. Starting to get cold feet with the Eldephia memory, though Kitguru's memory overclock with the asus was on par with the Sapphire. I really like Asus power delivery, back plate and color scheme much better.


----------



## Widde

Quote:


> Originally Posted by *rdr09*
> 
> what rez? i checked my cpu usage using 1080 BF4 MP maxed with one 290 . . .
> 
> It's at 1080p
> 
> 
> 
> 
> 
> 
> 
> Will check my cpu usage, what program are good for checking cpu usage? I only have one monitor though


----------



## Widde

Quote:


> Originally Posted by *Jack Mac*
> 
> Can you tell me what usage you get with ultra with 200% resolution scale and 0x MSAA? CPU and GPU wise at stock if possible? Also the average FPS. If it'll be a hassle for you, don't bother though. Thanks!


You mean cpu at stock? I'll try it tomorrow, it's 3 at night here atm







The cards arent ocd yet, waiting for my new psu ^^


----------



## ZealotKi11er

So i read EK Thermal Pads are not very good in terms of heat transfer. I really don't like to see 68C VRM1 when core is only 45C. What do you guys recommend?


----------



## sugarhell

Quote:


> Originally Posted by *ZealotKi11er*
> 
> So i read EK Thermal Pads are not very good in terms of heat transfer. I really don't like to see 68C VRM1 when core is only 45C. What do you guys recommend?


Buy fujipoly pads


----------



## ZealotKi11er

Quote:


> Originally Posted by *sugarhell*
> 
> Buy fujipoly pads


Is this the right stuff: http://www.frozencpu.com/scan/MM=c6eddef83858bd6b4bdb1d55e353f6a6:30:59:30.html?mv_more_ip=1&mv_nextpage=search_results_list&mv_profile=keyword_search&mv_searchspec=Fujipoly&mv_arg=


----------



## Jack Mac

Quote:


> Originally Posted by *Widde*
> 
> You mean cpu at stock? I'll try it tomorrow, it's 3 at night here atm
> 
> 
> 
> 
> 
> 
> 
> The cards arent ocd yet, waiting for my new psu ^^


Yeah, the CPU at stock please. Thanks.


----------



## sugarhell

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Is this the right stuff: http://www.frozencpu.com/scan/MM=c6eddef83858bd6b4bdb1d55e353f6a6:30:59:30.html?mv_more_ip=1&mv_nextpage=search_results_list&mv_profile=keyword_search&mv_searchspec=Fujipoly&mv_arg=


http://www.frozencpu.com/products/17500/thr-182/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_60_x_50_x_10_-_Thermal_Conductivity_170_WmK.html?tl=g30c309s1294

I tihnk you need 0.5 mm ones too.


----------



## kpoeticg

So is a 60x50mm Fujipoly recommended per card for the best heat transfer? I'm still fairly new to watercooling and my first 290x block is supposed to be delivered by Thursday. Would i buy a 60x50 and cut it into smaller strips to cover all the VRM's?


----------



## sugarhell

I just post one sheet. I dont know how exactly you need for hawaii because they have a lot more memories chip. But you need one 1mm and one 0.5mm for memories/vrms


----------



## kpoeticg

Ok, thanx for the quick response


----------



## tsm106

Quote:


> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> Is this the right stuff: http://www.frozencpu.com/scan/MM=c6eddef83858bd6b4bdb1d55e353f6a6:30:59:30.html?mv_more_ip=1&mv_nextpage=search_results_list&mv_profile=keyword_search&mv_searchspec=Fujipoly&mv_arg=
> 
> 
> 
> http://www.frozencpu.com/products/17500/thr-182/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_60_x_50_x_10_-_Thermal_Conductivity_170_WmK.html?tl=g30c309s1294
> 
> I tihnk you need 0.5 mm ones too.
Click to expand...

Quote:


> Originally Posted by *kpoeticg*
> 
> Ok, thanx for the quick response


*You only need the small 1mm strip for the vrms*. I would get Ultra Extremes for the main vrms, and for the memory ic you can go with the lesser cost pads or use stock which are .5mm.


----------



## kpoeticg

Cool. Thanx for the help


----------



## staryoshi

Quote:


> Originally Posted by *sugarhell*
> 
> Sapphire tri-x is better


The Sapphire Tri-X comes with a stronger cooler, but the Asus DCU II is a better card.

I'm in the process of RMA'ing my AC Hybrid - When it comes back I'll get to try it on my eventual R9 290... Or GTX 780


----------



## tsm106

Quote:


> Originally Posted by *staryoshi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *sugarhell*
> 
> Sapphire tri-x is better
> 
> 
> 
> The Sapphire Tri-X comes with a stronger cooler, but the Asus DCU II is a better card.
> 
> I'm in the process of RMA'ing my AC Hybrid - When it comes back I'll get to try it on my eventual R9 290... Or GTX 780
Click to expand...

Afaik only the Lightning has memory voltage control, so it wins be default.


----------



## battleaxe

Quote:


> Originally Posted by *staryoshi*
> 
> The Sapphire Tri-X comes with a stronger cooler, but the Asus DCU II is a better card.
> 
> I'm in the process of RMA'ing my AC Hybrid - When it comes back I'll get to try it on my eventual R9 290... Or GTX 780


You sure? I thought the tri-x had Hynix memory while the DCII has Elpida. The DCII does have better power controllers though right? I'm trying to decide which to go with also.


----------



## staryoshi

Quote:


> Originally Posted by *tsm106*
> 
> Afaik only the Lightning has memory voltage control, so it wins be default.


The Lightning is also biggie-sized.







I've never been particularly interested in them, myself.
Quote:


> Originally Posted by *battleaxe*
> 
> You sure? I thought the tri-x had Hynix memory while the DCII has Elpida. The DCII does have better power controllers though right? I'm trying to decide which to go with also.


I prefer the DCU II board over the reference board by quite a bit. I'm less fussed about the memory modules than others here.


----------



## tsm106

Quote:


> Originally Posted by *staryoshi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Afaik only the Lightning has memory voltage control, so it wins be default.
> 
> 
> 
> The Lightning is also biggie-sized.
> 
> 
> 
> 
> 
> 
> 
> I've never been particularly interested in them, myself.
Click to expand...

These half arsed customs have little gain though. You will be limited by the core in 99% of situations and the stock reference vrm is adequate for 1300mhz core. The limiting factor or weakness of the reference design is no memory voltage control.

The only one that gets it right as a true custom is the Lightning.


----------



## sugarhell

Quote:


> Originally Posted by *staryoshi*
> 
> The Sapphire Tri-X comes with a stronger cooler, but the Asus DCU II is a better card.
> 
> I'm in the process of RMA'ing my AC Hybrid - When it comes back I'll get to try it on my eventual R9 290... Or GTX 780


I prefer tri-x for the possibility of adding a waterblock. Also more phases means nothing when you stay on air.

If you stay on air the priorities is like that:

1)Strong cooler
2)Hynix memories
3)Ref pcb or ref pcb with more phases so you can add a waterblock after


----------



## battleaxe

What's the benefit of the XFX black edition cards other than clocked slightly higher?


----------



## tsm106

The Sapphire 290X Tri X = Sapphire 7970 OC before it got the blue crap pcb. The Tri X is the spiritual successor. I love the original OC cards btw, owning two of them still.

Quote:


> Originally Posted by *battleaxe*
> 
> What's the benefit of the XFX black edition cards other than clocked slightly higher?


You don't have to go to the dentist, he comes to you. And going to the dentist sucks!


----------



## sugarhell

Quote:


> Originally Posted by *battleaxe*
> 
> What's the benefit of the XFX black edition cards other than clocked slightly higher?


From what i have seen their cooler sucks for vrm cooling.


----------



## the9quad

Quote:


> Originally Posted by *sugarhell*
> 
> I prefer tri-x for the possibility of adding a waterblock. Also more phases means nothing when you stay on air.
> 
> If you stay on air the priorities is like that:
> 
> 1)Strong cooler
> 2)Hynix memories
> 3)Ref pcb or ref pcb with more phases so you can add a waterblock after


I didn't even think Hynix is much a of a deal maker on these cards is it? Seems like people are having about the same results with elpida and hynix


----------



## staryoshi

Quote:


> Originally Posted by *sugarhell*
> 
> I prefer tri-x for the possibility of adding a waterblock. Also more phases means nothing when you stay on air.
> 
> If you stay on air the priorities is like that:
> 
> 1)Strong cooler
> 2)Hynix memories
> 3)Ref pcb or ref pcb with more phases so you can add a waterblock after


I don't care for custom water-cooling solutions - I prefer AiO setups (Because of how often my system changes as well as not being in dire need of better thermal performance than an AiO offers). Something like the AC Hybrid or NZXT G10 + Cooler will work with most non-reference designs as long as the mounting layout is the same as the reference model (and should be usable with many PCB heatsinks).

My ideal solution would be to grab the DCU II and slap my AC hybrid on it - but at current pricing/availability that's not a likely scenario. (Mainly because I appreciate how they engineer their cards rather than a need for more performance)

Memory headroom only matters if you're going for max clocks or possibly hashes. For the majority of users it's really not an issue.


----------



## battleaxe

Quote:


> Originally Posted by *tsm106*
> 
> The Sapphire 290X Tri X = Sapphire 7970 OC before it got the blue crap pcb. The Tri X is the spiritual successor. I love the original OC cards btw, owning two of them still.
> You don't have to go to the dentist, he comes to you. And going to the dentist sucks!


WTH?... lol

I'm not even sure how to respond. lol ... meaning what?


----------



## tsm106

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> The Sapphire 290X Tri X = Sapphire 7970 OC before it got the blue crap pcb. The Tri X is the spiritual successor. I love the original OC cards btw, owning two of them still.
> You don't have to go to the dentist, he comes to you. And going to the dentist *sucks*!
> 
> 
> 
> WTH?... lol
> 
> I'm not even sure how to respond. lol ... meaning what?
Click to expand...

Hmm, sarcasm aside the XFX cards that exactly follow reference design are fine cards. However the XFX designed cards are terrible.


----------



## Arizonian

Quote:


> Originally Posted by *the9quad*
> 
> I didn't even think Hynix is much a of a deal maker on these cards is it? Seems like people are having about the same results with elpida and hynix


It just seems that the majority of people having issues with black screen have Elpida memory. I'm not even concerned about OC potential.


----------



## Forceman

Quote:


> Originally Posted by *the9quad*
> 
> I didn't even think Hynix is much a of a deal maker on these cards is it? Seems like people are having about the same results with elpida and hynix


Yeah, I think the limiting factor is the new memory controller, not the chips themselves.


----------



## staryoshi

Quote:


> Originally Posted by *Arizonian*
> 
> It just seems that the majority of people having issues with black screen have Elpida memory. I'm not even concerned about OC potential.


I haven't followed this issue much since the universe has kept the R9 series out of my reach for whatever, probably nefarious, reasons


----------



## Iniura

Quote:


> Originally Posted by *battleaxe*
> 
> What's the benefit of the XFX black edition cards other than clocked slightly higher?


Quote:


> Originally Posted by *sugarhell*
> 
> From what i have seen their cooler sucks for vrm cooling.


The bracket is black silver instead of a shiny silver color, I mean the bracket which hold the DVI-D, HDMI and Displayport, so it looks a little bit darker also from the side view when it's installed in your case.
I think It's a nice little touch to the cards, but nothing worth buying this one over the normal one for.

I am getting temperatures under full load in BF4 OC to 1150/1350 (+50 Power limit) +100mV (not my max OC on Air) on 2560x1600 76 degrees GPU temperature and VRM1 = 57 degrees and VRM2 = 64 degrees, and this is on Air with the stock cooler. I think these are nice temperatures for this card on Air. Having in mind that this is the Hawai card that everybody was so stoked over that it is a very hot card.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Forceman*
> 
> Yeah, I think the limiting factor is the new memory controller, not the chips themselves.


The memory controller is really good just undervolted so it lower power consumption. The GDDR5 is rated for 6GHz.


----------



## ds84

My gpuz cant show vrm temps for msi r9 290. Is it coz of my asrock z87 extreme 4, which dun support the readings?


----------



## kizwan

Quote:


> Originally Posted by *Jack Mac*
> 
> Anyone have CF 290s with a 2500k/3570k? I want to know if it'll hold back the cards too much and what kind of OC I'd need to run two of these cards.


I see you overclocked to 4.4GHz. What is the CPU usage when playing BF3 or BF4 with one card? You not yet upgrade to CFX right?

With 3820 @4.75GHz, single card CPU usage is in 50s to 60s % in BF4. With CFX, CPU usage at 80s %.
Quote:


> Originally Posted by *jerrolds*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jack Mac*
> 
> Dang it, I hate having to spend more money. I wish I went for the 3930K instead of being impatient and buying a 3570K.
> 
> 
> 
> A 3570K is a great CPU - i *really* doubt youll see real world gains vs a 3930K or even a [email protected] unless you turn all image quality options to low or somethign.
Click to expand...

For single card, yes no doubt there. For dual cards, with CPU intensive games like BF3 & BF4, you may need to overclock to 4.5GHz at least to keep up. I think with 3 cards, 3570K will start to struggle.
Quote:


> Originally Posted by *Jack Mac*
> 
> My 3570K is far from great. It's a total dud that's not even worth OCing on because of the high voltage required to get a decent clockspeed. I just put it on Craigslist, hope it sells. I'm really sick of this thing and just want it gone. I play a lot of BF3/4 so I hope to see a good improvement by upgrading.


Silicon lottery. 4.4GHz @1.32V, I say it's average chip.
Quote:


> Originally Posted by *Jack Mac*
> 
> I think my 3570K is actually reaching its limits and I hope mantle takes the strain away from my CPU. I see usage in the high 90s in BF4.


90s %? You have a lot of headroom there for overclock. Try 4.6 or 4.7GHz at least but if one card already at 90s %, for CFX, you'll need upgrade to 3770k at least.
Quote:


> Originally Posted by *Luckael*
> 
> Congrats, now we are 2 in this group using Sapphire R9 290 Tri-X..  but mine has BF4..


In Malaysia, both Tri-X & Tri-X OC come with BF4.
Quote:


> Originally Posted by *Widde*
> 
> Got a 3570k at 4.5ghz and it's being draged behind my 2 290s
> 
> 
> 
> 
> 
> 
> 
> Atleast what it feels like ^^ Could be unoptimized drivers aswell, cant wait for the 14.1 betas
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But went from 70-90 fps to 130-160 then i cranked the resolution scale to 200% and keeping it at 80-110 fps
> 
> 
> 
> 
> 
> 
> 
> (Battlefield 4 @ ultra 0 AA)


What is the CPU usage with 2 290s in BF4? FPS look good though, similar with mine @200% scaling, so CPU usage must be good.
Quote:


> Originally Posted by *rdr09*
> 
> what rez? i checked my cpu usage using 1080 BF4 MP maxed with one 290 . . .
> 
> 
> 
> 
> 
> both smooth, so i kept the HT off.


Which CPU you have there? 3770k?


----------



## Marvin82

Quote:


> Originally Posted by *Luckael*
> 
> Congrats, now we are 2 in this group using Sapphire R9 290 Tri-X..  but mine has BF4..


3... i have one to.....
And it is a 290x Tri x OC not 290....


----------



## Jack Mac

I don't have headroom to OC. 4.4 is my wall, 4.5 takes 1.4V, which I'm not going to use 24/7.


----------



## kizwan

Quote:


> Originally Posted by *Jack Mac*
> 
> I don't have headroom to OC. 4.4 is my wall, 4.5 takes 1.4V, which I'm not going to use 24/7.


Why you don't want to run 1.4V 24/7?


----------



## Aesthetics

Hey, just a question about watercooling 290

XSPC waterblock (Copper) is currently out of stock, I'm not too keen to wait a week or two.

But EK nickle waterblock are in stock, is it safe to use Nickle waterblock with copper radaitor/cpu block?

http://www.pccasegear.com/index.php?main_page=product_info&cPath=207_160_878_880_1501&products_id=25537
vs
http://www.pccasegear.com/index.php?main_page=product_info&products_id=25989&cPath=1504

Is their any major performance differences between two?


----------



## psyside

Quote:


> Originally Posted by *Paulenski*
> 
> Sadly no, but I already got BF4 from my first 290X, ended up RMA'ing the card but I got to keep the code


Please share some oc results, volts - vrm temps - core temps, and all of that good stuff, i will be getting my card tomorrow!


----------



## psyside

Quote:


> Originally Posted by *Luckael*
> 
> Congrats, now we are 2 in this group using Sapphire R9 290 Tri-X..  but mine has BF4..


Also, please share info for the card, mine will be BF4 as well


----------



## kizwan

Quote:


> Originally Posted by *Aesthetics*
> 
> Hey, just a question about watercooling 290
> 
> XSPC waterblock (Copper) is currently out of stock, I'm not too keen to wait a week or two.
> 
> But EK nickle waterblock are in stock, is it safe to use Nickle waterblock with copper radaitor/cpu block?
> 
> http://www.pccasegear.com/index.php?main_page=product_info&cPath=207_160_878_880_1501&products_id=25537
> vs
> http://www.pccasegear.com/index.php?main_page=product_info&products_id=25989&cPath=1504
> 
> Is their any major performance differences between two?


Deference between them probably just a couple of degrees. As for whether it's safe to use nickel plated block, yes it's safe. The issue with nickel plated block, if it's not plated properly, it may flakes but it doesn't reduced cooling performance.


----------



## psyside

Guys, i did lots of research, Hynix on 290/290X offer better oc 8/10 times. The Tri X cards, almost all i saw till now around ~10 cards, reached 1600+ on the memory









Elpida on the other hand, most of the times max out at 1450.

Maybe also the impressive cooler on the Tri X and cooling pads on the memory helps? guess so.

BTW did anyone played with Aux voltage? worth it to use? what exactly does it do?


----------



## Derpinheimer

I dont think the cooling should help much. Mem heatsinks on my card only get warm (and I actually mean hot, im estimating 60-70c) when running a miner. Games they are still cool to the touch. I mean I've attempted to OC my dud Elpida from a cold start and it doesnt help, so I doubt its a temp limit.

And I do think I found higher Aux to help maintain stability with memory OC slightly, but that might be placebo.. I dont use it anymore.

p.s. is it confirmed Tri-X are hynix only? That would be the first card to ever be hynix only, 79XX or R9 290(x)... I think even the Lightning 79XX came with Elpida and Hynix?


----------



## psyside

Quote:


> Originally Posted by *Derpinheimer*
> 
> p.s. is it confirmed Tri-X are hynix only? That would be the first card to ever be hynix only, 79XX or R9 290(x)...


From like 10 units i saw by now like i sad, all of them (290/290X) had Hynix, i don't think that's just an coincidence









Not sure about 280X/7970? was there 7970 with Tri x? i don't think so...


----------



## Forceman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> The memory controller is really good just undervolted so it lower power consumption. The GDDR5 is rated for 6GHz.


Well it's a new, simpler memory controller than what the 7970 had, and considering people aren't even getting rated speeds from the memory my guess is that the controller is the issue. Maybe just because of voltage, although I think it runs on core voltage, doesn't it? Because increasing core voltage definitely seems to help with memory overclocks.


----------



## VladimirT

Quote:


> Originally Posted by *Iniura*
> 
> I am getting temperatures under full load in BF4 OC to 1150/1350 (+50 Power limit) +100mV (not my max OC on Air) on 2560x1600 76 degrees GPU temperature and VRM1 = 57 degrees and VRM2 = 64 degrees, and this is on Air with the stock cooler. I think these are nice temperatures for this card on Air.


Fan speed ?


----------



## sugarhell

Quote:


> Originally Posted by *Forceman*
> 
> Well it's a new, simpler memory controller than what the 7970 had, and considering people aren't even getting rated speeds from the memory my guess is that the controller is the issue. Maybe just because of voltage, although I think it runs on core voltage, doesn't it? Because increasing core voltage definitely seems to help with memory overclocks.


I dont think so.Maybe powerlimit helps. Either way hynix are rated higher. I have seen elpida maxed out around 1500,hyniix can go more.Jomama has a 1750mem 290x hynix so the controller is fine


----------



## psyside

Quote:


> Originally Posted by *sugarhell*
> 
> I dont think so.Maybe powerlimit helps. Either way hynix are rated higher. I have seen elpida maxed out around 1500,hyniix can go more.Jomama has a 1750mem 290x hynix so the controller is fine


Agree, controlelr is fine, Elpida crap is the issue most of the times. Or maybe their timings? anyway, Hynix chips are generally better bins/higher quality same like Samsung ones are usually better then Hynix.


----------



## Forceman

Quote:


> Originally Posted by *sugarhell*
> 
> I dont think so.Maybe powerlimit helps. Either way hynix are rated higher. I have seen elpida maxed out around 1500,hyniix can go more.Jomama has a 1750mem 290x hynix so the controller is fine


Where did you see that the Hynix are rated higher? All I've ever seen is that they are both 6 GHz.

Someone ought to collect stats, so far it is all anecdotal and I suspect the Hynix/Elpida history colors the conventional wisdom.
Quote:


> Originally Posted by *psyside*
> 
> Agree, controlelr is fine, Elpida crap is the issue most of the times. Or maybe their timings? anyway, Hynix chips are generally better bins/higher quality same like Samsung ones are usually better then Hynix.


Again, how do you know that? The memory controller is completely new and there is very little actual data on which memory is overclocking higher, unless you know of a spreadsheet/database I don't.

If the controller is fine then why can't all the cards hit 1500/6000 at default voltage? Unless the theory is that the RAM is being undervolted, in which case all the data kind of goes out the window anyway.

And just to be clear, I'm not saying the memory controller is bad, or weak, just that it may be the limiting factor when it comes to overclocks, especially considering many people can't even get the rated speed out of the memory.


----------



## tsm106

Quote:


> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Forceman*
> 
> Well it's a new, simpler memory controller than what the 7970 had, and considering people aren't even getting rated speeds from the memory my guess is that the controller is the issue. Maybe just because of voltage, although I think it runs on core voltage, doesn't it? Because increasing core voltage definitely seems to help with memory overclocks.
> 
> 
> 
> I dont think so.Maybe powerlimit helps. Either way hynix are rated higher. I have seen elpida maxed out around 1500,hyniix can go more.Jomama has a 1750mem 290x hynix so the controller is fine
Click to expand...

It's not a WEAK controller. It's a 512bit imc, I'd assume it's rather stressed and we have no voltage control over it. We don't really know what it's potential is yet. Tahiti's imc is 384bit, child's play in comparison. And considering Hawaii's high memory overclocks are in the 1700mhz range (which is not far from 1850mhz for Tahiti) and then add in the fact that there is no voltage control, Hawaii's imc is doing rather well given its circumstances. Also, ya realize the difference in theoretical bandwidth between the two right? Hawaii is a large speed step forward. The actual memory ic speed isn't really a huge point when we realize the speeds are already freakishly high.


----------



## sugarhell

Quote:


> Originally Posted by *Forceman*
> 
> Where did you see that the Hynix are rated higher? All I've ever seen is that they are both 6 GHz.
> 
> Someone ought to collect stats, so far it is all anecdotal and I suspect the Hynix/Elpida history colors the conventional wisdom.


If they use the same hynix modules as 7970s then they are rated for higher speeds

So i did a research,

If you have
H5GQ2H24AFR-R0C its the lower binned part that run at 1.5 volts and its rated for 6ghz
H5GQ2H24AFR-R2C is 1.6 volts and 7ghz

Lower binned doesnt mean that cant run at 7ghz

Its the same memory that 7970s used.I had like 20 7970 hynix cards could do 2000 max elpida cards could do 1600 max

Sources:http://www.skhynix.com/products/graphics/view.jsp?info.ramKind=26&info.serialNo=H5GQ2H24AFR
http://www.techpowerup.com/reviews/AMD/HD_7970/5.html

The elpipa part is
elpida w2032bbbg

Now if you take account that elpida ones have tighter timings...


----------



## Iniura

Quote:


> Originally Posted by *VladimirT*
> 
> Fan speed ?


2 GT AP-15's in the front at full speed speedfan reports 1790ish RPM, 2 GT AP-15's in the top also at full speed and 1 GT AP-15 as rear exhaust at 7v think that's is something like 780ish RPM out the top of my head.


----------



## psyside

Quote:


> Originally Posted by *Forceman*
> 
> Again, how do you know that? The memory controller is completely new and there is very little actual data on which memory is overclocking higher, unless you know of a spreadsheet/database I don't.
> 
> If the controller is fine then why can't all the cards hit 1500/6000 at default voltage? Unless the theory is that the RAM is being undervolted, in which case all the data kind of goes out the window anyway.
> 
> And just to be clear, I'm not saying the memory controller is bad, or weak, just that it may be the limiting factor when it comes to overclocks, especially considering many people can't even get the rated speed out of the memory.


Dude, like many cards we saw by now, including Tri X (all of them) clocked alot higher then ones with Elpida. There is user on Youtbe, which 290, run @1700 on air, now you want to tell me that all of this is just coincidence?

Generally average Elpida oc on memory is around 1400, average on Hynix is 1550....also many Elpida had black screen issue, which kinda point to lower grade products, and less stability.


----------



## Paulenski

Quote:


> Originally Posted by *psyside*
> 
> Guys, i did lots of research, Hynix on 290/290X offer better oc 8/10 times. The Tri X cards, almost all i saw till now around ~10 cards, reached 1600+ on the memory
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Elpida on the other hand, most of the times max out at 1450.
> 
> Maybe also the impressive cooler on the Tri X and cooling pads on the memory helps? guess so.
> 
> BTW did anyone played with Aux voltage? worth it to use? what exactly does it do?


Quote:


> Originally Posted by *Derpinheimer*
> 
> I dont think the cooling should help much. Mem heatsinks on my card only get warm (and I actually mean hot, im estimating 60-70c) when running a miner. Games they are still cool to the touch. I mean I've attempted to OC my dud Elpida from a cold start and it doesnt help, so I doubt its a temp limit.
> 
> And I do think I found higher Aux to help maintain stability with memory OC slightly, but that might be placebo.. I dont use it anymore.
> 
> p.s. is it confirmed Tri-X are hynix only? That would be the first card to ever be hynix only, 79XX or R9 290(x)... I think even the Lightning 79XX came with Elpida and Hynix?


Just checked my tri-x, their Hynix, they easily reach 1500, 1600+ has been an bit of an struggle. I've only been using sapphire trixx which only has offset controls for core, no aux controls like afterburner.

I got around to making a nifty little chart for those interested in my findings for the TRI-X OC, I would love to get comparisons from other owners. Mostly concerned if these are normal VRM1 temps, thats the only thing worrying me. Runs great in AC4, gonna do some more testing for actual game performance.

About BF4, apparently Newegg is no longer offering it to all 290X as it was suppose to be. I'm gonna contact AMD and see if I can snag another code









The chart:


----------



## Forceman

Quote:


> Originally Posted by *psyside*
> 
> Dude, like many cards we saw by now, including Tri X (all of them) clocked alot higher then ones with Elpida. There is user on Youtbe, which 290, run @1700 on air, now you want to tell me that all of this is just coincidence?
> 
> Generally average Elpida oc on memory is around 1400, average on Hynix is 1550....also many Elpida had black screen issue, which kinda point to lower grade products, and less stability.


My point is that 1400 is _below_ the rated speed of the Elpida chips, so what's your reasoning for why they can't even do stock speeds? Either they are undervolted or something else is holding them back. Likewise the average Hynix overclock on the 7970 seemed to be quite a bit higher than what we've seen with the 290s. So again, unless the chips are different or undervolted, there is something else that is affecting the overclock - which is why I think the memory controller is the hold up. I'm not saying that Hynix chips aren't binned better, they may well be, but neither of them is clocking as high as they used to with the Tahiti cards.

Neither the Hynix nor the Elpida chips (which according to the part numbers sugarhell posted are both rated the same [1.6/6GHz and 1.6/7GHz]) are clocking as high as they have historically. Which implies that which memory is on the card has less to do with the max memory overclock than it has historically. Anecdotal Youtube data notwithstanding. Has anyone made a Hawaii overclocking thread yet? That'd be a good place to collect this kind of data.


----------



## sugarhell

Quote:


> Originally Posted by *Forceman*
> 
> My point is that 1400 is _below_ the rated speed of the Elpida chips, so what's your reasoning for why they can't even do stock speeds? Either they are undervolted or something else is holding them back. Likewise the average Hynix overclock on the 7970 seemed to be quite a bit higher than what we've seen with the 290s. So again, unless the chips are different or undervolted, there is something else that is affecting the overclock - which is why I think the memory controller is the hold up. I'm not saying that Hynix chips aren't binned better, they may well be, but neither of them is clocking as high as they used to with the Tahiti cards.


With more volts actually hynix cards could hit 1800-1900. But we are limited at 1.5 volts


----------



## psyside

R9 *290X* Tri X OC



R9 *290* Tri X



Same chips on both 290/290X


----------



## sugarhell

refer to my post about hynix vs elpida.

R0C version is the lower binned version of hynix for 1.5 volts. It should be able to reach 1700 kinda easily


----------



## psyside

Quote:


> Originally Posted by *Paulenski*
> 
> The chart:


Thanks alot for this. I would suggest to not use over 1.3v-1.35 as absolute max on air, if you do max out your fans, i would go for 1.3v, i dont think scaling with volts is really great after 1.3v...you already run like 120mV + if i'm not mistaken.
Quote:


> Originally Posted by *sugarhell*
> 
> refer to my post about hynix vs elpida.
> 
> R0C version is the lower binned version of hynix for 1.5 volts. It should be able to reach 1700 kinda easily


Thanks. What is the stock volts on this cards for the memory?


----------



## Paulenski

Quote:


> Originally Posted by *psyside*
> 
> Thanks alot for this. I would suggest to not use over 1.3v-1.35 as absolute max on air, if you do max out your fans, i would go for 1.3v, i dont think scaling with volts is really great after 1.3v...you already run like 120mV + if i'm not mistaken.


\

I didn't think I could go anymore on the voltage, unless something is wrong with the thermal pads I can't do much else with those VRM1 temps.


----------



## psyside

Quote:


> Originally Posted by *Paulenski*
> 
> I didn't think I could go anymore on the voltage, unless something is wrong with the thermal pads I can't do much else with those VRM1 temps.


Nothing is wrong. 1.4v is to much for air dude. Did you try 1200/1500 @1.3v? heck even 1.25v? if it stable i would stick to that, 20mhz+ on core for 100mV+ ? no thanks









What is your ambient temp, and what case you got?


----------



## Paulenski

Quote:


> Originally Posted by *psyside*
> 
> Nothing is wrong. 1.4v is to much for air dude. Did you try 1200/1500 @1.3v? heck even 1.25v? if it stable i would stick to that, 20mhz+ on core for 100mV+ ? no thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What is your ambient temp, and what case you got?


Not sure on ambient, it feels nice in my room so prob like 70-80F, doesn't feel warm.

I have the Thor V2 with plenty of cooling. I'm not too concerned with case temps causing any issues.

I figured I would be at the edge of what's possible with air. I've done some benchmarks with BF4 on small and large maps, about 3-5 minutes per bench performed consecutively to avoid temps lowering.

On 64 player map, Lancang Dam (largest outdoor one I could get on)
Temps: Core 65C, VRM1 63C, VRM2 51C
Avg Fraps: 59.8x
1202/1500 @ 125mv offset


----------



## psyside

Quote:


> Originally Posted by *Paulenski*
> 
> On 64 player map, Lancang Dam (largest outdoor one I could get on)
> Temps: Core 65C, VRM1 63C, VRM2 51C
> Avg Fraps: 59.8x
> 1202/1500 @ 125mv offset


Thanks for the info rep +

That's great temps! you should run heaven 20 mins loop at those exact oc settings and see what will be the max on vrm/core....would be nice to use 75% fan as well if you like









Looks like great settings for high oc


----------



## Widde

Quote:


> Originally Posted by *Jack Mac*
> 
> I don't have headroom to OC. 4.4 is my wall, 4.5 takes 1.4V, which I'm not going to use 24/7.


Morning ^^ I've done some testing, brief testing mind you ^^ But didnt notice much of a dip on fps going from 4.5 to 3.4ghz, might have to do more testing as i only did 10 min runs at the different speeds and I maybe should have restarted the pc before doing my 4.5 run as my cpu usage for some reason was higher at that speed. http://piclair.com/rwend (4.5ghz) http://piclair.com/8way5 (3.4ghz)

Did my testing on Rogue Transmission 64 conquest and feels like a 10fps drop average maybe, still keeping fps at 90-120 some time dipped into the high 80s with the clock at 3.4. Another run at 4.5 http://piclair.com/1n0rs gained again around 10 average fps


----------



## Sazz

I've asked this twice, never got an answer..

Anyone that has been using Arctic accelero Xtreme III coolers, what size did you guys use for the VRM and RAM modules???

I need to know ASAP to place an order for them so that I can receive them within this week.


----------



## dspacek

Where is the list of users with clocks and cooling of R9-290(X)? I would like to see, How i'm in comparrison ;-) thanks


----------



## rdr09

Quote:


> Originally Posted by *dspacek*
> 
> Where is the list of users with clocks and cooling of R9-290(X)? I would like to see, How i'm in comparrison ;-) thanks


http://www.overclock.net/t/1447763/amd-r9-290-290x-overclockers-club


----------



## Jack Mac

Quote:


> Originally Posted by *Widde*
> 
> Morning ^^ I've done some testing, brief testing mind you ^^ But didnt notice much of a dip on fps going from 4.5 to 3.4ghz, might have to do more testing as i only did 10 min runs at the different speeds and I maybe should have restarted the pc before doing my 4.5 run as my cpu usage for some reason was higher at that speed. http://piclair.com/rwend (4.5ghz) http://piclair.com/8way5 (3.4ghz)
> 
> Did my testing on Rogue Transmission 64 conquest and feels like a 10fps drop average maybe, still keeping fps at 90-120 some time dipped into the high 80s with the clock at 3.4. Another run at 4.5 http://piclair.com/1n0rs gained again around 10 average fps


Thanks, I'm fine with that FPS, looks like I'll go with CF 290s before upgrading my CPU.


----------



## Sazz

Quote:


> Originally Posted by *dspacek*
> 
> Where is the list of users with clocks and cooling of R9-290(X)? I would like to see, How i'm in comparrison ;-) thanks


I just dismantled my rig and I am back on stock cooler.

I replaced the thermal pads on the VRM parts and used PK1 for thermal grease. GPU maxing out at 79C after 1hr of BF4 and VRM1 is at 68C while VRM2 is 65C. I used my leftover Koolance thermal pad (the one with white/pink sides).

And I remember some people using corsair AIO coolers here too on their R9 290/X cards, that have mentioned available brackets specifically for these cards to install those AIO coolers.. where can i find those?


----------



## ds84

Quote:


> Originally Posted by *Sazz*
> 
> I just dismantled my rig and I am back on stock cooler.
> 
> I replaced the thermal pads on the VRM parts and used PK1 for thermal grease. GPU maxing out at 79C after 1hr of BF4 and VRM1 is at 68C while VRM2 is 65C. I used my leftover Koolance thermal pad (the one with white/pink sides).
> 
> And I remember some people using corsair AIO coolers here too on their R9 290/X cards, that have mentioned available brackets specifically for these cards to install those AIO coolers.. where can i find those?


NZXT G10...


----------



## Redvineal

Quote:


> Originally Posted by *Sazz*
> 
> ...And I remember some people using corsair AIO coolers here too on their R9 290/X cards, that have mentioned available brackets specifically for these cards to install those AIO coolers.. where can i find those?


Here's a couple options for you:

Richie's GPU Cool Bracket (with optional fan attachment bracket):
http://www.overclock.net/t/1439434

Kepler Dynamics' Sigma__Cool Bracket:
http://www.overclock.net/t/1221722/

The former has a more extensive water block compatibility list since it doesn't exactly fit the round Asetek design. Hope this helps!











Quote:


> Originally Posted by *ds84*
> 
> NZXT G10...


And of course there's the NZXT Kraken G10, if you're willing to wait a month for shipping.


----------



## Sazz

Quote:


> Originally Posted by *ds84*
> 
> NZXT G10...


That thing doesn't have enough for the VRM.

Quote:


> Originally Posted by *Redvineal*
> 
> Here's a couple options for you:
> 
> Richie's GPU Cool Bracket (with optional fan attachment bracket):
> http://www.overclock.net/t/1439434
> 
> Kepler Dynamics' Sigma__Cool Bracket:
> http://www.overclock.net/t/1221722/
> 
> The former has a more extensive water block compatibility list since it doesn't exactly fit the round Asetek design. Hope this helps!
> 
> 
> 
> 
> 
> 
> 
> 
> And of course there's the NZXT Kraken G10, if you're willing to wait a month for shipping.


Oh, if that's what it is... I can make one myself. I got some materials here and lots of screws/nuts/washers available around xD


----------



## ds84

Get some heatsinks for it? Jus wondering, does anyone have dimensions for the VRM 1 and VRM 2 for MSI r9 290?


----------



## Sazz

Quote:


> Originally Posted by *ds84*
> 
> Get some heatsinks for it? Jus wondering, does anyone have dimensions for the VRM 1 and VRM 2 for MSI r9 290?


Quote:


> Originally Posted by *ds84*
> 
> Get some heatsinks for it? Jus wondering, does anyone have dimensions for the VRM 1 and VRM 2 for MSI r9 290?


Quote:


> Originally Posted by *Redvineal*
> 
> Here's a couple options for you:
> 
> Richie's GPU Cool Bracket (with optional fan attachment bracket):
> http://www.overclock.net/t/1439434
> 
> Kepler Dynamics' Sigma__Cool Bracket:
> http://www.overclock.net/t/1221722/
> 
> The former has a more extensive water block compatibility list since it doesn't exactly fit the round Asetek design. Hope this helps!
> 
> 
> 
> 
> 
> 
> 
> 
> And of course there's the NZXT Kraken G10, if you're willing to wait a month for shipping.


I am just guestimating (well not really) and I got these..

http://www.frozencpu.com/products/5557/vid-86/Swiftech_MC21_Aluminum_MOSFET_Heatsinks_-_21mm_x_6mm_x_10mm_-_4_pack.html?id=W8WFsaTc&mv_pc=2619
http://www.frozencpu.com/products/5518/vid-82/Enzotech_Forged_Copper_VGA_Memory_Heatsink_Multipack_-_ATI_and_nVidia_-_14mm_x_14mm_x_14mm_BMR-C1.html?id=W8WFsaTc&mv_pc=2620


----------



## TheRoot

http://www.techpowerup.com/196794/amd-catalyst-14-1-beta-to-include-mantle-and-trueaudio-runtimes.html
mantle and true audio


----------



## katset

Hi !

I just received my sapphire r9 290 tri-X BF4, and after 30 min of testing and very slight OC I got artifacts everywhere.
Now @ stock, artifacts even under firefox... tried everything, other PCI-express port, reinstalled drivers..

I guess nothing else than RMA now, right?









thanks


----------



## Aggronor

Any Idea of Mantle? Guys.. today its supposed to be the day..!


----------



## ds84

Which material is better to get, copper or aluminium?


----------



## Luckael

Quote:


> Originally Posted by *Paulenski*
> 
> Where did you order from?


here in the Philippines.,


----------



## battleaxe

Quote:


> Originally Posted by *katset*
> 
> Hi !
> 
> I just received my sapphire r9 290 tri-X BF4, and after 30 min of testing and very slight OC I got artifacts everywhere.
> Now @ stock, artifacts even under firefox... tried everything, other PCI-express port, reinstalled drivers..
> 
> I guess nothing else than RMA now, right?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> thanks


I would guess. If it can't do stock, get a new one. That's not normal.


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> I see you overclocked to 4.4GHz. What is the CPU usage when playing BF3 or BF4 with one card? You not yet upgrade to CFX right?
> 
> With 3820 @4.75GHz, single card CPU usage is in 50s to 60s % in BF4. With CFX, CPU usage at 80s %.
> For single card, yes no doubt there. For dual cards, with CPU intensive games like BF3 & BF4, you may need to overclock to 4.5GHz at least to keep up. I think with 3 cards, 3570K will start to struggle.
> Silicon lottery. 4.4GHz @1.32V, I say it's average chip.
> 90s %? You have a lot of headroom there for overclock. Try 4.6 or 4.7GHz at least but if one card already at 90s %, for CFX, you'll need upgrade to 3770k at least.
> In Malaysia, both Tri-X & Tri-X OC come with BF4.
> What is the CPU usage with 2 290s in BF4? FPS look good though, similar with mine @200% scaling, so CPU usage must be good.
> Which CPU you have there? 3770k?


2700K. some maps usage are higher.


----------



## Csokis

MSI R9 290 Gaming Edition Mini review

Hynix memory





VRM temps!


----------



## Noufel

hi to all
I'll get my 2 290 tri x Thursday if all goes well I have two questions
first one is my 8320 Will bottleneck these cards and if my psu will be sufficient

thanks

PS: hope it will be, i have no money for a new one


----------



## shilka

Quote:


> Originally Posted by *Noufel*
> 
> hi to all
> I'll get my 2 290 tri x Thursday if all goes well I have two questions
> first one is my 8320 Will bottleneck these cards and if my psu will be sufficient
> 
> thanks
> 
> PS: hope it will be, i have no money for a new one


You dont need a new PSU unless you are going to overvolt your video cards

Also the only bad thing about what you have is its group regulated


----------



## rdr09

Quote:


> Originally Posted by *Noufel*
> 
> hi to all
> I'll get my 2 290 tri x Thursday if all goes well I have two questions
> first one is my 8320 Will bottleneck these cards and if my psu will be sufficient
> 
> thanks
> 
> PS: hope it will be, i have no money for a new one


the answer to the first question is a resounding - yes. mantle might help, though.


----------



## psyside

Quote:


> Originally Posted by *Csokis*
> 
> MSI R9 290 Gaming Edition Mini review
> 
> VRM temps!


Do an extended loop bench 30 mins, with oc, then post vrm temps


----------



## devilhead

those cards is really weird, now if i will overclock 1250/1750, then ok, but if 1280/1750 fails, 1280/1700 ok







.... need to figure out....


----------



## bloodkil93

Well my Sapphire 290x completely died yesterday, no post or nay life apart from a spinning fan -.-

The second time i've had to RMA...


----------



## ReHWolution

I installed the Icy Vision but VRM 1 temps are so damn high, I tried to run MetroLL benchmark and it went up to 85°C, and the gpu reached 71°C, I stopped it before the end cause I'm just checking what the temps are now, I was @ 1200/1500, +200mV on TriXX, +50% power, Did I mount it in the wrong way or what?


----------



## dspacek

Quote:


> Originally Posted by *devilhead*
> 
> those cards is really weird, now if i will overclock 1250/1750, then ok, but if 1280/1750 fails, 1280/1700 ok
> 
> 
> 
> 
> 
> 
> 
> .... need to figure out....


Which BIOS? which program to OC? you have X version I suppose...


----------



## Xares

hi, this fluctuation on GPU frequency core is normal?







it is not temperature problem.

drivers Catalyst 13.12


----------



## stickg1

Quote:


> Originally Posted by *ReHWolution*
> 
> I installed the Icy Vision but VRM 1 temps are so damn high, I tried to run MetroLL benchmark and it went up to 85°C, and the gpu reached 71°C, I stopped it before the end cause I'm just checking what the temps are now, I was @ 1200/1500, +200mV on TriXX, +50% power, Did I mount it in the wrong way or what?


Doesn't sound too bad for air cooling. You're pumping a bunch of voltage into the card. It's going to get hot at those settings even with a decent air cooler.


----------



## HardwareDecoder

arizonian you can update me to having 2, 290's and 2 290x's now.

2 -- 290's on air / 1 290x on air for mining.

1 290x on water for gaming/mining.

All sapphires

all ref models


----------



## devilhead

Quote:


> Originally Posted by *dspacek*
> 
> Which BIOS? which program to OC? you have X version I suppose...


stock 290x sapphire bios, tryed PT1, but it is the same







max +200mv


----------



## Arizonian

Quote:


> Originally Posted by *HardwareDecoder*
> 
> arizonian you can update me to having 2, 290's and 2 290x's now.
> 
> 2 -- 290's on air / 1 290x on air for mining.
> 
> 1 290x on water for gaming/mining.
> 
> All sapphires
> 
> all ref models


Sure can - updated


----------



## ReHWolution

Quote:


> Originally Posted by *stickg1*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ReHWolution*
> 
> I installed the Icy Vision but VRM 1 temps are so damn high, I tried to run MetroLL benchmark and it went up to 85°C, and the gpu reached 71°C, I stopped it before the end cause I'm just checking what the temps are now, I was @ 1200/1500, +200mV on TriXX, +50% power, Did I mount it in the wrong way or what?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Doesn't sound too bad for air cooling. You're pumping a bunch of voltage into the card. It's going to get hot at those settings even with a decent air cooler.
Click to expand...

104 °C is not a good temperature







VRM1 is the large row in the back of the card right? +100mV, 1200/1450MHz, CatZilla stable...tMAX for VRM1 is 104°C, tMAX for VRM2 is 49°C, tMAX for GPU is 71°C...


----------



## stickg1

Well before you said +200mV and 85C. 104C on +100mV is not okay. What heatsinks and thermal tape did you use?


----------



## rdr09

Quote:


> Originally Posted by *Xares*
> 
> hi, this fluctuation on GPU frequency core is normal?
> 
> 
> 
> 
> 
> 
> 
> it is not temperature problem.
> 
> drivers Catalyst 13.12


what were the vrm temps? your vantage looks low . . .

http://www.3dmark.com/3dmv/4536111

prolly it's a very old benchmark that does not play well with Hawaii.


----------



## Xares

Quote:


> Originally Posted by *rdr09*
> 
> what were the vrm temps? your vantage looks low . . .
> 
> http://www.3dmark.com/3dmv/4536111
> 
> prolly it's a very old benchmark that does not play well with Hawaii.


the VRM temps are 74ºC. It is a 290x Tri-X

the core fluctuation on Heaven and metro are normals??

thanks


----------



## rdr09

Quote:


> Originally Posted by *Xares*
> 
> the VRM temps are 74ºC. It is a 290x Tri-X
> 
> the core fluctuation on Heaven and metro are normals??
> 
> thanks


yes, those fluctuations are normal. actual games will show the same fluctuation at times. i ran Vantage with my 290 at stock and got about the same score. your vrm temp looks normal as well (there are 2 vrm temps). my Heaven score at stock is 50.


----------



## SkullTrail

Should I upgrade from a 7970 to this?


----------



## rdr09

Quote:


> Originally Posted by *SkullTrail*
> 
> Should I upgrade from a 7970 to this?


yes.


----------



## Jack Mac

I upgraded from a GTX 670, it was definitely worth it. I hope 290 prices drop so I can CF already.


----------



## ImJJames

Quote:


> Originally Posted by *SkullTrail*
> 
> Should I upgrade from a 7970 to this?


I had 7970 before 290, through all my gaming and benchmarks 290 for me was overall around 50% better. So up to you to decide. (I also bought mine when it was only $399 + 10% discount so $360 total)


----------



## Xares

OK thanks rdr09


----------



## pilla99

New 290 owner here. I just made the transition from a GTX 690 and have two immediate thoughts.

1. Jesus this thing runs hot, 95C temp target by default that reaches that temp in a hurry with any game I play.
2. Jesus this thing is loud. Default 45% fan profile sounds like a box fan as compared to my 690, which honestly I could almost never hear.

Aside from water is there anything that can be done to alleviate either of these issues even a little? Pic of the internals now, planning on doing true water in the future so I have the room now.


----------



## neurotix

OP, add me to the list.

Sapphire R9 290 Tri-X


Spoiler: Pictures of the new card














Spoiler: More proof








Card does not unlock (F801).

Haven't started overclocking yet but: http://www.3dmark.com/3dm11/7822192

Stock seems to be 1000/1300mhz. Haven't seen the card pass 55C yet running Valley and 3dmark11.

Time to pump the voltage and raise the clocks.

EDIT: If it wasn't apparent, this card has Hynix RAM, just like the other Tri-X we've seen.


----------



## Arizonian

Quote:


> Originally Posted by *pilla99*
> 
> New 290 owner here. I just made the transition from a GTX 690 and have two immediate thoughts.
> 
> 1. Jesus this thing runs hot, 95C temp target by default that reaches that temp in a hurry with any game I play.
> 2. Jesus this thing is loud. Default 45% fan profile sounds like a box fan as compared to my 690, which honestly I could almost never hear.
> 
> Aside from water is there anything that can be done to alleviate either of these issues even a little? Pic of the internals now, planning on doing true water in the future so I have the room now.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> ]


If you could 'edit' and add a GPU-Z link with OCN name or screen shot with GPU-Z validation tab open with OCN name showing would be great. In the meantime --- Congrats - added









Quote:


> Originally Posted by *neurotix*
> 
> OP, add me to the list.
> 
> Sapphire R9 290 Tri-X
> 
> 
> Spoiler: Pictures of the new card
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: More proof
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Card does not unlock (F801).
> 
> Haven't started overclocking yet but: http://www.3dmark.com/3dm11/7822192
> 
> Stock seems to be 1000/1300mhz. Haven't seen the card pass 55C yet running Valley and 3dmark11.
> 
> Time to pump the voltage and raise the clocks.


Congrats on the TRI-X post results on anything when you get'em. Added


----------



## ReHWolution

Quote:


> Originally Posted by *stickg1*
> 
> Well before you said +200mV and 85C. 104C on +100mV is not okay. What heatsinks and thermal tape did you use?


It was just a preliminary test, I stopped it after few seconds to check temperatures.
GPU Core Clock [MHz] , GPU Memory Clock [MHz] , GPU Temperature [°C] , Fan Speed (%) [%] , Fan Speed (RPM) [RPM] , GPU Load [%] , Memory Usage (Dedicated) [MB] , Memory Usage (Dynamic) [MB] , 12V [V] , VDDC [V] , VDDCI [V] , VDDC Current In [A] , VDDC Current Out [A] , VDDC Power In [W] , VDDC Power Out [W] , VRM Temperature 1 [°C] , VRM Temperature 2 [°C] ,
1200.0 , 1450.0 , 71.0 , - , - , 100 , 1256 , 11.50 , 1.180 , 1.000 , 28.3 , 213.5 , 327.0 , 234.5 , 104 , 49 ,

Those are the datas from the GPU-Z log :\ 1.2v effective (according to the programm but I was +0.100mV)... I don't know, I used this guide: http://www.overclock.net/t/1437634/installation-guide-tips-of-rev-2-icy-vision-on-r9-290x/0_50, but instead of 6 small VRM heatsinks I used 5 since one wasn't really in contact with much surface.. I'm unmounting again this ***** to see what's wrong, brb :\


----------



## ZealotKi11er

CF does not appeal me anymore. Most of the time i play games i dont need CF and those games that CF might see gain can still be played very well with one GPU and really the game is not worth the second card price.


----------



## VSG

Visiontek rep on Linustechtips said their online store will be getting a huge stock of reference 290 and 290x cards this week and they sell at MSRP so check it out if anyone wants a card. It may also mean other US vendors will be getting in more stock since the overseas shipments usually come in bulk together.


----------



## ds84

Quote:


> Originally Posted by *Csokis*
> 
> MSI R9 290 Gaming Edition Mini review
> 
> 
> 
> 
> 
> VRM temps!


Which software u used to see memory info?

And what mobo u using? My asrock z87 extreme 4 cant see vrm temps. Or any software can see vrm temps?

Hynix memory


----------



## jerrolds

Quote:


> Originally Posted by *Sazz*
> 
> That thing doesn't have enough for the VRM.


True, but it opens up the card so you can put some sinks and get some cooler air directly on them (hack the stock cooler). The air to my VRMs have to go thru the heatsink of the Gelid







gpu is 55C but VRM1 is 85C.

Just cant pull the trigger yet for going underwater.


----------



## MlNDSTORM

I want the Tri-X 290x....I have a 780Ti and happy with it...just one of those things where you have money and you want to buy something, even though you don't need it... Just to get the itch off I guess...


----------



## Gumbi

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Oh yeah, they look amazing, really great feel to them as well, surprisingly light but also no need for a backplate, next to no sag at all, was very impressed build quality wise.
> 
> EDIT: Instead of triple posting i'm just going to edit this one
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Both cards do not unlock and they both have Elphida ram so thats confirmed, the top card is on average 10c hotter than the bottom one and thats with 30c ambients......


That's expected with a CF setup, as one card is credit the other preheated air. You have 30 degree ambients? Are you sure, that's very very hi
Quote:


> Originally Posted by *MlNDSTORM*
> 
> I want the Tri-X 290x....I have a 780Ti and happy with it...just one of those things where you have money and you want to buy something, even though you don't need it... Just to get the itch off I guess...


gh 8-12 degrees higher than most peoples'. What are the cards idling at? Do you have a side fan feeding these cards cool air?
Hah. Well if you have the money, go for it. The ti is plain better than the 290x. Personally I think the 290 is far superior to the 290x - it's only 3/4% slower clock or clock and get it's a lot cheaper. Much better value IMO.

If I were in the market for a 290x, I'd probably just grab a 780 Lightning or a ti.

Then again, if you have the money and simply want to spend it, go for it







.

Forgive the formatting, am on phone









*StarYoshi Edit: I cleaned it up for ya







*


----------



## Prexxus

Just got my new RMA'd card back and it works like a charm! No more black screens, everything is gravy!

Now I'm contemplating buying a Accelero Xtreme III so I can get some better cooling on it, the sound really doesn't bother me much. But this thing freaks me out! I have never opened up a gpu and voiding the warranty just seems crazy to me.

But I still want to do it! So I've been reading up guides about the installation and such but I still feel worried. The guide on Toms hardware says that it does not come with enough heatsinks for all the memory chips. Where do I buy correct spare parts for this thing? What are they called? Their website has no spare heatsinks for sale from what I can see.

Also the guides say you should stick some tape on certain parts of the GPU but is not so clear about where or what they are?

There is also the small memory chip near the bottom of the card that is smaller then the others, most people put what seems to be a tiny heatsink on it with some thermal tape covering the other half of it? Does that heatsink come included with the Accelero Xtreme III?

As I write this I'm realising this might not be the place to post this. If not I am sorry and please move my post to the correct area of the forums or delete it. I'm new here so I'm still kind of lost.

Any info/advice on getting this thing installed would be greatly appreciated. Ive searched youtube for some videos but no one seems to have made any installation videos for a 290 with this custom cooler.

Thanks guys.


----------



## Paulenski

I did some extensive testing with higher memory clocks on my tri-x 290x.

Aux voltage does in fact affect memory overclocking. I used afterburner to only increase aux since trixx has higher core voltage control.

I've ran over 15+ valley runs, at a mere aux +50mv (1.047 actual increase) I went from a max of 1500 to 1690mhz. It also seems that this almost directly affects the vrm2 temps. Only 2c increase with the 190mhz memory increase.

My maximum core clock is 1202mhz @ 125 offset with vrm1 reaching 90c in valley benchmark.

Anyone else with tri-x get any higher on core or memory? Would like to know other tri-x vrm temps

Sent from my SGH-M919 using Tapatalk


----------



## neurotix

Hey.

Anybody having any problems with video playback with these cards?

Playing MPEG video in Windows Media Player or Youtube/Flash in Firefox causes my entire system to hard lock, requiring a reboot.

We found out that using VLC Media Player on the same files lets them play fine. Using Chrome for Youtube also works fine.

It seems to be a driver or codec issue. I tried Catalyst 13.12, 13.11 beta 9.5 drivers. I installed them with all options initially but the problem persisted, so I tried installing 13.12 without the "AMD Media", "AMD Accelerated Video Transcoding" and other video related suboptions. This didn't make a difference. So it leads me to believe there is likely a codec issue.

Does anyone have a fix for this? Should I RMA my card?

I have a Sapphire 290 Tri-X.

After dropping $600 on this thing and seeing impressive performance in games, it baffles me that it cannot play simple MPEG videos without hard locking the system.


----------



## chiknnwatrmln

Memory past 1500 MHz or so provides nearly no extra performance gains in real results... Maybe a couple hundred GPU points 3dmark11, which equates to like not even 1 additional fps for most games... Imo you're better off sticking to 1500 MHz and running default aux voltage.


----------



## Arizonian

Quote:


> Originally Posted by *ds84*
> 
> Which software u used to see memory info?
> 
> And what mobo u using? My asrock z87 extreme 4 cant see vrm temps. Or any software can see vrm temps?
> 
> Hynix memory


OP has a link to ' *Memory Info Test* ' - after you install it just make sure you have GPU-Z open when you test it for result.


----------



## Lanvin

What is the general consensus on cooling crossfire 290s without a loop?

Kraken g10 + x40
Or
Accelero xtreme 3 / icy vision.

I have enough slots between the two cards



Tri-x would have been a perfect match for my orange themed pc too.


----------



## Forceman

Quote:


> Originally Posted by *neurotix*
> 
> Hey.
> 
> Anybody having any problems with video playback with these cards?
> 
> Playing MPEG video in Windows Media Player or Youtube/Flash in Firefox causes my entire system to hard lock, requiring a reboot.
> 
> We found out that using VLC Media Player on the same files lets them play fine. Using Chrome for Youtube also works fine.
> 
> It seems to be a driver or codec issue. I tried Catalyst 13.12, 13.11 beta 9.5 drivers. I installed them with all options initially but the problem persisted, so I tried installing 13.12 without the "AMD Media", "AMD Accelerated Video Transcoding" and other video related suboptions. This didn't make a difference. So it leads me to believe there is likely a codec issue.
> 
> Does anyone have a fix for this? Should I RMA my card?
> 
> I have a Sapphire 290 Tri-X.
> 
> After dropping $600 on this thing and seeing impressive performance in games, it baffles me that it cannot play simple MPEG videos without hard locking the system.


I had the same problem and for me it was a conflict with the DisplayLink adapter I was using to run a second USB monitor. So maybe check if you have any other graphics related drivers on there (or maybe run DDU if you had Nvidia in before).


----------



## kizwan

Quote:


> Originally Posted by *ZealotKi11er*
> 
> CF does not appeal me anymore. Most of the time i play games i dont need CF and those games that CF might see gain can still be played very well with one GPU and really the game is not worth the second card price.


If you play only on one monitor, you don't need CFX of course.
Quote:


> Originally Posted by *Gumbi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Oh yeah, they look amazing, really great feel to them as well, surprisingly light but also no need for a backplate, next to no sag at all, was very impressed build quality wise.
> 
> EDIT: Instead of triple posting i'm just going to edit this one
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Both cards do not unlock and they both have Elphida ram so thats confirmed, the top card is on average 10c hotter than the bottom one and thats with 30c ambients......
> 
> 
> 
> That's expected with a CF setup, as one card is credit the other preheated air. You have 30 degree ambients? Are you sure, that's very very hi
Click to expand...

Why it's hard to believe? @Sgt Bilko live in NSW, Australia. I heard it's hot there now.

My ambient temp (indoor) currently 33C. This is typical ambient temp in Malaysia. At night it can go down to 28/29C (indoor) at which time with BF3 @80% fans, GPU temp can go as high as 70C.

BF3, ambient 28/30C, 80% fans (GPU1 & GPU2)





Can't wait to get water blocks for my cards. They're en route from Slovenia, now at Germany.


----------



## psyside

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Memory past 1500 MHz or so provides nearly no extra performance gains in real results... Maybe a couple hundred GPU points 3dmark11, which equates to like not even 1 additional fps for most games... Imo you're better off sticking to 1500 MHz and running default aux voltage.


Depends from the game. Going from 1250, to 1650 resulted in 8% increase in few games i saw. I bet that Crysis 3 will have big gains.


----------



## staryoshi

Quote:


> Originally Posted by *Lanvin*
> 
> What is the general consensus on cooling crossfire 290s without a loop?
> 
> Kraken g10 + x40
> Or
> Accelero xtreme 3 / icy vision.
> 
> I have enough slots between the two cards
> 
> Tri-x would have been a perfect match for my orange themed pc too.


This looks like the perfect scenario for a couple of Arctic Cooling Accelero Hybrids







(If you can actually find them)
http://www.arctic.ac/us_en/products/cooling/vga/accelero-hybrid.html


----------



## jerrolds

Quote:


> Originally Posted by *Lanvin*
> 
> What is the general consensus on cooling crossfire 290s without a loop?
> 
> Kraken g10 + x40
> Or
> Accelero xtreme 3 / icy vision.
> 
> I have enough slots between the two cards
> 
> 
> 
> Tri-x would have been a perfect match for my orange themed pc too.


I would not without a loop. I believe both the G10 and Xtreme 3 are dual slot so you probably wont have room to fit both. I had a Gelid on top and stock on the bottom and they still ran crazy hot the heat coming from the bottom card is no joke. I think at stock it didnt throttle, or at least it throttled a little. I sold my 2nd 290X after about a week.

The noise coming from 2 290Xs trying to keep it under 95C wont be pleasant.


----------



## Heinz68

Quote:


> Originally Posted by *Jack Mac*
> 
> I upgraded from a GTX 670, it was definitely worth it. I hope 290 prices drop so I can CF already.


Try at ShopBLT


----------



## Paulenski

Here are some temps and avg fps for BF4 with a single TRI-X OC 290X

3770k @ 4.3Ghz

Cat. 13.12
1185 core/1650 mem @ 125mv core offset/50mv aux offset

ULTRA Setting, 4xAA, 1080p

Dawnbreaker, Conquest, 64 players
71C Core
74C VRM1
55C VRM2
5 Minute bench - VSYNC ON - AVG FPS: 58.247 - Min: 45 Max: 63
VSYNC OFF - AVG FPS: 70.097 - Min: 45 Max: 144

----

Used slowly lower clocks for these, 1152/1500 @ 25mv core offset

Lancang Dam, Conquest Large, 64 players
65C Core
63C VRM1
51C VRM2
5 Minute Bench - VSYNC ON - AVG FPS: 57.567 - Min: 45 Max: 63

Operation Locker, Deathmatch, 32 players
65C Core
61C VRM1
51C VRM2
5 Minute Bench - VSYNC ON - AVG FPS: 59.87 - Min: 53 Max: 63

Hope someone finds the information useful.


----------



## psyside

Sure we do, thanks alot rep +









Can you run Heaven 4.0 with 1.3v and 1200/6500 for 15 mins loop and fans on 75% and see your core/vrm temps?

Sorry if i asked you this already, i'm trying to find out if there is some big difference from card to card/batch to batch


----------



## Marvin82

Quote:


> Originally Posted by *Paulenski*
> 
> I did some extensive testing with higher memory clocks on my tri-x 290x.
> 
> Aux voltage does in fact affect memory overclocking. I used afterburner to only increase aux since trixx has higher core voltage control.
> 
> I've ran over 15+ valley runs, at a mere aux +50mv (1.047 actual increase) I went from a max of 1500 to 1690mhz. It also seems that this almost directly affects the vrm2 temps. Only 2c increase with the 190mhz memory increase.
> 
> My maximum core clock is 1202mhz @ 125 offset with vrm1 reaching 90c in valley benchmark.
> 
> Anyone else with tri-x get any higher on core or memory? Would like to know other tri-x vrm temps
> 
> Sent from my SGH-M919 using Tapatalk


My Tri X make 1175mhz / 1760mhz with +100mv the Aux Voltage i not push up only the core Voltage
The max is 1270mhz/ 1780mhz with 1.32v


----------



## psyside

Quote:


> Originally Posted by *Marvin82*
> 
> My Tri X make 1175mhz / 1760mhz with +100mv the Aux Voltage i not push up only the core Voltage


How do you go beyond afterburner limits? afaik 1700 is max for memory there?


----------



## Marvin82

Quote:


> Originally Posted by *psyside*
> 
> How do you go beyond afterburner limits? afaik 1700 is max for memory there?


Sapphire Software TRIXX
OpenEnd OC


----------



## Heinz68

Quote:


> Originally Posted by *neurotix*
> 
> Hey.
> 
> Anybody having any problems with video playback with these cards?
> 
> Playing MPEG video in Windows Media Player or Youtube/Flash in Firefox causes my entire system to hard lock, requiring a reboot.
> 
> We found out that using VLC Media Player on the same files lets them play fine. Using Chrome for Youtube also works fine.
> 
> It seems to be a driver or codec issue. I tried Catalyst 13.12, 13.11 beta 9.5 drivers. I installed them with all options initially but the problem persisted, so I tried installing 13.12 without the "AMD Media", "AMD Accelerated Video Transcoding" and other video related suboptions. This didn't make a difference. So it leads me to believe there is likely a codec issue.
> 
> Does anyone have a fix for this? Should I RMA my card?
> 
> I have a Sapphire 290 Tri-X.
> 
> After dropping $600 on this thing and seeing impressive performance in games, it baffles me that it cannot play simple MPEG videos without hard locking the system.


There is nothing wrong with the card or driver, most likely some conflict with the codec pack, make sure you don't use more than one.
Check you Firefox Add-ons if the flash player is updated or you can check here http://helpx.adobe.com/flash-player.html


----------



## psyside

Quote:


> Originally Posted by *Marvin82*
> 
> Sapphire Software TRIXX
> OpenEnd OC


What do you mean by open end?


----------



## Sgt Bilko

Quote:


> Originally Posted by *kizwan*
> 
> If you play only on one monitor, you don't need CFX of course.
> Why it's hard to believe? @Sgt Bilko live in NSW, Australia. I heard it's hot there now.
> 
> My ambient temp (indoor) currently 33C. This is typical ambient temp in Malaysia. At night it can go down to 28/29C (indoor) at which time with BF3 @80% fans, GPU temp can go as high as 70C.
> 
> BF3, ambient 28/30C, 80% fans (GPU1 & GPU2)
> 
> 
> 
> 
> 
> Can't wait to get water blocks for my cards. They're en route from Slovenia, now at Germany.


Yes it is hot, was 40c today and it's going to stay like that for the rest of the week.

really wish i had air-con









EDIT: accidentally double quoted


----------



## p5ych00n5

Hollah lords of Radeon 290's, my next upgrade shall be a pair of 290 cards, whether it be 290 or 290X.
My query is, is it worth purchasing the X over the vanilla 290, I've been doing a bit of research on performance/price.

GPU Boss results:

TLDR: 290X beats 290 by an average of 10 - 15 %

Initial outlay is:


The vanilla 290 is $180 cheaper.

Which then leads to putting both cards in my loop:


So I can get Vanilla 290 + Block + postage cheaper than:



290X no block and no postage


Is the extra 10 - 15% performance superiority/ slightly higher fpu etc etc worth it, or the fact I can OC the vanilla version to close to identical speeds and integrated into my loop the caveat?

Peace Out
p5ych00n5


----------



## Paulenski

Quote:


> Originally Posted by *psyside*
> 
> Sure we do, thanks alot rep +
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can you run Heaven 4.0 with 1.3v and 1200/6500 for 15 mins loop and fans on 75% and see your core/vrm temps?
> 
> Sorry if i asked you this already, i'm trying to find out if there is some big difference from card to card/batch to batch


I couldn't do the 75% fan speed, I know the VRM will rise higher than the threshold I like it to be at. I used my custom fan profile just to play it safe. These synthetic benches put the card under way more load than any AAA game I've played (BF4 and AC4).

1202 core 125mv offset / 1650 (6600) mem aux 50mv offset

Fan Profile: 80-83% for first 9~ mins, 83-88% the last 7~ mins

Temps after 16 minutes of Heaven 4.0 Extreme 1080p Fullscreen

GPU Max: 78C (77C average)
VRM1 Max: 92C (88-90C average)
VRM2 Max: 59C

http://i.imgur.com/1uUTvfq.png


----------



## Sgt Bilko

Quote:


> Originally Posted by *p5ych00n5*
> 
> Hollah lords of Radeon 290's, my next upgrade shall be a pair of 290 cards, whether it be 290 or 290X.
> My query is, is it worth purchasing the X over the vanilla 290, I've been doing a bit of research on performance/price.
> 
> GPU Boss results:
> 
> TLDR: 290X beats 290 by an average of 10 - 15 %
> 
> Initial outlay is:
> 
> 
> The vanilla 290 is $180 cheaper.
> 
> Which then leads to putting both cards in my loop:
> 
> 
> So I can get Vanilla 290 + Block + postage cheaper than:
> 
> 
> 
> 290X no block and no postage
> 
> 
> Is the extra 10 - 15% performance superiority/ slightly higher fpu etc etc worth it, or the fact I can OC the vanilla version to close to identical speeds and integrated into my loop the caveat?
> 
> Peace Out
> p5ych00n5


Actually the performance difference is smaller than that (5% iirc), and from having a 290x myself i really can't notice a difference between a 290 and 290x at the same speed, In gaming that is.....small diff in benching.

imho get the 290 + waterblock.......especially with all the heat down your way


----------



## psyside

Quote:


> Originally Posted by *Paulenski*
> 
> I couldn't do the 75% fan speed, I know the VRM will rise higher than the threshold I like it to be at. I used my custom fan profile just to play it safe. These synthetic benches put the card under way more load than any AAA game I've played (BF4 and AC4).


Thanks alot for this, rep+

Yea thought so that benching loop will result in higher temps, still 90c on vrm is generally ok, a bit on a high side but still good









What does your card have as stock volts?


----------



## Forceman

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Actually the performance difference is smaller than that (5% iirc), and from having a 290x myself i really can't notice a difference between a 290 and 290x at the same speed, In gaming that is.....small diff in benching.
> 
> imho get the 290 + waterblock.......especially with all the heat down your way


I agree, I have a 290 flashed to 290 and in games there is virtually no difference between them. Use the $100 difference in price to cool the cards and overclock a 290.


----------



## p5ych00n5

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Actually the performance difference is smaller than that (5% iirc), and from having a 290x myself i really can't notice a difference between a 290 and 290x at the same speed, In gaming that is.....small diff in benching.
> 
> imho get the 290 + waterblock.......especially with *all the heat down your way*


Bahahahahaha, yeah I thought so, it's still 40 degrees in the HQ







plus I was still rocking Outlast to about 2am this morning and it was still 33 degrees at 3am









This is my first GPU upgrade since I retired my 460's to a mates rig and replaced them with 6850's. This upgrade will be my second last before new Monitors and I wanted to go a top tier card rather than a midrange card like I have been doing in the past (GTS 250, GTX 460, HD 6850). So a second opinion is always most welcome (feel free to flame me everyone







)

BTW you still need to get those badboys underwater


----------



## Paulenski

Quote:


> Originally Posted by *psyside*
> 
> Thanks alot for this, yea thought so that benching loop will result in higher temps, still 90c on vrm is generally ok, a bit on a high side but still good
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What does your card have as stock volts?


Yeah I thought the same thing, however, I was expecting to have much lower VRM's with this cooler. Would be nice to see more temp samples to make sure there's consistency among the cards.

I tried Kombustor earlier today to see about stability and my god, it sent that VRM1 right into 100C.

The stock voltage idles at .945-.938, with load voltage averaging 1.125-1.141


----------



## Sazz

Quote:


> Originally Posted by *p5ych00n5*
> 
> Hollah lords of Radeon 290's, my next upgrade shall be a pair of 290 cards, whether it be 290 or 290X.
> My query is, is it worth purchasing the X over the vanilla 290, I've been doing a bit of research on performance/price.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> GPU Boss results:
> 
> TLDR: 290X beats 290 by an average of 10 - 15 %
> 
> Initial outlay is:
> 
> 
> The vanilla 290 is $180 cheaper.
> 
> Which then leads to putting both cards in my loop:
> 
> 
> So I can get Vanilla 290 + Block + postage cheaper than:
> 
> 
> 
> 290X no block and no postage
> 
> 
> 
> 
> Is the extra 10 - 15% performance superiority/ slightly higher fpu etc etc worth it, or the fact I can OC the vanilla version to close to identical speeds and integrated into my loop the caveat?
> 
> Peace Out
> p5ych00n5


If you ever get a waterblock and a acetal+copper block w/ backplate from EK is good for you, I got one for sale =]


----------



## Marvin82

Quote:


> Originally Posted by *psyside*
> 
> What do you mean by open end?


I think that no card can handle what you can adjust in mem and core


----------



## Sgt Bilko

Quote:


> Originally Posted by *p5ych00n5*
> 
> Bahahahahaha, yeah I thought so, it's still 40 degrees in the HQ
> 
> 
> 
> 
> 
> 
> 
> plus I was still rocking Outlast to about 2am this morning and it was still 33 degrees at 3am
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This is my first GPU upgrade since I retired my 460's to a mates rig and replaced them with 6850's. This upgrade will be my second last before new Monitors and I wanted to go a top tier card rather than a midrange card like I have been doing in the past. So a second opinion is always most welcome (feel free to flame me everyone
> 
> 
> 
> 
> 
> 
> 
> )
> 
> BTW you still need to get those badboys underwater


Water has been put on hold, maybe indefinitely now.

Promised my wife i wouldn't spend anymore on the PC for now.









I'm still running fine on air, the DD coolers are doing a great job considering all the hate against XFX i've seen.

EDIT....I passed 1k posts!!


----------



## psyside

Quote:


> Originally Posted by *Paulenski*
> 
> Yeah I thought the same thing, however, I was expecting to have much lower VRM's with this cooler. Would be nice to see more temp samples to make sure there's consistency among the cards.
> 
> I tried Kombustor earlier today to see about stability and my god, it sent that VRM1 right into 100C.
> 
> The stock voltage idles at .945-.938, with load voltage averaging 1.125-1.141


Yea, if you got like 1.4v which is very strange and low, maybe you can go with 1.25 instead of 1.3v the point is to find the best ratio.

And yes, even with loop, your card vrm's runs a bit hotter then others, don't know why this is the case...

One user on OCN hit only 91c at 1.4V, which is like 100mV + after 30 mins heaven loop









But he has HAF X...


----------



## p5ych00n5

Quote:


> Originally Posted by *Sazz*
> 
> If you ever get a waterblock and a acetal+copper block w/ backplate from EK is good for you, I got one for sale =]


I still can't trade, buy or sell yet. I really appreciate the thought though







.

Oooooh I forgot the backplates

And my new list:


----------



## Sazz

Quote:


> Originally Posted by *p5ych00n5*
> 
> I still can't trade, buy or sell yet. I really appreciate the thought though


oh no worries, I have it up on ebay as well. whenever you wanna get one just check it out on ebay if its still up xD


----------



## p5ych00n5

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Water has been put on hold, maybe indefinitely now.
> 
> Promised my wife i wouldn't spend anymore on the PC for now.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm still running fine on air, the DD coolers are doing a great job considering all the hate against XFX i've seen.
> 
> EDIT....I passed 1k posts!!


I tried to find a pic of the Simpsons "Oh the wife *wisha wisha*" moment. Eff you Google









1K posts =


----------



## devilhead

valley really don't like 290x....


----------



## rdr09

Quote:


> Originally Posted by *devilhead*
> 
> valley really don't like 290x....


nice.

this is reasonably priced, is it?

http://www.amazon.com/MSI-R9-290-GAMING-4G/dp/B00HPS4AFG/ref=sr_1_3?ie=UTF8&qid=1389793318&sr=8-3&keywords=r9+290


----------



## PureBlue

I made a thread on this but thought i'd post it here as well.

Just wandering if there is anyone out there who has the 290X version of the Tri-X and is able to upload the the BIOS?
I want to try flashing my non X card to see if it unlocks, please PM me or post the BIOS here if you can.

Thanks in advance









*EDIT*

Apparently Trixx crashes when trying to save the BIOS on the Tir- x cards so please try GPU-Z incase Trixx crashes on you.


----------



## ReHWolution

Quote:


> Originally Posted by *PureBlue*
> 
> I made a thread on this but thought i'd post it here as well.
> 
> Just wandering if there is anyone out there who has the 290X version of the Tri-X and is able to upload the the BIOS?
> I want to try flashing my non X card to see if it unlocks, please PM me or post the BIOS here if you can.
> 
> Thanks in advance
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *EDIT*
> 
> Apparently Trixx crashes when trying to save the BIOS on the Tir- x cards so please try GPU-Z incase Trixx crashes on you.


I was expecting that BIOS too, can anybody upload their BIOSes from TriX? You can save it through GPU-Z


----------



## Marvin82

Quote:


> Originally Posted by *PureBlue*
> 
> I made a thread on this but thought i'd post it here as well.
> 
> Just wandering if there is anyone out there who has the 290X version of the Tri-X and is able to upload the the BIOS?
> I want to try flashing my non X card to see if it unlocks, please PM me or post the BIOS here if you can.
> 
> Thanks in advance
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *EDIT*
> 
> Apparently Trixx crashes when trying to save the BIOS on the Tir- x cards so please try GPU-Z incase Trixx crashes on you.


pleas read post #821
Tahts me and my Tri x oc bios
http://www.hardwareluxx.de/community/f305/amd-radeon-r9-290x-hawaii-xt-sammelthread-faq-bei-fragen-startpost-lesen-985505-33.html#post21702358


----------



## PureBlue

Quote:


> Originally Posted by *Marvin82*
> 
> pleas read post #821
> Tahts me and my Tri x oc bios
> http://www.hardwareluxx.de/community/f305/amd-radeon-r9-290x-hawaii-xt-sammelthread-faq-bei-fragen-startpost-lesen-985505-33.html#post21702358


Thank you very much dude


----------



## ReHWolution

Quote:


> Originally Posted by *Marvin82*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PureBlue*
> 
> I made a thread on this but thought i'd post it here as well.
> 
> Just wandering if there is anyone out there who has the 290X version of the Tri-X and is able to upload the the BIOS?
> I want to try flashing my non X card to see if it unlocks, please PM me or post the BIOS here if you can.
> 
> Thanks in advance
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *EDIT*
> 
> Apparently Trixx crashes when trying to save the BIOS on the Tir- x cards so please try GPU-Z incase Trixx crashes on you.
> 
> 
> 
> pleas read post #821
> Tahts me and my Tri x oc bios
> http://www.hardwareluxx.de/community/f305/amd-radeon-r9-290x-hawaii-xt-sammelthread-faq-bei-fragen-startpost-lesen-985505-33.html#post21702358
Click to expand...

It's the UEFI one?


----------



## ds84

Quote:


> Originally Posted by *ds84*
> 
> Which software u used to see memory info?
> 
> And what mobo u using? My asrock z87 extreme 4 cant see vrm temps. Or any software can see vrm temps?


Can anyone help me with my 2nd problem?


----------



## Marvin82

Quote:


> Originally Posted by *ReHWolution*
> 
> It's the UEFI one?


I dont know i think so


----------



## ReHWolution

Quote:


> Originally Posted by *ds84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ds84*
> 
> Which software u used to see memory info?
> 
> And what mobo u using? My asrock z87 extreme 4 cant see vrm temps. Or any software can see vrm temps?
> 
> 
> 
> Can anyone help me with my 2nd problem?
Click to expand...

You're asking for a software to read VRM temps for your mainboard *in a 290/290X thread*...


----------



## BradleyW

Quote:


> Originally Posted by *devilhead*
> 
> valley really don't like 290x....


Is that with a single 290X?


----------



## ReHWolution

Quote:


> Originally Posted by *Marvin82*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ReHWolution*
> 
> It's the UEFI one?
> 
> 
> 
> I dont know i think so
Click to expand...

I need to know, that's the only thing I was looking for, I can overclock by myself my card xD


----------



## ds84

Quote:


> Originally Posted by *ReHWolution*
> 
> You're asking for a software to read VRM temps for your mainboard *in a 290/290X thread*...


Coz i see other ppl's GPU-Z able to show VRM 1 and VRM 2 temp whereas mine didnt show. So, i thought ppl here might have an idea... Cant really find a thread abt my mobo in mobo sub-forum.

So, i tot it might be mobo limitness and hoping to see if anyone is in the same boat as me.


----------



## Krusher33

Just got my Sapphire 290X last night. I'm stoked.

Had problems overclocking it last night though. Kept flickering if I opened Afterburner. I was thinking the program was messed up so I uninstalled it. Installed Trixx but didn't try it because I was too tired.

Now checking the OP's post and I see something about setting powertune in CCC if messing with AB. Ugh... why didn't I see that last night?

EDIT: Now I'm confused. I did not have Overdrive enabled but I was having the flickering problems whenever I opened AB.


----------



## ReHWolution

Quote:


> Originally Posted by *ds84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ReHWolution*
> 
> You're asking for a software to read VRM temps for your mainboard *in a 290/290X thread*...
> 
> 
> 
> Coz i see other ppl's GPU-Z able to show VRM 1 and VRM 2 temp whereas mine didnt show. So, i thought ppl here might have an idea... Cant really find a thread abt my mobo in mobo sub-forum.
> 
> So, i tot it might be mobo limitness and hoping to see if anyone is in the same boat as me.
Click to expand...

Start GPU-Z in admin mode, I usually solve this problem closing GPU-Z and then starting it again :\


----------



## devilhead

yes, it is single, clocks is 1297/1728


----------



## BradleyW

Here are my 290x's under water










Quote:


> Originally Posted by *devilhead*
> 
> yes, it is single, clocks is 1297/1728


Impressive overclock!


----------



## Gumbi

Quote:


> Originally Posted by *devilhead*
> 
> yes, it is single, clocks is 1297/1728


Very nice. What voltages?


----------



## Arizonian

Quote:


> Originally Posted by *Krusher33*
> 
> Just got my Sapphire 290X last night. I'm stoked.
> 
> Had problems overclocking it last night though. Kept flickering if I opened Afterburner. I was thinking the program was messed up so I uninstalled it. Installed Trixx but didn't try it because I was too tired.
> 
> Now checking the OP's post and I see something about setting powertune in CCC if messing with AB. Ugh... why didn't I see that last night?
> 
> EDIT: Now I'm confused. I did not have Overdrive enabled but I was having the flickering problems whenever I opened AB.


Quote:


> Originally Posted by *BradleyW*
> 
> Here are my 290x's under water
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Impressive overclock!


Love to have both of you on the roster.









To be added on the member list please submit the following in your post

1. GPU-Z Link with OCN name or Screen shot of GPU-Z validation tab open with OCN Name or finally a pic of GPU with piece of paper with OCN name showing.
2. Manufacturer & Brand - Please don't make me guess.
3. Cooling - Stock, Aftermarket or 3rd Party Water

Well I'm off today for 11 hr work day. See you guys tonight.


----------



## armartins

Quote:


> Originally Posted by *Lanvin*
> 
> What is the general consensus on cooling crossfire 290s without a loop?
> 
> Kraken g10 + x40
> Or
> Accelero xtreme 3 / icy vision.
> 
> I have enough slots between the two cards
> 
> 
> 
> Tri-x would have been a perfect match for my orange themed pc too.


Sell one, get a Tri-x... sell the other get a seccond one.... dirty simple.


----------



## Marvin82

Quote:


> Originally Posted by *ReHWolution*
> 
> I need to know, that's the only thing I was looking for, I can overclock by myself my card xD


I think i had read it on the box
I at work and write on phone
Look in the Homepage that he hafe UEFI
http://www.sapphiretech.com/presentation/product/product_index.aspx?cid=1&gid=3&sgid=1227&pid=2090&lid=1

i can write it wen i at home in 3 Hours


----------



## dspacek

Quote:


> Originally Posted by *devilhead*
> 
> yes, it is single, clocks is 1297/1728


What BIOS? Using trixx? I could set up that high only memory. but core still around 1230 max, then artefacts.


----------



## Asrock Extreme7

how do u change mem voltage and what program do u use


----------



## Csokis

ASUS R9 290 DirectCU II review!

Too hot!


----------



## VSG

This week on Newegg,



Gee, thanks a lot


----------



## Jack Mac

Quote:


> Originally Posted by *geggeg*
> 
> This week on Newegg,
> 
> 
> 
> Gee, thanks a lot


LOL, in other words: "Buy AMD cards here at a $100+ mark-up so we can make lots of money off you idiots!"


----------



## VSG

That + GPUs not really being profitable for bitcoin mining means they are really catering this ad to fool customers big time.


----------



## devilhead

Quote:


> Originally Posted by *dspacek*
> 
> What BIOS? Using trixx? I could set up that high only memory. but core still around 1230 max, then artefacts.


For this run i used pt1 bios, 1.41v after vdrop







so in gpu tweak 1450v


----------



## MooseHead

Hey guys. My cousin sent me some pics this morning of his setup. Just wanted to share with you guys. He's in the marines so I guess that's justification for his setup??... Overkill much?!?!?









*WARNING: PIC HEAVY!*


----------



## Jack Mac

Quote:


> Originally Posted by *MooseHead*
> 
> Hey guys. My cousin sent me some pics this morning of his setup. Just wanted to share with you guys. He's in the marines so I guess that's justification for his setup??... Overkill much?!?!?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> *WARNING: PIC HEAVY!*


So that's where all the R9 290s went...


----------



## VSG

Ya I share your feelings, kinda feel upset whenever I see these pictures of people getting tens of 290/290x cards


----------



## psyside

Can anyone tell me this.

1. Why do i get flicker whenever i apply new clocks in AB? i made sure i did everything to avoid this, as shown in the picture down.

2. Why does my core varies from 1000-1150, i set it to 1150....used +100mV and constant voltage, and clock was all over the place in Ungine - AB sensor.

This are my settings..


----------



## pkrexer

Wouldn't it make more sense to buy a dedicated mining machine that does like 1.2Ths instead of wasting all the space and power with stacks of video cards?


----------



## sugarhell

Untick reset display mode


----------



## psyside

Quote:


> Originally Posted by *sugarhell*
> 
> Untick reset display mode


Thanks, any ideas about huge core variation?

Edit: its worse with unchecked...total flickering in chrome and in everything.


----------



## Sazz

Quote:


> Originally Posted by *devilhead*
> 
> yes, it is single, clocks is 1297/1728


w/ volt mods? And how did you get past 1625 on memory?

I don't think any 290/290X going past 1600 w/o some hard modding to add voltage to memory.


----------



## sugarhell

Quote:


> Originally Posted by *psyside*
> 
> Thanks, any ideas about huge core variation?
> 
> Edit: its worse with unchecked...total flickering in chrome and in everything.


Wait you have without powerplay? Do you have 2 profiles one 2d and one 3d?


----------



## devilhead

Quote:


> Originally Posted by *Sazz*
> 
> w/ volt mods? And how did you get past 1625 on memory?
> 
> I don't think any 290/290X going past 1600 w/o some hard modding to add voltage to memory.


1.41v after vdrop, thats hynix memory. My cards with elpidia can do just up to 1600mhz.


----------



## psyside

Quote:


> Originally Posted by *sugarhell*
> 
> Wait you have without powerplay? Do you have 2 profiles one 2d and one 3d?


Yes, without powerplay. No i don't have 2 profiles.


----------



## bond32

Quote:


> Originally Posted by *geggeg*
> 
> Ya I share your feelings, kinda feel upset whenever I see these pictures of people getting tens of 290/290x cards


I'm right there with ya man... I'm broke anyway, but I felt so cool having a 290x. Now not so much lol...

But seriously, this poor college kid needs one more 290x, you crazy miners don't need another (bringing the total to 15+) so send me one


----------



## sugarhell

Quote:


> Originally Posted by *psyside*
> 
> Yes, without powerplay. No i don't have 2 profiles.


Without powerplay means that you need to control the clocks. Your card cant go into 3d clocks. The default profile should be always downclocked


----------



## psyside

Quote:


> Originally Posted by *sugarhell*
> 
> Without powerplay means that you need to control the clocks. Your card cant go into 3d clocks. The default profile should be always downclocked


So i should put with powerplay?


----------



## sugarhell

Quote:


> Originally Posted by *psyside*
> 
> So i should put with powerplay?


You can set a 3d profile by your own. But if you want just enable powerplay


----------



## psyside

Quote:


> Originally Posted by *sugarhell*
> 
> You can set a 3d profile by your own. But if you want just enable powerplay


I did, i tried everything flickering galore, bah!

You got pm


----------



## Eroticus

Add me to club =)


----------



## Jack Mac

Quote:


> Originally Posted by *Eroticus*
> 
> Add me to club =)


Dat flash, looks like the BF3 sun in the first picture.


----------



## pilla99

Coming from my last GPU that ran at around 80C makes, this maxes me nervous. What fan settings are still sufficient for cooling but doesn't sound like a 747? Is running at 90+C while gaming for hours actually good for this card? It feels way too hot, but I mean the default catalyst settings have the temp target at 95C....
Edit; upped my fan speed to 40%. Just can't take much more than that.


----------



## Jack Mac

Quote:


> Originally Posted by *pilla99*
> 
> Coming from my last GPU that ran at around 80C max, this maxes me nervous. What fan settings are still sufficient for cooling but don't sound like a 747? Is running at 90+C while gaming for hours actually good for this card? It feels way too hot, but I mean the default catalyst settings have the temp target at 95C....
> Edit; upped my fan speed to 40%. Just can't take much more than that.


Don't think you can do that with the stock coler, but 45-55% is the sweetspot for me, create a custom fan curve in MSI AB.


----------



## X-oiL

Just started my first WC-build with dual 290's, delidded 4770k, the rare Lian-Li PC-D600 chassi and much more check it out *here* if it sounds interesting


----------



## Asrock Extreme7

how do u change mem volt help


----------



## Gumbi

Quote:


> Originally Posted by *pilla99*
> 
> Coming from my last GPU that ran at around 80C max, this maxes me nervous. What fan settings are still sufficient for cooling but don't sound like a 747? Is running at 90+C while gaming for hours actually good for this card? It feels way too hot, but I mean the default catalyst settings have the temp target at 95C....
> Edit; upped my fan speed to 40%. Just can't take much more than that.


Replace the stock paste with some aftermarket stuff, that should buy you a few degrees.


----------



## smoke2

Quote:


> Originally Posted by *Csokis*
> 
> ASUS R9 290 DirectCU II review!
> 
> Too hot!


According to this review the system with card have 660W power consumption, over 800W as overclocked.

The card itself consumes 385W.

Are this values real? Why they are so high?
I was planned to buy 700W supply for my single 290.
Is then 700W too little, with no power reserve?


----------



## psyside

Quote:


> Originally Posted by *smoke2*
> 
> According to this review the system with card have 660W power consumption, over 800W as overclocked.
> 
> The card itself consumes 385W.
> 
> Are this values real? Why they are so high?
> I was planned to buy 700W supply for my single 290.
> Is then 700W too little, with no power reserve?


That is with 1.4v don't use more then 1.3 for air generally.


----------



## kizwan

Quote:


> Originally Posted by *smoke2*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Csokis*
> 
> ASUS R9 290 DirectCU II review!
> 
> Too hot!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> According to this review the system with card have 660W power consumption, over 800W as overclocked.
> 
> The card itself consumes 385W.
> 
> Are this values real? Why they are so high?
> I was planned to buy 700W supply for my single 290.
> Is then 700W too little, with no power reserve?
Click to expand...

The review use 3930K CPU. LGA2011 CPUs does consume a lot of power when overclocked than IVY & Haswell. What is your computer specs? 700W is more than enough for single 290 including overclocked.


----------



## smoke2

Quote:


> Originally Posted by *psyside*
> 
> That is with 1.4v don't use more then 1.3 for air generally.


How will be aprox. total system power consumption with 1.3v OC?
Hope 700W is enough or have to buy 850W supply to be on safe side?


----------



## smoke2

Quote:


> Originally Posted by *kizwan*
> 
> The review use 3930K CPU. LGA2011 CPUs does consume a lot of power when overclocked than IVY & Haswell. What is your computer specs? 700W is more than enough for single 290 including overclocked.


I have i5-4670K OC, with 3x140mm fans, 2x WD Black HDD, about 6 USB peripherals.


----------



## Forceman

Quote:


> Originally Posted by *Asrock Extreme7*
> 
> how do u change mem volt help


You can't.
Quote:


> Originally Posted by *smoke2*
> 
> How will be aprox. total system power consumption with 1.3v OC?
> Hope 700W is enough or have to buy 850W supply to be on safe side?


My system (roughly comparable) draws around 400W at the wall running BF4. 700W is plenty.


----------



## kizwan

Quote:


> Originally Posted by *smoke2*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> The review use 3930K CPU. LGA2011 CPUs does consume a lot of power when overclocked than IVY & Haswell. What is your computer specs? 700W is more than enough for single 290 including overclocked.
> 
> 
> 
> I have i5-4670K OC, with 3x140mm fans, 2x WD Black HDD, about 6 USB peripherals.
Click to expand...

Yes, 700W is more than enough for your system with single 290 card. 4670K/4770K only consume around 1XX watts max.


----------



## Gunderman456

Here are my two r9 290s now officially under water!


----------



## Paulenski

Quote:


> Originally Posted by *Sazz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *devilhead*
> 
> yes, it is single, clocks is 1297/1728
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> w/ volt mods? And how did you get past 1625 on memory?
> 
> I don't think any 290/290X going past 1600 w/o some hard modding to add voltage to memory.
Click to expand...

My Tri-X 290X has Aux voltage control in AB, I increased that by 50mv and stable at 1690mem

Sent from my SGH-M919 using Tapatalk


----------



## Paulenski

Quote:


> Originally Posted by *Eroticus*
> 
> Add me to club =)


Press F12 in valley to get screenshot, their located under C:/Users/YourName/Valley/Screenshots

Sent from my SGH-M919 using Tapatalk


----------



## smoke2

OK, and how big is consumption of Tri-X 290 and OCed Tri-X with higher voltage?
Btw. I was planning to buy Seasonic X750, but now I have a really better offer for 50W weaker Cooler Master V 700W.


----------



## Paulenski

Quote:


> Originally Posted by *Asrock Extreme7*
> 
> how do u change mem volt help


Use MSI AfterBurner or Sapphire Trixx(may work on non sapphire cards)

Sent from my SGH-M919 using Tapatalk


----------



## Forceman

Quote:


> Originally Posted by *Paulenski*
> 
> Use MSI AfterBurner or Sapphire Trixx(may work on non sapphire cards)
> 
> Sent from my SGH-M919 using Tapatalk


There is no memory voltage control on these cards. You can change Aux voltage with AB, but it's not the same thing (even if it does help some people with overclocks).


----------



## Paulenski

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Paulenski*
> 
> Use MSI AfterBurner or Sapphire Trixx(may work on non sapphire cards)
> 
> Sent from my SGH-M919 using Tapatalk
> 
> 
> 
> There is no memory voltage control on these cards. You can change Aux voltage with AB, but it's not the same thing (even if it does help some people with overclocks).
Click to expand...

It's worked for me, 50mv increase on aux allowed me to go from a max of 1500 to 1690

Sent from my SGH-M919 using Tapatalk


----------



## Asrock Extreme7

thanks will try


----------



## DeutyDaBeast

Just got done making some graphs for my 290 usage:

Usage.PNG 47k .PNG file


Percentages.PNG 60k .PNG file


My build:

Motherboard: ASRock Extreme6 LGA1155
CPU: i5 3570k, OC to 4.0 GHz
GPU: Gigabyte R9 290 with Windforce cooler
RAM: DDR3 1600mhz
PSU: Corsair RM650


----------



## pilla99

Does someone have an afterburner fan curve they could post up so that I could try it out?


----------



## Widde

Quote:


> Originally Posted by *pilla99*
> 
> Does someone have an afterburner fan curve they could post up so that I could try it out?


http://piclair.com/ww9xq

This could VERY well be to loud for most ppl ^^ Running 2 290s overclocked alittle. But keeps em at 80 celsius in Battlefield 4


----------



## pilla99

Quote:


> Originally Posted by *Widde*
> 
> http://piclair.com/ww9xq
> 
> This could VERY well be to loud for most ppl ^^ Running 2 290s overclocked alittle. But keeps em at 80 celsius in Battlefield 4


Tried this curve with a quick gaming test run and my dog came in from the other room to see what was happening. Too loud. ;/


----------



## Widde

Quote:


> Originally Posted by *pilla99*
> 
> Tried this curve with a quick gaming test run and my dog came in from the other room to see what was happening. Too loud. ;/


Ye figured







Gaming with headphones on and noone else around ^^ Turning it on auto when i'm sleeping though but when not playing anything it's not to bad


----------



## shilka

Quote:


> Originally Posted by *DeutyDaBeast*
> 
> Just got done making some graphs for my 290 usage:
> 
> Usage.PNG 47k .PNG file
> 
> 
> Percentages.PNG 60k .PNG file
> 
> 
> My build:
> 
> Motherboard: ASRock Extreme6 LGA1155
> CPU: i5 3570k, OC to 4.0 GHz
> GPU: Gigabyte R9 290 with Windforce cooler
> RAM: DDR3 1600mhz
> PSU: Corsair RM650


Stay far away from the Corsair RM series

http://www.overclock.net/t/1455892/why-you-should-not-buy-a-corsair-rm-psu

If you already have it then send it back


----------



## neurotix

For any other users having system hard locks while playing Youtube in Firefox or video files in Windows Media Player:

Go to this site: http://www.divx.com/en/software/video-codec

And install DivX.

This fixed youtube for me in Firefox as well as files in Windows Media Player.

Credit for this goes to user bridgypoo, my girlfriend. After over a day of trying to find a solution, she found this.


----------



## DeutyDaBeast

Thanks for the heads up. Looks like a PSU is next on the to-do list. This PSU seemed so solid too


----------



## aaroc

My setup for the last 2 months a 2x XFX R9 290 CFX with stock cooling, no OC:


Today a new MSI R9 290 for TriFX, The three boxes together:


The new MSI R9 290 on my desktop waiting for action!:


The new MSI R9 290 in the middle of the TriFX Sandwich with stock cooling, no OC:


The GPU-Z validation here

Th CPUZ validation here


----------



## psyside

Quote:


> Originally Posted by *smoke2*
> 
> How will be aprox. total system power consumption with 1.3v OC?
> Hope 700W is enough or have to buy 850W supply to be on safe side?


With single 290 tri X you will never need more then ~ 600W, even with crazy volts during gameplay. Synthetic benchmarks are different story. Don't worry, you won't need more then ~100-120mV + for maxing the oc, after that you will need like 50mV + for 20mhz ain't worth it, just go for 100mV offset and you are golden, maybe add like 30- 50 aux for higher memory oc









Keep in mind, you will buy very high quality PSU if you go for CM Vanguard, so even at full load it will handle the 290 with ease.

If the price is very close go for 850W otherwise 700W more then enough.


----------



## kdawgmaster

Got my new silver 540

Without side panel

With side panel

I call this one prying eyes xD


----------



## rx7racer

Welp, I finally bit the bullet and got my 290X under water. Keeps her cool now topping out at 48c so far but that's only at 1175MHz core. Gonna try to play with Trixx soon and go above the +100mv default limit in AB.





Oh sorry for the flash, didn't think of a pic till very end. It ended up a lot uglier than I wanted it to, gonna have to contemplate a re-do haha.

Using stock base plate by doing the re-use mod obviously since I did it trying my Zalman hsf. And it doesn't do too bad, with a fan over the vrm1 section I hit mid 70's on vrm1 and low 60's on vrm2. Hopefully ram is ok, no issue so far so I guess they are.









Feels good to finally feel like I can play with this thing.









GPUZ


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *rx7racer*
> 
> Welp, I finally bit the bullet and got my 290X under water. Keeps her cool now topping out at 48c so far but that's only at 1175MHz core. Gonna try to play with Trixx soon and go above the +100mv default limit in AB.
> 
> 
> 
> 
> 
> Oh sorry for the flash, didn't think of a pic till very end. It ended up a lot uglier than I wanted it to, gonna have to contemplate a re-do haha.
> 
> Using stock base plate by doing the re-use mod obviously since I did it trying my Zalman hsf. And it doesn't do too bad, with a fan over the vrm1 section I hit mid 70's on vrm1 and low 60's on vrm2. Hopefully ram is ok, no issue so far so I guess they are.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Feels good to finally feel like I can play with this thing.


Glad you got water going through the card









Don't take this the wrong way, I still don't understand why people buy $500 items and won't spend $100 on a full cover block







It always hurts me inside









Anyhow, I got this 290x I'm using at a bad price and with those voided warrantys I'm scared to put it under water. What really sucks is seeing a card idle temperature being 20C more than my old watercooled card under load for long benchmarks or games


----------



## ZealotKi11er

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> Glad you got water going through the card
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Don't take this the wrong way, I still don't understand why people buy $500 items and won't spend $100 on a full cover block
> 
> 
> 
> 
> 
> 
> 
> It always hurts me inside
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyhow, I got this 290x I'm using at a bad price and with those voided warrantys I'm scared to put it under water. What really sucks is seeing a card idle temperature being 20C more than my old watercooled card under load for long benchmarks or games


Yeah and better aesthetics. With the age of current cards you know 290X will last you at least 1 year. My HD 7970 i got water block since day one and that lasted 2 year as AMDs top card. 2 year for water block is very good investment.


----------



## rx7racer

I feel ya Hulk, but it's simple really. Half the price of full cover block for 1, 2 the gpu was actually a christmas gift from the wife. 3 would be after doing so many full cover blocks and losing on them everytime I think more about my next card than current card in my rig. Switched back to air cooling gpu for past to 2 gpu setups since my 480 full cover and yada yada way back when.

I almost went full cover, but in the end it was gonna be $80 adding it to my loop than going with universal block. I did it on the cheap and it solved my noise and temp issue so me happy.









And warranty, hah, if it works for at least 2 weeks it's good, after that I know if it dies it's my fault







Don't need no stinkin warranty


----------



## the9quad

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> Glad you got water going through the card
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Don't take this the wrong way, I still don't understand why people buy $500 items and won't spend $100 on a full cover block
> 
> 
> 
> 
> 
> 
> 
> It always hurts me inside
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyhow, I got this 290x I'm using at a bad price and with those voided warrantys I'm scared to put it under water. What really sucks is seeing a card idle temperature being 20C more than my old watercooled card under load for long benchmarks or games


Multiply that x3 and throw in rads, pump, reservoir, tubing, and a new case so it will all fit, and it adds up to a ton of $.


----------



## Sazz

Hey guys, in a completely un-related question, on the 3 pin fan connectors, which one is the RPM lead? the left? right? or middle? the H100 that I got for free is not getting the RPM sensor right (my mobo thinks theres no fan connected) I believe the RPM lead is in the wrong spot, he did mention the connector broke off before and he just put it back.

Sorry about this, I just figured to ask here coz this thread is very much active. xD


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *the9quad*
> 
> Multiply that x3 and throw in rads, pump, reservoir, tubing, and a new case so it will all fit, and it adds up to a ton of $.


Did the universal GPU block tubing connect to a magical pump/res/rad








When the money is already spent on the loop, universal or full cover gpu block doesn't add to what was already spent
Quote:


> Originally Posted by *rx7racer*
> 
> I feel ya Hulk, but it's simple really. Half the price of full cover block for 1, 2 the gpu was actually a christmas gift from the wife. 3 would be after doing so many full cover blocks and losing on them everytime I think more about my next card than current card in my rig. Switched back to air cooling gpu for past to 2 gpu setups since my 480 full cover and yada yada way back when.
> 
> I almost went full cover, but in the end it was gonna be $80 adding it to my loop than going with universal block. I did it on the cheap and it solved my noise and temp issue so me happy.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And warranty, hah, if it works for at least 2 weeks it's good, after that I know if it dies it's my fault
> 
> 
> 
> 
> 
> 
> 
> Don't need no stinkin warranty


hahaha yeah I hear ya, I'm a bit iffy with throwin blocks straight on the card, took me a few weeks of mildly using my old card before I took the step.

time to try out this battlefield 4 and see how this card does, then time to see how this haswell oc's wish me luck!


----------



## rx7racer

Quote:


> Originally Posted by *Sazz*
> 
> Hey guys, in a completely un-related question, on the 3 pin fan connectors, which one is the RPM lead? the left? right? or middle? the H100 that I got for free is not getting the RPM sensor right (my mobo thinks theres no fan connected) I believe the RPM lead is in the wrong spot, he did mention the connector broke off before and he just put it back.
> 
> Sorry about this, I just figured to ask here coz this thread is very much active. xD


here ya go, pretty simple actually.


----------



## INCREDIBLEHULK

So this card has me baffled, I have a lot of reading that needs to be done.
Just ran furmark, almost hit 80C but vrms were 58 and 62C, the blower on the card sounds like a leafblower









Confused why the card kept going from 900-915mhz and didn't run its stock 1000mhz advertised clock

On the other hand, it's showing me 9xx-92x core / 1250mhz memory with V1.000 ish on GPU, is this card really doing these numbers on that low of a voltage?
My mind has been blown, it's throttling from 0.977V to 1.016V on VDDC


----------



## rx7racer

Only time I have had issues like that unless gpu was not under 100% load was when AB freaked out a bit and it took a restart to straighten up.

I've noticed if the gpu usage drops even the slightest it will decide to auto drop clocks which vddc goes right along with. Haven't used furmark as it's unrealistic usage typically.

Since i haven't used it can't really help you much there on your 290X's behavior.

Is it doing this under all loads?


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *rx7racer*
> 
> Only time I have had issues like that unless gpu was not under 100% load was when AB freaked out a bit and it took a restart to straighten up.
> 
> I've noticed if the gpu usage drops even the slightest it will decide to auto drop clocks which vddc goes right along with. Haven't used furmark as it's unrealistic usage typically.
> 
> Since i haven't used it can't really help you much there on your 290X's behavior.
> 
> Is it doing this under all loads?


I was trying to get msi kombuster working but since I just reformatted I didn't try further. (first time i personally used furmark)

did however get the bf4 installed to test card out, hitting 1000/1250 and max vddc has been 1.141


----------



## Forceman

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> Confused why the card kept going from 900-915mhz and didn't run its stock 1000mhz advertised clock


Probably power throttling. Did you check and make sure the power limit was increased? Sometimes AB doesn't set it right and you have to set it in CCC.


----------



## Arizonian

Quote:


> Originally Posted by *Eroticus*
> 
> Add me to club =)
> 
> 
> Spoiler: Warning: Spoiler!


I need you to go back to your original post and 'edit' your proof and add #1. In the meantime - congrats - added









Per OP request:
*To be added on the member list please submit the following in your post

1. GPU-Z Link with OCN name or Screen shot of GPU-Z validation tab open with OCN Name or finally a pic of GPU with piece of paper with OCN name showing.
2. Manufacturer & Brand - Please don't make me guess.
3. Cooling - Stock, Aftermarket or 3rd Party Water*

Quote:


> Originally Posted by *Gunderman456*
> 
> Here are my two r9 290s now officially under water!
> 
> 
> Spoiler: Warning: Spoiler!


Nice rig you got there bud.








Quote:


> Originally Posted by *aaroc*
> 
> My setup for the last 2 months a 2x XFX R9 290 CFX with stock cooling, no OC:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Today a new MSI R9 290 for TriFX, The three boxes together:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> The new MSI R9 290 on my desktop waiting for action!:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> The new MSI R9 290 in the middle of the TriFX Sandwich with stock cooling, no OC:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> The GPU-Z validation here
> 
> Th CPUZ validation here


Congrats - added







thank for your proper submission.









Quote:


> Originally Posted by *rx7racer*
> 
> Welp, I finally bit the bullet and got my 290X under water. Keeps her cool now topping out at 48c so far but that's only at 1175MHz core. Gonna try to play with Trixx soon and go above the +100mv default limit in AB.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Oh sorry for the flash, didn't think of a pic till very end. It ended up a lot uglier than I wanted it to, gonna have to contemplate a re-do haha.
> 
> Using stock base plate by doing the re-use mod obviously since I did it trying my Zalman hsf. And it doesn't do too bad, with a fan over the vrm1 section I hit mid 70's on vrm1 and low 60's on vrm2. Hopefully ram is ok, no issue so far so I guess they are.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Feels good to finally feel like I can play with this thing.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GPUZ


Congrats - added


----------



## kizwan

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> So this card has me baffled, I have a lot of reading that needs to be done.
> Just ran furmark, almost hit 80C but vrms were 58 and 62C, the blower on the card sounds like a leafblower
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Confused why the card kept going from 900-915mhz and didn't run its stock 1000mhz advertised clock
> 
> On the other hand, it's showing me 9xx-92x core / 1250mhz memory with V1.000 ish on GPU, is this card really doing these numbers on that low of a voltage?
> My mind has been blown, it's throttling from 0.977V to 1.016V on VDDC


Is this while playing games? My cards was acting the same way too when I first set them up. However, after reinstalling drivers, GPU core & memory clock running at full speed while playing BF3 & BF4. MSI AB, GPU-Z & Open Hardware Monitor show they're running at full speed when playing BF3 & BF4. When running benchmark, e.g. Unigine, 3dmark, it's normal for the clocks to fluctuating I think.


----------



## rx7racer

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> I was trying to get msi kombuster working but since I just reformatted I didn't try further. (first time i personally used furmark)
> 
> did however get the bf4 installed to test card out, hitting 1000/1250 and max vddc has been 1.141


Sounds about right from what I've seen on mine for BF4 at stock or near stock.

Haven't used kombuster either, few hours gaming session with WoT or BF4 plus a dozen or so runs of Valley bench have shown any instability so far. Which I might add has been like zilch, I think once I had a hardfreeze, rest of time it was easily spotted artifacts.

Edit: Oh and thanks arizonian


----------



## Joining

Can any of you guys help me? I have two 290s in crossfire and i seem to be getting random rainbow vertical line freezes with audio loops (similar to that of a black screen). Both my cards have elpida ram and i can't figure out for the life of me if it's driver related or hardware related. I'm running the latest beta and my PSU is an ax860. Thanks a lot


----------



## kizwan

Quote:


> Originally Posted by *Joining*
> 
> Can any of you guys help me? I have two 290s in crossfire and i seem to be getting random rainbow vertical line freezes with audio loops (similar to that of a black screen). Both my cards have elpida ram and i can't figure out for the life of me if it's driver related or hardware related. I'm running the latest beta and my PSU is an ax860. Thanks a lot


Try undervolt and underclock your card to rule out power problem. If the problem still persist, then most likely hardware problem. I have similar setup (specs in your sig?). My 3820 running @4.75GHz & R9 290's at stock clock.


----------



## Joining

Quote:


> Originally Posted by *kizwan*
> 
> Try undervolt and underclock your card to rule out power problem. If the problem still persist, then most likely hardware problem. I have similar setup (specs in your sig?). My 3820 running @4.75GHz & R9 290's at stock clock.


I'll try that. Can rainbow vertical lines be caused by power problems?


----------



## kizwan

Quote:


> Originally Posted by *Joining*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Try undervolt and underclock your card to rule out power problem. If the problem still persist, then most likely hardware problem. I have similar setup (specs in your sig?). My 3820 running @4.75GHz & R9 290's at stock clock.
> 
> 
> 
> I'll try that. Can rainbow vertical lines be caused by power problems?
Click to expand...

Not 100% sure. Usually black screen.


----------



## smoke2

Quote:


> Originally Posted by *psyside*
> 
> With single 290 tri X you will never need more then ~ 600W, even with crazy volts during gameplay. Synthetic benchmarks are different story. Don't worry, you won't need more then ~100-120mV + for maxing the oc, after that you will need like 50mV + for 20mhz ain't worth it, just go for 100mV offset and you are golden, maybe add like 30- 50 aux for higher memory oc
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Keep in mind, you will buy very high quality PSU if you go for CM Vanguard, so even at full load it will handle the 290 with ease.
> 
> If the price is very close go for 850W otherwise 700W more then enough.


The price for 700W Vanguard is here 114€ and for 152€ for 850W.
I was wondering about 850W into the future, when maybe GPU's will be more powerful and more power hungry, but who knows...?








Which one will you buy for these prices?


----------



## kizwan

Quote:


> Originally Posted by *smoke2*
> 
> The price for 700W Vanguard is here 114€ and for 152€ for 850W.
> I was wondering about 850W into the future, when maybe GPU's will be more powerful and more power hungry, but who knows...?
> 
> 
> 
> 
> 
> 
> 
> 
> Which one will you buy for these prices?


I would go highest capacity that I can afford. I like to overkill as long as it's within my budget.


----------



## Sazz

nevermind, found my answer.. anyways any suggestions on which 80/92mm PWM fans should I use on my upcoming mod on the 290X, gonna use em to cool VRM.


----------



## Sgt Bilko

Quote:


> Originally Posted by *smoke2*
> 
> The price for 700W Vanguard is here 114€ and for 152€ for 850W.
> I was wondering about 850W into the future, when maybe GPU's will be more powerful and more power hungry, but who knows...?
> 
> 
> 
> 
> 
> 
> 
> 
> Which one will you buy for these prices?


Crossfire XFX 290's, this is 1100/1300 in FSE with an 8350 @ 5.0Ghz just to give you an idea


----------



## smoke2

Thanks!
Is 8350 @ 5.0GHz same power hungry as i5-4670K OCed?
It's peak value on wattmeter?
FSE is Furmark?
What is your power consumption with single 290 OCed on 1100/1300 ?
Please, try to test it in some benchmark where is power consumption the biggest...


----------



## ZealotKi11er

Quote:


> Originally Posted by *smoke2*
> 
> Thanks!
> Is 8350 @ 5.0GHz same power hungry as i5-4670K OCed?
> It's peak value on wattmeter?
> FSE is Furmark?
> What is your power consumption with single 290 OCed on 1100/1300 ?
> Please, try to test it in some benchmark where is power consumption the biggest...


FX CPU maybe 2-3X more power.


----------



## Sgt Bilko

Quote:


> Originally Posted by *smoke2*
> 
> Thanks!
> Is 8350 @ 5.0GHz same power hungry as i5-4670K OCed?
> It's peak value on wattmeter?
> FSE is Furmark?
> What is your power consumption with single 290 OCed on 1100/1300 ?
> Please, try to test it in some benchmark where is power consumption the biggest...


That was the Firestrike Extreme Combined Test, Furmark is silly.

I'll run it again later sometime with a single card and the 8350 will chew more power than an i5 easily.


----------



## Gero2013

Hey guys happy (belated) new year!









I am back and missed some of the discussion.
Was anyone able to find out why some cards (mine including) don't drop below 60C on *idle*?

Btw is a 550W PS enough for a 290X OC to 1200Hz ? Rest of my system per Sig (except for the 670 it's not there).

Thanks !


----------



## Csokis

AMD Catalyst 13.35 BETA Driver With Mantle and HSA Support Scheduled For End of January


----------



## ZealotKi11er

Quote:


> Originally Posted by *Gero2013*
> 
> Hey guys happy (belated) new year!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am back and missed some of the discussion.
> Was anyone able to find out why some cards (mine including) don't drop below 60C on *idle*?
> 
> Btw is a 550W PS enough for a 290X OC to 1200Hz ? Rest of my system per Sig (except for the 670 it's not there).
> 
> Thanks !


Depends how much voltage you need for 1200Mhz. It will be very close though. My System uses 440W with GPU only under load.


----------



## Gero2013

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Depends how much voltage you need for 1200Mhz. It will be very close though. My System uses 440W with GPU only under load.


I think last time I ran it at 1150-1170 GPU-Z showed 1.300V and a 300W draw. Hence my concern : O

What will actually happen when the system draws more power than my PS can supply?
My PS is a Seasonic G550 btw.


----------



## smoke2

Quote:


> Originally Posted by *Sgt Bilko*
> 
> That was the Firestrike Extreme Combined Test, Furmark is silly.
> 
> I'll run it again later sometime with a single card and the 8350 will chew more power than an i5 easily.


Is Furmark more power hungry than Firestrike Extreme Combined ?

Then please post your results with single card, I've made a decision by your measurment









Thanks


----------



## ZealotKi11er

Quote:


> Originally Posted by *Gero2013*
> 
> I think last time I ran it at 1150-1170 GPU-Z showed 1.300V and a 300W draw. Hence my concern : O
> 
> What will actually happen when the system draws more power than my PS can supply?
> My PS is a Seasonic G550 btw.


Good PSUs will work even after you pass the limit but you don't know for how long.


----------



## Gero2013

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Good PSUs will work even after you pass the limit but you don't know for how long.


what do you mean "don't know for how long" ^^ will it explode ?
it's a very good PSU with all the protections OCP, OHP, OCD etc etc...


----------



## shilka

Quote:


> Originally Posted by *Gero2013*
> 
> what do you mean "don't know for how long" ^^ will it explode ?
> it's a very good PSU with all the protections OCP, OHP, OCD etc etc...


What he means is some of the better PSU´s can do more then their are rated for but they wont last forever when you do it


----------



## ZealotKi11er

Quote:


> Originally Posted by *Gero2013*
> 
> what do you mean "don't know for how long" ^^ will it explode ?
> it's a very good PSU with all the protections OCP, OHP, OCD etc etc...


I bought a AX1200 just so i dont have to deal with PSU problems. I had before a PSU smoke before. I did not kill anything else.


----------



## shilka

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I bought a AX1200 just so i dont have to deal with PSU problems. I had before a PSU smoke before. I did not kill anything else.


You could have gotten half that wattage and be fine


----------



## ZealotKi11er

Quote:


> Originally Posted by *shilka*
> 
> You could have gotten half that wattage and be fine


Yeah but at one point i was running 3 cards.


----------



## Sazz

Quote:


> Originally Posted by *shilka*
> 
> You could have gotten half that wattage and be fine


Yeah sure he would be, but personally I rather have more than what I need.. so that in the future if I ended up upgrading components, adding a GPU or whatever. I won't need a new PSU just to make that upgrade happen.

And besides if the efficiency rating is right, if you are only using 50% of your PSU's capacity, that means you are on the sweet spot of the PSU, which means it's running as efficient as it can.


----------



## shilka

Quote:


> Originally Posted by *Sazz*
> 
> And besides if the efficiency rating is right, if you are only using 50% of your PSU's capacity, that means you are on the sweet spot of the PSU, which means it's running as efficient as it can.


Thats a myth

http://www.overclock.net/t/872013/50-load-myth


----------



## Gero2013

Quote:


> Originally Posted by *shilka*
> 
> Thats a myth
> 
> http://www.overclock.net/t/872013/50-load-myth


haha Shilka, you have a parser running or something? Whenever PSUs are mentioned you are on it









I actually was going to ask you for your opinion. I started a new thread for this as not to clutter this one.

Would be much obliged if you could have a look:

http://www.overclock.net/t/1459604/550w-enough-for-i7-4770-4-6ghz-and-r9-290x-1200mhz


----------



## shilka

Quote:


> Originally Posted by *Gero2013*
> 
> haha Shilka, you have a parser running or something? Whenever PSUs are mentioned you are on it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I actually was going to ask you for your opinion. I started a new thread for this as not to clutter this one.
> 
> Would be much obliged if you could have a look:
> 
> http://www.overclock.net/t/1459604/550w-enough-for-i7-4770-4-6ghz-and-r9-290x-1200mhz


Already answered

And i am so bored right now so i just sit on the main page of OCN and press F5 once in a while


----------



## Krusher33

Quote:


> Originally Posted by *Gero2013*
> 
> Quote:
> 
> 
> 
> Originally Posted by *shilka*
> 
> Thats a myth
> 
> http://www.overclock.net/t/872013/50-load-myth
> 
> 
> 
> haha Shilka, you have a parser running or something? Whenever PSUs are mentioned you are on it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I actually was going to ask you for your opinion. I started a new thread for this as not to clutter this one.
> 
> Would be much obliged if you could have a look:
> 
> http://www.overclock.net/t/1459604/550w-enough-for-i7-4770-4-6ghz-and-r9-290x-1200mhz
Click to expand...

He's just a dark knight lurking in the shadows ready to help when someone's in trouble.


----------



## shilka

Quote:


> Originally Posted by *Krusher33*
> 
> He's just a dark knight lurking in the shadows ready to help when someone's in trouble.


So i am Batman now?


----------



## Krusher33

Would you rather be Bolt?


----------



## Sgt Bilko

Quote:


> Originally Posted by *smoke2*
> 
> Is Furmark more power hungry than Firestrike Extreme Combined ?
> 
> Then please post your results with single card, I've made a decision by your measurment
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks


I'm sorry i won't be able to run any more tests until Monday or late Sunday......i don't have my volt monitor with me.

I do remember drawing a 630w (580w usually) spike with my rig + a Giga 7970 Ghz 1000/1375 (volt locked) So you might be able to draw some conclusions based on that.

There is a link on the OP to a thread created by Sonda measuring the power draw on a single 290x iirc, Tsm106 also recorded some power figures as well earlier in the thread.

EDIT: http://www.overclock.net/t/1441118/290x-psu-power-output-tests/0_40#post_21156921

600w for a 290x clocked at 1200/1600 in Furmark with a 4770k etc.


----------



## Hogesyx

Most entry level PSU do peak their efficiency at around 40-60% load. However higher end stuff(80+ Gold, Platinum) the difference are too little to make a difference. Just look at AX1200 example, the moment it hits 30% and above, the graph remains more or less constant.


----------



## smoke2

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'm sorry i won't be able to run any more tests until Monday or late Sunday......i don't have my volt monitor with me.
> 
> I do remember drawing a 630w (580w usually) spike with my rig + a Giga 7970 Ghz 1000/1375 (volt locked) So you might be able to draw some conclusions based on that.
> 
> There is a link on the OP to a thread created by Sonda measuring the power draw on a single 290x iirc, Tsm106 also recorded some power figures as well earlier in the thread.
> 
> EDIT: http://www.overclock.net/t/1441118/290x-psu-power-output-tests/0_40#post_21156921
> 
> 600w for a 290x clocked at 1200/1600 in Furmark with a 4770k etc.


Thanks for useful post.
No problem. I will be waiting to Sunday or a Monday








Quote:


> Originally Posted by *Hogesyx*
> 
> Most entry level PSU do peak their efficiency at around 40-60% load. However higher end stuff(80+ Gold, Platinum) the difference are too little to make a difference. Just look at AX1200 example, the moment it hits 30% and above, the graph remains more or less constant.


Thats's positive the efficiency is still good at about 90% of load.
But then, isn't risky to use a PSU at 90% of load or it is not rapidly shorten of his lifetime?


----------



## Hogesyx

Quote:


> Originally Posted by *smoke2*
> 
> Thanks for useful post.
> No problem. I will be waiting to Sunday or a Monday
> 
> 
> 
> 
> 
> 
> 
> 
> Thats's positive the efficiency is still good at about 90% of load.
> But then, isn't risky to use a PSU at 90% of load or it is not rapidly shorten of his lifetime?


That is one issue with PSU, there is no proper way to judge it's life span. I had PSU which I can sort of sense is failing where your stable overclock suddenly trips the PSU requiring you to reduce the voltage and the 12v rail goes unstable, I also had PSU which just suddenly goes "pop". I currently had a 550Watt Platinum rating PSU which constantly stress it at 90% to even beyond the 550Watt rating at 610Watt~, so far it is still pretty much kicking after 2 years.

Most high end PSU has a insane warranty period anyway, 5 years++, EVGA even has a 10 years extended, so no harm getting the most out of it unless you see your 12v rail fluctuating or spiking.


----------



## psyside

So anyone can test Meto LL for me, something strange is going on.

These are the settings, i would like to replicate.



My clocks are as follow.

1180 core, 1480 memory, 100mV+ 40 mV aux and 50% power limit, thanks


----------



## ImJJames

Quote:


> Originally Posted by *psyside*
> 
> So anyone can test Meto LL for me, something strange is going on.
> 
> These are the settings, i would like to replicate.
> 
> 
> 
> My clocks are as follow.
> 
> 1180 core, 1480 memory, 100mV+ 40 mV aux and 50% power limit, thanks


I get 60 FPS @ 1200/1500 with those settings


----------



## tijgert

Quote:


> anyways any suggestions on which 80/92mm PWM fans should I use on my upcoming mod on the 290X, gonna use em to cool VRM.


Get a 140mm Akasa Viper and a shroud to funnel that air into 92 or 80mm... it's blow real good and still be ultra quiet.


----------



## psyside

Quote:


> Originally Posted by *ImJJames*
> 
> I get 60 FPS @ 1200/1500 with those settings


Well something is off will try to tweak it.


----------



## Sazz

Quote:


> Originally Posted by *tijgert*
> 
> Get a 140mm Akasa Viper and a shroud to funnel that air into 92 or 80mm... it's blow real good and still be ultra quiet.


I would but I forgot to mention, space is limited. I am fitting it into a bitfenix prodigy case xD

Anyways I just bought Arctic cooling F9 fans for it, just in case my stock cooler fan mod don't work (or if I mess cutting things up) I would have a back-up plan. xD


----------



## Falkentyne

I posted this in the blackscreen thread, but hopefully someone with read points and a DMM can find out just what is going on with the memory voltage, and this should be posted over here, too.

I may have some information which may or most likely, may NOT be related to people having black screens at STOCK, but should explain some things about black screens when overclocking the memory, or possibly solving them by either underclocking the memory or raising the core voltage.

I know, from testing, that my card (Hynix) runs stable at 1400 MHz memory with 0.961v in idle (2d/desktop). That's +0mv (default)

I also know that 1450 MHz and +0mv seems game stable but may trigger a completely random black screen at idle, when any GPU usage causes the clocks to go to 1450 mhz.

I also know that 1500 MHz +0mv will **ALWAYS** instantly trigger a black screen on the desktop as soon as any load whatsoever causes the memory clocks to rise--something as simple is putting the mouse over a desktop icon will cause this.

It seems like the black screen is simply a GPU shutdown from an error fault (which can be anything from VRM, too high GPU temps, memory issue, etc), but an unstable GPU (from overclocking) will either VPU recover (crash), or artifact.

It also SEEMS that the black screen from a memory related fault ONLY happens during a switch from 150 MHz to boost speeds. There could be other causes, as I doubt Battlefield 4 would ever cause the memory to downclock, but maybe the current core voltage drops too low in a cpu limited area..

With this, I found out a few things:

1) 1500 MHz and +25mv (0.984v idle desktop) is stable, and I can't seem to trigger the black screen.
2) The GPU speed/overclock speed seems to be irrelevant, but I've only tested this at 1100 mhz. I say it is irrelevant because the GPU never even clocks past 400 MHz when the black screen happens.

With this, I tried something.
I set my monitor (VG248QE) to 144 hz, which forces maximum RAM clocks at all times (seems both NV and AMD cards have this or similar quirk at these refresh rates).

2) I then set the memory directly to 1500 MHz at +0mv (default). So now my memory was 1500 MHz in windows without any downclocking.

What I noticed was bizarre.

The entire screen was flickering in horizontal lines, wildly. I could read the desktop, but the image was jumping around with flickering horizontal lines all over the place. HOWEVER IT DID NOT BLACK SCREEN! (gpu idle voltage=0.963v)

I then set the core voltage to +25mv.
The image INSTANTLY stabilized itself and became crystal clear.(0.984v GPU).

Then I loaded the final fantasy benchmark XIV in borderless mode to avoid any downclocking.
While it was running, I alt tabbed to MSI afterburner (which ran over the top of the running benchmark like an overlay) and set the core voltage back to +0.

There was zero flickering and the benchmark just kept running. I left it running for 15 minutes like this. There was no noticeable corruption or flickering, EXCEPT momentarily when the benchmark "reset' itself after all the scenes were done.

At this point I realized something.

It seems that the CURRENT CORE VOLTAGE (whatever the GPU is currently running at--NOT really what you actually set) somehow relates or influences whether the memory is stable or not. Almost as if the memory voltage, memory bus (which is supposed to be Aux voltage/VDDCI) is somehow either feeding the memory, or feeding it as some sort of skew or added voltage.

This may explain why -many- people have said they can get higher memory overclocks by raising the core voltage.

What someone needs to do is to take a multimeter (DMM), find the readpoints for the memory GDDR5 voltage (which is supposed to be spec'd at 1.5v), then see if raising or lowering the core voltage has any effect on the direct memory voltage itself, or if its somehow related to the memory controller itself (which may be different from the Aux voltage).

For you guys who are having black screens on the desktop when NOT overclocking,
Try 1 of 3 things first:

1) If you are using a 120/144 hz monitor, try setting the highest refresh rate, or anything that can lock the memory speed into boost speeds. Or use a utility (even the old GPU-z 0.7.3) that forces 3D clocks in windows. Does this stop the black screens?

2) Raise your core voltage by +25 or +50mv instead of underclocking the memory. Then check the current gpu voltage in gpu-z. Does this stop the blackscreens?

Before we can say the cause is flaky memory, someone first needs to find the memory GDDR5 read point and see if the core voltage is imaking this go higher or lower than 1.5v.

(note: raising Aux voltage to even 50mv did NOT help: my case seems to be purely a problem with insufficient core voltage in 2D, to stop the memory /controller/RAM/whatever from erroring out past 1400 MHz (I suspect 1425 MHz at 0.963v should be fine too).


----------



## e1m088

Hi guys!

I'm having a problem with my R9 290 reference card. It's throttling, but it's not thermal throttling as the temperature is no where near the target. I verified that by running the fan manually 62% speed.
So basically the core clock (and memory clock) drop every now and then to around 300MHZ, like if they were idle. I can't find the cause. Could it have something to do with my PSU (thermaltake 630W)? Or is the card faulty?

Other components in my setup are:

ASUS P8Z77V_LX2
i5-3570k
8GB DDR3

GPU-Z when the clocks are normal (although you can see the very fast drops as white lines):
http://aijaa.com/NCvR9X

GPU-Z exactly when there is a drop:
http://aijaa.com/pntucK


----------



## Mr357

Quote:


> Originally Posted by *e1m088*
> 
> Hi guys!
> 
> I'm having a problem with my R9 290 reference card. It's throttling, but it's not thermal throttling as the temperature is no where near the target. I verified that by running the fan manually 62% speed.
> So basically the core clock (and memory clock) drop every now and then to around 300MHZ, like if they were idle. I can't find the cause. Could it have something to do with my PSU (thermaltake 630W)? Or is the card faulty?
> 
> Other components in my setup are:
> 
> ASUS P8Z77V_LX2
> i5-3570k
> 8GB DDR3
> 
> GPU-Z when the clocks are normal (although you can see the very fast drops as white lines):
> http://aijaa.com/NCvR9X
> 
> GPU-Z exactly when there is a drop:
> http://aijaa.com/pntucK


Enabling voltage monitoring in a recent (not latest) version of Afterburner caused the issue you're having. Is that it?

Although 630W is enough, you might need a new PSU depending on the 12V rail configuration of the one you have.


----------



## shilka

Which Thermaltake 630 watts?

If its the non modular version of the SMART then no kill your PSU with FIRE!

The semi modular version is made by another OEM and is decent

Only other 630 watts is in the Germany series


----------



## ZealotKi11er

Quote:


> Originally Posted by *Falkentyne*
> 
> I posted this in the blackscreen thread, but hopefully someone with read points and a DMM can find out just what is going on with the memory voltage, and this should be posted over here, too.
> 
> I may have some information which may or most likely, may NOT be related to people having black screens at STOCK, but should explain some things about black screens when overclocking the memory, or possibly solving them by either underclocking the memory or raising the core voltage.
> 
> I know, from testing, that my card (Hynix) runs stable at 1400 MHz memory with 0.961v in idle (2d/desktop). That's +0mv (default)
> 
> I also know that 1450 MHz and +0mv seems game stable but may trigger a completely random black screen at idle, when any GPU usage causes the clocks to go to 1450 mhz.
> 
> I also know that 1500 MHz +0mv will **ALWAYS** instantly trigger a black screen on the desktop as soon as any load whatsoever causes the memory clocks to rise--something as simple is putting the mouse over a desktop icon will cause this.
> 
> It seems like the black screen is simply a GPU shutdown from an error fault (which can be anything from VRM, too high GPU temps, memory issue, etc), but an unstable GPU (from overclocking) will either VPU recover (crash), or artifact.
> 
> It also SEEMS that the black screen from a memory related fault ONLY happens during a switch from 150 MHz to boost speeds. There could be other causes, as I doubt Battlefield 4 would ever cause the memory to downclock, but maybe the current core voltage drops too low in a cpu limited area..
> 
> With this, I found out a few things:
> 
> 1) 1500 MHz and +25mv (0.984v idle desktop) is stable, and I can't seem to trigger the black screen.
> 2) The GPU speed/overclock speed seems to be irrelevant, but I've only tested this at 1100 mhz. I say it is irrelevant because the GPU never even clocks past 400 MHz when the black screen happens.
> 
> With this, I tried something.
> I set my monitor (VG248QE) to 144 hz, which forces maximum RAM clocks at all times (seems both NV and AMD cards have this or similar quirk at these refresh rates).
> 
> 2) I then set the memory directly to 1500 MHz at +0mv (default). So now my memory was 1500 MHz in windows without any downclocking.
> 
> What I noticed was bizarre.
> 
> The entire screen was flickering in horizontal lines, wildly. I could read the desktop, but the image was jumping around with flickering horizontal lines all over the place. HOWEVER IT DID NOT BLACK SCREEN! (gpu idle voltage=0.963v)
> 
> I then set the core voltage to +25mv.
> The image INSTANTLY stabilized itself and became crystal clear.(0.984v GPU).
> 
> Then I loaded the final fantasy benchmark XIV in borderless mode to avoid any downclocking.
> While it was running, I alt tabbed to MSI afterburner (which ran over the top of the running benchmark like an overlay) and set the core voltage back to +0.
> 
> There was zero flickering and the benchmark just kept running. I left it running for 15 minutes like this. There was no noticeable corruption or flickering, EXCEPT momentarily when the benchmark "reset' itself after all the scenes were done.
> 
> At this point I realized something.
> 
> It seems that the CURRENT CORE VOLTAGE (whatever the GPU is currently running at--NOT really what you actually set) somehow relates or influences whether the memory is stable or not. Almost as if the memory voltage, memory bus (which is supposed to be Aux voltage/VDDCI) is somehow either feeding the memory, or feeding it as some sort of skew or added voltage.
> 
> This may explain why -many- people have said they can get higher memory overclocks by raising the core voltage.
> 
> What someone needs to do is to take a multimeter (DMM), find the readpoints for the memory GDDR5 voltage (which is supposed to be spec'd at 1.5v), then see if raising or lowering the core voltage has any effect on the direct memory voltage itself, or if its somehow related to the memory controller itself (which may be different from the Aux voltage).
> 
> For you guys who are having black screens on the desktop when NOT overclocking,
> Try 1 of 3 things first:
> 
> 1) If you are using a 120/144 hz monitor, try setting the highest refresh rate, or anything that can lock the memory speed into boost speeds. Or use a utility (even the old GPU-z 0.7.3) that forces 3D clocks in windows. Does this stop the black screens?
> 
> 2) Raise your core voltage by +25 or +50mv instead of underclocking the memory. Then check the current gpu voltage in gpu-z. Does this stop the blackscreens?
> 
> Before we can say the cause is flaky memory, someone first needs to find the memory GDDR5 read point and see if the core voltage is imaking this go higher or lower than 1.5v.
> 
> (note: raising Aux voltage to even 50mv did NOT help: my case seems to be purely a problem with insufficient core voltage in 2D, to stop the memory /controller/RAM/whatever from erroring out past 1400 MHz (I suspect 1425 MHz at 0.963v should be fine too).


It has nothing to do with memory voltage. You are simply overvolting memory controller which is severely undervolted to lower power consumption for these cards.


----------



## BradleyW

Hey!
I really could do with some help please on this thread.
http://www.overclock.net/t/1459697/i7-3930k-bottlebeck
Thank you.


----------



## psyside

Quote:


> Originally Posted by *ZealotKi11er*
> 
> It has nothing to do with memory voltage. You are simply overvolting memory controller which is severely undervolted to lower power consumption for these cards.


AUX is MC voltage?


----------



## Falkentyne

Quote:


> Originally Posted by *ZealotKi11er*
> 
> It has nothing to do with memory voltage. You are simply overvolting memory controller which is severely undervolted to lower power consumption for these cards.


Pardon?
The core voltage and the memory controller voltage are the same thing? (Then what the hell is Aux voltage?)
........
Any proof or documentation of that?


----------



## sugarhell

The AUX voltage is like pll on intel.


----------



## Widde

God! I'm sitting and waiting and waiting for the new drivers







Want to see mantle and just better frame pacing!







Not that I have noticed any stuttering or any form of lag in anything yet


----------



## Bartouille

Quote:


> Originally Posted by *sugarhell*
> 
> The AUX voltage is like pll on intel.


It's the memory controller voltage according to this (MSI Afterburner release note):

Voltage control layer has been seriously revamped to give additional freedom to extreme overclockers with new custom design MSI graphics cards. Now MSI Afterburner is able to control up to 3 voltages on custom design MSI Fermi and other future custom design MSI graphics cards. New adjustable voltages include memory voltage and special multi-purpose auxiliary voltage feeding either memory bus (also known as VDDCI on AMD graphics cards) or PCIE bus and crystal (PEXVDD on NVIDIA graphics cards)

http://forums.guru3d.com/showthread.php?t=326936


----------



## sugarhell

memory bus =/= memory controller. There isnt a programmable voltage control at all in the 290x ref pcb


----------



## Bartouille

Quote:


> Originally Posted by *sugarhell*
> 
> memory bus =/= memory controller. There isnt a programmable voltage control at all in the 290x ref pcb


my bad I meant memory bus.







Aux is pretty useful tho, I was able to squeeze some serious mhz on the memory with it at +50mv. I haven't checked if it had an impact on core oc.


----------



## psyside

How to get AB to unlock + 100mV? i need + 30 for maximum oc..


----------



## sugarhell

Look in the op


----------



## neurotix

How do you get an AUX slider in Afterburner?

I have the latest beta and it's not showing up.


----------



## INCREDIBLEHULK

what version of msi AB are you guys running?
Is 2.2.1 still the preferred ?


----------



## sugarhell

Huh? Thats like 1-2 years old. Version 3.0 beta 18

If you want AUX voltages you need to press the arrow next to core voltage


----------



## psyside

Quote:


> Originally Posted by *neurotix*
> 
> How do you get an AUX slider in Afterburner?
> 
> I have the latest beta and it's not showing up.


press on this


----------



## BlackPH

Speaking of UEFI GOP compatible bios for 290\290x here is what I got from Powercolor support

Code:



Code:


Subject : UEFI Ready BIOS for non PCS+ R9 290 card 
Description: Hello! I have reference Powercolor 290 OC card and want to ask you for UEFI GOP compatible bios (I`m experienced user so flashing is not a problem). Other vendors like Sapphire allow users to upgrade their reference boards so I expect you to give us little favor also. https://www.sapphireforum.com/showthread.php?32844-Bad-support

Code:



Code:


Dear User,

Your card is AMD MBA graphic card. The BIOS is supplied by AMD. We can't do any change on it. Sorry for your inconvenient.

Best Regards

So what are we(ref users) have to do?









And btw I tried to flash this tixx sapphire bios with no result on uefi boot. Still get 5 beeps with "ultra fast" boot option in my mb bios if I use 290 as a primary adapter (with intel 2500 igu as a primary everything is ok)


----------



## psyside

Quote:


> Originally Posted by *sugarhell*
> 
> Look in the op


Thanks, any more advanced tips for card which doesent score well in Metro LL? what is usually the culprit? unstable oc? AFAIK 3D MARK 11 is far more harder stability wise, and i passed 15 mins loop, Metro LL i lose like 25% compared to others.


----------



## sugarhell

Quote:


> Originally Posted by *psyside*
> 
> Thanks, any more advanced tips for card which doesent score well in Metro LL? what is usually the culprit? unstable oc? AFAIK 3D MARK 11 is far more harder stability wise, and i passed 15 mins loop, Metro LL i lose like 25% compared to others.


A lot can give an increase. From running the game from a ssd or high speed memory with tight timings. A balance system just that.If you want to see if you are stable you should start with stock and compare the increase vs the clocks. The increase should be linear. 10% oc~10% increase


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *sugarhell*
> 
> Huh? Thats like 1-2 years old. Version 3.0 beta 18
> 
> If you want AUX voltages you need to press the arrow next to core voltage


yeah I was assuming newer versions still had issues where you could set the voltage in msi-ab but it wouldnt actually change it on cards


----------



## sugarhell

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> yeah I was assuming newer versions still had issues where you could set the voltage in msi-ab but it wouldnt actually change it on cards


You just press apply? Because i follow unwinder on guru3d with almost every single beta this was never a bug


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *sugarhell*
> 
> You just press apply? Because i follow unwinder on guru3d with almost every single beta this was never a bug


Is pressing apply the obvious way to apply a setting you changed or does it have anything to do with the widely known bug of msi ab versions 2.x which you could probably search on guru3d if you press apply?


----------



## sugarhell

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> Is pressing apply the obvious way to apply a setting you changed or does it have anything to do with the widely known bug of msi ab versions 2.x which you could probably search on guru3d if you press apply?


I am talking about version 3. 2.x version is from tahiti release maybe earlier that means 2 years


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *sugarhell*
> 
> I am talking about version 3. 2.x version is from tahiti release maybe earlier that means 2 years


it's been long time for me with MSI-AB

I was only trying to figure out if people were using previous versions due to glitches like they had in the past

As we both know, the "newest" or "beta" versions aren't always the best to use


----------



## BradleyW

Are my cards under performing? Can anyone else compare?
Valley 1.0
Extreme HD preset
1080p
Stock GPU's


----------



## Widde

Something just occured to me,http://piclair.com/8jnjm Why does it say "LogMeIn Mirror Driver 7.1.542.0/AMD Radeon R9 200 Series 13.251.0.0/AMD Radeon R9 200 Series 13.251.0.0 (4095MB) x1" Shouldnt it be a x2 in the end? and are the score equivalent for 2 cards?


----------



## Falkentyne

Quote:


> Originally Posted by *sugarhell*
> 
> memory bus =/= memory controller. There isnt a programmable voltage control at all in the 290x ref pcb


Thanks, but what exactly is the core voltage doing to the memory?

As I explained in my long post above, using default voltage (0.961 or 0.956 fluctuating) at 1500 MHz memory causes black screen to happen as soon as there is any 2D load (which causes the memory clocks to jump from 150 to 1500=this jump causes the crash as the memory/whatever is unstable). But if you lock the memory clocks at 3D speeds, on the desktop (ilke on a 144hz [email protected]), instead of black screening, you see massive horizontal flickering. Raise the core voltage while its happening and it instantly stabilizes.


----------



## sugarhell

Quote:


> Originally Posted by *Falkentyne*
> 
> Thanks, but what exactly is the core voltage doing to the memory?
> 
> As I explained in my long post above, using default voltage (0.961 or 0.956 fluctuating) at 1500 MHz memory causes black screen to happen as soon as there is any 2D load (which causes the memory clocks to jump from 150 to 1500=this jump causes the crash as the memory/whatever is unstable). But if you lock the memory clocks at 3D speeds, on the desktop (ilke on a 144hz [email protected]), instead of black screening, you see massive horizontal flickering. Raise the core voltage while its happening and it instantly stabilizes.


Power limit and more current in affects stability. With no voltage increase you get black screen but with a small bump you dont. More current helps with stability. You need to understand better the PWM voltage controller

Alse its not something new when tsm got his cards he said it like 2 months ago.


----------



## Falkentyne

I understand, but I don't think ANYONE has read what I wrote *closely* as everyone so far, has *ignored* the details I put in.

At 0.953v - 0.961v, the system flickers horizontal lines massively @ 1500 MHz if you freeze the memory at 1500 MHz on the desktop.
If there is FULL 3D load, there is NO flickering.

And, at 0.953-0.961v, at 150 MHz, the system will black screen as soon as the memory jumps to 1500 mhz. It won't flicker, it will just instantly black screen.

At 0.984v, the system is stable (in windows), as this is +25mv offset..

Thus you can easily figure out that if you are in 3D load/speeds at 1500 MHz, you won't get flickering or blackscreen (at least for this set of examples) because the CORE VOLTAGE will be anywhere between 1.1v to 1.85v! (without any overclocking offset).

Don't you see what I'm trying to get at here?

Does the MEMORY CONTROLLER use the CORE VOLTAGE as the supply voltage for BOTH the core AND the controller?

This seems to point to a simple (or careless) design issue and some samples of substandard parts somewhere (ilke there being no separate offset/control for 2D and 3D voltage, or why would the memory jump to 1500 MHz in 2D (when the core is only at 300-400 MHz, etc).

Seems like a Bios keeping the memory clocks at 150 MHz unless the core reaches 500 MHz or higher (since the core voltage would probably be around 1.03v 'ish here) would solve 95% of the problems...


----------



## sugarhell

... Your gpu is on 2d and you force a higher memory than stock. The memory bus also downclock and the pcie also. It means that you are not stable on idle. Its the same as cpus you can be stable on load but on idle you blackscreen.


----------



## kizwan

Quote:


> Originally Posted by *BradleyW*
> 
> Are my cards under performing? Can anyone else compare?
> Valley 1.0
> Extreme HD preset
> 1080p
> Stock GPU's
> 
> 
> Spoiler: Warning: Spoiler!


----------



## Forceman

Quote:


> Originally Posted by *Falkentyne*
> 
> Does the MEMORY CONTROLLER use the CORE VOLTAGE as the supply voltage for BOTH the core AND the controller?
> 
> This seems to point io a simple (or careles) design issue and some samples of substandard parts somewhere (ilke there being no separate offset/control for 2D and 3D voltage, or why would the memory jump to 1500 MHz in 2D (when the core is only at 300-400 MHz, etc).
> 
> Seems like a Bios keeping the memory clocks at 150 MHz unless the core reaches 500 MHz or higher (since the core voltage would probably be around 1.03v 'ish here) would solve 95% of the problems...


I think that is what is happening. The memory controller, being integrated on the core, is running at core voltage (otherwise what voltage is it running at?). So when it is undervolted at idle and the memory bus is overclocked, you get flickering. I get the exact same flickering on my card (also at 1400) and it also goes away with a core voltage bump. Likewise many people have found they can run higher memory overclocks with higher core voltage. The only explanation that makes sense is that the memory controller needs the extra voltage bump to run the higher clock speeds. It can't be DDR voltage itself, because that isn't controllable.

And you are exactly right - there is no reason for the memory to go to full speed while in 2D mode, they really need intermediate steps.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Forceman*
> 
> I think that is what is happening. The memory controller, being integrated on the core, is running at core voltage (otherwise what voltage is it running at?). So when it is undervolted at idle and the memory bus is overclocked, you get flickering. I get the exact same flickering on my card (also at 1400) and it also goes away with a core voltage bump. Likewise many people have found they can run higher memory overclocks with higher core voltage. The only explanation that makes sense is that the memory controller needs the extra voltage bump to run the higher clock speeds. It can't be DDR voltage itself, because that isn't controllable.
> 
> And you are exactly right - there is no reason for the memory to go to full speed while in 2D mode, they really need intermediate steps.


Quote:


> Originally Posted by *sugarhell*
> 
> ... Your gpu is on 2d and you force a higher memory than stock. The memory bus also downclock and the pcie also. It means that you are not stable on idle. Its the same as cpus you can be stable on load but on idle you blackscreen.


I think 290 is missing a memory clock step. It only has 150MHz and 1250Mhz. I believe HD 7970 had 3 if i am not mistaken.


----------



## Derpinheimer

Correct; I wouldnt doubt that having 3 steps would fix this problem.

Maybe once a BIOS editor comes out we can add in another, but I wouldnt count on it...


----------



## PillarOfAutumn

Stupid question, but is the R9 290 compatible with the X-STAR DP2710? And does the monitor come with the appropriate wires and cables or will I need to purchase them myself?


----------



## kizwan

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Forceman*
> 
> I think that is what is happening. The memory controller, being integrated on the core, is running at core voltage (otherwise what voltage is it running at?). So when it is undervolted at idle and the memory bus is overclocked, you get flickering. I get the exact same flickering on my card (also at 1400) and it also goes away with a core voltage bump. Likewise many people have found they can run higher memory overclocks with higher core voltage. The only explanation that makes sense is that the memory controller needs the extra voltage bump to run the higher clock speeds. It can't be DDR voltage itself, because that isn't controllable.
> 
> And you are exactly right - there is no reason for the memory to go to full speed while in 2D mode, they really need intermediate steps.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *sugarhell*
> 
> ... Your gpu is on 2d and you force a higher memory than stock. The memory bus also downclock and the pcie also. It means that you are not stable on idle. Its the same as cpus you can be stable on load but on idle you blackscreen.
> 
> Click to expand...
> 
> I think 290 is missing a memory clock step. It only has 150MHz and 1250Mhz. I believe HD 7970 had 3 if i am not mistaken.
Click to expand...

I can see 4 steps in the graph while in 2D mode; 425MHz, 700MHz, 975MHz & 1250MHz.


----------



## brazilianloser

FrozenCpu has shipped my blocks and the remainder of my loop parts. Should have mine under water by the end of next week if everything works right.


----------



## Falkentyne

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I think 290 is missing a memory clock step. It only has 150MHz and 1250Mhz. I believe HD 7970 had 3 if i am not mistaken.


Wasnt missing a step; It actually had core speeds stops
1) 300/150 MHz. 150 MHz was used for for any core load under 500 MHz, but I don't remember if the core jumped from 300 to 501 or not.

2) 501 / 1375 MHz (1500 MHz on ghz edition; this used overdrive speed though; there was a BUG where UVD clocks for HW accelerated video FORCED stock memory speeds if you had overclocked the memory, which was known to cause scrambled screens if your 2D memory speeds were running past stock already--common on 144 hz monitors)
3) Overdrive speed

Besides the scrambled screen UVD/accelerated video bug, memory remained at 150 MHz for any light 2d load.
On the 290 series, the memory jumps to full speed at any GUI load, while the core can use any clock it wants.


----------



## Forceman

Quote:


> Originally Posted by *kizwan*
> 
> I can see 4 steps in the graph while in 2D mode; 425MHz, 700MHz, 975MHz & 1250MHz.


On a 290? Strange, mine looks like a square wave - it's either 150 or 1350, with never an in-between.


----------



## Falkentyne

Quote:


> Originally Posted by *kizwan*
> 
> I can see 4 steps in the graph while in 2D mode; 425MHz, 700MHz, 975MHz & 1250MHz.


As forceman said, I've never seen anything except 150 and 1250 (or whatever overdrive is set to) on the memory.


----------



## e1m088

Quote:


> Originally Posted by *Mr357*
> 
> Enabling voltage monitoring in a recent (not latest) version of Afterburner caused the issue you're having. Is that it?
> 
> Although 630W is enough, you might need a new PSU depending on the 12V rail configuration of the one you have.


Thanks for the idea but I dont have AB installed. I'm using GPU tweak and Trixx. Unistalling those didn't change anything..
The PSU is Thermaltake Berlin 630W (german series). If I'm not mistaken it should bring about 530W through the 12V and that should be enough.


----------



## kizwan

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I can see 4 steps in the graph while in 2D mode; 425MHz, 700MHz, 975MHz & 1250MHz.
> 
> 
> 
> On a 290? Strange, mine looks like a square wave - it's either 150 or 1350, with never an in-between.
Click to expand...

I can post here the screenshot but the graph is too big (I can't adjust the y axis).

BTW, I thought I want to share my CPU usage in BF4 1080p High settings + 200% resolution scale (64 player map, "Siege of Shanghai", if that make any difference). Maybe useful to anyone with the same CPU & Crossfire (dual) setup.


----------



## Paulenski

Quote:


> Originally Posted by *BradleyW*
> 
> Are my cards under performing? Can anyone else compare?
> Valley 1.0
> Extreme HD preset
> 1080p
> Stock GPU's


For stock clocks that looks about normal, I've seen about 2600~ in valley, so with CF it'll be close to double. Valley doesn't provide the best scores for amd cards, nvidia in the other hand tends to do much better in comparative clock speeds.

Here's highest I've achieved in valley, 1218/1690, Extreme Preset

Single GPU


----------



## Derpinheimer

Quote:


> Originally Posted by *Falkentyne*
> 
> As forceman said, I've never seen anything except 150 and 1250 (or whatever overdrive is set to) on the memory.


Hes right - quick test:


----------



## kizwan

This is what I got (GPU memory clock), reported by Open Hardware Monitor. I have to take two screenshots because I can't re-size the y-axis.


----------



## Forceman

Quote:


> Originally Posted by *Derpinheimer*
> 
> Hes right - quick test:


What program is that?

Edit: Just logged mine with HWInfo for a couple of minutes and saw almost nothing but 149.5 and 1324, with the exception of two readings of 300. I need to up the polling rate and try it again. Okay, out of 300 readings, it was 98 @ 1324, 184 @ 149.5, 11 @ 662, and 3 @ 299. The 299 and 662 readings are always singles, never more than one together so they may be measurement errors as the card switching freqs. Or maybe it does have four modes and just doesn't ever hardly go to two of them.


----------



## kizwan

With BF4 1080p Ultra settings, 200% resolution scale, 40s to 60s FPS, CPU usages are less than High settings. "Rogue Transmission" map, 64 player map.


----------



## Paulenski

Quote:


> Originally Posted by *ImJJames*
> 
> I get 60 FPS @ 1200/1500 with those settings


I ran it too, this was my average result for 3 runs

Average Framerate: 55.00
Max. Framerate: 108.03
Min. Framerate: 12.03

1202/1690 @ 125 mv core / 50mv aux

Figured I would get better scores with slightly higher clocks, guess not lol


----------



## 2advanced

Did the latest AB Beta remove the extended overclocking? I updated and I can only go as high as 1235 on the core?!?!?!?


----------



## lol.69

Did you already seen this?


----------



## e1m088

Quote:


> Originally Posted by *e1m088*
> 
> Hi guys!
> 
> I'm having a problem with my R9 290 reference card. It's throttling, but it's not thermal throttling as the temperature is no where near the target. I verified that by running the fan manually 62% speed.
> So basically the core clock (and memory clock) drop every now and then to around 300MHZ, like if they were idle. I can't find the cause. Could it have something to do with my PSU (thermaltake 630W)? Or is the card faulty?
> 
> Other components in my setup are:
> 
> ASUS P8Z77V_LX2
> i5-3570k
> 8GB DDR3
> 
> GPU-Z when the clocks are normal (although you can see the very fast drops as white lines):
> http://aijaa.com/NCvR9X
> 
> GPU-Z exactly when there is a drop:
> http://aijaa.com/pntucK


SOLVED: It seems that even though I didn't have AB before, when I *downloaded the latest beta of AB*. I don't get the drops anymore. Really weird.

EDIT: It seems that GPU-Z makes the drops. Weird. I dont have the drops if I don't monitor them through GPU-Z (using GPU-Z 0.7.5)


----------



## rdr09

Quote:


> Originally Posted by *Falkentyne*
> 
> As forceman said, I've never seen anything except 150 and 1250 (or whatever overdrive is set to) on the memory.


i saw you post your idle voltage for the 290X. my 290 idle voltage is 0.984v. interesting that the X has a lower setting.

Quote:


> Originally Posted by *e1m088*
> 
> SOLVED: It seems that even though I didn't have AB before, when I *downloaded the latest beta of AB*. I don't get the drops anymore. Really weird.


are you saying that AB was the culprit? so, you uninstalled it and the issue went away?


----------



## e1m088

Quote:


> Originally Posted by *rdr09*
> 
> i saw you post your idle voltage for the 290X. my 290 idle voltage is 0.984v. interesting that the X has a lower setting.
> are you saying that AB was the culprit? so, you uninstalled it and the issue went away?


It seemed that the installation (not uninstallation) of latest beta of AB solved the issue, but now that I've done more tests it is GPU-Z that is causing the problems.
As soon as I start GPU-Z (0.7.5) the drops start and the moment when I close it the drops disappear.


----------



## rdr09

Quote:


> Originally Posted by *e1m088*
> 
> It seemed that the installation (not uninstallation) of latest beta of AB solved the issue, but now that I've done more tests it is GPU-Z that is causing the problems.
> As soon as I start GPU-Z (0.7.5) the drops start and the moment when I close it the drops disappear.


i see. not sure about your gpu but my vrm temps can only be detected by gpuz. guess not going to open during a game or a bench. thanks, i'll observe it, too, see if it does the same.


----------



## psyside

Quote:


> Originally Posted by *sugarhell*
> 
> A lot can give an increase. From running the game from a ssd or high speed memory with tight timings. A balance system just that.If you want to see if you are stable you should start with stock and compare the increase vs the clocks. The increase should be linear. 10% oc~10% increase


This card is going back, the owner of the store offer me replacement so i hope all will be fine with the other









Also tip for owners of Tri X,

Do not use GPU-Z replace it with HWinFO, GPU-z is freezing systems for this card i guess vBIOS issue, happening to 4 cards, my friends have them, and also on few users on the forum.


----------



## BradleyW

Quote:


> Originally Posted by *Paulenski*
> 
> For stock clocks that looks about normal, I've seen about 2600~ in valley, so with CF it'll be close to double. Valley doesn't provide the best scores for amd cards, nvidia in the other hand tends to do much better in comparative clock speeds.
> 
> Here's highest I've achieved in valley, 1218/1690, Extreme Preset
> 
> Single GPU


Thanks very much for this result. Could you repeat the test using stock GPU clocks please?

Thanks!


----------



## psyside

Quote:


> Originally Posted by *Paulenski*
> 
> Here's highest I've achieved in valley, 1218/1690, Extreme Preset
> 
> Single GPU


Tesselation off?


----------



## Arizonian

Quote:


> Originally Posted by *psyside*
> 
> This card is going back, the owner of the store offer me replacement so i hope all will be fine with the other
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also tip for owners of Tri X,
> 
> Do not use GPU-Z replace it with HWinFO, GPU-z is freezing systems for this card i guess vBIOS issue, happening to 4 cards, my friends have them, and also on few users on the forum.


What was wrong with your TRI X for you to return it?


----------



## devilhead

some Valley score with 2x290x


----------



## BradleyW

Quote:


> Originally Posted by *devilhead*
> 
> some Valley score with 2x290x


What do you get on stock speeds?


----------



## Sazz

Quote:


> Originally Posted by *BradleyW*
> 
> What do you get on stock speeds?


You do know that's OC run and not on stock.

Not to mention he got one of the lucky cards that can get high memory OC.


----------



## devilhead

Quote:


> Originally Posted by *BradleyW*
> 
> I think I will be returning my GPU's then. I get FPS 108 on Extreme Preset 1080P.


heh but my CPU is 3930K at 5.2ghz with 32gb 2400mhz memory







and gpu clocks 1280/1680







)


----------



## BradleyW

Quote:


> Originally Posted by *devilhead*
> 
> heh but my CPU is 3930K at 5.2ghz with 32gb 2400mhz memory
> 
> 
> 
> 
> 
> 
> 
> and gpu clocks 1280/1680
> 
> 
> 
> 
> 
> 
> 
> )


What Vcore do you need for 5.2GHz?


----------



## devilhead

Quote:


> Originally Posted by *BradleyW*
> 
> What Vcore do you need for 5.2GHz?


1.5v


----------



## Sazz

Quote:


> Originally Posted by *devilhead*
> 
> 1.5v


isn't that kinda high... what's the safe voltage for the Ivy's anyways?


----------



## Jack Mac

Guys, I strongly advise not using GPU-Z, just use HWInfo64 to monitor your card. I don't know why but GPU-Z was causing my BF4 stuttering. I was able to replicate it and the problem went away as soon as I closed GPU-Z.


----------



## kizwan

Quote:


> Originally Posted by *Sazz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *devilhead*
> 
> 1.5v
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> isn't that kinda high... what's the safe voltage for the Ivy's anyways?
Click to expand...

That is SB-E. If want to push it, 1.5V should be the limit. Higher than that CPU may degrade quickly, I found around 1.55V & higher.


----------



## Derpinheimer

Quote:


> Originally Posted by *Jack Mac*
> 
> Guys, I strongly advise not using GPU-Z, just use HWInfo64 to monitor your card. I don't know why but GPU-Z was causing my BF4 stuttering. I was able to replicate it and the problem went away as soon as I closed GPU-Z.


I once had a similar issue with MSI Afterburner w. Monitoring and another game... might have been BF4, not totally sure.


----------



## devilhead

Quote:


> Originally Posted by *Sazz*
> 
> isn't that kinda high... what's the safe voltage for the Ivy's anyways?


my daily overclock is 4.8ghz with 1.37v, for bench i use 5.2ghz 1.5v, thats even low for 5.2gh i think


----------



## psyside

Ok guys, this are my new tests R9 *290* Tri x, can someone please do the same benches on same settings, i want to know if the oc scaling is good









1170/1300...


----------



## BradleyW

Quote:


> Originally Posted by *devilhead*
> 
> 1.5v


I can't get 4.7GHz at 1.5v, never mind 5GHz+.
What tool will ccheck my MB temps? Could be the MB overheating which is an easy fix.


----------



## Jack Mac

Quote:


> Originally Posted by *psyside*
> 
> Ok guys, this are my new tests R9 *290* Tri x, can someone please do the same benches on same settings, i want to know if the oc scaling is good
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1170/1300...


I'll run heaven for you at those clocks in 30 mins or so.


----------



## psyside

Quote:


> Originally Posted by *Jack Mac*
> 
> I'll run heaven for you at those clocks in 30 mins or so.


Thanks bro i run them around 10 mins +


----------



## rdr09

Quote:


> Originally Posted by *psyside*
> 
> Ok guys, this are my new tests R9 *290* Tri x, can someone please do the same benches on same settings, i want to know if the oc scaling is good
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1170/1300...
> 
> 
> Spoiler: Warning: Spoiler!


i think you missed the question asked by Arizonian about why you are returning that card. why?

my score is slightly lower . . .



love my ek block. the vrm temps are always lower than the core . . .


----------



## kizwan

I'm dying to know as well.


----------



## Sgt Bilko

Quote:


> Originally Posted by *psyside*
> 
> Ok guys, this are my new tests R9 *290* Tri x, can someone please do the same benches on same settings, i want to know if the oc scaling is good
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1170/1300...


Well your Graphics score in 3DMark 11 is 4539, my XFX card at the same settings is 4536 so i'd say that's pretty bang on









http://www.3dmark.com/3dm11/7835261?

I'd download Heaven and run it as well.


----------



## psyside

Quote:


> Originally Posted by *rdr09*
> 
> i think you missed the question asked by Arizonian about why you are returning that card. why?


Was average overclocker....and the temps was kinda high for Tri x, the second card is better i guess, not 100% sure yet but i think it is









rep + to you for the results as well








Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well your Graphics score in 3DMark 11 is 4539, my XFX card at the same settings is 4536 so i'd say that's pretty bang on
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/7835261?
> 
> I'd download Heaven and run it as well.


Thanks bro so i guess 1170 on core is stable







rep +


----------



## Sgt Bilko

Quote:


> Originally Posted by *psyside*
> 
> Was average overclocker....and the temps was kinda high for Tri x, the second card is better i guess, not 100% sure yet but i think it is
> 
> 
> 
> 
> 
> 
> 
> 
> 
> rep + to you for the results as well
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks bro so i guess 1170 on core is stable
> 
> 
> 
> 
> 
> 
> 
> rep +


I'd say so, mine was fine at those clocks.

Here is Heaven btw:



I probably could have squeezed a few more fps out of it running at 5Ghz but meh, it's close enough, you are good


----------



## Jack Mac

Alright, here's the Heaven results at 1170/1300, same settings as you.


I think you're good.


----------



## Krusher33

Why can't I unlock the voltage control in AB? I've been doing it successfully since HD 6850's release but with this card... is something different?

I did try to follow TSM's guide but still to no avail.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Krusher33*
> 
> Why can't I unlock the voltage control in AB? I've been doing it successfully since HD 6850's release but with this card... is something different?
> 
> I did try to follow TSM's guide but still to no avail.


AB has been funny for me as well with Volt control, sometimes closing it down then starting it up again works and other times i need to re-install it.


----------



## tobitronics

How should I have set up my AMD Catalyst settings in order to get a good OC? I mean that I want to prevent AMD and Trixx interfering with each other. And does Trixx always have to run in order to force my OC?


----------



## psyside

Quote:


> Originally Posted by *Jack Mac*
> 
> Alright, here's the Heaven results at 1170/1300, same settings as you.
> 
> 
> I think you're good.


Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'd say so, mine was fine at those clocks.
> 
> Here is Heaven btw:
> 
> 
> 
> I probably could have squeezed a few more fps out of it running at 5Ghz but meh, it's close enough, you are good


Thanks both, rep + really happy its stable, but Metro is very bad on my cards, the new one is also weak, AFAIK another owner on Tri X had lower scores in Meto LL, dunno why, we lose like 10 fps, its very very strange...

Can anyone of you guys do max out Metro LL as well? this is my score..


----------



## Jhors2

Are Sapphire and Asus the only cards you are guaranteed to get the Hynix memory chips? I was going to pull the trigger on the reference version MSI cards, but not if they only come with Elpida.
Thanks!


----------



## VSG

None of the reference cards are guaranteed to come with Hynix, both my Sapphire cards were Elpida as was the Asus I had.


----------



## psyside

The only guaranteed Hynix cards are Tri X for now.


----------



## Paulenski

Quote:


> Originally Posted by *tobitronics*
> 
> How should I have set up my AMD Catalyst settings in order to get a good OC? I mean that I want to prevent AMD and Trixx interfering with each other. And does Trixx always have to run in order to force my OC?


Turn off overdrive, if you haven't done anything yet, just don't touch it.
The only optimizations you can use to improve performance are the 3D settings for that specific app from Catalyst.

Sent from my SGH-M919 using Tapatalk


----------



## Sgt Bilko

Quote:


> Originally Posted by *psyside*
> 
> Thanks both, rep + really happy its stable, but Metro is very bad on my cards, the new one is also weak, AFAIK another owner on Tri X had lower scores in Meto LL, dunno why, we lose like 10 fps, its very very strange...
> 
> Can anyone of you guys do max out Metro LL as well? this is my score..




It's roughly the same as mine, minus the extra fps i'd get from running the 8350 at 5.0Ghz


----------



## Gunderman456

Quote:


> Originally Posted by *psyside*
> 
> Thanks both, rep + really happy its stable, but Metro is very bad on my cards, the new one is also weak, AFAIK another owner on Tri X had lower scores in Meto LL, dunno why, we lose like 10 fps, its very very strange...
> 
> Can anyone of you guys do max out Metro LL as well? this is my score..


I got tons of MetroLL benches/scores in my new build log for "The Hawaiian Heat Wave" (in sig).


----------



## Krusher33

Quote:


> Originally Posted by *Jhors2*
> 
> Are Sapphire and Asus the only cards you are guaranteed to get the Hynix memory chips? I was going to pull the trigger on the reference version MSI cards, but not if they only come with Elpida.
> Thanks!


Quote:


> Originally Posted by *geggeg*
> 
> None of the reference cards are guaranteed to come with Hynix, both my Sapphire cards were Elpida as was the Asus I had.


His Sapphires had Elpdia and mine has Hynix. So it's not guaranteed but it is possible. Like lotto ticket as usual.


----------



## Paulenski

Quote:


> Originally Posted by *Derpinheimer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jack Mac*
> 
> Guys, I strongly advise not using GPU-Z, just use HWInfo64 to monitor your card. I don't know why but GPU-Z was causing my BF4 stuttering. I was able to replicate it and the problem went away as soon as I closed GPU-Z.
> 
> 
> 
> I once had a similar issue with MSI Afterburner w. Monitoring and another game... might have been BF4, not totally sure.
Click to expand...

Does HWInfo64 show vrm temps? Only reason I use it and aida64 is limited time use.

Sent from my SGH-M919 using Tapatalk


----------



## Sgt Bilko

Quote:


> Originally Posted by *Paulenski*
> 
> Does HWInfo64 show vrm temps? Only reason I use it and aida64 is limited time use.
> 
> Sent from my SGH-M919 using Tapatalk


It's showing them for me


----------



## psyside

Quote:


> Originally Posted by *Gunderman456*
> 
> I got tons of MetroLL benches/scores in my new build log for "The Hawaiian Heat Wave" (in sig).


Thanks that thead is amazing, unfortunately there are no settings for Metro LL so i don't know how you tested


----------



## psyside

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 
> 
> It's roughly the same as mine, minus the extra fps i'd get from running the 8350 at 5.0Ghz


So you are the 3rd user who got much lower score on Metro LL benchmark?


----------



## Sgt Bilko

Quote:


> Originally Posted by *psyside*
> 
> So you are the 3rd user who got much lower score on Metro LL benchmark?


Whats a normal score for Metro LL?


----------



## Gunderman456

Quote:


> Originally Posted by *psyside*
> 
> Thanks that thead is amazing, unfortunately there are no settings for Metro LL so i don't know how you tested


Yes, come to think of it, I never kept that screen.

Well, I tested at 1200 resolution, with everything at max, except for tesselation which was on normal.


----------



## the9quad

With 3x290x's I get 76 avg fps @1080p with all settings on max and physics off, and 53 fps avg @1440p. I don't think it scales that well with crossfire.


----------



## psyside

Quote:


> Originally Posted by *Gunderman456*
> 
> Yes, come to think of it, I never kept that screen.
> 
> Well, I tested at 1200 resolution, with everything at max, except for tesselation which was on normal.


Sadly my tests are on extreme







Quote:


> Originally Posted by *Sgt Bilko*
> 
> Whats a normal score for Metro LL?


Based on the users here, with ~ 1200/1600 they get around 60-65 fps maxed out which is far from our scores.

Ok guys, can anyone test 3D Mark 11 extreme, with core on 1000 memory on 1600? post ss so we can see the graphics score, because cpu's makes difference for total score.

Thanks and really sorry for bothering, but on internet i find very confusing results


----------



## jerrolds

Is it possible to log fps in BF4? I know you can normally log it using Rivatuner, but im not sure itll work with the game.

Maybe Fraps?


----------



## the9quad

Quote:


> Originally Posted by *jerrolds*
> 
> Is it possible to log fps in BF4? I know you can normally log it using Rivatuner, but im not sure itll work with the game.
> 
> Maybe Fraps?


Fraps is the easiest way to do it.


----------



## Loktar Ogar

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Stupid question, but is the R9 290 compatible with the X-STAR DP2710? And does the monitor come with the appropriate wires and cables or will I need to purchase them myself?


My friend. Check out this thread: http://www.overclock.net/t/1384767/official-the-korean-pls-monitor-club-qnix-x-star

It is compatible i'm sure and you don't have to buy cables.


----------



## Paulenski

Quote:


> Originally Posted by *psyside*
> 
> Tesselation off?


I believe it's on, Extreme HD Preset is on right?

I used that for the score ranking for here (http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0)


----------



## Ar2r4eg

Hello, anybody know how i can know, what mode (quiet or uber) set in my 290x without visual ? I install ek wb and brige (crossfire). Hi is close my look and i can see the switch...


----------



## Paulenski

Quote:


> Originally Posted by *psyside*
> 
> So you are the 3rd user who got much lower score on Metro LL benchmark?


Not sure if you saw my result

3 runs - SSAA on

Average Framerate: 55.00
Max. Framerate: 108.03
Min. Framerate: 12.03

1202/1690 @ 125 mv core / 50mv aux

I've looked around on some forums for the MetroLL bench and it seems SSAA is kind of a bad indicator of performance since you have no control on what level of AA it uses.


----------



## the9quad

Here is how bad the scaling is on Metro LL:
First TR almost 100% usage on all 3 cards:



Next Hitman almost 100% usage on all 3 cards:



Now frickin Metro LL 28%,77%, 50% usage:


----------



## Forceman

Quote:


> Originally Posted by *Ar2r4eg*
> 
> Hello, anybody know how i can know, what mode (quiet or uber) set in my 290x without visual ? I install ek wb and brige (crossfire). Hi is close my look and i can see the switch...


Reset the defaults in Catalyst Control Center and see what the fan speed limit is. Quiet mode will be 40% and Uber will be 55%. That won't work for a 290 though, since both are the same.


----------



## psyside

Quote:


> Originally Posted by *Paulenski*
> 
> I believe it's on, Extreme HD Preset is on right?
> 
> I used that for the score ranking for here (http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0)


our scores.

Did you turn off tessellation in the drivers? because you got like 13 fps more then me, that's insane difference and i run mine card @1170 core, 1300 memory which is much faster then stock anyway.

I wonder how come there is so big difference, is Extreme HD same as ultra with everything maxed out?


----------



## zpaf

Quote:


> Originally Posted by *psyside*
> 
> Based on the users here, with ~ 1200/1600 they get around 60-65 fps maxed out which is farm from our scores.


https://imageshack.com/i/0l0gwwj

60-65 no way on these clocks with R9 290 non X.


----------



## psyside

Quote:


> Originally Posted by *Paulenski*
> 
> Not sure if you saw my result
> 
> 3 runs - SSAA on
> 
> Average Framerate: 55.00
> Max. Framerate: 108.03
> Min. Framerate: 12.03
> 
> 1202/1690 @ 125 mv core / 50mv aux
> 
> I've looked around on some forums for the MetroLL bench and it seems SSAA is kind of a bad indicator of performance since you have no control on what level of AA it uses.


Didn't see it, thanks. I guess my scores are fine then? because i benched without memory oc...here is the result.


Quote:


> Originally Posted by *zpaf*
> 
> https://imageshack.com/i/0l0gwwj
> 
> 60-65 no way on these clocks with R9 290 non X.


I could swear someone said/posted 65 fps with everything maxed out in Metro LL, with highly overclocked 290/290X ~ 1250/1650 but can't find the post.


----------



## zpaf

Quote:


> Originally Posted by *psyside*
> 
> Didn't see it, thanks. I guess my scores are fine then? because i benched without memory oc...here is the result.
> 
> 
> I could swear someone said/posted 65 fps with everything maxed out in Metro LL, with highly overclocked 290/290X ~ 1250/1650 but can't find the post.


here is ...
Quote:


> Originally Posted by *ImJJames*
> 
> 290 @ 1250 Core / 1500 Mem
> 4770k @ 4.3
> All settings max Phsyx off
> 1080P
> 62.22
> 
> 
> 
> Heres a run at 1200 core 1500 mem 60.23


----------



## psyside

Quote:


> Originally Posted by *zpaf*
> 
> here is ...


So....how does he achieve such high scores man? huge difference from us...


----------



## zpaf

Quote:


> Originally Posted by *psyside*
> 
> So....how does he achieve such high scores man? huge difference from us...


Only 290*X* with PT1/3 bios can do these numbers without tweaks.


----------



## psyside

So ~ 20% higher score from vBIOS?


----------



## sugarhell

Quote:


> Originally Posted by *psyside*
> 
> So ~ 20% higher score from vBIOS?


http://www.3dmark.com/fs/1425869

1345/1700-A friend's 290


----------



## the9quad

That's impressive.


----------



## Widde

Quote:


> Originally Posted by *Derpinheimer*
> 
> I once had a similar issue with MSI Afterburner w. Monitoring and another game... might have been BF4, not totally sure.


I've had simillar issues with gpu-z, at first I thought it was my overclock, reverted to stock and still stuttering, closed gpu-z and it went away instantly


----------



## zpaf

Quote:


> Originally Posted by *psyside*
> 
> So ~ 20% higher score from vBIOS?


Nope.
I said 290*X* no Pro.
Metro LL likes shaders so I said that a 290X without power throttle can touch these numbers.


----------



## psyside

Quote:


> Originally Posted by *zpaf*
> 
> Nope.
> I said 290*X* no Pro.
> Metro LL likes shaders so I said that a 290X without power throttle can touch these numbers.


I understand, but even with more shaders imho its impossible to get 20% higher scores....at similar clocks.

Maybe his score is with tweaks?

Also does this look normal for 3D Mark 11 extreme? without memory oc i get this..

1170 core



With memory oc 1600 but core @1150, i get this...



Low increase for 300 mhz + on memory or?


----------



## Paulenski

Quote:


> Originally Posted by *psyside*
> 
> our scores.
> 
> Did you turn off tessellation in the drivers? because you got like 13 fps more then me, that's insane difference and i run mine card @1170 core, 1300 memory which is much faster then stock anyway.
> 
> I wonder how come there is so big difference, is Extreme HD same as ultra with everything maxed out?


The same thing I think, I have Tessellation on application specific in Catalyst for Valley.exe. I don't think it does tessellation considering that it has no controls for it and I thought only Heaven 4.0 was geared towards tessellation testing.


----------



## psyside

No idea really, can you do a 3D Mark 11 extreme @1150/1600 for me please? thanks.


----------



## tsm106

Quote:


> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *psyside*
> 
> So ~ 20% higher score from vBIOS?
> 
> 
> 
> http://www.3dmark.com/fs/1425869
> 
> 1345/1700-A friend's 290
Click to expand...

I still beat him.
















Side note, seems the X maintains its lead unlike previously with unlocked 6950s vs 6970s.


----------



## Sgt Bilko

Quote:


> Originally Posted by *sugarhell*
> 
> http://www.3dmark.com/fs/1425869
> 
> 1345/1700-A friend's 290


Sad thing about that is it's very close to my CF 290 score (stock clocks).......all dem Physics huh?









http://www.3dmark.com/compare/fs/1531853/fs/1425869


----------



## vieuxchnock

*I tried that this afternoon:



Do you think I can keep those settings 24/7?
I'm on air for now but I will switch to water in a near future.
*


----------



## bond32

Quote:


> Originally Posted by *vieuxchnock*
> 
> *I tried that this afternoon:
> 
> 
> 
> Do you think I can keep those settings 24/7?
> I'm on air for now but I will switch to water in a near future.
> *


Wish my card on water could do those clocks... If temps are in check im sure you can run that 24/7


----------



## zpaf

Hitman Absolution @ 1080p maxed out.



My reference card need water to 1300 hitman.


----------



## vieuxchnock

Quote:


> Originally Posted by *bond32*
> 
> Wish my card on water could do those clocks... If temps are in check im sure you can run that 24/7


*According to AB, the max temp was 75. Not so bad!!!*


----------



## bond32

Quote:


> Originally Posted by *vieuxchnock*
> 
> *According to AB, the max temp was 75. Not so bad!!!*


I would be more concerned with VRM temp. I imagine if your core was 75 VRM won't be far behind, but just keep an eye on it to be sure.


----------



## psyside

Does this look good to you guys? just done this.


----------



## INCREDIBLEHULK

Does anyone have a Sapphire 290x ?
I can't get MSI-AB to undervolt the card even with the voltage unlocked.

Don't get me wrong, It's doing stress tests at v1.000-v1.016 925/1250mhz I just want to try and see whats the most clock I can get out of the card at v0.950 or so.

Eitherway, mV on msi-ab just resets when I try negative for mV


----------



## lethal343

Boys boys boys.. and girls...

how has everybody been?!?!. I had 2x R9 290X's on release and returned via RAGE to obtain store credit due to the intermittent BLACK SCREEN problem .

I didn't want to return them because the cards are BEAUTIFUL in PERFORMANCE....

anyways....

ive decided to go for 3X non-reference r9 290's (non-x)

Could someone PLEASE tell me if the blackscreen issues have been fixed?!

whats going on with the 290 chipsets
would 3X of these cards deliver outstanding performance?

I am very interested please let me know....

http://www.gigabyte.com.au/products/product-page.aspx?pid=4884#ov

GV-R929OC-4GD


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *lethal343*
> 
> Boys boys boys.. and girls...
> 
> how has everybody been?!?!. I had 2x R9 290X's on release and returned via RAGE to obtain store credit due to the intermittent BLACK SCREEN problem .
> 
> I didn't want to return them because the cards are BEAUTIFUL in PERFORMANCE....
> 
> anyways....
> 
> ive decided to go for 3X non-reference r9 290's (non-x)
> 
> Could someone PLEASE tell me if the blackscreen issues have been fixed?!
> 
> whats going on with the 290 chipsets
> would 3X of these cards deliver outstanding performance?
> 
> I am very interested please let me know....
> 
> http://www.gigabyte.com.au/products/product-page.aspx?pid=4884#ov
> 
> GV-R929OC-4GD


was the black screen from bios flashing? or just random black screen


----------



## bond32

Black screen issue has not been fixed... The only thing we really know for sure is that is is 99.9% caused by Elpida memory. If you get black screens at stock clocks, you might try dropping the memory clock and see if that helps.


----------



## lethal343

REALLY??!!
the blackscreen problem hasnt been fixed yet?!....man AMD are hopeless..

the blackscreen wouldnt happen on idle in would happen during some specific intense games...
from testing i found that the cards were most prone to blackscreens upon cards switching from 2D to 3D apps.

damn....

ok well you guys reckon that 3X of them cards are a powerful option?!

much stronger than 2X r9 290X's??

cheers...


----------



## jerrolds

Quote:


> Originally Posted by *bond32*
> 
> Wish my card on water could do those clocks... If temps are in check im sure you can run that 24/7


You cant hit 1150 underwater? I thought it was a 120hz issue you were having?


----------



## psyside

Quote:


> Originally Posted by *lethal343*
> 
> Boys boys boys.. and girls...
> 
> how has everybody been?!?!. I had 2x R9 290X's on release and returned via RAGE to obtain store credit due to the intermittent BLACK SCREEN problem .
> 
> I didn't want to return them because the cards are BEAUTIFUL in PERFORMANCE....
> 
> anyways....
> 
> ive decided to go for 3X non-reference r9 290's (non-x)
> 
> Could someone PLEASE tell me if the blackscreen issues have been fixed?!
> 
> whats going on with the 290 chipsets
> would 3X of these cards deliver outstanding performance?
> 
> I am very interested please let me know....
> 
> http://www.gigabyte.com.au/products/product-page.aspx?pid=4884#ov
> 
> GV-R929OC-4GD


No black screen on any Tri X without oc, only black screen i saw was with oc, from memory instability. Dunno about Gigabyte, but Tri X imho is by far the best series for now. They also give you guaranteed Hynix chips.


----------



## bond32

Quote:


> Originally Posted by *jerrolds*
> 
> You cant hit 1150 underwater? I thought it was a 120hz issue you were having?


I can do 1150 ok, its the memory clocks that are my issue. Anything over 1325 causes black screens. I can do 1200 stable on core. And the refresh rate was a problem but with the corrected CRU values doesn't seem to be a problem anymore.


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> Does anyone have a Sapphire 290x ?
> I can't get MSI-AB to undervolt the card even with the voltage unlocked.
> 
> Don't get me wrong, It's doing stress tests at v1.000-v1.016 925/1250mhz I just want to try and see whats the most clock I can get out of the card at v0.950 or so.
> 
> Eitherway, mV on msi-ab just resets when I try negative for mV


Anyone? Or is bios editing the only way to actually undervolt


----------



## devilhead

2x Sapphire 290X Stock bios, +200mv FIRE STRIKE EXTREME: http://www.3dmark.com/fs/1559910


----------



## bayz11

Quote:


> Originally Posted by *bond32*
> 
> I can do 1150 ok, its the memory clocks that are my issue. Anything over 1325 causes black screens. I can do 1200 stable on core. And the refresh rate was a problem but with the corrected CRU values doesn't seem to be a problem anymore.


Can you post me your cru settings for 120hz for your 1200 core clock please.my qnix can only do 120 hz at 1100 core clock.more than that it cause artifact.

And i found out that it seem like my sapphire 290 can do 120hz refresh rate with stock voltage only.but funny enough i can run 96hz at 1200 core clock

Thank you


----------



## ZealotKi11er

I have a small problem. I use my GPU for mining but when i game i run close mining program. After i finish playing the game CGMiner is very slow. It does not set the GPU to 3D mode and tries to mine with 2D memory clocks for some reason. I have to restart my PC every time because of this. Anyone have this problem?


----------



## Roy360

Just got back my XFX card from RMA. Been mining for 1 day straight with no black screens,. Very happy

Just finished running Heaven's Benchmark got a score of 2437 on stock settings, no overclocking until this guy is under water

Might as well ask this now, but for those who are using the EK full water blocks. Are you having any problems with cooling VRM1? If so how did you resolve it?


----------



## Heinz68

I made a preorder for two SAP RADEON R9 290X TRI-X OC 4GB for $622.06 each at ShopBLT
Same card is listed at Newegg $699.99

I don't need for gaming 2 cards till I upgrade to 4K UHD, but in the meantime it's going to be good for mining


----------



## the9quad

Quote:


> Originally Posted by *devilhead*
> 
> 2x Sapphire 290X Stock bios, +200mv FIRE STRIKE EXTREME: http://www.3dmark.com/fs/1559910


nice job


----------



## psyside

2 of my friends use 2x 290 Tri x for minning, they say they are epic


----------



## Heinz68

Quote:


> Originally Posted by *psyside*
> 
> 2 of my friends use 2x 290 Tri x for minning, they say they are epic


Nice to hear







I just placed order for them, should get them early next week.

EDIT
Anybody has spare Battleship 4 for sale?


----------



## Kenshiro 26

Please change me over to


----------



## rdr09

Quote:


> Originally Posted by *Roy360*
> 
> Just got back my XFX card from RMA. Been mining for 1 day straight with no black screens,. Very happy
> 
> Just finished running Heaven's Benchmark got a score of 2437 on stock settings, no overclocking until this guy is under water
> 
> Might as well ask this now, but for those who are using the EK full water blocks. Are you having any problems with cooling VRM1? If so how did you resolve it?


yes, its too cold . . .



so, played BF4.









edit: my vrm1 is always lower than vrm2. vrm2 is always lower than the core. load or no load.


----------



## Tobiman

Anyone know if ShopBLT sticks to their pre-order price when the item is shipped and do they actually wait until it's shipped before charging me?


----------



## Arizonian

Just a heads up. My SSD died it seems. Won't be up and running for a week at best. Won't be able to update list until then. Will be keeping tabs on new members until then.

In the meantime if anyone has any suggestions....

http://www.overclock.net/t/1460051/windows-wont-boot-and-when-it-does-its-frozen-help#post_21607082


----------



## 2advanced

Quote:


> Originally Posted by *Kenshiro 26*
> 
> Please change me over to
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


So pretty!







+rep


----------



## Sazz

Quote:


> Originally Posted by *Kenshiro 26*
> 
> Please change me over to


is it just me, or the top fans are in "intake" position?

those are silverstone air penetrators right?


----------



## Paulenski

Quote:


> Originally Posted by *bond32*
> 
> I can do 1150 ok, its the memory clocks that are my issue. Anything over 1325 causes black screens. I can do 1200 stable on core. And the refresh rate was a problem but with the corrected CRU values doesn't seem to be a problem anymore.


You need to increase AUX Voltage in AB to get higher Memory clocks if it's stalling on you. I couldn't go above 1500 on my Tri-X without an 50mv bump in AUX, maxes at 1690 and somewhat stable at 1700. It works nicely.


----------



## Paulenski

Quote:


> Originally Posted by *psyside*
> 
> Does this look good to you guys? just done this.


Looks good to me, this was my best at 1202/1500


----------



## Paulenski

Quote:


> Originally Posted by *psyside*
> 
> No idea really, can you do a 3D Mark 11 extreme @1150/1600 for me please? thanks.


I actually don't own 3DMark 11 and demo version only offers 720p testing. I have the latest 3DMark, Ice Storm/Cloud Gate/FireStrike


----------



## zpaf

Anyone with Sapphire Radeon R9 290 Tri-X can share bios ?


----------



## psyside

Quote:


> Originally Posted by *Paulenski*
> 
> Looks good to me, this was my best at 1202/1500


You got 290X or 290? you got higher score then me, at almost the same settings. Does cpu makes difference, or my clocks are unstable so i get decrase in fps?


----------



## Sazz

Quote:


> Originally Posted by *psyside*
> 
> You got 290X or 290? you got higher score then me, at almost the same settings. Does cpu makes difference, or my clocks are unstable so i get decrase in fps?


Unigine scores do get affected by CPU and CPU clocks more than other benchmarks. If you guys have same CPU maybe you guys are running at different clocks?


----------



## psyside

Quote:


> Originally Posted by *Sazz*
> 
> Unigine scores do get affected by CPU and CPU clocks more than other benchmarks. If you guys have same CPU maybe you guys are running at different clocks?


I just saw some test, 0 difference going from 1 core i7 to 4 cores + HT.


----------



## Paulenski

Quote:


> Originally Posted by *psyside*
> 
> You got 290X or 290? you got higher score then me, at almost the same settings. Does cpu makes difference, or my clocks are unstable so i get decrase in fps?


290X, 3770k @ 4.3Ghz boost


----------



## psyside

Ah that explain the difference, so my scores are spot on.

What is the most efficient stability test for Hawaii? i got no time to bench million of them, few are ok but like 10 different and each one on different settings is very boring.


----------



## BlackPH

Occt 4.4.0 GPU test with "error checking" option. 95% of oc results ppl posting there are not 100% rock solid. Try 1-2 minutes, it will be enough, and, don't be afraid - powerlimit feature will protect your power circuit. But watch temps carefully.


----------



## psyside

AFAIK OCCT/Furmark are not good for games oc tests, but only for power delivery/temp test?


----------



## BlackPH

No, this "error check" option makes difference. But some ppl prefere to take risk. Anyway GPU is not CPU and if there are errors it will not always crush app. Mb it will affect like flickering, wrong texture etc. What I'm trying to say GPU errors quite less fatal usually.


----------



## psyside

So lets say i bench or game Crysis, if i see 2/3 flickers during 30 mins here and there, that means the card is unstable i know, but do you always lose fps or its just flicker?

Sorry i got no experience with Hawaii gpu's.


----------



## sugarhell

Flickers =memory errors


----------



## Kenshiro 26

Quote:


> Originally Posted by *Sazz*
> 
> is it just me, or the top fans are in "intake" position?
> 
> those are silverstone air penetrators right?


Yes, and yes.


----------



## Tobiman

Quote:


> Originally Posted by *Heinz68*
> 
> I made a preorder for two SAP RADEON R9 290X TRI-X OC 4GB for $622.06 each at ShopBLT
> Same card is listed at Newegg $699.99
> 
> I don't need for gaming 2 cards till I upgrade to 4K UHD, but in the meantime it's going to be good for mining


Do you have any earlier experience with shopBLT?


----------



## Heinz68

Quote:


> Originally Posted by *Tobiman*
> 
> Do you have any earlier experience with shopBLT?


Sorry I don't, but I remember at least one member at this forum said they're OK. See PM


----------



## Redeemer

Guys hows the MSI Gaming 290X?


----------



## cyenz

Please add me back to owners club, sold my 780ti and bought two 290x. Both at stock cooling (jet engines ftw) for now. One Gigabyte and one Asus.


----------



## rdr09

Quote:


> Originally Posted by *the9quad*
> 
> nice job


^this. good job, devilhead.
Quote:


> Originally Posted by *Kenshiro 26*
> 
> Please change me over to


almost missed this beauty.


----------



## Kuat

deleted.


----------



## psyside

LOL BF4 is such a good stability tester.


----------



## KyGuy

Hey Guys, I have had some severe lagging in youtube lately using Google Chrome. Doubt it is something with the card, because it has played fine for over 2 months, and then it just started lagging yesterday. Could it be a flash issue? Running Catalyst 13.12, and have had no black screens. I have noticed that the core clock speeds do not go above 450mhz. Could it be the card is stuck in 2D Clocks. Runs games fine by the way.....KG


----------



## ZealotKi11er

Quote:


> Originally Posted by *KyGuy*
> 
> Hey Guys, I have had some severe lagging in youtube lately using Google Chrome. Doubt it is something with the card, because it has played fine for over 2 months, and then it just started lagging yesterday. Could it be a flash issue? Running Catalyst 13.12, and have had no black screens. I have noticed that the core clock speeds do not go above 450mhz. Could it be the card is stuck in 2D Clocks. Runs games fine by the way.....KG


Same thing here. Its just Youtube.


----------



## PillarOfAutumn

Someone with a Crossfire 290, what is the total wattage used by your system overclocked and stock? I have the Seasonic X-750 gold and on it I have running a CPU OCed to 4.3 GHz, and the Sapphire 290 OCed to 1190/1480 and overvolted to +87. The maximum spike I've seen while playing BF4 is 430, otherwise its usually stayed near 380. I'm wondering how well will the PSU be able to handle 2x 290 not overclocked. Thanks.


----------



## ebduncan

You should be fine on a good 700 watt psu. It will be close with 290/x crossfire.

I've been tinkering with my card some more. Max stable clocks on my XFX Black Edition R9 290 1165 core/ 1500 mem. Hynix memory. Fan is custom profile up to 70% fan card stays under 90c this is with +100 mv and +50% powertune. I cannot push higher on the core start getting artifacts in games and benches. Will have to repeat testing when the water block gets here. I could probably get higher if I maxed out the fan speed, and applied better thermal paste. I just don't want to take the time to do it when I have a waterblock on the way and will have to redo it anyways.

Overall I'm quite happy with the results so far. 19% overclock on the core and 20% on the memory. I'm sure the memory can go higher I just stopped testing at 1500. Was working in 25mhz steps and re-benching to see if the score goes up or down. It is rather time consuming to test each clock 3 times and average the scores, then test the next clock.

Seems from what I have seen my card is decent overclocker compared to the other results. ASIC is 73.9% Hopefully under water it unleashes a whole new beast like it did with my 7950's.


----------



## brazilianloser

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Someone with a Crossfire 290, what is the total wattage used by your system overclocked and stock? I have the Seasonic X-750 gold and on it I have running a CPU OCed to 4.3 GHz, and the Sapphire 290 OCed to 1190/1480 and overvolted to +87. The maximum spike I've seen while playing BF4 is 430, otherwise its usually stayed near 380. I'm wondering how well will the PSU be able to handle 2x 290 not overclocked. Thanks.




During stress with oc to 1100/1250 and cpu @ 4.5. 1 ssd, 1 hdd, 6 fans...


----------



## KyGuy

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Same thing here. Its just Youtube.


I thougt that's what it was. Maybe it was an automatic Flash Player update. Who knows. Everything else on my 290x is fine...


----------



## ZealotKi11er

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I have a small problem. I use my GPU for mining but when i game i run close mining program. After i finish playing the game CGMiner is very slow. It does not set the GPU to 3D mode and tries to mine with 2D memory clocks for some reason. I have to restart my PC every time because of this. Anyone have this problem?


----------



## bond32

Quote:


> Originally Posted by *ZealotKi11er*


I have this problem too I think. Or at least with my 290x it gets around 500-600 khash. I look at memory usage and its only using 2gb. No idea what to do...


----------



## ebduncan

Quote:


> Originally Posted by *bond32*
> 
> I have this problem too I think. Or at least with my 290x it gets around 500-600 khash. I look at memory usage and its only using 2gb. No idea what to do...


your settings are wrong. I get 880 khash on a 290. Wu is 820 khash. Settings powertune -20, Gpu clock 1050, memory -1400, thread - 24450, work size 512, intensity 20.

I can start and stop Cgi-miner at anytime and resume as i wish. Some people seem to have trouble if they quit CGI-Miner using the Q command. I've always just closed the command prompt by clicking on the X to close the window.


----------



## ZealotKi11er

Quote:


> Originally Posted by *bond32*
> 
> I have this problem too I think. Or at least with my 290x it gets around 500-600 khash. I look at memory usage and its only using 2gb. No idea what to do...


Its get 30 KH/s not 600 KH/s


----------



## PillarOfAutumn

Quote:


> Originally Posted by *brazilianloser*
> 
> 
> 
> During stress with oc to 1100/1250 and cpu @ 4.5. 1 ssd, 1 hdd, 6 fans...


Thanks a lot! Can you please do a stress test without overclocking the 290? Do you think non-OCed 290 in XF is enough to run games like BF4 on 1440p at 120+ fps or on 4k at 60+ fps?


----------



## Loktar Ogar

Quote:


> Originally Posted by *vieuxchnock*
> 
> *I tried that this afternoon:
> 
> 
> 
> Do you think I can keep those settings 24/7?
> I'm on air for now but I will switch to water in a near future.
> *


We have the same clocks but mine has lower Volts (+69mV) to keep under 1.3v!









Going 1160 in core clocks require more voltage to avoid any artifacts in Valley and i think this is my sweet spot in gaming.


----------



## vieuxchnock

Quote:


> Originally Posted by *Loktar Ogar*
> 
> We have the same clocks but mine has lower Volts (+69mV) to keep under 1.3v!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Going 1160 in core clocks require more voltage to avoid any artifacts in Valley and i think this is my sweet spot in gaming.


*I checked in GPUZ and it never goes over 1.25v. I never touch 1.3v, so I think I will stay like that as it runs smoothly.*


----------



## battleaxe

Weird. After 2 weeks of running 100% my core temps and VRM temps have gone down. Starting at about 52c core now sitting at 44c on the core. Under water obviously. I guess the TIM must have seated in or something? Strange. Never seen a change that far apart.


----------



## Caanon

Quote:


> Originally Posted by *battleaxe*
> 
> Weird. After 2 weeks of running 100% my core temps and VRM temps have gone down. Starting at about 52c core now sitting at 44c on the core. Under water obviously. I guess the TIM must have seated in or something? Strange. Never seen a change that far apart.


Change in ambient maybe?


----------



## battleaxe

I guess its possible. But my furnace is set to the same setting as last week when it was 52c. IDK

I did move some fans around. Maybe the block shifted slightly and settled in a bit closer to the GPU.


----------



## Loktar Ogar

Quote:


> Originally Posted by *vieuxchnock*
> 
> *I checked in GPUZ and it never goes over 1.25v. I never touch 1.3v, so I think I will stay like that as it runs smoothly.*


Also, check your Max Volts value. My alternate setting is Core Clock 1160 (+mV81), Mem Clock 1500.
My target is Below 1.3v Max and VRM1 below 90c Max.

It's still your preference by the end of the day.


----------



## ZealotKi11er

My VRM1 temps are still horrible even after adding some thermal paste. Still getting 74C with only 50mV added. Cant image if i do 200mV. This is under water which really sucks.


----------



## ebduncan

Quote:


> Originally Posted by *ZealotKi11er*
> 
> My VRM1 temps are still horrible even after adding some thermal paste. Still getting 74C with only 50mV added. Cant image if i do 200mV. This is under water which really sucks.


what water block are you running? those temps are high for under water. Did you use thermal paste on the vrm as well as the pads? like in the EK instructions


----------



## ZealotKi11er

Quote:


> Originally Posted by *ebduncan*
> 
> what water block are you running? those temps are high for under water. Did you use thermal paste on the vrm as well as the pads? like in the EK instructions


Its the EK block. I just used the 1 mm Thermal Pad first. That gave me about 78C VRM1 temps. I then added thermal paste on the VRM1 only then placed the pad and then the block so VRM1-Paste-Pad-Block.


----------



## Mr357

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Its the EK block. I just used the 1 mm Thermal Pad first. That gave me about 78C VRM1 temps. I then added thermal paste on the VRM1 only then placed the pad and then the block so VRM1-Paste-Pad-Block.


Strange, I'm using the EK block as well, but didn't put any paste on the pads. Despite that, I still have seen no higher than 70C on the VRM's, and that was with 1.4V (more like 1.3 - 1.35V) set in the PT1 BIOS, running Valley for about 30 minutes straight. My fans were only at about 50% too.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Mr357*
> 
> Strange, I'm using the EK block as well, but didn't put any paste on the pads. Despite that, I still have seen no higher than 70C on the VRM's, and that was with 1.4V (more like 1.3 - 1.35V) set in the PT1 BIOS, running Valley for about 30 minutes straight. My fans were only at about 50% too.


Well i have not been able to game or bench the card. I am just mining. Have you tried to mine?


----------



## psyside

Guys, avoid using aux voltage, from my experience on 2 cards, it increases vrm temps ALOT.


----------



## Jack Mac

That's very odd, and higher temps than I get on stock air with +65mV (more volts) in a 22C ambient temps.


----------



## ZealotKi11er

Quote:


> Originally Posted by *psyside*
> 
> Guys, avoid using aux voltage, from my experience on 2 cards, it increases vrm temps ALOT.


I have that at stock.


----------



## Loktar Ogar

Quote:


> Originally Posted by *psyside*
> 
> Guys, avoid using aux voltage, from my experience on 2 cards, it increases vrm temps ALOT.


I agree. Especially on VRM2. I noticed higher temps being affected most in VRM1 if the Core Voltage (mV) increases and i'm sure you guys already know about this.


----------



## Derpinheimer

Quote:


> Originally Posted by *psyside*
> 
> Guys, avoid using aux voltage, from my experience on 2 cards, it increases vrm temps ALOT.


It doenst seem to effect mine much? Only VRM2


----------



## vieuxchnock

Quote:


> Originally Posted by *Loktar Ogar*
> 
> Also, check your Max Volts value. My alternate setting is Core Clock 1160 (+mV81), Mem Clock 1500.
> My target is Below 1.3v Max and VRM1 below 90c Max.
> 
> It's still your preference by the end of the day.


I check the max volts and it goes to 1.305v so I lower the Core voltage to +69 and now the max is 1.289v and the VRM are 68c max while the core is at 80c.


----------



## Paulenski

Quote:


> Originally Posted by *psyside*
> 
> Guys, avoid using aux voltage, from my experience on 2 cards, it increases vrm temps ALOT.


For me it only increases vrm2 by 2C when I used 50mv

Sent from my SGH-M919 using Tapatalk


----------



## Arizonian

Quote:


> Originally Posted by *Kenshiro 26*
> 
> Please change me over to
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Nice work - congrats updated









Quote:


> Originally Posted by *cyenz*
> 
> Please add me back to owners club, sold my 780ti and bought two 290x. Both at stock cooling (jet engines ftw) for now. One Gigabyte and one Asus.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - re-added


----------



## cyenz

Quote:


> Originally Posted by *Arizonian*
> 
> Nice work - congrats updated
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - re-added


Thanks! Just missing the "X" on 290


----------



## GenoOCAU

Just played BF4 for 2 hours straight, VRM 1 - Max was 50 deg C
VRM 2 - 38 Deg

Currently going through a heatwave here in Western Australia too, running EK blocks - not sure how people are getting such high temps on their VRMs under water given the climate in US atm.


----------



## Derpinheimer

If you mean me, my VRMs are air cooled (and vrm2 is basically not cooled at all).

Plus that is OCCT, so high power consumption.


----------



## rdr09

Quote:


> Originally Posted by *GenoOCAU*
> 
> Just played BF4 for 2 hours straight, VRM 1 - Max was 50 deg C
> VRM 2 - 38 Deg
> 
> Currently going through a heatwave here in Western Australia too, running EK blocks - not sure how people are getting such high temps on their VRMs under water given the climate in US atm.


i am only cooling one 290. my vrm1 is about 3C cooler than vrm2 and about 5C cooler than the core. i used CLU for the core and Fujipoly for the vrms.


----------



## IBIubbleTea

Is this normal..? When I apply my memory clock, I get these horizontal lines for like a second. Even if I set it to 1251, I get the lines and when I set it back to 1250 I still get the lines when I apply the clocks... Core clocks are 947 and power limit is 50.

My card is ASUS R9 290 with Hynix memory.


----------



## Jack Mac

Quote:


> Originally Posted by *IBIubbleTea*
> 
> Is this normal..? When I apply my memory clock, I get these horizontal lines for like a second. Even if I set it to 1251, I get the lines and when I set it back to 1250 I still get the lines when I apply the clocks... Core clocks are 947 and power limit is 50.
> 
> My card is ASUS R9 290 with Hynix memory.


I get them too, don't worry.


----------



## IBIubbleTea

Quote:


> Originally Posted by *Jack Mac*
> 
> I get them too, don't worry.


ohh okay.

What should I be looking for when I overclock my memory?


----------



## GenoOCAU

Quote:


> Originally Posted by *rdr09*
> 
> i am only cooling one 290. my vrm1 is about 3C cooler than vrm2 and about 5C cooler than the core. i used CLU for the core and Fujipoly for the vrms.


I literally used everything ek supplied, their paste and pads. Not sure why there is such a variance between my vrm temps. They really don't get hot enough for me to investigate further.


----------



## BradleyW

Sorry to but in, could someone tell me if my score is on target?
3D Mark 11
Performance Preset
GPU's Stock
CPU 4.5GHz HT

Graphics Score = 28,539

GT1 = 128
GT2 = 146
GT3 = 187
GT4 = 82

Physics = 13,400
Combined = 10,500

Thank you.


----------



## Loktar Ogar

@Arizonian

Please add me up: Sapphire R9 290 TRI-X 4GB GDDR5 OC

http://www.sapphiretech.com/presentation/product/?cid=1&gid=3&sgid=1227&lid=1&pid=2091&leg=0

GPU-Z Validation:









http://www.techpowerup.com/gpuz/2vu3e/


----------



## Sazz

I just got my H80i today that I was planning to use for the 290X mod, but the block part was too big, any suggestions on which AIO to use that has a smaller block footprint than H80i?

Something similar to the size of the Accelero Hybrid, or if anyone knows where I can get a accelero hybrid.

And if anyone interested on an H80i pm me as well xD


----------



## Jack Mac

Quote:


> Originally Posted by *IBIubbleTea*
> 
> ohh okay.
> 
> What should I be looking for when I overclock my memory?


When testing stability, look for artifacts and loss of performance, that'll tell you if you're unstable.


----------



## Paulenski

Quote:


> Originally Posted by *Loktar Ogar*
> 
> @Arizonian
> 
> Please add me up: Sapphire R9 290 TRI-X 4GB GDDR5 OC
> 
> http://www.sapphiretech.com/presentation/product/?cid=1&gid=3&sgid=1227&lid=1&pid=2091&leg=0
> 
> GPU-Z Validation:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/2vu3e/


Congrats on the TRI-X card, their pretty badass, I'm loving mine


----------



## Paulenski

Quote:


> Originally Posted by *IBIubbleTea*
> 
> ohh okay.
> 
> What should I be looking for when I overclock my memory?


Here's what my experience means with those lines, your approaching your memory overclock limit on stock aux voltage. Those lines are related to memory instability, I use 1690 mem clock as my stable max clock because it causes no lines compared to 1695/1700. Literally 1 mhz above 1700 will cause driver crash in valley/heaven bench.


----------



## zpaf

Just play with bios from Sapphire.


SAPPHIRE R9 290 4GB GDDR5 BATTLEFIELD 4 EDITION

With [email protected] this is my best score on Fire Strike @ 1200/1600.


http://www.3dmark.com/3dm/2246371


----------



## kizwan

@BradleyW, disabling HT on my quad core causing high CPU usage @90s % in BF4. However, my internet connection acting up (ping fluctuating), it lag/stutter pretty badly but FPS doesn't show any sign of CPU bottleneck. Also, disabling C1E, C3, C6 & C7 may help lower CPU usage a little bit in BF4. Seems smoother too.

I just watch a youtube video showing 3820 @stock running BF4 @Ultra. Doesn't look like CPU is bottleneck @stock though, FPS range are good. Going to try this later.

*[EDIT]*
Reset to stock clock (3.6/3.8GHz). Everything stock including RAM also running at DDR3-1600. This is my CPU usage in BF4 1080p Ultra & 100% resolution scale. My internet connection still pretty bad. I can't tell for sure whether CPU is bottleneck or not. I wonder why I overclock at all. I think better get faster memory because if I'm not mistaken BF4 perform better with faster RAM.


----------



## dansi

This is strange.

i can reach 1100/1400 with +70mv benched stable without artifacts without clock throttling.
But i got higher 3dmark11 scores at 1050/1400.

I know recent radeons have Memory ECC when your VRAM oc past the point of stability and while it will not crashed, your performance takes a hit.

This is first time i seen it happen with CORE oc.


----------



## Arizonian

Quote:


> Originally Posted by *cyenz*
> 
> Thanks! Just missing the "X" on 290


Whoops - fully enabled update now









Quote:


> Originally Posted by *Loktar Ogar*
> 
> @Arizonian
> 
> Please add me up: Sapphire R9 290 TRI-X 4GB GDDR5 OC
> 
> http://www.sapphiretech.com/presentation/product/?cid=1&gid=3&sgid=1227&lid=1&pid=2091&leg=0
> 
> GPU-Z Validation:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://www.techpowerup.com/gpuz/2vu3e/


Congrats - added









Tri-X is sweet enjoy


----------



## PillarOfAutumn

Quote:


> Originally Posted by *dansi*
> 
> This is strange.
> 
> i can reach 1100/1400 with +70mv benched stable without artifacts without clock throttling.
> But i got higher 3dmark11 scores at 1050/1400.
> 
> I know recent radeons have Memory ECC when your VRAM oc past the point of stability and while it will not crashed, your performance takes a hit.
> 
> This is first time i seen it happen with CORE oc.


Similar thing happened to me. Open up CCC and in the performance tab, make sure that "Power Usage" or something like that is set to it's max. I believe the max is 50%. Just push the slider on the x axis all the wah to the right. After doing that, then bench it.


----------



## dansi

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Similar thing happened to me. Open up CCC and in the performance tab, make sure that "Power Usage" or something like that is set to it's max. I believe the max is 50%. Just push the slider on the x axis all the wah to the right. After doing that, then bench it.


\

Changed the power in CCC and AB. Dont seem to affect the performance as i got higher overall scores but the individual tests fps is lower


----------



## Marvin82

Quote:


> Originally Posted by *zpaf*
> 
> Anyone with Sapphire Radeon R9 290 Tri-X can share bios ?


Its me and my Tri X Oc Uber Bios
http://www.hardwareluxx.de/community/f305/amd-radeon-r9-290x-hawaii-xt-sammelthread-faq-bei-fragen-startpost-lesen-985505-33.html#post21702358

Edit..
Sorry you need the 290 TRIX bios
My is the 290X Tri-X oc bios.


----------



## Marvin82

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Its the EK block. I just used the 1 mm Thermal Pad first. That gave me about 78C VRM1 temps. I then added thermal paste on the VRM1 only then placed the pad and then the block so VRM1-Paste-Pad-Block.


You must have 2 pads. Voltageregulators 1mm and 0.5mm at the memory


----------



## ottoore

Quote:


> Originally Posted by *rdr09*
> 
> i used CLU for the core and Fujipoly for the vrms.


I'll do the same. An advice: did you take precautions for electrical contacts around the core?
I mean these



Thank you.


----------



## hotrod717

Any word on when the 290X Lightning will be available???


----------



## 00Smurf

Quote:


> Originally Posted by *Sazz*
> 
> I just got my H80i today that I was planning to use for the 290X mod, but the block part was too big, any suggestions on which AIO to use that has a smaller block footprint than H80i?
> 
> Something similar to the size of the Accelero Hybrid, or if anyone knows where I can get a accelero hybrid.
> 
> And if anyone interested on an H80i pm me as well xD


you can use the h80, just takes a different approach to mounting.
I prefer the h55 myself (nd its only $54.59 at microcenter.
I have 5 290x's equipped with the h55, it works wonders.


----------



## 00Smurf

Anyone able to get afterburner to go over 1625mhz on the memory?
I know i could use gpu tweak, but i prefer afterburner and hope there is a mod.


----------



## Sazz

Quote:


> Originally Posted by *00Smurf*
> 
> you can use the h80, just takes a different approach to mounting.
> I prefer the h55 myself (nd its only $54.59 at microcenter.
> I have 5 290x's equipped with the h55, it works wonders.


Yeah the H55 is slimmer than the H80, I got a buyer for the H80i anyways so once I get rid of that imma get the H55, do you have any installation thread for that? I wanna see how you did it to implement it on mine.


----------



## trihy

Any VRM heatsink solution available?


----------



## rdr09

Quote:


> Originally Posted by *ottoore*
> 
> I'll do the same. An advice: did you take precautions for electrical contacts around the core?
> I mean these
> 
> 
> 
> Thank you.


use painter's or electrical tape to cover the perimeter (areas you highlighted) of the core. just apply a very thin layer (spread), so as to avoid the paste from spilling over after installing the block.

i watched this . . .


----------



## Gero2013

just tried to OC my XFX 290X to 1180/1250 with +200mV in Valley but it's giving me artifacts and then crashes.
Any ideas how I could get it up?

This is on reference cooler but with very aggressive fan profile, max temp was 85C.


----------



## Arizonian

Quote:


> Originally Posted by *hotrod717*
> 
> Any word on when the 290X Lightning will be available???


No official word. Buzz from CES 2014 was no later than Feb time frame.


----------



## SultanOfWalmart

Those of you who use HWMonitor...does yours report VRM temps? Mine only seems to show GPU temp.


----------



## aaroc

Quote:


> Originally Posted by *SultanOfWalmart*
> 
> Those of you who use HWMonitor...does yours report VRM temps? Mine only seems to show GPU temp.


Of the first GPU yes. If you have the ulps active (second and above gpu shutdown when not used) HWinfo does not touch the cards to avoid problems. You have to disable ulps or be using all gpu (running GPUZ) when starting Hwinfo to show the data. Hope it helps.


----------



## ottoore

Quote:


> Originally Posted by *rdr09*
> 
> use painter's or electrical tape to cover the perimeter (areas you highlighted) of the core. just apply a very thin layer (spread), so as to avoid the paste from spilling over after installing the block.
> 
> i watched this . . .


Thank you i watched that video. I'm worried about the spread caused by the pressure of the heatsink.

Do you use classic electrical tape? Ii it not going to liquefy with the temperature of the core?


----------



## 00Smurf

Quote:


> Originally Posted by *Sazz*
> 
> Yeah the H55 is slimmer than the H80, I got a buyer for the H80i anyways so once I get rid of that imma get the H55, do you have any installation thread for that? I wanna see how you did it to implement it on mine.


No install thread, but i can write ya up a quick diy. it takes like all of 10-15 mins. heres some pics in the meantime of mine.


----------



## rdr09

Quote:


> Originally Posted by *ottoore*
> 
> Thank you i watched that video. I'm worried about the spread caused by the pressure of the heatsink.
> 
> Do you use classic electrical tape? Ii it not going to liquefy with the temperature of the core?


yes, i used the ordinary electrical tape. not sure about if it liquefies but i know it works great. i may have to put some on my cpus 'cause it is really very easy to apply. i used the brush that came with the kit.


----------



## PillarOfAutumn

Quick question. I'm about to buy an X-star 1440p monitor. which dvi cable will I need in order to connect my 290 to the monitor? Are the appropriate cables included in the shipment as well?


----------



## rdr09

Quote:


> Originally Posted by *GenoOCAU*
> 
> I literally used everything ek supplied, their paste and pads. Not sure why there is such a variance between my vrm temps. They really don't get hot enough for me to investigate further.


I take those figures back. VRM1 temp does get higher than VRM2 at load. Here was about an hour of BF4 . . .



This is with just one 120 rad and fan. Ambient is pretty low and I'd imagine temps in summer will go up by about 5 -7C. might be more.


----------



## Jack Mac

I don't get it, my VRM1 is always cooler than my VRM2.


----------



## bond32

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Quick question. I'm about to buy an X-star 1440p monitor. which dvi cable will I need in order to connect my 290 to the monitor? Are the appropriate cables included in the shipment as well?


It comes with one. Although if you have a spare power cable it's a plus to use in the power brick. The one they send you has an international adapter


----------



## VSG

FYI: XSPC PSA regarding 290x waterblock


----------



## Sazz

Quote:


> Originally Posted by *00Smurf*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> No install thread, but i can write ya up a quick diy. it takes like all of 10-15 mins. heres some pics in the meantime of mine.


What kind of temps are you getting on the VRM's?


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> I take those figures back. VRM1 temp does get higher than VRM2 at load. Here was about an hour of BF4 . . .
> 
> 
> 
> This is with just one 120 rad and fan. Ambient is pretty low and I'd imagine temps in summer will go up by about 5 -7C. might be more.


I am going to test my card in BF4 and see what i get for VRM1 temps.

45C Core, VRM1 53C, VRM2 40C
+50 mV. Core jump between 1.15-1.2v mostly 1.15v


----------



## prostreetcamaro

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Quick question. I'm about to buy an X-star 1440p monitor. which dvi cable will I need in order to connect my 290 to the monitor? Are the appropriate cables included in the shipment as well?


Quote:


> Originally Posted by *bond32*
> 
> I can do 1150 ok, its the memory clocks that are my issue. Anything over 1325 causes black screens. I can do 1200 stable on core. And the refresh rate was a problem but with the corrected CRU values doesn't seem to be a problem anymore.


Comes with everything you need. Just be aware a few of us with those monitors have issues overclocking the 290/X. For some reason when you up the voltage above a certain point it will corrupt the display. If I reconnect my asus 120hz monitor and run it at 60hz I can up the voltage +200 no problem and OC to 1270. With the asus at 120hz i get similar display corruption as the qnix. With my qnix connected I can only up the voltage +140 and a max of 1220 on the core before it starts to corrupt the display. This leads me to believe the cable is at fault. I just ordered a top of the line 3' Gafen cable to see if that solves the issue.


----------



## IBIubbleTea

May I join the club?




Brand - ASUS
Cooling - Air

Looks like I got Hynix memory, I think this is a new batch of cards because I had to wait 3 weeks to pickup the card from NCIX. I got the card for $449.99 and it came with Battlefield 4.


----------



## Arizonian

Quote:


> Originally Posted by *IBIubbleTea*
> 
> May I join the club?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> Brand - ASUS
> Cooling - Air
> 
> Looks like I got Hynix memory, I think this is a new batch of cards because I had to wait 3 weeks to pickup the card from NCIX. I got the card for $449.99 and it came with Battlefield 4.


You sure may - Congrats - added


----------



## Redeemer

Quote:


> Originally Posted by *IBIubbleTea*
> 
> May I join the club?
> 
> 
> 
> 
> Brand - ASUS
> Cooling - Air
> 
> Looks like I got Hynix memory, I think this is a new batch of cards because I had to wait 3 weeks to pickup the card from NCIX. I got the card for $449.99 and it came with Battlefield 4.


You going to try unlocking it?


----------



## IBIubbleTea

Quote:


> Originally Posted by *Redeemer*
> 
> You going to try unlocking it?


I don't think I can flash my card, just checked the R9 290 --> R9 290x thread and my numbers don't match... Ohh welll.


----------



## kizwan

Quote:


> Originally Posted by *Jack Mac*
> 
> I don't get it, my VRM1 is always cooler than my VRM2.


Mine too, both cards but I figured that is because mine running @stock, no overvolt. Seeing that yours are overclocked, maybe my VRM1 will continue running cooler than VRM2 after overclock?! We'll see soon.


----------



## Redeemer

Hey guys is the MSI Gaming 4s 290x any good?


----------



## jorgitin02

geez when are these cards gonna drop in price ?


----------



## rdr09

Quote:


> Originally Posted by *Jack Mac*
> 
> I don't get it, my VRM1 is always cooler than my VRM2.


Maybe has to do with flow. I mean, which port the water enter first.

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I am going to test my card in BF4 and see what i get for VRM1 temps.
> 
> 45C Core, VRM1 53C, VRM2 40C
> +50 mV. Core jump between 1.15-1.2v mostly 1.15v


your core looks very good. VRM1, though, is not as good as the others. you sure you used the right screws in the area and are they flushed to the washer/pcb?

edit: looking at my gpu installed . . . water enters the the right port (one near the power plugs).


----------



## BradleyW

Quote:


> Originally Posted by *kizwan*
> 
> @BradleyW, disabling HT on my quad core causing high CPU usage @90s % in BF4. However, my internet connection acting up (ping fluctuating), it lag/stutter pretty badly but FPS doesn't show any sign of CPU bottleneck. Also, disabling C1E, C3, C6 & C7 may help lower CPU usage a little bit in BF4. Seems smoother too.
> 
> I just watch a youtube video showing 3820 @stock running BF4 @Ultra. Doesn't look like CPU is bottleneck @stock though, FPS range are good. Going to try this later.
> 
> *[EDIT]*
> Reset to stock clock (3.6/3.8GHz). Everything stock including RAM also running at DDR3-1600. This is my CPU usage in BF4 1080p Ultra & 100% resolution scale. My internet connection still pretty bad. I can't tell for sure whether CPU is bottleneck or not. I wonder why I overclock at all. I think better get faster memory because if I'm not mistaken BF4 perform better with faster RAM.


Thank you for this data. I've worked it out. You are on Windows 7 right? This means DX11.2 is not tapping into your CPU as much, which explains why you are not being bottlenecked.


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> Maybe has to do with flow. I mean, which port the water enter first.
> your core looks very good. VRM1, though, is not as good as the others. you sure you used the right screws in the area and are they flushed to the washer/pcb?
> 
> edit: looking at my gpu installed . . . water enters the the right port (one near the power plugs).


I have tried water in both directions. Same temps. Game VRM1 temps are fine but when mining it goes as high as 76C.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *jorgitin02*
> 
> geez when are these cards gonna drop in price ?


I'm looking to buy a second Sapphire 290 for XF. I refuse to buy a 290 for $500+ when I bought the first one for $400 and AMD's current MSRP is still $399. This is obviously because of the bitcoin mining boom and all that and I don't suspect the prices of AMD cards will drop any time soon or ever.



This price looks like a steal now.


----------



## the9quad

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> I'm looking to buy a second Sapphire 290 for XF. I refuse to buy a 290 for $500+ when I bought the first one for $400 and AMD's current MSRP is still $399. This is obviously because of the bitcoin mining boom and all that and I don't suspect the prices of AMD cards will drop any time soon or ever.
> 
> 
> 
> This price looks like a steal now.


http://accessories.dell.com/sna/products/graphic_video_cards/productdetail.aspx?c=ca&l=en&s=dhs&cs=cadhs1&sku=a7389425


----------



## PillarOfAutumn

Is visiontek reputable?


----------



## Redeemer

Quote:


> Originally Posted by *the9quad*
> 
> http://accessories.dell.com/sna/products/graphic_video_cards/productdetail.aspx?c=ca&l=en&s=dhs&cs=cadhs1&sku=a7389425


I saw this too bro, but 3+weeks shipping damn!


----------



## staryoshi

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Is visiontek reputable?


Yes. They've been around for ages.


----------



## smoke2

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'm sorry i won't be able to run any more tests until Monday or late Sunday......i don't have my volt monitor with me.
> 
> I do remember drawing a 630w (580w usually) spike with my rig + a Giga 7970 Ghz 1000/1375 (volt locked) So you might be able to draw some conclusions based on that.
> 
> There is a link on the OP to a thread created by Sonda measuring the power draw on a single 290x iirc, Tsm106 also recorded some power figures as well earlier in the thread.
> 
> EDIT: http://www.overclock.net/t/1441118/290x-psu-power-output-tests/0_40#post_21156921
> 
> 600w for a 290x clocked at 1200/1600 in Furmark with a 4770k etc.


I'm only reminding.
Please, don't forget on me to measure your rig with single 290








Thanks.


----------



## kizwan

Quote:


> Originally Posted by *BradleyW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> @BradleyW, disabling HT on my quad core causing high CPU usage @90s % in BF4. However, my internet connection acting up (ping fluctuating), it lag/stutter pretty badly but FPS doesn't show any sign of CPU bottleneck. Also, disabling C1E, C3, C6 & C7 may help lower CPU usage a little bit in BF4. Seems smoother too.
> 
> I just watch a youtube video showing 3820 @stock running BF4 @Ultra. Doesn't look like CPU is bottleneck @stock though, FPS range are good. Going to try this later.
> 
> *[EDIT]*
> Reset to stock clock (3.6/3.8GHz). Everything stock including RAM also running at DDR3-1600. This is my CPU usage in BF4 1080p Ultra & 100% resolution scale. My internet connection still pretty bad. I can't tell for sure whether CPU is bottleneck or not. I wonder why I overclock at all. I think better get faster memory because if I'm not mistaken BF4 perform better with faster RAM.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thank you for this data. I've worked it out. You are on Windows 7 right? This means DX11.2 is not tapping into your CPU as much, which explains why you are not being bottlenecked.
Click to expand...

I see. Can you point me where you got this information? Thanks.








Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> Maybe has to do with flow. I mean, which port the water enter first.
> your core looks very good. VRM1, though, is not as good as the others. you sure you used the right screws in the area and are they flushed to the washer/pcb?
> 
> edit: looking at my gpu installed . . . water enters the the right port (one near the power plugs).
> 
> 
> 
> I have tried water in both directions. Same temps. Game VRM1 temps are fine but when mining it goes as high as 76C.
Click to expand...

I was wondering what was your VRM1 temp with stock cooler when mining?


----------



## Koniakki

Guys I have sincere question. I was "arguing" with a friend about BF4 performance of 780/780Ti vs 290/290X.

Can someone who knows or have tested both the 780/780Ti 1200-1250Mhz vs 290/290X 1110-1200Mhz give me some info on their BF4 performance please?

I read the reviews etc, but I asking for real world performance from members here.


----------



## cam51037

Question for those of you with Sapphire 290 Tri-X cards, how is the VRM cooling on those cards, and what about noise levels if you try and keep the card at around 70C while gaming?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Koniakki*
> 
> Guys I have sincere question. I was "arguing" with a friend about BF4 performance of 780/780Ti vs 290/290X.
> 
> Can someone who knows or have tested both the 780/780Ti 1200-1250Mhz vs 290/290X 1110-1200Mhz give me some info on their BF4 performance please?
> 
> I read the reviews etc, but I asking for real world performance from members here.


I think they are about the same in BF4. Stock 290X matches 780 Ti Stock.


----------



## Jack Mac

Does anyone here think +75mV is safe for long term use? I can hit 1190MHz at +75.


----------



## Forceman

Quote:


> Originally Posted by *Jack Mac*
> 
> Does anyone here think +75mV is safe for long term use? I can hit 1190MHz at +75.


I do. What is that giving you for a load voltage?


----------



## BradleyW

Quote:


> Originally Posted by *kizwan*
> 
> I see. Can you point me where you got this information? Thanks.
> 
> 
> 
> 
> 
> 
> 
> 
> I was wondering what was your VRM1 temp with stock cooler when mining?


I tested it personally. There is also a ton of reviews and discussions about it if you google BF4 win 7 vs win 8.1.
Cheers.


----------



## sugarhell

Quote:


> Originally Posted by *BradleyW*
> 
> I tested it personally. There is also a ton of reviews and discussions about it if you google BF4 win 7 vs win 8.1.
> Cheers.


http://www.overclock.net/t/1433904/comparison-of-windows-7-vs-windows-8-8-1-ht-enabled-vs-disabled-on-battlefield-4


----------



## BradleyW

Quote:


> Originally Posted by *sugarhell*
> 
> http://www.overclock.net/t/1433904/comparison-of-windows-7-vs-windows-8-8-1-ht-enabled-vs-disabled-on-battlefield-4


I'm seeing a lot of usage on that quad core in 8.1.


----------



## sugarhell

Quote:


> Originally Posted by *BradleyW*
> 
> I'm seeing a lot of usage on that quad core in 8.1.


Quad. You have a six core. And with ht you should have around 40-50% usage


----------



## Jack Mac

Quote:


> Originally Posted by *Forceman*
> 
> I do. What is that giving you for a load voltage?


Around 1.227 in BF4.


----------



## neurotix

Quote:


> Originally Posted by *cam51037*
> 
> Question for those of you with Sapphire 290 Tri-X cards, how is the VRM cooling on those cards, and what about noise levels if you try and keep the card at around 70C while gaming?


VRM temps are great.

My VRM1 is usually only 5C hotter than my core. VRM2 is cooler.

Here's the GPU-Z sensor log of a Unigine Valley run at 1200/1500mhz, 50% power limit, +168mv, 100% fans.

GPU-ZSensorLog.txt 87k .txt file


For those who can't read it, max core temp was 69C and max VRM1 temp was 75C.

I can't measure the noise but with the fans at 100%, they aren't that bad. If I'm gaming and I have my headphones on I can't even hear my system.


----------



## kizwan

Quote:


> Originally Posted by *BradleyW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I see. Can you point me where you got this information? Thanks.
> 
> 
> 
> 
> 
> 
> 
> 
> I was wondering what was your VRM1 temp with stock cooler when mining?
> 
> 
> 
> I tested it personally. There is also a ton of reviews and discussions about it if you google BF4 win 7 vs win 8.1.
> Cheers.
Click to expand...

Thanks. Using the info available in the thread linked by sugarhell, even though CPU usage is higher in 8.1 than 7, your CPU still shouldn't bottleneck your 290X's CFX.

I wish I have 3930k to play with. Now upgrade bug biting me pretty hard...no no no, I should control myself.

Can you overclocked your CPU to 4GHz & 4.5GHz? You can record CPU utilization using MSI AB (right click in the graphs window & click "Log history to file"). The file should be located at *C:\Program Files (x86)\MSI Afterburner\HardwareMonitoring.hml*. You can rename it to *HardwareMonitoring.csv* because the file use comma delimited format. From there you can plot CPU usage graph. This way you can see how much CPU usage reduced going from stock to 4GHz & from 4GHz to 4.5GHz.
Quote:


> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BradleyW*
> 
> I tested it personally. There is also a ton of reviews and discussions about it if you google BF4 win 7 vs win 8.1.
> Cheers.
> 
> 
> 
> http://www.overclock.net/t/1433904/comparison-of-windows-7-vs-windows-8-8-1-ht-enabled-vs-disabled-on-battlefield-4
Click to expand...

I can replicate it on mine. I'm sure I can lower CPU usage when HT off if I run with 200% resolution scale.

HT on


HT off


----------



## cam51037

Quote:


> Originally Posted by *neurotix*
> 
> VRM temps are great.
> 
> My VRM1 is usually only 5C hotter than my core. VRM2 is cooler.
> 
> Here's the GPU-Z sensor log of a Unigine Valley run at 1200/1500mhz, 50% power limit, +168mv, 100% fans.
> 
> GPU-ZSensorLog.txt 87k .txt file
> 
> 
> For those who can't read it, max core temp was 69C and max VRM1 temp was 75C.
> 
> I can't measure the noise but with the fans at 100%, they aren't that bad. If I'm gaming and I have my headphones on I can't even hear my system.


+Rep! I'll read through the log tomorrow morning but for now I'll take your word. I guess I'll make a final decision in the coming days then. To buy, or not to buy!


----------



## Loktar Ogar

Quote:


> Originally Posted by *cam51037*
> 
> Question for those of you with Sapphire 290 Tri-X cards, how is the VRM cooling on those cards, and what about noise levels if you try and keep the card at around 70C while gaming?


My Share. Ambient temps probably around 22c-25c. I only have Valley and BF4 to test. Auto Fan settings in AB Beta 18. Noise levels are decent and sorry no fan % included but was below 50%. For FPS results, you can refer to reviews already made by various websites.

Valley Benchmark 1.0 (Extreme HD-No Tweaks) run once only: // Temps Max Value: Valley / BF4 (1440P Ultra 4xMSAA). MP one round.

*All Stock (1000/1300) and default voltage*.

FPS: 58.2 // GPU: 73c / 73c
Score: 2434 // VRM1: 74c / 74c
Min: 29.3 // VRM2: 57 / 57
Max: 110.8 // 1.197 V / 1.224 V

*(1100/1330) and default voltage. Power limit +50.*

FPS: 63 // GPU: 75c / 76c
Score: 2635 // VRM1: 76c / 79c
Min: 25.3 // VRM2: 57 / 59
Max: 119.8 // 1.175 V / 1.227 V

*(1130/1400) and +38 mV. No Aux Voltage. Power limit +50.*

FPS: 64.3 // GPU: 76c / 77c
Score: 2690 // VRM1: 79c / 82c
Min: 30.2 // VRM2: 57 / 59
Max: 123.4 // 1.234 V / 1.253 V

*(1140/1500) and +63 mV. No Aux Voltage. Power limit +50.*

FPS: 66.4 // GPU: 77c / 79c
Score: 2779 // VRM1: 82c / 84c
Min: 32 // VRM2: 57 / 60
Max: 125.6 / 1.257 V / 1.272 V

*(1150/1500) and +69 mV. No Aux Voltage. Power limit +50.*

FPS: 67.6 // GPU: 78c / 80c
Score: 2827 // VRM1: 83c / 85c
Min: 31.6 // VRM2: 57 / 60
Max: 128.2 // 1.277 V / 1.294 V

No significant gains for 1160/1500 but just got higher temps. This test took a long day to finish so please don't ask for more. This is just for your reference and i'm not competing with scores.









Added info: A quick fix to lower down GPU and VRM1 temps is to set a good fan profile. I only use Auto for this test.









EDIT: I've added some small info for quick fix on the temps and the power limit increase was +50.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Loktar Ogar*
> 
> My Share. Ambient temps probably around 22c-25c. I only have Valley and BF4 to test. Auto Fan settings in AB Beta 18. Noise levels are decent and sorry no fan % included but was below 50%. For FPS results and you can refer to reviews already made by various websites.
> 
> Valley Benchmark 1.0 (Extreme HD-No Tweaks) run once only: // Temps Max Value: Valley / BF4 (1440P Ultra 4xMSAA). MP one round.
> 
> *All Stock (1000/1300) and default voltage*.
> 
> FPS: 58.2 // GPU: 73c / 73c
> Score: 2434 // VRM1: 74c / 74c
> Min: 29.3 // VRM2: 57 / 57
> Max: 110.8 // 1.197 V / 1.224 V
> 
> *(1130/1400) and +38 mV. No Aux Voltage.*
> 
> FPS: 64.3 // GPU: 76c / 77c
> Score: 2690 // VRM1: 79c / 82c
> Min: 30.2 // VRM2: 57 / 59
> Max: 123.4 // 1.234 V / 1.253 V
> 
> *(1140/1500) and +63 mV. No Aux Voltage.*
> 
> FPS: 66.4 // GPU: 77c / 79c
> Score: 2779 // VRM1: 82c / 84c
> Min: 32 // VRM2: 57 / 60
> Max: 125.6 / 1.257 V / 1.272 V
> 
> *(1150/1500) and +69 mV. No Aux Voltage.*
> 
> FPS: 67.6 // GPU: 78c / 80c
> Score: 2827 // VRM1: 83c / 85c
> Min: 31.6 // VRM2: 57 / 60
> Max: 128.2 // 1.277 V / 1.294 V
> 
> No significant gains for 1160/1500 but just got higher temps. This test took a long day to finish so please don't ask for more. This is just for your reference and i'm not competing with scores.


Do you use your card for mining? Very curios to see VRM1 temps.


----------



## Loktar Ogar

No, just for gaming and normal use only.


----------



## kizwan

Quote:


> Originally Posted by *Loktar Ogar*
> 
> No, just for gaming and normal use only.


Good man!


----------



## Sgt Bilko

Quote:


> Originally Posted by *smoke2*
> 
> I'm only reminding.
> Please, don't forget on me to measure your rig with single 290
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks.


I've just gotten home from work,

I've lowered my CPU clock to 4.4Ghz now so it shouldn't chew as much juice now.

What would you like tested?

I can run them and tell you the results but i won't be able to upload any pics due to my net being capped (back to 56k speed







)


----------



## neurotix

I should note that my ambient temps for what I posted earlier are 21C (70F).


----------



## Forceman

Quote:


> Originally Posted by *Jack Mac*
> 
> Around 1.227 in BF4.


I was going to say anything less than 1.3V is plenty safe for 24/7 ops, if you are at 1.23 I wouldn't be worried at all.


----------



## kizwan

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jack Mac*
> 
> Around 1.227 in BF4.
> 
> 
> 
> I was going to say anything less than 1.3V is plenty safe for 24/7 ops, if you are at 1.23 I wouldn't be worried at all.
Click to expand...

What is going/likely to happen if one, let say running 1.32V 24/7?


----------



## smoke2

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I've just gotten home from work,
> 
> I've lowered my CPU clock to 4.4Ghz now so it shouldn't chew as much juice now.
> 
> What would you like tested?
> 
> I can run them and tell you the results but i won't be able to upload any pics due to my net being capped (back to 56k speed
> 
> 
> 
> 
> 
> 
> 
> )


You can test it also on 5GHz frequency to leave some reserve









Great be some power consumpting benchmark, maybe Furmark?
Or some who is the most power consuming.

Then some game with high end graphic, maybe Battlefield?

And please, measure a peak consumption, not average









No problem without pics


----------



## Forceman

Quote:


> Originally Posted by *kizwan*
> 
> What is going/likely to happen if one, let say running 1.32V 24/7?


Well the higher the voltage the higher the chance for degradation and the eventual failing of the chip. But there really is no hard and fast rule for what voltage is going to be "unsafe" since it is a gradual thing. 1.3V (and even higher) seemed to be fine for Tahiti chips, so it would be reasonable to assume it would be fine for Hawaii chips also since it is the same process, but only time will tell for sure.


----------



## Sgt Bilko

Quote:


> Originally Posted by *smoke2*
> 
> You can test it also on 5GHz frequency to leave some reserve
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Great be some power consumpting benchmark, maybe Furmark?
> Or some who is the most power consuming.
> 
> And then some game with high end graphic, maybe Battlefield 4?


I'm loaded into my 5Ghz profile now actually, i'll run some 3DMark 11 and Tomb Raider, Metro etc (Doing some runs for the Bench Thread)

I'll run Crossfire and Single card for all of them just for comparisons sake.


----------



## kizwan

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> What is going/likely to happen if one, let say running 1.32V 24/7?
> 
> 
> 
> Well the higher the voltage the higher the chance for degradation and the eventual failing of the chip. But there really is no hard and fast rule for what voltage is going to be "unsafe" since it is a gradual thing. 1.3V (and even higher) seemed to be fine for Tahiti chips, so it would be reasonable to assume it would be fine for Hawaii chips also since it is the same process, but only time will tell for sure.
Click to expand...

Does anyone have killed or degrade their GPU yet with 1.3V & higher?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *the9quad*
> 
> Here is how bad the scaling is on Metro LL:
> First TR almost 100% usage on all 3 cards:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Next Hitman almost 100% usage on all 3 cards:
> 
> 
> 
> Now frickin Metro LL 28%,77%, 50% usage:


Is this with 13.12 catalyst? or 13.11 whql? I've been stubborn and haven't switched yet lol


----------



## Sgt Bilko

Quote:


> Originally Posted by *smoke2*
> 
> You can test it also on 5GHz frequency to leave some reserve
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Great be some power consumpting benchmark, maybe Furmark?
> Or some who is the most power consuming.
> 
> Then some game with high end graphic, maybe Battlefield?
> 
> And please, measure a peak consumption, not average
> 
> 
> 
> 
> 
> 
> 
> 
> 
> No problem without pics


Alrighty, for Tomb Raider i measured a peak power usage of 650w with 1 R9 290 at 1200/1300 +100 mV in AB (GPU usage was 100%)



EDIT: Just ran FurMark with one card at stock clocks 980/1250, got a max peak of 660w power draw......


----------



## Forceman

Quote:


> Originally Posted by *kizwan*
> 
> Does anyone have killed or degrade their GPU yet with 1.3V & higher?


Not that I've seen or heard of.


----------



## smoke2

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Alrighty, for Tomb Raider i measured a peak power usage of 650w with 1 R9 290 at 1200/1300 +100 mV in AB (GPU usage was 100%)
> 
> 
> 
> EDIT: Just ran FurMark with one card at stock clocks 980/1250, got a max peak of 660w power draw......


Thank you so much.

I want to buy Sapphire Tri-X, which is OCed.
Do you think then 700W PSU is too weak for single 290 and my rig (i5-4670K)?
Will you buy 850W?


----------



## Sgt Bilko

Quote:


> Originally Posted by *smoke2*
> 
> Thank you so much.
> 
> I want to buy Sapphire Tri-X, which is OCed.
> Do you think then 700W PSU is too weak for single 290 and my rig (i5-4670K)?
> Will you buy 850W?


I'd say 750w to be safe for one card (I'm not sure about i5's power draw though), my 8350 will draw more power than your 4670k will.

even with another 100mV on the card i wasn't exceeding 700W during gaming.......so with 750w you should be plenty clear.

If you want to know more about the PSU you are looking at then take a look at some of Shilka's PSU info threads, they will tell you everything you need to know and some more









Have Fun!!


----------



## smoke2

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'd say 750w to be safe for one card (I'm not sure about i5's power draw though), my 8350 will draw more power than your 4670k will.
> 
> even with another 100mV on the card i wasn't exceeding 700W during gaming.......so with 750w you should be plenty clear.
> 
> If you want to know more about the PSU you are looking at then take a look at some of Shilka's PSU info threads, they will tell you everything you need to know and some more
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Have Fun!!


He wrote me 700W is plenty, but your mesaurments say it's almost 700W.

Please, try if you can to OC your card at 1050MHz with default RAM frequency and CPU at 4.5 GHz.
That's the frequency of my future graphic card


----------



## Sgt Bilko

Quote:


> Originally Posted by *smoke2*
> 
> He wrote me 700W is plenty, but your mesaurments say it's almost 700W.
> 
> Please, try if you can to OC your card at 1050MHz with default RAM frequency and CPU at 4.5 GHz.
> That's the frequency of my future graphic card


You might need to ask someone that has a kill-a-watt and an Intel CPU to run a test or two for you.

My AMD Chip chews a lot more power than an i5 will, besides.......if Shilka says 700w is enough, i'd believe him









EDIT: I should also add that the Tomb Raider test was done at 1200Mhz core clock with an extra 100mV applied through Afterburner......if you are going to be running stock clocks then 700w will be enough, i promise you that.


----------



## Forceman

Quote:


> Originally Posted by *smoke2*
> 
> Thank you so much.
> 
> I want to buy Sapphire Tri-X, which is OCed.
> Do you think then 700W PSU is too weak for single 290 and my rig (i5-4670K)?
> Will you buy 850W?


My overclocked 290X (using +100mV) and fairly high voltage 4770K (1.35V) drew right around 400W playing BF4. So I'd say a 700W would be fine unless you plan on running some crazy voltages.

But I can re-download Tomb Raider and run that benchmark to give you apples to apples if you want.


----------



## givmedew

Quote:


> Originally Posted by *smoke2*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Alrighty, for Tomb Raider i measured a peak power usage of 650w with 1 R9 290 at 1200/1300 +100 mV in AB (GPU usage was 100%)
> 
> 
> 
> EDIT: Just ran FurMark with one card at stock clocks 980/1250, got a max peak of 660w power draw......
> 
> 
> 
> Thank you so much.
> 
> I want to buy Sapphire Tri-X, which is OCed.
> Do you think then 700W PSU is too weak for single 290 and my rig (i5-4670K)?
> Will you buy 850W?
Click to expand...

700w on a good PSU is significantly more than you need. The 290X can exceed 300 watts... But the rest of what you have can not. A 600w PSU is plenty.

That said if you already have a 700 then keep it. If you do not have a 700w then a 850 is a total waste of money. It is way to much for 1 290/290x and only just barely enough for (2) 290/290x cards.

If you are not mining then while gaming eXtreme Power Supply Calculator has you at 500w needed 550 recommended with a 290X stock clocks and a 4670K @ 4.5GHz and 1.38v with 10% capacitor aging and a total 85% system load (probably more than you would see gaming). If you where mining then with 1 card you can expect 350-450w total system load and if 2 cards you can expect 675-900. I would never recommend running a PSU at 80% of it's capacity for days on end unless it was truly a high end PSU. I use the original PC P+C 1K-SR for mining with (2) 290s and (2) 5870s and that thing is breathing fire but it is server grade and weighs 2x as much as a 1200w PSU.

If only gaming it is ok to run your PSU near it's limits since it isn't going to be 24/7 365.

So anyways... my recommendation is if you have a 700 then stick with it. If you don't have a 700 then get a good 550-650 if you never plan to run (2). If you have any plans on running (2) of these things then 900-1200 should be what you go looking for. 1000-1200 will let you run 2 with breathing room and a 1200 will let you run 3 no prob.

Hope that helps.


----------



## Epsi

Quote:


> Originally Posted by *smoke2*
> 
> Thank you so much.
> 
> I want to buy Sapphire Tri-X, which is OCed.
> Do you think then 700W PSU is too weak for single 290 and my rig (i5-4670K)?
> Will you buy 850W?


I'm running with a 650W, 3770k @ 4.6 - 1.224Vcore
XFX R290 unlocked @ 1075 / 1350 +25mV

Pulling arround 400W during BF4.


----------



## smoke2

Quote:


> Originally Posted by *Sgt Bilko*
> 
> You might need to ask someone that has a kill-a-watt and an Intel CPU to run a test or two for you.
> 
> My AMD Chip chews a lot more power than an i5 will, besides.......if Shilka says 700w is enough, i'd believe him
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: I should also add that the Tomb Raider test was done at 1200Mhz core clock with an extra 100mV applied through Afterburner......if you are going to be running stock clocks then 700w will be enough, i promise you that.


Maybe Shilka can say something about your results








Quote:


> Originally Posted by *Forceman*
> 
> My overclocked 290X (using +100mV) and fairly high voltage 4770K (1.35V) drew right around 400W playing BF4. So I'd say a 700W would be fine unless you plan on running some crazy voltages.
> 
> But I can re-download Tomb Raider and run that benchmark to give you apples to apples if you want.


If you can test it, it would be great!
Try to test it with overclocking on 1050MHz core/default RAM and also with slight overclocking with +100mV as you have.
If you haven't Tomb Raider installed then BF4 will be equal I think? But will appreciate if you can run also Furmark.
And please, measure a peak consumption, not average.

Thank you kindly


----------



## smoke2

Quote:


> Originally Posted by *Epsi*
> 
> I'm running with a 650W, 3770k @ 4.6 - 1.224Vcore
> XFX R290 unlocked @ 1075 / 1350 +25mV
> 
> Pulling arround 400W during BF4.


400W is peak consumption or it is average consumption?


----------



## smoke2

Quote:


> Originally Posted by *givmedew*
> 
> 700w on a good PSU is significantly more than you need. The 290X can exceed 300 watts... But the rest of what you have can not. A 600w PSU is plenty.
> 
> That said if you already have a 700 then keep it. If you do not have a 700w then a 850 is a total waste of money. It is way to much for 1 290/290x and only just barely enough for (2) 290/290x cards.
> 
> If you are not mining then while gaming eXtreme Power Supply Calculator has you at 500w needed 550 recommended with a 290X stock clocks and a 4670K @ 4.5GHz and 1.38v with 10% capacitor aging and a total 85% system load (probably more than you would see gaming). If you where mining then with 1 card you can expect 350-450w total system load and if 2 cards you can expect 675-900. I would never recommend running a PSU at 80% of it's capacity for days on end unless it was truly a high end PSU. I use the original PC P+C 1K-SR for mining with (2) 290s and (2) 5870s and that thing is breathing fire but it is server grade and weighs 2x as much as a 1200w PSU.
> 
> If only gaming it is ok to run your PSU near it's limits since it isn't going to be 24/7 365.
> 
> So anyways... my recommendation is if you have a 700 then stick with it. If you don't have a 700 then get a good 550-650 if you never plan to run (2). If you have any plans on running (2) of these things then 900-1200 should be what you go looking for. 1000-1200 will let you run 2 with breathing room and a 1200 will let you run 3 no prob.
> 
> Hope that helps.


Thanks for long reply.
Then how Sgt Bilko can measure around 650W?
His CPU is so power hungry?
Or peaks can be so high?


----------



## Epsi

Quote:


> Originally Posted by *smoke2*
> 
> 400W is peak consumption or it is average consumption?


Thats average, it's floating between 390 - 410.


----------



## smoke2

Can you please measure a peak consumption?


----------



## Epsi

Sure what do u want me to run? Some sort of 3DMark?


----------



## ottoore

Quote:


> Originally Posted by *Epsi*
> 
> Sure what do u want me to run? Some sort of 3DMark?


Unigine Valley if you can. Can you also run it with +100mV in AB?

Thank you.


----------



## Epsi

Quote:


> Originally Posted by *ottoore*
> 
> Unigine Valley if you can. Can you also run it with +100mV in AB?
> 
> Thank you.


Unigine Valley running at Extreme HD preset.

1100/1350 +100mV



It was jumping between 450 and 470.


----------



## smoke2

Yes, please run some benchmark like 3DMark, maybe Furmark is heavy power consumption.
And also if you will have time some game with good graphic like Battlefield 4.
And don't forget to measure peak.

Thanks


----------



## kizwan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *smoke2*
> 
> The price for 700W Vanguard is here 114€ and for 152€ for 850W.
> I was wondering about 850W into the future, when maybe GPU's will be more powerful and more power hungry, but who knows...?
> 
> 
> 
> 
> 
> 
> 
> 
> Which one will you buy for these prices?
> 
> 
> 
> Crossfire XFX 290's, this is 1100/1300 in FSE with an 8350 @ 5.0Ghz just to give you an idea
Click to expand...

Quote:


> Originally Posted by *smoke2*
> 
> Quote:
> 
> 
> 
> Originally Posted by *givmedew*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 700w on a good PSU is significantly more than you need. The 290X can exceed 300 watts... But the rest of what you have can not. A 600w PSU is plenty.
> 
> That said if you already have a 700 then keep it. If you do not have a 700w then a 850 is a total waste of money. It is way to much for 1 290/290x and only just barely enough for (2) 290/290x cards.
> 
> If you are not mining then while gaming eXtreme Power Supply Calculator has you at 500w needed 550 recommended with a 290X stock clocks and a 4670K @ 4.5GHz and 1.38v with 10% capacitor aging and a total 85% system load (probably more than you would see gaming). If you where mining then with 1 card you can expect 350-450w total system load and if 2 cards you can expect 675-900. I would never recommend running a PSU at 80% of it's capacity for days on end unless it was truly a high end PSU. I use the original PC P+C 1K-SR for mining with (2) 290s and (2) 5870s and that thing is breathing fire but it is server grade and weighs 2x as much as a 1200w PSU.
> 
> If only gaming it is ok to run your PSU near it's limits since it isn't going to be 24/7 365.
> 
> So anyways... my recommendation is if you have a 700 then stick with it. If you don't have a 700 then get a good 550-650 if you never plan to run (2). If you have any plans on running (2) of these things then 900-1200 should be what you go looking for. 1000-1200 will let you run 2 with breathing room and a 1200 will let you run 3 no prob.
> 
> Hope that helps.
> 
> 
> 
> 
> 
> 
> Thanks for long reply.
> Then how Sgt Bilko can measure around 650W?
> His CPU is so power hungry?
> Or peaks can be so high?
Click to expand...

Sgt Bilko is running AMD CPU FX-8350 CPU. These AMD CPUs are power hungry CPU. FX-8150 alone at stock clock already draw more than 200W (overall system power consumption with only CPU is being stress). Haswell CPU like yours, if overclocked to 4.2GHz, CPU power consumption is less than 100W.

Combined with power hungry CPU like FX-8350 & dual 290's, definitely his rig power consumption is going a lot higher than yours.

I found interesting info about my 3820 power draw at stock clock. Over 150W power draw at stock clock.
Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Does anyone have killed or degrade their GPU yet with 1.3V & higher?
> 
> 
> 
> Not that I've seen or heard of.
Click to expand...

My 290's Elpida & Hynix cards max voltage are 1.227V & 1.219V before Vdroop respectively. Do you think it's not good idea when overclock to set voltage to +100mV?

Voltages after Vdroop are 1.219V & 1.203V respectively.


----------



## ottoore

Quote:


> Originally Posted by *smoke2*
> 
> Yes, please run some benchmark like 3DMark, maybe Furmark is heavy power consumption.
> And also if you will have time some game with good graphic like Battlefield 4.
> And don't forget to measure peak.
> 
> Thanks


Furmark isn't a bench. It is an useless sw which kills vga stressing Vrm' s phases.
And its power draw is can't be hit by any other sw, game or bench.

@Epsi thank you


----------



## Sgt Bilko

Quote:


> Originally Posted by *kizwan*
> 
> Sgt Bilko is running AMD CPU FX-8350 CPU. These AMD CPUs are power hungry CPU. FX-8150 alone at stock clock already draw more than 200W (overall system power consumption with only CPU is being stress). Haswell CPU like yours, if overclocked to 4.2GHz, CPU power consumption is less than 100W.
> 
> Combined with power hungry CPU like FX-8350 & dual 290's, definitely his rig power consumption is going a lot higher than yours.


Oh yeah, this thing loves power more than Mynocks do









i still wouldn't trade it though, I just want AMD to bring out a truly great CPU sometime within the next 3 years.......


----------



## Forceman

Quote:


> Originally Posted by *smoke2*
> 
> Maybe Shilka can say something about your results
> 
> 
> 
> 
> 
> 
> 
> 
> If you can test it, it would be great!
> Try to test it with overclocking on 1050MHz core/default RAM and also with slight overclocking with +100mV as you have.
> If you haven't Tomb Raider installed then BF4 will be equal I think? But will appreciate if you can run also Furmark.
> And please, measure a peak consumption, not average.
> 
> Thank you kindly


I hit 340 watts running Tomb Raider in both Ultra and Ultimate settings. That's at 1100/1325/+100mV. Furmark was 370 at those settings. At 1200/1400/+100mV Tomb Raider was 380.

That's with a single card - I don't understand how Sgt Bilko is using 250W more than that, even with an AMD CPU. I don't know what settings he was using for his benchmark run, but I get 66 FPS on Ultimate so either he was running something lower than that, or it was with Crossfire enabled. Hopefully he'll chime in with his benchmark settings.
Quote:


> Originally Posted by *kizwan*
> 
> I found interesting info about my 3820 power draw at stock clock. Over 150W power draw at stock clock.
> My 290's Elpida & Hynix cards max voltage are 1.227V & 1.219V before Vdroop respectively. Do you think it's not good idea when overclock to set voltage to +100mV?
> 
> Voltages after Vdroop are 1.219V & 1.203V respectively.


I'd check it with the offset under load just to be sure, but as long as the VRM/core temps are okay I don't think it would be a problem - I'm assuming that's going to be around 1.3 or 1.31V load after droop.


----------



## anteante

Any tips or tricks to try reduce coilwhine from my PC [email protected] card?


----------



## kizwan

Quote:


> Originally Posted by *Forceman*
> 
> I'd check it with the offset under load just to be sure, but as long as the VRM/core temps are okay I don't think it would be a problem - I'm assuming that's going to be around 1.3 or 1.31V load after droop.


I'm thinking setting voltage to max, +100mV & then reducing it slowly until it's not stable anymore. +100mV will be my max for now until I learn more.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Forceman*
> 
> I hit 340 watts running Tomb Raider in both Ultra and Ultimate settings. That's at 1100/1325/+100mV. Furmark was 370 at those settings. At 1200/1400/+100mV Tomb Raider was 380.
> 
> That's with a single card - you realize Sgt Bilko is running Crossfire, right?
> I'd check it with the offset under load just to be sure, but as long as the VRM/core temps are okay I don't think it would be a problem - I'm assuming that's going to be around 1.3 or 1.31V load after droop.


Actually that test was a single card, i disabled CF for it, just in CCC though, i didn't actually remove the card.

I'll run a Tomb Raider CF test now and let you know what it takes.

at 1100/1300 on both cards, avg fps was 91.8 and i drew a max of 630w.

only thing i can put that down to is Crossfire not working that well in Tomb Raider or/and my CPU is bottle necking them hard

Furmark draws 950w on both cards at 1100/1300 by comparison.


----------



## Forceman

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Actually that test was a single card, i disabled CF for it, just in CCC though, i didn't actually remove the card.
> 
> I'll run a Tomb Raider CF test now and let you know what it takes.


I ninja edited you. What benchmark settings are you running?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Forceman*
> 
> I ninja edited you. What benchmark settings are you running?


ninja'd









I was running Ultra settings (same as ultimate just no TressFX), with Ultimate i get 58 fps avg.


----------



## Forceman

Quote:


> Originally Posted by *Sgt Bilko*
> 
> ninja'd
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I was running Ultra settings (same as ultimate just no TressFX), with Ultimate i get 58 fps avg.


Just realized I was looking at the min FPS and not the average. At 1175/1400/+100mV I get 143 on Ultra and 88 on Ultimate, at 1920x1200. 380W max both ways.


----------



## Epsi

Quote:


> Originally Posted by *smoke2*
> 
> Yes, please run some benchmark like 3DMark, maybe Furmark is heavy power consumption.
> And also if you will have time some game with good graphic like Battlefield 4.
> And don't forget to measure peak.
> 
> Thanks


Some quick runs i wrote down the peak wattage.

3DMark13:

Running at 1100 / 1350 +100mV:

Test #1: *469W* (GPU test)
Test #2: *458W* (GPU test)
Test #3: *264W* (CPU test)
Test #4: *496W* (Combined test)

Furmark:

Stock 1000 / 1250 *479W*
1075 / 1350 +25mV *560W*
1075 / 1350 + 50mV *587W*
1075 / 1350 + 75mV system turns off / reboot unstable settings








1075 / 1350 + 100mV system turns off / reboot unstable settings, poor clocking card can't handle much.

Unigine Valley:

1100 / 1350 +100mV *468W*

BF4 multiplayer 48p server, max settings:

1075 / 1350 +25mV (24/7 OC) *404W*


----------



## Sgt Bilko

Quote:


> Originally Posted by *Forceman*
> 
> Just realized I was looking at the min FPS and not the average. At 1175/1400/+100mV I get 143 on Ultra and 88 on Ultimate, at 1920x1200. 380W max both ways.


Yeah, my 8350 is holding my cards back a bit but then again that's what mantle is supposed to change right?









I really need to get another Firestrike submission in as well......3DMark keeps crashing on launch, re-downloaded and all


----------



## ottoore

Quote:


> Originally Posted by *Epsi*
> 
> Some quick runs i wrote down the peak wattage.
> 
> 3DMark13:
> 
> Running at 1100 / 1350 +100mV:
> 
> Test #1: *469W* (GPU test)
> Test #2: *458W* (GPU test)
> Test #3: *264W* (CPU test)
> Test #4: *496W* (Combined test)
> 
> Furmark:
> 
> Stock 1000 / 1250 *479W*
> 1075 / 1350 +25mV *560W*
> 1075 / 1350 + 50mV *587W*
> 1075 / 1350 + 75mV system turns off / reboot unstable settings
> 
> 
> 
> 
> 
> 
> 
> 
> 1075 / 1350 + 100mV system turns off / reboot unstable settings, poor clocking card can't handle much.
> 
> Unigine Valley:
> 
> 1100 / 1350 +100mV *468W*
> 
> BF4 multiplayer 48p server, max settings:
> 
> 1075 / 1350 +25mV (24/7 OC) *404W*


Your psu has low efficiency so 496W at wall are 496*0.85= 421.6W for the psu.


----------



## Sgt Bilko

So it turns out there was a folder that was messing up my 3DMark key, delete that and i was back in business











Managed to beat my previous best there even though the physics score is over 1k lower than i've had before.


----------



## cam51037

Quote:


> Originally Posted by *Loktar Ogar*
> 
> My Share. Ambient temps probably around 22c-25c. I only have Valley and BF4 to test. Auto Fan settings in AB Beta 18. Noise levels are decent and sorry no fan % included but was below 50%. For FPS results and you can refer to reviews already made by various websites.
> 
> Valley Benchmark 1.0 (Extreme HD-No Tweaks) run once only: // Temps Max Value: Valley / BF4 (1440P Ultra 4xMSAA). MP one round.
> 
> *All Stock (1000/1300) and default voltage*.
> 
> FPS: 58.2 // GPU: 73c / 73c
> Score: 2434 // VRM1: 74c / 74c
> Min: 29.3 // VRM2: 57 / 57
> Max: 110.8 // 1.197 V / 1.224 V
> 
> *(1100/1330) and default voltage.*
> 
> FPS: 63 // GPU: 75c / 76c
> Score: 2635 // VRM1: 76c / 79c
> Min: 25.3 // VRM2: 57 / 59
> Max: 119.8 // 1.175 V / 1.227 V
> 
> *(1130/1400) and +38 mV. No Aux Voltage.*
> 
> FPS: 64.3 // GPU: 76c / 77c
> Score: 2690 // VRM1: 79c / 82c
> Min: 30.2 // VRM2: 57 / 59
> Max: 123.4 // 1.234 V / 1.253 V
> 
> *(1140/1500) and +63 mV. No Aux Voltage.*
> 
> FPS: 66.4 // GPU: 77c / 79c
> Score: 2779 // VRM1: 82c / 84c
> Min: 32 // VRM2: 57 / 60
> Max: 125.6 / 1.257 V / 1.272 V
> 
> *(1150/1500) and +69 mV. No Aux Voltage.*
> 
> FPS: 67.6 // GPU: 78c / 80c
> Score: 2827 // VRM1: 83c / 85c
> Min: 31.6 // VRM2: 57 / 60
> Max: 128.2 // 1.277 V / 1.294 V
> 
> No significant gains for 1160/1500 but just got higher temps. This test took a long day to finish so please don't ask for more. This is just for your reference and i'm not competing with scores.


That's a ton of information, thanks a bunch! It looks pretty good but it's a shame the VRM1 doesn't have great cooling.


----------



## smoke2

Quote:


> Originally Posted by *kizwan*
> 
> Sgt Bilko is running AMD CPU FX-8350 CPU. These AMD CPUs are power hungry CPU. FX-8150 alone at stock clock already draw more than 200W (overall system power consumption with only CPU is being stress). Haswell CPU like yours, if overclocked to 4.2GHz, CPU power consumption is less than 100W.
> 
> Combined with power hungry CPU like FX-8350 & dual 290's, definitely his rig power consumption is going a lot higher than yours.
> 
> I found interesting info about my 3820 power draw at stock clock. Over 150W power draw at stock clock.
> My 290's Elpida & Hynix cards max voltage are 1.227V & 1.219V before Vdroop respectively. Do you think it's not good idea when overclock to set voltage to +100mV?
> 
> Voltages after Vdroop are 1.219V & 1.203V respectively.


If its right then his OCed FX-8150 CPU have almost 300W consumption? Could it be possible?
Quote:


> Originally Posted by *Forceman*
> 
> I hit 340 watts running Tomb Raider in both Ultra and Ultimate settings. That's at 1100/1325/+100mV. Furmark was 370 at those settings. At 1200/1400/+100mV Tomb Raider was 380.
> 
> That's with a single card - I don't understand how Sgt Bilko is using 250W more than that, even with an AMD CPU. I don't know what settings he was using for his benchmark run, but I get 66 FPS on Ultimate so either he was running something lower than that, or it was with Crossfire enabled. Hopefully he'll chime in with his benchmark settings.
> I'd check it with the offset under load just to be sure, but as long as the VRM/core temps are okay I don't think it would be a problem - I'm assuming that's going to be around 1.3 or 1.31V load after droop.


340W and 380W are peak values?


----------



## kizwan

Quote:


> Originally Posted by *smoke2*
> 
> If its right then his OCed FX-8150 CPU have almost 300W consumption? Could it be possible?


Power draw from the wall can exceeds 300W if FX-8350 is overclocked to 4.8GHz. *[Source]*

Most review use Kill-A-Watt to measure power draw from wall. From there you can guestimate the CPU power consumption. So far the only review I found that measure almost accurate CPU power consumption is Sin0822's OC guides. For example 6-cores SB-E CPUs can consume more than 300W (*not* power draw from the wall) if overclocked to at least 4.8GHz. For Haswell CPUs, when overclocked to 5GHz, CPU power consumption is only 182W.


----------



## the9quad

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Is this with 13.12 catalyst? or 13.11 whql? I've been stubborn and haven't switched yet lol


sorry for late reply its on the latest whql's 13.12


----------



## smoke2

Quote:


> Originally Posted by *kizwan*
> 
> Power draw from the wall can exceeds 300W if FX-8350 is overclocked to 4.8GHz. *[Source]*
> 
> Most review use Kill-A-Watt to measure power draw from wall. From there you can guestimate the CPU power consumption. So far the only review I found that measure almost accurate CPU power consumption is Sin0822's OC guides. For example 6-cores SB-E CPUs can consume more than 300W (*not* power draw from the wall) if overclocked to at least 4.8GHz. For Haswell CPUs, when overclocked to 5GHz, CPU power consumption is only 182W.


How is power consumption with Haswell overclocked on 4.5GHz?
Maximum power consumption of R9 290 is 300W?
You think 700W is enough for my rig?


----------



## Mr357

Quote:


> Originally Posted by *smoke2*
> 
> How is power consumption with Haswell overclocked on 4.5GHz?
> Maximum power consumption of R9 290 is 300W?
> You think 700W is enough for my rig?


Power consumption is dependent upon voltage and amperage, not frequency. If you're not overvolting the 290, 700W should be fine.


----------



## kizwan

Quote:


> Originally Posted by *smoke2*
> 
> How is power consumption with Haswell overclocked on 4.5GHz?
> Maximum power consumption of R9 290 is 300W?
> You think 700W is enough for my rig?


This should give you an idea on Haswell power consumption. *[Source]*



Regarding R9 290, maybe. I don't know how much the actual R9 290 power consumption. I can only find power draw from the wall.

You have 4670k & R9 290 right? Yes, 700W more than enough.


----------



## Krusher33

My FX-8350 @ 1.46v, 290X @ stock volts, and custom water cooling was only drawing about 580W at the wall when both under full load. So 700W for 1 290 is plenty and then some.


----------



## rdr09

I am using a 700W psu and it can handle my i7 SB at 4.9GHz and my 290 at 1320/1620 in benches. 700 is plenty.

unless the 700w psu is made by inland or raidmax.


----------



## smoke2

OK, I've made a decision to take 700W finally









And I'm planning to buy new Radeon R9 290 non-reference.
Stucked between Sapphire Tri-X and Gigabyte Windforce.

I'm prefering quieter card.
Sapphire is little pricier (about 25€) and have only 2 years warranty instead of Gigabyte 3 years.
Which one do you recommend and possibly why?

The best will be someone experiences who has a both


----------



## brazilianloser

Quote:


> Originally Posted by *rdr09*
> 
> I am using a 700W psu and it can handle my i7 SB at 4.9GHz and my 290 at 1320/1620 in benches. 700 is plenty.
> 
> unless the 700w psu is made by inland or raidmax.


True. Since my 860 only pulls max 750~... with two cards from the wall.


----------



## ZealotKi11er

My System 3770K @ 4.6GHz @ 1.35v + 290X @ +50mV has pulled MAX 580W in BF4 and Mining.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *brazilianloser*
> 
> True. Since my 860 only pulls max 750~... with two cards from the wall.


Are you overvolted? My rig re-build will have the same PSU, full watercooling and eventually a 2nd 290...

I think I should be good for some pretty decent OC'ing, but you can never be too sure. This will be paired with a 3770k at or just under 5GHz.


----------



## the9quad

i have no idea what I am pulling but it's working so I dont care.


----------



## brazilianloser

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Are you overvolted? My rig re-build will have the same PSU, full watercooling and eventually a 2nd 290...
> 
> I think I should be good for some pretty decent OC'ing, but you can never be too sure. This will be paired with a 3770k at or just under 5GHz.


When I run my test using a kill a wat... two cards, i7 3770k @ 4.5~4.6... 6 fans, 1 ssd, 1 hdd, both cards were running at 1100/1250, power limit set to +50%. It peaked at 750~760...
On the weekend I will be receiving all my water loop parts and will run the tests once a again at stock and oc to see if I personally will need an upgrade or not.


----------



## bond32

Quote:


> Originally Posted by *ZealotKi11er*
> 
> My System 3770K @ 4.6GHz @ 1.35v + 290X @ +50mV has pulled MAX 580W in BF4 and Mining.


This was with one card? Quite a bit more than mine with one card... What else you running?


----------



## BradleyW

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Alrighty, for Tomb Raider i measured a peak power usage of 650w with 1 R9 290 at 1200/1300 +100 mV in AB (GPU usage was 100%)
> 
> 
> 
> EDIT: Just ran FurMark with one card at stock clocks 980/1250, got a max peak of 660w power draw......


Wait wait wait..... That's just with 1 card? That min fps is the same as my min fps in CFX.


----------



## Forceman

Quote:


> Originally Posted by *BradleyW*
> 
> Wait wait wait..... That's just with 1 card? That min fps is the same as my min fps in CFX.


I get about the same minimum (I think it was 116) and average with my single card. Maybe it's a CPU limitation or benchmark glitch that is making the min the same. You average is right for CFX though?


----------



## BradleyW

Quote:


> Originally Posted by *Forceman*
> 
> I get about the same minimum (I think it was 116) and average with my single card. Maybe it's a CPU limitation or benchmark glitch that is making the min the same. You average is right for CFX though?


Not sure. I get half the min fps in the benchmark when CFX is disabled!


----------



## ZealotKi11er

Quote:


> Originally Posted by *bond32*
> 
> This was with one card? Quite a bit more than mine with one card... What else you running?


580W is Max reading. Probably never goes that high. Mining which really pushes GPU its 440W.


----------



## devilhead

3x 290's(947/1250) + 1x290x (1025/1500) with watercooling at mining pulls 1400 w from wall














so with overclock..... need a lot power


----------



## smoke2

Quote:


> Originally Posted by *ZealotKi11er*
> 
> My System 3770K @ 4.6GHz @ 1.35v + 290X @ +50mV has pulled MAX 580W in BF4 and Mining.


Quite high...
It's a single 290X?
You have waterblock or many hard drives?


----------



## BradleyW

Can someone test power draw with 290X CFX and 3930K? I'm wondering if my 860 is enough now.


----------



## Mr357

Quote:


> Originally Posted by *BradleyW*
> 
> Can someone test power draw with 290X CFX and 3930K? I'm wondering if my 860 is enough now.


Hard to say, even if you're running them at stock.


----------



## neurotix

Can someone please logically explain to me the reason why people say ASIC quality doesn't matter on Hawaii?

My card has 80% ASIC, which is on the high side, and it does seem to overclock better than a lot of cards I see other people with. So, I'm wondering the actual logic behind why ASIC doesn't matter on Hawaii. The biggest reason I've heard is that it's because ASIC was coded for Tahiti, so it works properly with Tahiti, but since Hawaii is a new architecture it doesn't apply, even though it will still let you read the ASIC quality.

Don't get me wrong, I'm well aware that ASIC says nothing about a card, and that each card is individual and will clock differently. You have some 7970s that can do 1300mhz with high ASIC and others that can do it with low ASIC, and there's even some with very high ASIC that can run nowhere near 1300mhz without crashing, and so on.

Just curious why people say it doesn't apply for 290/X. Thanks.


----------



## ZealotKi11er

Quote:


> Originally Posted by *BradleyW*
> 
> Can someone test power draw with 290X CFX and 3930K? I'm wondering if my 860 is enough now.


GPUs stock are 300W MAX. 260W for the rest of the system. I would say you are hitting 800-900W.


----------



## devilhead

Quote:


> Originally Posted by *BradleyW*
> 
> Can someone test power draw with 290X CFX and 3930K? I'm wondering if my 860 is enough now.


i have 3930k







and 290x 's, at mining, with 2x290X at 1025/1500 +28mv +20power pulls around 800w(cpu 4.8 offset, so idle at 1200mhz and 0.85v)







cards are watercooled didn't checked how much it have pulled at 1300/1700, 5.2ghz 3930k and 1.5v at firestrike, but i think it was much more than 900w







) i have AX1200i







thats enough


----------



## ZealotKi11er

Quote:


> Originally Posted by *smoke2*
> 
> Quite high...
> It's a single 290X?
> You have waterblock or many hard drives?


Actually ~ 450W when playing BF4.
Yes its a single 290X

Edit:

Does 290X handle V-Sync differently? I was not able to notice much input lag in BF4 with V-Sync ON. With 2 x HD 7970 i could easly notice it. I check fps with MSI AB and its a steady 60 fps strati line. Core fluctuates between 950-1000MHz.


----------



## BradleyW

Quote:


> Originally Posted by *ZealotKi11er*
> 
> GPUs stock are 300W MAX. 260W for the rest of the system. I would say you are hitting 800-900W.


Quote:


> Originally Posted by *devilhead*
> 
> i have 3930k
> 
> 
> 
> 
> 
> 
> 
> and 290x 's, at mining, with 2x290X at 1025/1500 +28mv +20power pulls around 800w(cpu 4.8 offset, so idle at 1200mhz and 0.85v)
> 
> 
> 
> 
> 
> 
> 
> cards are watercooled didn't checked how much it have pulled at 1300/1700, 5.2ghz 3930k and 1.5v at firestrike, but i think it was much more than 900w
> 
> 
> 
> 
> 
> 
> 
> ) i have AX1200i
> 
> 
> 
> 
> 
> 
> 
> thats enough


Thank you.


----------



## cam51037

Are there any MSI 290 Gaming edition owners in this thread?

If so, next time you start up a game for awhile, mind monitoring the VRM temperatures with GPU-z and reporting back?

Right now I could purchase a Sapphire Tri-X 290 but I've heard mixed things about the VRM cooling. I think it would be fine if Sapphire allowed you to take the heatsink off, I'd put some heatsinks on the VRM's, but unfortunately that would void my warranty.


----------



## Krusher33

Quote:


> Originally Posted by *neurotix*
> 
> Can someone please logically explain to me the reason why people say ASIC quality doesn't matter on Hawaii?
> 
> My card has 80% ASIC, which is on the high side, and it does seem to overclock better than a lot of cards I see other people with. So, I'm wondering the actual logic behind why ASIC doesn't matter on Hawaii. The biggest reason I've heard is that it's because ASIC was coded for Tahiti, so it works properly with Tahiti, but since Hawaii is a new architecture it doesn't apply, even though it will still let you read the ASIC quality.
> 
> Don't get me wrong, I'm well aware that ASIC says nothing about a card, and that each card is individual and will clock differently. You have some 7970s that can do 1300mhz with high ASIC and others that can do it with low ASIC, and there's even some with very high ASIC that can run nowhere near 1300mhz without crashing, and so on.
> 
> Just curious why people say it doesn't apply for 290/X. Thanks.


I understood ASIC differently. It has nothing to do with how well your card overclocks and what not. It has more to do with leakage. If low ASIC, then it'll need more juice, thus run hotter which is why the chart says better for watercooling.

So the card's overclocking potential still rests on the GPU. But for hypothetical sake, if 2 GPU had the same overclocking potential, the one with lower ASIC % will need more volts than the other one.


----------



## Loktar Ogar

Quote:


> Originally Posted by *cam51037*
> 
> Are there any MSI 290 Gaming edition owners in this thread?
> 
> If so, next time you start up a game for awhile, mind monitoring the VRM temperatures with GPU-z and reporting back?
> 
> Right now I could purchase a Sapphire Tri-X 290 but I've heard mixed things about the VRM cooling. I think it would be fine if Sapphire allowed you to take the heatsink off, I'd put some heatsinks on the VRM's, but unfortunately that would void my warranty.


I'd like to see some GPU and VRM temp readings as well with MSI 290 Gaming cards. The GPU and VRM temps are easily fixed by making a fan profile in AB. In my previous test it was all about Auto fan settings and how well it will perform.

Also, as mentioned in other thread that you no longer have to put VRM1 heatsinks on TRI-X because the heatsink is already in contact VRM1 with padding. http://www.overclock.net/t/1452307/tpu-sapphire-announces-the-radeon-r9-290x-290-tri-x-graphics-cards/480#post_21596378


----------



## cam51037

Quote:


> Originally Posted by *Loktar Ogar*
> 
> I'd like to see some GPU and VRM temp readings as well with MSI 290 Gaming cards. The GPU and VRM temps are easily fixed by making a fan profile in AB. In my previous test it was all about Auto fan settings and how well it will perform.
> 
> Also, as mentioned in other thread that you no longer have to put VRM1 heatsinks on TRI-X because the heatsink is already in contact VRM1 with padding. http://www.overclock.net/t/1452307/tpu-sapphire-announces-the-radeon-r9-290x-290-tri-x-graphics-cards/480#post_21596378


That's really interesting that it has contact with the cooler with a padding, but it still gets so hot.

For my setup, the quieter, the better. Hopefully I'd watercool the card sooner or later (the system is all ready for it) but it's nice to have a decent stock cooler as well.


----------



## alawadhi3000

I'm running three stock R9 290 on a Corsair RM 1000W.


----------



## neurotix

Quote:


> Originally Posted by *Krusher33*
> 
> I understood ASIC differently. It has nothing to do with how well your card overclocks and what not. It has more to do with leakage. If low ASIC, then it'll need more juice, thus run hotter which is why the chart says better for watercooling.
> 
> So the card's overclocking potential still rests on the GPU. But for hypothetical sake, if 2 GPU had the same overclocking potential, the one with lower ASIC % will need more volts than the other one.


Yep, I understand this perfectly.

Lower ASIC has higher electrical leakage, and thus needs more voltage for the same clock compared to a higher ASIC card. Still, the max clocks it can do is dependent on the chip itself.

This still doesn't explain why people have said ASIC doesn't matter for Hawaii chips, though.


----------



## shilka

Quote:


> Originally Posted by *alawadhi3000*
> 
> I'm running three stock R9 290 on a Corsair RM 1000W.


Are you aware of all the shortcuts that was taken in build quality

http://www.overclock.net/t/1455892/why-you-should-not-buy-a-corsair-rm-psu


----------



## devilhead

Quote:


> Originally Posted by *alawadhi3000*
> 
> I'm running three stock R9 290 on a Corsair RM 1000W.


not for a long if you are gaming or mining







if just looking at desktop, then ok


----------



## Bartouille

Quote:


> Originally Posted by *neurotix*
> 
> Can someone please logically explain to me the reason why people say ASIC quality doesn't matter on Hawaii?
> 
> My card has 80% ASIC, which is on the high side, and it does seem to overclock better than a lot of cards I see other people with. So, I'm wondering the actual logic behind why ASIC doesn't matter on Hawaii. The biggest reason I've heard is that it's because ASIC was coded for Tahiti, so it works properly with Tahiti, but since Hawaii is a new architecture it doesn't apply, even though it will still let you read the ASIC quality.
> 
> Don't get me wrong, I'm well aware that ASIC says nothing about a card, and that each card is individual and will clock differently. You have some 7970s that can do 1300mhz with high ASIC and others that can do it with low ASIC, and there's even some with very high ASIC that can run nowhere near 1300mhz without crashing, and so on.
> 
> Just curious why people say it doesn't apply for 290/X. Thanks.


ASIC quality doesn't matter. As you said, there are good clocking cards with low and high asic. Heck, even Hynix vs Elpida doesn't matter! I've seen bad clocking hynix and bad elpida, vice versa. Also stability is subjective, I see a lot of people running FS at 1300MHz+ but the truth is that it probably wouldn't last 5 sec in a real game. Also people play different games and stability differs, for example, with my 7950 I completed Tomb Raider 2013 with 1230/1850MHz, I thought my OC was solid, then I was extremely disappointed when I found out it couldn't even hold 1.2ghz in Far Cry 3. Now I don't waste my time anymore when I try to find solid oc, I just play far cry 3 for hours, and maybe some day I will find an even more stressing games. The other thing people don't seem to understand is that even if you achieve an overclock at lower voltage, it doesn't mean it is a better oc cards than one that needs more volts. What matters is HOW MUCH volts you had FROM the stock voltage. Some cards maybe have 1.175v stock voltage and other with 1.23v (just some random numbers). 1.175v isn't any better than 1.23v, both cards will consume the same amount of power and achieve more of less the same clocks without adding any voltage. Also the 1.23v card will NOT run hotter than the 1.175v one.


----------



## Heinz68

Quote:


> Originally Posted by *Loktar Ogar*
> 
> Also, as mentioned in other thread that you no longer have to put VRM1 heatsinks on TRI-X because the heatsink is already in contact VRM1 with padding. http://www.overclock.net/t/1452307/tpu-sapphire-announces-the-radeon-r9-290x-290-tri-x-graphics-cards/480#post_21596378


This might not be true, read this
http://www.overclock.net/t/1456488/290-tri-x-non-x-version-silly-sapphire-tri-x-are-for-kids-some-temp-test-for-those-who-want-to-know/10#post_21548010 also read the second post after this post.
Anyway I'm getting my cards in couple days and will check it out.

EDIT
After further reading it, maybe there is contact with the heatsink but not direct contact with the heatpipes as the GPU has.


----------



## Loktar Ogar

This has to be confirmed by DamnedLife, i just had a quick view of the card and never attempted to open it up due to the warranty seal with my card. My observation regarding the GPU/VRM1 temps are not too far away with mild overclocks, and my guess there must be a contact on top of the VRM1...


----------



## mojobear

Quote:


> Originally Posted by *devilhead*
> 
> not for a long if you are gaming or mining
> 
> 
> 
> 
> 
> 
> 
> if just looking at desktop, then ok


if u are cooling the R9 290s sufficiently with H20, etc, you can get alot less leakage...

I run an i7 4770K OC to 4.75 GHZ and 3 x R9290s....I measured watts from the wall:

Gaming: OC to 1150/1350 + 75 mV on all three cards nets me around 950 W peak (this is crysis 3 and BF4)

Mining: 930/1375 with no OC I get 870 W peak

My PSU is the corsair AX1200


----------



## lakigucci

Is there a way to prevent the Tri-X from throttling (like u see in the gpu-z screenshot)?
I overclocked the card on stock voltage and +0% powerlimit and it runs perfectly fine.
But when i monitor the clockspeed it never stays at 1100Mhz wich i overclocked it to, and it just annoys me...
I just want it to run @ 1100 no matter the gpu load...


----------



## rx7racer

I'm not aware yet as to any way to lock the clocks on these new fangled gpu's NV and AMD are putting out. All that bs about people worrying about power consumption has gotten them on the lets save power when ever possible kick.

Personally I hate it, core voltage fluctuates way too much for my liking along with core clock rate etc. If there is a way to lock it, it is probably with a bios mod, I haven't dug that deep yet since for the most part on the demanding area when the cpu is feeding it enough data it will stay at the clock rate as long as power and temps are within the envelope set forth.

Someone want to point to a bios mod to remedy it? I wouldn't mind that myself.


----------



## ZealotKi11er

Quote:


> Originally Posted by *lakigucci*
> 
> Is there a way to prevent the Tri-X from throttling (like u see in the gpu-z screenshot)?
> I overclocked the card on stock voltage and +0% powerlimit and it runs perfectly fine.
> But when i monitor the clockspeed it never stays at 1100Mhz wich i overclocked it to, and it just annoys me...
> I just want it to run @ 1100 no matter the gpu load...


Why do you want to want to run 1100MHz? If the game/app can full use the card it will clock to 1100Mhz, if not he card has no reason to operate at high MHz to produce more heat and power consumption.


----------



## Widde

Quote:


> Originally Posted by *lakigucci*
> 
> Is there a way to prevent the Tri-X from throttling (like u see in the gpu-z screenshot)?
> I overclocked the card on stock voltage and +0% powerlimit and it runs perfectly fine.
> But when i monitor the clockspeed it never stays at 1100Mhz wich i overclocked it to, and it just annoys me...
> I just want it to run @ 1100 no matter the gpu load...


Try upping the power limit alittle, looks like it doesnt get enough juice

Or do you mean you want it at 1100 when at idle?

For some reason these cards dont like to be at max frequency when they dont need to or I could be totally wrong but ^^ http://piclair.com/9gm7c my card


----------



## lakigucci

Cause im used to having fixed clock i guess and i think the fluctuation makes my BF4 stutter when i spawn..


----------



## lakigucci

Quote:


> Originally Posted by *Widde*
> 
> Try upping the power limit alittle, looks like it doesnt get enough juice
> 
> Or do you mean you want it at 1100 when at idle?
> 
> For some reason these cards dont like to be at max frequency when they dont need to http://piclair.com/9gm7c my card


No i mean that i want it to run @1100 when im playing BF4 is all..


----------



## ZealotKi11er

Quote:


> Originally Posted by *lakigucci*
> 
> No i mean that i want it to run @1100 when im playing BF4 is all..


If GPU is not being fully used it will not clock up to 1100MHz. Also it will not clock up if it runs out of power draw.

In BF4 you should be seeing 1100Mhz clocks unless CPU is bottleneck, You dont have power increased or have V-Sync ON.


----------



## rx7racer

I totally understand and wish to god once the core was in 3D mode settings it stayed in them instead of dropping in the middle of a match because my cpu couldn't keep up for 2 seconds and then it goes oh more data let me get that for you.

I have the worst issue of this with World of Tanks. It's fine if I play consistent for a bit but when I start doing one match and then stepping away and and just sitting in garage I'll get back and then I go into match and stutter every 3 seconds because the clocks are doing the moonwalk to the gpu usage rate. Reality is applications and programs make bad use of code, it gets ugly and I need my hardware to be stable and steady to do what is needed instantly.

WoT is horribly optimized and is technically single core, they say multi but it's a secondary thread for the audio so no real multithreading of the engine itself until we get havok utilized.

Lord have mercy, way off course but I agree, there are users that don't care about the power saving and that for certain applications a steady operating piece of hardware is more beneficial. And yes even realizing it is not even noticeable technically the time it takes for a gpu to change power states, but it's constant fluctuating can have waterfall effects that are noticed to the user in the end product displayed.


----------



## Widde

Quote:


> Originally Posted by *lakigucci*
> 
> No i mean that i want it to run @1100 when im playing BF4 is all..


If that is in BF4 you should try and up the power limit to 25% and see, I see 100% on 2 290s at 1100mhz


----------



## lakigucci

I just set the powerlimit to the max of +50%, and i avarages more towards 1100, 1080-1100, wich is an improvement.
But still wont lock onto 1100mhz. Maybe its just a mindbug of me, i just dont want it to drop whilst gaming









If i up the voltage besides that with +50mv is sticks to higer 1095-1099's.

Edit2: i noticed that in heaven on extreme hd it doesnt throttle clock's back that often and actually stick to 1100 during a full scene. could it be that the gpu load on bf4 isnt enough?


----------



## Sazz

Quote:


> Originally Posted by *lakigucci*
> 
> I just set the powerlimit to the max of +50%, and i avarages more towards 1100, 1080-1100, wich is an improvement.
> But still wont lock onto 1100mhz. Maybe its just a mindbug of me, i just dont want it to drop whilst gaming
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If i up the voltage besides that with +50mv is sticks to higer 1095-1099's.
> 
> Edit2: i noticed that in heaven on extreme hd it doesnt throttle clock's back that often and actually stick to 1100 during a full scene. could it be that the gpu load on bf4 isnt enough?


you mean the clocks dropping? I am currently using the stock cooler w/ custom fan curve w/ +15 power limit and the most clock drops I get on BF4 is down to 1050, (using 1100/1350 daily clocks). but 99% of the time it stays at 1100, temps stays at 75C.


----------



## Falkentyne

Quote:


> Originally Posted by *neurotix*
> 
> Can someone please logically explain to me the reason why people say ASIC quality doesn't matter on Hawaii?
> 
> My card has 80% ASIC, which is on the high side, and it does seem to overclock better than a lot of cards I see other people with. So, I'm wondering the actual logic behind why ASIC doesn't matter on Hawaii. The biggest reason I've heard is that it's because ASIC was coded for Tahiti, so it works properly with Tahiti, but since Hawaii is a new architecture it doesn't apply, even though it will still let you read the ASIC quality.
> 
> Don't get me wrong, I'm well aware that ASIC says nothing about a card, and that each card is individual and will clock differently. You have some 7970s that can do 1300mhz with high ASIC and others that can do it with low ASIC, and there's even some with very high ASIC that can run nowhere near 1300mhz without crashing, and so on.
> 
> Just curious why people say it doesn't apply for 290/X. Thanks.


Because my 290x has 77% asic and needs +13mv to be fully stable (+25mv to be certain) at 1100 MHz, although +0mv is stable in Heaven and most other programs (but I can get artifacting in 3dmark or Skyrim when temps rise).


----------



## the9quad

Quote:


> Originally Posted by *ZealotKi11er*
> 
> If GPU is not being fully used it will not clock up to 1100MHz. Also it will not clock up if it runs out of power draw.
> 
> In BF4 you should be seeing 1100Mhz clocks unless CPU is bottleneck, You dont have power increased or have V-Sync ON.


you sure about that? that's not the way it works with me, just because GPU usage isn't 100% doesn't mean the clocks throttle. I cap frames in bf4 so am rarely if ever at 100% gpu usage on my cards but they still are at max core and ram frequency. They never throttle.


----------



## psyside

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Do you use your card for mining? Very curios to see VRM1 temps.


*24/7 max performance setting 900kh/s.*



*24/7 max efficiency setting 830kh/s.* 830kh/s



*
24/7 CM 850W Vanguard voltage regulation statistic
*


----------



## Forceman

Quote:


> Originally Posted by *neurotix*
> 
> This still doesn't explain why people have said ASIC doesn't matter for Hawaii chips, though.


Because the register that GPU-Z is reading isn't ASIC quality on the Hawaii cards. AMD guys have already said that GPU-z isn't reading what it thinks it is reading, and that the value it is pulling out of there has nothing to do with the card quality. Here's the quote:
Quote:


> ASIC "quality" is a misnomer propobated by GPU-z reading a register and not really knowing what it results in. It is even more meaningless with the binning mechanism of Hawaii.


http://forum.beyond3d.com/showpost.php?p=1808073&postcount=2130


----------



## PillarOfAutumn

I bought a Korean monitor that will use on of the DVI-D ports. I have two laptop monitors that will be converted to external monitors using a control board that supports hdmi and a DVI-I input. So my question is how do I connect the two DVI-Is to the 290? Will I need 1 HDMI+1 DVI-D or can I use one DVI-D and use a splitter and have it go to the two monitors?


----------



## Derpinheimer

Quote:


> Originally Posted by *psyside*
> 
> *24/7 max performance setting 900kh/s.*
> 
> *24/7 max efficiency setting 830kh/s.* 830kh/s
> 
> *
> 24/7 CM 850W Vanguard voltage regulation statistic
> *


Very nice! Hynix vram?

Love that voltage regulation BTW.
Cooler Master M600 -> Max: 11.84 Min: 11.33
Max VRM1 in for GPU: 180w (same as you)


----------



## ZealotKi11er

Quote:


> Originally Posted by *the9quad*
> 
> you sure about that? that's not the way it works with me, just because GPU usage isn't 100% doesn't mean the clocks throttle. I cap frames in bf4 so am rarely if ever at 100% gpu usage on my cards but they still are at max core and ram frequency. They never throttle.


I was just playing BF4 with V-Sync and clocks dropped to ~ 950MHz. Dota 2 which does not need a lot of GPU power plays ~ 700-800MHz.


----------



## alawadhi3000

Quote:


> Originally Posted by *shilka*
> 
> Are you aware of all the shortcuts that was taken in build quality
> 
> http://www.overclock.net/t/1455892/why-you-should-not-buy-a-corsair-rm-psu


Yeah I read that thread a few days ago, the RM 1000W was the best PSU I can find locally.
I could've imported a 1000W Tachyon for the same price but I'll have to wait 7-8 days which is a lot, I can make $150-$200 in that time.
Quote:


> Originally Posted by *devilhead*
> 
> not for a long if you are gaming or mining
> 
> 
> 
> 
> 
> 
> 
> if just looking at desktop, then ok


I doubt I'm using more than 900W on that PC.


----------



## Loktar Ogar

Quote:


> Originally Posted by *Heinz68*
> 
> This might not be true, read this
> http://www.overclock.net/t/1456488/290-tri-x-non-x-version-silly-sapphire-tri-x-are-for-kids-some-temp-test-for-those-who-want-to-know/10#post_21548010 also read the second post after this post.
> Anyway I'm getting my cards in couple days and will check it out.
> 
> EDIT
> After further reading it, maybe there is contact with the heatsink but not direct contact with the heatpipes as the GPU has.


Quote:


> Originally Posted by *Loktar Ogar*
> 
> This has to be confirmed by DamnedLife, i just had a quick view of the card and never attempted to open it up due to the warranty seal with my card. My observation regarding the GPU/VRM1 temps are not too far away with mild overclocks, and my guess there must be a contact on top of the VRM1...


I did look at it out of curiosity but did not disassembled it. I can safely confirm that the whole line of VRM1 has a gray thermal pad and a large base area of heatsink pressed on it. Surprisingly, that heatsink connects through all VRAMs and VRMs







but seems like there is no thermal pad on the VRAMs but just a regular black padding. I don't have a camera to take a pic on it, but just refer to anandtech's review below.









Sapphire Radeon R9 290 Tri-X OC Review: http://www.anandtech.com/show/7601/sapphire-radeon-r9-290-review-our-first-custom-cooled-290

Highlighted in Yellow is where the thermal pads were placed to cool those VRMs!











EDIT: Added pic and review link.


----------



## blak24

Hello everyone, are there any news about aftermarket coolers for the card? Or only the Accelero and Icy Vision v2 for a fair price?


----------



## kcuestag

Quote:


> Originally Posted by *blak24*
> 
> Hello everyone, are there any news about aftermarket coolers for the card? Or only the Accelero and Icy Vision v2 for a fair price?


If you mean separate coolers to buy, I believe those are the only two options right now (Unless you go water).

But there are cards with custom coolers already like the Gigabyte Windforce or the Sapphire Tri-X, and I believe there should be an ASUS DirectCUII soon?


----------



## Thanos1972

i do not see guys recommending prolimatech Mk 26 for this card...why not?


----------



## the9quad

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I was just playing BF4 with V-Sync and clocks dropped to ~ 950MHz. Dota 2 which does not need a lot of GPU power plays ~ 700-800MHz.


That does not happen for me at all. The only time my cards throttle is on temps, which never happens since I use a custom fan profile to prevent that. So even if my card usage drops to 5% the frequency will stay maxed on core and memory. I can post video later today if you wish.


----------



## psyside

Quote:


> Originally Posted by *Derpinheimer*
> 
> Very nice! Hynix vram?
> 
> Love that voltage regulation BTW.
> Cooler Master M600 -> Max: 11.84 Min: 11.33
> Max VRM1 in for GPU: 180w (same as you)


Thanks, the CM vanguard is absolutely insane. My friend has the 850W as well, during mining with *2x 280X and 1x 6870,* the 12V rail doesn't drop a single mV same as on mine unit lol!! measured with HWiNFO and multimeter as well, absolutely incredible.

Yes Hynix vram.


----------



## ottoore

Quote:


> Originally Posted by *Thanos1972*
> 
> i do not see guys recommending prolimatech Mk 26 for this card...why not?


Tomorrow i'll receive the Prolimatech mk-26. I'll mount it with 4x140mm fans and coolaboratory liquid pro for gpu.


----------



## shilka

Why is the CM V listed as Vanguard in some places?

Always makes me think of HMS Vanguard fast battleship


----------



## psyside

Quote:


> Originally Posted by *shilka*
> 
> Why is the CM V listed as Vanguard in some places?


Vanguard is the original name afaik? : /


----------



## Jack Mac

It makes me think of this:


----------



## PillarOfAutumn

Can someone please help?

I bought a Korean monitor that will use on of the DVI-D ports. I have two laptop monitors that will be converted to external monitors using a control board that supports hdmi and a DVI-I input. So my question is how do I connect the two converted monitors to the 290? Will I need 1 HDMI+1 DVI-D or can I use one DVI-D and use a splitter and have it go to the two monitors?


----------



## Timx2

I've got my Sapphire 290X with EKWB waterblock for a few weeks now and I've done some overclocking.

At stock settings I get the following results:

Valley: 2.552
Firestrike graphic score: 11.238
Firestrike extreme graphic score: 5.166

Without touching any voltage I can get 1100Mhz on the core and 1425Mhz on the memory. The results are:

Valley: 2.789 (+9%)
Firestrike graphic score: 12.303 (+9%)
Firestrike extreme graphic score: 5.708 (+10%)

These settings proof to be absolute stable for me (2 hours Valley loop and 50+ firestrike extreme first scene loop & BF4 3+ hours). Especially Firestrike Extreme was hard to get fully stable (for once Valley wasn't hard at all. Valley was even stable @1150Mhz on core and 1425Mhz on the memory).

When I touch the voltage (I use Trixx) I can get 1200Mhz on the core and 1425Mhz on the memory with +143mv on the core. The results are:

Valley: 2.935 (+15%)
Firestrike graphic score: 13.090 (+16%)
Firestrike extreme graphic score: 6.067 (+17%)

The temps are good. GPU: 48C, VRM1: 58C, VRM2: 39 with the fans arround 50%.

Like above these settings are absolute stable for me (2 hours Valley loop and 50+ firestrike extreme first scene loop & BF4 3+ hours).

With MSI Afterburner I could get the memory to 1575Mhz with +50mv on the AUX but then I had to lower my core to 1175Mhz (cause I can't get it higher then +100mv).

I hope my information contributes to other fellow overclockers. And by the way, for those who care my ASIC is 78%).


----------



## Loktar Ogar

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Timx2*
> 
> I've got my Sapphire 290X with EKWB waterblock for a few weeks now and I've done some overclocking.
> 
> At stock settings I get the following results:
> 
> Valley: 2.252
> Firestrike graphic score: 11.238
> Firestrike extreme graphic score: 5.166
> 
> Without touching any voltage I can get 1100Mhz on the core and 1425Mhz on the memory. The results are:
> 
> Valley: 2.789 (+9%)
> Firestrike graphic score: 12.303 (+9%)
> Firestrike extreme graphic score: 5.708 (+10%)
> 
> These settings proof to be absolute stable for (2 hours Valley loop and 5+ firestrike extreme first scene loop & BF4 3+ hours). Especially Firestrike Extreme was hard to get fully stable (for once Valley wasn't hard at all. Valley was even stable @1150Mhz on core and 1425Mhz on the memory). Also BF4 is stable for at least 3+ hours.
> 
> When I touch the voltage (I use Trixx) I can get 1200Mhz on the core and 1425Mhz on the memory with +137mv on the core. The results are:
> 
> Valley: 2.935 (+15%)
> Firestrike graphic score: 13.090 (+16%)
> Firestrike extreme graphic score: 6.067 (+17%)
> 
> The temps are good. GPU: 48C, VRM1: 58C, VRM2: 39 with the fans arround 50%.
> 
> Like above these settings are absolute stable for me (2 hours Valley loop and 5+ firestrike extreme first scene loop & BF4 3+ hours).
> 
> With MSI Afterburner I could get the memory to 1575Mhz with +50mv on the AUX but then I had to lower my core to 1175Mhz (cause I can't get it higher then +100mv).
> 
> I hope my information contributes to other fellow overclockers. And by the way, for those who care my ASIC is 78%).






Best combination (R9 290/290x + watercooling). Thanks!


----------



## alancsalt

The vanguard is at the Front, the advance unit.

If ASIC really meant something significant, wouldn't they be charging a premium for high ASIC cards?


----------



## sugarhell

High asic cards are awful for watercooling/overclocking

7990 has really high asic chips. I think based on the vddc is 85% + ASIC. And you paid a premium for 7990


----------



## kizwan

@Arizonian,

Please update me. My 290's are now underwater.


----------



## Brian18741

Guys, any thoughts on the Asus R9 290 DCII or the MSI R9 290 Gaming cards? Gonna hit the order button this evening if there's nothing wrong with either of them!


----------



## blak24

Quote:


> Originally Posted by *kcuestag*
> 
> If you mean separate coolers to buy, I believe those are the only two options right now (Unless you go water).
> 
> But there are cards with custom coolers already like the Gigabyte Windforce or the Sapphire Tri-X, and I believe there should be an ASUS DirectCUII soon?


Yep I was referring to aftermarket coolers for reference cards. I bought an unlockable Gigabyte more than a month ago, it's precious now


----------



## Arizonian

Quote:


> Originally Posted by *kizwan*
> 
> @Arizonian,
> 
> Please update me. My 290's are now underwater.
> 
> 
> Spoiler: Warning: Spoiler!


Looks great. Congrats - updated









Quote:


> Originally Posted by *Brian18741*
> 
> Guys, any thoughts on the Asus R9 290 DCII or the MSI R9 290 Gaming cards? Gonna hit the order button this evening if there's nothing wrong with either of them!


Nothing wrong on either.

*ASUS R9 290X DirectCU II OC Video Card Review* [H]ardOCP

Personally I'd choose the DCUII for the back plate.

MSI has a quieter fan on the Gaming.


----------



## hellkama

I just got my stock R9 290 (Gigabyte) and it seems I am not able to overclock the memory at all.

I can even undervolt the gpu and still overclock the core at the same time. But no matter the voltages I cannot overclock the memory. Adding even +5mhz to the stock memory clock brings performance down and some more crashes the card. I've tried upping both the core voltage and aux voltage (msi afterburner beta 18). Has anyone else experienced this? Does this make any sense at all?

Core is clocking nicely (-50mv, ~1.1v, 1050MHz) but I am planning to go water cooling for this so it would be great to get more out of the memory also.


----------



## Hattifnatten

Could you update me to water? You missed it last time I wrote it








Quote:


> Originally Posted by *Hattifnatten*
> 
> Finally got my 90* fitting today, and I was able to fill the loop up. I've actually had the card blocked for a few days, but thanks to EK putting the inlet/outlat WAY to close to the rear of the case, I was having so much trouble putting it together. Going from the block to the rad in a straight line? Nope. Unless you wanted to restrict the path from the rad to the cpu-block. And after all of that was figured out, I realized I was 2-3cm short of tube. Had to get a 90.
> 
> 
> Spoiler: Warning: Spoiler!


----------



## smoke2

Wouldn't be me PSU noisy with one Tri-X OCed and 4560K OCed?
I'm planning to buy Cooler Master V700 (W).
Will have also good efficiency?


----------



## Sazz

So I just bought THIS VRM heatsink from PPCS to use w/ my H55 mod on my 290X, and I just received it.

I gotta say the quality of that heatsink is good, altho the mounting holes doesn't match, one hole would be off by a 3-5mms. which is actually pretty doable in terms of drilling a second hole on one side.

Imma do the work probably tomorrow and see how it goes. but If I can make it work VRM temps should go down significantly. hopefully atleast on the levels of what the stock cooler is achieving right now which is 65C at 75% fan speed, but I honestly think this would cut that down even further. hopefully the thickness of thermal pad I bought is enough.

Now I gotta buy a 80mm fan to use on it too xD


----------



## Xyro TR1

Soon!


----------



## cam51037

I purchased my 290 today from WTCR for around $460 CAD, a Sapphire Tri-X 290. Hopefully it'll be in tomorrow for some mining goodness!

After it's paid itself off with mining, I'm planning to put it in my main machine for gaming.


----------



## bond32

Quote:


> Originally Posted by *cam51037*
> 
> I purchased my 290 today from WTCR for around $460 CAD, a Sapphire Tri-X 290. Hopefully it'll be in tomorrow for some mining goodness!
> 
> After it's paid itself off with mining, I'm planning to put it in my main machine for gaming.


Good luck with that. Last quote said a 290 running for 6 months straight will pay for itself with the current difficulty...


----------



## ZealotKi11er

Quote:


> Originally Posted by *bond32*
> 
> Good luck with that. Last quote said a 290 running for 6 months straight will pay for itself with the current difficulty...


My 290X makes 0.09 BTC per day. Thats 0.009*31= 0.28 ~ $200. 2-3 Month is a more realistic figure with free electricity.


----------



## Widde

I just had a screen freeze and rebooted my pc and when I got into windows a catalyst windows popped up informing me that I had no amd display driver installed :S And in the event log I had this "The LogMeIn Kernel Information Provider service failed to start due to the following error:
The system cannot find the path specified." Wth is that? :S The only thing it resembles is Hamachi which I have uninstalled and cant figure out what that has to do with graphics

And now when I have uninstalled everything using DDU taken out one card, installed everything, put the 2nd card in I get this :s http://piclair.com/p53eb


----------



## bond32

Quote:


> Originally Posted by *ZealotKi11er*
> 
> My 290X makes 0.09 BTC per day. Thats 0.009*31= 0.28 ~ $200. 2-3 Month is a more realistic figure with free electricity.


At the current difficulty, with a hashrate of 800 kh/s, that gives 6.04 LTC/Day. Assuming that remains constant (which is wont), that translates to $151/month. Assuming someone paid around $450 for their 290, that would take 3 months of the card running 24/7. That also means power costs are not a factor. Sorry but unless you have a dedicated mining rig, statements like "earn back the cost of the card" really don't seem reasonable to me.


----------



## Sazz

Quote:


> Originally Posted by *Widde*
> 
> I just had a screen freeze and rebooted my pc and when I got into windows a catalyst windows popped up informing me that I had no amd display driver installed :S And in the event log I had this "The LogMeIn Kernel Information Provider service failed to start due to the following error:
> The system cannot find the path specified." Wth is that? :S The only thing it resembles is Hamachi which I have uninstalled and cant figure out what that has to do with graphics


That usually happens when driver gets corrupted or have error on installation or hard drive got somehow corrupted. Have you tried re-installing the driver?


----------



## ZealotKi11er

Quote:


> Originally Posted by *bond32*
> 
> At the current difficulty, with a hashrate of 800 kh/s, that gives 6.04 LTC/Day. Assuming that remains constant (which is wont), that translates to $151/month. Assuming someone paid around $450 for their 290, that would take 3 months of the card running 24/7. That also means power costs are not a factor. Sorry but unless you have a dedicated mining rig, statements like "earn back the cost of the card" really don't seem reasonable to me.


I dont do LTC. I do multi-pool and the average difficultly does not really change much. I make as much BTC as i made in October with same hashrate.


----------



## Widde

Quote:


> Originally Posted by *Sazz*
> 
> That usually happens when driver gets corrupted or have error on installation or hard drive got somehow corrupted. Have you tried re-installing the driver?


Used DDU 2 times already







1st time without taking out the 2nd card, gonna try and redownload the drivers, had the 13.12 drivers on the hdd since they came out

It still happens after a fresh download from amds website







And it says at the end of the installation warnings occured http://piclair.com/64zp3 and this is in the log http://piclair.com/sdyrs But no error messages

Think I got it working now, Still said warnings at the end of the installation but now everything seems to work, crossfire works and now I can open CCC atleast.


----------



## Krusher33

Quote:


> Originally Posted by *Widde*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sazz*
> 
> That usually happens when driver gets corrupted or have error on installation or hard drive got somehow corrupted. Have you tried re-installing the driver?
> 
> 
> 
> Used DDU 2 times already
> 
> 
> 
> 
> 
> 
> 
> 1st time without taking out the 2nd card, gonna try and redownload the drivers, had the 13.12 drivers on the hdd since they came out
> 
> It still happens after a fresh download from amds website
> 
> 
> 
> 
> 
> 
> 
> And it says at the end of the installation warnings occured http://piclair.com/64zp3 and this is in the log http://piclair.com/sdyrs But no error messages
Click to expand...

Are you doing DDU in safe mode?


----------



## Widde

Quote:


> Originally Posted by *Krusher33*
> 
> Are you doing DDU in safe mode?


Yes sir ^^ Tried both in safe mode and regularly


----------



## Sgt Bilko

Ok so it looks like the Asus DCU II 290x is confirmed for Elpida Mem



Maybe they will use Hynix on the Matrix?

I used to like Asus, even with the price premium....


----------



## Arizonian

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Ok so it looks like the Asus DCU II 290x is confirmed for Elpida Mem
> 
> 
> 
> Maybe they will use Hynix on the Matrix?
> 
> I used to like Asus, even with the price premium....


So we now know *Sapphire Tri-X* is Hynix and *ASUS DCUII* is Elpida.

Did we confirm what *MSI Gaming* and *XFX DD* come with?


----------



## VSG

So apparently HIS has not given any retailer their IceQ 290x yet despite pre-orders stating stock availability on the 24th. Not sure what's going on with HIS.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Arizonian*
> 
> So we now know *Sapphire Tri-X* is Hynix and *ASUS DCUII* is Elpida.
> 
> Did we confirm what *MSI Gaming* and *XFX DD* come with?


Both of my XFX DD R9 290 Black Edition's (hell of a name) came with Elpida, The Fudzilla review also had Elpida.

MSI i'm not sure of as yet, i think someone in here is getting one soon iirc


----------



## Sgt Bilko

Quote:


> Originally Posted by *geggeg*
> 
> So apparently HIS has not given any retailer their IceQ 290x yet despite pre-orders stating stock availability on the 24th. Not sure what's going on with HIS.


I was going with HIS until the changed the colour of the IceQ cooler from Black to Gold.

The Blue PCB i can deal with.......but both at once


----------



## VSG

lol maybe they are changing the color of the PCB and/or cooler based on feedback so far. One can only hope!


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Both of my XFX DD R9 290 Black Edition's (hell of a name) came with Elpida, The Fudzilla review also had Elpida.
> 
> MSI i'm not sure of as yet, i think someone in here is getting one soon iirc


What's the big deal about whether the card has elpida or hynix? Is it the blackscreen issue? I know miners will want the ones with hynix.

To me, i would rather have a 290 with elpida that can do at least 1500 but with a core that can go up to 1300. when it comes to oc'ing for benches or games . . . the core speed will matter most. What's the use of having a hynix when your core can only go as high as 1250?


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> What's the big deal about whether the card has elpida or hynix? Is it the blackscreen issue? I know miners will want the ones with hynix.
> 
> To me, i would rather have a 290 with elpida that can do at least 1500 but with a core that can go up to 1300. when it comes to oc'ing for benches or games . . . the core speed will matter most. What's the use of having a hynix when your core can only go as high as 1250?


I can go 1200 on the core with AB, haven't tried Trixx yet but my memory won't budge over 1350 in most cases.

People like to know these things, me personally i will be running mine at stock 98% of the time so i don't mind.


----------



## cam51037

Quote:


> Originally Posted by *bond32*
> 
> Good luck with that. Last quote said a 290 running for 6 months straight will pay for itself with the current difficulty...


Not sure where you're getting 6 months from. According to Coinwarz, a PC that does 900KH/s, takes 400W at $0.11 per kW/h and cost $460 to purchase would be paid off within 95 days.

I'll also be running it with a 7950, so hopefully that should bump the ROI time down a little bit, to around 2 months at current rates.


----------



## devilhead

Quote:


> Originally Posted by *rdr09*
> 
> What's the big deal about whether the card has elpida or hynix? Is it the blackscreen issue? I know miners will want the ones with hynix.
> 
> To me, i would rather have a 290 with elpida that can do at least 1500 but with a core that can go up to 1300. when it comes to oc'ing for benches or games . . . the core speed will matter most. What's the use of having a hynix when your core can only go as high as 1250?


In my case Hynix is much more better, like elpidia can do 1500 ( somtimes 1600 but the score is falling down) and hynix 1700. The core clock depends already on chip quality, have 2x290x with hynix, and 3x290x with elpidia, those with hynix can do 1300 on core, but with elpidia max 1280( maybe that my 290x with elpidia is unlocked 290







)
If you are gaming, so yes, memory overclock doesn't make sense, most important is core clock. But if you are overclocker, making some test, like firestrike, valley....so then memory will make sense


----------



## bond32

Quote:


> Originally Posted by *cam51037*
> 
> Not sure where you're getting 6 months from. According to Coinwarz, a PC that does 900KH/s, takes 400W at $0.11 per kW/h and cost $460 to purchase would be paid off within 95 days.
> 
> I'll also be running it with a 7950, so hopefully that should bump the ROI time down a little bit, to around 2 months at current rates.


If you're so convinced that 1. The difficulty will remain the same, 2. Your hash rate will remain a constant 900 KH/s, 3. You will be using this brand new card for absolutely nothing except mining for 95 days straight then all power to you. Sorry but just seems silly.


----------



## battleaxe

Okay, so is the Tri-X really worth it over a basic reference design? I can get a Tri-X 290 for 460.00 or I can get a basic 290reference design and add an AIO water cooler for about the same price. Temps on the reference end up much cooler than the Tri-X I am sure, but the Tri-X guarantees I get Hynix Ram.

So which one?


----------



## Sazz

Quote:


> Originally Posted by *battleaxe*
> 
> Okay, so is the Tri-X really worth it over a basic reference design? I can get a Tri-X 290 for 460.00 or I can get a basic 290reference design and add an AIO water cooler for about the same price. Temps on the reference end up much cooler than the Tri-X I am sure, but the Tri-X guarantees I get Hynix Ram.
> 
> So which one?


The problem about the AIO mod is the VRM temps, if you can put a heatsink on it that is good enough to put it down to 60 or lower then good. Tri-X cooler can't achieve that unless you ramp up the fan to really high RPM's but at the same time even at 100% fan speed it's still in the acceptable noise range. so it's really up to you.

I bought a reference card but at the time there's no custom cooler option. I am about to put a AIO on it w/ a big ass heatsink on the VRM1 (I believe, whatever is the Long strip of VRM that the core uses). I also bought a extra reference stock cooler to cut up in this mod so that if I need to RMA I can always pop the stock cooler back in and send for RMA.


----------



## battleaxe

Quote:


> Originally Posted by *Sazz*
> 
> The problem about the AIO mod is the VRM temps, if you can put a heatsink on it that is good enough to put it down to 60 or lower then good. Tri-X cooler can't achieve that unless you ramp up the fan to really high RPM's but at the same time even at 100% fan speed it's still in the acceptable noise range. so it's really up to you.
> 
> I bought a reference card but at the time there's no custom cooler option. I am about to put a AIO on it w/ a big ass heatsink on the VRM1 (I believe, whatever is the Long strip of VRM that the core uses). I also bought a extra reference stock cooler to cut up in this mod so that if I need to RMA I can always pop the stock cooler back in and send for RMA.


Yeah, I have coolers on my VRM's and RAM. I don't have any temps over 55c at any time no matter how hard I run it. My core is under 45c too. So IDK, I guess it doesn't really matter. The Tri-X would have better resale though right? I imagine it would anyway.


----------



## Sazz

Quote:


> Originally Posted by *battleaxe*
> 
> Yeah, I have coolers on my VRM's and RAM. I don't have any temps over 55c at any time no matter how hard I run it. My core is under 45c too. So IDK, I guess it doesn't really matter. The Tri-X would have better resale though right? I imagine it would anyway.


Funny enough the past 2years, I've seen reference card have higher re-sale value than custom PCB, probably because it's easier to find waterblocks/custom coolers for them. but I believe Tri-X is still under the same reference PCB design? so that should have a slightly better re-sale value than reference cooler.


----------



## leetmode

Are stores still jacking up the prices of these cards? I need to upgrade from a 5970 but I just can't justify buying a r9 card knowing places like newegg are asking ~$100 over MSRP


----------



## trihy

The gelid VRM prototype it´s too small. It´s the same that comes with the kit, but with the holes in-line. Please anyone testing the prototype , send an email to gelid asking for a bigger solution.

That one will reach 90 or 95 degrees easily. With that good heatsink, that barely reach 60 degrees on gpu, it´s a shame to have such a poor vrm solution, limiting OC among other things.


----------



## Krusher33

Quote:


> Originally Posted by *leetmode*
> 
> Are stores still jacking up the prices of these cards? I need to upgrade from a 5970 but I just can't justify buying a r9 card knowing places like newegg are asking ~$100 over MSRP


Yeah, have to wait till the cryptocurrency mining craze to die down a bit.


----------



## battleaxe

Quote:


> Originally Posted by *Sazz*
> 
> Funny enough the past 2years, I've seen reference card have higher re-sale value than custom PCB, probably because it's easier to find waterblocks/custom coolers for them. but I believe Tri-X is still under the same reference PCB design? so that should have a slightly better re-sale value than reference cooler.


Well, that might make the Tri-X worth the extra cost then. Guaranteed Hynix Ram. I might remove the cooler and put a AIO on it anyway. Get better temps and ensure longevity, plus better OC ability overall. Planning to mine it, so the Hynix matters to me.


----------



## battleaxe

Quote:


> Originally Posted by *trihy*
> 
> The gelid VRM prototype it´s too small. It´s the same that comes with the kit, but with the holes in-line. Please anyone testing the prototype , send an email to gelid asking for a bigger solution.
> 
> That one will reach 90 or 95 degrees easily. With that good heatsink, that barely reach 60 degrees on gpu, it´s a shame to have such a poor vrm solution, limiting OC among other things.


I chopped an old CPU cooler into a thin slice and placed it over the VRM. Its about 1.15 inches tall, so no it won't work if you have to put an air cooler over top of it, but it keeps the VRM's under 55c even under the worst conditions. Works fantastic with an AIO water cooler on the die.


----------



## alawadhi3000

Just installed the Accelero Xtreme III on one of my R9 290, low noise and temperatures.

The problem is that the temp on VRM1 goes up to 100C, people with the AX3 how did you install the VRM1 heatsinks?? I just used the glue and glued it, do I need a thermal pad??


----------



## kcuestag

One of my R9 290 has a weird coil whine issue, it only happens on IDLE and while the monitor is ON. If I turn off the monitor, coil whine is gone, if I start a game, I can't hear it either.

Anyone else ever suffered this? It's weird, never had a card coil whine on idle...


----------



## Ragen

Quote:


> Originally Posted by *alawadhi3000*
> 
> Just installed the Accelero Xtreme III on one of my R9 290, low noise and temperatures.
> 
> The problem is that the temp on VRM1 goes up to 100C, people with the AX3 how did you install the VRM1 heatsinks?? I just used the glue and glued it, do I need a thermal pad??


Hey maybe I can help, I have the same cooler installed on my 290X and I can tell you that your vrm 1 temps should be no where near that high unless something isnt installed correctly, also make sure you are running the fan speed at 100 percent, its still very quiet and is necessary as the air from the cooler helps to cool the vrms as well. Also what is your case airflow like? do you have good case airflow?

In my setup VRM 1 temps never exceed 79C and that is with my card running a huge overclock and overvolt, im at 1.328vc for 1200mhz on the core. When im running stock clocks I only see VRM1 get up to around 60-65c tops.

BTW the glue is just fine, its what im using.


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I can go 1200 on the core with AB, haven't tried Trixx yet but my memory won't budge over 1350 in most cases.
> 
> People like to know these things, me personally i will be running mine at stock 98% of the time so i don't mind.


yes, try trixx. make sure to disable overdrive and do not open ab. like you, i game at stock. mine does 1175/1500 without voltage adjustments but i have to max out the PL to 50%. i have elpida, so i knew it won't do much but it did 1620 in some benches. however, my core can do 1300 with +200 offset in some benches.

Quote:


> Originally Posted by *devilhead*
> 
> In my case Hynix is much more better, like elpidia can do 1500 ( somtimes 1600 but the score is falling down) and hynix 1700. The core clock depends already on chip quality, have 2x290x with hynix, and 3x290x with elpidia, those with hynix can do 1300 on core, but with elpidia max 1280( maybe that my 290x with elpidia is unlocked 290
> 
> 
> 
> 
> 
> 
> 
> )
> If you are gaming, so yes, memory overclock doesn't make sense, most important is core clock. But if you are overclocker, making some test, like firestrike, valley....so then memory will make sense


that would be ideal. getting the core up to 1300 + and having hynix but i would not mind having elpida, so long as the core can oc as high. maybe if i mine, so i get the money i used to buy this card. actually, i sold my 7970, which paid for this 290.


----------



## VSG

Well lads, it's been fun here and I was really hoping for the 290x Lightning to release soon. But anyway:



Arizonian, could you please remove me from the owners club? Have fun, everyone!


----------



## kdawgmaster

Quote:


> Originally Posted by *rdr09*
> 
> yes, try trixx. make sure to disable overdrive and do not open ab. like you, i game at stock. mine does 1175/1500 without voltage adjustments but i have to max out the PL to 50%. i have elpida, so i knew it won't do much but it did 1620 in some benches. however, my core can do 1300 with +200 offset in some benches.
> that would be ideal. getting the core up to 1300 + and having hynix but i would not mind having elpida, so long as the core can oc as high. maybe if i mine, so i get the money i used to buy this card. actually, i sold my 7970, which paid for this 290.


I need to figure out how i can adjust more voltage on my HIS 290x :/ AB only allows for the +100mv and for me i think i need to go higher to get better overclocks


----------



## Ragen

Quote:


> Originally Posted by *rdr09*
> 
> yes, try trixx. make sure to disable overdrive and do not open ab. like you, i game at stock. mine does 1175/1500 without voltage adjustments but i have to max out the PL to 50%. i have elpida, so i knew it won't do much but it did 1620 in some benches. however, my core can do 1300 with +200 offset in some benches.
> that would be ideal. getting the core up to 1300 + and having hynix but i would not mind having elpida, so long as the core can oc as high. maybe if i mine, so i get the money i used to buy this card. actually, i sold my 7970, which paid for this 290.


Does trixx manage the memory overlclocks better than AB? I ask because I cant really go over 1350mhz memory either without getting a black screen eventually, 1350 seems rock solid though.

Also I have Elpida ram too, my card was originally a powercolor R9 290 but i was the first one here to make the discovery that it unlocked to a 290x, boy i was given some grief around here until others confirmed the same thing regardless of all the proof that I offered.


----------



## sugarhell

Quote:


> Originally Posted by *kdawgmaster*
> 
> I need to figure out how i can adjust more voltage on my HIS 290x :/ AB only allows for the +100mv and for me i think i need to go higher to get better overclocks


I dont know why you dont read OP


----------



## Ragen

Quote:


> Originally Posted by *geggeg*
> 
> Well lads, it's been fun here and I was really hoping for the 290x Lightning to release soon. But anyway:
> 
> 
> 
> Arizonian, could you please remove me from the owners club? Have fun, everyone!


Dual cards nvidia should be better anyway plus those cards are beast.


----------



## rdr09

Quote:


> Originally Posted by *Ragen*
> 
> Dual cards nvidia should be better anyway plus those cards are beast.


for benching - yes.


----------



## Jonsu

Hi, please add me. MSI R9 290 Gaming @ 1177/1500 Hynix memory


----------



## VSG

Quote:


> Originally Posted by *Ragen*
> 
> Dual cards nvidia should be better anyway plus those cards are beast.


Quote:


> Originally Posted by *rdr09*
> 
> for benching - yes.


Yes- make no mistake about those cards. The 290x cards are still better bang for the buck but not by much at the current prices









My decision was swayed by G-sync, EVGA's ridiculously awesome customer support for water coolers and the guarantee of a highly binned card with Samsung memory. This does not mean I will start pouting off against the red side though, AMD is still my first in both laptop and desktop discrete graphics


----------



## rdr09

Quote:


> Originally Posted by *Ragen*
> 
> Does trixx manage the memory overlclocks better than AB? I ask because I cant really go over 1350mhz memory either without getting a black screen eventually, 1350 seems rock solid though.
> 
> Also I have Elpida ram too, my card was originally a powercolor R9 290 but i was the first one here to make the discovery that it unlocked to a 290x, boy i was given some grief around here until others confirmed the same thing regardless of all the proof that I offered.


i only oc for benching not gaming. i get the same results from ab and trixx if all i need to oc is +100 offset. i'm not into flashing bioses and editing apps. i use trixx for higher oc's that need +200 offset. i read ab can provide +300. at +200 with trixx i saw my load core voltage go as high as 1.41v. to me that is the limit.

i think it comes down to silicon lottery on both core and memory.

you can't get any higher than 1350 with the original bios or when flashed to X?


----------



## trihy

Quote:


> Originally Posted by *battleaxe*
> 
> I chopped an old CPU cooler into a thin slice and placed it over the VRM. Its about 1.15 inches tall, so no it won't work if you have to put an air cooler over top of it, but it keeps the VRM's under 55c even under the worst conditions. Works fantastic with an AIO water cooler on the die.


Thats cool...but having the heatsink and cooler on top.. is a problem. You need to be creative









I have some ideas with an xbox 360 gpu heatsink, but dont have the tools to make all the mods into the heatsink..


----------



## Ragen

Quote:


> Originally Posted by *rdr09*
> 
> for benching - yes.


I would say for gaming especially, crossfire support can be pretty crappy.


----------



## alawadhi3000

Quote:


> Originally Posted by *Ragen*
> 
> Hey maybe I can help, I have the same cooler installed on my 290X and I can tell you that your vrm 1 temps should be no where near that high unless something isnt installed correctly, also make sure you are running the fan speed at 100 percent, its still very quiet and is necessary as the air from the cooler helps to cool the vrms as well. Also what is your case airflow like? do you have good case airflow?
> 
> In my setup VRM 1 temps never exceed 79C and that is with my card running a huge overclock and overvolt, im at 1.328vc for 1200mhz on the core. When im running stock clocks I only see VRM1 get up to around 60-65c tops.
> 
> BTW the glue is just fine, its what im using.


Welcome to OCN.

Case airflow is not the issue, I did put a 12 inch box fan and that VRM1 went down to ~95C only, I'm pretty sure I installed everything correctly but I'll take apart the cooler to inspect.


----------



## Ragen

Quote:


> Originally Posted by *rdr09*
> 
> i only oc for benching not gaming. i get the same results from ab and trixx if all i need to oc is +100 offset. i'm not into flashing bioses and editing apps. i use trixx for higher oc's that need +200 offset. i read ab can provide +300. at +200 with trixx i saw my load core voltage go as high as 1.41v. to me that is the limit.
> 
> i think it comes down to silicon lottery on both core and memory.
> 
> you can't get any higher than 1350 with the original bios or when flashed to X?


I could try to flash it back to a 290 and see.


----------



## Ragen

Quote:


> Originally Posted by *alawadhi3000*
> 
> Welcome to OCN.
> 
> Case airflow is not the issue, I did put a 12 inch box fan and that VRM1 went down to ~95C only, I'm pretty sure I installed everything correctly but I'll take apart the cooler to inspect.


There is a huge discrepancy in our temps so something is up, I know vrm 1 temps give a lot of people trouble but it usually comes down to poor airflow or an incorrect instillation. Not saying you have these issues but its a possibility. Either way good luck and I hope you find the problem.


----------



## rdr09

Quote:


> Originally Posted by *Ragen*
> 
> I would say for gaming especially, crossfire support can be pretty crappy.


things are changing, especially in single card setup . . .

http://www.overclock.net/t/1455171/amd-r9-290-nvidia-gtx-780-ti-bench-off-first-wave-complete-with-graphs-specviewperf-added


----------



## Krusher33

Quote:


> Originally Posted by *alawadhi3000*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ragen*
> 
> Hey maybe I can help, I have the same cooler installed on my 290X and I can tell you that your vrm 1 temps should be no where near that high unless something isnt installed correctly, also make sure you are running the fan speed at 100 percent, its still very quiet and is necessary as the air from the cooler helps to cool the vrms as well. Also what is your case airflow like? do you have good case airflow?
> 
> In my setup VRM 1 temps never exceed 79C and that is with my card running a huge overclock and overvolt, im at 1.328vc for 1200mhz on the core. When im running stock clocks I only see VRM1 get up to around 60-65c tops.
> 
> BTW the glue is just fine, its what im using.
> 
> 
> 
> Welcome to OCN.
> 
> Case airflow is not the issue, I did put a 12 inch box fan and that VRM1 went down to ~95C only, I'm pretty sure I installed everything correctly but I'll take apart the cooler to inspect.
Click to expand...

You sure you got all the VRM's? Not sure on the 290X but the 7970's had 1 in a obscured place that folks have missed.


----------



## Ragen

Quote:


> Originally Posted by *rdr09*
> 
> things are changing, especially in single card setup . . .
> 
> http://www.overclock.net/t/1455171/amd-r9-290-nvidia-gtx-780-ti-bench-off-first-wave-complete-with-graphs-specviewperf-added


Quote:


> Originally Posted by *alawadhi3000*
> 
> Welcome to OCN.
> 
> Case airflow is not the issue, I did put a 12 inch box fan and that VRM1 went down to ~95C only, I'm pretty sure I installed everything correctly but I'll take apart the cooler to inspect.


Single card I think is actually better with AMD at the moment, nvidia has not had a really good driver since 306.96.


----------



## battleaxe

Quote:


> Originally Posted by *trihy*
> 
> Thats cool...but having the heatsink and cooler on top.. is a problem. You need to be creative
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have some ideas with an xbox 360 gpu heatsink, but dont have the tools to make all the mods into the heatsink..


Uhhh.... and how is it a problem? Its not a problem for me. Mine is working great.

I'm guessing you mean its a problem for you?


----------



## trihy

Yes, as Im using an air cooler.. it's not so easy to fit a heatsink onto the vrm


----------



## alawadhi3000

Quote:


> Originally Posted by *Ragen*
> 
> There is a huge discrepancy in our temps so something is up, I know vrm 1 temps give a lot of people trouble but it usually comes down to poor airflow or an incorrect instillation. Not saying you have these issues but its a possibility. Either way good luck and I hope you find the problem.


I've removed the VRM1 heatsink and reapplied them, they're glued tight and correct but still temps shoot up to 100C.
Quote:


> Originally Posted by *Krusher33*
> 
> You sure you got all the VRM's? Not sure on the 290X but the 7970's had 1 in a obscured place that folks have missed.


Yes I got them all covered.
Quote:


> Originally Posted by *Ragen*
> 
> Single card I think is actually better with AMD at the moment, nvidia has not had a really good driver since 306.96.


I just came from a GTX670, nVidia drivers are better and I had no issues with any updates.


----------



## battleaxe

Quote:


> Originally Posted by *trihy*
> 
> Yes, as Im using an air cooler.. it's not so easy to fit a heatsink onto the vrm


Ahh... yeah. Bummer. I imagine its pretty tight under there.
Quote:


> Originally Posted by *alawadhi3000*
> 
> I've removed the VRM1 heatsink and reapplied them, they're glued tight and correct but still temps shoot up to 100C.
> Yes I got them all covered.
> I just came from a GTX670, nVidia drivers are better and I had no issues with any updates.


I"ve still got my 670's and love them. But I think the 290's are sweet cards. Can't be beat for the money if you ask me. Kinda like the 670 (best bargain card) was when it came out if you know what I mean.

Edit: wait... you don't remember the 320 drivers? The famous card killers? Those were fun drivers.


----------



## kdawgmaster

Quote:


> Originally Posted by *sugarhell*
> 
> Guys its easy to give more volts on msi.
> 
> Just use /wi4,30,8d,10 for 100mv. The offset is 6.25 mv in hexademical. So on decimal is :16*6.25=100 mv. For 50mv you need 8. For 200mv you need 20( 20=32 on dec. So 32 * 6.25=200mv)
> 
> The easy way to do changes:
> 
> Create a txt on desktop. Write
> CD C:\Program Files (x86)\MSI Afterburner
> MSIAfterburner.exe /wi4,30,8d,10
> 
> and then save as .bat file. Eveyrtime you start this bat file msi will start with +100mv
> 
> For 50mv: 8
> For 100mv:10
> For 125mv:14
> For 150mv:18
> For 175mv:1C
> For 200mv:20
> 
> I wouldn't go over this point because
> 1)You are close to leave the sweet spot of the ref pcb vrms efficiency
> 2)These commands add 200mv on top of the 100mv offset through AB gui.That means 300mv
> 
> By default /wi command apply to current gpu only. So if you have 2 or more gpus you must use /sg command. That means the command line is something like that
> ex:MsiAfterburner.exe /sg0 /wi4,30,8d,10 /sg1 /wi4,30,8d,10


I tried to do this and nothing happens at all.

this is my command line

CD C:\Program Files (x86)\MSI Afterburner
MSIAfterburner.exe /sg0 /wi4,30,8d,18 /sg1 /wi4,30,8d,18 /sg2 /wi4,30,8d,18


----------



## sugarhell

Monitor your volts. Or your path is wrong


----------



## kdawgmaster

Quote:


> Originally Posted by *sugarhell*
> 
> Monitor your volts. Or your path is wrong


K so removed the volts and then it launched so now it looks like this

CD C:\Program Files (x86)\MSI Afterburner
MsiAfterburner.exe

Now what I noticed is you said it adds the already 100mv so if i did 50 it would go max to 150.


----------



## alawadhi3000

Quote:


> Originally Posted by *battleaxe*
> 
> I"ve still got my 670's and love them. But I think the 290's are sweet cards. Can't be beat for the money if you ask me. Kinda like the 670 (best bargain card) was when it came out if you know what I mean.
> 
> Edit: wait... you don't remember the 320 drivers? The famous card killers? Those were fun drivers.


I installed those and never had any issues with them.


----------



## sugarhell

Quote:


> Originally Posted by *kdawgmaster*
> 
> K so removed the volts and then it launched so now it looks like this
> 
> CD C:\Program Files (x86)\MSI Afterburner
> MsiAfterburner.exe
> 
> Now what I noticed is you said it adds the already 100mv so if i did 50 it would go max to 150.


You will not gonna have more than 100mv on AB. The gpu will start with +50mv and then you can add 100mv from msi. If the msi cant start with these settings something is wrong with your path file

You can always add the code on msi ab shortcut on desktop. After the target add the code for the volts


----------



## kdawgmaster

Quote:


> Originally Posted by *sugarhell*
> 
> You will not gonna have more than 100mv on AB. The gpu will start with +50mv and then you can add 100mv from msi. If the msi cant start with these settings something is wrong with your path file
> 
> You can always add the code on msi ab shortcut on desktop. After the target add the code for the volts


same deal with the target :/ I have 3 GPU's so i imagine that the list would be sg0, sg1, sg2 would it not?


----------



## sugarhell

Quote:


> Originally Posted by *kdawgmaster*
> 
> same deal with the target :/


Try with the code just for the first gpu. dont use sg command. you need to care with the space and everything


----------



## kdawgmaster

Quote:


> Originally Posted by *sugarhell*
> 
> Try with the code just for the first gpu. dont use sg command. you need to care with the space and everything


just tried your basic one " /wi4,30,8d,10 " and nope nothing in either of them


----------



## velocityx

since today I realized, msi ab latest beta, is giving me hard freezes in bf4.


----------



## sugarhell

Quote:


> Originally Posted by *kdawgmaster*
> 
> just tried your basic one " /wi4,30,8d,10 " and nope nothing in either of them


Msi opens? If it doesnt open then you are doing something wrong. Your HIS are ref cards?


----------



## kdawgmaster

Quote:


> Originally Posted by *sugarhell*
> 
> Msi opens? If it doesnt open then you are doing something wrong. Your HIS are ref cards?


They are yes. And these cards seem to be very limited on what they let you do for voltage stuff. I wonder if this is my issue.

sry and MSI dosnt open.


----------



## sugarhell

Quote:


> Originally Posted by *kdawgmaster*
> 
> They are yes. And these cards seem to be very limited on what they let you do for voltage stuff. I wonder if this is my issue.
> 
> sry and MSI dosnt open.


If msi doesnt open you are writing something wrong. Care for space or something like that


----------



## kdawgmaster

Quote:


> Originally Posted by *sugarhell*
> 
> If msi doesnt open you are writing something wrong. Care for space or something like that


yeah this is a weird one for me. Ive done batch files for my game servers and everything and i know its very particular how its placed. but everything from what was mentioned looks to be proper :/

Does it effect that im using the latest beta version?


----------



## kdawgmaster

Quote:


> Originally Posted by *kdawgmaster*
> 
> yeah this is a weird one for me. Ive done batch files for my game servers and everything and i know its very particular how its placed. but everything from what was mentioned looks to be proper :/
> 
> Does it effect that im using the latest beta version?


Think i figured it out. Place this " \ " instead of this " / " will launch it

So now for me it looks like this

C:\Program Files (x86)\MSI Afterburner
MsiAfterburner.exe \sg0\wi4,30,8d,10 \sg1\wi4,30,8d,10 \sg2\wi4,30,8d,10

Which launches it but it would appear that the core voltage stayed at 0 is this proper?


----------



## sugarhell

Quote:


> Originally Posted by *kdawgmaster*
> 
> Think i figured it out. Place this " \ " instead of this " / " will launch it
> 
> So now for me it looks like this
> 
> C:\Program Files (x86)\MSI Afterburner
> MsiAfterburner.exe \sg0\wi4,30,8d,10 \sg1\wi4,30,8d,10 \sg2\wi4,30,8d,10
> 
> Which launches it but it would appear that the core voltage stayed at 0 is this proper?


Yeah the 100mv adds on top of the command


----------



## kdawgmaster

Quote:


> Originally Posted by *sugarhell*
> 
> Yeah the 100mv adds on top of the command


K so now when i bump up the voltage MSI still only seems to read the normal 100mv bump on idle, so will i only notice this bump of voltage under load?


----------



## alawadhi3000

VRM1 is the straight line of VRMs to the right or the triple small VRM near the display ports??


----------



## battleaxe

Quote:


> Originally Posted by *alawadhi3000*
> 
> VRM1 is the straight line of VRMs to the right or the triple small VRM near the display ports??


Quote:


> Originally Posted by *alawadhi3000*
> 
> I've removed the VRM1 heatsink and reapplied them, they're glued tight and correct but still temps shoot up to 100C.
> Yes I got them all covered.
> I just came from a GTX670, nVidia drivers are better and I had no issues with any updates.


VRM 1 is the long line to the right of the GPU die (on other side of the RAM modules of course). The small cluster of 3 close to display port are VRM2


----------



## Ragen

Quote:


> Originally Posted by *alawadhi3000*
> 
> I just came from a GTX670, nVidia drivers are better and I had no issues with any updates.


I had a 670, 680 and then eventually 670's in sli. For me and many others the experience with amd is the exact opposite. After driver 306.97 nvidia seemed to introduce some stupid random stuttering issue "especially in bf3" which to this day people still complain about. With my HD 7970 and R9 290x it has been smooth sailing ever since.


----------



## Ragen

Quote:


> Originally Posted by *battleaxe*
> 
> VRM 1 is the long line to the right of the GPU die (on other side of the RAM modules of course). The small cluster of 3 close to display port are VRM2


Yep.


----------



## Ragen

Quote:


> Originally Posted by *alawadhi3000*
> 
> I installed those and never had any issues with them.


Oddly enough neither did I and I used them too.


----------



## Ragen

Quote:


> Originally Posted by *velocityx*
> 
> since today I realized, msi ab latest beta, is giving me hard freezes in bf4.


Not having that issue myself.


----------



## kdawgmaster

Quote:


> Originally Posted by *velocityx*
> 
> since today I realized, msi ab latest beta, is giving me hard freezes in bf4.


If ur R9 290x overclocked?


----------



## Ragen

Quote:


> Originally Posted by *kdawgmaster*
> 
> If ur R9 290x overclocked?


Well mine is, 1200mhz on the core using the latest beta from msi and I have no such issues.


----------



## bond32

Quote:


> Originally Posted by *Ragen*
> 
> Well mine is, 1200mhz on the core using the latest beta from msi and I have no such issues.


Curious, which card and bios?


----------



## kizwan

Core & VRMs temperature: 2 x R9 290 + EK-FC R9-290X Acetal waterblock + EK-FC R9-290X backplate

BF4 1080p @200% resolution scale, 3820 @3.8GHz

GPU1



GPU2


----------



## bond32

Fantastic info man! I only have one card, and using the Koolance block, but I get almost identical temps to yours. What bios and clocks are you on?


----------



## kizwan

Quote:


> Originally Posted by *bond32*
> 
> Fantastic info man! I only have one card, and using the Koolance block, but I get almost identical temps to yours. What bios and clocks are you on?


Is your ambient (indoor) also 29C? If similar temps then we're doing it right.







Mine with stock BIOS (Sapphire) & stock clock. Yours overclocked?


----------



## 4zp1r1na

just order a single r9 290 because i was not sure if my Corsair AX760i could handle 2 of them, so let´s see how it goes, hope i made the right decision coming from a 780 SC ACX
and hoping i can unlock it to a 290x haha. posting pictures when it arrives.


----------



## Ragen

Quote:


> Originally Posted by *bond32*
> 
> Curious, which card and bios?


Powercolor R9 290 that has been flashed to a 290X using the unmodded Asus bios.


----------



## Ragen

One thing I cannot figure out, I have my R9 290X plugged into my Monitor with the DVI cord and I also like to use my 40inch HDTV from time to time with the HDMI cord but if I try to leave both of them plugged up the TV doesnt seem to work right, I mean it will load up windows but the only thing I can see is the toolbar, the desktop programs seem to have disappeard and the mouse and keyboard does nothing.

This is easily solved by me disconnecting the DVI cable from the monitor but I wanted to leave them both plugged up at the same time.

EDIT-Just found out how to do it through the CP.


----------



## kdawgmaster

Quote:


> Originally Posted by *Ragen*
> 
> Well mine is, 1200mhz on the core using the latest beta from msi and I have no such issues.


Yeah mines at 1125 which seems to be the best i can get for core. But for Velocity keep in mind BF4 has some memory ram leak issues with AMD video cards right now and if the card is overclocked it will crash. I know this because in all my other games but BF4 i can get 1400 stable on the memory without it crashing,


----------



## Jack Mac

How come 1150MHz feels so much smoother on BF4 than 1100MHz for me? I just noticed this today when playing with different clockspeeds and voltages.


----------



## Ragen

Quote:


> Originally Posted by *kdawgmaster*
> 
> Yeah mines at 1125 which seems to be the best i can get for core. But for Velocity keep in mind BF4 has some memory ram leak issues with AMD video cards right now and if the card is overclocked it will crash. I know this because in all my other games but BF4 i can get 1400 stable on the memory without it crashing,


Well, im clocked to 1200mhz core and 1350mhz on the ram and have no issues. However other games do seem to be fine with 1400mhz memory for me as well.


----------



## ZealotKi11er

Quote:


> Originally Posted by *kcuestag*
> 
> One of my R9 290 has a weird coil whine issue, it only happens on IDLE and while the monitor is ON. If I turn off the monitor, coil whine is gone, if I start a game, I can't hear it either.
> 
> Anyone else ever suffered this? It's weird, never had a card coil whine on idle...


Have you tried living the cards in game menu that has high fps. Something like Crysis I believe. Also I had a lot of coin noise but it was cause buy having a old. Ultra X3. PSU. What PSU you have. I also noticed the pcie cables that come with the ferrite choke seem to help with coil whine.


----------



## Sazz

I just got done doing my AIO mod on my R9 290X using an H55 and a cut up/modded Thermalright VRM heatsink (I did a whole lot of cutting on it than I expected).

Here's a temp so far running a quick valley pass, currently keeping valley on background to see what temps I get after an hour.
currently using stock voltage at +15% power limit at 1100/1350Mhz OC.

I have a arctic cooling 90mm fan installed on the VRM part and will add another one tomorrow on the extra space on it, Haven't glue'd in any chip sink on the faceplate coz I am still tryna see what kind of temps I get in this set-up. Altho my thermal pad that I used on VRM1 isn't think enough so I had to stack two Koolance thermal pads and this probably affect the temps quite significantly so I am ordering a thicker thermal pad to see if that improves temps.

Core temp is not a worry I think, the H55 is keeping pretty well. problem is gonna be the VRM temps.

I'll post my temps after one hour, and will update on my next testing w/ different set-up on the heatsinks.

(Sorry for crappy pics, camera phone xD)

Edit: Forgot to put up the temp screenshot LOL!


----------



## lethal343

ok guys i got my r9 290 OC WINDFORCES from GIGABYTE (3 of them)
im just wondering; whats the voltage at which the cards are MEANT to be running at under full load? (1.15V from memory?)


----------



## BradleyW

Nice VRM cooling. Hope you never go CFX


----------



## Falkentyne

Um, guys if you're using beta 18 of afterburner, you need to use /wi6 for the IC commands to add voltage, not wi4. There was a note in guru3d forums about the address changes from beta 17 to 18.


----------



## Sazz

Quote:


> Originally Posted by *BradleyW*
> 
> Nice VRM cooling. Hope you never go CFX


Me? no plans, I am using it on a mini-ITX build in a bitfenix prodigy. so nope, no plans for crossfire. I prefer single card configuration to rid of the headaches that crossfire/sli brings when it's not properly working (specially on new games).

And so far almost half an hour has passed and its maxed out at 71C on VRM1.. should be able to take few degrees on that w/ a proper thermal pad.

And VRM2 is at 63C, not bad since I basically only have the face plate on it and nothing else. would put some chip sinks on the face plate once I get everything dialed down.


----------



## sugarhell

Quote:


> Originally Posted by *Falkentyne*
> 
> Um, guys if you're using beta 18 of afterburner, you need to use /wi6 for the IC commands to add voltage, not wi4. There was a note in guru3d forums about the address changes from beta 17 to 18.


Hmm didint see that. Probably because of the new low i2c. So that needs to be change


----------



## BradleyW

Quote:


> Originally Posted by *Sazz*
> 
> Me? no plans, I am using it on a mini-ITX build in a bitfenix prodigy. so nope, no plans for crossfire. I prefer single card configuration to rid of the headaches that crossfire/sli brings when it's not properly working (specially on new games).
> 
> And so far almost half an hour has passed and its maxed out at 71C on VRM1.. should be able to take few degrees on that w/ a proper thermal pad.
> 
> And VRM2 is at 63C, not bad since I basically only have the face plate on it and nothing else. would put some chip sinks on the face plate once I get everything dialed down.


I'd zip tie a 80mm fan to the hanging heatsink.


----------



## lethal343

so whats the stock vcore voltage theyre meant to be running at under load?


----------



## Forceman

Quote:


> Originally Posted by *kdawgmaster*
> 
> K so now when i bump up the voltage MSI still only seems to read the normal 100mv bump on idle, so will i only notice this bump of voltage under load?


Have you thought about just using Trixx? It does +200mV by default, and works with all cards.


----------



## Arizonian

Quote:


> Originally Posted by *4zp1r1na*
> 
> just order a single r9 290 because i was not sure if my Corsair AX760i could handle 2 of them, so let´s see how it goes, hope i made the right decision coming from a 780 SC ACX
> and hoping i can unlock it to a 290x haha. posting pictures when it arrives.


Definitely strong card I'm sure you'll enjoy it either way. When you get it post proof submission and I'll add you to the roster.









Good luck unlocking it.


----------



## VSG

Quote:


> Originally Posted by *geggeg*
> 
> Well lads, it's been fun here and I was really hoping for the 290x Lightning to release soon. But anyway:


Arizonian, could you please remove me from the owners club? Have fun, everyone!


----------



## Arizonian

Quote:


> Originally Posted by *geggeg*
> 
> Arizonian, could you please remove me from the owners club? Have fun, everyone!


Will do.









Don't be a stranger.


----------



## battleaxe

Quote:


> Originally Posted by *Sazz*
> 
> Me? no plans, I am using it on a mini-ITX build in a bitfenix prodigy. so nope, no plans for crossfire. I prefer single card configuration to rid of the headaches that crossfire/sli brings when it's not properly working (specially on new games).
> 
> And so far almost half an hour has passed and its maxed out at 71C on VRM1.. should be able to take few degrees on that w/ a proper thermal pad.
> 
> And VRM2 is at 63C, not bad since I basically only have the face plate on it and nothing else. would put some chip sinks on the face plate once I get everything dialed down.


Are you passively cooling the VRM cooler? If you put a small fan on it, even on slow rpm I think your temps will drop drastically more. My temps are lower than this and my cooler is no where near that large on the VRM's, but I have a fan on them too. Big difference. I bet you'll see low 40 or even 30's maybe if you add a fan.


----------



## Sazz

Quote:


> Originally Posted by *battleaxe*
> 
> Are you passively cooling the VRM cooler? If you put a small fan on it, even on slow rpm I think your temps will drop drastically more. My temps are lower than this and my cooler is no where near that large on the VRM's, but I have a fan on them too. Big difference. I bet you'll see low 40 or even 30's maybe if you add a fan.


Quote:


> Originally Posted by *BradleyW*
> 
> I'd zip tie a 80mm fan to the hanging heatsink.


I got a 90mm fan on the VRM part itself and a 80mm fan will come tomorrow that would be mounted on the hanging part of the heatsink

So far 45mins in and im at 71C.


----------



## battleaxe

Quote:


> Originally Posted by *Sazz*
> 
> I got a 90mm fan on the VRM part itself and a 80mm fan will come tomorrow that would be mounted on the hanging part of the heatsink
> 
> So far 45mins in and im at 71C.


Well, you did say you were running at 1100mhz. So that's probably the difference. Still 71c aint bad at all. For the size of that thing though, I kinda expected a bit lower. But maybe the OC makes that impossible. I'll have to mess around with an OC on mine and see how I fare comparatively.


----------



## VSG

Quote:


> Originally Posted by *Arizonian*
> 
> Will do.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Don't be a stranger.


Fat chance! I will remain subscribed to the thread


----------



## Sazz

Quote:


> Originally Posted by *battleaxe*
> 
> Well, you did say you were running at 1100mhz. So that's probably the difference. Still 71c aint bad at all. For the size of that thing though, I kinda expected a bit lower. But maybe the OC makes that impossible. I'll have to mess around with an OC on mine and see how I fare comparatively.


Well I am using a stacked up thermal pad, I didn't have the right thick thermal pad for it, so I think the stacking of thermal pads affects its effectiveness. I just ordered some and will receive it next week.


----------



## battleaxe

Quote:


> Originally Posted by *Sazz*
> 
> Well I am using a stacked up thermal pad, I didn't have the right thick thermal pad for it, so I think the stacking of thermal pads affects its effectiveness. I just ordered some and will receive it next week.


Yeah, let us know how it does after the strip is on there. I looked at that cooler too, that's why I'm curious about it.

Judging by the pics it looks to me like you might be able to use those in crossfire? The way they curl up on top like that?


----------



## Sazz

Well 1hr has passed and here's my temps.



peaking at 74C, averaging 70C for that 1hr run. imma try see how overclocks does. (w/ voltage increase.)


----------



## Caanon

*pokes head in thread* Hey guys, nice thread ya got goin' on here. Can I join in the fun?


----------



## kizwan

Quote:


> Originally Posted by *Sazz*
> 
> Well 1hr has passed and here's my temps.
> 
> 
> 
> peaking at 74C, averaging 70C for that 1hr run. imma try see how overclocks does. (w/ voltage increase.)


What is your ambient (indoor) temp?


----------



## ryboto

Just got a R9 290x Tri-x. So far, a quick OC result is 1175mhz core, 1500mhz ram, +100mV offset, and +50 power. In Valley I'm seeing VRM1 Temps just over 90C, so I'm hesitant to push this thing further without upping the fan speed. Currently it's set to automatic.

Any suggestions on improving the OC? I'm using GPU-Z, Trixx, Valley and 3Dmark to test and monitor. That still a standard set of tools? I don't really like using Furmark.


----------



## Loktar Ogar

Use Firestrike or play BF4 to see if it's stable. Usually, those OC needed more volts for both.


----------



## taem

Quote:


> Originally Posted by *leetmode*
> 
> Are stores still jacking up the prices of these cards? I need to upgrade from a 5970 but I just can't justify buying a r9 card knowing places like newegg are asking ~$100 over MSRP


I pre ordered the MSI 290 Gaming from Amazon for $470. That's not so bad, msrp for reference is $400, you expect to pay $30+ for a custom cooler, so the markup is acceptable imho.


----------



## ryboto

Quote:


> Originally Posted by *Loktar Ogar*
> 
> Use Firestrike or play BF4 to see if it's stable. Usually, those OC needed more volts for both.


Tried Firestrike earlier with +68mv 1150mhz on the core for a few loops, no issues. Will try this new OC. I've also changed the fan profile in Trixx to custom, so VRM1 only hit 70C in Valley, which seemed to heat the card more than Firestrike.

GPUZ is showing vddc as 1.18v, I should trust that?


----------



## Sazz

Quote:


> Originally Posted by *kizwan*
> 
> What is your ambient (indoor) temp?


22-25C ambient temp.

Tried +50mV @ 1150/1350 and +100mV @ 1200/1350 (no aux voltage) and i got 79C and 85C temps on valley run.


----------



## Loktar Ogar

*ryboto*, Good. We have the same settings as to +68mV 1150/1500 +50 PL.







If i go higher i will require more mV which produces more VRM temps which i don't want personally.


----------



## ryboto

Quote:


> Originally Posted by *Loktar Ogar*
> 
> *ryboto*, Good. We have the same settings as to +68mV 1150/1500 +50 PL.
> 
> 
> 
> 
> 
> 
> 
> If i go higher i will require more mV which produces more VRM temps which i don't want personally.


This cooler seems to be doing it's job, even moreso when I up the fan speed. I didn't test to see how low I could go at 1150 with the offset voltage, mostly because I want to see how high I can go, and also to match the OC on my 7950.


----------



## kizwan

Quote:


> Originally Posted by *Sazz*
> 
> 22-25C ambient temp.
> 
> Tried +50mV @ 1150/1350 and +100mV @ 1200/1350 (no aux voltage) and i got 79C and 85C temps on valley run.


Temp is for Core or VRM1?


----------



## kdawgmaster

well i just ran 20-30 min worth of valley and my 3 R9 290x with a core of 1125 and ram of 1300 my VRM's for all cards never got past 65C and core temps are sitting around 76C after running maxed at 2560X1440


----------



## kizwan

Quote:


> Originally Posted by *kdawgmaster*
> 
> well i just ran 20-30 min worth of valley and my 3 R9 290x with a core of 1125 and ram of 1300 my VRM's for all cards never got past 65C and core temps are sitting around 76C after running maxed at 2560X1440


What is your ambient (indoor) temp?


----------



## kdawgmaster

Quote:


> Originally Posted by *kizwan*
> 
> What is your ambient (indoor) temp?


Dont have anything to messure it but i would say no more then 22C


----------



## Sazz

Quote:


> Originally Posted by *kizwan*
> 
> Temp is for Core or VRM1?


VRM1

Haven't seen core temp go over 50C so far, H55 doing it's job quite well. the problem w/ AIO mods are VRM cooling, Imma aim to cut the VRM temp down to 50's at stock voltage 1100/1350.

Edit:

I just re-seated the VRM1 heatsink and currently running valley again for 1hr. Instead of stacking the thermal pad I just used one piece, doesn't have the pressure on it as before but so far temps are looking a bit better than the stacked thermal pad temps.


----------



## kizwan

Quote:


> Originally Posted by *Sazz*
> 
> VRM1
> 
> Haven't seen core temp go over 50C so far, H55 doing it's job quite well. the problem w/ AIO mods are VRM cooling, Imma aim to cut the VRM temp down to 50's at stock voltage 1100/1350.
> 
> Edit:
> 
> I just re-seated the VRM1 heatsink and currently running valley again for 1hr. Instead of stacking the thermal pad I just used one piece, doesn't have the pressure on it as before but so far temps are looking a bit better than the stacked thermal pad temps.


I'm thinking trying your OC "+50mV @ 1150/1350". How about power limit? Instead of 1 hour, I think running benchmark in Extreme HD should suffice. What do you think?


----------



## Sazz

Quote:


> Originally Posted by *kizwan*
> 
> I'm thinking trying your OC "+50mV @ 1150/1350". How about power limit? Instead of 1 hour, I think running benchmark in Extreme HD should suffice. What do you think?


oh at +50mV 1150/1350 I put the power limit at +50%

And so far I am 15minutes in on this valley run w/ single thermal pad on it and its about 9C lower than earlier. interesting enough VRM2 has gone to 68C which earlier only did 62C max.


----------



## Redvineal

Quote:


> Originally Posted by *Thanos1972*
> 
> i do not see guys recommending prolimatech Mk 26 for this card...why not?


Quote:


> Originally Posted by *ds84*
> 
> NZXT G10...


I have the MK-26 on my 290 and it's great!

You probably don't see many MK-26 mentions because they're not as widely available as the Arctic AXIII / Gelid Icy Vision. Oh and Tom's (







) published that article about the AXIII + 290X a few months ago. That probably steered a lot of people towards it.


----------



## kdawgmaster

Quote:


> Originally Posted by *Redvineal*
> 
> I have the MK-26 on my 290 and it's great!
> 
> You probably don't see many MK-26 mentions because they're not as widely available as the Arctic AXIII / Gelid Icy Vision. Oh and Tom's (
> 
> 
> 
> 
> 
> 
> 
> ) published that article about the AXIII + 290X a few months ago. That probably steered a lot of people towards it.


I cant use either the Gelid or the AXIII because i have 3 cards xD it kinda sucks


----------



## kizwan

Quote:


> Originally Posted by *Sazz*
> 
> oh at +50mV 1150/1350 I put the power limit at +50%
> 
> And so far I am 15minutes in on this valley run w/ single thermal pad on it and its about 9C lower than earlier. interesting enough VRM2 has gone to 68C which earlier only did 62C max.


I got square dots artifact & valley not responding after few seconds. Do you think I should increase voltage? I already tried up to +60mV.


----------



## kdawgmaster

Quote:


> Originally Posted by *kizwan*
> 
> I got square dots artifact & valley not responding after few seconds. Do you think I should increase voltage? I already tried up to +60mV.


You should incrementally increase your voltages until it stops.


----------



## Sazz

Quote:


> Originally Posted by *kizwan*
> 
> I got square dots artifact & valley not responding after few seconds. Do you think I should increase voltage? I already tried up to +60mV.


If you already got artifacts showing at 1150/1350 at +50mV you prolly gonna need around +65 to +75mV for the same clocks, even if at that voltage you don't get artifacts you might wanna check on OCCT. my clocks is stable thru OCCT 1hr error checking test.

I already know what clocks my card can do since I used to have full waterblock on it when I still have my full ATX build. I'm still refraining from running at +200mV 1300/1500, wanna know what kind of temps my VRM would get in the lower voltages.


----------



## Redvineal

Quote:


> Originally Posted by *kdawgmaster*
> 
> I cant use either the Gelid or the AXIII because i have 3 cards xD it kinda sucks


Eww yea, the MK-26 is WAAAY out of the question for you, then!

I hope you're doing some mining with those triplets.


----------



## Arizonian

Quote:


> Originally Posted by *Caanon*
> 
> *pokes head in thread* Hey guys, nice thread ya got goin' on here. Can I join in the fun?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - welcome aboard - added


----------



## kdawgmaster

Quote:


> Originally Posted by *Redvineal*
> 
> Eww yea, the MK-26 is WAAAY out of the question for you, then!
> 
> I hope you're doing some mining with those triplets.


nope xD all gaming all the time.


----------



## TommyGunn123

Oh cmon MSI, they finally get into Australia and theyre around $50 more than the Gigabyte version :/

MSI: http://www.pccasegear.com/index.php?main_page=product_info&cPath=416&products_id=26611

Gigabyte: http://www.pccasegear.com/index.php?main_page=product_info&cPath=416&products_id=26600

I was hoping for around the same price so it can go with my red/black theme(I know, kinda default







), that that premium is fairly extreme :/


----------



## Sazz

Just got done w/ 1hr of temp testing on my re-seated VRM heatsink, this time w/ just one thermal pad instead of stacking two.

Temps are down from a max of 74C/Avg. 71C to 63Cmax/62C Avg, but interestingly enough VRM2 gone up from 62 to 69C.

Gonna do some testing w/ +50mV and +100mV in a sec.


----------



## kizwan

Quote:


> Originally Posted by *kdawgmaster*
> 
> You should incrementally increase your voltages until it stops.


Quote:


> Originally Posted by *Sazz*
> 
> If you already got artifacts showing at 1150/1350 at +50mV you prolly gonna need around +65 to +75mV for the same clocks, even if at that voltage you don't get artifacts you might wanna check on OCCT. my clocks is stable thru OCCT 1hr error checking test.
> 
> I already know what clocks my card can do since I used to have full waterblock on it when I still have my full ATX build. I'm still refraining from running at +200mV 1300/1500, wanna know what kind of temps my VRM would get in the lower voltages.


Thanks. +rep for both of you.

Up to +100mV, artifacts still there. So, I reduced to 1100/1300. I probably can do 1125/1350 for example but I didn't try it yet.

My result in CFX - 1100/1300 +100mV, power limit +50, ambient (indoor) 32.5C

GPU1


GPU2


----------



## Derpinheimer

Quote:


> Originally Posted by *Sazz*
> 
> Just got done w/ 1hr of temp testing on my re-seated VRM heatsink, this time w/ just one thermal pad instead of stacking two.
> 
> Temps are down from a max of 74C/Avg. 71C to 63Cmax/62C Avg, but interestingly enough VRM2 gone up from 62 to 69C.
> 
> Gonna do some testing w/ +50mV and +100mV in a sec.


Thats really odd VRM2 temp would increase. You sure its not testing error? Or perhaps a small issue with contact?

7c is "a lot", but really.. those temps are fine, and VRM2 doesnt matter that much. That 10c improvement on VRM1 is muccccchhh more important.


----------



## Sazz

Quote:


> Originally Posted by *Derpinheimer*
> 
> Thats really odd VRM2 temp would increase. You sure its not testing error? Or perhaps a small issue with contact?
> 
> 7c is "a lot", but really.. those temps are fine, and VRM2 doesnt matter that much. That 10c improvement on VRM1 is muccccchhh more important.


I just noticed I forgot to put a couple of screws on the face plate. I will bring those back in after these testings, VRM2 doesn't really go up to danger zone anyways.

I just got done w/ quick ExtremeHD Valley runs w/ +50mV @ 1150/1350 and +100mV @ 1200/1350 and a HUGE improvement from earlier run which was +50mV=79C and +100mV=85C.

Now at +50mV my VRM temps didn't change from stock voltage which maxed out at 64C and at +100mV my VRM1 temps maxed out at 71C from 85C..



I wonder how +200mV would do. xD

Edit:

Did a quick Valley ExtremeHD run at +200mV, one thing is for sure different from a full block to this mod is my max core clock and memory clock achievable, w/ a full EK block I was able to get 1300/1500 at +200mV but w/ this mod I got upto 1275/1350, anything more than 1275 and artifacts will show and anything over 1350 on memory and I get black screen.

VRM1 temps went to max of 91C, which is not bad considering it's +200mV on a mod VRM heatsink, and I wouldn't really run at +200mV a lot, only for benchmarking.


----------



## kizwan

@Sazz, what do you think about my temp?


----------



## esqueue

Quote:


> Originally Posted by *Redvineal*
> 
> Eww yea, the MK-26 is WAAAY out of the question for you, then!
> 
> I hope you're doing some mining with those triplets.


Quote:


> Originally Posted by *kdawgmaster*
> 
> nope xD all gaming all the time.


----------



## Sazz

Quote:


> Originally Posted by *kizwan*
> 
> @Sazz, what do you think about my temp?


You're under full block watercooling right? temps shouldn't be a problem for you.


----------



## kizwan

Quote:


> Originally Posted by *Sazz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> @Sazz, what do you think about my temp?
> 
> 
> 
> You're under full block watercooling right? temps shouldn't be a problem for you.
Click to expand...

Please see screenshots in my previous post - #post_21640458.


----------



## Sazz

Quote:


> Originally Posted by *kizwan*
> 
> Please see screenshots in my previous post - #post_21640458


temps is about right, since you got high ambient temp. my ambient temp is around 22-25C and when I had a full EK block on it my GPU maxes out at 47C, VRM's maxes out at 55C.


----------



## kizwan

Quote:


> Originally Posted by *Sazz*
> 
> temps is about right, since you got high ambient temp. my ambient temp is around 22-25C and when I had a full EK block on it my GPU maxes out at 47C, VRM's maxes out at 55C.


Thanks. I take the lowest ambient, 22C. So, basically core is 25C above ambient & VRM's are 33C above ambient.

Mine, ambient is 32.5C. So, core (65C) is 32.5C above ambient & VRM1 is 40.5C above ambient. Factoring I use poor thermal pad & I'm running parallel for the GPU's which added a couple degrees to temp, look like my temp is somewhat ok I guess.


----------



## ottoore

Quote:


> Originally Posted by *Sazz*
> 
> temps is about right, since you got high ambient temp. my ambient temp is around 22-25C and when I had a full EK block on it my GPU maxes out at 47C, VRM's maxes out at 55C.


Do you have 65 degrees on vrm2 without any fan on it!??!?!??!
Can you try +100aux voltage in msi ab?

Is there a sw which shows vram temperature?


----------



## Sazz

Quote:


> Originally Posted by *ottoore*
> 
> Do you have 65 degrees on vrm2 without any fan on it!??!?!??!
> Can you try +100aux voltage in msi ab?
> 
> Is there a sw which shows vram temperature?


I don't really bother using AUX voltage, it doesn't affect my clocks stability at all.

And yes VRM2 don't have fan on it, just the faceplate that's it. I am planning to put chipsinks all over the faceplate surface and maybe a small 40mm fan over the VRM2, but VRM2 is for Memory and since memory don't have voltage control anything under 70C should be perfectly fine, but lower will be better.

Main concern is VRM1 which is used by the Core, and since core is usually the one that gets over-volted the VRM1 is the one that gets hotter than the rest of the components in the PCB.


----------



## ottoore

Quote:


> Originally Posted by *Sazz*
> 
> I don't really bother using AUX voltage, it doesn't affect my clocks stability at all.
> 
> And yes VRM2 don't have fan on it, just the faceplate that's it. I am planning to put chipsinks all over the faceplate surface and maybe a small 40mm fan over the VRM2, but VRM2 is for Memory and since memory don't have voltage control anything under 70C should be perfectly fine, but lower will be better.
> 
> Main concern is VRM1 which is used by the Core, and since core is usually the one that gets over-volted the VRM1 is the one that gets hotter than the rest of the components in the PCB.


Ok. Thank you.

You can put chipsinks on the other side of the pcb over the line of Vrm1. You could gain a couple of degrees,


----------



## cam51037

I think I need to RMA or return my R9 290 sometime soon. It's a Tri-X, and the fans rattle around 44%, but just over or under that there's no issues.

That and when I set the power limit to anything higher than 0, without doing anything, the PC instantly has flashes of black pixels on random places on the screen. Doesn't seem good to me. :/


----------



## ryboto

290x is at 1200mhz core, 1500mhz ram, +143mV, +50 power. Ran Crysis 3 for 15 minutes, core hit 70C, VRM1 87C. Safe temps? I've only allowed the fan speed to reach 80% at 70C, haven't the desire to listen to 100% fans while gaming.


----------



## Sazz

Quote:


> Originally Posted by *ryboto*
> 
> 290x is at 1200mhz core, 1500mhz ram, +143mV, +50 power. Ran Crysis 3 for 15 minutes, core hit 70C, VRM1 87C. Safe temps? I've only allowed the fan speed to reach 80% at 70C, haven't the desire to listen to 100% fans while gaming.


personally, I don't like putting the VRM1 above 80C on any clocks. besides at 1100/1350 stock voltage the card plays every game quite overkill.

Sometimes I run mine at +50mV 1150/1350 just to see the difference if I can feel it.


----------



## ryboto

Quote:


> Originally Posted by *Sazz*
> 
> personally, I don't like putting the VRM1 above 80C on any clocks. besides at 1100/1350 stock voltage the card plays every game quite overkill.
> 
> Sometimes I run mine at +50mV 1150/1350 just to see the difference if I can feel it.


Agreed, the card definitely doesn't necessarily need the boost from a clock bump, but I'm still interested in a close to max air OC. Is there a reason you choose 80C as a limit? I know they're rated for much higher temperature operation, aren't they?


----------



## Sazz

Quote:


> Originally Posted by *ryboto*
> 
> Agreed, the card definitely doesn't necessarily need the boost from a clock bump, but I'm still interested in a close to max air OC. Is there a reason you choose 80C as a limit? I know they're rated for much higher temperature operation, aren't they?


I just feel more comfortable knowing that my temps is no where near the rated max temps on the VRM, I believe if I remember it right most VRM on these GPU's are able to withstand 120C, some GPU's have VRM's rated upto 150C but of course running them at that temps degrades them quicker.

I wanna run them as low as possible so that they last as long as they can.


----------



## battleaxe

Quote:


> Originally Posted by *ryboto*
> 
> Agreed, the card definitely doesn't necessarily need the boost from a clock bump, but I'm still interested in a close to max air OC. Is there a reason you choose 80C as a limit? I know they're rated for much higher temperature operation, aren't they?


The issue has more to do with efficiency. There was a chart floating around here somewhere that showed how much efficiency goes down as heat goes up. So as you push these harder it does take a toll on them. The idea is they just can't last as long at higher temps. How long is anyone's guess. After about 70c the efficiency starts dropping pretty drastically, that's why 80c makes sense from a maximum level standpoint. If you don't care much about longevity, then I say it doesn't really matter. But for maximum life below 80c makes a lot of sense. I keep mine under 60c just to be safe. But that's maybe a bit overkill, I just want this card to last a long time. I like collecting computer parts, kind of a hobby for me.


----------



## kdawgmaster

Quote:


> Originally Posted by *ryboto*
> 
> 290x is at 1200mhz core, 1500mhz ram, +143mV, +50 power. Ran Crysis 3 for 15 minutes, core hit 70C, VRM1 87C. Safe temps? I've only allowed the fan speed to reach 80% at 70C, haven't the desire to listen to 100% fans while gaming.


The VRMS are technically able to take up to 100C but i wouldnt feel comfortable doing that. If your ok with it then the card will be fine just run alittle hot, if ur not ok with it then find better cooling or lower the clocks and voltages.


----------



## Heinz68

HardwareCanucks Sapphire R9 290 4GB TRI-X OC Review

Nice Overclock Results
Quote:


> TRIXX is a relatively straightforward application which, for the time being, allows for modification of core / memory frequencies and core voltage. This is basically what AMD's PowerTune offers but in a highly simplified and user-friendly manner.
> 
> By using TRIXX we were able to get the Sapphire R9 290 Tri-X OC we were able to hit 1251MHz on the core and 5666MHz on the GDDR5 with .085V of extra voltage. This vastly outstrips the levels attained by our reference samples and brings actual performance to some incredible levels.


----------



## BradleyW

Quote:


> Originally Posted by *Heinz68*
> 
> HardwareCanucks Sapphire R9 290 4GB TRI-X OC Review
> 
> Nice Overclock Results


From review:
Quote:


> This vastly outstrips the levels attained by our reference samples


I wonder why that is? Oh yeah...because reference have crap coolers.


----------



## jerrolds

Whoa 1251mhz on the core, on air? Thats pretty good. Cant find any mention of VRM temps though.


----------



## Loktar Ogar

I wish i can achieve that OC too... or at least i can get a CC 1200 stable... I'm having trouble getting CC 1170 and the increased mV of +100 is not enough in AB... Maybe that card has a high ASIC score if that matters or a really good OC.









This gives me a hint to use TRIXX later today and test it. By the way how many VDDC offset is (.085V of extra voltage)?


----------



## carlovfx

Quote:


> Originally Posted by *BradleyW*
> 
> From review:
> I wonder why that is? Oh yeah...because reference have crap coolers.


I have a R9 290 unlocked to 290x and another R9 290x, in crossfire.
With Sapphire Trixx i cannot go over 1170MHz on core and around 1300 on memory, these cards are just not good enough to OC.
I forgot, I run my own custom loop and temps are always below 50 degrees under stress so not a temp issue.


----------



## Ukkooh

Got my new 290x today. Haven't had the time to install it yet but I hope it does 1150 @stock voltage. I really hope it hits 1200 once I get it under water.


----------



## Jack Mac

Quote:


> Originally Posted by *Ukkooh*
> 
> Got my new 290x today. Haven't had the time to install it yet but I hope it does 1150 @stock voltage. I really hope it hits 1200 once I get it under water.


1150 is a bit much to ask for at stock voltage.


----------



## PillarOfAutumn

Whats a max voltage you should aim for? Is it okay if I push it upto 200mv and see what the max clocks I will get? My card is under water.


----------



## Jack Mac

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Whats a max voltage you should aim for? Is it okay if I push it upto 200mv and see what the max clocks I will get? My card is under water.


TBH, no higher than 100mV for 24/7 use. Benching 200mV is ok, but a bit risky. For water, I'd do 145mV at the most for 24/7.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Jack Mac*
> 
> TBH, no higher than 100mV for 24/7 use. Benching 200mV is ok, but a bit risky. For water, I'd do 145mV at the most for 24/7.


Thank you! I usually have the sapphire profile running in the background. I always have it set to stock clocks, but when I start to game, I will load one of the profiles I have, finish my gaming and either turn my PC off or reset the clocks/voltage when I need my PC for less demanding work. Right now, at the max clocks that I'm running, I get the same amount of stability at 100mv vs 84mv, so I just keep it at 87. I will push it upto 120mv and see how far I can get.


----------



## ryboto

Has anyone been successful with downsampling on these cards? I read in a thread from a few days ago that the most recent 13.3 beta's work with the downsampling tools. Has anyone tried?


----------



## Jack Mac

What screws do I need to unscrew to remove the shroud only, not the heatsink.


----------



## 4zp1r1na

Quote:


> Originally Posted by *Arizonian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *4zp1r1na*
> 
> just order a single r9 290 because i was not sure if my Corsair AX760i could handle 2 of them, so let´s see how it goes, hope i made the right decision coming from a 780 SC ACX
> 
> and hoping i can unlock it to a 290x haha. posting pictures when it arrives.
> 
> 
> 
> Definitely strong card I'm sure you'll enjoy it either way. When you get it post proof submission and I'll add you to the roster.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Good luck unlocking it.
Click to expand...

Thanks Arizonian

And my card has arrive, it here

Some pics







I actually like the look of the cooler and it goes well with my system,

I'm still with the idea of getting a second one but not sure if my AX760i Can handle it, and I don't really want to change it because just arrived one week before my 290 haha.

I still don't check temperatures and noise, not even if it's unlockeable but so far I think a Waterblock will be a good friend to my 290.


----------



## Loktar Ogar

Quote:


> Originally Posted by *Loktar Ogar*
> 
> I wish i can achieve that OC too... or at least i can get a CC 1200 stable... I'm having trouble getting CC 1170 and the increased mV of +100 is not enough in AB... Maybe that card has a high ASIC score if that matters or a really good OC.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This gives me a hint to use TRIXX later today and test it. By the way how many VDDC offset is (.085V of extra voltage)?


Follow up question: What is the average OC of R9 290 anyway? CC 1150 or CC 1200?


----------



## battleaxe

Quote:


> Originally Posted by *Jack Mac*
> 
> What screws do I need to unscrew to remove the shroud only, not the heatsink.


They are the tiny ones on the sides of the plastic. Not the screws on the back of the PCB. Leave the PCB screws there, there might be two small ones on the chrome faceplate too, I can't remember on that one. Either way, you'll see once you start. Be careful when removing the shroud, I think the fan is attached to the plastic and the wires are short. (not entirely sure though, can't remember that either.... lol )
Quote:


> Originally Posted by *Loktar Ogar*
> 
> Follow up question: What is the average OC of R9 290 anyway? CC 1150 or CC 1200?


Average would be much closer to 1150 or even 1100 than 1200. Not too many can do 1200. Very few can do 1250 and only a handful reach 1300.


----------



## Knight26

Quote:


> Originally Posted by *4zp1r1na*
> 
> Thanks Arizonian
> 
> And my card has arrive, it here
> 
> Some pics
> 
> 
> 
> 
> 
> 
> 
> I actually like the look of the cooler and it goes well with my system,
> 
> I'm still with the idea of getting a second one but not sure if my AX760i Can handle it, and I don't really want to change it because just arrived one week before my 290 haha.
> 
> I still don't check temperatures and noise, not even if it's unlockeable but so far I think a Waterblock will be a good friend to my 290.


You would probably still be good running 2x290's with the AX760 as long as you're not planning on overvolting. I'm running 2x290's @ 1108mhz right now with an oc'd 3770k @ 1.4v and my system pulls a little less than 650w running BF4 at 5760x1080 mostly maxed out. I have a 12v water pump, 8 fans, 2 ssd's, and a mechanical hard drive running as well. So, your power draw will depend on what else you have in your system.


----------



## owcraftsman

I've had the card since Nov 2013, just found this thread and would like to join the club.

Just put on water a couple weeks ago and looking forward to pushing the OC a bit higher with some core voltage bumpage.

Looks like there is some good advice to be had here.

Thanks to the OP and contributors.

Max temps gaming/oced 48c 26c ambient at time of testing 32c case internal playing BF4 MP



http://www.techpowerup.com/gpuz/vyr5h/

or


----------



## ryboto

Quote:


> Originally Posted by *battleaxe*
> 
> Average would be much closer to 1150 or even 1100 than 1200. Not too many can do 1200. Very few can do 1250 and only a handful reach 1300.


Sweet. That means I lucked out with this card, as it can do 1200mhz. Ambient is pretty low now, so I might have to crank the Tri-x to 100% later in the year. Meanwhile my 3570k is mediocre


----------



## Matt-Matt

So guys,

What are some good waterblocks for these?
So far I can get EK Acetal's for around $20 cheaper then anything else (Well, the same price as XSPC).

I'm thinking that is the best yes?


----------



## VSG

Yup.


----------



## Mr357

Quote:


> Originally Posted by *Matt-Matt*
> 
> So guys,
> 
> What are some good waterblocks for these?
> So far I can get EK Acetal's for around $20 cheaper then anything else (Well, the same price as XSPC).
> 
> I'm thinking that is the best yes?


The EK blocks are great, so I say go for it.


----------



## Krusher33

I'm going to buy a waterblock sometime next week. Which is the best?


----------



## rx7racer

Quote:


> Originally Posted by *Krusher33*
> 
> I'm going to buy a waterblock sometime next week. Which is the best?


I don't know that enough are out and about and compiled to really say which is best. But they are all fairly equal and do a fine job. Heatkiller if you like the weighty copper look and feel, Aqua computer just cause it's sexy for some reason. EK, and Koolance are the main go to it seems though but look avg to me.

All the numbers you find for them are respectable and well within margin or error depending on rad,fan etc setup imo.

Maybe someone has a link to a bit more comparative articles that can show for sure. My quick glances and looks are that VRM temps vary a bit but for the core part if your loop is good they all and even universal blocks keep the 290X plenty cool.

Simple way is let budget dictate, if it allows then go by what tickles your fancy looks wise.


----------



## Krusher33

Quote:


> Originally Posted by *rx7racer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Krusher33*
> 
> I'm going to buy a waterblock sometime next week. Which is the best?
> 
> 
> 
> I don't know that enough are out and about and compiled to really say which is best. But they are all fairly equal and do a fine job. Heatkiller if you like the weighty copper look and feel, Aqua computer just cause it's sexy for some reason. EK, and Koolance are the main go to it seems though but look avg to me.
> 
> All the numbers you find for them are respectable and well within margin or error depending on rad,fan etc setup imo.
> 
> Maybe someone has a link to a bit more comparative articles that can show for sure. My quick glances and looks are that VRM temps vary a bit but for the core part if your loop is good they all and even universal blocks keep the 290X plenty cool.
> 
> Simple way is let budget dictate, if it allows then go by what tickles your fancy looks wise.
Click to expand...

k, and thanks.

BTW I was sooooo addicted to that game in your avy. (back in da day)


----------



## Mr357

Quote:


> Originally Posted by *Krusher33*
> 
> I'm going to buy a waterblock sometime next week. Which is the best?


EK, Koolance, and XSPC are all great, but I've always been told that Heatkiller blocks are the cream of the crop. If you like that clean copper look, go with one of those. They are, however a little more expensive than the other brands, but for a reason I would hope.


----------



## chiknnwatrmln

LOL good luck getting an EK Acetal block...

I ordered one from FrozenCPU Dec 14th and I'm still waiting on it to ship. I've talked with the reps and they said that EK is way behind on their order, one customer service rep said specifically that he has orders for over 70 blocks that EK is behind on.

The price isn't bad and the Acetal is sexy but this wait is killing me. I've had every other part for my rebuild for weeks now.


----------



## Loktar Ogar

Just for fun and got bored. Tried TRIXX utility and got these results. I enjoyed the scores but not the voltage increase. lol









Ambient temps around 25c to 27c. I clicked on custom fan profile in TRIXX and did not configured any.

1200/1500 +168 mV and Power limit +50

Firestrike 1.1 Score: 10358

http://www.3dmark.com/3dm/2287107

1200/1600 +168 mV and Power limit +50

Firestrike 1.1 Score: 10440*

http://www.3dmark.com/3dm/2287125

* For some reason the clocks did not registered

I don't feel like doing CC 1200+ due to volts. This R9 290 TRI-X are probably good in water.


----------



## Sazz

Quote:


> Originally Posted by *Mr357*
> 
> EK, Koolance, and XSPC are all great, but I've always been told that Heatkiller blocks are the cream of the crop. If you like that clean copper look, go with one of those. They are, however a little more expensive than the other brands, but for a reason I would hope.


heatkiller and Koolance are always on the top of the chain in terms of cooling performance and XSPC and EK is right next to them w/ a difference of 1-2C at most. but the margin on flow rate (specially Koolance block) is big, XSPC normally have better flow rate, EK next to it and heatkiller would come up after the two, and Koolance being the most restrictive blocks you would see.


----------



## Arizonian

Quote:


> Originally Posted by *4zp1r1na*
> 
> Thanks Arizonian
> 
> And my card has arrive, it here
> 
> Some pics
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I actually like the look of the cooler and it goes well with my system,
> 
> I'm still with the idea of getting a second one but not sure if my AX760i Can handle it, and I don't really want to change it because just arrived one week before my 290 haha.
> 
> I still don't check temperatures and noise, not even if it's unlockeable but so far I think a Waterblock will be a good friend to my 290.


Your right it does match nicely with that touch of red. Congrats - added









Quote:


> Originally Posted by *owcraftsman*
> 
> I've had the card since Nov 2013, just found this thread and would like to join the club.
> Just put on water a couple weeks ago and looking forward to pushing the OC a bit higher with some core voltage bumpage.
> Looks like there is some good advice to be had here.
> Thanks to the OP and contributors.
> 
> Max temps gaming/oced 48c 26c ambient at time of testing 32c case internal playing BF4 MP
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/vyr5h/
> 
> or
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## neurotix

Quote:


> Originally Posted by *Loktar Ogar*
> 
> Just for fun and got bored. Tried TRIXX utility and got these results. I enjoyed the scores but not the voltage increase. lol
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1200/1500 +168 mV and Power limit +50
> 
> Firestrike 1.1 Score: 10358
> 
> http://www.3dmark.com/3dm/2287107
> 
> snip


My R9 290 Tri-X takes the exact same amount of volts to do 1200mhz stably, +168mv. Anything less and I get errors in OCCT. With +168mv I can run it for 10 minutes and not get errors, so it seems good to me. Tested it in games a lot over the last few days, no crashes or lockups, no display driver crashes.

Anything more than 1200mhz on the core causes artifacts in Valley. Well, I can get away with 1215mhz with no artifacts but I just leave it at 1200 because it's an even number and I'd rather have total stability when gaming over a tiny bit more performance.

You're lucky your ram does 1600mhz. Mine will do 1550 fine in Valley, but 1600 I black screen immediately, even with +50 aux voltage in Afterburner.

Overall I'm VERY happy with the performance of my card at 1200/1500 and the only problem I've had at these clocks is artifacts in Skyrim using ENBseries. If I turn the core clock down they go away. This is the only game so far that does this, and it doesn't artifact without ENB at those clocks. The performance and temps of this thing is great and it powers through most games I play in Eyefinity no less. Any Unreal Engine 3 game seems to run at 60 fps with everything maxed out, I've tried Unreal Tournament 3 and Bulletstorm. Other games run great too, like Just Cause 2. Really the only thing I have it can't max is Crysis 3, I have to turn a few post processing options to "medium" to get good FPS, but I can't tell the difference between having them on medium or very high anyway. I also have the Metro games but I'm yet to try them.


----------



## Falkentyne

Try going into afterburner, set unofficial overclocking mode to without powerplay support, make sure your idle clocks are then set to 1000/1250 in idle (instead of 150/300) and then try to set your ram to 1600 mhz.

You wont black screen. At worst, the screen will flicker, since the 2D idle voltage will now be 3d.


----------



## bayz11

Quote:


> Originally Posted by *Krusher33*
> 
> I'm going to buy a waterblock sometime next week. Which is the best?


This is my aquacomputer block for your reference


----------



## kizwan

I can get 7W/MK rated thermal pad locally. Does 7W/MK is enough? I know people suggest Fujipoly but it's not available locally.
Quote:


> Originally Posted by *Loktar Ogar*
> 
> Just for fun and got bored. Tried TRIXX utility and got these results. I enjoyed the scores but not the voltage increase. lol
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1200/1500 +168 mV and Power limit +50
> 
> Firestrike 1.1 Score: 10358
> 
> http://www.3dmark.com/3dm/2287107
> 
> 1200/1600 +168 mV and Power limit +50
> 
> Firestrike 1.1 Score: 10440*
> 
> http://www.3dmark.com/3dm/2287125
> 
> * For some reason the clocks did not registered
> 
> I don't feel like doing CC 1200+ due to volts. This R9 290 TRI-X are probably good in water.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


What is your ambient (indoor) temp?
Quote:


> Originally Posted by *Matt-Matt*
> 
> So guys,
> 
> What are some good waterblocks for these?
> So far I can get EK Acetal's for around $20 cheaper then anything else (Well, the same price as XSPC).
> 
> I'm thinking that is the best yes?


Quote:


> Originally Posted by *rx7racer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Krusher33*
> 
> I'm going to buy a waterblock sometime next week. Which is the best?
> 
> 
> 
> I don't know that enough are out and about and compiled to really say which is best. But they are all fairly equal and do a fine job. Heatkiller if you like the weighty copper look and feel, Aqua computer just cause it's sexy for some reason. EK, and Koolance are the main go to it seems though but look avg to me.
> 
> All the numbers you find for them are respectable and well within margin or error depending on rad,fan etc setup imo.
> 
> Maybe someone has a link to a bit more comparative articles that can show for sure. My quick glances and looks are that VRM temps vary a bit but for the core part if your loop is good they all and even universal blocks keep the 290X plenty cool.
> 
> Simple way is let budget dictate, if it allows then go by what tickles your fancy looks wise.
Click to expand...

All blocks performs close to each other but the best _out-of-the-box_ I can say is the Aquacomputer Kryographics blocks with their active backplate. The active backplate does a good job in cooling VRM1.

_[source]_


To be fair, the stock thermal pad that come with the EK block is not good. So, if you're going to order EK block, order some Fujipoly thermal pads. With this thermal pad, you'll get better cooling on VRM1. Maybe close to the AC active backplate performance if not better.

http://www.frozencpu.com/products/17499/thr-181/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_60_x_50_x_05_-_Thermal_Conductivity_170_WmK.html?tl=g30c309s1294

http://www.frozencpu.com/products/17500/thr-182/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_60_x_50_x_10_-_Thermal_Conductivity_170_WmK.html?tl=g30c309s1294
Quote:


> Originally Posted by *chiknnwatrmln*
> 
> LOL good luck getting an EK Acetal block...
> 
> I ordered one from FrozenCPU Dec 14th and I'm still waiting on it to ship. I've talked with the reps and they said that EK is way behind on their order, one customer service rep said specifically that he has orders for over 70 blocks that EK is behind on.
> 
> The price isn't bad and the Acetal is sexy but this wait is killing me. I've had every other part for my rebuild for weeks now.


I bought mine on EKWB website. Ordered on 8 January & received mine on 20 January. I waited one week.


----------



## MrWhiteRX7

I know where a few bnib Xspc 290 water blocks and backplates are for sale ready to ship!


----------



## dimwit13

Hey guys, I just bought a Sapphire 290 and a XSPC Razor R9 290X / 290 - Full Cover Water Block and Back plate.
Looking forward to see what I can get out of it.
I will be back in a week or so to ask all kinds of dumb, already answered questions, thanks-lol

-dimwit-
Quote:


> Originally Posted by *MrWhiteRX7*
> 
> I know where a few bnib Xspc 290 water blocks and backplates are for sale ready to ship!


Performance PCs-where I bought mine.


----------



## steelkevin

Quote:


> Originally Posted by *HardwareDecoder*
> 
> I see. I think I really don't need to change it right now anyway. I just see people with like 42-45c load temps on the 290/x with + voltage and mine gets to like 50c in BF4 with no + voltage. So i'm running a bit hotter and I think it is my pump not being that powerful to go through CPU Block 2 rads and a gpu block. I should probably just be happy that everything works for now and just upgrade my pump whenever I do a new build.


You posted this a while back but I stopped reading this thread a while ago and am 2000 pages behind but I just had to answer that.

I also have a Midi R2 and was getting 50-55°C on load with a slight OC but no voltage bumps.
All I did was remove all dust filters from the case and now I'm getting 10-15°C less on the CPU cores and am only hitting 36°C on the GPU


----------



## Krusher33

Spoiler: Everyone who responded to my waterblock question



Quote:


> Originally Posted by *Mr357*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Krusher33*
> 
> I'm going to buy a waterblock sometime next week. Which is the best?
> 
> 
> 
> EK, Koolance, and XSPC are all great, but I've always been told that Heatkiller blocks are the cream of the crop. If you like that clean copper look, go with one of those. They are, however a little more expensive than the other brands, but for a reason I would hope.
Click to expand...

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> LOL good luck getting an EK Acetal block...
> 
> I ordered one from FrozenCPU Dec 14th and I'm still waiting on it to ship. I've talked with the reps and they said that EK is way behind on their order, one customer service rep said specifically that he has orders for over 70 blocks that EK is behind on.
> 
> The price isn't bad and the Acetal is sexy but this wait is killing me. I've had every other part for my rebuild for weeks now.


Quote:


> Originally Posted by *Sazz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mr357*
> 
> EK, Koolance, and XSPC are all great, but I've always been told that Heatkiller blocks are the cream of the crop. If you like that clean copper look, go with one of those. They are, however a little more expensive than the other brands, but for a reason I would hope.
> 
> 
> 
> heatkiller and Koolance are always on the top of the chain in terms of cooling performance and XSPC and EK is right next to them w/ a difference of 1-2C at most. but the margin on flow rate (specially Koolance block) is big, XSPC normally have better flow rate, EK next to it and heatkiller would come up after the two, and Koolance being the most restrictive blocks you would see.
Click to expand...

Quote:


> Originally Posted by *bayz11*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Krusher33*
> 
> I'm going to buy a waterblock sometime next week. Which is the best?
> 
> 
> 
> This is my aquacomputer block for your reference
Click to expand...

Quote:


> Originally Posted by *kizwan*
> 
> I can get 7W/MK rated thermal pad locally. Does 7W/MK is enough? I know people suggest Fujipoly but it's not available locally.
> Quote:
> 
> 
> 
> Originally Posted by *Loktar Ogar*
> 
> Just for fun and got bored. Tried TRIXX utility and got these results. I enjoyed the scores but not the voltage increase. lol
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1200/1500 +168 mV and Power limit +50
> 
> Firestrike 1.1 Score: 10358
> 
> http://www.3dmark.com/3dm/2287107
> 
> 1200/1600 +168 mV and Power limit +50
> 
> Firestrike 1.1 Score: 10440*
> 
> http://www.3dmark.com/3dm/2287125
> 
> * For some reason the clocks did not registered
> 
> I don't feel like doing CC 1200+ due to volts. This R9 290 TRI-X are probably good in water.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What is your ambient (indoor) temp?
> Quote:
> 
> 
> 
> Originally Posted by *Matt-Matt*
> 
> So guys,
> 
> What are some good waterblocks for these?
> So far I can get EK Acetal's for around $20 cheaper then anything else (Well, the same price as XSPC).
> 
> I'm thinking that is the best yes?
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *rx7racer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Krusher33*
> 
> I'm going to buy a waterblock sometime next week. Which is the best?
> 
> Click to expand...
> 
> I don't know that enough are out and about and compiled to really say which is best. But they are all fairly equal and do a fine job. Heatkiller if you like the weighty copper look and feel, Aqua computer just cause it's sexy for some reason. EK, and Koolance are the main go to it seems though but look avg to me.
> 
> All the numbers you find for them are respectable and well within margin or error depending on rad,fan etc setup imo.
> 
> Maybe someone has a link to a bit more comparative articles that can show for sure. My quick glances and looks are that VRM temps vary a bit but for the core part if your loop is good they all and even universal blocks keep the 290X plenty cool.
> 
> Simple way is let budget dictate, if it allows then go by what tickles your fancy looks wise.
> 
> Click to expand...
> 
> All blocks performs close to each other but the best _out-of-the-box_ I can say is the Aquacomputer Kryographics blocks with their active backplate. The active backplate does a good job in cooling VRM1.
> 
> _[source]_
> 
> 
> To be fair, the stock thermal pad that come with the EK block is not good. So, if you're going to order EK block, order some Fujipoly thermal pads. With this thermal pad, you'll get better cooling on VRM1. Maybe close to the AC active backplate performance if not better.
> 
> http://www.frozencpu.com/products/17499/thr-181/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_60_x_50_x_05_-_Thermal_Conductivity_170_WmK.html?tl=g30c309s1294
> 
> http://www.frozencpu.com/products/17500/thr-182/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_60_x_50_x_10_-_Thermal_Conductivity_170_WmK.html?tl=g30c309s1294
> Quote:
> 
> 
> 
> Originally Posted by *chiknnwatrmln*
> 
> LOL good luck getting an EK Acetal block...
> 
> I ordered one from FrozenCPU Dec 14th and I'm still waiting on it to ship. I've talked with the reps and they said that EK is way behind on their order, one customer service rep said specifically that he has orders for over 70 blocks that EK is behind on.
> 
> The price isn't bad and the Acetal is sexy but this wait is killing me. I've had every other part for my rebuild for weeks now.
> 
> Click to expand...
> 
> I bought mine on EKWB website. Ordered on 8 January & received mine on 20 January. I waited one week.
Click to expand...





Ok thanks everyone. My card/waterblock is pending payment today so I may be ordering one sooner than I thought.


----------



## dspacek

Quote:


> Originally Posted by *Krusher33*
> 
> I'm going to buy a waterblock sometime next week. Which is the best?


Im using Heatkiller since Alphacool did not made them for R9-290 yet. So i thing its the best choice, very good performance. And doesnt have that infantill kokonut tree on it. Lets see it.


----------



## syniad

Recently got 2 of the R9 290s running in crossfire 1100/1400 and performance can be pretty random/stuttery, by far the worst game seems to be Skyrim with Unreal ENB. FPS spikes as low as 18 and hovers around 30-55 which is pretty lame considering I was getting a nice smooth 40-50 fps with my single GTX 670 on the same settings. Turning crossfire off I still get around 30 fps but it doesn't stutter and lag so much.

With the ENB off I get around 110 fps so something is definetly weird with the ENB and crossfire 290s or maybe my overall mod setup works better with nvidia cards, not sure.


----------



## AlDyer

Does anybody else have extremely touchy Elpida memory? My card just black screens and I have to change the BIOS switch to get it working again, if I OC my Elpida memory to something like 1500 for example. Do you know any BIOS that has memory control and the possibility of higher core voltage? I am still stuck with stock cooler since I can't find a block in stock anywhere in Finland so I don't really want to overvolt too much right now


----------



## syniad

Quote:


> Originally Posted by *AlDyer*
> 
> Does anybody else have extremely touchy Elpida memory? My card just black screens and I have to change the BIOS switch to get it working again, if I OC my Elpida memory to something like 1500 for example. Do you know any BIOS that has memory control and the possibility of higher core voltage? I am still stuck with stock cooler since I can't find a block in stock anywhere in Finland so I don't really want to overvolt too much right now


I also have Elpida memory on my cards and if I overclock the memory much past around 1500 I start to get really strange issues such as the entire screen flashing whenever I drag something on the desktop. Not sure what causes this, could be something to do with my monitor (one of the korean PLS ones) running pretty close to it's refresh rate peak. Granted I've not done any increase in voltage yet (I have a feeling voltage is locked on my cards anyway). I'll try once I get in and see what happens.


----------



## vieuxchnock

*Do you think we can "remove". (cancel, take off call it whatever you want) a DVI port on a 290? I need a single slot card. LOL*


----------



## Krusher33

Quote:


> Originally Posted by *dspacek*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Krusher33*
> 
> I'm going to buy a waterblock sometime next week. Which is the best?
> 
> 
> 
> Im using Heatkiller since Alphacool did not made them for R9-290 yet. So i thing its the best choice, very good performance. And doesnt have that infantill kokonut tree on it. Lets see it.
Click to expand...

Thanks. I do like the look of heatkillers. Just unsure how it would look on a motherboard with gold heatsinks.


----------



## dspacek

Quote:


> Originally Posted by *vieuxchnock*
> 
> *Do you think we can "remove". (cancel, take off call it whatever you want) a DVI port on a 290? I need a single slot card. LOL*


Absolutly Yes, its only output, so if you do not short circuit of that output, you can solder it out ;-)


----------



## vieuxchnock

Quote:


> Originally Posted by *dspacek*
> 
> Absolutly Yes, its only output, so if you do not short circuit of that output, you can solder it out ;-)


*Thanks for your fast answer..It's because I have a Matx board and I want to Xfire 2X290, but I have a RevoDrive between them, so if I can make the last card "single slot" it will work.







*


----------



## Roy360

Quote:


> Originally Posted by *dspacek*
> 
> Absolutly Yes, its only output, so if you do not short circuit of that output, you can solder it out ;-)


I would love to do this. But it there a way to do this video voiding the warranty?


----------



## AlDyer

Stock cooler, can't wait to see what I can achieve with water! Is this in the lines of what you guys have been getting?

http://www.3dmark.com/3dm/2290894


----------



## kizwan

When overclock & overvolt, does the second GPU also suppose to overvolt? I'm asking because mine is not. See pic below (I added labels to show peak VDDC on both cards). Like the first time I installed these cards, with the Hynix card at first slot & Elpida card at second slot, Elpida seems like "lagging" behind. When I swapped the cards so that Hynix at second slot, both cards performed better. Now, when I installed the EK terminal to interconnect both water blocks on both cards, I mistakenly put Hynix card to first slot again. So, the Elpida card seems "lagging" behind again. No biggie, I'm going to tear down them later anyway.


----------



## esqueue

Quote:


> Originally Posted by *Roy360*
> 
> I would love to do this. But it there a way to do this video voiding the warranty?


Well of course they would void the warranty if you de-solder a component from the video card. If a manufacturer doesn't, I would avoid it them as bad decisions like that will lead them to bankruptcy. Voiding warranties for better cooling is another issue. It can reduce temps and actually help the video card last longer and preform better (no matter what the blind follower told me much earlier in this thread) if the correct cooling is used and is installed correctly. Many aftermarket cooling fail to properly address the vrm and ram.


----------



## Krusher33

Quote:


> Originally Posted by *syniad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *AlDyer*
> 
> Does anybody else have extremely touchy Elpida memory? My card just black screens and I have to change the BIOS switch to get it working again, if I OC my Elpida memory to something like 1500 for example. Do you know any BIOS that has memory control and the possibility of higher core voltage? I am still stuck with stock cooler since I can't find a block in stock anywhere in Finland so I don't really want to overvolt too much right now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I also have Elpida memory on my cards and if I overclock the memory much past around 1500 I start to get really strange issues such as the entire screen flashing whenever I drag something on the desktop. Not sure what causes this, could be something to do with my monitor (one of the korean PLS ones) running pretty close to it's refresh rate peak. Granted I've not done any increase in voltage yet (I have a feeling voltage is locked on my cards anyway). I'll try once I get in and see what happens.
Click to expand...

Adding voltage has fixed it for many people.


----------



## dspacek

Quote:


> Originally Posted by *AlDyer*
> 
> Stock cooler, can't wait to see what I can achieve with water! Is this in the lines of what you guys have been getting?
> 
> http://www.3dmark.com/3dm/2290894


My GPU score is 12930 with nothing special overclock 1200/1700


----------



## AlDyer

Quote:


> Originally Posted by *Krusher33*
> 
> Adding voltage has fixed it for many people.


Yeah I got the memory working perfectly now, I just need my waterblock now








Quote:


> Originally Posted by *dspacek*
> 
> My GPU score is 12930 with nothing special overclock 1200/1700


Is that with stock cooling? With what voltage?


----------



## vieuxchnock

*In North America, if you buy an XFX card you can change the cooler without voiding the waranty. I have an XFX 290 Black Edition and ask them if I can change the cooler, and they told me if I bought my card in North America. I can do it without voiding the warranty*


----------



## tsm106

Quote:


> Originally Posted by *Roy360*
> 
> Quote:
> 
> 
> 
> Originally Posted by *dspacek*
> 
> Absolutly Yes, its only output, so if you do not short circuit of that output, you can solder it out ;-)
> 
> 
> 
> I would love to do this. *But it there a way to do this video voiding the warranty*?
Click to expand...

Is that a trick question? I'd be more worried about frying the card though.


----------



## Loktar Ogar

Quote:


> Originally Posted by *neurotix*
> 
> My R9 290 Tri-X takes the exact same amount of volts to do 1200mhz stably, +168mv.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Anything less and I get errors in OCCT. With +168mv I can run it for 10 minutes and not get errors, so it seems good to me. Tested it in games a lot over the last few days, no crashes or lockups, no display driver crashes.
> 
> Anything more than 1200mhz on the core causes artifacts in Valley. Well, I can get away with 1215mhz with no artifacts but I just leave it at 1200 because it's an even number and I'd rather have total stability when gaming over a tiny bit more performance.
> 
> You're lucky your ram does 1600mhz. Mine will do 1550 fine in Valley, but 1600 I black screen immediately, even with +50 aux voltage in Afterburner.
> 
> Overall I'm VERY happy with the performance of my card at 1200/1500 and the only problem I've had at these clocks is artifacts in Skyrim using ENBseries. If I turn the core clock down they go away. This is the only game so far that does this, and it doesn't artifact without ENB at those clocks. The performance and temps of this thing is great and it powers through most games I play in Eyefinity no less. Any Unreal Engine 3 game seems to run at 60 fps with everything maxed out, I've tried Unreal Tournament 3 and Bulletstorm. Other games run great too, like Just Cause 2. Really the only thing I have it can't max is Crysis 3, I have to turn a few post processing options to "medium" to get good FPS, but I can't tell the difference between having them on medium or very high anyway. I also have the Metro games but I'm yet to try them.


Thanks for sharing your experiences as well on your R9 290 TRI-X.









In my experience, the aux voltage in AB does not have any effect in OC. Maybe it will be fixed in beta 19? I only tested this with Firestrike since the VRM1 temps don't go over 80c, in BF4 even with CC1150 i'll have a VRM1 temp of 80c-83c Max. I have a feeling that i'll have some artifacts in other Benchmarks and Games in those test as well and adding more mV.









Quote:


> Originally Posted by *kizwan*
> 
> What is your ambient (indoor) temp?


Around 25c to 27c. I clicked on custom fan profile in TRIXX and did not configured any. I'll update my post.


----------



## dspacek

Quote:


> Originally Posted by *AlDyer*
> 
> Yeah I got the memory working perfectly now, I just need my waterblock now
> 
> 
> 
> 
> 
> 
> 
> 
> Is that with stock cooling? With what voltage?


Water cooled, 1,4 V


----------



## AlDyer

Add me to the club:

http://www.techpowerup.com/gpuz/9uzsb/

XFX R9 290

Stock cooling


----------



## AlDyer

Anybody have a good BIOS suggestion? I am running stock XFX BIOS.


----------



## kdawgmaster

Quote:


> Originally Posted by *AlDyer*
> 
> Anybody have a good BIOS suggestion? I am running stock XFX BIOS.


Why are u looking for a new bios? If its in the hopes of a better overclock its not a 100% thing it will help with.


----------



## AlDyer

Quote:


> Originally Posted by *kdawgmaster*
> 
> Why are u looking for a new bios? If its in the hopes of a better overclock its not a 100% thing it will help with.


To maximize mining hashrate


----------



## Krusher33

Quote:


> Originally Posted by *AlDyer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kdawgmaster*
> 
> Why are u looking for a new bios? If its in the hopes of a better overclock its not a 100% thing it will help with.
> 
> 
> 
> To maximize mining hashrate
Click to expand...









You're in the wrong thread for that.

http://www.overclock.net/t/1437876/290-and-290x-litecoin-mining-performance/0_50


----------



## AlDyer

Quote:


> Originally Posted by *Krusher33*
> 
> 
> 
> 
> 
> 
> 
> 
> You're in the wrong thread for that.
> 
> http://www.overclock.net/t/1437876/290-and-290x-litecoin-mining-performance/0_50


Yeah I asked there, but as I did sin and asked in 2 threads. Oh mighty admin, lord of OCN, forgive me









Anyway I am also interested in which bios gives the most freedom. I am guessing people haven't tried out too many different BIOSes yet?


----------



## battleaxe

Quote:


> Originally Posted by *AlDyer*
> 
> Yeah I asked there, but as I did sin and asked in 2 threads. Oh mighty admin, lord of OCN, forgive me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyway I am also interested in which bios gives the most freedom. I am guessing people haven't tried out too many different BIOSes yet?


I forgive you son.









LOL


----------



## Chomuco

-amd-r9-290 chip









2020


----------



## kdawgmaster

Quote:


> Originally Posted by *AlDyer*
> 
> To maximize mining hashrate


Asus and msi are ur best bet then. But from my understanding the best hashrate comes from how well u can clock the ram on r9 290x and not the bios


----------



## BlackPH

Sapphire support can provide you UEFI GOP BIOS by request https://www.sapphireforum.com/showthread.php?32778-UEFI-GOP-vbios-for-R9-290

If you got one - can you share it with community?


----------



## Sazz

Quote:


> Originally Posted by *Chomuco*
> 
> -amd-r9-290 chip
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2020


I hate gelid thermal grease, EK blocks comes with them and I tried it on my 8350 back then, and boy it sucks. I am talking about 10C difference between the gelid and my PK1 thermal grease. not to mention it's a lot harder to apply.

Might wanna pick yourself up some PK1's or something else, heck even AS5 performed better than the gelid.


----------



## Arizonian

Quote:


> Originally Posted by *AlDyer*
> 
> Add me to the club:
> 
> http://www.techpowerup.com/gpuz/9uzsb/
> 
> XFX R9 290
> 
> Stock cooling


Congrats - added









Quote:


> Originally Posted by *Chomuco*
> 
> -amd-r9-290 chip
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2020
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## Loktar Ogar

Quote:


> Originally Posted by *Chomuco*
> 
> -amd-r9-290 chip
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 2020


I'm interested on the results. Hope you can post the temps before and after application.


----------



## yawa

I am now the proud owner of a soon to be Waterblocked, zero coil whine (thank the heavens) Diamond R9 290X.

Please add Me. I have a photo of GPU-Z valadation as well as an AIDA 64 GPGPU Bench. Funny thing is I don't have a 2nd GPU, I have a 7850K APU. And the AIDA 64 Benchmark used all the GCN cores for it. As in ALL OF THEM. As in HSA works with a 290X when paired with Kaveri BEFORE the drivers to enable such a combo have been released.

Just thought you'd like to know.



Anyway, let me know if you need anything else.


----------



## bond32

I have seen others tell of extreme display lines going on the screen... Anyone know the cause of this? It seems to be an unstable OC...


----------



## Ukkooh

My new 290x does 1100-1120 core with stock volts (~1.18V) and has hynix mem but haven't tried memory ocing yet. How likely it is that I'll be able to hit 1200 with decent voltage like 1.25V?


----------



## Krusher33

Quote:


> Originally Posted by *BlackPH*
> 
> Sapphire support can provide you UEFI GOP BIOS by request https://www.sapphireforum.com/showthread.php?32778-UEFI-GOP-vbios-for-R9-290
> 
> If you got one - can you share it with community?


Meh... what's the big deal with UEFI anyways? Lets you boot in less than a second?


----------



## kpoeticg

UEFI has a more interactive bios. It's more software supportive


----------



## nagle3092

Quote:


> Originally Posted by *Ukkooh*
> 
> My new 290x does 1100-1120 core with stock volts (~1.18V) and has hynix mem but haven't tried memory ocing yet. How likely it is that I'll be able to hit 1200 with decent voltage like 1.25V?


Hard to say, mine does about the same on stock volts but locks up at 1200 with +100


----------



## Forceman

Quote:


> Originally Posted by *Ukkooh*
> 
> My new 290x does 1100-1120 core with stock volts (~1.18V) and has hynix mem but haven't tried memory ocing yet. How likely it is that I'll be able to hit 1200 with decent voltage like 1.25V?


Honestly, probably not good. Not too many can do 1200 at that voltage.


----------



## Jack Mac

Quote:


> Originally Posted by *Forceman*
> 
> Honestly, probably not good. Not too many can do 1200 at that voltage.


Guess I'm lucky then, I do 1200 on 1.24. I wonder how much higher I could push with 200mV but I don't want to push my card that hard.


----------



## Sazz

I just re-did my cooler mod, It's kinda hard to explain exactly what I did so I'll just show you guys the pics.

re-used the stock cooler fan and shroud for better VRM cooling (hopefully better than my previous mod set-up) drilled a hole on the part where the faceplate is over the VRM1 and put in a copper wire/rod that is connected to a heatsink fin (and used thermal adhesive (arctic cooling) on the copper wire/rod to ensure full contact on the faceplate).

currently testing temps. just starded 1hr run for valley at stock voltage @1100/1350 clocks. to compare it to my previous mod set-up temps.

Will post results after I am done.
(currently running stock voltage @ 1100/1350 at 50% fan speed since at this speed the fan noise is tolerable for majority of people)


----------



## Iniura

Quote:


> Originally Posted by *BlackPH*
> 
> Sapphire support can provide you UEFI GOP BIOS by request https://www.sapphireforum.com/showthread.php?32778-UEFI-GOP-vbios-for-R9-290
> 
> If you got one - can you share it with community?


Damnz what a timing, I was just looking for a UEFI bios but don't know much about it.

I have a XFX R9 290 Black overclock edition, that is a legacy bios right?

Also I thought I read somewhere that there is also a ASUS UEFI bios is this correct?

Is it safe to flash a Sapphire or ASUS UEFI bios to one of the two positions on my XFX R9 290?

If so can someone post a UEFI enabled bios here because yeah I am also interested


----------



## Roy360

Just finished overclocking and testing my XFX R9 290
Load temp
GPU 46
VRM1 53
VRM2 38

http://www.techpowerup.com/gpuz/bp3kc/

Here's my results from overclocking, using Heaven's Benchmark 4.0 Basic to bench and MSI Afterburner to overclock
Core/Memory/Power/Voltage/Score
Core Clocks
947/1250/10%/ 0 1233
947/1250/50%/0 1234
1000/1250/50%/0 1285
1100/1250/50%/0 1370
1150/1250/50%/0 1407
1175/1250/50%/25 1425
1200/1250/50%/63 1433
1235/1250/50%/100 1455 - artifacts
Memory Clock
1200/1300/50%/63 - Unigine Crashed
1175/1300/50%/25 1397 - lower??!
1175/1350/50%/25 1424
1175/1450/50%/44 - Black Screen
1175/1450/50%/63 - Black Screen
1175/1450/50%/100 - Black Screen
1175/1400/50%/100 1430
1100/1450/50%/100 1394
1100/1500/50%/100 1414
1100/1600/50%/100 -Black Screen
1000/1600/50%/100- Black Screen
24/7 Overclock
1175/1350/25%/25 1437


----------



## Sazz

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Sazz*
> 
> I just re-did my cooler mod, It's kinda hard to explain exactly what I did so I'll just show you guys the pics.
> 
> re-used the stock cooler fan and shroud for better VRM cooling (hopefully better than my previous mod set-up) drilled a hole on the part where the faceplate is over the VRM1 and put in a copper wire/rod that is connected to a heatsink fin (and used thermal adhesive (arctic cooling) on the copper wire/rod to ensure full contact on the faceplate).
> 
> currently testing temps. just starded 1hr run for valley at stock voltage @1100/1350 clocks. to compare it to my previous mod set-up temps.
> 
> Will post results after I am done.
> (currently running stock voltage @ 1100/1350 at 50% fan speed since at this speed the fan noise is tolerable for majority of people)






Just got done w/ 1hr temp test and VRM1 peaked at 68C averaging 65C and core peaked at 59C averaging 57C. core got 7-9C higher temp (probably due to the heat surrounding the block and getting blown into it) and VRM1 got 1-4C higher than my previous mod, and probably could shave 1-2C more once the 80mm fan gets here to be installed on the external heatsink that I did.

and I could always crank up the blower fan speed higher if I wanted to.

gonna test +50 and +100mV's Valley pass(ExtremeHD) to see how the temps goes at same fan speed.



Edit:

Just got done w/ +50mV and +100mV, +50mV peaked at 73C and +100mV peaked out at 90C (VRM1 temps)

I will also try w/o shroud and instead of stock fan imma use 90mm Arctic cooling fan over the VRM1 see how that works.


----------



## Falkentyne

Quote:


> Originally Posted by *syniad*
> 
> I also have Elpida memory on my cards and if I overclock the memory much past around 1500 I start to get really strange issues such as the entire screen flashing whenever I drag something on the desktop. Not sure what causes this, could be something to do with my monitor (one of the korean PLS ones) running pretty close to it's refresh rate peak. Granted I've not done any increase in voltage yet (I have a feeling voltage is locked on my cards anyway). I'll try once I get in and see what happens.


That's because the memory controller cant handle the memory speed you set it to because the IDLE VOLTS are not high enough for the controller/GPU to handle it. It isn't because of unstable memory itself.
Raising the idle volts by overvolting the GPU--which also, of course, raises the load voltage, should make you stable.


----------



## noob.deagle

seeing as you are all owners of 290x's i was wondering what your opinions on selling a gtx690 to replace it with a 290x

my main reasoning is the low vram limit and SLI based problems. also im quite interested in mantle.

do you think this is a good idea ? or should i hold out.

im only going ahead if i can sell and buy with no loss.


----------



## BradleyW

That looks cool Sazz, best of luck buddy.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *noob.deagle*
> 
> seeing as you are all owners of 290x's i was wondering what your opinions on selling a gtx690 to replace it with a 290x
> 
> my main reasoning is the low vram limit and SLI based problems. also im quite interested in mantle.
> 
> do you think this is a good idea ? or should i hold out.
> 
> im only going ahead if i can sell and buy with no loss.


If you can get $500+ for the 690 then go for it.


----------



## rcoolb2002

Hi guys. Just picked up a MSI 290x reference + EKWB & backplate. Replacing my xfire 7970 setup with just one for right now. Hopefully with a healthy overclock it won't dissapoint!


----------



## blue1512

Guys, what should I do when my XFX 290X ref just refuses to boot? The LED on mainboard reported a VGA error and stucked with a black screen, even before the mainboard BIOS setup screen









It happened after a power failure. I tried to flash a number of BIOS but the card is still a brick. What should I do now, please help


----------



## Sazz

Quote:


> Originally Posted by *BradleyW*
> 
> That looks cool Sazz, best of luck buddy.


I re-seated the faceplate and I am getting lower temps.

stock voltage 1100/1350 50% fan speed = 61C max average 60C
+50mV 1150/1350 50% fan speed = 68C max average 66C
+100mV 1175/1350 50% fan speed = 76C Max 74C average


----------



## Forceman

Quote:


> Originally Posted by *blue1512*
> 
> Guys, what should I do when my XFX 290X ref just refuses to boot? The LED on mainboard reported a VGA error and stucked with a black screen, even before the mainboard BIOS setup screen
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It happened after a power failure. I tried to flash a number of BIOS but the card is still a brick. What should I do now, please help


Any other symptoms? Had it ever black screened before? My first card failed similarly, a couple of black screen crash/reboots and then it was dead. It stuck on the VGA initialization post code and no boot. I returned it.


----------



## blue1512

Quote:


> Originally Posted by *Forceman*
> 
> Any other symptoms? Had it ever black screened before? My first card failed similarly, a couple of black screen crash/reboots and then it was dead. It stuck on the VGA initialization post code and no boot. I returned it.


It sometimes had black screen crash before. And now, exactly as you described, it stucks on VGA initialization post code. I changed the cooler so the warranty was voided, as said by XFX























Is there any solution for my 600$ brick, please help









Strangely the card still appears (wrongly) in GPUz when I put it in secondary slot, and I can flash BIOS into it. I really hope that it is still alive in some way.


----------



## Sazz

Quote:


> Originally Posted by *blue1512*
> 
> It sometimes had black screen crash before. And now, exactly as you described, it stucks on VGA initialization post code. I changed the cooler so the warranty was voided, as said by XFX
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is there any solution for my 600$ brick, please help


if you're in the US/Canada XFX allows the useage of aftermarket coolers. if not then tough luck.


----------



## blue1512

Quote:


> Originally Posted by *Sazz*
> 
> if you're in the US/Canada XFX allows the useage of aftermarket coolers. if not then tough luck.


Australia here, so tough luck for me


----------



## Sgt Bilko

Quote:


> Originally Posted by *blue1512*
> 
> Australia here, so tough luck for me


I've sent them an e-mail asking if i can change the paste without voiding my warranty but i'm not holding my breath......


----------



## rdr09

Quote:


> Originally Posted by *yawa*
> 
> I am now the proud owner of a soon to be Waterblocked, zero coil whine (thank the heavens) Diamond R9 290X.
> 
> Please add Me. I have a photo of GPU-Z valadation as well as an AIDA 64 GPGPU Bench. Funny thing is I don't have a 2nd GPU, I have a 7850K APU. And the AIDA 64 Benchmark used all the GCN cores for it. As in ALL OF THEM. As in HSA works with a 290X when paired with Kaveri BEFORE the drivers to enable such a combo have been released.
> 
> Just thought you'd like to know.
> 
> 
> 
> Anyway, let me know if you need anything else.


do you mind explaining more about the significance of this? thanks.


----------



## kcuestag

Quote:


> Originally Posted by *blue1512*
> 
> Australia here, so tough luck for me


Quote:


> Originally Posted by *Sgt Bilko*
> 
> I've sent them an e-mail asking if i can change the paste without voiding my warranty but i'm not holding my breath......


Guys, just make sure you keep the stock cooler and original screws in proper condition, they will never notice and any RMA you may need will be accepted.

I RMA'd a 290X twice which I had watercooled, and I've done it before on other parts too.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kcuestag*
> 
> Guys, just make sure you keep the stock cooler and original screws in proper condition, they will never notice and any RMA you may need will be accepted.
> 
> I RMA'd a 290X twice which I had watercooled, and I've done it before on other parts too.


Stickers are over the screws, i have the DD coolers on mine and they are working fine, i just want to change the paste on them.......so i sent an email.


----------



## battleaxe

Quote:


> Originally Posted by *bond32*
> 
> I have seen others tell of extreme display lines going on the screen... Anyone know the cause of this? It seems to be an unstable OC...


I"ve seen that too. Whenever I don't have enough voltage running to it. Crazy horizontal colored lines. Makes it almost impossible to shut down without doing a hard reset. On mine I have a small section at the top of the screen that does respond. So if I put AB up at the top of the screen where I can click the "reset" button then the screen will return to normal.

The other thing I've noticed is that if the core is under load it is more resistant to doing this. So if I put the card under load first, and then decrease voltage (such as starting the miner), then it will hold the OC with lower voltage. But as soon as I stop the miner, the fragmented screen starts up again. But basically, as long as I leave 50mv of extra voltage when OCing to 1500 on the ram it won't do this. So it must be a voltage and OC issue.

I saw one thread where it explained that these cards go straight from 300mhz to their max whether 1500mhz or whatever on the memory. So when its set higher than stock (1250mhz) it causes the crashing or artifacts. It takes extra voltage to keep this from happening.

What we need is for someone to develop a bios that has more than two steps. Like it ramps up progressively to reach maximum frequency.


----------



## battleaxe

Quote:


> Originally Posted by *Sazz*
> 
> I just re-did my cooler mod, It's kinda hard to explain exactly what I did so I'll just show you guys the pics.
> 
> re-used the stock cooler fan and shroud for better VRM cooling (hopefully better than my previous mod set-up) drilled a hole on the part where the faceplate is over the VRM1 and put in a copper wire/rod that is connected to a heatsink fin (and used thermal adhesive (arctic cooling) on the copper wire/rod to ensure full contact on the faceplate).
> 
> currently testing temps. just starded 1hr run for valley at stock voltage @1100/1350 clocks. to compare it to my previous mod set-up temps.
> 
> Will post results after I am done.
> (currently running stock voltage @ 1100/1350 at 50% fan speed since at this speed the fan noise is tolerable for majority of people)


Holy Shinzu... hackimus maximus!


----------



## BlackPH

Quote:


> Originally Posted by *Iniura*
> 
> Damnz what a timing, I was just looking for a UEFI bios but don't know much about it.
> 
> I have a XFX R9 290 Black overclock edition, that is a legacy bios right?
> 
> Also I thought I read somewhere that there is also a ASUS UEFI bios is this correct?
> 
> Is it safe to flash a Sapphire or ASUS UEFI bios to one of the two positions on my XFX R9 290?
> 
> If so can someone post a UEFI enabled bios here because yeah I am also interested


Is "Black OC" based on reference design? If so try to flash this one http://www.techpowerup.com/vgabios/149574/msi-r9290-4096-131205.html.
I had success with this but started to get BSODs during idle (web browsing etc.) - probably cos 2D voltages was too low for my Powercolor 290. Mb you will have more luck with your card.
Have you ask XFX support about UEFI bios?

2 All
UEFI mb is not a big deal but all my stuff support Fast boot why I have to hold back just cos my videocard vendor not bothered with creating compatible BIOS?


----------



## Arizonian

Quote:


> Originally Posted by *yawa*
> 
> I am now the proud owner of a soon to be Waterblocked, zero coil whine (thank the heavens) Diamond R9 290X.
> 
> Please add Me. I have a photo of GPU-Z valadation as well as an AIDA 64 GPGPU Bench. Funny thing is I don't have a 2nd GPU, I have a 7850K APU. And the AIDA 64 Benchmark used all the GCN cores for it. As in ALL OF THEM. As in HSA works with a 290X when paired with Kaveri BEFORE the drivers to enable such a combo have been released.
> 
> Just thought you'd like to know.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Anyway, let me know if you need anything else.


Congrats - added









@Chomuco You missed a GPU-Z validation link and brand. If you could please go back to your original post - linked on roster and add the GPU-Z validation with OCN name would be great as proof. You didn't also add your brand and by default for those who forget or I cannot see from your post it's listed as Sapphire. If it's different please PM me I'll update roster.


----------



## yawa

Thanks Arz. Couple of things though.



I have a 290X.

Also I have a Koolance Waterblock on it. (check my idle temps in the screenshot)

Temps are amazing btw, and am about to flash a new bios to work with a voltage overclock. Get back to you soon.


----------



## Arizonian

Quote:


> Originally Posted by *yawa*
> 
> Thanks Arz. Couple of things though.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I have a 290X.
> 
> Also I have a Koolance Waterblock on it. (check my idle temps in the screenshot)
> 
> Temps are amazing btw, and am about to flash a new bios to work with a voltage overclock. Get back to you soon.










Sweet on both accounts - corrected.


----------



## Jiiks

Got mine yesterday







http://www.techpowerup.com/gpuz/fbg2z/

Sapphire 290 Tri-X, aircooled


----------



## Arizonian

Quote:


> Originally Posted by *Jiiks*
> 
> Got mine yesterday
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/fbg2z/
> 
> Sapphire 290 Tri-X, aircooled


Congrats on the Tri-X - added


----------



## cam51037

Here's my validation: http://www.techpowerup.com/gpuz/5ru25/

Sapphire 290 Tri-X as well. I think I have a memory voltage issue though, I have quick multicoloured artifacts on the screen whenever I overclock anything, I've heard it's a common problem. Any fixes, or should I be RMAing this card? I'm working on putting up a video of this issue now.


----------



## Arizonian

Quote:


> Originally Posted by *cam51037*
> 
> Here's my validation: http://www.techpowerup.com/gpuz/5ru25/
> 
> Sapphire 290 Tri-X as well. I think I have a memory voltage issue though, I have quick multicoloured artifacts on the screen whenever I overclock anything, I've heard it's a common problem. Any fixes, or should I be RMAing this card? I'm working on putting up a video of this issue now.


Congrats - added









Though I'm sorry to hear about your artifacts, not a good sign and with Hynix memory to boot. Not sure if adding voltage will fix it but might be worth a try. I think your the first of few Tri-X owner to have this issue.

If your under the 30 day return period I'd go with replacement with vendor over RMA with Sapphire.


----------



## taem

I'm so torn on what custom 290 to get.

$500. Sapphire 290 Tri-X
$470. MSI 290 Gamng
$570. Asus 290*X* DCU2

Sapphire Tri-X seems to be the best cooler and I think I'd prefer that over the MSI FOR $30 more. But $70 more gets me to an Asus 290x; wonder if that's worth it. Plus Asus and Msi cards would ship sooner. I'm hearing Amazon is expecting up to 3 months before they ship out Sapphire 290s.
Quote:


> Originally Posted by *Roy360*
> 
> Just finished overclocking and testing my XFX R9 290
> Load temp
> GPU 46
> VRM1 53
> VRM2 38


Is this reference under water or the custom DD? Has to be water right?


----------



## Roy360

Quote:


> Originally Posted by *taem*
> 
> I'm so torn on what custom 290 to get.
> 
> $500. Sapphire 290 Tri-X
> $470. MSI 290 Gamng
> $570. Asus 290*X* DCU2
> 
> Sapphire Tri-X seems to be the best cooler and I think I'd prefer that over the MSI FOR $30 more. But $70 more gets me to an Asus 290x; wonder if that's worth it. Plus Asus and Msi cards would ship sooner. I'm hearing Amazon is expecting up to 3 months before they ship out Sapphire 290s.
> Is this reference under water or the custom DD? Has to be water right?


EK waterblock, old temps were over 95 degrees


----------



## Sgt Bilko

Quote:


> Originally Posted by *taem*
> 
> I'm so torn on what custom 290 to get.
> 
> $500. Sapphire 290 Tri-X
> $470. MSI 290 Gamng
> $570. Asus 290*X* DCU2
> 
> Sapphire Tri-X seems to be the best cooler and I think I'd prefer that over the MSI FOR $30 more. But $70 more gets me to an Asus 290x; wonder if that's worth it. Plus Asus and Msi cards would ship sooner. I'm hearing Amazon is expecting up to 3 months before they ship out Sapphire 290s.
> Is this reference under water or the custom DD? Has to be water right?


The DD's aren't that cool,

mine idle around 36 and 33c and they have hit 90c on the core and 85c on the vrms whilst benching at 1200/1350 with +100mV in 35c ambients









They are a good cooler though, alot better than i thought they were going to be.


----------



## taem

Quote:


> Originally Posted by *Sgt Bilko*
> 
> The DD's aren't that cool,
> 
> mine idle around 36 and 33c and they have hit 90c on the core and 85c on the vrms whilst benching at 1200/1350 with +100mV in 35c ambients
> 
> 
> 
> 
> 
> 
> 
> 
> 
> They are a good cooler though, alot better than i thought they were going to be.


I really want the XFX dd cards for aesthetics. IMHO the best looking video cards ever made. But I want to crossfire in an air cooled case and I don't think the vrm cooling is good enough on these.


----------



## Falkentyne

Quote:


> Originally Posted by *battleaxe*
> 
> I"ve seen that too. Whenever I don't have enough voltage running to it. Crazy horizontal colored lines. Makes it almost impossible to shut down without doing a hard reset. On mine I have a small section at the top of the screen that does respond. So if I put AB up at the top of the screen where I can click the "reset" button then the screen will return to normal.
> 
> The other thing I've noticed is that if the core is under load it is more resistant to doing this. So if I put the card under load first, and then decrease voltage (such as starting the miner), then it will hold the OC with lower voltage. But as soon as I stop the miner, the fragmented screen starts up again. But basically, as long as I leave 50mv of extra voltage when OCing to 1500 on the ram it won't do this. So it must be a voltage and OC issue.
> 
> I saw one thread where it explained that these cards go straight from 300mhz to their max whether 1500mhz or whatever on the memory. So when its set higher than stock (1250mhz) it causes the crashing or artifacts. It takes extra voltage to keep this from happening.
> 
> What we need is for someone to develop a bios that has more than two steps. Like it ramps up progressively to reach maximum frequency.


Yes I mentioned that the reason you get those lines (or black screens) is when the memory jumps from 150 MHz to full speed at IDLE core voltage (OR when the memory is locked at full speed (you can do this by disabling powerplay in afterburner's unofficial overclocking mode), while the core is running at idle voltage (0.9xx) , because the core also controls the IMC, and the IMC's voltage is too low to handle those memory clocks; its NOT unstable memory that's causing it. If it was, the screen would not 'fix' itself instantly when the core voltage is raised or when you press reset.

You can also increase your memory overclocks to where they "should" be, by disabling powerplay which forces the core to run at full 3D volts at idle (1.19x-1.2x) in 2D--that will always stop any blackscreens, then you should be able to push your memory to 1600 MHz and higher.


----------



## Sgt Bilko

Quote:


> Originally Posted by *taem*
> 
> I really want the XFX dd cards for aesthetics. IMHO the best looking video cards ever made. But I want to crossfire in an air cooled case and I don't think the vrm cooling is good enough on these.


Depends on what temps you have in mind and what ambients/clock/volts etc you would be running.

I run a 1000/1250 overclock (more than enough for me) and they sit around 60c or so in BF4, vrms are 60ish again.

And yes, they look great, best looking cards on the market atm imo.


----------



## yawa

Just an aside but I was really shocked with this Diamond card. Considering I am apparently according to this list, the only person on this board with one, I'd like to share something.

Before I put her under water I played some games and ran a few benches. Even with a reference cooler it didn't go above 80C, which was quite shocking. Hell the fan didn't even throttle.

Zero coil whine as well.

Anyway she's under water now and peaking at about 42C under full load. That being said I need some advice, what bios should I use to unlock voltage and start my magical, mystical journey to 1300Mhz?

Been awhile since I had an AMD card so I'm curious.


----------



## Abyssic

hi guys, i'm about to buy a Sapphire 290X Tri-X but i want to put a EK backplate on it (EK-FC R9-290(X)). do you think this is possible? Just to make it clear: i want to leave the Tri-X cooler on and put the Backplate on with it.


----------



## Arizonian

Quote:


> Originally Posted by *Abyssic*
> 
> hi guys, i'm about to buy a Sapphire 290X Tri-X but i want to put a EK backplate on it (EK-FC R9-290(X)). do you think this is possible? Just to make it clear: i want to leave the Tri-X cooler on and put the Backplate on with it.


On the OP in the water block section I made sure to add (Not a stand alone) meaning it needs the full block. It won't work.

http://www.ekwb.com/shop/blocks/vga-blocks/fc-backplates/ek-fc-r9-290x-backplate-black.html
Quote:


> Please note:
> - The backplate does not serve as a standalone unit!


Too bad because EK would have sold a lot of those!


----------



## Abyssic

Quote:


> Originally Posted by *Arizonian*
> 
> On the OP in the water block section I made sure to add (Not a stand alone) meaning it needs the full block. It won't work.
> 
> http://www.ekwb.com/shop/blocks/vga-blocks/fc-backplates/ek-fc-r9-290x-backplate-black.html
> Too bad because EK would have sold a lot of those!


aww damn, that's a shame. thanks for the info.
and yes, ek would make a lot of money if they somehow made it compatible....


----------



## Iniura

Quote:


> Originally Posted by *BlackPH*
> 
> Is "Black OC" based on reference design? If so try to flash this one http://www.techpowerup.com/vgabios/149574/msi-r9290-4096-131205.html.
> I had success with this but started to get BSODs during idle (web browsing etc.) - probably cos 2D voltages was too low for my Powercolor 290. Mb you will have more luck with your card.
> Have you ask XFX support about UEFI bios?
> 
> 2 All
> UEFI mb is not a big deal but all my stuff support Fast boot why I have to hold back just cos my videocard vendor not bothered with creating compatible BIOS?


Thnx I will try that one in the near future when I get a new mobo.
I didn't contact XFX support about UEFI bios.
Yeah I also want to use ultra fast boot in the upcoming weeks that's why I am also searching for a UEFI bios, post in this thread when you found a good one, ty


----------



## Forceman

Quote:


> Originally Posted by *yawa*
> 
> Anyway she's under water now and peaking at about 42C under full load. That being said I need some advice, what bios should I use to unlock voltage and start my magical, mystical journey to 1300Mhz?


All the BIOSes should allow voltage control through Afterburner (+100mV) or Trixx (+200mV), or you can flash an Asus BIOS and use GPU Tweak. If you are feeling adventurous you can put the PT1 BIOS on and use that for even more voltage/LLC control.


----------



## bond32

In afterburner, next to the "Unlock Voltage Control" box, the pulldown that states "standard MSI", "reference design"... What is that? I assume with a reference card "Reference" should be selected?


----------



## Forceman

Quote:


> Originally Posted by *bond32*
> 
> In afterburner, next to the "Unlock Voltage Control" box, the pulldown that states "standard MSI", "reference design"... What is that? I assume with a reference card "Reference" should be selected?


If you hover over the drop down box a tooltip will pop up. It says reference is for reference cards only, Standard MSI is for reference and custom MSI cards, and extended MSI is for reference, custom MSI, and extended voltage range for some models. Mine seemed to have defaulted to standard MSI (and works fine), since I never changed it.


----------



## bond32

Quote:


> Originally Posted by *Forceman*
> 
> If you hover over the drop down box a tooltip will pop up. It says reference is for reference cards only, Standard MSI is for reference and custom MSI cards, and extended MSI is for reference, custom MSI, and extended voltage range for some models. Mine seemed to have defaulted to standard MSI (and works fine), since I never changed it.


Didn't notice that. Have any idea what it actually does?


----------



## Forceman

Quote:


> Originally Posted by *bond32*
> 
> Didn't notice that. Have any idea what it actually does?


No idea. Just adds support for additional voltage regulators? I don't know why you'd need a selection for it though, seems like selecting extended MSI should cover pretty much everything, I don't know what the downside of that would be.


----------



## Nightgamerx

Hey everyone, I'm about to order the Sapphire Tri-X 290(non-X) from one of my hardware vendors through my IT office, I can get it for $447.00







I'm wondering though, with the updates to how you can use Eyefinity and this card having 4 video outputs, if you can now use all 4 outputs simultaneously? (Similar to Nvidia Surround +1 Aux display) I currently have a Korean PLS 27" 1440p monitor and two 24" Samsung 1080p screens on the sides of it, I wanted to hook up my 58' TV to the HDMI out put and run all four at the same time. Meaning the following, 1440p monitor --> Dual Link DVI port, 1080p monitor #1 --> Second DVI port, 1080p monitor #2 --> Full Size DP to DVI Active Adapter (Already have), and my 58' TV through the remaining HDMI port. Somewhat confusing, I know, please let me know what you think. I can't wait to upgrade from my current HD5870 with 1Gb of VRAM!!!


----------



## aaroc

Quote:


> Originally Posted by *Nightgamerx*
> 
> Hey everyone, I'm about to order the Sapphire Tri-X 290(non-X) from one of my hardware vendors through my IT office, I can get it for $447.00
> 
> 
> 
> 
> 
> 
> 
> I'm wondering though, with the updates to how you can use Eyefinity and this card having 4 video outputs, if you can now use all 4 outputs simultaneously? (Similar to Nvidia Surround +1 Aux display) I currently have a Korean PLS 27" 1440p monitor and two 24" Samsung 1080p screens on the sides of it, I wanted to hook up my 58' TV to the HDMI out put and run all four at the same time. Meaning the following, 1440p monitor --> Dual Link DVI port, 1080p monitor #1 --> Second DVI port, 1080p monitor #2 --> Full Size DP to DVI Active Adapter (Already have), and my 58' TV through the remaining HDMI port. Somewhat confusing, I know, please let me know what you think. I can't wait to upgrade from my current HD5870 with 1Gb of VRAM!!!


From what I read before buying my R9 290, you can connect to any connector up to 6 devices: 1 HDMI, 2 DVI DL, 3 DP with hub. Not having to buy active adapters was the main reason of my upgrade from 2x AMD HD 7870 CFX.


----------



## Sazz

Quote:


> Originally Posted by *aaroc*
> 
> From what I read before buying my R9 290, you can connect to any connector up to 6 devices: 1 HDMI, 2 DVI DL, 3 DP with hub. Not having to buy active adapters was the main reason of my upgrade from 2x AMD HD 7870 CFX.


yes, the card already can use all of its output for eyefinity. no need for active adapters. The only time you would need an adapter is if you are gonna do 6monitors w/ a display port hub, 3 of which would come off the single Display port and the other 3 would be coming off the two DVI and the HDMI.


----------



## Nightgamerx

Quote:


> Originally Posted by *Sazz*
> 
> yes, the card already can use all of its output for eyefinity. no need for active adapters. The only time you would need an adapter is if you are gonna do 6monitors w/ a display port hub, 3 of which would come off the single Display port and the other 3 would be coming off the two DVI and the HDMI.


In my case, I will not be using Eyefinity. I only game on the middle monitor and would use the others to display streams/chat/web.So do the displays need to be in an eyefinity group to have more than 3 in use at the same time?


----------



## Sazz

Quote:


> Originally Posted by *Nightgamerx*
> 
> In my case, I will not be using Eyefinity. I only game on the middle monitor and would use the others to display streams/chat/web.So do the displays need to be in an eyefinity group to have more than 3 in use at the same time?


oh you just have to plug it in, and play. no need to set-up anything on Catalyst Control, maybe just identifying which monitor is which (Which monitor would be Left,Mid,Right) and that's it.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Nightgamerx*
> 
> In my case, I will not be using Eyefinity. I only game on the middle monitor and would use the others to display streams/chat/web.So do the displays need to be in an eyefinity group to have more than 3 in use at the same time?


I have 2 x 24" screens and a 32" TV for Movies, the 2 x 24" are running off DVI and the 32" TV off HDMI, i just set them as extendable desktops, not eye-infinity and it works great


----------



## blue1512

Quote:


> Originally Posted by *blue1512*
> 
> It sometimes had black screen crash before. And now, exactly as you described, it stucks on VGA initialization post code. I changed the cooler so the warranty was voided, as said by XFX
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is there any solution for my 600$ brick, please help
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Strangely the card still appears (wrongly) in GPUz when I put it in secondary slot, and I can flash BIOS into it. I really hope that it is still alive in some way.


Could you guys please give me a final verdict for my card? Is it dead or not?


----------



## Sazz

Quote:


> Originally Posted by *blue1512*
> 
> Could you guys please give me a final verdict for my card? Is it dead or not?


normally that shows when the driver is not installed. can you try installing the card on a different system? if it works there then somewhere along the line driver installation got messed up.


----------



## aaroc

Quote:


> Originally Posted by *blue1512*
> 
> Could you guys please give me a final verdict for my card? Is it dead or not?


You could try using the BIOS switch on the card. Try the other position, maybe your current BIOS has some problems.


----------



## kizwan

Quote:


> Originally Posted by *noob.deagle*
> 
> seeing as you are all owners of 290x's i was wondering what your opinions on selling a gtx690 to replace it with a 290x
> 
> my main reasoning is the low vram limit and SLI based problems. also im quite interested in mantle.
> 
> do you think this is a good idea ? or should i hold out.
> 
> im only going ahead if i can sell and buy with no loss.


If you can sell your GTX690 at good price, then go ahead.
Quote:


> Originally Posted by *rcoolb2002*
> 
> Hi guys. Just picked up a MSI 290x reference + EKWB & backplate. Replacing my xfire 7970 setup with just one for right now. Hopefully with a healthy overclock it won't dissapoint!


You won't disappoint! Don't forget to picked up Fujipoly thermal pad because EK stock thermal pad won't cool the VRMs good enough.

http://www.frozencpu.com/products/17499/thr-181/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_60_x_50_x_05_-_Thermal_Conductivity_170_WmK.html?tl=g30c309s1294

http://www.frozencpu.com/products/17500/thr-182/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_60_x_50_x_10_-_Thermal_Conductivity_170_WmK.html?tl=g30c309s1294
Quote:


> Originally Posted by *blue1512*
> 
> Guys, what should I do when my XFX 290X ref just refuses to boot? The LED on mainboard reported a VGA error and stucked with a black screen, even before the mainboard BIOS setup screen
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It happened after a power failure. I tried to flash a number of BIOS but the card is still a brick. What should I do now, please help


It's not BIOS problem unless the power failure happen while you're flashing the card. 290X should have dual BIOS like 290 right? If it's BIOS problem, you can always switch to second BIOS & boot from there.

If it's just normal usage while the power failure occurred & the card failed to work afterwards, most likely the card is faulty. I don't think you can do anything except RMA it. If you use custom cooler, put back the stock cooler & RMA the card.

Based on the screenshot of GPU-Z you posted after this (your) message, it seems the memory modules on the card is faulty.
Quote:


> Originally Posted by *yawa*
> 
> Thanks Arz. Couple of things though.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I have a 290X.
> 
> Also I have a Koolance Waterblock on it. (check my idle temps in the screenshot)
> 
> Temps are amazing btw, and am about to flash a new bios to work with a voltage overclock. Get back to you soon.


What is your ambient (indoor) temp? What TIM you use?
Quote:


> Originally Posted by *Arizonian*
> 
> http://www.ekwb.com/shop/blocks/vga-blocks/fc-backplates/ek-fc-r9-290x-backplate-black.html
> Quote:
> 
> 
> 
> Please note:
> - The backplate does not serve as a standalone unit!
> 
> 
> 
> Too bad because EK would have sold a lot of those!
Click to expand...

Do you know what screw size the reference card use? My stock coolers in the box & the box is under a pile of boxes. Kinda lazy right now to check myself.









If the stock cooler use M3 screws then the backplate should be compatible with reference cards. Regarding Tri-X cards....

This is reference (Sapphire) R9 290 card:-
_(source)_


This is Tri-X R9 290X card:-
_(source)_


Mounting holes locations look identical. If the Tri-X use M3 screws, the EK backplate might be compatible. Still, until someone try this, we should consider they likely not compatible.


----------



## Abyssic

Quote:


> Originally Posted by *blue1512*
> 
> Could you guys please give me a final verdict for my card? Is it dead or not?


you should try to find someone who knows about pcbs and send the card to him, maybe a cap is burnt or smth among those lines.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *kizwan*
> 
> If you can sell your GTX690 at good price, then go ahead.
> You won't disappoint! Don't forget to picked up Fujipoly thermal pad because EK stock thermal pad won't cool the VRMs good enough.
> 
> http://www.frozencpu.com/products/17499/thr-181/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_60_x_50_x_05_-_Thermal_Conductivity_170_WmK.html?tl=g30c309s1294
> 
> http://www.frozencpu.com/products/17500/thr-182/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_60_x_50_x_10_-_Thermal_Conductivity_170_WmK.html?tl=g30c309s1294
> It's not BIOS problem unless the power failure happen while you're flashing the card. 290X should have dual BIOS like 290 right? If it's BIOS problem, you can always switch to second BIOS & boot from there.
> 
> If it's just normal usage while the power failure occurred & the card failed to work afterwards, most likely the card is faulty. I don't think you can do anything except RMA it. If you use custom cooler, put back the stock cooler & RMA the card.
> 
> Based on the screenshot of GPU-Z you posted after this (your) message, it seems the memory modules on the card is faulty.
> What is your ambient (indoor) temp? What TIM you use?
> Do you know what screw size the reference card use? My stock coolers in the box & the box is under a pile of boxes. Kinda lazy right now to check myself.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If the stock cooler use M3 screws then the backplate should be compatible with reference cards. Regarding Tri-X cards....
> 
> This is reference (Sapphire) R9 290 card:-
> _(source)_
> 
> 
> This is Tri-X R9 290X card:-
> _(source)_
> 
> 
> Mounting holes locations look identical. If the Tri-X use M3 screws, the EK backplate might be compatible. Still, until someone try this, we should consider they likely not compatible.


Im saving this post nice work Kizwan Rep+


----------



## Sazz

Quote:


> Originally Posted by *kizwan*
> 
> If you can sell your GTX690 at good price, then go ahead.
> You won't disappoint! Don't forget to picked up Fujipoly thermal pad because EK stock thermal pad won't cool the VRMs good enough.
> 
> http://www.frozencpu.com/products/17499/thr-181/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_60_x_50_x_05_-_Thermal_Conductivity_170_WmK.html?tl=g30c309s1294
> 
> http://www.frozencpu.com/products/17500/thr-182/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_60_x_50_x_10_-_Thermal_Conductivity_170_WmK.html?tl=g30c309s1294
> It's not BIOS problem unless the power failure happen while you're flashing the card. 290X should have dual BIOS like 290 right? If it's BIOS problem, you can always switch to second BIOS & boot from there.
> 
> If it's just normal usage while the power failure occurred & the card failed to work afterwards, most likely the card is faulty. I don't think you can do anything except RMA it. If you use custom cooler, put back the stock cooler & RMA the card.
> 
> Based on the screenshot of GPU-Z you posted after this (your) message, it seems the memory modules on the card is faulty.
> What is your ambient (indoor) temp? What TIM you use?
> Do you know what screw size the reference card use? My stock coolers in the box & the box is under a pile of boxes. Kinda lazy right now to check myself.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If the stock cooler use M3 screws then the backplate should be compatible with reference cards. Regarding Tri-X cards....
> 
> This is reference (Sapphire) R9 290 card:-
> _(source)_
> 
> 
> This is Tri-X R9 290X card:-
> _(source)_
> 
> 
> Mounting holes locations look identical. If the Tri-X use M3 screws, the EK backplate might be compatible. Still, until someone try this, we should consider they likely not compatible.


As far as I know Tri-X cards are reference board w/ updated coolers.


----------



## Abyssic

Quote:


> Originally Posted by *kizwan*
> 
> Mounting holes locations look identical. If the Tri-X use M3 screws, the EK backplate might be compatible. Still, until someone try this, we should consider they likely not compatible.


i might try it out once i have mine. could be in around 2 weeks


----------



## blue1512

Quote:


> Originally Posted by *kizwan*
> 
> It is not BIOS problem unless the power failure happen while you're flashing the card. 290X should have dual BIOS like 290 right? If it's BIOS problem, you can always switch to second BIOS & boot from there.
> 
> If it's just normal usage while the power failure occurred & the card failed to work afterwards, most likely the card is faulty. I don't think you can do anything except RMA it. If you use custom cooler, put back the stock cooler & RMA the card.
> 
> Based on the screenshot of GPU-Z you posted after this (your) message, it seems the memory modules on the card is faulty.
> What is your ambient (indoor) temp? What TIM you use?


Thanks. I tried few BIOS and can confirm that it is not BIOS problem. So I guess that you are right about the memory, maybe I screwed up when putting heatsinks on the memory modules








Because it's XFX, I can't do RMA in Australia. Do you think that a computer store can fix this?


----------



## psyside

I'm here to say, that AMD powerplay is the worst thing ever in human history, tried literally every guide one can imagine, every setting, and no! power limit still ain't apply.

I ended up using the overdrive to apply it lol! neither unofficial overclocking method without powerplay helped, or reinstalling the drivers (DDU) and using AB to increase power limit, at once as i changed any setting, the screen flickers like mad, 3 driver reinstalls, and 30 mins + of different settings, nothing HELPS.

If anyone have similar issue, just use overdrive to increase it, it will work most likely. Also there is bug in AB, where you cant apply the exact volts/clocks.....it goes back to -/+ 5 or 10....


----------



## rdr09

Quote:


> Originally Posted by *psyside*
> 
> I'm here to say, that AMD powerplay is the worst thing ever in human history, tried literally every guide one can imagine, every setting, and no! power limit still ain't apply.
> 
> I ended up using the overdrive to apply it lol! neither unofficial overclocking method without powerplay helped, or reinstalling the drivers (DDU) and using AB to increase power limit, at once as i changed any setting, the screen flickers like mad, 3 driver reinstalls, and 30 mins + of different settings, nothing HELPS.
> 
> If anyone have similar issue, just use overdrive to increase it, it will work most likely. Also there is bug in AB, where you cant apply the exact volts/clocks.....it goes back to -/+ 5 or 10....


why not use Trixx?


----------



## psyside

I'm used to AB, for now i get it work decent at least so its ok i guess


----------



## kizwan

Quote:


> Originally Posted by *blue1512*
> 
> Thanks. I tried few BIOS and can confirm that it is not BIOS problem. So I guess that you are right about the memory, maybe I screwed up when putting heatsinks on the memory modules
> 
> 
> 
> 
> 
> 
> 
> 
> Because it's XFX, I can't do RMA in Australia. Do you think that a computer store can fix this?


Can you RMA the card through the store where you purchased them?

No if it's regular computer store, I don't think they can fixed it.


----------



## brazilianloser

And here comes the end of January and still no beta Mantle driver... :/


----------



## bond32

Those of you that set your clock profiles manually, what are your 2D clocks?


----------



## chiknnwatrmln

I keep my 2D clocks at stock, 150MHz core and 300MHz mem. No need to change them really.

For every day use I set my 3D clocks to 1100MHz core and 5500MHz mem on +18mV.


----------



## bond32

Thanks. I'm experimenting with a few things right now, still trying to figure out the black screen issue. I'm a bit confused still about the Power Limit slider... I was under the impression all it did was *allowed* the card to pull more current... So should it always be maxed at +50% provided temps are good?

Another question, I have a batch file to provide more voltage with AB. Here is what it has:

CD C:\Program Files (x86)\MSI Afterburner
MSIAfterburner.exe /wi6,30,8d,10

When I try to run it the prompt just immediately closes, what would cause that?


----------



## EliteReplay

are there any news on AMD releasing a revision of the R9 290/290X that has better thermal and power consumpttion?


----------



## AlDyer

Quote:


> Originally Posted by *bond32*
> 
> Thanks. I'm experimenting with a few things right now, still trying to figure out the black screen issue. I'm a bit confused still about the Power Limit slider... I was under the impression all it did was *allowed* the card to pull more current... So should it always be maxed at +50% provided temps are good?


Yeah it should, it actually increases performance even without changing any clockspeeds on my card (talking about mining perf mainly, haven't tested if it increases frames in games). The black screen issue is a pain in the butt, AUX voltage seems to help for me while it doesn't help for everyone. I never had a really bad black screen issue but had very similar stuff going on and these cards are a bit weird in other ways too, but I'm finally starting to get somewhere with it. Got the memory to 1500 with the AUX upped, can propably get higher with core increases and other tweaks +WC of course whenever I get my block


----------



## anubis1127

Quote:


> Originally Posted by *EliteReplay*
> 
> are there any news on AMD releasing a revision of the R9 290/290X that has better thermal and power consumption?












That is a good one.


----------



## Forceman

Quote:


> Originally Posted by *bond32*
> 
> Another question, I have a batch file to provide more voltage with AB. Here is what it has:
> 
> CD C:\Program Files (x86)\MSI Afterburner
> MSIAfterburner.exe /wi6,30,8d,10
> 
> When I try to run it the prompt just immediately closes, what would cause that?


Have you tried using Trixx? It allows +200mV standard.


----------



## AlDyer

Quote:


> Originally Posted by *anubis1127*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That is a good one.


Yeah, because it was "designed" to run at 95 C and throttle


----------



## fit949

http://www.microcenter.com/product/425128/Radeon_R9_290X_Express_Video_Card

Has a few in stock I got lucky and got a diamond 290x they had 1 in stock wohoo

Post some info and temps when i get it up and running


----------



## bond32

Quote:


> Originally Posted by *Forceman*
> 
> Have you tried using Trixx? It allows +200mV standard.


I have, but wouldn't it be better to use AB since it offers the unofficial OC methods?


----------



## Forceman

Quote:


> Originally Posted by *bond32*
> 
> I have, but wouldn't it be better to use AB since it offers the unofficial OC methods?


I don't know. I've used Trixx and it seemed to work fine. I don't know what the unofficial overclocking mode does in AB though, I've never used it. Is it needed for Crossfire or something, or have I been missing out on something by not using it?


----------



## JMCB

I'm getting two more 290X video cards, and was curious on what power supply I should go for, as I know the current PS that I have won't do the job.


----------



## yawa

I've slowly been upping the ante on both the GPU and APU for a few benches, trying achieve stablity.

So here is 3D Mark 11 with the 290X at 1179/1291 +100mv and Kaveri at 4.6 Ghz at 1.488

P9615
G: 15611
P: 4587
C: 4301



Kind of annoyed with GPU Tweak BTW. Even though I've flashed the ASUS bios for voltage control, it says it won't allow me to adjust my voltage since I don't have an ASUS card. I achieved the above bench using MSI Afterburner.

Considering my temps are amazing ATM (42C under heavy load







) I really wanna attempt the mythical 1300 at 1.4v. Anyone else have these issues with GPU tweak?


----------



## psyside

Quote:


> Originally Posted by *JMCB*
> 
> I'm getting two more 290X video cards, and was curious on what power supply I should go for, as I know the current PS that I have won't do the job.


This, its platinum actually, Seasonic unit.


----------



## tsm106

Quote:


> Originally Posted by *bond32*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Forceman*
> 
> Have you tried using Trixx? It allows +200mV standard.
> 
> 
> 
> I have, but wouldn't it be better to use AB since it offers the unofficial OC methods?
Click to expand...

Trixx uses unofficial mode.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *JMCB*
> 
> I'm getting two more 290X video cards, and was curious on what power supply I should go for, as I know the current PS that I have won't do the job.


With 4 x 290x people have had success using the Lepa G1600. I am using the EVGA 1300g2 with 3 x 290's and no issue.


----------



## tsm106

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Quote:
> 
> 
> 
> Originally Posted by *JMCB*
> 
> I'm getting two more 290X video cards, and was curious on what power supply I should go for, as I know the current PS that I have won't do the job.
> 
> 
> 
> With 4 x 290x people have had success using the Lepa G1600. I am using the EVGA 1300g2 with 3 x 290's and no issue.
Click to expand...

Lepa 1600 isn't enough to push quad 290x and big ole overclocks, especially when you start using the unlocked bios and modded gputweak.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *tsm106*
> 
> Lepa 1600 isn't enough to push quad 290x and big ole overclocks, especially when you start using the unlocked bios and modded gputweak.


Good to know if I go for a fourth!







Any recommendation? Dual PSU I assume would be best no?


----------



## tsm106

I use two Evga G2 psu with a simple dual psu cable.


----------



## shilka

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Good to know if I go for a fourth!
> 
> 
> 
> 
> 
> 
> 
> Any recommendation? Dual PSU I assume would be best no?


Get two V850´s and slap them together


----------



## MrWhiteRX7

Quote:


> Originally Posted by *tsm106*
> 
> I use two Evga G2 psu with a simple dual psu cable.


Ahhhhhhhhhhhh yeah! I just need one more lol


----------



## tsm106

Just remember to plug their outlets into different circuits.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *tsm106*
> 
> Lepa 1600 isn't enough to push quad 290x and big ole overclocks, especially when you start using the unlocked bios and modded gputweak.


I heard the same thing man . Old mates lepa chucked a wobbley with quad 580's . Rma'd that expensive sucker


----------



## bond32

So, still looking for answers and learning a lot here. My 290x has always been extremely poor at mining. After extensive testing if I set my clocks to 1300/1500, +200 in trixx the card will run at about 945 core 1500 mem, 1.28 V. Why is it that the clock rate is that rather than what I set? Is it different steps/profiles? I am starting from scratch so I put the stock bios back on the card.

Next question is still the power limit. I don't understand what this does... I have set clocks in trixx with 0 added to power limit, then ran the miner, saw the power draw to max around 313 watts. This is the reading in gpuz, I know this isn't the most accurate. But then I upped the power limit in trixx to +50, saw no change in the draw in gpuz. Maybe I am missing something? Still confused as to what the power limit actually does...


----------



## Forceman

For your first question, it sounds like you card is either thermal or power throttling.

For the second, upping the power limit just increases the amount of power the card can use, but if it doesn't need the extra power it won't use it. So changing the power limit may not cause actual power draw to go up, if the card doesn't need the extra power. You may as well just set +50% and forget about it, that way you don't need to worry about it. However, sometimes setting the power limit in AB doesn't take (it doesn't actually change) so check in CCC and be sure it was actually increased. If it didn't work through AB you can just set it in CCC instead, that's what I had to do.


----------



## bond32

Quote:


> Originally Posted by *Forceman*
> 
> For your first question, it sounds like you card is either thermal or power throttling.
> 
> For the second, upping the power limit just increases the amount of power the card can use, but if it doesn't need the extra power it won't use it. So changing the power limit may not cause actual power draw to go up, if the card doesn't need the extra power. You may as well just set +50% and forget about it, that way you don't need to worry about it. However, sometimes setting the power limit in AB doesn't take (it doesn't actually change) so check in CCC and be sure it was actually increased. If it didn't work through AB you can just set it in CCC instead, that's what I had to do.


Thanks! +rep


----------



## syniad

I'm having a strange issue while overclocking my 2 R9 290s (non x).

Sometimes when Heaven benchmark 4.0 freezes from an unstable overclock and I have to reset the computer I get the message saying AMD drivers are not installed. It's like the drivers uninstall themself after they crash, at first I thought Windows 8.1 was doing some kind of weird automatic system restore but it still does it with system restore disabled.

I'm also thinking my 850w PSU is not enough as I start to see the core throttling when I increase voltage.

Anyone seen this before or know what causes it?

Driver is 13.12, going to try the 13.30 version later to see if that makes a difference.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *syniad*
> 
> I'm having a strange issue while overclocking my 2 R9 290s (non x).
> 
> Sometimes when Heaven benchmark 4.0 freezes from an unstable overclock and I have to reset the computer I get the message saying AMD drivers are not installed. It's like the drivers uninstall themself after they crash, at first I thought Windows 8.1 was doing some kind of weird automatic system restore but it still does it with system restore disabled.
> 
> I'm also thinking my 850w PSU is not enough as I start to see the core throttling when I increase voltage.
> 
> Anyone seen this before or know what causes it?
> 
> Driver is 13.12, going to try the 13.30 version later to see if that makes a difference.


Driver 13.3?


----------



## syniad

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Driver 13.3?


Yeah it's this one:

http://forums.guru3d.com/showthread.php?t=385614


----------



## MrWhiteRX7

Quote:


> Originally Posted by *syniad*
> 
> Yeah it's this one:
> 
> http://forums.guru3d.com/showthread.php?t=385614


Awesome thank you!


----------



## kizwan

What do you guys think about my score? OK or not OK?

1150/1600 MHz, +100mV, power limit +50, ambient (indoor) 28.5C


GPU1


GPU2


VDDC


----------



## tsm106

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Quote:
> 
> 
> 
> Originally Posted by *syniad*
> 
> Yeah it's this one:
> 
> http://forums.guru3d.com/showthread.php?t=385614
> 
> 
> 
> Awesome thank you!
Click to expand...

Eh... not sure you wanna thank anybody just yet?

Quote:


> Originally Posted by *USAdystopia;4749943*
> Drivers functioned well in single gpu and in CF in ARMA 3...non-functional in COD: Ghosts with CF enabled in any resolution even with in-game "optimal video settings" enabled...repeatedly crashing to desktop.
> 
> *No one will believe this...the drivers automatically uninstalled themselves and left me with only part of the CCC*.


Bolded is freaky, and its not the only occurrence that I've seen.


----------



## BackwoodsNC

Quote:


> Originally Posted by *kizwan*
> 
> What do you guys think about my score? OK or not OK?
> 
> 1150/1600 MHz, +100mV, power limit +50, ambient (indoor) 28.5C
> 
> 
> GPU1
> 
> 
> GPU2
> 
> 
> VDDC


What's the score with just one gpu enabled? At the same clock of course


----------



## aaroc

Quote:


> Originally Posted by *kizwan*
> 
> What do you guys think about my score? OK or not OK?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 1150/1600 MHz, +100mV, power limit +50, ambient (indoor) 28.5C
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> GPU1
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> GPU2
> 
> 
> 
> 
> VDDC


How do you get those GPU graphics?


----------



## kdawgmaster

Quote:


> Originally Posted by *aaroc*
> 
> How do you get those GPU graphics?


u have to use GPU-z and they will have an option to display in the formation over a period of time in a graph chart.


----------



## kizwan

Quote:


> Originally Posted by *BackwoodsNC*
> 
> What's the score with just one gpu enabled? At the same clock of course


I'll run Valley with one GPU enabled. Maybe tonight.
Quote:


> Originally Posted by *aaroc*
> 
> How do you get those GPU graphics?


Do you means the graphs? I record the data using GPU-Z. It will save them in *.txt* file. The content is comma delimited format, so I just rename from *.txt* to *.csv*. Open with Excel, removed the blank rows & plot the graphs from the data.


----------



## Aesthethc

Someone is offering me a Sapphire 290x.

Whats the highest clock possible for a Sapphire 290x? And would that beat a maximum overclocked 780 classy under a full waterblock (with vrm's watercooled) ?

I was going with a 780 classy but i hear some good things about 290x on water. Does water really allow it to OC high? The owner selling me it says its hit 1100 on air, but im not sure if it will hit 1200 or more on water? or could i overvolt it?

Please, someone help me decide! I just want highest framerates possible as i have 120hz monitor.


----------



## aaroc

Quote:


> Originally Posted by *kizwan*
> 
> I'll run Valley with one GPU enabled. Maybe tonight.
> Do you means the graphs? I record the data using GPU-Z. It will save them in *.txt* file. The content is comma delimited format, so I just rename from *.txt* to *.csv*. Open with Excel, removed the blank rows & plot the graphs from the data.


Thanks! I though there was an easier way to do the graphs, I use the same method


----------



## darwing

Is the power color reference design to fit a waterblock on it

http://products.ncix.com/detail/powercolor-radeon-r9-290x-oc-1030mhz-4gb-5-0ghz-gddr5-2xdvi-hdmi-displayport-bf4-pci-e-video-card-9d-92940-1382.htm#Specifications

Powercolor Radeon R9 290X OC 1030MHZ 4GB 5.0GHZ GDDR5 2xDVI HDMI DisplayPort


----------



## Sgt Bilko

Quote:


> Originally Posted by *darwing*
> 
> Is the power color reference design to fit a waterblock on it
> 
> http://products.ncix.com/detail/powercolor-radeon-r9-290x-oc-1030mhz-4gb-5-0ghz-gddr5-2xdvi-hdmi-displayport-bf4-pci-e-video-card-9d-92940-1382.htm#Specifications
> 
> Powercolor Radeon R9 290X OC 1030MHZ 4GB 5.0GHZ GDDR5 2xDVI HDMI DisplayPort


Yes it's a Reference design so all waterblocks made for it will fit.

Only waterblock coming out that i know of for the Non-Ref designs is the Asus DCU II


----------



## darwing

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yes it's a Reference design so all waterblocks made for it will fit.
> 
> Only waterblock coming out that i know of for the Non-Ref designs is the Asus DCU II


+1 rep for the quick and informative reply







thank you!


----------



## kizwan

Quote:


> Originally Posted by *BackwoodsNC*
> 
> What's the score with just one gpu enabled? At the same clock of course


With one GPU enabled.

Elpida card


Hynix card


----------



## Aesthethc

Guys, i need help and advice from any 290x owner here that overvolts and overclocks.

I am leaning towards a 290x after seeing these results. Is this a normal result for a 290x @ 1.35v? Also how easy can you overvolt? Is it just as easy as 780 classy overvolting? Is 1200mhz easily achievable by 290x? or would i need to overvolt or do fancy things?

Please help me someone! I have to make a decision very very soon X_X


----------



## AlDyer

Quote:


> Originally Posted by *Aesthethc*
> 
> Guys, i need help and advice from any 290x owner here that overvolts and overclocks.
> 
> I am leaning towards a 290x after seeing these results. Is this a normal result for a 290x @ 1.35v? Also how easy can you overvolt? Is it just as easy as 780 classy overvolting? Is 1200mhz easily achievable by 290x? or would i need to overvolt or do fancy things?
> 
> Please help me someone! I have to make a decision very very soon X_X


You slide a slider to overvolt just like any other card. 1200 MHz is easy to achieve on water. You would need to overvolt most likely, nothing fancy


----------



## Aesthethc

Quote:


> Originally Posted by *AlDyer*
> 
> You slide a slider to overvolt just like any other card. 1200 MHz is easy to achieve on water. You would need to overvolt most likely, nothing fancy


No BIOS Flashing on the 290x cards? Just slide the slider all the way to max? its that easy?


----------



## AlDyer

Quote:


> Originally Posted by *Aesthethc*
> 
> No BIOS Flashing on the 290x cards? Just slide the slider all the way to max? its that easy?


Yup


----------



## Aesthethc

Quote:


> Originally Posted by *AlDyer*
> 
> Yup


Wow thats easy lol.

Anyone here have a Sapphire 290x that they put an aftermarket cooler on? and then was able to do a successful RMA? by slapping the original cooler on and RMA'ing?

Im scared if i put a waterblock on my sapphire 290x it will void warranty? Sapphire rep said i couldnt even change TIM, would they know i changed the TIM? Ill probably be putting on some Coolaboratory liquid ultra TIM


----------



## kdawgmaster

Quote:


> Originally Posted by *Aesthethc*
> 
> No BIOS Flashing on the 290x cards? Just slide the slider all the way to max? its that easy?


Theres alot more to it then just that. I had to mod my MSI AB ( thanks fto someone here ) to allow my card more voltages to get a stable clock of 1125 on the core and 1300 on the ram. How much your card will overclock will be determined by your luck ( what we call the silicon lottery ) and brand you go with. Right now the HIS cards seem to overclock the worse because the chips they buy from AMD are not as highly binned ( meaning they are cheap ) so overclocking on them can and may be tricky.


----------



## AlDyer

Quote:


> Originally Posted by *kdawgmaster*
> 
> Theres alot more to it then just that. I had to mod my MSI AB ( thanks fto someone here ) to allow my card more voltages to get a stable clock of 1125 on the core and 1300 on the ram. How much your card will overclock will be determined by your luck ( what we call the silicon lottery ) and brand you go with. Right now the HIS cards seem to overclock the worse because the chips they buy from AMD are not as highly binned ( meaning they are cheap ) so overclocking on them can and may be tricky.


Or you can just use Trixxx which allows higher voltage without any modding. And yes removing the cooler will void the warranty, they will know you removed the cooler, because it has stickers or whatever covering the screws.


----------



## Aesthethc

Quote:


> Originally Posted by *kdawgmaster*
> 
> Theres alot more to it then just that. I had to mod my MSI AB ( thanks fto someone here ) to allow my card more voltages to get a stable clock of 1125 on the core and 1300 on the ram. How much your card will overclock will be determined by your luck ( what we call the silicon lottery ) and brand you go with. Right now the HIS cards seem to overclock the worse because the chips they buy from AMD are not as highly binned ( meaning they are cheap ) so overclocking on them can and may be tricky.


Yes i am aware of silicon lottery. I am worried if i go with the Sapphire 290x i may not be as lucky with silicon lottery and get an average card when i could be going to get a 780 classified and get a "binned" GPU that will for sure clock high....

Can you link me and show me like a brief tutorial of what you need to do to modify MSI AB? I believe you have to do that for the GK110 cards as well right? (i could be wrong) .

Thanks kdawgmaster









So the highest youve achieved is 1125 on the core with more voltages? Is 1200 on the core considered an "above average" clocker? or is that easy if i max out the voltage ?
Quote:


> Originally Posted by *AlDyer*
> 
> Or you can just use Trixxx which allows higher voltage without any modding. And yes removing the cooler will void the warranty, *they will know you removed the cooler, because it has stickers or whatever covering the screws*.


AWWW ):

So sad..............


----------



## kdawgmaster

Quote:


> Originally Posted by *AlDyer*
> 
> Or you can just use Trixxx which allows higher voltage without any modding. And yes removing the cooler will void the warranty, they will know you removed the cooler, because it has stickers or whatever covering the screws.


Ive used Trixxx and no it wouldnt let me do anything past the extra +100mv which im needing something like +200mv right now for my card to make it stable.

Again using an HIS card so the chips arent as good on them so overclocking them was a complete pain. My ram will only go to tops 1350 which will run everything but BF4.
Quote:


> Originally Posted by *Aesthethc*
> 
> Yes i am aware of silicon lottery. I am worried if i go with the Sapphire 290x i may not be as lucky with silicon lottery and get an average card when i could be going to get a 780 classified and get a "binned" GPU that will for sure clock high....
> 
> Can you link me and show me like a brief tutorial of what you need to do to modify MSI AB? I believe you have to do that for the GK110 cards as well right? (i could be wrong) .
> 
> Thanks kdawgmaster
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So the highest youve achieved is 1125 on the core with more voltages? Is 1200 on the core considered an "above average" clocker? or is that easy if i max out the voltage ?
> AWWW ):
> 
> So sad..............





Spoiler: ADD more volts to MSI AB Guide



*Source*

Just use /wi4,30,8d,10 for 100mv. The offset is 6.25 mv in hexademical. So on decimal is :16*6.25=100 mv. For 50mv you need 8. For 200mv you need 20( 20=32 on dec. So 32 * 6.25=200mv)

The easy way to do changes:

Create a txt on desktop. Write
CD C:\Program Files (x86)\MSI Afterburner
MSIAfterburner.exe /wi4,30,8d,10

and then save as .bat file. Eveyrtime you start this bat file msi will start with +100mv

For 50mv: 8
For 100mv:10
For 125mv:14
For 150mv:18
For 175mv:1C
For 200mv:20

I wouldn't go over this point because
1)You are close to leave the sweet spot of the ref pcb vrms efficiency
2)These commands add 200mv on top of the 100mv offset through AB gui.That means 300mv

By default /wi command apply to current gpu only. So if you have 2 or more gpus you must use /sg command. That means the command line is something like that
ex:MsiAfterburner.exe /sg0 /wi6,30,8d,10 /sg1 /wi6,30,8d,10



[/quote]

This was posted on page 1

And i never did this before but +1 to sugarhell for helping me with the OC


----------



## chiknnwatrmln

1200 core is a pretty good OC. I'd say average is around 1150 to 1180 MHz. Your limiting factor will be VRM temps if you're not on reference cooler, and if you are then most likely noise. Even at 100% fan, the ref cooler's core temps aren't all that great and in my experience Hawaii is temp sensitive.

With my Gelid, at +18mV VRM 1 hits around 80c for extended gaming, but 1200 core is attainable (+168mV, about 1.28v actual) until my VRM 1 passes about 90c, at which I get instability. Hoping to remedy that with a waterblock soon.


----------



## AlDyer

Quote:


> Originally Posted by *kdawgmaster*
> 
> Ive used Trixxx and no it wouldnt let me do anything past the extra +100mv which im needing something like +200mv right now for my card to make it stable.
> 
> Again using an HIS card so the chips arent as good on them so overclocking them was a complete pain. My ram will only go to tops 1350 which will run everything but BF4.


I got 1175 and 1500 just with +100, you got unlucky I guess







At least the card is powerful right out the box anyway


----------



## Aesthethc

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> 1200 core is a pretty good OC. I'd say average is around 1150 to 1180 MHz. Your limiting factor will be VRM temps if you're not on reference cooler, and if you are then most likely noise. Even at 100% fan, the ref cooler's core temps aren't all that great and in my experience Hawaii is temp sensitive.
> 
> With my Gelid, at +18mV VRM 1 hits around 80c for extended gaming, but 1200 core is attainable (+168mV, about 1.28v actual) until my VRM 1 passes about 90c, at which I get instability. Hoping to remedy that with a waterblock soon.


It will be on an EK full block so air cooling is out of the question.

okay so what you guys are saying is 1150-1180 is avg and 1200 is considered "attainable" as long as i have good VRM cooling and as long as i dont have a HIS card? The guy trading me says the 290x can hit 1100 on air does that say anything that it could possibly be higher on water?

Im trying to think if i want warranty..... or not..... 290x is a good card but im sad that Sapphire will void my warranty.... need to think about this more.....


----------



## kdawgmaster

Quote:


> Originally Posted by *Aesthethc*
> 
> It will be on an EK full block so air cooling is out of the question.
> 
> okay so what you guys are saying is 1150-1180 is avg and 1200 is considered "attainable" as long as i have good VRM cooling and as long as i dont have a HIS card? The guy trading me says the 290x can hit 1100 on air does that say anything that it could possibly be higher on water?
> 
> Im trying to think if i want warranty..... or not..... 290x is a good card but im sad that Sapphire will void my warranty.... need to think about this more.....


From everything ive seen so far the best binned chips are Asus and MSI. Not saying that Sapphire still isnt a good choice. i just havent seen the success with them as i have so far with the other two with overclocking.

Also HIS will technically " void " the warranty if the coolers been tampered with however there is no warranty void if removed sticker on any of my screws so i did it anyways and changed the thermal past on one of my cards. Im thinking ill do the other 2 soon.
Quote:


> Originally Posted by *AlDyer*
> 
> I got 1175 and 1500 just with +100, you got unlucky I guess
> 
> 
> 
> 
> 
> 
> 
> At least the card is powerful right out the box anyway


I overclocked all my cards and it got me within the top 50 spots of firestrike extreme on 3 GPU's so im ( somewhat ) happy. Bottleneck is my CPU for synthetic stuff.


----------



## sugarhell

You cant bin chips.Only asic quality.


----------



## kdawgmaster

Quote:


> Originally Posted by *sugarhell*
> 
> You cant bin chips.Only asic quality.


i guess its not binning but from where i work " computer retail store " everything points to HiS almost always being the more inexpensive lower quality of all the bunch because their cards never last as long. NOW im not saying they are bad, as i have a bunch and i'm quite happy with them.


----------



## Forceman

Quote:


> Originally Posted by *AlDyer*
> 
> I got 1175 and 1500 just with +100, you got unlucky I guess
> 
> 
> 
> 
> 
> 
> 
> At least the card is powerful right out the box anyway


I wouldn't say "just" +100mV, since that's all most people are comfortable running 24/7.
Quote:


> Originally Posted by *Aesthethc*
> 
> It will be on an EK full block so air cooling is out of the question.
> 
> okay so what you guys are saying is 1150-1180 is avg and 1200 is considered "attainable" as long as i have good VRM cooling and as long as i dont have a HIS card? The guy trading me says the 290x can hit 1100 on air does that say anything that it could possibly be higher on water?
> 
> Im trying to think if i want warranty..... or not..... 290x is a good card but im sad that Sapphire will void my warranty.... need to think about this more.....


I thought Sapphire was one of the companies that didn't put stickers on the screws. It was all discussed a while back when they first came out, I'll see if I can find the post.

And 1200 is an above average overclock unless you are comfortable running higher than average voltages. Most cards will do 1150 or so with AB voltages (so +100mV), but 1200 may take more than that.

Edit: And Sapphire definitely does not have any stickers on the screws, at least in North America - I forgot that my first card was a Sapphire. They may still say it voids the warranty to remove the cooler, but as long as you put it back on I don't think they'll hassle you. Very few manufacturers overtly say it is okay to mess with the cooler.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Forceman*
> 
> I wouldn't say "just" +100mV, since that's all most people are comfortable running 24/7.
> I thought Sapphire was one of the companies that didn't put stickers on the screws. It was all discussed a while back when they first came out, I'll see if I can find the post.
> 
> And 1200 is an above average overclock unless you are comfortable running higher than average voltages. Most cards will do 1150 or so with AB voltages (so +100mV), but 1200 may take more than that.


No, Sapphire do NOT have stickers over the screws, i can confirm that from when I had one........mine wasn't a very good clocker though +100mV and i could only hit 1150 Stable........thought thats not a representation of Sapphire









Quote:


> Originally Posted by *Aesthethc*
> 
> Wow thats easy lol.
> 
> Anyone here have a Sapphire 290x that they put an aftermarket cooler on? and then was able to do a successful RMA? by slapping the original cooler on and RMA'ing?
> 
> Im scared if i put a waterblock on my sapphire 290x it will void warranty? Sapphire rep said i couldnt even change TIM, would they know i changed the TIM? Ill probably be putting on some Coolaboratory liquid ultra TIM


I took the cooler off my Sapphire 290x and replaced it with an Accelero Xtreme III and then had to RMA, it was all good, i just didnt tell them that i removed it


----------



## AlDyer

Quote:


> Originally Posted by *Forceman*
> 
> I wouldn't say "just" +100mV, since that's all most people are comfortable running 24/7.
> I thought Sapphire was one of the companies that didn't put stickers on the screws.


I though he was going to water cool, though. The stock coolers are hideous and I said only, because I for one am certainly going to put a waterblock on this baby







Also it is a bit of a gamble with the warranty if you remove the cooler, depends a lot on the person and company, I guess. I don't know about the RMA's in the states so can't say too much.


----------



## Jiiks

Quote:


> Originally Posted by *kizwan*
> 
> With one GPU enabled.
> 
> Elpida card
> 
> 
> Hynix card


I did a quick run with your clocks:



and another one with my own clocks for comparison. 1175/1450



Might have to mess aroung a bit more to get that 2fps


----------



## kizwan

Quote:


> Originally Posted by *Jiiks*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> With one GPU enabled.
> 
> Elpida card
> 
> 
> Hynix card
> 
> 
> 
> 
> 
> 
> 
> 
> I did a quick run with your clocks:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> and another one with my own clocks for comparison. 1175/1450
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Might have to mess aroung a bit more to get that 2fps
Click to expand...

I can say CPU does effect the score a little bit though. Did you overclocked your CPU? Mine running @4.75GHz. Going from 3.8GHz (stock) to 4.75GHz, I gain ~8.5 FPS (cards at stock clock. [EDIT] ~8.5 FPS gain is in Crossfire. I didn't test how many FPS I can gain with single card though.).


----------



## Jiiks

Quote:


> Originally Posted by *kizwan*
> 
> I can say CPU does effect the score a little bit though. Did you overclocked your CPU? Mine running @4.75GHz. Going from 3.8GHz (stock) to 4.75GHz, I gain ~8.5 FPS (cards at stock clock. [EDIT] this values in Crossfire. I didn't test how many FPS I can gain with single card though.).


Yeah i'm running @ 3.9Ghz(3.5 stock), i also had +20 power limit so i'll try another run in a moment with +50 and see if it makes any difference.

Edit:
Didn't really have any effect:


----------



## Widde

Quote:


> Originally Posted by *Forceman*
> 
> I wouldn't say "just" +100mV, since that's all most people are comfortable running 24/7.
> I thought Sapphire was one of the companies that didn't put stickers on the screws. It was all discussed a while back when they first came out, I'll see if I can find the post.
> 
> And 1200 is an above average overclock unless you are comfortable running higher than average voltages. Most cards will do 1150 or so with AB voltages (so +100mV), but 1200 may take more than that.
> 
> Edit: And Sapphire definitely does not have any stickers on the screws, at least in North America - I forgot that my first card was a Sapphire. They may still say it voids the warranty to remove the cooler, but as long as you put it back on I don't think they'll hassle you. Very few manufacturers overtly say it is okay to mess with the cooler.


Can confirm that sapphire dont have stickers on the screws in Sweden aswell ^^


----------



## syniad

Getting some very high VRM1 temps on my first card (over 100 degrees) whereas my 2nd card seems to be much better maxing out at 70 degrees. I probably need to re-seat the waterblock on the first card which is a pain because I will have to drain the whole system


----------



## kizwan

Quote:


> Originally Posted by *syniad*
> 
> Getting some very high VRM1 temps on my first card (over 100 degrees) whereas my 2nd card seems to be much better maxing out at 70 degrees. I probably need to re-seat the waterblock on the first card which is a pain because I will have to drain the whole system


GPU1 seems doing all the work. 100C too high. Mine only max at 80s Celsius in BF4 & my ambient (indoor) temp is high (30+ Celsius). You can try thicker thermal pad on VRM1, it may help reducing the temp. I just re-seat water blocks on my cards a couple of days ago with thicker thermal pad on VRM1. Temp so far better in Valley but have not test it in BF4. BF4 will push VRM1 more than Valley. We'll see tonight.

BTW, is there huge difference on GPU temp between GPU1 & GPU2?


----------



## steelkevin

Quote:


> Originally Posted by *syniad*
> 
> Getting some very high VRM1 temps on my first card (over 100 degrees) whereas my 2nd card seems to be much better maxing out at 70 degrees. I probably need to re-seat the waterblock on the first card which is a pain because I will have to drain the whole system


Wow, even 70°C is excessive for watercooled GPUs.
I played 5hours straight of BF4 yesterday afternoon when I got home (had been a while since I'd been able to play so much without getting bored) and both VRMs hit a maximum of 35°C and the GPU didn't go above 40°C. That's with an ambient of 21.5°C.
Make sure you used the right thermal pads although I doubt that would justify a 65°C increase :/


----------



## phallacy

What are peoples' general thoughts of the XFX r9 290x? I got one this weekend so now I have one from sapphire and one from XFX. They are both overclocking to 1100/1500 with a +60 mV and 50 power limit. These were my stable settings on the sapphire but it looks like the XFX has much more headroom. Fan is fixed at 80% when gaming.

Only game where the cards seem to be reaching 90+ is in Far Cry 3 with 8x msaa and high everything at 1440p. Haven't really noticed any throttling.

The Sapphire card has elpida memory and the XFX I received has hynix. One problem I am running into using reference coolers is that at idle my primary card which the display is connected to, the sapphire r9, is running at close to 75-80 just hanging out on the desktop or maybe watching a movie on VLC where as the second card is more normal at 55ish. Is this normal? I disabled ULPS btw and use AB to monitor but Trixx to overclock.

I'm thinking about switching the XFX as the primary card and moving the Sapphire one into the second slot.


----------



## dspacek

Hello, I have noticed a strange behaviour in World of tanks. My card with not much OC is throttling like a hell. Any suggestions what causes it?
I tried many BIOSES, Tweaking utilities etc. and nothing worked to get it stable on OC clocks.
Furmark, Heaven, almost other games run OK......


----------



## ArchieGriffs

Quote:


> Originally Posted by *phallacy*
> 
> What are peoples' general thoughts of the XFX r9 290x? I got one this weekend so now I have one from sapphire and one from XFX. They are both overclocking to 1100/1500 with a +60 mV and 50 power limit. These were my stable settings on the sapphire but it looks like the XFX has much more headroom. Fan is fixed at 80% when gaming.
> 
> Only game where the cards seem to be reaching 90+ is in Far Cry 3 with 8x msaa and high everything at 1440p. Haven't really noticed any throttling.
> 
> The Sapphire card has elpida memory and the XFX I received has hynix. One problem I am running into using reference coolers is that at idle my primary card which the display is connected to, the sapphire r9, is running at close to 75-80 just hanging out on the desktop or maybe watching a movie on VLC where as the second card is more normal at 55ish. Is this normal? I disabled ULPS btw and use AB to monitor but Trixx to overclock.
> 
> I'm thinking about switching the XFX as the primary card and moving the Sapphire one into the second slot.


75-80 isn't normal for idle temperatures, maybe it has something to do with your OC? Try them both on default settings when you're at the desktop and compare the two cards to each other. Heck even 55ish isn't good at idle, but that's probably overclocked, so I would say that's not out of the ordinary.
Quote:


> Originally Posted by *dspacek*
> 
> Hello, I have noticed a strange behaviour in World of tanks. My card with not much OC is throttling like a hell. Any suggestions what causes it?
> I tried many BIOSES, Tweaking utilities etc. and nothing worked to get it stable on OC clocks.
> Furmark, Heaven, almost other games run OK......


Probably because it's World of Tanks more than it is your graphics card. That game is terribly coded and optimized. If other games are running fine on a OC you always have the option of going back to the default settings for WoT, a R9 290 should be more than enough to run it on max settings with really good FPS, unless you're trying to get 120hz on max settings there's not much of a need to OC. Still extremely strange that you would throttle on that game. I won't be able to test if mine throttles on a OC until the 17th or 18th of Feb on WoT, so I can't really provide any personal experience as far as that game goes.


----------



## dspacek

Quote:


> Originally Posted by *ArchieGriffs*
> 
> 75-80 isn't normal for idle temperatures, maybe it has something to do with your OC? Try them both on default settings when you're at the desktop and compare the two cards to each other. Heck even 55ish isn't good at idle, but that's probably overclocked, so I would say that's not out of the ordinary.
> Probably because it's World of Tanks more than it is your graphics card. That game is terribly coded and optimized. If other games are running fine on a OC you always have the option of going back to the default settings for WoT, a R9 290 should be more than enough to run it on max settings with really good FPS, unless you're trying to get 120hz on max settings there's not much of a need to OC. Still extremely strange that you would throttle on that game. I won't be able to test if mine throttles on a OC until the 17th or 18th of Feb on WoT, so I can't really provide any personal experience as far as that game goes.


I saw a little bit throttling in Heaven, because I had set 3s to refresh graphs, with 1s its showing throttling but much much less than WoT. The problem is, that WOT on max settings go slowly in huge maps with many trees etc. It can handle only on 40FPS.
My experience are theese, that with AB 18 was throttling almost all games and Furmark too. With GPU tweak is Furmark OK. Im watercooled, but I red somewhere that the throttling causes FAN setup. So the question is.
Can the throttling be caused by non available FAN.... Is there any utility to make my own BIOS yet?


----------



## phallacy

Quote:


> Originally Posted by *ArchieGriffs*
> 
> 75-80 isn't normal for idle temperatures, maybe it has something to do with your OC? Try them both on default settings when you're at the desktop and compare the two cards to each other. Heck even 55ish isn't good at idle, but that's probably overclocked, so I would say that's not out of the ordinary.
> Probably because it's World of Tanks more than it is your graphics card. That game is terribly coded and optimized. If other games are running fine on a OC you always have the option of going back to the default settings for WoT, a R9 290 should be more than enough to run it on max settings with really good FPS, unless you're trying to get 120hz on max settings there's not much of a need to OC. Still extremely strange that you would throttle on that game. I won't be able to test if mine throttles on a OC until the 17th or 18th of Feb on WoT, so I can't really provide any personal experience as far as that game goes.


Thanks I will try them both at stock when I'm home from work. Weird thing is when I first installed them they were both at 55 or so idle then after about 5-10 resets due to coretemp constantly freezing my PC and switching to real temp GPU Z started showing the one card much hotter than the second during idle. Will also try to enable ULPS again and see if that is what is causing it.

EDIT: I enabled ULPS again both in Trixx and Afterburner or rather unchecked disable ULPS and temps are now 48 on the primary and 38 on the second card. Only downside is GPU Z can't read vrm temps of the second card and only says 24gb as the memory bandwidth. When I alt tab during a game and check I can read everything and the readings are correct but overall, glad to have lower temps again.


----------



## Jhors2

I picked up 3 R9 290xs. Currently on air soon to be on water (These are way too loud). I got really lucky and all of these came with Hynix memory and seem to be decently binned. Currently running 1100 core and 1425 memory with +35mv +25% power-limit. When you increase the voltage on the core it also increases memory voltage is that correct? Still working on overclocking but putting it on a soft hold until my EK blocks come in. Proof below:


----------



## Ukkooh

Does anyone know the stock voltage range for 290x gpus? My 1st one had ~1.28v at stock and this one has 1.18V stock voltage. Was wondering if some have even higher stock voltages to get an idea of safe voltages for the gpu. From my experience atleast 1.3V should be safe for 24/7 as it is even barely overvolting if compared to the stock voltage of my 1st 290x. And how high should I try to push my mem? I have hynix memory and I just put 1500mhz memory clocks thinking it would crash fast but so far it seems to be stable.


----------



## Mr357

Quote:


> Originally Posted by *Jhors2*
> 
> I got really lucky and all of these came with Hynix memory and seem to be decently binned. Currently running 1100 core and 1425 memory with +35mv +25% power-limit. When you increase the voltage on the core it also increases memory voltage is that correct?


I don't know that increasing the Vcore does anything to the memory voltage, but at the least higher Vcore allows for higher memory clocks.


----------



## MrWhiteRX7

Only thing I've seen that directly helps me with Memory OC is the "aux voltage" in MSI Afterburner. With that I can run as high as 1200 core / 1650 mem with +160mv but on Trixx even at +200mv I can't get the memory over 1525 -1550ish.


----------



## pilla99

At this point I'm just trusting my card with high temps until I can afford water and another card, and being a poor college student that probably means a year or two. I don't overclock it, but it runs at its temp target of 95C basically half of the day most days when I'm using it.

This is idle,

















































compare this to my CPU..... Sometimes I miss the 32C idle of my 690.


----------



## battleaxe

Quote:


> Originally Posted by *pilla99*
> 
> At this point I'm just trusting my card with high temps until I can afford water and another card, and being a poor college student that probably means a year or two. I don't overclock it, but it runs at its temp target of 95C basically half of the day most days when I'm using it.
> 
> This is idle,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> compare this to my CPU..... Sometimes I miss the 32C idle of my 690.


Put it on water and you'll have it back. Mine never go above 52c.


----------



## Cobrah




----------



## Krusher33

Quote:


> Originally Posted by *kizwan*
> 
> You won't disappoint! Don't forget to picked up Fujipoly thermal pad because EK stock thermal pad won't cool the VRMs good enough.
> 
> http://www.frozencpu.com/products/17499/thr-181/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_60_x_50_x_05_-_Thermal_Conductivity_170_WmK.html?tl=g30c309s1294
> 
> http://www.frozencpu.com/products/17500/thr-182/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_60_x_50_x_10_-_Thermal_Conductivity_170_WmK.html?tl=g30c309s1294


Geez those are expensive. Is there not any that are pretty good but easier on the wallet?


----------



## Roy360

Quote:


> Originally Posted by *Krusher33*
> 
> Geez those are expensive. Is there not any that are pretty good but easier on the wallet?


What's wrong with the EK ones? My VRAMS are 56 and 39. On full load, and VRAM 1 peaks to 60 after +100mv and 1350 Overclock. Condering how these were might to run over 125 degrees. I think 60 is pretty good. (and this is after some water leaked into the card)

For those who removed the DVI port to make it a single slot card, where did you get single bracket? All the ones I found are for 7990s and GTX cards. haven't seen a single DVI, HDMI , Displayport one yet


----------



## Krusher33

Quote:


> Originally Posted by *Roy360*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Krusher33*
> 
> Geez those are expensive. Is there not any that are pretty good but easier on the wallet?
> 
> 
> 
> What's wrong with the EK ones? My VRAMS are 56 and 39. On full load, and VRAM 1 peaks to 60 after +100mv and 1350 Overclock.
> 
> For those who removed the DVI port to make it a single slot card, where did you get single bracket? All the ones I found are for 7990s and GTX cards. haven't seen a single DVI, HDMI , Displayport one yet
Click to expand...

Apparently they were compared to the ones from other brands and they didn't fare as well.

But it's good to know yours was ok. I think I'll just use them, try it out and see for myself.


----------



## rdr09

Quote:


> Originally Posted by *Krusher33*
> 
> Geez those are expensive. Is there not any that are pretty good but easier on the wallet?


yes, these are cheaper but still way better than the stocks that come the ek kit . . .

http://www.frozencpu.com/products/16883/thr-168/Fujipoly_Extreme_System_Builder_Thermal_Pad_-_60_x_50_x_10_-_Thermal_Conductivity_110_WmK.html?tl=g30c309s1294

http://www.frozencpu.com/products/16882/thr-167/Fujipoly_Extreme_System_Builder_Thermal_Pad_-_60_x_50_x_05_-_Thermal_Conductivity_110_WmK.html?tl=g30c309s1294


----------



## Roy360

Quote:


> Originally Posted by *Krusher33*
> 
> Apparently they were compared to the ones from other brands and they didn't fare as well.
> 
> But it's good to know yours was ok. I think I'll just use them, try it out and see for myself.


Just make sure to use thermal paste on the thermal pads, I used up an entire tube on my card.

I will mention however, my R9 290 block came with a Gelid variant paste, instead of the regular runny paste. Or maybe I mixed it up with other EK block's paste.


----------



## Raephen

Quote:


> Originally Posted by *Krusher33*
> 
> Geez those are expensive. Is there not any that are pretty good but easier on the wallet?


Phobya 7W/mk pads also do the job admirably. I use them with my AC kryographics block.


----------



## Jhors2

Quote:


> Originally Posted by *Roy360*
> 
> Just make sure to use thermal paste on the thermal pads, I used up an entire tube on my card.
> 
> I will mention however, my R9 290 block came with a Gelid variant paste, instead of the regular runny paste. Or maybe I mixed it up with other EK block's paste.


Really? This is the first time I've ever heard to use paste ontop of the pads.


----------



## Krusher33

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Krusher33*
> 
> Geez those are expensive. Is there not any that are pretty good but easier on the wallet?
> 
> 
> 
> yes, these are cheaper but still way better than the stocks that come the ek kit . . .
> 
> http://www.frozencpu.com/products/16883/thr-168/Fujipoly_Extreme_System_Builder_Thermal_Pad_-_60_x_50_x_10_-_Thermal_Conductivity_110_WmK.html?tl=g30c309s1294
> 
> http://www.frozencpu.com/products/16882/thr-167/Fujipoly_Extreme_System_Builder_Thermal_Pad_-_60_x_50_x_05_-_Thermal_Conductivity_110_WmK.html?tl=g30c309s1294
Click to expand...

Quote:


> Originally Posted by *Raephen*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Krusher33*
> 
> Geez those are expensive. Is there not any that are pretty good but easier on the wallet?
> 
> 
> 
> Phobya 7W/mk pads also do the job admirably. I use them with my AC kryographics block.
Click to expand...

Ok thanks. I saw those but was concerned they weren't good enough for the lower costs.


----------



## Thorteris

I won a r9-290x should be shipping in 3-4 days


----------



## PillarOfAutumn

My system completed!






I posted this up on reddit, and someone was saying that my

__
https://www.reddit.com/r/1w6nwq/build_complete_the_dark_knight/cezk6cl
 (750 watts). With how he's talking, I was wondering if someone else can read it and try to make sense out of it?


----------



## the9quad

N
Quote:


> Originally Posted by *PillarOfAutumn*
> 
> My system completed!
> 
> 
> 
> 
> 
> 
> I posted this up on reddit, and someone was saying that my
> 
> __
> https://www.reddit.com/r/1w6nwq/build_complete_the_dark_knight/cezk6cl
> (750 watts). With how he's talking, I was wondering if someone else can read it and try to make sense out of it?


Can't help there, but your PC looks FANTASTIC, nice job


----------



## MrWhiteRX7

Seriously that is so damn clean!


----------



## VSG

Don't bother listening to such comments. I have read hundreds on them online courtesy of people like Linus advocating that most PSUs are overkill while assuming what each person intends to do with their system- overclocking, overvolting, going with SLI/CFX in the future and so forth.


----------



## syniad

Quote:


> Originally Posted by *kizwan*
> 
> GPU1 seems doing all the work. 100C too high. Mine only max at 80s Celsius in BF4 & my ambient (indoor) temp is high (30+ Celsius). You can try thicker thermal pad on VRM1, it may help reducing the temp. I just re-seat water blocks on my cards a couple of days ago with thicker thermal pad on VRM1. Temp so far better in Valley but have not test it in BF4. BF4 will push VRM1 more than Valley. We'll see tonight.
> 
> BTW, is there huge difference on GPU temp between GPU1 & GPU2?


I'm not sure how accurate the gpu usage graphs are anymore but yeah my 2nd cards usage goes all over the place.

The actual core gpu temps average about 47-50 degrees on both cards and sometimes go as high as 60 if they have been under load a while.
The only real worry for me is the VRM1 on the first card if its hitting over 100 degrees just in a benchmark after a long gaming session that's gonna get much worse. 70 on the 2nd is much more reasonable for me.
I may try putting thermal paste on the pads as well when I get round to it, that may be accounting for some of the heat.

Score from 1200/1625 not sure if this is normal for 2 R9 290s at 1440p


----------



## Roy360

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> ....
> I posted this up on reddit, and someone was saying that my
> 
> __
> https://www.reddit.com/r/1w6nwq/build_complete_the_dark_knight/cezk6cl
> (750 watts). With how he's talking, I was wondering if someone else can read it and try to make sense out of it?


Well I ran my R9 290 with a CX500 before upgrading to my AX1200 to support trifire.
Card takes up 260(+10%)+107(+140%) and then you have your HHDs, SSDs and USB drives. Many review sites suggest at least 800W to run these cards in Crossfire.
http://www.guru3d.com/articles_pages/radeon_r9_290_crossfire_review_benchmarks,4.html

Your PC does look very nice though


----------



## Sgt Bilko

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> My system completed!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I posted this up on reddit, and someone was saying that my
> 
> __
> https://www.reddit.com/r/1w6nwq/build_complete_the_dark_knight/cezk6cl
> (750 watts). With how he's talking, I was wondering if someone else can read it and try to make sense out of it?


The guys an idiot, 850w seems perfect for CF 290's and your 3570k OC'd









I know im running an 8350 rig but i've pulled up to 950w from the wall with my rig overclocked.....i have a number of HDD's/Fans/SSD's etc etc but still.
You done good.

Build looks pretty damn good too, fantastic job bud


----------



## VSG

Ummm.. He has a single card and a 750W PSU but yes- the other guy is a misled idiot I would imagine.


----------



## Sgt Bilko

Quote:


> Originally Posted by *geggeg*
> 
> Ummm.. He has a single card and a 750W PSU but yes- the other guy is a misled idiot I would imagine.


From what i gathered from the link he is planning of CF 290's sometime in the future and the PSU is a SeaSonic X Series 850W.

Unless that changed from the OP in the link


----------



## VSG

I was just looking at his sig rig lol but an 850W PSU for CF 290s might be cutting it a bit close if overclocking/volting. I switched over to a 1200W PSU from an 860W one when numbers started coming for CFX 290x setups.


----------



## Sgt Bilko

Quote:


> Originally Posted by *geggeg*
> 
> I was just looking at his sig rig lol but an 850W PSU for CF 290s might be cutting it a bit close if overclocking/volting. I switched over to a 1200W PSU from an 860W one when numbers started coming for CFX 290x setups.


I've had a 1200w PSU since i had CF 6970's lol

It might be getting to the point where i need to replace it but it hasn't failed me yet


----------



## VSG

Well I got mine for dual MSI 290x Lightnings but they seem to be vaporware at this time so I went with dual EVGA 780 Ti Classified KPEs. For both those cases, I am now thinking that a 1200W PSU may be too less for max performance though


----------



## Sgt Bilko

Quote:


> Originally Posted by *geggeg*
> 
> Well I got mine for dual MSI 290x Lightnings but they seem to be vaporware at this time so I went with dual EVGA 780 Ti Classified KPEs. For both those cases, I am now thinking that a 1200W PSU may be too less for max performance though


Ah, I remember you making the switch, how are the Ti's going?

I'd imagine they are pretty beastly


----------



## VSG

I missed the UPS guy by 15 min today so not very pleased lol. I am yet to get my waterblocks though, so it's ok. I was going to directly test under water anyway.


----------



## tsm106

Quote:


> Originally Posted by *geggeg*
> 
> Well I got mine for dual MSI 290x Lightnings but they seem to be vaporware at this time so I went with dual EVGA 780 Ti Classified KPEs. For both those cases, I am now thinking that a 1200W PSU may be too less for max performance though


Shoulda waited... shakes head.


----------



## VSG

Quote:


> Originally Posted by *tsm106*
> 
> Shoulda waited... shakes head.


Well if they prove worth the change and available at a decent price, who's to say I won't rejoin?


----------



## tsm106

Quote:


> Originally Posted by *geggeg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Shoulda waited... shakes head.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well if they prove worth the change and available at a decent price, who's to say I won't rejoin?
Click to expand...

True. Nods. Them Lightnings should wreak some havoc with the third voltage rail for memory volts. The waiting is killing me too.


----------



## Jack Mac

Imagine Hawaii at 1300-1400MHz, that would be amazing..


----------



## Clexzor

Hey all picked up a Asus R9 290 Reference at Microcenter today for 460$ as I was able to get a 280x but since im on 1440p and planned on adding another at some point I figured the 512 bit and 4gb was the way to go for the extra coin....anyways did I get a descent buy at 460?? I see newegg is at like 530

Also I don't plan on running it overclocked much since I mostly play mmo;s etc...I can downvolt it on stock to 1.16v

I also tried maxing the card out lol several time at 1.4v? was this safe??? I here they vdroop pretty good also I managed to keep temps below 85c and vrm's never go past 69c
max I could get was 1145/1600 so its not the best overclcoker lol asic score was 81%

the fan noise doesn't bother me as I keep it mostly below 40% and it doesn't break 85c 1.25v max was 89c at 1.4v


----------



## sugarhell

Quote:


> Originally Posted by *Jack Mac*
> 
> Imagine Hawaii at 1300-1400MHz, that would be amazing..


Thats low. Already ref 290x do 1300-1400ish


----------



## chiknnwatrmln

Quote:


> Originally Posted by *sugarhell*
> 
> Thats a lie
> 
> Thats low. Already ref 290x do 1300-1400ish


Can't tell if serious or not...

If so, 1300 MHz is really uncommon, especially so for reference. I've not even heard of any Hawaii chips doing over 1350 without extreme cooling solutions.

Most Hawaii chips, in my experience, will do 1150-1250 MHz on either water or air.


----------



## sugarhell

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Can't tell if serious or not...
> 
> If so, 1300 MHz is really uncommon, especially so for reference. I've not even heard of any Hawaii chips doing over 1350 without extreme cooling solutions.
> 
> Most Hawaii chips, in my experience, will do 1150-1250 MHz on either water or air.


Are you sure? I am only saying that if 290x lightning can do only 1300-1400 then its kinda low because EXPERIENCE ocers already hit 1350-1370 with ref pcb. And this is with water


----------



## ArchieGriffs

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> My system completed!
> I posted this up on reddit, and someone was saying that my
> 
> __
> https://www.reddit.com/r/1w6nwq/build_complete_the_dark_knight/cezk6cl
> (750 watts). With how he's talking, I was wondering if someone else can read it and try to make sense out of it?


_But where is your PSU_? So confused. Sick looking rig, the case looks massive though.


----------



## Sgt Bilko

Quote:


> Originally Posted by *ArchieGriffs*
> 
> _But where is your PSU_? So confused. Sick looking rig, the case looks massive though.


http://pcpartpicker.com/p/2HN9w

it's in the OP on reddit.


----------



## tsm106

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> My system completed!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I posted this up on reddit, and someone was saying that my
> 
> __
> https://www.reddit.com/r/1w6nwq/build_complete_the_dark_knight/cezk6cl
> (750 watts). With how he's talking, I was wondering if someone else can read it and try to make sense out of it?


I didn't follow what was going on at first lol. Had to read it a few times.

About the only sense I made out of it was to ignore whomever that was on reddit. Your psu is perfectly suited to your build imo and with a lil bit of room to spare. Ya don't want to run psus near their limits 24/7, it makes for short lived psus.


----------



## Thorteris

Will my PSU be able to handle a single r9-290x? I know it isn't the best but I wanted to know if its good enough for a i7-4770k at 4.4 ghz and a OC r9-290x.


----------



## Redvineal

Hey all. I'm really close to pulling the trigger on a dual CF setup. Will my AX860 PSU work well for 2 x R9 290's + i5 4670K with mild OC? I don't want a "sufficient" PSU that I always have to worry if I'm pulling too many W. I'd rather trade up now and have peace of mind.

Next question, of course, is what do I get if the AX860 will be too close for comfort with the setup above?

Thanks.


----------



## kizwan

Quote:


> Originally Posted by *Krusher33*
> 
> Apparently they were compared to the ones from other brands and they didn't fare as well.
> 
> But it's good to know yours was ok. I think I'll just use them, try it out and see for myself.


That is what I said to myself. Maybe they doing it wrong.









With the stock cooler, I max at 53 - 54C on VRM1. With water block + EK thermal pad, it go up to 64C. Definitely the VRM1 not cooling very well when using EK thermal pad.
Quote:


> Originally Posted by *Raephen*
> 
> Phobya 7W/mk pads also do the job admirably. I use them with my AC kryographics block.


Can you share your max temp (VRM1) when gaming or benching (e.g. Valley) or mining? The Phobya is available locally here. If it cool well, I'm thinking getting one. The EK thermal pad is rated 3 - 5 W/mK.
Quote:


> Originally Posted by *syniad*
> 
> I'm not sure how accurate the gpu usage graphs are anymore but yeah my 2nd cards usage goes all over the place.
> 
> The actual core gpu temps average about 47-50 degrees on both cards and sometimes go as high as 60 if they have been under load a while.
> The only real worry for me is the VRM1 on the first card if its hitting over 100 degrees just in a benchmark after a long gaming session that's gonna get much worse. 70 on the 2nd is much more reasonable for me.
> I may try putting thermal paste on the pads as well when I get round to it, that may be accounting for some of the heat.
> 
> Score from 1200/1625 not sure if this is normal for 2 R9 290s at 1440p
> 
> 
> Spoiler: Warning: Spoiler!


Even 70 is too high with water cooling. They should be in 40s range. You probably mistakenly using thinner thermal pad, that's why you're seeing 100C. If Phobya 7W/mK thermal pad can cool better, get that one.


----------



## tsm106

Quote:


> Originally Posted by *Thorteris*
> 
> Will my PSU be able to handle a single r9-290x? I know it isn't the best but I wanted to know if its good enough for a i7-4770k at 4.4 ghz and a OC r9-290x.


Can't speak on the QL of your unit but 750w is more than enough for single gpu.

Quote:


> Originally Posted by *Redvineal*
> 
> Hey all. I'm really close to pulling the trigger on a dual CF setup. Will my AX860 PSU work well for 2 x R9 290's + i5 4670K with mild OC? I don't want a "sufficient" PSU that I always have to worry if I'm pulling too many W. I'd rather trade up now and have peace of mind.
> 
> Next question, of course, is what do I get if the AX860 will be too close for comfort with the setup above?
> 
> Thanks.


Your psu is generally on the low side but depending on your overclocks(mild) you can skate by especially given your cpu. With mild overclocks and your system running full tilt, you will probably dip into your psus unlisted reserves. Your psu is sufficient not overkill. If you want to not have lingering doubts, a psu around 1000w and up cover that.


----------



## Redvineal

Quote:


> Originally Posted by *tsm106*
> 
> Your psu is generally on the low side but depending on your overclocks(mild) you can skate by especially given your cpu. With mild overclocks and your system running full tilt, you will probably dip into your psus unlisted reserves. Your psu is sufficient not overkill. If you want to not have lingering doubts, a psu around 1000w and up cover that.


Thanks for the info. Not what I was hoping to hear, but glad I found out sooner than later. Isn't it generally bad to run a PSU above a certain percentage for long periods of time (read "mining")?

Now what about tri-fire...1200 or 1500?


----------



## tsm106

Quote:


> Originally Posted by *Redvineal*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Your psu is generally on the low side but depending on your overclocks(mild) you can skate by especially given your cpu. With mild overclocks and your system running full tilt, you will probably dip into your psus unlisted reserves. Your psu is sufficient not overkill. If you want to not have lingering doubts, a psu around 1000w and up cover that.
> 
> 
> 
> Thanks for the info. Not what I was hoping to hear, but glad I found out sooner than later. Isn't it generally bad to run a PSU above a certain percentage for long periods of time (read "mining")?
> 
> Now what about tri-fire...1200 or 1500?
Click to expand...

Yea it is generally bad to run a psu at peak 24/7, but then again it is generally bad to run anything not intended for serious duty, 24/7.

Moving up to trifire you will want something in the 1300w range and up, depending on your cpu draw and loop size. If the latter two are bigger you'll want to compensate.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Redvineal*
> 
> Thanks for the info. Not what I was hoping to hear, but glad I found out sooner than later. Isn't it generally bad to run a PSU above a certain percentage for long periods of time (read "mining")?
> 
> Now what about tri-fire...1200 or 1500?


Heavy clocks = 1500w

Mild clocks = 1200w (just)

i'd rather make sure and run the 1500w

Tsm106 would know best though


----------



## tsm106

^^It's a not a bad idea to just jump right to 1500w ish hehe.









**The cost difference from 1300w to 1500w is like 80 bucks, but the cost from buying a 1300w and then realizing you need 1500w... it's a lot more expensive than 80 bucks and it bruises your woulda shoulda coulda 20/20 hindsight...


----------



## ArchieGriffs

Quote:


> Originally Posted by *Sgt Bilko*
> 
> http://pcpartpicker.com/p/2HN9w
> 
> it's in the OP on reddit.


Nononono I was wondering where it was in the case. I shouldn't have been lazy, all I had to do was look where the cables led to. Never seen a PSU mounted in the front like that, plus it was covered up. Anyways too neat a case to comprehend hehe.


----------



## Redvineal

Quote:


> Originally Posted by *tsm106*
> 
> ^^It's a not a bad idea to just jump right to 1500w ish hehe.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> **The cost difference from 1300w to 1500w is like 80 bucks, but the cost from buying a 1300w and then realizing you need 1500w... it's a lot more expensive than 80 bucks and it bruises your woulda shoulda coulda 20/20 hindsight...


Thanks for all the great responses tsm and Bilko. I'm sort of thinking along these lines now. Why stop at 1200 when I might need more down the road? I thought 860 was more than enough a mere 4 months ago...

Any tips on good places to sell slightly used PSU's and graphics cards?







If I don't sell my current stuff soon after tying up money in new stuff, my wife is going to expect purses!


----------



## the9quad

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Heavy clocks = 1500w
> 
> Mild clocks = 1200w (just)
> 
> i'd rather make sure and run the 1500w
> 
> Tsm106 would know best though


I can run trifire at 1135 and 1375 with my power supply.


----------



## Sgt Bilko

Quote:


> Originally Posted by *the9quad*
> 
> I can run trifire at 1135 and 1375 with my power supply.


What wattage are you pulling like that?


----------



## the9quad

Quote:


> Originally Posted by *Sgt Bilko*
> 
> What wattage are you pulling like that?


who knows, lol. Probably close enough to not do it, so I normally run at stock. I'd also recommend more than a 1200 watt for tri or quad, to be safe.


----------



## Sgt Bilko

Quote:


> Originally Posted by *the9quad*
> 
> who knows, lol. Probably close enough to not do it, so I normally run at stock. I'd also recommend more than a 1200 watt for tri or quad, to be safe.


Just to be on the safe side yeah, I've pulled 950w from 2 x 290's under +100mV with my FX rig so i'd have a hard time recommending anyone to use a 1200w PSU for Tri-Fire regardless of AMD.Intel CPU.

Better to be prepared.............and on that note, I'm kinda looking at these Kogan 4k TV's that are coming out, sub $1k.....gotta be a reason for that right?

i mean, i'd probably never watch TV on it but for gaming it could work









EDIT: Ah, nvm.....no Displayport and i'm pretty sure that 4K need HDMI 1.2 or something, i can't remember now.


----------



## BackwoodsNC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> EDIT: Ah, nvm.....no Displayport and i'm pretty sure that 4K need HDMI 1.2 or something, i can't remember now.


Yeah, it will only be 30Hz with the current HDMI standard. If it has DP1.2, then it will be able to 60Hz and at a very short distance(cable length).

Where is the 290x Lightning? It seems as though it is taking forever. Might just grab a 780 classy for gaming since it is taking it's sweet time to retail.


----------



## Forceman

Quote:


> Originally Posted by *sugarhell*
> 
> Are you sure? I am only saying that if 290x lightning can do only 1300-1400 then its kinda low because EXPERIENCE ocers already hit 1350-1370 with ref pcb. And this is with water


If overclockers are hitting 1350+ on water, they aren't posting about it here. There have been a couple (like 2 or 3) people hit 1300+, but almost all the results I've seen in the benchmark threads are in the 1250 range. HWBot shows something similar.


----------



## Falkentyne

Quote:


> Originally Posted by *Mr357*
> 
> I don't know that increasing the Vcore does anything to the memory voltage, but at the least higher Vcore allows for higher memory clocks.


Increasing core voltage increases voltage to the IMC--this will allow the memory to run at your overclocked speed when the system is **IDLE**---it's low idle voltage combined with high memory clocks that cause black screens on the desktop.

You can avoid that problem and push the memory to its upper limits by using unofficial overclocking mode WITHOUT powerplay support in afterburner, which forces 3D clocks on the desktop, with 3d voltages (which stops the 0.9xx idle voltage with full speed memory=black screen)


----------



## Sgt Bilko

Quote:


> Originally Posted by *BackwoodsNC*
> 
> Yeah, it will only be 30Hz with the current HDMI standard. If it has DP1.2, then it will be able to 60Hz and at a very short distance(cable length).
> 
> Where is the 290x Lightning? It seems as though it is taking forever. Might just grab a 780 classy for gaming since it is taking it's sweet time to retail.


Thanks for the info, I will check it when it launches but i'm pretty sure it's just going to launch with 4 x HDMI ports from what i've seen so far.

Guess i could always go rob a few banks for one of these though huh










Spoiler: Ultimate 4k inside!!



http://www.amazon.com/dp/B00CMEN95U/ref=tsm_1_fb_lk?tag=nerdgsm-20


----------



## Durvelle27

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Thanks for the info, I will check it when it launches but i'm pretty sure it's just going to launch with 4 x HDMI ports from what i've seen so far.
> 
> Guess i could always go rob a few banks for one of these though huh
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Ultimate 4k inside!!
> 
> 
> 
> http://www.amazon.com/dp/B00CMEN95U/ref=tsm_1_fb_lk?tag=nerdgsm-20


Actually HDMI 2.0 supports 60Hz also not only DP 1.2


----------



## Sgt Bilko

Quote:


> Originally Posted by *Durvelle27*
> 
> Actually HDMI 2.0 supports 60Hz also not only DP 1.2


The 290 series only supports HDMI 1.4a.......not sure if that can be change with a driver or it's hardware based though.


----------



## ArchieGriffs

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Thanks for the info, I will check it when it launches but i'm pretty sure it's just going to launch with 4 x HDMI ports from what i've seen so far.
> 
> Guess i could always go rob a few banks for one of these though huh
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Ultimate 4k inside!!
> 
> 
> 
> http://www.amazon.com/dp/B00CMEN95U/ref=tsm_1_fb_lk?tag=nerdgsm-20


Those reviews had me laughing my ass off. If you haven't read them, do it now.. especially the first one.

Edit: After reading more, I literally laughed to tears. If you haven't read the reviews do it now, and make sure there isn't a baby nearby that you might wake up.


----------



## bond32

Quote:


> Originally Posted by *Falkentyne*
> 
> Increasing core voltage increases voltage to the IMC--this will allow the memory to run at your overclocked speed when the system is **IDLE**---it's low idle voltage combined with high memory clocks that cause black screens on the desktop.
> 
> You can avoid that problem and push the memory to its upper limits by using unofficial overclocking mode WITHOUT powerplay support in afterburner, which forces 3D clocks on the desktop, with 3d voltages (which stops the 0.9xx idle voltage with full speed memory=black screen)


That might prevent black screens on the desktop, but it doesn't solve the black screens in game, at least for me.


----------



## Durvelle27

Yea they have HDMI 1.4a but there able to push the bandwidth needed for [email protected] The cable was the main limitation. You also have DP to HDMI as a option


----------



## PillarOfAutumn

Sorry if this has been answered before. I just did a clean install of windows on my PC. I wanted to go back to my OC of 1150/1450. I tried setting the power to +50% in CCC but it just resets itself. Any ideas on how to go about fixing this without installing MSI AB? I'm using Trixx to OC.


----------



## mojobear

Quote:


> Originally Posted by *the9quad*
> 
> who knows, lol. Probably close enough to not do it, so I normally run at stock. I'd also recommend more than a 1200 watt for tri or quad, to be safe.


Im also runnign trifire 290s under water. Stock with i7 4770K in sig runs at 870 - 880 watt from the wall with games like bf3/4 and crysis 3....so with 93% efficiency (worst case scenerio)....system requires about 820W from the PSU...

OC to 1150/1350 + 75mV I get around 970 watt from the wall...therefore same efficiency means system draws about 900 W from the psul.

Mind you might be bc its under water that there is less leakage from the gpus due to lower temps.


----------



## tsm106

Quote:


> Originally Posted by *mojobear*
> 
> Quote:
> 
> 
> 
> Originally Posted by *the9quad*
> 
> who knows, lol. Probably close enough to not do it, so I normally run at stock. I'd also recommend more than a 1200 watt for tri or quad, to be safe.
> 
> 
> 
> Im also runnign trifire 290s under water. Stock with i7 4770K in sig runs at 870 - 880 watt from the wall with games like bf3/4 and crysis 3....so with 93% efficiency (worst case scenerio)....system requires about 820W from the PSU...
> 
> OC to 1150/1350 + 75mV I get around 970 watt from the wall...therefore same efficiency means system draws about 900 W from the psul.
> 
> Mind you might be bc its under water that there is less leakage from the gpus due to lower temps.
Click to expand...

I almost draw more power than you with two stock aircooled cards with my cpu at 5ghz iirc. Is your rig working right?

I run dual psu and the cards were connected to PSU1 which for the test only powers cpu/mb/dual gpu. Everything else is connected to my 2nd PSU.

Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> Two cards on psu with my cpu and nothing else connected, ie. no drives, no loop, nothing else. Running 3dm11 extreme at stock with +50 on air, so it's throttling a bit as expected. 903w x .89 at that wattage = *803w.* This number should blow up with blocks and serious overclocking. Ya need to remember that on the stock bios PT is trying to keep the tdp of the cards very very low.


----------



## esqueue

Quote:


> Originally Posted by *AlDyer*
> 
> Yeah, because it was "designed" to run at 95 C and throttle


Yeah, some people actually believed that they were purposefully designed that way.
Quote:


> Originally Posted by *tsm106*
> 
> Again, more noobery. It's obvious you want a cool setup and the 290s are not that. Is it because the cooler just plain sucks or is maybe because AMD decided in their wisdom to keep the card in a specific temp range on purpose? Maybe you know more than AMD? You know,
> 
> 
> 
> 
> 
> 
> 
> if you had a better cooler, it would take the card out of this specific range and now make the card expend more energy to maintain the lower temp and it also forces the card to go thru many more heat cycles in a given timeframe. Just because everyone thinks they know what the hell they think in their head is right, doesn't make it so.


----------



## Redvineal

Soooo, I'm seeing these XFX DD R9 290's. One is standard, the other is "Black Edition". Besides the obvious minuscule overclock the Black Edition offers, are there any other differences?


----------



## amped24

Will be joining the club Wednesday


----------



## Sgt Bilko

Quote:


> Originally Posted by *ArchieGriffs*
> 
> Those reviews had me laughing my ass off. If you haven't read them, do it now.. especially the first one.
> 
> Edit: After reading more, I literally laughed to tears. If you haven't read the reviews do it now, and make sure there isn't a baby nearby that you might wake up.


I have, they are just awesome lol

Quote:


> Originally Posted by *Redvineal*
> 
> Soooo, I'm seeing these XFX DD R9 290's. One is standard, the other is "Black Edition". Besides the obvious minuscule overclock the Black Edition offers, are there any other differences?


Nope, not a thing, i went for the Black edition cards (only $10 more here) for a "chance" at some better cards.

if the price diff is minuscule then get the Black Editions if thats they way you wanna go









You will have some smexy looking cards though


----------



## Heinz68

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Just to be on the safe side yeah, I've pulled 950w from 2 x 290's under +100mV with my FX rig so i'd have a hard time recommending anyone to use a 1200w PSU for Tri-Fire regardless of AMD.Intel CPU.
> 
> Better to be prepared.............and on that note, I'm kinda looking at these Kogan 4k TV's that are coming out, sub $1k.....gotta be a reason for that right?
> 
> i mean, i'd probably never watch TV on it but for gaming it could work
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: Ah, nvm.....no Displayport and i'm pretty sure that 4K need HDMI 1.2 or something, i can't remember now.


I'm looking for 50 inch 4K UHD TV/monitor, was trying to find out more about the Kogan TV connectivity and it looks like it doesn't have Displayport 1.2a, only few TVs have.
The late models Graphic Cards only supports 4K 60Hz with Display port or 30 Hz with HDMI 1.4. The latest HDMI 2.0 now on some late models 4K TVs also supports 60Hz but no graphic card has yet HDMI 2.0 connection.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Heinz68*
> 
> I'm looking for 50 inch 4K UHD TV/monitor, was trying to find out more about the Kogan TV connectivity and it looks like it doesn't have Displayport 1.2a, only few TVs have.
> The late models Graphic Cards only supports 4K 60Hz with Display port or 30 Hz with HDMI 1.4. The latest HDMI 2.0 now on some late models 4K TVs also supports 60Hz but no graphic card has yet HDMI 2.0 connection.


Well i was just searching around and on Club3D's website they claim HDMI 2.0 support for the Ref 290/x so that would mean that all have it









We might just be in luck here: http://www.club-3d.com/index.php/products/reader.en/product/radeon-r9-290x.html (about 1/3 down)

"Display Connectors

Everything you need for your PC

Built in HDMI 1.4a with *2.0 support*, DisplayPort 1.2 and Dual DVI-D"


----------



## Heinz68

*XFX R9 290X Double Dissipation Edition CrossFire Review at [H]ard|OCP*, January 26, 2014
Quote:


> When we look at performance versus NVIDIA GeForce GTX 780 Ti SLI we also find the XFX R9 290X DD CrossFire coming out on top. This we did not expect. We figured GTX 780 Ti SLI might provide some faster performance in some cases. However, it really didn?t. When playing demanding games at high resolutions and aiming for those high in-game settings R9 290X CrossFire pulled through as the faster solution. When we cranked up AA settings, again R9 290X CrossFire pulled through. There wasn?t one scenario in our gaming experiences that GTX 780 Ti SLI was superior in gameplay experience to R9 290X CrossFire configuration from XFX.
> 
> We have a lot of good to say about the XFX R9 290X Double Dissipation Edition video cards. If we could give an award to the "best looking" video card, it would be these from XFX.


The above Quote is small part from page 11, Conclusion & The Bottom Line

BTW I'll be joining the club in about 2 days with Sapphire R9 290X 4GB TRI-X OC Crossfire
I hope my Corsair HX1000 80 Plus can handle voltage overclocking with 2 cards, any input on that?


----------



## Heinz68

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well i was just searching around and on Club3D's website they claim HDMI 2.0 support for the Ref 290/x so that would mean that all have it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> We might just be in luck here: http://www.club-3d.com/index.php/products/reader.en/product/radeon-r9-290x.html (about 1/3 down)
> 
> "Display Connectors
> 
> Everything you need for your PC
> 
> Built in HDMI 1.4a with *2.0 support*, DisplayPort 1.2 and Dual DVI-D"


I know about that, somebody pointed that Club3D site about 2 months ago. I did email the support and the reply was "no support for HDMI 2.0" too bad I deleted the email. You can try yourself and point the website quote which is lil misleading.

HDMI 2.0 standard was finalized on Sep 6, 2013, too late for the latest graphic cards already in production. Hopefully some AIB partner will add in on or we have to wait for the next generation cards.

EDIT
BTW The Kogan TV you posted about only says 4 HDMI connections. Not all TV support the HDMI 2.0 and they always point it out when they do.

I think a better choice is Vizio 50 inch 4K UHD $999.99 (MSRP) with Full-Array LED backlit and supports the latest HDMI standard. Now we only have wait hopefully some graphic card will support it soon


----------



## Raephen

Quote:


> Originally Posted by *kizwan*
> 
> Can you share your max temp (VRM1) when gaming or benching (e.g. Valley) or mining? The Phobya is available locally here. If it cool well, I'm thinking getting one. The EK thermal pad is rated 3 - 5 W/mK.
> Even 70 is too high with water cooling. They should be in 40s range. You probably mistakenly using thinner thermal pad, that's why you're seeing 100C. If Phobya 7W/mK thermal pad can cool better, get that one.


I'm not much of a bencher, nor have I clocked my 290 heavily (1. Don't need it yet and 2. my vram doesn' like to be clocked (?), so I run a simple 1100 / 1250 with no voltage increase, +50% power limit).

I did, however, pick up 3dmark last year when it was on sale on Steam.

I ran FS-E to heat the card up a bit from cold boot, and then just ran the full gambit of tests.

GPU max: 47C
VRM1 max: 48C
VRM2 max: 41C

These results seem consistent with what I've seen after a good gaming session with a heavily modded Skyrim with .ini tweaks.

I have to mention, I did use a dab of TIM when installing the pads on the VRM's. But mind you, you just need a very small amount, because like with CPU coolers: too much is almost as bad, if not worse, than too little or none at all.

TIM is just used to take up the place of air in places not entirely smooth (air is a very bad thermal conductor),.

The way I did it on my VRM's is I took the cap of my MX4 (wonderful stuff) and touched the VRM bit with the end of the spout and whatever TIM stuck on would do. I then placed my pads, put a dab of MX4 on on the VRAM chips* and installed the block.

* Aquacomputer Kryographics blocks don't use thermal padding on the VRAM, so there a dab of TIM is enough.


----------



## syniad

Quote:


> Originally Posted by *kizwan*
> 
> That is what I said to myself. Maybe they doing it wrong.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> With the stock cooler, I max at 53 - 54C on VRM1. With water block + EK thermal pad, it go up to 64C. Definitely the VRM1 not cooling very well when using EK thermal pad.
> Can you share your max temp (VRM1) when gaming or benching (e.g. Valley) or mining? The Phobya is available locally here. If it cool well, I'm thinking getting one. The EK thermal pad is rated 3 - 5 W/mK.
> Even 70 is too high with water cooling. They should be in 40s range. You probably mistakenly using thinner thermal pad, that's why you're seeing 100C. If Phobya 7W/mK thermal pad can cool better, get that one.


Wouldn't be surprised if I had used the wrong pads I do stupid stuff like that often lol. I'll order some of the Phobya 7w/mk pads, as you say they are better than EK's anyway. The phobya 7w/mk pads seem to come in 3 depth sizes (0.5-1.5mm) anyone know what size(s) are correct?


----------



## Raephen

Quote:


> Originally Posted by *syniad*
> 
> Wouldn't be surprised if I had used the wrong pads I do stupid stuff like that often lol. I'll order some of the Phobya 7w/mk pads, as you say they are better than EK's anyway. The phobya 7w/mk pads seem to come in 3 depth sizes (0.5-1.5mm) anyone know what size(s) are correct?


I know from experience that I used a 1mm one for my AC block on the VRM2 spot. Stock was 0,5, but possibly due to machining error it didn't quite make fully enough contact. In contact with Sven from Aquacomputer, he mentioned other customers had similar issues, and also that thicker might suffice, if not too stiff a pad.

The Phobya's are pretty pliable, and I seem to recall EK blocks use 1mm for VRM and 0,5mm for VRAM (or was it the other way around? might want to check on EK's site if it's listed...)


----------



## rdr09

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> My system completed!
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> I posted this up on reddit, and someone was saying that my
> 
> __
> https://www.reddit.com/r/1w6nwq/build_complete_the_dark_knight/cezk6cl
> (750 watts). With how he's talking, I was wondering if someone else can read it and try to make sense out of it?


the thing is a beauty. what's the id of the tubing?


----------



## Dispersion

This gem just popped up at a swedish retailer, wonder how well it would cool the R9 290/290X's, appears to be a pretty solid chunk of aluminium cooling the VRMs/RAM chips









Link for those of you interested, not much info available though...


----------



## Durvelle27

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well i was just searching around and on Club3D's website they claim HDMI 2.0 support for the Ref 290/x so that would mean that all have it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> We might just be in luck here: http://www.club-3d.com/index.php/products/reader.en/product/radeon-r9-290x.html (about 1/3 down)
> 
> "Display Connectors
> 
> Everything you need for your PC
> 
> Built in HDMI 1.4a with *2.0 support*, DisplayPort 1.2 and Dual DVI-D"


Read earlier message


----------



## kizwan

Quote:


> Originally Posted by *Raephen*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Can you share your max temp (VRM1) when gaming or benching (e.g. Valley) or mining? The Phobya is available locally here. If it cool well, I'm thinking getting one. The EK thermal pad is rated 3 - 5 W/mK.
> 
> 
> 
> I'm not much of a bencher, nor have I clocked my 290 heavily (1. Don't need it yet and 2. my vram doesn' like to be clocked (?), so I run a simple 1100 / 1250 with no voltage increase, +50% power limit).
> 
> I did, however, pick up 3dmark last year when it was on sale on Steam.
> 
> I ran FS-E to heat the card up a bit from cold boot, and then just ran the full gambit of tests.
> 
> GPU max: 47C
> VRM1 max: 48C
> VRM2 max: 41C
> 
> These results seem consistent with what I've seen after a good gaming session with a heavily modded Skyrim with .ini tweaks.
> 
> I have to mention, I did use a dab of TIM when installing the pads on the VRM's. But mind you, you just need a very small amount, because like with CPU coolers: too much is almost as bad, if not worse, than too little or none at all.
> 
> TIM is just used to take up the place of air in places not entirely smooth (air is a very bad thermal conductor),.
> 
> The way I did it on my VRM's is I took the cap of my MX4 (wonderful stuff) and touched the VRM bit with the end of the spout and whatever TIM stuck on would do. I then placed my pads, put a dab of MX4 on on the VRAM chips* and installed the block.
> 
> * Aquacomputer Kryographics blocks don't use thermal padding on the VRAM, so there a dab of TIM is enough.
Click to expand...

I use the same way to apply very small amount of TIM on the VRMs.









I have 4 MX4 which 3 of them still sealed untouched. I bought them a couple years ago. Somehow it doesn't work well. The Shin-Etsu G751 that have half thermal conductivity than MX4 worked well for me. This is why I hesitate to use MX4. I'll only use Shin-Etsu or IC Diamond 7 TIM.
Quote:


> Originally Posted by *Raephen*
> 
> Quote:
> 
> 
> 
> Originally Posted by *syniad*
> 
> Wouldn't be surprised if I had used the wrong pads I do stupid stuff like that often lol. I'll order some of the Phobya 7w/mk pads, as you say they are better than EK's anyway. The phobya 7w/mk pads seem to come in 3 depth sizes (0.5-1.5mm) anyone know what size(s) are correct?
> 
> 
> 
> I know from experience that I used a 1mm one for my AC block on the VRM2 spot. Stock was 0,5, but possibly due to machining error it didn't quite make fully enough contact. In contact with Sven from Aquacomputer, he mentioned other customers had similar issues, and also that thicker might suffice, if not too stiff a pad.
> 
> The Phobya's are pretty pliable, and I seem to recall EK blocks use 1mm for VRM and 0,5mm for VRAM (or was it the other way around? might want to check on EK's site if it's listed...)
Click to expand...

1mm for the VRMs & 0.5mm for the memory (VRAM). @syniad, if 1mm not available, you can always get 1.5mm. No problem using 1.5mm there.


----------



## syniad

Quote:


> Originally Posted by *kizwan*
> 
> 1mm for the VRMs & 0.5mm for the memory (VRAM). @syniad, if 1mm not available, you can always get 1.5mm. No problem using 1.5mm there.


Excellent thank you









On a side note, I take it the thermal paste that comes with EK's waterblocks is non-electrically conductive? Just want to make sure it's safe to use on the VRAM.


----------



## Gumbi

Quote:


> Originally Posted by *Dispersion*
> 
> 
> 
> This gem just popped up at a swedish retailer, wonder how well it would cool the R9 290/290X's, appears to be a pretty solid chunk of aluminium cooling the VRMs/RAM chips
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Link for those of you interested, not much info available though...


Wow super sexy... That's an xtreme Iv. I haven't seen any info about it anywhere else though. Veeery nice VRM/VRAM cooling.


----------



## gpson

Does anyone succeeded undervolting R9 290x for using it in Linux? On windows there are tweaking tools, but in Ubuntu the card can not be undervolted using any software and is eating many Wats..

Gigabyte R9 290X
vBIOS VER015.041.000.001.003745

I saved original bios files using atiflash utility to the USB. How could I edit the ROMs using hex editor? Which values shoud I change?..


----------



## kizwan

Quote:


> Originally Posted by *syniad*
> 
> Excellent thank you
> 
> 
> 
> 
> 
> 
> 
> 
> 
> On a side note, I take it the thermal paste that comes with EK's waterblocks is non-electrically conductive? Just want to make sure it's safe to use on the VRAM.


The thermal pad is non-electrical conductive.


----------



## Heinz68

Quote:


> Originally Posted by *Durvelle27*
> 
> Yea they have HDMI 1.4a but there able to push the bandwidth needed for [email protected] The cable was the main limitation. You also have DP to HDMI as a option


You are definitely wrong, maybe it's good idea to read little about it before posting, because there is always a danger somebody else will repeat your misleading advice. Or even more dangerous buy something that will not work.

HDMI 1.4 can't push [email protected] and the cable is (was) not the limitation since the cables or connection was not updated for HDMI 2.0. It's all about hardware and electronics. Sony had couple models HDMI 2.0 ready with firmware update, also the HDMI 2.0 is not only about bandwith.
At present there is not any Displayport 1.2a to HDMI 2.0 adapter.


----------



## mojobear

Quote:


> Originally Posted by *tsm106*
> 
> I almost draw more power than you with two stock aircooled cards with my cpu at 5ghz iirc. Is your rig working right?
> 
> I run dual psu and the cards were connected to PSU1 which for the test only powers cpu/mb/dual gpu. Everything else is connected to my 2nd PSU.


Woha...thats really interesting. My rig is working right







My cores were loaded up to 100% on all three in bf3/crysis3. Also with mining...which for sure loads them up 100% at stock I'm pulling 870 W from the wall. Might be that water makes a big difference as some reviews have shown that lowering temps have a pretty big impact on R9290 power draw. I'm also probably not throwing as much volts as you are into the CPU as I'm at 4.75 GHZ w 1.25V.

The only way the readings could be off is if the watt meter is off...but I doubt it as it was new and a friend also bought his watt meter in and got the same readings.


----------



## Heinz68

Quote:


> Originally Posted by *Dispersion*
> 
> 
> 
> This gem just popped up at a swedish retailer, wonder how well it would cool the R9 290/290X's, appears to be a pretty solid chunk of aluminium cooling the VRMs/RAM chips
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Link for those of you interested, not much info available though...


Read Tom's Hardware "Fixing The Radeon R9 290 With Arctic's Accelero Xtreme III", review and installation also I remember at least couple people in this thread used the cooler.

oops I just checked your retsail link and that's a new Arctic Cooling Accelero Xtreme IV, don't know anything about it and can't find any reviews on Google yet. Anyway looks like a beast cooler.


----------



## jerrolds

Itll work well enough - greatly lower temps ([email protected] +100mv w/ 20C ambient and good airflow) as well as make the sound much more tolerable, silent by comparison. Only problem is that VRM1 temps will be hard to keep in check if those dinky sinks w/ thermal tape is used..most of the time overclocks wont be limited by cpu temp, but by VRM1 temps i would think.


----------



## tsm106

Quote:


> Originally Posted by *mojobear*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I almost draw more power than you with two stock aircooled cards with my cpu at 5ghz iirc. Is your rig working right?
> 
> I run dual psu and the cards were connected to PSU1 which for the test only powers cpu/mb/dual gpu. Everything else is connected to my 2nd PSU.
> 
> 
> 
> Woha...thats really interesting. My rig is working right
> 
> 
> 
> 
> 
> 
> 
> My cores were loaded up to 100% on all three in bf3/crysis3. Also with mining...which for sure loads them up 100% at stock I'm pulling 870 W from the wall. Might be that water makes a big difference as some reviews have shown that lowering temps have a pretty big impact on R9290 power draw. I'm also probably not throwing as much volts as you are into the CPU as I'm at 4.75 GHZ w 1.25V.
> 
> The only way the readings could be off is if the watt meter is off...but I doubt it as it was new and a friend also bought his watt meter in and got the same readings.
Click to expand...

You are getting two card power consumption numbers with three cards man. Even [H] is getting numbers closer to what I got stock. Hell my 7970s draw over 1300w in trifire benching but your tri hawaii draws less than my dual hawaii? It doesn't add up imo. **Btw, my two card numbers are with gpus on air and cpu under water.

http://www.hardocp.com/article/2014/01/26/xfx_r9_290x_double_dissipation_edition_crossfire_review/10


----------



## Sonikku13

I ordered a Radeon R9 290X from Taiwan for mining purposes. Problem is, I haven't received my Radeon R9 290X yet. It's stuck in... _customs!_ Needless to say, I'm angry that it's taking this long. It's stuck in Chicago O'Hare International Airport. I ordered it on January 13th.


----------



## mojobear

Quote:


> Originally Posted by *tsm106*
> 
> You are getting two card power consumption numbers with three cards man. Even [H] is getting numbers closer to what I got stock. Hell my 7970s draw over 1300w in trifire benching but your tri hawaii draws less than my dual hawaii? It doesn't add up imo. **Btw, my two card numbers are with gpus on air and cpu under water.
> 
> http://www.hardocp.com/article/2014/01/26/xfx_r9_290x_double_dissipation_edition_crossfire_review/10


If you look at their system load for crossfire 290x its about 707 W full load....on HardOCP.

I was getting full load on all three cards, not two...fairly confident about this with crysis 3 and bf3 - reviews show good scaling with trifire and even quadfire + MSI afterburner was showing all 3 GPUs at 100%. To add to this, mining pegs my gpus at 100%...hash rate consistent with 3 x 290s...about 820 per card...total 2460 khash.

As for my theory that my lower power consumption is due to water being a lot more power efficient for R9 290s...I would point to this review..and there was another one that also had similar results but I cant find it.

http://www.legitreviews.com/nzxt-kraken-g10-gpu-water-cooler-review-on-an-amd-radeon-r9-290x_130344/5

Total system power consumption went down from 512W -> 468W from air to G10 kraken. That would be an impressive 8-9% decrease in TOTAL SYSTEM LOAD, but if you factor in the fact that only the GPU cooling was changed and assuming worst case scenerio where the R9 290x is drawing about 300W, we are then looking at the G10 kraken r9 290x drawing 256 W, making change from stock to G10 kraken cooling reducing the power consumption of the R9 290x by about 15%








All this + the fact that the stock cooler is likely throttling the R9 290x at load reducing the power consumption of the stock R9 290x...which obviously favors the stock vs. G10 Kraken.


----------



## Krusher33

Quote:


> Originally Posted by *Sonikku13*
> 
> I ordered a Radeon R9 290X from Taiwan for mining purposes. Problem is, I haven't received my Radeon R9 290X yet. It's stuck in... _customs!_ Needless to say, I'm angry that it's taking this long. It's stuck in Chicago O'Hare International Airport. I ordered it on January 13th.


The beastly of it probably made them think it was some highly advanced bomb.


----------



## tsm106

Quote:


> Originally Posted by *mojobear*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> You are getting two card power consumption numbers with three cards man. Even [H] is getting numbers closer to what I got stock. Hell my 7970s draw over 1300w in trifire benching but your tri hawaii draws less than my dual hawaii? It doesn't add up imo. **Btw, my two card numbers are with gpus on air and cpu under water.
> 
> http://www.hardocp.com/article/2014/01/26/xfx_r9_290x_double_dissipation_edition_crossfire_review/10
> 
> 
> 
> If you look at their system load for crossfire 290x its about 707 W full load....on HardOCP.
> 
> I was getting full load on all three cards, not two...fairly confident about this with crysis 3 and bf3 - reviews show good scaling with trifire and even quadfire + MSI afterburner was showing all 3 GPUs at 100%. To add to this, mining pegs my gpus at 100%...hash rate consistent with 3 x 290s...about 820 per card...total 2460 khash.
> 
> As for my theory that my lower power consumption is due to water being a lot more power efficient for R9 290s...I would point to this review..and there was another one that also had similar results but I cant find it.
> 
> http://www.legitreviews.com/nzxt-kraken-g10-gpu-water-cooler-review-on-an-amd-radeon-r9-290x_130344/5
> 
> Total system power consumption went down from 512W -> 468W from air to G10 kraken. That would be an impressive 8-9% decrease in TOTAL SYSTEM LOAD, but if you factor in the fact that only the GPU cooling was changed and assuming worst case scenerio where the R9 290x is drawing about 300W, we are then looking at the G10 kraken r9 290x drawing 256 W, making change from stock to G10 kraken cooling reducing the power consumption of the R9 290x by about 15%
Click to expand...

Games don't load both cpu and gpu at the same levels nor at the same times. [H]'s numbers are from gaming. My number is from benching. The difference between our numbers is understandable given the differences in testing content. What I don't get is how low your numbers are, makes no sense to me. Btw, it should be said that GPU load does not = output. ALSO, the kraken review is stupid. They removed the gpu cooler and slapped on the kraken but neglected to account for the "cost" of powering the kraken given their power consumption test. It is an error lol in testing. The kraken cooled gpu gets a free pass on cooling where the ref card's cooling is counted in its consumption numbers.


----------



## mojobear

Quote:


> Originally Posted by *tsm106*
> 
> Games don't load both cpu and gpu at the same levels nor at the same times. [H]'s numbers are from gaming. My number is from benching. The difference between our numbers is understandable given the differences in testing content. What I don't get is how low your numbers are, makes no sense to me. Btw, it should be said that GPU load does not = output. ALSO, the kraken review is stupid. They removed the gpu cooler and slapped on the kraken but neglected to account for the "cost" of powering the kraken given their power consumption test. It is an error lol in testing. The kraken cooled gpu gets a free pass on cooling where the ref card's cooling is counted in its consumption numbers.


Yah I understand that the Kraken's cooler is not taken into account but it cant be 44 W LOL! That would be a monsterous cooler.

Yes you are benching different and we cant compare and yes 100% load on games vs furmark are different, but my results I am pretty damn confident about







I don't like to spread misinformation but the numbers add up IF you believe that a better cooler for the VRMs and GPU reduces GPU draw. Also my numbers I think are more typical for an average user, not someone throwing 1.4V into three cards which would of course draw tremendous power.


----------



## Sonikku13

Quote:


> Originally Posted by *Krusher33*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sonikku13*
> 
> I ordered a Radeon R9 290X from Taiwan for mining purposes. Problem is, I haven't received my Radeon R9 290X yet. It's stuck in... _customs!_ Needless to say, I'm angry that it's taking this long. It's stuck in Chicago O'Hare International Airport. I ordered it on January 13th.
> 
> 
> 
> The beastly of it probably made them think it was some highly advanced bomb.
Click to expand...

Hah! That's funny.

Hopefully they'll get through customs before scrypt ASICs come out. If it doesn't, I'll be pissed.


----------



## magicdave26

Got an MSI R9 290 Gaming Twin Frozr today, loving it so far

Seems max OC it like is 1100/1350

Trying for 1150 gave artifacts right up to +75mV

1450 vRAM blackscreened

Temps haven't gone over 72c yet even after a couple hours of benching and gaming


----------



## jerrolds

Quote:


> Originally Posted by *tsm106*
> 
> Games don't load both cpu and gpu at the same levels nor at the same times. [H]'s numbers are from gaming. My number is from benching. The difference between our numbers is understandable given the differences in testing content. What I don't get is how low your numbers are, makes no sense to me. Btw, it should be said that GPU load does not = output. ALSO, the kraken review is stupid. They removed the gpu cooler and slapped on the kraken but neglected to account for the "cost" of powering the kraken given their power consumption test. It is an error lol in testing. The kraken cooled gpu gets a free pass on cooling where the ref card's cooling is counted in its consumption numbers.


Even with the flawed test, how much power does a 12v 92mm fan take? 5w at most? http://www.nzxt.com/product/detail/138-kraken-g10-gpu-bracket.html claims 2.6W, or 12V * 0.18A.

At 300W thats ~0.9%, thats not even statistically significant. Practically, at $0.1kwh - thats $0.19 per month - so pretty much free. Stock fan probably takes up even less power. Both would be negligible for most intents and purposes.

Maybe - since its being cooled more effectively, it should be more electrically efficient - meaning it requires less power to operate at the same speed - probably save more power than the fan takes to operate.


----------



## chiknnwatrmln

I notice that when my VRM temps go up, my PC power usage goes up also.

For example, when I just start a bench at 1100/5500 pm +18mV, I pull about 410w from the wall (specs in sig, I also have one monitor plugged into my UPS) but after my card warms up it goes up to about 430w.

At +168mV I pass 500w when the card is hot. This is with an 80+ bronze PSU, a 3770k @ 1.3v also.

If the card is constantly cooler I suspect it'd pull less power. I can test this when I get my rig re-built (hopefully this weekend, depending on how much homework I get).


----------



## SLADEizGOD

Hey I found this on legit news about the new Beta for AMD 13.35..here's the link: http://www.legitreviews.com/amd-catalyst-13-35-beta-driver-leaked-mantle-trueaudio-support_134753
I don't have an AMD GPU yet But thinking of upgrading. Hope this help you guys get a good look on what this new driver can do.


----------



## mojobear

i wish i could find that second review that also noticed lower temps = less power draw...oh well. I would believe OCN testing from multiple sources more anyway


----------



## the9quad

Quote:


> Originally Posted by *SLADEizGOD*
> 
> Hey I found this on legit news about the new Beta for AMD 13.35..here's the link: http://www.legitreviews.com/amd-catalyst-13-35-beta-driver-leaked-mantle-trueaudio-support_134753
> I don't have an AMD GPU yet But thinking of upgrading. Hope this help you guys get a good look on what this new driver can do.


Yesterday's news, and those drivers do not contain mantle sorry. I don't mean that in a bad way, but was just clarifying that those drivers leaked yesterday and by today it's already been noted that they go not contain mantle.


----------



## mojobear

Quote:


> Originally Posted by *mojobear*
> 
> Yah I understand that the Kraken's cooler is not taken into account but it cant be 44 W LOL! That would be a monsterous cooler.
> 
> Yes you are benching different and we cant compare and yes 100% load on games vs furmark are different, but my results I am pretty damn confident about
> 
> 
> 
> 
> 
> 
> 
> I don't like to spread misinformation but the numbers add up IF you believe that a better cooler for the VRMs and GPU reduces GPU draw. Also my numbers I think are more typical for an average user, not someone throwing 1.4V into three cards which would of course draw tremendous power.


wait a second i think im stupido...hahaha. Its total system draw the legit review was measuring which of course would include the Kraken G10 somewhere in there..unless they ran it off a separate outlet or something (extremely unlikely)....sooo the - 44W system draw would actually be the difference between Stock temp GPU/VRM vs G10 GPU/VRM....unless of course the G10 cooler is 44W more power efficient than the stock cooler


----------



## tsm106

Quote:


> Originally Posted by *mojobear*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mojobear*
> 
> Yah I understand that the Kraken's cooler is not taken into account but it cant be 44 W LOL! That would be a monsterous cooler.
> 
> Yes you are benching different and we cant compare and yes 100% load on games vs furmark are different, but my results I am pretty damn confident about
> 
> 
> 
> 
> 
> 
> 
> I don't like to spread misinformation but the numbers add up IF you believe that a better cooler for the VRMs and GPU reduces GPU draw. Also my numbers I think are more typical for an average user, not someone throwing 1.4V into three cards which would of course draw tremendous power.
> 
> 
> 
> wait a second i think im stupido...hahaha. Its total system draw the legit review was measuring which of course would include the Kraken G10 somewhere in there..unless they ran it off a separate outlet or something (extremely unlikely)....sooo the - 44W system draw would actually be the difference between Stock temp GPU/VRM vs G10 GPU/VRM....unless of course the G10 cooler is 44W more power efficient than the stock cooler
Click to expand...

That's why I wrote their numbers are whacky. The kraken's pump and fan cost power to run, a lot more power than just the gpu's cooling fan. Their numbers don't add up you see. Then if you look at their "calculated" consumption numbers, miraculously that wattage or whatever is lost somewhere... but where? The overall consumption drops too, but again how?

Fact, if the core is working more due to not being throttled, it is consuming MORE power to do so. If you say the core working 100% vs 80% whilst being throttled, and at 100% usage it is consuming less power. That is magic right there.

**Btw I could give two cents about some kraken review, I'm only this topic because you think that running cooler consumes less power. You should also realize AMD differs on this viewpoint. I've never found that better cooling = lowered consumption because the lower cooling always has a price. In my setups, the price is really freaking high, but that's ok because I'm willing to pay it.


----------



## mojobear

Quote:


> Originally Posted by *tsm106*
> 
> That's why I wrote their numbers are whacky. The kraken's pump and fan cost power to run, a lot more power than just the gpu's cooling fan. Their numbers don't add up you see. Then if you look at their "calculated" consumption numbers, miraculously that wattage or whatever is lost somewhere... but where? The overall consumption drops too, but again how?
> 
> Fact, if the core is working more due to not being throttled, it is consuming MORE power to do so. If you say the core working 100% vs 80% whilst being throttled, and at 100% usage it is consuming less power. That is magic right there.
> 
> **Btw I could give two cents about some kraken review, I'm only this topic because you think that running cooler consumes less power. You should also realize AMD differs on this viewpoint. I've never found that better cooling = lowered consumption because the lower cooling always has a price. In my setups, the price is really freaking high, but that's ok because I'm willing to pay it.


Meh, we all have our theories. I at least have some sort of proof to back up my theory. I use the G10 kraken as an example for my theory that significant drops in temp decrease power draw from gpu...really its hard to come up with some other explanation of why the G10 installation resulted in 44W less system draw. Stock and G10 both had same frequencies for GPU, same test bench....only difference was the cooling solution. So unless you can give me some confounders that can result in a 44W drop, you can imply causation between the G10 installation and -44W at load









Stating its a flawed review without giving good explanations of confounders is not justification of it being a bad review. I at least have given you some proof to back up my idea. Can you prove that lowering temps to 50 C for the GPU has no effect on power draw?


----------



## jerrolds

Couldnt find it on the NZXT site but xbitlabs says the Kraken X40 uses 7.5W http://www.xbitlabs.com/articles/coolers/display/nzxt-kraken-x40_3.html

Again with the 92mm fan thats about ~10W, not that much.

If i had a Kilawat it would be easy enough for me to test - i have 2 120mm fans that i can easily turn one off, increasing temps from ~60C to ~80C. It seems counter intuitive that it would actually require less power, but maybe its possible.


----------



## Forceman

Quote:


> Originally Posted by *tsm106*
> 
> That's why I wrote their numbers are whacky. The kraken's pump and fan cost power to run, a lot more power than just the gpu's cooling fan. Their numbers don't add up you see. Then if you look at their "calculated" consumption numbers, miraculously that wattage or whatever is lost somewhere... but where? The overall consumption drops too, but again how?
> 
> Fact, if the core is working more due to not being throttled, it is consuming MORE power to do so. If you say the core working 100% vs 80% whilst being throttled, and at 100% usage it is consuming less power. That is magic right there.
> 
> **Btw I could give two cents about some kraken review, I'm only this topic because you think that running cooler consumes less power. You should also realize AMD differs on this viewpoint. I've never found that better cooling = lowered consumption because the lower cooling always has a price. In my setups, the price is really freaking high, but that's ok because I'm willing to pay it.


It is well established that lower temps equals less power consumption, both because of reduced electrical resistance and increased efficiency. The VRMs in particular are much more efficient at lower temps. There have been numerous charts that show this posted around (the VRM one has been posted in this thread) - I'm on my phone or I'd find them and link them. Anand has posted a few for CPUs is the past, including one for Haswell. I have no trouble believing that putting a AIO cooler on the GPU would reduce power consumption, above and beyond what is needed for the pump/fan.

Edit: Here's an extensive look at it for CPUs

http://forums.anandtech.com/showthread.php?t=2281195


----------



## Redvineal

Quote:


> Originally Posted by *tsm106*
> 
> ^^It's a not a bad idea to just jump right to 1500w ish hehe.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> **The cost difference from 1300w to 1500w is like 80 bucks, but the cost from buying a 1300w and then realizing you need 1500w... it's a lot more expensive than 80 bucks and it bruises your woulda shoulda coulda 20/20 hindsight...


Not much stock available to me in the US. It seems like this is the best I can get that's in stock, fits in my Air 540, and is listed in this PSU thread:

http://www.newegg.com/Product/Product.aspx?Item=N82E16817153154&ignorebbr=1

Thoughts on reliability with up to 3 x R9 290's down the road?

(I really don't want to go with Newegg, but Amazon is bone dry on high output PSU's, and everything else for that matter)


----------



## shilka

Quote:


> Originally Posted by *Redvineal*
> 
> Not much stock available to me in the US. It seems like this is the best I can get that's in stock, fits in my Air 540, and is listed in this PSU thread:
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16817153154&ignorebbr=1
> 
> Thoughts on reliability with up to 3 x R9 290's down the road?
> 
> (I really don't want to go with Newegg, but Amazon is bone dry on high output PSU's, and everything else for that matter)


EVGA SuperNova G2 1300 atts is a much better PSU

That one you picked is a bit meh and its really overpriced for what you get

That does not mean its bad


----------



## Redvineal

Quote:


> Originally Posted by *shilka*
> 
> EVGA SuperNova G2 1300 atts is a much better PSU
> 
> That one you picked is a bit meh and its really overpriced for what you get
> 
> That does not mean its bad


The problem is I'm limited to what's in stock. And it seem like most high output PSU's are out of stock!

I think I might just run two R9 290's at stock clocks with my AX860 until I can get a hold of a beast PSU. I probably have a decent amount of headroom to support a second card right now.

Thanks for the info. While we're on the topic of brand, would you say EVGA PSU's are your favorite? Who's the most highly revered in the 1300-1500 range considering price is no issue?


----------



## shilka

Quote:


> Originally Posted by *Redvineal*
> 
> The problem is I'm limited to what's in stock. And it seem like most high output PSU's are out of stock!
> 
> I think I might just run two R9 290's at stock clocks with my AX860 until I can get a hold of a beast PSU. I probably have a decent amount of headroom to support a second card right now.
> 
> Thanks for the info. While we're on the topic of brand, would you say EVGA PSU's are your favorite? Who's the most highly revered in the 1300-1500 range considering price is no issue?


EVGA does not make the PSU´s they sell

In fact every EVGA PSU besides the G2 and P2 models suck in one way or another

Also there are not really any PSU´s above 1300 watts that does not have one flaw or another


----------



## kdawgmaster

Quote:


> Originally Posted by *shilka*
> 
> EVGA does not make the PSU´s they sell
> 
> In fact every EVGA PSU besides the G2 and P2 models suck in one way or another
> 
> Also there are not really any PSU´s above 1300 watts that does not have one flaw or another


I have the nova 1300 watts. Please point out its flaws to me??


----------



## shilka

Quote:


> Originally Posted by *kdawgmaster*
> 
> I have the nova 1300 watts. Please point out its flaws to me??


I said above 1300 watts

Only thing i can complain about with the G2 1300 watts is nitpicking

Like the Capxon caps on the modular board does not really matter in any way but it still dont like Capxon

And the manual could have been much better but thats not even nitpicking the PSU


----------



## the9quad

Quote:


> Originally Posted by *shilka*
> 
> I said above 1300 watts
> 
> Only thing i can complain about with the G2 1300 watts is nitpicking
> 
> Like the Capxon caps on the modular board does not really matter in any way but it still dont like Capxon
> 
> And the *manuel* could have been much better but thats not even nitpicking the PSU


Manuel would like a word with you


----------



## shilka

Quote:


> Originally Posted by *the9quad*
> 
> Manuel would like a word with you


Sory was a typo


----------



## the9quad

Quote:


> Originally Posted by *shilka*
> 
> Sory was a typo


Manuel forgives no one!

J/k man I was just teasing ya.


----------



## shilka

Quote:


> Originally Posted by *the9quad*
> 
> Manuel forgives no one!
> 
> J/k man I was just teasing ya.


Its 01.16 here so maybe i should just go to bed

Really nice that my internet only works when its feels like it

So when its this late it works fine but during the day nope

So to anyone that want to PM me if i dont get back to you i am either sleeping or cursing that my internet does not work


----------



## chiknnwatrmln

Quote:


> Originally Posted by *the9quad*
> 
> Manuel would like a word with you


Hahahaha


----------



## Roy360

I'm thinking of picking up a third R9 290. Only problem is, I already have a universal block that I am using for my 6950.

Do you guys think think the EK Supreme Block + Several low profile chinese copper heatsyncs and 3 GT will be enough to cool it?

I have one of theses in the my last slot, http://www.ebay.ca/itm/Video-Card-Heatsinks-Cooling-Fans-Mount-Rack-for-12cm-9cm-Dual-fan-Bracket-/171181890352?pt=US_Video_Card_GPU_Cooling&hash=item27db3c5b30&_uhb=1.

Two GT under the card, and 1 GT on the rear end blowing air towards the outputs.

GT= Gentle Typhoon 2150rpm @ 7V


----------



## tsm106

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> That's why I wrote their numbers are whacky. The kraken's pump and fan cost power to run, a lot more power than just the gpu's cooling fan. Their numbers don't add up you see. Then if you look at their "calculated" consumption numbers, miraculously that wattage or whatever is lost somewhere... but where? The overall consumption drops too, but again how?
> 
> Fact, if the core is working more due to not being throttled, it is consuming MORE power to do so. If you say the core working 100% vs 80% whilst being throttled, and at 100% usage it is consuming less power. That is magic right there.
> 
> **Btw I could give two cents about some kraken review, I'm only this topic because you think that running cooler consumes less power. You should also realize AMD differs on this viewpoint. I've never found that better cooling = lowered consumption because the lower cooling always has a price. In my setups, the price is really freaking high, but that's ok because I'm willing to pay it.
> 
> 
> 
> It is well established that lower temps equals less power consumption, both because of reduced electrical resistance and increased efficiency. The VRMs in particular are much more efficient at lower temps. There have been numerous charts that show this posted around (the VRM one has been posted in this thread) - I'm on my phone or I'd find them and link them. Anand has posted a few for CPUs is the past, including one for Haswell. I have no trouble believing that putting a AIO cooler on the GPU would reduce power consumption, above and beyond what is needed for the pump/fan.
> 
> Edit: Here's an extensive look at it for CPUs
> 
> http://forums.anandtech.com/showthread.php?t=2281195
Click to expand...

In ICD's article it's true that lowering temp drops consumption, but the problem is that the energy consumed to lower temps isn't accounted for. Generally after the cost of lowering said temps, their isn't a whole lot of gain left overall. In some of his graphs with the largest power consumption drops if you note the temperature between high and low, its a whopping 70c temp drop on a cpu. In this case it netted a drop in consumption of 28w at 1.3v. To go from 95c to 25c, you'd need sub ambient cooling afaik. There's a cost to that that isn't as for as I can tell in the discussion. It costs a helluvalot of whats to get the sub ambient/ambient level cooling, whether its phase or tec.

And this is part of my question regarding that article. They achieve X cooling but where are the costs associated with X cooling. The only thing I see is that the consumption dropped a whopping 44w. That level of cooling is beyond what is in ICD's article depending on the range taken. Again, I wonder if only to myself where's the beef? It's like magic.


----------



## Forceman

I think you are overestimating how much power an AIO cooler uses. I'd also mention that the ICD comparison was on a 150W CPU, I'd expect proportional gains on a 300W GPU.

At 1.3V it looks like about a 20W drop in power for a 30C drop in temp (from 105 to 75). Considering that a AIO can drop GPU temps by 30C, and factoring in that a GPU is using twice the total power (or more, in that test the CPU was 135W) I'd say double the power consumption reduction would be reasonable to expect, and that puts you at 40W. So then it comes down to how much the AIO cooler draws (which someone said was 7.5W) and you are in the ballpark of what that test found.

But it should be easy enough for someone with a kill-a-watt to test. I'd do it, but my rad fans are already running as low as possible so I can't really do much to raise the GPU temps in a controlled way.


----------



## Cool Mike

The PowerColor PCS+ AXR9 290 just hit Newegg, I just picked up two for my Dell 4K monitor.

Card is $539.99. Lower than I thought they would be.


----------



## VSG

Geez, that's what a non-reference 290x should have been priced at


----------



## Cool Mike

I really like the cooler on the PCS+. The 290/290x needs a hefty cooler and the PCS+ has one. A 2.5 slot card.


----------



## VSG

Let us know how it performs, it looks very promising!


----------



## Germanian

those powercolors look tempting but i got me some nice 3x xfx r9 290 reference for watercooling


----------



## Redvineal

Anyone have the MSI R9 290 GAMING? If so, how do you like it and can you share temps?

(edit: apologies for not channeling my search-fu before asking. found a few users and sent PM's directly)

I was so close to pulling the trigger on 2 x XFX DD 290's, but got to reading too many negatives around here.

One way or another, I want to put in an order for 2 290's. The hard part is deciding before I change my mind!

...and lol what's up with the free R7 240 that comes with the Sapphire Tri-X?


----------



## taem

Dam all the psu talk now has me worried my Xfx pro black 850w won't cut it for 290 crossfire with 4670k @ 4.6.

If I lower the cpu clock and volt and run the 290s at stock that ought to be ok but I also have 6 140mm fans and 6 4tb wd black hdds.

How's it looking for me? Aside from not wanting to spend $200+ on a new psu I'm also limited to 160mm in my case. There's basically one 1000w psu I can get, a silverstone strider plus, tho I believe that's a good psu.


----------



## psyside

Quote:


> Originally Posted by *Falkentyne*
> 
> Increasing core voltage increases voltage to the IMC--this will allow the memory to run at your overclocked speed when the system is **IDLE**---it's low idle voltage combined with high memory clocks that cause black screens on the desktop.
> 
> You can avoid that problem and push the memory to its upper limits by using unofficial overclocking mode WITHOUT powerplay support in afterburner, which forces 3D clocks on the desktop, with 3d voltages (which stops the 0.9xx idle voltage with full speed memory=black screen)


This doesn't work for me, i tried everything i could, with unofficial oc mode, the screens jumps again when i apply my clocks = power tune resets. Nothing helps, i tried everything, i ended using ccc - overdrive for setting power limit...but this have different downside, my fan settings and such apply on boot, so its like instant jump to 70%, if i set the profile with AB, then again i got the power limit reset whenever i press apply to any settings.


----------



## rdr09

Quote:


> Originally Posted by *Cool Mike*
> 
> The PowerColor PCS+ AXR9 290 just hit Newegg, I just picked up two for my Dell 4K monitor.
> 
> Card is $539.99. Lower than I thought they would be.


that's pretty much the amount i spent for my 290 ref and a waterblock. i think it is priced reasonbly considering all the mining craze.


----------



## kizwan

Quote:


> Originally Posted by *Raephen*
> 
> I'm not much of a bencher, nor have I clocked my 290 heavily (1. Don't need it yet and 2. my vram doesn' like to be clocked (?), so I run a simple 1100 / 1250 with no voltage increase, +50% power limit).
> 
> I did, however, pick up 3dmark last year when it was on sale on Steam.
> 
> I ran FS-E to heat the card up a bit from cold boot, and then just ran the full gambit of tests.
> 
> GPU max: 47C
> VRM1 max: 48C
> VRM2 max: 41C
> 
> These results seem consistent with what I've seen after a good gaming session with a heavily modded Skyrim with .ini tweaks.


Forgot to ask you. What is your ambient (indoor) temp when running the tests?


----------



## Arizonian

Quote:


> Originally Posted by *Jonsu*
> 
> Hi, please add me. MSI R9 290 Gaming @ 1177/1500 Hynix memory
> 
> 
> 
> Spoiler: Warning: Spoiler!


Apologies Jonsu - slotted in order to your submission. Congrats - added


----------



## MrWhiteRX7

Man so it seems no retailer will be getting reference 290's back in stock for quite a while. Sadface









I was wanting to add another card


----------



## kdawgmaster

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Man so it seems no retailer will be getting reference 290's back in stock for quite a while. Sadface
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I was wanting to add another card


u had to expect this eventually. It will now more then likely be that the next cards wont be flashable either.


----------



## Raephen

Quote:


> Originally Posted by *kizwan*
> 
> Forgot to ask you. What is your ambient (indoor) temp when running the tests?


I'd have to guesstimate, but 20C would be a safe bet.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *kdawgmaster*
> 
> u had to expect this eventually. It will now more then likely be that the next cards wont be flashable either.


Literally found a place that had two left in stock on amazon LOL they just popped up on my stock watch alert, they're on the way! WOOOOOOOOOOO!


----------



## kdawgmaster

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Literally found a place that had two left in stock on amazon LOL they just popped up on my stock watch alert, they're on the way! WOOOOOOOOOOO!


Im always one to not get something that didnt pass the first round of testing which is what a R9 290 essentially is right now a R9 290x that did pass ( for amd ) to be a 290x.


----------



## Ukkooh

I have a few questions before doing my first waterblock install ever on a 290x with an ek block. How much approximately should I tighten the screws? I'm afraid of crushing the die but wouldn't want to leave them too loose neither. I've heard that EK's stock thermal pads are pretty bad and limit the performane on VRM cooling. Would the Phobya Ultra thermal pads be better than the EK stock ones? Also if I use TIM on the thermal pads should I put it on both sides of it or only on the underside that touches the VRM? Thanks in advance.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *kdawgmaster*
> 
> Im always one to not get something that didnt pass the first round of testing which is what a R9 290 essentially is right now a R9 290x that did pass ( for amd ) to be a 290x.


The three I have now are fantastic and oc very well. I have no reason not to stick with 290's.


----------



## Jonsu

Quote:


> Originally Posted by *Arizonian*
> 
> Apologies Jonsu - slotted in order to your submission. Congrats - added


No problemo, nothing like a simple PM can't fix. Thanks


----------



## kizwan

Quote:


> Originally Posted by *Raephen*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Forgot to ask you. What is your ambient (indoor) temp when running the tests?
> 
> 
> 
> I'd have to guesstimate, but 20C would be a safe bet.
Click to expand...

I see. My ambient (indoor) is 10+ Celsius higher than yours.

This figure taken when playing BF4 at stock clock in 29C ambient. Valley & other benchmarks I have here doesn't push the GPU like BF4 does.
GPU core: 56C
VRM1: 58C
VRM2: 50C

Taking into account my ambient & yours overclock to 1100, mine is running a couple degrees higher than yours.


----------



## esqueue

Quote:


> Originally Posted by *Ukkooh*
> 
> I have a few questions before doing my first waterblock install ever on a 290x with an ek block. How much approximately should I tighten the screws? I'm afraid of crushing the die but wouldn't want to leave them too loose neither. I've heard that EK's stock thermal pads are pretty bad and limit the performane on VRM cooling. Would the Phobya Ultra thermal pads be better than the EK stock ones? Also if I use TIM on the thermal pads should I put it on both sides of it or only on the underside that touches the VRM? Thanks in advance.


I have the EK block for my 290x and what I did was snug all the screws down then went back to tighten them all down. I think that the screws stop so they will not damage the board. I don't remember having a problem on how tight I should tighten them at all.


----------



## rdr09

Quote:


> Originally Posted by *Ukkooh*
> 
> I have a few questions before doing my first waterblock install ever on a 290x with an ek block. How much approximately should I tighten the screws? I'm afraid of crushing the die but wouldn't want to leave them too loose neither. I've heard that EK's stock thermal pads are pretty bad and limit the performane on VRM cooling. Would the Phobya Ultra thermal pads be better than the EK stock ones? Also if I use TIM on the thermal pads should I put it on both sides of it or only on the underside that touches the VRM? Thanks in advance.


what i did was started with the screws for the core and work my way out. try to get into an X pattern and let each screw touch the washer/pcb. you can go back and turn each screw by a quarter turn or tighten them down a bit.

for the thermal pad, if you can find fujipoly extreme pads and make sure you get the right sizes. the memory IC takes the 0.5mm and the vrms - 1mm. i used the paste that came with the kit between the vrms and the pad. i should have applied it on both sides, which you can.


----------



## kizwan

Quote:


> Originally Posted by *Ukkooh*
> 
> I have a few questions before doing my first waterblock install ever on a 290x with an ek block. How much approximately should I tighten the screws? I'm afraid of crushing the die but wouldn't want to leave them too loose neither. I've heard that EK's stock thermal pads are pretty bad and limit the performane on VRM cooling. Would the Phobya Ultra thermal pads be better than the EK stock ones? Also if I use TIM on the thermal pads should I put it on both sides of it or only on the underside that touches the VRM? Thanks in advance.


There is one spot above GPU die where the tiny resistors near the screw holes. Be careful there. When you removing the stock cooler later, you'll know how easy to remove the screws. That is how you want to tighten them. Basically don't over tighten them.

The EK thermal pad is rated 3 - 5 W/mK & it's poorly perform. If you want to get Phobya thermal pad, make sure you get one that at least rated at 7 W/mK. Recommended thermal pad is Fujipoly 11 or 17 W/mK thermal pad but they cost more.

You only need to put very little TIM on the VRMs. No need to put on the water block side.


----------



## Raephen

Quote:


> Originally Posted by *kizwan*
> 
> I see. My ambient (indoor) is 10+ Celsius higher than yours.
> 
> This figure taken when playing BF4 at stock clock in 29C ambient. Valley & other benchmarks I have here doesn't push the GPU like BF4 does.
> GPU core: 56C
> VRM1: 58C
> VRM2: 50C
> 
> Taking into account my ambient & yours overclock to 1100, mine is running a couple degrees higher than yours.


No card is the same, so a couple of degrees is well within the margin of error.

Mind you, it's winter now here in the Netherlands. On a hot, damp day in the summer, with the lingering afternoon heat in my apartment building, I'd expect my card to show numbers like yours, if not higher...


----------



## SgtMunky

I'm about to place an order for the Sapphire Tri-x 290







£365.00

Plus a Silverstone Strider 750w Gold Evolution to give myself some more room for upgrades this year

And finally a Corsair H100i to stuff into the top of my TJ07 somehow


----------



## AlDyer

Is there really any perf/oc hit with ek pads? Does it truly matter that much?


----------



## Krusher33

I took off my stock cooler on my Sapphire 290X in preparation of the block I'm getting for it tomorrow. The pads over the memory was still gummy but the ones over the VRM's... they were flaky. Like dried out flaky. Is that typical? The VRM's temps were fine when I ran it last week.


----------



## jerrolds

Not sure if thats typical, but the VRM pads on my Sapphire were gummy.


----------



## VSG

FYI, reference 290/290x are now EOL in most parts on the world.


----------



## sugarhell

Quote:


> Originally Posted by *geggeg*
> 
> FYI, reference 290/290x are now EOL in most parts on the world.


Nope


----------



## rdr09

Quote:


> Originally Posted by *sugarhell*
> 
> Nope


^this. at least europe doesn't seem to run oos . . .

http://www.scan.co.uk/search.aspx?q=r9+290


----------



## SgtMunky

Quote:


> Originally Posted by *geggeg*
> 
> FYI, reference 290/290x are now EOL in most parts on the world.


What?


----------



## VSG

I meant what you see now is probably the last stock, OcUK and a NewEgg rep both told me the same thing.

Edit: Relevant posts from OcUK
Quote:


> Originally Posted by *Gibbo*
> Bad news these are now end of life, so the stock won't be around much longer on the 290's and then the 290X's shall also run out.


Quote:


> Originally Posted by *Finners*
> why are reference cards now EOL?


Quote:


> Originally Posted by *Gibbo*
> AMD stopped making them. Just like they did on 7900 and 6900 series, nothing new, same old.
> 
> Going forward it shall only be custom designs once all reference stock is sold out.


Source


----------



## SgtMunky

But I haven't seen anything regarding the next revisions of the fully custom boards, or have I missed something?


----------



## Krusher33

Quote:


> Originally Posted by *geggeg*
> 
> I meant what you see now is probably the last stock, OcUK and a NewEgg rep both told me the same thing.
> 
> Edit: Relevant posts from OcUK
> Quote:
> 
> 
> 
> Originally Posted by *Gibbo*
> Bad news these are now end of life, so the stock won't be around much longer on the 290's and then the 290X's shall also run out.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Finners*
> why are reference cards now EOL?
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gibbo*
> AMD stopped making them. Just like they did on 7900 and 6900 series, nothing new, same old.
> 
> Going forward it shall only be custom designs once all reference stock is sold out.
> 
> Click to expand...
> 
> Source
Click to expand...

I hate that for the watercooling purpose.

Like if I decide I wanna get another to crossfire mine... I'm SOOL then?


----------



## VSG

You will still find cards that use the reference PCB, don't worry.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Krusher33*
> 
> I hate that for the watercooling purpose.
> 
> Like if I decide I wanna get another to crossfire mine... I'm SOOL then?


Sapphire Tri-X, XFX DD, MSI Gaming etc, all Ref iirc


----------



## Krusher33

Quote:


> Originally Posted by *geggeg*
> 
> You will still find cards that use the reference PCB, don't worry.


That was such a PITA with 7970's.

"Hey! X card uses reference board!" ... "Oh wait, no it doesn't, they changed it recently".

I got lucky finding a seller selling a used one. But I'm more partial to buying new for RMA reasons. It's too much hassle and time consuming to deal with RMA's when bought used.


----------



## MrWhiteRX7

It's true, Sapphire had mentioned this in an email a week ago when I was asking when more would be available. They are only doing custom cards now. I knew this would happen, but it seems pretty soon to me.


----------



## BradleyW

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> It's true, Sapphire had mentioned this in an email a week ago when I was asking when more would be available. They are only doing custom cards now. I knew this would happen, but it seems pretty soon to me.


So what happens if a water cooler like me needs a replacement reference card in the future via RMA? (Think water blocks here).


----------



## Jack Mac

Wow, they better not discontinue reference yet, I need another for matching CF.


----------



## BackwoodsNC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Sapphire Tri-X, XFX DD, MSI Gaming etc, all Ref iirc


looks like they are all using reference pcb layout but with different components.


----------



## Krusher33

All it takes is a single Cap to be in the way or something like that.


----------



## VSG

Quote:


> Originally Posted by *BradleyW*
> 
> So what happens if a water cooler like me needs a replacement reference card in the future via RMA? (Think water blocks here).


Companies always keep aside a good number for warranty and RMA, but if they run out (highly unlikely), you get an equivalent or better product instead. This doesn't mean water block compatibility will be there- remember that most AMD vendors don't have warranty for water cooling as it is.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *geggeg*
> 
> Companies always keep aside a good number for warranty and RMA, but if they run out (highly unlikely), you get an equivalent or better product instead. This doesn't mean water block compatibility will be there- remember that most AMD vendors don't have warranty for water cooling as it is.


Also people will be getting refurbs under warranty


----------



## jamaican voodoo

this is getting worst by the moment 579 for a 290 on newegg i was planning getting two more but by the looks of it i wont be... im highly irritated







, i hope this mining crap goes to hell


----------



## tsm106

Quote:


> Originally Posted by *BackwoodsNC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Sapphire Tri-X, XFX DD, MSI Gaming etc, all Ref iirc
> 
> 
> 
> looks like they are all using reference pcb layout but with different components.
Click to expand...

As long as it fits the block all is good.


----------



## VSG

^ Yup, the smaller board partners also sell pre-fitted water cooled cards (Powercolor, Visiontek) so I don't think anyone needs to press the alarm button anytime soon. EK will also be making blocks for the Asus DCU2 and MSI 290x Lightning as the very least as well.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Krusher33*
> 
> All it takes is a single Cap to be in the way or something like that.


Quote:


> Originally Posted by *Jack Mac*
> 
> Wow, they better not discontinue reference yet, I need another for matching CF.


Quote:


> Originally Posted by *BradleyW*
> 
> So what happens if a water cooler like me needs a replacement reference card in the future via RMA? (Think water blocks here).


http://www.coolingconfigurator.com/waterblock/3831109868539

This should clear up a few things.


----------



## Krusher33

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Krusher33*
> 
> All it takes is a single Cap to be in the way or something like that.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jack Mac*
> 
> Wow, they better not discontinue reference yet, I need another for matching CF.
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *BradleyW*
> 
> So what happens if a water cooler like me needs a replacement reference card in the future via RMA? (Think water blocks here).
> 
> Click to expand...
> 
> http://www.coolingconfigurator.com/waterblock/3831109868539
> 
> This should clear up a few things.
Click to expand...

That just tells me "we don't know for sure but it LOOKS like it could fit".

I'm sorry, I'm just frustrated because I know it'll be many months before I could consider getting another one and by that time, all the companies will do what they had done in the past, change the way they had things and so everything we know now is outdated. Buy a card and a block won't fit because so-so changed where a cap or something was. I just wish they would consider keeping watercoolers in mind when they do make changes like that.

Or is that WHY? Maybe too many RMA's and they suspect the card went faulty because of a watercooler goofed and so they decide to change something to make it not able to take a block anymore?


----------



## SgtMunky

So I should wait before buying a 290 now then? Either pick up one of the new releases, or grab a Tri-X if it is the last bit of stock at a new lower price?


----------



## tsm106

Quote:


> Originally Posted by *Krusher33*
> 
> That just tells me "we don't know for sure but it LOOKS like it could fit".
> 
> I'm sorry, I'm just frustrated because I know it'll be many months before I could consider getting another one and by that time, all the companies will do what they had done in the past, change the way they had things and so everything we know now is outdated. Buy a card and a block won't fit because so-so changed where a cap or something was. I just wish they would consider keeping watercoolers in mind when they do make changes like that.
> 
> Or is that WHY? Maybe too many RMA's and they suspect the card went faulty because of a watercooler goofed and so they decide to change something to make it not able to take a block anymore?


This is how it's always been done dude. The roll out of reference cards are subsidized by AMD in a way and its done to ensure that you receive consistency. It also allows time for AIB partners to gear up their production of their own designs. There is not as much profit for AIBs to sell reference cards. The profit is in producing the whole card.

On the flip side there were so many posts complaining that there were no custom cards, and now that the ref card has reached EOL, ppl are disappointed that there will only be customs.

Make up your minds hehe.

Quote:


> Originally Posted by *SgtMunky*
> 
> So I should wait before buying a 290 now then? Either pick up one of the new releases, or grab a Tri-X if it is the last bit of stock at a new lower price?


When ref cards go, they are gone. If you want a "new" reference, it's best to get it while they are available. That said the Tri X isn't going anywhere unless Sapphire do a revision on it and re-produce it on that crap blue pcbs... I'm just remembering the 7970 days lol.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Krusher33*
> 
> That just tells me "we don't know for sure but it LOOKS like it could fit".
> 
> I'm sorry, I'm just frustrated because I know it'll be many months before I could consider getting another one and by that time, all the companies will do what they had done in the past, change the way they had things and so everything we know now is outdated. Buy a card and a block won't fit because so-so changed where a cap or something was. I just wish they would consider keeping watercoolers in mind when they do make changes like that.
> 
> Or is that WHY? Maybe too many RMA's and they suspect the card went faulty because of a watercooler goofed and so they decide to change something to make it not able to take a block anymore?


Well from what i can see there are still Ref design from multiple AIB's avalible in the US so i'd grab one now instead of waiting.

If not, then sell the one you have and grab another 2 later.


----------



## Krusher33

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Krusher33*
> 
> That just tells me "we don't know for sure but it LOOKS like it could fit".
> 
> I'm sorry, I'm just frustrated because I know it'll be many months before I could consider getting another one and by that time, all the companies will do what they had done in the past, change the way they had things and so everything we know now is outdated. Buy a card and a block won't fit because so-so changed where a cap or something was. I just wish they would consider keeping watercoolers in mind when they do make changes like that.
> 
> Or is that WHY? Maybe too many RMA's and they suspect the card went faulty because of a watercooler goofed and so they decide to change something to make it not able to take a block anymore?
> 
> 
> 
> This is how it's always been done dude. The roll out of reference cards are subsidized by AMD in a way and its done to ensure that you receive consistency. It also allows time for AIB partners to gear up their production of their own designs. There is not as much profit for AIBs to sell reference cards. The profit is in producing the whole card.
> 
> On the flip side there were so many posts complaining that there were no custom cards, and now that the ref card has reached EOL, ppl are disappointed that there will only be customs.
> 
> Make up your minds hehe.
> 
> Quote:
> 
> 
> 
> Originally Posted by *SgtMunky*
> 
> So I should wait before buying a 290 now then? Either pick up one of the new releases, or grab a Tri-X if it is the last bit of stock at a new lower price?
> 
> Click to expand...
> 
> When ref cards go, they are gone. If you want a "new" reference, it's best to get it while they are available. That said the Tri X isn't going anywhere unless Sapphire do a revision on it and re-produce it on that crap blue pcbs... I'm just remembering the 7970 days lol.
Click to expand...

That's exactly what I was referring to.

I guess I was just hoping that things would have changed by now, now that the watercooling is gaining popularity. At least that's how I feel.

Edit: in other news, GPUz now detects memory vendor. yay!


----------



## SgtMunky

I was expecting to see a MSI Lightning 290 or a Vapour-X 290, whats what I was referring to. Shall I just grab a Tri-X and be done with it then







As long as it wipes the floor with a 2080p monitor for quite some time, I'll be happy either way

Shame about the yellow on it, but I've heard great things and seen great reviews on that cooler


----------



## MrWhiteRX7

Quote:


> Originally Posted by *SgtMunky*
> 
> I was expecting to see a MSI Lightning 290 or a Vapour-X 290, whats what I was referring to. Shall I just grab a Tri-X and be done with it then
> 
> 
> 
> 
> 
> 
> 
> As long as it wipes the floor with a 2080p monitor for quite some time, I'll be happy either way
> 
> Shame about the yellow on it, but I've heard great things and seen great reviews on that cooler


If you mean 1080p then yes, you'll be set for a while


----------



## VSG

Quote:


> Originally Posted by *SgtMunky*
> 
> I was expecting to see a MSI Lightning 290 or a Vapour-X 290, whats what I was referring to. Shall I just grab a Tri-X and be done with it then
> 
> 
> 
> 
> 
> 
> 
> As long as it wipes the floor with a 2080p monitor for quite some time, I'll be happy either way
> 
> Shame about the yellow on it, but I've heard great things and seen great reviews on that cooler


There won't be a Lightning 290, only 290x. Unsure about the Vapor-X but then we haven't really heard any news from the other flagships (Toxic, Matrix, IceQ) at all.


----------



## Sgt Bilko

Quote:


> Originally Posted by *geggeg*
> 
> There won't be a Lightning 290, only 290x. Unsure about the Vapor-X but then we haven't really heard any news from the other flagships (Toxic, Matrix, IceQ) at all.


The IceQ is out but i'm in no rush to buy it.......well, If i never cared about the way it looked then sure


----------



## rdr09

Quote:


> Originally Posted by *SgtMunky*
> 
> I was expecting to see a MSI Lightning 290 or a Vapour-X 290, whats what I was referring to. Shall I just grab a Tri-X and be done with it then
> 
> 
> 
> 
> 
> 
> 
> As long as it wipes the floor with a 2080p monitor for quite some time, I'll be happy either way
> 
> Shame about the yellow on it, but I've heard great things and seen great reviews on that cooler


yah, if you mean 1080, then just get a 290. if you're into benching, then get the 290X. the thing is . . . once you start using the 290, you'll feel the urge to upgrade from 1080.


----------



## VSG

Quote:


> Originally Posted by *Sgt Bilko*
> 
> The IceQ is out but i'm in no rush to buy it.......well, If i never cared about the way it looked then sure


No retailer ever got stock for sale, it ain't out just yet!


----------



## Loktar Ogar

Quote:


> Originally Posted by *rdr09*
> 
> yah, if you mean 1080, then just get a 290. if you're into benching, then get the 290X. the thing is . . . ones you start using the 290, you'll feel the urge to upgrade from 1080.


This is true.







I just have to agree based on experience.


----------



## rdr09

Quote:


> Originally Posted by *Loktar Ogar*
> 
> This is true.
> 
> 
> 
> 
> 
> 
> 
> I just have to agree based on experience.


i had a typo but you beat me to it. anyway, i plan on going eyeinfinity. what monitor you have in mind?


----------



## Sgt Bilko

Quote:


> Originally Posted by *geggeg*
> 
> No retailer ever got stock for sale, it ain't out just yet!


I stand corrected then, pretty strange for it to be delayed, not like it's a new cooler or anything.


----------



## MIGhunter

Are any of you gaming on a single 290x with triple monitors? Is this doable? How's the FPS? Using 23" or 27"


----------



## Ukkooh

Any reviews out for 290x iceq? I tried googling them but only found some news about it.


----------



## VSG

As mentioned above, the HIS cards are vaporware at this point.


----------



## Ukkooh

I just got an odd issue with my gpu. I was taking the dog out and shortly after I came back I lost signal on my monitor and the gpu fan started running at 100%. Is this something to get worried about?


----------



## Redvineal

Question for you smart power supply folks...

I'm set to receive my 2nd and 3rd R9 290's on Friday. As I mentioned in previous posts, I have a Corsair AX860 PSU that can handle 2 cards for a while. In order to power the 3rd card, I'm considering using an older 850 PSU I have lying around.

Is it safe for me to use PCIe power connectors from 1 PSU on 2 cards, and PCIe power connectors from a second PSU to power the 3rd card?

I've seen builds with dual PSUs, but not sure about the risks and considerations involved.

(edit: btw, this is all just a temporary solution until I can find a nice high output PSU in stock somewhere)

Thanks!


----------



## Abyssic

hey guys, what are your opinions on the MSI R9 290X Gaming?
it's a bit cheaper than the sapphire tri-x in my country but i don't wanna look only at the money... wich one is the better choice in terms of noise and overclocking?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Abyssic*
> 
> hey guys, what are your opinions on the MSI R9 290X Gaming?
> it's a bit cheaper than the sapphire tri-x in my country but i don't wanna look only at the money... wich one is the better choice in terms of noise and overclocking?


I don't think that many people have the MSI card yet, From what i've seen the Tri-X has the best cooling and also comes with Hynix mem, the MSI card also has decent cooling and comes with Hynix mem so i'm guessing it's down to noise and aesthetics with both.

Can't trust review sites for noise reading, i just looked a few up and the Tri-X ranges from 36dB on load to 41dB depending on what site you visit


----------



## Abyssic

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I don't think that many people have the MSI card yet, From what i've seen the Tri-X has the best cooling and also comes with Hynix mem, the MSI card also has decent cooling and comes with Hynix mem so i'm guessing it's down to noise and aesthetics with both.
> 
> Can't trust review sites for noise reading, i just looked a few up and the Tri-X ranges from 36dB on load to 41dB depending on what site you visit


thanks for the reply. maybe i can find someone who has the msi one ^^


----------



## Abyssic

the review from guru3d shows a really good result for the cooling but not so much for overclocking. i am a bit concerned tho because there is no mention about the case they used for the tests. if those tests were made on a open benchtable, it would be a whole other result :/


----------



## Xyro TR1

My MSI has been on backorder for nearly a month, they say it'll get to me by February 15th.









I'll let you know by then!


----------



## ArchieGriffs

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I don't think that many people have the MSI card yet, From what i've seen the Tri-X has the best cooling and also comes with Hynix mem, the MSI card also has decent cooling and comes with Hynix mem so i'm guessing it's down to noise and aesthetics with both.
> 
> Can't trust review sites for noise reading, i just looked a few up and the Tri-X ranges from 36dB on load to 41dB depending on what site you visit


The guru3d MSI review only shows the core clocks going up to 1187 MHz. I'm assuming the reviewer isn't the most avid overclocker, and their Tri-X review is limited, not showing overclocking results, system specs used for benching, noise/temp etc. So for that site there's no direct comparison, but it's kind of disappointing to see 1187 Mhz, especially considering there's always the possibility of a hand picked card. If it comes with Hynix that sounds really promising. Anyways, we need a direct comparison between the two, or more solid reviews of the MSI gaming card before I would pick one or the other. Because we know about the Tri-X though, I can tell you it's worth the purchase, not necessarily worth buying over the MSI, but it's the best 290 non reference that we know of.


----------



## Abyssic

Quote:


> Originally Posted by *ArchieGriffs*
> 
> it's the best 290 non reference that we know of.


ok you totally got me with that statement ^^ the tri-x was my original thought plus i can get it from my favourite retailer (the msi one not)
and i don't wanna wait weeks or possibly months because i already sold my two 7950s and i'm running a nvidia 9600 gt now xD

sidenote: my favourite retailer dosen't sell any msi products because they seem to be pretty troublesome in terms of rma management


----------



## SgtMunky

Thanks guys, 290 it may be then, as the 780 is more than £50 more. Yes I did mean 1080, and if I do go up in resolution, it wont be for 6 months plus, and will only ever be single monitor


----------



## Derpinheimer

Quote:


> Originally Posted by *Ukkooh*
> 
> I just got an odd issue with my gpu. I was taking the dog out and shortly after I came back I lost signal on my monitor and the gpu fan started running at 100%. Is this something to get worried about?


This means the core temp went above 95*c, very worrying


----------



## velocityx

Quote:


> Originally Posted by *Derpinheimer*
> 
> This means the core temp went above 95*c, very worrying


sometimes in bf4, I see temps go over 95. donno why it happens. seen it go as high as 98C. msi ab monitoring.


----------



## bond32

Quote:


> Originally Posted by *velocityx*
> 
> sometimes in bf4, I see temps go over 95. donno why it happens. seen it go as high as 98C. msi ab monitoring.


Not a temp problem, sounds like a black screen issue. Its a known issue with these cards, there's a large thread about it.
http://www.overclock.net/t/1441349/290-290x-black-screen-poll


----------



## Ukkooh

I never had that fan speed issue before with my old 290x that black screened occssionally. I hope it wont happen again.


----------



## Loktar Ogar

I just realized that the R9 290/290x GPU is hot (ref and non-ref)! It makes my room warm and the exhaust blows directly to my feet since i don't have a PC chassis yet.








What more if you run this in X-fire or Tri-fire?







I guess it doesn't matter if you are using a watercooling setup and the ambient temp results will be the same?


----------



## Derpinheimer

Quote:


> Originally Posted by *bond32*
> 
> Not a temp problem, sounds like a black screen issue. Its a known issue with these cards, there's a large thread about it.
> http://www.overclock.net/t/1441349/290-290x-black-screen-poll


Black screen issue is memory instability and doesn't cause the fan to ramp up.

OTP triggers instant power off and 100% fan. Maybe it's 100c, I thought 96+.


----------



## esqueue

Quote:


> Originally Posted by *Loktar Ogar*
> 
> I just realized that the R9 290/290x GPU is hot (ref and non-ref)! It makes my room warm and the exhaust blows directly to my feet since i don't have a PC chassis yet.
> 
> 
> 
> 
> 
> 
> 
> 
> What more if you run this in X-fire or Tri-fire?
> 
> 
> 
> 
> 
> 
> 
> I guess it doesn't matter if you are using a watercooling setup and the ambient temp results will be the same?


Don't let a watercooling setup trick you into thinking that they aren't putting out lots of heat either. It's putting out the same amount of heat. Just with more air flow over a wide area makes it feel cooler in a when you put your hand in front of the fans.

My brief experience with wasting money and time mining made the radiator actually get pretty warm. A pontiac heater core. When you step into the room after it's been running for a while you DEFINITELY know that it is putting out lots of heat. Also my power supply fan going haywire is another indication that it is working hard.


----------



## Derpinheimer

Quote:


> Originally Posted by *Loktar Ogar*
> 
> I just realized that the R9 290/290x GPU is hot (ref and non-ref)! It makes my room warm and the exhaust blows directly to my feet since i don't have a PC chassis yet.
> 
> 
> 
> 
> 
> 
> 
> 
> What more if you run this in X-fire or Tri-fire?
> 
> 
> 
> 
> 
> 
> 
> I guess it doesn't matter if you are using a watercooling setup and the ambient temp results will be the same?


It warms my room up with my watercooling setup. I have to keep a window open in sub zero temps (not fully, depend on temp), to keep from cooking


----------



## Loktar Ogar

Quote:


> Originally Posted by *esqueue*
> 
> Don't let a watercooling setup trick you into thinking that they aren't putting out lots of heat either. It's putting out the same amount of heat. Just with more air flow over a wide area makes it feel cooler in a when you put your hand in front of the fans.
> 
> My brief experience with wasting money and time mining made the radiator actually get pretty warm. A pontiac heater core. When you step into the room after it's been running for a while you DEFINITELY know that it is putting out lots of heat. Also my power supply fan going haywire is another indication that it is working hard.


Quote:


> Originally Posted by *Derpinheimer*
> 
> It warms my room up with my watercooling setup. I have to keep a window open in sub zero temps (not fully, depend on temp), to keep from cooking


I'm glad I've posted this and got good answers from you guys. I appreciate it! Also, thanks in advanced to those who has good responses as well.


----------



## devilhead

Quote:


> Originally Posted by *Loktar Ogar*
> 
> I just realized that the R9 290/290x GPU is hot (ref and non-ref)! It makes my room warm and the exhaust blows directly to my feet since i don't have a PC chassis yet.
> 
> 
> 
> 
> 
> 
> 
> 
> What more if you run this in X-fire or Tri-fire?
> 
> 
> 
> 
> 
> 
> 
> I guess it doesn't matter if you are using a watercooling setup and the ambient temp results will be the same?


i run 4x way crossfire, all cards are watercooled EK, just for cooling those cards using 3x360 rads 60mm thicknes, 2xD5 pumps at 4 speed







all cards overclocked 1000/1500 +30mv, and the mining temperatures is 49C on core and 63C on vram







and it makes room damn hot, need to open window














all system takes ~1500W from wall


----------



## Loktar Ogar

Quote:


> Originally Posted by *devilhead*
> 
> i run 4x way crossfire, all cards are watercooled EK, just for cooling those cards using 3x360 rads 60mm thicknes, 2xD5 pumps at 4 speed
> 
> 
> 
> 
> 
> 
> 
> all cards overclocked 1000/1500 +30mv, and the mining temperatures is 49C on core and 63C on vram
> 
> 
> 
> 
> 
> 
> 
> and it makes room damn hot, need to open window
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> all system takes ~1500W from wall


I'm speechless.







Thanks for pointing out the load (1500W) for running 4-way X-fire setup.


----------



## NirHahs

Powercolor PCS+ r9 290 or Sapphire tri-x? it seems that sapphire had more good response compare to other 290. but, im very attracted with the powercolor's backplate


----------



## ryboto

Question regarding power supplies- If my system, under load using Crysis 3, BF4, 3dmark/11 never draws more than ~420W from the wall, I can get away with a 550W PSU, correct? I am considering upgrading the PSU in my SG08 to an S12G-550. I was considering the 650, but with the lack of space in the case and extra cables the higher power unit comes with, I'd rather drop to the 550W.

For reference, the 600W PSU in the Silverstone is 80Plus Bronze rated. Trouble with it is the low 12v, drops to 11.4V under load. Unless GPU-Z shouldn't be trusted for 12v monitoring?


----------



## esqueue

Quote:


> Originally Posted by *ryboto*
> 
> Question regarding power supplies- If my system, under load using Crysis 3, BF4, 3dmark/11 never draws more than ~420W from the wall, I can get away with a 550W PSU, correct? I am considering upgrading the PSU in my SG08 to an S12G-550. I was considering the 650, but with the lack of space in the case and extra cables the higher power unit comes with, I'd rather drop to the 550W.
> 
> For reference, the 600W PSU in the Silverstone is 80Plus Bronze rated. Trouble with it is the low 12v, drops to 11.4V under load. Unless GPU-Z shouldn't be trusted for 12v monitoring?


If space is an issue, get a modulated power supply (removable cables). I have an Antec 620m Power supply that I love.


----------



## Loktar Ogar

Quote:


> Originally Posted by *ryboto*
> 
> Question regarding power supplies- If my system, under load using Crysis 3, BF4, 3dmark/11 never draws more than ~420W from the wall, I can get away with a 550W PSU, correct? I am considering upgrading the PSU in my SG08 to an S12G-550. I was considering the 650, but with the lack of space in the case and extra cables the higher power unit comes with, I'd rather drop to the 550W.
> 
> For reference, the 600W PSU in the Silverstone is 80Plus Bronze rated. Trouble with it is the low 12v, drops to 11.4V under load. Unless GPU-Z shouldn't be trusted for 12v monitoring?


If you don't have plans of doing X-fire in the near future, i will not suggest getting more than 600W PSU if you only pulling 420W at your OC settings to save money for your next upgrade.

My best tip is to get a high efficiency and very good quality PSU if you really want a PSU upgrade as i have been told before by a PSU guru.


----------



## shilka

Quote:


> Originally Posted by *ryboto*
> 
> Question regarding power supplies- If my system, under load using Crysis 3, BF4, 3dmark/11 never draws more than ~420W from the wall, I can get away with a 550W PSU, correct? I am considering upgrading the PSU in my SG08 to an S12G-550. I was considering the 650, but with the lack of space in the case and extra cables the higher power unit comes with, I'd rather drop to the 550W.
> 
> For reference, the 600W PSU in the Silverstone is 80Plus Bronze rated. Trouble with it is the low 12v, drops to 11.4V under load. Unless GPU-Z shouldn't be trusted for 12v monitoring?


A 550 watts is enough if you dont overvolt


----------



## Redvineal

Quote:


> Originally Posted by *Redvineal*
> 
> Question for you smart power supply folks...
> 
> I'm set to receive my 2nd and 3rd R9 290's on Friday. As I mentioned in previous posts, I have a Corsair AX860 PSU that can handle 2 cards for a while. In order to power the 3rd card, I'm considering using an older 850 PSU I have lying around.
> 
> Is it safe for me to use PCIe power connectors from 1 PSU on 2 cards, and PCIe power connectors from a second PSU to power the 3rd card?
> 
> I've seen builds with dual PSUs, but not sure about the risks and considerations involved.
> 
> (edit: btw, this is all just a temporary solution until I can find a nice high output PSU in stock somewhere)
> 
> Thanks!


Anyone? I hate to bump, but I can't seem to find any good info about what I'd like to do...


----------



## shilka

Quote:


> Originally Posted by *Redvineal*
> 
> Anyone? I hate to bump, but I can't seem to find any good info about what I'd like to do...


You could run all 3 cards on the 860 but only if you leave your whole system on stock

After you get a bigger PSU you can go nuts

Or you could ghetto rig that TX850 into your system


----------



## Redvineal

Quote:


> Originally Posted by *shilka*
> 
> You could run all 3 cards on the 860 but only if you leave your whole system on stock
> 
> After you get a bigger PSU you can go nuts
> 
> Or you could ghetto rig that TX850 into your system


My single card system right now on full load uses about 430-440W at stock. Adding two cards will probably bust the AX860 easily.

I don't mind the ghetto rig for a few days/weeks. I just need to know if I can safely jump the TX850 and use only its PCIe connectors on 1 or 2 cards. Just to supplement the power. Mainly, I'm concerned that I'm not considering something with the ghetto rig which could cause GPU failure, fire, or whatever else is worse than those!

Has anyone tried this before? Am I the first around here to come up with such an idea?

Every OC post I've read about dual PSU's were people wanting to use 2x1200W units or something crazy like that. The only responses were "man, that's overkill, save your money" type responses.


----------



## Sazz

Quote:


> Originally Posted by *ryboto*
> 
> Question regarding power supplies- If my system, under load using Crysis 3, BF4, 3dmark/11 never draws more than ~420W from the wall, I can get away with a 550W PSU, correct? I am considering upgrading the PSU in my SG08 to an S12G-550. I was considering the 650, but with the lack of space in the case and extra cables the higher power unit comes with, I'd rather drop to the 550W.
> 
> For reference, the 600W PSU in the Silverstone is 80Plus Bronze rated. Trouble with it is the low 12v, drops to 11.4V under load. Unless GPU-Z shouldn't be trusted for 12v monitoring?


You can't trust any of the voltage monitor software, if you want a precise measurement of voltage you would need a multi-meter. heck I got Asus Probe Utility installed and it says my 12v drops to 0.12v and at that voltage my system should have crashed, LOL Heck it even says my CPU voltage goes as low as 0.05v LOLOL

They generally just give you an idea on how much voltage you are using but not the exact amount.


----------



## jerrolds

Quote:


> Originally Posted by *esqueue*
> 
> Don't let a watercooling setup trick you into thinking that they aren't putting out lots of heat either. It's putting out the same amount of heat. Just with more air flow over a wide area makes it feel cooler in a when you put your hand in front of the fans.
> 
> My brief experience with wasting money and time mining made the radiator actually get pretty warm. A pontiac heater core. When you step into the room after it's been running for a while you DEFINITELY know that it is putting out lots of heat. Also my power supply fan going haywire is another indication that it is working hard.


I dunno - i think with better cooling the heat is let off more slowly letting the surroundings dissipate it easier, may not even get as hot to begin with (since itll be more electrically efficient due to better cooling) - i have a spare bedroom that houses my pc, and when the 290X had the reference cooler it was noticeably hotter in there than with my gelid.

It was like a damned hair dryer, pretty hot when i opened the door after a long gaming session - i remember being shocked at how much cooler it was with the gelid. Dont think ambient temps changed too much those 2 weeks.


----------



## Forceman

Quote:


> Originally Posted by *Redvineal*
> 
> My single card system right now on full load uses about 430-440W at stock. Adding two cards will probably bust the AX860 easily.
> 
> I don't mind the ghetto rig for a few days/weeks. I just need to know if I can safely jump the TX850 and use only its PCIe connectors on 1 or 2 cards. Just to supplement the power. Mainly, I'm concerned that I'm not considering something with the ghetto rig which could cause GPU failure, fire, or whatever else is worse than those!
> 
> Has anyone tried this before? Am I the first around here to come up with such an idea?
> 
> Every OC post I've read about dual PSU's were people wanting to use 2x1200W units or something crazy like that. The only responses were "man, that's overkill, save your money" type responses.


Yes, you can run two PSUs like that to spread the load. They sell adapter things so that the second one comes on with the first also, should be able to find one at FrozenCPU or Performance PCs, somewhere like that, but I can't remember the name of the adapter thing off-hand.


----------



## VSG

ADD2PSU


----------



## esqueue

Quote:


> Originally Posted by *jerrolds*
> 
> I dunno - i think with better cooling the heat is let off more slowly letting the surroundings dissipate it easier, may not even get as hot to begin with (since itll be more electrically efficient due to better cooling) - i have a spare bedroom that houses my pc, and when the 290X had the reference cooler it was noticeably hotter in there than with my gelid.
> 
> It was like a damned hair dryer, pretty hot when i opened the door after a long gaming session - i remember being shocked at how much cooler it was with the gelid. Dont think ambient temps changed too much those 2 weeks.


I understand what you say about being efficient thus wasting less energy in heat but it not by much and the room will still get hot. I've never ran the card very hard on air but my room did get hot as hell when I mined. Actually just turned off [email protected] as the room was getting too hot.


----------



## alancsalt

Add2Psu

Dual PSU 24-Pin Adapter Cable


----------



## Derpinheimer

Quote:


> Originally Posted by *jerrolds*
> 
> I dunno - i think with better cooling the heat is let off more slowly letting the surroundings dissipate it easier, may not even get as hot to begin with (since itll be more electrically efficient due to better cooling) - i have a spare bedroom that houses my pc, and when the 290X had the reference cooler it was noticeably hotter in there than with my gelid.
> 
> It was like a damned hair dryer, pretty hot when i opened the door after a long gaming session - i remember being shocked at how much cooler it was with the gelid. Dont think ambient temps changed too much those 2 weeks.


There will be a minute difference in the amount of heat released between the two, most likely in favor of the water cooling setup.. which, depending on the amount of water, may be holding quite a bit of hear, while the small air heatsink.. though scorching hot to the touch, isnt really that much energy.

And there is a good 10-30w difference from a cool to hot GPU, I've found.. but the pump will negate most of those power savings.


----------



## Redvineal

Quote:


> Originally Posted by *Forceman*
> 
> Yes, you can run two PSUs like that to spread the load. They sell adapter things so that the second one comes on with the first also, should be able to find one at FrozenCPU or Performance PCs, somewhere like that, but I can't remember the name of the adapter thing off-hand.


Quote:


> Originally Posted by *geggeg*
> 
> ADD2PSU


Thank you! And thanks to the others that chimed in earlier, as well!

I don't think I'll need an adapter since my computer stays on pretty much 24/7.


----------



## kizwan

Quote:


> Originally Posted by *Ukkooh*
> 
> I just got an odd issue with my gpu. I was taking the dog out and shortly after I came back I lost signal on my monitor and the gpu fan started running at 100%. Is this something to get worried about?


Probably just crashed. No need to worry. If it's persistent then you need to check your PSU. If you use modular PSU and/or using extension PCIe power cables, check the cables. Sometime the pins doesn't make proper contact even though the connector seems connected properly. If the problem still persist with different PSU & cables are confirmed ok, then maybe something is wrong with the card.
Quote:


> Originally Posted by *Redvineal*
> 
> Question for you smart power supply folks...
> 
> I'm set to receive my 2nd and 3rd R9 290's on Friday. As I mentioned in previous posts, I have a Corsair AX860 PSU that can handle 2 cards for a while. In order to power the 3rd card, I'm considering using an older 850 PSU I have lying around.
> 
> Is it safe for me to use PCIe power connectors from 1 PSU on 2 cards, and PCIe power connectors from a second PSU to power the 3rd card?
> 
> I've seen builds with dual PSUs, but not sure about the risks and considerations involved.
> 
> (edit: btw, this is all just a temporary solution until I can find a nice high output PSU in stock somewhere)
> 
> Thanks!


Yes you can. It definitely safe. You can get dual power supply connector/adapter. It's optional though but you can turn on/off both power supply at the same time. You also can DIY the connector yourself.



Quote:


> Originally Posted by *Loktar Ogar*
> 
> I just realized that the R9 290/290x GPU is hot (ref and non-ref)! It makes my room warm and the exhaust blows directly to my feet since i don't have a PC chassis yet.
> 
> 
> 
> 
> 
> 
> 
> 
> What more if you run this in X-fire or Tri-fire?
> 
> 
> 
> 
> 
> 
> 
> I guess it doesn't matter if you are using a watercooling setup and the ambient temp results will be the same?


My room is hotter when I'm using stock cooler because the heat is concentrated at one spot in the room. I have dual GPU, you can imagine how much heat it produce. With watercooling setup, the heat being removed by radiator(s). It still warm up my room but not like when using stock cooler because the heat no longer concentrated at one spot in the room.


----------



## ryboto

Quote:


> Originally Posted by *esqueue*
> 
> If space is an issue, get a modulated power supply (removable cables). I have an Antec 620m Power supply that I love.


I'm limited to a 140mm long PSU. A 140mm modular PSU would only fit if I didn't use any of the cables. My case is the SG08, I have an mITX system.
Quote:


> Originally Posted by *Loktar Ogar*
> 
> If you don't have plans of doing X-fire in the near future, i will not suggest getting more than 600W PSU if you only pulling 420W at your OC settings to save money for your next upgrade.
> 
> My best tip is to get a high efficiency and very good quality PSU if you really want a PSU upgrade as i have been told before by a PSU guru.


Well, yea, I generally go that route. The SG08 comes with a 600W Silverstone 80Plus Bronze PSU. I am not sure if I can trust it as far as voltage output, so I'm considering a Seasonic as a replacement. As this is the rig in my sig, it's mITX, so no x-fire.

Quote:


> Originally Posted by *shilka*
> 
> A 550 watts is enough if you dont overvolt


I do overvolt, and those power draw numbers I posted are with an overclocked and overvolted GPU/CPU under in-game load.
Quote:


> Originally Posted by *Sazz*
> 
> You can't trust any of the voltage monitor software, if you want a precise measurement of voltage you would need a multi-meter. heck I got Asus Probe Utility installed and it says my 12v drops to 0.12v and at that voltage my system should have crashed, LOL Heck it even says my CPU voltage goes as low as 0.05v LOLOL
> 
> They generally just give you an idea on how much voltage you are using but not the exact amount.


Well, I think GPU-Z is measuring the 12v supply to the card, where all the motherboard monitoring software I have read the 12v as 1.26v. The Kill-A-Watt is what I'm using to measure power draw.


----------



## esqueue

Quote:


> Originally Posted by *kizwan*
> 
> My room is hotter when I'm using stock cooler because the heat is concentrated at one spot in the room. I have dual GPU, you can imagine how much heat it produce. With watercooling setup, the heat being removed by radiator(s). It still warm up my room but not like when using stock cooler because the heat no longer concentrated at one spot in the room.


A BTU is the same whether it is put out in a concentrated area or a wide area. A cooling system may take longer as the water can store quite a bit of the heat but it will put out the exact heat as a fan. A soldering iron will feel hotter to the touch as it is concentrated heat but the computer is generating more heat and will heat up a room way more than a 40 watt soldering ever will. Even at 15 watts a soldering iron is impossible to touch.


----------



## PillarOfAutumn

Question: I have one R9 290 running, OCed to 1150/1450 and +87mv. I was playing BF4 and I usually get frames of 90-120 normally. However, today I connected a 1440x900 monitor and without even using the monitor (its just connected via DVI and powered on), my frames drop to 40-60 in game. Is this normal? Any way I can improve this?

Thanks.


----------



## Arizonian

I was reading about the beta driver for mantle. I thought it was going to be a completed ready project. Looks like the bigger improvements are going to come from lower performing CPU's not so much on higher end. I wonder if it's in beta stages for Battlefield 4 implementation or includes other supported upcoming games like Thief and Star Citizen?

*AMD Catalyst 14.1 Beta Driver Brings Mantle Support, Frame Pacing Phase 2, HSA*
*
AMD*
Quote:


> We want to convey that this is only the initial release of Mantle. Mantle will continue to grow, evolve and improve in the months ahead. As an initial release, however, there is a list of known issues we are tracking. Everyone in the Mantle ecosystem is working to identify the root cause of these problems, and to resolve them as quickly as possible. As they are resolved, we and our partners will be issuing new drivers and patches as necessary!
> 
> We felt it best to get users working with and providing feedback on Mantle as soon as possible, rather than hold the entire launch for select scenarios that aren't performing up to our expectations. The known issues will be posted for all to see at www.amd.com/mantleknownissues, and attached you will find the complete list.


*Stats*
Quote:


> •*Core i7-4960X CPU + R9 290X GPU
> ◦1080p, Ultra Preset, 4xAA: 9.2% improvement with Mantle
> ◦1600p, Ultra Preset, 4xAA: 10% improvement with Mantle*
> 
> •Core i7-4960X CPU + R7 260X GPU
> ◦1080p, Ultra Preset, 4xAA: 2.7% improvement
> ◦1600p, Ultra Preset, 4xAA: 1.4% improvement
> 
> •*A10-7700K CPU + R9 290X GPU
> ◦1080p, Ultra Preset, 4xAA: 40.9% improvement
> ◦1600p, Ultra Preset, 4xAA: 17.3% improvement*
> 
> •A10-7700K CPU + R7 260X GPU
> ◦1080p, Ultra Preset, 4xAA: 8.3% improvement
> ◦1600p, Low Preset: 16.8% improvement


Hope we get some Mantle results soon and confirm these gains gaming.


----------



## Derpinheimer

Quote:


> Originally Posted by *Redvineal*
> 
> My single card system right now on full load uses about 430-440W at stock. Adding two cards will probably bust the AX860 easily.
> 
> I don't mind the ghetto rig for a few days/weeks. I just need to know if I can safely jump the TX850 and use only its PCIe connectors on 1 or 2 cards. Just to supplement the power. Mainly, I'm concerned that I'm not considering something with the ghetto rig which could cause GPU failure, fire, or whatever else is worse than those!
> 
> Has anyone tried this before? Am I the first around here to come up with such an idea?
> 
> Every OC post I've read about dual PSU's were people wanting to use 2x1200W units or something crazy like that. The only responses were "man, that's overkill, save your money" type responses.


860w is likely fine for 3 cards, as long as there is no overvolting.


----------



## DamnedLife

Can you add me.


It's a Sapphire TRI X R9 290X OC









GPUZ Validation: http://www.techpowerup.com/gpuz/c6nvr/


----------



## Ukkooh

Quote:


> Originally Posted by *kizwan*
> 
> Probably just crashed. No need to worry. If it's persistent then you need to check your PSU. If you use modular PSU and/or using extension PCIe power cables, check the cables. Sometime the pins doesn't make proper contact even though the connector seems connected properly. If the problem still persist with different PSU & cables are confirmed ok, then maybe something is wrong with the card.
> Yes you can. It definitely safe. You can get dual power supply connector/adapter. It's optional though but you can turn on/off both power supply at the same time. You also can DIY the connector yourself.


I guess I'll check the PSU cables today. Thanks for the tip.


----------



## Prozillah

Hey guys - dont have mine yet but hopefully soon. Quick heads up - 27" mon @ 1440p 120hz. 2x GTX 670 OC models. 1000w AURUM PSU.

I'm looking to go 2x 290's - I'm working out trying to decide which card to get! Options include:

Saph Tri X
Gigabyte Windforce
Powercolour PCS+

Question:
Will my 1000w PSU handle my OC'd CPU which is a 4670k @ 4.4ghz 1.27v AND 2x 290's OC?

Which card would be the best to Xfire / heat dissipation / OC ability? Don't care about noise or price to much just dont want the things to throttle under head etc with a nice healthy OC?

Cheers,


----------



## Arizonian

Quote:


> Originally Posted by *DamnedLife*
> 
> Can you add me.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> It's a Sapphire TRI X R9 290X OC
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GPUZ Validation: http://www.techpowerup.com/gpuz/c6nvr/


Congrats - added









Post back with your TRI-X results over clocking. I'm interested in it's results.


----------



## kizwan

Quote:


> Originally Posted by *Prozillah*
> 
> Hey guys - dont have mine yet but hopefully soon. Quick heads up - 27" mon @ 1440p 120hz. 2x GTX 670 OC models. 1000w AURUM PSU.
> 
> I'm looking to go 2x 290's - I'm working out trying to decide which card to get! Options include:
> 
> Saph Tri X
> Gigabyte Windforce
> Powercolour PCS+
> 
> Question:
> Will my 1000w PSU handle my OC'd CPU which is a 4670k @ 4.4ghz 1.27v AND 2x 290's OC?
> 
> Which card would be the best to Xfire / heat dissipation / OC ability? Don't care about noise or price to much just dont want the things to throttle under head etc with a nice healthy OC?
> 
> Cheers,


Your 1000W PSU definitely can support 2 x 290's OC. I have 1050W PSU can handle 3820 @4.75GHz & 2 x 290's @1150 (core)/1600 (memory) MHz just fine.


----------



## Asrock Extreme7

well first go at mining 645kh mining 42 coin hope I find one // core 915 mem 1400 temp 48c will this kill my card 24/7


----------



## broken pixel

Quote:


> Originally Posted by *Asrock Extreme7*
> 
> well first go at mining 645kh mining 42 coin hope I find one // core 915 mem 1400 temp 48c will this kill my card 24/7


48C is a good temp man.

Try 880/1250, -25mV gives my XFX 290x 845Kh/s @ 65C each @ 63% fan. -I 20 --thread-concurrency 32765

If you are solo mining I suggest you find a pool to mine with because 645Kh/s is not enough hashrate for a coin that has been out for a bit of time.


----------



## DamnedLife

Quote:


> Originally Posted by *Prozillah*
> 
> Hey guys - dont have mine yet but hopefully soon. Quick heads up - 27" mon @ 1440p 120hz. 2x GTX 670 OC models. 1000w AURUM PSU.
> 
> I'm looking to go 2x 290's - I'm working out trying to decide which card to get! Options include:
> 
> Saph Tri X
> Gigabyte Windforce
> Powercolour PCS+
> 
> Question:
> Will my 1000w PSU handle my OC'd CPU which is a 4670k @ 4.4ghz 1.27v AND 2x 290's OC?
> 
> Which card would be the best to Xfire / heat dissipation / OC ability? Don't care about noise or price to much just dont want the things to throttle under head etc with a nice healthy OC?
> 
> Cheers,


Powercolour PCS+ isn't out yet so no comment other than it may be 3 slot card so harder to xfire?
But I would go Sapphire Tri-X most definitely.


----------



## syniad

So I finally got around to putting in the new thermal pads on my 290s and what a difference!
VRM temps right down to less than 50 degrees, much improved from the 100 degrees I was hitting before lol.

Overclocked to 1200/1600 with +200mv and it's 100% stable in Heaven benchmark 4.0 but for some reason when I play hitman absolution with crossfire I get system freeze occasionally. Not sure if this is the game itself, driver problems or just slightly unstable overclock. I've not had much chance to try any other games gonna have to investigate this later.

Also my XFX r9 290 is being really weird with GPU usage going all over the place, whereas the Sapphire card stays relatively level. I don't know if this has anything to do with running on PCI-e 2.0 8x/8x but it only happens to the XFX card no matter what slot it's in. I might try flashing the XFX with the sapphire's bios when I get in to see if that changes anything.


----------



## Widde

http://battlelog.battlefield.com/bf4/forum/threadview/2979150493815503479/ yay ^^ Drivers shouldnt be far away now ^_^
http://battlelog.battlefield.com/bf4/news/view/bf4-mantle-live/

http://piclair.com/odzjp

We are putting the finishing touches on the AMD Catalyst™ 14.1 Beta driver, which enables support for Mantle and many other fabulous features. We appreciate your patience and will update you as soon as it's ready for download! from amds facebook, Wiiiee feels like I've developed some ocd over this mantle thing







Just want moar performance but trying to be realistic


----------



## PillarOfAutumn

I have one R9 290 running, OCed to 1150/1450 and +87mv. I was playing BF4 and I usually get frames of 90-120 normally. However, today I connected a 1440x900 monitor and without even using the monitor (its just connected via DVI and powered on), my frames drop to 40-60 in game. Is this normal? Any way I can improve this?


----------



## rdr09

after the last patch . . . EA is recommending i move from 13.11. last time i installed 13.12 my core voltage on the 290 went a tad higher and so did the temp.


----------



## Stuuut

So just ordered a Sapphire r290x Triple-X







yay BF4 version was cheaper then the one without it dunno how thats possible but so be it.


----------



## Krusher33

Quote:


> Originally Posted by *Arizonian*
> 
> I was reading about the beta driver for mantle. I thought it was going to be a completed ready project. Looks like the bigger improvements are going to come from lower performing CPU's not so much on higher end. I wonder if it's in beta stages for Battlefield 4 implementation or includes other supported upcoming games like Thief and Star Citizen?
> 
> *AMD Catalyst 14.1 Beta Driver Brings Mantle Support, Frame Pacing Phase 2, HSA*
> *
> AMD*
> Quote:
> 
> 
> 
> We want to convey that this is only the initial release of Mantle. Mantle will continue to grow, evolve and improve in the months ahead. As an initial release, however, there is a list of known issues we are tracking. Everyone in the Mantle ecosystem is working to identify the root cause of these problems, and to resolve them as quickly as possible. As they are resolved, we and our partners will be issuing new drivers and patches as necessary!
> 
> We felt it best to get users working with and providing feedback on Mantle as soon as possible, rather than hold the entire launch for select scenarios that aren't performing up to our expectations. The known issues will be posted for all to see at www.amd.com/mantleknownissues, and attached you will find the complete list.
> 
> 
> 
> *Stats*
> Quote:
> 
> 
> 
> •*Core i7-4960X CPU + R9 290X GPU
> ◦1080p, Ultra Preset, 4xAA: 9.2% improvement with Mantle
> ◦1600p, Ultra Preset, 4xAA: 10% improvement with Mantle*
> 
> •Core i7-4960X CPU + R7 260X GPU
> ◦1080p, Ultra Preset, 4xAA: 2.7% improvement
> ◦1600p, Ultra Preset, 4xAA: 1.4% improvement
> 
> •*A10-7700K CPU + R9 290X GPU
> ◦1080p, Ultra Preset, 4xAA: 40.9% improvement
> ◦1600p, Ultra Preset, 4xAA: 17.3% improvement*
> 
> •A10-7700K CPU + R7 260X GPU
> ◦1080p, Ultra Preset, 4xAA: 8.3% improvement
> ◦1600p, Low Preset: 16.8% improvement
> 
> Click to expand...
> 
> Hope we get some Mantle results soon and confirm these gains gaming.
Click to expand...

I have 7850K + R9 290X. I'm getting the GPU waterblock tonight but it may take me tonight and tomorrow night to do the acrylic tubing. I suspect by Saturday we'll have results from other people already.


----------



## Aggronor

, I have an i7 3770k + Sapphire R9 290X, and i will get just 9% of better performance?.. where is the super 45% percent of increase ?..

still waiting..


----------



## Widde

Quote:


> Originally Posted by *Aggronor*
> 
> 
> 
> 
> 
> 
> 
> 
> , I have an i7 3770k + Sapphire R9 290X, and i will get just 9% of better performance?.. where is the super 45% percent of increase ?..
> 
> still waiting..


It was "up to 45%" ^_^


----------



## rage fuury

Quote:


> Originally Posted by *Aggronor*
> 
> 
> 
> 
> 
> 
> 
> 
> , I have an i7 3770k + Sapphire R9 290X, and i will get just 9% of better performance?.. where is the super 45% percent of increase ?..
> 
> still waiting..


Dude, are you real? Do you Know how much you have to overclock your card for a "mere" 9% increase in performance? They never said you'll have 40% increase in performance in all scenarious anyway ("up to 40%" means something else) and I have a feeling you are a troll. If not please show us a prove that you have a 290X...


----------



## EliteReplay

Quote:


> Originally Posted by *Aggronor*
> 
> 
> 
> 
> 
> 
> 
> 
> , I have an i7 3770k + Sapphire R9 290X, and i will get just 9% of better performance?.. where is the super 45% percent of increase ?..
> 
> still waiting..


yeah because Intel give you 9% improvement every year and you still upgrade for less than that... this one is a free performance upgrade.


----------



## kdawgmaster

Quote:


> Originally Posted by *Aggronor*
> 
> 
> 
> 
> 
> 
> 
> 
> , I have an i7 3770k + Sapphire R9 290X, and i will get just 9% of better performance?.. where is the super 45% percent of increase ?..
> 
> still waiting..


get a second R9 290x and you can get up to 50% apparently even with a 3960x proof http://battlelog.battlefield.com/bf4/news/view/bf4-mantle-live/


----------



## Aggronor

Quote:


> Originally Posted by *rage fuury*
> 
> Dude, are you real? Do you Know how much you have to overclock your card for a "mere" 9% increase in performance? They never said you'll have 40% increase in performance in all circumstances anyway ("up to 40%" means something else) and I have a feeling you are a troll. If not please show us a prove that you have a 290X...


troll??... ok here is the prove:

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/12930#post_21505227

and.. yes I understand.. but.. i was specting something extaordinary better.. , anyway you are right.

I will wait to perform my own tests, with my PC cfg.


----------



## rage fuury

Quote:


> Originally Posted by *Aggronor*
> 
> troll??... ok here is the prove:
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/12930#post_21505227
> 
> and.. yes I understand.. but.. i was specting something extaordinary better.. , anyway you are right.
> 
> I will wait to perform my own tests, with my PC cfg.


Ok, please accept my excuses... Regarding the performance, I think that you just take anything is given to you, especially if is free...


----------



## Krusher33

I knew this would happen. When AMD first demonstrated mantle and HSA, they indeed said "*up to*" 45%.

I had said at the time that we'll probably only see the 45% on AMD's flagship CPU's and GPU's and then we gonna have a herd of Intel owners all having a rage fit because they only see 10%.

Looks like I'm right.


----------



## tsm106

Quote:


> Originally Posted by *Krusher33*
> 
> I knew this would happen. When AMD first demonstrated mantle and HSA, they indeed said "*up to*" 45%.
> 
> I had said at the time that we'll probably only see the 45% on AMD's flagship CPU's and GPU's and then we gonna have a herd of Intel owners all having a rage fit because they only see 10%.
> 
> Looks like I'm right.












Intel hexacore and two 290x = 58% gain bf4 MP. MP causes cpu bottlenecks due to the large loads on cpu from so many users.


----------



## the9quad

Quote:


> Originally Posted by *Krusher33*
> 
> I knew this would happen. When AMD first demonstrated mantle and HSA, they indeed said "*up to*" 45%.
> 
> I had said at the time that we'll probably only see the 45% on AMD's flagship CPU's and GPU's and then we gonna have a herd of Intel owners all having a rage fit because they only see 10%.
> 
> Looks like I'm right.


Scroll up, 3960x is getting a 50% increase in crossfire in one of those bf4 benches.


----------



## kdawgmaster

Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Intel hexacore and two 290x = 58% gain bf4 MP. MP causes cpu bottlenecks due to the large loads on cpu from so many users.


The 58% was for single player that battlelogs posted.
Quote:


> Originally Posted by *the9quad*
> 
> 3960x is getting a 50% increase in crossfire in one of those bf4 benches.


Cant wait to see what our 3 R9 290x get =D


----------



## tsm106

Single was 10% iirc.


----------



## Krusher33

Well then what the hell are some of these people freaking complaining about?! Geez, get over yourselves.


----------



## tsm106

Quote:


> Originally Posted by *Krusher33*
> 
> Well then what the hell are some of these people freaking complaining about?! Geez, get over yourselves.


You will see their true faces now. Free gains, who would complain right? For ex. 10% gain in SP BF4, thats like going from 290 to 290x and you save $150 bucks msrp lol.


----------



## kdawgmaster

Quote:


> Originally Posted by *tsm106*
> 
> You will see their true faces now. Free gains, who would complain right? For ex. 10% gain in SP BF4, thats like going from 290 to 290x and you save $150 bucks msrp lol.


Dont be surprised if AMD cards become more expensive again tho :/


----------



## tsm106

Quote:


> Originally Posted by *kdawgmaster*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> You will see their true faces now. Free gains, who would complain right? For ex. 10% gain in SP BF4, thats like going from 290 to 290x and you save $150 bucks msrp lol.
> 
> 
> 
> Dont be surprised if AMD cards become more expensive again tho :/
Click to expand...

True. Highly valued products sell well. On the plus side I won't have to run quads anymore lol.


----------



## Krusher33

For realz.

It's been like this for every generation of GPU/CPU for any brand.

A driver update improves performance by 5%... "OH MY GERRD! YOU SUCK AMD/INTEL/NVIDIA! NEVER BUYING YOUR CRAP AGAIN!"

1 year later, they buy flagship of the same brand.


----------



## battleaxe

Quote:


> Originally Posted by *tsm106*
> 
> True. Highly valued products sell well. On the plus side I won't have to run quads anymore lol.


LOL... like you really need quads now?









It'll just make your overkill a bit more serious than before.


----------



## AlDyer

Free fps! Here, please! Anybody who doesn't want their extra FPS can send it right my way! All types of FPS accepted! Can't wait to see what I get or don't get


----------



## kdawgmaster

Quote:


> Originally Posted by *tsm106*
> 
> True. Highly valued products sell well. On the plus side I won't have to run quads anymore lol.


U might as well, it seems mantle EXTREMELY benefits crossfire. To me this would be more of a reason to get more GPU's XD

Im wondering if this might mean that mantle gets access to the rest of the ram on the other cards to allow for some amount of this performance increase.


----------



## tsm106

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> True. Highly valued products sell well. On the plus side I won't have to run quads anymore lol.
> 
> 
> 
> LOL... like you really need quads now?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It'll just make your overkill a bit more serious than before.
Click to expand...

144hz triple 1080... go try that out and lemme know.


----------



## kdawgmaster




----------



## battleaxe

Quote:


> Originally Posted by *tsm106*
> 
> 144hz triple 1080... go try that out and lemme know.


Oh... yeah that.


----------



## Krusher33

Are there any good 120-144hz monitors for under $200? It's like the last piece of the puzzle for me in my new system.


----------



## LazarusIV

Quote:


> Originally Posted by *kdawgmaster*


Glorious!!!
Quote:


> Originally Posted by *Krusher33*
> 
> Are there any good 120-144hz monitors for under $200? It's like the last piece of the puzzle for me in my new system.


The QNIX in my sig rig. I love this thing, some day I'll have 3 of them! Ultra easy to OC, too.


----------



## Roy360

so mantle gets better performance on 6 core i7s? What about us common folks with overclocked i5s?


----------



## kdawgmaster

Quote:


> Originally Posted by *Krusher33*
> 
> Are there any good 120-144hz monitors for under $200? It's like the last piece of the puzzle for me in my new system.


No u wont find any for less then 300 bucks.
Quote:


> Originally Posted by *LazarusIV*
> 
> Glorious!!!


Right


----------



## Jack Mac

Quote:


> Originally Posted by *Krusher33*
> 
> Are there any good 120-144hz monitors for under $200? It's like the last piece of the puzzle for me in my new system.


Maybe an open box VG248QE on Newegg if you can find one, otherwise you're going to need $250-300 to buy a 120/144Hz monitor.


----------



## BradleyW

Hello.
Is their any difference in gaming performance with Hynix vs Elpida IC's on the 290X?
Thank you.


----------



## Krusher33

Quote:


> Originally Posted by *LazarusIV*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kdawgmaster*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Glorious!!!
> Quote:
> 
> 
> 
> Originally Posted by *Krusher33*
> 
> Are there any good 120-144hz monitors for under $200? It's like the last piece of the puzzle for me in my new system.
> 
> Click to expand...
> 
> The QNIX in my sig rig. I love this thing, some day I'll have 3 of them! Ultra easy to OC, too.
Click to expand...

Quote:


> Originally Posted by *kdawgmaster*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Krusher33*
> 
> Are there any good 120-144hz monitors for under $200? It's like the last piece of the puzzle for me in my new system.
> 
> 
> 
> No u wont find any for less then 300 bucks.
> Quote:
> 
> 
> 
> Originally Posted by *LazarusIV*
> 
> Glorious!!!
> 
> Click to expand...
> 
> Right
Click to expand...

Quote:


> Originally Posted by *Jack Mac*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Krusher33*
> 
> Are there any good 120-144hz monitors for under $200? It's like the last piece of the puzzle for me in my new system.
> 
> 
> 
> Maybe an open box VG248QE on Newegg if you can find one, otherwise you're going to need $250-300 to buy a 120/144Hz monitor.
Click to expand...

K, thanks guys. Guess I need to save up some more dough.


----------



## VSG

Quote:


> Originally Posted by *BradleyW*
> 
> Hello.
> Is their any difference in gaming performance with Hynix vs Elpida IC's on the 290X?
> Thank you.


Nothing that you would notice


----------



## BradleyW

Quote:


> Originally Posted by *geggeg*
> 
> Nothing that you would notice


Could you expand from this? What sort of numbers are we talking?
Cheers.


----------



## VSG

I just meant that the difference in memory overclocks, especially game stable clocks, between Samsung. Hynix and Elpida would be not noticeable in games really. Core overclocks will impact your FPS numbers more than memory overclocks.


----------



## chiknnwatrmln

People make a huge deal of hynix vs elpida when in reality it almost doesn't matter.

For example, my Elpida memory clocks to 6500mhz, higher than some hynix..

Anything over 6000mhz will not really provide any tangible benefit, maybe a hundred or so points on 3dmark but that's it.


----------



## BradleyW

Quote:


> Originally Posted by *geggeg*
> 
> I just meant that the difference in memory overclocks, especially game stable clocks, between Samsung. Hynix and Elpida would be not noticeable in games really. Core overclocks will impact your FPS numbers more than memory overclocks.


Fair point.
I tested Tomb Raider with the following settings:


Stock clocks Elpida memory
Min fps = 82
Stock Core, Memory @ 1350 Elpida
Min fps = 82
Stock memory Elpida, Core Clock @ 1120
Min fps = 90
Memory @ 1350 Elpida, Core @ 1120
Min fps = 90

No benefit from the memory OC.


----------



## sugarhell

Quote:


> Originally Posted by *BradleyW*
> 
> Fair point.
> I tested Tomb Raider with the following settings:
> 
> 
> Stock clocks Elpida memory
> Min fps = 82
> Stock Core, Memory @ 1350 Elpida
> Min fps = 82
> Stock memory Elpida, Core Clock @ 1120
> Min fps = 90
> Memory @ 1350 Elpida, Core @ 1120
> Min fps = 90
> 
> No benefit from the memory OC.


Min fps most of the times are cpu dependent


----------



## Widde

I'm dropping fps the moment i touch the memory frequency







xD

Never ever in my life had any luck with silicon >:| http://piclair.com/2v6c1 That's what I'm stuck with


----------



## BradleyW

Quote:


> Originally Posted by *sugarhell*
> 
> Min fps most of the times are cpu dependent


True, but for this benchmark, it shows that the memory speeds did not help the fps at all. That means if I switched to Hynix cards, the min fps should be the same. It seems only the real benefit of Hynix is for those who are wanting the very best crypto mining performance.


----------



## sugarhell

Quote:


> Originally Posted by *BradleyW*
> 
> True, but for this benchmark, it shows that the memory speeds did not help the fps at all. That means if I switched to Hynix cards, the min fps should be the same. It seems only the real benefit of Hynix is for those who are wanting the very best crypto mining performance.


Your min fps are already 90. Its cpu limited if you oc your card and you dont see a difference


----------



## BradleyW

Quote:


> Originally Posted by *sugarhell*
> 
> Your min fps are already 90. Its cpu limited if you oc your card and you dont see a difference


Absolutely not in this case.
Core @ 1150 gives min fps of 92.

I'm trying to show that the speed of the VRAM made no difference to the fps.


----------



## VSG

Well it will help out, just that without having numbers from benchmarks you would very likely not notice it. The effect will vary from game to game. What I said was that the difference in overclocks from Elpida to Hynix would be even less noticeable and not worth worrying over memory type as long as the card runs fine


----------



## DamnedLife

You are right R9 290(X) has 512-bit BUS so you can't for the life of you saturate it with measly FullHD 1440p or 3 monitor Eyefinity, memory clock increse for these cards measn nothing until you use 4K res. monitor.


----------



## DamnedLife

Take a look: http://forums.overclockers.co.uk/showthread.php?t=18572559

Memory overclock for low res (yeah FullHD is now low. res







) is not helping at all


----------



## sugarhell

Quote:


> Originally Posted by *BradleyW*
> 
> Absolutely not in this case.
> Core @ 1150 gives min fps of 92.
> 
> I'm trying to show that the speed of the VRAM made no difference to the fps.


Marginal.And the memory oc makes a huge difference.Already your fps are high. For the increase to 1150 the 2 fps increase is wrong too. I dont know why you dont see it. CPU limited

Go bench 3dmark and check memory oc. You will see the difference there


----------



## reb0rn

If anyone get a hand on Powercolor 290 PCS+ pls (they should have the BEST cooler so far + vrm heatsink) check the memory type, I need it for mining and here they are not yet on stock


----------



## Ukkooh

Quote:


> Originally Posted by *DamnedLife*
> 
> You are right R9 290(X) has 512-bit BUS so you can't for the life of you saturate it with measly FullHD 1440p or 3 monitor Eyefinity, memory clock increse for these cards measn nothing until you use 4K res. monitor.


Well my valley score increased ~80 when i ocd mem from 1250->1500.


----------



## BradleyW

Quote:


> Originally Posted by *sugarhell*
> 
> Marginal.And the memory oc makes a huge difference.Already your fps are high. For the increase to 1150 the 2 fps increase is wrong too. I dont know why you dont see it. CPU limited
> 
> Go bench 3dmark and check memory oc. You will see the difference there


Well, if I increase the core even more, the min fps continues to increase. Like I said, in this instance, NOT CPU limited in this particular benchmark.
In addition, I am only talking about this benchmark and that the memory speeds made no difference. I'm sure 3D Mark will show a slight increase with higher memory clocks. I never disputed this. But if you read my posts, I am purely talking about a particular benchmark. The only game I know that loves VRAM speed and bandwidth is Metro 2033. Increasing the core speed did not increase fps much at all, but increasing VRAM speed did push the fps higher.


----------



## Roy360

Quote:


> Originally Posted by *Ukkooh*
> 
> Well my valley score increased ~80 when i ocd mem from 1250->1500.


Valley is strange.

1175/1250/50%/25 1425

Upped the Memory

1175/1300/50%/25 1397

Other example

1200/1250/50%/63 1433
1100/1500/50%/100 1414

I've got a whole list of valley scores with their respective overclocks if you want to see, but I don't memory clock matters; at least not at 1080p


----------



## Amnyz

Hello,
Does anyone know if a reference PCB waterblock will fit on the new MSI r9 290x twin frozr 4s gaming ?

Why ? because it's the 290x i could buy from my local reseller ( i'm waiting his answer if he could get non-custom card )

According to EKWB it would fit, based on PCB pictures.
The review of Guru3D talked about custom PCB.. im a bit confused

Thank you.


----------



## AlDyer

The memory issue is there, black screens etc. Also in Mining it makes a MASSIVE difference. Hynix gets probably 100khash/s less on average. And there's a couple of FPS difference in some cases, but gaming and benching wise it isn't such big of a deal. Hynix memory has tighter timings too by the way..


----------



## vieuxchnock

Quote:


> Originally Posted by *Amnyz*
> 
> Hello,
> Does anyone know if a reference PCB waterblock will fit on the new MSI r9 290x twin frozr 4s gaming ?
> 
> Why ? because it's the 290x i could buy from my local reseller ( i'm waiting his answer if he could get non-custom card )
> 
> According to EKWB it would fit, based on PCB pictures.
> The review of Guru3D talked about custom PCB.. im a bit confused
> 
> Thank you.


On EK web site, it is indicated that is a reference desing board.I think that EK knows what they are talking about.


----------



## VSG

Thing is- manufacturers can well change around components during the lifetime of a card and not change the name of the model. So EK can't guarantee against that.


----------



## tsm106

Quote:


> Originally Posted by *geggeg*
> 
> Thing is- manufacturers can well change around components during the lifetime of a card and not change the name of the model. So EK can't guarantee against that.


This. The only one in control or responsible for your loop is you so be your own advocate. The ek database is not always update or even correct. Do your own visual match before ordering a block. Many threads on ppl who got wrong block or pcb simply because they assumed ek site was correct.


----------



## Krusher33

And you can't go by pictures of the PCB on retail sites.


----------



## Widde

== Results ================================================
Test Duration: 120 Seconds
Total Frames: 1200

Average FPS: 9.99
Average Unit Count: 2895
Maximum Unit Count: 4455
Average Batches/MS: 709.03
Maximum Batches/MS: 816.99
Average Batch Count: 73691
Maximum Batch Count: 136885

Star Swarm RTS on extreme with I5 3570K 4.6ghz and 2 290s at 1060/1250, tanks the fps pretty nice xD


----------



## jerrolds

Was hoping for 15-20%, not surprised that it can be sub 10%. Hopefully tests later tonight will be more


----------



## Sgt Bilko

Quote:


> Originally Posted by *Widde*
> 
> == Results ================================================
> Test Duration: 120 Seconds
> Total Frames: 1200
> 
> Average FPS: 9.99
> Average Unit Count: 2895
> Maximum Unit Count: 4455
> Average Batches/MS: 709.03
> Maximum Batches/MS: 816.99
> Average Batch Count: 73691
> Maximum Batch Count: 136885
> 
> Star Swarm RTS on extreme with I5 3570K 4.6ghz and 2 290s at 1060/1250, tanks the fps pretty nice xD


== Configuration ==========================================
API: DirectX
Scenario: ScenarioRTS.csv
User Input: Disabled
Resolution: 1920x1080
Fullscreen: True
GameCore Update: 16.6 ms
Bloom Quality: High
PointLight Quality: High
ToneCurve Quality: High
Glare Overdraw: 16
Shading Samples: 64
Shade Quality: Mid
Motion Blur Frame Time: 16
Motion Blur InterFrame Time: 2
Detailed Frame Info: Off
===========================================================

== Results ================================================
Test Duration: 120 Seconds
Total Frames: 874
Average FPS: 7.28
Average Unit Count: 2354
Maximum Unit Count: 4125
Average Batches/MS: 402.37
Maximum Batches/MS: 447.13
Average Batch Count: 57406
Maximum Batch Count: 128855
===========================================================

Same test and i win









7.28fps ftw!!!


----------



## Ukkooh

Quote:


> Originally Posted by *AlDyer*
> 
> The memory issue is there, black screens etc. Also in Mining it makes a MASSIVE difference. Hynix gets probably 100khash/s less on average. And there's a couple of FPS difference in some cases, but gaming and benching wise it isn't such big of a deal. Hynix memory has tighter timings too by the way..


Are you saying that hynix has tighter timings and still less hashrate with the same clocks? Is it supposed to say elpida somewhere in there?


----------



## DamnedLife

Quote:


> Originally Posted by *reb0rn*
> 
> If anyone get a hand on Powercolor 290 PCS+ pls (they should have the BEST cooler so far + vrm heatsink) check the memory type, I need it for mining and here they are not yet on stock


I wish you can't find one since mining! We gamers need it to game, that card is great gamer, but I wish it has Elpida mem. to spite you and not to be used for mining hahahahaha


----------



## BradleyW

Quote:


> Originally Posted by *AlDyer*
> 
> *The memory issue is there*, black screens etc. Also in Mining it makes a MASSIVE difference. Hynix gets probably 100khash/s less on average. And there's a couple of FPS difference in some cases, but gaming and benching wise it isn't such big of a deal. Hynix memory has tighter timings too by the way..


Both my 290x's are Elpida. No black screens at all.


----------



## Roy360

just a general question about video card cooling, but how do you know if you are cooling the VRAM properly? I have a 6950 that was cooled by the air running thru the heatsync. (No contact). While my R9s are mining I plan on using my 6950 for daily usage, was wondering if I could get away without any heatsyncs. The VRM is burning when I touch it. Maybe a piece of cardboard acting as a shroud would help? I eventually plan on replacing the 6950 with a R9 290 using the same universal GPU block.


----------



## Widde

Starting to get worried







Been at the computer all day long since 11 this morning waiting and waiting and now it's 01:45 here friday :< I have a problem I'm aware of that







xD


----------



## Abyssic

Quote:


> Originally Posted by *Widde*
> 
> Starting to get worried
> 
> 
> 
> 
> 
> 
> 
> Been at the computer all day long since 11 this morning waiting and waiting and now it's 01:45 here friday :< I have a problem I'm aware of that
> 
> 
> 
> 
> 
> 
> 
> xD


hey same here xD i'm about to go to sleep now. we're appearently in the same timezone


----------



## BradleyW

Hopefully when I wake up, AMD will have released the secret weapon!


----------



## Clexzor

I was at Microcenter today getting some blue ray discs and I know one of the guys int he pc area...anywyas I was chatting with him about some stuff as the store just opened...he also showed me the order of asus 290's and 290x's they just got in lol...about 10 mins later two guys rolled up first customer for him of the day. They eached purchased a 290 and 290x as its one per customer only lol microcenter is selling them for 100$+ less than online...so I grabbed one of 290s asus ref. cards from him for 460$ as newegg has them at 570$ figured I could make a quick buck or maybe just keep it idk


----------



## Widde

Quote:


> Originally Posted by *Abyssic*
> 
> hey same here xD i'm about to go to sleep now. we're appearently in the same timezone


Hehe ^_^ I'm up for the next 24hours if that's what it takes


----------



## BradleyW

Sorry if this has been answered already but, is it possible to increase the VRAM voltage for 290X with Elpida Memory?
Is it ok to run different VRAM clocks on 290X CFX?

Thank you.


----------



## MojoW

Quote:


> Originally Posted by *BradleyW*
> 
> Sorry if this has been answered already but, is it possible to increase the VRAM voltage for 290X with Elpida Memory?
> Is it ok to run different VRAM clocks on 290X CFX?
> 
> Thank you.


No VRAM voltage control but you can do different memory clocks, only i don't see the point.


----------



## Hogesyx

Quote:


> Originally Posted by *Sgt Bilko*
> 
> == Configuration ==========================================
> API: DirectX
> Scenario: ScenarioRTS.csv
> User Input: Disabled
> Resolution: 1920x1080
> Fullscreen: True
> GameCore Update: 16.6 ms
> Bloom Quality: High
> PointLight Quality: High
> ToneCurve Quality: High
> Glare Overdraw: 16
> Shading Samples: 64
> Shade Quality: Mid
> Motion Blur Frame Time: 16
> Motion Blur InterFrame Time: 2
> Detailed Frame Info: Off
> ===========================================================
> 
> == Results ================================================
> Test Duration: 120 Seconds
> Total Frames: 874
> Average FPS: 7.28
> Average Unit Count: 2354
> Maximum Unit Count: 4125
> Average Batches/MS: 402.37
> Maximum Batches/MS: 447.13
> Average Batch Count: 57406
> Maximum Batch Count: 128855
> ===========================================================
> 
> Same test and i win
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 7.28fps ftw!!!


Wait, the numbers cant be right? I am using a single 290x

===========================================================
Oxide Games
Star Swarm Benchmark - ©2013
C:\Users\XXXX\Documents\Star Swarm\Benchmark_14_01_31_1139.txt
Version 0.95
01/31/2014 11:39
===========================================================

== Hardware Configuration =================================
GPU: AMD Radeon R9 200 Series
CPU: GenuineIntel
Intel(R) Core(TM) i5-3570K CPU @ 3.40GHz
Physical Cores: 4
Logical Cores: 4
Physical Memory: 8471621632
Allocatable Memory: 140737488224256
===========================================================

== Configuration ==========================================
API: DirectX
Scenario: ScenarioRTS.csv
User Input: Disabled
Resolution: 1920x1080
Fullscreen: True
GameCore Update: 16.6 ms
Bloom Quality: High
PointLight Quality: High
ToneCurve Quality: High
Glare Overdraw: 16
Shading Samples: 64
Shade Quality: Mid
Motion Blur Frame Time: 16
Motion Blur InterFrame Time: 2
Detailed Frame Info: Off
===========================================================

== Results ================================================
Test Duration: 120 Seconds
Total Frames: 1599

Average FPS: 13.31
Average Unit Count: 3225
Maximum Unit Count: 5235
Average Batches/MS: 1120.50
Maximum Batches/MS: 1346.80
Average Batch Count: 85486
Maximum Batch Count: 164292
===========================================================


----------



## Roy360

Quote:


> Originally Posted by *Clexzor*
> 
> I was at Microcenter today getting some blue ray discs and I know one of the guys int he pc area...anywyas I was chatting with him about some stuff as the store just opened...he also showed me the order of asus 290's and 290x's they just got in lol...about 10 mins later two guys rolled up first customer for him of the day. They eached purchased a 290 and 290x as its one per customer only lol microcenter is selling them for 100$+ less than online...so I grabbed one of 290s asus ref. cards from him for 460$ as newegg has them at 570$ figured I could make a quick buck or maybe just keep it idk


Microcenter, the only reason why I'd ever want to live in the US


----------



## Cool Mike

Received my two Powercolor PCS+ 290's today. Very happy! the PCS+ has cooler temps than the Asus direct CU. (I owned one). As we all know Crossfire runs hotter especially the top card. Running 1085 Core and 1500 (6000 Eff,) memory. +75mV on the core and power at 20%. The memory has direct touch heat sinks, so very happy with the memory overclock. After a valley run for 15 min @ 1440P with AA4 top card hit 86C and bottom card hit 75C. Both cores never throttled down.









fire strike run


----------



## Sgt Bilko

Quote:


> Originally Posted by *Hogesyx*
> 
> Wait, the numbers cant be right? I am using a single 290x
> 
> == Hardware Configuration =================================
> GPU: AMD Radeon R9 200 Series
> CPU: GenuineIntel
> Intel(R) Core(TM) *i5-3570K* CPU @ 3.40GHz


It's all dependent on the CPU in this,

Check this one out:
Quote:


> Originally Posted by *Durquavian*
> 
> Ok this is weird.
> 
> == Configuration ==========================================
> API: DirectX
> Scenario: ScenarioRTS.csv
> User Input: Disabled
> Resolution: 1920x1080
> Fullscreen: True
> GameCore Update: 16.6 ms
> Bloom Quality: High
> PointLight Quality: High
> ToneCurve Quality: High
> Glare Overdraw: 16
> Shading Samples: 64
> Shade Quality: Mid
> Motion Blur Frame Time: 16
> Motion Blur InterFrame Time: 2
> Detailed Frame Info: Off
> ===========================================================
> 
> == Results ================================================
> Test Duration: 120 Seconds
> Total Frames: 875
> 
> Average FPS: 7.28
> Average Unit Count: 2358
> Maximum Unit Count: 4148
> Average Batches/MS: 400.84
> Maximum Batches/MS: 465.19
> Average Batch Count: 57265
> Maximum Batch Count: 127554
> ===========================================================
> Our frames are the same but you have a GPU that is 9000 times better. Gotta be a CPU heavy demo. I am FX 8350 4.84Ghz.


Now compare that to mine, Exactly the same FPS even though Durquavian is running Crossfire 7770's.

It's a CPU bench basically


----------



## reb0rn

Quote:


> Originally Posted by *Cool Mike*
> 
> Received my two Powercolor PCS+ 290's today. Very happy! the PCS+ has cooler temps than the Asus direct CU. (I owned one). As we all know Crossfire runs hotter especially the top card. Running 1085 Core and 1500 (6000 Eff,) memory. +75mV on the core and power at 20%. The memory has direct touch heat sinks, so very happy with the memory overclock. After a valley run for 15 min @ 1440P with AA4 top card hit 86C and bottom card hit 75C. Both cores never throttled down.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> fire strike run


Pls can you check VRM temp (Gpu-Z should show them) in load?

Also I am interested in memory type it use , is it elpida or hunyx

memory info should show memory type
www.mediafire.com/download/voj4j1rlk0ucfz4/MemoryInfo+1005.rar

I have Sapphite Tri-X and vrm is overheating at ~100C while mining









Big Tnx


----------



## IBIubbleTea

Is Radeon pro a good recording software or should I use Dxtory?


----------



## kizwan

Quote:


> Originally Posted by *BradleyW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *AlDyer*
> 
> *The memory issue is there*, black screens etc. Also in Mining it makes a MASSIVE difference. Hynix gets probably 100khash/s less on average. And there's a couple of FPS difference in some cases, but gaming and benching wise it isn't such big of a deal. Hynix memory has tighter timings too by the way..
> 
> 
> 
> Both my 290x's are Elpida. No black screens at all.
Click to expand...

If I'm not mistaken, it only black screen if you overclocked pass certain MHz. Also it pretty random, for example you may not get black screen in a couple of BF4 sessions.


----------



## Cool Mike

I will answer your questions tomorrow. Getting late. Thinking the memory is hynix but I need to make sure
Thx


----------



## The Mac

Quote:


> Originally Posted by *IBIubbleTea*
> 
> Is Radeon pro a good recording software or should I use Dxtory?


should be easy enough to look up the kraken's power usage...

RP is pretty good.


----------



## IBIubbleTea

Quote:


> Originally Posted by *The Mac*
> 
> should be easy enough to look up the kraken's power usage...
> 
> RP is pretty good.


Kraken's power usage...? Wat?


----------



## NirHahs

should i get SApphire tri-x r9 290 or Powercolor PCS+ r9 290?
or MSI r9 290 gaming?


----------



## Sgt Bilko

The Asus DCU II 290x has landed in Aus,

You have to pay Asus-tax of course: $829.00 vs $749.00 for a Tri-X........

https://www.pccasegear.com/index.php?main_page=product_info&products_id=26730


----------



## Kaapstad

Who says AMD cards are no good on the Heaven 4 bench



4 x 290Xs @1200/1600

4930k @4.8

Roll over Beethoven.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Kaapstad*
> 
> Who says AMD cards are no good on the Heaven 4 bench
> 
> 
> 
> 4 x 290Xs @1200/1600
> 
> 4930k @4.8
> 
> Roll over Beethoven.


Oh wowowie


----------



## Maracus

Quote:


> Originally Posted by *Sgt Bilko*
> 
> The Asus DCU II 290x has landed in Aus,
> 
> You have to pay Asus-tax of course: $829.00 vs $749.00 for a Tri-X........
> 
> https://www.pccasegear.com/index.php?main_page=product_info&products_id=26730


Yeah Tri-X it is then, I just recieved my EK block from pccasegear today so ill have a few beers and wack that baby on later


----------



## kpoeticg

Hey guys, i'm lookin for a little advice. I've had my card since Jan 10th but haven't gotten my build to the point of testing until last night.

My setups Rampage IV Black Edition, 4930k, 4x8GB Trident-X, PowerColor 290x (Hynix)

Booting on the RIVE BE gives me no display. The VGA LED is lit up and i get POST CODE B2. Also my keyboard doesn't light up.

I originally thought it was the board because the fan spins on the 290x, but when i hook it up to my old LGA775 board the same exact things happen.

It's still under the 30Day RMA Window, is the card dead or do i need to try flashing a bios?


----------



## Maracus

Quote:


> Originally Posted by *kpoeticg*
> 
> Hey guys, i'm lookin for a little advice. I've had my card since Jan 10th but haven't gotten my build to the point of testing until last night.
> My setups Rampage IV Black Edition, 4930k, 4x8GB Trident-X, PowerColor 290x (Hynix)
> 
> Booting on the RIVE BE gives me no display. The VGA LED is lit up and i get POST CODE B2. Also my keyboard doesn't light up.
> I originally thought it was the board because the fan spins on the 290x, but when i hook it up to my old LGA775 board the same exact things happen.
> 
> It's still under the 30Day RMA Window, is the card dead or do i need to try flashing a bios?


Hmm i was thinking RAM, try booting with one stick of RAM see if it at least boots, other than that I'm outa ideas


----------



## kpoeticg

Yeah i went through all that when i was thinking it was the mobo. It's definitely not the ram.

The 290x did the same thing on my LGA775 board. I just wanna check if there's any hope to flash it and make it work before i send it back


----------



## Sazz

Quote:


> Originally Posted by *kpoeticg*
> 
> Yeah i went through all that when i was thinking it was the mobo. It's definitely not the ram.
> 
> The 290x did the same thing on my LGA775 board. I just wanna check if there's any hope to flash it and make it work before i send it back


First question, did the no display happened after windows update and drivers update got installed? depending on your answer I may have encountered the same thing you are facing right now.


----------



## AlDyer

Quote:


> Originally Posted by *BradleyW*
> 
> Both my 290x's are Elpida. No black screens at all.


Mine is fine too, but there's A LOT of Elpida's that have black screen issues. Only a handful of Hynix cards AFAIK..


----------



## kpoeticg

Quote:


> Originally Posted by *Sazz*
> 
> First question, did the no display happened after windows update and drivers update got installed? depending on your answer I may have encountered the same thing you are facing right now.


I've never gotten my board past POST because it's LGA2011, no iGPU.

It's a fresh build, i've been getting my hardware together over the past cpl months but last night was the first time i tried booting. Since the 290x is the only GPU i have, i haven't even seen the BIOS for my motherboard yet. I spent all night last night and all day and night today troubleshooting the issue.

I was about to chalk it up to maybe my socket has bent pins before i decided to try the GPU on my old mobo. And the exact same thing.

On my LGA775 i was obviously able to get into bios because of the iGPU, but no display out of the 290x. And then when i set priority to dGPU in the bios it acted the same as my RIVE BE. Couldn't post, keyboard wouldn't light up, but the 290x fan was spinning. I've also tested the solder points on the card connected to the 8+6Pin, and all the voltages were on point as well

Seems strange...


----------



## Connolly

What PSU are you using?


----------



## kpoeticg

Antec HCP-1300 Platinum. I have the 290x on it's own 12V/50A rail


----------



## BradleyW

Quote:


> Originally Posted by *MojoW*
> 
> No VRAM voltage control but you can do different memory clocks, only i don't see the point.


Yeah I think I agree. Unless you really want a slightly higher number in a silly benchmark or higher hash rates for crypto currency, there is no point in OC'ing the VRAM since it's far faster than anything we need!


----------



## BackwoodsNC

Quote:


> Originally Posted by *kpoeticg*
> 
> I've never gotten my board past POST because it's LGA2011, no iGPU.
> 
> It's a fresh build, i've been getting my hardware together over the past cpl months but last night was the first time i tried booting. Since the 290x is the only GPU i have, i haven't even seen the BIOS for my motherboard yet. I spent all night last night and all day and night today troubleshooting the issue.
> 
> I was about to chalk it up to maybe my socket has bent pins before i decided to try the GPU on my old mobo. And the exact same thing.
> 
> On my LGA775 i was obviously able to get into bios because of the iGPU, but no display out of the 290x. And then when i set priority to dGPU in the bios it acted the same as my RIVE BE. Couldn't post, keyboard wouldn't light up, but the 290x fan was spinning. I've also tested the solder points on the card connected to the 8+6Pin, and all the voltages were on point as well
> 
> Seems strange...


Well if you tried the card on another board and the same thing happens, I would think it is the card. Get a used spare card off of craigslist or something and RMA that sucker.


----------



## devilhead

Quote:


> Originally Posted by *Kaapstad*
> 
> Who says AMD cards are no good on the Heaven 4 bench
> 
> 
> 
> 4 x 290Xs @1200/1600
> 
> 4930k @4.8
> 
> Roll over Beethoven.


yes, it's good cards, just for crossfire, trifire, x-fire you need huge overclock of cpu, to squeeze those cards







i feel that my 3930k at 5.2ghz still bottleneck my 290x's


----------



## Maracus

Add me to water club









So after like 13 beers, alot of back tracking because i was rushing its finally wet.

EK block/backplate 360rad/D5 pump etc. Just ran stock Firestrike Extreme 1000/1250

Ambient temp *30c*
Max Core temp *48c*
Max VRM1 temp *40c*
Max VRM2 temp *42c*

I will overclock with voltage later see how the temps fair, rather hot here atm. I'm glad i got the VRMs sorted i was kinda worried after reading a couple of posts.


----------



## kizwan

Quote:


> Originally Posted by *kpoeticg*
> 
> Hey guys, i'm lookin for a little advice. I've had my card since Jan 10th but haven't gotten my build to the point of testing until last night.
> My setups Rampage IV Black Edition, 4930k, 4x8GB Trident-X, PowerColor 290x (Hynix)
> 
> Booting on the RIVE BE gives me no display. The VGA LED is lit up and i get POST CODE B2. Also my keyboard doesn't light up.
> I originally thought it was the board because the fan spins on the 290x, but when i hook it up to my old LGA775 board the same exact things happen.
> 
> It's still under the 30Day RMA Window, is the card dead or do i need to try flashing a bios?


I think your card is dead. I assume you already tried booting by switching to the second BIOS on the card?
Quote:


> Originally Posted by *Maracus*
> 
> Add me to water club
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So after like 13 beers, alot of back tracking because i was rushing its finally wet.
> 
> EK block/backplate 360rad/D5 pump etc. Just ran stock Firestrike Extreme 1000/1250
> 
> Ambient temp *30c*
> Max Core temp *48c*
> Max VRM1 temp *40c*
> Max VRM2 temp *42c*
> 
> I will overclock with voltage later see how the temps fair, rather hot here atm. I'm glad i got the VRMs sorted i was kinda worried after reading a couple of posts.
> 
> 
> Spoiler: Warning: Spoiler!


Did you use Phobya or Fujipoly thermal pad? Good temps btw.


----------



## Sonikku13

I finally got my Radeon R9 290X. And I can't mine on it because I realized my board keeps messing up. The PC crashes due to the board. So I'm gonna replace the board and processor in one shot, which will require me to get a new copy of Windows 8.1.


----------



## battleaxe

Quote:


> Originally Posted by *Sonikku13*
> 
> I finally got my Radeon R9 290X. And I can't mine on it because I realized my board keeps messing up. The PC crashes due to the board. So I'm gonna replace the board and processor in one shot, which will require me to get a new copy of Windows 8.1.


No it wont'. You can change your existing Win8 over to the new build. Takes like 5 minutes to enable it using the phone option in the control panel under "system". I've done it a hundred times.


----------



## Krusher33

Y'all didn't warn me that I'd need 2 of the .5mm pads. -_- I ended up covering 2 of the memory with EK's pads.


----------



## esqueue

Quote:


> Originally Posted by *Sonikku13*
> 
> I finally got my Radeon R9 290X. And I can't mine on it because I realized my board keeps messing up. The PC crashes due to the board. So I'm gonna replace the board and processor in one shot, which will require me to get a new copy of Windows 8.1.


If your copy is legit, you will NOT require a new copy. You will be required to re-validate it by either calling a phone number or online. I've even managed to fully upgrade systems without having to reinstall the operating system. I loaded of the new mobo hdd controller drivers before removing it.


----------



## Krusher33

Quote:


> Originally Posted by *esqueue*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sonikku13*
> 
> I finally got my Radeon R9 290X. And I can't mine on it because I realized my board keeps messing up. The PC crashes due to the board. So I'm gonna replace the board and processor in one shot, which will require me to get a new copy of Windows 8.1.
> 
> 
> 
> If your copy is legit, you will NOT require a new copy. You will be required to re-validate it by either calling a phone number or online. I've even managed to fully upgrade systems without having to reinstall the operating system. I loaded of the new mobo hdd controller drivers before removing it.
Click to expand...

I didn't see this comment but esqueue is right. You don't have to get another copy of windows if you're replacing the board. It's going to recognize that you have a different board and ask you to re-activate it. Most of the time for me it's calling Windows and following an automated system.


----------



## King4x4

Well I just finished testing my little brothers new R9 290 Tri-X.... This card is just smoking!

Refernce R9 290x Card temps running for 2 hours:
Core: 93'C
Core: 840-940mhz
Vrams 1= 77'C 2=88'C
Sound Level: 47Dba (Have a sound sensor)

Sapphire R9 290 temps after running for 2 hours:
Core: 74'C
Core: 1000mhz STABLE
Vrams 1= 63'C 2=77'C
Sound Level: 37Dba

Performance difference? 1 fps.
Price Difference = Over 120$


----------



## kpoeticg

Quote:


> Originally Posted by *BackwoodsNC*
> 
> Well if you tried the card on another board and the same thing happens, I would think it is the card. Get a used spare card off of craigslist or something and RMA that sucker.


 Quote:



> Originally Posted by *kizwan*
> I think your card is dead. I assume you already tried booting by switching to the second BIOS on the card?


Thanx for replying guys. Yeah, i tried switching the bios. I figured it was RMA time, just thought it would be worth posting in here to see if i was missing something.


----------



## Krusher33

Quote:


> Originally Posted by *King4x4*
> 
> Well I just finished testing my little brothers new R9 290 Tri-X.... This card is just smoking!
> 
> Refernce R9 290x Card temps running for 2 hours:
> Core: 93'C
> Core: 840-940mhz
> Vrams 1= 77'C 2=88'C
> Sound Level: 47Dba (Have a sound sensor)
> 
> Sapphire R9 290 temps after running for 2 hours:
> Core: 74'C
> Core: 1000mhz STABLE
> Vrams 1= 63'C 2=77'C
> Sound Level: 37Dba
> 
> Performance difference? 1 fps.
> Price Difference = Over 120$
> *Epeen = priceless*


Fixed.


----------



## ArchieGriffs

Quote:


> Originally Posted by *King4x4*
> 
> Well I just finished testing my little brothers new R9 290 Tri-X.... This card is just smoking!
> 
> Refernce R9 290x Card temps running for 2 hours:
> Core: 93'C
> Core: 840-940mhz
> Vrams 1= 77'C 2=88'C
> Sound Level: 47Dba (Have a sound sensor)
> 
> Sapphire R9 290 temps after running for 2 hours:
> Core: 74'C
> Core: 1000mhz STABLE
> Vrams 1= 63'C 2=77'C
> Sound Level: 37Dba
> 
> Performance difference? 1 fps.
> Price Difference = Over 120$


Mostly because the Tri-X is meant to overclock and be stable while the reference can't reliably maintain its core clock. There really isn't any performance difference until you overclock, but most of the benefit of this card isn't a performance increase, it's lower temps + noise.


----------



## Thorteris

My r9-290x came in today. But I wasn't at home and they require a signature so they are are bringing it back on Monday







......Who expects a working person to be home at 10:00 AM??


----------



## JimmyBR

Hello .. could someone help me? My R9 290 XFX Black Edition at times in BF4 loses much FPS and GPU usage drops to 0% .. 30% .. 60% ... and then back to 100% and is for some time .. tested on 2 computers .. my setup: [email protected] .. 8GB DDR3 corsair vengeance .. MB ASUS P7P55D-Pro .. SSD OCZ Vertex 3 120GB SATA 2 .. Corsair 750W power .. 13.12whql amd driver and the other tested i5-3330 .. 8GB DDR3 .. MB Asrock Fatality .. Antec 620W .. SSD OCZ Vertex 3 120GB SATA 3 .. both with the same problem .. this driver is good for these cards? or is it something with my r9 290? thank you!


----------



## Krusher33

Quote:


> Originally Posted by *Thorteris*
> 
> My r9-290x came in today. But I wasn't at home and they require a signature so they are are bringing it back on Monday
> 
> 
> 
> 
> 
> 
> 
> ......Who expects a working person to be home at 10:00 AM??


I had a package like that recently but didn't think anything of it because my wife would be home to sign it. This was UPS and I think I remember being able to go online and change the time. But they were in big blocks and the last one went from before I get off through dinner time. Wish they could make smaller blocks but then I realized that they're delivering to everyone, not just me.


----------



## Maracus

Quote:


> Originally Posted by *kizwan*
> 
> Did you use Phobya or Fujipoly thermal pad? Good temps btw.


I just used the ones that came with the EK block (1mm), I also used the TIM on top of the pads, had just enough to cover GPU.

At 1200/1450 +100mv core max was *52c* VRM1 *42c* VRM *48c*


----------



## Thorteris

Anybody know how well a h55 would cool a r9-290x overclocked with a g10? Or do I need a stronger cooler?


----------



## Sonikku13

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sonikku13*
> 
> I finally got my Radeon R9 290X. And I can't mine on it because I realized my board keeps messing up. The PC crashes due to the board. So I'm gonna replace the board and processor in one shot, which will require me to get a new copy of Windows 8.1.
> 
> 
> 
> No it wont'. You can change your existing Win8 over to the new build. Takes like 5 minutes to enable it using the phone option in the control panel under "system". I've done it a hundred times.
Click to expand...

My existing copy of Windows 8 is still gonna be used by my current PC. I'm effectively fixing a PC with Vista OEM on it, which is tied to the old mobo, which means I need a new OS.


----------



## BackwoodsNC

Quote:


> Originally Posted by *Thorteris*
> 
> My r9-290x came in today. But I wasn't at home and they require a signature so they are are bringing it back on Monday
> 
> 
> 
> 
> 
> 
> 
> ......Who expects a working person to be home at 10:00 AM??


I use the pick up at depot option for expensive items. They are always open past 5pm normally, around i think 7pm. I log into UPS and change the delivery option. I do hate that you got to wait till Monday.

Still waiting on 290x lightning... ASUS DC2 is looking nice though. Get off your butts MSI and do [email protected]


----------



## battleaxe

Quote:


> Originally Posted by *Thorteris*
> 
> Anybody know how well a h55 would cool a r9-290x overclocked with a g10? Or do I need a stronger cooler?


It should do fine. My h80 is overkill for sure.


----------



## Bartouille

I want to replace the thermal pads on my stock 290x cooler. What thickness should I use? Some say 0.5mm on the vram and 1mm for the vrms, is that ok? Or can I use 1mm everywhere (save some money)? Also what are the dimensions a memory chip? THANKS!


----------



## Forceman

The worry with using 1mm on the VRAM would be it being thick enough to keep the rest of the heatsink from making good contact. I'm not sure how likely that is, but if you are going to the trouble to change it you may as well put the right ones on.


----------



## Bartouille

Quote:


> Originally Posted by *Forceman*
> 
> The worry with using 1mm on the VRAM would be it being thick enough to keep the rest of the heatsink from making good contact. I'm not sure how likely that is, but if you are going to the trouble to change it you may as well put the right ones on.


Yeah I want to do it right once and for all. I plan on keeping this card for a while so yea... EK website says 0.5mm for vram and 1mm for vrms, I don't know if this applies to stock cooler tho. What I fear with not thick enough pads is that contact is not good enough, thicker ones can get squeezed so it's not so bad. The best would be to measure them but I'm way too lazy lol


----------



## kizwan

Quote:


> Originally Posted by *kpoeticg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BackwoodsNC*
> 
> Well if you tried the card on another board and the same thing happens, I would think it is the card. Get a used spare card off of craigslist or something and RMA that sucker.
> 
> 
> 
> Quote:
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I think your card is dead. I assume you already tried booting by switching to the second BIOS on the card?
> 
> Click to expand...
> 
> Thanx for replying guys. Yeah, i tried switching the bios. I figured it was RMA time, just thought it would be worth posting in here to see if i was missing something.
Click to expand...

It's good idea to get second opinion. I figured you didn't missed anything after reading your posts.
Quote:


> Originally Posted by *ArchieGriffs*
> 
> Quote:
> 
> 
> 
> Originally Posted by *King4x4*
> 
> Well I just finished testing my little brothers new R9 290 Tri-X.... This card is just smoking!
> 
> Refernce R9 290x Card temps running for 2 hours:
> Core: 93'C
> Core: 840-940mhz
> Vrams 1= 77'C 2=88'C
> Sound Level: 47Dba (Have a sound sensor)
> 
> Sapphire R9 290 temps after running for 2 hours:
> Core: 74'C
> Core: 1000mhz STABLE
> Vrams 1= 63'C 2=77'C
> Sound Level: 37Dba
> 
> Performance difference? 1 fps.
> Price Difference = Over 120$
> 
> 
> 
> Mostly because the Tri-X is meant to overclock and be stable while the reference can't reliably maintain its core clock. There really isn't any performance difference until you overclock, but most of the benefit of this card isn't a performance increase, it's lower temps + noise.
Click to expand...

Either at stock or overclocked, my reference 290's can maintain at max clock in both games & benching.
Quote:


> Originally Posted by *JimmyBR*
> 
> Hello .. could someone help me? My R9 290 XFX Black Edition at times in BF4 loses much FPS and GPU usage drops to 0% .. 30% .. 60% ... and then back to 100% and is for some time .. tested on 2 computers .. my setup: [email protected] .. 8GB DDR3 corsair vengeance .. MB ASUS P7P55D-Pro .. SSD OCZ Vertex 3 120GB SATA 2 .. Corsair 750W power .. 13.12whql amd driver and the other tested i5-3330 .. 8GB DDR3 .. MB Asrock Fatality .. Antec 620W .. SSD OCZ Vertex 3 120GB SATA 3 .. both with the same problem .. this driver is good for these cards? or is it something with my r9 290? thank you!


It sound like CPU is bottlenecking. Did you checked CPU usage? If CPU usage is 100%, GPU usage drop & followed by sudden FPS drop, that show CPU is bottlenecking.
Quote:


> Originally Posted by *Maracus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Did you use Phobya or Fujipoly thermal pad? Good temps btw.
> 
> 
> 
> I just used the ones that came with the EK block (1mm), I also used the TIM on top of the pads, had just enough to cover GPU.
> 
> At 1200/1450 +100mv core max was *52c* VRM1 *42c* VRM *48c*
Click to expand...

I just noticed you run Firestrike Extreme at stock clock. Nice temp anyway.
Quote:


> Originally Posted by *Bartouille*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Forceman*
> 
> The worry with using 1mm on the VRAM would be it being thick enough to keep the rest of the heatsink from making good contact. I'm not sure how likely that is, but if you are going to the trouble to change it you may as well put the right ones on.
> 
> 
> 
> Yeah I want to do it right once and for all. I plan on keeping this card for a while so yea... EK website says 0.5mm for vram and 1mm for vrms, I don't know if this applies to stock cooler tho. What I fear with not thick enough pads is that contact is not good enough, thicker ones can get squeezed so it's not so bad. The best would be to measure them but I'm way too lazy lol
Click to expand...

I have double-stacked thermal pad on VRM1, making it total of 2mm thick. I'm using EK waterblock anyway but even with double thickness it doesn't prevent the block from making good contact with GPU core & VRAM because the thermal pad will squeezed when you fasten the screws (not over tighten of course). In my case 1.5mm should be perfect thickness for VRM1. I personally don't recommend 2mm or double thickness thermal pad on VRM1. I only do that because I don't have extra 0.5mm pad.

I don't know the correct thickness for stock cooling but 0.5 & 1mm for VRAM & VRM's respectively would be my best guess too.


----------



## Forceman

Quote:


> Originally Posted by *Bartouille*
> 
> Yeah I want to do it right once and for all. I plan on keeping this card for a while so yea... EK website says 0.5mm for vram and 1mm for vrms, I don't know if this applies to stock cooler tho. What I fear with not thick enough pads is that contact is not good enough, thicker ones can get squeezed so it's not so bad. The best would be to measure them but I'm way too lazy lol


Yeah, I don't know how squishy the pads you want to buy are. The stock ones are pretty soft, but the ones that came with the EK waterblock are stiffer. Not sure about the Fujipoly or others though.


----------



## JimmyBR

Quote:


> Originally Posted by *kizwan*
> 
> It sound like CPU is bottlenecking. Did you checked CPU usage? If CPU usage is 100%, GPU usage drop & followed by sudden FPS drop, that show CPU is bottlenecking


hello kizwan.. I am monitoring the cpu usage with msi afterburner ... cpu usage is at 60% .. 80% .. 88% .. never 100% ... even in stuttering .. yesterday did some testing .. happens only with battlefield 4 ..
In other games works perfect...gpu usage is at 100% .. in metro 2033 .. crysis 2 ... crysis 3 .. medal of honor warfighter ...


----------



## Matt-Matt

Just got my 290 running at 1075/1250MHz, i can't seem to get the vRAM to run at the rated 1500 it's set for.. Maybe I need some auxiliary voltage?

Anyway, did a fan curve and max i get about after 2 hours of mining is 87c and that's stable with a voltage offset of +13.

What am i doing wrong with the memory? I never seem to get cards that clock well for the memory :/


----------



## rdr09

Quote:


> Originally Posted by *JimmyBR*
> 
> hello kizwan.. I am monitoring the cpu usage with msi afterburner ... cpu usage is at 60% .. 80% .. 88% .. never 100% ... even in stuttering .. yesterday did some testing .. happens only with battlefield 4 ..
> In other games works perfect...gpu usage is at 100% .. in metro 2033 .. crysis 2 ... crysis 3 .. medal of honor warfighter ...


cpu cores do not have to reach 100% to bottleneck the gpu(s) in some cases.


----------



## flamin9_t00l

Just sharing my final daily OC I got dialled in now.

1100MHz on the core with +50mV core voltage
1400MHz on the vram (5600MHz effective) with +25mV Aux voltage.

Settings syncronized for both cards in AB (They're very well matched for oc'ability).

Guys get your OC's tested on Sleeping dogs and see if they hold up ok especially vram oc's. I managed to get my first and only black screen using this game and a freeze aswell when everything else was spot on (BF4, Bioshock, TR, 3dmark etc). This game seems to hammer the 290 series (highest I've seen my VRM1 temps too @ 55c - usually around 45-50c).

Done a quick bench too - tested against TTL's 780Ti Phantoms SLI from OC3D.

Sleeping dogs (1440p) TTL's 780Ti SLI (2 card) Gainward Phantoms (I think they were boosting higher that my 290's clocks aswell iirc)



My Radeon R9 290 non-Xs XFire (2 card) EK Waterblocks. (1100MHz core, 1400MHz vram).



780Ti's managed couple of frames higher on the avg and max but my min was 4fps higher.

So nice to see them keeping up with these cards considering they're £600 each.

Also can't wait to see mantle performance on my hexa with Xfire.


----------



## Asrock Extreme7

just mining some aliencoin litecoin takes to long got 900khs


----------



## rdr09

Quote:


> Originally Posted by *flamin9_t00l*
> 
> Just sharing my final daily OC I got dialled in now.
> 
> 1100MHz on the core with +50mV core voltage
> 1400MHz on the vram (5600MHz effective) with +25mV Aux voltage.
> 
> Settings syncronized for both cards in AB (They're very well matched for oc'ability).
> 
> Guys get your OC's tested on Sleeping dogs and see if they hold up ok especially vram oc's. I managed to get my first and only black screen using this game and a freeze aswell when everything else was spot on (BF4, Bioshock, TR, 3dmark etc). This game seems to hammer the 290 series (highest I've seen my VRM1 temps too @ 55c - usually around 45-50c).
> 
> Done a quick bench too - tested against TTL's 780Ti Phantoms SLI from OC3D.
> 
> Sleeping dogs (1440p) TTL's 780Ti SLI (2 card) Gainward Phantoms (I think they were boosting higher that my 290's clocks aswell iirc)
> 
> 
> 
> My Radeon R9 290 non-Xs XFire (2 card) EK Waterblocks. (1100MHz core, 1400MHz vram).
> 
> 
> 
> 780Ti's managed couple of frames higher on the avg and max but my min was 4fps higher.
> 
> So nice to see them keeping up with these cards considering they're £600 each.
> 
> Also can't wait to see mantle performance on my hexa with Xfire.


+rep. minimum matters most.


----------



## Darklyspectre

Honestly I wouldn't mind going with ATI with my next computer build beginning next year. which is where I go full blown mental on it again.

Shame that ATI doesn't have a company like EVGA that is totally okay with putting watercooling on their cards since I am planning to go dual(maybe triple) SLI/crossfire so air cooling is way out of the possible options.

But as far as I know there isn't any company that deals in ATI cards that has nothing against watercooling as far as I know. Okay you got some companies selling reference 290s with watercooling slammed on it but who cares about reference. I am looking for a good overclocker.

the 290X lightning(whenever it comes out) will be an amazing card too bad MSI is kinda iffy about watercooling it and it apparently depending on who is handling your RMA.


----------



## JimmyBR

Quote:


> Originally Posted by *rdr09*
> 
> cpu cores do not have to reach 100% to bottleneck the gpu(s) in some cases.


but even i5-3330? need an i7 then?


----------



## mojobear

Hey all!

Legit reviews has mantle benchmarks...but its only for i5 3570 + R7 260X

http://www.legitreviews.com/amd-mantle-api-real-world-bf4-benchmark-performance-catalyst-141_134959

They still see a 15% improvement







More reviews will stream in today....I think most review sites are getting the drivers now so Sunday/Monday latest for us (fingers crossed)

Source: http://www.tomshardware.com/news/amd-mantle-drivers-download,25927.html

Edit:

Some polish source...not sure what to make of it yet







Just trying to stir up interest

http://pclab.pl/art55953-2.html
http://pclab.pl/art55953-3.html


----------



## ArchieGriffs

Quote:


> Originally Posted by *kizwan*
> 
> Either at stock or overclocked, my reference 290's can maintain at max clock in both games & benching.


I'm assuming you've either: changed thermal past, added Vram thermal pads, have an epic case as far as cooling goes, run the fan at 70%+, or have put a waterblock on it. I've heard over and over from reference cards that if they reach a certain temperature they can't maintain consistent FPS/core clock. I guess with a bite of tweaking with the fan profile you'd be able to get better results too, so I guess there's ways around the temperature problem, I still would imagine your rig is pretty loud though (assuming it isn't water cooled). I personally wouldn't ever want my card to run that hot or loud, which is why I like the Tri-X. It's like what, a 40$ difference where I'm buying between the reference and the non reference cards? That isn't a huge deal for me for the advantages it provides.


----------



## Pheozero

Anyone know when those CryoVenom cards from VisionTek are going to be released?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Pheozero*
> 
> Anyone know when those CryoVenom cards from VisionTek are going to be released?


They already have haven't they?

EDIT: http://www.visiontekproducts.com/index.php/component/virtuemart/graphics-cards/visiontek-cryovenom-liquidcooled-series-r9-290-detail?Itemid=0

Says they are out of stock, so no idea if they are sold out already although i think they have...


----------



## Pheozero

They have? Besides their site, where can I find them at?


----------



## pounced

So I've got my Sapphire 290x Ref version running at 1200 MHZ Core / 1400 Mhz Mem with a custom fan curve and it never gets above 80C but I also have in trixx the VDDC Offset set to 200+

Is it smart to run 200+ VDDC Offset or will it damage the card in the longrun? I can run 130 VDCC and get 1175 Stable if need be.


----------



## Forceman

Quote:


> Originally Posted by *pounced*
> 
> So I've got my Sapphire 290x Ref version running at 1200 MHZ Core / 1400 Mhz Mem with a custom fan curve and it never gets above 80C but I also have in trixx the VDDC Offset set to 200+
> 
> Is it smart to run 200+ VDDC Offset or will it damage the card in the longrun? I can run 130 VDCC and get 1175 Stable if need be.


I'd be more comfortable with +130/1175 if it was me.


----------



## kizwan

Quote:


> Originally Posted by *pounced*
> 
> So I've got my Sapphire 290x Ref version running at 1200 MHZ Core / 1400 Mhz Mem with a custom fan curve and it never gets above 80C but I also have in trixx the VDDC Offset set to 200+
> 
> Is it smart to run 200+ VDDC Offset or will it damage the card in the longrun? I can run 130 VDCC and get 1175 Stable if need be.


What is your max voltage with +200mV?


----------



## pounced

Quote:


> Originally Posted by *kizwan*
> 
> What is your max voltage with +200mV?


I'll have to check but I might tone it down so I don't burn out the card.

On GPUZ VDDC IN maxes out around 240 Watts while I'm playing world of warcraft and sitting in the middle of a major city I'll have to test running battlefield 4.


----------



## Arizonian

Wonder where the Sapphire Vapor-X series is?

After Tri-X success I'm hoping same for Vapor X and hope it's with hynix memory too.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Arizonian*
> 
> Wonder where the Sapphire Vapor-X series is?
> 
> After Tri-X success I'm hoping same for Vapor X and hope it's with hynix memory too.


Sapphire are probably still working on it, really looking forward to see what they do with the Toxic though......4 fans?


----------



## King4x4

The Tri-X is BIG... 305mm on a single chip?

Toxic will be what? 340mm?!


----------



## VSG

Push-pull fans, 4 slot card


----------



## Sgt Bilko

They might even do an AIO option like the Ares II

That could be interesting


----------



## ArchieGriffs

I feel like it will be a custom PCB but the same cooler, though that wouldn't be much of an improvement, nor would it explain why it's taken them a long time to announce it.


----------



## DamnedLife

Custom PCB, Three fans, one in the middle is bigger&better, alternating color fans yellow black yellow, Sapphire logo in led with status colors. That is all...


----------



## bloodkil93

Quote:


> Originally Posted by *Arizonian*
> 
> Wonder where the Sapphire Vapor-X series is?
> 
> After Tri-X success I'm hoping same for Vapor X and hope it's with hynix memory too.


I've put my 290X under a H55 with an NZXT Kraken G10 bracket, can u update my profile, or u want images?

Cheers!


----------



## Thorteris

Quote:


> Originally Posted by *bloodkil93*
> 
> I've put my 290X under a H55 with an NZXT Kraken G10 bracket, can u update my profile, or u want images?
> 
> Cheers!


How are the temps? I'm planning to use the exact same cooler on my r9-290x when it comes in on monday. And did the h55 and g10 mount easily on the card?


----------



## Redvineal

Hey all. Sorry if this was asked and answered before, but I'm on my phone and can't navigate easily.

anyone know if the MSI R9 290 GAMING cards have a reference PCB? I'm looking to put my two under water this week!

Thanks.


----------



## bloodkil93

Quote:


> Originally Posted by *Thorteris*
> 
> How are the temps? I'm planning to use the exact same cooler on my r9-290x when it comes in on monday. And did the h55 and g10 mount easily on the card?


In 3 consecutive runs of Unigine Valley, these were my max temps: (GPU is overclocked to 1200/1680)

GPU: 55 C
VRM1: 78C
VRM2: 58 C

And it was a bit fiddly to get it in, but it's very simple to do, make sure you have sinks on VRM1 as it gets quite hot and that u check the clearance from the bracket and there isn't room for massive heatsinks, those temps on VRM2 were without Heatsinks.


----------



## Arizonian

Quote:


> Originally Posted by *bloodkil93*
> 
> I've put my 290X under a H55 with an NZXT Kraken G10 bracket, can u update my profile, or u want images?
> 
> Cheers!


Show a pic to show it off but I believe you









I've got you updated


----------



## taem

Quote:


> Originally Posted by *Redvineal*
> 
> Hey all. Sorry if this was asked and answered before, but I'm on my phone and can't navigate easily.
> 
> anyone know if the MSI R9 290 GAMING cards have a reference PCB? I'm looking to put my two under water this week!
> 
> Thanks.


It's a reference layout but custom components, so I dunno if a reference block would fit.


----------



## Sgt Bilko

Mantle Beta Driver: http://support.amd.com/en-us/download/desktop?os=Windows%208

Have fun


----------



## Arizonian

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Mantle Beta Driver: http://support.amd.com/en-us/download/desktop?os=Windows%208
> 
> Have fun


Updated to OP


----------



## Aggronor

OMG!!!!!!!!!!!!!!!!!!!!!!!!! excellent!!!!!


----------



## kizwan

I'm guessing everyone is trying the driver now??!! What is the verdict?









[EDIT] It's already 1 hour after @Sgt Bilko posted about the driver. So, it's either going great or gone wrong or everyone still hungover. Which is it? I can only test the driver in a couple of hours.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Arizonian*
> 
> Updated to OP


AMD has officially listed it now









http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx

I'm downloading it at 9.2KB/s...........gonna take me awhile unfortunately


----------



## bond32

Well, only time BF4 lagged was aiming down sight occasionally, don't have numbers but no lag now.


----------



## Sazz

do we just need to install the driver and thats it? or do I need to turn on something on the Catalyst Control? I switched the API on the game itself and I don't see any improvement at all.


----------



## kizwan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> AMD has officially listed it now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx
> 
> I'm downloading it at 9.2KB/s...........gonna take me awhile unfortunately


Capped? Also does the link quality effected too, like ping going from 77 to 222 to 1000 & go back to single digit in every few seconds?


----------



## Sgt Bilko

Quote:


> Originally Posted by *kizwan*
> 
> Capped? Also does the link quality effected too, like ping going from 77 to 222 to 1000 & go back to single digit in every few seconds?


not capped.

This should answer your question though:

http://www.speedtest.net/my-result/3276350908


----------



## Germanian

what's the best undervolt software for 3x XFX r9 290 reference editions?


----------



## Aggronor

Quote:


> Originally Posted by *Sazz*
> 
> do we just need to install the driver and thats it? or do I need to turn on something on the Catalyst Control? I switched the API on the game itself and I don't see any improvement at all.


Same problem here.., Ive tested and just 9 fps improvement (compared dx11):

i7 3770k + Sapphire 290X.

I switched the API on the game also.., but dunno if we have to change something on catalyst :S...


----------



## Sazz

Quote:


> Originally Posted by *Aggronor*
> 
> Same problem here.., Ive tested and just 9 fps improvement (compared dx11):
> 
> i7 3770k + Sapphire 290X.
> 
> I switched the API on the game also.., but dunno if we have to change something on catalyst :S...


You see an improvement but mine is showing 6FPS less. dunno if I got to turn something on or am I doing something wrong.


----------



## Aggronor

Quote:


> Originally Posted by *Sazz*
> 
> You see an improvement but mine is showing 6FPS less. dunno if I got to turn something on or am I doing something wrong.


Mmmmmmmmmmmm.... its weird.. , Config?? VGA ??
I read that the improvements are different within ATI models.., anyway I did not change anything from catalyst center.. all was changed on the game.


----------



## Sazz

Quote:


> Originally Posted by *Aggronor*
> 
> Mmmmmmmmmmmm.... its weird.. , Config?? VGA ??
> I read that the improvements are different within ATI models.., anyway I did not change anything from catalyst center.. all was changed on the game.


I got a 290X

All I changed on Catalyst that I already have is switched the Catalyst AI to performance and standard settings to performance.

I will do some testing later on, currently watching UFC xD

I ran a quick test, turned on and off on the same spot and no changes on FPS, no gain nor loss of FPS.

I got BF4 on Ultra preset nothing else is touched.

Edit:
Is there anything we need to turn on in Catalyst Control?? coz I am at loss on how the heck this thing works.

2nd Edit:

After removing driver, re-downloading and re-installing it finally works. I am seeing 12Fps improvement over DX11, I got a 290X paired w/ 3570k.


----------



## kizwan

*Star Swarm Benchmark: DirectX (13.12 WHQL) vs. Mantle (14.1 beta 1.6)*

*[EDIT]* Opps! One is follow & the other one is attract. I'll re-run the benchmark using attract.



Random screenshot.


----------



## zpaf

Welcome Mantle ...


----------



## kizwan

*Star Swarm Benchmark: DirectX (13.12 WHQL) vs. Mantle (14.1 beta 1.6)*
_I don't know why previous run with DirectX is only 120 seconds though._


*Star Swarm Benchmark: DirectX (14.1 beta 1.6) vs. Mantle (14.1 beta 1.6)*


----------



## Fahrenheit85

Ok so I ordered some of that Fujipoly to use with my EK water block. Can I get a final stance on if i should use TIM with it on the RAM and the VRMs?

If I do use it do I go chip - Fujipoly - TIM - block or chip - TIM - Fujipoly - block or do i use it on both sides of the Fujipoly?


----------



## kizwan

Quote:


> Originally Posted by *Fahrenheit85*
> 
> Ok so I ordered some of that Fujipoly to use with my EK water block. Can I get a final stance on if i should use TIM with it on the RAM and the VRMs?
> 
> If I do use it do I go chip - Fujipoly - TIM - block or chip - TIM - Fujipoly - block or do i use it on both sides of the Fujipoly?


You only need to apply TIM on VRMs. VRMs - TIM - Fujipoly - block.

Just put very tiny TIM on VRMs.


----------



## Fahrenheit85

Quote:


> Originally Posted by *kizwan*
> 
> You only need to apply TIM on VRMs. VRMs - TIM - Fujipoly - block.
> 
> Just put very tiny TIM on VRMs.


Thanks homie. Waiting on the Fujipoly from the UPS man any day this week.

Follow up question, the EK back plate calls for 1.5mm thick thermal pad over the gpu, would it be safe to put a piece of my 1mm and a piece of my 0.5mm to get that thickness or should i just use the included pad.


----------



## MrWhiteRX7

Guys when you switch the API setting to Mantle make sure you restart BF4 before testing.


----------



## ottoore

Quote:


> Originally Posted by *bloodkil93*
> 
> In 3 consecutive runs of Unigine Valley, these were my max temps: (GPU is overclocked to 1200/1680)
> 
> GPU: 55 C
> VRM1: 78C
> VRM2: 58 C
> 
> And it was a bit fiddly to get it in, but it's very simple to do, make sure you have sinks on VRM1 as it gets quite hot and that u check the clearance from the bracket and there isn't room for massive heatsinks, those temps on VRM2 were without Heatsinks.


Ahahah this is not true.
I have prolimatech mk-26 with 2 pieces of the reference plate on vrms ( and Fujipoly Sarcon 17W/mK). 2*140mm fans on the prolimatech, 2*140mm fans on the side.
I'll post photos but i can say that, with stock voltage i achieve 55C on vrm2, 68C on vrm1.

You cannot get 78C with little heatsinks on vrm1 ( or with not heatsinks at all). The reason why i don't choose liquid aio cooler for r9 290 is bad dissipation of Vrm1.
A believable result with, i imagine, +150mV ( at least for 1200/1680), and Nzxt g10 is 95C on Vrm.

Bye


----------



## K1llrzzZ

Ok guys,now I'm mad...I downloaded the new Mantle driver,installed it,restarted my PC just to make sure...I start BF4 oops...BSOD...atikmpag.sys or something so yeah driver...Ok lets try it again...It starts this time,great!Most be a one time thing!I click on options...crashes(exe stops working)...Lets try it again...same...Ok they said there is some issues with Crossfire,I turn it off,lets try it with one card...same error...I turn my second card on again and start monitoring fps on test range...-30...everything was on max,so it couldn't turn effect on,i can't see my options of course cuz than the game crashes...but i guess it's still in directx,and i lost around 30 fps...WHY?WHAT THE ........ AMD!?!?!


----------



## quakermaas

Just encase anybody missed it

"*Installing The AMD Catalyst**™** Software Driver*

Current driver *MUST* be *uninstalled* before updating to AMD Catalyst™ 14.1 Beta driver. For detailed instructions on how to correctly uninstall or install the AMD Catalyst Software Suite"


----------



## Widde

Need a new cpu and a 3rd card







http://piclair.com/pi3ce Nah but thinking of picking up a 3770k just because it might oc better than my 3570k


----------



## K1llrzzZ

I did uninstall everything,restarted,installed the new one,when I start BF4 BSOD...


----------



## syniad

Anyone else having core throttle after installing the 14.1 driver? I had to downclock from 1200/1500 to 1100/1400 just to make the cards stop throttling themselves. My old o/c was working fine with +100mv but even going up to +200 with this new driver it still throttles the cores.


----------



## kizwan

Quote:


> Originally Posted by *Widde*
> 
> Need a new cpu and a 3rd card
> 
> 
> 
> 
> 
> 
> 
> http://piclair.com/pi3ce Nah but thinking of picking up a 3770k just because it might oc better than my 3570k


Star Swarm only utilized one card. At least that is what I'm seeing here.
Quote:


> Originally Posted by *syniad*
> 
> Anyone else having core throttle after installing the 14.1 driver? I had to downclock from 1200/1500 to 1100/1400 just to make the cards stop throttling themselves. My old o/c was working fine with +100mv but even going up to +200 with this new driver it still throttles the cores.


Throttle when doing what? It's pretty normal though if I'm guessing.


----------



## quakermaas

Quote:


> Originally Posted by *syniad*
> 
> Anyone else having core throttle after installing the 14.1 driver? I had to downclock from 1200/1500 to 1100/1400 just to make the cards stop throttling themselves. My old o/c was working fine with +100mv but even going up to +200 with this new driver it still throttles the cores.


Have you checked the power limits are set and being applied ? Temps OK ?


----------



## Widde

Quote:


> Originally Posted by *kizwan*
> 
> *Star Swarm Benchmark: DirectX (13.12 WHQL) vs. Mantle (14.1 beta 1.6)*
> _I don't know why previous run with DirectX is only 120 seconds though._
> 
> 
> *Star Swarm Benchmark: DirectX (14.1 beta 1.6) vs. Mantle (14.1 beta 1.6)*


Star Swarm Updated
An update for Star Swarm is now available on Steam. Changes include the following:

Removed activation requirement
*Changed default duration to 360 seconds*
Added DeferredContexts flag to ini files (disabled by default - use at your own risk)
Default scenario changed to 'Follow' to yield more consistent results per run
Changed 'Follow' scenario to follow a different unit to help prevent crazy spinning
Throttled frame rate on the launcher to reduce CPU overhead while it's sleeping
Added tooltip descriptions to the scenarios

Saw these changes on steam


----------



## mboner1

Quote:


> Originally Posted by *syniad*
> 
> Anyone else having core throttle after installing the 14.1 driver? I had to downclock from 1200/1500 to 1100/1400 just to make the cards stop throttling themselves. My old o/c was working fine with +100mv but even going up to +200 with this new driver it still throttles the cores.


Yes and it is frustrating the crap out of me. I'm thinking it's something to do with the i7 cpu's cos i have been searching and we seem to be the main one's affected. Turning off hyperthreading has some effect. But to be honest it seems pretty random. At first i was down clocking in games, then i sorted that out and i wouldn't downclock at the desktop.

I think msi afterburner might be having issues with the driver as well, fan tachometer no longer shows for me, and if i click "enable overdrive" in ccc it kicks something into action in conjunction with msi afterburner runing, before ticking that in ccc i was not downclocking at the desktop, after ticking it i now downclock but still get drops with core clock and gpu usage, although nowhere near as bad as before. I was dropping between 0 and 100% gpu usage and core clock was going down to 300 in game (bf4) now i'm dropping to 75% gpu usage and around 800 core clock on and off.

Something is off and it is really starting to grate on my nerves after having to replace a faulty r9 290 windforce. This was happening on previous drivers for me as well. It is also not temp related, not going over 80 degrees and the vrm's are staying under 84 and 62 respectively.

We are not the only ones, it seems to be pretty prevalent, hopefully it's something that can be sorted. FTR i didn't have these issues with a reference r9 290 and i think it only affects the non reference cards.


----------



## devilhead

So tryed to play Battlefied 4, 3930k at 4.8ghz and 4x290's at stock clocks







so i dont see any difference between mantle and directx, they need to work a bit more


----------



## Widde

Btw how come my star swarm isnt working with crossfire? Getting 0% usage on my 2nd card but it works in everything else and ULPS is disabled in regedit, anyone else having problem?


----------



## kcuestag

Quote:


> Originally Posted by *Widde*
> 
> Btw how come my star swarm isnt working with crossfire? Getting 0% usage on my 2nd card but it works in everything else and ULPS is disabled in regedit, anyone else having problem?


You should pay more attention to the release notes.









It does not support CFX yet.


----------



## BradleyW

Quote:


> Originally Posted by *Widde*
> 
> Btw how come my star swarm isnt working with crossfire? Getting 0% usage on my 2nd card but it works in everything else and ULPS is disabled in regedit, anyone else having problem?


It does not support CFX yet. Patch and drivers should address this soon.


----------



## petedread

I have reference R9 290x and the throttling is happening to me too. In BF4 and Vally.


----------



## ebduncan

just installed the drivers on mine.

ran the star swarm benchmark, with and with out mantle.

With Mantle: Follow, Settings Extreme, Stress Test 6 mins
Test Duration: 360 Seconds
Total Frames: 21091

Average FPS: 58.59
Average Unit Count: 4585
Maximum Unit Count: 5844
Average Batches/MS: 912.86
Maximum Batches/MS: 2144.20
Average Batch Count: 18815
Maximum Batch Count: 117833

Without Mantle Follow, Settings Extreme, Stress Test 6 mins

Test Duration: 360 Seconds
Total Frames: 11935

Average FPS: 33.14
Average Unit Count: 4120
Maximum Unit Count: 5482
Average Batches/MS: 482.22
Maximum Batches/MS: 984.68
Average Batch Count: 17164
Maximum Batch Count: 124215

I didn't get around to testing BF4 it was getting late, and i wanted to test out mining before i went to bed, so it could mine overnight. Sadly the 14.1 beta 6 driver results in terrible mining performance, and I started to get black squares on the screen, along with other graphical errors. I tried clean install, of 14.1 after using amd driver remover, still didn't resolve issues. I reverted back to 13.12 for now, and everything works fine again. I'm sure the AMD software guys are busy right now getting this update out. The performance benefits are there though at least in Star Swarm. I hope they release a better driver here in the future that doesn't cripple the mining performance and gives me access to mantle goodness.


----------



## kizwan

I just played one session of BF4 with 64 player. Mantle enabled (restarted the game after enabled it). With Crossfire enabled I do noticed intermittent stuttering. This is at stock clock though. The game feels smoother. The CPU usage also down from below 80's % to below 70's %.

3820 @4.75GHz, Ultra, 100% resolution scale, 2 x 290's CFX @stock clock.


----------



## Porter_

I'm also getting gpu core throttling with the 14.1 drivers. Before updating (was using 13.11 betas) my 290x would stay rock solid at 1100/1200 while playing BF4. Now with the 14.1 drivers I'm throttling down to ~low 900's on the core using both Mantle and DX11. Tried a more aggressive fan profile and it didn't help; card throttled back even at 84 C.


----------



## petedread

Does it take a while for the memory to "burn" in? Because when I first got my card 1500 was the best mem OC I could get with AB +100mv. Noticed yesterday that my mem will now go up to 1625. Core Oc hasn't changed, it is still 1180 max. I would also like to add (because it has been mentioned a few pages back) that higher mem OC improves vally scores, for me at least.

My vally scores have dropped from 3485 to 2880, I'm assuming this is due to throttling, which is being caused by the 14.1 drivers. Just like porter^^ heat is not the cause.
Is it the general consensus that 14.1 drivers are causing the throttling? Or could it be the new drivers clashing with Afterburner?


----------



## Jhors2

This is some strange behavior I am getting myself. When I first got my cards I could only clock my mem to 1425 and now I can get 1525 with exact same voltage settings. All of my cards are Hynix FWIW.


----------



## jamaican voodoo

well im happy to report that i have 10-12 fps boost in bf4 with my 290...in area's where i usually get 60- 65 fps im now getting 75-81 some time 90 my minium fps increase too didn't drop below 59fps the whole time, where as before it would drop to 53 or so... ..over all it was smooth, keep in mind this was 64 plyrs parcel storm 1500 tickets conquest large..with all settings max except for high aa post and resolution scaling


----------



## bloodkil93

Quote:


> Originally Posted by *ottoore*
> 
> Ahahah this is not true.
> I have prolimatech mk-26 with 2 pieces of the reference plate on vrms ( and Fujipoly Sarcon 17W/mK). 2*140mm fans on the prolimatech, 2*140mm fans on the side.
> I'll post photos but i can say that, with stock voltage i achieve 55C on vrm2, 68C on vrm1.
> 
> You cannot get 78C with little heatsinks on vrm1 ( or with not heatsinks at all). The reason why i don't choose liquid aio cooler for r9 290 is bad dissipation of Vrm1.
> A believable result with, i imagine, +150mV ( at least for 1200/1680), and Nzxt g10 is 95C on Vrm.
> 
> Bye


Just to quote you on that, VRM1 is fitted with a large heatsink that I purchased in the Alpenfohn heatsink pack, I can give you a screenshot if you would like?


----------



## cam51037

Tomorrow I ship out my 290 for replacement, fingers crossed that my replacement doesn't have a fan rattle like my current card.

Really tempted to try out 14.1, but I'm going to be packing this card up in a few hours anyway and using a GTX 670 as a backup card instead.


----------



## mboner1

Quick update on my situation, i had pretty bad gpu usage and the core was down clocking before, i tried everything with only minor improvements, i had no issues with the original install but decided to have another crack at re installing 14.1.

With the second attempt i got a yellow question mark thing in ccc but no errors, still had the same old issues tho. Booted to safe mode , ran driver sweeper, re installed again same issues.

In the end it took about 5 try's but wowee, the result was worth it. I believe the install that completed properly for me took a tad longer than all the others that had issues (even tho some of the attempts reported no problems) I now don't get any fluctuations in bf4, for gpu usage it stays at 100%, for core clock it stays at 1080 which is all i have it set to atm, ZERO fluctuations. It is soooo damn smooth. Compared to what i initially thought mantle was this is crazy good.

My advice is that the gpu usage and core downclocking issues aren't a bug with mantle or this driver that need to be fixed in a later driver by AMD but a issue with the install that can be rectified by us with some persistence.

Good luck!


----------



## sugarhell

Hmm we need a mantle bug thread


----------



## Jack Mac

Quote:


> Originally Posted by *mboner1*
> 
> Quick update on my situation, i had pretty bad gpu usage and the core was down clocking before, i tried everything with only minor improvements, i had no issues with the original install but decided to have another crack at re installing 14.1.
> 
> With the second attempt i got a yellow question mark thing in ccc but no errors, still had the same old issues tho. Booted to safe mode , ran driver sweeper, re installed again same issues.
> 
> In the end it took about 5 try's but wowee, the result was worth it. I believe the install that completed properly for me took a tad longer than all the others that had issues (even tho some of the attempts reported no problems) I now don't get any fluctuations in bf4, for gpu usage it stays at 100%, for core clock it stays at 1080 which is all i have it set to atm, ZERO fluctuations. It is soooo damn smooth. Compared to what i initially thought mantle was this is crazy good.
> 
> My advice is that the gpu usage and core downclocking issues aren't a bug with mantle or this driver that need to be fixed in a later driver by AMD but a issue with the install that can be rectified by us with some persistence.
> 
> Good luck!


Hm, I guess that would explain why some people like me get freezes and dips with mantle but others have a perfect experience. Thing is, I had no issues installing 14.1 BETA.


----------



## esqueue

anyone know if it will affect mining performance? I have an odd setup and is afraid of even upgrading. 650ti as main video card and r9 290x as full time miner. Nothing connected to the 290x at the moment so it is mining pretty fast. Only way I could get ~900Kh/s with epilda memory.


----------



## mboner1

Quote:


> Originally Posted by *Jack Mac*
> 
> Hm, I guess that would explain why some people like me get freezes and dips with mantle but others have a perfect experience. Thing is, I had no issues installing 14.1 BETA.


Yeah, i didn't think i did either, but check this out now. And it was all over the shop before really.. only dip is at the end and that's cos i had to use snipping tool instead of fraps and it took a few seconds to get the pic.


----------



## sugarhell

I create a mantle bug thread. Please take a look and report if you have an issue

http://www.overclock.net/t/1464054/mantle-bug-thread


----------



## taem

Anybody know anything about Powercolor PCS+ 290? Whether heatsink contacts vram and vrms, etc? Also, dimensions? Official specs state 270mm x 115mm x 38mm. But is that right? Because everyone says what a huge 3 slot card this is and that is not huge. My 280x Vapor X was like 285mm x 145mm x 44mm and that was called a 2.5 slotter.


----------



## petedread

I have just discovered that I can not play BF4 with AfterBurner. I have to close AB now. I think I read Unwinder say something about a new beta or a update to beta 18 on it's way any day now. Any one else having trouble with BF4/AB/Mantle?


----------



## EliteReplay

Quote:


> Originally Posted by *petedread*
> 
> I have just discovered that I can not play BF4 with AfterBurner. I have to close AB now. I think I read Unwinder say something about a new beta or a update to beta 18 on it's way any day now. Any one else having trouble with BF4/AB/Mantle?


this doesnt have to do with mantle or drivers, im getting the same with my GTX460 with AB open the game just freeze ifyou alt tab... and have to force close on taks manager.


----------



## bond32

Appears I am getting some sort of random video lag in BF4 with Mantle. Not sure the cause yet... I would say stuttering but that term is used loosely. Seems like about 1 sec or so, ill get a random pile of video lag. This is with a 290x.


----------



## kizwan

This what happen when playing BF4 with Mantle & Crossfire enabled. Intermittent stuttering (known issues). As shown in second pic, GPU2 core clock is fluctuating.

This is recorded data from one BF4 session. 2 x 290's Crossfire @1150/1400, 1080p, Ultra settings, 100% resolution scale, Mantle.

GPU1 core/memory clock


GPU2 core/memory clock


----------



## petedread

EliteReplay is saying that it is happening on his Nvidia card too. So there could be big problems with the battlefield update not Mantle.


----------



## tobitronics

Is it safe to flash a 015.041.000.002.000000 bios from a ASUS DCII OC 290X onto a ASUS Stock (now watercooled) 290X? Or will this kill my card?


----------



## Aggronor

Does anyone know a program to record the frame rate FPS, with Mantle ??? Fraps and Afterburner are not working







because i want to perform some benchs on BF4 with my r9 290x between dx11 and mantle...

any ideaS? Thanks.


----------



## Asrock Extreme7

anyone mining alien coin start now going to be worth more than litecoin


----------



## bond32

Quote:


> Originally Posted by *Aggronor*
> 
> Does anyone know a program to record the frame rate FPS, with Mantle ??? Fraps and Afterburner are not working
> 
> 
> 
> 
> 
> 
> 
> because i want to perform some benchs on BF4 with my r9 290x between dx11 and mantle...
> 
> any ideaS? Thanks.


Quote:


> Originally Posted by *the9quad*
> 
> After a round of 64 player Paracel Storm with mantle:
> 
> 1440p True Ultra aka w/hbao 4msaa/ post high/ render 100%
> 
> Min:129.53 (4.21ms)
> Max:237.53(7.72ms)
> Avg:146.82(6.811ms)
> 
> Here is how to generate the benchmark and analyze it so far from what I think I figured out.
> 
> guide: ignore quotes do not type them
> In game
> 
> 1) press "`"
> 
> 2.) type "perfoverlay.framefilelogenable 1"
> do not add this to your user cfg as the data it generates while it loads is useless, wait til your in game and spawned.
> 
> 3.) prior to exiting the game type ""perfoverlay.framefilelogenable 0"
> if you don't do this the data at the end of your file will have huge numbers that are useless.
> 
> 4.) once out of game navigate to the documents/Battlefield 4 folder and you should see a FrameTimeLog.csv file
> 
> 5.) open this up with excel or libre office calc (it's free)
> 
> you should have a file that looks like this:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> that first column A is all you really care about (I think for fps):
> 
> so just go use the following calcs to get min max and avg
> where XX is the number of your last row in A
> find an empty cell and type
> 
> "MIN(A1:AXX)"
> 
> in another empty cell type
> "MAX(A1:AXX)"
> 
> in another empty cell type
> "AVERAGE(A1:AXX)"
> 
> in the above pic it would look like
> MIN(A1:A14)
> 
> *oh and just divide 1000/min max etc to get fps.*
> 
> Hope that helps. feel free to correct and/or add info and yes you could graph it as well if you want. someone more knowledgable I am sure will clarify what all 3 variables really are and how to make use of em


----------



## flamin9_t00l

I have just figured out how to calc average fps, lest say column A is from 1 - 7000

In any cell (lets use G5) type the following =SUM(A2:A7000)

then in a new cell type:

=7000/G5*1000

where 7000 is the amount of frames, G5 is the total rendering time in milliseconds, then multiplied by 1000 to give average FPS.

Tested this method against some older FRAPS minmaxavg.csv benchmarks and its correct, although don't know how to get min and max FPS yet.

edit: check following post for a much easier way to get the min, max, avg.


----------



## Forceman

For the frame times that BF4 records (in ms) just divide 1000/frame time. That works for the average, the min, or the max. Pretty much exactly as the9quad showed. A 16ms frame time = 60 FPS (1000/16).


----------



## flamin9_t00l

Quote:


> Originally Posted by *Forceman*
> 
> For the frame times that BF4 records (in ms) just divide 1000/frame time. That works for the average, the min, or the max. Pretty much exactly as the9quad showed. A 16ms frame time = 60 FPS (1000/16).


Yep spot on... my bad, I got the min and the max mixed up when checking my results. Guess I went the long way around with that one lol.


----------



## brazilianloser

They haven't updated afterburner in a while... Maybe with new drivers out they will finally give us a new update soon.


----------



## Thorteris

I can't wait I willl be joining the club tomrrow







. Should've been able to on Friday though...stupid shipping rules







.


----------



## kot0005

hi Ariz, can you update my card please?

Using R9 290X Sapp. Tri X OC.

Valid here.

Also I am almost done with my build


----------



## Arizonian

Quote:


> Originally Posted by *kot0005*
> 
> hi Ariz, can you update my card please?
> 
> Using R9 290X Sapp. Tri X OC.
> 
> Valid here.
> 
> Also I am almost done with my build
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats on the Tri-X and your build, updated


----------



## chiknnwatrmln

Gonna formally join the club in a day or two, and you're gonna have to put me down for water.









I've run into so many snags in my rebuild, but I'm almost done. All that's left is to reapply CLU to my CPU's die, put on the CPU block, run the tubing, and leak test.


----------



## Ukkooh

It seems that my card likes to black screen at idle when the memory is overclocked to 1500mhz. Has someone figured out a fix for this? This card's black screens are very odd as the gpu fan goes to 100% when black screening.


----------



## Sazz

Quote:


> Originally Posted by *Ukkooh*
> 
> It seems that my card likes to black screen at idle when the memory is overclocked to 1500mhz. Has someone figured out a fix for this? This card's black screens are very odd as the gpu fan goes to 100% when black screening.


Black screens means unstable Memory OC. if you get black screen on an OC it means it's unstable, you get it at stock clocks it means you need to RMA.

My card can go 1600Mhz on memory but once I put some load on it I will get Black screens. under air my max is 1350 under water (full block) my max is 1450.


----------



## Ukkooh

Quote:


> Originally Posted by *Sazz*
> 
> Black screens means unstable Memory OC. if you get black screen on an OC it means it's unstable, you get it at stock clocks it means you need to RMA.
> 
> My card can go 1600Mhz on memory but once I put some load on it I will get Black screens. under air my max is 1350 under water (full block) my max is 1450.


As i said happens only at idle and seems to be stable under load.


----------



## Sazz

Quote:


> Originally Posted by *Ukkooh*
> 
> As i said happens only at idle and seems to be stable under load.


if it doesn't happen at stock clocks, it means its UNSTABLE. that's the end of it.


----------



## Ukkooh

So are there any simple fixes to it? I think it goes to full mem clock on idle volts and crashes.


----------



## Sazz

There's no memory voltage control on this card, right now I only get those black screens if my memory clocks is unstable, otherwise never gotten them at stock clocks. you need to try lower clocks on your memory until you go a full day w/o experiencing any black screens. That's how I did it w/ mine xD


----------



## Sgt Bilko

Top AMD CPU based Dual GPU Score for FS and FSE









FS
http://www.3dmark.com/fs/1644041

FSE
http://www.3dmark.com/fs/1644094


----------



## Sazz

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Top AMD CPU based Dual GPU Score for FS and FSE
> 
> 
> 
> 
> 
> 
> 
> 
> 
> FS
> http://www.3dmark.com/fs/1644041
> 
> FSE
> http://www.3dmark.com/fs/1644094


Cool, my previous build is still top on the single GPU on FS and 3Dmark 11. Surprised no one passed those old runs now that a lot of waterblocks already out there.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Sazz*
> 
> Cool, my previous build is still top on the single GPU on FS and 3Dmark 11. Surprised no one passed those old runs now that a lot of waterblocks already out there.


Well I was really surprised when i looked it up considering my CPU is on an H100i and my 290's are Air-Cooled.......

Single Card is my next target though, need till wait till like 4am for cool enough ambients though (33c in my room atm) So gimme another 4-5 days and i'll go for a Single card run









Can you send me a link to your run? (just wanna know what you were running







)


----------



## Sazz

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well I was really surprised when i looked it up considering my CPU is on an H100i and my 290's are Air-Cooled.......
> 
> Single Card is my next target though, need till wait till like 4am for cool enough ambients though (33c in my room atm) So gimme another 4-5 days and i'll go for a Single card run
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can you send me a link to your run? (just wanna know what you were running
> 
> 
> 
> 
> 
> 
> 
> )


Firestrike
http://www.3dmark.com/fs/1468653

3Dmark 11
http://www.3dmark.com/3dm11/7765060

On firestrike there's this one score at 11k, and if you check the GPU score its 21k.. I think either it was mis-read as single GPU config or under LN2? I dunno, so I don't really consider that score to be valid coz 21k graphics score on a single card is insane..


----------



## Sgt Bilko

Quote:


> Originally Posted by *Sazz*
> 
> Firestrike
> http://www.3dmark.com/fs/1468653
> 
> 3Dmark 11
> http://www.3dmark.com/3dm11/7765060
> 
> On firestrike there's this one score at 11k, and if you check the GPU score its 21k.. I think either it was mis-read as single GPU config or under LN2? I dunno, so I don't really consider that score to be valid coz 21k graphics score on a single card is insane..


Wow, 1300......not sure i can match that on Air, i've tested 1250 and that holds long enough for FS but nothing everyday of course.

and 21k? yeah, that must be a mis-read score, my 290's in CF a 1200/1400 are 23k so i doubt even under LN2 it would push it that high.

I'm hitting 9.3k FS score atm but i know i can go a bit higher.......just need the colder ambient temps









I want to try and do it with an AIO and Air though......just wanna push the limits


----------



## Ukkooh

I just got an odd idea. Would it work if I flashed my reference card with a sapphire tri-x bios? Would be nice to know if they have some bios magic which explains their good ocs.


----------



## Sazz

Quote:


> Originally Posted by *Ukkooh*
> 
> I just got an odd idea. Would it work if I flashed my reference card with a sapphire tri-x bios? Would be nice to know if they have some bios magic which explains their good ocs.


Knowing that Tri-X is the same reference PCB, the only thing I can think of that the BIOS has different on it is fan speeds. And they are handpicked to have Hynix memory I believe? and in general Hynix card usually OC better but not always.

But you can try it, maybe it does have something else different. Imma try it myself to see if I get any different results on my OC.


----------



## mboner1

I flashed a windforce r9 290 to the trix bios and woke up to a dead card the next morning. Both bios switches were dead tho. And it was having higher than expected temps on the vrm's and overall before flashing so not sure if it was related.


----------



## Sazz

Quote:


> Originally Posted by *mboner1*
> 
> I flashed a windforce r9 290 to the trix bios and woke up to a dead card the next morning. Both bios switches were dead tho. And it was having higher than expected temps on the vrm's and overall before flashing so not sure if it was related.


just flashed my 290X to the Tri-X BIOS, currently testing memory clocks to see if I don't get black screen above 1350 on reference cooler. I know for a fact under full waterblock my card can run 1450 but under reference so far my best clock would be 1350 on memory.

Edit:

using Tri-X BIOS I went thru 1 BF4 session w/o black screen under stock cooling at 1450mhz on memory.. gonna try few more games and then move up the clocks.

Edit 2:

Just got done w/ 2 BF4 matches and 2 star swarm run w/ memory clock at 1500Mhz on Tri-X BIOS and so far no black screen. I did try 1550Mhz on memory and I got black screen w/ in 2minutes. Looks like Tri-X BIOS have something different, I was using BF4 edition BIOS prior to flashing my card to Tri-X BIOS. Somebody else should test it out, the more the better so that we can confirm if using Tri-X BIOS really make memory clock more stable.

I will further test it later today (I needa go to sleep it's almost 4AM LOL), I will not jump to conclusion just yet maybe it just happens to not get black screen so quick at these clocks. But then again prior to flashing I usually get black screen at 1400Mhz on stock cooler. (when I had waterblock on I can go 1450Mhz stable, 1500 instant black screen/or screen black lines appears).


----------



## sun100

Now that Mantle is out on BF4 any of you guys with OCd R9 290 stock tried it yet ? I've downclocked mine to stock speeds and voltage just to try the game on Mantle and it runs bf4 on ultra with 4xMSAA and 1080p with no hickups, however the card is getting some serious run for its money. It's taking off my table







I did custom fan speed curve on MSI AB and while ingame it runs at around 65% of speed, first game to do so yet (usually they go to 55 ish to 60%).
Should i maybe let it run a bit warmer (now fan speed and temps are set to proportional, so 65% is for 65-70-ish C) ?


----------



## BradleyW

Quote:


> Originally Posted by *sun100*
> 
> Now that Mantle is out on BF4 any of you guys with OCd R9 290 stock tried it yet ? I've downclocked mine to stock speeds and voltage just to try the game on Mantle and it runs bf4 on ultra with 4xMSAA and 1080p with no hickups, however the card is getting some serious run for its money. It's taking off my table
> 
> 
> 
> 
> 
> 
> 
> I did custom fan speed curve on MSI AB and while ingame it runs at around 65% of speed, first game to do so yet (usually they go to 55 ish to 60%).
> Should i maybe let it run a bit warmer (now fan speed and temps are set to proportional, so 65% is for 65-70-ish C) ?


The card is fine at 90c so watch the temps if you reduce the fan speed.
Mantle is excellent. DirectX must die, and quickly. Mantle is THE BEST API. I'm sick of dropping to 70 fos in every game I play because of stupid CPU bottlenecks.
Anyway, here are my results: BF4 - Ultra - 85 FOV - 1080p
Siege of Shanghai, looking at tower from the water, 64 man server - lowest fps @ 65.
MANTLE, same conditions - lowest fps @ 129.
CPU usage reduced, GPU usage increased.


----------



## the9quad

P
Quote:


> Originally Posted by *flamin9_t00l*
> 
> I have just figured out how to calc average fps, lest say column A is from 1 - 7000
> 
> In any cell (lets use G5) type the following =SUM(A2:A7000)
> 
> then in a new cell type:
> 
> =7000/G5*1000
> 
> where 7000 is the amount of frames, G5 is the total rendering time in milliseconds, then multiplied by 1000 to give average FPS.
> 
> Tested this method against some older FRAPS minmaxavg.csv benchmarks and its correct, although don't know how to get min and max FPS yet.
> 
> edit: check following post for a much easier way to get the min, max, avg.


I posted a guide on this in about 7 different threads here yesterday.


----------



## Kriant

Anyone else having huge issues with 14.1 1.6 beta?
My tri-fire goes crazy with them in most apps.


----------



## kizwan

I'm still confuse about the BF4 in-game benchmark "PerfOverlay.FrameFileLogEnable 1″. First is to find MIN, MAX & AVERAGE, then divide 1000/(MIN/MAX/AVERAGE). However after divide the MIN become MAX & MAX become MIN.


----------



## Widde

Quote:


> Originally Posted by *Kriant*
> 
> Anyone else having huge issues with 14.1 1.6 beta?
> My tri-fire goes crazy with them in most apps.


My 290s began to stutter in league of legends on windowed fullscreen mode, never did that with 13.12 drivers


----------



## bond32

Mantle has been a disaster for me. Direct x was fine, played the game fine in my rig. With mantle enabled I get random video lag spikes maybe 4-5 times per game. And yes I've reinstalled drivers multiple times.


----------



## kizwan

14.1 beta v1.6 is beta driver & first beta driver that support Mantle. I like to think it's not 100% perfect yet.


----------



## quakermaas

Quote:


> Originally Posted by *bond32*
> 
> Mantle has been a disaster for me. Direct x was fine, played the game fine in my rig. With mantle enabled I get random video lag spikes maybe 4-5 times per game. And yes I've reinstalled drivers multiple times.


Most people are getting them, but I am getting huge FPS increases and remember this is a beta.

I get the video lag spike about once a minute, I turned MSAA to x4 and RC. to 150% and the spikes was every second.


----------



## BradleyW

Quote:


> Originally Posted by *quakermaas*
> 
> Most people are getting them, but I am getting huge FPS increases and remember this is a beta.
> 
> I get the video lag spike about once a minute, I turned MSAA to x4 and RC. to 150% and the spikes was every second.


Same here, but I get them once every 10 minutes on a bad day. FPS increased up to 50% in CPU limited areas. I know people might not believe me, but I don't care because I'm too busy enjoying the increase.


----------



## kizwan

Quote:


> Originally Posted by *BradleyW*
> 
> Same here, but I get them once every 10 minutes on a bad day. FPS increased up to 50% in CPU limited areas. I know people might not believe me, but I don't care because I'm too busy enjoying the increase.


I believe you.


----------



## quakermaas

Quote:


> Originally Posted by *BradleyW*
> 
> Same here, but I get them once every 10 minutes on a bad day. FPS increased up to 50% in CPU limited areas. I know people might not believe me, but I don't care because I'm too busy enjoying the increase.


I believe you, because I am getting the same














It makes DX look like a Dinosaur


----------



## mojobear

Quote:


> Originally Posted by *Kriant*
> 
> Anyone else having huge issues with 14.1 1.6 beta?
> My tri-fire goes crazy with them in most apps.


hey yah my trifire does not like mantle drivers....it crashes immediately after load screen on BF4...and my iGPU is turned off by default on my Asus M6E...It also made Dx11 weird....so I reverted back to 13.12


----------



## Arizonian

XFX R9 290X Black OC Edition Graphics Card Review with Mantle
by Stuart Davidson - 3rd February 2014

http://www.hardwareheaven.com/reviews/1930/pg1/xfx-r9-290x-black-oc-edition-graphics-card-review-with-mantle-introduction.html


----------



## anubis1127

XFX have really stepped it up in the aesthetics department with this generation of GPUs. I would almost consider getting one (if I didn't have horrible prior experiences with their customer service).


----------



## Krusher33

Quote:


> Originally Posted by *mojobear*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Kriant*
> 
> Anyone else having huge issues with 14.1 1.6 beta?
> My tri-fire goes crazy with them in most apps.
> 
> 
> 
> hey yah my trifire does not like mantle drivers....it crashes immediately after load screen on BF4...and my iGPU is turned off by default on my Asus M6E...It also made Dx11 weird....so I reverted back to 13.12
Click to expand...

The driver notes says it does not support crossfires yet.


----------



## Ukkooh

Quote:


> Originally Posted by *Sazz*
> 
> 
> 
> Spoiler: Warning: Wall of text!
> 
> 
> 
> just flashed my 290X to the Tri-X BIOS, currently testing memory clocks to see if I don't get black screen above 1350 on reference cooler. I know for a fact under full waterblock my card can run 1450 but under reference so far my best clock would be 1350 on memory.
> 
> Edit:
> 
> using Tri-X BIOS I went thru 1 BF4 session w/o black screen under stock cooling at 1450mhz on memory.. gonna try few more games and then move up the clocks.
> 
> Edit 2:
> 
> Just got done w/ 2 BF4 matches and 2 star swarm run w/ memory clock at 1500Mhz on Tri-X BIOS and so far no black screen. I did try 1550Mhz on memory and I got black screen w/ in 2minutes. Looks like Tri-X BIOS have something different, I was using BF4 edition BIOS prior to flashing my card to Tri-X BIOS. Somebody else should test it out, the more the better so that we can confirm if using Tri-X BIOS really make memory clock more stable.
> 
> I will further test it later today (I needa go to sleep it's almost 4AM LOL), I will not jump to conclusion just yet maybe it just happens to not get black screen so quick at these clocks. But then again prior to flashing I usually get black screen at 1400Mhz on stock cooler. (when I had waterblock on I can go 1450Mhz stable, 1500 instant black screen/or screen black lines appears).


Thank you for trying it out! I'm glad that I was lucky enough to get hynix memory so I guess I won't have memory related problems. I don't feel like creating a custom fan profile though so I'll try it out next weekend when I'll finally put my card under water.


----------



## quakermaas

Quote:


> Originally Posted by *Krusher33*
> 
> The driver notes says it does not support crossfires yet.


I thought that, until I tried it.


----------



## Sazz

Quote:


> Originally Posted by *Ukkooh*
> 
> Thank you for trying it out! I'm glad that I was lucky enough to get hynix memory so I guess I won't have memory related problems. I don't feel like creating a custom fan profile though so I'll try it out next weekend when I'll finally put my card under water.


Interesting enough my card got Elipdia memory =P


----------



## BradleyW

Quote:


> Originally Posted by *Ukkooh*
> 
> Thank you for trying it out! I'm glad that I was lucky enough to get hynix memory so I guess I won't have memory related problems. I don't feel like creating a custom fan profile though so I'll try it out next weekend when I'll finally put my card under water.


What memory related problems? You mean less Hash on Crypto-Currency? Nothing wrong with Elpida.


----------



## Ukkooh

Quote:


> Originally Posted by *BradleyW*
> 
> What memory related problems? You mean less Hash on Crypto-Currency? Nothing wrong with Elpida.


We were talking about sapphire tri-x bios and using it with elpida mem could cause some odd issues since the tri-x cards have hynix only.
I didn't mean to bash elpida in general.


----------



## kizwan

Quote:


> Originally Posted by *BradleyW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ukkooh*
> 
> Thank you for trying it out! I'm glad that I was lucky enough to get hynix memory so I guess I won't have memory related problems. I don't feel like creating a custom fan profile though so I'll try it out next weekend when I'll finally put my card under water.
> 
> 
> 
> What memory related problems? You mean less Hash on Crypto-Currency? Nothing wrong with Elpida.
Click to expand...

Black screen issue in games when overclocked pass certain MHz. For example, read Sazz's post.


----------



## battleaxe

Quote:


> Originally Posted by *Ukkooh*
> 
> So are there any simple fixes to it? I think it goes to full mem clock on idle volts and crashes.


bump a tad more voltage to the core. It will stabilize your memory.


----------



## Asrock Extreme7




----------



## Sazz

and another issue I encountered on BF4 on mantle, sometimes when I start the game I get this bug where BF4 becomes super laggy and won't respond and windows says my graphics card is too slow and asking me to turn down my graphics settings on windows.

When I get that I had to restart to get the game started on normal again.


----------



## mojobear

Quote:


> Originally Posted by *Krusher33*
> 
> The driver notes says it does not support crossfires yet.


yah but thought worth a shot since DICE had 290x crossfire scores for BF4 with mantle and others with crossfire 290s have seen benefits. Oh well I wait some more!


----------



## jamaican voodoo

i got 2 more 290's on the way from the egg man can i say expensive, gosh i wish they were 290x for the price i paid lol
bitcoin go to hell already


----------



## brazilianloser

Quote:


> Originally Posted by *jamaican voodoo*
> 
> i got 2 more 290's on the way from the egg man can i say expensive, gosh i wish they were 290x for the price i paid lol
> bitcoin go to hell already


No kidding man... was contemplating getting a third card, but since there is a increase of almost 200 bucks on the original prices I paid, I will just pass.


----------



## Paul17041993

well you can officially count me in, I need a better camera... and desk...
(reference sapphire 290X)

as for clocks, she seemed to run 1100/1500 through heaven fine, though then flickered out and eventually hanged on a black screen after I locked the computer, 1050/1350 seems perfectly fine though, not going to worry about overvolting yet as for one thing its too hot, have to wait till it gets closer to winter, but Ill likely be trying to get a water loop set up by then too.

the amount of air that comes out the back of these on full-tilt is kinda scary, like a vacuum cleaner almost...

edit; here's the GPU-Z link;
http://www.techpowerup.com/gpuz/wr32g/


----------



## Widde

Have a problem, in crossfire my 290s doesnt wanna go above 1250 on the memory (Both elpida) losing fps in both furmark and OCCT but the MSI one clocked up to 1300 when it was alone, now I have a sapphire reference there aswell


----------



## Blameless

Quote:


> Originally Posted by *BradleyW*
> 
> What memory related problems? You mean less Hash on Crypto-Currency? Nothing wrong with Elpida.


A significant portion of 290/290X cards with Elpida memory are not stable at stock speeds, and the majority of reference cooled samples will eventually get black screen crashes with any memory OC, especially in conjunction with core OCs.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Blameless*
> 
> A significant portion of 290/290X cards with Elpida memory are not stable at stock speeds, and the majority of reference cooled samples will eventually get black screen crashes with any memory OC, especially in conjunction with core OCs.


This is not true.

Most Elpida cards work perfectly fine; some, like mine, can hold 1650 MHz mem without a problem. Hynix provides only a miniscule performance increase, it's really only noticeable for mining.

Most black screens result from unstable OC's and people not understanding that a card running at max mem clocks on idle voltages can cause problems.


----------



## dspacek

Hello friends, I have got a VERY strange issue now. I dont know exactly if that happend after trying 13.30 Catalyst but I thing since that Im getting this throttling issue (see picture).
the performance is masively turned down with every Mhz of OC the core. You can see that throttling which is caused by changing clock from STOCK to OC 1000Mhz and the real clock is throttling around 770Mhz, then I set set in AB to STOCk 947Mhz and throttling around 880-890Mhz.
Is this issue caused by driver, AB, or BIOS? Im still at ASUS BIOS. And the same issue is with the original one Saphire BIOS








Thanx for all your help.

PS: I thought that system had many drivers changes and it could be the reason why did this, but I reinstrall System, and the card with 14.1 drivers and any of known OC utility is throttling like crazy... Screw all AMD !








I never see that little bit higher clocks or stock clocks leads to real lower clocks .... especially under water !!!


----------



## Arizonian

Quote:


> Originally Posted by *Paul17041993*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> well you can officially count me in, I need a better camera... and desk...
> 
> as for clocks, she seemed to run 1100/1500 through heaven fine, though then flickered out and eventually hanged on a black screen after I locked the computer, 1050/1350 seems perfectly fine though, not going to worry about overvolting yet as for one thing its too hot, have to wait till it gets closer to winter, but Ill likely be trying to get a water loop set up by then too.
> 
> the amount of air that comes out the back of these on full-tilt is kinda scary, like a vacuum cleaner almost...


Congrats - added









If you could add a GPU-Z validation link with OCN name showing would be great on that original post for verification. The pic looked like a 290X?

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/15650#post_21713338

As for the air out the rear air flow bracket, I'd agree. Pretty hot. I had mine up against a wall almost and it heated up the wall quite a bit. I felt compelled to move it further away. Now I'm waiting on Lightning / Vapor X or Toxic and how mantle plays out.


----------



## anubis1127

Quote:


> Originally Posted by *Paul17041993*
> 
> well you can officially count me in, I need a better camera... and desk...
> 
> as for clocks, she seemed to run 1100/1500 through heaven fine, though then flickered out and eventually hanged on a black screen after I locked the computer, 1050/1350 seems perfectly fine though, not going to worry about overvolting yet as for one thing its too hot, have to wait till it gets closer to winter, but Ill likely be trying to get a water loop set up by then too.
> 
> the amount of air that comes out the back of these on full-tilt is kinda scary, like a vacuum cleaner almost...


Congrats on the new GPU!


----------



## BradleyW

Quote:


> Originally Posted by *Blameless*
> 
> A significant portion of 290/290X cards with Elpida memory are not stable at stock speeds, and the majority of reference cooled samples will eventually get black screen crashes with any memory OC, especially in conjunction with core OCs.


What do you mean by significant portion? What numbers are we talking here?


----------



## kizwan

With Mantle enabled I can load & play BF4 with Crossfire enabled. However, when I disabled Crossfire to play using only one GPU, it stuck at "LOGGING IN" & I can see BF4.exe executable file stop loading when memory usage at "41X,XXX K" (in Task Manager, column "Memory (Private Working Set)").


----------



## phallacy

Put both my 290Xs and CPU in a loop. Used the xspc photon and 360 and 280 res with the xspc razor blocks. I was able to up both my cards from 1100/1500 to 1200/1625 with the same +75 mv and 50 power limit. Tried a push to 1300 but heaven wouldn't start and had to reset. Could run it at 1250 but there were lots of artifacts. Didn't go over +100 mV though so I might try again later.









I still haven't downloaded the new mantle beta drivers. It seems crossfire is having a lot of problems because it's not optimized yet. Anybody think I should hold off until the WHQL ones arrive?


----------



## Thorteris

My r9-290x finally came in. And it Seems like Gigabyte just trolled me.....
I won a reference r9-290x mind you but it came in a windforce box so I got all happy







. I then open the box to find a reference r9-290x.







I got it for free and was planning to add a kraken g10 to it anyways but hey.


----------



## lol.69

Hi people ,
just bought today the r9-290x (Asus dc2) .

Instaled the last version of catalyst Beta (14) and in BF3 i have stutter with vsync on (MP) . Anyone experienced this?

Also I am planning to mess with overclock... Max safe voltage for core is 1.3? Do you add any volatge to memory?

I will use Furmark to test stability .... Is furmark a good tool to test stability?

Sorry for my english...


----------



## Krusher33

Quote:


> Originally Posted by *Thorteris*
> 
> My r9-290x finally came in. And it Seems like Gigabyte just trolled me.....
> I won a reference r9-290x mind you but it came in a windforce box so I got all happy
> 
> 
> 
> 
> 
> 
> 
> . I then open the box to find a reference r9-290x. rolleyes: I got it for free and was planning to add a kraken g10 to it anyways but hey.


Ha ha ha, that's funny.


----------



## Paul17041993

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you could add a GPU-Z validation link with OCN name showing would be great on that original post for verification. The pic looked like a 290X?
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/15650#post_21713338


oh woops forgot to mention, yes its a 290X, haven't had much time to fiddle with it but I'll run GPU-Z this avo if I don't forget...


----------



## ImJJames

Quote:


> Originally Posted by *lol.69*
> 
> Hi people ,
> just bought today the r9-290x (Asus dc2) .
> 
> Instaled the last version of catalyst Beta (14) and in BF3 i have stutter with vsync on (MP) . Anyone experienced this?
> 
> Also I am planning to mess with overclock... Max safe voltage for core is 1.3? Do you add any volatge to memory?
> 
> I will use Furmark to test stability .... Is furmark a good tool to test stability?
> 
> Sorry for my english...


Don't use furmark, just play games to find stability. 1.3 Volts is very safe. No need to add voltage for memory. Don't use Vsync.


----------



## lol.69

Thanks for response . I turned Vsync off .. For me 60 is what i need ( and at 60 fps the card runs so cold ...XD). MAybe the next beta can fix this..
Anyone with a Asus DC2 to tell me average OC ?


----------



## ImJJames

Quote:


> Originally Posted by *lol.69*
> 
> Thanks for response . I turned Vsync off .. For me 60 is what i need ( and at 60 fps the card runs so cold ...XD). MAybe the next beta can fix this..
> Anyone with a Asus DC2 to tell me average OC ?


I never understood people who use v-sync. For me if you're going to spend big money on a GPU, I want it working as hard as possible producing all the FPS it can. lol

But at the same time I run @ 144hz monitor


----------



## lol.69

Quote:


> Originally Posted by *ImJJames*
> 
> I never understood people who use v-sync. For me if you're going to spend big money on a GPU, I want it working as hard as possible producing all the FPS it can. lol
> 
> But at the same time I run @ 144hz monitor


hehe.

Well my monitor is 60Hz. For me if the card can run the game at max settings and at Monitor Hz its everything good.


----------



## bond32

Quote:


> Originally Posted by *Ukkooh*
> 
> Thank you for trying it out! I'm glad that I was lucky enough to get hynix memory so I guess I won't have memory related problems. I don't feel like creating a custom fan profile though so I'll try it out next weekend when I'll finally put my card under water.


On the topic of the tri-X bios, according to this: http://www.techpowerup.com/vgabios/151536/sapphire-r9290x-4096-131212.html it has Elpida memory capability too.

Honestly though, and someone with good bios knowledge please answer this, is there any difference between bios' and manufactures for reference design cards? If no, my guess is the tri-X bios is identical to reference except slightly OC'ed (since it uses a reference design card).


----------



## Sazz

Quote:


> Originally Posted by *bond32*
> 
> On the topic of the tri-X bios, according to this: http://www.techpowerup.com/vgabios/151536/sapphire-r9290x-4096-131212.html it has Elpida memory capability too.
> 
> Honestly though, and someone with good bios knowledge please answer this, is there any difference between bios' and manufactures for reference design cards? If no, my guess is the tri-X bios is identical to reference except slightly OC'ed (since it uses a reference design card).


Speaking of which, I've been testing Tri-X BIOS and I finally got a black screen at 1500Mhz after 4hrs of BF4 play. But prior to flashing at 1500Mhz I would get black screen on BF4 w/ in minutes of playing BF4. slowly downlclocking my memory clocks (currently testing 1480Mhz) to see what's my max memory clock at stock voltage. altho when I increased my core voltage to +50mV (what I use when I have 1150Mhz core clock) at 1500Mhz memory clock i didn't get any black screen. When I got the black screen I was using stock voltage w/ core clocks of 1100Mhz (memory is at 1500Mhz of course).

I will further test it later at 1500mhz at +50mV see if I get any black screens w/ a bump on core votlage.


----------



## bond32

Quote:


> Originally Posted by *Sazz*
> 
> Speaking of which, I've been testing Tri-X BIOS and I finally got a black screen at 1500Mhz after 4hrs of BF4 play. But prior to flashing at 1500Mhz I would get black screen on BF4 w/ in minutes of playing BF4. slowly downlclocking my memory clocks (currently testing 1480Mhz) to see what's my max memory clock at stock voltage. altho when I increased my core voltage to +50mV (what I use when I have 1150Mhz core clock) at 1500Mhz memory clock i didn't get any black screen. When I got the black screen I was using stock voltage w/ core clocks of 1100Mhz (memory is at 1500Mhz of course).
> 
> I will further test it later at 1500mhz at +50mV see if I get any black screens w/ a bump on core votlage.


Awesome, thanks for the info. Mine won't even open chrome at 1500 memory and stock voltage. Maybe I should try the triX too...


----------



## Krusher33

Quote:


> Originally Posted by *bond32*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ukkooh*
> 
> Thank you for trying it out! I'm glad that I was lucky enough to get hynix memory so I guess I won't have memory related problems. I don't feel like creating a custom fan profile though so I'll try it out next weekend when I'll finally put my card under water.
> 
> 
> 
> On the topic of the tri-X bios, according to this: http://www.techpowerup.com/vgabios/151536/sapphire-r9290x-4096-131212.html it has Elpida memory capability too.
> 
> Honestly though, and someone with good bios knowledge please answer this, is there any difference between bios' and manufactures for reference design cards? If no, my guess is the tri-X bios is identical to reference except slightly OC'ed (since it uses a reference design card).
Click to expand...

Stilt over at bitcointalk litecointalk forum has been modifying bios's for miners there. He found that some bios was inefficient in regards to how it handles memory and VRM's. His modifications not only improved hashing rates, but lowered VRM temps in a lot of cases.

His threads have recently locked up though and he's no longer taking pm's for some reason.

But yeah... there are differences between the bios of each card even from the same line.


----------



## bond32

Quote:


> Originally Posted by *Krusher33*
> 
> Stilt over at bitcointalk forum has been modifying bios's for miners there. He found that some bios was inefficient in regards to how it handles memory and VRM's. His modifications not only improved hashing rates, but lowered VRM temps in a lot of cases.
> 
> His threads have recently locked up though and he's no longer taking pm's for some reason.
> 
> But yeah... there are differences between the bios of each card even from the same line.


Awesome, +rep.

I took a look over there a few times, I didn't know he released anything.


----------



## Krusher33

Quote:


> Originally Posted by *bond32*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Krusher33*
> 
> Stilt over at bitcointalk forum has been modifying bios's for miners there. He found that some bios was inefficient in regards to how it handles memory and VRM's. His modifications not only improved hashing rates, but lowered VRM temps in a lot of cases.
> 
> His threads have recently locked up though and he's no longer taking pm's for some reason.
> 
> But yeah... there are differences between the bios of each card even from the same line.
> 
> 
> 
> Awesome, +rep.
> 
> I took a look over there a few times, I didn't know he released anything.
Click to expand...

This was the 280X thread: https://litecointalk.org/index.php?topic=12369.0

There was no blanket bios for all cards. People uploaded their vanilla bios, he reviewed it, if terrible, he modified and gave it.


----------



## Gilles3000

I'm looking into getting a ref. r9 290, atm i'm choosing between the gigabyte and the sapphire, is there a higher chance of getting hynix memory with either of the brands or does it just depend on the batch?


----------



## Jack Mac

Probably depends on the batch.


----------



## Paul17041993

Quote:


> Originally Posted by *ImJJames*
> 
> I never understood people who use v-sync. For me if you're going to spend big money on a GPU, I want it working as hard as possible producing all the FPS it can. lol
> 
> But at the same time I run @ 144hz monitor


I generally have vsync on for the most part so my hardware isn't doing more work and producing extra heat that isn't needed, though during winter I don't exactly care and may turn it off, though games like TF2 (or any source-based for that matter) I let run at 120-300 FPS as the input polling suffers horribly with vsync on.


----------



## cam51037

I turn V-sync on with my 290 mostly to get the fan to slow down a little bit, and cut down on heat and power consumption. Mind you, with v-sync off in BF4, it only drops from 75 FPS to 60 FPS on high, so there's not a huge drop in FPS.


----------



## Thorteris

Quote:


> Originally Posted by *Gilles3000*
> 
> I'm looking into getting a ref. r9 290, atm i'm choosing between the gigabyte and the sapphire, is there a higher chance of getting hynix memory with either of the brands or does it just depend on the batch?


I just got my Gigabyte reference r9-290x it has hynix.


----------



## Thorteris

Sign me up







Stock cooling for now.
http://www.techpowerup.com/gpuz/69s79/


----------



## SkyNetSTI

What's up guys. Thinking of joining to the club by flipping my reference msi gtx 780 with r9 290x ( Sapphire 21226-00-53G ) any comment about exact gpu - got a deal on it.


----------



## sugarhell

I never use vsync. Dynamic framerate cap from radeon pro if you have a problem with the heat


----------



## jamaican voodoo

i'm so ready to put this build together just waiting those cards to arrive and few other parts
my lonely waterblocks and back plates are waiting










the rig is bit messy right now!!! going to replace those delta fans i can't stand the noise anymore


----------



## lol.69

What is the max safe temperature on Vrm's ? Gpu-z reported 100ºc (!) on VRM1 !! The core never passes 70ºc but the vrm temperature is scary.
This was while doing a stress test with Furmark ; only 5 minutes.
My card is asus 290x Dc2 version


----------



## Mr357

Quote:


> Originally Posted by *lol.69*
> 
> What is the max safe temperature on Vrm's ? Gpu-z reported 100ºc (!) on VRM1 !! The core never passes 70ºc but the vrm temperature is scary.
> This was while doing a stress test with Furmark ; only 5 minutes.
> My card is asus 290x Dc2 version


AMD says 125C. Whether you trust that or not is up to you.


----------



## sugarhell

Quote:


> Originally Posted by *Mr357*
> 
> AMD says 125C. Whether you trust that or not is up to you.


Asus custom cards use rebrand stuffs so i dont know what they use. AMD suggest 125C for their vrm (i think they use volterra).Asus call them DIGI+ but 100C for a custom cooler is really bad. Its not dangerous but it kills any oc potential and i wouldnt like to run my vrms 100C everytime that i play


----------



## Tobiman

I just picked up a powercolor 290 with the triple cooler for $560 from NCIX.US Better hurry while stock lasts.


----------



## cam51037

Quote:


> Originally Posted by *Tobiman*
> 
> I just picked up a powercolor 290 with the triple cooler for $560 from NCIX.US Better hurry while stock lasts.


Jeez $560 doesn't really seem like a deal, but I guess it's in stock.

Just for reference, I managed to grab a Sapphire 290 Tri-X new for $480 shipped, which is waaay closer to the MSRP of these cards.


----------



## Tobiman

Quote:


> Originally Posted by *cam51037*
> 
> Jeez $560 doesn't really seem like a deal, but I guess it's in stock.
> 
> Just for reference, I managed to grab a Sapphire 290 Tri-X new for $480 shipped, which is waaay closer to the MSRP of these cards.


Where at?


----------



## taem

Quote:


> Originally Posted by *Tobiman*
> 
> I just picked up a powercolor 290 with the triple cooler for $560 from NCIX.US Better hurry while stock lasts.


What are the dimensions? Is it actually 270mm x 115mm x 38mm? That seems small but that's what the web site says.

I want to order this right now but I know nothing about this card, no reviews. How are temps?


----------



## cam51037

Quote:


> Originally Posted by *Tobiman*
> 
> Where at?


I purchased it from WTCR.ca a couple weeks back.


----------



## Paul17041993

better then the 630AUD I paid for mine.


----------



## Tobiman

Quote:


> Originally Posted by *cam51037*
> 
> I purchased it from WTCR.ca a couple weeks back.


Is that a website or a store?


----------



## anubis1127

Quote:


> Originally Posted by *Tobiman*
> 
> Is that a website or a store?


Sounds like a radio station. Coming to you live from the WTCR broadcast booth..


----------



## cam51037

Quote:


> Originally Posted by *Tobiman*
> 
> Is that a website or a store?


It's a store. Computer parts for bitcoins, but it's been down the past few hours. Hopefully it'll be back by tomorrow morning.

At the time of order, I paid $480 in bitcoins, which is equal to around $488 now.

lol @anubis1127 WTCR stands for Winnipeg Technology and Computer Renewal.


----------



## DraXxus1549

Anyone getting artifacts on the desktop with the latest beta drivers?

I was stable at 1100/1400 forever on the previous drivers, now I can't even run the stock clocks without artifacts.


----------



## Krusher33

I can vouch for WTCR. The tricky part was that I had originally ordered 3x 290's and the page didn't say they were out of stock. It wasn't till I emailed them about it and they let me know it was actually on back order and they weren't sure when they'd get it. They were nice about it via email and we settled for 290X + 2x 280X + a refund. They shipped it pretty quickly and I got it quickly considering they're in Canada and I'm in Tennessee.


----------



## Arizonian

Quote:


> Originally Posted by *jamaican voodoo*
> 
> i got 2 more 290's on the way from the egg man can i say expensive, gosh i wish they were 290x for the price i paid lol
> bitcoin go to hell already


Congrats - Updated








Quote:


> Originally Posted by *Thorteris*
> 
> Sign me up
> 
> 
> 
> 
> 
> 
> 
> Stock cooling for now.
> http://www.techpowerup.com/gpuz/69s79/


Congrats - welcome aboard - added









I didn't see manufacturer so by default it's Sapphire. If it's any different let me know via PM I'll be happy to update.


----------



## ottoore

Quote:


> Originally Posted by *bloodkil93*
> 
> Just to quote you on that, VRM1 is fitted with a large heatsink that I purchased in the Alpenfohn heatsink pack, I can give you a screenshot if you would like?


Yes please.
Take also a pic of the modded card.


----------



## ottoore

Quote:


> Originally Posted by *lol.69*
> 
> What is the max safe temperature on Vrm's ? Gpu-z reported 100ºc (!) on VRM1 !! The core never passes 70ºc but the vrm temperature is scary.
> This was while doing a stress test with Furmark ; only 5 minutes.
> My card is asus 290x Dc2 version


Stop using furmark. It ruins vga and it doesn't test vga stability. Use Valley, Heaven, 3mark or long game session.


----------



## Matt-Matt

Quote:


> Originally Posted by *Paul17041993*
> 
> better then the 630AUD I paid for mine.


Wow, that's actually pretty good for Australia.. My 290 was $499 @ PCCG.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Matt-Matt*
> 
> Wow, that's actually pretty good for Australia.. My 290 was $499 @ PCCG.


The Sapphire Ref 290x was $599 at PCCG on sale, They are all sold out now though.

XFX ref cards are $649 though


----------



## Paul17041993

Spoiler: post



Quote:


> Originally Posted by *Paul17041993*
> 
> 
> 
> well you can officially count me in, I need a better camera... and desk...
> (reference sapphire 290X)
> 
> as for clocks, she seemed to run 1100/1500 through heaven fine, though then flickered out and eventually hanged on a black screen after I locked the computer, 1050/1350 seems perfectly fine though, not going to worry about overvolting yet as for one thing its too hot, have to wait till it gets closer to winter, but Ill likely be trying to get a water loop set up by then too.
> 
> the amount of air that comes out the back of these on full-tilt is kinda scary, like a vacuum cleaner almost...
> 
> edit; here's the GPU-Z link;
> http://www.techpowerup.com/gpuz/wr32g/






ok updated it with the GPU-Z link, also seems I have Hynix ICs.


----------



## Abyssic

Quote:


> Originally Posted by *Sgt Bilko*
> 
> The Sapphire Ref 290x was $599 at PCCG on sale, They are all sold out now though.
> 
> XFX ref cards are $649 though


if i convert euro into dollar, i paid 700$ here in germany for my 290X Tri-X...









a good 780ti would cost 840$ btw xD


----------



## Sgt Bilko

Quote:


> Originally Posted by *Abyssic*
> 
> if i convert euro into dollar, i paid 700$ here in germany for my 290X Tri-X...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> a good 780ti would cost 840$ btw xD


I can beat that, EVGA GeForce GTX 780 Ti Superclocked ACX 3GB = $909

yeah.........and they are sold out as well


----------



## lol.69

Quote:


> Originally Posted by *sugarhell*
> 
> Asus custom cards use rebrand stuffs so i dont know what they use. AMD suggest 125C for their vrm (i think they use volterra).Asus call them DIGI+ but 100C for a custom cooler is really bad. Its not dangerous but it kills any oc potential and i wouldnt like to run my vrms 100C everytime that i play


While Gaming , about one hour ( bf3 MP) the vrm1 reached 84 ºc at stock clocks.

I wonder now why most websites reviews do not mentioned Vrm temp.

I mean I paid 585€ for a card that cant cool properly...

If i replace the cooler I loose my warranty right?


----------



## Abyssic

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I can beat that, EVGA GeForce GTX 780 Ti Superclocked ACX 3GB = $909
> 
> yeah.........and they are sold out as well


i'm so happy right now ^^ i just looked into my e-tailers website and found out that they could find a card that another cusomer canceled and they reserved it for me. the price i paid was 520€. and now, the availability for other customers went up a few days and they have to pay 600€ xD

i may be getting my 290x this week







FINALLY! i'm stuck with a 9600GT right now... what a torture


----------



## The Mac

Quote:


> Originally Posted by *lol.69*
> 
> While Gaming , about one hour ( bf3 MP) the vrm1 reached 84 ºc at stock clocks.
> 
> I wonder now why most websites reviews do not mentioned Vrm temp.
> 
> I mean I paid 585€ for a card that cant cool properly...
> 
> If i replace the cooler I loose my warranty right?


84c is perfectly fine.

They are rated to 125c,

90c and over is common on AIB coolers.


----------



## Sazz

Quote:


> Originally Posted by *The Mac*
> 
> 84c is perfectly fine.
> 
> They are rated to 125c,
> 
> 90c and over is common on AIB coolers.


Yeah they are rated to run at higher temps but they loose efficiency and reduce their life cycle if they are ran close to max temps for too long, Personally I don't like my VRM1 going over 80C so when I am using the reference cooler I got my own fan curve which maxes out at 72% fan speed, GPU core maxes out at 82C while VRM1 maxes out at 61C at stock voltage w/ OC of 1100/1350.


----------



## BradleyW

I seem to get strange GPU usage when using an fps limiter. Anyone know a fix?



When card 1 drops, card 2 shoots up. Cycle repeats.


----------



## Asrock Extreme7

mining nut coin goina be big£££££ get yours well its easy


----------



## devilhead

Quote:


> Originally Posted by *Asrock Extreme7*
> 
> mining nut coin goina be big£££££ get yours well its easy


hey man, so decide for your self, before said that alien coin will be da best, now nut coin







 go to mining thread


----------



## kizwan

Look like my 2nd 290 (Hynix) is dying. Playing BF4 in Crossfire & overclocked with 14.1 beta drivers, when BF4 crashed/BSOD, the 2nd GPU disappear. Disappear from device manager & GPU-Z doesn't detect it either. However, if I shutdown the computer, turn off the PSU & disconnect both PCIe cables, when I turn on the computer 5 minutes later (with all cables re-connected), the 2nd GPU is now detected again.

Look like I need to RMA the card.


----------



## Asrock Extreme7

Quote:


> Originally Posted by *devilhead*
> 
> hey man, so decide for your self, before said that alien coin will be da best, now nut coin
> 
> 
> 
> 
> 
> 
> 
> go to mining thread


I know im mining both:thumb:


----------



## Arizonian

Quote:


> Originally Posted by *Asrock Extreme7*
> 
> mining nut coin goina be big£££££ get yours well its easy
> 
> 
> 
> Spoiler: Warning: Spoiler!


Quote:


> Originally Posted by *Asrock Extreme7*
> 
> I know im mining both:thumb:


Best place for mining discussion should be posted in the *Distributed Computing* section.


----------



## IceAero

Anyone getting desktop artifacts using the new 14.1 drivers?


----------



## kcuestag

Quote:


> Originally Posted by *BradleyW*
> 
> I seem to get strange GPU usage when using an fps limiter. Anyone know a fix?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> When card 1 drops, card 2 shoots up. Cycle repeats.


Is the performance okay when that happens? If so, ignore it. It's been said countless times that GPU Usage is reported wrong due to an AMD Overdrive 5/6 API bug that needs to be fixed via drivers.


----------



## Paul17041993

Quote:


> Originally Posted by *lol.69*
> 
> While Gaming , about one hour ( bf3 MP) the vrm1 reached 84 ºc at stock clocks.
> 
> I wonder now why most websites reviews do not mentioned Vrm temp.
> 
> I mean I paid 585€ for a card that cant cool properly...
> 
> If i replace the cooler I loose my warranty right?


VRMs being, well, basic VRMs they don't mind high temps, 90C should be fine even for long use periods.

its only if you let them go up to 120C you can warp them off the PCB and make your card behave very poorly if not kill it.


----------



## kizwan

I found out that second GPU (Crossfire) core clock will only fluctuating when overclock (14.1 beta 1.6 driver). Running at stock clock, both 290's core & memory clock running at max all the time while playing BF4. This was with Mantle.



With Mantle, CPU usage also dropped which help boost FPS in BF4.



Spoiler: For reference, this is BF4 CPU usage with DirectX (driver 13.12 WHQL)









Spoiler: GPU Core/VRM1/VRM2 Temperature



*Ambient 28.5C*

*GPU1*
Core: 55C
VRM1: 55C
VRM2: 49C

*GPU2*
Core: 56C
VRM1: 57C
VRM2: 51C



I already recorded frame time when playing BF4 with Mantle but I can't post it here yet because I don't have data for DirectX for comparison. I'll play BF4 with DirectX & I'll get the frame time data for comparison later.
Quote:


> Originally Posted by *IceAero*
> 
> Anyone getting desktop artifacts using the new 14.1 drivers?


Happen over here too. On windows taskbar & Chrome. I just re-installed the driver (uninstall & re-install), so far no artifacts.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Paul17041993*
> 
> VRMs being, well, basic VRMs they don't mind high temps, 90C should be fine even for long use periods.
> 
> its only if you let them go up to 120C you can warp them off the PCB and make your card behave very poorly if not kill it.


High VRM temps (>80c) can negatively affect overclocks. They can generally reduce stability, and make the card draw more power because the VRM's are much less efficient when hot. For example, my last PC drew about 410w under full GPU load when cold; when fully warmed up (core ~ 60c, VRM 1 ~ 80c) the PC would draw about 430w.


----------



## taem

Anyone with a Powercolor PCS+ 290? Please share your take on this card.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Paul17041993*
> 
> VRMs being, well, basic VRMs they don't mind high temps, 90C should be fine even for long use periods.
> 
> its only if you let them go up to 120C you can warp them off the PCB and make your card behave very poorly if not kill it.


High VRM temps (>80c) can negatively affect overclocks. They can generally reduce stability, and make the card draw more power because the VRM's are much less efficient when hot. For example, my last PC drew about 410w under full GPU load when cold; when fully warmed up (core ~ 60c, VRM 1 ~ 80c) the PC would draw about 430w.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Paul17041993*
> 
> VRMs being, well, basic VRMs they don't mind high temps, 90C should be fine even for long use periods.
> 
> its only if you let them go up to 120C you can warp them off the PCB and make your card behave very poorly if not kill it.


High VRM temps (>80c) can negatively affect overclocks. They can generally reduce stability, and make the card draw more power because the VRM's are much less efficient when hot. For example, my last PC drew about 410w under full GPU load when cold; when fully warmed up (core ~ 60c, VRM 1 ~ 80c) the PC would draw about 430w.


----------



## kpoeticg

Quote:


> Originally Posted by *taem*
> 
> Anyone with a Powercolor PCS+ 290? Please share your take on this card.


I had a Powercolor 290x that i just sent back to The Egg for RMA today. It arrived DOA, wouldn't give video out of any of the ports even though the fan would spin and the voltages checked out on a multimeter.

Not much else i can say about it til i get my replacement


----------



## DampMonkey

Heres my 290x crossfire frame pacing as of 14.1 driver....not exactly optimal:
4770k @ 4.5ghz
Gigabyte Z87 UD5h
840 evo SSD for OS and Games
16gb ddr3 1866
stock gpu clocks, watercooled maxing at 44*C

Did a DDU clean in safe mode before these drivers

*1080p single card*


*1080p crossfire*


*1440p single card*


*1440p crossfire*


----------



## Paul17041993

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> High VRM temps (>80c) can negatively affect overclocks. They can generally reduce stability, and make the card draw more power because the VRM's are much less efficient when hot. For example, my last PC drew about 410w under full GPU load when cold; when fully warmed up (core ~ 60c, VRM 1 ~ 80c) the PC would draw about 430w.


well generally yes, in overclocking you want everything as cool as possible, though VRM efficiency doesn't change as much as the core, but it still increases OC stability from better voltage filtering by keeping them very cool (under water or sub-zero).

saying that though, you don't want your chokes too cold or they stop resisting the right amount of power and you'll overvolt uncontrollably. only really a problem with nitro and helium though.


----------



## IBIubbleTea

When I apply my memory clock I see weird lines across my screen for like a second, Is this normal?


----------



## Sazz

Quote:


> Originally Posted by *IBIubbleTea*
> 
> When I apply my memory clock I see weird lines across my screen for like a second, Is this normal?


I only get that when the memory clocks I entered is unstable. when using Sapphire TrixX the display shuts of and on quickly when applying any overclocks.


----------



## rdr09

Quote:


> Originally Posted by *IBIubbleTea*
> 
> When I apply my memory clock I see weird lines across my screen for like a second, Is this normal?


only when you apply, then it is normal. happened on mine too when i was benching. i play at stock clocks.


----------



## Jack Mac

Quote:


> Originally Posted by *IBIubbleTea*
> 
> When I apply my memory clock I see weird lines across my screen for like a second, Is this normal?


Yes, it happens with me as well, I wouldn't worry about it as long as it only happens when you apply the change.


----------



## devilhead

Quote:


> Originally Posted by *IBIubbleTea*
> 
> When I apply my memory clock I see weird lines across my screen for like a second, Is this normal?


thats a sign, that you need to bump voltage a bit


----------



## BradleyW

Quote:


> Originally Posted by *Jack Mac*
> 
> Yes, it happens with me as well, I wouldn't worry about it as long as it only happens when you apply the change.


That's ok, but if you see the lines after the initial bump, you must reduce the MHz.


----------



## IBIubbleTea

Quote:


> Originally Posted by *devilhead*
> 
> thats a sign, that you need to bump voltage a bit


But it's only when I apply it.... I did some testing and It looks fine to me.


----------



## Jack Mac

Quote:


> Originally Posted by *IBIubbleTea*
> 
> But it's only when I apply it.... I did some testing and It looks fine to me.


You're fine, you should only start to back off the OC if you crash or see artifacts when under load.


----------



## Thorteris

Do asics really matter with Hawaii GPUs? Mines at 74.8% and I wonder how that compares.


----------



## Jack Mac

Quote:


> Originally Posted by *Thorteris*
> 
> Do asics really matter with Hawaii GPUs? Mines at 74.8% and I wonder how that compares.


I'd say it's a decent place to start and maybe estimate your OCs, my 79.8% ASIC card does 1250MHz at 175mv, but my 24/7 clocks are 1175/1450 at +70mV which is around 1.212-1.22v under load.


----------



## The Mac

Quote:


> Originally Posted by *Thorteris*
> 
> Do asics really matter with Hawaii GPUs? Mines at 74.8% and I wonder how that compares.


Sure they do.

The higher the asic, the less leakage and more efficient the core which translates into more OC headroom on Air.

the lower the asic, the more leakage and less efficient the core is which translates into needing a lot more cooling (water) for decent OCs.

Now if you really mean do OCs really matter with Hawaii GPUs, thats a different question.


----------



## Roy360

How are you guys cooling your cards with universal GPU blocks.

I already have this block and ram sinks:

https://www.dazmode.com/store/product/ek-vga-supremacy-bridge-edition-acetal/

and

http://www.ebay.ca/itm/370655913100?ssPageName=STRK:MEWNX:IT&_trksid=p3984.m1497.l2649
http://www.ebay.ca/itm/350705273019?ssPageName=STRK:MEWNX:IT&_trksid=p3984.m1497.l2649

not sure if that's enough though


----------



## Ukkooh

Quote:


> Originally Posted by *Thorteris*
> 
> Do asics really matter with Hawaii GPUs? Mines at 74.8% and I wonder how that compares.


Nope. Gpu-z actually doesn't read the asic on hawaii chips accurately.


----------



## INCREDIBLEHULK

GV-R929OC-4GD is voltage locked?


----------



## Paul17041993

ok for the ASIC game; mine is 77.5%, voltage seems to be 1180mV (same as my older 7970DCIIT funnily enough)

though is it normal for the clocks and power to jump about so much in warming up?



I would think they would stay more solid then that, though it could just be the 768p render window just not demanding enough to keep the card on full 3D mode...

EDIT; ok nvm, forgot valley had the single-thread bottleneck, yea its just not getting fed enough is all.


----------



## quakermaas

Quote:


> Originally Posted by *IBIubbleTea*
> 
> But it's only when I apply it.... I did some testing and It looks fine to me.


It's just the memory jumping frequency, it can happen even at normal clock speeds, like going from idle to normal memory speed. It is very normal to see this for a second when applying a different frequency.


----------



## Connolly

I've just upgraded to the mantle drivers, and wow what a difference in BF4. Beyond wildest dreams stuff... Now that I can max out my fps at 120+ in pretty much all conditions I've started to look into playing with a frame rate limiter. Should I use some third party software to limit the frames? If so, which? Should I have V-Sync On/Off? And should I be using triple buffering? If there's anything else to add, feel free.

I know this isn't officially the correct place to ask this, but I've read so many conflicting views about the subject I'd love someone to be able to give me some definitive information. I'm using a Eizo Foris FG2421 as my monitor, so don't need any advice about lightboost settings for motion clarity etc.

Edit: Also, what should I be setting the FPS limiter to, I've read anything between 118 - 121 being the sweet spot, baring in mind that I'm using a 120hz display


----------



## quakermaas

Quote:


> Originally Posted by *Connolly*
> 
> I've just upgraded to the mantle drivers, and wow what a difference in BF4. Beyond wildest dreams stuff... Now that I can max out my fps at 120+ in pretty much all conditions I've started to look into playing with a frame rate limiter. Should I use some third party software to limit the frames? If so, which? Should I have V-Sync On/Off? And should I be using triple buffering? If there's anything else to add, feel free.
> 
> I know this isn't officially the correct place to ask this, but I've read so many conflicting views about the subject I'd love someone to be able to give me some definitive information. I'm using a Eizo Foris FG2421 as my monitor, so don't need any advice about lightboost settings for motion clarity etc.
> 
> Edit: Also, what should I be setting the FPS limiter to, I've read anything between 118 - 121 being the sweet spot, baring in mind that I'm using a 120hz display


In console or make a user config and put it in BF4 folder

*GameTime.MaxVariableFPS 65* (Set framerate limit)

Experiment and find what feels good to you.


----------



## Connolly

Thanks for the reply, but I'm pretty sure that I don't want to set the maxvariablefps to 65 on a 120hz monitor.


----------



## quakermaas

Quote:


> Originally Posted by *Connolly*
> 
> Thanks for the reply, but I'm pretty sure that I don't want to set the maxvariablefps to 65 on a 120hz monitor.


of course, set it to what ever you want...it was just an example


----------



## phallacy

So I downloaded the mantle drivers with a crossfire config and I'm really not noticing any difference in FPS with mantle or dx11. Starswarm showed a difference from 38 fps on dx11 to 51 fps on mantle but I don't know how accurate that bench is. Originally after downloading both BF4 and starswarm were crashing immediately when running with mantle but after disabling the intel hd4000 it seems to be stable. Haven't noticed any frame stuttering or crashes during gameplay..yet.


----------



## Slomo4shO

Has anyone got their hands on a PowerColor PCS+ R9 290 or 290X? Curious to know about the noise and temps of these cards.


----------



## quakermaas

Quote:


> Originally Posted by *phallacy*
> 
> So I downloaded the mantle drivers with a crossfire config and I'm really not noticing any difference in FPS with mantle or dx11. Starswarm showed a difference from 38 fps on dx11 to 51 fps on mantle but I don't know how accurate that bench is. Originally after downloading both BF4 and starswarm were crashing immediately when running with mantle but after disabling the intel hd4000 it seems to be stable. Haven't noticed any frame stuttering or crashes during gameplay..yet.


Did you restart BF4 after switching between DX11/Mantle ?


----------



## phallacy

Quote:


> Originally Posted by *quakermaas*
> 
> Did you restart BF4 after switching between DX11/Mantle ?


Yeah, I restarted the game and even rebooted my computer to make sure. This is not an in depth benchmark though by any means. I loaded into the same server same map with dx11 and mantle and with dx11 I average around 95-100 fps and mantle I was noticing about the same after 15 minutes of gameplay. If you have any recommendations i'm all ears. : ) Running 64bit BF4 so I can't use AB to monitor (at least not since getting the new driver), relying on the PerfOverlay feature

I use these graphic options:

1440p 120hz
Ultra all
field of view 89
Res Scale 135%
MSAA 2x
HBAO


----------



## battleaxe

I just ordered an ASUS 290x DCII card. Should have it within 3 weeks. This was about the only card I could get my hands on for a decent price. I really only wanted a 290 and reference would have been fine, but they were ridiculously priced.

What's the deal with the DCII? Are they good cards? Any chance for Hynix memory on the 290x?


----------



## VSG

As far as I know, the DCUII has Elpida memory and the cooler is not the best- they just took the 780Ti cooler and modified it slightly.


----------



## battleaxe

Quote:


> Originally Posted by *geggeg*
> 
> As far as I know, the DCUII has Elpida memory and the cooler is not the best- they just took the 780Ti cooler and modified it slightly.


Great. Saying I just got taken? Crud...


----------



## battleaxe

Quote:


> Originally Posted by *battleaxe*
> 
> Great. Saying I just got taken? Crud...


I couldn't get a Tri-X for another 2 mos. Guess I'll have to try my luck. I'll sell it on FleaBay to miners if it sucks. LOL


----------



## BakerMan1971

Where are you located battleaxe? I am in the market for an MSI Gaming 290, I know the launch price was £330 so I am waiting for it to drop to that ballpark from £360 also to come back in stock, stock comes through in dribs and drabs.
As for the tri-x there is some stock at overclockers in the uk. (or was when I checked earlier)


----------



## p5ych00n5

14.1 not supported by 3D Mark


----------



## VSG

What do you mean by "not supported"? Does it not run at all or does it just give you a message saying the driver was not verified because it is a beta driver?


----------



## battleaxe

Quote:


> Originally Posted by *BakerMan1971*
> 
> Where are you located battleaxe? I am in the market for an MSI Gaming 290, I know the launch price was £330 so I am waiting for it to drop to that ballpark from £360 also to come back in stock, stock comes through in dribs and drabs.
> As for the tri-x there is some stock at overclockers in the uk. (or was when I checked earlier)


I'm in the US. Bummer.


----------



## Ukkooh

Umm is it normal that my reference card runs at 90°C at idle while at stock fan profile and clocks? I just randomly checked it. I guess this is the reason for my random idle crashing with black screen and fan running at 100%... Well atleast I can finally put it under water on friday.

Edit: Only goes down to 50°C with the fan at 100%. Seems to be a real sloppy mount on the stock cooler.


----------



## battleaxe

Quote:


> Originally Posted by *Ukkooh*
> 
> Umm is it normal that my reference card runs at 90°C at idle while at stock fan profile and clocks? I just randomly checked it. I guess this is the reason for my random idle crashing with black screen and fan running at 100%... Well atleast I can finally put it under water on friday.
> 
> Edit: Only goes down to 50°C with the fan at 100%. Seems to be a real sloppy mount on the stock cooler.


No, that's not normal. My MSI did not do that before I put it on water.


----------



## Connolly

Is anyone getting flawless BF4 game play with the new mantle drivers? I'm loving the increase in av/max fps, but the sudden spikes that make you freeze for 0.2 of a second or so aren't really worth the gain. It only seems to happen a couple of times a round, with V-Sync on or off.


----------



## Loktar Ogar

Quote:


> Originally Posted by *Ukkooh*
> 
> Nope. Gpu-z actually doesn't read the asic on hawaii chips accurately.


Glad to hear this and i don't think the ASIC are accurate... But it still feels good if you have high % score.


----------



## The Mac

They arent really meant to be accurate, just an indication of their efficiency...


----------



## Krusher33

Quote:


> Originally Posted by *Connolly*
> 
> Is anyone getting flawless BF4 game play with the new mantle drivers? I'm loving the increase in av/max fps, but the sudden spikes that make you freeze for 0.2 of a second or so aren't really worth the gain. It only seems to happen a couple of times a round, with V-Sync on or off.


No there's some other bugs with it too. Like the sky is more white in paracel storm and the foggy ness in flood zone and some others.

I get those freeze for half second once or twice per round.

Some of us in the BF4 thread has noticed on some maps, it is so much brighter. Zavod for me is a lot brighter.

I did notice HUGE gains in FPS though.


----------



## quakermaas

Quote:


> Originally Posted by *Connolly*
> 
> Is anyone getting flawless BF4 game play with the new mantle drivers? I'm loving the increase in av/max fps, but the sudden spikes that make you freeze for 0.2 of a second or so aren't really worth the gain. It only seems to happen a couple of times a round, with V-Sync on or off.


Everybody is getting the spikes to some degree, it's a beta driver, but looks very promising I think


----------



## Widde

Quote:


> Originally Posted by *quakermaas*
> 
> Everybody is getting the spikes to some degree, it's a beta driver, but looks very promising I think


Also getting random freezes, really often though :/ Cant really tell if mantle have done anything for me yet but it's still early beta ^^ But 200% resolution scale and 95-120 fps







Dips into low 80s sometimes but only have a 60hz monitor so meeh


----------



## phallacy

Quote:


> Originally Posted by *Widde*
> 
> Also getting random freezes, really often though :/ Cant really tell if mantle have done anything for me yet but it's still early beta ^^ But 200% resolution scale and 95-120 fps
> 
> 
> 
> 
> 
> 
> 
> Dips into low 80s sometimes but only have a 60hz monitor so meeh


Really?! What resolution do you normally play at? I'm at 1440p and with 135% res scale I can do 100 fps avg with crossfire 290x. I tried bumping it to 200% and my fps dropped to like 5. Getting back into the options menu was a huge hassle and choppy just to turn it back down. My comp specs are very similar to yours as well


----------



## Ukkooh

Quote:


> Originally Posted by *The Mac*
> 
> They arent really meant to be accurate, just an indication of their efficiency...


I meant not accurate *at all*.
http://forum.beyond3d.com/showpost.php?p=1808073&postcount=2130


----------



## Widde

Quote:


> Originally Posted by *phallacy*
> 
> Really?! What resolution do you normally play at? I'm at 1440p and with 135% res scale I can do 100 fps avg with crossfire 290x. I tried bumping it to 200% and my fps dropped to like 5. Getting back into the options menu was a huge hassle and choppy just to turn it back down. My comp specs are very similar to yours as well


1080p ^_^ Turned off AA though as I think the jaggies are pretty much destroyed with 200% scaling


----------



## the9quad

Quote:


> Originally Posted by *phallacy*
> 
> Yeah, I restarted the game and even rebooted my computer to make sure. This is not an in depth benchmark though by any means. I loaded into the same server same map with dx11 and mantle and with dx11 I average around 95-100 fps and mantle I was noticing about the same after 15 minutes of gameplay. If you have any recommendations i'm all ears. : ) Running 64bit BF4 so I can't use AB to monitor (at least not since getting the new driver), relying on the PerfOverlay feature
> 
> I use these graphic options:
> 
> 1440p 120hz
> Ultra all
> field of view 89
> Res Scale 135%
> MSAA 2x
> HBAO


Are you running 4 290x's? I don't get that with 3 290x's @135% of 1440p on 64 player maps. Cards at stock with a 4930k @4.4ghz and ddr3 2400mhz. With mantle yea, but not dx11.2


----------



## phallacy

Quote:


> Originally Posted by *Widde*
> 
> 1080p ^_^ Turned off AA though as I think the jaggies are pretty much destroyed with 200% scaling


Ok gotcha lol. That's equivalent to 4k which is downsampled back at 1080p I believe. That should equate to a 150% res scale if i'm at 1440p. Will try turning off the AA and see at 150%. I was like whoa! you have some good cards
Quote:


> Originally Posted by *the9quad*
> 
> Are you running 4 290x's? I don't get that with 3 290x's @135% of 1440p on 64 player maps. Cards at stock with a 4930k @4.4ghz and ddr3 2400mhz. With mantle yea, but not dx11.2


No just 2 290x that are overclocked at 1200/1625. And yes dx 11.2 100fps averaging on the conquest large servers. It will dip to 75-80 when I'm caught in a big firefight or outside but inside and walking/running shooting will stay at about 100


----------



## kizwan

Quote:


> Originally Posted by *Widde*
> 
> 1080p ^_^ Turned off AA though as I think the jaggies are pretty much destroyed with 200% scaling


Ultra or High?


----------



## the9quad

Quote:


> Originally Posted by *phallacy*
> 
> Ok gotcha lol. That's equivalent to 4k which is downsampled back at 1080p I believe. That should equate to a 150% res scale if i'm at 1440p. Will try turning off the AA and see at 150%. I was like whoa! you have some good cards
> No just 2 290x that are overclocked at 1200/1625. And yes dx 11.2 100fps averaging on the conquest large servers. It will dip to 75-80 when I'm caught in a big firefight or outside but inside and walking/running shooting will stay at about 100


Can you run Paracel storm 64 player for five minutes with the new logging feature and post your average than? That is the true test. Also with the logging feature you can open up the csv file and get a true avg fps. I bet you'll be surprised at what your averaging than.


----------



## Widde

Quote:


> Originally Posted by *kizwan*
> 
> Ultra or High?


Ultra, Post processing and regular AA turned off, Found them to just make the image more blurry with 200% res scale







Cards at 1065/1400 atm, had to lower the core to be able to touch the memory


----------



## kizwan

Quote:


> Originally Posted by *Widde*
> 
> Ultra, Post processing and regular AA turned off, Found them to just make the image more blurry with 200% res scale
> 
> 
> 
> 
> 
> 
> 
> Cards at 1065/1400 atm, had to lower the core to be able to touch the memory


Is this with Mantle or DirectX? I tried 1150/1400 but BF4 tend to crash (complete system crash) with 14.1 beta (regardless Mantle or DirectX). It should stable. I don't know why it doesn't. Currently at stock clock with Mantle. Stable so far.


----------



## phallacy

Quote:


> Originally Posted by *the9quad*
> 
> Can you run Paracel storm 64 player for five minutes with the new logging feature and post your average than? That is the true test. Also with the logging feature you can open up the csv file and get a true avg fps. I bet you'll be surprised at what your averaging than.


Sure, I will send you a PM later this week with what I get. I'm redoing my loop anyways over the weekend and moving to a 4770k platform but will run it before I switch it out. If you say that 3 290xs are struggling to get that, I believe you, my anecdotal evidence may not be accurate like you say.


----------



## Widde

Quote:


> Originally Posted by *kizwan*
> 
> Is this with Mantle or DirectX? I tried 1150/1400 but BF4 tend to crash with 14.1 beta. It should stable. I don't know why it doesn't. Currently at stock clock with Mantle. Stable so far.


It is with mantle, but I am getting random 0.5 seconds freezes pretty often, Hoping for a driver update soon


----------



## Kriant

Need advice:
ever since I've installed 1.4 betas and reverted back half of games when running cross or tri-fire now blackscreen-freeze the pc or go with gray-screen.
Before I will go on a whole re-install of the system, any advice ?


----------



## the9quad

It's bf4 anything is possible! Mine could be messed up. Thanks for checking though!


----------



## Ukkooh

Quote:


> Originally Posted by *Kriant*
> 
> Need advice:
> ever since I've installed 1.4 betas and reverted back half of games when running cross or tri-fire now blackscreen-freeze the pc or go with gray-screen.
> Before I will go on a whole re-install of the system, any advice ?


Have you tried DDU?


----------



## jerrolds

Quote:


> Originally Posted by *Widde*
> 
> It is with mantle, but I am getting random 0.5 seconds freezes pretty often, Hoping for a driver update soon


I get those random half second stutters as well - this is on a freshly installed Windows 8.1 machine, made sure UPLS is disabled, and cores unparked. Beta 14.1 is very beta indeed.


----------



## kizwan

Quote:


> Originally Posted by *Widde*
> 
> It is with mantle, but I am getting random 0.5 seconds freezes pretty often, Hoping for a driver update soon


Same here.

Regarding overclock, I could take easy on the core clock. Maybe just go with 1000/1400. Pretty annoying when my pc crash. Second GPU will disappear & only come back if I removed the PCIe cables for ~5 minutes.

BTW, did you noticed VRAM usage with Mantle is ~6.4GB? DirectX only 4GB though.


----------



## Widde

Quote:


> Originally Posted by *kizwan*
> 
> Same here.
> 
> Regarding overclock, I could take easy on the core clock. Maybe just go with 1000/1400. Pretty annoying when my pc crash. Second GPU will disappear & only come back if I removed the PCIe cables for ~5 minutes.
> 
> BTW, did you noticed VRAM usage with Mantle is ~6.4GB? DirectX only 4GB though.


Having to do 100mV so I might go down on the core indeed ^_^


----------



## the9quad

Does mantle utilize memory different than direct x? For instance if you have two 4gig cards, you utilize only 4 gigs in Dx. However is mantle able to utilize all 8 gigs in crossfire? Maybe that is why we see the memory usage in mantle being doubled? Curious.


----------



## BradleyW

Quote:


> Originally Posted by *the9quad*
> 
> Does mantle utilize memory different than direct x? For instance if you have two 4gig cards, you utilize only 4 gigs in Dx. However is mantle able to utilize all 8 gigs in crossfire? Maybe that is why we see the memory usage in mantle being doubled? Curious.


I don't think this is the case. It would only add to latency and scaling issues if this was actually happening. Just my opinion.


----------



## kizwan

*BF4, Flood Zone 64 player, 1080p Ultra settings (AA & AA POST OFF), 200% resolution scale, 2 x 290's Crossfire*

*Frame Time - DirectX*


*Frame Time - Mantle*


*FPS - DirectX*
MIN: 30.68425898
MAX : 113.6363636
AVERAGE: 78.75183923


*FPS - Mantle*
MIN: 38.925652
MAX : 137.9310345
AVERAGE: 91.42697436


----------



## quakermaas

Quote:


> Originally Posted by *kizwan*
> 
> *BF4, Flood Zone 64 player, 1080p Ultra settings (AA & AA POST OFF), 200% resolution scale, 2 x 290's Crossfire*
> 
> *Frame Time - DirectX*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Frame Time - Mantle*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *FPS - DirectX*
> MIN: 30.68425898
> MAX : 113.6363636
> AVERAGE: 78.75183923
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *FPS - Mantle*
> MIN: 38.925652
> MAX : 137.9310345
> AVERAGE: 91.42697436
> 
> 
> Spoiler: Warning: Spoiler!


Nice


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> GV-R929OC-4GD is voltage locked?


Any of you have/know

Maybe my search is broken :S can't find mentions of it in this thread and off google I keep reading that it is one one forums thread & isn't on another
Quote:


> Originally Posted by *geggeg*
> 
> As far as I know, the DCUII has Elpida memory and the cooler is not the best- they just took the 780Ti cooler and modified it slightly.


Yeah, I plugged in a 290x DCUII and I've never been more sad to say, fancy backplate, fancy cooler with fans, and I still miss my sapphire 290x reference that was able to keep the card cool ( at the cost of running the dB of leafblower inside my case )

I've never wanted to buy an EK block for non-reference card more than now


----------



## VSG

Good think EK is making one for it then


----------



## Paul17041993

original 79x0 DCII(T) and Matrix(P) cards are notoriously horrible in many ways, only a few people have been happy with theirs (though they haven't OC'ed far at all), TBH only really good in pre-assembled/OEM rigs, if they were reliable...

the newer 280X version (the two slot, not the "V2" rebadge) seemed like an improvement, but Ive never seen one on these forums let alone overclocked, does the 290X one really have a cooling issue? I'm half wondering if ASUS pulled another QC fail at not tightening the screws correctly... (the 7970DCII on release had most cards RMA'ed due to the coolers not being attached to the core)

could also mention the orig DCIIT like I had didn't cover all the memory ICs correctly, but really that shouldn't be an issue on the 290(X) as the ICs don't need much cooling at all...


----------



## Thorteris

Should I overclock my card on the stock cooler?


----------



## Widde

Quote:


> Originally Posted by *Thorteris*
> 
> Should I overclock my card on the stock cooler?


If you can live with the added noise ^_^ You will most likely wanna do a custom fan profile to avoid throttling, I'm running 2 290s in crossfire at 1000/1500 atm on stock coolers


----------



## The Mac

yikes, that must be loud..


----------



## Widde

Quote:


> Originally Posted by *The Mac*
> 
> yikes, that must be loud..


Not easily bothered by sound ^_^ Cards rarely go over 84 degres in bf4







http://piclair.com/58q0u pic of fan profile


----------



## Thorteris

Quote:


> Originally Posted by *Widde*
> 
> If you can live with the added noise ^_^ You will most likely wanna do a custom fan profile to avoid throttling, I'm running 2 290s in crossfire at 1000/1500 atm on stock coolers


Sound doesn't bother me I game with a headset 99% of the time.


----------



## Widde

Quote:


> Originally Posted by *Thorteris*
> 
> Sound doesn't bother me I game with a headset 99% of the time.


Same here ^_^ And I just put it on auto when I'm gonna sleep


----------



## Kriant

Quote:


> Originally Posted by *Ukkooh*
> 
> Have you tried DDU?


8 times total.

I ran 3dmark 13 with extreme presets for an hour today - 0 issues.
BF4 -> lag lag lag...freeze....yeallowish color screen---> reboot
same for AC4

Going to reinstall windows tomorrow.

I think either 14.1 or Atiman 8.3.5 screwed something over really bad


----------



## BradleyW

Having issues with 14.1 here. Instant crashes and vertical lines. Issue happens as soon as I boot into any game. Issue present on both cards. Issue not present on 13.12 CCC. Any idea's?

Thank you.


----------



## IBIubbleTea

Quote:


> Originally Posted by *BradleyW*
> 
> Having issues with 14.1 here. Instant crashes and vertical lines. Issue happens as soon as I boot into any game. Issue present on both cards. Issue not present on 13.12 CCC. Any idea's?
> 
> Thank you.


Try disabling your on cpu onboard graphics " Intel Integrated Graphics" in device manager. I haven't tried this method yet. I just went into my bios and disabled cpu graphics.


----------



## IBIubbleTea

Not sure if this overclock is good... I'm kind of a noob when it comes to overclocking gpu.

ASUS R9 290.
ASIC Quality - 76.8%
Core Voltage - +100
Power Limit - +50
Core Clock - 1150
Memory Clock - 1625
AUX Voltage - +13

Memory is Hynix.


----------



## Paul17041993

being a desktop, you shouldn't need the intel iGPU at all, so that's one of the first things you should have disabled in BIOS. otherwise you end up with unneeded driver complexity.


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *Paul17041993*
> 
> original 79x0 DCII(T) and Matrix(P) cards are notoriously horrible in many ways, only a few people have been happy with theirs (though they haven't OC'ed far at all), TBH only really good in pre-assembled/OEM rigs, if they were reliable...
> 
> the newer 280X version (the two slot, not the "V2" rebadge) seemed like an improvement, but Ive never seen one on these forums let alone overclocked, does the 290X one really have a cooling issue? I'm half wondering if ASUS pulled another QC fail at not tightening the screws correctly... (the 7970DCII on release had most cards RMA'ed due to the coolers not being attached to the core)
> 
> could also mention the orig DCIIT like I had didn't cover all the memory ICs correctly, but really that shouldn't be an issue on the 290(X) as the ICs don't need much cooling at all...


I probably should not have given the impression that it is terrible.

I did do multiple testing, benches, games, and mining.

This card did hold up pretty good in BF4, I think it topped 70-72C with 70% fan( max is 80% at 3000rpm)
What I did not like is the VRMs being 10-15C higher.

I had a reference sapphire 290x, sure the blower is loud, but it really does cool the card to an extreme in any situation.

The DCII is going to be great for a casual gamer or inside a nice computer, it will cool the card in most all situations with good temps. Since I tested it to an extreme I just personally was not satisfied, this card and cooling unit is targeted for everything but the extreme potential of the card. They might as well have made it a tri-slot card, gave it some bigger heatsinks or fans and allowed it further cooling potential. (I don't even want to run OCCT for 30minutes because I'm scared to see if the cards cooling unit can actually cool it once it reaches max fan )

In all honesty, I am considering putting my own heatsinks on top of that backplate because it gets pretty hot....

On a side note, how about that GV 290 OC







Any of you own it / have the facts on voltage?


----------



## MrWhiteRX7

Finally got my new fittings in so finishing up redoing my tri-fire rig lol now I can see what all this Mantle fuss is about


----------



## ArchieGriffs

Can't they send my 290 Tri-X digitally through the SSD I just bought? /sarcasm. They shipped it a week earlier than I was anticipating and now for whatever reason I'm super impatient. Actually... NO take your time, I won't get to beta test a certain game again if gets here Friday. GAHHHH my head is going to explode. Get here when you get here.


----------



## chiknnwatrmln

Why can't I adjust voltage in MSI AB?

I've been searching for an hour, I did the unofficial overclocking crap in the config and changed all the settings in the program, even flashed an MSI bios but the voltage slider is still grayed out. All other OC programs are not running.

This is really frustrating, I can hold 1225 MHz at 40c with +162mV using GPUTweak but I want more.


----------



## bond32

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Why can't I adjust voltage in MSI AB?
> 
> I've been searching for an hour, I did the unofficial overclocking crap in the config and changed all the settings in the program, even flashed an MSI bios but the voltage slider is still grayed out. All other OC programs are not running.
> 
> This is really frustrating, I can hold 1225 MHz at 40c with +162mV using GPUTweak but I want more.


Check the box for "unlock voltage control"?


----------



## chiknnwatrmln

Yes, I said I already changed MSI AB's settings. I tried forcing constant voltage, extending limits, disabling ULPS, the whole nine yards.

Not trying to come off rude, but this problem is all over the net with nobody posting their solutions.

All I want is a program that can give me more than 1.412mV...

Using Beta 17, but for some reason it says this version expires this Sunday.


----------



## Matt-Matt

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Yes, I said I already changed MSI AB's settings. I tried forcing constant voltage, extending limits, disabling ULPS, the whole nine yards.
> 
> Not trying to come off rude, but this problem is all over the net with nobody posting their solutions.
> 
> All I want is a program that can give me more than 1.412mV...
> 
> Using Beta 17, but for some reason it says this version expires this Sunday.


You have a reference card yes?
I can get +100 in Afterburner with just checking the box to allow voltage control.

I just tried unlocking it and i still am limited to +100 and 1235MHz... Hmmm

Seems its broken, I am sure that MSI may fix it soon,


----------



## INCREDIBLEHULK

Where are you gigabyte owners


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Matt-Matt*
> 
> You have a reference card yes?
> I can get +100 in Afterburner with just checking the box to allow voltage control.
> 
> I just tried unlocking it and i still am limited to +100 and 1235MHz... Hmmm
> 
> Seems its broken, I am sure that MSI may fix it soon,


Yes, XFX ref card.


----------



## Forceman

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Yes, I said I already changed MSI AB's settings. I tried forcing constant voltage, extending limits, disabling ULPS, the whole nine yards.
> 
> Not trying to come off rude, but this problem is all over the net with nobody posting their solutions.
> 
> All I want is a program that can give me more than 1.412mV...
> 
> Using Beta 17, but for some reason it says this version expires this Sunday.


There's a Beta 18 available. There were some issues with early versions of Beta 17 not unlocking voltage control on some cards. Try 18 and see if it helps.


----------



## kizwan

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> Where are you gigabyte owners


Which Gigabyte you're looking for? Did you try look for a review or two for that particular card?


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *kizwan*
> 
> Which Gigabyte you're looking for? Did you try look for a review or two for that particular card?


GV-R929OC-4GD

Been getting too much mixed information from different forums and now I gave up the hunt









Was hoping a OCN owner would confirm


----------



## Sgt Bilko

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> GV-R929OC-4GD
> 
> Been getting too much mixed information from different forums and now I gave up the hunt
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Was hoping a OCN owner would confirm


Check out the owners list in the OP and PM those that own it, from memory i think 2 people have the Windforce 3 290/x.


----------



## Abyssic

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> GV-R929OC-4GD
> 
> Been getting too much mixed information from different forums and now I gave up the hunt
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Was hoping a OCN owner would confirm


i don't own it but here's what i know from reviews: they reused the gtc 780 windforce cooler wich gets a bit louder and hotter than the sapphire tri-x but over all a decent card.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Abyssic*
> 
> i don't own it but here's what i know from reviews: they reused the gtc 780 windforce cooler wich gets a bit louder and hotter than the sapphire tri-x but over all a decent card.


he was more interested if it was Volt Locked iirc


----------



## Abyssic

Quote:


> Originally Posted by *Sgt Bilko*
> 
> he was more interested if it was Volt Locked iirc


oh... uhm ok, then nvm me xD


----------



## BakerMan1971

Giddy as can be, just popped in my order for an MSI R9 290 Gaming, on offer at £340 with free next day delivery. Gone green to red for the first time since 2007.


----------



## Hogesyx

Quote:


> Originally Posted by *BakerMan1971*
> 
> Giddy as can be, just popped in my order for an MSI R9 290 Gaming, on offer at £340 with free next day delivery. Gone green to red for the first time since 2007.


Welcome back to the red camp!


----------



## BradleyW

Quote:


> Originally Posted by *IBIubbleTea*
> 
> Try disabling your on cpu onboard graphics " Intel Integrated Graphics" in device manager. I haven't tried this method yet. I just went into my bios and disabled cpu graphics.


I'm on X79 platform.

@everyone
Can everyone see my sig rig? Whenever I ask for help, people ask me what I'm running or provide solutions that don't fit my current hardware. If you can't see it, how to I make it visible?
Thank you.


----------



## Sgt Bilko

Quote:


> Originally Posted by *BradleyW*
> 
> I'm on X79 platform.
> 
> @everyone
> Can everyone see my sig rig? Whenever I ask for help, people ask me what I'm running or provide solutions that don't fit my current hardware. If you can't see it, how to I make it visible?
> Thank you.


I can see it....


----------



## BradleyW

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I can see it....


Oh good.
Anyone else have suggestions on my 14.1 issues? Games crash and driver stops responding. Tested on both cards.


----------



## Sgt Bilko

Quote:


> Originally Posted by *BradleyW*
> 
> Oh good.
> Anyone else have suggestions on my 14.1 issues? Games crash and driver stops responding. Tested on both cards.


I have similar issues but i'm putting it down to 14.1 beta driver being a beta more than anything.

re-install 13.11 and wait till the the next one is my advice


----------



## BradleyW

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I have similar issues but i'm putting it down to 14.1 beta driver being a beta more than anything.
> 
> re-install 13.11 and wait till the the next one is my advice


It's interesting, because we both have Elpida IC's and we both have issues with 14.1.


----------



## Sgt Bilko

Quote:


> Originally Posted by *BradleyW*
> 
> It's interesting, because we both have Elpida IC's and we both have issues with 14.1.


well i've got a free hour or so......what games etc are you having issues with and i'll run them (if i have them) and test?


----------



## quakermaas

Quote:


> Originally Posted by *BradleyW*
> 
> Oh good.
> Anyone else have suggestions on my 14.1 issues? Games crash and driver stops responding. Tested on both cards.


I'm on the 14.1 and similar set up, have only tried BF4 since I install the 14.1 driver, but not having any problems so far, other than the documented issues with Mantle.

I can test a few different games in about an hour, and let you know if I had any problems.

Our computers are similar, I have Elpida RAM on my GPUs and on win7


----------



## BradleyW

Quote:


> Originally Posted by *Sgt Bilko*
> 
> well i've got a free hour or so......what games etc are you having issues with and i'll run them (if i have them) and test?


Quote:


> Originally Posted by *quakermaas*
> 
> I'm on the 14.1 and similar set up, have only tried BF4 since I install the 14.1 driver, but not having any problems so far, other than the documented issues with Mantle.
> I can test a few different games in about an hour, and let you know if I had any problems.
> 
> Our computers are similar, I have Elpida RAM on my GPUs and on win7


Thanks to both you. I don't think I need any testing now since quackermass is using Elpida IC's and he is not having these issues.
But just in case you still want to check things out, I tried BF4, batman AO and Valley benchmark.


----------



## Sgt Bilko

Quote:


> Originally Posted by *BradleyW*
> 
> Thanks to both you. I don't think I need any testing now since quackermass is using Elpida IC's and he is not having these issues.
> But just in case you still want to check things out, I tried BF4, batman AO and Valley benchmark.


I was just in the BF4 Test Range messing about with a Joystick running Mantle and no issues, Don't have Arkham Origins and i ran Valley a few days ago and it was fine.......BF4 has played up on me but a quick restart usually fixes it for me


----------



## BradleyW

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I was just in the BF4 Test Range messing about with a Joystick running Mantle and no issues, Don't have Arkham Origins and i ran Valley a few days ago and it was fine.......BF4 has played up on me but a quick restart usually fixes it for me


It would be interesting to see if I get the same issues in Windows 7!


----------



## Fahrenheit85

Having a little issue with Afterburner, running 3.0 Beta 18 with my 290x and it only lets me go to +100mv. I thought at this point we could go +200? Running a reference gigabyte card.


----------



## rdr09

Arizonian, thanks for the wonderful job maintaining this thread even if you have - Nvidia.


----------



## rdr09

Quote:


> Originally Posted by *Fahrenheit85*
> 
> Having a little issue with Afterburner, running 3.0 Beta 18 with my 290x and it only lets me go to +100mv. I thought at this point we could go +200? Running a reference gigabyte card.


You want to try Trixx? it gives +200, which i only recommend for benching. i saw my core voltage go as high as 1.4v at that setting. set the Power Li (power limit) to max.


----------



## Fahrenheit85

Quote:


> Originally Posted by *rdr09*
> 
> You want to try Trixx? it gives +200, which i only recommend for benching. i saw my core voltage go as high as 1.4v at that setting. set the Power Li (power limit) to max.


Dont have the power limit option in Trixx on 4.8.2 It does let me go to 200 though.


----------



## rdr09

Quote:


> Originally Posted by *Fahrenheit85*
> 
> Dont have the power limit option in Trixx on 4.8.2 It does let me go to 200 though.


hmmm. you used the slider on the right?

you are not using stock cooler, are you?


----------



## battleaxe

Well. I successfully cancelled my order for the ASUS 290x DCII and replaced it with a Saphire 290 Tri-X. Very happy.

Awesome.


----------



## Fahrenheit85

Quote:


> Originally Posted by *rdr09*
> 
> hmmm. you used the slider on the right?
> 
> you are not using stock cooler, are you?


EK water block for cooling, though Trixx does have the slider, just have to scroll down to see it. Stupid me. Hmm seems like if I push over 100MV I get display corruptions even though i'm not changing the clocks at all. Ill be running my core at 1150 and at 100mv its fine but 150mv it has issues. Wounder why that is


----------



## Frontside

Hey, guys. Quick question here. Can i trust "AMD reference design checked by EK" I'm consider buying MSI R9 290X gaming or Sapphire r9 290x Trix card


----------



## Csokis

I ordered a *Sapphire R9 290 Tri-X* card, but i'm scared. Black Screen, Coil Whine, etc.







I'm paranoid, or?!


----------



## battleaxe

Quote:


> Originally Posted by *Csokis*
> 
> I ordered a *Sapphire R9 290 Tri-X* card, but i'm scared. Black Screen, Coil Whine, etc.
> 
> 
> 
> 
> 
> 
> 
> I'm paranoid, or?!


Why you scared? I have an MSI 290 and haven't had a single problem with it. Black screen are only when memory is running too high. If it can't do stock on mem just RMA. No worries.

Edit: I just ordered one too.


----------



## rdr09

Quote:


> Originally Posted by *Fahrenheit85*
> 
> EK water block for cooling, though Trixx does have the slider, just have to scroll down to see it. Stupid me. Hmm seems like if I push over 100MV I get display corruptions even though i'm not changing the clocks at all. Ill be running my core at 1150 and at 100mv its fine but 150mv it has issues. Wounder why that is


good. i have an above average clocker 290 but i play with it at stock. benches, i can go all the way up to 1175/1500 just maxing out the PL. try to the same oc just using the PL and leave the vcore. don't use furmark to check for stability. use 3DMarks and games. i don't mine so can't help you there. watch all your temps even if yours is watered. i use gpuz but it sometimes conflict with other apps.

make sure to Disable CCC Overdrive. use only one app to oc.


----------



## quakermaas

Quote:


> Originally Posted by *BradleyW*
> 
> Thanks to both you. I don't think I need any testing now since quackermass is using Elpida IC's and he is not having these issues.
> But just in case you still want to check things out, I tried BF4, batman AO and Valley benchmark.


Tried BF4, BF3 and Alan Wake...OK

Far Cry 3 and F1 2012 Blue screen with in a minute of starting


----------



## Krusher33

14.1 driver worked fine for me in Tomb Raider and DayZ last night.


----------



## Fahrenheit85

Quote:


> Originally Posted by *rdr09*
> 
> good. i have an above average clocker 290 but i play with it at stock. benches, i can go all the way up to 1175/1500 just maxing out the PL. try to the same oc just using the PL and leave the vcore. don't use furmark to check for stability. use 3DMarks and games. i don't mine so can't help you there. watch all your temps even if yours is watered. i use gpuz but it sometimes conflict with other apps.
> 
> make sure to Disable CCC Overdrive. use only one app to oc.


Hard lock soon as I backed down the power. At +100mv i can run 1150/1550 all day in Crysis 3 so I guess Ill be happy with that.

Now if I could get this coil whine to go away. Man I had bad luck with coil whine on my past 3 card. 570, 7950 and my 290x all have horrible whine. Now with the water block its all I hear. Welp better luck next go around


----------



## lol.69

Quote:


> Originally Posted by *Csokis*
> 
> I ordered a *Sapphire R9 290 Tri-X* card, but i'm scared. Black Screen, Coil Whine, etc.
> 
> 
> 
> 
> 
> 
> 
> I'm paranoid, or?!


Well my asus r9-290x dcII has started to do black screens randomly.
Had to return it.
It stayed about 5 hours in my Pc...

I asked the vendor if he can give me the Trixx version instead.

Just need to wait.

Also: what is the max voltage for these cards ( 1.4v)?

The "only" way we can break the card is if we put too much voltage for long period of time right?


----------



## chiknnwatrmln

MSI AB Beta 18 lets me adjust voltage but I can't use the voltage hack...

I want +200mV, I feel that I can break 1250MHz no problem.


----------



## rdr09

Quote:


> Originally Posted by *Fahrenheit85*
> 
> Hard lock soon as I backed down the power. At +100mv i can run 1150/1550 all day in Crysis 3 so I guess Ill be happy with that.
> 
> Now if I could get this coil whine to go away. Man I had bad luck with coil whine on my past 3 card. 570, 7950 and my 290x all have horrible whine. Now with the water block its all I hear. Welp better luck next go around


i had a 7950 that had coil whine. i joined the Valley bench and after running that it about 20 times the coil whine died down. at stock you can run Valley continously for a few hours if you like and see if it helps.


----------



## Fahrenheit85

Quote:


> Originally Posted by *rdr09*
> 
> i had a 7950 that had coil whine. i joined the Valley bench and after running that it about 20 times the coil whine died down. at stock you can run Valley continously for a few hours if you like and see if it helps.


Yeah have like 100 plus hours of hard gaming time on the card already, It ant going away lol. Is what it is


----------



## Ukkooh

Quote:


> Originally Posted by *lol.69*
> 
> Well my asus r9-290x dcII has started to do black screens randomly.
> Had to return it.
> It stayed about 5 hours in my Pc...
> 
> I asked the vendor if he can give me the Trixx version instead.
> 
> Just need to wait.
> 
> Also: what is the max voltage for these cards ( 1.4v)?
> 
> The "only" way we can break the card is if we put too much voltage for long period of time right?


It is the combination of temps and voltage that kill the card. I would do 1.4V only with water cooling or better and purely for benching. Around 1.3V should be safe for 24/7.
Quote:


> Originally Posted by *Fahrenheit85*
> 
> Yeah have like 100 plus hours of hard gaming time on the card already, It ant going away lol. Is what it is


RMA the card if it annoys you. My current 290x has no coil whine at all but my 1st was louder than my ap-15s.


----------



## rdr09

Quote:


> Originally Posted by *Fahrenheit85*
> 
> Yeah have like 100 plus hours of hard gaming time on the card already, It ant going away lol. Is what it is


it does sucks when you have a loop expecting it to be quiet and you hear the whine. i'll use headphones if i were you. sorry.


----------



## Fahrenheit85

Quote:


> Originally Posted by *Ukkooh*
> 
> It is the combination of temps and voltage that kill the card. I would do 1.4V only with water cooling or better and purely for benching. Around 1.3V should be safe for 24/7.
> RMA the card if it annoys you. My current 290x has no coil whine at all but my 1st was louder than my ap-15s.


I would like to if i'm honest but I have no card to replace it with. I guess I could run integrated graphics for a while but I don't feel like pulling down the loop. It is what it is.


----------



## BradleyW

Thanks for the feedback everyone. I tried 14.1 on Windows 7. Instant driver crash and system lockup. I really can't get 14.1 to work on my system. 13.12 are flawless.


----------



## Arizonian

Quote:


> Originally Posted by *Csokis*
> 
> I ordered a *Sapphire R9 290 Tri-X* card, but i'm scared. Black Screen, Coil Whine, etc.
> 
> 
> 
> 
> 
> 
> 
> I'm paranoid, or?!


I wouldn't worry about it. Right now the Tri-X is one of top cards IMO, good choice.
Quote:


> Originally Posted by *battleaxe*
> 
> Why you scared? I have an MSI 290 and haven't had a single problem with it. Black screen are only when memory is running too high. If it can't do stock on mem just RMA. No worries.
> 
> Edit: I just ordered one too.


I agree.

When you guys get them, submit your proof and join us.


----------



## Widde

http://piclair.com/9sl0y My gpu usage with 14.1 in bf4 seems to have gone spastic O_O or is it afterburner that needs a new patch?


----------



## kizwan

Quote:


> Originally Posted by *BradleyW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> I can see it....
> 
> 
> 
> Oh good.
> Anyone else have suggestions on my 14.1 issues? Games crash and driver stops responding. Tested on both cards.
Click to expand...

Quote:


> Originally Posted by *BradleyW*
> 
> Thanks for the feedback everyone. I tried 14.1 on Windows 7. Instant driver crash and system lockup. I really can't get 14.1 to work on my system. 13.12 are flawless.


BF4 crashed a couple of times. For me. it will crash only in either of these conditions:-

If memory overclocked to at least 1400MHz. So far I have tested separate core & memory overclock; 1100/1250 MHz & 947/1350 MHz. Both stable (in BF4) so far.
Core clock on one or both cards stuck at fixed frequency, e.g 947 or whatever MHz you overclocked it to.
The other night, BF4 crashing for no reason whenever I tried to launch it. Fortunately restarting the computer allow me to launch & play BF4 without any problem. For the stuck frequency, disabling & re-enabling Crossfire helps getting the frequency un-stuck.

Regarding disabling IGPU comment/advice, it's somewhat a valid advice. If you're trying to run with Crossfire disabled, you might want to try disable the other 290 card in Device Manager.

Using your sig as reference, I'm guessing you're trying with Crossfire enabled. I can run BF4 with Mantle in Crossfire but YMMV to be honest. The only issue I have with 14.1 drive, except the stuck frequency & everything listed in 14.1 drivers known issue, is when overclock, the core clock on the second GPU is fluctuating. This may or may not contribute to stability issue I encountered when overclocked. Interestingly, stability also listed in 14.1 drivers known issue when running BF4 with Mantle in Crossfire.

BTW, I'm using Windows 7.


----------



## The Mac

GPU usage is broken.

There is a bug that if the sensors are polled too rapidly, it returns garbage.

If you are usuing AB, there is a cfg setting to force rivatuner to use DX polling instead.


----------



## Widde

Getting MASSIVE fps drops aswell at random and all sound begin to stutter, Do I need to reinstall the drivers? Can stutter at random on the desktop aswell


----------



## IBIubbleTea

Quote:


> Originally Posted by *BradleyW*
> 
> I'm on X79 platform.
> 
> @everyone
> Can everyone see my sig rig? Whenever I ask for help, people ask me what I'm running or provide solutions that don't fit my current hardware. If you can't see it, how to I make it visible?
> Thank you.


What..? wow I'm a noob... Sorry about that.


----------



## battleaxe

Quote:


> Originally Posted by *Arizonian*
> 
> I wouldn't worry about it. Right now the Tri-X is one of top cards IMO, good choice.
> I agree.
> 
> When you guys get them, submit your proof and join us.


I've already joined. This would be my second card. Will advise when I get it.


----------



## ebduncan

To those using the 14.1 driver I would suggest going back to 13.12. The 14.1 drivers are exactly the best solution to your droids problem.

but if you really wish to use them, then be sure to use a driver cleaner such as this one https://forums.geforce.com/default/topic/550192/geforce-drivers/display-driver-uninstaller-ddu-v11-1-released-01-23-14-/

- Reboot/Start Windows in Safe mode, and use DDU to uninstall the previous AMD drivers.
- Reboot to Windows in normal mode.
- Install 14.1 driver (Run as Administrator)
- Reboot
- Delete all the folders in %user%/documents/battlefield 4
- Enable Crossfire in CCC (It also helps to disable ULPS).
- Start BF4 in Test range, and enable Mantle.
- Exit the game, and restart again in Test range.
- At this point you can change any graphics settings. Use resolution scale at least 125%
- Make sure Vsync is disabled
- If you want to make ANY changes to your graphics settings only do them in Test range do not do it while in game/match.
- Enjoy!

This seems to get crossfire to work using the 14.1 drivers as well. Honestly though just use the 13.12's for now, until AMD resolves a considerable list of problems.


----------



## ArchieGriffs

Guessing my Tri-X will get here Monday, I'll update with pics, and later do a couple benches, I'm pretty curious what the Vram temps + OCing capability of my card is, I have no intentions of pushing it harder than I need to to sate my curiosity though, so don't expect crazy voltages.


----------



## chiknnwatrmln

Arizonian, add me please.


Spoiler: Warning: Spoiler!


----------



## phallacy

Does anybody else have trouble with Trixx showing either vddc offset or power limit sliders when using crossfire? Sometimes both are shown for both of my 290Xs however most of the time only on the main card that is in my first pcix slot does the power limit/vddc show and one of them won't show up (It's usually the power limit slider which won't show). I have the sync multi gpu config box checked so I believe whatever I set on the main card the second card will have the same settings applied but it's annoying not to make sure that it is.


----------



## HOMECINEMA-PC

Finaly i got my 2nd 290 blocked and up and ruuning in c/f . No end of dramas to begin with , Driver crashes 10 sec after i get into windows .







lots of safe mode and ditched trixx . Both cards have modded asus bois's . So tonite im hoping to at least crack 27k on mk 11 with a view for 28k


----------



## tian105

i have 2x290x, while mining,(CF disabled) GPU2's memory clock would go from 1500mhz to 150 for a split second every few seconds.

Anyone know if this is normal or is my GPU2 defective?


----------



## Widde

Hmm installed 2 random optional windows updates and rebooted and now I run mantle hassle free and all other problems seem resolved


----------



## BradleyW

Quote:


> Originally Posted by *Widde*
> 
> Hmm installed 2 random optional windows updates and rebooted and now I run mantle hassle free and all other problems seem resolved


I just updated windows but I'm still having major issues with the new drivers. Glad you have it working on your end anyway.


----------



## Noskcaj

I've just bought an gigabyte (custom cooler) r9 290, two questions.
1. How can i overclock this in linux?
2. Can my card be unlocked to the 290x?


----------



## BradleyW

Quote:


> Originally Posted by *Noskcaj*
> 
> I've just bought an gigabyte (custom cooler) r9 290, two questions.
> 1. How can i overclock this in linux?
> 2. Can my card be unlocked to the 290x?


1)
http://www.overclock.net/t/517861/how-to-overclocking-ati-cards-in-linux
2)
Probably not. Only reference design to my knowledge.


----------



## Tokkan

Quote:


> Originally Posted by *phallacy*
> 
> Does anybody else have trouble with Trixx showing either vddc offset or power limit sliders when using crossfire? Sometimes both are shown for both of my 290Xs however most of the time only on the main card that is in my first pcix slot does the power limit/vddc show and one of them won't show up (It's usually the power limit slider which won't show). I have the sync multi gpu config box checked so I believe whatever I set on the main card the second card will have the same settings applied but it's annoying not to make sure that it is.


Disable ULPS?


----------



## Widde

Quote:


> Originally Posted by *BradleyW*
> 
> I just updated windows but I'm still having major issues with the new drivers. Glad you have it working on your end anyway.


Nvm bf4 just hardlocked so I had to restart the pc


----------



## headbass

btw I suggest any reference 290 owner to give a try to the tri-x bios

I flashed that bios when trying to sort out the hard freezing when playing hw accelerated flash content in FF and noticed the card runs a lot cooler

on the standard bios my card with stock cooling would get to 95 and even downclock at 48% fan
with the tri-x bios it runs stable at 1000/1300 and max vddc was only 1.203v so it stays at 79-80 and runs usually at 46% (max I've seen on the tri-x auto mode was 49%)

with the stock bios it was running at 947/1250 at 1.26v and 95 degrees and couldn't even keep at 947 at the same fan speeds

I used the tri-x bios for about a week before I installed my accelero and was a lot happier I don't have to bump the fan to 60% to keep it around 80 as I used to with the older bios


----------



## pompss

the 290's price are crazy high. 550









i remember the price was 400.
I sold my gtx 780 ti for 650 and i was looking for 290 to get some money back due some unexpected bills
Instead i bought a gtx 780 classified for 450 on ebay
ALso the gtx 780 ti its not worth the 200 dollars difference in power.

The gtx 780 classifed beat the 290 easy So right now the deal is a gtx 780 classy.


----------



## rdr09

Quote:


> Originally Posted by *pompss*
> 
> the 290's price are crazy high. 550
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i remember the price was 400.
> I sold my gtx 780 ti for 650 and i was looking for 290 to get some money back due some unexpected bills
> Instead i bought a gtx 780 classified for 450 on ebay
> ALso the gtx 780 ti its not worth the 200 dollars difference in power.
> 
> The gtx 780 classifed beat the 290 easy So right now the deal is a gtx 780 classy.


^this. if you can oc your 780 to 1400 on the core. games you won't see a difference between the two. just oc the 780 a bit.


----------



## Widde

Quote:


> Originally Posted by *pompss*
> 
> the 290's price are crazy high. 550
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i remember the price was 400.
> I sold my gtx 780 ti for 650 and i was looking for 290 to get some money back due some unexpected bills
> Instead i bought a gtx 780 classified for 450 on ebay
> ALso the gtx 780 ti its not worth the 200 dollars difference in power.
> 
> The gtx 780 classifed beat the 290 easy So right now the deal is a gtx 780 classy.


Here a gtx780 classy goes for 750







and a regular 780 670ish$, a 290 530$


----------



## pompss

Quote:


> Originally Posted by *rdr09*
> 
> ^this. if you can oc your 780 to 1400 on the core. games you won't see a difference between the two. just oc the 780 a bit.


yeah not worth wait for the r9 290 comes available at 450 if the 780 classy get better or equal even against the 290x.
Quote:


> Originally Posted by *Widde*
> 
> Here a gtx780 classy goes for 750
> 
> 
> 
> 
> 
> 
> 
> and a regular 780 670ish$, a 290 530$


I know
Price in Europa are too high instead USA.
I was living in Italy.Its not possible have a decent live there.
thanks god i get lucky and moved here in Florida Miami.
Love Italy but i Will never go back. Europa Is going down and Italy more down then rest of euro.


----------



## psyside

Quote:


> Originally Posted by *Fahrenheit85*
> 
> Yeah have like 100 plus hours of hard gaming time on the card already, It ant going away lol. Is what it is


Might be your psu.


----------



## Widde

Quote:


> Originally Posted by *pompss*
> 
> yeah not worth wait for the r9 290 comes available at 450 if the 780 classy get better or equal even against the 290x.
> I know
> Price in Europa are too high instead USA.
> I was living in Italy.Its not possible have a decent live there.
> thanks god i get lucky and moved here in Florida Miami.
> Love Italy but i Will never go back. Europa Is going down and Italy more down then rest of euro.


The taxes for import is silly :S Live in Sweden and we have been pretty untouched by the mining craze with these cards but the taxes for bringing in these cards geeez :S


----------



## rdr09

Quote:


> Originally Posted by *pompss*
> 
> yeah not worth wait for the r9 290 comes available at 450 if the 780 classy get better or equal even against the 290x.
> I know
> Price in Europa are too high instead USA.
> I was living in Italy.Its not possible have a decent live there.
> thanks god i get lucky and moved here in Florida Miami.
> Love Italy but i Will never go back. Europa Is going down and Italy more down then rest of euro.


it might match a 290 but not the 290X. unless the latter is a bugger.

edit: that is a reason why the Ti is more expensive. it will top benches for sure.

dang, i might go to vienna. how's vienna?


----------



## psyside

Quote:


> Originally Posted by *pompss*
> 
> The gtx 780 classifed beat the 290 easy So right now the deal is a gtx 780 classy.


Classy is amazing, and well worth 450. But, it only beats the custom 290 cards if you get it to 1300+ the tri x for example @1150 is as fast as [email protected] So keep that in mind.


----------



## Red1776

well this is going to be fun

4x R290X MSI 4GB


----------



## rdr09

Quote:


> Originally Posted by *Red1776*
> 
> well this is going to be fun
> 
> 4x R290X MSI 4GB


how's your better half doing?


----------



## pompss

Quote:


> Originally Posted by *psyside*
> 
> Classy is amazing, and well worth 450. But, it only beats the custom 290 cards if you get it to 1300+ the tri x for example @1150 is as fast as [email protected] So keep that in mind.


yeah i will keep the classy with stock bios.
For sure its gonna be more faster then r290 with stock bios.
also i try the 290x time ago and a gtx 780.
No big difference with stock bios in games.


----------



## rdr09

Quote:


> Originally Posted by *pompss*
> 
> yeah i will keep the classy with stock bios.
> For sure its gonna be more faster then r290 with stock bios.
> also i try the 290x time ago and a gtx 780.
> No big difference with stock bios in games.


i am using original bios on my 290. not all 290 will clock this high, though . . .

http://www.3dmark.com/3dm/2072325

and a 780

http://www.3dmark.com/fs/1673294


----------



## Jack Mac

Quote:


> Originally Posted by *Red1776*
> 
> well this is going to be fun
> 
> 4x R290X MSI 4GB


Are you planning on watercooling or just maxing the fans?


----------



## brazilianloser

You may now update me up to water


----------



## Red1776

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> well this is going to be fun
> 
> 4x R290X MSI 4GB
> 
> 
> 
> 
> how's your better half doing?
Click to expand...

Still struggling, but has begun a new drug tharapy, so we are hopeful..thanks









Quote:


> Originally Posted by *Jack Mac*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> well this is going to be fun
> 
> 4x R290X MSI 4GB
> 
> 
> 
> 
> Are you planning on watercooling or just maxing the fans?
Click to expand...

Ordering the Heatkillers as we speak!


----------



## Jack Mac

Nice, good luck with Quad CF, I'm waiting on a payment from Admin before I order my second 290.


----------



## Iniura

Is there anyone here who has a XFX R9 290 Black OC Edition with a EK-FC R9-290X waterblock on it? If so could you please confirm.


----------



## bond32

Quote:


> Originally Posted by *Iniura*
> 
> Is there anyone here who has a XFX R9 290 Black OC Edition with a EK-FC R9-290X waterblock on it? If so could you please confirm.


No block for that. Doubtful one will be released. I could be wrong, but I believe that is a non-reference pcb.


----------



## Arizonian

Quote:


> Originally Posted by *Red1776*
> 
> well this is going to be fun
> 
> 4x R290X MSI 4GB
> 
> 
> Spoiler: Warning: Spoiler!


Pop a GPU-Z validation link with OCN name in your post and welcome aboard. Congrats - added
















http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/15880#post_21733824

Quote:


> Originally Posted by *brazilianloser*
> 
> You may now update me up to water
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Sure will. Congrats, looks great. Updated


----------



## Frontside

Quote:


> Originally Posted by *Red1776*
> 
> well this is going to be fun
> 
> 4x R290X MSI 4GB


Congratulations, Red. I have just ordered one myself. Will get another one or two as soon as i can. Will have to hide them from my wife, though she has tonnes of cosmetics which is cost more than any hi-end rig
Do they got that nasty warranty void stickers on the screws and sweet hinyx memory onboard?


----------



## chiknnwatrmln

Hey Arizonian, not sure if you missed my post but can you add me please? Post is on page 1587. Thanks.


----------



## Arizonian

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Arizonian, add me please.
> 
> 
> Spoiler: Warning: Spoiler!


Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Hey Arizonian, not sure if you missed my post but can you add me please? Post is on page 1587. Thanks.


I did and I'm sorry.







You've been slotted in order.

Sweet OC and congrats - added


----------



## chiknnwatrmln

No worries, I know that posts can get lost in threads that move as fast as this one does. Thanks.


----------



## axiumone

I have two questions.

First is - I used to run some 7970 cards in eyefinity landscape and I loved the ease of use compared to my current 780 surround set up. I'm looking to possibly go for a 290x trifire setup.

However, I have gone from landscape to portrait now and I'm appreciating nvidias bezel correction in that config. Do the current catalyst drivers have support for bezel correction/compensation in portrait or portrait flipped modes? This is all in win 8.1.

Second is - where are the 290x lightnings? Seem's like they were shown off a month ago... but there is absolutely no news.


----------



## Falkentyne

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> MSI AB Beta 18 lets me adjust voltage but I can't use the voltage hack...
> 
> I want +200mV, I feel that I can break 1250MHz no problem.


Try /wi6 instead of /wi4 in beta 18.


----------



## kizwan

Quote:


> Originally Posted by *Widde*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BradleyW*
> 
> I just updated windows but I'm still having major issues with the new drivers. Glad you have it working on your end anyway.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nvm bf4 just hardlocked so I had to restart the pc
Click to expand...

If I'm not mistaken, you're overclocked right? Run BF4 at stock clock or take easy on the memory overclock. In my case BF4 will crash if memory overclock to 1400 MHz, even though it was stable with 13.12 WHQL driver. So far 1350 MHz stable for me. Also, did you check whether the core clock on second GPU is fluctuating when overclocked? Mine did. So I just run my card at stock clock. So far BF4 running without any problem.
Quote:


> Originally Posted by *brazilianloser*
> 
> You may now update me up to water
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Nice work! I can't ignore a beauty when I see one.







How is your temp, especially VRM1?

My rig at current state. Still need to change the cables to sleeved ones but this may have to wait because I think I want to change the front radiator to HL Black Ice SR-1 240 radiator.


----------



## Widde

Quote:


> Originally Posted by *kizwan*
> 
> If I'm not mistaken, you're overclocked right? Run BF4 at stock clock or take easy on the memory overclock. In my case BF4 will crash if memory overclock to 1400 MHz, even though it was stable with 13.12 WHQL driver. So far 1350 MHz stable for me. Also, did you check whether the core clock on second GPU is fluctuating when overclocked? Mine did. So I just run my card at stock clock. So far BF4 running without any problem.
> Nice work!
> 
> 
> 
> 
> 
> 
> 
> How is your temp, especially VRM1?


Gonna go with stock untill next driver or if I can be bothered trying out the pt1 bios, Might multi monitors mess stuff up? Having 2 24 inch screens but only gaming on 1


----------



## Redvineal

@Arizonian, I've been a busy man over the last month and made several changes to my box. Please update me to 2 x MSI R9 290 GAMING cards, now under water!



















Spoiler: Warning: Spoiler!









And for those that have asked on previous pages, I can confirm the XSPC full cover block is compatible with this card model!


----------



## kizwan

Quote:


> Originally Posted by *Widde*
> 
> Gonna go with stock untill next driver or if I can be bothered trying out the pt1 bios, Might multi monitors mess stuff up? Having 2 24 inch screens but only gaming on 1


I don't think so. I'm only using one monitor because the other one is broken (backlight).
Quote:


> Originally Posted by *Redvineal*
> 
> @Arizonian, I've been a busy man over the last month and made several changes to my box. Please update me to 2 x MSI R9 290 GAMING cards, now under water!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> And for those that have asked on previous pages, I can confirm the XSPC full cover block is compatible with this card model!


Nice work!







Can you shift the top radiator toward the back a little? This way you can put the 3rd fans on your 360.


----------



## Matt-Matt

Quote:


> Originally Posted by *Csokis*
> 
> I ordered a *Sapphire R9 290 Tri-X* card, but i'm scared. Black Screen, Coil Whine, etc.
> 
> 
> 
> 
> 
> 
> 
> I'm paranoid, or?!


Don't be worried at all! My card has been 100% fine unless I try to unlock it or overclock it too much!


----------



## Redvineal

Quote:


> Originally Posted by *kizwan*
> 
> Nice work!
> 
> 
> 
> 
> 
> 
> 
> Can you shift the top radiator toward the back a little? This way you can put the 3rd fans on your 360.


Thank you!

My case doesn't have proper holes for it, but I did consider it. Once I measure the clearance between the top radiator's fans and the 140mm mounted in the back, I may bust out the drill and make it happen.

First step before that is replacing those terrible, terrible power cables with individually sleeved ones!


----------



## Sgt Bilko

Quote:


> Originally Posted by *Iniura*
> 
> Is there anyone here who has a XFX R9 290 Black OC Edition with a EK-FC R9-290X waterblock on it? If so could you please confirm.


I have them and afaik they are a Ref design PCB but i can't be 100% sure without taking the coolers off and i'm still waiting for an answer from XFX about that.

And i think I'm the only person in this club with the DD cards so the only thing you could do is compare High Res pic's of the PCB's.

EK's website has them


----------



## Red1776

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Iniura*
> 
> Is there anyone here who has a XFX R9 290 Black OC Edition with a EK-FC R9-290X waterblock on it? If so could you please confirm.
> 
> 
> 
> I have them and afaik they are a Ref design PCB but i can't be 100% sure without taking the coolers off and i'm still waiting for an answer from XFX about that.
> 
> And i think I'm the only person in this club with the DD cards so the only thing you could do is compare High Res pic's of the PCB's.
> 
> EK's website has them
Click to expand...

Here is a ref/ref based PCB R290/290X


----------



## flamin9_t00l

@kizwan I noticed you have both your rads intake, is that to keep dust down or something else as I would imagine it'll get toasty in there with the side panel on with only 1 rear exhaust, supr rig tho.

@Redvineal Very nice compact tidy rig, great job. I routed some tubing behind the mb tray also, very handy to have that extra space behind... and the black and white looks mint IMO (if you get my meaning lol). Was going to ask where your PSU is but just noticed it's housed in the air 540.


----------



## flamin9_t00l

Anyone think Assassins Creed IV will improve with future drivers?

CFX 290's should handle max graphics on all current games you'd think. I've been using AFR friednly mode combined with D3DOverrider triple buffering and it runs a bit better although not maxed out graphics. Only done 8% of the game hoping for smoother play with new drivers before I continue so I can see the full glory.

Also tried v-sync a couple of times with CFX with a couple of games can't remember which and I thought I had issues with a single card (got that sorted) but with Xfire it just refuses to work properly.

Only problems I've had with my new R9's other than the new beta and BF4 of course.

1. AC4
2. V-sync

Anyone running AC4 or V-sync with CrossfireX 290(X) Without probs?


----------



## Sgt Bilko

Quote:


> Originally Posted by *flamin9_t00l*
> 
> Anyone think Assassins Creed IV will improve with future drivers?
> 
> CFX 290's should handle max graphics on all current games you'd think. I've been using AFR friednly mode combined with D3DOverrider triple buffering and it runs a bit better although not maxed out graphics. Only done 8% of the game hoping for smoother play with new drivers before I continue so I can see the full glory.
> 
> Also tried v-sync a couple of times with CFX with a couple of games can't remember which and I thought I had issues with a single card (got that sorted) but with Xfire it just refuses to work properly.
> 
> Only problems I've had with my new R9's other than the new beta and BF4 of course.
> 
> 1. AC4
> 2. V-sync
> 
> Anyone running AC4 or V-sync with CrossfireX 290(X) Without probs?


I could run AC4 with a single 290x maxed out, i'll run it now with CF on and see what happens, will report back soon

EDIT, My AC4 install file has vanished for some reason now.........Sorry, not going to be able to run that test any time soon, gimme a few hours , lucky i actually buy Retail games


----------



## TommyGunn123

Looky what PCCG has put up recently







Should be bloody good witha giant (what looks lke 2.5 slot) heatsink and 3 fans, and that price!



http://www.pccasegear.com/index.php?main_page=product_info&cPath=416&products_id=26799&zenid=dc5e9e51b8ae549766454c8078029dd5


----------



## Sgt Bilko

Quote:


> Originally Posted by *TommyGunn123*
> 
> Looky what PCCG has put up recently
> 
> 
> 
> 
> 
> 
> 
> Should be bloody good witha giant (what looks lke 2.5 slot) heatsink and 3 fans, and that price!
> 
> 
> 
> http://www.pccasegear.com/index.php?main_page=product_info&cPath=416&products_id=26799&zenid=dc5e9e51b8ae549766454c8078029dd5


Thats an impressive price for a card with a cooler that size on it, Makes the DCU II look like a bad joke


----------



## Matt-Matt

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Thats an impressive price for a card with a cooler that size on it, Makes the DCU II look like a bad joke


Wow.. Makes me wish I held out for non-ref (with ref-PCB's) and gotten one of those instead of this fully-ref XFX card.. Kind of wanted a shot at getting a unlock-able one


----------



## Duvar

What do you guys think about this: http://www.caseking.de/shop/catalog/Grafikkarten/VGA-Kuehler-Heatpipes/VGA-Kuehler-Arctic-Cooling/Arctic-Accelero-Xtreme-IV-VGA-Kuehler::26095.html

The PCS+ is a good card too, here in Germany some guys got it and the temps are fine ( Tri X niveau) and it isn t too loud.
These guys are very happy with the PCS+
Actually everybody is now recommending the Tri X or the PCS+ here, no Gigabyte/ASUS/MSI, only those two, because they are the best custom solutions. Sry 4 my bad english.


----------



## psyside

Can anyone post HWinFO ss during mining or load? not sure if my GPU voltage in/out readings are as they should?


----------



## ottoore

Quote:


> Originally Posted by *pompss*
> 
> yeah not worth wait for the r9 290 comes available at 450 if the 780 classy get better or equal even against the 290x.
> I know
> Price in Europa are too high instead USA.
> I was living in Italy.Its not possible have a decent live there.
> thanks god i get lucky and moved here in Florida Miami.
> Love Italy but i Will never go back. Europa Is going down and Italy more down then rest of euro.


I'm italian and i can say you' re completely right


----------



## kizwan

Quote:


> Originally Posted by *flamin9_t00l*
> 
> @kizwan I noticed you have both your rads intake, is that to keep dust down or something else as I would imagine it'll get toasty in there with the side panel on with only 1 rear exhaust, supr rig tho.


Both intake because I want both radiator get fresh & cooler air for cooling. In water cooling lower water temp (or lower delta T) is what you want & this allow your watercooled components/devices getting good thermal performance. I see improvement to GPU temps around -5 to -6 Celsius when moving from top exhaust to intake.

Regarding dust, I have dust filter at the front but not at the top. The dust filter on the front worked really well. When I clean my top radiator one or two weeks ago, I don't need to clean the front radiator at all. I really need to get dust filter for top radiator but dust in the case is not really serious. I can clean my case every two or three weeks.

The ambient inside the case not even toasty. I didn't notice any changes in temp though. Maybe just a couple degrees higher.


----------



## INCREDIBLEHULK

edit


----------



## Besty007

I've got the black screening issue with my MSI gaming 290x, had it for a little over a week. I'm running stock clocks and it just happens randomly, sometimes when loading a game sometimes when loading something that requires admin rights. Any help for this noob? :/


----------



## BradleyW

Quote:


> Originally Posted by *Besty007*
> 
> I've got the black screening issue with my MSI gaming 290x, had it for a little over a week. I'm running stock clocks and it just happens randomly, sometimes when loading a game sometimes when loading something that requires admin rights. Any help for this noob? :/


Try 13.12 WHQL drivers if you've not tried already.


----------



## Besty007

Quote:


> Originally Posted by *BradleyW*
> 
> Try 13.12 WHQL drivers if you've not tried already.


Thanks, will report back with results when I can.


----------



## Iniura

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I have them and afaik they are a Ref design PCB but i can't be 100% sure without taking the coolers off and i'm still waiting for an answer from XFX about that.
> 
> And i think I'm the only person in this club with the DD cards so the only thing you could do is compare High Res pic's of the PCB's.
> 
> EK's website has them


Hey I am not talking about the XFX DD model I am talking about the XFX Black OC Edition (XFX XFX Radeon R9 290 Black Edition 4GB GDDR5 (R9-290A-ENBC).

According the EK coolingconfigurator an EK-FC R9-290X should fit, bit I would like confirmation from someone who already fitted this block to his card.


----------



## flamin9_t00l

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I could run AC4 with a single 290x maxed out, i'll run it now with CF on and see what happens, will report back soon
> 
> EDIT, My AC4 install file has vanished for some reason now.........Sorry, not going to be able to run that test any time soon, gimme a few hours


I could run it quite well with 1 290 pretty much maxed (except I turned down god rays and shadows iirc) although not as smooth as I would have liked. It has to feel butter smooth for me or I'm not happy lol. With crossfire certain parts were butter smooth but there was many stutter parts and screen tearing etc and v-sync screwed everything.
Quote:


> Originally Posted by *Sgt Bilko*
> 
> lucky i actually buy Retail games


I got retail Crysis 3 for a xmas pressie and when I tried to install it through Origin the damn thing told me the key used was not what was on the disk and I had to download the game off the net, It was still Crysis 3 but that's not what was on the disk according to them... so the retail disk for that is useless (if you try to reinstall it using the disk it just says the key has been used and it won't install).


----------



## flamin9_t00l

Quote:


> Originally Posted by *kizwan*
> 
> Both intake because I want both radiator get fresh & cooler air for cooling. In water cooling lower water temp (or lower delta T) is what you want & this allow your watercooled components/devices getting good thermal performance. I see improvement to GPU temps around -5 to -6 Celsius when moving from top exhaust to intake.
> 
> Regarding dust, I have dust filter at the front but not at the top. The dust filter on the front worked really well. When I clean my top radiator one or two weeks ago, I don't need to clean the front radiator at all. I really need to get dust filter for top radiator but dust in the case is not really serious. I can clean my case every two or three weeks.
> 
> The ambient inside the case not even toasty. I didn't notice any changes in temp though. Maybe just a couple degrees higher.


I think if I tried this my northbridge would cook itself over gaming sessions. Allready put a mini fan on it to control temps which works but the X58 chip gets hot especially on Sabertooth heatsink. If I thought my MB VRM's heatsink and northbridge wouldn't be affected I would definitely be doing this but I have some canny volts going through the CPU and the northbridge is hot enough as it is.


----------



## Sgt Bilko

Quote:


> Originally Posted by *flamin9_t00l*
> 
> I could run it quite well with 1 290 pretty much maxed (except I turned down god rays and shadows iirc) although not as smooth as I would have liked. It has to feel butter smooth for me or I'm not happy lol. With crossfire certain parts were butter smooth but there was many stutter parts and screen tearing etc and v-sync screwed everything.
> I got retail Crysis 3 for a xmas pressie and when I tried to install it through Origin the damn thing told me the key used was not what was on the disk and I had to download the game off the net, It was still Crysis 3 but that's not what was on the disk according to them... so the retail disk for that is useless (if you try to reinstall it using the disk it just says the key has been used and it won't install).


Well, Uplay being Uplay, it won't let me install AC4 atm for some reason, i'll give it a go in the morning with a clear head.

But on the retail disc install thing, i really don't have any issues at all like that, pop the disc in......hit install and wait.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Iniura*
> 
> Hey I am not talking about the XFX DD model I am talking about the XFX Black OC Edition (XFX XFX Radeon R9 290 Black Edition 4GB GDDR5 (R9-290A-ENBC).
> 
> According the EK coolingconfigurator an EK-FC R9-290X should fit, bit I would like confirmation from someone who already fitted this block to his card.


The Black OC Edition is a Ref card, Ref cooler just different clocks.

So yes, a block will fit it.

Here is the Black OC Edition: http://www.techpowerup.com/gpudb/b2672/xfx-r9-290-black-edition.html


----------



## kizwan

Quote:


> Originally Posted by *flamin9_t00l*
> 
> I think if I tried this my northbridge would cook itself over gaming sessions. Allready put a mini fan on it to control temps which works but the X58 chip gets hot especially on Sabertooth heatsink. If I thought my MB VRM's heatsink and northbridge wouldn't be affected I would definitely be doing this but I have some canny volts going through the CPU and the northbridge is hot enough as it is.


Honestly you're overestimated that. It's not going to cook X58 chipset. I know X58 chipset running hot but by having intake instead of exhaust will not cook X58 chipset. BTW, what is the X58 temperature when playing games?


----------



## axiumone

Quote:


> Originally Posted by *axiumone*
> 
> I have two questions.
> 
> First is - I used to run some 7970 cards in eyefinity landscape and I loved the ease of use compared to my current 780 surround set up. I'm looking to possibly go for a 290x trifire setup.
> 
> However, I have gone from landscape to portrait now and I'm appreciating nvidias bezel correction in that config. Do the current catalyst drivers have support for bezel correction/compensation in portrait or portrait flipped modes? This is all in win 8.1.
> 
> Second is - where are the 290x lightnings? Seem's like they were shown off a month ago... but there is absolutely no news.


No one knows either answer?


----------



## Redvineal

Quote:


> Originally Posted by *axiumone*
> 
> No one knows either answer?


Can't help you about the eyefinity, but I know MSI has been really hush about the R9 290 Lightning since CES. All the news I can find is related to it being spotted at CES. Most users here are calling it vaporware at this point. I'm still a believer, but haven't seen any recent mention of it. Especially related to release info!


----------



## phallacy

Quote:


> Originally Posted by *Tokkan*
> 
> Disable ULPS?


Yes, that's been disabled for a while now, even checked regedit multiple times. It's weird that I can't get it to show on either Trixx or afterburner but afterburner is way worse and I use it only to monitor. Redoing my loop this weekend with some new parts and i'll see if a reseat fixes it.


----------



## Fahrenheit85

Okay so while i knew my card had coil whine before I put on my water block but now that the stock fan is gone the coil whine is much more noticeable. Now its a gigabyte reference 290x from mid December. If I did try and get an RMA should I tell them its for coil whine? If I put back on the stock cooler they won't notice I had it open except that the thermal pads/paste is different. Also any idea on the turn around on it?


----------



## Mr357

Quote:


> Originally Posted by *Fahrenheit85*
> 
> Okay so while i knew my card had coil whine before I put on my water block but now that the stock fan is gone the coil whine is much more noticeable. Now its a gigabyte reference 290x from mid December. If I did try and get an RMA should I tell them its for coil whine? If I put back on the stock cooler they won't notice I had it open except that the thermal pads/paste is different. Also any idea on the turn around on it?


Definitely don't let them know that you had it block'd. I do believe they'll accept RMA's for coil whine, but there's no guarantee that the replacement will be any better. If your card can do 1200+ core on a stock BIOS, I would keep it and just try to deal with the noise. Locking your frame rates can help a lot in reducing whine.

My card has some pretty bad whine too, but I simply got used to the subtle ambient noise, and locked all of my games to 60 FPS.


----------



## BakerMan1971

Well it's arrived, so here's a couple of photos for proof, no messing with the cooing or anything just unceremoniously shoved in my system















Finally gaming on ultra. now to go update the old sig rig









UPDATED:



sorry that looks awful, here is a closeup of gpuz


----------



## Arizonian

Quote:


> Originally Posted by *BakerMan1971*
> 
> Well it's arrived, so here's a couple of photos for proof, no messing with the cooing or anything just unceremoniously shoved in my system
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Finally gaming on ultra. now to go update the old sig rig


Congrats - add a GPU-Z validation link with OCN name showing to your post for proof if you would and you've been added to the roster.









http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/15930#post_21737112


----------



## BakerMan1971

Done & Cheers










sorry that looks awful, here is a closeup of gpuz


----------



## pkrexer

Finally under water and loving it







45c under full load and silence, ahhhh.

So now that I'm keeping it cooled, is it normal to just have games crash if its clocked to high vs. getting screen artifacts like I would on air? I can definitely clock it higher now, but I notice it'll just lock up now if I happen to push it to high. It does make it easier to know I have a stable clock.

Validation:

http://www.techpowerup.com/gpuz/7ydq9/

Asus 290x
Cooling: XSPC Razor full water block


----------



## BradleyW

I'd like to ask some questions about the GIGABYTE WindForce R9 290X:

1. What are those chips highlighted?
2. How can that black frame be removed from the PCB?
3. Is this card suitable for water cooling? (or are some components on the PCB reliant on air cooling? (Like those chips highlighted in red)?
4. Why are those closed caps being cooled? They are not cooled on other 290x's!

Thank you.


----------



## kizwan

Quote:


> Originally Posted by *BradleyW*
> 
> I'd like to ask some questions about the GIGABYTE WindForce R9 290X:
> 
> 1. What are those chips highlighted?
> 2. How can that black frame be removed from the PCB?
> 3. Is this card suitable for water cooling? (or are some components on the PCB reliant on air cooling? (Like those chips highlighted in red)?
> 4. Why are those closed caps being cooled? They are not cooled on other 290x's!
> 
> Thank you.


If you look carefully, the ones highlighted with red are not being cool at all. The two thermal pads was for VRM1 & chokes (the grey rectangle chips). Check pictures in *[this review]* for reference.

Reference PCB don't have those highlighted chips but reference PCB have capacitors there, next to the chokes.



2. I can see few screws holding the frame to the board. Just need to remove the screws I think.

3. Components changed but EK coolingconfigurator tell me you can use EK R9 290X water block.


----------



## ebduncan

So

Am i limited to just + 100mv for voltage control using MSI After Burner? I'm running at 1200 core /1600 memory right now. I know I can get some more out of the core if I have more voltage.

Here is a 3dmark Firestrike Extreme run not quite sure why the displayed clocks are shown as the stock clocks right now.

http://www.3dmark.com/fs/1674942


----------



## BradleyW

Quote:


> Originally Posted by *kizwan*
> 
> If you look carefully, the ones highlighted with red are not being cool at all. The two thermal pads was for VRM1 & chokes (the grey rectangle chips). Check pictures in *[this review]* for reference.
> 
> Reference PCB don't have those highlighted chips but reference PCB have capacitors there, next to the chokes.
> 
> 
> 
> 2. I can see few screws holding the frame to the board. Just need to remove the screws I think.
> 
> 3. Components changed but EK coolingconfigurator tell me you can use EK R9 290X water block.


This is very helpful, thank you.
Edit: I take it those sealed boxes next to the grey chips are the chokes, which don't really require cooling? (EK FC blocks have a cut out where they would be).
Also, those chips that are not being cooled are still sort of being cooled by the airflow from the fans. I think I touched up on this in my first post about this matter. Hence using a water block would leave these particular chips without any cooling at all.


----------



## rdr09

Quote:


> Originally Posted by *Red1776*
> 
> Still struggling, but has begun a new drug tharapy, so we are hopeful..thanks
> 
> 
> 
> 
> 
> 
> 
> 
> Ordering the Heatkillers as we speak!


glad to hear that Red and . . . 4? Rebuild Log.


----------



## kizwan

Quote:


> Originally Posted by *BradleyW*
> 
> This is very helpful, thank you.


You're welcome!

Reference card doesn't cooled the chokes but EK backplate does, at the back of the PCB (using thermal pad included with the backplate). However EK backplate is not compatible with Windforce 290X card. As far as I know chokes don't need cooling though.


----------



## BradleyW

Quote:


> Originally Posted by *kizwan*
> 
> You're welcome!
> 
> Reference card doesn't cooled the chokes but EK backplate does, at the back of the PCB (using thermal pad included with the backplate).


Those chips highlighted in red....surely they will be cooled by the wind force fans airflow? The EK block will provide no cooling for these chips. Question is, what are they and do they require any form of cooling?
Thank you for your kind support.


----------



## kizwan

Quote:


> Originally Posted by *BradleyW*
> 
> Those chips highlighted in red....surely they will be cooled by the wind force fans airflow? The EK block will provide no cooling for these chips. Question is, what are they and do they require any form of cooling?
> Thank you for your kind support.


If it need any cooling at all, just air flow is enough I think. Basically it doesn't need heatsink to run cool. I don't know what are those chips for.

BTW, EK backplate is not compatible with Windforce card, only EK water block compatible. Chokes (the grey rectangle chips) don't need cooling as far as I know.


----------



## BradleyW

Quote:


> Originally Posted by *kizwan*
> 
> If it need any cooling at all, just air flow is enough I think. Basically it doesn't need heatsink to run cool. I don't know what are those chips for.
> 
> BTW, EK backplate is not compatible with Windforce card, only EK water block compatible. Chokes (the grey rectangle chips) don't need cooling as far as I know.


So what are those sealed boxes near the little grey choke chips? Sealed coils? Do they require cooling? EK have a cut out where those would be.

Edit: Here is what would be cooled when the EK FC R9 290X WC block is installed.
Blue = cooled
Red = not cooled


----------



## Tobiman

Quote:


> Originally Posted by *TommyGunn123*
> 
> Looky what PCCG has put up recently
> 
> 
> 
> 
> 
> 
> 
> Should be bloody good witha giant (what looks lke 2.5 slot) heatsink and 3 fans, and that price!
> 
> 
> 
> http://www.pccasegear.com/index.php?main_page=product_info&cPath=416&products_id=26799&zenid=dc5e9e51b8ae549766454c8078029dd5


Mine just shipped today from NCIX. I will update thread when I get it sometime next week.


----------



## iCrap

Can the 290 do 5x1 Eyefinity? I don't see how it could going by the ports on the back of it.. but i cannot find a definitive answer.


----------



## kizwan

Quote:


> Originally Posted by *BradleyW*
> 
> So what are those sealed boxes near the little grey choke chips? Sealed coils? Do they require cooling? EK have a cut out where those would be.


If you're referring to the "chips" on the right of the chokes, they're VRM1 which need cooling. If you're referring on the left, no I don't think so because it also doesn't "attached" to the Windforce cooler with thermal pad or TIM.
Quote:


> Originally Posted by *BradleyW*
> 
> Edit: Here is what would be cooled when the EK FC R9 290X WC block is installed.
> Blue = cooled
> Red = not cooled


Yes, that is correct.


----------



## tsm106

They just replaced the standard caps for surface mount caps.


----------



## devilhead

damn i had some fun yesterday, forgot to switch on fans on my rads, and left mining my 4x290's 15 minutes, after that i can smell already some weird stuff








so at my mining rig room, was a lot mayhems liquid on the floor







even 2xD5 pumps have felt out from reservoir







and i was lucky that the liquid didn't touched my mining rig, because the res and pumps was outside rig







:thumb:4x 290's was at 94C


----------



## iPDrop

Can some one help me out here its been a while since I played BF4 and I just got on today after installing the new 13.14 patch with mantle thing. I forgot what kinds of fps I was getting before but now with one R9 290 I'm getting a mere 30fps in ultra settings 2560x1440, and only barely 60fps on low settings. Is this right? It seems a little low for me.


----------



## Tobiman

Quote:


> Originally Posted by *devilhead*
> 
> damn i had some fun yesterday, forgot to switch on fans on my rads, and left mining my 4x290's 15 minutes, after that i can smell already some weird stuff
> 
> 
> 
> 
> 
> 
> 
> 
> so at my mining rig room, was a lot mayhems liquid on the floor
> 
> 
> 
> 
> 
> 
> 
> even 2xD5 pumps have felt out from reservoir
> 
> 
> 
> 
> 
> 
> 
> 
> and i was lucky that the liquid didn't touched my mining rig, because the res and pumps was outside rig
> 
> 
> 
> 
> 
> 
> 
> :thumb:4x 290's was at 94C










Damn, jafar must have got your back. LoL


----------



## Tobiman

Quote:


> Originally Posted by *iPDrop*
> 
> Can some one help me out here its been a while since I played BF4 and I just got on today after installing the new 13.14 patch with mantle thing. I forgot what kinds of fps I was getting before but now with one R9 290 I'm getting a mere 30fps in ultra settings 2560x1440, and only barely 60fps on low settings. Is this right? It seems a little low for me.


Do you have 14.1 installed? I'd use DDU to uninstall drivers and reinstall them.


----------



## VSG

Relevant thread and info on the Powercolor PCS cards: Link

In short, the PCS is excellent for gaming but not for mining- at least not without a BIOS fix.


----------



## pkrexer

Anyone else getting odd performance drops with the new drivers? Some times when I start playing BF4 my frame-rates will almost be cut in half, like its being throttled (but its not, because I'm monitoring my clocks with AB). If I restart my computer, performance is back to normal.


----------



## AddictedGamer93

Quote:


> Originally Posted by *pkrexer*
> 
> Anyone else getting odd performance drops with the new drivers? Some times when I start playing BF4 my frame-rates will almost be cut in half, like its being throttled (but its not, because I'm monitoring my clocks with AB). If I restart my computer, performance is back to normal.


Yep, not just in BF4 either.


----------



## HOMECINEMA-PC

Blocked the 290's added asus 290x bios's and some tlc.........

HOMECINEMA-PC [email protected]@2436 R9 290 CF on wasser @ 1265 @1500 *27149*
Asus 290x PT1T bios .....











http://www.3dmark.com/3dm11/7937211

Broke that 27k









HOMECINEMA-PC [email protected]@2436 CF R9 290 on water [email protected] *10759*











http://www.3dmark.com/fs/1674523

Things can only get better....


----------



## vieuxchnock

*I have ordered my waterblock for my 290

http://www.servimg.com/image_preview.php?i=463&u=17159996

Coming in 2 days. LOL*


----------



## Besty007

The driver change fixed the issues I was having, damned beta drivers. Thanks BradleyW.


----------



## jamaican voodoo

here some pics of the xfx dd with ek water-blocks
it's reference pcb....flash the cards with tri-x 290 bios
running nicely..


----------



## Asrock Extreme7




----------



## Asrock Extreme7

Quote:


> Originally Posted by *Asrock Extreme7*


get some nuts


----------



## HOMECINEMA-PC

Heres a few pics of my 290 C/F setup on water using XSPC blocks


----------



## Thorteris

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Heres a few pics of my 290 C/F setup on water using XSPC blocks
> 
> 
> Spoiler: Warning: Spoiler!










I wish I could do water.


----------



## Arizonian

Quote:


> Originally Posted by *jamaican voodoo*
> 
> here some pics of the xfx dd with ek water-blocks
> it's reference pcb....flash the cards with tri-x 290 bios
> running nicely..
> 
> 
> Spoiler: Warning: Spoiler!


Looks great. Nice work.









Quote:


> Originally Posted by *Asrock Extreme7*
> 
> 
> 
> Spoiler: Warning: Spoiler!


*Distributed Computing* section you'll find all your nuts.

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Heres a few pics of my 290 C/F setup on water using XSPC blocks
> 
> 
> Spoiler: Warning: Spoiler!


I had you down for one - updated to two under water - nice.


----------



## reedy777

Quote:


> Originally Posted by *instream*
> 
> Ok, thanks for your help!
> 
> 
> 
> 
> 
> 
> 
> 
> I was going to put an after market cooler on it too, but since I'm not 100% sure that the card is not broken I don't dare... :-/
> If it's not too much to ask, could you run the game a bit more?
> 
> 
> 
> 
> 
> 
> 
> Sometimes I have been able to play Mount and Blade for an hour or so, so it could be that you are just lucky right now.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is that 13.11 whql driver the same as beta9.2 or something else?


I have the same problem
All was fine then I updated the drivers to las test beta clocked in at 1150 run firestrike so I could see how it fares against the ti's and in an instant shut down and restart. Thought it was a power surge due to the wet weather. Then run some benches and performance was halfed. So uninstall beta drivers reinstalled my second card and the 9.2 drivers I have xfire so I need these to avoid the whole vsync audio chopping thing any how went to run bench just to see if all was smooth again and whilst opening gpu tweak it happened again. Thing is iv had it for a week with no issues the only difference I installed msi ab while gpu tweak was still installed but not running. This might be the cause I hope
My setup is
Win 7 64
13.11 9.2 beta
2600k
Asus ME V bios 1903
1200w dark power pro (in single rail mode)
Custom water loop on cpu gpus waiting for blocks
2 x 290x
But like I said this problem only started when I updated the drivers and had msi ab and gpu tweak installed at the same time but not both running I will uninstall AB and see how I get on but this might narrow it down.


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Blocked the 290's added asus 290x bios's and some tlc.........
> 
> HOMECINEMA-PC [email protected]@2436 R9 290 CF on wasser @ 1265 @1500 *27149*
> Asus 290x PT1T bios .....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/7937211
> 
> Broke that 27k
> 
> 
> 
> 
> 
> 
> 
> 
> 
> HOMECINEMA-PC [email protected]@2436 CF R9 290 on water [email protected] *10759*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/1674523
> 
> Things can only get better....


Is that 1.493V? Nice!







So far I only go up to 1.432V. Can my card overclock better with Asus or Tri-X BIOS?


----------



## AddictedGamer93

Quote:


> Originally Posted by *kizwan*
> 
> Is that 1.493V? Nice!
> 
> 
> 
> 
> 
> 
> 
> So far I only go up to 1.432V. Can my card overclock better with Asus or Tri-X BIOS?


Would that even be safe? My voltages are in the 1.29-1.33 range, and it makes me paranoid.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Thorteris*
> 
> 
> 
> 
> 
> 
> 
> 
> I wish I could do water.


Its the only way to tame these beasts temps and to get more stable o/clocks and silence








Quote:


> Originally Posted by *Arizonian*
> 
> Looks great. Nice work.
> 
> 
> 
> 
> 
> 
> 
> 
> *Distributed Computing* section you'll find all your nuts.
> I had you down for one - updated to two under water - nice.


Thanks mate








I had a 3 slot connector for it but its a 4slot space








So i pulled out a couple of 1/2 5/8 ek comp fittings and a very small piece of hose to suit








Ive got a Giga 290 with asic of 69% and a Sapphy with asic of 69.8% . Both are locked 290's . And clock like Titans and 780 , 780ti's and KP,s LoooooL









Quote:


> Originally Posted by *kizwan*
> 
> Is that 1.493V? Nice!
> 
> 
> 
> 
> 
> 
> 
> So far I only go up to 1.432V. Can my card overclock better with Asus or Tri-X BIOS?


1.493v minus LLC droop = about 1.45 - 1.46v . This is the Asus 290x Pt1t Bios on switch 2 and Asus 290x bios with 1.41v on gpu tweek on switch 1
Havent read up on the tri x bios just yet but the PT1T bios will do the trick









Quote:


> Originally Posted by *AddictedGamer93*
> 
> Would that even be safe? My voltages are in the 1.29-1.33 range, and it makes me paranoid.


Waterblocks are the only way to fly with these


----------



## AddictedGamer93

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Waterblocks are the only way to fly with these


I am water cooled, btw. EK block.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *AddictedGamer93*
> 
> I am water cooled, btw. EK block.


They can handle the volts , just the drivers leave alot to be desired


----------



## cam51037

Does anybody here have a 290(X) Tri-X edition?

If so, have you experienced fan rattling with it before?


----------



## Paul17041993

Quote:


> Originally Posted by *BradleyW*
> 
> I'd like to ask some questions about the GIGABYTE WindForce R9 290X:
> 
> 1. What are those chips highlighted?
> 2. How can that black frame be removed from the PCB?
> 3. Is this card suitable for water cooling? (or are some components on the PCB reliant on air cooling? (Like those chips highlighted in red)?
> 4. Why are those closed caps being cooled? They are not cooled on other 290x's!
> 
> Thank you.


Quote:


> Originally Posted by *kizwan*
> 
> If you look carefully, the ones highlighted with red are not being cool at all. The two thermal pads was for VRM1 & chokes (the grey rectangle chips). Check pictures in *[this review]* for reference.
> 
> Reference PCB don't have those highlighted chips but reference PCB have capacitors there, next to the chokes.
> 
> 
> 
> 2. I can see few screws holding the frame to the board. Just need to remove the screws I think.
> 
> 3. Components changed but EK coolingconfigurator tell me you can use EK R9 290X water block.


ok to straight out any confusion, the pink cylinders are your solid capacitors, the gigabyte board uses low-profile ones that look similar to diodes, the grey "boxes" are then your choke coils, and the smaller chips are your regulators, or known here has VRMs.


----------



## iCrap

Quote:


> Originally Posted by *iCrap*
> 
> Can the 290 do 5x1 Eyefinity? I don't see how it could going by the ports on the back of it.. but i cannot find a definitive answer.


Bumping my previous question. Does anyone know?


----------



## tsm106

Quote:


> Originally Posted by *iCrap*
> 
> Quote:
> 
> 
> 
> Originally Posted by *iCrap*
> 
> Can the 290 do 5x1 Eyefinity? I don't see how it could going by the ports on the back of it.. but i cannot find a definitive answer.
> 
> 
> 
> Bumping my previous question. Does anyone know?
Click to expand...

?

Get a mst 1.2 dp hub and you can connect 3 1080 panels at 60hz to it. There's your 5x1. Btw, you probably want to situate the 3 panels off the hub as the center three.


----------



## iPDrop

Quote:


> Originally Posted by *Tobiman*
> 
> Do you have 14.1 installed? I'd use DDU to uninstall drivers and reinstall them.


is driver fusion okay to delete old drivers? That's what I've always used.


----------



## psyside

Quote:


> Originally Posted by *iPDrop*
> 
> is driver fusion okay to delete old drivers? That's what I've always used.


No, use DDU (google)


----------



## neurotix

Quote:


> Originally Posted by *cam51037*
> 
> Does anybody here have a 290(X) Tri-X edition?
> 
> If so, have you experienced fan rattling with it before?


Yes. I do have slight fan rattling with the fans at a higher RPM (3000). It almost sounds like coil whine but I know it's not coil whine, because the only time my card makes coil whine is when exiting the Valley benchmark.

This was happening when I was mining with 70% fans. My case uses thumbscrews for the VGA cards. I took my rig apart last night to do some cable management. I made sure to actually tighten the thumbscrews as tight as I could using a screwdriver (they have slits for a screwdriver). So far, I haven't noticed any rattling since tightening the screws down hard. Not sure if it will come back or not.


----------



## cam51037

Quote:


> Originally Posted by *neurotix*
> 
> Yes. I do have slight fan rattling with the fans at a higher RPM (3000). It almost sounds like coil whine but I know it's not coil whine, because the only time my card makes coil whine is when exiting the Valley benchmark.
> 
> This was happening when I was mining with 70% fans. My case uses thumbscrews for the VGA cards. I took my rig apart last night to do some cable management. I made sure to actually tighten the thumbscrews as tight as I could using a screwdriver (they have slits for a screwdriver). So far, I haven't noticed any rattling since tightening the screws down hard. Not sure if it will come back or not.


I had the same problem, a loud rattling at random fan speeds. WTCR (the place I bought and am RMAing with - the noise was LOUD) can't seem to find the problem I'm talking about anywhere. I sure hope that if they don't send me a new card tightening the screws will fix the issue, I'll be so mad if I receive it back and can re-create the problem within minutes.


----------



## Red1776

Quote:


> Originally Posted by *iCrap*
> 
> Quote:
> 
> 
> 
> Originally Posted by *iCrap*
> 
> Can the 290 do 5x1 Eyefinity? I don't see how it could going by the ports on the back of it.. but i cannot find a definitive answer.
> 
> 
> 
> Bumping my previous question. Does anyone know?
Click to expand...

Yes, but you may need a display port 1.2,1.4 adapter.
Her is some more info on that.

http://www.amd.com/us/products/technologies/amd-eyefinity-technology/for-consumers/Pages/what-is-eyefinity.aspx

http://blogs.windows.com/windows/b/extremewindows/archive/2014/01/23/ces2014-amd-showcases-new-technology-for-2014.aspx


----------



## the9quad

Quote:


> Originally Posted by *iPDrop*
> 
> is driver fusion okay to delete old drivers? That's what I've always used.


DDU and Driver Fusion are comparable. DDU is free though. You know how it is though when there is two or more of something, one person will swear by one and you'll be a noob if you use anything else. This will be followed by much forum ranting and raving over why one is better than the other. I've used both, and both work fine. You do have to be careful you can end up deleting stuff you need.


----------



## Arizonian

Quote:


> Originally Posted by *pkrexer*
> 
> Finally under water and loving it
> 
> 
> 
> 
> 
> 
> 
> 45c under full load and silence, ahhhh.
> 
> So now that I'm keeping it cooled, is it normal to just have games crash if its clocked to high vs. getting screen artifacts like I would on air? I can definitely clock it higher now, but I notice it'll just lock up now if I happen to push it to high. It does make it easier to know I have a stable clock.
> 
> Validation:
> 
> http://www.techpowerup.com/gpuz/7ydq9/
> 
> Asus 290x
> Cooling: XSPC Razor full water block
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## Matt-Matt

Yo,

Just wondering if all 290x blocks will fit all reference 290's? I got stuffed over last time with 7950's and 7970 blocks, I just want to be sure..


----------



## Bartouille

I just finished installing my AP-30s (4250rpm) on my MK-26 290x. MAN!! I've been running a couple valley loops, 54c max and 41c max vrm1 and 32c vrm2!! stock speed, just testing. The card has less vdroop when kept cool too that's great! Those vrms are like watercooling lol So happy.







Only downside is that the fan run at full speed all the speed because I have it to a molex... molex to 3 pin, does that exists? But even so I'm not sure my fan controller could handle it, this fan requires 1.36A to start, so around 20w. Got some pics coming too.


----------



## chiknnwatrmln

Question, maybe somebody experience like TSM can answer.

What is the max voltage I should run while benching? I'm currently at 1260/1625 with +220mV, this gives me 1.31v actual. I'm artifacting like crazy so I need a bit more, and I think I should be good up to 1.35v but somebody please correct me if I'm wrong.

Temps are below 40 core and 30 VRMs if I have my rad fans on.

Fun fact, according to GPUz my 290 is taking 331w, and my entire PC is using 525w at the plug...


----------



## rdr09

Quote:


> Originally Posted by *Matt-Matt*
> 
> Yo,
> 
> Just wondering if all 290x blocks will fit all reference 290's? I got stuffed over last time with 7950's and 7970 blocks, I just want to be sure..


the 290 PRO and 290X use the same waterblock.

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Question, maybe somebody experience like TSM can answer.
> 
> What is the max voltage I should run while benching? I'm currently at 1260/1625 with +220mV, this gives me 1.31v actual. I'm artifacting like crazy so I need a bit more, and I think I should be good up to 1.35v but somebody please correct me if I'm wrong.
> 
> Temps are below 40 core and 30 VRMs if I have my rad fans on.


+220mV as in +220 offset? i used trixx and applied +200 offset for benching and gpuz showed 1.42v at load.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *rdr09*
> 
> the 290 PRO and 290X use the same waterblock.
> +220mV as in +220 offset? i used trixx and applied +200 offset for benching and gpuz showed 1.42v at load.


Yes, I tried +231 mV but still artifacting.

Each card has different voltage after droop depending on ASIC, my card has great ASIC for wc'ing and it really benefits from a block... I went from 1200MHz max OC to 1260, before while on air I couldn't hold 1200 for more than a minute due to temps but now I don't have a problem.

At stock clocks my VDDC is about 1.14v actual.


----------



## rdr09

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Yes, I tried +231 mV but still artifacting.
> 
> Each card has different voltage after droop depending on ASIC, my card has great ASIC for wc'ing and it really benefits from a block... I went from 1200MHz max OC to 1260, before while on air I couldn't hold 1200 for more than a minute due to temps but now I don't have a problem.


if it is only showing 1.31v and you are in water, then you can go higher. i'd say 1.4v max and make sure you monitor the core and the vrm temps even in water.

you flashed your cards?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *rdr09*
> 
> if it is only showing 1.31v and you are in water, then you can go higher. i'd say 1.4v max and make sure you monitor the core and the vrm temps even in water.
> 
> you flashed your cards?


Sweet. My card is currently on an MSI 290x BIOS (thought it would help when I was having issues with AB, I'm too lazy to flash it back) but does not unlock to a 290x. I always keep an eye on temps, don't worry lol.

My card whines like a 15 year old girl when I bench... These scores are amazing though. 15.6k so far for 3dmark11.


----------



## rdr09

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Sweet. My card is currently on an MSI 290x BIOS (thought it would help when I was having issues with AB, I'm too lazy to flash it back) but does not unlock to a 290x. I always keep an eye on temps, don't worry lol.
> 
> My card whines like a 15 year old girl when I bench... These scores are amazing though. 15.6k so far for 3dmark11.


i see artifacts for anything higher than 1300 on the core. i am using original bios. i use Trixx which only gives +200. heaven and valley are more demanding from my experience benching.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *rdr09*
> 
> i see artifacts for anything higher than 1300 on the core. i am using original bios. i use Trixx which only gives +200. heaven and valley are more demanding from my experience benching.


Nice, I don't currently have Valley and I'm too lazy to update Heaven..

Too bad I didn't get a better card, my clocks are pretty good but with these temps if I got a bit luckier I could break 1300MHz. Oh well, I could have ended up with a dud OCer that can't break 1200.

Edit: +250mV causes my PC to instantly restart...


----------



## psyside

Quote:


> Originally Posted by *psyside*
> 
> Can anyone post HWinFO ss during mining or load? not sure if my GPU voltage in/out readings are as they should?


Anyone please ?


----------



## iPDrop

Quote:


> Originally Posted by *Besty007*
> 
> The driver change fixed the issues I was having, damned beta drivers. Thanks BradleyW.


what issue were you having and which drivers did you change to and from? I'm getting some sporadic performance ever since I installed the new 14.1 drivers


----------



## brazilianloser

My first gaming stretch with my 290s under water and nothing like a almost dead silent build that keeps both cards at the low 50s after about an full hour of Crysis 3 maxed out. Not oc atm but will soon again.


----------



## Arizonian

Quote:


> Originally Posted by *Redvineal*
> 
> @Arizonian, I've been a busy man over the last month and made several changes to my box. Please update me to 2 x MSI R9 290 GAMING cards, now under water!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And for those that have asked on previous pages, I can confirm the XSPC full cover block is compatible with this card model!


I missed this originally. Congrats. Some nice water blocked rigs as of late. Nice.


----------



## Sgt Bilko

Quote:


> Originally Posted by *jamaican voodoo*
> 
> here some pics of the xfx dd with ek water-blocks
> it's reference pcb....flash the cards with tri-x 290 bios
> running nicely..
> 
> 
> Spoiler: Warning: Spoiler!


Nice!

Thats really good to know, a few people weren't sure if the DD cards were a Ref design PCB or not and i guess this clears it up


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 1.493v minus LLC droop = about 1.45 - 1.46v . This is the Asus 290x Pt1t Bios on switch 2 and Asus 290x bios with 1.41v on gpu tweek on switch 1
> Havent read up on the tri x bios just yet but the PT1T bios will do the trick


I see. BIOS from overclock club. Will try those. I'm also interested to try Tri-X BIOS.


----------



## tsm106

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Question, maybe somebody experience like TSM can answer.
> 
> What is the max voltage I should run while benching? I'm currently at 1260/1625 with +220mV, this gives me 1.31v actual. I'm artifacting like crazy so I need a bit more, and I think I should be good up to 1.35v but somebody please correct me if I'm wrong.
> 
> Temps are below 40 core and 30 VRMs if I have my rad fans on.
> 
> Fun fact, according to GPUz my 290 is taking 331w, and my entire PC is using 525w at the plug...


I've pushed a lot more voltage than that before and the cards did not die.

There is no definitive max voltage with these cards because the limits can be removed by using either of the unlocked bios' and well then it comes down to how far you are willing to go. I have taken my cards past 1.5v loaded without blowing up lol. However I would not recommend anyone venture that far down the rabbit hole. I was more or less testing the limits of the reference design. Its pretty stout imo. Also, at high voltages the software monitors start to lose their accuracy. Anyways, I would say stay under 1.40v loaded for most situations.

Here's some things I wrote down from my experience overclocking this silicon.

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/6120_40#post_21205990


----------



## Sazz

Arizonian you can update me back to full watercooling again, I gave my AIO kit to my friend and I went back to full EK block.

Was worried about my temps at first since I am only using a single 240mm radiator (EK XT240) on both my CPU and GPU coz I don't have the space to work with my case (Bitfenix Prodigy) but surprisingly enough my temps are the same as my previous build that my 290X was in that used two 240mm radiators which is under 45C. Used EK LTX Supreme for the CPU and its keeping it under 74C (3570k at 1.375v 4.7Ghz) which is surprisingly well.

Sorry for the crappy pic, camera phone xD


And is it just me or w/ Beta Driver 14.1 the core clocks doesn't go down to 2D clocks.


----------



## kizwan

Quote:


> Originally Posted by *AddictedGamer93*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Is that 1.493V? Nice!
> 
> 
> 
> 
> 
> 
> 
> So far I only go up to 1.432V. Can my card overclock better with Asus or Tri-X BIOS?
> 
> 
> 
> Would that even be safe? My voltages are in the 1.29-1.33 range, and it makes me paranoid.
Click to expand...

It's called _suicide run_ or _suicide benching_. (I prefer the former.) It's for short benching, not for 24/7 use. We don't know the voltage limit on this cards. So, it's hard to say whether it's safe or not. _Suicide run_ is not for fainthearted. I don't recommend go over 1.4V for 24/7. I might willing to run below 1.45V 24/7 though. I already done back-to-back benching @1.422 - 1.432V though.
Quote:


> Originally Posted by *Sazz*
> 
> Arizonian you can update me back to full watercooling again, I gave my AIO kit to my friend and I went back to full EK block.
> 
> Was worried about my temps at first since I am only using a single 240mm radiator (EK XT240) on both my CPU and GPU coz I don't have the space to work with my case (Bitfenix Prodigy) but surprisingly enough my temps are the same as my previous build that my 290X was in that used two 240mm radiators which is under 45C. Used EK LTX Supreme for the CPU and its keeping it under 74C (3570k at 1.375v 4.7Ghz) which is surprisingly well.
> 
> Sorry for the crappy pic, camera phone xD
> 
> 
> And is it just me or w/ Beta Driver 14.1 the core clocks doesn't go down to 2D clocks.


That looks better. I didn't tell you before but I honestly don't understand why you plan to use AIO when downsizing to smaller case. Proper full waterblock work just fine in small form factor case.









It's not just you but it doesn't happen frequently though. In my case I have easy trick to un-stuck the core clock; disable & re-enabled Crossfire. I don't know how to un-stuck core clock with one card though. Did you try restarting the computer or full power cycle?


----------



## Arizonian

Quote:


> Originally Posted by *Sazz*
> 
> Arizonian you can update me back to full watercooling again, I gave my AIO kit to my friend and I went back to full EK block.
> 
> Was worried about my temps at first since I am only using a single 240mm radiator (EK XT240) on both my CPU and GPU coz I don't have the space to work with my case (Bitfenix Prodigy) but surprisingly enough my temps are the same as my previous build that my 290X was in that used two 240mm radiators which is under 45C. Used EK LTX Supreme for the CPU and its keeping it under 74C (3570k at 1.375v 4.7Ghz) which is surprisingly well.
> 
> Sorry for the crappy pic, camera phone xD
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> And is it just me or w/ Beta Driver 14.1 the core clocks doesn't go down to 2D clocks.


You got it bud - updated


----------



## Sazz

Quote:


> Originally Posted by *kizwan*
> 
> That looks better. I didn't tell you before but I honestly don't understand why you plan to use AIO when downsizing to smaller case. Proper full waterblock work just fine in small form factor case.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's not just you but it doesn't happen frequently though. In my case I have easy trick to un-stuck the core clock; disable & re-enabled Crossfire. I don't know how to un-stuck core clock with one card though. Did you try restarting the computer or full power cycle?


Nah, only happened when I installed 14.1. Not really affecting me on anything maybe besides the slightly higher idle power consumption.

configuring out how to layout my loop was a bit of a challenge. ran into problems of things not fitting in (specially the tubing on the GPU) good thing I had some extra parts laying around from my previous builds.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *tsm106*
> 
> I've pushed a lot more voltage than that before and the cards did not die.
> 
> There is no definitive max voltage with these cards because the limits can be removed by using either of the unlocked bios' and well then it comes down to how far you are willing to go. I have taken my cards past 1.5v loaded without blowing up lol. However I would not recommend anyone venture that far down the rabbit hole. I was more or less testing the limits of the reference design. Its pretty stout imo. Also, at high voltages the software monitors start to lose their accuracy. Anyways, I would say stay under 1.40v loaded for most situations.
> 
> Here's some things I wrote down from my experience overclocking this silicon.
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/6120_40#post_21205990


Awesome, thanks for the info. I knew you had run crazy volts at some point.

I keep running into an issue where at high volts (+250mV or higher, about 1.33v under load for my card) I'd get instant PC reboots, not sure why. Also, if I try anything higher than 1250MHz, then halfway through a 3dmark11 run the clocks reset to stock... 1145MHz is fine though, even at much lower voltage.

PSU is a brand new Corsair AX860, so I don't think lack of power is the problem.


----------



## kizwan

Quote:


> Originally Posted by *Sazz*
> 
> Nah, *only happened when I installed 14.1*. Not really affecting me on anything maybe besides the slightly higher idle power consumption.


Yes of course. It rarely happen though. In my case it does effecting stability when overclock. If I didn't un-stuck the core clock, it will be unstable when benching or gaming.


----------



## Sazz

Quote:


> Originally Posted by *kizwan*
> 
> Yes, forgot to mention this. It's only happen with 14.1 beta driver. It rarely happen though. In my case it does effecting stability when overclock. If I didn't un-stuck the core clock, it will be unstable when benching or gaming.


It's been like this ever since I installed 14.1 and didn't really affect my stability or anything.


----------



## psyside

Quote:


> Originally Posted by *tsm106*
> 
> I was more or less testing the limits of the reference design. Its pretty stout imo.


And to think off, someone was saying the reference pcb was weak


----------



## Sgt Bilko

http://www.hardwareheaven.com/reviews/1933/pg11/sapphire-tri-x-r9-290x-graphics-card-review-power-temps-and-overclocking.html

Seems that XFX did do a decent job with the cooling, no vrm temp readings though


----------



## pounced

Wow I just checked Newegg and the sapphire 290x I bought the day it launched @ $583 is now $750........... Why on earth


----------



## Sazz

Quote:


> Originally Posted by *pounced*
> 
> Wow I just checked Newegg and the sapphire 290x I bought the day it launched @ $583 is now $750........... Why on earth


simple answer. Greed.


----------



## chiknnwatrmln

No, supply and demand.

LTC and other altcoins got hugely popular, AMD cards are good for mining, so demand went up.

Supply went down, and prices went up to compensate.

This screws over the average gamer who wants to buy AMD, but if/when mining dies out the market will be flooded with used AMD GPUs and they will hit an all time low...

Now that's not to say some etailers have been doing some price gouging. You can find near MSRP prices if you look hard enough, but sites like Newegg are gonna try and screw you.

Hard to believe I got my 290 for $380 with BF4 shortly after release. I've seen the same card for about $575 on eBay at one point.


----------



## Sazz

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> No, supply and demand.
> 
> LTC and other altcoins got hugely popular, AMD cards are good for mining, so demand went up.
> 
> Supply went down, and prices went up to compensate.
> 
> This screws over the average gamer who wants to buy AMD, but if/when mining dies out the market will be flooded with used AMD GPUs and they will hit an all time low...
> 
> Now that's not to say some etailers have been doing some price gouging. You can find near MSRP prices if you look hard enough, but sites like Newegg are gonna try and screw you.
> 
> Hard to believe I got my 290 for $380 with BF4 shortly after release. I've seen the same card for about $575 on eBay at one point.


Another fact to add on this is the parts for the cards is scarce.. AMD themselves said so that's why the supply of these cards are low. Newegg takes advantage of this and coin mining and now pricing these cards $200 over their MSRP, if you can't call that greed I don't know what is.

I had a 290, sold it for 650bucks (didn't expect it to sell but it did) and bought a 290X in a forum for 575bucks, worked out pretty well in my case.

Only reason why I still buy some stuff from newegg is because I got 1year free shoprunner membership that I got from a promotion they had, and newegg doesn't charge me tax.

But once my shoprunner membership runs out and the Law that requires all e-tailers to charge tax passes, imma start looking somewhere else.


----------



## psyside

No, its ALL AMD fault, no such thing as supply and demand. Ask Altair


----------



## Paul17041993

Quote:


> Originally Posted by *psyside*
> 
> No, its ALL AMD fault, no such thing as supply and demand. Ask Altair


if only that were true, but at least its not nearly as bad as nvidia...


----------



## ottoore

Quote:


> Originally Posted by *Bartouille*
> 
> I just finished installing my AP-30s (4250rpm) on my MK-26 290x. MAN!! I've been running a couple valley loops, 54c max and 41c max vrm1 and 32c vrm2!! stock speed, just testing. The card has less vdroop when kept cool too that's great! Those vrms are like watercooling lol So happy.
> 
> 
> 
> 
> 
> 
> 
> Only downside is that the fan run at full speed all the speed because I have it to a molex... molex to 3 pin, does that exists? But even so I'm not sure my fan controller could handle it, this fan requires 1.36A to start, so around 20w. Got some pics coming too.


Can you take a pic of your card?

I'm running mk-26 too with 2 usv14 on it and 2*140mm fans 1200rpm on the side but i cannot get less then 75C during valley on hte core.
I tried with Coolaboratory liquid pro and Prolimatech pk-3. The difference is 2-3 degrees.

I notice that temperature is not uniform on the mk-26. Short piece is really hotter ( 25 or more degrees) than the other side.


----------



## quickinfi

I have a question. I got myself a R9 290 Tri X and I'm having some problems with it. Under high load (around 80° temp) for example when running FurMark or GW2 my monitor loses the signal to the GPU and the fans of the GPU go to what sounds like 100%.... I need to shutdown with by holding down the power button. I read about the Blackscreen problems and installed various drivers (newest, older, betas, hwql ....). I checked all cables and manually put the fans to 80% and higher to see if it somehow doesnt get enough power. The computer is all new and it has worked perfectly (with iGPU) before I installed the GPU.
I really think it has something to do with the temperatur because it always happens around 80°-82°.
Faulty card or is there a known fix for this?

(Note: I'm from Switzerland, sorry for the crappy english)


----------



## kizwan

To anyone using Tri-X BIOS with their 290/290X, what is the consensus about them? Did you get good overclock with Tri-X BIOS? I'm thinking using 290X Tri-X BIOS on my Sapphire R9 290's. Comparing Tri-X & ASUS/PT1 BIOS, which do you think is better?

@HOMECINEMA-PC, your feedback/comment on this will be useful too.

With stock BIOS, I can do 1200/1600 MHz @+200mV (1.422 - 1.432V) & I can run Valley (5115), Heaven (2965) & Firestrike Extreme (9660, graphics score 11340, physics score 12025) without any problem. Please let me know if my score is too low.


----------



## Fahrenheit85

To anyone that RMA'ed a card back to Gigabyte (or anyone for that matter) how did you send it? I have the Gigabyte box with my card but I dont have a box big enough to fit it in. Think putting it in its anti static bag and in a box will of packing peanuts will be enough or should I really try and hunt down a box?


----------



## Bartouille

Quote:


> Originally Posted by *ottoore*
> 
> Can you take a pic of your card?
> 
> I'm running mk-26 too with 2 usv14 on it and 2*140mm fans 1200rpm on the side but i cannot get less then 75C during valley on hte core.
> I tried with Coolaboratory liquid pro and Prolimatech pk-3. The difference is 2-3 degrees.
> 
> I notice that temperature is not uniform on the mk-26. Short piece is really hotter ( 25 or more degrees) than the other side.






Keep in mind these fans are very loud. I have to handle them at 100% even at idle because my fan controller can't handle them. Even MK-26 struggles to keep the card cool. These fans are about as loud as reference cooler at 75%. I also used CLP. I'll have some screenshots coming of temps.









Can you update me to aftermarket Arizonian?


----------



## rdr09

Quote:


> Originally Posted by *quickinfi*
> 
> I have a question. I got myself a R9 290 Tri X and I'm having some problems with it. Under high load (around 80° temp) for example when running FurMark or GW2 my monitor loses the signal to the GPU and the fans of the GPU go to what sounds like 100%.... I need to shutdown with by holding down the power button. I read about the Blackscreen problems and installed various drivers (newest, older, betas, hwql ....). I checked all cables and manually put the fans to 80% and higher to see if it somehow doesnt get enough power. The computer is all new and it has worked perfectly (with iGPU) before I installed the GPU.
> I really think it has something to do with the temperatur because it always happens around 80°-82°.
> Faulty card or is there a known fix for this?
> 
> (Note: I'm from Switzerland, sorry for the crappy english)


Most of us here will not recommend the use of furmark. anyway, I would like to see your vcore at idle and load. you can use GPUZ to see these readings and more. It should look like these . . .

Idle



Load temps. You have to monitor the *vrms temps* just as much . . .





I would like to compare your voltages with mine. and, yes, use GPUZ to load your gpu by clicking the "?" and start. To stop (if temps are getting too hot - just close the render window.


----------



## Bartouille

Quote:


> Originally Posted by *rdr09*
> 
> Most of us here will not recommend the use of furmark. anyway, I would like to see your vcore at idle and load. you can use GPUZ to see these readings and more. It should look like these . . .
> 
> Idle
> 
> 
> 
> Load temps. You have to monitor the *vrms temps* just as much . . .
> 
> 
> 
> 
> 
> I would like to compare your voltages with mine. and, yes, use GPUZ to load your gpu by clicking the "?" and start. To stop (if temps are getting too hot - just close the render window.


I'm pretty sure everyone gets 1.18v in that GPU-Z rendering test. What I am curious about is when you say you get 1.41v after vdroop when you apply +200mv... are you sure that it is really 1.41 during the bench or just a spike (like when you close Valley and you render 1000+ frames). On my card when I apply +100mv my actual voltage is more like 1.18-1.21 depending on the game.


----------



## rdr09

Quote:


> Originally Posted by *Bartouille*
> 
> I'm pretty sure everyone gets 1.18v in that GPU-Z rendering test. What I am curious about is when you say you get 1.41v after vdroop when you apply +200mv... are you sure that it is really 1.41 during the bench or just a spike (like when you close Valley and you render 1000+ frames). On my card when I apply +100mv my actual voltage is more like 1.18-1.21 depending on the game.


It seems quick's 290 is having issues like blackscreen and i suspect that the vcore might be too low. just speculating.

I just saw the 1.4v using GPUZ and not sure is there was vdroop or not. btw, i had problems using AB and GPUZ at the same time, so now i only use Trixx to oc.

edit: 9000 posts


----------



## quickinfi

Quote:


> Originally Posted by *rdr09*
> 
> Most of us here will not recommend the use of furmark. anyway, I would like to see your vcore at idle and load. you can use GPUZ to see these readings and more. It should look like these . . .
> 
> Idle
> 
> [...]
> 
> Load temps. You have to monitor the *vrms temps* just as much . . .
> 
> [...]
> 
> I would like to compare your voltages with mine. and, yes, use GPUZ to load your gpu by clicking the "?" and start. To stop (if temps are getting too hot - just close the render window.


Okay no more furmark then








Thank you for your answer I did what you told me here are the results:

Idle:



Load:





Load after some minutes:


----------



## Gabkicks

I just ordered 2 AMD R9 290 Tri-x cards







.


----------



## rdr09

Quote:


> Originally Posted by *quickinfi*
> 
> Okay no more furmark then
> 
> 
> 
> 
> 
> 
> 
> 
> Thank you for your answer I did what you told me here are the results:
> 
> Idle:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Load:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Load after some minutes:
> 
> 
> Spoiler: Warning: Spoiler!


i am just gonna guess here but your 12V is dipping down to 11.5v. Gpu was stock, right? this apps, though, may not be too accurate, so i may be wrong to advise you to look at a better psu.

what PSU do you have?

edit: VDDC at load is 1.16v. hmmmm


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> [/SPOILER]
> 
> i am just gonna guess here but your 12V is dipping down to 11.5v. Gpu was stock, right? this apps, though, may not be too accurate, so i may be wrong to advise you to look at a better psu.
> 
> what PSU do you have?
> 
> edit: VDDC at load is 1.16v. hmmmm


What should the 12v reading be at?

Mine is 11.75V and dips to 11.63V during the GPU-Z render test


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> What should the 12v reading be at?
> 
> Mine is 11.75V and dips to 11.63V during the GPU-Z render test


mine is same as yours. tolerance is +/- 5%. it should not go over 12.6v and not lower than 11.4v. 11.5v is cutting it and with the app not being accurate - it could very well be dipping below that using furmark or heavy games.

@quick, I recommend testing your gpu on another rig if possible that has a known good working psu.


----------



## Arizonian

Quote:


> Originally Posted by *Bartouille*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Keep in mind these fans are very loud. I have to handle them at 100% even at idle because my fan controller can't handle them. Even MK-26 struggles to keep the card cool. These fans are about as loud as reference cooler at 75%. I also used CLP. I'll have some screenshots coming of temps.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can you update me to aftermarket Arizonian?


Updated









Quote:


> Originally Posted by *Gabkicks*
> 
> I just ordered 2 AMD R9 290 Tri-x cards
> 
> 
> 
> 
> 
> 
> 
> .


Sweet. When you get them I'll add you to the roster.









To be added on the member list please submit the following in your post
1. GPU-Z Link with OCN name or Screen shot of GPU-Z validation tab open with OCN Name or finally a pic of GPU with piece of paper with OCN name showing.
2. Manufacturer & Brand - Please don't make me guess.
3. Cooling - Stock, Aftermarket or 3rd Party Water


----------



## devilhead

Quote:


> Originally Posted by *rdr09*
> 
> [/SPOILER]
> 
> i am just gonna guess here but your 12V is dipping down to 11.5v. Gpu was stock, right? this apps, though, may not be too accurate, so i may be wrong to advise you to look at a better psu.
> 
> what PSU do you have?
> 
> edit: VDDC at load is 1.16v. hmmmm


My AX1200i with one 290X works at 11.5-11.63


----------



## rdr09

Quote:


> Originally Posted by *devilhead*
> 
> My AX1200i with one 290X works at 11.5-11.63


I rather see your benchmarks.









what are the VDDCs?


----------



## devilhead

Quote:


> Originally Posted by *rdr09*
> 
> I rather see your benchmarks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> what are the VDDCs?


1.188-1.195 at those render test








i can hit 1300/1720 wit those card with stock bios and +200mv


----------



## quickinfi

Quote:


> Originally Posted by *rdr09*
> 
> mine is same as yours. tolerance is +/- 5%. it should not go over 12.6v and not lower than 11.4v. 11.5v is cutting it and with the app not being accurate - it could very well be dipping below that using furmark or heavy games.
> 
> @quick, I recommend testing your gpu on another rig if possible that has a known good working psu.


Hm... testing on another rig would be a possibility but I need a few days for that because I don't have another rig in my house :/
But I have an additional 400W PSU which I bought accidentally (same brand, same model). Is there a possibility to use it only for the GPU and run the rest of the system over the normal PSU?


----------



## rdr09

Quote:


> Originally Posted by *devilhead*
> 
> 1.188-1.195 at those render test
> 
> 
> 
> 
> 
> 
> 
> 
> i can hit 1300/1720 wit those card with stock bios and +200mv


a 290X at 1300. something I can only dream of.
Quote:


> Originally Posted by *quickinfi*
> 
> Hm... testing on another rig would be a possibility but I need a few days for that because I don't have another rig in my house :/
> But I have an additional 400W PSU which I bought accidentally (same brand, same model). Is there a possibility to use it only for the GPU and run the rest of the system over the normal PSU?


see, devilhead's VDDC? that is a good working PSU. I would not use any lower than 500W and 80plus cert for a 290.









I think the minimum AMP required is about 41A. do not use that 400W.


----------



## rcoolb2002

Quote:


> Originally Posted by *rdr09*
> 
> a 290X at 1300. something I can only dream of.
> see, devilhead's VDDC? that is a good working PSU. I would not use any lower than 500W and 80plus cert for a 290.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think the minimum AMP required is about 41A. do not use that 400W.


Will my 1300g2 support my 290x









mining rig gone bad haha


----------



## battleaxe

Quote:


> Originally Posted by *quickinfi*
> 
> Hm... testing on another rig would be a possibility but I need a few days for that because I don't have another rig in my house :/
> But I have an additional 400W PSU which I bought accidentally (same brand, same model). Is there a possibility to use it only for the GPU and run the rest of the system over the normal PSU?


Yes you can do this. You just need an adapter. Some even use a paper clip on the PSU but I'm not sure of which pins.

Edit: assuming the PSU can handle the load of course.


----------



## quickinfi

Quote:


> Originally Posted by *rdr09*
> 
> a 290X at 1300. something I can only dream of.
> see, devilhead's VDDC? that is a good working PSU. I would not use any lower than 500W and 80plus cert for a 290.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think the minimum AMP required is about 41A. do not use that 400W.


Hm... okay.
I forgot to answer your previous question:
My PSU is a be quiet! Pure Power L8-600W. I thought it would be a good PSU :/


----------



## Heinz68

Arizonian please add me to the club
Sapphire R9 290 4GB TRI-X OC in Crossfire
Both Hynix memory
TPU Validation


----------



## MrWhiteRX7

Quote:


> Originally Posted by *rcoolb2002*
> 
> Will my 1300g2 support my 290x
> 
> 
> 
> 
> 
> 
> 
> 
> 
> mining rig gone bad haha


I have a 1300g2 supporting 3 x 290's


----------



## pkrexer

Pretty happy with my results on water. 1250 / 5000, could bench higher but starts to artifact badly above 1260. Temps < 45c load









My daily clocks are 1120 / 4000 on stock volts.

http://www.3dmark.com/fs/1683601


----------



## itomic

Did anyone compare FX 8350 vs i7 gaming performance with R9 290X ??


----------



## mojobear

hey all....I've read though the forum that 1-2 people have flashed their 290 bios to tri-x and recommend it...Does anyone know if it is indeed better? Also if you have Epida or Hynix memory, do you need an Epida or Hynix specific bios or does it detect automatically and change timings etc according. THANKS!


----------



## Jack Mac

Sgt Bilko, if it's not too much trouble for you, can I ask what temperatures you get on your CF XFX 290s? Core and VRMs and what fan speed/RPM you need to keep temperatures in check. I'm thinking about selling my reference card and getting the XFX DD because it looks nice and apparently cools well.


----------



## rcoolb2002

Quote:


> Originally Posted by *Jack Mac*
> 
> Sgt Bilko, if it's not too much trouble for you, can I ask what temperatures you get on your CF XFX 290s? Core and VRMs and what fan speed/RPM you need to keep temperatures in check. I'm thinking about selling my reference card and getting the XFX DD because it looks nice and apparently cools well.


It is quite a beauty! If i wasnt doing water I would be all over that card!


----------



## Jack Mac

Quote:


> Originally Posted by *rcoolb2002*
> 
> It is quite a beauty! If i wasnt doing water I would be all over that card!


Yeah it does look really nice and I'd be able to see the glowing XFX logo clearly through the window of my FT02, which makes it even more tempting.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Jack Mac*
> 
> Sgt Bilko, if it's not too much trouble for you, can I ask what temperatures you get on your CF XFX 290s? Core and VRMs and what fan speed/RPM you need to keep temperatures in check. I'm thinking about selling my reference card and getting the XFX DD because it looks nice and apparently cools well.


28c ambients atm (4:42am







) Cores are 67c and 59c, VRM's are 64c/60c and 58c/57c

Playing Sanctum 2 at 1000/1250 on Auto Fan speed (aggressive profile) Still no louder than my case fans tbh.

I have hit 90c max on the core (top card) playing BF4 on a hot day though, and i mean hot, ambients of 38-40c so i'm not surprised tbh.

From what the reviews are saying the DD cards are as good as the Tri-X cards on cooling the core, just not quite as good on the vrm's and i can say that's true.

mind you.....they look about 400% better (scientific opinion is valid







)

EDIT: as for the fan speed question, 100% speed is not loud by any means and at 100% they spin at 1575rpm according to HWiNFO64









i make them bump up to 80% as soon as they pass 65c though and then 100% at 75c


----------



## Sgt Bilko

Quote:


> Originally Posted by *Jack Mac*
> 
> Yeah it does look really nice and I'd be able to see the glowing XFX logo clearly through the window of my FT02, which makes it even more tempting.


Yes they do













Spoiler: More Pics
















Just cos....you know, they looks smexy and all


----------



## Jack Mac

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 28c ambients atm (4:42am
> 
> 
> 
> 
> 
> 
> 
> ) Cores are 67c and 59c, VRM's are 64c/60c and 58c/57c
> 
> Playing Sanctum 2 at 1000/1250 on Auto Fan speed (aggressive profile) Still no louder than my case fans tbh.
> 
> I have hit 90c max on the core (top card) playing BF4 on a hot day though, and i mean hot, ambients of 38-40c so i'm not surprised tbh.
> 
> From what the reviews are saying the DD cards are as good as the Tri-X cards on cooling the core, just not quite as good on the vrm's and i can say that's true.
> 
> mind you.....they look about 400% better (scientific opinion is valid
> 
> 
> 
> 
> 
> 
> 
> )
> 
> EDIT: as for the fan speed question, 100% speed is not loud by any means and at 100% they spin at 1575rpm according to HWiNFO64
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i make them bump up to 80% as soon as they pass 65c though and then 100% at 75c


Thanks, seeing as my ambients are usually below 28C and I'd only ever get to 28C during the summer, I should be fine with the XFX 290s in CF with a mild OC, thanks.


----------



## NirHahs

anyone here with MSI r9 290 gaming? i want to see the temp in OC and gaming, also the performance. im going to purchase this card very soon but since no reviews yet, it is hard for me to decide


----------



## maynard14

Hi sir,

im planning to buy a h50 corsair and nzxt g10 for my reference r9 290 card.

my question is can i safely use the coollaboratory liquid pro to the card ?

and i heard that g10 has some flaw like the vrm temps.. is it true that if i connect tje included 92 mm fan to the motherboard fan header ill be able to max the fan speed so that vrm temps will get lower? thanks again!


----------



## chiknnwatrmln

CLP should be fine for use, as long as when you remove it you are gentle. This means using a qtip with iso alcohol and carefully removing it, as liquid metal will scratch the die. True me I scratched the hell out of my 3770k's die.

You will need heatsinks for the VRM's/VRAM modules. I hear that enzotech copper heatsinks do the job pretty well, but you will also need either thermal tape or thermal glue.

An alternative is to hack up the stock cooler and use the metal plate to cool the VRM's, but say goodbye to RMA'ing if you do that.

VRM's without heatsinks will burn up very quickly, especially if you OC. My old Gelid air cooler used to only hold my VRM 1 at 80c on +18mV, and this was with 100% fan, excellent case airflow, and cool ambients.


----------



## Sgt Bilko

Quote:


> Originally Posted by *NirHahs*
> 
> anyone here with MSI r9 290 gaming? i want to see the temp in OC and gaming, also the performance. im going to purchase this card very soon but since no reviews yet, it is hard for me to decide


There are a few members that have that card, I'm sure they will chime in with some useful info








Quote:


> Originally Posted by *maynard14*
> 
> Hi sir,
> 
> im planning to buy a h50 corsair and nzxt g10 for my reference r9 290 card.
> 
> my question is can i safely use the coollaboratory liquid pro to the card ?
> 
> and i heard that g10 has some flaw like the vrm temps.. is it true that if i connect tje included 92 mm fan to the motherboard fan header ill be able to max the fan speed so that vrm temps will get lower? thanks again!


the G10 looks great but the vrm cooling is less than stellar from what i've heard.

If you are going to use it then grab some heatsinks for the vrms and mount a fan on the front of the card as well as the one included on the back.

As for the TIM, you can use any compound/paste you want just be careful in applying it, only enough for the die itself and no more, too much and you risk frying it.

If you do get the G10 though let me know what it's like, I have a few friends that are very interested in it


----------



## Paul17041993

if anyone's ever wondered what happens when you have a 7970 and a 290X in the same rig, you get this in CCC;


Spoiler: CCC overdrive







only problem is I cant adjust the max fan speed now, but looks like I could set it to a set speed instead...


----------



## maynard14

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> CLP should be fine for use, as long as when you remove it you are gentle. This means using a qtip with iso alcohol and carefully removing it, as liquid metal will scratch the die. True me I scratched the hell out of my 3770k's die.
> 
> You will need heatsinks for the VRM's/VRAM modules. I hear that enzotech copper heatsinks do the job pretty well, but you will also need either thermal tape or thermal glue.
> 
> An alternative is to hack up the stock cooler and use the metal plate to cool the VRM's, but say goodbye to RMA'ing if you do that.
> 
> VRM's without heatsinks will burn up very quickly, especially if you OC. My old Gelid air cooler used to only hold my VRM 1 at 80c on +18mV, and this was with 100% fan, excellent case airflow, and cool ambients.


Quote:


> Originally Posted by *Sgt Bilko*
> 
> There are a few members that have that card, I'm sure they will chime in with some useful info
> 
> 
> 
> 
> 
> 
> 
> 
> the G10 looks great but the vrm cooling is less than stellar from what i've heard.
> 
> If you are going to use it then grab some heatsinks for the vrms and mount a fan on the front of the card as well as the one included on the back.
> 
> As for the TIM, you can use any compound/paste you want just be careful in applying it, only enough for the die itself and no more, too much and you risk frying it.
> 
> If you do get the G10 though let me know what it's like, I have a few friends that are very interested in it


thank you both for the replies, guess ill think about it more if i should buy thr g10 and mess the warranty of my xfx card... the one big problem is the vrm heatsinks, i live here in the philippines and i know that vrm heatsinks are hard to find,. thats why im holding off to buy the g10, i thought running the g10 92mm fan at 100 percent will do a decent job on lowering the vrm temps, guess i should first find vrm heatsinks







. and lastly i wont oc my card but i will use it as 290x moded bios


----------



## Davschall

Quote:


> Originally Posted by *jamaican voodoo*
> 
> here some pics of the xfx dd with ek water-blocks
> it's reference pcb....flash the cards with tri-x 290 bios
> running nicely..


Thank you so much for this!! I have been scouring the web looking for this!!! I ordered the Block and Backplate and then shopblt canceled my order for a reference design gigabyte 290. I aksed on the ekwb facebook page and on tomshardware if it would fit the DD, but got no answer. Its so awesome to know this works. Definitely +rep!!


----------



## battleaxe

Quote:


> Originally Posted by *NirHahs*
> 
> anyone here with MSI r9 290 gaming? i want to see the temp in OC and gaming, also the performance. im going to purchase this card very soon but since no reviews yet, it is hard for me to decide


I have one. Its on water now. But I remember what it was doing on air too. VRM's 72c max while mining. Core 70c max at about 83% fan speed. Stable on stock voltss at 1100mhz core and 1450memory. its gone up to 1500mhz on memory, but I have not tried higher than 1100 on the core. I needed +50mv to get the memory stable at 1500mhz.

Only played about 15 minutes of BF4 but I noticed the temps were about 8c lower. Mining is harder on these cards than gaming for sure. So my results should be indicative of worst case scenario. Obviously it is very loud at these settings, but it stays cool. My PC is in a basement closet, so noise isn't a huge issue.


----------



## jamaican voodoo

Quote:


> Originally Posted by *Davschall*
> 
> Thank you so much for this!! I have been scouring the web looking for this!!! I ordered the Block and Backplate and then shopblt canceled my order for a reference design gigabyte 290. I aksed on the ekwb facebook page and on tomshardware if it would fit the DD, but got no answer. Its so awesome to know this works. Definitely +rep!!


im glad it help you...i wasn't sure myself until i did some research on the pcb of the xfx dd vs reference, it turns out to be same exact pcb layout as reference the only difference is the xfx pcb is darker and matt looking


----------



## psyside

Hey guys some info for you.

Do not count on GPU-Z readings for 12V, its the card vrm volts read out, (the IR controller) not the main 12V+ rail so its all good









HWiNFO is better choice to read 12V+ rail on your psu.


----------



## Davschall

Quote:


> Originally Posted by *jamaican voodoo*
> 
> the only difference is the xfx pcb is darker and matt looking


Lol thats exactly what made me wonder, I had two pictures pulled up one of that and one of reference. I was holding my backplate trying to match all the cuttouts to the back of the pcb. It didnt really work...really sweet to be able to confirm and I was able to pick one up today! Once again thanks and super sick setup!


----------



## Arizonian

Quote:


> Originally Posted by *Heinz68*
> 
> Arizonian please add me to the club
> Sapphire R9 290 4GB TRI-X OC in Crossfire
> Both Hynix memory
> TPU Validation
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added Tri-X x 2


----------



## Fahrenheit85

Well just toke off my water block and got my 290x all packed up to RMA. I know you guys said it was silly to RMA for coil whine but the main reason i went with water cooling was for a silent rig anyways.

EDIT - Well i'll do some more testing before I drop it off at fedex. 3 weeks without my card sounds like a long long time


----------



## yawa

So quick question when enabling above the standard +100mv in afterburner.

Do I need a specific version of it? Or will it work with any?

Also just for the record I'm running the ASUS modded bios to get voltage access, and I've tried to use GPUtweak (to get the most voltage bump) and it rejects me opening it saying my card is not compatible. Is there a trick to this I'm missing?

My card does great since I put it under water (1182Mhz with zero artifacts never exceeding 48 C at +100mv) so I'm dying to try to push 1300Mhz and am pretty sure I can push up to that speed with another 100mv voltage bump, but seeing as how temps aren't an issue, would like the voltage overhead in case scaling stops being so good.

As such any help is greatly appreciated.


----------



## Fahrenheit85

Quote:


> Originally Posted by *yawa*
> 
> So quick question when enabling above the standard +100mv in afterburner.
> 
> Do I need a specific version of it? Or will it work with any?
> 
> Also just for the record I'm running the ASUS modded bios to get voltage access, and I've tried to use GPUtweak (to get the most voltage bump) and it rejects me opening it saying my card is not compatible. Is there a trick to this I'm missing?
> 
> My card does great since I put it under water (1182Mhz with zero artifacts never exceeding 48 C at +100mv) so I'm dying to try to push 1300Mhz and am pretty sure I can push up to that speed with another 100mv voltage bump, but seeing as how temps aren't an issue, would like the voltage overhead in case scaling stops being so good.
> 
> As such any help is greatly appreciated.


Trixx lets me push +200mv without issue


----------



## Paul17041993

Quote:


> Originally Posted by *Fahrenheit85*
> 
> EDIT - Well i'll do some more testing before I drop it off at fedex. 3 weeks without my card sounds like a long long time


don't worry my 7970DCIIT took 3+months on its first RMA...


----------



## taem

So I think my 850w psu will be fine for cross firing these. Testing without Gpus, total system wattage for 4670k @ 4.6, 1.25v is 145w. If I run a pair of 290s at stock on top that should be fine right? I might down clock my cpu also -- what do you guys think is a good 4670k clock for 290 crossfire?


----------



## Fahrenheit85

Quote:


> Originally Posted by *Paul17041993*
> 
> don't worry my 7970DCIIT took 3+months on its first RMA...


Ugh more I think about it the more I think ill keep it till I get my second 290x just so I don't have to run Intel graphics


----------



## Strata

What would be the recommended Wattage to run 2 290Xs (OCed, maybe 1300/1500 max) a 4770k @ 5.0, and a full custom loop? 650W seems barely enough to run my 290X stock and [email protected] 1.3v with a simple loop.

Secondly, what rad setup would you recommend? Is 2x280mm fine? My MCR220QP-Res v2 is not enough for the current single card + cpu loop, by any stretch.


----------



## psyside

Quote:


> Originally Posted by *Strata*
> 
> What would be the recommended Wattage to run 2 290Xs (OCed, maybe 1300/1500 max) a 4770k @ 5.0, and a full custom loop? 650W seems barely enough to run my 290X stock and [email protected] 1.3v with a simple loop.
> 
> Secondly, what rad setup would you recommend? Is 2x280mm fine? My MCR220QP-Res v2 is not enough for the current single card + cpu loop, by any stretch.


This


----------



## Jack Mac

May have found a buyer for my R9 290 on Craigslist, I'll be selling for $550 and buying the XFX 290 DD. It's going to suck having to run on integrated for 1-2 weeks and giving up a good OCer though.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Jack Mac*
> 
> May have found a buyer for my R9 290 on Craigslist, I'll be selling for $550 and buying the XFX 290 DD. It's going to suck having to run on integrated for 1-2 weeks and giving up a good OCer though.


My top card can run 1200/1500 with +100mV and the bottom needs +120mV for the same clocks, I can bench 1250/1500 with +150mV on the top card......haven't tried Both yet, waiting for better temps


----------



## Jack Mac

Quote:


> Originally Posted by *Sgt Bilko*
> 
> My top card can run 1200/1500 with +100mV and the bottom needs +120mV for the same clocks, I can bench 1250/1500 with +150mV on the top card......haven't tried Both yet, waiting for better temps


My max clocks aren't that great, 1265 w/ 185mV but my 24/7 clocks on this card are pretty good. 1175/1450 at +70mV. 780 Classified is also looking pretty tempting, but idk really.


----------



## Strata

Quote:


> Originally Posted by *psyside*
> 
> This


So then 1kW is fine for 2 OC 290X and and OC 4770K with all the WC accoutrements?


----------



## psyside

Quote:


> Originally Posted by *Strata*
> 
> So then 1kW is fine for 2 OC 290X and and OC 4770K with all the WC accoutrements?


Yes. Or maybe Leadex Platinum 1200W.


----------



## Strata

Ok awesome, that saves some serious cash. What about Radiators? If I go up to a 280mm from my current 240mm will that be enough for both CPU and GPU, or would it be better to do something like add a 120mm to the loop? Currently the addition of the GPU has my CPU maxing at 85C at load (think I need to redo my TIM though too) during BF4, and the 290X at stock hits 65C.

I assume that adding a second card would require at least a 280mm added to the loop, these bad boys are quite hot...


----------



## psyside

I'm far from expert about WC, sorry man.


----------



## rcoolb2002

Quote:


> Originally Posted by *Strata*
> 
> Ok awesome, that saves some serious cash. What about Radiators? If I go up to a 280mm from my current 240mm will that be enough for both CPU and GPU, or would it be better to do something like add a 120mm to the loop? Currently the addition of the GPU has my CPU maxing at 85C at load (think I need to redo my TIM though too) during BF4, and the 290X at stock hits 65C.
> 
> I assume that adding a second card would require at least a 280mm added to the loop, these bad boys are quite hot...


2GPU + 4670 you would prefer at least 360mm + 120mm overhead. Thats the standard rule for BASIC cooling. You might be able to sneak through with a MONSTA 360 in push pull. Its really all in how much you overvolt etc.


----------



## Paul17041993

Quote:


> Originally Posted by *Strata*
> 
> So then 1kW is fine for 2 OC 290X and and OC 4770K with all the WC accoutrements?


keeping in mind water also improves the efficiency of the card, so your wattage will drop significantly compared to air too.


----------



## Sazz

Quote:


> Originally Posted by *rcoolb2002*
> 
> 2GPU + 4670 you would prefer at least 360mm + 120mm overhead. Thats the standard rule for BASIC cooling. You might be able to sneak through with a MONSTA 360 in push pull. Its really all in how much you overvolt etc.


Basic watercooling equation suggests that every block you use needs a 120mm radiator space, but depends on how hot that component get you may need to get 240mm radiator space for that block. With that being said, the two GPU blocks definitely needs atleast 240mm radiator space while w/ the CPU you can get 120-240mm radiator for it, so a single 360mm radiator can handle CPU+2GPU blocks, while getting it upto 480mm radiator space would give you the most out of the loop. Any more than that isn't really gonna give you a lot of gain in temps, maybe 1C at most.

I don't have space on my mini-ITX build so I had to settle w/ a single 240mm radiator for my GPU+CPU Blocks and so far my CPU has remained under 74C w/ prime95 for 2hrs and my GPU has remained under 45C w/ Kombuster (VRM1 at 48C) while during BF4 session of 2hrs my CPU only maxed out at 71C (averaging 58C) while GPU maxed out at 51C (VRM1 48C).

I am using a EK Supreme LTX for my 3570k @ 4.7Ghz 1.375v while on my 290X(clocked at 1100/1450 at stock voltage) I am using an EK acetal+copper waterblock w/ a single EK XT240 radiator, ambient temp is at 22-25C. Pretty surprised on the temps I am getting, was expecting 5-7C higher knowing that I barely have enough radiator for it, would probably have cut the temps down by 5C (on CPU, GPU is probably gonna only get 1-2C lower) or so w/ an extra thick 120mm radiator.

Quote:


> Originally Posted by *Paul17041993*
> 
> keeping in mind water also improves the efficiency of the card, so your wattage will drop significantly compared to air too.


Not to mention noise, LOL. Specially if you're coming from the reference cooler.


----------



## Paul17041993

Quote:


> Originally Posted by *Sazz*
> 
> Not to mention noise, LOL. Specially if you're coming from the reference cooler.


is that not the idea of watercooling anyway








(unless you're using 10k+RPM fans for whatever reason)


----------



## Sazz

Quote:


> Originally Posted by *Paul17041993*
> 
> is that not the idea of watercooling anyway
> 
> 
> 
> 
> 
> 
> 
> 
> (unless you're using 10k+RPM fans for whatever reason)


Not all high static pressure fans can be quiet, and even if you got a quiet fan if the radiator you are using got too dense fins it will cause noise as well. if part's used hasn't been considered properly to be of low noise even watercooling can get noisy. So no, even if you go watercooling it's not AUTOMATICALLY quiet. Cooling performance is of course goes superb w/o saying but noise is another thing, if you don't select the proper component you may end up having a loud set up as the air cooler you had.


----------



## Strata

Oh I went on Water ASAP with the Ref Design, Acetal + Copper Full Block, the Noise difference is worth the loss of CPU head room on the loop, but I do wan to be able to go back to OCing the card, so the puny MCR220QP just wont cut it. Sounds like the best headroom for a 2Fire + CPU would be along the lines of 280mm + 280mm or 360mm + 240mm, depending on what will fit (though I cant say I wont try for the 420mm + 280mm anyway)


----------



## Sazz

Quote:


> Originally Posted by *Strata*
> 
> Oh I went on Water ASAP with the Ref Design, Acetal + Copper Full Block, the Noise difference is worth the loss of CPU head room on the loop, but I do wan to be able to go back to OCing the card, so the puny MCR220QP just wont cut it. Sounds like the best headroom for a 2Fire + CPU would be along the lines of 280mm + 280mm or 360mm + 240mm, depending on what will fit (though I cant say I wont try for the 420mm + 280mm anyway)


Do you use that rad as reservoir too? or do you have a separate reservoir or pump/reservoir combo? XSPC AX radiators are great, altho their housing are bigger, I had an AX240 and would have used that instead of the XT240 but it was a lot bigger than my XT240 that it doesn't fit inside the Prodigy case.

XSPC RX radiators would be great. 480mm radiator should be fine but I would personally go for upto 720 total radiator space if you can w/ that set-up (total of 240mm radiator for each block.)


----------



## Paul17041993

Quote:


> Originally Posted by *Sazz*
> 
> Not all high static pressure fans can be quiet, and even if you got a quiet fan if the radiator you are using got too dense fins it will cause noise as well. if part's used hasn't been considered properly to be of low noise even watercooling can get noisy. So no, even if you go watercooling it's not AUTOMATICALLY quiet. Cooling performance is of course goes superb w/o saying but noise is another thing, if you don't select the proper component you may end up having a loud set up as the air cooler you had.


yea I know first-hand high-density radiators can be horribly noisy, especially with cheap 2KRPM fans, sounds like a V8, but usually people go for multiple large radiators with fairly low density and decent fans.


----------



## Strata

Quote:


> Originally Posted by *Sazz*
> 
> Do you use that rad as reservoir too? or do you have a separate reservoir or pump/reservoir combo? XSPC AX radiators are great, altho their housing are bigger, I had an AX240 and would have used that instead of the XT240 but it was a lot bigger than my XT240 that it doesn't fit inside the Prodigy case.
> 
> XSPC RX radiators would be great. 480mm radiator should be fine but I would personally go for upto 720 total radiator space if you can w/ that set-up (total of 240mm radiator for each block.)


Mine is the QP Res version, so it has a (tiny) built in res.

720 total... not bad, I'm much less daunted by that prospect now that i have found the Phanteks Enthoo Primo case. Sounds like a fun project for the future, once 290X prices drop (I hope)


----------



## Newbie2009

What drivers launched with the 290x? I'm looking for drivers that don't have the vsync/sound issue?


----------



## Fahrenheit85

Quote:


> Originally Posted by *Newbie2009*
> 
> What drivers launched with the 290x? I'm looking for drivers that don't have the vsync/sound issue?


Maybe I missed the boat but what is the vsync/sound issue? I haven't notice anything on my end


----------



## NirHahs

Im very new with this kind of stuff, i only had experience with overclocking CPU. my question is

1. what is the best software for GPU overclocking? (best ocing CPU is with BIOS)
2. does GPU voltage require to set manually like CPU ocing?
3. what is the max temp and VOLTAGE for GPU ocing?

im planning to get MSI r9 290 gaming. so i need ur tips. thanks


----------



## Forceman

Quote:


> Originally Posted by *Newbie2009*
> 
> What drivers launched with the 290x? I'm looking for drivers that don't have the vsync/sound issue?


The 13.12 WHQL are probably your best bet. The sound thing is new with the 14.1s I believe.


----------



## psyside

Quote:


> Originally Posted by *NirHahs*
> 
> Im very new with this kind of stuff, i only had experience with overclocking CPU. my question is
> 
> 1. what is the best software for GPU overclocking? (best ocing CPU is with BIOS)
> 2. does GPU voltage require to set manually like CPU ocing?
> 3. what is the max temp and VOLTAGE for GPU ocing?
> 
> im planning to get MSI r9 290 gaming. so i need ur tips. thanks


1. MSI Ab and Trixx.

2. yes, depends on your clocks, for ~ 1050/1350 lets say you wont need any volts most of the times, even for 1100, for 1150/1500 ~ 100mV + you will need.

3. Do not go over 85c on the core, and 85-*90 max on vrm, for best oc results, try to keep it around 75*


----------



## Jack Mac

Ugh I hate Craigslist. I said $550 firm and some guy told me he was interested and asked where we could meet, we agreed on a location and time and I was getting ready to leave when he tells me "Yeah I can only pay $400 for your 290." And he tells me this after I get dressed and ready and I had the card in the original packaging.


----------



## NirHahs

Quote:


> Originally Posted by *psyside*
> 
> 1. MSI Ab and Trixx.
> 
> 2. yes, depends on your clocks, for ~ 1050/1350 lets say you wont need any volts most of the times, even for 1100, for 1150/1500 ~ 100mV + you will need.
> 
> 3. Do not go over 85c on the core, and 85-*90 max on vrm, for best oc results, try to keep it around 75*


so, for 1050/1350 no need to touch the volts. what do u mean 100mV? did u mean 0.1?


----------



## rcoolb2002

Quote:


> Originally Posted by *Jack Mac*
> 
> Ugh I hate Craigslist. I said $550 firm and some guy told me he was interested and asked where we could meet, we agreed on a location and time and I was getting ready to leave when he tells me "Yeah I can only pay $400 for your 290." And he tells me this after I get dressed and ready and I had the card in the original packaging.


Totally sucks when that happens! Ill give you $425 shipped if that makes you feel any better


----------



## Jack Mac

Quote:


> Originally Posted by *rcoolb2002*
> 
> Totally sucks when that happens! Ill give you $425 shipped if that makes you feel any better


Lol, I think I'll pass on that, I paid $412 for my card on Amazon ($400 +next day delivery). I thought my deal was fair too, I even offered to toss in a TX750 power supply I don't use.


----------



## ebduncan

for the folks talking about water cooling.

I have a XSPC RX240 (push) and a Alphacool XT45 280(push/pull) cooling both gpu and cpu. I wouldn't mind having more radiator at all. I understand the rule of thumb of 120mm of rad per part. That is fine and handy if you want temperatures under control. However if you doing water cooling for the benefits of better cooling then that rule must be thrown out the window. Under full load (cpu and gpu) my liquid temp gets to 40c. Gpu core temp gets to around 58c and Cpu temp stays below 54c. Ambient temperatures are kinda high at around 24c.

Everything is safe and being cooled well. Keep in mind this is one gpu and one cpu. If you have 2 gpus and 1 cpu my radiator space would BARELY be enough.

Keep in mind I have the AMD FX [email protected] which puts out a good bit more heat than other cpus.

So here is what I recommend.

1 Cpu+ 2 GPU's = at least 500+mm radiator space
1 CPu+ 1 GPU= at least 360mm radiator space

Best results 1Cpu+2Gpus= 600+mm radiator Space (two 360's, or a 480+240, or 360+240)


----------



## bond32

Quote:


> Originally Posted by *ebduncan*
> 
> for the folks talking about water cooling.
> 
> I have a XSPC RX240 (push) and a Alphacool XT45 280(push/pull) cooling both gpu and cpu. I wouldn't mind having more radiator at all. I understand the rule of thumb of 120mm of rad per part. That is fine and handy if you want temperatures under control. However if you doing water cooling for the benefits of better cooling then that rule must be thrown out the window. Under full load (cpu and gpu) my liquid temp gets to 40c. Gpu core temp gets to around 58c and Cpu temp stays below 54c. Ambient temperatures are kinda high at around 24c.
> 
> Everything is safe and being cooled well. Keep in mind this is one gpu and one cpu. If you have 2 gpus and 1 cpu my radiator space would BARELY be enough.
> 
> Keep in mind I have the AMD FX [email protected] which puts out a good bit more heat than other cpus.
> 
> So here is what I recommend.
> 
> 1 Cpu+ 2 GPU's = at least 500+mm radiator space
> 1 CPu+ 1 GPU= at least 360mm radiator space
> 
> Best results 1Cpu+2Gpus= 600+mm radiator Space (two 360's, or a 480+240, or 360+240)


What about a thick 420 + 480 both in push/pull


----------



## ebduncan

Quote:


> Originally Posted by *bond32*
> 
> What about a thick 420 + 480 both in push/pull


more than enough, with that you can do quadfire.


----------



## mojobear

Hi Arizonian,

Can I join the club please









These were reference 3 sapphire R9 290s...now with EK acetal copper blocks. Thanks!


----------



## Arizonian

Quote:


> Originally Posted by *mojobear*
> 
> Hi Arizonian,
> 
> Can I join the club please
> 
> 
> 
> 
> 
> 
> 
> 
> 
> These were reference 3 sapphire R9 290s...now with EK acetal copper blocks. Thanks!
> 
> 
> Spoiler: Warning: Spoiler!


You sure may! Thanks for all the info too.









Congrats - added x3


----------



## iPDrop

So I'm getting literally no difference in performance of BF4 with mantle when i switch back and forth from the DIRECTX11 setting in the video tab of the options menu. So glad I gave AMD a chance. I've tried reinstalling 14.1 whiping old drivers with both driver fusion and DDU.


----------



## Jack Mac

Restart the game and I'm sure you'll get some extra performance.


----------



## bloodkil93

Quote:


> Originally Posted by *iPDrop*
> 
> So I'm getting literally no difference in performance of BF4 with mantle when i switch back and forth from the DIRECTX11 setting in the video tab of the options menu. So glad I gave AMD a chance. I've tried reinstalling 14.1 whiping old drivers with both driver fusion and DDU.


It does say clearly at the bottom that a restart is requited when switching API's


----------



## bloodkil93

Quote:


> Originally Posted by *Arizonian*
> 
> You sure may! Thanks for all the info too.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added x3


Mind updating mine to Water? got a Kraken G10 and a H55 now


----------



## Ukkooh

You can update me to under water.


Spoiler: Warning: Bad quality pictures inside!


----------



## Arizonian

Quote:


> Originally Posted by *Ukkooh*
> 
> You can update me to under water.
> 
> 
> Spoiler: Warning: Bad quality pictures inside!


Congrats - updated


----------



## cam51037

I don't think I've ever requested to join this club.

I'm in for a single Sapphire 290 Tri-X OC. Here is an album with some pictures of the computer:


http://imgur.com/yIBdL


Although, technically I don't have the card right now, it's in for an RMA currently.


----------



## maynard14

Quote:


> Originally Posted by *bloodkil93*
> 
> Mind updating mine to Water? got a Kraken G10 and a H55 now


Hi sir can you tell me the g10 and h55 effectiveness to 290?


----------



## bloodkil93

Quote:


> Originally Posted by *maynard14*
> 
> Hi sir can you tell me the g10 and h55 effectiveness to 290?


This is my Sapphire 290X BF4 Edition at 1200/1680 (Hynix VRAM) using an NZXT Kraken G10 and a Carsair H55 AIO, I'm using the VRM Heatsink from the Stock cooler which I cut up and screwed in:

The temps after 3 runs of Unigine Valley:
VRM1: 70C
VRM2: 61C
GPU: 48C

The card is using an overvolt of +200mV and a Power Limit of 30%

I'm using the ASUS PT1 BIOS so the GPU clock lowers a little bit when it's idle.

If you want any more benchmarks, please tell me and i'll get them!


----------



## maynard14

Quote:


> Originally Posted by *bloodkil93*
> 
> This is my Sapphire 290X BF4 Edition at 1200/1680 (Hynix VRAM) using an NZXT Kraken G10 and a Carsair H55 AIO, I'm using the VRM Heatsink from the Stock cooler which I cut up and screwed in:
> 
> The temps after 3 runs of Unigine Valley:
> VRM1: 70C
> VRM2: 61C
> GPU: 48C
> 
> The card is using an overvolt of +200mV and a Power Limit of 30%
> 
> I'm using the ASUS PT1 BIOS so the GPU clock lowers a little bit when it's idle.
> 
> If you want any more benchmarks, please tell me and i'll get them!


\

ooooohhhh nice! but sir can you take a picture of the "VRM Heatsink from the Stock cooler which I cut up and screwed in"

pls pls... haha coz i dont know where to buy vrm heatsinks for the 290,. im planning to buy also g10 and a kraken x40


----------



## Paul17041993

Quote:


> Originally Posted by *maynard14*
> 
> \
> 
> ooooohhhh nice! but sir can you take a picture of the "VRM Heatsink from the Stock cooler which I cut up and screwed in"
> 
> pls pls... haha coz i dont know where to buy vrm heatsinks for the 290,. im planning to buy also g10 and a kraken x40


usually you can get heatsink kits which have everything,

eg this;
http://www.arctic.ac/worldwide_en/products/cooling/spare-parts/heatsink-accelero-xtreme-7970.html


----------



## bloodkil93

Quote:


> Originally Posted by *maynard14*
> 
> \
> 
> ooooohhhh nice! but sir can you take a picture of the "VRM Heatsink from the Stock cooler which I cut up and screwed in"
> 
> pls pls... haha coz i dont know where to buy vrm heatsinks for the 290,. im planning to buy also g10 and a kraken x40


Yeah sure, give me about 30 mins


----------



## maynard14

Quote:


> Originally Posted by *bloodkil93*
> 
> Yeah sure, give me about 30 mins


all right bro! thank you so much


----------



## maynard14

Quote:


> Originally Posted by *Paul17041993*
> 
> usually you can get heatsink kits which have everything,
> 
> eg this;
> http://www.arctic.ac/worldwide_en/products/cooling/spare-parts/heatsink-accelero-xtreme-7970.html


thanks sir, but the problem is i live here in the philippines and i dont have a credit card to buy online,. so vrm heatsinks are my major problem on wer to buy it here in my country


----------



## Paul17041993

Quote:


> Originally Posted by *maynard14*
> 
> thanks sir, but the problem is i live here in the philippines and i dont have a credit card to buy online,. so vrm heatsinks are my major problem on wer to buy it here in my country


I see, though you could always get a block of copper and grind your own one


----------



## maynard14

Quote:


> Originally Posted by *Paul17041993*
> 
> I see, though you could always get a block of copper and grind your own one


ahmm do you have any links for a tutorial sir? thanks again


----------



## maynard14

lastly sir which one of these can i use? or can i use it on the 290?

http://www.cdrking.com/index.php?productstype=All+Products&searchvalue=heatsink&x=0&y=0&mod=products&type=search


----------



## psyside

Quote:


> Originally Posted by *NirHahs*
> 
> so, for 1050/1350 no need to touch the volts. what do u mean 100mV? did u mean 0.1?


i mean + 100mV on core in Afterburner or Trixx.


----------



## iPDrop

Quote:


> Originally Posted by *bloodkil93*
> 
> It does say clearly at the bottom that a restart is requited when switching API's


Yeah I have and still there is no difference in fps


----------



## Widde

Quote:


> Originally Posted by *iPDrop*
> 
> Yeah I have and still there is no difference in fps


Then something is off, I'm getting a decent chunk performance increase, 80-90ish fps to 100-120ish, have to run something that records my average fps though but the increase is very very noticable


----------



## iPDrop

Quote:


> Originally Posted by *Widde*
> 
> Then something is off, I'm getting a decent chunk performance increase, 80-90ish fps to 100-120ish, have to run something that records my average fps though but the increase is very very noticable


sometimes i start the game up and my frame rate will be cut in half


----------



## Heinz68

*VisionTek CryoVenom R9 290 Liquid Cooling edition buy directly from VisionTek $550*

I got this Email from VisionTek,since I already bought two Sapphire R9 290X 4GB TRI-X OC it's too late for me.
Quote:


> Dear CryoVenom Enthusiast,
> 
> Thank you for your interest in purchasing our liquid cooled CryoVenom R9 290 graphics card. Because of an unforeseen limited supply of required
> components and the card's hand-built process, small batches of CryoVenom cards will be released on an on-going basis.
> 
> *The next limited availability of cards will be on Monday, February 10, 2014 on a first come, first served basis starting at 10:00AM Central Time.
> *
> To purchase a CryoVenom R9 290 card, please visit:
> http://www.visiontekproducts.com/index.php/component/virtuemart/graphics-cards/visiontek-cryovenom-liquidcooled-series-r9-290-detail?Itemid=0
> 
> If you are unable to purchase a card on the above date, or find that CryoVenom is out of stock by the time you place your order, don't worry! We
> will keep you on this exclusive mailing list unless you purchase a card or unsubscribe.


----------



## SeanEboy

So... I'm going for a 290.. Which one is "the" one to get? I will be putting my own waterblocks on them.. I'm thinking this means XFX for the lifetime warranty? Am I right on that?

That Visiontek sounds good, but the 1 year warranty kind of blows compared to XFX, no?


----------



## Heinz68

[quote name="Heinz68"

Arizonian please add me to the club
Sapphire R9 290 4GB TRI-X OC in Crossfire
Both Hynix memory
TPU Validation





[/quote]

Arizonian I made a mistake when submitting the above club registration, I have the X version of the R9 290 cards, can you please correct it, Thank You Heinz


----------



## taem

Quote:


> Originally Posted by *Heinz68*
> 
> Arizonian I made a mistake when submitting the above club registration, I have the X version of the R9 290 cards, can you please correct it, Thank You Heinz


What temps on the cards? I want to crossfire 290 tri-xs but I worry my air cooled mid tower won't be able to dissipate the heat. And what's your case/case fan setup?


----------



## Arizonian

Quote:


> Originally Posted by *Heinz68*
> 
> Arizonian I made a mistake when submitting the above club registration, I have the X version of the R9 290 cards, can you please correct it, Thank You Heinz


Nice - updated


----------



## L36

Well my card does 1300 core 1550 memory with +200mv offset. VRMs never break 60C during valley benchmark runs. Although this is at stock CPU clock, maybe i should do a proper run... Anyway, glad i went with a water solution.


----------



## Matt-Matt

Quote:


> Originally Posted by *rdr09*
> 
> the 290 PRO and 290X use the same waterblock.
> +220mV as in +220 offset? i used trixx and applied +200 offset for benching and gpuz showed 1.42v at load.


Just looked it up, all reference stuff fits between cards.


----------



## aiyaaabatt

I just bought two used (1 month old, and tested working properly) XFX reference r9 290x's for $800 on craigslist. Both cards came with original box, all accessories, and unredeemed game codes.

I won right? Or was it just luke-warm?


----------



## Matt-Matt

Quote:


> Originally Posted by *aiyaaabatt*
> 
> I just bought two used (1 month old, and tested working properly) XFX reference r9 290x's for $800 on craigslist. Both cards came with original box, all accessories, and unredeemed game codes.
> 
> I won right? Or was it just luke-warm?


You won, that is the regular 290's MSRP price.. You did good, real good.


----------



## Ukkooh

Quote:


> Originally Posted by *aiyaaabatt*
> 
> I just bought two used (1 month old, and tested working properly) XFX reference r9 290x's for $800 on craigslist. Both cards came with original box, all accessories, and unredeemed game codes.
> 
> I won right? Or was it just luke-warm?


So the mining craze is ending?


----------



## Arizonian

Quote:


> Originally Posted by *aiyaaabatt*
> 
> I just bought two used (1 month old, and tested working properly) XFX reference r9 290x's for $800 on craigslist. Both cards came with original box, all accessories, and unredeemed game codes.
> 
> I won right? Or was it just luke-warm?


You sure did good on that one. When you get the chance look at OP and submit the info needed to be added to the roster.








Quote:


> Originally Posted by *Ukkooh*
> 
> So the mining craze is ending?


Highly doubt it. I guess at some point all the miners will have had their GPU's and if sales slow down it may force a price drop again. Would like to see a 290X with non-ref cooling go down to $600.


----------



## Prozillah

Hi guys,

I'm in a pickle here - I've got the option to buy either 2x Gigabyte Windforce 290's OC or the powercolour PCS+ 290's OC. Can't decide as the reviews aren't out on the powercolour but they are exactly the same price and people have been raving about how good the cards are.

Which card is better??


----------



## psyside

Quote:


> Originally Posted by *Prozillah*
> 
> Hi guys,
> 
> I'm in a pickle here - I've got the option to buy either 2x Gigabyte Windforce 290's OC or the powercolour PCS+ 290's OC. Can't decide as the reviews aren't out on the powercolour but they are exactly the same price and people have been raving about how good the cards are.
> 
> Which card is better??


PCS.


----------



## Cool Mike

Prices have gotten crazy. Look at Newegg, mid 700's now. Pure Greed.

Go away Miners!


----------



## Cool Mike

Hello,

I purchased two PCS+ 290's last week. Running very well and cool. Even the top card (crossfire) never gets above 88C and thats playing BF4 with Ultra settings and 4k resolution.









You need two for 4K resolution.


----------



## psyside

Quote:


> Originally Posted by *Cool Mike*
> 
> Hello,
> 
> I purchased two PCS+ 290's last week. Running very well and cool. Even the top card (crossfire) never gets above 88C and thats playing BF4 with Ultra settings and 4k resolution.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You need two for 4K resolution.


Can you please do one test for us?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Cool Mike*
> 
> Hello,
> 
> I purchased two PCS+ 290's last week. Running very well and cool. Even the top card (crossfire) never gets above 88C and thats playing BF4 with Ultra settings and 4k resolution.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You need two for 4K resolution.


4K at what fps?

I'm curious about this as i'm stuck between going 4K or a 120hz/1440p screen for my next monitor


----------



## Widde

How does people get these images with frame times and average fps?







Fraps .csv files are "Insert profanity here"


----------



## Csokis

Arriving the Sapphire Tri-X card. But 3 times BS screen.









1: Desktop
2: Running GPU-Z
3. Running 3DMark

Bad card and RMA?














Or driver?


----------



## rdr09

Quote:


> Originally Posted by *Csokis*
> 
> Arriving the Sapphire Tri-X card. But 3 times BS screen.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1: Desktop
> 2: Running GPU-Z
> 3. Running 3DMark
> 
> Bad card and RMA?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Or driver?


Follow this thread, especially the last post . . .

http://www.overclock.net/t/1441349/290-290x-black-screen-poll/590#post_21750825


----------



## Cool Mike

Running two 290's with BF4 ultra settings, No AA (4K you dont need it)
Low FPS - 38
High - 76
Average - 48


----------



## maynard14

hi again

just bechmark my r9 290 unlockable to 290x

with stock r9 290 bios:



stock voltage of the card and oc to 1070 core clock, 1270 on memory clock:



and lastly heres unlock 290x

stock xfx 290x bios :



and oc using stock voltage core clock 1070 memory clock 1260 :



a bit jump of fps right? my onlu prob now is the temp..







gotta get nzxt g10 and kraken x40 n vrm temps


----------



## Csokis

Quote:


> Originally Posted by *rdr09*
> 
> Follow this thread, especially the last post . . .
> 
> http://www.overclock.net/t/1441349/290-290x-black-screen-poll/590#post_21750825


Thanks, But it did not help.







Always BS, randomly.


----------



## rdr09

Quote:


> Originally Posted by *Csokis*
> 
> Thanks, But it did not help.
> 
> 
> 
> 
> 
> 
> 
> Always BS, randomly.


I doubt its your psu. I would rma it if it bs at stock. if mine does - it will be boxed for shipping the next hour.

we have a hardware rep here in ocn. I think his name is - VaporX.

edit: http://www.overclock.net/t/1429534/greetings-ocn-from-sapphire


----------



## Kriant

Quote:


> Originally Posted by *Csokis*
> 
> Thanks, But it did not help.
> 
> 
> 
> 
> 
> 
> 
> Always BS, randomly.


Which drivers are you using?
My second card BS with 14.1, but runs like a champ with 13.11 9.5 beta.


----------



## BradleyW

Quote:


> Originally Posted by *Kriant*
> 
> Which drivers are you using?
> My second card BS with 14.1, but runs like a champ with 13.11 9.5 beta.


Would this second card happen to have Elpida IC's?


----------



## Kriant

Quote:


> Originally Posted by *Kriant*
> 
> Which drivers are you using?
> My second card BS with 14.1, but runs like a champ with 13.11 9.5 beta.


p.s. with 14.1 it BS in 3dmark, BF4, and even plain desktop.


----------



## Kriant

Quote:


> Originally Posted by *BradleyW*
> 
> Would this second card happen to have Elpida IC's?


yuuuup.
But first one doesn't and it also has Elpida IC

What's the catch you were referring to ?


----------



## Widde

Got some unsuall stuff happening in bf4 now, havent changed anything at all and now all of a sudden my frames have dropped to 40ish on ultra MSAA and post AA off 200% res scale :S Have restarted the pc to no avail

http://piclair.com/g5c4e gpu usage before bf4 crash


----------



## BradleyW

Quote:


> Originally Posted by *Kriant*
> 
> yuuuup.
> But first one doesn't and it also has Elpida IC
> 
> What's the catch you were referring to ?


Thanks for that information.
Any idea why just one of your cards is having issues? Both my cards crash or blue screen instantly on 14.1. I have Elpida on both.


----------



## maynard14

i think 14.1 driver has a lot of bugs,,. mine to after i use my pc for a long period my fps drops to 30ish, on tomb raider, crysis 3, and crashes it event viewer shows me "Display driver amdkmdap stopped responding and has successfully recovered."

now im using 13.12 and no issues


----------



## rdr09

Quote:


> Originally Posted by *BradleyW*
> 
> Thanks for that information.
> Any idea why just one of your cards is having issues? Both my cards crash or blue screen instantly on 14.1. I have Elpida on both.


I only have one 290 and it has elpida. the only issue I had with 14.1 was I was not able to play BF4. it crashes after "Transitioning". reverted back to trusty 13.11.


----------



## Widde

Quote:


> Originally Posted by *maynard14*
> 
> i think 14.1 driver has a lot of bugs,,. mine to after i use my pc for a long period my fps drops to 30ish, on tomb raider, crysis 3, and crashes it event viewer shows me "Display driver amdkmdap stopped responding and has successfully recovered."
> 
> now im using 13.12 and no issues


ah saw that mine stopped responding aswell and caused bf4 to crash, at stock clocks should be added


----------



## Csokis

Quote:


> Originally Posted by *rdr09*
> 
> I doubt its your psu. I would rma it if it bs at stock. if mine does - it will be boxed for shipping the next hour.
> 
> we have a hardware rep here in ocn. I think his name is - VaporX.
> 
> edit: http://www.overclock.net/t/1429534/greetings-ocn-from-sapphire


I'm not overclocking! Stock i7-3770 and stock r9 290 tri-x card. Booting win 8.1, and BS.







13.12 and 14.1 are same results.







Go to RMA...


----------



## BradleyW

Quote:


> Originally Posted by *maynard14*
> 
> i think 14.1 driver has a lot of bugs,,. mine to after i use my pc for a long period my fps drops to 30ish, on tomb raider, crysis 3, and crashes it event viewer shows me "Display driver amdkmdap stopped responding and has successfully recovered."
> 
> now im using 13.12 and no issues


I also had that issue! I too am running the WHQL drivers right now.


----------



## Kriant

Quote:


> Originally Posted by *BradleyW*
> 
> Thanks for that information.
> Any idea why just one of your cards is having issues? Both my cards crash or blue screen instantly on 14.1. I have Elpida on both.


No clue.
I know that all 3 of my cards are Elpida.
I can't take them out completely, since they are in a loop, but I can turn pci-e slots off via mobo control.
So I've ran first cards -> 3d mark, heaven, BF4 -> no issues, set on desktop for hours -> no issues.
Turned off the first pci-e slot and 3rd, left only 2nd live -> card two goes crazy with 14.1 no matter the OC, PCI-E 3.0 setting on/off, eyefinity on/off.
AND any combination of card 1+2, 1+3, 1+2+3 blackscreens in games with 14.1 drivers.


----------



## BradleyW

Quote:


> Originally Posted by *Kriant*
> 
> No clue.
> I know that all 3 of my cards are Elpida.
> I can't take them out completely, since they are in a loop, but I can turn pci-e slots off via mobo control.
> So I've ran first cards -> 3d mark, heaven, BF4 -> no issues, set on desktop for hours -> no issues.
> Turned off the first pci-e slot and 3rd, left only 2nd live -> card two goes crazy with 14.1 no matter the OC, PCI-E 3.0 setting on/off, eyefinity on/off.
> AND any combination of card 1+2, 1+3, 1+2+3 blackscreens in games with 14.1 drivers.


It sounds like that card is faulty, but it can't be since 13.12 is fine.


----------



## Kriant

Quote:


> Originally Posted by *BradleyW*
> 
> It sounds like that card is faulty, but it can't be since 13.12 is fine.


I know, right?
I've reverted back to 13.11 and ran furmark for 2h, ran heaven 4.0 for 2h, played BF4, ran 3dmark 13 on extreme multiple times.
Not a single issue.


----------



## BradleyW

Quote:


> Originally Posted by *Kriant*
> 
> I know, right?
> I've reverted back to 13.11 and ran furmark for 2h, ran heaven 4.0 for 2h, played BF4, ran 3dmark 13 on extreme multiple times.
> Not a single issue.


Maybe the new drivers are tapping into more of the card, which is bringing out the fault? If that's right, both my cards will have to go back!


----------



## Kriant

Quote:


> Originally Posted by *BradleyW*
> 
> Maybe the new drivers are tapping into more of the card, which is bringing out the fault? If that's right, both my cards will have to go back!


Same =\,
I've reported to AMD, and I'll wait till next drivers, taking the loop apart is always a pain.


----------



## mojobear

14.1 beta (....aka alpha) is full of bugs and not worth the trouble. If your cards were fine with 13.12...then your cards are fine. I have 3 epida 290s as well...crash instantly after load screen on Bf4...tried for 20 min then said F&^% it and sticking to 13.12 until fixed. Really you lose nothing with 13.12 over 14.1, potentially you gain some FPS with 14.1, its stutter FPS...which also sucks.

i'm excited about mantle, but poor showing for first driver. Smells of meeting deadlines and releasing an unpolished product...even for beta (alpha). Can blame DICE as well, but really when you break crossfire for games even with directx with 14.1....thats AMD's problem. If you had a SINGLE 290(x), might be worth trouble shooting but 14.1 crossfire is a lost cause.

Just wait for a better release in a month of so and jump on the wagon then


----------



## axiumone

Quote:


> Originally Posted by *Heinz68*
> 
> *VisionTek CryoVenom R9 290 Liquid Cooling edition buy directly from VisionTek $550*
> 
> I got this Email from VisionTek,since I already bought two Sapphire R9 290X 4GB TRI-X OC it's too late for me.


Servers are hammered to hell. Getting 503 errors trying to order


----------



## pkrexer

I too am seeing all kinds of problems with the 14.1 betas (should be called alpha maybe?) ... BF4 will randomly crash, lockup. Getting random performance drops (fps cut in half) that are only fixable by rebooting my computer. Can't even run firestrike without it crashing. Metro LL locks up before it even gets to the game menu.

The WHQL drivers work fine and I'm sticking with those until maybe a few more revisions later of the 14's


----------



## Flisker_new

Hi guys,

any owner of Asus DCUII R9 290 who could tell me what is max voltage available on that card without physical volt mod ?

edit: Oh and how are VRM temps please ?

Thanks


----------



## Abyssic

So I finally got my 290X ^^



AMD R9 290X Sapphire Tri-X Edition on air

but i've already had some issues







from time to time some games just freeze and stop responding for no reason. i've always checked the readings after this occured, nothing suspicious in temps clocks or usage. anyone having similar issues? (btw: newest beta driver (mantle) is installed)


----------



## jamaican voodoo

it seems most everyone is having some sort of issues with 14.1 beta driver i when back to 13.12 and i'm gaming fine!!!


----------



## MrWhiteRX7

Realized I've never added my other cards... this is a terribad picture and yes I know it needs cable/ hose management but ehhhh it works great









This was before I added the quad channel ram kit, I'll post an updated pic when I get home from work lol.


----------



## Widde

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Realized I've never added my other cards... this is a terribad picture and yes I know it needs cable/ hose management but ehhhh it works great
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This was before I added the quad channel ram kit, I'll post an updated pic when I get home from work lol.


The want factor is high on this one


----------



## trihy

Anyone testing cod ghost?

With 14.1 cant get into the game, video scenes work but as the engine start...image freeze with some squares on screen.

With 13.12 and msaa x4 I get some blackscreen from time to time.

With smaa, no problems so far...


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Widde*
> 
> The want factor is high on this one


LOL thanks! It's ugly as hell but it works









One day I'll go all out and make a good looking version!


----------



## Abyssic

does someone know anything about when amd is going to release a new official driver? (non beta)


----------



## Widde

Quote:


> Originally Posted by *Abyssic*
> 
> does someone know anything about when amd is going to release a new official driver? (non beta)


Probably "Big maybe" in march sometime I hope


----------



## corsairfan

will corsair rm 750w be enough to power i7 4770k and r9 290x ,both overclocked to max with custom loop?


----------



## phallacy

Quote:


> Originally Posted by *corsairfan*
> 
> will corsair rm 750w be enough to power i7 4770k and r9 290x ,both overclocked to max with custom loop?


I used an HX750 to power an i5 3570k OCd and a 290x without any probems. I'm not sure about the RM series though as there was this thread with a lot of consensus on the relatively bad quality of the RM series of PSUs. Take a look.
http://www.overclock.net/t/1455892/why-you-should-not-buy-a-corsair-rm-psu


----------



## szeged

have there been any sightings of a asus 290x matrix yet? or any rumors of it? if so id easily sell/trade my kingpin 780ti to get one as fast as possible lol.


----------



## VSG

Tired of the KPE even before you benched it? That's a new record for you I think


----------



## szeged

Quote:


> Originally Posted by *geggeg*
> 
> Tired of the KPE even before you benched it? That's a new record for you I think


Lol, I have two kpes I'd just trade one in and keep the better one







nah I'm not tired of them at all, I'm just a major fanboy of the matrix 7970 and want a 290x matrix lol.


----------



## Jack Mac

Quote:


> Originally Posted by *szeged*
> 
> Lol, I have two kpes I'd just trade one in and keep the better one
> 
> 
> 
> 
> 
> 
> 
> nah I'm not tired of them at all, I'm just a major fanboy of the matrix 7970 and want a 290x matrix lol.


lol I found szeged, just replace weaponry with video cards. "You can never have enough."


Spoiler: Warning: Spoiler!










Hopefully he doesn't trip lol


----------



## VSG

rofl that's amazing!


----------



## HOMECINEMA-PC

A very hyped and overpriced piece of tech........ here in oz anyways


----------



## Abyssic

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> A very hyped and overpriced piece of tech........ here in oz anyways


what exactly are you talking about?


----------



## Arizonian

Quote:


> Originally Posted by *Abyssic*
> 
> So I finally got my 290X ^^
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> AMD R9 290X Sapphire Tri-X Edition on air
> 
> but i've already had some issues
> 
> 
> 
> 
> 
> 
> 
> from time to time some games just freeze and stop responding for no reason. i've always checked the readings after this occured, nothing suspicious in temps clocks or usage. anyone having similar issues? (btw: newest beta driver (mantle) is installed)


Congrats - added


----------



## rabidz7

All that the lazies at ATi had to do to slay a TITAN was have 3072 streamies. But they just sat on the couch and watched TV while NVIDIA pounded them. Was it too hard to add 1750MHz VRAM?


----------



## szeged

1750mhz ram on a 512 bit bus at stock? Doubt that would happen even on custom boards.


----------



## Paul17041993

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Realized I've never added my other cards... this is a terribad picture and yes I know it needs cable/ hose management but ehhhh it works great
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This was before I added the quad channel ram kit, I'll post an updated pic when I get home from work lol.


needs more fans.

though more seriously, try getting a good rack/stand to mount the radiators on, set the fans up on a dedicated controller that adjusts on the temperature of the water coming into the rads, get a few quick-disconnects and a few meters of tubing and you can then locate them wherever you want!


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Paul17041993*
> 
> needs more fans.
> 
> though more seriously, try getting a good rack/stand to mount the radiators on, set the fans up on a dedicated controller that adjusts on the temperature of the water coming into the rads, get a few quick-disconnects and a few meters of tubing and you can then locate them wherever you want!


Thanks for the advice, I actually have two shroud / stands and was thinking about some quick-disconnects but just really wanted to get it back up and running first. I'll go through and make it beast mode soon enough. I really want to run hardlines


----------



## mojobear

Hey all,

Noticed something interesting. Was thinking about flashing to Tri-x bios (*anyone have any experience ?!?!?!*) ...and was looking at techpowerup roms for sapphire r9 290s.

http://www.techpowerup.com/vgabios/index.php?architecture=ATI&manufacturer=&model=R9+290&interface=&memType=&memSize=

Older ones have this:

Flags: Optimal Perf
Memory Support
4096 MB, GDDR5, Autodetect
4096 MB, GDDR5, Hynix H5GQ2H24AFR
4096 MB, Unknown, Elpida EDW2032BBBG

New ones have this:

Memory Support
4096 MB, GDDR5, Autodetect
4096 MB, GDDR5, Hynix H5GQ2H24AFR
4096 MB, GDDR5, Elpida *EDW2032BBBG_DEBUG2*

So maybe the blackscreens and stuff were really due to the elpida memory? I know this is issue probably long fixed but still...thought it was interesting


----------



## Abyssic

hey guys, i thought i'd share this.


----------



## JordanTr

Coming back to 14.1 drivers. I got issues too, like cut in half lineage 2 performance, blackscreens playing crysis 2 every 10-15 minutes, sometimes even after 2-3mins even at stock speeds. Used driver cleaner to revert back to 13.12, all issues gone like a bad dream


----------



## Heinz68

Quote:


> Originally Posted by *taem*
> 
> What temps on the cards? I want to crossfire 290 tri-xs but I worry my air cooled mid tower won't be able to dissipate the heat. And what's your case/case fan setup?




The case is CoolerMaster Cosmos S, I added the bottom fan on the back by removing all the spare slot covers and using sticky foam pads to attach the fan on the outside.
All the fans except the side panel are COUGAR 12CM Blue LED Hydraulic (Liquid) Bearing 1200RPM, 64.4CFM, 16.6dBA

The set up including the CPU and mobo is new so I didn't have much time to run many benches or do any overclocking except the GPU memory is set at 1500 Mhz. The CPU is at stock speed.
Ambient temperature 25C on all tests. Display Driver 13.12

*After couple runs 3DMARK 11 Performance* - Score P20465 LINK
HWiNFO64 Sensors =
Top card - GPU temp 76C, VRM1 79C, VRM2 62C, Fan 50%
Bottom Card - GPU temp 71C, VRM1 71C, VRM2 54C, Fan 50%

*Unigine Valley Benchmark 1.0 after 10 minutes pre warming*
HWiNFO64 Sensors
Top card - GPU temp 79C, VRM1 82C, VRM2 64C, Fan 50%
Bottom Card - GPU temp 73C, VRM1 73C, VRM2 56C, Fan 50%



*The Litecoin mining* is real stress test, I have the memory OC at 1500MHZ to get 970 Kh/s hashrate on each card. In order to stop the top card from throttling and occasional blackscreen I increased the voltage by 15 and power by 30.

HWiNFO64 Sensors
Top card - GPU temp 85C, VRM1 99C, VRM2 71C, Fan 80%
Bottom Card - GPU temp 80C, VRM1 79C, VRM2 62C, Fan 68%

For the mining I have custom fan profile set in the MSI Afterburner.
The KILL-A-WAT pull from the wall shows 860 Watts. I usually run the top card at less intensity to avoid the high VRM1 temps, getting about 700 Kh/s hashrate for that card plus being able to use the PC for other work at the same time.

My cards are not sandwiched, there is empty double slot between them, I think you might need water-cooling with 3 cards sandwiched if you are looking for any voltage overclocking.


----------



## Paul17041993

Quote:


> Originally Posted by *rabidz7*
> 
> All that the lazies at ATi had to do to slay a TITAN was have 3072 streamies. But they just sat on the couch and watched TV while NVIDIA pounded them. Was it too hard to add 1750MHz VRAM?


pretty sure only 172 of the total 192 units in the chip are enabled on the 290X for both optimization and redundancy, and 1750MHz GDDR5 is only possible on the power-tolerant 384bit (triple channel) memory controller, the 512bit (quad) present in the 290/X is much more power efficient and is more suited for the shader count, high-res textures and ultimately eyefinity, 1440p and 4K monitor setups.

if you manage to get the memory clock to 1600MHz, that's 409.6GB/s max potential memory speed, almost 22% faster then the 780ti's memory on its stock 1750MHz.


----------



## BradleyW

Quote:


> Originally Posted by *JordanTr*
> 
> Coming back to 14.1 drivers. I got issues too, like cut in half lineage 2 performance, blackscreens playing crysis 2 every 10-15 minutes, sometimes even after 2-3mins even at stock speeds. Used driver cleaner to revert back to 13.12, all issues gone like a bad dream


Do you run Hynix or Elpida? I think Elpida is having issues on 14.1 drivers. (I've conducted a poll for over 10 days and Elpida users are suffering on 14.1).


----------



## Abyssic

Quote:


> Originally Posted by *BradleyW*
> 
> Do you run Hynix or Elpida? I think Elpida is having issues on 14.1 drivers. (I've conducted a poll for over 10 days and Elpida users are suffering on 14.1).


i've got hynix and i've also got issues


----------



## ebduncan

Quote:


> The Litecoin mining is real stress test, I have the memory OC at 1500MHZ to get 970 Kh/s hashrate on each card. In order to stop the top card from throttling and occasional blackscreen I increased the voltage by 15 and power by 30.


Yes mining is the real stress test. I have power increased by 40 voltage by 30mv, my R9 290 is running at 50c full load under water. 1140core/1600 mining getting hash rates just over 1000 k/hash. Took awhile to find the right settings for mining. Adding core mhz didn't help past 1140, adding memory mhz gains stopped at 1500mhz. Between 1500-1600 was a mixed bag. Then magically at 1600mhz memory and only 1140 core did I manage to break the 1000 k/hash barrier.

Current settings for me are tc-24600, worksize-512, vectors-1, gpu threads-1,gpu lockup-1, Intensity 19 (intensity 20 reduced hash rate)

using cgiminer 3.5

cat version 13:12

I would like to note have issues with 14.1 and my R9 get black artifacts all over the screen. I have Hynix memory. 14.1 drivers are truely beta drivers and I will wait for later revisions.


----------



## neurotix

Quote:


> Originally Posted by *Paul17041993*
> 
> pretty sure only 172 of the total 192 units in the chip are enabled on the 290X for both optimization and redundancy, and 1750MHz GDDR5 is only possible on the power-tolerant 384bit (triple channel) memory controller, the 512bit (quad) present in the 290/X is much more power efficient and is more suited for the shader count, high-res textures and ultimately eyefinity, 1440p and 4K monitor setups.
> 
> if you manage to get the memory clock to 1600MHz, that's 409.6GB/s max potential memory speed, almost 22% faster then the 780ti's memory on its stock 1750MHz.


176 TMUs on a 290X are enabled.

160 TMUs are enabled on the 290.

Close, but not quite accurate.


----------



## maynard14

Quote:


> Originally Posted by *neurotix*
> 
> 176 TMUs on a 290X are enabled.
> 
> 160 TMUs are enabled on the 290.
> 
> Close, but not quite accurate.


why amd would not use 192 units? im curious







thanks


----------



## ebduncan

Quote:


> Originally Posted by *maynard14*
> 
> why amd would not use 192 units? im curious
> 
> 
> 
> 
> 
> 
> 
> thanks


production yields.

if they disable a few units it means more samples can be sold, if they have defects. Ie they can still sell a die if it has one bad stream processor, because not all of them are activated in the final product.

Its rather common practice for increasing yields in semiconductor manufacturing


----------



## maynard14

Quote:


> Originally Posted by *ebduncan*
> 
> production yields.
> 
> if they disable a few units it means more samples can be sold, if they have defects. Ie they can still sell a die if it has one bad stream processor, because not all of them are activated in the final product.
> 
> Its rather common practice for increasing yields in semiconductor manufacturing


ohhh i see,. so thats why. thank you sir


----------



## Paul17041993

Quote:


> Originally Posted by *neurotix*
> 
> 176 TMUs on a 290X are enabled.
> 
> 160 TMUs are enabled on the 290.
> 
> Close, but not quite accurate.


Quote:


> Originally Posted by *Paul17041993*
> 
> pretty sure only 172 of the total 192 units in the chip are enabled on the 290X for both optimization and redundancy,


there would be spare units to allow for binning, this is especially needed on such a massive chip, though only AMD know the exact amount inside, rumor is 192, which makes sense as its 64*3 (16 * 12), exactly 150% of a 7970.


----------



## axiumone

Put in an order for 4x visiontek cryovenom 290's today. They charged my card. So hopefully the order will be fulfilled.


----------



## tsm106

Quote:


> Originally Posted by *axiumone*
> 
> Put in an order for 4x visiontek cryovenom 290's today. They charged my card. So hopefully the order will be fulfilled.


Damn, you gone all in on this hand ya. Let it ride baby!


----------



## szeged

Quote:


> Originally Posted by *tsm106*
> 
> Damn, you gone all in on this hand ya. Let it ride baby!


i know this is completely off topic, but do you happen to know what your avatar is from? without the edited in mantle logo crashing down that is







it looks familiar but i just cant place it.


----------



## ebduncan

Quote:


> Originally Posted by *axiumone*
> 
> Put in an order for 4x visiontek cryovenom 290's today. They charged my card. So hopefully the order will be fulfilled.


nice. I've been thinking about placing a order with them as well. Still pricing things out and deciding if I want to make a gamble on the mining scene.


----------



## tsm106

Quote:


> Originally Posted by *szeged*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Damn, you gone all in on this hand ya. Let it ride baby!
> 
> 
> 
> i know this is completely off topic, but do you happen to know what your avatar is from? without the edited in mantle logo crashing down that is
> 
> 
> 
> 
> 
> 
> 
> it looks familiar but i just cant place it.
Click to expand...



__
https://www.reddit.com/r/q0ruv/my_new_favorite_gif/


----------



## szeged

Quote:


> Originally Posted by *tsm106*
> 
> 
> __
> https://www.reddit.com/r/q0ruv/my_new_favorite_gif/


yep thats it! thanks


----------



## tsm106

Thank sugarhell. I stole it from him. He posted it in the mantle news thread and got infracted for trolling ironically lol.


----------



## szeged

if that gif is trolling then why the hell havent the thousands of posts that are blatant insults and trolling not being deleted in 99% of the amd or nvidia threads







sometimes my faith in humanity is running on fumes.


----------



## axiumone

Quote:


> Originally Posted by *tsm106*
> 
> Damn, you gone all in on this hand ya. Let it ride baby!


Yeah, they are the only decently priced cards out there. The performance difference shouldn't be huge between 290x and 290, but the cheapest 290x with a block is almost $800! I'm also secretly hoping that some of these cards unlock...

Quote:


> Originally Posted by *ebduncan*
> 
> nice. I've been thinking about placing a order with them as well. Still pricing things out and deciding if I want to make a gamble on the mining scene.


I didn't get the for mining. I'm only interested in gaming. With electricity here .20c/kWh, it just isnt profitable.


----------



## szeged

4 290s for gaming, thats gonna be a powerful system







/jealous


----------



## Red1776

Quote:


> Originally Posted by *szeged*
> 
> 4 290s for gaming, thats gonna be a powerful system
> 
> 
> 
> 
> 
> 
> 
> /jealous


mine already showed up hehehe
just waiting for the WB's


----------



## szeged

very sexy









why did you grab the gaming edition cards just to pull that beast cooler off ? just curious.


----------



## Red1776

Quote:


> Originally Posted by *szeged*
> 
> very sexy
> 
> 
> 
> 
> 
> 
> 
> 
> 
> why did you grab the gaming edition cards just to pull that beast cooler off ? just curious.


good question.
AMD sent me the R290x's for a project/story I am doing. with the demand for the 290x I suspect it was what MSI had available. I am going to test them with the stock coolers and then with WB's as well.


----------



## ArchieGriffs

And to remove warranty stickers on the screw too. Definitely not the cards I would choose to water cool, though I guess it's the cheapest 290 with guaranteed Hynix.


----------



## szeged

Quote:


> Originally Posted by *Red1776*
> 
> good question.
> AMD sent me the R290x's for a project/story I am doing. with the demand for the 290x I suspect it was what MSI had available. I am going to test them with the stock coolers and then with WB's as well.


either way very nice and now youll have some awesome wall decorations once you remove the coolers


----------



## aiyaaabatt

Hey guys,

Do you know if the SeaSonic 750X (Gold) PSU will be able to power crossfired (reference) r9 290x's at stock settings?

I know it is cutting it close, but also the Seasonic x750 is highly rated and has been tested to output 850w easily.

Am I running any risk by adding the second card? Hurry and stop me before I do it!!


----------



## ArchieGriffs

Quote:


> Originally Posted by *aiyaaabatt*
> 
> Hey guys,
> 
> Do you know if the SeaSonic 750X (Gold) PSU will be able to power crossfired (reference) r9 290x's at stock settings?
> 
> I know it is cutting it close, but also the Seasonic x750 is highly rated and has been tested to output 850w easily.
> 
> Am I running any risk by adding the second card? Hurry and stop me before I do it!!


850w is what it can manage for short periods of time, certainly not what it's recommended to do. It also depends on if you have a power hungry CPU, and if you're OCing it.

Two R9 290s output 250-300W depending on how heavily they're being used, even more wattage than 300W with OCing. Again, unless you have a power hungry CPU, the rest of your system including the CPU should be around 200W. So anywhere from 700-800W is what two should be drawing, assuming you aren't pushing them with OCing. If it was 675-775W, I would say you're fine, but that's a pretty close call and it isn't worth the risk.

I don't have a power hungry CPU, and I have a 750W PSU, so I'm considering not upgrading my PSU should the event arise that I need to upgrade (I hope it is a very long time from now), but even then I kind of want a 1000W just so I wouldn't ever have to worry. 750W was a terrible move in hindsight, I should have waited for a better deal on a 850W.


----------



## Arizonian

@Red1776

Post a pic with OCN name along side those beauties or a GPU-Z validation link when you get them set up with OCN name showing and I'll add you to roster.


----------



## taem

Quote:


> Originally Posted by *ArchieGriffs*
> 
> 850w is what it can manage for short periods of time, certainly not what it's recommended to do. It also depends on if you have a power hungry CPU, and if you're OCing it.
> 
> Two R9 290s output 250-300W depending on how heavily they're being used, even more wattage than 300W with OCing. Again, unless you have a power hungry CPU, the rest of your system including the CPU should be around 200W. So anywhere from 700-800W is what two should be drawing, assuming you aren't pushing them with OCing. If it was 675-775W, I would say you're fine, but that's a pretty close call and it isn't worth the risk.
> 
> I don't have a power hungry CPU, and I have a 750W PSU, so I'm considering not upgrading my PSU should the event arise that I need to upgrade (I hope it is a very long time from now), but even then I kind of want a 1000W just so I wouldn't ever have to worry. 750W was a terrible move in hindsight, I should have waited for a better deal on a 850W.


I've seen several reviews that have 290/x drawing 200-220w at game load. I seriously doubt any 6 + 8 pin 290/x will draw anywhere near 300w.

My 4670k @ 4.6/1.25v with Asus z87 pro, 16gb ram, ssd, odd, 8 fans, and 2 7200rpm hdds draws 145w. If I down clock to stock cpu my system draws 100w.

But yeah I'd say 650 psu is pushing it too close. But if you're going to buy a new psu with crossfire 290 in mind, I'd say get a 1000w. I have an 850 now and I'm thinking of moving up so I can oc.


----------



## taem

Quote:


> Originally Posted by *Heinz68*
> 
> 
> 
> The case is CoolerMaster Cosmos S, I added the bottom fan on the back by removing all the spare slot covers and using sticky foam pads to attach the fan on the outside.
> All the fans except the side panel are COUGAR 12CM Blue LED Hydraulic (Liquid) Bearing 1200RPM, 64.4CFM, 16.6dBA
> 
> The set up including the CPU and mobo is new so I didn't have much time to run many benches or do any overclocking except the GPU memory is set at 1500 Mhz. The CPU is at stock speed.
> Ambient temperature 25C on all tests. Display Driver 13.12
> 
> *After couple runs 3DMARK 11 Performance* - Score P20465 LINK
> HWiNFO64 Sensors =
> Top card - GPU temp 76C, VRM1 79C, VRM2 62C, Fan 50%
> Bottom Card - GPU temp 71C, VRM1 71C, VRM2 54C, Fan 50%
> 
> *Unigine Valley Benchmark 1.0 after 10 minutes pre warming*
> HWiNFO64 Sensors
> Top card - GPU temp 79C, VRM1 82C, VRM2 64C, Fan 50%
> Bottom Card - GPU temp 73C, VRM1 73C, VRM2 56C, Fan 50%
> 
> 
> 
> *The Litecoin mining* is real stress test, I have the memory OC at 1500MHZ to get 970 Kh/s hashrate on each card. In order to stop the top card from throttling and occasional blackscreen I increased the voltage by 15 and power by 30.
> 
> HWiNFO64 Sensors
> Top card - GPU temp 85C, VRM1 99C, VRM2 71C, Fan 80%
> Bottom Card - GPU temp 80C, VRM1 79C, VRM2 62C, Fan 68%
> 
> For the mining I have custom fan profile set in the MSI Afterburner.
> The KILL-A-WAT pull from the wall shows 860 Watts. I usually run the top card at less intensity to avoid the high VRM1 temps, getting about 700 Kh/s hashrate for that card plus being able to use the PC for other work at the same time.
> 
> My cards are not sandwiched, there is empty double slot between them, I think you might need water-cooling with 3 cards sandwiched if you are looking for any voltage overclocking.


Hmm. Given that your case has significantly better airflow than my Define R4, I'm still worried about temps. Ambient here is usually low so that's a plus. And my board has pcie 16x at slots 2 and 5, so good spacing. Still, I don't know if my system can handle the temps. I should give up on the 290 and get a reference 780 probably. Or get a 290x and go for an oc'd single gpu.


----------



## Red1776

Quote:


> Originally Posted by *Arizonian*
> 
> @Red1776
> 
> Post a pic with OCN name along side those beauties or a GPU-Z validation link when you get them set up with OCN name showing and I'll add you to roster.


here you go Arizonian



Thanks


----------



## Arizonian

Quote:


> Originally Posted by *Red1776*
> 
> here you go Arizonian
> 
> 
> Spoiler: Warning: Spoiler!


Well I mean your OCN name, but that's proof enough for me...you can always update that post.

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/16210#post_21757826

Congrats - added









Wish I did own two.


----------



## TheOCNoob

Quote:


> Originally Posted by *Red1776*
> 
> here you go Arizonian
> 
> 
> 
> Thanks


lel


----------



## Red1776

Quote:


> Originally Posted by *Arizonian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> here you go Arizonian
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well I mean your OCN name, but that's proof enough for me...you can always update that post.
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/16210#post_21757826
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Wish I did own two.
Click to expand...

You have the number of GPU's wrong Ariz, there are 4 of them :


----------



## Arizonian

Fixed


----------



## Red1776

Quote:


> Originally Posted by *Arizonian*
> 
> 
> 
> 
> 
> 
> 
> 
> Fixed


Thank you sir


----------



## tsm106

What cpu are you gonna use on that setup?


----------



## ArchieGriffs

Quote:


> Originally Posted by *taem*
> 
> I've seen several reviews that have 290/x drawing 200-220w at game load. I seriously doubt any 6 + 8 pin 290/x will draw anywhere near 300w.
> 
> My 4670k @ 4.6/1.25v with Asus z87 pro, 16gb ram, ssd, odd, 8 fans, and 2 7200rpm hdds draws 145w. If I down clock to stock cpu my system draws 100w.
> 
> But yeah I'd say 650 psu is pushing it too close. But if you're going to buy a new psu with crossfire 290 in mind, I'd say get a 1000w. I have an 850 now and I'm thinking of moving up so I can oc.


Is that from the wall? Or from monitoring software on computers?


----------



## Paul17041993

Quote:


> Originally Posted by *aiyaaabatt*
> 
> Hey guys,
> 
> Do you know if the SeaSonic 750X (Gold) PSU will be able to power crossfired (reference) r9 290x's at stock settings?
> 
> I know it is cutting it close, but also the Seasonic x750 is highly rated and has been tested to output 850w easily.
> 
> Am I running any risk by adding the second card? Hurry and stop me before I do it!!


stock settings, yea should be fine, but you're not going to have much room for any OCs...


----------



## wolfej

Has anyone had problems with Heaven 4.0 crashing in windowed mode, but not in fullscreen mode? It makes no sense to me. I even disabled xfire and put the cards at stock clocks. The program crashes as soon as the program is done loading. Fullscreen works without a hitch. . . I'm baffled.


----------



## chiknnwatrmln

At +210mV, according to GPUZ, my 290 takes over 325w at load.

My entire PC pulls about 530w at the wall with 100% gpu load. This is with a 3770k and a platinum plus power supply.


----------



## ebduncan

R9-290/r9-290x with out oc will pull 220-250watts from the wall. OC'ed r9's creep past 300 watts.

So with a crossfire setup the min power supply I would use is a 800 watt unit. A High quality 850+ is recommended.


----------



## neurotix

With my FX-8350 at 4.7ghz and my R9 290 Tri-X at 1200/1500mhz +168mv, my kill a watt read 680W while running Just Cause 2 in Eyefinity.

The machine at idle was drawing 200W, likely from the oc'ed FX-8350. That's 200W x .85 = 170W. With the R9 290 overclocked and running a game it drew 680. 680 x .85 = 578.

578 - 170 = 408W.

So, my R9 290 easily draws 400W from the wall when overclocked to such a high degree.

This is why I recently upgraded to a 1000W PSU in anticipation of getting another R9 290 Tri-X with mining money. I think if they're both overclocked along with my FX-8350 I will come really close to, if not exceed the 1000w rating of my new V1000.

These cards draw quite a lot over their TDP in my testing.

Also, for some reason Just Cause 2 in Eyefinity draws WAY more power than any other game I tested. Bulletstorm at the same clocks drew 530W. Crysis 3 drew about 600W. For some reason JC2 draws a lot. Not sure why.


----------



## kskwerl

Can anyone suggest or can I get a few opinions actually on which 290X I should buy?

Thanks for any input!


----------



## szeged

Quote:


> Originally Posted by *kskwerl*
> 
> Can anyone suggest or can I get a few opinions actually on which 290X I should buy?
> 
> Thanks for any input!


my personal favorite thats released right now is the DCUII from asus, but if you can wait a couple weeks the 290x lightning from msi is probably going to be an unstoppable monster.


----------



## Paul17041993

Quote:


> Originally Posted by *neurotix*
> 
> Also, for some reason Just Cause 2 in Eyefinity draws WAY more power than any other game I tested. Bulletstorm at the same clocks drew 530W. Crysis 3 drew about 600W. For some reason JC2 draws a lot. Not sure why.


hows the cpu usage between them? and it could be due to JC2 having a massive open map, more vertexes for the gpu to process.


----------



## taem

Question was 290/x at stock. At stock draw will be 250w tops probably significantly less. Yes if you oc especially if you tweak power the draw will skyrocket. The two 280x's I tried drew less than 200w at game load. And cpu never drew full load during gaming.

Quote:


> Originally Posted by *ArchieGriffs*
> 
> Is that from the wall? Or from monitoring software on computers?


I use a BelkIin conserve insight unit plugged between wall and psu.


----------



## kskwerl

Quote:


> Originally Posted by *szeged*
> 
> my personal favorite thats released right now is the DCUII from asus, but if you can wait a couple weeks the 290x lightning from msi is probably going to be an unstoppable monster.


Is there an official release date


----------



## szeged

Quote:


> Originally Posted by *kskwerl*
> 
> Is there an official release date


not yet, we have heard from msi reps " middle of february" though, and guess what, thats soon


----------



## kskwerl

Quote:


> Originally Posted by *szeged*
> 
> not yet, we have heard from msi reps " middle of february" though, and guess what, thats soon


That is soon! I'm only going to be using the 290x for mining though


----------



## rdr09

Quote:


> Originally Posted by *taem*
> 
> So I think my 850w psu will be fine for cross firing these. Testing without Gpus, total system wattage for 4670k @ 4.6, 1.25v is 145w. If I run a pair of 290s at stock on top that should be fine right? I might down clock my cpu also -- what do you guys think is a good 4670k clock for 290 crossfire?


i know it will sound taboo but i've been reading members with i5 bottlenecking 2 290s.


----------



## taem

Quote:


> Originally Posted by *rdr09*
> 
> i know it will sound taboo but i've been reading members with i5 bottlenecking 2 290s.


Well it's a $200 cpu, not exactly high end, so that wouldn't outrage me. But, 4.6 is a good clock speed, and no game I've played takes usage over 50%, so I wonder where the bottleneck could come from. Not enough cores?


----------



## kskwerl

Quote:


> Originally Posted by *taem*
> 
> Well it's a $200 cpu, not exactly high end, so that wouldn't outrage me. But, 4.6 is a good clock speed, and no game I've played takes usage over 50%, so I wonder where the bottleneck could come from. Not enough cores?


would this bottleneck occur if I'm strictly building a mining rig? I was under the impression I should just buy the cheapest CPU for the 1150 socket (board I was planning to use has 1150 socket)


----------



## ebduncan

Quote:


> Originally Posted by *kskwerl*
> 
> would this bottleneck occur if I'm strictly building a mining rig? I was under the impression I should just buy the cheapest CPU for the 1150 socket (board I was planning to use has 1150 socket)


cpu doesn't matter for mining. So no it would not bottle neck mining.


----------



## rdr09

Quote:


> Originally Posted by *taem*
> 
> Well it's a $200 cpu, not exactly high end, so that wouldn't outrage me. But, 4.6 is a good clock speed, and no game I've played takes usage over 50%, so I wonder where the bottleneck could come from. Not enough cores?


do you have a 290 now? here was BF4 MP with a single 290 at stock and an i7 SB @ 4.5 HT off . . .



look at the usage. i keep the HT off, though, 'cause it is still smooth but with 2 . . . i may have to turn on HT. In games the 290 is only a tad slower than the 290X. A 50 - 70MHz oc on the 290 will match a 290X is my estimate.


----------



## Mr357

Quote:


> Originally Posted by *ebduncan*
> 
> cpu doesn't matter for mining. So no it would not bottle neck mining.


That's not entirely true. Even if it's not being stressed, your CPU needs to meet a certain level of performance to not bottleneck the card. At equal clocks, a 290 driven by a 4770K will get a higher graphics score than one paired with an 8150.


----------



## Paul17041993

only really games like BF4 that use 8+threads that would really be an issue, but for said game mantle is supposed to help with that too and use less CPU, but that's still experimental...


----------



## ebduncan

Quote:


> Originally Posted by *Mr357*
> 
> That's not entirely true. Even if it's not being stressed, your CPU needs to meet a certain level of performance to not bottleneck the card. At equal clocks, a 290 driven by a 4770K will get a higher graphics score than one paired with an 8150.


cpu speed doesn't matter for mining. It matters for games, but not mining. People use semptrons in mining rigs.


----------



## taem

Oh I forgot to mention, my cpu usage gaming doesn't exceed 50% with a *280x*.

BF4 is a rarity though, right, in that it's far more cpu intensive than most? With most games, say Metro LL or Far Cry 3, Arkham series, etc, will a 4670k @ 4.6 bottleneck 290 crossfire? How low could I take the clock on a 4670k and not bottleneck for most games?


----------



## Mr357

Quote:


> Originally Posted by *ebduncan*
> 
> cpu speed doesn't matter for mining. It matters for games, but not mining. People use semptrons in mining rigs.


Does it work well? I can't imagine so.


----------



## ebduncan

Quote:


> Originally Posted by *Mr357*
> 
> Does it work well? I can't imagine so.


works fine, cpu is the least of your concerns with coin mining.


----------



## the9quad

Quote:


> Originally Posted by *Paul17041993*
> 
> only really games like BF4 that use 8+threads that would really be an issue, but for said game mantle is supposed to help with that too and use less CPU, but that's still experimental...


I think max is 8 threads in BF4. That's all I see at the top when I'm running it with a 4930k


----------



## Arizonian

Wow pricing on the egg is at an all time high.

290X TRI-X selling for $799.99








290X MSI Gaming for $749.99








290X ASUS DCUII for $749.99









Unless your mining and plan to make it up this is insane. Even the miners have to work it longer now to break even. Truly sad state of affairs for gamers.


----------



## szeged

wow really, 290x's going for $799? someone shoot me this cant be real. stupid newegg needs to burn already.


----------



## Sazz

This makes me wanna sell my 290X and just go for a 780Ti..


----------



## szeged

Quote:


> Originally Posted by *Sazz*
> 
> This makes me wanna sell my 290X and just go for a 780Ti..


lol wait a month and sell it for $800 when newegg pushes the prices up to $875 for a new one zzzzzzzzzzzzzzzzzzz


----------



## beejay

Ridiculous. Luckily prices outside the US are still acceptable. I bought three r9290x Reference last January for $620 a pop.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Arizonian*
> 
> Wow pricing on the egg is at an all time high.
> 
> 290X TRI-X selling for $799.99
> 
> 
> 
> 
> 
> 
> 
> 
> 290X MSI Gaming for $749.99
> 
> 
> 
> 
> 
> 
> 
> 
> 290X ASUS DCUII for $749.99
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Unless your mining and plan to make it up this is insane. Even the miners have to work it longer now to break even. Truly sad state of affairs for gamers.


Kinda sad when the US starts charging more than Aus does.....

Well i shouldn't say US, i should say select major online retailers


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Abyssic*
> 
> what exactly are you talking about?


780 , 780ti , KPE , Classies ect . See previous posts .
I can pull same scores as them in most benchmarks with my R9 290's


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Kinda sad when the US starts charging more than Aus does.....
> 
> Well i shouldn't say US, i should say select major online retailers


LoooL
I LooLed and LooLed when I saw pc case gear advertise ROG Asus MARS 760 for $830AU ........


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> LoooL
> I LooLed and LooLed when I saw pc case gear advertise ROG Asus MARS 760 for $830AU ........


Yeah i didn't really know what to think when i saw that cept maybe


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yeah i didn't really know what to think when i saw that cept maybe


more like maybe not........

I got 5 760's and 2 are hawks . Mars wont let you mod power level or AB soft mod ...... definately not


----------



## Sazz

NCIX got the XFX DD 290X version for 670ish+shipping which is not bad since launch price is 580, that's about 100bucks more fore a custom cooler, which is probably you would spend for a arctic cooling solution anyways. that's the only one that I see reasonably priced.


----------



## beejay

And, I think, a lot of peeps would be selling their R9s at low prices, maybe a month from now. It's hard mining right now. I'm mining with three R9 290Xs and I'm calculating 7 months until I break even. Unless there would be an increase in the current coin value.


----------



## szeged

ive been trying to find a r9 280x for a decent price for the past month for an all asus ROG build, too bad everyone and their mother wants $600+ for a 2 year old gpu.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sazz*
> 
> NCIX got the XFX DD 290X version for 670ish+shipping which is not bad since launch price is 580, that's about 100bucks more fore a custom cooler, which is probably you would spend for a arctic cooling solution anyways. that's the only one that I see reasonably priced.


That's $720AU + shipping . Go a reference card and w/block it . 40c less temp at full load . All up $630AU for 290 , block and backplate plus shipping


----------



## Sazz

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> That's $720AU + shipping . Go a reference card and w/block it . 40c less temp at full load . All up $630AU for 290 , block and backplate plus shipping


I am not really talking about AU market, coz that would be a different story.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sazz*
> 
> I am not really talking about AU market, coz that would be a different story.


Its about the dollars AU or not . Just comparing but I get ya drift


----------



## devilhead

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Its about the dollars AU or not . Just comparing but I get ya drift


In Norway you can get 290x for 650US dollar


----------



## rdr09

Quote:


> Originally Posted by *taem*
> 
> Oh I forgot to mention, my cpu usage gaming doesn't exceed 50% with a *280x*.
> 
> BF4 is a rarity though, right, in that it's far more cpu intensive than most? With most games, say Metro LL or Far Cry 3, Arkham series, etc, will a 4670k @ 4.6 bottleneck 290 crossfire? How low could I take the clock on a 4670k and not bottleneck for most games?


I can't really say for sure 'cause my idea is based off one 290 and my i7 HT off (basically an i5). i've seen some members pair an i5 with Titans and others will say - what? like you said, a $200 cpu paired with $2K worth of gpu.

now, this is just a reference - here is Titan that's prolly using old drivers so its graphics score is kinda low . . .

http://www.3dmark.com/3dm11/7594343

here is my 290 . . .

http://www.3dmark.com/3dm11/7648825

compare the graphics scores. i am not saying that the 290 is equal to a Titan but what i am saying is that it is up there.


----------



## Abyssic

Quote:


> Originally Posted by *taem*
> 
> Oh I forgot to mention, my cpu usage gaming doesn't exceed 50% with a *280x*.
> 
> BF4 is a rarity though, right, in that it's far more cpu intensive than most? With most games, say Metro LL or Far Cry 3, Arkham series, etc, will a 4670k @ 4.6 bottleneck 290 crossfire? How low could I take the clock on a 4670k and not bottleneck for most games?


i had two 7950s with a i5 3450. the only game where i was bottlenecked by the cpu was crysis 3 in the grass areas.


----------



## rdr09

Quote:


> Originally Posted by *Abyssic*
> 
> i had two 7950s with a i5 3450. the only game where i was bottlenecked by the cpu was crysis 3 in the grass areas.


i had to crossfire my 7950 and 7970 using 1080 to max out C3. Very High and 8MSAA. now i can do it with a single 290 at stock but i have to turn on HT. Turning HT on raises my cpu temp @ 5-7C, which raises my gpu temp in the same loop. BF3 and 4 SP have no issues with HT off. MP is a different animal. Some maps will require either a higher oc or turning HT helps a bit.


----------



## hatlesschimp

I think my reference 290x's cost $675 to 699 each. Still ive found the performance similar to my 3 titans and for $550 less per card lol!

Aussie Prices for tech is terrible!!!!!


----------



## Bartouille

Wow, I still can't believe how picky these cards are in the way you mount the cooler. I have a MK-26 with some heavy fans, and I think this time I did a perfect mounting job and I got something to hold my card because it's so heavy and now my card doesn't artifact anymore. 1120Mhz on stock voltage, just running valley, very impressed. I mounted once and it artifacted within seconds are 1090MHz. Even on stock cooler I don't remember getting 1120 no artifacts.


----------



## kot0005

Quote:


> Originally Posted by *Arizonian*
> 
> Wow pricing on the egg is at an all time high.
> 
> 290X TRI-X selling for $799.99
> 
> 
> 
> 
> 
> 
> 
> 
> 290X MSI Gaming for $749.99
> 
> 
> 
> 
> 
> 
> 
> 
> 290X ASUS DCUII for $749.99
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Unless your mining and plan to make it up this is insane. Even the miners have to work it longer now to break even. Truly sad state of affairs for gamers.


New egg def being over greedy....Tri-X $729 AUD in Australia...

http://www.pccasegear.com/index.php?main_page=product_info&cPath=193_1555&products_id=26476


----------



## Ukkooh

Quote:


> Originally Posted by *Bartouille*
> 
> Wow, I still can't believe how picky these cards are in the way you mount the cooler. I have a MK-26 with some heavy fans, and I think this time I did a perfect mounting job and I got something to hold my card because it's so heavy and now my card doesn't artifact anymore. 1120Mhz on stock voltage, just running valley, very impressed. I mounted once and it artifacted within seconds are 1090MHz. Even on stock cooler I don't remember getting 1120 no artifacts.


Sounds like you didn't put enough tim and some parts of the die weren't cooled properly. What did it look like when you took the cooler off again?


----------



## Abyssic

guys, i know here are some ppl that also have a tri-x 290x. what overclocks did you get out? i fiddled a bit with it yesterday and i basically wasn't able to oc at all. +10mhz on the core and +15mhz on the memory is all i could get out without artifacts







(hynix memory, 78% asic)


----------



## kizwan

Quote:


> Originally Posted by *devilhead*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HOMECINEMA-PC*
> 
> Its about the dollars AU or not . Just comparing but I get ya drift
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In Norway you can get 290x for 650US dollar
Click to expand...

In Malaysia, you can get ref 290X for USD$571 & TRI-X R9 290 for USD$566.
Quote:


> Originally Posted by *Abyssic*
> 
> guys, i know here are some ppl that also have a tri-x 290x. what overclocks did you get out? i fiddled a bit with it yesterday and i basically wasn't able to oc at all. +10mhz on the core and +15mhz on the memory is all i could get out without artifacts
> 
> 
> 
> 
> 
> 
> 
> (hynix memory, 78% asic)


Are you trying to see how high you can go with stock voltage?


----------



## Bartouille

Quote:


> Originally Posted by *Ukkooh*
> 
> Sounds like you didn't put enough tim and some parts of the die weren't cooled properly. What did it look like when you took the cooler off again?


Actually I never fully unmounted the cooler again, I just released the screws enough for the cooler to stay in place, but I readjusted the torque. I also tried to apply more torque than I did right now and I was getting artifacts too, now I didn't apply too much and held the cooler. The problem with these cards is that the core is higher than the rest, so all the force you apply is on the core. I preferred the 79xx where the core was a little lower than the rest.


----------



## darwing

Quote:


> Originally Posted by *rdr09*
> 
> i had to crossfire my 7950 and 7970 using 1080 to max out C3. Very High and 8MSAA. now i can do it with a single 290 at stock but i have to turn on HT. Turning HT on raises my cpu temp @ 5-7C, which raises my gpu temp in the same loop. BF3 and 4 SP have no issues with HT off. MP is a different animal. Some maps will require either a higher oc or turning HT helps a bit.


in all reality you bottlenecked your 7970 by adding a 7950...

a single OC 7970 on a single monitor has no issues running BF4 at ultra


----------



## Abyssic

Quote:


> Originally Posted by *kizwan*
> 
> Are you trying to see how high you can go with stock voltage?


not entirely. most of my tests were on stock volts but i also did some overvolting during the process. i didn't go too far or too deep in it. (was about an hour) but i could already see that this card overclocks the worst from everything i've seen so far. i will start a new attempt maybe this evening when i go back to the last drivers. it could even be that those buggy beta drivers are giving me troubles


----------



## chronicfx

I am running three 290x on a z77 ud5h gigabyte board with gen3 x8,x8,x4 bandwidth. I know these may provide some bottleneck to these cards. I have played a few games in trifire and am coming from 7970 trifire. My 7970 trifire was butter smooth in all games with same hardware setup (only upgraded gpu's). Now i get alot of stutter, well not alot but it is noticeable and kills my gaming joy. My question is will having a small pcie bottleneck on hawaii impact framerate, frametime, or will it impact both? I have not tested just two cards, but one card is very smooth. I also use an afterburner fan profile and cards stay under 80c.


----------



## Abyssic

Quote:


> Originally Posted by *chronicfx*
> 
> I am running three 290x on a z77 ud5h gigabyte board with gen3 x8,x8,x4 bandwidth. I know these may provide some bottleneck to these cards. I have played a few games in trifire and am coming from 7970 trifire. My 7970 trifire was butter smooth in all games with same hardware setup (only upgraded gpu's). Now i get alot of stutter, well not alot but it is noticeable and kills my gaming joy. My question is will having a small pcie bottleneck on hawaii impact framerate, frametime, or will it impact both? I have not tested just two cards, but one card is very smooth. I also use an afterburner fan profile and cards stay under 80c.


i don't know too much about that but maybe the drivers aren't as optimized yet for those cards


----------



## chronicfx

I guess to clarify I don't mind if I lose 5 to 7% of my frame rate because I have three cards and a massive framerate anyway but if the bandwidth limitation is causing my cards not to sync up and run crossfire correctly than it is a bigger matter. Thoughts?


----------



## chronicfx

I am also having trouble with the driver for some funny reason.. Everytime i choose uninstall from install manager it goes through the install but then remains on the list. That never happened with my 7970's and i uninstalled dozens of times.


----------



## hatlesschimp

I gave the 14.1 mantle drivers ago and nothing but problems!!!!

I was getting 7fps in 3DMark11 when im normally up around the 80fps mark lol

And most games didnt load.


----------



## Red1776

Quote:


> Originally Posted by *chronicfx*
> 
> I am running three 290x on a z77 ud5h gigabyte board with gen3 x8,x8,x4 bandwidth. I know these may provide some bottleneck to these cards. I have played a few games in trifire and am coming from 7970 trifire. My 7970 trifire was butter smooth in all games with same hardware setup (only upgraded gpu's). Now i get alot of stutter, well not alot but it is noticeable and kills my gaming joy. My question is will having a small pcie bottleneck on hawaii impact framerate, frametime, or will it impact both? I have not tested just two cards, but one card is very smooth. I also use an afterburner fan profile and cards stay under 80c.


Crossfire and the rendering load balance the cards so something else is happening there.
have a look at this :

http://www.techpowerup.com/reviews/Intel/Ivy_Bridge_PCI-Express_Scaling/1.html

If you were to block all but a single PCIe lane , you would still get 70% of the cards performance.


----------



## Krusher33

Quote:


> Originally Posted by *hatlesschimp*
> 
> I gave the 14.1 mantle drivers ago and nothing but problems!!!!
> 
> I was getting 7fps in 3DMark11 when im normally up around the 80fps mark lol
> 
> And most games didnt load.


I don't think the benchmarks have updated for the new beta driver yet? I haven't kept up with them so I may be wrong.

14.1 drivers works for me in BF4, DayZ, Tomb Raider, Civ 5, and recently played ESO beta for an hour or 2. No issues graphically... just game bugs.


----------



## hatlesschimp

it was like my PC was in limp mode.


----------



## mojobear

Quote:


> Originally Posted by *chronicfx*
> 
> I am running three 290x on a z77 ud5h gigabyte board with gen3 x8,x8,x4 bandwidth. I know these may provide some bottleneck to these cards. I have played a few games in trifire and am coming from 7970 trifire. My 7970 trifire was butter smooth in all games with same hardware setup (only upgraded gpu's). Now i get alot of stutter, well not alot but it is noticeable and kills my gaming joy. My question is will having a small pcie bottleneck on hawaii impact framerate, frametime, or will it impact both? I have not tested just two cards, but one card is very smooth. I also use an afterburner fan profile and cards stay under 80c.


Are you running with vsync on? That can create some stutter with the frame pacing in catalyst.


----------



## velocityx

Is anyone having a problem with cold boots? every single cold boot in the morning or during the day, my games are running as if they had half the gpu power. for example bf4 and witcher 2.

I boot my pc in the morning, and both of those games (havent check other games) are running bad. the difference is almost 50 percent in fps. witcher 2 for example, boot first time, fire up the game, 30fps, reboot, suddenly 70-60 fps same location. For bf4 its around 40-50 fps first boot, and 80-90 second boot.

I have no idea whats up with those cards.


----------



## Maxxa

Quote:


> Originally Posted by *velocityx*
> 
> Is anyone having a problem with cold boots? every single cold boot in the morning or during the day, my games are running as if they had half the gpu power. for example bf4 and witcher 2.
> 
> I boot my pc in the morning, and both of those games (havent check other games) are running bad. the difference is almost 50 percent in fps. witcher 2 for example, boot first time, fire up the game, 30fps, reboot, suddenly 70-60 fps same location. For bf4 its around 40-50 fps first boot, and 80-90 second boot.
> 
> I have no idea whats up with those cards.


Have you tried un-installing then re-installing your drivers? Are you running the 14.1 beta drivers or the last official ones?


----------



## chronicfx

Quote:


> Originally Posted by *mojobear*
> 
> Are you running with vsync on? That can create some stutter with the frame pacing in catalyst.


I messed around a little with ingame vs ccc vs off. Off is smoothest but was noticeably tearing and the other two stuttered. Although i have not messed with framepacing on off too much. I am also mining on the cards so i am trying to lay off and play playstation 3 whenever possible. When theif comes out i will probably play it so i would like to get this solved by then. What would you suggest as smoothest?


----------



## velocityx

Quote:


> Originally Posted by *Maxxa*
> 
> Have you tried un-installing then re-installing your drivers? Are you running the 14.1 beta drivers or the last official ones?


this happens on 14.1 / 13.11 v9.5 and obviously 13.12 whql. don't remember how previous 13.11 behaves.


----------



## cplifj

the halfed framerate only occurs in 14.1 on my system, i really think somehow BF4 pushed or is itself one big piece of malware. We all got rooted somehow. now to find the proof.

i also tend to see correlation between nr of BF4 players and litecoin minners active. but thats just my paranoid 2 cents.


----------



## phallacy

Quote:


> Originally Posted by *hatlesschimp*
> 
> I gave the 14.1 mantle drivers ago and nothing but problems!!!!
> 
> I was getting 7fps in 3DMark11 when im normally up around the 80fps mark lol
> 
> And most games didnt load.


This might not solve your problem, but have you tried disabling the intel hd graphics adapter? I did this after running into similar problems as you where I couldn't load anything. After disabling the on board graphics, games and benchmarks were able to load like before. FWIW I didn't see a huge gain with mantle so I switched back to the WHQL for now.


----------



## velocityx

Quote:


> Originally Posted by *cplifj*
> 
> the halfed framerate only occurs in 14.1 on my system, i really think somehow BF4 pushed or is itself one big piece of malware. We all got rooted somehow. now to find the proof.
> 
> i also tend to see correlation between nr of BF4 players and litecoin minners active. but thats just my paranoid 2 cents.


well, before in the 13.11/13.12 first cold boot was giving me a hard lockup in bf4 within minutes so I donno if it is better that it's running bad in 14.1. I still need to reboot to get performance/stability.


----------



## Abyssic

Quote:


> Originally Posted by *velocityx*
> 
> well, before in the 13.11/13.12 first cold boot was giving me a hard lockup in bf4 within minutes so I donno if it is better that it's running bad in 14.1. I still need to reboot to get performance/stability.


this sounds somewhat psu related


----------



## Abyssic

Quote:


> Originally Posted by *Abyssic*
> 
> not entirely. most of my tests were on stock volts but i also did some overvolting during the process. i didn't go too far or too deep in it. (was about an hour) but i could already see that this card overclocks the worst from everything i've seen so far. i will start a new attempt maybe this evening when i go back to the last drivers. it could even be that those buggy beta drivers are giving me troubles


so i now went back to the whql driver and tried again. i got more out of it than last time but still not really good. i'm not finished yet but so far it's this: +80mv on the core, 1150mhz core, 1450 mhz mem. and i still get artifacts in some applications.

are those 290x's extremely voltage hungry? anyone got some advice for overclocking those? (i know how to oc in general but i'm not experienced, especially with amd)


----------



## cplifj

the reboot to get full performance has nothing to do with bad power or psu.

its pure related to card+drivers. i have said it before, i suspect a rootkit has been pushed. since that moment the system just has quircks that cant be pinned.

i am back with my old symptom i had on previous pc's:

whenever you boot, it's like a different windows is booted, like there are 2 windows on your machine and every other boot it switches between those 2.

i notice this by changing settings that dont stick on reboot, subsequent bootup they are back where i set them, in between it uses the previous settings.

NO reinstall or wipe will fix this, it remains even with a full clean setup. So something is very wrong again. Either a rootkit or the crash from BF4 destroyed something.


----------



## velocityx

Quote:


> Originally Posted by *Abyssic*
> 
> this sounds somewhat psu related


Quote:


> Originally Posted by *velocityx*
> 
> well, before in the 13.11/13.12 first cold boot was giving me a hard lockup in bf4 within minutes so I donno if it is better that it's running bad in 14.1. I still need to reboot to get performance/stability.


I would say the same thing, but when I had my first 290, I had a 650 bronze psu, and it was giving me the same error. now that I have two 290's, i upgraded to a seasonic 850 gold unit, and same thing happens, so I think it's either amd powertune being buggy or something in the drivers is messed up.


----------



## rdr09

Quote:


> Originally Posted by *darwing*
> 
> in all reality you bottlenecked your 7970 by adding a 7950...
> 
> a single OC 7970 on a single monitor has no issues running BF4 at ultra


i never crossfired my 7900 cards in BF4. Just BF3. and i know that a single 7900 card can max out BF4, which i did with both 7950 and 7970. It is C3 that needs crossifre.


----------



## ebduncan

Quote:


> Originally Posted by *cplifj*
> 
> the reboot to get full performance has nothing to do with bad power or psu.
> 
> its pure related to card+drivers. i have said it before, i suspect a rootkit has been pushed. since that moment the system just has quircks that cant be pinned.
> 
> i am back with my old symptom i had on previous pc's:
> 
> whenever you boot, it's like a different windows is booted, like there are 2 windows on your machine and every other boot it switches between those 2.
> 
> i notice this by changing settings that dont stick on reboot, subsequent bootup they are back where i set them, in between it uses the previous settings.
> 
> NO reinstall or wipe will fix this, it remains even with a full clean setup. So something is very wrong again. Either a rootkit or the crash from BF4 destroyed something.


how many hard-drives do you have? you could have a issue with the boot MGR. Sounds to me you might have upgraded at some point in time and moved your primary hard drive to a storage drive and put in a new primary boot drive. However when you change boot drives, your boot manager/system files stay on the other drive. In order to properly do a new clean install, you must remove the hard drive from the system, and install fresh. Meaning you can only have one drive connected at this time and its your new primary drive. This will re-install the boot manger/ system files on the primary drive, and allow you to format the other drive and remove the system files/boot manager from the other drive.


----------



## cplifj

Quote:


> Originally Posted by *velocityx*
> 
> I would say the same thing, but when I had my first 290, I had a 650 bronze psu, and it was giving me the same error. now that I have two 290's, i upgraded to a seasonic 850 gold unit, and same thing happens, so I think it's either amd powertune being buggy or something in the drivers is messed up.


yes, i agree with that. i use a single 290X with a corsair HX650 watt. on a mini-atx board with just 1 sdd and 1 hdd, 6 fans, 1 wc pump corsair H110 cooler. i dont think that power is a problem, it wasn't a problem before 14.1 got installed.

maybe each new driver changes something in the cards UEFI firmware regarding to this powertune thing of AMD. or maybe just this driver and it doenst change back with previous drivers.


----------



## Abyssic

are there any risks in maxing out the power limit slider?


----------



## cplifj

Quote:


> Originally Posted by *ebduncan*
> 
> how many hard-drives do you have? you could have a issue with the boot MGR. Sounds to me you might have upgraded at some point in time and moved your primary hard drive to a storage drive and put in a new primary boot drive. However when you change boot drives, your boot manager/system files stay on the other drive. In order to properly do a new clean install, you must remove the hard drive from the system, and install fresh. Meaning you can only have one drive connected at this time and its your new primary drive. This will re-install the boot manger/ system files on the primary drive, and allow you to format the other drive and remove the system files/boot manager from the other drive.


system is the same as it was when put together over a year ago, with the 290X came the HX 650 watts, thats all that changed.

system was running before with a GTX660 Ti on a zalman 460 watts without any problem ever.


----------



## pompss

guys i look for some benchmark of the r9 290 with a custom bios on water or air
i just ordered the visiontek r9 290 and i plan to put a full waterblock and return my gtx 780 classy.


----------



## Maxxa

Quote:


> Originally Posted by *velocityx*
> 
> well, before in the 13.11/13.12 first cold boot was giving me a hard lockup in bf4 within minutes so I donno if it is better that it's running bad in 14.1. I still need to reboot to get performance/stability.


My BF4 locks up all the time now with mantle. I'm going to switch back to DX and see if it stop locking up in game. I got a day of decent play out of it after un-installing and re-installing the drivers but now it does it all over again.


----------



## Paul17041993

Quote:


> Originally Posted by *chronicfx*
> 
> I am running three 290x on a z77 ud5h gigabyte board with gen3 x8,x8,x4 bandwidth.


what monitor/s and what framerates? provided the primary card has at least enough bandwidth for each slave card you should be fine, but how's that x4 lane attached to everything? if you're using lanes from the chipset that could be a performance concern...


----------



## Clockster

Well I got rid of my Gigabyte GTX780Ti Windforce OC Edition and ordered 2x Gigabyte R9 290 Windforce Editions, will post pics and rejoin the club lol


----------



## Ukkooh

Quote:


> Originally Posted by *Clockster*
> 
> Well I got rid of my Gigabyte GTX780Ti Windforce OC Edition and ordered 2x Gigabyte R9 290 Windforce Editions, will post pics and rejoin the club lol


May I ask why? Needed some money from reselling it or did you have issues?


----------



## Clockster

Quote:


> Originally Posted by *Ukkooh*
> 
> May I ask why? Needed some money from reselling it or did you have issues?


I actually got a really good price for my card and then got a damn good deal on the 2x 290 cards.
I also had endless issues with DX errors while playing BF4.
Running [email protected] and 2x 290's are def a whole lot faster than a single 780ti.


----------



## Abyssic

so here's what i finally got out of my poor card









+38mv
core: 1110mhz
mem: 1450mhz

any further and i get artifacs and that is with volage up to +150mv (i don't want to go any higher)

so ultimately: my gpu can not handle more that 1110mhz







the worst overclocker i had so far...


----------



## phallacy

Quote:


> Originally Posted by *Abyssic*
> 
> so here's what i finally got out of my poor card
> 
> 
> 
> 
> 
> 
> 
> 
> 
> +38mv
> core: 1110mhz
> mem: 1450mhz
> 
> any further and i get artifacs and that is with volage up to +150mv (i don't want to go any higher)
> 
> so ultimately: my gpu can not handle more that 1110mhz
> 
> 
> 
> 
> 
> 
> 
> the worst overclocker i had so far...


Have you tried going lower on the memory and higher on the core? Memory clocks haven't been doing much for me. I ran heaven with my clocks at 1200/1300 and then again at 1200/1625 and the fps difference was like 5-6 scored from 2940 to 2982


----------



## Abyssic

Quote:


> Originally Posted by *phallacy*
> 
> Have you tried going lower on the memory and higher on the core? Memory clocks haven't been doing much for me. I ran heaven with my clocks at 1200/1300 and then again at 1200/1625 and the fps difference was like 5-6 scored from 2940 to 2982


yes, i first left everything on stock, then i went up on the core until it needed more voltage (that's were i realized that i can only get 1110mhz no matter what voltage) and after i finished the core, i went for the memory.


----------



## mojobear

Quote:


> Originally Posted by *chronicfx*
> 
> I messed around a little with ingame vs ccc vs off. Off is smoothest but was noticeably tearing and the other two stuttered. Although i have not messed with framepacing on off too much. I am also mining on the cards so i am trying to lay off and play playstation 3 whenever possible. When theif comes out i will probably play it so i would like to get this solved by then. What would you suggest as smoothest?


I run mine without vsync for 2 reasons. First was the stutter I experienced and second I get audio stutter with vsync on. I got the best experience with Vsync off in game and leaving CCC alone.


----------



## chronicfx

What is the latest and best driver uninstaller?


----------



## MojoW

Quote:


> Originally Posted by *chronicfx*
> 
> What is the latest and best driver uninstaller?


DDU(Display Driver Uninstaller) in my humble oppinion.


----------



## Abyssic

Quote:


> Originally Posted by *MojoW*
> 
> DDU(Display Driver Uninstaller) in my humble oppinion.


yep, just used it today


----------



## chronicfx

Quote:


> Originally Posted by *Paul17041993*
> 
> what monitor/s and what framerates? provided the primary card has at least enough bandwidth for each slave card you should be fine, but how's that x4 lane attached to everything? if you're using lanes from the chipset that could be a performance concern...


What is the latest and beat driver uninstler?

I use a 1440p catleap. Am considering triple monitor by summer but this for now. I don't know how the pcie lanes work out might be a question for @sin0822 because he is the king of the gigabyte boards. Framerates are not monitored as i haven't messed with radeon pro in a while as 7970 trifire (trifire eliminated the stutter) did not need it. But the stutter is back with the more powerful cards. Now I am trying to reason out why and how to fix it. Perhaps i need to explore that avenue again.

Edit: gigabyte z77x-ud5h

I bought it based on sins and this review of the trifire performance from anandtech, and I was wrong looks to be x8/x4/x4 gen3

http://www.anandtech.com/show/6108/gigabyte-gaz77xud5h-review-functionality-meets-competitive-pricing/10


----------



## Abyssic

Quote:


> Originally Posted by *chronicfx*
> 
> What is the latest and beat driver uninstler?


see above


----------



## Loktar Ogar

Quote:


> Originally Posted by *Abyssic*
> 
> yes, i first left everything on stock, then i went up on the core until it needed more voltage (that's were i realized that i can only get 1110mhz no matter what voltage) and after i finished the core, i went for the memory.


What's your ASIC% score? I know this is not relevant but i just wanted to know. Try again the OC settings for +38mv core: 1110mhz and mem: default and restart PC and start from there and OC the core and don't OC the mem check if still having some artifacts... Also, i'm not sure if using a different BIOS will help you OC better.


----------



## Abyssic

Quote:


> Originally Posted by *Loktar Ogar*
> 
> What's your ASIC% score? I know this is not relevant but i just wanted to know. Try again the OC settings for +38mv core: 1110mhz and mem: default and restart PC and start from there and OC the core and don't OC the mem check if still having some artifacts... Also, i'm not sure if using a different BIOS will help you OC better.


75,9%

so you think i sould just leave it at the settings i have right now, except the mem back to stock, and try overclocking the core more once i rebooted?


----------



## hotrod717

What the..... $800-$850 for a new 290X at Newegg. I really don't know what to say other than I hope the crypto market drops out on it's face.


----------



## kskwerl

Whats the best 290X to buy at the moment in terms of mining? Can anyone chime in on this?


----------



## chronicfx

No idea. I have a sapphire and two xfx at 875/1375 that do 850khash each.


----------



## taem

Quote:


> Originally Posted by *szeged*
> 
> wow really, 290x's going for $799? someone shoot me this cant be real. stupid newegg needs to burn already.


What's really great is only newegg ever seems to have cards in stock. The few outlets that don't mark up, amd doesn't ship to those guys. Just to newegg.


----------



## Loktar Ogar

Quote:


> Originally Posted by *Abyssic*
> 
> 75,9%
> 
> so you think i sould just leave it at the settings i have right now, except the mem back to stock, and try overclocking the core more once i rebooted?


You have higher ASIC% than mine at 70% but i can reach up to CC1200... Tried it for fun using Firestrike 1.1. Anyway, it was doable. Yeah, try leaving the settings except mem back to stock and try OC after reboot. Sometimes i'm having glitch in OC settings if i don't reboot... or try closing your AB/TRIXX software every OC settings and open up again.


----------



## Abyssic

Quote:


> Originally Posted by *Loktar Ogar*
> 
> You have higher ASIC% than mine at 70% but i can reach up to CC1200... Tried it for fun using Firestrike 1.1. Anyway, it was doable. Yeah, try leaving the settings except mem back to stock and try OC after reboot. Sometimes i'm having glitch in OC settings if i don't reboot... or try closing your AB/TRIXX software every OC settings and open up again.


thanks for the advice. i will come back tomorrow with results ^^


----------



## kskwerl

Quote:


> Originally Posted by *chronicfx*
> 
> No idea. I have a sapphire and two xfx at 875/1375 that do 850khash each.


Are there any calculators that measure khash, all I can find is megahash and up


----------



## chronicfx

Quote:


> Originally Posted by *kskwerl*
> 
> Are there any calculators that measure khash, all I can find is megahash and up


Could be wrong but mhash is for the sha-256 which is not profitable for gpu anymore. Khash is used for scrypt mining so you would want to use that as your measure. Of course i an probably a long way away from being a great source of this information. Perhaps a visit to the bitcoin cryptocurrency thread would be more helpful. Ivanlabrie is the guy who would know.


----------



## Jack Mac

If all goes as planned, I'll be selling my 290 for $525 on Friday and buying an XFX 290. Can't wait, I'm looking forward to having another toy to play with and hopefully it'll be quieter/cooler. Thing is, I'll be on integrated for over a week...


----------



## devilhead

just now saw, that the new gpu-z show type of memory, in my case hynix







it was always that?


----------



## BackwoodsNC

Quote:


> Originally Posted by *szeged*
> 
> wow really, 290x's going for $799? someone shoot me this cant be real. stupid newegg needs to burn already.


Well, it is tax time. Newegg see's sales increase in items like these. I have noticed every year around tax time newegg raises prices on certain items. This year we just see it more with these cards. They are asking almost double the MSRP on these, major price gouging. Newegg needs some online retail competition.


----------



## Mr357

Quote:


> Originally Posted by *devilhead*
> 
> just now saw, that the new gpu-z show type of memory, in my case hynix
> 
> 
> 
> 
> 
> 
> 
> it was always that?


Came with the latest update


----------



## Kenshiro 26

OT, but I figured some in here might want to know about it.

www.titanfall.com/beta


----------



## chronicfx

Quote:


> Originally Posted by *devilhead*
> 
> just now saw, that the new gpu-z show type of memory, in my case hynix
> 
> 
> 
> 
> 
> 
> 
> it was always that?


Darnit... Two elpidas and a Hynix. That is why I can't get my memory any higher than 1475 without seeing lines across my desktop


----------



## FarmerJo

hey guys! i am looking into getting a 290x (switching from titan) and was just wondering what are considered good clocks for memory at what voltage? wanting any help i can get. also is there a bios that you recommend? i will be using a EK water block and looking for the best clocks i can get at a safe voltage. also what is considered a safe voltage for this card? thanks for all the help!


----------



## phallacy

Quote:


> Originally Posted by *FarmerJo*
> 
> hey guys! i am looking into getting a 290x (switching from titan) and was just wondering what are considered good clocks for memory at what voltage? wanting any help i can get. also is there a bios that you recommend? i will be using a EK water block and looking for the best clocks i can get at a safe voltage. also what is considered a safe voltage for this card? thanks for all the help!


If you can get your memory 1600+ you have one of the better cards and 1700+ some of the best of the current ones out based on results here and the crossfire thread. For the core 1200+ is good 1300+ means a prime card. Voltage i'm going to say 1.35 or under is good for stability. I've never gone over 1.3 and don't really feel the need right now. That is using a +75 mV offset in trixx. There are various bios out now but I think they may be more tailored for mining? Can't say for certain, supposedly the asus bios has the only voltage unlock.


----------



## Gabkicks

well, my r9 290 is here , and It hasnt been working properly







. I tried mining with it with my 780 and it wouldnt work. so then i tried gaming with just the r9 290 and computer screen went back and kb stopped responding. I tried mining and after 20 seconds i get a bunch of strange artifacts (foreground windows break apart). I tried using DDU a few times to no avail. Should i try Formatting my OS drive? or is the card bad?


----------



## ebduncan

Quote:


> Originally Posted by *chronicfx*
> 
> Darnit... Two elpidas and a Hynix. That is why I can't get my memory any higher than 1475 without seeing lines across my desktop


actually ram being used doesn't really make a difference. Its widely believed the hynix memory is better. I've had both types of mem on the same cards and well its really just luck of the draw.
Quote:


> Originally Posted by *FarmerJo*
> 
> hey guys! i am looking into getting a 290x (switching from titan) and was just wondering what are considered good clocks for memory at what voltage? wanting any help i can get. also is there a bios that you recommend? i will be using a EK water block and looking for the best clocks i can get at a safe voltage. also what is considered a safe voltage for this card? thanks for all the help!


anything over 1200 core/ and 1500 mem is considered great. Voltage wise anything from stock to + 100mv is considered safe. + 200mv is safe as well much higher than that well I hope you have some good cooling.


----------



## ArchieGriffs

Clearly I need to take off my tinfoil hat, installing AMD catalyst control center, and windows pops up with a message "trust Advanced Micro Technologies?" "Who the hell is that, is AMD trying to install 3rd party software?". Biggest facepalm ever the few seconds after it took me to figure out that.

Anyways, it's finally here and I'd rather show my swag on some forum than I would actually test the card.
I'd appreciate it if you could add me to the list Arizonian, thanks in advance, do it whenever you feel like it, don't feel compelled to rush, I'm not in any hurry, I'll likely be glued to the games I'm playing for the next few days weeks, until I die of sleep deprivation.

R9 290, Tri-X, Sapphire, aftermarket.
http://www.techpowerup.com/gpuz/8uwfx/


----------



## Paul17041993

Quote:


> Originally Posted by *chronicfx*
> 
> What is the latest and beat driver uninstler?
> 
> I use a 1440p catleap. Am considering triple monitor by summer but this for now. I don't know how the pcie lanes work out might be a question for @sin0822 because he is the king of the gigabyte boards. Framerates are not monitored as i haven't messed with radeon pro in a while as 7970 trifire (trifire eliminated the stutter) did not need it. But the stutter is back with the more powerful cards. Now I am trying to reason out why and how to fix it. Perhaps i need to explore that avenue again.
> 
> Edit: gigabyte z77x-ud5h
> 
> I bought it based on sins and this review of the trifire performance from anandtech, and I was wrong looks to be x8/x4/x4 gen3
> 
> http://www.anandtech.com/show/6108/gigabyte-gaz77xud5h-review-functionality-meets-competitive-pricing/10


ok yea if its 8/4/4 all PCIe3.0 split out from the CPU you're golden for even eyefinity or 4K setups.

drivers wise I only ever do manual file removal and ccleaner, never touch driver removers anymore as they don't always do things nicely...

so the stuttering must be either drivers or your CPU is bottlenecking very heavilly...


----------



## Paul17041993

ok did a quick voltage bump on air @100%, with +100mV my card can do 1150/1625 (memory slider maxed) and she seems somewhat stable on heaven at that, any higher on the core though and artifacts appear even on stock memory so it would definitely need more voltage there, not going to bother unlocking it further as the blower cant keep up but this definitely holds potential for an eventual water upgrade...

http://gpuz.techpowerup.com/14/02/12/eab.png


Spoiler: heaven







likely would crash in skyrim on these clocks and voltage but hey, runs heaven without artifacts...


----------



## Ukkooh

TBH heaven is not a very good stability test for these cards. From my experience cs:go shows artifacts earlier than heaven, valley and bf4.


----------



## Maracus

Quote:


> Originally Posted by *chronicfx*
> 
> Darnit... Two elpidas and a Hynix. That is why I can't get my memory any higher than 1475 without seeing lines across my desktop


Ya they tough to work out I'm able to get the mem stable (Elpida) at *1250/1550*@ +175mv +50mv aux. For a while there above 1450mem would wig out on the desktop with no load, soon as i loaded a benchmark it was fine, think it had something to do with dropping down of the clocks during idle.

I haven't tried +200mv, but I can only guess if im lucky *1270*core will be my limit, if it even makes that.

Arizonian

Could you add me to water please.


----------



## Gabkicks

My mistake was trying to use the beta drivers. i rolled back to official and now the card seems to be working fine alone. I'll try running both cards again tomorrow maybe heh. I want to play some actual games with the 290 before my 2nd one gets here.


----------



## Paul17041993

Quote:


> Originally Posted by *Ukkooh*
> 
> TBH heaven is not a very good stability test for these cards. From my experience cs:go shows artifacts earlier than heaven, valley and bf4.


yea I wouldn't expect it to be stable at that anyway, just that atm I cant actually do proper testing as it will hit 95C after a couple minutes, have to wait till water before I'll know the true potential of this card...


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *Gabkicks*
> 
> My mistake was trying to use the beta drivers. i rolled back to official and now the card seems to be working fine alone. I'll try running both cards again tomorrow maybe heh. I want to play some actual games with the 290 before my 2nd one gets here.


Just sent you a PM about your card









Was actually curious if any others have a ASUS 290X DCUII, curious about the bios's on it. Seems the bios on mine has only been uploaded to techpowerup from a sapphire card.


----------



## Gabkicks

ooh that is a ASUS gtx 780 DC2.


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *Gabkicks*
> 
> ooh that is a ASUS gtx 780 DC2.










excuse my derpiness


----------



## Arizonian

Quote:


> Originally Posted by *ArchieGriffs*
> 
> Clearly I need to take off my tinfoil hat, installing AMD catalyst control center, and windows pops up with a message "trust Advanced Micro Technologies?" "Who the hell is that, is AMD trying to install 3rd party software?". Biggest facepalm ever the few seconds after it took me to figure out that.
> 
> Anyways, it's finally here and I'd rather show my swag on some forum than I would actually test the card.
> I'd appreciate it if you could add me to the list Arizonian, thanks in advance, do it whenever you feel like it, don't feel compelled to rush, I'm not in any hurry, I'll likely be glued to the games I'm playing for the next few days weeks, until I die of sleep deprivation.
> 
> R9 290, Tri-X, Sapphire, aftermarket.
> http://www.techpowerup.com/gpuz/8uwfx/
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - welcome aboard - added









Quote:


> Originally Posted by *Maracus*
> 
> Ya they tough to work out I'm able to get the mem stable (Elpida) at *1250/1550*@ +175mv +50mv aux. For a while there above 1450mem would wig out on the desktop with no load, soon as i loaded a benchmark it was fine, think it had something to do with dropping down of the clocks during idle.
> 
> I haven't tried +200mv, but I can only guess if im lucky *1270*core will be my limit, if it even makes that.
> 
> Arizonian
> 
> Could you add me to water please.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - updated









Quote:


> Originally Posted by *Gabkicks*
> 
> My mistake was trying to use the beta drivers. i rolled back to official and now the card seems to be working fine alone. I'll try running both cards again tomorrow maybe heh. I want to play some actual games with the 290 before my 2nd one gets here.
> 
> 
> Spoiler: Warning: Spoiler!


Appreciate seeing the pics side by side and how they look in rig.









Post back with either a GPU-Z validation link or GPU-Z screen shot with validation tab open with your OCN name on it and I'll get you added to roster.

I like the TRI-X when they are side by side. Do you have a favorite out of them? I know it's just by luck but which one over clocks better?


----------



## taem

Quote:


> Originally Posted by *Gabkicks*
> 
> My mistake was trying to use the beta drivers. i rolled back to official and now the card seems to be working fine alone. I'll try running both cards again tomorrow maybe heh. I want to play some actual games with the 290 before my 2nd one gets here.


Does the overhang on the tri-x really make a difference? Additional bit of heatsink, ok maybe. But the shroud hanging over it and adding 10mm of length, is that really necessary? OCUK is reporting that the Powercolor pcs+ at 35mm less length gets 3-5c better temps. And that card will actually fit in my case too. Sapphires are just too freakin long.

I have the option of $510 at superbiiz or $650 at newegg. What to do, what to do...


----------



## Arizonian

Quote:


> Originally Posted by *taem*
> 
> Does the overhang on the tri-x really make a difference? Additional bit of heatsink, ok maybe. But the shroud hanging over it and adding 10mm of length, is that really necessary? OCUK is reporting that the Powercolor pcs+ at 35mm less length gets 3-5c better temps. And that card will actually fit in my case too. Sapphires are just too freakin long.
> 
> I have the option of $510 at superbiiz or $650 at newegg. What to do, what to do...


Go for the superbiz price hands down. Newegg or any retailer asking those high prices should be forced to keep those on their shelves until it's priced accordingly. Good for superbiz getting your business.


----------



## Gabkicks

i just tried to oc the tri-x without changing voltage. i bumped memory up to 1400 and screen started going crazy D: so i went back to stock speeds. the asus is a gtx 780. it has elpida ram so i didnt expect any oc out of the ram. i thought that the 290 with hynix ram would be able to hit 1480/1500 or so without touching voltage. Maybe my card is faulty?

I ordered one 290 from newegg for $705 shipped and one from superbizz for $620 shipped. I alreayd have the one from newegg and superbiiz just emailed me that they have shipped it.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gabkicks*
> 
> i just tried to oc the tri-x without changing voltage. i bumped memory up to 1400 and screen started going crazy D: so i went back to stock speeds. the asus is a gtx 780. it has elpida ram so i didnt expect any oc out of the ram. i thought that the 290 with hynix ram would be able to hit 1480/1500 or so without touching voltage. Maybe my card is faulty?
> 
> I ordered one 290 from newegg for $705 shipped and one from superbizz for $620 shipped. I alreayd have the one from newegg and superbiiz just emailed me that they have shipped it.


From what i've seen the core voltage helps the memory clocks a little.

try adding a little bit of voltage to the core and seeing if that helps?


----------



## MojoW

Quote:


> Originally Posted by *ArchieGriffs*
> 
> Clearly I need to take off my tinfoil hat, installing AMD catalyst control center, and windows pops up with a message "trust Advanced Micro Technologies?" "Who the hell is that, is AMD trying to install 3rd party software?". Biggest facepalm ever the few seconds after it took me to figure out that.
> 
> Anyways, it's finally here and I'd rather show my swag on some forum than I would actually test the card.
> I'd appreciate it if you could add me to the list Arizonian, thanks in advance, do it whenever you feel like it, don't feel compelled to rush, I'm not in any hurry, I'll likely be glued to the games I'm playing for the next few days weeks, until I die of sleep deprivation.
> 
> R9 290, Tri-X, Sapphire, aftermarket.
> http://www.techpowerup.com/gpuz/8uwfx/


Can you upload the bios of your card? ? that's a newer bios than anything on techpowerup bios collection.


----------



## taem

Quote:


> Originally Posted by *Arizonian*
> 
> Go for the superbiz price hands down. Newegg or any retailer asking those high prices should be forced to keep those on their shelves until it's priced accordingly. Good for superbiz getting your business.


Heh yeah I went with superbiiz. They're just up the highway from me, have ordered from them before, they're solid. If there was a small price difference and it was an iron egg I might have gone with newegg. But not for +$140. My god those people.

Anyway, went to powercolor web site, they don't have an in-house overclocking utility. Is that a bad sign lol? Is this a rinky dink outfit?

So, which software should I use I wonder. I use hwinfo and RTSS for osd so I don't need AB for that, but is AB the best option even without that?


----------



## ArchieGriffs

Quote:


> Originally Posted by *MojoW*
> 
> Can you upload the bios of your card? ? that's a newer bios than anything on techpowerup bios collection.


I won't have the time to do it until early evening tomorrow, but I can try then.


----------



## MojoW

Quote:


> Originally Posted by *ArchieGriffs*
> 
> I won't have the time to do it until early evening tomorrow, but I can try then.


No problem and thanks.


----------



## Besty007

Quote:


> Originally Posted by *iPDrop*
> 
> what issue were you having and which drivers did you change to and from? I'm getting some sporadic performance ever since I installed the new 14.1 drivers


Was on the 14.1 drivers, now back on 13.12 drivers. Was getting black screen on launching some games.


----------



## Sgt Bilko

Quote:


> Originally Posted by *taem*
> 
> Heh yeah I went with superbiiz. They're just up the highway from me, have ordered from them before, they're solid. If there was a small price difference and it was an iron egg I might have gone with newegg. But not for +$140. My god those people.
> 
> Anyway, went to powercolor web site, they don't have an in-house overclocking utility. Is that a bad sign lol? Is this a rinky dink outfit?
> 
> So, which software should I use I wonder. I use hwinfo and RTSS for osd so I don't need AB for that, but is AB the best option even without that?


Either Afterburner or Sapphire Trixx works, Afterburner is better all round but only goes up to +100mV unless you add commands to it.

Trixx supports +200mV right off the bat but you don't get the same graphs that AB give you.


----------



## Paul17041993

Quote:


> Originally Posted by *Gabkicks*
> 
> i just tried to oc the tri-x without changing voltage. i bumped memory up to 1400 and screen started going crazy D: so i went back to stock speeds. the asus is a gtx 780. it has elpida ram so i didnt expect any oc out of the ram. i thought that the 290 with hynix ram would be able to hit 1480/1500 or so without touching voltage. Maybe my card is faulty?
> 
> I ordered one 290 from newegg for $705 shipped and one from superbizz for $620 shipped. I alreayd have the one from newegg and superbiiz just emailed me that they have shipped it.


no, the newer memory controller is more lightweight and very sensitive on the memory clocks, pump it with +100mV and you should be able to take it to 1500 easily, mine for example seems to handle 1625 ok with 1150 core but I haven't got the chance to fully test and push it yet, the extra volts should make 1100/1500 stable at least anyway.


----------



## rdr09

Quote:


> Originally Posted by *Gabkicks*
> 
> well, my r9 290 is here , and It hasnt been working properly
> 
> 
> 
> 
> 
> 
> 
> . I tried mining with it with my 780 and it wouldnt work. so then i tried gaming with just the r9 290 and computer screen went back and kb stopped responding. I tried mining and after 20 seconds i get a bunch of strange artifacts (foreground windows break apart). I tried using DDU a few times to no avail. Should i try Formatting my OS drive? or is the card bad?


competing gpus in one system - not good.


----------



## Abyssic

Quote:


> Originally Posted by *Abyssic*
> 
> thanks for the advice. i will come back tomorrow with results ^^


thanks to Loktar Ogar ^^

With your rebooting method, i managed to squeeze out 1140mhz completely stable with +100mv. i don't want to go above +100mv for peace of mind. my memory actually works flawlessly at 1600mhz







i'm satisfied now


----------



## Ukkooh

Abyssic, how much actual voltage are you getting at +100mV offset? I hate this offset trend at this club cause that data is completely meaningless. If your actual voltage is below 1.3V and you have proper cooling just feed it some more volts.


----------



## Abyssic

i was just too lazy to write down the actual voltage ^^ i get around 1.2v at max. i would add a bit more voltage but i also want to stay with AB is there any way to exeed the limits of AB?


----------



## MojoW

To set +150mV offset:
MSIAfterburner.exe /wi4,30,8d,18

To restore original voltage:
MSIAfterburner.exe /wi4,30,8d,0

AB commands but this will not stick after restart.


----------



## Ukkooh

Quote:


> Originally Posted by *Abyssic*
> 
> i was just too lazy to write down the actual voltage ^^ i get around 1.2v at max. i would add a bit more voltage but i also want to stay with AB is there any way to exeed the limits of AB?


Use trixx. It allows +200mV.


----------



## Abyssic

thanks guys but i just did a new reading and i realized that my voltage actually is ~1.3v under load so i won't increase it any further anyways. i guess i have to accept that i don't have a good sample ^^ well, +140mhz over reference are still better than nothing ^^
(max temps around 80°C)


----------



## ArcticZero

Quick question: Can you flick the BIOS switch while the computer is powered on? I was thinking of flashing a mining BIOS on one side, and gaming BIOS on the other.


----------



## rdr09

Quote:


> Originally Posted by *Abyssic*
> 
> thanks guys but i just did a new reading and i realized that my voltage actually is ~1.3v under load so i won't increase it any further anyways. i guess i have to accept that i don't have a good sample ^^ well, +140mhz over reference are still better than nothing ^^
> (max temps around 80°C)


1140 stable? that's like 1200 on a 290. so, I'd say that's very good. you played with the Power Limit?

Quote:


> Originally Posted by *ArcticZero*
> 
> Quick question: Can you flick the BIOS switch while the computer is powered on? I was thinking of flashing a mining BIOS on one side, and gaming BIOS on the other.


yes, just be careful not to touch anything else.


----------



## ArcticZero

Quote:


> Originally Posted by *rdr09*
> 
> yes, just be careful not to touch anything else.


Great, thanks! Rep+


----------



## rdr09

Quote:


> Originally Posted by *ArcticZero*
> 
> Great, thanks! Rep+


you have a 290(nonX)? I believe one bios is locked for safety. the other is flashable. make sure to save the bios first.


----------



## Abyssic

Quote:


> Originally Posted by *rdr09*
> 
> 1140 stable? that's like 1200 on a 290. so, I'd say that's very good. you played with the Power Limit?


i was aiming for 1200 ^^ but not too bad. i set the power limit to max. no need for limitations ^^


----------



## MojoW

Quote:


> Originally Posted by *rdr09*
> 
> you have a 290(nonX)? I believe one bios is locked for safety. the other is flashable. make sure to save the bios first.


Nope i flashed both and both are flashable in my experience.
And my 290 is locked so it's not a 290x undercover.


----------



## rdr09

Quote:


> Originally Posted by *Abyssic*
> 
> i was aiming for 1200 ^^ but not too bad. i set the power limit to max. no need for limitations ^^


no, not bad at all.

Quote:


> Originally Posted by *MojoW*
> 
> Nope i flashed both and both are flashable in my experience.
> And my 290 is locked so it's not a 290x undercover.


wow, that is dangerous. mine is locked, too, according to an app. i am such a wuss in flashing gpus. mobo's, yes, that is easy.


----------



## chronicfx

Does power limit help memory instability when overclocking the 290x. If not how do you increase mem voltage and what is safe? I cannot get past 1450 without line across my desktop whenever i move my mouse.


----------



## MojoW

Quote:


> Originally Posted by *chronicfx*
> 
> Does power limit help memory instability when overclocking the 290x. If not how do you increase mem voltage and what is safe? I cannot get past 1450 without line across my desktop whenever i move my mouse.


Nope only core voltage will help with memory stability.


----------



## MojoW

Quote:


> Originally Posted by *rdr09*
> 
> no, not bad at all.
> wow, that is dangerous. mine is locked, too, according to an app. i am such a wuss in flashing gpus. mobo's, yes, that is easy.


Lol not @ once but i was looking for the best bios for my card.
And if one bios is working you don't have to worry about bricking, as you can easily unbrick it with the working bios.


----------



## Coree

Hey people, got a brand new Sapphire R9 290X for only 415e. (Remember, here in Finland 1€ = 1$.) So the price was pretty awesome.
So, I modified the cooler. I used the cheap Accelero S1 Plus (19€) , which was left behind from my 7870LE. Even if the S1 Plus doesn't have 290X support, it really does, as the screwholes line up perfectly.
Combined the cooler with 2x quiet 140mm TY147 fans, which are rated @ 21dba 73CFM (Got these for free). Screwed 2 of them into place. I left the VRAM's passive, as i'm not OCing the memory. (I believe the Asus DC II 290/X has passively cooled memory chips too) For the VRMs, I chopped a stock AMD heatsink. (The stock heatsinks aren't useless!







)
So the most interesting part, temperatures @ stock?!
Core dropped from 94C -> 73C load. (BF3, no throttling, uber bios enabled)
VRM's: VRM1 82C -> 70C. VRM2 66C -> 62C.
So, not bad? And the noise levels, these fans are quiet as I said, even at max. speeds.
Hope this helped you people, thanks for reading









GPU validation: http://www.techpowerup.com/gpuz/7nuek/

Pics:










Sorry for the quality :L


----------



## disintegratorx

Yup. Just ordered the Powecolor LCS 290x.. Hope its worth it. lol It was over 900$ with the cost of Newegg's next day shipping..


I'll order the waterblock for it tomorrow. Going with what was recommended by a member here.







Can't wait until its all set up, then its time to take a few days off and get properly acquainted with some BF4.









I will be asking to join the club within the next few days. Til then guys!


----------



## phallacy

Quote:


> Originally Posted by *disintegratorx*
> 
> Yup. Just ordered the Powecolor LCS 290x.. Hope its worth it. lol It was over 900$ with the cost of Newegg's next day shipping..
> 
> 
> I'll order the waterblock for it tomorrow. Going with what was recommended by a member here.
> 
> 
> 
> 
> 
> 
> 
> Can't wait until its all set up, then its time to take a few days off and get properly acquainted with some BF4.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I will be asking to join the club within the next few days. Til then guys!


Nice.. but doesn't that already come with an EK waterblock on it ? If you're going to switch it out you should try just getting a reference card and save some money.


----------



## disintegratorx

Yeah, it does.. I'm getting the cooling kit for it separately though, because I'm waiting to get more money from my paycheck. lol


----------



## Abyssic

Quote:


> Originally Posted by *chronicfx*
> 
> Does power limit help memory instability when overclocking the 290x. If not how do you increase mem voltage and what is safe? I cannot get past 1450 without line across my desktop whenever i move my mouse.


well you can try to fiddle with the aux voltage if your card allows it but at your own risk. aux voltage is a bit mysterious ^^ some say it helps keeping overclocks stable, others say it does nothing, others say it can cause serious damage.


----------



## Abyssic

Quote:


> Originally Posted by *Coree*
> 
> Hey people, got a brand new Sapphire R9 290X for only 415e. (Remember, here in Finland 1€ = 1$.) So the price was pretty awesome.
> So, I modified the cooler. I used the cheap Accelero S1 Plus (19€) , which was left behind from my 7870LE. Even if the S1 Plus doesn't have 290X support, it really does, as the screwholes line up perfectly.
> Combined the cooler with 2x quiet 140mm TY147 fans, which are rated @ 21dba 73CFM (Got these for free). Screwed 2 of them into place. I left the VRAM's passive, as i'm not OCing the memory. (I believe the Asus DC II 290/X has passively cooled memory chips too) For the VRMs, I chopped a stock AMD heatsink. (The stock heatsinks aren't useless!
> 
> 
> 
> 
> 
> 
> 
> )
> So the most interesting part, temperatures @ stock?!
> Core dropped from 94C -> 73C load. (BF3, no throttling, uber bios enabled)
> VRM's: VRM1 82C -> 70C. VRM2 66C -> 62C.
> So, not bad? And the noise levels, these fans are quiet as I said, even at max. speeds.
> Hope this helped you people, thanks for reading
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GPU validation: http://www.techpowerup.com/gpuz/7nuek/
> 
> Sorry for the quality :L


very nice diy cooling ^^ pretty impressive results there.


----------



## kizwan

When sharing temps, please don't forget to include ambient (indoor) temp. Without it the number is useless.


----------



## rdr09

Quote:


> Originally Posted by *disintegratorx*
> 
> Yup. Just ordered the Powecolor LCS 290x.. Hope its worth it. lol It was over 900$ with the cost of Newegg's next day shipping..
> 
> 
> I'll order the waterblock for it tomorrow. Going with what was recommended by a member here.
> 
> 
> 
> 
> 
> 
> 
> Can't wait until its all set up, then its time to take a few days off and get properly acquainted with some BF4.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I will be asking to join the club within the next few days. Til then guys!


wut?!!!!!!!!!!!!! cancel that and get this . . .

http://www.superbiiz.com/detail.php?name=PC-R9290PP

GET 2.


----------



## kizwan

The card from superbiiz is *PCS+* card while @disintegratorx want to buy *LCS* card. LCS card come with waterblock installed.


----------



## Ukkooh

So far my 290x seems to be stable at 1200/1600 with +75mV offset (1.273V max). I guess I got a slightly better than average card.


----------



## Loktar Ogar

Quote:


> Originally Posted by *Abyssic*
> 
> thanks to Loktar Ogar ^^
> 
> With your rebooting method, i managed to squeeze out 1140mhz completely stable with +100mv. i don't want to go above +100mv for peace of mind. my memory actually works flawlessly at 1600mhz
> 
> 
> 
> 
> 
> 
> 
> i'm satisfied now


Glad it worked my friend!







I had the same error and just learned from experience.


----------



## Ukkooh

Quote:


> Originally Posted by *disintegratorx*
> 
> Yup. Just ordered the Powecolor LCS 290x.. Hope its worth it. lol It was over 900$ with the cost of Newegg's next day shipping..
> 
> 
> I'll order the waterblock for it tomorrow. Going with what was recommended by a member here.
> 
> 
> 
> 
> 
> 
> 
> Can't wait until its all set up, then its time to take a few days off and get properly acquainted with some BF4.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I will be asking to join the club within the next few days. Til then guys!


I hope you didn't order the block yet as the powercolor lcs has a EK block preinstalled.

Edit: NVM this was already mentioned.


----------



## kizwan

Quote:


> Originally Posted by *Ukkooh*
> 
> So far my 290x seems to be stable at 1200/1600 with +75mV offset (1.273V max). I guess I got a slightly better than average card.


That is good.







My 290's can run at 1180/1250 with +100mV.







I can run Valley benchmark @1200/1600 with +100mV but it's not stable for Firestrike nor BF4. What apps or games did you test at that clock?


----------



## Ukkooh

Quote:


> Originally Posted by *kizwan*
> 
> That is good.
> 
> 
> 
> 
> 
> 
> 
> My 290's can run at 1180/1250 with +100mV.
> 
> 
> 
> 
> 
> 
> 
> I can run Valley benchmark @1200/1600 with +100mV but it's not stable for Firestrike nor BF4. What apps or games did you test at that clock?


CS:GO and valley. For some odd reason cs gives artifacts easier than bf4 or valley with these cards. Do you know of any better stress testing games for hawaii?


----------



## ebduncan

Quote:


> Originally Posted by *kizwan*
> 
> That is good.
> 
> 
> 
> 
> 
> 
> 
> My 290's can run at 1180/1250 with +100mV.
> 
> 
> 
> 
> 
> 
> 
> I can run Valley benchmark @1200/1600 with +100mV but it's not stable for Firestrike nor BF4. What apps or games did you test at that clock?


have you tried to lower the voltage? sometimes more voltage is not better.

Quote:


> Originally Posted by *Ukkooh*
> 
> So far my 290x seems to be stable at 1200/1600 with +75mV offset (1.273V max). I guess I got a slightly better than average card.


certainly better than average. I wouldn't call it a great sample.

My card does 1200/1625 with an average VDDC of 1.160 (+50mv set in Trix) Card is water cooled, ambient temp 24c, card temp 50c full load

On air this card would only do 1150 core/1500 mem with +100mv with 65% fan speed (reference cooler)

Dropping the temps from 90c to 50c full load with the water block was certainly worth it. I haven't tried for more than 1200/1600 yet, might do that tonight.


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> The card from superbiiz is *PCS+* card while @disintegratorx want to buy *LCS* card. LCS card come with waterblock installed.


i understand that but $900? $500 for a 290 plus a $120 block is still a better deal. In benches, the 290X might always come on top but in games - no difference.

besides, not many 290X can beat my 290 in benches. it is all silicon lottery.

edit: $900 - I'll get me a Black Titan whenever. Oh wait, I play Battlefield so have to stick to AMD.


----------



## kizwan

Quote:


> Originally Posted by *Ukkooh*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> That is good.
> 
> 
> 
> 
> 
> 
> 
> My 290's can run at 1180/1250 with +100mV.
> 
> 
> 
> 
> 
> 
> 
> I can run Valley benchmark @1200/1600 with +100mV but it's not stable for Firestrike nor BF4. What apps or games did you test at that clock?
> 
> 
> 
> 
> 
> 
> 
> CS:GO and valley. For some odd reason cs gives artifacts easier than bf4 or valley with these cards. Do you know of any better stress testing games for hawaii?
Click to expand...

IMO if it passed Firestrike Extreme, 3dMark11, BF3 & BF4 (or any games you're playing), then I say it's stable.
Quote:


> Originally Posted by *ebduncan*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> That is good.
> 
> 
> 
> 
> 
> 
> 
> My 290's can run at 1180/1250 with +100mV.
> 
> 
> 
> 
> 
> 
> 
> I can run Valley benchmark @1200/1600 with +100mV but it's not stable for Firestrike nor BF4. What apps or games did you test at that clock?
> 
> 
> 
> 
> 
> 
> 
> have you tried to lower the voltage? sometimes more voltage is not better.
Click to expand...

I did tried lower voltage but my cards does need more voltage to get stable. I even can put voltage to +200mV & try any clocks below 1200 for core clock & below 1600 for memory clock, they all stable.


----------



## chronicfx

Quote:


> Originally Posted by *MojoW*
> 
> Nope only core voltage will help with memory stability.


Thank you. How do i raise my core voltage? I use afterburner for my fan speeds but the slider on top is blank where it should be able to adjust core voltages, is there something i have to check? Gpu-z has vddc for all three cards at 1.045v. I assume i have some headroom? 1.2ish volts max for 24/7 or can i go higher?


----------



## ebduncan

Quote:


> Originally Posted by *rdr09*
> 
> i understand that but $900? $500 for a 290 plus a $120 block is still a better deal. In benches, the 290X might always come on top but in games - no difference.
> 
> besides, not many 290X can beat my 290 in benches. it is all silicon lottery.
> 
> edit: $900 - I'll get me a Black Titan whenever. Oh wait, I play Battlefield so have to stick to AMD.


the 290x has 10% more stream processors. 2816 stream vs 2560 On average the 290x is 4-5% faster at the same clocks. Ie you could hit 1300 mhz on your 290 and a 290x would match it at 1235. I do agree to an extent it is more about silicon lottery though. Some 290s will beat 290xs when comparing overclocked results ie when a great overclockers goes up against a weak overclocker.

Quote:


> Thank you. How do i raise my core voltage? I use afterburner for my fan speeds but the slider on top is blank where it should be able to adjust core voltages, is there something i have to check? Gpu-z has vddc for all three cards at 1.045v. I assume i have some headroom? 1.2ish volts max for 24/7 or can i go higher?


you must be using a older version of after burne. You need beta 18, or use another program such as Trix. 1.2 volts is fine for 24/7 as long as temps are under control. Hawaii is just like Tahiti just bigger so up to 1.3 volts is considered safe.


----------



## Duvar

For everyone searching for a new cooler http://extreme.pcgameshardware.de/grafikkarten/319198-neuer-arctic-cooling-hybrid-ii-ist-angekommen-8.html#post6154976


----------



## rdr09

Quote:


> Originally Posted by *ebduncan*
> 
> the 290x has 10% more stream processors. 2816 stream vs 2560 On average the 290x is 4-5% faster at the same clocks. Ie you could hit 1300 mhz on your 290 and a 290x would match it at 1235. I do agree to an extent it is more about silicon lottery though. Some 290s will beat 290xs when comparing overclocked results ie when a great overclockers goes up against a weak overclocker.
> you must be using a older version of after burne. You need beta 18, or use another program such as Trix. 1.2 volts is fine for 24/7 as long as temps are under control. Hawaii is just like Tahiti just bigger so up to 1.3 volts is considered safe.


prolly 1250 on the 290X . . .

http://www.3dmark.com/3dm11/7748882

to match that. like i said, in games - zero difference 'cause most 290 will oc at least 50 - 70 MHz.

edit: again, $900? i hope it does not BS.


----------



## Ukkooh

Quote:


> Originally Posted by *rdr09*
> 
> i understand that but $900? $500 for a 290 plus a $120 block is still a better deal. In benches, the 290X might always come on top but in games - no difference.
> 
> besides, not many 290X can beat my 290 in benches. it is all silicon lottery.
> 
> edit: $900 - I'll get me a Black Titan whenever. Oh wait, I play Battlefield so have to stick to AMD.


How high does your 290 go then?


----------



## MojoW

Quote:


> Originally Posted by *chronicfx*
> 
> Thank you. How do i raise my core voltage? I use afterburner for my fan speeds but the slider on top is blank where it should be able to adjust core voltages, is there something i have to check? Gpu-z has vddc for all three cards at 1.045v. I assume i have some headroom? 1.2ish volts max for 24/7 or can i go higher?


In the settings tab of AB you can enable voltage control.
I would say 1.2 is safe for 24/7 but it all depends on your temps, 1.3 if you are under water.
Yes i think you have but you'll have fun finding it out, good luck.

Edit: Indeed as ebduncan said make sure you have AB 3.0.0 beta 18


----------



## Abyssic

Quote:


> Originally Posted by *Ukkooh*
> 
> CS:GO and valley. For some odd reason cs gives artifacts easier than bf4 or valley with these cards. Do you know of any better stress testing games for hawaii?


i tested a lot of benchmarks and games and far cry 3 was the one that showed artifacts even when the others didn't so i used it for validation.
(other apps were: 3D Mark 11, Heaven, Valley, Crysis 1&3, Trine 2, BF4, Hitman Absolution, Sleeping Dogs)


----------



## rdr09

Quote:


> Originally Posted by *Ukkooh*
> 
> How high does your 290 go then?


Uk, i play my games at stock. i only oc'ed during the benches. 1320/1620 is my highest in 3DMarks.


----------



## disintegratorx

Yeah, the Titan may win in overall FPS when it comes to BF4, but if you look at the scores from the 1 in 99th frame benchmarks, the AMD 290x card scores much better, thanks to their MANTLE API.








So what I gather from that is that AMD have the overall best scores in lowest latency performance.


----------



## rdr09

Quote:


> Originally Posted by *disintegratorx*
> 
> Yeah, the Titan may win in overall FPS when it comes to BF4, but if you look at the scores from the 1 in 99th frame benchmarks, the AMD 290x card scores much better, thanks to their MANTLE API.
> 
> 
> 
> 
> 
> 
> 
> 
> So what I gather from that is that AMD have the overall best scores in lowest latency performance.


even without mantle amd for battlefield. forget about who gets the highest fps. there are other more important things in play. imo, amd for battlefield games.

have you cancelled yet?


----------



## jerrolds

Quote:


> Originally Posted by *Abyssic*
> 
> 75,9%
> 
> so you think i sould just leave it at the settings i have right now, except the mem back to stock, and try overclocking the core more once i rebooted?


Have you tried GPU Tweak to overclock? If its a reference 290 then i find the ASUS BIOS worked best for me (for light overclocking ~1200mhz)


----------



## Abyssic

Quote:


> Originally Posted by *jerrolds*
> 
> Have you tried GPU Tweak to overclock? If its a reference 290 then i find the ASUS BIOS worked best for me (for light overclocking ~1200mhz)


first of all, it's a 290x ^^ thanks for your help but i'm already done overclocking this card. i'm leaving it at 1140 core and 1600mhz mem @ 1.3v becasue i don't want to go higher on the voltage and i would have to pump much more voltage to get near to 1200mhz.


----------



## Gabkicks

Here is a pic of my sapphire 290 and asus gtx 780. My 2nd 290 should be here monday.








I guess i can use the gtx 780 as a physx card for my 290s


----------



## jerrolds

Quote:


> Originally Posted by *Abyssic*
> 
> first of all, it's a 290x ^^ thanks for your help but i'm already done overclocking this card. i'm leaving it at 1140 core and 1600mhz mem @ 1.3v becasue i don't want to go higher on the voltage and i would have to pump much more voltage to get near to 1200mhz.


AH then 1140 isnt bad, for me to get anywhere close to 1200 i have to max out gpu tweak at 1410mv heh - im comfortably at 1170/1500 +80mv. This is at 75.0% ASIC if it matters.


----------



## Paul17041993

Quote:


> Originally Posted by *Ukkooh*
> 
> So far my 290x seems to be stable at 1200/1600 with +75mV offset (1.273V max). I guess I got a slightly better than average card.


whats your ASIC?


----------



## the9quad

I'm pretty sure it's been shown that none of the programs are able to read the ASIC correctly on the 290/290x's so those numbers mean nothing.


----------



## Abyssic

Quote:


> Originally Posted by *jerrolds*
> 
> AH then 1140 isnt bad, for me to get anywhere close to 1200 i have to max out gpu tweak at 1410mv heh - im comfortably at 1170/1500 +80mv. This is at 75.0% ASIC if it matters.


asic matters as far as it's accurate ^^ you can certainly gain some rough information from it ^^. i got 75% asic as well so you see it's not really a accurate way to measure things.


----------



## Ukkooh

Quote:


> Originally Posted by *Paul17041993*
> 
> whats your ASIC?


According to gpu-z 79.3%. Oddly enough my first 290x had 73.9%. AFAIK asic readings on hawaii are not accurate though.
http://forum.beyond3d.com/showpost.php?p=1808073&postcount=2130


----------



## Paul17041993

heh so I tried trixx and had it pumping +200mV, 1200/1700, before she blackscreened, I can only assume the VRMs overheated as I didn't get the chance to check gpu-Z...








Quote:


> Originally Posted by *Ukkooh*
> 
> According to gpu-z 79.3%. Oddly enough my first 290x had 73.9%. AFAIK asic readings on hawaii are not accurate though.
> http://forum.beyond3d.com/showpost.php?p=1808073&postcount=2130


hm yea there could be a slight trend similar to the 79x0s, likely still too early to tell though...


----------



## Ukkooh

If you blackscreened it is most likely your vram being unstable on 1700. I doubt the VRMs are going to overheat at idle. What cooler do you have on your 290x?


----------



## Abyssic

hey guys, i've uploaded some new pictures of my current setup. feel free to rate and hate xD


----------



## chronicfx

First there were two then there were three


----------



## Jedson3614

I have a question about these cards or any R series card. I was browsing newegg just looking at some specs of different R9 gpu's. I nopticed MSI had one with boost clocks similar to geforce cards. It had something like OC mode, gaming mode, and silent mode. Well Almost every other R series card just list a static clock rate, I thought the idea of these newer cards is they had some from of boost feature for these ghz cards. Apparently they don't. So when overclockignthese cards, do you jsut increase one static clock, and switch profiles for 2d/3d. How is this handled in the radeon world of video cards lately, I have gtx 670, and have not owned a AMD card in quite some time.


----------



## Abyssic

Quote:


> Originally Posted by *chronicfx*
> 
> 
> 
> 
> 
> First there were two then there were three


woah those colors xD


----------



## chronicfx

Quote:


> Originally Posted by *Abyssic*
> 
> woah those colors xD


Haha it all glows different when the uv is on. I have 4 6 inch sticks attached to the door. Orange fans and uv blue tube with orange raystorm. I like it


----------



## Abyssic

Quote:


> Originally Posted by *chronicfx*
> 
> Haha it all glows different when the uv is on. I have 4 6 inch sticks attached to the door. Orange fans and uv blue tube with orange raystorm. I like it


oh yeah i can imagine that with uv. but it looks a bit hideous in daylight ^^


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Arizonian*


Now I can officially be updated lol


----------



## Paul17041993

Quote:


> Originally Posted by *Ukkooh*
> 
> If you blackscreened it is most likely your vram being unstable on 1700. I doubt the VRMs are going to overheat at idle. What cooler do you have on your 290x?


nah it was under load, core was reaching 95C even with the fan (stock blower) on full tilt, went to check gpu-z for the VRMs but she blacked out just before I could click the sensors tab, might try it again tomorrow morning if its cold enough just to see...


----------



## Abyssic

Quote:


> Originally Posted by *Paul17041993*
> 
> (stock blower)


there's the problem xD


----------



## Paul17041993

Quote:


> Originally Posted by *Abyssic*
> 
> there's the problem xD


obviously


----------



## Widde

Just had a hard freeze in bf4 with the 13.12 driver, restarted the pc and ccc didnt load, it said I didnt have any compatible hardware installed, rebooted in safe mode used DDU restarted installed 14.1 disabled ULPS and rebooted again, Now it wont find my 2nd card and gpu-z shows this http://piclair.com/yqgrh. Dont know whats wrong

And changing EnableULPS in regedit doesnt seem to work but it finds the 2nd card now

Nvm xD Aparently you're not allowed to "shut down" you have to restart, rebooted and now it works


----------



## conwa

Can I be added to the club please? (XFX 290X on water)


----------



## Paul17041993

Quote:


> Originally Posted by *Widde*
> 
> Just had a hard freeze in bf4 with the 13.12 driver, restarted the pc and ccc didnt load, it said I didnt have any compatible hardware installed, rebooted in safe mode used DDU restarted installed 14.1 disabled ULPS and rebooted again, Now it wont find my 2nd card and gpu-z shows this http://piclair.com/yqgrh. Dont know whats wrong


could you test the card by itself? and I don't remember if 14.1 was crossfire-happy, hopefully the card hasn't blown a phase or IC...


----------



## Widde

Quote:


> Originally Posted by *Paul17041993*
> 
> could you test the card by itself? and I don't remember if 14.1 was crossfire-happy, hopefully the card hasn't blown a phase or IC...


Works now for some reason, Rebooted a 2nd time and now it works, and I have a problem with the 13.12 drivers, at the end of the installation it says something about warnings and the icon is yellow but everything seems to work, in the log it just says Advanced Micro Systems something and no errors. But it seems happy for now


----------



## szeged

gigabyte 290x is up to $899.99 on newegg now, xfx 290x is up to $849.99, msi 290x is at $799.99

prices going up and up and up overnight.

http://www.newegg.com/Desktop-Graphics-Cards/SubCategory/ID-48?Order=PRICED


----------



## cam51037

Quote:


> Originally Posted by *szeged*
> 
> gigabyte 290x is up to $899.99 on newegg now, xfx 290x is up to $849.99, msi 290x is at $799.99
> 
> prices going up and up and up overnight.
> 
> http://www.newegg.com/Desktop-Graphics-Cards/SubCategory/ID-48?Order=PRICED


The Sapphire 290 Tri-X is listed as $700 when you search for it on Newegg... on the product page it's $750. That's insane!


----------



## Cool Mike

Prices are beyond crazy now.Getting to a point of needing a price gouging investigation.


----------



## szeged

whos up for calling the BBB on newegg lol.


----------



## the9quad

Quote:


> Originally Posted by *szeged*
> 
> whos up for calling the BBB on newegg lol.


I am hoping the price keeps going up so I can sell these cards for $700 each and get some 780ti's.


----------



## Eljoka

Quote:


> Originally Posted by *cam51037*
> 
> The Sapphire 290 Tri-X is listed as $700 when you search for it on Newegg... on the product page it's $750. That's insane!


In the meantime, I just bought a 780 for 400$ (Canada). I'm ok with paying 400$ less and pass on mining crypto with it. Hell, I could outright buy a GTX 780 and 400$ worth of Bitcoin/Litecoin/Dogecoin instead of buying a R9 290x...


----------



## Cool Mike

Wonder if AMD has increased pricing on their end. Everyone taking advantage of the opportunity.


----------



## Derpinheimer

Quote:


> Originally Posted by *the9quad*
> 
> I am hoping the price keeps going up so I can sell these cards for $700 each and get some 780ti's.


If this bitcoin "crash" doesnt reverse, expect the prices to drop.

Note that a bitcoin price drop results in a much sharper drop in prices of altcoins.

E.g. if bitcoin goes down 20%, the exchange rate of, say, Ultracoin:BTC will not go up 20% to retain its same value in USD. Nor will it stay the same in terms of BTC. It will drop too. So a 20% price drop in bitcoin may translate to a 40% drop in the price of a random scrypt coin. Check out cryptsy and see how a great deal of the coins are dropping hard since the "crash". Main exceptions right now being Dogecoin (up 66%) and Infinitecoin (up 25%)


----------



## pkrexer

And I was thinking $549 was a lot when I bought my 290x


----------



## King4x4

Good thing I hunted my 290Xs and bought 4 one shot for my gaming rig









Was priced at $570 but that was cheaped compared to what they are priced now!


----------



## SeanEboy

I have looked high and low.. And still cannot find the difference between the R9-290A-ENFC (Core Edition), and the R9-290A-ENBC (Black Edition)...?! What the hell? Can someone please enlighten me as to which one I should buy if I'm going to watercool? Dammit it's so frustrating...


----------



## MrWhiteRX7

If you're water cooling it won't matter


----------



## Durquavian

Quote:


> Originally Posted by *Cool Mike*
> 
> Wonder if AMD has increased pricing on their end. Everyone taking advantage of the opportunity.


This has always been a good question and quite looked over to be honest. It is possible they have now, but we would be guessing. But I can guarantee that the first Bitcoin boom with these cards and the ensuing price hikes gave not one single extra cent to AMD. Generally AMD sells according to orders so say Newegg (moneygrubbingmotha...) who would have ordered an original amount, for arguments sake lets say 1000 units. They sold 100 before the Craze and then had 900 left that they decided to increase the price. Because they already had the purchase into to AMD for the 1000 at a certain cost they would not be required nor obligated to pay the extra profit gained from the hike to AMD. Unfortunately I feel it should be illegal for such to transpire but (this is the unfortunate part) it isn't.


----------



## Spectre-

hey guys i got my NZXT G10 and running it with my Corsair H55 and i cant get it above 1080mhz on core

could it be that its the limit of my card since on stock clock they cant get above 1080mhz as well


----------



## SeanEboy

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> If you're water cooling it won't matter


Thanks for that.. But still, what the hell is the difference? They're only $10 more for the black.. What can that even be?


----------



## chiknnwatrmln

So glad I got my 290 for $380... It was a good deal then, it's a steal now.


----------



## Redvineal

Quote:


> Originally Posted by *SeanEboy*
> 
> Thanks for that.. But still, what the hell is the difference? They're only $10 more for the black.. What can that even be?


Best I can tell the Black Edition comes slightly (VERY slightly) overclocked on the core, and scores you some "benefits" with XFX like early announcements and access to special community resources and events (whatevs). One theory is that XFX uses slightly better components in the Black Edition, but that's just a rumor that I've never seen proven.

In terms of performance, as stated before, if you're water cooling it doesn't matter. As a matter of fact, if you're *not* water cooling, it STILL doesn't matter.







Personally, I see it as marketing fluff.


----------



## MrWhiteRX7

I have 3 reference cards on water and they all do well. I wouldn't worry about it. ?


----------



## rdr09

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> So glad I got my 290 for $380... It was a good deal then, it's a steal now.


I feel like selling mine for $600?


----------



## SeanEboy

Cool, I guess I should just pull the effin' trigger already... $489 for XFX 290s... Not bad.. Not great, but not bad... XFX has lifetime warranty right?


----------



## Widde

Hmmm just got a blackscreen at stock on 14.1 drivers, Pumping up the voltage to +63mV 1.281V on card 1 and 1.234 on the 2nd lets see if it helps ^^


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Widde*
> 
> Just had a hard freeze in bf4 with the 13.12 driver, restarted the pc and ccc didnt load, it said I didnt have any compatible hardware installed, rebooted in safe mode used DDU restarted installed 14.1 disabled ULPS and rebooted again, Now it wont find my 2nd card and gpu-z shows this http://piclair.com/yqgrh. Dont know whats wrong
> 
> And changing EnableULPS in regedit doesnt seem to work but it finds the 2nd card now
> 
> Nvm xD Aparently you're not allowed to "shut down" you have to restart, rebooted and now it works


I was having all sorts of dramas with 14.1 beta and bloody crossfire . It would recognise both cards but only bench one ???? . Did the ddu removal and went back to 13 beta 9.2 . Will test when i gets home , hopefully i will be back to benchin MK 11 and 200fps +

Quote:


> Originally Posted by *SeanEboy*
> 
> Thanks for that.. But still, what the hell is the difference? They're only $10 more for the black.. What can that even be?


Spend the extra $10 and not worry about it


----------



## L36

Does anyone know if its possible to enable VRM temperature monitoring in afterburner for the 290/x?


----------



## Maracus

Quote:


> Originally Posted by *Spectre-*
> 
> hey guys i got my NZXT G10 and running it with my Corsair H55 and i cant get it above 1080mhz on core
> 
> could it be that its the limit of my card since on stock clock they cant get above 1080mhz as well


Depends if your adding extra mv to the core? or just stock volts? Mine only hits 1100mhz stock voltage then needs +175mv to get 1250


----------



## Arizonian

Quote:


> Originally Posted by *Coree*
> 
> Hey people, got a brand new Sapphire R9 290X for only 415e. (Remember, here in Finland 1€ = 1$.) So the price was pretty awesome.
> So, I modified the cooler. I used the cheap Accelero S1 Plus (19€) , which was left behind from my 7870LE. Even if the S1 Plus doesn't have 290X support, it really does, as the screwholes line up perfectly.
> Combined the cooler with 2x quiet 140mm TY147 fans, which are rated @ 21dba 73CFM (Got these for free). Screwed 2 of them into place. I left the VRAM's passive, as i'm not OCing the memory. (I believe the Asus DC II 290/X has passively cooled memory chips too) For the VRMs, I chopped a stock AMD heatsink. (The stock heatsinks aren't useless!
> 
> 
> 
> 
> 
> 
> 
> )
> So the most interesting part, temperatures @ stock?!
> Core dropped from 94C -> 73C load. (BF3, no throttling, uber bios enabled)
> VRM's: VRM1 82C -> 70C. VRM2 66C -> 62C.
> So, not bad? And the noise levels, these fans are quiet as I said, even at max. speeds.
> Hope this helped you people, thanks for reading
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GPU validation: http://www.techpowerup.com/gpuz/7nuek/
> 
> Pics:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sorry for the quality :L


Congrats - added









Quote:


> Originally Posted by *chronicfx*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> First there were two then there were three


I went to look for your name on the roster and either I missed you or you didn't properly make a submission. Would love to add you. Check OP to see what's needed or please link me to your post where you did.









Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Now I can officially be updated lol
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - updated








Quote:


> Originally Posted by *conwa*
> 
> Can I be added to the club please? (XFX 290X on water)
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## Ukkooh

Quote:


> Originally Posted by *Cool Mike*
> 
> Wonder if AMD has increased pricing on their end. Everyone taking advantage of the opportunity.


No they haven't because the problem doesn't happen outside USA. The pricing in here has been the same from launch to this day.


----------



## Sazz

Ok, now I am gonna sell my 290X and get a 780Ti.. then get a 290X later on once price resumes to normal. LOLOL!


----------



## MrWhiteRX7

I really like my 780ti not going to lie, but 90% of the time I'm on my 290 rig









For a single gpu the TI is a beast. You honestly can't go wrong either way. If you already have a 290/x I don't see a real reason to go through the trouble of selling, ordering, replacing etc...


----------



## kizwan

Quote:


> Originally Posted by *Widde*
> 
> Just had a hard freeze in bf4 with the 13.12 driver, restarted the pc and ccc didnt load, it said I didnt have any compatible hardware installed, rebooted in safe mode used DDU restarted installed 14.1 disabled ULPS and rebooted again, Now it wont find my 2nd card and gpu-z shows this http://piclair.com/yqgrh. Dont know whats wrong
> 
> And changing EnableULPS in regedit doesnt seem to work but it finds the 2nd card now
> 
> Nvm xD Aparently you're not allowed to "shut down" you have to restart, rebooted and now it works


Look like I'm not alone. With 14.1 beta drivers, whenever it crashed or blue screen, my 2nd GPU also tend to disappear. Even after restart the computer a couple of time, the 2nd GPU remain MIA. It eventually comeback if I shutdown the computer & disconnect the PCIe power cables to the card for a couple of minutes. I also figured out that whenever it BSOD it will auto-restart (or it freezed & I pushed the restart button), the 2nd GPU will disappear. If it freezed or BSOD & I quickly push the power button to shutdown, the 2nd GPU doesn't disappear. So, I disabled the auto-restart & this proved to be working well for me.

Last night I got BSOD (same bug check code with 14.1 beta driver, 0xa0000001 which show it caused by ATI driver) with 13.12 WHQL driver & I did shutdown the computer by pressing the case on/off button. When I turned on my computer later, the 2nd GPU disappeared again. This time I did differently. I uninstall the driver in Control Panel & when restart the 2nd GPU still missing in Device Manager but while windows is installing the generic driver, I hear new hardware found/connected sound & I can see the 2nd GPU re-appeared in Device Manager. For good measure I uninstall the remaining driver using DDU in safe mode & re-install fresh 13.12 WHQL driver later.

I never had BSOD 0xa0000001 or any BSOD before except when using 14.1 beta driver but that only happen when I overclocked & 14.1 is buggy driver anyway. This is first BSOD with WHQL driver. I hope it just driver issue. Prior BSOD, I did re-install 13.12 driver on top of existing 13.12 driver because the driver version doesn't show up in CCC.
Quote:


> Originally Posted by *Cool Mike*
> 
> Wonder if AMD has increased pricing on their end. Everyone taking advantage of the opportunity.


I doubt though. Price in Malaysia still normal, still MSRP.


----------



## chronicfx

Quote:


> Originally Posted by *Arizonian*
> 
> I went to look for your name on the roster and either I missed you or you didn't properly make a submission. Would love to add you. Check OP to see what's needed or please link me to your post where you did.


Sure


----------



## chronicfx

Quote:


> Originally Posted by *Abyssic*
> 
> woah those colors xD


It looks different with the UV on. The tubes actually match all the blue LED fans on my HAF case and the orange fans on my two rads and Raystorm are a perfect match to eachother. I still can't get the blue to show up exactly as it looks in the case on my iPhone.. Maybe my three year old has been messing with the colors again in the camera app. But rest assured it is definitely not the purple that it looks on daylight. It is a match to the color of the fans in HAF blue editions if you have seen them. Oh and funny, those red EVGA PSU cables actually match the raystorm and fans under UV light they turn orange. What luck.


----------



## kizwan

Quote:


> Originally Posted by *chronicfx*
> 
> Sure


One Hynix & two Elpida?! I have one Hynix & one Elpida.


----------



## chronicfx

Quote:


> Originally Posted by *kizwan*
> 
> One Hynix & two Elpida?! I have one Hynix & one Elpida.


I wished they would all be Hynix. But I am not sad, I am buying my awesome overclock by having three cards







Well, really I thought it would eliminate the two card stutter through trifire. We will see so far crysis 3 and BF4 are butter smooth but Far Cry 3, skyrim and the witcher 2 are stutter fest (Which I was hoping to replay before 3 comes out with uber enabled this time). Far Cry 3 was smooth as a babies bottom with my 3 7970 is what I don't get... Maybe I have to go back to windows 7 because I updates to 8.1 with the card install? I did the mouse lag registry fix...


----------



## kizwan

Quote:


> Originally Posted by *chronicfx*
> 
> I wished they would all be Hynix. But I am not sad, I am buying my awesome overclock by having three cards
> 
> 
> 
> 
> 
> 
> 
> Well, really I thought it would eliminate the two card stutter through trifire. We will see so far crysis 3 and BF4 are butter smooth but Far Cry 3, skyrim and the witcher 2 are stutter fest (Which I was hoping to replay before 3 comes out with uber enabled this time). Far Cry 3 was smooth as a babies bottom with my 3 7970 is what I don't get... Maybe I have to go back to windows 7 because I updates to 8.1 with the card install? I did the mouse lag registry fix...


Yeah, I was hoping both Hynix too but both are good cards though, albeit need a lot of volts when overclock + benching. As far as I know, CPU usage in Windows 8/8.1 slightly higher than Windows 7. I don't know whether Far Cry 3 also CPU intensive game like BF4 though. You might want to monitor CPU usage when it stutter.


----------



## MrWhiteRX7

I haven't tested witcher 2 with my triple 290s. I'll reinstall it and see if it's smooth or stutter fest.

One of my cards are Epilda and the other two are hynix. All three will hit 1100mhz on stock volts ??


----------



## Arizonian

Quote:


> Originally Posted by *chronicfx*
> 
> Sure
> 
> 
> Spoiler: Warning: Spoiler!


Sweet. Congrats - added x 3


----------



## chronicfx

Quote:


> Originally Posted by *kizwan*
> 
> Yeah, I was hoping both Hynix too but both are good cards though, albeit need a lot of volts when overclock + benching. As far as I know, CPU usage in Windows 8/8.1 slightly higher than Windows 7. I don't know whether Far Cry 3 also CPU intensive game like BF4 though. You might want to monitor CPU usage when it stutter.


Very good point. I have been wondering if my cpu is handling these cards ok. Three of them is alot, my cpu usage was high running my 7970 quadfire on win 7 before I put them to mining duty. I my i5 is struggling now..i haven't done much gaming teeaking and monitoring because I am mining on them alot. As much as people say the i5 is more cpu than you will ever need, they don't run three or four cards. I expect to play theif in a few weeks so i will have a better idea of where my performance is with newer games then too.


----------



## Spectre-

Quote:


> Originally Posted by *Maracus*
> 
> Depends if your adding extra mv to the core? or just stock volts? Mine only hits 1100mhz stock voltage then needs +175mv to get 1250


i am pushing 150% power target and running full speed on fans and i still cant get above my old score


----------



## Ukkooh

VRM temps were 46°C and 35°C with +125mV. I guess I could bench at 1300 core with this card.


----------



## Mercy4You

Quote:


> Originally Posted by *Ukkooh*
> 
> 
> 
> VRM temps were 46°C and 35°C with +125mV. I guess I could bench at 1300 core with this card.


My Sapphire R9290XTriX at 1150/1350 +30% Power +.75v

Unigine Valley Benchmark 1.0
Score:
*4793
*


----------



## Ukkooh

Quote:


> Originally Posted by *Mercy4You*
> 
> My Sapphire R9290XTriX at 1150/1350 +30% Power +.75v
> 
> Unigine Valley Benchmark 1.0
> FPS:
> 98.7
> Score:
> *4131
> *Min FPS:
> 22.8
> Max FPS:
> 153.8


Did you use the extreme hd preset?


----------



## Mercy4You

Quote:


> Originally Posted by *Ukkooh*
> 
> Did you use the extreme hd preset?


Lets see, uh.... Custom! Score 4793...


----------



## Ukkooh

Mine was done with extreme hd preset. Can you do a run with it for me?


----------



## Mercy4You

Quote:


> Originally Posted by *Ukkooh*
> 
> Mine was done with extreme hd preset. Can you do a run with it for me?


Ok, I give it a try!


----------



## Mercy4You

Quote:


> Originally Posted by *Ukkooh*
> 
> Mine was done with extreme hd preset. Can you do a run with it for me?


Here it is at 1150/1350 130% Power +.75V:

Extreme HD preset:

FPS:
67.1
Score:
*2809
*Min FPS:
31.1
Max FPS:
128.9


----------



## Abyssic

Quote:


> Originally Posted by *SeanEboy*
> 
> I have looked high and low.. And still cannot find the difference between the R9-290A-ENFC (Core Edition), and the R9-290A-ENBC (Black Edition)...?! What the hell? Can someone please enlighten me as to which one I should buy if I'm going to watercool? Dammit it's so frustrating...


xfx claims that they are using "gpu edging" for their black editions wich means they cherrypick gpus but when i bought my 7950 BE, i had to return it two times because it was faulty xD


----------



## kizwan

Quote:


> Originally Posted by *Mercy4You*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ukkooh*
> 
> 
> 
> VRM temps were 46°C and 35°C with +125mV. I guess I could bench at 1300 core with this card.
> 
> 
> 
> My Sapphire R9290XTriX at 1150/1350 +30% Power +.75v
> 
> Unigine Valley Benchmark 1.0
> Score:
> *4793
> *
Click to expand...

Quote:


> Originally Posted by *Mercy4You*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ukkooh*
> 
> Mine was done with extreme hd preset. Can you do a run with it for me?
> 
> 
> 
> Here it is at 1150/1350 130% Power +.75V:
> 
> Extreme HD preset:
> 
> FPS:
> 67.1
> Score:
> *2809
> *Min FPS:
> 31.1
> Max FPS:
> 128.9
Click to expand...

LOL. I almost got heart attack.


----------



## Mercy4You

Quote:


> Originally Posted by *kizwan*
> 
> LOL. I almost got heart attack.


LOL, I thought lets drop a bomb on this thread









Valley is rather new to me, didn't see the preset choices yet


----------



## Abyssic

hey guys, could someone take a look at my log file to make sure everything is alright? i just want some other opinions. and i also have to note that i've seen vrm temps over 90°C, is this dangerous?

Download the Log.txt here


----------



## kizwan

Quote:


> Originally Posted by *Abyssic*
> 
> hey guys, could someone take a look at my log file to make sure everything is alright? i just want some other opinions. and i also have to note that i've seen vrm temps over 90°C, is this dangerous?
> 
> Download the Log.txt here


What log is that? Can you upload it as attachment?


----------



## Heinz68

Battlefield 4 PC Patch coming out today, it includes some fix for Mantle.

http://mp1st.com/2014/02/13/incoming-battlefield-4-pc-patch-sneak-peak-battlelog-loadout-presets/


----------



## Mercy4You

Quote:


> Originally Posted by *Abyssic*
> 
> hey guys, could someone take a look at my log file to make sure everything is alright? i just want some other opinions. and i also have to note that i've seen vrm temps over 90°C, is this dangerous?
> 
> Download the Log.txt here


Why overclock your memory to 1600? I did not see any gain past 1400...


----------



## Mercy4You

Quote:


> Originally Posted by *kizwan*
> 
> What log is that? Can you upload it as attachment?


I downloaded it...

GPU-ZSensorLog.txt 177k .txt file


----------



## cam51037

Yeesh! Newegg Canada is charging $800 for the 290 Tri-X: http://www.newegg.ca/Product/Product.aspx?Item=N82E16814202080

I'm thinking maybe I should sell my 290 when I receive it back from RMA and purchase dual 780's instead...


----------



## Maracus

Quote:


> Originally Posted by *Spectre-*
> 
> i am pushing 150% power target and running full speed on fans and i still cant get above my old score


Then you need to add some voltage to it, keeping VRM and core temps in check.


----------



## kizwan

Quote:


> Originally Posted by *Abyssic*
> 
> hey guys, could someone take a look at my log file to make sure everything is alright? i just want some other opinions. and i also have to note that i've seen vrm temps over 90°C, is this dangerous?
> 
> Download the Log.txt here


Core clock fluctuating. Based on the memory usage I'm guessing you record this while running Valley. Is this correct? I recommend set fan at fixed speed when gaming or benching. Fan(s) only go up to 41%. Try set to 60% & see whether this improved temps.


----------



## Maracus

Quote:


> Originally Posted by *Abyssic*
> 
> hey guys, could someone take a look at my log file to make sure everything is alright? i just want some other opinions. and i also have to note that i've seen vrm temps over 90°C, is this dangerous?
> 
> Download the Log.txt here


To me thats pretty high VRM temp, Even on stock ref cooler I never saw that high with AB fan curve( Vrms stayed cooler than the core). Just me but I don't think that's healthy...


----------



## Kaapstad

Four 290Xs @1220/1625

4930k @4.8


----------



## Mercy4You

Quote:


> Originally Posted by *Maracus*
> 
> To me thats pretty high VRM temp, Even on stock ref cooler I never saw that high with AB fan curve( Vrms stayed cooler than the core). Just me but I don't think that's healthy...


I just did a Valley run, my VRM hit 92 C while GPU maxed out on 77 C...


----------



## Spectre-

Quote:


> Originally Posted by *Maracus*
> 
> Then you need to add some voltage to it, keeping VRM and core temps in check.


how do i add more voltage
?

i use Asus tweak and all i can do is 150% power speed and maxing out the fan speeds

on Asus Tweak all i se is Core, Memory, Power and Fan


----------



## Mercy4You

Quote:


> Originally Posted by *Abyssic*
> 
> hey guys, could someone take a look at my log file to make sure everything is alright? i just want some other opinions. and i also have to note that i've seen vrm temps over 90°C, is this dangerous?
> 
> Download the Log.txt here


Here's my log, VRM 92 C

GPU-ZSensorLogSapphireTriX.txt 199k .txt file


----------



## SeanEboy

Quote:


> Originally Posted by *Abyssic*
> 
> xfx claims that they are using "gpu edging" for their black editions wich means they cherrypick gpus but when i bought my 7950 BE, i had to return it two times because it was faulty xD


Thanks for that info... Crap, now I wonder if I should go black, or not...? I guess I'll take a crack at it...


----------



## psyside

You guys use stock fans???


----------



## Pesmerrga

Unbelievable.. all 290x's at $899 (even a reference XFX)

http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709%20600473877%20600473871&IsNodeId=1

And they finally started to inflate the 270/270x prices..


----------



## SeanEboy

Anyone know anyplace I can buy xfx 290s with paypal, at around $490?


----------



## Maracus

Quote:


> Originally Posted by *Spectre-*
> 
> how do i add more voltage
> ?
> 
> i use Asus tweak and all i can do is 150% power speed and maxing out the fan speeds
> 
> on Asus Tweak all i se is Core, Memory, Power and Fan


Play around with "core" I'd start with 50mv and increase from there, just keep temps below 90c. Use MSIafterburner if you can't adjust core voltage with asus tweak, but you should be able


----------



## noles1983

Quote:


> Originally Posted by *Pesmerrga*
> 
> Unbelievable.. all 290x's at $899 (even a reference XFX)
> 
> http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709%20600473877%20600473871&IsNodeId=1
> 
> And they finally started to inflate the 270/270x prices..


GD miners, wish that bubble would burst already


----------



## Sgt Bilko

Quote:


> Originally Posted by *Spectre-*
> 
> how do i add more voltage
> ?
> 
> i use Asus tweak and all i can do is 150% power speed and maxing out the fan speeds
> 
> on Asus Tweak all i se is Core, Memory, Power and Fan


Is that the newest GPU Tweak?

you should be able to change the core voltage on GPU Tweak, if not you can try Afterburner or Trixx.


----------



## Mercy4You

Quote:


> Originally Posted by *Abyssic*
> 
> hey guys, could someone take a look at my log file to make sure everything is alright? i just want some other opinions. and i also have to note that i've seen vrm temps over 90°C, is this dangerous?


Ok guys, like Abyssic I'm bit alarmed now by VRM temps over 90 C. Here my logfile with 2 runs Valley in 1 log.

GPU-ZSensorLogSapphire2.txt 549k .txt file


First part 1150/1350/+.75v VRM temps maxed at 92 C

Second part 1125/1350/+.25v VRM temps maxed out at 85 C

Should this mean that my max overclock should stay at 1125/1350/+.25v?

How bad is VRM at 90 plus C?

Will it throttle back at a certain VRM temp or only at Core temp?


----------



## Chimera1970

Damn...and I bought the same, exact card (Gigabyte R9 290x) not a month ago for $699.
Quote:


> Originally Posted by *Pesmerrga*
> 
> Unbelievable.. all 290x's at $899 (even a reference XFX)
> 
> http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709%20600473877%20600473871&IsNodeId=1
> 
> And they finally started to inflate the 270/270x prices..


----------



## Falkentyne

Quote:


> Originally Posted by *MojoW*
> 
> To set +150mV offset:
> MSIAfterburner.exe /wi4,30,8d,18
> 
> To restore original voltage:
> MSIAfterburner.exe /wi4,30,8d,0
> 
> AB commands but this will not stick after restart.


Shouldn't this be wi6? Wi4 was only for beta 17.
Afterburner 18 changed the access to IC commands, so the wi4 didn't work anymore.


----------



## Abyssic

Quote:


> Originally Posted by *Mercy4You*
> 
> Ok guys, like Abyssic I'm bit alarmed now by VRM temps over 90 C. Here my logfile with 2 runs Valley in 1 log.
> 
> GPU-ZSensorLogSapphire2.txt 549k .txt file
> 
> 
> First part 1150/1350/+.75v VRM temps maxed at 92 C
> 
> Second part 1125/1350/+.25v VRM temps maxed out at 85 C
> 
> Should this mean that my max overclock should stay at 1125/1350/+.25v?
> 
> How bad is VRM at 90 plus C?
> 
> Will it throttle back at a certain VRM temp or only at Core temp?


welcome onboard xD

do you also have a sapphire tri-x?


----------



## Mercy4You

Quote:


> Originally Posted by *Abyssic*
> 
> welcome onboard xD
> 
> do you also have a sapphire tri-x?


Thx!

Yep, Sapphire R9290XTriX...

http://www.techpowerup.com/gpuz/e5pee/


----------



## Mercy4You

Found this on AMD forum regarding high VRM temp R9 290X:

"In adition I had a little chat with the support att Saphires "Selct Club"...I asked about the temps and attached a screenshot with my VRM temps during load and this the reply I got:

2013-11-12 [10:07]
Yes, VRM temp normally over 100c and sometime goes to 120c. Yours just fine."

I'm curious what you guys have to say about that...


----------



## psyside

For all who ask about vrm temps.

1. What is your ambient temps?

2. What case?

3. What fan speed?

4. What clocks/volts.power limit?

5. How exactly do you test, how long what benchmark?


----------



## jamaican voodoo

Quote:


> Originally Posted by *chronicfx*
> 
> Very good point. I have been wondering if my cpu is handling these cards ok. Three of them is alot, my cpu usage was high running my 7970 quadfire on win 7 before I put them to mining duty. I my i5 is struggling now..i haven't done much gaming teeaking and monitoring because I am mining on them alot. As much as people say the i5 is more cpu than you will ever need, they don't run three or four cards. I expect to play theif in a few weeks so i will have a better idea of where my performance is with newer games then too.


this way i rock an i7...the i5 are good with two cards but once you go beyond the i7's are the way to go period...


----------



## jamaican voodoo

i got to say after week of playing bf3, bf4 and grid 2 i can say trifire 290s are beast so smooth in those games the scaling is amazing i can't go back to the green team after this unless they scaling is just as good...but so far im happy


----------



## Mercy4You

Quote:


> Originally Posted by *psyside*
> 
> For all who ask about vrm temps.
> 
> 1. What is your ambient temps?
> 
> 2. What case?
> 
> 3. What fan speed?
> 
> 4. What clocks/volts.power limit?
> 
> 5. How exactly do you test, how long what benchmark?


1: 22 C

2: Lian Li A75WX

3: Up to 45% Tri X fans

4: 1150/1350/130% Power/+.75v

5: Valley extreme HD preset, 1 'warm' run (50 C) and VRM temp at 92 max.

I noticed Valley is heavy on VRM, when I run 3D Mark 11 the VRM temp stays well below 80 C, even multiple runs...


----------



## Abyssic

Quote:


> Originally Posted by *Mercy4You*
> 
> 1: 22 C
> 
> 2: Lian Li A75WX
> 
> 3: Up to 45% Tri X fans
> 
> 4: 1150/1350/130% Power/+.75v
> 
> 5: Valley extreme HD preset, 1 'warm' run (50 C) and VRM temp at 92 max.
> 
> I noticed Valley is heavy on VRM, when I run 3D Mark 11 the VRM temp stays well below 80 C, even multiple runs...


1: 18°C

2: Corsair Carbide Air 540 with 6 fans

3: also up to 45% tri-x cooler

4: 1140/1600/+50% Power/+100mV

5: Crysis 3 various run times up to 2h


----------



## BakerMan1971

Quote:


> Originally Posted by *jamaican voodoo*
> 
> i got to say after week of playing bf3, bf4 and grid 2 i can say trifire 290s are beast so smooth in those games the scaling is amazing i can't go back to the green team after this unless they scaling is just as good...but so far im happy


Just go with whichever team is right when you upgrade, there is no need to take sides








My Gtx 570 served me well for around 3.5 years, now its the turn of my 290


----------



## Arizonian

Quote:


> Originally Posted by *Mercy4You*
> 
> Thx!
> 
> Yep, Sapphire R9290XTriX...
> 
> http://www.techpowerup.com/gpuz/e5pee/


Welcome aboard - congrats - added


----------



## Mercy4You

Quote:


> Originally Posted by *Arizonian*
> 
> Welcome aboard - congrats - added


Thx


----------



## mojobear

Quote:


> Originally Posted by *chronicfx*
> 
> I wished they would all be Hynix. But I am not sad, I am buying my awesome overclock by having three cards
> 
> 
> 
> 
> 
> 
> 
> Well, really I thought it would eliminate the two card stutter through trifire. We will see so far crysis 3 and BF4 are butter smooth but Far Cry 3, skyrim and the witcher 2 are stutter fest (Which I was hoping to replay before 3 comes out with uber enabled this time). Far Cry 3 was smooth as a babies bottom with my 3 7970 is what I don't get... Maybe I have to go back to windows 7 because I updates to 8.1 with the card install? I did the mouse lag registry fix...


Oh dude dont judge performance with farcry 3...that game is a stuttering mess for me too with trifire with W7


----------



## chronicfx

Quote:


> Originally Posted by *mojobear*
> 
> Oh dude dont judge performance with farcry 3...that game is a stuttering mess for me too with trifire


Only reason i judge is because i had no stutter and I mean NO stutter with three 7970 and have played all the way up to where i just switched islands to go after hoyt then bought the 290x's but the 290x's have made the game unplayable and i have not touched it for a month now







was a really great game too


----------



## mojobear

Quote:


> Originally Posted by *chronicfx*
> 
> Only reason i judge is because i had no stutter and I mean NO stutter with three 7970 and have played all the way up to where i just switched islands to go after hoyt then bought the 290x's but the 290x's have made the game unplayable and i have not touched it for a month now
> 
> 
> 
> 
> 
> 
> 
> was a really great game too


I know...ive been wanting to play that game for months...initially i had quad gtx 590s....they sucked hard with surround so didnt bother...now R9s are stuttering as well...most every other game is awesome so I cant really complain.

Whats interesing is that blood dragon uses the same engine but runs so butter smooth...dont get it. When I get the urge to play farcry3 I use my laptop with a GTX 765M.









If your 7970s ran smooth, then there is hope I guess. More than Farcry 3 I wish they would fix crossfire for AC4 Black flag.


----------



## szeged

http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709&IsNodeId=1&Description=r9%20290x&name=Desktop%20Graphics%20Cards&Order=PRICED&Pagesize=20&isdeptsrh=1

5 290x's have hit $899.99

up and up and up we go.


----------



## mojobear

Quote:


> Originally Posted by *szeged*
> 
> http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709&IsNodeId=1&Description=r9%20290x&name=Desktop%20Graphics%20Cards&Order=PRICED&Pagesize=20&isdeptsrh=1
> 
> 5 290x's have hit $899.99
> 
> up and up and up we go.


thats so weird...newegg canada still has the DD XFX 290x at 807.99...and our dollar is what? 0.91 USD?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *chronicfx*
> 
> Very good point. I have been wondering if my cpu is handling these cards ok. Three of them is alot, my cpu usage was high running my 7970 quadfire on win 7 before I put them to mining duty. I my i5 is struggling now..i haven't done much gaming teeaking and monitoring because I am mining on them alot. As much as people say the i5 is more cpu than you will ever need, they don't run three or four cards. I expect to play theif in a few weeks so i will have a better idea of where my performance is with newer games then too.


I have been saying this for a few years now. Ever since I started running more than two gpu's at a time I quickly realized the i5 met it's match. With 3 x 7970's I went from a 2500k to a 2600k and picked up a very noticeable amount of fps (especially minimums). Now with a 4820k oc'd and 3 x 290's games run butter smooth. The board you use can play a role as well which is why I went x79.

So it's not just in your head, it's just most people defending the i5 are single gpu.


----------



## Ukkooh

Quote:


> Originally Posted by *szeged*
> 
> http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709&IsNodeId=1&Description=r9%20290x&name=Desktop%20Graphics%20Cards&Order=PRICED&Pagesize=20&isdeptsrh=1
> 
> 5 290x's have hit $899.99
> 
> up and up and up we go.


I never thought it would end up being more expensive than in Finland.


----------



## Mr357

Quote:


> Originally Posted by *szeged*
> 
> http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709&IsNodeId=1&Description=r9%20290x&name=Desktop%20Graphics%20Cards&Order=PRICED&Pagesize=20&isdeptsrh=1
> 
> 5 290x's have hit $899.99
> 
> up and up and up we go.


For that much I got my 290X and EK block on launch night, a 240 res, some new tubing, fittings, a fan controller, and 5 AP-15 GT's.


----------



## Flisker_new

Hey guys, just got my hands on reference gigabyte R9 290 and I'am not impressed







Card is seriously coil whining, not sure if it;s like normal thing or not and when I start up jet (fan at 90%) and try some OC it works till around 1140 and than it's throttling somehow even tho powerlimit is 50% (sounds like bs to me...) gpu temp 69 and VRM's both under 60.

Could anyone please tell me if all of this is normal ?

Thanks in advance









edit: this is what happens when I start BF4, it sits at 1175 core for a minute and than it starts throttling...


----------



## taem

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> I have been saying this for a few years now. Ever since I started running more than two gpu's at a time I quickly realized the i5 met it's match. With 3 x 7970's I went from a 2500k to a 2600k and picked up a very noticeable amount of fps (especially minimums). Now with a 4820k oc'd and 3 x 290's games run butter smooth. The board you use can play a role as well which is why I went x79.
> 
> So it's not just in your head, it's just most people defending the i5 are single gpu.


So 290 crossfire (TWO cards, not three) is going to bottleneck a 4670k @ 4.6? Well that sucks. By how much?


----------



## kizwan

Quote:


> Originally Posted by *taem*
> 
> So 290 crossfire (TWO cards, not three) is going to bottleneck a 4670k @ 4.6? Well that sucks. By how much?


4670k should be able to handle two GPU just fine.

I'm interested to see 4670k CPU usage in BF4 with two 290/290X.


----------



## phallacy

Quote:


> Originally Posted by *Flisker_new*
> 
> Hey guys, just got my hands on reference gigabyte R9 290 and I'am not impressed
> 
> 
> 
> 
> 
> 
> 
> Card is seriously coil whining, not sure if it;s like normal thing or not and when I start up jet (fan at 90%) and try some OC it works till around 1140 and than it's throttling somehow even tho powerlimit is 50% (sounds like bs to me...) gpu temp 69 and VRM's both under 60.
> 
> Could anyone please tell me if all of this is normal ?
> 
> Thanks in advance
> 
> 
> 
> 
> 
> 
> 
> 
> 
> edit: this is what happens when I start BF4, it sits at 1175 core for a minute and than it starts throttling...


I ran into this same problem. But first, which program are you using to OC? I use trixx for the OC and AB to monitor. So usually when I run a game I first open Afterburner THEN I open trixx and set the values through the presets I saved. I found that if AB had control over the cards they would go UP to my desired OC clocks where as with trixx they would be static at the values I set and only downclock if I ever reached 95. Because my cards are WCd now, this never happens. A little more info about your card settings would help though.


----------



## sugarhell

https://www.skroutz.gr/c/55/kartes-grafikwn-vga.html?from=catspan&keyphrase=290x

Meanwhile on greece.


----------



## Flisker_new

Quote:


> Originally Posted by *phallacy*
> 
> I ran into this same problem. But first, which program are you using to OC? I use trixx for the OC and AB to monitor. So usually when I run a game I first open Afterburner THEN I open trixx and set the values through the presets I saved. I found that if AB had control over the cards they would go UP to my desired OC clocks where as with trixx they would be static at the values I set and only downclock if I ever reached 95. Because my cards are WCd now, this never happens. A little more info about your card settings would help though.


I tried both and it's throttling with AB same as with Trixx, this picture was ab just as monitor as you say ... than launched trixx and used to set frequency voltage etc.

So you say that when you got water blocks everything started working fine ? My guess is that throttling is somehow done by memory temp... because I don't know these temps and it always runs for few minutes fine. If everything else is fine than it's either this or 50% power limit is bs.


----------



## phallacy

Quote:


> Originally Posted by *Flisker_new*
> 
> I tried both and it's throttling with AB same as with Trixx, this picture was ab just as monitor as you say ... than launched trixx and used to set frequency voltage etc.
> 
> So you say that when you got water blocks everything started working fine ? My guess is that throttling is somehow done by memory temp... because I don't know these temps and it always runs for few minutes fine. If everything else is fine than it's either this or 50% power limit is bs.


Try this, restart your computer, then before you open your game, open Afterburner and give it a sec to show up. Then make sure your OSD is set so that you can see your clock speeds, mem speeds, temp, voltage and gpu usage for your card while playing. Then open Trixx, adjust your levels and if you have a fan curve, try changing that to fixed fan speed 80% or 90% whatever you desire. Then once trixx has applied the new settings, open your game of choice. I never needed to go past 80% using 1100/1450 while on Air. It was loud as hell but it kept the cards under the thermal limit. See if that does anything for you.

In regards to my waterblocks, they just keep the VRMs and GPU core MUCH cooler allowing more OC room. The problem was fixed before I even put watercooling on them, it's just that it happened only when Afterburner had control of the card specs which only happened when I opened AB after Trixx and over rided Trixx settings I guess. Power limit just ensures your card can draw the power when needed it should work fine unless it's not applying it but I doubt that. If you can say what your full OC settings are that would help.


----------



## Paul17041993

+50% power limit is the max as 375W is the maximum the (reference) power delivery allows, which is even 75W above what the card is designed for so you can even run the risk of damaging your mobo if its low-end enough (generally why most have the extra power connectors these days).

I just did another push run, this time more softly, and found I needed to set the power to +50% in CCC for it to work properly (not sure why trixx doesn't have it), 1200/1600 with +150mV, seemed to run nice but noticed a streak of corrupt pixels appearing on the screen, pushed to +200mV and it seemed stable, but it was peaking at about 350W in and the VRMs started going high so I pushed the reset button, oddly the drivers crashed at that point but at least it wasn't a full crash or lock...


----------



## Iniura

Quote:


> Originally Posted by *SeanEboy*
> 
> Thanks for that.. But still, what the hell is the difference? They're only $10 more for the black.. What can that even be?


The bracket is black silver instead of a shiny silver color, I mean the bracket which hold the DVI-D, HDMI and Displayport, so it looks a little bit darker also from the side view when it's installed in your case.I think It's a nice little touch to the cards, but nothing worth buying this one over the normal one for.


----------



## Duvar

Here my Firestrike score with a R9 290 Tri X http://www.3dmark.com/3dm/2449196 (+200 mV) ASIC 81.5

My PSU is only a 480W BeQuiet E9, so i had to lower my CPU Clocks from 4.8 to 4.4


----------



## VSG

rofl I just saw this on OcUK and told you that run was amazing


----------



## Brian18741

Add me to the club please!











Quick question though, I'm getting different readings for the 2 cards on GPUz.

2816 shanders on one card, 2560 on the other.
176GTexel/s vs 160.
Memory GDDR5 (autodetect) vs (elpida)
Bus Width 32bit vs 512bit!
Bandwidth 20.2 GB/s vs 322.6GB/s

Can anyone explain this to me?! Is it just GPUz is bugged or is there something up with one of my cards? Running latest beta drivers.


----------



## Ukkooh

One of the is running pci-e at x1 which is perfectly normal in idle. It is a part of the power saving things and it will ramp up in load.

Edit: NVM it normally doesn't cause issues like that. Have you tried unplugging the other one and just checking with one card in your pc? Any performance issues?


----------



## Davschall

MIne just arrived!! Lol the ups man managed to tip over an indoor tree spilled dirt everywhere.. Pics soon!


----------



## chronicfx

Quote:


> Originally Posted by *Brian18741*
> 
> Add me to the club please!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quick question though, I'm getting different readings for the 2 cards on GPUz.
> 
> 2816 shanders on one card, 2560 on the other.
> 176GTexel/s vs 160.
> Memory GDDR5 (autodetect) vs (elpida)
> Bus Width 32bit vs 512bit!
> Bandwidth 20.2 GB/s vs 322.6GB/s
> 
> Can anyone explain this to me?! Is it just GPUz is bugged or is there something up with one of my cards? Running latest beta drivers.


Somethings screwy. The pcie says 3.0 x1 on one of the cards


----------



## chronicfx

Quote:


> Originally Posted by *jamaican voodoo*
> 
> i got to say after week of playing bf3, bf4 and grid 2 i can say trifire 290s are beast so smooth in those games the scaling is amazing i can't go back to the green team after this unless they scaling is just as good...but so far im happy


What drivers u using?


----------



## devilhead

Quote:


> Originally Posted by *sugarhell*
> 
> https://www.skroutz.gr/c/55/kartes-grafikwn-vga.html?from=catspan&keyphrase=290x
> 
> Meanwhile on greece.


same in norway 290 ---> 370 €, 290x --475€


----------



## devilhead

Quote:


> Originally Posted by *Kaapstad*
> 
> Four 290Xs @1220/1625
> 
> 4930k @4.8


try valley at same clocks ExtremeHD preset


----------



## rdr09

Quote:


> Originally Posted by *Ukkooh*
> 
> One of the is running pci-e at x1 which is perfectly normal in idle. It is a part of the power saving things and it will ramp up in load.
> 
> Edit: NVM it normally doesn't cause issues like that. Have you tried unplugging the other one and just checking with one card in your pc? Any performance issues?


if those are 290s, then one has been unlocked to a X.

congrats on your oc.


----------



## Kaapstad

Quote:


> Originally Posted by *devilhead*
> 
> try valley at same clocks ExtremeHD preset


No point, Valley is a horrible CPU bottleneck and there is no point using anymore than two cards.


----------



## Mercy4You

Quote:


> Originally Posted by *Abyssic*
> 
> hey guys, could someone take a look at my log file to make sure everything is alright? i just want some other opinions. and i also have to note that i've seen vrm temps over 90°C, is this dangerous?


Did some reading tonight. VRM's can handle temps up to 120-125 C. To be safe it's best to keep it below 90-95 C...

Your logfile (and mine) gives normal temps for overclocking the Tri-X, but high VRM is due to that. To help my card from 1125 Core to 1150, I have to raise the voltage to +75 mV, which gives me a slightly better performance, but it will result in VRM above 90 C.

So, I decided to lower my clocks to 1125 which only needs +25 mV. VRM stays now below 85-90 C...

You could gain a lot by lowering your memory clock. I doubt if 1600 is performing much better than say 1400


----------



## Duvar

Quote:


> Originally Posted by *geggeg*
> 
> rofl I just saw this on OcUK and told you that run was amazing


Oh ok









 Bench Clocks

Here with ASIC  Gaming Clocks


----------



## AmcieK

Hello guys i buy today xfx r9 290 DD and have question about temp , in idle i have 40 - 44 C Its OK ? or something wrong . After 3 hours in bf 4 is about 75 C . I am worried about the temp in idle


----------



## MojoW

Quote:


> Originally Posted by *Duvar*
> 
> Oh ok
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Bench Clocks
> 
> Here with ASIC  Gaming Clocks


Just the bios i need can you upload that pretty please? that's the newest bios out, it isn't even on techpowerup's bios collections.


----------



## devilhead

Quote:


> Originally Posted by *Kaapstad*
> 
> No point, Valley is a horrible CPU bottleneck and there is no point using anymore than two cards.


heh, i have tryed with 4x290's, but got just 152.5 1100/1500, but don't remember it was 4.8ghz on cpu or 5.2ghz, it was not worth to push those card further







)

and here 5.2ghz with 2x290x's


----------



## Duvar

Quote:


> Originally Posted by *MojoW*
> 
> Just the bios i need can you upload that pretty please? that's the newest bios out, it isn't even on techpowerup's bios collections.


Quote:


> Originally Posted by *geggeg*
> 
> rofl I just saw this on OcUK and told you that run was amazing


Hawaii.rom


----------



## MojoW

Quote:


> Originally Posted by *AmcieK*
> 
> Hello guys i buy today xfx r9 290 DD and have question about temp , in idle i have 40 - 44 C Its OK ? or something wrong . After 3 hours in bf 4 is about 75 C . I am worried about the temp in idle


No problem, it's suppose to ramp down in idle.
Mine is @ 43/44 C idle with a ambient of 23.5C.


----------



## MojoW

Quote:


> Originally Posted by *Duvar*
> 
> Hawaii.rom


Thank you very much.


----------



## jamaican voodoo

Quote:


> Originally Posted by *chronicfx*
> 
> What drivers u using?


i'm running 13.12 drivers...14.1 beta are terrible for crossfire users nothing but problems


----------



## chronicfx

Quote:


> Originally Posted by *jamaican voodoo*
> 
> i'm running 13.12 drivers...14.1 beta are terrible for crossfire users nothing but problems


I will give those a try. 14.1 about halved my khash mining for some reason. Mantle is nice but i have enough power to go without it. So those are a no. Need to find a compromise.


----------



## GenoOCAU

Quote:


> Originally Posted by *Duvar*
> 
> Hawaii.rom


First bios i've tried that resulted in black screen on restart


----------



## Paul17041993

Quote:


> Originally Posted by *Brian18741*
> 
> Add me to the club please!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quick question though, I'm getting different readings for the 2 cards on GPUz.
> 
> 2816 shanders on one card, 2560 on the other.
> 176GTexel/s vs 160.
> Memory GDDR5 (autodetect) vs (elpida)
> Bus Width 32bit vs 512bit!
> Bandwidth 20.2 GB/s vs 322.6GB/s
> 
> Can anyone explain this to me?! Is it just GPUz is bugged or is there something up with one of my cards? Running latest beta drivers.


both are 290s, but ones got a 290X bios, or was boxed wrong...? but at the same time, the 290X DCII runs on 1050/1350 not 1000/1280, so that's confusing, try benching them separately and see if ones faster then the other.

also dw about the PCIe stuff, that's just link state power management, the links will power down to x1 @1.1 instead of x8/x16 @3.0, you can disable that via window's power profiles, unless of course you plugged the [second] card in the wrong slot...


----------



## Heinz68

Quote:


> Originally Posted by *Brian18741*
> 
> Add me to the club please!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quick question though, I'm getting different readings for the 2 cards on GPUz.
> 
> 2816 shanders on one card, 2560 on the other.
> 176GTexel/s vs 160.
> Memory GDDR5 (autodetect) vs (elpida)
> Bus Width 32bit vs 512bit!
> Bandwidth 20.2 GB/s vs 322.6GB/s
> 
> Can anyone explain this to me?! Is it just GPUz is bugged or is there something up with one of my cards? Running latest beta drivers.


Try opening only one GPU-Z and use the box at the bottom to switch from one card to the second card.


----------



## kizwan

Quote:


> Originally Posted by *Brian18741*
> 
> Add me to the club please!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Quick question though, I'm getting different readings for the 2 cards on GPUz.
> 
> *2816 shanders on one card, 2560 on the other.
> 176GTexel/s vs 160.
> Memory GDDR5 (autodetect) vs (elpida)
> Bus Width 32bit vs 512bit!
> Bandwidth 20.2 GB/s vs 322.6GB/s*
> 
> Can anyone explain this to me?! Is it just GPUz is bugged or is there something up with one of my cards? Running latest beta drivers.
> 
> 
> Spoiler: Warning: Spoiler!


Second GPU is in ULPS mode (power saving mode). This is why the _"Shaders"_, _Texture Fillrate"_, _"Bus Width"_ & _"Bandwidth"_ reported wrong value (e.g. 32bit & 20.2 GB/s) on the second card. Disabled ULPS using MSI Afterburner 3 beta 18.


----------



## GenoOCAU

Is it possible to boot on my working vbios then flick the switch while in atiflash? Ie flick the switch to the broken vbios while the cards are on ?

Ie boot into atiflash with bios switch 1 on both cards, move switch to bad bios while on then reflash to last working bios?


----------



## MojoW

Quote:


> Originally Posted by *GenoOCAU*
> 
> Is it possible to boot on my working vbios then flick the switch while in atiflash? Ie flick the switch to the broken vbios while the cards are on ?
> 
> Ie boot into atiflash with bios switch 1 on both cards, move switch to bad bios while on then reflash to last working bios?


Yes you can.


----------



## Arizonian

Quote:


> Originally Posted by *Brian18741*
> 
> Add me to the club please!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Quick question though, I'm getting different readings for the 2 cards on GPUz.
> 
> 2816 shanders on one card, 2560 on the other.
> 176GTexel/s vs 160.
> Memory GDDR5 (autodetect) vs (elpida)
> Bus Width 32bit vs 512bit!
> Bandwidth 20.2 GB/s vs 322.6GB/s
> 
> Can anyone explain this to me?! Is it just GPUz is bugged or is there something up with one of my cards? Running latest beta drivers.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Also #277 your the first DCUII's on the roster.


----------



## Abyssic

Quote:


> Originally Posted by *Mercy4You*
> 
> Did some reading tonight. VRM's can handle temps up to 120-125 C. To be safe it's best to keep it below 90-95 C...
> 
> Your logfile (and mine) gives normal temps for overclocking the Tri-X, but high VRM is due to that. To help my card from 1125 Core to 1150, I have to raise the voltage to +75 mV, which gives me a slightly better performance, but it will result in VRM above 90 C.
> 
> So, I decided to lower my clocks to 1125 which only needs +25 mV. VRM stays now below 85-90 C...
> 
> You could gain a lot by lowering your memory clock. I doubt if 1600 is performing much better than say 1400


thanks for the information, yeah i will also see if i can decrease those temps a bit. i also noticed that i can run 1120 at +20mV but i need +100mV for 1140 xD


----------



## Abyssic

Quote:


> Originally Posted by *chronicfx*
> 
> Somethings screwy. The pcie says 3.0 x1 on one of the cards


this normal. the power saving technology even decreases the used lanes in idle.


----------



## SeanEboy

Quote:


> Originally Posted by *Iniura*
> 
> The bracket is black silver instead of a shiny silver color, I mean the bracket which hold the DVI-D, HDMI and Displayport, so it looks a little bit darker also from the side view when it's installed in your case.I think It's a nice little touch to the cards, but nothing worth buying this one over the normal one for.


Annd.. so lame... Hahah, thanks for the info...


----------



## Paul17041993

Quote:


> Originally Posted by *kizwan*
> 
> Second GPU is in ULPS mode (power saving mode). This is why the _"Shaders"_, _Texture Fillrate"_, _"Bus Width"_ & _"Bandwidth"_ reported wrong value (e.g. 32bit & 20.2 GB/s) on the second card. Disabled ULPS using MSI Afterburner 3 beta 18.


the pcie, bus and bandwidth are power saving, the rest isn't, you cant enable or disable shaders on-the-fly.


----------



## Heinz68

Quote:


> Originally Posted by *kizwan*
> 
> Second GPU is in ULPS mode (power saving mode). This is why the _"Shaders"_, _Texture Fillrate"_, _"Bus Width"_ & _"Bandwidth"_ reported wrong value (e.g. 32bit & 20.2 GB/s) on the second card. Disabled ULPS using MSI Afterburner 3 beta 18.


He has R9 290 cards, it can't show 2816 shaders on one card unless he got lucky flashing it to X version, I'm sure if he did he would not wonder why.
The only other way is if somebody put R9 290X card in the wrong box.

EDIT
In that case the card with the 2816 shaders would also show GPU Clock 1050 MHz and Memory 1350MHz, but it doesn't, it shows the R9 290 clocks for both cards.


----------



## maynard14

tried to contact xfx rep for if i remove the warranty stickers for my stock reference xfx r9 290 will it remove the warranty

and here is the reply

Your question has been discussed by our management and the conclusion is not allowed.

It means it will be no warranty if you use aftermarket cooler.

Therefore, if you want to buy the card, please think twice.

guess ill wait fot 2 years for the warranty to expire huhuhuhu to be able to replace the stock cooler


----------



## MrWhiteRX7

Quote:


> Originally Posted by *taem*
> 
> So 290 crossfire (TWO cards, not three) is going to bottleneck a 4670k @ 4.6? Well that sucks. By how much?


With a healthy overclock I think a 4670k will do fine with two 290's. The gain from single to xfire will be a healthy jump, but with a 4770k you would get an even greater jump possibly. I made the switch when I went tri-fire 6970's years ago.









Edit: Back then I went from 2500k @ 4.9ghz to 2600k @ 4.7ghz and still got more performance with the multi-gpu setup. Especially when BF3 came out.


----------



## Iniura

Quote:


> Originally Posted by *maynard14*
> 
> tried to contact xfx rep for if i remove the warranty stickers for my stock reference xfx r9 290 will it remove the warranty
> 
> and here is the reply
> 
> Your question has been discussed by our management and the conclusion is not allowed.
> 
> It means it will be no warranty if you use aftermarket cooler.
> 
> Therefore, if you want to buy the card, please think twice.
> 
> guess ill wait fot 2 years for the warranty to expire huhuhuhu to be able to replace the stock cooler


Where did you find this XFX rep or how did you contact him?

I made a ticket on the XFX support website on february the 8th and I still haven't received a reply nothing, I asked if they could provide me with a UEFI GOP bios for the R9 290, because I want to use ultra fast boot.


----------



## Forceman

Quote:


> Originally Posted by *maynard14*
> 
> tried to contact xfx rep for if i remove the warranty stickers for my stock reference xfx r9 290 will it remove the warranty
> 
> and here is the reply
> 
> Your question has been discussed by our management and the conclusion is not allowed.
> 
> It means it will be no warranty if you use aftermarket cooler.
> 
> Therefore, if you want to buy the card, please think twice.
> 
> guess ill wait fot 2 years for the warranty to expire huhuhuhu to be able to replace the stock cooler


Was this in North America? As far as I know, that's the only place they allow removal of the cooler.


----------



## axiumone

Quote:


> Originally Posted by *maynard14*
> 
> tried to contact xfx rep for if i remove the warranty stickers for my stock reference xfx r9 290 will it remove the warranty
> 
> and here is the reply
> 
> Your question has been discussed by our management and the conclusion is not allowed.
> 
> It means it will be no warranty if you use aftermarket cooler.
> 
> Therefore, if you want to buy the card, please think twice.
> 
> guess ill wait fot 2 years for the warranty to expire huhuhuhu to be able to replace the stock cooler


That's very strange... I thought xfx was one of the few manufacturers that allowed installation of waterblocks and aftermarket coolers.

I would call them, because this quote is straight from the XFX WARRANTY webpage.
Quote:


> ** XFX has carefully selected the optimal thermal or fansink component for your graphics card model. We do not encourage the removal of components due to damage that may result in the process. XFX understands that some enthusiasts may choose to replace the original component with their own cooling solution. To support the gaming community, we recommend that you contact XFX prior to any modifications so that we can update your profile and product registration to avoid potential issues with warranty support. In addition, XFX support will be able to walk through the installation with you or provide feedback and pointers on available options for your specific product. You may even consider shipping your components to XFX and allow the technicians at XFX to perform the modification for you (shipping charges to XFX apply).


----------



## Iniura

Quote:


> Originally Posted by *Forceman*
> 
> Was thus in North America? As far as I know, that's the only place they allow removal of the cooler.


Quote:


> Originally Posted by *Forceman*
> 
> Was thus in North America? As far as I know, that's the only place they allow removal of the cooler.


Hey Forceman your question doesn't make sense hehe, they said it would void his warranty if he would remove the cooler. I know it's very late it's 3:19 AM here so maybe I am making a mistake but untill now I think the mistake is on you haha


----------



## Forceman

Quote:


> Originally Posted by *Iniura*
> 
> Hey Forceman your question doesn't make sense hehe, they said it would void his warranty if he would remove the cooler. I know it's very late it's 3:19 AM here so maybe I am making a mistake but untill now I think the mistake is on you haha


If he is in Europe and asked the rep, they would correctly tell him that he can not remove it, because it is only allowed in NA. If he is in NA and they told him that, then that represents a change in policy, or a misinformed rep.


----------



## chronicfx

OK OK.. Forget everything I said about stuttering, for now... I wiped my drivers correctly using DDU and put on 13.12.. Crysis 3, Far Cry 3 (100+ FPS), BF3, BF4, and The Witcher 2 (Maxed with uber enabled, getting 70-90 FPS @ 1440p) ALL ACCEPTABLY SMOOTH









Only lame thing that has sprung up since I started messing with the drivers is that now my headphones crackle when vsync is on. I use a soundblaster ZX with aurvana live headphones. I googled the problem for a sec and it seems pretty damn common. Is there anyway to fix it?

Super excited I got these synced up and working, I was looking at IVY-E mobo/proc combos. Saved myself $700 by installing the drivers right (not sure if I am happy about not getting buying a new processor though).


----------



## Iniura

Quote:


> Originally Posted by *Forceman*
> 
> If he is in Europe and asked the rep, they would correctly tell him that he can not remove it, because it is only allowed in NA. If he is in NA and they told him that, then that represents a change in policy, or a misinformed rep.


owyeah I know what you meant now, now that I read there warranty statement again, it could also be possible that depending on which department of XFX you contact they would always deny you the warranty when you remove the cooler, but if you contact them first at another department they would mark your account and pass it along to the warranty department so that some specific users are allowed to remove it.

Dunno it was just a thought that came up in my head.

I also thing it's probably a hit and miss with these companies just with how well informed their reps are yes, I get the idea that it just depends on which rep you contact.


----------



## maynard14

Quote:


> Originally Posted by *Iniura*
> 
> Where did you find this XFX rep or how did you contact him?
> 
> I made a ticket on the XFX support website on february the 8th and I still haven't received a reply nothing, I asked if they could provide me with a UEFI GOP bios for the R9 290, because I want to use ultra fast boot.


I see, just create another ticket sir if you want,

here is the site sir.

http://www.xfx.com/, then just log in and creae a ticket for your question sir,. hope they will reply asap


----------



## Iniura

Quote:


> Originally Posted by *maynard14*
> 
> I see, just create another ticket sir if you want,
> 
> here is the site sir.
> 
> http://www.xfx.com/, then just log in and creae a ticket for your question sir,. hope they will reply asap


Yeah I already have my ticket pending there since the 8th. I guess I will wait a little longer, thx for letting me know.


----------



## maynard14

oppps miss the updated replies sir,. sorry for the late reply,.

i just create a ticket from http://www.xfx.com

US region, but i live here in the philippines,

so when i created and ticket for my question about the removing the stock cooler i have put my region to ASIA region

maybe ASIA region is not covered for the xfx removal of stock cooler warranty,..

thanks guys,,, maybe ill just forget the warranty, cant bare the stock cooler and the temps,, i cant even use the flash 290x unlock coz my temps are to high 77c @ fan speed of 75 percent,.


----------



## Iniura

Just replace it and when you need your warranty change the stock cooler back make sure you flashed the default bioses on both sides and I dont think their would be a problem to claim your warranty.


----------



## maynard14

Quote:


> Originally Posted by *Iniura*
> 
> Just replace it and when you need your warranty change the stock cooler back make sure you flashed the default bioses on both sides and I dont think their would be a problem to claim your warranty.


thank you sir for the advice,. but how about those stickers that seals the screw ? are there any tutorial to remove the stickers without damaging them?


----------



## Iniura

Quote:


> Originally Posted by *maynard14*
> 
> thank you sir for the advice,. but how about those stickers that seals the screw ? are there any tutorial to remove the stickers without damaging them?


You could heat them with a hairdryer and use a razorblade to get them of make sure you keep them somewhere save, preferrable on a (I dont know how you call this in english but a sticker piece of paper). Also im typing this one my iphone so please excuse my spelling if any.


----------



## Paul17041993

I actually just realized mine doesn't have the X-bar, yay no warranty void...? Id have to question how much its actually needed though...


----------



## maynard14

Quote:


> Originally Posted by *Iniura*
> 
> You could heat them with a hairdryer and use a razorblade to get them of make sure you keep them somewhere save, preferrable on a (I dont know how you call this in english but a sticker piece of paper). Also im typing this one my iphone so please excuse my spelling if any.


whoaa i dont know that sir,.. really nice! thank you so much for your advice it really help me sir,.. ill try to blow dry them and have remove them safely,,. yes sir i got what you meant sir,. sorry also for my english..... lastly sir,. i have coollaboratory here, can i use that on the gpu die? is it safe.. ? i dont have nzxt g10 and vrm hearsinks as for now,. but im planning to re apply the stock thermal paste of my vcard..


----------



## Iniura

Quote:


> Originally Posted by *maynard14*
> 
> whoaa i dont know that sir,.. really nice! thank you so much for your advice it really help me sir,.. ill try to blow dry them and have remove them safely,,. yes sir i got what you meant sir,. sorry also for my english..... lastly sir,. i have coollaboratory here, can i use that on the gpu die? is it safe.. ? i dont have nzxt g10 and vrm hearsinks as for now,. but im planning to re apply the stock thermal paste of my vcard..


Yes sir its excellent for low temps but be very carefull when applying it, I've never used it but I think you have to make sure it doesnt go anywhere else then the gpu die, also if someone else can verify this what I said that would be great The last thing I want to do is misinforming people.


----------



## jamaican voodoo

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> With a healthy overclock I think a 4670k will do fine with two 290's. The gain from single to xfire will be a healthy jump, but with a 4770k you would get an even greater jump possibly. I made the switch when I went tri-fire 6970's years ago.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Back then I went from 2500k @ 4.9ghz to 2600k @ 4.7ghz and still got more performance with the multi-gpu setup. Especially when BF3 came out.


yes the i7 does make a difference for setup with more than two gpus


----------



## jamaican voodoo

Quote:


> Originally Posted by *chronicfx*
> 
> OK OK.. Forget everything I said about stuttering, for now... I wiped my drivers correctly using DDU and put on 13.12.. Crysis 3, Far Cry 3 (100+ FPS), BF3, BF4, and The Witcher 2 (Maxed with uber enabled, getting 70-90 FPS @ 1440p) ALL ACCEPTABLY SMOOTH
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Only lame thing that has sprung up since I started messing with the drivers is that now my headphones crackle when vsync is on. I use a soundblaster ZX with aurvana live headphones. I googled the problem for a sec and it seems pretty damn common. Is there anyway to fix it?
> 
> Super excited I got these synced up and working, I was looking at IVY-E mobo/proc combos. Saved myself $700 by installing the drivers right (not sure if I am happy about not getting buying a new processor though).


if you can get an i7 it will make big difference in fps with trifire....your i5 is kinda choking right now!! but it'll do for now


----------



## maynard14

Quote:


> Originally Posted by *Iniura*
> 
> Yes sir its excellent for low temps but be very carefull when applying it, I've never used it but I think you have to make sure it doesnt go anywhere else then the gpu die, also if someone else can verify this what I said that would be great The last thing I want to do is misinforming people.


thank you once again sir! ill try to remove some stickers on other things first so i wont mess up my vcard stickers,, haha ,, ok sir thank you so much!


----------



## Davschall

Quote:


> Originally Posted by *maynard14*
> 
> thank you sir for the advice,. but how about those stickers that seals the screw ? are there any tutorial to remove the stickers without damaging them?


I actually used a needle nose pliers and gripped the outside of the screw from the top down. I tried a razor blade but I think the thought of that lol. Good thing I was to smart for them. Be really careful with the needle nose dont want to slip.


----------



## maynard14

Quote:


> Originally Posted by *Davschall*
> 
> I actually used a needle nose pliers and gripped the outside of the screw from the top down. I tried a razor blade but I think the thought of that lol. Good thing I was to smart for them. Be really careful with the needle nose dont want to slip.


wowow another useful technique ,.. thanks sir! ill think about it first which is the best and safe for the vcard







thanks


----------



## Davschall

Quote:


> Originally Posted by *maynard14*
> 
> wowow another useful technique ,.. thanks sir! ill think about it first which is the best and safe for the vcard
> 
> 
> 
> 
> 
> 
> 
> thanks


NP Lol i just laughed cause I had this problem just a few hours ago, figured I might as well be useful. FIY the tension on the screws (well on my card anyway) was so light it was incredibly easy. Just use a small needle nose to big and you run a higher risk of slipping which might not be bad....buuut it also could be.


----------



## TheRoot

Quote:


> Originally Posted by *Cool Mike*
> 
> Wonder if AMD has increased pricing on their end. Everyone taking advantage of the opportunity.


absolutely not







its all newegg greed(or something else), they use our card to make mining kit with double price


----------



## MrWhiteRX7

Yea I got my last two 290's on amazon for just over retail price, still under 500


----------



## kizwan

Quote:


> Originally Posted by *Paul17041993*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Second GPU is in ULPS mode (power saving mode). This is why the _"Shaders"_, _Texture Fillrate"_, _"Bus Width"_ & _"Bandwidth"_ reported wrong value (e.g. 32bit & 20.2 GB/s) on the second card. Disabled ULPS using MSI Afterburner 3 beta 18.
> 
> 
> 
> 
> 
> 
> 
> the pcie, bus and bandwidth are power saving, the rest isn't, you cant enable or disable shaders on-the-fly.
Click to expand...

Quote:


> Originally Posted by *Heinz68*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Second GPU is in ULPS mode (power saving mode). This is why the _"Shaders"_, _Texture Fillrate"_, _"Bus Width"_ & _"Bandwidth"_ reported wrong value (e.g. 32bit & 20.2 GB/s) on the second card. Disabled ULPS using MSI Afterburner 3 beta 18.
> 
> 
> 
> 
> 
> 
> 
> He has R9 290 cards, it can't show 2816 shaders on one card unless he got lucky flashing it to X version, I'm sure if he did he would not wonder why.
> The only other way is if somebody put R9 290X card in the wrong box.
> 
> EDIT
> In that case the card with the 2816 shaders would also show GPU Clock 1050 MHz and Memory 1350MHz, but it doesn't, it shows the R9 290 clocks for both cards.
Click to expand...

This is screenshot of my cards when ULPS is disabled.


This is screenshot of my cards when ULPS is enabled.


@Heinz68, enabled ULPS on your cards & see what you get in GPU-Z.


----------



## Spectre-

http://www.3dmark.com/3dm11/7972229

hey guys i want to bump my score even more

and i have a feeling i am throttling considering my clocks are - 1150/ 1325 @ +125 VDDC offset

and i am only getting 16000 graphics score

thank you for your time


----------



## mojobear

Quote:


> Originally Posted by *Brian18741*
> 
> Add me to the club please!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quick question though, I'm getting different readings for the 2 cards on GPUz.
> 
> 2816 shanders on one card, 2560 on the other.
> 176GTexel/s vs 160.
> Memory GDDR5 (autodetect) vs (elpida)
> Bus Width 32bit vs 512bit!
> Bandwidth 20.2 GB/s vs 322.6GB/s
> 
> Can anyone explain this to me?! Is it just GPUz is bugged or is there something up with one of my cards? Running latest beta drivers.


Just disable ulps and they should show up as 290s. Ulps messes up gpuz


----------



## psyside

Quote:


> Originally Posted by *Mercy4You*
> 
> 1: 22 C
> 
> 2: Lian Li A75WX
> 
> 3: Up to 45% Tri X fans
> 
> 4: 1150/1350/130% Power/+.75v
> 
> 5: Valley extreme HD preset, 1 'warm' run (50 C) and VRM temp at 92 max.
> 
> I noticed Valley is heavy on VRM, when I run 3D Mark 11 the VRM temp stays well below 80 C, even multiple runs...


Let me tell you at once, that running fans lower then 60% at least and oc is bad idea. With your settings, use at least 60%.
Quote:


> Originally Posted by *Abyssic*
> 
> 1: 18°C
> 
> 2: Corsair Carbide Air 540 with 6 fans
> 
> 3: also up to 45% tri-x cooler
> 
> 4: 1140/1600/+50% Power/+100mV
> 
> 5: Crysis 3 various run times up to 2h


Same goes to you, at least 60% fan and you will be fine, cheers.


----------



## Heinz68

Quote:


> Originally Posted by *kizwan*
> 
> This is screenshot of my cards when ULPS is disabled.
> 
> 
> This is screenshot of my cards when ULPS is enabled.
> 
> 
> @Heinz68, enabled ULPS on your cards & see what you get in GPU-Z.


I have Sapphire *R9 290X* 4GB TRI-X OC in CrossFire, so they both correctly show 2816 shaders and 512 Bus With. Does not make any difference if the ULPS is disabled or not. It does not effect the basic information about the cards.

The ULPS DIsabled or Enabled only effects the sensors on the next page and for that to read correctly you should have only one GPU-Z open at the same time.


----------



## Derpinheimer

Quote:


> Originally Posted by *Davschall*
> 
> NP Lol i just laughed cause I had this problem just a few hours ago, figured I might as well be useful. FIY the tension on the screws (well on my card anyway) was so light it was incredibly easy. Just use a small needle nose to big and you run a higher risk of slipping which might not be bad....buuut it also could be.


Clever. I used a razor and only succeeded with 1 of the screws. The other tore. Wonder how that works.. lol.


----------



## Maracus

Quote:


> Originally Posted by *Derpinheimer*
> 
> Clever. I used a razor and only succeeded with 1 of the screws. The other tore. Wonder how that works.. lol.


Heh I stuffed around trying to get them off, in the end i just drove the screwdriver into the guts of it and mangled both of them...


----------



## kizwan

Quote:


> Originally Posted by *Heinz68*
> 
> I have Sapphire *R9 290X* 4GB TRI-X OC in CrossFire, so they both correctly show 2816 shaders and 512 Bus With. Does not make any difference if the ULPS is disabled or not. It does not effect the basic information about the cards.
> 
> The ULPS DIsabled or Enabled only effects the sensors on the next page and for that to read correctly you should have only one GPU-Z open at the same time.


I got information from your sig. Your sig say you have 290's. You have 290X's, so it's not good example. ULPS does make GPU-Z read wrong information from the second card. On 290 crossfire, GPU-Z will read wrong shaders value from second card.

It doesn't matter if I'm running one or two instance of GPU-Z. With one GPU-Z, it will still read wrong values from second card if ULPS is enabled.


----------



## Snyderman34

Need to be removed from the club. Switching from my Sapphire 290 to a 780 Ti SC


----------



## Arizonian

Quote:


> Originally Posted by *Snyderman34*
> 
> Need to be removed from the club. Switching from my Sapphire 290 to a 780 Ti SC


Done.


----------



## sporti

Quote:


> Originally Posted by *Snyderman34*
> 
> Need to be removed from the club. Switching from my Sapphire 290 to a 780 Ti SC


Lol, i switched from a 780ti SC ACX to a 290X Tri-x....








All the 780ti i have tried had much and very loud coil whining...

In direct comparison, the image quality from the 290x is much better than NVIDIA 780/780ti...
Anybody else noticed that ???

My new Tri-x 290X is running with default Voltage at [email protected] [email protected] Power Target +25%. Asic is about 78,xx%. Temps are low GPU max 75C VRM1 max. 83Ĉ VRM2 max 65C.

Is that OC result okay ?


----------



## kizwan

Quote:


> Originally Posted by *sporti*
> 
> Lol, i switched from a 780ti SC ACX to a 290X Tri-x....
> 
> 
> 
> 
> 
> 
> 
> 
> All the 780ti i have tried had much and very loud coil whining...
> 
> In direct comparison, the image quality from the 290x is much better than NVIDIA 780/780ti...
> Anybody else noticed that ???
> 
> My new Tri-x 290X is running with default Voltage at [email protected] [email protected] Power Target +25%. Asic is about 78,xx%. Temps are low GPU max 75C VRM1 max. 83Ĉ VRM2 max 65C.
> 
> Is that OC result okay ?


I'm just wondering, did you run OCCT to check stability? I'm asking because when benching I only make sure it stable for that benching & for gaming I use OCCT.


----------



## sporti

Quote:


> Originally Posted by *kizwan*
> 
> I'm just wondering, did you run OCCT to check stability? I'm asking because when benching I only make sure it stable for that benching & for gaming I use OCCT.


Why do you wondering. Is the result not okay ?

I have checked the result with, 3DMark Firestrike, 3DMark11, Tom Raider, Heaven and Valley...
I own the Card about 14 Days. So i have played BF4 for 14 Days with this OC Parameter without any Problem or Crash...So i think it is stable.

Is it normal that the Benchmark Result for 3DMark11, Heaven or Valley is lower than NVIDIA 780ti Card ?
Under Firestrike Extreme Benchmark i get nearly the same result like my 780ti SC [email protected] Bios 1200/3900...


----------



## Paul17041993

Quote:


> Originally Posted by *kizwan*
> 
> This is screenshot of my cards when ULPS is disabled.
> 
> 
> This is screenshot of my cards when ULPS is enabled.
> 
> 
> @Heinz68, enabled ULPS on your cards & see what you get in GPU-Z.


huh, guess that [gpu-z] bug should be reported then...


----------



## Brian18741

Quote:


> Originally Posted by *kizwan*
> 
> This is screenshot of my cards when ULPS is disabled.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> This is screenshot of my cards when ULPS is enabled.
> 
> 
> 
> 
> @Heinz68, enabled ULPS on your cards & see what you get in GPU-Z.


Thanks man. This looks to be exactly the problem I'm having, in work now, will have a look later when home! I briefly looked for UPLS in Afterburner but no sign of it. I'm using 2.3.0, what version are you running?

And thanks to everybody else for your suggestions and input! Will explore them more if the power saving thing doesn't resolve the issue.


----------



## BakerMan1971

Quote:


> Originally Posted by *sporti*
> 
> Lol, i switched from a 780ti SC ACX to a 290X Tri-x....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In direct comparison, the image quality from the 290x is much better than NVIDIA 780/780ti...
> Anybody else noticed that ???


I think the image quality thing has always been present between vendors, of course in recent years with just two major players it's mostly between AMD and Nvidia.

I have always found ATI/AMD image to appear sharper and more vivid, it was always put down to better DAC's. This becomes more of a preference which differs greatly depending on the person


----------



## psyside

Don't use auto fan when testing oc......


----------



## rcoolb2002

Im ready to join









MSI R-290x DDR5 Reference Non-BF4

EK-Acetal Block + Backplate


----------



## Arizonian

Quote:


> Originally Posted by *rcoolb2002*
> 
> Im ready to join
> 
> 
> 
> 
> 
> 
> 
> 
> 
> MSI R-290x DDR5 Reference Non-BF4
> 
> EK-Acetal Block + Backplate
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## rdr09

Quote:


> Originally Posted by *sporti*
> 
> Why do you wondering. Is the result not okay ?
> 
> I have checked the result with, 3DMark Firestrike, 3DMark11, Tom Raider, Heaven and Valley...
> I own the Card about 14 Days. So i have played BF4 for 14 Days with this OC Parameter without any Problem or Crash...So i think it is stable.
> 
> Is it normal that the Benchmark Result for 3DMark11, Heaven or Valley is lower than NVIDIA 780ti Card ?
> Under Firestrike Extreme Benchmark i get nearly the same result like my 780ti SC [email protected] Bios 1200/3900...


i noticed the 290 Pro clock for clock keeps up with the 780 Ti in some synthetic benches except Heaven 4 and Valley.


----------



## kizwan

Quote:


> Originally Posted by *Brian18741*
> 
> Thanks man. This looks to be exactly the problem I'm having, in work now, will have a look later when home! I briefly looked for UPLS in Afterburner but no sign of it. I'm using 2.3.0, what version are you running?
> 
> And thanks to everybody else for your suggestions and input! Will explore them more if the power saving thing doesn't resolve the issue.


It's not actually a problem or bug. ULPS is a power saving feature where secondary GPU in Crossfire will go in power saving mode when idle, it only "activated" when playing games or any 3D apps that use crossfire. If you want to overclock, you'll need to disable ULPS because it can prevent second GPU from overvolting. ULPS prevent GPU-Z from getting correct information from second GPU when idle. I use latest beta version, MSI Afterburner 3 beta 18. Disabled ULPS in settings.


----------



## Sgt Bilko

A friend just picked up an XFX R9 290 Ref from PCCG in Aus and it unlocked to a 290x!

there are still a few around it seems


----------



## Heinz68

Quote:


> Originally Posted by *kizwan*
> 
> I got information from your sig. Your sig say you have 290's. You have 290X's, so it's not good example. ULPS does make GPU-Z read wrong information from the second card. On 290 crossfire, GPU-Z will read wrong shaders value from second card.
> 
> It doesn't matter if I'm running one or two instance of GPU-Z. With one GPU-Z, it will still read wrong values from second card if ULPS is enabled.


There was a mistake in my signature, thanks for pointing it out and I just corrected it. Both my cards are R9 290X.

I have no idea why GPU-Z does not describe one of your card correctly. On the main page are only info about the cards as I said before, ULPS should not change it, the card is still the same whether the ULPS is on or off.

Do you have any other voltage setting or monitoring application running at the same time, MSI Afterburner strongly recommends against it and there is a warning in tooltips about it.
Finally did you try flash the card, maybe it's R9 290X if it shows 2816 shaders.

Read more about it here

EDIT
Or there is a bug in GPU-Z as 'Paul17041993' allready posted here and you should report it.
I can't duplicate the error with 2 R9 290X cards otherwise i would report it.


----------



## Mercy4You

Quote:


> Originally Posted by *psyside*
> 
> Let me tell you at once, that running fans lower then 60% at least and oc is bad idea. With your settings, use at least 60%.
> Same goes to you, at least 60% fan and you will be fine, cheers.


Thx Psyside! Instead of raising the fans on this oh so quiet card I decided to lower my clocks to the point that it doesn't need any voltage adjustment. I can do 1100/1400 with only +30% Power. Done some VRM testing and now it remains under 85 C


----------



## Mercy4You

Quote:


> Originally Posted by *Abyssic*
> 
> thanks for the information, yeah i will also see if i can decrease those temps a bit. i also noticed that i can run 1120 at +20mV but i need +100mV for 1140 xD


Abyssic, I decided to run my Tri-X at 1100/1400 +30% Power and *NO* voltage adjustment, VRM temps remain now below 80-85 C. Read some nasty story's about burned VRM's while chips were intact..xD


----------



## standardhlozek

GIGABITE R9 290 GV-R929D5-4GD-B

ATERMARKET COOLING - Prolimatech MK26 and Alpenhof heatsings


----------



## kizwan

Quote:


> Originally Posted by *Mercy4You*
> 
> Abyssic, I decided to run my Tri-X at 1100/1400 +30% Power and *NO* voltage adjustment, VRM temps remain now below 80-85 C. Read some nasty story's about *burned VRM's* while chips were intact..xD


Where did you read that? Related to 290/290X?


----------



## Arizonian

Quote:


> Originally Posted by *standardhlozek*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> GIGABITE R9 290 GV-R929D5-4GD-B
> 
> ATERMARKET COOLING - Prolimatech MK26 and Alpenhof heatsings


Congrats - added









Also see it's your 18th post on OCN welcome aboard.


----------



## Mercy4You

Quote:


> Originally Posted by *kizwan*
> 
> Where did you read that? Related to 290/290X?


No, not related to 290/290X but yes to older AMD cards...


----------



## jerrolds

Quote:


> Originally Posted by *Abyssic*
> 
> hey guys, could someone take a look at my log file to make sure everything is alright? i just want some other opinions. and i also have to note that i've seen vrm temps over 90°C, is this dangerous?
> 
> Download the Log.txt here


90C for VRM is safe, and normal for people using smaller heatsinks/thermal tape with aftermarket coolersl ike the Acclero III, Hybrid, G10, etc. Over 100C i wouldnt be comfortable but i think theyre rated for at least 125C, maybe even 150C.

Obviously lower is better, but 90C is fine.


----------



## Ukkooh

Quote:


> Originally Posted by *jerrolds*
> 
> 90C for VRM is safe, and normal for people using smaller heatsinks/thermal tape with aftermarket coolersl ike the Acclero III, Hybrid, G10, etc. Over 100C i wouldnt be comfortable but i think theyre rated for at least 125C, maybe even 150C.
> 
> Obviously lower is better, but 90C is fine.


Though they lose efficiency fast after 80°C or so I've heard.


----------



## Mercy4You

Quote:


> Originally Posted by *Ukkooh*
> 
> Though they lose efficiency fast after 80°C or so I've heard.


And how about lifespan when running VRM at 90-95 C ? When buying a new GPU every 2 years, like most of us do here I guess it will not be a problem


----------



## Ukkooh

Quote:


> Originally Posted by *Mercy4You*
> 
> And how about lifespan when running VRM at 90-95 C ? When buying a new GPU every 2 years, like most of us do here I guess it will not be a problem


If they break you can always RMA it









Edit: I have proof that sapphire supports overclocking and overvoltage in their warranty if you use trixx to oc with stock bios.


----------



## Mercy4You

Quote:


> Originally Posted by *Ukkooh*
> 
> If they break you can always RMA it


Really? Aren't we all overclocking here


----------



## Ukkooh

Quote:


> Originally Posted by *Mercy4You*
> 
> Really? Aren't we all overclocking here


Edited something to my earlier post that might be of interest to you.


----------



## chronicfx

Quote:


> Originally Posted by *Mercy4You*
> 
> And how about lifespan when running VRM at 90-95 C ? When buying a new GPU every 2 years, like most of us do here I guess it will not be a problem


Being a scientist myself and seeing how long we torture test alot of stuff i would bet they have already demonstrated in amd's own labs that these cards do not break at 95c if that is what they are publicly claiming.


----------



## Cybertox

Got my R9 290X running uber-mode and so far I am a proud owner


----------



## Mercy4You

Quote:


> Originally Posted by *Ukkooh*
> 
> If they break you can always RMA it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: I have proof that sapphire supports overclocking and overvoltage in their warranty if you use trixx to oc with stock bios.


That is indeed interesting, good to know because I really thought overclocking meant ending the warranty.

Let's overclock







Anyway, I nearly bought a new card every year last 4 years, hell even 2 R9 290X's in 2 months time!
So, how long should a card last...?
.


----------



## Pesmerrga

There still has't been any further info on the 290x Lightnings, right? They would probably cost $999 at this point...









Newegg is such a bunch of d-bags.. XFX 290x DD - $899 Tigerdirect XFX 290x DD - $649


----------



## xProxius

Hey guys,Imma order a 290 in a few days.But so far all i can find for a decent price is for xfx r9s.How are Xfx these days?I know a few years ago they were to be avoided.


----------



## MrWhiteRX7

If it's a reference card don't worry about







I had xfx 7970 black editions in the past and they were awesome


----------



## Sgt Bilko

Quote:


> Originally Posted by *Xaphan187*
> 
> Hey guys,Imma order a 290 in a few days.But so far all i can find for a decent price is for xfx r9s.How are Xfx these days?I know a few years ago they were to be avoided.


Nothing wrong with mine


----------



## jamaican voodoo

Quote:


> Originally Posted by *sporti*
> 
> Lol, i switched from a 780ti SC ACX to a 290X Tri-x....
> 
> 
> 
> 
> 
> 
> 
> 
> All the 780ti i have tried had much and very loud coil whining...
> 
> In direct comparison, the image quality from the 290x is much better than NVIDIA 780/780ti...
> Anybody else noticed that ???
> 
> My new Tri-x 290X is running with default Voltage at [email protected] [email protected] Power Target +25%. Asic is about 78,xx%. Temps are low GPU max 75C VRM1 max. 83Ĉ VRM2 max 65C.
> 
> Is that OC result okay ?


yes i have notice that going from my gtx 570 sli to the hd 7970 crossfire....the scaling on amd side was much better too imo....this way i stick with amd cards


----------



## rcoolb2002

Is it normal for VRM temp to be higher than core temp?

First time with a custom loop. Core sitting @ 52c with VRM 1 floating @ 63C


----------



## MrWhiteRX7

Quote:


> Originally Posted by *rcoolb2002*
> 
> Is it normal for VRM temp to be higher than core temp?
> 
> First time with a custom loop. Core sitting @ 52c with VRM 1 floating @ 63C


What water block? My vrms stay under my core temp


----------



## chiknnwatrmln

Quote:


> Originally Posted by *rcoolb2002*
> 
> Is it normal for VRM temp to be higher than core temp?
> 
> First time with a custom loop. Core sitting @ 52c with VRM 1 floating @ 63C


What rads do you have? And is your CPU in your loop?

I run a 240 and a 360 rad, along with my 3770k. My card tops at 48c core and 40c VRM 1 on +200mV, about 1.31v under load.

I also used FujiPoly Ultra Extreme pads on the VRMs, so that could be part of the reason.

What block do you have? I have an EK Acetal.


----------



## rcoolb2002

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> What rads do you have? And is your CPU in your loop?
> 
> I run a 240 and a 360 rad, along with my 3770k. My card tops at 48c core and 40c VRM 1 on +200mV, about 1.31v under load.
> 
> I also used FujiPoly Ultra Extreme pads on the VRMs, so that could be part of the reason.
> 
> What block do you have? I have an EK Acetal.


240 .. only 290x on it. I used fujipoly on the mems, but didnt get any 1.0mm for the VRM's


----------



## jamaican voodoo

Quote:


> Originally Posted by *Xaphan187*
> 
> Hey guys,Imma order a 290 in a few days.But so far all i can find for a decent price is for xfx r9s.How are Xfx these days?I know a few years ago they were to be avoided.


i heard the same thing too but i just took a leap of faith an when ahead order 2 xfx dd editions...both watercooled their running just fine with the sapphire 290 tri X bios.....


----------



## kizwan

Quote:


> Originally Posted by *jamaican voodoo*
> 
> i heard the same thing too but i just took a leap of faith an when ahead order 2 xfx dd editions...both watercooled their running just fine with the sapphire 290 tri X bios.....


What is the advantage or pros when using Sapphire 290 TRI-X BIOS?
Quote:


> Originally Posted by *rcoolb2002*
> 
> Is it normal for VRM temp to be higher than core temp?
> 
> First time with a custom loop. Core sitting @ 52c with VRM 1 floating @ 63C


Mine with same block & stock EK thermal pad on VRMs, VRM1 usually few degrees higher than core (depends on how high I overclocked) but the difference between them less than 10 degrees though. At stock clock, almost equal between core & VRM1.
Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rcoolb2002*
> 
> Is it normal for VRM temp to be higher than core temp?
> 
> First time with a custom loop. Core sitting @ 52c with VRM 1 floating @ 63C
> 
> 
> 
> What water block? My vrms stay under my core temp
Click to expand...

@rcoolb2002 use EK-Acetal Block + Backplate.

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/16600#post_21780210


----------



## xProxius

Quote:


> Originally Posted by *jamaican voodoo*
> 
> i heard the same thing too but i just took a leap of faith an when ahead order 2 xfx dd editions...both watercooled their running just fine with the sapphire 290 tri X bios.....


Nice thats the ones i was looking at hows you vrm temps?


----------



## jamaican voodoo

Quote:


> Originally Posted by *Xaphan187*
> 
> Nice thats the ones i was looking at hows you vrm temps?


after hour and half of bf4 my vrm temps on all three cards was 47-49 that was max...i'm happy with the results

@ Kizwan the advantages is that you might be able to overclock the card a little further using a different bios than stock...also its a free overclock 1000mhz core 1300mhz mem vs the stock 947mhz core 1250 mem


----------



## MrWhiteRX7

@kizwan

Did you add thermal paste on both sides of the pads?


----------



## sporti

Quote:


> Originally Posted by *sporti*
> 
> My new Tri-x 290X is running with default Voltage at [email protected] [email protected] Power Target +25%. Asic is about 78,xx%. Temps are low GPU max 75C VRM1 max. 83Ĉ VRM2 max 65C.
> 
> Is that OC result okay ?


What about my Card, is the OC result good ? What are your max. OC with default Volatage...?

Is there a new/better Bios for 290X Tri-X....
My Bios Version is 015.042.000.000.000000.


----------



## mojobear

Quote:


> Originally Posted by *jamaican voodoo*
> 
> after hour and half of bf4 my vrm temps on all three cards was 47-49 that was max...i'm happy with the results
> 
> @ Kizwan the advantages is that you might be able to overclock the card a little further using a different bios than stock...also its a free overclock 1000mhz core 1300mhz mem vs the stock 947mhz core 1250 mem


Have you noticed that you actually could overclock higher with the Trix bios? Curious...may attempt depending on your info


----------



## Mercy4You

Quote:


> Originally Posted by *sporti*
> 
> What about my Card, is the OC result good ? What are your max. OC with default Volatage...?
> 
> Is there a new/better Bios for 290X Tri-X....
> My Bios Version is 015.042.000.000.000000.


It's a bit boring, have the same card, same BIOS, same OC with the same default voltage









...and the same temps!

But, I dare to say the Tri-X is the best R9 290X on the market now








.


----------



## Paul17041993

Quote:


> Originally Posted by *Mercy4You*
> 
> It's a bit boring, have the same card, same BIOS, same OC with the same default voltage
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ...and the same temps!
> 
> But, I dare to say the Tri-X is the best R9 290X on the market now
> 
> 
> 
> 
> 
> 
> 
> 
> .


it still uses the reference PCB too doesn't it?


----------



## Widde

Got a freeze in bf 4 AGAIN now on test range :< Had to push the button to restart







this is on 14.1 drivers, getting this problem on both 13.12 and 14.1









And my fps on ultra with 200% res scale is down to 40


----------



## Sgt Bilko

Quote:


> Originally Posted by *Paul17041993*
> 
> it still uses the reference PCB too doesn't it?


Tri-X and XFX DD both do, MSI Gaming does, not sure about the Gigabyte or Powercolor cards yet (assuming they are), the Asus DCUII is a non-ref design though.


----------



## Mercy4You

Quote:


> Originally Posted by *Paul17041993*
> 
> it still uses the reference PCB too doesn't it?


Yep, but I read the AMD quality is very good...


----------



## kizwan

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> @kizwan
> 
> Did you add thermal paste on both sides of the pads?


Currently TIM only on the VRMs & I'm using thicker pads. Your VRM1 remain below core even if overclocked?


----------



## Paul17041993

Quote:


> Originally Posted by *Mercy4You*
> 
> Yep, but I read the AMD quality is very good...


yea its pretty good, much like the 7970s were, just that in this case the cooler cant keep up.

no idea if the DCII is any good this time round apart from cooling though...


----------



## chronicfx

Quote:


> Originally Posted by *Widde*
> 
> Got a freeze in bf 4 AGAIN now on test range :< Had to push the button to restart
> 
> 
> 
> 
> 
> 
> 
> this is on 14.1 drivers, getting this problem on both 13.12 and 14.1
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And my fps on ultra with 200% res scale is down to 40


What is this res scale stuff that i can use to artificially lower my framerate and force me to buy more gpu's?


----------



## kpoeticg

Has the Tri-X Bios helped anybody with Random Black Screens (Desktop, Firefox, Unable to open games)? I had to RMA my Hynix Powercolor cuz it was DOA and got an Elpida back. I used DDU in safe mode and switched to the WHQL 13.12, also have my board setup to boot the card under Legacy. I'm running MSI Bios right now with Overdrive off and Afterburner running. It helps but doesn't solve the problem. Less random stuff, but still happens. And still can't open games


----------



## Coree

ARGH! I can't get enough pressure on the VRM1. The pad has contact with the VRM's and the heatsink, but the problem is that if I try to press it, the resistor-type thingy is bulging too much and I can't do nothing with it. I'm reaching 78C Core, 82C VRM1 and 64C VRM2 @ 1100/1250 during load (BF3). Any ideas? Should I just order a thicker VRM pad... Really this VRM1 is bugging me, as to some of you people too


----------



## Sgt Bilko

Quote:


> Originally Posted by *chronicfx*
> 
> What is this res scale stuff that i can use to artificially lower my framerate and force me to buy more gpu's?


i use it to actually make my GPU's do something









At normal 1080p my CPU bottlenecks them so hard they sit on 30% usage with 70 fps lol


----------



## AmcieK

Guys why i have 100 % usage and only 907,2 Mhz ? Look screen , and next question why only 50 fps , seting set by game to high



http://imgur.com/Onh9Wo3


----------



## Mercy4You

Quote:


> Originally Posted by *AmcieK*
> 
> Guys why i have 100 % usage and only 907,2 Mhz ? Look screen , and next question why only 50 fps , seting set by game to high ]


I think you have to set the PowerLimit +25% in CCC, otherwise it's not loading the core fully...


----------



## chiknnwatrmln

Also you don't have 100% GPU usage, MSI AB's readings are notoriously inaccurate. Use GPUz's readings.


----------



## kcuestag

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Also you don't have 100% GPU usage, MSI AB's readings are notoriously inaccurate. Use GPUz's readings.


That's not MSI AB's fault though, it's been explained quite a few times, it's a bug in the AMD Overdrive 5/6 API and it needs to be fixed by AMD.


----------



## Maracus

Quote:


> Originally Posted by *kizwan*
> 
> Currently TIM only on the VRMs & I'm using thicker pads. Your VRM1 remain below core even if overclocked?


Mine doesn't usually but only playing BF4, for example I just tried *1210/1450+150*mv. Bit of overkill on the voltage, thats with EK 1mm pads with TIM on top of pads.


----------



## Jack Mac

This made me lol


----------



## Sgt Bilko

Quote:


> Originally Posted by *Jack Mac*
> 
> This made me lol


wait wait wait.....$840 US for a R9 290x?

And thats on SALE?

Scary thing is they are $750 AUD for the black edition.

my my the world has indeed gone mad......

You know all those $1k Titan jokes?........what goes around comes around?


----------



## Jack Mac

Well, I sold my 290 today for $525 and ordered this:

Oh lawd, hope it doesn't take a month or so to ship, idk how long I can live on integrated...
Edit:
I'm too impatient and I want something new, while I do absolutely love the 290, I don't like the prices or having to wait to get another card. I just pulled the trigger on the 780 SC. I am really sorry to leave, I'll definitely miss being a part of this club. If you could remove me from the roster, that'd be great. Thanks.


----------



## phallacy

^ Traitor haha j/k.. Enjoy the 780 SC! Are you going to go SLI?

I just did a heaven run with both cards at 1200/1600 but using 120 mV offset instead of the normal 75. I beat my previous heaven score and finally passed 3000 which is the number I was looking for. I had a higher run but at scene 25/26 my system locked up =/


----------



## Jack Mac

Actually I changed my mind again lol, I went on NCIX to check the prices and ordered an XFX 290 DD for $2 more than the 780 would have cost me on Amazon now that Amazon charges tax in NC.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Jack Mac*
> 
> Actually I changed my mind again lol, I went on NCIX to check the prices and ordered an XFX 290 DD for $2 more than the 780 would have cost me on Amazon now that Amazon charges tax in NC.


Welcome back!!


----------



## taem

Ok my Powercolor PCS+ 290 arrives in an hour or so. Single gpu, 4670k, Windows 8.1 64-bit, 1440p Korean IPS screen. I do not play BF4 and won't be buying Thief until big steam sale.

Should I use catalyst 13.12 or 14.1 beta?

I plan to crossfire fairly soon, I have a 290 Tri-x on back order, might cancel that and get another Powercolor PCS+ if I like the card.


----------



## Jack Mac

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Welcome back!!


A very brief, 20 minute leave lol. I really don't want to go back to Nvidia.


----------



## SeanEboy

I have the option to buy (2) 290x @ $650 each (damn tax)... Thoughts on that? Do I pull the trigger? Or wait until I find 290s - specifically without damn tax...? Will I be able to make some of it back mining?


----------



## jamaican voodoo

Quote:


> Originally Posted by *Jack Mac*
> 
> Actually I changed my mind again lol, I went on NCIX to check the prices and ordered an XFX 290 DD for $2 more than the 780 would have cost me on Amazon now that Amazon charges tax in NC.


that was a quick return lol i got a good laugh outta this!!!


----------



## jamaican voodoo

Quote:


> Originally Posted by *Jack Mac*
> 
> A very brief, 20 minute leave lol. *I really don't want to go back to Nvidia*.


you and me both...it's not that i will never go back, but right my experience amd card have been nothing but amazing with a few hick ups here and their.....between image quality, scaling and raw gpu power whats not to like


----------



## Jack Mac

I think I don't want to go back because I can't fathom paying for a slower card to save $2.


----------



## SeanEboy

So, I pulled the trigger... Dammit, I'm so pissed.. (2) 290x at $650 each... Some guy bought all (4) 290s he had in stock THIS MORNING... Bastids! Well, I guess I can only hope for hynix, high hashes, and high resale value, h-right? ;c)


----------



## taem

Quote:


> Originally Posted by *Jack Mac*
> 
> I think I don't want to go back because I can't fathom paying for a slower card to save $2.


I picked 290 also but is the 780 a slower gpu? When 290 released the selling point was that it is *almost*as fast as a 780 but costs $100 less. Have the custom coolers changed this equation?


----------



## jamaican voodoo

trust me the 290 is faster but not by much


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *kizwan*
> 
> Currently TIM only on the VRMs & I'm using thicker pads. Your VRM1 remain below core even if overclocked?


Are you using tim and thermal pads ? If so where on the pcb you would add tim and on what pads ?


----------



## pkrexer

My VRM 1 temp is usually always higher then my core temp for me on water. After 1 hour of BF4, I'm usually sitting around 42c core / 54c VRM. XSPC water block here.


----------



## axiumone

Still waiting for my visiontek orders to ship. Ordered on monday and the orders are still in "confirmed" status. The site says it takes 5-8 days to ship. I can't hardly wait.


----------



## kizwan

Quote:


> Originally Posted by *Maracus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Currently TIM only on the VRMs & I'm using thicker pads. Your VRM1 remain below core even if overclocked?
> 
> 
> 
> Mine doesn't usually but only playing BF4, for example I just tried *1210/1450+150*mv. Bit of overkill on the voltage, thats with EK 1mm pads with TIM on top of pads.
Click to expand...

What is your ambient (indoor) temp? Ambient (indoor) here without aircond 30+ Celsius 24/7. We had cooler night for a couple of weeks, 28C but not anymore. When my cards at stock clock, usually core & VRM1 are very close with each other. It's when I overclock VRM1 start to getting higher than core but the difference is less than 10C.
Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Currently TIM only on the VRMs & I'm using thicker pads. Your VRM1 remain below core even if overclocked?
> 
> 
> 
> Are you using tim and thermal pads ? If so where on the pcb you would add tim and on what pads ?
Click to expand...

[EDIT] I don't know whether you need to do this on XSPC blocks though. For EK blocks, EK recommend putting TIM on VRMs before putting thermal pad there.

Yes, I'm using TIM & thermal pads. I put very little TIM on the VRMs & put thermal pad on top of them. Some people like to put on both side of the thermal pad. Right now I use double stack 1mm thermal pads on VRM1 which means 2mm there. I personally don't recommend 2mm pad there, only for pro... hehe

Ambient (indoor) without aircond is 30+ Celsius 24/7. So, when overclock, both core & VRM1 will be in the 60s Celsius. I'll run some bench or games with aircond on to see what I get in cooler ambient.
Quote:


> Originally Posted by *pkrexer*
> 
> My VRM 1 temp is usually always higher then my core temp for me on water. After 1 hour of BF4, I'm usually sitting around 42c core / 54c VRM. XSPC water block here.


Same here but difference between core & VRM1 on my cards are less than 10C.


----------



## Widde

A new problem just showed itself :S In the opening scene in the witcher 2 when the elf or whatever it is is getting chased through the woods, when you see the 2 guys running towards him and when the screen turns black the game freezes and I have to reboot the pc. Is this driver related or do I have a bad card? Running crossfire. Same thing happens in battlefield 4 after a while but never in any other game

Starting to consider selling these 2 and going 780ti :S Dont want to but soon it seems to be the only way


----------



## kizwan

Quote:


> Originally Posted by *Widde*
> 
> A new problem just showed itself :S In the opening scene in the witcher 2 when the elf or whatever it is is getting chased through the woods, when you see the 2 guys running towards him and when the screen turns black the game freezes and I have to reboot the pc. Is this driver related or do I have a bad card? Running crossfire. Same thing happens in battlefield 4 after a while but never in any other game
> 
> Starting to consider selling these 2 and going 780ti :S Dont want to but soon it seems to be the only way


If this is happening at stock clock, I think you should consider RMA them. I can tell you black screen also can happen if VRAM/memory running hot. Make sure VRAM/memory getting adequate cooling. I would try re-seat the cooler (if not using stock cooler) before RMA them.


----------



## Maracus

Quote:


> Originally Posted by *kizwan*
> 
> What is your ambient (indoor) temp? Ambient (indoor) here without aircond 30+ Celsius 24/7. We had cooler night for a couple of weeks, 28C but not anymore. When my cards at stock clock, usually core & VRM1 are very close with each other. It's when I overclock VRM1 start to getting higher than core but the difference is less than 10C.


It's usually around 30 (indoor) aswell I may have had the air con on but don't remember(sweaty tropics). Anyhow i didn't really plan water cooling the card that well, i had no brackets to attach the radiator to so i just slapped it on top of the outtake of the 360 radiator for the CPU(So when playing BF4 temps get a little high).

Messy build I know, but im just holding out for Haswell-E and a 900D case and then i will take a bit of pride in the way it looks.

As for the VRM temps very similar to yours when adding voltage only maybe 1-2c higher than core. I'll give it +200mv when i can get the room cool enough to see how it goes.


----------



## Widde

Quote:


> Originally Posted by *kizwan*
> 
> If this is happening at stock clock, I think you should consider RMA them. I can tell you black screen also can happen if VRAM/memory running hot. Make sure VRAM/memory getting adequate cooling. I would try re-seat the cooler (if not using stock cooler) before RMA them.


It is at stock clock, have had problem with stutter when gpu-z had been open for awhile, might have been fixed, will check vrm temps

Have a custom fan profile that keeps the core below 85c though


----------



## taem

Got and installed my Powercolor PCS+ 290 and wanted to share looks at the cooling especially. Pic flood inbound:

Intro:


Now on to the good stuff:

General look at cooler:


Vrm1:


Vrm2:


Thermal pad contact between heatsink and memory chips:


Looks pretty good, no? Havent tested it yet, right now its idling at 29c core, 24c vrm1, 25c vrm2, fan at 30%/1400rpm.

http://www.techpowerup.com/gpuz/8k7n8/


----------



## Roaches

Best looking radeon card this gen. Looking forward to load temp results


----------



## Arizonian

Quote:


> Originally Posted by *taem*
> 
> Got and installed my Powercolor PCS+ 290 and wanted to share looks at the cooling especially. Pic flood inbound:
> 
> Intro:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Now on to the good stuff:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> General look at cooler:
> 
> 
> Vrm1:
> 
> 
> Vrm2:
> 
> 
> Thermal pad contact between heatsink and memory chips:
> 
> 
> 
> 
> Looks pretty good, no? Havent tested it yet, right now its idling at 29c core, 24c vrm1, 25c vrm2, fan at 30%/1400rpm.
> 
> http://www.techpowerup.com/gpuz/8k7n8/


Congrats - added


----------



## kizwan

Quote:


> Originally Posted by *Maracus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> What is your ambient (indoor) temp? Ambient (indoor) here without aircond 30+ Celsius 24/7. We had cooler night for a couple of weeks, 28C but not anymore. When my cards at stock clock, usually core & VRM1 are very close with each other. It's when I overclock VRM1 start to getting higher than core but the difference is less than 10C.
> 
> 
> 
> It's usually around 30 (indoor) aswell I may have had the air con on but don't remember(sweaty tropics). Anyhow i didn't really plan water cooling the card that well, i had no brackets to attach the radiator to so i just slapped it on top of the outtake of the 360 radiator for the CPU(So when playing BF4 temps get a little high).
> 
> Messy build I know, but im just holding out for Haswell-E and a 900D case and then i will take a bit of pride in the way it looks.
> 
> As for the VRM temps very similar to yours when adding voltage only maybe 1-2c higher than core. I'll give it +200mv when i can get the room cool enough to see how it goes.
Click to expand...

I don't worry whether my watercooling look pretty or not. My priority is performance, not aesthetic. Just make sure your radiator(s) get fresh air. Don't stack radiators because thermal performance will be poor.

I just benching my GPU's at +200mV (voltage max at 1.414V & 1.422V) in 34 - 36C ambient yesterday.







Finally I got max stable clocks, passed OCCT stress test (15 minutes), at 1175/1600 MHz. I can bench 1200/1600 with +200mV but stable clock is 1175/1600.
Quote:


> Originally Posted by *Widde*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> If this is happening at stock clock, I think you should consider RMA them. I can tell you black screen also can happen if VRAM/memory running hot. Make sure VRAM/memory getting adequate cooling. I would try re-seat the cooler (if not using stock cooler) before RMA them.
> 
> 
> 
> It is at stock clock, have had problem with stutter when gpu-z had been open for awhile, might have been fixed, will check vrm temps
> 
> Have a custom fan profile that keeps the core below 85c though
Click to expand...

Not VRM. I'm referring to *VRAM*. Overheat VRM can leads to throttling while overheat VRAM can leads to black screen.


----------



## Widde

Quote:


> Originally Posted by *kizwan*
> 
> I don't worry whether my watercooling look pretty or not. My priority is performance, not aesthetic. Just make sure your radiator(s) get fresh air. Don't stack radiators because thermal performance will be poor.
> 
> I just benching my GPU's at +200mV (voltage max at 1.414V & 1.422V) in 34 - 36C ambient yesterday.
> 
> 
> 
> 
> 
> 
> 
> Finally I got max stable clocks, passed OCCT stress test (15 minutes), at 1175/1600 MHz. I can bench 1200/1600 with +200mV but stable clock is 1175/1600.
> Not VRM. I'm referring to *VRAM*. Overheat VRM can leads to throttling while overheat VRAM can leads to black screen.


Ah, It's reference cards


----------



## taem

Quote:


> Originally Posted by *Roaches*
> 
> Best looking radeon card this gen. Looking forward to load temp results


Just ran a quick test of 10 runs in Metro LL bench, everything maxed. Not thrilled with the result tbh. Ambient is 21.5c.



77c vrm1 temp









Oh btw Elpida ram. ASIC 70%.

On the plus side visual quality was flawless. May not be a high bar at stock but my 280x Vapor X gave me artifacts on this bench even at stock.

Was glancing at power draw on my Belkin Conserve Insight (measures at wall), the card at stock with 4670k @ 4.6 1.25v draws 360-385w.


----------



## Mercy4You

*Originally Posted by kizwan View Post

Overheat VRM can leads to throttling while overheat VRAM can leads to black screen.

*
Does any1 know if there's an overheat protection on VRM? Does throttling only occur on high core temp or also VRM?


----------



## Paul17041993

Quote:


> Originally Posted by *Mercy4You*
> 
> *Originally Posted by kizwan View Post
> 
> Overheat VRM can leads to throttling while overheat VRAM can leads to black screen.
> 
> *
> Does any1 know if there's an overheat protection on VRM? Does throttling only occur on high core temp or also VRM?


that's probably what mine was doing, but yea I think it usually throttles if the VRMs start to heat too much, and would likely cut off (blackscreen) if left too hot for too long...


----------



## Mercy4You

Quote:


> Originally Posted by *Paul17041993*
> 
> that's probably what mine was doing, but yea I think it usually throttles if the VRMs start to heat too much, and would likely cut off (blackscreen) if left too hot for too long...


Then that could be the reason my reference R9 290X was blackscreening I guess. When I look at my Tri-X, when the core is around 80 C when overclocked, the VRM1 is going 90+C.

My reference was doing stock at about 90 C on core, so probably the VRM1 was doing 100+C.
I never checked my VRM temps on that card, too bad...

*Edited*: Underclocking the memory made it more stable, and that's because VRM would heat up less right?
.


----------



## Forceman

The memory is powered by a different set of VRMs (VRM2) so reducing the memory clock isn't likely doing anything for your VRM temps (plus it's still the same voltage).


----------



## Jack Mac

Yay, NCIX isn't taking as long as I thought they would. I really want this card now, I'm really impatient if you haven't guessed already. I just have to wait for shipping now.


----------



## Widde

This was my gpu usage in furmark http://piclair.com/husar Noticed it was dropping to 0% on and off, RMA needed? or drivers/program messing? nvm in 3dmark it seems fine


----------



## Mercy4You

Quote:


> Originally Posted by *Forceman*
> 
> The memory is powered by a different set of VRMs (VRM2) so reducing the memory clock isn't likely doing anything for your VRM temps (plus it's still the same voltage).


Well, I wasn't totally sure about more stability by reducing memory clocks, but I tried about everything to get the damn thing not to blackscreen








What I still don't get is that AMD declared that 95 C on core (and 100+ C on VRM) would be absolutely fine...

Why are we all here trying to lower our temps than? Is that because we know that not all parts can handle that temps very well, despite of what AMD is trying to let us believe?

BTW: What's the function of VRM1?
.


----------



## Paul17041993

Quote:


> Originally Posted by *Mercy4You*
> 
> *Edited*: Underclocking the memory made it more stable, and that's because VRM would heat up less right?
> .


Quote:


> Originally Posted by *Forceman*
> 
> The memory is powered by a different set of VRMs (VRM2) so reducing the memory clock isn't likely doing anything for your VRM temps (plus it's still the same voltage).


it would and wouldn't, higher clocks will pull more power but I doubt it would be enough to change the RAM VRM behavior, only usually the main VRMs you have to worry about, but at the same time the RAM VRMs would be collecting heat due to the heat off the rest of the hardware so they can still overheat fairly easily.

Quote:


> Originally Posted by *Mercy4You*
> 
> Well, I wasn't totally sure about more stability by reducing memory clocks, but I tried about everything to get the damn thing not to blackscreen
> 
> 
> 
> 
> 
> 
> 
> 
> What I still don't get is that AMD declared that 95 C on core (and 100+ C on VRM) would be absolutely fine...
> 
> Why are we all here trying to lower our temps than? Is that because we know that not all parts can handle that temps very well, despite of what AMD is trying to let us believe?
> 
> BTW: What's the function of VRM1?
> .


the temps at stock are fine really, but when overclocking you want your hardware as cool as possible for the best results, even a few degrees can mean better regulation, less droop and better clocks, though at the same time the shear size of these chips means if you dropped the temp to <40C instead of 95C you would actually loose some 20W or more of your power draw, while running higher clocks, here's an example;
http://www.legitreviews.com/nzxt-kraken-g10-gpu-water-cooler-review-on-an-amd-radeon-r9-290x_130344/4
(pretty sure I or someone else already posed it somewhere)


----------



## kizwan

Quote:


> Originally Posted by *jamaican voodoo*
> 
> @ Kizwan the advantages is that you might be able to overclock the card a little further using a different bios than stock...also its a free overclock 1000mhz core 1300mhz mem vs the stock 947mhz core 1250 mem


Flashed my cards with Sapphire 290 TRI-X OC VBIOS just now.











Quote:


> Originally Posted by *Forceman*
> 
> The memory is powered by a different set of VRMs (VRM2) so reducing the memory clock isn't likely doing anything for your VRM temps (plus it's still the same voltage).


Voltage will be the same but current draw will be less, power consumption will be less & heat output will be less. This will definitely helps lower VRM2 temp. Also we know core voltage also helps memory overclocking. So it will helps lower VRM1 temps too.
Quote:


> Originally Posted by *Widde*
> 
> This was my gpu usage in furmark http://piclair.com/husar Noticed it was dropping to 0% on and off, RMA needed? or drivers/program messing? nvm in 3dmark it seems fine


Don't worry with GPU usage. They doesn't reflect the cards actual performance at all.

You can try swap the cards; second GPU to first slot & first GPU to second slot, see whether it even out the GPU usage for both cards.


----------



## lol.69

Well vrm (1) temps is something that concerns me alot .

At stock clocks , playing bf3 Mp after about 2 hours my vrm (1) temp was 90+ ºC ... ( this was some tests I've done last week before i ran into black screens)

Last night I tested some overclock again, and I used afterburner , set voltage and power to 100 , and I put the core at 1150 .
With Vsync on , the card never passed 64 ºC on the core. Played about 1h30 , then I turned Vsync off and the core temp jumped to 83 ºC after about 1h..

Now,I am affraid to install Gpu-z to check the Vrm temps, because I was having black screens, and since i uninstalled Cpu-z and Gpu-z, the problem disappeared..

I have searched but still *could not find* any hints about these Digi+ things max temps...

Also I need some time to mess with AfterBurner, overclocking is very new to me..

My card is an asus 290x- DcII.


----------



## Mercy4You

Quote:


> Originally Posted by *lol.69*
> 
> At stock clocks , playing bf3 Mp after about 2 hours my vrm (1) temp was 90+ ºC ... ( this was some tests I've done last week before i ran into black screens)
> My card is an asus 290x- DcII.


Let me guess, you have Elpdia memory...








download here: http://www.techpowerup.com/downloads/2326/asus-radeon-memoryinfo-1-005/

And forget about any programs like CPUZ or GPUZ or whatever to cause blackscreens.
The blackscreens are random, so you can never tell what causes it.

I changed cards 3 weeks ago and haven't seen a single BS since, while overclocking and running multiple programs.

You can read more here: http://www.overclock.net/t/1441349/290-290x-black-screen-poll


----------



## lol.69

Quote:


> Originally Posted by *Mercy4You*
> 
> Let me guess, you have Elpdia memory...
> 
> 
> 
> 
> 
> 
> 
> 
> download here:http://www.techpowerup.com/downloads/2326/asus-radeon-memoryinfo-1-005/
> 
> And forget about any programs like CPUZ or GPUZ or whatever to cause blackscreens.
> The blackscreens are random, so you can never tell what causes it.
> 
> I changed cards 3 weeks ago and haven't seen a single BS since, while overclocking and running multiple programs.
> 
> You can read more here: http://www.overclock.net/t/1441349/290-290x-black-screen-poll/530


I will check when i get home , thanks!

Regarding Black screens, i bought the card, and with Catalyst 13.12 or 14.1 I had Black screens, or screen tearing randomly, I went to the retailer and they Have done tests ( Valley it seems) and called me then after four days saying that they did not encountered any problem, and asked for my pc to test. They tested on win 7 x64 and then on my pc, more two days waiting, and nothing... they uninstalled asus Gpu tweak... ( forgot to mention early). I really dont know what happened, I also check the way they used the power cables from the psu on the card, everything the same.

And yes, I am not that convinced after reading that thread that this is software issue.. Need to test more..
In Portugal you simply cant return things.. I' am pissed , because the card was connected to my pc 6 hours at max , I recorded some videos and sent to them, and they said that they could replace this one with the Trix version, and after four days, called me saying that they did not encountered problems.


----------



## THE_Shev

Received my 2nd Sapphire R9 290 tri-oc this morning.
Gpu-z shows for the 1st card 2560 shaders, 2nd card 2816.
Ordered the regular version but this shows that it is an x-version.
So very confused


----------



## Mercy4You

Quote:


> Originally Posted by *lol.69*
> 
> I will check when i get home , thanks!
> 
> Regarding Black screens, i bought the card, and with Catalyst 13.12 or 14.1 I had Black screens, or screen tearing randomly.
> 
> And yes, I am not that convinced after reading that thread that this is software issue.. Need to test more..
> In Portugal you simply cant return things.. I' am pissed , because the card was connected to my pc 6 hours at max , I recorded some videos and sent to them, and they said that they could replace this one with the Trix version, and after four days, called me saying that they did not encountered problems.


Some of us 'blackscreeners' got our same cards returned after RMA. That's because the blackscreens occur randomly, like once a day or once a week and in testing that can easily be missed.

I had about 25 blackscreens in 2 months time and I am rather sure it's a hardware problem or hardware related (memory/temp).

I advice you to insist on replacing the card. You have Elpdia memory, but most cards with Elpdia are good. There's probably been a bad badge of memory modules. Elpdia claims there are counterfeit modules in the market.

If they offer you to get the Tri-X instead (which is on Hynix memory) grab that card out of their hands before they think twice


----------



## Abyssic

Quote:


> Originally Posted by *Mercy4You*


hey man, i just tested my card at stock voltage (wich is still 1.2V Oo) with 1100mhz core and 1400mhz mem and my vrm temps just went down about 7°C :/

and something else: when i booted my pc yesterday, it blackscreened as soon as ab started up to apply my oc. my card ran fine with 1600mhz on the mem but now it immediately blackscreens with 1500mhz or more. (that's why i now run 1400mhz) should i rma it?


----------



## THE_Shev

Quote:


> Originally Posted by *THE_Shev*
> 
> Received my 2nd Sapphire R9 290 tri-oc this morning.
> Gpu-z shows for the 1st card 2560 shaders, 2nd card 2816.
> Ordered the regular version but this shows that it is an x-version.
> So very confused


Ran the card separately and gave 2560 shaders so bug in gpu-z then.


----------



## kizwan

Quote:


> Originally Posted by *THE_Shev*
> 
> Quote:
> 
> 
> 
> Originally Posted by *THE_Shev*
> 
> Received my 2nd Sapphire R9 290 tri-oc this morning.
> Gpu-z shows for the 1st card 2560 shaders, 2nd card 2816.
> Ordered the regular version but this shows that it is an x-version.
> So very confused
> 
> 
> 
> Ran the card separately and gave 2560 shaders so bug in gpu-z then.
Click to expand...

When ULPS enabled, which is enabled by default, it make GPU-Z to read wrong shaders value. Disable ULPS using MSI Afterburner v3 beta 18. After ULPS is disabled, GPU-Z will be able to detect correct shaders value.

Also, when ULPS is enabled, fans on 2nd card in Crossfire will not spin when idle (or when not running 3d apps that use Crossfire), at least this is true with reference card with reference cooler.


----------



## Brian18741

What sort of temps are people getting in crossfire? I benched Heaven 4.0 and my top card hit 93*C with the fans running over 60% ....... loud!! Bottom card was mid 80s. This is after just one 2 or 3 minute run benching Heaven 4.0!

I'm using two Asus R9 290 DirectCU II.

Also what PSU's are people running these on? Currently running them on an XFX 750w Pro series but my power monitor shows about 650w (peaked at 720w after 10 min for FurMark) being drawn form the wall. I'm looking at replacing it with an 850w ... but should I stretch to 1000W?


----------



## Mercy4You

Quote:


> Originally Posted by *Abyssic*
> 
> hey man, i just tested my card at stock voltage (wich is still 1.2V Oo) with 1100mhz core and 1400mhz mem and my vrm temps just went down about 7°C :/
> 
> and something else: when i booted my pc yesterday, it blackscreened as soon as ab started up to apply my oc. my card ran fine with 1600mhz on the mem but now it immediately blackscreens with 1500mhz or more. (that's why i now run 1400mhz) should i rma it?


Not so fast







It blackscreens now on stock voltage, but with *+20% overclock on Memory*?!

I haven't even tried 1500 Memory. I run it now stock voltage 1100/1400/130% Power

My temps Core 75 C and VRM1 under 85 . So, I'm happy gamer









Think your card's just fine


----------



## Sgt Bilko

Quote:


> Originally Posted by *Brian18741*
> 
> What sort of temps are people getting in crossfire? I benched Heaven 4.0 and my top card hit 93*C with the fans running over 60% ....... loud!! Bottom card was mid 80s. This is after just one 2 or 3 minute run benching Heaven 4.0!
> 
> I'm using two Asus R9 290 DirectCU II.
> 
> Also what PSU's are people running these on? Currently running them on an XFX 750w Pro series but my power monitor shows about 650w (peaked at 720w after 10 min for FurMark) being drawn form the wall. I'm looking at replacing it with an 850w ... but should I stretch to 1000W?


If you are doing any sort of benching or running both cards at 100% then you want 70-80% fan speed minimum.

As for the PSU, I'd recommend a 850w at least, spring for the 1000w if it's decent quality and you have the cash for it though

never cheap out on your PSU









btw, whats the vrm temps like on the DCU II models?


----------



## Mercy4You

Quote:


> Originally Posted by *Brian18741*
> 
> What sort of temps are people getting in crossfire? I benched Heaven 4.0 and my top card hit 93*C with the fans running over 60% ....... loud!! Bottom card was mid 80s. This is after just one 2 or 3 minute run benching Heaven 4.0!
> 
> I'm using two Asus R9 290 DirectCU


Honestly, I now think crossfire or sli should be under water. I almost did what you've just done, but then I read this review :
http://www.guru3d.com/articles_pages/radeon_r9_290_crossfire_review_benchmarks,23.html

I was not fully aware that an idle card that is producing say 32 dB(A), which is very quiet, is totally uncomparable to 2 of those cards together. Let alone 2 cards under load when de lower one is heating up the upper one..

Conclusion, you have to deal with heat a.k.a. *noise*


----------



## Brian18741

Quote:


> Originally Posted by *Sgt Bilko*
> 
> If you are doing any sort of benching or running both cards at 100% then you want 70-80% fan speed minimum.
> 
> As for the PSU, I'd recommend a 850w at least, spring for the 1000w if it's decent quality and you have the cash for it though
> 
> never cheap out on your PSU
> 
> 
> 
> 
> 
> 
> 
> 
> 
> btw, whats the vrm temps like on the DCU II models?


I'm kinda broke after recent upgrades (2x R9 290's and QNIX 1440p monitor!) but can get a Corsair RM 1000 for the not unreasonable price of around €160. I can get a nice 850w for about €100 though. Might just grab the bull by the horns and get the RM1000 and hope it doesn't end in divorce!









Can't remember VRM temps off the top of my head, in work now, will check later and upload the results.
Quote:


> Originally Posted by *Mercy4You*
> 
> Honestly, I now think crossfire or sli should be under water. I almost did what you've just done, but then I read this review :
> http://www.guru3d.com/articles_pages/radeon_r9_290_crossfire_review_benchmarks,23.html
> 
> Conclusion, you have to deal with heat a.k.a. *noise*


This was my original plan, stick to 1080p and get one 290 and WC the lot. But the upgrade bug got a hold of me and I got a bit carried away! Eventually I will probably put it all under water but realistically it will be 6 months minimum before that's an option financially.


----------



## Abyssic

Quote:


> Originally Posted by *Mercy4You*
> 
> Not so fast
> 
> 
> 
> 
> 
> 
> 
> It blackscreens now on stock voltage, but with *+20% overclock on Memory*?!
> 
> I haven't even tried 1500 Memory. I run it now stock voltage 1100/1400/130% Power
> 
> My temps Core 75 C and VRM1 under 85 . So, I'm happy gamer
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Think your card's just fine


nah xD it blackscreened with my "old" settings (+100mV, 1140mhz core, 1600mhz mem) but also with my current settings (stock volts, 1100mhz core) when i pushed the mem over 1500mhz. remember, the settings i mentioned first worked for a day or so under all kinds of stress.


----------



## Mercy4You

Quote:


> Originally Posted by *Abyssic*
> 
> nah xD it blackscreened with my "old" settings (+100mV, 1140mhz core, 1600mhz mem) but also with my current settings (stock volts, 1100mhz core) when i pushed the mem over 1500mhz. remember, the settings i mentioned first worked for a day or so under all kinds of stress.


I've heard about cards blackscreening that did not BS before...

If it's fully stable at 1100/1400 I wouldn't RMA it. Keep it at these setting for a while and see what happens. Did it blackscreen while gaming, because I had some blackscreens when returning from sleeping mode...


----------



## Iniura

Quote:


> Originally Posted by *Brian18741*
> 
> I'm kinda broke after recent upgrades (2x R9 290's and QNIX 1440p monitor!) but can get a Corsair RM 1000 for the not unreasonable price of around €160. I can get a nice 850w for about €100 though. Might just grab the bull by the horns and get the RM1000 and hope it doesn't end in divorce!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can't remember VRM temps off the top of my head, in work now, will check later and upload the results.
> This was my original plan, stick to 1080p and get one 290 and WC the lot. But the upgrade bug got a hold of me and I got a bit carried away! Eventually I will probably put it all under water but realistically it will be 6 months minimum before that's an option financially.


If I where you I would search some info on the RM1000 here OCN, it is a very bad PSU and totally not worth the money if I recall correctly. Ask for advice in the ''FAQ: Recommended Power suplies'' thread.


----------



## Abyssic

Quote:


> Originally Posted by *Mercy4You*
> 
> I've heard about cards blackscreening that did not BS before...
> 
> If it's fully stable at 1100/1400 I wouldn't RMA it. Keep it at these setting for a while and see what happens. Did it blackscreen while gaming, because I had some blackscreens when returning from sleeping mode...


the first blackscreen was a few minutes after booting and from then on everytime i booted, oc gets applied, blackscreen. had to boot into safe mode, uninstall AB to get rid of the OC and then i could reboot normally with stock settings.


----------



## taem

Quote:


> Originally Posted by *Brian18741*
> 
> What sort of temps are people getting in crossfire? I benched Heaven 4.0 and my top card hit 93*C with the fans running over 60% ....... loud!! Bottom card was mid 80s. This is after just one 2 or 3 minute run benching Heaven 4.0!
> 
> I'm using two Asus R9 290 DirectCU II.
> 
> Also what PSU's are people running these on? Currently running them on an XFX 750w Pro series but my power monitor shows about 650w (peaked at 720w after 10 min for FurMark) being drawn form the wall. I'm looking at replacing it with an 850w ... but should I stretch to 1000W?


I want to crossfire too which is why I waited long to try to find the best coolet. I think the Power color PCS+ is good, on par with the Tri-x. I assume 15-20c higher temps on gpu with crossfire so core of mid 60s should work.

I'dget a 1000w. With 4670k at 4.6 and 290 at stock my draw peaks at 386w. OC will take that up a lot. CoolerMaster V1000 is the psu to get imho. Can find it for $189-190.


----------



## Mercy4You

Quote:


> Originally Posted by *Abyssic*
> 
> the first blackscreen was a few minutes after booting and from then on everytime i booted, oc gets applied, blackscreen. had to boot into safe mode, uninstall AB to get rid of the OC and then i could reboot normally with stock settings.


That's definitely a memory issue!
I had the same problem with my previous card, when I tried to overclock the memory from 1250 to 1400 Mhz...

Could be driver corruption. Try reinstall 13.12 and use Display Driver Uninstaller first.


----------



## Falkentyne

Quote:


> Originally Posted by *Abyssic*
> 
> nah xD it blackscreened with my "old" settings (+100mV, 1140mhz core, 1600mhz mem) but also with my current settings (stock volts, 1100mhz core) when i pushed the mem over 1500mhz. remember, the settings i mentioned first worked for a day or so under all kinds of stress.


The reason it worked before at 1500 or 1600 MHz is because the 2D voltage was running at 3d volts--there was either load on the card, or a monitoring program was causing the card to run at 3D clocks/speeds instead of 2D. You blue screen when the 2D idle voltage is TOO LOW for the current memory voltage when it ramps up.


----------



## Abyssic

Quote:


> Originally Posted by *Mercy4You*
> 
> That's definitely a memory issue!
> I had the same problem with my previous card, when I tried to overclock the memory from 1250 to 1400 Mhz...
> 
> Could be driver corruption. Try reinstall 13.12 and use Display Driver Uninstaller first.


already did that before i've done my new oc settings and it still blackscreened at 1500mhz right away.


----------



## Bartouille

Anyone knows why these cards artifacts on the desktop when the memory is clocked too high and eventually goes away when you start a game? Also why does it increase your idle voltage when you increase voltage? To me this is one of the most stupid things on these cards and I feel like selling mine and getting a 780 ti or something. This is a step backward compared to the previous gen cards, why???


----------



## Abyssic

Quote:


> Originally Posted by *Falkentyne*
> 
> The reason it worked before at 1500 or 1600 MHz is because the 2D voltage was running at 3d volts--there was either load on the card, or a monitoring program was causing the card to run at 3D clocks/speeds instead of 2D. You blue screen when the 2D idle voltage is TOO LOW for the current memory voltage when it ramps up.


sorry, i'm afraid i don't quite understand (i'm german xD).

msi afterburner with it's rivatuner implementation was running but it normally doesn't bring the card to load clocks when it should be idling. and just to make it clear: it's going into black screen (with a brief image distortion before) not BSOD.


----------



## Forceman

Quote:


> Originally Posted by *Bartouille*
> 
> Anyone knows why these cards artifacts on the desktop when the memory is clocked too high and eventually goes away when you start a game? Also why does it increase your idle voltage when you increase voltage? To me this is one of the most stupid things on these cards and I feel like selling mine and getting a 780 ti or something. This is a step backward compared to the previous gen cards, why???


It artifacts because it is trying to run the higher memory clock speed at the idle voltage, which is too low. Once it goes to 3D volts the problem goes away. They really need an intermediate memory freq to use in 2D, there's no reason for it to need 1250 memory to run Firefox or Youtube.

Increasing the voltage increases the idle bolts because it is an offset voltage that is applied at all freq/voltage ranges. On the plus side, that often fixes the first problem with the too-low idle volts for the memory.


----------



## Mercy4You

Quote:


> Originally Posted by *Abyssic*
> 
> sorry, i'm afraid i don't quite understand (i'm german xD).
> 
> msi afterburner with it's rivatuner implementation was running but it normally doesn't bring the card to load clocks when it should be idling. and just to make it clear: it's going into black screen (with a brief image distortion before) not BSOD.


Lol, I didn't understand it either (I'm Dutch)









You know Abyssic, *A LOT* of R9290X owners would kill for a memory speed of 1400Mhz!


----------



## Brian18741

Quote:


> Originally Posted by *Iniura*
> 
> If I where you I would search some info on the RM1000 here OCN, it is a very bad PSU and totally not worth the money if I recall correctly. Ask for advice in the ''FAQ: Recommended Power suplies'' thread.


Quote:


> Originally Posted by *taem*
> 
> I'd get a 1000w. With 4670k at 4.6 and 290 at stock my draw peaks at 386w. OC will take that up a lot. CoolerMaster V1000 is the psu to get imho. Can find it for $189-190.


Thanks for the help guys. I asked in the FAQ: Recommended Power Suplies thread and was steered towards the Cooler Master V850. Been doing a bit of research and it seems like a winner! Given the temps I'm getting I won't be overclocking enough to require more voltage so we should be good at 850w


----------



## kcuestag

So what's a basic OC on these 290's to start at with default voltage? I want to see what they can do. Maybe 1050? 1100?


----------



## Ukkooh

It should do atleast 1050 core @stock voltage unless it is a really bad one so I'd start there.


----------



## kcuestag

Quote:


> Originally Posted by *Ukkooh*
> 
> It should do atleast 1050 core @stock voltage unless it is a really bad one so I'd start there.


So similar to 290X's then, cheers.









I'll give them a try with some Battlefield 4.


----------



## Pooms

Hi Everyone please i need some help with Mantle on Star Swarm test :

i have R9 290 + FX-8150 @ 4.2Ghz + 8GB Ram (1*8) 1600 bus + PSU CoolerMaster 725W Extreme2

and i 've got Only 45 avg FPS....!! i saw other rigs results saying that they got 70 avg Fps !!!

What is the problem ? is it a CPU or GPU bottlenecking or that is normal results ?

===========================================================
Oxide Games
Star Swarm Stress Test - ©2013
C:\Users\Pooms\Documents\Star Swarm\Output_14_02_15_0342.txt
Version 1.10
02/15/2014 03:42
===========================================================

== Hardware Configuration =================================
GPU: AMD Radeon R9 200 Series
CPU: AuthenticAMD
AMD FX(tm)-8150 Eight-Core Processor
Physical Cores: 4
Logical Cores: 8
Physical Memory: 8587038720
Allocatable Memory: 140737488224256
===========================================================

== Configuration ==========================================
API: Mantle
Scenario: ScenarioFollow.csv
User Input: Disabled
Resolution: 1920x1080
Fullscreen: True
GameCore Update: 16.6 ms
Bloom Quality: High
PointLight Quality: High
ToneCurve Quality: High
Glare Overdraw: 16
Shading Samples: 64
Shade Quality: Mid
Deferred Contexts: Disabled
Temporal AA Duration: 16
Temporal AA Time Slice: 2
Detailed Frame Info: C:\Users\Pooms\Documents\Star Swarm\FrameDump_14_02_15_0342.csv
===========================================================

== Results ================================================
Test Duration: 360 Seconds
Total Frames: 16435

Average FPS: 45.65
Average Unit Count: 4276
Maximum Unit Count: 5576
Average Batches/MS: 737.44
Maximum Batches/MS: 3571.49
Average Batch Count: 19221
Maximum Batch Count: 151856
===========================================================


----------



## Ukkooh

Quote:


> Originally Posted by *kcuestag*
> 
> So similar to 290X's then, cheers.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll give them a try with some Battlefield 4.


BF4 isn' a good stability test for these cards. I was able to play bf4 for hours with clocks that artifacted in 2 minutes in valley.


----------



## Abyssic

Quote:


> Originally Posted by *Ukkooh*
> 
> BF4 isn' a good stability test for these cards. I was able to play bf4 for hours with clocks that artifacted in 2 minutes in valley.


try far cry 3. make sure post fx is up to maximum and you can seee the slightest instability in the mountains afar. there will be little red dots flickering around with a slightly too high overclock or the usual checkerborad artifacts if the oc is way too high.


----------



## Abyssic

Quote:


> Originally Posted by *Mercy4You*
> 
> Lol, I didn't understand it either (I'm Dutch)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You know Abyssic, *A LOT* of R9290X owners would kill for a memory speed of 1400Mhz!


oh? good to know ^^


----------



## taem

Quote:


> Originally Posted by *Abyssic*
> 
> try far cry 3. make sure post fx is up to maximum and you can seee the slightest instability in the mountains afar. there will be little red dots flickering around with a slightly too high overclock or the usual checkerborad artifacts if the oc is way too high.


Far Cry 3 is the most demanding oc stability test I've found lol. I can pass a bunch of synthetics like heaven and valley ad 3dmark, play other games, but Far Cry 3 will crash.
Quote:


> Originally Posted by *Brian18741*
> 
> Thanks for the help guys. I asked in the FAQ: Recommended Power Suplies thread and was steered towards the Cooler Master V850. Been doing a bit of research and it seems like a winner! Given the temps I'm getting I won't be overclocking enough to require more voltage so we should be good at 850w


Well I'll just say, I thought the same thing and got an 850w, it was on sale for $109, and it's a well reviewed seasonic rebrand. But now I find myself having to upgrade to a 1000w. A psu will last you for several builds, so I would still suggest a 1000w. What price can you get the 850w for? The Cooler Master V1000 gets a 9.7 from johnnyguru and cost $185 at ncix and superbiiz, $209 on amazon. If the price difference is not big I would seriously urge you to go for a 1000w.


----------



## AmcieK

http://imgur.com/9ncK0me


Guys can you help me i have random drop fps in b4 look in screen , its high settings , scale 100% , v sync on , 13,12 .


----------



## the9quad

Quote:


> Originally Posted by *AmcieK*
> 
> 
> 
> http://imgur.com/9ncK0me
> 
> 
> Guys can you help me i have random drop fps in b4 look in screen , its high settings , scale 100% , v sync on , 13,12 .


you dont have a problem, it is just the way the card reports usage to after burner. Read the ab release notes for beta 18 and there is a way you can make it look flatter.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *AmcieK*
> 
> 
> 
> http://imgur.com/9ncK0me
> 
> 
> Guys can you help me i have random drop fps in b4 look in screen , its high settings , scale 100% , v sync on , 13,12 .


Also anytime you cap fps the cards usage will look like this. But yea ab can't read usage properly


----------



## AmcieK

Ok , then why only 49 fps ? bug in game ? i have more strange drop.


----------



## sugarhell

bug because you have a frame drop? Look your cpu almost 100% in some point.You have high settings also with vsync. What do you expect. Do you even know what vsync does?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *AmcieK*
> 
> Ok , then why only 49 fps ? bug in game ? i have more strange drop.


Looks like your card is down clocking


----------



## MrWhiteRX7

Just realized he says "random" fps drop..... its all good man. This is normal. Your cpu is maxing out in these areas.


----------



## kizwan

Quote:


> Originally Posted by *the9quad*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *AmcieK*
> 
> 
> 
> http://imgur.com/9ncK0me
> 
> 
> Guys can you help me i have random drop fps in b4 look in screen , its high settings , scale 100% , v sync on , 13,12 .
> 
> 
> 
> 
> 
> 
> 
> you dont have a problem, it is just the way the card reports usage to after burner. Read the ab release notes for beta 18 and there is a way you can make it look flatter.
Click to expand...

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *AmcieK*
> 
> 
> 
> http://imgur.com/9ncK0me
> 
> 
> Guys can you help me i have random drop fps in b4 look in screen , its high settings , scale 100% , v sync on , 13,12 .
> 
> 
> 
> 
> 
> 
> 
> Also anytime you cap fps the cards usage will look like this. But yea ab can't read usage properly
Click to expand...

@AmcieK is asking about FPS drop, not GPU usage drop.

@AmcieK, your CPU usage almost 100%. FPS drop because CPU bottleneck. Strange. I thought i5 can handle one GPU. Try overclock your i5.


----------



## MrWhiteRX7

Are you on Windows 7? Sorry if it's in the sig I'm on mobile while playing Titanfall lol.


----------



## Brian18741

Quote:


> Originally Posted by *Sgt Bilko*
> 
> btw, whats the vrm temps like on the DCU II models?


After an hour and 10 minutes gaming my temps were the following.

Card 1 89°C
VRM 1 89°C
VRM 2 77°C

Card 2 75°C
VRM 1 78°C
VRM 2 72°C

How do they look? It got quite toasty in here, had to turn the heating off!


----------



## AmcieK

8.1 64 bit . I try tomorrow OC my processor and look what happened, my last card was gtx 760 and i play in high settings and i haven't so much drop what now . I format hdd after buy new card , install cattalyst 14.1 it was worse than now on 13.12. Maybe my psu is weak ?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Brian18741*
> 
> After an hour and 10 minutes gaming my temps were the following.
> 
> Card 1 89°C
> VRM 1 89°C
> VRM 2 77°C
> 
> Card 2 75°C
> VRM 1 78°C
> VRM 2 72°C
> 
> How do they look? It got quite toasty in here, had to turn the heating off!


Thats not much different from my DD cards.....

I would have thought that Asus would have done better than that.

What ambients was that at?


----------



## Abyssic

Quote:


> Originally Posted by *kizwan*
> 
> @AmcieK is asking about FPS drop, not GPU usage drop.
> 
> @AmcieK, your CPU usage almost 100%. FPS drop because CPU bottleneck. Strange. I thought i5 can handle one GPU. Try overclock your i5.


my setup is handling bf4 just fine, no cpu limit encountered. so his cpu limiting would be very stange Oo


----------



## kizwan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Brian18741*
> 
> After an hour and 10 minutes gaming my temps were the following.
> 
> Card 1 89°C
> VRM 1 89°C
> VRM 2 77°C
> 
> Card 2 75°C
> VRM 1 78°C
> VRM 2 72°C
> 
> How do they look? It got quite toasty in here, had to turn the heating off!
> 
> 
> 
> Thats not much different from my DD cards.....
> 
> I would have thought that Asus would have done better than that.
> 
> What ambients was that at?
Click to expand...

Should have run fans at 100% for fair comparison. This way you're going to compare the full cooling capability of both cooler.

/rant on
Seriously, not one manufacturer have done right in cooling 290/290X especially VRM1. Everyone failed, except Aquacomputer.
/rant off

Well, if I didn't overclock, my VRM1 temp pretty good. They in 50s Celsius when playing BF4 in 31C ambient.
Quote:


> Originally Posted by *Abyssic*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> @AmcieK is asking about FPS drop, not GPU usage drop.
> 
> @AmcieK, your CPU usage almost 100%. FPS drop because CPU bottleneck. Strange. I thought i5 can handle one GPU. Try overclock your i5.
> 
> 
> 
> my setup is handling bf4 just fine, no cpu limit encountered. so his cpu limiting would be very stange Oo
Click to expand...

Yes, that is very strange. CPU usage in Windows 8 will be higher than Windows 7 but it shouldn't bottleneck with one card. What is your CPU usage when playing BF4?


----------



## Sgt Bilko

Quote:


> Originally Posted by *kizwan*
> 
> Should have run fans at 100% for fair comparison. This way you're going to compare the full cooling capability of both cooler.
> 
> /rant on
> Seriously, not one manufacturer have done right in cooling 290/290X especially VRM1. Everyone failed, except Aquacomputer.
> /rant off


Good point, forgot to ask about fan speed.

I figured that Asus would have done cooling right though considering they seem to have a higher price tag than everyone else

I'm really tempted to just take the cooler of these cards and damn the warranty just to see whats going on in terms of cooling


----------



## kizwan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> *I figured that Asus would have done cooling right though considering they seem to have a higher price tag than everyone else*
> 
> I'm really tempted to just take the cooler of these cards and damn the warranty just to see whats going on in terms of cooling


I fully agree with you. They should perform better with that price tag.

XFX still didn't get back you whether you can take the cooler off?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Pooms*
> 
> Hi Everyone please i need some help with Mantle on Star Swarm test :
> 
> i have R9 290 + FX-8150 @ 4.2Ghz + 8GB Ram (1*8) 1600 bus + PSU CoolerMaster 725W Extreme2
> 
> and i 've got Only 45 avg FPS....!! i saw other rigs results saying that they got 70 avg Fps !!!
> 
> What is the problem ? is it a CPU or GPU bottlenecking or that is normal results ?
> 
> ===========================================================
> Oxide Games
> Star Swarm Stress Test - ©2013
> C:\Users\Pooms\Documents\Star Swarm\Output_14_02_15_0342.txt
> Version 1.10
> 02/15/2014 03:42
> ===========================================================
> 
> == Hardware Configuration =================================
> GPU: AMD Radeon R9 200 Series
> CPU: AuthenticAMD
> AMD FX(tm)-8150 Eight-Core Processor
> Physical Cores: 4
> Logical Cores: 8
> Physical Memory: 8587038720
> Allocatable Memory: 140737488224256
> ===========================================================
> 
> == Configuration ==========================================
> API: Mantle
> Scenario: ScenarioFollow.csv
> User Input: Disabled
> Resolution: 1920x1080
> Fullscreen: True
> GameCore Update: 16.6 ms
> Bloom Quality: High
> PointLight Quality: High
> ToneCurve Quality: High
> Glare Overdraw: 16
> Shading Samples: 64
> Shade Quality: Mid
> Deferred Contexts: Disabled
> Temporal AA Duration: 16
> Temporal AA Time Slice: 2
> Detailed Frame Info: C:\Users\Pooms\Documents\Star Swarm\FrameDump_14_02_15_0342.csv
> ===========================================================
> 
> == Results ================================================
> Test Duration: 360 Seconds
> Total Frames: 16435
> 
> Average FPS: 45.65
> Average Unit Count: 4276
> Maximum Unit Count: 5576
> Average Batches/MS: 737.44
> Maximum Batches/MS: 3571.49
> Average Batch Count: 19221
> Maximum Batch Count: 151856
> ===========================================================


Where have you seen other people getting 70fps?

I get about 45fps on the follow setting as well


----------



## Sgt Bilko

Quote:


> Originally Posted by *kizwan*
> 
> I fully agree with you. They should perform better with that price tag.
> 
> XFX still didn't get back you whether you can take the cooler off?


Not yet they haven't, I think i might just grab a pair of needle-nose pliers and attempt it that way.

I've still got some small aluminium heatsinks laying around from my AX III so i should be able to do something to the vrms


----------



## Paul17041993

Quote:


> Originally Posted by *AmcieK*
> 
> 
> 
> http://imgur.com/9ncK0me
> 
> 
> Guys can you help me i have random drop fps in b4 look in screen , its high settings , scale 100% , v sync on , 13,12 .


your CPU is pegged, so the card is flipping between idle and load states waiting for the CPU to feed it after rendering a frame, causing a messy graph.


----------



## Abyssic

Quote:


> Originally Posted by *kizwan*
> 
> Yes, that is very strange. CPU usage in Windows 8 will be higher than Windows 7 but it shouldn't bottleneck with one card. What is your CPU usage when playing BF4?


around 70-80%


----------



## AmcieK

Stock


http://imgur.com/z0Ki5ka

OC 4,4 Ghz


http://imgur.com/VGLlxF2


----------



## pkrexer

Anyway to keep my voltage from reverting back to stock once my monitors go to sleep?


----------



## Sgt Bilko

Quote:


> Originally Posted by *pkrexer*
> 
> Anyway to keep my voltage from reverting back to stock once my monitors go to sleep?


Disable ULPS


----------



## chiknnwatrmln

Does anyone know if flashing a different BIOS can increase OC potential?

I heard somewhere it can, but I'm kind of skeptical.


----------



## Spectre-

Pushed my GPU to 1217/1250 with my G10 and H55

http://www.3dmark.com/3dm11/7978422

made it to top 30 for 3D mark 11 on Top 30 3D Mark11 scores thread


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Spectre-*
> 
> Pushed my GPU to 1217/1250 with my G10 and H55
> 
> http://www.3dmark.com/3dm11/7978422
> 
> made it to top 30 for 3D mark 11 on Top 30 3D Mark11 scores thread


I think you can push that card more.

You should be able to beat my score, especially considering you have a hexy. http://www.3dmark.com/3dm11/7941592

This was at 1245/1625 with my 290.

How are your temps?


----------



## Spectre-

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I think you can push that card more.
> 
> You should be able to beat my score, especially considering you have a hexy. http://www.3dmark.com/3dm11/7941592
> 
> This was at 1245/1625 with my 290.
> 
> How are your temps?


i get throttle after 1220 temps are ok highest i have seen the VRM's get to is 75


----------



## lol.69

So I set the memory to 1500 MHZ on my asus r9-290x dcII ( Elpida ram).
Also set core to 1150MHZ, and fan to 100%.

Played BF3 MP about half an hour with Vsync Off when some artifacts appeared. The VRM1 temp as at 103, when the artifacts showed up.

My Question is: if i put my card in a better cooling solution probably I will not have artifacts right? If the memory couldn´t do that clocks it would crash instantly right?

Will try later 1200 on core with memory at stock..


----------



## Paul17041993

Quote:


> Originally Posted by *lol.69*
> 
> So I set the memory to 1500 MHZ on my asus r9-290x dcII ( Elpida ram).
> Also set core to 1150MHZ, and fan to 100%.
> 
> Played BF3 MP about half an hour with Vsync Off when some artifacts appeared. The VRM1 temp as at 103, when the artifacts showed up.
> 
> My Question is: if i put my card in a better cooling solution probably I will not have artifacts right? If the memory couldn´t do that clocks it would crash instantly right?
> 
> Will try later 1200 on core with memory at stock..


yea the artifacts are likely instability from the hot VRMs, and no (to the 2nd question), a lot of times clocks that seem stable, end up unstable under certain loads or with certain games, there's generally a certain amount of headroom needed to counter the vdroop from clock state changes.


----------



## Brian18741

Quote:


> Originally Posted by *Paul17041993*
> 
> your CPU is pegged, so the card is flipping between idle and load states waiting for the CPU to feed it after rendering a frame, causing a messy graph.


Does this mean the CPU is bottlenecking the GPU? My usage graph looks like that in AB as well but I have a 3570k @ 4.5ghz, surely that's not a bottleneck?

:EDIT: Humm, I'm getting driver crashes while watching youtube clips now. Temps well under control. No real usage on GPUs. Screen turns to static and flashes a few times until the drivers reset. 13.12


----------



## Paul17041993

Quote:


> Originally Posted by *Brian18741*
> 
> Does this mean the CPU is bottlenecking the GPU? My usage graph looks like that in AB as well but I have a 3570k @ 4.5ghz, surely that's not a bottleneck?


not entirely sure, though you do have two 290s though correct? Ive heard a lot about BF4 being more cpu-bound then anything else in crossfire/SLI, is there any settings in there you could try to turn down? like particles and physics?


----------



## rdr09

Quote:


> Originally Posted by *Brian18741*
> 
> Does this mean the CPU is bottlenecking the GPU? My usage graph looks like that in AB as well but I have a 3570k @ 4.5ghz, surely that's not a bottleneck?
> 
> :EDIT: Humm, I'm getting driver crashes while watching youtube clips now. Temps well under control. No real usage on GPUs. Screen turns to static and flashes a few times until the drivers reset. 13.12


Rightclick the video and go to settings. make sure to disable Hardware Acceleration (uncheck).


----------



## Paul17041993

just did a couple of Star Swarm runs, literally a twofold perf increase;
[email protected], 1866 RAM, ref. 290X with 40% fan profile


Spoiler: Star Swarm results



for both;
== Configuration ==========================================

Scenario: ScenarioFollow.csv
User Input: Disabled
Resolution: 1920x1080
Fullscreen: True
GameCore Update: 16.6 ms
Bloom Quality: High
PointLight Quality: High
ToneCurve Quality: High
Glare Overdraw: 16
Shading Samples: 64
Shade Quality: Mid
Deferred Contexts: Disabled
Temporal AA Duration: 16
Temporal AA Time Slice: 2
Detailed Frame Info: Off
===========================================================

direct3D;
== Results ================================================
Test Duration: 360 Seconds
Total Frames: 9486

Average FPS: 26.35
Average Unit Count: 3978
Maximum Unit Count: 5697
Average Batches/MS: 380.51
Maximum Batches/MS: 693.89
Average Batch Count: 15608
Maximum Batch Count: 75970
===========================================================

mantle;
== Results ================================================
Test Duration: 360 Seconds
Total Frames: 18392

Average FPS: 51.09
Average Unit Count: 4380
Maximum Unit Count: 5662
Average Batches/MS: 1010.75
Maximum Batches/MS: 3724.72
Average Batch Count: 22772
Maximum Batch Count: 119590
===========================================================


----------



## Forceman

Quote:


> Originally Posted by *lol.69*
> 
> So I set the memory to 1500 MHZ on my asus r9-290x dcII ( Elpida ram).
> 
> Also set core to 1150MHZ, and fan to 100%.
> 
> Played BF3 MP about half an hour with Vsync Off when some artifacts appeared. The VRM1 temp as at 103, when the artifacts showed up.
> 
> My Question is: if i put my card in a better cooling solution probably I will not have artifacts right? If the memory couldn´t do that clocks it would crash instantly right?
> 
> Will try later 1200 on core with memory at stock..


I think you would be better off dropping the memory clock and pushing the core instead. These cards have crazy memory bandwidth.


----------



## Roboyto

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Not yet they haven't, I think i might just grab a pair of needle-nose pliers and attempt it that way.
> 
> I've still got some small aluminium heatsinks laying around from my AX III so i should be able to do something to the vrms


For US market your warranty is not void if you remove stock HSF. I have the R9 290 BE card. Just make sure keep all stock parts. If you need to make a warranty claim you must put card back to factory for RMA purposes.

Here is verification from an XFX rep through a NewEgg review response.



18th Review from the top is listed in order of date posted

http://www.newegg.com/Product/Product.aspx?Item=9SIA2RY1C10908


----------



## Abyssic

Quote:


> Originally Posted by *Roboyto*
> 
> For US market your warranty is not void if you remove stock HSF. I have the R9 290 BE card. Just make sure keep all stock parts. If you need to make a warranty claim you must put card back to factory for RMA purposes.
> 
> Here is verification from an XFX rep through a NewEgg review response.
> 
> 
> 
> 18th Review from the top is listed in order of date posted
> 
> http://www.newegg.com/Product/Product.aspx?Item=9SIA2RY1C10908


and if you're living somewhere else you have to contact their support first and they will most likely give you a "go" to change the cooler but you still void the warranty if they don't.


----------



## taem

So my Powercolor PCS+ 290, at stock settings of 1040 core 1350 mem, Valley ExtremeHD preset:



Gpu z of temps:



Note power draw of 220.3w.

This is a good score for stock imho. Gtx 780 blows the 290 away on this bench though.

I hope to see more Valley scores. Please use ExtremeHD preset, not custom settings. I hate how reviews always use custom settings.

Whats the preferred setting for Heaven btw?


----------



## Roboyto

XFX R9 290 BE 1215/1675 +125mV +50% Power - Score 3010
With Stock Cooler @ 75% Fan - Core Max 85C - VRMs 71/75 Max



Tested again once water loop was in but wasn't saving score files; I have an excel spreadsheet for all my OC testing though. Only squeaked another 45MHz on the core, but temps are significantly better.

XFX R9 290 BE 1260/1675 +125mV +50% Power - Score 3070
XSPC Razor - GPU 44C Max - VRMs 64/38

Full Loop: XSPC Raystorm - XSPC Razor - XSPC Dualbay Res - MCP655B - (Top) XSPC EX240 Yate-Loon Thin 20mm in Push - (Front) XSPC EX240 Corsair SP120 in Push/Pull


----------



## keikei

Hi Guys,

I just wanted to ask everyone what frames are you getting in BF4? I'm looking to upgrade (when prices drop). Appreciated.


----------



## taem

Prelim oc experimentation. Ran Valley at 1200 core 1400 mem, +100mV +50% power limit.



Temps:



Gonna run a loop of MetroLL bench now.
Quote:


> Originally Posted by *Roboyto*
> 
> XFX R9 290 BE 1215/1675 +125mV +50% Power - Score 3010
> With Stock Cooler @ 75% Fan - Core Max 85C - VRMs 71/75 Max


Edit, Oh that's a reference, no wonder your vrm cooling is so good. Score is so much higher than mine given the minimal core clock difference. Big diff in mem clock of course and its always been my understanding that Valley loves mem overclocks but still, that's a big difference. You are in GTX 780 territory.


----------



## Roboyto

Quote:


> Originally Posted by *taem*
> 
> Prelim oc experimentation. Ran Valley at 1200 core 1400 mem, +100mV +50% power limit.
> 
> Gonna run a loop of MetroLL bench now.
> Edit, Oh that's a reference, no wonder your vrm cooling is so good. Score is so much higher than mine given the minimal core clock difference. Big diff in mem clock of course and its always been my understanding that Valley loves mem overclocks but still, that's a big difference. You are in GTX 780 territory.


Quote:


> I hope to see more Valley scores. Please use ExtremeHD preset, not custom settings. I hate how reviews always use custom settings.


Yes, the BE is a reference card.

I did get Hynix RAM, however I definitely wasn't expecting the 5GHz modules to push to nearly 7GHz! Best RAM clock I got stable in anything so far is 3DMark11 Performance run @ 1260/1725. Score of P17636, graphics score is 21037. Would be tickling P20000 if I had a 6-core CPU









http://www.3dmark.com/3dm11/7981805

Also did extensive testing in a few other benches...best scores listed

3DMark11 Extreme Presets: 1215/1625 - Score X5255
3DMark11 Extreme Presets: 1255/1700 - Score X6487 *Tesselation Off*

Unigine Heaven Extreme Presets: 1255/1675 - Score 1620 FPS: 64.3/132.1/29.0
Unigine Heaven Extreme Presets: 1255/1675 - Score 2186 FPS: 86.8/149.2/34.6 *Tesselation Off*

FFXIV Bench: 1255/1675 - Score 17181


----------



## taem

Quote:


> Originally Posted by *Roboyto*
> 
> Yes, the BE is a reference card.
> 
> I did get Hynix RAM, however I definitely wasn't expecting the 5GHz modules to push to nearly 7GHz! Best RAM clock I got stable in anything so far is 3DMark11 Performance run @ 1260/1725. Score of P17636, graphics score is 21037. Would be tickling P20000 if I had a 6-core CPU


What oc software you using? AB 13 beta 18 maxes at +100mV and +50% (but I just now remembered I didn't tick exceed oc limits). I can't adjust mem volts at all. MetroLL bench froze at 1200/1400 +100mV +50% so I need to try higher volts. I do have the temp overhead at 65 core, though I have to watch vrm1 I don't like to go over 80-85c.

I installed the latest Trixx earlier but it doesn't use volt +mV, it uses offset, which I don't know how to use. And that doesn't have mem volt adjust.

Your card is an insane overclocker IMHO. I've never taken a card to those clocks.


----------



## SeanEboy

Here's a question... Is an unlockable 290, the same as owning a 290x? For example.. I have an offer to buy an unlockable 290 for $500, or a 290x for $620... Should I get the unlockable over the $120 extra for the 290x? Perhaps I should just go stupid, and buy both...


----------



## Roboyto

Quote:


> Originally Posted by *taem*
> 
> What oc software you using? AB 13 beta 18 maxes at +100mV and +50% (but I just now remembered I didn't tick exceed oc limits). I can't adjust mem volts at all. MetroLL bench froze at 1200/1400 +100mV +50% so I need to try higher volts. I do have the temp overhead at 65 core, though I have to watch vrm1 I don't like to go over 80-85c.
> 
> I installed the latest Trixx earlier but it doesn't use volt +mV, it uses offset, which I don't know how to use. And that doesn't have mem volt adjust.
> 
> Your card is an insane overclocker IMHO. I've never taken a card to those clocks.


I had problems with AfterBurner right away, ditch it and try Trixx.

You can compare voltages in GPU-Z to see how the Offset voltage in Trixx applies compared to AB.

My +125mV I listed is Offset in Trixx.

Insane indeed...~33% on core and ~34% on RAM...I got a winner!


----------



## Sgt Bilko

Quote:


> Originally Posted by *Roboyto*
> 
> For US market your warranty is not void if you remove stock HSF. I have the R9 290 BE card. Just make sure keep all stock parts. If you need to make a warranty claim you must put card back to factory for RMA purposes.
> 
> Here is verification from an XFX rep through a NewEgg review response.
> 
> 
> 
> 18th Review from the top is listed in order of date posted
> 
> http://www.newegg.com/Product/Product.aspx?Item=9SIA2RY1C10908


I'm in Australia, hence why im still waiting for XFX to reply.


----------



## Roboyto

Quote:


> Originally Posted by *SeanEboy*
> 
> Here's a question... Is an unlockable 290, the same as owning a 290x? For example.. I have an offer to buy an unlockable 290 for $500, or a 290x for $620... Should I get the unlockable over the $120 extra for the 290x? Perhaps I should just go stupid, and buy both...


Just make sure you have proof that the 290 is actually unlocked, and that is definitely the smarter buy. Only quirk is having to flash back to stock BIOS if you have to RMA at some point. Get the unlocked one and buy a Gelid Icy Vision or Arctic Accelero.

Dual 290s is cool if you have a use for them. One of these cards is overkill for 1080 gaming, great for 1440 or 1600, and likely sufficient for 5760x1080 surround.

Also have to consider powering two of these beasts, as well as keeping two of them cool. Xfire without watercooling will likely 'cook' the top card.


----------



## Roboyto

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'm in Australia, hence why im still waiting for XFX to reply.


Didn't realize that until after I posted. My bad.


----------



## SeanEboy

Quote:


> Originally Posted by *Roboyto*
> 
> Just make sure you have proof that the 290 is actually unlocked, and that is definitely the smarter buy. Only quirk is having to flash back to stock BIOS if you have to RMA at some point. Get the unlocked one and buy a Gelid Icy Vision or Arctic Accelero.
> 
> Dual 290s is cool if you have a use for them. One of these cards is overkill for 1080 gaming, great for 1440 or 1600, and likely sufficient for 5760x1080 surround.
> 
> Also have to consider powering two of these beasts, as well as keeping two of them cool. Xfire without watercooling will likely 'cook' the top card.


Thanks for that info! Yeah, I figured something like that was in play... However, is there a benefit to buying the 290x, if I already have (2) 290x on the way? Or would the 290 running as a 290x work just as well as the legit 290x's?

As for the PSU, I have a Seasonic 1000w Platinum on the way...

I'm going for 1440p, wondering if I should take a crack at 1600p - didn't really think about that.. I'm hoping for ultra settings in BF4, on one of those 'korean special' monitors....

EDIT: How can I get proof that it is indeed unlocked? Couldn't the guy just rip any old screenshot off the web?


----------



## taem

Quote:


> Originally Posted by *Roboyto*
> 
> I had problems with AfterBurner right away, ditch it and try Trixx.
> 
> You can compare voltages in GPU-Z to see how the Offset voltage in Trixx applies compared to AB.
> 
> My +125mV I listed is Offset in Trixx.


So I just look at vdcc in gpu z to see the effect that trixx offset has? I'm not sure how this differs from straight up +mV, will have to do research. Thanks for the tip. So you're not adjusting mem voltage at all? And still hitting those clocks? Magic memory modules, holy smokes.
Quote:


> Originally Posted by *Roboyto*
> 
> Dual 290s is cool if you have a use for them. One of these cards is overkill for 1080 gaming, great for 1440 or 1600, and likely sufficient for 5760x1080 surround.
> 
> Also have to consider powering two of these beasts, as well as keeping two of them cool. Xfire without watercooling will likely 'cook' the top card.


I don't know that a single 290 is great for 1440p. That's the screen I have and I have to tone down a lot of settings. I find that I don't need much AA at that res so it's not too bad. But I really feel like I want crossfire 290 for 1440p.

I hope you're wrong about xfire on air, cos I plan on it, I have a 290 Tri-X on preorder to mate with this PCS+. I know it's an issue and that's why I waited and did research to get the best cooler. I think I picked the two cards that are neck and neck for the title. I know for sure the MSI Gaming 290 I couldn't xfire, a friend had that setup and gpu1 heat throttles in the same case I have, albeit with weaker case cooling.


----------



## Roboyto

Quote:


> Originally Posted by *SeanEboy*
> 
> Thanks for that info! Yeah, I figured something like that was in play... However, is there a benefit to buying the 290x, if I already have (2) 290x on the way? Or would the 290 running as a 290x work just as well as the legit 290x's?
> 
> As for the PSU, I have a Seasonic 1000w Platinum on the way...
> 
> I'm going for 1440p, wondering if I should take a crack at 1600p - didn't really think about that.. I'm hoping for ultra settings in BF4, on one of those 'korean special' monitors....


If the 290 truly unlocks then it will work just like a 290X. There were a few people that received the unlockable cards, whom verified by a # stamped on the cards, that the manufacturers were simply flashing the 290X to a 290. When these cards first released sales on 290X were slow because of the $150 difference in price to the 290. Companies were flashing 290Xs simply to move their stock.

1000W Seasonic Platinum









I don't think you'll have any issues with Ultra Settings in BF4 with 2 of these monstrosities. Not sure how forgiving BF4 has been with Xfire though, I know it has been kind of sketchy in numerous regards since it launched. Take a look into the Hawaiian Heat Wave build log. Super in depth, tons of useful information for Xfire 290(X) system. Gunderman456 is one thorough SOB!

http://www.overclock.net/t/1442038/build-log-the-hawaiian-heat-wave


----------



## Roboyto

Quote:


> Originally Posted by *taem*
> 
> So I just look at vdcc in gpu z to see the effect that trixx offset has? I'm not sure how this differs from straight up +mV, will have to do research. Thanks for the tip. So you're not adjusting muddy at all? And
> 
> I don't know that a single 290 is great for 1440p. That's the screen I have and I have to tone down a lot of settings. I find that I don't need much AA at that res so it's not too bad. But I really feel like I want crossfire 290 for 1440p.
> 
> I hope you're wrong about xfire on air, cos I plan on it, I have a 290 Tri-X on preorder to mate with this PCS+. I know it's an issue and that's why I waited and did research to get the best cooler. I think I picked the two cards that are neck and neck for the title. I know for sure the MSI Gaming 290 I couldn't xfire, a friend had that setup and gpu1 heat throttles in the same case I have, albeit with weaker case cooling.


For voltage see where your VDDC is at stock, 100mV with AB, then compare when you adjust voltage with offset in Trixx. Then you know how far to push in TrixxiIf you don't want to exceed the voltage that AB applied.

For 1440p I was just going off memory of reading numerous reviews of these cards. In many games at 2560*1600 with some AA they were keeping at least 30 in the most demanding games if not low-mid 40s for most others, and that was at stock clocks. Give it some OC and the frame rates should be far from choppy. Although if you're talking about a 120/144Hz monitor, or never dipping under 60fps then a second card would be necessary.

http://www.techpowerup.com/reviews/Sapphire/R9_290X_Tri-X_OC/9.html

Xfire temps will depend on two things...Space between the cards, and if you have sufficient airflow getting to the cards.

You can always go the route of strapping an AIO onto the GPUs if you have room for a couple 120mm rads


----------



## the9quad

Quote:


> Originally Posted by *taem*
> 
> So I just look at vdcc in gpu z to see the effect that trixx offset has? I'm not sure how this differs from straight up +mV, will have to do research. Thanks for the tip. So you're not adjusting mem voltage at all? And still hitting those clocks? Magic memory modules, holy smokes.
> I don't know that a single 290 is great for 1440p. That's the screen I have and I have to tone down a lot of settings. I find that I don't need much AA at that res so it's not too bad. But I really feel like I want crossfire 290 for 1440p.
> 
> I hope you're wrong about xfire on air, cos I plan on it, I have a 290 Tri-X on preorder to mate with this PCS+. I know it's an issue and that's why I waited and did research to get the best cooler. I think I picked the two cards that are neck and neck for the title. I know for sure the MSI Gaming 290 I couldn't xfire, a friend had that setup and gpu1 heat throttles in the same case I have, albeit with weaker case cooling.


As long as your case has good airflow and you don't mind using a custom fan profile you can cfx on air no problem. *It will be loud though*.

For reference on my pc with reference coolers, after an hour of:

BF4 my temps will range from 67C/57% fan speed on the coolest card with 74C/64% fan speed on the hottest.
Crysis 3 my temps will be 79C/69% fan speed on the coolest with 87C/88% fan speed on the hottest-this was after like 3 hours though.
Crysis 3 xfire usage is crazy good with the 3 gpu's essentially pegged at 100% the whole time. BF4 I limite frames to 105 fps so they dont get as much of a workout and since my monitor is 100hz


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *keikei*
> 
> Hi Guys,
> 
> I just wanted to ask everyone what frames are you getting in BF4? I'm looking to upgrade (when prices drop). Appreciated.


with my 290's in CF with a 1150 / 1400 on mem about 140 - 180fps on ultra settings


----------



## kizwan

Quote:


> Originally Posted by *Roboyto*
> 
> XFX R9 290 BE 1215/1675 +125mV +50% Power - Score 3010
> With Stock Cooler @ 75% Fan - Core Max 85C - VRMs 71/75 Max
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Tested again once water loop was in but wasn't saving score files; I have an excel spreadsheet for all my OC testing though. Only squeaked another 45MHz on the core, but temps are significantly better.
> 
> XFX R9 290 BE 1260/1675 +125mV +50% Power - Score 3070
> XSPC Razor - GPU 44C Max - VRMs 64/38
> 
> Full Loop: XSPC Raystorm - XSPC Razor - XSPC Dualbay Res - MCP655B - (Top) XSPC EX240 Yate-Loon Thin 20mm in Push - (Front) XSPC EX240 Corsair SP120 in Push/Pull


You have MCP655B pump? Not the MCP655 or D5 Vario?


----------



## nightfox

Quote:


> Originally Posted by *Roboyto*
> 
> Dual 290s is cool if you have a use for them. One of these cards is overkill for 1080 gaming, great for 1440 or 1600, and likely sufficient for 5760x1080 surround.
> 
> Also have to consider powering two of these beasts, as well as keeping two of them cool. Xfire without watercooling will likely 'cook' the top card.


single 290 is not great for 1600p. I can confirm you that. I need to tone down settings if I want more than 50 fps. 50 fps is playable but dont like it. 2 is good. Ultra settings, no AA. 3 x 290, ultra 4x MSAA, 200% res scale, never go down less 60 fps. But man, these things are loud. But my AP182 fan is even more louder though and I dont mind. My wife dont like it though. Have to close room to play









oh one more thing, 3 x 290 is not so loud as people think. I would say 2 x 290 and 3x 290 is almost noise comparable.

oh one more thing, the fps i mentioned in playing BF4, everything is at stock. CPU and GPU.overclocked 4770k can really drive with 4 x 290's. with 3 x 290's, stock 4770k can cope up. No bottlenecking observed.


----------



## kizwan

Just to make sure my eyes working properly. Fujipoly 11W/mK is $7.49 while Fujipoly 17W/mK is $25.99 at FrozenCPU. Why huge difference? Is this always like this? If I go with 11W/mK, do I still get huge improvement in VRM1 temp?

http://www.frozencpu.com/products/16883/thr-168/Fujipoly_Extreme_System_Builder_Thermal_Pad_-_60_x_50_x_10_-_Thermal_Conductivity_110_WmK.html
http://www.frozencpu.com/products/17500/thr-182/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_60_x_50_x_10_-_Thermal_Conductivity_170_WmK.html


----------



## Roboyto

Quote:


> Originally Posted by *kizwan*
> 
> You have MCP655B pump? Not the MCP655 or D5 Vario?


Yup, the one without speed control.

http://www.swiftech.com/mcp655-b.aspx


----------



## taem

Do you guys force constant voltage? My vddc fluctuates a lot. At +112 offset, peak is uncomfortably high for air, around 1.315v. But it usually hovers between 1.220 - 1.250.

And does ULPS have to be disabled for crossfire? I'd like to keep that. That's like adaptive voltage right? Adaptive saves me a lot of wattage.


----------



## kizwan

Quote:


> Originally Posted by *taem*
> 
> And does ULPS have to be disabled for
> crossfire? I'd like to keep that. That's like adaptive voltage right? Adaptive saves me a lot of wattage.


If you overclock then you want ULPS disabled. It can prevent the 2nd GPU from overvolting & this will effect stability. If you always running at stock clocks then you can leave ULPS enabled.

ULPS put 2nd GPU into ultra low power state. Basically it like putting 2nd GPU to sleep.


----------



## Roboyto

Quote:


> Originally Posted by *nightfox*
> 
> single 290 is not great for 1600p. I can confirm you that. I need to tone down settings if I want more than 50 fps. 50 fps is playable but dont like it. 2 is good. Ultra settings, no AA. 3 x 290, ultra 4x MSAA, 200% res scale, never go down less 60 fps. But man, these things are loud. But my AP182 fan is even more louder though and I dont mind. My wife dont like it though. Have to close room to play
> 
> 
> 
> 
> 
> 
> 
> 
> 
> oh one more thing, 3 x 290 is not so loud as people think. I would say 2 x 290 and 3x 290 is almost noise comparable.
> 
> oh one more thing, the fps i mentioned in playing BF4, everything is at stock. CPU and GPU.overclocked 4770k can really drive with 4 x 290's. with 3 x 290's, stock 4770k can cope up. No bottlenecking observed.


At 1600 with 200% resolution scale I would imagine it wouldn't be very smooth. That's essentially doubling the load, right?

When you're at 1440/1600, AA is hurting your performance more than it's making a noticeable difference in how things look. Personally, I would turn down the res scale and AA to get better frames. 50 for 1 card as you claimed is more than acceptable by a lot of people's standards.

Not many people here run their CPU/GPU at stock settings. BF4 takes advantage of CPU, and RAM Overclocking more than just about any other game out presently. A mild overclock for CPU, RAM, and 2 cards would yield some pretty good, if not great, performance gains.

You must have a high tolerance for noise! One card at anything over 55% was beyond unacceptable for me. But I'm used to the silence of my water cooling so I'm slightly spoiled I suppose.


----------



## taem

Quote:


> Originally Posted by *kizwan*
> 
> If you overclock then you want ULPS disabled. It can prevent the 2nd GPU from overvolting & this will effect stability. If you always running at stock clocks then you can leave ULPS enabled.
> 
> ULPS put 2nd GPU into ultra low power state. Basically it like putting 2nd GPU to sleep.


And that only happens if you oc? That seems odd. But I guess I probably won't oc when I crossfire. My temps should be fine then.

Let me ask this -- what is the consequence of turning ULPS off for 290 crossfire? Just higher power consumption?


----------



## Roboyto

Quote:


> Originally Posted by *kizwan*
> 
> If you overclock then you want ULPS disabled. It can prevent the 2nd GPU from overvolting & this will effect stability. If you always running at stock clocks then you can leave ULPS enabled.
> 
> ULPS put 2nd GPU into ultra low power state. Basically it like putting 2nd GPU to sleep.


Yup, shoukd disable ULPS for Xfire.

ULPS is ultra low power state. It turns off second GPU if it's not being used, but has been known to interfere with performance especially when overclocking.

Yes, force constant voltage to help stability.


----------



## Paul17041993

Quote:


> Originally Posted by *taem*
> 
> And that only happens if you oc? That seems odd. But I guess I probably won't oc when I crossfire. My temps should be fine then.
> 
> Let me ask this -- what is the consequence of turning ULPS off for 290 crossfire? Just higher power consumption?


maby 4W vs 10W per card at idle, the cores already go into a fairly low power state, ULPS just shuts off the fan and half the actual core, which poses problems at times, windows 7 particularly hates ULPS and will have a fit a lot of times.


----------



## kizwan

Anyone done anything crazy lately? Well, look what I just did.







 I'm not rich but first shipping option will take long time to arrived in Malaysia while the difference between second & third shipping options are only few bucks.

Quote:


> Originally Posted by *taem*
> 
> And that only happens if you oc? That seems odd. But I guess I probably won't oc when I crossfire. My temps should be fine then.
> 
> Let me ask this -- what is the consequence of turning ULPS off for 290 crossfire? Just higher power consumption?


With ULPS disabled, power consumption when idle will be slightly higher. Just slightly I imagine. Not worth to worry about. When my cards still with stock cooler, I leave ULPS enabled. Don't have any adverse effect I can tell & no problem when playing games like BF3/BF4 (at stock clocks). The 2nd card don't have problem exiting & entering ULPS mode whenever I play games & exiting games.


----------



## taem

Heh I'm just continuing to move the slider and Valley lets me. 1230 core, 1500 memory. I could keep going I guess but I don't see the point, I don't think this is game stable. The artifacting was very mild actually, just sprinkles of tiny blue squares here and there every few frames, but Far Cry 3 for sure will crash on this.





Temps are tolerable. Core and vrm2 are great no matter what I do, it's just vrm1 as always. But a lot of guys could live with 90c vrm1 I guess. I personally like to stay in the 70s.

Ah one more question: do you guys disable CCC? I had problems running that alongside Trixx or AB with my 280x. Not having any issues so far but I was curious.

Incidentally Trixx gets me slightly higher scores at the same clocks than AB. DIfference is small but noticeable, and it happens consistently. Does that make any sense at all?


----------



## Shine6

Hi,

May I join the club?



Sapphire R9 290 - NZXT Kraken G10 + Corsair H75



Shine6


----------



## nightfox

Quote:


> Originally Posted by *Roboyto*
> 
> At 1600 with 200% resolution scale I would imagine it wouldn't be very smooth. That's essentially doubling the load, right?
> 
> When you're at 1440/1600, AA is hurting your performance more than it's making a noticeable difference in how things look. Personally, I would turn down the res scale and AA to get better frames. 50 for 1 card as you claimed is more than acceptable by a lot of people's standards.
> 
> Not many people here run their CPU/GPU at stock settings. BF4 takes advantage of CPU, and RAM Overclocking more than just about any other game out presently. A mild overclock for CPU, RAM, and 2 cards would yield some pretty good, if not great, performance gains.
> 
> You must have a high tolerance for noise! One card at anything over 55% was beyond unacceptable for me. But I'm used to the silence of my water cooling so I'm slightly spoiled I suppose.


the 1600p res 200scale is for 3x 290. ultra settings 4x msaa. card really works alot but plays well

as for noise, i am using headphone so it doesnt bother me at all and like i said if i set my ap182 fans, high pitch noise frm 290 dis appears. i could only hear the woosh sound from my case ap182 fan

oh one more thing, 3 290 noise is not same as 1 290 x 3. if single 290 has a 40 dcb noise, it does not mean 3 290s will be 120 dcb. you may not believe me but that is what my experience. would u believe that 2x 290 is almost same noise as 3 290s .......


----------



## Arizonian

Quote:


> Originally Posted by *Shine6*
> 
> Hi,
> 
> May I join the club?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Sapphire R9 290 NZXT Kraken G10 + Corsair H75
> 
> Shine6


You sure may. Congrats - added









I see your new to OCN, welcome aboard too.









Here's a simple guide to get you started listing your rig as well - *How to put your Rig in your Sig*

On a side note: Woke up to get my Titantall Beta download started and noticed, prices dropped in the US on Newegg.

R9 290X - $699.99 Ref

R9 290 - $599.99 Ref

EDIT: Though I just noticed - nothing is in stock.


----------



## the9quad

Quote:


> Originally Posted by *nightfox*
> 
> the 1600p res 200scale is for 3x 290. ultra settings 4x msaa. card really works alot but plays well
> 
> as for noise, i am using headphone so it doesnt bother me at all and like i said if i set my ap182 fans, high pitch noise frm 290 dis appears. i could only hear the woosh sound from my case ap182 fan
> 
> oh one more thing, 3 290 noise is not same as 1 290 x 3. if single 290 has a 40 dcb noise, it does not mean 3 290s will be 120 dcb. you may not believe me but that is what my experience. would u believe that 2x 290 is almost same noise as 3 290s .......


It is till considerably louder than one, and I wouldn't call it quiet by any stretch. It's not an annoying loud, just the sound of fans.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Arizonian*
> 
> You sure may. Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I see your new to OCN, welcome aboard too.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here's a simple guide to get you started listing your rig as well - *How to put your Rig in your Sig*
> 
> On a side note: Woke up to get my Titantall Beta download started and noticed, prices dropped in the US on Newegg.
> 
> R9 290X - $699.99 Ref
> 
> R9 290 - $599.99 Ref
> 
> EDIT: Though I just noticed - nothing is in stock.


Holy overpriced VGA Batman









And to think I paid last year $475AU each for 290 . ( $430US )


----------



## Paul17041993

Quote:


> Originally Posted by *kizwan*
> 
> Anyone done anything crazy lately? Well, look what I just did.
> 
> 
> 
> 
> 
> 
> 
> I'm not rich but first shipping option will take long time to arrived in Malaysia while the difference between second & third shipping options are only few bucks.


my god...

Quote:


> Originally Posted by *taem*
> 
> Heh I'm just continuing to move the slider and Valley lets me. 1230 core, 1500 memory. I could keep going I guess but I don't see the point, I don't think this is game stable. The artifacting was very mild actually, just sprinkles of tiny blue squares here and there every few frames, but Far Cry 3 for sure will crash on this.
> 
> Temps are tolerable. Core and vrm2 are great no matter what I do, it's just vrm1 as always. But a lot of guys could live with 90c vrm1 I guess. I personally like to stay in the 70s.
> 
> Ah one more question: do you guys disable CCC? I had problems running that alongside Trixx or AB with my 280x. Not having any issues so far but I was curious.
> 
> Incidentally Trixx gets me slightly higher scores at the same clocks than AB. DIfference is small but noticeable, and it happens consistently. Does that make any sense at all?


heh doing what I was doing ey? yea GCN is a very tolerant architecture, a lot of 7970s could be taken as high as 1400 on the core with good cooling (~1.4V, stable), just that in this case the reference power delivery isn't quite able to keep up for the most part, the lightning of the 7970s was the best for high overclocks really and I think this is going to be the same case on the 290/Xs.

I think trixx may do some extra little things like reducing the throttle aggression which can improve the performance too, something afterburner wouldn't likely have due to being designed for a vast range of both AMD and nvidia hardware...


----------



## nightfox

Quote:


> Originally Posted by *the9quad*
> 
> It is till considerably louder than one, and I wouldn't call it quiet by any stretch. It's not an annoying loud, just the sound of fans.


Quote:


> Originally Posted by *the9quad*
> 
> It is till considerably louder than one, and I wouldn't call it quiet by any stretch. It's not an annoying loud, just the sound of fans.


true. it is louder than one but using headphne will omit that fact. i will enjoy winter now with these. They serve as my heater in my pc room lol...

Summer ill start water cooling. I like these cards. And amd drivers are definitely getting better.


----------



## THE_Shev

Got my 2nd Sapphire Tri_Oc yesterday. saw in the AMD catalyst control center that my gpus have different bios versions.
Should this be better with both same bios version. had a few blue screens yesterday but think they got to do with the software.


----------



## Marvin82

Take this with the newest release date.


----------



## Monkeysphere

I have tried everything with these cards. It really disappoints me to say it but my last cards were Nvidia and my next ones will be Nvidia no matter the price difference.

2x MSI R9290 on water running at 42deg and pretty much every game I have gives me less frames in Crossfire then with a single card. In Valley they almost double their performance from a single card yet every game I play either won't run Crossfire at all, crashes or gets several frames lower than running just a single card.

I'm stumped and disappointed







.


----------



## rdr09

Quote:


> Originally Posted by *Monkeysphere*
> 
> I have tried everything with these cards. It really disappoints me to say it but my last cards were Nvidia and my next ones will be Nvidia no matter the price difference.
> 
> 2x MSI R9290 on water running at 42deg and pretty much every game I have gives me less frames in Crossfire then with a single card. In Valley they almost double their performance from a single card yet every game I play either won't run Crossfire at all, crashes or gets several frames lower than running just a single card.
> 
> I'm stumped and disappointed
> 
> 
> 
> 
> 
> 
> 
> .


last cards were Nvdia? you sure you cleared all traces of those drivers? what cpu and oc? and what driver are you currently using?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Monkeysphere*
> 
> I have tried everything with these cards. It really disappoints me to say it but my last cards were Nvidia and my next ones will be Nvidia no matter the price difference.
> 
> 2x MSI R9290 on water running at 42deg and pretty much every game I have gives me less frames in Crossfire then with a single card. In Valley they almost double their performance from a single card yet every game I play either won't run Crossfire at all, crashes or gets several frames lower than running just a single card.
> 
> I'm stumped and disappointed
> 
> 
> 
> 
> 
> 
> 
> .


The drivers are stil very immature for these cards in xfire but they have enormous grunt that's what really impresses me


----------



## Monkeysphere

Quote:


> Originally Posted by *rdr09*
> 
> last cards were Nvdia? you sure you cleared all traces of those drivers? what cpu and oc? and what driver are you currently using?


Fresh windows install. 2500k @ 4.7ghz. 13.12 Driver.

I'm hoping my issues are largely driver related. Single card performance is great. The problem is Valley gets great frames over a single card, almost double but all my games suck with quite a few having to disable Crossfire to even get them playable.


----------



## nightfox

Quote:


> Originally Posted by *Monkeysphere*
> 
> I have tried everything with these cards. It really disappoints me to say it but my last cards were Nvidia and my next ones will be Nvidia no matter the price difference.
> 
> 2x MSI R9290 on water running at 42deg and pretty much every game I have gives me less frames in Crossfire then with a single card. In Valley they almost double their performance from a single card yet every game I play either won't run Crossfire at all, crashes or gets several frames lower than running just a single card.
> 
> I'm stumped and disappointed
> 
> 
> 
> 
> 
> 
> 
> .


If youre playing nvidia optimized games. Yes expect that. Its a dirty trick by nvidia. Amd is so slow in optimizing their drivers for nvidia optimized games.


----------



## rdr09

Quote:


> Originally Posted by *Monkeysphere*
> 
> Fresh windows install. 2500k @ 4.7ghz. 13.12 Driver.
> 
> I'm hoping my issues are largely driver related. Single card performance is great. The problem is Valley gets great frames over a single card, almost double but all my games suck with quite a few having to disable Crossfire to even get them playable.


there are games that really do not benefit in crossfire. others like BF3, BF4, and C3 do. when i had crossfired 7950/7970 my i7 at 4.5HT off (basically an i5) was pegged at 100% usage in BF3 MP 64. during huge explosions i saw my system lagged. those 2 are a bit faster than a single 290 and i can even play with HT off in these same games.

check your cpu usage. a 290 is about as fast as a Titan clock for clock. you are driving basically 2 Titans with an i5.


----------



## Monkeysphere

Quote:


> Originally Posted by *rdr09*
> 
> there are games that really do not benefit in crossfire. others like BF3, BF4, and C3 do. when i had crossfired 7950/7970 my i7 at 4.5HT off (basically an i5) was pegged at 100% usage in BF3 MP 64. during huge explosions i saw my system lagged. those 2 are a bit faster than a single 290 and i can even play with HT off in these same games.
> 
> check your cpu usage. a 290 is about as fast as a Titan clock for clock. you are driving basically 2 Titans with an i5.


CPU usage is fine, about 80-90% across all 4 cores in BF4 64 MP. A single 290 runs BF4 at about 60fps and when I enable Crossfire I drop to about 50fps. Not a CPU issue. Also no reason to be going down in frames with Crossfire enabled and an AMD optimized game at that.

Also not a PCIE or RAM bottleneck issue since Valley benches just fine. All benchmarks in fact are great. In game performance however is disappointing.

The nice thing I suppose is that 1 card gets me by at 2560x1440 so I have some time for drivers to maybe improve the situation.


----------



## rdr09

Quote:


> Originally Posted by *Monkeysphere*
> 
> CPU usage is fine, about 80-90% across all 4 cores in BF4 64 MP. A single 290 runs BF4 at about 60fps and when I enable Crossfire I drop to about 50fps. Not a CPU issue. Also no reason to be going down in frames with Crossfire enabled and an AMD optimized game at that.
> 
> Also not a PCIE or RAM bottleneck issue since Valley benches just fine. All benchmarks in fact are great. In game performance however is disappointing.
> 
> The nice thing I suppose is that 1 card gets me by at 2560x1440 so I have some time for drivers to maybe improve the situation.


what driver? we have members here playing the same game using similar setup just fine.

checkout my usage in BF4 MP 64 with HT off and on . . .



it could be very well be driver.


----------



## Brian18741

Asus R9 290 DCU II crossfire. Stock clocks. The top cards can hit 94°C on stock clocks so no OCing on air. If you're planning on running these cards in crossfire I would seriously suggest against it unless you are putting them under water. The heat is too much.

1080p


1440p


----------



## rdr09

Quote:


> Originally Posted by *Brian18741*
> 
> Asus R9 290 DCU II crossfire. Stock clocks. The top cards can hit 94°C on stock clocks so no OCing on air. If you're planning on running these cards in crossfire I would seriously suggest against it unless you are putting them under water. The heat is too much.
> 
> 1080p
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 1440p
> 
> 
> Spoiler: Warning: Spoiler!


[/SPOILER

that, too. one card temps could be getting up there. some benches, though, like Valley - the cpu does not make much of a difference compared to some games. Benchmarks like 3DMarks behave like some games where the overall score depend on both cpu and gpu(s).


----------



## Monkeysphere

Quote:


> Originally Posted by *rdr09*
> 
> what driver? we have members here playing the same game using similar setup just fine.
> 
> checkout my usage in BF4 MP 64 with HT off and on . . .
> 
> it could be very well be driver.


Using 13.12 driver. Tried reinstalling plenty of times.


----------



## rdr09

Quote:


> Originally Posted by *Monkeysphere*
> 
> Using 13.12 driver. Tried reinstalling plenty of times.


from amd to another amd card I normally just use the driver itself to uninstall previous version. you may have to use this in SAFE mode . . .

http://www.overclock.net/t/1461011/ddu-version-11-0-rc1-released-for-testing

not sure if that is the latest version. try 13.11 driver and make sure your temps are all below 80C (core and VRMS).

13.12 is fine but raised my vcore a bit as well as my temp. could be just my card.


----------



## nightfox

Quote:


> Originally Posted by *Monkeysphere*
> 
> Using 13.12 driver. Tried reinstalling plenty of times.


Reinstalling wont really help. Fresg install is better. Use ddu to remove all video drivers. Intel, amd, nvidia. And the install 13.12. Stay away from latest beta 14.1. Oh one more thing, if windows update shows amd update, DO NOT update. It messed up my drivers. I cant enable crossfire even device manager shows i have 3.


----------



## Monkeysphere

Quote:


> Originally Posted by *rdr09*
> 
> from amd to another amd card I normally just use the driver itself to uninstall previous version. you may have to use this in SAFE mode . . .
> 
> http://www.overclock.net/t/1461011/ddu-version-11-0-rc1-released-for-testing
> 
> not sure if that is the latest version. try 13.11 driver and make sure your temps are all below 80C (core and VRMS).
> 
> 13.12 is fine but raised my vcore a bit as well as my temp. could be just my card.


Sorry to be clear by reinstall I meant with DDU.

Also cards are at max 45℃ with power +50 and ULPS off.


----------



## Abyssic

Quote:


> Originally Posted by *Monkeysphere*
> 
> Sorry to be clear by reinstall I meant with DDU.
> 
> Also cards are at max 45℃ with power +50 and ULPS off.


you have some weird issue going on. have you tried the 2nd gpu on its own?


----------



## Roboyto

Quote:


> Originally Posted by *taem*
> 
> Heh I'm just continuing to move the slider and Valley lets me. 1230 core, 1500 memory. I could keep going I guess but I don't see the point, I don't think this is game stable. The artifacting was very mild actually, just sprinkles of tiny blue squares here and there every few frames, but Far Cry 3 for sure will crash on this.
> 
> 
> 
> 
> 
> Temps are tolerable. Core and vrm2 are great no matter what I do, it's just vrm1 as always. But a lot of guys could live with 90c vrm1 I guess. I personally like to stay in the 70s.
> 
> Ah one more question: do you guys disable CCC? I had problems running that alongside Trixx or AB with my 280x. Not having any issues so far but I was curious.
> 
> Incidentally Trixx gets me slightly higher scores at the same clocks than AB. DIfference is small but noticeable, and it happens consistently. Does that make any sense at all?


Your core speed of 1230 is exactly where I started getting artifacting in Valley before I went under water, and I suspect it is due to VRM temps because I did end up with 1260 as best core speed for Valley. Try backing your core speed down 5-10 and see if you can squeeze some more out of the memory. Some benches/games don't rely on RAM speed as much, especially with enormous 512-bit bus. However, I did check in my testing using my best core speeds while keeping memory clock at stock.

Here's the break down:


Heaven: 6.86% decrease in score
Valley: 8.72% decrease in score
3DMark11 Performance Run: 1.98% decrease in score
3DMark11 Xtreme Run: 3.72% Decrease in score
FFXIV Benchmark: 4.47% Decrease in score

I'm with you for VRM1 temps; I believe they're rated for 120C though, if I recall correctly? Before I had all my fans on my radiators my VRM1 temps were maxing at 79, which was the same as the stock blower! I was slightly baffled at this until I got all 6 fans in and the system was completely bled of air.



I have also been pondering a theory that my front rad fans are so close to the GPU and the VRM1 that the air they are blowing across the back of card and under/over the backplate is lowering VRM1 temps....I'm going to test this now!

If your VRM2 temps are still in check why not try to push the memory a little more? This is OCN afterall









What do you mean by disable CCC? I don't have OverDrive enabled in CCC. Only thing I use it for is to force tesselation off to see how much can be squeezed out of certain benchmarks.

Difference from AB to Trixx? Hard to say really...Did you have ULPS disabled with AB? Does AB have force constant voltage option? A smidge of voltage can make a difference in a score. Plausible that it was getting enough volts not to have issues, but that little extra helped keep frames up thus slightly increasing your score.


----------



## Abyssic

so this is what i finally got out for 24/7 usage. i could push the core a tad higher but i want to keep my VRM temp under 90°C

Core: 1125
Mem: 1450
Power limit +50%
Voltage +30Mv = ~1.24V


----------



## Roboyto

Quote:


> Originally Posted by *Abyssic*
> 
> you have some weird issue going on. have you tried the 2nd gpu on its own?


I'm with Abyssic on this one Monkeysphere. Did you test both cards individually before putting the blocks on them? Maybe your 2nd card has issues?

Your GPU core temp is spot on with mine. How are your VRM temps looking just out of curiousity?


----------



## Roboyto

Quote:


> Originally Posted by *Roboyto*
> 
> 
> 
> I have also been pondering a theory that my front rad fans are so close to the GPU and the VRM1 that the air they are blowing across the back of card and under/over the backplate is lowering VRM1 temps....I'm going to test this now!


Valley seems to give me highest VRM temps out of the benches I have been using so that's what I went with to test my theory.

1255/1675 | +125mV | 50% Power | GPU Idle 30C | VRMs Idle 26C/26C

1st Run: GPU 45C VRMs 65C/39C

Set fans to maximum via Thermal Radar 2 utility and got idle temperatures back to where they were prior to first run.

2nd Run: GPU 45C VRMs 65C/39C

No change. The higher temperatures I was seeing initially undoubtedly were due to not having all fans on radiators and not having all the air bled out of the loop.

Maybe there could be a slight decrease in temperature if I didn't have the fans set to silent in Thermal Radar 2...?


----------



## Mercy4You

Quote:


> Originally Posted by *Abyssic*
> 
> so this is what i finally got out for 24/7 usage. i could push the core a tad higher but i want to keep my VRM temp under 90°C
> 
> Core: 1125
> Mem: 1450
> Power limit +50%
> Voltage +30Mv = ~1.24V


LOL, my core 1125, memory 1400, Power +30%, voltage +25mV and my score Valley was..... 2809
Sapphire really did their best to pick equal chips









Unigine Valley Benchmark 1.0
FPS:
67.1
Score:
2809
Min FPS:
31.1
Max FPS:
128.9
System
Platform:
Windows 8 (build 9200) 64bit
CPU model:
Intel(R) Core(TM) i7-4930K CPU @ 3.40GHz (3399MHz) x6
GPU model:
AMD Radeon R9 200 Series 13.251.0.0 (4095MB) x1
Settings
Render:
Direct3D11
Mode:
1920x1080 8xAA fullscreen
Preset
Extreme HD
Powered by UNIGINE Engine
Unigine Corp. © 2005-2013


----------



## SeanEboy

Hey guys, someone just told me my Seasonic 1000w PSU isn't going to cut it with (3) 290x, is that true? My build: 4930k, RIVBE, 4x4GB Dominator GT, (2 - potentially 3) 290x, 256GB 840 Pro SSD, all watercooled - most likely 1 pump. I'm looking to mine when I'm not gaming.. Thoughts on that?


----------



## pkrexer

I have the same water block and our temperatures are pretty much the same. Though my card doesn't overclock as good.

1220 / 1500 | +125mV | 50% Power | GPU Idle 31c | VRMs Idle 27c / 30c

Valley Run - GPU 44C / VRMs 64c / 37c

1250 / 1500 | +150mV | 50% Power

Valley Run - GPU 45C / VRMS 68c / 38c

Quote:


> Originally Posted by *Roboyto*
> 
> Valley seems to give me highest VRM temps out of the benches I have been using so that's what I went with to test my theory.
> 
> 1255/1675 | +125mV | 50% Power | GPU Idle 30C | VRMs Idle 26C/26C
> 
> 1st Run: GPU 45C VRMs 65C/39C
> 
> Set fans to maximum via Thermal Radar 2 utility and got idle temperatures back to where they were prior to first run.
> 
> 2nd Run: GPU 45C VRMs 65C/39C
> 
> No change. The higher temperatures I was seeing initially undoubtedly were due to not having all fans on radiators and not having all the air bled out of the loop.
> 
> Maybe there could be a slight decrease in temperature if I didn't have the fans set to silent in Thermal Radar 2...?


----------



## Mercy4You

Quote:


> Originally Posted by *SeanEboy*
> 
> Hey guys, someone just told me my Seasonic 1000w PSU isn't going to cut it with (3) 290x, is that true? My build: 4930k, RIVBE, 4x4GB Dominator GT, (2 - potentially 3) 290x, 256GB 840 Pro SSD, all watercooled - most likely 1 pump. I'm looking to mine when I'm not gaming.. Thoughts on that?


Sorry, but that is *very true*!

1000W isn't even going to pull 2 R9 290x's (overclocked)


----------



## SeanEboy

Well crapola.. Time to go cancel my order.. 1250? Duals?


----------



## Mercy4You

Quote:


> Originally Posted by *SeanEboy*
> 
> Well crapola.. Time to go cancel my order.. 1250? Duals?


You can calculate it accurately here:

http://www.extreme.outervision.com/psucalculatorlite.jsp


----------



## Roboyto

Quote:


> Originally Posted by *Roboyto*
> 
> Valley seems to give me highest VRM temps out of the benches I have been using so that's what I went with to test my theory.
> 
> 1255/1675 | +125mV | 50% Power | GPU Idle 30C | VRMs Idle 26C/26C
> 
> 1st Run: GPU 45C VRMs 65C/39C
> 
> Set fans to maximum via Thermal Radar 2 utility and got idle temperatures back to where they were prior to first run.
> 
> 2nd Run: GPU 45C VRMs 65C/39C
> 
> No change. The higher temperatures I was seeing initially undoubtedly were due to not having all fans on radiators and not having all the air bled out of the loop.
> 
> Maybe there could be a slight decrease in temperature if I didn't have the fans set to silent in Thermal Radar 2...?




Same methodology this time with fans set to Full Speed in Thermal Radar 2. Using a mousepad to block air blowing across the GPU.

1255/1675 | +125mV | 50% Power | GPU Idle 30C | VRMs Idle 26C/26C

1st Run no mousepad: GPU 42C VRMs 60C/36C
- Temperatures are down a few degrees as expected due to Full Speed on fans compared to Silent.

2nd Run w/ mousepad: GPU 42C VRMs 63C/37C
- VRM Temperature increase! 3C on VRM1 and 1C on VRM2. GPU unchanged, this I was anticipating.

Is 3C worth the amount of noise that the fans make on Full Speed compared to Silent? Not at all IMHO. BUT, we have found out that even a full cover block, with active VRM cooling, can benefit from some air flow across it.


----------



## SeanEboy

Quote:


> Originally Posted by *Mercy4You*
> 
> You can calculate it accurately here:
> 
> http://www.extreme.outervision.com/psucalculatorlite.jsp


Thanks, I used one before, but another 290x fell into my lap, couldn't resist...


----------



## Mercy4You

Quote:


> Originally Posted by *SeanEboy*
> 
> Thanks, I used one before, but another 290x fell into my lap, couldn't resist...]










(hope you can deal with heat and noise)


----------



## Roboyto

Quote:


> Originally Posted by *pkrexer*
> 
> I have the same water block and our temperatures are pretty much the same. Though my card doesn't overclock as good.
> 
> 1220 / 1500 | +125mV | 50% Power | GPU Idle 31c | VRMs Idle 27c / 30c
> 
> Valley Run - GPU 44C / VRMs 64c / 37c
> 
> 1250 / 1500 | +150mV | 50% Power
> 
> Valley Run - GPU 45C / VRMS 68c / 38c


Very nice









I just thought of something else as well that I changed that may or may not be affecting VRM1 temps. I was thoroughly bothered that XSPC did not include any hardware to fill the last two holes on the backplate.



I made a run to Ace Hardware and picked up some nuts.



Used red paper shims on the bottom.



Now the look is complete!



The extra pressure that is being applied at the end of the backplate from the 2 screws/nuts may be enough for better thermal pad contact on VRM1?


----------



## SeanEboy

Quote:


> Originally Posted by *Mercy4You*
> 
> 
> 
> 
> 
> 
> 
> 
> (hope you can deal with heat and noise)


Haha, coming from Mercy4You, perfect post.. ;c)

I'll be watercooling, so ideally that shouldn't be an issue.. (2) SR1 360s on push/pull. Wondering if I should throw another one in there now...


----------



## Roboyto

Quote:


> Originally Posted by *SeanEboy*
> 
> Hey guys, someone just told me my Seasonic 1000w PSU isn't going to cut it with (3) 290x, is that true? My build: 4930k, RIVBE, 4x4GB Dominator GT, (2 - potentially 3) 290x, 256GB 840 Pro SSD, all watercooled - most likely 1 pump. I'm looking to mine when I'm not gaming.. Thoughts on that?


I used that extreme PSU calculator link and you will likely need a bigger PSU.

The boxes I checked were:

CPU 4930k 100% TDP OC to 4.5GHz, High End MoBo, 4 DIMMs of RAM, (3) 290X, Only (1) SSD drive, and a guess of Swiftech MCP655 pump.

This doesn't include other drives, lights, fans or fan controllers, external devices via USB, or any add-in cards if you have them.

Total came to 1200W recommended.

Remove CPU OC and it came down to 1114W.

These calculators aren't definitive, but you'd likely be pushing a 1kW PSU to or beyond it's limits with (3) 290X.

http://www.hardocp.com/article/2013/11/01/amd_radeon_r9_290x_crossfire_video_card_review/9#.UwDsqIV0Yj8

Take this HardOCP review as an example. They tested with (2) 290X in Xfire. Total System load of 780W, and idle of 129W.
780-129 = 651
651/2 = 325.5
This of course includes the CPU load, but you can likely count on at least another 250W by adding the 3rd card. That would put it at 1030W.

If you were considering watercooling the scenario could change slightly as the cooler the cards can run the more efficient they become.

http://www.legitreviews.com/nzxt-kraken-g10-gpu-water-cooler-review-on-an-amd-radeon-r9-290x_130344/5

See this review here where by attaching an AIO single radiator watercooler they observed a 44W decrease in power consumption. 44W * 3 cards is 132W savings.

If you have room for AIO rads this could be cheaper than buying a new PSU. It would also cut down on noise from the cards and undoubtedly give you OC ability. Hell even if you only attached watercoolers to 2 of them it would help drastically.


----------



## brazilianloser

Quote:


> Originally Posted by *Roboyto*
> 
> Very nice
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I just thought of something else as well that I changed that may or may not be affecting VRM1 temps. I was thoroughly bothered that XSPC did not include any hardware to fill the last two holes on the backplate.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I made a run to Ace Hardware and picked up some nuts.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Used red paper shims on the bottom.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Now the look is complete!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> The extra pressure that is being applied at the end of the backplate from the 2 screws/nuts may be enough for better thermal pad contact on VRM1?


My thoughts exactly. Have that in mind as well for the next time I open up my system for some polishing and adding another rad that it is on its way.


----------



## pkrexer

I was wondering that myself, I may have to try that whenever I end up draining my loop.

Quote:


> Originally Posted by *Roboyto*
> 
> Very nice
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I just thought of something else as well that I changed that may or may not be affecting VRM1 temps. I was thoroughly bothered that XSPC did not include any hardware to fill the last two holes on the backplate.
> 
> 
> 
> I made a run to Ace Hardware and picked up some nuts.
> 
> 
> 
> Used red paper shims on the bottom.
> 
> 
> 
> Now the look is complete!
> 
> 
> 
> The extra pressure that is being applied at the end of the backplate from the 2 screws/nuts may be enough for better thermal pad contact on VRM1?


----------



## Roboyto

Quote:


> Originally Posted by *pkrexer*
> 
> I was wondering that myself, I may have to try that whenever I end up draining my loop.


Quote:


> Originally Posted by *brazilianloser*
> 
> My thoughts exactly. Have that in mind as well for the next time I open up my system for some polishing and adding another rad that it is on its way.


A pair of needle nose pliers, the nuts, and a fair







level of patience could probably get them in there. If you have allen key screwdriver bits it makes it a lot easier to tighten them when holding the nut with the needle nose.


----------



## SeanEboy

Quote:


> Originally Posted by *Roboyto*
> 
> I used that extreme PSU calculator link and you will likely need a bigger PSU.
> 
> The boxes I checked were:
> 
> CPU 4930k 100% TDP OC to 4.5GHz, High End MoBo, 4 DIMMs of RAM, (3) 290X, Only (1) SSD drive, and a guess of Swiftech MCP655 pump.
> 
> This doesn't include other drives, lights, fans or fan controllers, external devices via USB, or any add-in cards if you have them.
> 
> Total came to 1200W recommended.
> 
> Remove CPU OC and it came down to 1114W.
> 
> These calculators aren't definitive, but you'd likely be pushing a 1kW PSU to or beyond it's limits with (3) 290X.
> 
> http://www.hardocp.com/article/2013/11/01/amd_radeon_r9_290x_crossfire_video_card_review/9#.UwDsqIV0Yj8
> 
> Take this HardOCP review as an example. They tested with (2) 290X in Xfire. Total System load of 780W, and idle of 129W.
> 780-129 = 651
> 651/2 = 325.5
> This of course includes the CPU load, but you can likely count on at least another 250W by adding the 3rd card. That would put it at 1030W.
> 
> If you were considering watercooling the scenario could change slightly as the cooler the cards can run the more efficient they become.
> 
> http://www.legitreviews.com/nzxt-kraken-g10-gpu-water-cooler-review-on-an-amd-radeon-r9-290x_130344/5
> 
> See this review here where by attaching an AIO single radiator watercooler they observed a 44W decrease in power consumption. 44W * 3 cards is 132W savings.
> 
> If you have room for AIO rads this could be cheaper than buying a new PSU. It would also cut down on noise from the cards and undoubtedly give you OC ability. Hell even if you only attached watercoolers to 2 of them it would help drastically.


Well, I was going for a custom loop, so no AIO needed... So, I guess I'm looking at a monster PSU - 1250 and up? Should I just go 1500w EVGA? I wanted to do the Seasonic Platinum, but they only have 1250 in the X series - gold. I'd like to be able to add another 290x down the line without having to swap PSUs... Plus, I have some money to burn.. ;c)


----------



## Roboyto

Quote:


> Originally Posted by *SeanEboy*
> 
> Well, I was going for a custom loop, so no AIO needed... So, I guess I'm looking at a monster PSU - 1250 and up? Should I just go 1500w EVGA? I wanted to do the Seasonic Platinum, but they only have 1250 in the X series - gold. I'd like to be able to add another 290x down the line without having to swap PSUs... Plus, I have some money to burn.. ;c)


If you're planning on adding a 4th one down the line, then its probably best to buy the 1500W now. 1250 should do you for 3 cards with a little headroom so PSU is running efficiently, but adding that 4th would put you in the same predicament you are in now going from 2 cards to 3.

My card under heavy OC of 1255/1675 is pulling ~ 260, not including up to 75W from PCI-E slot.

260 I got from 1.352*193.5 = 261.61
Max with PCI-E Slot = 336.61

That's 1.352V VDDC and 193.5A VDDC Current Out in GPU-Z during Valley Benchmark. Valley bench similar load as Coin Mining? I'm Unsure?
You likely wouldn't have 4 cards that could clock this high, so I can test at stock or with slight OC to see.

Stock settings for my card 980/1250. Mines a 290, yours are 290X. Slight variance to account for, but this can give you an idea.

1.172*142.5 = 167.01 + 75 = 242.01

Nearly 100W lower with stock settings compared to large OC.


----------



## SeanEboy

Quote:


> Originally Posted by *Roboyto*
> 
> If you're planning on adding a 4th one down the line, then its probably best to buy the 1500W now. 1250 should do you for 3 cards with a little headroom so PSU is running efficiently, but adding that 4th would put you in the same predicament you are in now going from 2 cards to 3.
> 
> My card under heavy OC of 1255/1675 is pulling ~ 260, not including up to 75W from PCI-E slot.
> 
> 260 I got from 1.352*193.5 = 261.61
> Max with PCI-E Slot = 336.61
> 
> That's 1.352V VDDC and 193.5A VDDC Current Out in GPU-Z during Valley Benchmark. Valley bench similar load as Coin Mining? I'm Unsure?
> You likely wouldn't have 4 cards that could clock this high, so I can test at stock or with slight OC to see.


Thanks a lot for all your assistance... It's tough being a newb, really tough.. Especially when you're trying to jump into a higher tier build like this...


----------



## Brian18741

I'm considering returning my Asus DCU II 290's. They're just too hot in crossfire. Top card hitting 94°C. If I play with the fans on 100% the top cards maxes out around 85°C but the noise is ridiculous. I play with headphones on but the PC is in the sitting room and I don't think the wife will appreciate the noise of it while watching TV etc!

What non reference cooler cards have a reference PCB? MSI Gaming Edition? It's gonna take me ages to save up for watercooling so need to semi-decent aftermarket cooling while I'm saving!


----------



## Roboyto

http://www.techpowerup.com/gpuz/k7gsh/

XFX R9 290 Black Edition - Hynix Memory w/ beast OCability

XSPC Razor Waterblock


----------



## Arizonian

Quote:


> Originally Posted by *Roboyto*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/k7gsh/
> 
> XFX R9 290 Black Edition - Hynix Memory w/ beast OCability
> 
> XSPC Razor Waterblock


Congrats - added


----------



## Roboyto

Quote:


> Originally Posted by *Brian18741*
> 
> I'm considering returning my Asus DCU II 290's. They're just too hot in crossfire. Top card hitting 94°C. If I play with the fans on 100% the top cards maxes out around 85°C but the noise is ridiculous. I play with headphones on but the PC is in the sitting room and I don't think the wife will appreciate the noise of it while watching TV etc!
> 
> What non reference cooler cards have a reference PCB? MSI Gaming Edition? It's gonna take me ages to save up for watercooling so need to semi-decent aftermarket cooling while I'm saving!


According to EK CoolingConfigurator.com I have found a few:

Gigabyte Windforce 3X (GV-R929OC-4GD Rev.1.0)
MSI Twin Frozr IV 4S (V308-002R)
PowerColor PCS+ (AXR9 290 4GBD5-PPDHE_
Sapphire Tri-X (11227-00)
XFX Black Double Dissipation (R9-290A-EDBD)

If you have the DC2 version of the 290, have you thought about contacting EK to see if the DC2 290X block will fit on your card? From looking at pictures of both the DC2 290 and 290X with HSF removed, they seem identical to me...but an inquiry to them would be your best bet.

You can examine both yourself if you'd like

TechPowerUp with the 290X
http://www.techpowerup.com/reviews/ASUS/R9_290X_Direct_Cu_II_OC/images/front.jpg

KitGuru with the 290
http://www.kitguru.net/wp-content/uploads/2014/02/ACC_5415_DxO.jpg
http://www.kitguru.net/components/graphic-cards/zardon/asus-r9-290-direct-cu-ii-oc-review-1600p-ultra-hd-4k/2/


----------



## Roboyto

Quote:


> Originally Posted by *Brian18741*
> 
> I'm considering returning my Asus DCU II 290's. They're just too hot in crossfire. Top card hitting 94°C. If I play with the fans on 100% the top cards maxes out around 85°C but the noise is ridiculous. I play with headphones on but the PC is in the sitting room and I don't think the wife will appreciate the noise of it while watching TV etc!
> 
> What non reference cooler cards have a reference PCB? MSI Gaming Edition? It's gonna take me ages to save up for watercooling so need to semi-decent aftermarket cooling while I'm saving!


Clean rig









Was looking at pics of your rig and maybe some additional airflow to the cards will help. DC2 cards usually do a decent job of cooling.

Do you have a fan spot available on your side panel? If it's positioned properly it could give much needed fresh air to your top card.
Do you normally have the side panel off? Are your front fans pulling fresh air in? With the side panel on it may help air flow through the case and evacuate the heat better than with it off.

I see an open bottom 120mm fan spot. Try adding a fan there to bring some cool air to the bottom card. If the bottom card runs cooler, so will the top.

How does that 750W PSU fair with the two cards? I would have imagined 750W wouldn't hold up for 2 of these cards.


----------



## Paul17041993

Quote:


> Originally Posted by *Monkeysphere*
> 
> CPU usage is fine, about 80-90% across all 4 cores in BF4 64 MP. A single 290 runs BF4 at about 60fps and when I enable Crossfire I drop to about 50fps. Not a CPU issue. Also no reason to be going down in frames with Crossfire enabled and an AMD optimized game at that.
> 
> Also not a PCIE or RAM bottleneck issue since Valley benches just fine. All benchmarks in fact are great. In game performance however is disappointing.
> 
> The nice thing I suppose is that 1 card gets me by at 2560x1440 so I have some time for drivers to maybe improve the situation.


BF4 is CPU-bound, your render thread is being flooded with everything else and cant feed the cards quick enough. Same would happen on an SLI setup, but mantle however skips over software pre-processing from drivers and would perform a lot better, but you might still have to wait a bit for it to be ready on crossfire setups...


----------



## Roboyto

Quote:


> Originally Posted by *Paul17041993*
> 
> BF4 is CPU-bound, your render thread is being flooded with everything else and cant feed the cards quick enough. Same would happen on an SLI setup, but mantle however skips over software pre-processing from drivers and would perform a lot better, but you might still have to wait a bit for it to be ready on crossfire setups...


Yup. BF4 will suck up every MHz of every component you have. CPU and RAM clocks help significantly with BF4. A jump to an i7 would likely help you out as well.

http://www.overclock.net/t/1442038/build-log-the-hawaiian-heat-wave/120

Page 13, Second post. Going from 1333 to 2400 ram speed netted 60% gain in average frame rates!


----------



## Mercy4You

Quote:


> Originally Posted by *Roboyto*
> 
> Yup. BF4 will suck up every MHz of every component you have. CPU and RAM clocks help significantly with BF4. A jump to an i7 would likely help you out as well.
> 
> http://www.overclock.net/t/1442038/build-log-the-hawaiian-heat-wave/120
> 
> Page 13, Second post. Going from 1333 to 2400 ram speed netted 60% gain in average frame rates!


Do you know if there's any gain to be maid by overclocking my 2133 Mhz RAM?


----------



## Roboyto

Quote:


> Originally Posted by *Mercy4You*
> 
> Do you know if there's any gain to be maid by overclocking my 2133 Mhz RAM?


You're already a fair amount ahead of 1333 or 1600, but there is always room for improvement.







This is OCN afterall.

You must also remember, not every game will scale like BF4 since it is coded to take advantage of modern systems.

This post from Xbitlabs is nearly 2 years old and regarding Ivy Bridge, but it can still get the point across as it applies for Ivy/Sandy/Haswell. They say 2133 is best bang for buck since you get most of the performance increase.

http://www.xbitlabs.com/articles/memory/display/ivy-bridge-ddr3_4.html#sect0

Faster memory reads/writes can help with everything running better. It's just that not everything will see the ridiculous gains that BF4 does.


----------



## Brian18741

Quote:


> Originally Posted by *Roboyto*
> 
> Clean rig
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Was looking at pics of your rig and maybe some additional airflow to the cards will help. DC2 cards usually do a decent job of cooling.
> 
> Do you have a fan spot available on your side panel? If it's positioned properly it could give much needed fresh air to your top card.
> Do you normally have the side panel off? Are your front fans pulling fresh air in? With the side panel on it may help air flow through the case and evacuate the heat better than with it off.
> 
> I see an open bottom 120mm fan spot. Try adding a fan there to bring some cool air to the bottom card. If the bottom card runs cooler, so will the top.
> 
> How does that 750W PSU fair with the two cards? I would have imagined 750W wouldn't hold up for 2 of these cards.


Thanks man









Airflow wise the case is always closed. Fan controller maxed whilst gaming. I have two 120mm intakes on the front. One 140mm intake on the side directly onto the cards. One 120mm rear exhaust and two 120's exhausting up through the H100 rad on top. Might throw in another 120mm intake on the floor as you suggested and see if it makes a difference.

The 750w PSU seems fine with them tbh. I've been keeping an eye on it with a power monitor and the whole system typically pulls between 550w and 650w from the wall whilst gaming so a good of room to play with. I did see a brief spike up to about 730w after a 10 min run on Furmark but to be fair, that wouldn't by typical use. I was worried about it myself but I spent some time asking around the Power Supply forum here and they reckon the 750 will be fine!

Finally, I look on the EK site, the configurator shows WB for the DCU II but you can;t select them and there is no price. Do they exist today? Done a bit of googling and can only find "EK _announce_ WB for DCU II" etc


----------



## taem

Quote:


> Originally Posted by *Roboyto*
> 
> Your core speed of 1230 is exactly where I started getting artifacting in Valley before I went under water, and I suspect it is due to VRM temps because I did end up with 1260 as best core speed for Valley. Try backing your core speed down 5-10 and see if you can squeeze some more out of the memory. Some benches/games don't rely on RAM speed as much...
> 
> If your VRM2 temps are still in check why not try to push the memory a little more? This is OCN afterall


Are you saying the artifacts are due to vrm1 temp? Shouldn't be, I don't like 90c but vrm1s are rated for well above that. Mem oc, I personally don't see a real world benefit to higher mem clocks. Like JJ from Asus likes to say, "core is king." But I have found, maybe it's my imagination, that higher mem can help core oc stability. This is an Elpida card though, dunno how much higher than 1500 I can go, everyone is always saying elpida does not oc well.

Anyway, just gamed a while at 1200 core, 1500 mem, I appear to be stable, but vrm1 temp way too high -- hit 99c at the end of a 2 hour Metro LL session. No artifacts though. But I'm thermal limited with my oc here. Maybe vrm1 sink isn't on right. Core hit 72c, vrm2 hit 50c. Lots of thermal headroom there, except I run into that vrm1 brick wall.
Quote:


> Originally Posted by *Roboyto*
> 
> Was looking at pics of your rig and maybe some additional airflow to the cards will help. DC2 cards usually do a decent job of cooling.
> 
> Do you have a fan spot available on your side panel? If it's positioned properly it could give much needed fresh air to your top card.
> Do you normally have the side panel off? Are your front fans pulling fresh air in? With the side panel on it may help air flow through the case and evacuate the heat better than with it off.
> 
> I see an open bottom 120mm fan spot. Try adding a fan there to bring some cool air to the bottom card. If the bottom card runs cooler, so will the top.
> 
> How does that 750W PSU fair with the two cards? I would have imagined 750W wouldn't hold up for 2 of these cards.


The DCU II is great for the 780, but not so great for AMD R9s imho. I had a 280X DCU II, the cooler doesn't fit quite right. They just slapped the cooler designed for the 780 on these cards. Very quiet at stock and built like a tank though. Side fan and bottom fan make a huge difference for me on temps. Bottom fan especially. But side fan lowers my temps 3-5c. If you don't oc you can easily run 290 crossfire on a 750w psu imho. At stock, my factory overclocked card only draws 220w.
Quote:


> Originally Posted by *rdr09*
> 
> some benches, though, like Valley - the cpu does not make much of a difference compared to some games


Valley might be less cpu dependent than other benches but cpu still makes enough of a difference that you can't cross compare. GrinderCAN has the same card I do, but he has a 8350 while I have a 4670k, I get 4-5 fps more than he does at stock. 5 fps in Valley is a big difference.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Mercy4You*
> 
> Do you know if there's any gain to be maid by overclocking my 2133 Mhz RAM?


Its not so much the dram speed but its the tighter dram timinings and lower latency that does it
so answer to your question is YES


----------



## Roboyto

Quote:


> Originally Posted by *Brian18741*
> 
> Thanks man
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Airflow wise the case is always closed. Fan controller maxed whilst gaming. I have two 120mm intakes on the front. One 140mm intake on the side directly onto the cards. One 120mm rear exhaust and two 120's exhausting up through the H100 rad on top. Might throw in another 120mm intake on the floor as you suggested and see if it makes a difference.
> 
> The 750w PSU seems fine with them tbh. I've been keeping an eye on it with a power monitor and the whole system typically pulls between 550w and 650w from the wall whilst gaming so a good of room to play with. I did see a brief spike up to about 730w after a 10 min run on Furmark but to be fair, that wouldn't by typical use. I was worried about it myself but I spent some time asking around the Power Supply forum here and they reckon the 750 will be fine!
> 
> Finally, I look on the EK site, the configurator shows WB for the DCU II but you can;t select them and there is no price. Do they exist today? Done a bit of googling and can only find "EK _announce_ WB for DCU II" etc


Yeah I don't even bother using FurMark to judge anything anymore. It just pulls unrealistic amounts of power to create unrealistic amounts of heat.









Maybe you can fab up something to bring a fan closer to the end of both cards? Attach one to your HDD cage somehow?

Your other option is an AIO cooler on one or both cards. You could probably fit a rad at the rear 120 and another at that bottom 120 spot









Kraken G10 sold out everywhere at the moment
https://www.nzxt.com/product/detail/138-kraken-g10-gpu-bracket.html

Option #2
http://keplerdynamics.com/sigmacool/radeonmkii
Need sinks on VRMs and airflow on them here.

Not sure what happened to TriptCC, can't find his site anymore.


----------



## Abyssic

Quote:


> Originally Posted by *Roboyto*
> 
> Not sure what happened to TriptCC, can't find his site anymore.


maybe nzxt has something to do with this...


----------



## Mercy4You

Quote:


> Originally Posted by *Roboyto*
> 
> You're already a fair amount ahead of 1333 or 1600, but there is always room for improvement.
> 
> 
> 
> 
> 
> 
> 
> This is OCN afterall.
> 
> You must also remember, not every game will scale like BF4 since it is coded to take advantage of modern systems.
> 
> This post from Xbitlabs is nearly 2 years old and regarding Ivy Bridge, but it can still get the point across as it applies for Ivy/Sandy/Haswell. They say 2133 is best bang for buck since you get most of the performance increase.
> 
> http://www.xbitlabs.com/articles/memory/display/ivy-bridge-ddr3_4.html#sect0
> 
> Faster memory reads/writes can help with everything running better. It's just that not everything will see the ridiculous gains that BF4 does.


Thx for the link with physical proof







I did see better results in some benchmarks when I overclock my RAM, but I was not aware of the gain to be maid in BF4. In that game I can use every FPS I can sqeeze out of my system to get my target of 120 FPS


----------



## Roboyto

Quote:


> Originally Posted by *Brian18741*
> 
> Thanks man
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Airflow wise the case is always closed. Fan controller maxed whilst gaming. I have two 120mm intakes on the front. One 140mm intake on the side directly onto the cards. One 120mm rear exhaust and two 120's exhausting up through the H100 rad on top. Might throw in another 120mm intake on the floor as you suggested and see if it makes a difference.
> 
> The 750w PSU seems fine with them tbh. I've been keeping an eye on it with a power monitor and the whole system typically pulls between 550w and 650w from the wall whilst gaming so a good of room to play with. I did see a brief spike up to about 730w after a 10 min run on Furmark but to be fair, that wouldn't by typical use. I was worried about it myself but I spent some time asking around the Power Supply forum here and they reckon the 750 will be fine!
> 
> Finally, I look on the EK site, the configurator shows WB for the DCU II but you can;t select them and there is no price. Do they exist today? Done a bit of googling and can only find "EK _announce_ WB for DCU II" etc


Block probably isn't out yet, but I would shoot EK an e-mail to see if they can possibly give you an ETA and some information if it will fit on the 290.

Have you swapped out the stock thermal paste yet? That has always made a good change for me on any GPU I've done it to.


----------



## Roboyto

Quote:


> Originally Posted by *taem*
> 
> Are you saying the artifacts are due to vrm1 temp? Shouldn't be, I don't like 90c but vrm1s are rated for well above that. Mem oc, I personally don't see a real world benefit to higher mem clocks. Like JJ from Asus likes to say, "core is king." But I have found, maybe it's my imagination, that higher mem can help core oc stability. This is an Elpida card though, dunno how much higher than 1500 I can go, everyone is always saying elpida does not oc well.


I know they are rated beyond 90C, but I couldn't push my core past 1215 in Valley before my card went under water. The core wasn't heat limited at that point as it was maxing at 88C on stock blower. VRMs could just react better when temps are lower.

Never know how 'bad' the Elpida is until you try and push it. It won't affect your other temps so you may as well give it a go


----------



## Roboyto

Quote:


> Originally Posted by *Abyssic*
> 
> maybe nzxt has something to do with this...


Bought him out?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Roboyto*
> 
> I know they are rated beyond 90C, but I couldn't push my core past 1215 in Valley before my card went under water. The core wasn't heat limited at that point as it was maxing at 88C on stock blower. VRMs could just react better when temps are lower.
> 
> Never know how 'bad' the Elpida is until you try and push it. It won't affect your other temps so you may as well give it a go


Back when I had my Gelid on my 290, I couldn't even hold 1200MHz on +162mV because VRM1 temps would skyrocket to 90c+.

On the stock blower I could hold 1200 no problem, now I can hold 1215 or so on the same voltage.

VRM1 temps have a huge effect on stability, as well as power draw. Higher temps make the VRM's much less efficient, drawing more power, and creating more heat.

Now my VRM1 temps don't go over 40c, and my core doesn't go over 48c, and I can hold 1240MHz with +200mV no problem.

Also, people who say all Elpida clocks poorly don't know what they're talking about. My old 7950 had Elpida and clocked up to 1700MHz no problem, my 290 has Elpida and can hold 1625MHz. I didn't max out both of these, either. I still don't know what the actual upper limit of my 290's ram speed is.

There's not a massive performance increase, so I usually keep it at 1500MHz, but for 3dMark11 a couple hundred points can be had by raising my memory OC.


----------



## Abyssic

Quote:


> Originally Posted by *Roboyto*
> 
> Bought him out?


possibly. i could totally imagine that- it was after all his design wich they sold...


----------



## Roboyto

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Back when I had my Gelid on my 290, I couldn't even hold 1200MHz on +162mV because VRM1 temps would skyrocket to 90c+.
> 
> On the stock blower I could hold 1200 no problem, now I can hold 1215 or so on the same voltage.
> 
> VRM1 temps have a huge effect on stability, as well as power draw. Higher temps make the VRM's much less efficient, drawing more power, and creating more heat.
> 
> Now my VRM1 temps don't go over 40c, and my core doesn't go over 48c, and I can hold 1240MHz with +200mV no problem.
> 
> Also, people who say all Elpida clocks poorly don't know what they're talking about. My old 7950 had Elpida and clocked up to 1700MHz no problem, my 290 has Elpida and can hold 1625MHz. I didn't max out both of these, either. I still don't know what the actual upper limit of my 290's ram speed is.
> 
> There's not a massive performance increase, so I usually keep it at 1500MHz, but for 3dMark11 a couple hundred points can be had by raising my memory OC.


40C on VRM1 is impressive. Mine is maxing at 63 now under benchmarks and low 50s while gaming. Did you upgrade thermal pads? I wonder what brand thermal pad XSPC includes with the Razor blocks









From one of my earlier posts
Quote:


> Some benches/games don't rely on RAM speed as much, especially with enormous 512-bit bus. However, I did check in my testing using my best core speeds while keeping memory clock at stock.
> 
> Here's the break down:
> 
> Heaven: 6.86% decrease in score
> Valley: 8.72% decrease in score
> 3DMark11 Performance Run: 1.98% decrease in score
> 3DMark11 Xtreme Run: 3.72% Decrease in score
> FFXIV Benchmark: 4.47% Decrease in score


~2%-9% in this small handful of benchmarks. I'll take any free performance I can get


----------



## NathG79

Hi Guys.

I`ll try and keep this short before I impulse buy. LOL!

I have a sabertooth z87 running a 4770K which I uprgraded from a Crosshair Formula Iv. It had a 1090t six core ticking over a 4.2Ghz.. Oh, and two XFX 7970 DD BE in Xfire on a 2560 x1440p monitor. I have had the sabertooth/4770K for about 2months now, one of my 7970`s recently packed up. I had just bought the Kraken G10 for one of them, on a antec kuler 620. fitted it. purchased another antec 620 for the second card, and it packed up. I am in the process of selling the faulty 7970 on ebay.

I now have one 7970 running a Kraken G10 with an antec kuler 620, which ticks over at 29c, full load about 65c. which im very pleased with.

Shall I buy two new sapphire R9 290x tri-x GPU`s?

Shall I buy one r9 280x and another Kraken G10 and Xfire?.

tell me what I should do.


----------



## Roboyto

Quote:


> Originally Posted by *NathG79*
> 
> Hi Guys.
> 
> I`ll try and keep this short before I impulse buy. LOL!
> 
> I have a sabertooth z87 running a 4770K which I uprgraded from a Crosshair Formula Iv. It had a 1090t six core ticking over a 4.2Ghz.. Oh, and two XFX 7970 DD BE in Xfire on a 2560 x1440p monitor. I have had the sabertooth/4770K for about 2months now, one of my 7970`s recently packed up. I had just bought the Kraken G10 for one of them, on a antec kuler 620. fitted it. purchased another antec 620 for the second card, and it packed up. I am in the process of selling the faulty 7970 on ebay.
> 
> I now have one 7970 running a Kraken G10 with an antec kuler 620, which ticks over at 29c, full load about 65c. which im very pleased with.
> 
> Shall I buy two new sapphire R9 290x tri-x GPU`s?
> 
> Shall I buy one r9 280x and another Kraken G10 and Xfire?.
> 
> tell me what I should do.


Can't RMA the 7970 DD? They have lifetime warranty no?

Prices are sky high still, so it's hard to say. If you have the money, and for future proofing purposes, the 290 or 290X is a good choice.


----------



## Abyssic

Quote:


> Originally Posted by *NathG79*
> 
> Hi Guys.
> 
> I`ll try and keep this short before I impulse buy. LOL!
> 
> I have a sabertooth z87 running a 4770K which I uprgraded from a Crosshair Formula Iv. It had a 1090t six core ticking over a 4.2Ghz.. Oh, and two XFX 7970 DD BE in Xfire on a 2560 x1440p monitor. I have had the sabertooth/4770K for about 2months now, one of my 7970`s recently packed up. I had just bought the Kraken G10 for one of them, on a antec kuler 620. fitted it. purchased another antec 620 for the second card, and it packed up. I am in the process of selling the faulty 7970 on ebay.
> 
> I now have one 7970 running a Kraken G10 with an antec kuler 620, which ticks over at 29c, full load about 65c. which im very pleased with.
> 
> Shall I buy two new sapphire R9 290x tri-x GPU`s?
> 
> Shall I buy one r9 280x and another Kraken G10 and Xfire?.
> 
> tell me what I should do.


i'd go for the 290x tri-x. i bought it myself to move away from 7950 crossfire. i guess prices are ok since you are in the uk?


----------



## Brian18741

Quote:


> Originally Posted by *Roboyto*
> 
> Block probably isn't out yet, but I would shoot EK an e-mail to see if they can possibly give you an ETA and some information if it will fit on the 290.
> 
> Have you swapped out the stock thermal paste yet? That has always made a good change for me on any GPU I've done it to.


Thanks man, just emailed EK there now and asked about the waterblock. Sent em a message on FaceBook as well. Will update results if I get a reply.

Just ordered another 120mm fan for the bottom of the case as well so hopefully that won't be long getting delivered.

Didn't think about changing the TIM. have some upstairs, might give it a go during the week if I get a chance.

Will the kracken work on crossfire cards? Will it actually fit?


----------



## NathG79

no lifetime warranty in uk..its 20 days out of 24month warranty.


----------



## NathG79

Quote:


> Originally Posted by *Abyssic*
> 
> i'd go for the 290x tri-x. i bought it myself to move away from 7950
> crossfire. i guess prices are ok since you are in the uk?


no lifetime warranty in uk..its 20 days out of 24month warranty.

prices are crap 450gbp for a tri-x r9 290x


----------



## Roboyto

Quote:


> Originally Posted by *NathG79*
> 
> no lifetime warranty in uk..its 20 days out of 24month warranty.












I keep forgetting to look at people's location


----------



## Roboyto

Quote:


> Originally Posted by *Brian18741*
> 
> Thanks man, just emailed EK there now and asked about the waterblock. Sent em a message on FaceBook as well. Will update results if I get a reply.
> 
> Just ordered another 120mm fan for the bottom of the case as well so hopefully that won't be long getting delivered.
> 
> Didn't think about changing the TIM. have some upstairs, might give it a go during the week if I get a chance.
> 
> Will the kracken work on crossfire cards? Will it actually fit?


My bro has Xfire 7950's with TriptCC brackets that are very similar to the Kraken G10...I would imagine they would fit? Just found this from another user on here

http://www.overclock.net/content/type/61/id/1829875/

I'm going to text him and ask him to e-mail me a picture of his rig.

Swap out that TIM! I will bet good $ that your temps will come down! Got some MX4 or similar that doesn't need a cure/break in period? I've grown quite fond of the Xigmatek PTI-G4512 as it doesn't need a break in, instantly lower temps







. It's $10 and has 25% coupon at NewEgg right now









http://www.newegg.com/Product/Product.aspx?Item=N82E16835233069


----------



## NathG79

Quote:


> Originally Posted by *NathG79*
> 
> no lifetime warranty in uk..its 20 days out of 24month warranty.
> 
> 24month warranty is bs.. you can spend 400pound plus on a card..it will go wrong in a year.replacement rma lasts just over a year.boom..no gpu..no warranty. no nuffin...supplier walks..speaking from experience.
> prices are crap 450gbp for a tri-x r9 290x


no lifetime warranty in uk..its 20 days out of 24month warranty.


----------



## NathG79

sorry..on xfx hate at the moment.never again


----------



## Abyssic

Quote:


> Originally Posted by *NathG79*
> 
> sorry..on xfx hate at the moment.never again


i've also had very bad experiences with xfx and i won't buy anything from them again xD


----------



## hotrod717

Quote:


> Originally Posted by *Roboyto*
> 
> My bro has Xfire 7950's with TriptCC brackets that are very similar to the Kraken G10...I would imagine they would fit? Just found this from another user on here
> 
> http://www.overclock.net/content/type/61/id/1829875/
> 
> I'm going to text him and ask him to e-mail me a picture of his rig.
> 
> Swap out that TIM! I will bet good $ that your temps will come down! Got some MX4 or similar that doesn't need a cure/break in period? I've grown quite fond of the Xigmatek PTI-G4512 as it doesn't need a break in, instantly lower temps
> 
> 
> 
> 
> 
> 
> 
> . It's $10 and has 25% coupon at NewEgg right now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16835233069


Ek recommends Gelid and so do I. Top tier. Anything better is metallic.


----------



## Roboyto

Quote:


> Originally Posted by *Brian18741*
> 
> Will the kracken work on crossfire cards? Will it actually fit?


Pics from my Bro





I forgot he has one with fan mount and the other not. However, if you have 2 empty PCI-E slot spacing you should be alright it seems.

Also, if you've never tried the Arctic TIM remover I would highly suggest it! It's also 25% off right now.

http://www.newegg.com/Product/Product.aspx?Item=N82E16835100010


----------



## Roboyto

Quote:


> Originally Posted by *hotrod717*
> 
> Ek recommends Gelid and so do I. Top tier. Anything better is metallic.


Gelid GC Extreme is also solid stuff







Also 25% off at the Egg

http://www.newegg.com/Product/Product.aspx?Item=N82E16835426020


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Roboyto*
> 
> 40C on VRM1 is impressive. Mine is maxing at 63 now under benchmarks and low 50s while gaming. Did you upgrade thermal pads? I wonder what brand thermal pad XSPC includes with the Razor blocks


Thanks, my goal was to get as low temps as possible.

I used FujiPoly Ultra Extreme pads on the VRM's instead of the default EK pads. I used regular Ultra pads on the memory to save cash. I also have 600mm of 65mm thick rad space for only a 290 and a 3770k at 1.4v. I purposely left room for a 290, but with prices the way they are it's gonna be a while til I go CF.

Also, my UPS is only capable of outputting 865 watts, so I'll have to upgrade that. I don't want to run all this expensive hardware on potentially dirty power, and a rig with 2 heavily oc'ed 290's will pull way more than that from the wall. Currently I've pulled about 540 watts max during a suicide run of 3dMark11. This was with like +225mV I think.


----------



## Roboyto

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Thanks, my goal was to get as low temps as possible.
> 
> I used FujiPoly Ultra Extreme pads on the VRM's instead of the default EK pads. I used regular Ultra pads on the memory to save cash. I also have 600mm of 65mm thick rad space for only a 290 and a 3770k at 1.4v. I purposely left room for a 290, but with prices the way they are it's gonna be a while til I go CF.
> 
> Also, my UPS is only capable of outputting 865 watts, so I'll have to upgrade that. I don't want to run all this expensive hardware on potentially dirty power, and a rig with 2 heavily oc'ed 290's will pull way more than that from the wall. Currently I've pulled about 540 watts max during a suicide run of 3dMark11. This was with like +225mV I think.


I was looking at those on FrozenCPU site. $26 for for the 60mm*50mm*1mm pad. Checked my installation instructions, XSPC Razor block uses 1mm for all pads.

Is one of these 60*50 big enough to do the whole card? I just bought a Vernier Caliper but left it at work, otherwise I'd be measuring the pads on the stock HSF right now









I'm thinking I'm going to have to order up some of this wonderous thermal pad


----------



## taem

3dmark results for Powercolor PCS+ 290 at 1200 core 1500 mem.





I'm stable beyond this but vrm1 is going into 90s so these are my target clocks, now need to lower vddc offset. If I can't get vrm1 down to low 80s at most I will downlock the core.

Does it matter where I have Power Draw set? Right now I have it at +50 but doesn't this only kick in if needed? Any downside to keeping it at +50?


----------



## Abyssic

Quote:


> Originally Posted by *taem*
> 
> 3dmark results for Powercolor PCS+ 290 at 1200 core 1500 mem.
> 
> 
> 
> 
> 
> I'm stable beyond this but vrm1 is going into 90s so these are my target clocks, now need to lower vddc offset. If I can't get vrm1 down to low 80s at most I will downlock the core.
> 
> Does it matter where I have Power Draw set? Right now I have it at +50 but doesn't this only kick in if needed? Any downside to keeping it at +50?


i will download 3D Mark now to compare ^^ i'm interested how the lower cpu but higher gpu will work out. how is your cpu clocked?
i just looked up my old scores with two 7950s and they are plain even with yours ^^

btw i also have that question about the power limit


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Roboyto*
> 
> I was looking at those on FrozenCPU site. $26 for for the 60mm*50mm*1mm pad. Checked my installation instructions, XSPC Razor block uses 1mm for all pads.
> 
> Is one of these 60*50 big enough to do the whole card? I just bought a Vernier Caliper but left it at work, otherwise I'd be measuring the pads on the stock HSF right now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm thinking I'm going to have to order up some of this wonderous thermal pad


Nope. FCPU actually ended up giving me more than I bought (I bought the 100x15x1mm Ultra Extreme, I got about 120x25mm. For the .5mm I bought 150x100, got about 160x125mm) so I had more than enough. However, I recommend the 100x15 Ultra Extreme because it's better to have one continuous sheet for VRM's then two pieces. I used a good chunk of the 100x15. It will be nowhere near enough for all the RAM's, but I only used about 1/3 of the 150x100 sheet. I figured I'd be better off with more than less, and if I mess up a cut or something I won't have to wait for a new order.

I don't really think the RAM modules need Ultra Extreme pads as they don't get nearly as hot as the VRM's.

The Ultra Extreme 100x15x1mm was $20 when I got it, the 150x100x.5mm was $17 but I can't imagine the 1mm would be much more.


----------



## taem

Quote:


> Originally Posted by *Abyssic*
> 
> i will download 3D Mark now to compare ^^ i'm interested how the lower cpu but higher gpu will work out. how is your cpu clocked?
> i just looked up my old scores with two 7950s and they are plain even with yours ^^
> 
> btw i also have that question about the power limit


Cpu is 4670k @ 4.6

Btw I have a question about these test results. In details, my cpu oc clock is shown, but gpu oc clocks are not.



Does that mean the test ran with stock clocks? Or this screen just doesn't show gpu oc clocks?


----------



## Monkeysphere

Quote:


> Originally Posted by *Abyssic*
> 
> you have some weird issue going on. have you tried the 2nd gpu on its own?


Tried them both with no issues but will disable pcie slots and try them again individually for the sake of it. I know the 2 cards have different BIOS versions so maybe that's it?

VRM temps are around 50℃ to 60℃.

I am however running a 1000w PSU which I would have thought would be fine but maybe not?

Also saw some people talking about TIM. One card has EK Ecotherm and the other has Gelid GC Extreme. There is 0 difference in temps.


----------



## pkrexer

Think I might pick up some of those pads as well. I'm already assembly a laundry list of things I want to do when I drain my loop... guess I'll add to it.
Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Nope. FCPU actually ended up giving me more than I bought (I bought the 100x15x1mm Ultra Extreme, I got about 120x25mm. For the .5mm I bought 150x100, got about 160x125mm) so I had more than enough. However, I recommend the 100x15 Ultra Extreme because it's better to have one continuous sheet for VRM's then two pieces. I used a good chunk of the 100x15. It will be nowhere near enough for all the RAM's, but I only used about 1/3 of the 150x100 sheet. I figured I'd be better off with more than less, and if I mess up a cut or something I won't have to wait for a new order.
> 
> I don't really think the RAM modules need Ultra Extreme pads as they don't get nearly as hot as the VRM's.
> 
> The Ultra Extreme 100x15x1mm was $20 when I got it, the 150x100x.5mm was $17 but I can't imagine the 1mm would be much more.


----------



## Roboyto

Quote:


> Originally Posted by *taem*
> 
> Cpu is 4670k @ 4.6
> 
> Btw I have a question about these test results. In details, my cpu oc clock is shown, but gpu oc clocks are not.
> 
> 
> 
> Does that mean the test ran with stock clocks? Or this screen just doesn't show gpu oc clocks?


3DMark doesn't seem to always report correct clocks for CPU or GPU. Check in GPU-Z to see usage throughout the benchmark, I highly doubt you were at stock clocks. You can always run the bench at stock clocks to make sure there is a performance increase.


----------



## Roboyto

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Nope. FCPU actually ended up giving me more than I bought (I bought the 100x15x1mm Ultra Extreme, I got about 120x25mm. For the .5mm I bought 150x100, got about 160x125mm) so I had more than enough. However, I recommend the 100x15 Ultra Extreme because it's better to have one continuous sheet for VRM's then two pieces. I used a good chunk of the 100x15. It will be nowhere near enough for all the RAM's, but I only used about 1/3 of the 150x100 sheet. I figured I'd be better off with more than less, and if I mess up a cut or something I won't have to wait for a new order.
> 
> I don't really think the RAM modules need Ultra Extreme pads as they don't get nearly as hot as the VRM's.
> 
> The Ultra Extreme 100x15x1mm was $20 when I got it, the 150x100x.5mm was $17 but I can't imagine the 1mm would be much more.


Yeah the price for this stuff is quite high. I'll go with enough Ultra Extreme for the VRMs only. Then which did you say for RAM?


----------



## Abyssic

Quote:


> Originally Posted by *taem*
> 
> Cpu is 4670k @ 4.6
> 
> Btw I have a question about these test results. In details, my cpu oc clock is shown, but gpu oc clocks are not.
> 
> 
> 
> Does that mean the test ran with stock clocks? Or this screen just doesn't show gpu oc clocks?


it sometimes doesn't display the right clocks, don't worry.

so i just did the test and i'm a bit sad xD i lost around 2000 gpu points when i went away from cf 7950s.

my firestrike standard test was 9654.

i'm not running my usual cpu overclock right now, so that's definetely turning the score worse.
but it's a bit weird that i only lost around 300 P points in 3D Mark 11 (from P13490 to P13159) wich is around 2.5% loss compared to 6.2% loss in 3D Mark (2013)... god i hate that "name"


----------



## SeanEboy

Quote:


> Originally Posted by *Roboyto*
> 
> 
> 
> http://www.techpowerup.com/gpuz/k7gsh/
> 
> XFX R9 290 Black Edition - Hynix Memory w/ beast OCability
> 
> XSPC Razor Waterblock


I wonder if the black edition only comes with hynix???


----------



## kpoeticg

That would be awesome to know. I still have 2 more 290x's to order soon, and my Powercolor with Elpida is driving me friggin loco with all these random black screens.


----------



## vieuxchnock

*I have just installed my Kryographics waterblock.

http://www.servimg.com/image_preview.php?i=475&u=17159996

Do you think I can paint those ugly pink caps in black maybe with a brush?
*


----------



## kpoeticg

Plastidip would probly do the trick, or some type of vinyl or stickers


----------



## vieuxchnock

*Thanks. Sure I will try something because in my Prodigy, you see those caps thru the side window.*


----------



## Roboyto

Quote:


> Originally Posted by *SeanEboy*
> 
> I wonder if the black edition only comes with hynix???


I ordered a pair of them on a NewEgg combo deal for $899 back in January. The two cards were 5 serial numbers apart, both were locked, but both had Hynix memory.

Also worth mentioning the first card was a lousy overclocker, unlike the beauty that I kept


----------



## Paul17041993

Quote:


> Originally Posted by *vieuxchnock*
> 
> I have just installed my Kryographics waterblock.
> 
> http://www.servimg.com/image_preview.php?i=475&u=17159996
> 
> Do you think I can paint those ugly pink caps in black maybe with a brush?


sure you can paint them however you wish, however I'm not sure how the manufacturer would enjoy it if you ever needed to RMA, try to record the cap models if you can or use something easily removable...

side note; whats with the center, blue and bold text...?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Roboyto*
> 
> Yeah the price for this stuff is quite high. I'll go with enough Ultra Extreme for the VRMs only. Then which did you say for RAM?


I think it's called FujiPoly Extreme. They have 3 different varieties, Ultra Extreme is the best and Extreme is number 2.


----------



## vieuxchnock

Quote:


> Originally Posted by *Paul17041993*
> 
> sure you can paint them however you wish, however I'm not sure how the manufacturer would enjoy it if you ever needed to RMA, try to record the cap models if you can or use something easily removable...
> 
> side note; whats with the center, blue and bold text...?


I think I will try Plastidip


----------



## SeanEboy

Quote:


> Originally Posted by *vieuxchnock*
> 
> *I have just installed my Kryographics waterblock.
> 
> http://www.servimg.com/image_preview.php?i=475&u=17159996
> 
> Do you think I can paint those ugly pink caps in black maybe with a brush?
> *


Which block is that? This one? http://www.frozencpu.com/products/22333/ex-blc-1603/Aquacomputer_Kryographics_Hawaii_Radeon_R9_290X_Full_Coverage_Liquid_Cooling_Block_-_Nickel_Acrylic_23581.html?tl=g30c357s1657


----------



## vieuxchnock

*Yes that's the one.*


----------



## SeanEboy

Has anyone seen any benchmarks between a 290x, and an unlocked 290 to compare performance?

Here are screens of the unlocked 290 card I might buy:



Anyone have any insight on that?


----------



## Paul17041993

Quote:


> Originally Posted by *SeanEboy*
> 
> Has anyone seen any benchmarks between a 290x, and an unlocked 290 to compare performance?
> 
> Anyone have any insight on that?


you'll have to get the seller to actually run proper benches to actually know if its a true 290X core or just a lasered 290 with a placebo 290X BIOS.


----------



## SeanEboy

Quote:


> Originally Posted by *Paul17041993*
> 
> you'll have to get the buyer to actually run proper benches to actually know if its a true 290X core or just a lasered 290 with a placebo 290X BIOS.


Ok, and which bench should I have him run? To be honest, it's a really fair price, he's asking $500. Perhaps I should just pull the trigger?


----------



## Roboyto

Quote:


> Originally Posted by *SeanEboy*
> 
> Ok, and which bench should I have him run? To be honest, it's a really fair price, he's asking $500. Perhaps I should just pull the trigger?


Fair chance he purchased when the cards were 399. Find out when he bought it. It wouldn't make sense to sell it for $500 with demand right now unless he bought when it was $399.

Is he including receipt and all original packaging?

What brand card?

Is it registered already? If so can it's registration be transferred to you? Will warranty still be valid if you get it?


----------



## SeanEboy

Quote:


> Originally Posted by *Roboyto*
> 
> Fair chance he purchased when the cards were 399. Find out when he bought it. It wouldn't make sense to sell it for $500 with demand right now unless he bought when it was $399.
> 
> Is he including receipt and all original packaging?
> 
> What brand card?
> 
> Is it registered already? If so can it's registration be transferred to you? Will warranty still be valid if you get it?


I don't know about receipt and packaging. It's an XFX. Just asked about registration...


----------



## cam51037

Not sure if any of you guys have seen me before in this thread, but I used to own a Sapphire 290 Tri-X but had to RMA it due to fan rattling issues.

Good news, my new card should be in on Tuesday! I can't wait to try it out again, it was great while I had it.


----------



## Widde

Need advice, Should I RMA the cards due to them locking up in bf4 and the witcher 2, should be noted that the witcher 2 always locks up at the EXACT same scene, the one in the beginning when the elf or w/e is chased through the woods and when he sees 2 humans it freezes and I have to reset the computer. Defect card or driver related? Using 14.1 atm but had a crash in bf4 on 13.12 that caused me to have to reinstall drivers afterwards and temps are in check. Just to make sure I upped the voltage at stock clocks to +45mV

Anyone wanna swap for a gtx 780ti classy?


----------



## Paul17041993

Quote:


> Originally Posted by *Widde*
> 
> Need advice, Should I RMA the cards due to them locking up in bf4 and the witcher 2, should be noted that the witcher 2 always locks up at the EXACT same scene, the one in the beginning when the elf or w/e is chased through the woods and when he sees 2 humans it freezes and I have to reset the computer. Defect card or driver related? Using 14.1 atm but had a crash in bf4 on 13.12 that caused me to have to reinstall drivers afterwards and temps are in check. Just to make sure I upped the voltage at stock clocks to +45mV


have you tested each card separately?


----------



## cam51037

Quote:


> Originally Posted by *Widde*
> 
> Need advice, Should I RMA the cards due to them locking up in bf4 and the witcher 2, should be noted that the witcher 2 always locks up at the EXACT same scene, the one in the beginning when the elf or w/e is chased through the woods and when he sees 2 humans it freezes and I have to reset the computer. Defect card or driver related? Using 14.1 atm but had a crash in bf4 on 13.12 that caused me to have to reinstall drivers afterwards and temps are in check. Just to make sure I upped the voltage at stock clocks to +45mV
> 
> Anyone wanna swap for a gtx 780ti classy?


I personally would test out different drivers, IMO it sounds like a driver issue if it crashes at the same spot every time.


----------



## Widde

Quote:


> Originally Posted by *cam51037*
> 
> I personally would test out different drivers, IMO it sounds like a driver issue if it crashes at the same spot every time.


Valley, Heaven and 3dmark 11 and games like LoL, Metro Last Light and CS:GO runs perfectly fine. I hope it's driver related, Will try and test the cards separately later today.


----------



## Forceman

Quote:


> Originally Posted by *SeanEboy*
> 
> Ok, and which bench should I have him run? To be honest, it's a really fair price, he's asking $500. Perhaps I should just pull the trigger?


Have him download and run Compubench or ShaderToyMark - they both are very shader heavy and will quickly highlight the difference between a shader unlocked 290X and a 290 running at 290X speeds. The 290 unlock thread has a bunch of scores posted you can compare to also.

http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread/0_30

You can also have him run the Hawaii Info tool posted in that thread, if the results don't match the results from an unlockable card, you'll know it isn't really unlocked. That tool has a 100% success rate in determining unlockable cards.

But really, considering the going prices, $500 for a even a locked 290 isn't bad.


----------



## aaroc

Quote:


> Originally Posted by *SeanEboy*
> 
> Hey guys, someone just told me my Seasonic 1000w PSU isn't going to cut it with (3) 290x, is that true? My build: 4930k, RIVBE, 4x4GB Dominator GT, (2 - potentially 3) 290x, 256GB 840 Pro SSD, all watercooled - most likely 1 pump. I'm looking to mine when I'm not gaming.. Thoughts on that?


My PC with 3 R9 290 used 950-1010W as reported by Corsair Link from an Corsair AX1200i running benchmark/mining. All stock and no OC.


----------



## SeanEboy

Quote:


> Originally Posted by *Forceman*
> 
> Have him download and run Compubench or ShaderToyMark - they both are very shader heavy and will quickly highlight the difference between a shader unlocked 290X and a 290 running at 290X speeds. The 290 unlock thread has a bunch of scores posted you can compare to also.
> 
> http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread/0_30
> 
> You can also have him run the Hawaii Info tool posted in that thread, if the results don't match the results from an unlockable card, you'll know it isn't really unlocked. That tool has a 100% success rate in determining unlockable cards.
> 
> But really, considering the going prices, $500 for a even a locked 290 isn't bad.


Thanks for that info.. Yeah, that's kind of what I thought.. For the price, I should just buy it before someone else does... ;c)
Quote:


> Originally Posted by *aaroc*
> 
> My PC with 3 R9 290 used 950-1010W as reported by Corsair Link from an Corsair AX1200i running benchmark/mining. All stock and no OC.


Yeah... Gonna need a bigger PSU! ;c)

Of course, I cannot find any 1500w PSU's in stock anywhere...


----------



## Redvineal

Quote:


> Originally Posted by *SeanEboy*
> 
> Yeah... Gonna need a bigger PSU! ;c)
> 
> Of course, I cannot find any 1500w PSU's in stock anywhere...


If you're in the US:
http://www.newegg.com/Product/Product.aspx?Item=N82E16817194106

(site says in stock and also not available. still lets you add it to the cart. not sure what is the real indicator.)

-or-

http://www.performance-pcs.com/catalog/index.php?main_page=product_info&products_id=35336


----------



## The Mac

i prefer corsair's platinum certified myself...

http://www.newegg.com/Product/Product.aspx?Item=N82E16817139039

expensive, but has everything you could ever want....


----------



## Redvineal

Except that extra 300w that a good 1500w can offer. I'm with you about Corsair in general, though. I try like hell to stick with their products as much as possible as you can see in my sig rig. The only problem is their AX1500i isn't widely available yet, is extra long, and ~$450!

For 3x290's leaving little room for power expansion, the AX1200i is great. To get more, you have to look elsewhere.


----------



## ericz

http://www.techpowerup.com/gpuz/dkhgv/

MSi 290X Gaming Edition LE
Stock cooling


----------



## Tobiman

I get my powercolor PCS+ 290 today.


----------



## Belkov

I want to join the club:

*Belkov - I5 2500K 4.2 Ghz - GB R9 290 OC 1040 CORE / 1250 MEMORY - Stock Cooling - 13.12 WHQL Drivers*

*GPU-Z Validation Link*


----------



## taem

Quote:


> Originally Posted by *Tobiman*
> 
> I get my powercolor PCS+ 290 today.


Now we are three! Me you and GrinderCan. I've settled on 1200 core 1500 mem, can go a bit higher but this is good for air. Nothing I do can make the core go over 72c this is a nice cooler!


----------



## Arizonian

Quote:


> Originally Posted by *ericz*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/dkhgv/
> 
> MSi 290X Gaming Edition LE
> Stock cooling


Welcome to OCN with first post and the club - Congrats - added








Quote:


> Originally Posted by *Belkov*
> 
> I want to join the club:
> 
> *Belkov - I5 2500K 4.2 Ghz - GB R9 290 OC 1040 CORE / 1250 MEMORY - Stock Cooling - 13.12 WHQL Drivers*
> 
> *GPU-Z Validation Link*


Congrats - added








Quote:


> Originally Posted by *Tobiman*
> 
> I get my powercolor PCS+ 290 today.


Post a submission, love to add you to the list.


----------



## Belkov

I want to ask you something. I thought i know what does it mean asic's quality(from GPU-Z). But now i am not so sure. Some say that it doesn't matter, others say that higher is better or lower is better. Now i am a little confused. My old card had only 58.4% and i was not very satisfied because of that. It was not a great clocker, but pretty sufficient. My new card has pretty high asic's quality, but now i am not so sure what it means.



Please can you advice me what can i expect of my GPU with such asic's quality.

And i want to apologize for my english - i don't practice it enough...


----------



## Sgt Bilko

Quote:


> Originally Posted by *SeanEboy*
> 
> I wonder if the black edition only comes with hynix???


Nope,

I'm assuming that Black Edition is actually a reference card thats been factory overclocked

Mine are the DD Black Editions and they come with Elpida, not that it really makes much difference


----------



## Paul17041993

Quote:


> Originally Posted by *Belkov*
> 
> I want to ask you something. I thought i know what does it mean asic's quality(from GPU-Z). But now i am not so sure. Some say that it doesn't matter, others say that higher is better or lower is better. Now i am a little confused. My old card had only 58.4% and i was not very satisfied because of that. It was not a great clocker, but pretty sufficient. My new card has pretty high asic's quality, but now i am not so sure what it means.
> 
> 
> 
> Please can you advice me what can i expect of my GPU with such asic's quality.
> 
> And i want to apologize for my english - i don't practice it enough...


at this point we havnt really determined how much ASIC affects how you overclock, but I think I have noticed some cases where the classic character from the 7970s has come through; where high meant less voltage needed but still drew the same wattage and produced the same heat at the same clocks despite the voltage difference between high and low ASIC cards.

for the most part though overclocking is dependent on how cool you can keep the whole card, full-cover water will give the best results, if your on air or hybrid try your best to cool the VRMs as well as the core and memory, these cards have quite a bit of droop too from the shear power draw, a fair few people are waiting for the legendary lightning PCB to come for more extreme overclocking.


----------



## Forceman

Quote:


> Originally Posted by *Belkov*
> 
> I want to ask you something. I thought i know what does it mean asic's quality(from GPU-Z). But now i am not so sure. Some say that it doesn't matter, others say that higher is better or lower is better. Now i am a little confused. My old card had only 58.4% and i was not very satisfied because of that. It was not a great clocker, but pretty sufficient. My new card has pretty high asic's quality, but now i am not so sure what it means.
> 
> 
> 
> Please can you advice me what can i expect of my GPU with such asic's quality.
> 
> And i want to apologize for my english - i don't practice it enough...


It means nothing because it isn't reading anything to do with ASIC quality. It's just reading a register and making some assumption about quality based on that, but AMD people have already said that it isn't reading what GPU-Z thinks it is reading. So it is completely unimportant.
Quote:


> ASIC "quality" is a misnomer propobated by GPU-z reading a register and not really knowing what it results in. It is even more meaningless with the binning mechanism of Hawaii.


http://forum.beyond3d.com/showpost.php?p=1808073&postcount=2130


----------



## Belkov

OK. Thank you for your fast responces.
Best regards.


----------



## HOMECINEMA-PC

Heres the highest clock ive pulled outta 290 [email protected]@1.5v....... and W/R for single 290 / MK 11 P on HWBOT





http://www.3dmark.com/3dm11/7725420


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Heres the highest clock ive pulled outta 290 [email protected]@1.5v....... and W/R for single 290 / MK 11 P on HWBOT
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/7725420


Nice, Very nice indeed









Congrats


----------



## MasterT

Wow nice OC.. But can it play minecraft?














Well done man.


----------



## Brian18741

Quote:


> Originally Posted by *Brian18741*
> 
> Thanks man, just emailed EK there now and asked about the waterblock. Sent em a message on FaceBook as well. Will update results if I get a reply.


EK have come back to ma saying their 290x DCU II waterblock is in the testing phase and should be available for sale ina bout 4 weeks. They didn't answer if it would fit both the 290 and 290x cards. I'm going to replay asking the same. Will also fire an email to Asus and seeif they can confirm the same PCB for their DCU II 290 and 290x. That would be deciding factor would it not?


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Heres the highest clock ive pulled outta 290 [email protected]@1.5v....... and W/R for single 290 / MK 11 P on HWBOT
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/7725420


Nice score indeed!







BTW, did you noticed latest 3dmark11 update causing lower score than before?


----------



## HOMECINEMA-PC

Heres some good clocks here in C/F as well 1280 / 1500 . Still more to go in these two but as normal drivers are very sketchy for X / fire



3rd place on the HWBOT







http://www.3dmark.com/3dm11/7942448


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *kizwan*
> 
> Nice score indeed!
> 
> 
> 
> 
> 
> 
> 
> BTW, did you noticed latest 3dmark11 update causing lower score than before?


At least 70pts


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Brian18741*
> 
> EK have come back to ma saying their 290x DCU II waterblock is in the testing phase and should be available for sale ina bout 4 weeks. They didn't answer if it would fit both the 290 and 290x cards. I'm going to replay asking the same. Will also fire an email to Asus and seeif they can confirm the same PCB for their DCU II 290 and 290x. That would be deciding factor would it not?


Now that has grabbed my attention ..... interesting


----------



## GrinderCAN

Quote:


> Originally Posted by *taem*
> 
> Now we are three! Me you and GrinderCan. I've settled on 1200 core 1500 mem, can go a bit higher but this is good for air. Nothing I do can make the core go over 72c this is a nice cooler!


Well, since there are only 3 of us...



http://www.techpowerup.com/gpuz/5vbmu/

Powercolor PCS+ R9 290 with stock cooler.

Mine isn't as nice as taem's, with +100/+50 it starts to artifact around 1165 core, doesn't seem to be a heat issue. I'm running 1125/1550 with +56mv while I decide whether to install my EK waterblock... As taem pointed out in another thread, it seems a waste of a good cooler. I have another rig where I could use this, and I'd prefer to find a card with a better core and Hynix ram, but it is awfully tempting to start setting up my water loop as I'm currently in limp mode on stock air with an FX-8350.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Heres the highest clock ive pulled outta 290 [email protected]@1.5v....... and W/R for single 290 / MK 11 P on HWBOT
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/7725420


http://www.3dmark.com/3dma11/7941592
Right behind you, for GPU score at least. Your hexy destroys my 3770k.

Oddly enough, this was at 1245/1625. I don't know how it's only 200 points behind you if it's a whole 100mhz slower.


----------



## chiknnwatrmln

This is the correct link, OCN's been acting up and won't let me edit my posts.

http://www.3dmark.com/3dm11/7941592


----------



## MrWhiteRX7

@HOMECINEMA-PC might be getting some downclocking?









Both are great gpu scores though!


----------



## rdr09

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> This is the correct link, OCN's been acting up and won't let me edit my posts.
> 
> http://www.3dmark.com/3dm11/7941592


i told Home's the same. you flashed your 290? here is mine using original bios . . .

http://www.3dmark.com/3dm11/7748506


----------



## Heinz68

*3DMark 11(P) Score 23939
3DMark11 HOF Rank # 6 (SLI/CrossFire)
3DMark11 HOF Rank # 21 Overall*

Not bad for air cooling and in a benchmark where Nvidia dominates due to the tessellation.

I'm not any expert on overclocking, but believe I should be able to push the *i7-4930K* more than the maximum turbo core clock *4,381 MHz* after I learn more about the ASUS Rampage IV Black Edition overclocking.

Think I wont be able to push the *Sapphire R9 290X 4GB TRI-X* OC any higher than *Core 1,225 MHz, Memory 1,650 MHz* unless I use more than + 200 mV offset which is the max in the Sapphire TRIXX OC tool setting.


----------



## phallacy

Anybody with watercooled 290x that are overclocked can anyone tell me what your VDDC ins are looking like? Max i've seen under 100% load is 220 W which seems a little low to me.


----------



## Tobiman

Just picked up my pcs+ 290. Here are some pics.






Ok, so my PC before I had her installed.


Then after.


Added a fan for vrms. Apparently, it only helps VRM2 on the 290.


So much lean on that last card. Will fix after I post this.




AB usage and stock stats


Will play some games, run cgminer and post temps later.

P.S. Forgot validation. http://www.techpowerup.com/gpuz/9khbf/


----------



## taem

Quote:


> Originally Posted by *Tobiman*
> 
> Just picked up my pcs+ here are some pics


You should probably black out serial nos when posting pix of backplate.

I posted pics of these earlier but have you checked out the VRM sinks and thermal pads on memory? They did this card right.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *rdr09*
> 
> i told Home's the same. you flashed your 290? here is mine using original bios . . .
> 
> http://www.3dmark.com/3dm11/7748506


Running a 290x BIOS, but it didn't unlock. 2560 shaders still.


----------



## Roboyto

Quote:


> Originally Posted by *Brian18741*
> 
> EK have come back to ma saying their 290x DCU II waterblock is in the testing phase and should be available for sale ina bout 4 weeks. They didn't answer if it would fit both the 290 and 290x cards. I'm going to replay asking the same. Will also fire an email to Asus and seeif they can confirm the same PCB for their DCU II 290 and 290x. That would be deciding factor would it not?


That would be the deciding factor :-D


----------



## Roboyto

Quote:


> Originally Posted by *phallacy*
> 
> Anybody with watercooled 290x that are overclocked can anyone tell me what your VDDC ins are looking like? Max i've seen under 100% load is 220 W which seems a little low to me.


My 290 OC to 1255/1675 does around 220-230. I can post pic later when I get home. Just another perk to watercooling :-D


----------



## MacDone

Small question about BIOS and its update.
I have a Sapphire R9 290 Tri-X and found that there is a newer BIOS. Whats normally fixed there? Does it make sense to update it if everyting is running fine?


----------



## Roboyto

Quote:


> Originally Posted by *MacDone*
> 
> Small question about BIOS and its update.
> I have a Sapphire R9 290 Tri-X and found that there is a newer BIOS. Whats normally fixed there? Does it make sense to update it if everyting is running fine?


Better stability, or OC better, maybe take less voltage than before. Hard to say really until you start fiddling around with it.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *phallacy*
> 
> Anybody with watercooled 290x that are overclocked can anyone tell me what your VDDC ins are looking like? Max i've seen under 100% load is 220 W which seems a little low to me.


For wattage in, with +200mV I usually get around 300w full load.

During suicide runs I get around 325w. This is at a bit over 1.3v after droop.


----------



## Abyssic

Quote:


> Originally Posted by *MacDone*
> 
> Small question about BIOS and its update.
> I have a Sapphire R9 290 Tri-X and found that there is a newer BIOS. Whats normally fixed there? Does it make sense to update it if everyting is running fine?


what bios do you have and wich one do you have in mind?


----------



## Roboyto

Quote:


> Originally Posted by *phallacy*
> 
> Anybody with watercooled 290x that are overclocked can anyone tell me what your VDDC ins are looking like? Max i've seen under 100% load is 220 W which seems a little low to me.


I don't believe this takes into consideration the, up to, 75W from PCI-E slot. Can anyone confirm this?


----------



## Abyssic

Quote:


> Originally Posted by *Roboyto*
> 
> I don't believe this takes into consideration the, up to, 75W from PCI-E slot. Can anyone confirm this?


i can't really confirm it but i also think this is the case. it's about 200W with my not-so-much-overclocked 290X


----------



## Roboyto

Quote:


> Originally Posted by *taem*
> 
> 3dmark results for Powercolor PCS+ 290 at 1200 core 1500 mem.
> 
> 
> 
> 
> 
> I'm stable beyond this *but vrm1 is going into 90s so these are my target clocks, now need to lower vddc offset. If I can't get vrm1 down to low 80s* at most I will downlock the core.


Since Chiknwatrmln is getting 20C+ lower VRM1 temps than me with the Fujipoly Ultra Extreme thermal pads, I'm ordering them as soon as I'm done with this post. Need to see how much of a difference they will make. Partially curious if the EK block is outperforming my XSPC as well.

Anyway...I'm willing to bet that upgrading thermal pads on the custom air cooled cards will help similarly to how it does with a waterblock. I will be posting results once I get them installed. Could be a missing link for the air coolers and the volcanic VRMs on these cards.


----------



## Tobiman

The PCS+ might be the best 290 to date. Here are my temps after 10 mins of mining at the highest intensity. Ambient is around 25-29 degrees celsius. Those who are familiar with mining know that the temps shoot up fast. This card found it difficult to go over 70 celsius at 53% fan speed which makes it barely audible compared to my antec 620 pump or corsair SP 120 fans at mid blast. I have great airflow in my case with 10 fans atm.


----------



## Abyssic

Quote:


> Originally Posted by *Roboyto*
> 
> Anyway...I'm willing to bet that upgrading thermal pads on the custom air cooled cards will help similarly to how it does with a waterblock. I will be posting results once I get them installed. Could be a missing link for the air coolers and the volcanic VRMs on these cards.


i would like to swap em (also TIM ofc) but thanks to sapphire, that minor change would viod warranty so i guess i'm not changing anything in the next few months or until i have a new toy ^^


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *MacDone*
> 
> Small question about BIOS and its update.
> I have a Sapphire R9 290 Tri-X and found that there is a newer BIOS. Whats normally fixed there? Does it make sense to update it if everyting is running fine?


If it aint broke don't fix it


----------



## Roboyto

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Nope. FCPU actually ended up giving me more than I bought (I bought the 100x15x1mm Ultra Extreme, I got about 120x25mm. For the .5mm I bought 150x100, got about 160x125mm) so I had more than enough. However, I recommend the 100x15 Ultra Extreme because it's better to have one continuous sheet for VRM's then two pieces. I used a good chunk of the 100x15. It will be nowhere near enough for all the RAM's, but I only used about 1/3 of the 150x100 sheet. I figured I'd be better off with more than less, and if I mess up a cut or something I won't have to wait for a new order.
> 
> I don't really think the RAM modules need Ultra Extreme pads as they don't get nearly as hot as the VRM's.
> 
> The Ultra Extreme 100x15x1mm was $20 when I got it, the 150x100x.5mm was $17 but I can't imagine the 1mm would be much more.


Thanks for the info, just ordered mine up with rush processing and 2 day shipping











Eager to see the temperature difference from the Fuji pads, I will post results as soon as I have them, and the difference between EK and XSPC temps for VRM1.

+Rep


----------



## Roboyto

Quote:


> Originally Posted by *Abyssic*
> 
> i would like to swap em (also TIM ofc) but thanks to sapphire, that minor change would viod warranty so i guess i'm not changing anything in the next few months or until i have a new toy ^^


That's the reason I didn't order a Sapphire 290. I was quite shocked when I got the e-mail back from their support saying it would void warranty. I have had very good luck with Sapphire in the past, so I was slightly disappointed. My HD4890 Vapor-X was a mammoth OCer under water. I ran it with heavy OC for 3 years, put stock cooler back on it, and sold it to a friend..still working to this day.


----------



## UNOE

How can I get my GPU core from downclocking without a bios flash ?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Roboyto*
> 
> Thanks for the info, just ordered mine up with rush processing and 2 day shipping
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Eager to see the temperature difference from the Fuji pads, I will post results as soon as I have them, and the difference between EK and XSPC temps for VRM1.
> 
> +Rep


No problem. Not sure if I asked you, but how much rad do you have?
Quote:


> Originally Posted by *Roboyto*
> I don't believe this takes into consideration the, up to, 75W from PCI-E slot. Can anyone confirm this?


I'm not certain, but I would assume that because the measurement is taken at the VRM's that it includes the power from PCI-e. At least, that would make the most sense.


----------



## Abyssic

Quote:


> Originally Posted by *UNOE*
> 
> How can I get my GPU core from downclocking without a bios flash ?


what are your temps and clockings?


----------



## disintegratorx

Yeah, I lost a mobo in the process, but I have this 290x all hooked up now, just with my previous setup which was a i7 950 and 12gig of RAM, triple channel, Asus Sabertooth x58. Its running pretty nicely with it, and Windows 8.1. I will be getting another motherboard to replace my last one. I got some liquid spillage into the PCI SLOT, I think.

Anyway, how do I join the club? I'll look at the fron page then and see if there are rules to sign up. Thanks guys!


----------



## Forceman

Quote:


> Originally Posted by *Roboyto*
> 
> Since Chiknwatrmln is getting 20C+ lower VRM1 temps than me with the Fujipoly Ultra Extreme thermal pads, I'm ordering them as soon as I'm done with this post. Need to see how much of a difference they will make. *Partially curious if the EK block is outperforming my XSPC as well*.
> 
> Anyway...I'm willing to bet that upgrading thermal pads on the custom air cooled cards will help similarly to how it does with a waterblock. I will be posting results once I get them installed. Could be a missing link for the air coolers and the volcanic VRMs on these cards.


It may be the bolded part. I thought one of the knocks on the XSPC blocks was that they had a somewhat smaller VRM cooling channel than the EKs do.


----------



## sterob

anyone know how to determine if a 290 (non-x) card have Hynix memory before buying?


----------



## Tobiman

Quote:


> Originally Posted by *sterob*
> 
> anyone know how to determine if a 290 (non-x) card have Hynix memory before buying?


Probably need to practice some magic to be able to pull that off.


----------



## Abyssic

Quote:


> Originally Posted by *sterob*
> 
> anyone know how to determine if a 290 (non-x) card have Hynix memory before buying?


the only possible way is human interaction and we all know that this doesn't work


----------



## disintegratorx

Ok, my Validation ID is frurx, and my GPU-Z looks like this:






And my card, of course is a Powercolor LCS Radeon R9 290X - Clocked at 1123/1450... (On liquid)

And its running great.









Oh...


There it is..


----------



## UNOE

Quote:


> Originally Posted by *Abyssic*
> 
> what are your temps and clockings?


The card starts throttling before 80c and anything over 950 core will throttle one of the cards that's connected to the monitor. Anything over 985 core all the cards start trottling.

So doesn't make much since at all. Full fan speed doesn't help either. So doesn't seem to be temp related.


----------



## Heinz68

Quote:


> Originally Posted by *Tobiman*
> 
> The PCS+ might be the best 290 to date. Here are my temps after 10 mins of mining at the highest intensity. Ambient is around 25-29 degrees celsius. Those who are familiar with mining know that the temps shoot up fast. This card found it difficult to go over 70 celsius at 53% fan speed which makes it barely audible compared to my antec 620 pump or corsair SP 120 fans at mid blast. I have great airflow in my case with 10 fans atm.


I think your hash rate is too low for 20 intensity, don't think it should be so much lower than my Sapphire R9 290*X* 4GB TRI-X OC.
Did you try to increase the memory Mhz? I had the same results like you before I increased the memory to 1500 MHz.

I have two cards set not in crossfire for mining so I can do other work on my PC when mining. The top card set at I-14 gets 657 Kh/s the bottom card set I-20 gets 976 Kh/s

I kept increasing the memory by 50 MHz and every time the hash rate increased up to 1500 MHz memory. After that any memory increase was actually decreasing the hash rate
I also increased the power limit +30 and voltage +25 mV to avoid artifacts and occasional blackscreen.

Off course with 2 cards the temps are much higher, top card 81C bottom card 78C after many hours running and ambient temp 25C. Fan speed 70% custom set auto profile.


----------



## Belkov

Quote:


> Originally Posted by *Tobiman*
> 
> The PCS+ might be the best 290 to date. Here are my temps after 10 mins of mining at the highest intensity. Ambient is around 25-29 degrees celsius. Those who are familiar with mining know that the temps shoot up fast. This card found it difficult to go over 70 celsius at 53% fan speed which makes it barely audible compared to my antec 620 pump or corsair SP 120 fans at mid blast. I have great airflow in my case with 10 fans atm.


What's your settings? This is very low speed. With i17 i make 850K and with i20 - almost 940. By the way i don't mine but i wanted to test temps...







. Max 72 for the core and max 75 for VRM1. VRM2 is always too low - max 65. These are on stock clocks.


----------



## standardhlozek

Quote:


> Originally Posted by *standardhlozek*
> 
> 
> 
> GIGABITE R9 290 GV-R929D5-4GD-B
> 
> ATERMARKET COOLING - Prolimatech MK26 and Alpenhof heatsings


----------



## standardhlozek




----------



## Paul17041993

Quote:


> Originally Posted by *standardhlozek*


push that submit button and give the link? what cooler is that though anyway? and be sure your VRMs and memory ICs have adequate heatsinks and airflow too








nvm just noticed the heatsinks popping out the side...


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Heres the highest clock ive pulled outta 290 [email protected]@1.5v....... and W/R for single 290 / MK 11 P on HWBOT
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/7725420


1210/1625 MHz @+200mV is the best one of my card (Hynix) can do.

http://www.3dmark.com/3dm11/7990333


----------



## standardhlozek

cooler is prolimatech mk26. Heatsinks are Alpenhof. The backplate is a part of water coolin, drilled for coss . Bacplate hold the card becasuse the cooler is about 600 g wight and pushing card down about 2 cms on not fitted side. Looks terible.

Temps doesnt exceed 70 degs on 100 % load.


----------



## kpoeticg

Can anybody tell me exactly how much thermal pad is needed of each thickness for full coverage on a ref 290x (Specifically for AC Kryographics Blocks with Passive Backplate)


----------



## standardhlozek

Is true that 290 doesnt have Uber/Quiet bios? I red it has just two same bioses,


----------



## Abyssic

Quote:


> Originally Posted by *standardhlozek*
> 
> cooler is prolimatech mk26. Heatsinks are *Alpenhof*. The backplate is a part of water coolin, drilled for coss . Bacplate hold the card becasuse the cooler is about 600 g wight and pushing card down about 2 cms on not fitted side. Looks terible.
> 
> Temps doesnt exceed 70 degs on 100 % load.


wut? never heard of that. do you mean Alpenföhn?


----------



## standardhlozek

Yes. Alpenföhn


----------



## Belkov

Quote:


> Originally Posted by *standardhlozek*
> 
> Is true that 290 doesnt have Uber/Quiet bios? I red it has just two same bioses,


They are indeed the same, excluding fan profiles. By the way i use my silent mode bios for reflashing and testing bioses. It is unusable because the temps go too high under load.


----------



## kpoeticg

I use silent mode for flashing on my 290x too.


----------



## Roboyto

Quote:


> Originally Posted by *Forceman*
> 
> It may be the bolded part. I thought one of the knocks on the XSPC blocks was that they had a somewhat smaller VRM cooling channel than the EKs do.


That's what we're going to find out :-D


----------



## Roboyto

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> No problem. Not sure if I asked you, but how much rad do you have?
> I'm not certain, but I would assume that because the measurement is taken at the VRM's that it includes the power from PCI-e. At least, that would make the most sense.


I have (2) XSPC EX240.

Front rad has Corsair SP120s in push/pull.

Top rad has Yate Loon 20mm fans in push.

Good to know about the power measurement. I was looking on another forum regarding that wattage measurements; It stated it didn't account for PCI-E wattage. At work at the moment and don't have time to look it up. I'll see if I can find it again when I get home.


----------



## Jack Mac

Just cancelled my NCIX order because they were taking too long to ship (was painless reaching their CS though). Amazon should have my XFX 290 here in two days.


----------



## Paul17041993

Quote:


> Originally Posted by *standardhlozek*
> 
> Is true that 290 doesnt have Uber/Quiet bios? I red it has just two same bioses,


yea, cause 290 doesn't need high fan speeds and doesn't throttle most of the time.


----------



## Mercy4You

I have a question about GPU-Z.

When I run the program it shows for VDDC max 1.24V, while in the logfile the VDDC is max 1.195V...

Anyone knows what's going on?


----------



## Forceman

Quote:


> Originally Posted by *Mercy4You*
> 
> I have a question about GPU-Z.
> 
> When I run the program it shows for VDDC max 1.24V, while in the logfile the VDDC is max 1.18V...
> 
> Anyone knows what's going on?


I noticed the same thing last night. The log file was about 0.04V less than the display.


----------



## primal92

Hello everyone, I've managed to get hold of a msi reference 290 non X for a very good price, second hand with invoices from scan. Once I sell my current DCUII 7970 the upgrade should only cost about £30-40. Is there anything I should be aware of before using this card ? Any issues cropping up with them? My 7970 has been great should I expect the same from this card or more issues ( drivers, reliability) ? And is there anything I should I.e. modding to keep temps down or gain more performance ?

Thanks


----------



## Mercy4You

Quote:


> Originally Posted by *Forceman*
> 
> I noticed the same thing last night. The log file was about 0.04V less than the display.


A closer look at my file and the difference is indeed 0.045V...


----------



## Roboyto

Quote:


> Originally Posted by *sterob*
> 
> anyone know how to determine if a 290 (non-x) card have Hynix memory before buying?


You can look at a professional review of a card on any number of websites; TechpowerUP, HardwareCanucks, TomsHardware, Guru3D, AnandTech, etc. See what RAM chips were observed in their studies of the GPU. Then cross your fingers and toes, and hope that they didn't run out of Hynix RAM when the card was manufactured.









Buy one at a local retailer, use Hawaii Info to see if it has Hynix or Elpida. Test the card and see how it clocks, even though most cards with Elpida don't clock that well, it's not ALL of them. If it doesn't do well return/exchange.

I bought a XFX 7950 DD from NewEgg once upon a time, and returned it because it was hardware voltage locked and it had *extremely* poor OC ability. They said they were going to charge me a restock fee, but didn't. So, it only cost me return shipping.

Buy one on eBay or CraigsList that has already been examined.


----------



## rdr09

Quote:


> Originally Posted by *primal92*
> 
> Hello everyone, I've managed to get hold of a msi reference 290 non X for a very good price, second hand with invoices from scan. Once I sell my current DCUII 7970 the upgrade should only cost about £30-40. Is there anything I should be aware of before using this card ? Any issues cropping up with them? My 7970 has been great should I expect the same from this card or more issues ( drivers, reliability) ? And is there anything I should I.e. modding to keep temps down or gain more performance ?
> 
> Thanks


got one here. bought at launch. mine can bench at 1175/1500 without voltage tweaks just max the power limit using Trixx. leave mine stock 7/24. you have to watercool reference or use aftermarket coolers.

sticking to 13,11 'cause 13.12 raised my vcore a bit and so my temps. i have zero issues with mine even with elpida vram.


----------



## The Mac

gpu-z now shows memeory manufacturer as well


----------



## Tobiman

Quote:


> Originally Posted by *Heinz68*
> 
> I think your hash rate is too low for 20 intensity, don't think it should be so much lower than my Sapphire R9 290*X* 4GB TRI-X OC.
> Did you try to increase the memory Mhz? I had the same results like you before I increased the memory to 1500 MHz.
> 
> I have two cards set not in crossfire for mining so I can do other work on my PC when mining. The top card set at I-14 gets 657 Kh/s the bottom card set I-20 gets 976 Kh/s
> 
> I kept increasing the memory by 50 MHz and every time the hash rate increased up to 1500 MHz memory. After that any memory increase was actually decreasing the hash rate
> I also increased the power limit +30 and voltage +25 mV to avoid artifacts and occasional blackscreen.
> 
> Off course with 2 cards the temps are much higher, top card 81C bottom card 78C after many hours running and ambient temp 25C. Fan speed 70% custom set auto profile.


My card isn't a 290X. It's a 290 with Elpida which is unbelievably ****ty at any other clock other than 1250mhz. Normal hash rates for 290s with Hynix is around 870ish so I'm not far off by any means. Maybe one can dip into the 900s with "The Stilt" bios. I'll get to try that later.


----------



## Roboyto

Quote:


> Originally Posted by *primal92*
> 
> Hello everyone, I've managed to get hold of a msi reference 290 non X for a very good price, second hand with invoices from scan. Once I sell my current DCUII 7970 the upgrade should only cost about £30-40. Is there anything I should be aware of before using this card ? Any issues cropping up with them? My 7970 has been great should I expect the same from this card or more issues ( drivers, reliability) ? And is there anything I should I.e. modding to keep temps down or gain more performance ?
> 
> Thanks


I have had great luck so far with my XFX R9 290 Black Edition. Great overclocker, stable, crushing benches and games. Drivers are stable thus far from my experience. Your biggest hurdle will be that reference blower if you plan to get the full potential out of your purhcase!

Reference cooler does a pretty awful job of cooling the card *quietly*.

They are capable of cooling the card even under high overclocks, but the sound is quite outrageous. I pushed my XFX 290 BE, with the stock blower, to 1215/1650 with the fan cranked up to 75%. The card maxed at 88C and never throttled. It also kept the VRMs cool enough to sustain these clocks and bench/test very well; never broke 79C on VRM1.

I haven't seen any information for what changing TIM and/or thermal pads can do on a reference cooler...it may help, but the bottom line is the reference blower is inadequate; especially if you plan to OC.

Your best bet with these cards is upgrading the cooling solution and make sure you *keep the main VRMs cool*. Main VRMs are highlighted in yellow and secondary VRMs circled in orange.


You have a couple options...

*Full cover waterblock* if you are setup for water cooling already. Best solution for cooling everything on the card cool, especially the main VRMs.

*Upgraded air cooler* Two most popular I've seen are the Arctic Accelero Extreme III and the Gelid Icy VIsion.
*
[*]* *Arctic Accelero Extreme III*
Arctic is currently sold out at NewEgg because of its popularity and capability to tame these cards and overclock them. The kit includes heatsinks for RAM and VRMs
http://www.newegg.com/Product/Product.aspx?Item=N82E16835186068
To give you an idea of how well the Arctic works here's a link:
http://www.tomshardware.com/reviews/r9-290-accelero-xtreme-290,3671-4.html

*[*]* *Gelid Icy Vision*
Is in stock and a bit cheaper. Several reviews stating it works well. The kit includes heatsinks for RAM and VRMs
http://www.newegg.com/Product/Product.aspx?Item=N82E16835426026
One of our own has installation instructions and results from this cooler.
http://www.overclock.net/t/1437634/installation-guide-tips-of-rev-2-icy-vision-on-r9-290x

*Attach an AIO CPU water cooler*. Will do a fantastic job of cooling the GPU, but you will need to attach heatsinks to RAM and VRMs. You have a couple options here as well.


*Arctic Accelero Hybrid* which is the most expensive of the lot, if you can find one as it seems they may be discontinued. They are listed as compatible with the 290(X). These have a built in fan to assist in cooling the RAM and VRMs. The kit includes the heatsinks for RAM and VRMs
http://www.arctic.ac/us_en/products/cooling/vga/accelero-hybrid.html

*NZXT Kraken G10* This is a bracket designed to attach the Asetek style AIO coolers to the card. It also has a fan holder on it to assist in cooling the RAM and VRMs. I believe these are still on backorder because of there high demand and stylish looks








You must purchase heatsinks for RAM and VRMs
https://www.nzxt.com/product/detail/138-kraken-g10-gpu-bracket.html
Asetek AIO coolers are the round ones like this. You can also see a compatibility list on NZXT website.


*KeplerDynamics* AIO adapter bracket. This bracket is compatible with the Asetek coolers as well. Downside is there is no fan attachment. You will still need to attach heatsinks to RAM and VRMs and then make sure you have sufficient air flow over the heatsinks/card.
Orders from here can sometimes take a little while. A friend of mine ordered a few for his 7970 miner cards and it took several weeks for them to arrive. He is very happy with the results though








http://keplerdynamics.com/sigmacool/radeonmkii
*Heatisnks for RAM and VRMs*
You will require two sizes. Larger for RAM and smaller for VRMs.

*Enzotech copper sinks*
My brother has these on his Xfire 7950's that have AIO coolers on them.
Larger Sinks for RAM - http://www.newegg.com/Product/Product.aspx?Item=N82E16835708012
Smaller Sinks for VRMs - http://www.newegg.com/Product/Product.aspx?Item=N82E16835708011


*Akust RAM Sinks*
Aluminum - http://www.newegg.com/Product/Product.aspx?Item=9SIA3TR18A6835
Copper - http://www.newegg.com/Product/Product.aspx?Item=9SIA3TR18A6835
You would require more than one package of the copper as there are 16 RAM chips


*Ebay Heatsinks*
Here is one example, there are many more you can choose from if you search for RAM heatsink. The prices are very low on eBay and I have seen people mention them working well.
http://www.ebay.com/itm/8pcs-Aluminium-Heatsink-For-Motherboard-DDR-VGA-RAM-Memory-IC-Chipset-Cooler-W-/370655913100?pt=US_Memory_Chipset_Cooling&hash=item564cd0648c


----------



## SeanEboy

I may have a used accelero extreme III available for sale shortly. What's a used one worth?


----------



## Roboyto

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Heres the highest clock ive pulled outta 290 [email protected]@1.5v....... and W/R for single 290 / MK 11 P on HWBOT
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/7725420


Nice score









I've got you beat on graphics score...damn your 6-core CPU!









XFX BE 290 @ 1260/1725
http://www.3dmark.com/3dm11/7991063


----------



## Raephen

Quote:


> Originally Posted by *kpoeticg*
> 
> Can anybody tell me exactly how much thermal pad is needed of each thickness for full coverage on a ref 290x (Specifically for AC Kryographics Blocks with Passive Backplate)


The Kryographics blocks use the least thermal padding of all waterblocks: only for the VRM's. The VRAM is cooled by direct contact with the machined copper/nickel, so there only some TIM.

Officially, the padding is 0,5mm thick, but if the padding you're getting is more pliable than the stock ones, 1mm is also good.

I've got some Phobya 7w/mk and it does the job very nicely.


----------



## kpoeticg

Thanx for the reply +1

I was gonna order the Fujipoly Ultra Extreme since that's what everybody seems to recommend. I noticed that the instructions for the block only recommended TIM on the ram and not pads, but wasn't sure if pads would help. I'm stuck with a BSOD Elpida so trying to get as much out of it as possible. I figured Fujipoly would be better than the strip that's in the package, but might not order it now if i don't need it for the ram at all.


----------



## Roboyto

Quote:


> Originally Posted by *Roboyto*
> 
> Good to know about the power measurement. I was looking on another forum regarding that wattage measurements; It stated it didn't account for PCI-E wattage. At work at the moment and don't have time to look it up. I'll see if I can find it again when I get home.


Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I'm not certain, but I would assume that because the measurement is taken at the VRM's that it includes the power from PCI-e. At least, that would make the most sense.


W1zzard is like *the* guy behind GPU-Z right?
http://www.techpowerup.com/forums/threads/use-gpu-z-to-attain-wattage.160143/


----------



## Roboyto

Quote:


> Originally Posted by *kpoeticg*
> 
> Thanx for the reply +1
> 
> I was gonna order the Fujipoly Ultra Extreme since that's what everybody seems to recommend. I noticed that the instructions for the block only recommended TIM on the ram and not pads, but wasn't sure if pads would help. I'm stuck with a BSOD Elpida so trying to get as much out of it as possible. I figured Fujipoly would be better than the strip that's in the package, but might not order it now if i don't need it for the ram at all.


Not sure what pads kryographics block comes with, but I have some Fujipoly Ultra Extreme on its way. Stay tuned and I'll have results from upgrading on my XSPC Razor. Thermal pads that came with mine are blue.


----------



## pkrexer

Quote:


> Originally Posted by *Roboyto*
> 
> Not sure what pads kryographics block comes with, but I have some Fujipoly Ultra Extreme on its way. Stay tuned and I'll have results from upgrading on my XSPC Razor. Thermal pads that came with mine are blue.


Hopefully you have good results, since I already bought them for my xspc block as well







Are you planning on putting a thin layer of tim on yours, like I heard others doing?


----------



## Roboyto

Quote:


> Originally Posted by *pkrexer*
> 
> Hopefully you have good results, since I already bought them for my xspc block as well
> 
> 
> 
> 
> 
> 
> 
> Are you planning on putting a thin layer of tim on yours, like I heard others doing?


I don't think so. If it was necessary I think the manufacturers would be suggesting it. Besides the stuff is too expensive to potentially ruin with TIM.


----------



## kpoeticg

The AC blocks come with a gray strip. EK recommends putting tim between the pads and ram/vrm's


----------



## Arizonian

Quote:


> Originally Posted by *GrinderCAN*
> 
> Well, since there are only 3 of us...
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/5vbmu/
> 
> Powercolor PCS+ R9 290 with stock cooler.
> 
> Mine isn't as nice as taem's, with +100/+50 it starts to artifact around 1165 core, doesn't seem to be a heat issue. I'm running 1125/1550 with +56mv while I decide whether to install my EK waterblock... As taem pointed out in another thread, it seems a waste of a good cooler. I have another rig where I could use this, and I'd prefer to find a card with a better core and Hynix ram, but it is awfully tempting to start setting up my water loop as I'm currently in limp mode on stock air with an FX-8350.


Congrats - added









Quote:


> Originally Posted by *Tobiman*
> 
> Just picked up my pcs+ 290. Here are some pics.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ok, so my PC before I had her installed.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Then after.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Added a fan for vrms. Apparently, it only helps VRM2 on the 290.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> So much lean on that last card. Will fix after I post this.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> AB usage and stock stats
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Will play some games, run cgminer and post temps later.
> 
> P.S. Forgot validation. http://www.techpowerup.com/gpuz/9khbf/


Congrats - added









Quote:


> Originally Posted by *disintegratorx*
> 
> Ok, my Validation ID is frurx, and my GPU-Z looks like this:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And my card, of course is a Powercolor LCS Radeon R9 290X - Clocked at 1123/1450... (On liquid)
> 
> And its running great.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Oh...
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> There it is..


Congrats - added









.......and three Powercolor PCS in a row!


----------



## taem

Quote:


> Originally Posted by *Arizonian*
> 
> .......and three Powercolor PCS in a row!


For the average user and not an extreme overclocker I think it's the best custom 290, along with the Tri-X. Fairly quiet and delightfully cool at stock, enough thermal headroom for a decent oc, shorter than the Tri-X, backplate, reference pcb with no custom components for maximum compatibility with blocks, metal shroud. Talking pcs+ not The WC model.


----------



## Belkov

Quote:


> Originally Posted by *Paul17041993*
> 
> yea, cause 290 doesn't need high fan speeds and doesn't throttle most of the time.


Mine has silent mode, or let me say it that way - it HAD silent mode...


----------



## Roboyto

Quote:


> Originally Posted by *kpoeticg*
> 
> The AC blocks come with a gray strip. EK recommends putting tim between the pads and ram/vrm's










Intriguing

I know instructions for mine did not recommend this. I still don't think I'm going to do it because I need to fairly compare temperatures with the stock pads as I did not put any TIM down for them. Also, I do not want to have to clean TIM off of the card, especially around the VRMs.


----------



## kpoeticg

It's probly from manufacturer to manufacturer like other people are saying, since the blocks aren't designed exactly the same

http://www.ekwb.com/shop/EK-IM/EK-IM-3831109868539.pdf


----------



## Roboyto

Quote:


> Originally Posted by *kpoeticg*
> 
> It's probly from manufacturer to manufacturer like other people are saying, since the blocks aren't designed exactly the same
> 
> http://www.ekwb.com/shop/EK-IM/EK-IM-3831109868539.pdf


Yeah, just different recommendations per manufacturer.

I asked chiknnwatrmln if he used TIM for his VRM pads and he said no. Will be an equal comparison between EK and XSPC blocks when I put the Fujipoly on mine


----------



## MacDone

Quote:


> Originally Posted by *Abyssic*
> 
> what bios do you have and wich one do you have in mind?


Hi,

I've got 015.042.000.000.000000 and I've seen that there is a 015.043.000.001.000000


----------



## DeadlyDNA

So yeah, i don't know what i was thinking but i loaded 14.1 drivers and the ONLY thing that worked was star swarm in "mantle" mode.

Hahaha, every game/benchmark i tried crashed with BSOD's. Yeah, so umm back to 13.12 ..............


----------



## K Man

Hey all

I recently got 2 R9 290's, to Powercolor PCS+ ones. Wonderful cards, very quiet. I did a quick write up in this forum for anyone interested.

Anyway, I was only testing the one card, so today I got around to plugging in the second one - it appears to be a 290X. It has more shaders than the other one (2816 vs 2560)

Also the second hard says it has a bus width of 32 bit - which I seriously doubt. I ran Firestrike - it scored mid 15K points while there have been no issues with crossfire and the 2 cards seem to be playing nicely together. Only now MSI afterburner cant read the data off either card (temp, core clock etc)



Do you guys think there may be an Issue with that second card?

Also Id like to join the club - Validation

http://www.techpowerup.com/gpuz/du9zh/



Cheers,
K Man


----------



## Arizonian

Quote:


> Originally Posted by *K Man*
> 
> Hey all
> 
> I recently got 2 R9 290's, to Powercolor PCS+ ones. Wonderful cards, very quiet. I did a quick write up in this forum for anyone interested.
> 
> Anyway, I was only testing the one card, so today I got around to plugging in the second one - it appears to be a 290X. It has more shaders than the other one (2816 vs 2560)
> 
> Also the second hard says it has a bus width of 32 bit - which I seriously doubt. I ran Firestrike - it scored mid 15K points while there have been no issues with crossfire and the 2 cards seem to be playing nicely together. Only now MSI afterburner cant read the data off either card (temp, core clock etc)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Do you guys think there may be an Issue with that second card?
> 
> Also Id like to join the club - Validation
> 
> http://www.techpowerup.com/gpuz/du9zh/
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Cheers,
> K Man


Congrats - added









and your cards make the fourth and fifth Powercolor PCS back to back in one day.


----------



## Sgt Bilko

I'm liking those PCS+ cards, they look pretty nice and from all reports actually manage to keep it fairly cool.

One thing though, Are they a 2 slot or a 2.5 slot card by chance?


----------



## Sgt Bilko

Quote:


> Originally Posted by *K Man*
> 
> Hey all
> 
> I recently got 2 R9 290's, to Powercolor PCS+ ones. Wonderful cards, very quiet. I did a quick write up in this forum for anyone interested.
> 
> Anyway, I was only testing the one card, so today I got around to plugging in the second one - it appears to be a 290X. It has more shaders than the other one (2816 vs 2560)
> 
> Also the second hard says it has a bus width of 32 bit - which I seriously doubt. I ran Firestrike - it scored mid 15K points while there have been no issues with crossfire and the 2 cards seem to be playing nicely together. Only now MSI afterburner cant read the data off either card (temp, core clock etc)
> 
> 
> 
> Do you guys think there may be an Issue with that second card?
> 
> Also Id like to join the club - Validation
> 
> http://www.techpowerup.com/gpuz/du9zh/
> 
> 
> 
> Cheers,
> K Man


When you enable/disable crossfire you might need to close down Afterburner then start it again.

And disable ULPS in AB as well, that causes GPU-Z to read the shaders, clocks, mem bus etc wrong on your second card


----------



## K Man

2.5 slots Im afraid.

Temps on the top card went from 61C stock to 78C in Crossfire. Looks like my G10s and Kraken x40's wont be wasted after all. It may in fact be a good candidate as it comes with VRM heat sinks, a backplate and another plate that covers the RAM modules (which are both Hynix BTW)

Thanks for the Heads up with MSI AB and ULPS done. Problem Solved, both reading correctly as 290's now.


----------



## Sgt Bilko

Quote:


> Originally Posted by *K Man*
> 
> 
> 
> 2.5 slots Im afraid.
> 
> Temps on the top card went from 61C stock to 78C in Crossfire. Looks like my G10s and Kraken x40's wont be wasted after all. It may in fact be a good candidate as it comes with VRM heat sinks, a backplate and another plate that covers the RAM modules (which are both Hynix BTW)
> 
> Thanks for the Heads up with MSI AB and ULPS done. Problem Solved, both reading correctly as 290's now.


No problem









I don't have an issue with 2.5 slots, handy info to have though, cheers


----------



## K Man

Yea 2.5 slots was fine for me, I bought my motherboard with the possibility of that in mind.

I have tried a bit of overclocking, one gets to 1150 - past that it artefacts while the other gets to slightly less. It is based off the reference PCB, but I cant tell if it has better components.

The price was a steal - $530 from PC case gear. Could believe it was only $30 more than reference, especially compared to the ASUS and Gigabyte models too. So I snapped up immediately.


----------



## Sgt Bilko

Quote:


> Originally Posted by *K Man*
> 
> Yea 2.5 slots was fine for me, I bought my motherboard with the possibility of that in mind.
> 
> I have tried a bit of overclocking, one gets to 1150 - past that it artefacts while the other gets to slightly less. It is based off the reference PCB, but I cant tell if it has better components.
> 
> The price was a steal - $530 from PC case gear. Could believe it was only $30 more than reference, especially compared to the ASUS and Gigabyte models too. So I snapped up immediately.


i got my XFX DD Blacks from PCCG as well, first custom 290's in stock. I agree, the Asus cards are overpriced for no reason, even the Asus ref cards were +$30-$40 over others even though they are the exact same

My Top card will bench at 1250/1500 (haven't tried the second too much atm) I can game with them at 1150/1400 comfortably though. No real need though as my 8350 bottlenecks them


----------



## Mercy4You

Quote:


> Originally Posted by *Roboyto*
> 
> I have had great luck so far with my XFX R9 290 Black Edition. Great overclocker, stable, crushing benches and games. Drivers are stable thus far from my experience. Your biggest hurdle will be that reference blower if you plan to get the full potential out of your purhcase!
> 
> Reference cooler does a pretty awful job of cooling the card *quietly*.
> 
> They are capable of cooling the card even under high overclocks, but the sound is quite outrageous. I pushed my XFX 290 BE, with the stock blower, to 1215/1650 with the fan cranked up to 75%. The card maxed at 88C and never throttled. It also kept the VRMs cool enough to sustain these clocks and bench/test very well; never broke 79C on VRM1.
> 
> Your best bet with these cards is upgrading the cooling solution and make sure you *keep the main VRMs cool*. Main VRMs are highlighted in yellow and secondary VRMs circled in orange.
> 
> 
> You have a couple options...


Thx for this very complete cooling guide, I'll save it for when things run too hot for me


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *standardhlozek*


Mother of God









Quote:


> Originally Posted by *Paul17041993*
> 
> push that submit button and give the link? what cooler is that though anyway? and be sure your VRMs and memory ICs have adequate heatsinks and airflow too
> 
> 
> 
> 
> 
> 
> 
> 
> nvm just noticed the heatsinks popping out the side...


LoooooL









Quote:


> Originally Posted by *kizwan*
> 
> 1210/1625 MHz @+200mV is the best one of my card (Hynix) can do.
> 
> http://www.3dmark.com/3dm11/7990333


You need to smash in about 1.5v for more









Quote:


> Originally Posted by *Roboyto*
> 
> Nice score
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've got you beat on graphics score...damn your 6-core CPU!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> XFX BE 290 @ 1260/1725
> http://www.3dmark.com/3dm11/7991063


Thanks maaaate








Its more a combo of my mad skillz 50% and Hexy Physics score 40% and crappy AMD drivers 10%
Quote:


> Originally Posted by *Roboyto*
> 
> Not sure what pads kryographics block comes with, but I have some Fujipoly Ultra Extreme on its way. Stay tuned and I'll have results from upgrading on my XSPC Razor. Thermal pads that came with mine are blue.
> Quote:
> 
> 
> 
> Originally Posted by *pkrexer*
> 
> Hopefully you have good results, since I already bought them for my xspc block as well
> 
> 
> 
> 
> 
> 
> 
> Are you planning on putting a thin layer of tim on yours, like I heard others doing?
Click to expand...

You two need to tell me all about your findings . Got it









Quote:


> Originally Posted by *Sgt Bilko*
> 
> i got my XFX DD Blacks from PCCG as well, first custom 290's in stock. I agree, the Asus cards are overpriced for no reason, even the Asus ref cards were +$30-$40 over others even though they are the exact same
> 
> My Top card will bench at 1250/1500 (haven't tried the second too much atm) I can game with them at 1150/1400 comfortably though. No real need though as my 8350 bottlenecks them


block those beasties



Cant auto focus for some reason


----------



## Mercy4You

Quote:


> Originally Posted by *Mercy4You*
> 
> I have a question about GPU-Z.
> 
> When I run it the display for Show Highest Reading VDDC gives 1.24V, while in the file the VDDC is logged at max 1.195V...


Found my own answer









GPU-Z is logging each second. The highest reading happend in shorter timeframe, so it was not logged...









Amazing that my GPU is doing 1125/1400 stable with mostly 1.15V - 1.18V.
It must be an overclock beast, too bad it isn't under water...
.


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> block those beasties
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Cant auto focus for some reason


Working on it









Gonna cost me $1k for a full loop though


----------



## Mercy4You

I have interesting 1 hour Furmark test for my blackscreening reference R9 290X in Ubermode here:

HWFurmarkR9290Xtest.png 3311k .png file


The VRM temps stay all the time below 82 C. Why the card blackscreened so often I don't know....

I just did a short run Furmark for my Sapphire R9 290X Tri-X, *and my VRM1 got 98 C within 8 minutes* (Core at that time 75 C)
Should mention, the GPU was overclocked to 1125/1400 +25mV


----------



## Paul17041993

Quote:


> Originally Posted by *DeadlyDNA*
> 
> So yeah, i don't know what i was thinking but i loaded 14.1 drivers and the ONLY thing that worked was star swarm in "mantle" mode.
> 
> Hahaha, every game/benchmark i tried crashed with BSOD's. Yeah, so umm back to 13.12 ..............


so what were these BSODs may I ask? sounds like a library link got broken when you updated for whatever reason...


----------



## maynard14

pls help i just bought a g10 kraken and this vrm heatsinks,..

Dimensions: 22 x 22 x 10 mm
• Material: Aluminum Alloy
• Accessories: Thermal Tape : 3M9448



im confuse if this vrm heatsinks are compatible with the card,.. and do i need also to put heatsinks on the highlight red circle on this picture?

thank you so much


----------



## Sgt Bilko

Quote:


> Originally Posted by *maynard14*
> 
> pls help i just bought a g10 kraken and this vrm heatsinks,..
> 
> Dimensions: 22 x 22 x 10 mm
> • Material: Aluminum Alloy
> • Accessories: Thermal Tape : 3M9448
> 
> 
> 
> im confuse if this vrm heatsinks are compatible with the card,.. and do i need also to put heatsinks on the highlight red circle on this picture?
> 
> thank you so much


What you higlighted in red is the Vram (Video Memory) and yes, heatsinks on the Vram as well as the Vrm's is a very good idea


----------



## Paul17041993

Quote:


> Originally Posted by *maynard14*
> 
> pls help i just bought a g10 kraken and this vrm heatsinks,..
> 
> Dimensions: 22 x 22 x 10 mm
> • Material: Aluminum Alloy
> • Accessories: Thermal Tape : 3M9448
> 
> 
> 
> im confuse if this vrm heatsinks are compatible with the card,.. and do i need also to put heatsinks on the highlight red circle on this picture?
> 
> thank you so much


ok what you have there in that picture are memory heatsinks, you put those on the 16 memory ICs, which are the large flat black things located all around the GPU core.

the units in the orange circle are your memory VRMs, the ones in the yellow highlighted rectangle are your core VRMs, they both need cooling most importantly, so you need long heatsinks to fit them correctly, the memory VRMs need a heatsink similar to the memory heatsinks.


----------



## maynard14

Quote:


> Originally Posted by *Sgt Bilko*
> 
> What you higlighted in red is the Vram (Video Memory) and yes, heatsinks on the Vram as well as the Vrm's is a very good idea


Thank you again sir for you reply , i only have 24 vrm heatsinks,.. is that enough sir? im sorry for being noob


----------



## maynard14

Quote:


> Originally Posted by *Paul17041993*
> 
> ok what you have there in that picture are memory heatsinks, you put those on the 16 memory ICs, which are the large flat black things located all around the GPU core.
> 
> the units in the orange circle are your memory VRMs, the ones in the yellow highlighted rectangle are your core VRMs, they both need cooling most importantly, so you need long heatsinks to fit them correctly, the memory VRMs need a heatsink similar to the memory heatsinks.


oh no only found those type of vrm heatsinks here in my country,.. i cant find any long vrm heatsinks,, but can i still use the shorter vrm the one i posted on the core vrm?


----------



## Sgt Bilko

Quote:


> Originally Posted by *maynard14*
> 
> Thank you again sir for you reply , i only have 24 vrm heatsinks,.. is that enough sir? im sorry for being noob


Paul as answered that for you









in short, yes, you "should" have enough, 16 for the Vram and 6 for the vrm's, Take the cooler off and place them on it without any thermal paste/pads to be 100% sure








Quote:


> Originally Posted by *Paul17041993*
> 
> ok what you have there in that picture are memory heatsinks, you put those on the 16 memory ICs, which are the large flat black things located all around the GPU core.
> 
> the units in the orange circle are your memory VRMs, the ones in the yellow highlighted rectangle are your core VRMs, they both need cooling most importantly, so you need long heatsinks to fit them correctly, the memory VRMs need a heatsink similar to the memory heatsinks.


----------



## maynard14

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Paul as answered that for you
> 
> 
> 
> 
> 
> 
> 
> 
> 
> in short, yes, you "should" have enough, 16 for the Vram and 6 for the vrm's, Take the cooler off and place them on it without any thermal paste/pads to be 100% sure


yes sir when i post my reply to you i didnt see sir pauls reply,.. but thank you both of you im just so excited to try it and mod myself









thank you so much ocn! you really help me

ill be very carefull later when i will put those vrm rams


----------



## Brian18741

Quote:


> Originally Posted by *K Man*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 2.5 slots Im afraid.
> 
> Temps on the top card went from 61C stock to 78C in Crossfire. Looks like my G10s and Kraken x40's wont be wasted after all. It may in fact be a good candidate as it comes with VRM heat sinks, a backplate and another plate that covers the RAM modules (which are both Hynix BTW)
> 
> Thanks for the Heads up with MSI AB and ULPS done. Problem Solved, both reading correctly as 290's now.


Man those temps are very good. 78C on top card? My Asus 290 DCU II hits 94C in crossfire. The bottom card is mid 80s. This is on auto fan speed.

Will you bench Heaven 4.0 with your fans manually set to 100% and update with your temps? I'll do the same later when home.

I'm finding the heat of xfiring these cards unreal on air. So much so that I'm going to _have_ to put them under water!


----------



## phallacy

Thanks for the responses regarding the VDDC ins roboyto and chiknnwatermln, I guess mine are within normal range too. That's quite a difference in terms of heat generated from water vs air.

I'm adding a third 290x to my rig and will update when it's in! : )


----------



## Imprezzion

I have some mad throttling issues which my previous card didn't have. Any input?

I got a reference XFX R9 290 flashed to 290X. Unlocks properly btw. Benches and GPU-Z confirm it.

Cooled by a Accelero Xtreme III and the stock coolers VRM parts which I cut out.

I use MSI AB or ASUS GPU Tweak and in for example GPU Tweak with 150% Power Limit and 1.412v and 1100mhz the core barely gets to 1000Mhz cause it throttles like mad.

Temps for the core: ~60c, VRM1 ~65c, VRM2 ~55c.

No need for it to throttle.. Then why is it..

I bought it with this BIOS pre-applied to it secondhand so I'll just go ahead and flash the PT1 BIOS from the OP as that's the one I ran on my Club3D unlocked card which didn't have any throttling but clocked dismally to begin with.


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> 1210/1625 MHz @+200mV is the best one of my card (Hynix) can do.
> 
> http://www.3dmark.com/3dm11/7990333
> 
> 
> 
> 
> 
> 
> 
> 
> You need to smash in about 1.5v for more
Click to expand...

I have serious heat issue with my loop. GPU's are actually doing fine, core & vrm1 are max in 60s & 70s Celsius respectively (31 - 32C ambient/room temp). The problem is water in my loop feels pretty hot. I need to solved this before going any further. I just attached temp sensors on radiators & on the tube (can't get proper water temp sensor because stock issue). I'll get some numbers tonight.
Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> block those beasties
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Cant auto focus for some reason


----------



## Imprezzion

Never mind my first post. It had a stock ROM loaded to it.
Flashed PT1.ROM and it works fine with no throttling at 1.400v setpoint (~1.20v load). Testing 1200Mhz now @ 290X shaders.

Update: Hahaha at least now I know one thing









No running 1.40v setpoint on unlimited power targets with air cooled VRM's









Core hit about 70c but VRM's stabilized temps at VRM1:102c, VRM2 61c.
Dropping the AUX voltage to -100mV helped a tad and dropped it to 96c for VRM1. Even though they are ''rated'' for 125c constant I'm not going to run 100c on them long term.

Dropped the OC a slight bit to 1175Mhz @ 1.30v (load ~1.15v).
Simply went back to MSI AB and +100mV) which runs at 83c VRM1 and 55c VRM2 with -100mV AUX.
I can live with 83c VRM1 in Valley.. Now let's see what kind of load temps BF4 @ Mantle can bring up.

EDIT2: BF4 @ Ultra with Mantle gets to about 65c core and 80c VRM's flat line. Is that runnable 24/7?


----------



## brazilianloser

So as of this week... when I uninstalled the mantle beta and went back to whql driver all my games have a distorted sound. I have updated the bios and driver to the most recent and all other drivers as well. Anyone aware of this problem or experiencing it as well? My system is below.


----------



## Mercy4You

Quote:


> Originally Posted by *brazilianloser*
> 
> So as of this week... when I uninstalled the mantle beta and went back to whql driver all my games have a distorted sound. I have updated the bios and driver to the most recent and all other drivers as well. Anyone aware of this problem or experiencing it as well? My system is below.


With 14.1 Catalyst installed AMD High Definition Audio (True Audio) device on your pc, maybe there is a conflict with your (onboard) sound drivers...
I've heard other people complaining about that too, did not affect me when rolled back to 13.12.


----------



## brazilianloser

Quote:


> Originally Posted by *Mercy4You*
> 
> With 14.1 Catalyst installed AMD High Definition Audio (True Audio) device on your pc, maybe there is a conflict with your (onboard) sound drivers...
> I've heard other people complaining about that too, did not affect me when rolled back to 13.12.


Yeah I rolled back using ddu on safe mode... So I would guess that ddu does not yet remove True Audio. Will look around my system to see if I find it to make sure it is not sticking around.


----------



## Mercy4You

Quote:


> Originally Posted by *Imprezzion*
> 
> I can live with 83c VRM1 in Valley.. Now let's see what kind of load temps BF4 @ Mantle can bring up.
> 
> EDIT2: BF4 @ Ultra with Mantle gets to about 65c core and 80c VRM's flat line. Is that runnable 24/7?


Are you going to play BF4 24/7


----------



## Roboyto

Quote:


> Originally Posted by *Mercy4You*
> 
> Thx for this very complete cooling guide, I'll save it for when things run too hot for me


No problem


----------



## lombardsoup

Fan speed on my R9 270 isn't going up on load with auto fan speed. Checked this in BF4, fan speed stayed at 18% 1000RPM while core temps rose to a roasty 65c. Made sure auto was selected in Afterburner. Manual speed works without issue.


----------



## Mercy4You

Quote:


> Originally Posted by *lombardsoup*
> 
> Fan speed on my R9 270 isn't going up on load with auto fan speed. Checked this in BF4, fan speed stayed at 18% 1000RPM while core temps rose to a roasty 65c. Made sure auto was selected in Afterburner. Manual speed works without issue.


65 C for a R9 270 is far from roasty, so don't worry about your fanspeed! I think it's fine


----------



## rdr09

Quote:


> Originally Posted by *lombardsoup*
> 
> Fan speed on my R9 270 isn't going up on load with auto fan speed. Checked this in BF4, fan speed stayed at 18% 1000RPM while core temps rose to a roasty 65c. Made sure auto was selected in Afterburner. Manual speed works without issue.


65 isn't anything to be alarmed about but if it stays at 18% at 80C or above, then that is the problem.

use GPUZ render test (click ?) and open another window of it for the sensor tab. make sure you watch the temp closely. 270 right? not 290?

http://www.overclock.net/t/1432035/official-amd-r9-280x-280-270x-270-owners-club


----------



## Imprezzion

Quote:


> Originally Posted by *Mercy4You*
> 
> Are you going to play BF4 24/7


Haha no but you get the point









Problem is, with the ''PT1.ROM'' BIOS I had installed the card won't downclock when idle. I don't like this as I spend quite some time idle as well just like all of us.
Went to a original XFX 290X BIOS since this card is a XFX after all.

When using +100mV it seems to throttle a little even with +50 power limit. However, dropping volts to +50mV throttling is gone.

This card does respond to voltage very bad. It seems to run 1150Mhz on 1.1-1.12v load (+50mV) and going as far as +200mV still won't get 1200Mhz stable without artifacts and it needs +100mV to get to 1175Mhz.

So, I decided to see if 1150Mhz is actually long-term stable at +50mV with the stock 290X BIOS. This drops VRM's into the low-mid 70's in Valley and probably high 60's to low 70c in BF4.
Going to try it now









Only issue I have with the card so far is the fact it has Elpida RAM so it's a huge miracle it runs 1400Mhz stable so far... Can't do 1500Mhz without random freezes though.

Still running Aux (PLL) at -100mV. Lowest it'll go.


----------



## Roboyto

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> You two need to tell me all about your findings . Got it


Roger that


----------



## Roboyto

Quote:


> Originally Posted by *Mercy4You*
> 
> Found my own answer
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GPU-Z is logging each second. The highest reading happend in shorter timeframe, so it was not logged...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Amazing that my GPU is doing 1125/1400 stable with mostly 1.15V - 1.18V.
> It must be an overclock beast, too bad it isn't under water...
> .


Crank your fans to 100% and see how far you can push it! I got nearly all of my OC with stock blower. Maxed at 1215/1625, 88C on core, and 79C on VRM1

If you have a strong contender it will be worth upgrading cooling!


----------



## Roboyto

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Working on it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Gonna cost me $1k for a full loop though


When I first put my loop together I didn't buy compression fittings as they add up $$$ VERY quickly. You can likely cut off $200-$300 by using herbie clips initially and then upgrading to compression fittings later.


----------



## Roboyto

Quote:


> Originally Posted by *Mercy4You*
> 
> I have interesting 1 hour Furmark test for my blackscreening reference R9 290X in Ubermode here:
> 
> HWFurmarkR9290Xtest.png 3311k .png file
> 
> 
> The VRM temps stay all the time below 82 C. Why the card blackscreened so often I don't know....
> 
> I just did a short run Furmark for my Sapphire R9 290X Tri-X, *and my VRM1 got 98 C within 8 minutes* (Core at that time 75 C)
> Should mention, the GPU was overclocked to 1125/1400 +25mV


Not fond of Furmark these days personally...unrealistic temperatures are generated. Run any number of 3DMark benches, Heaven/Valley, FFXIV is one of my favorites.


----------



## Roboyto

Quote:


> Originally Posted by *maynard14*
> 
> pls help i just bought a g10 kraken and this vrm heatsinks,..
> 
> Dimensions: 22 x 22 x 10 mm
> • Material: Aluminum Alloy
> • Accessories: Thermal Tape : 3M9448
> 
> 
> 
> im confuse if this vrm heatsinks are compatible with the card,.. and do i need also to put heatsinks on the highlight red circle on this picture?


It looks like those are the size for the memory, may be a little too large for the VRMs. If you check my original post that you quoted from I had a link for Enzotech copper sinks that were smaller in size.
It is possible to cut/modify the larger sinks you have listed though. I have taken a dremel cutting wheel to Arctic heatsinks before to make them a size I needed.


----------



## Roboyto

Quote:


> Originally Posted by *Brian18741*
> 
> Man those temps are very good. 78C on top card? My Asus 290 DCU II hits 94C in crossfire. The bottom card is mid 80s. This is on auto fan speed.
> 
> Will you bench Heaven 4.0 with your fans manually set to 100% and update with your temps? I'll do the same later when home.
> 
> I'm finding the heat of xfiring these cards unreal on air. So much so that I'm going to _have_ to put them under water!


Don't forget about trying to upgrade TIM. Maybe your card has poor mounting for some strange reason, unlikely but possible









Water is always better though


----------



## Brian18741

I'll be changing the TIM at the weekend. Don't have time with work this week unfortunately. Hopefully this, added with an extra 120mm fan, will improve temps but I can't imagine them improving that much. Watercooling is the only way to go, EK confirmed their WB for the DCU II is in trials and should be out in 4 weeks but have not confirmed it will fit both 290 X and non X version.


----------



## Roboyto

Quote:


> Originally Posted by *Brian18741*
> 
> I'll be changing the TIM at the weekend. Don't have time with work this week unfortunately. Hopefully this, added with an extra 120mm fan, will improve temps but I can't imagine them improving that much. Watercooling is the only way to go, EK confirmed their WB for the DCU II is in trials and should be out in 4 weeks but have not confirmed it will fit both 290 X and non X version.


I'd be willing to bet 10-15C with a TIM that doesn't require curing; MX-4, GC Extreme, Xigmatek PTI, etc. I've always seen good results when swapping out stock paste; ASUS, MSI, XFX and Sapphire cards all with great returns.


----------



## Paul17041993

furmark is one of those ultra peak burn tests that keeps every single shader and cache at absolute 100% load, so it gives you the absolute maximum possible heat and power draw, but applications almost never punch a GPU that hard and its almost useless, but sometimes useful for testing PSU headroom and your cooling.


----------



## taem

Quote:


> Originally Posted by *Paul17041993*
> 
> furmark is one of those ultra peak burn tests that keeps every single shader and cache at absolute 100% load, so it gives you the absolute maximum possible heat and power draw, but applications almost never punch a GPU that hard and its almost useless, but sometimes useful for testing PSU headroom and your cooling.


MetroLL bench at max settings on loop get me higher temps than I've ever seen in a game, that's my go-to these days for temp/fan curve testing. My games aren't terribly demanding though, I'd say Metro LL and Far Cry 3 are the most demanding games I have installed ATM. I imagine BF4 is more taxing.


----------



## ste.ru

I buy a MSI 290X reference for my case( Sivlerstone TJ-08) which are the best cooler? i think to buy the Gelid Solutions Icy Vision Rev.2


----------



## Cool Mike

Newegg is completely sold out of 290 and 290x.


----------



## Imprezzion

Ok, my unlocked XFX is quite a sad overclocker. Will only do 1125Mhz @ +50mV and 1150Mhz @ +100mV.
RAM Elpida won't go any higher then 1350Mhz either.. Lol.

Oh well, at least @ Mantle it tears through BF4


----------



## Prozillah

Anyone running the gigabyte windforce 290's/x in xfire - very keen & urgent to know what kind of temps you're getting under the 3x cooler?


----------



## Mercy4You

Quote:


> Originally Posted by *Paul17041993*
> 
> furmark is one of those ultra peak burn tests that keeps every single shader and cache at absolute 100% load, so it gives you the absolute maximum possible heat and power draw, but applications almost never punch a GPU that hard and its almost useless, but sometimes useful for testing PSU headroom and your cooling.


Quote:


> Originally Posted by *Roboyto*
> 
> Not fond of Furmark these days personally...unrealistic temperatures are generated. Run any number of 3DMark benches, Heaven/Valley, FFXIV is one of my favorites.


Thx guys, good to know, I don't want to push my GPU over the edge. I'll take a look at FFXIV


----------



## Mercy4You

Quote:


> Originally Posted by *Imprezzion*
> 
> Ok, my unlocked XFX is quite a sad overclocker. Will only do 1125Mhz @ +50mV and 1150Mhz @ +100mV.
> RAM Elpida won't go any higher then 1350Mhz either.. Lol.
> 
> Oh well, at least @ Mantle it tears through BF4


Hey Dutchy (it giet net oan







) it's not so bad [email protected] +100mV !

Did you see much gain from Mantle in BF4?


----------



## Imprezzion

Yeah Mantle runs quite well in BF4. Stable, and great FPS. It doesn't drop under 90 even in Gulf's sandstorm @ 64p CQ.


----------



## taem

Quote:


> Originally Posted by *K Man*
> 
> 
> 
> 2.5 slots Im afraid.
> 
> Temps on the top card went from 61C stock to 78C in Crossfire. Looks like my G10s and Kraken x40's wont be wasted after all. It may in fact be a good candidate as it comes with VRM heat sinks, a backplate and another plate that covers the RAM modules (which are both Hynix BTW)
> 
> Thanks for the Heads up with MSI AB and ULPS done. Problem Solved, both reading correctly as 290's now.


Huh you have Hynix ram on your pcs+ cards? Me and GrinderCan got elpida. Mine went to 1500 without a murmur though and clears 3dmark and unigine benches at 1600, have yet to fully test 1600 in-game tho, or try anything higher.

Thanx for pic, I plan to crossfire but I figured the pcs+ at 38mm is too thick so I got 1 tri-x and 1 pcs+. Tri-x is much thinner, might do better as gpu1. Hope so anyway.


----------



## maynard14

hey guys,.. getting ready for my nzxt g10 and xfx r9 290 reference card









i have another question, im deciding what thermal paste will i use to the gpu core and ill be using antec kuhler h620

i think cooper is compatible with liquid pro



so which is the best? i have both right now



or this



thank you again


----------



## taem

Posted this on the drivers and apps sub forum but no replies, hopefully somone here can help.

Core clock is fluctuating with afterburner 13beta 18. Stays constant on trixx. Hoping its a settings issue that can be resolved, or maybe a cfm edit. Here is my afterburner settings:


----------



## Ukkooh

@maynard14:
Liquid pro is better but you'll have to be careful with it as it conducts electricity. Basically that means that the slightest bit of it outside the die could cause problems or kill your card.


----------



## maynard14

Quote:


> Originally Posted by *Ukkooh*
> 
> @maynard14:
> Liquid pro is better but you'll have to be careful with it as it conducts electricity. Basically that means that the slightest bit of it outside the die could cause problems or kill your card.


oh i see, im a bit scared right now,. hmm thanks,.. i guess i have to search alot of videos on how to spread it on the die very carefully. i have done this on my 3570kdelided but that was last yr,. maybe ill just by a small paint brush, and use it to very carefully spread the clp, i think using q tips is dangerous


----------



## Ukkooh

Quote:


> Originally Posted by *maynard14*
> 
> oh i see, im a bit scared right now,. hmm thanks,.. i guess i have to search alot of videos on how to spread it on the die very carefully. i have done this on my 3570kdelided but that was last yr,. maybe ill just by a small paint brush, and use it to very carefully spread the clp, i think using q tips is dangerous


If you already did it with your 3570k, I'm sure you'l be able to do it. Just don't slide the cooler at all when mounting it to make sure that your liquid pro stays on the die.


----------



## maynard14

Quote:


> Originally Posted by *Ukkooh*
> 
> If you already did it with your 3570k, I'm sure you'l be able to do it. Just don't slide the cooler at all when mounting it to make sure that your liquid pro stays on the die.


thank you sir thank you so much,.. ill post the result when i get it dne later!


----------



## primal92

Quote:


> Originally Posted by *rdr09*
> 
> got one here. bought at launch. mine can bench at 1175/1500 without voltage tweaks just max the power limit using Trixx. leave mine stock 7/24. you have to watercool reference or use aftermarket coolers.
> 
> sticking to 13,11 'cause 13.12 raised my vcore a bit and so my temps. i have zero issues with mine even with elpida vram.


I'll try that, I'm on 13.12 the mantle beta were horrendous couldn't play anything. Is it safe to Max power limits won't it push my vrms to high temps ? Also noticed a buzzing noise when gaming and its definately not the fan so coil wine stops when switch to desktop? Should I RMA ?

So far have only been able to do 1050 at 25 power limit.

Thanks


----------



## Brian18741

Quick update on temps for the crossfired DCU II. Gamed for 1 hour with fans on 100%

Top card 83°C. VRM1 78°C. VRM2 69°C.

Bottom card 67°C. VRM1 62°C. VRM2 62°C.


----------



## Paul17041993

Quote:


> Originally Posted by *maynard14*
> 
> oh i see, im a bit scared right now,. hmm thanks,.. i guess i have to search alot of videos on how to spread it on the die very carefully. i have done this on my 3570kdelided but that was last yr,. maybe ill just by a small paint brush, and use it to very carefully spread the clp, i think using q tips is dangerous


get some masking tape and stick it around the silicon, it will help to get it even on the die, then you take it off after placing the block on at least once and you should get a perfect coat without getting any on the outer components. You could possibly even leave tape on if its non-conductive and handles heat well...


----------



## VSG

Quote:


> Originally Posted by *Brian18741*
> 
> Quick update on temps for the crossfired DCU II. Gamed for 1 hour with fans on 100%
> 
> Top card 83°C. VRM1 78°C. VRM2 69°C.
> 
> Bottom card 67°C. VRM1 62°C. VRM2 62°C.


How much spacing did you have between the cards? What about case air flow? Those factors will influence your GPU temps a lot on stock cooling.


----------



## JordanTr

Quote:


> Originally Posted by *ste.ru*
> 
> I buy a MSI 290X reference for my case( Sivlerstone TJ-08) which are the best cooler? i think to buy the Gelid Solutions Icy Vision Rev.2


I went for the same cooler. However you should use thermal glue for vrms if you dont want heatsinks to fall out. Another thing - fans always go on 2200rpm even on idle, so at day time it looks quite, but if ypu want keep your pc on at night it becomes anoying, so i bought 2x be quiet shadow wings 92mm 1800rpm, 18db, i zip tied them to gelids radiator and now its fine, you can hear only whispers of your pc at night







temps stayed the same, most of the time gaming under 70, through benchmarks 75-76, keep in mind i got good airflow in my fractal design xl r2. I hope it helps.


----------



## maynard14

Quote:


> Originally Posted by *Paul17041993*
> 
> get some masking tape and stick it around the silicon, it will help to get it even on the die, then you take it off after placing the block on at least once and you should get a perfect coat without getting any on the outer components. You could possibly even leave tape on if its non-conductive and handles heat well...


hi sir, thanks for the advice,. is electric tape enough for this mission? aha i mean can i leave the electric tape on the gpu ? or just ordinary masking tape ?


----------



## sugarhell

If you mess up with the liquid pro/ultra then dont power up the card. Just clean it good enough with rubber alchohol and its okay


----------



## Brian18741

Quote:


> Originally Posted by *geggeg*
> 
> How much spacing did you have between the cards? What about case air flow? Those factors will influence your GPU temps a lot on stock cooling.


This is mine below. Spacing is about standard for crossfire really. Have my case fans on full tilt. 2 120s front intake, 140 intake on side panel directly onto cards, 3 120 exhaust. Have another 120 on the way that I will put on the floor intake and will replace TIM at the weekend so fingers crossed that improves it. But realistically I going to have to start saving to put them under water.


----------



## VSG

Ya, I can't really fault your cooling there. The Asus cooler for the Hawaii cards has been a bit underwhelming unfortunately, and luckily EK is preparing blocks for them. You could always try out the NZXT G10 + Asetek cooler + Aftermarket heatsinks for memory/VRMs in the meantime.


----------



## Brian18741

Yea I actually emailed EK about the waterblocks the they day and got a reply saying they should be ready for sale in about 4 weeks. Not that it matters as it will be months before I can save up for a full loop!

So whats the deal with the Krackn? You pick one up for about $30 and you can fit _any_ AIO 120mm cooler? Also would two of them fit is the next question?


----------



## Roboyto

Quote:


> Originally Posted by *Brian18741*
> 
> Yea I actually emailed EK about the waterblocks the they day and got a reply saying they should be ready for sale in about 4 weeks. Not that it matters as it will be months before I can save up for a full loop!
> 
> So whats the deal with the Krackn? You pick one up for about $30 and you can fit _any_ AIO 120mm cooler? Also would two of them fit is the next question?


Quote:


> Originally Posted by *Roboyto*
> 
> Your best bet with these cards is upgrading the cooling solution and make sure you *keep the main VRMs cool*. Main VRMs are highlighted in yellow and secondary VRMs circled in orange.
> 
> 
> *Attach an AIO CPU water cooler*. Will do a fantastic job of cooling the GPU, but you will need to attach heatsinks to RAM and VRMs. You have a couple options here as well.
> 
> 
> *Arctic Accelero Hybrid* which is the most expensive of the lot, if you can find one as it seems they may be discontinued. They are listed as compatible with the 290(X). These have a built in fan to assist in cooling the RAM and VRMs. The kit includes the heatsinks for RAM and VRMs
> http://www.arctic.ac/us_en/products/cooling/vga/accelero-hybrid.html
> 
> *NZXT Kraken G10* This is a bracket designed to attach the Asetek style AIO coolers to the card. It also has a fan holder on it to assist in cooling the RAM and VRMs. I believe these are still on backorder because of there high demand and stylish looks
> 
> 
> 
> 
> 
> 
> 
> 
> You must purchase heatsinks for RAM and VRMs
> https://www.nzxt.com/product/detail/138-kraken-g10-gpu-bracket.html
> Asetek AIO coolers are the round ones like this. You can also see a compatibility list on NZXT website.
> 
> 
> *KeplerDynamics* AIO adapter bracket. This bracket is compatible with the Asetek coolers as well. Downside is there is no fan attachment. You will still need to attach heatsinks to RAM and VRMs and then make sure you have sufficient air flow over the heatsinks/card.
> Orders from here can sometimes take a little while. A friend of mine ordered a few for his 7970 miner cards and it took several weeks for them to arrive. He is very happy with the results though
> 
> 
> 
> 
> 
> 
> 
> 
> http://keplerdynamics.com/sigmacool/radeonmkii
> *Heatisnks for RAM and VRMs*
> You will require two sizes. Larger for RAM and smaller for VRMs.
> 
> *Enzotech copper sinks*
> My brother has these on his Xfire 7950's that have AIO coolers on them.
> Larger Sinks for RAM - http://www.newegg.com/Product/Product.aspx?Item=N82E16835708012
> Smaller Sinks for VRMs - http://www.newegg.com/Product/Product.aspx?Item=N82E16835708011
> 
> 
> *Akust RAM Sinks*
> Aluminum - http://www.newegg.com/Product/Product.aspx?Item=9SIA3TR18A6835
> Copper - http://www.newegg.com/Product/Product.aspx?Item=9SIA3TR18A6835
> You would require more than one package of the copper as there are 16 RAM chips
> 
> 
> *Ebay Heatsinks*
> Here is one example, there are many more you can choose from if you search for RAM heatsink. The prices are very low on eBay and I have seen people mention them working well.
> http://www.ebay.com/itm/8pcs-Aluminium-Heatsink-For-Motherboard-DDR-VGA-RAM-Memory-IC-Chipset-Cooler-W-/370655913100?pt=US_Memory_Chipset_Cooling&hash=item564cd0648c


----------



## Abyssic

Quote:


> Originally Posted by *Brian18741*
> 
> This is mine below. Spacing is about standard for crossfire really. Have my case fans on full tilt. 2 120s front intake, 140 intake on side panel directly onto cards, 3 120 exhaust. Have another 120 on the way that I will put on the floor intake and will replace TIM at the weekend so fingers crossed that improves it. But realistically I going to have to start saving to put them under water.


nice system man ^^ have you swapped out your tim?


----------



## VSG

Well that is a handy post to have


----------



## Paul17041993

Quote:


> Originally Posted by *maynard14*
> 
> hi sir, thanks for the advice,. is electric tape enough for this mission? aha i mean can i leave the electric tape on the gpu ? or just ordinary masking tape ?


I think either should be fine, electrical would be less likely to conduct, I don't know how conductive masking usually is though, but like sugarhell said, you just clean off any overflow before using it, the tape just makes it slightly easier by catching the overflow instead of letting it touch the surrounding components, shouldn't really need to leave the tape there when using it if you placed the block correctly beforehand.

so in a nutshell;
place tape around die/silicon, over the little components but not on the silicon itself really
> place and spread paste all over silicon
> place block on correctly
> any overflow should be caught on the tape, remove said tape and you should be able to place the block again and shouldn't have any overflow
> place and remove block again to be sure and you should then have a perfect coating ready for use.

keeping in mind though, if the tape is thick enough it could stop the block going all the way down, so good to test without it there to be sure. Someone had a picture here of theirs before but I cant remember how far back in the topic it was...

EDIT: ah here we are;
Quote:


> Originally Posted by *tsm106*
> 
> CLU for the win. Masking tape for applying and removal. Use a baby wipe to wipe it off.


----------



## Roboyto

Quote:


> Originally Posted by *Brian18741*
> 
> Also would two of them fit is the next question?


Quote:


> Originally Posted by *Roboyto*
> 
> Pics from my Bro
> 
> 
> 
> 
> 
> I forgot he has one with fan mount and the other not. However, if you have 2 empty PCI-E slot spacing you should be alright it seems.


Pics are using TriptCC brackets, which are essentially Kraken G10s. You can Xfire with them









Just have to have a spot to fit the two 120mm AIO rads. Important to keep in mind the AIO rad dimension is taller than a 120mm Fan!

I've made the mistake of thinking I could install one in an older case and it wouldn't attach to the rear 120mm fan spot because there wasn't enough height. He had no optical drive, so I ended up using industrial strength velcro to hold the radiator in place in the optical drive bay area and pointed the fan to exhaust out the front of the case...worked very well actually lol


----------



## neurotix

Bumping this thread in my list


----------



## rdr09

Quote:


> Originally Posted by *Brian18741*
> 
> This is mine below. Spacing is about standard for crossfire really. Have my case fans on full tilt. 2 120s front intake, 140 intake on side panel directly onto cards, 3 120 exhaust. Have another 120 on the way that I will put on the floor intake and will replace TIM at the weekend so fingers crossed that improves it. But realistically I going to have to start saving to put them under water.


Very nice rig.

edit: wait, 750W?


----------



## Gabkicks

My 2nd R9 290 came today







2 Sapphire R9 290 Tri-X cards


----------



## spinejam

$436.17 & FS =









http://www.amazon.com/gp/product/B00GRNULDA/ref=ox_sc_act_title_1?ie=UTF8&psc=1&smid=ATVPDKIKX0DER


----------



## maynard14

guys,,, sorry for alot of questions lately... pls bear with me









still deciding what best budget cooler for the nzxt g10?

NZXT : Kraken X60, Kraken X40
Corsair : H105, H110, H90, H75, H55 , H50
Antec : KUHLER H2O 920V4, KUHLER H2O 620V4, KUHLER H2O 920, KUHLER H2O 620
Thermaltake : Water 3.0 Extreme, Water 3.0 Pro, Water 3.0 Performer, Water 2.0 Extreme, Water 2.0 Pro, Water 2.0 Performer
Zalman : LQ-320, LQ-315, LQ-310

will the kuhler h20 620 enough for the gpu? or just save some money and buy the kraken x60?


----------



## Spectre-

Quote:


> Originally Posted by *maynard14*
> 
> guys,,, sorry for alot of questions lately... pls bear with me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> still deciding what best budget cooler for the nzxt g10?
> 
> NZXT : Kraken X60, Kraken X40
> Corsair : H105, H110, H90, H75, H55 , H50
> Antec : KUHLER H2O 920V4, KUHLER H2O 620V4, KUHLER H2O 920, KUHLER H2O 620
> Thermaltake : Water 3.0 Extreme, Water 3.0 Pro, Water 3.0 Performer, Water 2.0 Extreme, Water 2.0 Pro, Water 2.0 Performer
> Zalman : LQ-320, LQ-315, LQ-310
> 
> will the kuhler h20 620 enough for the gpu? or just save some money and buy the kraken x60?


lol i have the H55 and i am doing 1150 @75mv on core all the time now

i think adding a 140mm cooler or a thick 120mm would defs be better than the H55


----------



## maynard14

Quote:


> Originally Posted by *Spectre-*
> 
> lol i have the H55 and i am doing 1150 @75mv on core all the time now
> 
> i think adding a 140mm cooler or a thick 120mm would defs be better than the H55


i see, what are your temps using the h55 bro? thanks


----------



## Roboyto

Quote:


> Originally Posted by *maynard14*
> 
> guys,,, sorry for alot of questions lately... pls bear with me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> still deciding what best budget cooler for the nzxt g10?
> 
> NZXT : Kraken X60, Kraken X40
> Corsair : H105, H110, H90, H75, H55 , H50
> Antec : KUHLER H2O 920V4, KUHLER H2O 620V4, KUHLER H2O 920, KUHLER H2O 620
> Thermaltake : Water 3.0 Extreme, Water 3.0 Pro, Water 3.0 Performer, Water 2.0 Extreme, Water 2.0 Pro, Water 2.0 Performer
> Zalman : LQ-320, LQ-315, LQ-310
> 
> will the kuhler h20 620 enough for the gpu? or just save some money and buy the kraken x60?


Friend of mine using Antec 620 on (3) 7970s with heavy overclocks. Get high static pressure fan like Corsair SP120 and I'm willing to bet it will be more than sufficient to cool one of these.


----------



## maynard14

Quote:


> Originally Posted by *Roboyto*
> 
> Friend of mine using Antec 620 on (3) 7970s with heavy overclocks. Get high static pressure fan like Corsair SP120 and I'm willing to bet it will be more than sufficient to cool one of these.


thank you , i will buy the kuhler 620 later from a friend which cost me 44 dollars,.. i think thats cheap and buy 2x 120 sp fans also. im so excited cant wait to assemble it later


----------



## Roboyto

Quote:


> Originally Posted by *maynard14*
> 
> thank you , i will buy the kuhler 620 later from a friend which cost me 44 dollars,.. i think thats cheap and buy 2x 120 sp fans also. im so excited cant wait to assemble it later


Push/Pull with those fans works very well. He has AIOs for all his CPUs and GPUs and they all have Corsair Air Series fans on them, most (if not all) of the GPUs are running with push/pull.

For gaming my bro's 7950's don't break 50C with a single SP120. 290 runs hotter, but I'd wager you to be at or around the 60C mark while gaming; benches may push you higher than that.

Since you'll be running push/pull you could even use the SP120 High Performance edition with the 7V step down resistors to keep the noise down. I have the HP ones and they make some noise at full blast...not terrible, but more than comfortable for me. I spent $100s of dollars on custom water cooling for the benefit of *silence*, so I have to have silence!


----------



## Roboyto

I Think I have found the absolute limit of my GPU. Temps are still solid too. GPU 47 VRMs 71/40

XFX R9 290 BE @ 1315/1700 - P17940

200mV offset in Trixx came out to 1.422V

...Maybe time to try and push the 4770k a little more







I want P18000!

http://www.3dmark.com/3dm11/7996244



Check out this ridiculous wattage though!


----------



## K Man

Quote:


> Originally Posted by *Brian18741*
> 
> Man those temps are very good. 78C on top card? My Asus 290 DCU II hits 94C in crossfire. The bottom card is mid 80s. This is on auto fan speed.
> 
> Will you bench Heaven 4.0 with your fans manually set to 100% and update with your temps? I'll do the same later when home.
> 
> I'm finding the heat of xfiring these cards unreal on air. So much so that I'm going to _have_ to put them under water!


Yes I can.

Srock clocks, 100% fan, top card - 67C - bottom card 53 max.

That was with Everything maxed out in Heaven.


----------



## rdr09

Quote:


> Originally Posted by *Roboyto*
> 
> I Think I have found the absolute limit of my GPU. Temps are still solid too. GPU 47 VRMs 71/40
> 
> XFX R9 290 BE @ 1315/1700 - P17940
> 
> 200mV offset in Trixx came out to 1.422V
> 
> ...Maybe time to try and push the 4770k a little more
> 
> 
> 
> 
> 
> 
> 
> I want P18000!
> 
> http://www.3dmark.com/3dm11/7996244
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Check out this ridiculous wattage though!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


congrats. you beat my 290.


----------



## Arizonian

Quote:


> Originally Posted by *Gabkicks*
> 
> My 2nd R9 290 came today
> 
> 
> 
> 
> 
> 
> 
> 2 Sapphire R9 290 Tri-X cards
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added x2









Quote:


> Originally Posted by *maynard14*
> 
> thank you , i will buy the kuhler 620 later from a friend which cost me 44 dollars,.. i think thats cheap and buy 2x 120 sp fans also. im so excited cant wait to assemble it later
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Nice. That's all I need - updated


----------



## Roboyto

Quote:


> Originally Posted by *rdr09*
> 
> [/SPOILER]
> 
> congrats. you beat my 290.


Thanks! 290(X) are sparse in that list, as well as quad core i7. What's your 2700k running at?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Roboyto*
> 
> I Think I have found the absolute limit of my GPU. Temps are still solid too. GPU 47 VRMs 71/40
> 
> XFX R9 290 BE @ 1315/1700 - P17940
> 
> 200mV offset in Trixx came out to 1.422V
> 
> ...Maybe time to try and push the 4770k a little more
> 
> 
> 
> 
> 
> 
> 
> I want P18000!
> 
> http://www.3dmark.com/3dm11/7996244
> 
> 
> 
> Check out this ridiculous wattage though!


Wow, good clocks. I wish my card clocked that high, the point where voltage no longer helps is at 1245MHz core. The jump from 1215 to 1230 without artifacts alone requires an additional 30mV...

Your VRM1 temps seem pretty high though. Running +200mV today while gaming Skyrim for a while (triple monitor setup keeps me under constant 100% load, finally got my 3rd monitor back from RMA today) and my VRM1 topped at like 45c with core around 47c.

Did you get the right thickness thermal pads for your block?

For some reason my card's voltage droops quite a bit, even with +200mV I'm barely above 1.3v actual. I wish there was a BIOS with LLC and downclocking, the only one with LLC that I know of is PT3 and I don't want to run my card at 1200+MHz all day..


----------



## Roboyto

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Wow, good clocks. I wish my card clocked that high, the point where voltage no longer helps is at 1245MHz core. The jump from 1215 to 1230 without artifacts alone requires an additional 30mV...
> 
> Your VRM1 temps seem pretty high though. Running +200mV today while gaming Skyrim for a while (triple monitor setup keeps me under constant 100% load, finally got my 3rd monitor back from RMA today) and my VRM1 topped at like 45c with core around 47c.
> 
> Did you get the right thickness thermal pads for your block?
> 
> For some reason my card's voltage droops quite a bit, even with +200mV I'm barely above 1.3v actual. I wish there was a BIOS with LLC and downclocking, the only one with LLC that I know of is PT3 and I don't want to run my card at 1200+MHz all day..


Fujipoly Ultra Extreme coming tomorrow







Hopefully they help.

Thermal pads were pre-cut.

It's not like 71 is bad for the VRMs but it could be better. 71 is 8C higher than previous clocks, but also an additional 75mV to only get 50MHz more.

If I could give it more juice I might have a little more to squeeze out of it.

Having downclock is definitely necessary. Could always load the LLC bios to test with just to see...

What monitor you have RMAd?


----------



## Ukkooh

Quote:


> Originally Posted by *Roboyto*
> 
> Thanks! 290(X) are sparse in that list, as well as quad core i7. What's your 2700k running at?


Quote:


> Originally Posted by *Roboyto*
> 
> I Think I have found the absolute limit of my GPU. Temps are still solid too. GPU 47 VRMs 71/40
> 
> XFX R9 290 BE @ 1315/1700 - P17940
> 
> 200mV offset in Trixx came out to 1.422V
> 
> ...Maybe time to try and push the 4770k a little more
> 
> 
> 
> 
> 
> 
> 
> I want P18000!
> 
> http://www.3dmark.com/3dm11/7996244
> 
> 
> Spoiler: Warning: Image!
> 
> 
> 
> 
> 
> 
> 
> Check out this ridiculous wattage though!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Image!


Thanks to you I really feel like trying how my card ocs with +200mV now.


----------



## maynard14

is it a bad idea to use this kuhler for the gpu?



huhu


----------



## Sgt Bilko

Quote:


> Originally Posted by *maynard14*
> 
> is it a bad idea to use this kuhler for the gpu?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> huhu


You should be fine with using any AIO with Hawaii


----------



## Sgt Bilko

Quote:


> Originally Posted by *Roboyto*
> 
> I Think I have found the absolute limit of my GPU. Temps are still solid too. GPU 47 VRMs 71/40
> 
> XFX R9 290 BE @ 1315/1700 - P17940
> 
> 200mV offset in Trixx came out to 1.422V
> 
> ...Maybe time to try and push the 4770k a little more
> 
> 
> 
> 
> 
> 
> 
> I want P18000!
> 
> http://www.3dmark.com/3dm11/7996244
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Check out this ridiculous wattage though!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


That just makes me want water cooling more now









At +200mV i can bench at 1250 on the core, haven't tested higher yet.........need cooler temps


----------



## maynard14

Quote:


> Originally Posted by *Sgt Bilko*
> 
> You should be fine with using any AIO with Hawaii


i mean it should be ok if i use that cooler sir? thats the actual cooler im gonna use,. i ried lapping it and thats the result,. is it fine ?


----------



## Sgt Bilko

Quote:


> Originally Posted by *maynard14*
> 
> i mean it should be ok if i use that cooler sir? thats the actual cooler im gonna use,. i ried lapping it and thats the result,. is it fine ?


I don't see any issue with it, looks good


----------



## Panther Al

Coming over to spend some time with the Red Team:

2 Sapphire 290x's, Stock all around (Will eventually go under water, because my god are they loud).


----------



## maynard14

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I don't see any issue with it, looks good


thank you so much im in the middle of cleaning the gpu core .. post some pics later wahoo


----------



## maynard14

im not confident on what i have done,.....


----------



## ste.ru

Quote:


> Originally Posted by *JordanTr*
> 
> I went for the same cooler. However you should use thermal glue for vrms if you dont want heatsinks to fall out. Another thing - fans always go on 2200rpm even on idle, so at day time it looks quite, but if ypu want keep your pc on at night it becomes anoying, so i bought 2x be quiet shadow wings 92mm 1800rpm, 18db, i zip tied them to gelids radiator and now its fine, you can hear only whispers of your pc at night
> 
> 
> 
> 
> 
> 
> 
> temps stayed the same, most of the time gaming under 70, through benchmarks 75-76, keep in mind i got good airflow in my fractal design xl r2. I hope it helps.


thanks







i think to buy it..


----------



## maynard14

ahmm but is it ok to put the heatsinks near the capacitors even they are really close? will it short out the capacitors?


----------



## maynard14

help pls,.. still im not turning it on till i know it is safe that the vrm heatsinks are touching the capacitors...


----------



## Prozillah

Guys am I doing something seriously wrong here?

Just picked up my 2x new gigabyte 290 windforce OC cards - swapped them out from my 2x GTX670's and on the same settings in BF4 it is no where near as smooth or nice???? Also - when monitoring temps in afterburner I notice that one card has a temp display while the other shuts off completely on desktop - the card only kicks in once i load a game etc. I guess its a power saving feature??? Anyway - I figure it is just a matter of setting them up correctly or using the magic recipe with BF4 - because I would damn well expect 2x WINDFORCE OC 290's 4gb to kick 2x geforce 670's 2gb ass...


----------



## bloodmaster

I have a Sapphire tri-x 290 and i flash it with the PowerColor PCS bios .

After flashing i discover that the PowerColor PCS bios has by default +50mv in VDDC !!

Sapphire tri-x 290 default clocks under load is 1000/1300 with 1,148-1,152v vs PowerColor PCS r290 1040/1350 with 1,195-1,205v.
Also Powercolor PCS bios has more aggressive fan settings.


----------



## Ukkooh

Quote:


> Originally Posted by *bloodmaster*
> 
> I have a Sapphire tri-x 290 and i flash it with the PowerColor PCS bios .
> 
> After flashing i discover that the PowerColor PCS bios has by default +50mv in VDDC !!
> 
> Sapphire tri-x 290 default clocks under load is 1000/1300 with 1,48-1,52v vs PowerColor PCS r290 1040/1350 with 1,95-2,05v.
> Also Powercolor PCS bios has more aggressive fan settings.


I think there is something wrong with your voltage readings. What program did you use to check them?


----------



## Mercy4You

Quote:


> Originally Posted by *maynard14*
> 
> help pls,.. still im not turning it on till i know it is safe that the vrm heatsinks are touching the capacitors...


Hmm, see your point! Think it's a little too big, the VRM heatsinks... but I'm noob at this


----------



## Abyssic

Quote:


> Originally Posted by *Prozillah*
> 
> Guys am I doing something seriously wrong here?
> 
> Just picked up my 2x new gigabyte 290 windforce OC cards - swapped them out from my 2x GTX670's and on the same settings in BF4 it is no where near as smooth or nice???? Also - when monitoring temps in afterburner I notice that one card has a temp display while the other shuts off completely on desktop - the card only kicks in once i load a game etc. I guess its a power saving feature??? Anyway - I figure it is just a matter of setting them up correctly or using the magic recipe with BF4 - because I would damn well expect 2x WINDFORCE OC 290's 4gb to kick 2x geforce 670's 2gb ass...


the turning off thing is a feature called ultra low power state (ULPS) you can and should turn it off using trixx or afterburner. don't about your performance issues though


----------



## bloodmaster

Quote:


> Originally Posted by *Ukkooh*
> 
> I think there is something wrong with your voltage readings. What program did you use to check them?


I use HWiNFO64 and GPU-Z.0.7.7


----------



## Paul17041993

Quote:


> Originally Posted by *maynard14*
> 
> im not confident on what i have done,.....


oh my, should be fine though, may have to watch the memory IC on the top-right corner, and be sure you put a heatsink on that bottom end of the VRMs.
Quote:


> Originally Posted by *maynard14*
> 
> ahmm but is it ok to put the heatsinks near the capacitors even they are really close? will it short out the capacitors?


they should be fine, the caps are not designed to send charge through the casing, nor should the other components be able to receive charge from the heatsinks.


----------



## Ukkooh

Quote:


> Originally Posted by *bloodmaster*
> 
> I use HWiNFO64 and GPU-Z.0.7.7


Something is wrong because if your card's voltage would have really been 1.9V it would have fried your card in seconds on air cooling. I think you should flash the original bios back.


----------



## Prozillah

Quote:


> Originally Posted by *Abyssic*
> 
> the turning off thing is a feature called ultra low power state (ULPS) you can and should turn it off using trixx or afterburner. don't about your performance issues though


Beauty - did that through registry and we are good on that front. More specifically put around the performance issues - I'm using the 14.1 beta drivers, in game bf4 switched to mantle. still was ****. Been reading alot about frame pacing so I turned that **** off - started up BF4 again and BAM - freeze & x0116 BSOD. Again - looks like other people were experiencing this.

So would I be better at this stage to revert back to the WHQL drivers and stay using the DirectX APU - can anyone confirm if that will improve xfire 290 performance in bf4?


----------



## bloodmaster

Quote:


> Originally Posted by *Ukkooh*
> 
> Something is wrong because if your card's voltage would have really been 1.9V it would have fried your card in seconds on air cooling. I think you should flash the original bios back.


You are right! I have type it wrong and i correct it .
The main idea has not change as far Powercolor bios overvoltage.


----------



## Forceman

Quote:


> Originally Posted by *maynard14*
> 
> help pls,.. still im not turning it on till i know it is safe that the vrm heatsinks are touching the capacitors...


Touching the barrel caps is fine, but you don't want anything touching the flat surface mount caps (they look like two dots of solder with a brown thing between them - you can see them next to the bottom right VRMs in this pic).


----------



## ReXtN

Can I join the Owners Club?
http://www.techpowerup.com/gpuz/kngrn/

MSI R9 290X Referance design with EK Full Cover Waterblock.


----------



## the9quad

Quote:


> Originally Posted by *Prozillah*
> 
> Beauty - did that through registry and we are good on that front. More specifically put around the performance issues - I'm using the 14.1 beta drivers, in game bf4 switched to mantle. still was ****. Been reading alot about frame pacing so I turned that **** off - started up BF4 again and BAM - freeze & x0116 BSOD. Again - looks like other people were experiencing this.
> 
> So would I be better at this stage to revert back to the WHQL drivers and stay using the DirectX APU - can anyone confirm if that will improve xfire 290 performance in bf4?


I have yet to see anyone with a 290/290x using the 14.1's with crossfire who hasn't had one issue or another. I suggest using the whql's until the next beta release.


----------



## Sgt Bilko

Quote:


> Originally Posted by *the9quad*
> 
> I have yet to see anyone with a 290/290x using the 14.1's with crossfire who hasn't had one issue or another. I suggest using the whql's until the next beta release.


I agree, Single card worked alright but Crossfire was messed up, in pretty much every game i tried to run.

I'm back on the 13.12 drivers for now till the Mantle WHQL driver or a promising Beta comes out


----------



## Widde

Can I get updated to msi/sapphire? ^^ http://piclair.com/vju0d

2 R9 290s ^_^


----------



## BradleyW

What's the truth about these black screen issues?


----------



## Ukkooh

Quote:


> Originally Posted by *BradleyW*
> 
> What's the truth about these black screen issues?


Some of the initial batches had faulty cards that blackscreened but the later batches seem to be all good.


----------



## rdr09

Quote:


> Originally Posted by *Roboyto*
> 
> Thanks! 290(X) are sparse in that list, as well as quad core i7. What's your 2700k running at?


it runs 4.9GHz in benches and 4.5 for 7/24 use. saw Osjur's 290X? it crossed 22K in graphics. imagine we have X79 systems.


----------



## the9quad

Quote:


> Originally Posted by *BradleyW*
> 
> What's the truth about these black screen issues?


Personal experience my first card bought on launch day black screened. Have since bought 3 new ones and they all work fine. I deduced that in my situation it was a hardware issue.


----------



## Roboyto

Quote:


> Originally Posted by *Ukkooh*
> 
> Thanks to you I really feel like trying how my card ocs with +200mV now.


Quote:


> Originally Posted by *Sgt Bilko*
> 
> [/SPOILER]
> 
> That just makes me want water cooling more now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> At +200mV i can bench at 1250 on the core, haven't tested higher yet.........need cooler temps


I'm glad I could inspire you guys


----------



## Roboyto

Quote:


> Originally Posted by *rdr09*
> 
> it runs 4.9GHz in benches and 4.5 for 7/24 use. saw Osjur's 290X? it crossed 22K in graphics. imagine we have X79 systems.


4770k is aggravating. I'm direct-to-die cooling with Raystorm block and the temps are great, but it hasn't helped me get past 4.5GHz. I was at same speeds/voltages with delid and IHS still on while using Antec 620 with one fan.



I didn't see that yet, impressive!

I know right...Damn 6 core CPUs! We'd be much higher on the list if we had, roughly, a 50% physics score boost.


----------



## Roboyto

Quote:


> Originally Posted by *rdr09*
> 
> Very nice rig.
> 
> edit: wait, 750W?


I asked him the same question a couple days ago. He's presently running at stock clocks on both cards and the 750W has been holding up well for him.


----------



## Roboyto

Quote:


> Originally Posted by *Prozillah*
> 
> Beauty - did that through registry and we are good on that front. More specifically put around the performance issues - I'm using the 14.1 beta drivers, in game bf4 switched to mantle. still was ****. Been reading alot about frame pacing so I turned that **** off - started up BF4 again and BAM - freeze & x0116 BSOD. Again - looks like other people were experiencing this.
> 
> So would I be better at this stage to revert back to the WHQL drivers and stay using the DirectX APU - can anyone confirm if that will improve xfire 290 performance in bf4?


Make sure you cleaned out Nvidia drivers thoroughly.

Seems everyone is experiencing issues with those 14.1 drivers. I would say revert back to what was working better. I'm still on 13.12, with a single 290, and haven't had issues with anything yet.

ULPS off should definitely help your cause. Trixx has option to force constant voltage, this may help as well.

How are your temps looking in Xfire?

They're still ironing out the kinks with Mantle unfortunately. Should have been expected IMHO, it's an entirely new API so there's bound to be speed bumps and pot holes along the way. Another thing to consider as well is Mantle wasn't designed around enthusiast level hardware like 290(X). It 's primary goals are to help boost performance for AMD APUs, weaker CPUs, and lower tier GCN graphics cards. I'm not sure if folks with the lower-end hardware are having these same issues?

They will get it functioning properly though, just need to give it a little time.


----------



## Mercy4You

Quote:


> Originally Posted by *BradleyW*
> 
> What's the truth about these black screen issues? I think if it was a hardware issue, I too would have black screen issues since I use Elpida, but I don't.


Quote:


> Originally Posted by *Ukkooh*
> 
> Some of the initial batches had faulty cards that blackscreened but the later batches seem to be all good.


It's amazing to learn that there are still people believing the blackscreens are not hardware related...


----------



## rdr09

Quote:


> Originally Posted by *Roboyto*
> 
> I asked him the same question a couple days ago. He's presently running at stock clocks on both cards and the 750W has been holding up well for him.


750w for 2 290s. tsk, tsk.

http://www.3dmark.com/3dm11/7510934


----------



## Roboyto

Quote:


> Originally Posted by *rdr09*
> 
> 750w for 2 290s. tsk, tsk.
> 
> http://www.3dmark.com/3dm11/7510934




I figured same thing. Won't be able to push cards to their max with only 750, but he hasn't had issues aside from temps on the cards. Maybe a higher wattage and more efficient PSU could bring his temps down a bit? I know he is going to be changing TIM in a couple days to see where he ends up.

His PSU isn't terrible. Manufactured by Seasonic for XFX. Hardware secrets got it to belt out 931W @ 47C ambient with 79% efficiency.
http://www.hardwaresecrets.com/article/XFX-PRO-750-W-Power-Supply-Review/1182/9

I won't be able to top 18774 I don't think. Kick a$$







score representing AMD though









Interesting that nearly 5.4GHz on a 2600k only scores 605pts higher for physics than my 4770k @ 4.5GHz.; 4.9% higher score for 19.6% clock speed increase.

Graphics score comparison... 22683/21578...5.1% better. There's the advantage of those extra cores on the X.


----------



## rdr09

Quote:


> Originally Posted by *Roboyto*
> 
> 
> 
> I figured same thing. Won't be able to push cards to their max with only 750, but he hasn't had issues aside from temps on the cards. Maybe a higher wattage and more efficient PSU could bring his temps down a bit? I know he is going to be changing TIM in a couple days to see where he ends up.
> 
> His PSU isn't terrible. Manufactured by Seasonic for XFX. Hardware secrets got it to belt out 931W @ 47C ambient with 79% efficiency.
> http://www.hardwaresecrets.com/article/XFX-PRO-750-W-Power-Supply-Review/1182/9
> 
> I won't be able to top 18774 I don't think. Kick a$$
> 
> 
> 
> 
> 
> 
> 
> score representing AMD though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Interesting that nearly 5.4GHz on a 2600k only scores 605pts higher for physics than my 4770k @ 4.5GHz.; 4.9% higher score for 19.6% clock speed increase.
> 
> Graphics score comparison... 22683/21578...5.1% better. There's the advantage of those extra cores on the X.


what did you get with tess on? i only use Trixx . . .

http://www.3dmark.com/3dm11/7748882


----------



## Roboyto

Quote:


> Originally Posted by *rdr09*
> 
> what did you get with tess on? i only use Trixx . . .
> 
> http://www.3dmark.com/3dm11/7748882


Haven't tested with tesselation on lately, but best score so far is:

P16132 | 1265/1725 | +175mV | GPU 44 VRMs 62/37 |

http://www.3dmark.com/3dm11/7998588


----------



## rdr09

Quote:


> Originally Posted by *Roboyto*
> 
> Haven't tested with tesselation on lately, but best score so far is:
> 
> P16132 | 1265/1725 | +175mV | GPU 44 VRMs 62/37 |
> 
> http://www.3dmark.com/3dm11/7998588


your card has hynix? nice memory oc and it matters most in benches. i think miners like those, too.

let's see what newer driver can do. when i had the 7900 series cards - newer driver increased graphics scores at least 1000 pts.


----------



## Kenshiro 26

You guys able to use Mantle in BF4 with the 14.1 v6 beta driver?

I'm getting crashing and freezing with it, it's stable when I revert to the WHQL 13.12 driver.


----------



## VSG

Quote:


> Originally Posted by *parm123*
> 
> Does someone need Powercolor 290x lcs cards?
> http://www.powercolor.com/ru/products_features.asp?id=521
> 
> I have 2 cards, new, not used, ever not unpacked. If someone interested, please send me private message


You can't post items for sale outside of the marketplace, and to do that you need at least 35 rep for a good reason. Sorry man!


----------



## Roboyto

Quote:


> Originally Posted by *rdr09*
> 
> what did you get with tess on? i only use Trixx . . .
> 
> http://www.3dmark.com/3dm11/7748882


Quote:


> Originally Posted by *Roboyto*
> 
> Haven't tested with tesselation on lately, but best score so far is:
> 
> P16132 | 1265/1725 | +175mV | GPU 44 VRMs 62/37 |
> 
> http://www.3dmark.com/3dm11/7998588


http://www.3dmark.com/3dm11/7998665

Just ran again.

P16307 | 1295/1700 | +200mV | GPU 48 VRMs 68/38 |



Tried with same settings I used with tessealtion off, 1315/1715, and there was significant artifacting/flickering. Score also dropped 6 points to P16301.

Tried again @ 1315/1700, and artifacting/flickering got worse and score dropped to P16293. Definitely have hit the wall for the card.


----------



## Roboyto

Quote:


> Originally Posted by *Kenshiro 26*
> 
> You guys able to use Mantle in BF4 with the 14.1 v6 beta driver?
> 
> I'm getting crashing and freezing with it, it's stable when I revert to the WHQL 13.12 driver.


You're not the only one. Keep that 13.12 for a little while longer.


----------



## Mercy4You

Quote:


> Originally Posted by *Kenshiro 26*
> 
> You guys able to use Mantle in BF4 with the 14.1 v6 beta driver?
> 
> I'm getting crashing and freezing with it, it's stable when I revert to the WHQL 13.12 driver.


Yep, here too. Back to 13.12


----------



## Gabkicks

14.1 doesnt work well with my pc either


----------



## maynard14

hi there ! got some questions









hope you could help me, just today i installed the nzxt g10 for my 290x card,.. and so far great temps with kuhler 620 from antec,.

im using 2 corsair sp fans as push pull to pull out hot air in the case,..

do you think this a good set up for the fans? or any other much better options.. thank you


----------



## VSG

See if you get better CPU/GPU temps by having the AIO units pulling in cool air from the outside and switching over the case fans to exhaust instead.


----------



## rdr09

Quote:


> Originally Posted by *Roboyto*
> 
> You're not the only one. Keep that 13.12 for a little while longer.


^this or 13.11. you got a nice card Rob.


----------



## maynard14

Quote:


> Originally Posted by *geggeg*
> 
> See if you get better CPU/GPU temps by having the AIO units pulling in cool air from the outside and switching over the case fans to exhaust instead.


ok thanks sir ill try to change my fan settings on the gpu and see if it will give some changes,. thanks

btw heres my cureent set up:


----------



## VSG

At the very least, it makes sense to have the GPU rad on intake. Try having both on intake first and see if anything changes.


----------



## maynard14

Quote:


> Originally Posted by *geggeg*
> 
> At the very least, it makes sense to have the GPU rad on intake. Try having both on intake first and see if anything changes.


ok sir i will change it now







be back shortly


----------



## Roboyto

Quote:


> Originally Posted by *maynard14*
> 
> hi there ! got some questions
> 
> 
> 
> 
> 
> 
> 
> 
> 
> hope you could help me, just today i installed the nzxt g10 for my 290x card,.. and so far great temps with kuhler 620 from antec,.
> 
> im using 2 corsair sp fans as push pull to pull out hot air in the case,..
> 
> do you think this a good set up for the fans? or any other much better options.. thank you




Is it possible to fit the Antec 620 up there where I marked? Push/Pull to exhaust out the back of the case. Then put a fan where the Radiator is presently and have all 3 fans in front pull fresh air in and across the card.

Also, it looks like how the fans line up and where your main PCI-E slot is at you don't get great airflow across the card. If you MoBo will run PCI-E 3.0 16x in the lower slot then this may help with your temperatures as well?


----------



## VSG

I personally try my best to have rads on intake as much as possible, even if it is at the sake of higher case temperatures.


----------



## maynard14

wowow it made a big diff sir,.. while playing crysis 3 max load is 59c now on intake max temp 55c!! also my vrm , from 71c to 61c! cant believe it!

hmm i will try the antec kuler up there where u mark it sir,. and see if will make a diff,.

no i cant run the card at 16x on the 3rd or 2nd slot,.. it will be 8x only

yes i think best for rads are intake for the air flow! thank you


----------



## VSG

No problem, glad it worked out for you


----------



## Roboyto

Quote:


> Originally Posted by *rdr09*
> 
> your card has hynix? nice memory oc and it matters most in benches. i think miners like those, too.
> 
> let's see what newer driver can do. when i had the 7900 series cards - newer driver increased graphics scores at least 1000 pts.


Quote:


> Originally Posted by *rdr09*
> 
> ^this or 13.11. you got a nice card Rob.


Yes Hynix RAM. Unbelivable the 5GHz chips are tickling 7GHz









I did hit the silicon lottery again. I've been extremely fortunate in this regard with nearly all AMD cards I have had.

HIS HD7950 IceQ; Super solid with IceQ blower. 1200+ on core with almost no additional voltage. Never OC RAM as it was a miner card I owned but briefly.

ASUS DC2 7870 & 7770; Both fantastic 1250+ on core and 1500+ on RAM. 7870 had EKFC and 7770 w/ DC2 cooler.

Sapphire HD6790 with stock cooler; fantastic overclocker. Sapphire Vapor-X HD4890; nearly maxed CCC clocks for core/RAM with no extra voltage w/ EKFC block.

I bought my 290 as a pair with another identical card. The first card only went to 1080/1350 with +125mV. In comparison to 1215/1675 with +125mV.

Only a couple bad experiences with AMD/ATi...
I had an XFX 7970 DD card that was hardware voltage locked and had no OC ability whatsoever. Sapphire HD7970 that was DOA.

And the *worst* card I ever had...

EVGA GTX 670 Superclocked 4GB.

I tested the card with stock blower before putting EKFC on it and it did great with factory boost settings so I figured it was cool. Put EKFC Acetal/Nickel on it and as soon as there was ANY 3D load the thing (coil) whined like a little baby! I couldn't get 10MHz over factory boost clocks on the core, and was so salty I never even attempted to OC the RAM. The card ended up being worthless after about 6 months and would BSOD or black screen under any gaming load. I waited until I was doing an upgrade and got my store credit at MicroCenter. Put the $375 towards my 4770k and Z87 Gryphon board I'm using now.

The last time I had an Nvidia card before that GTX 670 was when I was running SLI 7600 GT, which were great. I used them for years and sold them...still working as far as I know.
*
BUT* I don't think I'll ever stray from AMD again for GPU...


----------



## Roboyto

Quote:


> Originally Posted by *maynard14*
> 
> wowow it made a big diff sir,.. while playing crysis 3 max load is 59c now on intake max temp 55c!! also my vrm , from 71c to 61c! cant believe it!
> 
> hmm i will try the antec kuler up there where u mark it sir,. and see if will make a diff,.
> 
> no i cant run the card at 16x on the 3rd or 2nd slot,.. it will be 8x only
> 
> yes i think best for rads are intake for the air flow! thank you


Nice temp drops









Always best to tinker around with fan orientations. Free performance can be found just by turning/flipping/moving things a little bit


----------



## Roboyto

Quote:


> Originally Posted by *maynard14*
> 
> no i cant run the card at 16x on the 3rd or 2nd slot,.. it will be 8x only


The real question is the 3.0 8x going to bottleneck your card? I don't think it will, so if it gives you better temps it could be worth trying to move it down.

PCIe 3.0 x8 is same bandwidth as PCIe 2.0 x16.



http://www.techpowerup.com/reviews/Intel/Ivy_Bridge_PCI-Express_Scaling/11.html

This test shows a 7970/680. And there was nearly no measurable gain from PCIe 2.0 x16 ~VS~ PCIe 3.0 x16. You have more horsepower than a 7970 or 680...so

We look here a GTX 690 for a better comparison.

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/53901-nvidia-geforce-gtx-690-review-25.html

Nearly no advantage with 16GB/sec. You shouldn't see a performance loss in the x8 slot


----------



## maynard14

Quote:


> Originally Posted by *Roboyto*
> 
> The real question is the 3.0 8x going to bottleneck your card? I don't think it will, so if it gives you better temps it could be worth trying to move it down.
> 
> PCIe 3.0 x8 is same bandwidth as PCIe 2.0 x16.
> 
> 
> 
> http://www.techpowerup.com/reviews/Intel/Ivy_Bridge_PCI-Express_Scaling/11.html
> 
> This test shows a 7970/680. And there was nearly no measurable gain from PCIe 2.0 x16 ~VS~ PCIe 3.0 x16. You have more horsepower than a 7970 or 680...so
> 
> We look here a GTX 690 for a better comparison.
> 
> http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/53901-nvidia-geforce-gtx-690-review-25.html
> 
> Nearly no advantage with 16GB/sec. You shouldn't see a performance loss in the x8 slot


last year i have 7970, i tried putting it on the 3rd slot sir,.. and my fps drops significantly,.. i dont j\know why,.. but i remember my board it 16x 8x and 4x,.. but im no expert to this kind of stuff,.


----------



## VSG

So I came across an interesting video on YouTube today, it is from CES and the Powercolor rep here mentions their MSRP for the non reference 290/290x cards:





Their MSRP is itself very high and in accordance with pricing from retailers. I tried calling up Powercolor to confirm but I did not get through to a rep at this point.


----------



## Roboyto

Quote:


> Originally Posted by *maynard14*
> 
> last year i have 7970, i tried putting it on the 3rd slot sir,.. and my fps drops significantly,.. i dont j\know why,.. but i remember my board it 16x 8x and 4x,.. but im no expert to this kind of stuff,.


Well if the bottom most slot is x4 than that would likely bottleneck you since it's half as much bandwidth as x8.

Looks like your board is either 16/4/4 or 8/8/4. You may have to change a BIOS setting to get the second slot to be x8.


----------



## SeanEboy

Quote:


> Originally Posted by *Forceman*
> 
> Have him download and run Compubench or ShaderToyMark - they both are very shader heavy and will quickly highlight the difference between a shader unlocked 290X and a 290 running at 290X speeds. The 290 unlock thread has a bunch of scores posted you can compare to also.
> 
> http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread/0_30
> 
> You can also have him run the Hawaii Info tool posted in that thread, if the results don't match the results from an unlockable card, you'll know it isn't really unlocked. That tool has a 100% success rate in determining unlockable cards.
> 
> But really, considering the going prices, $500 for a even a locked 290 isn't bad.


Here is what I got as a response:
http://i.imgur.com/GJG3ZHk.png


http://imgur.com/xPRjA


Firestrike Extreme

http://www.3dmark.com/3dm/2415455

Graphics Score: 5776

Physics Score: 8148

Combined Score: 2483

Firestike

http://www.3dmark.com/3dm/2415482

Graphics Score: 12505

Physics Score: 8117

Combined Score: 4825


----------



## Tobiman

Quote:


> Originally Posted by *maynard14*
> 
> hi there ! got some questions
> 
> 
> 
> 
> 
> 
> 
> 
> 
> hope you could help me, just today i installed the nzxt g10 for my 290x card,.. and so far great temps with kuhler 620 from antec,.
> 
> im using 2 corsair sp fans as push pull to pull out hot air in the case,..
> 
> do you think this a good set up for the fans? or any other much better options.. thank you


If you have problems with vram temps, get a fan to blow some air on it.


----------



## Brian18741

Quote:


> Originally Posted by *Roboyto*
> 
> _VRM diagram_


Quote:


> Originally Posted by *Roboyto*
> 
> Pics are using TriptCC brackets, which are essentially Kraken G10s. You can Xfire with them
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just have to have a spot to fit the two 120mm AIO rads. Important to keep in mind the AIO rad dimension is taller than a 120mm Fan!
> 
> I've made the mistake of thinking I could install one in an older case and it wouldn't attach to the rear 120mm fan spot because there wasn't enough height. He had no optical drive, so I ended up using industrial strength velcro to hold the radiator in place in the optical drive bay area and pointed the fan to exhaust out the front of the case...worked very well actually lol


Very helpful, thanks man! Sorry I missed this the first time, this thread moves fast! Will look into it! +rep









Quote:


> Originally Posted by *Abyssic*
> 
> nice system man ^^ have you swapped out your tim?


Thanks man! I plan to swap the TIM out this weekend, don't have time this week with work, will update results!

Quote:


> Originally Posted by *rdr09*
> 
> Very nice rig.
> 
> edit: wait, 750W?


Quote:


> Originally Posted by *rdr09*
> 
> 750w for 2 290s. tsk, tsk.


Thanks. Yea 750w. I have it plugged into a power monitor and I draw on average of 550w to 650w during gaming. I did manage a brief spike to about 720w after 10 min of Furmark but that's fairly unrealistic for every day use including 3D gaming. I will eventually upgrade as I want to overvolt these bad boys when they're under water and see what they're made of, but for now, at stock voltage, 750w is fine. And this is with an i5 3570k at 4.5GHz!

Quote:


> Originally Posted by *Gabkicks*
> 
> My 2nd R9 290 came today
> 
> 
> 
> 
> 
> 
> 
> 2 Sapphire R9 290 Tri-X cards
> 
> 
> Spoiler: Warning: Spoiler!


Nice! What are your temps like?


----------



## maynard14

Quote:


> Originally Posted by *Roboyto*
> 
> Well if the bottom most slot is x4 than that would likely bottleneck you since it's half as much bandwidth as x8.
> 
> Looks like your board is either 16/4/4 or 8/8/4. You may have to change a BIOS setting to get the second slot to be x8.


i see,.. i will search for it tommorow and test if my 290x wont be bottleneck by my mobos pcie e slots,. thanks again sir


----------



## maynard14

Quote:


> Originally Posted by *Tobiman*
> 
> If you have problems with vram temps, get a fan to blow some air on it.


i already tried sir :



but still vrm 1 still goes unto 80c while on 290x mod,.. while on 290 mode 71c is my max vrm temp,.

guess i have to really buy some vrm heatsinks and paste them permenently into the gpus board


----------



## Roboyto

Quote:


> Originally Posted by *Brian18741*
> 
> Very helpful, thanks man! Sorry I missed this the first time, this thread moves fast! Will look into it! +rep


No problem, Glad it's useful information. Thanks for the rep!

Looking forward to see where your temps end up this weekend.

Just got my Fujipoly Ultra Extreme pads in the mail! Will be tearing down shortly to put them in.

This thread does move fast, lots of good information though.


----------



## Roboyto

Quote:


> Originally Posted by *maynard14*
> 
> i already tried sir :
> 
> 
> 
> but still vrm 1 still goes unto 80c while on 290x mod,.. while on 290 mode 71c is my max vrm temp,.
> 
> guess i have to really buy some vrm heatsinks and paste them permenently into the gpus board


What kind of clocks/voltages are you pushing with VRMs at that temperature? Those VRMs get hot hot, especially when you start pushing past factory speeds/voltages.

My VRMs are in the low 60s with +125mV and clocks at 1255/1675 and that's with full cover waterblock.


----------



## phallacy

Roboyto, are the clocks you are running stable both in the games you play and 3dmark11 / heaven / valley ? I seem to be stuck at 1200/1600 with +75mV offset and 50 power limit. I tried going up to the max of +200mV and bumping up to 1250/1700 but I was artifacting like crazy. Would love some advice on how to break through this OC wall if possible : )


----------



## Prozillah

Quote:


> Originally Posted by *Roboyto*
> 
> Make sure you cleaned out Nvidia drivers thoroughly.
> 
> Seems everyone is experiencing issues with those 14.1 drivers. I would say revert back to what was working better. I'm still on 13.12, with a single 290, and haven't had issues with anything yet.
> 
> ULPS off should definitely help your cause. Trixx has option to force constant voltage, this may help as well.
> 
> How are your temps looking in Xfire?
> 
> They're still ironing out the kinks with Mantle unfortunately. Should have been expected IMHO, it's an entirely new API so there's bound to be speed bumps and pot holes along the way. Another thing to consider as well is Mantle wasn't designed around enthusiast level hardware like 290(X). It 's primary goals are to help boost performance for AMD APUs, weaker CPUs, and lower tier GCN graphics cards. I'm not sure if folks with the lower-end hardware are having these same issues?
> 
> They will get it functioning properly though, just need to give it a little time.


Thanks man - ending up doing a full clean install back to 13.12 drivers and that made all the difference! I've got really good air flow in my case with ambient temps around 24c and during a full unigine heaven run both running 100% load it was 84c on the top, 70c on the bottom was max. Aggressive fan curve though ive set them to 100% once it hits 75c.

Gaming in bf4 top card temp was bouncing betwen 63 - 65 so definitely headroom to overlock a bit. Truthfully though i havent had the chance to have a good long gaming sesh yet. But so far after getting past that driver issue I'm pretty chuffed with the performance. Back on 13.12 on direct x i mean.


----------



## Jurge92

Hi.

I have a Sapphire R9 290 Tri-X, and want to OC it for the first time. What would be the recommended software and clocks for it? I currently have both MSI Afterburner as well as TriXX software.
My current temps never go beyond 50 degrees Celsius at load(normally around 40 under load). Idle is around ~30(sometimes around 25) degrees.

My usage is only for gaming at the time being


----------



## Roboyto

Quote:


> Originally Posted by *phallacy*
> 
> Roboyto, are the clocks you are running stable both in the games you play and 3dmark11 / heaven / valley ? I seem to be stuck at 1200/1600 with +75mV offset and 50 power limit. I tried going up to the max of +200mV and bumping up to 1250/1700 but I was artifacting like crazy. Would love some advice on how to break through this OC wall if possible : )


The only 2 games I've played with high OC have been FFXIV and Elder Scrolls Online Beta. Both are pretty demanding titles as far as GPU is concerned.

There is only a small difference in Core/RAM max speeds between benchmarks. Some will allow a little more here or there...just requires time to test.

I can't get on ESO anymore, but for FFXIV I'm 100% stable with 1255/1675 +125mV offset. I haven't tried playing with my new found highest clocks with +200mV.

I think FFXIV is a great overall system test as I end up with slightly lower stable clocks compared to 3DMark/Unigine.

My suggestion is:

+50% power limit
*don't try to make large jumps in MHz like that*.
If you're 100% good at 1200/1600 then *work the core then memory individually* in 10MHz jumps.
When you start to have issues at a certain core speed, go back to your previously good setting and then start working on the RAM.
When you encounter issues there, then it is time for more voltage.
*Add voltage in small increments*; I was working in 5-7mV increases.
Once you have applied more voltage you *then start over again*. Small jumps in core frequency and then RAM, then more volts.
Rinse and repeat.

*It will be time consuming*, but *your clock speeds are already pretty good all things considered*. I worked all settings up little by little to help keep temperatures in check since I did 2/3 of my benchmark runs with the stock blower fan @ a deafening 75%.

Up until the last few days my best clocks were achieved at +125mV, I was somewhat unsure of what would happen with +200mV until I saw a few other people post with that much voltage. Then I started to experiment past +125mV. I did get some extra performance going up to +200mV, but I won't ever be running the card constantly at +200mV since the performance gain was minimal and I don't want my card blowing up anytime soon.; I plan on keeping this thing for years to come.

I've ran the following benches at least this many times from what I have recorded in my spreadsheet:


FFXIV Benchmark: 30
Unigine Heaven: 8
Uningine Valley: 42
3DMark11 Performance Presets: 61
Uningine Valley: 41
3DMark11 Extreme Presets: 38
If I could guesstimate between the GPU and fine tuning my 4770k to minimal voltage with RAM @ 2400MHz., I'm probably 100 hours into tweaking.


----------



## phallacy

Quote:


> Originally Posted by *Roboyto*
> 
> The only 2 games I've played with high OC have been FFXIV and Elder Scrolls Online Beta. Both are pretty demanding titles as far as GPU is concerned.
> 
> I can't get on ESO anymore, but for FFXIV I'm 100% stable with 122/1675 +125mV offset. I haven't tried playing with my new found highest clocks with +200mV.
> 
> I think FFXIV is a great overall system test as I end up with slightly lower stable clocks compared to 3DMark/Unigine.
> 
> My suggestion is:
> 
> 
> *don't try to make large jumps in MHz like that*.
> If you're 100% good at 1200/1600 then *work the core then memory individually* in 10MHz jumps.
> When you start to have issues at a certain core speed, go back to your previously good setting and then start working on the RAM.
> When you encounter issues there, then it is time for more voltage.
> *Add voltage in small increments*; I was working in 5-7mV increases.
> Once you have applied more voltage you *then start over again*. Small jumps in core frequency and then RAM, then more volts.
> Rinse and repeat.
> 
> *It will be time consuming*, but *your clock speeds are already pretty good all things considered*. I worked all settings up little by little to help keep temperatures in check since I did 2/3 of my benchmark runs with the stock blower fan @ a deafening 75%.
> 
> Up until the last few days my best clocks were achieved at +125mV, I was somewhat unsure of what would happen with +200mV until I saw a few other people post with that much voltage. Then I started to experiment past +125mV. I did get some extra performance going up to +200mV, but I won't ever be running the card constantly at +200mV since the performance gain was minimal and I don't want my card blowing up anytime soon.; I plan on keeping this thing for years to come.
> 
> I've ran the following benches at least this many times from what I have recorded in my spreadsheet:
> 
> FFXIV Benchmark: 30
> 
> Unigine Heaven: 8
> 
> Uningine Valley: 42
> 
> 3DMark11 Performance Presets: 61
> 
> Uningine Valley: 41
> 
> 3DMark11 Extreme Presets: 38
> 
> If I could guesstimate between the GPU and fine tuning my 4770k to minimal voltage with RAM @ 2400MHz., I'm probably 100 hours into tweaking.


Cool, much appreciated + rep. Will try those steps this weekend when I add my third 290x into the loop. From looking over your build, it seems we have pretty similar specs. What kind of OC are you running on your 4770k with voltage? Did that affect your gpu OC at all?


----------



## Roboyto

Quote:


> Originally Posted by *Prozillah*
> 
> Thanks man - ending up doing a full clean install back to 13.12 drivers and that made all the difference! I've got really good air flow in my case with ambient temps around 24c and during a full unigine heaven run both running 100% load it was 84c on the top, 70c on the bottom was max. Aggressive fan curve though ive set them to 100% once it hits 75c.
> 
> Gaming in bf4 top card temp was bouncing betwen 63 - 65 so definitely headroom to overlock a bit. Truthfully though i havent had the chance to have a good long gaming sesh yet. But so far after getting past that driver issue I'm pretty chuffed with the performance. Back on 13.12 on direct x i mean.


Glad you're back to functioning. This hobby can be extremely frustrating at times!

If you're only at 65 on top card while gaming you should have a fair bit of room to work with as far as OC. Downside to Xfire is overclocking both cards simultaneously can be a PITA. I've never personally had a Xfire system, but from what I've read and seen with my friends Xfire 7970s, it is usually best to find the limits of each card individually. Once you know what each card can do you can decide on a base line to run both cards at, then work up from there. Odds are you won't be able to run both cards at the same frequencies they did individually, especially when you factor in heat.

Make sure you keep an eye on VRM1 temps as well. The core can take the heat, but once those VRMs start climbing to 90+ you can start having stability issues.

A great read for OC 290(X) Xfire is going to be here:
Gunderman456 is *EXREMELY* thorough. Lots of information can be found in his build log!

http://www.overclock.net/t/1442038/build-log-the-hawaiian-heat-wave

I know how you feel with not having a decent gaming session. If you scope out my build log, I've had a looooong road getting to where I'm at presently and I still have a bit more work to do.


----------



## taem

My Powercolor PCS+ 290 is a little screamer! 1200 core, 1500 mem, 4670k @ 4.6

3dmark11 performance:

http://www.3dmark.com/3dm11/7991243

Fire Strike:

http://www.3dmark.com/fs/1740273

Fire Strike Extreme:

http://www.3dmark.com/fs/1740238

Ranks at 3dmark for 4670k + single 290, valid driver:
#16 for 3dmark11
#6 for fire strike
#18 for fire strike extreme

Valley ExtremeHD & Heaven Extreme:


Temps are pretty awesome too, here after 50 loop of Metro LL bench maxed settings, ambient 22-23:
Stock: 65c core, 74c vrm1, 49c vrm2
OC 1200 core/1500 mem: 67c core, 85c vrm1, 49c vrm2



Vrm1 at 85c at oc clocks may seem high but I've yet to find a game that gets me temps as high as MetroLL bench maxed on loop. Gaming vrm1 temp is much lower.


----------



## Paul17041993

Quote:


> Originally Posted by *Mercy4You*
> 
> It's amazing to learn that there are still people believing the blackscreens are not hardware related...


generally its what the card does to prevent damage, similar to how a BSOD works, it just cuts the power and leaves the fan running.

Can be caused from a bad OC, defective units, overheating or a driver sending bad values.


----------



## cam51037

Well my replacement 290 Tri-X came in today - no fan rattle so far! Success!!!

I tested turning the fans up in Trixx slowly from 20-100%, and a round of BF4 and no rattling yet.  Oh, and an ASIC score of 79.3%, compared to the old card's 73.2%.


----------



## Roboyto

Quote:


> Originally Posted by *phallacy*
> 
> Cool, much appreciated + rep. Will try those steps this weekend when I add my third 290x into the loop. From looking over your build, it seems we have pretty similar specs. What kind of OC are you running on your 4770k with voltage? Did that affect your gpu OC at all?


I had my CPU overclock/voltage figured out before I even delidded or had it under the full water loop surprisingly enough. The delid, and then direct-to-die mount only helped temperatures.

I found my current CPU settings with an Antec 620 using (1) Corsair SP120.

Presently:

4.5GHz 1.259V 100*45
2x8GB Dominator Platinum 2400MHz 1.65V
Intel Burn Test Temps with fans on Silent mode through Thermal Radar 2: 72 | 76 | 75 | 70
During my craziest benchmark for 3dMark11 the CPU maxed at: 57 | 57 | 57 | 55
During gaming load the CPU and GPU don't break 50C.

CPU hasn't affected anything with the GPU clocking since I new it was 100% stable before I tested the GPU at all.


----------



## Roboyto

Quote:


> Originally Posted by *taem*
> 
> My Powercolor PCS+ 290 is a little screamer! 1200 core, 1500 mem, 4670k @ 4.6
> 
> 3dmark11 performance:
> 
> http://www.3dmark.com/3dm11/7991243
> 
> Fire Strike:
> 
> http://www.3dmark.com/fs/1740273
> 
> Fire Strike Extreme:
> 
> http://www.3dmark.com/fs/1740238
> 
> Ranks at 3dmark for 4670k + single 290, valid driver:
> #16 for 3dmark11
> #6 for fire strike
> #18 for fire strike extreme
> 
> Valley ExtremeHD & Heaven Extreme:
> 
> Vrm1 at 85c at oc clocks may seem high but I've yet to find a game that gets me temps as high as MetroLL bench maxed on loop. Gaming vrm1 temp is much lower.


Looking good!

I'm about to shut down and swap out thermal pads for VRMs...I'll let you know what kind of temp benefits there are with this Fujipoly Ultra Extreme.


----------



## Roboyto

Quote:


> Originally Posted by *cam51037*
> 
> Well my replacement 290 Tri-X came in today - no fan rattle so far! Success!!!
> 
> I tested turning the fans up in Trixx slowly from 20-100%, and a round of BF4 and no rattling yet.
> 
> 
> 
> 
> 
> 
> 
> Oh, and an ASIC score of 79.3%, compared to the old card's 73.2%.


Glad to hear you're issue free so far with new card









We'd all love to see some benchmark and temperature results


----------



## cam51037

Quote:


> Originally Posted by *Roboyto*
> 
> Glad to hear you're issue free so far with new card
> 
> 
> 
> 
> 
> 
> 
> 
> 
> We'd all love to see some benchmark and temperature results


Sure thing! Just from my BF4 testing here are some results:

Max temp: 71C

Max fan speed: 64%

VRM1 Max temp: 69C

VRM2 Max temp: 54C

Stock clocks and voltage, gets around 55FPS max settings BF4, 1440p









EDIT: Also just ran Fire Strike on it, here's the result: http://www.3dmark.com/3dm/2495065

2600k @ 4.6GHz and R9 290 @1150/1475


----------



## Tobiman

Quote:


> Originally Posted by *taem*
> 
> My Powercolor PCS+ 290 is a little screamer! 1200 core, 1500 mem, 4670k @ 4.6
> 
> Ranks at 3dmark for 4670k + single 290, valid driver:
> #16 for 3dmark11
> #6 for fire strike
> #18 for fire strike extreme
> 
> Valley ExtremeHD & Heaven Extreme:
> 
> 
> Temps are pretty awesome too, here after 50 loop of Metro LL bench maxed settings, ambient 22-23:
> Stock: 65c core, 74c vrm1, 49c vrm2
> OC 1200 core/1500 mem: 67c core, 85c vrm1, 49c vrm2
> 
> Vrm1 at 85c at oc clocks may seem high but I've yet to find a game that gets me temps as high as MetroLL bench maxed on loop. Gaming vrm1 temp is much lower.


Those are pretty good numbers. As for me, I'm having a hard time getting my memory to be 1500mhz stable (it's elpida). I even tried increasing core voltage but no bueno.


----------



## Coree

I've been thinking this for a while.. why the VRAM is rated to work @ 1500Mhz 1.5V (6Ghz effective) on all R9 290's? Is there some reason why they are downclocked.. and are there any ways to control the memory voltage? 384 GB/s 100% stable memory would be awesome though.


----------



## taem

Quote:


> Originally Posted by *Tobiman*
> 
> Those are pretty good numbers. As for me, I'm having a hard time getting my memory to be 1500mhz stable (it's elpida). I even tried increasing core voltage but no bueno.


The one thing I don't like about this card is locked mem voltage, we might be able to bump mem clock more with a tweak. Since it's dual bios I'm thinking about trying to find a bios that will unlock mem voltage tweaking. Dunno if that's possible. I need to find out if this card has identical bios on both slots, hope so.

Right now I'm playing with undervolting since I'll be pairing this with a Tri-X and this card has +50mV core at stock while the Tri-X does not. What I'm finding is that I can increase core clock while undervolting but not mem. Core is king though so that's much much better than the other way around, mem clocks are really only significant on some benches, no practical effect in games.


----------



## Tobiman

Quote:


> Originally Posted by *taem*
> 
> The one thing I don't like about this card is locked mem voltage, we might be able to bump mem clock more with a tweak. Since it's dual bios I'm thinking about trying to find a bios that will unlock mem voltage tweaking. Dunno if that's possible. I need to find out if this card has identical bios on both slots, hope so.
> 
> Right now I'm playing with undervolting since I'll be pairing this with a Tri-X and this card has +50mV core at stock while the Tri-X does not. What I'm finding is that I can increase core clock while undervolting but not mem. Core is king though so that's much much better than the other way around, mem clocks are really only significant on some benches, no practical effect in games.


Exactly what I was thinking. My core is stable at 1150 with the stock +50mv but the memory just falls on it's head whenever I try to change it. Seems we now have both the same processor and gpu so I'll be comparing my scores to your when I get a stable setup going.


----------



## Heinz68

Quote:


> Originally Posted by *Roboyto*
> 
> http://www.3dmark.com/3dm11/7998665
> 
> Just ran again.
> 
> P16307 | 1295/1700 | +200mV | GPU 48 VRMs 68/38 |
> 
> 
> 
> Tried with same settings I used with tessealtion off, 1315/1715, and there was significant artifacting/flickering. Score also dropped 6 points to P16301.
> 
> Tried again @ 1315/1700, and artifacting/flickering got worse and score dropped to P16293. Definitely have hit the wall for the card.


Very nice GPU overclock. I only maxed at 1,225 /1,650 MHz with +200 mV, think not to bad with Crossfire on air cooling.

If you added the P16132 to compare 3DMARK *11(P) HOF you would get # 23 (single card)* rating. Pretty good on benchmark dominated by Nvidia cards due to the tessellation. I'm sure with i7-3930K or 4930K you would easy hit top 10 with much higher Physics Score.

I run the 3DMARK 11(P) tessellation on and got rated *# 8 in SLI/Crossfire chart* as I said not too bad for air cooling. Was actually # 6 for at least couple days


----------



## Paul17041993

Quote:


> Originally Posted by *Coree*
> 
> I've been thinking this for a while.. why the VRAM is rated to work @ 1500Mhz 1.5V (6Ghz effective) on all R9 290's? Is there some reason why they are downclocked.. and are there any ways to control the memory voltage? 384 GB/s 100% stable memory would be awesome though.


power efficiency, binning, and 320GB/s on the 512bit bus is enough for 4K so you only really need to OC it for extra frames in said res or getting the most out of benchmarks, there isn't really much that makes use of ~400GB/s memory, but it can still help with various things, whether or not you would want to run the memory that high is up to you, and AMD generally gives people the option by using 1500MHz rated units on the reference PCB.

Quote:


> Originally Posted by *taem*
> 
> The one thing I don't like about this card is locked mem voltage, we might be able to bump mem clock more with a tweak. Since it's dual bios I'm thinking about trying to find a bios that will unlock mem voltage tweaking. Dunno if that's possible. I need to find out if this card has identical bios on both slots, hope so.


memory voltage control isn't really needed on these cards, the majority of ICs can handle 1600MHz (100 over spec) on the 1.5V without issues, most 7970s could go up to 1700, its more limited by the controller then the actual ICs.

1800+MHz on the other hand (entering subzero territory), a PCB and/or BIOS mod would be in use likely anyway.


----------



## Arizonian

Quote:


> Originally Posted by *Panther Al*
> 
> Coming over to spend some time with the Red Team:
> 
> 2 Sapphire 290x's, Stock all around (Will eventually go under water, because my god are they loud).
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Quote:


> Originally Posted by *ReXtN*
> 
> Can I join the Owners Club?
> http://www.techpowerup.com/gpuz/kngrn/
> 
> MSI R9 290X Referance design with EK Full Cover Waterblock.


Congrats - added
















Quote:


> Originally Posted by *Widde*
> 
> Can I get updated to msi/sapphire? ^^ http://piclair.com/vju0d
> 
> 2 R9 290s ^_^


All good - updated


----------



## maynard14

Quote:


> Originally Posted by *Roboyto*
> 
> What kind of clocks/voltages are you pushing with VRMs at that temperature? Those VRMs get hot hot, especially when you start pushing past factory speeds/voltages.
> 
> My VRMs are in the low 60s with +125mV and clocks at 1255/1675 and that's with full cover waterblock.


hi sir goodmorning, i am running stock xfx 290x bios and my vrm are so high @ 80c so i think my vrm heatsinks sucks,.. haha but i will try to re apply them and see if ii get a better vrm temp result


----------



## Roboyto

Quote:


> Originally Posted by *Heinz68*
> 
> Very nice GPU overclock. I only maxed at 1,225 /1,650 MHz with +200 mV, think not to bad with Crossfire on air cooling.
> 
> If you added the P16132 to compare 3DMARK *11(P) HOF you would get # 23 (single card)* rating. Pretty good on benchmark dominated by Nvidia cards due to the tessellation. I'm sure with i7-3930K or 4930K you would easy hit top 10 with much higher Physics Score.
> 
> I run the 3DMARK 11(P) tessellation on and got rated *# 8 in SLI/Crossfire chart* as I said not too bad for air cooling. Was actually # 6 for at least couple days


Yeah, I've already noticed that. Socket 2011 CPU just too expensive to justify.

What air cooler are you using? Impressive to hold those clocks/volts on Air and keep VRM temps in check.


----------



## Roboyto

Quote:


> Originally Posted by *maynard14*
> 
> hi sir goodmorning, i am running stock xfx 290x bios and my vrm are so high @ 80c so i think my vrm heatsinks sucks,.. haha but i will try to re apply them and see if ii get a better vrm temp result


80C is still well within operating temperature, but it never hurts to try and bring the temps down


----------



## taem

Quote:


> Originally Posted by *Paul17041993*
> 
> memory voltage control isn't really needed on these cards, the majority of ICs can handle 1600MHz (100 over spec) on the 1.5V without issues, most 7970s could go up to 1700, its more limited by the controller then the actual ICs.


Really? This is my first 290. But historically I've found that tweaking mem volt has allowed me to clock higher. And higher mem clocks have allowed me to get to higher core clocks.
Quote:


> Originally Posted by *Roboyto*
> 
> Looking good!
> 
> I'm about to shut down and swap out thermal pads for VRMs...I'll let you know what kind of temp benefits there are with this Fujipoly Ultra Extreme.


Dude! Your temps are frosty already! Stop it and use the money for a burger or something. What was the graphics sub score on your insane 3dmark11 score btw?


----------



## Roboyto

Quote:


> Originally Posted by *taem*
> 
> Dude! Your temps are frosty already! Stop it and use the money for a burger or something. What was the graphics sub score on your insane 3dmark11 score btw?


Core and VRM2 were frosty. VRM1 could use a little improvement. Chiknnwatrmln VRM1 temps are 20C under mine and he's using the Fujipoly Ultra Extreme pads on VRMs. I want to know if they're worth the price premium, and I want to know if the EK block has better VRM1 cooling than my XSPC.

It was something like 21528 or something along those lines... you can go back and find it or I'll check once I'm back up and running. I'm bleeding air out of loop right now.

I'm half Italian and from Chicago, I'd prefer pizza :-D


----------



## IBIubbleTea

What kind of kh/s are you able to get with a R9 290?


----------



## Roboyto

Quote:


> Originally Posted by *IBIubbleTea*
> 
> What kind of kh/s are you able to get with a R9 290?


Can find lots of information here:

http://www.overclock.net/t/1437876/290-and-290x-litecoin-mining-performance


----------



## maynard14

Quote:


> Originally Posted by *Roboyto*
> 
> 80C is still well within operating temperature, but it never hurts to try and bring the temps down


yeah sir,.. tried this set up and heres my temp on vrms whle playing crysis 3 atleast 10 min





and you were right sir it does give me normal use on my gpu even on 2nd slot,. its weird when im using 7970 my fps drops significantly on the 2nd slot



and on the 2nd slot my 290 show x8


----------



## Roboyto

Quote:


> Originally Posted by *maynard14*
> 
> yeah sir,.. tried this set up and heres my temp on vrms whle playing crysis 3 atleast 10 min
> 
> and you were right sir it does give me normal use on my gpu even on 2nd slot,. its weird when im using 7970 my fps drops significantly on the 2nd slot


Looking good. What clocks/voltage you at?

That is odd it doesn't work correctly with 7970... not sure about that one *scratches head*


----------



## maynard14

Quote:


> Originally Posted by *Roboyto*
> 
> Looking good. What clocks/voltage you at?


hi sir,. im using the stock 290 bios which is 0.984 v idle and load 1.242 v,.. 947/1250 speed.. i will try the 290x bios later and see the vrm temps using this set up







thanks again


----------



## Roboyto

Quote:


> Originally Posted by *maynard14*
> 
> hi sir,. im using the stock 290 bios which is 0.984 v idle and load 1.242 v,.. 947/1250 speed.. i will try the 290x bios later and see the vrm temps using this set up
> 
> 
> 
> 
> 
> 
> 
> thanks again


No need to call me sir, haha


----------



## Mercy4You

Quote:


> Originally Posted by *Mercy4You*
> 
> It's amazing to learn that there are still people believing the blackscreens are not hardware related...


Quote:


> Originally Posted by *Paul17041993*
> 
> generally its what the card does to prevent damage, similar to how a BSOD works, it just cuts the power and leaves the fan running.
> 
> Can be caused from a bad OC, defective units, overheating or a driver sending bad values.


Sure, and that's the reason AMD is dead silent about the blackscreens affecting a few hundred cards at least. What would happen you think if they admit the reason (of which I am 100% sure they know) is say faulty memory modules, regulators or another piece of hardware?


----------



## pkrexer

Quote:


> Originally Posted by *Roboyto*
> 
> Core and VRM2 were frosty. VRM1 could use a little improvement. Chiknnwatrmln VRM1 temps are 20C under mine and he's using the Fujipoly Ultra Extreme pads on VRMs. I want to know if they're worth the price premium, and I want to know if the EK block has better VRM1 cooling than my XSPC.
> 
> It was something like 21528 or something along those lines... you can go back and find it or I'll check once I'm back up and running. I'm bleeding air out of loop right now.
> 
> I'm half Italian and from Chicago, I'd prefer pizza :-D


So when you going to slap those new pads on? I'm anxious to see your results as my pads won't be arriving for a few days yet.


----------



## Roboyto

Quote:


> Originally Posted by *pkrexer*
> 
> So when you going to slap those new pads on? I'm anxious to see your results as my pads won't be arriving for a few days yet.


Pads are on








Been bleeding air out of loop for about last hour or so...Almost done.

I'll have temperature results posted in a couple hours at best along with some other information about installing the pads.


----------



## Paul17041993

Quote:


> Originally Posted by *Mercy4You*
> 
> Sure, and that's the reason AMD is dead silent about the blackscreens affecting a few hundred cards at least. What would happen you think if they admit the reason (of which I am 100% sure they know) is say faulty memory modules, regulators or another piece of hardware?


they would end up with bad attention from media etc, as per what usually happens when a product has a common problem, generally why they changed their driver plans to not release a version every month as it was releasing too many bugs into the public domain, and they have an extensive bug report form system for the beta drivers for if certain issues arise from the patches in the experimental beta releases.

though I'm yet to encounter a blackscreen fault outside of unstable overclocking so I haven't had the chance to debug it myself...


----------



## Mercy4You

Quote:


> Originally Posted by *Mercy4You*
> 
> Sure, and that's the reason AMD is dead silent about the blackscreens affecting a few hundred cards at least. What would happen you think if they admit the reason (of which I am 100% sure they know) is say faulty memory modules, regulators or another piece of hardware?


Quote:


> Originally Posted by *Paul17041993*
> 
> they would end up with bad attention from media etc, as per what usually happens when a product has a common problem, generally why they changed their driver plans to not release a version every month as it was releasing too many bugs into the public domain, and they have an extensive bug report form system for the beta drivers for if certain issues arise from the patches in the experimental beta releases.


Yes, and many many people would sent back their cards who ever saw something what looked like a blackscreen or whatever, because of the limited warranty.

Hell, even people who didn't have an issue yet would sent back their cards. Now it's a couple of hundred RMA's, then it would be a couple of thousand....


----------



## MapRef41N93W

So I purchased a Tri-X 290x from Tigerdirect for 649.99 today. Just wondering though if I purchase an XFX 290 and unlock it to a 290x, would it work in X-fire with the Sapphire? Unlocking it does technically make it a 290x right?


----------



## Paul17041993

Quote:


> Originally Posted by *MapRef41N93W*
> 
> So I purchased a Tri-X 290x from Tigerdirect for 649.99 today. Just wondering though if I purchase an XFX 290 and unlock it to a 290x, would it work in X-fire with the Sapphire? Unlocking it does technically make it a 290x right?


it would work in crossfire regardless, and no, a 290X BIOS doesn't actually unlock the shaders UNLESS you are actually LUCKY enough to get a true 290X flashed with a 290 BIOS, no idea how many are left out there, if any...


----------



## MapRef41N93W

Quote:


> Originally Posted by *Paul17041993*
> 
> it would work in crossfire regardless, and no, a 290X BIOS doesn't actually unlock the shaders UNLESS you are actually LUCKY enough to get a true 290X flashed with a 290 BIOS, no idea how many are left out there, if any...


Right, I forgot that with xfire you can run both of the cards as essentially 290s. From reading the unlock thread on here it seems like there are still some XfX 290s floating around that can unlock.


----------



## Roboyto

Quote:


> Originally Posted by *kpoeticg*
> 
> Thanx for the reply +1
> 
> I was gonna order the Fujipoly Ultra Extreme since that's what everybody seems to recommend. I noticed that the instructions for the block only recommended TIM on the ram and not pads, but wasn't sure if pads would help. I'm stuck with a BSOD Elpida so trying to get as much out of it as possible. I figured Fujipoly would be better than the strip that's in the package, but might not order it now if i don't need it for the ram at all.


Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> You two need to tell me all about your findings . Got it


Quote:


> Originally Posted by *pkrexer*
> 
> So when you going to slap those new pads on? I'm anxious to see your results as my pads won't be arriving for a few days yet.


Quote:


> Originally Posted by *Forceman*
> 
> It may be the bolded part. I thought one of the knocks on the XSPC blocks was that they had a somewhat smaller VRM cooling channel than the EKs do.


Quote:


> Originally Posted by *Fahrenheit85*
> 
> Waiting on the Fujipoly from the UPS man any day this week.


Quote:


> Originally Posted by *kizwan*
> 
> Just to make sure my eyes working properly. Fujipoly 11W/mK is $7.49 while Fujipoly 17W/mK is $25.99 at FrozenCPU. Why huge difference? Is this always like this? If I go with 11W/mK, do I still get huge improvement in VRM1 temp?


Quote:


> Originally Posted by *Krusher33*
> 
> Geez those are expensive. Is there not any that are pretty good but easier on the wallet?


Quote:


> Originally Posted by *rdr09*
> 
> i am only cooling one 290. my vrm1 is about 3C cooler than vrm2 and about 5C cooler than the core. i used CLU for the core and *Fujipoly for the vrms*.


Quote:


> Originally Posted by *ZealotKi11er*
> 
> Is this the right stuff: http://www.frozencpu.com/scan/MM=c6eddef83858bd6b4bdb1d55e353f6a6:30:59:30.html?mv_more_ip=1&mv_nextpage=search_results_list&mv_profile=keyword_search&mv_searchspec=Fujipoly&mv_arg=


Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I think it's called FujiPoly Extreme. They have 3 different varieties, Ultra Extreme is the best and Extreme is number 2.


Quote:


> Originally Posted by *Gunderman456*
> 
> So knowing all this, I only need to purchase one Fujipoly Extreme System Builder Thermal Pad - 1/4 Sheet - 150 x 100 x 1.0mm - Thermal Conductivity 11.0 W/mK , which I just did by the way for $22.99 from FrozenCPU. With shipping and entering the OCN 5% savings code it came to $29.81.
> 
> You just saved me some money since I was about to buy 2 x 1mm and 2 x 0.5 mm sheets (thinking I'd also have to cover the VRAM) and now knowing this, I can get by with one 1mm sheet which will do both cards nicely.


*Upgraded thermal pads on XSPC Razor Block*

Alright so I have installed the Fujipoly Ultra Extreme 17.0 W/mk pads on both VRM1 and VRM2. I have also installed the Fujipoly Extreme 11.0 W/mk on the RAM chips.

I did not use any TIM to assist in thermal conductivity on the pads. Chiknnwatrmln also did not use any TIM with his EK block. With the Ultra Extreme he is seeing roughly 20C lower temps on VRM1. So I am doing this write up to see what kind of a difference there is going from stock XSPC pads to FUE, and to see which block does a better job cooling VRM1.

*Here's what it looked like with stock pads.*


*I ordered the FUE in 15x100 for the VRMs and FE in the 150x100 for the RAM chips.*

*The 150x100 is enough to probably do 3 whole cards with a little extra left over. If you wanted to get better than stock results and not spend $60 for both types you could easily get away with one of these.*


*The 15x100 FUE is more than enough to do the VRMs on these cards. If you cut them thin enough you might have enough to do 3 cards.*

*I examined the stock pad contact and found some spots that weren't being 100% covered.*


*I took the XSPC stock pads off and laid them on the Fujipoly to use them as a template and enlarged as needed.*







*My awesome Husky 'titanium' coated scissors worked marvelously to cut both versions of the pads.*



*Minimal trimming was needed to make all pieces fit on RAM really nice.*





*At this point I thought it would nice for everyone else to have a template of their own, so I put the pads on my scanner







*

R9290XThermalPadTemplate.jpg 1077k .jpg file


R9290XThermalPadTemplate.pdf 1055k .pdf file

*Either should print on a whole sheet of 8.5x11 paper so they are the correct size. I would imagine these would work on other blocks aside from XSPC, but that will be up to others to explore.*

*Time to clean the VRMs and RAM chips*

*I chose to use Arcticlean TIM remover/purifier as I figured it could easily remove the residue from the other pads. Chiknnwatrmln suggested Isopropyl Alcohol or a Pencil Eraser.*


*Time to lay the pads onto the card.*

*One more cut was required when I got to this point. I did not want to pick up the Fujipoly Extreme after it was on the card as it felt somewhat fragile. I ended up taking a precision flat head screwdriver to cut the pad where I needed. It required very little force to cut through the FE.*



*Tada!*


*A few miscellaneous pointers before I end this rant...*
*
Wash your hands before handling these. The less oil from your skin the better; you could even wear gloves I suppose.

The Fujipoly Ultra Extreme has thinner more pliable plastic on both sides, it is easy to remove from both sides.

The Fujipoly Extreme is like all other thermal pads I have used in the past. One side is thin, pliable, has a checkered pattern, and is easy to remove. The other side has a thicker, harder, clear plastic and is more difficult to remove. Take the hard plastic off first so you don't struggle removing it and touch the pads more than necessary.*


*Temperature results coming shortly







*


----------



## VSG

Great post, +1

I usually cut the pads to fit individual memory chips. While this takes time, it saves on a lot of expensive pads that eventually can't be reused. These are easy to tear but easy to re-join as well.


----------



## brazilianloser

Yeah man good post. Got my sheet of Fujipoly on the way to put on my two 290s.


----------



## phallacy

Looking forward to your temp results. If it's as good as 20c drops on vrm 1 I will definitely consider going the fujipoly route/


----------



## VSG

If it helps, I have sold Fujipoly Extreme pads to other R9-290x owners along with the EK blocks and they said they have noticed an average of 10 C on the VRMs with these pads compared to the stock EK pads. Right now, I have them on my 780 Ti KPEs and so far so good- I can't compare them against anything but I am pretty happy.


----------



## tsm106

Quote:


> Originally Posted by *Roboyto*


Dude, that's a serious waste of Fujis. You're not supposed to cut out using the block as a template. XSPC does it that way because they use cheap ass pads so the waste is nothing and the time saved by wasting pad material is worth it.

People, use the memory chips and vrm chips as a template. You will make your Fuji sheets last a lot longer this way. The Fuji pads DO NOT LAST LONG so don't waste them. Once mounted and heat cured, they will not come off the card easily. Most often the crack when someone tries to take them off. And you don't have to put TIM on them either. That said, I hate using TIM anyways because it often stains the memory chips leaving behind absolute proof you put a block on it which is very important if you have a non-friendly warranty.


----------



## VSG

It's not that bad, if you are patient you can shape most of the torn pads together and reuse them. But I agree on cutting for each chip separately and not using any TIM.


----------



## Roboyto

Quote:


> Originally Posted by *tsm106*
> 
> Dude, that's a serious waste of Fujis. You're not supposed to cut out using the block as a template. XSPC does it that way because they use cheap ass pads so the waste is nothing and the time saved by wasting pad material is worth it.
> 
> People, use the memory chips and vrm chips as a template. You will make your Fuji sheets last a lot longer this way. The Fuji pads DO NOT LAST LONG so don't waste them. Once mounted and heat cured, they will not come off the card easily. Most often the crack when someone tries to take them off. And you don't have to put TIM on them either. That said, I hate using TIM anyways because it often stains the memory chips leaving behind absolute proof you put a block on it which is very important if you have a non-friendly warranty.


I don't think you really read my post very well...

I did not use the block as a template. I took the XSPC pads and used them as a template. They weren't making 100% contact on certain areas, that I highlighted in one of the pictures. I enlarged accordingly to make sure everything had thermal pad covering it completely.

You may think it's wasteful but, cutting 16 individual squares for RAM would be a PITA. The 150x100 sheet is enough to easily cover the RAM chips the way I cut them at least 3 times.

Cutting tiny individual squares for the VRMs would be even more of a PITA since they would have to be so small.

I did not use TIM to apply the thermal pads.
Quote:


> Originally Posted by *Roboyto*
> 
> *I did not use any TIM to assist in thermal conductivity on the pads.*
> 
> *I ordered the FUE in 15x100 for the VRMs and FE in the 150x100 for the RAM chips.*
> 
> *The 150x100 is enough to probably do 3 whole cards with a little extra left over. If you wanted to get better than stock results and not spend $60 for both types you could easily get away with one of these.*
> 
> 
> *The 15x100 FUE is more than enough to do the VRMs on these cards. If you cut them thin enough you might have enough to do 3 cards.*
> 
> *I examined the stock pad contact and found some spots that weren't being 100% covered.*


----------



## tsm106

You cut pads along the contact surfaces on the block. Looks like using the block as a guide to me.

The vrm strip... ya simply cut a thin stripe the width of the vrms. Pretty straight forward no?


----------



## Roboyto

*Temperature Results from Fujipoly Ultra Extreme on VRMs*

Screenshot of VRM temps with FUE for 3DMark11 Performance Presets 1295/1700 +200mV


All observed temperatures before and after using same settings before and after thermal pad switch


*On average for 3DMark11 Performance/Xtreme Presets, Unigine Heaven/Valley Extreme Presets, and FFXIV Benchmark Maximum Settings, temperatures decreased:

VRM1 23.25% VRM2 12.15%*


----------



## Roboyto

Quote:


> Originally Posted by *tsm106*
> 
> You cut pads along the contact surfaces on the block. Looks like using the block as a guide to me.
> 
> The vrm strip... ya simply cut a thin stripe the width of the vrms. Pretty straight forward no?


I did cut a strip for the VRMs.



If you skim through the lengthy post, it could be assumed I cut pads along the contact surface of the block, but that is not how I went about it.




I have plenty of extra left over, no worries.

Good to know that this stuff dries out though. How long should I expect before that happens?


----------



## combine1237

Hello, got my card 2 weeks ago and just wanted to post it on here.

1. Gpuz Validation link:

http://www.techpowerup.com/gpuz/w6fua/

2. Xfx 290x Double Dissipation Black edition R9-290X-EDBD

3. Double Dissipation Aftermarket.

Edit: Spelling


----------



## nightfox

Oh cant believe I didnt join this yet.

can I join?

http://www.techpowerup.com/gpuz/vsm7c/

All ref r9 290

unfortunately not even one of them unlock


----------



## Arizonian

Quote:


> Originally Posted by *combine1237*
> 
> Hello, got my card 2 weeks ago and just wanted to post it on here.
> 
> 1. Gpuz Validation link:
> 
> http://www.techpowerup.com/gpuz/w6fua/
> 
> 2. Xfx 290x Double Dissipation Black edition R9-290X-EDBD
> 
> 3. Double Dissipation Aftermarket.
> 
> Edit: Spelling


Congrats - added









Quote:


> Originally Posted by *nightfox*
> 
> Oh cant believe I didnt join this yet.
> 
> can I join?
> 
> http://www.techpowerup.com/gpuz/vsm7c/
> 
> All ref r9 290
> 
> unfortunately not even one of them unlock


Never too late. Congrats - added


----------



## standardhlozek

Hi guys. Does anybody knows what is the critical temps on vrms r9 290? I have prolimatech mk 26, alpenfohn heatsinks and 140 fans. I plug fans into card. Before that i controle them via external conroler. My issue is that i am getting back black sreens when gaming. The temp of the chips doesnt exceed 70 degs on haevy load so i thing this is not the problem. I worry that the issue is that damn elpia memory. Gpu gigabite, 1000 oc core, running 13.11. 9.5 driver ( this driver solved my issue wih black screen in bf4. )


----------



## Mercy4You

Quote:


> Originally Posted by *standardhlozek*
> 
> Hi guys. Does anybody knows what is the critical temps on vrms r9 290?


126 C


----------



## taem

Quote:


> Originally Posted by *standardhlozek*
> 
> Hi guys. Does anybody knows what is the critical temps on vrms r9 290? I have prolimatech mk 26, alpenfohn heatsinks and 140 fans. I plug fans into card. Before that i controle them via external conroler. My issue is that i am getting back black sreens when gaming. The temp of the chips doesnt exceed 70 degs on haevy load so i thing this is not the problem. I worry that the issue is that damn elpia memory. Gpu gigabite, 1000 oc core, running 13.11. 9.5 driver ( this driver solved my issue wih black screen in bf4. )


What are your mem clocks and voltage? Power limit? I sometimes get black screens when I'm trying to find stable clocks and minimum voltage and set the clock too high or bolts too low. Like when I set my clocks at 1300 core, 1700 mem, on stock voltage. Went black screen the moment I launched valley lol.

Everyone hates on elpida. So many say Hynix is better, shrug, I take them at their word. But I have an elpida 290 right now, mem clocks to 1600 easy. My last two cards, sapphire 280x with Hynix, asus 280x with Hynix. I couldn't make either of those mem clocks budge without screen glitches. Maybe Hynix is better, but this whole issue is so overblown.


----------



## Mercy4You

Quote:


> Originally Posted by *taem*
> 
> Everyone hates on elpida. Maybe Hynix is better, but this whole issue is so overblown.


I'm an Elpida hater







, but the truth is that most Elpida are just fine









I don't think the whole issue is overblown. There is a lot of speculation about this memory modules, it started here I guess...
http://www.overclock.net/t/1441349/290-290x-black-screen-poll

AMD chose to give it very little attention, so we all had to find our own solutions. That's too bad, because in my opinion Elpida got roasted while AMD was to blame. They obviously should have done more testing with their flagship in stead of rush it on the market.
.


----------



## Paul17041993

well one thing I just found out today with the 14.1 beta drivers;



minecraft looses its tile and model rendering... how odd...


----------



## BradleyW

Elpida is fine. I've seen just as many Hynix users have issues. But, when OCN says something is bad, everyone follows that opinion, even if they have no idea what's going on.


----------



## Prozillah

Update on Xfire Gigabyte 290's Windforce temps:

Bf4 64p Multi seems sweet but 5 passes of Metro last light bench kicks there arse. At stock clocks I get a slight amount of throttling happening on top card - dips about 10-20mhz. Maxes at 84c and doesnt go hotter but VRM1 maxes at 91c so that's definitely getting up there. Now the cards in Xfire will do 1120 on the core with no power % increase or Volts. But the throttle becomes much more noticeable with drops around 900 - 1000 odd mhz and so on.

I'm going to try a nice powerful 120mm fan on the side of the case to see if I can suck as much hot air away as possible (blowing at my feet isnt really going to be enjoyable but I'll deal with it...) and see if that helps temps at all.

HOWEVER - considering I've always run my OC's to the programs I use & play I may increase the OC if the temps stay low in BF4. The goal being the highest core clock I can go without reaching 83c.

If anyone has a pair of these and can offer any tips or tricks send it my way!


----------



## Imprezzion

Well, on a nvidia GTX7xx you absolutely don't want Elpida as they barely leave for +100Mhz of OC room...

On my unlocked 290 (XFX) however the Elpida ain't even half bad and runs and clocks just fine.

Btw, does anyone know why my card scales so crappy with voltage? When I use the ASUS BIOS and do +50mV (about 1.12v load) and unlimited power limit it does 1140Mhz core. Not too bad. However, when I go all the way to say, +200mV (1.21v load) it still won't get nowhere near 1200Mhz. Like, 1180Mhz or something is max if that's even stable to begin with..

Now, why would it respond SO bad on voltage??

It's cooled with a Accelero Xtreme III and cut up stock cooler for VRM's.
Temps @ +50mV 1140Mhz core:
Core: ~60c
VRM1: ~72c
VRM2: ~ 54c

Temps @ +200mV 1175Mhz core:
Core: ~70c
VRM1: ~85c, high but well within limits afaik
VRM2: ~ 56c


----------



## BradleyW

Quote:


> Originally Posted by *Imprezzion*
> 
> Well, on a nvidia GTX7xx you absolutely don't want Elpida as they barely leave for +100Mhz of OC room...
> 
> On my unlocked 290 (XFX) however the Elpida ain't even half bad and runs and clocks just fine.
> 
> Btw, does anyone know why my card scales so crappy with voltage? When I use the ASUS BIOS and do +50mV (about 1.12v load) and unlimited power limit it does 1140Mhz core. Not too bad. However, when I go all the way to say, +200mV (1.21v load) it still won't get nowhere near 1200Mhz. Like, 1180Mhz or something is max if that's even stable to begin with..
> 
> Now, why would it respond SO bad on voltage??
> 
> It's cooled with a Accelero Xtreme III and cut up stock cooler for VRM's.
> Temps @ +50mV 1140Mhz core:
> Core: ~60c
> VRM1: ~72c
> VRM2: ~ 54c
> 
> Temps @ +200mV 1175Mhz core:
> Core: ~70c
> VRM1: ~85c, high but well within limits afaik
> VRM2: ~ 56c


You might have just reached your overclocking limit. Too much voltage can cause instability sometimes.


----------



## Sgt Bilko

Looks like MSI have the Lightning finished









"We know a lot of you have been waiting for it and it's almost here! This upcoming beauty has what you need. MORE power, MORE speed, MORE performance We've finished our work on the fastest R9 290X and it could be in your rig in just a few weeks."



Source: https://www.facebook.com/photo.php?fbid=648935501809797


----------



## Maracus

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Looks like MSI have the Lightning finished
> 
> 
> 
> 
> 
> 
> 
> 
> 
> "We know a lot of you have been waiting for it and it's almost here! This upcoming beauty has what you need. MORE power, MORE speed, MORE performance We've finished our work on the fastest R9 290X and it could be in your rig in just a few weeks."
> 
> 
> 
> Source: https://www.facebook.com/photo.php?fbid=648935501809797


Nice can't wait to see what they can achieve OC wise, hopefully they can iron out some vdroop.


----------



## szeged

oh my goooooooddddddddddddddd msi 290x lightning! finally! my kingpin card is about to make a red brother.


----------



## Sgt Bilko

Quote:


> Originally Posted by *szeged*
> 
> oh my goooooooddddddddddddddd msi 290x lightning! finally! my kingpin card is about to make a red brother.


Thought you might be happy about that









EDIT: So a few weeks time sounds about mid-late March then.


----------



## Durquavian

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Thought you might be happy about that
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: So a few weeks time sounds about mid-late March then.


Another guy got an email from them saying late march early April, but they were shooting for late march.


----------



## Ukkooh

Any news for wc blocks for the lightnings?


----------



## VSG

I hope the card releases at a fair price but I realize there is a great chance this is going to run at stock clocks, undervolted in a mining rig as well.


----------



## anubis1127

Quote:


> Originally Posted by *Ukkooh*
> 
> Any news for wc blocks for the lightnings?


EK will probably make one for it, but nothing on their website about it yet.


----------



## VSG

Tiborr and Derick had both alluded to it, but the Asus 290x DCUII is still not out yet so I wouldn't hold my breadth.


----------



## Brian18741

Quote:


> Originally Posted by *Roboyto*
> 
> temperatures decreased:
> 
> VRM1 23.25% VRM2 12.15%[/SIZE][/B]


Excellent results man, thanks for sharing! Hopefully I can get even half that result on the core with the TIM change!

In other news, EK have confirmed the DCU II waterblock will fit both the 290x and non X version and should be out in about 4 weeks!


----------



## kizwan

Quote:


> Originally Posted by *taem*
> 
> My Powercolor PCS+ 290 is a little screamer! 1200 core, 1500 mem, 4670k @ 4.6
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 3dmark11 performance:
> 
> http://www.3dmark.com/3dm11/7991243
> 
> Fire Strike:
> 
> http://www.3dmark.com/fs/1740273
> 
> Fire Strike Extreme:
> 
> http://www.3dmark.com/fs/1740238
> 
> 
> 
> Ranks at 3dmark for 4670k + single 290, valid driver:
> #16 for 3dmark11
> #6 for fire strike
> #18 for fire strike extreme
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Valley ExtremeHD & Heaven Extreme:
> 
> 
> Temps are pretty awesome too, here after 50 loop of Metro LL bench maxed settings, ambient 22-23:
> Stock: 65c core, 74c vrm1, 49c vrm2
> OC 1200 core/1500 mem: 67c core, 85c vrm1, 49c vrm2
> 
> 
> 
> 
> 
> Vrm1 at 85c at oc clocks may seem high but I've yet to find a game that gets me temps as high as MetroLL bench maxed on loop. Gaming vrm1 temp is much lower.


For 3dmark 11 (P):-
- #71 overall
- #19 for single card
- #39 for dual/crossfire
Quote:


> Originally Posted by *Coree*
> 
> I've been thinking this for a while.. why the VRAM is rated to work @ 1500Mhz 1.5V (6Ghz effective) on all R9 290's? Is there some reason why they are downclocked.. and are there any ways to control the memory voltage? 384 GB/s 100% stable memory would be awesome though.


Probably the memory voltage only set to 1.35V but I could be wrong because we can do 1500MHz OC easy & stable too.
Quote:


> Originally Posted by *Roboyto*
> 
> *Temperature Results from Fujipoly Ultra Extreme on VRMs*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Screenshot of VRM temps with FUE for 3DMark11 Performance Presets 1295/1700 +200mV
> 
> 
> All observed temperatures before and after using same settings before and after thermal pad switch
> 
> 
> 
> 
> *On average for 3DMark11 Performance/Xtreme Presets, Unigine Heaven/Valley Extreme Presets, and FFXIV Benchmark Maximum Settings, temperatures decreased:
> 
> VRM1 23.25% VRM2 12.15%*


Thanks. That is good result. VRM1 able to maintain in the 50s with FUE vs. 80s with stock pad.

I didn't get 17 W/mK but I got 11 W/mK instead. For the amount I paid for the shipping, it still circulating around NY for 4 days.







I don't know how many sorting facility they need to go through.


----------



## Redvineal

Quote:


> Originally Posted by *Roboyto*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> *Upgraded thermal pads on XSPC Razor Block*
> 
> Alright so I have installed the Fujipoly Ultra Extreme 17.0 W/mk pads on both VRM1 and VRM2. I have also installed the Fujipoly Extreme 11.0 W/mk on the RAM chips.
> 
> I did not use any TIM to assist in thermal conductivity on the pads. Chiknnwatrmln also did not use any TIM with his EK block. With the Ultra Extreme he is seeing roughly 20C lower temps on VRM1. So I am doing this write up to see what kind of a difference there is going from stock XSPC pads to FUE, and to see which block does a better job cooling VRM1.
> 
> *Here's what it looked like with stock pads.*
> 
> 
> *I ordered the FUE in 15x100 for the VRMs and FE in the 150x100 for the RAM chips.*
> 
> *The 150x100 is enough to probably do 3 whole cards with a little extra left over. If you wanted to get better than stock results and not spend $60 for both types you could easily get away with one of these.*
> 
> 
> *The 15x100 FUE is more than enough to do the VRMs on these cards. If you cut them thin enough you might have enough to do 3 cards.*
> 
> *I examined the stock pad contact and found some spots that weren't being 100% covered.*
> 
> 
> *I took the XSPC stock pads off and laid them on the Fujipoly to use them as a template and enlarged as needed.*
> 
> 
> 
> 
> 
> 
> 
> *My awesome Husky 'titanium' coated scissors worked marvelously to cut both versions of the pads.*
> 
> 
> 
> *Minimal trimming was needed to make all pieces fit on RAM really nice.*
> 
> 
> 
> 
> 
> *At this point I thought it would nice for everyone else to have a template of their own, so I put the pads on my scanner
> 
> 
> 
> 
> 
> 
> 
> *
> 
> R9290XThermalPadTemplate.jpg 1077k .jpg file
> 
> 
> R9290XThermalPadTemplate.pdf 1055k .pdf file
> 
> *Either should print on a whole sheet of 8.5x11 paper so they are the correct size. I would imagine these would work on other blocks aside from XSPC, but that will be up to others to explore.*
> 
> *Time to clean the VRMs and RAM chips*
> 
> *I chose to use Arcticlean TIM remover/purifier as I figured it could easily remove the residue from the other pads. Chiknnwatrmln suggested Isopropyl Alcohol or a Pencil Eraser.*
> 
> 
> *Time to lay the pads onto the card.*
> 
> *One more cut was required when I got to this point. I did not want to pick up the Fujipoly Extreme after it was on the card as it felt somewhat fragile. I ended up taking a precision flat head screwdriver to cut the pad where I needed. It required very little force to cut through the FE.*
> 
> 
> 
> *Tada!*
> 
> 
> *A few miscellaneous pointers before I end this rant...*
> *
> Wash your hands before handling these. The less oil from your skin the better; you could even wear gloves I suppose.
> 
> The Fujipoly Ultra Extreme has thinner more pliable plastic on both sides, it is easy to remove from both sides.
> 
> The Fujipoly Extreme is like all other thermal pads I have used in the past. One side is thin, pliable, has a checkered pattern, and is easy to remove. The other side has a thicker, harder, clear plastic and is more difficult to remove. Take the hard plastic off first so you don't struggle removing it and touch the pads more than necessary.*
> 
> 
> *Temperature results coming shortly
> 
> 
> 
> 
> 
> 
> 
> *


+rep coming your way for such detailed info, and results in a later post. you've certainly convinced me to drop the cash on some new pads for my 3 blocks.

a few suggestions. since your post will likely be linked and referenced by many, it may be good to mention the pad thickness (I'm assuming 1.0mm across the board) in the opening statement. also, maybe some links to the exact pads you bought could be helpful for those looking for quick reference.









anyways, great work!


----------



## taem

Quote:


> Originally Posted by *Prozillah*
> 
> Update on Xfire Gigabyte 290's Windforce temps:
> 
> Bf4 64p Multi seems sweet but 5 passes of Metro last light bench kicks there arse. At stock clocks I get a slight amount of throttling happening on top card - dips about 10-20mhz. Maxes at 84c and doesnt go hotter but VRM1 maxes at 91c so that's definitely getting up there. Now the cards in Xfire will do 1120 on the core with no power % increase or Volts. But the throttle becomes much more noticeable with drops around 900 - 1000 odd mhz and so on.
> 
> I'm going to try a nice powerful 120mm fan on the side of the case to see if I can suck as much hot air away as possible (blowing at my feet isnt really going to be enjoyable but I'll deal with it...) and see if that helps temps at all.
> 
> HOWEVER - considering I've always run my OC's to the programs I use & play I may increase the OC if the temps stay low in BF4. The goal being the highest core clock I can go without reaching 83c.
> 
> If anyone has a pair of these and can offer any tips or tricks send it my way!


I'm on single Powercolor PCS+ card, waiting on 290 Tri-X, but a few thoughts.

When running MetroLL bench as a test, you need more than 5 passes. If I get a problem, it's almost always at pass 8-10, and then again at 15-18. Happens consistently. For throttling, are you sure it's throttling? What oc app you using and what are your settings? Because I had Afterburner set up wrong and clock was fluctuating. You need to have "Extend Official Overclocking Limits" off and "Unofficial Overclocking Mode" set to disabled. I was getting that same thing, clock set to 1200 which I knew was stable but operating at 900-1000. It has to do with Powertune.
Quote:


> Originally Posted by *Imprezzion*
> 
> Well, on a nvidia GTX7xx you absolutely don't want Elpida as they barely leave for +100Mhz of OC room...
> 
> On my unlocked 290 (XFX) however the Elpida ain't even half bad and runs and clocks just fine.
> 
> Btw, does anyone know why my card scales so crappy with voltage? When I use the ASUS BIOS and do +50mV (about 1.12v load) and unlimited power limit it does 1140Mhz core. Not too bad. However, when I go all the way to say, +200mV (1.21v load) it still won't get nowhere near 1200Mhz. Like, 1180Mhz or something is max if that's even stable to begin with..
> 
> Now, why would it respond SO bad on voltage??
> 
> It's cooled with a Accelero Xtreme III and cut up stock cooler for VRM's.
> Temps @ +50mV 1140Mhz core:
> Core: ~60c
> VRM1: ~72c
> VRM2: ~ 54c
> 
> Temps @ +200mV 1175Mhz core:
> Core: ~70c
> VRM1: ~85c, high but well within limits afaik
> VRM2: ~ 56c


Yeah I too think clock gains from increased voltage scale very poorly on my 290. So much so that I went back down to stock voltage (Powercolor PCS+ is set to +50mV in bios anyway) and am running 1150 core 1550 clock. It's not worth it going to +100mV just to run 1200 core 1600 mem when the gains are quite small for noticeably higher temps and power draw. Also, I have Elpida and it runs 1600 perfectly.

Your temps are great! Why do some folks bash XFX? They look great, have the best warranty, and look at your temps! I guess my Powercolor with its massive cooler isn't so special lol. Core and vrm2 are actually frosty but vrm1 runs hot, wonder if the heatsink isn't on right. I know vrm1 always runs hottest but it's out of range relative to core and vrm2.


----------



## SeanJ76

[email protected] the 290x jumping $350.00 in price this past week!!! 780Ti kills it at 150.00 less!!! AMD must be having some issues with yields to have to do this.
http://www.anandtech.com/show/7758/radeon-r9-290x-retail-prices-hit-900


----------



## rdr09

Quote:


> Originally Posted by *SeanJ76*
> 
> [email protected] the 290x jumping $350.00 in price this past week!!! 780Ti kills it at 150.00 less!!! AMD must be having some issues with yields to have to do this.
> http://www.anandtech.com/show/7758/radeon-r9-290x-retail-prices-hit-900


i know, right?

my 290 was $400 and i might be able to sell it for $500 in the bay.

i remember the 780 was selling for $700 . . . now $450.


----------



## anubis1127

Quote:


> Originally Posted by *rdr09*
> 
> i know, right?
> 
> my 290 was $400 and i might be able to sell it for $500 in the bay.
> 
> i remember the 780 was selling for $700 . . . now $450.


 Probably more like $550-600. R9 280Xs (7970s) are up to $500 (even crappy blue pcb sapphire models).

Crazy cryptocurrency world we are living in.


----------



## rdr09

Quote:


> Originally Posted by *anubis1127*
> 
> Probably more like $550-600. R9 280Xs (7970s) are up to $500 (even crappy blue pcb sapphire models).
> 
> Crazy cryptocurrency world we are living in.


I was going to buy the 670 but 320.18 came out and turned me off. I ended up with this . . .



greedy. sorry.


----------



## pkrexer

Quote:


> Originally Posted by *Roboyto*
> 
> *Temperature Results from Fujipoly Ultra Extreme on VRMs*
> 
> Screenshot of VRM temps with FUE for 3DMark11 Performance Presets 1295/1700 +200mV
> 
> 
> All observed temperatures before and after using same settings before and after thermal pad switch
> 
> 
> *On average for 3DMark11 Performance/Xtreme Presets, Unigine Heaven/Valley Extreme Presets, and FFXIV Benchmark Maximum Settings, temperatures decreased:
> 
> VRM1 23.25% VRM2 12.15%*


Excellent results, makes me happy I went ahead and bought them. I'm assuming that with a little cure time, the temps will possibly even get a bit better?


----------



## anubis1127

Haha, I should have sold my 7950 on ebay, I just get scared with scammer buyers, and morons that ruin cards flashing vbios for mining performance, that I sold it on OCN for not very much.


----------



## chronicfx

I hate ebay soo much.. I have my 7990 up right now and all i can think about is someone buying it and then 5 days later.... I would like a refund


----------



## Roboyto

Quote:


> Originally Posted by *Brian18741*
> 
> Excellent results man, thanks for sharing! Hopefully I can get even half that result on the core with the TIM change!
> 
> In other news, EK have confirmed the DCU II waterblock will fit both the 290x and non X version and should be out in about 4 weeks!


You should see a solid benefit from it. I would be very surprised if temps didn't come down a helpful amount.

Glad to hear that it will fit the 290 as well! They looked identical but it's always good to have verification.


----------



## MrWhiteRX7

I sold my old 7970's and 7950's on Amazon and sure enough a few weeks later I get a guy saying "ohh it won't post. I have tried everything"... that stupid punk obviously flashed a bios and hardlocked it and of course AMAZON AWARDS HIM buyer protection. I get the card and it's got different bios. Amazon doesn't care.

I'm never selling a gpu again, except maybe on here and that's it.


----------



## rdr09

Quote:


> Originally Posted by *anubis1127*
> 
> Haha, I should have sold my 7950 on ebay, I just get scared with scammer buyers, and morons that ruin cards flashing vbios for mining performance, that I sold it on OCN for not very much.


mine went smooth. the 7950 lasted for 2 mins. Sold. so, i posted my 7970 for $375 and i was surprised it lasted 24hrs.

i tried to sell the 7970 here in ocn for $250 but the buyer backed out after finding out powercolor's warranty policy.

edit: anubis, that 7950 had a $30 rebate and both cards have 3 games each.


----------



## chronicfx

I sold my 7990 2 months ago to a litecoin miner woth a large farm. He sent it back and it never worked the same again. Saying it hard locked on him. Had to RMA it only getting it back yesterday and they replaced my nice malta with a three slot monstrosity. Who is going to want this?


----------



## anubis1127

Quote:


> Originally Posted by *rdr09*
> 
> mine went smooth. the 7950 lasted for 2 mins. Sold. so, i posted my 7970 for $375 and i was surprised it lasted 24hrs.
> 
> i tried to sell the 7970 here in ocn for $250 but the buyer backed out after finding out powercolor's warranty policy.
> 
> edit: anubis, that 7950 had a $30 rebate and both cards have 3 games each.


Haha, nice, did you actually send in the rebate?

Quote:


> Originally Posted by *chronicfx*
> 
> I sold my 7990 2 months ago to a litecoin miner woth a large farm. He sent it back and it never worked the same again. Saying it hard locked on him. Had to RMA it only getting it back yesterday and they replaced my nice malta with a three slot monstrosity. Who is going to want this?


Yeah, hearing stories like that are what dissuaded me from selling on ebay during the mining craze.

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> I sold my old 7970's and 7950's on Amazon and sure enough a few weeks later I get a guy saying "ohh it won't post. I have tried everything"... that stupid punk obviously flashed a bios and hardlocked it and of course AMAZON AWARDS HIM buyer protection. I get the card and it's got different bios. Amazon doesn't care.
> 
> I'm never selling a gpu again, except maybe on here and that's it.


Ouch, that is a bummer, sounds about par for the course. Buyer Protection could easily be renamed Scam the Seller.


----------



## rdr09

Quote:


> Originally Posted by *anubis1127*
> 
> Haha, nice, did you actually send in the rebate?
> 
> Yeah, hearing stories like that are what dissuaded me from selling on ebay during the mining craze.
> Ouch, that is a bummer, sounds about par for the course. Buyer Protection could easily be renamed Scam the Seller.


rebates are for real . . .

http://images10.newegg.com/uploadfilesfornewegg/rebate/SH/MSI10MIRsAug1Aug3113ej11.pdf


----------



## The Mac

i sold my 7970 dual-x on ebay to some dude in Georgia for $415 back in december.

He never returned it.

I looked at his buying record, and i didnt see any other gpu being purchased so i guess he wasnt a miner,

i still wonder why did he pay so much for it?


----------



## Roboyto

Quote:


> Originally Posted by *name*
> 
> Thanks. That is good result. VRM1 able to maintain in the 50s with FUE vs. 80s with stock pad.
> 
> I didn't get 17 W/mK but I got 11 W/mK instead. For the amount I paid for the shipping, it still circulating around NY for 4 days.
> 
> 
> 
> 
> 
> 
> 
> I don't know how many sorting facility they need to go through.


That's what I put on the RAM, I still need to try and push it s little harder to see if it helped RAM capabilities.

With the way the weather has been if it shipped USPS or FedEx you'll likely be waiting. Everything I have ordered, via FedEx or USPS, since the snow started has been delayed.


----------



## Roboyto

Quote:


> Originally Posted by *Redvineal*
> 
> +rep coming your way for such detailed info, and results in a later post. you've certainly convinced me to drop the cash on some new pads for my 3 blocks.
> 
> a few suggestions. since your post will likely be linked and referenced by many, it may be good to mention the pad thickness (I'm assuming 1.0mm across the board) in the opening statement. also, maybe some links to the exact pads you bought could be helpful for those looking for quick reference.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> anyways, great work!


Thanks for constructive criticism. I will update the post later today.


----------



## Roboyto

Quote:


> Originally Posted by *pkrexer*
> 
> Excellent results, makes me happy I went ahead and bought them. I'm assuming that with a little cure time, the temps will possibly even get a bit better?


No one has mentioned better temps over time. But that's a great idea, I will monitor them in the weeks to come.


----------



## pkrexer

Quote:


> Originally Posted by *Roboyto*
> 
> That's what I put on the RAM, I still need to try and push it s little harder to see if it helped RAM capabilities.
> 
> With the way the weather has been if it shipped USPS or FedEx you'll likely be waiting. Everything I have ordered, via FedEx or USPS, since the snow started has been delayed.


Did you gain any stability by lowering your VRM1 temps?


----------



## Roboyto

Quote:


> Originally Posted by *pkrexer*
> 
> Did you gain any stability by lowering your VRM1 temps?


I only tested at same exact clocks/voltages In each bench to make an accurate comparison of temperatures. I will try my luck in the days to come. I will be on vacation this coming week so it will give me time to play around a little more... Not sure there is much left to squeeze out of this one, ~1300/~1700 is an immense overclock for a reference card.


----------



## cam51037

I too am thinking about selling my 290 Tri-X. I may have a buyer for $650-$700. I'd then plan to go for dual GTX 780's for $890 shipped from EVGA. (B-stock)

It's really tempting... $200 more for a ton more performance.


----------



## phallacy

Quote:


> Originally Posted by *cam51037*
> 
> I too am thinking about selling my 290 Tri-X. I may have a buyer for $650-$700. I'd then plan to go for dual GTX 780's for $890 shipped from EVGA. (B-stock)
> 
> It's really tempting... $200 more for a ton more performance.


Do the 780s really provide that much more performance compared to the 290s? All the benchmarks I've seen it's either been 290 up by a few fps or 780s up by a few fps depending on game. Same thing with 780 ti vs 290x.


----------



## cam51037

Quote:


> Originally Posted by *phallacy*
> 
> Do the 780s really provide that much more performance compared to the 290s? All the benchmarks I've seen it's either been 290 up by a few fps or 780s up by a few fps depending on game. Same thing with 780 ti vs 290x.


Indeed the 290 is the better performer, it's just that for $200 I can upgrade my single R9 290 to *dual* GTX 780's.


----------



## phallacy

Quote:


> Originally Posted by *cam51037*
> 
> Indeed the 290 is the better performer, it's just that for $200 I can upgrade my single R9 290 to *dual* GTX 780's.


Ooh gotcha, thought you meant 2 290s vs 2 780s. Worth it then if you can sell it. I bought a 290x tri x OC and 290x dd black. Going to see which one is the better OCer and return the one I don't need. Will waterblock it and add to my config anyways.


----------



## anubis1127

Quote:


> Originally Posted by *cam51037*
> 
> Indeed the 290 is the better performer, it's just that for $200 I can upgrade my single R9 290 to *dual* GTX 780's.


Do it nao! xD


----------



## cam51037

Quote:


> Originally Posted by *anubis1127*
> 
> Do it nao! xD


:O

I'll think about it for the rest of the day and then make my final decision tomorrow.


----------



## taem

Quote:


> Originally Posted by *phallacy*
> 
> Do the 780s really provide that much more performance compared to the 290s? All the benchmarks I've seen it's either been 290 up by a few fps or 780s up by a few fps depending on game. Same thing with 780 ti vs 290x.


Not sure this is the best site for info but it provides the graphics sub-score which most reviews don't, and cpu makes such a huge difference


http://www.legitreviews.com/nvidia-geforce-gtx-780-ti-video-card-review_128012/8


----------



## Roboyto

Quote:


> Originally Posted by *cam51037*
> 
> :O
> 
> I'll think about it for the rest of the day and then make my final decision tomorrow.


I don't know if I would want to gamble on B stock. Is there a reduced warranty or return policy if they don't perform as you'd like?


----------



## cam51037

Quote:


> Originally Posted by *Roboyto*
> 
> I don't know if I would want to gamble on B stock. Is there a reduced warranty or return policy if they don't perform as you'd like?


Well if I'm reading it right you have a one year warranty on the cards instead of two years, thats the only real difference.


----------



## taem

Quote:


> Originally Posted by *cam51037*
> 
> Well if I'm reading it right you have a one year warranty on the cards instead of two years, thats the only real difference.


I would guess poor overclocking on a lot of these if that's important to you. Probably doesn't matter for sli. But given that sli doesn't scale 100% in game, the price advantage probably diminishes somewhat over a single gpu with good clocks. Price advantage will be further undercut over time by power consumption. 200-250w more isn't trivial.


----------



## Jack Mac

I'm so mad right now, I cancelled my NCIX order of an XFX 290 to order from a third party Amazon seller because it would be $30 cheaper, today I received an e-mail from the Amazon seller saying that the XFX 290 is on back order for 2-3 weeks. I am not going to use flipping integrated for two to three WEEKS. And they never mentioned that it wouldn't ship for that long. Ugh, why did I sell my good clocking 290?







What should I do?


----------



## szeged

Quote:


> Originally Posted by *Jack Mac*
> 
> I'm so mad right now, I cancelled my NCIX order of an XFX 290 to order from a third party Amazon seller because it would be $30 cheaper, today I received an e-mail from the Amazon seller saying that the XFX 290 is on back order for 2-3 weeks. I am not going to use flipping integrated for two to three WEEKS. And they never mentioned that it wouldn't ship for that long. Ugh, why did I sell my good clocking 290?
> 
> 
> 
> 
> 
> 
> 
> What should I do?


Grab a 780 classy from the evga b stock page for $450 imo lol


----------



## Jack Mac

Quote:


> Originally Posted by *szeged*
> 
> Grab a 780 classy from the evga b stock page for $450 imo lol


I was thinking of that but I could also get the 780SC from Amazon tomorrow lol. I don't know if a returned 780 Classy would even clock well at all.


----------



## rdr09

Quote:


> Originally Posted by *cam51037*
> 
> Indeed the 290 is the better performer, it's just that for $200 I can upgrade my single R9 290 to *dual* GTX 780's.


don't you mine? i read they can mine as good as the 7900 series now.


----------



## szeged

Quote:


> Originally Posted by *Jack Mac*
> 
> I was thinking of that but I could also get the 780SC from Amazon tomorrow lol. I don't know if a returned 780 Classy would even clock well at all.


Probably just got returned because someone got mean looks from the wife for buying another gpu lol.


----------



## cam51037

Quote:


> Originally Posted by *rdr09*
> 
> don't you mine? i read they can mine as good as the 7900 series now.


Yes I mine, except these cards would be in my main computer, and I don't mine on my main computer so they wouldn't see any mining time.


----------



## rdr09

Quote:


> Originally Posted by *cam51037*
> 
> Yes I mine, except these cards would be in my main computer, and I don't mine on my main computer so they wouldn't see any mining time.


for $200 extra - go for it.


----------



## Roboyto

Quote:


> Originally Posted by *cam51037*
> 
> Well if I'm reading it right you have a one year warranty on the cards instead of two years, thats the only real difference.


The only Nvidia card I have purchased recently was a EVGA GTX 670 Superclock 4GB, and it was an open box item from my local MicroCenter. Card ran just fine at stock clocks even with factory blower. As soon as I put a waterblock on it I could hear the horrendous coil whine it was generating. I definitely know that I was the first one to remove the stock HSF so that wasn't the problem. The card wouldn't take any kind of an overclock. Within 6 months it was BSOD or black screen within a few minutes of 3D load, especially something that had PhysX. I know I probably got a crappy card, but the experience was enough to make me not turn my back on the Red Team again. Only positive was I had the walk-in exchange warranty so I got all of my money back to put towards a MoBo/CPU.


----------



## LTC

So I'm new to this whole 290 thing. Just received my 290 DCUII, and was wondering what results people are getting, right now I got PowerLimit at 50%, however I still seems to chrash in BF4 with 1050/1300 :S


----------



## anubis1127

Quote:


> Originally Posted by *Roboyto*
> 
> The only Nvidia card I have purchased recently was a EVGA GTX 670 Superclock 4GB, and it was an open box item from my local MicroCenter. Card ran just fine at stock clocks even with factory blower. As soon as I put a waterblock on it I could hear the horrendous coil whine it was generating. I definitely know that I was the first one to remove the stock HSF so that wasn't the problem. The card wouldn't take any kind of an overclock. Within 6 months it was BSOD or black screen within a few minutes of 3D load, especially something that had PhysX. I know I probably got a crappy card, but the experience was enough to make me not turn my back on the Red Team again. Only positive was I had the walk-in exchange warranty so I got all of my money back to put towards a MoBo/CPU.


Stahp.


----------



## Paul17041993

Quote:


> Originally Posted by *LTC*
> 
> So I'm new to this whole 290 thing. Just received my 290 DCUII, and was wondering what results people are getting, right now I got PowerLimit at 50%, however I still seems to chrash in BF4 with 1050/1300 :S


sounds about right.

jokes aside could you try adding a little core voltage? the DCIIs may need a little extra compared to reference...

Quote:


> Originally Posted by *Roboyto*
> 
> The only Nvidia card I have purchased recently was a EVGA GTX 670 Superclock 4GB, and it was an open box item from my local MicroCenter. Card ran just fine at stock clocks even with factory blower. As soon as I put a waterblock on it I could hear the horrendous coil whine it was generating. I definitely know that I was the first one to remove the stock HSF so that wasn't the problem. The card wouldn't take any kind of an overclock. Within 6 months it was BSOD or black screen within a few minutes of 3D load, especially something that had PhysX. I know I probably got a crappy card, but the experience was enough to make me not turn my back on the Red Team again. Only positive was I had the walk-in exchange warranty so I got all of my money back to put towards a MoBo/CPU.


should I mention my laptop has a GT 540M...?

worst card ever honestly... (they were flogged off as a defective 550M, failures galore and had no thermal protection...)


----------



## LTC

Quote:


> Originally Posted by *Paul17041993*
> 
> sounds about right.
> 
> jokes aside could you try adding a little core voltage? the DCIIs may need a little extra compared to reference...


How is that done? The slider is greyed out in AB, even with unofficial OC enabled...


----------



## anubis1127

Quote:


> Originally Posted by *LTC*
> 
> How is that done? The slider is greyed out in AB, even with unofficial OC enabled...


Have you tried GPU Tweak? I thought you needed to use that software with the Asus cards, I could be wrong though, somebody please correct me if that is the case.


----------



## phallacy

^ If the unlock voltage control and enable voltage monitoring is checked in AB pretty much every 290 card should be able to take in higher voltages with the offset slider. Same thing with Trixx (my preferred OC utility for these cards)


----------



## Paul17041993

Quote:


> Originally Posted by *LTC*
> 
> How is that done? The slider is greyed out in AB, even with unofficial OC enabled...


ASUS locks the controller to be used only with GPUTweak, likely specific versions of said software too.


----------



## taem

Quote:


> Originally Posted by *LTC*
> 
> How is that done? The slider is greyed out in AB, even with unofficial OC enabled...


Go to settings / general and check boxes for Unlock Voltage Control and Unlock Voltage Monitoring. Restart.

You don't want Unofficial Overclock, set it to disabled.
Quote:


> Originally Posted by *Paul17041993*
> 
> ASUS locks the controller to be used only with GPUTweak, likely specific versions of said software too.


Seriously?? I had Asus 280x dcu ii, I could tweak volts with afterburner. So this is new with asus 290s? What a horrible thing to do. It's not like they even put any effort into gpu tweak.


----------



## disintegratorx

Wow guys, I JUST got my card working how it should and now I'm FLYING around BF4 with NO HITCHES...
This is easily the best card that ATI/AMD has made yet. I'm posting a snapshot from my MS Afterburner so that everyone can see the settings that I'm using. MANTLE... Is the definite BEST thing that ATI/AMD has created. Never thought that I could play the game so smooth after everything I went through, installing the water cooling system. And today, just about a couple hours ago, set my card to a lower Power Play setting and a little bit of a lower freq. on the memory. Dudes, THIS card.. IS worth every penny. Thank you SO much, AMD! You guys outdid yourselves.







Here they are...







)))





And I'm getting stunning results with my old quad-core i7 950 proc.. un-REAL.. lol









Also, I have a 700mw offset on my CPU PLL Voltage, LLC enabled and only a slight O.C. to the CPU...
Now I have to make it a personal mission to get my new chipset back up and running again, even though I don't feel that I need to at all.
Still, the curiosity has been re-ignited in this series. It can only get better!?! lol AWESOOOOOOOOOME!!!!!!

Oh, and I would like to post a good link for you all to see, if you're interested in PC tweaking. From one club member to another.








Performance Tweaks: http://forums.guru3d.com/showthread.php?t=327922 . And be sure to let me know if there is anything else I could give a speculation on, or anything... My favorite hobby is what you're reading about on THIS forum. Game On!


----------



## Slomo4shO

Quote:


> Originally Posted by *Roboyto*
> 
> *Upgraded thermal pads on XSPC Razor Block*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Alright so I have installed the Fujipoly Ultra Extreme 17.0 W/mk pads on both VRM1 and VRM2. I have also installed the Fujipoly Extreme 11.0 W/mk on the RAM chips.
> 
> I did not use any TIM to assist in thermal conductivity on the pads. Chiknnwatrmln also did not use any TIM with his EK block. With the Ultra Extreme he is seeing roughly 20C lower temps on VRM1. So I am doing this write up to see what kind of a difference there is going from stock XSPC pads to FUE, and to see which block does a better job cooling VRM1.
> 
> *Here's what it looked like with stock pads.*
> 
> 
> *I ordered the FUE in 15x100 for the VRMs and FE in the 150x100 for the RAM chips.*
> 
> *The 150x100 is enough to probably do 3 whole cards with a little extra left over. If you wanted to get better than stock results and not spend $60 for both types you could easily get away with one of these.*
> 
> 
> *The 15x100 FUE is more than enough to do the VRMs on these cards. If you cut them thin enough you might have enough to do 3 cards.*
> 
> *I examined the stock pad contact and found some spots that weren't being 100% covered.*
> 
> 
> *I took the XSPC stock pads off and laid them on the Fujipoly to use them as a template and enlarged as needed.*
> 
> 
> 
> 
> 
> 
> 
> *My awesome Husky 'titanium' coated scissors worked marvelously to cut both versions of the pads.*
> 
> 
> 
> *Minimal trimming was needed to make all pieces fit on RAM really nice.*
> 
> 
> 
> 
> 
> *At this point I thought it would nice for everyone else to have a template of their own, so I put the pads on my scanner
> 
> 
> 
> 
> 
> 
> 
> *
> 
> R9290XThermalPadTemplate.jpg 1077k .jpg file
> 
> 
> R9290XThermalPadTemplate.pdf 1055k .pdf file
> 
> *Either should print on a whole sheet of 8.5x11 paper so they are the correct size. I would imagine these would work on other blocks aside from XSPC, but that will be up to others to explore.*
> 
> *Time to clean the VRMs and RAM chips*
> 
> *I chose to use Arcticlean TIM remover/purifier as I figured it could easily remove the residue from the other pads. Chiknnwatrmln suggested Isopropyl Alcohol or a Pencil Eraser.*
> 
> 
> *Time to lay the pads onto the card.*
> 
> *One more cut was required when I got to this point. I did not want to pick up the Fujipoly Extreme after it was on the card as it felt somewhat fragile. I ended up taking a precision flat head screwdriver to cut the pad where I needed. It required very little force to cut through the FE.*
> 
> 
> 
> *Tada!*
> 
> 
> *A few miscellaneous pointers before I end this rant...*
> *
> Wash your hands before handling these. The less oil from your skin the better; you could even wear gloves I suppose.
> 
> The Fujipoly Ultra Extreme has thinner more pliable plastic on both sides, it is easy to remove from both sides.
> 
> The Fujipoly Extreme is like all other thermal pads I have used in the past. One side is thin, pliable, has a checkered pattern, and is easy to remove. The other side has a thicker, harder, clear plastic and is more difficult to remove. Take the hard plastic off first so you don't struggle removing it and touch the pads more than necessary.*
> 
> 
> *Temperature results coming shortly
> 
> 
> 
> 
> 
> 
> 
> *


Hmm, I also have Fujipoly Extreme pads on my XSPC blocked R9 290s. The only difference is that I applied MX-4 on either side of the pad, would be curious to see how your temps compare to mine:


----------



## Forceman

Quote:


> Originally Posted by *LTC*
> 
> How is that done? The slider is greyed out in AB, even with unofficial OC enabled...


Make sure you are using AB Beta 18, earlier versions don't work.
Quote:


> Originally Posted by *Paul17041993*
> 
> ASUS locks the controller to be used only with GPUTweak, likely specific versions of said software too.


AB worked just fine when I had my card flashed with the Asus BIOS, unless that's a DCII limitation.


----------



## LTC

Voltage control seems to work now, so how much is safe? 0-100 doesn't really give me any sense of how much it is overvolting? is it 0-100mV?


----------



## VSG

@Slomo4shO I made the same potential mistake first time around. MX4 has a worse conductivity than those pads and makes reusing them much harder. I didn't do it on my current cards.

I also replaced MX4 with PK3 and got a further temp drop on the CPU and GPU cores.


----------



## Jack Mac

100mV should definitely be OK for 24/7 use providing that temperatures are kept in check, any more than that and things start to get flaky.


----------



## anubis1127

Quote:


> Originally Posted by *LTC*
> 
> Voltage control seems to work now, so how much is safe? 0-100 doesn't really give me any sense of how much it is overvolting? is it 0-100mV?


Correct, 100mV. It should be fine at the max setting in AB.


----------



## pkrexer

Quote:


> Originally Posted by *geggeg*
> 
> @Slomo4shO I made the same potential mistake first time around. MX4 has a worse conductivity than those pads and makes reusing them much harder. I didn't do it on my current cards.
> 
> I also replaced MX4 with PK3 and got a further temp drop on the CPU and GPU cores.


I've been using PK3 on all my blocks, good stuff







I debated about throwing some CLU on the gpu, but ended up sticking with PK3.


----------



## LTC

Quote:


> Originally Posted by *anubis1127*
> 
> Correct, 100mV. It should be fine at the max setting in AB.


So can you tell me the difference between PL and Overvoltage? Is it safe to have PowerLimit at 50% all the time, and regulate with the Voltage?


----------



## Jack Mac

50% Powerlimit just prevents power based throttling, I'm 99.9% sure 50% is safe as it's just like power limit on nvidia cards, I ran my 290 at 50% 24/7 with no issues.


----------



## taem

Quote:


> Originally Posted by *Jack Mac*
> 
> 50% Powerlimit just prevents power based throttling, I'm 99.9% sure 50% is safe as it's just like power limit on nvidia cards, I ran my 290 at 50% 24/7 with no issues.


I've always wondered about power limit. In theory since the additional % is only called on as needed and max voltage is set elsewhere it should be safe to leave this at +50. But some users swear going above 25 creates instability.


----------



## LTC

Quote:


> Originally Posted by *Jack Mac*
> 
> 50% Powerlimit just prevents power based throttling, I'm 99.9% sure 50% is safe as it's just like power limit on nvidia cards, I ran my 290 at 50% 24/7 with no issues.


Thanks! Are there anything I should disable/enable? I disabled ULPS, but what about downclocking and so on?


----------



## LTC

Quote:


> Originally Posted by *taem*
> 
> I've always wondered about power limit. In theory since the additional % is only called on as needed and max voltage is set elsewhere it should be safe to leave this at +50. But some users swear going above 25 creates instability.


Huh, back when I had my 6970, I always had it at max as well...


----------



## taem

Quote:


> Originally Posted by *LTC*
> 
> Thanks! Are there anything I should disable/enable? I disabled ULPS, but what about downclocking and so on?


I'm finding that unless I disable Extend Official Overclock Limits and Unofficial Overclocking Mode, my core will throttle/down clock regardless of temps. They should both be disabled by default in AB beta 18.


----------



## Slomo4shO

Quote:


> Originally Posted by *geggeg*
> 
> @Slomo4shO I made the same potential mistake first time around. MX4 has a worse conductivity than those pads and makes reusing them much harder. I didn't do it on my current cards.
> 
> I also replaced MX4 with PK3 and got a further temp drop on the CPU and GPU cores.


MX-4 is slightly worst than Extremes (8.5w/mk vs 11w/mk, I didn't use the 17w/mk Ultra Extreme) but, theoretically, applying the paste would enable better contact and, thus, provide better heat transfer. Did you ever compare the temps of each method in the same build?

Also, I used CLU on the die...


----------



## VSG

I don't have any paste on my pads anymore, didn't seem worth the hassle later on honestly. The performance with the Fujipoly extreme pads themselves has been excellent as it is.


----------



## Slomo4shO

Quote:


> Originally Posted by *geggeg*
> 
> I don't have any paste on my pads anymore, didn't seem worth the hassle later on honestly. The performance with the Fujipoly extreme pads themselves has been excellent as it is.


I had extra MX-4 laying around that I got for free so it wasn't much of a hassle









If I had elected for the Ultra Extreme, I wouldn't have used the MX-4.


----------



## VSG

lol I was referring to the hassle with removing the pads and resuing them.


----------



## Slomo4shO

Quote:


> Originally Posted by *geggeg*
> 
> lol I was referring to the hassle with removing the pads and resuing them.


Meh, when it's time to upgrade, will just sell the cards with the blocks and will need new blocks and pads so no hassle


----------



## Roboyto

Quote:


> Originally Posted by *taem*
> 
> I've always wondered about power limit. In theory since the additional % is only called on as needed and max voltage is set elsewhere it should be safe to leave this at +50. But some users swear going above 25 creates instability.


I haven't tried testing with power under +50, but it is not a cause for instability from my testing. Instability is from clocking too high, or not enough voltage.


----------



## Slomo4shO

Quote:


> Originally Posted by *Roboyto*
> 
> I haven't tried testing with power under +50, but it is not a cause for instability from my testing. Instability is from clocking too high, or not enough voltage.


While mining, if the power limit is left at 0 it will actually cause the card to throttle. Increasing the power limit ensures that there is enough current to maintain the desired frequencies at load. Added voltages allow you to increase the maximum frequency


----------



## LTC

So what are some of you guys settings? What increments do you adjust voltage in?


----------



## Prozillah

Would you recommend running the power limit at 50% 24/7 even with stock clocks for bf4 gaming?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Prozillah*
> 
> Would you recommend running the power limit at 50% 24/7 even with stock clocks for bf4 gaming?


Shouldn't need to touch PL at all with stock clocks. As long as you keep temps down


----------



## taem

Quote:


> Originally Posted by *Roboyto*
> 
> I haven't tried testing with power under +50, but it is not a cause for instability from my testing. Instability is from clocking too high, or not enough voltage.


I just set it to +50% and forget about it. But there are quite a few people who say this, including here at ocn. I've read several times that you should not go over +25%.


----------



## Paul17041993

Quote:


> Originally Posted by *taem*
> 
> You don't want Unofficial Overclock, set it to disabled.
> Seriously?? I had Asus 280x dcu ii, I could tweak volts with afterburner. So this is new with asus 290s? What a horrible thing to do. It's not like they even put any effort into gpu tweak.


its an old thing with all their cards, so don't update your BIOS at all if you want to continue using afterburner, usually the first couple generations are decent, then afterwards they lock the controller, BIOS and voltages, don't ask why because it makes no sense to me either as it doesn't do them any good...


----------



## Roboyto

Quote:


> Originally Posted by *Slomo4shO*
> 
> While mining, if the power limit is left at 0 it will actually cause the card to throttle. Increasing the power limit ensures that there is enough current to maintain the desired frequencies at load. Added voltages allow you to increase the maximum frequency


Good to know







. The power slider has been cranked to +50% since I hit roughly the +75mV mark for OC testing. Haven't tried any maximum OC runs with it at 0. Now I'm curious to see how it will affect my benchmark scores.

Oh yes, I know all about moAr volts = moAr MHz


----------



## Slomo4shO

Quote:


> Originally Posted by *Roboyto*
> 
> Good to know
> 
> 
> 
> 
> 
> 
> 
> . The power slider has been cranked to +50% since I hit roughly the +75mV mark for OC testing. Haven't tried any maximum OC runs with it at 0. Now I'm curious to see how it will affect my benchmark scores.
> 
> Oh yes, I know all about moAr volts = moAr MHz


The power slider adds moAr Amps for moAr consistent frequencies


----------



## LTC

So how do you guys figure out your max overclock? Is it core first, then memory, or the other way around? Or maybe both at the same time? Also what is that thing about the memory being difficult to overclock because of hangs on the desktop and so on?


----------



## Derpinheimer

You can do either one first, it doesnt really matter. But the core returns are greater, so generally you would go core first to determine the max stable core clock, and then memory (because a higher memory clock could mean a lower max core clock), and now you'll know if that is the case.

Memory will give black screens if OC'd too far, I dont know anything about hanging.


----------



## LTC

Quote:


> Originally Posted by *Derpinheimer*
> 
> You can do either one first, it doesnt really matter. But the core returns are greater, so generally you would go core first to determine the max stable core clock, and then memory (because a higher memory clock could mean a lower max core clock), and now you'll know if that is the case.
> 
> Memory will give black screens if OC'd too far, I dont know anything about hanging.


Thanks, what is the preferred stability testing software nowadays? Still furmark?


----------



## Davschall

May I join? XFX DD 290 underwater.


----------



## Loktar Ogar

Quote:


> Originally Posted by *LTC*
> 
> Thanks, what is the preferred stability testing software nowadays? Still furmark?


From my experience: Firestrike / Firestrike Extreme and BF4.


----------



## Arizonian

Quote:


> Originally Posted by *Davschall*
> 
> May I join? XFX DD 290 underwater.
> 
> 
> Spoiler: Warning: Spoiler!


Looks great. Congrats - added


----------



## GenoOCAU

Quote:


> Originally Posted by *Loktar Ogar*
> 
> From my experience: Firestrike / Firestrike Extreme and BF4.


I would have to disagree given how unstable BF4 can be even on a good day.


----------



## Roboyto

Quote:


> Originally Posted by *LTC*
> 
> Thanks, what is the preferred stability testing software nowadays? Still furmark?


Quote:


> Originally Posted by *Roboyto*
> 
> The only 2 games I've played with high OC have been FFXIV and Elder Scrolls Online Beta. Both are pretty demanding titles as far as GPU is concerned.
> 
> There is only a small difference in Core/RAM max speeds between benchmarks. Some will allow a little more here or there...just requires time to test.
> 
> FFXIV I'm 100% stable with 1255/1675 +125mV offset. I haven't tried playing with my new found highest clocks with +200mV.
> 
> I think FFXIV benchmark is a great overall system test as I end up with slightly lower stable clocks compared to 3DMark/Unigine.
> 
> My suggestion is:
> 
> +50% power limit
> *don't try to make large jumps in MHz* .
> If you're 100% good at certain clock speeds then *work the core then memory individually* in 10MHz jumps.
> When you start to have issues at a certain core speed, go back to your previously good setting and then start working on the RAM.
> When you encounter issues there, then it is time for more voltage.
> *Add voltage in small increments*; I was working in 5-7mV increases.
> Once you have applied more voltage you *then start over again*. Small jumps in core frequency and then RAM, then more volts.
> Rinse and repeat.
> 
> *It will be time consuming*. I worked all settings up little by little to help keep temperatures in check since I did 2/3 of my benchmark runs with the stock blower fan @ a deafening 75%.
> 
> Up until the last few days my best clocks were achieved at +125mV, I was somewhat unsure of what would happen with +200mV until I saw a few other people post with that much voltage. Then I started to experiment past +125mV. I did get some extra performance going up to +200mV, but I won't ever be running the card constantly at +200mV since the performance gain was minimal and I don't want my card blowing up anytime soon.; I plan on keeping this thing for years to come.
> 
> I've ran the following benches at least this many times from what I have recorded in my spreadsheet:
> 
> 
> FFXIV Benchmark: 30
> Unigine Heaven: 8
> Uningine Valley: 42
> 3DMark11 Performance Presets: 61
> Uningine Valley: 41
> 3DMark11 Extreme Presets: 38
> If I could guesstimate between the GPU and fine tuning my 4770k to minimal voltage with RAM @ 2400MHz., I'm probably 100 hours into tweaking.


Furmark gives you *absolute maximum power draw and temperatures*, but isn't a great gauge for how the computer will act while playing a game. 3DMark, Unigine, and FFXIV are my go to benchmarks. These are likely more strenuous than playing a game, but a better judge than Furmark.


----------



## Slomo4shO

Quote:


> Originally Posted by *Arizonian*
> 
> Looks great. Congrats - added


Can you update me total total of 7 cards: The 4 reference Saphire R9 290 BF4 Editions that unlocked to R9 290X under water in a 4-way crossfire, 2 XFX R9 290 Black edition under water in xfire, and a XFX R9 290 DD Black edition on air?

Hmm, looking forward to better Nvidia GPUs that can mine so all my builds aren't riddled with red


----------



## LTC

Quote:


> Originally Posted by *Roboyto*
> 
> Furmark gives you *absolute maximum power draw and temperatures*, but isn't a great gauge for how the computer will act while playing a game. 3DMark, Unigine, and FFXIV are my go to benchmarks. These are likely more strenuous than playing a game, but a better judge than Furmark.


Very informative post, thanks! But how do you get a 125mV increase? The highest I can go in AB is 100mV


----------



## Roboyto

Quote:


> Originally Posted by *Derpinheimer*
> 
> You can do either one first, it doesnt really matter. But the core returns are greater, so generally you would go core first to determine the max stable core clock, and then memory (because a higher memory clock could mean a lower max core clock), and now you'll know if that is the case.
> 
> Memory will give black screens if OC'd too far, I dont know anything about hanging.


Core returns are definitely greater on these cards, especially with 512-bit bus. I've tested in all my benchmarks using my maximum core overclock with stock memory clock. The loss in performance, depending on benchmark, hasn't been over 10%

Memory will give black screens; absolutely. Usually locks the computer up, sometimes sound continues to operate normally or will freeze as well.

Core pushed too far will first generate artifacting/flickering, decrease in bench scores/frames, or lock ups.

Strangely enough I haven't encountered a single BSOD from pushing the card too far.


----------



## Roboyto

Quote:


> Originally Posted by *LTC*
> 
> Very informative post, thanks! But how do you get a 125mV increase? The highest I can go in AB is 100mV


Sapphire Trixx with Offset Voltage


----------



## chiknnwatrmln

Geez this thread moves fast... Been busy with schoolwork for a few days and I missed like 30 some odd pages...

Anyway, I saw your temps decrease a lot Roboyto, that's good. I also use Trixx as starting up AB with hax every time is a pain, and when I'm not gaming I downclock my card to 900/1000 with stock voltage because my daily OC (1210/1437.5 on +168mV) gives like 1.125v at idle..


----------



## taem

Quote:


> Originally Posted by *LTC*
> 
> Very informative post, thanks! But how do you get a 125mV increase? The highest I can go in AB is 100mV


If you click the vertical bar to the right of the core voltage slider, 2 more sliders appear. One is the Aux Voltage slider that lets you add another +100mV.


----------



## GenoOCAU

Quote:


> Originally Posted by *taem*
> 
> If you click the vertical bar to the right of the core voltage slider, 2 more sliders appear. One is the Aux Voltage slider that lets you add another +100mV.


I just tried +125 in trixx and then +100mv in afterburner with +25 on the aux and it didn't work as you suggested?


----------



## taem

Quote:


> Originally Posted by *GenoOCAU*
> 
> I just tried +125 in trixx and then +100mv in afterburner with +25 on the aux and it didn't work as you suggested?


They're not a 1 to 1 correlation AB is vddc adjust andTrixx is vddc offset. You can get to the same place with either but the # will be slightly different.


----------



## Forceman

Quote:


> Originally Posted by *Roboyto*
> 
> I haven't tried testing with power under +50, but it is not a cause for instability from my testing. Instability is from clocking too high, or not enough voltage.


I wonder if the reason some people find +25% to be more stable than +50% is because at +25% with high overclocks the card is power throttling to a lower clock speed. I know when I was first testing mine I was getting crazy clocks until I realized the power limit change hadn't taken (was still at 0% even though I set 50%) and I was throttling to 925 or something instead of the 1250 I had set. Fixed the power limit and bang, crash.
Quote:


> Originally Posted by *taem*
> 
> If you click the vertical bar to the right of the core voltage slider, 2 more sliders appear. One is the Aux Voltage slider that lets you add another +100mV.


That AUX voltage is not added to the core voltage. It is a different thing. So putting +100mV in core and +100mV in AUX will not give you +200mV on the core.


----------



## taem

Quote:


> Originally Posted by *Forceman*
> 
> AUX voltage is not added to the core voltage. It is a different thing. So putting +100mV in core and +100mV in AUX will not give you +200mV on the core.


OIC. It adjusts mem bus among other things. It's the vddci that's moving, not vddc. This is interesting, it's been my experience that raising mem clock sometimes allows a higher stable core clock, maybe that's what was happening. Aux volt +25 lets me get to 1220 core, without it I max at 1200, on AB.


----------



## Roboyto

Quote:


> Originally Posted by *Forceman*
> 
> I wonder if the reason some people find +25% to be more stable than +50% is because at +25% with high overclocks the card is power throttling to a lower clock speed. I know when I was first testing mine I was getting crazy clocks until I realized the power limit change hadn't taken (was still at 0% even though I set 50%) and I was throttling to 925 or something instead of the 1250 I had set. Fixed the power limit and bang, crash.


I'll have to give it a try. All my OC has been done with +50. I just figured it could use as much as possible when pushing the card to the brink.


----------



## Forceman

Quote:


> Originally Posted by *Roboyto*
> 
> I'll have to give it a try. All my OC has been done with +50. I just figured it could use as much as possible when pushing the card to the brink.


That's what I've done as well. Might try lower and see what happens also. I've never heard any reason for why lower would be more stable, unless the power delivery system couldn't handle the load for some reason, but unless it throttled at 25% the power should be the same.


----------



## Roboyto

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Geez this thread moves fast... Been busy with schoolwork for a few days and I missed like 30 some odd pages...
> 
> Anyway, I saw your temps decrease a lot Roboyto, that's good.


Yes they did. With my more mild settings of 1255/1675 +125mV, core and VRM1 temps are now pretty much identical in the mid-upper 40s. Big improvement from VRM1 normally being 20C higher than core temp. On a few of my runs core temp was up 1-3C...Not sure if this is attributed to the lower VRM1 temps or there being a smidge of air left in the loop as I was anxious to get to results. It ran all day and there's not a single air bubble or gurgle sound so I might have to rerun the tests just to make sure.

Gunderman456 should have the Fujipoly Extreme 11 W/mk pads in a day or two to install on his EK blocks. Between your temps, mine, and his we should be able to find a trend in performance for the Fujipoly pads. Plus it will be nice to see in the Ultra Extreme are worth their premium price tag.


----------



## phallacy

Do you think adding the Fujipoly ultra extreme on the VRMs could take the place of more rads for multi gpu setup? Adding another gpu to my setup for 3 way, possibly 4 later on. However I don't have enough space to accommodate extra rads other than maybe a 240mm. If I replace the pads, you think it would work without getting much hotter?


----------



## Paul17041993

Quote:


> Originally Posted by *taem*
> 
> OIC. It adjusts mem bus among other things. It's the vddci that's moving, not vddc. This is interesting, it's been my experience that raising mem clock sometimes allows a higher stable core clock, maybe that's what was happening. Aux volt +25 lets me get to 1220 core, without it I max at 1200, on AB.


the power draw is likely creating extra vdroop on the memory, allowing +25 memory voltage to make it stable again.


----------



## VSG

Quote:


> Originally Posted by *phallacy*
> 
> Do you think adding the Fujipoly ultra extreme on the VRMs could take the place of more rads for multi gpu setup? Adding another gpu to my setup for 3 way, possibly 4 later on. However I don't have enough space to accommodate extra rads other than maybe a 240mm. If I replace the pads, you think it would work without getting much hotter?


You really can't compare those two. Better thermal transfer takes heat away from a localized area like the VRMs and distributes it throughout the loop, raising the average loop temperature. A radiator helps dissipate some of this heat from the loop to the environment. So the two work hand in hand and don't really replace each other.


----------



## pkrexer

Quote:


> Originally Posted by *phallacy*
> 
> Do you think adding the Fujipoly ultra extreme on the VRMs could take the place of more rads for multi gpu setup? Adding another gpu to my setup for 3 way, possibly 4 later on. However I don't have enough space to accommodate extra rads other than maybe a 240mm. If I replace the pads, you think it would work without getting much hotter?


If anything, you'll need more rad space by adding the pads. It's not the pads cooling the vrms, it's just allowing heat to transfer to your block more efficiently... which in turn will transfer more heat through your rads...


----------



## phallacy

Quote:


> Originally Posted by *geggeg*
> 
> You really can't compare those two. Better thermal transfer takes heat away from a localized area like the VRMs and distributes it throughout the loop, raising the average loop temperature. A radiator helps dissipate some of this heat from the loop to the environment. So the two work hand in hand and don't really replace each other.


Quote:


> Originally Posted by *pkrexer*
> 
> If anything, you'll need more rad space by adding the pads. It's not the pads cooling the vrms, it's just allowing heat to transfer to your block more efficiently... which in turn will transfer more heat through your rads...


Hmm ok thanks for the info. I'll have to see how much I can add with my current config.


----------



## Bluemustang

So i got some 290s on the way and if i get lucky im going to try and unlock them to 290xs. Id guess this would void the warranty so if for some reason in the future i had to RMA the card can i flash the bios back and lock it back to a 290 before sending it in?


----------



## cplifj

R9 Eunuch-X :worst card ever, it isnt actually any good at anything , only making air move. i immediatly correct myself as i do have comfortably warm feet.


----------



## taem

Quote:


> Originally Posted by *Paul17041993*
> 
> the power draw is likely creating extra vdroop on the memory, allowing +25 memory voltage to make it stable again.


As a practical matter does this mean you can achieve higher mem clocks with AB, due to this mem bus voltage adjust?


----------



## Paul17041993

Quote:


> Originally Posted by *taem*
> 
> As a practical matter does this mean you can achieve higher mem clocks with AB, due to this mem bus voltage adjust?


possibly? I haven't fiddled with mine enough to really know...


----------



## Roboyto

Quote:


> Originally Posted by *pkrexer*
> 
> If anything, you'll need more rad space by adding the pads. It's not the pads cooling the vrms, it's just allowing heat to transfer to your block more efficiently... *which in turn will transfer more heat through your rads*...


Quote:


> Originally Posted by *geggeg*
> 
> You really can't compare those two. Better thermal transfer takes heat away from a localized area like the VRMs and distributes it throughout the loop, *raising the average loop temperature*. A radiator helps dissipate some of this heat from the loop to the environment. So the two work hand in hand and don't really replace each other.


Quote:


> Originally Posted by *phallacy*
> 
> Do you think adding the Fujipoly ultra extreme on the VRMs could take the place of more rads for multi gpu setup? Adding another gpu to my setup for 3 way, possibly 4 later on. However I don't have enough space to accommodate extra rads other than maybe a 240mm. If I replace the pads, you think it would work without getting much hotter?


In my very limited results comparison that I posted, 10 runs, I did notice GPU core temp was higher sometimes. It averages out to .8C hotter on the GPU. Not much but still measurable. I will continue to tweak and tinker to see if this trend stays.


----------



## rdr09

Quote:


> Originally Posted by *cplifj*
> 
> R9 Eunuch-X :worst card ever, it isnt actually any good at anything , only making air move. i immediatly correct myself as i do have comfortably warm feet.


mine plays BF4 beautifully.









have you ever thought of xbox?


----------



## Ukkooh

Quote:


> Originally Posted by *cplifj*
> 
> R9 Eunuch-X :worst card ever, it isnt actually any good at anything , only making air move. i immediatly correct myself as i do have comfortably warm feet.


You shouldn't have bought reference if you aren't going to watercool it.


----------



## Forceman

Quote:


> Originally Posted by *taem*
> 
> As a practical matter does this mean you can achieve higher mem clocks with AB, due to this mem bus voltage adjust?


The AUX voltage isn't the memory voltage. The default voltage for what it changes is 1.0V so that's not memory (unless the memory bus is running at something other than the memory voltage, although I don't know if that is possible). I've heard it is PLL voltage, but I don't know what impact raising that would have on overclocking.


----------



## Paul17041993

irony being, the top of my raven gets hot enough to burn your hand, or at least warm a cheese sandwich, couldn't imagine what trifire of these would be like...


----------



## Arizonian

Quote:


> Originally Posted by *Slomo4shO*
> 
> Can you update me total total of 7 cards: The 4 reference Saphire R9 290 BF4 Editions that unlocked to R9 290X under water in a 4-way crossfire, 2 XFX R9 290 Black edition under water in xfire, and a XFX R9 290 DD Black edition on air?
> 
> Hmm, looking forward to better Nvidia GPUs that can mine so all my builds aren't riddled with red


Nice builds you got going on there. Updated


----------



## taem

Quote:


> Originally Posted by *Forceman*
> 
> The AUX voltage isn't the memory voltage. The default voltage for what it changes is 1.0V so that's not memory (unless the memory bus is running at something other than the memory voltage, although I don't know if that is possible). I've heard it is PLL voltage, but I don't know what impact raising that would have on overclocking.


Like I said earlier, adding to aux volt allows me to increase core clock. The phenomenon is repeatable and consistent, set the aux to 0 and MetroLL bench will crash and Valley will artifact. Set it to +25 and it will complete a looped run and no artifacts in Valley. Same result every time I try. But, I only tried with mem at 1500. In any event this setting is affecting core clock stability, absolutely positive about this.

I happen to be using Trixx right now for the profile based fan curves, I don't like the AB universal fan curve. But I might go back and play with this now that I know a bit more about this.

Edit, aux volt affects mem bus voltage, and other things. On nvidia cards it affects pcie bus voltage.


----------



## Roboyto

Quote:


> Originally Posted by *Slomo4shO*
> 
> Can you update me total total of 7 cards: The 4 reference Saphire R9 290 BF4 Editions that unlocked to R9 290X under water in a 4-way crossfire, 2 XFX R9 290 Black edition under water in xfire, and a XFX R9 290 DD Black edition on air?
> 
> Hmm, looking forward to better Nvidia GPUs that can mine so all my builds aren't riddled with red


Should run some 3DMark and get it on this:

http://www.overclock.net/t/1361939/top-30-3dmark11-scores-for-single-dual-tri-quad


----------



## Forceman

Quote:


> Originally Posted by *taem*
> 
> Like I said earlier, adding to aux volt allows me to increase core clock. The phenomenon is repeatable and consistent, set the aux to 0 and MetroLL bench will crash and Valley will artifact. Set it to +25 and it will complete a looped run and no artifacts in Valley. Same result every time I try. But, I only tried with mem at 1500. In any event this setting is affecting core clock stability, absolutely positive about this.
> 
> I happen to be using Trixx right now for the profile based fan curves, I don't like the AB universal fan curve. But I might go back and play with this now that I know a bit more about this.
> 
> Edit, aux volt affects mem bus voltage, and other things. On nvidia cards it affects pcie bus voltage.


I never said it didn't affect core stability, many people have said that it does. But I don't know what you mean by mem bus voltage. The memory runs at 1.5V (or maybe 1.35V), so AUX/VDDCI certainly isn't affecting that since it is 1.0V at default (plus Unwinder already said there is no programmable memory voltage control on these cards). I would assume that the memory controller runs at core voltage, which is why increasing core voltage can help with memory overclocks, but I'm not certain about that. Do you think VDDCI is memory controller voltage?


----------



## taem

Quote:


> Originally Posted by *Forceman*
> 
> I never said it didn't affect core stability, many people have said that it does. But I don't know what you mean by mem bus voltage. The memory runs at 1.5V (or maybe 1.35V), so AUX/VDDCI certainly isn't affecting that since it is 1.0V at default (plus Unwinder already said there is no programmable memory voltage control on these cards). I would assume that the memory controller runs at core voltage, which is why increasing core voltage can help with memory overclocks, but I'm not certain about that. Do you think VDDCI is memory controller voltage?


I glanced at info on what aux volt does and that's what it says, that it affects mem bus voltage, not mem voltage. I have no idea what the technical relationship is between aux volt and core clocks and how it all intersects with mem clocks so I'm not claiming anything. But for my air cooled system, the core voltage limit isn't the +100 max in AB, it's temps, so if I can squeeze out some more clock out of AB via aux voltage, then it might be worthwhile for me to use AB over Trixx. Same might apply for other air cooled folks.

I do understand why MSI says this can have the same practical effect as core voltage, since you're raising core voltage to get higher clocks, and if aux voltage can do the same for you, you arrive at the same place.


----------



## Paul17041993

AUX is likely a part of the memory controller, as its lower profile compared to its predecessor in the 79x0s it likely needs less voltage then what you can use from the core voltage and works better and more efficiently with its own power.

memory would be locked to 1.5V, the lightning would likely be the only one able to control the memory power, of which is somewhat needed for record runs anyway.


----------



## LTC

My card seems to crash in Unigine with stock clocks with Trixx open. After a reboot it seems to run fine again, no overclock in trixx, only PL at 50%, could that cause the problem?


----------



## Mercy4You

Quote:


> Originally Posted by *LTC*
> 
> My card seems to crash in Unigine with stock clocks with Trixx open. After a reboot it seems to run fine again, no overclock in trixx, only PL at 50%, could that cause the problem?


The answer is not so easy to give without other information about your card. What are your temps?


----------



## LTC

Quote:


> Originally Posted by *Mercy4You*
> 
> The answer is no...


Huh, well, seems weird then...


----------



## Mercy4You

Quote:


> Originally Posted by *LTC*
> 
> Huh, well, seems weird then...


Lol, I was too fast and you too


----------



## Mercy4You

Quote:


> Originally Posted by *Mercy4You*
> 
> Lol, I was too fast and you too


You should test it with GPU-Z logging...


----------



## LTC

Quote:


> Originally Posted by *Mercy4You*
> 
> The answer is not so easy to give without other information about your card. What are your temps?


It's a DCUII 290, temps shouldn't be a problem as I have a pretty agressive custom fan profile, also Unigine crashes right after the first scene, where it changes to the next.


----------



## Mercy4You

Quote:


> Originally Posted by *LTC*
> 
> It's a DCUII 290, temps shouldn't be a problem as I have a pretty agressive custom fan profile, also Unigine crashes right after the first scene, where it changes to the next.


Is this the only problem or does this also happen at other moments?


----------



## LTC

Quote:


> Originally Posted by *Mercy4You*
> 
> Is this the only problem or does this also happen at other moments?


Haven't tried yet, could it be the 14.1 drivers?


----------



## Mercy4You

Quote:


> Originally Posted by *LTC*
> 
> Haven't tried yet, could it be the 14.1 drivers?


Ah, 14.1









I Had same problem with 14.1, so I rolled back 13.12


----------



## LTC

Quote:


> Originally Posted by *Mercy4You*
> 
> Ah, 14.1
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I Had same problem with 14.1, so I rolled back 13.12


But.. but no Mantle in BF4 then








Anyway, it gives me a little hope that my card isn't completely bad.


----------



## Mercy4You

Quote:


> Originally Posted by *LTC*
> 
> But.. but no Mantle in BF4 then
> 
> 
> 
> 
> 
> 
> 
> Anyway, it gives me a little hope that my card isn't completely bad.


Don't worry, it's not your card. I must admit I only saw this problem in benchmarking.

You could choose to keep 14.1 for Mantle and stop benchmarking.

I rolled back 13.12 because I saw little gain in FPS with Mantle.
My FPS was good without it (100-150)...


----------



## LTC

Quote:


> Originally Posted by *Mercy4You*
> 
> Don't worry, it's not your card. I must admit I only saw this problem in benchmarking. You could choose to keep 14.1 for Mantle and stop benchmarking.
> I rolled back 13.12 because I saw little gain in FPS with Mantle. My FPS was good without it (100-150)...


I did get some weird chrashes in BF4 with a overclock yesterday too, however that could just be a cause of to little voltage, gonna test that later today.


----------



## Mercy4You

Quote:


> Originally Posted by *LTC*
> 
> I did get some weird chrashes in BF4 with a overclock yesterday too, however that could just be a cause of to little voltage, gonna test that later today.


Now you mention it, I also got some crashes in BF4 with 14.1 which I didn't get with 13.12. I guess that's why I rolled back so fast (same day)


----------



## LTC

Quote:


> Originally Posted by *Mercy4You*
> 
> Now you mention it, I also got some crashes in BF4 with 14.1 which I didn't get with 13.12. I guess that's why I rolled back so fast (same day)


Sounds good, well sorta







Anyway, I will try to roll back to 13.12 and see what happens...


----------



## Paul17041993

Quote:


> Originally Posted by *LTC*
> 
> It's a DCUII 290, temps shouldn't be a problem as I have a pretty agressive custom fan profile, also Unigine crashes right after the first scene, where it changes to the next.


state jump, unstable clocks or not enough voltage to counter the jump droop.


----------



## Mercy4You

Quote:


> Originally Posted by *LTC*
> 
> Sounds good, well sorta
> 
> 
> 
> 
> 
> 
> 
> Anyway, I will try to roll back to 13.12 and see what happens...


Don't forget to use DDU first in safemode!


----------



## LTC

Quote:


> Originally Posted by *Paul17041993*
> 
> state jump, unstable clocks or not enough voltage to counter the jump droop.


Not enough voltage at stock clocks with PL at 50%?


----------



## maynard14

guys help..

i think my card it throttling,, using xfx 290x bios, my gpu core temp at load is 61c and vrm1 is 75c, and vrm2 max load temp is 59c

im using xfx 290x bios:

max voltage 1.234 and all my settings on the gpu is default, no oc no voltage tweak.

while playing crysis 3 my fps is still low and in msi burner my gpu load is not 100 percent stable,

what can i do to make it stable? pls help, coz this is the first time i saw my card throttle but temps are still acceptable right


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *cam51037*
> 
> Indeed the 290 is the better performer, it's just that for $200 I can upgrade my single R9 290 to *dual* GTX 780's.


JUST DO IT ALREADY


----------



## nightfox

Quote:


> Originally Posted by *maynard14*
> 
> guys help..
> 
> i think my card it throttling,, using xfx 290x bios, my gpu core temp at load is 61c and vrm1 is 75c, and vrm2 max load temp is 59c
> 
> im using xfx 290x bios:
> 
> max voltage 1.234 and all my settings on the gpu is default, no oc no voltage tweak.
> 
> while playing crysis 3 my fps is still low and in msi burner my gpu load is not 100 percent stable,
> 
> what can i do to make it stable? pls help, coz this is the first time i saw my card throttle but temps are still acceptable right


nope its not throttling. you can see the core clock. it is steady.

thats the GPU usage you are looking at in which it depends on the load of the game.

when you say low fps how many are you talking about?


----------



## Roboyto

Quote:


> Originally Posted by *LTC*
> 
> It's a DCUII 290, temps shouldn't be a problem as I have a pretty agressive custom fan profile, also Unigine crashes right after the first scene, where it changes to the next.


We have seen issues with DC2 cooler on 290, especially in Xfire. DC2 cooler for 290 doesn't seem to be up to snuff. Powercolor PCS has been showing most impressive air cooling results from what I've seen in this thread.

I have done over 40 Unigine Valley runs testing my 290 and if you are crashing then you either need more voltage, to lower your clocks, or reduce operating temps.

*Edit* Scrub those 14.1 Drivers and roll back to 13.11 or 13.12.


----------



## rdr09

Quote:


> Originally Posted by *Ukkooh*
> 
> You shouldn't have bought reference if you aren't going to watercool it.


cp's issue is blackscreen. i would be upset myself.


----------



## Ilsanto86

*Hi All can i join the Club?
*

Gigabyte R9 290X OC WF3 + Asus Bios Mod


----------



## rdr09

Quote:


> Originally Posted by *Ilsanto86*
> 
> *Hi All can i join the Club?
> *
> 
> Gigabyte R9 290X OC WF3 + Asus Bios Mod
> 
> 
> Spoiler: Warning: Spoiler!


seems everybody got better looking rig than i do.









Gigabyte can oc, huh. no coil whine?

It is a beauty.


----------



## Prozillah

Sweet - cracked it. 1140 corr stock mem 50% PL +20mv max 76c temps 10x passes of metro on 1440p maxed. Best part; crossfire gigabyte 290s windforce ON AIR Hahaha.

I found something interesting. My top mounted cpu rad is set up in push pull pushing the hot air out the top of the case. Now when I didnt have my it cranking my cards would max at 84c and throttle badly and that was only at 1120 no volts added. As soon I cranked on the big suction vac fans xfire temps have been soooooo noticeably better it's not even funny! Cant stress enough how important case vortex airflow is. And no it wouldn't win any awards for silent operation. ..


----------



## Spectre-

Quote:


> Originally Posted by *rdr09*
> 
> seems everybody got better looking rig than i do.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Gigabyte can oc, huh. no coil whine?


i got coil whine on my reference gigabyte R9 290


----------



## rdr09

Quote:


> Originally Posted by *Spectre-*
> 
> i got coil whine on my reference gigabyte R9 290


i had a giga 7950 that whined. dissappeared though after 50 something runs of Valley. scored 51, too.


----------



## Sgt Bilko

This seems to be all i can squeeze out of my cards for now.

Single 290: http://www.3dmark.com/fs/1663793

Crossfire FS: http://www.3dmark.com/fs/1644041

Crossfire FSE: http://www.3dmark.com/fs/1644094


----------



## maynard14

Quote:


> Originally Posted by *nightfox*
> 
> nope its not throttling. you can see the core clock. it is steady.
> 
> thats the GPU usage you are looking at in which it depends on the load of the game.
> 
> when you say low fps how many are you talking about?


yes sir, i just realize that crysis 3 is not that optionize, i got 40 to 58 fps on crysis on the post human level, and are my temps acceptable?


----------



## Sgt Bilko

Quote:


> Originally Posted by *maynard14*
> 
> yes sir, i just realize that crysis 3 is not that optionize, i got 40 to 58 fps on crysis on the post human level, and are my temps acceptable?


I'd call them acceptable, So long as the VRM's and Core stay under 85-90c you will be fine


----------



## maynard14

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'd call them acceptable, So long as the VRM's and Core stay under 85-90c you will be fine


thank you,, im using antec h620 and nzxt g10,.. core clock for me is very good, but vrms are not that good but its natural coz i didnt put any vrm heatsinks


----------



## rdr09

Quote:


> Originally Posted by *maynard14*
> 
> yes sir, i just realize that crysis 3 is not that optionize, i got 40 to 58 fps on crysis on the post human level, and are my temps acceptable?


what do you mean not optimized? to me it runs very well. it is just demanding. checkout my fps with crossfire 7950/7970 stock (slightly faster than 290) . . .



see how it dips to the 30s outdoors. last three were indoors. btw, that's MP.

that was maxed out using 1080. Very high and 8msaa. i can play the same settings with a 290 at stock but i have my HT on on my i7 4.5GHz. i don't monitor my fps. no need to.


----------



## maynard14

Quote:


> Originally Posted by *rdr09*
> 
> what do you mean not optimized? to me it runs very well. it is just demanding. checkout my fps with crossfire 7950/7970 stock (slightly faster than 290) . . .
> 
> 
> 
> see how it dips to the 30s outdoors. last three were indoors. btw, that's MP.
> 
> that was maxed out using 1080. Very high and 8msaa. i can play the same settings with a 290 at stock but i have my HT on on my i7 4.5GHz. i don't monitor my fps. no need to.


wow nice,. me too my settings are same as yours except fxaa only, and what i mean is that some areas just suddenly dips down to mid 40s to 30s, thats why i call it not optinize, im only playing campaign mode sir, so the dips are normal?


----------



## LTC

So after rolling back to 13.12 I seem to be able to overclock, overclock well even. Right now I'm at 1100/1400 at stock voltage with temps going to 79/82/74 (Core/VRM1/VRM2) is those temps safe? Getting a score of 2598 in Valley ExtremeHD benchmark.


----------



## rdr09

Quote:


> Originally Posted by *maynard14*
> 
> wow nice,. me too my settings are same as yours except fxaa only, and what i mean is that some areas just suddenly dips down to mid 40s to 30s, thats why i call it not optinize, im only playing campaign mode sir, so the dips are normal?


you're playing 1080 60hz? if so, why fxaa? i got my card for $400 - it better max out that game and it does. put some msaa. dips in SP? not sure.


----------



## rdr09

Quote:


> Originally Posted by *LTC*
> 
> So after rolling back to 13.12 I seem to be able to overclock, overclock well even. Right now I'm at 1100/1400 at stock voltage with temps going to 79/82/74 (Core/VRM1/VRM2) is those temps safe? Getting a score of 2598 in Valley ExtremeHD benchmark.


i find 13.11 better for higher clocks.


----------



## LTC

Quote:


> Originally Posted by *rdr09*
> 
> i find 13.11 better for higher clocks.


Well, I'm just curious to see how high I can go without change voltage, not going for any records or anything







However a 5fps increase in BF4 would be nice.


----------



## rdr09

Quote:


> Originally Posted by *LTC*
> 
> Well, I'm just curious to see how high I can go without change voltage, not going for any records or anything
> 
> 
> 
> 
> 
> 
> 
> However a 5fps increase in BF4 would be nice.


you can stick with 13.12. my experience with it was it raised my vcore along with the temp, so iwent back to 13.11. it is up to you. you might be able to do 1150 at stock.


----------



## LTC

Quote:


> Originally Posted by *rdr09*
> 
> you can stick with 13.12. my experience with it was it raised my vcore along with the temp, so iwent back to 13.11. it is up to you. you might be able to do 1150 at stock.


How about memory though, how much is a normal clock for that?


----------



## maynard14

Quote:


> Originally Posted by *rdr09*
> 
> you're playing 1080 60hz? if so, why fxaa? i got my card for $400 - it better max out that game and it does. put some msaa. dips in SP? not sure.


im playing it on 1080p only and 120hz , i think when i tried 8x msaa my fps drops significantly,. thats why i just put fxaa.. haha i like 60 fps and up.. wow nice price for $400, mine i bought it for about 480 dollars here in the philippines, but i got lucky it unlocks to 290x


----------



## rdr09

Quote:


> Originally Posted by *LTC*
> 
> How about memory though, how much is a normal clock for that?


try little steps like 1300, 1400, 1500. i doubt 1600 without vcore increase. it seems that vcore helps not just the core but also mem oc. it is silicon lottery. i can get mine to bench at 1175/1500 with just raising the PL to max with Trixx. the vcore adjust itself automatically or dynamically.


----------



## DA77

I would like to join the club

here is my GPU-Z validation

Powercolor R9 290 PCS+


----------



## rdr09

Quote:


> Originally Posted by *maynard14*
> 
> im playing it on 1080p only and 120hz , i think when i tried 8x msaa my fps drops significantly,. thats why i just put fxaa.. haha i like 60 fps and up.. wow nice price for $400, mine i bought it for about 480 dollars here in the philippines, but i got lucky it unlocks to 290x


120Hz you need 3 290s lol. i am sorry maynard. but C3 is one of the best looking game out there, so take advantage of higher settings.

well, $400 plus the block comes to $540 total. so, the new ones with non ref cooler should not be priced any higher. $600 for a 290 is loko.


----------



## LTC

Quote:


> Originally Posted by *rdr09*
> 
> try little steps like 1300, 1400, 1500. i doubt 1600 without vcore increase. it seems that vcore helps not just the core but also mem oc. it is silicon lottery. i can get mine to bench at 1175/1500 with just raising the PL to max with Trixx. the vcore adjust itself automatically or dynamically.


How much do you game at? I'm wondering if those clocks are stable?


----------



## maynard14

Quote:


> Originally Posted by *rdr09*
> 
> 120Hz you need 3 290s lol. i am sorry maynard. but C3 is one of the best looking game out there, so take advantage of higher settings.


really sir... for 120hz?? my monitor is samsung s27a950d,.. i see, but i will try to put 8xmsaa later, im at work right now haha.

ahhh, mine is cooled with nzxt g10 and antec h620 only,







i cant afford a full block water cooled gpu


----------



## rdr09

Quote:


> Originally Posted by *maynard14*
> 
> really sir... for 120hz?? my monitor is samsung s27a950d,.. i see, but i will try to put 8xmsaa later, im at work right now haha.
> 
> ahhh, mine is cooled with nzxt g10 and antec h620 only,
> 
> 
> 
> 
> 
> 
> 
> i cant afford a full block water cooled gpu


Idk, just guessing. I am your supervisor - GO BACK TO WORK.

yah, try 60Hz and set it to at least Very High and 4MSAA. 8 if your cpu can take it.


----------



## rdr09

Quote:


> Originally Posted by *LTC*
> 
> How much do you game at? I'm wondering if those clocks are stable?


I am sorry for multiple posts. I just saw your ?.

I play at stock 'cause I only have 1080. here is FF 14 . . .



although, I tried BF4 MP 64 with no crashes. it is watercooled.


----------



## LTC

Can't seem to get 1150/1450 benchmark stable, even with a 50mV increase, also the VRM's hit a frightening 90C at that voltage...


----------



## kizwan

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *LTC*
> 
> How much do you game at? I'm wondering if those clocks are stable?
> 
> 
> 
> I am sorry for multiple posts. I just saw your ?.
> 
> I play at stock 'cause I only have 1080. here is FF 14 . . .
> 
> 
> 
> although, I tried BF4 MP 64 with no crashes. it is watercooled.
Click to expand...

I have different FFXIV official benchmark. How many official benchmark they released?


----------



## Sgt Bilko

Quote:


> Originally Posted by *maynard14*
> 
> really sir... for 120hz?? my monitor is samsung s27a950d,.. i see, but i will try to put 8xmsaa later, im at work right now haha.
> 
> ahhh, mine is cooled with nzxt g10 and antec h620 only,
> 
> 
> 
> 
> 
> 
> 
> i cant afford a full block water cooled gpu


i just ran the first SP mission for you, Crossfire R9 290's 1000/1250 with my 8350 @ 4.8 Game settings were Very High with MSAA x8 and i was getting 70 fps avg with both cards at 100%

Crysis 3 will bring alot of rigs to their knees


----------



## maynard14

Quote:


> Originally Posted by *rdr09*
> 
> Idk, just guessing. I am your supervisor - GO BACK TO WORK.
> 
> yah, try 60Hz and set it to at least Very High and 4MSAA. 8 if your cpu can take it.


ahahah i work @ ups, matching some peoples packages, kinda bored hahaha, im just googling some tips on how to increase fps, those damn vrm heatsinks are hard to find here in my country, i tired 100mv volts on after burner i only got max oc at core clock of 1150 and memory of 1450 on 290x bios but my temps on vrms on 100mv volts is killing me ,, max i have seen tosay is 90c,, so i didnt bother using the overclock on my gpu,.

thanks again,. havent tried 60hz ill try later,. i wanna go home WAHAH


----------



## rdr09

Quote:


> Originally Posted by *LTC*
> 
> Can't seem to get 1150/1450 benchmark stable, even with a 50mV increase, also the VRM's hit a frightening *90C* at that voltage...


that's why.

Quote:


> Originally Posted by *kizwan*
> 
> I have different FFXIV official benchmark. How many official benchmark they released?


i only know of 2. character and exploration. stand alone benches.


----------



## maynard14

Quote:


> Originally Posted by *Sgt Bilko*
> 
> i just ran the first SP mission for you, Crossfire R9 290's 1000/1250 with my 8350 @ 4.8 Game settings were Very High with MSAA x8 and i was getting 70 fps avg with both cards at 100%
> 
> Crysis 3 will bring alot of rigs to their knees


hahah thats crazy,.. crossfire 290s on crysis 3,. thats just a dream for me never can i afford another 290 hahah,, your lucky sir haha, im jealous, is your monitor 120hz? now i know why my fps dips down to mid 30s to 40s on crysis max settings except fxaa... hahah damn crysis 3


----------



## Sgt Bilko

Quote:


> Originally Posted by *maynard14*
> 
> hahah thats crazy,.. crossfire 290s on crysis 3,. thats just a dream for me never can i afford another 290 hahah,, your lucky sir haha, im jealous, is your monitor 120hz? now i know why my fps dips down to mid 30s to 40s on crysis max settings except fxaa... hahah damn crysis 3


Nope, just a 60hz for me.


----------



## pkrexer

Are there any other bios besides the Powercolor PCS+ that have a build in voltage increase? Since I'm water cooled, it'd be nice to just have voltage set via hardware and not have to worry about the software increasing it.


----------



## LTC

Quote:


> Originally Posted by *rdr09*
> 
> that's why.


I actually thought that the VRM's were designed for higher temps?


----------



## maynard14

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Nope, just a 60hz for me.


i wonder if 60 hz will increase my fps ?


----------



## rdr09

Quote:


> Originally Posted by *LTC*
> 
> I actually thought that the VRM's were designed for higher temps?


the cooler the better and that i think applies to all gpus. 80C is fine for the core and vrms.


----------



## Slomo4shO

Quote:


> Originally Posted by *Arizonian*
> 
> Nice builds you got going on there. Updated


Thanks








Quote:


> Originally Posted by *Roboyto*
> 
> Should run some 3DMark and get it on this:
> 
> http://www.overclock.net/t/1361939/top-30-3dmark11-scores-for-single-dual-tri-quad


Been meaning to prioritize some benchmarking into the mix but haven't gotten around to it... I usually end up gaming instead, gotta get through that 200+ Steam backlog


----------



## Sgt Bilko

Quote:


> Originally Posted by *maynard14*
> 
> i wonder if 60 hz will increase my fps ?


Your panels refresh rate won't increase or decrease your fps.

All you can do is decrease some settings or raise your OC as high as you are comfortable with


----------



## maynard14

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Your panels refresh rate won't increase or decrease your fps.
> 
> All you can do is decrease some settings or raise your OC as high as you are comfortable with


i see, so i have no choice but to oc my gpu, gotta order from amazon then for vrm heatsinks haha, thank you for you info sir,


----------



## rdr09

Quote:


> Originally Posted by *maynard14*
> 
> i see, so i have no choice but to oc my gpu, gotta order from amazon then for vrm heatsinks haha, thank you for you info sir,


if you go 60Hz with C3 you don't need to maintain 60 average to stay smooth. this and BF4 are similar that if minimum fps dips to the 30s (just don't go 35 or below) they stay playable. In BF3 if your min dips to the 40s it becomes ugly. i think it is the way the games are coded. those are just my observation.


----------



## Mercy4You

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Your panels refresh rate won't increase or decrease your fps.


That is to be debated. On a 60 Hertz monitor, you will not get more than 60 FPS. A 120 Hertz monitor will deliver up to 120 FPS.

The graphic card will be able to work harder on a 120 Hertz monitor than on a 60 Hertz monitor.

I've experienced this when I got my 120 Hertz monitor while using the same GPU...


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mercy4You*
> 
> That is to be debated. On a 60 Hertz monitor, you will not get more than 60 FPS. A 120 Hertz monitor will deliver up to 120 FPS.
> 
> The graphic card will be able to work harder on a 120 Hertz monitor than on a 60 Hertz monitor.
> 
> I've experienced this when I got my 120 Hertz monitor while using the same GPU...


It's an interesting point.

I always thought that the GPU(s) rendered frames regardless of the display rate of said frames.

Maybe you can run a few tests?

Can you runs a few game benches at the same clocks etc with your panel at 60 and 120hz?

Just note things like GPU and CPU usage maybe?


----------



## Slomo4shO

Quote:


> Originally Posted by *Mercy4You*
> 
> That is to be debated. On a 60 Hertz monitor, you will not get more than 60 FPS. A 120 Hertz monitor will deliver up to 120 FPS.
> 
> The graphic card will be able to work harder on a 120 Hertz monitor than on a 60 Hertz monitor.
> 
> I've experienced this when I got my 120 Hertz monitor while using the same GPU...


Frames rendered doesn't change, frames viewed does...


----------



## Arizonian

Quote:


> Originally Posted by *Ilsanto86*
> 
> *Hi All can i join the Club?
> *
> 
> Gigabyte R9 290X OC WF3 + Asus Bios Mod
> 
> 
> Spoiler: Warning: Spoiler!


Way to make an entrance with your first post on OCN, looks great. Congrats - added









Here's *"How to put your Rig in your Sig"*.

Quote:


> Originally Posted by *DA77*
> 
> I would like to join the club
> 
> here is my GPU-Z validation
> 
> Powercolor R9 290 PCS+


Congrats to you as well. Added


----------



## taem

Quote:


> Originally Posted by *DA77*
> 
> I would like to join the club
> 
> here is my GPU-Z validation
> 
> Powercolor R9 290 PCS+


Did you get Hynix or elpida?


----------



## Mercy4You

Quote:


> Originally Posted by *Slomo4shO*
> 
> Frames rendered doesn't change, frames viewed does...


Quote:


> Originally Posted by *Sgt Bilko*
> 
> It's an interesting point.
> 
> I always thought that the GPU(s) rendered frames regardless of the display rate of said frames.


Forgot to tell that I experienced this during gaming, not in benchmarking. My GPU got much hotter, fans were louder....


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mercy4You*
> 
> Forgot to tell that I experienced this during gaming, not in benchmarking. My GPU got much hotter, fans were louder....


I gathered that much, that's why i asked if you could run game benchmarks, not synthetic ones.


----------



## Mercy4You

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I gathered that much, that's why i asked if you could run game benchmarks, not synthetic ones.


Do you have an idea how this works in gaming than?


----------



## DA77

Quote:


> Originally Posted by *taem*
> 
> Did you get Hynix or elpida?


Elpida


----------



## thothx87

hello!


----------



## revro

so would you recommend 2 290 gigabytes with blower or with windforce cooler for my case? i have a lot of fans and great air cooling the 2 side fans are blowing air inside and 2 are blowing it outside, i just dont have that photo in my rig

thanks
best
revro


----------



## Slomo4shO

Quote:


> Originally Posted by *revro*
> 
> so would you recommend 2 290 gigabytes with blower or with windforce cooler for my case? i have a lot of fans and great air cooling the 2 side fans are blowing air inside and 2 are blowing it outside, i just dont have that photo in my rig
> 
> thanks
> best
> revro


The reference models run hot and loud. You would be better off going with the Windforce models.


----------



## Roboyto

Quote:


> Originally Posted by *Slomo4shO*
> 
> The reference models run hot and loud. You would be better off going with the Windforce models.


You definitely don't want blowers!

Get those Windforce, Sapphire Tri-X, or Powercolor PCS. Guys with the Powercolor seem to be getting best temps overall.


----------



## Roboyto

Quote:


> Originally Posted by *maynard14*
> 
> guys help..
> 
> i think my card it throttling,, using xfx 290x bios, my gpu core temp at load is 61c and vrm1 is 75c, and vrm2 max load temp is 59c
> 
> im using xfx 290x bios:
> 
> max voltage 1.234 and all my settings on the gpu is default, no oc no voltage tweak.
> 
> while playing crysis 3 my fps is still low and in msi burner my gpu load is not 100 percent stable,
> 
> what can i do to make it stable? pls help, coz this is the first time i saw my card throttle but temps are still acceptable right


Why not try overclocking with your regular BIOS?


----------



## maynard14

Quote:


> Originally Posted by *Roboyto*
> 
> Why not try overclocking with your regular BIOS?


ahhh because i want to unlock to 290x sir,.. my card is a reference xfx 290,, i unlock it with xfx 290x bios







thats why im using the moded bios sir,. will it make diff if i use a diff 290x bios when overclocking?


----------



## Roboyto

Quote:


> Originally Posted by *maynard14*
> 
> ahhh because i want to unlock to 290x sir,.. my card is a reference xfx 290,, i unlock it with xfx 290x bios
> 
> 
> 
> 
> 
> 
> 
> thats why im using the moded bios sir,. will it make diff if i use a diff 290x bios when overclocking?


If I'm not mistaken the ASUS bios is a popular one for the unlocking.


----------



## Prozillah

Quote:


> Originally Posted by *revro*
> 
> so would you recommend 2 290 gigabytes with blower or with windforce cooler for my case? i have a lot of fans and great air cooling the 2 side fans are blowing air inside and 2 are blowing it outside, i just dont have that photo in my rig
> 
> thanks
> best
> revro


Windforce definitely. Just try to create a really powerful vortex. Thw windforce cooler is excellent at pulling the heat away from the chips but it dumps it inside the case so a power suction is required to dump that hot air out. I run a push pull 280mm top mount rad on full power as well as a side panel 200mm fan all sucking hot air out. The room heats up something chronic!


----------



## chiknnwatrmln

The amount of heat that these cards dump into the air is ridiculous...

Even with only one 290, if I game for an hour or so the room I'm in (280sq ft dorm room with 12ft high ceiling) heats up noticeably...


----------



## revro

Quote:


> Originally Posted by *Prozillah*
> 
> Windforce definitely. Just try to create a really powerful vortex. Thw windforce cooler is excellent at pulling the heat away from the chips but it dumps it inside the case so a power suction is required to dump that hot air out. I run a push pull 280mm top mount rad on full power as well as a side panel 200mm fan all sucking hot air out. The room heats up something chronic!


check out my rig, there is no stronger air tunnel i could imagine of aside going for a d900 corsair







case cost 130eur and another 72eur in a lot of other high performance akasa fans







my 780oc maxes out at 72C after hours of gaming

best
revro


----------



## maynard14

Quote:


> Originally Posted by *Roboyto*
> 
> If I'm not mistaken the ASUS bios is a popular one for the unlocking.


thanks sir,. i already tried that asus bios but i read somewhere here if im not mistaken that even in idle the clocks dont go down,.. correct me if im wrong,.. and the voltage seems higher than the average,.


----------



## ozzy1925

guys i bought a new r290 tri-x oc and i runned heaven benchmark when i checked i see one of my vrm max 77c other one max 55c.Is there something wrong?


----------



## pkrexer

Quote:


> Originally Posted by *ozzy1925*
> 
> guys i bought a new r290 tri-x oc and i runned heaven benchmark when i checked i see one of my vrm max 77c other one max 55c.Is there something wrong?


That looks pretty normal to me.


----------



## ozzy1925

Quote:


> Originally Posted by *pkrexer*
> 
> That looks pretty normal to me.


i tought both vrm temperatures should be the same.There are 22C difference between them


----------



## Roboyto

Quote:


> Originally Posted by *maynard14*
> 
> thanks sir,. i already tried that asus bios but i read somewhere here if im not mistaken that even in idle the clocks dont go down,.. correct me if im wrong,.. and the voltage seems higher than the average,.


It could be, I don't have any personal experience with unlocking. I'm just going off memory from what I've read on here.

Could likely be a better BIOS for extreme overclocking, but not having the ability to automatically down clock is definitely not worth it.


----------



## Roboyto

Quote:


> Originally Posted by *ozzy1925*
> 
> guys i bought a new r290 tri-x oc and i runned heaven benchmark when i checked i see one of my vrm max 77c other one max 55c.Is there something wrong?


Quote:


> Originally Posted by *ozzy1925*
> 
> i tought both vrm temperatures should be the same.There are 22C difference between them


The main VRM1 is always hotter as it deals with voltage for the core.

VRM2 is for memory and doesn't get nearly as much of a workout.

20C difference is par for the course especially on Air cooling.


----------



## taem

So I'd always run my side fan at low rpms because I thought it'd be bad to hurl the gpu exhaust right back into the heatsink. I just tried maxing the side fan. Temps plummeted. Vrm1 went from 85c to 79c and that was a brief peak, stabilized at 77c.

Btw to address a comment roboyto made, I'm not sure PCS+ runs cooler than Tri-X. Seem about the same, maybe tri-x even slightly better. Of course pcs+ is higher clocks and +50mv adjust at stock. This is deffo a good cooler though. Ought to be for a triple slotter!!
Quote:


> Originally Posted by *ozzy1925*
> 
> i tought both vrm temperatures should be the same.There are 22C difference between them


Afaik vrm1 always runs hottest. Been my experience anyway. On my Powercolor PCS+ 290 vrm1 runs ~15c hotter than core, and a staggering ~35c hotter than core at my highest oc clocks.


----------



## maynard14

Quote:


> Originally Posted by *Roboyto*
> 
> It could be, I don't have any personal experience with unlocking. I'm just going off memory from what I've read on here.
> 
> Could likely be a better BIOS for extreme overclocking, but not having the ability to automatically down clock is definitely not worth it.


thanks sir, yeah im ok with the xfx bios im using, im still very much happy with my card free 290x and a bit of good clocker for stock voltage i can up the gpu core to 1075 and memory to 1300

temps are still at gpu core max i have seen 61c and vrm1 is the toughest to cool down 80c and vrm 2 61c max load

ill try to experiment later and put a 140 fan under neath my vcard







let see if its cool down


----------



## chiknnwatrmln

Normal ASUS BIOS run like normal, obviously.

PT1 BIOS allows up to 2v, I believe it has no powertune and no downclocking.

PT3 is PT1 with LLC to alleviate vdroop.


----------



## Roboyto

Quote:


> Originally Posted by *maynard14*
> 
> thanks sir, yeah im ok with the xfx bios im using, im still very much happy with my card free 290x and a bit of good clocker for stock voltage i can up the gpu core to 1075 and memory to 1300
> 
> temps are still at gpu core max i have seen 61c and vrm1 is the toughest to cool down 80c and vrm 2 61c max load
> 
> ill try to experiment later and put a 140 fan under neath my vcard
> 
> 
> 
> 
> 
> 
> 
> let see if its cool down


I'd probably say it would.
Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Normal ASUS BIOS run like normal, obviously.
> 
> PT1 BIOS allows up to 2v, I believe it has no powertune and no downclocking.
> 
> PT3 is PT1 with LLC to alleviate vdroop.


Haven't experimented with flashing my BIOS yet since I've achieved such phenomenal results with stock.

Can I use that PT3 on my 290 black edition? I wonder if LLC would help me squeak a little more out of this thing


----------



## Roboyto

Quote:


> Originally Posted by *taem*
> 
> Of course pcs+ is higher clocks and +50mv adjust at stock. This is deffo a good cooler though. Ought to be for a *triple slotter*!!
> Afaik vrm1 always runs hottest. Been my experience anyway. On my Powercolor PCS+ 290 vrm1 runs ~15c hotter than core, and a staggering ~35c hotter than core at my highest oc clocks.


Hadn't really paid attention to the fact the PCS is a triple slotter, that makes sense now.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Roboyto*
> 
> Haven't experimented with flashing my BIOS yet since I've achieved such phenomenal results with stock.
> 
> Can I use that PT3 on my 290 black edition? I wonder if LLC would help me squeak a little more out of this thing


Don't see why not. You could give it a shot, but read up on it first. LLC will allow the card to feed the GPU more voltage than you tell it to. You're cooling is good, but if you make a mistake I don't think 1.5v+ would be good for the card.

TSM told me he has had his cards running 1.5v for benches and they're fine, but I'm certainly never going that high. My GPU hovers around 1.3v after droop.

Off topic, trig substitution in calc 2 sucks. I have a midterm coming up and I've been studying for hours and this is still really tough.


----------



## Roboyto

Quote:


> Originally Posted by *Roboyto*
> 
> I Think I have found the absolute limit of my GPU. Temps are still solid too. GPU 47 VRMs 71/40
> 
> XFX R9 290 BE @ 1315/1700 - P17940
> 
> 200mV offset in Trixx came out to 1.422V
> 
> ...Maybe time to try and push the 4770k a little more
> 
> 
> 
> 
> 
> 
> 
> I want P18000!
> 
> http://www.3dmark.com/3dm11/7996244
> 
> 
> 
> Check out this ridiculous wattage though!


Quote:


> Originally Posted by *chiknnwatrmln*
> 
> *My GPU hovers around 1.3v after droop.*
> 
> Off topic, trig substitution in calc 2 sucks. I have a midterm coming up and I've been studying for hours and this is still really tough.


How do I get the reading for voltage after Vdroop? That was my highest voltage reading at 1.422

I remember calc 2...PITA!


----------



## phallacy

Hot damn that is a good score for a single GPU. Roboyto what are your 4770k settings like? I've only been able to get to 11.5k at the same 4.5 OC.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Roboyto*
> 
> How do I get the reading for voltage after Vdroop? That was my highest voltage reading at 1.422
> 
> I remember calc 2...PITA!


Highest voltage means nothing..

What happens is when the card kicks into 3d mode it clocks up and voltage is increased to max. In your case 1.422v, around the same for me.

Due to vdroop, as GPU usage (and current) increases the voltage decreases. Each GPU will droop a different amount, depending on chip quality.

To figure out actual voltage, run a bench like Heaven that will give you constant 100% usage and look at your VDDC in GPUz as you're running the bench. The VDDC will jump a bit (usually like .1v in either direction) but will stay around the same number... Mine goes from 1.29v to 1.31v usually so I call it 1.3v after droop.

If you're not familiar with vdroop, if I remember right it's a safety feature implemented by resistors. Because of changes in voltage as the GPU kicks into 3d mode, if those resistors weren't there then a massive yet quick change in voltage would be present, and over time could damage the GPU. The same principle applies to CPU's, which is why CPU's also have droop.

LLC (loadline correction) basically uses algorithms to feed more voltage when the CPU/GPU is under load... I'm hesitant to use PT3 BIOS because as far as I know there's no way to control LLC. With my motherboard, on the other hand, I can adjust LLC so that at 100% clock speed but no load I have the same exact voltage as 100% clock speed and full load, 1.39v.

This link has graphs that will help if I didn't explain it clearly. http://en.wikipedia.org/wiki/Voltage_droop

I hear multivariable calc is actually easier than calc 2... I guess I'll find out in the fall.


----------



## phallacy

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> ...
> 
> I hear multivariable calc is actually easier than calc 2... I guess I'll find out in the fall.


That it is, I've honestly never used calc 2 material in my current work (quant. finance / i banking) but multivariate, PDE and stats all come into play heavily. Good luck with your midterms


----------



## Roboyto

Quote:


> Originally Posted by *phallacy*
> 
> Hot damn that is a good score for a single GPU. Roboyto what are your 4770k settings like? I've only been able to get to 11.5k at the same 4.5 OC.


Thanks! I've been running benchmarks and tweaking for the last month between the i7 and the (2) 290s I had originally. I'm easily 100 hours into benching/tweaking...it has paid off however! Sold the other one as it only had ~10% OCability on core and RAM.

4.5GHz | 1.259V | 2x8GB Dominator Platinum 2400MHz | 1.65V

CPU is delidded and naked under my Raystorm with phenomenal temperatures. Link in my sig if you want to see how I did that.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *taem*
> 
> So does constant voltage do anything then? Because I'm trying to keep my air cooled 290 under 1.3v but I'm wondering what number is the determinative one. Max vddc is much higher than what I'm seeing in gpu z 99% of the time in a given run. Do I use the gpu z avg?


No, look at VDDC while your card is under 100% load. Max voltage will give you the voltage before droop, and average won't be accurate.

You can either run a bench or game in windowed mode and take a look at GPUz, or run something to give you 100% GPU usage then quickly exit and check your VDDC when your GPU was fully loaded.

Here, this was with my card underclocked so droop isn't that apparent, but still there.


----------



## taem

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> No, look at VDDC while your card is under 100% load. Max voltage will give you the voltage before droop, and average won't be accurate.
> 
> You can either run a bench or game in windowed mode and take a look at GPUz, or run something to give you 100% GPU usage then quickly exit and check your VDDC when your GPU was fully loaded.
> 
> Here, this was with my card underclocked so droop isn't that apparent, but still there.


Even using the correlated graphs on afterburner, vddc still fluctuates even at 100% usage. Like at one point it will read 1.156 @ 100%, then 1.188 @ 100% usage. And the the max will say 1.254.


----------



## Prozillah

I believe im out of juice on my 1000w psu. Currenly my 80% gold 1000w fsp aurum psu is powering 1 mobo, 2x 290's, 2x 2400 4gb ram modules, h110 water pump, 2x 140mm fans, 8x 120mm fans, 1x 200mm fans, 2x hdds, 1x ssd, 1x dvdrw drive, 5x usb devices and some led lighting. I did afew online calcs and they all recommend 1200w psu

. Im getting some random freezing/hard locks when gaming on bf4. I should also mention Im oc cpu to 4500 @ 1.36 and my xfire cards are at 1140 core 50% pl + 20mv.

I passed 10 runa of metro ll no problem so im thinking its the intensity of 64 mp that is maxing everything out.

Do I need to buy a new 1200w psu?


----------



## LTC

What would guys say are a safe temp for the VRM's? My VRM1 gets to 85C in BF4...


----------



## Mercy4You

Quote:


> Originally Posted by *LTC*
> 
> What would guys say are a safe temp for the VRM's? My VRM1 gets to 85C in BF4...


You ask this WC guys here they say 85C is very high









They are rated for at least 125C so 90-100C will be fine. I like to keep it below 90C, while gaming it's 70- 85C.

BTW, did you enable a frame limit in BF4? I found that dropped my temps about 10C


----------



## chiknnwatrmln

Quote:


> Originally Posted by *taem*
> 
> Even using the correlated graphs on afterburner, vddc still fluctuates even at 100% usage. Like at one point it will read 1.156 @ 100%, then 1.188 @ 100% usage. And the the max will say 1.254.


It will fluctuate a bit, like I said mine is usually from 1.29 to 1.31v so I call it 1.3v...

Yours seems like it's 1.14 or so.

As for VRM's, 85c is kinda high. They will be fine but high VRM temps (over 70 or so) can affect OC stability as well as efficiency.


----------



## Prozillah

Quote:


> Originally Posted by *Mercy4You*
> 
> You ask this WC guys here they say 85C is very high
> 
> 
> 
> 
> 
> 
> 
> 
> 
> They are rated for at least 125C so 90-100C will be fine. I like to keep it below 90C, while gaming it's 70- 85C.
> 
> BTW, did you enable a frame limit in BF4? I found that dropped my temps about 10C


by fame limited to do mean vsync? I found in xfire it brings my avg FPS waay down..?


----------



## taem

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> As for VRM's, 85c is kinda high. They will be fine but high VRM temps (over 70 or so) can affect OC stability as well as efficiency.


Dude! I like to keep vrms at 80c max but do you know how hard it is to get a ceiling vrm temp of 70c in an air cooled system? My last card, 280x vapor x, I had to run stock clocks and undervolt to get <70c vrm1. I'm targeting 80c with this powercolor pcs+ 290, and to get that I can't go close to my max stable clocks.

If you've got stability and efficiency is not an issue, what would you say is a good vrm temp? Is 90c damaging to a card long term?

Speaking of card damage what is a safe max fan speed to run 24/7? Fan rattling and failure are the most common gpu problems I know from personal experience.


----------



## Mercy4You

Quote:


> Originally Posted by *Prozillah*
> 
> by fame limited to do mean vsync? I found in xfire it brings my avg FPS waay down..?


Nope! You can limit your frames in the console to a value you want. As for me, I want 120 FPS, so anything higher is a waist of GPU power and is causing more heat. Vsync is causing latency, and that I don't like.


----------



## Prozillah

Quote:


> Originally Posted by *Mercy4You*
> 
> Nope! You can limit your frames in the console to a value you want. As for me, I want 120 FPS, so anything higher is a waist of GPU power and is causing more heat. Vsync is causing latency, and that I don't like.


Hrrmmm very good point! What's the command line to run it and I would imagine probably best to edit it in the user file?


----------



## Mercy4You

Quote:


> Originally Posted by *Prozillah*
> 
> Hrrmmm very good point! What's the command line to run it and I would imagine probably best to edit it in the user file?


Yep,







you know what to do:

gametime.maxvariablefps 122

That's what I did, if you have a 60 Hertz monitor you want 59 or 61 or 62 frames. You just try what's best for you to avoid tearing. 122 works best for my 120 Hertz monitor


----------



## LTC

Quote:


> Originally Posted by *Mercy4You*
> 
> You ask this WC guys here they say 85C is very high
> 
> 
> 
> 
> 
> 
> 
> 
> 
> They are rated for at least 125C so 90-100C will be fine. I like to keep it below 90C, while gaming it's 70- 85C.
> 
> BTW, did you enable a frame limit in BF4? I found that dropped my temps about 10C


Thanks, no, do you mean Vsync?

Sorry didn't read thoroughly before posting.


----------



## Mercy4You

Quote:


> Originally Posted by *LTC*
> 
> Thanks, no, do you mean Vsync?
> 
> Sorry didn't read thoroughly before posting.


Works also in BF3 and probably other games as well... With the rigs we have here more than 60/120+ FPS is often seen while gaming and it's a total waist of power=heat=noise ...

And absolutely latency free


----------



## LTC

Quote:


> Originally Posted by *Mercy4You*
> 
> Works also in BF3 and probably other games as well... With the rigs we have here more than 60/120+ FPS is often seen while gaming and it's a total waist of power=heat=noise ...
> 
> And absolutely latency free


My overclock is behaving very weirdly :S Sometimes I can play BF4 for hours without a problem, and other times, it just hangs two seconds in a match


----------



## Paul17041993

~100 posts in 15 hours, too much...

anywho, just did an underclock and undervolt test; -100mV, 800/1100 and the card uses no more then 85W in heaven, >50W in TF2, efficient no? so ~25% less performance but nearly half the power draw...


----------



## Widde

Completely wiped all my drives and reinstalled my OS ^^ Was finally able to update to windows 8.1. Rolling with 13.12 atm and everything seems fine ^^ Much better in fact







Only trouble now is the freeze into sound loop in bf4 but that is on their end









Had lots of weird stuff happening before, but hadnt reinstalled the OS in over a year so about time ^_^


----------



## LTC

Quote:


> Originally Posted by *Widde*
> 
> Completely wiped all my drives and reinstalled my OS ^^ Was finally able to update to windows 8.1. Rolling with 13.12 atm and everything seems fine ^^ Much better in fact
> 
> 
> 
> 
> 
> 
> 
> Only trouble now is the freeze into sound loop in bf4 but that is on their end
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Had lots of weird stuff happening before, but hadnt reinstalled the OS in over a year so about time ^_^


What is that freeze you are talking about? Is it in the start of a round?


----------



## Widde

Quote:


> Originally Posted by *LTC*
> 
> What is that freeze you are talking about? Is it in the start of a round?


Happens at random, happened now last time on operation metro, the game just freezes and the sound loops so you have to bring up the task manager and close it.

btw feels like something is off during my heaven runs :S http://piclair.com/eidz2

Overclocked 1000/1300 cpu at 4.5ghz R9 290 crossfire 3570k cpu


----------



## LTC

Quote:


> Originally Posted by *Widde*
> 
> Happens at random, happened now last time on operation metro, the game just freezes and the sound loops so you have to bring up the task manager and close it.


Okay, I don't know if it is a bad overclock or just the game freezing now :S


----------



## Widde

Quote:


> Originally Posted by *LTC*
> 
> Okay, I don't know if it is a bad overclock or just the game freezing now :S


I was running stock if it's of any help







and that freeze have been around in bf4 for quite some time


----------



## LTC

Quote:


> Originally Posted by *Widde*
> 
> I was running stock if it's of any help
> 
> 
> 
> 
> 
> 
> 
> and that freeze have been around in bf4 for quite some time


Seems like a bad overclock then..


----------



## Prozillah

had the same issue with sli 670's and my OC cpu when BF4 firstr launched. Exactly the same thing happening - had to take everything back to stock which fixed all issues until the wat patched a couple more times. Then I was able to run my OC no problem. HOWEVER - since installing my 290's I'm getting 101 BSODS and restarts/freezes and hardlocks. So I am doing a test right now on completely stock. There is a couple of things to consider - 1. PSU - I am possibly over my PSU limit with my OC's etc. 2. BF4 optimization, could be off and needs more dev work on the patch side of things or 3. drivers and then god knows where to start on that...


----------



## Roboyto

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> No, look at VDDC while your card is under 100% load. Max voltage will give you the voltage before droop, and average won't be accurate.
> 
> You can either run a bench or game in windowed mode and take a look at GPUz, or run something to give you 100% GPU usage then quickly exit and check your VDDC when your GPU was fully loaded.
> 
> Here, this was with my card underclocked so droop isn't that apparent, but still there.




HWINFO gives a lot more information. Is this correct then for Vdroop?

Don't know what changed between Wednesday and today but card isn't stable at clocks/voltages I was previously running; especially 3Dmark11. Any ideas? Haven't had much time to tinker with it, but it is kind of disturbing...


----------



## LTC

Quote:


> Originally Posted by *Prozillah*
> 
> had the same issue with sli 670's and my OC cpu when BF4 firstr launched. Exactly the same thing happening - had to take everything back to stock which fixed all issues until the wat patched a couple more times. Then I was able to run my OC no problem. HOWEVER - since installing my 290's I'm getting 101 BSODS and restarts/freezes and hardlocks. So I am doing a test right now on completely stock. There is a couple of things to consider - 1. PSU - I am possibly over my PSU limit with my OC's etc. 2. BF4 optimization, could be off and needs more dev work on the patch side of things or 3. drivers and then god knows where to start on that...


My PSU shouldn't be the problem. I just got the problem twice with stock clocks as well...


----------



## chiknnwatrmln

Quote:


> Originally Posted by *taem*
> 
> Dude! I like to keep vrms at 80c max but do you know how hard it is to get a ceiling vrm temp of 70c in an air cooled system? My last card, 280x vapor x, I had to run stock clocks and undervolt to get <70c vrm1. I'm targeting 80c with this powercolor pcs+ 290, and to get that I can't go close to my max stable clocks.
> 
> If you've got stability and efficiency is not an issue, what would you say is a good vrm temp? Is 90c damaging to a card long term?
> 
> Speaking of card damage what is a safe max fan speed to run 24/7? Fan rattling and failure are the most common gpu problems I know from personal experience.


I realize this, usually for air you should aim for under 80c VRMs, 85c max for long term usage.

Honesty what would you rather replace, a faulty fan or an entire graphics card? I usually run aggressive fan profiles because most fans are incredibly durable (except Sapphire fans, but Sapphire has a crappy warranty so I don't buy anything from them any more) and fans are inexpensive. Besides, fans are rated for run time at max speed, most fans will run for at least 30k hours.
Quote:


> Originally Posted by *Roboyto*
> 
> 
> 
> HWINFO gives a lot more information. Is this correct then for Vdroop?
> 
> Don't know what changed between Wednesday and today but card isn't stable at clocks/voltages I was previously running; especially 3Dmark11. Any ideas? Haven't had much time to tinker with it, but it is kind of disturbing...


No, first stick with GPUz as it's easier to read with graphs.

All you do is look at your VDDC while your card is under 100% load.

I'm not sure about the overclock, sometimes it takes a while to discover instability in an OC.


----------



## Paul17041993

BF4 is a hardware butcher, so whatever OC you think may be stable, BF4 will likely show otherwise as the draw just pushes your power hard and the vdroop drops you out.

that and some other little parts that don't get used by other games and benches are likely used by BF4, and crash for said parts not being 100% stable, or you may have a bad spot of memory that is usually fine with textures, but crashes when used with particles and physics...


----------



## Roboyto

Quote:


> Originally Posted by *LTC*
> 
> My PSU shouldn't be the problem. I just got the problem twice with stock clocks as well...


If you're on driver 14.1 roll back to 13.12. 14.1 causing lots of issues


----------



## LTC

Quote:


> Originally Posted by *Roboyto*
> 
> If you're on driver 14.1 roll back to 13.12. 14.1 causing lots of issues


I'm on 13.12 already.


----------



## Prozillah

Successfully running bf4 completely stock clocks right now coming up on 1 continuous hour of play. So far so good! If 1 hour passes I will go back to overclocking the gpus again. I still think my psu is bottlenecking me...???


----------



## Roboyto

Quote:


> Originally Posted by *Prozillah*
> 
> Successfully running bf4 completely stock clocks right now coming up on 1 continuous hour of play. So far so good! If 1 hour passes I will go back to overclocking the gpus again. I still think my psu is bottlenecking me...???


Do you happen to have any extra PSU laying around? Could hook the extra GPU up to one of the cards and see if the problem persists at those overclocks.


----------



## Prozillah

Quote:


> Originally Posted by *Roboyto*
> 
> Do you happen to have any extra PSU laying around? Could hook the extra GPU up to one of the cards and see if the problem persists at those overclocks.


Good point - Ill find one and try it


----------



## hks215

has anybody have memory voltage on the msi r9 290 gaming


----------



## Widde

Wow these scores feels like they have dropped :S http://piclair.com/zezuq 1120/1250







Seemed to run better on regular W8 than 8.1


----------



## nightfox

Quote:


> Originally Posted by *Paul17041993*
> 
> ~100 posts in 15 hours, too much...
> 
> anywho, just did an underclock and undervolt test; -100mV, 800/1100 and the card uses no more then 85W in heaven, >50W in TF2, efficient no? so ~25% less performance but nearly half the power draw...


is this possible? less power draw less heat basically right? ill try this later.


----------



## Forceman

Quote:


> Originally Posted by *Paul17041993*
> 
> BF4 is a hardware butcher, so whatever OC you think may be stable, BF4 will likely show otherwise as the draw just pushes your power hard and the vdroop drops you out.


We really need LLC control.
Quote:


> Originally Posted by *hks215*
> 
> has anybody have memory voltage on the msi r9 290 gaming


None of the cards released so far have memory voltage control.


----------



## Paul17041993

Quote:


> Originally Posted by *nightfox*
> 
> is this possible? less power draw less heat basically right? ill try this later.


of course, both reducing voltage and reducing clocks will reduce power draw significantly, and the heat produced is exactly proportional to the power draw, that's where the power all goes.

so undervolt and underclock to your heart's content, just be weary that the voltage still has to be enough for the clocks, and underclocking too far will give some performance issues, -100mV, 800/1100 seems perfectly stable for me, 1000/1250 with -100mV will likely not be, though some games may perform a lot worse then expected.

miners and folders usually undervolt for efficiency.


----------



## brazilianloser

Probably won't be the same for everyone but at least on my end here, I saw a 5 to 10fps increase after going from w7 to w8. Some other factors like newly installed os and such have an affect but had a hell of an easy time getting crossfire to work properly on w8. Just thought I would share my experience.


----------



## Paul17041993

Quote:


> Originally Posted by *brazilianloser*
> 
> Probably won't be the same for everyone but at least on my end here, I saw a 5 to 10fps increase after going from w7 to w8. Some other factors like newly installed os and such have an affect but had a hell of an easy time getting crossfire to work properly on w8. Just thought I would share my experience.


well it *is* what the cards and drivers are optimized for, + some little reworks here and there that make 8 slightly more efficient and a fair bit more stable then 7, you can now leave it running for longer then a month without a reboot like 7 needed...


----------



## Tobiman

I'm having a bit of a problem while running benchmarks like 3dmark, heaven and valley. They tend to have much much longer cut-scenes, where the screen usually goes all black for a sec but it's taking about 2-3mins. I'm currently running 14.1 and will be going back to 13.12 soonish. See if it's the story with that driver too.

P.S. I can confirm that 13.12 WHQL is much much more stable than 14.1. Ran all benchmarks and games at 1175/1500mhz. Will be pushing for higher stable clocks.


----------



## nightfox

Quote:


> Originally Posted by *Paul17041993*
> 
> of course, both reducing voltage and reducing clocks will reduce power draw significantly, and the heat produced is exactly proportional to the power draw, that's where the power all goes.
> 
> so undervolt and underclock to your heart's content, just be weary that the voltage still has to be enough for the clocks, and underclocking too far will give some performance issues, -100mV, 800/1100 seems perfectly stable for me, 1000/1250 with -100mV will likely not be, though some games may perform a lot worse then expected.
> 
> miners and folders usually undervolt for efficiency.


Well I did try undervolting and underclocking. I got black screen past -50mV on AB

here is my settings

Core voltage -31
Powerlimit -50%
Core clock 800
Memory Clock 1250

I did ran a 3dmark11 benchmark

http://www.3dmark.com/3dm11/8012349

I was so surprised that my 3dmark test before at stock clocks (975) was abit lower

http://www.3dmark.com/3dm11/8001317

http://www.3dmark.com/compare/3dm11/8001317/3dm11/8012349

shocking


----------



## Roboyto

Quote:


> Originally Posted by *nightfox*
> 
> Well I did try undervolting and underclocking. I got black screen past -50mV on AB
> 
> here is my settings
> 
> Core voltage -31
> Powerlimit -50%
> Core clock 800
> Memory Clock 1250
> 
> I did ran a 3dmark11 benchmark
> 
> http://www.3dmark.com/3dm11/8012349
> 
> *I was so surprised that my 3dmark test before at stock clocks (975) was abit lower*
> 
> http://www.3dmark.com/3dm11/8001317
> 
> http://www.3dmark.com/compare/3dm11/8001317/3dm11/8012349
> 
> shocking


Likely cards were throttling quite a bit at stock clocks.


----------



## nightfox

Quote:


> Originally Posted by *Roboyto*
> 
> Likely cards were throttling quite a bit at stock clocks.


temps are fine. I use AB custom fan profile and monitoring clocks. Temps never go up more than 75 so I dont think it was throttling. I dont think its CPU bottleneck as well cause I monitored CPU usage,


----------



## Roboyto

Quote:


> Originally Posted by *nightfox*
> 
> temps are fine. I use AB custom fan profile and monitoring clocks. Temps never go up more than 75 so I dont think it was throttling. I dont think its CPU bottleneck as well cause I monitored CPU usage,












Hmmm, is your PSU new? Maybe struggling to give good power at stock clocks? eXtreme PSU calculator comes to 938w recommended

Best brainstorm I could come up with, and it's not that great lol

Have an extra PSU, maybe hook it up to the 3rd card alone and see if your performance increases?


----------



## Paul17041993

Quote:


> Originally Posted by *nightfox*
> 
> Well I did try undervolting and underclocking. I got black screen past -50mV on AB
> 
> here is my settings
> 
> Core voltage -31
> Powerlimit -50%
> Core clock 800
> Memory Clock 1250
> 
> I did ran a 3dmark11 benchmark
> 
> http://www.3dmark.com/3dm11/8012349
> 
> I was so surprised that my 3dmark test before at stock clocks (975) was abit lower
> 
> http://www.3dmark.com/3dm11/8001317
> 
> http://www.3dmark.com/compare/3dm11/8001317/3dm11/8012349
> 
> shocking


I think dropping the memory slightly may allow a little more drop in core voltage, but being trifire that's likely already the most optimal settings, definitely didn't expect an _increase_ in 3dmark scores like that though, even your physics is almost 400 points higher...

I wonder what your total system power consumption is now compared to before though, this might become a new trend for cooling-limited setups or the power concerned...


----------



## taem

Quote:


> Originally Posted by *Tobiman*
> 
> Ran all benchmarks and games at 1175/1500mhz. Will be pushing for higher stable clocks.


On my powercolor pcs+ 290 I'm finding that past 1170/1500, the vddc adjust needed to step up gets rather steep. +31 for 1050/1450; +75 for 1170/1500; +88 for 1180/1500; +125 for 1200/1500; +200 for 1230/1650 which seems the max stable clocks for me.

Temp wise this is ok because I'm discovering that this card isn't so great at delivering low temps at load; but it excels at keeping temps tolerable even when you push it. The bigger issue is wattage.

Btw I have yet to game on this card lol. Just testing and testing. I think I've found the two profiles I like tho. 1050/1450 at +31 (card comes +50 stock so that's actually an undervolt) with fans set to 50% max as my quiet profile. And then 1180/1500 at +88 with fans set to 80% max as my oc profile.


----------



## nightfox

Quote:


> Originally Posted by *Roboyto*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hmmm, is your PSU new? Maybe struggling to give good power at stock clocks? eXtreme PSU calculator comes to 938w recommended
> 
> Best brainstorm I could come up with, and it's not that great lol
> 
> Have an extra PSU, maybe hook it up to the 3rd card alone and see if your performance increases?


I dont think its a lack of power cause I tried overclocking both GPU and CPU. Here is my 3dmark11

http://www.3dmark.com/3dm11/7967107

Quote:


> Originally Posted by *Paul17041993*
> 
> I think dropping the memory slightly may allow a little more drop in core voltage, but being trifire that's likely already the most optimal settings, definitely didn't expect an _increase_ in 3dmark scores like that though, even your physics is almost 400 points higher...
> 
> I wonder what your total system power consumption is now compared to before though, this might become a new trend for cooling-limited setups or the power concerned...


I was also so surprised about it. for physics, could be because of I oced abit my memory. from stock 1600 to 1866. I could only guess.

I was really shocked at the graphics score

30772.0 stock

31708.0 underclocked.

I could only guess that it is because of the PLX latency?

i try to lower memory to 1100 and ran a benchmark


----------



## Belkov

Quote:


> Originally Posted by *Prozillah*
> 
> Windforce definitely. Just try to create a really powerful vortex. Thw windforce cooler is excellent at pulling the heat away from the chips but it dumps it inside the case so a power suction is required to dump that hot air out. I run a push pull 280mm top mount rad on full power as well as a side panel 200mm fan all sucking hot air out. The room heats up something chronic!


I have a question related to your answer...








My case is a cheap elite 430 but i am pleased of its cooling so far. I have one 120mm. fan at the front, wich blows in; two 120mm. fans at the top, wich blow out; one 120mm. fan at the back, wich blows out; one 120mm. fan at the bottom, wich blows in; and one 120mm. fan at the side panel, wich blows in. My card is gb r9 290oc wf3 and it runs at stock with max temp for the core 75 degrees and max 78 for the vrm1. Do you think that the temps will drop if i turn the side panel fan to suck the air out? My fan configuration is made like this due to a grafic for the cooling solution in the page of cm elite 430. I will appreciate your answers. Thank you in advance.

Edit: I tested both situations and there is no change in the temps of the GPU. So do you think i must keep the side panel fan to blow in, or better to turn it to blow out.


----------



## Roboyto

Quote:


> Originally Posted by *nightfox*
> 
> I dont think its a lack of power cause I tried overclocking both GPU and CPU. Here is my 3dmark11
> 
> http://www.3dmark.com/3dm11/7967107
> I was also so surprised about it. for physics, could be because of I *oced abit my memory. from stock 1600 to 1866. I could only guess.*
> 
> I was really shocked at the graphics score
> 
> 30772.0 stock
> 
> 31708.0 underclocked.
> 
> I could only guess that it is because of the PLX latency?
> 
> i try to lower memory to 1100 and ran a benchmark


I'd wager this would make a difference. Try underclocked settings with 1600 memory


----------



## nightfox

Ok Ok now I more than stumbled.

settings as follows.

-37mV core
-50% power limit
800 / 1100

and look at my 3dmark11 score

http://www.3dmark.com/3dm11/8012631

IT EVEN WENT UP ABIT. lol

PHYSIX went up!!

comparing with the previous one

http://www.3dmark.com/compare/3dm11/8012618/3dm11/8012349

- only changed was core voltage from -31 to -37mv and memory gpu from 1250 to 1100

Graphics went down abit

now I am more than puzzled.....


----------



## nightfox

Quote:


> Originally Posted by *Roboyto*
> 
> I'd wager this would make a difference. Try underclocked settings with 1600 memory


ok my bad. RAM was at stock. That was few days ago when I tried to oc everything. now everything is at stock CPU and RAM. GPU underclocked and undervolted


----------



## Roboyto

Quote:


> Originally Posted by *nightfox*
> 
> ok my bad. RAM was at stock. That was few days ago when I tried to oc everything. now everything is at stock CPU and RAM. GPU underclocked and undervolted


Hmmm, change of drivers lately?

Is it consistently testing higher underclocked? Did you disable tessellation?


----------



## nightfox

Quote:


> Originally Posted by *Roboyto*
> 
> Hmmm, change of drivers lately?
> 
> Is it consistently testing higher underclocked? Did you disable tessellation?


nope. i always use same drivers 13.12. nope i didnt disable tessellation. I dont even know how to disable it







i didnt change any settings at all.


----------



## Roboyto

Quote:


> Originally Posted by *nightfox*
> 
> Ok Ok now I more than stumbled.
> 
> settings as follows.
> 
> -37mV core
> -50% power limit
> 800 / 1100
> 
> and look at my 3dmark11 score
> 
> http://www.3dmark.com/3dm11/8012631
> 
> IT EVEN WENT UP ABIT. lol
> 
> PHYSIX went up!!
> 
> comparing with the previous one
> 
> http://www.3dmark.com/compare/3dm11/8012618/3dm11/8012349
> 
> - only changed was core voltage from -31 to -37mv and memory gpu from 1250 to 1100
> 
> Graphics went down abit
> 
> now I am more than puzzled.....


your graphics score is up with the 1250 RAM speed, that makes sense.

Your whole score went down because you lost 1.0 FPS in the physics test. I have seen this fluctuation in my 100+ 3dmark runs I've done testing my 4770k and 290.

1.0 FPS doesn't seem like much but when your referencing 29.0, 1.0 is over 3% drop. Your 28.0 FPS score 8843 + 3% is very close to your physics score at 29.0 FPS


----------



## Roboyto

Quote:


> Originally Posted by *Tobiman*
> 
> P.S. I can confirm that 13.12 WHQL is much much more stable than 14.1. Ran all benchmarks and games at 1175/1500mhz. *Will be pushing for higher stable clocks.*


Yes those 14.1 are no Bueno presently...and more hertz please 8)


----------



## Shurtugal

Hey guys, hopefully I'll be joining soon, I've ordered my Gigabyte w/Windforce 290, with luck it will come on Monday, without, the following Monday.
This is replacing my 280X Matrix, as I'm changing to ITX using the RVZ01.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Shurtugal*
> 
> Hey guys, hopefully I'll be joining soon, I've ordered my Gigabyte w/Windforce 290, with luck it will come on Monday, without, the following Monday.
> This is replacing my 280X Matrix, as I'm changing to ITX using the RVZ01.


Atra esterní ono thelduin, Shur'tugal

Hope it all goes well!


----------



## Shurtugal

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Atra esterní ono thelduin, Shur'tugal
> 
> Hope it all goes well!


Haha, I'm impressed!








And thanks, I'm really hoping for this monday!


----------



## Sgt Bilko

Quote:


> Originally Posted by *Shurtugal*
> 
> Haha, I'm impressed!
> 
> 
> 
> 
> 
> 
> 
> 
> And thanks, I'm really hoping for this monday!


I'm a big fan of the inheritance cycle









Been a while since i read them through, probably should again soon.

I take it you ordered from PCCG?


----------



## nightfox

Quote:


> Originally Posted by *Roboyto*
> 
> your graphics score is up with the 1250 RAM speed, that makes sense.
> 
> Your whole score went down because you lost 1.0 FPS in the physics test. I have seen this fluctuation in my 100+ 3dmark runs I've done testing my 4770k and 290.
> 
> 1.0 FPS doesn't seem like much but when your referencing 29.0, 1.0 is over 3% drop. Your 28.0 FPS score 8843 + 3% is very close to your physics score at 29.0 FPS


actually whole score went up.

P18822 - 800 / 1100 vs P18690 - 800 /1250

and I did lower more my memory just for a quick test

800 / 1000 -50mv/-50%

and surprisingly graphics score and physix went up further

http://www.3dmark.com/3dm11/8012696

this is the comparison with the previous one

http://www.3dmark.com/compare/3dm11/8012696/3dm11/8012618

oh and I even lower more to 900 memory, result went down

http://www.3dmark.com/3dm11/8012731

I think I will settle for 800 / 1000.

with these settings, cards runs abit cooler


----------



## Shurtugal

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'm a big fan of the inheritance cycle
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Been a while since i read them through, probably should again soon.
> 
> I take it you ordered from PCCG?


I'm the same, they are great books!

I'm ordering the rest of my Rig from PCCG, I got the 290 from Centrecom, mainly because I was hoping they would have one in stock so I could try it out over the weekend








Generally speaking, I order almost everything else from PCCG!


----------



## Mercy4You

Quote:


> Originally Posted by *LTC*
> 
> Seems like a bad overclock then..


I think it's bad overclock. Had the same issue with my previous GPU in BF3.

BF4 had multiple crashes in the beginning, but after 10+ updates it's completely stable now...


----------



## Sgt Bilko

Quote:


> Originally Posted by *Shurtugal*
> 
> I'm the same, they are great books!
> 
> I'm ordering the rest of my Rig from PCCG, I got the 290 from Centrecom, mainly because I was hoping they would have one in stock so I could try it out over the weekend
> 
> 
> 
> 
> 
> 
> 
> 
> Generally speaking, I order almost everything else from PCCG!


As do i,
I'm really having to push my CPU to keep up with the 290's so i might need to go for a custom loop soon....


----------



## Shurtugal

Quote:


> Originally Posted by *Sgt Bilko*
> 
> As do i,
> I'm really having to push my CPU to keep up with the 290's so i might need to go for a custom loop soon....


Yes, haha, is your CPU running constantly at 5.0 GHz, and if so I'm guessing an AIO WC unit?
This is assuming you are using the 8350 from your sig rig









Edit: Just saw the H100i in sig rig.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Shurtugal*
> 
> Yes, haha, is your CPU running constantly at 5.0 GHz, and if so I'm guessing an AIO WC unit?
> This is assuming you are using the 8350 from your sig rig


In summer i have to drop it back to 4.6 or 4.8 but generally i run it at 5.0 all the time.

The H100i just can't keep it cool enough









I can bench at 5.1 provided i get some nice cool temps though.


----------



## Shurtugal

Quote:


> Originally Posted by *Sgt Bilko*
> 
> In summer i have to drop it back to 4.6 or 4.8 but generally i run it at 5.0 all the time.
> 
> The H100i just can't keep it cool enough
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can bench at 5.1 provided i get some nice cool temps though.


Fair enough, I was using a H100i for my 3570k, and I got that up to 4.5GHz fairly comfortably, I didn't really try to go higher though because of the Voltage.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Shurtugal*
> 
> Fair enough, I was using a H100i for my 3570k, and I got that up to 4.5GHz fairly comfortably, I didn't really try to go higher though because of the Voltage.


The FX chips pump out some serious heat once you start climbing, Here's hoping the next series they bring out likes to stay a little cooler


----------



## Shurtugal

Quote:


> Originally Posted by *Sgt Bilko*
> 
> The FX chips pump out some serious heat once you start climbing, Here's hoping the next series they bring out likes to stay a little cooler


Hopefully! I've always been Intel/Nvidia, but I quite liked my 280X, it and the 290 are priced well against the 770 and 780. The new APU's are looking pretty good too.


----------



## Matt-Matt

Does anyone else get COD: Ghosts being unplayable? Tried both overclocked and stock and both are the same issue.. Join a game for 2 seconds, crash (black screen) and then I have to open task manager and it's allgood again. Well sort of, it still is using 100% GPU usage, but it's closed.

Maybe it's just the 14.1's that I'm on.. Not sure.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Matt-Matt*
> 
> Does anyone else get COD: Ghosts being unplayable? Tried both overclocked and stock and both are the same issue.. Join a game for 2 seconds, crash (black screen) and then I have to open task manager and it's allgood again. Well sort of, it still is using 100% GPU usage, but it's closed.
> 
> Maybe it's just the 14.1's that I'm on.. Not sure.


It's either 14.1 or Ghosts in general, I've heard some funny stuff goes on with that game.......


----------



## Matt-Matt

Quote:


> Originally Posted by *Sgt Bilko*
> 
> It's either 14.1 or Ghosts in general, I've heard some funny stuff goes on with that game.......


Yeah, my housemate has a 4670k and a 760 and it runs pretty fine apart from the sound cutting in/out.
It's only a free weekend on steam, happy I didn't buy it!


----------



## Sgt Bilko

Quote:


> Originally Posted by *Matt-Matt*
> 
> Yeah, my housemate has a 4670k and a 760 and it runs pretty fine apart from the sound cutting in/out.
> It's only a free weekend on steam, happy I didn't buy it!


Yeah i seen that......not really worth wasting my data cap on it tbh









maybe when it drops to $5 or something but not within the next 12 months


----------



## rdr09

Quote:


> Originally Posted by *nightfox*
> 
> Well I did try undervolting and underclocking. I got black screen past -50mV on AB
> 
> here is my settings
> 
> Core voltage -31
> Powerlimit -50%
> Core clock 800
> Memory Clock 1250
> 
> I did ran a 3dmark11 benchmark
> 
> http://www.3dmark.com/3dm11/8012349
> 
> I was so surprised that my 3dmark test before at stock clocks (975) was abit lower
> 
> http://www.3dmark.com/3dm11/8001317
> 
> http://www.3dmark.com/compare/3dm11/8001317/3dm11/8012349
> 
> shocking


could be that the bottleneck is lessend when the gpus are slowed down. the combined score shows.


----------



## Matt-Matt

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yeah i seen that......not really worth wasting my data cap on it tbh
> 
> 
> 
> 
> 
> 
> 
> 
> 
> maybe when it drops to $5 or something but not within the next 12 months


We have unlimited here, it works out quite cheap when there are 3 people. Plus my other housemate copied it off me









But yeah, totally not worth the download if you have a limit, it'd be good if you knew someone with the disc though.


----------



## Belkov

Quote:


> Originally Posted by *Belkov*
> 
> I have a question related to your answer...
> 
> 
> 
> 
> 
> 
> 
> 
> My case is a cheap elite 430 but i am pleased of its cooling so far. I have one 120mm. fan at the front, wich blows in; two 120mm. fans at the top, wich blow out; one 120mm. fan at the back, wich blows out; one 120mm. fan at the bottom, wich blows in; and one 120mm. fan at the side panel, wich blows in. My card is gb r9 290oc wf3 and it runs at stock with max temp for the core 75 degrees and max 78 for the vrm1. Do you think that the temps will drop if i turn the side panel fan to suck the air out? My fan configuration is made like this due to a grafic for the cooling solution in the page of cm elite 430. I will appreciate your answers. Thank you in advance.
> 
> Edit: I tested both situations and there is no change in the temps of the GPU. So do you think i must keep the side panel fan to blow in, or better to turn it to blow out.


Would someone gives me an advice about the quoted over?


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> could be that the bottleneck is lessend when the gpus are slowed down. the combined score shows.


What to see weird?

http://www.3dmark.com/compare/3dm11/8013125/3dm11/8013171

Overclocking my 290's net me a better Physics score


----------



## Sgt Bilko

Quote:


> Originally Posted by *Matt-Matt*
> 
> We have unlimited here, it works out quite cheap when there are 3 people. Plus my other housemate copied it off me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But yeah, totally not worth the download if you have a limit, it'd be good if you knew someone with the disc though.


70GB a month.....have to be very picky


----------



## rdr09

Quote:


> Originally Posted by *Belkov*
> 
> Would someone gives me an advice about the quoted over?


leave it blowing out. the gpu cooler sucks air from the bottom and blows it out on top. out as towards the side panel. tried turning off the bottom fan? it could be sucking hot air from the psu.
Quote:


> Originally Posted by *Sgt Bilko*
> 
> What to see weird?
> 
> http://www.3dmark.com/compare/3dm11/8013125/3dm11/8013171
> 
> Overclocking my 290's net me a better Physics score


in the case of 2 290s, yes, i've seen that happened when i crossfired the 7900 series cards. it could be a different case with 3. not sure.


----------



## JordanTr

Hello everybody. Yesterday i went to sapphire website to check if there is any bios update for my reference design card ( its on gelid+ be quiet fans now). After i found nothing new i checked for drivers win8.1 x64, and i found there is drivers called v13.12c with release date of february 12th. Any idea about them? Cause they dont exist on amd website.


----------



## nightfox

Quote:


> Originally Posted by *rdr09*
> 
> could be that the bottleneck is lessend when the gpus are slowed down. the combined score shows.


CPU bottleneck? Well I have monitored my CPU usage before when I overclocked my GPU's and play BF4 multiplayer, ran 3dmark11, cpu usage dont even reach 95%.

I am really puzzled after seeing it to be honest. And also, as for PSU, have tried running unigine heaven (full screen - for trifire) + cinebench, I dont experience shut down. Have tried
as well unigine + OCCT (power supply) and no shut down. That is why I am so much confident about my PSU.
Quote:


> Originally Posted by *Sgt Bilko*
> 
> What to see weird?
> 
> http://www.3dmark.com/compare/3dm11/8013125/3dm11/8013171
> 
> Overclocking my 290's net me a better Physics score


abit weird on your results. I could see that clock settings and memory is same. same setting but difference results.


----------



## Belkov

Quote:


> Originally Posted by *rdr09*
> 
> leave it blowing out. the gpu cooler sucks air from the bottom and blows it out on top. out as towards the side panel. tried turning off the bottom fan? it could be sucking hot air from the psu.


I installed the bottom fan before two days and i'm seeing an improvement of 2 degrees for vrm1 so it is ok. But there is no diferrence between side fan blowing out and in.


----------



## Paul17041993

Quote:


> Originally Posted by *JordanTr*
> 
> Hello everybody. Yesterday i went to sapphire website to check if there is any bios update for my reference design card ( its on gelid+ be quiet fans now). After i found nothing new i checked for drivers win8.1 x64, and i found there is drivers called v13.12c with release date of february 12th. Any idea about them? Cause they dont exist on amd website.


latest driver releases are 13.12 and betas 14.1 (haven't checked the specific build lately), you get these from AMD site only, don't bother with the manufacture sites for drivers, (BIOS and software however its good to look at your manufacturer's).

13.12 is highly recommended as it has the best stability currently, 14.1 has a lot of broken parts and should only be used to test mantle;
http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64
this is for win 7/8/8.1 x64


----------



## Sgt Bilko

Quote:


> Originally Posted by *nightfox*
> 
> CPU bottleneck? Well I have monitored my CPU usage before when I overclocked my GPU's and play BF4 multiplayer, ran 3dmark11, cpu usage dont even reach 95%.
> 
> I am really puzzled after seeing it to be honest. And also, as for PSU, have tried running unigine heaven (full screen - for trifire) + cinebench, I dont experience shut down. Have tried
> as well unigine + OCCT (power supply) and no shut down. That is why I am so much confident about my PSU.
> abit weird on your results. I could see that clock settings and memory is same. same setting but difference results.


Sorry, Higher results is 1200/1600 for the 290's, CPU/Ram etc is all the same


----------



## nightfox

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Sorry, Higher results is 1200/1600 for the 290's, CPU/Ram etc is all the same


yeah figured its futuremark sys info didnt detect well hardware infos.


----------



## rdr09

Quote:


> Originally Posted by *nightfox*
> 
> CPU bottleneck? Well I have monitored my CPU usage before when I overclocked my GPU's and play BF4 multiplayer, ran 3dmark11, cpu usage dont even reach 95%.
> Quote:
> 
> 
> 
> bottleneck can occur even if the cpu is not pegging afaik. i know you are underclocking and leaving the cpu at stock but i've seen scores higher than that with just 2 290s. even higher. they may be oc'ed but still. your combined score is about 2600 pts lower than mine with an oc'ed 290 and i7SB.
> 
> Quote:
> 
> 
> 
> Originally Posted by *Belkov*
> 
> I installed the bottom fan before two days and i'm seeing an improvement of 2 degrees for vrm1 so it is ok. But there is no diferrence between side fan blowing out and in.
> 
> 
> 
> your temp is not crazy high but it sucks you might not be able to oc if you want to and if the gpu is overclockable. have you played with the fans on top?
> 
> Click to expand...
Click to expand...


----------



## LTC

Running my 1100/1400 clocks in BF4 again without a problem, lets see how long that will keep working.


----------



## Ilsanto86

Hi Arizonian my R9 290x is on water not windforce, can you change me please?


----------



## Belkov

Quote:


> Originally Posted by *rdr09*
> 
> your temp is not crazy high but it sucks you might not be able to oc if you want to and if the gpu is overclockable. have you played with the fans on top?


What do you mean? To turn them blow in? I am not sure this will be right - after all the hot air goes up. The gpu is overclocable. I cam make 1150/1500 easily with + 100mV with temps around 82 for the core and 85-88 for Vrm1.


----------



## pkrexer

Crysis 3 is a better stress test in my personal experience. My gpu is typically 3-4C hotter then BF4 after about an hour of game play. If I can survive an hour of Crysis 3, I'm pretty satisfied hehe.


----------



## Jack Mac

Quote:


> Originally Posted by *pkrexer*
> 
> Crysis 3 is a better stress test in my personal experience. My gpu is typically 3-4C hotter then BF4 after about an hour of game play. If I can survive an hour of Crysis 3, I'm pretty satisfied hehe.


Yep, Crysis 3 is a great benchmark.


----------



## JordanTr

Quote:


> Originally Posted by *Paul17041993*
> 
> latest driver releases are 13.12 and betas 14.1 (haven't checked the specific build lately), you get these from AMD site only, don't bother with the manufacture sites for drivers, (BIOS and software however its good to look at your manufacturer's).
> 
> 13.12 is highly recommended as it has the best stability currently, 14.1 has a lot of broken parts and should only be used to test mantle;
> http://support.amd.com/en-us/download/desktop?os=Windows+7+-+64
> this is for win 7/8/8.1 x64


i know all that, they also got regular 13.12. I just thought maybe they made some improvements to it or put mantle support on top of 13.12 and named it 13.12c.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Jack Mac*
> 
> Yep, Crysis 3 is a great benchmark.


Only game that puts both my 290's to 100% usage consistently

8350 bottlenecks them everywhere else


----------



## rdr09

Quote:


> Originally Posted by *Belkov*
> 
> What do you mean? To turn them blow in? I am not sure this will be right - after all the hot air goes up. The gpu is overclocable. I cam make 1150/1500 easily with + 100mV with temps around 82 for the core and 85-88 for Vrm1.


your temps are still acceptable even oc'ed then i would not be concerned much. 1150 is fast. the "hot air goes up" is easily negated by the fans in such a small area. with the amount of fans you have no hot air will stay in your case long enuf to matter.


----------



## Roboyto

Quote:


> Originally Posted by *nightfox*
> 
> actually whole score went up.
> 
> P18822 - 800 / 1100 vs P18690 - 800 /1250




Yes, the overall score went up because your physics score improved on the 800/1100 run, the 800/1250 did have a better graphics score. 299 points in physics sub-score has more bearing on the overall score than 242 points on the graphics sub-score.

I'm not sure what causes the variation in the physics test but I have experience this as well, Mine fluctuates between 35.8 and 37.1 for my 4770k @ 4.5GHz and 2400MHz on RAM. These settings haven't changed at all for every 3DMark11 run that I have ran. I know it isn't temperature because mine is naked and temps don't exceed 55C for benchmarks.


----------



## vieuxchnock

*What is the maximum safe temperature for the VRM in load for a 290 (1175/1500)? I reach 80 degres in 3DMark Fire Strike. I'm under water.*


----------



## rdr09

Quote:


> Originally Posted by *vieuxchnock*
> 
> *What is the maximum safe temperature for the VRM in load for a 290 (1175/1500)? I reach 80 degres in 3DMark Fire Strike. I'm under water.*


what block? you should not see even 60 in that bench.


----------



## vieuxchnock

That's what I have.


Kryographics


----------



## kizwan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Matt-Matt*
> 
> Yeah, my housemate has a 4670k and a 760 and it runs pretty fine apart from the sound cutting in/out.
> It's only a free weekend on steam, happy I didn't buy it!
> 
> 
> 
> Yeah i seen that......not really worth wasting my data cap on it tbh
> 
> 
> 
> 
> 
> 
> 
> 
> 
> maybe when it drops to $5 or something but not within the next 12 months
Click to expand...

My unlimited data plan has been changed to 10GB cap for no reason.







Their excuse is enforcing "Fair Usage Policy" which not yet exist when I signup the unlimited plan years ago.

My 290's are currently idling for 2 - 3 days now. Not in the mood to play any games because internet connection is very poor. Surfing & downloading are very good but when in gaming, ping fluctuates very badly.









Regarding weird benching result, my 290's somehow getting lower score in Valley at the same clocks. For example for 1200/1600, I got 5115 but now if I'm lucky I can only get 50XX score. Average FPS was 122.3 but now it dropped to 120 or 119.









Triple


----------



## Roboyto

/
Quote:


> Originally Posted by *vieuxchnock*
> 
> *What is the maximum safe temperature for the VRM in load for a 290 (1175/1500)? I reach 80 degres in 3DMark Fire Strike. I'm under water.*


They are rated for 120C I believe, but once you cross over the 90 degree mark you can start to hurt stability or OCability.

Full cover water or attached AIO? 80 is a tad on the high side for full cover water from my experience. I was maxing at 72 with a heavy overclock of 1295/1700 +200mV offset.

If you're full cover and you feel like spending a few dollars, and the time to remove/reinstall your block, the Fujipoly Ultra Extreme thermal pads made a significant difference for me; about 23% lower temperatures for VRM1. My VRM1 temps now are nearly always identical to core temperature.

check out this link for my post:

http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures#post_21821258


----------



## rdr09

Quote:


> Originally Posted by *vieuxchnock*
> 
> That's what I have.
> 
> 
> Kryographics


you sure you used the right size thermal pads? what sizes are they? i have an ek that uses 2 sizes - 1mm and 0.5mm.

i just ran the bench with same oc but i just turned on my rig, so that may have affected the low temp and i have HT off on the cpu . . .



it could also be the amount of vcore. how much?

i'll run it again showing max temp. wait a sec.

EDIT: it does not even see 40. room temp is about 24 . . .



you need to remount or check your flow.


----------



## Belkov

Can someone recommends me a vga bios editor? I want to check two bioses.


----------



## Brian18741

So swapped out the TIM on my R9 290 DCU II's today. Using MX-4 TIM. So far results leave a lot to be desired. After an hour and a bit of gaming

*Old results;*

Top Card
Core 81°C
VRM 1 77°C
VRM 2 67°C

Bottom Card
Core 66°C
VRM 1 62°C
VRM 2 61°C

*New results;*

Top Card
Core 87°C
VRM 1 80°C
VRM 2 69°C

Bottom Card
Core 68°C
VRM 1 63°C
VRM 2 62°C

The core on the top cards peaked at 87°C but hovered around 83°C for the most part which is still 2°C higher than yesterday which seems pretty typical across the board, I'm seeing an increase of about 2 - 3°C everywhere.


----------



## kizwan

Quote:


> Originally Posted by *Brian18741*
> 
> So swapped out the TIM on my R9 290 DCU II's today. Using MX-4 TIM. So far results leave a lot to be desired. After an hour and a bit of gaming
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> *Old results;*
> 
> Top Card
> Core 81°C
> VRM 1 77°C
> VRM 2 67°C
> 
> Bottom Card
> Core 66°C
> VRM 1 62°C
> VRM 2 61°C
> 
> *New results;*
> 
> Top Card
> Core 87°C
> VRM 1 80°C
> VRM 2 69°C
> 
> Bottom Card
> Core 68°C
> VRM 1 63°C
> VRM 2 62°C
> 
> 
> 
> The core on the top cards peaked at 87°C but hovered around 83°C for the most part which is still 2°C higher than yesterday which seems pretty typical across the board, I'm seeing an increase of about 2 - 3°C everywhere.


Based on my experience with MX-4, I don't think it will work well with hot card/chip like 290's. I have first gen Intel Core quad core processor that always running hot & MX-4 doesn't help either. Switching to Shin-Etsu G751 TIM, it running a lot cooler. It's not because bad mount because I already tried like hundreds of mount using different methods. G751 have halve thermal conductivity compare to MX-4 but G751 is thicker than MX-4. Try different TIM, preferably thicker than MX-4, like Gelid GC-Extreme.

I run out of G751 TIM. Going to get ICD 7 TIM later. Will use this on my 290's core when my Fujipoly pad arrived.


----------



## LTC

Quote:


> Originally Posted by *Brian18741*
> 
> So swapped out the TIM on my R9 290 DCU II's today. Using MX-4 TIM. So far results leave a lot to be desired. After an hour and a bit of gaming
> 
> *Old results;*
> 
> Top Card
> Core 81°C
> VRM 1 77°C
> VRM 2 67°C
> 
> Bottom Card
> Core 66°C
> VRM 1 62°C
> VRM 2 61°C
> 
> *New results;*
> 
> Top Card
> Core 87°C
> VRM 1 80°C
> VRM 2 69°C
> 
> Bottom Card
> Core 68°C
> VRM 1 63°C
> VRM 2 62°C
> 
> The core on the top cards peaked at 87°C but hovered around 83°C for the most part which is still 2°C higher than yesterday which seems pretty typical across the board, I'm seeing an increase of about 2 - 3°C everywhere.


Won't you removing the cooler to replace thermal paste void your warranty?


----------



## vieuxchnock

Quote:


> Originally Posted by *rdr09*
> 
> you sure you used the right size thermal pads? what sizes are they? i have an ek that uses 2 sizes - 1mm and 0.5mm.
> 
> i just ran the bench with same oc but i just turned on my rig, so that may have affected the low temp and i have HT off on the cpu . . .
> 
> 
> 
> it could also be the amount of vcore. how much?
> 
> i'll run it again showing max temp. wait a sec.
> 
> EDIT: it does not even see 40. room temp is about 24 . . .
> 
> 
> 
> you need to remount or check your flow.


Aquacomputer don't use thermal pad only Compound and use thermal pad on VRM only.

This is my setting:


----------



## rdr09

Quote:


> Originally Posted by *vieuxchnock*
> 
> Aquacomputer don't use thermal pad only Compound and use thermal pad on VRM only.
> 
> This is my setting:


same with ek but ek uses one size of thermal pad for vrms and another for the memory. follow this thread . . .

http://www.overclock.net/t/1438042/vc-aqua-computer-adds-r9-290x-full-cover-block-to-product-line/60

Shoggy is our rep here in ocn. 80s are temps for aircooled 290s.


----------



## Brian18741

Quote:


> Originally Posted by *kizwan*
> 
> Based on my experience with MX-4, I don't think it will work well with hot card/chip like 290's. I have first gen Intel Core quad core processor that always running hot & MX-4 doesn't help either. Switching to Shin-Etsu G751 TIM, it running a lot cooler. It's not because bad mount because I already tried like hundreds of mount using different methods. G751 have halve thermal conductivity compare to MX-4 but G751 is thicker than MX-4. Try different TIM, preferably thicker than MX-4, like Gelid GC-Extreme.
> 
> I run out of G751 TIM. Going to get ICD 7 TIM later. Will use this on my 290's core when my Fujipoly pad arrived.


Thanks for the tip dude, just ordered some Gelid GC-Extreme now, will swap it over as soon as it arrives and see how we get on. I forgot to install the extra 120mm fan bought so popped it in there a while ago and we're down another bit, the core is the same but the VRMs are down 2°C on top and 3 and 5°C lower on the bottom.

Quote:


> Originally Posted by *LTC*
> 
> Won't you removing the cooler to replace thermal paste void your warranty?


Probably, there was a sticker over one of the screws that I had to break. But I'll be putting them under water eventually so it was going to go either way!


----------



## vieuxchnock

I have ordered an active backplate and waiting for it.After the reading I think is should help.On Idle the temp is 39 and the CPU is 37. Maybe adding another rad will also help.


----------



## rdr09

Quote:


> Originally Posted by *vieuxchnock*
> 
> I have ordered an active backplate and waiting for it.After the reading I think is should help.On Idle the temp is 39 and the CPU is 37. Maybe adding another rad will also help.


I think even without the active backplate your temp should not go that high. those thermal pads have plastic shields on both sides - you took them off, right?

anyway, pm Shoggy and see if he can assists you further. i'd recommend putting your gpu back to stock, though.


----------



## vieuxchnock

Quote:


> Originally Posted by *rdr09*
> 
> I think even without the active backplate your temp should not go that high. those thermal pads have plastic shields on both sides - you took them off, right?
> 
> anyway, pm Shoggy and see if he can assists you further. i'd recommend putting your gpu back to stock, though.


I've just make a 3DMark run at stock setting and the highest VRM were 73 and the GPU 63.


----------



## Zell84

New to the forums. Just wanted to validate my R9-290X. Here's a screenshot of CPU-z with my name. I look forward to the great info from this thread.


----------



## Cybertox

As an owner of an R9 290X I still dont understand why they would provide a switch on the card to transition between two different bios modes, for someone like me who goes for performance its obvious that I would want my gpu to utilize its whole capacity, why would I need a mode for less fan speed and lower clockrate? Next time just provide a lower working temperature and less noisy fans so that users wont have to decide.
Plus a user can be misinformed and so its gpu would perform slightly worse in games meaning lower fps ranging from 1 to 8.


----------



## Belkov

For me this is truly stupid. Everyone who want to buy 290(x) won't use it for movies, net and so on... So this silent mode is completely unusable... I've already flash another bios over it.


----------



## Paul17041993

Quote:


> Originally Posted by *Cybertox*
> 
> As an owner of an R9 290X I still dont understand why they would provide a switch on the card to transition between two different bios modes, for someone like me who goes for performance its obvious that I would want my gpu to utilize its whole capacity, why would I need a mode for less fan speed and lower clockrate? Next time just provide a lower working temperature and less noisy fans so that users wont have to decide.
> Plus a user can be misinformed and so its gpu would perform slightly worse in games meaning lower fps ranging from 1 to 8.


the dual-BIOS is an AMD feature for their top-end cards, intended to allow flashing without the risk of loosing the card to a bad BIOS, found it very useful on my 7970.

due to the character of the 290X AMD also decided to fit them with different fan profiles by default, so people are not entirely forced to use CCC to switch between silence or performance, though mine only has 40% fan on both BIOSs for whatever reason...


----------



## Arizonian

Quote:


> Originally Posted by *Ilsanto86*
> 
> Hi Arizonian my R9 290x is on water not windforce, can you change me please?










...... Corrected









Quote:


> Originally Posted by *Zell84*
> 
> New to the forums. Just wanted to validate my R9-290X. Here's a screenshot of CPU-z with my name. I look forward to the great info from this thread.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









*"How to put your Rig in your Sig"*


----------



## Roboyto

Quote:


> Originally Posted by *Brian18741*
> 
> So swapped out the TIM on my R9 290 DCU II's today. Using MX-4 TIM. So far results leave a lot to be desired. After an hour and a bit of gaming
> 
> The core on the top cards peaked at 87°C but hovered around 83°C for the most part which is still 2°C higher than yesterday which seems pretty typical across the board, I'm seeing an increase of about 2 - 3°C everywhere.


Surprising results indeed especially considering the Arctic Accelero Extreme III comes with MX-4 pre-applied; we know the AC Extreme III works well on these cards.

Hopefully the Gelid will give better results. If it doesn't then we'll really know how inadequate the DC2 cooler is on the 290.


----------



## Roboyto

Quote:


> Originally Posted by *Brian18741*
> 
> I forgot to install the extra 120mm fan bought so popped it in there a while ago and we're down another bit, the core is the same but the VRMs are down 2°C on top and 3 and 5°C lower on the bottom.
> 
> Probably, there was a sticker over one of the screws that I had to break. But I'll be putting them under water eventually so it was going to go either way!


Glad to see the bottom fan made noticeable improvements.

I would check with ASUS regarding their warranty policy. I recently RMAd my HD7870 DC2 that was under water previously. I Just put the stock cooler back on and there were no issues with the RMA whatsoever. Calling in an RMA also makes things move much quicker than an online submission.


----------



## Roboyto

Quote:


> Originally Posted by *vieuxchnock*
> 
> I have ordered an active backplate and waiting for it.After the reading I think is should help.On Idle the temp is 39 and the CPU is 37. Maybe adding another rad will also help.


Have some pics of your rig? Maybe changing orientation of fans could help lower temperatures for you.

How much rad space you working with; What does your loop consist of?


----------



## Roboyto

Quote:


> Originally Posted by *Belkov*
> 
> What do you mean? To turn them blow in? I am not sure this will be right - after all the hot air goes up. The gpu is overclocable. I cam make 1150/1500 easily with + 100mV with temps around 82 for the core and 85-88 for Vrm1.




Do you have a fan on your side panel, or exhaust fans up top?

Can you change the orientation of your CPU cooler to exhaust upwards?

You would likely benefit from a fan on the bottom of the case to increase airflow.

A bigger improvement would be to get some more airflow coming from the front of the case above the first fan to get a breeze coming across the front of the card for the VRM1, and across the backside of the card. It doesn't look like you have a mount for one, but rigging something up temporarily could be worth seeing what kind of results are possible and if it's worth it to make a modification to accommodate another fan there permanently.


----------



## Jack Mac

Well, after owning both a reference 290 and now a reference 780, I must say that they are much closer than people think, but the 290 is still the faster card. 290 --> 780 is like going from 290 to 290X. Mainly switched because of noise (family was complaining) and it'd be cheaper than watercooling my 290, plus I wanted to have something new to play with. Ideally, I would have preferred an aftermarket 290, but I wasn't about to use integrated for 2-3 weeks because they're backordered, OOS, or overpriced.


----------



## Roboyto

Quote:


> Originally Posted by *Jack Mac*
> 
> Well, after owning both a reference 290 and now a reference 780, I must say that they are much closer than people think, but the 290 is still the faster card. 290 --> 780 is like going from 290 to 290X. Mainly switched because of noise (family was complaining) and it'd be cheaper than watercooling my 290, plus I wanted to have something new to play with. Ideally, I would have preferred an aftermarket 290, but I wasn't about to use integrated for 2-3 weeks because they're backordered, OOS, or overpriced.


It's unfortunate that cryptocurrency has everything all out of whack.

It's pretty amazing the 290(X) performs so well compared to their Green counterparts, especially when you take into consideration the die size and transistor count.


----------



## Jack Mac

Yeah it's pretty impressive. I loved my 290.


----------



## Belkov

Quote:


> Originally Posted by *Roboyto*
> 
> 
> 
> Do you have a fan on your side panel, or exhaust fans up top?
> 
> Can you change the orientation of your CPU cooler to exhaust upwards?
> 
> You would likely benefit from a fan on the bottom of the case to increase airflow.
> 
> A bigger improvement would be to get some more airflow coming from the front of the case above the first fan to get a breeze coming across the front of the card for the VRM1, and across the backside of the card. It doesn't look like you have a mount for one, but rigging something up temporarily could be worth seeing what kind of results are possible and if it's worth it to make a modification to accommodate another fan there permanently.


This is an old pic:
Quote:


> Originally Posted by *Belkov*
> 
> I have a question related to your answer...
> 
> 
> 
> 
> 
> 
> 
> 
> My case is a cheap elite 430 but i am pleased of its cooling so far. I have one 120mm. fan at the front, wich blows in; two 120mm. fans at the top, wich blow out; one 120mm. fan at the back, wich blows out; one 120mm. fan at the bottom, wich blows in; and one 120mm. fan at the side panel, wich blows in. My card is gb r9 290oc wf3 and it runs at stock with max temp for the core 75 degrees and max 78 for the vrm1. Do you think that the temps will drop if i turn the side panel fan to suck the air out? My fan configuration is made like this due to a grafic for the cooling solution in the page of cm elite 430. I will appreciate your answers. Thank you in advance.
> 
> Edit: I tested both situations and there is no change in the temps of the GPU. So do you think i must keep the side panel fan to blow in, or better to turn it to blow out.




And please don't mention the dust - i know i must clean it soon...


----------



## Roboyto

Quote:


> Originally Posted by *Belkov*
> 
> This is an old pic:
> 
> And please don't mention the dust - i know i must clean it soon...


Sorry, hadn't seen that original post, this thread moves quick









I still think your best bet is to get more air to the front end of the card by VRM1.



Maybe move your HDD's up to the 5.25" bays with some adapters, or craftiness, so the air flow from your single fan in front isn't being restricted. Also, if I read correctly for your case you can fit a 140mm in the front, if so an upgrade there may help as well.


----------



## Belkov

Well, i don't think to buy fans anymore, because i will change the whole box in the close future...







The big question all along was:

Quote:


> Originally Posted by *Belkov*
> 
> I have a question related to your answer...
> 
> 
> 
> 
> 
> 
> 
> 
> My case is a cheap elite 430 but i am pleased of its cooling so far. I have one 120mm. fan at the front, wich blows in; two 120mm. fans at the top, wich blow out; one 120mm. fan at the back, wich blows out; one 120mm. fan at the bottom, wich blows in; and one 120mm. fan at the side panel, wich blows in. My card is gb r9 290oc wf3 and it runs at stock with max temp for the core 75 degrees and max 78 for the vrm1. *Do you think that the temps will drop if i turn the side panel fan to suck the air out?* My fan configuration is made like this due to a grafic for the cooling solution in the page of cm elite 430. I will appreciate your answers. Thank you in advance.
> 
> Edit: I tested both situations and there is no change in the temps of the GPU. *So do you think i must keep the side panel fan to blow in, or better to turn it to blow out.*


But i apreciate your advices and i will try some of them.








Thank you.


----------



## Roboyto

Quote:


> Originally Posted by *Belkov*
> 
> Well, i don't think to buy fans anymore, because i will change the whole box in the close future...
> 
> 
> 
> 
> 
> 
> 
> The big question all along was:
> But i apreciate your advices and i will try some of them.
> 
> 
> 
> 
> 
> 
> 
> 
> Thank you.


No problem, I hope they help.


----------



## Matt-Matt

Quote:


> Originally Posted by *kizwan*
> 
> *My unlimited data plan has been changed to 10GB cap for no reason.
> 
> 
> 
> 
> 
> 
> 
> Their excuse is enforcing "Fair Usage Policy" which not yet exist when I signup the unlimited plan years ago.*
> 
> My 290's are currently idling for 2 - 3 days now. Not in the mood to play any games because internet connection is very poor. Surfing & downloading are very good but when in gaming, ping fluctuates very badly.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Regarding weird benching result, my 290's somehow getting lower score in Valley at the same clocks. For example for 1200/1600, I got 5115 but now if I'm lucky I can only get 50XX score. Average FPS was 122.3 but now it dropped to 120 or 119.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Triple


You may want to ring them up and say "Fair enough you want to limit me, but can we make a more reasonable amount of data?"


----------



## taem

So I finally started gaming on my Powercolor PCS+ 290 after spending days and days figuring out clocks temps etc.

Here's how it looks after 2 hours of Far Cry 3:


Core 1170, mem 1500, vddc offset +75, power limit +50%, ambient ~22.

Those peaks really aren't indicative, throughout play the hwinfo/rtss osd showed lower temps. So, pretty good temps for those clocks. Downside, if I take clocks and vddc higher, vrm1 hits 80-85. I so want to tinker with the vrm1 heatsink but I will wait until warranty has expired. What is the best method for improving vrm1 heat dispersion btw, for an open air cooler? What type/brand heatsink, what material to apply between sink and vrm area, etc? I've heard folks say removing the heatsink entirely and just letting the fan blow on it gets the best result.

All in all a good result, nice clocks and temps for air cooled. Otoh for a single card setup all the custom 290s seem pretty good on temps. But I'm happy with this card.


----------



## rdr09

Quote:


> Originally Posted by *taem*
> 
> So I finally started gaming on my Powercolor PCS+ 290 after spending days and days figuring out clocks temps etc.
> 
> Here's how it looks after 2 hours of Far Cry 3:
> 
> 
> Core 1170, mem 1500, vddc offset +75, power limit +50%, ambient ~22.
> 
> Those peaks really aren't indicative, throughout play the hwinfo/rtss osd showed lower temps. So, pretty good temps for those clocks. Downside, if I take clocks and vddc higher, vrm1 hits 80-85. I so want to tinker with the vrm1 heatsink but I will wait until warranty has expired. What is the best method for improving vrm1 heat dispersion btw, for an open air cooler? What type/brand heatsink, what material to apply between sink and vrm area, etc? I've heard folks say removing the heatsink entirely and just letting the fan blow on it gets the best result.
> 
> All in all a good result, nice clocks and temps for air cooled. Otoh for a single card setup all the custom 290s seem pretty good on temps. But I'm happy with this card.


Impressive. I lean toward keeping them vrm's sinked.


----------



## Roboyto

Quote:


> Originally Posted by *taem*
> 
> So I finally started gaming on my Powercolor PCS+ 290 after spending days and days figuring out clocks temps etc.
> 
> Here's how it looks after 2 hours of Far Cry 3:
> 
> Core 1170, mem 1500, vddc offset +75, power limit +50%, ambient ~22.
> 
> Those peaks really aren't indicative, throughout play the hwinfo/rtss osd showed lower temps. So, pretty good temps for those clocks. Downside, if I take clocks and vddc higher, vrm1 hits 80-85. I so want to tinker with the vrm1 heatsink but I will wait until warranty has expired. What is the best method for improving vrm1 heat dispersion btw, for an open air cooler? What type/brand heatsink, what material to apply between sink and vrm area, etc? I've heard folks say removing the heatsink entirely and just letting the fan blow on it gets the best result.
> 
> All in all a good result, nice clocks and temps for air cooled. Otoh for a single card setup all the custom 290s seem pretty good on temps. But I'm happy with this card.


Temps are solid for air cooling.







That triple slotter is a BEAST

I have to agree with rdr09 with keeping heatsinks on the VRMs.

Biggest question is how is/are the VRM sink(s) attached presently? That will play a big role in how you go about upgrading. I'm having trouble locating a picture of the PCS+ without the cooler on it to see what's going on under the hood.

We know for sure that with a waterblock, upgrading to Fujipoly Ultra Extreme thermal pads can make a big difference in VRM temps. I observed a 23% temperature drop from a $20 part.

http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures

Will this help air cooled that much? Not sure, but waiting to hear from a user or two about upgrading pads on their air coolers...can't remember exactly who it was though. Stay tuned, I'm sure someone will have information posted sooner than later.


----------



## Prozillah

Can someone explain to me it seems like overall i was achieving better FPS from my 2x 670 cards apposed to my new 290's? I feel it might be CPU related as some areas of BF4 I'm getting like 50 - 60's regardless of whether I'm ultra setting or low???

Also it seems that the CPU is working ALOT harder all of sudden - like now loading BF4 I'm getting short bursts where CPU load is 100% and the mouse freezes up for a 2nd etc.

Could it be that's these cards and drawing so much more of the CPU just to be in use???


----------



## BradleyW

Quote:


> Originally Posted by *Prozillah*
> 
> Can someone explain to me it seems like overall i was achieving better FPS from my 2x 670 cards apposed to my new 290's? I feel it might be CPU related as some areas of BF4 I'm getting like 50 - 60's regardless of whether I'm ultra setting or low???
> 
> Also it seems that the CPU is working ALOT harder all of sudden - like now loading BF4 I'm getting short bursts where CPU load is 100% and the mouse freezes up for a 2nd etc.
> 
> Could it be that's these cards and drawing so much more of the CPU just to be in use???


Get windows 8.1 and Mantle Driver! Should clear all this up.


----------



## vieuxchnock

Quote:


> Originally Posted by *Roboyto*
> 
> Have some pics of your rig? Maybe changing orientation of fans could help lower temperatures for you.
> 
> How much rad space you working with; What does your loop consist of?


This is my rig. I have CPU and GPU in the loop with a 180 mm rad.


----------



## rdr09

Quote:


> Originally Posted by *vieuxchnock*
> 
> This is my rig. I have CPU and GPU in the loop with a 180 mm rad.


looks good. yes, prolly another 120 rad after the cpu.


----------



## vieuxchnock

I will put a 240 in push/pull in the top after the CPU


----------



## Prozillah

Quote:


> Originally Posted by *Prozillah*
> 
> Can someone explain to me it seems like overall i was achieving better FPS from my 2x 670 cards apposed to my new 290's? I feel it might be CPU related as some areas of BF4 I'm getting like 50 - 60's regardless of whether I'm ultra setting or low???
> 
> Also it seems that the CPU is working ALOT harder all of sudden - like now loading BF4 I'm getting short bursts where CPU load is 100% and the mouse freezes up for a 2nd etc.
> 
> Could it be that's these cards and drawing so much more of the CPU just to be in use???


Quote:


> Originally Posted by *BradleyW*
> 
> Get windows 8.1 and Mantle Driver! Should clear all this up.


But why would that be the case? ?


----------



## Roboyto

Quote:


> Originally Posted by *Prozillah*
> 
> Can someone explain to me it seems like overall i was achieving better FPS from my 2x 670 cards apposed to my new 290's? I feel it might be CPU related as some areas of BF4 I'm getting like 50 - 60's regardless of whether I'm ultra setting or low???
> 
> Also it seems that the CPU is working ALOT harder all of sudden - like now loading BF4 I'm getting short bursts where CPU load is 100% and the mouse freezes up for a 2nd etc.
> 
> Could it be that's these cards and drawing so much more of the CPU just to be in use???


BF4 is very CPU dependent and will utilize whatever # of cores you have available to throw at it. I doubt a 4670k would be bottlenecking even 2 of these beasts, what clock you running?


----------



## Roboyto

Quote:


> Originally Posted by *vieuxchnock*
> 
> This is my rig. I have CPU and GPU in the loop with a 180 mm rad.


That green looks sweeeeeet!







Especially with that Sniper board. What case is that?

If you only have the 280 then you are definitely lacking in the radiator department. As rdr09 said, at least, an additional 120/140 to bring temps in check. If you wanna join the cool







kids club, pun fully intended, than a 240 as you said will get your temps where you want to be.

I have all my fans daisy chained off one fan header, running in silent mode through Thermal Radar, with (2) XSPC EX240s. My top 240 using thin 20mm Yate Loon in push, and the front 240 in push/pull with Corsair SP120 HP. Temperatures are solid with worst bench temps for CPU @ 55, GPU Core/VRM1 peaking @ 52, and VRM2 peaking @ 42. Gaming load CPU/GPU/VRM1 Mid 40s.


----------



## taem

Quote:


> Originally Posted by *Roboyto*
> 
> Temps are solid for air cooling.
> 
> 
> 
> 
> 
> 
> 
> That triple slotter is a BEAST


The more I think about it, yeah, this is pretty beastly cooling, because it's on clocks of 1170/1500. I've seen the Tri X achieve better temps across the board but that's at stock, I don't know how it does with a +75 vddc offset and 1170 core on a 947 reference. But I have a Tri-X on the way so I'll see soon enough.
Quote:


> I have to agree with rdr09 with keeping heatsinks on the VRMs.
> 
> Biggest question is how is/are the VRM sink(s) attached presently? That will play a big role in how you go about upgrading. I'm having trouble locating a picture of the PCS+ without the cooler on it to see what's going on under the hood.


I don't want to void warranty so I can't take the cooler off but here are some closeups of the cooling design for the PCS+ 290.

Memory:



Vrm2:


Now the one spot where I'd like to improve, vrm1:


Can't really see how it's attached. Just glued on?

I wondered if it was something I should worry about but performance is rocksolid, it's just glue that slopped right? Purely cosmetic? Looks awful tho lol. Good thing it's on the mobo side.


----------



## vieuxchnock

Quote:


> Originally Posted by *Roboyto*
> 
> That green looks sweeeeeet!
> 
> 
> 
> 
> 
> 
> 
> Especially with that Sniper board. What case is that?
> 
> If you only have the 280 then you are definitely lacking in the radiator department. As rdr09 said, at least, an additional 120/140 to bring temps in check. If you wanna join the cool
> 
> 
> 
> 
> 
> 
> 
> kids club, pun fully intended, than a 240 as you said will get your temps where you want to be.
> 
> I have all my fans daisy chained off one fan header, running in silent mode through Thermal Radar, with (2) XSPC EX240s. My top 240 using thin 20mm Yate Loon in push, and the front 240 in push/pull with Corsair SP120 HP. Temperatures are solid with worst bench temps for CPU @ 55, GPU Core/VRM1 peaking @ 52, and VRM2 peaking @ 42. Gaming load CPU/GPU/VRM1 Mid 40s.


That's a G1 Sniper M5 Matx in a Prodigy MITX. I modded it to have the board horizontaly to see the graphic card. If I was able to fit a Matx board in a Mitx case , I am sure I can fit a 240 on top under my Koolance shroud.


----------



## Roboyto

Quote:


> Originally Posted by *Prozillah*
> 
> But why would that be the case? ?


I apologize if I've mentioned any of this to you before, things start to blend together after a while.

There is supposed to be a performance boost going to Win8. However, a fresh install with Windows 7 could just as likely fix your issue too if it has been a while since you formatted.

Drivers - Scrubbed with DDU? Using 14.1 drivers? They're causing lots of issues presently.

Is there a frame limiting setting for BF4? Maybe it's set for 60?

Using AMD Frame Pacing? I don't know much about this, but what I do know is that I have only seen/read bad things here about it.

.Net Framework on the latest version? Should be 4.5.1, check your installed updates list for it.

This post from reddit seems to have some information that could be worth a shot:


__
https://www.reddit.com/r/1q14u9/bf4_i_have_multiple_fixes_that_make_the_game_90/

I wish I had more definitive answers, but I don't play BF4.


----------



## Roboyto

Quote:


> Originally Posted by *taem*
> 
> The more I think about it, yeah, this is pretty beastly cooling, because it's on clocks of 1170/1500. I've seen the Tri X achieve better temps across the board but that's at stock, I don't know how it does with a +75 vddc offset and 1170 core on a 947 reference. But I have a Tri-X on the way so I'll see soon enough.
> I don't want to void warranty so I can't take the cooler off but here are some closeups of the cooling design for the PCS+ 290.
> 
> Now the one spot where I'd like to improve, vrm1:
> 
> 
> Can't really see how it's attached. Just glued on?
> 
> I wondered if it was something I should worry about but performance is rocksolid, it's just glue that slopped right? Purely cosmetic? Looks awful tho lol. Good thing it's on the mobo side.


VRM1 is over closer to the power connectors, with the aluminum sink on them. What you have circled are solid chokes I believe; anyone confirm?





You can definitely see a thermal pad in there though under the sink. Once the HSF is removed it shouldn't be hard to remove the aluminum sink for the VRM1 and change the pad to see if it helps.

Another option, whenever you get to the point of taking it apart would be try swapping to copper sinks. Copper is better at conducting heat, but aluminum at dissipating it; hence why heatsinks are usually copper with aluminum fins. Having the copper to more effectively suck heat off the VRM could be an improvement over the aluminum that is there.


----------



## Roboyto

Quote:


> Originally Posted by *vieuxchnock*
> 
> That's a G1 Sniper M5 Matx in a Prodigy MITX. I modded it to have the board horizontaly to see the graphic card. If I was able to fit a Matx board in a Mitx case , I am sure I can fit a 240 on top under my Koolance shroud.


Sweet deal! Didn't know there was a Sniper mATX board.

Let us know what happens when you add the second rad. While its apart, the thermal pad upgrade would likely be worth the effort


----------



## vieuxchnock

Quote:


> Originally Posted by *Roboyto*
> 
> Sweet deal! Didn't know there was a Sniper mATX board.
> 
> Let us know what happens when you add the second rad. While its apart, the thermal pad upgrade would likely be worth the effort


What do you mean by the thermal pad upgrade? Did I miss something?


----------



## Prozillah

Quote:


> Originally Posted by *Roboyto*
> 
> BF4 is very CPU dependent and will utilize whatever # of cores you have available to throw at it. I doubt a 4670k would be bottlenecking even 2 of these beasts, what clock you running?


Yeah 4670k at 4.2 currently. I duno maybe its drivers or something else? I just checked my wattage from plug and my system is pulling upto 1380watts! How can my psu handle that if it is only 1000w?


----------



## Mr357

Quote:


> Originally Posted by *Prozillah*
> 
> Yeah 4670k at 4.2 currently. I duno maybe its drivers or something else? I just checked my wattage from plug and my system is pulling upto 1380watts! How can my psu handle that if it is only 1000w?


PSU's are never 100% efficient. It's pulling AC voltage from the wall, and then converting a percentage of that wattage (usually 80-90%) into DC voltage. If it was 1380 from the wall, you're possibly getting less than 70% efficiency.


----------



## IBIubbleTea

Can someone explain to me what is ULPS? Also is it better to disable it?


----------



## Paul17041993

Quote:


> Originally Posted by *taem*
> 
> Now the one spot where I'd like to improve, vrm1:
> 
> 
> Can't really see how it's attached. Just glued on?
> 
> I wondered if it was something I should worry about but performance is rocksolid, it's just glue that slopped right? Purely cosmetic? Looks awful tho lol. Good thing it's on the mobo side.


that doesn't look right... but if the card's running fine then yea its just a cosmetic "defect", if you start to get coil whine however that means one of the chokes has a fractured casing...
Quote:


> Originally Posted by *IBIubbleTea*
> 
> Can someone explain to me what is ULPS? Also is it better to disable it?


Ultra Low Power State, its a state of the card that the core and fans are shut off almost 100% and only ~1W of power is used, but this only activates when the screen turns off and/or in crossfire setups, generally good to turn off as in windows 7 it can cause all sorts of issues.


----------



## kizwan

Quote:


> Originally Posted by *Prozillah*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Roboyto*
> 
> BF4 is very CPU dependent and will utilize whatever # of cores you have available to throw at it. I doubt a 4670k would be bottlenecking even 2 of these beasts, what clock you running?
> 
> 
> 
> Yeah 4670k at 4.2 currently. I duno maybe its drivers or something else? I just checked my wattage from plug and my system is pulling upto 1380watts! How can my psu handle that if it is only 1000w?
Click to expand...

Try overclock higher. BF4 like multithreading CPU. I think 4 cores CPU @4.2GHz is too low for BF4. 1000W is not the amount of wattage your PSU can pull from the wall but it's the max amount of wattage your PSU can deliver stably to the components. Some PSU can deliver more than what was it rated for. Not all energy will be use, some will be wasted as heat. The actual power consumption will be less than the amount of wattage pulled from the wall.

1380W? Really? It seems too high for two 290's, even with core clock at 1040MHz which I believe at stock voltage. Do you have anything else plugged in other than your PC?
Quote:


> Originally Posted by *IBIubbleTea*
> 
> Can someone explain to me what is ULPS? Also is it better to disable it?


ULPS stand for Ultra Low Power State. It only activated if you have Crossfire, meaning more than one GPU. ULPS put secondary GPU to low power state when not in use (it's like the GPU is turn off or sleep/standby). If you're overclocking, it's recommend to disable ULPS because it can prevent secondary GPU from overvolting which will effect stability.

An example, graph below show GPU2 in crossfire (overclocked) is still running at stock voltage when ULPS is still enabled.


----------



## IBIubbleTea

Quote:


> Originally Posted by *Paul17041993*
> 
> Ultra Low Power State, its a state of the card that the core and fans are shut off almost 100% and only ~1W of power is used, but this only activates when the screen turns off and/or in crossfire setups, generally good to turn off as in windows 7 it can cause all sorts of issues.


I have disabled sleep and the black screen when not using. I'm using windows 8.1, will I encounter any problems?


----------



## Paul17041993

Quote:


> Originally Posted by *IBIubbleTea*
> 
> I have disabled sleep and the black screen when not using. I'm using windows 8.1, will I encounter any problems?


shouldn't, I haven't had any problems with ULPS since I started testing 8 beta, I'm on 8.1 now with 13.12 drivers.


----------



## alancsalt

Quote:


> Originally Posted by *vieuxchnock*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Roboyto*
> 
> Sweet deal! Didn't know there was a Sniper mATX board.
> 
> Let us know what happens when you add the second rad. While its apart, the thermal pad upgrade would likely be worth the effort
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What do you mean by the thermal pad upgrade? Did I miss something?
Click to expand...

Probably not what you meant, but made me curious.

Is there a thermal pad roundup/comparison?

This is 17 W/m.K, so presumably some pads are better than others.

AMD Thermal Interface Material Comparison: Thermal Pads vs. Thermal Grease

Interesting. Thin Copper with TIM may be more effective than a pad. (unless electrical conductivity is an issue? Pads are dielectric.) I guess you'd anneal (soften by heating) the copper first .... if trying it.


----------



## standardhlozek

I had black screen issues with my gygabite. Now i already recognize that this not the software issue but probably the firts batch elpidia memory issue. The solution for me was to downclock mem speed and the i ran valle for three hours without black screen. Now i want compensate the performance loss by overclocking the core. But i dont know how much i can. Voltage, power limit etc. I using afterburner. Thanx.


----------



## kizwan

Quote:


> Originally Posted by *IBIubbleTea*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Paul17041993*
> 
> Ultra Low Power State, its a state of the card that the core and fans are shut off almost 100% and only ~1W of power is used, but this only activates when the screen turns off and/or in crossfire setups, generally good to turn off as in windows 7 it can cause all sorts of issues.
> 
> 
> 
> 
> 
> 
> 
> I have disabled sleep and the black screen when not using. I'm using windows 8.1, will I encounter any problems?
Click to expand...

Quote:


> Originally Posted by *Paul17041993*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *IBIubbleTea*
> 
> I have disabled sleep and the black screen when not using. I'm using windows 8.1, will I encounter any problems?
> 
> 
> 
> 
> 
> 
> 
> shouldn't, I haven't had any problems with ULPS since I started testing 8 beta, I'm on 8.1 now with 13.12 drivers.
Click to expand...

Paul, you have one 290X right? With crossfire, ULPS may effect stability when overclock because it can prevent secondary GPU from overvolting.

An example when GPU's are overclocked with ULPS enabled. GPU2 refused to overvolt.


@IBIubbleTea, if you have one 290/290X, then you don't need to worry with ULPS.


----------



## Forceman

Quote:


> Originally Posted by *kizwan*
> 
> @IBIubbleTea, if you have one 290/290X, then you don't need to worry about ULPS.


I'd still turn it off. ULPS caused my card to sometimes lock up when the monitor goes to sleep, forcing a reboot to fix it.


----------



## kizwan

Quote:


> Originally Posted by *Forceman*
> 
> I'd still turn it off. ULPS caused my card to sometimes lock up when the monitor goes to sleep, forcing a reboot to fix it.


I don't have that problem. BTW, when screen goes to sleep, does your 290X fan stop spinning (with ULPS enabled of course)?


----------



## Forceman

Quote:


> Originally Posted by *kizwan*
> 
> I don't have that problem. BTW, when screen goes to sleep, does your 290X fan stop spinning (with ULPS enabled of course)?


I don't know, it's on water. Maybe that has something to do with it.


----------



## IBIubbleTea

Thanks for the quick answers. Rep+.

Another question, planning on doing a water cooled build, Should I get a EK R9 290x Acetal + copper or nickel? Planning on using distilled water with PT nuke. I heard about changing the thermal pad when applying the water block, What kind of thermal pads should I get and what thickness is needed? What is a good thermal paste I should use, Right now I have AC5 and swiftech tim mate...?


----------



## Forceman

Quote:


> Originally Posted by *IBIubbleTea*
> 
> Thanks for the quick answers. Rep+.
> 
> Another question, planning on doing a water cooled build, Should I get a EK R9 290x Acetal + copper or nickel? Planning on using distilled water with PT nuke. I heard about changing the thermal pad when applying the water block, What kind of thermal pads should I get and what thickness is needed? What is a good thermal paste I should use, Right now I have AC5 and swiftech tim mate...?


Unless you are looking for nickel for aesthetics, I'd just get the copper. They seem to have fixed the issues with the nickel flaking, but copper is one less thing to worry about.

The stock EK pads that come with the block are 0.5mm for the VRAM and 1.0mm for the VRM, but if you intend to get Fujipoly you can just get 1.0mm all the way around. Saves ordering two sheets. But you can get good enough performance with the stock pads, unless you really want to push for that last 10 degrees or so (on the VRMs, won't affect the core any). I wouldn't use AS5 (I assume that's what you meant) on the core, either use the Swiftech you have or pick up something like MX-4.


----------



## Prozillah

Quote:


> Originally Posted by *kizwan*
> 
> Try overclock higher. BF4 like multithreading CPU. I think 4 cores CPU @4.2GHz is too low for BF4. 1000W is not the amount of wattage your PSU can pull from the wall but it's the max amount of wattage your PSU can deliver stably to the components. Some PSU can deliver more than what was it rated for. Not all energy will be use, some will be wasted as heat. The actual power consumption will be less than the amount of wattage pulled from the wall.
> 
> 1380W? Really? It seems too high for two 290's, even with core clock at 1040MHz which I believe at stock voltage. Do you have anything else plugged in other than your PC?
> ULPS stand for Ultra Low Power State. It only activated if you have Crossfire, meaning more than one GPU. ULPS put secondary GPU to low power state when not in use (it's like the GPU is turn off or sleep/standby). If you're overclocking, it's recommend to disable ULPS because it can prevent secondary GPU from overvolting which will effect stability.
> 
> An example, graph below show GPU2 in crossfire (overclocked) is still running at stock voltage when ULPS is still enabled.


Yeah think I figured it out! Drivers were not loaded correctly as the ULPS stopped a component from loading properly. So fixed that - kept it all at stock speeds and it's working quite well. Going to try and OC my CPU again now..

As for my system - it is:

2x 140mm fans
8x 120mm fans (some LED lighting)
1x 200mm fan
1x H110 water pump
2x Gigabyte 290 Windforces
1x 4670k OC
2x 4GB G Skill Trident X 2400mhz
4x USB devices
1x DVDRW drive
2x Storage HDD
1x SSD

So yea the draw is relevant to the numbers being pulled! I have a 80% Gold1 000w PSU so in theory my max pc load according to my gauge should be 1250 from the wall - is that how it works?? Something still tells me I need a 1200w PSU...


----------



## Paul17041993

Quote:


> Originally Posted by *kizwan*
> 
> Paul, you have one 290X right? With crossfire, ULPS may effect stability when overclock because it can prevent secondary GPU from overvolting.
> 
> An example when GPU's are overclocked with ULPS enabled. GPU2 refused to overvolt.
> 
> 
> @IBIubbleTea, if you have one 290/290X, then you don't need to worry with ULPS.


yea ULPS can have some strange effects on crossfire, was this on 7 or 8 though?

and yea, only one 290X, I don't do crossfire, just prefer a single GPU for rendering.


----------



## kizwan

Quote:


> Originally Posted by *IBIubbleTea*
> 
> Thanks for the quick answers. Rep+.
> 
> Another question, planning on doing a water cooled build, Should I get a EK R9 290x Acetal + copper or nickel? Planning on using distilled water with PT nuke. I heard about changing the thermal pad when applying the water block, What kind of thermal pads should I get and what thickness is needed? What is a good thermal paste I should use, Right now I have AC5 and swiftech tim mate...?


Either - copper or nickel - should be ok. Nickel depends on the quality of the plating process though. I got copper block & I'm using Mayhems Pastel coolant.

Recommended thermal pad is Fujipoly Ultra Extreme 17 W/mK but Fujipoly Extreme 11 W/mK should be ok too. I'll know for sure when I received mine. 0.5mm for memory & 1mm for the VRMs. Memory (or VR*A*M) doesn't run hot, so you can use stock EK thermal pad there. 17 W/mK pad is expensive than 11 W/mK one. You can get 15 x 100 mm 17 W/mK pad because it will be cheaper & more than enough for VRMs.

AC5 AS5 should be fine. Gelid GC-Extreme is good TIM too. I'm going to replaced mine with ICD 7 TIM.
Quote:


> Originally Posted by *Prozillah*
> 
> So yea the draw is relevant to the numbers being pulled! I have a 80% Gold1 000w PSU so in theory my max pc load according to my gauge should be 1250 from the wall - is that how it works?? Something still tells me I need a 1200w PSU...


If the power meter is not broken, 1380W is the amount of wattage your PSU pulled from the wall. However I don't know the amount of wattage actually being consumed by your PC. It maybe 1000W (72% efficiency) or 1104W (if it's 80% efficiency). Efficiency is not a constant, depends on the load AFAIK.

I don't think you need 1200W PSU. If you have insufficient PSU, your PC will shutdown (power cut-off) or black screen constantly when under load.


----------



## IBIubbleTea

Quote:


> Originally Posted by *Forceman*
> 
> Unless you are looking for nickel for aesthetics, I'd just get the copper. They seem to have fixed the issues with the nickel flaking, but copper is one less thing to worry about.
> 
> The stock EK pads that come with the block are 0.5mm for the VRAM and 1.0mm for the VRM, but if you intend to get Fujipoly you can just get 1.0mm all the way around. Saves ordering two sheets. But you can get good enough performance with the stock pads, unless you really want to push for that last 10 degrees or so (on the VRMs, won't affect the core any). I wouldn't use AS5 (I assume that's what you meant) on the core, either use the Swiftech you have or pick up something like MX-4.


What do you think of IC Diamond ?


----------



## Forceman

Quote:


> Originally Posted by *IBIubbleTea*
> 
> What do you think of IC Diamond ?


I wouldn't use it on a bare die (like the GPU), it'll scratch it.


----------



## IBIubbleTea

Quote:


> Originally Posted by *Forceman*
> 
> I wouldn't use it on a bare die (like the GPU), it'll scratch it.


This might be a dumb question but.. Whats wrong with scratching the die? I know it void the warranty. Does it shorten the life of the gpu?


----------



## Forceman

Quote:


> Originally Posted by *IBIubbleTea*
> 
> This might be a dumb question but.. Whats wrong with scratching the die? I know it void the warranty. Does it shorten the life of the gpu?


I think the concern is that the tiny scratches may weaken the die and lead to cracks. I don't know how likely that really is, but it also pretty much irrevocably voids your warranty, while just changing the cooler probably doesn't (depending on the manufacturer).


----------



## taem

Quote:


> Originally Posted by *Paul17041993*
> 
> that doesn't look right... but if the card's running fine then yea its just a cosmetic "defect", if you start to get coil whine however that means one of the chokes has a fractured casing...


Dude!! Don't freak me out like that. I don't want to RMA this card because it's an above average performer. A cracked choke wouldn't ooze a gelatinous material like that would it? That looks like glue to me. But now you've got me worried.
Quote:


> Originally Posted by *Roboyto*
> 
> VRM1 is over closer to the power connectors, with the aluminum sink on them. What you have circled are solid chokes I believe; anyone confirm?.


Lol yeah I circled the booger on the choke to ask what it might be, but I used profanity and a mod edited that out so that wasn't clear


----------



## maynard14

how about noctua nth1 sir? will it scratch the gpu core?


----------



## Matt-Matt

Quote:


> Originally Posted by *Forceman*
> 
> I think the concern is that the tiny scratches may weaken the die and lead to cracks. I don't know how likely that really is, but it also pretty much irrevocably voids your warranty, while just changing the cooler probably doesn't (depending on the manufacturer).


Any scratches to the die will possibly damage the core, I actually scratched one of my 7950's with the end of the liquid pro stick.
It actually still worked, and it's been long gone now. Still be careful! It can damage the core and it will definitely void your warranty as you mentioned.


----------



## kizwan

Applying IC Diamond 7 is easy. If you install the heatsink/waterblock carefully, it won't scratch the die. It's very unlikely you're going to scratch the die during installation unless you're very clumsy. It's not easy to remove/clean ICD 7 & lot of people scratched their CPU/GPU when they don't clean it properly. Scratched & your warranty gone. It's advised when removing/cleaning, do it carefully to avoid scratching the die, using solvent like acetone or ArctiClean to re-liquefy the TIM.

*[EDIT]* On second thought, I may re-mount the blocks a couple of time until I satisfy. So, I'm getting the trusty Shin-Etsu G751 instead.


----------



## Matt-Matt

Quote:


> Originally Posted by *kizwan*
> 
> Applying IC Diamond 7 is easy. If you install the heatsink/waterblock carefully, it won't scratch the die. It's very unlikely you're going to scratch the die during installation unless you're very clumsy. It's not easy to remove/clean ICD 7 & lot of people scratched their CPU/GPU when they don't clean it properly. Scratched & your warranty gone. It's advised when removing/cleaning, do it carefully to avoid scratching the die, using solvent like acetone or ArctiClean to re-liquefy the TIM.
> 
> *[EDIT]* On second thought, I may re-mount the blocks a couple of time until I satisfy. So, I'm getting the trusty Shin-Etsu G751 instead.


That is what I mean about scratching it, suppose you have to RMA it.. You have to take it off and apply some more normal looking TIM, which is where the chance of scratching it is a lot higher. Going with Shin-Etsu is a good choice!


----------



## Paul17041993

Quote:


> Originally Posted by *taem*
> 
> Dude!! Don't freak me out like that. I don't want to RMA this card because it's an above average performer. A cracked choke wouldn't ooze a gelatinous material like that would it? That looks like glue to me. But now you've got me worried.


dw you should be fine, but *if* you start to get coil whine then that glue must have come out of one of the chokes, otherwise its just spill from something...


----------



## Forceman

Quote:


> Originally Posted by *Paul17041993*
> 
> dw you should be fine, but *if* you start to get coil whine then that glue must have come out of one of the chokes, otherwise its just spill from something...


I've never broken one open but I don't think the stuff inside is liquid, is it? Even if the casing was cracked I don't think it would leak out like that. More likely it was just something that got on the outside during manufacturing.


----------



## Noufel

finaly they are here and they run fine on my 850 watt cm PSU, waiting a new one for overclocking


----------



## cyenz

Just a question, is occt stable the rule for GPU stability? Because i can game all day BF4, crysis 3, etc at 1150mhz +75mv, but occt will trow errors, i can only be occt stable at 1125 +100mv and i cant even test more than 5 five minutes without vrm´s reach 100c+, in long gaming sessions they do about 80c max.


----------



## Maracus

Quote:


> Originally Posted by *Noufel*
> 
> 
> 
> 
> finaly they are here and they run fine on my 850 watt cm PSU, waiting a new one for overclocking


Nice lets hope they clock well, although both over clocked may put some strain on your PSU.


----------



## Noufel

Quote:


> Originally Posted by *Maracus*
> 
> Nice lets hope they clock well, although both over clocked may put some strain on your PSU.


Thanks, I ordered a 1000 w cm v it should be sufficient


----------



## Roboyto

Quote:


> Originally Posted by *vieuxchnock*
> 
> What do you mean by the thermal pad upgrade? Did I miss something?


Sorry, check this out:

http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures


----------



## Belkov

I just notices something - on GB site it is written that 600W PSU is required. Other brands have a lot higher requirements for PSUs. Do you think my SEASONIC SS-620GB Bronze is enough? I don't have problems so far, except some artifacting with too high clocks, but this is normal...


----------



## Matt-Matt

Quote:


> Originally Posted by *Belkov*
> 
> I just notices something - on GB site it is written that 600W PSU is required. Other brands have a lot higher requirements for PSUs. Do you think my SEASONIC SS-620GB Bronze is enough? I don't have problems so far, except some artifacting with too high clocks, but this is normal...


It should be being a Seasonic, but check the amperage on your specific PSU; It should be on the label somewhere. You're looking for amperage on the 12v rail. You want 40+ really but a bit under is usually okay i've found.


----------



## vieuxchnock

Quote:


> Originally Posted by *Roboyto*
> 
> Sorry, check this out:
> 
> http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures


With my block (Aquacomputer Aquagraphics) there is thermal pads only under VRM but not over memory chips. Should I use Fujipoly everywhere ? If yes , what thickness? Is it the same thickness for VRM and Memory?


----------



## Roboyto

Quote:


> Originally Posted by *Paul17041993*
> 
> if you start to get coil whine however that means one of the chokes has a fractured casing...


My card has some coil whine, but only when I'm pushing it really hard; It can be hard/impossible to hear with an air cooler. I started fiddling around again to find suitable clocks for gaming and decided on +100mV offset and I'm stable at 1200/1500 with no apparent coil whine.

Quote:


> Originally Posted by *alancsalt*
> 
> Probably not what you meant, but made me curious.
> 
> *Is there a thermal pad roundup/comparison?*
> 
> This is 17 W/m.K, so presumably some pads are better than others.
> 
> AMD Thermal Interface Material Comparison: Thermal Pads vs. Thermal Grease
> 
> Interesting. Thin Copper with TIM may be more effective than a pad. (unless electrical conductivity is an issue? Pads are dielectric.) I guess you'd anneal (soften by heating) the copper first .... if trying it.


No round up that I'm aware of, but waiting to hear from Gunderman456 about what kind of a difference the Fujipoly Extreme 11 W/mk make compared to EK stock. I observed 23% lower temps switching from XSPC stock to Fujipoly Ultra Extreme.

http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures
Quote:


> Originally Posted by *IBIubbleTea*
> 
> Thanks for the quick answers. Rep+.
> 
> Another question, planning on doing a water cooled build, Should I get a EK R9 290x Acetal + copper or nickel? Planning on using distilled water with PT nuke. I heard about changing the thermal pad when applying the water block, What kind of thermal pads should I get and what thickness is needed? What is a good thermal paste I should use, Right now I have AC5 and swiftech tim mate...?


I have used EK Acetal Nickel blocks for HD4890, HD7870, and GTX 670 with no issues at all. The benefit of the nickel is obviously the bling factor and it cleans very easily. Bare copper will end up getting stained from TIM like a CPU heatsink.

*See my link above; Fujipoly Ultra Extreme make BIG difference for me.*

EK use 2 different thickness pads, 0.5mm for RAM and 1mm for VRMs. RAM doesn't get very hot so save your $ and use the supplied EK pads for that. I upgraded the thermal pads for RAM to the Fujipoly Extreme and observed *no gains from the additional money I spent.*

Just buy the $20 Fujipoly Ultra Extreme 15mmx100mm piece and upgrade just the VRM pads.



EK recommends EK TIM Ectotherm, Arctic Cooling MX-2 ™, MX-4 ™ or GELID GC-Extreme. I've had very good luck with Xigmatek PTI-G4512. Others like Noctua NT-H1.
Quote:


> Originally Posted by *cyenz*
> 
> Just a question, is occt stable the rule for GPU stability? Because i can game all day BF4, crysis 3, etc at 1150mhz +75mv, but occt will trow errors, i can only be occt stable at 1125 +100mv and i cant even test more than 5 five minutes without vrm´s reach 100c+, in long gaming sessions they do about 80c max.


Benchmark/Stress test are going to load everything more heavily than a game and cause higher temps. If you have no issues while gaming I wouldn't be very concerned, especially if your VRMs are maxing at 80C, you should be just fine.


----------



## Belkov

Quote:


> Originally Posted by *Matt-Matt*
> 
> It should be being a Seasonic, but check the amperage on your specific PSU; It should be on the label somewhere. You're looking for amperage on the 12v rail. You want 40+ really but a bit under is usually okay i've found.


It is writen - +12V1 - 24A and +12V2 - 24A; -12V - 0.8A. Are these ok?

And one more thing - my max temp for the core is 75 degrees and 77 for VRM1. This is with stock voltage of 1.156V. max under full load for more than an hour playing or benching heaven and valley. What do you think about?


----------



## Roboyto

Quote:


> Originally Posted by *vieuxchnock*
> 
> With my block (Aquacomputer Aquagraphics) there is thermal pads only under VRM but not over memory chips. Should I use Fujipoly everywhere ? If yes , what thickness? Is it the same thickness for VRM and Memory?


I don't know for sure, I was trying to locate the pad thickness for the Aquacomputer block but it wasn't listed in the instructions. To be on the safe side I would say use 1mm for both the RAM and VRMs. If possible, try and measure the pads you were given to see how thick they are?

I wouldn't want to put TIM on the RAM chips, too messy to clean up IMHO. EK has a different procedure as well, recommending TIM used *with* the thermal pads on the VRMs. I just think that's too much hassle to clean up for likely minimal improvement in performance.


----------



## Roboyto

Quote:


> Originally Posted by *Belkov*
> 
> It is writen - +12V1 - 24A and +12V2 - 24A; -12V - 0.8A. Are these ok?
> 
> And one more thing - my max temp for the core is 75 degrees and 77 for VRM1. This is with stock voltage of 1.156V. max under full load for more than an hour playing or benching heaven and valley. What do you think about?


Temps are definitely acceptable for core and VRM1. If you are at 77 for VRM1 on stock clocks, you may not have much room to overclock or add more voltage. The general opinion/consensus in this thread seems to be that 80C is where people start becoming uncomfortable for that VRM1 temp. They are rated for ~120C I believe, but for stability and longevity best to keep them as cool as possible.


----------



## kizwan

Quote:


> Originally Posted by *vieuxchnock*
> 
> With my block (Aquacomputer Aquagraphics) there is thermal pads only under VRM but not over memory chips. Should I use Fujipoly everywhere ? If yes , what thickness? Is it the same thickness for VRM and Memory?


You don't need thermal pad on memory. Follow the instruction that come with the block. If I remember correctly, you only need put TIM on the memory. You should only put thermal pad on VRMs. I don't know the thickness of thermal pad the Aquacomputer kryographics use. The manual doesn't mention about the thickness?

*[EDIT]* Found it. The Kryographics stock thermal pad is 0.5mm thick.
Quote:


> Originally Posted by *Raephen*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kpoeticg*
> 
> Can anybody tell me exactly how much thermal pad is needed of each thickness for full coverage on a ref 290x (Specifically for AC Kryographics Blocks with Passive Backplate)
> 
> 
> 
> The Kryographics blocks use the least thermal padding of all waterblocks: only for the VRM's. The VRAM is cooled by direct contact with the machined copper/nickel, so there only some TIM.
> 
> Officially, the padding is 0,5mm thick, but if the padding you're getting is more pliable than the stock ones, 1mm is also good.
> 
> I've got some Phobya 7w/mk and it does the job very nicely.
Click to expand...

Kryographics waterblock is very good in cooling VRM1 compared to other blocks in the market. The stock thermal pad should be more than enough but you can always use better thermal pad if you want. @Raephen use Phobya 7w/mk & it worked well for him.
_(Source)_


----------



## Belkov

By the way my stock clocks are perfectly stable with -37mV offset. And we are talking for much lower temps - 69-70 for the core and 71-72 for VRM1. I thing that GB decided to reinsure...







And by the way on stock i can clock 1100/1500MHz, so it is not so big problem.

And what about:
Quote:


> Originally Posted by *Belkov*
> 
> I just notices something - on GB site it is written that 600W PSU is required. Other brands have a lot higher requirements for PSUs. Do you think my SEASONIC SS-620GB Bronze is enough? I don't have problems so far, except some artifacting with too high clocks, but this is normal...


Quote:


> Originally Posted by *Matt-Matt*
> 
> It should be being a Seasonic, but check the amperage on your specific PSU; It should be on the label somewhere. You're looking for amperage on the 12v rail. You want 40+ really but a bit under is usually okay i've found.


Quote:


> Originally Posted by *Belkov*
> 
> It is writen - +12V1 - 24A and +12V2 - 24A; -12V - 0.8A. Are these ok?


----------



## vieuxchnock

Thanks kizwan:thumb:


----------



## kizwan

Quote:


> Originally Posted by *Belkov*
> 
> By the way my stock clocks are perfectly stable with -37mV offset. And we are talking for much lower temps - 69-70 for the core and 71-72 for VRM1. I thing that GB decided to reinsure...
> 
> 
> 
> 
> 
> 
> 
> And by the way on stock i can clock 1100/1500MHz, so it is not so big problem.
> 
> And what about:
> Quote:
> 
> 
> 
> Originally Posted by *Belkov*
> 
> I just notices something - on GB site it is written that 600W PSU is required. Other brands have a lot higher requirements for PSUs. Do you think my SEASONIC SS-620GB Bronze is enough? I don't have problems so far, except some artifacting with too high clocks, but this is normal...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Matt-Matt*
> 
> It should be being a Seasonic, but check the amperage on your specific PSU; It should be on the label somewhere. You're looking for amperage on the 12v rail. You want 40+ really but a bit under is usually okay i've found.
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *Belkov*
> 
> It is writen - +12V1 - 24A and +12V2 - 24A; -12V - 0.8A. Are these ok?
> 
> Click to expand...
Click to expand...

+12V2 is for CPU (ATX/EATX12V/EPS12V) & +12V1 is for everything else. This means, you have 288W for GPU & other components (except CPU) in your PC. Huh...did you ever overvolt your 290 before?


----------



## Belkov

Yes - with +100mV i benched for an hour with 1150/1500MHz. A little bit high temperatures but the gpu was stable.


----------



## Belkov

I found that it is compatible even for 290X:

AMD Radeon R9 290X Compatible PSUs

And more details for Seasonic S12II-620 80 Bronze 620W.

So +12V Output is 48A.


----------



## Roboyto

Quote:


> Originally Posted by *kizwan*
> 
> *[EDIT]* Found it. The Kryographics stock thermal pad is 0.5mm thick.
> Kryographics waterblock is very good in cooling VRM1 compared to other blocks in the market. The stock thermal pad should be more than enough but you can always use better thermal pad if you want. @Raephen use Phobya 7w/mk & it worked well for him.
> _(Source)_


Thanks, I'm going to add that pad thickness to my VRM thermal pad write up.

Impressive cooling for Aquacomputer even with their passive, and more so with active, backplate.

Regarding the graph form ExtremeRigs.net. Before I upgraded thermal pads, I was right on with their results from the EK + Backplate, in the 40C delta area. After the upgrade I'm right down there with the Aquacomputer at 25C delta.

Curious as to what pads the Aquacomputer includes with their blocks?


----------



## Heinz68

Quote:


> Originally Posted by *Prozillah*
> 
> Yeah 4670k at 4.2 currently. I duno maybe its drivers or something else? I just checked my wattage from plug and my system is pulling upto 1380watts! How can my psu handle that if it is only 1000w?


Quote:


> Originally Posted by *Mr357*
> 
> PSU's are never 100% efficient. It's pulling AC voltage from the wall, and then converting a percentage of that wattage (usually 80-90%) into DC voltage. If it was 1380 from the wall, you're possibly getting less than 70% efficiency.


My system shows pulling up to 1260W from the wall. I believe it's preventing me to OC the
i7-4930k further than 4381 MHz max i have now.

I have Sapphire R9 290X TRI-X OC in Crossfire 1235/1650 MHz + 200 mV offset. This set up is only for benching, but I would like to push it even more at least the CPU.

Thinking about upgrading my present (about 6 years old) Corsair HX1000W PSU to CORSAIR AXi series AX1200i 1200W.


----------



## Roboyto

Has anyone else seen this? VisionTek R9 290 CryoVenom with factory installed 'custom' EK full cover block, MSRP of $550. Currently sold out obviously, but what a deal!

http://www.visiontekproducts.com/index.php/component/virtuemart/graphics-cards/visiontek-cryovenom-liquidcooled-series-r9-290-detail?Itemid=0


----------



## kizwan

Quote:


> Originally Posted by *Roboyto*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> *[EDIT]* Found it. The Kryographics stock thermal pad is 0.5mm thick.
> Kryographics waterblock is very good in cooling VRM1 compared to other blocks in the market. The stock thermal pad should be more than enough but you can always use better thermal pad if you want. @Raephen use Phobya 7w/mk & it worked well for him.
> _(Source)_
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks, I'm going to add that pad thickness to my VRM thermal pad write up.
> 
> Impressive cooling for Aquacomputer even with their passive, and more so with active, backplate.
> 
> Regarding the graph form ExtremeRigs.net. Before I upgraded thermal pads, I was right on with their results from the EK + Backplate, in the 40C delta area. After the upgrade I'm right down there with the Aquacomputer at 25C delta.
> 
> *Curious as to what pads the Aquacomputer includes with their blocks?*
Click to expand...

No idea but they're thinner (0.5mm) than EK (1mm). Like TIM, thinner thermal pad have better thermal performance than thicker thermal pad provided it get good pressure from the heatsink/waterblock. This show in the graph where the performance of Aqua wb w/o backplate almost similar (slightly better) with EK + backplate. With Aqua backplate, VRM1 temp drop a lot. Based on the pic shared by OCN member @OCN watercooling club thread, Aqua backplate have thermal pads for cooling the chokes & capacitors (one next to the chokes & the other one for the 4 capacitors near the VRM1).

My Fujipoly pad still stuck in NY (since Feb 20). Apparently it already left "ISC NEW YORK NY(USPS)" on Feb 20, in transit to destination.


----------



## taem

Quick FYI, the listed dimensions for Powercolor PCS+ 290/X are *way* off. It's listed as 270mm x 110mm x 38mm.

Actual dimensions are 294mm x 110mm x 52mm.

Biggest gpu *ever*. Or maybe lightning and matrix are bigger. But this thing is huge. The cooling justifies it though.


----------



## Roboyto

Quote:


> Originally Posted by *kizwan*
> 
> *No idea but they're thinner (0.5mm) than EK (1mm)*. Like TIM, thinner thermal pad have better thermal performance than thicker thermal pad provided it get good pressure from the heatsink/waterblock. This show in the graph where the performance of Aqua wb w/o backplate almost similar (slightly better) with EK + backplate. With Aqua backplate, VRM1 temp drop a lot. Based on the pic shared by OCN member @OCN watercooling club thread, *Aqua backplate have thermal pads* for cooling the chokes & capacitors (one next to the chokes & the other one for the 4 capacitors near the VRM1).
> 
> My Fujipoly pad still stuck in NY (since Feb 20). Apparently it already left "ISC NEW YORK NY(USPS)" on Feb 20, in transit to destination.


XSPC block also uses 1mm. Difference between XSPC and EK though is XSPC uses 1mm for VRM/RAM, while EK uses 1mm for VRM and 0.5mm for RAM. Considering both my RAM/VRM have 1mm I probably could have used 0.5mm and seen even better results.

Thermal pads on backplate, very interesting. Wonder if I can further increase cooling on mine by adding thermal pads to my backplate.


----------



## vieuxchnock

@kiswan
So you tell me I should not use thermal pads on rams only on VRM on my Aqua block for my 290? Right?


----------



## IBIubbleTea

Quote:


> Originally Posted by *Roboyto*
> 
> *See my link above; Fujipoly Ultra Extreme make BIG difference for me.*
> 
> EK use 2 different thickness pads, 0.5mm for RAM and 1mm for VRMs. RAM doesn't get very hot so save your $ and use the supplied EK pads for that. I upgraded the thermal pads for RAM to the Fujipoly Extreme and observed *no gains from the additional money I spent.*
> 
> Just buy the $20 Fujipoly Ultra Extreme 15mmx100mm piece and upgrade just the VRM pads.
> 
> 
> 
> EK recommends EK TIM Ectotherm, Arctic Cooling MX-2 ™, MX-4 ™ or GELID GC-Extreme. I've had very good luck with Xigmatek PTI-G4512. Others like Noctua NT-H1.
> Benchmark/Stress test are going to load everything more heavily than a game and cause higher temps. If you have no issues while gaming I wouldn't be very concerned, especially if your VRMs are maxing at 80C, you should be just fine.


Just out of curiosity, What were the temperature differences for the ram when using stock and Fujipoly?

Fujipoly Ultra Extreme System Builder Mosfet Block
What does it mean when it says "Mosfet Block"? Is it the same as Fujipoly Ultra Extreme System Builder

http://www.frozencpu.com/products/1..._15_x_10_-_Thermal_Conductivity_170_WmK.html?


----------



## Roboyto

Those two items are the same product, just in different sizes. The 15x100 is enough to do a card at least twice, possibly a 3rd time...it would be a stretch.

There's no way to monitor the VRAM temperature as far as I know...so I'm not sure. I wasn't able to overclock the memory any further so the extra expense may very well not be worth it.

Quote:


> Originally Posted by *IBIubbleTea*
> 
> Just out of curiosity, What were the temperature differences for the ram when using stock and Fujipoly?
> 
> Fujipoly Ultra Extreme System Builder Mosfet Block
> What does it mean when it says "Mosfet Block"? Is it the same as Fujipoly Ultra Extreme System Builder
> 
> http://www.frozencpu.com/products/1...hermal_Conductivity_170_WmK.html?</div></div>


----------



## Paul17041993

Quote:


> Originally Posted by *Forceman*
> 
> I've never broken one open but I don't think the stuff inside is liquid, is it? Even if the casing was cracked I don't think it would leak out like that. More likely it was just something that got on the outside during manufacturing.


nah it isn't, or at least cold it isn't, but if the casing had a crack or wasn't sitting right when being filled, you could have a result similar to that.
most likely its just overflow/spillage that wasn't cleaned off...


----------



## Roboyto

Quote:


> Originally Posted by *vieuxchnock*
> 
> @kiswan
> So you tell me I should not use thermal pads on rams only on VRM on my Aqua block for my 290? Right?


Since that is what aquacomputer suggests it is likely the best way to go.

After I thought about it, adding thermal pads to the RAM could add unwanted spacing causing the block to not make good contact with the GPU core or VRMs.

You would have to take measurements of the block to confirm this. If the block is thinner where it contacts the VRMs to compensate for thermal pads then you'd have a pretty good idea that adding pads to the RAM could lead to performance loss elsewhere.


----------



## Blameless

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Most black screens result from unstable OC's and people not understanding that a card running at max mem clocks on idle voltages can cause problems.


Most reports of blackscreen crashes I've seen are talking about stock cards.
Quote:


> Originally Posted by *BradleyW*
> 
> What do you mean by significant portion? What numbers are we talking here?


Enough to regularly hear complaints about bone stock 290X parts locking up and crashing with a black screen during normal gaming.

I've had two R9 290s, both Elpida, and neither worked correctly out of the box. Stock clocks, with enough of a boost to power limit to prevent throttling, would almost invariably result in a black screen crash during VRAM intensive games. Most early BIOSes I've tried had similar issues.

The problems largely disappear with the right BIOS, reduced temps, or increased core voltage.
Quote:


> Originally Posted by *Krusher33*
> 
> Stilt over at bitcointalk litecointalk forum has been modifying bios's for miners there. He found that some bios was inefficient in regards to how it handles memory and VRM's. His modifications not only improved hashing rates, but lowered VRM temps in a lot of cases.


It's becoming pretty clear that most Hawaii BIOSes have iffy memory tables and/or load lines, especially on the Elpida samples. The default Hynix tables are either better written or Hynix memory tends to be more tolerant of sloppy settings.

It's not the memory itself that is ultimately the problem, it's the BIOSes.
Quote:


> Originally Posted by *sugarhell*
> 
> Asus custom cards use rebrand stuffs so i dont know what they use. AMD suggest 125C for their vrm (i think they use volterra).Asus call them DIGI+ but 100C for a custom cooler is really bad. Its not dangerous but it kills any oc potential and i wouldnt like to run my vrms 100C everytime that i play


Many custom coolers cool the VRMs worse than the reference cooler, but the reference cooler is pretty terrible at cooling the core.
Quote:


> Originally Posted by *ottoore*
> 
> Stop using furmark. It ruins vga and it doesn't test vga stability. Use Valley, Heaven, 3mark or long game session.


If you part fails from running FurMark, it was probably broken when you got it.

It's not a great stability test, not having an artifact scanner or anything, but it's not dangerous to a non-defective part that isn't already pushed beyond it's limits.
Quote:


> Originally Posted by *BradleyW*
> 
> I[/IMG]
> When card 1 drops, card 2 shoots up. Cycle repeats.


Is this unusual for AFR?


----------



## Heinz68

Quote:


> Originally Posted by *Roboyto*
> 
> Has anyone else seen this? VisionTek R9 290 CryoVenom with factory installed 'custom' EK full cover block, MSRP of $550. Currently sold out obviously, but what a deal!
> 
> http://www.visiontekproducts.com/index.php/component/virtuemart/graphics-cards/visiontek-cryovenom-liquidcooled-series-r9-290-detail?Itemid=0


I was on the mailing list and was informed when the cards were it stock. At that time I already bought two Sapphire R9 290X 4GB TRI-X, so I posted about the cards in stock in this forum. Some people got lucky a got some I believe one guy got four. They sold out fast.

Visiontek will restock again, get on the mailing list if you're intersted to get some.


----------



## taem

Quote:


> Originally Posted by *Paul17041993*
> 
> dw you should be fine, but *if* you start to get coil whine then that glue must have come out of one of the chokes, otherwise its just spill from something...


There is no glue in chokes is there? I'm hoping that's an intentional epoxy coating on the solid state chokes to prevent whine and vibration. I've seen that before. Just, not on a gpu. But I took the card out and took a closer look, that goo is on every single one of those chokes, so it's not a spill, and it's *highly* unlikely every single choke along that row cracked. Downside is this probably means they used cheap chokes, Sapphire and Asus etc don't need coatings on the outside of their chokes.


----------



## vieuxchnock

Quote:


> Originally Posted by *Roboyto*
> 
> Since that is what aquacomputer suggests it is likely the best way to go.
> 
> After I thought about it, adding thermal pads to the RAM could add unwanted spacing causing the block to not make good contact with the GPU core or VRMs.
> 
> You would have to take measurements of the block to confirm this. If the block is thinner where it contacts the VRMs to compensate for thermal pads then you'd have a pretty good idea that adding pads to the RAM could lead to performance loss elsewhere.


I think I will order Fujipoli Ultra Extreme .5mm for the VRM and use Gelid GC Extreme for the Ram. I already ordered an active backplate from Aqua so it should be OK. Also I've ordered an Alphacool XT45 rad for the case top and 2 Corsair SP 120 High Performance.After that, the temperature should OK


----------



## rdr09

Quote:


> Originally Posted by *vieuxchnock*
> 
> I think I will order Fujipoli Ultra Extreme .5mm for the VRM and use Gelid GC Extreme for the Ram. I already ordered an active backplate from Aqua so it should be OK. Also I've ordered an Alphacool XT45 rad for the case top and 2 Corsair SP 120 High Performance.After that, the temperature should OK


your pump is good?


----------



## vieuxchnock

I have a DDC 3.25 and a Watercool pump top


----------



## Roboyto

Quote:


> Originally Posted by *vieuxchnock*
> 
> I think I will order Fujipoli Ultra Extreme .5mm for the VRM and use Gelid GC Extreme for the Ram. I already ordered an active backplate from Aqua so it should be OK. Also I've ordered an Alphacool XT45 rad for the case top and 2 Corsair SP 120 High Performance.After that, the temperature should OK










Temps will be much better.

Not sure what your noise tolerance is for fans, but the SP120s at full blast can be slightly intrusive. The 7V resistors included work wonders if you don't have fan control some other way.


----------



## vieuxchnock

I have a fan controler


----------



## Roboyto

Quote:


> Originally Posted by *vieuxchnock*
> 
> I have a fan controler


Good to go then


----------



## Prozillah

Quote:


> Originally Posted by *Heinz68*
> 
> My system shows pulling up to 1260W from the wall. I believe it's preventing me to OC the
> i7-4930k further than 4381 MHz max i have now.
> 
> I have Sapphire R9 290X TRI-X OC in Crossfire 1235/1650 MHz + 200 mV offset. This set up is only for benching, but I would like to push it even more at least the CPU.
> 
> Thinking about upgrading my present (about 6 years old) Corsair HX1000W PSU to CORSAIR AXi series AX1200i 1200W.


Yep- definitely psu In this case. Tested both overclocking he cpu and gpu individually and they are stable. When combined I get black screens, restarts, freezes and power related bsods within minutes of heavy loads. With one card it would be fine but start overclocking two and man do those bad boys suck the power! Fsp aurum 1200w here I come! Might grab some corsair sp120 performance fans as front intakes too - the stock corsair carbide 500 front instake fans are pathetic..


----------



## Tobiman

Quote:


> Originally Posted by *Prozillah*
> 
> Yep- definitely psu In this case. Tested both overclocking he cpu and gpu individually and they are stable. When combined I get black screens, restarts, freezes and power related bsods within minutes of heavy loads. With one card it would be fine but start overclocking two and man do those bad boys suck the power! Fsp aurum 1200w here I come! Might grab some corsair sp120 performance fans as front intakes too - the stock corsair carbide 500 front instake fans are pathetic..


That's insane power draw.


----------



## FuriousPop

Ladies & Gents,

finally sold my 2x7970's and got myself 2x Sapphire R9 290 Tri-X in crossfire. Hey look no CF bridge!









been trying to test these as much as possible but having very limited time to being forced to leave the cards running overnight.

left it on Heaven 3.0 rolling and came back early morning (approx 8 hours of running) to which the cards were at approx 74 degress both at 100% usage. (running 7560x1600 with the standard options already in)

with 13.12 drivers i seem to notice that every now and then after running something for prolong time - once i quit and exit the application/game the PC hardlocks and im forced to reset. is the issue driver related?

Don't think it was power since it happens at very random times and i cannot consistently replicate the issue.
Running all at stock now for the time being since RMA option only lasts until another day or 2 from the reseller store.

If anyone can recall another person bringing up the issue in thread please point me to which page and i'll have a look, only managed to read about 10 pages so far... thanks in advanced,


----------



## taem

Quote:


> Originally Posted by *FuriousPop*
> 
> Ladies & Gents,
> 
> finally sold my 2x7970's and got myself 2x Sapphire R9 290 Tri-X in crossfire. Hey look no CF bridge!
> 
> 
> 
> 
> 
> 
> 
> ,


How is the build and feel of the card? I've got one on the way to mate to my powercolor pcs+ but I dislike the idea of a plastic shroud.


----------



## magicase

I currently have 2 Tri-X 290s and I'm wondering should I sell them and buy the powercolor PCS+ instead. I'm after the best cooling performance and don't care about the best clocks.


----------



## SimonKaz

Hey guys, I mounted accelero xtreme 3 just now and it works wonders. I was wondering though, which one is the VRM1? I'll be fiddling around with it to get lower temps, although current ones are satisfactory (core 50, vrm 1 70, vrm2 55 at 100% use with radeon pro for 1 hour).


----------



## FuriousPop

Quote:


> Originally Posted by *taem*
> 
> How is the build and feel of the card? I've got one on the way to mate to my powercolor pcs+ but I dislike the idea of a plastic shroud.


i love these cards - playing at 7560x1600 - and i only start to hear them after playing for 2+ hours but even then the sound is minimal since i got headphones on.

truly great cards, only been playing GW2, Metro (1st one), bioshocks (all 3) and some other older titles - tried skyrium but i need to edit the .ini for my res since it plays all 3x screens at same time! games are at Max settings and im a happy customer... the FSAA etc i tend to keep it low since i can barely tell the diff at my res on certain games.

compared to the 2x 7970's the R9's are quieter, faster and cooler.... so far i am happy with my purchase... only now starting to become very tempted to put everything under water, my only concern is the noise with the pumps etc.... must research more..

but my minor (i hope they are minor eg driver related) issues are the only thing concerning me at the moment - however i did have alot more trouble with the 7970's...


----------



## Forceman

Quote:


> Originally Posted by *SimonKaz*
> 
> Hey guys, I mounted accelero xtreme 3 just now and it works wonders. I was wondering though, which one is the VRM1? I'll be fiddling around with it to get lower temps, although current ones are satisfactory (core 50, vrm 1 70, vrm2 55 at 100% use with radeon pro for 1 hour).


VRM1 is the line near the power connectors and supplies core power. VRM2 is the three near the display connectors and powers the VA.


----------



## Arizonian

Quote:


> Originally Posted by *Noufel*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> finaly they are here and they run fine on my *850 watt* cm PSU, waiting a new one for overclocking


Congrats - added









If you don't add voltage you should be fine, let us know how it goes. Enjoy crossfire Tri-x, sweet.


----------



## SimonKaz

hehe, thats interesting. I missed one heatsink on the vrm2 too then (woops), it still performs insanely well (never above 50) so yeah... haha.

I currently have the xtreme 3 heatsinks on vrm1 (attached with akasa thermal tape). Is there any better solution? Ill probably get a sidefan as well.


----------



## Tobiman

Quote:


> Originally Posted by *FuriousPop*
> 
> i love these cards - playing at 7560x1600 - and i only start to hear them after playing for 2+ hours but even then the sound is minimal since i got headphones on.
> 
> truly great cards, only been playing GW2, Metro (1st one), bioshocks (all 3) and some other older titles - tried skyrium but i need to edit the .ini for my res since it plays all 3x screens at same time! games are at Max settings and im a happy customer... the FSAA etc i tend to keep it low since i can barely tell the diff at my res on certain games.
> 
> compared to the 2x 7970's the R9's are quieter, faster and cooler.... so far i am happy with my purchase... only now starting to become very tempted to put everything under water, my only concern is the noise with the pumps etc.... must research more..
> 
> but my minor (i hope they are minor eg driver related) issues are the only thing concerning me at the moment - however i did have alot more trouble with the 7970's...


D5 pumps are pretty much the best bang for buck you can get nowadays.


----------



## taem

Quote:


> Originally Posted by *FuriousPop*
> 
> i love these cards - playing at 7560x1600 - and i only start to hear them after playing for 2+ hours but even then the sound is minimal since i got headphones on.
> 
> truly great cards, only been playing GW2, Metro (1st one), bioshocks (all 3) and some other older titles - tried skyrium but i need to edit the .ini for my res since it plays all 3x screens at same time! games are at Max settings and im a happy customer... the FSAA etc i tend to keep it low since i can barely tell the diff at my res on certain games.
> 
> compared to the 2x 7970's the R9's are quieter, faster and cooler.... so far i am happy with my purchase... only now starting to become very tempted to put everything under water, my only concern is the noise with the pumps etc.... must research more..
> 
> but my minor (i hope they are minor eg driver related) issues are the only thing concerning me at the moment - however i did have alot more trouble with the 7970's...


I don't doubt the performance, this is the consistently most highly rated custom 290 currently available. But I'm wondering about the physical feel of the card. Is the shroud flimsy feeling with a lot of flex? Is there any reinforcement along the pcb to minimize sag? Any fan rattle? (The one criticism I've heard about the Tri-X.)


----------



## kizwan

Quote:


> Originally Posted by *FuriousPop*
> 
> Ladies & Gents,
> 
> finally sold my 2x7970's and got myself 2x Sapphire R9 290 Tri-X in crossfire. Hey look no CF bridge!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> been trying to test these as much as possible but having very limited time to being forced to leave the cards running overnight.
> 
> left it on Heaven 3.0 rolling and came back early morning (approx 8 hours of running) to which the cards were at approx 74 degress both at 100% usage. (running 7560x1600 with the standard options already in)
> 
> with 13.12 drivers i seem to notice that every now and then after running something for prolong time - once i quit and exit the application/game the PC hardlocks and im forced to reset. is the issue driver related?
> 
> Don't think it was power since it happens at very random times and i cannot consistently replicate the issue.
> Running all at stock now for the time being since RMA option only lasts until another day or 2 from the reseller store.
> 
> If anyone can recall another person bringing up the issue in thread please point me to which page and i'll have a look, only managed to read about 10 pages so far... thanks in advanced,


Did you try completely remove the drivers using DDU & re-install fresh drivers (13.12)? If you already did, you might want to try it with fresh windows installation before RMA the cards. If the trouble you had with 7970's is similar to the current problem, there might be something else broken.


----------



## esqueue

Haven't checked in here in ages. Is 14.1 still detrimental to mining?


----------



## Paul17041993

Quote:


> Originally Posted by *taem*
> 
> There is no glue in chokes is there? I'm hoping that's an intentional epoxy coating on the solid state chokes to prevent whine and vibration. I've seen that before. Just, not on a gpu. But I took the card out and took a closer look, that goo is on every single one of those chokes, so it's not a spill, and it's *highly* unlikely every single choke along that row cracked. Downside is this probably means they used cheap chokes, Sapphire and Asus etc don't need coatings on the outside of their chokes.


huh thats weird, I wonder why they have that...


----------



## Matt-Matt

Quote:


> Originally Posted by *Belkov*
> 
> I found that it is compatible even for 290X:
> 
> AMD Radeon R9 290X Compatible PSUs
> 
> And more details for Seasonic S12II-620 80 Bronze 620W.
> 
> So +12V Output is 48A.


Well that's good!


----------



## brazilianloser

For you folks with the XSPC block... did you guys had a hell of a time fitting the led in the exhaust slot? If so any tricks other than just keep on trying until it goes it?

When I was putting in the first time I just gave up after trying for a long time and the dam thing just wouldn't go in... But will be taking the system apart this week again for some additions and minor mods so going to give a second round. Any tips would be greatly appreciate it.


----------



## MapRef41N93W

Hi, I'd like to join:

http://www.techpowerup.com/gpuz/mdzsa/

Sapphire Tri-X 290x with just the regular Tri-X cooler.


----------



## Arizonian

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Hi, I'd like to join:
> 
> http://www.techpowerup.com/gpuz/mdzsa/
> 
> Sapphire Tri-X 290x with just the regular Tri-X cooler.


Congrats - added. Welcome.


----------



## taem

Quote:


> Originally Posted by *Paul17041993*
> 
> huh thats weird, I wonder why they have that...


Tobiman's card has it too, so it is by design. It's epoxy to minimize vibration. I've seen this on TV sets, but never before in a gpu.


----------



## Noufel

Thanks arizonian,
got the famous black screen but it was a driver issue with the 14.1 reinstalled 13.12 and no more BS, for 100 % load both GPUs i got :
- top GPU temps 72 VRM1 temps 73 VRM2 temps 58
- seconde GPU temps 65 VRM1 63 VRM2 temps 55
i have the HAF X case with the 200mm fan side intake cooling the GPUS ( i couldn't put the 120mm air duct because of the Tri-x lenght it did well cooling my 7950s VRMs )


----------



## Paul17041993

Quote:


> Originally Posted by *taem*
> 
> Tobiman's card has it too, so it is by design. It's epoxy to minimize vibration. I've seen this on TV sets, but never before in a gpu.


wouldn't make sense to put it outside the casing, that's what confuses me...

inside the casing is an epoxy or heat-based glue designed to stop the coil from colliding with itself and creating the well known "whine", the casing just keeps said material contained in a set shape for ease of manufacture, other types of chokes use a ceramic or iron base which are completely solid and whine-less (usually), but much more expensive to manufacture and can have less capacity (meaning more phases needed).


----------



## taem

Quote:


> Originally Posted by *Paul17041993*
> 
> wouldn't make sense to put it outside the casing, that's what confuses me...
> 
> inside the casing is an epoxy or heat-based glue designed to stop the coil from colliding with itself and creating the well known "whine", the casing just keeps said material contained in a set shape for ease of manufacture, other types of chokes use a ceramic or iron base which are completely solid and whine-less (usually), but much more expensive to manufacture and can have less capacity (meaning more phases needed).


Epoxy dipped chokes are common. You've seen them all over the place.



Typically you see them in cheap stuff though,not a $500 gpu







I didn't think that was a possibility until I realized all the chokes on that row are epoxy coated on the same two sides.


----------



## Paul17041993

Quote:


> Originally Posted by *taem*
> 
> Epoxy dipped chokes are common. You've seen them all over the place.
> 
> 
> 
> Typically you see them in cheap stuff though,not a $500 gpu
> 
> 
> 
> 
> 
> 
> 
> I didn't think that was a possibility until I realized all the chokes on that row are epoxy coated on the same two sides.


of course, but not on the OUTSIDE of the casing, the casing doesn't move at all so why would you...

first time Ive ever seen something like that, excluding open chokes because of course you would see the glue if there's no case there...

couldnt find any good pictures of these common chokes taken appart, people dont ever seem to do it for whatever reason, but heres the other two common types;

ceramic chokes on the ASUS 7970 Matrix;


and iron chokes on the MSI 7970 Lightning; (heavy gold paint to prevent rusting)


----------



## Prozillah

Quick question here guys - I finally thought I had my system running butter smooth 100% after an hour of beautiful bf4 play / Metro LL runs & Ungine heaven benches. However - I've come back to the PC & sat down and noticed a distinct timely lag or interference with the system. it is noticeable when running the courser across the desktop. I've got 120hz monitor so it very noticable and seems to "skip a beat" every second of so. Now once I load up a BF4 game it is even more noticeable and moving is horribly jumpy!

I go into CCC and disable xfire and it's back to buttersmooth - go to re-enable it and then still buttersmooth. So it's like a very strange intermittent lag that randomly kicks in for some reason until I disable/re-enable my xfire in CCC.

Does anyone has have this issue or can shed any light on it?


----------



## Noufel

Quote:


> Originally Posted by *Prozillah*
> 
> Quick question here guys - I finally thought I had my system running butter smooth 100% after an hour of beautiful bf4 play / Metro LL runs & Ungine heaven benches. However - I've come back to the PC & sat down and noticed a distinct timely lag or interference with the system. it is noticeable when running the courser across the desktop. I've got 120hz monitor so it very noticable and seems to "skip a beat" every second of so. Now once I load up a BF4 game it is even more noticeable and moving is horribly jumpy!
> 
> I go into CCC and disable xfire and it's back to buttersmooth - go to re-enable it and then still buttersmooth. So it's like a very strange intermittent lag that randomly kicks in for some reason until I disable/re-enable my xfire in CCC.
> 
> Does anyone has have this issue or can shed any light on it?


I had the same problem + black screen until i disable xfire in the ccc removed the the 14.1 and installed the 13.12 whql no more problems


----------



## magicase

14.1 is still buggy so use it with caution.


----------



## Prozillah

Quote:


> Originally Posted by *Prozillah*
> 
> Quick question here guys - I finally thought I had my system running butter smooth 100% after an hour of beautiful bf4 play / Metro LL runs & Ungine heaven benches. However - I've come back to the PC & sat down and noticed a distinct timely lag or interference with the system. it is noticeable when running the courser across the desktop. I've got 120hz monitor so it very noticable and seems to "skip a beat" every second of so. Now once I load up a BF4 game it is even more noticeable and moving is horribly jumpy!
> 
> I go into CCC and disable xfire and it's back to buttersmooth - go to re-enable it and then still buttersmooth. So it's like a very strange intermittent lag that randomly kicks in for some reason until I disable/re-enable my xfire in CCC.
> 
> Does anyone has have this issue or can shed any light on it?


Quote:


> Originally Posted by *Noufel*
> 
> I had the same problem + black screen until i disable xfire in the ccc removed the the 14.1 and installed the 13.12 whql no more problems


Quote:


> Originally Posted by *magicase*
> 
> 14.1 is still buggy so use it with caution.


Just yesterday did a full ddu uninstall re install of the 13.12 drivers wasnt even using 14.1 haha ahh its so random I do see why some ppl revert back to nvidia...not that im giving up! I did read somewhere it may be caused by multi monitor set ups? I am using a second screen for stats etc


----------



## Roboyto

Quote:


> Originally Posted by *brazilianloser*
> 
> For you folks with the XSPC block... did you guys had a hell of a time fitting the led in the exhaust slot? If so any tricks other than just keep on trying until it goes it?
> 
> When I was putting in the first time I just gave up after trying for a long time and the dam thing just wouldn't go in... But will be taking the system apart this week again for some additions and minor mods so going to give a second round. Any tips would be greatly appreciate it.


It's a tight squeeze back there even when installing that LED with the card/block out of the case. My suggestion is to bend the wires/resistor on a 90* so you can more easily install it. Bend slowly though otherwise you could break the wires. It's hard to get the LED to line up straight and slide in, I thought there was something wrong at first as well.


----------



## Belkov

Something interesting for GB R9 290 WF3 - it seems that GB decided to reinsure (after all their cards are with the highest stock clock for the core) and set the voltage to 1.156V. My max temps with all at stock are 75 degrees for the core and 78 degrees for VRM1. I've just tried to overvolt my card with -50mV and ran heaven bench for an hour. Well, the max temp for the core was 70 degrees and 69 for VRM1...







And the card was absolutely stable and of course less noisier...

Can someone tell me how much is the stock voltage of tri-x and PCS+?


----------



## axiumone

Two out of four of these bad boys arrived at my door step today.






I'd be installing them right now if performance-pc didnt completely screw the pooch on my fittings ordered. Won't reply to my emails or phone calls.


----------



## pkrexer

Quote:


> Originally Posted by *axiumone*
> 
> Two out of four of these bad boys arrived at my door step today.
> 
> 
> 
> 
> 
> 
> I'd be installing them right now if performance-pc didnt completely screw the pooch on my fittings ordered. Won't reply to my emails or phone calls.


Very nice!

Seems like performance-pc screws up lot of orders. Makes me think twice about ordering from them again (though I haven't an issue yet). Their prices are generally lower then everyone else though.


----------



## taem

Quote:


> Originally Posted by *Belkov*
> 
> Something interesting for GB R9 290 WF3 - it seems that GB decided to reinsure (after all their cards are with the highest stock clock for the core) and set the voltage to 1.156V. My max temps with all at stock are 75 degrees for the core and 78 degrees for VRM1. I've just tried to overvolt my card with -50mV and ran heaven bench for an hour. Well, the max temp for the core was 70 degrees and 69 for VRM1...
> 
> 
> 
> 
> 
> 
> 
> And the card was absolutely stable and of course less noisier...
> 
> Can someone tell me how much is the stock voltage of tri-x and PCS+?


Powercolor PCS+ 290 here. Just ran a Valley loop at stock settings. Note that stock settings for this card are 1040 core, 1350 memory, *+50mV vddc*.

I can't get a fixed voltage reading, it fluctuates, even with constant voltage on. So I don't quite know how to determine the precise # you're looking for.


----------



## Roboyto

I got thermal pad sizes for all the 290(X) water blocks for reference cards. HeatKiller and Koolance were very quick to respond to e-mail inquiries.

If you didn't catch temperature drops with Fujipoly Ultra Extreme you can see it here:

http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures

Thermal pad sizes are as follows:
*
XSPC Razor: VRMs 1mm | RAM 1mm
EK FC: VRMs 1mm** | RAM 0.5mm
Koolance: VRMs 0.7mm* | RAM 0.7mm*
HeatKiller: VRMs 1mm | RAM 0.5mm
AquaComputer: VRMs 0.5mm | RAM only uses non-conductive TIM
*
*From Kooland Tech Support: As any one manufacturers card could have slight differences from another, we often include more than one thickness so that the customer can achieve the best contact on their block. *Ideally you use as thin as possible, but some cards have shorter chips and demand the thicker pads.**
**EK suggests using non-conductive TIM *with* the VRM pads**

Koolance responded to my questions about using TIM with the thermal pads, what kind of pad they include and their specific 0.7mm thickness pads.

_"Koolance Tech [email protected]

12:00 PM (43 minutes ago)

We include 0.7mm and 1mm thick pads but the diagram included with the block calls for 0.7mm. We do not suggest it as we don't believe it to be necessary but there's nothing stopping you from doing it. You won't void your warranty or anything. Just make sure to get the paste nice and clean so that it doesn't drip or ooze down onto components and cause them to short.

-Dylan
Koolance Technical Support

12:30 PM (18 minutes ago)
Hello Bob,

The thermal pads we include are our own proprietary pad.

-Dylan
Koolance Technical Support_

Robert Melchert
12:34 PM (1 hour ago)
Thanks for lightning fast response again. One last question for you: *Is it possible to use a different thickness thermal pad than 0.7mm?* Would a 0.5mm or 1mm affect how the block mounts to the card and cause contact issues with the core?

Koolance Tech

1:22 PM (51 minutes ago)
As any one manufacturers card could have slight differences from another, we often include more than one thickness so that the customer can achieve the best contact on their block. *Ideally you use as thin as possible, but some cards have shorter chips and demand the thicker pads.
*
-Dylan
Koolance Technical Support


----------



## Belkov

No software can't show you the real voltage using by the card, but i decided to use for such purposes GPU-Z, cause it seems that it is closer to the reality. So i check my max voltages with it. At stock my card runs with 1.156V. max under load on 1040/1250MHz, but it is perfectly stable on 1.109V. That's why i wanted to know what are the casual voltages for other 290s.


----------



## taem

Quote:


> Originally Posted by *Belkov*
> 
> No software can't show you the real voltage using by the card, but i decided to use for such purposes GPU-Z, cause it seems that it is closer to the reality. So i check my max voltages with it. At stock my card runs with 1.156V. max under load on 1040/1250MHz, but it is perfectly stable on 1.109V. That's why i wanted to know what are the casual voltages for other 290s.


Well if you're looking for max voltage it's 1.220v. It's stable at less, my quiet profile is 1060 core 1450 mem +31 vddc (which is a -19 vddc from stock). I can go lower but there is no point since this quiet profile runs at 50% fan which is inaudible and <70c across the board.


----------



## Belkov

Quote:


> Originally Posted by *taem*
> 
> Well if you're looking for max voltage it's 1.220v. It's stable at less, my quiet profile is 1060 core 1450 mem +31 vddc (which is a -19 vddc from stock). I can go lower but there is no point since this quiet profile runs at 50% fan which is inaudible and <70c across the board.


Yes, i saw your max voltage. Thank you.









And very nice temps, I must say...







This PCS+ cooling solution is one of the best i've ever seen. I love WF3 but i must admit that PCS+ is superior.


----------



## taem

So what do you guys think is a safe max voltage for a 290 for everyday operation? I can control the heat, but I wonder about running max volts well into the 1.3x range for prolonged periods.
Quote:


> Originally Posted by *Belkov*
> 
> Yes, i saw your max voltage. Thank you.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And very nice temps, I must say...
> 
> 
> 
> 
> 
> 
> 
> This PCS+ cooling solution is one of the best i've ever seen. I love WF3 but i must admit that PCS+ is superior.


It's loud though. Noticeable at 60%, loud at 80%. At lower fan speeds it cools about the same as other customs. It does excel at cooling if you can take the noise. It's also huge, 52mm thick. And it seems optimized for high rpm cooling, wonder if that's a consequence of a fat heatsink. The Tri-X has a flat long sink and it seems to do much better at a low clocked low rpm environment, as does the MSI Gaming. I wish it had a better power delivery than 5+1+1 since it does handle high clocks temp wise.


----------



## rv112

Hi guys! I got a 290 Tri-X. Using it with the 13.12 WHQL and MSI Afterburner. Everything to default, but the CPU usage is jumping like crazy between 0-100% in Afterburner. Why?! I can't read the right usage of the GPU. Same System and drivers, with my old 7950 I don't have that issue.
I also testet the 14.1 Beta 6 driver and set up the powerlimit.

Anybody can help?


----------



## Roboyto

Quote:


> Originally Posted by *rv112*
> 
> Hi guys! I got a 290 Tri-X. Using it with the 13.12 WHQL and MSI Afterburner. Everything to default, but the CPU usage is jumping like crazy between 0-100% in Afterburner. Why?! I can't read the right usage of the GPU. Same System and drivers, with my old 7950 I don't have that issue.
> I also testet the 14.1 Beta 6 driver and set up the powerlimit.
> 
> Anybody can help?


Did you scrub drivers between installs? If everything isn't removed properly it can cause some issues. Use Display Driver Uninstaller to make sure it's squeaky clean.


----------



## rv112

I've already used the AMD uninstaller, yes. Reinstalled driver and Afterburner, but still the same







.


----------



## Mercy4You

Quote:


> Originally Posted by *rv112*
> 
> I've already used the AMD uninstaller, yes. Reinstalled driver and Afterburner, but still the same
> 
> 
> 
> 
> 
> 
> 
> .


The AMD uninstaller is not the right way to go. You should use DDU to be completey sure you get a clean install after that.

I have seen very strange behavior of my reference R9 290X when updating the drivers without using DDU first.


----------



## Paul17041993

Quote:


> Originally Posted by *rv112*
> 
> Hi guys! I got a 290 Tri-X. Using it with the 13.12 WHQL and MSI Afterburner. Everything to default, but the CPU usage is jumping like crazy between 0-100% in Afterburner. Why?! I can't read the right usage of the GPU. Same System and drivers, with my old 7950 I don't have that issue.
> I also testet the 14.1 Beta 6 driver and set up the powerlimit.
> 
> Anybody can help?


looks like a bottleneck of some sort, what you running and hows your CPU behaving? you mention CPU usage but I can only assume you mean GPU as that's whats in the picture.

driver wise, stick to 13.12, 14.1 is very very buggy in multiple levels and areas.


----------



## rv112

I use 13.12 WHQL right now. I only got this issue with MSI Afterburner. GPU-Z shows the right GPU usage as it seems. But why? Can anybody else please check with Afterburner? It's only if you limit the FPS.


----------



## FuriousPop

Quote:


> Originally Posted by *taem*
> 
> I don't doubt the performance, this is the consistently most highly rated custom 290 currently available. But I'm wondering about the physical feel of the card. Is the shroud flimsy feeling with a lot of flex? Is there any reinforcement along the pcb to minimize sag? Any fan rattle? (The one criticism I've heard about the Tri-X.)


The card itself does feel pretty flexible. Having my first slot almost touching the Noctua (just barely touches the clamps of the fan which clip into the heat sink and stick out just a few mm's) i could hear it vibrate and putting my finger on the top corner of the card absorbed the sound immediately - so its more a positioning thing than a faulty card.

Cards do not Sag at all, they seem to be sitting nicely into the board/case. Fan rattle none at the moment and the highest fan speed i have come across is 70% which was at 74 degrees temp..

I heard about the fan rattle from other users but also found that depending on what drivers you run will/will not cause this issue.


----------



## FuriousPop

Quote:


> Originally Posted by *kizwan*
> 
> Did you try completely remove the drivers using DDU & re-install fresh drivers (13.12)? If you already did, you might want to try it with fresh windows installation before RMA the cards. If the trouble you had with 7970's is similar to the current problem, there might be something else broken.


Yes - i previously had my 4970 in there - uninstalled catalyst + all drivers, restarted - did a driver sweep - restarted, tested & looked for any remaining files but i could not locate anything at all. shutdown and installed the new cards - installed drivers 13.12 - setup eyefinity..

I had alot of trouble with the 7970's however it was all driver related and managed to reinstall Windows close to 30+ times - has to be a better way hence why i have become pretty confident in the above method of using the driver sweeper.

however i installed GPUZ 0.7.7 and found a concern.

1st GPU should the Hynix memory HOWEVER 2nd GPU had (GDDR5(Autodetect)) wondering if anyone else has come across this one?
also noticed that the bus width and shaders were different in comparison with each other GPU.

GPU & Mem clock were still identical to each other. 1000/1300

any ideas? or is this a GPUZ issue?


----------



## taem

Quote:


> Originally Posted by *FuriousPop*
> 
> The card itself does feel pretty flexible. Having my first slot almost touching the Noctua (just barely touches the clamps of the fan which clip into the heat sink and stick out just a few mm's) i could hear it vibrate and putting my finger on the top corner of the card absorbed the sound immediately - so its more a positioning thing than a faulty card.
> 
> Cards do not Sag at all, they seem to be sitting nicely into the board/case.


I have one on the way like I said. I just worry because I'm looking at pics and there seems to be no pcb support at all. I think current cards don't sag like the cards of old, but still, I would have liked to see a metal reinforcement along one of the edges, or better yet, a backplate.




Its a great looking card though. The XFX DD is my favorite, but I think this is second best out of the custom 290s. But I happen to love yellow and orange lol. edit to add, weight of card is 1022g.

Edit, I guess there is a full contact plate between the heat sinks and pcb. Would this prevent sag? Looks like it might have the same effect as a backplate.


----------



## Jack Mac

Quote:


> Originally Posted by *rv112*
> 
> Hi guys! I got a 290 Tri-X. Using it with the 13.12 WHQL and MSI Afterburner. Everything to default, but the CPU usage is jumping like crazy between 0-100% in Afterburner. Why?! I can't read the right usage of the GPU. Same System and drivers, with my old 7950 I don't have that issue.
> I also testet the 14.1 Beta 6 driver and set up the powerlimit.
> 
> Anybody can help?


If you're using the Afterburner from the MSI website, it's probably outdated and the GPU usage is a software bug. If your gameplay is smooth don't worry about it and you can actually fix it by downloading AB Beta 18 from Guru3D. I had that problem when I had my 290 before I updated Afterburner.


----------



## FuriousPop

Quote:


> Originally Posted by *Prozillah*
> 
> Quick question here guys - I finally thought I had my system running butter smooth 100% after an hour of beautiful bf4 play / Metro LL runs & Ungine heaven benches. However - I've come back to the PC & sat down and noticed a distinct timely lag or interference with the system. it is noticeable when running the courser across the desktop. I've got 120hz monitor so it very noticable and seems to "skip a beat" every second of so. Now once I load up a BF4 game it is even more noticeable and moving is horribly jumpy!
> 
> I go into CCC and disable xfire and it's back to buttersmooth - go to re-enable it and then still buttersmooth. So it's like a very strange intermittent lag that randomly kicks in for some reason until I disable/re-enable my xfire in CCC.
> 
> Does anyone has have this issue or can shed any light on it?


I had this issue as well.

when trying to run the heaven benchmark4.0 at a lower res then my eyefinity setup 7560x1600 - tried to run at 1920x1080 (to compare scores) and it would project to all 3x screens (forgot to make setting change) - once i did it was like everything went into slowmo.... disabling and re-enabling the xfire snapped it out immediately.

2nd time around the DVI-D cable was a little loose - when trying to push it back it it came out first (i slipped lol) then i moved in back in and the same issue occurred again, same fix and worked without a issue again and since.


----------



## FuriousPop

Quote:


> Originally Posted by *taem*
> 
> I have one on the way like I said. I just worry because I'm looking at pics and there seems to be no pcb support at all. I think current cards don't sag like the cards of old, but still, I would have liked to see a metal reinforcement along one of the edges, or better yet, a backplate.
> 
> 
> 
> 
> Its a great looking card though. The XFX DD is my favorite, but I think this is second best out of the custom 290s. But I happen to love yellow and orange lol. edit to add, weight of card is 1022g.


i just love the fact that they are 2 slotters - my previous bricks 7970's 3 slotters heated up my room faster than any heater i have in the house!!!

2 slotters is like cotton - they breathe!

The weight - well it can only get better coming from the 7970's.......


----------



## Tobiman

Just did a firestrike run and thought I broke the bank until I saw 780TI scores. Really makes me want to get a 4930k and put this badboy under water when my tax return arrives.

http://www.3dmark.com/3dm/2534258


----------



## taem

Quote:


> Originally Posted by *Tobiman*
> 
> Just did a firestrike run and thought I broke the bank until I saw 780TI scores. Really makes me want to get a 4930k and put this badboy under water when my tax return arrives.
> 
> http://www.3dmark.com/3dm/2534258


You're running 1250/1600? 4670k @ 4.4? What's your vddc offset?

Here's mine running 1200/1500 with 4670k @ 4.6, Asus z87 pro. http://www.3dmark.com/fs/1740273


----------



## Tobiman

My cpu is actually at 4.6ghz and 1250/1600 for the 290. I think I just put that sucker at 200mv in trixx.


----------



## Matt-Matt

Has anyone here for a Aquacomputer kyrographics block?
If so what is it like overall?


----------



## Shurtugal

Hey, will a 450w Silverstone Gold Rated SFX PSU run my 290 + i5? No overclocking.


----------



## Tobiman

Quote:


> Originally Posted by *Shurtugal*
> 
> Hey, will a 450w Silverstone Gold Rated SFX PSU run my 290 + i5? No overclocking.


That's borderline but you should be fine as long as you aren't overclocking.


----------



## Falkentyne

Quote:


> Originally Posted by *Forceman*
> 
> We really need LLC control.
> None of the cards released so far have memory voltage control.


You can enable LLC with afterburner with an IC command. You'll have to find the command for yourself.
in beta 18 , the addressing is different so the wi4 commands to change voltlage from beta 17 now use wi6 instead of wi4.
Keep that in mind when you find the IC command for LLC. (no, I don't know nor remember the command nor do I know the beta 18 adjustment).


----------



## Paul17041993

Quote:


> Originally Posted by *rv112*
> 
> It's only if you limit the FPS.


ok that's why, afterburner uses a raw graph wheras GPU-Z will average it out, so you're effectively seeing the sleep times in afterburner.
Quote:


> Originally Posted by *Shurtugal*
> 
> Hey, will a 450w Silverstone Gold Rated SFX PSU run my 290 + i5? No overclocking.


in theory, yes, but very borderline, check your 12V rail can supply a minimum of 350W for one thing.


----------



## FuriousPop

Guys, would anyone know why in GPUZ xfire 290's (Sapphire) 1 card showing hynix memory and the other is on autodetect?

edit: OMG, the ULPS - that must be the issue........ can someone confirm this has to be turned off after EACH driver update/remove/install???

i totally forgot about that - but i could of sworn i made the change in my registry for my 7970's. does it revert back to default when you change drivers!?? thats news to me..


----------



## rickyman0319

how do I air cool the r9 290? when I am mining, the temp is around 70-75 degree. is that normal?


----------



## tsm106

Quote:


> Originally Posted by *FuriousPop*
> 
> Guys, would anyone know why in GPUZ xfire 290's (Sapphire) 1 card showing hynix memory and the other is on autodetect?
> 
> edit: OMG, the ULPS - that must be the issue........ can someone confirm this has to be turned off after EACH driver update/remove/install???
> 
> i totally forgot about that - but i could of sworn i made the change in my registry for my 7970's. does it revert back to default when you change drivers!?? thats news to me..


Yeap, you have to re-disable ULPS or any other reg edits you've made after every driver install.


----------



## Paul17041993

Quote:


> Originally Posted by *rickyman0319*
> 
> how do I air cool the r9 290? when I am mining, the temp is around 70-75 degree. is that normal?


what cooler? and 75C is quite low really for these cards, is the fan on a high setting or just stock profile?


----------



## rickyman0319

Quote:


> Originally Posted by *Paul17041993*
> 
> what cooler? and 75C is quite low really for these cards, is the fan on a high setting or just stock profile?


I have xfx r9 290 gpu. the fan speed is around 4000 rpm and 85%.


----------



## Roboyto

Quote:


> Originally Posted by *Matt-Matt*
> 
> Has anyone here for a Aquacomputer kyrographics block?
> If so what is it like overall?


I don't have one personally but I have seen quite a few on here. They look quite nice with dyed/colored coolant running through them. Performance wise they're matched pretty evenly with the others for GPU core cooling. However, they seem to have the best VRM cooling out of the lot(while using supplied thermal pads), especially if you use their backplate. Either will work, as they have 2, but the active backplate with heatpipe to cool VRM1 and chokes from the backside is very effective and pretty sweet looking.




Check out this waterblock roundup to see excellent VRM cooling:

http://www.xtremerigs.net/2014/01/01/r9-290x-gpu-block-performance-summary/

Here is one pretty sweet example with green coolant:
Quote:


> Originally Posted by *vieuxchnock*
> 
> This is my rig. I have CPU and GPU in the loop with a 180 mm rad.


----------



## rv112

Quote:


> Originally Posted by *Jack Mac*
> 
> If you're using the Afterburner from the MSI website, it's probably outdated and the GPU usage is a software bug. If your gameplay is smooth don't worry about it and you can actually fix it by downloading AB Beta 18 from Guru3D. I had that problem when I had my 290 before I updated Afterburner.


I already use the Beta 18 form Afterburner. Also reinstalled it.


----------



## Shurtugal

Quote:


> Originally Posted by *Tobiman*
> 
> That's borderline but you should be fine as long as you aren't overclocking.


Quote:


> Originally Posted by *Paul17041993*
> 
> ok that's why, afterburner uses a raw graph wheras GPU-Z will average it out, so you're effectively seeing the sleep times in afterburner.
> in theory, yes, but very borderline, check your 12V rail can supply a minimum of 350W for one thing.


Quote:


> Originally Posted by *Tobiman*
> 
> That's borderline but you should be fine as long as you aren't overclocking.


This is the one here:
450w SFX Edit: Forgot to add the link!








It says "Class-leading single +12V rail with 37A" if that means anything?
Am I able to get an SFX PSU bigger than 450w?
Thanks for your help, ordered the Graphics Card, had ordered the PSU, but an issue with the motherboard being unable to be supplied has stopped it until I choose another motherboard.

Thanks for your help!


----------



## Paul17041993

Quote:


> Originally Posted by *Shurtugal*
> 
> *single +12V rail with 37A*


12*37 = 444W peak, should be golden, and being an SFX I can only assume you would only have a couple of drives too so that shouldn't be a concern.


----------



## Shurtugal

Quote:


> Originally Posted by *Paul17041993*
> 
> 12*37 = 444W peak, should be golden, and being an SFX I can only assume you would only have a couple of drives too so that shouldn't be a concern.


Yeah, just an SSD, a HDD, no disc drive, i5 CPU, 8GB ram, Gigabyte 290. Thanks!


----------



## Arizonian

Quote:


> Originally Posted by *axiumone*
> 
> Two out of four of these bad boys arrived at my door step today.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'd be installing them right now if performance-pc didnt completely screw the pooch on my fittings ordered. Won't reply to my emails or phone calls.


Congrats - added #300
















If I could ask you please edit and place a GPU-Z validation link in that post for reference with OCN name showing.
http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/17690#post_21844280

Hope you get your fittings situation figured out.


----------



## axiumone

Thanks Arizonian. I will post a validation along with a pic of the 4 cards one I have them all together.









Here are my findings on these cryovenoms so far. I'm only able to get 1127 on the core and 1480 on the memory. Considering they advertise these cards as capable of 1175. 1128mhz on the core and any amount of additional voltage produces artifacts in 3dmark, that's a little disappointing.

I'm not sure what to make of these EK waterblocks. Core temp tops out at 45c during heavy benchmarking and in the high 30's gaming. What really bothers me are the vrm temps. Vrm 1 can get up to 82c in benches, while vrm 2 stays around 40c. Can anyone with an EK block chime in please? Is this normal? This is all with an ambient temp of 21c.

Edit - also, forgot to mention. Both cards so far have Hynix memory and are locked.


----------



## chiknnwatrmln

No, during extended gaming sessions at sustained 100% load at 1205/1437 with +168mV I get max of 51c core and VRM1.

I also have FujiPoly Ultra Extreme pads, but 80c is too high.


----------



## axiumone

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> No, during extended gaming sessions at sustained 100% load at 1205/1437 with +168mV I get max of 51c core and VRM1.
> 
> I also have FujiPoly Ultra Extreme pads, but 80c is too high.


Thanks for a quick reply! I wonder then if it's a case of substandard thermal material from the factory, like always.


----------



## Forceman

Quote:


> Originally Posted by *axiumone*
> 
> I'm not sure what to make of these EK waterblocks. Core temp tops out at 45c during heavy benchmarking and in the high 30's gaming. What really bothers me are the vrm temps. Vrm 1 can get up to 82c in benches, while vrm 2 stays around 40c. Can anyone with an EK block chime in please? Is this normal? This is all with an ambient temp of 21c.
> 
> Edit - also, forgot to mention. Both cards so far have Hynix memory and are locked.


I'm using the stock pads and I get VRM1 temps around 60C under a full load (BF4, Valley, etc). My core is low to mid-50s at the same time, so I'd say something is off with yours. You used the 1mm pads on the VRMs?


----------



## Roboyto

Quote:


> Originally Posted by *axiumone*
> 
> Thanks for a quick reply! I wonder then if it's a case of substandard thermal material from the factory, like always.


Highest VRM1 temps I usually get in Valley. Highest VRM1 temp I have recorded was 81C @ 1315/1675 +200mV offset during a FFXIV bench; before swapping pads.

What's the warranty policy for those cards, can you remove the block? If so you should consider changing those thermal pads for VRM1 at least. I observed 23% temp drop and now getting same temps as chiknnwatrmln.

http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures

Since you're getting 4 cards you'd need to order 2 of these:

http://www.frozencpu.com/products/17504/thr-185/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_Mosfet_Block_-_100_x_15_x_10_-_Thermal_Conductivity_170_WmK.html?id=XfebVm5d&mv_pc=358


----------



## Coree

Quote:


> Originally Posted by *Shurtugal*
> 
> Hey, will a 450w Silverstone Gold Rated SFX PSU run my 290 + i5? No overclocking.


I have a E3-1230V3 + 290X @ 1093/1325 not overvolted paired up with a SuperFlower GG 450W PSU, it's enough. No hard lockups or so. Even at full load, the PSU is quiet. Prolly draws ~380W


----------



## kizwan

Quote:


> Originally Posted by *axiumone*
> 
> Thanks Arizonian. I will post a validation along with a pic of the 4 cards one I have them all together.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here are my findings on these cryovenoms so far. I'm only able to get 1127 on the core and 1480 on the memory. Considering they advertise these cards as capable of 1175. 1128mhz on the core and any amount of additional voltage produces artifacts in 3dmark, that's a little disappointing.
> 
> I'm not sure what to make of these EK waterblocks. Core temp tops out at 45c during heavy benchmarking and in the high 30's gaming. What really bothers me are the vrm temps. Vrm 1 can get up to 82c in benches, while vrm 2 stays around 40c. Can anyone with an EK block chime in please? Is this normal? This is all with an ambient temp of 21c.
> 
> Edit - also, forgot to mention. Both cards so far have Hynix memory and are locked.


Since you mentioned heavy benching, 80s Celsius for VRM1 is pretty normal with EK water block & stock EK thermal pad. Mine managed to go up to 82C when benching at 1200/1600 +200mV (max at 1.422V, average 1.344 - 1.359V) in 32C ambient (indoor).
Quote:


> Originally Posted by *axiumone*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *chiknnwatrmln*
> 
> No, during extended gaming sessions at sustained 100% load at 1205/1437 with +168mV I get max of 51c core and VRM1.
> 
> I also have FujiPoly Ultra Extreme pads, but 80c is too high.
> 
> 
> 
> 
> 
> 
> 
> Thanks for a quick reply! I wonder then if it's a case of substandard thermal material from the factory, like always.
Click to expand...

Substandard? No. I use EK water block with stock EK thermal pad before on previous gen AMD card. It keep VRMs running cool without too much hassle. The reason is because 290's VRM1 is running hot, that's why.


----------



## Shurtugal

Quote:


> Originally Posted by *Coree*
> 
> I have a E3-1230V3 + 290X @ 1093/1325 not overvolted paired up with a SuperFlower GG 450W PSU, it's enough. No hard lockups or so. Even at full load, the PSU is quiet. Prolly draws ~380W


Awesome, gives me some peace of mind, thanks!








+ Rep


----------



## Paul17041993

Quote:


> Originally Posted by *Coree*
> 
> I have a E3-1230V3 + 290X @ 1093/1325 not overvolted paired up with a SuperFlower GG 450W PSU, it's enough. No hard lockups or so. Even at full load, the PSU is quiet. Prolly draws ~380W


Quote:


> Originally Posted by *Shurtugal*
> 
> Awesome, gives me some peace of mind, thanks!
> 
> 
> 
> 
> 
> 
> 
> 
> + Rep


yea, the shear size of these chips makes them pull a crapload of power on even small overclocks, but on stock they are pretty efficient, especially under water, my (reference) 290X on stock I'm yet to see pull any more then 200W, though I only have a 1050p screen atm so its not really being used to its potential yet...


----------



## Csokis

Catalyst 14.2 Beta is all about Thief
Quote:


> AMD will roll out Catalyst 14.2 Beta in a few hours and the new driver is all about Thief. The game launches today and it's part of AMD's Gaming Evolved programme.


----------



## kpoeticg

Just got my daily black screen with nothing open except GPU Tweak + Windows 7 Desktop.

Just thought i'd share my happiness with everybody from my laptop instead of my $4-5k and growing PC









In 3D i have memory at stock (5GHz), core at 1050, core V at 1300mV, power +150

In 2D i have the same with core at 1000 instead of 1050. Both fan curves leaning towards high rpm's with 2 120 fans pointing straight at the card.

Overdrive disabled, 13.12 WHQL (or 13.25??) ASUS bios flashed, Powercolor reference card. Nothing installed except CCC & GPU Tweak.

Anybody have any insight? I really wanna put this AC Block + Backplate on, but not if i'm gonna have to take it off again. I feel like i'm missing something. I can game in AC IV til i get sick of it, quit to desktop, then 30min's later the display driver just stops. Or it can happen 20 minutes into gaming. Or just turning on my PC and waiting for Blackness, it'll happen before i tire of waiting for it....

/rant


----------



## rdr09

Quote:


> Originally Posted by *kpoeticg*
> 
> Just got my daily black screen with nothing open except GPU Tweak + Windows 7 Desktop.
> 
> Just thought i'd share my happiness with everybody from my laptop instead of my $4-5k and growing PC
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In 3D i have memory at stock (5GHz), core at 1050, core V at 1300mV, power +150
> In 2D i have the same with core at 1000 instead of 1050. Both fan curves leaning towards high rpm's with 2 120 fans pointing straight at the card.
> 
> Overdrive disabled, 13.12 WHQL (or 13.25??) ASUS bios flashed, Powercolor reference card. Nothing installed except CCC & GPU Tweak.
> 
> Anybody have any insight? I really wanna put this AC Block + Backplate on, but not if i'm gonna have to take it off again. I feel like i'm missing something. I can game in AC IV til i get sick of it, quit to desktop, then 30min's later the display driver just stops. Or it can happen 20 minutes into gaming. Or just turning on my PC and waiting for Blackness, it'll happen before i tire of waiting for it....
> 
> /rant


myself and Mercy told you what you need to do last week but it is just a recommendation. do not put the block.


----------



## vortex240

Flash back to stock bios, uninstall GPU tweak. Did you disable ULPS? Driver cleaner and reinstall 13.2 - have you reseated the card in the pcix slot? Could be some dust in there, I've had that happen before.

Sounds like a software issue.


----------



## SimonKaz

Can someone explain to me why I'm seeing dips to 40 fps in Tomb Raider with not-really-maxed-out (TressFX off, AA turned down to FXAA) settings when there are videos of people playing with similar rigs with average 80 fps? I'm looking at this video in particular 




My rig is in the sig.

What am I doing wrong?


----------



## kpoeticg

Quote:


> Originally Posted by *rdr09*
> 
> myself and Mercy told you what you need to do last week but it is just a recommendation. do not put the block.


You mean the RMA?

Newegg pulled a fast one on me if you remember and gave me a VERY limited return window. So i would have to send it to Powercolor or AMD. Which would you recommend? I just can very easily see them having my card for 2 weeks and then sending it back to me because it "Powers On and shows display just fine"

Do you think they'll actually replace it for me?

Quote:


> Originally Posted by *vortex240*
> 
> Flash back to stock bios, uninstall GPU tweak. Did you disable ULPS? Driver cleaner and reinstall 13.2 - have you reseated the card in the pcix slot? Could be some dust in there, I've had that happen before.
> 
> Sounds like a software issue.


I've flashed every bios on there except the PT1 & PT3. I stayed on the Asus because the voltage unlock seems to help a bit. Still get a cpl black screens a day though.

I didn't disable ULPS because i only have the 1 card so far. I plan on a Triple 290x build, but only have the one so far. I've definitely used DDU more times than i can count, and also reseated the card AND the cooler =\

Edit: Also, thanx for helping guys. I know I've brought this up multiple times in multiple threads recently.


----------



## rdr09

Quote:


> Originally Posted by *kpoeticg*
> 
> You mean the RMA?
> Newegg pulled a fast one on me if you remember and gave me a VERY limited return window. So i would have to send it to Powercolor or AMD. Which would you recommend? I just can very easily see them having my card for 2 weeks and then sending it back to me because it "Powers On and shows display just fine"
> Do you think they'll actually replace it for me?
> 
> I've flashed every bios on there except the PT1 & PT3. I stayed on the Asus because the voltage unlock seems to help a bit. Still get a cpl black screens a day though.
> I didn't disable ULPS because i only have the 1 card so far. I plan on a Triple 290x build, but only have the one so far. I've definitely used DDU more times than i can count, and also reseated the card AND the cooler =\
> 
> Edit: Also, thanx for helping guys. I know I've brought this up multiple times in multiple threads recently.


k, send it back to powercolor but take good pictures of it in all sides. you have no choice.

i only use Egg as a search function. i buy from super biiz or provantage.


----------



## kpoeticg

Cool, i'll definitely take good pics. I just hope they actually do something about it. Googleing the issue has brought me across a cpl threads where manufacturers just sent the card back stating that there's nothing wrong with it.

I used to always stick up for Newegg on here because i always used them for my main components. This (+ a cpl other recent transactions) has soured me to them. Gonna have to join Amazon Prime. Does SuperBiiz and Provantage have good reputations for return policies and warranties?


----------



## rdr09

Quote:


> Originally Posted by *kpoeticg*
> 
> Cool, i'll definitely take good pics. I just hope they actually do something about it. Googleing the issue has brought me across a cpl threads where manufacturers just sent the card back stating that there's nothing wrong with it.
> 
> I used to always stick up for Newegg on here because i always used them for my main components. This (+ a cpl other recent transactions) has soured me to them. Gonna have to join Amazon Prime. Does SuperBiiz and Provantage have good reputations for return policies and warranties?


like i said, you have no choice. these two sellers are better than egg. provantage bend over backwards to help in my last purchase. i don't get taxed, too.


----------



## kpoeticg

SuperBiiz usually seems to have fair prices for 290x's also. Maybe i'll just grab my other 2 cards to keep my pc running in the meantime. Gonna have to check out Provantage. Never really browsed them before. It's really attractive when companies don't play games with return policies. The whole reason i chose Powercolor to begin with is because they don't void warranties for watercooling. Hopefully they handle this black screen issue professionally. I'll be sure and post feedback in here about how everything goes


----------



## kizwan

Good luck @kpoeticg! I hope they solved the problem. I would be mad too if mine got daily black screen.


----------



## Matt-Matt

Quote:


> Originally Posted by *Roboyto*
> 
> I don't have one personally but I have seen quite a few on here. They look quite nice with dyed/colored coolant running through them. Performance wise they're matched pretty evenly with the others for GPU core cooling. However, they seem to have the best VRM cooling out of the lot(while using supplied thermal pads), especially if you use their backplate. Either will work, as they have 2, but the active backplate with heatpipe to cool VRM1 and chokes from the backside is very effective and pretty sweet looking.
> 
> 
> 
> 
> Check out this waterblock roundup to see excellent VRM cooling:
> 
> http://www.xtremerigs.net/2014/01/01/r9-290x-gpu-block-performance-summary/
> 
> Here is one pretty sweet example with green coolant:


Sorry I missed this yesterday!

I did see the green build somewhere else, but yes it looks really nice!

I think I am just going to go with it, it's worth it and the active backplate only costs like an extra $20 so even if it's not that much better it's only $20.


----------



## kpoeticg

Quote:


> Originally Posted by *kizwan*
> 
> Good luck @kpoeticg! I hope they solved the problem. I would be mad too if mine got daily black screen.


Thanx brotha. It's the Newegg Game-Playing that really has me pissed. They sent me a card that was DOA first. Neither Bios would boot. Then i RMA'd and got this card back with a note on the receipt saying "You have 6 days to rma this card". That's why i'm stuck with manufacturer warranty.

Quote:


> Originally Posted by *Matt-Matt*
> 
> Sorry I missed this yesterday!
> 
> I did see the green build somewhere else, but yes it looks really nice!
> 
> I think I am just going to go with it, it's worth it and the active backplate only costs like an extra $20 so even if it's not that much better it's only $20.


Stren (I think) tested the AC Backplates (Passive/Active)

The Backplate itself showed a HUGE improvement. But the Active vs Passive only showed like 1C improvement. As long as you get the AC Backplate you'll be good. Active or Passive is up to you, the Active doesn't make alot of difference though


----------



## Roboyto

Quote:


> Originally Posted by *SimonKaz*
> 
> Can someone explain to me why I'm seeing dips to 40 fps in Tomb Raider with not-really-maxed-out (TressFX off, AA turned down to FXAA) settings when there are videos of people playing with similar rigs with average 80 fps? I'm looking at this video in particular
> 
> 
> 
> 
> My rig is in the sig.
> 
> What am I doing wrong?


It always helps to provide as much information as possible about an issue, so you can help us help you









I didn't see that video display the FPS performance, unless I missed it. What exactly do you mean by 'dipping' into the 40s? What FPS do you see most often? You can try running Fraps while you play for an hour or so to get an accurate measurement of FPS over that time frame. It's plausible the frame rates could drop and recover at certain points in the game.

Hardware is the first half of the equation here and yours is certainly capable of not bottlenecking the 290, but:


what clocks/voltages do you have for CPU, RAM and GPU?
Your monitor is 1440p, I would assume you are running at that resolution? 
Looks like that monitor's refresh rate can be overclocked quite successfully, are you higher than 60Hz refresh rate?

I see you have AE III on your card, are your temperatures in check for core *and* VRM1? Stability and other issues can arise when VRM1 crosses the ~90C mark

Software is the 2nd half of the battle in this hobby, so what about software and drivers?


When was the last time you did a fresh Windows install? I know it has helped me before to get correct performance from hardware, and it seems to be a popular fix in this thread in particular.
What driver version are you running? 14.1 has been horrible for doing just about anything; 13.11 or 13.12 are the way to go presently. 14.2 came out today, that may be better, don't know yet. *EDIT: See page 1781/1782*
What GPU did you have prior to the 290, Nvidia or AMD?
Did you scrub old drivers, even if they were AMD, with Display Driver Uninstaller? Driver conflicts can cause all sorts of issues. http://www.guru3d.com/files_details/display_driver_uninstaller_download.html

Are your motherboard chipset drivers up to date? Is there a new BIOS for your board that could help?
If you're OCing the GPU which utility, and version of that utility, are you running?


----------



## Widde

New driver is up







http://support.amd.com/en-us/download/desktop?os=Windows%208%20-%2064


----------



## kpoeticg

OK so in preparation for RMA, i've flashed back to stock Powercolor BIOS. Also uninstalled GPU Tweak and ran DDU in safe mode. Then installed 14.1 Beta fresh. The silent mode bios BS'd right away. The UBER bios hasn't yet though.

I haven't installed any tweaking apps. I enabled Overdrive just to up the max fan speed, but haven't adjusted anything else yet. I've been running stress tests one after the other (Firestrike Extreme, AVP, Valley, IBT) Haven't had a BS yet.

The Max Fan doesn't seem to be working in Overdrive because my GPU Diode has broke 80+C and VRM2 has broken 70+C without the fan really speeding up. No black screens yet though. This is strange cuz i know i've tried this combination before. 14.1 was the very first driver i tried before flashing any different bios or overclocking. I don't remember if i used DDU back then though.

Lets see if i can leave my pc running all day with firefox open. I may end up finding religion if my pc doesn't black screen!!!!!

Edit: And just after i hit Submit, I got a black screen and am now typing this on my laptop. HAHAHAHAHA

Well it's official. I've tried everything!!!! RMA Time!!!


----------



## brazilianloser

Quote:


> Originally Posted by *Widde*
> 
> New driver is up
> 
> 
> 
> 
> 
> 
> 
> http://support.amd.com/en-us/download/desktop?os=Windows%208%20-%2064


This time I will be waiting to see others experiences. The last beta kind of messed up my system for a while.


----------



## kpoeticg

Just submitted my ticket with Powercolor...


----------



## phallacy

Taken from AnandTech forums

http://translate.google.com/translate?hl=pt-BR&sl=de&tl=en&u=http%3A%2F%2Fwww.computerbase.de%2Fnews%2F2014-02%2Fmantle-benchmarks-mit-2-x-r9-290x-in-battlefield-4%2F

14.2 tests from computerbase in Denmark. Noticeable FPS improvements for Single / Multi GPU configs with mantle and an overall performance boost. This is only with BF 4 however but reassuring at least. I'll weigh in with my 2 cents when I get a chance to try out 14.2


----------



## Roboyto

*How much of performance do you get from GPU RAM overclocking on the 290(X) Cards?*

*AMD chose the 512-bit bus and only 5GHz RAM modules for a reason, they didn't leave much on the table from my findings. Pushing the 5.0GHz RAM to 6.5+GHz gives you a boost, but nothing like the core.*

I have a full spreadsheet of nearly all the benches I have run. If you'd like to see that you can check out my build log:

http://www.overclock.net/t/1456279/honey-i-shrunk-the-ultra-tower-beast-my-journey-to-creating-a-more-compact-pc-with-an-r9-290/30

https://docs.google.com/spreadsheet/pub?key=0Ag6Cx4-NkihQdG5CVENEUDRyOFVqZnMwRUg2SEVaSmc&output=html&widget=true


----------



## Roboyto

Quote:


> Originally Posted by *kpoeticg*
> 
> OK so in preparation for RMA, i've flashed back to stock Powercolor BIOS. Also uninstalled GPU Tweak and ran DDU in safe mode. *Then installed 14.1 Beta fresh.* The silent mode bios BS'd right away. The UBER bios hasn't yet though.
> 
> I haven't installed any tweaking apps. I enabled Overdrive just to up the max fan speed, but haven't adjusted anything else yet. I've been running stress tests one after the other (Firestrike Extreme, AVP, Valley, IBT) Haven't had a BS yet.
> 
> The Max Fan doesn't seem to be working in Overdrive because my GPU Diode has broke 80+C and VRM2 has broken 70+C without the fan really speeding up. No black screens yet though. This is strange cuz i know i've tried this combination before. 14.1 was the very first driver i tried before flashing any different bios or overclocking. I don't remember if i used DDU back then though.
> 
> Lets see if i can leave my pc running all day with firefox open. I may end up finding religion if my pc doesn't black screen!!!!!
> 
> Edit: And just after i hit Submit, I got a black screen and am now typing this on my laptop. HAHAHAHAHA
> 
> Well it's official. I've tried everything!!!! RMA Time!!!


I would try with 13.12 drivers, 14.1 have been horrible


----------



## rdr09

Quote:


> Originally Posted by *kpoeticg*
> 
> Just submitted my ticket with Powercolor...


have a spare gpu? if not get a cheap hd7770. plays all the games my 290 plays including metro2033 in dx11.

wish you well and update us.


----------



## kpoeticg

Quote:


> Originally Posted by *Roboyto*
> 
> I would try with 13.12 drivers, 14.1 have been horrible


I've been using the 13.12 for pretty much the whole time i was trying to fix the black screen issue. That 14.1 install was just a last ditch attempt to not have to rma









Quote:


> Originally Posted by *rdr09*
> 
> have a spare gpu? if not get a cheap hd7770. plays all the games my 290 plays including metro2033 in dx11.
> 
> wish you well and update us.


Lol, when my first 290x was DOA i went and bought an R5 230 (i think it was that) for $60 from Best Buy, just to make sure it wasn't my mobo/cpu/socket. When my Elpida card came back from Newegg i returned the R5 for a Samsung EVO









I'm planning a triple 290x build, so i'm gonna need to order more eventually. Gonna try to find a decent price on an XFX or Powercolor while this card's going through RMA process.

Thanx for the help (and kicks in the a## to get this rma started







)

I'll definitely keep you guys posted


----------



## Roboyto

Quote:


> Originally Posted by *kpoeticg*
> 
> I'll definitely keep you guys posted


Look on the bright-ish side, at least you get another shot at a good overclocker...and maybe the RMA will get you a good one in return as well


----------



## kpoeticg

Yeah i'm definitely keeping my fingers crossed for some Hynix cards. Still can't believe nobody's released a Hynix-ONLY version!!!

I'm just worried about the random-ish nature of Black Screens & hoping Powercolor honors the replacement. I'm gonne be HEATED if they keep my card tied up for a month and then send it back to me because there's "No Problems Found"

Edit: Anybody know the MSI policy on tampering with the cooler or adding waterblocks? Do they have tamper seals on the twin frozr 290x? Also, they're reference PCB correct?


----------



## Imprezzion

Guys... How can I fix my overclocks constantly being reset after a reboot...

Using MSI AB B18 or TriXX latest doesn't matter. All 14.x drivers have it.


----------



## SimonKaz

Thanks for the reply, below is my answer.

Quote:


> Originally Posted by *Roboyto*
> 
> It always helps to provide as much information as possible about an issue, so you can help us help you
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I didn't see that video display the FPS performance, unless I missed it. What exactly do you mean by 'dipping' into the 40s? What FPS do you see most often? You can try running Fraps while you play for an hour or so to get an accurate measurement of FPS over that time frame. It's plausible the frame rates could drop and recover at certain points in the game.


The FPS is listed in the description. This is just an example, I've heard other users mention similar results.

Quote:


> Originally Posted by *Roboyto*
> 
> Hardware is the first half of the equation here and yours is certainly capable of not bottlenecking the 290, but:
> 
> what clocks/voltages do you have for CPU, RAM and GPU?
> Your monitor is 1440p, I would assume you are running at that resolution?
> Looks like that monitor's refresh rate can be overclocked quite successfully, are you higher than 60Hz refresh rate?
> 
> I see you have AE III on your card, are your temperatures in check for core *and* VRM1? Stability and other issues can arise when VRM1 crosses the ~90C mark


CPU - OCd to 4.4ghz, I'll take a look at the details once I'm back from work.
RAM - left as is
GPU - left as is with +50% power

- 1440p, stock refresh rate 60hz

- temps are:
60 for GPU
77 for VRM1
55 for VRM2

But my results were the same with stock cooler at 60% speed (all temps under 80 with that)

Quote:


> Originally Posted by *Roboyto*
> 
> Software is the 2nd half of the battle in this hobby, so what about software and drivers?
> 
> When was the last time you did a fresh Windows install? I know it has helped me before to get correct performance from hardware, and it seems to be a popular fix in this thread in particular.
> What driver version are you running? 14.1 has been horrible for doing just about anything; 13.11 or 13.12 are the way to go presently. 14.2 is coming out today I believe, that may be better, don't know yet.
> What GPU did you have prior to the 290, Nvidia or AMD?
> Did you scrub old drivers, even if they were AMD, with Display Driver Uninstaller? Driver conflicts can cause all sorts of issues. http://www.guru3d.com/files_details/display_driver_uninstaller_download.html
> 
> Are your motherboard chipset drivers up to date? Is there a new BIOS for your board that could help?
> If you're OCing the GPU which utility, and version of that utility, are you running?


- I did a fresh windows install when I got the SSD, which was at the beginning of December.
- I'm using 14.1, but the test was also ran with the 13.12. I've only installed 14.1 recently.
- I didn't have any GPU prior to that (unless you count the integrated Intel HD one which I used before the R9 290)
- I'll have a look at the motherboard chipset drivers, haven't updated those. Might be the culprit here.
- I'm not ocing the GPU, I only raised power limit yesterday to 50% to see if it would help, it didnt..


----------



## taem

Quote:


> Originally Posted by *SimonKaz*
> 
> Can someone explain to me why I'm seeing dips to 40 fps in Tomb Raider with not-really-maxed-out (TressFX off, AA turned down to FXAA) settings when there are videos of people playing with similar rigs with average 80 fps? I'm looking at this video in particular
> 
> 
> 
> 
> My rig is in the sig.
> 
> What am I doing wrong?


Dipping to 40s in gameplay or in the bench? And who's averaging 80?! With what settings? Here's how my 290 looks:




Tressfx is on so that's pretty good imho. You should definitely not be getting 40 min in the bench if Tressfx is off.


----------



## SimonKaz

Quote:


> Originally Posted by *taem*
> 
> Dipping to 40s in gameplay or in the bench? And who's averaging 80?! With what settings? Here's how my 290 looks:
> 
> 
> 
> 
> Tressfx is on so that's pretty good imho. You should definitely not be getting 40 min in the bench if Tressfx is off.


My settings are different from yours, I had all the settings maxed out and tressFX off. I'll be doing more testing today in a couple of hours.

EDIT: Oh yeah, im running the benchmark, havent played the game yet.


----------



## Tobiman

Powercolor PCS+ 290X review by Guru3D.

http://www.guru3d.com/articles_pages/powercolor_radeon_r9_290x_pcs_review,1.html


----------



## Katamarino

From the review:
Quote:


> while at the same time being less noisy and more quiet


The PCS+ is both less noisy, and more quiet. Sign me up!


----------



## taem

Quote:


> Originally Posted by *SimonKaz*
> 
> My settings are different from yours, I had all the settings maxed out and tressFX off. I'll be doing more testing today in a couple of hours.
> 
> EDIT: Oh yeah, im running the benchmark, havent played the game yet.


*Everything* maxed out? Including triple buffer vsync? Because if I turn on triple buffer vsync my min fps drops to 29.9 no matter what the other settings are, because the bench starts at 29.9 with that on before increasing. With double buffer vsync, everything else maxed, tressfx off, I get:


Oh and just noticed I get the exact same bench this way as earlier. What? Lol. Is there something wrong with this bench?


----------



## syniad

Just been testing the new 14.2 driver and I have to say its disappointing at best.
The cores on my 290s are still downclocking for no reason as they did with 14.1 even 50 MHz below what was stable in 13.12.

Considering its been almost a month since 14.1 was released it really doesn't seem like AMD has been doing much going from the release notes.
Back to 13.12 it is.


----------



## Roboyto

Quote:
Originally Posted by *kpoeticg* 

Yeah i'm definitely keeping my fingers crossed for some Hynix cards. Still can't believe nobody's released a Hynix-ONLY version!!!

I'm just worried about the random-ish nature of Black Screens & hoping Powercolor honors the replacement. I'm gonne be HEATED if they keep my card tied up for a month and then send it back to me because there's "No Problems Found"

Edit: Anybody know the MSI policy on tampering with the cooler or adding waterblocks? Do they have tamper seals on the twin frozr 290x? Also, they're reference PCB correct?

The Tri-X 290 & 290X are reference













A trick I have used for a few years now when removing factory heatsink/fan. Run a bench/stress test on the GPU for a solid 20 minutes and let it get up to a decent temperature to soften up the factory TIM. Once you have done this give it a minute, maybe 2, for core temp to calm down from its full load temp. Shut down the computer, pull the GPU out and remove the factory HSF. The cooler with lift off and most, if not all, of the factory TIM should still be in place on the HSF. Let me see if I can find a pic.



*Nearly all TIM still on factory HSF. If you ever need to send it back, just plop the stock cooler back on and you should be alright.*



MSI Warranty: Like most others you must reinstall the factory cooler and/or re-flash stock BIOS and all will be well as long as there is no physical damage to the card. I just did an RMA with ASUS for a HD7870 that was previously under water and it went well. *First time I ever called in a RMA and it made things go A LOT faster than an online submission.*

"*Limited Warranty*


Removal and/or damaging of the SN/PN sticker(s) on all MSI products will void all warranties associated with that product.
When sending in your defective product for service, a copy of the RMA e-mail MUST be included in the package or the RMA may be declined and shipped back (at the customer's expense).
Products sent in for RMA will be repaired or replaced with a product of equal or greater performance based on availability.
Repaired, replaced or exchanged Product will be warranted for the remainder of the original warranty.
Please be sure *ONLY* send in the defective product. Keep all accessories and driver discs with you unless specified.
The MSI product *MUST* be free of any physical damage due to improper installation or modification of *ANY* kind (this includes installing aftermarket parts) or the warranty *WILL* be *VOID*.
The product MUST be returned to MSI in the original factory configuration and condition. All aftermarket modifications must be reversed prior to sending in the product for repair or replacement.
MSI reserves the right to inspect and verify the defects of any product(s) returned and further reserves the right to claim for service charge from the customer for any product returned incomplete or modified if product requires repair or replacement or when the customer is not entitled to any coverage under this limited lifetime warranty.
Products returned with customer-induced damage (including, but not limited to physical damage) will be charged for out of warranty repair fee.
MSI reserves the right to change this policy without advance notice."

I don't know this for certain, but I originally had (2) XFX Black edition cards and both had Hynix; they were only 5 serial numbers apart however. It's worth letting you know the card I sold to a co-worker had poor RAM OCability even with the Hynix; was only able to get +100 MHz. It could be possible that XFX uses Hynix for their BE cards...? Worth looking into to see what others have gotten with BE cards?

I have seen a few people with decent OC with Elpida memory, so it wouldn't be the end of the world especially considering the GPU RAM clocks don't play as much of a role in the performance of these beasts with the 512-bit bus.

This is what I've come to find:



Spoiler: Spreadsheet GPU RAM



Quote:
Originally Posted by *Roboyto* 

*How much of performance do you get from GPU RAM overclocking on the 290(X) Cards?*

*AMD chose the 512-bit bus and only 5GHz RAM modules for a reason, they didn't leave much on the table from my findings. Pushing the 5.0GHz RAM to 6.5+GHz gives you a boost, but nothing like the core.*

I have a full spreadsheet of nearly all the benches I have run. If you'd like to see that you can check out my build log:

http://www.overclock.net/t/1456279/honey-i-shrunk-the-ultra-tower-beast-my-journey-to-creating-a-more-compact-pc-with-an-r9-290/30

https://docs.google.com/spreadsheet/pub?key=0Ag6Cx4-NkihQdG5CVENEUDRyOFVqZnMwRUg2SEVaSmc&output=html&widget=true


----------



## Roboyto

Quote:


> Originally Posted by *SimonKaz*
> 
> Thanks for the reply, below is my answer.
> The FPS is listed in the description. This is just an example, I've heard other users mention similar results.
> CPU - OCd to 4.4ghz, I'll take a look at the details once I'm back from work.
> RAM - left as is
> GPU - left as is with +50% power
> 
> - 1440p, stock refresh rate 60hz
> 
> - temps are:
> 60 for GPU
> 77 for VRM1
> 55 for VRM2
> 
> But my results were the same with stock cooler at 60% speed (all temps under 80 with that)
> - I did a fresh windows install when I got the SSD, which was at the beginning of December.
> - I'm using 14.1, but the test was also ran with the 13.12. I've only installed 14.1 recently.
> - I didn't have any GPU prior to that (unless you count the integrated Intel HD one which I used before the R9 290)
> - I'll have a look at the motherboard chipset drivers, haven't updated those. Might be the culprit here.
> - I'm not ocing the GPU, I only raised power limit yesterday to 50% to see if it would help, it didnt..


Alrighty, this is what we need









CPU or RAM won't be hurting your performance


2133 not the fastest but it gets a big boost in performance compared to 1600. 
Read here if you'd like about RAM speeds: http://www.xbitlabs.com/articles/memory/display/ivy-bridge-ddr3_2.html

Power adjustment only going to give the card more amperage. If you're having issues with frame rates it'd be better to give it extra voltage along with the power adjustment.

Scrub away those 14.1 drivers immediately and use 13.12! http://www.guru3d.com/files_details/display_driver_uninstaller_download.html

The only thing the factory blower does half way decently, especially at high fan speed, is cool VRM1. VRM1 are directly in front of the fan so this, as well as the poor GPU core cooling, make a lot of sense.

Your temperatures seem high for stock clocks.


TomsHardware review of the Extreme III put the GPU Core @50C and VRM1 @65C under stock clocks/voltages. 
VRMs were in the low 70s with the core at 1150. 
http://www.tomshardware.com/reviews/r9-290-accelero-xtreme-290,3671-4.html
If you have your Accelero using the 7V step down resistor I would suggest removing it; those coolers aren't loud even at 100% fan. 
Make sure all of the sinks are properly attached to the VRMs. 

Could we see some pics of your rig? Getting fresh air to these cards while air cooled is a necessity.


I know the R4 is designed around silence, but it could be starving the card for air. 
Having the front door closed could hinder air flow through the front of the case.
If you have all of the HDD cages installed this will also hinder air flow through the front of the case.
Utilize those ModuVents if you haven't already.
If you have the windowed version this could be hurting you by not having a side inlet fan.
It looks like there is a bottom 120/140 fan spot, others have seen a few degree improvement by installing a fan there.


----------



## Paul17041993

Quote:


> Originally Posted by *Roboyto*
> 
> How much of performance do you get from GPU RAM overclocking on the 290(X) Cards?


in a nutshell; not much at all unless you have eyefinity or 4K.
but it does drop overall buffer latency and increases cycles so in bench scores and GPGPU tasks it can give significant gain.


----------



## pkrexer

I'm going to be remounting my water block so I can replace the thermal pads. Question: Do you think its worth the risk to use CLU on the core, or just stick with my PK3 TIM?


----------



## Roboyto

Quote:


> Originally Posted by *Paul17041993*
> 
> in a nutshell; not much at all unless you have eyefinity or 4K.
> but it does drop overall buffer latency and increases cycles so in bench scores and GPGPU tasks it can give significant gain.


Did you see the spreadsheet with that post?

At 1080P going from 5GHz to 6.5+GHz was on average, for the 5 benches I listed, 5.5%

I will re-examine this once I get my computer build fully tied up and am on 5760x1080 resolution.


----------



## Roboyto

Quote:


> Originally Posted by *pkrexer*
> 
> I'm going to be remounting my water block so I can replace the thermal pads. Question: Do you think its worth the risk to use CLU on the core, or just stick with my PK3 TIM?


This will be a matter of opinion, so here is my









CLU does provide better temperatures than even the best performing 'traditional' TIMs in paste/grease form.

All of the waterblocks available for these cards do a great job of keeping the core cool, even under maximum overclocks with 'traidtional' paste/grease.

Is couple degrees you will drop worth spreading liquid metal on the block with the potential problem of getting it where it shouldn't be or having difficulty cleaning it off?

Not to me.


----------



## taem

Quote:


> Originally Posted by *Roboyto*
> 
> Did you see the spreadsheet with that post?
> 
> At 1080P going from 5GHz to 6.5+GHz was on average, for the 5 benches I listed, 5.5%
> 
> I will re-examine this once I get my computer build fully tied up and am on 5760x1080 resolution.


Look fw to that. I was really surprised by that spreadsheet, I'd always read ram speed makes no discernible difference for gpu or cpu.


----------



## b0x3d

I bought 2x GB WF 290x from Scan. An upgrade from 2x 6970s. When I opened one of them I heard a rattling noise inside and when I turned the card over a screw fell out! I can't actually see anything different from the two cards. Apart from when running on idle, the fans wobble ever so slightly on one of them and I thought maybe the screw came from the fan housing - but I think this is normal?

The temps / fan speeds / noise are almost identical. It's not obvious where the screw came from. It's quite big, about 5mm long with a philips dome shaped (as opposed to flat) head. Any ideas?

Could it just be a random screw that found it's way in there during manufacturing or packaging? Or maybe I got a card someone returned and they accidentally left a screw in there?

Bit of a mystery!


----------



## Roboyto

Quote:


> Originally Posted by *taem*
> 
> Look fw to that. I was really surprised by that spreadsheet, I'd always read ram speed makes no discernible difference for gpu or cpu.


I knew there wouldn't be much difference, but I was really curious as to exactly how much. I had already run all the benches many, many, times...so what was one more with the RAM at default speed? lol

It peaked my interest because I remember people discussing AMD's choice of only 5GHz RAM chips, especially when some of the 770s have Samsung 7GHz chips in them!

With the 512-bit bus, the need for premium, high speed, RAM modules is partly negated. This likely also helped keep production costs down...not that it really helped the consumer any with this stupid cryptocurrency inflation!

I'm looking forward to seeing if the big, bad, 290(X)s come equipped with higher base speed RAM. Gigabyte SOC, ASUS TOP/Matrix, Lightning, etc.


----------



## Roboyto

Quote:


> Originally Posted by *b0x3d*
> 
> I bought 2x GB WF 290x from Scan. An upgrade from 2x 6970s. When I opened one of them I heard a rattling noise inside and when I turned the card over a screw fell out! I can't actually see anything different from the two cards. Apart from when running on idle, the fans wobble ever so slightly on one of them and I thought maybe the screw came from the fan housing - but I think this is normal?
> 
> The temps / fan speeds / noise are almost identical. It's not obvious where the screw came from. It's quite big, about 5mm long with a philips dome shaped (as opposed to flat) head. Any ideas?
> 
> Could it just be a random screw that found it's way in there during manufacturing or packaging? Or maybe I got a card someone returned and they accidentally left a screw in there?
> 
> Bit of a mystery!


Pic please! Someone with a gigabyte card who has disassembled them may be able to identify the screw for you


----------



## taem

Quote:


> Originally Posted by *Roboyto*
> 
> I knew there wouldn't be much difference, but I was really curious as to exactly how much. I had already run all the benches many, many, times...so what was one more with the RAM at default speed? lol


Imho a 5.x% boost in performance is really good and not trivial at all, given the cost of overclocking ram, which is minimal.


----------



## Roboyto

Quote:


> Originally Posted by *taem*
> 
> Imho a 5.x% boost in performance is really good and not trivial at all, given the cost of overclocking ram, which is minimal.


It's definitely a fair bump, *IF* your card will clock the RAM as high as mine does.

I've started to locate my 'everyday' use clocks/voltages and it's looking like 1200/1500 +100mV offset is going to be where I'm at. Still a 1GHz boost to RAM.

Compared to my maximum overclocks, using up to +200mV, I'm seeing ~96% of the total Core/RAM OC gains. This makes me very happy that I get nearly all of the performance without abusing the card too much with maxed out voltages.


----------



## SimonKaz

Quote:


> Originally Posted by *taem*
> 
> Dipping to 40s in gameplay or in the bench? And who's averaging 80?! With what settings? Here's how my 290 looks:
> 
> 
> 
> 
> Tressfx is on so that's pretty good imho. You should definitely not be getting 40 min in the bench if Tressfx is off.


Rerun the test with those settings, im getting

39.6 minimum
62.1 max
56.7 average

What gives :/ I'll install 14.2 tomorrow and see if it helps.


----------



## Roboyto

Quote:


> Originally Posted by *SimonKaz*
> 
> Rerun the test with those settings, im getting
> 
> 39.6 minimum
> 62.1 max
> 56.7 average
> 
> What gives :/ I'll install 14.2 tomorrow and see if it helps.


Is the TR bench downloadable, or must I install the game? I think I have a game coupon around here somwhere


----------



## roudabout6

What is the best card for a 290x? Buying two of them and want the best I will not be putting them under water though but they will be 2 slots away from each other if that makes a difference. Also can you buy backplates for them. Have one on my 280x and love it.


----------



## Paul17041993

Quote:


> Originally Posted by *taem*
> 
> I'd always read ram speed makes no discernible difference for gpu or cpu.


it depends on platform, channels and load, more memory rate in CPU allows faster (CPU) physics and feeding buffers to the GPU, for the same CPU, gains in clock speed on dual-channel DDR3 will be higher clock-to-clock compared to quad-channel, as when you increase clocks on quad-channel you become more CPU bound then actual memory bound, same applies to these 512bit (quad 128) memory GPUs.


----------



## SimonKaz

Quote:


> Originally Posted by *Roboyto*
> 
> Is the TR bench downloadable, or must I install the game? I think I have a game coupon around here somwhere


I'm pretty sure its a part of the game...


----------



## taem

Quote:


> Originally Posted by *SimonKaz*
> 
> Rerun the test with those settings, im getting
> 
> 39.6 minimum
> 62.1 max
> 56.7 average
> 
> What gives :/ I'll install 14.2 tomorrow and see if it helps.


My 4670k is at 4.7 could that be it?


----------



## SimonKaz

Quote:


> Originally Posted by *taem*
> 
> My 4670k is at 4.7 could that be it?


apparently the CPU has a small impact. Read that somewhere, some benchmark


----------



## Widde

Hmm this was odd, was playing thief and it doesnt utilize my 2nd card :S http://piclair.com/6tp9e and my 2nd gpu was at 55C flat while I was playing :<

I'm on 14.2 but havent had any problems or crashes so far.


----------



## vieuxchnock

Quote:


> Originally Posted by *Roboyto*
> 
> I don't know this for certain, but I originally had (2) XFX Black edition cards and both had Hynix; they were only 5 serial numbers apart however. It's worth letting you know the card I sold to a co-worker had poor RAM OCability even with the Hynix; was only able to get +100 MHz. It could be possible that XFX uses Hynix for their BE cards...? Worth looking into to see what others have gotten with BE cards?
> 
> I have seen a few people with decent OC with Elpida memory, so it wouldn't be the end of the world especially considering the GPU RAM clocks don't play as much of a role in the performance of these beasts with the 512-bit bus.
> 
> This is what I've come to find:


I have an XFX 290 BE with Hynix ram and the OC of the ram is 1500:


----------



## Matt-Matt

Quote:


> Originally Posted by *kpoeticg*
> 
> Thanx brotha. It's the Newegg Game-Playing that really has me pissed. They sent me a card that was DOA first. Neither Bios would boot. Then i RMA'd and got this card back with a note on the receipt saying "You have 6 days to rma this card". That's why i'm stuck with manufacturer warranty.
> 
> Stren (I think) tested the AC Backplates (Passive/Active)
> The Backplate itself showed a HUGE improvement. But the Active vs Passive only showed like 1C improvement. As long as you get the AC Backplate you'll be good. Active or Passive is up to you, the Active doesn't make alot of difference though


Yes, I read this last night.. Might go with active because it's technically better and looks like it's better. Stren was running overclocked a bit too, I plan to go to the extremes


----------



## nightfox

14.2 beta gives a huge 3dmark11 score compared to 13.12. Same settings, same hardware same OS

+ almost 5000 Graphic score
+ 400 physix score
almost 2000 overall score

http://www.3dmark.com/compare/3dm11/8032261/3dm11/8001317

Had a quick run BF4 single performance.

With direct 11, it is smoother than mantle.
observed some fps drop / spiking with mantle, overall directx 11 still smoother although not so much difference in fps with BF4

I do observed though that with these driver and with mantle, BF4 uses more the CPU compared to direct11 API +5-10%.

one more thing, RPM sensor still broken







or not detecting at all GPUZ and AB


----------



## Roboyto

Quote:


> Originally Posted by *nightfox*
> 
> 14.2 beta gives a huge 3dmark11 score compared to 13.12. Same settings, same hardware same OS
> 
> + almost 5000 Graphic score
> + 400 physix score
> almost 2000 overall score
> 
> http://www.3dmark.com/compare/3dm11/8032261/3dm11/8001317
> 
> Had a quick run BF4 single performance.
> 
> With direct 11, it is smoother than mantle.
> observed some fps drop / spiking with mantle, overall directx 11 still smoother although not so much difference in fps with BF4
> 
> I do observed though that with these driver and with mantle, BF4 uses more the CPU compared to direct11 API +5-10%.
> 
> one more thing, RPM sensor still broken
> 
> 
> 
> 
> 
> 
> 
> or not detecting at all GPUZ and AB


Holy smokes! What a jump in 3dMark11! I think I'm going to have to update and possibly move myself up that Top 30 3DMark ladder, shooting for top 10 if you got a 15% boost in graphics and a 6% boost in physics score.

More CPU power...hmm that's going in the opposite direction of what they wanted


----------



## taem

Quote:


> Originally Posted by *Roboyto*
> 
> I've started to locate my 'everyday' use clocks/voltages and it's looking like 1200/1500 +100mV offset is going to be where I'm at. Still a 1GHz boost to RAM.
> 
> Compared to my maximum overclocks, using up to +200mV, I'm seeing ~96% of the total Core/RAM OC gains. This makes me very happy that I get nearly all of the performance without abusing the card too much with maxed out voltages.


I'm definitely not pushing my card to max due to the voltage issue. This card can actually handle the heat (the Powercolor PCS+ on the 290/X might be the best cooler *ever*) but the voltage sort of freaks me out. The max I am using is +112 vddc offset. Think that's too high? The clocks are actually stable on everything at +100 but I get this weird glitch where when I boot up, the Windows 8 Start Screen jumps around a bit before stabilizing. At +112 it doesn't do that. Which is odd since at that point Trixx hasn't even loaded -- I have it on a 30 second delay in Task Scheduler because it always loads before CCC and CCC resets the clocks.

I've actually been meaning to start a thread just on this issue of what voltages folks are comfortable with. At +112 vddc offset the max VID is 1.34 or so, but most of the time it runs in the ~1.25 range. But the performance gain is not that huge over my second highest profile which runs +75 vddc. And I could live with my +31 vddc clocks too. Wonder if I should. But then the water cooling guys run insane voltages and while water cooling obviously removes the heat issue, the voltage effect on the card remains the same right?
Quote:


> Originally Posted by *SimonKaz*
> 
> apparently the CPU has a small impact. Read that somewhere, some benchmark


Come to think of it I don't know if I ran that bench at stock clocks. I meant to but those numbers seem too high for stock. But, that doesn't account for the difference in minimum fps I don't think, higher clocks don't get you that much improvement, especially with minimum fps.


----------



## nightfox

Quote:


> Originally Posted by *Roboyto*
> 
> Holy smokes! What a jump in 3dMark11! I think I'm going to have to update and possibly move myself up that Top 30 3DMark ladder, shooting for top 10 if you got a 15% boost in graphics and a 6% boost in physics score.
> 
> More CPU power...hmm that's going in the opposite direction of what they wanted


yeah tell me about it. I was also really surprised. and I restart my PC re run and result was same.


----------



## Roboyto

Quote:


> Originally Posted by *taem*
> 
> I'm definitely not pushing my card to max due to the voltage issue. This card can actually handle the heat (the Powercolor PCS+ on the 290/X might be the best cooler *ever*) but the voltage sort of freaks me out. The max I am using is +112 vddc offset. *Think that's too high?* The clocks are actually stable on everything at +100 but I get this weird glitch where when I boot up, the Windows 8 Start Screen jumps around a bit before stabilizing. At +112 it doesn't do that. *Which is odd since at that point Trixx hasn't even loaded -- I have it on a 30 second delay in Task Scheduler because it always loads before CCC and CCC resets the clocks.*
> 
> I've actually been meaning to start a thread just on this issue of what voltages folks are comfortable with. At +112 vddc offset the max VID is 1.34 or so, but most of the time it runs in the ~1.25 range. *But the performance gain is not that huge over my second highest profile which runs +75 vddc.* And I could live with my +31 vddc clocks too. Wonder if I should. But then the water cooling guys run insane voltages and while water cooling obviously removes the heat issue, *the voltage effect on the card remains the same right?*


Voltage you can push depends on your cooling...if the temperatures are in check then you certainly can do it.

I don't have Trixx applying overclocks when Windows boots. Trixx will open and be in the task bar, but not changing any settings. Disable overdrive and I don't think it will reset your clocks.

Benchmarking and pushing your hardware to the brink is fun, but it's always best IMHO to have settings to use on a regular basis. I have tried all the benchmarks I have been running with +200mV and the gains are minimal...they are measurable but minimal.

For example you can use my gaming settings and maximum benchmark settings: 1200/1500 +100mV ~compared to~ 1295/1700 +200mV

*When running at 1200/1500, I'm seeing ~96% of the performance* of pushing my GPU to it's absolute maximum. Is it worth it to add another +100mV for 95MHz core clock? Not in my opinion, just more wear and tear on the GPU as a whole.

Extra voltage effect is lessened when you can keep temperatures under control. Hence the reason lowering the temperature will yield higher clocks, usually. Same as with a CPU, will they degrade over time with additional voltage in the long haul? Probably.


----------



## Roboyto

Quote:


> Originally Posted by *nightfox*
> 
> yeah tell me about it. I was also really surprised. and I restart my PC re run and result was same.


Just scrubbed and installed 14.2

Time to put my benchmarking hat on


----------



## IBIubbleTea

What do you guys think of using nt-h1 over MX-4 and EK stock thermal paste?


----------



## Roboyto

Quote:


> Originally Posted by *IBIubbleTea*
> 
> What do you guys think of using nt-h1 over MX-4 and EK stock thermal paste?


Likely to have similar results. Core temp isn't the trouble maker for these cards... It's those pesky VRMs you should be more concerned with


----------



## Roboyto

Quote:


> Originally Posted by *nightfox*
> 
> yeah tell me about it. I was also really surprised. and I restart my PC re run and result was same.


They must have worked on Xfire a lot because not only did I not see gains, my scores are down 3.83% Tesselation ON, and 2.98% Tessealtion OFF.

Hrumph!

Have to see if it helps with Unigine or FFXIV...I'll be back


----------



## Roboyto

Quote:


> Originally Posted by *nightfox*
> 
> yeah tell me about it. I was also really surprised. and I restart my PC re run and result was same.


Rebooted like you did to double check my scores, and now my GPU clock isn't taking during the benchmark. This time my score is down is 9.98%








14.2 Beta Single GPU









Back to 13.12 me thinks

Set to 1225 in Trixx, only registered 1129 during the whole bench


----------



## nightfox

Quote:


> Originally Posted by *Roboyto*
> 
> Rebooted like you did to double check my scores, and now my GPU clock isn't taking during the benchmark. This time my score is down is 9.98%
> 
> 
> 
> 
> 
> 
> 
> 
> 14.2 Beta Single GPU
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Back to 13.12 me thinks
> 
> Set to 1225 in Trixx, only registered 1129 during the whole bench


awww. that sucks.

Ill try later benching with a single GPU. I run a benchmark with my 3770K, 13.12 and 14.2.

The benchmark I had earlier could be because AMD has concentrated on multi GPU.


----------



## Roboyto

Quote:


> Originally Posted by *nightfox*
> 
> awww. that sucks.
> 
> Ill try later benching with a single GPU. I run a benchmark with my 3770K, 13.12 and 14.2.
> 
> The benchmark I had earlier could be because AMD has concentrated on multi GPU.


Rebooted again, Checked Trixx and it had been set to defaults. ULPS wasn't disabled and it wasn't forcing constant voltage.

As soon as I checked the two boxes my screen started flickering. Trixx prompted for reboot to enable these settings. I did. Rebooted and as soon as I loaded an overclock profile screen flickering like crazy. Set back to default, still flickering.

Scrubbing, back to 13.12 it is.

Reverted to 13.12 and all is back to normal. Score even came up a smidge for my gaming OC.


----------



## MrWhiteRX7

Just did a back to back to back test of BF4 with 13.12, 14.2 d3d11, and 14.2 mantle

14.2 d3d11 is actually the best BY FAR lol

Mantle is awful still. My Fps is slightly higher but my lows are way lower. It's a stuttery mess where as the d3d11 version with 14.2 is incredibly smooth and my fps is very high and way more consistent. I went to a single spot in the center of Firestorm and hung out for a sec. On the catwalk right in the middle facing towards all the smoke and fire. With mantle on I get around 115fps but with d3d11 I get 168fps lol.

I'll peg the 200fps cap in a lot of places in that same map with mantle but it will go under 80fps and always feels like 30fps. The fog is definitely gone the game looks the same in mantle and d3d11. My highs aren't consistently as high in d3d11 but the lows never go under 130fps. Lamesauce Mantle!!!

1440p, ultra with msaa and all turned up, fov 90 etc...

Ah well, still going back to 13.12


----------



## Roboyto

Took a pic earlier and was fiddling around with photo editing on my phone. Accidentally clicked on make negative and this is what I got


----------



## Roboyto

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Just did a back to back to back test of BF4 with 13.12, 14.2 d3d11, and 14.2 mantle
> 
> 14.2 d3d11 is actually the best BY FAR lol
> 
> Mantle is awful still. My Fps is slightly higher but my lows are way lower. It's a stuttery mess where as the d3d11 version with 14.2 is incredibly smooth and my fps is very high and way more consistent. I went to a single spot in the center of Firestorm and hung out for a sec. On the catwalk right in the middle facing towards all the smoke and fire. With mantle on I get around 115fps but with d3d11 I get 168fps lol.
> 
> I'll peg the 200fps cap in a lot of places in that same map with mantle but it will go under 80fps and always feels like 30fps. The fog is definitely gone the game looks the same in mantle and d3d11. My highs aren't consistently as high in d3d11 but the lows never go under 130fps. Lamesauce Mantle!!!
> 
> 1440p, ultra with msaa and all turned up, fov 90 etc...
> 
> Ah well, still going back to 13.12


Just had bad experience myself trying to run 3DMark11 with 14.2







Didn't even make it to trying to use mantle with my screen flickering like crazy.

At least they're Beta drivers doing this.


----------



## tsm106

Quote:


> Originally Posted by *nightfox*
> 
> 14.2 beta gives a huge 3dmark11 score compared to 13.12. Same settings, same hardware same OS
> 
> + almost 5000 Graphic score
> + 400 physix score
> almost 2000 overall score
> 
> http://www.3dmark.com/compare/3dm11/8032261/3dm11/8001317


Hmm, that's interesting. Ignoring scores except for GS, three stock 290s equals two highly clocked 290x.

http://www.3dmark.com/compare/3dm11/7515121/3dm11/8032261

Adding to my list of things that make you say huh, 4 medium clocked 290x is slightly faster than 4 highly clocked 7970s. It's ironic, who knew lol?

http://www.3dmark.com/compare/3dm11/7303785/3dm11/7894543

Btw, I would not make any conclusions on driver bench gains unless you are really pushing the clocks. Running at stock leaves a lot of room to trim the fat in a manner of speaking.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Roboyto*
> 
> Just had bad experience myself trying to run 3DMark11 with 14.2
> 
> 
> 
> 
> 
> 
> 
> Didn't even make it to trying to use mantle with my screen flickering like crazy.
> 
> At least they're Beta drivers doing this.


Yea I'm glad they're still working on it, I can't be upset. Everything runs absolutely great with my tri-fire and 13.12 whql drivers


----------



## tsm106

You want to run 14.2 though, just don't switch bf4 to mantle if you don't want to beta test for amd.


----------



## nightfox

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Just did a back to back to back test of BF4 with 13.12, 14.2 d3d11, and 14.2 mantle
> 
> 14.2 d3d11 is actually the best BY FAR lol
> 
> Mantle is awful still. My Fps is slightly higher but my lows are way lower. It's a stuttery mess where as the d3d11 version with 14.2 is incredibly smooth and my fps is very high and way more consistent. I went to a single spot in the center of Firestorm and hung out for a sec. On the catwalk right in the middle facing towards all the smoke and fire. With mantle on I get around 115fps but with d3d11 I get 168fps lol.
> 
> I'll peg the 200fps cap in a lot of places in that same map with mantle but it will go under 80fps and always feels like 30fps. The fog is definitely gone the game looks the same in mantle and d3d11. My highs aren't consistently as high in d3d11 but the lows never go under 130fps. Lamesauce Mantle!!!
> 
> 1440p, ultra with msaa and all turned up, fov 90 etc...
> 
> Ah well, still going back to 13.12


had same experience. But Mantle is not really that aweful now. the d3d11 feels alot smoother for me though than 13.12. Although I could say not so big difference

Quote:


> Originally Posted by *tsm106*
> 
> Hmm, that's interesting. Ignoring scores except for GS, three stock 290s equals two highly clocked 290x.
> 
> http://www.3dmark.com/compare/3dm11/7515121/3dm11/8032261
> 
> Adding to my list of things that make you say huh, 4 medium clocked 290x is slightly faster than 4 highly clocked 7970s. It's ironic, who knew lol?
> 
> http://www.3dmark.com/compare/3dm11/7303785/3dm11/7894543
> 
> Btw, I would not make any conclusions on driver bench gains unless you are really pushing the clocks. Running at stock leaves a lot of room to trim the fat in a manner of speaking.


I think one reason is that my mobo has a PLX chip. We all know that PLX gives a latency and it is really noticable with synthetic benchmarks but not so much during gaming. That could be reason.

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Yea I'm glad they're still working on it, I can't be upset. Everything runs absolutely great with my tri-fire and 13.12 whql drivers


Yeah had a good experience with 13.12. But I may stick with the 14.2 for a while. Need to do play more games.


----------



## MrWhiteRX7

Yea 14.2 using d3d11 in bf4 was definitely better. Not a huge jump by any means but I picked up some fps







I'm just sticking with ol' trusty 13.12 for now


----------



## nightfox

Quote:


> Originally Posted by *Roboyto*
> 
> Rebooted like you did to double check my scores, and now my GPU clock isn't taking during the benchmark. This time my score is down is 9.98%
> 
> 
> 
> 
> 
> 
> 
> 
> 14.2 Beta Single GPU
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Back to 13.12 me thinks
> 
> Set to 1225 in Trixx, only registered 1129 during the whole bench


had a test with 3770k single GPU.

I run 3dmark11 test with 13.12. uninstalled 13.12 using DDU, installed 14.2, did some test,

result is same. well not completely identical but overall score, graphic score and physx score went down but only a bit +/- 1%

http://www.3dmark.com/compare/3dm11/8032761/3dm11/8032738


----------



## Roboyto

Quote:


> Originally Posted by *nightfox*
> 
> had a test with 3770k single GPU.
> 
> I run 3dmark11 test with 13.12. uninstalled 13.12 using DDU, installed 14.2, did some test,
> 
> result is same. well not completely identical but overall score, graphic score and physx score went down but only a bit +/- 1%
> 
> http://www.3dmark.com/compare/3dm11/8032761/3dm11/8032738


Variance of 1% can be expected sometimes, definitely not a loss like I had though....My rig did not appreciate those 14.2 drivers lol


----------



## nightfox

Quote:


> Originally Posted by *Roboyto*
> 
> Variance of 1% can be expected sometimes, definitely not a loss like I had though....My rig did not appreciate those 14.2 drivers lol


lol. most probably.


----------



## kdawgmaster

Hey Guys heres a video i did of Thief on the 14.2 drivers. FPS is recorded with it and crossfire is enabled but it appears only one card is working.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *kdawgmaster*
> 
> Hey Guys heres a video i did of Thief on the 14.2 drivers. FPS is recorded with it and crossfire is enabled but it appears only one card is working.


You record with Action right? How much of a performance hit is there?


----------



## Belkov

I tried to return my silent mode bios. Everything was fine, until it started to load windows7. After loading screen there is no picture... If i load win7 in safe mode, it works, but in normal mode everything is black. Do you have an idea what to do? Oh, yeah, i was using atiflash under dos.


----------



## nightfox

Quote:


> Originally Posted by *kdawgmaster*
> 
> Hey Guys heres a video i did of Thief on the 14.2 drivers. FPS is recorded with it and crossfire is enabled but it appears only one card is working.


try disable ULPS


----------



## alancsalt

Quote:


> Originally Posted by *Belkov*
> 
> I tried to return my silent mode bios. Everything was fine, until it started to load windows7. After loading screen there is no picture... If i load win7 in safe mode, it works, but in normal mode everything is black. Do you have an idea what to do? Oh, yeah, i was using atiflash under dos.


At a rough guess, boot in safe mode, uninstall graphics drivers in device manager, maybe run a driver cleaner, and reboot.


----------



## b0x3d

Quote:


> Originally Posted by *Roboyto*
> 
> Pic please! Someone with a gigabyte card who has disassembled them may be able to identify the screw for you


Thanks for your reply mate, that's a good idea. I'm traveling at the moment and using my Mac, but will send a pic when I get back to my PC.


----------



## b0x3d

Does it make much difference using windows 8.1 pro over windows 7 ultimate? Apparently you get DX11.2 with win8 but do any games support it yet and is it worth the upgrade?

DIRECTX 11.2 which brings a host of new features to improve performance in your games and graphics app. The DIRECTX 11.2 offers HLSL shader linking, Inbox HLSL compiler, GPU overlay support, DirectX tiled resources, and more. Bring new levels of visual realism to gaming on the PC and get top-notch performance.


----------



## nightfox

Quote:


> Originally Posted by *b0x3d*
> 
> Thanks for your reply mate, that's a good idea. I'm traveling at the moment and using my Mac, but will send a pic when I get back to my PC.


Quote:


> Originally Posted by *b0x3d*
> 
> Does it make much difference using windows 8.1 pro over windows 7 ultimate? Apparently you get DX11.2 with win8 but do any games support it yet and is it worth the upgrade?
> 
> DIRECTX 11.2 which brings a host of new features to improve performance in your games and graphics app. The DIRECTX 11.2 offers HLSL shader linking, Inbox HLSL compiler, GPU overlay support, DirectX tiled resources, and more. Bring new levels of visual realism to gaming on the PC and get top-notch performance.


FYI, there is function here called "edit" where you can just edit your earlier post.

To answer to your question, yes as alot of people if not alot of test finds that windows 8.1 does have improvement over windows 7. But not really that much. What gamers really dont like is the metro apps/overlay. I do however use windows 8.1, For no particular reason, I just got used to it already


----------



## Belkov

Quote:


> Originally Posted by *alancsalt*
> 
> At a rough guess, boot in safe mode, uninstall graphics drivers in device manager, maybe run a driver cleaner, and reboot.


Thanks. I tried to give you a rep, but there is no such option for you...








I will try to do what you said.


----------



## Mercy4You

Thx guys for testing 14.2 Beta, I'll keep 13.12


----------



## Widde

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Just did a back to back to back test of BF4 with 13.12, 14.2 d3d11, and 14.2 mantle
> 
> 14.2 d3d11 is actually the best BY FAR lol
> 
> Mantle is awful still. My Fps is slightly higher but my lows are way lower. It's a stuttery mess where as the d3d11 version with 14.2 is incredibly smooth and my fps is very high and way more consistent. I went to a single spot in the center of Firestorm and hung out for a sec. On the catwalk right in the middle facing towards all the smoke and fire. With mantle on I get around 115fps but with d3d11 I get 168fps lol.
> 
> I'll peg the 200fps cap in a lot of places in that same map with mantle but it will go under 80fps and always feels like 30fps. The fog is definitely gone the game looks the same in mantle and d3d11. My highs aren't consistently as high in d3d11 but the lows never go under 130fps. Lamesauce Mantle!!!
> 
> 1440p, ultra with msaa and all turned up, fov 90 etc...
> 
> Ah well, still going back to 13.12


Have had a similar experience, dx11 is running butter smooth on 14.2 and mantle runs like a car without gas. Having 100+fps and not many dipps into high 80s with dx11 and with mantle it goes down to 80ish fps and dipps to high 60s.
Atleast they have fixed something









And cant for the life of me get Thief to run crossfire, it just wont







Tried enabling crossfire for stuff that doesnt have a profile and ULPS is disabled.

Noticed another thing, 55C idle temps


----------



## CU4TLAN

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Just did a back to back to back test of BF4 with 13.12, 14.2 d3d11, and 14.2 mantle
> 
> 14.2 d3d11 is actually the best BY FAR lol
> 
> Mantle is awful still. My Fps is slightly higher but my lows are way lower. It's a stuttery mess where as the d3d11 version with 14.2 is incredibly smooth and my fps is very high and way more consistent. I went to a single spot in the center of Firestorm and hung out for a sec. On the catwalk right in the middle facing towards all the smoke and fire. With mantle on I get around 115fps but with d3d11 I get 168fps lol.
> 
> I'll peg the 200fps cap in a lot of places in that same map with mantle but it will go under 80fps and always feels like 30fps. The fog is definitely gone the game looks the same in mantle and d3d11. My highs aren't consistently as high in d3d11 but the lows never go under 130fps. Lamesauce Mantle!!!
> 
> 1440p, ultra with msaa and all turned up, fov 90 etc...
> 
> Ah well, still going back to 13.12


You gotta keep in mind that Mantle wasnt targeted at i7 and 290 owners, but the APUs or lower end CPUs (It'll also help FX CPU bottlenecks) and GPUs. And this is common knowledge.

So idk why you're complaining, its not like you need the boost Mantle has to offer when you have 3 290s.


----------



## Widde

Quote:


> Originally Posted by *CU4TLAN*
> 
> You gotta keep in mind that Mantle wasnt targeted at i7 and 290 owners, but the APUs or lower end CPUs (It'll also help FX CPU bottlenecks) and GPUs. And this is common knowledge.
> 
> So idk why you're complaining, its not like you need the boost Mantle has to offer when you have 3 290s.


It's not about what you "need"







Some just want as high fps as possible







I dont even need a 2nd 290 ^^ But I got one anyway because I wanted more performance


----------



## CU4TLAN

Quote:


> Originally Posted by *Widde*
> 
> It's not about what you "need"
> 
> 
> 
> 
> 
> 
> 
> Some just want as high fps as possible
> 
> 
> 
> 
> 
> 
> 
> I dont even need a 2nd 290 ^^ But I got one anyway because I wanted more performance


Should've considered an i7 before you got that 290.


----------



## Widde

Quote:


> Originally Posted by *CU4TLAN*
> 
> Should've considered an i7 before you got that 290.


bf4 isnt even pegging this at 100% but it's getting close, Getting a socket 2011 platform soon


----------



## CU4TLAN

Quote:


> Originally Posted by *Widde*
> 
> bf4 isnt even pegging this at 100% but it's getting close, Getting a socket 2011 platform soon


When you do; get some proper RAM.


----------



## Widde

Quote:


> Originally Posted by *CU4TLAN*
> 
> When you do; get some proper RAM.


Getting corsair vengeance pro 32gigs when I do ^^


----------



## CU4TLAN

Quote:


> Originally Posted by *Widde*
> 
> Huh? What do you mean by proper? ^_^ Have 16gigs at 1866mhz


Its slow.


----------



## Widde

Quote:


> Originally Posted by *CU4TLAN*
> 
> Its slow.


Edited* Buying new ram alltogether when I'm switching


----------



## CU4TLAN

Quote:


> Originally Posted by *Widde*
> 
> Edited* Buying new ram alltogether when I'm switching


I suggest you stay away from Vengeance RAM all together but OK lol


----------



## Widde

Quote:


> Originally Posted by *CU4TLAN*
> 
> I suggest you stay away from Vengeance RAM all together but OK lol


Why?







Never had any problems with them but okey


----------



## CU4TLAN

Quote:


> Originally Posted by *Widde*
> 
> Why?
> 
> 
> 
> 
> 
> 
> 
> Never had any problems with them but okey


Well just running stock or a slight overclock wont give you any issues with Kingston value RAM either, get my point?


----------



## Widde

Quote:


> Originally Posted by *CU4TLAN*
> 
> Well just running stock or a slight overclock wont give you any issues with Kingston value RAM either, get my point?


I'm not putting so much weight on what ram I'm using, not going for bench records exactly ^^ But ye I get your point, they dont overclock well have tried. If I was to overclock my ram I'd rather get some dominator plats or TridentXs


----------



## CU4TLAN

Quote:


> Originally Posted by *Widde*
> 
> I'm not putting so much weight on what ram I'm using, not going for bench records exactly ^^ But ye I get your point, they dont overclock well have tried. If I was to overclock my ram I'd rather get some dominator plats or TridentXs


RAM helps more than people like to admit.


----------



## LTC

So have anyone tried the 14.2 drivers with a 290? How are they overclocking, and how stable are they? I went back to 13.2 because of 14.1 being unstable in BF4 with a little overclocking, however I do miss Mantle...


----------



## vortex240

Quote:


> Originally Posted by *Roboyto*
> 
> Is the TR bench downloadable, or must I install the game? I think I have a game coupon around here somwhere


That sounds perfectly fine - it seems that when the bench starts, there is loading delay from your HD most likely. This outlier is what skews your average. If both you and the other guy reran the benches with the vsync off - you would see that your results would be nearly identical.


----------



## SimonKaz

Quote:


> Originally Posted by *vortex240*
> 
> That sounds perfectly fine - it seems that when the bench starts, there is loading delay from your HD most likely. This outlier is what skews your average. If both you and the other guy reran the benches with the vsync off - you would see that your results would be nearly identical.


I think we're both on SSD (at least I am), so I Don't think thats the reason.


----------



## vortex240

Quote:


> Originally Posted by *SimonKaz*
> 
> I think we're both on SSD (at least I am), so I Don't think thats the reason.


Do the test like I suggested. Could be a ton of things on your rig causing a single dip that screws up your average.


----------



## SimonKaz

Quote:


> Originally Posted by *vortex240*
> 
> Do the test like I suggested. Could be a ton of things on your rig causing a single dip that screws up your average.


Oh yeah, definitely an option. I'm taking note of all the suggestions and will be rerunning my tests today. Ill start with getting 13.12 back in the system and if that doesn't help, I'll get the 14.2 to see if theres any improvement. Apart from that, I'll run the fans at 100% and open the case to allow for better ventilation.


----------



## Imprezzion

As my previous post got snowed under, i'll ask again.

I keep getting several issues with my 290X when cold booting and having a OC set in ANY overclocking program.

*1.* With MSI AB and TriXX the overclock does not apply and resets every reboot even though apply on boot and a saved profile are enabled.
The voltage and power limit however, ARE applied.

Simply clicking the profile again and applying fixes it.

*2.* Every cold boot i need to reboot at least once to get performance up. Whenever I cold boot, with or without overclock, that doesn't matter, oc programs don't affect this either, performance is extremely low after a cold boot.

Valley will run at 20 ish FPS. Clocks show normal 3d clocks across the board. Usage is 100%. Temps and power usage are MUCH lower then expected however.
This can ONLY be fixed with a full reboot and after that it runs without any issues.

Happens on both 14.1 and 14.2 drivers. Haven't tested 13.xx and I won't either as I need Mantle for BF4.

*3.* When I use AMD Overdrive to simply OC without editting voltage the 2d clocks are stuck in full 3d for the core.
Memory clocks down just fine, as soon as Overdrive is enabled, core gets stuck in 3d clocks.

When I use a 3rd party OC program and leave Overdrive disabled, it downclocks just fine btw. Turning off Overdrive sees core clock instantly dropping to ~300Mhz. Turning it on even with stock speeds immidiatly jumps to full speed.

Please, anyone, can you help me with any of these issues?

I love my card and even though it is one hell of a sad overclocker I am not going to get rid of it pure because of the insane performance it has in the only game I play ATM which is BF4.
I mean, I run all Ultra with no AA and with HBAO enabled. Drivers set everything to max quality. 1080p with 150% res scale which means 2880x1620 resolution.
With Mantle enabled and at 1100Mhz core and 1500Mhz VRAM i'm averaging way over 70FPS. probably about 80-82 FPS. This is where a GTX780 @ 1306Mhz core and 1800Mhz VRAM can't even average 50 on D3D11...


----------



## smokedawg

Quote:


> Originally Posted by *Imprezzion*
> 
> *2.* Every cold boot i need to reboot at least once to get performance up. Whenever I cold boot, with or without overclock, that doesn't matter, oc programs don't affect this either, performance is extremely low after a cold boot.
> 
> Valley will run at 20 ish FPS. Clocks show normal 3d clocks across the board. Usage is 100%. Temps and power usage are MUCH lower then expected however.
> This can ONLY be fixed with a full reboot and after that it runs without any issues.
> 
> Happens on both 14.1 and 14.2 drivers. Haven't tested 13.xx and I won't either as I need Mantle for BF4.


Same happened to me using 14.1. Reverting to 13.xx fixed this. I noticed it playing Skyrim: Performance was bad so I decided to reboot after which it got back to normal. Haven't checked 14.2 yet.


----------



## VSG

8-pack is live and breaking 4x GPU world records using Asus DCUII R9-290x cards: 




He just broke Catzilla 4-way WR. I need to talk to him later about how he figured out to enable multi GPU scaling on Catzilla lol


----------



## rdr09

Quote:


> Originally Posted by *Imprezzion*
> 
> As my previous post got snowed under, i'll ask again.
> 
> I keep getting several issues with my 290X when cold booting and having a OC set in ANY overclocking program.
> 
> *1.* With MSI AB and TriXX the overclock does not apply and resets every reboot even though apply on boot and a saved profile are enabled.
> The voltage and power limit however, ARE applied.
> 
> Simply clicking the profile again and applying fixes it.
> 
> *2.* Every cold boot i need to reboot at least once to get performance up. Whenever I cold boot, with or without overclock, that doesn't matter, oc programs don't affect this either, performance is extremely low after a cold boot.
> 
> Valley will run at 20 ish FPS. Clocks show normal 3d clocks across the board. Usage is 100%. Temps and power usage are MUCH lower then expected however.
> This can ONLY be fixed with a full reboot and after that it runs without any issues.
> 
> Happens on both 14.1 and 14.2 drivers. Haven't tested 13.xx and I won't either as I need Mantle for BF4.
> 
> *3.* When I use AMD Overdrive to simply OC without editting voltage the 2d clocks are stuck in full 3d for the core.
> Memory clocks down just fine, as soon as Overdrive is enabled, core gets stuck in 3d clocks.
> 
> When I use a 3rd party OC program and leave Overdrive disabled, it downclocks just fine btw. Turning off Overdrive sees core clock instantly dropping to ~300Mhz. Turning it on even with stock speeds immidiatly jumps to full speed.
> 
> Please, anyone, can you help me with any of these issues?
> 
> I love my card and even though it is one hell of a sad overclocker I am not going to get rid of it pure because of the insane performance it has in the only game I play ATM which is BF4.
> I mean, I run all Ultra with no AA and with HBAO enabled. Drivers set everything to max quality. 1080p with 150% res scale which means 2880x1620 resolution.
> With Mantle enabled and at 1100Mhz core and 1500Mhz VRAM i'm averaging way over 70FPS. probably about 80-82 FPS. This is where a GTX780 @ 1306Mhz core and 1800Mhz VRAM can't even average 50 on D3D11...


i just tried it. oc'ed and it stayed. both restart and cold boot. unless i am not understanding you. here is what i used . . .



i don't mix ab and trixx. i only use trixx to oc and only for benching. i tried it to help you. this is using 14.2, btw. 14.2 works as flawless as 13.11 just not with BF4/mantle in my case.


----------



## ebduncan

Quote:


> i don't mix ab and trixx. i only use trixx to oc and only for benching. i tried it to help you. this is using 14.2, btw. 14.2 works as flawless as 13.11 just not with BF4/mantle in my case.


I highly doubt 14.2 is anywhere NEAR the 13.12 drivers. When i tried 14.1 i had so many issues i had to revert back to 13.12, I think it was the first time in history i had to actually uninstall a new beta driver and go back to the older driver because of issues.

Now i'm sure 14.2 is better than 14.1, but I would not compare it to 13.12 drivers, which are rock solid in everything, yes its missing mantle, but Mantle isn't needed.


----------



## rdr09

Quote:


> Originally Posted by *ebduncan*
> 
> I highly doubt 14.2 is anywhere NEAR the 13.12 drivers. When i tried 14.1 i had so many issues i had to revert back to 13.12, I think it was the first time in history i had to actually uninstall a new beta driver and go back to the older driver because of issues.
> 
> Now i'm sure 14.2 is better than 14.1, but I would not compare it to 13.12 drivers, which are rock solid in everything, yes its missing mantle, but Mantle isn't needed.


same here, 14.1 had problems with BF4/mantle. did not try any other game. i tried 14.2 with BF4/DX11 and my other games - works fine.

why don't you try it? btw, i want mantle but i don't need it.

edit: i skipped DDU this time around and just used 14.2 to uninstall 13.11 and installed it after. the way i always used to do it except for 14.1.


----------



## ebduncan

i see no sense in wasting time switching drivers again when everything I play now works fine, and get good performance. I understand some folks have to switch every waking second of the day. I don't have time for that.


----------



## rdr09

Quote:


> Originally Posted by *ebduncan*
> 
> i see no sense in wasting time switching drivers again when everything I play now works fine, and get good performance. I understand some folks have to switch every waking second of the day. I don't have time for that.


i agree if you don't want to help and test the beta with mantle. no one is forcing you.


----------



## Mercy4You

Quote:


> Originally Posted by *ebduncan*
> 
> I highly doubt 14.2 is anywhere NEAR the 13.12 drivers. When i tried 14.1 i had so many issues i had to revert back to 13.12, I think it was the first time in history i had to actually uninstall a new beta driver and go back to the older driver because of issues.
> 
> Now i'm sure 14.2 is better than 14.1, but I would not compare it to 13.12 drivers, which are rock solid in everything, yes its missing mantle, but Mantle isn't needed.


I can only confirm all you said in this post


----------



## kizwan

Quote:


> Originally Posted by *ebduncan*
> 
> Quote:
> 
> 
> 
> i don't mix ab and trixx. i only use trixx to oc and only for benching. i tried it to help you. this is using 14.2, btw. 14.2 works as flawless as 13.11 just not with BF4/mantle in my case.
> 
> 
> 
> I highly doubt 14.2 is anywhere NEAR the 13.12 drivers. When i tried 14.1 i had so many issues i had to revert back to 13.12, I think it was the first time in history i had to actually uninstall a new beta driver and go back to the older driver because of issues.
> 
> Now i'm sure 14.2 is better than 14.1, but I would not compare it to 13.12 drivers, which are rock solid in everything, yes its missing mantle, but Mantle isn't needed.
Click to expand...

Of course you don't need Mantle, you only have one GPU. For people running multi-gpu, they going to need it. Multi-gpu use CPU processing power a lot. Mantle can help that & this will definitely boost performance.


----------



## Roboyto

Quote:


> Originally Posted by *Imprezzion*
> 
> As my previous post got snowed under, i'll ask again.
> 
> I keep getting several issues with my 290X when cold booting and having a OC set in ANY overclocking program.
> 
> *1.* With MSI AB and TriXX the overclock does not apply and resets every reboot even *though apply on boot and a saved profile are enabled.*
> The voltage and power limit however, ARE applied.
> 
> Simply clicking the profile again and applying fixes it.
> 
> *2.* Every cold boot i need to reboot at least once to get performance up. Whenever I cold boot, with or without overclock, that doesn't matter, oc programs don't affect this either, performance is extremely low after a cold boot.
> 
> Valley will run at 20 ish FPS. Clocks show normal 3d clocks across the board. Usage is 100%. Temps and power usage are MUCH lower then expected however.
> This can ONLY be fixed with a full reboot and after that it runs without any issues.
> 
> *Happens on both 14.1 and 14.2 drivers. Haven't tested 13.xx and I won't either as I need Mantle for BF4.*
> 
> *3.* When I use *AMD Overdrive* to simply OC without editting voltage the 2d clocks are stuck in full 3d for the core.
> Memory clocks down just fine, as soon as Overdrive is enabled, core gets stuck in 3d clocks.
> 
> When I use a 3rd party OC program and leave Overdrive disabled, it downclocks just fine btw. Turning off Overdrive sees core clock instantly dropping to ~300Mhz. Turning it on even with stock speeds immidiatly jumps to full speed.
> 
> Please, anyone, can you help me with any of these issues?
> 
> I love my card and even though it is one hell of a sad overclocker I am not going to get rid of it pure because of the insane performance it has in the only game I play ATM which is BF4.
> I mean, I run all Ultra with no AA and with HBAO enabled. Drivers set everything to max quality. 1080p with 150% res scale which means 2880x1620 resolution.
> With Mantle enabled and at 1100Mhz core and 1500Mhz VRAM i'm averaging way over 70FPS. probably about 80-82 FPS. This is where a GTX780 @ 1306Mhz core and 1800Mhz VRAM can't even average 50 on D3D11...


Choose AB or Trixx. Don't use both.

I use Trixx personally. Disable the apply on boot setting, but let Trixx load with Windows. If you need to overclock it's not hard to enable a saved profile. Make sure you disable ULPS and force constant voltage, these will help with stability and to keep frame rates consistent.

Get rid of those 14.X drivers, they hav been nothing but problems. Make sure your scrub with DDU. http://www.guru3d.com/files_details/display_driver_uninstaller_download.html

You don't *NEED* Mantle to play BF4 with one of these cards.

Disable AMD Overdrive.


----------



## Roboyto

Quote:


> Originally Posted by *CU4TLAN*
> 
> RAM helps more than people like to admit.


Quote:


> Originally Posted by *Widde*
> 
> I'm not putting so much weight on what ram I'm using, not going for bench records exactly ^^ But ye I get your point, they dont overclock well have tried. If I was to overclock my ram I'd rather get some dominator plats or TridentXs


It helps quite a bit actually. Especially with modern games, and even more so with BF4.

http://www.xbitlabs.com/articles/memory/display/ivy-bridge-ddr3_2.html

Check out this build log, page 13 post #122. BF4 gets ENORMOUS gains from system memory speeds!

http://www.overclock.net/t/1442038/build-log-the-hawaiian-heat-wave/120


----------



## ebduncan

Quote:


> Originally Posted by *kizwan*
> 
> Of course you don't need Mantle, you only have one GPU. For people running multi-gpu, they going to need it. Multi-gpu use CPU processing power a lot. Mantle can help that & this will definitely boost performance.


We are only talking about benefits in one game right now, which is BF4. I understand there are performance gains, but to me there are not enough to even begin to deal with the issues the newer drivers bring.

I usually play at 1080p, and my avg fps is around 120 or higher. I don't need extra performance. Even the crossfire users unless your running 4k or 3 monitors, you will not need the extra performance mantle provides.

I'm a avid supporter of Mantle and all it brings, but simply put its not ready yet.


----------



## Widde

Quote:


> Originally Posted by *Roboyto*
> 
> It helps quite a bit actually. Especially with modern games, and even more so with BF4.
> 
> http://www.xbitlabs.com/articles/memory/display/ivy-bridge-ddr3_2.html
> 
> Check out this build log, page 13 post #122. BF4 gets ENORMOUS gains from system memory speeds!
> 
> http://www.overclock.net/t/1442038/build-log-the-hawaiian-heat-wave/120


I highly doubt I will see massive gains xD But this is an old build and the price for higher clocked ram wasnt pretty ^^ Having them at 1866 9-10-9-24 atm. Will do a proper build when I get some money together though


----------



## rdr09

Quote:


> Originally Posted by *ebduncan*
> 
> We are only talking about benefits in one game right now, which is BF4. I understand there are performance gains, but to me there are not enough to even begin to deal with the issues the newer drivers bring.
> 
> I usually play at 1080p, and my avg fps is around 120 or higher. I don't need extra performance. Even the crossfire users unless your running 4k or 3 monitors, you will not need the extra performance mantle provides.
> 
> I'm a avid supporter of Mantle and all it brings, but simply put its not ready yet.


the beta testers are us - the gamers. how can it be ready if we don't help test it? see, success or failure, someday i can say i helped test mantle.

like i said, enjoy 13.12 - amd is not forcing to try it.


----------



## WotanWipeout

Did anybody try mining with 14.2? 14.1 had some errors and i didnt check 14.2.


----------



## Roboyto

Quote:


> Originally Posted by *Widde*
> 
> I highly doubt I will see massive gains xD But this is an old build and the price for higher clocked ram wasnt pretty ^^ Having them at 1866 9-10-9-24 atm. Will do a proper build when I get some money together though


You doubt, but did you read?

The information is right there, and that Xbitlabs article is from July of 2012, things have only come to take advantage of RAM speeds more since then.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *CU4TLAN*
> 
> You gotta keep in mind that Mantle wasnt targeted at i7 and 290 owners, but the APUs or lower end CPUs (It'll also help FX CPU bottlenecks) and GPUs. And this is common knowledge.
> 
> So idk why you're complaining, its not like you need the boost Mantle has to offer when you have 3 290s.


LOL!!! This guy

No matter how many gpu's one has there will be a cpu choke point in parts of maps in BF4. I can average 165fps @ my resolution with all the bells and whistles, but if I get low points down below my refresh rate that is annoying. This is OCN, not averageclock.net I want the most performance I can get out of my setup.

Plus "common knowledge" would hopefully show you that Mantle is actually there to HELP with multi-gpu configs every bit as much as a low-end cpu combined with a good gpu. They even showed this in their own testing!









Not sure why you all of sudden come in this thread acting the way you are, but have a nice day all the same.


----------



## Widde

Quote:


> Originally Posted by *Roboyto*
> 
> You doubt, but did you read?
> 
> The information is right there, and that Xbitlabs article is from July of 2012, things have only come to take advantage of RAM speeds more since then.


I dont really believe the average going from 75 to 120 purely because of ram, Granted I dont know the exact works ^^ anyways I'm gonna leave it now because I dont 100% not even 75% know what I'm talking about ^^


----------



## MrWhiteRX7

Quote:


> Originally Posted by *kizwan*
> 
> Of course you don't need Mantle, you only have one GPU. For people running multi-gpu, they going to need it. Multi-gpu use CPU processing power a lot. Mantle can help that & this will definitely boost performance.


THIS^^^


----------



## Roboyto

Quote:


> Originally Posted by *Widde*
> 
> I dont really believe the average going from 75 to 120 purely because of ram, Granted I dont know the exact works ^^ anyways I'm gonna leave it now because I dont 100% not even 75% know what I'm talking about ^^


I can only lead you to the water...I can not force you to drink it.


----------



## Imprezzion

Quote:


> Originally Posted by *Roboyto*
> 
> Choose AB or Trixx. Don't use both.
> 
> I use Trixx personally. Disable the apply on boot setting, but let Trixx load with Windows. If you need to overclock it's not hard to enable a saved profile. Make sure you disable ULPS and force constant voltage, these will help with stability and to keep frame rates consistent.
> 
> Get rid of those 14.X drivers, they hav been nothing but problems. Make sure your scrub with DDU. http://www.guru3d.com/files_details/display_driver_uninstaller_download.html
> 
> You don't _*NEED*_ Mantle to play BF4 with one of these cards.
> 
> Disable AMD Overdrive.


Sorry for the confusion, I mean I tried with either MSI AB or TriXX but both seems to fail.

And yeah, I kinda need Mantle for BF4. On the settings I play DirectX just cannot deliver the performance to run smoothly in the roughest combat. Mantle can.
I am a GFX whore and even though I like my FPS high (at least 60+ as MINIMUM FPS. Not average) I also like my gfx maxed out.
This means 150% resolution scale at least which is 2880x1620 rendering res.

When I run DX11 it will drop to mid 40's in heavy fighting. Using Mantle it will not ever drop below 55 no matter how hard the fighting is.
Averages also go from 71 in DX11 to 84 on Mantle and FPS is MUCH more stable.

Plus, 14.1 was horrible, I admit. Lot's of crashes and problems.

But, 14.2 has served me so well that ever since I installed it i haven't had a single crash in any game.
Not in BF4 @ Mantle, not in MW2/3 on DX, Not in Bioshock Infinite, not on any random MMO...

I guess i'll have to live with clicking MSI AB's profile and clicking Apply again after every cold boot. Or just use sleep mode


----------



## kizwan

Quote:


> Originally Posted by *ebduncan*
> 
> We are only talking about benefits in one game right now, which is BF4. I understand there are performance gains, but to me there are not enough to even begin to deal with the issues the newer drivers bring.
> 
> I usually play at 1080p, and my avg fps is around 120 or higher. I don't need extra performance. Even the crossfire users unless your running 4k or 3 monitors, you will not need the extra performance mantle provides.
> 
> I'm a avid supporter of Mantle and all it brings, but simply put its not ready yet.


Crossfire will use/need more CPU processing power than single GPU. With certain CPUs, it can strain the CPU to the point where performance lost because CPU bottlenecking. You think multi-gpu can provide extra performance but forgot to take into account whether CPU can handle them or not. Two 290's already enough to bring certain CPUs to its knees. This is where Mantle come in. It can reduced CPU time & therefore improved performance.

Like I said before, you don't need Mantle because you only have one GPU. I'm not arguing whether you should use Mantle or not but just agreeing with you in your case.


----------



## Roboyto

Quote:


> Originally Posted by *Imprezzion*
> 
> Sorry for the confusion, I mean I tried with either MSI AB or TriXX but both seems to fail.
> 
> And yeah, I kinda need Mantle for BF4. On the settings I play DirectX just cannot deliver the performance to run smoothly in the roughest combat. Mantle can.
> I am a GFX whore and even though I like my FPS high (at least 60+ as MINIMUM FPS. Not average) I also like my gfx maxed out.
> This means 150% resolution scale at least which is 2880x1620 rendering res.
> 
> When I run DX11 it will drop to mid 40's in heavy fighting. Using Mantle it will not ever drop below 55 no matter how hard the fighting is.
> Averages also go from 71 in DX11 to 84 on Mantle and FPS is MUCH more stable.
> 
> Plus, 14.1 was horrible, I admit. Lot's of crashes and problems.
> 
> But, 14.2 has served me so well that ever since I installed it i haven't had a single crash in any game.
> Not in BF4 @ Mantle, not in MW2/3 on DX, Not in Bioshock Infinite, not on any random MMO...
> 
> I guess i'll have to live with clicking MSI AB's profile and clicking Apply again after every cold boot. Or just use sleep mode


Sometimes compromises must be made, especially at that resolution with 1 card.

Mantle is still in its infancy, only time will help it grow.


----------



## b0x3d

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Mantle is actually there to HELP with multi-gpu configs every bit as much as a low-end cpu combined with a good gpu.


Mantle seemed to make a massive difference for me. I have an old mobo and CPU (my next upgrade) - AM3 Phenom II 3.2 Quad Core - and I added 2x 290x in CF - I play eyefinity 5292x1050 res and installing the 14.2 drivers and enabling mantle made BF4 much smoother. Basically without mantle I had to turn video setting down to high, now I can have them on ultra

Realise this feedback is a bit anecdotal, when I get a minute I'll do some tests and put the results and my specs etc on here


----------



## MrWhiteRX7

Quote:


> Originally Posted by *b0x3d*
> 
> Mantle seemed to make a massive difference for me. I have an old mobo and CPU (my next upgrade) - AM3 Phenom II 3.2 Quad Core - and I added 2x 290x in CF - I play eyefinity 5292x1050 res and installing the 14.2 drivers and enabling mantle made BF4 much smoother. Basically without mantle I had to turn video setting down to high, now I can have them on ultra
> 
> Realise this feedback is a bit anecdotal, when I get a minute I'll do some tests and put the results and my specs etc on here


It was weird for me, if I stopped and just looked around everything was fluid and smooth as it should be hitting almost 200fps but the second I started walking around it got choppy even with the fps not dropping at all. Then sometimes it was fine lol. I'm going to play around with it more just because I'm a glutton for punishment









They claim it's good for 3 and 4 gpu's now


----------



## taem

So you guys saying 290 xfire will be bottlenecked by my 4670k @ 4.7? Just in BF4 or generally?


----------



## CU4TLAN

Quote:


> Originally Posted by *taem*
> 
> So you guys saying 290 xfire will be bottlenecked by my 4670k @ 4.7? Just in BF4 or generally?


Yes.


----------



## kizwan

Quote:


> Originally Posted by *taem*
> 
> So you guys saying 290 xfire will be bottlenecked by my 4670k @ 4.7? Just in BF4 or generally?


I believe 4670k still can handle two 290's in Crossfire. I don't know for sure. If the CPU is bottlenecking, it will affecting all CPU intensive games.


----------



## Widde

Quote:


> Originally Posted by *taem*
> 
> So you guys saying 290 xfire will be bottlenecked by my 4670k @ 4.7? Just in BF4 or generally?


I dont experience any bottlenecking with my 3570k at 4,6ghz, cpu usage doesnt go higher than 90% in bf4


----------



## Roboyto

Quote:


> Originally Posted by *taem*
> 
> So you guys saying 290 xfire will be bottlenecked by my 4670k @ 4.7? Just in BF4 or generally?


It would depend on the game and how optimized it was for more cores.

http://www.anandtech.com/show/7189/choosing-a-gaming-cpu-september-2013/10



Anandtech found that 4670k ~VS~ 4770k with (3) 7970s the frame rates were very close. I think the 4670k should be plenty.


----------



## CU4TLAN

Quote:


> Originally Posted by *Roboyto*
> 
> It would depend on the game and how optimized it was for more cores.
> 
> http://www.anandtech.com/show/7189/choosing-a-gaming-cpu-september-2013/10
> 
> 
> 
> 
> Anandtech found that 4670k ~VS~ 4770k with (3) 7970s the frame rates were very close. I think the 4670k should be plenty.


None of these games are very CPU intensive at all, and even IF some are, they're singlethreaded so i5 and i7 wouldnt matter much. Compare an i5 to a 3930k or something in BF4.


----------



## kpoeticg

So after all this stressing about RMA'ing my black screen 290x to Powercolor, since i haven't heard back from them yet about my ticket, i decided to give Newegg a call just for the hell of it. They decided to honor the card and issued me another RMA









Much respect to Newegg on this one. Really didn't think they'd step up after giving me the 6 day return window from my first RMA.


----------



## pkrexer

So I installed the Fujipoly Ultra Extreme thermal pads on my XSPC Razor water block last night and got good results. Looks like it dropped my VRM 1 temps by about 20c across the board. I was hitting 66-69c after a run of Valley @ 1250 / 1500, +150mv, now with the pads I'm seeing around 47c. Very happy!

Took me lot longer to install though because my new rotary fitting I installed started leaking *sigh* luckily I caught it right away, but it caused me to have to drain my loop all over again... what a pain. I will be doing further testing tonight


----------



## Roboyto

Quote:


> Originally Posted by *pkrexer*
> 
> So I installed the Fujipoly Ultra Extreme thermal pads on my XSPC Razor water block last night and got good results. Looks like it dropped my VRM 1 temps by about 20c across the board. I was hitting 66-69c after a run of Valley @ 1250 / 1500, +150mv, now with the pads I'm seeing around 47c. Very happy!
> 
> Took me lot longer to install though because my new rotary fitting I installed started leaking *sigh* luckily I caught it right away, but it caused me to have to drain my loop all over again... what a pain. I will be doing further testing tonight


Another successful upgrade!









For all my testing I've done since the upgrade, VRM1 is usually within ~5C of the core depending on clocks/volts.

I needed to change my reservoir because my was all cracked and jackedup; check my build log if you want to see a res on the brink of disaster. Installed pump in new res, tied everything up, and started filling. I never had a leak at the res before so I was being cocky, and didn't lay down any towels or anything...BIG mistake lol







Water ran down all over my front fans, onto the GPU and in between the backplate, then continued rolling down along the GPU and spilling onto my PSU. Luckily, we don't leak test with the PSU giving power to everything! All ended well, but it was a lesson learned.

I see you have a Raystorm, ever consider letting your CPU run around naked?







http://www.overclock.net/t/1468701/xspc-raystorm-naked-ivy-haswell-mounting-no-additional-hardware

Nice build! Almost bought that same case, but decided on the 350D.


----------



## SimonKaz

Alrighty guys, had some time to reinstall 13.12 and test 14.2. Can you guys compare your results with mine?

Settings:
 

And results:

First, 14.2


13.12


13.12 again after warming the GPU a bit more


----------



## CU4TLAN

Quote:


> Originally Posted by *SimonKaz*
> 
> Alrighty guys, had some time to reinstall 13.12 and test 14.2. Can you guys compare your results with mine?
> 
> Settings:
> 
> 
> And results:
> 
> First, 14.2
> 
> 
> 13.12
> 
> 
> 13.12 again after warming the GPU a bit more


Within margin of error.


----------



## phallacy

Quote:


> Originally Posted by *Roboyto*
> 
> Choose AB or Trixx. Don't use both.
> 
> I use Trixx personally. Disable the apply on boot setting, but let Trixx load with Windows. If you need to overclock it's not hard to enable a saved profile. Make sure you disable ULPS and force constant voltage, these will help with stability and to keep frame rates consistent.
> 
> Get rid of those 14.X drivers, they hav been nothing but problems. Make sure your scrub with DDU. http://www.guru3d.com/files_details/display_driver_uninstaller_download.html
> 
> You don't _*NEED*_ Mantle to play BF4 with one of these cards.
> 
> Disable AMD Overdrive.


Interesting, has forcing constant voltage helped with your OC stability? I always left it unchecked but this has piqued my interest as maybe something I have been overlooking this whole time


----------



## Roboyto

Quote:


> Originally Posted by *phallacy*
> 
> Interesting, has forcing constant voltage helped with your OC stability? I always left it unchecked but this has piqued my interest as maybe something I have been overlooking this whole time


Absolutely! I've always used Trixx since my HD4890 and have been using that function for as long as I can remember.


----------



## phallacy

Quote:


> Originally Posted by *Roboyto*
> 
> Absolutely! I've always used Trixx since my HD4890 and have been using that function for as long as I can remember.


Great, I will have to try it very soon. On the list for this weekend.

-Add waterblock to my 290x #3 and test OC
-Test 290x #4 at stock and OC limits
-Replace XSPC stock thermal pads with fujipoly ultra extreme
-Try out 14.2
and now -Try constant voltage for OC


----------



## Prozillah

My list for the evening - install a new silverstone g evo 1200w psu as replacement to my fsp aurum 1000w and hope and pray theres enough power there to run both my cpu and gpu overclocks properly!!


----------



## Paul17041993

with a lot of 290/X crossfire setups being CPU-bound, has anyone actually tested a setup with i5s, i7s and an FX-8350 to see the differences? I'm very curious if having 8 "dedicated" cores over hyperthreading really helps at all or if FX suffers the same bottleneck...

edit; oh not including the Unigine benches though, those are single-thread...


----------



## Csokis

Sapphire to launch Radeon R9 290X VAPOR-X with 8GB RAM


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Paul17041993*
> 
> with a lot of 290/X crossfire setups being CPU-bound, has anyone actually tested a setup with i5s, i7s and an FX-8350 to see the differences? I'm very curious if having 8 "dedicated" cores over hyperthreading really helps at all or if FX suffers the same bottleneck...
> 
> edit; oh not including the Unigine benches though, those are single-thread...


In BF3 and BF4 on my old 2600k triple 7970 rig I swapped out to an 8350 for about a month. I lost fps everywhere in every game but definitely my average went down quite a bit in the BF series. I got more fps with two 7970's and a 2600k @ 4.2ghz than three 7970's and my 8350 @ 4.8ghz lol.


----------



## Roboyto

Quote:


> Originally Posted by *Paul17041993*
> 
> with a lot of 290/X crossfire setups being CPU-bound, has anyone actually tested a setup with i5s, i7s and an FX-8350 to see the differences? I'm very curious if having 8 "dedicated" cores over hyperthreading really helps at all or if FX suffers the same bottleneck...
> 
> edit; oh not including the Unigine benches though, those are single-thread...


The 8 'whole' cores is up for debate, technically they are 4 modules each with 2 bulldozer cores. They work in a similar fashion to hyper-threading by sharing cache.

Mantle is supposed to boost performance for AMD's bulldozer cores, utilizing up to 16 cores, if I remember correctly, from the Mantle Hawaii press release.


----------



## Prozillah

Quote:


> Originally Posted by *Csokis*
> 
> Sapphire to launch Radeon R9 290X VAPOR-X with 8GB RAM


What on earth do you need 8gb vram for?


----------



## Roboyto

Quote:


> Originally Posted by *Prozillah*
> 
> What on earth do you need 8gb vram for?


4K UHD 5-monitor Eyefinity...is that possible?


----------



## pkrexer

Quote:


> Originally Posted by *Roboyto*
> 
> Another successful upgrade!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> For all my testing I've done since the upgrade, VRM1 is usually within ~5C of the core depending on clocks/volts.
> 
> I needed to change my reservoir because my was all cracked and jackedup; check my build log if you want to see a res on the brink of disaster. Installed pump in new res, tied everything up, and started filling. I never had a leak at the res before so I was being cocky, and didn't lay down any towels or anything...BIG mistake lol
> 
> 
> 
> 
> 
> 
> 
> Water ran down all over my front fans, onto the GPU and in between the backplate, then continued rolling down along the GPU and spilling onto my PSU. Luckily, we don't leak test with the PSU giving power to everything! All ended well, but it was a lesson learned.
> 
> I see you have a Raystorm, ever consider letting your CPU run around naked?
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1468701/xspc-raystorm-naked-ivy-haswell-mounting-no-additional-hardware
> 
> Nice build! Almost bought that same case, but decided on the 350D.


I'm delidded already. I've heard the temp improvement isn't huge by running naked... not sure if I'd like to take the chance of cracking my core


----------



## chronicfx

Quote:


> Originally Posted by *Roboyto*
> 
> 4K UHD 5-monitor Eyefinity...is that possible?


On my budget... No


----------



## Jack Mac

I know the 290s are beasts, especially at higher resolutions, but I highly doubt we'll be seeing eyefinity 4K, even with CF 290s.


----------



## Roboyto

Quote:


> Originally Posted by *CU4TLAN*
> 
> None of these games are very CPU intensive at all, and even IF some are, they're singlethreaded so i5 and i7 wouldnt matter much. Compare an i5 to a 3930k or something in BF4.


Quote:


> Originally Posted by *CU4TLAN*
> 
> Within margin of error.


Your replies are curt, and not very helpful sir.

This review was from back in November, so things could have been updated or optimized more by now; best I could find however.

I found a review comparing all of these CPUs for Battlefield 4: 4770k, 4670k, i3 4340, FX-8350, FX-6300, and FX-4300.

http://www.hardwarepal.com/battlefield-4-benchmark-mp-cpu-gpu-w7-vs-w8-1/

4670k didn't bottleneck for BF4 with a 7970 at 2560x1440, frames are near identical to 4770k.

*What they concluded was that BF4 is optimized for 4 cores.*



If you have 4 cores, BF4 will use all of them at 90-95%. The FX-6300 ran at an average of 66% load; i.e 4 cores. The 4770k at 45% and FX-8350 at 52%; i.e 4 cores.

Most surprising to me is the performance capabilities of the i3 and FX-4300!


----------



## sugarhell

Average means nothing. You have 90% average now think with a tank or some grenades what will happen? Yeah you got it. If you want to compare cpus you compare minimums

By the way this review is a trash.


----------



## Roboyto

Quote:


> Originally Posted by *pkrexer*
> 
> I'm delidded already. I've heard the temp improvement isn't huge by running naked... not sure if I'd like to take the chance of cracking my core


If you do the math for the measurements, which I already did, you won't crack it







How do these temps compare to yours?

4.5GHz 1.259V - 2400MHz RAM 1.65V IBT low-mid 70s, Graphics benching low-mid 50s, gaming is mid 40s

I have a NewEgg eggxpert review to do for a gigabyte z87 board...so my 4770k must come out as I don't have any other 1150 chips in my house. I'll be putting the lid back on under water after that happens to see what the differences are.


----------



## VSG

EK blocks for the Asus 290/290x DCU II


Spoiler: Warning: Spoiler!


----------



## taem

Quote:


> Originally Posted by *SimonKaz*
> 
> Alrighty guys, had some time to reinstall 13.12 and test 14.2. Can you guys compare your results with mine?


What clocks? Stock?


----------



## Paul17041993

Quote:


> Originally Posted by *Roboyto*
> 
> The 8 'whole' cores is up for debate, technically they are 4 modules each with 2 bulldozer cores. They work in a similar fashion to hyper-threading by sharing cache.
> 
> Mantle is supposed to boost performance for AMD's bulldozer cores, utilizing up to 16 cores, if I remember correctly, from the Mantle Hawaii press release.


each "core" shares certain inputs and cache in pairs, difference compared to hyperthreading though, is they are individual units. HT is the inverse, single units with split intakes.

so HT gives great single-thread perf., little multi-thread gain, but "bulldozer" gives great multi-thread and reduces stack packing on each core just by having more of them.


----------



## pkrexer

Quote:


> Originally Posted by *Roboyto*
> 
> If you do the math for the measurements, which I already did, you won't crack it
> 
> 
> 
> 
> 
> 
> 
> How do these temps compare to yours?
> 4.5GHz 1.259V - 2400MHz RAM 1.65V IBT low-mid 70s, Graphics benching low-mid 50s, gaming is mid 40s
> 
> I have a NewEgg eggxpert review to do for a gigabyte z87 board...so my 4770k must come out as I don't have any other 1150 chips in my house. I'll be putting the lid back on under water after that happens to see what the differences are.


Delidded with clu between the core & spreader, I get really good temps. 4.6GHz @ 1.4 Vcore ... I'm seeing 69c on the hottest core after a 10 pass run of IBT


----------



## Roboyto

Quote:


> Originally Posted by *Paul17041993*
> 
> each "core" shares certain inputs and cache in pairs, difference compared to hyperthreading though, is they are individual units. HT is the inverse, single units with split intakes.
> 
> so HT gives great single-thread perf., little multi-thread gain, but "bulldozer" gives great multi-thread and reduces stack packing on each core just by having more of them.


That's why I said similar fashion to HT. I hope Mantle and other programs/programmers takes advantage of the multi-thread possibilities for them as they still lag behind the i7s.


----------



## MrWhiteRX7

I take back what I said about Mantle and 14.2... tested again and got it to not hit the vram cap by only turning down MSAA from 4x to 2x and it's amazing









1440p, 90 fov, Ultra errrthang, high post, hbao, msaa 2x (because of vram/mem leak in mantle)

Paracel Storm

Frame Time Avg CPU Frame Avg GPU Frame Avg
173.382 FPS 173.299 FPS 61.523 FPS

Max FPS Max FPS (CPU) Max FPS (GPU)
263.158 FPS 200.803 FPS 113.379 FPS

Min FPS Min FPS (CPU) Min FPS (GPU)
62.189 FPS 41.118 FPS 39.588 FPS

Time Spent: FPS %: FPS %(CPU): FPS %(GPU):
Above 200 FPS: 10.64 % 34.01 % 0 %
Above 144 FPS: 89.96 % 91.11 % 0 %
Above 120 FPS: 99.42 % 97.77 % 0 %
Above 100 FPS: 99.75 % 99.2 % 0.27 %
Above 90 FPS: 99.9 % 99.55 % 0.68 %
Above 60 FPS: 100 % 99.88 % 70.2 %
Above 45 FPS: 100 % 99.99 % 97.17 %
Above 30 FPS: 100 % 100 % 100 %


----------



## Roboyto

Quote:


> Originally Posted by *pkrexer*
> 
> Delidded with clu between the core & spreader, I get really good temps. 4.6GHz @ 1.4 Vcore ... I'm seeing 69c on the hottest core after a 10 pass run of IBT


I'll have to try with fans on something aside from silent, 870 RPM, and see where my temps are at. Not using CLU so that will make a little difference...but your higher Vcore probably negates the difference in TIM.


----------



## Roy360

hey, does anyone know how the warranty on an AMD internal sample works? I found a seller selling his R9 290 for 525. But he doesn't have a receipt because it's an internal sample. Apparently all I need is the card itself with the serial number stick intact. Can anyone confirm this?


----------



## VSG

Are internal samples even meant to be sold at all? You may want to check on that. Who will be honoring the warranty? Send a tweet to Roy if you are on Twitter- he is usually good on responding to stuff like this quickly.


----------



## MrWhiteRX7

Yea that sounds sketch


----------



## tsm106

Quote:


> Originally Posted by *Roy360*
> 
> hey, does anyone know how the warranty on an AMD internal sample works? I found a seller selling his R9 290 for 525. But he doesn't have a receipt because it's an internal sample. Apparently all I need is the card itself with the serial number stick intact. Can anyone confirm this?


Don't buy it. They are usually gimped. They are not same quality as retail and only meant for development.


----------



## Hogesyx

Quote:


> Originally Posted by *Roy360*
> 
> hey, does anyone know how the warranty on an AMD internal sample works? I found a seller selling his R9 290 for 525. But he doesn't have a receipt because it's an internal sample. Apparently all I need is the card itself with the serial number stick intact. Can anyone confirm this?


IANAL but from what I understand, ownership of "sample" is normally the original manufacturer, and the recipient does not have the rights to resell. That is why most sample has a big label "NOT FOR RESALE". I would advise against buying any test or prototyping samples from any third party.


----------



## taem

So Amazon just notified me that my Sapphire 290 Tri-X will be delivered March 3. Price is $499. I can't decide whether to cancel this order. I have a Powercolor PCS+ 290 right now and I love it. This is imho the best cooler ever. I have never seen temps this good on an air cooled high end gpu. I know the Tri-X is highly rated but I strongly doubt it performs like this card. Check out an evening of Far Cry 3 at 1170 core 1500 mem:



Temps below 70c across the board at 1170/1500 overclock, ambient ~22. Tell me that's not fabulous for an air cooled 290. Will the Tri-X get me cooling like this? I have a hard time believing that. I've used Sapphires before. They do great at stock clocks, amazing if you undervolt, but the moment you overclock to any degree vrm1 hits high 80s.

So I'm debating whether to cancel this order and get another PCS+.

One advantage of the Tri-X is that it's 38mm, vs the 52mm of the PCS+. So more space between cards for airflow if I put the Tri-X in the upper slot. The PCS+ has gone up in price so it's $550 now. But in addition to the cooling it's got a metal shroud and backplate, vs the Tri-X plastic shroud and no backplate. The Tri-X is 1022g. I worry about sag.

Will I run into any problems if I use different brand cards in xfire? I know it's possible, but you guys all seem to use the same brand card in your xfire setups.

Gah dunno what to do. Have to decide pretty soon obviously, it's going to ship Friday I think. What would you do? Tri-X for $499, or cancel it and pay $50 extra for another PCS+?


----------



## Roboyto

Quote:


> Originally Posted by *taem*
> 
> So Amazon just notified me that my Sapphire 290 Tri-X will be delivered March 3. Price is $499. I can't decide whether to cancel this order. I have a Powercolor PCS+ 290 right now and I love it. This is imho the best cooler ever. I have never seen temps this good on an air cooled high end gpu. I know the Tri-X is highly rated but I strongly doubt it performs like this card. Check out an evening of Far Cry 3 at 1170 core 1500 mem:
> 
> 
> 
> Temps below 70c across the board at 1170/1500 overclock, ambient ~22. Tell me that's not fabulous for an air cooled 290. Will the Tri-X get me cooling like this? I have a hard time believing that. I've used Sapphires before. They do great at stock clocks, amazing if you undervolt, but the moment you overclock to any degree vrm1 hits high 80s.
> 
> So I'm debating whether to cancel this order and get another PCS+.
> 
> One advantage of the Tri-X is that it's 38mm, vs the 52mm of the PCS+. So more space between cards for airflow if I put the Tri-X in the upper slot. The PCS+ has gone up in price so it's $550 now. But in addition to the cooling it's got a metal shroud and backplate, vs the Tri-X plastic shroud and no backplate. The Tri-X is 1022g. I worry about sag.
> 
> Will I run into any problems if I use different brand cards in xfire? I know it's possible, but you guys all seem to use the same brand card in your xfire setups.
> 
> Gah dunno what to do. Have to decide pretty soon obviously, it's going to ship Friday I think. What would you do? Tri-X for $499, or cancel it and pay $50 extra for another PCS+?


Go with your gut feeling...


----------



## SimonKaz

Quote:


> Originally Posted by *taem*
> 
> What clocks? Stock?


Yes, it's all stock.


----------



## Connolly

So the new 14.2 drivers seem really nice for Battlefield 4, terrible for 3 though. Anyone else finding that? Just logged in to do some dogfights and it's close to unplayable.


----------



## Imprezzion

Quote:


> Originally Posted by *Connolly*
> 
> So the new 14.2 drivers seem really nice for Battlefield 4, terrible for 3 though. Anyone else finding that? Just logged in to do some dogfights and it's close to unplayable.


Both 3 and 4 run amazing on 14.2 here. No issues with 3 at all. I played a bit of Metro on 3 to compare it to Metro on 4 and had no issues. Also, i miss grand bazaar as it was my favorite map so olayed that as well, no issues.


----------



## rdr09

i wish all my games have mantle as a choice so i can keep my i7 at stock with ht off. i might be able to go by 4GHz down from 4.5. lower the vcore, which will really help come warmer weather.


----------



## kizwan

I just received my Fujipoly Extreme pad today. Can't do anything yet because I still need to order TIM.


----------



## NapalmV5

feel like such a loser always on the losing end









overpaid by 320$ since i got 4x dcIIoc @ 730$

they lowered the price just hours after i placed order

what are the chances of getting credited the 320$ ?

http://www.newegg.com/Product/Product.aspx?Item=N82E16814121840


----------



## rdr09

Quote:


> Originally Posted by *NapalmV5*
> 
> feel like such a loser always on the losing end
> 
> 
> 
> 
> 
> 
> 
> 
> 
> overpaid by 320$ since i got 4x dcIIoc @ 730$
> 
> they lowered the price just hours after i placed order
> 
> what are the chances of getting credited the 320$ ?
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814121840


call them. if they decline - return them or cancel. provantage accepted my request when that happened to me. gl!


----------



## ReXtN

Quote:


> Originally Posted by *NapalmV5*
> 
> feel like such a loser always on the losing end
> 
> 
> 
> 
> 
> 
> 
> 
> 
> overpaid by 320$ since i got 4x dcIIoc @ 730$
> 
> they lowered the price just hours after i placed order
> 
> what are the chances of getting credited the 320$ ?
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814121840


Return the 4 cards, and then order 4 new ones


----------



## NapalmV5

Quote:


> Originally Posted by *rdr09*
> 
> call them. if they decline - return them or cancel. provantage accepted my request when that happened to me. gl!


thanks!
Quote:


> Originally Posted by *ReXtN*
> 
> Return the 4 cards, and then order 4 new ones


will call as soon as they open but concern is their sadistic policy: "Return Policy: VGA Replacement Only Return Policy"

i can only hope ill get the right rep to talk to


----------



## Roy360

Quote:


> Originally Posted by *rdr09*
> 
> call them. if they decline - return them or cancel. provantage accepted my request when that happened to me. gl!


pricematch with your credit card. Most credit cards have a 3 month price protection policy or at least my WORLD and major bank mastercards do


----------



## kpoeticg

That can't be right. So you can buy a 290x at any point in time, and over the next 3 months you'll receive a refund for the difference of the lowest price you can find???

You could make a career out of that LOLLLL


----------



## Imprezzion

I just got a whole new issue with my card...

I cannot run PCI-E 3.0 speeds with it at all in slot #1. SLot #2 works fine on 3.0, but only x8.

When left on Auto in the BIOS it seems the card only runs at x16 1.1 when testing in GPU-Z.
When I put it on Gen3 the PC boots, windows starts, i hear the startup sound but screen stays on stand-by. Card does nothing.
When I put it on Gen2 it works just fine and GPU-Z also sais x16 2.0 under load.
When I use slot #2 and force that to Gen 3 it works just fine and GPU-Z confirms x8 3.0. Only downsidfe is that this slot is just x8 so the top slot @ x16 2.0 is just as fast and cools better.

The heck is this issue? My CPU is a 3770K and board is a P8Z77-V Pro with the latest BIOS. Card runs a stock XFX R9 290X BIOS.

Is the top slot damaged somehow? Wierd thing is, my old GTX780's all ran just fine on x16 3.0 on the same slot.........

There's only one thing left to try. Switching PCI-E Option ROM but I doubt it'll work since slot 2 works fine on 3.0.

EDIT: Option ROM had no effect. Running forced 2.0 now. performance was indeed less with the Auto (1.1) settings so the slot really was on Gen 1.1. Valley FPS has gone up quite a lot running 2.0.

Should I give up on this motherboard? Is it defective?


----------



## ebduncan

Quote:


> Originally Posted by *Imprezzion*
> 
> I just got a whole new issue with my card...
> 
> I cannot run PCI-E 3.0 speeds with it at all in slot #1. SLot #2 works fine on 3.0, but only x8.
> 
> When left on Auto in the BIOS it seems the card only runs at x16 1.1 when testing in GPU-Z.
> When I put it on Gen3 the PC boots, windows starts, i hear the startup sound but screen stays on stand-by. Card does nothing.
> When I put it on Gen2 it works just fine and GPU-Z also sais x16 2.0 under load.
> When I use slot #2 and force that to Gen 3 it works just fine and GPU-Z confirms x8 3.0. Only downsidfe is that this slot is just x8 so the top slot @ x16 2.0 is just as fast and cools better.
> 
> The heck is this issue? My CPU is a 3770K and board is a P8Z77-V Pro with the latest BIOS. Card runs a stock XFX R9 290X BIOS.
> 
> Is the top slot damaged somehow? Wierd thing is, my old GTX780's all ran just fine on x16 3.0 on the same slot.........
> 
> There's only one thing left to try. Switching PCI-E Option ROM but I doubt it'll work since slot 2 works fine on 3.0.
> 
> EDIT: Option ROM had no effect. Running forced 2.0 now. performance was indeed less with the Auto (1.1) settings so the slot really was on Gen 1.1. Valley FPS has gone up quite a lot running 2.0.
> 
> Should I give up on this motherboard? Is it defective?


interesting. I remember there being a bug in gpu z that was fixed with the one of the later updates. There was also a note in the driver release notes about improper pci-e bus setting when using crossfire (13.12 release notes).

You should know there is ZERO performance difference between pci 2.0 and pci 3.0. So as long as your running at PCI-E 2.0 x16 you will not have any performance loss. I don't know if you have a another computer around that you can put the card into to see if you get the same results, but that would rule out the graphics card or the motherboard.


----------



## Imprezzion

Quote:


> Originally Posted by *ebduncan*
> 
> interesting. I remember there being a bug in gpu z that was fixed with the one of the later updates. There was also a note in the driver release notes about improper pci-e bus setting when using crossfire (13.12 release notes).
> 
> You should know there is ZERO performance difference between pci 2.0 and pci 3.0. So as long as your running at PCI-E 2.0 x16 you will not have any performance loss. I don't know if you have a another computer around that you can put the card into to see if you get the same results, but that would rule out the graphics card or the motherboard.


Plenty of other rigs here but none with PCI-E 3.0 capability. All either AMD or Sandy based


----------



## ebduncan

Quote:


> Originally Posted by *Imprezzion*
> 
> Plenty of other rigs here but none with PCI-E 3.0 capability. All either AMD or Sandy based


ahh ok. Would be the easiest way to figure out the root of the problem.

I noticed your card has a flashed bios, have you tried putting it back on the stock bios?


----------



## Imprezzion

Quote:


> Originally Posted by *ebduncan*
> 
> ahh ok. Would be the easiest way to figure out the root of the problem.
> 
> I noticed your card has a flashed bios, have you tried putting it back on the stock bios?


Yep. BIOS switch eh









But this BIOS is practically stock. Cards a XFX 290 Core Edition running a 290X Core Edition BIOS for shader unlock.


----------



## Connolly

Quote:


> Originally Posted by *Imprezzion*
> 
> Both 3 and 4 run amazing on 14.2 here. No issues with 3 at all. I played a bit of Metro on 3 to compare it to Metro on 4 and had no issues. Also, i miss grand bazaar as it was my favorite map so olayed that as well, no issues.


Thanks for the info, I'll do some more testing later and see if I can get to the bottom of it


----------



## Roy360

Quote:


> Originally Posted by *kpoeticg*
> 
> That can't be right. So you can buy a 290x at any point in time, and over the next 3 months you'll receive a refund for the difference of the lowest price you can find???
> 
> You could make a career out of that LOLLLL


PRICE PROTECTION

Should you find a lower regular price for a new item within 60 days from the date of purchase using your eligible MasterCard card, you may be reimbursed for the price difference"

http://www.mastercard.ca/card-benefits.html

certain mastercards will extend that to 3 months

EDIT: they have a few other useful protections:
SATISFACTION GUARANTEE: If you become dissatisfied with a product you purchase using your eligible MasterCard card within 60 days of purchase, and the store will not accept a return, you may be eligible for a refund for the cost of the product up to $250.

EXTENDED WARRANTY: Doubles the original manufacturer's or store brand warranty for up to one year when you pay with your eligible MasterCard card.

Or maybe American's don't have this protection, but I doubt it. Canada is usually the one that gets the short end of the stick

Also has anyone tried the ASUS vbios update?


----------



## ebduncan

Quote:


> Originally Posted by *Roy360*
> 
> PRICE PROTECTION
> 
> Should you find a lower regular price for a new item within 60 days from the date of purchase using your eligible MasterCard card, you may be reimbursed for the price difference"
> 
> http://www.mastercard.ca/card-benefits.html
> 
> certain mastercards will extend that to 3 months
> 
> EDIT: they have a few other useful protections:
> SATISFACTION GUARANTEE: If you become dissatisfied with a product you purchase using your eligible MasterCard card within 60 days of purchase, and the store will not accept a return, you may be eligible for a refund for the cost of the product up to $250.
> 
> EXTENDED WARRANTY: Doubles the original manufacturer's or store brand warranty for up to one year when you pay with your eligible MasterCard card.
> 
> Or maybe American's don't have this protection, but I doubt it. Canada is usually the one that gets the short end of the stick
> 
> Also has anyone tried the ASUS vbios update?


I don't use mastercard, but Visa, and its pretty much the same thing depending on who is providing your card.


----------



## b0x3d

[/quote]
Quote:


> Originally Posted by *b0x3d*
> 
> I bought 2x GB WF 290x from Scan. An upgrade from 2x 6970s. When I opened one of them I heard a rattling noise inside and when I turned the card over a screw fell out! I can't actually see anything different from the two cards. Apart from when running on idle, the fans wobble ever so slightly on one of them and I thought maybe the screw came from the fan housing - but I think this is normal?
> 
> The temps / fan speeds / noise are almost identical. It's not obvious where the screw came from. It's quite big, about 5mm long with a philips dome shaped (as opposed to flat) head. Any ideas?
> 
> Could it just be a random screw that found it's way in there during manufacturing or packaging? Or maybe I got a card someone returned and they accidentally left a screw in there?
> 
> Bit of a mystery!


Quote:


> Originally Posted by *Roboyto*
> 
> Pic please! Someone with a gigabyte card who has disassembled them may be able to identify the screw for you


Here's a pic of the screw - any ideas where it came from??


----------



## battleaxe

I've taken my 290 apart several times. That doesn't look familiar.

Second 290 in the mail. Should be here on Monday. I'll update at that time to 2 (290's)

Yay....


----------



## keikei

Hello,

I know there is a new crossfire tech with the new 290 (x), by not needed the connection bridges. Is there anything I need to know other than enbling it in ccc?


----------



## taem

Going to be setting up 290 xfire over the weekend, first time running dual gpus ever for me. I have some questions.

So to install, since I have 13.12 installed already for my current 290, I can just remove my current card and put the new card in, right? And then after testing put the older one in?

Does the top card have to be the primary gpu or master or whatever its called? Does the primary gpu run hotter? I know the card in the upper slot will have higher temps, so should I set the lower card as master, if I have that option?

Do you have to have CCC to control xfire?

13.12 is fine for 290 xfire if I don't need mantle right? No additional improvement for xfire in 14.2 if I'm not running mantle?

Also I'm getting this odd glitch all of a sudden:



A ghosted outline of Trixx on my primary display. (Trixx is set to appear on my secondary display.) Just started happening yesterday. Any clue what this might be?


----------



## Roy360

Quote:


> Originally Posted by *b0x3d*


Here's a pic of the screw - any ideas where it came from??


[/quote]

if that's a reference card, check the back of the pcb. If there are all there, it's probably from the shroud, which you can ignore. Or less likely the PCI bracket


----------



## phallacy

Quote:


> Originally Posted by *taem*
> 
> Going to be setting up 290 xfire over the weekend, first time running dual gpus ever for me. I have some questions.
> 
> So to install, since I have 13.12 installed already for my current 290, I can just remove my current card and put the new card in, right? And then after testing put the older one in?
> 
> Does the top card have to be the primary gpu or master or whatever its called? Does the primary gpu run hotter? I know the card in the upper slot will have higher temps, so should I set the lower card as master, if I have that option?
> 
> Do you have to have CCC to control xfire?
> 
> 13.12 is fine for 290 xfire if I don't need mantle right? No additional improvement for xfire in 14.2 if I'm not running mantle?
> 
> Also I'm getting this odd glitch all of a sudden:
> 
> 
> 
> A ghosted outline of Trixx on my primary display. (Trixx is set to appear on my secondary display.) Just started happening yesterday. Any clue what this might be?


Yes you can remove your present card to test your new card and then put them both in when everything checks out. For crossfire you will need CCC to enable crossfire (It should recognize both cards immediately and then will ask you to update the crossfire settings). You don't need CCC after this just don't use the software but leave it on your system. The card in the first slot will be the primary card. I have found that putting the card which can perform the best will give you a slight performance gain since it is utilized more than the second. I haven't found any setting where you can designate which is master/slave in that regard. Mantle is not required you will see a big improvement on DX11. Of course Mantle is supposed to offer even more than DX11 but its very hit or miss right now. 13.12 is very stable for a crossfire set up, running it currently with a trifire setup.

Regarding the trixx ghosting, it's happened to me on a few occasions, I usually just go into display settings put a different refresh rate then revert and it's gone. Don't really know what is causing it though.


----------



## keikei

Quote:


> Originally Posted by *phallacy*
> 
> Yes you can remove your present card to test your new card and then put them both in when everything checks out. For crossfire you will need CCC to enable crossfire (It should recognize both cards immediately and then will ask you to update the crossfire settings). You don't need CCC after this just don't use the software but leave it on your system. The card in the first slot will be the primary card. I have found that putting the card which can perform the best will give you a slight performance gain since it is utilized more than the second. I haven't found any setting where you can designate which is master/slave in that regard. Mantle is not required you will see a big improvement on DX11. Of course Mantle is supposed to offer even more than DX11 but its very hit or miss right now. 13.12 is very stable for a crossfire set up, running it currently with a trifire setup.
> 
> Regarding the trixx ghosting, it's happened to me on a few occasions, I usually just go into display settings put a different refresh rate then revert and it's gone. Don't really know what is causing it though.


Thanks for the response. What frames are you getting on BF4 may i ask? I just ordered an i7 as my old i5 is struggling on bf4 at the moment.


----------



## taem

Is it advisable though to uninstall 13.12, seat both cards, and reinstall 13.12? Any benefit to that, or conversely, any downside to simply adding the second card to already installed drivers?
Quote:


> Originally Posted by *phallacy*
> 
> Yes you can remove your present card to test your new card and then put them both in when everything checks out. For crossfire you will need CCC to enable crossfire (It should recognize both cards immediately and then will ask you to update the crossfire settings). You don't need CCC after this just don't use the software but leave it on your system.


Probably a stupid question but that means enabled, right? I like to disable it because it loads after Trixx and resets the clocks. To enable CCC (but have OC features off) I need to set Trixx on a delay in Task Scheduler, which I don't think is causing any problems, but my philosophy is to tinker as little as possible.
Quote:


> The card in the first slot will be the primary card. I have found that putting the card which can perform the best will give you a slight performance gain since it is utilized more than the second.


Hmm. I was planning on the Tri-X in the upper slot and the Powercolor PCS+ in the bottom slot because the Tri X is 14mm thinner than the PCS+ and will leave more spacing between cards. But I wonder if the other way around would be better. How do you test for better performer? I'm assuming benches will be the same at same clocks since they're the same gpu and both reference board. So how do I know what is the better performer? In terms of overclocking they obviously set to the same clocks.


----------



## b0x3d

Quote:


> Originally Posted by *Roy360*
> 
> if that's a reference card, check the back of the pcb. If there are all there, it's probably from the shroud, which you can ignore. Or less likely the PCI bracket


It's not reference - it's the GB WF version


----------



## BradleyW

Quote:


> Originally Posted by *taem*
> 
> Is it advisable though to uninstall 13.12, seat both cards, and reinstall 13.12? *Any benefit to that*, or conversely, any downside to simply adding the second card to already installed drivers?
> Probably a stupid question but that means enabled, right? I like to disable it because it loads after Trixx and resets the clocks. To enable CCC (but have OC features off) I need to set Trixx on a delay in Task Scheduler, which I don't think is causing any problems, but my philosophy is to tinker as little as possible.
> Hmm. I was planning on the Tri-X in the upper slot and the Powercolor PCS+ in the bottom slot because the Tri X is 14mm thinner than the PCS+ and will leave more spacing between cards. But I wonder if the other way around would be better. How do you test for better performer? I'm assuming benches will be the same at same clocks since they're the same gpu and both reference board. So how do I know what is the better performer? In terms of overclocking they obviously set to the same clocks.


Just add the second card. No need to reinstall drivers.


----------



## Roy360

Quote:


> Originally Posted by *b0x3d*
> 
> It's not reference - it's the GB WF version


then I have no clue


----------



## ArchieGriffs

Quote:


> Originally Posted by *taem*
> 
> So Amazon just notified me that my Sapphire 290 Tri-X will be delivered March 3. Price is $499. I can't decide whether to cancel this order. I have a Powercolor PCS+ 290 right now and I love it. This is imho the best cooler ever. I have never seen temps this good on an air cooled high end gpu. I know the Tri-X is highly rated but I strongly doubt it performs like this card. Check out an evening of Far Cry 3 at 1170 core 1500 mem:
> 
> 
> 
> Temps below 70c across the board at 1170/1500 overclock, ambient ~22. Tell me that's not fabulous for an air cooled 290. Will the Tri-X get me cooling like this? I have a hard time believing that. I've used Sapphires before. They do great at stock clocks, amazing if you undervolt, but the moment you overclock to any degree vrm1 hits high 80s.
> 
> So I'm debating whether to cancel this order and get another PCS+.
> 
> One advantage of the Tri-X is that it's 38mm, vs the 52mm of the PCS+. So more space between cards for airflow if I put the Tri-X in the upper slot. The PCS+ has gone up in price so it's $550 now. But in addition to the cooling it's got a metal shroud and backplate, vs the Tri-X plastic shroud and no backplate. The Tri-X is 1022g. I worry about sag.
> 
> Will I run into any problems if I use different brand cards in xfire? I know it's possible, but you guys all seem to use the same brand card in your xfire setups.
> 
> Gah dunno what to do. Have to decide pretty soon obviously, it's going to ship Friday I think. What would you do? Tri-X for $499, or cancel it and pay $50 extra for another PCS+?


Honestly, either one is a really good choice, you can't make a bad decision in this situation. My Tri-X has never hit above 75c, but Ive also never seen it go above 40-45% fan speed except for when I manually tweak speeds. I can't hear my GPU's fans at 40% at all, even with the case open. At 45% I can hear a slight sound, at 50% i can hear it, but it still isn't loud. The tri-x is going to be a better overclocker because it has much more headroom to crank up the fans an hynix memory, but at stock speeds the PCS+ has absolutely incredible temps. I really like the idle temps of the PCS+ as well.

The most impressive thing on your PCS+ is 67c VRM1 overclocked, that is incredible. I've only overclocked a little, because I couldn't get GPU-Z to show my VRAM temps, I changed my version of catalyst, so maybe I can see them now, but even when I overclock at 1150/1500, I couldn't get the core temp to go above 76c, I was manually messing with the fan during that time so it was higher than 45%, but it still maintained goo temps.

If you like the clocks you have right now and don't ever plan on going higher, get the PCS+, it's superior in temps in every way to the Tri-X, if you want better overclocking but slightly higher temps, stick with the Tri-X. If you don't want to deal with cancelling your order or paying an extra 50$, then keep the Tri-X, it isn't a bad card at all and shines in different ways than the PCS+.

Ultimately it is your decision, go with what you feel like you need for your system, I am mostly just pointing out some of the things I see the PCS+ beating the Tri-X at and vis versa.


----------



## rdr09

Quote:


> Originally Posted by *BradleyW*
> 
> Just add the second card. No need to reinstall drivers.


^this is what i did. shutdown and installed second card. bootup and went to ccc, enabled crossfire, and disabled ulps (i used trixx with this step).

this since you know already which card is going where.


----------



## MrWhiteRX7

With as finicky as CCC can be I've always uninstalled drivers, then added the other card, then re-installed and went about the rest. Better safe than possibly chasing driver issues


----------



## disintegratorx

Hey guys,

Does anyone know if there is a way to delay the startup of MSI Afterburner? I can't find it in the services.. Could I do it with the task scheduler or something? Reason being is because as soon as my screen starts up it gets a jolt from MSI AB, and I know its doing that because without it, I don't get that glitch jolt when I'm first starting up to my desktop. Anyone with a good answer I will throw you a rep and it would be much appreciated.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *disintegratorx*
> 
> Hey guys,
> 
> Does anyone know if there is a way to delay the startup of MSI Afterburner? I can't find it in the services.. Could I do it with the task scheduler or something? Reason being is because as soon as my screen starts up it gets a jolt from MSI AB, and I know its doing that because without it, I don't get that glitch jolt when I'm first starting up to my desktop. Anyone with a good answer I will throw you a rep and it would be much appreciated.


Start-up delayer proggie

http://www.r2.com.au/page/products/dl/startup-delayer/


----------



## taem

Quote:


> Originally Posted by *disintegratorx*
> 
> Hey guys,
> 
> Does anyone know if there is a way to delay the startup of MSI Afterburner? I can't find it in the services.. Could I do it with the task scheduler or something? Reason being is because as soon as my screen starts up it gets a jolt from MSI AB, and I know its doing that because without it, I don't get that glitch jolt when I'm first starting up to my desktop. Anyone with a good answer I will throw you a rep and it would be much appreciated.


Can't answer about AB but Trixx loads from Task Scheduler so it's easy to set a delay, which is what I do since CCC loads after without the delay on Trixx and overrides the clocks. I also find Trixx superior because of separate fan curves for each profile. Maybe AB does that too but I've never been able to find it, always just the one universal fan curve. AB does have hysteresis which is nice.
Quote:


> Originally Posted by *MrWhiteRX7*
> 
> With as finicky as CCC can be I've always uninstalled drivers, then added the other card, then re-installed and went about the rest. Better safe than possibly chasing driver issues


Now you're making me all paranoid. DDU is pretty easy to use right? Maybe I should just do that. Though, I pretty much always run into problems whenever I uninstall/reinstall video drivers. Never tried DDU before. But it sounds like it works well and glitch free.


----------



## phallacy

Quote:


> Originally Posted by *taem*
> 
> Is it advisable though to uninstall 13.12, seat both cards, and reinstall 13.12? Any benefit to that, or conversely, any downside to simply adding the second card to already installed drivers?
> Probably a stupid question but that means enabled, right? I like to disable it because it loads after Trixx and resets the clocks. To enable CCC (but have OC features off) I need to set Trixx on a delay in Task Scheduler, which I don't think is causing any problems, but my philosophy is to tinker as little as possible.
> Hmm. I was planning on the Tri-X in the upper slot and the Powercolor PCS+ in the bottom slot because the Tri X is 14mm thinner than the PCS+ and will leave more spacing between cards. But I wonder if the other way around would be better. How do you test for better performer? I'm assuming benches will be the same at same clocks since they're the same gpu and both reference board. So how do I know what is the better performer? In terms of overclocking they obviously set to the same clocks.


Correct, if you are running stock then all the hubub about finding the best card is moot because they will perform near identically. However since this is OCN, find your best OC card by doing your bench runs and game testing to see how the single performance is (remember aim for stability!). Use the best (usually highest average/min/max fps in that regard) as your first slot card. I've noticed a fps gain of about 5-7 going this way.

Agree with others in that there is really no need to do a driver uninstall and reinstall. Just pop that baby in and fire it up.
Quote:


> Originally Posted by *keikei*
> 
> Thanks for the response. What frames are you getting on BF4 may i ask? I just ordered an i7 as my old i5 is struggling on bf4 at the moment.


No problem. I haven't played BF4 since getting the third card but with 2 cards I was averaging 97-100 fps using the following graphics settings with a low of 73 and high of 148. And yes I was on a 3570k but switched to the 4770k because I had microcenter credit and the 4770k was 249 at the time. : ) Kinda sorta wish I had went with the 4930k but I think I will wait for Haswell-E.

1440p
135% res scale
Everything ultra
87 field of view (forget what it's called in game)
2x MSAA


----------



## disintegratorx

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Start-up delayer proggie
> 
> http://www.r2.com.au/page/products/dl/startup-delayer/


Don't know dude. It asked me to update my already updated .Net framework files. I'll look into that one a bit more and get back... And besides I'm pretty sure that there's a way that's more native to the Operating system.. I usually go as light as I can with assisted performance programs.


----------



## taem

Quote:


> Originally Posted by *ArchieGriffs*
> 
> Honestly, either one is a really good choice, you can't make a bad decision in this situation...


Point became moot because Amazon ambushed me with an early ship lol. So now to go for the PCS+ would mean a restock fee plus the price inflation, a $100 premium, not worth it. Sapphire is probably my favorite gpu brand anyway, though Powercolor has really won me over this round with the 270X Devil and the PCS+ 290. Yeah I don't think I can go wrong, I'm pretty sure these are the two best 290s available right now and at the lowish clocks I'll be running I think temps will be fine across both gpus.
Quote:


> Originally Posted by *phallacy*
> 
> Correct, if you are running stock then all the hubub about finding the best card is moot because they will perform near identically. However since this is OCN, find your best OC card by doing your bench runs and game testing to see how the single performance is (remember aim for stability!). Use the best (usually highest average/min/max fps in that regard) as your first slot card. I've noticed a fps gain of about 5-7 going this way.


Well oc won't be an issue since I have an 850w psu to run 290x2 and a 4670k @ 4.7. Cpu draw 150w, a single 290 at stock draws around 200-220. So I won't be able to overclock much at all. Targeting something around 1080/1400 which is what my Powercolor can do at 0 vddc offset. I'm hoping the Tri-X will hit that, fingers crossed, +80 on the core without a vddc offset isn't guaranteed by any means.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *taem*
> 
> Now you're making me all paranoid. DDU is pretty easy to use right? Maybe I should just do that. Though, I pretty much always run into problems whenever I uninstall/reinstall video drivers. Never tried DDU before. But it sounds like it works well and glitch free.


I've been using DDU ever since I switched to win 8 and then 8.1. I've had zero issues with driver un-installs/ installs.









It's uber simple. Unzip the proggie, reboot in to safe mode, run the proggie (make sure you check to delete the main amd folder etc..), it reboots, I install my new drivers, restart and done.


----------



## disintegratorx

Quote:


> Originally Posted by *taem*
> 
> Can't answer about AB but Trixx loads from Task Scheduler so it's easy to set a delay, which is what I do since CCC loads after without the delay on Trixx and overrides the clocks. I also find Trixx superior because of separate fan curves for each profile. Maybe AB does that too but I've never been able to find it, always just the one universal fan curve. AB does have hysteresis which is nice.
> Now you're making me all paranoid. DDU is pretty easy to use right? Maybe I should just do that. Though, I pretty much always run into problems whenever I uninstall/reinstall video drivers. Never tried DDU before. But it sounds like it works well and glitch free.


Yes, you can set your fan curves in MSI AB, and thanks for your reply, I'll look into it further tomorrow and post whatever I find out about it. +1









Also, yeah I use DDU.. But what I just found out is that after you set a profile with any sort of adjustment, ALWAYS reboot your system and wait for all of your start up programs to load before running a graphic intensive program, and it should* let you run the program as intended. My problem was that I making adjustments in either mid-game, or starting the initial program before I let all of my start up programs load. Now I've just achieved a +9 on the Powerplay feature, with a 6mv increase doing 1125/1450. Running Battlefiel 4 now with supersamplng and high quality textures and all of the best settings and its like a whole different game I'm playing. Just on the threshold of being exactly what I'm after which is a 100% smooth running game. These are very interesting times with these new Radeons and I'm quite satisfied with this newest series, and its only gonna get better!


----------



## Prozillah

Quote:


> Originally Posted by *taem*
> 
> Point became moot because Amazon ambushed me with an early ship lol. So now to go for the PCS+ would mean a restock fee plus the price inflation, a $100 premium, not worth it. Sapphire is probably my favorite gpu brand anyway, though Powercolor has really won me over this round with the 270X Devil and the PCS+ 290. Yeah I don't think I can go wrong, I'm pretty sure these are the two best 290s available right now and at the lowish clocks I'll be running I think temps will be fine across both gpus.
> Well oc won't be an issue since I have an 850w psu to run 290x2 and a 4670k @ 4.7. Cpu draw 150w, a single 290 at stock draws around 200-220. So I won't be able to overclock much at all. Targeting something around 1080/1400 which is what my Powercolor can do at 0 vddc offset. I'm hoping the Tri-X will hit that, fingers crossed, +80 on the core without a vddc offset isn't guaranteed by any means.


Depending on what else you're running that 850 psu may not be enough. I had a 1000w which was only enough to clock either the two cards or my cpu - couldn't do both. However I sold that and picked a brand new 1200w strider gold and its running full overclocks perfectly! The most ive seen it pull from the wall plug is 1527w...


----------



## phallacy

Quote:


> Originally Posted by *Prozillah*
> 
> Depending on what else you're running that 850 psu may not be enough. I had a 1000w which was only enough to clock either the two cards or my cpu - couldn't do both. However I sold that and picked a brand new 1200w strider gold and its running full overclocks perfectly! The most ive seen it pull from the wall plug is 1527w...


That's really damn high! 1500+ watts is something normally a 4 way config and lga2011 platform would do, all with substantial overclocking. What are your overclocks and voltage(s) used?


----------



## taem

Quote:


> Originally Posted by *Prozillah*
> 
> Depending on what else you're running that 850 psu may not be enough. I had a 1000w which was only enough to clock either the two cards or my cpu - couldn't do both. However I sold that and picked a brand new 1200w strider gold and its running full overclocks perfectly! The most ive seen it pull from the wall plug is 1527w...


Whoa. 1200w for crossfire, even overclocked? That seems really high. I have a wall meter also, a Belkin Conserve Insight, and at 0 vddc offset and/or moderate clocks, the system draw is less than 400, and my 4670k is at 4.7 and pulls a ton of watts. Here is a readout of HWInfo64 showing the power draw at +81 vddc, 1150/1500 clocks:


And the cpu power draw:


And that's about what my wall meter shows me, just a little shy, the rest goes to fans and drives and mobo. At higher clocks than I plan to run in xfire the gpu is drawing ~220w. I do have a 1000w psu in a cart ready to order, but I thought I'd try the 850w first. Seems like it should work.

Quote:


> Originally Posted by *disintegratorx*
> 
> Yes, you can set your fan curves in MSI AB, and thanks for your reply, I'll look into it further tomorrow and post whatever I find out about it.


I know you can set a custom fan curve in AB but can you save separate fan curves per profile? I couldn't find a way to do that. Seemed like AB was just saving one fan curve setting for all profiles.


----------



## Paul17041993

Quote:


> Originally Posted by *b0x3d*
> 
> Here's a pic of the screw - any ideas where it came from??


looks to be a standard case screw, the ones you usually use on your slots and hard drives, but I doubt the card would use one at all on the cooler...


----------



## quakermaas

Quote:


> Originally Posted by *b0x3d*


Here's a pic of the screw - any ideas where it came from??



Spoiler: Warning: Spoiler!















[/QUOTE]

???


----------



## Roboyto

Quote:


> Originally Posted by *quakermaas*
> 
> ???


Quote:


> Originally Posted by *b0x3d*


Here's a pic of the screw - any ideas where it came from??


[/QUOTE]

I agree with Quakermaas, that would be the most likely location a screw of that size and head type would go. Tried to take a picture of mine from the underside, but waterblock is in the way.


----------



## vortex240

Quote:


> Originally Posted by *Prozillah*
> 
> Depending on what else you're running that 850 psu may not be enough. I had a 1000w which was only enough to clock either the two cards or my cpu - couldn't do both. However I sold that and picked a brand new 1200w strider gold and its running full overclocks perfectly! The most ive seen it pull from the wall plug is 1527w...


Hahah, how many things do you have plugged into that Killawatt?

To put things into perspective - 4670K at 4.7 1.4v, 5 3TB HD, 1 SSD, corsaid h80, 5 fans, corsair tx750, R9 290X @ 1200, 1625 +150mv ...now that draws ~650watts with full load. Cearly you are either not using the killwatt properly or there is much more plugged into it then just your pc. And I really hope you are not just making these numbers up. Since they absolutely don't make any sense. cheers


----------



## nyrmitz

http://www.techpowerup.com/gpuz/4kchw/

Add me Please

Msi R9 290x OC edition Twin Frozr III


----------



## StonedAlex

I'm thinking about selling my 780 ti and buying a couple r9 290's or a 290x and adding another one in a few weeks, since I'd like to start mining. Which cards would run coolest in crossfire? I've never used AMD or multi card setups and I'm not really sure what to expect, is it very buggy? I've been reading lots of contradictory information. Are you guys having any problems with stuttering, etc? Has the black screen issue been fixed? Does AMD have an easy way to overclock monitors? I'm sure all these questions have been beat to death but I really don't want to sift through 1800 pages lol.


----------



## vortex240

Quote:


> Originally Posted by *StonedAlex*
> 
> I'm thinking about selling my 780 ti and buying a couple r9 290's or a 290x and adding another one in a few weeks, since I'd like to start mining. Which cards would run coolest in crossfire? I've never used AMD or multi card setups and I'm not really sure what to expect, is it very buggy? I've been reading lots of contradictory information. Are you guys having any problems with stuttering, etc? Has the black screen issue been fixed? Does AMD have an easy way to overclock monitors? I'm sure all these questions have been beat to death but I really don't want to sift through 1800 pages lol.


I upgraded from a GTX 770 that I sold for $330 to a 290X at $550 - in close to 2 months I already made the money to cover the upgrade, so go for it. You will need good spacing between the cards and great airflow in your case, if you go with open air coolers.


----------



## Roboyto

Quote:


> Originally Posted by *StonedAlex*
> 
> I'm thinking about selling my 780 ti and buying a couple r9 290's or a 290x and adding another one in a few weeks, since I'd like to start mining. Which cards would run coolest in crossfire? I've never used AMD or multi card setups and I'm not really sure what to expect, is it very buggy? I've been reading lots of contradictory information. Are you guys having any problems with stuttering, etc? Has the black screen issue been fixed? Does AMD have an easy way to overclock monitors? I'm sure all these questions have been beat to death but I really don't want to sift through 1800 pages lol.


You'll have to do some research if you want to make an educated decision on whether or not to start mining. To make your sifting a little quicker use the search function for the thread.

For LTC mining specific questions head here: http://www.overclock.net/t/1398250/official-tutorial-how-to-start-mining-litecoins

I can't answer all your questions but I can tell you what I know...

If you're not going to run under water then it may be hard to keep them cool, especially if they'll be mining. The Powercolor PCS+ has a triple slot cooler on it and seems to perform the best from what I've seen in this thread. Xfire them can turn into an issue because they are end up being so close together. Sapphire Tri-X seems to be the next most popular card. Guys with Xfire will likely chime in to tell you there experiences. I know one member was having a hard time keeping ASUS DC2 290s cool in Xfire. Haven't seen much of anything about the HIS IceQ or XFX Double D cards.

People seem to be having good luck with the multi-card setups, but they can have their issues with AMD or Nvidia...it's more work than a single GPU. Biggest issue right now seems to be Beta Drivers, but the 13.12 are very solid for everyone.

My card is pushed to the brink for a reference 290, 1295/1700, and it runs smooth as butter. I haven't had any stuttering issues personally, but this is with only 1 card. I can say I don't recall seeing anyone complain about stuttering since I've been active in this thread for the last few weeks.

I do believe the black screen issue was a fairly small batch of cards right after release; Haven't seen many people talking about it.

Don't know about OC monitors.

Best of luck to you with your potential upcoming project


----------



## Prozillah

Quote:


> Originally Posted by *vortex240*
> 
> Hahah, how many things do you have plugged into that Killawatt?
> 
> To put things into perspective - 4670K at 4.7 1.4v, 5 3TB HD, 1 SSD, corsaid h80, 5 fans, corsair tx750, R9 290X @ 1200, 1625 +150mv ...now that draws ~650watts with full load. Cearly you are either not using the killwatt properly or there is much more plugged into it then just your pc. And I really hope you are not just making these numbers up. Since they absolutely don't make any sense. cheers


Currently plugged in:

2x 140mm fans
8x 120mm fans (some LED lighting)
1x 200mm fan
1x H110 water pump
2x Gigabyte 290 Windforces xfire 1150 / 1400 (50% PL + 50mv)
1x 4670k @ 4.5ghz 1.325v
2x 4GB G Skill Trident X 2400mhz
5x USB devices Razor LED mech Keyboard etc...
1x DVDRW drive
2x Storage HDD
1x SSD

So yea I am only reading numbers on my wall meter and the largest I've spotted playing BF4 on a 64 multi map heavy battle was 1537 draw from the plug. Also, my tests previously indicated that with a 1000w PSU i was able to oc EITHER - my GPU's or my CPU. Not both. Both were 100% in 2 hour sessions of bf4 under the same condition, same drivers etc. As soon as I would run both together I would spontaneously restart within 5 - 10 minutes.

Since installed my new 1200w (peaking at 1300w under heavy loads...) I have replicated the same test - 2 hour bf4 multi 64 map session with both OC's in place running 100% stable. Same drivers etc.

You tell me if you would draw a different conclusion..?

EDIT: To add - I do have to run all my case fans at 100% to create the powerful negative vortex to suck all the hot air out of the case with the xfire set up (especially overclocking..) so I am sure that is drawing much more power than other users...


----------



## chiknnwatrmln

If that were me I'd come to the conclusion that my power meter is broken...


----------



## Paul17041993

Quote:


> Originally Posted by *Prozillah*
> 
> So yea I am only reading numbers on my wall meter and the largest I've spotted playing BF4 on a 64 multi map heavy battle was 1537 draw from the plug.


just your power cable to your rig or your monitors and speakers as well?

monitors and speakers can use as much as a couple hundred watts, more or less depending on size and backlight power.


----------



## StonedAlex

Quote:


> Originally Posted by *Roboyto*
> 
> You'll have to do some research if you want to make an educated decision on whether or not to start mining. To make your sifting a little quicker use the search function for the thread.
> 
> For LTC mining specific questions head here: http://www.overclock.net/t/1398250/official-tutorial-how-to-start-mining-litecoins
> 
> I can't answer all your questions but I can tell you what I know...
> 
> If you're not going to run under water then it may be hard to keep them cool, especially if they'll be mining. The Powercolor PCS+ has a triple slot cooler on it and seems to perform the best from what I've seen in this thread. Xfire them can turn into an issue because they are end up being so close together. Sapphire Tri-X seems to be the next most popular card. Guys with Xfire will likely chime in to tell you there experiences. I know one member was having a hard time keeping ASUS DC2 290s cool in Xfire. Haven't seen much of anything about the HIS IceQ or XFX Double D cards.
> 
> People seem to be having good luck with the multi-card setups, but they can have their issues with AMD or Nvidia...it's more work than a single GPU. Biggest issue right now seems to be Beta Drivers, but the 13.12 are very solid for everyone.
> 
> My card is pushed to the brink for a reference 290, 1295/1700, and it runs smooth as butter. I haven't had any stuttering issues personally, but this is with only 1 card. I can say I don't recall seeing anyone complain about stuttering since I've been active in this thread for the last few weeks.
> 
> I do believe the black screen issue was a fairly small batch of cards right after release; Haven't seen many people talking about it.
> 
> Don't know about OC monitors.
> 
> Best of luck to you with your potential upcoming project


Thanks guys, I will be doing tons of research into mining before buying anything. I really just wanted to know what to expect from crossfire and heat output. Mining sounds fun but gaming is my primary concern. I have all the fan slots filled in my Phantom 410, but I'm not sure it will be enough to cool 2 of these.


----------



## Roy360

Quote:


> Originally Posted by *StonedAlex*
> 
> Thanks guys, I will be doing tons of research into mining before buying anything. I really just wanted to know what to expect from crossfire and heat output. Mining sounds fun but gaming is my primary concern. I have all the fan slots filled in my Phantom 410, but I'm not sure it will be enough to cool 2 of these.


lol I doubt a few fans will help.

My single reference R9 290 is able bring up my CPU temps up to 36 degrees from 28, and that's with 11 Gentle Typhoons and 4 radiators

I don't know the exact temp it's running at since I switch to iGPU while mining, but I its somewhere around 50 for the gpu and 70 for the vrm


----------



## Roboyto

Quote:


> Originally Posted by *StonedAlex*
> 
> Thanks guys, I will be doing tons of research into mining before buying anything. I really just wanted to know what to expect from crossfire and heat output. Mining sounds fun but gaming is my primary concern. I have all the fan slots filled in my Phantom 410, but I'm not sure it will be enough to cool 2 of these.


You have options for aftermarket cooling to help keep things in check. Here is a post I put up a little over a week ago.



Spoiler: 290 cooling



Quote:


> Originally Posted by *Roboyto*
> 
> I have had great luck so far with my XFX R9 290 Black Edition. Great overclocker, stable, crushing benches and games. Drivers are stable thus far from my experience. Your biggest hurdle will be that reference blower if you plan to get the full potential out of your purhcase!
> 
> Reference cooler does a pretty awful job of cooling the card *quietly*.
> 
> They are capable of cooling the card even under high overclocks, but the sound is quite outrageous. I pushed my XFX 290 BE, with the stock blower, to 1215/1650 with the fan cranked up to 75%. The card maxed at 88C and never throttled. It also kept the VRMs cool enough to sustain these clocks and bench/test very well; never broke 79C on VRM1.
> 
> I haven't seen any information for what changing TIM and/or thermal pads can do on a reference cooler...it may help, but the bottom line is the reference blower is inadequate; especially if you plan to OC.
> 
> Your best bet with these cards is upgrading the cooling solution and make sure you *keep the main VRMs cool*. Main VRMs are highlighted in yellow and secondary VRMs circled in orange.
> 
> 
> You have a couple options...
> 
> *Full cover waterblock* if you are setup for water cooling already. Best solution for cooling everything on the card cool, especially the main VRMs.
> 
> *Upgraded air cooler* Two most popular I've seen are the Arctic Accelero Extreme III and the Gelid Icy VIsion.
> *[*]* *Arctic Accelero Extreme III*
> Arctic is currently sold out at NewEgg because of its popularity and capability to tame these cards and overclock them. The kit includes heatsinks for RAM and VRMs
> http://www.newegg.com/Product/Product.aspx?Item=N82E16835186068
> To give you an idea of how well the Arctic works here's a link:
> http://www.tomshardware.com/reviews/r9-290-accelero-xtreme-290,3671-4.html
> 
> *[*]* *Gelid Icy Vision*
> Is in stock and a bit cheaper. Several reviews stating it works well. The kit includes heatsinks for RAM and VRMs
> http://www.newegg.com/Product/Product.aspx?Item=N82E16835426026
> One of our own has installation instructions and results from this cooler.
> http://www.overclock.net/t/1437634/installation-guide-tips-of-rev-2-icy-vision-on-r9-290x
> 
> *Attach an AIO CPU water cooler*. Will do a fantastic job of cooling the GPU, but you will need to attach heatsinks to RAM and VRMs. You have a couple options here as well.
> 
> 
> *Arctic Accelero Hybrid* which is the most expensive of the lot, if you can find one as it seems they may be discontinued. They are listed as compatible with the 290(X). These have a built in fan to assist in cooling the RAM and VRMs. The kit includes the heatsinks for RAM and VRMs
> http://www.arctic.ac/us_en/products/cooling/vga/accelero-hybrid.html
> *NZXT Kraken G10* This is a bracket designed to attach the Asetek style AIO coolers to the card. It also has a fan holder on it to assist in cooling the RAM and VRMs. I believe these are still on backorder because of there high demand and stylish looks
> 
> 
> 
> 
> 
> 
> 
> 
> You must purchase heatsinks for RAM and VRMs
> https://www.nzxt.com/product/detail/138-kraken-g10-gpu-bracket.html
> Asetek AIO coolers are the round ones like this. You can also see a compatibility list on NZXT website.
> 
> *KeplerDynamics* AIO adapter bracket. This bracket is compatible with the Asetek coolers as well. Downside is there is no fan attachment. You will still need to attach heatsinks to RAM and VRMs and then make sure you have sufficient air flow over the heatsinks/card.
> Orders from here can sometimes take a little while. A friend of mine ordered a few for his 7970 miner cards and it took several weeks for them to arrive. He is very happy with the results though
> 
> 
> 
> 
> 
> 
> 
> 
> http://keplerdynamics.com/sigmacool/radeonmkii
> *Heatisnks for RAM and VRMs*
> You will require two sizes. Larger for RAM and smaller for VRMs.
> 
> *Enzotech copper sinks*
> My brother has these on his Xfire 7950's that have AIO coolers on them.
> Larger Sinks for RAM - http://www.newegg.com/Product/Product.aspx?Item=N82E16835708012
> Smaller Sinks for VRMs - http://www.newegg.com/Product/Product.aspx?Item=N82E16835708011
> 
> 
> *Akust RAM Sinks*
> Aluminum - http://www.newegg.com/Product/Product.aspx?Item=9SIA3TR18A6835
> Copper - http://www.newegg.com/Product/Product.aspx?Item=9SIA3TR18A6835
> You would require more than one package of the copper as there are 16 RAM chips
> 
> 
> *Ebay Heatsinks*
> Here is one example, there are many more you can choose from if you search for RAM heatsink. The prices are very low on eBay and I have seen people mention them working well.
> http://www.ebay.com/itm/8pcs-Aluminium-Heatsink-For-Motherboard-DDR-VGA-RAM-Memory-IC-Chipset-Cooler-W-/370655913100?pt=US_Memory_Chipset_Cooling&hash=item564cd0648c


----------



## Sgt Bilko

The fix for audio stutters in systems with two R9 290s in Crossfire mode with VSync enabled coming soon:

http://forums.amd.com/game/messageview.cfm?catid=475&cmpid=social19234614&threadid=172035


----------



## vortex240

Quote:


> Originally Posted by *Prozillah*
> 
> Currently plugged in:
> 
> 2x 140mm fans
> 8x 120mm fans (some LED lighting)
> 1x 200mm fan
> 1x H110 water pump
> 2x Gigabyte 290 Windforces xfire 1150 / 1400 (50% PL + 50mv)
> 1x 4670k @ 4.5ghz 1.325v
> 2x 4GB G Skill Trident X 2400mhz
> 5x USB devices Razor LED mech Keyboard etc...
> 1x DVDRW drive
> 2x Storage HDD
> 1x SSD
> 
> So yea I am only reading numbers on my wall meter and the largest I've spotted playing BF4 on a 64 multi map heavy battle was 1537 draw from the plug. Also, my tests previously indicated that with a 1000w PSU i was able to oc EITHER - my GPU's or my CPU. Not both. Both were 100% in 2 hour sessions of bf4 under the same condition, same drivers etc. As soon as I would run both together I would spontaneously restart within 5 - 10 minutes.
> 
> Since installed my new 1200w (peaking at 1300w under heavy loads...) I have replicated the same test - 2 hour bf4 multi 64 map session with both OC's in place running 100% stable. Same drivers etc.
> 
> You tell me if you would draw a different conclusion..?
> 
> EDIT: To add - I do have to run all my case fans at 100% to create the powerful negative vortex to suck all the hot air out of the case with the xfire set up (especially overclocking..) so I am sure that is drawing much more power than other users...


Your power meter is A) broken or B) you are not reading it correctly

The fact that you said that you could not OC your both cards and the CPU on a 1000watt PSU, but can now on a 1200watt just plain does NOT make sense. Granted your 1000watt could have been defective but a 1000watt PSU is capable of delivering 1000watt, a killawatt will show a power draw about 50-70 watt higher then that at peak due to the efficiency of a gold PSU in this example.

Either way, lets see a picture of your power meter for proof drawing these crazy amounts. For the meantime I'm calling BS on this story. It's simple math here, no magic.


----------



## MapRef41N93W

So would picking up a second aftermarket 290x and running Xfire be a bad idea with my case setup? I'd be running both on the air coolers in a Haf-X with a side 200mm fan blowing on them and two top 200mm fans blowing air out. Would the top card run too hot to be usable in this setup?


----------



## Sgt Bilko

Quote:


> Originally Posted by *MapRef41N93W*
> 
> So would picking up a second aftermarket 290x and running Xfire be a bad idea with my case setup? I'd be running both on the air coolers in a Haf-X with a side 200mm fan blowing on them and two top 200mm fans blowing air out. Would the top card run too hot to be usable in this setup?


Nope, i have CF XFX DD 290s in my Storm trooper, works fine


----------



## Ouro

Do you guys know if this Gigabyte WF3 290 has the same PCB design as the r290 WF3 OC? I just ordered one.


----------



## taem

Well on the wattage issue I just ran furmark just to see what the peak draw is, I don't even like that app.

I only had on:
4670k @ 4.7 & 2.75v adaptive (but cpu was only drawing half of what it does during intense games or h264 encoding)
R9 290 at stock, 0 vddc
6 Noctua fans which draw low watts at 7v
2 Thermalright high powered fans at 7v
1 ssd
1 bluray drive

Drew 450 watts. If cpu were also peaking that would be ~525w. That's scary high. Dunno if my 850w can handle this.

Quote:


> Originally Posted by *Sgt Bilko*
> 
> The fix for audio stutters in systems with two R9 290s in Crossfire mode with VSync enabled coming soon:
> 
> http://forums.amd.com/game/messageview.cfm?catid=475&cmpid=social19234614&threadid=172035


Wut! I'm about to crossfire 290s and I like vsync. I didn't even know about this problem.


----------



## Sgt Bilko

Quote:


> Originally Posted by *taem*
> 
> Well on the wattage issue I just ran furmark just to see what the peak draw is, I don't even like that app.
> 
> I only had on:
> 4670k @ 4.7 & 2.75v adaptive (but cpu was only drawing half of what it does during intense games or h264 encoding)
> R9 290 at stock, 0 vddc
> 6 Noctua fans which draw low watts at 7v
> 2 Thermalright high powered fans at 7v
> 1 ssd
> 1 bluray drive
> 
> Drew 450 watts. If cpu were also peaking that would be ~525w. That's scary high. Dunno if my 850w can handle this.
> Wut! I'm about to crossfire 290s and I like vsync. I didn't even know about this problem.


the Audio "crackles" when you have both Crossfire and Vsync enabled, turn Vsync off and solved, or just run one card.

Driver will be out within the month i think, won't be long


----------



## Durquavian

From AMD Facebook:
Quote:


> We have identified the root cause for the audio stutters in systems with 2 R9 290's in Crossfire mode with Vsync enabled. We are planning to include this fix in an upcoming WHQL driver expected to be released in April; however we are aiming to include this fix in an earlier Beta driver that is slated for March. Thank you for your patience, and special thanks to those who have reported the bug, and provided thorough feedback about this issue.


----------



## Prozillah

Quote:


> Originally Posted by *vortex240*
> 
> Your power meter is A) broken or B) you are not reading it correctly
> 
> The fact that you said that you could not OC your both cards and the CPU on a 1000watt PSU, but can now on a 1200watt just plain does NOT make sense. Granted your 1000watt could have been defective but a 1000watt PSU is capable of delivering 1000watt, a killawatt will show a power draw about 50-70 watt higher then that at peak due to the efficiency of a gold PSU in this example.
> 
> Either way, lets see a picture of your power meter for proof drawing these crazy amounts. For the meantime I'm calling BS on this story. It's simple math here, no magic.


I'll get a picture up on saturday so you guys know what I'm looking at - Either way I won't debate my integrity over a forum - this place is designed to help assist and provide a knowledge base for our hobby. So you call it BS champ, I'm simply sharing my results & experiences - I don't know why anyone would feel the need to lie about how much power their system draws...


----------



## WotanWipeout

I am running a 2500k and 2 290x.
All watercooled.
Powerdraw on my old corsair 750m (gold)
Between 550w underclocked and 900w overclocked.
On my new corsair 1200i max 800w.
You might come to 1000w under high cpu load. 1500w seems too high. Check Afterburner about vcore on gpus and downlock your gpus as a test.

At 1500w you should be able to run 4 290.
Regards Wotan


----------



## keikei

I reverted back to 13.12 drivers, and now BF4 is a lot stabler. 14.2 had way to much fps fluctuations. I was able to actually enjoy a round of conquest without stutter. 'New' rig not complete yet, but running 2 290's. Waiting on an i7.


----------



## Sgt Bilko

Quote:


> Originally Posted by *keikei*
> 
> I reverted back to 13.12 drivers, and now BF4 is a lot stabler. 14.2 had way to much fps fluctuations. I was able to actually enjoy a round of conquest without stutter. 'New' rig not complete yet, but running 2 290's. Waiting on an i7.


I'm not having any stutter issues at all actually.

Seem it varies person to person.


----------



## nightfox

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'm not having any stutter issues at all actually.
> 
> Seem it varies person to person.


agree. I dont experience lag on my trifire 290's when i use D3d11. but with mantle, fluctuates about 45 to 90 fps. it is more stable on d3d11.

but with mantle, i observed that CPU was utilized more than d3d11. its like 5-10% more usage,


----------



## Sgt Bilko

Quote:


> Originally Posted by *nightfox*
> 
> agree. I dont experience lag on my trifire 290's when i use D3d11. but with mantle, fluctuates about 45 to 90 fps. it is more stable on d3d11.
> 
> but with mantle, i observed that CPU was utilized more than d3d11. its like 5-10% more usage,


i don't experiance lag or stutter with D3D or Mantle, I just get about 60 fps more with mantle


----------



## CU4TLAN

Quote:


> Originally Posted by *Sgt Bilko*
> 
> i don't experiance lag or stutter with D3D or Mantle, I just get about 60 fps more with mantle


Yea because you reduced the bottleneck with Mantle. Us 3930k users, not so much.


----------



## Sgt Bilko

Quote:


> Originally Posted by *CU4TLAN*
> 
> Yea because you reduced the bottleneck with Mantle. Us 3930k users, not so much.


Well......yeah, That was half the point of mantle to begin with.


----------



## CU4TLAN

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well......yeah, That was half the point of mantle to begin with.


Exactly, which is why you saw a great increase


----------



## Sgt Bilko

Quote:


> Originally Posted by *CU4TLAN*
> 
> Exactly, which is why you saw a great increase


you are only telling me stuff i already know,

I understand why i get a greater percentage increase over the faster Intel chips, I was just saying that the only difference with the 14.2 driver is more fps for me.


----------



## quakermaas

Quote:


> Originally Posted by *Prozillah*
> 
> I'll get a picture up on saturday so you guys know what I'm looking at - Either way I won't debate my integrity over a forum - this place is designed to help assist and provide a knowledge base for our hobby. So you call it BS champ, I'm simply sharing my results & experiences - I don't know why anyone would feel the need to lie about how much power their system draws...


Don't take it personal, nobody is attacking you, they are just saying the numbers seem a bit off, and I would agree that 1500w from the wall is way to much for your system, should be topping out at around 1000w.

My system, 3930k @ 4.5 and two [email protected], two water pumps, ten fans, five HDDs will hit just over 1000w from the wall when benching, I am using a HX850w, I can not push the CPU and graphic cards at the same time, its just to much, but when gaming with cards at default and CPU at 4.5 I draw around 650w to 700w at the wall, which is fine for a 850w

So if I wanted to run some benchmarks with my CPU and Graphic cards at max overclock, I would probably draw around 1100 to 1200 watts from the wall.

So you see why others think that 1500w is a bit off for your system ?

Quote:


> Originally Posted by *Sgt Bilko*
> 
> i don't experiance lag or stutter with D3D or Mantle, I just get about 60 fps more with mantle
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yep, sort of what I'm seeing.


Quote:


> Originally Posted by *CU4TLAN*
> 
> Yea because you reduced the bottleneck with Mantle. Us 3930k users, not so much.


I am getting huge benefits using Mantle with my [email protected] 4.5. easy around 25%

15 minutes recording each in a full "Firestorm" 64 player server, first person spectator mode.

Would be very hard to go back to using DX11 now.

Mantle left/DX11 right


----------



## OmgitsSexyChase

Quote:


> Originally Posted by *StonedAlex*
> 
> Thanks guys, I will be doing tons of research into mining before buying anything. I really just wanted to know what to expect from crossfire and heat output. Mining sounds fun but gaming is my primary concern. I have all the fan slots filled in my Phantom 410, but I'm not sure it will be enough to cool 2 of these.


Yeah these things get hot, if your not watercooling, They might be a little much, mining produces a LOT of heat, like you won't need a heater in your house hot


----------



## kpoeticg

I know this is a fairly dumb question since they're always non-reference, but since i'm about to order my 2nd 290x, the DCII is non-reference correct?

I'm going with all Kryographics blocks, and i'm trying to order my 2nd card either tonight or tomorrow

Please don't flame me for asking a dumb question. Just dbl checking since they're priced good right now


----------



## Sgt Bilko

Quote:


> Originally Posted by *kpoeticg*
> 
> I know this is a fairly dumb question since they're always non-reference, but since i'm about to order my 2nd 290x, the DCII is non-reference correct?
> I'm going with all Kryographics blocks, and i'm trying to order my 2nd card either tonight or tomorrow
> 
> Please don't flame me for asking a dumb question. Just dbl checking since they're priced good right now


290x DCU II is a non-ref card with a non-ref PCB, EK have blocks for them, not sure if kryographics does though,


----------



## Sgt Bilko

Quote:


> Originally Posted by *quakermaas*
> 
> Don't take it personal, nobody is attacking you, they are just saying the numbers seem a bit off, and I would agree that 1500w from the wall is way to much for your system, should be topping out at around 1000w.
> 
> My system, 3930k @ 4.5 and two [email protected], two water pumps, ten fans, five HDDs will hit just over 1000w from the wall when benching, I am using a HX850w, I can not push the CPU and graphic cards at the same time, its just to much, but when gaming with cards at default and CPU at 4.5 I draw around 650w to 700w at the wall, which is fine for a 850w
> 
> So if I wanted to run some benchmarks with my CPU and Graphic cards at max overclock, I would probably draw around 1100 to 1200 watts from the wall.
> 
> So you see why others think that 1500w is a bit off for your system ?
> 
> I am getting huge benefits using Mantle with my [email protected] 4.5. easy around 25%
> 
> 15 minutes recording each in a full "Firestorm" 64 player server, first person spectator mode.
> Would be very hard to go back to using DX11 now.
> 
> Mantle left/DX11 right
> 
> 
> Spoiler: Warning: Spoiler!


That's some nice results there, good to see the hexacores are getting a good boost out of it as well


----------



## vortex240

Quote:


> Originally Posted by *Prozillah*
> 
> I'll get a picture up on saturday so you guys know what I'm looking at - Either way I won't debate my integrity over a forum - this place is designed to help assist and provide a knowledge base for our hobby. So you call it BS champ, I'm simply sharing my results & experiences - I don't know why anyone would feel the need to lie about how much power their system draws...


Hey, I didn't mean it in any bad way at all. Being in IT and building PC's for over 15 years, I have never seen such a discrepency between what the components are rated for and what they actually consume. Take a picture though, so we can figure out what kind of a black hole for electricity you have created in your pc by accident


----------



## kpoeticg

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 290x DCU II is a non-ref card with a non-ref PCB, EK have blocks for them, not sure if kryographics does though,


Thanx. Yeah i was pretty sure they were. Never hurts to ask though. Maybe i'll just grab another Powercolor or a Sapphire


----------



## Sgt Bilko

Quote:


> Originally Posted by *kpoeticg*
> 
> Thanx. Yeah i was pretty sure they were. Never hurts to ask though. Maybe i'll just grab another Powercolor or a Sapphire


Someone in here has confirmed the XFX DD cards are ref PCB's as well, might keep that in mind if the powercolor or Sapphire cards are hard to come by


----------



## kpoeticg

The DD's were cheaper a few days ago. Now the Powercolor and Sapphire are lol. Newegg's crazy like that. I'm bout to place my order now =)


----------



## Ilsanto86

8Pack & HiVizMan (Team OcUK) going for 4x R9-290X World Record in Firestrike / Extreme. Sponsored by AMD & Asus


----------



## Shurtugal

Hey Guys, got my rig together but still waiting on my 290, sadly. Out of curiosity, what sort of overclocks are people getting from the 290's? Mainly the Gigabyte Windforce, but which card seems to overclock the best with the best cooler? Thanks Guys!


----------



## Jackalito

Just a new guy in town








http://www.techpowerup.com/gpuz/ngfrw/

Sapphire R9 290X Tri-X OC


----------



## Widde

Quote:


> Originally Posted by *Sgt Bilko*
> 
> The fix for audio stutters in systems with two R9 290s in Crossfire mode with VSync enabled coming soon:
> 
> http://forums.amd.com/game/messageview.cfm?catid=475&cmpid=social19234614&threadid=172035


I hope it fixes the stutter I get when running something in fullscreen and alt-tabbing







The audio that is ^^


----------



## Denon240

Quote:


> Originally Posted by *Roy360*
> 
> hey, does anyone know how the warranty on an AMD internal sample works? I found a seller selling his R9 290 for 525. But he doesn't have a receipt because it's an internal sample. Apparently all I need is the card itself with the serial number stick intact. Can anyone confirm this?


I can confirm, I have a couple 290X that my brother got me through work at AMD and that's the process. Think about it, these are meant for employees, why would you need a receipt and who would give you one to being with lol. Card#1 is 96.2 asic and card #2 is 94.9 asic, both do 1200/1600 on stock volt/cooler. With +100mv each separately will do 1325/1675 and that point I'm pretty sure they are being held back by the temperatures. I'm just waiting for my water components to unleash them. Too bad AMD doesn't have their own waterblocks. $525 with the minning rush is a very good price. I'm trying to twist my brother's arm for a 3rd one LOL, he says I need to get better marks this semester haha.

Man... this thread is fracking long, i've been reading it for like 3 days now, yolo


----------



## sterob

anyone have list of non-ref 290 non-x that can change voltage (undervolt)? Some people on reddit said asus can be changed but some said no.


----------



## vortex240

.


----------



## taem

Hexus has a review of the Powercolor PCS+ 290X http://hexus.net/tech/reviews/graphics/66625-powercolor-radeon-r9-290x-pcs/?page=10
Quote:


> switching to performance mode limited the under-load temperature to a very impressive 68ºC - the lowest we've seen on any R9 290X.


Quote:


> Originally Posted by *Shurtugal*
> 
> Hey Guys, got my rig together but still waiting on my 290, sadly. Out of curiosity, what sort of overclocks are people getting from the 290's? Mainly the Gigabyte Windforce, but which card seems to overclock the best with the best cooler? Thanks Guys!


I have a Powercolor PCS+ 290. Stock settings are 1040 core, 1350 mem, +50 vddc adjust. My card limit is 1230 core 1700 memory, but this requires a palm sweating +200mV vddc. Some of the points in between are:

Vddc +100mV: 1170 core, 1600 mem
Vddc +75: 1140 core, 1500 mem
Vddc +31: 1100 core, 1375 mem
Vddc 0: 1080 core, 1350 mem

ASIC score is 70.0%

It's all lottery though, how much you can overclock a card, just because one guy gets great clocks on a given brand is no guarantee you'll get the same.

I'm pretty sure the Powercolor PCS+ I have cools the best. It's not a quiet card though, and it's gigantic (which is in part why it cools the best, it's got 52mm thickness and 294mm length of heatsink). So to the extent that keeping everything cool helps achieve better clocks, if you're willing to crank the fans, you can keep the vrms nice and cool on this card. If I go full blast on case fans and the card fans, I can run 1200/1600 on air and keep vrm1 under 70c at 22 ambient, which is crazy imho. But keeping the case fans and card fans on full blast is also crazy except not in a good way.


----------



## keikei

Quote:


> Originally Posted by *quakermaas*
> 
> I am getting huge benefits using Mantle with my [email protected] 4.5. easy around 25%
> 
> 15 minutes recording each in a full "Firestorm" 64 player server, first person spectator mode.
> Would be very hard to go back to using DX11 now.
> 
> Mantle left/DX11 right
> 
> 
> Spoiler: Warning: Spoiler!


Appreciate the benches.







The fps drops/stuttering probably explains the cpu bottle neck i have. The 2nd card doesnt even get worked. Cpu usage is also very high.


----------



## Roboyto

Quote:


> Originally Posted by *Shurtugal*
> 
> Hey Guys, got my rig together but still waiting on my 290, sadly. Out of curiosity, what sort of overclocks are people getting from the 290's? Mainly the Gigabyte Windforce, but which card seems to overclock the best with the best cooler? Thanks Guys!


I've got a reference XFX 290 Black Edition. Stock clocks 980/1250. If I ram +200mV offset through it, the card is rock solid stable hanging around 1275-1300 core, depending on bench/game, and the RAM in the 1675-1700 range.


----------



## Roboyto

Quote:


> Originally Posted by *quakermaas*
> I am getting huge benefits using Mantle with my [email protected] 4.5. easy around 25%
> 
> 15 minutes recording each in a full "Firestorm" 64 player server, first person spectator mode.
> 
> Would be very hard to go back to using DX11 now.
> 
> Mantle left/DX11 right
> 
> 
> 
> Spoiler: Warning: Spoiler!


Good to see some positive things! This is with 14.2 I presume?


----------



## quakermaas

Quote:


> Originally Posted by *Roboyto*
> 
> Good to see some positive things! This is with 14.2 I presume?


Yes..14.1 worked well when it ran right, which wasn't that offen ,it was a nightmare driver with Mantle and DirectX


----------



## pkrexer

Curious what everyone runs for their daily overclock? Even though I'm under water, I typically leave my voltage at default. My card can do 1110 / 1400 at default voltage and my Core / VRM1 runs 46c after an hour or so of BF4 . I could run it at 1200/1500 with +100mv, but it seems overkill and I don't want to sacrifice the longevity of my card.


----------



## Roboyto

Quote:


> Originally Posted by *pkrexer*
> 
> Curious what everyone runs for their daily overclock? Even though I'm under water, I typically leave my voltage at default. My card can do 1110 / 1400 at default voltage and my Core / VRM1 runs 46c after an hour or so of BF4 . I could run it at 1200/1500 with +100mv, but it seems overkill and I don't want to sacrifice the longevity of my card.


Thats exactly same clocks my card gets with +100mV. Best ratio of performance and temperatures I've found from the ridiculous number of benchmarks I've ran in the last few weeks.

I've compared all my benchmark scores from 1200/1500 to my best clocks which are near, or at, 1300/1700 +200mV; depending on benchmark.

The higher clocks give me 5% better scores. Not really worth the extra voltage, temperatures, and wear on the card over the long haul IMO.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *pkrexer*
> 
> Curious what everyone runs for their daily overclock? Even though I'm under water, I typically leave my voltage at default. My card can do 1110 / 1400 at default voltage and my Core / VRM1 runs 46c after an hour or so of BF4 . I could run it at 1200/1500 with +100mv, but it seems overkill and I don't want to sacrifice the longevity of my card.


I run 1205/1437 on +168mV daily. Temps don't go above 50c core and VRM if my rad fans speed up.

Hawaii takes voltage pretty well, there are several people on this forum who have run high voltage OC's with no problem.

Also, if my card degrades in two years I don't really care as I'm gonna be upgrading then anyways... That's also when I'm going to get a new mobo, CPU, and RAM.


----------



## phallacy

Quote:


> Originally Posted by *pkrexer*
> 
> Curious what everyone runs for their daily overclock? Even though I'm under water, I typically leave my voltage at default. My card can do 1110 / 1400 at default voltage and my Core / VRM1 runs 46c after an hour or so of BF4 . I could run it at 1200/1500 with +100mv, but it seems overkill and I don't want to sacrifice the longevity of my card.


1175/1500 on my Elpida memory sapphire 290x with +60mV and 1200/1600 +75 mV on my Hynix based XFX 290x. Currently testing my tri x oc 290x to see its limits. Max GPU core temp is 50 and it only happens with Far Cry 3, even in Crysis 3 cards are maxing at 47. Kinda weird me thinks.


----------



## By-Tor

Seeing that almost everyone here is running a 290/290x and more than likely used a 7900 series card before that jump.

Was the jump worth it?

I'm running a pair of 7950's now and I'm thinking about going to just one 290. The 14.2 drivers work great for me and mantle runs fine in BF4 as long as I disable crossfire and only use one card.

Thanks


----------



## Arizonian

Quote:


> Originally Posted by *Jackalito*
> 
> Just a new guy in town
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/ngfrw/
> 
> Sapphire R9 290X Tri-X OC


Welcome to OCN and the 290X / 290 Club 'new guy in town'. Congrats - added


----------



## vortex240

Quote:


> Originally Posted by *By-Tor*
> 
> Seeing that almost everyone here is running a 290/290x and more than likely used a 7900 series card before that jump.
> 
> Was the jump worth it?
> 
> I'm running a pair of 7950's now and I'm thinking about going to just one 290. The 14.2 drivers work great for me and mantle runs fine in BF4 as long as I disable crossfire and only use one card.
> 
> Thanks


I had a reference 780 for 2 weeks before I got my AMD internal card 290X and it's a big performance difference. I'm on the stock cooler with the I/O plate removed for better flow, replaced TIM and fan raised my 2mm with spacers to make the fan more efficient and I run it at 1200/1650 with +25mv.


----------



## phallacy

Quote:


> Originally Posted by *By-Tor*
> 
> Seeing that almost everyone here is running a 290/290x and more than likely used a 7900 series card before that jump.
> 
> Was the jump worth it?
> 
> I'm running a pair of 7950's now and I'm thinking about going to just one 290. The 14.2 drivers work great for me and mantle runs fine in BF4 as long as I disable crossfire and only use one card.
> 
> Thanks


For me coming from a 7870 reference, it was a HUGE upgrade. And if you are thinking about crossfire 290/x eventually, I can tell you the scaling is pretty awesome. 80-100% plus in most games and it is a nice way to futureproof your rig.

Also from benchmarks and reviews I've seen a crossfire 7970 is just about equal with a single 290x a pretty massive generational leap. Even more so if you play at 1440p, 1600p, 4k or eyefinity resolutions


----------



## Roboyto

Quote:


> Originally Posted by *By-Tor*
> 
> Seeing that almost everyone here is running a 290/290x and more than likely used a 7900 series card before that jump.
> 
> Was the jump worth it?
> 
> I'm running a pair of 7950's now and I'm thinking about going to just one 290. The 14.2 drivers work great for me and mantle runs fine in BF4 as long as I disable crossfire and only use one card.
> 
> Thanks


A friend of mine running a 7950/7970 in Xfire and his 3DMark11 bench scores are just slightly higher than mine. As long as you can keep these cards cool, most of them OC very, very well.

You could likely sell your 7950s and come to close, if no, break even on a new card.


----------



## By-Tor

Quote:


> Originally Posted by *vortex240*
> 
> I had a reference 780 for 2 weeks before I got my AMD internal card 290X and it's a big performance difference. I'm on the stock cooler with the I/O plate removed for better flow, replaced TIM and fan raised my 2mm with spacers to make the fan more efficient and I run it at 1200/1650 with +25mv.


Quote:


> Originally Posted by *phallacy*
> 
> For me coming from a 7870 reference, it was a HUGE upgrade. And if you are thinking about crossfire 290/x eventually, I can tell you the scaling is pretty awesome. 80-100% plus in most games and it is a nice way to futureproof your rig.
> 
> Also from benchmarks and reviews I've seen a crossfire 7970 is just about equal with a single 290x a pretty massive generational leap. Even more so if you play at 1440p, 1600p, 4k or eyefinity resolutions


Thanks

The only issue I see is that the die itself is turned back 45 degrees and my MCW82-7900 Swiftech water block won't work on that card.

Is the die lower than that little wall around it on the PCB?

If so I could just go back to my older MCW60 Swiftech block and add a shim.

Thanks again


----------



## The Mac

Quote:


> Originally Posted by *By-Tor*
> 
> Seeing that almost everyone here is running a 290/290x and more than likely used a 7900 series card before that jump.
> 
> Was the jump worth it?
> 
> I'm running a pair of 7950's now and I'm thinking about going to just one 290. The 14.2 drivers work great for me and mantle runs fine in BF4 as long as I disable crossfire and only use one card.
> 
> Thanks


came from a single 7970 dual-x non ghz to a 290 pro.

Heat issues aside, it was a hell of an upgrade.


----------



## phallacy

Quote:


> Originally Posted by *By-Tor*
> 
> Thanks
> 
> The only issue I see is that the die itself is turned back 45 degrees and my MCW82-7900 Swiftech water block won't work on that card.
> 
> Is the die lower than that little wall around it on the PCB?
> 
> If so I could just go back to my older MCW60 Swiftech block and add a shim.
> 
> Thanks again


I can't say in regards to your swiftech block because I don't know how the spacing is. Main thing with cooling these cards is the VRM temperatures. Even good air cooling can keep the core temps reasonable but it's those VRMs which will spike more. I'd recommend a block made for the 290x but that's because I haven't seen anybody with that Swiftech 7900 block on a 290x.

FWIW, I was about to put the keper dynamic AIO bracket on my cards before deciding to go custom looped and there was no shim required to attach the cooler to the card.


----------



## binormalkilla

Quote:


> Originally Posted by *Sgt Bilko*
> 
> The fix for audio stutters in systems with two R9 290s in Crossfire mode with VSync enabled coming soon:
> 
> http://forums.amd.com/game/messageview.cfm?catid=475&cmpid=social19234614&threadid=172035


It's about time. I've only been waiting on this since November.


----------



## keikei

Quote:


> Originally Posted by *phallacy*
> 
> Also from benchmarks and reviews I've seen a crossfire 7970 is just about equal with a single 290x 290 a pretty massive generational leap. Even more so if you play at 1440p, 1600p, 4k or eyefinity resolutions


*concerning BF4, X vs non-X theres a huge difference of 2 whole fps.


----------



## The Mac

Quote:


> Originally Posted by *keikei*
> 
> *concerning BF4, X vs non-X theres a huge difference of 2 whole fps.


mantle or dx?


----------



## keikei

Quote:


> Originally Posted by *The Mac*
> 
> mantle or dx?


The vid I saw was side by side dx comparison, pre mantle. Its what made me decide to go with the non-x, with other findings of course. My primary upgrade reason was bf4 fps increase.


----------



## taem

Does Raptr no longer auto install? For a while now every time I install Catalyst drivers Raptr auto installs. But I just installed 13.12 after rolling back, and Raptr did not install. Just a few weeks ago this same driver package auto installed that. I'm wondering if something went wrong lol.


----------



## By-Tor

Quote:


> Originally Posted by *phallacy*
> 
> I can't say in regards to your swiftech block because I don't know how the spacing is. Main thing with cooling these cards is the VRM temperatures. Even good air cooling can keep the core temps reasonable but it's those VRMs which will spike more. I'd recommend a block made for the 290x but that's because I haven't seen anybody with that Swiftech 7900 block on a 290x.
> 
> FWIW, I was about to put the keper dynamic AIO bracket on my cards before deciding to go custom looped and there was no shim required to attach the cooler to the card.


Here's how I'm cooling the 7950's Ram and VRM's and have a fan blowing across both cards and it works great.

I run them at 1200/1700


----------



## Roboyto

Quote:


> Originally Posted by *By-Tor*
> 
> Here's how I'm cooling the 7950's Ram and VRM's and have a fan blowing across both cards and it works great.
> 
> I run them at 1200/1700
> 
> 
> Spoiler: Warning: Spoiler!


The sinks would be necessary for the 290 as well, but I think these VRMs run quite a bit hotter than the 7900s do. Especially once you start adding any voltage to OC.


----------



## VSG

Quote:


> Caution caution! It appears that Gigabyte R9 290(X) Windforce 3X series graphics cards (no longer) fit our EK-FC R9-290X series water blocks due to slight changes to the circuit board. We will however release a compatible water block in the near future.
> 
> Please share among your fellow water cooling enthusiasts!


From EK's facebook page.


----------



## rickyman0319

I am wondering how to make orgianl fan quieter on r9 290. please help me.


----------



## Roboyto

Quote:


> Originally Posted by *rickyman0319*
> 
> I am wondering how to make orgianl fan quieter on r9 290. please help me.


Upgrade the factory thermal paste for the GPU. Upgrading the thermal pads for the main VRMs could help or hurt...its hard to say. One reason the core runs so hot is the blower first cools the VRMs which run very hot. That hot air then is blown across the core...not the best design

Some paste like Arctic MX-4, Gelid GC Extreme, or Noctua NT-H1 would probably lower core temperatures. If you'd like to go the route of best performance for TIM it would be CLP or CLU, Coolaborator Liquid Pro/Ultra. These you must be cautious when applying since it is liquid metal. Best method seen here on this thread for application is using masking tap around the GPU core to make sure it doesn't get anywhere else.

VRM Thermal Pads and temperatures: http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures#post_21821258

Your best bet is likely aftermarket cooling unfortunately, *but I'd be interested to know what would happen with good TIM on the core and a thermal pad upgrade for the VRM1 on a reference cooler could do.*

Quote: Originally Posted by Roboyt0


> Spoiler: Aftermarket 290 Cooling
> 
> 
> 
> I have had great luck so far with my XFX R9 290 Black Edition. Great overclocker, stable, crushing benches and games. Drivers are stable thus far from my experience. Your biggest hurdle will be that reference blower if you plan to get the full potential out of your purhcase!
> 
> Reference cooler does a pretty awful job of cooling the card *quietly*.
> 
> They are capable of cooling the card even under high overclocks, but the sound is quite outrageous. I pushed my XFX 290 BE, with the stock blower, to 1215/1650 with the fan cranked up to 75%. The card maxed at 88C and never throttled. It also kept the VRMs cool enough to sustain these clocks and bench/test very well; never broke 79C on VRM1.
> 
> I haven't seen any information for what changing TIM and/or thermal pads can do on a reference cooler...it may help, but the bottom line is the reference blower is inadequate; especially if you plan to OC.
> 
> Your best bet with these cards is upgrading the cooling solution and make sure you *keep the main VRMs cool*. Main VRMs are highlighted in yellow and secondary VRMs circled in orange.
> 
> 
> You have a couple options...
> 
> *Full cover waterblock* if you are setup for water cooling already. Best solution for cooling everything on the card cool, especially the main VRMs.
> 
> *Upgraded air cooler* Two most popular I've seen are the Arctic Accelero Extreme III and the Gelid Icy VIsion.
> 
> *[*]* *Arctic Accelero Extreme III*
> Arctic is currently sold out at NewEgg because of its popularity and capability to tame these cards and overclock them. The kit includes heatsinks for RAM and VRMs
> http://www.newegg.com/Product/Product.aspx?Item=N82E16835186068
> To give you an idea of how well the Arctic works here's a link:
> http://www.tomshardware.com/reviews/r9-290-accelero-xtreme-290,3671-4.html
> 
> *[*]* *Gelid Icy Vision*
> Is in stock and a bit cheaper. Several reviews stating it works well. The kit includes heatsinks for RAM and VRMs
> http://www.newegg.com/Product/Product.aspx?Item=N82E16835426026
> One of our own has installation instructions and results from this cooler.
> http://www.overclock.net/t/1437634/installation-guide-tips-of-rev-2-icy-vision-on-r9-290x
> 
> *Attach an AIO CPU water cooler*. Will do a fantastic job of cooling the GPU, but you will need to attach heatsinks to RAM and VRMs. You have a couple options here as well.
> 
> 
> *Arctic Accelero Hybrid* which is the most expensive of the lot, if you can find one as it seems they may be discontinued. They are listed as compatible with the 290(X). These have a built in fan to assist in cooling the RAM and VRMs. The kit includes the heatsinks for RAM and VRMs
> http://www.arctic.ac/us_en/products/cooling/vga/accelero-hybrid.html
> *NZXT Kraken G10* This is a bracket designed to attach the Asetek style AIO coolers to the card. It also has a fan holder on it to assist in cooling the RAM and VRMs. I believe these are still on backorder because of there high demand and stylish looks
> 
> 
> 
> 
> 
> 
> 
> 
> You must purchase heatsinks for RAM and VRMs
> https://www.nzxt.com/product/detail/138-kraken-g10-gpu-bracket.html
> Asetek AIO coolers are the round ones like this. You can also see a compatibility list on NZXT website.
> 
> *KeplerDynamics* AIO adapter bracket. This bracket is compatible with the Asetek coolers as well. Downside is there is no fan attachment. You will still need to attach heatsinks to RAM and VRMs and then make sure you have sufficient air flow over the heatsinks/card.
> Orders from here can sometimes take a little while. A friend of mine ordered a few for his 7970 miner cards and it took several weeks for them to arrive. He is very happy with the results though
> 
> 
> 
> 
> 
> 
> 
> 
> http://keplerdynamics.com/sigmacool/radeonmkii
> *Heatisnks for RAM and VRMs*
> You will require two sizes. Larger for RAM and smaller for VRMs.
> 
> *Enzotech copper sinks*
> My brother has these on his Xfire 7950's that have AIO coolers on them.
> Larger Sinks for RAM - http://www.newegg.com/Product/Product.aspx?Item=N82E16835708012
> Smaller Sinks for VRMs - http://www.newegg.com/Product/Product.aspx?Item=N82E16835708011
> 
> 
> *Akust RAM Sinks*
> Aluminum - http://www.newegg.com/Product/Product.aspx?Item=9SIA3TR18A6835
> Copper - http://www.newegg.com/Product/Product.aspx?Item=9SIA3TR18A6835
> You would require more than one package of the copper as there are 16 RAM chips
> 
> 
> *Ebay Heatsinks*
> Here is one example, there are many more you can choose from if you search for RAM heatsink. The prices are very low on eBay and I have seen people mention them working well.
> http://www.ebay.com/itm/8pcs-Aluminium-Heatsink-For-Motherboard-DDR-VGA-RAM-Memory-IC-Chipset-Cooler-W-/370655913100?pt=US_Memory_Chipset_Cooling&hash=item564cd0648c


----------



## taem

Yay


----------



## taem

Quick closeup of vrm1 on the 290 Tri-X, compared to the Powercolor PCS+ 290:

Tri-X (shroud overhangs there and gets in the way):


PCS+:


At least no bizarre goop on the Tri-X chokes









I worried this card might be flimsy due to the plastic in the shroud but it feels rock solid. If anything less flex in this card than on the all metal shroud/backplate Powercolor PCS+. Tri-X shroud is half metal as it turns out.


----------



## Shurtugal

Quote:


> Originally Posted by *taem*
> 
> Hexus has a review of the Powercolor PCS+ 290X http://hexus.net/tech/reviews/graphics/66625-powercolor-radeon-r9-290x-pcs/?page=10
> 
> I have a Powercolor PCS+ 290. Stock settings are 1040 core, 1350 mem, +50 vddc adjust. My card limit is 1230 core 1700 memory, but this requires a palm sweating +200mV vddc. Some of the points in between are:
> 
> Vddc +100mV: 1170 core, 1600 mem
> Vddc +75: 1140 core, 1500 mem
> Vddc +31: 1100 core, 1375 mem
> Vddc 0: 1080 core, 1350 mem
> 
> ASIC score is 70.0%
> 
> It's all lottery though, how much you can overclock a card, just because one guy gets great clocks on a given brand is no guarantee you'll get the same.
> 
> I'm pretty sure the Powercolor PCS+ I have cools the best. It's not a quiet card though, and it's gigantic (which is in part why it cools the best, it's got 52mm thickness and 294mm length of heatsink). So to the extent that keeping everything cool helps achieve better clocks, if you're willing to crank the fans, you can keep the vrms nice and cool on this card. If I go full blast on case fans and the card fans, I can run 1200/1600 on air and keep vrm1 under 70c at 22 ambient, which is crazy imho. But keeping the case fans and card fans on full blast is also crazy except not in a good way.


Quote:


> Originally Posted by *Roboyto*
> 
> I've got a reference XFX 290 Black Edition. Stock clocks 980/1250. If I ram +200mV offset through it, the card is rock solid stable hanging around 1275-1300 core, depending on bench/game, and the RAM in the 1675-1700 range.


Awesome, thanks!


----------



## Roy360

Quote:


> Originally Posted by *Denon240*
> 
> I can confirm, I have a couple 290X that my brother got me through work at AMD and that's the process. Think about it, these are meant for employees, why would you need a receipt and who would give you one to being with lol. Card#1 is 96.2 asic and card #2 is 94.9 asic, both do 1200/1600 on stock volt/cooler. With +100mv each separately will do 1325/1675 and that point I'm pretty sure they are being held back by the temperatures. I'm just waiting for my water components to unleash them. Too bad AMD doesn't have their own waterblocks. $525 with the minning rush is a very good price. I'm trying to twist my brother's arm for a 3rd one LOL, he says I need to get better marks this semester haha.
> 
> Man... this thread is fracking long, i've been reading it for like 3 days now, yolo


Thanks, i set the deal back with the buyer after reading your last. Can't wait to see how it overclocks under water


----------



## taem

Ok so I set up my crossfire


Here are the Valley scores for both cards singly at 1080 core 1350 mem. Tri-X a lot faster at same clocks. Why? Does the primary gpu1 perform better?


Here is the Valley score in Crossfire:


Does that look about right? Not quite double but almost? Thing is gpu usage is crazy:


----------



## LazarusIV

Hey all, I have an issue...

I have turned off sleep and hybrid sleep on my computer and set the monitor to go into power-saving mode after 10 minutes. For some reason, when the computer sits for a while and the monitor has turned off like it's supposed to, moving the mouse will not wake up the computer every time. It seems to have frozen as if it had gone to sleep and had an issue, even though I have disabled all of that. I've gone in to check the sleep and hybrid sleep settings and they're both still disabled so I'm not sure what this is about. The only other thing I can think of is the driver, I'm on the 13.12 WHQL driver and I just re-installed Windows 7 64bit last Friday. Before my re-install it was doing the same on the same driver, albeit not as often. I'm at a loss here, I'm not sure what's going on!

Any help or suggestions are appreciated! Maybe there's a setting or some such I've forgotten!


----------



## Roboyto

Quote:


> Originally Posted by *taem*
> 
> Ok so I set up my crossfire
> 
> Here are the Valley scores for both cards singly at 1080 core 1350 mem. Tri-X a lot faster at same clocks. Why? Does the primary gpu1 perform better?


Have you tried running multiple times? There can be a little fluctuation in scores.

Do they both have same voltages applied for the clocks?

Have you tried with both GPUs in the top and bottom slot? I've seen people mention swapping positions before.


----------



## Forceman

Quote:


> Originally Posted by *LazarusIV*
> 
> Hey all, I have an issue...
> 
> I have turned off sleep and hybrid sleep on my computer and set the monitor to go into power-saving mode after 10 minutes. For some reason, when the computer sits for a while and the monitor has turned off like it's supposed to, moving the mouse will not wake up the computer every time. It seems to have frozen as if it had gone to sleep and had an issue, even though I have disabled all of that. I've gone in to check the sleep and hybrid sleep settings and they're both still disabled so I'm not sure what this is about. The only other thing I can think of is the driver, I'm on the 13.12 WHQL driver and I just re-installed Windows 7 64bit last Friday. Before my re-install it was doing the same on the same driver, albeit not as often. I'm at a loss here, I'm not sure what's going on!
> 
> Any help or suggestions are appreciated! Maybe there's a setting or some such I've forgotten!


Do you have ULPS disabled? I had similar problems (it sounds like) until I disabled ULPS in the Afterburner settings, after that all was fine.


----------



## vortex240

Quote:


> Originally Posted by *rickyman0319*
> 
> I am wondering how to make orgianl fan quieter on r9 290. please help me.


Remove the rear IO plate, makes a huge difference. It restricts the airflow and adds a lot of noise from turbulent air trying to escape from through the small vents. 55% doesn't bother me at all now, and it sure did with the IO plate on it.

Just google for a article on how to do it, Semiaccurate did one on just that but their rusults were BS if you ask me, since I did it and it was a big difference in noise.


----------



## taem

Preliminary 290 crossfire temp test using Metro LL bench looped 50x on max settings. Clocked at 1080 core, 1350 mem, 0 vddc offset.

Gpu1: Sapphire 290 Tri-X OC


Gpu2: Powercolor PCS+ 290


Sapphire 290 -- 72c core, 71c vrm1, 56c vrm2
Powercolor 290 -- 61c core, 61c vrm1, 48c vrm2

I can live with those temps







That's better than I expected from my silenced Define R4 case which doesn't have stellar airflow to begin with and I'm using Noctua fans set to max at 1000rpms. But boy does 290 crossfire pump out the heat. The area around my desk is noticeably warmer. That's crazy. My pc case is literally a space heater now. Cpu temps are wrecked, high 50s low 60s, 5-10c higher than before.

Gpu usage is still all over the place though, what is this about? Max gpu load on gpu1 was only 88%. Cpu usage was 91%.

Quote:


> Originally Posted by *Roboyto*
> 
> Have you tried running multiple times? There can be a little fluctuation in scores.


I'm not so much interested in maxing out scores (one 290 was enough for me frankly, this setup is waaaaay overkill lol), just in making sure things are as they should be. If other xfirers could share their Valley scores I would appreciate it. List your clocks obviously.
Quote:


> Do they both have same voltages applied for the clocks?


Here's an odd thing. I have Afterburner set to apply the same settings to both cards obviously. And vddc is set to 0. But the Powercolor, which comes with a +50 vddc in the bios, is still using the higher voltage. Sapphire gets 1.188v max, Powercolor gets 1.227 max. Shouldn't they be operating on the same voltage?
Quote:


> Have you tried with both GPUs in the top and bottom slot? I've seen people mention swapping positions before.


Won't fit that way, Sapphire is 305mm and collides with the lower hdd cage.


----------



## rickyman0319

Quote:


> Originally Posted by *Roboyto*
> 
> Upgrade the factory thermal paste for the GPU. Upgrading the thermal pads for the main VRMs could help or hurt...its hard to say. One reason the core runs so hot is the blower first cools the VRMs which run very hot. That hot air then is blown across the core...not the best design
> 
> Some paste like Arctic MX-4, Gelid GC Extreme, or Noctua NT-H1 would probably lower core temperatures. If you'd like to go the route of best performance for TIM it would be CLP or CLU, Coolaborator Liquid Pro/Ultra. These you must be cautious when applying since it is liquid metal. Best method seen here on this thread for application is using masking tap around the GPU core to make sure it doesn't get anywhere else.
> 
> VRM Thermal Pads and temperatures: http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures#post_21821258
> 
> Your best bet is likely aftermarket cooling unfortunately, *but I'd be interested to know what would happen with good TIM on the core and a thermal pad upgrade for the VRM1 on a reference cooler could do.*


the best aircooler out there is out of stock. the cheaper one , they still have it.


----------



## kizwan

Quote:


> Originally Posted by *taem*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Preliminary 290 crossfire temp test using Metro LL bench looped 50x on max settings. Clocked at 1080 core, 1350 mem, 0 vddc offset.
> 
> Gpu1: Sapphire 290 Tri-X OC
> 
> 
> Gpu2: Powercolor PCS+ 290
> 
> 
> Sapphire 290 -- 72c core, 71c vrm1, 56c vrm2
> Powercolor 290 -- 61c core, 61c vrm1, 48c vrm2
> 
> I can live with those temps
> 
> 
> 
> 
> 
> 
> 
> That's better than I expected from my silenced Define R4 case which doesn't have stellar airflow to begin with and I'm using Noctua fans set to max at 1000rpms. But boy does 290 crossfire pump out the heat. The area around my desk is noticeably warmer. That's crazy. My pc case is literally a space heater now. Cpu temps are wrecked, high 50s low 60s, 5-10c higher than before.
> 
> 
> 
> Gpu usage is still all over the place though, what is this about? Max gpu load on gpu1 was only 88%. Cpu usage was 91%.


GPU usage is normal like that with 290's Crossfire. If games didn't stutter/lag then you don't need to worry.
Quote:


> Originally Posted by *taem*
> 
> Here's an odd thing. I have Afterburner set to apply the same settings to both cards obviously. And vddc is set to 0. But the Powercolor, which comes with a +50 vddc in the bios, is still using the higher voltage. Sapphire gets 1.188v max, Powercolor gets 1.227 max. *Shouldn't they be operating on the same voltage?*


Mine doesn't. Makes sense since both of my cards have different max VDDC. When applying offset voltage, both card will run at different voltage too.


----------



## Prozillah

14.2 work brilliantly on dx. Mantle didn't look visually as nice personally. Constant 100+ fps on xfire 290's max settings super sampling 115%. Happy.


----------



## taem

Anyone with a Sapphire 290 Tri-X:

How do you tell which is the legacy bios and which is the uefi bios? Ever since I installed that card my bootup time has gone up by a good 20 seconds. So which switch position is what, and is there an alternative way to determine which BIOS you're on?

Crossfire not going so great for me right now lol, I had crazy fluctuations in Afterburner, someone told me I need to use Trixx because AB does not handle 290s well, well I installed Trixx and it contains no info, everything is a blank or a 0.


----------



## Derpinheimer

Quote:


> Originally Posted by *taem*
> 
> Quick closeup of vrm1 on the 290 Tri-X, compared to the Powercolor PCS+ 290:
> 
> Tri-X (shroud overhangs there and gets in the way):
> 
> 
> PCS+:
> 
> 
> At least no bizarre goop on the Tri-X chokes
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I worried this card might be flimsy due to the plastic in the shroud but it feels rock solid. If anything less flex in this card than on the all metal shroud/backplate Powercolor PCS+. Tri-X shroud is half metal as it turns out.


The VRM cooling looks much nicer on the PCS+? Any temps?


----------



## taem

Quote:


> Originally Posted by *Derpinheimer*
> 
> The VRM cooling looks much nicer on the PCS+? Any temps?


Yeah posted earlier here http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/18000_50#post_21872910


----------



## Paul17041993

Quote:


> Originally Posted by *taem*
> 
> Gpu usage is still all over the place though, what is this about? Max gpu load on gpu1 was only 88%. Cpu usage was 91%.


bottleneck, render thread and drivers don't have enough space to feed the GPUs fast enough, once mantle picks up you shouldn't have to worry about that much though.

side note, discovered that OpenGL rendering with excessive tessellation in wireframe mode can punch the card hard enough to make windows lag and your mouse almost unusable... :I
(though heaven etc have locked max values, so you cant accidentally do this on said benchmarks)


----------



## taem

Ok this is freaking maddening. I thought I'd give 14.2 a try. Well now the fan rpm sensor on Gpu Z read like this:


HWInfo reads it fine though.

Meanwhile over in Trixx I cannot get Trixx to save a custom fan curve to both cards. I can set both to a fixed rpm. I can set both to auto. But if I want a custom curve, only one can have it. I'm bouncing back and forth between tabs trying to get the thing to save, it will not.

The gpu usage thing makes this crossfire unusable atm. I'll get hundreds and hundreds of frames for a while, and then it will plummet to like 20 fps for a few seconds as gpu usage drops to 2%. It's the mother of all macro stutters.

Oh why did I even try this. I was so happy with a single gpu. What a waste for $500 because right now, I think I'm just going to put this Tri X in the closet.

What should I do, just format my drive and start all over with a GPT convert? Because that's probably going to be a lot faster than trying to chase a possible driver glitch.


----------



## kizwan

Quote:


> Originally Posted by *taem*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Ok this is freaking maddening. I thought I'd give 14.2 a try. Well now the fan rpm sensor on Gpu Z read like this:
> 
> 
> HWInfo reads it fine though.
> 
> Meanwhile over in Trixx I cannot get Trixx to save a custom fan curve to both cards. I can set both to a fixed rpm. I can set both to auto. But if I want a custom curve, only one can have it. I'm bouncing back and forth between tabs trying to get the thing to save, it will not.
> 
> 
> 
> The gpu usage thing makes this crossfire unusable atm. I'll get hundreds and hundreds of frames for a while, and then it will plummet to like 20 fps for a few seconds as gpu usage drops to 2%. It's the mother of all macro stutters.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Oh why did I even try this. I was so happy with a single gpu. What a waste for $500 because right now, I think I'm just going to put this Tri X in the closet.
> 
> What should I do, just format my drive and start all over with a GPT convert? Because that's probably going to be a lot faster than trying to chase a possible driver glitch.


Like Paul said, your CPU is bottlenecking. Did you overclocked your 4670k?


----------



## taem

Quote:


> Originally Posted by *kizwan*
> 
> Like Paul said, your CPU is bottlenecking. Did you overclocked your 4670k?


Yeah it's a 4670k at 4.7. I guess I deserve some bottlenecking for pairing a $200 cpu with $1000 worth of gpu. But total cpu usage isn't anywhere near 100%, peaked for a second at 91% in Metro LL but other than that it's 50-70%. But I suppose bottlenecking can occur without total cpu usage hitting 100%.

I freaked for no reason lol crossfire is sort of working now







Trixx was a problem, that gave me huge janky stutters in addition to all sorts of glitches, including the first ever black screen I've gotten with a 290. With AB I don't get that. I don't understand why. The only problem that's really irritating me right now is that the fan rpm sensors on Gpu Z don't work. But then I've always had this totally irrational completely unfounded suspicion that Gpu Z causes all sorts of glitches so it's not a huge loss. HWInfo is superior anyway.

Anyway, 290 crossfire is overkill for me so I'm running 1000 core 1300 mem and leaving fans at 50%, still get more fps than my display can show at 60hz. Really wish you could undervolt 290s. Because it's silly to be getting 150fps in games with all the settings I want. Although -- I still can't max Tomb Raider or Metro LL. But I don't want some of those settings anyway, stuff like SSAO and DOF don't always add to the experience, sometimes they look awful.

In any event -- now that I've determined I do not need a new psu (absolute peak with a slight vddc adjust was 647w) or new fans or a new case (temps are good, everything low 70s or below), I guess I'll get a new cpu. I take it 4770k is not enough to avoid bottleneck? Do I need a 4930 or whatever it is in the range I never usually even look at?


----------



## Tobiman

Quote:


> Originally Posted by *taem*
> 
> Yeah it's a 4670k at 4.7. I guess I deserve some bottlenecking for pairing a $200 cpu with $1000 worth of gpu. But total cpu usage isn't anywhere near 100%, peaked for a second at 91% in Metro LL but other than that it's 50-70%. But I suppose bottlenecking can occur without total cpu usage hitting 100%.
> 
> I freaked for no reason lol crossfire is sort of working now
> 
> 
> 
> 
> 
> 
> 
> Trixx was a problem, that gave me huge janky stutters in addition to all sorts of glitches, including the first ever black screen I've gotten with a 290. With AB I don't get that. I don't understand why. The only problem that's really irritating me right now is that the fan rpm sensors on Gpu Z don't work. But then I've always had this totally irrational completely unfounded suspicion that Gpu Z causes all sorts of glitches so it's not a huge loss. HWInfo is superior anyway.
> 
> Anyway, 290 crossfire is overkill for me so I'm running 1000 core 1300 mem and leaving fans at 50%, still get more fps than my display can show at 60hz. Really wish you could undervolt 290s. Because it's silly to be getting 150fps in games with all the settings I want. Although -- I still can't max Tomb Raider or Metro LL. But I don't want some of those settings anyway, stuff like SSAO and DOF don't always add to the experience, sometimes they look awful.
> 
> In any event -- now that I've determined I do not need a new psu (absolute peak with a slight vddc adjust was 647w) or new fans or a new case (temps are good, everything low 70s or below), I guess I'll get a new cpu. I take it 4770k is not enough to avoid bottleneck? Do I need a 4930 or whatever it is in the range I never usually even look at?


You'll only gain a handful of fps even if you went 4930k. As for the usage spike, It's ab that finicky. It doesn't seem to work well with hawaii atm. The GPU-Z rpm bug is common with 14.x drivers. I had it on 14.1 also.


----------



## kizwan

@taem,

Glad to hear Crossfire now working well for you. I got problem with Trixx when using 14.1 beta driver though. Bottlenecking will happen if CPU usage 100% & GPU usage lower and this accompanied by sudden FPS drop. My 3820 can handle two 290's even at stock clock, so 4770k should be able too. Look like your 4670k can handle two 290's just fine. You don't need to upgrade yet.

In my test using BF4, going from 1080p 100% resolution scale to 200% resolution scale, the CPU usage is lowered roughly by 10% at least. I wonder whether going from 1080p to 2160p (or any resolution higher than 1080p) also yield the same result.
Quote:


> Originally Posted by *Tobiman*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *taem*
> 
> Yeah it's a 4670k at 4.7. I guess I deserve some bottlenecking for pairing a $200 cpu with $1000 worth of gpu. But total cpu usage isn't anywhere near 100%, peaked for a second at 91% in Metro LL but other than that it's 50-70%. But I suppose bottlenecking can occur without total cpu usage hitting 100%.
> 
> I freaked for no reason lol crossfire is sort of working now
> 
> 
> 
> 
> 
> 
> 
> Trixx was a problem, that gave me huge janky stutters in addition to all sorts of glitches, including the first ever black screen I've gotten with a 290. With AB I don't get that. I don't understand why. The only problem that's really irritating me right now is that the fan rpm sensors on Gpu Z don't work. But then I've always had this totally irrational completely unfounded suspicion that Gpu Z causes all sorts of glitches so it's not a huge loss. HWInfo is superior anyway.
> 
> Anyway, 290 crossfire is overkill for me so I'm running 1000 core 1300 mem and leaving fans at 50%, still get more fps than my display can show at 60hz. Really wish you could undervolt 290s. Because it's silly to be getting 150fps in games with all the settings I want. Although -- I still can't max Tomb Raider or Metro LL. But I don't want some of those settings anyway, stuff like SSAO and DOF don't always add to the experience, sometimes they look awful.
> 
> In any event -- now that I've determined I do not need a new psu (absolute peak with a slight vddc adjust was 647w) or new fans or a new case (temps are good, everything low 70s or below), I guess I'll get a new cpu. I take it 4770k is not enough to avoid bottleneck? Do I need a 4930 or whatever it is in the range I never usually even look at?
> 
> 
> 
> 
> 
> 
> You'll only gain a handful of fps even if you went 4930k. As for the usage spike, It's ab that finicky. It doesn't seem to work well with hawaii atm. The GPU-Z rpm bug is common with 14.x drivers. I had it on 14.1 also.
Click to expand...

You missing the point here. We're not talking about fps gain but regarding CPU bottlenecking in Crossfire.


----------



## HOMECINEMA-PC

Just did a 4hr session on my wc 290's at stock clocks 35c idle 44c full load temps avg 170fps and so smooth









Quote:


> Originally Posted by *taem*
> 
> Preliminary 290 crossfire temp test using Metro LL bench looped 50x on max settings. Clocked at 1080 core, 1350 mem, 0 vddc offset.
> 
> Gpu1: Sapphire 290 Tri-X OC
> 
> 
> Gpu2: Powercolor PCS+ 290


My gpuz doesn't show vrm temps what makes yours so special ?


----------



## TommyGunn123

Anyone watch the overclocker's uk stream last night? Just curious they got the record, they were still going when I left


----------



## Paul17041993

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Just did a 4hr session on my wc 290's at stock clocks 35c idle 44c full load temps avg 170fps and so smooth
> 
> 
> 
> 
> 
> 
> 
> 
> My gpuz doesn't show vrm temps what makes yours so special ?


if its the ASUS DCII, they usually don't have VRM sensors, otherwise what card you got?


----------



## kpoeticg

My VRM Sensors are either readable or not, like every other boot.

I have an Aida64 widget on my dekstop. When i boot my pc, it's like a 50/50 shot they'll either be there or not.

But then again, that card's on it's way back to Newegg right now. And i have the new one plus a 2nd coming next week. So we'll see what the future brings


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Paul17041993*
> 
> if its the ASUS DCII, they usually don't have VRM sensors, otherwise what card you got?


1 gigabyte and a sapphire waterblocked and flashed with asus 290x bios on switch 1 and a modded Asus PT1T bios on position 2 on both cards on a RIVE 2011 M/board


----------



## Outlaw02

Joining in the bandwagon late. Here's mine.

http://www.techpowerup.com/gpuz/eqdmf/

It's a Powercolor R9-290 PCS+


----------



## rdr09

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 1 gigabyte and a sapphire waterblocked and flashed with asus 290x bios on switch 1 and a modded Asus PT1T bios on position 2 on both cards on a RIVE 2011 M/board


Home, see if your dpi is set to low.


----------



## Jackalito

Quote:


> Originally Posted by *Arizonian*
> 
> Welcome to OCN and the 290X / 290 Club 'new guy in town'. Congrats - added


Thank you very much! It's amazing all I can learn just by reading you guys!
Amazing place to share experiences and ask for doubts


----------



## kpoeticg

Yeah OCN is a pretty great place, until one day you can't afford to go grocery shopping and are suddenly wondering what compelled you to spend more money on your computer than your car


----------



## hks215

im having a problem here my card is a msi 290 gaming what happened was i flashed it to the tri x bios to see if i can get lower temps
but didnt so i flashed back to the stock bios and now every time when i start my comp the gpu core clock is at 977mhz instead of 300mhz the memory clock is fine at 150 just the core clock how do i fix tried reinstalling ccc still happening
i need to enable graphics overdrive to make the clock back to normal if i leave enable graphics overdrive on when start up its still the same
i use msi ab so i cant leave it on anyway anybody know a fix


----------



## Brian18741

Ok so for anyone who was following, my Asus R9 290s DCU II in crossfire were hitting ridiculous temperatures. I was hitting 94°C on the core on the top card when leaving the fans on auto!

So I've been swapping thermal paste to see if there's any improvements. Results below. Making definite progress, I tried some Atric Cooling MX-4 last week and the results were fairly disappointing.

This weekend I have tried Gelid OC Extreme and so far this paste is giving the best results. Again though, this is at 100% fans on both cards and case fans so it's very loud.

Watercooling is still the only viable option. Oh well, may start saving!









Stock clocks and voltages. Fans 100%. Case fans 100%. Over 1 hour of gaming.


----------



## pkrexer

I still can't seem to correct the issue I'm having. Soon as my monitors go into sleep mode and I wake them up again, my voltage is being reset to default. This makes it difficult to overclock since I get screen corruption unless I have some extra voltage applied.

I use gpu tweak which doesn't have an option to disable ulps... so I've tried disabling ulps through the registry, but it doesn't seem to make a difference. The voltage is still being reset.

Anyone know of a fix?


----------



## Roboyto

Quote:


> Originally Posted by *Brian18741*
> 
> Ok so for anyone who was following, my Asus R9 290s DCU II in crossfire were hitting ridiculous temperatures. I was hitting 94°C on the core on the top card when leaving the fans on auto!
> 
> So I've been swapping thermal paste to see if there's any improvements. Results below. Making definite progress, I tried some Atric Cooling MX-4 last week and the results were fairly disappointing.
> 
> This weekend I have tried Gelid OC Extreme and so far this paste is giving the best results. Again though, this is at 100% fans on both cards and case fans so it's very loud.
> 
> Watercooling is still the only viable option. Oh well, may start saving!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Stock clocks and voltages. Fans 100%. Case fans 100%. Over 1 hour of gaming.


Glad to see there is some improvement for your efforts. It's now quite obvious the DC2 cooler is inadequate in this case.


----------



## taem

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Just did a 4hr session on my wc 290's at stock clocks 35c idle 44c full load temps avg 170fps and so smooth
> 
> 
> 
> 
> 
> 
> 
> 
> My gpuz doesn't show vrm temps what makes yours so special ?


You don't need to see any temps. Since you're all overkill on the temp already. Those monitors are for us poor air coolers.
Quote:


> Originally Posted by *Brian18741*
> 
> Fans 100%. Case fans 100%.


That must be unbearable lol. I'm finding my Powercolor and Sapphire cards' fans intolerable at past 60%, and 60% not exactly friendly either. What's your case and case fan setup though?


----------



## Red1776

Try using TRIXX. It has ULPS disable and wont reset during sleep/hibernate, or Afterburner

https://www.sapphireselectclub.com/ssc/TriXX/


----------



## signex

I got myself a Asus R9 290 DCII OC, but it keeps underclocking?? And GPU Usage keeps going to 100 and 0% and sometimes 50's.


----------



## Arizonian

Quote:


> Originally Posted by *Outlaw02*
> 
> Joining in the bandwagon late. Here's mine.
> 
> http://www.techpowerup.com/gpuz/eqdmf/
> 
> It's a Powercolor R9-290 PCS+


Congrats - added


----------



## quakermaas

Quote:


> Originally Posted by *signex*
> 
> I got myself a Asus R9 290 DCII OC, but it keeps underclocking?? And GPU Usage keeps going to 100 and 0% and sometimes 50's.


That looks like a very old MSI AB, update to the latest 3.0 beta 18


----------



## signex

Quote:


> Originally Posted by *quakermaas*
> 
> That looks like a very old MSI AB, update to the latest 3.0 beta 18


Ok i'll try that one.


----------



## Forceman

Quote:


> Originally Posted by *hks215*
> 
> im having a problem here my card is a msi 290 gaming what happened was i flashed it to the tri x bios to see if i can get lower temps
> but didnt so i flashed back to the stock bios and now every time when i start my comp the gpu core clock is at 977mhz instead of 300mhz the memory clock is fine at 150 just the core clock how do i fix tried reinstalling ccc still happening
> i need to enable graphics overdrive to make the clock back to normal if i leave enable graphics overdrive on when start up its still the same
> i use msi ab so i cant leave it on anyway anybody know a fix


Which driver? There were some issues with the 14.x drivers not downclocking, so it may be that and not your card/BIOS.


----------



## taem

In-game temps with Sapphire 290 Tri-X/Powercolor PCS+ 290 xfire going well










These are both very good coolers I think. I'd still like to get the fans lower because fps is pointlessly high for my 60hz display so I can downclock these a fair amount. Anyone know of a modded bios that will let me undervolt?


----------



## Brian18741

Quote:


> Originally Posted by *taem*
> 
> In-game temps with Sapphire 290 Tri-X/Powercolor PCS+ 290 xfire going well
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> These are both very good coolers I think. I'd still like to get the fans lower because fps is pointlessly high for my 60hz display so I can downclock these a fair amount. Anyone know of a modded bios that will let me undervolt?


how do you read that table dude? Looka like yu got 64C on top and 57C on bottom card?


----------



## taem

Fire Strike very unimpressive in xfire. Cpu usage hit 99%, I guess I'm throttling like Paul said?

Here is the single 290 I was running at 1200/1500


Xfire 290s at 1040/1350

Quote:


> Originally Posted by *Brian18741*
> 
> how do you read that table dude? Looka like yu got 64C on top and 57C on bottom card?


I edited the image upload to show which column is what. I guess you don't use HWInfo? You should give it a try. The author has a thread here where he answers your questions too, an incredibly useful tool.


----------



## ArchieGriffs

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Just did a 4hr session on my wc 290's at stock clocks 35c idle 44c full load temps avg 170fps and so smooth
> 
> 
> 
> 
> 
> 
> 
> 
> My gpuz doesn't show vrm temps what makes yours so special ?


I had this issue up until 2 days ago. The problem for me at least was that windows scaled my text to be larger. It's kind of confusing to explain why that would affect gpu-z, so I'll just tell you how I solved it: right click when you're at the desktop, hit screen resolution, then there's a link that says "make text and other items smaller or larger". Windows sometimes default sets it at 125%, set it to 100% and it will show the rest of gpu-z. If you aren't running windows 7 or 8 i don't know how to fix it, but searching for your specific operating system on how to change text size should do the trick.


----------



## vortex240

Quote:


> Originally Posted by *Brian18741*
> 
> Ok so for anyone who was following, my Asus R9 290s DCU II in crossfire were hitting ridiculous temperatures. I was hitting 94°C on the core on the top card when leaving the fans on auto!
> 
> So I've been swapping thermal paste to see if there's any improvements. Results below. Making definite progress, I tried some Atric Cooling MX-4 last week and the results were fairly disappointing.
> 
> This weekend I have tried Gelid OC Extreme and so far this paste is giving the best results. Again though, this is at 100% fans on both cards and case fans so it's very loud.
> 
> Watercooling is still the only viable option. Oh well, may start saving!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Stock clocks and voltages. Fans 100%. Case fans 100%. Over 1 hour of gaming.


Quote:


> Originally Posted by *Brian18741*
> 
> Ok so for anyone who was following, my Asus R9 290s DCU II in crossfire were hitting ridiculous temperatures. I was hitting 94°C on the core on the top card when leaving the fans on auto!
> 
> So I've been swapping thermal paste to see if there's any improvements. Results below. Making definite progress, I tried some Atric Cooling MX-4 last week and the results were fairly disappointing.
> 
> This weekend I have tried Gelid OC Extreme and so far this paste is giving the best results. Again though, this is at 100% fans on both cards and case fans so it's very loud.
> 
> Watercooling is still the only viable option. Oh well, may start saving!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Stock clocks and voltages. Fans 100%. Case fans 100%. Over 1 hour of gaming.


Quote:


> Originally Posted by *Roboyto*
> 
> Upgrade the factory thermal paste for the GPU. Upgrading the thermal pads for the main VRMs could help or hurt...its hard to say. One reason the core runs so hot is the blower first cools the VRMs which run very hot. That hot air then is blown across the core...not the best design
> 
> Some paste like Arctic MX-4, Gelid GC Extreme, or Noctua NT-H1 would probably lower core temperatures. If you'd like to go the route of best performance for TIM it would be CLP or CLU, Coolaborator Liquid Pro/Ultra. These you must be cautious when applying since it is liquid metal. Best method seen here on this thread for application is using masking tap around the GPU core to make sure it doesn't get anywhere else.
> 
> VRM Thermal Pads and temperatures: http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures#post_21821258
> 
> Your best bet is likely aftermarket cooling unfortunately, *but I'd be interested to know what would happen with good TIM on the core and a thermal pad upgrade for the VRM1 on a reference cooler could do.*


ASUS screwed up with the DC2 heatisk, Tom's Hardware covered that isssue. Out of the 5 heatpipes it has, only 3 make contact. And out of those 3, 1 just barely. Get a thin square/rectangle copper plate between the core and those heatpipes like all the other manufaturers did. A user on another forum did that with great results on the DC2.


----------



## vortex240

Quote:


> Originally Posted by *Prozillah*
> 
> 14.2 work brilliantly on dx. Mantle didn't look visually as nice personally. Constant 100+ fps on xfire 290's max settings super sampling 115%. Happy.


So, are we going to see a picture of that 1500watt draw or the equipment you are using for that reading?


----------



## Roboyto

Quote:


> Originally Posted by *vortex240*
> 
> ASUS screwed up with the DC2 heatisk, Tom's Hardware covered that isssue. Out of the 5 heatpipes it has, only 3 make contact. And out of those 3, 1 just barely. Get a thin square/rectangle copper plate between the core and those heatpipes like all the other manufaturers did. A user on another forum did that with great results on the DC2.


Would it be possible to use a copper shim to make contact with all of the heat pipes?


----------



## Coree

Hey guys, I encountered a weird thing today :O
I repasted my GPU TIM, as I had put too much earlier. I put the cooler + vrm heatsinks + fans back.
Then I put it to the PCI-E slot, and it fit tightly as normal. But as I tried to power on my PC, it didn't turn on. (All power connectors were put) I just heard a click sound from the PSU (OVP protection?)
I panicked a bit, but later I loosened up the GPU heatsink screws -> put it back to the slot, and everything booted just as fine.

Does anyone had this same weird issue before? Can too much pressure affect this?


----------



## vortex240

Quote:


> Originally Posted by *Roboyto*
> 
> Would it be possible to use a copper shim to make contact with all of the heat pipes?


Thats what I'm describing. Just have a look at the Tom's hardware article where they compare all the R9 290 custom cards. It's a very simple fix.


----------



## axiumone

Here we are then. All 4 cards have arrived.


----------



## keikei

Hi,

finally got the rig updated. May I join da club?







Please excuse the mess that is my case.





Spoiler: Warning: Spoiler!







Models: SAPPHIRE TRI-X R9 290
Cooling: Stock


----------



## Roy360

Quote:


> Originally Posted by *axiumone*
> 
> Here we are then. All 4 cards have arrived.


where did you get those? I can't find a single retailer with those cards.
Quote:


> Originally Posted by *vortex240*
> 
> ASUS screwed up with the DC2 heatisk, Tom's Hardware covered that isssue. Out of the 5 heatpipes it has, only 3 make contact. And out of those 3, 1 just barely. Get a thin square/rectangle copper plate between the core and those heatpipes like all the other manufaturers did. A user on another forum did that with great results on the DC2.


man I just bought one of those today. No wonder NCIX price matched it for me to 539.99


----------



## anubis1127

Quote:


> Originally Posted by *axiumone*
> 
> Here we are then. All 4 cards have arrived.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


----------



## King4x4

Cards are under water.... EK blocks are the works! Vrms never go over 50'C and core too!


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *taem*
> 
> Fire Strike very unimpressive in xfire. Cpu usage hit 99%, I guess I'm throttling like Paul said?
> 
> Here is the single 290 I was running at 1200/1500
> 
> 
> Xfire 290s at 1040/1350
> 
> I edited the image upload to show which column is what. I guess you don't use HWInfo? You should give it a try. The author has a thread here where he answers your questions too, an incredibly useful tool.


My best CF Firestrike



http://www.3dmark.com/fs/1784841

Single card Firestrike



http://www.3dmark.com/fs/1412352

You can turn off Tess in CCC for some extra pts . It wont be allowed in the HOF unfortunately









Quote:


> Originally Posted by *rdr09*
> 
> Home, see if your dpi is set to low.
> Quote:
> 
> 
> 
> Originally Posted by *ArchieGriffs*
> 
> I had this issue up until 2 days ago. The problem for me at least was that windows scaled my text to be larger. It's kind of confusing to explain why that would affect gpu-z, so I'll just tell you how I solved it: right click when you're at the desktop, hit screen resolution, then there's a link that says "make text and other items smaller or larger". Windows sometimes default sets it at 125%, set it to 100% and it will show the rest of gpu-z. If you aren't running windows 7 or 8 i don't know how to fix it, but searching for your specific operating system on how to change text size should do the trick.
Click to expand...

Thanks guys that's all it was LoooL


----------



## axiumone

Quote:


> Originally Posted by *Roy360*
> 
> where did you get those? I can't find a single retailer with those cards.


I was able to get them straight from visiontek.

http://www.visiontekproducts.com/index.php/component/virtuemart/graphics-cards/visiontek-cryovenom-liquidcooled-series-r9-290-detail?Itemid=0

Click on notify me and they'll email you when they have more stock.

Also, I just noticed that they raised the price. I bought them for 550 a piece, now they are 600


----------



## Coree

Quote:


> Originally Posted by *King4x4*
> 
> Cards are under water.... EK blocks are the works! Vrms never go over 50'C and core too!


Holy crap batman those look nice. What fans are those? Looks similar like the Gentle Typhoons..


----------



## axiumone

King, what did you use for thermal pads? Stock EK stuff or something else?


----------



## hks215

Reply forceman
I'm using 14.2 it didn't happen before just after I flashed it happened


----------



## Forceman

Quote:


> Originally Posted by *hks215*
> 
> Reply forceman
> I'm using 14.2 it didn't happen before just after I flashed it happened


Hmm. Did you reinstall the drivers after the flash?


----------



## borderkill666

Hi Guys,

I want to buy a 290x from Gigabyte, does this card also have the option to monitor vrm temps?
Its based on the reference design and uses the same voltage controller but I read somewhere the card has no option to monitor the vrms.

I'm glad if someone could tell it me, or maybe has a GPU-Z screen of the sensor readings of the Gigabyte 290X WindForce.


----------



## Prozillah

Quote:


> Originally Posted by *vortex240*
> 
> So, are we going to see a picture of that 1500watt draw or the equipment you are using for that reading?


 one quick pic for ya


----------



## Prozillah

Just loaded up bf4 not even in game battle haha. Meter is plugged into a four plug with the power cord pc connected to that. I would have thought that having I plugged into a four plug wouldn't have any effect on the power draw??


----------



## King4x4

Quote:


> Originally Posted by *Coree*
> 
> Holy crap batman those look nice. What fans are those? Looks similar like the Gentle Typhoons..


Those are Evercool Titan Kukri fans.

Beasty little things with a 800-2200rpm range... they push a lot of air but are noisy. I am dialing them down to 1000rpm.

Can't compare with the price... got them at $8 a pop!









Quote:


> Originally Posted by *axiumone*
> 
> King, what did you use for thermal pads? Stock EK stuff or something else?


Stock EK and no paste added.

Btw tried PT1 and PT3 bios... was getting constant crashes... then tried the Asus bios.. same. Tried the Tri-X bios and they stuck with no crashes... gonna test a bit and see if they do any better then the stock ones on voltage settings.


----------



## pkrexer

So I updated to the 14.2 drivers and it seems to have fixed my problem with my voltage being reset after waking my monitors from sleep... but now it doesn't seem to be applying my power limit, getting throttling when I set anything beyond +50mv. I tried increasing it through both AB and CCC, doesn't seem to do anything.


----------



## Ilsanto86

Quote:


> Originally Posted by *borderkill666*
> 
> Hi Guys,
> 
> I want to buy a 290x from Gigabyte, does this card also have the option to monitor vrm temps?
> Its based on the reference design and uses the same voltage controller but I read somewhere the card has no option to monitor the vrms.
> 
> I'm glad if someone could tell it me, or maybe has a GPU-Z screen of the sensor readings of the Gigabyte 290X WindForce.


I have a Gigabyte R9 290XOC WindForce and it have the option to monitor vrm temps on Gpu-z or Aida64


----------



## borderkill666

Quote:


> Originally Posted by *Ilsanto86*
> 
> I have a Gigabyte R9 290XOC WindForce and it have the option to monitor vrm temps on Gpu-z or Aida64


Thank you for the answer, good to know it.


----------



## Paul17041993

Quote:


> Originally Posted by *Prozillah*
> 
> one quick pic for ya


1500W, @80% eff. thats 1200W internal power, @70% eff, 1050W, what PSU and what voltage do you use? 110 or 220V?


----------



## Prozillah

Quote:


> Originally Posted by *Paul17041993*
> 
> 1500W, @80% eff. thats 1200W internal power, @70% eff, 1050W, what PSU and what voltage do you use? 110 or 220V?


Nz uses 240ac so most appliances run on 230v @ 50hz. My psu is silverstone strider gold evolution 1200w (1300w @ peak). Since getting my psu I havent had one random restart which is excellent. The 1000w psu i had prior just wasn't enough


----------



## Prozillah

Quick note on the 14.2 drivers - scratched them and gone back to 13.12. Waaaay better consistency and avg fps overall. Mantle looked like **** and was cpu lag spiking every 10 seconds odd and dx was just kinda chopy and glitchy. 13.12 runs beaut.


----------



## taem

Quote:


> Originally Posted by *Prozillah*
> 
> one quick pic for ya


That's crazy! I've got 4670k @ 4.7/1.275v and two 290s @ 1080/1350 and the highest read I get at the wall is 647w. That includes 8 fans, drives, etc.


----------



## Forceman

Quote:


> Originally Posted by *Prozillah*
> 
> Nz uses 240ac so most appliances run on 230v @ 50hz. My psu is silverstone strider gold evolution 1200w (1300w @ peak). Since getting my psu I havent had one random restart which is excellent. The 1000w psu i had prior just wasn't enough


Kinda makes you wonder if the 120/240 is messing up the meter. You are drawing double what Taem is seeing.


----------



## Prozillah

Possibly? Looking at taems post with a very similar set up mine is almost double? ?


----------



## rickyman0319

there is a switch on the board near the I/o plate. which one is silent and uber?


----------



## VSG

Quote:


> Originally Posted by *Prozillah*
> 
> Possibly? Looking at taems post with a very similar set up mine is almost double? ?


I missed out on a lot of this, but could you please tell me what were your config settings and what you were running that led to that 1457w pull from the wall?


----------



## Iniura

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> My best CF Firestrike
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/1784841
> 
> Single card Firestrike
> 
> 
> 
> http://www.3dmark.com/fs/1412352
> 
> You can turn off Tess in CCC for some extra pts . It wont be allowed in the HOF unfortunately
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks guys that's all it was LoooL


Is it allowed to turn off Tess in CCC for Hwbot submissions?
Quote:


> Originally Posted by *TommyGunn123*
> 
> Anyone watch the overclocker's uk stream last night? Just curious they got the record, they were still going when I left


Yeah I watched they hit 3DMark Vantage - Performance 4x GPU World Record. You can see the score here: http://hwbot.org/benchmark/3dmark_vantage_-_performance/rankings?cores=4#start=0#interval=20
They got 6 WR in 3 days, 720P Catzilla 3 and 4 Card, 1440P Catzilla 3 and 4 Card, 3D Mark 11 2 card and Vantage 4 card.


----------



## VSG

Quote:


> Originally Posted by *Iniura*
> 
> Is it allowed to turn off Tess in CCC for Hwbot submissions?


No, it is still a good score though. I wouldn't mind knowing what he got on graphics score with Tess on.


----------



## Iniura

Quote:


> Originally Posted by *geggeg*
> 
> No, it is still a good score though. I wouldn't mind knowing what he got on graphics score with Tess on.


I think the one he submitted 12077 good for #15 on H2O

Edit: ow srry misread, thats the one without Tess probably.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *geggeg*
> 
> No, it is still a good score though. I wouldn't mind knowing what he got on graphics score with Tess on.


Yes you can depends what bench


----------



## Iniura

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yes you can depends what bench


Firestrike Extreme allowed with Tess on Home?


----------



## VSG

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yes you can depends what bench


I assumed we were talking about FS/FSE.


----------



## Paul17041993

Quote:


> Originally Posted by *Prozillah*
> 
> Nz uses 240ac so most appliances run on 230v @ 50hz. My psu is silverstone strider gold evolution 1200w (1300w @ peak). Since getting my psu I havent had one random restart which is excellent. The 1000w psu i had prior just wasn't enough


ok well that sounds right for if you were peaking a 1k PSU, but I wonder how your rig could be pulling quite so much, surely the cards aren't hitting their 375W (card) TDP...?


----------



## vortex240

Quote:


> Originally Posted by *Prozillah*
> 
> Just loaded up bf4 not even in game battle haha. Meter is plugged into a four plug with the power cord pc connected to that. I would have thought that having I plugged into a four plug wouldn't have any effect on the power draw??


LOL... from the looks of it, it's using power factor correction and clearly not accurately at all. Have you measured it against a source that you know consumes and X amount of power. I can tell you that's its not accurate at all or you are not using the correct setting. Basically it is showing you two times what it should..


----------



## vortex240

Quote:


> Originally Posted by *Roy360*
> 
> where did you get those? I can't find a single retailer with those cards.
> man I just bought one of those today. No wonder NCIX price matched it for me to 539.99


Lol, how many cards are you buying? Weren't you getting an AMD card?


----------



## Forceman

Quote:


> Originally Posted by *rickyman0319*
> 
> there is a switch on the board near the I/o plate. which one is silent and uber?


On a 290X silent is toward the display connectors and uber is toward the power connectors. On a 290 they are the same.


----------



## taem

How safe is it to use atiwinflash to flash the bios of a custom 290? Since the cards are all dual bios it should be safe right? As long as I'm not using a crazy bios meant for WC or ln2 that will scorch my air cooled cards?

Is there a bios out there for 290 that allows for undervolting? Is there any way to see what the fan curve is like in a given bios? I'd like to just get my settings set in bios and not deal with AB or Trixx. Is there an easy way to edit my own custom bios?

Sorry for basic questions I've never done this before.


----------



## pkrexer

Just flashed my asus 290x with the powercolor 290x pcs+ bios. It might be just the placebo effect, but my card seems to have gain some stability at higher clocks. Plus it's nice to have the +50mv added by default.

Going to do some further testing.


----------



## Roy360

Quote:


> Originally Posted by *vortex240*
> 
> Lol, how many cards are you buying? Weren't you getting an AMD card?


want to end up with Quad fire








Got one XFX R9 290 when it first came out 409.99 from NCIX
Got a VisionTek R9 290 for 339.99$ from Dell
Got an ASUS R9 290 DU OC for 539.99 from NCIX <---Not sure if this one is a keeper
the last one is undecided.

I have an AX1200i, so hopefully that will be enough to power these cards, 100A on the 12V rail, plus max power draw is ~1504W

Decided not to get the internal sample. Tons of people on rfd said it wasn't worth the risk, the seller was a bit of an @sshole too, so I didn't bother


----------



## hks215

Quote:
Originally Posted by hks215 View Post

Reply forceman
I'm using 14.2 it didn't happen before just after I flashed it happened

Hmm. Did you reinstall the drivers after the flash?

i fixed it i done a system resotre and it worked
i think because my comp was still think it was the tri-x bios
cause i flashed it then flash back to stock bios but my system still thought it was the tri-x bios
i think that was the cause


----------



## Red1776

You are going to need a bit more power than 1200w for quad 290x. You will want a bit of overhead/margin. I have a 4 x 7970 and am running 2.2kW (including a AX 1200)

I am assembling a 4 x R290X rig now and am running 2.3kW power config. at the least I would look at 1600w for quad 290X's. If you do benching, you really don't want your PSU config running at or close to 100%

Just my 2 cents


----------



## phallacy

Quote:


> Originally Posted by *Roy360*
> 
> want to end up with Quad fire
> 
> 
> 
> 
> 
> 
> 
> 
> Got one XFX R9 290 when it first came out 409.99 from NCIX
> Got a VisionTek R9 290 for 339.99$ from Dell
> Got an ASUS R9 290 DU OC for 539.99 from NCIX <---Not sure if this one is a keeper
> the last one is undecided.
> 
> I have an AX1200i, so hopefully that will be enough to power these cards, 100A on the 12V rail, plus max power draw is ~1504W
> 
> Decided not to get the internal sample. Tons of people on rfd said it wasn't worth the risk, the seller was a bit of an @sshole too, so I didn't bother


Quote:


> Originally Posted by *Red1776*
> 
> You are going to need a bit more power than 1200w for quad 290x. You will want a bit of overhead/margin. I have a 4 x 7970 and am running 2.2kW (including a AX 1200)
> I am assembling a 4 x R290X rig now and am running 2.3kW power config. at the least I would look at 1600w for quad 290X's. If you do benching, you really don't want your PSU config running at or close to 100%
> Just my 2 cents


I made a thread in the PSU section about this just yesterday. Take a look and weigh in if you're interested.

http://www.overclock.net/t/1470607/dual-psu-or-1600w-single-for-4-way-290x

I'm in the same boat. Running trifire currently with an EVGA supernova 1300W. I doubt it will cut it for 4 way. Only problem is my current case has no room for a 2nd PSU =/. Can't tell exactly what i'm drawing from the wall since I don't have any sort multimeter (yet) but ever since connecting my third card, I hear the PSU fan on so it's gotta be close to it's limit. Especially when benching.


----------



## King4x4

So got #15 on FSE Hall of Fame... hmm I like!









http://www.3dmark.com/fs/1799345

Gonna try to push for more tonight... but my cards clock like crap.... 1180/1500 is not fun









On stock bios I was getting 1200mhz/1500mhz but I was scoring 15k.... Switched to Tri-X bios and started getting [email protected]


----------



## taem

So I took out the 290 Tri-X, gonna return it, I should have listened to you guys who were trying to tell me 290 xfire would be bottlenecked by my 4670k. It's just not worth it to run xfire at this point, my psu can't oc two cards and my cpu can't handle the full power anyway. Since I can oc a single card xfire at stock clocks is getting me a 20% boost at best.

But now I'm having issues with my single card. It was running fine with no down clocking, now clock fluctuates quite a bit -- as much as 200. It's not temps, it's not psu, it wasn't doing it before, now it is. What are the potential causes of core clock fluctuation and how can I fix it?
Quote:


> Originally Posted by *King4x4*
> 
> On stock bios I was getting 1200mhz/1500mhz but I was scoring 15k.... Switched to Tri-X bios and started getting [email protected]


At the same clocks my Tri-X 290 out benches my Powercolor 290 by a decent margin.


----------



## bbond007

my new toy.

I do have a question. is there a way to lower the maximum temp before throttling on these from 94c to something lower like 85c?

http://www.techpowerup.com/gpuz/8hena/
http://www.techpowerup.com/gpuz/vc44b/
http://www.3dmark.com/fs/1800627 <- 15039
Valley Bench Mark <- 4401 @ 105.2 FPS

MSI Gaming R9 290X x 2 <- ASIC 76.5% & 76.0%
stock cooling with lots of fans.
CPU Intel Xeon 1230v3 with 16GB Patriot Sapphire Blue RAM







Cheers!


----------



## Forceman

Quote:


> Originally Posted by *bbond007*
> 
> my new toy.
> 
> I do have a question. is there a way to lower the maximum temp before throttling on these from 94c to something lower like 85c?
> 
> Cheers!


Change the Target GPU Temperature setting in the Overdrive portion of CCC. That'll try to make the card hold whatever temp you set (although I'm not sure if it is just fan speed, or if it will also throttle the card).


----------



## Paul17041993

Quote:


> Originally Posted by *Red1776*
> 
> You are going to need a bit more power than 1200w for quad 290x. You will want a bit of overhead/margin. I have a 4 x 7970 and am running 2.2kW (including a AX 1200)
> I am assembling a 4 x R290X rig now and am running 2.3kW power config. at the least I would look at 1600w for quad 290X's. If you do benching, you really don't want your PSU config running at or close to 100%
> Just my 2 cents


you can always undervolt and underclock the cards if you wanted to, they handle it quite well and someone here did it with his 290 trifire with quite spectacular results, though his was on air so less power draw meant no throttling.


----------



## vortex240

Quote:


> Originally Posted by *Roy360*
> 
> want to end up with Quad fire
> 
> 
> 
> 
> 
> 
> 
> 
> Got one XFX R9 290 when it first came out 409.99 from NCIX
> Got a VisionTek R9 290 for 339.99$ from Dell
> Got an ASUS R9 290 DU OC for 539.99 from NCIX <---Not sure if this one is a keeper
> the last one is undecided.
> 
> I have an AX1200i, so hopefully that will be enough to power these cards, 100A on the 12V rail, plus max power draw is ~1504W
> 
> Decided not to get the internal sample. Tons of people on rfd said it wasn't worth the risk, the seller was a bit of an @sshole too, so I didn't bother


I'm also running a internal sample, my company buys a lot of FireGL cards and we got 5 internal 290X as a gift last December. Awesome chip, not sure why they told you there is a risk, rma is easy with these, just call and give them serial, they take care of the rest. Haha, last time I rma'd a 7970 they even called me a week later to ask if everything was ok. Oh well, how did you pay so little for one from Dell?

BTW, that AX1200i most likely won't cut it for anything more then stock clocks. Here is some math:

4 x250watt (r9 290) = 1000watt leaving you with 200 watt for rest of the system, there is nothing wrong with running the AX1200 at near it's peak but peak nubmbers are not meant to be taken as rated sustained draw. You will A) kill the psu or B) it most likely will shut down due to over current protection


----------



## taem

Quote:


> Originally Posted by *Paul17041993*
> 
> you can always undervolt and underclock the cards if you wanted to, they handle it quite well and someone here did it with his 290 trifire with quite spectacular results, though his was on air so less power draw meant no throttling.


How do you undervolt a 290/x? Neither the Powercolor PCS+ nor the Tri-X 290 allow undervolting.


----------



## Paul17041993

Quote:


> Originally Posted by *taem*
> 
> How do you undervolt a 290/x? Neither the Powercolor PCS+ nor the Tri-X 290 allow undervolting.


afterburner, set voltage to as low as -100mV? trixx unfortunately doesn't have undervolt support but I'm pretty sure you should be able to do it with any card in afterburner, unless of course you don't have voltage control in there entirely...


----------



## taem

Quote:


> Originally Posted by *Paul17041993*
> 
> afterburner, set voltage to as low as -100mV? trixx unfortunately doesn't have undervolt support but I'm pretty sure you should be able to do it with any card in afterburner, unless of course you don't have voltage control in there entirely...


Tried AB. I can adjust voltage up but it won't let me go below 0. When I try it just bounces back to 0.


----------



## vortex240

Quote:


> Originally Posted by *taem*
> 
> How do you undervolt a 290/x? Neither the Powercolor PCS+ nor the Tri-X 290 allow undervolting.


Both cards use reference PCB, I have tried both of their bioses on my reference amd card. The PCS+ bios runs higher volts(like a regular 290) - so I didn't like it for extra heat, Tri-X runs stock volts of a reference 290X.

If you have a reference board, you can flash ANY and I mean ANY bios that came from a reference card. Hope this helps.


----------



## Jack Mac

Quote:


> Originally Posted by *taem*
> 
> Tried AB. I can adjust voltage up but it won't let me go below 0. When I try it just bounces back to 0.


That's an issue with the newer drivers, AB only allows undervolting on 13.12 and 13.11 it seems.


----------



## Roy360

Quote:


> Originally Posted by *vortex240*
> 
> I'm also running a internal sample, my company buys a lot of FireGL cards and we got 5 internal 290X as a gift last December. Awesome chip, not sure why they told you there is a risk, rma is easy with these, just call and give them serial, they take care of the rest. Haha, last time I rma'd a 7970 they even called me a week later to ask if everything was ok. Oh well, how did you pay so little for one from Dell?
> 
> BTW, that AX1200i most likely won't cut it for anything more then stock clocks. Here is some math:
> 
> 4 x250watt (r9 290) = 1000watt leaving you with 200 watt for rest of the system, there is nothing wrong with running the AX1200 at near it's peak but peak numbers are not meant to be taken as rated sustained draw. You will A) kill the psu or B) it most likely will shut down due to over current protection


peak power on the AX1200 says 1504W, 204W for the system and 1200W on the 12V rail, but I"m probably pushing. I guess I"ll stop at 3 cards. dang, if only I had the Phantek primo. I'd be running two psus, that would probably cost less than the AX1200


----------



## pkrexer

Quote:


> Originally Posted by *taem*
> 
> So I took out the 290 Tri-X, gonna return it, I should have listened to you guys who were trying to tell me 290 xfire would be bottlenecked by my 4670k. It's just not worth it to run xfire at this point, my psu can't oc two cards and my cpu can't handle the full power anyway. Since I can oc a single card xfire at stock clocks is getting me a 20% boost at best.
> 
> But now I'm having issues with my single card. It was running fine with no down clocking, now clock fluctuates quite a bit -- as much as 200. It's not temps, it's not psu, it wasn't doing it before, now it is. What are the potential causes of core clock fluctuation and how can I fix it?
> At the same clocks my Tri-X 290 out benches my Powercolor 290 by a decent margin.


If your using the 14 beta drivers, I would try going back to the 13.12 drivers. When I was using the new betas, it would not apply the power limit that I set in either AB or CCC and this would cause my card to throttle. Rolled back to the whql drivers, all is good.


----------



## BradleyW

I'm glad AMD fixed the instant lock ups as seen on the 14.1 drivers. With said drivers, I could not run a single game on my system without instant crash as soon as the applications begins to load.


----------



## Hattifnatten

Quote:


> Originally Posted by *pkrexer*
> 
> If your using the 14 beta drivers, I would try going back to the 13.12 drivers. When I was using the new betas, *it would not apply the power limit that I set in either AB or CCC and this would cause my card to throttle*. Rolled back to the whql drivers, all is good.


That explains a lot. I was beginning to think that I was seeing degrading already on my card due to mining


----------



## ArchieGriffs

Quote:


> Originally Posted by *vortex240*
> 
> I'm also running a internal sample, my company buys a lot of FireGL cards and we got 5 internal 290X as a gift last December. Awesome chip, not sure why they told you there is a risk, rma is easy with these, just call and give them serial, they take care of the rest. Haha, last time I rma'd a 7970 they even called me a week later to ask if everything was ok. Oh well, how did you pay so little for one from Dell?
> 
> BTW, that AX1200i most likely won't cut it for anything more then stock clocks. Here is some math:
> 
> 4 x250watt (r9 290) = 1000watt leaving you with 200 watt for rest of the system, there is nothing wrong with running the AX1200 at near it's peak but peak nubmbers are not meant to be taken as rated sustained draw. You will A) kill the psu or B) it most likely will shut down due to over current protection


I concur, 4x would work on a 1200 assuming you don't overclock at all. I've seen 290s/290xs with power draws of over 300 during more extreme stress test benches, but not really during normal gaming use on stock clocks.
Quote:


> Originally Posted by *BradleyW*
> 
> I'm glad AMD fixed the instant lock ups as seen on the 14.1 drivers. With said drivers, I could not run a single game on my system without instant crash as soon as the applications begins to load.


For me Crysis 3 wouldn't run period, nothing would load when I hit the play button. I had success with Skyrim once, but the second time it wouldn't load (It's equally likely that it was due to mods rather than 14.2). BF4 was just about the only thing I could play that I wanted to at the time, so I ended up rolling back to 13.2.


----------



## sterob

does anyone have experience with both Asus 290 DC2 and Sapphire 290 tri-x? Between those 2 which one run cooler and quieter? Can Asus be undervolted?


----------



## keikei

Quote:


> Originally Posted by *sterob*
> 
> does anyone have experience with both Asus 290 DC2 and Sapphire 290 tri-x? Between those 2 which one run cooler and quieter? Can Asus be undervolted?


I can vouch for the tri-x, its *extremely quiet*. I came from a reference 7970. When I leave the PC on overnight, I do not hear it and its 5 feet away from my bed. When not gaming i leave both card at 35% fan speed. Even cranked up to 60% for gaming, its not noticeable, only if you really focus on it (if you are playing with not sound). I will never go back to reference cards after experiencing the tri-x.


----------



## ZealotKi11er

Quote:


> Originally Posted by *keikei*
> 
> I can vouch for the tri-x, its *extremely quiet*. I came from a reference 7970. When I leave the PC on overnight, I do not hear it and its 5 feet away from my bed. When not gaming i leave both card at 35% fan speed. Even cranked up to 60% for gaming, its not noticeable, only if you really focus on it (if you are playing with not sound). I will never go back to reference cards after experiencing the tri-x.


Reference HD 7970 is louder then 290X so its not saying much.


----------



## Coree

Heres some clearer shots of my 4 slot beast.


----------



## taem

You know what's actually a pretty good overclocking app? CCC. I'm liking this. Obviously I can't achieve high clocks without the vddc adjust but going to what the stock voltage allows is enough for me. I was getting severe clock and usage fluctuation in both AB and Trixx, do not get that with just CCC. At the same clocks, CCC is getting me better benches than either AB or Trixx. By a big margin -- like running Tomb Raider bench with AB gave me minimum 38fps, with CCC alone it became 58.
Quote:


> Originally Posted by *sterob*
> 
> does anyone have experience with both Asus 290 DC2 and Sapphire 290 tri-x? Between those 2 which one run cooler and quieter?


I have to disagree with keikei, I don't think the Tri-X is quiet. At 60%+, it's pretty noticeable. Temps are great though, I tried it as the upper card in air cooled xfire in a silenced mid tower, I expected temps in the high 70s, but I got gaming temps in the high 60s for core and vrm1 and high 50s for vrm2.


That's stellar imho, great cooling on this card. But again, only with fans cranked a fair amount (hit 71% here), and it's loudish that way. I would not get Asus for any of the R9 cards if you're air cooling, I don't think they did their cooling very well this round. You might get Asus if water cooling since their custom pcb has better power delivery.

Only thing I don't like about the Tri-X is there seems to be zero support for the pcb, and I hate sag.
Quote:


> Originally Posted by *Jack Mac*
> 
> That's an issue with the newer drivers, AB only allows undervolting on 13.12 and 13.11 it seems.


Quote:


> Originally Posted by *vortex240*
> 
> Both cards use reference PCB, I have tried both of their bioses on my reference amd card. The PCS+ bios runs higher volts(like a regular 290) - so I didn't like it for extra heat, Tri-X runs stock volts of a reference 290X.
> 
> If you have a reference board, you can flash ANY and I mean ANY bios that came from a reference card. Hope this helps.


It's precisely the stock +50mV on the Powercolor PCS+ I want to lose. I'd like the option of a very quiet profile with low clocks that runs at 40% fan, and then up the vddc and clocks and fans in a separate profile.

But, even with 13.12, I cannot undervolt the Powercolor PCS+, just keeps bouncing back to 0. (Which is still a vddc adjust of negative 50mV.) Tri-X would not undervolt on 14.2, dunno if it can undervolt on 13.12. Can anyone confirm?

So which bios should I flash to get undervolting? I need uefi support. Actually I suppose I could just flash the Tri-X bios and get a neutral vddc default.


----------



## EliteReplay

tracking info


----------



## Roboyto

Quote:


> Originally Posted by *sterob*
> 
> does anyone have experience with both Asus 290 DC2 and Sapphire 290 tri-x? Between those 2 which one run cooler and quieter?


Vortex240 pointed this the other day.

The ASUS DC2 has an issue where all of the heatpipes do not make contact with the die. ASUS was lazy, atypical of them, and did not make a specific cooler for the 290, they used the 780 cooler. Since the die on the 780 is so much larger, the heatpipes don't make proper contact with the Hawaii die. Unless ASUS is, or are going to, fix the problem, you are likely better off with the Sapphire card.

http://www.tomshardware.com/reviews/radeon-r9-290-and-290x,3728-12.html


----------



## Arizonian

Quote:


> Originally Posted by *axiumone*
> 
> Here we are then. All 4 cards have arrived.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - welcome
















Quote:


> Originally Posted by *keikei*
> 
> Hi,
> 
> finally got the rig updated. May I join da club?
> 
> 
> 
> 
> 
> 
> 
> Please excuse the mess that is my case.
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Models: SAPPHIRE TRI-X R9 290
> Cooling: Stock


Congrats - added









Quote:


> Originally Posted by *King4x4*
> 
> Cards are under water.... EK blocks are the works! Vrms never go over 50'C and core too!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - updated









Quote:


> Originally Posted by *bbond007*
> 
> my new toy.
> 
> I do have a question. is there a way to lower the maximum temp before throttling on these from 94c to something lower like 85c?
> 
> http://www.techpowerup.com/gpuz/8hena/
> http://www.techpowerup.com/gpuz/vc44b/
> http://www.3dmark.com/fs/1800627 <- 15039
> Valley Bench Mark <- 4401 @ 105.2 FPS
> 
> MSI Gaming R9 290X x 2 <- ASIC 76.5% & 76.0%
> stock cooling with lots of fans.
> CPU Intel Xeon 1230v3 with 16GB Patriot Sapphire Blue RAM
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers!


Congrats - added









Just a reminder to anyone who hasn't yet or looking to join, please submit the following to be considered:

To be added on the member list please submit the following in your post

1. GPU-Z Link with OCN name or Screen shot of GPU-Z validation tab open with OCN Name or finally a pic of GPU with piece of paper with OCN name showing.
2. Manufacturer & Brand - Please don't make me guess.
3. Cooling - Stock, Aftermarket or 3rd Party Water


----------



## IBIubbleTea

I have a asus r9 290. If I were to change the Tim, will I void my warranty?


----------



## Coree

Quote:


> Originally Posted by *IBIubbleTea*
> 
> I have a asus r9 290. If I were to change the Tim, will I void my warranty?


Technically yes, but just don't damage anything and don't mention the TIM change if RMAing
IMO you will get only a 3-4C improvement if you are swapping it to MX-4


----------



## Prozillah

Quote:


> Originally Posted by *taem*
> 
> So I took out the 290 Tri-X, gonna return it, I should have listened to you guys who were trying to tell me 290 xfire would be bottlenecked by my 4670k. It's just not worth it to run xfire at this point, my psu can't oc two cards and my cpu can't handle the full power anyway. Since I can oc a single card xfire at stock clocks is getting me a 20% boost at best.
> 
> But now I'm having issues with my single card. It was running fine with no down clocking, now clock fluctuates quite a bit -- as much as 200. It's not temps, it's not psu, it wasn't doing it before, now it is. What are the potential causes of core clock fluctuation and how can I fix it?
> At the same clocks my Tri-X 290 out benches my Powercolor 290 by a decent margin.


Can someone explain how an overclocked 4570k to 4.5ghz + is bottlenecking 2x 290's? I know im seeing evidence of this when gaming in bf4 like load times and load amount on the cpu but why does it consume so much more of the cpu when in xfire? I came from 2x 670's and it ran perfectly - why is it that these 290s seem to be loading so much more on the cpu? And to add - what cpu out there would be considered the best for these cards?


----------



## phallacy

Quote:


> Originally Posted by *Prozillah*
> 
> Can someone explain how an overclocked 4570k to 4.5ghz + is bottlenecking 2x 290's? I know im seeing evidence of this when gaming in bf4 like load times and load amount on the cpu but why does it consume so much more of the cpu when in xfire? I came from 2x 670's and it ran perfectly - why is it that these 290s seem to be loading so much more on the cpu? And to add - what cpu out there would be considered the best for these cards?


I believe part of the reason is because the Crossfire bridge is now done over the PCIE bus rather than having the physical bridge connected to both or multiple cards. Also the 290x in general is a higher demanding gpu than the 670. I would say the 4930k is probably the best performance/price for the lga2011 socket. They're made for multiple demanding GPUs.

Edit: to clarify I don't believe a 4670k would bottleneck the cards other than maybe 5% at most vs lga2011 for gaming. Benchmarks are much different though. I'm using a 4770k for 3 cards currently and gaming performance is awesome. I know i'd gain some fps going to lga2011 but i'm happy with the 4770k.


----------



## King4x4

No idea Prozillah... I think it depends on your screen... if you are playing on a 1080 you will be most defintly cpu bottlenecked by a game that eats hexacores for a living!

In other news, Fresh windows 8.1 install.... some tweaking and vola!



http://www.3dmark.com/3dm/2579243


----------



## ZealotKi11er

I finally changed the EK thermal pad for VRM1. What a improvement.

Stock EK - 78C (Mining)
Stock EK + Thermal paste - 70 (Mining)
Fujipoly Extreme - 52C (Mining)

This was with Stock voltage and card @ 1000/1500

Game VRM1 temps where much lower.


----------



## taem

Quote:


> Originally Posted by *phallacy*
> 
> I would say the 4930k is probably the best performance/price for the lga2011 socket. They're made for multiple demanding GPUs.


So that's a $600 cpu that requires a new mobo. Add the second 290 and it's a $1500 price tag to go from single 290 for crossfire. I'd rather just play at lowered settings. Best cpu I can do is 4770k right, for socket 1150? Not sure that won't bottleneck 290 xfire either.


----------



## Paul17041993

Quote:


> Originally Posted by *Prozillah*
> 
> Can someone explain how an overclocked 4570k to 4.5ghz + is bottlenecking 2x 290's? I know im seeing evidence of this when gaming in bf4 like load times and load amount on the cpu but why does it consume so much more of the cpu when in xfire? I came from 2x 670's and it ran perfectly - why is it that these 290s seem to be loading so much more on the cpu? And to add - what cpu out there would be considered the best for these cards?


a lot of games, your framerate is proportional to the engine cycle rate, your render thread and driver pre-processing also has to work harder to feed the cards.


----------



## The Storm

Quote:


> Originally Posted by *taem*
> 
> So that's a $600 cpu that requires a new mobo. Add the second 290 and it's a $1500 price tag to go from single 290 for crossfire. I'd rather just play at lowered settings. Best cpu I can do is 4770k right, for socket 1150? Not sure that won't bottleneck 290 xfire either.


My 4770k handles my xfired 290X's just fine.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *phallacy*
> 
> I believe part of the reason is because the Crossfire bridge is now done over the PCIE bus rather than having the physical bridge connected to both or multiple cards. Also the 290x in general is a higher demanding gpu than the 670. I would say the 4930k is probably the best performance/price for the lga2011 socket. They're made for multiple demanding GPUs.


Your forgetting its older brother 3930k


----------



## taem

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Your forgetting its older brother 3930k


Isn't 4770k faster at 4 cores or less?


----------



## vortex240

Quote:


> Originally Posted by *Coree*
> 
> Heres some clearer shots of my 4 slot beast.


Is that Arctic Accelero S1? If it is, what kind of temps are you getting? I tried the S1 on my 290X it got absolutely overpowered with ever - 45% power limit and 800mhz core. Worst experiment ever! I used 2 very powerful scythe fans.


----------



## Roboyto

Quote:


> Originally Posted by *IBIubbleTea*
> 
> I have a asus r9 290. If I were to change the Tim, will I void my warranty?


No you will not void your warranty. As long as there is no damage done to the card everything will be fine if you ever need to RMA it


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *taem*
> 
> Isn't 4770k faster at 4 cores or less?


Maybe but it depends on the app I suppose

4770k has a better imc allowing for higher dram speeds 2400 , 2666 +

I have a 3820 that does 166 strap and 2666 + . Sandybee Hexys will do 2400 dram but I haven't seen much more than that









But for sheer benching / gaming horsepower hexy rules


----------



## vortex240

Quote:


> Originally Posted by *Prozillah*
> 
> Can someone explain how an overclocked 4570k to 4.5ghz + is bottlenecking 2x 290's? I know im seeing evidence of this when gaming in bf4 like load times and load amount on the cpu but why does it consume so much more of the cpu when in xfire? I came from 2x 670's and it ran perfectly - why is it that these 290s seem to be loading so much more on the cpu? And to add - what cpu out there would be considered the best for these cards?


Haha, I'm sorry but a 4670K at 4.5 is not a bottleneck for a pair of 290 or 290x in crossfire. Please take a look at an article Anandtech did very recently, last 2 maybe 3 months where they go into detail testing of crossfire, sli, AMD and Intel chips. I'm too lazy to google this for you.

Short story long... AT NO POINT a 4670K will be a bottleneck, even at 1600p. EVER!! It's on par with 3960x, 4770k. Or within 1-3%. There are a handful of synthetic benchmarks that are designed to take advantage of hyper-threading(3dmark11, etc.) - these are NOT real world scenarios. Also, a couple real time strategy games, which Mantle is designed to address.

Load times, come down to storage, platform, graphics drivers. I'm on a samsung 840 Evo with a 4670K and every single game(BF3, BF4, everything!) load ultra fast. No delays at all, sata3 channel is satured to the max - you can test this with resource manager and look at disk I/0.


----------



## rdr09

Quote:


> Originally Posted by *King4x4*
> 
> No idea Prozillah... I think it depends on your screen... if you are playing on a 1080 you will be most defintly cpu bottlenecked by a game that eats hexacores for a living!
> 
> In other news, Fresh windows 8.1 install.... some tweaking and vola!
> 
> 
> 
> http://www.3dmark.com/3dm/2579243


congrats!


----------



## axiumone

This is what 4 cards at 1200/1500 and a cpu at 4.8 pull from the wall during firestrike extreme. It actually hit 1704 watts, but I didnt catch that.


----------



## Roboyto

Quote:


> Originally Posted by *vortex240*
> 
> Haha, I'm sorry but a 4670K at 4.5 is not a bottleneck for a pair of 290 or 290x in crossfire. Please take a look at an article Anandtech did very recently, last 2 maybe 3 months where they go into detail testing of crossfire, sli, AMD and Intel chips. I'm too lazy to google this for you.
> 
> Short story long... AT NO POINT a 4670K will be a bottleneck, even at 1600p. EVER!! It's on par with 3960x, 4770k. Or within 1-3%. There are a handful of synthetic benchmarks that are designed to take advantage of hyper-threading(3dmark11, etc.) - these are NOT real world scenarios. Also, a couple real time strategy games, which Mantle is designed to address.
> 
> Load times, come down to storage, platform, graphics drivers. I'm on a samsung 840 Evo with a 4670K and every single game(BF3, BF4, everything!) load ultra fast. No delays at all, sata3 channel is satured to the max - you can test this with resource manager and look at disk I/0.


I believe this is the review you're speaking of. I have been under the impression that the i5s wouldn't bottleneck either.

http://anandtech.com/show/7189/choosing-a-gaming-cpu-september-2013

I posted this review not too long ago but I don't believe the games tested don't utilize the CPU like BF4 does. It's hard to compare to BF4 since there isn't much out there that uses CPU and RAM as heavily as BF4.


----------



## Paul17041993

Quote:


> Originally Posted by *vortex240*
> 
> Haha, I'm sorry but a 4670K at 4.5 is not a bottleneck for a pair of 290 or 290x in crossfire. Please take a look at an article Anandtech did very recently, last 2 maybe 3 months where they go into detail testing of crossfire, sli, AMD and Intel chips. I'm too lazy to google this for you.


for games like BF4, that i5 will be absolutely murdered, as for a 2x670 setup that didn't have a problem, you're simply not counting the fact you're now running higher detail and higher framerates, 1600p however, the cards are rendering more pixels, which isn't CPU-bound usually, and the balance is restored.

give a link to this article please though.


----------



## taem

I don't even play BF4. I don't think any of my games are cpu intensive. And my screen is 1440p. It's entirely possible that my crossfire problems were not cpu related. Because it's not like I get full performance out of a single gpu either, core usage and clock still fluctuate madly. I was told amd had solved it's driver problems but I just don't believe that's the case, this is so irritating. I can either run AB with unofficial overclocking with power play disabled, in which case my clocks stay locked even at idle, or I can just suffer the framerate drops as my gpu usage goes from 2% to 100% any time it feels like it. Even at 100% the clock will bounce from 900 to 1100.

Going to try a full reformat start from win8.1 but I don't have high hopes.


----------



## Xyro TR1

Joining now that I finally got my rig up and running!









http://www.techpowerup.com/gpuz/ey632/


*MSI R9 290X GAMING 4G*
Stock cooling


----------



## Prozillah

Quote:


> Originally Posted by *vortex240*
> 
> Haha, I'm sorry but a 4670K at 4.5 is not a bottleneck for a pair of 290 or 290x in crossfire. Please take a look at an article Anandtech did very recently, last 2 maybe 3 months where they go into detail testing of crossfire, sli, AMD and Intel chips. I'm too lazy to google this for you.
> 
> Short story long... AT NO POINT a 4670K will be a bottleneck, even at 1600p. EVER!! It's on par with 3960x, 4770k. Or within 1-3%. There are a handful of synthetic benchmarks that are designed to take advantage of hyper-threading(3dmark11, etc.) - these are NOT real world scenarios. Also, a couple real time strategy games, which Mantle is designed to address.
> 
> Load times, come down to storage, platform, graphics drivers. I'm on a samsung 840 Evo with a 4670K and every single game(BF3, BF4, everything!) load ultra fast. No delays at all, sata3 channel is satured to the max - you can test this with resource manager and look at disk I/0.


Quote:


> Originally Posted by *Paul17041993*
> 
> for games like BF4, that i5 will be absolutely murdered, as for a 2x670 setup that didn't have a problem, you're simply not counting the fact you're now running higher detail and higher framerates, 1600p however, the cards are rendering more pixels, which isn't CPU-bound usually, and the balance is restored.
> 
> give a link to this article please though.


Conflicting opinions. Im guessing in time to come with optimisation etc bf4 will smoothern out. I must say as well the amd driver issue/rebate is a huge factor. Best performance so far is 13.12 xfire on dx. Semi constant 100fps @ 1440p ultra. And I imagine mantle will play a big part In this as it matures


----------



## BradleyW

Quote:


> Originally Posted by *Paul17041993*
> 
> for games like BF4, that i5 will be absolutely murdered, as for a 2x670 setup that didn't have a problem, you're simply not counting the fact you're now running higher detail and higher framerates, *1600p however, the cards are rendering more pixels, which isn't CPU-bound usually, and the balance is restored.*
> 
> give a link to this article please though.


That's only true if the resolution is purely the only bottleneck, and not the 3D application causing the issue. 9/10 it is the applications fault, therefore upping the resolution does nothing to restoring balance, it just potentially gives you more pixels for your buck since the GPU's are hanging back a little.


----------



## Pesmerrga

Check out the great deal for an Open Box 290 on Newegg..

http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709%20600007787&IsNodeId=1&name=4GB&Order=PRICED&Pagesize=100


----------



## Arizonian

Quote:


> Originally Posted by *Xyro TR1*
> 
> Joining now that I finally got my rig up and running!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/ey632/
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> *MSI R9 290X GAMING 4G*
> Stock cooling
> 
> 
> Spoiler: Warning: Spoiler!


Congrats bud - added


----------



## vortex240

Quote:


> Originally Posted by *Roboyto*
> 
> I believe this is the review you're speaking of. I have been under the impression that the i5s wouldn't bottleneck either.
> 
> http://anandtech.com/show/7189/choosing-a-gaming-cpu-september-2013
> 
> I posted this review not too long ago but I don't believe the games tested don't utilize the CPU like BF4 does. It's hard to compare to BF4 since there isn't much out there that uses CPU and RAM as heavily as BF4.


Quote:


> Originally Posted by *Paul17041993*
> 
> for games like BF4, that i5 will be absolutely murdered, as for a 2x670 setup that didn't have a problem, you're simply not counting the fact you're now running higher detail and higher framerates, 1600p however, the cards are rendering more pixels, which isn't CPU-bound usually, and the balance is restored.
> 
> give a link to this article please though.


Murdered? Roboyto was kind enough to post the link. Anandtech are pretty much the best in the business, please have a read.

Please don't spread misinformation. Proof just below - no 4670k but a 3570k is the 'equivalent' there.

http://www.techspot.com/review/734-battlefield-4-benchmarks/page6.html

Now, one might say 'But it's not multiplayer!!' The chips in the review are at stock, a 4670K at 4.5 is hardly a bottleneck.


----------



## chronicfx

Quote:


> Originally Posted by *Paul17041993*
> 
> for games like BF4, that i5 will be absolutely murdered, as for a 2x670 setup that didn't have a problem, you're simply not counting the fact you're now running higher detail and higher framerates, 1600p however, the cards are rendering more pixels, which isn't CPU-bound usually, and the balance is restored.
> 
> give a link to this article please though.


Thats not true. I haven't dipped under 90fps yet. Meh i take it back somewhat because i haven't played the multiplayer yet. But the campaign plays without any issue.


----------



## Paul17041993

Quote:


> Originally Posted by *Xyro TR1*
> 
> Joining now that I finally got my rig up and running!


I want your title...









Quote:


> Originally Posted by *vortex240*
> 
> Now, one might say 'But it's not multiplayer!!'


well, that *is* a MAJOR factor...

Quote:


> Originally Posted by *BradleyW*
> 
> That's only true if the resolution is purely the only bottleneck, and not the 3D application causing the issue. 9/10 it is the applications fault, therefore upping the resolution does nothing to restoring balance, it just potentially gives you more pixels for your buck since the GPU's are hanging back a little.


I mean by more pixels means you use the gpus more to their max, these cards are intended for 1440p and higher, and/or eyefinity.


----------



## taem

Is this true?
Quote:


> Quote:
> Originally Posted by Stuka87
> Load up MSI Afterburner and watch the GPU % along with the clock speeds. I am betting you are not hitting 100% utilization, which means its going to declock on its own to conserve power/heat. The voltage should fluctuate as well.
> GPU usage doesn't work correctly with Hawaii. It'll continuously bounce from 0% to 100% no matter how much of a load you're putting on it. According to Unwinder it's something that AMD will have to fix within their drivers.
> 
> Quote:
> Originally Posted by Unwinder
> Guys, in different forums I've seen claims that AB reports wrong GPU usage on 290 cards because it is not supporting new AMD GPUs yet. That's incorrect statement, new versions of AB won't fix it, GPU usage is reported as it is calculated by AMD driver and it is not anyhow calculated internally in AB. So it is not a case of unknown GPU, it is as correct as AMD driver team implemented it. Currently GPU usage sensor implementation inside AMD driver is broken and it may return unreliable data if other sensors were polled immediately before GPU usage. So the amount of wrong data can be minimized by adding artificial delays between polling the sensors, it may be minimized by putting GPU load sensors to the first positions in the list, but the only true fix can be expected from AMD drivers update.
> Silverforce11's answer is most likely the correct one. If the load drops any the GPU clock will drop accordingly as Hawaii clocks down rather than waste cycles.


http://forums.anandtech.com/showthread.php?t=2357062

Because this is driving me nuts. Am I the only one getting this? Rest of you guys get steady usage and constant clocks in game?


----------



## Xyro TR1

Quote:


> Originally Posted by *Pesmerrga*
> 
> Check out the great deal for an Open Box 290 on Newegg..
> 
> http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709%20600007787&IsNodeId=1&name=4GB&Order=PRICED&Pagesize=100











Quote:


> Originally Posted by *Arizonian*
> 
> Congrats bud - added


Thanks!!







Considering I ordered this card mid-January, it's been a long time coming.








Quote:


> Originally Posted by *Paul17041993*
> 
> I want your title...


Haha! I'm quite a fan of it.








Quote:


> Originally Posted by *taem*
> 
> Is this true?
> http://forums.anandtech.com/showthread.php?t=2357062
> 
> Because this is driving me nuts. Am I the only one getting this? Rest of you guys get steady usage and constant clocks in game?


I get full CPU usage but my clocks are never constant. My GPU temperature also plateaus at 94C and then the clocks start dropping. This scares me.


----------



## phallacy

Quote:


> Originally Posted by *taem*
> 
> Is this true?
> http://forums.anandtech.com/showthread.php?t=2357062
> 
> Because this is driving me nuts. Am I the only one getting this? Rest of you guys get steady usage and constant clocks in game?


Using AB to monitor, my clocks stay constant at whatever frequency I set, the load is variable and keeps fluctuating along with the voltage.


----------



## Paul17041993

Quote:


> Originally Posted by *taem*
> 
> Is this true?
> http://forums.anandtech.com/showthread.php?t=2357062
> 
> Because this is driving me nuts. Am I the only one getting this? Rest of you guys get steady usage and constant clocks in game?


the GPU readings, likely this is what you are seeing, its been debated about here for a little while now, the framerate drops and otherwise lack of performance however would be a combination of factors...

might be good to try and log whats using your CPU when trying to game, there could be some background task that's eating a core or two occasionally, not sure atm what tools you could use to log this though...


----------



## Widde

Finally bf4 is stable







And fps has gotten alot better in dx11 on 14.2 atleast for me ^^ Running on ultra no AA no post AA 180% res scale at 1080p rarely ever see anything below 70ish, bouncing between 80-120 fps


----------



## Tobiman

Quote:


> Originally Posted by *Xyro TR1*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks!!
> 
> 
> 
> 
> 
> 
> 
> Considering I ordered this card mid-January, it's been a long time coming.
> 
> 
> 
> 
> 
> 
> 
> 
> Haha! I'm quite a fan of it.
> 
> 
> 
> 
> 
> 
> 
> 
> I get full CPU usage but my clocks are never constant. My GPU temperature also plateaus at 94C and then the clocks start dropping. This scares me.


You might want to put the GPU in uber mode. You will get more fan speed and noise but much better temps. The MSI custom cooler doesn't do a good job of keeping these cards cool since it's pretty much a GTX 780 cooler slapped on the smaller die.


----------



## Xyro TR1

Quote:


> Originally Posted by *Tobiman*
> 
> You might want to put the GPU in uber mode. You will get more fan speed and noise but much better temps. The MSI custom cooler doesn't do a good job of keeping these cards cool since it's pretty much a GTX 780 cooler slapped on the smaller die.


Do you know if this has any effect on what Afterburner reports is 100% fan speed? Because I set it at 100% and just watched the temps climb (rapidly) to 94C.

edit: looks like not - identical results in Gaming or OC modes x.x


----------



## Tobiman

Quote:


> Originally Posted by *taem*
> 
> Is this true?
> http://forums.anandtech.com/showthread.php?t=2357062
> 
> Because this is driving me nuts. Am I the only one getting this? Rest of you guys get steady usage and constant clocks in game?


GPU usage and clocks will always fluctuate with load. For example, when I run COD BO2, I only get max clocks when i'm actually in the game doing/shooting stuff. As soon as I return back to the lobby or menu, the card downclocks. In some other games such as eve online, my gpu can downclock to around 900mhz with about 70% usage while everything remains silky smooth. It's all about how much power the application needs at a specific point and that varies throughout a game.


----------



## Tobiman

Quote:


> Originally Posted by *Xyro TR1*
> 
> Do you know if this has any effect on what Afterburner reports is 100% fan speed? Because I set it at 100% and just watched the temps climb (rapidly) to 94C.


As long as you are making use of AB beta 18, it should be accurate. Now, there's a possibility that CCC overdrive is interferring with AB so I'd advise that you turn either one off, but preferably overdrive, for smooth operation. You can also install GPU-Z 0.7.7 which works well with AB. You can use it to confirm the numbers AB is dishing out.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Xyro TR1*
> 
> Do you know if this has any effect on what Afterburner reports is 100% fan speed? Because I set it at 100% and just watched the temps climb (rapidly) to 94C.


That's rather worrying.

It shouldn't be hitting those sort of temps at 100% fan speed

Which position is the BIOS switch in?

Towards the back end of the card should "uber" mode


----------



## Xyro TR1

Quote:


> Originally Posted by *Tobiman*
> 
> As long as you are making use of AB beta 18, it should be accurate. Now, there's a possibility that CCC overdrive is interferring with AB so I'd advise that you turn either one off, but preferably overdrive, for smooth operation. You can also install GPU-Z 0.7.7 which works well with AB. You can use it to confirm the numbers AB is dishing out.


I've never even enabled OverDrive, numbers confirmed by GPU-Z.



Quote:


> Originally Posted by *Sgt Bilko*
> 
> That's rather worrying.
> 
> It shouldn't be hitting those sort of temps at 100% fan speed
> 
> Which position is the BIOS switch in?
> 
> Towards the back end of the card should "uber" mode


I'm having a hard time locating the switch... I'll get back to you on that. Also, the switch should be toward the power connectors or the I/O panel?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Xyro TR1*
> 
> I've never even enabled OverDrive, numbers confirmed by GPU-Z.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> I'm having a hard time locating the switch... I'll get back to you on that. Also, the switch should be toward the power connectors or the I/O panel?


Flick it toward the power connectors.

Should be located here towards the I/O ports.


----------



## Xyro TR1

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Flick it toward the power connectors.
> 
> Should be located here towards the I/O ports.


It was already towards the power connectors. I just switched it towards the I/O panel and saw literally no change in the card's behavior under full load.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Xyro TR1*
> 
> It was already towards the power connectors. I just switched it towards the I/O panel and saw literally no change in the card's behavior under full load.
> 
> 
> Spoiler: Warning: Spoiler!


Might have to send an e-mail to MSI then,

those temps are way too high for that speed.


----------



## Tobiman

Quote:


> Originally Posted by *Xyro TR1*
> 
> I've never even enabled OverDrive, numbers confirmed by GPU-Z.
> 
> 
> I'm having a hard time locating the switch... I'll get back to you on that. Also, the switch should be toward the power connectors or the I/O panel?


I'd refrain from running kombustor/furmark to test cards because very few applications will actually stress the card that hard. The cooling on the MSI is already not stellar since the heatpipes don't cover the core fully (bummer) so I won't advise basing performance on the worst absolute case which won't happen when gaming.


----------



## Xyro TR1

Quote:


> Originally Posted by *Sgt Bilko*
> 
> [/SPOILER]
> 
> Might have to send an e-mail to MSI then,
> 
> those temps are way too high for that speed.


My first thought was that I should probably replace the TIM. It would _seriously_ suck if I had to send it in. I waited over a month for this stupid thing. x.x
Quote:


> Originally Posted by *Tobiman*
> 
> I'd refrain from running kombustor/furmark to test cards because very few applications will actually stress the card that hard. The cooling on the MSI is already not stellar since the heatpipes don't cover the core fully (bummer) so I won't advise basing performance on the worst absolute case which won't happen when gaming.


Fair enough! I mean, I did just play GTA4 for a couple hours with absolutely no problems in a warm room, so I'm not _terribly_ concerned. I'm just a bit paranoid over my new $650 purchase, ya know?


----------



## Tobiman

Quote:


> Originally Posted by *Xyro TR1*
> 
> My first thought was that I should probably replace the TIM. It would _seriously_ suck if I had to send it in. I waited over a month for this stupid thing. x.x
> Fair enough! I mean, I did just play GTA4 for a couple hours with absolutely no problems in a warm room, so I'm not _terribly_ concerned. I'm just a bit paranoid over my new $650 purchase, ya know?


I understand and you are not the only one with said problem.

http://forums.anandtech.com/showthread.php?t=2364119


----------



## Sgt Bilko

Quote:


> Originally Posted by *Tobiman*
> 
> I understand and you are not the only one with said problem.
> 
> http://forums.anandtech.com/showthread.php?t=2364119


wow, here i was thinking that the Twin Frozr was enough for Hawaii, guess not.

Looks like they are performing worse than my XFX cards.......that's a bummer.


----------



## Xyro TR1

Quote:


> Originally Posted by *Tobiman*
> 
> I understand and you are not the only one with said problem.
> 
> http://forums.anandtech.com/showthread.php?t=2364119


Quote:


> Originally Posted by *Sgt Bilko*
> 
> wow, here i was thinking that the Twin Frozr was enough for Hawaii, guess not.
> 
> Looks like they are performing worse than my XFX cards.......that's a bummer.


To be fair, this fits in exactly with my needs - I wanted a tiny, quiet, powerful system to run my very basic set of games without lagging. So I will probably just put some of my GeLID Extreme TIM on and hope for the best.









Thanks for the info and help, guys!!


----------



## Jack Mac

I think they're performing bad because of the heatpipe issue and because MSI chose to use quiet fans that don't even hit 3K RPM at 100% fan speed.


----------



## Tobiman

Quote:


> Originally Posted by *Sgt Bilko*
> 
> wow, here i was thinking that the Twin Frozr was enough for Hawaii, guess not.
> 
> Looks like they are performing worse than my XFX cards.......that's a bummer.


Yeah, about 2 of the 5 heat pipes in the cooler do not touch the hawaii core properly.


----------



## DeadlyDNA

Hello, anyone with tri-fire or quadfire using 14.2 drivers? In every game i try i get this wierd stutter/frame tearing like i've never seen before. Some games it fixes if i enable v-synch then it's smooth. Metro last light though as an example it doesn't help, also with vsynch sound stutters and makes noise also.


----------



## phallacy

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Hello, anyone with tri-fire or quadfire using 14.2 drivers? In every game i try i get this wierd stutter/frame tearing like i've never seen before. Some games it fixes if i enable v-synch then it's smooth. Metro last light though as an example it doesn't help, also with vsynch sound stutters and makes noise also.


I'm using a tri fire setup with 14.2 and the only thing I've noticed so far is frame drops down to like 40 for half a second then goes back to normal. I believe the v sync sound issue was addressed by an AMD rep here and said it would be patched in the next WHQL driver in April.


----------



## nightfox

Quote:


> Originally Posted by *phallacy*
> 
> I'm using a tri fire setup with 14.2 and the only thing I've noticed so far is frame drops down to like 40 for half a second then goes back to normal. I believe the v sync sound issue was addressed by an AMD rep here and said it would be patched in the next WHQL driver in April.


found this the other day:

http://forums.amd.com/game/messageview.cfm?catid=475&threadid=172035


----------



## keikei

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Hello, anyone with tri-fire or quadfire using 14.2 drivers? In every game i try i get this wierd stutter/frame tearing like i've never seen before. Some games it fixes if i enable v-synch then it's smooth. Metro last light though as an example it doesn't help, also with vsynch sound stutters and makes noise also.


Did you turn 'frame pacing' on?


----------



## phallacy

Quote:


> Originally Posted by *nightfox*
> 
> found this the other day:
> 
> http://forums.amd.com/game/messageview.cfm?catid=475&threadid=172035


That's even better probably at the same time as the mantle patch for Thief.


----------



## DeadlyDNA

Quote:


> Originally Posted by *keikei*
> 
> Did you turn 'frame pacing' on?


I tried both on and off.. however i do recall a while back messing with skyrim and turning off frame pacing gave me the same type of issue i see now. It probably is frame pacing related....

edit: Anyone with quad 29xx setups tried 14.2 yet? I am going to try tri-fire and dual as well to see is it changes.


----------



## DeadlyDNA

It is in fact only happening in 4 way crossfire. when i set to tri-fire it runs flawless. Can anyone else with 4 way crossfire give it a go as well?

It appears with 4 way crossfire not only do i get bizarre frame tearing and stuttering, i also only have about half of my games even making it into the 3d mode of the game. Most of my games are just crashing before they make into 3d mode.

Looks like i'll be going back to WHQL and dealing with the issues it has. It has been tempting to sell these cards and go back to nvidia. Not that i want too, but it seems like so much is broken


----------



## keikei

Quote:


> Originally Posted by *DeadlyDNA*
> 
> I tried both on and off.. however i do recall a while back messing with skyrim and turning off frame pacing gave me the same type of issue i see now. It probably is frame pacing related....
> 
> edit: Anyone with quad 29xx setups tried 14.2 yet? I am going to try tri-fire and dual as well to see is it changes.


It might be alt+tab related also. I was playing resident evil 6 and i noticed the lag/stutter. I finally found the issue. Alt+tab and RE6 dont play well together. It might be related to the cards as well. I have xfire 290's.


----------



## MrWhiteRX7

I'm using 3 x 290's and everything runs flawless on 14.2 so far.


----------



## Paul17041993

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Hello, anyone with tri-fire or quadfire using 14.2 drivers? In every game i try i get this wierd stutter/frame tearing like i've never seen before. Some games it fixes if i enable v-synch then it's smooth. Metro last light though as an example it doesn't help, also with vsynch sound stutters and makes noise also.


that could be the sound issue thing I heard about, what you using for sound?

edit; actually what board and cpu you using for this? wondering if your cards are being deprived of PCIe bandwidth...


----------



## NapalmV5

4x 290x asus dc2oc/1080p/ultimate


----------



## Sgt Bilko

Quote:


> Originally Posted by *Tobiman*
> 
> Yeah, about 2 of the 5 heat pipes in the cooler do not touch the hawaii core properly.


That's just silly, i remember MSI posting on FB and Twitter about all the hard work they were putting into the cooler for Hawaii, doesn't seem like they put that much effort in after all.


----------



## Tobiman

Quote:


> Originally Posted by *NapalmV5*
> 
> 4x 290x asus dc2oc/1080p/ultimate


Lmao. That's dope!


----------



## Sgt Bilko

Well, i decided to really push my XFX cards, these are my max runs:

Firestrike: http://www.3dmark.com/3dm/2581280

Firestrike Extreme: http://www.3dmark.com/3dm/2581467


----------



## Sgt Bilko

Quote:


> Originally Posted by *NapalmV5*
> 
> 4x 290x asus dc2oc/1080p/ultimate
> 
> 
> Spoiler: Warning: Spoiler!


Holy crap!!

Looks like you need a 240hz panel now huh?


----------



## Red1776

? I am about to put four of these under water. But I have run all four twin Frozr and the twin Frozr cooler is the best air cooler currently on the R290x.


----------



## Coree

Quote:


> Originally Posted by *vortex240*
> 
> Is that Arctic Accelero S1? If it is, what kind of temps are you getting? I tried the S1 on my 290X it got absolutely overpowered with ever - 45% power limit and 800mhz core. Worst experiment ever! I used 2 very powerful scythe fans.


Yeah, it's the S1 Plus. I'm getting 78C core, 77C vrm1, 66C vrm2 after a 3hr session of BF3 MP, zero throttle. And btw, i've OCed the GPU @ 1093/1325 during this. Also, the fans i'm using are TY-147's running at max RPM. They push 73CFM @ 21dBA of noise and are quiet. I suggest you to buy the same fans, you will see a drop in temps


----------



## NapalmV5

Quote:


> Originally Posted by *Tobiman*
> 
> Lmao. That's dope!


Quote:


> Originally Posted by *Sgt Bilko*
> 
> Holy crap!!
> 
> Looks like you need a 240hz panel now huh?










thanks guys yah 144hz panel going by the min fps but since avg is so good yah 240hz would do nicely









same clocks (1100/1500) @ mining 14K+ cpm @ win 8.1 64 - 18K cpm @ linux


----------



## Connolly

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Hello, anyone with tri-fire or quadfire using 14.2 drivers? In every game i try i get this wierd stutter/frame tearing like i've never seen before. Some games it fixes if i enable v-synch then it's smooth. Metro last light though as an example it doesn't help, also with vsynch sound stutters and makes noise also.


I think the 14.2 drivers have some issues, I'm running a single 290x and get a strange stutter when playing BF3. It's most noticeable when flying a jet and panning over back ground images, such as the trees surrounding Caspian Border. Switching back to 13.2 completely cures the issue. BF4 runs perfectly on the 14.2 drivers, I assume because it's mantle optimised.


----------



## Paul17041993

Quote:


> Originally Posted by *NapalmV5*
> 
> 4x 290x asus dc2oc/1080p/ultimate


1440p or eyefinity before I blow chunks!


----------



## Gil80

Hi all,

I just got my Gigabyte R9 290 Windforce (BF4 edition).

If I take a loot at catalyst control center I see that the memory clock settings is 1250 highest.
shouldn't it be 5000?


----------



## ReXtN

Quote:


> Originally Posted by *Gil80*
> 
> Hi all,
> 
> I just got my Gigabyte R9 290 Windforce (BF4 edition).
> 
> If I take a loot at catalyst control center I see that the memory clock settings is 1250 highest.
> shouldn't it be 5000?


1250MHz * 4 = 5000Mhz effective clock speed









That means that the 1250MHz memory clockspeed is right


----------



## Paul17041993

Quote:


> Originally Posted by *Gil80*
> 
> If I take a loot at catalyst control center I see that the memory clock settings is 1250 highest.
> shouldn't it be 5000?


no, 1250 is the correct frequency you should run off, 5000 is after you multiply it by the GDDR5 bus links (x4), nvidia's x2 multiplication you just ignore, I have no idea what drunk technician was involved with that...

technically, the "total bandwidth" doesn't == the x4 multiple, but usually it does as most graphics math is done with vector4s (128bit)


----------



## Spectre-

hey guys some wierd stuff going on my brothers R9 290 hoping i could get some help

so we turn on his pc and it works fine for anything but as soon any gpu intensive task comes along and it the screen goes black but everything is running in the background

hes running the 290 on the g10 bracket like me witha h55

i have underclocked the clocks and gave +50mv and 50% power in AB

still i cant come up with anything

also tested out the psu on my rig and everything works like a charm


----------



## vortex240

Quote:


> Originally Posted by *Coree*
> 
> Yeah, it's the S1 Plus. I'm getting 78C core, 77C vrm1, 66C vrm2 after a 3hr session of BF3 MP, zero throttle. And btw, i've OCed the GPU @ 1093/1325 during this. Also, the fans i'm using are TY-147's running at max RPM. They push 73CFM @ 21dBA of noise and are quiet. I suggest you to buy the same fans, you will see a drop in temps


I'm not gonna lie, I have no idea how you are achieving this!! S1 plus is rated at 120watt and thats with the fan module. Obviously with 2 large fans like your it will be higher.

I used the S1 which is pretty much S1 plus but with less compatibility(old mounting bracket) and even when I underclocked the card to 800 core and -45% power limit, it was still throttling like crazy all the way to 600 core and running in mid 80C temp. This was with 2 120mm scythe fans, each blowing 95cfm.

The thermal power output of a OC 290will be easily double that of what S1 plus is rated for, so my results are in line of what the cooler is capable of and I wasn't really surprised when I saw them.

What is this black magic you are doing??? I can show pics and video for proof of my setup.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Spectre-*
> 
> hey guys some wierd stuff going on my brothers R9 290 hoping i could get some help
> 
> so we turn on his pc and it works fine for anything but as soon any gpu intensive task comes along and it the screen goes black but everything is running in the background
> 
> hes running the 290 on the g10 bracket like me witha h55
> 
> i have underclocked the clocks and gave +50mv and 50% power in AB
> 
> still i cant come up with anything
> 
> also tested out the psu on my rig and everything works like a charm


Try different driver?

other than you it's a black screen card, RMA is the usual option.


----------



## Coree

@vortex

Yes, show some pics. Are you sure that the GPU die and heatsink have good contact? What TIM are you using? Btw what is your case and airflow? Btw do you have VRM heatsinks installed?


----------



## TommyGunn123

TTL's second channel, Rushkit, has a pretty NDA complying video on the MSI 290x Lightning! It's pretty much more of the stuff we probs know already, but it's REAAALLL!









NDA IS LIFTED TOMORROW! Prepare for review's a-plently on your youtube sub-box


----------



## hotrod717

^^^^ I knew this would happen! Looks like I'm going to have the 2 top cards on the market at the same time.







Tried to wait it out for the Lightning, but I needed something new to play with. Wonder which one is going to win out and stay in my rig. 780ti Kingpin or 290X Lightning? Kingpin arrives tomorrow. Wondering when and where the first Lightnings will be available?


----------



## VSG

Also- at what price?


----------



## hotrod717

Quote:


> Originally Posted by *geggeg*
> 
> Also- at what price?


Rofl! Can't imagine it being lass than $850 with the miners driving the bus. Safe to say it should have pretty much the same features as 780 Lightning - http://game.msi.com/product/graphics-card/lightning-trifrozr-series. Can't wait to see Kingpin and Lightning head to head. Exciting stuff! Sad that Asus hasn't stepped up to the plate and offered a 290X Matrix.


----------



## phallacy

Anybody here with the sapphire 290x Tri x, which way do I flick the bios switch for UEFI and non UEFI ? Right now it's to the power connectors, the same way that the uber bios would be for reference 290x.


----------



## Mercy4You

Quote:


> Originally Posted by *phallacy*
> 
> Anybody here with the sapphire 290x Tri x, which way do I flick the bios switch for UEFI and non UEFI ? Right now it's to the power connectors, the same way that the uber bios would be for reference 290x.


That's UEFI!


----------



## phallacy

Quote:


> Originally Posted by *Mercy4You*
> 
> That's UEFI!


Cool! thank you


----------



## bbond007

Quote:


> Originally Posted by *Sgt Bilko*
> 
> wow, here i was thinking that the Twin Frozr was enough for Hawaii, guess not.
> 
> Looks like they are performing worse than my XFX cards.......that's a bummer.


I originally got a Gigabyte Windforce r9 290x and I liked the design with 3 fans.

Anyway, that card died within 5 hours of use so I sent it back and get a MSI gaming R9 290x.

The Gigabyte RMA page was down, and the technical support lady was rude, so I decided to switch brands to MSI because I have two GTX 760s that I'm extremely impressed with.

I also have a gigabyte ga-990fxa-ud3 (rev4) that is a nice MB except the BIOS is buggy. For that reason I decided to switch.

I was fine with one card, but when I added the 2nd MSI r9 290x I had the same sort of issues with the top card throttling down due to cooling.

I cut a hole in the side of the case and put in 200mm fan directly in front of the cards. That location originally had an 80mm fan.

Hot air was also begin trapped in the case to I cut two holes int he top and installed a 120mm fan in each.

The case takes 2 80mm fans in the front and 2 80mm fans in the back(for whatever that's worth), so I made sure those where all present and fresh.

The 200mm fan causes primary GPU temp to drop dramatically, however, the bottom card does actually increase 1 or 2 degrees for whatever reason.

Cheers!


----------



## vortex240

Quote:


> Originally Posted by *Coree*
> 
> @vortex
> 
> Yes, show some pics. Are you sure that the GPU die and heatsink have good contact? What TIM are you using? Btw what is your case and airflow? Btw do you have VRM heatsinks installed?


Haha, yes, I'm 100% sure - it really doesn't take a college degree to mount one and I have plenty of experience over the last 15 years. MX4 is the TIM and the results are the same on open bench and with closed case - which has tons of cooling anyways. VRM and ram is cooled with the stock heatink baseplate(with the removed cooler). A guide can be found on here on OC.net for anyone interested. And the temps are very cool on vrm 1,2 - low 60C.

I'm really curious since XBIT used a S1 with the turbo module on a radeon 7770 @1.1Ghz and they were reaching 60C temps. All this makes sense, given the cooling capacity of the S1 is 120 watt and 7770 a 100watt TDP. Mind you that is less than half the heat output of a 290.

http://www.xbitlabs.com/articles/coolers/display/arctic-accelero-s1-plus-turbo-module_4.html#sect1

In all honesty, I wasn't surprised with my results. The S1 compared to a windforce cooler on my GTX 770 has less then half the cooling fin area and weights half as well and that same heatsink on a 290 is just about adequate for OC and mid 70C temps. Now, its important to note that the windforce is a 450watt cooler, the S1 is a 120watt(just check the Arctic Cooling site).

How you are achieving this is like I said, 'black magic'. You basically were able to pretty much triple the cooling capacity of the S1


----------



## Coree

Quote:


> Originally Posted by *vortex240*
> 
> Haha, yes, I'm 100% sure - it really doesn't take a college degree to mount one and I have plenty of experience over the last 15 years. MX4 is the TIM and the results are the same on open bench and with closed case - which has tons of cooling anyways. VRM and ram is cooled with the stock heatink baseplate(with the removed cooler). A guide can be found on here on OC.net for anyone interested. And the temps are very cool on vrm 1,2 - low 60C.
> 
> I'm really curious since XBIT used a S1 with the turbo module on a radeon 7770 @1.1Ghz and they were reaching 60C temps. All this makes sense, given the cooling capacity of the S1 is 120 watt and 7770 a 100watt TDP. Mind you that is less than half the heat output of a 290.
> 
> http://www.xbitlabs.com/articles/coolers/display/arctic-accelero-s1-plus-turbo-module_4.html#sect1
> 
> In all honesty, I wasn't surprised with my results. The S1 compared to a windforce cooler on my GTX 770 has less then half the cooling fin area and weights half as well and that same heatsink on a 290 is just about adequate for OC and mid 70C temps. Now, its important to note that the windforce is a 450watt cooler, the S1 is a 120watt(just check the Arctic Cooling site).
> 
> How you are achieving this is like I said, 'black magic'. You basically were able to pretty much triple the cooling capacity of the S1


Haha







I actually got my GPU up to 84C while playing Crysis 3. For some reason, during BF3 the GPU drained 201W max. Crysis 3 ups this to 251W. I have my temperature limit set @ 85C, and the clocks were 95% of the time at 1093 core. On rare occasions, the core clock dropped to 1047mhz but never any lower. I think that the high CFM fans compensates the lower fin area and heatpipes.. And btw, this heatsink is not suitable for mining if overclocked. I mine at 900/1250 850kh/s and core caps out at 75C, VRM1 76C and VRM2 64C.
Can you show some pics of your cooler please?

I will do a 5hr run of Unigine Heaven tomorrow, let's see what temps will there be.

Here are my VRM heatsink setups:


----------



## vortex240

Quote:


> Originally Posted by *Coree*
> 
> Haha
> 
> 
> 
> 
> 
> 
> 
> I actually got my GPU up to 84C while playing Crysis 3. For some reason, during BF3 the GPU drained 201W max. Crysis 3 ups this to 251W. I have my temperature limit set @ 85C, and the clocks were 95% of the time at 1093 core. On rare occasions, the core clock dropped to 1047mhz but never any lower. I think that the high CFM fans compensates the lower fin area and heatpipes.. And btw, this heatsink is not suitable for mining if overclocked. I mine at 900/1250 850kh/s and core caps out at 75C, VRM1 76C and VRM2 64C.
> Can you show some pics of your cooler please?
> 
> I will do a 5hr run of Unigine Heaven tomorrow, let's see what temps will there be.
> 
> Here are my VRM heatsink setups:


Aww!!!! Well, from what you wrote here it seems that powertune is working with your 85C target and it limiting the power big time - I would like to see you run a firestrike (just the regular test) and post a score. That should shed some light here.

I'm in the office at the moment. I'll post some when I get back home - apart from the stock baseplate cooler and slightly different fans its the same thing. The fans that I used have a higher CFM then yours. It seems you have a chip that doesn't leak much heat - whats your asic quality?


----------



## Coree

Quote:


> Originally Posted by *vortex240*
> 
> Aww!!!! Well, from what you wrote here it seems that powertune is working with your 85C target and it limiting the power big time - I would like to see you run a firestrike (just the regular test) and post a score. That should shed some light here.
> 
> I'm in the office at the moment. I'll post some when I get back home - apart from the stock baseplate cooler and slightly different fans its the same thing. The fans that I used have a higher CFM then yours. It seems you have a chip that doesn't leak much heat - whats your asic quality?


Settings are as followed: (CCC)
Power limit: +12%
GPU: 9,3%
Memory: 6%
Target GPU temperature: 85C

ASIC quality is 75,1%.

Heres my FS, Cloud Gate and Ice Storm results:
http://www.3dmark.com/3dm/2584464

But tomorrow, I will test out 5h of Unigine Benchmark and i'll put the target GPU temp to 95C.


----------



## The Mac

@coree: you have your vrms backwards.

vrm1 is core, vrm2 is memory.

Vrm2 doesnt need crazy cooling.


----------



## devilhead

hi, why my core stays always on 1000 mhz on idle(desktop)? with drivers 14.2 Beta


----------



## The Mac

mine stick too.

open CCC, go to the perforamce tab, enable overdrive, click reset to default and save. You may have to do it a few times.

should fix it.


----------



## devilhead

Quote:


> Originally Posted by *The Mac*
> 
> mine stick too.
> 
> open CCC, go to the perforamce tab, enable overdrive, click reset to default and save. You may have to do it a few times.
> 
> should fix it.


thnks +rep


----------



## Coree

Quote:


> Originally Posted by *The Mac*
> 
> @coree: you have your vrms backwards.
> 
> vrm1 is core, vrm2 is memory.
> 
> Vrm2 doesnt need crazy cooling.


I keep confusing these.. :L


----------



## Paul17041993

Quote:


> Originally Posted by *The Mac*
> 
> Vrm2 doesnt need crazy cooling.


well with 16ICs, if you wanted high memory you should be sure to put a large heatsink on it








but yea, they shouldn't be any hotter then the main VRMs.


----------



## Coree

So the VRM1 is the longer strip? VRM2 the 3 smaller ones located near the pci bracket?


----------



## The Mac

yes sir...


----------



## The Mac

Quote:


> Originally Posted by *Paul17041993*
> 
> well with 16ICs, if you wanted high memory you should be sure to put a large heatsink on it
> 
> 
> 
> 
> 
> 
> 
> 
> but yea, they shouldn't be any hotter then the main VRMs.


i have a gelid icy vision on my GPU and just a couple silly aluminium sinks on my vrm2, and they rarely break 65 at 1300mhz on the memory.

Vrm1 on the otherhand....lol


----------



## Coree

Hmm, should I add screws on the VRM1 too? I think there isn't enough pressure, as you can see on the picture, the cooler just presses the heatsink into place.


----------



## The Mac

if you can,

i had to drill out my strip sink to make it fit screws.

aluminum is soft though, doesnt take much effort.


----------



## magicase

Can you add me to the list "again"











So far I have moved from XFX -> Asus -> Sapphire Tri-X CF -> Powercolor PCS+ CF


----------



## taem

Quote:


> Originally Posted by *magicase*
> 
> Can you add me to the list "again"
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So far I have moved from XFX -> Asus -> Sapphire Tri-X CF -> Powercolor PCS+ CF


Don't stop now just a few more brands and you'll have gone through every 290 on the market


----------



## devilhead

Quote:


> Originally Posted by *taem*
> 
> Don't stop now just a few more brands and you'll have gone through every 290 on the market


Hehe, yes your target now 2xLightings


----------



## DeadlyDNA

I did some testing and on the 14.2 beta drivers, if your running 4way crossfire you might have issues like me.

If i disable eyefinity and run a 1080p single monitor in 4 wayCF i have no issues with frame pacing.
If i run 4wayCF in Eyefinity i get horrible screen tearing/suttering. The FPS still looks good on benchmarks but the screen is barely discernible

So while this may just be an issue for me and only me, i'm still waiting for some feedback from any 4wayCF users as well, in eyefinity if possible.

Thanks.

4wayCF + Eyfinity = frame/stutter
3wayCF + Eyefinity = fine.
4wayCF + single screen = fine.


----------



## pkrexer

Any word on if sapphire is going to be making a "toxic" 290 edition?


----------



## Sgt Bilko

Quote:


> Originally Posted by *pkrexer*
> 
> Any word on if sapphire is going to be making a "toxic" 290 edition?


No official word yet but they will, at least for the 290x anyways.

They may do a 290 Toxic but i wouldn't get my hopes up.


----------



## Roboyto

Quote:


> Originally Posted by *pkrexer*
> 
> Any word on if sapphire is going to be making a "toxic" 290 edition?


If/when either the 290 or 290X are available you better get one quick. There are always fairly limited numbers of them and they go quick


----------



## battleaxe

I just added the second 290. Got an MSI 290. Pretty happy with it. Temps are 63c full load while mining.


----------



## magicase

Quote:


> Originally Posted by *taem*
> 
> Don't stop now just a few more brands and you'll have gone through every 290 on the market


Actually forgot to mention the Gigabyte windforce I had for 2 days









XFX -> Asus -> Gigabyte Windforce CF -> Sapphire Tri-X CF -> Powercolor PCS+ CF


----------



## pkrexer

Finally had some time to push the limits of my Asus 290x (Flashed with the Powercolor 290x PCS+ bios) Haven't had much time since putting the waterblock on it and also upgrading the pads to play with it. Its no "golden" 290x by any means, but I'm pretty happy with the results.

*Max clock = 1280 / 1500*

*Benches:*

Firestrike: http://www.3dmark.com/fs/1809882

Valley:


----------



## Pesmerrga

Powercolor reference 290x +Battlefield in stock at Newegg for $569! Lowest price since release..

http://www.newegg.com/Product/Product.aspx?Item=N82E16814131522

And an MSI gaming 290 +Battlefield for $469

http://www.newegg.com/Product/Product.aspx?Item=N82E16814127774


----------



## Prozillah

Been doing alot of reading regarding the 4670k vs 4770k for BF4 utilizing HT & multiple GPU's.

So currently I have the 4670k @ 4.5ghz and ive recently purchased myself a couple of 290's to run in crossfiree basically just for bf4. Since doing so I have noticed my cpu working alot harder and after reading a great deal of info it has become evident to me I might need to goto an i7 and take advantage of the HT.

My question is how much is my current cpu bottlenecking the gpus and will going to a 4770k with HT provide any type of noticeable boost in performance for my bf4?

Running ultra settings on 1440p 120hz monitor with a 2nd monitor for stats.


----------



## Roboyto

Quote:


> Originally Posted by *pkrexer*
> 
> Finally had some time to push the limits of my Asus 290x (Flashed with the Powercolor 290x PCS+ bios) Haven't had much time since putting the waterblock on it and also upgrading the pads to play with it. Its no "golden" 290x by any means, but I'm pretty happy with the results.
> 
> *Max clock = 1280 / 1500*
> 
> *Benches:*
> 
> Firestrike: http://www.3dmark.com/fs/1809882
> 
> Valley:


1280 on the core is pretty good all things considered. They run 1000 MHz out of the box right? 28% is a healthy jump. One of my 290s couldn't break 1080/1350, got to +125mV and it still wasn't budging...so I sold it.


----------



## Roboyto

Quote:


> Originally Posted by *Prozillah*
> 
> Been doing alot of reading regarding the 4670k vs 4770k for BF4 utilizing HT & multiple GPU's.
> 
> So currently I have the 4670k @ 4.5ghz and ive recently purchased myself a couple of 290's to run in crossfiree basically just for bf4. Since doing so I have noticed my cpu working alot harder and after reading a great deal of info it has become evident to me I might need to goto an i7 and take advantage of the HT.
> 
> My question is how much is my current cpu bottlenecking the gpus and will going to a 4770k with HT provide any type of noticeable boost in performance for my bf4?
> 
> Running ultra settings on 1440p 120hz monitor with a 2nd monitor for stats.


BF4 optimized around 4 cores/threads just as BF3 was. It will push a 4 core/thread CPU to 90% or better. As you add more cores/threads the load is distributed across the additional threads. 8 cores/threads instead of 90% usage you would likely see something more like 40-50%.

Bottleneck probably going to be hard to measure until you get the new CPU. Comparing benchmarks could be one way to tell, but the problem is that the benchmark can't exactly simulate large multiplayer load.

What we do know is that a 4770k absolutely won't bottleneck with Xfire 290s. I think your best bet is to toss your CPU up on eBay or something, cover half-ish the cost of the i7 and then you can stop worrying about a potential issue and just enjoy your hardware.

You can use Afterburner's RTSS/OSD with HWinfo to display real-time stats on your monitor in game. I have mine setup to see CPU load per core, CPU temp, RAM used, GPU load, GPU/VRM1 temps.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Prozillah*
> 
> Been doing alot of reading regarding the 4670k vs 4770k for BF4 utilizing HT & multiple GPU's.
> 
> So currently I have the 4670k @ 4.5ghz and ive recently purchased myself a couple of 290's to run in crossfiree basically just for bf4. Since doing so I have noticed my cpu working alot harder and after reading a great deal of info it has become evident to me I might need to goto an i7 and take advantage of the HT.
> 
> My question is how much is my current cpu bottlenecking the gpus and will going to a 4770k with HT provide any type of noticeable boost in performance for my bf4?
> 
> Running ultra settings on 1440p 120hz monitor with a 2nd monitor for stats.


HT will provide absolutely nothing in BF4. Same story with BF3. I can see HT helping if BF4 has similar utilization for 8-Core CPU as its does for 4-Core. I upgraded from 3570K to 3770K because I was being bottlenecked in BF3 with 2 x HD 7970. It basically did nothing. CPU usage went from 90% to 45%-50%. GPU utilization stayed the same and so did the fps.


----------



## Roboyto

Quote:


> Originally Posted by *ZealotKi11er*
> 
> HT will provide absolutely nothing in BF4. Same story with BF3. I can see HT helping if BF4 has similar utilization for 8-Core CPU as its does for 4-Core. I upgraded from 3570K to 3770K because I was being bottlenecked in BF3 with 2 x HD 7970. It basically did nothing. CPU usage went from 90% to 45%-50%. GPU utilization stayed the same and so did the fps.


That's exactly what I read in a review. Even an FX-4300 or i3-4340 can push surprising frames in BF4, but the CPU is loaded just as much as the i5 since they are still 4 core/thread CPUs.


----------



## taem

Quote:


> Originally Posted by *pkrexer*
> 
> Finally had some time to push the limits of my Asus 290x (Flashed with the Powercolor 290x PCS+ bios) Haven't had much time since putting the waterblock on it and also upgrading the pads to play with it. Its no "golden" 290x by any means, but I'm pretty happy with the results.
> 
> *Max clock = 1280 / 1500*
> 
> *Benches:*
> 
> Firestrike: http://www.3dmark.com/fs/1809882
> 
> Valley:


1280 is bleepin fantastic imho. What's your ASIC? 1200/1500 is my ceiling at ASIC 70% and I'm totally happy with that.

Which gets me


Quote:


> Originally Posted by *Prozillah*
> 
> Been doing alot of reading regarding the 4670k vs 4770k for BF4 utilizing HT & multiple GPU's.
> 
> So currently I have the 4670k @ 4.5ghz and ive recently purchased myself a couple of 290's to run in crossfiree basically just for bf4. Since doing so I have noticed my cpu working alot harder and after reading a great deal of info it has become evident to me I might need to goto an i7 and take advantage of the HT.
> 
> My question is how much is my current cpu bottlenecking the gpus and will going to a 4770k with HT provide any type of noticeable boost in performance for my bf4?
> 
> Running ultra settings on 1440p 120hz monitor with a 2nd monitor for stats.


I found 290 crossfire totally not worth it paired to my 4670k. Some folks here swear 4670k cannot bottleneck 290 xfire but gpu usage was constantly dropping, and it doesn't do that in single card, so I assume that's cpu bottleneck. It does give a boost, not saying otherwise, but not enough to justify (for me) the added power draw, heat, noise. But if you're a BF4 fanatic I suppose the boost is worth it even if the upside-downside ratio isn't 1 to 1.


----------



## brazilianloser

Yeah 1280 is pretty darn good. I can't pass 1175 on mine. But it might be that I am hitting the max on my psu and not necessary the cards max.


----------



## pkrexer

Quote:


> Originally Posted by *taem*
> 
> 1280 is bleepin fantastic imho. What's your ASIC? 1200/1500 is my ceiling at ASIC 70% and I'm totally happy with that.


72.9% nothing special

Werid thing about my GPU is that it does not like any thing over +190mv even though my temps never passed 45c running 3dmark. Soon as I apply 200mv to the core, my video starts wiggin out... Screen will randomly go black and come back on... starts shifting side to side, really werid.


----------



## kizwan

Quote:


> Originally Posted by *DeadlyDNA*
> 
> I did some testing and on the 14.2 beta drivers, if your running 4way crossfire you might have issues like me.
> 
> If i disable eyefinity and run a 1080p single monitor in 4 wayCF i have no issues with frame pacing.
> If i run 4wayCF in Eyefinity i get horrible screen tearing/suttering. The FPS still looks good on benchmarks but the screen is barely discernible
> 
> So while this may just be an issue for me and only me, i'm still waiting for some feedback from any 4wayCF users as well, in eyefinity if possible.
> 
> Thanks.
> 
> 4wayCF + Eyfinity = frame/stutter
> 3wayCF + Eyefinity = fine.
> 4wayCF + single screen = fine.


Are you running 4-way 290's in your sig rig, with CPU 3820? If yes, good to know 3820 can handle 4-way just fine. What is the total screen resolution in Eyefinity? Can you do me a favour? Please take screenshot of CPU usage while playing BF4. You can use "Open Hardware Monitor", just need to run it in the background to record CPU usage.

For example.

Quote:


> Originally Posted by *taem*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Prozillah*
> 
> Been doing alot of reading regarding the 4670k vs 4770k for BF4 utilizing HT & multiple GPU's.
> 
> So currently I have the 4670k @ 4.5ghz and ive recently purchased myself a couple of 290's to run in crossfiree basically just for bf4. Since doing so I have noticed my cpu working alot harder and after reading a great deal of info it has become evident to me I might need to goto an i7 and take advantage of the HT.
> 
> My question is how much is my current cpu bottlenecking the gpus and will going to a 4770k with HT provide any type of noticeable boost in performance for my bf4?
> 
> Running ultra settings on 1440p 120hz monitor with a 2nd monitor for stats.
> 
> 
> 
> I found 290 crossfire totally not worth it paired to my 4670k. Some folks here swear 4670k cannot bottleneck 290 xfire but gpu usage was constantly dropping, and it doesn't do that in single card, so I assume that's cpu bottleneck. It does give a boost, not saying otherwise, but not enough to justify (for me) the added power draw, heat, noise. But if you're a BF4 fanatic I suppose the boost is worth it even if the upside-downside ratio isn't 1 to 1.
Click to expand...

290's Crossfire GPU usage will be fluctuating & this is normal. However, if it's followed by sudden FPS drop, then most likely CPU is bottlenecking. If it's feels smooth in games, then CPU is not bottlenecking.

Check out my CPU & GPU usage in BF4. In all settings, the games feels smooths, no sudden FPS drop, except when HT off.

This is my GPU's usage @1080p 100%resolution scale in BF4 (High settings).


CPU usage @1080p 100% resolution scale in BF4 (Ultra settings).


CPU usage @1080p 100% resolution scale in BF4 (Ultra settings) + CPU *HT OFF*.
As you can see here, with HT off, CPU usage also increased.


GPU's usage @1080p 200% resolution scale in BF4 (High settings).


CPU usage @1080p 200% resolution scale in BF4 (Ultra settings).


@Prozillah, monitor your CPU usage using Open Hardware Monitor.


----------



## Arizonian

Quote:


> Originally Posted by *magicase*
> 
> Can you add me to the list "again"
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> So far I have moved from XFX -> Asus -> Sapphire Tri-X CF -> Powercolor PCS+ CF


Congrats - updated ...wow!









Quote:


> Originally Posted by *battleaxe*
> 
> I just added the second 290. Got an MSI 290. Pretty happy with it. Temps are 63c full load while mining.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## kpoeticg

Just a tiny update on Powercolor honoring RMA's. Even though Newegg's return window expired, they still accepted the card back when i called them. But i filed a ticket with Powercolor the day before. Powercolor just emailed me back finally a cpl hours ago. It looks like they're at least willing to RMA the black screen cards if it came to it. Thought this might be useful info to somebody

Dear User,

1. Please try update MB BIOS to the newest version.

2. Please check your ATX power supply, the R9 290x needs a 750 watt or better power supply with one 6-pin PCIe and one 8-pin power connectors. Certified power supplies are strongly recommended; for a list of certified power supplies, please see http://support.amd.com/en-us/recommended/power-supplies

3. Please try install the newest beta graphic driver from AMD website: http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx Removing all old graphic driver before installing new driver is recommend.

4. If problem still exist after above checking step, please do a RMA service. If you need a RMA service, please contact your dealer for RMA services, they are supposed to take care of your RMA request; if your dealer is no longer in service or can't provide you services, please go to http://www.powercolor.com/us/support_rmaservice.asp then following steps to get RMA service. Thanks


----------



## Paul17041993

Quote:


> Originally Posted by *DeadlyDNA*
> 
> I did some testing and on the 14.2 beta drivers, if your running 4way crossfire you might have issues like me.
> 
> If i disable eyefinity and run a 1080p single monitor in 4 wayCF i have no issues with frame pacing.
> If i run 4wayCF in Eyefinity i get horrible screen tearing/suttering. The FPS still looks good on benchmarks but the screen is barely discernible
> 
> So while this may just be an issue for me and only me, i'm still waiting for some feedback from any 4wayCF users as well, in eyefinity if possible.
> 
> Thanks.
> 
> 4wayCF + Eyfinity = frame/stutter
> 3wayCF + Eyefinity = fine.
> 4wayCF + single screen = fine.


sounds like a PCIe issue, could you put them under load and give the PCIe states for each card from GPU-Z? also go to CCC > hardware info and give the "Graphics Bus..." and "Max Bus..." if you want, should be the same as GPU-Z.

do you have the same stutter problems on 13.12 or earlier though?


----------



## Cybertox

Is there somebody with two R9 290Xs here? If so then whats the average fps in Far Cry 3 maxed at 1440P with aa off?


----------



## Coree

R9 290X w/ Accelero S1 Plus + 2x140mm fans Unigine heaven max. temps (5 hour 20min run) @ 1093/1250
Powerlimit +12%
Targeted GPU temp: 95C


After this, ran it for another 15min
Score:


----------



## Sgt Bilko

R9 290x Lighting has been released: http://www.msi.com/news/1674.html

Now to see when it pops up in stores









EDIT: Wow....Bid claim MSI are making here: "world's fastest single GPU graphics card"


----------



## battleaxe

Quote:


> Originally Posted by *Sgt Bilko*
> 
> R9 290x Lighting has been released: http://www.msi.com/news/1674.html
> 
> Now to see when it pops up in stores
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: Wow....Bid claim MSI are making here: "world's fastest single GPU graphics card"


Nice... that looks awesome. I bet its gonna be pricey though.


----------



## kpoeticg

Wow, 2x8Pin + 1x6Pin. That's alot of power


----------



## Coree

Quote:


> Originally Posted by *kpoeticg*
> 
> Wow, 2x8Pin + 1x6Pin. That's alot of power


Yeah. But only 2 x 8pin are necessary, no need to plug in the 6-pin if not needed.


----------



## Ukkooh

Quote:


> Originally Posted by *pkrexer*
> 
> Finally had some time to push the limits of my Asus 290x (Flashed with the Powercolor 290x PCS+ bios) Haven't had much time since putting the waterblock on it and also upgrading the pads to play with it. Its no "golden" 290x by any means, but I'm pretty happy with the results.
> 
> *Max clock = 1280 / 1500*
> 
> *Benches:*
> 
> Firestrike: http://www.3dmark.com/fs/1809882
> 
> Valley:


Why did you flash the powercolor bios? Were you able to oc higher after it?


----------



## SeanEboy

Quote:


> Originally Posted by *Cybertox*
> 
> Is there somebody with two R9 290Xs here? If so then whats the average fps in Far Cry 3 maxed at 1440P with aa off?


Have 1440p, and (2) 290x, but no Far Cry...

So, I just got my build up and running... What are the tests to run? Unigine Valley, and 3DMark Dx11? Are there parameters to save the runs I do on both of them? I want to do a comparison of each card I have (4), to order them, 1-4, then install the best first. Furthermore, I will be putting them all under water, so I want to have a spreadsheet of their performance before water (on air), and then after...


----------



## vortex240

Quote:


> Originally Posted by *ZealotKi11er*
> 
> HT will provide absolutely nothing in BF4. Same story with BF3. I can see HT helping if BF4 has similar utilization for 8-Core CPU as its does for 4-Core. I upgraded from 3570K to 3770K because I was being bottlenecked in BF3 with 2 x HD 7970. It basically did nothing. CPU usage went from 90% to 45%-50%. GPU utilization stayed the same and so did the fps.


Thank you! Finally someone that understands this. There are a ton of articles/threads on the net that covered this over and over.

I said it before and I'll say it again a 4670K at 4.5 is NOT a bottleneck. Then again it's not my money, people can go find out on their own.


----------



## TommyGunn123

It's up, and its damn good!



http://www.overclock3d.net/reviews/gpu_displays/msi_r9_290x_lightning_review/1


----------



## vortex240

Quote:


> Originally Posted by *Coree*
> 
> R9 290X w/ Accelero S1 Plus + 2x140mm fans Unigine heaven max. temps (5 hour 20min run) @ 1093/1250
> Powerlimit +12%
> Targeted GPU temp: 95C
> 
> 
> After this, ran it for another 15min
> Score:


After looking at your results, there are only 2 scenarios I see happening here.

A) my S1 is defective (not sure how but it is, maybe the liquid escaped from the heatpipes over the years.

B) Your chip is one of a kind and doesn't leak much power

Either way, I'm glad this is working so well for you. I was gonna post pics but I reverted to the stock cooler and really don't feel like throwing the S1 on again - since it looks just like your setup except I used the stock baseplate and different fans.


----------



## Ukkooh

Quote:


> Originally Posted by *TommyGunn123*
> 
> It's up, and its damn good!
> 
> 
> 
> http://www.overclock3d.net/reviews/gpu_displays/msi_r9_290x_lightning_review/1


I might actually get one if they are binned good enough for most of them to do 1300 core with daily usage voltages.


----------



## phallacy

Quote:


> Originally Posted by *Ukkooh*
> 
> I might actually get one if they are binned good enough for most of them to do 1300 core with daily usage voltages.


Agreed, the memory OC is great, but it doesn't help nearly as much as the core for performance. I'm curious to see what it would do underwater and overvolted a bit more. +60mV is really not that bad to get a 1150/1650.


----------



## goodenough88

Got my eye on possibly buying 2 R9 290's a little later this year. I'm just after some approximate results of temps for these cards in Xfire while being watercooled? What is the OC potential of these cards when being watercooled? Also, what PSU would be required to run 2 of these cards OC? And the PSU required if I ran 3 of them OC? Lastly, since I'm going to watercool the cards, is there any need for me to spend extra money on buying a card with a custom air cooler? Or can I buy a cheaper card without the custom cooler?

I could search for the answers, but I'm heading off to bed now, so I'm hoping some awesome OCN user will be able to answer my questions while i sleep


----------



## phallacy

Quote:


> Originally Posted by *goodenough88*
> 
> Got my eye on possibly buying 2 R9 290's a little later this year. I'm just after some approximate results of temps for these cards in Xfire while being watercooled? What is the OC potential of these cards when being watercooled? Also, what PSU would be required to run 2 of these cards OC? And the PSU required if I ran 3 of them OC? Lastly, since I'm going to watercool the cards, is there any need for me to spend extra money on buying a card with a custom air cooler? Or can I buy a cheaper card without the custom cooler?
> 
> I could search for the answers, but I'm heading off to bed now, so I'm hoping some awesome OCN user will be able to answer my questions while i sleep


I'll try to answer your questions. In order,

1. Crossfire temps should be around 50c core and 60ish vrm depending on how much cooling surface area you have in your loop
2. Noone knows OC potential when underwater yet.
3. Review says 250w tdp with stock voltages so double that and add 30-50% to be safe for heavy overclocking, you're looking at 650-700w for the cards. 1000w psu or above should be able to handle it.
4. 3 cards would require a 1200w or above. I'm using an evga supernova 1300w for my trifire setup
5. Yes and no. The reason people buy custom over reference designs is that some of them (MSI, Sapphire, Powercolor) are higher binned cards with components meant to sustain better performance than the reference design. All depends on price of the card vs reference and how much overclocking you intend to do.


----------



## pkrexer

Quote:


> Originally Posted by *Ukkooh*
> 
> Why did you flash the powercolor bios? Were you able to oc higher after it?


Primarily because it has +50mv set within the bios. It does seem to be a bit more stable at higher clocks, but that could be just my imagination.


----------



## Jack Mac

Techpowerup has a VBios for plenty of cards. Link:
http://www.techpowerup.com/vgabios/152917/powercolor-r9290-4096-140108.html


----------



## Coree

Quote:


> Originally Posted by *vortex240*
> 
> After looking at your results, there are only 2 scenarios I see happening here.
> 
> A) my S1 is defective (not sure how but it is, maybe the liquid escaped from the heatpipes over the years.
> 
> B) Your chip is one of a kind and doesn't leak much power
> 
> Either way, I'm glad this is working so well for you. I was gonna post pics but I reverted to the stock cooler and really don't feel like throwing the S1 on again - since it looks just like your setup except I used the stock baseplate and different fans.


Hmm, i'm just thinking that it could be the heatpipes too, I know that the new and old revision use 4x6mm heatpipes, but could there be some internal differences in the pipes? Idk about the leakage, it should'nt do much of a difference really. By the way, what is your ASIC quality?


----------



## vortex240

Quote:


> Originally Posted by *Coree*
> 
> Hmm, i'm just thinking that it could be the heatpipes too, I know that the new and old revision use 4x6mm heatpipes, but could there be some internal differences in the pipes? Idk about the leakage, it should'nt do much of a difference really. By the way, what is your ASIC quality?


81.6% - its an AMD internal sample, clocks very well. S1 and S1 Turbo have identical dimensions, only the compatibility has been improved through the better mounting plate. Either way, I paid $30 for it 7 years ago, so maybe some of the liquid inside has evaporated/sipped out over the years.

Then again I don't Arctic Cooling recommending it to cool 250watt card like r9 290. For example Tom's Hardware used their Accelero Extreme for 1150 core with 100% fans and got 70C temps. Mind you Acellero Xtreme is a 300watt cooler.

http://www.tomshardware.com/reviews/r9-290-accelero-xtreme-290,3671-4.html

http://www.arctic.ac/worldwide_en/products/cooling/vga/accelero-xtreme-iii.html

http://www.arctic.ac/worldwide_en/products/cooling/vga/accelero-s1-plus.html

"Equip it with an add-on fan - S1 PLUS Turbo Module, the cooling capacity of the Accelero S1 PLUS can be upgraded to 120 Watts, which is adequate enough for some of the high-end VGA cards including AMD HD 7870 and NVIDIA GTX 670."

So in conclusion, I have no idea how you are able to cool ~275watts with a 120watt cooler. You got black magic going on there.


----------



## rdr09

Quote:


> Originally Posted by *vortex240*
> 
> Thank you! Finally someone that understands this. There are a ton of articles/threads on the net that covered this over and over.
> 
> I said it before and I'll say it again a 4670K at 4.5 is NOT a bottleneck. Then again it's not my money, people can go find out on their own.


my findings differ. here is my i7 HT off in BF3 MP 64 . . .



100% on all cores. those are a bit faster than a single 290. during huge explosions - I was able to tell that the game lagged. never occurred with HT on.

Good news: I play BF4 MP 64 with my i7 at stock and HT off but using mantle. I love it.


----------



## Coree

7 years, darn :O I've had mine for like 9 months now, it was earlier strapped on a 7870LE (200W TDP) Same fan configuration.



Check my post on the 7870LE:
http://www.overclock.net/t/1373543/official-7870-tahiti-le-xt-owners-club/3360#post_20835678


----------



## rickyman0319

i am wondering if there any aftermarker cooler that takes 2 slot only ( 1 slot + 1 gpu)?


----------



## The Mac

sure, waterblocks.

lol


----------



## phallacy

I recently bought a kill a watt so I could see my power draw and if the 290x used THAT much power.

So far I've only done two benchmarks but plan on doing more and updating. To make sure I was recording accurately, I moved everything on that outlet to an adjacent room's outlet and plugged my computer into the meter. Then reset and switched it to watts on a 120v outlet.

I ran Heaven 4.0 and 3dmark11 P after about 30 minutes of idling

The following gpu clocks were
1: 1200/1650 +96mV +50 power limit
2: 1200/1500 +103mV +50 power limit
2: 1225/1650 + 110mV +50 power limit

Heaven settings were maxed at 1080p with 8xAA

Beginning by ship 996 watts



Rotating screen close to end 1007 watts



3dmark11 performance setting

Graphics scene 4 1001 watts


CPU only test 596 watts


Combined test 701 watts


----------



## Roboyto

Quote:


> Originally Posted by *rickyman0319*
> 
> i am wondering if there any aftermarker cooler that takes 2 slot only ( 1 slot + 1 gpu)?


Gelid Icy Vision appears thinner than the Arctic with supplied fans. Can anyone say exactly how thick it is.


----------



## Gunderman456

Anyone has this problem, no matter what you set and try to apply/save in Afterburner it will keep specific values - for me 1166 Core and 1340 Mem. I even installed latest AMD drivers and reinstalled Afterburner and still same values, it won't even show the default clock values for the r9 290 cards anymore.

Edit;

I installed Trixx and it also initially gave me 1166 Core and 1340 Mem, however I was able to reset the default clocks with it and chose the clocks of my likings.

I went back to Afterburner and it was still defective, until I deleted the profile files in the folder, then I was able to reset to default values and then overclock to whatever I wanted. Weird, since when I uninstalled Afterburner I was sure to delete the folder and it's profiles.

Not sure if Trixx fixed the issue or deleting the Afterburner Profiles afterwards did the trick, but now it works.


----------



## LazarusIV

Quote:


> Originally Posted by *Forceman*
> 
> Do you have ULPS disabled? I had similar problems (it sounds like) until I disabled ULPS in the Afterburner settings, after that all was fine.


I disabled ULPS per your suggestion but it just did it again. It definitely doesn't happen nearly as frequently but apparently it still does happen. Maybe a mobo setting?


----------



## goodenough88

Quote:


> Originally Posted by *phallacy*
> 
> I'll try to answer your questions. In order,
> 
> 1. Crossfire temps should be around 50c core and 60ish vrm depending on how much cooling surface area you have in your loop
> 2. Noone knows OC potential when underwater yet.
> 3. Review says 250w tdp with stock voltages so double that and add 30-50% to be safe for heavy overclocking, you're looking at 650-700w for the cards. 1000w psu or above should be able to handle it.
> 4. 3 cards would require a 1200w or above. I'm using an evga supernova 1300w for my trifire setup
> 5. Yes and no. The reason people buy custom over reference designs is that some of them (MSI, Sapphire, Powercolor) are higher binned cards with components meant to sustain better performance than the reference design. All depends on price of the card vs reference and how much overclocking you intend to do.


Thanks very much for that







and sorry for the poor layout of my post haha

I am planning on buying the EVGA SuperNOVA G2 1000w, but might upgrade in case I do go for a trifire setup.

As for overclocking, I just want to give the cards a decent OC but able to keep them steady for gaming.


----------



## tsm106

Quote:


> Originally Posted by *phallacy*
> 
> I recently bought a kill a watt so I could see my power draw and if the 290x used THAT much power.
> 
> So far I've only done two benchmarks but plan on doing more and updating. To make sure I was recording accurately, I moved everything on that outlet to an adjacent room's outlet and plugged my computer into the meter. Then reset and switched it to watts on a 120v outlet.
> 
> I ran Heaven 4.0 and 3dmark11 P after about 30 minutes of idling
> 
> The following gpu clocks were
> 1: 1200/1650 +96mV +50 power limit
> 2: 1200/1500 +103mV +50 power limit
> 2: 1225/1650 + 110mV +50 power limit
> 
> Heaven settings were maxed at 1080p with 8xAA
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Beginning by ship 996 watts
> 
> 
> 
> Rotating screen close to end 1007 watts
> 
> 
> 
> 3dmark11 performance setting
> 
> Graphics scene 4 1001 watts
> 
> 
> CPU only test 596 watts
> 
> 
> Combined test 701 watts


Nice.









Your numbers are with a relatively thrifty cpu and gpu clocks that are decently high but not nosebleed high so imagine the what's left on the table. Like for instance a 5ghz hexacore intel cpu and gpu clocks over 1300 which would require +200mv and some in certain cases.


----------



## phallacy

Quote:


> Originally Posted by *tsm106*
> 
> Nice.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Your numbers are with a relatively thrifty cpu and gpu clocks that are decently high but not nosebleed high so imagine the what's left on the table. Like for instance a 5ghz hexacore intel cpu and gpu clocks over 1300 which would require +200mv and some in certain cases.


I'm curious as well. Main reason I did this was to see if I was going to be heading for a short with the above overclocks and voltages. Now that I know I have a decent amount of headroom, time to venture into +150/200mV territory


----------



## Red1776

Anyone care to hazard a guess at what my 4 x R290X (OC'd) + 5.2Ghz (OC'd) CPU + a quad D5 Pump system will pull whilst being stressed tested?

maybe I will award a CPU/APU to the person who gets the closest *if that is not against OCN policy.*

If not I will provide more detailed info about the OC settings and the stress program used.


----------



## sugarhell

2k+


----------



## Roboyto

Quote:


> Originally Posted by *phallacy*
> 
> I'm curious as well. Main reason I did this was to see if I was going to be heading for a short with the above overclocks and voltages. Now that I know I have a decent amount of headroom, time to venture into +150/200mV territory


It's all about silicon lottery now









I got my XFX BE up to 1255/1675 with +125mV. Pushing to +200mV only got me 40-60Mhz additional core and 0-25MHz RAM, depending on the bench.

Let's see those max clocks please


----------



## Red1776

Considering what my 4 x 7970 system will take stressed, I think you are close Sugar.


----------



## Roboyto

Quote:



> Originally Posted by *Red1776*
> 
> Anyone care to hazard a guess at what my 4 x R290X (OC'd) + 5.2Ghz (OC'd) CPU + a quad D5 Pump system will pull whilst being stressed tested?
> 
> maybe I will award a CPU/APU to the person who gets the closest *if that is not against OCN policy.*
> 
> If not I will provide more detailed info about the OC settings and the stress program used.


Quote:


> Originally Posted by *sugarhell*
> 
> 2k+


I agree with sugarhell...You're in the 400W range per GPU depending on OC, and probably 300W for 5.2GHz CPU...plus everything else.

When you measuring?


----------



## Red1776

I am putting the system together as we speak. If anyone is interested in winning a new APU, I will keep a list of exact 'guesses' ( I have a feeling that folks like you, Sugar, and TSM106 will probably get within 50w.


----------



## phallacy

Quote:


> Originally Posted by *Roboyto*
> 
> It's all about silicon lottery now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I got my XFX BE up to 1255/1675 with +125mV. Pushing to +200mV only got me 40-60Mhz additional core and 0-25MHz RAM, depending on the bench.
> 
> Let's see those max clocks please


I'm hoping I struck the lottery thrice!

Quote:


> Originally Posted by *Red1776*
> 
> Anyone care to hazard a guess at what my _4 x R290X (OC'd) + 5.2Ghz (OC'd) CPU + a quad D5 Pump system_ will pull whilst being stressed tested?
> maybe I will award a CPU/APU to the person who gets the closest *if that is not against OCN policy.*
> If not I will provide more detailed info about the OC settings and the stress program used.


Going with others of about 2000-2100 watts. You have 2.2kw in your sig so I think it will be close lol. Will you be using Furmark?


----------



## Roboyto

Quote:


> Originally Posted by *Red1776*
> 
> I am putting the system together as we speak. If anyone is interested in winning a new APU, I will keep a list of exact 'guesses' ( I have a feeling that folks like you, Sugar, and TSM106 will probably get within 50w.


This was pushing my 290 to the brink, 1315/1700 +200mV



Really depends on your OC and what voltage you push through them. They get *real* thirsty when you pushing the envelope.


----------



## Red1776

ARGGG, I hate Furmark, but well over 2kW will be fun.

I am going with 2.7kW for this build. I will give exact OC speeds etc . all four of my R290x's best 1200Mhz at least. Good thing I had my electrician put in a dedicated circuit last year.

Quote:


> ~~Really depends on your OC and what voltage you push through them. They get real thirsty when you pushing the envelope.


Yeah....this is where the five rads come in hehe


----------



## Roboyto

Quote:


> Originally Posted by *Red1776*
> 
> ARGGG, I hate Furmark, but well over 2kW will be fun.
> 
> I am going with 2.7kW for this build. I will give exact OC speeds etc . all four of my R290x's best 1200Mhz at least. Good thing I had my electrician put in a dedicated circuit last year.
> 
> Yeah....this is where the five rads come in hehe


I haven't even tried Furmark (







) for my clocks, it could likely push mine to 400W.

Chernobyl?







I presume that is the name of this quad 290X creation?


----------



## The Storm

Quote:


> Originally Posted by *Red1776*
> 
> Anyone care to hazard a guess at what my _4 x R290X (OC'd) + 5.2Ghz (OC'd) CPU + a quad D5 Pump system_ will pull whilst being stressed tested?
> maybe I will award a CPU/APU to the person who gets the closest *if that is not against OCN policy.*
> If not I will provide more detailed info about the OC settings and the stress program used.


I would guess 2150w at the wall


----------



## Red1776

hehehe, no actually ~~Chernobyl was the name of a 4 x 5870 build a few years ago. I build a new quadfire machine every year since 2007. all of the rigs in my sig are quadfire.

This one is going to be 'Holodeck VII'


----------



## VSG

You need to know what overclocks, overvolts you will be running at as well as the PSU efficiency but I will guess 2100+ watts


----------



## taem

Quote:


> Originally Posted by *goodenough88*
> 
> I am planning on buying the EVGA SuperNOVA G2 1000w, but might upgrade in case I do go for a trifire setup..


Check out the Cooler Master v1000 if you get a 1000w psu, as well reviewed if not better than the evga and only 170mm, in case you ever put it in a case that has a bottom fan mount you want to use.


----------



## ZealotKi11er

Gona finally try to see how far my card go. What tools do you guys Recommend?

Does PowerTune not work with 14.2?


----------



## sugarhell

400 watt per gpu
300-400 for the vishera
200 watt for cooling maybe more i dont know your loop
I estimate around 2.5 k furmark,2.2k -2.3k everything else


----------



## DeadlyDNA

Quote:


> Originally Posted by *Red1776*
> 
> hehehe, no actually ~~Chernobyl was the name of a 4 x 5870 build a few years ago. I build a new quadfire machine every year since 2007. all of the rigs in my sig are quadfire.
> This one is going to be 'Holodeck VII'


I was going to guess about 1800w-1900w


----------



## Roboyto

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Gona finally try to see how far my card go. What tools do you guys Recommend?
> 
> Does PowerTune not work with 14.2?


Not sure about PowerRune or 14.2. Used 14.2 briefly, was very unstable for me. Haven't used CCC for overclocking at all personally.

Run Trixx or Afterburner. Make sure you disable ULPS and force constant voltage. Trixx has ULPS right in the settings, AB you may have to force through registry since I don't readily see it...?

Registry edit for ULPS: http://www.tomshardware.com/faq/id-1904869/disable-ulps-amd-crossfire-setups.html


----------



## goodenough88

Quote:


> Originally Posted by *taem*
> 
> Check out the Cooler Master v1000 if you get a 1000w psu, as well reviewed if not better than the evga and only 170mm, in case you ever put it in a case that has a bottom fan mount you want to use.


Yeah I also have the V1000 on my list, but none of the 3 stores that I'll be buying my parts from have the V1000 in stock and don't have an ETA on when they will get more.

But I won't be buying the PSU for a few months anyway, so it might become available again.


----------



## SeanEboy

Hey guys.. I'm getting the 'Ouch! Something has gone wrong" error.. Anyone know how to fix it?

Unexpected error
System.Xml.XmlException: '.', hexadecimal value 0x00, is an invalid character. Line 115, position 23.
at System.Xml.XmlTextReaderImpl.Throw(Exception e)
at System.Xml.XmlTextReaderImpl.ParseCDataOrComment(XmlNodeType type, Int32& outStartPos, Int32& outEndPos)
at System.Xml.XmlTextReaderImpl.ParseCDataOrComment(XmlNodeType type)
at System.Xml.XmlTextReaderImpl.ParseElementContent()
at System.Xml.Linq.XContainer.ReadContentFrom(XmlReader r)
at System.Xml.Linq.XContainer.ReadContentFrom(XmlReader r, LoadOptions o)
at System.Xml.Linq.XElement.ReadElementFrom(XmlReader r, LoadOptions o)
at System.Xml.Linq.XElement.Load(XmlReader reader, LoadOptions options)
at System.Xml.Linq.XElement.Parse(String text, LoadOptions options)
at Gui.MainWindow.ExecuteBenchmarkRun(BenchmarkXmlBuilder builder, Settings uiCustomSettings)
at System.Windows.EventRoute.InvokeHandlersImpl(Object source, RoutedEventArgs args, Boolean reRaised)
at System.Windows.UIElement.RaiseEventImpl(DependencyObject sender, RoutedEventArgs args)
at System.Windows.Controls.Button.OnClick()
at System.Windows.Controls.Primitives.ButtonBase.OnMouseLeftButtonUp(MouseButtonEventArgs e)
at System.Windows.RoutedEventArgs.InvokeHandler(Delegate handler, Object target)
at System.Windows.EventRoute.InvokeHandlersImpl(Object source, RoutedEventArgs args, Boolean reRaised)
at System.Windows.UIElement.ReRaiseEventAs(DependencyObject sender, RoutedEventArgs args, RoutedEvent newEvent)
at System.Windows.RoutedEventArgs.InvokeHandler(Delegate handler, Object target)
at System.Windows.EventRoute.InvokeHandlersImpl(Object source, RoutedEventArgs args, Boolean reRaised)
at System.Windows.UIElement.RaiseEventImpl(DependencyObject sender, RoutedEventArgs args)
at System.Windows.UIElement.RaiseTrustedEvent(RoutedEventArgs args)
at System.Windows.Input.InputManager.ProcessStagingArea()
at System.Windows.Input.InputManager.ProcessInput(InputEventArgs input)
at System.Windows.Input.InputProviderSite.ReportInput(InputReport inputReport)
at System.Windows.Interop.HwndMouseInputProvider.ReportInput(IntPtr hwnd, InputMode mode, Int32 timestamp, RawMouseActions actions, Int32 x, Int32 y, Int32 wheel)
at System.Windows.Interop.HwndMouseInputProvider.FilterMessage(IntPtr hwnd, WindowMessage msg, IntPtr wParam, IntPtr lParam, Boolean& handled)
at System.Windows.Interop.HwndSource.InputFilterMessage(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam, Boolean& handled)
at MS.Win32.HwndWrapper.WndProc(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam, Boolean& handled)
at MS.Win32.HwndSubclass.DispatcherCallbackOperation(Object o)
at System.Windows.Threading.ExceptionWrapper.InternalRealCall(Delegate callback, Object args, Int32 numArgs)
at MS.Internal.Threading.ExceptionFilterHelper.TryCatchWhen(Object source, Delegate method, Object args, Int32 numArgs, Delegate catchHandler)
at System.Windows.Threading.Dispatcher.LegacyInvokeImpl(DispatcherPriority priority, TimeSpan timeout, Delegate method, Object args, Int32 numArgs)
at MS.Win32.HwndSubclass.SubclassWndProc(IntPtr hwnd, Int32 msg, IntPtr wParam, IntPtr lParam)
at MS.Win32.UnsafeNativeMethods.DispatchMessage(MSG& msg)
at System.Windows.Threading.Dispatcher.PushFrameImpl(DispatcherFrame frame)
at System.Windows.Application.RunDispatcher(Object ignore)
at System.Windows.Application.RunInternal(Window window)
at Gui.App.Main()
System.UnhandledExceptionEventArgs


----------



## Roy360

just to confirm, but it's the ASUS Radeon R9 290 DirectCU II cooler that has the flaw where the heatpipes don't touch right?

When I was looking at the Tom's Hardware thing, their card was black, whereas mines isn't. I have yet to break the seal on mines, but I really don't want a non-reference card that performs as bad as a reference card

the card: http://products.ncix.com/detail/asus-radeon-r9-290-directcu-6f-93995-1064.htm


----------



## Mr357

Quote:


> Originally Posted by *Red1776*
> 
> Anyone care to hazard a guess at what my _4 x R290X (OC'd) + 5.2Ghz (OC'd) CPU + a quad D5 Pump system_ will pull whilst being stressed tested?
> maybe I will award a CPU/APU to the person who gets the closest *if that is not against OCN policy.*
> If not I will provide more detailed info about the OC settings and the stress program used.


Are we talking just the components you listed? How many volts on the 290X's? How many volts on the 8350?


----------



## Red1776

I will list the entire system, voltage, OC's , pumps, CPU, HDD's, SSD's etc, and then keep a list of your guys estimates here shortly.


----------



## Mr357

Quote:


> Originally Posted by *Roboyto*
> 
> This was pushing my 290 to the brink, 1315/1700 +200mV
> 
> 
> 
> Really depends on your OC and what voltage you push through them. They get _*real*_ thirsty when you pushing the envelope.


Even with that many volts, 1315 core is killer for a 290 that's not on LN2. Lucky you


----------



## OnEMoReTrY

I have a Sapphire Tri-X R9 290 and my power connectors are molten. I just replaced the power cable of my AX860 with another one because the plastic tips were literally melted and charred. Is this normal?


----------



## Mr357

Quote:


> Originally Posted by *OnEMoReTrY*
> 
> I have a Sapphire Tri-X R9 290 and my power connectors are molten. I just replaced the power cable of my AX860 with another one because the plastic tips were literally melted and charred. Is this normal?


Certainly doesn't sound normal. Something needs to be RMA'd.


----------



## Roboyto

Quote:


> Originally Posted by *OnEMoReTrY*
> 
> I have a Sapphire Tri-X R9 290 and my power connectors are molten. I just replaced the power cable of my AX860 with another one because the plastic tips were literally melted and charred. Is this normal?


Sounds like you have a serious problem of some sort, that definitely shouldn't be happening


----------



## OnEMoReTrY

Quote:


> Originally Posted by *Roboyto*
> 
> Sounds like you have a serious problem of some sort, that definitely shouldn't be happening


The strange thing is my 7950's power connectors don't even feel warm while mining. I understand that the power connectors aren't supposed to be melting on the R9 290, but are they supposed to be fairly hot while mining? I don't even have the voltage that high, just +37.


----------



## rt123

Quote:


> Originally Posted by *Red1776*
> 
> Anyone care to hazard a guess at what my _4 x R290X (OC'd) + 5.2Ghz (OC'd) CPU + a quad D5 Pump system_ will pull whilst being stressed tested?
> maybe I will award a CPU/APU to the person who gets the closest *if that is not against OCN policy.*
> If not I will provide more detailed info about the OC settings and the stress program used.


I'll hazard a guess,
OCed 290X 400W each = 1600W
Vishera @ 5.2Ghz = 300W
Fans / Watercooling pumps = 100W
Total consumption = 2000W.

Also I dont think you need a quad D5 pump, a Swiftech MCP35X2 should be plenty.


----------



## taem

Quote:


> Originally Posted by *rt123*
> 
> I'll hazard a guess,
> OCed 290X 400W each = 1600W
> Vishera @ 5.2Ghz = 300W
> Fans / Watercooling pumps = 100W
> Total consumption = 2000W.
> 
> Also I dont think you need a quad D5 pump, a Swiftech MCP35X2 should be plenty.


What does a 2000w gaming rig do to your power bill?


----------



## Roboyto

Quote:


> Originally Posted by *OnEMoReTrY*
> 
> The strange thing is my 7950's power connectors don't even feel warm while mining. I understand that the power connectors aren't supposed to be melting on the R9 290, but are they supposed to be fairly hot while mining? I don't even have the voltage that high, just +37.


Just checked mine after a few hours of FFXIV and they are definitely warm, but I don't think they should be melting. It isn't mining, but running 5760*1080 off one card gives the 290 a workout!


----------



## rt123

Quote:


> Originally Posted by *taem*
> 
> What does a 2000w gaming rig do to your power bill?


Just ask any serious miner out there, they could provide you with months of concrete data.


----------



## Paul17041993

Quote:


> Originally Posted by *SeanEboy*
> 
> "Ouch! Something has gone wrong"
> Unexpected error
> System.Xml.XmlException: '.', hexadecimal value 0x00, is an invalid character. Line 115, position 23.
> at System.Xml.XmlTextReaderImpl.Throw(Exception e)


I know XML all too well and also see that someones a silly programmer...

whats this bug from though? and did you try flushing files and re-install of it? whatever program it is.

one thing to remember for the power draw subject though; you can drop the power consumtion of a 290X well below 200W, almost as low as 100W, with a little undervolting and underclocking, the efficiency ramps up massively the more you go, almost at the same rate that overclocking makes it worse.

being under water also improves power by as much as 40W on stock settings, vs the reference air blower (95C core).


----------



## Roboyto

Quote:


> Originally Posted by *Mr357*
> 
> Even with that many volts, 1315 core is killer for a 290 that's not on LN2. Lucky you


I did get a very good chip! Only ran those clocks a few times for benchmark scores.

Have settled at 1200/1500 with +75mV for gaming. Running strong and temps are excellent even when FFXIV is pushing the card 100% for several hours. Haven't had a crash or BSOD yet.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Red1776*
> 
> Anyone care to hazard a guess at what my _4 x R290X (OC'd) + 5.2Ghz (OC'd) CPU + a quad D5 Pump system_ will pull whilst being stressed tested?
> maybe I will award a CPU/APU to the person who gets the closest *if that is not against OCN policy.*
> If not I will provide more detailed info about the OC settings and the stress program used.


2168w

Random figure pulled out of the air


----------



## MrWhiteRX7

1989 watts


----------



## Paul17041993

Quote:


> Originally Posted by *Red1776*
> 
> Anyone care to hazard a guess at what my _4 x R290X (OC'd) + 5.2Ghz (OC'd) CPU + a quad D5 Pump system_ will pull whilst being stressed tested?
> maybe I will award a CPU/APU to the person who gets the closest *if that is not against OCN policy.*
> If not I will provide more detailed info about the OC settings and the stress program used.


would need the voltages and clocks for anything close to accurate, but for now;

1852W with OCCT/furmark.


----------



## King4x4

Well comparing to my own system:
OCed 290X 420W [email protected] on both voltage and aux voltage = 1680watt
Vishera @ 5.2Ghz = 400W
Fans(You said 5 rads so I would hazard p/p? / Watercooling pumps = 40+(24x3)= 112watt
Misc Equipment in the case = 50watt
Total consumption = 2242watt


----------



## nightfox

2157 watts. pulled from wall. just my guess.


----------



## Widde

Damnit this is the 2nd time I forget to apply the fan profile after sleeping >_< xD Pc shutting down after about 10-20min of bf4







Have the fan fixed at 35% when I sleep

btw 50C idling is that normal at stock?







ambient is around 19-21C when I have my custom profile active http://piclair.com/tv2th


----------



## JMCB

I want to update my stats. I went from 2x 290x to 4x 290x all water-cooled ( 4x XSPC. Water blocks). Gotta do some benches(currently mining) but so far, pretty happy with how everything has turned out (althought VRM1 temps on cards 3 and 4 get VERY hot mining after 3+ hours at almost 96C - need to find a way to get those cooler).


----------



## ZealotKi11er

Quote:


> Originally Posted by *JMCB*
> 
> I want to update my stats. I went from 2x 290x to 4x 290x all water-cooled ( 4x XSPC. Water blocks). Gotta do some benches(currently mining) but so far, pretty happy with how everything has turned out (althought VRM1 temps on cards 3 and 4 get VERY hot mining after 3+ hours at almost 96C - need to find a way to get those cooler).


You need to buy better thermal tape for VRM1s.


----------



## SeanEboy

Quote:


> Originally Posted by *JMCB*
> 
> I want to update my stats. I went from 2x 290x to 4x 290x all water-cooled ( 4x XSPC. Water blocks). Gotta do some benches(currently mining) but so far, pretty happy with how everything has turned out (althought VRM1 temps on cards 3 and 4 get VERY hot mining after 3+ hours at almost 96C - need to find a way to get those cooler).


See Roboyto'a excellently detailed thread... You need Fujipoly Ultra Extreme thermal pads...
Quote:


> Originally Posted by *Paul17041993*
> 
> I know XML all too well and also see that someones a silly programmer...
> 
> whats this bug from though? and did you try flushing files and re-install of it? whatever program it is.


I guess I should've mentioned that but of info... It's effin' 3DMark... I'm about to start 3D-marking up all the damn walls in this house! Tried the localappdata file delete, tried downloading from different mirrors.. Nada.


----------



## Ukkooh

Quote:


> Originally Posted by *taem*
> 
> What does a 2000w gaming rig do to your power bill?


It would do nothing for me as my power bill is fixed anyway.
Quote:


> Originally Posted by *JMCB*
> 
> I want to update my stats. I went from 2x 290x to 4x 290x all water-cooled ( 4x XSPC. Water blocks). Gotta do some benches(currently mining) but so far, pretty happy with how everything has turned out (althought VRM1 temps on cards 3 and 4 get VERY hot mining after 3+ hours at almost 96C - need to find a way to get those cooler).


If only two of the cards get hot while mining it is most likely just a bad mount. Or did you mine only on two of the cards?


----------



## phallacy

Hey guys in CCC, there are profiles you can set for specific games or applications. However I haven't even touched this area and it's all set to let application decide where possible. Are there any games or profiles I should be using for better performance or benches?


----------



## Roboyto

Quote:


> Originally Posted by *Widde*
> 
> Damnit this is the 2nd time I forget to apply the fan profile after sleeping >_< xD Pc shutting down after about 10-20min of bf4
> 
> 
> 
> 
> 
> 
> 
> Have the fan fixed at 35% when I sleep
> 
> btw 50C idling is that normal at stock?
> 
> 
> 
> 
> 
> 
> 
> ambient is around 19-21C when I have my custom profile active http://piclair.com/tv2th


Looks like you have a reference card with blower? If that is the case 45-50C idle is about right if I remember from when I was testing with blower.


----------



## Roboyto

Quote:


> Originally Posted by *SeanEboy*
> 
> See Roboyto'a excellently detailed thread... You need Fujipoly Ultra Extreme thermal pads...


Thanks

Quote:



> Originally Posted by *JMCB*
> 
> I want to update my stats. I went from 2x 290x to 4x 290x all water-cooled ( 4x XSPC. Water blocks). Gotta do some benches(currently mining) but so far, pretty happy with how everything has turned out (althought VRM1 temps on cards 3 and 4 get VERY hot mining after 3+ hours at almost 96C - need to find a way to get those cooler).
> 
> 
> Spoiler: Warning: Spoiler!
Click to expand...

Here is the answer to your VRM1 temperatures









http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures

What are the temps like on cards 1 & 2? Are they within a similar temperature range? Or is the heated up water that has passed through blocks 1 & 2 causing higher temps on 3 & 4?

Before I upgraded pads with my XSPC block the worst temperature I ever saw was 81C.


----------



## kizwan

Quote:


> Originally Posted by *phallacy*
> 
> I recently bought a kill a watt so I could see my power draw and if the 290x used THAT much power.
> 
> So far I've only done two benchmarks but plan on doing more and updating. To make sure I was recording accurately, I moved everything on that outlet to an adjacent room's outlet and plugged my computer into the meter. Then reset and switched it to watts on a 120v outlet.
> 
> I ran Heaven 4.0 and 3dmark11 P after about 30 minutes of idling
> 
> The following gpu clocks were
> 1: 1200/1650 +96mV +50 power limit
> 2: 1200/1500 +103mV +50 power limit
> 2: 1225/1650 + 110mV +50 power limit
> 
> Heaven settings were maxed at 1080p with 8xAA
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Beginning by ship 996 watts
> 
> 
> 
> Rotating screen close to end 1007 watts
> 
> 
> 
> 3dmark11 performance setting
> 
> Graphics scene 4 1001 watts
> 
> 
> CPU only test 596 watts
> 
> 
> Combined test 701 watts


Nice!








Quote:


> Originally Posted by *Roboyto*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> Gona finally try to see how far my card go. What tools do you guys Recommend?
> 
> Does PowerTune not work with 14.2?
> 
> 
> 
> Not sure about PowerRune or 14.2. Used 14.2 briefly, was very unstable for me. Haven't used CCC for overclocking at all personally.
> 
> 
> 
> Run Trixx or Afterburner. Make sure you disable ULPS and force constant voltage. Trixx has ULPS right in the settings, AB you may have to force through registry since I don't readily see it...?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Registry edit for ULPS: http://www.tomshardware.com/faq/id-1904869/disable-ulps-amd-crossfire-setups.html
Click to expand...

With MSI AB 3 beta 18, you can disabled ULPS in settings page.
Quote:


> Originally Posted by *OnEMoReTrY*
> 
> I have a Sapphire Tri-X R9 290 and my power connectors are molten. I just replaced the power cable of my AX860 with another one because the plastic tips were literally melted and charred. Is this normal?


Definitely not normal. Probably bad cable. The melted cable is stock cable or custom cable? Hopefully new one doesn't melt too.


----------



## Roboyto

Quote:


> Originally Posted by *kizwan*
> 
> With MSI AB 3 beta 18, you can disabled ULPS in settings page.


Good to know, I have 2.3.1 installed.


----------



## battleaxe

Quote:


> Originally Posted by *taem*
> 
> What does a 2000w gaming rig do to your power bill?


Cost about 100.00 a month to run it at .063 per kw/h.


----------



## Arizonian

Quote:


> Originally Posted by *JMCB*
> 
> I want to update my stats. I went from 2x 290x to 4x 290x all water-cooled ( 4x XSPC. Water blocks). Gotta do some benches(currently mining) but so far, pretty happy with how everything has turned out (althought VRM1 temps on cards 3 and 4 get VERY hot mining after 3+ hours at almost 96C - need to find a way to get those cooler).
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - updated


----------



## OnEMoReTrY

Quote:


> Originally Posted by *kizwan*
> 
> Nice!
> 
> 
> 
> 
> 
> 
> 
> 
> With MSI AB 3 beta 18, you can disabled ULPS in settings page.
> Definitely not normal. Probably bad cable. The melted cable is stock cable or custom cable? Hopefully new one doesn't melt too.


It was a stock cable for the AX860. The new cable is feeling just as hot while the card is mining. Seems like it's probably the video card at this point.


----------



## Widde

Quote:


> Originally Posted by *Roboyto*
> 
> Looks like you have a reference card with blower? If that is the case 45-50C idle is about right if I remember from when I was testing with blower.


Yea it's a reference ^^ But I want to remember it running cooler on idle with 13.12 :S Gonna put them under water in the future and prob add a 3rd


----------



## Roboyto

Quote:


> Originally Posted by *Brian18741*
> 
> Ok so for anyone who was following, my Asus R9 290s DCU II in crossfire were hitting ridiculous temperatures. I was hitting 94°C on the core on the top card when leaving the fans on auto!
> 
> So I've been swapping thermal paste to see if there's any improvements. Results below. Making definite progress, I tried some Atric Cooling MX-4 last week and the results were fairly disappointing.
> 
> This weekend I have tried Gelid OC Extreme and so far this paste is giving the best results. Again though, this is at 100% fans on both cards and case fans so it's very loud.
> 
> Watercooling is still the only viable option. Oh well, may start saving!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Stock clocks and voltages. Fans 100%. Case fans 100%. Over 1 hour of gaming.


Quote:



> Originally Posted by *Roy360*
> 
> just to confirm, but it's the ASUS Radeon R9 290 DirectCU II cooler that has the flaw where the heatpipes don't touch right?
> 
> When I was looking at the Tom's Hardware thing, their card was black, whereas mines isn't. I have yet to break the seal on mines, but I really don't want a non-reference card that performs as bad as a reference card
> 
> the card: http://products.ncix.com/detail/asus-radeon-r9-290-directcu-6f-93995-1064.htm


It was the DC2 that was having issues with the heatpipes. I think this may be able to be fixed by using a copper shim?







If you look at the cooler for the 7970 DC2 Top you can see they have another piece of copper across the heatpipes to make sure the heat is distributed. Why they didn't do this in the first place is a mystery though!

*7970*



*290...2 heatpipes not even being touched*











Edit: Also noticed the 7970 has 6 heatpipes and the 290 only 5...







Granted one of the pipes on the 290 is larger, but why would they do this?


----------



## Roboyto

Quote:


> Originally Posted by *OnEMoReTrY*
> 
> It was a stock cable for the AX860. The new cable is feeling just as hot while the card is mining. Seems like it's probably the video card at this point.


Maybe a little airflow across the top of the card and power connectors could help a little?

On second thought, I'm feeling you have a bad card. My card has warm air from radiator blowing on it and they still don't get that hot...may have to RMA it.

Do you have another PSU to test with to make sure that isn't the problem?


----------



## OnEMoReTrY

Quote:


> Originally Posted by *Roboyto*
> 
> Maybe a little airflow across the top of the card and power connectors could help a little?
> 
> On second thought, I'm feeling you have a bad card. My card has warm air from radiator blowing on it and they still don't get that hot...may have to RMA it.
> 
> Do you have another PSU to test with to make sure that isn't the problem?


I actually have a 140mm fan standing up right next to it, doesn't seem to be helping. I've got another PSU that I can test with, so I'll give that a shot. I'm not sure if the plastic melted/tore on its own, or when I pulled the cable out right after it had been running that hot. Probably either way indicates a problem because I can't keep a finger on it for more than 1-2 seconds before getting burned.

I really hope I won't have to RMA this =/ it's such a nice overclocker and has no coil whine. If it melts another cable/burns out it's going back I guess.


----------



## Roboyto

Quote:


> Originally Posted by *OnEMoReTrY*
> 
> I actually have a 140mm fan standing up right next to it, doesn't seem to be helping. I've got another PSU that I can test with, so I'll give that a shot. I'm not sure if the plastic melted/tore on its own, or when I pulled the cable out right after it had been running that hot. Probably either way indicates a problem because I can't keep a finger on it for more than 1-2 seconds before getting burned.


Swap PSU temporarily to test, and if nothing changes I would get that card out of their before it fries something, or everything, else!


----------



## Mr357

Quote:


> Originally Posted by *JMCB*
> 
> I want to update my stats. I went from 2x 290x to 4x 290x all water-cooled ( 4x XSPC. Water blocks). Gotta do some benches(currently mining) but so far, pretty happy with how everything has turned out (althought VRM1 temps on cards 3 and 4 get VERY hot mining after 3+ hours at almost 96C - need to find a way to get those cooler).
> 
> 
> Spoiler: Warning: Spoiler!


Like others have said, upgrade the thermal pads on your cards, but even besides that you may need to add rad space. How are your CPU and GPU core temperatures?


----------



## Roboyto

Quote:


> Originally Posted by *JMCB*
> 
> I want to update my stats. I went from 2x 290x to 4x 290x all water-cooled ( 4x XSPC. Water blocks). Gotta do some benches(currently mining) but so far, pretty happy with how everything has turned out (althought VRM1 temps on cards 3 and 4 get VERY hot mining after 3+ hours at almost 96C - need to find a way to get those cooler).
> 
> 
> Spoiler: Warning: Spoiler!


Quote:



> Originally Posted by *Mr357*
> 
> Like others have said, upgrade the thermal pads on your cards, but even besides that you may need to add rad space. How are your CPU and GPU core temperatures?


Good call.

As long as his rig build is accurate, he is using a RX480 and RS360. If you're going to be mining I would think that 2x120 rad space per GPU would be necessary for constant 100% load especially when overclocked. The RX helps a little since it's double thick, but you have your CPU in the loop as well so you may very well need more radiator real estate.

I know from recent experience the RX V2 and RS series radiators are discontinued. If you're going to stick with XSPC they just came out with the RX V3 and EX radiators, both of which take design principles from the AX. I just replaced my RS/RX radiators with EX.


----------



## VSG

Quote:


> Originally Posted by *Roboyto*
> 
> It was the DC2 that was having issues with the heatpipes. I think this may be able to be fixed by using a copper shim?
> 
> 
> 
> 
> 
> 
> 
> If you look at the cooler for the 7970 DC2 Top you can see they have another piece of copper across the heatpipes to make sure the heat is distributed. Why they didn't do this in the first place is a mystery though!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> *7970*
> 
> 
> 
> *290...2 heatpipes not even being touched*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Also noticed the 7970 has 6 heatpipes and the 290 only 5...
> 
> 
> 
> 
> 
> 
> 
> Granted one of the pipes on the 290 is larger, but why would they do this?


Does this looks familiar:



That's the DCUII 780 Ti cooler, which came out before the DCUII 290x.

Or how about this:



That's the DCUII 780. There has been very little change from what I can see on this cooler and ASUS thought it will be great for all GPUs


----------



## Roboyto

Quote:


> Originally Posted by *geggeg*
> 
> Does this looks familiar:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> That's the DCUII 780 Ti cooler, which came out before the DCUII 290x.
> 
> Or how about this:
> 
> 
> 
> 
> 
> That's the DCUII 780. There has been very little change from what I can see on this cooler and ASUS thought it will be great for all GPUs


One of the review site, and others in this thread, mentioned this fact as well. Hawaii is so much smaller than GK110...lots of transistors in a smaller package for more concentrated heat...ASUS made a







horrible decision here.

I bet with a shim the temps would improve...I'd like to see someone try it out.

Other big question...is ASUS doing anything to fix the problem?


----------



## Widde

Hmmm, my idle temps got lowered to 43-45C when I moved my pc from behind my screens to the floor


----------



## Hl86

I just went from 670 to 290x and i get lower fps in WoW. Its only WoW though.
Is there a fix?


----------



## Roboyto

Quote:


> Originally Posted by *Hl86*
> 
> I just went from 670 to 290x and i get lower fps in WoW. Its only WoW though.
> Is there a fix?


Did you make sure you properly removed Nvidia drivers first? If you didn't we would suggest using Display Driver Uninstaller to make sure everything is gone for a clean installation.

http://www.guru3d.com/files_details/display_driver_uninstaller_download.html

I haven't played WoW in a long time, but wouldn't a 670 have been overkill already? What frames did you have before and what kind of frames are you seeing with the 290X?


----------



## Ultracarpet

Is there a bios and or utility that will give me more voltage control... I'm maxed out on AB with +100mv and am maxed out at 1150 core/ 1500 memory


----------



## taem

Quote:


> Originally Posted by *Ultracarpet*
> 
> Is there a bios and or utility that will give me more voltage control... I'm maxed out on AB with +100mv and am maxed out at 1150 core/ 1500 memory


Trixx goes to +200mV vddc adjust. But Afterburner has aux voltage adjustment which can sometimes offer stability for higher clocks at a given vddc adjust. If you are air cooling and can't handle more than +100mV that can be better than the higher limit on Trixx. Afterburner also has memory voltage, though I can't get it to work on any of the 290s I've tried.

Quote:


> Originally Posted by *Widde*
> 
> Hmmm, my idle temps got lowered to 43-45C when I moved my pc from behind my screens to the floor


What do you have that idles at 43-45 in *Sweden* where I'm thinking it's probably cold.


----------



## KGBinUSA

Quote:


> Originally Posted by *Hl86*
> 
> I just went from 670 to 290x and i get lower fps in WoW. Its only WoW though.
> Is there a fix?


WoW is more CPU intensive, upgrading the GPU should not have done anything at all.

To me, it sounds like you might have hit the limit on your power supply. What power supply you got? What CPU, did you overclock it?


----------



## Paul17041993

Quote:


> Originally Posted by *SeanEboy*
> 
> I guess I should've mentioned that but of info... It's effin' 3DMark... I'm about to start 3D-marking up all the damn walls in this house! Tried the localappdata file delete, tried downloading from different mirrors.. Nada.


you'll have to talk to futuremark support about that I think, either way, an exception shouldn't be thrown on another caught exception... that's a double fail...

Ive only ever seen something like that once, which was minecraft crashing on the crash screen, an exception thrown from something bad in the crashlog from the actual crash...









Quote:


> Originally Posted by *geggeg*


soldering a mm copper plate should gain ~10% cooling performance, more or less, would have to be a good solder job though.


----------



## Widde

Quote:


> Originally Posted by *taem*
> 
> Trixx goes to +200mV vddc adjust. But Afterburner has aux voltage adjustment which can sometimes offer stability for higher clocks at a given vddc adjust. If you are air cooling and can't handle more than +100mV that can be better than the higher limit on Trixx. Afterburner also has memory voltage, though I can't get it to work on any of the 290s I've tried.
> What do you have that idles at 43-45 in *Sweden* where I'm thinking it's probably cold.


290s in crossfire crammed in an antec threehundred









And this is the warmest winter ever that I can remember :S


----------



## BradleyW

So much fail in this thread.
http://steamcommunity.com/app/239160/discussions/1/558749190557032294/


----------



## Widde

Quote:


> Originally Posted by *BradleyW*
> 
> So much fail in this thread.
> http://steamcommunity.com/app/239160/discussions/1/558749190557032294/


This made my night







"Gets popcorn"


----------



## BradleyW

Quote:


> Originally Posted by *Widde*
> 
> This made my night
> 
> 
> 
> 
> 
> 
> 
> "Gets popcorn"


The level of stupidity is crazy in that thread.


----------



## Roboyto

Quote:


> Originally Posted by *BradleyW*
> 
> So much fail in this thread.
> http://steamcommunity.com/app/239160/discussions/1/558749190557032294/


In light of the super awesome information in this thread...

I say we all remove our 290s, smash them with a hammer, and then burn them as a sacrificial offering to the 'Green God'


----------



## BradleyW

Quote:


> Originally Posted by *Roboyto*
> 
> In light of the super awesome information in this thread...
> 
> I say we all remove our 290s, smash them with a hammer, and then burn them as a sacrificial offering to the 'Green God'


Don't tell them that! They will do it!


----------



## Spectre-

Quote:


> Originally Posted by *BradleyW*
> 
> So much fail in this thread.
> http://steamcommunity.com/app/239160/discussions/1/558749190557032294/


you guys should read COd threads it kills your brain cells

sometimes its just a bunch 12yr olds and telling everyone to get i7's and 1000 watts PSU's to mak out cod


----------



## Roboyto

Quote:


> Originally Posted by *Spectre-*
> 
> you guys should read COd threads it kills your brain cells
> 
> sometimes its just a bunch 12yr olds and telling everyone to get i7's and 1000 watts PSU's to mak out cod


Everyone knows more watts = more power = more fps = more frags. These are the rules you should live your life buy


----------



## kizwan

Quote:


> Originally Posted by *JMCB*
> 
> I want to update my stats. I went from 2x 290x to 4x 290x all water-cooled ( 4x XSPC. Water blocks). Gotta do some benches(currently mining) but so far, pretty happy with how everything has turned out (althought VRM1 temps on cards 3 and 4 get VERY hot mining after 3+ hours at almost 96C - need to find a way to get those cooler).
> 
> 
> Spoiler: Warning: Spoiler!


Can you monitor water temp? I'm interested to know your Delta T (= water temp - ambient temp) when GPU's under load like when mining and/or BF4. VRM1 temps are high because stock thermal pad unable to cool them well. I recommend change them to Fujipoly thermal pad. If your Delta T is high like higher than 10C, adding more radiator space might help lower the water temp & therefore helps improve cooling capability.


----------



## BradleyW

Worst comments in threads I've seen are:

1) XBOX make all XBOX games.
2) i7 with HT @ 3.8GHz = 3.8 x 12 threads = 45GHz
3) The more RAM you have, the more RAM your game can use
4) There is no difference between 60Hz and 120Hz
5) I don't need to overclock my CPU. Games only use 40% of it.
6) Why does Windows 8 make my CPU single core

And many more.


----------



## VSG

Quote:


> Originally Posted by *BradleyW*
> 
> Worst comments in threads I've seen are:
> 6) Why does Windows 8 make my CPU single core
> 
> And many more.


lol where did you see this beauty?


----------



## kskwerl

Anyone here have a MSI 290x Gaming and uses it to mine? If so, what kind of temps do you get and hashes?


----------



## BradleyW

Quote:


> Originally Posted by *geggeg*
> 
> lol where did you see this beauty?


I think it was either on Guru3d forums or Overclockers UK fourm.


----------



## The Mac

2 of those guys are actually on here...

lol


----------



## Spectre-

Quote:


> Originally Posted by *BradleyW*
> 
> Worst comments in threads I've seen are:
> 
> 1) XBOX make all XBOX games.
> 2) i7 with HT @ 3.8GHz = 3.8 x 12 threads = 45GHz
> 3) The more RAM you have, the more RAM your game can use
> 4) There is no difference between 60Hz and 120Hz
> 5) I don't need to overclock my CPU. Games only use 40% of it.
> 6) Why does Windows 8 make my CPU single core
> 
> And many more.


wait a second that #2 means i run my 3930k @ 4.4

so 12 X 4.4 = 52.8ghz

that means5 fps rite guys?

i also have 16gb ram ill get 32gb ram to get like 150 fps


----------



## Hl86

Quote:


> Originally Posted by *KGBinUSA*
> 
> WoW is more CPU intensive, upgrading the GPU should not have done anything at all.
> 
> To me, it sounds like you might have hit the limit on your power supply. What power supply you got? What CPU, did you overclock it?


Im getting 100% gpu usage at full core speed, seems kinda weird.


----------



## Hl86

my cpu is a 2500k oc to 4.8ghz


----------



## Roy360

Quote:


> Originally Posted by *Roboyto*
> 
> It was the DC2 that was having issues with the heatpipes. I think this may be able to be fixed by using a copper shim?
> 
> 
> 
> 
> 
> 
> 
> If you look at the cooler for the 7970 DC2 Top you can see they have another piece of copper across the heatpipes to make sure the heat is distributed. Why they didn't do this in the first place is a mystery though!
> 
> Edit: Also noticed the 7970 has 6 heatpipes and the 290 only 5...
> 
> 
> 
> 
> 
> 
> 
> Granted one of the pipes on the 290 is larger, but why would they do this?


damn, so now I have to choose. Stick with the R9 290 for 610$ after tax, and buy a shim, or buy a watercooled VisionTek R9 290 for 600+tax+shipping....
will the EK full blocks fit the ASUS R9 290? Did ASUS make any other "improvements" to the reference design?


----------



## sugarhell

Quote:


> Originally Posted by *Spectre-*
> 
> wait a second that #2 means i run my 3930k @ 4.4
> 
> so 12 X 4.4 = 52.8ghz
> 
> that means5 fps rite guys?
> 
> i also have 16gb ram ill get 32gb ram to get like 150 fps


fx 6300=3930
because
6=6


----------



## BradleyW

Has anyone got Thief and CFX? If so, how to I fix the following error?


Spoiler: Warning: Spoiler!



22:21:39:431 (2736) > THIEF (64-bit)
22:21:39:433 (2736) > ==============
22:21:39:433 (2736) >
22:21:39:433 (2736) >
22:21:39:434 (2736) > Memory statistics:
22:21:39:434 (2736) > Total RAM = 15.9 GB (16303 MB)
22:21:39:434 (2736) > Avail RAM = 14.1 GB (14396 MB)
22:21:39:435 (2736) > Total virtual memory = 131072.0 GB (134217727 MB)
22:21:39:435 (2736) > Avail virtual memory = 131071.9 GB (134217572 MB)
22:21:39:435 (2736) > OS = Windows 8 (build 9200)
22:21:39:436 (2736) > [Render] NxApp PreInit
22:21:39:437 (2736) > [Render] Enumerating adapters:
22:21:39:438 (2736) > [Render] Adapter 0:
22:21:39:438 (2736) > Description : AMD Radeon R9 200 Series
22:21:39:438 (2736) > Vendor : AMD (1002)
22:21:39:438 (2736) > Device : 67b0
22:21:39:438 (2736) > Sub system : b001002
22:21:39:439 (2736) > Revision : 0
22:21:39:439 (2736) > Dedicated Video Memory : 8109 MB
22:21:39:439 (2736) > Dedicated System Memory : 0000 MB
22:21:39:439 (2736) > Shared System Memory : 7936 MB
22:21:39:439 (2736) >
22:21:39:439 (2736) > [Render] Loading ADL...
*22:21:39:455 (2736) > [Render] CROSSFIRE GPU count query failed, no multi-gpu optimizations.
22:21:39:456 (2736) > [Render] Done loading ADL and AGS
22:21:39:456 (2736) > [Render] Adapter 1:
22:21:39:456 (2736) > Description : Microsoft Basic Render Driver
22:21:39:456 (2736) > Vendor : Unknown (1414)
22:21:39:456 (2736) > Device : 8c
22:21:39:456 (2736) > Sub system : 0
22:21:39:456 (2736) > Revision : 0
22:21:39:456 (2736) > Dedicated Video Memory : 0000 MB
22:21:39:456 (2736) > Dedicated System Memory : 0000 MB
22:21:39:456 (2736) > Shared System Memory : 0256 MB*
22:21:39:457 (2736) >
22:21:39:457 (2736) > [Render] Enumerating monitors:
22:21:39:457 (2736) > [Render] Monitor 0: handle = 0000000000010001, aspect = 1.78
22:21:39:457 (2736) >
22:21:39:491 (2736) > [Render] AMD HD3D Extension created.
22:21:39:491 (2736) > [Render] AMD HD3D Extension Version: 8.1

22:21:39:491 (2736) > [Render] AMD HD3D failed to initialize. (Most likely cause: there is no 3D support)
22:21:39:491 (2736) > [Render] AMD HD3D: Released Quad Buffer Stereo Extension.
22:21:39:491 (2736) > [Render] Enumerating Display Modes:
22:21:39:497 (2736) > [Render] Querying ADL for Eyefinity modes
22:21:39:498 (2736) > [Render] ADL_Display_DisplayMapConfig_Get() failed
22:21:39:498 (2736) > [Render] Monitor 'Display 1 (AMD Radeon R9 200 Series(1))' (0000000000010001):
22:21:39:498 (2736) > [Render] - Mode 0 : 640 x 480 @ 67 Hz, stereo = 0
22:21:39:498 (2736) > [Render] - Mode 1 : 640 x 480 @ 75 Hz, stereo = 0
22:21:39:498 (2736) > [Render] - Mode 2 : 640 x 480 @ 100 Hz, stereo = 0
22:21:39:498 (2736) > [Render] - Mode 3 : 640 x 480 @ 100 Hz, stereo = 0
22:21:39:498 (2736) > [Render] - Mode 4 : 640 x 480 @ 110 Hz, stereo = 0
22:21:39:498 (2736) > [Render] - Mode 5 : 640 x 480 @ 110 Hz, stereo = 0
22:21:39:499 (2736) > [Render] - Mode 6 : 640 x 480 @ 120 Hz, stereo = 0
22:21:39:499 (2736) > [Render] - Mode 7 : 640 x 480 @ 120 Hz, stereo = 0
22:21:39:499 (2736) > [Render] - Mode 8 : 640 x 480 @ 144 Hz, stereo = 0
22:21:39:499 (2736) > [Render] - Mode 9 : 640 x 480 @ 144 Hz, stereo = 0
22:21:39:499 (2736) > [Render] - Mode 10 : 640 x 480 @ 60 Hz, stereo = 0
22:21:39:499 (2736) > [Render] - Mode 11 : 720 x 480 @ 56 Hz, stereo = 0
22:21:39:499 (2736) > [Render] - Mode 12 : 720 x 480 @ 56 Hz, stereo = 0
22:21:39:499 (2736) > [Render] - Mode 13 : 720 x 480 @ 60 Hz, stereo = 0
22:21:39:499 (2736) > [Render] - Mode 14 : 720 x 480 @ 72 Hz, stereo = 0
22:21:39:499 (2736) > [Render] - Mode 15 : 720 x 480 @ 72 Hz, stereo = 0
22:21:39:500 (2736) > [Render] - Mode 16 : 720 x 480 @ 75 Hz, stereo = 0
22:21:39:500 (2736) > [Render] - Mode 17 : 720 x 480 @ 75 Hz, stereo = 0
22:21:39:500 (2736) > [Render] - Mode 18 : 720 x 480 @ 100 Hz, stereo = 0
22:21:39:500 (2736) > [Render] - Mode 19 : 720 x 480 @ 100 Hz, stereo = 0
22:21:39:500 (2736) > [Render] - Mode 20 : 720 x 480 @ 110 Hz, stereo = 0
22:21:39:500 (2736) > [Render] - Mode 21 : 720 x 480 @ 110 Hz, stereo = 0
22:21:39:500 (2736) > [Render] - Mode 22 : 720 x 480 @ 120 Hz, stereo = 0
22:21:39:500 (2736) > [Render] - Mode 23 : 720 x 480 @ 120 Hz, stereo = 0
22:21:39:500 (2736) > [Render] - Mode 24 : 720 x 480 @ 144 Hz, stereo = 0
22:21:39:501 (2736) > [Render] - Mode 25 : 720 x 480 @ 144 Hz, stereo = 0
22:21:39:501 (2736) > [Render] - Mode 26 : 720 x 480 @ 60 Hz, stereo = 0
22:21:39:501 (2736) > [Render] - Mode 27 : 720 x 576 @ 50 Hz, stereo = 0
22:21:39:501 (2736) > [Render] - Mode 28 : 720 x 576 @ 56 Hz, stereo = 0
22:21:39:501 (2736) > [Render] - Mode 29 : 720 x 576 @ 56 Hz, stereo = 0
22:21:39:501 (2736) > [Render] - Mode 30 : 720 x 576 @ 60 Hz, stereo = 0
22:21:39:501 (2736) > [Render] - Mode 31 : 720 x 576 @ 60 Hz, stereo = 0
22:21:39:501 (2736) > [Render] - Mode 32 : 720 x 576 @ 72 Hz, stereo = 0
22:21:39:501 (2736) > [Render] - Mode 33 : 720 x 576 @ 72 Hz, stereo = 0
22:21:39:502 (2736) > [Render] - Mode 34 : 720 x 576 @ 75 Hz, stereo = 0
22:21:39:502 (2736) > [Render] - Mode 35 : 720 x 576 @ 75 Hz, stereo = 0
22:21:39:502 (2736) > [Render] - Mode 36 : 720 x 576 @ 100 Hz, stereo = 0
22:21:39:502 (2736) > [Render] - Mode 37 : 720 x 576 @ 100 Hz, stereo = 0
22:21:39:502 (2736) > [Render] - Mode 38 : 720 x 576 @ 110 Hz, stereo = 0
22:21:39:502 (2736) > [Render] - Mode 39 : 720 x 576 @ 110 Hz, stereo = 0
22:21:39:502 (2736) > [Render] - Mode 40 : 720 x 576 @ 120 Hz, stereo = 0
22:21:39:502 (2736) > [Render] - Mode 41 : 720 x 576 @ 120 Hz, stereo = 0
22:21:39:502 (2736) > [Render] - Mode 42 : 720 x 576 @ 144 Hz, stereo = 0
22:21:39:503 (2736) > [Render] - Mode 43 : 720 x 576 @ 144 Hz, stereo = 0
22:21:39:503 (2736) > [Render] - Mode 44 : 800 x 600 @ 56 Hz, stereo = 0
22:21:39:503 (2736) > [Render] - Mode 45 : 800 x 600 @ 60 Hz, stereo = 0
22:21:39:503 (2736) > [Render] - Mode 46 : 800 x 600 @ 72 Hz, stereo = 0
22:21:39:503 (2736) > [Render] - Mode 47 : 800 x 600 @ 75 Hz, stereo = 0
22:21:39:503 (2736) > [Render] - Mode 48 : 800 x 600 @ 100 Hz, stereo = 0
22:21:39:503 (2736) > [Render] - Mode 49 : 800 x 600 @ 100 Hz, stereo = 0
22:21:39:503 (2736) > [Render] - Mode 50 : 800 x 600 @ 110 Hz, stereo = 0
22:21:39:503 (2736) > [Render] - Mode 51 : 800 x 600 @ 110 Hz, stereo = 0
22:21:39:504 (2736) > [Render] - Mode 52 : 800 x 600 @ 120 Hz, stereo = 0
22:21:39:504 (2736) > [Render] - Mode 53 : 800 x 600 @ 120 Hz, stereo = 0
22:21:39:504 (2736) > [Render] - Mode 54 : 800 x 600 @ 144 Hz, stereo = 0
22:21:39:504 (2736) > [Render] - Mode 55 : 800 x 600 @ 144 Hz, stereo = 0
22:21:39:504 (2736) > [Render] - Mode 56 : 1024 x 768 @ 60 Hz, stereo = 0
22:21:39:504 (2736) > [Render] - Mode 57 : 1024 x 768 @ 70 Hz, stereo = 0
22:21:39:504 (2736) > [Render] - Mode 58 : 1024 x 768 @ 75 Hz, stereo = 0
22:21:39:504 (2736) > [Render] - Mode 59 : 1024 x 768 @ 100 Hz, stereo = 0
22:21:39:504 (2736) > [Render] - Mode 60 : 1024 x 768 @ 100 Hz, stereo = 0
22:21:39:505 (2736) > [Render] - Mode 61 : 1024 x 768 @ 110 Hz, stereo = 0
22:21:39:505 (2736) > [Render] - Mode 62 : 1024 x 768 @ 110 Hz, stereo = 0
22:21:39:505 (2736) > [Render] - Mode 63 : 1024 x 768 @ 120 Hz, stereo = 0
22:21:39:505 (2736) > [Render] - Mode 64 : 1024 x 768 @ 120 Hz, stereo = 0
22:21:39:505 (2736) > [Render] - Mode 65 : 1024 x 768 @ 144 Hz, stereo = 0
22:21:39:505 (2736) > [Render] - Mode 66 : 1024 x 768 @ 144 Hz, stereo = 0
22:21:39:505 (2736) > [Render] - Mode 67 : 1152 x 864 @ 60 Hz, stereo = 0
22:21:39:505 (2736) > [Render] - Mode 68 : 1152 x 864 @ 60 Hz, stereo = 0
22:21:39:506 (2736) > [Render] - Mode 69 : 1152 x 864 @ 75 Hz, stereo = 0
22:21:39:506 (2736) > [Render] - Mode 70 : 1152 x 864 @ 100 Hz, stereo = 0
22:21:39:506 (2736) > [Render] - Mode 71 : 1152 x 864 @ 100 Hz, stereo = 0
22:21:39:507 (2736) > [Render] - Mode 72 : 1152 x 864 @ 110 Hz, stereo = 0
22:21:39:507 (2736) > [Render] - Mode 73 : 1152 x 864 @ 110 Hz, stereo = 0
22:21:39:507 (2736) > [Render] - Mode 74 : 1152 x 864 @ 120 Hz, stereo = 0
22:21:39:507 (2736) > [Render] - Mode 75 : 1152 x 864 @ 120 Hz, stereo = 0
22:21:39:507 (2736) > [Render] - Mode 76 : 1152 x 864 @ 144 Hz, stereo = 0
22:21:39:508 (2736) > [Render] - Mode 77 : 1152 x 864 @ 144 Hz, stereo = 0
22:21:39:508 (2736) > [Render] - Mode 78 : 1280 x 720 @ 50 Hz, stereo = 0
22:21:39:508 (2736) > [Render] - Mode 79 : 1280 x 720 @ 100 Hz, stereo = 0
22:21:39:508 (2736) > [Render] - Mode 80 : 1280 x 720 @ 100 Hz, stereo = 0
22:21:39:508 (2736) > [Render] - Mode 81 : 1280 x 720 @ 110 Hz, stereo = 0
22:21:39:508 (2736) > [Render] - Mode 82 : 1280 x 720 @ 110 Hz, stereo = 0
22:21:39:508 (2736) > [Render] - Mode 83 : 1280 x 720 @ 120 Hz, stereo = 0
22:21:39:508 (2736) > [Render] - Mode 84 : 1280 x 720 @ 120 Hz, stereo = 0
22:21:39:508 (2736) > [Render] - Mode 85 : 1280 x 720 @ 144 Hz, stereo = 0
22:21:39:508 (2736) > [Render] - Mode 86 : 1280 x 720 @ 144 Hz, stereo = 0
22:21:39:509 (2736) > [Render] - Mode 87 : 1280 x 720 @ 60 Hz, stereo = 0
22:21:39:509 (2736) > [Render] - Mode 88 : 1280 x 768 @ 60 Hz, stereo = 0
22:21:39:509 (2736) > [Render] - Mode 89 : 1280 x 768 @ 60 Hz, stereo = 0
22:21:39:509 (2736) > [Render] - Mode 90 : 1280 x 768 @ 100 Hz, stereo = 0
22:21:39:509 (2736) > [Render] - Mode 91 : 1280 x 768 @ 100 Hz, stereo = 0
22:21:39:509 (2736) > [Render] - Mode 92 : 1280 x 768 @ 110 Hz, stereo = 0
22:21:39:509 (2736) > [Render] - Mode 93 : 1280 x 768 @ 110 Hz, stereo = 0
22:21:39:509 (2736) > [Render] - Mode 94 : 1280 x 768 @ 120 Hz, stereo = 0
22:21:39:509 (2736) > [Render] - Mode 95 : 1280 x 768 @ 120 Hz, stereo = 0
22:21:39:510 (2736) > [Render] - Mode 96 : 1280 x 768 @ 144 Hz, stereo = 0
22:21:39:510 (2736) > [Render] - Mode 97 : 1280 x 768 @ 144 Hz, stereo = 0
22:21:39:510 (2736) > [Render] - Mode 98 : 1280 x 800 @ 60 Hz, stereo = 0
22:21:39:510 (2736) > [Render] - Mode 99 : 1280 x 800 @ 100 Hz, stereo = 0
22:21:39:510 (2736) > [Render] - Mode 100 : 1280 x 800 @ 100 Hz, stereo = 0
22:21:39:510 (2736) > [Render] - Mode 101 : 1280 x 800 @ 110 Hz, stereo = 0
22:21:39:510 (2736) > [Render] - Mode 102 : 1280 x 800 @ 110 Hz, stereo = 0
22:21:39:510 (2736) > [Render] - Mode 103 : 1280 x 800 @ 120 Hz, stereo = 0
22:21:39:510 (2736) > [Render] - Mode 104 : 1280 x 800 @ 120 Hz, stereo = 0
22:21:39:511 (2736) > [Render] - Mode 105 : 1280 x 800 @ 144 Hz, stereo = 0
22:21:39:511 (2736) > [Render] - Mode 106 : 1280 x 800 @ 144 Hz, stereo = 0
22:21:39:511 (2736) > [Render] - Mode 107 : 1280 x 960 @ 60 Hz, stereo = 0
22:21:39:511 (2736) > [Render] - Mode 108 : 1280 x 960 @ 100 Hz, stereo = 0
22:21:39:511 (2736) > [Render] - Mode 109 : 1280 x 960 @ 100 Hz, stereo = 0
22:21:39:511 (2736) > [Render] - Mode 110 : 1280 x 960 @ 110 Hz, stereo = 0
22:21:39:511 (2736) > [Render] - Mode 111 : 1280 x 960 @ 110 Hz, stereo = 0
22:21:39:511 (2736) > [Render] - Mode 112 : 1280 x 960 @ 120 Hz, stereo = 0
22:21:39:511 (2736) > [Render] - Mode 113 : 1280 x 960 @ 120 Hz, stereo = 0
22:21:39:512 (2736) > [Render] - Mode 114 : 1280 x 960 @ 144 Hz, stereo = 0
22:21:39:512 (2736) > [Render] - Mode 115 : 1280 x 960 @ 144 Hz, stereo = 0
22:21:39:512 (2736) > [Render] - Mode 116 : 1280 x 1024 @ 60 Hz, stereo = 0
22:21:39:512 (2736) > [Render] - Mode 117 : 1280 x 1024 @ 75 Hz, stereo = 0
22:21:39:512 (2736) > [Render] - Mode 118 : 1280 x 1024 @ 100 Hz, stereo = 0
22:21:39:512 (2736) > [Render] - Mode 119 : 1280 x 1024 @ 100 Hz, stereo = 0
22:21:39:512 (2736) > [Render] - Mode 120 : 1280 x 1024 @ 110 Hz, stereo = 0
22:21:39:512 (2736) > [Render] - Mode 121 : 1280 x 1024 @ 110 Hz, stereo = 0
22:21:39:512 (2736) > [Render] - Mode 122 : 1280 x 1024 @ 120 Hz, stereo = 0
22:21:39:512 (2736) > [Render] - Mode 123 : 1280 x 1024 @ 120 Hz, stereo = 0
22:21:39:512 (2736) > [Render] - Mode 124 : 1280 x 1024 @ 144 Hz, stereo = 0
22:21:39:512 (2736) > [Render] - Mode 125 : 1280 x 1024 @ 144 Hz, stereo = 0
22:21:39:512 (2736) > [Render] - Mode 126 : 1360 x 768 @ 60 Hz, stereo = 0
22:21:39:512 (2736) > [Render] - Mode 127 : 1360 x 768 @ 60 Hz, stereo = 0
22:21:39:513 (2736) > [Render] - Mode 128 : 1360 x 768 @ 100 Hz, stereo = 0
22:21:39:513 (2736) > [Render] - Mode 129 : 1360 x 768 @ 100 Hz, stereo = 0
22:21:39:513 (2736) > [Render] - Mode 130 : 1360 x 768 @ 110 Hz, stereo = 0
22:21:39:513 (2736) > [Render] - Mode 131 : 1360 x 768 @ 110 Hz, stereo = 0
22:21:39:513 (2736) > [Render] - Mode 132 : 1360 x 768 @ 120 Hz, stereo = 0
22:21:39:513 (2736) > [Render] - Mode 133 : 1360 x 768 @ 120 Hz, stereo = 0
22:21:39:513 (2736) > [Render] - Mode 134 : 1360 x 768 @ 144 Hz, stereo = 0
22:21:39:513 (2736) > [Render] - Mode 135 : 1360 x 768 @ 144 Hz, stereo = 0
22:21:39:514 (2736) > [Render] - Mode 136 : 1360 x 1024 @ 60 Hz, stereo = 0
22:21:39:514 (2736) > [Render] - Mode 137 : 1360 x 1024 @ 60 Hz, stereo = 0
22:21:39:514 (2736) > [Render] - Mode 138 : 1360 x 1024 @ 100 Hz, stereo = 0
22:21:39:514 (2736) > [Render] - Mode 139 : 1360 x 1024 @ 100 Hz, stereo = 0
22:21:39:514 (2736) > [Render] - Mode 140 : 1360 x 1024 @ 110 Hz, stereo = 0
22:21:39:514 (2736) > [Render] - Mode 141 : 1360 x 1024 @ 110 Hz, stereo = 0
22:21:39:514 (2736) > [Render] - Mode 142 : 1360 x 1024 @ 120 Hz, stereo = 0
22:21:39:514 (2736) > [Render] - Mode 143 : 1360 x 1024 @ 120 Hz, stereo = 0
22:21:39:515 (2736) > [Render] - Mode 144 : 1360 x 1024 @ 144 Hz, stereo = 0
22:21:39:515 (2736) > [Render] - Mode 145 : 1360 x 1024 @ 144 Hz, stereo = 0
22:21:39:515 (2736) > [Render] - Mode 146 : 1366 x 768 @ 60 Hz, stereo = 0
22:21:39:515 (2736) > [Render] - Mode 147 : 1366 x 768 @ 60 Hz, stereo = 0
22:21:39:515 (2736) > [Render] - Mode 148 : 1366 x 768 @ 100 Hz, stereo = 0
22:21:39:515 (2736) > [Render] - Mode 149 : 1366 x 768 @ 100 Hz, stereo = 0
22:21:39:515 (2736) > [Render] - Mode 150 : 1366 x 768 @ 110 Hz, stereo = 0
22:21:39:515 (2736) > [Render] - Mode 151 : 1366 x 768 @ 110 Hz, stereo = 0
22:21:39:515 (2736) > [Render] - Mode 152 : 1366 x 768 @ 120 Hz, stereo = 0
22:21:39:515 (2736) > [Render] - Mode 153 : 1366 x 768 @ 120 Hz, stereo = 0
22:21:39:516 (2736) > [Render] - Mode 154 : 1366 x 768 @ 144 Hz, stereo = 0
22:21:39:516 (2736) > [Render] - Mode 155 : 1366 x 768 @ 144 Hz, stereo = 0
22:21:39:516 (2736) > [Render] - Mode 156 : 1440 x 900 @ 60 Hz, stereo = 0
22:21:39:516 (2736) > [Render] - Mode 157 : 1440 x 900 @ 100 Hz, stereo = 0
22:21:39:516 (2736) > [Render] - Mode 158 : 1440 x 900 @ 100 Hz, stereo = 0
22:21:39:516 (2736) > [Render] - Mode 159 : 1440 x 900 @ 110 Hz, stereo = 0
22:21:39:516 (2736) > [Render] - Mode 160 : 1440 x 900 @ 110 Hz, stereo = 0
22:21:39:517 (2736) > [Render] - Mode 161 : 1440 x 900 @ 120 Hz, stereo = 0
22:21:39:517 (2736) > [Render] - Mode 162 : 1440 x 900 @ 120 Hz, stereo = 0
22:21:39:517 (2736) > [Render] - Mode 163 : 1440 x 900 @ 144 Hz, stereo = 0
22:21:39:517 (2736) > [Render] - Mode 164 : 1440 x 900 @ 144 Hz, stereo = 0
22:21:39:517 (2736) > [Render] - Mode 165 : 1600 x 900 @ 60 Hz, stereo = 0
22:21:39:517 (2736) > [Render] - Mode 166 : 1600 x 900 @ 60 Hz, stereo = 0
22:21:39:517 (2736) > [Render] - Mode 167 : 1600 x 900 @ 100 Hz, stereo = 0
22:21:39:517 (2736) > [Render] - Mode 168 : 1600 x 900 @ 100 Hz, stereo = 0
22:21:39:517 (2736) > [Render] - Mode 169 : 1600 x 900 @ 110 Hz, stereo = 0
22:21:39:517 (2736) > [Render] - Mode 170 : 1600 x 900 @ 110 Hz, stereo = 0
22:21:39:518 (2736) > [Render] - Mode 171 : 1600 x 900 @ 120 Hz, stereo = 0
22:21:39:518 (2736) > [Render] - Mode 172 : 1600 x 900 @ 120 Hz, stereo = 0
22:21:39:518 (2736) > [Render] - Mode 173 : 1600 x 900 @ 144 Hz, stereo = 0
22:21:39:518 (2736) > [Render] - Mode 174 : 1600 x 900 @ 144 Hz, stereo = 0
22:21:39:518 (2736) > [Render] - Mode 175 : 1680 x 1050 @ 60 Hz, stereo = 0
22:21:39:518 (2736) > [Render] - Mode 176 : 1680 x 1050 @ 100 Hz, stereo = 0
22:21:39:518 (2736) > [Render] - Mode 177 : 1680 x 1050 @ 100 Hz, stereo = 0
22:21:39:518 (2736) > [Render] - Mode 178 : 1680 x 1050 @ 110 Hz, stereo = 0
22:21:39:518 (2736) > [Render] - Mode 179 : 1680 x 1050 @ 110 Hz, stereo = 0
22:21:39:519 (2736) > [Render] - Mode 180 : 1680 x 1050 @ 120 Hz, stereo = 0
22:21:39:519 (2736) > [Render] - Mode 181 : 1680 x 1050 @ 120 Hz, stereo = 0
22:21:39:520 (2736) > [Render] - Mode 182 : 1680 x 1050 @ 144 Hz, stereo = 0
22:21:39:520 (2736) > [Render] - Mode 183 : 1680 x 1050 @ 144 Hz, stereo = 0
22:21:39:520 (2736) > [Render] - Mode 184 : 1920 x 1080 @ 50 Hz, stereo = 0
22:21:39:520 (2736) > [Render] - Mode 185 : 1920 x 1080 @ 100 Hz, stereo = 0
22:21:39:520 (2736) > [Render] - Mode 186 : 1920 x 1080 @ 110 Hz, stereo = 0
22:21:39:520 (2736) > [Render] - Mode 187 : 1920 x 1080 @ 120 Hz, stereo = 0
22:21:39:520 (2736) > [Render] - Mode 188 : 1920 x 1080 @ 144 Hz, stereo = 0
22:21:39:520 (2736) > [Render] - Mode 189 : 1920 x 1080 @ 60 Hz, stereo = 0
22:21:39:520 (2736) >
22:23:45:622 (2736) > [audio] TrueAudio unavailable
22:23:45:622 (2736) > [audio] Convolution Reverb Option = SOFTWARE
22:23:45:660 (4328) > [Render] AMD HD3D Extension created.
22:23:45:660 (4328) > [Render] AMD HD3D Extension Version: 8.1

22:23:45:660 (4328) > [Render] AMD HD3D failed to initialize. (Most likely cause: there is no 3D support)
22:23:45:660 (4328) > [Render] AMD HD3D: Released Quad Buffer Stereo Extension.
22:23:45:862 (2736) > [Render] Initializing NxRenderer
22:23:45:863 (2736) > [Render] NxApp successfully created
22:23:47:733 (2736) > [Render] Applying display settings
22:23:47:744 (4328) > [Settings] New display settings are:
22:23:47:744 (4328) > [Settings] Using Mantle:0
22:23:47:744 (4328) > [Settings] Window mode:
22:23:47:744 (4328) > [Settings] Fullscreen:1
22:23:47:744 (4328) > [Settings] Excl.Fullscreen:1
22:23:47:744 (4328) > [Settings] Monitor:0
22:23:47:744 (4328) > [Settings] Monitor Rect: x:0 y:0 w:1920 h:1080
22:23:47:744 (4328) > [Settings] Workspace Rect: x:0 y:0 w:1920 h:1040
22:23:47:744 (4328) > [Settings] Aspect Ratio:1.779762 (Pixel Aspect Ratio:0.998885)
22:23:47:744 (4328) > [Settings] Fullscreen display mode:
22:23:47:745 (4328) > [Settings] Display mode: [email protected]
22:23:47:745 (4328) > [Settings] VSync:0, TripleBuffer:0
22:23:47:745 (4328) > [Settings] Stereoscopic3D:0
22:23:47:745 (4328) > [Settings] Window settings:
22:23:47:745 (4328) > [Settings] Window rect: x:240 y:135 w:1440 h:810
22:23:47:745 (4328) > [Settings] User window: w:1440 h:810
22:23:47:745 (4328) > [Settings] Window maximized:0
22:23:47:745 (4328) > [Settings] Graphics Options:
22:23:47:745 (4328) > [Settings] TextureQuality:4294967294, ShadowQuality:3, TextureFiltering:4
22:23:47:746 (4328) > [Settings] SSR:1, DOFQuality:1, POM:1, SSAA:1, FXAA:1, CHS:1, Tess:1
22:23:47:746 (4328) > [Settings] FOVCorrection:10.000000
22:23:47:746 (4328) > [Settings] StereoSeparation:0.400000
22:23:47:746 (4328) > [Settings] StereoPlaneDepth:0.100000
22:23:47:746 (4328) > [Settings] End of settings.
22:23:47:746 (2736) > [Render] Finished applying display settings
22:24:44:514 (2736) > [audio] Convolution Reverb Option = SOFTWARE



Using 14.2 Drivers and CFX enabled.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *BradleyW*
> 
> Has anyone got Thief and CFX? If so, how to I fix the following error?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 22:21:39:431 (2736) > THIEF (64-bit)
> 22:21:39:433 (2736) > ==============
> 22:21:39:433 (2736) >
> 22:21:39:433 (2736) >
> 22:21:39:434 (2736) > Memory statistics:
> 22:21:39:434 (2736) > Total RAM = 15.9 GB (16303 MB)
> 22:21:39:434 (2736) > Avail RAM = 14.1 GB (14396 MB)
> 22:21:39:435 (2736) > Total virtual memory = 131072.0 GB (134217727 MB)
> 22:21:39:435 (2736) > Avail virtual memory = 131071.9 GB (134217572 MB)
> 22:21:39:435 (2736) > OS = Windows 8 (build 9200)
> 22:21:39:436 (2736) > [Render] NxApp PreInit
> 22:21:39:437 (2736) > [Render] Enumerating adapters:
> 22:21:39:438 (2736) > [Render] Adapter 0:
> 22:21:39:438 (2736) > Description : AMD Radeon R9 200 Series
> 22:21:39:438 (2736) > Vendor : AMD (1002)
> 22:21:39:438 (2736) > Device : 67b0
> 22:21:39:438 (2736) > Sub system : b001002
> 22:21:39:439 (2736) > Revision : 0
> 22:21:39:439 (2736) > Dedicated Video Memory : 8109 MB
> 22:21:39:439 (2736) > Dedicated System Memory : 0000 MB
> 22:21:39:439 (2736) > Shared System Memory : 7936 MB
> 22:21:39:439 (2736) >
> 22:21:39:439 (2736) > [Render] Loading ADL...
> *22:21:39:455 (2736) > [Render] CROSSFIRE GPU count query failed, no multi-gpu optimizations.
> 22:21:39:456 (2736) > [Render] Done loading ADL and AGS
> 22:21:39:456 (2736) > [Render] Adapter 1:
> 22:21:39:456 (2736) > Description : Microsoft Basic Render Driver
> 22:21:39:456 (2736) > Vendor : Unknown (1414)
> 22:21:39:456 (2736) > Device : 8c
> 22:21:39:456 (2736) > Sub system : 0
> 22:21:39:456 (2736) > Revision : 0
> 22:21:39:456 (2736) > Dedicated Video Memory : 0000 MB
> 22:21:39:456 (2736) > Dedicated System Memory : 0000 MB
> 22:21:39:456 (2736) > Shared System Memory : 0256 MB*
> 22:21:39:457 (2736) >
> 22:21:39:457 (2736) > [Render] Enumerating monitors:
> 22:21:39:457 (2736) > [Render] Monitor 0: handle = 0000000000010001, aspect = 1.78
> 22:21:39:457 (2736) >
> 22:21:39:491 (2736) > [Render] AMD HD3D Extension created.
> 22:21:39:491 (2736) > [Render] AMD HD3D Extension Version: 8.1
> 
> 22:21:39:491 (2736) > [Render] AMD HD3D failed to initialize. (Most likely cause: there is no 3D support)
> 22:21:39:491 (2736) > [Render] AMD HD3D: Released Quad Buffer Stereo Extension.
> 22:21:39:491 (2736) > [Render] Enumerating Display Modes:
> 22:21:39:497 (2736) > [Render] Querying ADL for Eyefinity modes
> 22:21:39:498 (2736) > [Render] ADL_Display_DisplayMapConfig_Get() failed
> 22:21:39:498 (2736) > [Render] Monitor 'Display 1 (AMD Radeon R9 200 Series(1))' (0000000000010001):
> 22:21:39:498 (2736) > [Render] - Mode 0 : 640 x 480 @ 67 Hz, stereo = 0
> 22:21:39:498 (2736) > [Render] - Mode 1 : 640 x 480 @ 75 Hz, stereo = 0
> 22:21:39:498 (2736) > [Render] - Mode 2 : 640 x 480 @ 100 Hz, stereo = 0
> 22:21:39:498 (2736) > [Render] - Mode 3 : 640 x 480 @ 100 Hz, stereo = 0
> 22:21:39:498 (2736) > [Render] - Mode 4 : 640 x 480 @ 110 Hz, stereo = 0
> 22:21:39:498 (2736) > [Render] - Mode 5 : 640 x 480 @ 110 Hz, stereo = 0
> 22:21:39:499 (2736) > [Render] - Mode 6 : 640 x 480 @ 120 Hz, stereo = 0
> 22:21:39:499 (2736) > [Render] - Mode 7 : 640 x 480 @ 120 Hz, stereo = 0
> 22:21:39:499 (2736) > [Render] - Mode 8 : 640 x 480 @ 144 Hz, stereo = 0
> 22:21:39:499 (2736) > [Render] - Mode 9 : 640 x 480 @ 144 Hz, stereo = 0
> 22:21:39:499 (2736) > [Render] - Mode 10 : 640 x 480 @ 60 Hz, stereo = 0
> 22:21:39:499 (2736) > [Render] - Mode 11 : 720 x 480 @ 56 Hz, stereo = 0
> 22:21:39:499 (2736) > [Render] - Mode 12 : 720 x 480 @ 56 Hz, stereo = 0
> 22:21:39:499 (2736) > [Render] - Mode 13 : 720 x 480 @ 60 Hz, stereo = 0
> 22:21:39:499 (2736) > [Render] - Mode 14 : 720 x 480 @ 72 Hz, stereo = 0
> 22:21:39:499 (2736) > [Render] - Mode 15 : 720 x 480 @ 72 Hz, stereo = 0
> 22:21:39:500 (2736) > [Render] - Mode 16 : 720 x 480 @ 75 Hz, stereo = 0
> 22:21:39:500 (2736) > [Render] - Mode 17 : 720 x 480 @ 75 Hz, stereo = 0
> 22:21:39:500 (2736) > [Render] - Mode 18 : 720 x 480 @ 100 Hz, stereo = 0
> 22:21:39:500 (2736) > [Render] - Mode 19 : 720 x 480 @ 100 Hz, stereo = 0
> 22:21:39:500 (2736) > [Render] - Mode 20 : 720 x 480 @ 110 Hz, stereo = 0
> 22:21:39:500 (2736) > [Render] - Mode 21 : 720 x 480 @ 110 Hz, stereo = 0
> 22:21:39:500 (2736) > [Render] - Mode 22 : 720 x 480 @ 120 Hz, stereo = 0
> 22:21:39:500 (2736) > [Render] - Mode 23 : 720 x 480 @ 120 Hz, stereo = 0
> 22:21:39:500 (2736) > [Render] - Mode 24 : 720 x 480 @ 144 Hz, stereo = 0
> 22:21:39:501 (2736) > [Render] - Mode 25 : 720 x 480 @ 144 Hz, stereo = 0
> 22:21:39:501 (2736) > [Render] - Mode 26 : 720 x 480 @ 60 Hz, stereo = 0
> 22:21:39:501 (2736) > [Render] - Mode 27 : 720 x 576 @ 50 Hz, stereo = 0
> 22:21:39:501 (2736) > [Render] - Mode 28 : 720 x 576 @ 56 Hz, stereo = 0
> 22:21:39:501 (2736) > [Render] - Mode 29 : 720 x 576 @ 56 Hz, stereo = 0
> 22:21:39:501 (2736) > [Render] - Mode 30 : 720 x 576 @ 60 Hz, stereo = 0
> 22:21:39:501 (2736) > [Render] - Mode 31 : 720 x 576 @ 60 Hz, stereo = 0
> 22:21:39:501 (2736) > [Render] - Mode 32 : 720 x 576 @ 72 Hz, stereo = 0
> 22:21:39:501 (2736) > [Render] - Mode 33 : 720 x 576 @ 72 Hz, stereo = 0
> 22:21:39:502 (2736) > [Render] - Mode 34 : 720 x 576 @ 75 Hz, stereo = 0
> 22:21:39:502 (2736) > [Render] - Mode 35 : 720 x 576 @ 75 Hz, stereo = 0
> 22:21:39:502 (2736) > [Render] - Mode 36 : 720 x 576 @ 100 Hz, stereo = 0
> 22:21:39:502 (2736) > [Render] - Mode 37 : 720 x 576 @ 100 Hz, stereo = 0
> 22:21:39:502 (2736) > [Render] - Mode 38 : 720 x 576 @ 110 Hz, stereo = 0
> 22:21:39:502 (2736) > [Render] - Mode 39 : 720 x 576 @ 110 Hz, stereo = 0
> 22:21:39:502 (2736) > [Render] - Mode 40 : 720 x 576 @ 120 Hz, stereo = 0
> 22:21:39:502 (2736) > [Render] - Mode 41 : 720 x 576 @ 120 Hz, stereo = 0
> 22:21:39:502 (2736) > [Render] - Mode 42 : 720 x 576 @ 144 Hz, stereo = 0
> 22:21:39:503 (2736) > [Render] - Mode 43 : 720 x 576 @ 144 Hz, stereo = 0
> 22:21:39:503 (2736) > [Render] - Mode 44 : 800 x 600 @ 56 Hz, stereo = 0
> 22:21:39:503 (2736) > [Render] - Mode 45 : 800 x 600 @ 60 Hz, stereo = 0
> 22:21:39:503 (2736) > [Render] - Mode 46 : 800 x 600 @ 72 Hz, stereo = 0
> 22:21:39:503 (2736) > [Render] - Mode 47 : 800 x 600 @ 75 Hz, stereo = 0
> 22:21:39:503 (2736) > [Render] - Mode 48 : 800 x 600 @ 100 Hz, stereo = 0
> 22:21:39:503 (2736) > [Render] - Mode 49 : 800 x 600 @ 100 Hz, stereo = 0
> 22:21:39:503 (2736) > [Render] - Mode 50 : 800 x 600 @ 110 Hz, stereo = 0
> 22:21:39:503 (2736) > [Render] - Mode 51 : 800 x 600 @ 110 Hz, stereo = 0
> 22:21:39:504 (2736) > [Render] - Mode 52 : 800 x 600 @ 120 Hz, stereo = 0
> 22:21:39:504 (2736) > [Render] - Mode 53 : 800 x 600 @ 120 Hz, stereo = 0
> 22:21:39:504 (2736) > [Render] - Mode 54 : 800 x 600 @ 144 Hz, stereo = 0
> 22:21:39:504 (2736) > [Render] - Mode 55 : 800 x 600 @ 144 Hz, stereo = 0
> 22:21:39:504 (2736) > [Render] - Mode 56 : 1024 x 768 @ 60 Hz, stereo = 0
> 22:21:39:504 (2736) > [Render] - Mode 57 : 1024 x 768 @ 70 Hz, stereo = 0
> 22:21:39:504 (2736) > [Render] - Mode 58 : 1024 x 768 @ 75 Hz, stereo = 0
> 22:21:39:504 (2736) > [Render] - Mode 59 : 1024 x 768 @ 100 Hz, stereo = 0
> 22:21:39:504 (2736) > [Render] - Mode 60 : 1024 x 768 @ 100 Hz, stereo = 0
> 22:21:39:505 (2736) > [Render] - Mode 61 : 1024 x 768 @ 110 Hz, stereo = 0
> 22:21:39:505 (2736) > [Render] - Mode 62 : 1024 x 768 @ 110 Hz, stereo = 0
> 22:21:39:505 (2736) > [Render] - Mode 63 : 1024 x 768 @ 120 Hz, stereo = 0
> 22:21:39:505 (2736) > [Render] - Mode 64 : 1024 x 768 @ 120 Hz, stereo = 0
> 22:21:39:505 (2736) > [Render] - Mode 65 : 1024 x 768 @ 144 Hz, stereo = 0
> 22:21:39:505 (2736) > [Render] - Mode 66 : 1024 x 768 @ 144 Hz, stereo = 0
> 22:21:39:505 (2736) > [Render] - Mode 67 : 1152 x 864 @ 60 Hz, stereo = 0
> 22:21:39:505 (2736) > [Render] - Mode 68 : 1152 x 864 @ 60 Hz, stereo = 0
> 22:21:39:506 (2736) > [Render] - Mode 69 : 1152 x 864 @ 75 Hz, stereo = 0
> 22:21:39:506 (2736) > [Render] - Mode 70 : 1152 x 864 @ 100 Hz, stereo = 0
> 22:21:39:506 (2736) > [Render] - Mode 71 : 1152 x 864 @ 100 Hz, stereo = 0
> 22:21:39:507 (2736) > [Render] - Mode 72 : 1152 x 864 @ 110 Hz, stereo = 0
> 22:21:39:507 (2736) > [Render] - Mode 73 : 1152 x 864 @ 110 Hz, stereo = 0
> 22:21:39:507 (2736) > [Render] - Mode 74 : 1152 x 864 @ 120 Hz, stereo = 0
> 22:21:39:507 (2736) > [Render] - Mode 75 : 1152 x 864 @ 120 Hz, stereo = 0
> 22:21:39:507 (2736) > [Render] - Mode 76 : 1152 x 864 @ 144 Hz, stereo = 0
> 22:21:39:508 (2736) > [Render] - Mode 77 : 1152 x 864 @ 144 Hz, stereo = 0
> 22:21:39:508 (2736) > [Render] - Mode 78 : 1280 x 720 @ 50 Hz, stereo = 0
> 22:21:39:508 (2736) > [Render] - Mode 79 : 1280 x 720 @ 100 Hz, stereo = 0
> 22:21:39:508 (2736) > [Render] - Mode 80 : 1280 x 720 @ 100 Hz, stereo = 0
> 22:21:39:508 (2736) > [Render] - Mode 81 : 1280 x 720 @ 110 Hz, stereo = 0
> 22:21:39:508 (2736) > [Render] - Mode 82 : 1280 x 720 @ 110 Hz, stereo = 0
> 22:21:39:508 (2736) > [Render] - Mode 83 : 1280 x 720 @ 120 Hz, stereo = 0
> 22:21:39:508 (2736) > [Render] - Mode 84 : 1280 x 720 @ 120 Hz, stereo = 0
> 22:21:39:508 (2736) > [Render] - Mode 85 : 1280 x 720 @ 144 Hz, stereo = 0
> 22:21:39:508 (2736) > [Render] - Mode 86 : 1280 x 720 @ 144 Hz, stereo = 0
> 22:21:39:509 (2736) > [Render] - Mode 87 : 1280 x 720 @ 60 Hz, stereo = 0
> 22:21:39:509 (2736) > [Render] - Mode 88 : 1280 x 768 @ 60 Hz, stereo = 0
> 22:21:39:509 (2736) > [Render] - Mode 89 : 1280 x 768 @ 60 Hz, stereo = 0
> 22:21:39:509 (2736) > [Render] - Mode 90 : 1280 x 768 @ 100 Hz, stereo = 0
> 22:21:39:509 (2736) > [Render] - Mode 91 : 1280 x 768 @ 100 Hz, stereo = 0
> 22:21:39:509 (2736) > [Render] - Mode 92 : 1280 x 768 @ 110 Hz, stereo = 0
> 22:21:39:509 (2736) > [Render] - Mode 93 : 1280 x 768 @ 110 Hz, stereo = 0
> 22:21:39:509 (2736) > [Render] - Mode 94 : 1280 x 768 @ 120 Hz, stereo = 0
> 22:21:39:509 (2736) > [Render] - Mode 95 : 1280 x 768 @ 120 Hz, stereo = 0
> 22:21:39:510 (2736) > [Render] - Mode 96 : 1280 x 768 @ 144 Hz, stereo = 0
> 22:21:39:510 (2736) > [Render] - Mode 97 : 1280 x 768 @ 144 Hz, stereo = 0
> 22:21:39:510 (2736) > [Render] - Mode 98 : 1280 x 800 @ 60 Hz, stereo = 0
> 22:21:39:510 (2736) > [Render] - Mode 99 : 1280 x 800 @ 100 Hz, stereo = 0
> 22:21:39:510 (2736) > [Render] - Mode 100 : 1280 x 800 @ 100 Hz, stereo = 0
> 22:21:39:510 (2736) > [Render] - Mode 101 : 1280 x 800 @ 110 Hz, stereo = 0
> 22:21:39:510 (2736) > [Render] - Mode 102 : 1280 x 800 @ 110 Hz, stereo = 0
> 22:21:39:510 (2736) > [Render] - Mode 103 : 1280 x 800 @ 120 Hz, stereo = 0
> 22:21:39:510 (2736) > [Render] - Mode 104 : 1280 x 800 @ 120 Hz, stereo = 0
> 22:21:39:511 (2736) > [Render] - Mode 105 : 1280 x 800 @ 144 Hz, stereo = 0
> 22:21:39:511 (2736) > [Render] - Mode 106 : 1280 x 800 @ 144 Hz, stereo = 0
> 22:21:39:511 (2736) > [Render] - Mode 107 : 1280 x 960 @ 60 Hz, stereo = 0
> 22:21:39:511 (2736) > [Render] - Mode 108 : 1280 x 960 @ 100 Hz, stereo = 0
> 22:21:39:511 (2736) > [Render] - Mode 109 : 1280 x 960 @ 100 Hz, stereo = 0
> 22:21:39:511 (2736) > [Render] - Mode 110 : 1280 x 960 @ 110 Hz, stereo = 0
> 22:21:39:511 (2736) > [Render] - Mode 111 : 1280 x 960 @ 110 Hz, stereo = 0
> 22:21:39:511 (2736) > [Render] - Mode 112 : 1280 x 960 @ 120 Hz, stereo = 0
> 22:21:39:511 (2736) > [Render] - Mode 113 : 1280 x 960 @ 120 Hz, stereo = 0
> 22:21:39:512 (2736) > [Render] - Mode 114 : 1280 x 960 @ 144 Hz, stereo = 0
> 22:21:39:512 (2736) > [Render] - Mode 115 : 1280 x 960 @ 144 Hz, stereo = 0
> 22:21:39:512 (2736) > [Render] - Mode 116 : 1280 x 1024 @ 60 Hz, stereo = 0
> 22:21:39:512 (2736) > [Render] - Mode 117 : 1280 x 1024 @ 75 Hz, stereo = 0
> 22:21:39:512 (2736) > [Render] - Mode 118 : 1280 x 1024 @ 100 Hz, stereo = 0
> 22:21:39:512 (2736) > [Render] - Mode 119 : 1280 x 1024 @ 100 Hz, stereo = 0
> 22:21:39:512 (2736) > [Render] - Mode 120 : 1280 x 1024 @ 110 Hz, stereo = 0
> 22:21:39:512 (2736) > [Render] - Mode 121 : 1280 x 1024 @ 110 Hz, stereo = 0
> 22:21:39:512 (2736) > [Render] - Mode 122 : 1280 x 1024 @ 120 Hz, stereo = 0
> 22:21:39:512 (2736) > [Render] - Mode 123 : 1280 x 1024 @ 120 Hz, stereo = 0
> 22:21:39:512 (2736) > [Render] - Mode 124 : 1280 x 1024 @ 144 Hz, stereo = 0
> 22:21:39:512 (2736) > [Render] - Mode 125 : 1280 x 1024 @ 144 Hz, stereo = 0
> 22:21:39:512 (2736) > [Render] - Mode 126 : 1360 x 768 @ 60 Hz, stereo = 0
> 22:21:39:512 (2736) > [Render] - Mode 127 : 1360 x 768 @ 60 Hz, stereo = 0
> 22:21:39:513 (2736) > [Render] - Mode 128 : 1360 x 768 @ 100 Hz, stereo = 0
> 22:21:39:513 (2736) > [Render] - Mode 129 : 1360 x 768 @ 100 Hz, stereo = 0
> 22:21:39:513 (2736) > [Render] - Mode 130 : 1360 x 768 @ 110 Hz, stereo = 0
> 22:21:39:513 (2736) > [Render] - Mode 131 : 1360 x 768 @ 110 Hz, stereo = 0
> 22:21:39:513 (2736) > [Render] - Mode 132 : 1360 x 768 @ 120 Hz, stereo = 0
> 22:21:39:513 (2736) > [Render] - Mode 133 : 1360 x 768 @ 120 Hz, stereo = 0
> 22:21:39:513 (2736) > [Render] - Mode 134 : 1360 x 768 @ 144 Hz, stereo = 0
> 22:21:39:513 (2736) > [Render] - Mode 135 : 1360 x 768 @ 144 Hz, stereo = 0
> 22:21:39:514 (2736) > [Render] - Mode 136 : 1360 x 1024 @ 60 Hz, stereo = 0
> 22:21:39:514 (2736) > [Render] - Mode 137 : 1360 x 1024 @ 60 Hz, stereo = 0
> 22:21:39:514 (2736) > [Render] - Mode 138 : 1360 x 1024 @ 100 Hz, stereo = 0
> 22:21:39:514 (2736) > [Render] - Mode 139 : 1360 x 1024 @ 100 Hz, stereo = 0
> 22:21:39:514 (2736) > [Render] - Mode 140 : 1360 x 1024 @ 110 Hz, stereo = 0
> 22:21:39:514 (2736) > [Render] - Mode 141 : 1360 x 1024 @ 110 Hz, stereo = 0
> 22:21:39:514 (2736) > [Render] - Mode 142 : 1360 x 1024 @ 120 Hz, stereo = 0
> 22:21:39:514 (2736) > [Render] - Mode 143 : 1360 x 1024 @ 120 Hz, stereo = 0
> 22:21:39:515 (2736) > [Render] - Mode 144 : 1360 x 1024 @ 144 Hz, stereo = 0
> 22:21:39:515 (2736) > [Render] - Mode 145 : 1360 x 1024 @ 144 Hz, stereo = 0
> 22:21:39:515 (2736) > [Render] - Mode 146 : 1366 x 768 @ 60 Hz, stereo = 0
> 22:21:39:515 (2736) > [Render] - Mode 147 : 1366 x 768 @ 60 Hz, stereo = 0
> 22:21:39:515 (2736) > [Render] - Mode 148 : 1366 x 768 @ 100 Hz, stereo = 0
> 22:21:39:515 (2736) > [Render] - Mode 149 : 1366 x 768 @ 100 Hz, stereo = 0
> 22:21:39:515 (2736) > [Render] - Mode 150 : 1366 x 768 @ 110 Hz, stereo = 0
> 22:21:39:515 (2736) > [Render] - Mode 151 : 1366 x 768 @ 110 Hz, stereo = 0
> 22:21:39:515 (2736) > [Render] - Mode 152 : 1366 x 768 @ 120 Hz, stereo = 0
> 22:21:39:515 (2736) > [Render] - Mode 153 : 1366 x 768 @ 120 Hz, stereo = 0
> 22:21:39:516 (2736) > [Render] - Mode 154 : 1366 x 768 @ 144 Hz, stereo = 0
> 22:21:39:516 (2736) > [Render] - Mode 155 : 1366 x 768 @ 144 Hz, stereo = 0
> 22:21:39:516 (2736) > [Render] - Mode 156 : 1440 x 900 @ 60 Hz, stereo = 0
> 22:21:39:516 (2736) > [Render] - Mode 157 : 1440 x 900 @ 100 Hz, stereo = 0
> 22:21:39:516 (2736) > [Render] - Mode 158 : 1440 x 900 @ 100 Hz, stereo = 0
> 22:21:39:516 (2736) > [Render] - Mode 159 : 1440 x 900 @ 110 Hz, stereo = 0
> 22:21:39:516 (2736) > [Render] - Mode 160 : 1440 x 900 @ 110 Hz, stereo = 0
> 22:21:39:517 (2736) > [Render] - Mode 161 : 1440 x 900 @ 120 Hz, stereo = 0
> 22:21:39:517 (2736) > [Render] - Mode 162 : 1440 x 900 @ 120 Hz, stereo = 0
> 22:21:39:517 (2736) > [Render] - Mode 163 : 1440 x 900 @ 144 Hz, stereo = 0
> 22:21:39:517 (2736) > [Render] - Mode 164 : 1440 x 900 @ 144 Hz, stereo = 0
> 22:21:39:517 (2736) > [Render] - Mode 165 : 1600 x 900 @ 60 Hz, stereo = 0
> 22:21:39:517 (2736) > [Render] - Mode 166 : 1600 x 900 @ 60 Hz, stereo = 0
> 22:21:39:517 (2736) > [Render] - Mode 167 : 1600 x 900 @ 100 Hz, stereo = 0
> 22:21:39:517 (2736) > [Render] - Mode 168 : 1600 x 900 @ 100 Hz, stereo = 0
> 22:21:39:517 (2736) > [Render] - Mode 169 : 1600 x 900 @ 110 Hz, stereo = 0
> 22:21:39:517 (2736) > [Render] - Mode 170 : 1600 x 900 @ 110 Hz, stereo = 0
> 22:21:39:518 (2736) > [Render] - Mode 171 : 1600 x 900 @ 120 Hz, stereo = 0
> 22:21:39:518 (2736) > [Render] - Mode 172 : 1600 x 900 @ 120 Hz, stereo = 0
> 22:21:39:518 (2736) > [Render] - Mode 173 : 1600 x 900 @ 144 Hz, stereo = 0
> 22:21:39:518 (2736) > [Render] - Mode 174 : 1600 x 900 @ 144 Hz, stereo = 0
> 22:21:39:518 (2736) > [Render] - Mode 175 : 1680 x 1050 @ 60 Hz, stereo = 0
> 22:21:39:518 (2736) > [Render] - Mode 176 : 1680 x 1050 @ 100 Hz, stereo = 0
> 22:21:39:518 (2736) > [Render] - Mode 177 : 1680 x 1050 @ 100 Hz, stereo = 0
> 22:21:39:518 (2736) > [Render] - Mode 178 : 1680 x 1050 @ 110 Hz, stereo = 0
> 22:21:39:518 (2736) > [Render] - Mode 179 : 1680 x 1050 @ 110 Hz, stereo = 0
> 22:21:39:519 (2736) > [Render] - Mode 180 : 1680 x 1050 @ 120 Hz, stereo = 0
> 22:21:39:519 (2736) > [Render] - Mode 181 : 1680 x 1050 @ 120 Hz, stereo = 0
> 22:21:39:520 (2736) > [Render] - Mode 182 : 1680 x 1050 @ 144 Hz, stereo = 0
> 22:21:39:520 (2736) > [Render] - Mode 183 : 1680 x 1050 @ 144 Hz, stereo = 0
> 22:21:39:520 (2736) > [Render] - Mode 184 : 1920 x 1080 @ 50 Hz, stereo = 0
> 22:21:39:520 (2736) > [Render] - Mode 185 : 1920 x 1080 @ 100 Hz, stereo = 0
> 22:21:39:520 (2736) > [Render] - Mode 186 : 1920 x 1080 @ 110 Hz, stereo = 0
> 22:21:39:520 (2736) > [Render] - Mode 187 : 1920 x 1080 @ 120 Hz, stereo = 0
> 22:21:39:520 (2736) > [Render] - Mode 188 : 1920 x 1080 @ 144 Hz, stereo = 0
> 22:21:39:520 (2736) > [Render] - Mode 189 : 1920 x 1080 @ 60 Hz, stereo = 0
> 22:21:39:520 (2736) >
> 22:23:45:622 (2736) > [audio] TrueAudio unavailable
> 22:23:45:622 (2736) > [audio] Convolution Reverb Option = SOFTWARE
> 22:23:45:660 (4328) > [Render] AMD HD3D Extension created.
> 22:23:45:660 (4328) > [Render] AMD HD3D Extension Version: 8.1
> 
> 22:23:45:660 (4328) > [Render] AMD HD3D failed to initialize. (Most likely cause: there is no 3D support)
> 22:23:45:660 (4328) > [Render] AMD HD3D: Released Quad Buffer Stereo Extension.
> 22:23:45:862 (2736) > [Render] Initializing NxRenderer
> 22:23:45:863 (2736) > [Render] NxApp successfully created
> 22:23:47:733 (2736) > [Render] Applying display settings
> 22:23:47:744 (4328) > [Settings] New display settings are:
> 22:23:47:744 (4328) > [Settings] Using Mantle:0
> 22:23:47:744 (4328) > [Settings] Window mode:
> 22:23:47:744 (4328) > [Settings] Fullscreen:1
> 22:23:47:744 (4328) > [Settings] Excl.Fullscreen:1
> 22:23:47:744 (4328) > [Settings] Monitor:0
> 22:23:47:744 (4328) > [Settings] Monitor Rect: x:0 y:0 w:1920 h:1080
> 22:23:47:744 (4328) > [Settings] Workspace Rect: x:0 y:0 w:1920 h:1040
> 22:23:47:744 (4328) > [Settings] Aspect Ratio:1.779762 (Pixel Aspect Ratio:0.998885)
> 22:23:47:744 (4328) > [Settings] Fullscreen display mode:
> 22:23:47:745 (4328) > [Settings] Display mode: [email protected]
> 22:23:47:745 (4328) > [Settings] VSync:0, TripleBuffer:0
> 22:23:47:745 (4328) > [Settings] Stereoscopic3D:0
> 22:23:47:745 (4328) > [Settings] Window settings:
> 22:23:47:745 (4328) > [Settings] Window rect: x:240 y:135 w:1440 h:810
> 22:23:47:745 (4328) > [Settings] User window: w:1440 h:810
> 22:23:47:745 (4328) > [Settings] Window maximized:0
> 22:23:47:745 (4328) > [Settings] Graphics Options:
> 22:23:47:745 (4328) > [Settings] TextureQuality:4294967294, ShadowQuality:3, TextureFiltering:4
> 22:23:47:746 (4328) > [Settings] SSR:1, DOFQuality:1, POM:1, SSAA:1, FXAA:1, CHS:1, Tess:1
> 22:23:47:746 (4328) > [Settings] FOVCorrection:10.000000
> 22:23:47:746 (4328) > [Settings] StereoSeparation:0.400000
> 22:23:47:746 (4328) > [Settings] StereoPlaneDepth:0.100000
> 22:23:47:746 (4328) > [Settings] End of settings.
> 22:23:47:746 (2736) > [Render] Finished applying display settings
> 22:24:44:514 (2736) > [audio] Convolution Reverb Option = SOFTWARE
> 
> 
> 
> Using 14.2 Drivers and CFX enabled.


Setup your own profile for now, try using either AFR or 1x1


----------



## Roboyto

Quote:


> Originally Posted by *Roy360*
> 
> damn, so now I have to choose. Stick with the R9 290 for 610$ after tax, and buy a shim, or buy a watercooled VisionTek R9 290 for 600+tax+shipping....
> will the EK full blocks fit the ASUS R9 290? Did ASUS make any other "improvements" to the reference design?


The reference ASUS GPU will use the reference block, otherwise the DC2 must use the EK DC2 block. I know EK announced the DC2 full cover block all the way back in December, but they aren't out for sale yet.

Double check here for bock compatibility: http://www.coolingconfigurator.com/

*Brian18741* contacted EK maybe a week or so ago and got some information on when they will be released; I don't remember exactly what he said but it is supposed to be soon. The block will fit both the 290 and 290X DC2 cards. Send him a PM and I'm sure he can give you more details.

Most importantly they upgraded power regulation on the card. More phases, 6+2+2 on the DC2, compared to 5+2 on the reference cards; 6 phases for GPU and 2+2 for the RAM using their renowned DIGI+VRM and Super Alloy Power. This doesn't guarantee better overclocking, but it sure makes it more likely. The card will get cleaner power, be more efficient, reduce power consumption, operating temperatures and likely lengthen the life of the card. Any of the professional reviews can give you more information on them. Considering the price that they are *normally* offered at, not considering cryptocurrency markups, the upgrades that you get over reference design are solid.

EK is usually the only company who makes blocks for these cards, which is somewhat surprising IMO given their popularity and reputation. I had a HD7870 DC2 I put under an EK-FC block and that card was a solid overclocker; Was ~1250/~1600. With the overclock is was outperforming my buddies MSI TF 670.


----------



## sugarhell

Digi+VRM is low end rebrand stuff. Reband low IR stuffs and then look our superior custom card


----------



## BradleyW

Quote:


> Originally Posted by *MrWhiteRX7*


Still got same issue.








Have you got this game?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *BradleyW*
> 
> Still got same issue.
> 
> 
> 
> 
> 
> 
> 
> 
> Have you got this game?


I do, but honestly I just let it run on one gpu no issues. I was just suggesting that for a try as it usually helps in most games to force xfire. You do have the box checked in CCC under crossfire settings that says (use xfire for applications that don't support it or something like this). right?


----------



## BradleyW

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> I do, but honestly I just let it run on one gpu no issues. I was just suggesting that for a try as it usually helps in most games to force xfire. You do have the box checked in CCC under crossfire settings that says (use xfire for applications that don't support it or something like this). right?


Yes I do, but Thief CFX profile is hard coded anyway.
What drivers are you on?
Can you enable cfx and pass me your log for comparison?


----------



## MrWhiteRX7

I'm on 14.2.

Game runs really well (minus a couple random glitches) @ 1440p


----------



## Roy360

Quote:


> Originally Posted by *Roboyto*
> 
> The reference ASUS GPU will use the reference block, otherwise the DC2 must use the EK DC2 block. I know EK announced the DC2 full cover block all the way back in December, but they aren't out for sale yet.
> 
> Double check here for bock compatibility: http://www.coolingconfigurator.com/
> 
> *Brian18741* contacted EK maybe a week or so ago and got some information on when they will be released; I don't remember exactly what he said but it is supposed to be soon. The block will fit both the 290 and 290X DC2 cards. Send him a PM and I'm sure he can give you more details.
> 
> Most importantly they upgraded power regulation on the card. More phases, 6+2+2 on the DC2, compared to 5+2 on the reference cards; 6 phases for GPU and 2+2 for the RAM using their renowned DIGI+VRM and Super Alloy Power. This doesn't guarantee better overclocking, but it sure makes it more likely. The card will get cleaner power, be more efficient, reduce power consumption, operating temperatures and likely lengthen the life of the card. Any of the professional reviews can give you more information on them. Considering the price that they are _*normally*_ offered at, not considering cryptocurrency markups, the upgrades that you get over reference design are solid.


The EK block was suppose to come out in February, so I'll wait before returning it. But until it comes out, I'm going to mine the crap out of the card hoping for it to fails within the first 30 days. No overclocks or cramped cases, but I won't be applying any shims yet.

Will adding a shim void my warranty?

P.S The heatpipes explain the core temp, but what about the vrm? Why are they so hot? (I'm looking at the pic posted a few pages behind)


----------



## LazarusIV

Looks like I had a dud from HIS... Card shat out today. It would randomly not wake up when the monitor went to sleep even though I had all the power saving stuff turned off. It would require a hard reboot then today when we got home from getting our taxes done it had done the same thing. So my wife turns it off and when she hits the power button, nothing shows up on the screen. So I get in there and poke about a bit and I tried a different PCIE slot but the card was a no-go. I swapped in an old GTX 280 I had lying around and it started right up, no problem! Put the 290 back in and same issues, nothing showed up on the monitor. So now I have to RMA through HIS which uses a different company to handle their RMAs. Now I wish I had bought an Asus!

Oh well... Hopefully the RMA won't take too long. Even though the website says 2-4 weeks after they receive it!

#Fmylife
#Firstworldproblems


----------



## Jack Mac

Quote:


> Originally Posted by *LazarusIV*
> 
> Looks like I had a dud from HIS... Card shat out today. It would randomly not wake up when the monitor went to sleep even though I had all the power saving stuff turned off. It would require a hard reboot then today when we got home from getting our taxes done it had done the same thing. So my wife turns it off and when she hits the power button, nothing shows up on the screen. So I get in there and poke about a bit and I tried a different PCIE slot but the card was a no-go. I swapped in an old GTX 280 I had lying around and it started right up, no problem! Put the 290 back in and same issues, nothing showed up on the monitor. So now I have to RMA through HIS which uses a different company to handle their RMAs. Now I wish I had bought an Asus!
> 
> Oh well... Hopefully the RMA won't take too long. Even though the website says 2-4 weeks after they receive it!
> 
> #Fmylife
> #Firstworldproblems


At least you have a GTX 280 to keep you happy while you wait, I had to run integrated for over a week when I sold my 290.


----------



## Hl86

So my tri-x 290x fluctuates when i hit 1200 on core. Using trixx.50% power limit, 150+ VDDC, core hits 70c vrm 77-80c ingame.
What can i do to make it stable?


----------



## LazarusIV

Quote:


> Originally Posted by *Jack Mac*
> 
> At least you have a GTX 280 to keep you happy while you wait, I had to run integrated for over a week when I sold my 290.


Yeah... could be worse I suppose.

So should I uninstall the AMD drivers and put in nVidia drivers for my GTX 280 or can I just throw it in there and run it? I imagine I have to install those nVidia drivers, huh?


----------



## Jack Mac

Quote:


> Originally Posted by *LazarusIV*
> 
> Yeah... could be worse I suppose.
> 
> So should I uninstall the AMD drivers and put in nVidia drivers for my GTX 280 or can I just throw it in there and run it? I imagine I have to install those nVidia drivers, huh?


Yeah, wipe the AMD drivers and install the nvidia ones, I couldn't game on integrated until I installed drivers so I'd imagine it'd be the same on a 280.


----------



## BradleyW

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> I'm on 14.2.
> 
> Game runs really well (minus a couple random glitches) @ 1440p


Can you enable cfx and pass me your log for comparison?
Thanks.


----------



## Roboyto

Quote:


> Originally Posted by *Roy360*
> 
> The EK block was suppose to come out in February, so I'll wait before returning it. But until it comes out, I'm going to mine the crap out of the card hoping for it to fails within the first 30 days. No overclocks or cramped cases, but I won't be applying any shims yet.
> 
> Will adding a shim void my warranty?
> 
> P.S The heatpipes explain the core temp, but what about the vrm? Why are they so hot? (I'm looking at the pic posted a few pages behind)


I think those VisionTek cards only carry a 1-year warranty? I would say hold out for the DC2 block, it will undoubtedly bring the temps within reason, and take advantage of the ASUS 3-year warranty.

I sold my 7870 to a friend, of a friend, of a friend, lol. He managed to blow it up somehow...it was running "stock clocks" when this happened; which I find hard to believe. I RMA'd it for the guy, and it was a smooth transaction, especially since I called in the RMA. Only took a like 9-10 days total including transit time there and back; RMA center is in Indiana.

Taking off stock heatsink doesn't void your warranty with ASUS cards. As long as you don't do any damage to the card, you would be fine. Just have to make sure you use non-conductive TIM(in case you get some where you shouldn't) and that the shim isn't touching anything that could cause a short.

VRMs get very hot on these cards especially with OC and additional voltage. Even under water I was initially seeing high 70s on VRM1 with a healthy overclock. I got the temperatures down by upgrading thermal pads. I have a write up on that in my sig if you want to check it out.

The one thing the stock blower does halfway decently is cool VRM1 since all the cool air hits those first...this consequently sends very hot air over the core, which is likely a good portion of the reason the reference cooler does such a poor job cooling the core.


----------



## vortex240

Quote:


> Originally Posted by *Paul17041993*
> 
> you'll have to talk to futuremark support about that I think, either way, an exception shouldn't be thrown on another caught exception... that's a double fail...
> 
> Ive only ever seen something like that once, which was minecraft crashing on the crash screen, an exception thrown from something bad in the crashlog from the actual crash...
> 
> 
> 
> 
> 
> 
> 
> 
> soldering a mm copper plate should gain ~10% cooling performance, more or less, would have to be a good solder job though.


Why solder, A simple copper shim, 0,1mm thick with TIM on, will do the job, I would say 10% is a minimum, I would expected about 20% to be honest.


----------



## Roy360

Quote:


> Originally Posted by *Roboyto*
> 
> I think those VisionTek cards only carry a 1-year warranty? I would say hold out for the DC2 block, it will undoubtedly bring the temps within reason, and take advantage of the ASUS 3-year warranty.
> 
> I sold my 7870 to a friend, of a friend, of a friend, lol. He managed to blow it up somehow...it was running "stock clocks" when this happened; which I find hard to believe. I RMA'd it for the guy, and it was a smooth transaction, especially since I called in the RMA. Only took a like 9-10 days total including transit time there and back; RMA center is in Indiana.
> 
> Taking off stock heatsink doesn't void your warranty with ASUS cards. As long as you don't do any damage to the card, you would be fine. Just have to make sure you use non-conductive TIM(in case you get some where you shouldn't) and that the shim isn't touching anything that could cause a short.
> 
> VRMs get very hot on these cards especially with OC and additional voltage. Even under water I was initially seeing high 70s on VRM1 with a healthy overclock. I got the temperatures down by upgrading thermal pads. I have a write up on that in my sig if you want to check it out.
> .


Thanks, I'll look around some hardware stores to see if I can find a thin piece of copper to use as a shim.

You found a 12% decrease in temps from using better pads. I have some leftover EK pads from my two R9 290 full cover blocks, they keep my VRM under 60 degrees when oc'ed to 1250 @ +25mV but that's with tim on both sides of the pad.

Would EK pads be an improvement over ASUS'? I'd buy the poly ones, but shipping is like 20$ from FrozenCPU


----------



## Paul17041993

Quote:


> Originally Posted by *vortex240*
> 
> Why solder, A simple copper shim, 0,1mm thick with TIM on, will do the job, I would say 10% is a minimum, I would expected about 20% to be honest.


TIM would be adequate, solder would be a little better (similar to how the TIM-base intel's need de-lidding but the solder-base AMD's don't),

.1mm might be a bit thin though? that's about fin thickness, wouldn't exactly be thick enough to transfer the heat effectively, .5mm though should be enough, 1mm would give the best results but you would likely need longer screws...


----------



## Roboyto

Quote:


> Originally Posted by *Roy360*
> 
> Thanks, I'll look around some hardware stores to see if I can find a thin piece of copper to use as a shim.
> 
> You found a 12% decrease in temps from using better pads. hmm I have some leftover EK pads from my two R9 290 full cover blocks, they keep my VRM under 60 degrees when oc'ed to 1450 @ +100mV but that's with tim on both sides of the pad.
> 
> Would EK pads be an improvement over ASUS'? I'd buy the poly ones, but shipping is like 20$ from FrozenCPU


No problem. If you have no luck at the hardware store, eBay is full of them







http://www.ebay.com/bhp/gpu-copper

Just have to get one that is thin enough for the cooler to still mount properly; Not sure how many extra threads the screws have for mounting the cooler have left. Maybe in the 1-1.5mm range so it will make good contact and not bend?

*23% for VRM1* this is the important one for GPU core, 12% for VRM2 which is less important and for the memory.

It's hard to say if the EK pads would see an improvement over the ASUS ones. I know several people have upgraded the EK pads to get similar temperature drops to what I saw with my XSPC. The EK pads have roughly 1/3 of the thermal conductance of the Fuji Ultra Extreme,(someone posted this previously can't remember who) so EK suggests using the TIM to assist with heat transfer.

The only way to know if the EK pads will work better is to swap them out.


----------



## KGBinUSA

Quote:


> Originally Posted by *Paul17041993*
> 
> TIM would be adequate, solder would be a little better (similar to how the TIM-base intel's need de-lidding but the solder-base AMD's don't),
> 
> .1mm might be a bit thin though? that's about fin thickness, wouldn't exactly be thick enough to transfer the heat effectively, .5mm though should be enough, 1mm would give the best results but you would likely need longer screws...


Longer screws because of 1 mm? I find that hard to believe.


----------



## vortex240

Quote:


> Originally Posted by *Paul17041993*
> 
> TIM would be adequate, solder would be a little better (similar to how the TIM-base intel's need de-lidding but the solder-base AMD's don't),
> 
> .1mm might be a bit thin though? that's about fin thickness, wouldn't exactly be thick enough to transfer the heat effectively, .5mm though should be enough, 1mm would give the best results but you would likely need longer screws...


Whoops, I meant 1mm, soldering a surface area that large would be near impossible to maintain square flatness in relation to the core. The DC2 cooler itself, is enough to take care of the 780ti, so it's perfectly adequate for hawaii. If I had a chance to pick up a DC2 for a lot less then other custom cards, I would.

If I had to choose a custom cooler, it would be PCS+ followed by the TriX. Allthough the new VaporX is just around the corner. In all honesty, as long as its quieter then the reference cooler and cools better then that's all that is needed.

In the end it's still a chip lottery. A friend has a 290X that will do 1135 max on the core with +100mv, more voltage does not help. He switched to a full waterblock and even though the temps dropped to 50C, he is still stuck at 1135.


----------



## Paul17041993

Quote:


> Originally Posted by *KGBinUSA*
> 
> Longer screws because of 1 mm? I find that hard to believe.


it would depend on how much room the screws have, my 7970DCII for example would have no screw space for such a mod, the screws sit all the way down on the PCB, my ref 290X however has about 1.5mm of space still left on the screws.


----------



## MapRef41N93W

Just bought a DCUII 290x to pair with my Tri-X for Xfire. Hoping I'm not going to have heat issues as my Tri-X already runs at like 50c idle and 80c gaming.


----------



## Roboyto

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Just bought a DCUII 290x to pair with my Tri-X for Xfire. Hoping I'm not going to have heat issues as my Tri-X already runs at like 50c idle and 80c gaming.


I don't want to be the bearer of bad news, but Xfire on air hasn't been the greatest for folks in this thread. Make sure you have plenty of fresh air feeding those cards; Whichever runs cooler should be on top.

Hopefully your DC2 doesn't have temperature issues like the others we have been discussing.


----------



## MapRef41N93W

Quote:


> Originally Posted by *Roboyto*
> 
> I don't want to be the bearer of bad news, but Xfire on air hasn't been the greatest for folks in this thread. Make sure you have plenty of fresh air feeding those cards; Whichever runs cooler should be on top.
> 
> Hopefully your DC2 doesn't have temperature issues like the others we have been discussing.


I have a HAF-X with a 200mm side fan feeding air to the GPUs and two top 200mms pulling hot air out. I could try to setup the GPU bracket with an 80mm fan as well, but I doubt it will fit with such large cards.


----------



## Roy360

Quote:


> Originally Posted by *Roboyto*
> 
> No problem. If you have no luck at the hardware store, eBay is full of them
> 
> 
> 
> 
> 
> 
> 
> http://www.ebay.com/bhp/gpu-copper
> 
> Just have to get one that is thin enough for the cooler to still mount properly; Not sure how many extra threads the screws have for mounting the cooler have left. Maybe in the 1-1.5mm range so it will make good contact and not bend?
> 
> *23% for VRM1* this is the important one for GPU core, 12% for VRM2 which is less important and for the memory.
> 
> It's hard to say if the EK pads would see an improvement over the ASUS ones. I know several people have upgraded the EK pads to get similar temperature drops to what I saw with my XSPC. The EK pads have roughly 1/3 of the thermal conductance of the Fuji Ultra Extreme,(someone posted this previously can't remember who) so EK suggests using the TIM to assist with heat transfer.
> 
> The only way to know if the EK pads will work better is to swap them out.


Quote:


> Originally Posted by *Roy360*
> 
> Thanks, I'll look around some hardware stores to see if I can find a thin piece of copper to use as a shim.
> 
> You found a 12% decrease in temps from using better pads. I have some leftover EK pads from my two R9 290 full cover blocks, they keep my VRM under 60 degrees when oc'ed to 1250 @ +25mV but that's with tim on both sides of the pad.
> 
> Would EK pads be an improvement over ASUS'? I'd buy the poly ones, but shipping is like 20$ from FrozenCPU


Turns out I was wrong about shipping, it's actually like 7$. I just hope UPS doesn't try charging a broker fee on a <30$ item.

Okay, so going off your guide, I need 1mm thick pads.

Found these two:
http://www.frozencpu.com/products/17500/thr-182/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_60_x_50_x_10_-_Thermal_Conductivity_170_WmK.html
http://www.frozencpu.com/products/17504/thr-185/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_Mosfet_Block_-_100_x_15_x_10_-_Thermal_Conductivity_170_WmK.html

So 6$ more for double the amount? (Is that the only difference) I have a reference card besides the ASUS, so maybe I should get the 60x50 and do both cards? Or will the 100x15 be enough for VRM1 and maybe VRM2 if I have enough? I remember your guide saying you had enough for 3 cards.

For the shim, I just need a thin copper sheet, (1mm?) and drill 4 fours to match up with the stock screws?
The die size is 428mm^2 so a 25x25x1 is good? Or should I get a 30x30x1 I don't have the cooler infront of me, or else I would measure it myself.


----------



## Roboyto

Quote:


> Originally Posted by *Hl86*
> 
> So my tri-x 290x fluctuates when i hit 1200 on core. Using trixx.50% power limit, 150+ VDDC, core hits 70c vrm 77-80c ingame.
> What can i do to make it stable?


Are you forcing constant voltage? Have ULPS disabled? 14.X drivers have had lots of stability issues, 13.X usually better.

1200/1500 is a solid overclock for one of these cards.

*More voltage or less clocks. *

Best methodology for overclocking is usually one component at a time.

Start pushing GPU for example, when you lose stability drop clocks back down to the last stable clocks you had found, and then work on memory. When you lose stability here, drop clocks back down to stable settings and add more voltage.

Rinse and repeat until you find the best combination of clocks, voltages, and temperatures. It's a lengthy process, so be patient!

Core clock definitely yields better performance gains, so sacrificing some RAM speed for overall stability isn't necessarily a bad thing.

I have spent probably 100 hours benching, testing, and overclocking my 290. I have come to find that 1200/1500 yields best combination of performance, voltage, and temperatures. I am able to push my card as high as 1300/1700 with +200mV. However, at 1200/1500 I am getting 96% of the performance found with my aforementioned maximum clocks/voltages. *1200/1500 is a solid overclock for one of these cards.*


----------



## Paul17041993

Quote:


> Originally Posted by *vortex240*
> 
> Whoops, I meant 1mm, soldering a surface area that large would be near impossible to maintain square flatness in relation to the core. The DC2 cooler itself, is enough to take care of the 780ti, so it's perfectly adequate for hawaii. If I had a chance to pick up a DC2 for a lot less then other custom cards, I would.


other heatpipe coolers use solder to join the pipes to the contact plate, it would be relatively simple to solder a plate onto these as its already pressed flat, you would just need a blowtorch on-hand to keep it warm as you get the solder across the surface and the plate sitting flat, would probably be easiest to spread the solder all over the plate and then press the plate to the heatsink as the heatsink would naturally want to cool itself.

its tricky and hazerdous, wouldn't expect many here to do it but anyone proficient in basic metalwork would be free to try, of course you would want to take all the plastic parts off the heatsink and have a solder vacuum ready for spillage.


----------



## Roboyto

Quote:


> Originally Posted by *Roy360*
> 
> Turns out I was wrong about shipping, it's actually like 7$. I just hope UPS doesn't try charging a broker fee on a <30$ item.
> 
> Okay, so going off your guide, *I need 1mm thick pads.*
> 
> Found these two:
> http://www.frozencpu.com/products/17500/thr-182/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_60_x_50_x_10_-_Thermal_Conductivity_170_WmK.html
> *http://www.frozencpu.com/products/17504/thr-185/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_Mosfet_Block_-_100_x_15_x_10_-_Thermal_Conductivity_170_WmK.html*
> 
> So 6$ more for double the amount? (Is that the only difference) I have a reference card besides the ASUS, so maybe I should get the 60x50 and do both cards? Or will the 100x15 be enough for VRM1 and maybe VRM2 if I have enough? I remember your guide saying you had enough for 3 cards.
> 
> For the shim, I just need a thin copper sheet, (1mm?) and drill 4 fours to match up with the stock screws?
> *The die size is 428mm^2 so a 25x25x1 is good?* Or should I get a 30x30x1 I don't have the cooler infront of me, or else I would measure it myself.


The 100x15mm for is enough to do all VRMs for 2 cards. Get the Fujipoly Ultra Extreme for VRMs.

*I would suggest leaving the RAM pads as they are. I spent the extra money to upgrade the RAM pads and I got nothing out of it. The RAM doesn't get anywhere near the temperature of the VRMs.*

*If you really want to change the RAM pads as well, you will need (2) 60x50mm pads for (1) card. But I suggest saving yourself the time and money and just using the pads the blocks come with.*

Square root of 428 is 20.68, but the die is slightly taller than wide. I would imagine a 25x25 would help, but the real question is 25x25 large enough to make sufficient contact to those outer heatpipes? Hard to say without removing the DC2 heatsink.


----------



## vortex240

Quote:


> Originally Posted by *Paul17041993*
> 
> other heatpipe coolers use solder to join the pipes to the contact plate, it would be relatively simple to solder a plate onto these as its already pressed flat, you would just need a blowtorch on-hand to keep it warm as you get the solder across the surface and the plate sitting flat, would probably be easiest to spread the solder all over the plate and then press the plate to the heatsink as the heatsink would naturally want to cool itself.
> 
> its tricky and hazerdous, wouldn't expect many here to do it but anyone proficient in basic metalwork would be free to try, of course you would want to take all the plastic parts off the heatsink and have a solder vacuum ready for spillage.


I can guarantee you that you will not achieve the square flatness required to perfectly transfer heat to the die. Then again, if you have access to a machine shop that should not be too difficult.


----------



## Roy360

Quote:


> Originally Posted by *Roboyto*
> 
> The 100x15mm for is enough to do all VRMs for 2 cards. Get the Fujipoly Ultra Extreme for VRMs.
> 
> *I would suggest leaving the RAM pads as they are. I spent the extra money to upgrade the RAM pads and I got nothing out of it. The RAM doesn't get anywhere near the temperature of the VRMs.*
> *If you really want to change the RAM pads as well, you will need (2) 60x50mm pads for (1) card. But I suggest saving yourself the time and money and just using the pads the blocks come with.*
> 
> Square root of 428 is 20.68, but the die is slightly taller than wide. I would imagine a 25x25 would help, but the real question is 25x25 large enough to make sufficient contact to those outer heatpipes? Hard to say without removing the DC2 heatsink.


Would getting a too large shim have any adverse effects? I can't think of any since it's in contact with a massive copper block. In some early posts someone mentioned that shims <1mm are too thin to be effective, but too thick might not screw it..


----------



## Roboyto

Quote:


> Originally Posted by *Roy360*
> 
> Would getting a too large shim have any adverse effects? I can't think of any since it's in contact with a massive copper block. In some early posts someone mentioned that shims <1mm are too thin to be effective, but too thick might not screw it..


<1mm probably too thin to be sturdy. There is always the possibility of longer screws if the shim causes issues. I have seen numerous different types of screws for holding on heatsinks over the years.

The shims are pretty cheap, so why not order a 1mm and 1.5mm? I think 2mm would really be pushing your luck.


----------



## Iniura

EK-FC R9-290X - Acetal + Backplate installed


----------



## Roboyto

Quote:


> Originally Posted by *Iniura*
> 
> EK-FC R9-290X - Acetal + Backplate installed
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats! Welcome to the club


----------



## vortex240

Quote:


> Originally Posted by *Roy360*
> 
> Would getting a too large shim have any adverse effects? I can't think of any since it's in contact with a massive copper block. In some early posts someone mentioned that shims <1mm are too thin to be effective, but too thick might not screw it..


I meant to say 1mm, another member pointed out my mistake. There is no downside to a copper shim.


----------



## SeanEboy

Question.. I have (4) 290x, and (1) 290 that can unlock.. Right now, I'm running (1) 290x, and the unlockable one. I'm a newb to benching, so I don't really know what I'm doing. I've done some looking around and cannot find the answers I need..
My questions:
*1)* I am running a Rampage Black Edition, and I read in the manual that there are (1-4) swtiches on the board (next to reset button), that enable me to disable the PCI-E slots individually to help troubleshoot which card is acting up, without removing them. However, I'm wondering if I could also use this feature to benchmark each card individually? Is that the way to go about it? Or is there another way? I want to bench the unlockable card, then flash to 290x and see the difference.

*2)* I also want to bench my other (3) 290x, to find out their individual performance, and compare them. How do I go about this? Is there a quicker/easier way of doing this, than constantly disabling the PCI-E slot, restarting, and switching the card the monitor is plugged into?

*3)* Should I just use firestrike extreme to compare them? Or Unigine Valley? Or both?

*4)* What parameters should I record?


----------



## Paul17041993

Quote:


> Originally Posted by *vortex240*
> 
> I can guarantee you that you will not achieve the square flatness required to perfectly transfer heat to the die. Then again, if you have access to a machine shop that should not be too difficult.


that's what the 1mm copper plate is for? the solder makes contact between the plate and the heatpipes, the plate should already be perfectly flat to touch the die.


----------



## Roboyto

Quote:


> Originally Posted by *SeanEboy*
> 
> Question.. I have (4) 290x, and (1) 290 that can unlock.. Right now, I'm running (1) 290x, and the unlockable one. I'm a newb to benching, so I don't really know what I'm doing. I've done some looking around and cannot find the answers I need..
> My questions:
> *1)* I am running a Rampage Black Edition, and I read in the manual that there are (1-4) swtiches on the board (next to reset button), that enable me to disable the PCI-E slots individually to help troubleshoot which card is acting up, without removing them. However, I'm wondering if I could also use this feature to benchmark each card individually? Is that the way to go about it? Or is there another way? I want to bench the unlockable card, then flash to 290x and see the difference.
> 
> *2)* I also want to bench my other (3) 290x, to find out their individual performance, and compare them. How do I go about this? Is there a quicker/easier way of doing this, than constantly disabling the PCI-E slot, restarting, and switching the card the monitor is plugged into?
> 
> *3)* Should I just use firestrike extreme to compare them? Or Unigine Valley? Or both?
> 
> *4)* What parameters should I record?


1) That sounds like an awesome feature. If it disables the slot than yes, I would imagine this would work. If it doesn't I believe you could disconnect the PCI-E power cables, as these are needed for the card to turn on and display video.

2) It's been a long time since I ran multiple cards, so I'll give you an educated guess. If disabling the PCI-E slots requires a reboot, then that is probably the route you're going to have to take. Boots don't take that long these days anyhow with SSDs









3) Use whatever benchmarks you would like. Anything 3DMark or Unigine will do, just depends on how much time you want to spend doing it.

4) Depends on how detailed you want to be. If you're just looking for performance results, than the benchmark scores will do just fine. For Unigine you could also record the min/max/avg frames per second if you wanted.

In my build log I have a spreadsheet if you want to scope that out. I recorded GPU/VRM temperatures for most of my benchmark runs as well.


----------



## vortex240

Quote:


> Originally Posted by *Paul17041993*
> 
> that's what the 1mm copper plate is for? the solder makes contact between the plate and the heatpipes, the plate should already be perfectly flat to touch the die.


To explain better here is a simple ascii diagram

heapipe-> | | | <-core

The first | is the heatpipes, then you have the shim, then the core. If the shim is soldered at even a slight angle - therefore it will not be 100% parallel to the core. The mounting screws will end up putting unequal pressure on one side/corner of the core and most likely cracking it. Hope this helps, since I don't feel like drawing.









When you use just TIM, it has give and will distribute due to clamping force evenly.

Here is what will end up:

http://sound.westhost.com/heatsink3.gif


----------



## bbond007

I have done what I can with cooling my GPUs with a large fan, and my top GPU performance is greatly improved, and I don't see any throttling while gaming with my 1030 stock clock anymore.

Anyway, depending on what I'm mining my top card can reach 90c+ still, and i would like to limit it because I feel that is too high.

Now I just set the GPU0 for 950mhz which does OK but sometimes then it goes into the low 80s so its hard to get it just right..

This CCC has the settings I want "Target GPU Temprature". from here -> http://www.guru3d.com/articles_pages/radeon_r9_290_review_benchmarks,31.html


Spoiler: Warning: Spoiler!







but this is what my CCC settings look like:


Spoiler: Warning: Spoiler!







Anyone know why mine is different? I have 14.2 drivers. Is there any way to expose those additional sliders?

Thanks!


----------



## Paul17041993

Quote:


> Originally Posted by *vortex240*
> 
> To explain better here is a simple ascii diagram
> 
> heapipe-> | | | <-core
> 
> The first | is the heatpipes, then you have the shim, then the core. If the shim is soldered at even a slight angle - therefore it will not be 100% parallel to the core. The mounting screws will end up putting unequal pressure on one side/corner of the core and most likely cracking it. Hope this helps, since I don't feel like drawing.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> When you use just TIM, it has give and will distribute due to clamping force evenly.
> 
> Here is what will end up:
> 
> http://sound.westhost.com/heatsink3.gif


oh that's what you meant, but yea you don't have that issue with solder either, provided its still hot, its more fluid then any TIM you can imagine. ever seen mercury?
of course though if you let the solder cool before pressing it to the heatsink, expect it to sit badly, or not stick entirely...

oh I actually just remembered, there was this old "TIM" for CPUs which was essential a low-melting solder, you sat it between the CPU and heatsink, let it heat up and it would liquify and spread across, damned if I can remember what it was called, though I don't think it would be suitable for this as these GPUs already run very hot, you would need a solder with a melting point of about 150-200C...


----------



## Forceman

Quote:


> Originally Posted by *bbond007*
> 
> I have done what I can with cooling my GPUs with a large fan, and my top GPU performance is greatly improved, and I don't see any throttling while gaming with my 1030 stock clock anymore.
> 
> Anyway, depending on what I'm mining my top card can reach 90c+ still, and i would like to limit it because I feel that is too high.
> 
> Now I just set the GPU0 for 950mhz which does OK but sometimes then it goes into the low 80s so its hard to get it just right..
> 
> This CCC has the settings I want "Target GPU Temprature". from here -> http://www.guru3d.com/articles_pages/radeon_r9_290_review_benchmarks,31.html
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> but this is what my CCC settings look like:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Anyone know why mine is different? I have 14.2 drivers. Is there any way to expose those additional sliders?
> 
> Thanks!


You probably just need to enable Overdrive - it isn't checked in your 14.2 screenshot. Otherwise there was a bug with the 14.1s that didn't correctly install CCC sometimes, maybe try re-installing the drivers or manually installing CCC if the Overdrive thing doesn't work.


----------



## SeanEboy

Quote:


> Originally Posted by *Roboyto*
> 
> 1) That sounds like an awesome feature. If it disables the slot than yes, I would imagine this would work. If it doesn't I believe you could disconnect the PCI-E power cables, as these are needed for the card to turn on and display video.
> 
> 2) It's been a long time since I ran multiple cards, so I'll give you an educated guess. If disabling the PCI-E slots requires a reboot, then that is probably the route you're going to have to take. Boots don't take that long these days anyhow with SSDs
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 3) Use whatever benchmarks you would like. Anything 3DMark or Unigine will do, just depends on how much time you want to spend doing it.
> 
> 4) Depends on how detailed you want to be. If you're just looking for performance results, than the benchmark scores will do just fine. For Unigine you could also record the min/max/avg frames per second if you wanted.
> 
> In my build log I have a spreadsheet if you want to scope that out. I recorded GPU/VRM temperatures for most of my benchmark runs as well.


Thanks, as usual Roboyto!

Question, am I about to cook my motherboard?! I'm running only on motherboard tray (out of PC) at the moment.
See highlight:


----------



## bbond007

Quote:


> Originally Posted by *Forceman*
> 
> You probably just need to enable Overdrive - it isn't checked in your 14.2 screenshot. Otherwise there was a bug with the 14.1s that didn't correctly install CCC sometimes, maybe try re-installing the drivers or manually installing CCC if the Overdrive thing doesn't work.


Enabling that does not expose any more options unfortunately:


Spoiler: Warning: Spoiler!







I did use DDU to clean up any nvidia stuff maybe leftover. I also re-installed the 14.2 Catalyst stuff.

here is the versions i have now:


Spoiler: Warning: Spoiler!







thanks!


----------



## Roboyto

Quote:


> Originally Posted by *SeanEboy*
> 
> Thanks, as usual Roboyto!
> 
> Question, am I about to cook my motherboard?! I'm running only on motherboard tray (out of PC) at the moment.
> See highlight:
> 
> 
> Spoiler: Warning: Spoiler!


I don't think those sensors are giving a correct reading. Just checked mine and I have maximums up to 112C and 120C. Don't even know what they are for TBH...my list doesn't have Temp 3..only 2, 4, and 5. I know nothing in my computer is running that hot, so I would assume that there is nothing to worry about.

Since you have the board on a table poke







, literally, around and see if anything is that hot. If something was close to boiling water temperatures you would probably find it pretty quickly.


----------



## SeanEboy

Quote:


> Originally Posted by *Roboyto*
> 
> I don't think those sensors are giving a correct reading. Just checked mine and I have maximums up to 112C and 120C. Don't even know what they are for TBH...my list doesn't have Temp 3..only 2, 4, and 5. I know nothing in my computer is running that hot, so I would assume that there is nothing to worry about.
> 
> Since you have the board on a table poke
> 
> 
> 
> 
> 
> 
> 
> , literally, around and see if anything is that hot. If something was close to boiling water temperatures you would probably find it pretty quickly.


I was too scared to get electrocuted... Haha, yeah it's fine.. Thanks.


----------



## Arizonian

Quote:


> Originally Posted by *Iniura*
> 
> EK-FC R9-290X - Acetal + Backplate installed
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added
















If you can go back to that post and just edit in a GPU-Z Link with OCN name or Screen shot of GPU-Z validation tab open with OCN name as proof. But I believe you, so added.


----------



## HardwareDecoder

the nickel/acetal is sexy :-D


----------



## Forceman

Quote:


> Originally Posted by *bbond007*
> 
> Enabling that does not expose any more options unfortunately:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I did use DDU to clean up any nvidia stuff maybe leftover. I also re-installed the 14.2 Catalyst stuff.
> 
> here is the versions i have now:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> thanks!


All the info at the top of the overdrive page (in CCC) is blank (0 MHz clock speed, 0 MHz memory speed, temp 0C, etc) so I'd say it is probably a bad install. Some people, with the 14.1s, had success installing 13.12 and then installing 14.1 over top, which kept it from losing CCC, I guess you could try that.


----------



## bbond007

Quote:


> Originally Posted by *Forceman*
> 
> All the info at the top of the overdrive page (in CCC) is blank (0 MHz clock speed, 0 MHz memory speed, temp 0C, etc) so I'd say it is probably a bad install. Some people, with the 14.1s, had success installing 13.12 and then installing 14.1 over top, which kept it from losing CCC, I guess you could try that.


I think that is because I selected the second video card which is inactive.

I re-installed 13.2 drivers and i still don't have the options.


Spoiler: Warning: Spoiler!







Thanks for your help.

AMD sucks. I hope these cards catch on fire.


----------



## Sgt Bilko

Quote:


> Originally Posted by *bbond007*
> 
> I think that is because I selected the second video card which is inactive.
> 
> I re-installed 13.2 drivers and i still don't have the options.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Thanks for your help.
> 
> AMD sucks. I hope these cards catch on fire.


Your CCC looks the same as mine, From what i've heard the custom cards don't have a Temp Target setting in CCC.

Only Reference have the Temp target setting, other might need to confirm this for me though.


----------



## The Storm

Quote:


> Originally Posted by *bbond007*
> 
> AMD sucks. I hope these cards catch on fire.


Coments like this are hilarious....


----------



## kizwan

Quote:


> Originally Posted by *Forceman*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *bbond007*
> 
> Enabling that does not expose any more options unfortunately:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I did use DDU to clean up any nvidia stuff maybe leftover. I also re-installed the 14.2 Catalyst stuff.
> 
> here is the versions i have now:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> thanks!
> 
> 
> 
> 
> 
> 
> 
> All the info at the top of the overdrive page (in CCC) is blank (0 MHz clock speed, 0 MHz memory speed, temp 0C, etc) so I'd say it is probably a bad install. Some people, with the 14.1s, had success installing 13.12 and then installing 14.1 over top, which kept it from losing CCC, I guess you could try that.
Click to expand...

I'm pretty sure it read 0 MHz because ULPS is enabled. It's second card was selected in CCC because if it's primary GPU, you should see monitor name next to the "R9 290". Regarding the temp target slider missing, I can only check it tonight (I'm also using 14.2 driver).


----------



## Paul17041993

Quote:


> Originally Posted by *bbond007*
> 
> I think that is because I selected the second video card which is inactive.
> 
> I re-installed 13.2 drivers and i still don't have the options.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Thanks for your help.
> 
> AMD sucks. I hope these cards catch on fire.


Quote:


> Originally Posted by *Sgt Bilko*
> 
> Only Reference have the Temp target setting, other might need to confirm this for me though.


likely the BIOS has the thermal control disabled and hard-set to 95C, as they would use a custom fan profile anyway.


----------



## Shurtugal

Hey guys, what sort of temps are people reaching during gaming with the Gigabyte Windforce cooler? Just because I'm reaching 90+ degrees, and that seems a little hot...

Edit: I just took the side of the case off, using a Silverstone Raven RVZ01, so the graphics card is using one of these:

So the graphics card is sort of in a seperate compartment to the rest of the rig. The side I took off was the CPU side, and I also took off the two slim intake fans which were blowing directly onto the GPU. with it like this, I got a maximum of 78 degrees, which seems a bit better. About to put the side of the case back on, but leave the fans out to see if this makes a difference.


----------



## Abyssic

i'd say your temps are pretty normal considering the case... what did you expect when you put a open airflow designed 290(x?) in a case like this?

imho, the 290 and 290x are not a viable option for very small cases because you would need a good blower designed card to get all the heat out of the case, wich the reference 290/x isn't.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Paul17041993*
> 
> likely the BIOS has the thermal control disabled and hard-set to 95C, as they would use a custom fan profile anyway.


yeah thats kinda what i was getting at, it would be in the BIOS rather just be in CCC.


----------



## Paul17041993

Quote:


> Originally Posted by *Shurtugal*
> 
> Hey guys, what sort of temps are people reaching during gaming with the Gigabyte Windforce cooler? Just because I'm reaching 90+ degrees, and that seems a little hot...
> 
> Edit: I just took the side of the case off, using a Silverstone Raven RVZ01, so the graphics card is using one of these:
> 
> So the graphics card is sort of in a seperate compartment to the rest of the rig. The side I took off was the CPU side, and I also took off the two slim intake fans which were blowing directly onto the GPU. with it like this, I got a maximum of 78 degrees, which seems a bit better. About to put the side of the case back on, but leave the fans out to see if this makes a difference.


if you can keep it below 90C then that's great, these cards are built for this heat but will throttle down on 95C, if you wanted to OC you would definitely need better cooling/airflow, but the case being so small I wouldn't see you doing that much. (both heat and PSU concerns)

if you wanted it to run cooler and quieter you could always push the clocks back and undervolt a little, this brings the power draw right down and makes the card very power efficient, at the cost of a little loss in performance depending on how far you go.


----------



## Shurtugal

Quote:


> Originally Posted by *Shurtugal*
> 
> Hey guys, what sort of temps are people reaching during gaming with the Gigabyte Windforce cooler? Just because I'm reaching 90+ degrees, and that seems a little hot...
> 
> Edit: I just took the side of the case off, using a Silverstone Raven RVZ01, so the graphics card is using one of these:
> 
> So the graphics card is sort of in a seperate compartment to the rest of the rig. The side I took off was the CPU side, and I also took off the two slim intake fans which were blowing directly onto the GPU. with it like this, I got a maximum of 78 degrees, which seems a bit better. About to put the side of the case back on, but leave the fans out to see if this makes a difference.


Quote:


> Originally Posted by *Paul17041993*
> 
> if you can keep it below 90C then that's great, these cards are built for this heat but will throttle down on 95C, if you wanted to OC you would definitely need better cooling/airflow, but the case being so small I wouldn't see you doing that much. (both heat and PSU concerns)
> 
> if you wanted it to run cooler and quieter you could always push the clocks back and undervolt a little, this brings the power draw right down and makes the card very power efficient, at the cost of a little loss in performance depending on how far you go.


Awesome, thanks guys! For now anyhow, I've removed the two intake fans into the GPU, which brought it down to a max of about 82 degrees, which I'm happy with. The CPU was the problem at a max of 87 degrees, but just sticking one of the 120mm slim fans blowing onto that brought it down to a maximum of 68, which is much better. How would I go about undervolting it? Would this bring the temps and power usage down very much? I have a bit of experience in overclocking CPU and GPU, thanks!








+ rep!


----------



## Paul17041993

Quote:


> Originally Posted by *Shurtugal*
> 
> Awesome, thanks guys! For now anyhow, I've removed the two intake fans into the GPU, which brought it down to a max of about 82 degrees, which I'm happy with. The CPU was the problem at a max of 87 degrees, but just sticking one of the 120mm slim fans blowing onto that brought it down to a maximum of 68, which is much better. How would I go about undervolting it? Would this bring the temps and power usage down very much? I have a bit of experience in overclocking CPU and GPU, thanks!
> 
> 
> 
> 
> 
> 
> 
> 
> + rep!


you can use MSI Afterburner to try undervolting, it allows -100mV to +100mV voltage control on the core, though some people here have found theirs doesn't allow -100mV so there's a chance it might not be possible without a different BIOS, I haven't seen much about the windforce cards though so I'm not too sure...


----------



## vieuxchnock

What PSU will be able to support a XFire of 290 OC 1175/1500 and my i7 4770k OC 4500, my WC and everything else in my build? Many people tell me I will need at least a 1000W to support all of that but in my Prodigy the longer PSU I can fit is a 850W.( See my sig for details)


----------



## bhardy1185

Coming here in hopes of some help with my "new to me" Sapphire R9 290. A friend bought the 290 from ebay and said he didn't really need because he was just trying to lowball the seller and actually won. I ended up trading my 7970 plus 2x 5870s for the 290. I installed it last night and instead of testing stock, started to OC it. Initial OC was 1100 core, 1350 mem. Ran Unique and my temps shot up to 95 degrees C. Idle was around 70 degrees. I did some fan tweaking and got it to come down to 91 degrees at around 55% fan speed. This seems extremely hot to me. So I dropped the OC and just ran stock. Same thing. Temps jumped up to 94 degrees and stayed there. My first thought is to remove the stock cooler and replace TIM and reapply the cooler. Anyone have any other suggestions? I honestly don't want to spend money on aftermarket cooling but will if it is absolutely necessary. Thank you in advance.


----------



## Abyssic

Quote:


> Originally Posted by *Shurtugal*
> 
> Awesome, thanks guys! For now anyhow, I've removed the two intake fans into the GPU, which brought it down to a max of about 82 degrees, which I'm happy with. The CPU was the problem at a max of 87 degrees, but just sticking one of the 120mm slim fans blowing onto that brought it down to a maximum of 68, which is much better. How would I go about undervolting it? Would this bring the temps and power usage down very much? I have a bit of experience in overclocking CPU and GPU, thanks!
> 
> 
> 
> 
> 
> 
> 
> 
> + rep!


undervolting usually does a great deal on power consumption and therefore heat output. i reduced my former hd 7950 by almost 20°C by undervolting it around 70mV (it still ran the stock clocks, good gpu sample)


----------



## goodenough88

I saw some old benchmarks for a 7680x1440 resolution 2-way R9 290X. (http://www.tomshardware.com/reviews/radeon-r9-290x-hawaii-review,3650-27.html)

Does anyone know if adding a 3rd R9 290X to a 7680x1440 setup will make a massive difference? Or is there very little gain to be had from adding a 3rd R9 290X?


----------



## SeanEboy

Quote:


> Originally Posted by *bhardy1185*
> 
> Coming here in hopes of some help with my "new to me" Sapphire R9 290. A friend bought the 290 from ebay and said he didn't really need because he was just trying to lowball the seller and actually won. I ended up trading my 7970 plus 2x 5870s for the 290. I installed it last night and instead of testing stock, started to OC it. Initial OC was 1100 core, 1350 mem. Ran Unique and my temps shot up to 95 degrees C. Idle was around 70 degrees. I did some fan tweaking and got it to come down to 91 degrees at around 55% fan speed. This seems extremely hot to me. So I dropped the OC and just ran stock. Same thing. Temps jumped up to 94 degrees and stayed there. My first thought is to remove the stock cooler and replace TIM and reapply the cooler. Anyone have any other suggestions? I honestly don't want to spend money on aftermarket cooling but will if it is absolutely necessary. Thank you in advance.


The stock coolers on these things are horrible. TIM might help, but not nearly as much as an Arctic Extreme III or something like that. At 100% fan, they sound like a 777 taking off, and it is not uncommon for them to run at such high temps.


----------



## The Storm

Quote:


> Originally Posted by *vieuxchnock*
> 
> What PSU will be able to support a XFire of 290 OC 1175/1500 and my i7 4770k OC 4500, my WC and everything else in my build? Many people tell me I will need at least a 1000W to support all of that but in my Prodigy the longer PSU I can fit is a 850W.( See my sig for details)


I have something similar in my build, 2 r9-290x's 4 rads, so lots of fans, a 4770k running at 4.6 and of course all the other stuff and I power it all with my corsair AX1200 just fine.


----------



## bhardy1185

Quote:


> Originally Posted by *SeanEboy*
> 
> The stock coolers on these things are horrible. TIM might help, but not nearly as much as an Arctic Extreme III or something like that. At 100% fan, they sound like a 777 taking off, and it is not uncommon for them to run at such high temps.


Dang, I was hoping that someone would have a miracle fix instead of buying an aftermarket cooler







Thanks for the info. I know at 75% it was almost unbearable even with headphones on. The same buddy that bought the card is going to take off the cooler and apply the TIM. I will test that out for a day or two and if the temps don't drop much, then I guess I will go for the Arctic Extreme.

What temps are acceptable for long term gaming? I have been reading up on these cards this morning and I am seeing that 90s are somewhat common.


----------



## The Storm

Quote:


> Originally Posted by *bhardy1185*
> 
> Dang, I was hoping that someone would have a miracle fix instead of buying an aftermarket cooler
> 
> 
> 
> 
> 
> 
> 
> Thanks for the info. I know at 75% it was almost unbearable even with headphones on. The same buddy that bought the card is going to take off the cooler and apply the TIM. I will test that out for a day or two and if the temps don't drop much, then I guess I will go for the Arctic Extreme.
> 
> What temps are acceptable for long term gaming? I have been reading up on these cards this morning and I am seeing that 90s are somewhat common.


According to AMD they are designed to run at 95c all day long. That's why the fan profile doesn't ramp up that fast and tries to maintain a target temp of 95. Now with that said I personally am not comfortable with those temps and I went with water cooling.


----------



## bhardy1185

Quote:


> Originally Posted by *The Storm*
> 
> According to AMD they are designed to run at 95c all day long. That's why the fan profile doesn't ramp up that fast and tries to maintain a target temp of 95. Now with that said I personally am not comfortable with those temps and I went with water cooling.


Thanks Storm. I don't like it running at 95 either but I guess I will have to deal with it being around that temp for a while until I can get a different cooler.


----------



## Abyssic

Quote:


> Originally Posted by *bhardy1185*
> 
> Dang, I was hoping that someone would have a miracle fix instead of buying an aftermarket cooler
> 
> 
> 
> 
> 
> 
> 
> Thanks for the info. I know at 75% it was almost unbearable even with headphones on. The same buddy that bought the card is going to take off the cooler and apply the TIM. I will test that out for a day or two and if the temps don't drop much, then I guess I will go for the Arctic Extreme.
> 
> What temps are acceptable for long term gaming? I have been reading up on these cards this morning and I am seeing that 90s are somewhat common.


90°C is somewhat the optimal condition amd aims for with these cards. wich makes no sense at all (other than cheaping out on the reference cooler) but whatever, new tim and pads will give you some improvement but you won't get silence out of this card. however, you could undervolt it a bit to see where it's going.


----------



## Talon720

Hey i had a question ive had the 290x since it came out.. Just recently picked a second one up for crossfire. Ive done research saying fullscreen mode is the only way to use crossfire, which sucks I prefer windowed. Anyways in bf4 i used windowed mode, and when i checked hwinfo64 and/or afterburner I had both cards chuggling along together. From everything ive read crossfire wont work in windowed mode of any sort. I am using 14.2 drivers maybe thats it? Anyone else see this?


----------



## Coree

Quote:


> Originally Posted by *SeanEboy*
> 
> The stock coolers on these things are horrible. TIM might help, but not nearly as much as an Arctic Extreme III or something like that. At 100% fan, they sound like a 777 taking off, and it is not uncommon for them to run at such high temps.


More like a 707 or 747 Classic taking off







Dem old school turbines.
The cost effective solution is a S1 Plus + 2x140mm fans. Extreme III if you want to spend more.


----------



## Abyssic

Quote:


> Originally Posted by *Talon720*
> 
> Hey i had a question ive had the 290x since it came out.. Just recently picked a second one up for crossfire. Ive done research saying fullscreen mode is the only way to use crossfire, which sucks I prefer windowed. Anyways in bf4 i used windowed mode, and when i checked hwinfo64 and/or afterburner I had both cards chuggling along together. From everything ive read crossfire wont work in windowed mode of any sort. I am using 14.2 drivers maybe thats it? Anyone else see this?


interesting. but i would firstly consider a reading error because i also only knew about crossfire working in fullscreen (had CF in the past)


----------



## BradleyW

Quote:


> Originally Posted by *Talon720*
> 
> Hey i had a question ive had the 290x since it came out.. Just recently picked a second one up for crossfire. Ive done research saying fullscreen mode is the only way to use crossfire, which sucks I prefer windowed. Anyways in bf4 i used windowed mode, and when i checked hwinfo64 and/or afterburner I had both cards chuggling along together. From everything ive read crossfire wont work in windowed mode of any sort. I am using 14.2 drivers maybe thats it? Anyone else see this?


Usage cannot be read on these cards properly. Because CFX is enabled, the tool assumed usage on card 2 despite using windows mode.
Just to confirm: CFX only works with full screen mode.


----------



## sugarhell

With mantle you can use crossfire wndows mode


----------



## disintegratorx

Did you guys see the OpenCL 1.2 beta driver from AMD? It can be found here: http://developer.amd.com/resources/heterogeneous-computing/opencl-zone/opencl-1-2-beta-driver/

And Intel's CPU only OpenCL 1.2 runtime here: http://software.intel.com/en-us/vcsource/tools/opencl-sdk#pid-20228-1441

I installed AMD's driver right over their latest beta driver with MANTLE, and I can still run Battlefield 4 with MANTLE enabled and along with the OpenCL extensions I think its running everything better... Not positive, but I think its an interesting topic and anyone with any good feedback on this would be appreciated too.. But anyway, there's that. lol







They say that the features for the OpenCL will officially come out later this year in future drivers, so this isn't a necessary driver other than for developers right now anyways. But if you decide to try it and have some info about it, I'd like to learn more about it from those who know about it, and are interested.


----------



## webhito

Howdy folks! I plan on buying 2 waterblocks for my 290x crossfire setup, I do have a question though. I plan on buying the ek 290x reference blocks with a 3 slot terminal block, do I need any sort of plugs to get this working or is it pretty much plug and play?


----------



## phallacy

Quote:


> Originally Posted by *webhito*
> 
> Howdy folks! I plan on buying 2 waterblocks for my 290x crossfire setup, I do have a question though. I plan on buying the ek 290x reference blocks with a 3 slot terminal block, do I need any sort of plugs to get this working or is it pretty much plug and play?


Plugs as in those that cover the g1/4" ports? I believe all waterblock makers will give you enough of those to cover the unused ones in pretty much any setup. Other than that block your cards, put the terminal block on the side by the ports screw it in together tightly with O rings and then snap both cards into the correct pcie slots and hook up your tubing.


----------



## webhito

Quote:


> Originally Posted by *phallacy*
> 
> Plugs as in those that cover the g1/4" ports? I believe all waterblock makers will give you enough of those to cover the unused ones in pretty much any setup. Other than that block your cards, put the terminal block on the side by the ports screw it in together tightly with O rings and then snap both cards into the correct pcie slots and hook up your tubing.


Awesome, just want to make sure as I don't have a store nearby so I have to order from frozencpu, takes them well over a month to get stuff shipped to Mexico.

Much appreciated!


----------



## phallacy

Quote:


> Originally Posted by *webhito*
> 
> Awesome, just want to make sure as I don't have a store nearby so I have to order from frozencpu, takes them well over a month to get stuff shipped to Mexico.
> 
> Much appreciated!


Ouch thats quite a long time for shipping. if you are unsure just contact ek support they will let you know. Just make sure to buy enough fittings for your planned loop


----------



## Talon720

Quote:


> Originally Posted by *Abyssic*
> 
> interesting. but i would firstly consider a reading error because i also only knew about crossfire working in fullscreen (had CF in the past)


Quote:


> Originally Posted by *BradleyW*
> 
> Usage cannot be read on these cards properly. Because CFX is enabled, the tool assumed usage on card 2 despite using windows mode.
> Just to confirm: CFX only works with full screen mode.


Quote:


> Originally Posted by *sugarhell*
> 
> With mantle you can use crossfire wndows mode


Well i used mantle also.. And my fps was super high compared to just 1 card. The 2nd cards heat numbers also reflected its usage. Anyone else confirm mantle and windowed mode. I didn't see anything in patch notes. It seems dumb they wouldn't enable it, when I read sli does work in windowed mode. It doesn't seem right to force fullscreen and vsync, more choice the better. Since vsync gives is reduced tearing and extra input lag. Windowed mode seems to solve all those issues. Plus when I do try to run in fullscreen with x-fire i get about 5 fps no joke, its unplayable.


----------



## SeanEboy

Quote:


> Originally Posted by *webhito*
> 
> Howdy folks! I plan on buying 2 waterblocks for my 290x crossfire setup, I do have a question though. I plan on buying the ek 290x reference blocks with a 3 slot terminal block, do I need any sort of plugs to get this working or is it pretty much plug and play?


Email tech support: [email protected] , he was very helpful to me with my questions, and I think considering the amount of time it takes you to receive items, it's well worth dropping a quick email. Also, I think you'll need this: http://www.frozencpu.com/products/19690/ex-blc-1448/EK_Terminal_Block_-_Blank_Parallel_EK-FC_Terminal_BLANK_Parallel.html
Quote:


> Originally Posted by *phallacy*
> 
> Plugs as in those that cover the g1/4" ports? I believe all waterblock makers will give you enough of those to cover the unused ones in pretty much any setup. Other than that block your cards, put the terminal block on the side by the ports screw it in together tightly with O rings and then snap both cards into the correct pcie slots and hook up your tubing.


Quote:


> Originally Posted by *webhito*
> 
> Howdy folks! I plan on buying 2 waterblocks for my 290x crossfire setup, I do have a question though. I plan on buying the ek 290x reference blocks with a 3 slot terminal block, do I need any sort of plugs to get this working or is it pretty much plug and play?


I think he means for plugging up the unused terminal port.. I think you might need a certain amount of these:
http://www.frozencpu.com/products/19690/ex-blc-1448/EK_Terminal_Block_-_Blank_Parallel_EK-FC_Terminal_BLANK_Parallel.html


----------



## kizwan

Quote:


> Originally Posted by *webhito*
> 
> Howdy folks! I plan on buying 2 waterblocks for my 290x crossfire setup, I do have a question though. I plan on buying the ek 290x reference blocks with a 3 slot terminal block, do I need any sort of plugs to get this working or is it pretty much plug and play?


For two, these are all you need for the GPU's:-


If you get the "Original CSQ" block, then you need EK-FC Bridge (Parallel+BLANK or Serial+BLANK), not Terminal. The "EK-FC Terminal BLANK Parallel" in the picture is for parallel. If you get "triple serial" instead, you'll need "EK-FC Terminal BLANK Serial". The backplate is optional but if you can afford, get one because it provide passive cooling.


----------



## SeanEboy

Quote:


> Originally Posted by *kizwan*
> 
> For two, these are all you need for the GPU's:-
> 
> 
> If you get the "Original CSQ" block, then you need EK-FC Bridge (Parallel+BLANK or Serial+BLANK), not Terminal. The "EK-FC Terminal BLANK Parallel" in the picture is for parallel. If you get "triple serial" instead, you'll need "EK-FC Terminal BLANK Serial". The backplate is optional but if you can afford, get one because it provide passive cooling.


Well, I get the feeling he wants to leave open the option of upgrading to tri's down the line, without having to replace the terminal...


----------



## kizwan

Quote:


> Originally Posted by *SeanEboy*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> For two, these are all you need for the GPU's:-
> 
> 
> If you get the "Original CSQ" block, then you need EK-FC Bridge (Parallel+BLANK or Serial+BLANK), not Terminal. The "EK-FC Terminal BLANK Parallel" in the picture is for parallel. If you get "triple serial" instead, you'll need "EK-FC Terminal BLANK Serial". The backplate is optional but if you can afford, get one because it provide passive cooling.
> 
> 
> 
> 
> 
> 
> 
> Well, I get the feeling he wants to leave open the option of upgrading to tri's down the line, without having to replace the terminal...
Click to expand...

Look carefully. I linked to TRIPLE serial Terminal/Bridge. When upgrade to tri-Fire, no need to replace the Terminal/Bridge.


----------



## taem

Ok something odd just happened. I have a Powercolor PCS+ 290
Was gaming, everything fine, and then system made a loud screeching sound and screen went blue with a WHEA_INCORRECTIBLE error.

Now the gpu is idling at 33c core, 26c vrm1, 27c vrm2.

At current ambient, gpu up til now idles at 28-29c core, 18-19c vrms.

Checked the card and nothing seems wrong physically. But even leaving it off for a while, temps are not going down to where they used to be.

Any insight? Did something I can't see past the heatsink explode lol?


----------



## SeanEboy

Quote:


> Originally Posted by *kizwan*
> 
> Look carefully. I linked to TRIPLE serial Terminal/Bridge. When upgrade to tri-Fire, no need to replace the Terminal/Bridge.


Yeah, I couldn't read it.. Awesome, nice work.


----------



## pkrexer

Quote:


> Originally Posted by *taem*
> 
> Ok something odd just happened. I have a Powercolor PCS+ 290
> Was gaming, everything fine, and then system made a loud screeching sound and screen went blue with a WHEA_INCORRECTIBLE error.
> 
> Now the gpu is idling at 33c core, 26c vrm1, 27c vrm2.
> 
> At current ambient, gpu up til now idles at 28-29c core, 18-19c vrms.
> 
> Checked the card and nothing seems wrong physically. But even leaving it off for a while, temps are not going down to where they used to be.
> 
> Any insight? Did something I can't see past the heatsink explode lol?


WHEA_INCORRECTIBLE is usually something I see when my CPU overclock isn't stable. Are you overclocking your i5?

However, I'm not really sure why your temps would have increased for your gpu. If you look at gpuz, do all your voltage readings look normal? Its possible that things are just heated up still from gaming and needs a good rest.


----------



## sterob

i am having a very weird problem with Sapphire Tri-X 290.

Basically, i use msiafterburner to adjust core/mem. I notice i am stuck at 830kh no matter what core/mem i use. Then i check with GPU-Z and notice a thing strange. On the "graphic card" tab - GPU clock and memory show exactly the core/mem i adjust in MSIafterburner (for instance 1050/1500). However on the "sensor" tab GPU core clock is always at 932 maximum. Even when i push core clock in afterburner to 980, 1000, 1020 or 1050 core clock will never go above 932 and in some case (1000) it goes down to 900. Using --gpu-engine in cgminer does nothing.


----------



## disintegratorx

That's frikin Windows 8.1, dude.. You now will have to use your disk to boot back in to your system. Its in the 'Repair' options. Very inconvenient, I know. And when you get back into your OS, make sure to check your boot settings in your msconfig because Win 8.1 puts anything it feels like in its boot options..

Other than that, no. Probably not. just check that your fans on the GPU are all working.


----------



## Abyssic

Quote:


> Originally Posted by *sterob*
> 
> i am having a very weird problem with Sapphire Tri-X 290.
> 
> Basically, i use msiafterburner to adjust core/mem. I notice i am stuck at 830kh no matter what core/mem i use. Then i check with GPU-Z and notice a thing strange. On the "graphic card" tab - GPU clock and memory show exactly the core/mem i adjust in MSIafterburner (for instance 1050/1500). However on the "sensor" tab GPU core clock is always at 932 maximum. Even when i push core clock in afterburner to 980, 1000, 1020 or 1050 core clock will never go above 932 and in some case (1000) it goes down to 900. Using --gpu-engine in cgminer does nothing.


power slider: max.


----------



## Paul17041993

Quote:


> Originally Posted by *taem*
> 
> Ok something odd just happened. I have a Powercolor PCS+ 290
> Was gaming, everything fine, and then system made a *loud screeching sound* and screen went blue with a WHEA_INCORRECTIBLE error.
> 
> Now the gpu is idling at 33c core, 26c vrm1, 27c vrm2.
> 
> At current ambient, gpu up til now idles at 28-29c core, 18-19c vrms.
> 
> Checked the card and nothing seems wrong physically. But even leaving it off for a while, temps are not going down to where they used to be.
> 
> Any insight? Did something I can't see past the heatsink explode lol?


might want to check your PSU voltages, did it sound like a choke or more mechanical? could be your pump acting up...

oh wait you're air cooled, those are pretty good temps for air, but check your fan RPMs?


----------



## webhito

Quote:


> Originally Posted by *kizwan*
> 
> For two, these are all you need for the GPU's:-
> 
> 
> If you get the "Original CSQ" block, then you need EK-FC Bridge (Parallel+BLANK or Serial+BLANK), not Terminal. The "EK-FC Terminal BLANK Parallel" in the picture is for parallel. If you get "triple serial" instead, you'll need "EK-FC Terminal BLANK Serial". The backplate is optional but if you can afford, get one because it provide passive cooling.


I got the backplates also, however the slot I purchased is a different one, I do not plan on going more than 2 cards so it should suffice. http://www.frozencpu.com/products/23384/ex-blc-1685/EK_Terminal_Block_-_Dual_Serial_3-Slot_-_Black_Acetal_EK-FC_Terminal_DUAL_Serial_3-Slot.html

Reason I went 3 slot is because I have a rampage iv black edition board.


----------



## sterob

Quote:


> Originally Posted by *Abyssic*
> 
> power slider: max.


already maxed both core voltage and power limit but i am still stuck at 932 Mhz


----------



## Abyssic

Quote:


> Originally Posted by *sterob*
> 
> already maxed both core voltage and power limit but i am still stuck at 932 Mhz


curious. maybe a reading error.


----------



## kizwan

Quote:


> Originally Posted by *webhito*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> For two, these are all you need for the GPU's:-
> 
> 
> If you get the "Original CSQ" block, then you need EK-FC Bridge (Parallel+BLANK or Serial+BLANK), not Terminal. The "EK-FC Terminal BLANK Parallel" in the picture is for parallel. If you get "triple serial" instead, you'll need "EK-FC Terminal BLANK Serial". The backplate is optional but if you can afford, get one because it provide passive cooling.
> 
> 
> 
> 
> 
> 
> 
> I got the backplates also, however the slot I purchased is a different one, I do not plan on going more than 2 cards so it should suffice. http://www.frozencpu.com/products/23384/ex-blc-1685/EK_Terminal_Block_-_Dual_Serial_3-Slot_-_Black_Acetal_EK-FC_Terminal_DUAL_Serial_3-Slot.html
> 
> Reason I went 3 slot is because I have a rampage iv black edition board.
Click to expand...

That terminal won't work for you. You need Triple Serial Terminal.

The Dual Serial 3-Slot only suitable if PCIe slots are configured like this:-


Carefully compare the above PCIe slots configuration with RIV Black Edition:-


They don't match.

The EK-FC Terminal Triple Serial will match with your RIV Black Edition:-


----------



## taem

Quote:


> Originally Posted by *Paul17041993*
> 
> might want to check your PSU voltages, did it sound like a choke or more mechanical? could be your pump acting up...
> 
> oh wait you're air cooled, those are pretty good temps for air, but check your fan RPMs?


Scratching my head still, at idle this gpu is pretty much at ambient for vrms and a degree or two higher for core, has been ever since I got it. Now it's idling at 3-4 degrees above ambient. DIfference is so small it could be nothing. But because it happened right after that screeching noise I'm all paranoid. Maybe it's just hot air build up in my case since I was gaming a few hours? Hope so. PSU voltages are the same as they were I think, all those numbers hover around a bit so it's hard to say.

I'm upset my cpu crashed







Been running rocksolid at 4.7, 1.275v since October, first WHEA error since I finalized my settings. I'm not going to spend days again trying to tweak those settings and running tests, so I guess I have to downlock to 4.6.


----------



## rv8000

Curse my impulse buying







Powercolor r9 290 PCS+ on the way







. Looks to be very promising overall.


----------



## pkrexer

Quote:


> Originally Posted by *rv8000*
> 
> Curse my impulse buying
> 
> 
> 
> 
> 
> 
> 
> Powercolor r9 290 PCS+ on the way
> 
> 
> 
> 
> 
> 
> 
> . Looks to be very promising overall.


You choose wisely


----------



## webhito

Quote:


> Originally Posted by *kizwan*
> 
> That terminal won't work for you. You need Triple Serial Terminal.
> 
> The Dual Serial 3-Slot only suitable if PCIe slots are configured like this:-
> 
> 
> Carefully compare the above PCIe slots configuration with RIV Black Edition:-
> 
> 
> They don't match.
> 
> The EK-FC Terminal Triple Serial will match with your RIV Black Edition:-


Seems my post was deleted 

Anyways, frozencpu had no problem swapping it out, took them less than 5 mins. One question however, first one you linked me was a parallel one, the second was a serial, is there any difference worth mentioning or none at all?


----------



## Ouro

Are there any MSI Twin Frozr 290 compatible Waterblocks?With the release of the 290x lightning Im hoping there might or will be...


----------



## ZealotKi11er

Does Power Target work with 14.2 V1.3 drivers because its not working for me. I tried CCC and GPU Tweak.


----------



## Talon720

So now with gsync coming out for nvidia and it only working in fullscreen mode. Sli supposedly also works in windowed mode. It kinda makes x-fire seem behind. I really like the 290x, but its kinda a bummer... I guess we have freesync if and when it ever makes it out


----------



## taem

Quote:


> Originally Posted by *Talon720*
> 
> So now with gsync coming out for nvidia and it only working in fullscreen mode. Sli supposedly also works in windowed mode. It kinda makes x-fire seem behind. I really like the 290x, but its kinda a bummer... I guess we have freesync if and when it ever makes it out


No doubt g sync sync sounds great but I wonder about the practical impact. I want a constant 60 fps. I choose my settings accordingly and turn v sync on. Would g sync improve on this? I don't know all the technical details but I assume not.

I guess it's great if you want to go over 60 fps or avoid refresh rate stutter at below 60. That would help a lot in games where establishing a minimum 60 fps is pretty near impossible, like Metro LL.

I think I prefer the higher mem bus and more vram of the amd offerings over g sync, for now. Plus I don't want to bleed cash for a g sync display when my Korean IPS is so excellent. Especially when I run 2 of them.


----------



## bbond007

Quote:


> Originally Posted by *Talon720*
> 
> So now with gsync coming out for nvidia and it only working in fullscreen mode. Sli supposedly also works in windowed mode. It kinda makes x-fire seem behind. I really like the 290x, but its kinda a bummer... I guess we have freesync if and when it ever makes it out


Yes, SLI does also does indeed work windowed.

Not only that, but I have noticed that if you switch out of heaven benchmark to the desktop and back to heaven again, crossfire is disabled and not restored and frame-rate is cut in half.

I'm also having an issue where sound pops and cracks if vsync is enabled with crossfire.


----------



## Widde

What does unofficial overclocking mode do? And if one were to enable it which of the 2 options is better?


----------



## dropxo

I'm trying to overclock my 2x gigabyte 290's they're cooled using EK wb, temps aren't going above 50deg for cores and vrms.
Current clocks im playing with are 1230 /1600, using the extra +200mV .bat file, which puts a stable 1.314 V on load or 1.41 before droop i think.
However above 1200 my clocks are throttling back to ~800-900 or one clocks at 1230 then they alternate.
Any ideas how to eliminate the down clocking, just trying to do some valley benches for now.
CCC overdrive is disabled, it shows my power limit is still at +50% and the clocks change after AB 3.0.0 Beta 18 clocks are applied.

Edit: having issues with downclocking at most voltages / clocks.


----------



## ZealotKi11er

Quote:


> Originally Posted by *dropxo*
> 
> I'm trying to overclock my 2x gigabyte 290's they're cooled using EK wb, temps aren't going above 50deg for cores and vrms.
> Current clocks im playing with are 1230 /1600, using the extra +200mV .bat file, which puts a stable 1.314 V on load or 1.41 before droop i think.
> However above 1200 my clocks are throttling back to ~800-900 or one clocks at 1230 then they alternate.
> Any ideas how to eliminate the down clocking, just trying to do some valley benches for now.
> CCC overdrive is disabled, it shows my power limit is still at +50% and the clocks change after AB 3.0.0 Beta 18 clocks are applied.
> 
> Edit: having issues with downclocking at most voltages / clocks.


Same here. My card downclocks even with 50% Power Target. I am using 14.2. I have been asking for 2 days and nobody has a answer.


----------



## The Mac

Power target is broken in 14.2

If you go back to 13.12, it works fine.


----------



## ZealotKi11er

Quote:


> Originally Posted by *The Mac*
> 
> Power target is broken in 14.2
> 
> If you go back to 13.12, it works fine.


Thanks. I figured as much.


----------



## primal92

Hello everyone,

Just like to share my experience with my R9 290. Having bought one second hand (3 months oldI got hold of a MSI 290 one with the stock reference cooler, which was pretty loud and useless at cooling compared to the DCUII cooler on my 7970. It overclocked okay, with a core clock of 1050 and 1450 on the memory (hynix). However one thing that drove me up the wall was the coil wine!







first time I've ever had to deal with it, as all my previous cards since the radeon x1300 pro days never had such a thing.

Luckily the seller was a friend and helped me RMA and for a small fee I upgraded to the DCUII version of the card, having had a ASUS card since my GTX 480s and never had any issues with them. This card clocks much better as well with a core clock of 1100 and memory clock of 1475 (even though it has the supposedly inferior elpida chips) all stock volts. With a bit of voltage tweaking hoping to hit 1200 but even at 1100 all my games never go below 60 fps on max settings. I actually appreciate how these cards need some seriously good airflow in the case to keep them cool even with aftermarket coolers, had to install a fan on the case floor to blow directly on the card to keep it from going back to the usual 90-95C that the reference cooler was doing. Now in the 70-80 range when OC'ed and no throttling









As much as the Asus DCUII cards get a lot crap for falling behind other vendors when it comes to getting a 'golden chip' or needing it's on custom waterblock, they do make solid, silent, good looking cards.

Hopefully this will help anyone who is thinking of getting an Asus 290.









Sorry if this post may seem pointless to anyone.


----------



## Paul17041993

Quote:


> Originally Posted by *Talon720*
> 
> So now with gsync coming out for nvidia and it only working in fullscreen mode. Sli supposedly also works in windowed mode. It kinda makes x-fire seem behind. I really like the 290x, but its kinda a bummer... I guess we have freesync if and when it ever makes it out


variable framerate is actually a very bad thing, it causes loss in accuracy and makes physics behave very kooky and otherwise horrible...

there's a reason why AMD never bothered with the variable refresh rate tech they had on laptops some while back, its just not as good as it sounds.
Quote:


> Originally Posted by *bbond007*
> 
> Yes, SLI does also does indeed work windowed.
> 
> Not only that, but I have noticed that if you switch out of heaven benchmark to the desktop and back to heaven again, crossfire is disabled and not restored and frame-rate is cut in half.
> 
> I'm also having an issue where sound pops and cracks if vsync is enabled with crossfire.


the sound bug is half known and AMD are getting it fixed sometime soon, the minimize bug on crossfire is probably in the same boat, just that with mantle and all the 14.x beta drivers are a clustercuss of everything and don't work very good in a lot of games.


----------



## Widde

Quote:


> Originally Posted by *Paul17041993*
> 
> variable framerate is actually a very bad thing, it causes loss in accuracy and makes physics behave very kooky and otherwise horrible...
> 
> there's a reason why AMD never bothered with the variable refresh rate tech they had on laptops some while back, its just not as good as it sounds.
> the sound bug is half known and AMD are getting it fixed sometime soon, the minimize bug on crossfire is probably in the same boat, just that with mantle and all the 14.x beta drivers are a clustercuss of everything and don't work very good in a lot of games.


Is it the sound stutter that comes from alt-tabbing from a game in fullscreen?







Atleast that's what I have right now, had it with 13.12 aswell. And my microphone seems to get stuttery aswell on VoIP programs aswell when I tab


----------



## phallacy

Quote:


> Originally Posted by *The Mac*
> 
> Power target is broken in 14.2
> 
> If you go back to 13.12, it works fine.


Quote:


> Originally Posted by *ZealotKi11er*
> 
> Thanks. I figured as much.


Is there a post about this? I've been on 14.2 since the weekend and I haven't noticed any downclocking..? Using trixx


----------



## pkrexer

Quote:


> Originally Posted by *phallacy*
> 
> Is there a post about this? I've been on 14.2 since the weekend and I haven't noticed any downclocking..? Using trixx


Are you increasing your core voltage? It happens to me if I go over +70mv on my core.


----------



## phallacy

Quote:


> Originally Posted by *pkrexer*
> 
> Are you increasing your core voltage? It happens to me if I go over +70mv on my core.


Yeah, they range from 75-86 for the cards.


----------



## Forceman

Quote:


> Originally Posted by *Paul17041993*
> 
> variable framerate is actually a very bad thing, it causes loss in accuracy and makes physics behave very kooky and otherwise horrible...


You do realize that any time you have Vsync off you have variable frame rates, right? Just because the monitor isn't showing it doesn't mean the GPU isn't rendering it.


----------



## Roboyto

Quote:


> Originally Posted by *primal92*
> 
> As much as the Asus DCUII cards get a lot crap for falling behind other vendors when it comes to getting a 'golden chip' or needing it's on custom waterblock, they do make solid, silent, good looking cards.
> 
> Hopefully this will help anyone who is thinking of getting an Asus 290.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sorry if this post may seem pointless to anyone.


When did you purchase your card? Could be possible that ASUS has fixed the issue with the cooler since they first were released.


----------



## Paul17041993

Quote:


> Originally Posted by *Forceman*
> 
> You do realize that any time you have Vsync off you have variable frame rates, right? Just because the monitor isn't showing it doesn't mean the GPU isn't rendering it.


obviously, but anything above your framerate either is or isn't a concern, depending on the engine, physics can behave very odd if the deltatime goes very high or very low, most engines are based for vsync or a locked rate close to it.

excessively variable cycle rates can also cause thread sync misses, causing all sorts of lag and stutter issues.


----------



## taem

Quote:


> Originally Posted by *Paul17041993*
> 
> variable framerate is actually a very bad thing, it causes loss in accuracy and makes physics behave very kooky and otherwise horrible...


I knew there was a reason I prefer v sync at 60fps settings over what g sync offers, like I was saying earlier. If it's not too action heavy you can go for 30fps v sync too.
Quote:


> the minimize bug on crossfire is probably in the same boat, just that with mantle and all the 14.x beta drivers are a clustercuss of everything and don't work very good in a lot of games.


Is that really a bug tho? Amd or nvidia, if I alt tab out of a game, there's a fair chance I won't be able to get back in that game, I'll have to force quit and restart. This is why I choose to run Raptr, for the in-game control panel browser and email etc. I also have my pad pr phone on hand. What I don't do is alt tab out of a game, ever. Has nothing to do with amd drivers for me.


----------



## Durquavian

Quote:


> Originally Posted by *Paul17041993*
> 
> obviously, but anything above your framerate either is or isn't a concern, depending on the engine, physics can behave very odd if the deltatime goes very high or very low, most engines are based for vsync or a locked rate close to it.
> 
> excessively variable cycle rates can also cause thread sync misses, causing all sorts of lag and stutter issues.


This is interesting and likely why G-sync was a per game basis. But be careful, this topic causes a lot of passionate responses.


----------



## Paul17041993

Quote:


> Originally Posted by *taem*
> 
> Amd or nvidia, if I alt tab out of a game, there's a fair chance I won't be able to get back in that game, I'll have to force quit and restart.


oh yea that's a half common bug with some games/engines, skyrim for example can loose proper mouse control if you alt-tab/minimise, but the crossfire thing is something like the drivers thinking the program was closed, and of course the profile doesn't get re-loaded as its already open when you try to play again, or at least something similar to that.


----------



## primal92

Quote:


> Originally Posted by *Roboyto*
> 
> When did you purchase your card? Could be possible that ASUS has fixed the issue with the cooler since they first were released.


I got it just 2 days ago, with my card the red accents are actually anodised metal unlike the earlier cards that came with the two sets of stickers. However haven't taken the cooler off to check the contact its making with the GPU die, since there is a warranty sticker on one of the retention screws I won't be checking any time soon.


----------



## Roboyto

Quote:


> Originally Posted by *primal92*
> 
> I got it just 2 days ago, with my card the red accents are actually anodised metal unlike the earlier cards that came with the two sets of stickers. However haven't taken the cooler off to check the contact its making with the GPU die, since there is a warranty sticker on one of the retention screws I won't be checking any time soon.


Removing the cooler won't void your warranty, just an FYI.


----------



## MrWhiteRX7

BF4 with latest patch running mantle on 14.2 beta driver.

1440p maxed the hell out (2xMSAA though because of the mem leak mantle issue)

Everything in green is above 165fps

Everything in yellow is between 140fps and 165fps

The two red dots are between 135fps and 140fps lol

Shanghai map, 64p server but only around 55 people in there on average.



Love this program @RagingCain !!!


----------



## The Mac

Quote:


> Originally Posted by *phallacy*
> 
> Is there a post about this? I've been on 14.2 since the weekend and I haven't noticed any downclocking..? Using trixx


you are probobly within your power envelope, its only going to show up if you exceed the stock TDP.

there have been a bunch of other sites that have posters that confirmed the issue.


----------



## kizwan

Quote:


> Originally Posted by *webhito*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> That terminal won't work for you. You need Triple Serial Terminal.
> 
> The Dual Serial 3-Slot only suitable if PCIe slots are configured like this:-
> 
> 
> Carefully compare the above PCIe slots configuration with RIV Black Edition:-
> 
> 
> They don't match.
> 
> The EK-FC Terminal Triple Serial will match with your RIV Black Edition:-
> 
> 
> 
> 
> 
> 
> 
> 
> Seems my post was deleted
> 
> Anyways, frozencpu had no problem swapping it out, took them less than 5 mins. One question however, first one you linked me was a parallel one, the second was a serial, is there any difference worth mentioning or none at all?
Click to expand...

The difference between Serial & Parallel is temp. Standard/typical config is Serial & in Serial, second GPU usually running hotter than first GPU because water/coolant first enter the first water block & then enter the second water block. While in Parallel, water/coolant distributed across both water blocks evenly. This way the difference between both blocks are only a couple of degrees.

BTW, don't forget to get EK-FC Terminal BLANK Serial too if you're getting the Terminal Triple Serial.

Also you'll need probably want this for cooling the VRM1. The stock EK thermal pad unable to cool VRM1 properly when overclocked & overvolted. The stock EK thermal pad is fine if you're always running your GPU's at stock clock though.
http://www.frozencpu.com/products/17504/thr-185/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_Mosfet_Block_-_100_x_15_x_10_-_Thermal_Conductivity_170_WmK.html


----------



## Forceman

Quote:


> Originally Posted by *kizwan*
> 
> Also you'll need this for cooling the VRM1. The stock EK thermal pad unable to cool VRM1 properly when overclocked.
> http://www.frozencpu.com/products/17504/thr-185/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_Mosfet_Block_-_100_x_15_x_10_-_Thermal_Conductivity_170_WmK.html


The stock pads are adequate to cool the VRMs as long as you don't demand very low temps. My VRM1 only gets to 65C even with 1200/+150mV.


----------



## Tobiman

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> BF4 with latest patch running mantle on 14.2 beta driver.
> 
> 1440p maxed the hell out (2xMSAA though because of the mem leak mantle issue)
> 
> Everything in green is above 165fps
> 
> Everything in yellow is between 140fps and 165fps
> 
> The two red dots are between 135fps and 140fps lol
> 
> Shanghai map, 64p server but only around 55 people in there on average.
> 
> 
> 
> Love this program @RagingCain !!!


What program is that?


----------



## Forceman

Quote:


> Originally Posted by *Tobiman*
> 
> What program is that?


The BF4 Frame Time Analyzer that RagingCain made.

http://www.overclock.net/t/1469627/bf4fta-battlefield-4-frame-time-analyzer-version-4-1-released-minor-version


----------



## Red1776

also keep in mind that while parallel cooling is superior, it does take a lot more flow/pressure to make sure that the waterway is constantly full with no 'dry spots' the more GPU's the more critical this becomes. I used a triple D5 pump setup for my 4 x7970 machine and am using 3-4 on the 4 x R2920x build am currently building.


----------



## kizwan

Quote:


> Originally Posted by *Forceman*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Also you'll need this for cooling the VRM1. The stock EK thermal pad unable to cool VRM1 properly when overclocked.
> http://www.frozencpu.com/products/17504/thr-185/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_Mosfet_Block_-_100_x_15_x_10_-_Thermal_Conductivity_170_WmK.html
> 
> 
> 
> 
> 
> 
> 
> The stock pads are adequate to cool the VRMs as long as you don't demand very low temps. My VRM1 only gets to 65C even with 1200/+150mV.
Click to expand...

True but it just silly to have VRM1 running @60s Celsius when water cooling. VRMs on my previous AMD GPU running cool in 40s when water cooling with EK water block. If low ambient (indoor) temp, the max temp maybe @60s Celsius but if ambient is higher than 30C, when overclocked & overvolt, VRM1 can go up to 80s Celsius even when water cooling. 80s Celsius is well below & safe according to specification but I prefer running them at lower temp if possible & not to forget kinda silly to have VRMs go over 60C when water cooling.
Quote:


> Originally Posted by *Red1776*
> 
> *also keep in mind that while parallel cooling is superior*, it does take a lot more flow/pressure to make sure that the waterway is constantly full with no 'dry spots' the more GPU's the more critical this becomes. I used a triple D5 pump setup for my 4 x7970 machine and am using 3-4 on the 4 x R2920x build am currently building.


I'm sure many people will disagree with that but I leave it to the experts.

I'm late for an appointment. Buh-bye!


----------



## Forceman

Quote:


> Originally Posted by *kizwan*
> 
> True but it just silly to have VRM1 running @60s Celsius when water cooling. VRMs on my previous AMD GPU running cool in 40s when water cooling with EK water block. If low ambient (indoor) temp, the max temp maybe @60s Celsius but if ambient is higher than 30C, when overclocked & overvolt, VRM1 can go up to 80s Celsius even when water cooling. 80s Celsius is well below & safe according to specification but I prefer running them at lower temp if possible & not to forget kinda silly to have VRMs go over 60C when water cooling.


I guess it just depends on what you consider silly. I think it is silly to spend $20 just to take 15C off the already very safe 60C VRM temps. Other people would gladly pay twice that - it just depends where your priorities are and what your current situation is. But either way, you don't *need* to replace the stock pads, it's optional.


----------



## Red1776

If you are running two GPU's you will not see a big diference in temps, but running 3-4 GPU's there is a a sizeable difference in temps.



In my 7970 quad, there is an 11c difference between serial vs parallel


----------



## Sgt Bilko

I think i found my max overclock for now.

PSU tripped with my 8350 at 5.168Ghz @1.69v and my 290's clocked at 1250/1550 with +175mV in Trixx, didn't have my voltage meter plugged in at the time (wish i did) but that means it went well over 1200w


----------



## kizwan

Quote:


> Originally Posted by *Forceman*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> True but it just silly to have VRM1 running @60s Celsius when water cooling. VRMs on my previous AMD GPU running cool in 40s when water cooling with EK water block. If low ambient (indoor) temp, the max temp maybe @60s Celsius but if ambient is higher than 30C, when overclocked & overvolt, VRM1 can go up to 80s Celsius even when water cooling. 80s Celsius is well below & safe according to specification but I prefer running them at lower temp if possible & not to forget kinda silly to have VRMs go over 60C when water cooling.
> 
> 
> 
> 
> 
> 
> 
> I guess it just depends on what you consider silly. I think it is silly to spend $20 just to take 15C off the already very safe 60C VRM temps. Other people would gladly pay twice that - it just depends where your priorities are and what your current situation is. But either way, you don't *need* to replace the stock pads, it's optional.
Click to expand...

Yeah I agree with you 60s or 80s Celsius pretty much safe but I'm not talking what safe. I'm talking what you *should* get when water cooling. If you already spends a lot of money for water cooling, it's kinda silly if VRMs still running at that high temp. Might as well stick with stock reference cooler because it does did a great job in cooling the VRMs.

Yeah my bad for saying "need". "You might want" is more appropriate. Seriously, VRMs above 60s Celsius with water cooling is just silly, unless you're water cooling just for aesthetic which that's fine with me.
Quote:


> Originally Posted by *Red1776*
> 
> If you are running two GPU's you will not see a big diference in temps, but running 3-4 GPU's there is a a sizeable difference in temps.
> 
> 
> In my 7970 quad, there is an 11c difference between serial vs parallel


For quad, better run "hybrid" (parallel-serial-parallel) though.


----------



## Paul17041993

Quote:


> Originally Posted by *kizwan*
> 
> For quad, better run "hybrid" (parallel-serial-parallel) though.


parallel >> series or hybrid/mix on 3 or 4 290Xs, you need a LOT of water flowing through them, which only parrallel is really adequate, otherwise one end of your cards will end up hot due to the water not having enough/any thermal capacity left.

if your CPU, RAM and/or NB blocks aren't too restrictive though, you can run the rest of the system all in series, like @Red1776 for his 7970 quadfire (only CPU+GPUs), BUT you would need a radiator between these and the cards, the cards need cool water.


----------



## Red1776

right, full parallel yields the best results, but as I said it takes a lot of flow to do. I am running full parallel with my new quad R290x rig with 4 pumps and 5 rads.


----------



## kizwan

Quote:


> Originally Posted by *Paul17041993*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> For quad, better run "hybrid" (parallel-serial-parallel) though.
> 
> 
> 
> 
> 
> 
> 
> parallel >> series or hybrid/mix on 3 or 4 290Xs, you need a LOT of water flowing through them, which only parrallel is really adequate, otherwise one end of your cards will end up hot due to the water not having enough/any thermal capacity left.
> 
> if your CPU, RAM and/or NB blocks aren't too restrictive though, you can run the rest of the system all in series, like @Red1776 for his 7970 quadfire (only CPU+GPUs), BUT you would need a radiator between these and the cards, the cards need cool water.
Click to expand...

I'm not an expert on this subject. I only read about it at OCN water cooling club thread. Read it here & here. It seems "hybrid" is best practice for quad GPU.


----------



## Red1776

well it comes with a caveat that there are many variables to take into account. I have tried all three methods after have extensive conversations with B-Negative and jow at FCPU who has been in contact with Skinnee labs and parallel yielded the best results for me. I came to the conclusion that the hybrid potion is to maintain pressure to avoid the air that gets into the hard to fill wide surface area that is found in all full cover blocks. I employ a tremendous amount of flow/pressure with multiple pumps running at the correct speed.

I would not try to spend time talking anyone else into it, but it yields the best results for me.


----------



## kizwan

Quote:


> Originally Posted by *Red1776*
> 
> well it comes with a caveat that there are many variables to take into account. I have tried all three methods after have extensive conversations with B-Negative and jow at FCPU who has been in contact with Skinnee labs and parallel yielded the best results for me. I came to the conclusion that the hybrid potion is to maintain pressure to avoid the air that gets into the hard to fill wide surface area that is found in all full cover blocks. I employ a tremendous amount of flow/pressure with multiple pumps running at the correct speed.
> *I would not try to spend time talking anyone else into it*, but it yields the best results for me.


Haha. Sorry if I'm seems to argue with you about your parallel config with quad GPU or challenging your knowledge in any way. It's not my intention, because like I said, I'm not an expert on this subject. I only mentioned the information I read at OCN water cooling club. I don't just believe what I read in forums but I always take interest in any information posted by reputable member like BNEG & Darlene.

If I'm going quad, I'll definitely going "hybrid".


----------



## Red1776

hehe, not at all. That's what the forums are for, an exchange of ideas. Good luck with the project  let me know how it turns out for you, not many of us in the quad club


----------



## kizwan

Quote:


> Originally Posted by *Red1776*
> 
> hehe, not at all. That's what the forums are for, an exchange of ideas. Good luck with the project
> 
> 
> 
> 
> 
> 
> 
> let me know how it turns out for you, not many of us in the quad club


Not any time soon unfortunately but definitely going to try quad.







Currently going to "upgrade" my current motherboard to either Rampage IV Extreme or Rampage IV Black Edition or Asrock x79 Extreme 11. Something wrong with my current motherboard though, annoyingly it's intermittent problem.


----------



## Red1776

What exactly is happening with your MB?


----------



## Widde

Gonna tear my antec threehundred a new one soon







Gonna do a intake below the psu, either 1 or 3 extra 120mm fan holes in the side for intake, a 3rd intake on the front in the dvd drivebay and dustfilters







Running without side panel atm to allow the cards to "breathe"







Will shop for a better case when I upgrade ^^


----------



## battleaxe

Quote:


> Originally Posted by *Red1776*
> 
> hehe, not at all. That's what the forums are for, an exchange of ideas. Good luck with the project
> 
> 
> 
> 
> 
> 
> 
> let me know how it turns out for you, not many of us in the quad club


Other than the rediculous amount of power available and subsequent high FPS. What's the benefit of a quad setup? Do they run smoother for some reason or something? Or is it just absolute power being the main benefit?

Edit: the amount something like this must weigh.... man.... gotta be over 100lbs. My PC weighs about 60 lbs, so I can't imagine what something like this would tip the scales at?

I take my big computer to LAN parties sometimes just cause I like having it there by my side in stead of my portable rig. "Hey guys can you help me get my quadfire rig in the door.? I need a couple strong men. Its only 140lbs after all."


----------



## stilllogicz

Quote:


> Originally Posted by *Red1776*
> 
> right, full parallel yields the best results, but as I said it takes a lot of flow to do. I am running full parallel with my new quad R290x rig with 4 pumps and 5 rads.


4 pumps as in 4 pumps in serial? Ever thought about doing a dual loop?


----------



## kizwan

Quote:


> Originally Posted by *Red1776*
> 
> What exactly is happening with your MB?


My motherboard refused to turn on. It already happening twice now. Out of nowhere refused to turn on but the next day it turn on just fine. When it work, it work just fine. I thought about RMA'ing it but they probably tell me my motherboad is working just fine. Getting another motherboard is much easier I think.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *kizwan*
> 
> Not any time soon unfortunately but definitely going to try quad.
> 
> 
> 
> 
> 
> 
> 
> Currently going to "upgrade" my current motherboard to either Rampage IV Extreme or Rampage IV Black Edition or Asrock x79 Extreme 11. *Something wrong with my current motherboard though, annoyingly it's intermittent problem.
> 
> 
> 
> 
> 
> 
> 
> *


NOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO
















R4F will do it nicely with less outlay with tonnes of overvolting capacity


----------



## phallacy

Quote:


> Originally Posted by *The Mac*
> 
> you are probobly within your power envelope, its only going to show up if you exceed the stock TDP.
> 
> there have been a bunch of other sites that have posters that confirmed the issue.


Thank you for the info. I was actually able to recreate the throttling so it is evident even for me. It took about 90 minutes though, I just haven't played games for more than an hour in the past few weeks. So that's why it was never occurring. I just left Far Cry 3 on my computer while I caught up on some work and sure enough about an hour and a half later, my clocks were coming down and started going between 900 and the max clocks I set. Very weird, I guess it is back to 13.12. And I was really beginning to like 14.2 too = /


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Not any time soon unfortunately but definitely going to try quad.
> 
> 
> 
> 
> 
> 
> 
> Currently going to "upgrade" my current motherboard to either Rampage IV Extreme or Rampage IV Black Edition or Asrock x79 Extreme 11. *Something wrong with my current motherboard though, annoyingly it's intermittent problem.
> 
> 
> 
> 
> 
> 
> 
> *
> 
> 
> 
> 
> 
> 
> NOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOOO
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> R4F will do it nicely with less outlay with tonnes of overvolting capacity
Click to expand...

You didn't tell Murphy where I live, did you?


----------



## webhito

Quote:


> Originally Posted by *kizwan*
> 
> The difference between Serial & Parallel is temp. Standard/typical config is Serial & in Serial, second GPU usually running hotter than first GPU because water/coolant first enter the first water block & then enter the second water block. While in Parallel, water/coolant distributed across both water blocks evenly. This way the difference between both blocks are only a couple of degrees.
> 
> BTW, don't forget to get EK-FC Terminal BLANK Serial too if you're getting the Terminal Triple Serial.
> 
> Also you'll need probably want this for cooling the VRM1. The stock EK thermal pad unable to cool VRM1 properly when overclocked & overvolted. The stock EK thermal pad is fine if you're always running your GPU's at stock clock though.
> http://www.frozencpu.com/products/17504/thr-185/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_Mosfet_Block_-_100_x_15_x_10_-_Thermal_Conductivity_170_WmK.html


Much appreciated, will do!


----------



## HOMECINEMA-PC

@kizwan

Settle down you ....
Pfffffffttttt Its murphys way of saying ' Get ROG board and Voltup ' LooooooL


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> @kizwan
> 
> Settle down you ....
> Pfffffffttttt Its murphys way of saying ' Get ROG board and Voltup ' LooooooL


Haha. I will once they in stock again.


----------



## Red1776

Well I run at high resolution (eyefinity) 3+1 extended and like everything maxed out. But its more of my own OCD thing. I have built a quadfire machine every year since 2007.



My 4 x 7970 rig weighs in at 116 lbs , and I imagine the 4 x R290x I am building now will be about the same.


----------



## The Mac

Quote:


> Originally Posted by *phallacy*
> 
> Thank you for the info. I was actually able to recreate the throttling so it is evident even for me. It took about 90 minutes though, I just haven't played games for more than an hour in the past few weeks. So that's why it was never occurring. I just left Far Cry 3 on my computer while I caught up on some work and sure enough about an hour and a half later, my clocks were coming down and started going between 900 and the max clocks I set. Very weird, I guess it is back to 13.12. And I was really beginning to like 14.2 too = /


Well, if you dont play long enough to be an issue, why bother going back to 13.12?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Red1776*
> 
> Well I run at high resolution (eyefinity) 3+1 extended and like everything maxed out. *But its more of my own OCD thing. I have built a quadfire machine every year since 2007.*
> 
> 
> 
> My 4 x 7970 rig weighs in at 116 lbs , and I imagine the 4 x R290x I am building now will be about the same.


Ahh yes OCD is pretty much OCN


----------



## chronicfx

Quote:


> Originally Posted by *Red1776*
> 
> Well I run at high resolution (eyefinity) 3+1 extended and like everything maxed out. But its more of my own OCD thing. I have built a quadfire machine every year since 2007.
> 
> 
> 
> My 4 x 7970 rig weighs in at 116 lbs , and I imagine the 4 x R290x I am building now will be about the same.


Nice! I am finally playing the witcher 2 which I previously did not play because I couldn't with ubersampling.. That is also my OCD of needing things in max settings or it isn't worth it. Now it plays beautifully with 3x290x and I am really enjoying it. Great game so far! Just killed the kayran and will join the elves to find the kingslayer. Excited for the witcher 3 too.


----------



## The Mac

Your only about halfway through

Make sure you play nice with the troll while you are in flotsam, that armor you get is key.


----------



## Red1776

hehehe,ah yes, ubersampling. you are on my brainwave. I have the same "if its not turned all the way up, its not worth it" thing. I am kind of a graphics whore as it were.


----------



## ArchieGriffs

Quote:


> Originally Posted by *Paul17041993*
> 
> oh yea that's a half common bug with some games/engines, skyrim for example can loose proper mouse control if you alt-tab/minimise, but the crossfire thing is something like the drivers thinking the program was closed, and of course the profile doesn't get re-loaded as its already open when you try to play again, or at least something similar to that.


If you hit escape to go into the menu or tab to go into your inventory/magic/skills before tabbing out you don't get the stupid second mouse cursor floating around anywhere when you tab back in. Kind of a stupid fix when it shouldn't be happening in the first place, but once you get into the habit of it you don't ever have to worry about it happening again.


----------



## chronicfx

Quote:


> Originally Posted by *The Mac*
> 
> Your only about halfway through
> 
> Make sure you play nice with the troll while you are in flotsam, that armor you get is key.


I killed his woman's killers and told him. Should I have gotten armor?


----------



## The Mac

Quote:


> Originally Posted by *chronicfx*
> 
> I killed his woman's killers and told him. Should I have gotten armor?


You have to go play dice for the stuffed troll head (his mates head) with Sendler (hes in the house with the fish hanging ouside) in the elf town and give it to him. You will get the hunters armor diagram. Its sick armor for this point in the game.


----------



## chronicfx

Quote:


> Originally Posted by *The Mac*
> 
> You have to go play dice for the stuffed troll head (his mates head) with Sendler (hes in the house with the fish hanging ouside) in the elf town and give it to him. You will get the hunters armor diagram. Its sick armor for this point in the game.


Thanks!!


----------



## phallacy

Quote:


> Originally Posted by *The Mac*
> 
> Well, if you dont play long enough to be an issue, why bother going back to 13.12?


Well I do plan on playing for more than that this weekend = ]. Replaying deus ex but this time on a PC and after how much I played it on ps3, I see myself going well beyond the 90 minutes on far cry 3.


----------



## The Mac

Dues ex is pretty sweet on PC. Nixxes implementation of FXAA is really effective.


----------



## kizwan

Quote:


> Originally Posted by *The Mac*
> 
> Quote:
> 
> 
> 
> Originally Posted by *phallacy*
> 
> Thank you for the info. I was actually able to recreate the throttling so it is evident even for me. It took about 90 minutes though, I just haven't played games for more than an hour in the past few weeks. So that's why it was never occurring. I just left Far Cry 3 on my computer while I caught up on some work and sure enough about an hour and a half later, my clocks were coming down and started going between 900 and the max clocks I set. Very weird, I guess it is back to 13.12. And I was really beginning to like 14.2 too = /
> 
> 
> 
> Well, if you dont play long enough to be an issue, why bother going back to 13.12?
Click to expand...

Is the clock fluctuating only in certain games or all? I just played BF4 for 1 hour 40 minutes without throttling though. I'm with 14.2 driver. I'm experiencing very bad lag but that is because my bad connection.


----------



## phallacy

Quote:


> Originally Posted by *kizwan*
> 
> Is the clock fluctuating only in certain games or all? I just played BF4 for 1 hour 40 minutes without throttling though. I'm with 14.2 driver. I'm experiencing very bad lag but that is because my bad connection.


Could just be select games but I can test when home. Like I said above it happened after 90 min of far cry 3. I'll try it with crysis, deus ex and Thief.


----------



## jerrolds

Question about onscreen recording/streaming - AMD does not have its own version of ShadowPlay correct? Whats the best way to record ingame action without affecting performance too much?

Fraps at half resolution or something (720p 30fps)? or a dedicated card the best way to go about this?


----------



## sugarhell

Quote:


> Originally Posted by *jerrolds*
> 
> Question about onscreen recording/streaming - AMD does not have its own version of ShadowPlay correct? Whats the best way to record ingame action without affecting performance too much?
> 
> Fraps at half resolution or something (720p 30fps)? or a dedicated card the best way to go about this?


Msi AB with quicksync. Or obs with quicksync


----------



## magicdave26

Quote:


> Originally Posted by *jerrolds*
> 
> Question about onscreen recording/streaming - AMD does not have its own version of ShadowPlay correct? Whats the best way to record ingame action without affecting performance too much?
> 
> Fraps at half resolution or something (720p 30fps)? or a dedicated card the best way to go about this?


Raptr is AMDs version of Geforce Experience/Shadowplay with streaming to Twitch
http://raptr.com/amd

But MSI Afterburner is likely to be way better for recording gameplay


----------



## The Mac

Radeon Pro has some good recording tools as well, and has minimal performance impact.


----------



## jerrolds

Quote:


> Originally Posted by *sugarhell*
> 
> Msi AB with quicksync. Or obs with quicksync


Ah cool thanks - but QuickSync needs the integrated gpu to be enabled (didnt even know my i7 2600k had integrated gpu lol) but i know GPU Tweak really really doesnt like 2 different cards in the system.

I'll have to see how it goes, or lower my overclock and use AB exclusively.


----------



## The Mac

As long as your MOBO supports integrated graphics, quicksync should already be enabled .

P67 for example does not.


----------



## battleaxe

Quote:


> Originally Posted by *Red1776*
> 
> Well I run at high resolution (eyefinity) 3+1 extended and like everything maxed out. But its more of my own OCD thing. I have built a quadfire machine every year since 2007.
> 
> 
> 
> My 4 x 7970 rig weighs in at 116 lbs , and I imagine the 4 x R290x I am building now will be about the same.


LOL.... that's what I thought. Figured it would be over 100lbs.


----------



## MapRef41N93W

So far so good on my new Crossfire 290x setup. You guys were saying the Asus has heat problems but mine actually runs really cool where as my Tri-X is the nuker running at 50+c in idle. The benches I have been running show them both running in the high 60s-mid 70s so far.

Heres my firestrike score http://www.3dmark.com/3dm/2613896

Edit: after furthing testing I'm actually running at about 80c on my cards during load. Got BSOD during a benchmark of Tomb Raider and wondering if it's driver related or not. Is 14.2 stable for xfire?


----------



## The Storm

Quote:


> Originally Posted by *MapRef41N93W*
> 
> So far so good on my new Crossfire 290x setup. You guys were saying the Asus has heat problems but mine actually runs really cool where as my Tri-X is the nuker running at 50+c in idle. The benches I have been running show them both running in the high 60s-mid 70s so far.
> 
> Heres my firestrike score http://www.3dmark.com/3dm/2613896
> 
> Edit: after furthing testing I'm actually running at about 80c on my cards during load. Got BSOD during a benchmark of Tomb Raider and wondering if it's driver related or not. Is 14.2 stable for xfire?


Not sure but I am Xfired with 290x's and still running good ole 13.12 and not any problems at all.


----------



## taem

Quote:


> Originally Posted by *jerrolds*
> 
> Question about onscreen recording/streaming - AMD does not have its own version of ShadowPlay correct? Whats the best way to record ingame action without affecting performance too much?
> 
> Fraps at half resolution or something (720p 30fps)? or a dedicated card the best way to go about this?


In addition to Afterburner or Raptr give OBS a look. http://obsproject.com. Stream or vid record, free. Haven't played with it too much but it looks promising.


----------



## battleaxe

Quote:


> Originally Posted by *The Storm*
> 
> Not sure but I am Xfired with 290x's and still running good ole 13.12 and not any problems at all.


Lots of us are still running 13.12. I'm terrified to try the BETAs.


----------



## The Storm

Quote:


> Originally Posted by *battleaxe*
> 
> Lots of us are still running 13.12. I'm terrified to try the BETAs.


I totally agree, I dont even want to try the beta's.


----------



## The Mac

Quote:


> Originally Posted by *battleaxe*
> 
> Lots of us are still running 13.12. I'm terrified to try the BETAs.


why is it terrifying?

if it doesnt work for you, switch back...


----------



## MapRef41N93W

I switched back to 13.12 and am having much better stability now. No BSOD and less stutter. Cards are even running a bit cooler.


----------



## neurotix

I won't use the betas either, mainly because they lower mining performance. 960 khash/s with 13.12. Only 900 with 14.2.


----------



## MrWhiteRX7

Yea for mining the beta's never seem to do as well.


----------



## battleaxe

Quote:


> Originally Posted by *neurotix*
> 
> I won't use the betas either, mainly because they lower mining performance. 960 khash/s with 13.12. Only 900 with 14.2.


Ditto. That's why I won't either. Plus, heard about all the crashes and whatnot. So why go through the headache in the first place.


----------



## kizwan

*3820 @4.75GHz, BF4 @1080p 200% resolution scale, Ultra settings (no AA), Mantle with 14.2 beta driver, 2 x 290 Crossfire*

*CPU usage vs. Core/Memory Clocks*


*GPU usage vs. Core/Memory Clocks*


No crash, no BSOD, no headache with 14.2 beta driver.


----------



## magicase

My adventure with the R9 290 so far









XFX stock cooler -> Asus stock cooler -> 2 x Gigabyte Windforce -> 2 x Sapphire Tri-X OC -> 2 x Powercolor PCS+ -> 2 x Sappire Tri-X OC


----------



## battleaxe

Quote:


> Originally Posted by *magicase*
> 
> My adventure with the R9 290 so far
> 
> 
> 
> 
> 
> 
> 
> 
> 
> XFX stock cooler -> Asus stock cooler -> 2 x Gigabyte Windforce -> 2 x Sapphire Tri-X OC -> 2 x Powercolor PCS+ -> 2 x Sappire Tri-X OC


What on earth are you doing? Trying to see if you can own every brand?


----------



## jerrolds

Why would go thru all that trouble - if it was a noise/heat issue, shouldve just went with watercooling

Couldve at least tried 780TI or Black or something in there


----------



## Durquavian

Quote:


> Originally Posted by *battleaxe*
> 
> What on earth are you doing? Trying to see if you can own every brand?


Maybe looking for one that unlocks to 290X


----------



## Elmy

Quote:


> Originally Posted by *neurotix*
> 
> I won't use the betas either, mainly because they lower mining performance. 960 khash/s with 13.12. Only 900 with 14.2.


What settings do you use?


----------



## bhardy1185

What is the best way to revert back to previous drivers?


----------



## MapRef41N93W

Quote:


> Originally Posted by *bhardy1185*
> 
> What is the best way to revert back to previous drivers?


Uninstall current drivers, then driver sweeper them clean. Then you can install the 13.12 catalyst.


----------



## bhardy1185

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Uninstall current drivers, then driver sweeper them clean. Then you can install the 13.12 catalyst.


Thanks. I will give it a go now.


----------



## Paul17041993

Quote:


> Originally Posted by *kizwan*
> 
> My motherboard refused to turn on. It already happening twice now. Out of nowhere refused to turn on but the next day it turn on just fine. When it work, it work just fine. I thought about RMA'ing it but they probably tell me my motherboad is working just fine. Getting another motherboard is much easier I think.


have you tried with a different PSU? I say this cause back in 2012 my first seasonic plat. (same as the sig) had a regulator/protection fault and wouldn't start warm without toying with the power switch on the back a few times, RMA-ed it and all was good again.


----------



## JordanTr

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Uninstall current drivers, then driver sweeper them clean. Then you can install the 13.12 catalyst.


Is it necessery to uninstall drivers before using ddu in safe mode? I never bothered to uninstall them before using ddu. It deletes driver and cleans registry as well.


----------



## hotrod717

Well, Looks like I may be the first to own and say - MSI Lightning 290X is in stock now! - http://www.newegg.com/Product/Product.aspx?Item=N82E16814127787


----------



## ZealotKi11er

Quote:


> Originally Posted by *hotrod717*
> 
> Well, Looks like I may be the first to own and say - MSI Lightning 290X is in stock now! - http://www.newegg.com/Product/Product.aspx?Item=N82E16814127787


Wow $750. It does not even OC any better then stock card. Thats $200 of Stock MSRP.


----------



## VSG

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Wow $750. It does not even OC any better then stock card. Thats $200 of Stock MSRP.


What are you basing that statement on though, the reviews out so far have been pretty bad as far as overclocking procedures go. They barely touched the card's potentials I bet.


----------



## MapRef41N93W

Quote:


> Originally Posted by *JordanTr*
> 
> Is it necessery to uninstall drivers before using ddu in safe mode? I never bothered to uninstall them before using ddu. It deletes driver and cleans registry as well.


Don't know, I just find it better to do a quick uninstall then sweep to avoid any issues.


----------



## ZealotKi11er

Quote:


> Originally Posted by *geggeg*
> 
> What are you basing that statement on though, the reviews out so far have been pretty bad as far as overclocking procedures go. They barely touched the card's potentials I bet.


Stock cards OC potential in air/water is not that great. AMD reference cards have no problems delivering power and such. Its all down to the core. The Lightning will not be a better overclocker so no point on dropping more money on custom PCB GPU if they dont OC more. You dont even need to test Lightning to know it OC potential. With HD 7970 and 6970 it was the same story.


----------



## hotrod717

Quote:


> Originally Posted by *geggeg*
> 
> What are you basing that statement on though, the reviews out so far have been pretty bad as far as overclocking procedures go. They barely touched the card's potentials I bet.


Absolutely.
Take the 7970 Matrix - All kinds of complaints, but somehow I managed 1385/ 1875 on water. Any cards potential is in the hands of the owner. I'm so tired of the arguements over reference vs. custom pcb cards. There is a reason the top clocks on graphics cards with custom cooling, come on custom pcb cards. Same goes for Nvidia. The higher quality components and power phases definately reduce the probability of coil whine and its unlocked out of the box. So much easier.


----------



## ZealotKi11er

Quote:


> Originally Posted by *hotrod717*
> 
> Absolutely.
> Take the 7970 Matrix - All kinds of complaints, but somehow I managed 1385/ 1875 on water. Any cards potential is in the hands of the owner. I'm so tired of the arguements over reference vs. custom pcb cards. There is a reason the top clocks on graphics cards with custom cooling, come on custom pcb cards. Same goes for Nvidia. The higher quality components and power phases definately reduce the probability of coil whine and its unlocked out of the box. So much easier.


Thats because when you go for extreme overclocks power delivery matters.


----------



## MapRef41N93W

So I tried out some gaming with the new 290x crossfire setup with Shadow Warrior and I keep getting BSOD. The cards don't even seem to be running that hot as the fan speed seems to not be going over 40% at all. This is on 13.12 drivers. Any suggestions?


----------



## hotrod717

Quote:


> Originally Posted by *MapRef41N93W*
> 
> So I tried out some gaming with the new 290x crossfire setup with Shadow Warrior and I keep getting BSOD. The cards don't even seem to be running that hot as the fan speed seems to not be going over 40% at all. This is on 13.12 drivers. Any suggestions?


Have you set a custom fan profile or monitored temps? What temps under load?


----------



## MapRef41N93W

Quote:


> Originally Posted by *hotrod717*
> 
> Have you set a custom fan profile or monitored temps? What temps under load?


Yes I have a custom fan profile. Load temps are in the mid to high 70s.

Edit: Just tested in Arkham Origins and I am hitting 85c on my Tri-X and mid 70s on Asus. No BSOD there (though I didn't exactly hit any heavy areas of the game).


----------



## magicase

Quote:


> Originally Posted by *battleaxe*
> 
> What on earth are you doing? Trying to see if you can own every brand?


Trying different coolers


----------



## taem

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Yes I have a custom fan profile. Load temps are in the mid to high 70s.
> 
> Edit: Just tested in Arkham Origins and I am hitting 85c on my Tri-X and mid 70s on Asus. No BSOD there (though I didn't exactly hit any heavy areas of the game).


Something's wrong there. I have 290 Tri-X as top card in crossfire, 71c core and vrm1 in Metro LL bench loop heat test, game temps in the 60s, with fan curve set at 1% per 1 degree. I have a side fan though, makes all the difference for open air coolers in crossfire.


----------



## MapRef41N93W

Quote:


> Originally Posted by *taem*
> 
> Something's wrong there. I have 290 Tri-X as top card in crossfire, 71c core and vrm1 in Metro LL bench loop heat test, game temps in the 60s, with fan curve set at 1% per 1 degree. I have a side fan though, makes all the difference for open air coolers in crossfire.


Yeah I have a side 200mm blowing on the cards. The Tri-X is the one hitting 80+c for me while the Asus seems to be doing it's job nicely.


----------



## Paul17041993

Quote:


> Originally Posted by *MapRef41N93W*
> 
> So I tried out some gaming with the new 290x crossfire setup with Shadow Warrior and I keep getting BSOD. The cards don't even seem to be running that hot as the fan speed seems to not be going over 40% at all. This is on 13.12 drivers. Any suggestions?


helps if you actually say what BSOD you're getting? could be as simple as dust in the PCIe slots even. test each card individually and see if one of them has artifacts or blackouts.


----------



## kizwan

Quote:


> Originally Posted by *Paul17041993*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> My motherboard refused to turn on. It already happening twice now. Out of nowhere refused to turn on but the next day it turn on just fine. When it work, it work just fine. I thought about RMA'ing it but they probably tell me my motherboad is working just fine. Getting another motherboard is much easier I think.
> 
> 
> 
> 
> 
> 
> 
> have you tried with a different PSU? I say this cause back in 2012 my first seasonic plat. (same as the sig) had a regulator/protection fault and wouldn't start warm without toying with the power switch on the back a few times, RMA-ed it and all was good again.
Click to expand...

Not yet but that is my plan. I need to wait for it to happen again before I can test it with different PSU.
Quote:


> Originally Posted by *MapRef41N93W*
> 
> So I tried out some gaming with the new 290x crossfire setup with Shadow Warrior and I keep getting BSOD. The cards don't even seem to be running that hot as the fan speed seems to not be going over 40% at all. This is on 13.12 drivers. Any suggestions?


Quote:


> Originally Posted by *MapRef41N93W*
> 
> Quote:
> 
> 
> 
> Originally Posted by *hotrod717*
> 
> Have you set a custom fan profile or monitored temps? What temps under load?
> 
> 
> 
> Yes I have a custom fan profile. Load temps are in the mid to high 70s.
> 
> Edit: Just tested in Arkham Origins and I am hitting 85c on my Tri-X and mid 70s on Asus. No BSOD there (though I didn't exactly hit any heavy areas of the game).
Click to expand...

It shouldn't BSOD even if running hot at stock clock. I recommend test your GPU one by one. One of them probably faulty.


----------



## Aussiejuggalo

So fellers... thinking about jumping to the red team and grabbing a 290 instead of a 780, pricing is outrages on Nvidia the last couple of series and I'm fed up lol

Couple of questions tho, hows the drivers, I know AMD havent always had the greatest most stable drivers, are they still that way or have they been tamed?

How does a 290 go underwater? does it keep temps in check or is it still hard to keep them cool?

Thanks


----------



## ZealotKi11er

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> So fellers... thinking about jumping to the red team and grabbing a 290 instead of a 780, pricing is outrages on Nvidia the last couple of series and I'm fed up lol
> 
> Couple of questions tho, hows the drivers, I know AMD havent always had the greatest most stable drivers, are they still that way or have they been tamed?
> 
> How does a 290 go underwater? does it keep temps in check or is it still hard to keep them cool?
> 
> Thanks


Under water temps are fine. My 290X hits 43C in BF4.


----------



## nightfox

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> So fellers... thinking about jumping to the red team and grabbing a 290 instead of a 780, pricing is outrages on Nvidia the last couple of series and I'm fed up lol
> 
> Couple of questions tho, hows the drivers, I know AMD havent always had the greatest most stable drivers, are they still that way or have they been tamed?
> 
> How does a 290 go underwater? does it keep temps in check or is it still hard to keep them cool?
> 
> Thanks


i could only comment on drivers. its way better now that before. with the new xdma engine on 290/x's, and bridgeless crossfire, crossfire scaling is better now and microstutter is reduce significantly. almost none. although, dont expect too much on new games. amd still slow in updating drivers for new games. especially the nvdia optimized game. although its faster now than before.


----------



## Mr357

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> So fellers... thinking about jumping to the red team and grabbing a 290 instead of a 780, pricing is outrages on Nvidia the last couple of series and I'm fed up lol
> 
> Couple of questions tho, hows the drivers, I know AMD havent always had the greatest most stable drivers, are they still that way or have they been tamed?
> 
> How does a 290 go underwater? does it keep temps in check or is it still hard to keep them cool?
> 
> Thanks


The drivers are fine if you use one of the "good ones."

Under water a 290X is like any other card- low ambient and load temperatures, but of course the heat output is still high. Unfortunately a good 290X can only do about 1200-1250MHz core 24/7.


----------



## Xyro TR1

Got a return extension on my MSI GAMING 290X from Amazon since they're out of stock for a while. They will be replacing my card due to the cooling issue. But now I don't have to send it back 'till the new ones come back into stock!


----------



## Aussiejuggalo

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Under water temps are fine. My 290X hits 43C in BF4.


Well thats a hell of a lot better then the 90 with the stock cooler








Quote:


> Originally Posted by *nightfox*
> 
> i could only comment on drivers. its way better now that before. with the new xdma engine on 290/x's, and bridgeless crossfire, crossfire scaling is better now and microstutter is reduce significantly. almost none. although, dont expect too much on new games. amd still slow in updating drivers for new games. especially the nvdia optimized game. although its faster now than before.


I was actually wondering how that bridgeless crossfire would go, seemed like a big risk if it didnt work. Nvidia are kinda useless for new game drivers to, still waiting on Thief ones... probably will end up being another month or so
Quote:


> Originally Posted by *Mr357*
> 
> The drivers are fine if you use one of the "good ones."
> 
> Under water a 290X is like any other card- low ambient and load temperatures, but of course the heat output is still high. Unfortunately a good 290X can only do about 1200-1250MHz core 24/7.


Cool, well I have got around 600mm of rad with maybe soon to be 10 AP-15s so I should be good on keeping it cool... I hope







. I dont normally overclock my graphic cards so thats not really a problem for me

Another driver related question, I remember back in the days of old they use to crash, freeze and even sometimes BSOD... do they still do that?


----------



## taem

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> So fellers... thinking about jumping to the red team and grabbing a 290 instead of a 780
> 
> How does a 290 go underwater? does it keep temps in check or is it still hard to keep them cool?


Don't have to go underwater, the better coolers do just fine on air. Ambient ~25



Powercolor PCS+ and Sapphire Tri-X are the two best 290 coolers imho.


----------



## Aussiejuggalo

Quote:


> Originally Posted by *taem*
> 
> Don't have to go underwater, the better coolers do just fine on air. Ambient ~25
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Powercolor PCS+ and Sapphire Tri-X are the two best 290 coolers imho.


Ah ok, well I already have watercooling setup, would just need a block but good to know I dont need to do water straight away


----------



## kizwan

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> So fellers... thinking about jumping to the red team and grabbing a 290 instead of a 780, pricing is outrages on Nvidia the last couple of series and I'm fed up lol
> 
> Couple of questions tho, hows the drivers, I know AMD havent always had the greatest most stable drivers, are they still that way or have they been tamed?
> 
> How does a 290 go underwater? does it keep temps in check or is it still hard to keep them cool?
> 
> Thanks


The latest WHQL (13.12) & latest beta driver (14.2) works well for me. No BSOD or crash so far. I'm playing BF4 using Mantle with 14.2 beta driver. Running card at 1000/1300 at stock voltage using 290 Tri-X BIOS. Can't say whether it's better than Nvidia because I only have AMD cards.

Mine on the core can go up to 60 - 61C underwater but this is because I'm playing in 32C ambient (indoor). I have 360 + 240mm radiators for cooling CPU & two 290's. Delta temp is 11C when playing BF4. All water blocks works well in cooling these cards with only a couple of degrees between them. The best block would be Aquacomputer Kryographics water block with backplate because it cooling the VRM1 better than the other blocks. VRM1 temp with other blocks tend to run a little bit higher but I'm obligated to add that even though VRM1 running at high temp, they're within safe operating temperature.

I have EK blocks. I already have Fujipoly Extreme thermal pad for VRM1 but didn't apply it yet. I'm waiting thermal paste & SR-1 120 radiator before I can apply them.


----------



## nightfox

@aussie,

my trifire 290's doesnt go above 85C. although noise fan is irritating. lol

I have same dilemna about the bridgeless crossfire not working. But it works. AMD card has less requirements for PCI lane. unlike NVIDIA. Nvidia cards require atleast PCIE2 x 16? or PCIE3 x 8 to enable SLI. am I correct? AMD cards requires less. Or they dont require at all.


----------



## Aussiejuggalo

Quote:


> Originally Posted by *kizwan*
> 
> The latest WHQL (13.12) & latest beta driver (14.2) works well for me. No BSOD or crash so far. I'm playing BF4 using Mantle with 14.2 beta driver. Running card at 1000/1300 at stock voltage using 290 Tri-X BIOS. Can't say whether it's better than Nvidia because I only have AMD cards.
> 
> Mine on the core can go up to 60 - 61C underwater but this is because I'm playing in 32C ambient (indoor). I have 360 + 240mm radiators for cooling CPU & two 290's. Delta temp is 11C when playing BF4. All water blocks works well in cooling these cards with only a couple of degrees between them. The best block would be Aquacomputer Kryographics water block with backplate because it cooling the VRM1 better than the other blocks. VRM1 temp with other blocks tend to run a little bit higher but I'm obligated to add that even though VRM1 running at high temp, they're within safe operating temperature.


Thats reassuring about the drivers, which was my main concern

Thats still not to bad considering I know these 290/290x throw out a hell of a lot a heat, your temps are about what mine would be to








Quote:


> Originally Posted by *nightfox*
> 
> @aussie,
> 
> my trifire 290's doesnt go above 85C. although noise fan is irritating. lol
> 
> I have same dilemna about the bridgeless crossfire not working. But it works. AMD card has less requirements for PCI lane. unlike NVIDIA. Nvidia cards require atleast PCIE2 x 16? or PCIE3 x 8 to enable SLI. am I correct? AMD cards requires less. Or they dont require at all.


Cool







, yeah fans on stock coolers/some after market stuff seem to sound like jets

Seems like the bridgeless crossfire works better then the old cable between the cards which is awesome







. I dont actually remember what Nvidia cards require, havent gone SLI in a few years, to much stutter :\


----------



## MapRef41N93W

I think I figured out what was causing BSOD for me. I was trying to run both cards at 1350 memory (upping the Tri-X from 1300), but my Tri-X has serious issues with OCing of the memory (even with Hynix) so I downclocked to 1320 and have had no issues gaming since.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *kizwan*
> 
> The latest WHQL (13.12) & latest beta driver (14.2) works well for me. No BSOD or crash so far. I'm playing BF4 using Mantle with 14.2 beta driver. Running card at 1000/1300 at stock voltage using 290 Tri-X BIOS. Can't say whether it's better than Nvidia because I only have AMD cards.
> 
> Mine on the core can go up to 60 - 61C underwater but this is because I'm playing in 32C ambient (indoor). I have 360 + 240mm radiators for cooling CPU & two 290's. Delta temp is 11C when playing BF4. All water blocks works well in cooling these cards with only a couple of degrees between them. The best block would be Aquacomputer Kryographics water block with backplate because it cooling the VRM1 better than the other blocks. VRM1 temp with other blocks tend to run a little bit higher but I'm obligated to add that even though VRM1 running at high temp, they're within safe operating temperature.
> 
> I have EK blocks. I already have Fujipoly Extreme thermal pad for VRM1 but didn't apply it yet. I'm waiting thermal paste & SR-1 120 radiator before I can apply them.


Gonna have to try that tri-x bios at some stage








Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Thats reassuring about the drivers, which was my main concern
> 
> Thats still not to bad considering I know these 290/290x throw out a hell of a lot a heat, your temps are about what mine would be to
> 
> 
> 
> 
> 
> 
> 
> 
> Cool
> 
> 
> 
> 
> 
> 
> 
> , yeah fans on stock coolers/some after market stuff seem to sound like jets
> 
> Seems like the bridgeless crossfire works better then the old cable between the cards which is awesome
> 
> 
> 
> 
> 
> 
> 
> . I dont actually remember what Nvidia cards require, havent gone SLI in a few years, to much stutter :\


*QUEENSLANDER*


----------



## Widde

Is mantle working now since the latest bf patch? Last week I dropped in fps in mantle compared to dx11


----------



## Red1776

just a heads up about the MSI 4GB R290x Gaming model.

I have 4 of then and they all are very good OC'ers. I am expecting more when I get them under water.

Just a heads up


----------



## Aussiejuggalo

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> *QUEENSLANDER*


Another Aussie















Quote:


> Originally Posted by *Red1776*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> just a heads up about the MSI 4GB R290x Gaming model.
> I have 4 of then and they all are very good OC'ers. I am expecting more when I get them under water.
> Just a heads up
> 
> 
> Spoiler: Warning: Spoiler!


What are you getting out of them now? they look like damn nice cards


----------



## Sgt Bilko

Quote:


> Originally Posted by *Widde*
> 
> Is mantle working now since the latest bf patch? Last week I dropped in fps in mantle compared to dx11


I did these before the BF4 patch,
Quote:


> Originally Posted by *Sgt Bilko*
> 
> So i did as you said, here are the the results
> 
> Here is Mantle:
> 
> 
> 
> And here is D3D 11:
> 
> 
> 
> For Me Mantle is better and D3D has larger spikes.


My net is capped now so i can't download the patch for a few days


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I did these before the BF4 patch,
> My net is capped now so i can't download the patch for a few days


Bummer









im on a Telstra broadband / wifi dongle atm , just moved house











Gonna go dodo with unlimited useage in the next week or so cause Telstra has no ports left and cant tell you the truth


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Bummer
> 
> 
> 
> 
> 
> 
> 
> 
> 
> im on a Telstra broadband / wifi dongle atm , just moved house
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Gonna go dodo with unlimited useage in the next week or so cause Telstra has no ports left and cant tell you the truth


That's a nice lookin place, congrats









well, nothing can really save my net, I'm on Satelitte so ping is always 700ms+ it's just the "speed" i'm missing.


----------



## kizwan

Quote:


> Originally Posted by *Widde*
> 
> Is mantle working now since the latest bf patch? Last week I dropped in fps in mantle compared to dx11


Played Siege of Shanghai last night, 64 players. With 14.2 beta driver. Frame Time are higher than 14.1 beta driver but FPS seems not so smooth compared to 14.1.

Frame Time


FPS




Spoiler: Mantle with 14.1 beta driver for comparison



Quote:


> Originally Posted by *kizwan*
> 
> *BF4, Flood Zone 64 player, 1080p Ultra settings (AA & AA POST OFF), 200% resolution scale, 2 x 290's Crossfire*
> 
> *Frame Time - DirectX*
> 
> 
> *Frame Time - Mantle*
> 
> 
> *FPS - DirectX*
> MIN: 30.68425898
> MAX : 113.6363636
> AVERAGE: 78.75183923
> 
> 
> *FPS - Mantle*
> MIN: 38.925652
> MAX : 137.9310345
> AVERAGE: 91.42697436


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Sgt Bilko*
> 
> well, nothing can really save my net, I'm on Satelitte so ping is always 700ms+ it's just the "speed" i'm missing.


God I thought I had it bad on my net



I'm ment to be getting full ADSL2+ but our exchange to overloaded


----------



## Widde

Quote:


> Originally Posted by *Sgt Bilko*
> 
> That's a nice lookin place, congrats
> 
> 
> 
> 
> 
> 
> 
> 
> 
> well, nothing can really save my net, I'm on Satelitte so ping is always 700ms+ it's just the "speed" i'm missing.


One of the only good things with sweden







Unlimited download with a 100/100 line







soon 1000/100


----------



## combine1237

Hello, just wondering if 1125mhz core on stock voltage and plus 20 power is decent? for my 290x dd black. Also do not get my started on my throttle happy isp


----------



## Sgt Bilko

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> God I thought I had it bad on my net
> 
> 
> 
> I'm ment to be getting full ADSL2+ but our exchange to overloaded


lol, I'd kill for ADSL1 atm, just so i can actually do some gaming,
Quote:


> Originally Posted by *Widde*
> 
> One of the only good things with sweden
> 
> 
> 
> 
> 
> 
> 
> Unlimited download with a 100/100 line
> 
> 
> 
> 
> 
> 
> 
> soon 1000/100


Don't worry, i'm going to move to Denmark asap, looking forward to the cooler weather as well


----------



## lurker2501

Manufacturer Sapphire, stock cooling.


----------



## King4x4

Just cracked 19k on Firestrike extreme








http://www.3dmark.com/fs/1831296


----------



## Sgt Bilko

Quote:


> Originally Posted by *King4x4*
> 
> Just cracked 19k on Firestrike extreme
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/1831296


Nice!









I've managed just over 9k with CF 290's but 19k......

What kind of power draw were you getting with them at 1230?


----------



## King4x4

Don't have a power meter









But with two Seasonic X-Series 1250watt I was getting no power issues.


----------



## Sgt Bilko

Quote:


> Originally Posted by *King4x4*
> 
> Don't have a power meter
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But with two Seasonic X-Series 1250watt I was getting no power issues.


Ah well, curiosity more than anything.

Impressive score nevertheless, well done


----------



## BradleyW

Quote:


> Originally Posted by *kizwan*
> 
> Played Siege of Shanghai last night, 64 players. With 14.2 beta driver. Frame Time are higher than 14.1 beta driver but FPS seems not so smooth compared to 14.1.
> 
> Frame Time
> 
> 
> FPS


Does this prove XDMA frame pacing is a lie? Or is pacing done via software for XDMA?


----------



## kizwan

Quote:


> Originally Posted by *BradleyW*
> 
> Does this prove XDMA frame pacing is a lie? Or is pacing done via software for XDMA?


FPS much smoother with 14.1 beta driver. So can't blame Frame Pacing IMO.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kizwan*
> 
> FPS much smoother with 14.1 beta driver. So can't blame Frame Pacing IMO.


Actually the fps and frametimes have been excellent for me with the 14.2 driver, i posted this a little earlier but i'll do it again.


----------



## sterob

Quote:


> Originally Posted by *sterob*
> 
> i am having a very weird problem with Sapphire Tri-X 290.
> 
> Basically, i use msiafterburner to adjust core/mem. I notice i am stuck at 830kh no matter what core/mem i use. Then i check with GPU-Z and notice a thing strange. On the "graphic card" tab - GPU clock and memory show exactly the core/mem i adjust in MSIafterburner (for instance 1050/1500). However on the "sensor" tab GPU core clock is always at 932 maximum. Even when i push core clock in afterburner to 980, 1000, 1020 or 1050 core clock will never go above 932 and in some case (1000) it goes down to 900. Using --gpu-engine in cgminer does nothing.


I found the problem. Apparently Afterburner was buggy. Delete the "profile" folder inside afterburner folder and everything work again.

p/s: Actually this time i cant adjust 1 of my 280x voltage (i already had 2x280 running before buying this 290), everything else is fine. So i try applying the old .cfg file and setting files of the 280x while still keep the new 290 setting file. GPU temp did drop, indicated i was able to undervolt it properly.


----------



## kizwan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> FPS much smoother with 14.1 beta driver. So can't blame Frame Pacing IMO.
> 
> 
> 
> Actually the fps and frametimes have been excellent for me with the 14.2 driver, i posted this a little earlier but i'll do it again.
Click to expand...

Mine with latest BF4 patch. Yours with latest patch? I don't know if the patch make any difference though.

Wait a sec...my frame time pretty good too but higher than yours though somehow.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kizwan*
> 
> Mine with latest BF4 patch. Yours with latest patch? I don't know if the patch make any difference though.
> 
> Wait a sec...my frame time pretty good too, higher than yours though somehow.
> 
> 
> Spoiler: Warning: Spoiler!


I'm downloading the latest patch atm, 7 more hours to go









I've been comparing results with a few people from the Mantle Discussion Thread and it seems that my experience with mantle i better than most others.

not sure why but my fps, frame times and everything else is general has been great using the 14.2 drivers.

14.1 was absolute crap for me, crashes, bad fps etc.

I'll post some results whenever i get the latest patch downloaded but it won't be anytime soon.


----------



## kizwan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'm downloading the latest patch atm, 7 more hours to go
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've been comparing results with a few people from the Mantle Discussion Thread and it seems that my experience with mantle i better than most others.
> 
> not sure why but my fps, frame times and everything else is general has been great using the 14.2 drivers.
> 
> 14.1 was absolute crap for me, crashes, bad fps etc.
> 
> I'll post some results whenever i get the latest patch downloaded but it won't be anytime soon.


Mine only crashed with 14.1 if I overclock. At stock, it ran flawlessly. With 14.2, free from any problem so far.

My frame time was with 200% resolution scale. At what resolution & resolution scale you're running BF4?


----------



## Sgt Bilko

Quote:


> Originally Posted by *kizwan*
> 
> Mine only crashed with 14.1 if I overclock. At stock, it ran flawlessly. With 14.2, free from any problem so far.
> 
> My frame time was with 200% resolution scale. At what resolution & resolution scale you're running BF4?


I'm running at 1080p (100% res scale for testing and 150% when gaming) and will keep doing so until a 1440p 120hz+ Monitor comes out that doesn't cost more than half a years wages


----------



## Hl86

Artifacting does that mean end of the line or too Little VDDC?


----------



## kizwan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Mine only crashed with 14.1 if I overclock. At stock, it ran flawlessly. With 14.2, free from any problem so far.
> 
> My frame time was with 200% resolution scale. At what resolution & resolution scale you're running BF4?
> 
> 
> 
> 
> 
> 
> 
> I'm running at 1080p (100% res scale for testing and 150% when gaming) and will keep doing so until a 1440p 120hz+ Monitor comes out that doesn't cost more than half a years wages
Click to expand...

If BF4 is lagging because poor internet connection, does this affect FPS? I'm asking because when playing last night, it was lags pretty badly. My ping like from two digits to 100s and to 200s, then go back to 100s in seconds.

I want 120Hz monitor too but yeah, they're too expensive.








Quote:


> Originally Posted by *Hl86*
> 
> Artifacting does that mean end of the line or too Little VDDC?


Too little VDDC. It's end of the line if it's still artifacting even when you already push a lot of volt.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kizwan*
> 
> If BF4 is lagging because poor internet connection, does this affect FPS? I'm asking because when playing last night, it was lags pretty badly. My ping like from two digits to 100s and to 200s, then go back to 100s in seconds..


Oh i don't game on my home connection, I go to a friends place for that.

As for ping affecting fps, i really don't know.....i assume it does to some extent but what extent that is i don't know.


----------



## kizwan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Oh i don't game on my home connection, I go to a friends place for that.
> 
> As for ping affecting fps, i really don't know.....i assume it does to some extent but what extent that is i don't know.


I don't have much choice here. The only alternative is applying 1Mbps connection. That should solved the problem.

You should see me trying to aim with the games lags pretty bad.







I don't know whether I can record a video though.


----------



## bbond007

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Your CCC looks the same as mine, From what i've heard the custom cards don't have a Temp Target setting in CCC.
> 
> Only Reference have the Temp target setting, other might need to confirm this for me though.


Thanks. After much reading I understand a little more, however unfortunately i just have more questions.

I'm really starting to thing I would have been better off with reference cards than these MSI r9 290x gamer cards if not for that reason alone.

Anyway, to combat the temps on the top cards I used a dremel to install a 200mm fan over the topmost card. Not throttling under heavy gaming load such as 3dmark Firestrike anymore.

Still, For mining I have been setting my GPU clocks lower to keep the temps down into the 80c - 88c range because I feel 94c is just too hot.

I already had to return a Gigabyte Windforce r9 290x that died after about 5 hours of ownership, and I'm not entirely sure I did not kill the board by overheating it while mining, but I was not overclocking. I just don't want these boards to have a similar fate.

I have successfully booted into MS DOS and dumped my flash ROM.

Would it be possible to load reference card ROM on these MSI boards to enable the Target Temp setting in CCC?

The next frustrating thing about CCC is that overclock settings are not available for the 2nd card unless you enable crossfireX or plug another monitor into the 2nd card. Any way around that?

Thanks.


----------



## ZealotKi11er

I am going to play BF4 in DX and see if its better.


----------



## Forceman

Quote:


> Originally Posted by *combine1237*
> 
> Hello, just wondering if 1125mhz core on stock voltage and plus 20 power is decent? for my 290x dd black. Also do not get my started on my throttle happy isp


Yes 1125 on stock voltage is very good. Make sure it isn't throttling though, with only 20% power.


----------



## combine1237

Thanks Forceman, it doesn't seem to throttle, but 1125 is its limit on stock, and the memory does not like to go above 1320 on stock, probably due to being elipda. My vrm is a little high at 80 for one in furmark. For some reason msi Kombuster causes vrm1 to go to 120 degrees according to gpu-z while 2 stays at 59-60. Is that a little to large of a gap, or is it just me?


----------



## Arizonian

Quote:


> Originally Posted by *lurker2501*
> 
> Manufacturer Sapphire, stock cooling.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added








Quote:


> Originally Posted by *King4x4*
> 
> Just cracked 19k on Firestrike extreme
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/1831296


----------



## MapRef41N93W

Quote:


> Originally Posted by *kizwan*
> 
> If BF4 is lagging because poor internet connection, does this affect FPS? I'm asking because when playing last night, it was lags pretty badly. My ping like from two digits to 100s and to 200s, then go back to 100s in seconds.
> 
> I want 120Hz monitor too but yeah, they're too expensive.
> 
> 
> 
> 
> 
> 
> 
> 
> Too little VDDC. It's end of the line if it's still artifacting even when you already push a lot of volt.


Are they now? http://www.ebay.com/itm/330932578190?ssPageName=STRK:MEWAX:IT&_trksid=p3984.m1423.l2649


----------



## chronicfx

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Are they now? http://www.ebay.com/itm/330932578190?ssPageName=STRK:MEWAX:IT&_trksid=p3984.m1423.l2649


$279? Can that be any good? Not because it is korean pls because I own a catleap myself and paid almost $500 for it 2 or 3 years ago... It just seems too cheap for an led pls panel. Whats the catch?


----------



## Forceman

Quote:


> Originally Posted by *combine1237*
> 
> Thanks Forceman, it doesn't seem to throttle, but 1125 is its limit on stock, and the memory does not like to go above 1320 on stock, probably due to being elipda. My vrm is a little high at 80 for one in furmark. For some reason msi Kombuster causes vrm1 to go to 120 degrees according to gpu-z while 2 stays at 59-60. Is that a little to large of a gap, or is it just me?


VRM1 is always hotter (since it's the core voltage) but 120C is very hot, probably too hot. Furmark isn't really a good test to use, see what it does in Heaven or Valley instead, but I would think about ramping up the fan speed if the VRMs get that hot. I'd try to keep them around 80C.
Quote:


> Originally Posted by *chronicfx*
> 
> $279? Can that be any good? Not because it is korean pls because I own a catleap myself and paid almost $500 for it 2 or 3 years ago... It just seems too cheap for an led pls panel. Whats the catch?


Most of the user reports are positive over in the owners thread - I'm very close to pulling the trigger on one today myself.


----------



## MapRef41N93W

Quote:


> Originally Posted by *chronicfx*
> 
> $279? Can that be any good? Not because it is korean pls because I own a catleap myself and paid almost $500 for it 2 or 3 years ago... It just seems too cheap for an led pls panel. Whats the catch?


There is no catch. It's the best monitor I have ever owned and beats out other 600-800 dollar monitors that I have seen. The only "catch" I suppose is it is a bit annoying having to remake a custom resolution everytime you have to change AMD drivers, but nothing that takes more than a minute or two.

Also Samsung PLS suffer from minor backlight bleed, but it's hardly noticeable and very fixable by simply putting a piece of electrical tape on the frame (which is pressing against the screen too hard causing it).


----------



## combine1237

Heaven and valley let it stay below 70 its only Kombuster that does that too it not even mining will do that to it. That is at max fan speed, I just don't know it's only Kombuster that does that to it. Everything else is below 80.


----------



## Forceman

Quote:


> Originally Posted by *combine1237*
> 
> Heaven and valley let it stay below 70 its only Kombuster that does that too it not even mining will do that to it. That is at max fan speed, I just don't know it's only Kombuster that does that to it. Everything else is below 80.


Furmark and Kombuster are completely unrealistic loads (and probably unsafe also). If it only goes that high when running them, don't worry about it. Just don't run them.


----------



## combine1237

Thanks. My main concern is as long as I am safe during everyday use.


----------



## Forceman

Ughh. That "black screen after monitor goes to sleep" bug is back with a vengeance in 14.2. Even with ULPS disabled (which seemed to be the cause in 13.12) it still has done it three times in the past two days. Plus two "driver power state failure" blue screens at idle. May have to roll-back and lose Mantle.


----------



## King4x4

Try the tri-x bios.


----------



## Sinisa Glusica

Guys,what is the max safe voltage on 290x? My Asus R9 290X is watercooled,so temp is not a problem,just want to know how far can i go with volts....si 1.4v (1.367 after v drop) is too high for this chip? On thios voltage,i can run it on 1180mhz for core speed,temps aobut 49 C,just tell me is it too high vcore......Thanks!


----------



## Forceman

Quote:


> Originally Posted by *Sinisa Glusica*
> 
> Guys,what is the max safe voltage on 290x? My Asus R9 290X is watercooled,so temp is not a problem,just want to know how far can i go with volts....si 1.4v (1.367 after v drop) is too high for this chip? On thios voltage,i can run it on 1180mhz for core speed,temps aobut 49 C,just tell me is it too high vcore......Thanks!


I think the generally accepted number is 1.35V (steady-state, after Vdroop), but I don't know that there is any science behind that.


----------



## ArchieGriffs

Quote:


> Originally Posted by *chronicfx*
> 
> $279? Can that be any good? Not because it is korean pls because I own a catleap myself and paid almost $500 for it 2 or 3 years ago... It just seems too cheap for an led pls panel. Whats the catch?


Backlight bleed and dead pixels. Backlight bleed is only ever annoying when the screen is black and the room isn't well lit, and even then it's isn't terrible. I play Skyrim and some areas are nearly pitch black and I play in the dark and I have little issues with the backlight bleed on my Korean PLS monitor. Backlight bleed can be fixed, but you have to take a butter knife or a similar metal thin object and press it against the screen and then fill the gap with tape.

As far as dead pixels go it's anywhere from 0-5 dead pixels, each pixel about the size of the width of your hair, the 1 dead pixel I have took me 2 months to find even after doing a dead pixel test.

The issues are very minimal and the screen is absolutely fantastic, not just because it's 1440p, but because it's high quality as well. Korean monitors are cheaper because they don't pass a certain standard which lets them sell at a full price, and you can't really fix dead pixels, so it's much cheaper and easier for them to sell it at a reduced price.

Warranty is kind of sketchy, I've asked amazon where I purchased my QNIX 2710 and I haven't heard back from them. Assuming you get an overclockable monitor, you might be able to kill it in a couple years, but otherwise it's going to last the same amount of time as a regular monitor, but not having that warranty really scares me. Depending on the site you might be able to pay extra for warranty, but I'm no expert on it.

If you can live with the warranty issues, the backlight, and a few dead pixels there's no drawback, at least for the QNIX, I have no knowledge on the monitor that was linked.


----------



## MapRef41N93W

Quote:


> Originally Posted by *ArchieGriffs*
> 
> Backlight bleed and dead pixels. Backlight bleed is only ever annoying when the screen is black and the room isn't well lit, and even then it's isn't terrible. I play Skyrim and some areas are nearly pitch black and I play in the dark and I have little issues with the backlight bleed on my Korean PLS monitor. Backlight bleed can be fixed, but you have to take a butter knife or a similar metal thin object and press it against the screen and then fill the gap with tape.
> 
> As far as dead pixels go it's anywhere from 0-5 dead pixels, each pixel about the size of the width of your hair, the 1 dead pixel I have took me 2 months to find even after doing a dead pixel test.
> 
> The issues are very minimal and the screen is absolutely fantastic, not just because it's 1440p, but because it's high quality as well. Korean monitors are cheaper because they don't pass a certain standard which lets them sell at a full price, and you can't really fix dead pixels, so it's much cheaper and easier for them to sell it at a reduced price.
> 
> Warranty is kind of sketchy, I've asked amazon where I purchased my QNIX 2710 and I haven't heard back from them. Assuming you get an overclockable monitor, you might be able to kill it in a couple years, but otherwise it's going to last the same amount of time as a regular monitor, but not having that warranty really scares me. Depending on the site you might be able to pay extra for warranty, but I'm no expert on it.
> 
> If you can live with the warranty issues, the backlight, and a few dead pixels there's no drawback, at least for the QNIX, I have no knowledge on the monitor that was linked.


X-Star is the exact same monitor as Qnix.....Same panel, same PCB, same parts, glossier bezel with a different logo on it.


----------



## chronicfx

Thanks for the explanation guys. I do own a catleap q270 for several years and am very happy with it. I have even been looking into getting a yamakasi spartan 1600p but was not sure whether to spend up for LED or stay with LCD at a more affordable price. But that $270 price tag looked really enticing for 1440p. If it is a decent enough panel I think people should definitely give it a try.


----------



## MapRef41N93W

Quote:


> Originally Posted by *chronicfx*
> 
> Thanks for the explanation guys. I do own a catleap q270 for several years and am very happy with it. I have even been looking into getting a yamakasi spartan 1600p but was not sure whether to spend up for LED or stay with LCD at a more affordable price. But that $270 price tag looked really enticing for 1440p. If it is a decent enough panel I think people should definitely give it a try.


Don't buy a 1600p Korean monitor. They are all wide gamut and have no sRGB mode meaning colors are really bad for applications that use sRGB (as in games and videos). They also have a nasty AG coating on them.

By the way, LED and LCD are the same exact thing. LCD is the display type (Liquid Crystal) and LED is just the light used in the display.

The X-Star/Qnix are far better than just "decent". They are an 800$ panel that is reduced to that cost because they are the leftover A- panels.


----------



## The Storm

Quote:


> Originally Posted by *chronicfx*
> 
> Thanks for the explanation guys. I do own a catleap q270 for several years and am very happy with it. I have even been looking into getting a yamakasi spartan 1600p but was not sure whether to spend up for LED or stay with LCD at a more affordable price. But that $270 price tag looked really enticing for 1440p. If it is a decent enough panel I think people should definitely give it a try.


I have had my Xstar 1440p that I purchased from dreamseller for 6 months now. I did *NOT* buy the pixel perfect monitor and it is pixel perfect, I paid $279 for it and its been overclocked to 120hz from day one and not a single issue. BTW I purchased the protection plan from square trade just in case.


----------



## ArchieGriffs

Quote:


> Originally Posted by *MapRef41N93W*
> 
> X-Star is the exact same monitor as Qnix.....Same panel, same PCB, same parts, glossier bezel with a different logo on it.


I figured, a lot of them use the exact same panels, and the X-star looks pretty much identical.


----------



## Forceman

Quote:


> Originally Posted by *chronicfx*
> 
> But that $270 price tag looked really enticing for 1440p. If it is a decent enough panel I think people should definitely give it a try.


I think people are. That seller has sold 18 since I started watching the page about 6 hours ago.


----------



## The Storm

Quote:


> Originally Posted by *Forceman*
> 
> I think people are. That seller has sold 18 since I started watching the page about 6 hours ago.


I would love have 2 more to go with the one I've got.


----------



## SeanEboy

Theres a thread for that...

www.overclock.net/t/1384767/official-the-korean-pls-monitor-club-qnix-x-star/14940#post_21917196


----------



## taem

Quote:


> Originally Posted by *Forceman*
> 
> VRM1 is always hotter (since it's the core voltage) but 120C is very hot, probably too hot. Furmark isn't really a good test to use, see what it does in Heaven or Valley instead, but I would think about ramping up the fan speed if the VRMs get that hot. I'd try to keep them around 80C.
> Most of the user reports are positive over in the owners thread - I'm very close to pulling the trigger on one today myself.


The best real world heat test I've found is Metro LL bench maxed on loop. Hotter than any game or bench, but not beyond the realm of real world application like fur mark you have to own Metro LL obviously but it's a great game and worth having. Steam sells it cheap from time to time.

I have a Qnix 1440p, it's awesome. Get the one with matte screen (looks much better IMHO) and just the one dual link dvi input (easiest to oc to 96hz). Accessorieswholesale is where I got mine, official eBay vendor for qnix I believe, free ship, brand new, zero dead pixels, just a bit of bleed in one corner. Don't even hesitate the Korean 1440ps are the best gaming display deal to be had.


----------



## Aussiejuggalo

Right, gonna be joining the red team I think, green team is getting BS expensive now, even tho it may be better in some games I cant see the point of spending $100 - $300 extra for at most 20 more frames









Just need help picking a card, XFX Radeon R9 290 4GB Double Dissipation, Sapphire Radeon R9 290 Tri-X 4GB, MSI Radeon R9 290 Gaming 4GB or Sapphire Radeon R9 290 Tri-X OC 4GB

I'm still not 100% sure if I'll stick it underwater yet gonna wait and see if I'm happy with performance







Then again anything will be better then my broken 670


----------



## Sgt Bilko

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Right, gonna be joining the red team I think, green team is getting BS expensive now, even tho it may be better in some games I cant see the point of spending $100 - $300 extra for at most 20 more frames
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just need help picking a card, XFX Radeon R9 290 4GB Double Dissipation, Sapphire Radeon R9 290 Tri-X 4GB, MSI Radeon R9 290 Gaming 4GB or Sapphire Radeon R9 290 Tri-X OC 4GB
> 
> I'm still not 100% sure if I'll stick it underwater yet gonna wait and see if I'm happy with performance
> 
> 
> 
> 
> 
> 
> 
> Then again anything will be better then my broken 670


I see you are ordering from PCCG, well then, lets go through the list.

The XFX DD cards are great looking and they cool the core well but vrm cooling leaves alot to be desired. also, stickers on the screws means taking the cooler to put it under water is a PITA.

The Tri-X cards are very good, better than most for core and vrm cooling, colour scheme may throw you off though.

The MSI gaming cards have had mixed results, some have had a very good experience with the cooler, others not so much.

I didn't see you list the Powercolor PCS+? it's also a very good card, equal to the Tri-X in cooling the only downside from what i see is it's a 2.5 slot card opposed to the rest that are 2 slot.

i can speak from experience with the XFX cards as well, even if you don't plan on overclocking them i've seen them hit 100c on the vrm's at stock with some high ambient temps.


----------



## Red1776

I can speak about the MSI R290X 4GB gaming.

I own 4 of them and they all have been great OC'ers , and run in the low 70's while gaming and VRM temps have been great.

They are going under water, but I have had a great experience with them so far with the stock coolers.

That's my 2 cents  good luck.


----------



## ghostly44

Still running the i2500k but got a Power-color PCS+ R9 290 Stock cooling


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I see you are ordering from PCCG, well then, lets go through the list.
> 
> The XFX DD cards are great looking and they cool the core well but vrm cooling leaves alot to be desired. also, stickers on the screws means taking the cooler to put it under water is a PITA.
> 
> The Tri-X cards are very good, better than most for core and vrm cooling, colour scheme may throw you off though.
> 
> The MSI gaming cards have had mixed results, some have had a very good experience with the cooler, others not so much.
> 
> I didn't see you list the Powercolor PCS+? it's also a very good card, equal to the Tri-X in cooling the only downside from what i see is it's a 2.5 slot card opposed to the rest that are 2 slot.
> 
> i can speak from experience with the XFX cards as well, even if you don't plan on overclocking them i've seen them hit 100c on the vrm's at stock with some high ambient temps.


Yeah PCCG are really the only shop I buy from anymore

Well thats crap about XFX card and vrm cooling adn 100c is a little to hot for me









Tri-X the colour scheme does throw me off a little but if they perform well I can always cover most of the yellow with carbon fiber vinyl









I heard MSI cards can overclock well but others cant... they look like really nice coolers tho and MSI seem to have stepped up there game

As for Powercolor, PCCG dont have it listed under the 290, the only powercolor is the 290x which I wont buy, to expensive
Quote:


> Originally Posted by *Red1776*
> 
> I can speak about the MSI R290X 4GB gaming.
> I own 4 of them and they all have been great OC'ers , and run in the low 70's while gaming and VRM temps have been great.
> They are going under water, but I have had a great experience with them so far with the stock coolers.
> That's my 2 cents
> 
> 
> 
> 
> 
> 
> 
> good luck.
> 
> 
> Spoiler: Warning: Spoiler!


Ah cool









How high have you been able to clock them on the stock coolers?

There is also the Gigabyte Radeon R9 290 OC 4GB BF4 Bundle but its the BF4 bundle







and from my experience Gigabyte cards dont seem to OC well


----------



## Red1776

I have gotten them to 1190-1210 so far, going to get them under water this weekend and push more. The dual 10cm fans are quiet as well. Like you, I have not had much luck with Gigabyte cards.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Yeah PCCG are really the only shop I buy from anymore
> 
> Well thats crap about XFX card and vrm cooling adn 100c is a little to hot for me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Tri-X the colour scheme does throw me off a little but if they perform well I can always cover most of the yellow with carbon fiber vinyl
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I heard MSI cards can overclock well but others cant... they look like really nice coolers tho and MSI seem to have stepped up there game
> 
> As for Powercolor, PCCG dont have it listed under the 290, the only powercolor is the 290x which I wont buy, to expensive
> Ah cool
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How high have you been able to clock them on the stock coolers?
> 
> There is also the Gigabyte Radeon R9 290 OC 4GB BF4 Bundle but its the BF4 bundle
> 
> 
> 
> 
> 
> 
> 
> and from my experience Gigabyte cards dont seem to OC well


Strange, the Powercolor cards were in stock a couple of days ago.

Not sure about the Giga cards, hopefully someone that has one might chime in.


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Red1776*
> 
> I have gotten them to 1190-1210 so far, going to get them under water this weekend and push more. The dual 10cm fans are quiet as well. Like you, I have not had much luck with Gigabyte cards.


Thats pretty good on a stock cooler








and people say AMD cant overclock well









Yeah Gigabyte cards are kinda crappy compared to other companys








Quote:


> Originally Posted by *Sgt Bilko*
> 
> Strange, the Powercolor cards were in stock a couple of days ago.
> 
> Not sure about the Giga cards, hopefully someone that has one might chime in.


Yeah... bit weird, but PCCG had MIS Lightning 290xs in stock for a day now there not even on the website


----------



## kizwan

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> I see you are ordering from PCCG, well then, lets go through the list.
> 
> The XFX DD cards are great looking and they cool the core well but vrm cooling leaves alot to be desired. also, stickers on the screws means taking the cooler to put it under water is a PITA.
> 
> The Tri-X cards are very good, better than most for core and vrm cooling, colour scheme may throw you off though.
> 
> The MSI gaming cards have had mixed results, some have had a very good experience with the cooler, others not so much.
> 
> I didn't see you list the Powercolor PCS+? it's also a very good card, equal to the Tri-X in cooling the only downside from what i see is it's a 2.5 slot card opposed to the rest that are 2 slot.
> 
> i can speak from experience with the XFX cards as well, even if you don't plan on overclocking them i've seen them hit 100c on the vrm's at stock with some high ambient temps.
> 
> 
> 
> 
> 
> 
> 
> Yeah PCCG are really the only shop I buy from anymore
> 
> Well thats crap about XFX card and vrm cooling adn 100c is a little to hot for me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Tri-X the colour scheme does throw me off a little but if they perform well I can always cover most of the yellow with carbon fiber vinyl
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I heard MSI cards can overclock well but others cant... they look like really nice coolers tho and MSI seem to have stepped up there game
> 
> As for Powercolor, PCCG dont have it listed under the 290, the only powercolor is the 290x which I wont buy, to expensive
Click to expand...

If I were you, it will be between Tri-X or PowerColor PCS+. If I'm not mistaken Tri-X cards come with Hynix memory. Not that Elpida memory is weak because one of my card have Elpida memory & I can overclocked up to 1600MHz at least.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Thats pretty good on a stock cooler
> 
> 
> 
> 
> 
> 
> 
> and people say AMD cant overclock well
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah Gigabyte cards are kinda crappy compared to other companys
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah... bit weird, but PCCG had MIS Lightning 290xs in stock for a day now there not even on the website


I won't be getting another Giga card i know that much.

The Lightning sold out in one day and they probably won't be getting any new stock in soon so easier to remove it from the site, same thing happened with the Tri-X Cards as well.

Guessing thats what's happened with the PCS+ cards as well.

As for AMD cards not overclocking well, I've hit 1250/1550 on my cards, waiting till i get them under water to push for 1300


----------



## Aussiejuggalo

Quote:


> Originally Posted by *kizwan*
> 
> If I were you, it will be between Tri-X or PowerColor PCS+. If I'm not mistaken Tri-X cards come with Hynix memory. Not that Elpida memory is weak because one of my card have Elpida memory & I can overclocked up to 1600MHz at least.


So if I got a Tri-X should I get the stock version or the already OC one
Quote:


> Originally Posted by *Sgt Bilko*
> 
> I won't be getting another Giga card i know that much.
> 
> The Lightning sold out in one day and they probably won't be getting any new stock in soon so easier to remove it from the site, same thing happened with the Tri-X Cards as well.
> 
> Guessing thats what's happened with the PCS+ cards as well.
> 
> As for AMD cards not overclocking well, I've hit 1250/1550 on my cards, waiting till i get them under water to push for 1300


Yeah Giga cards are ok at stock but... meh lol

Ah that explains it lol

Thats a nice clock over stock, I remember OC3D did a review of the 290x lightning and he got 1600 or something out of it


----------



## Sgt Bilko

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> So if I got a Tri-X should I get the stock version or the already OC one


Shouldn't really matter but i'd go for the OC version just to have a higher stock clock if anything.


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Shouldn't really matter but i'd go for the OC version just to have a higher stock clock if anything.


Ah cool, OC version it is then









Just looked a few reviews of the OC one... once they overclocked it more it beat out a 780Ti


----------



## Pandora's Box

Quote:


> Originally Posted by *Red1776*
> 
> I have gotten them to 1190-1210 so far, going to get them under water this weekend and push more. The dual 10cm fans are quiet as well. Like you, I have not had much luck with Gigabyte cards.


Hows the noise and how are the temps with 4 290's in 1 system? Especially since that heat isn't being pushed out of the case.

Got any pics?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Ah cool, OC version it is then
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just looked a few reviews of the OC one... once they overclocked it more it beat out a 780Ti


It would beat out a 780Ti at stock yeah, once the Ti gets overclocked then it becomes interesting









There is also the $300 AUD price difference as well


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Sgt Bilko*
> 
> It would beat out a 780Ti at stock yeah, once the Ti gets overclocked then it becomes interesting
> 
> 
> 
> 
> 
> 
> 
> 
> 
> There is also the $300 AUD price difference as well

















that would be very interesting









Yeah that to, one of the biggest reasons why I'm moving away from Nvidia is the BS price mark up here


----------



## kizwan

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> It would beat out a 780Ti at stock yeah, once the Ti gets overclocked then it becomes interesting
> 
> 
> 
> 
> 
> 
> 
> 
> 
> There is also the $300 AUD price difference as well
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> that would be very interesting
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Yeah that to, one of the biggest reasons why I'm moving away from Nvidia is the BS price mark up here*
Click to expand...

Yeah, they're too expensive. This is why I stick with red team. If the price range like AMDs, I would have go green. Sorry Earth!


----------



## Aussiejuggalo

Quote:


> Originally Posted by *kizwan*
> 
> Yeah, they're too expensive. This is why I stick with red team. *If the price range like AMDs, I would have go green.* Sorry Earth!


Pretty much this









You gotta hand it to Nvidia tho they did so well with marketing the Titan that when the 780's came out they seemed so cheap in comparison


----------



## Sgt Bilko

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Pretty much this
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You gotta hand it to Nvidia tho they did so well with marketing the Titan that when the 780's came out they seemed so cheap in comparison


The 780's had a redonkculous price when they came out. They came down after the 290/x's came out though, so the 290's are the 780's competitor and the 290x are the competitor for the 780 Ti.


----------



## mojobear

Hey guys - quick question.

I've been OCing my trifire 290s and I seem to have the best results with lower power limit, like 35% rather than maxed out at 50%. Could this be bc the cards are throttling and I'm not getting the actual clocks set in tri-x? When I use GPUZ I dont see any throttling of the core frequency on any of my 3 cards...they are also under water.

If GPUZ does not show frequency fluctuations then it likely means the GPU is not throttling right? Or is it like micro-throttling that these programs cannot detect?

Thanks!


----------



## Coree

The upcoming Asus Matrix R9 290X's heatsink doesn't look that promising...

As you can see, 3 of 5 heatpipes will make contact with the core itself.. the same problem was with the DC II 290's.


----------



## MapRef41N93W

So I finally got my cards to work with mining, but wow are they running hot. My Tri-X was running at 75c before when it was the only card mining, now both cards are running about 90c and downclocking at times.


----------



## Forceman

So apparently Afterburner still has the bug where it doesn't apply the voltage offset after waking up from sleep (or after the monitor sleeps)? Just did some testing at 1100/+100 and was surprised to see my VDDC was on 1.15V. Changed AB to +99 and it popped right up to 1.25V. I thought that had been fixed in 13.12.

Surprisingly it seemed to run just fine at 1.15V. Now I need to go re-evaluate my overclock.

I'm thinking of using BF4 in first-person spectator mode to test with, anyone else tried that?


----------



## aaroc

Quote:


> Originally Posted by *goodenough88*
> 
> I saw some old benchmarks for a 7680x1440 resolution 2-way R9 290X. (http://www.tomshardware.com/reviews/radeon-r9-290x-hawaii-review,3650-27.html)
> 
> Does anyone know if adding a 3rd R9 290X to a 7680x1440 setup will make a massive difference? Or is there very little gain to be had from adding a 3rd R9 290X?


Depends on the game. It scaled very well for Dirt 3, Dirt Showdown, Bioshock Infinite, Sleeping dogs, all setting on ultra, etc... But it doesn't add anything to games like F1 2013/F1 2012. Initially I bought 2 R9290 for 7680x1440 but I had to lower the settings to have an acceptable fps rate, for me is 30fps all the time and better if its more than 60. F1 2013/2012 run at 40-50fps with all settings on ultra at 7680x1440 on one R9 290.
Im waiting for WC parts as they run hot without space between the three cards.


----------



## Aonex

Does anyone have any experience or read anything about the Logisys VC6006 VGA Cooler as a replacement for the stock blower cooler?

http://www.newegg.com/Product/Product.aspx?Item=N82E16835999052


----------



## Abyssic

my 290x tri-x is about to drive me crazy







one of the fans is making noises as if it would be grinding against smth. with each rotation. this only occurs at nighttime, due to the lower temps i guess. it occurs from minimal fan speed up to around 60%, then it simply stops.
do you think i can claim my warranty for smth. like this?


----------



## Paul17041993

Quote:


> Originally Posted by *Coree*
> 
> The upcoming Asus Matrix R9 290X's heatsink doesn't look that promising...
> 
> As you can see, 3 of 5 heatpipes will make contact with the core itself.. the same problem was with the DC II 290's.


gutless 2slot, well looks like the lightning will have no competition somewhat...


----------



## kizwan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> FPS much smoother with 14.1 beta driver. So can't blame Frame Pacing IMO.
> 
> 
> 
> Actually the fps and frametimes have been excellent for me with the 14.2 driver, i posted this a little earlier but i'll do it again.
Click to expand...

I found out what causing Mantle failed to perform properly. I forgot to disabled CPU power management (C3/C6/C7). Apparently they affect BF4 performance, just like core parking. Frame time & FPS much better this time with Mantle. Less Frame Time spike & less FPS drop with Mantle compared to DirectX.

*BF4, Zavod 311 64 player map (half player), 1080p Ultra settings (AA & AA POST OFF), 200% resolution scale, 2 x 290's Crossfire*

*Frame Time - DirectX*


*Frame Time - Mantle*


*FPS - DirectX*
MIN :25.52974215
MAX :154.5595054
AVERAGE :79.01253821


*FPS - Mantle*
MIN :23.7586125
MAX :190.4761905
AVERAGE :90.6788262


----------



## axiumone

I'm starting to like these cards.

18,447








http://www.3dmark.com/fs/1840179


----------



## TommyGunn123

Welp looks like Asus is fighting back, but will the pipes touch? I guess we'll find out next time on the Asus facebook page!


----------



## Paul17041993

Quote:


> Originally Posted by *TommyGunn123*
> 
> Welp looks like Asus is fighting back, *but will the pipes touch*? I guess we'll find out next time on the Asus facebook page!


not without a plate, the pic of the cooler was already linked here.

and that PCB looks awful, whats with all the blank space...? and wheres the memory and mem. VRM cooling...? it only has 2*8pin power too... somehow I don't think this will end up any better then the 79x0 if that picture is actually accurate...


----------



## TommyGunn123

Quote:


> Originally Posted by *Paul17041993*
> 
> not without a plate, the pic of the cooler was already linked here.
> 
> and that PCB looks awful, whats with all the blank space...? and wheres the memory and mem. VRM cooling...? it only has 2*8pin power too... somehow I don't think this will end up any better then the 79x0 if that picture is actually accurate...


Yeah, I just hope for their sake (and ours, really) that they still have to work on the heatsink arrangement or w/e for the 290x, because their reputation's taken a hit in this gen :/


----------



## Paul17041993

Quote:


> Originally Posted by *Paul17041993*
> 
> it only has 2*8pin power too...


oh wait I just noticed the extra molex connector... why would they use that and why put it down there...? that's very odd...


----------



## TommyGunn123

Quote:


> Originally Posted by *Paul17041993*
> 
> oh wait I just noticed the extra molex connector... why would they use that and why put it down there...? that's very odd...


'Cos molex is obviously super power... or something


----------



## Aussiejuggalo

Quote:


> Originally Posted by *TommyGunn123*


Anyone else notice the height of the pcb on this card









Edit, Also what kinda performance should I be looking at getting with a Tri-X OC over my current 670? and what kinda overclock can I expect to achieve without much effort?


----------



## Tonza

What is the safe voltage for these @ 24/7 usage? Getting Sapphire 290 Tri-X today, and later this month second aswell. VRM seems to be same as on 7970 reference which was very good. Atleast the afterburner max +100mv should be fine?


----------



## Coree

OC depends on silicon lottery. That small VRM strip will prevent not fare well with high volts...


----------



## TommyGunn123

Quote:


> Originally Posted by *Tonza*
> 
> What is the safe voltage for these @ 24/7 usage? Getting Sapphire 290 Tri-X today, and later this month second aswell. VRM seems to be same as on 7970 reference which was very good. Atleast the afterburner max +100mv should be fine?


Most would probs say +100 mv is safe for long term, much more would probs affect lifespan. Also depends on temps, you wanna keep the vrms around 80c or lower for efficiency and longevity.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kizwan*
> 
> I found out what causing Mantle failed to perform properly. I forgot to disabled CPU power management (C3/C6/C7). Apparently they affect BF4 performance, just like core parking. Frame time & FPS much better this time with Mantle. Less Frame Time spike & less FPS drop with Mantle compared to DirectX.
> 
> *BF4, Zavod 311 64 player map (half player), 1080p Ultra settings (AA & AA POST OFF), 200% resolution scale, 2 x 290's Crossfire*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> *Frame Time - DirectX*
> 
> 
> *Frame Time - Mantle*
> 
> 
> *FPS - DirectX*
> MIN :25.52974215
> MAX :154.5595054
> AVERAGE :79.01253821
> 
> 
> *FPS - Mantle*
> MIN :23.7586125
> MAX :190.4761905
> AVERAGE :90.6788262


That looks a bit more like it!

Glad to see it's working for you, you should post that over here: http://www.overclock.net/t/1429303/amd-mantle-discussion-thread/0_40

Might help a few other Intel users getting Mantle to run a bit smoother


----------



## Sgt Bilko

Quote:


> Originally Posted by *TommyGunn123*
> 
> Welp looks like Asus is fighting back, but will the pipes touch? I guess we'll find out next time on the Asus facebook page!


I seen that.....looks bad imo, heatpipes will barely touch meaning this will be a H20/LN2 only card.

And all them extra power phases just make for a blank piece of PCB, Lightning is looking like it will have no competition for Team AMD

EDIT: here some proof for anyone interested:

R9 290x Direct CU II Cooler contact with Die



R9 290x Matrix Heatsink:



Same Heatsink and same contact points.


----------



## Coree

Seems that the heatsink is the same as the DC II, but with a different color lol


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> That's a nice lookin place, congrats
> 
> 
> 
> 
> 
> 
> 
> 
> 
> well, nothing can really save my net, I'm on Satelitte so ping is always 700ms+ it's just the "speed" i'm missing.


Thanks maaaate gonna be here for a very long time cause ive finally separated my bedroom from my bench room ( one room mancave / bench other bed only )








Yesterday I was pullin 1.2mb per sec on mobile broadband







Coudnt believe it faster than cable at my old joint !


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Thanks maaaate gonna be here for a very long time cause ive finally separated my bedroom from my bench room ( one room mancave / bench other bed only )
> 
> 
> 
> 
> 
> 
> 
> 
> Yesterday I was pullin 1.2mb per sec on mobile broadband
> 
> 
> 
> 
> 
> 
> 
> Coudnt believe it faster than cable at my old joint !


Mancave FTW!!

My Wife wants a mancave.....wants it filled with a Home Cinema surrounded by Marvel and Mass Effect gear


----------



## MrWhiteRX7

I put all three of my 290s on the Tri-x bios. So far so good,gonna do some testing after work.

What are people getting with stock clocks triple 290/X in heaven 4.0?

I'm slowly going back through the thread looking for results lol


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Mancave FTW!!
> 
> My Wife wants a mancave.....wants it filled with a Home Cinema surrounded by Marvel and Mass Effect gear


Sounds like she wants a 'sexytime' grotto LoooooooL











Never had sooooo much room to bench and game









Just got paid time for pre-mixers









I be back ..........


----------



## Paul17041993

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I seen that.....looks bad imo, heatpipes will barely touch meaning this will be a H20/LN2 only card.
> 
> And all them extra power phases just make for a blank piece of PCB, *Lightning is looking like it will have no competition* for Team AMD
> 
> EDIT: here some proof for anyone interested:
> 
> R9 290x Direct CU II Cooler contact with Die
> 
> 
> 
> R9 290x Matrix Heatsink:
> 
> 
> 
> Same Heatsink and same contact points.


exactly what I said, and damn I knew the heatsink looked too identical, it IS the SAME cooler, just with some stupid black (ceramic) paint...


----------



## Sgt Bilko

Quote:


> Originally Posted by *Paul17041993*
> 
> exactly what I said, and damn I knew the heatsink looked too identical, it IS the SAME cooler, just with some stupid black (ceramic) paint...


So you did, didn't see your post before.

And yes sadly, same cooler it seems, I've really lost faith in Asus now, they cheaped out. It's the same cooler for the 780 Ti Matrix. GK110 has a bigger die so that's what they designed for and never bothered to change it for the Hawaii cards.


----------



## Coree

Yeah.. The 3 slot coolers on the HD79XX were very good though, now they've just cheaped on the high end coolers.


----------



## Mercy4You

Sent my ASUS R9 290X back to ASUS for RMA. It's 8 weeks now, and still nothing....

What's going on with ASUS????


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> So you did, didn't see your post before.
> 
> And yes sadly, same cooler it seems, I've really lost faith in Asus now, they cheaped out. It's the same cooler for the 780 Ti Matrix. GK110 has a bigger die so that's what they designed for and never bothered to change it for the Hawaii cards.
> Quote:
> 
> 
> 
> Originally Posted by *Coree*
> 
> Yeah.. The 3 slot coolers on the HD79XX were very good though, now they've just cheaped on the high end coolers.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mercy4You*
> 
> Sent my ASUS R9 290X back to ASUS for RMA. It's 8 weeks now, and still nothing....
> 
> What's going on with ASUS????
> 
> Click to expand...
Click to expand...

They have slipped big time with red things

8 weeks for rma and not a peep that's disgusting


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> They have slipped big time with red things
> 
> 8 weeks for rma and not a peep that's disgusting


They have yeah, sad really.

8 Weeks!?!?!


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> They have yeah, sad really.
> 
> 8 Weeks!?!?!


At least here in OZ we deal with the seller and not the maker ......... which has a different set of pros and cons


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> At least here in OZ we deal with the seller and not the maker ......... which has a different set of pros and cons


True, PCCG took a while (4-5 weeks) to process my Sapphire 290x but they still did even though they knew i removed the cooler


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> True, PCCG took a while (4-5 weeks) to process my Sapphire 290x but they still did even though they knew i removed the cooler


I try to get all of my hardware locally to avoid dramas









and your lucky they did cause msi is the only one that allows you to do that without voiding the warranty








mind you that I don't use msi cause I couldn't get any at the time











Check out my cf 290 temps a/c on full blast straight on to case ..... no airbending.........


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> They have yeah, sad really.
> 
> 8 Weeks!?!?!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> At least here in OZ we deal with the seller and not the maker ......... which has a different set of pros and cons
Click to expand...

In Malaysia, we either RMA through the seller or bring it ourselves to Asus service center.

nice temps!







My comp also right next to A/C.


----------



## SimonKaz

Is 13.12 still the best driver?


----------



## BradleyW

Quote:


> Originally Posted by *SimonKaz*
> 
> Is 13.12 still the best driver?


14.2 Beta seems great for many people!


----------



## SimonKaz

I guess I'll give it a shot then. Was it great for gaming or overclocking?


----------



## bhardy1185

Been having fits with my card. After doing several tests, my temps were not where I wanted them. Temps would shoot up to 95 degrees pretty quickly and would stay there unless I put fans to 55% or higher which was pretty loud. So I got a buddy to take the stock cooler off and reapply TIM. He put some AS5 on the chip and a little on the cooler and put the cooler back on. Took the card back to the house and fired it up. After about 3 seconds of any benchmark software/video game, the computer screens shut off and go black and the cards fan kicks up to 100%. The last reading I saw on HWmonitor was 86 degrees. Is this the card overheating? Anybody have similar problems? I was originally on 14.2 but reverted back to the "good" drivers. I did a DDU sweep and clean install. This happens with Afterburner installed and not installed. So everything I am seeing is pointing to having to take the cooler back off and reseating? Does this sound about right?


----------



## rdr09

Quote:


> Originally Posted by *bhardy1185*
> 
> Been having fits with my card. After doing several tests, my temps were not where I wanted them. Temps would shoot up to 95 degrees pretty quickly and would stay there unless I put fans to 55% or higher which was pretty loud. So I got a buddy to take the stock cooler off and reapply TIM. He put some AS5 on the chip and a little on the cooler and put the cooler back on. Took the card back to the house and fired it up. After about 3 seconds of any benchmark software/video game, the computer screens shut off and go black and the cards fan kicks up to 100%. The last reading I saw on HWmonitor was 86 degrees. Is this the card overheating? Anybody have similar problems? I was originally on 14.2 but reverted back to the "good" drivers. I did a DDU sweep and clean install. This happens with Afterburner installed and not installed. So everything I am seeing is pointing to having to take the cooler back off and reseating? Does this sound about right?


i think as5 is capacitive. it can create shorts and kill components. i advise to reapply some other paste like Gelid extreme. clean the core and its surrounding. gl!


----------



## BradleyW

AS5 is no longer the standard. it's been this way for the past 2-3 years at least. There are far better thermal pastes out there.


----------



## JordanTr

Good day, guys. Yesterday night i decided to flash my stock sapphire r9 290 ( now its cooled by gelid icy vision rad + 2x be quiet shadows wings 92mm fans). I flashed it to powercolor pcs+ bios. So my clocks changed from 947/1250 to 1040/1350 and added +50mV (bios itself). For my surprise max temps went from 84 to 80 degrees and i dont get blackscreens anymore. Before on sapphire bios if i passed 1300 for memory i was having it while playing crysis 2. After flash no more, i overvolted stock bios with no avail, it looks like somehow pcs+ bios are more stable if overclocked.


----------



## phallacy

I'm really considering flashing one of my ref 290x bios because the date says 10/14/13 version 15.039. It seems to be one of the very first ones which explains it's poor performance relative to my other cards. Is there a tutorial anywhere for doing it in a multi card setup ?


----------



## MrWhiteRX7

You flash multi card same as single. Do each one at a time...

atiflash.exe -f -p 0 x.rom

then

atiflash.exe -f -p 1 x.rom

and so on


----------



## axiumone

Quote:


> Originally Posted by *phallacy*
> 
> I'm really considering flashing one of my ref 290x bios because the date says 10/14/13 version 15.039. It seems to be one of the very first ones which explains it's poor performance relative to my other cards. Is there a tutorial anywhere for doing it in a multi card setup ?


I flashed my visiontek cards that came with 15.039 to the latest powercolor 290 pcs+ bios and on water, it hasn't made any difference at all for me. My limits and stability haven't changed at all. It may make a difference on air.


----------



## JordanTr

Good day, guys. Yesterday night i decided to flash m
Quote:


> Originally Posted by *phallacy*
> 
> I'm really considering flashing one of my ref 290x bios because the date says 10/14/13 version 15.039. It seems to be one of the very first ones which explains it's poor performance relative to my other cards. Is there a tutorial anywhere for doing it in a multi card setup ?


there is a tutorial, on thread "unlock r9 290 -> r9 290x" just download the bios you want from techpowerup. Rename it to up to 7 symbols. Everything is explained step by step and all files are given.


----------



## TheRoot

http://videocardz.com/49911/sapphire-shows-radeon-r9-290x-toxic-8gb-memory
toxic come


----------



## MrWhiteRX7

I ended up going back to my factory bios on all 3 cards due to a couple weird crashes on the Tri-x bios that I've never seen before. Might try the PCS+ tonight. I like having a bios that sets the card over 1000mhz default. I need to just edit my own bios lol


----------



## Germanian

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> I ended up going back to my factory bios on all 3 cards due to a couple weird crashes on the Tri-x bios that I've never seen before. Might try the PCS+ tonight. I like having a bios that sets the card over 1000mhz default. I need to just edit my own bios lol


be careful if you set the 1000mhz as default in your bios you might need to increase your core voltage slightly as well. Maybe like 10-20mv. Otherwise your computer might start crashing every time you see the windows logo.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Germanian*
> 
> be careful if you set the 1000mhz as default in your bios you might need to increase your core voltage slightly as well. Maybe like 10-20mv. Otherwise your computer might start crashing every time you see the windows logo.


I've tested them all up to 1100mhz stock volts, but if I ever clock them up in games it's usually just 1000 - 1050 max. With 3 of these there's no point in oc at all really lol.


----------



## kizwan

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> I ended up going back to my factory bios on all 3 cards due to a couple weird crashes on the Tri-x bios that I've never seen before. Might try the PCS+ tonight. I like having a bios that sets the card over 1000mhz default. I need to just edit my own bios lol


What kind of weird crashes?


----------



## taem

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> I ended up going back to my factory bios on all 3 cards due to a couple weird crashes on the Tri-x bios that I've never seen before. Might try the PCS+ tonight. I like having a bios that sets the card over 1000mhz default. I need to just edit my own bios lol


Great thing about the PCS+ bios is the default +50mV vddc adjust. Allows a bit of room for overclocking using CCC instead of Afterburner or Trixx, which are both buggy imho. CCC is rock solid for me, Afterburner and Trixx especially get me little glitches, like loss of signal to display at boot, etc.


----------



## Paul17041993

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> They have slipped big time with red things
> 
> 8 weeks for rma and not a peep that's disgusting


Quote:


> Originally Posted by *Sgt Bilko*
> 
> They have yeah, sad really.
> 
> 8 Weeks!?!?!


hate to mention, my 7970DCIIT had 4 months total RMA time combined, for only 2 RMAs...


----------



## MrWhiteRX7

Quote:


> Originally Posted by *kizwan*
> 
> What kind of weird crashes?


The monitor splits up in to almost 2 sections of color. Left side like a checkered red and black and the right side a white static look. Never seen anything like it


----------



## Coree

Speaking of the 7970DC II, the core and copperpipes have nice contact, thanks to the plate:

I bet that the 7970 cooler would fare better than the a.k.a re-painted Matrix R9 290X heatsink (R9 290X DC II)


----------



## Roy360

Is the power reported by gpuz accurate?

according to it, my R9 290 is only using 170W on full load.

multiplying (VDDC current*VDDC)


----------



## phallacy

http://www.amazon.com/gp/product/B00IEZGWI2

Saw this in another thread. I'm really tempted to get either this or the Asus one that's supposed to be released at the same time frame. This samsung definitely looks sleeker and I think I can sacrifice 1440p @ 110z for 4k @ 60hz. They did market the 290x as a 4k card after all.









What do you guys think?


----------



## taem

Quote:


> Originally Posted by *Roy360*
> 
> Is the power reported by gpuz accurate?
> 
> according to it, my R9 290 is only using 170W on full load.
> 
> multiplying (VDDC current*VDDC)


That's about what I get with vddc at 0 adjust, 175w or so, on my Powercolor. The Sapphire Tri-X at the same vddc and clocks draws way more and tops 200w by a bit. And gpu z readout on wattage accords with hwinfo and my wall meter so I would assume it's accurate. Doesn't gpu z multiply vddc current and vddc for you already, in the Power In/Out fields?
Quote:


> Originally Posted by *phallacy*
> 
> http://www.amazon.com/gp/product/B00IEZGWI2
> 
> Saw this in another thread. I'm really tempted to get either this or the Asus one that's supposed to be released at the same time frame. This samsung definitely looks sleeker and I think I can sacrifice 1440p @ 110z for 4k @ 60hz. They did market the 290x as a 4k card after all.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What do you guys think?


What kind of settings/framrates can a 290x achieve at 4k? Because I find 290 crossfire can't max out a 1440p @ 60, I have to lower settings.


----------



## phallacy

Quote:


> Originally Posted by *taem*
> 
> That's about what I get with vddc at 0 adjust, 175w or so, on my Powercolor. The Sapphire Tri-X at the same vddc and clocks draws way more and tops 200w by a bit. And gpu z readout on wattage accords with hwinfo and my wall meter so I would assume it's accurate. Doesn't gpu z multiply vddc current and vddc for you already, in the Power In/Out fields?
> What kind of settings/framrates can a 290x achieve at 4k? Because I find 290 crossfire can't max out a 1440p @ 60, I have to lower settings.


Which games are you talking about ? Crysis 3 with trifire I average around 70fps with full ultra and 2x smaa mgpu. Far Cry 3 I get 100ish avg fps with full max settings. Other than these 2 games which are really demanding, all my other games are super fluid in 1440p with 150+ fps averaging

To test, I played BF4 with 150% res scale (4k downscaled back to 1440p) and I was averaging 100 fps there too. Although it was using Mantle (the post with my results and screenshots are in the Mantle thread)


----------



## taem

Quote:


> Originally Posted by *phallacy*
> 
> Which games are you talking about ? Crysis 3 with trifire I average around 70fps with full ultra and 2x smaa mgpu. Far Cry 3 I get 100ish avg fps with full max settings. Other than these 2 games which are really demanding, all my other games are super fluid in 1440p with 150+ fps averaging
> 
> To test, I played BF4 with 150% res scale (4k downscaled back to 1440p) and I was averaging 100 fps there too. Although it was using Mantle (the post with my results and screenshots are in the Mantle thread)


I never go by average, I use v sync and choose settings that give me a constant 60. And there are few current gen demanding games where I can max everything and get 60. Bioshock Infinite is the only one I can think of. Metro LL, FC3, Tomb Raider, Max Payne 3, Hitman Absolution, Sleeping Dogs, can't max any of these, and that's just to name a few. I can get all the settings I want so I'm not whingeing about it. But max FC3? Frame buffer at 5 cripples my framerate.

Do you really mean max everything, literally everything? Because maybe tri fire makes all the difference, but you can't max a 1440p with 290x crossfire. Here's just one example.


----------



## phallacy

Quote:


> Originally Posted by *taem*
> 
> I never go by average, I use v sync and choose settings that give me a constant 60. And there are few current gen demanding games where I can max everything and get 60. Bioshock Infinite is the only one I can think of. Metro LL, FC3, Tomb Raider, Max Payne 3, Hitman Absolution, Sleeping Dogs, can't max any of these, and that's just to name a few. I can get all the settings I want so I'm not whingeing about it. But max FC3? Frame buffer at 5 cripples my framerate.
> 
> Do you really mean max everything, literally everything? Because maybe tri fire makes all the difference, but you can't max a 1440p with 290x crossfire. Here's just one example.


That example is showing Crysis 3 at 1600p with 4x MSAA. I'll admit this is the one game where even quad setups with uber hexacore / xeons will give trouble. The engine and rendering is just that intensive. I have to turn down the AA to 2x smgpu to get good trifire performance.

But for all other AAA games, yes when I say max everything I mean the highest available setting for that option. Max Payne 3 MSAA is very broken though. I don't use it in that game. I use FXAA there and that is regularly sitting at 150-160fps. You can call that a compromise I guess but if you check out multiple forum posts many people consider FXAA superior to MSAA for Max Payne 3.

I use 8x MSAA in Far Cry 3 and still I can manage 90-100 fps. Can't comment on the others as I have uninstalled them since reformatting.


----------



## Paul17041993

Quote:


> Originally Posted by *Coree*
> 
> Speaking of the 7970DC II, the core and copperpipes have nice contact, thanks to the plate:
> 
> *I bet that the 7970 cooler would fare better than the a.k.a re-painted Matrix R9 290X heatsink* (R9 290X DC II)


it would, easily, greater surface area, no outtake restriction, fans are at least something decent (apart from a rattle in my case), and the heatpipes are actually oriented correctly and would give picture-perfect contact, even better then what they give on the 7970.

you would want to modify the fan shroud though so it works correctly, the fan towards the display outputs for example dumps most of its cold air out the back instead of pushing it through the heatsink, no idea if the mounting holes would be close enough though, Id probably try myself but I prefer a blower fan due to how my case behaves. (no side vents or fans)


----------



## Andrix85

Hy boys...

I have a XFX R9 290X unlock to R9 290X with ELPIDA ram that can do only 1400 mhz...It's possible or should i do something to go higher ? I use ASUS bios to oc and GPU TWEAK...

Thanks


----------



## Mr357

Does anyone know what the world record core clock is for the 290/X?

The best I've seen is 1500MHz by Smoke.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mr357*
> 
> Does anyone know what the world record core clock is for the 290/X?
> 
> The best I've seen is 1500MHz by Smoke.


Thought smoke hit 1600?

Either way. 8 pack hit 1600 recently on a quad fire setup


----------



## Mr357

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Thought smoke hit 1600?
> 
> Either way. 8 pack hit 1600 recently on a quad fire setup


Is that so?









Reference or Lightning's/Matrix Platinum's?


----------



## VSG

Asus DCUII


----------



## Maracus

Well found the limit of my card *1260*/*1625* +200mv , anything higher with the vdroops craps out. Could squeeze maybe a little more from the memory but 1625mhz is already pushing for Elpida.

http://www.3dmark.com/3dm11/8095432


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mr357*
> 
> Is that so?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Reference or Lightning's/Matrix Platinum's?


Quote:


> Originally Posted by *geggeg*
> 
> Asus DCUII


^This, I think they were 2 MSI Gaming R9 290x's and 2 Asus DCU II's but not 100% sure


----------



## Forceman

Anyone else getting "driver power state failure" crashes with 14.2? Happens when the system is idle - used to happen with 13.12 until I turned off ULPS, but now even with ULPS turned off I still get them with 14.2. Kind of annoying.

Edit: That's with a clean install of 14.2. Wondering if it would help to install 13.12 again and then just install 14.2 over top.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Forceman*
> 
> Anyone else getting "driver power state failure" crashes with 14.2? Happens when the system is idle - used to happen with 13.12 until I turned off ULPS, but now even with ULPS turned off I still get them with 14.2. Kind of annoying.
> 
> Edit: That's with a clean install of 14.2. Wondering if it would help to install 13.12 again and then just install 14.2 over top.


I've installed 14.2 over 13.12 and it's running great for me in all games and windows


----------



## Prozillah

Can anyone shed any light on the windows msg that states performance is going slow etc - choose to keep current settings blah blah? It pops up everytime im gaming be it bf4, ghosts etc. It doesn't effect anything as my game play is beautiful but I am guessing it shouldn't pop up?


----------



## Aussiejuggalo

Normally the Windows messages about performance normally it just turns off aero in Win 7 then when you exit a game or whatever it turns aero back on

Could be a different message I'm thinking of tho


----------



## Forceman

Quote:


> Originally Posted by *Prozillah*
> 
> Can anyone shed any light on the windows msg that states performance is going slow etc - choose to keep current settings blah blah? It pops up everytime im gaming be it bf4, ghosts etc. It doesn't effect anything as my game play is beautiful but I am guessing it shouldn't pop up?


Windows 7? I used to get that a lot for some reason, if you right-click the game executable you can change a couple of settings in the compatibility tab and it normally goes away for that game. There are two options - "disable desktop composition" and something else right near that, but I can't recall the name of it.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Prozillah*
> 
> Can anyone shed any light on the windows msg that states performance is going slow etc - choose to keep current settings blah blah? It pops up everytime im gaming be it bf4, ghosts etc. It doesn't effect anything as my game play is beautiful but I am guessing it shouldn't pop up?


Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Normally the Windows messages about performance normally it just turns off aero in Win 7 then when you exit a game or whatever it turns aero back on
> 
> Could be a different message I'm thinking of tho


^This

I get it as well sometimes, easiest thing you can do is change your theme to Win 7 classic or turn off aero.

Just Windows deciding that it needs to have a "moment"


----------



## taem

I'm getting really low scores on benches for 290 crossfire. At 1140 core, 1500 mem, I get

3dmark11 crossfire


3dmark11 single gpu


fire strike crossfire


fire strike single gpu


valley crossfire


valley single gpu


Gpu z is reporting clocks and usage where they should be during benching. So why are my 3dmark scores so low? I'm only seeing 35%-45% gain over single gpu at same clocks. Is that normal? In a game I could understand a 40% gain. But for benching shouldn't I be getting 80% at least in gain?

Any of you guys have any bench results for 290 crossfire at roughly these clocks, 1140 core 1500 mem that you'd care to share? It's the gpu score that I'm interested in obviously.

edit to add, this review http://www.tweaktown.com/reviews/5991/sapphire-radeon-r9-290-4gb-in-crossfire-video-card-review/index4.html reports 3dmark11 score of 22008, a full 1/3 higher than my score. Granted that is a 3960 cpu @ 4.7 vs my 4670k @ 4.6 but still.


----------



## Paul17041993

Quote:


> Originally Posted by *Forceman*
> 
> Anyone else getting "driver power state failure" crashes with 14.2? Happens when the system is idle - used to happen with 13.12 until I turned off ULPS, but now even with ULPS turned off I still get them with 14.2. Kind of annoying.
> 
> Edit: That's with a clean install of 14.2. Wondering if it would help to install 13.12 again and then just install 14.2 over top.


I only ever remember intel network controllers having that bug...

does it happen with the monitor being off? and/or coming back on? or randomly when idling? or when resuming from sleep?


----------



## Sgt Bilko

That combined score of yours is killer.

http://www.3dmark.com/fs/1805629

that's with mine at 1250/1550

Seems that the Intel chips affect the graphics score after all.


----------



## Forceman

Quote:


> Originally Posted by *Paul17041993*
> 
> I only ever remember intel network controllers having that bug...
> 
> does it happen with the monitor being off? and/or coming back on? or randomly when idling? or when resuming from sleep?


When trying to wake the monitor after it goes to sleep. Sometimes it'll just hang with the monitor on but a black screen (but you can still use Windows - to reboot with keyboard commands, say) and sometimes it'll just reboot when you try to wake it, and give that driver error. Never happened with 13.12,but now I've grown accustomed to Mantle and don't want to roll-back.


----------



## Paul17041993

Quote:


> Originally Posted by *Forceman*
> 
> When trying to wake the monitor after it goes to sleep. Sometimes it'll just hang with the monitor on but a black screen (but you can still use Windows - to reboot with keyboard commands, say) and sometimes it'll just reboot when you try to wake it, and give that driver error. Never happened with 13.12,but now I've grown accustomed to Mantle and don't want to roll-back.


hm seems like a ULPS problem, best bet might just be to disable the monitor sleep via window's power profiles.


----------



## Forceman

Quote:


> Originally Posted by *Paul17041993*
> 
> hm seems like a ULPS problem, best bet might just be to disable the monitor sleep via window's power profiles.


That's my guess also, although I did disable ULPS in AB (and checked in the registry). I might give the monitor idea a try though, see if it still happens.


----------



## Tonza

Got yesterday Sapphire R9-290 Tri-X, tried briefly overclocking and its stable at 1100 core / 1500 mem with just +13mv voltage increase (gonna try today whats the absolutely maximum overclock). The cooler is great, its even more silent than DCII cooler in GTX 780. Temps are also great @ 50% fan speed around 72c max core. Very happy with my purchase, getting another one later this month for crossfire (since framepacing is now available, was horrible mess with my 7950 crossfire last year). 290 overclocked seems to be in several games faster than my 780 (Tomb Raider atleast and Far Cry 3 are much smoother).


----------



## Paul17041993

Quote:


> Originally Posted by *Tonza*
> 
> Got yesterday Sapphire R9-290 Tri-X, tried briefly overclocking and its stable at 1100 core / 1500 mem with just +13mv voltage increase (gonna try today whats the absolutely maximum overclock). The cooler is great, its even more silent than DCII cooler in GTX 780. Temps are also great @ 50% fan speed around 72c max core. Very happy with my purchase, getting another one later this month for crossfire (since framepacing is now available, was horrible mess with my 7950 crossfire last year). 290 overclocked seems to be in several games faster than my 780 (Tomb Raider atleast and Far Cry 3 are much smoother).


yea, its the same design the 290X tri-X uses, so you get a massive amount of thermal room for silence and/or large overclocks, you should be able to hit 1200/1600 quite easily, just keep in mind your memory OC also relies on the core voltage on these cards.

I'm going to take a guess you'll settle on 1250/1650 with +80mV on the core...

edit; oh and be sure to test on games to ensure its stable, a lot of burn-in and benchmark tools don't exactly expose instability, and don't use furmark in this case as it uses way too much power compared to normal real-world loads.


----------



## kizwan

Quote:


> Originally Posted by *taem*
> 
> I'm getting really low scores on benches for 290 crossfire. At 1140 core, 1500 mem, I get
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 3dmark11 crossfire
> 
> 
> 3dmark11 single gpu
> 
> 
> fire strike crossfire
> 
> 
> fire strike single gpu
> 
> 
> valley crossfire
> 
> 
> valley single gpu
> 
> 
> 
> 
> Gpu z is reporting clocks and usage where they should be during benching. So why are my 3dmark scores so low? I'm only seeing 35%-45% gain over single gpu at same clocks. Is that normal? In a game I could understand a 40% gain. But for benching shouldn't I be getting 80% at least in gain?
> 
> Any of you guys have any bench results for 290 crossfire at roughly these clocks, 1140 core 1500 mem that you'd care to share? It's the gpu score that I'm interested in obviously.
> 
> edit to add, this review http://www.tweaktown.com/reviews/5991/sapphire-radeon-r9-290-4gb-in-crossfire-video-card-review/index4.html reports 3dmark11 score of 22008, a full 1/3 higher than my score. Granted that is a 3960 cpu @ 4.7 vs my 4670k @ 4.6 but still.


You can't compare 4670k with hexy in 3dmark11. CPU does affect 3dmark11 score. It's called synthetic benchmark for a reason. Don't worry about it. Your Valleys score also are in line with everyone else.

Check out mine. Honestly, I thought my score lower than it should be too.
http://www.3dmark.com/3dm11/8007990
http://www.3dmark.com/3dm11/8004406


----------



## Tonza

Quote:


> Originally Posted by *Paul17041993*
> 
> yea, its the same design the 290X tri-X uses, so you get a massive amount of thermal room for silence and/or large overclocks, you should be able to hit 1200/1600 quite easily, just keep in mind your memory OC also relies on the core voltage on these cards.
> 
> I'm going to take a guess you'll settle on 1250/1650 with +80mV on the core...
> 
> edit; oh and be sure to test on games to ensure its stable, a lot of burn-in and benchmark tools don't exactly expose instability, and don't use furmark in this case as it uses way too much power compared to normal real-world loads.


Far Cry 3 atleast was ultimate torture test for overclocked GK110 (Titan and 780), stressed more the card than any other game. If Far Cry 3 was stable, everything else was.







I usually loop valley to see if card is stable, then i play some games aswell.


----------



## Widde

Quote:


> Originally Posted by *taem*
> 
> I'm getting really low scores on benches for 290 crossfire. At 1140 core, 1500 mem, I get
> 
> 3dmark11 crossfire
> 
> 
> 3dmark11 single gpu
> 
> 
> fire strike crossfire
> 
> 
> fire strike single gpu
> 
> 
> valley crossfire
> 
> 
> valley single gpu
> 
> 
> Gpu z is reporting clocks and usage where they should be during benching. So why are my 3dmark scores so low? I'm only seeing 35%-45% gain over single gpu at same clocks. Is that normal? In a game I could understand a 40% gain. But for benching shouldn't I be getting 80% at least in gain?
> 
> Any of you guys have any bench results for 290 crossfire at roughly these clocks, 1140 core 1500 mem that you'd care to share? It's the gpu score that I'm interested in obviously.
> 
> edit to add, this review http://www.tweaktown.com/reviews/5991/sapphire-radeon-r9-290-4gb-in-crossfire-video-card-review/index4.html reports 3dmark11 score of 22008, a full 1/3 higher than my score. Granted that is a 3960 cpu @ 4.7 vs my 4670k @ 4.6 but still.


You're getting like 5k more score then me







getting like 19-20k ish graphics score in firestrike on 1050/1350 and cpu is a 3570k at 4.5

Gonna see if a mate want to trade his 3770k wiht me


----------



## bhardy1185

Quote:


> Originally Posted by *rdr09*
> 
> i think as5 is capacitive. it can create shorts and kill components. i advise to reapply some other paste like Gelid extreme. clean the core and its surrounding. gl!


More bad news. Took the cooler off last night to look to see how the paste was looking. There was a relatively good size "hole" in the center. Looked to have either pulled away from the chip/block. So I removed the paste and reapplied a different paste. Got everything back together and put the card back in the machine. Heard a nice sounding POP as soon as I turned on the computer. Watched the card to see if it was going to burst into flames... Immediately opened HWmonitor to see if I could tell what happened. Everything looked fine for a little while. Temps started out at 41 degrees idle. And then they starting climbing. I let it get up to 85 degrees and then tried to turn the fans up... NOTHING. Temps still climbed to 99 degrees before I shut it down. Opened the card up this morning and paste looked good,







but, over where the cooler connects to the board, there was a nice star burst on the board where one of the small silver things (don't know technical term) were right next to the connector. So I am taking it as I just blew the fan up...

Couple questions. I have been told that somehow some paste may have gotten on the board and caused a short. With that probably being the case, is there any way to save the card? I can take a picture of it this afternoon if that would help. If the area that is shorted out is just the fan, will this card still work if a waterblock is put on it?


----------



## phallacy

So guys, I pulled the trigger and bought the Samsung u28d590d on eBay. Comes out to the same price as Amazon preorder with tax but I can get this one in 2-3 days. Got it from eBay. I'll hopefully have it this weekend and can do some gaming and see how 4k 60hz stacks up!


----------



## Raephen

Quote:


> Originally Posted by *bhardy1185*
> 
> More bad news. Took the cooler off last night to look to see how the paste was looking. There was a relatively good size "hole" in the center. Looked to have either pulled away from the chip/block. So I removed the paste and reapplied a different paste. Got everything back together and put the card back in the machine. Heard a nice sounding POP as soon as I turned on the computer. Watched the card to see if it was going to burst into flames... Immediately opened HWmonitor to see if I could tell what happened. Everything looked fine for a little while. Temps started out at 41 degrees idle. And then they starting climbing. I let it get up to 85 degrees and then tried to turn the fans up... NOTHING. Temps still climbed to 99 degrees before I shut it down. Opened the card up this morning and paste looked good,
> 
> 
> 
> 
> 
> 
> 
> but, over where the cooler connects to the board, there was a nice star burst on the board where one of the small silver things (don't know technical term) were right next to the connector. So I am taking it as I just blew the fan up...
> 
> Couple questions. I have been told that somehow some paste may have gotten on the board and caused a short. With that probably being the case, is there any way to save the card? I can take a picture of it this afternoon if that would help. If the area that is shorted out is just the fan, will this card still work if a waterblock is put on it?


I honestly wouldn't know, but might I suggest - if you want to see if the card still works okay - trying an after market cooler before deciding to take the plunge in water?

A thing like the Gelid Icy Vision or Arctic's Accelero Twin Turbo II would do the trick, either hooked up to a fan controller or a direct feed from the psu via molex connectors.

If you feel daring and have two 12 / 14 cm fans lying around, you could even try to ghetto rig them onto an Artic Accelro S1 - there is someone in this thread who did that not to long ago, and if I recall correctly, it worked out good enough.

Just a word of advice: be sure to keep VRM1 cool - sub 80 C seems to be the concensus for proper and effecient function.

Good luck!


----------



## taem

Quote:


> Originally Posted by *Sgt Bilko*
> 
> that's with mine at 1250/1550


Quote:


> Originally Posted by *kizwan*
> 
> Don't worry about it. Your Valleys score also are in line with everyone else.
> 
> Check out mine. Honestly, I thought my score lower than it should be too.


Quote:


> Originally Posted by *Widde*
> 
> getting like 19-20k ish graphics score in firestrike on 1050/1350 and cpu is a 3570k at 4.5


Thanks guys, I guess my scores are where they should be. Crossfire doesn't scale as well as I'd hoped. I'm on 13.12, maybe 14.2 will help a bit. But since I only have 850w psu and air cool, I can't oc in crossfire as much as I could a single gpu. I can get close to these 3dmark scores with a single 290 heavily oc'd. I might have gotten equal performance from a good clocking single 290x than 290 crossfire. I guess to get benefit out of this I need a better psu and put some more beef on those clocks. Though, temps will limit me from going to 1200+/1600+ like I could with single gpu, though that requires a vddc adjust of over 100 and I'm not sure that's so great long term. But I doubt the card would fry before I feel the itch to upgrade anyway,
Quote:


> Originally Posted by *Tonza*
> 
> Far Cry 3 atleast was ultimate torture test for overclocked GK110 (Titan and 780), stressed more the card than any other game. If Far Cry 3 was stable, everything else was.
> 
> 
> 
> 
> 
> 
> 
> I usually loop valley to see if card is stable, then i play some games aswell.


I've found the same thing, I can pass every bench and other games and then FC3 will crash on me. I keep it installed just to check clock stability these days. Great game too I wish there was more I could do in it but I've cleared the map.


----------



## Forceman

Quote:


> Originally Posted by *bhardy1185*
> 
> More bad news. Took the cooler off last night to look to see how the paste was looking. There was a relatively good size "hole" in the center. Looked to have either pulled away from the chip/block. So I removed the paste and reapplied a different paste. Got everything back together and put the card back in the machine. Heard a nice sounding POP as soon as I turned on the computer. Watched the card to see if it was going to burst into flames... Immediately opened HWmonitor to see if I could tell what happened. Everything looked fine for a little while. Temps started out at 41 degrees idle. And then they starting climbing. I let it get up to 85 degrees and then tried to turn the fans up... NOTHING. Temps still climbed to 99 degrees before I shut it down. Opened the card up this morning and paste looked good,
> 
> 
> 
> 
> 
> 
> 
> but, over where the cooler connects to the board, there was a nice star burst on the board where one of the small silver things (don't know technical term) were right next to the connector. So I am taking it as I just blew the fan up...
> 
> Couple questions. I have been told that somehow some paste may have gotten on the board and caused a short. With that probably being the case, is there any way to save the card? I can take a picture of it this afternoon if that would help. If the area that is shorted out is just the fan, will this card still work if a waterblock is put on it?


Was the fan running while you did this testing? Did you check to see while it was on? If you had to, you could probably get by with taking the shroud off the cooler and just wire-tie a couple of fans to the cooler so you could run it for a quick test. That would be better than buying a cooler just to test it. I wouldn't run it like that long-term, but it should certainly do the job for 15 minutes or so while you tested the card out.

But I don't like the sound of something going pop on the card - I'd try to RMA it instead of fooling with it, if you still can. And if something on the card actually blew (the star-burst thing you mention) I would definitely not bother fooling with it.


----------



## Prozillah

Definitely rma it


----------



## bhardy1185

Quote:


> Originally Posted by *Raephen*
> 
> I honestly wouldn't know, but might I suggest - if you want to see if the card still works okay - trying an after market cooler before deciding to take the plunge in water?
> 
> A thing like the Gelid Icy Vision or Arctic's Accelero Twin Turbo II would do the trick, either hooked up to a fan controller or a direct feed from the psu via molex connectors.
> 
> If you feel daring and have two 12 / 14 cm fans lying around, you could even try to ghetto rig them onto an Artic Accelro S1 - there is someone in this thread who did that not to long ago, and if I recall correctly, it worked out good enough.
> 
> Just a word of advice: be sure to keep VRM1 cool - sub 80 C seems to be the concensus for proper and effecient function.
> 
> Good luck!


Thank you for the response. Trust me, I am not jumping into water cooling anytime soon. I use to think it was something I wanted to do but i just can't afford it and don't want to put the time into it. (That sounds really negative, not meant that way).

The real reason I asked if it would still work for water cooling, was if I decided to list it on ebay. I could list it and just say it is a "water cooling only" card. But I guess that is a little premature without the proper testing.
Quote:


> Originally Posted by *Forceman*
> 
> Was the fan running while you did this testing? Did you check to see while it was on? If you had to, you could probably get by with taking the shroud off the cooler and just wire-tie a couple of fans to the cooler so you could run it for a quick test. That would be better than buying a cooler just to test it. I wouldn't run it like that long-term, but it should certainly do the job for 15 minutes or so while you tested the card out.
> 
> But I don't like the sound of something going pop on the card - I'd try to RMA it instead of fooling with it, if you still can. And if something on the card actually blew (the star-burst thing you mention) I would definitely not bother fooling with it.


I know the fan worked before I took the cooler off the second time because, after my friend initially took the cooler off and put AS5 on it, I ran tests and it would crash and the fan would shoot to 100%. After the pop and a test of about 5 minutes, the card got to that point of blanking out the screens and the fan should have kicked to 100% but they didn't. So I shut it off, let it cool for a little while and turned the computer back on to see how the temps acted. Didn't take long for it to get hot just idling with HWmonitor up.

I am leaning more towards just RMA'ing it. Only problem is my friend bought it off of ebay. Will they still honor a RMA with me not being the "original" owner? Btw it is a Sapphire r9 290.


----------



## taem

Need some help with a borked flash.

I tried flashing my 290 Tri-X to the Powercolor BIOS. I moved the switch to position #1 (#2 is default) and used atiwinflash.

Flash went fine, rebooted. Would not boot.

Went into safe mode and used DDU to uninstall drivers. Rebooted then, with 290s showing as basic displays. Installed drivers. Won't boot again.

I'm back on switch position #2 and it's all working fine but what did I do wrong, how can I fix it? Is it not possible to flash a Sapphire 290 Tri-X to the Powercolor PCS+ bios?

Also, if I want to get my Sapphire card's switch #1 bios back to default, which bios do I use? I forgot to save it







I looked on the gpu z database and they have 3 bios uploaded, they are identical as far as I can tell.

Edit to add, latest gpu z won't let me save powercolor pcs+ bios! gives me "bios reading not supported on this device." Ugh


----------



## Paul17041993

Quote:


> Originally Posted by *bhardy1185*
> 
> More bad news. Took the cooler off last night to look to see how the paste was looking. There was a relatively good size "hole" in the center. Looked to have either pulled away from the chip/block. So I removed the paste and reapplied a different paste. Got everything back together and put the card back in the machine. Heard a nice sounding POP as soon as I turned on the computer. Watched the card to see if it was going to burst into flames... Immediately opened HWmonitor to see if I could tell what happened. Everything looked fine for a little while. Temps started out at 41 degrees idle. And then they starting climbing. I let it get up to 85 degrees and then tried to turn the fans up... NOTHING. Temps still climbed to 99 degrees before I shut it down. Opened the card up this morning and paste looked good,
> 
> 
> 
> 
> 
> 
> 
> but, over where the cooler connects to the board, there was a nice star burst on the board where one of the small silver things (don't know technical term) were right next to the connector. So I am taking it as I just blew the fan up...
> 
> Couple questions. I have been told that somehow some paste may have gotten on the board and caused a short. With that probably being the case, is there any way to save the card? I can take a picture of it this afternoon if that would help. If the area that is shorted out is just the fan, will this card still work if a waterblock is put on it?


sounds like either the fan or controller kacked it, highly doubt the paste job would have something to do with it unless you somehow got some in the fan or on the PCB...

its quite possible for a defect to only appear after a few amount of power cycles, my second seasonic that replaced the first (just bad coincidence really) blew its fuse after turning it on at one point, didn't seem to have anything else wrong with it so it must have just been a weak fuse or capacitor.


----------



## Tobiman

If you card is still being detected then you have nothing to worry about. Just reflash back to stock bios and your card should work.You could probably get the bios from someone on here but I'm pretty sure it will be available on techpowerup. I'd read in between the lines to figure out which bios is for the Tri-X.


----------



## taem

Something is buggy here. Gpu z would not let me read the bios on the Powercolor. After a reboot it does save the bios to file. But when I try to flash it, atiwinflash says it can't read the rom. And now atiwinflash is also saying it can't read the rom I successfully flashed to the Tri-X that won't boot. But it did earlier.

How are you guys flashing? Is atiwinflash in windows not a good idea? Has anyone successfully flashed a Tri-X 290 with the Powercolor PCS+ bios?
Quote:


> Originally Posted by *Tobiman*
> 
> If you card is still being detected then you have nothing to worry about. Just reflash back to stock bios and your card should work.You could probably get the bios from someone on here but I'm pretty sure it will be available on techpowerup. I'd read in between the lines to figure out which bios is for the Tri-X.


The issue is the Tri-X has 2 different bios and the uploaded ones are all identical, the bios from switch position #2, which comes selected by default. The bios for switch #1 doesn't seem to be on there.


----------



## Tobiman

Quote:


> Originally Posted by *taem*
> 
> Something is buggy here. Gpu z would not let me read the bios on the Powercolor. After a reboot it does save the bios to file. But when I try to flash it, atiwinflash says it can't read the rom. And now atiwinflash is also saying it can't read the rom I successfully flashed to the Tri-X that won't boot. But it did earlier.
> 
> How are you guys flashing? Is atiwinflash in windows not a good idea? Has anyone successfully flashed a Tri-X 290 with the Powercolor PCS+ bios?
> The issue is the Tri-X has 2 different bios and the uploaded ones are all identical, the bios from switch position #2, which comes selected by default. The bios for switch #1 doesn't seem to be on there.


Well, seems the only option you have is to ask for it from someone on here.


----------



## Andrix85

This is my XFX R9 290 @ 290X with ASUS bios and 1.4V...Higher frequency go gpu throttling and the ram are crappy...I have also frequently black screen...Why ? the bios is PT1...


----------



## kizwan

Quote:


> Originally Posted by *taem*
> 
> Something is buggy here. Gpu z would not let me read the bios on the Powercolor. After a reboot it does save the bios to file. But when I try to flash it, atiwinflash says it can't read the rom. And now atiwinflash is also saying it can't read the rom I successfully flashed to the Tri-X that won't boot. But it did earlier.
> 
> How are you guys flashing? Is atiwinflash in windows not a good idea? Has anyone successfully flashed a Tri-X 290 with the Powercolor PCS+ bios?


I use atiflash in dos mode (bootable drive). You can use atiwinflash but when flashing in windows, a lot of things can go wrong.

Follow the instruction at The 290/290x Unlock thread to "unbrick" your card. Regarding the BIOS, techpowerup have two version for 290 Tri-X cards. Did you remember what version of the original Tri-X BIOS? If you posted screenshot of the GPU-Z here, you can check the screenshot to know the exact version.
http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread

In the "Fixing the Black Screen until Windows is Booted Error (not showing POST)" spoiler, apparently you can boot with good BIOS (switch 2), then change the switch to position #1 (bad BIOS) & flash BIOS.
Quote:


> Originally Posted by *Andrix85*
> 
> 
> 
> This is my XFX R9 290 @ 290X with ASUS bios and 1.4V...Higher frequency go gpu throttling and the ram are crappy...I have also frequently black screen...Why ? the bios is PT1...


Can you post it as picture, not as attachment? Thank you.

Even with power limit to max, it still throttling?


----------



## Jack Mac

I'm starting to get buyers remorse. My nice clocking 290 was sold to some miner on craigslist and the poor thing has probably been mining away since I sold it. Hopefully AMD's next gen cards won't be bought out and will have better noise levels, I wouldn't mind switching back to the red team.


----------



## VSG

Isn't your 780 overclocking well?


----------



## Jack Mac

Quote:


> Originally Posted by *geggeg*
> 
> Isn't your 780 overclocking well?


Haven't played with it too much, and I preferred the simplicity of clocking on AMD cards. So far I'm at 1150MHz on the core and 3500 on the memory with +13mv which will probably be my 24/7 settings. This is on a skynet bios with boost disabled and I can always do an AB mod for more voltage if I want.


----------



## VSG

But it is pretty much the same thing on the 780- overvolt, overclock, test. The only thing I changed going from 290x cards to 780 Ti cards is switching from Afterburner to Precision-X (and having to use the Classy overvolting tool, but that's another thing entirely). The BIOS support from this forum or generally everywhere for Nvidia is far better than for AMD which is unfortunate but a matter of fact.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *taem*
> 
> Need some help with a borked flash.
> 
> I tried flashing my 290 Tri-X to the Powercolor BIOS. I moved the switch to position #1 (#2 is default) and used atiwinflash.
> 
> Flash went fine, rebooted. Would not boot.
> 
> Went into safe mode and used DDU to uninstall drivers. Rebooted then, with 290s showing as basic displays. Installed drivers. Won't boot again.
> 
> I'm back on switch position #2 and it's all working fine but what did I do wrong, how can I fix it? Is it not possible to flash a Sapphire 290 Tri-X to the Powercolor PCS+ bios?
> 
> Also, if I want to get my Sapphire card's switch #1 bios back to default, which bios do I use? I forgot to save it
> 
> 
> 
> 
> 
> 
> 
> I looked on the gpu z database and they have 3 bios uploaded, they are identical as far as I can tell.
> 
> Edit to add, latest gpu z won't let me save powercolor pcs+ bios! gives me "bios reading not supported on this device." Ugh


Always flash in DOS. I don't trust ati winflash


----------



## taem

By DOS do you guys mean command prompt from a usb boot drive? Or literally a dos disk?

Also, now gpu1 won't clock down. Just stays at the max clock and idles really high temp, like 45c. Gpu2 does down clock in idle. I've uninstalled Trixx, just using CCC. what gives? How do I fix this?

Not a fan of my first foray into amd btw, all my nvidia cards were install and forget. But with amd so far all my time is spent trying to get things to work right. Maybe it's my win8.1 install. Or maybe it's asus. Dunno. But this is a pain.

edit, rebooted and now gpu1 downclocks in idle. But gpu2 is stuck at max clock. Lol. Is it 14.2 that's the issue?

second edit, rebooted again and now both gpus won't downlock. Wonder what happens on the next reboot...


----------



## Andrix85

Quote:


> Originally Posted by *kizwan*
> 
> I use atiflash in dos mode (bootable drive). You can use atiwinflash but when flashing in windows, a lot of things can go wrong.
> 
> Follow the instruction at The 290/290x Unlock thread to "unbrick" your card. Regarding the BIOS, techpowerup have two version for 290 Tri-X cards. Did you remember what version of the original Tri-X BIOS? If you posted screenshot of the GPU-Z here, you can check the screenshot to know the exact version.
> http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread
> 
> In the "Fixing the Black Screen until Windows is Booted Error (not showing POST)" spoiler, apparently you can boot with good BIOS (switch 2), then change the switch to position #1 (bad BIOS) & flash BIOS.
> Can you post it as picture, not as attachment? Thank you.
> 
> Even with power limit to max, it still throttling?


Sorry for picture..









I set the power target to 150% on GPU TWEAK but if I run bench with frequency higher 1200 mhz or memory higher 1420 mhz the result dropped....This is the max frequency for good result...i'm on stock cooler, only changed the Thermal paste with ARCTIC MX4...


----------



## Paul17041993

Quote:


> Originally Posted by *taem*
> 
> Something is buggy here. Gpu z would not let me read the bios on the Powercolor. After a reboot it does save the bios to file. But when I try to flash it, atiwinflash says it can't read the rom. And now atiwinflash is also saying it can't read the rom I successfully flashed to the Tri-X that won't boot. But it did earlier.
> 
> How are you guys flashing? Is atiwinflash in windows not a good idea? Has anyone successfully flashed a Tri-X 290 with the Powercolor PCS+ bios?
> The issue is the Tri-X has 2 different bios and the uploaded ones are all identical, the bios from switch position #2, which comes selected by default. The bios for switch #1 doesn't seem to be on there.


maby have it as the only card in the rig, for simplicity, switch it to the working BIOS and boot into windows, then switch it to the borked BIOS (don't reboot!) and proceed with trying to re-flash with either a BIOS for it or just a reference 290 BIOS.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *taem*
> 
> By DOS do you guys mean command prompt from a usb boot drive? Or literally a dos disk?
> 
> Also, now gpu1 won't clock down. Just stays at the max clock and idles really high temp, like 45c. Gpu2 does down clock in idle. I've uninstalled Trixx, just using CCC. what gives? How do I fix this?
> 
> Not a fan of my first foray into amd btw, all my nvidia cards were install and forget. But with amd so far all my time is spent trying to get things to work right. Maybe it's my win8.1 install. Or maybe it's asus. Dunno. But this is a pain.
> 
> edit, rebooted and now gpu1 downclocks in idle. But gpu2 is stuck at max clock. Lol. Is it 14.2 that's the issue?
> 
> second edit, rebooted again and now both gpus won't downlock. Wonder what happens on the next reboot...


Yep booting with a usb drive.

PLUS, you should always clean and re-install drivers after a bios switch. I've glitchy things happen on my older cards when I used to flash bios and just keep going without a fresh driver install.

I'm on 8.1 and all three of my 290's have been straight up plug in, install drivers, and go. I've had glitchy bios's too though, and that could have been what you installed , but after trying another download of the same bios (maybe first one was corrupt?) everything was great. I went back to my stock bios's with my 290's. I actually lost a couple fps with the Tri-x one.

Either way if you're set on flashing you're best bet is to always do it on a boot up with a USB flash drive using the HP boot setup.

Follow this and you'll be good









http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread/0_20


----------



## Forceman

Quote:


> Originally Posted by *taem*
> 
> second edit, rebooted again and now both gpus won't downlock. Wonder what happens on the next reboot...


The only possibility left, of course, they'll both downclock. But then you can never reboot again, or the cycle will start anew.


----------



## taem

Any gigabyte windforce crossfire set ups in here? Been chatting with a guy who is length limited who just ordered one to use as gpu1, along with powercolor pcs+ as gpu2 and I was curious. There are different bios versions for this card right? Which one is best?


----------



## Arizonian

Quote:


> Originally Posted by *Andrix85*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> This is my XFX R9 290 @ 290X with ASUS bios and 1.4V...Higher frequency go gpu throttling and the ram are crappy...I have also frequently black screen...Why ? the bios is PT1...


Congrats - Added









Sorry to read about your black screen issue. Check out *290/290X Black Screen Poll* and though I'm pretty sure most of those posters in the other thread are members here, you may try you luck asking there as well.

Also it's a good idea to fill out your system specs that helps other members sometimes help you.

*"How to put your Rig in your Sig"*


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *kizwan*
> 
> In Malaysia, we either RMA through the seller or bring it ourselves to Asus service center.
> 
> nice temps!
> 
> 
> 
> 
> 
> 
> 
> My comp also right next to A/C.


Much better without flexible duct everwhere , but im gonna keep it . Its $50AU a 6 meter bag
Quote:


> Originally Posted by *MrWhiteRX7*
> 
> You flash multi card same as single. Do each one at a time...
> 
> atiflash.exe -f -p 0 x.rom
> 
> then
> 
> atiflash.exe -f -p 1 x.rom
> 
> and so on


Thanks for that info man


----------



## MrWhiteRX7

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Much better without flexible duct everwhere , but im gonna keep it . Its $50AU a 6 meter bag
> Thanks for that info man


What bios you going to try out?


----------



## ledzepp3

Probably a slim chance, but is anyone here running a 1000W power supply with a 3930K and tri-fire 290X's? Any problems with it? I already have two 290X's soon to be underwater (rig isn't completed yet), paired with a 3930K. Just wanted to see the possibility of using my power supply with three 290X's.

-Zepp


----------



## Xyro TR1

Are these numbers around what they should be?

Unigine Heaven 4.0 with Extreme setting. Stock MSI R9 290X GAMING


----------



## sam66er




----------



## MrWhiteRX7

Quote:


> Originally Posted by *Xyro TR1*
> 
> Are these numbers around what they should be?
> 
> Unigine Heaven 4.0 with Extreme setting. Stock MSI R9 290X GAMING


run it in 1080p


----------



## Paul17041993

ok just ran mine, stock and 65% fan (ref. blower), @Xyro you'res is slightly better then my result so I would think that's correct.


----------



## Red1776

High Guys, Awhile ago I had mentioned a project that I was working on with the execs at AMD. Well the project has finally come together and if you would like to follow the build log for both machines prior to the articles release, you can see them here. The high end enthusiast machine is a quad fire 4 x MSI 4GB R290X

http://www.overclock.net/t/1473361/amd-high-performance-project-by-red1776


----------



## TommyGunn123

Looks like Scorptec have the 290x Lightning, and $919 dollary-doos, holy crap :/

http://www.scorptec.com.au/product/Graphics_Cards/AMD/54108-R9_290X_LIGHTNING


----------



## Aussiejuggalo

Quote:


> Originally Posted by *TommyGunn123*
> 
> Looks like Scorptec have the 290x Lightning, and $919 dollary-doos, holy crap :/
> 
> http://www.scorptec.com.au/product/Graphics_Cards/AMD/54108-R9_290X_LIGHTNING










thats a bit expensive... lol

Scroptec seem to be kinda expensive tho


----------



## Sgt Bilko

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> 
> 
> 
> 
> 
> 
> 
> thats a bit expensive... lol
> 
> Scroptec seem to be kinda expensive tho


Usually only $10-$40 more than PCCG in most cases, they also get overlooked by some so you can usually get the hard to find stuff sometimes


----------



## bbond007

Quote:


> Originally Posted by *Paul17041993*
> 
> ok just ran mine, stock and 65% fan (ref. blower), @Xyro you'res is slightly better then my result so I would think that's correct.
> 
> 
> Spoiler: Warning: Spoiler!


Your score looks OK to me







I ran it with your same settings. You are a little faster.


Spoiler: Warning: Spoiler!


----------



## maynard14

hahaha tried mine, using stock xfx 290x bios unlock 290. 1000 core and memory 1250


----------



## OmgitsSexyChase

Quote:


> Originally Posted by *ledzepp3*
> 
> Probably a slim chance, but is anyone here running a 1000W power supply with a 3930K and tri-fire 290X's? Any problems with it? I already have two 290X's soon to be underwater (rig isn't completed yet), paired with a 3930K. Just wanted to see the possibility of using my power supply with three 290X's.
> 
> -Zepp


Doesn't seem like a great idea, I have 3770k OC @ 4.5 which I imagine pulls about the same as a 3930k, and crossfire reference 290x, At stock voltage I pull around 650-700 while gaming in witch 2 max settings, overclocked high 700s to 800, so I donno, especially if you are going to mine on them


----------



## Paul17041993

Quote:


> Originally Posted by *bbond007*
> 
> Your score looks OK to me
> 
> 
> 
> 
> 
> 
> 
> I ran it with your same settings. You are a little faster.
> 
> 
> Spoiler: Warning: Spoiler!


Quote:


> Originally Posted by *maynard14*
> 
> hahaha tried mine, using stock xfx 290x bios unlock 290. 1000 core and memory 1250


@bbond007 I would expect yours to be quite a bit higher for 2*290s, though I believe its bottlenecking on the render thread, which is normal really for such a lowish res...

though mine had quite a bit of stutter, as you may notice with the minimum FPS, I have a crapload of background tasks and a windows uptime of 14 days, along with a "hybrid" 13.12/14.1 driver install... (I'm just lazy like that)


----------



## Seid Dark

Does any AMD brand approve installing custom cooler or waterblock while keeping warranty intact? EVGA does but it's Nvidia only. Asking for a friend who want's to buy 290 and watercool it.


----------



## Xyro TR1

Quote:


> Originally Posted by *Paul17041993*
> 
> though mine had quite a bit of stutter, as you may notice with the minimum FPS, I have a crapload of background tasks and a windows uptime of 14 days, along with a "hybrid" 13.12/14.1 driver install... (I'm just lazy like that)


Oh jeez, I didn't even think to turn off background tasks and the like. It's been many years since I was in to benching for numbers.


----------



## rdr09

Quote:


> Originally Posted by *Seid Dark*
> 
> Does any AMD brand approve installing custom cooler or waterblock while keeping warranty intact? EVGA does but it's Nvidia only. Asking for a friend who want's to buy 290 and watercool it.


msi i know does not mind, so long as you put everything back to its original config if ever the gpu needs to be sent back. sapphire goes by "don't ask, don't tell" policy afaik.

edit: mine is msi with a block.


----------



## Xyro TR1

Quote:


> Originally Posted by *rdr09*
> 
> msi i know does not mind, so long as you put everything back to its original config if ever the gpu needs to be sent back. sapphire goes by "don't ask, don't tell" policy afaik.


My MSI has a tamper sticker on the heatsink.


----------



## rdr09

Quote:


> Originally Posted by *Xyro TR1*
> 
> My MSI has a tamper sticker on the heatsink.


mine did, too. it's gone.


----------



## bbond007

Quote:


> Originally Posted by *Paul17041993*
> 
> @bbond007 I would expect yours to be quite a bit higher for 2*290s, though I believe its bottlenecking on the render thread, which is normal really for such a lowish res...


I do have 2 MSI gamers, but I ran mine to match the specs of the poster who was wondering if his bench was up to speed.

Was not running in crossfire. I'll have to run in crossfire and report back, but I have run it before in crossfire and my system is pretty good.

I was initially having some horrible performance problems which turned out to just be windows set to save power - in control panel









Had some people tell me that 1230v3 is no good for gaming, what a joke









Plus it was running windowed so crossfire would not of helped anyway (unlike MSI GTX 760 Gamer pair).

Also I flashed them with ASUS BIOS and I have the temp locked to stay under 87c and I have the 1000mhz/1250MHZ setting with -50v. I know this responsible for a slight decline in 3DMARK.


----------



## Tonza

Hmm, im getting occasionally black screen + blue screen when changing youtube to fullscreen, any fix for this? Card is sapphire 290 Tri-X.


----------



## BradleyW

Quote:


> Originally Posted by *Tonza*
> 
> Hmm, im getting occasionally black screen + blue screen when changing youtube to fullscreen, any fix for this? Card is sapphire 290 Tri-X.


Disable hardware acceleration in browser.


----------



## Tonza

Quote:


> Originally Posted by *BradleyW*
> 
> Disable hardware acceleration in browser.


Meh hardly a fix, videos look like crap after that.


----------



## lawson67

Hi i wonder if anyone can help me ..i have just uninstalled my GTX660 SLI cards and and uninstalled the Nvidia drivers and just installed my new Powercolor Radeon R9 290 PCS+ OC 4096MB GDDR5 PCI-Express Graphics Card in the bottom slot andd in the top slot i have just installed my Gigabyte Radeon R9 290 OC WindForce 4096MB GDDR5 PCI-Express Graphics Card (GV-R929OC-4GD-GA ) card

However when i try to crossfire them in CCC the power color in the bottom slot fans stop spinning and GPZ-U says the 0 in the power colors core clock?????...do i need a good driver cleaner program to remove any maybe left over Nvida drivers...or is there something else i am miss installing 2 radon r9 290 cards and setting them up crossfire?


----------



## Seid Dark

Quote:


> Originally Posted by *lawson67*
> 
> Hi i wonder if anyone can help me ..i have just uninstalled my GTX660 SLI cards and and uninstalled the Nvidia drivers and just installed my new Powercolor Radeon R9 290 PCS+ OC 4096MB GDDR5 PCI-Express Graphics Card in the bottom slot andd in the top slot i have just installed my Gigabyte Radeon R9 290 OC WindForce 4096MB GDDR5 PCI-Express Graphics Card (GV-R929OC-4GD-GA ) card
> 
> However when i try to crossfire them in CCC the power color in the bottom slot fans stop spinning and GPZ-U says the 0 in the power colors core clock?????...do i need a good driver cleaner program to remove any maybe left over Nvida drivers...or is there something else i am miss installing 2 radon r9 290 cards and setting them up crossfire?


Use this in safe mode: http://www.guru3d.com/files_details/display_driver_uninstaller_download.html. I had many problems when I switched from 670 to 7950 without properly cleaning drivers.


----------



## lawson67

Quote:


> Originally Posted by *Seid Dark*
> 
> Use this in safe mode: http://www.guru3d.com/files_details/display_driver_uninstaller_download.html. I had many problems when I switched from 670 to 7950 without properly cleaning drivers.


I think i may of sorted it?? Its working only when i use a graphic Heaven Benchmark or the graphics cards need to be put under load ?...it appears that if the second card power is not needed it stills idle?...however as soon as i start a Heaven Benchmark it kicks in which is a great feature if that's how they are supposed to work?..can someone tell me if this is correct behaviour for crossfire?....AS the second card was definitely working as it was getting hot ..but as soon as Heaven Benchmark stop the second card goes idle again now fans spinning or nothing...is this correct for crossfire?...many thanks for advance for you advice and help guys


----------



## MrWhiteRX7

Quote:


> Originally Posted by *lawson67*
> 
> I think i may of sorted it?? Its working only when i use a graphic Heaven Benchmark or the graphics cards need to be put under load ?...it appears that if the second card power is not needed it stills idle?...however as soon as i start a Heaven Benchmark it kicks in which is a great feature if that's how they are supposed to work?..can someone tell me if this is correct behaviour for crossfire?....AS the second card was definitely working as it was getting hot ..but as soon as Heaven Benchmark stop the second card goes idle again now fans spinning or nothing...is this correct for crossfire?...many thanks for advance for you advice and help guys


You need to disable ULPS.


----------



## kizwan

Quote:


> Originally Posted by *lawson67*
> 
> I think i may of sorted it?? Its working only when i use a graphic Heaven Benchmark or the graphics cards need to be put under load ?...it appears that if the second card power is not needed it stills idle?...however as soon as i start a Heaven Benchmark it kicks in which is a great feature if that's how they are supposed to work?..can someone tell me if this is correct behaviour for crossfire?....AS the second card was definitely working as it was getting hot ..but as soon as Heaven Benchmark stop the second card goes idle again now fans spinning or nothing...is this correct for crossfire?...many thanks for advance for you advice and help guys


Yes, secondary GPU in Crossfire when idle or not in use will enter ultra low power state (ULPS). The fan also stop spinning. This is how the cards work by default in Crossfire. You can disable ULPS if you want using MSI AB beta 3 or Trixx. When overclocking, you want to disable ULPS because it can prevent secondary GPU from overvolting.


----------



## lawson67

Quote:


> Originally Posted by *kizwan*
> 
> Yes, secondary GPU in Crossfire when idle or not in use will enter ultra low power state (ULPS). The fan also stop spinning. This is how the cards work by default in Crossfire. You can disable ULPS if you want using MSI AB beta 3 or Trixx. When overclocking, you want to disable ULPS because it can prevent secondary GPU from overvolting.


Well that's just great thanks its working fine and i am getting 150 fps in heaven benchmark 4.0 i don't know if that's good but it looks nice lol..

Also one other thing guys i want to check the gigabyte windforce bios and update it if it needs to be as i have seen some bad story's about the windforce cards over heating and needing bios updates!...can i update the bios on the gigabye with both card plugged? ...in or do i need to disconnect the power color card before i flash so as not to confuse the flash program and maybe by mistake it flashed the power color card with the gigabyte bios...thanks again for your help guys









PS : does anyone know if the power color has an uber switch the windforce seems to have one "and need to find out which way is ober and which is not"...however cant see one on the power color card?


----------



## Jack Mac

Quote:


> Originally Posted by *Seid Dark*
> 
> Does any AMD brand approve installing custom cooler or waterblock while keeping warranty intact? EVGA does but it's Nvidia only. Asking for a friend who want's to buy 290 and watercool it.


MSI, XFX (US only AFAIK) and I think PowerColor allows it as well.
Quote:


> Originally Posted by *Xyro TR1*
> 
> My MSI has a tamper sticker on the heatsink.


So do reference MSI 290s, they're there but AFAIK they don't care as long as it's RMA'd in stock configuration.


----------



## BradleyW

Quote:


> Originally Posted by *Tonza*
> 
> Meh hardly a fix, videos look like crap after that.


Nothing else you can do.


----------



## Roboyto

Quote:


> Originally Posted by *Seid Dark*
> 
> Does any AMD brand approve installing custom cooler or waterblock while keeping warranty intact? EVGA does but it's Nvidia only. Asking for a friend who want's to buy 290 and watercool it.


XFX, Sapphire and ASUS I know for certain. If you must have concrete answers you can always e-mail or chat with the respective company regarding their policy. I believe the only company that I received a definitive no from was HIS; when I was considering purchasing a 7950/7970.

I have RMA'd ASUS card that had a waterblock on it without any issues. If you need to RMA the card for whatever reason, put stock cooler back on and all should be just fine. Only problem you could run into would be if the card is physically damaged in some manner.


----------



## Prozillah

Quote:


> Originally Posted by *taem*
> 
> Any gigabyte windforce crossfire set ups in here? Been chatting with a guy who is length limited who just ordered one to use as gpu1, along with powercolor pcs+ as gpu2 and I was curious. There are different bios versions for this card right? Which one is best?


Im running twin giga 290 oc windforce cards in crossfire and they are brilliant. Overclocking on stock voltage and 50% + power limit running 1100core 1300mem. I believe im running most recent bios


----------



## Roboyto

Quote:


> Originally Posted by *ledzepp3*
> 
> Probably a slim chance, but is anyone here running a 1000W power supply with a 3930K and tri-fire 290X's? Any problems with it? I already have two 290X's soon to be underwater (rig isn't completed yet), paired with a 3930K. Just wanted to see the possibility of using my power supply with three 290X's.
> 
> -Zepp


You'll need to be in the 1200W range from everything I have seen in this thread. (3) 290X's, alone, overclocked and mining will probably pull 800 or better.


----------



## lawson67

hey guys it appears i can not turn off ULPS in windows 8 i have done the regedit enableulps and set the only 2 keys it found to 0 yet!...when i boot up i can not see the fans turn...also GPZ-U see no activity in the second card just sitting on the desk top?......also i tried some guys patchers.. "Disable_ULPS.exe"...and all they say is

ERROR: The system was unable to find the specified registry key or value.
Error: ULPS registry key not found.
Press any key to continue . . .

Any help please...thanks guys


----------



## BradleyW

Quote:


> Originally Posted by *lawson67*
> 
> hey guys it appears i can not turn off ULPS in windows 8 i have done the regedit enableulps and set the only 2 keys it found to 0 yet!...when i boot up i can not see the fans turn...also GPZ-U see no activity in the second card just sitting on the desk top?......also i tried some guys patchers.. "Disable_ULPS.exe"...and all they say is
> 
> ERROR: The system was unable to find the specified registry key or value.
> Error: ULPS registry key not found.
> Press any key to continue . . .
> 
> Any help please...thanks guys


Do it via MSI Afterburner.


----------



## lawson67

Quote:


> Originally Posted by *BradleyW*
> 
> Do it via MSI Afterburner.


Thanks mate


----------



## Roboyto

Quote:


> Originally Posted by *Seid Dark*
> 
> Does any AMD brand approve installing custom cooler or waterblock while keeping warranty intact? EVGA does but it's Nvidia only. Asking for a friend who want's to buy 290 and watercool it.


Just remembered something else. My XFX had stickers on the screws for warranty void, however warranty terms/policy relies on what market/region the card is sold. For example, in Europe, an XFX DD card that carries lifetime warranty here in US will only have 2 year warranty there.

I know undoubtedly XFX won't void warranty for removing heatsink.


----------



## taem

This will probably supplant the Powercolor PCS+ as the King of cooling for 290/x



Club3D RoyalAce 290/X. 308mm long, 54mm thick. Looks identical to PCS+ except that much more heatsink.


----------



## BradleyW

Quote:


> Originally Posted by *lawson67*
> 
> Thanks mate


Let me know if it works for you.


----------



## lawson67

Hey guys can anyone help me i have no check box disable ULPS in afterburner?...do i need to do something to make it appear?


----------



## BradleyW

It looks like your second card is not being detected properly. Try reinstalling the cards if reinstalling the drivers does nothing.

Edit: Wait, have you tweaked ULPS in registry? You might have messed something up. reinstalling the drivers might help.


----------



## lawson67

Quote:


> Originally Posted by *BradleyW*
> 
> It looks like your second card is not being detected properly. Try reinstalling the cards if reinstalling the drivers does nothing.
> 
> Edit: Wait, have you tweaked ULPS in registry? You might have messed something up. reinstalling the drivers might help.


I put the registry back to normal...which drivers do you recommend i use i am using 13.12

Also i can flick between cards in the drop down box in Afterburner so it can see my cards

Do you want me to mess with the registry again?....what drivers are recommend?????


----------



## BradleyW

Quote:


> Originally Posted by *lawson67*
> 
> I put the registry back to normal...which drivers do you recommend i use i am using 13.12
> 
> Also i can flick between cards in the drop down box in Afterburner so it can see my cards


OK, install 14.2 and ULPS should be back online.


----------



## lawson67

Quote:


> Originally Posted by *BradleyW*
> 
> OK, install 14.2 and ULPS should be back online.


Ok thanks ill do that now









Btw these are 2 new cards i uninstalled my Nvidia drivers and installed 13.12 so i have never seen the disable ULPS in afterburner check box in afterburner..but ill install 14.2 now


----------



## BradleyW

Quote:


> Originally Posted by *lawson67*
> 
> Ok thanks ill do that now


No problem. Hope it works out.


----------



## nightfox

Quote:


> Originally Posted by *lawson67*
> 
> Ok thanks ill do that now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Btw these are 2 new cards i uninstalled my Nvidia drivers and installed 13.12 so i have never seen the disable ULPS in afterburner check box in afterburner..but ill install 14.2 now


uninstall nvidia, amd and intel drivers using ddu.

DDU

themn install 14.2


----------



## taem

Quote:


> Originally Posted by *BradleyW*
> 
> It looks like your second card is not being detected properly. Try reinstalling the cards if reinstalling the drivers does nothing.
> 
> Edit: Wait, have you tweaked ULPS in registry? You might have messed something up. reinstalling the drivers might help.


Does setting enableulps to 0 in regedit mess things up? Because I'm the guy who told him to do that, or use afterburner or Trixx. I do it via regedit if I'm using ccc only and it's never caused a problem for me that I know of. But I only do it that way if not using AB or Trixx.


----------



## Iniura

Regarding the Asus Matrix R9 290X I guess we will find out soon enough about the cooling, 8 Pack posted the follwing:
Quote:


> ASUS have asked me to review the up and coming very innovative Matrix range of Graphics cards.
> 
> These are true high end offerings in both team Red AMD R290X Matrix Plat and team green Nvidia 780 Ti Matrix Plat guises. They are firmly aimed at the enthusiast with no compromise design and several unique features.
> 
> Examples of these features are a new GPU Tweak to accompany the card with even more voltages to tune and stabilize your overclock for both benching and gaming. This software has higher limits than has been seen previously and with a small mod on the card becomes fully unlocked. A redesigned cooler that is even better than one of our favs the DC2. Bios reload button which restores the card to stock bios in case of pushing it too hard or bad flash and for LN2 memory defroster on the AMD card which helps with cold bugging memory issue down to -70C. Most 290X are cold bugging at -20 - -40 or even higher without this feature. So at last no more black screen AMD benching!!!
> 
> Put this little lot with serious power delivery and ASUS could better my current favorite 780ti the DC2.
> 
> I will be testing the cards on there stock coolers first over the next couple of days here at OCUK and publishing the results of this testing here on the forum. I will be focusing as you would expect on raw overclocking and bench marking results.
> 
> I will then be streaming live world record attempts with these cards on LN2 Tuesday next week from around 11 am on our twitch channel @
> 
> 
> 
> 
> so dont forget to tune in and see some serious hardware ragged for records!!!


Do you think he is talking about a new GPU tweak version? or is he referring to the one already out a while now?
He is also talking about a redesigned cooler that is even better then the DC2's, maybe that is just marketing talk or maybe he knows some things we don't.


----------



## GrinderCAN

I'm already in the club with an elpida equipped powercolor pcs+ r9 290. But I'd like to have a 290 or 290X with Hynix ram to use with my EK waterblock. I don't have the heart to scrap the pcs+ cooler because it's great, and I can use it in another build I'm selling.

I've found a Sapphire r9 290 tri-x locally for ~$540 US + tax, and also a reference Asus 290X with Hynix ram for ~$595 US + tax.

Which do you think is better binned? Better for water? Better power delivery or other components? I'd like to have the 290X, but it's a small increase and if I ever had to revert it to stock it would be limiting... AFAIK it's 3 years warranty on the Asus vs. 2 on the Sapphire. Like a warranty is likely to apply


----------



## lawson67

Hey guys i have installed beta 14.2 driver and still no ULPS check box in afterburner?...i am running windows 8.1 pro any help would be more than grateful


----------



## taem

Quote:


> Originally Posted by *lawson67*
> 
> Hey guys i have installed beta 14.2 driver and still no ULPS check box in afterburner?...i am running windows 8.1 pro any help would be more than grateful


Try Sapphire Trixx and see if you get the checkbox in settings. Also try uninstalling AB and reinstallng, maybe the regedit messed up the install.


----------



## Arizonian

Quote:


> Originally Posted by *sam66er*
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## Pesmerrga

Tigerdirect selling Asus R9 290X DirectCU II's for $599.. dumping them from the heatpipe not touching thing you think? Not bad at all if you are gonna watercool/Red Mod.

http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=8883738&CatId=7387


----------



## bbond007

Quote:


> Originally Posted by *Pesmerrga*
> 
> Tigerdirect selling Asus R9 290X DirectCU II's for $599.. dumping them from the heatpipe not touching thing you think? Not bad at all if you are gonna watercool/Red Mod.
> 
> http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=8883738&CatId=7387


only 947 MHz?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *lawson67*
> 
> Hey guys i have installed beta 14.2 driver and still no ULPS check box in afterburner?...i am running windows 8.1 pro any help would be more than grateful


You need to reinstall afterburner if you were using it with your nvidia cards.


----------



## Roboyto

Quote:


> Originally Posted by *bbond007*
> 
> only 947 MHz?


Tiger is notorious for incorrectly labeling, listing, or representing product information; I notice things like this constantly since I get their mailer daily. Worst tech website and physical store these days...at least around chicagoland. Microcenter vastly superior.


----------



## Forceman

Quote:


> Originally Posted by *lawson67*
> 
> Hey guys i have installed beta 14.2 driver and still no ULPS check box in afterburner?...i am running windows 8.1 pro any help would be more than grateful


Maybe try checking the "unlock voltage control" box? Not sure the options that normally appear at the bottom of the settings page need that or not.


----------



## lawson67

Quote:


> Originally Posted by *Forceman*
> 
> Maybe try checking the "unlock voltage control" box? Not sure the options that normally appear at the bottom of the settings page need that or not.


Nope does not matter what i do i can not get that option to come up...and also i never had it installed with Nvidia...its a fresh install today along with my new AMD cards..i am lost what to do next?


----------



## Roboyto

Quote:


> Originally Posted by *lawson67*
> 
> Nope does not matter what i do i can not get that option to come up...and also i never had it installed with Nvidia...its a fresh install today along with my new AMD cards..i am lost what to do next?


Are you using the latest beta version? If not that option may not be available.

You can always try sapphire Trixx as well.


----------



## lawson67

Quote:


> Originally Posted by *taem*
> 
> Try Sapphire Trixx and see if you get the checkbox in settings. Also try uninstalling AB and reinstallng, maybe the regedit messed up the install.


What is Sapphire Trixx and where do i get it?

Edit found it


----------



## Heinz68

Quote:


> Originally Posted by *lawson67*
> 
> Hey guys i have installed beta 14.2 driver and still no ULPS check box in afterburner?...i am running windows 8.1 pro any help would be more than grateful


Are you using the latest MSI Afterburner 3.0.0 Beta 18 ?

EDIT
Anyway I noticed you found the Sapphire Trixx which will also do it. The Trixx has less setting options than Afterburner mainly included in the RivaTuner, but i found the Sapphire Trixx much more stable at maximum overclocks.


----------



## Roy360

Is anyone else have troubling overclocking their ASUS R9 DCIIOC card?

GPUTweak sees it as a the non OC version, voltage and power tune is locked, and pretty much any overclock gives me a black screen.


----------



## MapRef41N93W

Quote:


> Originally Posted by *Roy360*
> 
> Is anyone else have troubling overclocking their ASUS R9 DCIIOC card?
> 
> GPUTweak sees it as a the non OC version, voltage and power tune is locked, and pretty much any overclock gives me a black screen.


Mine overclocks with AB just fine. Voltage unlocked and all. Just runs really hot under load (though not in idle) which is why I'm slapping an H90 on it.


----------



## lawson67

Quote:


> Originally Posted by *taem*
> 
> Try Sapphire Trixx and see if you get the checkbox in settings. Also try uninstalling AB and reinstallng, maybe the regedit messed up the install.


Well done Taem you fixed it!...Sapphire Trixx installed and it worked just fine...+Rep


----------



## Paul17041993

Quote:


> Originally Posted by *Roy360*
> 
> Is anyone else have troubling overclocking their ASUS R9 DCIIOC card?
> 
> GPUTweak sees it as a the non OC version, voltage and power tune is locked, and pretty much any overclock gives me a black screen.


this is the normal for ASUS cards half the time, you could try different BIOS, GPUTweak version or just RMA for false advertising if you want.


----------



## taem

Quote:


> Originally Posted by *lawson67*
> 
> Well done Taem you fixed it!...Sapphire Trixx installed and it worked just fine...+Rep


Trixx is my favored app for overclocking because it allows me to save a different fan curve for each profile. WIth Afterburner, maybe I'm doing something wrong, but I only get the one custom fan curve that applies to all profiles. With Catalyst Control Center you can't do a custom curve at all, you can only go the default curve in the bios or set a fixed rpm.

Here's a pro tip, what happens with Trixx for me and some others is that it loads before Catalyst Control Center, and even if you set it to restore your overclocks on startup, CCC loads after and resets the cards to factory clocks. What you can do is go into Task Scheduler, find Trixx, right click for Properties, go into the Triggers tab, select Edit, and set it to load on a 30 second delay.

Also, keep in mind that the Powercolor PCS+ 290 has a +50 vddc adjust by default, and whatever you set the Trixx slider to overrides that. Hitting reset will restore the factory clocks but will not restore the factory vddc adjust. If you set that to 0 your card may not be able to run the default 1040 core and 1350 memory clocks. (Highly unlikely though; my card can go 1080 core 1400 mem with 0 vddc, and at the default +50 vddc it can go to 1140 core and 1500 memory. But the fact that Powercolor set the default to +50 vddc suggests some of these cards won't be able to run the default clocks at a lower vddc.)


----------



## kizwan

Quote:


> Originally Posted by *Tonza*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BradleyW*
> 
> Disable hardware acceleration in browser.
> 
> 
> 
> Meh hardly a fix, videos look like crap after that.
Click to expand...

You shouldn't need to do this though. I didn't but I don't have any problem playing youtube, including changing from window/web to full screen.

What is the version of the AMD driver you installed? Try uninstall the driver using DDU (see first post) & re-install the driver again.
Quote:


> Originally Posted by *lawson67*
> 
> Hey guys can anyone help me i have no check box disable ULPS in afterburner?...do i need to do something to make it appear?


Make sure you're using MSI Afterburner *3 beta 18* or Trixx.

*[EDIT]* Oopps! Too late.


----------



## Widde

Anyone have any idea how to get trixx to apply a custom fan profile on both cards? :| For me it only does it on one card at a time and when you choose the other card the 1st card goes back to the stock fan profile


----------



## rdr09

By Elmy . . .


----------



## Xyro TR1

My god!

http://promotions.newegg.com/NEemail/Mar-0-2014/StPatricks-Day_Lucky-Sale-13/index-landing.html?nm_mc=EMC-IGNEFL031314&cm_mmc=EMC-IGNEFL031314-_-EMC-031314-Index-_-E0-_-PromoWord&et_cid=6002&et_rid=148153

ASUS 290X DCU2 for _cheap!_


----------



## Akehage

My friend just got a build with asus r9 290 and 4670k. I built it for him. All stress tests were going great but he gets red screen all the time playing bf4. Latest beta driver installed.what's wrong?


----------



## nightfox

Quote:


> Originally Posted by *Akehage*
> 
> My friend just got a build with asus r9 290 and 4670k. I built it for him. All stress tests were going great but he gets red screen all the time playing bf4. Latest beta driver installed.what's wrong?


did he do overclock? if so try lowering clock or put back all to stock, cpu, gpu, memory


----------



## Akehage

I did it only on cpu so going to try that the first thing. The gpu is not clocked.


----------



## k3rast4se

any bad feedback for the XFX R9 290 DD?


----------



## Akehage

One more question. Can the red screens occur due to hw-error? And I have 8pin from Psu that gives me 2x6+2pin. So only one cord from psu that branches too both 6 and 8pin on gpu. Just want to make sure that's ok. 600w corsair.

skickat från min Nallebjörn


----------



## Sgt Bilko

Quote:


> Originally Posted by *k3rast4se*
> 
> any bad feedback for the XFX R9 290 DD?


If you aren't planning on overclocking them then they are fine.

vrm cooling isn't the best but at stock settings it's fine, they look pretty damn good as well


----------



## k3rast4se

What about noise? I'm also contemplating a Zotac GT780 3-fan OC, Powercolor R9 290 PCS+


----------



## Sgt Bilko

Quote:


> Originally Posted by *k3rast4se*
> 
> What about noise? I'm also contemplating a Zotac GT780 3-fan OC, Powercolor R9 290 PCS+


If you are after a quiet card with good cooling then grab the Powercolor PCS+, it's currently the best custom cooled 290 out there along with the Sapphire Tri-X. Only downside to them are that they are 2.5 slot cards.

The XFX cards are quiet, I can't hear them over my case fans (Corsair SP 120's) but i use headphones anyway so i don't mind that much.


----------



## Heinz68

Quote:


> Originally Posted by *Widde*
> 
> Anyone have any idea how to get trixx to apply a custom fan profile on both cards? :| For me it only does it on one card at a time and when you choose the other card the 1st card goes back to the stock fan profile


I just choose the same custom setting which is already TriXX preset and in 'Settings' I mark the "Sychronize cards in Multi-GPU config" also "Save fan settings in profile" than apply and save all.
It works fine for me even when not in crossfire setting, which I seldom use due to mining.


----------



## Heinz68

Quote:


> Originally Posted by *Xyro TR1*
> 
> My god!
> 
> http://promotions.newegg.com/NEemail/Mar-0-2014/StPatricks-Day_Lucky-Sale-13/index-landing.html?nm_mc=EMC-IGNEFL031314&cm_mmc=EMC-IGNEFL031314-_-EMC-031314-Index-_-E0-_-PromoWord&et_cid=6002&et_rid=148153
> 
> ASUS 290X DCU2 for _cheap!_


Nice deal compared to recent price gouging but it's actually MSRP

Looks like the prices are slowly coming down


----------



## Akehage

Problem now solved. Lowered cpu multi to x45. Repaired game, uninstalled avisynth, running ultra on 100-140fps. 98-100% load on cpu and gpu.

skickat från min Nallebjörn


----------



## SeanEboy

Anyone have any information on what I can expect from quadire 290x, and koren 1440p monitors in eyefinity? I have two monitors, and am debating a third, however if the performance blows, I won't bother.. I really only got a second 1440p for general usage, while gaming on the other.


----------



## MapRef41N93W

Quote:


> Originally Posted by *SeanEboy*
> 
> Anyone have any information on what I can expect from quadire 290x, and koren 1440p monitors in eyefinity? I have two monitors, and am debating a third, however if the performance blows, I won't bother.. I really only got a second 1440p for general usage, while gaming on the other.


Here are some 7680x1440 benchmarks on quad titans as a reference http://www.overclock.net/t/1415441/7680x1440-benchmarks-plus-2-3-4-way-sli-gk110-scaling/0_30. 290x get better scaling from bridgeless x-fire so you should do a bit better than that.


----------



## SeanEboy

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Here are some 7680x1440 benchmarks on quad titans as a reference http://www.overclock.net/t/1415441/7680x1440-benchmarks-plus-2-3-4-way-sli-gk110-scaling/0_30. 290x get better scaling from bridgeless x-fire so you should do a bit better than that.


Hey, thanks alot for that good find! I've been hunting, but I was missing some keywords... I appreciate it very much.. Still unsure what direction to go, and kind of afraid that the 1440ps are coming to an end, as I think they're being phased out for something else...


----------



## MapRef41N93W

Quote:


> Originally Posted by *SeanEboy*
> 
> Hey, thanks alot for that good find! I've been hunting, but I was missing some keywords... I appreciate it very much.. Still unsure what direction to go, and kind of afraid that the 1440ps are coming to an end, as I think they're being phased out for something else...


No problem, I had the same questions you did a few months ago. I ended up just going single 1440p, because 120 fps was just too good to give up for me for a bit of peripheral vision. 1440 is definitely not being "phased out", as 4k is still very very much in early adoption phase at this point. Hell 1080 still hasn't been phased out yet. Not to mention the Asus 1440p 120hz panel with G-Sync coming out in a few months.


----------



## SeanEboy

Quote:


> Originally Posted by *MapRef41N93W*
> 
> No problem, I had the same questions you did a few months ago. I ended up just going single 1440p, because 120 fps was just too good to give up for me for a bit of peripheral vision. 1440 is definitely not being "phased out", as 4k is still very very much in early adoption phase at this point. Hell 1080 still hasn't been phased out yet. Not to mention the Asus 1440p 120hz panel with G-Sync coming out in a few months.


Yeah, that is basically the boat I'm in... I would probably run eyefinity portrait though...

I don't mean 1440p being phased out.. But, this brand/run/whatever.. I tried to place an order for (50) of them, and they said they don't have enough.. And that they will be 'disappeared' soon, so translate that to proper English, and it doesn't sound good.


----------



## Widde

Quote:


> Originally Posted by *Heinz68*
> 
> I just choose the same custom setting which is already TriXX preset and in 'Settings' I mark the "Sychronize cards in Multi-GPU config" also "Save fan settings in profile" than apply and save all.
> It works fine for me even when not in crossfire setting, which I seldom use due to mining.


It didnt work, 2nd gpu still runs fan at stock







Afterburner will have to do ^^ Nothing against it but trixx offers more offset voltage right? I know you can do it with some commands in AB aswell but more convinient with trixx ^^


----------



## lawson67

Hey guys i have ordered a corsair RM1000 watt gold psu today i hope that enough power?... i think with my windforce gigabyte R9 290 and my power color R9 290 are to much of a strain on my seasonic gold 750 watt PSU...in heaven bench mark after about 10 mins - 15 mins the frame rates drop to half what they where and it starts to really judder...i don't believe it is throttling as the highest temps hit is 76c!...and my seasonic PSU fan seems to be flat out...with my 2x GTX660 i never seen that fan come on!....have any of you guys had a psu that was NOT coping with the power demands and experienced in a gradual slow down in frame rates accompanied by a slow juddery screen?...and does this sound like a PSU not coping problem as opposed to a GPU problem??...thanks for any help









Also do you guys know of a good program that will show me or measure on the fly and/or record or even show me via an OSD what the hell my gpus loads and frequency are and bing being hit while its benchmarking.?....would be great to have an acurate OSD in the beanchmark software that was showing what is going on...but for sure i dont think my 750 watt seasonic is coping?


----------



## The Mac

aida64


----------



## phallacy

Quote:


> Originally Posted by *lawson67*
> 
> Hey guys i have ordered a corsair RM1000 watt gold psu today i hope that enough power?... i think with my windforce gigabyte R9 290 and my power color R9 290 are to much of a strain on my seasonic gold 750 watt PSU...in heaven bench mark after about 10 mins - 15 mins the frame rates drop to half what they where and it starts to really judder...i don't believe it is throttling as the highest temps hit is 76c!...and my seasonic PSU fan seems to be flat out...with my 2x GTX660 i never seen that fan come on!....have any of you guys had a psu that was NOT coping with the power demands and experienced in a gradual slow down in frame rates accompanied by a slow juddery screen?...and does this sound like a PSU not coping problem as opposed to a GPU problem??...thanks for any help
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also do you guys know of a good program that will show me or measure on the fly and/or record or even show me via an OSD what the hell my gpus loads and frequency are and bing being hit while its benchmarking.?....would be great to have an acurate OSD in the beanchmark software that was showing what is going on...but for sure i dont think my 750 watt seasonic is coping?


Yeah the 750w may be the reason why that throttling and reduced fps is occurring. 1000w should be fine for Crossfire even with a sizable OC. I use a 1300w for 3 way and it holds up quite well. Afterburner OSD is a good one but if you are using another OC utility such as trixx or gpu tweak, I believe radeonpro has an OSD function. You are in the UK however, I would check out Super Flower PSUs as they pretty much make them for all high end PSU companies. I want to get their 1200w modular leadex platinum when I add a second PSU to my rig. They look so cool : )


----------



## lawson67

Quote:


> Originally Posted by *phallacy*
> 
> Yeah the 750w may be the reason why that throttling and reduced fps is occurring. 1000w should be fine for Crossfire even with a sizable OC. I use a 1300w for 3 way and it holds up quite well. Afterburner OSD is a good one but if you are using another OC utility such as trixx or gpu tweak, I believe radeonpro has an OSD function. You are in the UK however, I would check out Super Flower PSUs as they pretty much make them for all high end PSU companies. I want to get their 1200w modular leadex platinum when I add a second PSU to my rig. They look so cool : )


Oh so you think it is my psu then?....i was worried weather a psu could throttle a card or maybe i had a duff GPU?...anyhow glad that you think its down to the PSU as i do....and as for looking for a Super Flower PSU its a bit late as my new corsiar RM 1000 watt Will be at my house before 12pm tomorrow ..

Also has anyone here had or has got a corsiar RM 1000watt psu and are happy with it?..also i noticed it came with corsair link...does anyone know if it will be seen in my existing corsair H80i software? or will i need to connect its lead to the anther if i have one USB 2.0 header for it to be seen?


----------



## BradleyW

Quote:


> Originally Posted by *lawson67*
> 
> Oh so you think it is my psu then?....i was worried weather a psu could throttle a card or maybe i had a duff GPU?...anyhow glad that you think its down to the PSU as i do....and as for looking for a Super Flower PSU its a bit late as my new corsiar RM 1000 watt Will be at my house before 12pm tomorrow ..
> 
> Also has anyone here had or has got a corsiar RM 1000watt psu and are happy with it?..also i noticed it came with corsair link...does anyone know if it will be seen in my existing corsair H80i software? or will i need to connect its lead to the anther if i have one USB 2.0 header for it to be seen?


Corsair RM series use very cheap components apparently. I'd steer clear.


----------



## lawson67

Quote:


> Originally Posted by *BradleyW*
> 
> Corsair RM series use very cheap components apparently. I'd steer clear.


A bit late for that it will be at my house in the morning!


----------



## BradleyW

Quote:


> Originally Posted by *lawson67*
> 
> A bit late for that it will be at my house in the morning!


Sorry, I did not realise.

Here, this is where I got my info from at least. Hopefully you get no issues with your product!









http://www.overclock.net/t/1455892/why-you-should-not-buy-a-corsair-rm-psu


----------



## lawson67

Quote:


> Originally Posted by *BradleyW*
> 
> Sorry, I did not realise.
> 
> Here, this is where I got my info from at least. Hopefully you get no issues with your product!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1455892/why-you-should-not-buy-a-corsair-rm-psu


Oh i don't think ill read that would be to depressing lol...does anyone know if i will be able to use the existing seasonic cables that are already wired into my case by simply installing the corsair psu and plugin the seasonic leads into it? ...or do have to rewire the whole case and use the leads that come with the corsair PSU?


----------



## BradleyW

Quote:


> Originally Posted by *lawson67*
> 
> Oh i don't think ill read that would be to depressing lol...does anyone know if i will be able to use the existing seasonic cables that are already wired into my case by simply installing the corsair psu and plugin the seasonic leads into it? ...or do have to rewire the whole case and use the leads that come with the corsair PSU?


I'd recommend using the cables that came with the PSU. So why are you upgrading the PSU anyway?


----------



## Paul17041993

Quote:


> Originally Posted by *lawson67*
> 
> Oh i don't think ill read that would be to depressing lol...does anyone know if i will be able to use the existing seasonic cables that are already wired into my case by simply installing the corsair psu and plugin the seasonic leads into it? ...or do have to rewire the whole case and use the leads that come with the corsair PSU?


in a nutshell, if you get coilwhine and/or stability issues, that PSU should be the first to go.
and you should have at least a 14day return policy I would think if you wanted to cancel the order after you have received it, check who you got it from.


----------



## lawson67

Quote:


> Originally Posted by *BradleyW*
> 
> I'd recommend using the cables that came with the PSU. So why are you upgrading the PSU anyway?


Cos my 750 watt seasonic cant handle x2 r9 290 cards...also i am not that bothered if the corsair is crap it has a 5 year warranty


----------



## BradleyW

Quote:


> Originally Posted by *Paul17041993*
> 
> in a nutshell, if you get coilwhine and/or stability issues, that PSU should be the first to go.
> and you should have at least a 14day return policy I would think if you wanted to cancel the order after you have received it, check who you got it from.


Quote:


> Originally Posted by *lawson67*
> 
> Cos my 750 watt seasonic cant handle x2 r9 290 cards...also i am not that bothered if the corsair is crap it has a 5 year warranty


Like Paul said above, you should have up to 14 days. Give the PSU a test drive and see if you get any stability or coil whine issues. Hopefully not! Let us know how you get on.


----------



## Durquavian

Quote:


> Originally Posted by *lawson67*
> 
> Oh i don't think ill read that would be to depressing lol...does anyone know if i will be able to use the existing seasonic cables that are already wired into my case by simply installing the corsair psu and plugin the seasonic leads into it? ...or do have to rewire the whole case and use the leads that come with the corsair PSU?


Don't worry too much. Just saw that CX600 isn't recommended here but after reading the review it is quite decent and not quite the doom and gloom some make it out to be. Makes me wonder what Shilka thinks of my long history of use with Corsairs GS series. The review on my GS800 looked outstanding.


----------



## lawson67

Quote:


> Originally Posted by *Durquavian*
> 
> Don't worry too much. Just saw that CX600 isn't recommended here but after reading the review it is quite decent and not quite the doom and gloom some make it out to be. Makes me wonder what Shilka thinks of my long history of use with Corsairs GS series. The review on my GS800 looked outstanding.


The guy doing the review i don't believe mentioned the RM 1000watt and what OEM maker that was?...however he did say they where more expensive than the seasonics etc...but i think £131.16 for a 1000 watt Psu with a 5 year warranty is fantastic value for money!

http://www.dabs.com/products/corsair-1000w-rm1000-rm-series-power-supply-95GY.html?q=corsair%20rm1000&src=16


----------



## BradleyW

Quote:


> Originally Posted by *lawson67*
> 
> The guy doing the review i don't believe mentioned the RM 1000watt and what OEM maker that was?...however he did say they where more expensive than the seasonics etc...but i think £131.16 for a 1000 watt Psu with a 5 year warranty is fantastic value for money!
> 
> http://www.dabs.com/products/corsair-1000w-rm1000-rm-series-power-supply-95GY.html?q=corsair%20rm1000&src=16


True! My CM 1000w was £260 and my current PSU (860) was about £170.


----------



## CoinMinerPower

R9 290+R9 290X CrossFireX

http://www.techpowerup.com/gpuz/9ax87/
Reference card
Peltier cooled

http://www.techpowerup.com/gpuz/gq6up/
Reference card
Peltier cooled


----------



## k3rast4se

johnnyguru is THE guy for psu review


----------



## Durquavian

Quote:


> Originally Posted by *lawson67*
> 
> The guy doing the review i don't believe mentioned the RM 1000watt and what OEM maker that was?...however he did say they where more expensive than the seasonics etc...but i think £131.16 for a 1000 watt Psu with a 5 year warranty is fantastic value for money!
> 
> http://www.dabs.com/products/corsair-1000w-rm1000-rm-series-power-supply-95GY.html?q=corsair%20rm1000&src=16


From what I read the lower versions were problems, the 850 and up were far better quality.


----------



## lawson67

Quote:


> Originally Posted by *Durquavian*
> 
> From what I read the lower versions were problems, the 850 and up were far better quality.


Well i do hope so...however this fantastic Youtube surly has got to quash some of the doubts surrounding Corsairs Rm line of PSU,s...and in this case more specifically the lower wattage ones!


----------



## Tonza

Meh i really want to paint the yellow shroud red on my 290 Tri-X, but i guess warranty is gone instantly (even tho removing the shroud does not require removing the heatsink completely). Btw does Sapphire allow waterblock installation without losing warranty (atleast it seems that there aint any stickers on any of the screws).


----------



## Sgt Bilko

Quote:


> Originally Posted by *Tonza*
> 
> Meh i really want to paint the yellow shroud red on my 290 Tri-X, but i guess warranty is gone instantly (even tho removing the shroud does not require removing the heatsink completely). Btw does Sapphire allow waterblock installation without losing warranty (atleast it seems that there aint any stickers on any of the screws).


Pretty sure it's a don't ask don't tell policy with Sapphire









And yeah, The yellow throws me off the Tri-X and the Gold on the Toxic looks bad imo......the Blue on the upcoming Vapor-X is nice though


----------



## Aussiejuggalo

If you dont wanna void the warranty on uglyish cards like the Tri-X, cover it with something like carbon fibre vinyl


----------



## Sgt Bilko

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> If you dont wanna void the warranty on uglyish cards like the Tri-X, cover it with something like carbon fibre vinyl


Good point, never thought of that


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Good point, never thought of that


I'm a bit obsessed with carbon fibre vinyl







I plan on getting white and covering my fridge door... coz I can


----------



## Red1776

ROFL


----------



## Sgt Bilko

So....this has been making the rounds lately,



Dual Hawaii anyone?

Source:http://www.pcper.com/news/Graphics-Cards/AMD-Teasing-Dual-GPU-Graphics-Card-Punks-Me-Same-Time

Also rumors of a hybrid Air/Water Ares III in the works as well









http://www.forbes.com/sites/jasonevangelho/2014/03/08/amds-dual-290x-graphics-card-is-just-a-rumor-but-heres-why-it-definitely-exists/


----------



## Maracus

Quote:


> Originally Posted by *Sgt Bilko*
> 
> So....this has been making the rounds lately,
> 
> 
> 
> Dual Hawaii anyone?
> 
> Source:http://www.pcper.com/news/Graphics-Cards/AMD-Teasing-Dual-GPU-Graphics-Card-Punks-Me-Same-Time
> 
> Also rumors of a hybrid Air/Water Ares III in the works as well
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.forbes.com/sites/jasonevangelho/2014/03/08/amds-dual-290x-graphics-card-is-just-a-rumor-but-heres-why-it-definitely-exists/


Surely if it were true it could only effectively be cooled by water.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Maracus*
> 
> Surely if it were true it could only effectively be cooled by water.


Air cooling could do it but overclocking it on the stock cooling would probably be out of the question.

The Ares III will prove interesting though.


----------



## fubmabs

will my rm 750 handle another 290 along with my 4770k?


----------



## Forceman

Quote:


> Originally Posted by *fubmabs*
> 
> will my rm 750 handle another 290 along with my 4770k?


Not overclocked, but maybe at stock. I don't know that I would try it though - safer just to get something larger. 850W should be enough.


----------



## Paul17041993

considering I can make my 290X use only 80W in [email protected], an air cooled dual 290/X is easily possible, but watercooled would allow full-spec regardless.


----------



## fubmabs

Quote:


> Originally Posted by *Forceman*
> 
> Not overclocked, but maybe at stock. I don't know that I would try it though - safer just to get something larger. 850W should be enough.


well i have a new hx 850 fresh from rma but it doesnt match my black and yellow build


----------



## Sgt Bilko

Quote:


> Originally Posted by *fubmabs*
> 
> well i have a new hx 850 fresh from rma but it doesnt match my black and yellow build


Black Vinyl and Yellow Pinstripes, easy to apply and easy to take off


----------



## fubmabs

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Black Vinyl and Yellow Pinstripes, easy to apply and easy to take off


Thats an idea I just feel like everyone over does the vinyl thing.
Is there still a guy here who customizes the PSU decal for you?


----------



## Sgt Bilko

Quote:


> Originally Posted by *fubmabs*
> 
> Thats an idea I just feel like everyone over does the vinyl thing.
> Is there still a guy here who customizes the PSU decal for you?


If you like it then thats all that really matters









try here: http://www.overclock.net/f/260/overclock-net-artisans


----------



## Roy360

Quote:


> Originally Posted by *Tonza*
> 
> Meh i really want to paint the yellow shroud red on my 290 Tri-X, but i guess warranty is gone instantly (even tho removing the shroud does not require removing the heatsink completely). Btw does Sapphire allow waterblock installation without losing warranty (atleast it seems that there aint any stickers on any of the screws).


You shoudn't even consider Sapphire's warranty as a warranty. It's more like a repair service.

In Canada, to RMA: we pay shipping both ways (to China), and have to pay their middle-man an admin fee(varies on the province)


----------



## Paul17041993

Quote:


> Originally Posted by *Roy360*
> 
> You shoudn't even consider Sapphire's warranty as a warranty. It's more like a repair service.
> 
> In Canada, to RMA: we pay shipping both ways (to China), and have to pay their middle-man an admin fee(varies on the province)


which is why the store/distributor handles the warranty under policy. only thing you should pay is shipping to them (for the first two at least), and service fee if it was a false RMA (rarely the case).


----------



## chronicfx

Question for 290x trifire or above owners. I am currently running an i5 3570k with 290x trifire. Rig is in sig (updated I think). Would upgrading to a 3770k give me any advantages gaming. I am aware that the advantages are small single and dual gpu but for future games arriving and better cpu management on the horizon will swapping to a 3770k be worthwhile? I would go to 2011 or the new haswell-E 5xxx processors but it is just too much money to justify. So for $249 at microcenter will I see a difference in gaming? I know my 3dmark physics score goes up if I do that but does it translate into games?


----------



## phallacy

Quote:


> Originally Posted by *chronicfx*
> 
> Question for 290x trifire or above owners. I am currently running an i5 3570k with 290x trifire. Rig is in sig (updated I think). Would upgrading to a 3770k give me any advantages gaming. I am aware that the advantages are small single and dual gpu but for future games arriving and better cpu management on the horizon will swapping to a 3770k be worthwhile? I would go to 2011 or the new haswell-E 5xxx processors but it is just too much money to justify. So for $249 at microcenter will I see a difference in gaming? I know my 3dmark physics score goes up if I do that but does it translate into games?


Can you tell us your CPU usage during your most played games? Is it reaching its limits before the cards do? I was in the same boat a few weeks back moved from 3570k to 4770k and I've noticed about a 10 fps difference on the highest settings. This is anecdotal however, just from what it feels like, it is indeed better. I'm also waiting to see what haswell e brings


----------



## chronicfx

What program can I use to log cpu usage in a graph form over lets say an hour period?


----------



## phallacy

Quote:


> Originally Posted by *chronicfx*
> 
> What program can I use to log cpu usage in a graph form over lets say an hour period?


I think Aida64 will do that can't say 100% though.


----------



## kizwan

Quote:


> Originally Posted by *chronicfx*
> 
> What program can I use to log cpu usage in a graph form over lets say an hour period?


Open Hardware Monitor. It automatically log CPU usage. Just need to run it in the background.

3570k @5GHz for 24/7? You didn't experience sudden FPS drop when playing games like BF3/BF4?


----------



## The Mac

Quote:


> Originally Posted by *phallacy*
> 
> I think Aida64 will do that can't say 100% though.


yup.


----------



## JordanTr

Hello guys. Today i had some time, so cut my stock coolers plate and uset it for vrm. Also changed my thermal paste from gelid's (was on the gelid icy vision) to noctua nt-h1. vrm1 temps under load droped from 98 to 80 and vrm2 from 74 to 54. also core max temp dropped from 78 to 69 (didnt expect that difference by changing thermal compound). I guess my vrms could be a bit cooler, but im using be quiet shadow wings 92mm 1600rpm fans to operate ultimate quiet, thats the best quiet/cool ratio


----------



## chronicfx

Quote:


> Originally Posted by *kizwan*
> 
> Open Hardware Monitor. It automatically log CPU usage. Just need to run it in the background.
> 
> 3570k @5GHz for 24/7? You didn't experience sudden FPS drop when playing games like BF3/BF4?


I not into multiplayer. Campaigns have not been a problem. I play single player games mostly. I am thinking down the line to watch dogs, star citizen, witcher 3 etc.. And when mantle and the new direct x are out and possible having better utilization. So I have tried hardware monitor in the past but how do you get it to show longer than just the last few minutes? I would like to see a whole gaming session, not just where i died and decided to quit for the night.


----------



## kizwan

Quote:


> Originally Posted by *chronicfx*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Open Hardware Monitor. It automatically log CPU usage. Just need to run it in the background.
> 
> 3570k @5GHz for 24/7? You didn't experience sudden FPS drop when playing games like BF3/BF4?
> 
> 
> 
> 
> 
> 
> 
> I not into multiplayer. Campaigns have not been a problem. I play single player games mostly. I am thinking down the line to watch dogs, star citizen, witcher 3 etc.. And when mantle and the new direct x are out and possible having better utilization. So I have tried hardware monitor in the past but how do you get it to show longer than just the last few minutes? I would like to see a whole gaming session, not just where i died and decided to quit for the night.
Click to expand...

Hardware Monitor you're referring to might be different software. The one I'm referring to is *Open Hardware Monitor*. I like to use it because I don't need to set anything, it automatically log & keep the data for the last 24 hours.


----------



## chronicfx

Quote:


> Originally Posted by *kizwan*
> 
> Hardware Monitor you're referring to might be different software. The one I'm referring to is *Open Hardware Monitor*. I like to use it because I don't need to set anything, it automatically log & keep the data for the last 24 hours.


Yes that software looks perfect I was referring to TaskManager excuse me for that. Is that free to download or at least have a trial period? Can you link it if you have some time today?


----------



## kizwan

Quote:


> Originally Posted by *chronicfx*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Hardware Monitor you're referring to might be different software. The one I'm referring to is *Open Hardware Monitor*. I like to use it because I don't need to set anything, it automatically log & keep the data for the last 24 hours.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yes that software looks perfect I was referring to TaskManager excuse me for that. Is that free to download or at least have a trial period? Can you link it if you have some time today?
Click to expand...

It's free, no trial. They accept donation though. Here is the link.
http://openhardwaremonitor.org/


----------



## MapRef41N93W

So I installed a G10 with an H90 on my Tri-X but my temps are almost the same??? How is that even possible.

Actually it seems like it's even running hotter. Temps start at 49c and then slowly rise by 1c every minute or so.


----------



## phallacy

Quote:


> Originally Posted by *MapRef41N93W*
> 
> So I installed a G10 with an H90 on my Tri-X but my temps are almost the same??? How is that even possible.


Try reseating it. What TIM are you using? Sapphire Tri X is one of the best aftermarket coolers too so that could be it.


----------



## MapRef41N93W

Quote:


> Originally Posted by *phallacy*
> 
> Try reseating it. What TIM are you using? Sapphire Tri X is one of the best aftermarket coolers too so that could be it.


My Tri-X was awful. Idled in the high 40s to low 50s. The TIM was the stuff that came on the H90. I really don't want to reseat the damn thing... such a pain in the ass.


----------



## GhostRyder

Ive currently got 3 PowerColor R9 290X BF4 Editions that I attached EK waterblocks to.

Mine run Stable at 1125 on the Core, can't get any higher but temps are extremely low on them in my system (Below 55 Under Full Stress on all 3)


----------



## phallacy

Quote:


> Originally Posted by *GhostRyder*
> 
> Ive currently got 3 PowerColor R9 290X BF4 Editions that I attached EK waterblocks to.
> 
> Mine run Stable at 1125 on the Core, can't get any higher but temps are extremely low on them in my system (Below 55 Under Full Stress on all 3)


That's pretty much exactly where I'm topping out temp wise under full extended load with my trifire setup. Are you 1125 core with stock memory clocks or what are you doing on those. Also what is your mV offset and I'm assuming 50% power limit is there as well?


----------



## GhostRyder

Quote:


> Originally Posted by *phallacy*
> 
> That's pretty much exactly where I'm topping out temp wise under full extended load with my trifire setup. Are you 1125 core with stock memory clocks or what are you doing on those. Also what is your mV offset and I'm assuming 50% power limit is there as well?


My Memory clocks are 1350 with my 1125 Core Clocks. Yes thats with MSI afterburner setting to max allowed voltage offset.

I have tried going further, but even 1mhz difference and I can still complete some benchmarks, but I get random screen tearing. I have done a burn in test on all three cards with these settings and they all seem happy after a 3 hour session. Temps never exceed 55c that ive seen even when including stress from the Processor.

Though to be fair, I did this overclock rather quickly as work has kept my bogged down lately








So I may be able to go further with a little bit of playing with the machine.


----------



## phallacy

Quote:


> Originally Posted by *GhostRyder*
> 
> My Memory clocks are 1350 with my 1125 Core Clocks. Yes thats with MSI afterburner setting to max allowed voltage offset.
> 
> I have tried going further, but even 1mhz difference and I can still complete some benchmarks, but I get random screen tearing. I have done a burn in test on all three cards with these settings and they all seem happy after a 3 hour session. Temps never exceed 55c that ive seen even when including stress from the Processor.
> 
> Though to be fair, I did this overclock rather quickly as work has kept my bogged down lately
> 
> 
> 
> 
> 
> 
> 
> 
> So I may be able to go further with a little bit of playing with the machine.


You should try Trixx as well to see if you can eek out more performance if you desire. I know for my clocks to be stable at 1200 while running all games smoothly with no artifacting, tearing etc. I need between 110mV and 134 mV depending on the card. I can bench 2/3 at 1250/1650 and the third at 1225/1625 if I bump up the mV to about 145-155 mV on each card but it's still not stable for most games. I usually get the driver has stopped responding error and the game shuts down. Another thing is bring the memory back to stock and work on the core first. I haven't noticed nearly the same performance increase with memory as with core.


----------



## MapRef41N93W

I re-seated my G10 and applied MX-4 and am still seeing these ridiculous temps. Starts at 47c and then just keeps climbing every half a minute or so by 1 degree. I noticed the G10 bracket gets pulled away by the PCI-E cable when you plug it in and am wondering if this is causing the issue. Overall it's a pretty janky design and I am thinking of just going back to the Tri-X cooler and returning the G10s and H90s.


----------



## lawson67

Quote:


> Originally Posted by *kizwan*
> 
> Hardware Monitor you're referring to might be different software. The one I'm referring to is *Open Hardware Monitor*. I like to use it because I don't need to set anything, it automatically log & keep the data for the last 24 hours.


Hi i have just downloaded and installed Open Hardware Monitor..how do i get it to graph like you have?...i have some problems that i hope someone can help me with and this software if it can monitor and show me what's *been* going on will be really helpful!...i bought 2x R9 290 non X cards one being a power color r9 290 pcs+...and for the second slot.... and a Gigabyte wind-force r9 290 for the top slot...Also today i have received and now installed my new Corsair RM 1000w ...

I bought this because in Heaven Benchmark 4.0 with my seasonic 750 watt G series Gold PSU after about 10 mins looping Heaven Benchmark 4.0 the frame rates drop and the screen was juddering along..i guessed that my PSU was not up to the job?..so today i have bought and installed the Corsair RM 1000w Psu and again the same thing happens but this time after about 40-50mins?...Frames rate drop to half what the where and screen starts to judder?....now if stop the benchmark and let everything cool down i can restart heaven it all seems fine again until it eventually the FPS drop to half and it starts juddering again?...and i am finding it hard to believe that this problem is now down to the PSU?..and i know that there are people here that are going to say..."it your Corsair RM 1000w PSW its a piece of crap etc which is not going to be helpful...and even if it is please just pretend its not my PSU and think what else it could be!".

Also on Tuesday i will have arriving a plug in power monitor with an LCD screen that you plug your computer and it tells you how many watts its pulling which will be helpfull..right now i need some good hardware monitoring so i can better understand what's throttling the cards...also i need to know if the fans on the cards are going to 100% under load...i know none of this right now which is why i am keen to use Open Hardware Monitor in graph mode!.. so any help with that would be fine!...Also from the Screen shot below i don't believe the max temps on the cards are hitting high enough to be throttling the cards?...or are they?...

Also i have read as in my own case it is normal to see the cards core clocks fluctuating something rapidly even when benchmarking in heaven 4.0 i have seen one card drop to 0% load and suddenly shoot back up to 100% load...i have been told its is just PowerTune doing its work...can someone confirm this please?...Also in the Screen shot below that i took after shuting down heaven 4.0 due to the halved frame rates and juddering screen i noticed that the power color card fan has not been going flat out!...which i have set for it to do at 70c in Trixx software using synchronize both cards setting?...Maybe be the Trixx software does not like the power color card?

Also i have been told not so much as to check the GPU temps but more the VRM Temps as they can throttle the card?...i have also noticed the my windforce has the F3L bios installed and there is a new one out called F3 bios do you guys consider it a *must* to load this bios update?...Also i can not find on the power color website an update to the latest bios?...if there even is one?....also i have heard you guys are all using one type of bios for these cards?..which one is it and where do i find the tools and instructions to flash my cards with the bios you guys are using?...i am also thinking maybe this bios you guys are using will unlock the fans on my card to use 100% as in the case of the power color which does not seem to?

In the picture below are 2 instances of GPZ-U the one with the lowest max fan sped is the power color card!..and i don't believe the max temp on that card hitting 77c should be enough to throttle it?...all the temps and core clocks in GPZ-U i have set to show the max hit!... which i am hoping will give you more experienced guys more of an idea how to help me?...it might just be something simple like the VRM over heating...but any help with my throttling cards and juddering screen after 30 mins or so of heaven 4.0 benchmark will help as clearly something is not right and it should be able to run heaven all night long if i wanted it too!...Also any help setting up the graph on Open Hardware Monitor would also be much appreciated as this will enable me to see what's been happening in the background while heaven 4.0 is run which could hopefully lead the the root of this problem!....i know i have a lot of questions but i feel this could be something simple like i need to use the the same bios you guys are on etc?...and i am very new to these graphic cards...i have always upon till now used Nvidia...so any help guys would very much appreciated especially turning the graph on on hardware monitor








Thanks Guys


----------



## Forceman

What are your VRM Temps? They are usually shown in the bottom half of the GPU-Z sensors tab (VRM1 is the one to watch). You can also check them in HWInfo (which can also log values for charting like you wanted).

You've increased the power limit in Trixx? And checked that it actually took in CCC? Sounds like over time your cards might be getting heat-soaked and pushing the VRM Temps (or something else) to the point that it starts throttling.


----------



## kizwan

Quote:


> Originally Posted by *lawson67*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Hardware Monitor you're referring to might be different software. The one I'm referring to is *Open Hardware Monitor*. I like to use it because I don't need to set anything, it automatically log & keep the data for the last 24 hours.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hi i have just downloaded and installed Open Hardware Monitor..how do i get it to graph like you have?...i have some problems that i hope someone can help me with and this software if it can monitor and show me what's *been* going on will be really helpful!...i bought 2x R9 290 non X cards one being a power color r9 290 pcs+...and for the second slot.... and a Gigabyte wind-force r9 290 for the top slot...Also today i have received and now installed my new Corsair RM 1000w ...
> 
> I bought this because in Heaven Benchmark 4.0 with my seasonic 750 watt G series Gold PSU after about 10 mins looping Heaven Benchmark 4.0 the frame rates drop and the screen was juddering along..i guessed that my PSU was not up to the job?..so today i have bought and installed the Corsair RM 1000w Psu and again the same thing happens but this time after about 40-50mins?...Frames rate drop to half what the where and screen starts to judder?....now if stop the benchmark and let everything cool down i can restart heaven it all seems fine again until it eventually the FPS drop to half and it starts juddering again?...and i am finding it hard to believe that this problem is now down to the PSU?..and i know that there are people here that are going to say..."it your Corsair RM 1000w PSW its a piece of crap etc which is not going to be helpful...and even if it is please just pretend its not my PSU and think what else it could be!".
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Also on Tuesday i will have arriving a plug in power monitor with an LCD screen that you plug your computer and it tells you how many watts its pulling which will be helpfull..right now i need some good hardware monitoring so i can better understand what's throttling the cards...also i need to know if the fans on the cards are going to 100% under load...i know none of this right now which is why i am keen to use Open Hardware Monitor in graph mode!.. so any help with that would be fine!...Also from the Screen shot below i don't believe the max temps on the cards are hitting high enough to be throttling the cards?...or are they?...
> 
> Also i have read as in my own case it is normal to see the cards core clocks fluctuating something rapidly even when benchmarking in heaven 4.0 i have seen one card drop to 0% load and suddenly shoot back up to 100% load...i have been told its is just PowerTune doing its work...can someone confirm this please?...Also in the Screen shot below that i took after shuting down heaven 4.0 due to the halved frame rates and juddering screen i noticed that the power color card fan has not been going flat out!...which i have set for it to do at 70c in Trixx software using synchronize both cards setting?...Maybe be the Trixx software does not like the power color card?
> 
> Also i have been told not so much as to check the GPU temps but more the VRM Temps as they can throttle the card?...i have also noticed the my windforce has the F3L bios installed and there is a new one out called F3 bios do you guys consider it a *must* to load this bios update?...Also i can not find on the power color website an update to the latest bios?...if there even is one?....also i have heard you guys are all using one type of bios for these cards?..which one is it and where do i find the tools and instructions to flash my cards with the bios you guys are using?...i am also thinking maybe this bios you guys are using will unlock the fans on my card to use 100% as in the case of the power color which does not seem to?
> 
> In the picture below are 2 instances of GPZ-U the one with the lowest max fan sped is the power color card!..and i don't believe the max temp on that card hitting 77c should be enough to throttle it?...all the temps and core clocks in GPZ-U i have set to show the max hit!... which i am hoping will give you more experienced guys more of an idea how to help me?...it might just be something simple like the VRM over heating...but any help with my throttling cards and juddering screen after 30 mins or so of heaven 4.0 benchmark will help as clearly something is not right and it should be able to run heaven all night long if i wanted it too!...Also any help setting up the graph on Open Hardware Monitor would also be much appreciated as this will enable me to see what's been happening in the background while heaven 4.0 is run which could hopefully lead the the root of this problem!....i know i have a lot of questions but i feel this could be something simple like i need to use the the same bios you guys are on etc?...and i am very new to these graphic cards...i have always upon till now used Nvidia...so any help guys would very much appreciated especially turning the graph on on hardware monitor
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks Guys
Click to expand...

Go to "View" >> "Show Plot". Then select whatever you want to display/plot.

AFAIK when PSU insufficient to drive the cards, it will just shutdown because the cards trying to draw an amount of watts that PSU unable to deliver. FPS drop or throttling doesn't caused by PSU. Probably the confusion came from the Power Limit. Power Limit can make the card throttling because it telling the card to keep power draw/consumption at or below the pre-set setting. PSU can't do this because it doesn't have a way to "communicate" with the cards.

Regarding core clock fluctuating, it depends on the benchmark or software you're using. With some benchmarks, it's normal for core clock to fluctuating while changing from one test to another. This graph show my GPU's core & memory clocks while running Heaven benchmark. This is complete benchmark run. Mine doesn't fluctuate at all.


----------



## bbond007

Quote:


> Originally Posted by *GhostRyder*
> 
> Ive currently got 3 PowerColor R9 290X BF4 Editions that I attached EK waterblocks to.
> 
> Mine run Stable at 1125 on the Core, can't get any higher but temps are extremely low on them in my system (Below 55 Under Full Stress on all 3)


Very nice. I like those temps









Do you know if that block is compatible with MSI Gamers with the twin frozr ? I'm a little discouraged about the temps...

thanks! cheers!

guess not "Bigger than reference capacitors prevent the installation of EK-FC R9-290X series water blocks."


----------



## lawson67

Quote:


> Originally Posted by *Forceman*
> 
> What are your VRM Temps? They are usually shown in the bottom half of the GPU-Z sensors tab (VRM1 is the one to watch). You can also check them in HWInfo (which can also log values for charting like you wanted).
> 
> You've increased the power limit in Trixx? And checked that it actually took in CCC? Sounds like over time your cards might be getting heat-soaked and pushing the VRM Temps (or something else) to the point that it starts throttling.


Ahh no i have not increased the power limit in Trixx!..i don't know what it does lol..but what should i set it too?...and where would i find it confirmed in CCC...Sorry i maybe frustrating i no nothing about AMD cards but am willing to learn...As for what my temps are on MY VRM i don't know yet course the last run i did with the screen shot i took i did not look at them course i did not know i needed too!..right now i want to get hardware monitor set up then hopefully before my next run at heaven 4.0 i will have them set and hopefully it will cure my problem?...hopefully someone will jump in and tell me what to set my power limits too and ill give Heaven another shot and check the VRM temps in GPU-Z after and report them here!....so what do i set my power limit too?
Quote:


> Originally Posted by *kizwan*
> 
> Go to "View" >> "Show Plot". Then select whatever you want to display/plot.
> 
> AFAIK when PSU insufficient to drive the cards, it will just shutdown because the cards trying to draw an amount of watts that PSU unable to deliver. FPS drop or throttling doesn't caused by PSU. Probably the confusion came from the Power Limit. Power Limit can make the card throttling because it telling the card to keep power draw/consumption at or below the pre-set setting. PSU can't do this because it doesn't have a way to "communicate" with the cards.
> 
> Regarding core clock fluctuating, it depends on the benchmark or software you're using. With some benchmarks, it's normal for core clock to fluctuating while changing from one test to another. This graph show my GPU's core & memory clocks while running Heaven benchmark. This is complete benchmark run. Mine doesn't fluctuate at all.


Thank you for your helpful comments as regard core clock fluctuating and i will set up Open Hardware Monitor as instructed and hopefully i wont get any core clock fluctuating in heaven 4.0 next run...and thank for letting me know its not my PSU ...now all i need to know is what to set my power level too and where to make sure it set in CCC?.....really appreciate your help guys


----------



## rdr09

Quote:


> Originally Posted by *lawson67*
> 
> Ahh no i have not increased the power limit in Trixx!..i don't know what it does lol..but what should i set it too?...and where would i find it confirmed in CCC...Sorry i maybe frustrating i no nothing about AMD cards but am willing to learn...As for what my temps are on MY VRM i don't know yet course the last run i did with the screen shot i took i did not look at them course i did not know i needed too!..right now i want to get hardware monitor set up then hopefully before my next run at heaven 4.0 i will have them set and hopefully it will cure my problem?...hopefully someone will jump in and tell me what to set my power limits too and ill give Heaven another shot and check the VRM temps in GPU-Z after and report them here!....so what do i set my power limit too?
> Thank you for your helpful comments as regard core clock fluctuating and i will set up Open Hardware Monitor as instructed and hopefully i wont get any core clock fluctuating in heaven 4.0 next run...and thank for letting me know its not my PSU ...now all i need to know is what to set my power level too and where to make sure it set in CCC?.....really appreciate your help guys


max the power limit.


----------



## MapRef41N93W

Sapphire must have cheapened out on the TIM or something. I decided I wasn't happy with the G10 and went back to the Tri-X cooler with newly applied MX-4 and lo and behold my temps have dropped considerably. Went from idling in the 40s to 50s to the low 30s.


----------



## lawson67

Quote:


> Originally Posted by *rdr09*
> 
> max the power limit.


In Trixx is the power limit called VDDC offset as there is know mention in trixx of the word power limit?....Also where do i find this confirmed in CCC?.... thanks guys


----------



## Red1776

Yep, the BF4 and the MSI gamers are both ref design PCB's ( I checked because I have 4 R290X 4GB twin Frozr Gamers) that are getting blocked as well.


----------



## lawson67

Quote:


> Originally Posted by *lawson67*
> 
> In Trixx is the power limit called VDDC offset as there is know mention in trixx of the word power limit?....Also where do i find this confirmed in CCC?.... thanks guys


Edit i found power limit it and set it to 50% highest it will go!... and set the VDDC to 25%....?


----------



## Forceman

Quote:


> Originally Posted by *lawson67*
> 
> Edit i found power limit it and set it to 50% highest it will go!... and set the VDDC to 25%....?


Did you mean 25mV for VDDC? You can check the power limit on the Performance tab in CCC, you should be able to see in the little chart thing that the circle has moved over to the right.


----------



## lawson67

Quote:


> Originally Posted by *Forceman*
> 
> Did you mean 25mV for VDDC? You can check the power limit on the Performance tab in CCC, you should be able to see in the little chart thing that the circle has moved over to the right.


Hi airforce set power limit as in the screen shots!... however this time after only 10 mins in heaven and the FPS drop in half and i get a juddering screen?...also it looks like mu GPU load is not staying at 100% in the hardware monitor graph its fluctuating up and down?...i don't know what to do?...maybe be both cards should be using the same bios?...or the bios that you guys use which i don't know where to get or how to install it or what tools you use etc...i need a link to read instructions?....any help please this is very frustrating .these cards cost me so much money and they seem usless...i am sure someone can help me sort it?...thanks guys

Can you guys check my setting and results and help me fix this problem from the pictures below






Also here are the max voltages the cards have drawn...most of the results in GPU-Z i have set to show max!


----------



## MapRef41N93W

I feel you Lawson. I sure as hell won't be buying AMD again after my experience with these 290xs. I have spent more time in BSOD, Screen lock, and re-installing drivers than I have playing games since I got them. I never once had to re-install an NVIDIA driver and rarely if ever saw BSOD, screen lock, etc.


----------



## Forceman

Have you tried running the cards one at a time to see if the problem still occurs? That might tell you if one of the cards is bad, or if it is a Crossfire issue.

You shouldn't have any problems running the different cards (or different BIOSes) together, unless there is something else wrong. Crossfire shouldn't care that they are different.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Forceman*
> 
> Have you tried running the cards one at a time to see if the problem still occurs? That might tell you if one of the cards is bad, or if it is a Crossfire issue.
> 
> You shouldn't have any problems running the different cards (or different BIOSes) together, unless there is something else wrong. Crossfire shouldn't care that they are different.


^This

Also, what resolution are you running Heaven at?


----------



## Sgt Bilko

Quote:


> Originally Posted by *MapRef41N93W*
> 
> I feel you Lawson. I sure as hell won't be buying AMD again after my experience with these 290xs. I have spent more time in BSOD, Screen lock, and re-installing drivers than I have playing games since I got them. I never once had to re-install an NVIDIA driver and rarely if ever saw BSOD, screen lock, etc.


there are people on both sides with good and bad experiances, me personally AMD/ATi cards have never been an issue but Nvidia ones have.

just luck of the draw really.


----------



## lawson67

Quote:


> Originally Posted by *Forceman*
> 
> Have you tried running the cards one at a time to see if the problem still occurs? That might tell you if one of the cards is bad, or if it is a Crossfire issue.
> 
> You shouldn't have any problems running the different cards (or different BIOSes) together, unless there is something else wrong. Crossfire shouldn't care that they are different.


I shall try now to test each card on its own?...on the main card did you see that one of the VRM hit 90c....Do you think this could be the reason for the half frame rates and juddering?

Also i will test each card alone!...i don't think i need to physically remove the card?.... just unplug the power cables from the card i am not using correct?

Also i am ruining Heaven at 2560x1440

EDIT:- Also it look in the hardware monitor if the loads are going up and down...however after the stoping heaven when the frame rate drops in half and i get the juddering i check average GPU core clock in GPU-Z and the result is over 1100 on both cards?...so is are they really jumping up and down?

Also i am not even overclocking these cards i am not bothered about that...they will be fast enough standard for me

Could the VRM hitting 90c be the reason this is happening,..at what temps do these cards throttle?


----------



## Forceman

90C is a little hot for the VRMs, but they are supposedly rated to 120C so I doubt they would be causing throttling like you are seeing at 90C. The problem is happening right away now, or is it still only occurring after you've run the cards for a while?

Have you tried running anything besides Heaven to test them, maybe Valley or 3DMark? I'd hate for you to be going through all this trouble only to find out it is a Heaven issue and not a card problem.

For comparison, here's my card running Heaven (using HWInfo logging)


----------



## SeanEboy

So, I'm a newb to overclocking.. Is there a 290x overclocking guide? I'm going to watercool my quadfire setup, however I figure (much like Lawson, sup bro?!) that I should run each individually, right?

My process:
Bump up 25MHz, run, if no artificating, bump another 25MHz... Once artifacting occurs, bump up the voltage, 10mV, repeat until no more artifacting. Is there any sort of ratio in terms of MHz to mV increases?

I'm benching with 3Dmark, and Unigine Valley...


----------



## Sgt Bilko

Quote:


> Originally Posted by *SeanEboy*
> 
> So, I'm a newb to overclocking.. Is there a 290x overclocking guide? I'm going to watercool my quadfire setup, however I figure (much like Lawson, sup bro?!) that I should run each individually, right?
> 
> My process:
> Bump up 25MHz, run, if no artificating, bump another 25MHz... Once artifacting occurs, bump up the voltage, 10mV, repeat until no more artifacting. Is there any sort of ratio in terms of MHz to mV increases?
> 
> I'm benching with 3Dmark, and Unigine Valley...


Not sure why you'd need to overclock quad-fire unless you are going for Benching records but individual clocks are a good way to go about it.

The process you listed is the way i'd do it, bear in mind you want alot of power supplying these cards if you are going to overvolt them all at once (Dual PSU rig, probably 2000w or so at least.)


----------



## lawson67

Quote:


> Originally Posted by *Forceman*
> 
> 90C is a little hot for the VRMs, but they are supposedly rated to 120C so I doubt they would be causing throttling like you are seeing at 90C. The problem is happening right away now, or is it still only occurring after you've run the cards for a while?
> 
> Have you tried running anything besides Heaven to test them, maybe Valley or 3DMark? I'd hate for you to be going through all this trouble only to find out it is a Heaven issue and not a card problem.
> 
> For comparison, here's my card running Heaven (using HWInfo logging)


I have fixed it!!...i un-installed Trixx and did not turn on gpu-z and just used the hardware monitor and set power limit to 50 on both cards in CCC and benchmarked it for 1 hour and it was fine!...the top card hit 90c but i have been reading that's fine for these cards?...i did not get any throttling!...i am now very pleased







....I have superduper fast Graphics on my lovely 27"overclocked to 120hz 2560x1440 Qnix







....Oh and hey Hi Sean









No throttling!


----------



## Forceman

Quote:


> Originally Posted by *lawson67*
> 
> I have fixed it!!...i un-installed Trixx and did not turn on gpu-z and just used the hardware monitor and set power limit to 50 on both cards benchmarked it for 1 hour and it was fine!...the top card hit 90c but i have been reading that's fine for these cards?...i did not get any throttling!...i am now very pleased
> 
> 
> 
> 
> 
> 
> 
> 
> 
> No throttling!


Glad you figured it out. Wonder if it may have been GPU-Z, there were some issues when the cards first came out with GPU-Z causing issues. Did you have the newest version?


----------



## Sgt Bilko

Quote:


> Originally Posted by *lawson67*
> 
> I have fixed it!!...i un-installed Trixx and did not turn on gpu-z and just used the hardware monitor and set power limit to 50 on both cards in CCC and benchmarked it for 1 hour and it was fine!...the top card hit 90c but i have been reading that's fine for these cards?...i did not get any throttling!...i am now very pleased
> 
> 
> 
> 
> 
> 
> 
> 
> 
> No throttling!
> 
> 
> Spoiler: Warning: Spoiler!


Quote:


> Originally Posted by *Forceman*
> 
> Glad you figured it out. Wonder if it may have been GPU-Z, there were some issues when the cards first came out with GPU-Z causing issues. Did you have the newest version?


I think running too many programs that read devices cause these cards to be weird, if i try to run HWiNFO64, AB and GPU-Z together my clocks get all messed up along with my power limit.

That's with AB active mind you, when it's in the task bar it's fine.


----------



## lawson67

Quote:


> Originally Posted by *Forceman*
> 
> Glad you figured it out. Wonder if it may have been GPU-Z, there were some issues when the cards first came out with GPU-Z causing issues. Did you have the newest version?


Yes its GPZ-U and yes i made sure i had downloaded the latest version!...it even locked up the computer on few occasions!...its something to remember in the future for other members that may have this problem!...but thanks for all the help guys


----------



## lawson67

Quote:


> Originally Posted by *MapRef41N93W*
> 
> I feel you Lawson. I sure as hell won't be buying AMD again after my experience with these 290xs. I have spent more time in BSOD, Screen lock, and re-installing drivers than I have playing games since I got them. I never once had to re-install an NVIDIA driver and rarely if ever saw BSOD, screen lock, etc.


I have never ever had screen lock on this PC...GPU-Z gave me screen lock and forced me to hard reset!...try not to use it and see how you go


----------



## MrWhiteRX7

Glad you got it sorted Lawson, these are bad ass cards... enjoy them now!


----------



## SeanEboy

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Not sure why you'd need to overclock quad-fire unless you are going for Benching records but individual clocks are a good way to go about it.
> 
> The process you listed is the way i'd do it, bear in mind you want alot of power supplying these cards if you are going to overvolt them all at once (Dual PSU rig, probably 2000w or so at least.)


I don't know, I just figured that's what you do.. Buy something fast, and try and make it faster..? ;c) Ok, perhaps I'll just start installing the waterblocks already... After all, I've been dodging land mines all over my floor for too long, waiting for parts to come in.. So, might as well put them to use!


----------



## lawson67

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Glad you got it sorted Lawson, these are bad ass cards... enjoy them now!


Thanks mate!...Also i am glad i upgraded my PSU from my 750w Seasonic to a Corsair RM1000w...i bought one those plugs that measure wattage being drawn from the wall!..ruining heaven 4.0 it was ruining mostly around around 680 -700 and peaked at 756.9 watts!....i could not be more happier with these cards and and my system set up now...Also thanks for all the help guys as this is my first time with AMD cards and did not even know about the power limit function! so thank for pointing that out guys


----------



## Imprezzion

I got my new Sapphire Tri-X 290X in and i totally love it. That thing is capable of some seriously low temperatures!

Only, I notice it does the exact same as all my other 290/290X cards... Whenever I put the voltage past +50mV in MSI AB / TriXX with power limit at +50 it still throttles like a mad man.
When I set +100mV, +50 Power, 1200Mhz Core / 1500Mhz VRAM it like, runs at ~1030 core most of the time..

When I set +50mV however and 1150Mhz core it runs at about ~1110Mhz..

So, how do you even OC these things without flashing the ASUS ''PT1'' BIOS. I'd rather not as this does eliminate throttling (tested it already) but also makes it run at ~45c idle cause it won't downclock.


----------



## lawson67

Quote:


> Originally Posted by *Imprezzion*
> 
> I got my new Sapphire Tri-X 290X in and i totally love it. That thing is capable of some seriously low temperatures!
> 
> Only, I notice it does the exact same as all my other 290/290X cards... Whenever I put the voltage past +50mV in MSI AB / TriXX with power limit at +50 it still throttles like a mad man.
> When I set +100mV, +50 Power, 1200Mhz Core / 1500Mhz VRAM it like, runs at ~1030 core most of the time..
> 
> When I set +50mV however and 1150Mhz core it runs at about ~1110Mhz..
> 
> So, how do you even OC these things without flashing the ASUS ''PT1'' BIOS. I'd rather not as this does eliminate throttling (tested it already) but also makes it run at ~45c idle cause it won't downclock.


I am more than happy with my cards ruining at the overclocked stock speeds!.. they are more than fast enough for me


----------



## EliteReplay

Quote:


> Originally Posted by *lawson67*
> 
> Thanks mate!...Also i am glad i upgraded my PSU from my 750w Seasonic to a Corsair RM1000w...i bought one those plugs that measure wattage being drawn from the wall!..ruining heaven 4.0 it was ruining mostly around around 680 -700 and peaked at 756.9 watts!....i could not be more happier with these cards and and my system set up now...Also thanks for all the guys this is my first time with AMD cards and did not even know about the power limit function! so thank for point that out guys


that the worst psu you could ever bought...


----------



## Imprezzion

Agree with above..

Btw, I got black screen issues with this thing even though it has Hynix VRAM?
I didn't buy it new, I bought it secondhand from a miner (is 3 months old and I got the reciept / warranty docs so i still have 21 months of warranty) and it black screens all over the place.

1400Mhz memory is enough to instantly blacks creen when windows boots. I'm gunna see what BF4 on Mantle @ 1080p 150% res scale all Ultra can do and whether it black screens or not..


----------



## lawson67

Quote:


> Originally Posted by *EliteReplay*
> 
> that the worst psu you could ever bought...


For £133 and a 5 year warranty i am more than happy with it...also i believe the higher end RM corsair PSU are made by seasonic as it came with seasonic cables which have the exact same part code on the cables that i have on my replaced 750w seasonic PSU!...i think you will find that some guy on here has put down the RM corsair PSU range however i ask you to watch this youtube and see how you feel afterwards about the Corsiar RM range of PSU!...I think the guy that put down the Corsair RM range of PSU power packs should also watch this!.. Also have you personally owned a RM 1000w PSU?.. or are you basing you opinion from hearsay?


----------



## pkrexer

Quote:


> Originally Posted by *Imprezzion*
> 
> I got my new Sapphire Tri-X 290X in and i totally love it. That thing is capable of some seriously low temperatures!
> 
> Only, I notice it does the exact same as all my other 290/290X cards... Whenever I put the voltage past +50mV in MSI AB / TriXX with power limit at +50 it still throttles like a mad man.
> When I set +100mV, +50 Power, 1200Mhz Core / 1500Mhz VRAM it like, runs at ~1030 core most of the time..
> 
> When I set +50mV however and 1150Mhz core it runs at about ~1110Mhz..
> 
> So, how do you even OC these things without flashing the ASUS ''PT1'' BIOS. I'd rather not as this does eliminate throttling (tested it already) but also makes it run at ~45c idle cause it won't downclock.


Are you using the 14 beta drivers? From my own experience, increasing the power limit doesn't work in the beta drivers. Use DDU to remove them and switch to the 13.12 drivers, you shouldn't throttle anymore.


----------



## Imprezzion

That would explain but that would also ruin Mantle for me in BF4.

I'd rather use stock clocks and Mantle then OC without Mantle in BF4









It doesn't seem to black screen anymore as long as I keep VRAM under 1400Mhz so at least that's fixed.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Imprezzion*
> 
> That would explain but that would also ruin Mantle for me in BF4.
> 
> I'd rather use stock clocks and Mantle then OC without Mantle in BF4
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It doesn't seem to black screen anymore as long as I keep VRAM under 1400Mhz so at least that's fixed.


Use the 14.2 drivers, everything that was bad in the 14.1 is gone, everything is butter smooth for me now









http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx

Cept Titanfall, but even on 13.12 i'm still getting DX bsods


----------



## chiknnwatrmln

Quote:


> Originally Posted by *lawson67*
> 
> For £133 and a 5 year warranty i am more than happy with it...also i believe the higher end RM corsair PSU are made by seasonic as it came with seasonic cables which have the exact same part code on the cables that i have on my replaced 750w seasonic PSU!...i think you will find that some guy on here has put down the RM corsair PSU range however i ask you to watch this youtube and see how you feel afterwards about the Corsiar RM range of PSU!...I think the guy that put down the Corsair RM range of PSU power packs should also watch this!.. Also have you personally owned a RM 1000w PSU?.. or are you basing you opinion from hearsay?


Does the warranty cover the rest of your components and back up your data for when that PSU craps out and destroys your entire PC?


----------



## Durquavian

Quote:


> Originally Posted by *EliteReplay*
> 
> that the worst psu you could ever bought...


It is the low watt RM that were not reliable. The review and shilka claimed the 800^ were not subject to the same problems.


----------



## kizwan

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *lawson67*
> 
> For £133 and a 5 year warranty i am more than happy with it...also i believe the higher end RM corsair PSU are made by seasonic as it came with seasonic cables which have the exact same part code on the cables that i have on my replaced 750w seasonic PSU!...i think you will find that some guy on here has put down the RM corsair PSU range however i ask you to watch this youtube and see how you feel afterwards about the Corsiar RM range of PSU!...I think the guy that put down the Corsair RM range of PSU power packs should also watch this!.. Also have you personally owned a RM 1000w PSU?.. or are you basing you opinion from hearsay?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Does the warranty cover the rest of your components and back up your data for when that PSU craps out and destroys your entire PC?
Click to expand...

I'm not surprise if Corsair pay for the damage components if it ever happen.


----------



## sugarhell

Why corsair. There are way better psus. First choice an evga g2


----------



## Sgt Bilko

Quote:


> Originally Posted by *sugarhell*
> 
> Why corsair. There are way better psus. First choice an evga g2


Probably going to be my next choice actually, should be enough to feed my power hungry AMD setup


----------



## MrWhiteRX7

My 1300 EVGA G2 supports my tri-fire setup very well


----------



## shilka

lawson67 one thing you got wrong is you claim i opened up a Corsair RM that is wrong and i clearly say i have never done so

Techpowerup are the ones that opened it up not me

You are free to feel and think however you like but at least make sure you get your fact stright when you do so

Anyway i dont really care you already bought it if you get problems or not that your own responsibility


----------



## bbond007

For my FX 8320 machine I wanted to switch away from the OCZ 680 multiple-rial PSU because it was noisy and really probably a disaster waiting to happen. Was using and assortment of molex -> PCIe conenctors.

Someone posted in the "Online Deals" section about a Roswell Capstone 750 with a lot of discounts and rebates, so I went ahead and ordered that.

I did not anticipate that the Roswell would be a lot longer than the OCZ and the modular connectors interfered with my closed-loop radiator I had hacked into the top of the case.

I ended up buying a Cosair CX 750 because it was physically shorter than the Roswell and also on sale.

Anyway I still had to mod the PSU to recess the modular connectors, and I really should have just bought another non-modular PSU at this point.

While doing this, I disassembled both Roswell and Cosair PSU's to try figure out which would be simpler to relocate the connectors.

Anyway, I can attest to the fast that the build quality of the Cosair cx is less than the build quality of the Roswell. The Roswell is really a Sunflower which good, but still not Sesonic.

I'm not saying Cosair is bad, I'm just saying in my opinion their 750 looks more like a 650 to me. It is pretty quiet compared to the OCZ and its newer...

I'm using the Roswell with the 2 290X now.


----------



## MrWhiteRX7

@HOMECINEMA-PC

Did you ever try a different bios? Any success?

I went back to my stock bios on all cards. I actually lost a couple fps with the TRI-X bios and it was clocked slightly higher by default lol. Weird!


----------



## Sgt Bilko

So has anyone ordered a 290x Lightning yet?

They are back in stock at PCCG in Aus for a whopping $889









still cheaper than a 780 Ti though lol


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Sgt Bilko*
> 
> So has anyone ordered a 290x Lightning yet?
> 
> They are back in stock at PCCG in Aus for a whopping $889
> 
> 
> 
> 
> 
> 
> 
> 
> 
> still cheaper than a 780 Ti though lol


I believe there's a separate thread with a couple people who have them already.

I'm gonna wait for more LN2 and some underwater results before I come to a conclusion for the Lightning, so far they don't seem to OC any more than regular 290/x's but again those cards are meant to be cold.


----------



## Sgt Bilko

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I believe there's a separate thread with a couple people who have them already.
> 
> I'm gonna wait for more LN2 and some underwater results before I come to a conclusion for the Lightning, so far they don't seem to OC any more than regular 290/x's but again those cards are meant to be cold.


Lightning has Mem Voltage no?


----------



## chiknnwatrmln

I think so, not 100% sure though.


----------



## Sgt Bilko

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I think so, not 100% sure though.


Yeah, it does









renders the 290x Matrix obsolete in that respect alone.


----------



## Imprezzion

Well, I was getting a bit mad about the 14.x power limit not working so I flashed my Tri-X with the PT1.rom ASUS BIOS and smacked +100mV on it.

Now it won't throttle at all but it won't downclock idle either









At least the increase in clockspeed that's artifact free is qutie substantial with the extra volts.

On stock volts and power limit I could get to just about 1100Mhz without arti's. 1120Mhz artifacted bad in Valley if it didn't throttle..

Now, on +100mV I am running 1170Mhz core and no arti's. 1180Mhz gives a few arti's but ok. 70Mhz increase aint nothing. That's like, 6% total clockspeed.

If I were to grab GPU Tweak so I can go to +200mV I can easily hit 1200Mhz but temps wouldn't be worth it then.

At a custom 75% fanspeed she's sitting at 67c core, 72c VRM1 and 48c VRM2 after about 30 minutes of Valley. These temps are very low and at 100% they drop another ~10c but it gets loud and isn't worth the few extra Mhz..

I assume my card is perfectly safe at these temps for daily usage? +100mV (which is about 1.21v load), no power limit, no idle so it runs 1.305v idle.


----------



## chiknnwatrmln

+100mV is fine as long as your temps are low enough which they are. I run +168mV daily, but my card is under water.


----------



## Ironsmack

Hey guys,

I have my XFX 290 under water using XSPC WB and able to OC to 1125/1300 stable.

But i noticed that it underclocks and i have no idea why. I see my VRM1 hover at 77 C on BF4 (which is pretty bad considering its watercooled). Im buying a thicker thermal pad to address it.

For the mean time, i manage to plop a fan in there to help it cool down. I saw a drop around 68 C.

Is that the reason why my card underclocks?


----------



## Imprezzion

Drivers. I found out first hand.

Use the ASUS PT1 ROM to eliminate downclocking udner water at the cost of more power draw idle.

The 14.xx drivers don't apply the power limit so it seems so only the 13.xx or a custom BIOS will run without downclocking.
Quote:


> Originally Posted by *chiknnwatrmln*
> 
> +100mV is fine as long as your temps are low enough which they are. I run +168mV daily, but my card is under water.


Thanks for the info btw. Only downside to this is the rather over the top high idle temps. It's idling at about 48-51c now at 40% fanspeed.

Not that that's high.. I mean, my good old ASUS HD4890 Voltage Tweak edition idled at 65c and loaded around 90c overclocked but k.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Imprezzion*
> 
> Drivers. I found out first hand.
> 
> Use the ASUS PT1 ROM to eliminate downclocking udner water at the cost of more power draw idle.
> 
> The 14.xx drivers don't apply the power limit so it seems so only the 13.xx or a custom BIOS will run without downclocking.
> Thanks for the info btw. Only downside to this is the rather over the top high idle temps. It's idling at about 48-51c now at 40% fanspeed.
> 
> Not that that's high.. I mean, my good old ASUS HD4890 Voltage Tweak edition idled at 65c and loaded around 90c overclocked but k.


You can do what I do, make a profile for gaming that has the +100mV then make another profile with default (or even undervolted) voltage for browsing or whatever and switch between the two. The only thing is with my card at least, if I switch profiles while the memory is clocked up I get a blackscreen. I get around this by checking my memory speed in GPUz before switching profiles.

Also, to above, I'm not a cleaver. This is a cleaver.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Ironsmack*
> 
> Hey guys,
> 
> I have my XFX 290 under water using XSPC WB and able to OC to 1125/1300 stable.
> 
> But i noticed that it underclocks and i have no idea why. I see my VRM1 hover at 77 C on BF4 (which is pretty bad considering its watercooled). Im buying a thicker thermal pad to address it.
> 
> For the mean time, i manage to plop a fan in there to help it cool down. I saw a drop around 68 C.
> 
> Is that the reason why my card underclocks?


Check the powerlimit in CCC and AB or whichever program you are using, these cards don't thermal throttle until the core hits 95c.
Other than that GPU-Z has been known to cause some issues if you are running that.
Quote:


> Originally Posted by *Imprezzion*
> 
> Drivers. I found out first hand.
> 
> Use the ASUS PT1 ROM to eliminate downclocking udner water at the cost of more power draw idle.
> 
> The 14.xx drivers don't apply the power limit so it seems so only the 13.xx or a custom BIOS will run without downclocking.
> Thanks for the info btw. Only downside to this is the rather over the top high idle temps. It's idling at about 48-51c now at 40% fanspeed.
> 
> Not that that's high.. I mean, my good old ASUS HD4890 Voltage Tweak edition idled at 65c and loaded around 90c overclocked but k.


Have you tried the 14.2 beta? i have Power Limit options available to me. i just installed it directly over the 13.12 driver.


----------



## Imprezzion

Yes I have been running 14.2 all the time.

Power limits must be failing cause even at +40mV on a stock BIOS with +50% power limits (in CCC and MSI AB) it throttles like mad above 1100Mhz. Now, with a power limit free BIOS I can run whatever I want.

And this happens with both my 290x's. Not that I run both of them at once, but ones a unlocked reference XFX 290 and this is a ''real'' Sapphire Tri-X 290X.


----------



## Themisseble

need help
Which R9 290 has best custom cooling? because store where i buy doesnt have Sapphire trixx i have to choose between these:
PowerColor Radeon R9 290 4GB PCS+
ASUS Radeon R9 290 4GB DirectCU II OC
XFX Radeon R9 290 4GB Black Double Dissipation Edition
Club 3D Radeon R9 290 4GB royalKing
Gigabyte Radeon R9 290 4GB WindForce 3X OC
MSI Radeon R9 290 4GB Twin Frozr Gaming


----------



## Sgt Bilko

Quote:


> Originally Posted by *Imprezzion*
> 
> Yes I have been running 14.2 all the time.
> 
> Power limits must be failing cause even at +40mV on a stock BIOS with +50% power limits (in CCC and MSI AB) it throttles like mad above 1100Mhz. Now, with a power limit free BIOS I can run whatever I want.
> 
> And this happens with both my 290x's. Not that I run both of them at once, but ones a unlocked reference XFX 290 and this is a ''real'' Sapphire Tri-X 290X.


AB used to bug out with CCC's powerlimit sometimes, they were set individually i take it?
Other than that i'm really not sure, it's a weird problem.......

Quote:


> Originally Posted by *Themisseble*
> 
> need help
> Which R9 290 has best custom cooling? because store where i buy doesnt have Sapphire trixx i have to choose between these:
> PowerColor Radeon R9 290 4GB PCS+
> ASUS Radeon R9 290 4GB DirectCU II OC
> XFX Radeon R9 290 4GB Black Double Dissipation Edition
> Club 3D Radeon R9 290 4GB royalKing
> Gigabyte Radeon R9 290 4GB WindForce 3X OC
> MSI Radeon R9 290 4GB Twin Frozr Gaming


PCS+ out of that list, haven't seen what the Club3D card can do yet.


----------



## Ironsmack

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Check the powerlimit in CCC and AB or whichever program you are using, these cards don't thermal throttle until the core hits 95c.
> Other than that GPU-Z has been known to cause some issues if you are running that.




Let me know if you guys can see this.

I have Open HWMonitor and GPU-Z (0.7.7) and both are reporting the 290 is throttling down.

Even with stock settings - after 1 hr + of game play - it throttles down as well.

Am i missing something else?


----------



## Imprezzion

I have the exact same as Ironsmack.

Both are set, individually. Doesn't work at all. TriXX doesn't work either, GPU-Tweak doesn't work either.

So yeah.. ;P


----------



## Sgt Bilko

Quote:


> Originally Posted by *Ironsmack*
> 
> 
> 
> Let me know if you guys can see this.
> 
> I have Open HWMonitor and GPU-Z (0.7.7) and both are reporting the 290 is throttling down.
> 
> Even with stock settings - after 1 hr + of game play - it throttles down as well.
> 
> Am i missing something else?


That paints a pretty clear picture, have you tried disabling overdrive in CCC and just using AB or vice versa?

I've only ever had one situation where my cards would throttle but after a system reboot it was fixed so i don't know the exact cause.


----------



## Forceman

Quote:


> Originally Posted by *Ironsmack*
> 
> 
> 
> Let me know if you guys can see this.
> 
> I have Open HWMonitor and GPU-Z (0.7.7) and both are reporting the 290 is throttling down.
> 
> Even with stock settings - after 1 hr + of game play - it throttles down as well.
> 
> Am i missing something else?


Have you tried changing the power limit with CCC and not AB? I've had problems with AB not changing the power limit even though it looks like it does, but changing it directly in CCC has worked even with the 14.2 drivers.

And is there any reason you have the maximum fan speed set to 20% in CCC?


----------



## tpi2007

Guys, be civil and stay on-topic. If you want to discuss PSUs, there is a section for that.

Thank you.


----------



## Ironsmack

Quote:


> Originally Posted by *Forceman*
> 
> Have you tried changing the power limit with CCC and not AB? I've had problems with AB not changing the power limit even though it looks like it does, but changing it directly in CCC has worked even with the 14.2 drivers.
> 
> And is there any reason you have the maximum fan speed set to 20% in CCC?


I did changed it on CCC. I rebooted it and i'll watch it if downclocks it again.

The card is under water, so i set the fan on 20. Or maybe i should change it to 100....


----------



## Forceman

Quote:


> Originally Posted by *Ironsmack*
> 
> I did changed it on CCC. I rebooted it and i'll watch it if downclocks it again.
> 
> The card is under water, so i set the fan on 20. Or maybe i should change it to 100....


Oh, yeah, if it's under water the fan setting shouldn't matter. It only downclocks after long use?


----------



## Imprezzion

Mine downclocks no matter what if I don't use the ASUS BIOS.

But, I got much wierder issues.

With NO overclocking software installed at all and just clean drivers with DDU wipe before installing the card will not downclock to idle.

Memory will drop to 150Mhz but core stays at 1040Mhz. In the first few seconds after a windows boot it goes to 3000Mhz nicely but it clocks to max core clocks after roughly 20 seconds of being booted..

I don't get it..

EDIT: I get it.

Figured out what the issue with downclocking was.

Whenever you change memory frequency even ONE Mhz in either CCC or a OC program core hangs on max clocks.

Does memory frequency even make a difference at 2880x1620?


----------



## Paul17041993

Quote:


> Originally Posted by *Imprezzion*
> 
> Does memory frequency even make a difference at 2880x1620?


at that res? yea it would, that's a lot of pixels.


----------



## Imprezzion

Yeah well, I run that in BF4 and ArmA III (150% res scale at 1920x1080) so it might just have to idle a little higher then..


----------



## taem

90c vrm1 should not cause throttling. I don't like it that high and I'm pretty sure it affects oc stability but plenty of folks run their Gpus with 90c vrm1. For xfx cards that's pretty much a standard vrm1 operating temp, a lot of sapphire 7970s ran like that, and the Asus 290 dcu2 in a lot of reviews gets 88-90c vrm1.

But Lawson is that with default fan profile? If you're running ccc I guess it is. For air cooled crossfire you really should run a more aggressive custom fan curve. Try increasing rpm on your side fan also.
Quote:


> Originally Posted by *Forceman*
> 
> Glad you figured it out. Wonder if it may have been GPU-Z, there were some issues when the cards first came out with GPU-Z causing issues. Did you have the newest version?


I've said earlier in this thread that I'm positive gpu z is causing glitches. Maybe if you're running something else that scans hardware sensors?


----------



## lawson67

Quote:


> Originally Posted by *taem*
> 
> 90c vrm1 should not cause throttling. I don't like it that high and I'm pretty sure it affects oc stability but plenty of folks run their Gpus with 90c vrm1. For xfx cards that's pretty much a standard vrm1 operating temp, a lot of sapphire 7970s ran like that, and the Asus 290 dcu2 in a lot of reviews gets 88-90c vrm1.
> 
> But Lawson is that with default fan profile? If you're running ccc I guess it is. For air cooled crossfire you really should run a more aggressive custom fan curve. Try increasing rpm on your side fan also.
> I've said earlier in this thread that I'm positive gpu z is causing glitches. Maybe if you're running something else that scans hardware sensors?


Yes Taem it was indeed GPU-Z when i stopped running that to monitor everything ran fine!.. no throttling nothing just perfect...very pleased with my cards indeed...GPU-Z also caused screen locks

What software are you using to change fan profile?


----------



## taem

Quote:


> Originally Posted by *lawson67*
> 
> What software are you using to change fan profile?


Sapphire Trixx, I find it less buggy than Afterburner, though Trixx isn't glitch free either. There's a separate tab for fan curves, you can click anywhere on the graph to add more points you can adjust (and right click the points to remove them). I would have fans set to operate at 60-80% fan at load temps for air cooled crossfire, depending on how much noise you can tolerate. You can always go back to auto so there's no risk.


----------



## kskwerl

Can anyone tell me what this grease like substance in on my MSI 290x Gaming. They all seem to have it to some degree but this one is the worst.

http://imgur.com/HcJTn4M,Vgtsb1V,E8JleiW,piPqkzS,EDSzVyH,BRHqLPk,2zCnJRs#2


----------



## TommyGunn123

Quote:


> Originally Posted by *kskwerl*
> 
> Can anyone tell me what this grease like substance in on my MSI 290x Gaming. They all seem to have it to some degree but this one is the worst.
> 
> http://imgur.com/HcJTn4M,Vgtsb1V,E8JleiW,piPqkzS,EDSzVyH,BRHqLPk,2zCnJRs#2


Was is there when you bought it? if so it's probs just left over from some part of manufacturing or something.


----------



## Matt-Matt

Sooooo

My idle with two monitors attached @ 1075/1400 is like 61c

Wow, so keen for my waterblock to get here. It's so loud just idling too!


----------



## kskwerl

Quote:


> Originally Posted by *TommyGunn123*
> 
> Was is there when you bought it? if so it's probs just left over from some part of manufacturing or something.


I'm like 99% sure it wasn't there, i hope its nothing to be worried about. It feels like some kind of grease and its def not acid because I rubbed mad of it on my hands hours ago LOL


----------



## TommyGunn123

Quote:


> Originally Posted by *Matt-Matt*
> 
> Sooooo
> 
> My idle with two monitors attached @ 1075/1400 is like 61c
> 
> Wow, so keen for my waterblock to get here. It's so loud just idling too!


yeah well they idle at max speed with 2 monitors connected, so if you can just unplug one of them when just watching youtube and whatever and replug when you play a game









EDIT: Maybe they don't, seen a couple of responses saying the opposite, so i admit i'm wrong if i'm wrong
Quote:


> Originally Posted by *kskwerl*
> 
> I'm like 99% sure it wasn't there, i hope its nothing to be worried about. It feels like some kind of grease and its def not acid because I rubbed mad of it on my hands hours ago LOL


well it kinda looks like it was rubbed on, but i guess it could be when you touched it as you mentioned. ive heard of the fans leaking the bearing grease but that kinda doesnt work since it's ontop of the card, unless you're xfiring them. maybe leave a bit of cloth or something and see if there's drips to see if its just a random thing or a possibly big thing haha


----------



## Matt-Matt

Quote:


> Originally Posted by *TommyGunn123*
> 
> yeah well they idle at max speed with 2 monitors connected, so if you can just unplug one of them when just watching youtube and whatever and replug when you play a game


I didn't know that! I knew that previous gen cards (68xx's that I owned once) their RAM clock increased but not the core speed.

I use two screens for productivity.. I can't live without them







I guess I'll have to live until I get this block haha.


----------



## taem

Quote:


> Originally Posted by *TommyGunn123*
> 
> yeah well they idle at max speed with 2 monitors connected, so if you can just unplug one of them when just watching youtube and whatever and replug when you play a game


Whoa... I did not know that. This is true of all 290? I was debating whether to turn igpu multi monitor off and connect my secondary display to the card but I guess I won't.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *taem*
> 
> Whoa... I did not know that. This is true of all 290? I was debating whether to turn igpu multi monitor off and connect my secondary display to the card but I guess I won't.


I don't think so. I have three monitors and my card downclocks with all 3 connected with Eyefinity both on and off. My clocks drop to 300/150MHz.


----------



## Matt-Matt

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I don't think so. I have three monitors and my card downclocks with all 3 connected with Eyefinity both on and off. My clocks drop to 300/150MHz.


Well that's weird; Mine are at 950/150 right now using two screens..

I don't even know why? Stock is 947MHz, overclocked settings are at 1075/1400..


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Matt-Matt*
> 
> Well that's weird; Mine are at 950/150 right now using two screens..
> 
> I don't even know why? Stock is 947MHz, overclocked settings are at 1075/1400..


What drivers are you using? Sounds like your card is in 3D mode, but downclocking itself because there's no actual 3D load...

My old 7950 used to kick into 3D mode with more than one monitor. I never figured out what caused it but when I upgraded to my 290 that stopped.

What monitor, res, and connections are you using?


----------



## taem

Aw yeah! Leader in 2 Trials of the Rakyat in Far Cry 3. This is what becomes possible with the 60 constant fps you get with a 290.




My name is written in stone baby. WRITTEN IN STONE


----------



## MapRef41N93W

Quote:


> Originally Posted by *kskwerl*
> 
> Can anyone tell me what this grease like substance in on my MSI 290x Gaming. They all seem to have it to some degree but this one is the worst.
> 
> http://imgur.com/HcJTn4M,Vgtsb1V,E8JleiW,piPqkzS,EDSzVyH,BRHqLPk,2zCnJRs#2


This is a known issue with that card. Numerous reports of grease found on the 290x variant.


----------



## kskwerl

Quote:


> Originally Posted by *MapRef41N93W*
> 
> This is a known issue with that card. Numerous reports of grease found on the 290x variant.


Anything serious? Should I be worried?


----------



## kizwan

Quote:


> Originally Posted by *kskwerl*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MapRef41N93W*
> 
> This is a known issue with that card. Numerous reports of grease found on the 290x variant.
> 
> 
> 
> Anything serious? Should I be worried?
Click to expand...

My best bet the grease probably from thermal pad. It could be excess oil from the fans too. I figured you're running Crossfire.


----------



## vedaire

hi guys I'm wondering if one of you could help me
I need a component part number from a sapphire r9 290x tri-x video card
specifically I need the part number off of the 4.7uh inductor at position L501 just slightily
below the 8pin power connector. any help would be appreciated.


----------



## p33k

I guess I can join this club... thought I was getting some reference cards but when I showed up at my shop here in Korea he had these... not exactly what i wanted but after a week without having a computer and that ek is going to release blocks for them i will be ok with it


----------



## Matt-Matt

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> What drivers are you using? Sounds like your card is in 3D mode, but downclocking itself because there's no actual 3D load...
> 
> My old 7950 used to kick into 3D mode with more than one monitor. I never figured out what caused it but when I upgraded to my 290 that stopped.
> 
> What monitor, res, and connections are you using?


I'm on 14.2's; I was trying to setup 3D modes in afterburner so I'll see what happens if i close it off fully.

I'm just using the two DVI ports also, I assume that's fine?


----------



## Jetskyer

hey guys,
I've used the method in the first post of this thread to add more volts to my R9 290.
However no matter what I do (+200, +100, etc) nether afterburner or HWiNFO see a change in voltage, while the slider in afterburner itself does work just fine and results in a change of voltage.

Am I missing something here?

Thanks!









yup, I was missing something, got it working now


----------



## Mercy4You

Does anybody know the difference between Catalyst 13.12 and 13.12 C and where to find a release note?

update: it's a Sapphire driver released in februari, about 40Mb more than 13.12. No mention of what it's about...


----------



## Imprezzion

For the people who have issues with core clocks hanging on full speed while memory clocks down on idle, it's related to memory overclocks. When you OC your memory the card will no longer downclock the core in idle.

This happens with the 14.2 drivers.

Reminds me.. Can I make a request from you guys?

As there is no working BIOS editor, at least not that I know of, for the R9 29xx cards can anyone hex edit a BIOS around here?
I would like a BIOS with BIOS level memory clocks to see if the card downclocks properly with 1500Mhz mem clocks in the BIOS.

Also, if someone could make a hybrid BIOS of the ASUS ''PT1.ROM'' BIOS and the regular one meaning a BIOS with no power limits nor throttling like the PT1 has, but WITH idle clocks / idle voltages which PT1 lacks.

This would be so epic for all of us 290x owners. What I am going to try is to see if the PowerColor PCS+ BIOS works on my Tri-X as that BIOS has 1350Mhz mem clocks. If that works, and it downclocks properly, it confirms my theory of the drivers + software mem OC being at fault.


----------



## Matt-Matt

Quote:


> Originally Posted by *Imprezzion*
> 
> For the people who have issues with core clocks hanging on full speed while memory clocks down on idle, it's related to memory overclocks. When you OC your memory the card will no longer downclock the core in idle.
> 
> This happens with the 14.2 drivers.
> 
> Reminds me.. Can I make a request from you guys?
> 
> As there is no working BIOS editor, at least not that I know of, for the R9 29xx cards can anyone hex edit a BIOS around here?
> I would like a BIOS with BIOS level memory clocks to see if the card downclocks properly with 1500Mhz mem clocks in the BIOS.
> 
> Also, if someone could make a hybrid BIOS of the ASUS ''PT1.ROM'' BIOS and the regular one meaning a BIOS with no power limits nor throttling like the PT1 has, but WITH idle clocks / idle voltages which PT1 lacks.
> 
> This would be so epic for all of us 290x owners. What I am going to try is to see if the PowerColor PCS+ BIOS works on my Tri-X as that BIOS has 1350Mhz mem clocks. If that works, and it downclocks properly, it confirms my theory of the drivers + software mem OC being at fault.


My memory is at stock now.. I thought it was overclocked but I'm at 1250 ATM..









So annoying, is there a BIOS I can use to fix this or something? I may even roll back to 14.1 as bad as it was..


----------



## Belkov

Or you can always use the official driver. I have no problems with it.


----------



## Matt-Matt

Quote:


> Originally Posted by *Belkov*
> 
> Or you can always use the official driver. I have no problems with it.


But that's 2013 drivers..









I think that is my only option as of right now though. Sigh; AMD why u no update drivers often?


----------



## Imprezzion

What you should do is disable all OC software from starting up with WIndows.
Disable CCC Overdrive and set it to it's defaults.
Reboot.
Now, it should downclock again like normal.
Turn on CCC or your preferred OC software.
Apply your overclock BUT LEAVE RAM AT STOCK.
Now, enjoy your coreclocks and voltage + fan profiles with downclocking


----------



## Matt-Matt

Quote:


> Originally Posted by *Imprezzion*
> 
> What you should do is disable all OC software from starting up with WIndows.
> Disable CCC Overdrive and set it to it's defaults.
> Reboot.
> Now, it should downclock again like normal.
> Turn on CCC or your preferred OC software.
> Apply your overclock BUT LEAVE RAM AT STOCK.
> Now, enjoy your coreclocks and voltage + fan profiles with downclocking


So we can't overclock RAM yet? That's kind of lame









I'll fix this tomorrow, got a team meeting tomorrow early


----------



## Imprezzion

Well, you can OC RAM just fine if your OK with higher idle temps and power consumption...

i'm going to install 14.1 Bv1.6 now to see if downclocking works with that driver. I never had any issues with it in BF4 on Mantle..Performance between 14.1 Bv1.6 and 14.2 Bv1.3 is exactly the same..

MAJOR EDIT:

14.1 Beta V1.6 DOES work with memory OC and idle clocks!
Now testing to see if I can somehow fix power limits as well..

So, the memory clock and idle clocks issue is pure with 14.2 series drivers!

Anyone who's seeing cards not returning to idle clocks, try the older 14.1 beta V1.6 drivers!

EDIT2: Power limit still aint working and for clocks to apply at boot you have to enable voerdrive and set the core and memory clocks + power limit manually in CCC as well as your OC program. Voltage and fanspeed curves are applied by your OC program but core and memory speed isn't so manually adjust that in CCC as well (for me it was +50% power, +6.8% clocks, 1500Mhz VRAM) and it works like a charm.

FPS in Valley is WAY higher with 14.1 as well compared to 14.2 for what it's worth.. It's like, in the first scene, a difference from 58 to 66 FPS.


----------



## Matt-Matt

Quote:


> Originally Posted by *Imprezzion*
> 
> Well, you can OC RAM just fine if your OK with higher idle temps and power consumption...
> 
> i'm going to install 14.1 Bv1.6 now to see if downclocking works with that driver. I never had any issues with it in BF4 on Mantle..Performance between 14.1 Bv1.6 and 14.2 Bv1.3 is exactly the same..
> 
> MAJOR EDIT:
> 
> 14.1 Beta V1.6 DOES work with memory OC and idle clocks!
> Now testing to see if I can somehow fix power limits as well..
> 
> So, the memory clock and idle clocks issue is pure with 14.2 series drivers!
> 
> Anyone who's seeing cards not returning to idle clocks, try the older 14.1 beta V1.6 drivers!
> 
> EDIT2: Power limit still aint working and for clocks to apply at boot you have to enable voerdrive and set the core and memory clocks + power limit manually in CCC as well as your OC program. Voltage and fanspeed curves are applied by your OC program but core and memory speed isn't so manually adjust that in CCC as well (for me it was +50% power, +6.8% clocks, 1500Mhz VRAM) and it works like a charm.
> 
> FPS in Valley is WAY higher with 14.1 as well compared to 14.2 for what it's worth.. It's like, in the first scene, a difference from 58 to 66 FPS.


Okay so I'll defs be rolling back to 14.1's... First time I've had to roll back a driver since having Crossfire 6850s.. Not really happy; I mean the 290/290x's are still newer cards, but still..

EDIT: Can you please upload a link with the 14.1 v1.6's that you used? I can't seem to find them after a quick google search.


----------



## rdr09

Quote:


> Originally Posted by *Imprezzion*
> 
> For the people who have issues with core clocks hanging on full speed while memory clocks down on idle, it's related to memory overclocks. When you OC your memory the card will no longer downclock the core in idle.
> 
> This happens with the 14.2 drivers.
> 
> Reminds me.. Can I make a request from you guys?
> 
> As there is no working BIOS editor, at least not that I know of, for the R9 29xx cards can anyone hex edit a BIOS around here?
> I would like a BIOS with BIOS level memory clocks to see if the card downclocks properly with 1500Mhz mem clocks in the BIOS.
> 
> Also, if someone could make a hybrid BIOS of the ASUS ''PT1.ROM'' BIOS and the regular one meaning a BIOS with no power limits nor throttling like the PT1 has, but WITH idle clocks / idle voltages which PT1 lacks.
> 
> This would be so epic for all of us 290x owners. What I am going to try is to see if the PowerColor PCS+ BIOS works on my Tri-X as that BIOS has 1350Mhz mem clocks. If that works, and it downclocks properly, it confirms my theory of the drivers + software mem OC being at fault.


i have 14.2 installed in one of my HDD and even when my 290 is oc'ed to 1200/1500 it downclocks to normal 150/300 when idle. unless i misread your post. i only have one monitor installed.


----------



## JordanTr

Quote:


> Originally Posted by *Mercy4You*
> 
> Does anybody know the difference between Catalyst 13.12 and 13.12 C and where to find a release note?
> 
> update: it's a Sapphire driver released in februari, about 40Mb more than 13.12. No mention of what it's about...


I asked about it back in february, then noone knew anything about them and told me dont worry and use the drivers from amd site.


----------



## Imprezzion

www.guru3d.com/files_details/amd_catalyst_14_1_beta_1_6_(13_350_1005_january_31).html

Well, that's wierd as many of us don't have downclocking..

What OC software did you use?


----------



## rdr09

Quote:


> Originally Posted by *Imprezzion*
> 
> www.guru3d.com/files_details/amd_catalyst_14_1_beta_1_6_(13_350_1005_january_31).html
> 
> Well, that's wierd as many of us don't have downclocking..
> 
> What OC software did you use?


that could be the reason. i use Trixx. i did the render test to put the load and after the test it downclocked.


----------



## Matt-Matt

Quote:


> Originally Posted by *rdr09*
> 
> i have 14.2 installed in one of my HDD and even when my 290 is oc'ed to 1200/1500 it downclocks to normal 150/300 when idle. unless i misread your post. i only have one monitor installed.


Yeah, pretty sure the issue is when you have two monitors; I added another one and have noticed the higher idle clocks/temps then. WHILE being on 14.2


----------



## rdr09

Quote:


> Originally Posted by *Matt-Matt*
> 
> Yeah, pretty sure the issue is when you have two monitors; I added another one and have noticed the higher idle clocks/temps then. WHILE being on 14.2


that is normal with more than one monitor. just the gpu working harder. it happens with other series cards like the Tahitis.


----------



## sugarhell

The gpu is not working harder. When you have 2 monitors using different outputs memory locks on 3d clocks. You can avoid it by disabling powerplay


----------



## rdr09

Quote:


> Originally Posted by *sugarhell*
> 
> The gpu is not working harder. When you have 2 monitors using different outputs memory locks on 3d clocks. You can avoid it by disabling powerplay


i may have it worded it wrong but it does that when the gpu has more than one monitor installed.

anyway to disable powerplay without using AB?


----------



## Matt-Matt

Quote:


> Originally Posted by *rdr09*
> 
> that is normal with more than one monitor. just the gpu working harder. it happens with other series cards like the Tahitis.


It does; But it shouldn't be running that high.. 950MHz is way too high for a Hawaii to idle at and the memory is really low. I remember it being the other way around on my previous cards from AMD and the clocks being lower. I.E 300/500 on a 6870 from 900/1200 or something.


----------



## sugarhell

Quote:


> Originally Posted by *rdr09*
> 
> i may have it worded it wrong but it does that when the gpu has more than one monitor installed.
> 
> anyway to disable powerplay without using AB?


Nah i dont think so.


----------



## rdr09

Quote:


> Originally Posted by *Matt-Matt*
> 
> It does; But it shouldn't be running that high.. 950MHz is way too high for a Hawaii to idle at and the memory is really low. I remember it being the other way around on my previous cards from AMD and the clocks being lower. I.E 300/500 on a 6870 from 900/1200 or something.


anything wrong using your igpu? i assume it is about your sig.

Quote:


> Originally Posted by *sugarhell*
> 
> Nah i dont think so.


i see.


----------



## Imprezzion

Aaaand my testting brought another revelation.

14.2 DOES downclock properly.. The downclocking issues are BIOS related combined with this driver.

I flashed the Powercolor PCS+ BIOS on my Tri-X OC as it has the highest stock clocks of any BIOS but decided to OC the VRAM anyway just to test performance difference.
Noticed all of a sudden it downclocked just fine when using MSI AB..

When you clock mem in CCC it still hangs but disabling CCC's overdrive and using MSI AB to clock works just fine but this did not work with the ASUS and Sapphire BIOS's..

Power limit is, ofcourse, still not working but k.
The PCS+ BIOS works just fine on my Tri-X, no issues at all. That would mean it should work on reference as well.


----------



## sugarhell

PCS+,tri-x are reference pcb with a different cooler. They just use different caps etc etc.Thats why tri-x bios and pcs+ bios works on ref cards


----------



## Imprezzion

Oh I thought the PCS+ had a non-ref PCB.. Oh well. I know the Tri-X has a beefed up ref. design but k.

Well, I do have to say that i'm staggered by performance difference on DX11 between 14.1 and 14.2.
14.1 performs like, 10-20% better in most games. Take Valley for example.. First scene. 52FPS with 14.2. 66FPS with 14.1...
ArmA III? On my custom settings with 14.2: 34FPS. 14.1: 42FPS...


----------



## chiknnwatrmln

I'm sticking with 13.12 until these drivers are fixed. I don't play any Mantle games, in fact almost every game I play is DX9.


----------



## Widde

Is there any downsides to not letting the card downclock besides the electrical bill? Thinking of flashing the asus pt1 bios and see if that helps yield a better oc. One would think stability might improve when it doesnt downclock? My gpu oc experience is limited to a nvidia gts 250 and a HD6870 so it's not great.

Have flashed gpu bioses before though


----------



## Arizonian

Quote:


> Originally Posted by *p33k*
> 
> I guess I can join this club... thought I was getting some reference cards but when I showed up at my shop here in Korea he had these... not exactly what i wanted but after a week without having a computer and that ek is going to release blocks for them i will be ok with it


Welcome aboard - added


----------



## Imprezzion

Quote:


> Originally Posted by *Widde*
> 
> Is there any downsides to not letting the card downclock besides the electrical bill? Thinking of flashing the asus pt1 bios and see if that helps yield a better oc. One would think stability might improve when it doesnt downclock? My gpu oc experience is limited to a nvidia gts 250 and a HD6870 so it's not great.
> 
> Have flashed gpu bioses before though


Noise.. Cause when upping voltages to past +100mV it needs a LOT of fanspeed to idle at reasonable temps.

At +100mV, 1.305v idle, my Tri-X runs at about 48c idle on 55% fanspeed.


----------



## taem

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I'm sticking with 13.12 until these drivers are fixed. I don't play any Mantle games, in fact almost every game I play is DX9.


What games do you play?! Like Diablo2? Even the first Batman Arkham game supports dx11 I believe.


----------



## Roboyto

Quote:


> Originally Posted by *Widde*
> 
> Is there any downsides to not letting the card downclock besides the electrical bill? Thinking of flashing the asus pt1 bios and see if that helps yield a better oc. One would think stability might improve when it doesnt downclock? My gpu oc experience is limited to a nvidia gts 250 and a HD6870 so it's not great.
> 
> Have flashed gpu bioses before though


Unnecessary wear and tear on the GPU... Not downclocking shouldn't have that much effect on the cards ability to OC... either you got a good card or you didn't, ya know?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *taem*
> 
> What games do you play?! Like Diablo2? Even the first Batman Arkham game supports dx11 I believe.


Mainly Skyrim, Fallout, and Half Life/Black Mesa.


----------



## kizwan

Quote:


> Originally Posted by *Roboyto*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Widde*
> 
> Is there any downsides to not letting the card downclock besides the electrical bill? Thinking of flashing the asus pt1 bios and see if that helps yield a better oc. One would think stability might improve when it doesnt downclock? My gpu oc experience is limited to a nvidia gts 250 and a HD6870 so it's not great.
> 
> Have flashed gpu bioses before though
> 
> 
> 
> Unnecessary wear and tear on the GPU... Not downclocking shouldn't have that much effect on the cards ability to OC... either you got a good card or you didn't, ya know?
Click to expand...

It more for stability than ability. Downclocking may introduced instability when going for high overclock. IMO, PT1 BIOS suitable for benching than 24/7 usage. Fortunately our cards have BIOS switch. Stick the first BIOS for 24/7 usage & flashed the second BIOS for overclock/benching.

@Widde, if you didn't overvolt your card too much, it should be ok to run your card at fixed clock 24/7.


----------



## Paul17041993

Quote:


> Originally Posted by *Widde*
> 
> Is there any downsides to not letting the card downclock besides the electrical bill? Thinking of flashing the asus pt1 bios and see if that helps yield a better oc. One would think stability might improve when it doesnt downclock? My gpu oc experience is limited to a nvidia gts 250 and a HD6870 so it's not great.
> 
> Have flashed gpu bioses before though


the only reason why these have downclocking is to stop them overheating, or at least not overburden the TDP, so if you can keep the card cool and allow higher TDP (+50%) then downclocking should stop itself by default, unless you're hitting the hard-set card TDP which you then need a [dangerous] BIOS to overcome this limit.


----------



## Imprezzion

Quote:


> Originally Posted by *Paul17041993*
> 
> the only reason why these have downclocking is to stop them overheating, or at least not overburden the TDP, so if you can keep the card cool and allow higher TDP (+50%) then downclocking should stop itself by default, unless you're hitting the hard-set card TDP which you then need a [dangerous] BIOS to overcome this limit.


Kind of right but not really.

14.xx drivers prevent the power limit of +50% from working in any overclocking software. Even CCC. The +50% limit is the hard limit. But you can't get anywhere near it..


----------



## Forceman

Quote:


> Originally Posted by *Paul17041993*
> 
> the only reason why these have downclocking is to stop them overheating, or at least not overburden the TDP, so if you can keep the card cool and allow higher TDP (+50%) then downclocking should stop itself by default, unless you're hitting the hard-set card TDP which you then need a [dangerous] BIOS to overcome this limit.


He's talking about downclocking at idle, not throttle downclocking.
Quote:


> Originally Posted by *Imprezzion*
> 
> Kind of right but not really.
> 
> 14.xx drivers prevent the power limit of +50% from working in any overclocking software. Even CCC. The +50% limit is the hard limit. But you can't get anywhere near it..


The increased power limit works for me on 14.2, although I had to set it in CCC to get it to work. At 0‰ I get downclocking but with +50% I don't, so I know it is working.


----------



## Widde

Quote:


> Originally Posted by *Paul17041993*
> 
> the only reason why these have downclocking is to stop them overheating, or at least not overburden the TDP, so if you can keep the card cool and allow higher TDP (+50%) then downclocking should stop itself by default, unless you're hitting the hard-set card TDP which you then need a [dangerous] BIOS to overcome this limit.


It's the idle downclocking I was talking about ^^ Sorry if I was unclear







Just so sick of getting bad chips overall







Cpu,gpus etc







But idle temp atm is about 45-50C and noise doesnt bother me since I have a closed headset while at the pc ^_^ But sticking with the stock bios for now, Cant wait untill summer and get some watercooling going


----------



## Paul17041993

Quote:


> Originally Posted by *Forceman*
> 
> He's talking about downclocking at idle, not throttle downclocking.


Quote:


> Originally Posted by *Widde*
> 
> It's the idle downclocking I was talking about ^^ Sorry if I was unclear
> 
> 
> 
> 
> 
> 
> 
> Just so sick of getting bad chips overall
> 
> 
> 
> 
> 
> 
> 
> Cpu,gpus etc
> 
> 
> 
> 
> 
> 
> 
> But idle temp atm is about 45-50C and noise doesnt bother me since I have a closed headset while at the pc ^_^ But sticking with the stock bios for now, Cant wait untill summer and get some watercooling going


ah, yea that's powerplay you're looking for, its what allows it to drop to the BIOS-defined 2D clocks, and can be disabled via a 3rd party tool, not sure what there is apart from afterburner though...

keeping in mind what I said before, you don't want your card to overheat, it should shut off if it does or if it overruns the TDP though.


----------



## k3rast4se

According to a few reviews, the MSI Gaming R9 290 and Sapphire R9 290 Tri-x are really on par. Users feedback look good for both card. Is there anything I should know about one of them? The MSI Gaming is consistently cheeper than the Sapphire even if MSI msrp is 469 vs 449 for Sapphire.


----------



## Red1776

Quote:


> Originally Posted by *k3rast4se*
> 
> According to a few reviews, the MSI Gaming R9 290 and Sapphire R9 290 Tri-x are really on par. Users feedback look good for both card. Is there anything I should know about one of them? The MSI Gaming is consistently cheeper than the Sapphire even if MSI msrp is 469 vs 449 for Sapphire.


 This is a good review to have a gander at http://www.guru3d.com/articles_pages/msi_radeon_r9_290x_gaming_oc_review,1.html

Not sure where the prices you gave are coming from though

http://www.newegg.com/Product/Product.aspx?Item=N82E16814127773

(never mind, you are talking the 290's , not the 290X's)


----------



## k3rast4se

Extremely interesting review. Does seem like THE card to buy. Was originally planning to buy powercolor/sapphire r9 290, but MSI quality look higher and hynix ram.


----------



## Red1776

Quote:


> Originally Posted by *k3rast4se*
> 
> Extremely interesting review. Does seem like THE card to buy. Was originally planning to buy powercolor/sapphire r9 290, but MSI quality look higher and hynix ram.


seems so. all four of mine seem to be great OC'ers and run in the 70's under load with the twin Frozr. mine are going under water which should add some OC headroom.


----------



## k3rast4se

price at newegg.com are currently really good (469.99$) but ncix canada (no sales tax and free shipping) is still 549.99 (which seem way to high). Since this card has been out for only 2 month, almost as powerful as GTX 780ti, and nvidia is not looking a card anytime soon, I believe 499.99$ would be a good price??


----------



## taem

Quote:


> Originally Posted by *k3rast4se*
> 
> According to a few reviews, the MSI Gaming R9 290 and Sapphire R9 290 Tri-x are really on par. Users feedback look good for both card. Is there anything I should know about one of them? The MSI Gaming is consistently cheeper than the Sapphire even if MSI msrp is 469 vs 449 for Sapphire.


MSI Gaming is fine for single gpu gaming without much overclocking. Once you start raising clocks and vddc temps get out of hand. And based on a friend's experience they can't be crossfired, this card will heat-choke as the upper card. Problem is the fans max out at a really low rpm. Really quiet card at stock clocks though, albeit with temps I consider high for stock clocks.

The Sapphire Tri-X is the better card, it's second only to the Powercolor PCS+ for cooling. At stock clocks it will run 15-20c cooler than the MSI Gaming. Plenty of thermal overhead for overclocking. I have it as upper card in crossfire and it runs high 60s to low 70s in most gaming scenarios -- which for air cooled crossfire in a mid tower is godlike IMHO. Fans are loud though, once you crack 55% you won't be able to ignore them.

At Amazon the MSI Gaming can be had for $470, while the Tri-X is $499 and usually out of stock. The places that do carry the Tri-X charge $550ish.


----------



## MrWhiteRX7

@Red1776 is running four of them on air at the moment, I believe he's having success.


----------



## Red1776

while my MSI twin Frozr Gaming R290X's are going underwater as soon as the blocks get here, I have exceptional airflow through my case. I ran 4 x 6970 quadfire on air for quite a while not exceeding 85-88C.

Quote:


> ~~The Sapphire Tri-X is the better card, it's second only to the Powercolor PCS+ for cooling. At stock clocks it will *run 15-20c cooler than the MSI Gaming*. Plenty of thermal overhead for overclocking. I have it as upper card in crossfire and it runs high 60s to low 70s in most gaming scenarios


 neither the PCS nor the tri x run 15-20c below the MSI gaming card, or even close to that. The PCS runs @ 68c under load, and the MSI Gaming series runs at 73C secondly, there is more to OC headroom than a card running a few degree cooler.

http://www.guru3d.com/articles_categories/videocards.html

Guru put them all thru the paces and the conclusion on performance, temps, OC are remarkably close. In fact the load dBA from the MSI is a touch quieter.


----------



## taem

Quote:


> Originally Posted by *Red1776*
> 
> neither the PCS nor the tri x run 15-20c below the MSI gaming card, or even close to that. The PCS runs @ 68c under load, and the MSI Gaming series runs at 73C secondly, there is more to OC headroom than a card running a few degree cooler.
> 
> http://www.guru3d.com/articles_categories/videocards.html
> 
> Guru put them all thru the paces and the conclusion on performance, temps, OC are remarkably close. In fact the load dBA from the MSI is a touch quieter.


I have a PCS+ and Tri-X, a buddy had a pair of MSI Gaming (he returned them due to temps), we have the same case setup, Define R4s. The PCS+ won't hit 68 at load at stock. It won't come close. You won't hit 68c even with a beefy overclock. Here is Far Cry 3 all night, ambient 22, with a pretty good overclock (1170 on the core, up from a factory overclock of 1040) and high fan rpm:


If I run close to stock with a 1090 oc, I can turn fans to basically silent (40%) and get temps like this:


It's not just me, go check out temps in the Powercolor thread, nobody is hitting 68c on the core without a big overclock and vddc adjust. Meanwhile my buddy's MSI Gaming in single gpu would game at low 70s core, high 70s/low 80s vrm1. The Tri-X isn't as good as the PCS+. But it's pretty darned good too. Here is my crossfire temp picture after a 50 loop of Metro LL bench maxed, ~22-23 ambient:



Gaming temps, Far Cry 3 for 3+ hours, ~22-23 ambient:



I'm surprised you're able to run quad MSI Gaming xfire on air. What are your temps like?

Anyway my buddy's MSI Gaming in crossfire always hit 90c on the top card at game load in minutes and throttled. He's not the only one with that experience. I'm not ragging on the card, like I said it's fine as a single card gpu and it's quiet. And it's the smallest 290 by a big margin which is nice. But I think your experience is aberrant. Over at OCUK there was a big thread about MSI Gaming and high temps which is what made me wait for the PCS+ and Tri-X, unlike my impatient friend.

Sorry for the long thread, the hwinfo screens take up a lot of room, but I stand by what I said: Powercolor PCS+ runs 15-20c cooler than the MSI Gaming, and the Tri-X is not too far behind. This PCS+ cooler, is quite possibly the best gpu cooler ever made. I certainly have never owned a card that gets temps like this.


----------



## bbond007

Quote:


> Originally Posted by *taem*
> 
> Anyway my buddy's MSI Gaming in crossfire always hit 90c on the top card at game load in minutes and throttled. He's not the only one with that experience. I'm not ragging on the card, like I said it's fine as a single card gpu and it's quiet. And it's the smallest 290 by a big margin which is nice. But I think your experience is aberrant. Over at OCUK there was a big thread about MSI Gaming and high temps which is what made me wait for the PCS+ and Tri-X, unlike my impatient friend.


that is also my experience with the MSI gamers. I am pretty disappointed with the MSI gamers in crossfire. I have MSI GTX 760 twin frozrs and they are great.

The top card just seems like its not able to cool itself at all.

I put a 200MM fan blowing directly on the cards the 2 120mm cans in the top of the case to draw heat out and equalize the airflow.

If i open my case while the cards are under load the temp of the top card instantly shoots up.

My theory is the the twin frozr just do not have enough static pressure to cool in limited space like crossfire. They probably just fans with a different blade profile.

I would recommend some other card for less headache. You can get MSI gamers to work in crossfire without throttling, but its going to take some engineering.

On the other hand, my first card was a Gigabyte Windforce and it died after several hours of use.


----------



## Red1776

Quote:


> Originally Posted by *taem*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> neither the PCS nor the tri x run 15-20c below the MSI gaming card, or even close to that. The PCS runs @ 68c under load, and the MSI Gaming series runs at 73C secondly, there is more to OC headroom than a card running a few degree cooler.
> 
> http://www.guru3d.com/articles_categories/videocards.html
> 
> Guru put them all thru the paces and the conclusion on performance, temps, OC are remarkably close. In fact the load dBA from the MSI is a touch quieter.
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I have a PCS+ and Tri-X, a buddy had a pair of MSI Gaming (he returned them due to temps), we have the same case setup, Define R4s. The PCS+ won't hit 68 at load at stock. It won't come close. You won't hit 68c even with a beefy overclock. Here is Far Cry 3 all night, ambient 22, with a pretty good overclock (1170 on the core, up from a factory overclock of 1040) and high fan rpm:
> 
> 
> If I run close to stock with a 1090 oc, I can turn fans to basically silent (40%) and get temps like this:
> 
> 
> It's not just me, go check out temps in the Powercolor thread, nobody is hitting 68c on the core without a big overclock and vddc adjust. Meanwhile my buddy's MSI Gaming in single gpu would game at low 70s core, high 70s/low 80s vrm1. The Tri-X isn't as good as the PCS+. But it's pretty darned good too. Here is my crossfire temp picture after a 50 loop of Metro LL bench maxed, ~22-23 ambient:
> 
> 
> 
> Gaming temps, Far Cry 3 for 3+ hours, ~22-23 ambient:
> 
> 
> 
> 
> 
> I'm surprised you're able to run quad MSI Gaming xfire on air. What are your temps like?
> 
> Anyway my buddy's MSI Gaming in crossfire always hit 90c on the top card at game load in minutes and throttled. He's not the only one with that experience. I'm not ragging on the card, like I said it's fine as a single card gpu and it's quiet. And it's the smallest 290 by a big margin which is nice. But I think your experience is aberrant. Over at OCUK there was a big thread about MSI Gaming and high temps which is what made me wait for the PCS+ and Tri-X, unlike my impatient friend.
> 
> Sorry for the long thread, the hwinfo screens take up a lot of room, but I stand by what I said: Powercolor PCS+ runs 15-20c cooler than the MSI Gaming, and the Tri-X is not too far behind. This PCS+ cooler, is quite possibly the best gpu cooler ever made. I certainly have never owned a card that gets temps like this.
Click to expand...

It just dawned on me that we are comparing two different cards. you are running the 290 while I am running the 290X versions. individually my core temps are coming in at around 65c during gaming, but my VRM's are nowhere close to what you claim. If you buddy is getting VRM temps on his MSI gaming of 80-90, I would have him RMA it, or have a serious look at his caseflow situation because my VRM temps are looking very close to yours.

Good Luck.

I am trying mine out until my water blocks get here, I have no interest in quadfiring 4 x 290x on air.

anyway my mistake, I was comparing card with 300+ more shaders and less TDP.


----------



## armartins

About the discussion regarding downclocking with 14.2 Beta 3, I'm still rocking my ref 7970 daily @1200Mhz Core @1850Mhz Mem (Hynix FTW) and if I set only my 1920x1200 @60Hz Dell U2412m as active it will downclock to 300Mhz Core, 150Mhz Men, my overlord at 2560x1440 @114Hz alone won't let the memory downclock from 1850Mhz and the core downclocks only to 500Mhz, both active the same 500/1850... funy is that the only resolution @114hz that downclocks properly is 1024x768. But in general I don't mind, I use the monitor full time @114Hz since it makes reading foruns at chrome (with chrome wheel addon) crystal clear while scrolling, also I don't mind it not downclocking the memory since it's rock stable and temps are low...


----------



## Imprezzion

Ok, so I got pissed the crap off about 14.xx's issues with overclocking, power limits not working.. black screens when mem ocing, not downclocking when mem ocing..

So, I installed 13.12, ditched Mantle for BF4 for now, installed TriXX, Smacked my load fan speed to 100%, rammed the voltage bar and power limit as far as they go and boom.

In stead of running 1100Mhz core and 1400Mhz VRAM with the occasional throttling i'm now running +200mV, 1200Mhz core, 1625Mhz VRAM, no throttling at all at +50%. Temps are very high but at 100% fanspeed even +200mV (1.30v load) is tamable.

She's running at a flat 70c core and 78c VRM1 under BF4. Is that still within reason?









EDIT:

Ok, after a few rounds I can say it's reasonably stable, but temps did keep creeping up slightly to finally stabilize at 76c core and 90c VRM1. Now, I don't really like 90c VRM1 at 1.30v so i'll have to dail down a bit


----------



## taem

Quote:


> Originally Posted by *Red1776*
> 
> It just dawned on me that we are comparing two different cards. you are running the 290 while I am running the 290X versions. individually my core temps are coming in at around 65c during gaming, but my VRM's are nowhere close to what you claim. If you buddy is getting VRM temps on his MSI gaming of 80-90, I would have him RMA it, or have a serious look at his caseflow situation because my VRM temps are looking very close to yours.
> Good Luck.
> I am trying mine out until my water blocks get here, I have no interest in quadfiring 4 x 290x on air.
> anyway my mistake, I was comparing card with 300+ more shaders and less TDP.


So you haven't tried to crossfire these on air? Because I was saying from the start, the card is fine in single gpu scenarios. You're getting better temps than we were, but the temps we were getting in single card mode was more than acceptable, low 70s core high 70s/low 80s vrm1. Those are fine numbers. The 90c throttling scenario we were getting is in crossfire.

But if you're getting 70c under load like you were saying earlier, that's at least 15c higher than what I get at 1080 core, which is around the MSI Gaming's performance mode of 1070 or whatever it was, which is what I was saying, the Powercolor is 15-20c cooler than the MSI Gaming. Keep in mind, the MSI Gaming is 267mm x 110mm x 38mm with dual fans that max at 2200rpm or so. The PCS+ is 294mm x 110mm x 52mm with triple fans that max at 5000rpm. The cooling differential is not surprising at all. The price you pay for it is a massive 3 slotter card with very loud fans.
Quote:


> Originally Posted by *Imprezzion*
> 
> Smacked my load fan speed to 100%, rammed the voltage bar and power limit as far as they go and boom.
> 
> In stead of running 1100Mhz core and 1400Mhz VRAM with the occasional throttling i'm now running +200mV, 1200Mhz core, 1625Mhz VRAM, no throttling at all at +50%. Temps are very high but at 100% fanspeed even +200mV (1.30v load) is tamable.
> 
> She's running at a flat 70c core and 78c VRM1 under BF4. Is that still within reason?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT:
> 
> Ok, after a few rounds I can say it's reasonably stable, but temps did keep creeping up slightly to finally stabilize at 76c core and 90c VRM1. Now, I don't really like 90c VRM1 at 1.30v so i'll have to dail down a bit


90c vrm1 at +200mV on air? That's godlike.


----------



## Red1776

On air, CF 4 x 290X gets warm. I did not have any throttling problems however. The single card temp comes down to ambient, case air flow, and the game, or stress test being run. getting a comparison of any accuracy is not possible because of the variable between your setup and mine, including comparing an X version vs a non X version of the 290.

Next week mine will be under water and maxing out at 39C, so its a moot point for me.


----------



## Imprezzion

Well, yeah, but that's with the fan on 100% and the intake fans of my case, including side intake, all maxed out









The Tri-X does a really good job at cooling VRM's I gotta say.

I'll tell ya even better. At +150mV, which is what I run now, I can run 1200Mhz as well but it artifacts now and then with random white checkerboards so I turned clocks down to 1180Mhz and that ran fine for 3 rounds of BF4 with no arti's. Temps were of ''godlike'' proportions at 100% fan. 68c core and only 71c VRM1.

I'll post some screenshots later.. Lost my GPU-Z graph with temps due to a random reboot.. Had a tad too high VRAM clocks as I was testing how high my Hynix would go







1625Mhz (6500Mhz effective) appears to be fully stable.


----------



## taem

Quote:


> Originally Posted by *Imprezzion*
> 
> Well, yeah, but that's with the fan on 100% and the intake fans of my case, including side intake, all maxed out
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The Tri-X does a really good job at cooling VRM's I gotta say.
> 
> I'll tell ya even better. At +150mV, which is what I run now, I can run 1200Mhz as well but it artifacts now and then with random white checkerboards so I turned clocks down to 1180Mhz and that ran fine for 3 rounds of BF4 with no arti's. Temps were of ''godlike'' proportions at 100% fan. 68c core and only 71c VRM1.
> 
> I'll post some screenshots later.. Lost my GPU-Z graph with temps due to a random reboot.. Had a tad too high VRAM clocks as I was testing how high my Hynix would go
> 
> 
> 
> 
> 
> 
> 
> 1625Mhz (6500Mhz effective) appears to be fully stable.


Yeah I'm amazed at the Tri-X vrm temps because my experience with Sapphire is iffy vrm cooling. I didn't think it'd be possible to get better than high 70s core low 80s vrm1 in air cooled crossfire for gpu1 but the Tri-X is staying sub 70c. I'm amazed and delighted, especially since I run a quiet mid tower with under volted Noctuas.

Tri-X fans are loud though. I don't see how you can live with 100% on those fans.


----------



## Imprezzion

Well, as my rad fans are 6 2000RPM Noiseblocker PWM fans which ramp to 100% above 45c i'm used to some noise









Plus, they only ramp up to 100% when gaming which is when I use my headset anyway and the Siberia V2 is quite good at covering outside noise.


----------



## Prozillah

Alright I finally took some pics...can I pweeze joinz da club now??













2x Gigabyte Windforce OC's

Still testing overclocks at the moment - thinking I'll get pretty close to 1150 core with stock and minimal volt adjustment.


----------



## Tonza

So is the power limit bug caused by the latest AMD beta drivers or latest MSI afterburner beta? Might give a shot for those WHQL drivers then if they fix the issue (im not playing BF4 anyway so i dont use Mantle).


----------



## Connolly

Are any of you guys playing Titanfall with this GPU? If so, has anyone found a way to get smooth graphics, no screen tearing and be able to achieve 120 FPS? The game seems a bit of a mess, the FPS is locked to 60 if you turn V-Sync off, but with it on it's seems to have weird "stuttering" graphics. I've tried setting V-Sync on in game and the using CCC to force it off, it doesn't work very well.

I'm currently running the 13.2 drivers.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Connolly*
> 
> Are any of you guys playing Titanfall with this GPU? If so, has anyone found a way to get smooth graphics, no screen tearing and be able to achieve 120 FPS? The game seems a bit of a mess, the FPS is locked to 60 if you turn V-Sync off, but with it on it's seems to have weird "stuttering" graphics. I've tried setting V-Sync on in game and the using CCC to force it off, it doesn't work very well.
> 
> I'm currently running the 13.2 drivers.


eh, Titanfall's vsync is bugged atm, I run MSAA x 8 and vsync off, not getting any tearing.

as for the stuttering, try turning it from window to fullscreen a few times, that's an issue i've been having.


----------



## Connolly

Quote:


> Originally Posted by *Sgt Bilko*
> 
> eh, Titanfall's vsync is bugged atm, I run MSAA x 8 and vsync off, not getting any tearing.
> 
> as for the stuttering, try turning it from window to fullscreen a few times, that's an issue i've been having.


Yeah, running it with V-Sync off does help but, and this defies logic, it then locks to 60 FPS. No idea why having it off would do that, but there it is.

I'll give the windowed/full screen switching a go to see if it smooths out the stuttering. It's strange, there is no drop in FPS when the choppy/stuttering graphics occur, it just seems like something is going wrong whilst rendering certain parts of the map/objects.


----------



## Letmefly

hey peeps, i have been rocking with the Sapphire R9 290 Tri X for a while now and have decided to up the memory from a measly 1380 to 1650mhz. Everything runs fine but once in a while testing using various games i encounter small block type graphical artifacts (memory related). These artifacts appear after 30mins or after 1 hour of gameplay then i realize my clocks aren't stable then i back off currently on 1540mhz.

Will aux pll voltage help a great deal? what is considered a safe value?

VRM 1 maxed temps 75-80c

cheers


----------



## Sgt Bilko

Quote:


> Originally Posted by *Connolly*
> 
> Yeah, running it with V-Sync off does help but, and this defies logic, it then locks to 60 FPS. No idea why having it off would do that, but there it is.
> 
> I'll give the windowed/full screen switching a go to see if it smooths out the stuttering. It's strange, there is no drop in FPS when the choppy/stuttering graphics occur, it just seems like something is going wrong whilst rendering certain parts of the map/objects.


It's certainly strange, they are aware of it and and it will be fixed at some point.

Hope it works out for you anyway


----------



## Matt-Matt

Quote:


> Originally Posted by *Letmefly*
> 
> hey peeps, i have been rocking with the Sapphire R9 290 Tri X for a while now and have decided to up the memory from a measly 1380 to 1650mhz. Everything runs fine but once in a while testing using various games i encounter small block type graphical artifacts (memory related). These artifacts appear after 30mins or after 1 hour of gameplay then i realize my clocks aren't stable then i back off currently on 1540mhz.
> 
> Will aux pll voltage help a great deal? what is considered a safe value?
> 
> VRM 1 maxed temps 75-80c
> 
> cheers


Upping the PLL voltage a titch may help a little.. You can only try; I'd say a safe value is to increase it ever so slightly and see if it helps as your overclock seems borderline unstable, (usually after an hour and it does that it means that you're almost stable). - That's what I've found at least.


----------



## phallacy

Guys I ran into a bit of a problem, I was wondering if you could help.

Yesterday I decided I wanted to roll back to 13.12 to see how that affects benchmarking scores so I did the usual DDU uninstall and reinstall. However when installing 13.12 the part where the screen flickers to detect/undetect the cards, it froze. I thought weird some hiccup must have happened so I did DDU again just to get rid of anything the previous failed install might have put on. Tried it again and the same thing, it froze at the exact same part. So I thought ok I will try this later. Tried putting back 14.2 and bam the exact same thing kept happening. I can't get either driver installed now. Had to use system restore to get back to 14.2 and that seemed to work ok. All the cards in GPU Z are recognized the same and all showing 13.350 as the driver. But now Catalyst Control Center won't open at all, it says installed but when I click / run as admin, the circle goes for a few seconds saying it's trying but then no CCC window pops up. As far as I can tell graphic performance hasn't changed any and my cards are still able to clock as normal. I just can't seem to open Catalyst or roll back to any other driver. Any ideas? I really don't want to reinstall Windows that will be a pita.

I thought maybe I will disable 2/3 cards and just leave the first card in the slot then try the install of the 13.12 drivers, maybe that would work? Then just enable the other 2 cards when the driver has installed fine and just update the Crossfire settings.


----------



## Imprezzion

Quote:


> Originally Posted by *Tonza*
> 
> So is the power limit bug caused by the latest AMD beta drivers or latest MSI afterburner beta? Might give a shot for those WHQL drivers then if they fix the issue (im not playing BF4 anyway so i dont use Mantle).


Driver bug. 13.12 performs way better in DX11 then the 14.xx family anyway.

I do have a issue where my case heats up a lot inside due to the side exaust of the Tri-X meaning that even though temps are at 68c core and 71c VRM at first, they rise and stabilize at 76c core and 88c VRM after gaming for a while.

When I get back home i'll see if leaving my sidepanel off helps. If yes, then my windowed panel has to go and i'll have to use my regular fanned panel again lol.

I also ordered a ColdZero backplate for my card btw







I love those things so much!


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Connolly*
> 
> Are any of you guys playing Titanfall with this GPU? If so, has anyone found a way to get smooth graphics, no screen tearing and be able to achieve 120 FPS? The game seems a bit of a mess, the FPS is locked to 60 if you turn V-Sync off, but with it on it's seems to have weird "stuttering" graphics. I've tried setting V-Sync on in game and the using CCC to force it off, it doesn't work very well.
> 
> I'm currently running the 13.2 drivers.


Most likely it's either the game or your cpu (which actually is because of the game). Log your cpu core usages while playing. Make sure core 1 isn't hitting 100%. If it is, especially if in windows 7, add this to the Origin Titanfall properties launch command... -high

This has helped some people.

Mine plays smooth maxed out, fov 90, msaa 8x, insane textures, etc.. no fps drops whatsoever I love it







My cpu core usage is pretty well spread out where as my buddy that had stuttering was getting maxed out on core 1. When he set the priority of the game to high it leveled out and no more issues for him.


----------



## kizwan

Additional stuff came in. Trying to *tame* water temp with additional 120mm rad. Still need to order Pastel because I don't have any Pastel left.

Fujipoly Extreme 11 W/mK (150 x 100 x 1 mm) for the VRM1 cooling
(3 x) Custom 3-pin to 3 x 3-pin fans cables


----------



## MrWhiteRX7




----------



## kizwan

Quote:


> Originally Posted by *MrWhiteRX7*


----------



## taem

Quote:


> Originally Posted by *Prozillah*
> 
> 2x Gigabyte Windforce OC's
> 
> Still testing overclocks at the moment - thinking I'll get pretty close to 1150 core with stock and minimal volt adjustment.


What are the default settings like for the 290 Windforce? I know clocks are 1040/1250, but is default vddc 0? Because Powercolor PCS+ is 1040 core also and it comes with a stock +50mV vddc adjust. How is the fan curve? What kind of temps do you get for the top card? As much as I like this Tri-X the 305mm prevents me from using the upper hdd cage so I really need to replace it with a card that's 295mm or less and I'm thinking the Giga is my best bet. Sapphire's design decisions irritate me, this Tri-X is a lot longer than it needs to be because the shroud extends way past the heatsink. They claim it directs the air but that last bit of heatsink at the end is just sitting over blank pcb, you don't need directed airflow there.
Quote:


> Originally Posted by *Imprezzion*
> 
> Well, as my rad fans are 6 2000RPM Noiseblocker PWM fans which ramp to 100% above 45c i'm used to some noise
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Plus, they only ramp up to 100% when gaming which is when I use my headset anyway and the Siberia V2 is quite good at covering outside noise.


It's ironic that you're used to fan noise because of your Noiseblocker fans.







Though I'm guessing for 2000rpm they are quiet. I've always wanted Noiseblocker fans, they ooze quality, but, SO EXPENSIVE. They make my Noctuas seem budget. But man, you really like to rev up those fans lol. I've got my NF P14s working at 700-900rpm and I run my cards at 1080/1350 so I can keep the fans low. I don't use headphones though.


----------



## Connolly

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Most likely it's either the game or your cpu (which actually is because of the game). Log your cpu core usages while playing. Make sure core 1 isn't hitting 100%. If it is, especially if in windows 7, add this to the Origin Titanfall properties launch command... -high
> 
> This has helped some people.
> 
> Mine plays smooth maxed out, fov 90, msaa 8x, insane textures, etc.. no fps drops whatsoever I love it
> 
> 
> 
> 
> 
> 
> 
> My cpu core usage is pretty well spread out where as my buddy that had stuttering was getting maxed out on core 1. When he set the priority of the game to high it leveled out and no more issues for him.


Thanks for the info, I'll test this out later. Was your friend running the game with v-sync on, on a 120hz monitor? My game is smooth if I turn v-sync off in game but force it to always on using Radeon pro.


----------



## Imprezzion

Well, 6 of em at 2K RPM on a rad is't quiet at all even though for 2K RPM they indeed are quite quiet.

I think my card isn't all too happy with my settings TBH.

Even on 100% fanspeed temps are very high.. ~75c core and ~90c VRM's when playing BF4 for extended periods of time. But then again, I am running +175mV and 1180Mhz core..


----------



## k3rast4se

I can buy the MSI R9 290 Gaming today for 528CDN+tx, 563$ everything include. Considering a MSRP of 469.99$ US and that the card was out only 2 month ago, should I buy it based on this price or the price will be dropping even more in a few days?


----------



## taem

Quote:


> Originally Posted by *Imprezzion*
> 
> Well, 6 of em at 2K RPM on a rad is't quiet at all even though for 2K RPM they indeed are quite quiet.
> 
> I think my card isn't all too happy with my settings TBH.
> 
> Even on 100% fanspeed temps are very high.. ~75c core and ~90c VRM's when playing BF4 for extended periods of time. But then again, I am running +175mV and 1180Mhz core..


That's a really high vddc adjust for the clock. But imho the temps are good for the voltage on air. You're still a nutter to run those Tri X fans at 100% though







That's like your mom vacuuming your room while you game.


----------



## Imprezzion

Yeah well, this card aint a real good overclocker. Only does about 1090-1100 at stock and truly eats voltage past that.

I found out it runs 1150Mhz stable without artifacts at +100mV. Those temps are a lot lower at 68c core and 77c VRM after 1+ hour of BF4.
That sees more sensible to me tbh.


----------



## rdr09

Quote:


> Originally Posted by *Red1776*
> 
> seems so. all four of mine seem to be great OC'ers and run in the 70's under load with the twin Frozr. mine are going under water which should add some OC headroom.


Hey Red, are you going to use ek full waterblock? might want to check this thread out . . .

http://www.overclock.net/t/1474896/what-water-block-will-fit-my-msi-r9-290-twin-frozr/10#post_21962764


----------



## taem

Quote:


> Originally Posted by *Imprezzion*
> 
> Yeah well, this card aint a real good overclocker. Only does about 1090-1100 at stock and truly eats voltage past that.
> 
> I found out it runs 1150Mhz stable without artifacts at +100mV. Those temps are a lot lower at 68c core and 77c VRM after 1+ hour of BF4.
> That sees more sensible to me tbh.


Well since reference is 947 and factory is 1000, a +90-100 on the core at stock voltage isn't bad at all. But yeah needing +100mV for 1150 isn't stellar. I take it you're using Trixx since you took voltage so high, if you're going to stay under +100 vddc you might try Afterburner, increasing Aux voltage can sometimes let you squeeze out a few more clocks.


----------



## Imprezzion

Yeah i was using TriXX but i'm stepping back to MSI AB as running past +100mV is not done on air.. Temps are ok, but it's not even summer and my room gets REALLY hot in the summer.

Can't wait for my ColdZero backplate to arrive btw







I'll make sure tot ake plenty of pics when it does


----------



## Mercy4You

Quote:


> Originally Posted by *taem*
> 
> Well since reference is 947 and factory is 1000, a +90-100 on the core at stock voltage isn't bad at all. But yeah needing +100mV for 1150 isn't stellar. I take it you're using Trixx since you took voltage so high, if you're going to stay under +100 vddc you might try Afterburner, increasing Aux voltage can sometimes let you squeeze out a few more clocks.


I run mine at 1150Mhz using Trixx with +75mV. How can Afterburner help with Aux voltage? What settings should I try then?


----------



## Imprezzion

Hmm nice. It does 25mV better then mine lol. Mine needs +100mV to be artifact free in Far Cry 3







(Which is, imo, the heaviest test possible for a GPU OC).


----------



## Mercy4You

Quote:


> Originally Posted by *Imprezzion*
> 
> Hmm nice. It does 25mV better then mine lol. Mine needs +100mV to be artifact free in Far Cry 3
> 
> 
> 
> 
> 
> 
> 
> (Which is, imo, the heaviest test possible for a GPU OC).










for 1175Mhz though it's artifacting with +125mV...


----------



## Imprezzion

Max I can do with TriXX is 1180 @ +200mV arti free but my god the heat.. 90+c VRM's at 100% fanspeed..

I found out that 1150/1625 @ +100mV runs great at 65% fanspeed. Max temps in GPU-Z after a few BF4 rounds and a half an hour of FC3 where 68c core, 80c VRM1 and 51c VRM2

Oh well.. Maybe a waterblock in the future? All i'd need is a res, block and a 60mm 240 rad.. I can use my H320's pump and 360 rad as base..


----------



## Mercy4You

Can you run your memory that high on lower VDDC than + 100mV?


----------



## Durvelle27

Ok guys question. Whats the performance dufference between the R9 290 and R9 290X when at the same clocks


----------



## Mercy4You

Quote:


> Originally Posted by *Durvelle27*
> 
> Ok guys question. Whats the performance dufference between the R9 290 and R9 290X when at the same clocks


Rather close clockspeed comparison between the 2:

http://www.tomshardware.com/reviews/radeon-r9-290-and-290x,3728-2.html


----------



## Jack Mac

TBH even 100mV for 24/7 seems a bit high since the Tri-X PCB is very similar to reference, 200mV is pushing it and you're probably risking your card. My reference 290 did 1150 at +25mV, 1175 at +69mV and 1200 at +100mV.


----------



## Germanian

Quote:


> Originally Posted by *Durvelle27*
> 
> Ok guys question. Whats the performance dufference between the R9 290 and R9 290X when at the same clocks


about 3-10% depending on game, maybe lower than 10% more like 8% max.


----------



## sugarhell

Quote:


> Originally Posted by *Durvelle27*
> 
> Ok guys question. Whats the performance dufference between the R9 290 and R9 290X when at the same clocks


Do you watercooling your psu?


----------



## Durvelle27

Quote:


> Originally Posted by *Mercy4You*
> 
> Rather close clockspeed comparison between the 2:
> 
> http://www.tomshardware.com/reviews/radeon-r9-290-and-290x,3728-2.html


Thx ill take a look
Quote:


> Originally Posted by *Germanian*
> 
> about 3-10% depending on game, maybe lower than 10% more like 8% max.


I mostly play TitanFall, NFS Rivals, BF4, Crysis 3 etc
Quote:


> Originally Posted by *sugarhell*
> 
> Do you watercooling your psu?


Where did you find that lol


----------



## sugarhell

Quote:


> Originally Posted by *Durvelle27*
> 
> Thx ill take a look
> I mostly play TitanFall, NFS Rivals, BF4, Crysis 3 etc
> Where did you find that lol


I googled watercooled psus and find your rig


----------



## Durvelle27

Quote:


> Originally Posted by *sugarhell*
> 
> I googled watercooled psus and find your rig


But no PSU is not watercooled as its not worth it


----------



## taem

Quote:


> Originally Posted by *Mercy4You*
> 
> I run mine at 1150Mhz using Trixx with +75mV. How can Afterburner help with Aux voltage? What settings should I try then?


Afterburner has Aux vddc adjust slider, no other oc app I know of does. What this actually affects I have no clue lol. But I have found I can get better clocks at the same core vddc in Afterburner by raising this, than I could in Trixx which doesn't have this slider. Not always though, and only at certain clock break points. Seems to help more at higher core clock/vddc than lower. Just play with it and see if it helps. For example if you max at 1150 @ 75mV in Trixx, try 1160 @ 75 vddc with a +25 or +50 aux vddc in Afterburner.

Maybe one of the tech guys here like Paul could chime in. Because I don't know if this poses any risk.


----------



## MapRef41N93W

Changing the thermal paste on my Tri-X has made such a difference. 30 mins of Valley benchmarking and it's running at a cool 65c at only 50% fan speed down from the mid 80s. My DCUII on the other hand is running 83-85c (which seems to be consistent with what the reviews stated about it) at 70% fan speed. Thankfully the DCUII cooler is much quieter than the Tri-X and even at 70% fan speed with my new 900d setup it's really not that loud.

I'd really like to apply some MX-4 to the DCUII, but the jerks at Asus put a target sticker on top of one of the backplate screws forcing you to remove it (and probably tear it) to get the backplate off which I assume voids the warranty. Not really worth it IMO.


----------



## sugarhell

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Changing the thermal paste on my Tri-X has made such a difference. 30 mins of Valley benchmarking and it's running at a cool 65c at only 50% fan speed. My DCUII on the other hand is running 83-85c (which seems to be consistent with what the reviews stated about it) at 70% fan speed. Thankfully the DCUII cooler is much quieter than the Tri-X and even at 70% fan speed with my new 900d setup it's really not that loud.
> 
> I'd really like to apply some MX-4 to the DCUII, but the jerks at Asus put a target sticker on top of one of the backplate screws forcing you to remove it (and probably tear it) to get the backplate off which I assume voids the warranty. Not really worth it IMO.


You have a 900d for air cooling? lol


----------



## MapRef41N93W

Quote:


> Originally Posted by *sugarhell*
> 
> You have a 900d for air cooling? lol


No, I bought a 900d to use with the G10s as my HAF-X had no more slots for 120/140mm, but they didn't work out so I went back to air. I figure I will attempt to custom watercool again in the near future.


----------



## Paul17041993

Quote:


> Originally Posted by *sugarhell*
> 
> I googled watercooled psus and find your rig


Quote:


> Originally Posted by *Durvelle27*
> 
> But no PSU is not watercooled as its not worth it


why would a PSU even need water, it wouldn't even be simple to set up... only fluid-cooled PSU Ive seen was something from a decade back because they were inefficient something stupid...


----------



## kpoeticg

Doesn't Koolance make a PSU Block? Or used to at least?

Putting a PSU underwater is like putting HDD's underwater, or PCH, or RAM. Just something to add to the loop


----------



## TommyGunn123

Welp looks like our estimations were correct, they fugged the pipes and the Asus 290x matrix is pretty much just a bulkier DCU2, gg Asus



http://overclock3d.net/reviews/gpu_displays/asus_r9_290x_matrix_platinum_review/1


----------



## TommyGunn123

Quote:


> Originally Posted by *kpoeticg*
> 
> Doesn't Koolance make a PSU Block? Or used to at least?
> 
> Putting a PSU underwater is like putting HDD's underwater, or PCH, or RAM. Just something to add to the loop


or just have your psu in a tank of mineral oil


----------



## Aussiejuggalo

Asus really havent been doing all that great with AMD cards this gen, I see a lot of people even on this thread complaining about temps of the core or VRMs, low overclocks etc so personally I'm not surprised about the Matrix

Gotta love OC3Ds reviews, Toms more or less right to the point and wont kiss ass, says exactly what he thinks


----------



## taem

Quote:


> Originally Posted by *TommyGunn123*
> 
> Welp looks like our estimations were correct, they fugged the pipes and the Asus 290x matrix is pretty much just a bulkier DCU2, gg Asus1


At least they have a partial excuse in that they had a nice 780 cooler lying around and they decided to get double value out of it. What's XFX's excuse for their hitler at Stalingrad like refusal to budge on putting sinks on the vrms? They don't even make nvidia cards anymore. After all the flak they got over melting 7970s, they still refuse to put sinks on vrms. Real sinks I mean, not a piece of plastic. That's like a 25 cent part. Just don't get it. Especially when they obviously invested some time and money on aesthetics, those XFX DDs are the best looking Gpus ever made IMHO. If they had vrm sinks I'd grab a pair of them in a heartbeat.


----------



## bbond007

Quote:


> Originally Posted by *Imprezzion*
> 
> When I get back home i'll see if leaving my sidepanel off helps. If yes, then my windowed panel has to go and i'll have to use my regular fanned panel again lol.


I found a window panel which is the same style as my old case.

I used a dremel to cut a hole 200mm in the window panel for my fan. I used a transparent fan, and people have told me it looks pretty good.


----------



## TommyGunn123

Quote:


> Originally Posted by *taem*
> 
> At least they have a partial excuse in that they had a nice 780 cooler lying around and they decided to get double value out of it. What's XFX's excuse for their hitler at Stalingrad like refusal to budge on putting sinks on the vrms? They don't even make nvidia cards anymore. After all the flak they got over melting 7970s, they still refuse to put sinks on vrms. Real sinks I mean, not a piece of plastic. That's like a 25 cent part. Just don't get it. Especially when they obviously invested some time and money on aesthetics, those XFX DDs are the best looking Gpus ever made IMHO. If they had vrm sinks I'd grab a pair of them in a heartbeat.


Ehh i guess, but tbh Asus is a giant in gpu's and that comes with a responsibility. Idno, i'm just dissapointed, because the matrix looks pretty awesome imo, and i had some faith they'd smash out a top tier 290x :/

on other news, have a look at the arctic extreme 4 on these 2 780ti's, theyre massive and looks sweet!







Img source: comments section on http://www.techpowerup.com/198939/arctic-announces-four-new-universal-graphics-card-coolers.html


----------



## MapRef41N93W

Welp so much for a clean install fixing my cards issues. It appeared for a while it had, then in the middle of gaming my screen turns into this crazy connecting pipe looking screen that was cream, blue, and red colored and then shuts off without being able to turn on. How stupid of me too think I could actually go more than 6 hours without a driver failure.


----------



## Imprezzion

AMD 14.3 Beta v1.0 drivers out. For us overclockers still utterly useless. Throttles even worse then 14.2. Power limit is still not working.
Also loads of artifacting when overclocking memory... My VRAM barely makes it to 1400mhz without flickering and black screens on 14.3 while it's running stable on 1625Mhz on 13.13


----------



## Sgt Bilko

Quote:


> Originally Posted by *Imprezzion*
> 
> AMD 14.3 Beta v1.0 drivers out. For us overclockers still utterly useless. Throttles even worse then 14.2. Power limit is still not working.
> Also loads of artifacting when overclocking memory... My VRAM barely makes it to 1400mhz without flickering and black screens on 14.3 while it's running stable on 1625Mhz on 13.13


Feature Highlights of The AMD Catalyst™ 14.3 Beta V1.0 Driver for Windows®
Thief:
AMD Mantle and AMD True Audio support
Improves stuttering observed in CrossFire mode
Call of Duty: Ghosts: QUAD CrossFire profile update - improves level load times
*Audio issues observed when using CrossFire configurations (and V-sync enabled) have been resolved*
BattleField 4: V-sync issues observed on CrossFire configurations (with Mantle enabled) have been resolved

That's what i was interested in









Downloading it now, will report back once i've messed about with a few things.


----------



## Forceman

Interesting that the date in the filename is 12 March.


----------



## kizwan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Imprezzion*
> 
> AMD 14.3 Beta v1.0 drivers out. For us overclockers still utterly useless. Throttles even worse then 14.2. Power limit is still not working.
> Also loads of artifacting when overclocking memory... My VRAM barely makes it to 1400mhz without flickering and black screens on 14.3 while it's running stable on 1625Mhz on 13.13
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Feature Highlights of The AMD Catalyst™ 14.3 Beta V1.0 Driver for Windows®
> Thief:
> AMD Mantle and AMD True Audio support
> *Improves stuttering* observed in CrossFire mode
> Call of Duty: Ghosts: QUAD CrossFire profile update - improves level load times
> *Audio issues observed when using CrossFire configurations (and V-sync enabled) have been resolved*
> BattleField 4: V-sync issues observed on CrossFire configurations (with Mantle enabled) have been resolved
> 
> That's what i was interested in
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Downloading it now, will report back once i've messed about with a few things.
Click to expand...

Improves stuttering?? I don't want stuttering. I want reduced or eliminate stuttering.


----------



## Arizonian

Updated driver info to OP history list

*AMD Catalyst™ 14.3 Beta V1.0 Driver for Windows® Operating System*


Spoiler: Driver Info!



*Feature Highlights of The AMD Catalyst™ 14.3 Beta V1.0 Driver for Windows®*
◾Thief:
◾AMD Mantle and AMD True Audio support
◾Improves stuttering observed in CrossFire mode
◾Call of Duty: Ghosts: QUAD CrossFire profile update - improves level load times
◾Audio issues observed when using CrossFire configurations (and V-sync enabled) have been resolved
◾BattleField 4: V-sync issues observed on CrossFire configurations (with Mantle enabled) have been resolved

*Known Issues*
◾Intermittent driver stability issues when installing/un-installing on Desktop Kaveri platforms that support AMD Enduro technology under Windows 8.1. Please disable Enduro support to resolve the issue
◾Secondary GPUs do not enter low power state on CrossFire configurations; this issue will be addressed in the next AMD Catalyst beta release
◾Thief (DirectX): Lighting flickers on CrossFire configurations only after CrossFire has been enabled then disabled; this issue will be addressed in the next AMD Catalyst beta release
◾Battlefield 4 (DirectX): Quad CrossFire configurations with Eyefinity Display configurations suffer slowdowns and stability issues
◾Titanfall: Flickering occurs under AMD CrossFire configurations



Good luck everyone.

If anyone has Thief I'm interested in finding out your takes on True Audio when enabled if you have good headphones or surround sound.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Arizonian*
> 
> Updated driver info to OP history list
> 
> *AMD Catalyst™ 14.3 Beta V1.0 Driver for Windows® Operating System*
> 
> 
> Spoiler: Driver Info!
> 
> 
> 
> *Feature Highlights of The AMD Catalyst™ 14.3 Beta V1.0 Driver for Windows®*
> ◾Thief:
> ◾AMD Mantle and AMD True Audio support
> ◾Improves stuttering observed in CrossFire mode
> ◾Call of Duty: Ghosts: QUAD CrossFire profile update - improves level load times
> ◾Audio issues observed when using CrossFire configurations (and V-sync enabled) have been resolved
> ◾BattleField 4: V-sync issues observed on CrossFire configurations (with Mantle enabled) have been resolved
> 
> *Known Issues*
> ◾Intermittent driver stability issues when installing/un-installing on Desktop Kaveri platforms that support AMD Enduro technology under Windows 8.1. Please disable Enduro support to resolve the issue
> ◾Secondary GPUs do not enter low power state on CrossFire configurations; this issue will be addressed in the next AMD Catalyst beta release
> ◾Thief (DirectX): Lighting flickers on CrossFire configurations only after CrossFire has been enabled then disabled; this issue will be addressed in the next AMD Catalyst beta release
> ◾Battlefield 4 (DirectX): Quad CrossFire configurations with Eyefinity Display configurations suffer slowdowns and stability issues
> ◾Titanfall: Flickering occurs under AMD CrossFire configurations
> 
> 
> 
> Good luck everyone.
> 
> If anyone has Thief I'm interested in finding out your takes on True Audio when enabled if you have good headphones or surround sound.


Well looks like we won't know till tomorrow










Spoiler: Warning: Spoiler!



Eidos is about to roll out a new patch for Thief and AMD says the patch will add support for Mantle and TrueAudio.
There is no word on how much performance will be gained, but users with compatible AMD graphics bottlenecked by a lacklustre CPU stand to gain the most. Mantle is all about eliminating CPU overhead, so it delivers the best results on CPU-bound configurations.

TrueAudio will bring a few new audio effects, including convolution reverb, which should make echoes a bit more realistic. TrueAudio can deliver a lot of location-based audio improvements, adding a bit of realism to what is already a rather realistic looking game.

However, only users with the latest Hawaii based cards will be able to enjoy Mantle and TrueAudio improvements. Only the R9 290X, R9 290 and the Bonnaire based R7 260X will do the trick. AMD is still working to get more developers on the Mantle bandwagon. Eidos and EA DICE are the biggest names so far, but other big players like Crytek, Valve and Ubisoft are still not on board.



The patch should roll out on March 18.

http://www.fudzilla.com/home/item/34222-thief-patch-adds-mantle-and-trueaudio-support


----------



## Heinz68

I just received email from VisionTek:

_"We currently have a limited quantity of CryoVenom cards available for order right now at"_ http://bit.ly/19Vevxn!


----------



## Imprezzion

Well, at least I found out what AUX VDDC helps with..

VRAM OC get's a lot more stable with more AUX VDDC even though it's not directly VRAM voltage. Voltage is too low for that.

I can run 1625Mhz at +100mV AUX without issues but it starts to flicker and artifact when I lower AUX under +50mV. At +25mV it flickers on my desktopand gives random colored stripes in my screen but at +100mV it's just perfectly fine.

Might just be random but I hope it helps you guys clock VRAM.

I was lucky with my card sicne it has Hynix R0C VRAM so it clocks a tad better then Elpida anyway.


----------



## Matt-Matt

Quote:


> Originally Posted by *Imprezzion*
> 
> Well, at least I found out what AUX VDDC helps with..
> 
> VRAM OC get's a lot more stable with more AUX VDDC even though it's not directly VRAM voltage. Voltage is too low for that.
> 
> I can run 1625Mhz at +100mV AUX without issues but it starts to flicker and artifact when I lower AUX under +50mV. At +25mV it flickers on my desktopand gives random colored stripes in my screen but at +100mV it's just perfectly fine.
> 
> Might just be random but I hope it helps you guys clock VRAM.
> 
> I was lucky with my card sicne it has Hynix R0C VRAM so it clocks a tad better then Elpida anyway.


Nice, good to know that it does have some sort of correlation with it. Means I'll be able to push more then 1400MHz hopefully!
I've got Elpida on mine unfortunately; Bad Luck Matt, always getting the worst clocking vRAM


----------



## Sgt Bilko

Quote:


> Originally Posted by *Matt-Matt*
> 
> Nice, good to know that it does have some sort of correlation with it. Means I'll be able to push more then 1400MHz hopefully!
> I've got Elpida on mine unfortunately; Bad Luck Matt, always getting the worst clocking vRAM


Wow, thats bad. i can run 1500/1550Mhz depending on the program.

mine Elpida too btw


----------



## kizwan

Quote:


> Originally Posted by *Matt-Matt*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Imprezzion*
> 
> Well, at least I found out what AUX VDDC helps with..
> 
> VRAM OC get's a lot more stable with more AUX VDDC even though it's not directly VRAM voltage. Voltage is too low for that.
> 
> I can run 1625Mhz at +100mV AUX without issues but it starts to flicker and artifact when I lower AUX under +50mV. At +25mV it flickers on my desktopand gives random colored stripes in my screen but at +100mV it's just perfectly fine.
> 
> Might just be random but I hope it helps you guys clock VRAM.
> 
> I was lucky with my card sicne it has Hynix R0C VRAM so it clocks a tad better then Elpida anyway.
> 
> 
> 
> 
> 
> 
> 
> Nice, good to know that it does have some sort of correlation with it. Means I'll be able to push more then 1400MHz hopefully!
> I've got Elpida on mine unfortunately; Bad Luck Matt, always getting the worst clocking vRAM
Click to expand...

On 290's, Elpida is not bad in overclocking actually. I've bench mine at 1600MHz, no artifact, no problem. Also a couple of 1620 - 1625 MHz run too. My cards have Hynix & Elpida memory respectively btw.


----------



## Matt-Matt

Quote:


> Originally Posted by *kizwan*
> 
> On 290's, Elpida is not bad in overclocking actually. I've bench mine at 1600MHz, no artifact, no problem. Also a couple of 1620 - 1625 MHz run too. My cards have Hynix & Elpida memory respectively btw.


Both my 7950's had Elpida and the one that I only had for a week had Hynix ($150 card.. Sold for ~$330).

I know that without any voltage increasing I can do around 1400, maybe 1450 but I remember that 1500+ makes the windows desktop corrupt after a good 30 minutes or so.


----------



## Brian18741

What drivers are people using in crossfire? I'm still on 13.12, should I be on the new beta 14.2?

Started playing Sleeping Dogs last night and am getting shocking FPS, about 25 - 30fps max dipping down to single digits at points! This is at 1440p but with R9 290 crossfire?! I even tried the low preset in graphics settings and problem persists.

Also I'm noticing only 1 CPU core (3570k) is being utilised according to Afterburner. Core 1 is at 100%, while 2, 3 and 4 max about 5% usage. Is this a bugged reading or what's going on?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Brian18741*
> 
> What drivers are people using in crossfire? I'm still on 13.12, should I be on the new beta 14.2?
> 
> Started playing Sleeping Dogs last night and am getting shocking FPS, about 25 - 30fps max dipping down to single digits at points! This is at 1440p but with R9 290 crossfire?! I even tried the low preset in graphics settings and problem persists.
> 
> Also I'm noticing only 1 CPU core (3570k) is being utilised according to Afterburner. Core 1 is at 100%, while 2, 3 and 4 max about 5% usage. Is this a bugged reading or what's going on?


I was using 14.2 for everything and nothing changed in non-mantle games from 13.12 but i'm on 14.3 now and i'm getting terribad tearing on Titanfall.

might need to tweak the vsync settings now that the audio crackle has been fixed (Crossfire, Vsync issue with 14.2).

haven't tried anything else on 14.3 but give it a go and see what happens, worst case is a DDU uninstall and roll back to 13.12









As for Sleeping dogs, isn't it a very CPU intensive game?

what clock is your 3570k running at?


----------



## TommyGunn123

Quote:


> Originally Posted by *Brian18741*
> 
> What drivers are people using in crossfire? I'm still on 13.12, should I be on the new beta 14.2?
> 
> Started playing Sleeping Dogs last night and am getting shocking FPS, about 25 - 30fps max dipping down to single digits at points! This is at 1440p but with R9 290 crossfire?! I even tried the low preset in graphics settings and problem persists.
> 
> Also I'm noticing only 1 CPU core (3570k) is being utilised according to Afterburner. Core 1 is at 100%, while 2, 3 and 4 max about 5% usage. Is this a bugged reading or what's going on?


Does it only do it in Sleeping dogs? if not try updating/reinstalling driver


----------



## Brian18741

Will try tonight when I get home, as you say, worst case scenario I just roll back to 13.12.

Re 3570k, currently it's at stock clocks as my H100 died last week and I'm using the stock cooler while waiting on RMA. It usually runs at 4.5ghz though. But the problem seems to be it's only using 1 core, literally 100% usageon core 1 and max 4 or 5% usage on cores 2, 3 and 4. I'll stick up a screen shot later on as well.


----------



## rdr09

Quote:


> Originally Posted by *Brian18741*
> 
> Will try tonight when I get home, as you say, worst case scenario I just roll back to 13.12.
> 
> Re 3570k, currently it's at stock clocks as my H100 died last week and I'm using the stock cooler while waiting on RMA. It usually runs at 4.5ghz though. But the problem seems to be it's only using 1 core, literally 100% usageon core 1 and max 4 or 5% usage on cores 2, 3 and 4. I'll stick up a screen shot later on as well.


can't leave an i5 at stock with 2 290s no matter what game.









edit: might as well play with just one gpu and it might still be not enough.


----------



## Matt-Matt

Quote:


> Originally Posted by *rdr09*
> 
> can't leave an i5 at stock with 2 290s no matter what game.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> edit: might as well play with just one gpu and it might still be not enough.


Yeah, my i5 at stock is bottlenecking my single mildly overclocked 290.. I've gotta overclock it again once I get off the stock cooler


----------



## rdr09

Quote:


> Originally Posted by *Matt-Matt*
> 
> Yeah, my i5 at stock is bottlenecking my single mildly overclocked 290.. I've gotta overclock it again once I get off the stock cooler


i remember my i7 struggled at stock with a 7970.


----------



## Csokis

AMD Clean Uninstall Utility

Quote:


> The AMD Clean Uninstall Utility will attempt to remove previously installed AMD Catalyst™ driver and AMD files and folders from the system. It is recommended to run the utility prior to the installation of a new AMD Catalyst driver. This may help prevent errors relating to missing/corrupt files associated with the AMD Catalyst software that may occur during/after the driver installation process.
> 
> *Supported Operating Systems*
> - Windows 8.1 32-bit and 64-bit
> - Windows 8 32-bit and 64-bit
> - Windows 7 32-bit and 64-bit


----------



## Matt-Matt

Quote:


> Originally Posted by *rdr09*
> 
> i remember my i7 struggled at stock with a 7970.


What i7 though? It doesn't make much difference; But yeaaah. It vastly depends on the game however.


----------



## battleaxe

Will two 290's in crossfire choke a 3570k clocked at 4.7ghz? After this mining craze is over, which seems we're about there I plan to put the 290's on one of my gaming rigs. My 3570k seems the best one as it has PCI-e 3.0 where my 2700k does not.

Suggestions? Will the 3570k do the dual 290's justice?
Quote:


> Originally Posted by *rdr09*
> 
> can't leave an i5 at stock with 2 290s no matter what game.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> edit: might as well play with just one gpu and it might still be not enough.


I just read this. Jeez. So what should I get for this board? Its a Z77 Pro ASUS. Seems a nice board. I guess I could go 3770k? Would that be enough? Or do I have to start over and go 6 core?


----------



## rdr09

Quote:


> Originally Posted by *Matt-Matt*
> 
> What i7 though? It doesn't make much difference; But yeaaah. It vastly depends on the game however.


Sandy Bridge. even with a slower 7950 at stock . . . i have to oc my cpu to 4.5GHz. never tried lower.

the only game i play with the i7 at stock and the 290 is BF4. with HT off but with mantle.


----------



## Brian18741

I don't have the problem in other games though. Is it just a case of Sleeping Dogs being badly optimised? Will try one card later as well see what happens.


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> Will two 290's in crossfire choke a 3570k clocked at 4.7ghz? After this mining craze is over, which seems we're about there I plan to put the 290's on one of my gaming rigs. My 3570k seems the best one as it has PCI-e 3.0 where my 2700k does not.
> 
> Suggestions? Will the 3570k do the dual 290's justice?
> I just read this. Jeez. So what should I get for this board? Its a Z77 Pro ASUS. Seems a nice board. I guess I could go 3770k? Would that be enough? Or do I have to start over and go 6 core?


like Matt said, it depends on the game. i highly recommend an i7 for 2 290s. 3 . . . i7 Haswell would be minimum preferrably X79 and with the cpu oc'ed. again, this is just an opinion.


----------



## battleaxe

Quote:


> Originally Posted by *rdr09*
> 
> can't leave an i5 at stock with 2 290s no matter what game.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> edit: might as well play with just one gpu and it might still be not enough.


Quote:


> Originally Posted by *rdr09*
> 
> like Matt said, it depends on the game. i highly recommend an i7 for 2 290s. 3 . . . i7 Haswell would be minimum preferrably X79 and with the cpu oc'ed. again, this is just an opinion.


My 2700k will go to 5.0Ghz. Should I use that instead for the 290's?


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> My 2700k will go to 5.0Ghz. Should I use that instead for the 290's?


i would.


----------



## chronicfx

Quote:


> Originally Posted by *rdr09*
> 
> like Matt said, it depends on the game. i highly recommend an i7 for 2 290s. 3 . . . i7 Haswell would be minimum preferrably X79 and with the cpu oc'ed. again, this is just an opinion.


Are there any realworld benchmarks to back this up? I keep hearing i7 as I run 3 290x on an 3570k but I have not seen 100% on all my cores yet. The only scenario could be 64 man multiplayer due to the physics of all those people needing rendering at once??

I would just love to see a fraps or something just once.. I know switching cpu is a damn pain in the arse but it is so hard to find cpu comparisons i5 vs i7 with overclocks holding a 3 to 4 modern gpu configuration as constant. I am not even sure why they do all the cpu comparisons with only one gpu just to show no difference.


----------



## rdr09

Quote:


> Originally Posted by *chronicfx*
> 
> Are there any realworld benchmarks to back this up? I keep hearing i7 as I run 3 290x on an 3570k but I have not seen 100% on all my cores yet. The only scenario could be 64 man multiplayer due to the physics of all those people needing rendering at once??
> 
> I would just love to see a fraps or something just once.. I know switching cpu is a damn pain in the arse but it is so hard to find cpu comparisons i5 vs i7 with overclocks holding a 3 to 4 modern gpu configuration as constant. I am not even sure why they do all the cpu comparisons with only one gpu just to show no difference.


like Matt said, it depends on the game. in the case of Battleaxe, he has a choice and i recommend the i7. games that don't need HT, then turn it off to keep the system cooler. games that do, then simply turn it on.

i found this out playing BF3MP . . .

HT off



HT on



that was crossfired 7900 series cards.


----------



## Brian18741

Quote:


> Originally Posted by *chronicfx*
> 
> Are there any realworld benchmarks to back this up? I keep hearing i7 as I run 3 290x on an 3570k but I have not seen 100% on all my cores yet. The only scenario could be 64 man multiplayer due to the physics of all those people needing rendering at once??


If you leave afterburner open while your gaming, does it show each of you GPUs at 100% or is the usage up and down all over the place?


----------



## Widde

Quote:


> Originally Posted by *battleaxe*
> 
> Will two 290's in crossfire choke a 3570k clocked at 4.7ghz? After this mining craze is over, which seems we're about there I plan to put the 290's on one of my gaming rigs. My 3570k seems the best one as it has PCI-e 3.0 where my 2700k does not.
> 
> Suggestions? Will the 3570k do the dual 290's justice?
> I just read this. Jeez. So what should I get for this board? Its a Z77 Pro ASUS. Seems a nice board. I guess I could go 3770k? Would that be enough? Or do I have to start over and go 6 core?


I have not experienced any bottlenecking with my 3570k at 4,6ghz, my 290s are on 100% in bf4 (@1080p, Ultra with 200% res scale no AA) and cpu is hovering at 90ish%, havent tested titanfall yet though.


----------



## battleaxe

Quote:


> Originally Posted by *Widde*
> 
> I have not experienced any bottlenecking with my 3570k at 4,6ghz, my 290s are on 100% in bf4 (@1080p, Ultra with 200% res scale no AA) and cpu is hovering at 90ish%, havent tested titanfall yet though.


Good to know. Thanks. I might just leave them on the 3570k then. That's be a lot easier. And I'm lazy after all.


----------



## Durvelle27

Is it worth getting a 290X over the 290


----------



## k3rast4se

http://www.techspot.com/review/734-battlefield-4-benchmarks/page6.html


----------



## rdr09

Quote:


> Originally Posted by *Widde*
> 
> I have not experienced any bottlenecking with my 3570k at 4,6ghz, my 290s are on 100% in bf4 (@1080p, Ultra with 200% res scale no AA) and cpu is hovering at 90ish%, havent tested titanfall yet though.


single player or MP? i do play with my i7 at 4.5 Ht off (slower than your ivy) with one 290 without issues in DX11.


----------



## rdr09

Quote:


> Originally Posted by *k3rast4se*
> 
> http://www.techspot.com/review/734-battlefield-4-benchmarks/page6.html


any MP benchmarks?


----------



## BradleyW

Ha anyone received their copy of the Mantle patch for Thief yet?


----------



## taem

Quote:


> Originally Posted by *Brian18741*
> 
> What drivers are people using in crossfire? I'm still on 13.12, should I be on the new beta 14.2?
> 
> Started playing Sleeping Dogs last night and am getting shocking FPS, about 25 - 30fps max dipping down to single digits at points! This is at 1440p but with R9 290 crossfire?! I even tried the low preset in graphics settings and problem persists.
> 
> Also I'm noticing only 1 CPU core (3570k) is being utilised according to Afterburner. Core 1 is at 100%, while 2, 3 and 4 max about 5% usage. Is this a bugged reading or what's going on?


Something definitely wrong, even on a single 290 and [email protected] I can run Sleeping Dogs at 1440p with high res textures and most settings maxed (not all though) and get 60 fps.


----------



## BradleyW

Just for those waiting for Mantle on thief, CFX has been hard locked disabled and awaiting a game patch.


----------



## Paul17041993

Quote:


> Originally Posted by *k3rast4se*
> 
> http://www.techspot.com/review/734-battlefield-4-benchmarks/page6.html


peaks 4 threads in singleplayer, guess that's why performance issues on multi are so common...

and I guess I was mistaken about the ceramic paint on the matrix heatsink, its just paint paint that makes it a good 10% hotter then the clean DCII.


----------



## Mercy4You

Quote:


> Originally Posted by *Durvelle27*
> 
> Is it worth getting a 290X over the 290


Absolutely, but me need fastest gear. I mostly play Death Match on 120 Hertz monitor, so I want 120 FPS


----------



## Durvelle27

Quote:


> Originally Posted by *Mercy4You*
> 
> Absolutely, but me need fastest gear. I mostly play Death Match on 120 Hertz monitor, so I want 120 FPS


Well i have a 4K monitor and need more power to drive it.


----------



## Mercy4You

Quote:


> Originally Posted by *Durvelle27*
> 
> Well i have a 4K monitor and need more power to drive it.


Here may be your answer I guess:

http://www.pcmag.com/article2/0,2817,2453389,00.asp

When I'd go for a 4K monitor, I would rather consider 2 x R9 290 in CF. But it all depends on how many FPS you need in the games you play...


----------



## Durvelle27

Quote:


> Originally Posted by *Mercy4You*
> 
> Here may be your answer I guess:
> 
> http://www.pcmag.com/article2/0,2817,2453389,00.asp
> 
> When I'd go for a 4K monitor, I would rather consider 2 x R9 290 in CF. But it all depends on how many FPS you need in the games you play...


The plan was either 2x 290, 2x 290X, or 2x 780 Ti eventually


----------



## Mercy4You

Today, after 10 weeks of RMA, ASUS sent back my exact same blackscreening R 290X with the comment 'we could not find a problem regarding this GPU'

Well, ASUS this was the last product I bought from your company!


----------



## MrWhiteRX7




----------



## rdr09

Quote:


> Originally Posted by *Mercy4You*
> 
> Today, after 10 weeks of RMA, ASUS sent back my exact same blackscreening R 290X with the comment 'we could not find a problem regarding this GPU'
> 
> Well, ASUS this was the last product I bought from your company!


Test it. They may have quietly fixed it. Sell it if it is fixed. just a suggestion.


----------



## vortex240

Quote:


> Originally Posted by *Mercy4You*
> 
> Today, after 10 weeks of RMA, ASUS sent back my exact same blackscreening R 290X with the comment 'we could not find a problem regarding this GPU'
> 
> Well, ASUS this was the last product I bought from your company!


If they tested it at stock frequencies and it passed then they have no obligation to replace it. Might simply be a gpu that doesn't want to OC.


----------



## Mercy4You

Quote:


> Originally Posted by *rdr09*
> 
> Test it. They may have quietly fixed it. Sell it if it is fixed. just a suggestion.


you may be right








Quote:


> Originally Posted by *vortex240*
> 
> If they tested it at stock frequencies and it passed then they have no obligation to replace it. Might simply be a gpu that doesn't want to OC.


It blackscreened at stock!


----------



## vortex240

Quote:


> Originally Posted by *Mercy4You*
> 
> It blackscreened at stock!


Have you tried the card in a different mobo/psu?

ASUS has great RMA, they don't fuuuck around - there is a very high chance here that the REAL issue is on your end.


----------



## VSG

Quote:


> Originally Posted by *vortex240*
> 
> ASUS has great RMA, they don't fuuuck around - there is a very high chance here that the REAL issue is on your end.


Asus has terrible RMA service relative to other companies, based on what I have read on several forums.


----------



## MapRef41N93W

I'm crashing to vertical lines all the time now. 15 minutes of heaven benchmark cards running at 80c/65c and crash to vertical lines. 20 mins of dota cards running at 70c/55c crash to vertical lines. I'm really starting to think this Asus card is just a POS and regretting buying it. Tried running at stock and same problem. Clean windows didn't fix, 13.12, 14.1, 14.2 no fix.


----------



## Mercy4You

[/quote]
Quote:


> Originally Posted by *vortex240*
> 
> Have you tried the card in a different mobo/psu?
> 
> there is a very high chance here that the REAL issue is on your end.


No, but I run another R9 290X now for 8 weeks in the same rig and not a single blackscreen or crash or whatever...


----------



## vortex240

Quote:


> Originally Posted by *Mercy4You*


No, but I run another R9 290X now for 8 weeks in the same rig and not a single blackscreen or crash or whatever...[/quote]

HAHAHA, great troubleshooting dude... No offence but how can you be taken seriously?!? Could be a myriad of issues...... For example, a twisted pcix power connecter that now is delivering the power properly, bad seating in the pcix slot, etc. etc. etc.

And ASUS has great RMA! Not a single friend that I personally know, has any issues in the last 10 years with them - that speaks volumes. Mind you this is in Canada. I guess some people are angry they don't get a 'I love you' card' with their RMA.


----------



## Mercy4You

Quote:


> Originally Posted by *vortex240*
> 
> HAHAHA, great troubleshooting dude... No offence but how can you be taken seriously?!? Could be a myriad of issues...... For example, a twisted pcix power connecter that now is delivering the power properly, bad seating in the pcix slot, etc. etc. etc.
> 
> And ASUS has great RMA! Not a single friend that I personally know, has any issues in the last 10 years with them - that speaks volumes. Mind you this is in Canada. I guess some people are angry they don't get a 'I love you' card' with their RMA.


No offence taken! Where have you been the last months I wonder, the blackscreening was a known issue with many cards. Sure, I tested all sort of things before RMAíng it...

Well if Asus was so great the past 10 years, they must be great now I guess, what education did you get dude, no offence?!


----------



## MrWhiteRX7

It's sad that a lot of people on here have the mentality that "well mine works fine so you're clearly just an idiot". I've been fortunate to be fairly trouble free, but I would be stupid to assume there aren't real problems going on. The black screen issue is VERY WELL known and documented.

Asus does NOT have a great RMA track record lol, just search this site for yourself and you'll see LOL


----------



## givmedew

Quote:


> Originally Posted by *Mercy4You*
> 
> Quote:
> 
> 
> 
> Originally Posted by *vortex240*
> 
> HAHAHA, great troubleshooting dude... No offence but how can you be taken seriously?!? Could be a myriad of issues...... For example, a twisted pcix power connecter that now is delivering the power properly, bad seating in the pcix slot, etc. etc. etc.
> 
> And ASUS has great RMA! Not a single friend that I personally know, has any issues in the last 10 years with them - that speaks volumes. Mind you this is in Canada. I guess some people are angry they don't get a 'I love you' card' with their RMA.
> 
> 
> 
> No offence taken! Where have you been the last months I wonder, the blackscreening was a known issue with many cards. Sure, I tested all sort of things before RMAíng it...
> 
> Well if Asus was so great the past 10 years, they must be great now I guess, what education did you get dude, no offence?!
Click to expand...

ASUS is well known for having a poor record for RMA... it is like the number 1 issue with the company. The second major issue is all of their middle end and low end boards are rip offs... only their highest end boards are competitive values.


----------



## vortex240

Quote:


> Originally Posted by *Mercy4You*
> 
> No offence taken! Where have you been the last months I wonder, the blackscreening was a known issue with many cards. Sure, I tested all sort of things before RMAíng it...
> 
> Well if Asus was so great the past 10 years, they must be great now I guess, what education did you get dude, no offence?!


I'm well aware of the blackscreen issue, and the fact that a lot of people got these with crappy PSU, and while overclocking for the most part. Due to the immense power draw of these cards. Then they took advantage of the RMA process to get a better clocking card.

I'm NOT trying to discredit you here, but I asked if you checked the 2 most important things that caused this and you did not.

You DID not try a different psu or motherboard - the 2 MAIN AND MOST probable causes. Please tell me what you tested? I'm curious now.

As a matter a fact, I have RMA'd a socket 1366 saberooth just 2 months ago after the 24pin connector melted due to sli and was given new z87 sabertooth as replacement, which I sold. RMA took just over 2 weeks - that is excellent service!!

BTW, I have no idea what you even are getting at mentioning my education?!


----------



## VSG

Quote:


> Originally Posted by *vortex240*
> 
> I'm well aware of the blackscreen issue, and the fact that a lot of people got these with crappy PSU, and while overclocking for the most part. Due to the immense power draw of these cards. Then they took advantage of the RMA process to get a better clocking card.


lol do you seriously believe people have been doing that?









I personally had an ASUS 290x which blackscreened at stock powered by an ASUS ROG M6F and a Corsair 860i as well as an 1200i later before I did an RMA with Newegg and got my money back.

So please tell me exactly which was crappy here- the 2 different flagship PSUs from Corsair or Asus's very own ROG M6F motherboard if the card was perfectly fine as you suggest?


----------



## kpoeticg

I had a black screen Powercolor 290x running on 1300W PSU. With no overclocking or tweaking i'd get random blackscreens at desktop, browser, literally anywhere.

Some of the cards with Elpida Ram are defective. This is a known fact. Has nothing to do with people playing the silicon lottery!!!!


----------



## stilllogicz

What is the estimated average OC a watercooled reference 290x can achieve?


----------



## kizwan

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> It's sad that a lot of people on here have the mentality that "well mine works fine so you're clearly just an idiot". I've been fortunate to be fairly trouble free, but I would be stupid to assume there aren't real problems going on. The black screen issue is VERY WELL known and documented.
> 
> Asus does NOT have a great RMA track record lol, just search this site for yourself and you'll see LOL


^^This.








Quote:


> Originally Posted by *vortex240*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mercy4You*
> 
> No offence taken! Where have you been the last months I wonder, the blackscreening was a known issue with many cards. Sure, I tested all sort of things before RMAíng it...
> 
> Well if Asus was so great the past 10 years, they must be great now I guess, what education did you get dude, no offence?!
> 
> 
> 
> I'm well aware of the blackscreen issue, and the fact that a lot of people got these with crappy PSU, and while overclocking for the most part. Due to the immense power draw of these cards. Then they took advantage of the RMA process to get a better clocking card.
> 
> I'm NOT trying to discredit you here, but I asked if you checked the 2 most important things that caused this and you did not.
> 
> *You DID not try a different psu or motherboard - the 2 MAIN AND MOST probable causes. Please tell me what you tested? I'm curious now.*
> 
> As a matter a fact, I have RMA'd a socket 1366 saberooth just 2 months ago after the 24pin connector melted due to sli and was given new z87 sabertooth as replacement, which I sold. RMA took just over 2 weeks - that is excellent service!!
> 
> BTW, I have no idea what you even are getting at mentioning my education?!
Click to expand...

He did try with another 290X on the same motherboard with the same PSU & it works just fine. Definitely these two not probable causes in this case.


----------



## vortex240

Quote:


> Originally Posted by *geggeg*
> 
> lol do you seriously believe people have been doing that?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I personally had an ASUS 290x which blackscreened at stock powered by an ASUS ROG M6F and a Corsair 860i as well as an 1200i later before I did an RMA with Newegg and got my money back.
> 
> So please tell me exactly which was crappy here- the 2 different flagship PSUs from Corsair or Asus's very own ROG M6F motherboard if the card was perfectly fine as you suggest?


LOL - I know personally a guy that RMA'd 3 cards like this. Does that answer your question?


----------



## cennis

I used the wi command for afterburner and managed to get to 1.41v volts on desktop,

however when I start 3dm11 it goes down to 1.26v~
is this vdroop a little large?

cant even bench 1200/1500

on screaming air, temps <85 shouldn't be the problem


----------



## kizwan

Quote:


> Originally Posted by *stilllogicz*
> 
> What is the estimated average OC a watercooled reference 290x can achieve?


I guestimate between 1150 - 1200 MHz for the core & 1500 - 1600 MHz for the memory.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *stilllogicz*
> 
> What is the estimated average OC a watercooled reference 290x can achieve?


I would say about 1200 to 1250MHz core. A 290/x will gain some OC if watercooled, I was able to gain 45MHz core, up to 1245MHz.


----------



## cennis

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I would say about 1200 to 1250MHz core. A 290/x will gain some OC if watercooled, I was able to gain 45MHz core, up to 1245MHz.


May I ask, what is your voltage settings?
Quote:


> Originally Posted by *kizwan*
> 
> I guestimate between 1150 - 1200 MHz for the core & 1500 - 1600 MHz for the memory.


That range can be acheived on air cooling (loud) with some voltage increases with passable temps


----------



## kpoeticg

Just because you know ONE person that's taking advantage of replacement windows to get better clockers, doesn't mean that the HUNDREDS of people that have receieved defective cards are idiots or liars

People use that same trick to bin all cpu's, gpu's and ram. By your logic, black screen 4930k's, and 3770's, and 780's, and 670's, and Trident-X, and Corsair Dominator would be just as common as black-screen 290x's. This is a computer enthusiast forum. When you read instance after instance of something happening, 99% chance it's real.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *cennis*
> 
> May I ask, what is your voltage settings?
> That range can be acheived on air cooling (loud) with some voltage increases with passable temps


For 1245MHz I need +200mV.. Adding more voltage does not help my card and decreases stability. RAM speed can to 1625MHz, haven't tested higher but it might be capable of more.
For every day I run 1205/1437.5MHz with +168mV. Elpida RAM.
Once my loop's water gets entirely warmed up, my core and VRM 1 max at 52c, my PC is also right above a heater though.


----------



## MapRef41N93W

Hmm it seems like my Asus may not have been binned correctly. I downclocked the cards from 1350 mem to 1300 and they ran stable for 40 mins without vertical lines of death in Heaven. Isn't that grounds for an RMA as the card is supposed to be binned for 1350 mem?


----------



## taem

Quote:


> Originally Posted by *Mercy4You*
> 
> Today, after 10 weeks of RMA, ASUS sent back my exact same blackscreening R 290X with the comment 'we could not find a problem regarding this GPU'
> 
> Well, ASUS this was the last product I bought from your company!


Problem with black screen is it's an intermittent problem, that's the worst kind of problem to have. They're not going to sit there and test your card for hours on end, they're going to stick it in a test bench, boot, run a benchmark or two, fire up a game, and if during that half hour it doesn't black screen, you're going to get that card sent back to you.


----------



## kizwan

Quote:


> Originally Posted by *cennis*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I guestimate between 1150 - 1200 MHz for the core & 1500 - 1600 MHz for the memory.
> 
> 
> 
> That range can be acheived on air cooling (loud) with some voltage increases with passable temps
Click to expand...

Depends on the ambient temp of course. Not possible with my ambient temp (30+ C) with air cooling without overheating the VRM1. I believe the range is average overclocks any cards can achieve with water cooling, with acceptable temps even with high ambient temp.


----------



## Forceman

Quote:


> Originally Posted by *cennis*
> 
> That range can be acheived on air cooling (loud) with some voltage increases with passable temps


Yeah, for whatever reason watercoolimg doesn't seem to help these cards as much. Inherent chip limits maybe, they don't seem to scale as high as Tahiti chips did.


----------



## kpoeticg

Quote:


> Originally Posted by *taem*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mercy4You*
> 
> Today, after 10 weeks of RMA, ASUS sent back my exact same blackscreening R 290X with the comment 'we could not find a problem regarding this GPU'
> 
> Well, ASUS this was the last product I bought from your company!
> 
> 
> 
> Problem with black screen is it's an intermittent problem, that's the worst kind of problem to have. They're not going to sit there and test your card for hours on end, they're going to stick it in a test bench, boot, run a benchmark or two, fire up a game, and if during that half hour it doesn't black screen, you're going to get that card sent back to you.
Click to expand...

Yeah, that's what i was afraid of with my card. I wasn't able to test & troubleshoot the black screens it til my NE 30day return window was over. I didn't have alot of confidence that Manufacturer RMA would find & fix the problem. Black screens are too random. I've had em 2hr's into AC Black Flag, and 2 minutes into staring at my Desktop. I've seen other people talk about the same thing happening with manufacturer rma's. Luckily, to my suprise, NE honored the replacement even though the 30days had passed. Much credit to them for taking care of that. I thought i was stuck with it!!!


----------



## pkrexer

Maybe my stock heatsink was just installed badly, because it ran super hot and would artifact easily. Prior to installing my water block, 1160mhz was the max I could get with +100mv (with the fan sounding like jet engine). Now I can get 1200mhz with the same voltage and the GPU never goes above 45c.


----------



## stilllogicz

Quote:


> Originally Posted by *kizwan*
> 
> I guestimate between 1150 - 1200 MHz for the core & 1500 - 1600 MHz for the memory.


Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I would say about 1200 to 1250MHz core. A 290/x will gain some OC if watercooled, I was able to gain 45MHz core, up to 1245MHz.


Thanks guys. From what I've read there isn't a much greater difference with the 290x lightning. OFC there isn't a waterblock available for it yet. This considered, would it be worth it to get 2 lightnings over 2 reference cards for watercooling?


----------



## kpoeticg

Quote:


> Originally Posted by *stilllogicz*
> 
> Thanks guys. From what I've read there isn't a much greater difference with the 290x lightning. OFC there isn't a waterblock available for it yet. This considered, would it be worth it to get 2 lightnings over 2 reference cards for watercooling?


I loved the AC Kryographics 290x Block since it was announced. Also not a particular fan of the metal plate on the EK 290x Blocks. That was my number one decider in going with reference.

If you like the EK Blocks for your particular GPU, then your usually safe getting a popular non-reference like the DCII, Matrix, or Lightning. They'll release a block for it eventually 99% of the time


----------



## Tobiman

I've only had two different 290s but what I've noticed is that the black screen issue comes in two levels. You either black screen randomly at stock (increasing vcore might resolve this) or black screen after running a substantial overclock on memory even after increasing vcore. Average memory overclock is around 1500mhz for elpida and 1650 for hynix.

On another note, I wonder why we don't get samsung memory in AMD cards? Apart from costs, I can't think of any other reason.


----------



## bond32

Lightnings will probably get you over 1200. They generally have top end components and power delivery as you know im sure. But when considering that slight extra bump you COULD have over reference, I don't think it is warranted. Especially considering you will have to have the lightning water blocks.


----------



## cennis

Its quite disappointing how unlocked 780/ti/titans can get 1300mhz+ with unlocked voltage and water


----------



## kpoeticg

Yeah, the random stock black screens are the most frustrating thing to deal with. Increasing the power definitely helps stabilize the card (at least it did for me), but didn't get rid of the problem by any means. Still had 2-3 black screens a day at a minimum.

I wish we could get some 290x's with Samsung Ram!!!!


----------



## Ironsmack

So an update to my problem...

I was still experiencing throttling down even though my VRM's were acceptable (around 77 C on BF4).

So what I did was, instead of having multiple programs running to watch my GPU - I turn them off one by one.

And I found that GPU-Z (0.7.7) be the cause of my throttling down.

I started with 13.2, stock GPU freq - GPU-Z, OpenHWMonitor and AB running.

It throttles down within 20+ mins of gameplay.

Repeat test and turned off GPU-Z, but kept AB and OHWMonitor.

No throttling.

Repeat test and turned off GPU-Z and AB but OHWMonitor... no throttling.

Repeat another test with OHWMonitor off but kept GPU-Z and AB... throttling happens.

Repeat another test with OHWMonitor and AB but left GPU-Z... for some reason it still throttles down.

Btw, I reboot in between tests.

So I just decided to turn off GPU-Z for the mean time and rely on OHWMonitor to monitor GPU although I cant see my VRM.


----------



## stilllogicz

Quote:


> Originally Posted by *bond32*
> 
> Lightnings will probably get you over 1200. They generally have top end components and power delivery as you know im sure. But when considering that slight extra bump you COULD have over reference, I don't think it is warranted. Especially considering you will have to have the lightning water blocks.


Yea considering the nature of Hawaii architecture it just seems like the 290x lightnings will shine under LN2 instead of water. Considering the objective, reference might just be the way to go. Guilty confession, I actually wanna use the EK csq block so I can use the plexi bridge to see the path of the coolant


----------



## taem

Quote:


> Originally Posted by *Ironsmack*
> 
> So an update to my problem...
> 
> I was still experiencing throttling down even though my VRM's were acceptable (around 77 C on BF4).
> 
> So what I did was, instead of having multiple programs running to watch my GPU - I turn them off one by one.
> 
> And I found that GPU-Z (0.7.7) be the cause of my throttling down..


I said it earlier in this thread, I'm positive gpu z messes things up. Maybe when it's run alongside another monitor. Whatever the case may be, the only issue I've had with my cards is sometimes display loses signal at boot up. This *only* happens when gpu z is set to auto start. No gpu z, no issues, ever. I just don't use it anymore, I use hwinfo64.


----------



## Paul17041993

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Hmm it seems like my Asus may not have been binned correctly. I downclocked the cards from 1350 mem to 1300 and they ran stable for 40 mins without vertical lines of death in Heaven. *Isn't that grounds for an RMA as the card is supposed to be binned for 1350 mem?*


yes, RMA, welcome to ASUS cards.


----------



## Mercy4You

Originally Posted by taem

Problem with black screen is it's an intermittent problem, that's the worst kind of problem to have. They're not going to sit there and test your card for hours on end, they're going to stick it in a test bench, boot, run a benchmark or two, fire up a game, and if during that half hour it doesn't black screen, you're going to get that card sent back to you.
Quote:


> Originally Posted by *kpoeticg*
> 
> Yeah, that's what i was afraid of with my card. I wasn't able to test & troubleshoot the black screens it til my NE 30day return window was over. I didn't have alot of confidence that Manufacturer RMA would find & fix the problem. Black screens are too random. I've had em 2hr's into AC Black Flag, and 2 minutes into staring at my Desktop. I've seen other people talk about the same thing happening with manufacturer rma's. Luckily, to my suprise, NE honored the replacement even though the 30days had passed. Much credit to them for taking care of that. I thought i was stuck with it!!!


Very true guys, I counted my blackscreens at 31 in 2 months time, while some weeks none... My card could sometimes run games for hours without a problem and at other times a blackscreen within 15-60 minutes. It passed Furmark full load for 60 minutes, and later it crashed at desktop 10 minutes after startup. I could never replicate it.

I've read here (http://www.overclock.net/t/1441349/290-290x-black-screen-poll) that other people also got their cards returned by ASUS, so I was prepared for this to happen. Also XFX was notorious for that, while other Brands simply replaced the cards like they all should do with clearly a faulty product.


----------



## kpoeticg

Yeah didn't XFX state outright that they will not accept black screen rma's?

Also i read either in the black screen poll thread or on the amd forums (can't remember) some1 that rma'd an msi card for black screens and had the same card with the same serial number sent back to them with Hynix Ram replacing the original Elpida


----------



## Brian18741

Quote:


> Originally Posted by *taem*
> 
> Something definitely wrong, even on a single 290 and [email protected] I can run Sleeping Dogs at 1440p with high res textures and most settings maxed (not all though) and get 60 fps.


So I've ran just one card, problem persist.

Clean install of 14.3, problem persists but no probs with the drivers so far!

Sleeping Dogs is only using 1 CPU core so only 25% of my processing power. Check out the SS below. You can both GPUs usage up and down all over the place and only CPU core 1 maxed out, the rest hardy even registering!


----------



## vortex240

Quote:


> Originally Posted by *kpoeticg*
> 
> Yeah didn't XFX state outright that they will not accept black screen rma's?
> 
> Also i read either in the black screen poll thread or on the amd forums (can't remember) some1 that rma'd an msi card for black screens and had the same card with the same serial number sent back to them with Hynix Ram replacing the original Elpida


Can you link the source please, I would like to have a read. What a lot of people forget, is that you can't believe everything you read on the internet. RAM chips NEVER are replaced on a card, such cards are deemed dead if diagnosed with a bad RAM chip. Once a RAM chip is sodered on, it is impossible to replace. If the issues is with a resistor, etc. - repairs can be made.

A lot of these stories I simply take with a grain of salt.


----------



## kpoeticg

I read it like a month ago when i was going crazy scouring the internet because of my own black screen 290x.

I don't know if it's fact or fiction. All i said was i read it either in the black screen poll thread or somewhere on the amd forums and i can't remember which. I wouldn't have said that if i remembered where i read it =). The guy who posted it seemed pretty convinced it was the same card.

But i've also never heard of such an item that is impossible to replace once it's soldered on. So....


----------



## Scorpion49

Quote:


> Originally Posted by *vortex240*
> 
> Can you link the source please, I would like to have a read. What a lot of people forget, is that you can't believe everything you read on the internet. RAM chips NEVER are replaced on a card, such cards are deemed dead if diagnosed with a bad RAM chip. Once a RAM chip is sodered on, it is impossible to replace. If the issues is with a resistor, etc. - repairs can be made.
> 
> A lot of these stories I simply take with a grain of salt.


BGA chips can most certainly be replaced, its actually a rather quick process. Would an RMA company do it? I don't know, probably not. Seems like it would be easier just to blame it on the end user somehow and not repair it at all.


----------



## devilhead

Quote:


> Originally Posted by *kpoeticg*
> 
> Yeah, the random stock black screens are the most frustrating thing to deal with. Increasing the power definitely helps stabilize the card (at least it did for me), but didn't get rid of the problem by any means. Still had 2-3 black screens a day at a minimum.
> 
> I wish we could get some 290x's with Samsung Ram!!!!


buy msi lightning and you will get samsung memory







like i see many people who bought msi lightning got Samsung memory


----------



## kpoeticg

On AMD cards?

I'm pretty positive there isn't any 290x with Sammy memory including the Lightning


----------



## Aussiejuggalo

Well I bought a Sapphire 290 Tri-X OC







just gotta wait for it to be delivered

Hope I dont have the black screen bs everyones been going on about...


----------



## stilllogicz

Quote:


> Originally Posted by *kpoeticg*
> 
> On AMD cards?
> 
> I'm pretty positive there isn't any 290x with Sammy memory including the Lightning


People in the 290x lightning thread have samsung memory


----------



## devilhead

Quote:


> Originally Posted by *kpoeticg*
> 
> On AMD cards?
> 
> I'm pretty positive there isn't any 290x with Sammy memory including the Lightning


MSI lightning goes with Samsung memory and Hynix








http://www.overclock3d.net/reviews/gpu_displays/msi_r9_290x_lightning_review/4 you can see on gpu-z memory type,
and ofcourse http://www.overclock.net/t/1472341/msi-r9-290x-lightning-thread , all guys there have samsung


----------



## kpoeticg

OMG +rep for pointing that out to me

That's the one possible thing that could make me wish i went non-reference. Damnit LOL. Im already commited to reference. Got 2 out of my 3 cards, AC Blocks, & Backplates.

Hynix & Sammy IMO definitely makes the Lightning worth it!!!


----------



## MapRef41N93W

Quote:


> Originally Posted by *kpoeticg*
> 
> OMG +rep for pointing that out to me
> 
> That's the one possible thing that could make me wish i went non-reference. Damnit LOL. Im already commited to reference. Got 2 out of my 3 cards, AC Blocks, & Backplates.
> 
> Hynix & Sammy IMO definitely makes the Lightning worth it!!!


Tri-X is reference with Hynix (some Elpida but more Hynix from what I can tell)


----------



## MrWhiteRX7

Quote:


> Originally Posted by *stilllogicz*
> 
> What is the estimated average OC a watercooled reference 290x can achieve?


Honestly you can pretty much achieve max overclock on air, it's just loud as hell and not as efficient with power. Watercooling brings more efficiency and quietness. Now if we are talking multi- gpu then that's another story but a lot of people right out the gate with only a single 290/x were only maybe getting a few mhz more with water. (that's not EVERYONE, but the majority).

I'm not trying to steer you away from going under the h2o treatment, I think it was the best thing I did for mine


----------



## kpoeticg

All the cards til now have been a lottery for Hynix or Elpida. That's definitely news that MSI went smart with the Lightnings and made it a Hynix/Samsung lottery. That's huge

It might be something that's harder to appreciate if you haven't had a Black Screen PLAGUED Card. Alot of people get Elpida cards that work just great. But having one of the defective ones can drive you nuts!!!


----------



## rdr09

Quote:


> Originally Posted by *kpoeticg*
> 
> All the cards til now have been a lottery for Hynix or Elpida. That's definitely news that MSI went smart with the Lightnings and made it a Hynix/Samsung lottery. That's huge
> 
> It might be something that's harder to appreciate if you haven't had a Black Screen PLAGUED Card. Alot of people get Elpida cards that work just great. *But having one of the defective ones can drive you nuts!!!*


i agree, especially for those in other countries where rma is a pita.


----------



## battleaxe

I have a 290 hynix card and it won't hit 1500mhz on the RAM. My 290 with elpida does. Go figure.

On that note. Seems the MSI gaming 290 does not like to OC even with extra voltage. (not as well as the reference anyway)

And the stock voltage on the MSI gaming card is much lower than my MSI 290 that is a reference card. Has there been some reports on this phenomenon? Or are my cards just 'special'? LOL


----------



## kpoeticg

Yeah there was a good 2-3 weeks i thought i was stuck with it. That's why i was going nuts.

Didn't even occur to me that Newegg would still replace it for me after the 30 window had passed. A few days after i filed my ticket with Powercolor and i still hadn't heard back yet, i figured it couldn't hurt to give Newegg a call. I couldn't believe how understanding they were about it when they were technically off the hook for it.


----------



## stilllogicz

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Honestly you can pretty much achieve max overclock on air, it's just loud as hell and not as efficient with power. Watercooling brings more efficiency and quietness. Now if we are talking multi- gpu then that's another story but a lot of people right out the gate with only a single 290/x were only maybe getting a few mhz more with water. (that's not EVERYONE, but the majority).
> 
> I'm not trying to steer you away from going under the h2o treatment, I think it was the best thing I did for mine


Going crossfire for sure if I bite the bullet. Kinda torn cuz I'll be replacing two 780 classifieds, not liking the 3gb of vram but I do like the overclocking of the classified. Geez, these frickin decisions.


----------



## MrWhiteRX7

The average overclock on these cards seems to be around 1150/ 1450... a good bit will do 1200+ and occasionally some won't even hit 1150.

With water and xfire I'm sure with some tweaking you'll at least get 1150mhz on both









All three of mine will do 1125 without even touching voltage.


----------



## stilllogicz

A part of me just thinks this is a lesson in futility. Titanfall, albeit unoptimized, maxes my vram on the classy's. If I were to put them under water I'm sure I can approach 1400mhz. The 290x under water would offer similar or slightly less performance but an extra gig of vram, I like to game on ultra 1440p 100 fps.

Not sure if it's worth the hassle of getting a pair of 290x and going through the trouble of selling my classifieds. Atleast the prices have dropped substantially for the 290x. Hence why I'm torn. In truth this rebuild just has to last atleast till the beginning of next year when all the new toys coming this year will have time to mature properly.


----------



## taem

At 1440p, with the framerates you want, memory and mem bandwidth aren't the only limits I don't think. In any event I would never replace 780 classys with Hawaiis. I might, and actually did, buy 290s over 780s. But I would never side grade what you have with 290xs. Just wait for gcn2 to upgrade.

Tho, you could probably sell the classys for not much less than what you need for the side grade so I guess it doesn't really matter.


----------



## MapRef41N93W

Quote:


> Originally Posted by *stilllogicz*
> 
> Going crossfire for sure if I bite the bullet. Kinda torn cuz I'll be replacing two 780 classifieds, not liking the 3gb of vram but I do like the overclocking of the classified. Geez, these frickin decisions.


Have fun buddy. Enjoy constant crashes, vertical lines of death, screen flickering, and spending hours upon hours uninstalling and reinstalling drivers.


----------



## axiumone

So, pretty much the same thing as nvidia, right? Stop it with this none sense already. Having 3 way and 4 way set ups from both sides, I can tell you that both sides have let the driver quality slide a lot lately.


----------



## Aussiejuggalo

You own AMD cards and your crying about them... kinda stupid

Also I can tell you Nvidia drivers lately suck, I'm having constant crashing just from watching a full screen 1080p Youtube video...


----------



## Prozillah

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Have fun buddy. Enjoy constant crashes, vertical lines of death, screen flickering, and spending hours upon hours uninstalling and reinstalling drivers.


Mate thats not true - if ur xfire stay away from the bloody betas and you'll be absolutely fine. If you are unfortunate enough to have the black screen issue or anything else that sucks. However I muat admit I did change my mobo and cpu to 4770k from 4670k which I can honestly say made a BIG difference for me. So if you are going to run high end cards (xfire specific) make sure the rest of ur system is high end too.


----------



## Prozillah

Ive got my twin giga windforces on air (soon to be water...







) on 13.12 and they everything absolutely perfectly no issues whatsoever.


----------



## taem

Quote:


> Originally Posted by *Prozillah*
> 
> Mate thats not true - if ur xfire stay away from the bloody betas and you'll be absolutely fine. If you are unfortunate enough to have the black screen issue or anything else that sucks. However I muat admit I did change my mobo and cpu to 4770k from 4670k which I can honestly say made a BIG difference for me. So if you are going to run high end cards (xfire specific) make sure the rest of ur system is high end too.


That's good to know, my 4670k is definitely choking my 290 xfire, but I can't justify upgrading to an $800 cpu just for gaming. But a 4770k I can do, be a good excuse to finally build my Node 304 htpc with the replaced 4670k.

How are temps on your giga xfire btw?


----------



## MapRef41N93W

Quote:


> Originally Posted by *Prozillah*
> 
> Mate thats not true - if ur xfire stay away from the bloody betas and you'll be absolutely fine. If you are unfortunate enough to have the black screen issue or anything else that sucks. However I muat admit I did change my mobo and cpu to 4770k from 4670k which I can honestly say made a BIG difference for me. So if you are going to run high end cards (xfire specific) make sure the rest of ur system is high end too.


Umm I don't use the beta drivers. If you clicked my sig you would see that I already run a high end system.

I'm not complaining about AMD cards, running a single 290x is perfect for me. I'm complaining about the cesspool that is their x-fire drivers which are a nightmare compared to what I used to have with Nvidia SLI 760s.


----------



## Aussiejuggalo

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Umm I don't use the beta drivers. If you clicked my sig you would see that I already run a high end system.
> 
> I'm not complaining about AMD cards, running a single 290x is perfect for me. I'm complaining about the cesspool that is their x-fire drivers which are a nightmare compared to what I used to have with Nvidia SLI 760s.


Most major problems with SLI/Crossfire comes from mixing card brands/OCs... had so many problems with SLI 460 OC & SOC, changed to 2 SOC cards and most problems were gone

Also unstable overclocks can cause massive problems with multiple cards


----------



## MapRef41N93W

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Most major problems with SLI/Crossfire is mixing card brands/OCs... had so many problems with SLI 460 OC & SOC, changed to 2 SOC cards and most problems were gone
> 
> Also unstable overclocks can cause massive problems with multiple cards


I have to underclock my cards to stay stable at all in X-fire. Just running stock causes vertical lines of death after 20 mins or so of gaming/benching.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *MapRef41N93W*
> 
> I have to underclock my cards to stay stable at all in X-fire. Just running stock causes vertical lines of death after 20 mins or so of gaming/benching.


When I had 4 x 7970's I ran in to a few issues, went down to 3 then 2 and never had a problem after that with them. Currently with 3 x 290's I've been problem free. Even on the latest beta's, but I choose to stick with 13.12 now since I don't play BF4 anymore.


----------



## Sgt Bilko

Quote:


> Originally Posted by *MapRef41N93W*
> 
> I have to underclock my cards to stay stable at all in X-fire. Just running stock causes vertical lines of death after 20 mins or so of gaming/benching.


Wait....so you are saying that in Xfire these cards don't work properly with any driver?


----------



## kizwan

Quote:


> Originally Posted by *MapRef41N93W*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Prozillah*
> 
> Mate thats not true - if ur xfire stay away from the bloody betas and you'll be absolutely fine. If you are unfortunate enough to have the black screen issue or anything else that sucks. However I muat admit I did change my mobo and cpu to 4770k from 4670k which I can honestly say made a BIG difference for me. So if you are going to run high end cards (xfire specific) make sure the rest of ur system is high end too.
> 
> 
> 
> 
> 
> 
> 
> Umm I don't use the beta drivers. If you clicked my sig you would see that I already run a high end system.
> 
> I'm not complaining about AMD cards, running a single 290x is perfect for me. I'm complaining about the cesspool that is their x-fire drivers which are a nightmare compared to what I used to have with Nvidia SLI 760s.
Click to expand...

No problem here with Crossfire, either in 13.12 WHQL or 14.2 beta drivers. I watched youtube, play games, left to idle a couple of hours & monitor power saving also enabled, NO problem whatsoever. Mine also doesn't have black screen issue.

I'm not 100% sure but the problem probably because mismatching cards. I noticed you're crosfiring with reference & custom card. I'm guessing BIOS on both cards don't agree with each other.

*[EDIT]* I just noticed you have 290X Tri-X Crossfire & 290X DCII Crossfire. So, you're running quad-fire?


----------



## Prozillah

Quote:


> Originally Posted by *MapRef41N93W*
> 
> I have to underclock my cards to stay stable at all in X-fire. Just running stock causes vertical lines of death after 20 mins or so of gaming/benching.


Definitely something up with that set up if you're having to underclock to remain stable - have you tried each card individually? Hell even tried swapping the slots around? I would be tempted to get my hands on another mobo to try an isolate that as a possible potential issue.


----------



## Prozillah

Quote:


> Originally Posted by *taem*
> 
> That's good to know, my 4670k is definitely choking my 290 xfire, but I can't justify upgrading to an $800 cpu just for gaming. But a 4770k I can do, be a good excuse to finally build my Node 304 htpc with the replaced 4670k.
> 
> How are temps on your giga xfire btw?


Sorry Taem I did read your original message just got lost trying to follow this thread when it moves at like 15 pages a day!

Temps are sweet! I'm on air xfired however I have explained the importance of a really solid negative pressure vortex for your case fans so if you cant handle the noise I wouldnt recommend xfiring anything 290 cards... BUT with my 11 case fans running on full I can run a healthy OC without temps breaking 80c in an environment consisting of 25 - 31 degrees (New Zealand summer right now..)

I will putting them under water at some stage but truthfully right now I'm not evening worried as I game generally with the corsair vengeance 1500 headset that has amazing noise blocking abilities in a private gaming room. In a lounge or where other members congregate I would need to go h20 probably...

I will do some gaming tonight and post up a few screens of my HWinfo & Playclaw stats.


----------



## Imprezzion

I just bought a second Tri-X 290X for binning purposes.. It's great when miners decide to sell their cards.. I got it for just €399 shipped and all and it's less then 2 months old with full warranty and all









We shall see if she's a better clocker then my current one.. I'd love to run CF but there's no way in hell my PSU is going to run that..


----------



## Mercy4You

So guys, I started the *BAD ASUS RMA SERVICE* thread and the blackscreening first batch of cards and now we are discussing X-fire. I am interested to try this with R9 290X and it should be under water, I think everybody here agrees about that









As this is very new to me, I have a few questions:

1: can someone recommend a good case that has WC pre-installed or easy complete WC install set plus case?

2: which easy to install GPU waterblocks?

3: which PSU for 2 x R9 290X? (Yes I will OC CPU and GPU)

4: which Mobo? (not ASUS, I will never buy their products again as result of bad service







)

Thx, all suggestions are welcome, but remember I'm very noob


----------



## TommyGunn123

Quote:


> Originally Posted by *Mercy4You*
> 
> So guys, I started the *BAD ASUS RMA SERVICE* thread and the blackscreening first batch of cards and now we are discussing X-fire. I am interested to try this with R9 290X and it should be under water, I think everybody here agrees about that
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As this is very new to me, I have a few questions:
> 
> 1: can someone recommend a good case that has WC pre-installed or easy complete WC install set plus case?
> 
> 2: which easy to install GPU waterblocks?
> 
> 3: which PSU for 2 x R9 290X? (Yes I will OC CPU and GPU)
> 
> 4: which Mobo? (not ASUS, I will never buy their products again as result of bad service
> 
> 
> 
> 
> 
> 
> 
> )
> 
> Thx, all suggestions are welcome, but remember I'm very noob


1. Depends on size, if you want a big case i'd say the Phanteks Enthoo Primo, mid tower size the old n gold Corsair 750d is great OR the Corsair 540 air looks awesome and tonnes of options

2. Most waterblocks are the same to install really, just paste the core, thermal tape the memory/vrms and slap dat on

3. You'll probs want 850-1000w if you're gonna oc errythin and had a bit of headroom for efficiency

4. If you dont want Asus ( which atm i dont blame you), the MSI gaming ones are pretty damn good, and at 200 aus, my MSi z87-g45 is a bargain and looks awesome


----------



## Matt-Matt

Quote:


> Originally Posted by *Mercy4You*
> 
> So guys, I started the *BAD ASUS RMA SERVICE* thread and the blackscreening first batch of cards and now we are discussing X-fire. I am interested to try this with R9 290X and it should be under water, I think everybody here agrees about that
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As this is very new to me, I have a few questions:
> 
> 1: can someone recommend a good case that has WC pre-installed or easy complete WC install set plus case?
> 
> 2: which easy to install GPU waterblocks?
> 
> 3: which PSU for 2 x R9 290X? (Yes I will OC CPU and GPU)
> 
> 4: which Mobo? (not ASUS, I will never buy their products again as result of bad service
> 
> 
> 
> 
> 
> 
> 
> )
> 
> Thx, all suggestions are welcome, but remember I'm very noob


Quote:


> Originally Posted by *TommyGunn123*
> 
> 1. Depends on size, if you want a big case i'd say the Phanteks Enthoo Primo, mid tower size the old n gold Corsair 750d is great OR the Corsair 540 air looks awesome and tonnes of options
> 
> 2. Most waterblocks are the same to install really, just paste the core, thermal tape the memory/vrms and slap dat on
> 
> 3. You'll probs want 850-1000w if you're gonna oc errythin and had a bit of headroom for efficiency
> 
> 4. If you dont want Asus ( which atm i dont blame you), the MSI gaming ones are pretty damn good, and at 200 aus, my MSi z87-g45 is a bargain and looks awesome


The Answers Tommygun have provided are pretty good, for specifics though I'd suggest;

1. This entirely depends on you, if you want to go watercooling though you'll want a case with space for some thick radiators and big ones. I.E Big cases that allow a 360mm (3x 120mm). You'll also want some good space for the res too.

2. Yeah, most waterblocks are fairly similar; Unless you go with the AquaComputer ones that have the active backplate, but it's not much harder I'd imagine (I have one in the post atm).

3. I'd suggest a 850W Silverstone strider (Silver/Gold rated), or a similar Seasonic MK2II (1000w would be better however for the future and just efficiency/noise - in some cases)

4. I'd suggest a Gigabyte board, what is the price you're looking at though? I'd say go for a Z87X-UD3H, but at the same time for only a bit more a Sniper/Z87x-OC aren't much more and have many more features.

To add nowadays by taking the stock cooler off and putting a waterblock on you'll be voiding the warranty on AMD cards. I know that eVGA doesn't care on the nVidia side and a few others. But at least in Australia nobody does a AMD card that doesn't void the warranty. (Warranty terms change based on where you are/where you got it).


----------



## velocityx

Quote:


> Originally Posted by *Matt-Matt*
> 
> The Answers Tommygun have provided are pretty good, for specifics though I'd suggest;
> 
> 1. This entirely depends on you, if you want to go watercooling though you'll want a case with space for some thick radiators and big ones. I.E Big cases that allow a 360mm (3x 120mm). You'll also want some good space for the res too.
> 
> 2. Yeah, most waterblocks are fairly similar; Unless you go with the AquaComputer ones that have the active backplate, but it's not much harder I'd imagine (I have one in the post atm).
> 
> 3. I'd suggest a 850W Silverstone strider (Silver/Gold rated), or a similar Seasonic MK2II (1000w would be better however for the future and just efficiency/noise - in some cases)
> 
> 4. I'd suggest a Gigabyte board, what is the price you're looking at though? I'd say go for a Z87X-UD3H, but at the same time for only a bit more a Sniper/Z87x-OC aren't much more and have many more features.
> 
> To add nowadays by taking the stock cooler off and putting a waterblock on you'll be voiding the warranty on AMD cards. I know that eVGA doesn't care on the nVidia side and a few others. But at least in Australia nobody does a AMD card that doesn't void the warranty. (Warranty terms change based on where you are/where you got it).


1- Corsair 750D, good price and excellent choice for a WC setup.

2- I would go with EK

3- Get a 1000 watt if you want 2x290x. I have 2x290's and an overclocked 8320 and simply jumping +100mv on each GPU in AB shuts down my machine within 5 minutes of bf4 multiplayer because PSU protection systems kick in. and that's on a seasonic 850w gold based XFX unit. 290x is about 50w each more power hungry than a 290 so it's 100watt. considering amd is more power hungry than intel, that 100w makes it about equal.

4 board, well, I run asus for years and no issues so can't recommend one.


----------



## Matt-Matt

Quote:


> Originally Posted by *velocityx*
> 
> 1- Corsair 750D, good price and excellent choice for a WC setup.
> 
> 2- I would go with EK
> 
> 3- Get a 1000 watt if you want 2x290x. I have 2x290's and an overclocked 8320 and simply jumping +100mv on each GPU in AB shuts down my machine within 5 minutes of bf4 multiplayer because PSU protection systems kick in. and that's on a seasonic 850w gold based XFX unit. 290x is about 50w each more power hungry than a 290 so it's 100watt. considering amd is more power hungry than intel, that 100w makes it about equal.
> 
> 4 board, well, I run asus for years and no issues so can't recommend one.


Really? They're _that_ power hungry? But yeah okay then I'd suggest 1000W in that case.

EK are all good, but I prefer the looks of other blocks and the performance of all blocks is very similar just that EK isn't the best (within 2-3c) haha. Whatever works for you I guess though. EK are easier to get usually.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Matt-Matt*
> 
> Really? They're _that_ power hungry? But yeah okay then I'd suggest 1000W in that case.
> 
> EK are all good, but I prefer the looks of other blocks and the performance of all blocks is very similar just that EK isn't the best (within 2-3c) haha. Whatever works for you I guess though. EK are easier to get usually.


My 8350 at 5.0Ghz plus my 290's overclocked equals around 950w being drawn from the socket.

I'll find the post where i recorded it all.


----------



## King4x4

Guys any of you tried the Stilts Mining Bios for general gaming?


----------



## Imprezzion

Well, as it has ROP's disabled, it'll probably severely underperform in games..


----------



## Matt-Matt

Quote:


> Originally Posted by *Sgt Bilko*
> 
> My 8350 at 5.0Ghz plus my 290's overclocked equals around 950w being drawn from the socket.
> 
> I'll find the post where i recorded it all.


Oh well an 8350 at 5GHz will be pulling ~220W whereas an i5 or i7 at similar clocks would pull like 150w or so? Not sure, but i know it'd be substantially less. I do get your point though.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Matt-Matt*
> 
> Oh well an 8350 at 5GHz will be pulling ~220W whereas an i5 or i7 at similar clocks would pull like 150w or so? Not sure, but i know it'd be substantially less. I do get your point though.


yeah, the FX 8 cores are like hungry mynocks









heaps of fun to play around with though.


----------



## Mercy4You

Quote:


> Originally Posted by *TommyGunn123*
> 
> 1. Depends on size, if you want a big case i'd say the Phanteks Enthoo Primo, mid tower size the old n gold Corsair 750d is great OR the Corsair 540 air looks awesome and tonnes of options
> 
> 2. Most waterblocks are the same to install really, just paste the core, thermal tape the memory/vrms and slap dat on
> 
> 3. You'll probs want 850-1000w if you're gonna oc errythin and had a bit of headroom for efficiency
> 
> 4. If you dont want Asus ( which atm i dont blame you), the MSI gaming ones are pretty damn good, and at 200 aus, my MSi z87-g45 is a bargain and looks awesome


Quote:


> Originally Posted by *Matt-Matt*
> 
> The Answers Tommygun have provided are pretty good, for specifics though I'd suggest;
> 
> 1. This entirely depends on you, if you want to go watercooling though you'll want a case with space for some thick radiators and big ones. I.E Big cases that allow a 360mm (3x 120mm). You'll also want some good space for the res too.
> 
> 2. Yeah, most waterblocks are fairly similar; Unless you go with the AquaComputer ones that have the active backplate, but it's not much harder I'd imagine (I have one in the post atm).
> 
> 3. I'd suggest a 850W Silverstone strider (Silver/Gold rated), or a similar Seasonic MK2II (1000w would be better however for the future and just efficiency/noise - in some cases)
> 
> 4. I'd suggest a Gigabyte board, what is the price you're looking at though? I'd say go for a Z87X-UD3H, but at the same time for only a bit more a Sniper/Z87x-OC aren't much more and have many more features.
> 
> To add nowadays by taking the stock cooler off and putting a waterblock on you'll be voiding the warranty on AMD cards. I know that eVGA doesn't care on the nVidia side and a few others. But at least in Australia nobody does a AMD card that doesn't void the warranty. (Warranty terms change based on where you are/where you got it).


Quote:


> Originally Posted by *velocityx*
> 
> 1- Corsair 750D, good price and excellent choice for a WC setup.
> 
> 2- I would go with EK
> 
> 3- Get a 1000 watt if you want 2x290x. I have 2x290's and an overclocked 8320 and simply jumping +100mv on each GPU in AB shuts down my machine within 5 minutes of bf4 multiplayer because PSU protection systems kick in. and that's on a seasonic 850w gold based XFX unit. 290x is about 50w each more power hungry than a 290 so it's 100watt. considering amd is more power hungry than intel, that 100w makes it about equal.
> 
> 4 board, well, I run asus for years and no issues so can't recommend one.


THX guys!


----------



## Mattriz

GPU-Z: http://www.techpowerup.com/gpuz/rnnh/
SAPPHIRE R9 290X 4GB GDDR5
stock cooling


----------



## Mercy4You

Quote:


> Originally Posted by *Mattriz*
> 
> GPU-Z: http://www.techpowerup.com/gpuz/rnnh/
> SAPPHIRE R9 290X 4GB GDDR5
> stock cooling


This is what our card can do







Just did Valley Extreme 2934 with 1175/1500/+125mV (Fans at 58%)

Unigine Valley Benchmark 1.0
FPS:
70.1
Score:
2934
Min FPS:
32.1
Max FPS:
135.9
System
Platform:
Windows 8 (build 9200) 64bit
CPU model:
Intel(R) Core(TM) i7-4930K CPU @ 3.40GHz (3498MHz) x6
GPU model:
AMD Radeon R9 200 Series 13.251.0.0 (4095MB) x1
Settings
Render:
Direct3D11
Mode:
1920x1080 8xAA fullscreen
Preset
Extreme HD
Powered by UNIGINE Engine
Unigine Corp. © 2005-2013


----------



## Mattriz

Quote:


> Originally Posted by *Mercy4You*
> 
> This is what our card can do
> 
> 
> 
> 
> 
> 
> 
> Just did Valley Extreme 2934 with 1175/1500/+125mV (Fans at 58%)
> 
> Unigine Valley Benchmark 1.0
> FPS:
> 70.1
> Score:
> 2934
> Min FPS:
> 32.1
> Max FPS:
> 135.9
> System
> Platform:
> Windows 8 (build 9200) 64bit
> CPU model:
> Intel(R) Core(TM) i7-4930K CPU @ 3.40GHz (3498MHz) x6
> GPU model:
> AMD Radeon R9 200 Series 13.251.0.0 (4095MB) x1
> Settings
> Render:
> Direct3D11
> Mode:
> 1920x1080 8xAA fullscreen
> Preset
> Extreme HD
> Powered by UNIGINE Engine
> Unigine Corp. © 2005-2013


How come i dont get any better result after overclocking?
i stress tested using OCCT with no errors and ran Heaven Benchmark 4.0 without seeing any artifacts

Oveclocked:


Stock:


my settings:

(temp and gpu usage is high because im running OCCT)


----------



## hammelgammler

I want to ask you something.
I bought a Sapphire R9 290X Reference and put the Accelero Hybrid on it.
My highest overclocking for the card is 1200MHz Core and 1420 Memory. The temps are 62°C GPU, 72°C VRM1 and 59°C VRM2. My voltage is set to +100mV, whats approximate around 1,18V under load.
The two Noctua F12 are very silent, as noisy as a beQuiet Silent Wing @ 100%.
Are these settings okay for that voltage or is it a bad overclocker i got there?


----------



## chiknnwatrmln

1200 core on +100mV is pretty good.


----------



## Talon720

Quote:


> Originally Posted by *hammelgammler*
> 
> I want to ask you something.
> I bought a Sapphire R9 290X Reference and put the Accelero Hybrid on it.
> My highest overclocking for the card is 1200MHz Core and 1420 Memory. The temps are 62°C GPU, 72°C VRM1 and 59°C VRM2. My voltage is set to +100mV, whats approximate around 1,18V under load.
> The two Noctua F12 are very silent, as noisy as a beQuiet Silent Wing @ 100%.
> Are these settings okay for that voltage or is it a bad overclocker i got there?


Are you serious, joking or just rubbing it in? Despite the fact no overclock is guaranteed..Id say thats a pretty good card theres alot who cant even get 1100 no matter what


----------



## MapRef41N93W

Quote:


> Originally Posted by *Prozillah*
> 
> Definitely something up with that set up if you're having to underclock to remain stable - have you tried each card individually? Hell even tried swapping the slots around? I would be tempted to get my hands on another mobo to try an isolate that as a possible potential issue.


I've already tried each card and swapped slots. Both cards run fine individually. Nothing is wrong with my mobo, it ran SLI 760s perfectly fine for multiple months. It's has the necessary 3x PCI-E 3.0 slots as well.
Quote:


> Originally Posted by *Sgt Bilko*
> 
> Wait....so you are saying that in Xfire these cards don't work properly with any driver?


Correct.


----------



## chiknnwatrmln

With adequate cooling just about every Hawaii chip can hit at least 1100 MHz. Most fall in the 1150-1250 range max OC, by this I mean at least +100mV.


----------



## hammelgammler

I was just asking because 1420 is the max i can achieve with the Elpida RAM, which seemed pretty low to me.
As i could see many people got at least 1500 one the memory.
Well but okay, if it's a good result then im happy with my card.


----------



## Mercy4You

Quote:


> Originally Posted by *Mattriz*
> 
> How come i dont get any better result after overclocking?
> i stress tested using OCCT with no errors and ran Heaven Benchmark 4.0 without seeing any artifacts


I notice you run CCC 14.2 beta, this may be your problem. I still run 13.12 because I had trouble benchmarking with 14.1 and rolled back 13.12 which is a rock stable driver!


----------



## phallacy

Quote:


> Originally Posted by *Mercy4You*
> 
> I notice you run CCC 14.2 beta, this may be your problem. I still run 13.12 because I had trouble benchmarking with 14.1 and rolled back 13.12 which is a rock stable driver!


I've actually had the opposite happen to me. Best scores from my benchmarks are with 14.2. With 13.12 especially valley, there's about a 20fps difference but adding specfic valley app settings in CCC seems to even it out more. 14.2 nothing is really needed, I fire up any benchmark program and it scales on the whole better than 13.12


----------



## wako7654

Just curious,

What is the black screen issue other users are having?

Because my PCS+ 290 has driver crashes[driver stopped responding] on older Game titles[WoW/Bioshock] on DX11. Is it the same issue or for them?


----------



## Iniura

Quote:


> Originally Posted by *hammelgammler*
> 
> I was just asking because 1420 is the max i can achieve with the Elpida RAM, which seemed pretty low to me.
> As i could see many people got at least 1500 one the memory.
> Well but okay, if it's a good result then im happy with my card.


You could use Afterburner 3.0.0 beta 18 and set the AUX voltage to +13, that will probably help to get much higher memory OC.


----------



## Mattriz

Quote:


> Originally Posted by *Mercy4You*
> 
> I notice you run CCC 14.2 beta, this may be your problem. I still run 13.12 because I had trouble benchmarking with 14.1 and rolled back 13.12 which is a rock stable driver!


thanks, that got me this:


i might be able to get higher clocks, back to the drawing board!


----------



## Mercy4You




----------



## hammelgammler

Quote:


> Originally Posted by *Iniura*
> 
> You could use Afterburner 3.0.0 beta 18 and set the AUX voltage to +13, that will probably help to get much higher memory OC.


I will try that. Is +13mV the "highest" you should do or could i set it to even more?


----------



## Iniura

Quote:


> Originally Posted by *hammelgammler*
> 
> I will try that. Is +13mV the "highest" you should do or could i set it to even more?


You could set it even more i've seen people use +25mV and even +100, but for me it was enough to hit 1625 memory oc with elpidia.

I don't know how save it is to set it to +100 tho, I should try +25 max fot the moment and do some more research first before setting it higher.


----------



## axiumone

Quote:


> Originally Posted by *MapRef41N93W*
> 
> I've already tried each card and swapped slots. Both cards run fine individually. Nothing is wrong with my mobo, it ran SLI 760s perfectly fine for multiple months. It's has the necessary 3x PCI-E 3.0 slots as well.
> Correct.


Have you tried flashing both cards with the same bios and then running in crossfire?


----------



## taem

Quote:


> Originally Posted by *Mercy4You*
> 
> 1: can someone recommend a good case that has WC pre-installed or easy complete WC install set plus case?


If you're willing to invest in a premium case, check out the Enthoo Primo. $230
Quote:


> 3.which PSU for 2 x R9 290X? (Yes I will OC CPU and GPU)


IMHO the Cooler Master V1000 is the best dual gpu psu available right now. A steal at $200.
Quote:


> 4: which Mobo? (not ASUS, I will never buy their products again as result of bad service
> 
> 
> 
> 
> 
> 
> 
> )


The Gigabyte Z87X UD7 TH looks awesome. All sorts of tricks like being able to turn off pcie slots so you can without physically removing cards. I'll bet this thing is going to be expensive though. Thunderbolt alone adds such a high premium.


----------



## stilllogicz

I second the enthoo primo, perfectly fine for entry to high end watercooling setups. Extreme loops like dual loops can work as well but is more limited. The case is amazingly well laid out for watercooling. Check out some in depth reviews on youtube.


----------



## Paul17041993

at this point
Quote:


> Originally Posted by *MapRef41N93W*
> 
> I have to underclock my cards to stay stable at all in X-fire. Just running stock causes vertical lines of death after 20 mins or so of gaming/benching.


test those two DCIIs individually in games, if they crash they are defective and your sole cause of problem.


----------



## MapRef41N93W

Quote:


> Originally Posted by *Paul17041993*
> 
> at this point
> test those two DCIIs individually in games, if they crash they are defective and your sole cause of problem.


I only have one DCII my other card is a Tri-X. I have already tested individually and both cards ran just fine for 40 mins in heaven and 1-2 hours in gaming.

I may try bios flashing like the other poster suggested and see if that works at stock. If not, I may just try to sell this Asus and buy another Tri-X for a perfect matching setup and see if that fixes it.


----------



## hammelgammler

Does anyone exactly know what the aux voltage does?
I have to set it "high" to get 1500MHz stable, im already at +25mV, right know it works, with +19mV i got a black screen at Valley.
Is +25mV good as a 24/7 setting?


----------



## airisom2

Hey guys. I'm going to have a pair of PCS+ 290Xs by around Saturday. Is there anything I should be warned about when it comes to crossfire and whatnot? I'm wanting to make a really in-depth review for OCN on the cards with bios flashes, a lot of benchmarks, overclocking, temps, and some other stuff, and I just want to know if there's anything I should know about before I start getting into it. I don't mine, so I don't really need information regarding stuff like that. These babies are strictly for gaming








Quote:


> Originally Posted by *hammelgammler*
> 
> Does anyone exactly know what the aux voltage does?
> I have to set it "high" to get 1500MHz stable, im already at +25mV, right know it works, with +19mV i got a black screen at Valley.
> Is +25mV good as a 24/7 setting?


Aux/PLL voltage determines how much voltage goes through PCIe.


----------



## BradleyW

Quote:


> Originally Posted by *airisom2*
> 
> Hey guys. I'm going to have a pair of PCS+ 290Xs by around Saturday. Is there anything I should be warned about when it comes to crossfire and whatnot? I'm wanting to make a really in-depth review for OCN on the cards with bios flashes, a lot of benchmarks, overclocking, temps, and some other stuff, and I just want to know if there's anything I should know about before I start getting into it. I don't mine, so I don't really need information regarding stuff like that. These babies are strictly for gaming
> 
> 
> 
> 
> 
> 
> 
> 
> Aux/PLL voltage determines how much voltage goes through PCIe.


Yes, the temps. Make sure you feed them with plenty of cool air. Also, update your BIOS (Some motherboards require this for the new XDMA tech to work well).


----------



## airisom2

Temps? Covered









Spoiler: Warning: Spoiler!



bgears b-blaster 120mm x2 (103cfm each)


Yate Loon D14SH-12 140mm x2 (140cfm each)




Off to flashing my bios now. Thanks.

Edit: Done. Is that all, guys? I remember reading some posts on fluctuating gpu utilization on crossfire setups and black screens a while ago . Are those still problems?


----------



## BradleyW

Quote:


> Originally Posted by *airisom2*
> 
> Temps? Covered
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> bgears b-blaster 120mm x2 (103cfm each)
> 
> 
> Yate Loon D14SH-12 140mm x2 (140cfm each)
> 
> 
> 
> 
> Off to flashing my bios now. Thanks.
> 
> Edit: Done. Is that all, guys? I remember reading some posts on *fluctuating gpu utilization on crossfire setups and black screens* a while ago . Are those still problems?


The GPU utilization issues are not real. It is just software guessing the usage since this new architectures utilization can't be analysed yet. So software is using algorithms to show an approximation. As for black screens, we think it was a first batch problem that's been long fixed.

Hope this helps.


----------



## airisom2

Quote:


> Originally Posted by *BradleyW*
> 
> The GPU utilization issues are not real. It is just software guessing the usage since this new architectures utilization can't be analysed yet. So software is using algorithms to show an approximation. As for black screens, we think it was a first batch problem that's been long fixed.
> 
> Hope this helps.


Yup, that clears a lot of stuff I thought I would have to deal with. Thanks.


----------



## chiknnwatrmln

About to DL BF4 and install 14.3 drivers... Anything I should know about the newest Beta drivers? I remember hearing something about a decrease in DX performance, is this true?


----------



## BradleyW

Quote:


> Originally Posted by *airisom2*
> 
> Yup, that clears a lot of stuff I thought I would have to deal with. Thanks.


One more thing, disable ULPS. Don't do this via the registry. Do it via MSI Afterburner.


----------



## airisom2

Alright


----------



## taem

Requesting info: what variance are you getting on your 12v? We're trying to diagnose underperforming cards over in the powercolor pcs+ thread. I'm getting a range of 11.564 and 11.813 during gaming, which seems like a big spread to me.


----------



## chiknnwatrmln

If you're going off of GPUz readings from your GPU, those are notoriously inaccurate.

Side note, anybody know a fix for BF4 crashing? I've been trying to play the game for an hour now, after 2 minutes or less I always get a crash. It's not my OC and I've tried running in 32bit and both DX and Mantle...

Also Mantle provides zero performance increase for me.


----------



## unquestioned

Hi hi, I am new to the crossfire experience. Haven't fitted it yet as I'm on night shift at the moment.

currently running a 9370 cpu, current gpu is a reference sapphire 290 that I bought on day of release, stock cooler us gone though & it is cooled with an H75 on a g10 bracket. Clocks stable at 1100mhz & 1325 on the memory. Temps don't go above 60c









Expecting a decent boost with the tri-x that will sit beside it. Chose the tri-x because its close to my current card with an and reference board & has good cooling so I don't need to set up another H75 as space is pretty tight now.


----------



## The Mac

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> If you're going off of GPUz readings from your GPU, those are notoriously inaccurate.
> 
> Side note, anybody know a fix for BF4 crashing? I've been trying to play the game for an hour now, after 2 minutes or less I always get a crash. It's not my OC and I've tried running in 32bit and both DX and Mantle...
> 
> Also Mantle provides zero performance increase for me.


try deleting the cache folder, it will force it to rebuild the shaders.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *The Mac*
> 
> try deleting the cache folder, it will force it to rebuild the shaders.


Where is the cache folder? I looked in my BF4 directory but I don't see it.

This is a new install of BF4 btw.


----------



## The Mac

is in your my documents folder


----------



## Forceman

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> If you're going off of GPUz readings from your GPU, those are notoriously inaccurate.
> 
> Side note, anybody know a fix for BF4 crashing? I've been trying to play the game for an hour now, after 2 minutes or less I always get a crash. It's not my OC and I've tried running in 32bit and both DX and Mantle...
> 
> Also Mantle provides zero performance increase for me.


What kind of crash exactly? Seems like there are about 4 common crashes, all with different causes.


----------



## chiknnwatrmln

OK thanks, I brought it up in the BF4 thread as to not derail this one.


----------



## Arizonian

Quote:


> Originally Posted by *Mattriz*
> 
> GPU-Z: http://www.techpowerup.com/gpuz/rnnh/
> SAPPHIRE R9 290X 4GB GDDR5
> stock cooling


Congrats - added


----------



## rickyman0319

on the gpuz on r9 290 , it says (vrm 1) 100C. which parts on 290 is vrm 1 on it?


----------



## Forceman

Quote:


> Originally Posted by *rickyman0319*
> 
> on the gpuz on r9 290 , it says (vrm 1) 100C. which parts on 290 is vrm 1 on it?


The line of VRMs near the power connectors. They provide core power.


----------



## rickyman0319

how do u guys cool it off when u put the heatsink that include with Geild ? I have tried a lot of things. it still not working. it always 100C when I do mining.


----------



## givmedew

Quote:


> Originally Posted by *rickyman0319*
> 
> how do u guys cool it off when u put the heatsink that include with Geild ? I have tried a lot of things. it still not working. it always 100C when I do mining.


What heatsink includes geilid that you are talking about?

If you have a stock 290 cooler then read how to use the original cooler with a universal GPU block and then put some heatsinks on the flat plate that is left over and point a high powered fan at it.

Not sure why but 100C seems higher than any of the air cooler 290s that I have worked with ever hit while mining.

For my primary unit it is water-cooled... since you are into water-cooling I would recommend doing that... you can get a universal block for $20-30


----------



## rickyman0319

I am not using watercooling. I am just using geild icy 2 for r9 290. I am talking about geild vrm heatsink ( 1 piece).


----------



## TommyGunn123

Quote:


> Originally Posted by *rickyman0319*
> 
> I am not using watercooling. I am just using geild icy 2 for r9 290. I am talking about geild vrm heatsink ( 1 piece).


As long as you've mounted it properly and the fans are up high, it may need better thermal tape or better air flow in the case. Those coolers aren't fantastic for vrm, but should be better than that.


----------



## Talon720

Anyone running crossfire use hwinfo64 and see the 2nd gpu getting a higher wattage. It seems weird it would be using 10 more watts than gpu 1. Also gpu 2 vrm1 temp is around 10 degrees higher. If i turn down powerlimit, voltage, and downclock it (using pt1.bios) then temps and wattages are closer. Is the 10watt difference enough to justify the +10 degree vrm1 difference. Im trying to rule out too that my 2nd gpu vrm1 thermal pad is making sufficient contact. Im using ek water blocks. Under load nothing overheats even the core temps are both even. Just the 10 degree difference between the two vrm1 temps and wattage at load too being both higher on gpu2.


----------



## Imprezzion

Guys, when overclocking a R9 series card, and your game freezes. just a hard lock. No arti's or whatever. Is that core or VRAM.
I'm testing my second Tri-X 290X for max OC to see if it's better then my other one but where as my other one artifacts way before hard locking this one doesn't.


----------



## axiumone

Hard lock is most likely core. Memory will artifact heavily before locking.


----------



## Imprezzion

Yeah true. This card does stay a *lot* cooler then my other one though.

Much higher ambient (about 6c higher), yet still over 10c cooler on VRM1 and core then my other one. Both are repasted with Prolimatech PK-1 as Sapphire doesn't use warranty stickers on the screws









The other one maxes out at 1180Mhz core and 1625Mhz VRAM at +200mV. Normal day to day clocks for it are maxed at 1140/1625 on +100mV. Getting 74c core and 80c VRM1 on 65% fanspeed.

This one, the newer one, is now testing at 1200Mhz core and the same 1625Mhz VRAM on +150mV core. Still, temps maxed out after 3 rounds of BF4 (same as the other card) on 70c core and 77c VRM1 with 65% fanspeed.

It seems like I got a fair chance on 1200Mhz 24/7







Even if voltage is a tad high.


----------



## GhostRyder

Quote:


> Originally Posted by *Imprezzion*
> 
> Yeah true. This card does stay a *lot* cooler then my other one though.
> 
> Much higher ambient (about 6c higher), yet still over 10c cooler on VRM1 and core then my other one. Both are repasted with Prolimatech PK-1 as Sapphire doesn't use warranty stickers on the screws
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The other one maxes out at 1180Mhz core and 1625Mhz VRAM at +200mV. Normal day to day clocks for it are maxed at 1140/1625 on +100mV. Getting 74c core and 80c VRM1 on 65% fanspeed.
> 
> This one, the newer one, is now testing at 1200Mhz core and the same 1625Mhz VRAM on +150mV core. Still, temps maxed out after 3 rounds of BF4 (same as the other card) on 70c core and 77c VRM1 with 65% fanspeed.
> 
> It seems like I got a fair chance on 1200Mhz 24/7
> 
> 
> 
> 
> 
> 
> 
> Even if voltage is a tad high.


Thats quite impressive to get 1200 + 1625 if you get that full on stable. Im jealous of your TRIXX card lol.

I have not individually tried my cards mostly because I wanted to test what all three could do together more than anything. I still have not tried much with clocking higher recently, but I have reduced the voltage on mine and upped the VRAM Clocks higher.


----------



## Niberius

+100mv (1.328vc actual load) gets me rock solid stable 1200mhz core clock on my 290X and memory is 1500mhz since its elpida ram. Accelero 3 cooler keeps me at 64c after hours of heavy gaming with BF4 or Crysis 3 maxed out and the hottest VRMs being VRM 1 tops out at 79-80c.


----------



## hammelgammler

Quote:


> Originally Posted by *Niberius*
> 
> +100mv (1.328vc actual load) gets me rock solid stable 1200mhz core clock on my 290X and memory is 1500mhz since its elpida ram. Accelero 3 cooler keeps me at 64c after hours of heavy gaming with BF4 or Crysis 3 maxed out and the hottest VRMs being VRM 1 tops out at 79-80c.


Is 1.328V the avarage voltage under load or the max (peak) voltage which you get when selecting max GPU-Z at the voltage?
Im asking because why are my voltage that low under load?
With +112mV my avarage voltage in GPU-Z is ~1.193.
You have very good cards when looking at VRAM OC, i would love to have 1500MHz with my Elpida RAM, but more then 1430 is pretty unstable (black screen after a while).


----------



## Roboyto

Quote:


> Originally Posted by *axiumone*
> 
> Hard lock is most likely core. Memory will artifact heavily before locking.


Spot on.

Artifacting can happen from not enough volts for when you're trying to push the core; Go too far and hard lock. I did have some blackscreens from pushing memory too far, once my card crossed 1700 on RAM I would get heavy artifacts and blackscreens.

Check my build log if you'd like I have a Google docs spreadsheet with all my overclocking attempts and notes on what issues I had with every increase in volts and clocks.


----------



## Imprezzion

Wierd thing is, with my other Tri-X, the older one, the card shows heavy artifacting with white checkerboard patterns when core mhz is too high or volts too low. The newer one just... Locks up. No artifacts or whatever. Oh well. It initially seems like +150mV is enough for 1200Mhz core.

I'll check what the average load voltage is but i'm guessing ~1.24v.


----------



## The Mac

Better check your vrm1 temp at that speed.


----------



## hammelgammler

Is there any way to set more then 100mV GPU + AUX Voltage?
Sapphire Trixx has the option to select more then 100mV but no Aux setting.


----------



## Imprezzion

Quote:


> Originally Posted by *The Mac*
> 
> Better check your vrm1 temp at that speed.


I do







I know the dangers as it cost me a HD7970 reference back in the days. Ran at 1.38v load benches. VRM1 would hit up to 110-115c under the Accelero 7970 and after a last suicide run in order to get a ''record'' in 3DM11 at 1380Mhz core it blew one of the main VRM's mid-run in GT2









But, to get to the point, I haven't had any VRM go over 80c for longer periods of time at all. On +150mV the new card (65-70% fanspeed) runs at about 77c VRM1 and 51c VRM2.

I'm going to go back to testing the beast in BF4 again as that's proven to be a great stresstest when running Mantle on Ultra / no AA / 150% res scale 1080p a.k.a. 2880x1620.

EDIT: Ok, to answer the VDDC question, and VRM temps as well, my card at +150mV which i'm running now is at 1.266-1.273v load. This results in core temps of 71c max, VRM temps of 81c VRM1 and 53c VRM2 max. At 70% fanspeed btw. Ambient 24c.


----------



## bbond007

I have two MSI r9 290x Gamer Twin Frozr. I do have them flashed with asus BIOS for better temp control.

Anyway, I just got this Displayport to VGA adapter because I want to use a KVM with my system which supports VGA only.

The displayport to VGA adapter worked with my bottom card but not my top card (primary). I swapped the cards and it still only works that one same card... which is now the primary topmost card.

I have no other displayport capable adapters or devices









The error I'm getting is "Displayport link Failure".

Any ideas? Does one of my cards have defective displayport?

thanks...


----------



## Mercy4You

Well guys, I told you about my blackscreening ASUS R9 290X which was sent back to me by ASUS after 10 weeks of RMA. My vendor was so kind to take on the next battle with ASUS regarding this ridiculous act of them and provided me today with a new and shiny Sapphire R9 290X Tri-X as replacement























1175/1500 with + 50 mV


----------



## MapRef41N93W

Quote:


> Originally Posted by *Mercy4You*
> 
> Well guys, I told you about my blackscreening ASUS R9 290X which was sent back to me by ASUS after 10 weeks of RMA. My vendor was so kind to take on the next battle with ASUS regarding this ridiculous act of them and provided me today with a new and shiny Sapphire R9 290X Tri-X as replacement
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1175/1500 with + 50 mV


Congrats man. How are your temps? You should slap some good thermal paste on it and if your card was anything like mine you'll see big temp drops. No warranty sticker on the Tri-X cooler either.


----------



## Imprezzion

Quote:


> Originally Posted by *Mercy4You*
> 
> Well guys, I told you about my blackscreening ASUS R9 290X which was sent back to me by ASUS after 10 weeks of RMA. My vendor was so kind to take on the next battle with ASUS regarding this ridiculous act of them and provided me today with a new and shiny Sapphire R9 290X Tri-X as replacement
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1175/1500 with + 50 mV


Aaah bah why do you guys on OCN always get the epic ones and I have 2 Tri-X's which don't do 1150 on any less then +100mV


----------



## rdr09

Quote:


> Originally Posted by *Mercy4You*
> 
> Well guys, I told you about my blackscreening ASUS R9 290X which was sent back to me by ASUS after 10 weeks of RMA. My vendor was so kind to take on the next battle with ASUS regarding this ridiculous act of them and provided me today with a new and shiny Sapphire R9 290X Tri-X as replacement
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1175/1500 with + 50 mV


happy for you Mercy.


----------



## BradleyW

So there has been talk about the sound blaster z causing issues with CFX:

For those who are interested, I conducted an experiment to see if the Sound Blaster Z causes elevated CPU usage and reduced GPU usage whilst gaming.
Here is my study:


Spoiler: Warning: Spoiler!



After receiving news about the potential issues with the Creative Sound Blaster Z series sound cards causing elevated CPU usage and reduced multi GPU scaling abilities, I decided to carry a series of tests to determine if these issues are present on my own configuration of hardware. Here are the results.
For system specifications, see sig rig below.

All 3D applications were ran at 1080p @ 144hz.
Hyper-threaded technology was disabled to give a better indication of raw CPU time.
Sound Blaster Z (Latest official Creative drivers for Win 8.1 x64) :
RealTek onboard HD (Latest official RealTek drivers for Win 8.1 x64) :

Testing:

Metro Last Light, All settings max except SSAA (OFF), Physx (OFF).

Location: Artyoms Room (chapter 1), looking into the room from the doorway.

(Please note, Metro LL does not scale well in many areas of this game by nature)

*Sound Blaster Z*
CFX = 182 FPS
No CFX = 120 FPS
CPU Usage = 70 on each core (CFX)

*RealTek onboard HD*
CFX = 181 FPS
No CFX = 117 FPS
CPU Usage = 70 on each core (CFX)

Tomb Raider, Ultimate Preset, FXAA, ALL additional Settings enabled/max out.

(TEST 1) Location: helipad, overlooked shanty town (Highly CPU dependant area. Looking to see if the Sound Blaster Z could cause increased CPU time which may further bottleneck GPU's)

*Sound Blaster Z*
CFX = 72 FPS
No CFX = 58 FPS
CPU Usage = 95 on each core (CFX)

*RealTek onboard HD*
CFX = 71 FPS
No CFX = 58 FPS
CPU Usage = 95 on each core (CFX)

(TEST 2) Location: Built in benchmark (This is GPU dependant. This will test if crossfire scaling is disrupted by the Sound Blaster Z).

(Please note, the max value can be ignored since it can fluctuate between 170fps to around 220 fps during repeated run throughs. We should look at the min and avg since these are consistently the same every run through).

*Sound Blaster Z*
CFX = Min 112, Max 176, Avg 148 FPS
No CFX = Min 60, Max 100, Avg 78 FPS
CPU Usage = 40 on each core (CFX)

*RealTek onboard HD*
CFX = Min 112, Max 186, Avg 148 FPS
No CFX = Min 60, Max 98, Avg 78 FPS
CPU Usage = 40 on each core (CFX)

Battlefield 4 (DirectX 11.1 API), Ultra Preset, Post AA OFF, MSAA x4, Resolution Scaling 100%, Motion Blur 50%, FOV 90.

(Please note, "gametime.maxvariablefps 0" to remove the 200fps limit. Mantle was not used because we need to put the CPU under the wrong kind of strain to see if the Sound Blaster Z can elevate CPU usage beyond an already CPU dependant area).

Location: Test Range

*Sound Blaster Z*
CFX = 213 FPS
No CFX = 108
CPU Usage = 100 (Cores 1 to 4), 98 (Cores 5 and 6) (CFX)

*RealTek onboard HD*
CFX = 213 FPS
No CFX = 108
CPU Usage = 100 (Cores 1 to 4), 98 (Cores 5 and 6) (CFX)

Far Cry 3 (DirectX 9.0 API), Ultra Preset, FOV 90, Water (LOW), GPU frame buffer (1).

Please note, FC3 suffers from crossfire scaling issues since the HD 7000 series. These issues are more apparent in DX11 mode. DX9 gives the best scaling performance. Performance can be increased by forcing Bioshoch.exe profile, or by reducing the POST FX setting to medium or low. During these tests, I decided to leave the default crossfire profile in place to reduce the chance of instability which could affect the validity of the results. Water is set to low to negate a common CPU overhead issue which causes strange fps results in certain areas of the game, even well into the mainland terrain. Frame buffer is left at 1, since this is default.

(TEST 1) Location: Doctor Earnhardt's house (looking towards the sealed front door from the inside of the home. This is CPU intensive area. Not CPU bound. If Sound Blaster Z did cause elevated CPU time, it would show in this area).

*Sound Blaster Z*
CFX = 138 FPS
No CFX = 117 FPS
CPU Usage = 70 (Cores 1 to 3), 100 (Cores 4 and 5), 90 (Core 6) (CFX)

*RealTek onboard HD*
CFX = 139 FPS
No CFX = 117 FPS
CPU Usage = 70 (Cores 1 to 3), 100 (Cores 4 and 5), 90 (Core 6) (CFX)

(TEST 2) Location 2: Doctor Earnhardt's house (Overlooking the sea during sunset. This is GPU bound)

*Sound Blaster Z*
CFX = 55 FPS
No CFX = 32 FPS
CPU Usage = 35 on each core (CFX)

*RealTek onboard HD*
CFX = 55 FPS
No CFX = 32 FPS
CPU Usage = 35 on each core (CFX)

The next test is to see if a modified EQ and SBX effects caused elevated CPU usage:

*Sound Blaster Z*
Tomb Raider - EQ - SBX Enabled:


Final image just confirms the CPU usage in FC3 (First Test Location).


To conclude, there seems to be Zero effects on the systems performance during gaming whilst using the SBZ sound card. From looking at previous cases, users seemed to be on Windows 7 and using last generation GPU's. I suspect the drivers might have been fixed, or the card seems to be error-less in Windows 8.1.

Thank you for reading.


----------



## Mercy4You

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Congrats man. How are your temps? You should slap some good thermal paste on it and if your card was anything like mine you'll see big temp drops. No warranty sticker on the Tri-X cooler either.


Thx! Temps VRM1 80-85 Which paste did you use and what did it with your temps?
Quote:


> Originally Posted by *Imprezzion*
> 
> Aaah bah why do you guys on OCN always get the epic ones and I have 2 Tri-X's which don't do 1150 on any less then +100mV


LOL, there are alway guys with better GPU's and CPU's








Quote:


> Originally Posted by *rdr09*
> 
> happy for you Mercy.


Thx m8, is happy day indeed!


----------



## Imprezzion

Oh well I got the better of the 2 Tri-X's to do 1200Mhz stable. Took +150mV in TriXX but it's stable







RAM is also quite good hitting 1625Mhz easily. Ran a few rounds of BF4 on 1700Mhz and it ran fine as well but I did get some small random FPS dips so I thought it might be hitting ECC.


----------



## MapRef41N93W

Quote:


> Originally Posted by *Mercy4You*
> 
> Thx! Temps VRM1 80-85 Which paste did you use and what did it with your temps?


MX-4 brought temps down 10c in idle and load.


----------



## Mercy4You

Quote:


> Originally Posted by *Imprezzion*
> 
> Oh well I got the better of the 2 Tri-X's to do 1200Mhz stable. Took +150mV in TriXX but it's stable
> 
> 
> 
> 
> 
> 
> 
> RAM is also quite good hitting 1625Mhz easily. Ran a few rounds of BF4 on 1700Mhz and it ran fine as well but I did get some small random FPS dips so I thought it might be hitting ECC.


Are you going to X-Fire them? Very interested to hear what those 2 will do in BF4, like smoothness and microstutter and frametimes...


----------



## Mercy4You

Quote:


> Originally Posted by *MapRef41N93W*
> 
> MX-4 brought temps down 10c in idle and load.










just placed an order


----------



## Imprezzion

Quote:


> Originally Posted by *Mercy4You*
> 
> Are you going to X-Fire them? Very interested to hear what those 2 will do in BF4, like smoothness and microstutter and frametimes...


Wasn't planning on it. Got this one very cheap so was more or less a quick flip for a profit and a binning efford but I can try..

Only issue is my PSU won't ever handle 2 of them overclocked so, IF it handles 2 on stock, i can't bench BF4 overclocked.


----------



## MapRef41N93W

Believe it or not, but my issues with crossfire have apparently been caused by Fraps this entire time... I noticed this because all of a sudden games were just opening to black screens and closing fraps fixed this. I uninstalled Fraps and have not had a single black screen or vertical lines of death crash for over a day now and games load up properly. I also now get an error message saying "Fraps is corrupted please uninstall" even after I have already uninstalled it and the system doesn't see any more Fraps files anywhere.

I've got my cards running 1115/1500 stable at +50 mv though I do notice that I actually get higher scores in benchmarks with 1400 mem over 1500. I assume it's probably the Elpida Asus card causing that to happen. Definitely going to drop these cards under water though, cause at +50 mv my temps go up about 6-7c and are around 79-81/74-76 after about 30 mins of Heaven/Valley.


----------



## Sgt Bilko

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Believe it or not, but my issues with crossfire have apparently been caused by Fraps this entire time... I noticed this because all of a sudden games were just opening to black screens and closing fraps fixed this. I uninstalled Fraps and have not had a single black screen or vertical lines of death crash for over a day now and games load up properly. I also now get an error message saying "Fraps is corrupted please uninstall" even after I have already uninstalled it and the system doesn't see any more Fraps files anywhere.
> 
> I've got my cards running 1115/1500 stable at +50 mv though I do notice that I actually get higher scores in benchmarks with 1400 mem over 1500. I assume it's probably the Elpida Asus card causing that to happen. Definitely going to drop these cards under water though, cause at +50 mv my temps go up about 6-7c and are around 79-81/74-76 after about 30 mins of Heaven/Valley.


So not crappy AMD drivers?


----------



## Imprezzion

Hmm. DirectX crash in Battlefield 4. Are they core or memory?


----------



## rickyman0319

Quote:


> Originally Posted by *givmedew*
> 
> What heatsink includes geilid that you are talking about?
> 
> If you have a stock 290 cooler then read how to use the original cooler with a universal GPU block and then put some heatsinks on the flat plate that is left over and point a high powered fan at it.
> 
> Not sure why but 100C seems higher than any of the air cooler 290s that I have worked with ever hit while mining.
> 
> For my primary unit it is water-cooled... since you are into water-cooling I would recommend doing that... you can get a universal block for $20-30


where do u buy a universal block for $20-30?

I also want to know this If you have a stock 290 cooler then read how to use the original cooler with a universal GPU block and then put some heatsinks on the flat plate that is left over and point a high powered fan at it.
.

where do I get the info?


----------



## Imprezzion

Quote:


> Originally Posted by *Imprezzion*
> 
> Hmm. DirectX crash in Battlefield 4. Are they core or memory?


I just remembered something.. how on earth did I get a DirectX crash when i'm running Mantle... Unless it's not really running Mantle.

Scratch that. 13.12 drivers due to throttling on 14.xx.. DirectX still.


----------



## bond32

I wish I had gone the universal GPU block route myself. Jumped the gun and got the Koolance which works fantastic, but long term the universal is better.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Imprezzion*
> 
> I just remembered something.. how on earth did I get a DirectX crash when i'm running Mantle... Unless it's not really running Mantle.
> 
> Scratch that. 13.12 drivers due to throttling on 14.xx.. DirectX still.


Ive had something similar before. DDU uninstall and 14.2 plus Mantle fixed it for me


----------



## rdr09

Quote:


> Originally Posted by *Imprezzion*
> 
> I just remembered something.. how on earth did I get a DirectX crash when i'm running Mantle... Unless it's not really running Mantle.
> 
> Scratch that. 13.12 drivers due to throttling on 14.xx.. DirectX still.


it's either throttling or 14s are not good for benching . . .

13.11

http://www.3dmark.com/3dm/1895751

14.3

http://www.3dmark.com/3dm/2694522?


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> it's either throttling or 14s are not good for benching . . .
> 
> 13.11
> 
> http://www.3dmark.com/3dm/1895751
> 
> 14.3
> 
> http://www.3dmark.com/3dm/2694522?


Weird. 13.12 was good benching but 14.2 was better in my experience.

I havent tried 14.3 yet but I think ill wait for the next beta or a WHQL driver which should be around the corner actually.


----------



## Aussiejuggalo

So getting my 290 today









just wanted to a ask whats the best way to uninstall Nvidia drivers, the full manual screwing around in the registry or using Display Driver Uninstaller? normally I just use the Windows uninstaller but seeing as I havent gone AMD in many many years I dont wanna have stupid driver issues


----------



## Jack Mac

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> So getting my 290 today
> 
> 
> 
> 
> 
> 
> 
> 
> 
> just wanted to a ask whats the best way to uninstall Nvidia drivers, the full manual screwing around in the registry or using Display Driver Uninstaller? normally I just use the Windows uninstaller but seeing as I havent gone AMD in many many years I dont wanna have stupid driver issues


I've been using this to uninstall drivers, zero issues with AMD/Nvidia.
http://www.overclock.net/t/1150443/how-to-remove-your-nvidia-gpu-drivers/0_40


----------



## cennis

Quote:


> Originally Posted by *Niberius*
> 
> +100mv (1.328vc actual load) gets me rock solid stable 1200mhz core clock on my 290X and memory is 1500mhz since its elpida ram. Accelero 3 cooler keeps me at 64c after hours of heavy gaming with BF4 or Crysis 3 maxed out and the hottest VRMs being VRM 1 tops out at 79-80c.


How does +100 give you 1.32v? Can you try running 3dmark 11 and see if u encounter a drop to 1.27v?


----------



## Mr357

Quote:


> Originally Posted by *cennis*
> 
> How does +100 give you 1.32v? Can you try running 3dmark 11 and see if u encounter a drop to 1.27v?


+100mV for me is 1.35V.


----------



## cennis

Quote:


> Originally Posted by *Mr357*
> 
> +100mV for me is 1.35V.


I just tried a non vdroop driver from 290->290x unlock thread,

that gives me 1.35V constant voltage in 3dmark11, however system just shuts off after 10 seconds. possibly reaching 95c too fast on air cooler.

however on sapphire's bios setting +100mv is 1.35v but when I run 3dmark11 it goes down to 1.20~


----------



## rickyman0319

why universal gpu block is better?

which gpu block did u buy?


----------



## rickyman0319

Quote:


> Originally Posted by *bond32*
> 
> I wish I had gone the universal GPU block route myself. Jumped the gun and got the Koolance which works fantastic, but long term the universal is better.


which gpu block did u buy? why is it better?


----------



## DrClaw

can someone here please tell me what your stock core voltage is for the stock amd 290? i want to undervolt my powercolor pcs 290, it has 1040 core clock and 1350 memory
this way i can reduce heat

im not sure if the core voltage is the same


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Jack Mac*
> 
> I've been using this to uninstall drivers, zero issues with AMD/Nvidia.
> http://www.overclock.net/t/1150443/how-to-remove-your-nvidia-gpu-drivers/0_40


Ah cool









Question, can you mine a 290 while your doing general stuff like watching movies and crap? just curious


----------



## Paul17041993

Quote:


> Originally Posted by *rickyman0319*
> 
> why universal gpu block is better?
> 
> which gpu block did u buy?


universal blocks are not generally "better", they just can be used on multiple cards, very similar to CPU blocks, however they only cover the core and nothing else, VRMs, memory and any extra controllers need heatsinks or their own blocks to be kept cool, fullcover blocks are very common for this reason as they cool the entire card with water and need nothing else, they also only take 1 slot space where a universal may take 2, at the cost that they are only useable for that one card.


----------



## TommyGunn123

Kinda sorta off topic, but the Dell UltraSharp UP2414Q popped up on PCCG today and it's only $1449 aussie (only, lol)

http://www.pccasegear.com/index.php?main_page=product_info&cPath=416&products_id=27171&zenid=c831d2ac128d8287be695cc42838b8dc

I want someone to get one and do some benchmarks


----------



## Jack Mac

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Ah cool
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Question, can you mine a 290 while your doing general stuff like watching movies and crap? just curious


I'm no expert on mining, but you should be able to, just turn down the intensity.


----------



## Niberius

Quote:


> Originally Posted by *cennis*
> 
> How does +100 give you 1.32v? Can you try running 3dmark 11 and see if u encounter a drop to 1.27v?


Nope thats what it shows consistently, no dropping.


----------



## Arizonian

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> So getting my 290 today
> 
> 
> 
> 
> 
> 
> 
> 
> 
> just wanted to a ask whats the best way to uninstall Nvidia drivers, the full manual screwing around in the registry or using Display Driver Uninstaller? normally I just use the Windows uninstaller but seeing as I havent gone AMD in many many years I dont wanna have stupid driver issues


OP has the info of useful links. I think sometimes a bit too much.









This works too









*
Nvidia & AMD Drivers Un-install Utility*


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Arizonian*
> 
> OP has the info of useful links. I think sometimes a bit too much.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This works too
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *
> Nvidia & AMD Drivers Un-install Utility*


Thats I noticed that after I posted









Can I join ya club







, just stuck my 290 Tri-X OC in



GPU-Z Validation

Cant wait to see how she performs compared to my 670


----------



## Arizonian

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Thats I noticed that after I posted
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can I join ya club
> 
> 
> 
> 
> 
> 
> 
> , just stuck my 290 Tri-X OC in
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> GPU-Z Validation
> Cant wait to see how she performs compared to my 670


Congrats - added


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added


Thanks









Question the Tri-X thats a reference PCB isnt it? think I need to get her wet before trying to clock









Edit, lol had my first black screen when I went to open GPU-Z, had to hard reset


----------



## Imprezzion

GPU-Z gives quite some crashes with R9's appearantly.. My first card can run just fine with GPU-Z on but my second card will hard lock my PC with GPU-Z open within 10 minutes of gaming... Benching goes fine with GPU-Z on however..

Tad wierd. But k.

I just ran 3dmark 11 and got quite the score lol. These are 24/7 clocks as in my sig for my CPU, RAM and card. Also, no driver tweaks, no adjusted tesselation. Result is validated.

http://www.3dmark.com/3dm11/8141281 P15941 with a Graphics score of well in excess of 18000

Temps:


----------



## Aussiejuggalo

Good to know its not just me having GPU-Z crashes









Damn your VRM temps are lower then mine were after a 15 min Furmark burn in test, my VRM 1 hit 91°


----------



## Mercy4You

Quote:


> Originally Posted by *Imprezzion*
> 
> I just ran 3dmark 11 and got quite the score lol. These are 24/7 clocks as in my sig for my CPU, RAM and card. Also, no driver tweaks, no adjusted tesselation. Result is validated.
> 
> http://www.3dmark.com/3dm11/8141281 P15941 with a Graphics score of well in excess of 18000


Nice!!! Close to mine at 1175/1500 :

Score P15803 with AMD Radeon R9 290X(1x) and Intel Core i7-4930K

Graphics Score
17013
Physics Score
14198
Combined Score
11588

Nice to see the difference in CPU


----------



## Raephen

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Good to know its not just me having GPU-Z crashes
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Damn your VRM temps are lower then mine were after *a 15 min Furmark burn in test*, my VRM 1 hit 91°


And there is the why: Furmark.

As far as I've seen good for just a few things, like drawing absolutely unrealistic max power through your card. And for heating up your room, would you want to use the pc as such









In Fumrark, one of my watercooled VRMs hit 80+ C. VRM2, to be precise, and the main culprit of that was bad contacting stock thermal pads due to machining variance in the block.

I fixed that, but I haven't run Furmark since then - completely unrealistic program.


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Raephen*
> 
> And there is the why: Furmark.
> 
> As far as I've seen good for just a few things, like drawing absolutely unrealistic max power through your card. And for heating up your room, would you want to use the pc as such
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In Fumrark, one of my watercooled VRMs hit 80+ C. VRM2, to be precise, and the main culprit of that was bad contacting stock thermal pads due to machining variance in the block.
> 
> I fixed that, but I haven't run Furmark since then - completely unrealistic program.


I dont overclock my GPUs (mainly coz I've only ever had Nvidia ones) but when I stress my CPU I use Furmark as if its playing a game while I run LinX to stress the CPU. Kinda stupid way of testing really lol

I'm to lazy/poor for most other real GPU stress tests


----------



## TommyGunn123

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> I dont overclock my GPUs (mainly coz I've only ever had Nvidia ones) but when I stress my CPU I use Furmark as if its playing a game while I run LinX to stress the CPU. Kinda stupid way of testing really lol
> 
> I'm to lazy/poor for most other real GPU stress tests


Just use unigine valley/heaven running for a test lol


----------



## rdr09

Quote:


> Originally Posted by *DrClaw*
> 
> can someone here please tell me what your stock core voltage is for the stock amd 290? i want to undervolt my powercolor pcs 290, it has 1040 core clock and 1350 memory
> this way i can reduce heat
> 
> im not sure if the core voltage is the same


0.984v for a reference 290 here.


----------



## Aussiejuggalo

Quote:


> Originally Posted by *TommyGunn123*
> 
> Just use unigine valley/heaven running for a test lol


Yeah I think I need to, I was gonna use 3DMark.... then I seen you need to pay for it


----------



## Imprezzion

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Yeah I think I need to, I was gonna use 3DMark.... then I seen you need to pay for it


I bought it on G2Play.net for like, €8.

But still, my VRM's in-game hit high 70's / low 80's on +150mV but as it's on Tri-X air cooling it's actually quite impressive.
3DMark's tests just don't take long enough to really heat them up. Same for the core. It only hit 64c in 3DMark but in real gaming it's at a nice flat line of 71c.


----------



## TommyGunn123

Team Australia Extreme Overclockers got their hands on an Asus 290x Matrix, and that mem speed is craycray for elpida lol



cmon straya









Img source: https://www.facebook.com/photo.php?fbid=276866099155258&set=pcb.276866192488582&type=1


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Imprezzion*
> 
> I bought it on G2Play.net for like, €8.
> 
> But still, my VRM's in-game hit high 70's / low 80's on +150mV but as it's on Tri-X air cooling it's actually quite impressive.
> 3DMark's tests just don't take long enough to really heat them up. Same for the core. It only hit 64c in 3DMark but in real gaming it's at a nice flat line of 71c.


I dont even have 30cents









I ran Unigine Valley for a few mins, my VRM's hit 80 pretty much straight away lol, I dont really have much air flow in my case tho seeing its full of rads








Quote:


> Originally Posted by *TommyGunn123*
> 
> Team Australia Extreme Overclockers got their hands on an Asus 290x Matrix, and that mem speed is craycray for elpida lol
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> cmon straya
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Img source: https://www.facebook.com/photo.php?fbid=276866099155258&set=pcb.276866192488582&type=1










Straya FTW!

Anyone else with a Tri-X OC notice there fans start to get noisy after around 50%?


----------



## Niberius

Quote:


> Originally Posted by *cennis*
> 
> I just tried a non vdroop driver from 290->290x unlock thread,
> 
> that gives me 1.35V constant voltage in 3dmark11, however system just shuts off after 10 seconds. possibly reaching 95c too fast on air cooler.
> 
> however on sapphire's bios setting +100mv is 1.35v but when I run 3dmark11 it goes down to 1.20~


S
Quote:


> Originally Posted by *Aussiejuggalo*
> 
> I dont even have 30cents
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I ran Unigine Valley for a few mins, my VRM's hit 80 pretty much straight away lol, I dont really have much air flow in my case tho seeing its full of rads
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Straya FTW!
> 
> Anyone else with a Tri-X OC notice there fans start to get noisy after around 50%?


Thats why I love the Accelero 3, 100 percent fan speed and I still cant hear it over my case fans. Not that the Tri-X cooler is bad because it certainly isnt but when it comes to noise you cant beat the Accelero.


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Niberius*
> 
> Thats why I love the Accelero 3, 100 percent fan speed and I still cant hear it over my case fans. Not that the Tri-X cooler is bad because it certainly isnt but when it comes to noise you cant beat the Accelero.


Or watercooling with 5 - 10 AP-15 fans running push pull on 5v


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Weird. 13.12 was good benching but 14.2 was better in my experience.
> 
> I havent tried 14.3 yet but I think ill wait for the next beta or a WHQL driver which should be around the corner actually.


i just ran it again using 13.11 and same thing . . .

http://www.3dmark.com/3dm/2713716?

physics went down a bit. must be background programs running.


----------



## Imprezzion

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Anyone else with a Tri-X OC notice there fans start to get noisy after around 50%?


Yeah, for me they get real audible past 60%. I run 65% forced fanspeed which is not silent but it aint loud either.

Also, my ColdZero backpalte is in and installed









How do you guys like it on my Tri-X?


----------



## hammelgammler

Does anyone know whats a safe voltage for 24/7 with Aux?
I need at least +50mV at Aux to get 1500MHz stable for the Memory, but i dont know if that would be okay for 24/7...


----------



## JordanTr

Hello guys im thinking to get watercooling for my graphics card only, cause my cpu got decent temp with my corsair h90. I am totaly noob in that? Is it any good if i buy lets say ek kit with 240mm rad for cpu + waterblock for my r9 290 and connect it instead of cpu? Its about £160 for that kit + £85 for gpu waterblock. Is everything included? O mean pump, fitings etc?


----------



## rdr09

Quote:


> Originally Posted by *JordanTr*
> 
> Hello guys im thinking to get watercooling for my graphics card only, cause my cpu got decent temp with my corsair h90. I am totaly noob in that? Is it any good if i buy lets say ek kit with 240mm rad for cpu + waterblock for my r9 290 and connect it instead of cpu? Its about £160 for that kit + £85 for gpu waterblock. Is everything included? O mean pump, fitings etc?


yes, should be enough. i only have 120 rads each for my cpu and gpu. 2 fans for the cpu rad and just one for the gpu. my gpu temps are relatively low for just that rad. never saw anything above 55C on the core and vrms. i recommend the ek waterblock 'cause the vrms are cooler on mine than the core and i used CLU for the core. Fujipoly for the gpu - 2 sizes - 1mm and 0.5mm. Go all out.

Use Gelid extreme for the core to be safe.

edit: not sure if you find this in your area . . .

http://www.frozencpu.com/products/16878/thr-164/Fujipoly_Extreme_System_Builder_Thermal_Pad_-_14_Sheet_-_150_x_100_x_05_-_Thermal_Conductivity_110_WmK.html

http://www.frozencpu.com/products/16880/thr-165/Fujipoly_Extreme_System_Builder_Thermal_Pad_-_14_Sheet_-_150_x_100_x_10_-_Thermal_Conductivity_110_WmK.html

more than enough for 3 gpus.


----------



## JordanTr

I got some noctua nt-h1. Found this one http://www.watercoolinguk.co.uk/p/XSPC-RayStorm-750-AX240-V4-Pump-WaterCooling-Kit_39372.html everything gonna be fine? I will put noctua fans on that rad.


----------



## rdr09

Quote:


> Originally Posted by *JordanTr*
> 
> I got some noctua nt-h1. Found this one http://www.watercoolinguk.co.uk/p/XSPC-RayStorm-750-AX240-V4-Pump-WaterCooling-Kit_39372.html everything gonna be fine? I will put noctua fans on that rad.


that is perfect Jordan. i use Cougar fans. i only hear the pump in my rig.


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> that is perfect Jordan. i use Cougar fans. i only hear the pump in my rig.


Silence is overrated, turn the music up to 11


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Silence is overrated, turn the music up to 11


i know. my rig seats about 2 feet away from my left ear.


----------



## hammelgammler

Thats weird.
With +100mV im able to get 1000 / 1525 stable, also 1200 / 1440, but 1200 / 1500 doesn't work at all, 1440 is max i get with 1200.
Any suggestions why that happens?


----------



## rdr09

Quote:


> Originally Posted by *hammelgammler*
> 
> Thats weird.
> With +100mV im able to get 1000 / 1525 stable, also 1200 / 1440, but 1200 / 1500 doesn't work at all, 1440 is max i get with 1200.
> Any suggestions why that happens?


what oc'ing tool are you using? if not Trixx - try it. at 1200 core . . . you might need a little higher than +100. try +125, +130, and so on but make sure to raise the Power Li (Power limit) to 50%. also, use 13.12 or 13.11 unless you are playing around with mantle.

watch all your temps and don't let them exceed 80C.


----------



## hammelgammler

Quote:


> Originally Posted by *rdr09*
> 
> what oc'ing tool are you using? if not Trixx - try it. at 1200 core . . . you might need a little higher than +100. try +125, +130, and so on but make sure to raise the Power Li (Power limit) to 50%. also, use 13.12 or 13.11 unless you are playing around with mantle.
> 
> watch all your temps and don't let them exceed 80C.


Yeah im using Trixx, but +100mV in enough to get 1200 stable. 1200 / 1440 also works completly stable, as well as 1000 / 1525, but 1200 / 1450 is unstable and I get a black screen after a while.
Temps are absolutely fine.
And yeah im using 13.12 with Windows 7 64 bit.
My card is a Sapphire R9 290X Reference @ Accelero Hybrid.


----------



## rdr09

Quote:


> Originally Posted by *hammelgammler*
> 
> Yeah im using Trixx, but +100mV in enough to get 1200 stable. 1200 / 1440 also works completly stable, as well as 1000 / 1525, but 1200 / 1450 is unstable and I get a black screen after a while.
> Temps are absolutely fine.


i know the feeling but 1440 even 1400 is decent unless you are aiming for mining perfromance or benching. memory at stock is sufficient for games. luck of the draw.

you did raise the PL, right?


----------



## hammelgammler

Quote:


> Originally Posted by *rdr09*
> 
> i know the feeling but 1440 even 1400 is decent unless you are aiming for mining perfromance or benching. memory at stock is sufficient for games. luck of the draw.
> 
> you did raise the PL, right?


Yes i raised it to +50%, max.
As for 1080p, you are right, but im gaming at 2560x1440, and it seems like memory is doing a "good job" there.
The R9 290X Lightning @ 1150 / 1625 is 1 Fps slower at 1440p then my settings with 1200 / 1440.


----------



## JordanTr

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Silence is overrated, turn the music up to 11


I prefer loose 4-5 degrees on my cpu/gpu, but have quiet system. Is here anyone who got fractal design xl r2? Where do you fit that 240mm radiator?


----------



## Mr357

Quote:


> Originally Posted by *cennis*
> 
> I just tried a non vdroop driver from 290->290x unlock thread,
> 
> that gives me 1.35V constant voltage in 3dmark11, however system just shuts off after 10 seconds. possibly reaching 95c too fast on air cooler.
> 
> however on sapphire's bios setting +100mv is 1.35v but when I run 3dmark11 it goes down to 1.20~


Unfortunately you're always going to see a lot of droop like that unless you use the PT3 BIOS, but I don't recommend that unless your card is under water, and even then it's a bad idea for extended use.


----------



## chiknnwatrmln

Somebody needs to make a BIOS with customizable LLC and downclocking for every day usage... My card droops a whole lot.


----------



## Imprezzion

I asked that indeed in a new topic and all but no useful reply's and i'm not capable of it..

My card is acting wierd btw. This morning i played like 3 hours of BF4 straight with no issues at all at 1200/1500Mhz +150mV. Now, this afternoon it suddenly froze mid-game like it used to do with too little voltage. And now I can't seem to be able to pass a round of BF4 without it hard-locking.

But.. If it's that unstable.. How did I just play 3 hours straight this morning without a hitch... I dunno..
.


----------



## cennis

that's what thought should be the case, but that user is saying 1.35v with 100mw increase. Thats high and after vdroop or no vdroop

Quote:


> Originally Posted by *Mr357*
> 
> Unfortunately you're always going to see a lot of droop like that unless you use the PT3 BIOS, but I don't recommend that unless your card is under water, and even then it's a bad idea for extended use.


----------



## DrClaw

what do you guys mean by droop? drops in the 12v?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *DrClaw*
> 
> what do you guys mean by droop? drops in the 12v?


Droop in VDDC.
http://en.wikipedia.org/wiki/Voltage_droop


----------



## DrClaw

Quote:


> Originally Posted by *rdr09*
> 
> 0.984v for a reference 290 here.


thanks man + rep









i will use asus gpu tweak to change the core voltage and bring core clock to 947 and mem clock to stock 290
wish me luck

im hoping to literally transform the powercolor pcs 290 back into a stock 290, without changing the heatsink
i should be able to reduce heat output by alot, i notice when gaming over time, the amount of heat accumulates in the case, i wish the non-ref cards were more of an exhaust type card with just better heatsinks

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Droop in VDDC.
> http://en.wikipedia.org/wiki/Voltage_droop


oh why thankyou








so he was talking about droop in the core voltage (VDDC)


----------



## keikei

Quote:


> Originally Posted by *Imprezzion*
> 
> Also, my ColdZero backpalte is in and installed
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How do you guys like it on my Tri-X?
> 
> 
> Spoiler: Warning: Spoiler!


Sexy! Where would one get one or two of those?


----------



## boldenc

I have a question, currently testing a TRI-X 290x and once I change the memory clock, the GPU Clock will stuck to 3d clocks at idle and only the memory clock will go down.
Is there a way to fix that?


----------



## rdr09

Quote:


> Originally Posted by *hammelgammler*
> 
> Yeah im using Trixx, but +100mV in enough to get 1200 stable. 1200 / 1440 also works completly stable, as well as 1000 / 1525, but 1200 / 1450 is unstable and I get a black screen after a while.
> Temps are absolutely fine.
> And yeah im using 13.12 with Windows 7 64 bit.
> My card is a Sapphire R9 290X Reference @ Accelero Hybrid.


Quote:


> Originally Posted by *hammelgammler*
> 
> Yes i raised it to +50%, max.
> As for 1080p, you are right, but im gaming at 2560x1440, and it seems like memory is doing a "good job" there.
> The R9 290X Lightning @ 1150 / 1625 is 1 Fps slower at 1440p then my settings with 1200 / 1440.


yeah, i read that memory oc helps in higher rez.

Quote:


> Originally Posted by *DrClaw*
> 
> thanks man + rep
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *i will use asus gpu tweak to change the core voltage and bring core clock to 947 and mem clock to stock 290
> wish me luck*
> im hoping to literally transform the powercolor pcs 290 back into a stock 290, without changing the heatsink
> i should be able to reduce heat output by alot, i notice when gaming over time, the amount of heat accumulates in the case, i wish the non-ref cards were more of an exhaust type card with just better heatsinks
> oh why thankyou
> 
> 
> 
> 
> 
> 
> 
> 
> so he was talking about droop in the core voltage (VDDC)


it should work. i noticed 13.12 raised my vcore a bit or it could be just my imagination. so, i use 13.11 and 14.3 if i want to play with mantle.


----------



## stilllogicz

On these waterblocks:

http://www.ekwb.com/shop/blocks/vga-blocks/ati-radeon-full-cover-blocks/radeon-rx-200-series/ek-fc-r9-290x-original-csq-nickel.html

Can that chrome\silver\nickel metal part be removed and not used? or possibly painted?


----------



## DrClaw

Quote:


> Originally Posted by *rdr09*
> 
> it should work. i noticed 13.12 raised my vcore a bit or it could be just my imagination. so, i use 13.11 and 14.3 if i want to play with mantle.


hmm im gonna try those drivers later and run some unigine valley benchmarks, ty


----------



## The Mac

Quote:


> Originally Posted by *boldenc*
> 
> I have a question, currently testing a TRI-X 290x and once I change the memory clock, the GPU Clock will stuck to 3d clocks at idle and only the memory clock will go down.
> Is there a way to fix that?


are you using 14.X? then, no

13.12 works fine however.

Mine actually sticks on boot with no OC applied.


----------



## boldenc

Quote:


> Originally Posted by *The Mac*
> 
> are you using 14.X? then, no
> 
> 13.12 works fine however.
> 
> Mine actually sticks on boot with no OC applied.


13.12 fixed it, also the memory can run now at 1500 with 13.12.
So if someone his memory is not oc'ing well, it is because of the 14.3


----------



## Imprezzion

Quote:


> Originally Posted by *keikei*
> 
> Sexy! Where would one get one or two of those?


www.coldzero.eu


----------



## taem

Quote:


> Originally Posted by *Imprezzion*
> 
> I just ran 3dmark 11 and got quite the score lol. These are 24/7 clocks as in my sig for my CPU, RAM and card. Also, no driver tweaks, no adjusted tesselation. Result is validated.
> 
> http://www.3dmark.com/3dm11/8141281 P15941 with a Graphics score of well in excess of 18000


Dang your gpu score is so much higher than mine at same clocks. I get 17330 http://www.3dmark.com/3dm11/7991243. But I'm not surprised, my Tri-X clock for clock benches much faster than the Powercolor PCS+. Variance is supposed to be 1-2% but the difference is much bigger than that. The Powercolor PCS+ has godlike cooling but it appears to be an underperformer. But I'm fine with it, I'll trade a few fps for the absurd cooling. A few folks are getting much much bigger underperformance though. I'm still in expected range for a 290.


----------



## Imprezzion

Well, yours is a 290 and mine is a 290X so.. not that wierd







:thumb:


----------



## Thorteris

Should I overclock my card on reference or should I wait for my kraken x60 to come in I have a kraken g10


----------



## Roy360

I ordered my reference VisionTek R9 290 from Dell back on Dec.26.

It was delayed twice, and today I finally receive it. and they give me a non reference card.









Someone tell me that Visiontek is the Zotac of AMD. Someone please tell me that my EK waterblock will fit on this card.


----------



## GhostRyder

Quote:


> Originally Posted by *Roy360*
> 
> I ordered my reference VisionTek R9 290 from Dell back on Dec.26.
> 
> It was delayed twice, and today I finally receive it. and they give me a non reference card.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Someone tell me that Visiontek is the Zotac of AMD. Someone please tell me that my EK waterblock will fit on this card.


It is still a reference PCB so your waterblock should fit on it quite nicely.


----------



## taem

Quote:


> Originally Posted by *Imprezzion*
> 
> Well, yours is a 290 and mine is a 290X so.. not that wierd
> 
> 
> 
> 
> 
> 
> 
> :thumb:


Then it turns out my Powercolor 290's performance is actually quite impressive










Your physics score is godly to me. 3770/4770 performance so much better than 4670, I was stupid not to spend the extra $100 here. Let that be a lesson to the folks spec'ing new systems, don't cheap out and get the 4670k if you're going to get a card as powerful as Hawaii.


----------



## Roy360

Quote:


> Originally Posted by *GhostRyder*
> 
> It is still a reference PCB so your waterblock should fit on it quite nicely.


thanks. Looks like I got worried for no reason. Plus I got a free backplate now









now I just wish I had a space pcie x16 slot to test out with, before watercooling the card


----------



## MapRef41N93W

Quote:


> Originally Posted by *taem*
> 
> Then it turns out my Powercolor 290's performance is actually quite impressive
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Your physics score is godly to me. 3770/4770 performance so much better than 4670, I was stupid not to spend the extra $100 here. Let that be a lesson to the folks spec'ing new systems, don't cheap out and get the 4670k if you're going to get a card as powerful as Hawaii.


What is your physics score in Firestrike?


----------



## Roy360

should I keep the Aluminum plates? Or use thermal pads like everyone else? I"m leaning towards the latter


----------



## bbond007

First Gigabyte Windforce dies within a few hours, after sending that back, I bought 2x MSI r9 290x Twin Frozr...

Found out yesterday one had a defective displayport. HDMI and DVI seems fine.

Thankfully it was the one I bought locally at Tiger Direct exactly 30 days ago so I was able to exchange it.

Went from 76% asic to 74.4%







Still, I'm so glad I don't have to RMA again.

ATI VIDEO BOARDS NEVER AGAIN... My luck is not that bad. I promise....


----------



## Tobiman

Quote:


> Originally Posted by *taem*
> 
> Then it turns out my Powercolor 290's performance is actually quite impressive
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Your physics score is godly to me. 3770/4770 performance so much better than 4670, I was stupid not to spend the extra $100 here. Let that be a lesson to the folks spec'ing new systems, don't cheap out and get the 4670k if you're going to get a card as powerful as Hawaii.


That's only noticed in synthetic benchmarks though. A 4770k and 4670k would trade blows in most games today.

On another note, a new powercolor 290 SKU on newegg for cheap. Can't wait to get my hands on the bios.
http://www.newegg.com/Product/Product.aspx?Item=N82E16814131569


----------



## taem

Quote:


> Originally Posted by *MapRef41N93W*
> 
> What is your physics score in Firestrike?


About 8400 at 4.6ghz.
http://www.3dmark.com/fs/1740273
http://www.3dmark.com/fs/1740436
http://www.3dmark.com/fs/1740286
Quote:


> Originally Posted by *Tobiman*
> 
> On another note, a new powercolor 290 SKU on newegg for cheap. Can't wait to get my hands on the bios.
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814131569


Was wondering if they were going to do a 290 turboduo. DIfferent heatpipe orientation, this would work in a Silverstone 90 degree rotated mobo case. I've become a big fan of Powercolor due to the 270X Devil and PCS+ 290/X, going forward I'll probably look to these guys first for gpus. One thing they do right is backplates, a cheap part that a lot of users really like, no excuse not to put one on a $300+ card imho.

And yeah I'd like to get the bios too, and flash the 975/1250/0 vddc adjust settings on the backup switch of my PCS+.


----------



## Brian18741

EK's waterblock for the Asus R9 290 DCU II is out! And no horrible CSQ design! Horay!

http://www.ekwb.com/shop/blocks/vga-blocks/ati-radeon-full-cover-blocks/radeon-rx-200-series/ek-fc-r9-290x-dcii-nickel.html

http://www.ekwb.com/shop/ek-fc-r9-290x-dcii-acetal-nickel.html

edit: actually there is a horrible CSQ design one









http://www.ekwb.com/shop/blocks/vga-blocks/ati-radeon-full-cover-blocks/radeon-rx-200-series/ek-fc-r9-290x-dcii-nickel-original-csq.html


----------



## Tobiman

Taking a look back at my 3D mark 11 scores in the past 6 months.

http://www.3dmark.com/3dm11/8144294

http://www.3dmark.com/3dm11/7728796

http://www.3dmark.com/3dm11/7620731

http://www.3dmark.com/3dm11/7416757

Really need to stop changing cards so often.


----------



## Tobiman

Quote:


> Originally Posted by *taem*
> 
> About 8400 at 4.6ghz.
> http://www.3dmark.com/fs/1740273
> http://www.3dmark.com/fs/1740436
> http://www.3dmark.com/fs/1740286
> Was wondering if they were going to do a 290 turboduo. DIfferent heatpipe orientation, this would work in a Silverstone 90 degree rotated mobo case. I've become a big fan of Powercolor due to the 270X Devil and PCS+ 290/X, going forward I'll probably look to these guys first for gpus. One thing they do right is backplates, a cheap part that a lot of users really like, no excuse not to put one on a $300+ card imho.
> 
> And yeah I'd like to get the bios too, and flash the 975/1250/0 vddc adjust settings on the backup switch of my PCS+.


I already did that with the reference cooler bios. Seems stable thus far and overclocks as well as stock PCS+ bios.


----------



## taem

Quote:


> Originally Posted by *Tobiman*
> 
> I already did that with the reference cooler bios. Seems stable thus far and overclocks as well as stock PCS+ bios.


Which prog did you use to flash? I tried Atiwinflash with my Tri-X, was a disaster. I'm gonna try Atiflash via command prompt off a usb recovery drive next.


----------



## Tobiman

Quote:


> Originally Posted by *taem*
> 
> Which prog did you use to flash? I tried Atiwinflash with my Tri-X, was a disaster. I'm gonna try Atiflash via command prompt off a usb recovery drive next.


I use atiwinflash via DOS on my usb.


----------



## Ebefren

Here i go ! Changed my glorious Sapphire HD7950 OC 950mhz edition today for ..... THE BEAST !!

http://www.techpowerup.com/gpuz/4wa/

http://www.3dmark.com/3dm/2717900

... Sapphire Tri-X r9 290X OC












...and for who asking or searching.... this long card FIT in a Cooler Master 690 II perfectly WITHOUT remove the HDD BAY... coool













... fit for a bunch of millimeters ...









Anyway im sorry for the low quality of the photos, my fault. (next upgrade). Cya around guys.


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Imprezzion*
> 
> Yeah, for me they get real audible past 60%. I run 65% forced fanspeed which is not silent but it aint loud either.


Glad I'm not the only one









Think I'm gonna stick mine underwater, just gotta find a block


----------



## MapRef41N93W

Quote:


> Originally Posted by *taem*
> 
> About 8400 at 4.6ghz.
> http://www.3dmark.com/fs/1740273
> http://www.3dmark.com/fs/1740436
> http://www.3dmark.com/fs/1740286


Yeah big difference. My 4770k hits 12500 http://www.3dmark.com/fs/1894931


----------



## bbond007

Quote:


> Originally Posted by *taem*
> 
> Which prog did you use to flash? I tried Atiwinflash with my Tri-X, was a disaster. I'm gonna try Atiflash via command prompt off a usb recovery drive next.


I have not had a bit of problem with the dos command









DOS RULEZ!


----------



## IBIubbleTea

Just wondering but why are some people using different bios from different companies? Does it overclock better and/or perform better?


----------



## DrClaw

can someone help me please i want to change the gpu core voltage on the pcs 290

i tried using asus gpu tweak but it doesnt show the gpu core voltage,but ive seen asus gpu tweak show core voltage in decimal form
on a sapphire trixx card and you could change it on the fly.
reason is i just want to bring the pcs 290 to a stock 290

i know the stock 290 VDDC is 0.984v , someone had told me the core voltage which im grateful for, now the pcs 290 runs around 1.031 core volt

msi afterburner lets you adjust 100mv down or up but you cant see the voltage as decimal, so im not sure how to undervolt the card accordingly.


----------



## Roy360

Is Unigine Heaven Benchmark 4.0, one of those programs that see no performance gains from crossfire? Cuz my score with 2 r9 290 is 1244

I have a feeling video cards might not be sitting their slots, could that be the reason?


----------



## airisom2

Quote:


> Originally Posted by *DrClaw*
> 
> can someone help me please i want to change the gpu core voltage on the pcs 290
> 
> i tried using asus gpu tweak but it doesnt show the gpu core voltage,but ive seen asus gpu tweak show core voltage in decimal form
> on a sapphire trixx card and you could change it on the fly.
> reason is i just want to bring the pcs 290 to a stock 290
> 
> i know the stock 290 VDDC is 0.984v , someone had told me the core voltage which im grateful for, now the pcs 290 runs around 1.031 core volt
> 
> msi afterburner lets you adjust 100mv down or up but you cant see the voltage as decimal, so im not sure how to undervolt the card accordingly.


GPUTweak only works for Asus bioses. Use hwinfo's sensor status to monitor how many volts are going to your card (VDDC). VDDC varies on a per-card basis. One can be .984, and another could be 1.1v. It just depends.


----------



## bbond007

Quote:


> Originally Posted by *IBIubbleTea*
> 
> Just wondering but why are some people using different bios from different companies? Does it overclock better and/or perform better?


In my case I simply wanted stupid twin frozrs to have the target temp setting in CCC so I could LOWER it.

I have done everything I can with air cooling and now I just want the cards to mine cooler even if it means slower....

The default behavior is to heat up to 94c and you have no control.


----------



## DrClaw

Quote:


> Originally Posted by *airisom2*
> 
> GPUTweak only works for Asus bioses. Use hwinfo's sensor status to monitor how many volts are going to your card (VDDC). VDDC varies on a per-card basis. One can be .984, and another could be 1.1v. It just depends.


thanks man btw which one is the core voltage? is it vout/vid? i cant find VDDC


----------



## airisom2

Quote:


> Originally Posted by *DrClaw*
> 
> Quote:
> 
> 
> 
> Originally Posted by *airisom2*
> 
> GPUTweak only works for Asus bioses. Use hwinfo's sensor status to monitor how many volts are going to your card (VDDC). VDDC varies on a per-card basis. One can be .984, and another could be 1.1v. It just depends.
> 
> 
> 
> thanks man btw which one is the core voltage? is it vout/vid? i cant find VDDC
Click to expand...

vout/vid. My little 5770 shows VDDC in hwinfo


----------



## Arizonian

Quote:


> Originally Posted by *Ebefren*
> 
> Here i go ! Changed my glorious Sapphire HD7950 OC 950mhz edition today for ..... THE BEAST !!
> 
> http://www.techpowerup.com/gpuz/4wa/
> 
> http://www.3dmark.com/3dm/2717900
> 
> ... Sapphire Tri-X r9 290X OC
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> ...and for who asking or searching.... this long card FIT in a Cooler Master 690 II perfectly WITHOUT remove the HDD BAY... coool
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ... fit for a bunch of millimeters ...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyway im sorry for the low quality of the photos, my fault. (next upgrade). Cya around guys.


Congrats - added to the roster









Looks great. Post back with your over clocking results


----------



## kizwan

Quote:


> Originally Posted by *Roy360*
> 
> Is Unigine Heaven Benchmark 4.0, one of those programs that see no performance gains from crossfire? Cuz my score with 2 r9 290 is 1244
> 
> I have a feeling video cards might not be sitting their slots, could that be the reason?


Did you forget to enable Crossfire?


----------



## Roy360

Quote:


> Originally Posted by *kizwan*
> 
> Did you forget to enable Crossfire?


nope have it enabled, including that check box under it.

Might be something wrong with my installation though, Bios sees both cards and it running them at 8x but Windows occasionaly ignores the 2nd card.

Looks like a fresh install is in order.

EDIT: well this is interesting... looks like my VisionTek R9 290 is actually a R9 290X, or at-least it has more shaders than my XFX card.

VisionTek 

My XFX card: 

actaully, can someone tell me why the second card only has a Bus width of 32bits.


----------



## kizwan

Quote:


> Originally Posted by *Roy360*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Did you forget to enable Crossfire?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> nope have it enabled, including that check box under it.
> 
> Might be something wrong with my installation though, Bios sees both cards and it running them at 8x but Windows occasionaly ignores the 2nd card.
> 
> Looks like a fresh install is in order.
> 
> EDIT: well this is interesting... looks like my VisionTek R9 290 is actually a R9 290X, or at-least it has more shaders than my XFX card.
> 
> VisionTek
> 
> My XFX card:
> 
> actaully, can someone tell me why the second card only has a Bus width of 32bits.
Click to expand...

@Roy360, GPU-Z reported wrong info because ULPS is enabled. Your 290 card is still 290.


----------



## Roy360

Quote:


> Originally Posted by *kizwan*
> 
> @Roy360, GPU-Z reported wrong info because ULPS is enabled. Your 290 card is still 290.


thanks. I found out about ULPS just as you posted this. WIth Crossfire enabled the shaders match, although the vistontek's pixel fillrate is still higher than the XFX

but I'm not sure if my cards are working properly. this my score with the 14.1 beta drivers and crossfire enabled.



I lost my spreadsheet, of my R9 290 overclocks and their relative Valley scores, but I"m pretty sure I got a similar score with one card.


----------



## MapRef41N93W

Quote:


> Originally Posted by *Roy360*
> 
> thanks. I found out about ULPS just as you posted this. WIth Crossfire enabled the shaders match, although the vistontek's pixel fillrate is still higher than the XFX
> 
> but I'm not sure if my cards are working properly. this my score with the 14.1 beta drivers and crossfire enabled.
> 
> 
> 
> I lost my spreadsheet, of my R9 290 overclocks and their relative Valley scores, but I"m pretty sure I got a similar score with one card.


Dude you're in windowed mode... CF doesn't work in windowed.


----------



## Sgt Bilko

GPU-Z does that when the card is in power saving mode, disable ULPS in afterburner and they will show up correctly


----------



## kizwan

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Roy360*
> 
> thanks. I found out about ULPS just as you posted this. WIth Crossfire enabled the shaders match, although the vistontek's pixel fillrate is still higher than the XFX
> 
> but I'm not sure if my cards are working properly. this my score with the 14.1 beta drivers and crossfire enabled.
> 
> 
> 
> I lost my spreadsheet, of my R9 290 overclocks and their relative Valley scores, but I"m pretty sure I got a similar score with one card.
> 
> 
> 
> Dude you're in windowed mode... CF doesn't work in windowed.
Click to expand...

Polar bear facepalm.


----------



## axiumone

Snip - never mind. Someone already answered.


----------



## Roy360

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Dude you're in windowed mode... CF doesn't work in windowed.


thanks, first time crossfiring cards for me.

Only bought the second card, because I couldn't pass up the deal.


----------



## MapRef41N93W

Quote:


> Originally Posted by *Roy360*
> 
> thanks, first time crossfiring cards for me.
> 
> Only bought the second card, because I couldn't pass up the deal.


You should really not use the beta drivers unless you are actually planning on using Mantle. In which case you should use 14.3 which actually makes Mantle work in BF4. Beta drivers cause all sorts of issues with CF.


----------



## 107Spartan

Quote:


> Originally Posted by *MapRef41N93W*
> 
> You should really not use the beta drivers unless you are actually planning on using Mantle. In which case you should use 14.3 which actually makes Mantle work in BF4. Beta drivers cause all sorts of issues with CF.


I had the first beta drive that had mantle and it killed my hashrates for mining and i did not notice a difference in BF4 with it either so i went back to 13.12....what changed with the 14.3 beta driver?


----------



## 107Spartan

everyone should put these cards under water. I just got both of my 290x's swimming with the fishes and they do not go over 45 deg celcius while playing BF4 with everything on max 1440p! granted my watercooling loop is rather extensive but even 55 degrees celsius for how hot these cards is amazing if you ask me.


----------



## DeadlyDNA

Beta 14.3 loaded here mainly for the supposed vsynch fix. Well, it's not really fixed. My audio doesn't stutter now with vsynch on. However im getting bad video stutter still with vsynch on and even occasionally a frame rendered that is way out of place. Its almost like they are limiting the FPS instead of synchronizing with the monitor refresh rate.

Really sad as i have been hoping for working vsynch since i bought my cards. Anyone else getting the same issue?


----------



## cephelix

hey fellas, just got my R9 290 a few days ago and am really excited about this card.
Got the MSI R9 290 Gaming BF4 version.
Cooling is stock for now but will put them under water in a few months.

Since this is a non-ref card, full cover blocks won't fit, even using the cooling configurator, EK shows only compatible universal water blocks.
Have read a few threads stating that just attaching heatsinks on the VRMs and VRAMs may not be sufficient and was wondering if this is actually true or those individuals just got a bad card to begin with.

If it is so, what would my options be for watercooling this card?

Anyways, enough of me talking, below are the GPU-Z pics.


----------



## Roy360

I really hope, I"m having driver issues, because it took me forever to add my VisionTek R9 290 to my waterloop.

Here's what happened:

1) I left the PC, and when I came back, the montor said unsupported resolution and I had to restart the PC
2) Windows 8 logo appears, but black screen after.....graphics card issue
3) Tried using iGPU, same story
4) Switched back to PCIe and let Windows do it's restore thing, and it uninstalled the display adapters
5) Once in the OS, I completely uninstalled 14.4 and rebooted
6) Tried installed 13.6 (I think), PC always freezes after the first (flash/blink whatever you call it)
7) Went back to Intel drivers, (had to do a rollback). Can no longer extend displays. HDMI port mimics Displayport, and there doesn't seem to be a way to extend

and now I'm stuck..... I really hope this has something to do with ULPS and it's not the VisionTek that is defective, or else I would have to put it back together and ship it back









Once I figure out how to move my DOGEcoin wallet to my laptop, I will probably install Windows 8.1 and try again


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Roy360*
> 
> I really hope, I"m having driver issues, because it took me forever to add my VisionTek R9 290 to my waterloop.
> 
> Here's what happened:
> 
> 1) I left the PC, and when I came back, the montor said unsupported resolution and I had to restart the PC
> 2) Windows 8 logo appears, but black screen after.....graphics card issue
> 3) Tried using iGPU, same story
> 4) Switched back to PCIe and let Windows do it's restore thing, and it uninstalled the display adapters
> 5) Once in the OS, I completely uninstalled 14.4 and rebooted
> 6) Tried installed 13.6 (I think), PC always freezes after the first (flash/blink whatever you call it)
> 7) Went back to Intel drivers, (had to do a rollback). Can no longer extend displays. HDMI port mimics Displayport, and there doesn't seem to be a way to extend
> 
> and now I'm stuck..... I really hope this has something to do with ULPS and it's not the VisionTek that is defective, or else I would have to put it back together and ship it back


get the 13.12 whql driver

And use DDU in safe mode to completely clean the PC of all existing display drivers first.


----------



## taem

Quote:


> Originally Posted by *107Spartan*
> 
> everyone should put these cards under water. I just got both of my 290x's swimming with the fishes and they do not go over 45 deg celcius while playing BF4 with everything on max 1440p! granted my watercooling loop is rather extensive but even 55 degrees celsius for how hot these cards is amazing if you ask me.


My Sapphire 290 Tri-X OC 290 in upper slot, Powercolor PCS+ 290 in lower slot air cooled crossfire temps in Far Cry 3:


Ambient 22-23. Air is fine for the better Hawaii coolers if you have good case airflow and get the cards with good coolers.


----------



## Arizonian

Quote:


> Originally Posted by *cephelix*
> 
> hey fellas, just got my R9 290 a few days ago and am really excited about this card.
> Got the MSI R9 290 Gaming BF4 version.
> Cooling is stock for now but will put them under water in a few months.
> 
> Since this is a non-ref card, full cover blocks won't fit, even using the cooling configurator, EK shows only compatible universal water blocks.
> Have read a few threads stating that just attaching heatsinks on the VRMs and VRAMs may not be sufficient and was wondering if this is actually true or those individuals just got a bad card to begin with.
> 
> If it is so, what would my options be for watercooling this card?
> 
> Anyways, enough of me talking, below are the GPU-Z pics.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Have you tried *coolingconfigurator.com* ?

*"How to put your Rig in your Sig"*


----------



## cephelix

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Have you tried *coolingconfigurator.com* ?
> 
> *"How to put your Rig in your Sig"*


Thanx for adding me to the club..
I have tried coolingconfigurator.com actually. Only showed me universal blocks.

Added my rig to my sig...Some parts,especially the mobo n processor are dated i must admit and have been told that it might bottleneck my gpu a bit.


----------



## Brian18741

Disable ULPS and they will match.


----------



## Imprezzion

Quote:


> Originally Posted by *cephelix*
> 
> Thanx for adding me to the club..
> I have tried coolingconfigurator.com actually. Only showed me universal blocks.
> 
> Added my rig to my sig...Some parts,especially the mobo n processor are dated i must admit and have been told that it might bottleneck my gpu a bit.


Well, with such a powerful board and cooler you should have no problem getting 4.2Ghz+ outta that CPU.


----------



## cephelix

Quote:


> Originally Posted by *Imprezzion*
> 
> Well, with such a powerful board and cooler you should have no problem getting 4.2Ghz+ outta that CPU.


Really? I was thinking of a modest 3.8 actually. My current OC a friend did it for me. been reading up on OCing but since I was happy with the 3.6, i left it alone.
Now you got me itching to get a higher OC...
Gd to know...I'll give it a go, along with oc-ing my 290 and see how far i can get it


----------



## Niberius

Quote:


> Originally Posted by *cephelix*
> 
> Really? I was thinking of a modest 3.8 actually. My current OC a friend did it for me. been reading up on OCing but since I was happy with the 3.6, i left it alone.
> Now you got me itching to get a higher OC...
> Gd to know...I'll give it a go, along with oc-ing my 290 and see how far i can get it


I had that same CPU awhile back "i5 760" and had no trouble attaining a rock stable 4.4ghz overclock with it. P95 stable on all test for more than 12hrs each and I used a Thermalright Ultra 120 to keep the temps in check.

Its still a good CPU but if you are gonna pair it up with a 290 I would clock it as much as possible so it can keep better pace with the GPU.


----------



## cephelix

great OC....hopefully my chip is stable.....currently ambient temps are around 27 deg celcius, and my chip is running at 39 on the hottest core. the 290 is at 44 now according to AB


----------



## Niberius

Quote:


> Originally Posted by *cephelix*
> 
> great OC....hopefully my chip is stable.....currently ambient temps are around 27 deg celcius, and my chip is running at 39 on the hottest core. the 290 is at 44 now according to AB


What are your CPU temps under full load? Like with P95 small FFT's? also what is your GPU at under full load?


----------



## cephelix

last when i tested with a noctua cooler, i got high 70s.Have not tested mine with the new cooler. Just got my case, fans gfx n cooler a few days ago and havent had much time for testing


----------



## Niberius

Quote:


> Originally Posted by *cephelix*
> 
> last when i tested with a noctua cooler, i got high 70s.Have not tested mine with the new cooler. Just got my case, fans gfx n cooler a few days ago and havent had much time for testing


Make sure you dont use auto setting for vcore, its probably pushing too much voltage.


----------



## cephelix

noted...
thanks. I now know who to look for if i have any questions....
lol


----------



## DrClaw

anybody recording gaming with dxtory? ever since i switched from my old 8800 gt to the powercolor pcs 290, i try play the video it has all this rainbow colored glitchyness over the screen. you can still watch the video though but its all messed up

i put it in sony vegas and i can see the video fine though
then i tried recording with fraps and the video wasnt bugged anymore, is dxtory have issues with amd cards?


----------



## Aussiejuggalo

Quote:


> Originally Posted by *DrClaw*
> 
> anybody recording gaming with dxtory? ever since i switched from my old 8800 gt to the powercolor pcs 290, i try play the video it has all this rainbow colored glitchyness over the screen. you can still watch the video though but its all messed up
> 
> i put it in sony vegas and i can see the video fine though
> then i tried recording with fraps and the video wasnt bugged anymore, is dxtory have issues with amd cards?


I record with dxtory but havent tried it since putting my 290 in, I'll try it now in Thief and see how it goes









I'm using 13.12 drivers tho


----------



## Niberius

Quote:


> Originally Posted by *cephelix*
> 
> noted...
> thanks. I now know who to look for if i have any questions....
> lol


Been building gaming PC's for over a decade and have been overclocking stuff just as long, PM me anytime you need some help.


----------



## cephelix

thanx man....i only started 3 yrs ago.....and havent fiddled around much


----------



## Imprezzion

You also have the benefits of a extra multiplier as it's a i5 760 compared to the 750.

But, ehm, at what VRM temp should I start to worry? I'm dailing down fanspeed now for my OC (1175/1500 @ +100mV) and core stays cool as can be with the Tri-X even at like, 55% fanspeed, but VRM's are starting to hit high 80's on such ''low'' fanspeeds.


----------



## Jack Mac

I'd personally only feel comfortable if my VRMs were at or less than 80C, mid-high 80s isn't good, but it is still safe. I'd try and find a way to drop the temperatures.


----------



## Roboyto

Quote:


> Originally Posted by *Niberius*
> 
> Make sure you dont use auto setting for vcore, its probably pushing too much voltage.


Quote:


> Originally Posted by *cephelix*
> 
> thanx man....i only started 3 yrs ago.....and havent fiddled around much


This is very true, even with the new Z87 motherboards, they push far too much voltage with auto-tuning or preset overclocks. For example, I just tested a Gigabyte Z87X-UD5H board and the preset 4.4GHz overclock for an i7 4770k set voltage to 1.372V

That is WAY too much for that clock speed on a Haswell chip. After thorough testing I found it 100% stable with 1.265V for 4.5GHz, a healthy voltage drop of 0.107V


----------



## Paul17041993

Quote:


> Originally Posted by *DrClaw*
> 
> anybody recording gaming with dxtory? ever since i switched from my old 8800 gt to the powercolor pcs 290, i try play the video it has all this rainbow colored glitchyness over the screen. you can still watch the video though but its all messed up
> 
> *i put it in sony vegas and i can see the video fine though*


sounds like a bugged codec? have you tried messing with the AMD video playback settings and/or using a different playback program? I use the k-lite codec pack and play videos on it's media player classic. haven't yet tried out dxtory though, last I ever saw it it was still in very early alpha...


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Paul17041993*
> 
> sounds like a bugged codec? have you tried messing with the AMD video playback settings and/or using a different playback program? I use the k-lite codec pack and play videos on it's media player classic. *haven't yet tried out dxtory though, last I ever saw it it was still in very early alpha...*


When was the last time you seen it? lol, I've been using it for a little over a year and it runs pretty well, doesnt destroy fps like fraps and has so many settings









*@DrClaw* like Paul said its probably a bugged codec or something, I tested dxtory lastnight on Thief and didnt have the problem you described with video playback

Question, is it normal when you have a monitor in both DVI connections for the start up display stuff, bios, Windows splash screen etc to be shown on both?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> When was the last time you seen it? lol, I've been using it for a little over a year and it runs pretty well, doesnt destroy fps like fraps and has so many settings
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *@DrClaw* like Paul said its probably a bugged codec or something, I tested dxtory lastnight on Thief and didnt have the problem you described with video playback
> 
> Question, is it normal when you have a monitor in both DVI connections for the start up display stuff, bios, Windows splash screen etc to be shown on both?


I have both DVI plugged in (ones a DVI/HDMI converter) the splash screen only pops up on my main, but if you have the monitors mirrored then it will show on both


----------



## trihy

Anyone tried bios like this? http://www.techpowerup.com/vgabios/152285/asus-r9290x-4096-131206-1.html

I wonder if elpida can boot with 1350mhz as stock... asus has elpida on many 290x, maybe this bios is using higher vram voltage?


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I have both DVI plugged in (ones a DVI/HDMI converter) the splash screen only pops up on my main, but if you have the monitors mirrored then it will show on both










thats weird my monitors arnt mirrored and it does it, did it before I had drivers installed to when I was in Windows, that was really weird


----------



## Sgt Bilko

Quote:


> Originally Posted by *trihy*
> 
> Anyone tried bios like this? http://www.techpowerup.com/vgabios/152285/asus-r9290x-4096-131206-1.html
> 
> I wonder if elpida can boot with 1350mhz as stock... asus has elpida on many 290x, maybe this bios is using higher vram voltage?


My Elpida hits 1500-1550Mhz depending on what i'm running.


----------



## King4x4

I hit 1500-1550 with aux voltage at 100


----------



## trihy

Well... nice to hear that, I cannot pass 1300 without screen corruption. So, wonder how asus gives a 1300mhz bios... If you check techpowerup bios database, only 2 or 3 bios goes above 1200mhz...

Even Oc editions comes with conservative 1200mhz..

I´ll give it a try to aux voltage, wasn´t sure if it´s related to ram voltage...


----------



## MapRef41N93W

My Asus Elpida card is technically "stable" at 1500 mem, but I get higher scores at 1350 so I guess that means it just erroring a bunch at that OC.


----------



## trihy

Huh... I'd set aux voltage to +75 and I can run 3dmark11 at 1500mhz... thats crazy.. that was unstable even at 1301...


----------



## DrClaw

how do you guys backup your gpus bios?
i went to powercolor website here, http://www.powercolor.com/Global/products_features.asp?id=523

theres no bios to download, i want to change the voltage and things back to a stock 290


----------



## Sgt Bilko

Quote:


> Originally Posted by *King4x4*
> 
> I hit 1500-1550 with aux voltage at 100


i should give AUX voltage a go sometime, mine will run 1500 straight from stock....curious to see how high can get it


----------



## trihy

Well.. lowered aux to +50... still stable at 1500mhz... dont know what to think..

Before using aux, showed garbage all over the screen at 1300... and blackscreen at 1350...

Edit. Now it's really weird... lowered aux to 0... still works at 1500... made a reboot in case afterburner wasnt lowering the aux voltage...

After reboot.. can go to 1500mhz with aux at 0... even 1600.. stable.something weird is happening here... cant make mem crash...


----------



## Sgt Bilko

Quote:


> Originally Posted by *trihy*
> 
> Well.. lowered aux to +50... still stable at 1500mhz... dont know what to think..
> 
> Before using aux, showed garbage all over the screen at 1300... and blackscreen at 1350...
> 
> Edit. Now it's really weird... lowered aux to 0... still works at 1500... made a reboot in case afterburner wasnt lowering the aux voltage...
> 
> After reboot.. can go to 1500mhz with aux at 0... even 1600.. stable.something weird is happening here... cant make mem crash...


Most likely a driver error that was fixed when you rebooted


----------



## cephelix

Heya,

Just ran AC4 for about 15-20mins with almost everything on high settings and saw in Afterburner that my temps were in the low 70s (deg celsius)

Wondering if this is normal...
Card: MSI R9 290 Gaming, only factory OCed, Twin Frozr IV
Resolution: 1920x1200 if it helps any


----------



## Iniura

Quote:


> Originally Posted by *DrClaw*
> 
> how do you guys backup your gpus bios?
> i went to powercolor website here, http://www.powercolor.com/Global/products_features.asp?id=523
> 
> theres no bios to download, i want to change the voltage and things back to a stock 290


In GPU-Z click on the black chip with the green arrow pointing to the right, it's just under the logo of your GPU in GPU-Z.


----------



## Sgt Bilko

Quote:


> Originally Posted by *cephelix*
> 
> Heya,
> 
> Just ran AC4 for about 15-20mins with almost everything on high settings and saw in Afterburner that my temps were in the low 70s (deg celsius)
> 
> Wondering if this is normal...
> Card: MSI R9 290 Gaming, only factory OCed, Twin Frozr IV
> Resolution: 1920x1200 if it helps any


yep, that's ok temps, running as intended









These cards run hot.....but fast, I still haven't found the maximum clocks for my 290's yet because heat is the issue not volts


----------



## trihy

It's not a driver problem guys...

After reboot can get 1500mhz stable with aux at 0. Can't even get 1300 before.

Now I'm running 3dmak11 with 1625mhz (afterburner limit) fully stable with aux at 0...

Elpida woke up? What kind of sorcery is this?


----------



## cephelix

Quote:


> Originally Posted by *Sgt Bilko*
> 
> yep, that's ok temps, running as intended
> 
> 
> 
> 
> 
> 
> 
> 
> 
> These cards run hot.....but fast, I still haven't found the maximum clocks for my 290's yet because heat is the issue not volts


Thanks for that...
Will keep an eye on the VRM temps.....
After reading posts on the 290, some were unhappy with the temps of the VRMs when using universal waterblocks.
This makes me reconsider actually putting my card under water vs just keeping it on stock cooling


----------



## Sgt Bilko

Quote:


> Originally Posted by *trihy*
> 
> It's not a driver problem guys...
> 
> After reboot can get 1500mhz stable with aux at 0. Can't even get 1300 before.
> 
> Now I'm running 3dmak11 with 1625mhz (afterburner limit) fully stable with aux at 0...


Aux is the extra voltage supplied through the PCIe slot, these cards have no dedicated memory voltage controller (only the 290x Lightning does).

The most likely cause was the driver bugging out (without you noticing) and causing unstable memory clocks, then when you rebooted everything was normal.


----------



## Sgt Bilko

Quote:


> Originally Posted by *cephelix*
> 
> Thanks for that...
> Will keep an eye on the VRM temps.....
> After reading posts on the 290, some were unhappy with the temps of the VRMs when using universal waterblocks.
> This makes me reconsider actually putting my card under water vs just keeping it on stock cooling


Universal blocks don't cool the vrms so you would need to put heatsinks on them yourself or buy a full cover block (best option)

vrms are fine at under 100c, as a preference i'd like them under 90c as much as possible, Mine actually hit 128c today when benching (took my eyes off HWiNFO for a few seconds







)


----------



## trihy

But the ram on this card cannot pass 1300mhz since I'd buyed it three months ago.

And now can make 1625 under the same conditions...

Maybe amd blocks mem oc in some way...


----------



## Sgt Bilko

Quote:


> Originally Posted by *trihy*
> 
> But the ram on this card cannot pass 1300mhz since I'd buyed it three months ago.
> 
> And now can make 1625 under the same conditions...
> 
> Maybe amd blocks mem oc in some way...


yeah that's kinda weird.

No, AMD does not block memory oc in any way.


----------



## trihy

Yeah.. I don't get it. My Elpida mem cannot do 1300 and now does 1625.

The only thing I can think... It's afterburner applying the +75 aux forever besides resetting it at 0... software bug? Anyone using aux voltage can check if resetting to 0 makes the mem oc unstable again?

Afterburner b18 and catalyst 14.3 beta here.


----------



## JordanTr

Are you sure its stable. I can run 3dmark and heaven benchmarks for ages on my elpida 1500mhz, but if i try to play crysis 2 mp anything above 1300mhz after 10-15 minutes i get blackscreen and have to reset my pc. Check real world gaming instead.


----------



## cephelix

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Universal blocks don't cool the vrms so you would need to put heatsinks on them yourself or buy a full cover block (best option)
> 
> vrms are fine at under 100c, as a preference i'd like them under 90c as much as possible, Mine actually hit 128c today when benching (took my eyes off HWiNFO for a few seconds
> 
> 
> 
> 
> 
> 
> 
> )


Yup, planning to attach heatsinks to the VRMs. Checked coolingconfigurator.com, only shows me universal blocks


----------



## trihy

Im running sleeping dogs bench with no problem..

But the point is.. before.. setting 1300 gives instant blackscreen after hitting apply on AB and now it's ok even at 1625..


----------



## Forceman

Quote:


> Originally Posted by *trihy*
> 
> Yeah.. I don't get it. My Elpida mem cannot do 1300 and now does 1625.
> 
> The only thing I can think... It's afterburner applying the +75 aux forever besides resetting it at 0... software bug? Anyone using aux voltage can check if resetting to 0 makes the mem oc unstable again?
> 
> Afterburner b18 and catalyst 14.3 beta here.


What does GPU-Z show for the VDDCI voltage? Default is 1.0V I believe, so if it's not that then the overvoltage didn't reset.


----------



## trihy

It shows 1000, so I guess it´s stock... (I think I need to check this under load...)

But hey, new info...

I´d flashed another bios to the card (a reference one too) and get back to normal, 1300... instant garbage / blackscreen.

After setting again aux voltage to +75, 1625mhz stable... reverted aux to 0, still 1625 stable, reboot, still 1625 stable. So, AB changes something that wont be reverted. Maybe a registry change that stays forever because AB bug... not sure.

Also not sure what would happen if I go back to previous bios... but once you set aux... mem stability stays forever.


----------



## Stay Puft

I see the MSI gaming 290 is getting cheaper and cheaper on amazon. Reviews say it absolutely sucks for overclocking. Anyone have it? Results? Could it do 1300 Core under the right conditions?


----------



## velocityx

Quote:


> Originally Posted by *trihy*
> 
> It shows 1000, so I guess it´s stock... (I think I need to check this under load...)
> 
> But hey, new info...
> 
> I´d flashed another bios to the card (a reference one too) and get back to normal, 1300... instant garbage / blackscreen.
> 
> After setting again aux voltage to +75, 1625mhz stable... reverted aux to 0, still 1625 stable, reboot, still 1625 stable. So, AB changes something that wont be reverted. Maybe a registry change that stays forever because AB bug... not sure.
> 
> Also not sure what would happen if I go back to previous bios... but once you set aux... mem stability stays forever.


prolly noob question. aux voltage? I only have core voltage in my AB?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Stay Puft*
> 
> I see the MSI gaming 290 is getting cheaper and cheaper on amazon. Reviews say it absolutely sucks for overclocking. Anyone have it? Results? Could it do 1300 Core under the right conditions?


1300 on air is stretching it but very possible under water, my XFX DD cards can do 1250/1550 on air but not for long......vrm cooling for these is not so great, core cooling is fine though. As i said below, talk to Red about it.
Quote:


> Originally Posted by *cephelix*
> 
> Yup, planning to attach heatsinks to the VRMs. Checked coolingconfigurator.com, only shows me universal blocks


Talk to @Red1776 he has 4 MSI gaming R9 290x's that are going under water with fullcover blocks.

They are ref design i think but you might want to check with red on that one.


----------



## trihy

@velocityx. On the right side of the core voltage there is a button that opens aux voltage.

If someone would like to report my experience to msi ab team.. I think it could be helpful.


----------



## Aussiejuggalo

Something really strange with my card, when its on load then starts cooling down it makes a kinda weird creaking sound, like when metal expands and contracts
















Besides that and the fan noise this card is awesome, ran the Sleeping Dogs bench and got average 100FPS with everything maxed and AA on normal


----------



## kizwan

Quote:


> Originally Posted by *Roy360*
> 
> I really hope, I"m having driver issues, because it took me forever to add my VisionTek R9 290 to my waterloop.
> 
> Here's what happened:
> 
> 1) I left the PC, and when I came back, the montor said unsupported resolution and I had to restart the PC
> 2) Windows 8 logo appears, but black screen after.....graphics card issue
> 3) Tried using iGPU, same story
> 4) Switched back to PCIe and let Windows do it's restore thing, and it uninstalled the display adapters
> 5) Once in the OS, I completely uninstalled 14.4 and rebooted
> 6) Tried installed 13.6 (I think), PC always freezes after the first (flash/blink whatever you call it)
> 7) Went back to Intel drivers, (had to do a rollback). Can no longer extend displays. HDMI port mimics Displayport, and there doesn't seem to be a way to extend
> 
> and now I'm stuck..... I really hope this has something to do with ULPS and it's not the VisionTek that is defective, or else I would have to put it back together and ship it back
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Once I figure out how to move my DOGEcoin wallet to my laptop, I will probably install Windows 8.1 and try again


Try re-seat your 290's. I know you're watercooling but you should be able to re-seat your 290's without draining the loop.
Quote:


> Originally Posted by *Sgt Bilko*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *cephelix*
> 
> Thanks for that...
> Will keep an eye on the VRM temps.....
> After reading posts on the 290, some were unhappy with the temps of the VRMs when using universal waterblocks.
> This makes me reconsider actually putting my card under water vs just keeping it on stock cooling
> 
> 
> 
> 
> 
> 
> 
> Universal blocks don't cool the vrms so you would need to put heatsinks on them yourself or buy a full cover block (best option)
> 
> vrms are fine at under 100c, as a preference i'd like them under 90c as much as possible, Mine actually hit *128c* today when benching (took my eyes off HWiNFO for a few seconds
> 
> 
> 
> 
> 
> 
> 
> )
Click to expand...









What benching software did you run? Voltage?
Quote:


> Originally Posted by *trihy*
> 
> But the ram on this card cannot pass 1300mhz since I'd buyed it three months ago.
> 
> And now can make 1625 under the same conditions...
> 
> Maybe amd blocks mem oc in some way...


I'm not surprise with this weird phenomenon. If I put Elpida card in the second PCIe slot, it feels like it's under-perform in Crossfire. When I put it in the first PCIe slot, it running well in Crossfire without any issue.


----------



## taem

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 1300 on air is stretching it but very possible under water, my XFX DD cards can do 1250/1550 on air but not for long......vrm cooling for these is not so great, core cooling is fine though. As i said below, talk to Red about it.
> Talk to @Red1776 he has 4 MSI gaming R9 290x's that are going under water with fullcover blocks.
> 
> They are ref design i think but you might want to check with red on that one.


1300 is really high for Hawaii. I don't know that temps are the issue, that's just a massive oc most chips I'm guessing can't do. My card maxes at 1220, I can keep the temps in the 70s core 80s vrm1, but the card just can't go higher. Maybe I could squeeze out another 10-20 MHz with vrm1 at sub 70c. But I don't think I'm temp limited here.

I would be very surprised if more than a handful of folks here could hit 1230-1250 no matter the cooling solution.


----------



## Forceman

Quote:


> Originally Posted by *taem*
> 
> 1300 is really high for Hawaii. I don't know that temps are the issue, that's just a massive oc most chips I'm guessing can't do. My card maxes at 1220, I can keep the temps in the 70s core 80s vrm1, but the card just can't go higher. Maybe I could squeeze out another 10-20 MHz with vrm1 at sub 70c. But I don't think I'm temp limited here.
> 
> I would be very surprised if more than a handful of folks here could hit 1230-1250 no matter the cooling solution.


Yeah, these chips just don't overclock that far unless you are willing to push high volts through them. I'm not sure anyone has done 1300 on a stock BIOS - most of those clocks are on PT1/PT3 I think.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kizwan*
> 
> 
> 
> 
> 
> 
> 
> 
> What benching software did you run? Voltage?


Catzilla 1440p Crossfire, trying to improve on this: http://hwbot.org/submission/2518655_

as for volts, wasn't watching actually, +150 in trixx though
Quote:


> Originally Posted by *taem*
> 
> 1300 is really high for Hawaii. I don't know that temps are the issue, that's just a massive oc most chips I'm guessing can't do. My card maxes at 1220, I can keep the temps in the 70s core 80s vrm1, but the card just can't go higher. Maybe I could squeeze out another 10-20 MHz with vrm1 at sub 70c. But I don't think I'm temp limited here.
> 
> I would be very surprised if more than a handful of folks here could hit 1230-1250 no matter the cooling solution.


rdr09 has hit 1300, i'm pretty confident i could too if i had better cooling, and considering i have "crappy" XFX cards i'd imagine alot of others could as well.

290's should clock higher than 290x's due to less shaders but it's till up in the air atm i guess.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Forceman*
> 
> Yeah, these chips just don't overclock that far unless you are willing to push high volts through them. I'm not sure anyone has done 1300 on a stock BIOS - most of those clocks are on PT1/PT3 I think.


Well i'll have these under water at some stage (probably 6 months away if i'm honest), i can hit 1250 on both cards, i haven't tried any higher than that due to temps, i still have a bit more room left to play with.


----------



## Ebefren

I just try today a little OC on my Sapphire Tri-x. Base cpu is 1040 i pushed to 1100 mhz, default voltage, +20 on the power using Trixx. I run some benchmarks (Valley, 3dMark) and was superstable at 75C max temp (fan automatic !!). BUT, when i try to high even the Vram (to 1300 at 1400) i get numerous video artifacts (on windows desk) and a black screen at end. Pc rebooted manually.









Anyone have an explanation for that ?!?! Its really strange, with my HD7950 i can push the Vram at 1500 with no problem..ok the gpu is highly different...BUT a black screen, never seen it.


----------



## Forceman

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well i'll have these under water at some stage (probably 6 months away if i'm honest), i can hit 1250 on both cards, i haven't tried any higher than that due to temps, i still have a bit more room left to play with.


From watching the 3DMark threads, it doesn't appear that going on water helps these cards as much as it did 7970s. They seem to hit a wall that isn't temperature related, and there aren't many day-to-day 1300 cards around. There are only two submissions that are over 1300, of the 20 or so 290/290X cards in those threads. I guess it just depends what kind of voltage you are comfortable with though. But I wouldn't say 1300 is likely, even if it is possible.
Quote:


> Originally Posted by *Ebefren*
> 
> I just try today a little OC on my Sapphire Tri-x. Base cpu is 1040 i pushed to 1100 mhz, default voltage, +20 on the power using Trixx. I run some benchmarks (Valley, 3dMark) and was superstable at 75C max temp (fan automatic !!). BUT, when i try to high even the Vram (to 1300 at 1400) i get numerous video artifacts (on windows desk) and a black screen at end. Pc rebooted manually.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone have an explanation for that ?!?! Its really strange, with my HD7950 i can push the Vram at 1500 with no problem..ok the gpu is highly different...BUT a black screen, never seen it.


Short answer is that they just don't overclock the memory all that well. You can try adding more core voltage, which seems to help, or put in some AUX voltage(which helps some people), but don't expect Tahiti-level memory overclocks with the new Hawaii memory controller. On the plus side, the 512-bit bus means you get crazy bandwidth even without super high clocks, so it all kind of washes out in the end.


----------



## Tobiman

I can get my core clock to 1250mhz for benches and probably a bit more but vrm temps will be nuts.


----------



## Tobiman

Quote:


> Originally Posted by *Ebefren*
> 
> I just try today a little OC on my Sapphire Tri-x. Base cpu is 1040 i pushed to 1100 mhz, default voltage, +20 on the power using Trixx. I run some benchmarks (Valley, 3dMark) and was superstable at 75C max temp (fan automatic !!). BUT, when i try to high even the Vram (to 1300 at 1400) i get numerous video artifacts (on windows desk) and a black screen at end. Pc rebooted manually.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone have an explanation for that ?!?! Its really strange, with my HD7950 i can push the Vram at 1500 with no problem..ok the gpu is highly different...BUT a black screen, never seen it.


14.X drivers are unstable atm since they are still in beta stage. 13.12 will fix that issue so i'd suggest you switch to it. Uninstall with DDU first then install 13.12 drivers.


----------



## taem

Quote:


> Originally Posted by *Forceman*
> 
> From watching the 3DMark threads, it doesn't appear that going on water helps these cards as much as it did 7970s. They seem to hit a wall that isn't temperature related, and there aren't many day-to-day 1300 cards around. There are only two submissions that are over 1300, of the 20 or so 290/290X cards in those threads. I guess it just depends what kind of voltage you are comfortable with though. But I wouldn't say 1300 is likely, even if it is possible.
> Short answer is that they just don't overclock the memory all that well. You can try adding more core voltage, which seems to help, or put in some AUX voltage(which helps some people), but don't expect Tahiti-level memory overclocks with the new Hawaii memory controller. On the plus side, the 512-bit bus means you get crazy bandwidth even without super high clocks, so it all kind of washes out in the end.


On 3dmark there's only several dozen 290xs that hit 1300 core, out of like 15,000 results on single gpu, any driver.


----------



## Ebefren

Quote:


> Originally Posted by *Forceman*
> 
> Short answer is that they just don't overclock the memory all that well. You can try adding more core voltage, which seems to help, or put in some AUX voltage(which helps some people), but don't expect Tahiti-level memory overclocks with the new Hawaii memory controller. On the plus side, the 512-bit bus means you get crazy bandwidth even without super high clocks, so it all kind of washes out in the end.


Thanks, i think something like that (i read some reviews).








Quote:


> Originally Posted by *Tobiman*
> 
> 14.X drivers are unstable atm since they are still in beta stage. 13.12 will fix that issue so i'd suggest you switch to it. Uninstall with DDU first then install 13.12 drivers.


Yes, i use 14.3... but i need it because of Mantle...i think. (maybe before with the hd7950, now i don't know)


----------



## mojobear

Quote:


> Originally Posted by *trihy*
> 
> @velocityx. On the right side of the core voltage there is a button that opens aux voltage.
> 
> If someone would like to report my experience to msi ab team.. I think it could be helpful.


Hey Trihy.....I had the same experience as you, My memory would not OC over 1400 but after increasing aux voltage I could get 1500+. By using trixxx to OC my cards instead of afterburner, the memory potential reset itself and I was back to my old 1400. Interesting stuff though!


----------



## Tobiman

Quote:


> Originally Posted by *Ebefren*
> 
> Thanks, i think something like that (i read some reviews).
> 
> 
> 
> 
> 
> 
> 
> 
> Yes, i use 14.3... but i need it because of Mantle...i think. (maybe before with the hd7950, now i don't know)


I had the same problem with 14.2 and it totally disappeared after I reverted back to 13.12.


----------



## kizwan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cephelix*
> 
> Yup, planning to attach heatsinks to the VRMs. Checked coolingconfigurator.com, only shows me universal blocks
> 
> 
> 
> Talk to @Red1776 he has 4 MSI gaming R9 290x's that are going under water with fullcover blocks.
> 
> They are ref design i think but you might want to check with red on that one.
Click to expand...

Look like the EK water block is not compatible with MSI Gaming 290/290x cards.

http://www.overclock.net/t/1474896/what-water-block-will-fit-my-msi-r9-290-twin-frozr/10#post_21962732


----------



## Sgt Bilko

Quote:


> Originally Posted by *kizwan*
> 
> Look like the EK water block is not compatible with MSI Gaming 290/290x cards.
> 
> http://www.overclock.net/t/1474896/what-water-block-will-fit-my-msi-r9-290-twin-frozr/10#post_21962732


Quote:


> Originally Posted by *derickwm*
> 
> My boss informed me this morning that a revised block that will fit the MSI Twin Frozr 290(x) will be out in a month or so.


Grand scheme of things 4 weeks or so isn't that long to wait.....

Hope there is a way of telling them apart though


----------



## airisom2

Well guys, I got my PCS+ 290Xs in, and I'm pretty upset. I've been fooling around with them all day, and I can't get either one of them to perform better than the 290 I had. I remember on my 290, I got 64.7fps on Valley clocked at 1150/1450. The PCS+ 290X gets 63.9fps at the same clocks. Crossfire works as it should, essentially doubling the performance in the stuff that supports it, but the performance per card is really off, giving me lower than expected crossfire results.

I've reinstalled Windows 8.1, tried 13.11 Beta 9.5, 13.12 WHQL (both of these perform the same), and 14.3 Beta 1.0 (performs about 3fps lower), using DDU to uninstall each one. I've also disabled ULPS, and that didn't seem to do anything.

My Firestrike score is also lower than what I'm seeing online. I matched the clockspeeds of the Tri-X 290X (PCPer) to my PCS+, and it scores about 5% worse than it (Tri-X: 9934 Combined, 11260 Graphics Score. PCS+: 9349 Combined, 10636 Graphics Score). They even perform worse than G3D's PCS+ 290X, which already had performance problems (Firestrike: 9489 vs. 9784).

Also, comparing taem's Valley results to mine, I'm getting lower scores than he is (58.8 vs 62.9 fps). Keep in mind he has a PCS+ 290 non-x, and I lowered the core clock (1040Mhz vs. 1050Mhz) so it that it matched his.

Does anyone know what's going on here?

Edit: And if this matters, both have Elpida.

Edit 2: +50% Power Limit for all of this.


----------



## IBIubbleTea

Hey guys, I'm planning on overclocking my R9 290 once I receive my water cooling components and I was wondering what is a good max voltage I should for 24/7 and from killing it too quickly.


----------



## cephelix

Quote:


> Originally Posted by *kizwan*
> 
> Look like the EK water block is not compatible with MSI Gaming 290/290x cards.
> 
> http://www.overclock.net/t/1474896/what-water-block-will-fit-my-msi-r9-290-twin-frozr/10#post_21962732


Thanks...if what they say is true that a revised block is coming out, i'd probably get one of em


----------



## taem

Quote:


> Originally Posted by *airisom2*
> 
> Well guys, I got my PCS+ 290Xs in, and I'm pretty upset. I've been fooling around with them all day, and I can't get either one of them to perform better than the 290 I had. I remember on my 290, I got 64.7fps on Valley clocked at 1150/1450. The PCS+ 290X gets 63.9fps at the same clocks. Crossfire works as it should, essentially doubling the performance in the stuff that supports it, but the performance per card is really off, giving me lower than expected crossfire results.
> 
> I've reinstalled Windows 8.1, tried 13.11 Beta 9.5, 13.12 WHQL (both of these perform the same), and 14.3 Beta 1.0 (performs about 3fps lower), using DDU to uninstall each one. I've also disabled ULPS, and that didn't seem to do anything.
> 
> My Firestrike score is also lower than what I'm seeing online. I matched the clockspeeds of the Tri-X 290X (PCPer) to my PCS+, and it scores about 5% worse than it (Tri-X: 9934 Combined, 11260 Graphics Score. PCS+: 9349 Combined, 10636 Graphics Score). They even perform worse than G3D's PCS+ 290X, which already had performance problems (Firestrike: 9489 vs. 9784).
> 
> Also, comparing taem's Valley results to mine, I'm getting lower scores than he is (58.8 vs 62.9 fps). Keep in mind he has a PCS+ 290 non-x, and I lowered the core clock (1040Mhz vs. 1050Mhz) so it that it matched his.
> 
> Does anyone know what's going on here?
> 
> Edit: And if this matters, both have Elpida.
> 
> Edit 2: +50% Power Limit for all of this.


Check out the Powercolor PCS+ thread, several folks having issues with underperforming cards, though up to this point all the cards having this issue were Hynix. RMA might be your only resolution. http://www.overclock.net/t/1462592/powercolor-pcs-r9-290/300_100#post_21988657

My Valley results are odd, they are too high. I should not be getting ~63fps at 1040/1350. So I've said before my Valley results should be disregarded as aberrant. My 3dmark scores are in expected range though. I'm still trying to figure out what's going on.

edit, ok I figured out one thing, I had CCC Texture Filtering set to Performance, and Surface Format Optimization was off. I set everything in CCC to default and ran Valley/ExtremeHD and my score went down:



Doesn't change the fact that your card is underperforming though.


----------



## rv8000

Quote:


> Originally Posted by *airisom2*
> 
> Well guys, I got my PCS+ 290Xs in, and I'm pretty upset. I've been fooling around with them all day, and I can't get either one of them to perform better than the 290 I had. I remember on my 290, I got 64.7fps on Valley clocked at 1150/1450. The PCS+ 290X gets 63.9fps at the same clocks. Crossfire works as it should, essentially doubling the performance in the stuff that supports it, but the performance per card is really off, giving me lower than expected crossfire results.
> 
> I've reinstalled Windows 8.1, tried 13.11 Beta 9.5, 13.12 WHQL (both of these perform the same), and 14.3 Beta 1.0 (performs about 3fps lower), using DDU to uninstall each one. I've also disabled ULPS, and that didn't seem to do anything.
> 
> My Firestrike score is also lower than what I'm seeing online. I matched the clockspeeds of the Tri-X 290X (PCPer) to my PCS+, and it scores about 5% worse than it (Tri-X: 9934 Combined, 11260 Graphics Score. PCS+: 9349 Combined, 10636 Graphics Score). They even perform worse than G3D's PCS+ 290X, which already had performance problems (Firestrike: 9489 vs. 9784).
> 
> Also, comparing taem's Valley results to mine, I'm getting lower scores than he is (58.8 vs 62.9 fps). Keep in mind he has a PCS+ 290 non-x, and I lowered the core clock (1040Mhz vs. 1050Mhz) so it that it matched his.
> 
> Does anyone know what's going on here?
> 
> Edit: And if this matters, both have Elpida.
> 
> Edit 2: +50% Power Limit for all of this.


This issue is probably becoming more and more widespread, and it's a shame that the large majority of owners probably won't recognize they may have gotten a bad card form a possible bad batch, but something is up with their PCS+ line of cards. Until I personally receive a working replacement and some for of explanation I'm going to recommend people don't buy the PCS line of 290s.


----------



## taem

Quote:


> Originally Posted by *airisom2*
> 
> Well guys, I got my PCS+ 290Xs in, and I'm pretty upset. I've been fooling around with them all day, and I can't get either one of them to perform better than the 290 I had. I remember on my 290, I got 64.7fps on Valley clocked at 1150/1450. The PCS+ 290X gets 63.9fps at the same clocks. Crossfire works as it should, essentially doubling the performance in the stuff that supports it, but the performance per card is really off, giving me lower than expected crossfire results.


Btw how are your temps with crossfire? The cards being 52mm thick and giving so little space for airflow and all.


----------



## kizwan

Quote:


> Originally Posted by *taem*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *airisom2*
> 
> Well guys, I got my PCS+ 290Xs in, and I'm pretty upset. I've been fooling around with them all day, and I can't get either one of them to perform better than the 290 I had. I remember on my 290, I got 64.7fps on Valley clocked at 1150/1450. The PCS+ 290X gets 63.9fps at the same clocks. Crossfire works as it should, essentially doubling the performance in the stuff that supports it, but the performance per card is really off, giving me lower than expected crossfire results.
> 
> I've reinstalled Windows 8.1, tried 13.11 Beta 9.5, 13.12 WHQL (both of these perform the same), and 14.3 Beta 1.0 (performs about 3fps lower), using DDU to uninstall each one. I've also disabled ULPS, and that didn't seem to do anything.
> 
> My Firestrike score is also lower than what I'm seeing online. I matched the clockspeeds of the Tri-X 290X (PCPer) to my PCS+, and it scores about 5% worse than it (Tri-X: 9934 Combined, 11260 Graphics Score. PCS+: 9349 Combined, 10636 Graphics Score). They even perform worse than G3D's PCS+ 290X, which already had performance problems (Firestrike: 9489 vs. 9784).
> 
> Also, comparing taem's Valley results to mine, I'm getting lower scores than he is (58.8 vs 62.9 fps). Keep in mind he has a PCS+ 290 non-x, and I lowered the core clock (1040Mhz vs. 1050Mhz) so it that it matched his.
> 
> Does anyone know what's going on here?
> 
> Edit: And if this matters, both have Elpida.
> 
> Edit 2: +50% Power Limit for all of this.
> 
> 
> 
> 
> 
> 
> 
> Check out the Powercolor PCS+ thread, several folks having issues with underperforming cards, though up to this point all the cards having this issue were Hynix. RMA might be your only resolution. http://www.overclock.net/t/1462592/powercolor-pcs-r9-290/300_100#post_21988657
> 
> My Valley results are odd, they are too high. I should not be getting ~63fps at 1040/1350. So I've said before my Valley results should be disregarded as aberrant. My 3dmark scores are in expected range though. I'm still trying to figure out what's going on.
> 
> edit, ok I figured out one thing, I had CCC Texture Filtering set to Performance, and Surface Format Optimization was off. I set everything in CCC to default and ran Valley/ExtremeHD and my score went down:
> 
> 
> 
> Doesn't change the fact that your card is underperforming though.
Click to expand...

I was going to ask whether you have tessellation off or not.


----------



## Aussiejuggalo

When you guys do 3DMark 11 & Fire strike are you doing it on 1080 or 720?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> When you guys do 3DMark 11 & Fire strike are you doing it on 1080 or 720?


3DMark 11 performance is 720p and Extreme is 1080p

Firestrike is 1080p and Firestrike Extreme is 1440p.

Firestrike and 3DMark 11 Performance would be good choices though


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 3DMark 11 performance is 720p and Extreme is 1080p
> 
> Firestrike is 1080p and Firestrike Extreme is 1440p.
> 
> Firestrike and 3DMark 11 Performance would be good choices though


Ah ok

I ran 3DMark 11 on extreme before and got well... not even 5000







. Not even gonna attempt to run Firestrike


----------



## Sgt Bilko

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Ah ok
> 
> I ran 3DMark 11 on extreme before and got well... not even 5000
> 
> 
> 
> 
> 
> 
> 
> . Not even gonna attempt to run Firestrike


Well you have a fairly comparable rig to me, you looking for a baseline?

EDIT: here ya go: http://www.3dmark.com/3dm11/7835261


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well you have a fairly comparable rig to me, you looking for a baseline?


Nah just stuffing around







and curious to what you guys run, I havent even bothered to overclock my CPU again


----------



## Sgt Bilko

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Nah just stuffing around
> 
> 
> 
> 
> 
> 
> 
> , I havent even bothered to overclock my CPU again


i posted the only result of mine around 5k score, CPU was 4.4Ghz and 290 was stock clocks 980/1250


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Sgt Bilko*
> 
> i posted the only result of mine around 5k score, CPU was 4.4Ghz and 290 was stock clocks 980/1250


lol nice







, I just ran a Valley bench



Not to bad I suppose for everything being at stock apart from my RAM







. Those fans get loud after 50% tho









My scores might not be that great coz I run the 3 screens and also have Rainmeter running


----------



## Germanian

anyone know average r9 290 reference overclock speeds?
my 3x r9 290 all hit 1160clock and left my Vram at stock 1250 with +100core voltage and +50 pll voltage in tri-fire

Best part is there is still room for OC, but i just wanted to see if they can at least do that speed








had to ramp up reference fan to 93% just to be safe for a quick run.
No way I would be able to tolerate that noise 24/7 though. It's bad enough around 60%.


----------



## Imprezzion

1160 @ +100mV for a 290 is average at best indeed. 290's usually get slightly higher clocks then 290x's and my 290x runs 1175Mhz on +100mV with 1500Mhz VRAM. I can bench / game as high as 1650Mhz on my VRAM but it will randomly freeze in a single colored screen after a hour orso.

I'll run Valley on my 290x as well. I'm curious to see if she can beat my best GTX780 @ 1400Mhz.


----------



## Xylene

Anyone running 3x 1080p on a single 290? If so, does a single cut it? I don't care about AA.


----------



## airisom2

Well guys, I just submitted an RMA with Powercolor. Hopefully, this all works out.


----------



## Cool Mike

*Powercolor 290x PCS+*
Wow! This is the highest overclocking Elpida memory I have ever seen. I know of no other 290 or 290x (with Elpida) that can hit 1600X4 on stock voltage! Running two in Crossfire. Both core's at 1150 and memory at 1600. Using Sapphire Trixx to get core Voltage to +125.


----------



## Xylene

Quote:


> Originally Posted by *Germanian*
> 
> anyone know average r9 290 reference overclock speeds?
> my 3x r9 290 all hit 1160clock and left my Vram at stock 1250 with +100core voltage and +50 pll voltage in tri-fire
> 
> Best part is there is still room for OC, but i just wanted to see if they can at least do that speed
> 
> 
> 
> 
> 
> 
> 
> 
> had to ramp up reference fan to 93% just to be safe for a quick run.
> No way I would be able to tolerate that noise 24/7 though. It's bad enough around 60%.


Lulz. I run mine at 100% fan. Between hardcore blasting in my ears and guns with headphones on, I can't hear a damn thing.


----------



## Cool Mike

airisom

I am running two PCS+ also. I have had them less than 1 week. Temps are great. You are right my firestrike scores are just slightly higher than the PCS+ 290 card I had. The Elpida memory on the 290x is excellent I am hitting 1600 gaming and benching and 1150 on both cores using Trixx with +125mv on the core. I plan on keeping mine. Got them for $579 plus BF4. I preordered a 4K monitor so will defiantly need two 290x's


----------



## Imprezzion

My Tri-X 290X pulled 67.4 FPS on Valley @ Extreme preset bench. Is that any good?







Clocks at 1175/1500 +100mV.


----------



## airisom2

Quote:


> Originally Posted by *Cool Mike*
> 
> airisom
> 
> I am running two PCS+ also. I have had them less than 1 week. Temps are great. You are right my firestrike scores are just slightly higher than the PCS+ 290 card I had. The Elpida memory on the 290x is excellent I am hitting 1600 gaming and benching and 1150 on both cores using Trixx with +125mv on the core. I plan on keeping mine. Got them for $579 plus BF4. I preordered a 4K monitor so will defiantly need two 290x's


Well, my elpida can't go above ~1450 on the first card, and ~1420 on the second...both are 1150 core stable, though. I also got mine for $579. Too bad mine are lemons. I would go 4K, but 120Hz has got me. I will never go back to 60Hz. I might get a 120Hz 1440p monitor in the future (not one of those Korean monitors because 120Hz isn't guaranteed on those), but not now. These cards will be used until 120Hz 4k comes out.


----------



## Jack Mac

Quote:


> Originally Posted by *Imprezzion*
> 
> My Tri-X 290X pulled 67.4 FPS on Valley @ Extreme preset bench. Is that any good?
> 
> 
> 
> 
> 
> 
> 
> Clocks at 1175/1500 +100mV.


Sounds about right if your CPU is at stock.

When my i5 was at stock and my 290 was at 1200/1250, I got 66.1FPS


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Imprezzion*
> 
> My Tri-X 290X pulled 67.4 FPS on Valley @ Extreme preset bench. Is that any good?
> 
> 
> 
> 
> 
> 
> 
> Clocks at 1175/1500 +100mV.


You got 10.7 more frames then I did on my Tri-X at stock







lol (to hot for me to try overclocking yet)


----------



## Imprezzion

Quote:


> Originally Posted by *Jack Mac*
> 
> Sounds about right if your CPU is at stock.
> 
> When my i5 was at stock and my 290 was at 1200/1250, I got 66.1FPS


CPU was a 3770K @ 5Ghz under water







RAM @ 2179Mhz 9-10-10-21-120-1T.


----------



## Imprezzion

Quote:


> Originally Posted by *Imprezzion*
> 
> CPU was a 3770K @ 5Ghz under water
> 
> 
> 
> 
> 
> 
> 
> RAM @ 2179Mhz 9-10-10-21-120-1T.


Quote:


> Originally Posted by *Aussiejuggalo*
> 
> You got 10.7 more frames then I did on my Tri-X at stock
> 
> 
> 
> 
> 
> 
> 
> lol (to hot for me to try overclocking yet)


Well, even with the low ambients, +100mV and +50% power target puts a hammer down on VRM temps even on something like a Tri-X so I can keep my core plenty cool (barely touching 70c) but the VRM's are hammered with temps close to 80c with 18c ambients..

I might just have to RMA this card btw.. Even though it clocks great one of the fans has either a inbalance or the bearing is gone cause above 55% fanspeed it makes one hell of a rattling noise and when I stop the fan it's gone so..


----------



## Jack Mac

Quote:


> Originally Posted by *Imprezzion*
> 
> CPU was a 3770K @ 5Ghz under water
> 
> 
> 
> 
> 
> 
> 
> RAM @ 2179Mhz 9-10-10-21-120-1T.


Then you should be getting more FPS. I managed to eek out 72.1FPS from my 290 at 1210/1625 and my i5 at 4.4GHz. Use the tweaks on the valley thread, that helped me get 2FPS.


----------



## Imprezzion

Quote:


> Originally Posted by *Jack Mac*
> 
> Then you should be getting more FPS. I managed to eek out 72.1FPS from my 290 at 1210/1625 and my i5 at 4.4GHz. Use the tweaks on the valley thread, that helped me get 2FPS.


This was untweaked with drivers set to max quality. Adaptive AA, Max quality texture filtering asf.


----------



## cephelix

haven't done any benchmarks, only played ac4, but with every option on its highest setting, I only get 30fps. is this normal?
playing at 1920x 1200 res.


----------



## uaedroid

My R9-290X, still trying to optimized the settings..


----------



## MapRef41N93W

Quote:


> Originally Posted by *cephelix*
> 
> haven't done any benchmarks, only played ac4, but with every option on its highest setting, I only get 30fps. is this normal?
> playing at 1920x 1200 res.


Every option at it's highest? Does that include 16x AA?

By the way does anyone know why Afterburner beta is expiring on the 28th? I'm using the newest beta right off MSI's site and it keeps saying this now. Where is the new beta if this one expires?


----------



## taem

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 3DMark 11 performance is 720p and Extreme is 1080p
> 
> Firestrike is 1080p and Firestrike Extreme is 1440p.
> 
> Firestrike and 3DMark 11 Performance would be good choices though


Fire Strike and 3DMark11 Performance, Valley ExtremeHD preset, Heaven Extreme preset -- these should be the basis of comparison since they are free. Just remember to run 3DMark11 in stretched rather than centered. For Fire Strike and 3DMark11, it's the gpu sub score that's relevant. For purposes of this thread that is.
Quote:


> Originally Posted by *cephelix*
> 
> haven't done any benchmarks, only played ac4, but with every option on its highest setting, I only get 30fps. is this normal?
> playing at 1920x 1200 res.


I don't have that title but sounds like double frame v sync is on.


----------



## Thorteris

How many r9-290x reach 1300mhz+ ? I was wondering if anybody thinks I will reach it with a kraken g10 + x60. Or will my temps still be too high?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *cephelix*
> 
> haven't done any benchmarks, only played ac4, but with every option on its highest setting, I only get 30fps. is this normal?
> playing at 1920x 1200 res.


Turn off vsync


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Thorteris*
> 
> How many r9-290x reach 1300mhz+ ? I was wondering if anybody thinks I will reach it with a kraken g10 + x60. Or will my temps still be too high?


Almost none. 290x over 1300MHz is very rare.

I am 99.9% sure you will not reach those clocks. 1200 is possible.


----------



## Arizonian

Quote:


> Originally Posted by *uaedroid*
> 
> My R9-290X, still trying to optimized the settings..
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## Thorteris

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Almost none. 290x over 1300MHz is very rare.
> 
> I am 99.9% sure you will not reach those clocks. 1200 is possible.


Well thanks. I guess they don't reach high clocks. I will aim for 1200 than.


----------



## Imprezzion

Mine can do 1240Mhz on full +200mV with no LLC BIOS but VRM temps are impossible then hitting 105-110c on 100% fanspeed in minutes lol.


----------



## taem

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Almost none. 290x over 1300MHz is very rare.
> 
> I am 99.9% sure you will not reach those clocks. 1200 is possible.


Like I said earlier, you can go to 3dmark and check out the 290/x results, only a handful out of 15-17,000 results hit 1300. It's not a temp issue.

Imho you should be satisfied with 1200 and very happy with 1250.


----------



## hammelgammler

Does anyone know if it could be my PSU that isnt capable of handling my system?
If i set my R9 290X to over +145mV i will get a crash with reboot when benching Metro 2033.
With lets say +125mV i dont get that crash, with the same clocks.
I have a 2500K @ 4,5GHz (1,256V), 10 fans (1 80mm, 2 120mm, 7 140mm), Accelero Hybrid, externel DAC + AMP and some minor other stuff.
My PSU is a beQuiet 680W Gold.
Could it be my PSU?


----------



## taem

Quote:


> Originally Posted by *hammelgammler*
> 
> Does anyone know if it could be my PSU that isnt capable of handling my system?
> If i set my R9 290X to over +145mV i will get a crash with reboot when benching Metro 2033.
> With lets say +125mV i dont get that crash, with the same clocks.
> I have a 2500K @ 4,5GHz (1,256V), 10 fans (1 80mm, 2 120mm, 7 140mm), Accelero Hybrid, externel DAC + AMP and some minor other stuff.
> My PSU is a beQuiet 680W Gold.
> Could it be my PSU?


Is it the Straight Power E9? I think this could be a problem:



I'm no expert on psus so maybe others could weigh in, but generally I seek single line operation on the 12v rail in my psus if I'm going to get a power hungry gpu. I want minimum 33-40a on the 12v rail. Max line you have is 22a, x12v = 264w, and your card might be spiking higher on occasion, and causing the crash.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *taem*
> 
> Like I said earlier, you can go to 3dmark and check out the 290/x results, only a handful out of 15-17,000 results hit 1300. It's not a temp issue.
> 
> Imho you should be satisfied with 1200 and very happy with 1250.


My point exactly, over 1300MHz core is very rare.

My card does 1245MHz core under water, and that's slightly above average. Most chips will do from 1150-1225MHz core if temp is not an issue and they are overvolted adequately.

Also the reason why I'm not super impressed by the Lighting 290x's; the build quality is nice and the cooler performs well but the power delivery on reference boards is more than adequate for almost all uses. Lightning 290x's should be used strictly for cold imo.


----------



## neurotix

Quote:


> Originally Posted by *Xylene*
> 
> Anyone running 3x 1080p on a single 290? If so, does a single cut it? I don't care about AA.


I ran a 1080p Eyefinity array on one R9 290 Tri-X since January, but I just upgraded. For most of the time I ran it at 1200/1500mhz.

For 99% of DirectX11 games it was more than enough if you use FXAA. Just Cause 2, Sleeping Dogs, Torchlight 2, XCOM, Skyrim w/ texture mods, Battlefield 3 and quite a few others all ran great. With vsync off in a lot of the games I'd get over 70 fps in most areas. In some areas in Skyrim for example it would dip down to 45 but never anything less than that. For most games it was perfectly playable and possibly even overkill (Getting 190 fps in Just Cause 2 in a jet...).

However, there's a small amount of games that wouldn't run over 30 fps with everything maxed out- mostly Crysis 3 and Far Cry 3. To get higher fps in those I had to turn most post processing options to medium. Battlefield 4 was a smooth 60 for the most part at the "Ultra" preset with FXAA but the fps would dip to 30 in intense action (I only played the campaign, I'm not a big multiplayer guy). I haven't tried Thief yet.

I decided to get another 290 just to be safe and future proof my system for upcoming games.

But yeah, Eyefinity is definitely doable with a single 290, especially if you overclock it. You might need to lower post processing options in some games to get more fps though. If you're one of those people who wants 8x AA on everything (Why?) it probably won't be enough.


----------



## Paul17041993

GCN is capable of up to 1400 on water, 1800 (or more) on nitro/helium, the problem more lies in keeping the voltage stable and not melting your power cables and/or mobo.


----------



## Imprezzion

Or the VRM's for that matter.. You'd better LN2 the VRM's as well lawl!


----------



## Forceman

Quote:


> Originally Posted by *Paul17041993*
> 
> GCN is capable of up to 1400 on water, 1800 (or more) on nitro/helium, the problem more lies in keeping the voltage stable and not melting your power cables and/or mobo.


Tahiti was, but Hawaii doesn't seem to be as overclock friendly. Are there any 1400 on water cards?


----------



## DrClaw

Quote:


> Originally Posted by *Paul17041993*
> 
> sounds like a bugged codec? have you tried messing with the AMD video playback settings and/or using a different playback program? I use the k-lite codec pack and play videos on it's media player classic. haven't yet tried out dxtory though, last I ever saw it it was still in very early alpha...


i turned off some video settings in amd catalyst, fine enough it works but the quality isnt so great now, its maybe something i turned off.


----------



## taem

Quote:


> Originally Posted by *Paul17041993*
> 
> GCN is capable of up to 1400 on water, 1800 (or more) on nitro/helium, the problem more lies in keeping the voltage stable and not melting your power cables and/or mobo.


Where are you getting this? When wizerty set a single gpu record a little while back, he did it with 1470 core on ln2. The quad world record breaker on ln2 ran 1400 or so I think. Smoke also hit like 1400 on ln2 for the dual 290x record. 1800 sounds ridiculous to me. Isn't the record for a titan 1700? Maybe all these records have been broken, dunno. But all these cards were supplied by amd or board partners. A user cannot expect to hit 1400 on water, ln2, uru, or unicorn dust with a card they order off newegg.


----------



## Imprezzion

Quote:


> Originally Posted by *taem*
> 
> Where are you getting this? When wizerty set a single gpu record a little while back, he did it with 1470 core on ln2. The quad world record breaker on ln2 ran 1400 or so I think. Smoke also hit like 1400 on ln2 for the dual 290x record. 1800 sounds ridiculous to me. Isn't the record for a titan 1700? Maybe all these records have been broken, dunno. But all these cards were supplied by amd or board partners. A user cannot expect to hit 1400 on water, ln2, uru, or unicorn dust with a card they order off newegg.


When I see that LN2 world records are set at 1400-1470Mhz I think that when a card hits 1150-1200 on air that aint even bad at all


----------



## Paul17041993

Quote:


> Originally Posted by *Forceman*
> 
> Tahiti was, but Hawaii doesn't seem to be as overclock friendly. Are there any 1400 on water cards?


Quote:


> Originally Posted by *taem*
> 
> Where are you getting this? When wizerty set a single gpu record a little while back, he did it with 1470 core on ln2. The quad world record breaker on ln2 ran 1400 or so I think. Smoke also hit like 1400 on ln2 for the dual 290x record. 1800 sounds ridiculous to me. Isn't the record for a titan 1700? Maybe all these records have been broken, dunno. But all these cards were supplied by amd or board partners. A user cannot expect to hit 1400 on water, ln2, uru, or unicorn dust with a card they order off newegg.


I would expect a lightning to be the one to get that high, as I said, voltage is an issue, the shear size of these chips kills the regulation, might be a while before we see these clocks but its still fairly possible. especially with a fair amount of 290s getting to 1300 without insane modding or the sort.


----------



## Thorteris

I see hawai doesn't overclock as well as gk-110. Anybody know why?


----------



## Cool Mike

Hello, Does anyone have the BIOS for the *PowerColor LCS R9 290X (waterblock version) Uber please*
Techpowerup doesn't have it in their database.


----------



## Forceman

Quote:


> Originally Posted by *Thorteris*
> 
> I see hawai doesn't overclock as well as gk-110. Anybody know why?


Architectural differences, plus Hawaii has higher transistor density which is probably a factor.


----------



## cephelix

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Every option at it's highest? Does that include 16x AA?


Ok,now i have another qn.in game,there are quite a few AA options.which one is regarded as the higest setting in this case?noob qn i know..heh

Quote:


> Originally Posted by *taem*
> 
> Fire Strike and 3DMark11 Performance, Valley ExtremeHD preset, Heaven Extreme preset -- these should be the basis of comparison since they are free. Just remember to run 3DMark11 in stretched rather than centered. For Fire Strike and 3DMark11, it's the gpu sub score that's relevant. For purposes of this thread that is.
> I don't have that title but sounds like double frame v sync is on.


Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Turn off vsync


I did turn on vsync in game.didnt know that it'll drop frame rates that much.
Will mess ard with it again tmrw(no free time today) and get back to you guys with the results


----------



## MrWhiteRX7

Quote:


> Originally Posted by *cephelix*
> 
> Ok,now i have another qn.in game,there are quite a few AA options.which one is regarded as the higest setting in this case?noob qn i know..heh
> 
> I did turn on vsync in game.didnt know that it'll drop frame rates that much.
> Will mess ard with it again tmrw(no free time today) and get back to you guys with the results


Yep the way it's setup if you can't maintain a solid 60fps then it will drop you immediately down to 30fps and lock you there.


----------



## cephelix

Noted.thanx for that.
Really itching to do some benches with my card and see where they stand on air b4 i oc/put it underwater.
Another qn,i have 13.11 installed b4 i saw the 13.12 drivers.so to have a problem free update,i should use tge softwares/methods mentioned on the 1st page to remove 13.11 b4 installing 13.12 right?or could i just install over the 13.11?


----------



## MapRef41N93W

Quote:


> Originally Posted by *cephelix*
> 
> Ok,now i have another qn.in game,there are quite a few AA options.which one is regarded as the higest setting in this case?noob qn i know..heh
> 
> I did turn on vsync in game.didnt know that it'll drop frame rates that much.
> Will mess ard with it again tmrw(no free time today) and get back to you guys with the results


SSAA is the highest version of AA. It can turn games that aren't demanding into system crushers at 8-16x. Generally you just want to use MSAA. FXAA if you don't have the system resources and don't mind noticeably blurring the image a bit.

Quote:


> Originally Posted by *cephelix*
> 
> Noted.thanx for that.
> Really itching to do some benches with my card and see where they stand on air b4 i oc/put it underwater.
> Another qn,i have 13.11 installed b4 i saw the 13.12 drivers.so to have a problem free update,i should use tge softwares/methods mentioned on the 1st page to remove 13.11 b4 installing 13.12 right?or could i just install over the 13.11?


No you most definitely need to completely uninstall the previous driver. I recommend using DDU to do so and not AMD's tool as I once used AMD's tool and it caused a total corrupted installation afterwards. DDU will wipe everything out giving you a proper install.


----------



## taem

Quote:


> Originally Posted by *Paul17041993*
> 
> I would expect a lightning to be the one to get that high, as I said, voltage is an issue, the shear size of these chips kills the regulation, might be a while before we see these clocks but its still fairly possible. especially with a fair amount of 290s getting to 1300 without insane modding or the sort.


Yeah you might be right, right now we're dealing mostly with reference power delivery. Iirc the quad fire 290x that broke the quad titan benching record used a pair of asus dcu2 and a pair of msi gaming, the matrix and lightning weren't out and of the available cards those had the best components. I think the dcu2 has a slightly beefed up power delivery but the msi gaming afaik is 5+1+1. So when the benchers start working with the 24+ phase cards they should hit higher clocks. Which is sick if you think about it since I believe the 290x already holds most of the records.

But I still don't know about 1300 core being a realistic target for users who get custom boards with reference card phases which will be the vast majority. Otoh I don't sub to the benching threads where the high clockers hang out so maybe a lot of folks are already hitting that.


----------



## Thorteris

Quote:


> Originally Posted by *taem*
> 
> Yeah you might be right, right now we're dealing mostly with reference power delivery. Iirc the quad fire 290x that broke the quad titan benching record used a pair of asus dcu2 and a pair of msi gaming, the matrix and lightning weren't out and of the available cards those had the best components. I think the dcu2 has a slightly beefed up power delivery but the msi gaming afaik is 5+1+1. So when the benchers start working with the 24+ phase cards they should hit higher clocks. Which is sick if you think about it since I believe the 290x already holds most of the records.
> 
> But I still don't know about 1300 core being a realistic target for users who get custom boards with reference card phases which will be the vast majority. Otoh I don't sub to the benching threads where the high clockers hang out so maybe a lot of folks are already hitting that.


Yeah I think I will have trouble reaching 1200. I got one of press edition cards that was sent out to reviewers. The first ones...unless I got A cherry picked one. I haven't tried overclocking yet.


----------



## cephelix

Quote:


> Originally Posted by *MapRef41N93W*
> 
> SSAA is the highest version of AA. It can turn games that aren't demanding into system crushers at 8-16x. Generally you just want to use MSAA. FXAA if you don't have the system resources and don't mind noticeably blurring the image a bit.
> No you most definitely need to completely uninstall the previous driver. I recommend using DDU to do so and not AMD's tool as I once used AMD's tool and it caused a total corrupted installation afterwards. DDU will wipe everything out giving you a proper install.


Ok,will have to spare some time over the weekend to do it.thanx for the info.really learning alot from you guys and the forum in general.
Will see if i can unlock mine to a 290x as well but knowing my luck,i doubt it


----------



## Cool Mike

airsom2,
I own two 290x PCS+ also.
I hope you still have your 290x PCS+ cards. Can you send me the Uber bios (bios date 1/28/14 I believe) I would really appreciate it. Not sure you can upload it to techpowerup bios archive?

Best regards


----------



## DrClaw

should be here somewhere
http://www.techpowerup.com/gpudb/2397/radeon-r9-290.html


----------



## Cool Mike

290X version. Cant find in Techpowerup DATABASE


----------



## Tobiman

I'm beginning to have a problem with Valley 1.0. I can't seem to get the normal fps I used to which was around the 60-70s on average. Now I get 20-30s. Anyone had this problem before?


----------



## PachAz

I just got my Sapphire Tri-X R9 290 and I have started to OC it a bit. So far I have managed 1100 on the core and 1470 on the memory, with 56 in vddc offset and 50+ power limit in trixx software. I have run a couple of heaven and valley benchmarks as well as 3d mark 11 extreme and firestrike extreme tests, with no black screens or artifacts. I dont know if this is stable or not, since I dont play that many games except WOT these days.

With 1490 on memory I got black screen/freeze in 3d mark, so I put it back to 1470. At 1100 in core I needed to bump up the vddc to 56 in trixx to get rid of artifacts in valley extreme hd benchmark.

Edit: Just played a couple of sessions in wot and I got artifacts, so I stepped down the core clock to 1090. I dont wanna go more than 56 in vddc because the gpu gets too hot and the fans spins too loud. I think I play around with 1090/1470 for now.


----------



## Aussiejuggalo

So I thought I'd screw around on DayZ standalone and see how it goes



Thats artifacting isnt it? or is it just coz the games not really well optimized for these cards and coz I'm running 13.12 drivers









btw dont mind the grey screen I'm really sick and dont wanna kill myself


----------



## Forceman

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> So I thought I'd screw around on DayZ standalone and see how it goes
> 
> 
> 
> Thats artifacting isnt it? or is it just coz the games not really well optimized for these cards and coz I'm running 13.12 drivers
> 
> 
> 
> 
> 
> 
> 
> 
> 
> btw dont mind the grey screen I'm really sick and dont wanna kill myself


The artifacting (that is caused by overclocking) I've seen has normally been white squares, not white dots. Does it happen all the time? Could just be a game/driver bug.


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Forceman*
> 
> The artifacting (that is caused by overclocking) I've seen has normally been white squares, not white dots. Does it happen all the time? Could just be a game/driver bug.


Cards only got its factory OC







and this was the first time I tried DayZ, only game I've played since getting my card thats had a problem









Edit, what do you guys use for overclocking? might try lowering my clocks a little

Edit 2, just had a 0xA000001 BSOD when I opened a Youtube vid


----------



## Imprezzion

The white dots are a issue with ATOC used in the Arma engine. Especially when combined with any form of AA.

I finally got my card actually stable. On my previous clocks it would sometimes run for hours, and sometimes it would crash in 5 minutes.. That was VRAM. It ran at 1625Mhz @ +100 AUX which was too much. Dailed it down to 1500Mhz @ 0 AUX and it's stable.

Also, had the core at 1200Mhz +150mV but VRMs got epicly hot in Far Cry 3 for example. FC3 is still imo the best OC tester as it artifacts very fast with all the trees and also reaches very very high temperature levels. So, it was stable but 92c VRM1 @ 80% fanspeed... No thanks.

Dailed that down to 1175Mhz +100mV and it's stable as a rock now with VRM temps in the high 70's.

These cards are a pain to clock cause they can *seem* stable but still crash after a long time lol.. Especially VRAM.

Btw, does the VRAM use different timings or voltages compared to he HD7xxx series? I have Hynix on my card and I can remember HD7970 cards with the same "R0C" Hynix chips doing like, 1700-1800Mhz..


----------



## Aussiejuggalo

Arma engine sucks lol, anyway I died, respawned and it seems to be all good now







now I just hope it doesnt bsod with craptube again


----------



## Forceman

Quote:


> Originally Posted by *Imprezzion*
> 
> Btw, does the VRAM use different timings or voltages compared to he HD7xxx series? I have Hynix on my card and I can remember HD7970 cards with the same "R0C" Hynix chips doing like, 1700-1800Mhz..


Completely different memory controller - supposed to be simpler than Tahiti - so that's probably the limiting factor on memory overclocks right now.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Imprezzion*
> 
> Mine can do 1240Mhz on full +200mV with no LLC BIOS but VRM temps are impossible then hitting 105-110c on 100% fanspeed in minutes lol.


Both of my cards will do 1250/1500 at +160mV, if i can keep the temps on the vrms below 70c after that i need more voltage. but as you said....vrm temps are just brutal.


----------



## rdr09

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Cards only got its factory OC
> 
> 
> 
> 
> 
> 
> 
> and this was the first time I tried DayZ, only game I've played since getting my card thats had a problem
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit, what do you guys use for overclocking? might try lowering my clocks a little
> 
> Edit 2, just had a 0xA000001 *BSOD when I opened a Youtube* vid


could be hardware acceleration. disable it.


----------



## Aussiejuggalo

Quote:


> Originally Posted by *rdr09*
> 
> could be hardware acceleration. disable it.


Yeah I figured it might be that, disabled it and it seems to be good









Guess Youtube doesnt like 290s?


----------



## Dan848

Quote:


> Originally Posted by *dansi*
> 
> This is strange.
> 
> i can reach 1100/1400 with +70mv benched stable without artifacts without clock throttling.
> But i got higher 3dmark11 scores at 1050/1400.
> 
> I know recent radeons have Memory ECC when your VRAM oc past the point of stability and while it will not crashed, your performance takes a hit.
> 
> This is first time i seen it happen with CORE oc.


Too high an overclock on GPU or memory will lower benchmark scores because of instability with either. Sometimes better cooling helps, however, I am not going to push things that far, too lazy.

My sweet spot is 1119 VRAM and 1427 GPU. I add no extra voltage to the VRAM and 12% increase GPU voltage. I have not played around to see how low I can go with the GPU voltage. I simply made a guess and ran with it.


----------



## trihy

Quote:


> Originally Posted by *mojobear*
> 
> Hey Trihy.....I had the same experience as you, My memory would not OC over 1400 but after increasing aux voltage I could get 1500+. By using trixxx to OC my cards instead of afterburner, the memory potential reset itself and I was back to my old 1400. Interesting stuff though!


Nice to know, I think msi AB is getting an update this week, let´s see if this issue is listed on changelog...


----------



## Jack Mac

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Yeah I figured it might be that, disabled it and it seems to be good
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Guess Youtube doesnt like 290s?


Just use chrome to watch YT vids on the 290, I had issues with mine when I used Firefox and YT. I believe it's an issue with the flash player.


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Jack Mac*
> 
> Just use chrome to watch YT vids on the 290, I had issues with mine when I used Firefox and YT. I believe it's an issue with the flash player.


Tempting, FF has been buggy and crap since the last update lol

Died in DayZ and it seemed to fixed the artifacting I was getting to







kinda weird


----------



## rdr09

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Yeah I figured it might be that, disabled it and it seems to be good
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Guess Youtube doesnt like 290s?


afaik, it affects both red and green. weird thing is i can leave mine enabled on both my sigs and don't get bsods. HD7770 and 290.


----------



## boldenc

Hello guys, my Sapphire R9-290x Tri-X is making a buzz rattle only when the fan is running at 44% ~ 47%, a little touch on the plastic shrouder makes the rattle disappears. Is there something I can do to fix that?


----------



## VulgarDisplay

Quote:


> Originally Posted by *boldenc*
> 
> Hello guys, my Sapphire R9-290x Tri-X is making a buzz rattle only when the fan is running at 44% ~ 47%, a little touch on the plastic shrouder makes the rattle disappears. Is there something I can do to fix that?


I think I am experiencing the same issue, but I thought it was coil whine. Not something related to the shroud.


----------



## MapRef41N93W

So apparently anyone using the Afterburner beta is royally screwed as of this Friday. It expires and the creator has no new version coming out anytime soon (apparently due to the crisis in Ukraine). So anyone who is voltage unlocking their card is going to have to look to use another program like Trixx or GPU Tweak. Way too go MSI.


----------



## Niberius

Quote:


> Originally Posted by *MapRef41N93W*
> 
> So apparently anyone using the Afterburner beta is royally screwed as of this Friday. It expires and the creator has no new version coming out anytime soon (apparently due to the crisis in Ukraine). So anyone who is voltage unlocking their card is going to have to look to use another program like Trixx or GPU Tweak. Way too go MSI.


You are seriously griping about this when the person who programs this utility is in the middle of a crisis? Just use Trixx for the time being, its not the end of the world if you have to wait a bit.


----------



## MapRef41N93W

Quote:


> Originally Posted by *Niberius*
> 
> You are seriously griping about this when the person programs this utility is in the middle of a crisis? Just use Trixx for the time being, its not the end of the world if you have to wait a bit.


You're telling me MSI couldn't hire a temp programmer to work on afterburner while their main person is dealing with the crisis? Yeah sorry not buying that. There are lots of people who rely on Afterburner and they dropped the ball.

Also Trixx fan profiling doesn't work for my second card, so no I will not use it.


----------



## Niberius

Quote:


> Originally Posted by *MapRef41N93W*
> 
> You're telling me MSI couldn't hire a temp programmer to work on afterburner while their main person is dealing with the crisis? Yeah sorry not buying that. There are lots of people who rely on Afterburner and they dropped the ball.
> 
> Also Trixx fan profiling doesn't work for my second card, so no I will not use it.


I tell you what, when they start charging you to use the utility then you can complain, its a program that is being provided free of charge so you really have no basis for complaining about a delay.

This is unwinders project and MSI isnt going to spend more money hiring someone else to program something that you as the user are getting for free.


----------



## VulgarDisplay

So I'm working on overclocking my r9 290 tri-x OC at the moment. It's stable at +75mv, +50% power limit, @1170mhz(still going up), mem still stock while I find max core.

I'm noticing that it's downclocking quite frequently to around 1070mhz while running heaven benchmark on a loop to scan for artifacts. What could be causing this behavior?

My temps are 70C on the core, and 72C on VRM 1 and 50C (is this normal?) on VRM2. Am I just hitting the power limit for the card and throttling? If so then is there much point in pushing it further?


----------



## MapRef41N93W

Quote:


> Originally Posted by *Niberius*
> 
> I tell you what, when they start charging you to use the utility then you can complain, its a program that is being provided free of charge so you really have no basis for complaining about a delay.
> 
> This is unwinders project and MSI isnt going to spend more money hiring someone else to program something that you as the user are getting for free.


It's not a free utility. People spend hundreds of dollars on MSI's products and expect to get software that won't expire. Personally I would have no problem paying for the utility as it is much better than all the other software from the major GPU makers out there, but that isn't exactly an option now is it?


----------



## Niberius

Quote:


> Originally Posted by *MapRef41N93W*
> 
> It's not a free utility.


It is a free utility until they start charging you to use it and its a free utility which works perfectly fine with my non msi brand video card.


----------



## taem

Quote:


> Originally Posted by *Niberius*
> 
> You are seriously griping about this when the person who programs this utility is in the middle of a crisis? Just use Trixx for the time being, its not the end of the world if you have to wait a bit.


I didn't know that guy was in some sort of crisis. Hope everything works out. I appreciate the work he put into Afterburner all these years, it's so much more full featured than any other oc app out there. And frankly I thought we were lucky to have access to AB on non-MSI cards all these years.


----------



## centvalny

Matrix 290X review

http://www.xtremesystems.org/forums/showthread.php?288712-REVIEW-ASUS-ROG-Matrix-R9-290X-Platinum-Performance-Test&p=5228327#post5228327

Info by coolice @ KPC

http://kingpincooling.com/forum/showpost.php?p=27211&postcount=1

Short to enable LN2 Mode
- to use VGA Hotwire
- BIOS switch for LN2 BIOS
- Enhance overvoltage range via GPUTweak





To use memory defroster
- Short to enable memory defroster
- 4pins Molex power
It keeps memories above 0c and off automatically above 10c


----------



## Roboyto

Quote:


> Originally Posted by *centvalny*
> 
> Matrix 290X review
> 
> http://www.xtremesystems.org/forums/showthread.php?288712-REVIEW-ASUS-ROG-Matrix-R9-290X-Platinum-Performance-Test&p=5228327#post5228327
> 
> Info by coolice @ KPC
> 
> http://kingpincooling.com/forum/showpost.php?p=27211&postcount=1
> 
> Short to enable LN2 Mode
> - to use VGA Hotwire
> - BIOS switch for LN2 BIOS
> - Enhance overvoltage range via GPUTweak
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> To use memory defroster
> - Short to enable memory defroster
> - 4pins Molex power
> It keeps memories above 0c and off automatically above 10c
Click to expand...

I'm curious how well it would perform under LN2...but the performance of it in the review goes to show that the silicon lottery is more important than extra VRMs and other enhancements don't mean much unless you get a good chip. My reference XFX outpaces it on core and RAM, albeit I have Hynix...but why did they use Elpida on their elite card? Plus they STILL haven't fixed the cooler! I can't believe they would let something like that slide on the Matrix card. Between what I have see in this thread regarding the 290(X) cards and recent experiences with ASUS tech support, I am no longer an ASUS supporter.


----------



## centvalny

Quote:


> Originally Posted by *Roboyto*
> 
> I'm curious how well it would perform under LN2...but the performance of it in the review goes to show that the silicon lottery is more important than extra VRMs and other enhancements don't mean much unless you get a good chip. My reference XFX outpaces it on core and RAM, albeit I have Hynix...but why did they use Elpida on their elite card? Plus they STILL haven't fixed the cooler! I can't believe they would let something like that slide on the Matrix card. Between what I have see in this thread regarding the 290(X) cards and recent experiences with ASUS tech support, I am no longer an ASUS supporter.


I guess more V, elpida seems good with stock V here http://hwbot.org/submission/2519260_poparamiro_3dmark11___performance_radeon_r9_290x_19675_marks


----------



## Roboyto

Quote:


> Originally Posted by *centvalny*
> 
> I guess more V, elpida seems good with stock V here http://hwbot.org/submission/2519260_poparamiro_3dmark11___performance_radeon_r9_290x_19675_marks


Yes, there are the occasional card with Elpida memory that can clock well... but it's no secret Hynix have a much better probability. Point being that is their cream of the crop card that will require a premium price tag and they didn't use premium quality RAM. And they're still using the cooler designed for GK110. It's unacceptable, especially when the problem has been noted in nearly every review of the DC2 cards.


----------



## centvalny

Quote:


> Originally Posted by *Roboyto*
> 
> Yes, there are the occasional card with Elpida memory that can clock well... but it's no secret Hynix have a much better probability. Point being that is their cream of the crop card that will require a premium price tag and they didn't use premium quality RAM. And they're still using the cooler designed for GK110. It's unacceptable, especially when the problem has been noted in nearly every review of the DC2 cards.


Hynix probably can do higher with more relax ram timings


----------



## Mr357

Quote:


> Originally Posted by *Roboyto*
> 
> Yes, there are the occasional card with Elpida memory that can clock well


----------



## Imprezzion

Quote:


> Originally Posted by *MapRef41N93W*
> 
> So apparently anyone using the Afterburner beta is royally screwed as of this Friday. It expires and the creator has no new version coming out anytime soon (apparently due to the crisis in Ukraine). So anyone who is voltage unlocking their card is going to have to look to use another program like Trixx or GPU Tweak. Way too go MSI.


Well, I get the fact he's in a crisis and all, but why let it expire in the first place if they won't replace it?!
That's the fact I don't understand?!

Quote:


> Originally Posted by *VulgarDisplay*
> 
> So I'm working on overclocking my r9 290 tri-x OC at the moment. It's stable at +75mv, +50% power limit, @1170mhz(still going up), mem still stock while I find max core.
> 
> I'm noticing that it's downclocking quite frequently to around 1070mhz while running heaven benchmark on a loop to scan for artifacts. What could be causing this behavior?
> 
> My temps are 70C on the core, and 72C on VRM 1 and 50C (is this normal?) on VRM2. Am I just hitting the power limit for the card and throttling? If so then is there much point in pushing it further?


You using 14.xx drivers? Power target is broken on 14.xx so anything above +40mV will throttle hard. Use 13.12.


----------



## rdr09

Quote:


> Originally Posted by *MapRef41N93W*
> 
> So apparently anyone using the Afterburner beta is royally screwed as of this Friday. It expires and the creator has no new version coming out anytime soon (apparently due to the crisis in Ukraine). So anyone who is voltage unlocking their card is going to have to look to use another program like Trixx or GPU Tweak. Way too go MSI.


post#7 might help.

http://www.overclock.net/t/1475693/msi-afterburner-3-0-0-beta-18-expires-3-28-14-no-new-beta-available-can-i-make-this-work-after-3-28


----------



## Paul17041993

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Tempting, FF has been buggy and crap since the last update lol
> 
> Died in DayZ and it seemed to fixed the artifacting I was getting to
> 
> 
> 
> 
> 
> 
> 
> kinda weird


I use FF all the time, and its pretty stable until you get to about a 5 day uptime with 200+ tabs open, I have my plugins set to ask-to-activate and I kill the plugincontainer every now and then.

and yea I think you might have just had a particle render bug that dieing triggered a refresh...

Quote:


> Originally Posted by *MapRef41N93W*
> 
> It's not a free utility. People spend hundreds of dollars on MSI's products and expect to get software that won't expire. Personally I would have no problem paying for the utility as it is much better than all the other software from the major GPU makers out there, but that isn't exactly an option now is it?


you do realize afterburner isn't vendor-locked right? and so what if it doesn't get updated for another few months, if I can use a 2 month old version it must be pretty stable no?
its a tool, don't be so attached to things, learn to let go.


----------



## DrClaw

Quote:


> Originally Posted by *Paul17041993*
> 
> I use FF all the time, and its pretty stable until you get to about a 5 day uptime with 200+ tabs open, I have my plugins set to ask-to-activate and I kill the plugincontainer every now and then.
> 
> and yea I think you might have just had a particle render bug that dieing triggered a refresh...
> you do realize afterburner isn't vendor-locked right? and so what if it doesn't get updated for another few months, if I can use a 2 month old version it must be pretty stable no?
> its a tool, don't be so attached to things, learn to let go.


i wonder why other manufacturures that even use the same damn chip dont cover overclocking or dont have a proper overclocking tool and then the companies that do dont recommend using their tool on different cards
they can at least tell you they wont support you in any way but that its expected to work without any problems.
i have used msi-afterburner on my powercolor 290 and powercolor doesnt cover overclocking, its just stupid even when companies give you a bloody bios switch


----------



## taem

Quote:


> Originally Posted by *Roboyto*
> 
> I'm curious how well it would perform under LN2...but the performance of it in the review goes to show that the silicon lottery is more important than extra VRMs and other enhancements don't mean much unless you get a good chip. My reference XFX outpaces it on core and RAM, albeit I have Hynix...but why did they use Elpida on their elite card? Plus they STILL haven't fixed the cooler! I can't believe they would let something like that slide on the Matrix card. Between what I have see in this thread regarding the 290(X) cards and recent experiences with ASUS tech support, I am no longer an ASUS supporter.


Why does the cooler matter? You're not supposed to run this on air. And 1500/2000 on an ootb oc with stock cooler is unfrigginbelievably good, if you're beating that on a reference board you are so lucky you shouldn't even be complaining about anything amd related, you should be in a corner hugging yourself, giggling.


----------



## Raephen

Quote:


> Originally Posted by *DrClaw*
> 
> its just stupid even when companies give you a bloody bios switch


Correct me if I'm wrong, but the switch is just for Quiet and Uber mode... It's not intended for having a back-up if your flash goes wrong. Sure, it's what almost every entjusiast who flashed his card appreciated, but not it's intended purpose.

The devil is in the detail, every now and then


----------



## DrClaw

Quote:


> Originally Posted by *Raephen*
> 
> Correct me if I'm wrong, but the switch is just for Quiet and Uber mode... It's not intended for having a back-up if your flash goes wrong. Sure, it's what almost every entjusiast who flashed his card appreciated, but not it's intended purpose.
> 
> The devil is in the detail, every now and then


are you kidding me..


----------



## Roboyto

Quote:


> Originally Posted by *centvalny*
> 
> Hynix probably can do higher with more relax ram timings


I'm pretty sure all the 290(X) cards with Hynix memory are using the stock rated 5 GHz chips, mine run around 6.7GHz-6.8GHz stable in every bench or game I've thrown at them. I know nothing about RAM timings between Elpida and Hynix, but the information could be out there somewhere. H5GQ2H24AFR is the Hynix model number which you can google and find Hynix's specs on the RAM, but all of the specs listed are a range. It would be difficult to ascertain what the exact timings are; anyone know? I never thought to compare cards clock for clock that are using the different memory brands. It poses the problem of having all other hardware being identical to get a legit comparison though.

One could argue that the RAM speed isn't as important on these cards with the enormous 512-bit bus, but it really depends on what benchmark/game you're running; at least from what I've found from testing my 290. I ran 3DMark 11 Performance/Extreme presets, Unigine Heaven/Valley, and FFXIV benchmark with stock 1500 and OC up to 1700. The newer the application, the more it gained from RAM clocks; with FFXIV benchmark gaining the most at nearly 10%. On average for the 5 benchmarks it was ~5%. You can find a spreadsheet with exact figures in my build log.

http://www.overclock.net/t/1456279/honey-i-shrunk-the-ultra-tower-beast-my-journey-to-creating-a-more-compact-pc-with-an-r9-290

All of this aside, it doesn't change the fact ASUS made poor decisions for that card. It's made to go under LN2, and on air/water that RAM, most likely, isn't going to perform very well. They could have went with Samsung as an alternative to Hynix or Elpida. Heck, they could have even showboated a little and tossed the 7GHz Samsung modules in there like the 770/780 come equipped with; that usually clock to 8GHz.


----------



## Roboyto

Quote:


> Originally Posted by *taem*
> 
> Why does the cooler matter? You're not supposed to run this on air. And 1500/2000 on an ootb oc with stock cooler is unfrigginbelievably good, if you're beating that on a reference board you are so lucky you shouldn't even be complaining about anything amd related, you should be in a corner hugging yourself, giggling.


1500/2000? What card are you looking at, the Matrix in that review went to 1150/1562?

I'm not complaining, and it wasn't directed at AMD. I'm pointing out where they obviously didn't care. Regardless of if the card is made for LN2, the cooler on it shouldn't have been borrowed from the Nvidia design, especially when this has been well documented since the DC2 was released.


----------



## Raephen

Quote:


> Originally Posted by *DrClaw*
> 
> are you kidding me..


Short answer: yes, I was.

Love the .gif in your reply!


----------



## Paul17041993

Quote:


> Originally Posted by *DrClaw*
> 
> i wonder why other manufacturures that even use the same damn chip dont cover overclocking or dont have a proper overclocking tool and then the companies that do dont recommend using their tool on different cards
> they can at least tell you they wont support you in any way but that its expected to work without any problems.
> i have used msi-afterburner on my powercolor 290 and powercolor doesnt cover overclocking, its just stupid even when companies give you a bloody bios switch


most of the time everyone just uses the same or similar controllers that reference uses, simply for its simplicity, reliability and driver support, a common exception being ASUS cards tend to use completely different hardware and BIOS for... I don't really know why...

Quote:


> Originally Posted by *Raephen*
> 
> Correct me if I'm wrong, but the switch is just for Quiet and Uber mode... It's not intended for having a back-up if your flash goes wrong. Sure, it's what almost every entjusiast who flashed his card appreciated, but not it's intended purpose.
> 
> The devil is in the detail, every now and then


the dual-BIOS design generally originates from the older HD 5K cards, sapphire (and I think one other company) made their vaporX and toxic cards with such spastic cooling and potential they also gave it a switchable BIOS to allow two different stock settings, one being basic stock and the other having higher clocks and voltage for greater performance at the cost of more heat, noise and power draw (PSU issues were very common).

there's a side effect to having two BIOSs, you now can also allow the extra BIOS to be flashed to something more custom while still having a reserve to counter the issue of the classic bad flash, usually to fix a bad flash involved having to use a secondary display adapter or card to get past POST first, now you don't have to, AMD being so OC friendly they then decided to make this an official feature of the 6970, and eventually the superspastic 79x0 with its GCN and 2-4 times the amount of shader threads then its previous and competition. eventually a 3rd advantage came up that meant all these first 7970s could be upgraded to a GHz edition without problems, and if your card wasn't compatible you could just flash it back or use the other BIOS.

so in general its now just a standard for AMD and their mid-to-high range cards to have the two BIOSs and their vendors can do whatever they please with this feature, and of course warranty can never cover overclocking as its just simply not their fault if something goes horribly wrong because you made the hardware go harder then it was intended to do. Most BIOS related issues are covered by warranty still provided it wasn't OC related, ie you went to test an updated BIOS and somehow screwed up both ROMs, they will usually be happy to flash it back again for you, but if you used a special BIOS that had uncapped clocks or a board TDP and the card suffers damage they cant really cover that.


----------



## Aussiejuggalo

Quote:


> Originally Posted by *rdr09*
> 
> afaik, it affects both red and green. weird thing is i can leave mine enabled on both my sigs and don't get bsods. HD7770 and 290.


Mine doesnt so it all the time it seems to be every so often it'll bsod








Quote:


> Originally Posted by *Paul17041993*
> 
> I use FF all the time, and its pretty stable until you get to about a *5 day uptime with 200+ tabs open*, I have my plugins set to ask-to-activate and I kill the plugincontainer every now and then.
> 
> and yea I think you might have just had a particle render bug that dieing triggered a refresh...


Thats a lot of porn









My FF only seems to crash with 3 1080p videos buffering at once or when I look at really high res pictures like build logs on here

Yeah it was weird in DayZ, then again its running of a pretty crappy game engine in my opinion


----------



## AlphaC

Powercolor R9 290 TurboDuo http://www.newegg.com/Product/Product.aspx?Item=N82E16814131569

new SKU?


----------



## DeadlyDNA

Anyone know how to get the Crossfire profiles listed under 14.3 beta drivers? In the crossfire settings under AMD Predefined is empty for me. Not having alot of luck finding anything about this.


----------



## BradleyW

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Anyone know how to get the Crossfire profiles listed under 14.3 beta drivers? In the crossfire settings under AMD Predefined is empty for me. Not having alot of luck finding anything about this.


Have you selected a .exe to assign a profile to?


----------



## taem

Quote:


> Originally Posted by *Roboyto*
> 
> 1500/2000? What card are you looking at, the Matrix in that review went to 1150/1562?
> 
> I'm not complaining, and it wasn't directed at AMD. I'm pointing out where they obviously didn't care. Regardless of if the card is made for LN2, the cooler on it shouldn't have been borrowed from the Nvidia design, especially when this has been well documented since the DC2 was released.


Huh, they had a screen of gpu tweak with 1500 core 8000 mem settings. If that's not for the card being reviewed why is it there? They couldn't use snipping tool to show the settings in actual use? 1150/1562 isn't very impressive lol. Nice binning there Asus. You do get the sense Asus just did not care about Hawaii as much as they cared about 780.

edit Wait I take that back -- 1150/1562 at stock volts? Is that reference voltage? With no vddc adjust? Because that's not bad at all. You expect more from Matrix or Lightning maybe but that's probably more than the vast majority could get on run of the mill cards.


----------



## Paul17041993

Quote:


> Originally Posted by *taem*
> 
> Huh, they had a screen of gpu tweak with 1500 core 8000 mem settings. If that's not for the card being reviewed why is it there? They couldn't use snipping tool to show the settings in actual use? 1150/1562 isn't very impressive lol. Nice binning there Asus. You do get the sense Asus just did not care about Hawaii as much as they cared about 780.
> 
> edit Wait I take that back -- 1150/1562 at stock volts? Is that reference voltage? With no vddc adjust? Because that's not bad at all. You expect more from Matrix or Lightning maybe but that's probably more than the vast majority could get on run of the mill cards.


the 7K ASUS cards generally has ASIC of 50-60, which is good for water and subzero/nitro/helium, the cards themselves however were a massive letdown, stupid amount of power phases that didn't do anything, the majority of users Ive seen with them complained about them being either unable to do even the slightest of overclocks or they just cant run stock clocks in general...

their 290/Xs, DCII is just like ref. with a different cooler, the matrix... I cant take it seriously... there must be some reason for that blank PCB area surely...


----------



## Roboyto

Quote:


> Originally Posted by *taem*
> 
> Huh, they had a screen of gpu tweak with 1500 core 8000 mem settings. If that's not for the card being reviewed why is it there? They couldn't use snipping tool to show the settings in actual use? 1150/1562 isn't very impressive lol. Nice binning there Asus. You do get the sense Asus just did not care about Hawaii as much as they cared about 780.


That was showing the maximum capabilities of GPU tweak. Scroll down to the section where it says overclocking.

I quote from that xtremesystems.org page:

"Loaded with massive overclock features on the graphics card, *we expected that the Matrix R9 290X Platinum should be at least on par or beating the R9 290X DirectCU II OC. But to our dismay, we're disappointed in the overclock ability of the Matrix*. Perhaps it is just our review sample that doesn't hit the silicon lottery.

With a GPU voltage of 1.3 V, we managed to overclock the *GPU clock speed to 1150 MHz* from 1050 MHz. We were able to obtain a 1225 MHz overclock on the GPU with the R9 290X DirectCU II OC.

However, memory overclocking is better than the R9 290X DirectCU II OC. *The video memory went to 6250 MHz from 5400 MHz* with 1.63v (up from 1.5v). Well, that is pretty "normal" for Elpida based memory chips which we have discussed earlier. You can't go more than that with Elpida chips."


----------



## taem

Wow those are terrible results lol. Both my $499.99 cards beat that and I don't consider either a particularly good overclocker. I just assumed that was an ootb oc without voltage tweaking since this is a Matrix. Maybe it's just a bad chip.


----------



## DeadlyDNA

Quote:


> Originally Posted by *BradleyW*
> 
> Have you selected a .exe to assign a profile to?


Yes, for instance i go to CCC->gaming-->3d application settings-->click "add"---> select tesv.exe--->click on tesv.exe tab--->scroll down to AMD crossfireX mode--->select AMD predefined profile--->then the search bar and profile window are blank.

Was just wondering if anyone else has the same issue on 14.3beta drivers.


----------



## Paul17041993

Quote:


> Originally Posted by *Roboyto*
> 
> ...voltage of 1.3 V ...1150 MHz. from 1050 MHz...
> ...6250 MHz from 5400 MHz with 1.63v (up from 1.5v). ...Elpida...


I would hope that's the worst case scenario for it...

1630mV for 1562.5MHz memory...? elpida...? on their "flagship"...? I don't... why would they... what...?


----------



## taem

Yeah... I'm not even an elpida hater. IMHO for users who just game there's little difference between elpida and Hynix, at the 1500-1550 or so most folks will stick to. The Hynix advantage seems to exist more at the upper end, for benchers and enthusiast overclockers. But since this card is meant for the extreme, makes no sense to use elpida. I'm really surprised, as much hate as Asus gets they're really one of the better outfits who usually build solid products. Even on these 290s where they seem to be dropping the ball they're using better components than anyone else afaik.


----------



## Roboyto

Quote:


> Originally Posted by *Paul17041993*
> 
> I would hope that's the worst case scenario for it...
> 
> 1630mV for 1562.5MHz memory...? elpida...? on their "flagship"...? I don't... why would they... what...?


Exactly. I don't see any other reviews around presently, so hopefully some other sites will have more promising results. It's not looking good though if this one has Elpida.
Quote:


> Originally Posted by *taem*
> 
> Yeah... I'm not even an elpida hater. But since this card is meant for the extreme, makes no sense to use elpida


I'm not a hater of Elpida either, it's just much less common for their memory to perform under extreme conditions. The card has a memory defroster for cripes sakes! They couldn't use worthwhile RAM in there to be defrosted at least?

Anyone know if there is going to be a Gigabyte SOC 290X?

The Lightning faired better than this on air cooling didn't it?


----------



## Aussiejuggalo

Are the black screen lock ups the cards fault or is it the drivers?


----------



## Forceman

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Are the black screen lock ups the cards fault or is it the drivers?


No one knows for sure, but if you have Elpida then it is probably the card. They seemed to have more problems at launch.


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Forceman*
> 
> No one knows for sure, but if you have Elpida then it is probably the card. They seemed to have more problems at launch.


Ah ok, well I got Hynix







but I am running 13.12 drivers so I'm just gonna put it down to the drivers


----------



## Tobiman

The 290X lightnings have samsung ram which is even better than Hynix for timings and overclocks. For me though, I can't shout, the elpida ram on my card overclocks by 20% easily from 1250 to 1500 and my card can bench at 1250 core as long as temps are kept in check. I doubt this feat will be feasible with most of the matrix cards though. Why you ask? See... I had the matrix 7970 ghz edition and it was a sham. It could barely hold stock core and memory clocks without artifacting in explorer. Had to increase core and memory voltage to stabilize the card at the supposedly factory clocks.
The funny thing though was that there were a select few that had perfectly working cards but I doubt it was up to half of those that had the card.


----------



## Aussiejuggalo

If I could of forked out the $900 for a lightning I would of, those things are beasts. I havent clocked my card yet, easily hits 90 on stock clocks... well stock overclocks







so I'm gonna wait till I get it underwater

Probably gonna get flamed for this but... A lot of Asus cards seem like gimmicks, so many complains about the DirectCU cards and the reviews I've seen on the Matrix ones are a joke


----------



## cephelix

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> If I could of forked out the $900 for a lightning I would of, those things are beasts. I havent clocked my card yet, easily hits 90 on stock clocks... well stock overclocks
> 
> 
> 
> 
> 
> 
> 
> so I'm gonna wait till I get it underwater
> 
> Probably gonna get flamed for this but... A lot of Asus cards seem like gimmicks, so many complains about the DirectCU cards and the reviews I've seen on the Matrix ones are a joke


Really?then their marketing really fooled me.i would've thought asus cards were the top of the line


----------



## Paul17041993

I think I got mine to 1650 memory, ref. with hynix, craploads better then my 7970DCIIT that could barely do 1500 stable...


----------



## taem

Quote:


> Originally Posted by *cephelix*
> 
> Really?then their marketing really fooled me.i would've thought asus cards were the top of the line


Imho they do use the best components, MSI is good too.


----------



## cephelix

Quote:


> Originally Posted by *taem*
> 
> Imho they do use the best components, MSI is good too.


Was originally going to go with gigabyte windforce. The reviews for thm seem favourable as well but in the end i went with a familiar brand. We'll see how mine OCs in the next few weeks


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Are the black screen lock ups the cards fault or is it the drivers?


Good question man . I personally blame crappy drivers [email protected] is all she wrote LoooooooL


----------



## Aussiejuggalo

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Good question man . I personally blame crappy drivers [email protected] is all she wrote LoooooooL


Hahahah well AMD has always been a bit um... dodgy with drivers


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Hahahah well AMD has always been a bit um... dodgy with drivers


These are my first red things so I concur on that . NV drivers are a bit better .
I still love my 290's awesome cards to bench and game man


----------



## Aussiejuggalo

Nvidia drivers can be really bad tho to, the last 3 "stable" drivers from Nvidia before I changed to AMD were unstable as hell with my 670, non stop driver crashes, black screens, glitches, one of the reasons I grabbed a 290


----------



## Imprezzion

Quote:


> Originally Posted by *taem*
> 
> Imho they do use the best components, MSI is good too.


ASUS has to learn to stop using those puny little VRM heatsinks and Elpida VRAM. That's what kills it's OC potential.
I mean, a stock 290X DC2 hits up to 90c VRM1. And that's without any overvolts.


----------



## Forceman

Apologies up front for asking instead of searching through the dozens of pages to find this answer, but which card/BIOS is it that has the higher core voltage by default? The Powercolor PCS+?

I'm sick and tired of my card losing my additional voltage every time the monitor goes to sleep, and all I need is +50. Any hope of a BIOS editor for these cards like there is for Kepler cards?


----------



## Imprezzion

Quote:


> Originally Posted by *Forceman*
> 
> Apologies up front for asking instead of searching through the dozens of pages to find this answer, but which card/BIOS is it that has the higher core voltage by default? The Powercolor PCS+?
> 
> I'm sick and tired of my card losing my additional voltage every time the monitor goes to sleep, and all I need is +50. Any hope of a BIOS editor for these cards like there is for Kepler cards?


If we had a working BIOS editor we could adjust power limits in the BIOS and then we don't need the 14.xx drivers to be fixed


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Nvidia drivers can be really bad tho to, the last 3 "stable" drivers from Nvidia before I changed to AMD were unstable as hell with my 670, non stop driver crashes, black screens, glitches, one of the reasons I grabbed a 290


I grabbed em cause I wanted to bench em on HWBOT . Got some W/R too








Then realised how good they really were with PT1T bios and w/blocked and many hours of really good quality gaming sessions








But grew a dislike to black screens , disappearing vid driver and technicolor spazzies and inadequate PSU's


----------



## Aussiejuggalo

Yeah the black screens and driver issues are getting annoying but I'll deal with it seeing I dont have the $1000 for a damn Ti









I wanna stick mine underwater just so I dont have the fan noise







and also keep it cooler... maybe


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Yeah the black screens and driver issues are getting annoying but I'll deal with it seeing I dont have the $1000 for a damn Ti
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I wanna stick mine underwater just so I dont have the fan noise
> 
> 
> 
> 
> 
> 
> 
> and also keep it cooler... maybe


Once you block em 50% drop in full load temps and ..... silence
Another reason I went red and im up there pulling same / better benchmark scores 780 780ti Titan and 290x








Gotta go bed time


----------



## Aussiejuggalo

lol silence... maybe not for me, gonna have 10 AP-15s in there







temps could still be a problem for me tho, my room is normally 30 almost 40 most days









I went red seeing the only logical upgrade on the green side would of been a Ti... and $900 for a card that is at most what 10 - 15% faster then this one? doesnt really seem worth it imo


----------



## rdr09

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> lol silence... maybe not for me, gonna have 10 AP-15s in there
> 
> 
> 
> 
> 
> 
> 
> temps could still be a problem for me tho, my room is normally 30 almost 40 most days
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I went red seeing the only logical upgrade on the green side would of been a Ti... and $900 for a card that is at most what 10 - 15% faster then this one? doesnt really seem worth it imo


benching weather is almost there in your neck of the woods.


----------



## taem

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> lol silence... maybe not for me, gonna have 10 AP-15s in there
> 
> 
> 
> 
> 
> 
> 
> temps could still be a problem for me tho, my room is normally 30 almost 40 most days
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I went red seeing the only logical upgrade on the green side would of been a Ti... and $900 for a card that is at most what 10 - 15% faster then this one? doesnt really seem worth it imo


That's $200+ in fans that are unbearably loud at full rpm. Can't beat the cooling no doubt. Still...

MSI GTX 780ti is $660 at Newegg. http://www.newegg.com/Product/Product.aspx?Item=N82E16814127770&nm_mc=EMC-IGNEFL032514&cm_mmc=EMC-IGNEFL032514-_-EMC-032514-Index-_-DesktopGraphicsVideoCards-_-14127770-L05C


----------



## DrClaw

Quote:


> Originally Posted by *taem*
> 
> That's $200+ in fans that are unbearably loud at full rpm. Can't beat the cooling no doubt. Still...
> 
> MSI GTX 780ti is $660 at Newegg. http://www.newegg.com/Product/Product.aspx?Item=N82E16814127770&nm_mc=EMC-IGNEFL032514&cm_mmc=EMC-IGNEFL032514-_-EMC-032514-Index-_-DesktopGraphicsVideoCards-_-14127770-L05C


wow thats a great price for 780 ti goddamn,


----------



## The Mac

They are all coming down...

http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&N=100007709&IsNodeId=1&Description=r9%20290&name=Desktop%20Graphics%20Cards&Order=PRICE&Pagesize=20&isdeptsrh=1


----------



## FrankoNL

Hello Guys,

I need your help. I have just received my r9 HD290 gigabyte reference card. but i am having problems with my GPU load in battlefield 4 as well as in heaven benchmark.

My heaven benchmark ( standard preset 1080p @ 1050/1250 ) gives a score of only 1290 and i see my GPU load fluctuating between 70 and 100 procent. Same goes for battlefield 4.

When i check my cpu usage it is only at 20 percent for heaven and 65 percent for battlefield 4.

Do you have any suggestions?


----------



## chiknnwatrmln

Increase power limit. What driver are you on?


----------



## kizwan

Quote:


> Originally Posted by *FrankoNL*
> 
> Hello Guys,
> 
> I need your help. I have just received my r9 HD290 gigabyte reference card. but i am having problems with my GPU load in battlefield 4 as well as in heaven benchmark.
> 
> My heaven benchmark ( standard preset 1080p @ 1050/1250 ) gives a score of only 1290 and i see my GPU load fluctuating between 70 and 100 procent. Same goes for battlefield 4.
> 
> When i check my cpu usage it is only at 20 percent for heaven and 65 percent for battlefield 4.
> 
> Do you have any suggestions?


Look alright to me. Fluctuating GPU usage is normal. The GPU usage monitoring is not accurate & doesn't represent the actual performance of the card. Check out my CPU/GPU usage below.

Heaven benchmark (single card, Crossfire disabled) *with Tessellation off* @1000/1300 MHz (290 Tri-X BIOS)
_I got higher score because Tessellation off & slightly higher (stock) clocks_


GPU usage in Heaven


CPU usage in BF4 with Mantle (14.2)


----------



## LesPaulLover

Hey I just picked up an msi gaming r9 290....which is the standard bios setting? Position 1 or 2?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *rdr09*
> 
> benching weather is almost there in your neck of the woods.


No not yet rain and 90% humidity at least I can do this all year round

Quote:


> Originally Posted by *LesPaulLover*
> 
> Hey I just picked up an msi gaming r9 290....which is the standard bios setting? Position 1 or 2?


Pos 1+ 2 are same bios no 2 has a higher fan speed profile of 50% and no1 has 20%


----------



## boot318

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> No not yet rain and 90% humidity at least I can do this all year round
> 
> Pos 1+ 2 are same bios no 2 has a higher fan speed profile of 50% and no1 has 20%


You have to add 400watts to your build now.


----------



## FrankoNL

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Increase power limit. What driver are you on?


I am on the newest beta drivers.









Quote:


> Originally Posted by *kizwan*
> 
> Look alright to me. Fluctuating GPU usage is normal. The GPU usage monitoring is not accurate & doesn't represent the actual performance of the card. Check out my CPU/GPU usage below.
> 
> Heaven benchmark (single card, Crossfire disabled) *with Tessellation off* @1000/1300 MHz (290 Tri-X BIOS)
> _I got higher score because Tessellation off & slightly higher (stock) clocks_
> 
> 
> GPU usage in Heaven
> 
> 
> CPU usage in BF4 with Mantle (14.2)


ahh ok thanks for the info!


----------



## The Mac

power limit doesnt work on the new betas.


----------



## Frontside

Hey, what are the safe temps for VRM? After a few[IMG ALT="cmp3.9.27.3q4
hours in BF4 GPU-Z shows VRM1 ~111C max VRM 2~68 C max MSI R9 290X 4g Gaming

btw can i join the Club?


http://www.techpowerup.com/gpuz/5mmd6/


----------



## cephelix

@Frontside correct me if i'm wrong but i think they're safe till 120 deg celsius.
Just checked mine and my vrm2 is currently hotter than vrm1...ard 68 deg and this is under no stress.is this normal??


----------



## VulgarDisplay

I thought that 110c was considered the max safe temp on the vrms. I'm loving the tri x cooler on my 290. Vrm 1 is seeing 72c max and vrm2 is 50c.

I'm on the latest betas for mantle in bf4, so I don't know if that is what is keeping my vrms so cool. The card isn't drawing any extra power because power limit is broken I assume?

Even with no extra power limit it runs stable up to 1150 right now with minimum throttling.


----------



## rdr09

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> No not yet rain and 90% humidity at least I can do this all year round
> 
> Pos 1+ 2 are same bios no 2 has a higher fan speed profile of 50% and no1 has 20%


I've seen that. that is in your new crib, right? congrats.


----------



## Aussiejuggalo

Quote:


> Originally Posted by *taem*
> 
> That's $200+ in fans that are unbearably loud at full rpm. Can't beat the cooling no doubt. Still...
> 
> MSI GTX 780ti is $660 at Newegg. http://www.newegg.com/Product/Product.aspx?Item=N82E16814127770&nm_mc=EMC-IGNEFL032514&cm_mmc=EMC-IGNEFL032514-_-EMC-032514-Index-_-DesktopGraphicsVideoCards-_-14127770-L05C


$200 on fans








and I got 5 in there at the moment running 12v, there not to loud, I can turn them down to 5v and I only gain 5 degrees and there practically dead silent then









Thats a damn good price for americans, australians on the other hand... we get ripped off at every turn


----------



## boldenc

is there a way to get more than 100mV with afterburner on the Sapphire Tri-X 290 ?
I tried the afterburner guide to enable more than 100mV but it didn't work.


----------



## Germanian

sold 1 of my r9 290's. Went down from 3 to 2.

2 is more than enough for me at least for now. Let's see if Witcher 3 changes that


----------



## Forceman

Quote:


> Originally Posted by *boldenc*
> 
> is there a way to get more than 100mV with afterburner on the Sapphire Tri-X 290 ?
> I tried the afterburner guide to enable more than 100mV but it didn't work.


You could just use Trixx, unless you need something special from Afterburner (which expires Friday anyway, unless they patch it). It allows +200mV.


----------



## Ebefren

News: using Trixx i finally made my first little stable overclock :

Gpu : 1060 Vram : 1350 (without +VDDC or +Power Limit, vanilla !)

Full stable, 73C max temp. with 40% fans. (not a noise)



Now i can push further for sure









ps. (ok the settings are the same of the HIS model...but i think the temps are much better)


----------



## taem

Quote:


> Originally Posted by *HOMECINEMA-PC*


Dang dude you don't worry about nudging that narrow tall case with your elbow and knocking it off the desk?

Oh and do you use the fan mount on the side which I'm assuming aims a fan at the bottom of the cpu socket? I've always wondered if those make a difference.


----------



## Mattriz

I have a stock 290x and are having problems with the FPS jumping up and down. For example look at 1.15 in the video. The FPS goes between 50 and 100 in just a few seconds. Does anyone know why this is happening, someone said it might be the CPU holding me back, but according to this it says that "If yellow is below green on the graph, then your CPU is holding back your GFX card." Which its not in the second video. Playing on High with 4xAA and stock GPU settings.





(sound is weird on this one, because i cut out a skype conversation)

Another video with the performance logger on:





EDIT:
After a closer look it looks like its the CPU thats holding me back, when playing BF4 with mantle i have no problems and the FPS is even.


----------



## Arizonian

Quote:


> Originally Posted by *Frontside*
> 
> Hey, what are the safe temps for VRM? After a few[IMG ALT="cmp3.9.27.3q4
> hours in BF4 GPU-Z shows VRM1 ~111C max VRM 2~68 C max MSI R9 290X 4g Gaming
> 
> btw can i join the Club?
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - I've added you to roster









However if you can go back to that post and edit in a GPU-Z screen shot of OCN name showing or GPU-Z validation link with OCN name would be great.









http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/19670#post_22008142


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *boot318*
> 
> You have to add 400watts to your build now.


Plus 3930k @ 5.2 over 250watts under full load , 2 heavily oc 290's @ [email protected]@1.5v .......... LooooL









Quote:


> Originally Posted by *The Mac*
> 
> power limit doesnt work on the new betas.


On what bios ? stock or PT1T Asus 290x bios with droop ?









Quote:


> Originally Posted by *rdr09*
> 
> I've seen that. that is in your new crib, right? congrats.


Yeah man finally off my bed


Quote:


> Originally Posted by *taem*
> 
> Dang dude you don't worry about nudging that narrow tall case with your elbow and knocking it off the desk?
> 
> Oh and do you use the fan mount on the side which I'm assuming aims a fan at the bottom of the cpu socket? I've always wondered if those make a difference.


No mate that rig weighs 30lbs heaps of room


That 120mm fan is drawing air out straight over top of cpu back plate
Those 290's look damn fine


----------



## Imprezzion

Power limit in 14.xx doesn't work with stock BIOS's. PT1 and PT3 work as they have no limit. But, for a stock based BIOS it means no overvolting past +25mV. Otherwise it'll just throttle harder.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Imprezzion*
> 
> Power limit in 14.xx doesn't work with stock BIOS's. PT1 and PT3 work as they have no limit. But, for a stock based BIOS it means no overvolting past +25mV. Otherwise it'll just throttle harder.


Gotchya thanks mate


----------



## Forceman

Quote:


> Originally Posted by *Imprezzion*
> 
> Power limit in 14.xx doesn't work with stock BIOS's. PT1 and PT3 work as they have no limit. But, for a stock based BIOS it means no overvolting past +25mV. Otherwise it'll just throttle harder.


Mine didn't throttle at +75 or +50mV. So it kind of depends on the card. I'm on water though.


----------



## boldenc

How I can make rivatuner statistics to work with trixx ?


----------



## phallacy

So I had a weird problem occur yesterday. Was playing Arkham Origins when suddenly I got a BSOD saying system exception thread not handled as the error code. I restart and all of a sudden CCC won't load and Device Manager was telling me "windows has stopped this device because it has communicated that it's not working properly" code 43 for all 3 of my cards. All my restore points were deleted somehow too and my ssd boot drive keeps getting loaded with files I can't see because the available space keeps getting lower and lower everytime I look at it for no apparent reason. ( but that's another problem). Anyways, I had to disable all my cards except 1 uninstall 14.2 betas and then install 14.3. Cards seem to be ok now but I've never had a BSOD from overclocking that didn't say atikmdiag exception or whatever it says usually for driver problems.


----------



## Frontside

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - I've added you to roster
> 
> 
> 
> 
> 
> 
> 
> 
> 
> However if you can go back to that post and edit in a GPU-Z screen shot of OCN name showing or GPU-Z validation link with OCN name would be great.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/19670#post_22008142


Done.
SO what about the VRM temperature 1? Is it OK? 110 Ca bit high for my liking


----------



## Imprezzion

It is indeed very high and cannot be considered safe anymore. Are you somehow running extremely low fanspeeds?


----------



## Frontside

Quote:


> Originally Posted by *Imprezzion*
> 
> It is indeed very high and cannot be considered safe anymore. Are you somehow running extremely low fanspeeds?


Fans set on auto 2100rpm(~60%) under heavy load,


----------



## duplissi

Can I join?

OCforum.PNG 62k .PNG file


WP_20140326_001.jpg 1529k .jpg file
 (Forgive the potato quality)

Card is a Asus DirectCU II R9 290X OC. I have not modified it (Yet) A custom loop with two rads is in the planning stages at the moment.


----------



## spqmax

Hi, I'm considering upgrading from my 780ti to a 3 x 290x configuration with 3 monitors in mind, watercooling the whole thing. I'm running a rampage iv black and a mildly overclocked 4930k. I don't see overclocking/volting the cards beeing necessary right now (+I don't mine), so I'll be running all three at stock. Since I had been considering SLI when building the rig, i went with an ax1200i. After having watched the newegg video, a trifire config should draw 950w at full load. Should I upgrade the PSU to maybe an EVGA 1300W?

When it comes to watercooling, I'm thinking 2x D5s, EK Blocks and 2 x 480mm rads (45mm and 80mm) in push pull to cool the cards + cpu. But that totals up to 16 fans + 3 case fans = 19fans. That adds to the power draw, as do the 2 pumps.

Or should I run the cards + cpu (+ mobo) off the 1200w and add a 2nd/slave one for pumps,fans, my 1x ssd and 1x sata, to have a bit of head room?


----------



## mojobear

Quote:


> Originally Posted by *spqmax*
> 
> Hi, I'm considering upgrading from my 780ti to a 3 x 290x configuration with 3 monitors in mind, watercooling the whole thing. I'm running a rampage iv black and a mildly overclocked 4930k. I don't see overclocking/volting the cards beeing necessary right now (+I don't mine), so I'll be running all three at stock. Since I had been considering SLI when building the rig, i went with an ax1200i. After having watched the newegg video, a trifire config should draw 950w at full load. Should I upgrade the PSU to maybe an EVGA 1300W?
> 
> When it comes to watercooling, I'm thinking 2x D5s, EK Blocks and 2 x 480mm rads (45mm and 80mm) in push pull to cool the cards + cpu. But that totals up to 16 fans + 3 case fans = 19fans. That adds to the power draw, as do the 2 pumps.
> 
> Or should I run the cards + cpu (+ mobo) off the 1200w and add a 2nd/slave one for pumps,fans, my 1x ssd and 1x sata, to have a bit of head room?


Hey buddy,

I have a 4770K and 3 x 290s and they pull 850 - 900 W from the wall during gaming with a mild OC on the 4770K to 4.75 ghz and the 290s at stock. I run a AX1200 and its holding up fine (knock on wood) for over 3 months now. Im also running 2 D5 pumps and 11 fans









Your setup will pull more power but I can't image your CPU OC pulling an extra 200W over my 4770K so you should be fine.

Watercooling can really increase the efficiency of the 290s, drawing a lot less power - even with +87 mV and 1175/1375 OC on all 3 cards I pull around 1000W max, which the AX1200 handles beautifully (again knock on wood







).


----------



## MrWhiteRX7

Quote:


> Originally Posted by *spqmax*
> 
> Hi, I'm considering upgrading from my 780ti to a 3 x 290x configuration with 3 monitors in mind, watercooling the whole thing. I'm running a rampage iv black and a mildly overclocked 4930k. I don't see overclocking/volting the cards beeing necessary right now (+I don't mine), so I'll be running all three at stock. Since I had been considering SLI when building the rig, i went with an ax1200i. After having watched the newegg video, a trifire config should draw 950w at full load. Should I upgrade the PSU to maybe an EVGA 1300W?
> 
> When it comes to watercooling, I'm thinking 2x D5s, EK Blocks and 2 x 480mm rads (45mm and 80mm) in push pull to cool the cards + cpu. But that totals up to 16 fans + 3 case fans = 19fans. That adds to the power draw, as do the 2 pumps.
> 
> Or should I run the cards + cpu (+ mobo) off the 1200w and add a 2nd/slave one for pumps,fans, my 1x ssd and 1x sata, to have a bit of head room?


I'm running basically the rig you describe (4820k though) with 21 fans in total and d5 pumps all on a single 1300g2. Works great and as you're thinking, you won't need to overclock the gpu's for anything right now. I also have a rig with a 780ti clocked over 1200mhz and I never ever game on it anymore. I've had no issues with tri-fire on my 290 rig. It plows through all games.

Enjoy!!!


----------



## stilllogicz

Quote:


> Originally Posted by *stilllogicz*
> 
> On these waterblocks:
> 
> http://www.ekwb.com/shop/blocks/vga-blocks/ati-radeon-full-cover-blocks/radeon-rx-200-series/ek-fc-r9-290x-original-csq-nickel.html
> 
> Can that chrome\silver\nickel metal part be removed and not used? or possibly painted?


Does anyone know?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *stilllogicz*
> 
> Does anyone know?


Paint it


----------



## stilllogicz

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Paint it


Excellent.


----------



## Imprezzion

Quote:


> Originally Posted by *Frontside*
> 
> Fans set on auto 2100rpm(~60%) under heavy load,


That is insane. That card is definitely defective unless your pushing more then +200mV through the card...

VRM temps on stock volts and 60% fanspeed should not exceed 70c on the MSI Gaming.


----------



## taem

Quote:


> Originally Posted by *boldenc*
> 
> How I can make rivatuner statistics to work with trixx ?


I don't think it does. Just use hwinfo, the osd options from that + rtss is superior to Afterburner + rtss. You run the sensors you want from hwinfo, and then the framerate counter off rtss.



http://www.hwinfo.com/download.php


----------



## Roboyto

Quote:


> Originally Posted by *taem*
> 
> I don't think it does. Just use hwinfo, the osd options from that + rtss is superior to Afterburner + rtss. You run the sensors you want from hwinfo, and then the framerate counter off rtss.
> 
> 
> 
> http://www.hwinfo.com/download.php


I concur, HWInfo works very well. I have it setup to display FPS, CPU load per core, CPU temp, RAM load, GPU load, GPU temp, and VRM1 temp. Nice to be able to move the stats around where you want them on the screen, and change the color of the information; I have found yellow to be the best color. It takes a little finagling but once you dial it in, it is awesome. Very handy when you're running eyefinity


----------



## Frontside

Quote:


> Originally Posted by *Imprezzion*
> 
> That is insane. That card is definitely defective unless your pushing more then +200mV through the card...
> 
> VRM temps on stock volts and 60% fanspeed should not exceed 70c on the MSI Gaming.


Is there anything what can be done to solve this problem? RMAing the cars is not an option. I bought it in Germany, but live in Russia. Sending it there and back again if i get lucky enough, will take more that a month and a half, thanks to Russian post.


----------



## Roboyto

Quote:


> Originally Posted by *Mattriz*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I have a stock 290x and are having problems with the FPS jumping up and down. For example look at 1.15 in the video. The FPS goes between 50 and 100 in just a few seconds. Does anyone know why this is happening, someone said it might be the CPU holding me back, but according to this it says that "If yellow is below green on the graph, then your CPU is holding back your GFX card." Which its not in the second video. Playing on High with 4xAA and stock GPU settings.
> 
> 
> 
> 
> 
> (sound is weird on this one, because i cut out a skype conversation)
> 
> Another video with the performance logger on:
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT:
> After a closer look it looks like its the CPU thats holding me back, when playing BF4 with mantle i have no problems and the FPS is even.


I hightly doubt a HT i7 930 is bottlenecking your system. Drivers beyond 13.12 have been problematic for quite some time showing all sorts of problems, with or without Mantle. Remove all drivers and drive 13.12 w/o Mantle and see where you end up.


----------



## Roboyto

Quote:


> Originally Posted by *Frontside*
> 
> Is there anything what can be done to solve this problem? RMAing the cars is not an option. I bought it in Germany, but live in Russia. Sending it there and back again if i get lucky enough, will take more that a month and a half, thanks to Russian post.


Remove the heatsink/cooler and see if good contact is being made on VRM1. VRM1 highlighted in yellow.


----------



## cephelix

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Roboyto*
> 
> Remove the heatsink/cooler and see if good contact is being made on VRM1. VRM1 highlighted in yellow.






On a side note, thanks for the picture.
So i know it's advisable to use a full cover water block but since mine a a 290 gaming from MSI, there isn't really one out there for it, yet.
Hence, a hypothetical qn, that if i use a universal waterblock and have the VRMs cooled via heatsinks with a fan blowing directly on it, would that be sufficient, especially for VRM1 cooling?
Also, would the heatsinks stay on if I use Fujipoly thermal pads? or should i use thermal adhesive(hoping to avoid) instead?


----------



## Roboyto

Quote:


> Originally Posted by *cephelix*
> 
> 
> On a side note, thanks for the picture.
> So i know it's advisable to use a full cover water block but since mine a a 290 gaming from MSI, there isn't really one out there for it, yet.
> Hence, a hypothetical qn, that if i use a universal waterblock and have the VRMs cooled via heatsinks with a fan blowing directly on it, would that be sufficient, especially for VRM1 cooling?
> Also, would the heatsinks stay on if I use Fujipoly thermal pads? or should i use thermal adhesive(hoping to avoid) instead?


Yeah, that would work to cool them. It wouldn't be much different than using the Arctic Accelero or the NZXT Kraken G10 with an AIO cooler. Someone else would have more accurate information regarding attaching heatsinks, but I'm pretty sure some, if not all, will require some kind of thermal adhesive to keep them held on securely. I've personally only used the Arctic Cooling thermal epoxy for attaching heatsinks with the Accelero coolers.

EDIT: Just looked up Arctic Thermal Adhesive, and they blatantly state it is permanent.

http://www.arcticsilver.com/pdf/appinstruct/asta/ins_asepxy.pdf

If you can get away with pads on the VRM1, Fujipoly Ultra Extreme is definitely the way to go.

http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures

I've seen some people cut off the VRM1 heatsink/cooling from their coolers and use it with an AIO water cooler on the core. That could be an option for you.


----------



## cephelix

Was hoping for something strong,but still non permanent.
Read your thread on the FUE,and was amazed at the temp differences.

I do have the arctic accelero hybrid though.might give a read as to mounting it on my 290.the end result wouldnt be as pretty as a custom loop but if temps arent gonna be much of a difference,then why not


----------



## Roboyto

Quote:


> Originally Posted by *cephelix*
> 
> Was hoping for something strong,but still non permanent.
> Read your thread on the FUE,and was amazed at the temp differences.
> 
> I do have the arctic accelero hybrid though.might give a read as to mounting it on my 290.the end result wouldnt be as pretty as a custom loop but if temps arent gonna be much of a difference,then why not


Core temps will likely be very similar to a full loop, depending on how extravagant of one anyhow. The AIO has the advantage of not having to cope with VRM1. The primary reason why the reference coolers were so terrible was that they were first cooling the volcanic VRM1...the hot air from there was then going to the core. VRM1 temps are the biggest difference with full cover and other cooling means. Just need to make sure you have healthy airflow over the VRM1, especially if you plan on OCing, and you should be OK.


----------



## cephelix

thanks for that. Will give it a go when I have the time/resources and get back to you with any qns i might have...


----------



## Paul17041993

Quote:


> Originally Posted by *phallacy*
> 
> So I had a weird problem occur yesterday. Was playing Arkham Origins when suddenly I got a BSOD saying system exception thread not handled as the error code. I restart and all of a sudden CCC won't load and Device Manager was telling me "windows has stopped this device because it has communicated that it's not working properly" code 43 for all 3 of my cards. All my restore points were deleted somehow too and my ssd boot drive keeps getting loaded with files I can't see because the available space keeps getting lower and lower everytime I look at it for no apparent reason. ( but that's another problem). Anyways, I had to disable all my cards except 1 uninstall 14.2 betas and then install 14.3. Cards seem to be ok now but I've never had a BSOD from overclocking that didn't say atikmdiag exception or whatever it says usually for driver problems.


run a memtest overnight, then check the status of your SSD, check its performance, and its quite possible you could have a bad SATA cable on it or another drive.

those above can cause all sorts of problems, but if you pass that I would think you would want to format the SSD and do a fresh windows install...

edit; oh and if you're overclocking CPU, be sure to runs things like prime95 for a bit every now and then to be sure its still stable, sometimes you need to add a tad more voltage.


----------



## phallacy

Quote:


> Originally Posted by *Paul17041993*
> 
> run a memtest overnight, then check the status of your SSD, check its performance, and its quite possible you could have a bad SATA cable on it or another drive.
> 
> those above can cause all sorts of problems, but if you pass that I would think you would want to format the SSD and do a fresh windows install...
> 
> edit; oh and if you're overclocking CPU, be sure to runs things like prime95 for a bit every now and then to be sure its still stable, sometimes you need to add a tad more voltage.


Thanks for the speedy reply Paul. I had run the initial batch of stability testing when I first did all the OCs and they were all stable with the settings I refined. Includes Aida64 memtest, prime95 linx and cinebench. Valley, Heaven, games and 3dmark for the gpus. I'm using the xmp profile for my memory and never noticed a problem. I think you may be on to something with the stability of the SSD. But even the ssd is also very new about 3 months old so I will check that over the weekend.


----------



## Roboyto

Quote:


> Originally Posted by *phallacy*
> 
> Thanks for the speedy reply Paul. I had run the initial batch of stability testing when I first did all the OCs and they were all stable with the settings I refined. Includes Aida64 memtest, prime95 linx and cinebench. Valley, Heaven, games and 3dmark for the gpus. I'm using the xmp profile for my memory and never noticed a problem. I think you may be on to something with the stability of the SSD. But even the ssd is also very new about 3 months old so I will check that over the weekend.


MemTest doesn't always tell all from my experiences. On 2 separate occasions I have had MemTest run for 12+ hours showing no errors and the RAM was actually faulty. This happened with G.Skill Ares and Corsair Vengeance DDR3 RAM. If you have extra RAM somewhere I would consider swapping it in for testing purposes. The problem with synthetic benchmarks is they don't simulate exactly how things happen in real-world use scenarios.









Also a quick removal and re-seating of the DIMMs can fix strange quirks like this; one tiny piece of dust can throw it all off.


----------



## cephelix

Quote:


> Originally Posted by *Roboyto*
> 
> MemTest doesn't always tell all from my experiences. On 2 separate occasions I have had MemTest run for 12+ hours showing no errors and the RAM was actually faulty. This happened with G.Skill Ares and Corsair Vengeance DDR3 RAM. If you have extra RAM somewhere I would consider swapping it in for testing purposes. The problem with synthetic benchmarks is they don't simulate exactly how things happen in real-world use scenarios.


Hmm, so is there a bench that is more representative of real-world conditions?


----------



## phallacy

Quote:


> Originally Posted by *Roboyto*
> 
> MemTest doesn't always tell all from my experiences. On 2 separate occasions I have had MemTest run for 12+ hours showing no errors and the RAM was actually faulty. This happened with G.Skill Ares and Corsair Vengeance DDR3 RAM. If you have extra RAM somewhere I would consider swapping it in for testing purposes. The problem with synthetic benchmarks is they don't simulate exactly how things happen in real-world use scenarios.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also a quick removal and re-seating of the DIMMs can fix strange quirks like this; one tiny piece of dust can throw it all off.


Thanks ! I'll definitely reseat the DIMMs I remember when putting the dominator platinums in that I couldn't tell if they were secure due to the heat spreader being so tall and the gpu covering the bottom side of the tabs. They all "clicked" in so I assumed they were ok. What kind of problems were you noticing that lead you to conclude that the RAM was faulty after your memtesting? I do have a kit of vengeance 1600mhz but I really don't want to find out the dominator platinums I have aren't working properly


----------



## Roboyto

Quote:


> Originally Posted by *cephelix*
> 
> Hmm, so is there a bench that is more representative of real-world conditions?


Hard to say TBH, this is just my experience. They all do a pretty good job, but if you consider all the different things you could be doing simultaneously it is hard to replicate.

Strange performance issues and unexplainable BSODs always turn me to: re-seating components, unplugging/plugging cables, switching RAM slots, testing DIMMS individually, or swapping in known good RAM.

It's always the stupidest little thing that can make a difference. Just had an issue when I was reinstalling my delidded/naked 4770k; thought my Z87 Gryphon died. Took 2 reseats of the CPU for everything to function properly.

Quote:


> Originally Posted by *phallacy*
> 
> Thanks ! I'll definitely reseat the DIMMs I remember when putting the dominator platinums in that I couldn't tell if they were secure due to the heat spreader being so tall and the gpu covering the bottom side of the tabs. They all "clicked" in so I assumed they were ok. What kind of problems were you noticing that lead you to conclude that the RAM was faulty after your memtesting? I do have a kit of vengeance 1600mhz but I really don't want to find out the dominator platinums I have aren't working properly


I have dominator platinum as well...it is fairly unlikely they are bad considering the rigorous testing that they are put through...but it's not impossible! I was getting strange BSODs and lackluster performance overall, especially when booting. On a positive note, Corsair offers advanced RMA. It does require a CC/Debit card number, but they ship you out a new set with a return label. Just drop your faulty memory in the box and send it back. Couldn't be any easier than that


----------



## cephelix

Noted. Same here too. I've prime95ed my way through an OC on small ftt and it'll pass, but sometime when watching videos i'll BSOD.
No idea why. For now tempted to wipe my ssd boot drive clean and re try my OC.


----------



## Roboyto

Quote:



> Originally Posted by *cephelix*
> 
> Noted. Same here too. I've prime95ed my way through an OC on small ftt and it'll pass, but sometime when watching videos i'll BSOD.
> No idea why. For now tempted to wipe my ssd boot drive clean and re try my OC.


Never hurts to reset to stock settings to see if the issue persists. If it does then there is another factor involved.


----------



## cephelix

On stock I encounter no problems so I'm guessing it is my OC. but the VTT and CPU ratio is leaving me stumped.


----------



## Roboyto

Quote:


> Originally Posted by *cephelix*
> 
> On stock I encounter no problems so I'm guessing it is my OC. but the VTT and CPU ratio is leaving me stumped.


It's usually an uphill battle in this hobby







Just have to tinker some more...maybe reset to stock and start over. I find IBT to be a better stress test than Prime; I believe I read somewhere Linpack is what Intel uses to test chips...Anyone comment on that? Just keep in mind IBT will bring temps up a bit higher than Prime, ~20C from my experience.


----------



## cephelix

I just noticed that too this evening when i was trying to find my max bclk.
The best i got was 200 MHz BCLK at 1.13V VTT. passed IBT for 5mins(initial test). max temp was 59 in a 26 deg C ambient. Not sure if I should increase VTT more at this point. Trying for a 4.0GHz OC.


----------



## trelokomio58

Ηi guys!
I have a 290Χ reference card(gigabyte) and i want to flash my card back to stock bios(both switche's flash, quiet mode stock bios and uber mode stock bios)...

I tried some different bios to my card, but i dont keep a back up from the original bios of the card, and now i want to go back at stock bios and i dont know where to find the correct bios









If someone can hep me, or someone with reference 290X can upload the stock bios of his card (quiet and uber, both bios) i would appreciate that!









Thank's in advance


----------



## rdr09

Quote:


> Originally Posted by *trelokomio58*
> 
> Ηi guys!
> I have a 290Χ reference card(gigabyte) and i want to flash my card back to stock bios(both switche's flash, quiet mode stock bios and uber mode stock bios)...
> 
> I tried some different bios to my card, but i dont keep a back up from the original bios of the card, and now i want to go back at stock bios and i dont know where to find the correct bios
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If someone can hep me, or someone with reference 290X can upload the stock bios of his card (quiet and uber, both bios) i would appreciate that!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thank's in advance


make sure the model number matches yours . . .

http://www.gigabyte.com/support-downloads/download-center.aspx?kw=GV-R929XD5-4GD-B+


----------



## trelokomio58

Quote:


> Originally Posted by *rdr09*
> 
> make sure the model number matches yours . . .
> 
> http://www.gigabyte.com/support-downloads/download-center.aspx?kw=GV-R929XD5-4GD-B+


Thank's mate!!!








Yes that's exactly the model of the card!









I Download the file, and have inside two bios!
1)R929XD5-4GD-B-M.rom

2)R929XD5-4GD-B-P.rom
Which one is the quite bios and which is the uber bios??


----------



## Matt-Matt

I'm back..

So I ordered a AquaComputer 290x waterblock and the aktiv backplate about a month ago and the ETA was about a week so I decided to wait. It's now been a month and the date just got pushed back another 30 days so I've cancelled the order with them.

I'm just wondering what the best choice for block would be?


EK Fullcover Acetal Nickel Link
Koolance VID-AR290X Link
Aquacomputer Copper (Nowhere does Nickel







) Link
Any other suggestions avaliable in Australia? Price isn't much of an issue as I should be getting $250 back from the other block.
I'd love to get a block to match my heatkiller CPU block.. But that doesn't look like it's gonna happen either.


----------



## esqueue

Hmm, went from 13.12 to 14.3 and I've gotten a few blue screens already. Anyone else having this issue?


----------



## gameover1320

Hi Guys,

looks like I am part of the club. I purchased a r9 290x MSI Gaming 4g. the card is awesome. before I start overclocking I thought I would post stock heaven results

.


----------



## Frontside

Quote:


> Originally Posted by *gameover1320*
> 
> Hi Guys,
> 
> looks like I am part of the club. I purchased a r9 290x MSI Gaming 4g. the card is awesome. before I start overclocking I thought I would post stock heaven results
> 
> .


Congrats. Could you check VRM temperatures, please. Mine gets really hot


----------



## Paul17041993

Quote:


> Originally Posted by *phallacy*
> 
> Thanks for the speedy reply Paul. I had run the initial batch of stability testing when I first did all the OCs and they were all stable with the settings I refined. Includes Aida64 memtest, prime95 linx and cinebench. Valley, Heaven, games and 3dmark for the gpus. I'm using the xmp profile for my memory and never noticed a problem. I think you may be on to something with the stability of the SSD. But even the ssd is also very new about 3 months old so I will check that over the weekend.


I might have actually just gotten the same BSOD you did, was it 0x7e? I was [email protected] at the time.

went to reinstall drivers to find CCC and the actual drivers seemed to be messed up and couldn't be uninstalled, decided to give the AMD clean uninstall utility a shot, however this also screwed me over in that it killed the RAID drivers, fortunately I could still get back into safemode and patch them back in, I now reinstalled 13.12 again from scratch (I *was* on 13.12 before, but with the way it was messed up I'm not sure if it somehow installed 14.1 back over it...)


----------



## esqueue

Quote:


> Originally Posted by *Paul17041993*
> 
> I might have actually just gotten the same BSOD you did, was it 0x7e? I was [email protected] at the time.
> 
> went to reinstall drivers to find CCC and the actual drivers seemed to be messed up and couldn't be uninstalled, decided to give the AMD clean uninstall utility a shot, however this also screwed me over in that it killed the RAID drivers, fortunately I could still get back into safemode and patch them back in, I now reinstalled 13.12 again from scratch (I *was* on 13.12 before, but with the way it was messed up I'm not sure if it somehow installed 14.1 back over it...)


I think that you may have quoted the wrong person. I an unsure of the exact error code as I don't know where it is listed on windows 8. I uninstalled 13.12 with their uninstall utility as they recommended and installed 14.3. I've since gotten around 3 BSOD while watching videos. Oddly enough, I completed Castlevania LOD2 and didn't get one BSOD. If it keeps up, I may go to 14.02, 14.01 or back to 13.


----------



## Paul17041993

Quote:


> Originally Posted by *esqueue*
> 
> I think that you may have quoted the wrong person.


original was this;


Spoiler: quote



Quote:


> Originally Posted by *phallacy*
> 
> So I had a weird problem occur yesterday. Was playing Arkham Origins when suddenly I got a BSOD saying system exception thread not handled as the error code. I restart and all of a sudden CCC won't load and Device Manager was telling me "windows has stopped this device because it has communicated that it's not working properly" code 43 for all 3 of my cards. All my restore points were deleted somehow too and my ssd boot drive keeps getting loaded with files I can't see because the available space keeps getting lower and lower everytime I look at it for no apparent reason. ( but that's another problem). Anyways, I had to disable all my cards except 1 uninstall 14.2 betas and then install 14.3. Cards seem to be ok now but I've never had a BSOD from overclocking that didn't say atikmdiag exception or whatever it says usually for driver problems.






he got a SYSTEM_THREAD_EXCEPTION_NOT_HANDLED, I got the same type of error on mine with [email protected] on 13.12 drivers.


----------



## htapocysp

Got my asus r9 290x today and it rocks, especially for the 480$ i payed for it. Open box FTW!

http://www.3dmark.com/fs/1925607

Kizwan is my hero.


----------



## kizwan

Quote:


> Originally Posted by *htapocysp*
> 
> Got my asus r9 290x today and it rocks, especially for the 480$ i payed for it. Open box FTW!
> 
> http://www.3dmark.com/fs/1925607
> 
> If anyone got a free bf4 code with there r9 290x and wants to give it to me please message me


This is your lucky day. I have extra BF4 code here. I'll PM the code to you.


----------



## Kalistoval

Anybody have a Asus R9 290X direct cu II stock *heat sink* they wanna sell?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> I'm running basically the rig you describe (4820k though) with 21 fans in total and d5 pumps all on a single 1300g2. Works great and as you're thinking, you won't need to overclock the gpu's for anything right now. I also have a rig with a 780ti clocked over 1200mhz and I never ever game on it anymore. I've had no issues with tri-fire on my 290 rig. It plows through all games.
> 
> Enjoy!!!


You sure have maaaate .......... beasty


----------



## Matt-Matt

I have had black screens on 13.12 upon opening afterburner and the card entering a certain state (When I open VLC for a few minutes).


----------



## VulgarDisplay

I'm truly impressed with my Sapphire R9 290 Tri-X OC. Playing BF4 in mantle with 1100mhz core (14.3 beta so it throttles over that number) and the core is only seeing 60C. This cooler is just amazing.

It's the first non reference card I have ever purchased. I usually get a reference card at launch.


----------



## rdr09

Quote:


> Originally Posted by *trelokomio58*
> 
> Thank's mate!!!
> 
> 
> 
> 
> 
> 
> 
> 
> Yes that's exactly the model of the card!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I Download the file, and have inside two bios!
> 1)R929XD5-4GD-B-M.rom
> 
> 2)R929XD5-4GD-B-P.rom
> Which one is the quite bios and which is the uber bios??


you can try both. one at a time. not sure.

Quote:


> Originally Posted by *esqueue*
> 
> Hmm, went from 13.12 to 14.3 and I've gotten a few blue screens already. Anyone else having this issue?


i've been switching between 14.3 and 13.11 (2 HDDs) and have not experience any of that.

Reinstall 14.3. Use the *driver itself* to uninstall first. Just use Express. I never use DDU or driver sweeper crap.

edit: But 14.3 comes with stutter in some games like BI. C3 and BF3 are smooth.


----------



## Roboyto

Quote:


> Originally Posted by *esqueue*
> 
> Hmm, went from 13.12 to 14.3 and I've gotten a few blue screens already. Anyone else having this issue?


All drivers beyond 13.12 have been showing various problems. Roll back and everything should be fine.


----------



## King4x4

Don't use DDU for 14.3... Had a corrupted windows install and had to reinstall everything!


----------



## trelokomio58

Quote:


> Originally Posted by *rdr09*
> 
> you can try both. one at a time. not sure.
> .


Yes i tried Βοth of them and the "M" is the quite bios and "P" is the uber bios!

Thank's mate!


----------



## Niberius

Quote:


> Originally Posted by *King4x4*
> 
> Don't use DDU for 14.3... Had a corrupted windows install and had to reinstall everything!


I have used it multiple times with no issues on 14.3.


----------



## Niberius

I wouldnt think anyone should be getting blue screens from the 14.3 drivers, to me that shows an instability issue somewhere in your system that needs to be addressed. 14.3 and previous mantle beta drivers have been problematic in the area of stutter across a wide range of DX games which makes absolutely no sense because I dont understand how ATI has done something to screw up the DX function of the driver when mantle itself has nothing to do with DX anyway.

Im glad to see others posting about the stuttering issues in the games that I have mentioned as well, the more people that come out about it the faster we will hopefully get a fix.


----------



## Niberius

Heck I have even used multiple driver cleaners and have no corruption, ATI uninstall first, then driver cleaner pro, then driver sweeper and then DDU. Lastly I used c cleaner to address any registry issues and everything continues to go smoothly except for the stutter which was present even on a clean install of windows.


----------



## Durvelle27

Ok guys just picked up a reference R9 290X. Hopefully Tsk a good clocker


----------



## phallacy

Quote:


> Originally Posted by *King4x4*
> 
> Don't use DDU for 14.3... Had a corrupted windows install and had to reinstall everything!


Installing / reinstalling any new 14.xx beta with a crossfire / multi gpu setup seems to freeze everytime for me. It always happens during the installation stage where the screen flickers and the windows sound comes up identifying new hardware. During the last flicker the whole computer freezes and I have to use DDU or the AMD utility to get rid of the partial install and redo it. My only workaround was disabling all but 1 card installing the desired driver and then switch the rest of the cards back on when the driver was installed successfully. That's how I've been doing it as of late and it's just a PITA

Quote:


> Originally Posted by *Paul17041993*
> 
> I might have actually just gotten the same BSOD you did, was it 0x7e? I was [email protected] at the time.
> 
> went to reinstall drivers to find CCC and the actual drivers seemed to be messed up and couldn't be uninstalled, decided to give the AMD clean uninstall utility a shot, however this also screwed me over in that it killed the RAID drivers, fortunately I could still get back into safemode and patch them back in, I now reinstalled 13.12 again from scratch (I *was* on 13.12 before, but with the way it was messed up I'm not sure if it somehow installed 14.1 back over it...)


I didn't save the error code I just remember the System Thread Exception not handled part but it sounds like the problem I was facing. I do not have any RAID setup on my disks so I was not affected in that way. So this happened to you on 13.12 or you were on 14.1 and switched back to 13.12? I looked up the code online and many of the articles say it's usually because of a driver error and windows shuts down preemptively to protect itself.


----------



## Mercy4You

Quote:


> Originally Posted by *King4x4*
> 
> Don't use DDU for 14.3... Had a corrupted windows install and had to reinstall everything!


Quote:


> Originally Posted by *Niberius*
> 
> I have used it multiple times with no issues on 14.3.


It has nothing to do with 14.3, it just happens once in a while. It happened to me also 4 months ago.. When you read the warnings of DDU, it states that this can occur. Despite the risks, it's the only program that really sweeps your display drivers


----------



## Niberius

Quote:


> Originally Posted by *Mercy4You*
> 
> It has nothing to do with 14.3, it just happens once in a while. It happened to me also 4 months ago.. When you read the warnings of DDU, it states that this can occur. Despite the risks, it's the only program that really sweeps your display drivers


Right but so far so good for me, no issues with using DDU on my rig yet and I have used it a lot. Guess there is always gonna be a risk when using such software but I have always had good luck with programs such as these.


----------



## bond32

Well... I'm dumb. Finally figured out how to add more voltage using afterburner... Was trying to use "w4" when it should have been "w6". This means I should be able to game at 1200 now core and stable.


----------



## drserk

i have two 290x reference design cards. one is Sapphire, other is Asus..so i wanto to overclock both.and can i change sapphire's bios to Asus's? will it be any problem asus bios on sapphire card?
and which bios do i have to change? uber or quiet mode?


----------



## phallacy

Quote:


> Originally Posted by *drserk*
> 
> i have two 290x reference design cards. one is Sapphire, other is Asus..so i wanto to overclock both.and can i change sapphire's bios to Asus's? will it be any problem asus bios on sapphire card?
> and which bios do i have to change? uber or quiet mode?


You could, a few people have been having problems switching BIOS on the cards but most seem to to do it ok as long as you follow the tutorial in the 290 to 290x unlock thread.

I don't see a big gain from doing that either.

use Afterburner or Trixx to overclock your cards.


----------



## spqmax

to proud asus 290x reference owners







: can anyone confirm if there is a warranty sticker covering any screws on the back of the pcb? I'm about to order mine and wouldn't want to void the warranty right away by installing waterblocks







. I've read somewhere that there is one covering a screw, but that was for the directcII edition, not the reference. thanks a bunch!


----------



## jamaican voodoo

yes theirs a sticker on one of the screws...i didn't care too much so i removed it and stall my waterblock


----------



## Dan848

There might be something wrong with the latest flash player, it would not install properly, and I cannot download it from their web site now. Not sure what is going on.

I have Windows 7 64bit and an R9 290. I can watch YouTube, and videos from several other sites, however, probably half of them I cannot see. I too think it has something to do with the latest version of flash player, I did not have any problems with the older version. May be the new version does not like Win 7 64bit and R9 290 combination.


----------



## Mtom

Are there any AMD guys on the forum?

I want to ask/report this towards AMD

Powerplay on - 11151 physics point


Powerplay off - 12529 physics points


----------



## Dan848

Quote:


> Originally Posted by *Mtom*
> 
> Is there any AMD guys on the forum?
> 
> I want to ask/report this towards AMD
> 
> Powerplay on - 11151 physics point
> 
> 
> Powerplay off - 12529 physics points


Did you include those pictures when you became a member of the AMD R9 290X / 290 Owners Club? These are probably too new.

Did you try contacting Chip or one of the other staff?

http://www.overclock.net/f/68/graphics-cards


----------



## PachAz

I dont get it I run 1060/1470 with +56mv and 50% power in trixx and I have done like 10 rounds of 3d mark firestrike extreme, no crash or artifacts, but once I play WOT, some times I get artifacts now and then. Some times I can play for hours with no issues. I dont wanna go more than 56 in vddc offset because I want to keep my temps and fan speed down. I have tried with 1050 on the core and i will see if I get any issues. I dont really know if my clock is unstable or if its the drivers/games mess things up.


----------



## Imprezzion

It's your memory. Not core. 1060Mhz core is still +0mV territory.


----------



## PachAz

Quote:


> Originally Posted by *Imprezzion*
> 
> It's your memory. Not core. 1060Mhz core is still +0mV territory.


Are you sure? Because I did tune the core first and the artifacts I got, while memory was still at stock dissapeared while upping the vddc in trixx. Memory I tuned after that, and I got artifacts before I even touched the memory, only when I was messing around with the core.


----------



## Niberius

Quote:


> Originally Posted by *PachAz*
> 
> Are you sure? Because I did tune the core first and the artifacts I got, while memory was still at stock dissapeared while upping the vddc in trixx. Memory I tuned after that, and I got artifacts before I even touched the memory, only when I was messing around with the core.


.

Sounds right, I can do 1080mhz core stable without even messing with voltage.


----------



## kizwan

Two words; silicon lottery. It doesn't means all chips can do 1050++ without touching VDDC.


----------



## stilllogicz

I'll be joining team red soon. Got 3 290's on their way to me


----------



## cephelix

congrats!! a big purchase indeed


----------



## PachAz

I dont know what sounds right, but it didnt sound right, specially not when artifacts occured and dissapeared once I increased the vddc. But yeah, of course you need to increase vddc once you start clocking the memory after you found a stable core clock. Or decrease the core clock once you increase the memory unless you want to make the voltage even higher.


----------



## boldenc

I have noticed the memory clocking is linked to VDDC, some how if you increase the VDDC it will help you to achieve higher stable core and memory clock compared to stock VDDC

also I found looping Firestrike extreme is the best to detect the artifacts specially test # 2 and under load the core voltage has a huge vdroop in this test compared to BF3/BF4.


----------



## Paul17041993

Quote:


> Originally Posted by *phallacy*
> 
> Installing / reinstalling any new 14.xx beta with a crossfire / multi gpu setup seems to freeze everytime for me. It always happens during the installation stage where the screen flickers and the windows sound comes up identifying new hardware. During the last flicker the whole computer freezes and I have to use DDU or the AMD utility to get rid of the partial install and redo it. My only workaround was disabling all but 1 card installing the desired driver and then switch the rest of the cards back on when the driver was installed successfully. That's how I've been doing it as of late and it's just a PITA
> I didn't save the error code I just remember the System Thread Exception not handled part but it sounds like the problem I was facing. I do not have any RAID setup on my disks so I was not affected in that way. So this happened to you on 13.12 or you were on 14.1 and switched back to 13.12? I looked up the code online and many of the articles say it's usually because of a driver error and windows shuts down preemptively to protect itself.


I guess i possibly had one of the 14.x betas installed, seeing as it wouldn't uninstall correctly for a clean install it was pretty borked, haven't had any problems folding now so clean 13.12 seems perfectly stable...


----------



## airisom2

Alright guys. I just want to confirm this one more time before I RMA my PCS+ 290Xs. Could someone please run Valley 1080p extreme hd on their 290x at 1050/1350? Thanks.

This is my score on one 290x (1050/1350) no tweaks:


----------



## boldenc

Quote:


> Originally Posted by *airisom2*
> 
> Alright guys. I just want to confirm this one more time before I RMA my PCS+ 290Xs. Could someone please run Valley 1080p extreme hd on their 290x at 1050/1350? Thanks.
> 
> This is my score on one 290x (1050/1350) no tweaks:


this is mine


Trix 290x @ 1050/1350 no tweaks


----------



## airisom2

Well, that settles it. Thanks.


----------



## spqmax

guys, what am I to expect from a crossfire/trifire setup? is the microstuttering + sound issues really that bad?


----------



## taem

Quote:


> Originally Posted by *airisom2*
> 
> Well, that settles it. Thanks.


Unfortunate you got an underperforming card. Hope Powercolor takes care of you.

But did you ever try the crossfire? Iirc you bought two of these no? Wondering what temps you got.


----------



## Niberius

Quote:


> Originally Posted by *kizwan*
> 
> Two words; silicon lottery. It doesn't means all chips can do 1050++ without touching VDDC.


That's right.


----------



## gameover1320

Quote:


> Originally Posted by *Frontside*
> 
> Congrats. Could you check VRM temperatures, please. Mine gets really hot


my vrm1 max temp is 84


----------



## airisom2

Quote:


> Originally Posted by *taem*
> 
> Unfortunate you got an underperforming card. Hope Powercolor takes care of you.
> 
> But did you ever try the crossfire? Iirc you bought two of these no? Wondering what temps you got.


Well, I never paid attention to them since I was too busy and aggravated trying to figure out the performance drops I was getting, and because I knew they were good, but whenever I get my replacements, I'll record them. For now, here are some pics of the cards installed.







I had to stagger the fans due to the pcie connectors interfering.


----------



## Roboyto

Quote:


> Originally Posted by *spqmax*
> 
> to proud asus 290x reference owners
> 
> 
> 
> 
> 
> 
> 
> : can anyone confirm if there is a warranty sticker covering any screws on the back of the pcb? I'm about to order mine and wouldn't want to void the warranty right away by installing waterblocks
> 
> 
> 
> 
> 
> 
> 
> . I've read somewhere that there is one covering a screw, but that was for the directcII edition, not the reference. thanks a bunch!


ASUS does not void warranty for removing stock heatsink/fan, as long as there is no damage to the card. If you must RMA the card must be returned to stock configuration. I did this with my HD7870 which was watercooled and had no issues whatsoever. Also, calling in an RMA makes things MUCH faster than waiting for the online form to go through.

Quote:


> Originally Posted by *airisom2*
> 
> Alright guys. I just want to confirm this one more time before I RMA my PCS+ 290Xs. Could someone please run Valley 1080p extreme hd on their 290x at 1050/1350? Thanks.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> This is my score on one 290x (1050/1350) no tweaks:


That score does seem low. My reference XFX Black Edition 290, all stock voltages/clocks of 980/1250, scored 2408


----------



## airisom2

Yeah, they're low. The 290 I had back in November got 64.7fps at 1150/1450. At the same clocks, my 290x gets 64.0fps. It should be closer to 68-69fps.


----------



## Durvelle27

Quote:


> Originally Posted by *Roboyto*
> 
> ASUS does not void warranty for removing stock heatsink/fan, as long as there is no damage to the card. If you must RMA the card must be returned to stock configuration. I did this with my HD7870 which was watercooled and had no issues whatsoever. Also, calling in an RMA makes things MUCH faster than waiting for the online form to go through.
> 
> That score does seem low. My reference XFX Black Edition 290, all stock voltages/clocks of 980/1250, scored 2408


What about PowerColor and HIS


----------



## kpoeticg

Can't speak for HIS, but Powercolor doesn't have any Tamper Seal's on their 290x's

You can break the card down all the way, and put it back together with no proof. One of the reasons i chose them. Reference is reference, just easier to not have to worry about tamper-seals


----------



## Roboyto

Quote:


> Originally Posted by *Durvelle27*
> 
> What about PowerColor and HIS


PowerColor I am uncertain, I think you would be OK...there are plenty of people in this forum that have the PowerColor cards that would be able to confirm/deny.

It doesn't look like HIS has any Warranty Void stickers on the screws so you may be OK. However, I do know back when I was in the market for a 7950/7970 I contacted HIS. They said they did not want me to remove the stock HSF and it would void warranty; this was over a year ago at this point.

This is what HIS has listed on their site:

*"**What is covered by the Warranty?*

The product warranties that maintain and repair faulty HIS graphic cards within its warranty period.

*What is NOT covered by the Warranty?*

This Limited Warranty will not apply if a claim is made arising from any unacceptable use or care of the Product, including (without limitation) physically damaged, graphics card is burnt out, misuse, abuse, negligence, acts of God, unauthorized modification or repair, unauthorized commercial use and any operation of the Product outside HIS recommended parameters. This Limited Warranty is also invalid if any serial number or date stamp on the Product has been altered, obliterated or removed. Cosmetic damage and normal wear and tear are not covered.

This Limited Warranty is only valid in the country where originally purchased. If products were shipped internationally by an authorized reseller, country of original purchase is the shipping point of the reseller. This Limited Warranty applies only to the *original purchaser and is non-transferable.*

HIS is not responsible for any loss, damage or shipping charges incurred to our RMA Center. Additionally, HIS reserves the right to inspect and verify the defects of the product(s) that are returned. Replacement or refund of purchased product is not included in the warranty.

*Cooler / Fan Replacement*

*We regret to inform you that we do not suggest users to replace or install the cooler/fan themselves, as it would damage the ASIC / graphic card and void the warranty.* We could not supply the individual cooler/fan.*"*


----------



## Roboyto

Also for anyone else who may be wondering, XFX cards *DO* have tamper seals on the screws. *HOWEVER, they do not apply to the North American market.*



I found this on NewEgg in one of the 290(X) reviews.


----------



## kpoeticg

Yeah but XFX also publicly stated that Black Screen Cards are user error and they won't accept RMA's for them = i will never buy an xfx product EVER


----------



## taem

Quote:


> Originally Posted by *airisom2*
> 
> Well, I never paid attention to them since I was too busy and aggravated trying to figure out the performance drops I was getting, and because I knew they were good, but whenever I get my replacements, I'll record them. For now, here are some pics of the cards installed.
> 
> 
> 
> 
> 
> 
> 
> I had to stagger the fans due to the pcie connectors interfering.


Is that a magnetic mount? I use a similar setup with a pci bracket mount from moddiy that takes triple 80/92mm fans, but obviously that requires a vertical expansion slot that few cases have. Really does make a huge difference for dual Gpus.


----------



## Durvelle27

Quote:


> Originally Posted by *kpoeticg*
> 
> Can't speak for HIS, but Powercolor doesn't have any Tamper Seal's on their 290x's


Seems it's a AMD branded Reference 290X


----------



## Roboyto

Quote:


> Originally Posted by *kpoeticg*
> 
> Yeah but XFX also publicly stated that Black Screen Cards are user error and they won't accept RMA's for them = i will never buy an xfx product EVER


Interesting...I'd like to see it in writing for their warranty terms.

Anyway, if my card was going to black screen at any point between stock and +200mV offset clocking up to 1300/1700 it probably would have already. It's solid as a rock running 1200/1500 with +87mV offset.


----------



## kpoeticg

Yeah, if it hasn't happened yet, you're good. For me it was pretty instant.

I don't think it's stated on their warranty terms. Doesn't really need to be. Random black screens are hard to reproduce, so it's a bit of a loophole. Any company can test a horribly defective card without finding a single issue with it.

Quote:


> Originally Posted by *Durvelle27*
> 
> Seems it's a AMD branded Reference 290X


Yeah all reference cards are the same PCB, but AMD doesn't actually sell the cards. Companies like Sapphire, Powercolor, HIS, etc do. Different companies have different policies about tampering with the stock cooler. Some have tamper-seals on the cooler, some don't, some have tamper-seals but still let you remove it...

Quote:


> Originally Posted by *taem*
> 
> Is that a magnetic mount? I use a similar setup with a pci bracket mount from moddiy that takes triple 80/92mm fans, but obviously that requires a vertical expansion slot that few cases have. Really does make a huge difference for dual Gpus.


They look like the Akust Magnetic Fan Bridges. Funny, I bought some of those for my XB too. Different purpose tho..



Spoiler: Warning: Spoiler!











Didn't end up using it tho..


----------



## airisom2

Quote:


> Originally Posted by *taem*
> 
> Is that a magnetic mount? I use a similar setup with a pci bracket mount from moddiy that takes triple 80/92mm fans, but obviously that requires a vertical expansion slot that few cases have. Really does make a huge difference for dual Gpus.


Yup. I took off the magnets since they weren't needed. The washers keep the screws from getting out the holes that its screwed to.

http://www.frozencpu.com/products/16145/slf-11/Akust_Adjustable_Magnetic_Fan_Bridge_Mounting_Kit_BK00-0107-AKS.html

Edit: Oh, you were the guy that was talking about that fan bracket back in the PCS+ 290 thread. I thought I remembered reading something about a triple fan pci bracket


----------



## taem

Quote:


> Originally Posted by *airisom2*
> 
> Edit: Oh, you were the guy that was talking about that fan bracket back in the PCS+ 290 thread. I thought I remembered reading something about a triple fan pci bracket


Yeah. The mount I have won't let me use the modu vent silencer on the side fan mesh, so I thought I might try the akust if it will fit the prolimatech ultra slim 140mm x 15 mm.


----------



## Durvelle27

Quote:


> Originally Posted by *kpoeticg*
> 
> Yeah, if it hasn't happened yet, you're good. For me it was pretty instant.
> 
> I don't think it's stated on their warranty terms. Doesn't really need to be. Random black screens are hard to reproduce, so it's a bit of a loophole. Any company can test a horribly defective card without finding a single issue with it.
> 
> Yeah all reference cards are the same PCB, but AMD doesn't actually sell the cards. Companies like Sapphire, Powercolor, HIS, etc do. Different companies have different policies about tampering with the stock cooler. Some have tamper-seals on the cooler, some don't, some have tamper-seals but still let you remove it...
> 
> They look like the Akust Magnetic Fan Bridges. Funny, I bought some of those for my XB too. Different purpose tho..
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Didn't end up using it tho..


What I mean it seems to be unbranded


----------



## HardwareDecoder

Anyone else bummed out that the 290/x isn't gonna support DX12?


----------



## Aussiejuggalo

Quote:


> Originally Posted by *HardwareDecoder*
> 
> Anyone else bummed out that the 290/x isn't gonna support DX12?


I thought Mantle was ment for getting away from DX?


----------



## Durvelle27

Quote:


> Originally Posted by *HardwareDecoder*
> 
> Anyone else bummed out that the 290/x isn't gonna support DX12?


AMD has said the HD 7000 and R7 and up series will support DX12.

Nvidia fermi, Kelper, and Maxwell will have support for DX12 also


----------



## HardwareDecoder

Quote:


> Originally Posted by *Durvelle27*
> 
> AMD has said the HD 7000 and R7 and up series will support DX12.
> 
> Nvidia fermi, Kelper, and Maxwell will have support for DX12 also


really? I could be wrong and I hope I am but I saw something on here saying that DX12 will require new hardware?
Quote:


> Originally Posted by *Aussiejuggalo*
> 
> I thought Mantle was ment for getting away from DX?


well yes, that is the plan and it does seem support is growing for mantle but only time will tell how much support I guess.


----------



## Aussiejuggalo

Quote:


> Originally Posted by *HardwareDecoder*
> 
> well yes, that is the plan and it does seem support is growing for mantle but only time will tell how much support I guess.


Well Mantle came into the game pretty late, its like some new GPU brand trying to come in and take over Nvidia... pretty damn hard


----------



## Paul17041993

Quote:


> Originally Posted by *HardwareDecoder*
> 
> Anyone else bummed out that the 290/x isn't gonna support DX12?


uh, no, DX12 runs on all GCN architecture GPUs and all modern nvidia GPUs, its only driver-dependant like OpenGL and Mantle are, no actual architecture dependencies as long as it at least supports DX11.

only architectures that may not be supported would be your super-low-end and virtually ancient hardware from years back, which don't support DX11 anyway.


----------



## Durvelle27

Quote:


> Originally Posted by *HardwareDecoder*
> 
> really? I could be wrong and I hope I am but I saw something on here saying that DX12 will require new hardware?
> well yes, that is the plan and it does seem support is growing for mantle but only time will tell how much support I guess.


Nope


----------



## VulgarDisplay

DX12 will not require new hardware with anything post GCN and Fermi for ONLY the driver overhead reducing aspects.

Any GPU specific features introduced with the new API will require a new GPU to be purchased.


----------



## wntrsnowg

Just got my reference xfx r9 290 setup via red mod. Have some aluminum cooling heat sinks on the vrms and memory as well. Runs core at 55C and VRM at 68C under load while Heaven benchmarking. Im running 1200 core / 1500 mem at +80mV and +20% power. Are these settings good for these cards? I don't know what you guys typically see


----------



## kizwan

Quote:


> Originally Posted by *boldenc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *airisom2*
> 
> Alright guys. I just want to confirm this one more time before I RMA my PCS+ 290Xs. Could someone please run Valley 1080p extreme hd on their 290x at 1050/1350? Thanks.
> 
> This is my score on one 290x (1050/1350) no tweaks:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> this is mine
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Trix 290x @ 1050/1350 no tweaks
Click to expand...

CPU does slightly affect Valley score. Going from 3820 @4.75GHz to 3.8GHz, I loose a couple hundred of Valley score. I honestly don't think your card under perform just from one benchmark score. I'll need to check out your previous post to understand what is the problem you're experiencing.

@airisom2, this is mine, 290 though, @1050/1350, no tweaks (CCC default) & 3820 @4.75GHz.


This at stock clock, 1000/1300 (290 TRI-X BIOS).


----------



## Connolly

What are you guys doing to unlock voltage control now the MSI Afterburner beta has expired? There doesn't seem to be a beta 19 on the official site yet, but there are some floating around on 3rd party sites. When I use the "check for available product beta version" in the application itself, it says that beta 19 is available but then the re-direct takes me to a broken page.

Or is there a way to get the non beta version 2.3.1 to allow voltage control with these cards now?


----------



## Imprezzion

Just download the Beta 19 from Guru3D.

I use it, it works perfectly.


----------



## Matt-Matt

Quote:


> Originally Posted by *kizwan*
> 
> Two words; silicon lottery. It doesn't means all chips can do 1050++ without touching VDDC.


Yeah, for 1075 I need +13mv.. Hopefully this doesn't mean my chip is terrible


----------



## Kalistoval

This might be a long shot but does anyone have a shroud for sale for a Asus Direct CU II R9 290x I was cleaning my card and my kids got a hold of the cooler cuz I obviously wasnt watching them and damaged it beyond repair. It was funny but now Im not laughing cuz I cant find one haha owned by toddlers.


----------



## kpoeticg

EK makes a very nice 'Shroud' for it


----------



## Kalistoval

yea but i dont have a custom loop


----------



## kpoeticg

Neither did i, until i did









Sorry, easy to tell people on OCN how they should spend thousands LOLL


----------



## Mercy4You

Quote:


> Originally Posted by *spqmax*
> 
> to proud asus 290x reference owners
> 
> 
> 
> 
> 
> 
> 
> : can anyone confirm if there is a warranty sticker covering any screws on the back of the pcb? I'm about to order mine and wouldn't want to void the warranty right away by installing waterblocks
> 
> 
> 
> 
> 
> 
> 
> . I've read somewhere that there is one covering a screw, but that was for the directcII edition, not the reference. thanks a bunch!


Hate to be a party pooper







, but forget about ASUS warranty these days, it sucks!!!


----------



## Tobiman

Wow, it's unbelievable the prices you can pick up these cards for on ebay. I just saw 290Xs selling for $400 but reference cooler and just a few.
Quote:


> Originally Posted by *Kalistoval*
> 
> This might be a long shot but does anyone have a shroud for sale for a Asus Direct CU II R9 290x I was cleaning my card and my kids got a hold of the cooler cuz I obviously wasnt watching them and damaged it beyond repair. It was funny but now Im not laughing cuz I cant find one haha owned by toddlers.


There's an MSI non-reference cooler for sale on ebay for $85.


----------



## Maracus

Anyone on water tried +250mv benching? Just wondering cause I topped out at *1260*, was maybe thinking i could hit *1300*. GPU VRM stayed mid to high 50's


----------



## airisom2

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *boldenc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *airisom2*
> 
> Alright guys. I just want to confirm this one more time before I RMA my PCS+ 290Xs. Could someone please run Valley 1080p extreme hd on their 290x at 1050/1350? Thanks.
> 
> This is my score on one 290x (1050/1350) no tweaks:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> this is mine
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Trix 290x @ 1050/1350 no tweaks
> 
> Click to expand...
> 
> CPU does slightly affect Valley score. Going from 3820 @4.75GHz to 3.8GHz, I loose a couple hundred of Valley score. I honestly don't think your card under perform just from one benchmark score. I'll need to check out your previous post to understand what is the problem you're experiencing.
> 
> @airisom2, this is mine, 290 though, @1050/1350, no tweaks (CCC default) & 3820 @4.75GHz.
> 
> 
> This at stock clock, 1000/1300 (290 TRI-X BIOS).
Click to expand...

Oh, it (they) underperforms in everything. Firestrike, Valley, Heaven, you name it. Basically, in any gpu-taxing benchmark and games, it performs around 290 levels, but slightly worse. The difference between a 4 core and 6 core (sandy and up) in Valley is basically negligible in gpu benchmarks, and I've seen a few comparisons between the two in the past. The scores are basically within <1% of each other, given that both are overclocked (Valley/Heaven).


----------



## The Mac

Quote:


> Originally Posted by *HardwareDecoder*
> 
> Anyone else bummed out that the 290/x isn't gonna support DX12?


Nope, by the time DX12 is available for games, current hardware will be outdated and it will be time for an upgrade anyway


----------



## Witwicky

Greetings. Can you tell me how is average safe voltage for properly cooled R9 290? Does keeping +175mv 24/7 a good idea? For my card this mean 1,313 1.212 v under full load (OCCT) and 1.25-1.3v under average load (damn offset). It is safe ? Thanks for all replies.


----------



## Niberius

Quote:


> Originally Posted by *Witwicky*
> 
> Greetings. Can you tell me how is average safe voltage for properly cooled R9 290? Does keeping +175mv 24/7 a good idea? For my card this mean 1,313 1.212 v under full load (OCCT) and 1.25-1.3v under average load (damn offset). It is safe ? Thanks for all replies.


That is some serious vdroop! you should be fine though as long as temps are in check. By comparison with my R9 290X with just +100mv my actual core voltage under a strenuous load is 1.328vc which gets me 1200mhz on the core.


----------



## Witwicky

Yes. In OCCT vdroop is very high, because this tool stressing the card to the maximum (with settings Shader Complexity-7, fps limit-200fps, and error checking enabled). But in demanding games like Metro LL, voltage jumping between 1.25-1.3v. But I still have doubts if it is not too much for offset, because sometime voltage spikes even to 1.38 v :S


----------



## chalkypink

Quote:


> Originally Posted by *Imprezzion*
> 
> Just download the Beta 19 from Guru3D.
> 
> I use it, it works perfectly.


really? mine isn't working anymore. the /wi6,30,8d,20 parameter that I used in beta 18 just stopped doing anything. anyone else having this problem?


----------



## EliteReplay

any one with a 290X Matrix? i wanna see some benchmarks







from users not websites


----------



## phallacy

Quote:


> Originally Posted by *EliteReplay*
> 
> any one with a 290X Matrix? i wanna see some benchmarks
> 
> 
> 
> 
> 
> 
> 
> from users not websites


After seeing the reviews and a general consensus that the combination of elpida vram, and recycled DCU II coolerl doesn't offer much more than the reference design or a custom reference design, I'm guessing very few people here will opt to buy that custom card given the choice between MSI's Lightning and Asus' Matrix. I wouldn't expect to see that many on here.


----------



## PachAz

I think all r9 290 and r9 290x with elipida memory whould be at least 40% cheaper because they suck according to many people. I feel sorry for those paying big bucks for these card with the highly unstable memory. Price cut it a must on these cards, I think a okay price for a 290 referense should be around 270 dollars and around 230 for elpidia. For the 290x, 300 for the reference and 250 for the elpidia mem. R9 cards are a beast yes, but the fact that they get so freaking hot even with non-reference cooler and can achieve questionable OC results, make them weaker than the 7970 for this generation. I think the 7970 did give better value for money than the 290/290x. A 290 is what, like 10-15% faster than a 7970Ghz? At this point, if you are going to WC your card ick the cheapest reference one with hynix memory and if you are running non.reference cooler take the tri-x, because these are the cheapest ones with hynix memory if im not wrong. Sorry, but neither 290 or 290x is worth 400 dollars, max 300 imo.


----------



## Niberius

Quote:


> Originally Posted by *PachAz*
> 
> I think all r9 290 and r9 290x with elipida memory whould be at least 40% cheaper because they suck according to many people. I feel sorry for those paying big bucks for these card with the highly unstable memory. Price cut it a must on these cards, I think a okay price for a 290 referense should be around 270 dollars and around 230 for elpidia. For the 290x, 300 for the reference and 250 for the elpidia mem. R9 cards are a beast yes, but the fact that they get so freaking hot even with non-reference cooler and can achieve questionable OC results, make them weaker than the 7970 for this generation. I think the 7970 did give better value for money than the 290/290x. A 290 is what, like 10-15% faster than a 7970Ghz? At this point, if you are going to WC your card ick the cheapest reference one with hynix memory and if you are running non.reference cooler take the tri-x, because these are the cheapest ones with hynix memory if im not wrong. Sorry, but neither 290 or 290x is worth 400 dollars, max 300 imo.


First thing, my previous card was a HD 7970 running at a healthy 1250mhz core and a big memory overclock to boot. I loved the card, it helds its own for a long time and is still more than enough for most games on a single 1080P screen. Now I own a R9 290X, im one of the lucky ones who actually purchased a R9 290 at release for $399.99 only to find out it could be flashed to a full fledged $550.00 R9 290X.









Yes the card with stock cooling is very hot but I never leave my video cards with stock cooling either so I purchased and Accelero 3 cooler and now enjoy a 1200mhz core overclock and 1500mhz ram while never exceeding 64C load temps in spite of the overclock and overvolt to 1.328vc. As far as performance is concerned you are wrong, my HD R9 290X even at stock clocks walks my old overclocked HD 7970 and at 1200mhz core its not even close. I have gained as much as 20-30fps in some games which makes quiet a difference if you were previously struggling to maintain 30fps in certain gaming scenarios.

Oh yah, forgot to mention my card has elpida ram and no it doesnt suck. Some folks I think just got bad batches of it or something because mine has never given me any issues and clocks pretty decent. IMO the card is worth every bit of the $399.99 which I plaid for it, especially since it was a R9 290X in disguise which just needed to be unlocked.


----------



## GhostRyder

Quote:


> Originally Posted by *PachAz*
> 
> I think all r9 290 and r9 290x with elipida memory whould be at least 40% cheaper because they suck according to many people. I feel sorry for those paying big bucks for these card with the highly unstable memory. Price cut it a must on these cards, I think a okay price for a 290 referense should be around 270 dollars and around 230 for elpidia. For the 290x, 300 for the reference and 250 for the elpidia mem. R9 cards are a beast yes, but the fact that they get so freaking hot even with non-reference cooler and can achieve questionable OC results, make them weaker than the 7970 for this generation. I think the 7970 did give better value for money than the 290/290x. A 290 is what, like 10-15% faster than a 7970Ghz? At this point, if you are going to WC your card ick the cheapest reference one with hynix memory and if you are running non.reference cooler take the tri-x, because these are the cheapest ones with hynix memory if im not wrong. Sorry, but neither 290 or 290x is worth 400 dollars, max 300 imo.


How do you figure that, if we do a video card comparison, the R9 290 is about as fast as the GTX 780 priced at 500 and the 290X is better than the 780 and titan which are priced at 500 and 1k respectively. Why would AMD charge any less for cards that perform as well as they do in comparison to the competition. They are already better price to performance, dropping the price to levels that low would in the end lose money to AMD. They can't just give video cards away...

Any card you spend only 300 dollars on, compare the benchmark scores from both companies to the 290 or 290X and see where they stand even from past years top cards versus 300 dollar cards.


----------



## PachAz

Well, max 400 for a 290x as I said and which you agree with. But thats it. Cards are good, but not god like and certanly not worth the premium prices for 600-700 dollars.Like those overprices lightnings, rog, etc. I think we live in times where value of hardware dont really reflect the actually value, and you can relate to that since you are running a fx 9590 if you know what I mean....


----------



## Jack Mac

Quote:


> Originally Posted by *PachAz*
> 
> Well, max 400 for a 290x as I said and which you agree with. But thats it. Cards are good, but not god like and certanly not worth the premium prices for 600-700 dollars.Like those overprices lightnings, rog, etc. I think we live in times where value of hardware dont really reflect the actually value, and you can relate to that since you are running a fx 9590 if you know what I mean....


Lol you're delusional. $400 290 and $550 290X is perfectly reasonable. Yes, stock cooling is mediocre at best, but the cards are beasts and AMD deserves to make a good amount of $ on them. They can't just give them away you know. And Elpida is fine, it was just the first batch of cards with Elpida that had issues AFAIK. The Sapphire reference 290 I had was solid at 1450MHz memory and it had Elpida. 24/7 OC of 1175/1450 at +69mV.


----------



## taem

Quote:


> Originally Posted by *PachAz*
> 
> I think all r9 290 and r9 290x with elipida memory whould be at least 40% cheaper because they suck according to many people. I feel sorry for those paying big bucks for these card with the highly unstable memory. Price cut it a must on these cards, I think a okay price for a 290 referense should be around 270 dollars and around 230 for elpidia. For the 290x, 300 for the reference and 250 for the elpidia mem. R9 cards are a beast yes, but the fact that they get so freaking hot even with non-reference cooler and can achieve questionable OC results, make them weaker than the 7970 for this generation. I think the 7970 did give better value for money than the 290/290x. A 290 is what, like 10-15% faster than a 7970Ghz? At this point, if you are going to WC your card ick the cheapest reference one with hynix memory and if you are running non.reference cooler take the tri-x, because these are the cheapest ones with hynix memory if im not wrong. Sorry, but neither 290 or 290x is worth 400 dollars, max 300 imo.


The elpida vs Hynix thing is so overstated. For users who game on air, elpida is just fine, it all depends on the silicon lottery. My 7970 with Hynix could not overclock mem hardly at all, my current elpida 290 hits 1600 mem. This is only an issue for extreme overclockers IMHO.

And the 290 is a great card, I consider it a good purchase for $500. That extra horsepower makes all the difference for maxing out key settings while maintaining a constant 60fps at 1440p, which the 7970 cannot do. It's not just about raw horsepower but whether the extra muscle breaks a threshold.


----------



## PachAz

Its not about the cost, its about what I think it is worth, different.


----------



## rdr09

Quote:


> Originally Posted by *PachAz*
> 
> Its not about the cost, its about what I think it is worth, different.


you must have a bad clocker?

edit: my 290 is about 20 - 30% faster than my 7970.


----------



## VulgarDisplay

Quote:


> Originally Posted by *PachAz*
> 
> I think all r9 290 and r9 290x with elipida memory whould be at least 40% cheaper because they suck according to many people. I feel sorry for those paying big bucks for these card with the highly unstable memory. Price cut it a must on these cards, I think a okay price for a 290 referense should be around 270 dollars and around 230 for elpidia. For the 290x, 300 for the reference and 250 for the elpidia mem. R9 cards are a beast yes, but the fact that they get so freaking hot even with non-reference cooler and can achieve questionable OC results, make them weaker than the 7970 for this generation. I think the 7970 did give better value for money than the 290/290x. A 290 is what, like 10-15% faster than a 7970Ghz? At this point, if you are going to WC your card ick the cheapest reference one with hynix memory and if you are running non.reference cooler take the tri-x, because these are the cheapest ones with hynix memory if im not wrong. Sorry, but neither 290 or 290x is worth 400 dollars, max 300 imo.


200mhz over stock is now questionable ocing on a GPU...

Isn't the base clock around 800mhz on these chips? I'd say even the chips that don't overclock well are far better than what we had a few generations ago in terms of headroom.


----------



## Matt-Matt

Quote:


> Originally Posted by *Niberius*
> 
> First thing, my previous card was a HD 7970 running at a healthy 1250mhz core and a big memory overclock to boot. I loved the card, it helds its own for a long time and is still more than enough for most games on a single 1080P screen. Now I own a R9 290X, im one of the lucky ones who actually purchased a R9 290 at release for $399.99 only to find out it could be flashed to a full fledged $550.00 R9 290X.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yes the card with stock cooling is very hot but I never leave my video cards with stock cooling either so I purchased and Accelero 3 cooler and now enjoy a 1200mhz core overclock and 1500mhz ram while never exceeding 64C load temps in spite of the overclock and overvolt to 1.328vc. As far as performance is concerned you are wrong, my HD R9 290X even at stock clocks walks my old overclocked HD 7970 and at 1200mhz core its not even close. I have gained as much as 20-30fps in some games which makes quiet a difference if you were previously struggling to maintain 30fps in certain gaming scenarios.
> 
> Oh yah, forgot to mention my card has elpida ram and no it doesnt suck. Some folks I think just got bad batches of it or something because mine has never given me any issues and clocks pretty decent. IMO the card is worth every bit of the $399.99 which I plaid for it, especially since it was a R9 290X in disguise which just needed to be unlocked.


Same here, my Elpida RAM is great. Doesn't overclock that madly but I can do 1400 without voltage adjustments








Quote:


> Originally Posted by *VulgarDisplay*
> 
> 200mhz over stock is now questionable ocing on a GPU...
> 
> Isn't the base clock around 800mhz on these chips? I'd say even the chips that don't overclock well are far better than what we had a few generations ago in terms of headroom.


Not really? It was back in the days of the 6xxx cards, but now with the newer 7xxx, r9 and r7 series it isn't.

Also the R9 290 comes stock at 947MHz, and the R9 290x is 1000MHz. We are talking about the boost though.


----------



## IBIubbleTea

What is up with TriXX? I have a custom fan profile and when i restart my computer it goes back to 20% even though it is set to my fan profile. Never had this problem with AfterBurner..


----------



## taem

Quote:


> Originally Posted by *IBIubbleTea*
> 
> What is up with TriXX? I have a custom fan profile and when i restart my computer it goes back to 20% even though it is set to my fan profile. Never had this problem with AfterBurner..


Did you check the box in settings to save fan tweak to profile?

But in general Trixx keeps reverting to auto fan, it's a huge pain in crossfire. I usually have to save, load, save, repeat, multiple times before it will stick.


----------



## Tobiman

Well, I think even the 780TI should go for around $300 max but that's not happening and there's a reason why the 770 exists.


----------



## Noviets

Hey guys, just wondering about the R9 290X Powercolour card, its $120 cheaper than the others, does it not overclock well?
I want to put a block on it so aftermarket cooling isn't something I need (as I know that the stock blowers are horrible)

I'll likely get it from PCCaseGear or Computer Alliance.

I'm running 1080P, I was going to get a second 7970 for CFX, but if I get a 290X, the performance is better right? I could get a second in the future, will the 8350 hold me back from the second or would I still benefit from it?


----------



## Matt-Matt

Quote:


> Originally Posted by *Noviets*
> 
> Hey guys, just wondering about the R9 290X Powercolour card, its $120 cheaper than the others, does it not overclock well?
> I want to put a block on it so aftermarket cooling isn't something I need (as I know that the stock blowers are horrible)
> 
> I'll likely get it from PCCaseGear or Computer Alliance.
> 
> I'm running 1080P, I was going to get a second 7970 for CFX, but if I get a 290X, the performance is better right? I could get a second in the future, will the 8350 hold me back from the second or would I still benefit from it?


To start with that is a cheap 290X! As it's reference there is basically no difference apart from the stickers on it/branding on the box. The only thing that changes is the warranty however.

If you want to waterblock it, i think it's the way to go!








As the majority of non-ref cards have non-ref PCB's which means you won't (or at least will struggle) to get a block for it, assuming you're talking about a fullcover one that is.

I'd say to not get a second 7970/280x for Crossfire, as a 290x works out cheaper (once you sell the 7970) and you can add another one later easier (Adding three 7970's is hard as the power, heat and space). You will lose a reasonable amount of performance in going to a 290x instead of two 7970's but the crossfire issues aren't present, you'll have less power and heat to deal with (Once you waterblock it). It's also a cheaper solution to watercool as you only need one block.

I came from Crossfire 7950's myself, I'd had them modded to run 280W and overclocked to 1150/1250 or so and a 290 is a bit of a downgrade but I don't really notice it all that much as everything I play still runs fine and lacks crossfire/stuttering issues now.

That being said I've had more problems with 290 drivers so far, BUT they will get better, whereas 7970 Crossfire won't.


----------



## Noviets

Quote:


> Originally Posted by *Matt-Matt*
> 
> To start with that is a cheap 290X! As it's reference there is basically no difference apart from the stickers on it/branding on the box. The only thing that changes is the warranty however.
> 
> If you want to waterblock it, i think it's the way to go!
> 
> 
> 
> 
> 
> 
> 
> 
> As the majority of non-ref cards have non-ref PCB's which means you won't (or at least will struggle) to get a block for it, assuming you're talking about a fullcover one that is.
> 
> I'd say to not get a second 7970/280x for Crossfire, as a 290x works out cheaper (once you sell the 7970) and you can add another one later easier (Adding three 7970's is hard as the power, heat and space). You will lose a reasonable amount of performance in going to a 290x instead of two 7970's but the crossfire issues aren't present, you'll have less power and heat to deal with (Once you waterblock it). It's also a cheaper solution to watercool as you only need one block.
> 
> I came from Crossfire 7950's myself, I'd had them modded to run 280W and overclocked to 1150/1250 or so and a 290 is a bit of a downgrade but I don't really notice it all that much as everything I play still runs fine and lacks crossfire/stuttering issues now.
> 
> That being said I've had more problems with 290 drivers so far, BUT they will get better, whereas 7970 Crossfire won't.


$549 is cheap for a 290x? I might grab me one later down the road, as I want to get a waterblock and fittings for it at the same time, I'm enjoying my silent system at the moment, and putting a turbine in my case isn't on my to-do list lol

I wont be selling the 7970, I have a row of systems that I have in the office, as I upgrade my primary, my secondary gets the parts, which go into the tertiary etc.

What clock speed do you think the 8350 will need to use the 290x at full potential? And perhaps adding a second one?

I'm mostly looking forward to the true audio, and Mantle advantages on the card, as I really the audio side of things, and I know this processor is going to be holding me back quite a bit.

Do we have any word on a new FX line, or top end chip form AMD this year? What happened to Steamroller?


----------



## Niberius

Quote:


> Originally Posted by *Noviets*
> 
> $549 is cheap for a 290x? I might grab me one later down the road, as I want to get a waterblock and fittings for it at the same time, I'm enjoying my silent system at the moment, and putting a turbine in my case isn't on my to-do list lol
> 
> I wont be selling the 7970, I have a row of systems that I have in the office, as I upgrade my primary, my secondary gets the parts, which go into the tertiary etc.
> 
> What clock speed do you think the 8350 will need to use the 290x at full potential? And perhaps adding a second one?
> 
> I'm mostly looking forward to the true audio, and Mantle advantages on the card, as I really the audio side of things, and I know this processor is going to be holding me back quite a bit.
> 
> Do we have any word on a new FX line, or top end chip form AMD this year? What happened to Steamroller?


$549.99 is where the 290X is supposed to be, that was the launch price of the card and then it sky rocketed for a while because of the mining craze so now suddenly people think they are getting a steal of a deal when the card goes back to its original msrp. Kind of like how they raise fuel prices to over $4.00 a gallon and then drop it back to $3.50 a gallon and watch people go nuts as they think they are getting a deal, lol.

Anyway, you will need a very hefty overclock on the 8350 to feed the 290X properly and for 2 in crossfire you will be severely CPU limited. My buddy has a 8350 clocked to 5ghz paired up with a 290X and we have compared a lot of stuff to my rig with a 2700K clocked to 4.8ghz which also houses a 290X and there are a few scenarios in games where my rig is giving a decent amount more FPS.

In gpu limited scenarios they are pretty close with mine still having the edge but when you get into a cpu limited situation the difference is pretty large.

None of this is to say that the 8350 is a bad chip, its not and you will still have a great gaming experience if you can get a decent overclock with it.


----------



## Jack Mac

Quote:


> Originally Posted by *Noviets*
> 
> $549 is cheap for a 290x? I might grab me one later down the road, as I want to get a waterblock and fittings for it at the same time, I'm enjoying my silent system at the moment, and putting a turbine in my case isn't on my to-do list lol
> 
> I wont be selling the 7970, I have a row of systems that I have in the office, as I upgrade my primary, my secondary gets the parts, which go into the tertiary etc.
> 
> What clock speed do you think the 8350 will need to use the 290x at full potential? And perhaps adding a second one?
> 
> I'm mostly looking forward to the true audio, and Mantle advantages on the card, as I really the audio side of things, and I know this processor is going to be holding me back quite a bit.
> 
> Do we have any word on a new FX line, or top end chip form AMD this year? What happened to Steamroller?


You should clock the 8350 to at least 4.6GHz to properly feed a 290/X. Seeing as my i5 is somewhat comparable to an 8350 and I need at least 4.4GHz in most games to get the most out of my 290 (when I had it) and 780. And AFAIK, there's been no word of new FX chips.


----------



## Matt-Matt

Quote:


> Originally Posted by *Noviets*
> 
> $549 is cheap for a 290x? I might grab me one later down the road, as I want to get a waterblock and fittings for it at the same time, I'm enjoying my silent system at the moment, and putting a turbine in my case isn't on my to-do list lol
> 
> I wont be selling the 7970, I have a row of systems that I have in the office, as I upgrade my primary, my secondary gets the parts, which go into the tertiary etc.
> 
> What clock speed do you think the 8350 will need to use the 290x at full potential? And perhaps adding a second one?
> 
> I'm mostly looking forward to the true audio, and Mantle advantages on the card, as I really the audio side of things, and I know this processor is going to be holding me back quite a bit.
> 
> Do we have any word on a new FX line, or top end chip form AMD this year? What happened to Steamroller?


Ah yep, fair enough.

I'm not sure about AMD's roadmap for new top end chips.

I'd also say that you want a pretty decent overclock to attain the max FPS you can get from it, but it should still be an improvement over a 7970.

To add mantle runs on 7xxx cards and a 7970 counts as that. The audio side of things isn't released yet so if that's what's pushing you over just wait to see how that pans out.

Quote:


> Originally Posted by *Niberius*
> 
> $549.99 is where the 290X is supposed to be, that was the launch price of the card and then it sky rocketed for a while because of the mining craze so now suddenly people think they are getting a steal of a deal when the card goes back to its original msrp. Kind of like how they raise fuel prices to over $4.00 a gallon and then drop it back to $3.50 a gallon and watch people go nuts as they think they are getting a deal, lol.
> 
> Anyway, you will need a very hefty overclock on the 8350 to feed the 290X properly and for 2 in crossfire you will be severely CPU limited. My buddy has a 8350 clocked to 5ghz paired up with a 290X and we have compared a lot of stuff to my rig with a 2700K clocked to 4.8ghz which also houses a 290X and there are a few scenarios in games where my rig is giving a decent amount more FPS.
> 
> In gpu limited scenarios they are pretty close with mine still having the edge but when you get into a cpu limited situation the difference is pretty large.
> 
> None of this is to say that the 8350 is a bad chip, its not and you will still have a great gaming experience if you can get a decent overclock with it.


Yeah, but in Australia cards usually cost a lot more. I understand $550 is meant to be the MSRP of it in the states but getting the card for that price here is decent; Especially when the rest of the cards are around $650+ easily if not more. I've seen 290x's sell on eBay for $550 used so I do think it's a really good deal for us Australians still despite it still not being below MSRP.

If you're keen Noviets; You could go for a used one off ebay? I found quite a few 290's on there for like $450 and under the other day.








Quote:


> Originally Posted by *Jack Mac*
> 
> You should clock the 8350 to at least 4.6GHz to properly feed a 290/X. Seeing as my i5 is somewhat comparable to an 8350 and I need at least 4.4GHz in most games to get the most out of my 290 (when I had it) and 780. And AFAIK, there's been no word of new FX chips.


I'm stock at the moment on the 3570k too, I'm keen to see what I get when I'm watercooled again (literally counting down the days) and when I can overclock the 290 a lot. (on water). I can say that at stock i'm being held back quite a bit. 4.6 is the max clock my chip can do unfortunately however.


----------



## Niberius

Quote:


> Originally Posted by *Matt-Matt*
> 
> Ah yep, fair enough.
> 
> I'm not sure about AMD's roadmap for new top end chips.
> 
> I'd also say that you want a pretty decent overclock to attain the max FPS you can get from it, but it should still be an improvement over a 7970.
> 
> To add mantle runs on 7xxx cards and a 7970 counts as that. The audio side of things isn't released yet so if that's what's pushing you over just wait to see how that pans out.
> Yeah, but in Australia cards usually cost a lot more. I understand $550 is meant to be the MSRP of it in the states but getting the card for that price here is decent; Especially when the rest of the cards are around $650+ easily if not more. I've seen 290x's sell on eBay for $550 used so I do think it's a really good deal for us Australians still despite it still not being below MSRP.
> 
> If you're keen Noviets; You could go for a used one off ebay? I found quite a few 290's on there for like $450 and under the other day.
> 
> 
> 
> 
> 
> 
> 
> 
> I'm stock at the moment on the 3570k too, I'm keen to see what I get when I'm watercooled again (literally counting down the days) and when I can overclock the 290 a lot. (on water). I can say that at stock i'm being held back quite a bit. 4.6 is the max clock my chip can do unfortunately however.


Oh my bad, I didnt realize you were not in the states.


----------



## Roboyto

Quote:


> Originally Posted by *Noviets*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> $549 is cheap for a 290x? I might grab me one later down the road, as I want to get a waterblock and fittings for it at the same time, I'm enjoying my silent system at the moment, and putting a turbine in my case isn't on my to-do list lol
> 
> I wont be selling the 7970, I have a row of systems that I have in the office, as I upgrade my primary, my secondary gets the parts, which go into the tertiary etc.
> 
> What clock speed do you think the 8350 will need to use the 290x at full potential? And perhaps adding a second one?
> 
> I'm mostly looking forward to the true audio, and Mantle advantages on the card, as I really the audio side of things, and I know this processor is going to be holding me back quite a bit.
> 
> 
> 
> Do we have any word on a new FX line, or top end chip form AMD this year? What happened to Steamroller?


http://www.macrumors.com/2012/08/01/key-apple-chip-designer-jim-keller-returns-to-amd/

This could mean good things for AMD. Could also be why there hasn't been much information regarding a new chip anytime recently.

I'm putting my money on something completely new from AMD. FX chips were a fairly big let down all things considered, especially considering heat/power consumption.


----------



## Roboyto

Quote:


> Originally Posted by *Matt-Matt*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Ah yep, fair enough.
> 
> I'm not sure about AMD's roadmap for new top end chips.
> 
> I'd also say that you want a pretty decent overclock to attain the max FPS you can get from it, but it should still be an improvement over a 7970.
> 
> To add mantle runs on 7xxx cards and a 7970 counts as that. The audio side of things isn't released yet so if that's what's pushing you over just wait to see how that pans out.
> Yeah, but in Australia cards usually cost a lot more. I understand $550 is meant to be the MSRP of it in the states but getting the card for that price here is decent; Especially when the rest of the cards are around $650+ easily if not more. I've seen 290x's sell on eBay for $550 used so I do think it's a really good deal for us Australians still despite it still not being below MSRP.
> 
> 
> 
> If you're keen Noviets; You could go for a used one off ebay? I found quite a few 290's on there for like $450 and under the other day.
> 
> 
> 
> 
> 
> 
> 
> 
> I'm stock at the moment on the 3570k too, I'm keen to see what I get when I'm watercooled again (literally counting down the days) and *when I can overclock the 290 a lot*. (on water). I can say that at stock i'm being held back quite a bit. 4.6 is the max clock my chip can do unfortunately however.


The reference blower can still show you what your card is capable of, granted you have to force the fan to 'leaf blower' mode...but it will cool the chip/VRMs so the card doesn't throttle. I was able to push the core on my XFX Black Edition to 1200+ with the stock blower fan set at 75%. After going under water I was only able to squeeze roughly another 40-80 MHz, depending on benchmark, out of the core; memory clocks were not affected. The biggest improvement is undoubtedly noise, and after that temperatures. Either you will get a good clocking chip, or you won't...that's the silicon lottery.

Check out my build log if you want, there is a spreadsheet with all of my overclocking for benchmarking listing clocks, voltages, scores, etc.


----------



## TheGoat Eater

I have the MSI 290X Lightning and damn the card is amazingly build and the packaging was the most insanely well done job I have ever seen. I am still tweaking it and I love it though


----------



## Imprezzion

Ok, I ordered some Fujipoly Ultra Extreme from FrozenCPU (shipping was $8 and takes 15-30 days but k) to tame my VRM temps.

Using a thin spread of regular non-conductive TIM (PK-1 in this case) on both sides of the thermal pad improved VRM temps by 5-8c but still not qutie good..


----------



## Roboyto

Quote:


> Originally Posted by *Imprezzion*
> 
> Ok, I ordered some Fujipoly Ultra Extreme from FrozenCPU (shipping was $8 and takes 15-30 days but k) to tame my VRM temps.
> 
> Using a thin spread of regular non-conductive TIM (PK-1 in this case) on both sides of the thermal pad improved VRM temps by 5-8c but still not qutie good..


Adding the TIM have hurt you if you were using the Ultra Extreme. The reason why EK suggests using TIM with their included pads is because the thermal conductance isn't that high.

Make sure you used the correct thickness pads and then I would retry without the TIM. Several people have reported significant temperature drops in my thread for the upgraded pads.

What block are you using BTW? Aquacomputer VRM temps were already very good, so you may not have seen the decrease that XSPC and EK users have seen.


----------



## cephelix

Hi Guys,
just finished with my overclocks but I have one problem with 3dmark.
I seem to be getting lower scores in 3dmark after overclocking.

OC: Core 1060, Memory 1500
Voltage in afterburner is locked
Used DDU to remove old drivers before installing 13.12 WHQL.

Got an increase in Valley on Ultra, Vsync checked and 8x AA from 34.0 to 36.1

Temps are all in check, Core is high 60 degrees celcius, VRM1 is 59 deg, VRM2 60 deg

Any idea why?


----------



## Imprezzion

Quote:


> Originally Posted by *Roboyto*
> 
> Adding the TIM have hurt you if you were using the Ultra Extreme. The reason why EK suggests using TIM with their included pads is because the thermal conductance isn't that high.
> 
> Make sure you used the correct thickness pads and then I would retry without the TIM. Several people have reported significant temperature drops in my thread for the upgraded pads.


No I wasn't using them yet casue I don't have them yet. Can take up to a month for them to arrive. USA - Holland is pretty far









I am using the stock Tri-X thermal pads which are just as bad as the EK ones (hitting 87c VRM1 @ +100mV with 65% fanspeed - now after TIM it hits 76c max).


----------



## Roboyto

Quote:


> Originally Posted by *Imprezzion*
> 
> No I wasn't using them yet casue I don't have them yet. Can take up to a month for them to arrive. USA - Holland is pretty far
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am using the stock Tri-X thermal pads which are just as bad as the EK ones (hitting 87c VRM1 @ +100mV with 65% fanspeed - now after TIM it hits 76c max).


Ahhh, you just added TIM. I understand now







lol

Well, that is a decent temperature drop essentially for free.


----------



## Roboyto

Quote:


> Originally Posted by *cephelix*
> 
> Hi Guys,
> just finished with my overclocks but I have one problem with 3dmark.
> I seem to be getting lower scores in 3dmark after overclocking.
> 
> OC: Core 1060, Memory 1500
> Voltage in afterburner is locked
> Used DDU to remove old drivers before installing 13.12 WHQL.
> 
> Got an increase in Valley on Ultra, Vsync checked and 8x AA from 34.0 to 36.1
> 
> Temps are all in check, Core is high 60 degrees celcius, VRM1 is 59 deg, VRM2 60 deg
> 
> Any idea why?


I usually don't run Vsync personally, but have you been playing around with enabling/disabling tesselation perchance?

If the score has dropped then I would make sure the card is running full clocks throughout the bench. Maybe try a little extra voltage.

Your RAM clock is high, which is nice, but pushing it too far for me sometimes caused issues. Maybe dial it back a little, maybe 25MHz and see what happens.


----------



## Imprezzion

Quote:


> Originally Posted by *Roboyto*
> 
> Ahhh, you just added TIM. I understand now
> 
> 
> 
> 
> 
> 
> 
> lol
> 
> Well, that is a decent temperature drop essentially for free.


True.. I feel like the issue here was the fact that I had the cooler off several times already cause I was mounting my backplate, repasting it and so on. This made some rips in the stock TIM pad which isn't all that thick to begin with.

Temps are decent now, but as the core is only at ~66c load i'd like to push to about +160mV so I can stabilize 1200Mhz. Only thing keeping me from it are VRM temps


----------



## cephelix

Quote:


> Originally Posted by *Roboyto*
> 
> I usually don't run Vsync personally, but have you been playing around with enabling/disabling tesselation perchance?
> 
> If the score has dropped then I would make sure the card is running full clocks throughout the bench. Maybe try a little extra voltage.
> 
> Your RAM clock is high, which is nice, but pushing it too far for me sometimes caused issues. Maybe dial it back a little, maybe 25MHz and see what happens.


Can't change voltage in afterburner...is there does trixxx allow for voltage change?
dialed memory back to 1400, now i can't even run 3d mark without artifacts.


----------



## JordanTr

As long as your vrms doesnt go above 100 you should be fine with your oc.


----------



## Roboyto

Quote:


> Originally Posted by *cephelix*
> 
> Can't change voltage in afterburner...is there does trixxx allow for voltage change?
> dialed memory back to 1400, now i can't even run 3d mark without artifacts.


Try trixx...I don't think any of the R9 290(X) cards are voltage locked.

In the options for Trixx make sure you enable 'Force Constant Voltage' and disable ULPS. I had issues with AB with my card right out of the gate, have been using Trixx with great success since. I got beast card, and Trixx has been able to push it to 1300/1700


----------



## cephelix

Quote:


> Originally Posted by *Roboyto*
> 
> Try trixx...I don't think any of the R9 290(X) cards are voltage locked.
> 
> In the options for Trixx make sure you enable 'Force Constant Voltage' and disable ULPS. I had issues with AB with my card right out of the gate, have been using Trixx with great success since. I got beast card, and Trixx has been able to push it to 1300/1700


Very nice OC...will try tmrw then.too late/early to do it now depending on how you look at it.It's 3am here.....
Another qn though before i go, how large of an increase to voltage do i apply at each step?


----------



## Roboyto

Quote:



> Originally Posted by *cephelix*
> 
> Very nice OC...will try tmrw then.too late/early to do it now depending on how you look at it.It's 3am here.....
> Another qn though before i go, how large of an increase to voltage do i apply at each step?


Thank You







I did spend a substantial amount of time benching, but the fact is undeniable that I got a winner in the silicon lottery.

If you look in my build log I have a spread sheet showing each voltage increase with what clocks I was running, and I usually noted what the issue was. I personally only make small jumps in voltage, 8-10 mV each time the card starts to act up. I then push as far as I can with that voltage and the process starts all over. Memory clock is nice, but for these cards core is definitely king; I would be more concerned with your core clock first. In that spreadsheet I mentioned there is a tab showing the performance gains from running stock RAM clocks to my VERY high 1700MHz OC, and depending on what you're doing the gains are not that substantial.

I like to make the small jumps in voltage because one of the combinations of voltages/clocks you find will end up being the 'sweet spot' between performance, voltages, temperatures and clock speeds.

I finally settled with 1200/1500 +87mV offset as it gives me 90-95% of the performance I was getting from ~1300/1700 +200mV. The card was fully functional at, or very close to, 1300 core but the additional heat and abuse to the card were not worth the small performance gains it was giving me.

*EDIT* One thing I just remembered with Trixx voltage settings. It doesn't always apply exactly what you move the slider to. For example you will move the slider to say +70mV, but when you apply it may move up or down a couple mV, say to 68 or 73. I don't know why it does this, but I have heard others mention it. Just bump a couple extra mV if you need to get the voltage increase desired.


----------



## Imprezzion

The controller works in steps. Those steps aren't 1mV but for example 7mV. So it will adjust to those steps


----------



## Roboyto

Quote:


> Originally Posted by *Imprezzion*
> 
> The controller works in steps. Those steps aren't 1mV but for example 7mV. So it will adjust to those steps


Ahh, I see. I never noticed that on other cards I have overclocked. Some even values took, but not all of them. Good to know


----------



## devilhead

Quote:


> Originally Posted by *Roboyto*
> 
> Try trixx...I don't think any of the R9 290(X) cards are voltage locked.
> 
> In the options for Trixx make sure you enable 'Force Constant Voltage' and disable ULPS. I had issues with AB with my card right out of the gate, have been using Trixx with great success since. I got beast card, and Trixx has been able to push it to 1300/1700


I have 2x 290X, one can do 1300/1730 with +200, and other 1330/1690 with same +200 on trixx







and do you have 290 or 290X?


----------



## Roboyto

Quote:


> Originally Posted by *devilhead*
> 
> I have 2x 290X, one can do 1300/1730 with +200, and other 1330/1690 with same +200 on trixx
> 
> 
> 
> 
> 
> 
> 
> and do you have 290 or 290X?


Reference XFX 290 Black Edition


----------



## chalkypink

I really loved ab but since the new version came out I started having some issues (mainly with setting voltage over +100mv which I need to get a decent OC on my card) so I uninstalled it and switched to trixx (did the whole ddu uninstall of catalyst and everything too) since trixx has native support for up to +200mv. I started with the settings that I had tested for an hour with OCCT error checking and found to be stable in the last build of ab which was 1170mhz/+200mv/1500mhz ddr (the core voltage would usually hover around 1.3). However, now I'm having this new problem...throttling. This wasn't an issue with ab beta 18 and catalyst beta 14.3 (don't remember the build but it was the latest one released in mid march). Whenever I try to run furmark or OCCT gpu, which I think uses an iteration of furmark, within 2 seconds of starting the test my core clock would drop to 700-800mhz and my voltage would drop to around 1.1-1.2 and fluctuate between there. Even with stock clocks and voltage there is a slight drop to around 970mhz/1.02v. It's NOT thermal throttling, unless that kicks in at 60c lol.

I'm able to get full clock speeds in game benchmarks, but I really like using OCCT to test for stability since it's just a bit more concrete than eyeballing for artifacts in game (and I can "set it and forget it" if it makes it past the first 5 minutes error free). I've read a few accounts of this throttling happening to 290(x) users with furmark (OCCT doesn't seem as popular so I couldn't find anything about that) but I've yet to find a solution. I've tried rolling back my drivers, I've switched back and forth between trixx and ab, nothing I seem to do changes. I was able to run OCCT by limiting the framerate but I'm concerned with whether that may skew the results.

Anyone know how to circumvent this power throttling?? I'm sort of at my wits end here. I guess worst case I will have to find a different way to stability test but I love OCCT!


----------



## Roboyto

Quote:


> Originally Posted by *chalkypink*
> 
> I really loved ab but since the new version came out I started having some issues (mainly with setting voltage over +100mv which I need to get a decent OC on my card) so I uninstalled it and switched to trixx (did the whole ddu uninstall of catalyst and everything too) since trixx has native support for up to +200mv. I started with the settings that I had tested for an hour with OCCT error checking and found to be stable in the last build of ab which was 1170mhz/+200mv/1500mhz ddr (the core voltage would usually hover around 1.3). However, now I'm having this new problem...throttling. This wasn't an issue with ab beta 18 and catalyst beta 14.3 (don't remember the build but it was the latest one released in mid march). Whenever I try to run furmark or OCCT gpu, which I think uses an iteration of furmark, within 2 seconds of starting the test my core clock would drop to 700-800mhz and my voltage would drop to around 1.1-1.2 and fluctuate between there. Even with stock clocks and voltage there is a slight drop to around 970mhz/1.02v. It's NOT thermal throttling, unless that kicks in at 60c lol.
> 
> I'm able to get full clock speeds in game benchmarks, but I really like using OCCT to test for stability since it's just a bit more concrete than eyeballing for artifacts in game (and I can "set it and forget it" if it makes it past the first 5 minutes error free). I've read a few accounts of this throttling happening to 290(x) users with furmark (OCCT doesn't seem as popular so I couldn't find anything about that) but I've yet to find a solution. I've tried rolling back my drivers, I've switched back and forth between trixx and ab, nothing I seem to do changes. I was able to run OCCT by limiting the framerate but I'm concerned with whether that may skew the results.
> 
> Anyone know how to circumvent this power throttling?? I'm sort of at my wits end here. I guess worst case I will have to find a different way to stability test but I love OCCT!


The card could be throttling so it doesn't blow up under Furmark style loads.  Furmark, and likely OCCT, is a poor test IMHO. It gives an unrealistic power draw and load to the card. If you want to test for stability use 3DMark benches or Unigine, or both. I have grown fond of the FFXIV benchmark, as the game is quite brutal on GPUs. If you can get stability in that benchmark, I'd bet you are stable for just about anything else.

AB and Trixx apply voltage differently if I'm not mistaken; can someone confirm or deny this? You would have to use GPU-Z or HWInfo to see what your load voltages are with AB and compare them to what you see when running Trixx.

I have not tried any of the 14.X drivers since 14.1 came out and did not work well with anything I was running. I have since scrubbed and gone back to the rock solid 13.12 with no issues whatsoever.

My XFX 290 BE, with stock BIOS, peaks VDDC at 1.422V when applying +200mV offset in Trixx.


----------



## Durvelle27

New card arrived today


----------



## DrClaw

nice clock speeds, didnt even change the voltage holy cow, did you run some benchmarks?


----------



## Durvelle27

Quote:


> Originally Posted by *DrClaw*
> 
> nice clock speeds, didnt even change the voltage holy cow, did you run some benchmarks?


Are you referring to me ? Only 3DMrk 11


----------



## igrease

How is the MSI GAMING version of the 290? I plan on getting it since it is the cheapest non-reference version. I heard it doesn't OC as well though. Also about how much of a performance gain would I be expecting over my current 560 Ti?


----------



## rdr09

Quote:


> Originally Posted by *igrease*
> 
> How is the MSI GAMING version of the 290? I plan on getting it since it is the cheapest non-reference version. I heard it doesn't OC as well though. Also about how much of a performance gain would I be expecting over my current 560 Ti?


prolly 3 of those 560Tis.


----------



## Matt-Matt

Quote:


> Originally Posted by *Niberius*
> 
> Oh my bad, I didnt realize you were not in the states.


Haha, don't sweat it! Australia gets royally screwed over for pricing on most things.
Quote:


> Originally Posted by *Roboyto*
> 
> The reference blower can still show you what your card is capable of, granted you have to force the fan to 'leaf blower' mode...but it will cool the chip/VRMs so the card doesn't throttle. I was able to push the core on my XFX Black Edition to 1200+ with the stock blower fan set at 75%. After going under water I was only able to squeeze roughly another 40-80 MHz, depending on benchmark, out of the core; memory clocks were not affected. The biggest improvement is undoubtedly noise, and after that temperatures. Either you will get a good clocking chip, or you won't...that's the silicon lottery.
> 
> Check out my build log if you want, there is a spreadsheet with all of my overclocking for benchmarking listing clocks, voltages, scores, etc.


So you're saying i should just go for it now? Haha

Toobad the 14.2's make my idle stuff up (runs really hot) and 13.12's make me black screen with afterburner open. Not keen on trying 14.3's yet.. lol

I'll see how it goes later I guess, I never thought of that though haha. Here to hoping for a golden card!


----------



## Frontside

Quote:


> Originally Posted by *igrease*
> 
> How is the MSI GAMING version of the 290? I plan on getting it since it is the cheapest non-reference version. I heard it doesn't OC as well though. Also about how much of a performance gain would I be expecting over my current 560 Ti?


Just make sure to check if it has thermal pads on VRMs, mine doesn't and heats like a crazy. MSI forgot to put some


----------



## cephelix

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Roboyto*
> 
> Quote:
> Thank You
> 
> 
> 
> 
> 
> 
> 
> I did spend a substantial amount of time benching, but the fact is undeniable that I got a winner in the silicon lottery.
> 
> If you look in my build log I have a spread sheet showing each voltage increase with what clocks I was running, and I usually noted what the issue was. I personally only make small jumps in voltage, 8-10 mV each time the card starts to act up. I then push as far as I can with that voltage and the process starts all over. Memory clock is nice, but for these cards core is definitely king; I would be more concerned with your core clock first. In that spreadsheet I mentioned there is a tab showing the performance gains from running stock RAM clocks to my VERY high 1700MHz OC, and depending on what you're doing the gains are not that substantial.
> 
> I like to make the small jumps in voltage because one of the combinations of voltages/clocks you find will end up being the 'sweet spot' between performance, voltages, temperatures and clock speeds.
> 
> I finally settled with 1200/1500 +87mV offset as it gives me 90-95% of the performance I was getting from ~1300/1700 +200mV. The card was fully functional at, or very close to 1300 on core, but the additional heat and abuse to the card were not worth the small performance gains it was giving me.
> 
> *EDIT* One thing I just remembered with Trixx voltage settings. It doesn't always apply exactly what you move the slider to. For example you will move the slider to say +70mV, but when you apply it may move up or down a couple mV, say to 68 or 73. I don't know why it does this, but I have heard others mention it. Just bump a couple extra mV if you need to get the voltage increase desired.






Thanks for that @Roboyto. This is my first time OC-ing anything seriously and definitely the first time I'm playing with voltages even though I've had my current setup for 3 yrs except for the case and cooler. Hence i'm trying to take it nice and easy and quite cautious.. Trying to take notes on everything so I know if something goes wrong and what my max stable OC would be and apply that. Will go through you build log and take a look a the spreadsheet.
Quote:


> Originally Posted by *Durvelle27*
> 
> Are you referring to me ? Only 3DMrk 11


Very nice OC there. Seems like you got a winner.
Quote:


> Originally Posted by *igrease*
> 
> How is the MSI GAMING version of the 290? I plan on getting it since it is the cheapest non-reference version. I heard it doesn't OC as well though. Also about how much of a performance gain would I be expecting over my current 560 Ti?


All I can say for now is that it's built solid and looks pretty...Currently in the process of OC-in so I can't comment on that


----------



## Durvelle27

Quote:


> Originally Posted by *cephelix*
> 
> 
> Thanks for that @Roboyto. This is my first time OC-ing anything seriously and definitely the first time I'm playing with voltages even though I've had my current setup for 3 yrs except for the case and cooler. Hence i'm trying to take it nice and easy and quite cautious.. Trying to take notes on everything so I know if something goes wrong and what my max stable OC would be and apply that. Will go through you build log and take a look a the spreadsheet.
> Very nice OC there. Seems like you got a winner.
> All I can say for now is that it's built solid and looks pretty...Currently in the process of OC-in so I can't comment on that


ehhhh i don't think its that good. Furthest i can get it to go is 1150/1350 at stock


----------



## Roboyto

Quote:


> Originally Posted by *Matt-Matt*
> 
> Haha, don't sweat it! Australia gets royally screwed over for pricing on most things.
> So you're saying i should just go for it now? Haha
> 
> *Toobad the 14.2's make my idle stuff up (runs really hot) and 13.12's make me black screen with afterburner open. Not keen on trying 14.3's yet.. lol*
> 
> I'll see how it goes later I guess, I never thought of that though haha. Here to hoping for a golden card!


Make sure you scrub your drivers clean with Display Driver Uninstaller if you haven't been. I would rely on 13.12 to find your overclocks, anything 14.X is too finicky.


----------



## Roboyto

Quote:


> Originally Posted by *Durvelle27*
> 
> ehhhh i don't think its that good. Furthest i can get it to go is 1150/1350 at stock


If you're hitting 1150 on core with no changes in voltage, you have a VERY good card.


----------



## cephelix

Quote:


> Originally Posted by *Durvelle27*
> 
> ehhhh i don't think its that good. Furthest i can get it to go is 1150/1350 at stock


1150 at stock voltage is impressive. Best I could get mine at is 1070 before everything starts getting screwy
Quote:


> Originally Posted by *Roboyto*
> 
> If you're hitting 1150 on core with no changes in voltage, you have a VERY good card.


I second that!

So for Trixx settings, I've checked "Force constant voltage" and "Disable ULPS" that should be right right?

and in the Overclocking tab, the VDDC offset increases by 1 every time i click on the "+". should my increments then be in 1's or 7-10 instead?


----------



## Roboyto

Quote:


> Originally Posted by *Frontside*
> 
> Just make sure to check if it has thermal pads on VRMs, mine doesn't and heats like a crazy. MSI forgot to put some


I would get some thermal pads in there ASAP, you are indefinitely risking burning the card up. If you can wait to order some, Fujipoly Ultra Extreme are the way to go. They gave me over 20% drop in temperature for VRM1 compared to XSPC stock pads for waterblock.

http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures

If you don't want to take it apart to do so, I would RMA immediately.

Quote:


> Originally Posted by *cephelix*
> 
> Thanks for that @Roboyto. This is my first time OC-ing anything seriously and definitely the first time I'm playing with voltages even though I've had my current setup for 3 yrs except for the case and cooler. Hence i'm trying to take it nice and easy and quite cautious.. Trying to take notes on everything so I know if something goes wrong and what my max stable OC would be and apply that. Will go through you build log and take a look a the spreadsheet.


You're welcome. Notes is a great thing, it helps you find trends for performance. If you're going to use a spreadsheet, save often in case you lockup or BSOD









Quote:


> Originally Posted by *Matt-Matt*
> 
> So you're saying i should just go for it now? Haha


Absolutely if you can tolerate the noise to run some benchmarks. Once you listen to it blasting noise and hot air around, you get kind of used to the noise and forget about it...especially if you get a good overclocker and the scores keep improving


----------



## Roboyto

Quote:


> Originally Posted by *cephelix*
> 
> 1150 at stock voltage is impressive. Best I could get mine at is 1070 before everything starts getting screwy
> I second that!


If it's really getting 1150 with no voltage that card needs to be under water ASAP! It may be a magical card







1400MHz core?


----------



## Durvelle27

Quote:


> Originally Posted by *cephelix*
> 
> 1150 at stock voltage is impressive. Best I could get mine at is 1070 before everything starts getting screwy
> I second that!
> 
> So for Trixx settings, I've checked "Force constant voltage" and "Disable ULPS" that should be right right?
> 
> and in the Overclocking tab, the VDDC offset increases by 1 every time i click on the "+". should my increments then be in 1's or 7-10 instead?


I'm still testing it though to make sure it's stable in all 're games I play. So far it's holding up


----------



## Roboyto

Quote:


> Originally Posted by *igrease*
> 
> How is the MSI GAMING version of the 290? I plan on getting it since it is the cheapest non-reference version. I heard it doesn't OC as well though. Also about how much of a performance gain would I be expecting over my current 560 Ti?


Quote:


> Originally Posted by *rdr09*
> 
> prolly 3 of those 560Tis.


What he said


----------



## Roboyto

Quote:


> Originally Posted by *cephelix*
> 
> 1150 at stock voltage is impressive. Best I could get mine at is 1070 before everything starts getting screwy
> I second that!
> 
> *So for Trixx settings, I've checked "Force constant voltage" and "Disable ULPS" that should be right right?*
> 
> and in the Overclocking tab, the VDDC offset increases by 1 every time i click on the "+". should my increments then be in 1's or 7-10 instead?


Yes, that is correct.

Each time you click the voltage slider it will increase 1mV. When you start to have instability, artifacting, BSOD, etc you will want to jump voltage up around 10mV. You can certainly apply more if you'd like to though.

Not every card is guaranteed to be a winner. You may encounter a point where additional voltage gets you no more gains. I purchased 2 cards initially on a NewEgg combo, and the first was underwhelming; It peaked at 1080/1350. It was 100% rock solid stable at those settings, but it wouldn't budge past those clocks.


----------



## cephelix

[quote name="Roboyto" url="/t/1436497/official-amd-r9-290x-290-owners-club/19870#post_22029192"
You're welcome. Notes is a great thing, it helps you find trends for performance. If you're going to use a spreadsheet, save often in case you lockup or BSOD







[/quote]

I use the ol pen and paper method...just incase of bsods.

And 1400 at core? that's just crazy! but yeah, if it's a magical card, it's gotta go underwater, and soon!

Noted about the voltage increase....will have to restart my testing
Weird though, in valley i can run my clocks to 1070 no problem but in 3dmark i find that i have to keep lowering it just to run

@Durvelle27 That sounds good...now at 1070 i cant even run 3dmark...


----------



## Durvelle27

Quote:


> Originally Posted by *cephelix*
> 
> [quote name="Roboyto" url="/t/1436497/official-amd-r9-290x-290-owners-club/19870#post_22029192"
> You're welcome. Notes is a great thing, it helps you find trends for performance. If you're going to use a spreadsheet, save often in case you lockup or BSOD


I use the ol pen and paper method...just incase of bsods.

And 1400 at core? that's just crazy! but yeah, if it's a magical card, it's gotta go underwater, and soon!

Noted about the voltage increase....will have to restart my testing
Weird though, in valley i can run my clocks to 1070 no problem but in 3dmark i find that i have to keep lowering it just to run

@Durvelle27 That sounds good...now at 1070 i cant even run 3dmark...[/quote]
I plan to put it under water eventually but for now stock is good


----------



## cephelix

@Durvelle27 plan to put mine underwater eventually too...but most likely it'll be a universal block and not a full cover.
but even on stock your clocks are impressive, just sayin...









@Roboyto Just went through your excel, how do you increase your power limit in Trixx? I dont see an option for that


----------



## Paul17041993

Quote:


> Originally Posted by *Durvelle27*
> 
> ehhhh i don't think its that good. Furthest i can get it to go is 1150/1350 at stock


stock voltage? whats the ASIC score? I'm very curious what it actually says...


----------



## cephelix

Quote:


> Originally Posted by *Paul17041993*
> 
> stock voltage? whats the ASIC score? I'm very curious what it actually says...


noob qn, what;s ASIC?


----------



## IBIubbleTea

How do I disable GPU Hardware acceleration? Getting blue screens because of it. :/ I'm using Windows 8.1 and latest version of chrome.


----------



## rdr09

Quote:


> Originally Posted by *IBIubbleTea*
> 
> How do I disable GPU Hardware acceleration? Getting blue screens because of it. :/ I'm using Windows 8.1 and latest version of chrome.


when watching a video like You Tube? rightclick the video>settings>disable HA.


----------



## Durvelle27

Quote:


> Originally Posted by *cephelix*
> 
> @Durvelle27 plan to put mine underwater eventually too...but most likely it'll be a universal block and not a full cover.
> but even on stock your clocks are impressive, just sayin...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @Roboyto Just went through your excel, how do you increase your power limit in Trixx? I dont see an option for that


I'll be going full cover. Hate the look of universals
Quote:


> Originally Posted by *Paul17041993*
> 
> stock voltage? whats the ASIC score? I'm very curious what it actually says...


----------



## cephelix

Quote:


> Originally Posted by *Durvelle27*
> 
> I'll be going full cover. Hate the look of universals


It isn't ideal I know, but the current fullcovers dont fit my msi 290 gaming


----------



## Roboyto

Quote:


> Originally Posted by *cephelix*
> 
> @Durvelle27 plan to put mine underwater eventually too...but most likely it'll be a universal block and not a full cover.
> but even on stock your clocks are impressive, just sayin...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @Roboyto Just went through your excel, how do you increase your power limit in Trixx? I dont see an option for that


There is a scroll bar, just have to move it down a little. I would crank that baby to 50% and leave it there


----------



## IBIubbleTea

Quote:


> Originally Posted by *rdr09*
> 
> when watching a video like You Tube? rightclick the video>settings>disable HA.


When I right click the video I don't see settings, All I see is...Copy Video URL, Copy Vidfeo URl at current time, Copy embed code, Report play back issue, copy debug info, stats for nerds, about HTML5 player.


----------



## Durvelle27

Quote:


> Originally Posted by *cephelix*
> 
> It isn't ideal I know, but the current fullcovers dont fit my msi 290 gaming


Reference board FTW lol


----------



## cephelix

Ah, thanks,didnt see the scrollbar.
Now already set to 50%.
Read up on it but still dont understand the difference between the power limit and voltage


----------



## Matt-Matt

Quote:


> Originally Posted by *Roboyto*
> 
> Make sure you scrub your drivers clean with Display Driver Uninstaller if you haven't been. I would rely on 13.12 to find your overclocks, anything 14.X is too finicky.


Yeah the 14.x's are a _female dog.._ I'll have a look later, but opening afterburner makes my second monitor crash after like 5 minutes.. Then the system locks up.

I've also had a real bad experience from driver sweepers back in the days that I owned 68xx cards. (I had three)
Quote:


> Originally Posted by *Roboyto*
> 
> I would get some thermal pads in there ASAP, you are indefinitely risking burning the card up. If you can wait to order some, Fujipoly Ultra Extreme are the way to go. They gave me over 20% drop in temperature for VRM1 compared to XSPC stock pads for waterblock.
> 
> http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures
> 
> If you don't want to take it apart to do so, I would RMA immediately.
> You're welcome. Notes is a great thing, it helps you find trends for performance. If you're going to use a spreadsheet, save often in case you lockup or BSOD
> 
> 
> 
> 
> 
> 
> 
> 
> Absolutely if you can tolerate the noise to run some benchmarks. Once you listen to it blasting noise and hot air around, you get kind of used to the noise and forget about it...especially if you get a good overclocker and the scores keep improving


Haha, I'll have a look later!
I'm going to try a crossfire setup with reference cards this week too! My housemate thought his ASUS 7970 DCII with an EK block died.. So he impulse bought a R9 290 reference off eBay for $430 shipped! (that's a steal really).

It's a shame though, as we don't think the 7970 is dead anymore and it was golden (able to do 1300/1700 at 1.27v). We were going to try and push 1.3 to see if we could break 1300 haha.


----------



## cephelix

Quote:


> Originally Posted by *Durvelle27*
> 
> Reference board FTW lol


Yeah, my comp guy doesnt carry stock boards...lol....


----------



## Roboyto

Quote:


> Originally Posted by *Matt-Matt*
> 
> Yeah the 14.x's are a female dog.. I'll have a look later, but opening afterburner makes my second monitor crash after like 5 minutes.. Then the system locks up.
> Haha, I'll have a look later!
> I'm going to try a crossfire setup with reference cards this week too! My housemate thought his ASUS 7970 DCII with an EK block died.. So he impulse bought a R9 290 reference off eBay for $430 shipped! (that's a steal really).
> 
> It's a shame though, as we don't think the 7970 is dead anymore and it was golden (able to do 1300/1700 at 1.27v). We were going to try and push 1.3 to see if we could break 1300 haha.


Couldn't hurt to try running Trixx instead of AB.

That is beast clock for a 7970.

Quote:


> Originally Posted by *cephelix*
> 
> Ah, thanks,didnt see the scrollbar.
> Now already set to 50%.
> Read up on it but still dont understand the difference between the power limit and voltage


Power increased amperage, which allows more wattage.

Quote:


> Originally Posted by *cephelix*
> 
> Yeah, my comp guy doesnt carry stock boards...lol....


I would send an e-mail to MSI, and some of the waterblock manufacturers to see if any of them plan on making a block for it...couldn't hurt. From my experience, most of the water cooling companies respond within a day or 2 at most.


----------



## Durvelle27

Quote:


> Originally Posted by *Matt-Matt*
> 
> Yeah the 14.x's are a _female dog.._ I'll have a look later, but opening afterburner makes my second monitor crash after like 5 minutes.. Then the system locks up.
> Haha, I'll have a look later!
> I'm going to try a crossfire setup with reference cards this week too! My housemate thought his ASUS 7970 DCII with an EK block died.. So he impulse bought a R9 290 reference off eBay for $430 shipped! (that's a steal really).
> 
> It's a shame though, as we don't think the 7970 is dead anymore and it was golden (able to do 1300/1700 at 1.27v). We were going to try and push 1.3 to see if we could break 1300 haha.


Not bad at all

My old XFX HD 7970 (Reference) only was able to do 1290/1850 @1.35v
Quote:


> Originally Posted by *cephelix*
> 
> Yeah, my comp guy doesnt carry stock boards...lol....


Nahhhh that sucks


----------



## Iniura

Quote:


> Originally Posted by *igrease*
> 
> How is the MSI GAMING version of the 290? I plan on getting it since it is the cheapest non-reference version. I heard it doesn't OC as well though. Also about how much of a performance gain would I be expecting over my current 560 Ti?


*3DMark 11 Performance on Air*

MSI GTX 560 Ti Twin Frozr II - OC Core 950/ Memory 1145
Graphics Score: 5120



XFX R9 290 Black OC Edition - OC Core 1200/ Memory 1450
Graphics Score: 16467



*3DMark 11 Firestrike Extreme on Air*

MSI GTX 560 Ti Twin Frozr II - OC Core 950/ Memory 1145
Graphics Score: 804



XFX R9 290 Black OC Edition - OC Core 1200/ Memory 1450
Graphics Score: 5862



*3DMark 11 Firestrike on Air*

XFX R9 290 Black OC Edition - OC Core 1180/ Memory 1450
Graphics Score: 12242


----------



## cephelix

Quote:


> Originally Posted by *Durvelle27*
> 
> Not bad at all
> 
> My old XFX HD 7970 (Reference) only was able to do 1290/1850 @1.35v
> Nahhhh that sucks


That's how it is here, quite difficult to purchase reference cards. Most would go for ones with aftermarket coolers


----------



## rdr09

@Iniura, i just ran my 290 at 1200/1500 and got a graphics score of 17300 in 3DMark11. yours is about 1000 points short.

here . . .

http://www.3dmark.com/3dm11/8174572


----------



## bbond007

Quote:


> Originally Posted by *cephelix*
> 
> It isn't ideal I know, but the current fullcovers dont fit my msi 290 gaming


http://www.coolingconfigurator.com/step1_complist?gpu_gpus=1286
http://www.coolingconfigurator.com/step1_complist?gpu_gpus=1285

"Coming Soon"


----------



## cephelix

Quote:


> Originally Posted by *bbond007*
> 
> http://www.coolingconfigurator.com/step1_complist?gpu_gpus=1286
> 
> "Coming Soon"


Woohoo!!....my card has hope then...


----------



## Roboyto

Quote:


> Originally Posted by *cephelix*
> 
> Woohoo!!....my card has hope then...


Good news.


----------



## cephelix

good news indeed...
well, putting off OC-ing my card till next week. tired spending 4 days of OC-ing my cpu and gpu...
Still wondering why my 3dmark score is lower at oc than stock though but it may be my cpu, the scores for cpu get lower as i oc my 290.
does that make sense?


----------



## Roboyto

Quote:


> Originally Posted by *cephelix*
> 
> good news indeed...
> well, putting off OC-ing my card till next week. tired spending 4 days of OC-ing my cpu and gpu...
> Still wondering why my 3dmark score is lower at oc than stock though but it may be my cpu, the scores for cpu get lower as i oc my 290.
> does that make sense?


Your i5 could be showing its age with the beastly GPU. Not really certain why your CPU score would drop, that is odd. Your 750W PSU should be more than enough to power your system. Is the CPU holding clocks during the benches?


----------



## cephelix

haven't checked, how do i check tht my cpu is holding it's clocks?
It is stable though....


----------



## Matt-Matt

Quote:


> Originally Posted by *Durvelle27*
> 
> Not bad at all
> 
> My old XFX HD 7970 (Reference) only was able to do 1290/1850 @1.35v
> Nahhhh that sucks


Haha, your card is decent too! All of my 7950's (I had 3) the top one was around 1200MHz.
Quote:


> Originally Posted by *Roboyto*
> 
> Couldn't hurt to try running Trixx instead of AB.
> 
> That is beast clock for a 7970.


Haha shame it wasn't mine.. Toobad he only uses it for skyrim too..

Trixx.. Haha maybe, I'm a MSI Afterburner fanboy however


----------



## Roboyto

Quote:


> Originally Posted by *Matt-Matt*
> 
> Trixx.. Haha maybe, I'm a MSI Afterburner fanboy however


I'm a fan of whatever works :-D

Plus you're not the only person I've heard of having issues with AB.

Can't hurt to give it a shot. All you've got to lose is your instability issues


----------



## IBIubbleTea

Anyone know how to disable gpu hardware acceleration on YouTube? When I right click a video I don't see settings. All I see is Copy Video URL, Copy Vidfeo URl at current time, Copy embed code, Report play back issue, copy debug info, stats for nerds, about HTML5 player.


----------



## Paul17041993

Quote:


> Originally Posted by *IBIubbleTea*
> 
> Anyone know how to disable gpu hardware acceleration on YouTube? When I right click a video I don't see settings. All I see is Copy Video URL, Copy Vidfeo URl at current time, Copy embed code, Report play back issue, copy debug info, stats for nerds, about HTML5 player.


that'd be HTML5, you're not using flash for that video, keep trying different videos until "settings" appears, it should also say "about adobe flash player" instead of html5.


----------



## IBIubbleTea

Quote:


> Originally Posted by *Paul17041993*
> 
> that'd be HTML5, you're not using flash for that video, keep trying different videos until "settings" appears, it should also say "about adobe flash player" instead of html5.


Now I don't know anymore. Depending on the YouTube video, it will give me different options... dafuq?

Edit: I read the post to quickly.. lol


----------



## Forceman

Quote:


> Originally Posted by *cephelix*
> 
> haven't checked, how do i check tht my cpu is holding it's clocks?
> It is stable though....


If you run Afterburner, you can detach the monitoring window, then check it after you finish a game to see if the clock speed was stable (detached window is bigger). Or you can use something like HWInfo or GPU-z to log your clock speeds while you play, then check the logs.


----------



## cephelix

@Forceman thanx for that...will try


----------



## Paul17041993

Quote:


> Originally Posted by *IBIubbleTea*
> 
> Now I don't know anymore. Depending on the YouTube video, it will give me different options... dafuq?
> 
> Edit: I read the post to quickly.. lol


blame google


----------



## JordanTr

I've been wondering which full cover waterblock is best for vrm temps out of the box? For core they all seems to be similar it varies by 2-5 degrees. Im in UK and dont wanna pay for fujipoly shipping than it cost itself is the matter of principle and its much better to get everything prepared. Another question? Does backplate help for vrms as well? Is the difference noticeable?


----------



## Hattifnatten

Aquacomputer has an active backplate, which helped quite a bit IIRCC.


----------



## Matt-Matt

Quote:


> Originally Posted by *Hattifnatten*
> 
> Aquacomputer has an active backplate, which helped quite a bit IIRCC.


The regular backplate was withing 2c of the "Aktiv", but everywhere I've looked it OOS.. So I cancelled my order and I'm going with the Koolance one instead (it works out to be $120 or so cheaper for me too)


----------



## Iniura

Quote:


> Originally Posted by *rdr09*
> 
> @Iniura, i just ran my 290 at 1200/1500 and got a graphics score of 17300 in 3DMark11. yours is about 1000 points short.
> 
> here . . .
> 
> http://www.3dmark.com/3dm11/8174572


Hey ty, hmm that's strange, you've got a slightly higher memory OC and used Win 7, I am on 8 but that wouldn't make such a difference, what is the brand of your R9 290? And what bios are you running? Yours isn't unlocked to a R9 290X right? I want to get to the bottom of this. Here's mine http://www.3dmark.com/3dm11/7721810 maybe you see something odd? Also I am not 100% sure I was running 1200 core, it's what 3dmark reports tho. I've been waiting for over a month on my reservoir to be delivered so unfortunately I don't have access to my pc right now.







I want to test!!

Edit: I guess it is because your on windows 7, 3dmark performance seems to favor Windows 7, all the world records are set on win 7.


----------



## rdr09

Quote:


> Originally Posted by *Iniura*
> 
> Hey ty, hmm that's strange, you've got a slightly higher memory OC and used Win 7, I am on 8 but that wouldn't make such a difference, what is the brand of your R9 290? And what bios are you running? Yours isn't unlocked to a R9 290X right? I want to get to the bottom of this. Here's mine http://www.3dmark.com/3dm11/7721810 maybe you see something odd? Also I am not 100% sure I was running 1200 core, it's what 3dmark reports tho. I've been waiting for over a month on my reservoir to be delivered so unfortunately I don't have access to my pc right now.
> 
> 
> 
> 
> 
> 
> 
> I want to test!!
> 
> Edit: I guess it is because your on windows 7, 3dmark performance seems to favor Windows 7, all the world records are set on win 7.


what driver did you use in that run?

i use 13.11. did you raise your Power Limit all the way to 50%?


----------



## JordanTr

So aquacomputer + backplate is the best choice if dont want to buy thermal pads separately?


----------



## Iniura

Quote:


> Originally Posted by *rdr09*
> 
> what driver did you use in that run?
> 
> i use 13.11. did you raise your Power Limit all the way to 50%?


13.12, yes power limit was +50, GPU wasn't throttling.


----------



## rdr09

Quote:


> Originally Posted by *Iniura*
> 
> 13.12, yes power limit was +50, GPU wasn't throttling.


you have some power saving features turned on in your motherboard? is windows performance set to high?

here is 1200/1450 . . .

http://www.3dmark.com/3dm11/8175989

i turn off mse whenever i run a bench.

edit: it's a reference 290 unlockable and using stock bios with elpida vram. not sure what brand you have but there are findings on here that some powercolor 290s with hynix vram are underperforming.


----------



## Durvelle27

Quote:


> Originally Posted by *cephelix*
> 
> That's how it is here, quite difficult to purchase reference cards. Most would go for ones with aftermarket coolers


Idk maybe I'm just a reference junky lol.

7970 was reference
R9 290
GTX 780
R9 290X

Etc lol

Quote:


> Originally Posted by *Matt-Matt*
> 
> Haha, your card is decent too! All of my 7950's (I had 3) the top one was around 1200MHz.
> Haha shame it wasn't mine.. Toobad he only uses it for skyrim too..
> 
> Trixx.. Haha maybe, I'm a MSI Afterburner fanboy however


So 10MHz more lol


----------



## Iniura

Quote:


> Originally Posted by *rdr09*
> 
> you have some power saving features turned on in your motherboard? is windows performance set to high?
> 
> here is 1200/1450 . . .
> 
> http://www.3dmark.com/3dm11/8175989
> 
> i turn off mse whenever i run a bench.


I was running the CPU without OC, and stock settings of the mobo, I am not sure of the settings which where set on the mobo.
What powersaving setting could be the issue? I think windows performance was set to high yes, but again I am not 100% sure, I don't own the mobo anymore and also reinstalled OS since then on a new mobo, also It sucks that I dont have access to my pc at the moment, I hope I test some of the things you mentioned soon tho.

What is mse? I'm going to look your 3dmark verification with the same settings soon as i am on mobile now.


----------



## Durvelle27

Quote:


> Originally Posted by *Iniura*
> 
> I was running the CPU without OC, and stock settings of the mobo, I am not sure of the settings which where set on the mobo.
> What powersaving setting could be the issue? I think windows performance was set to high yes, but again I am not 100% sure, I don't own the mobo anymore and also reinstalled OS since then on a new mobo, also It sucks that I dont have access to my pc at the moment, I hope I test some of the things you mentioned soon tho.
> 
> What is mse? I'm going to look your 3dmark verification with the same settings soon as i am on mobile now.


Here's my Sapphire R9 290 @1215/1450

http://www.3dmark.com/3dm11/7626071


----------



## cephelix

that is an awesome core clock!


----------



## Roboyto

Quote:


> Originally Posted by *JordanTr*
> 
> So aquacomputer + backplate is the best choice if dont want to buy thermal pads separately?


Yes, but you can get same results with another manufacturers block and Fujipoly Ultra Extreme. When I upgraded to the FUE, see my sig, it put the VRM1 temperature to the same temperature as my core. Since I have tweaked my OC reducing offset voltage to +87 for 1200/1500, VRM1 is now usually 1-2C under core temperature.

I contacted XSPC regarding their backplate, and they said to use some thermal pad on the backside of VRM1 to make contact with the backplate to assist in cooling. I have not yet tried this to see the temperature results, but plan to do so in the near future.


----------



## Durvelle27

Quote:


> Originally Posted by *cephelix*
> 
> that is an awesome core clock!


the 290 also wasn't on stock cooling


----------



## rdr09

Quote:


> Originally Posted by *Roboyto*
> 
> Yes, but you can get same results with another manufacturers block and Fujipoly Ultra Extreme. When I upgraded to the FUE, see my sig, it put the VRM1 temperature to the same temperature as my core. Since I have tweaked my OC reducing offset voltage to +87 for 1200/1500, VRM1 is now usually 1-2C under core temperature.
> 
> I contacted XSPC regarding their backplate, and they said to use some thermal pad on the backside of VRM1 to make contact with the backplate to assist in cooling. I have not yet tried this to see the temperature results, but plan to do so in the near future.


Rob, that's your 7/24 clocks and what games do you use to stress test? i never oc for daily use.
Quote:


> Originally Posted by *Durvelle27*
> 
> the 290 also wasn't on stock cooling


indeed 1215 is very decent for a 290, Durville. i see you have 1180 clocks for your 290X.


----------



## Durvelle27

Quote:


> Originally Posted by *rdr09*
> 
> Rob, that's your 7/24 clocks and what games do you use to stress test? i never oc for daily use.
> indeed 1215 is very decent for a 290, Durville.


Yea but i was pushing about 1.32v


----------



## rdr09

Quote:


> Originally Posted by *Durvelle27*
> 
> Yea but i was pushing about 1.32v


for the 290X or 290? 1.32v is safe for daily of continual use. it is not like it is loaded 7/24. if temps are reasonably set at load, then it is safe imo.

i did get 1.32 at +100 offset. +200 went to 1.41v on mine when benched.


----------



## Tobiman

Just did a quick run of 3DMark at 1250/1500. http://www.3dmark.com/3dm/2777867


----------



## rdr09

Quote:


> Originally Posted by *Tobiman*
> 
> Just did a quick run of 3DMark at 1250/1500. http://www.3dmark.com/3dm/2777867


nice. whay you only have 3 sticks of ram?


----------



## Durvelle27

Quote:


> Originally Posted by *rdr09*
> 
> for the 290X or 290? 1.32v is safe for daily of continual use. it is not like it is loaded 7/24. if temps are reasonably set at load, then it is safe imo.
> 
> i did get 1.32 at +100 offset. +200 went to 1.41v on mine when benched.


Sapphire R9 290 with Accelero Xtreme III @1215/1450 1.32v
AMD R9 290X Stock @1150/1350 Stock


----------



## JordanTr

Quote:


> Originally Posted by *Roboyto*
> 
> Yes, but you can get same results with another manufacturers block and Fujipoly Ultra Extreme. When I upgraded to the FUE, see my sig, it put the VRM1 temperature to the same temperature as my core. Since I have tweaked my OC reducing offset voltage to +87 for 1200/1500, VRM1 is now usually 1-2C under core temperature.
> 
> I contacted XSPC regarding their backplate, and they said to use some thermal pad on the backside of VRM1 to make contact with the backplate to assist in cooling. I have not yet tried this to see the temperature results, but plan to do so in the near future.


I told before that im not going to use FUE pads at all. I want the best result from the block as it comes from shop. So probably i will go with kryographics waterblock+ their passive backplate, i will order some phobya termal pads ( thats the best i can find in uk ), but its 7w/mk. Is it worth to put them under backplate at all?


----------



## Durvelle27

Some reason after 1100/1350 3DMark 11 stops reporting correct clocks

but heres 290X at 1150/1450

http://www.3dmark.com/3dm11/8174315


----------



## Tobiman

Quote:


> Originally Posted by *rdr09*
> 
> nice. whay you only have 3 sticks of ram?


4th stick is dead. I started RMA but haven't sent it yet.


----------



## rdr09

Quote:


> Originally Posted by *Durvelle27*
> 
> Some reason after 1100/1350 3DMark 11 stops reporting correct clocks
> 
> but heres 290X at 1150/1450
> 
> http://www.3dmark.com/3dm11/8174315


you hit the lottery twice.

Quote:


> Originally Posted by *Tobiman*
> 
> 4th stick is dead. I started RMA but haven't sent it yet.


looks like it does not affect the performance much. here is mine at same clocks . . .

http://www.3dmark.com/3dm/2778034


----------



## Tobiman

Yeah, I don't think running one less ram stick affects performance at all. Even more important, my new 620W power supply has proven, for the past few days, that it can power my whole system, even overclocked, without any problems.


----------



## Durvelle27

Quote:


> Originally Posted by *rdr09*
> 
> you hit the lottery twice.
> looks like it does not affect the performance much. here is mine at same clocks . . .
> 
> http://www.3dmark.com/3dm/2778034


idk seems like its a trend

HD 6950 1000/1250 @1.25V
HD 7870 1300/1500 @1.3V
HD 7970 1290/1850 @1.35V
R9 290 1215/1450 @1.35V
GTX 780 1330/1850 @1.32V
R9 290X 1150/1500 @Stock (Haven't tried for max yet)


----------



## Roboyto

Quote:


> Originally Posted by *rdr09*
> 
> Rob, that's your 7/24 clocks and what games do you use to stress test? i never oc for daily use.


The 3 that I have played most extensively have been Final Fantasy XIV, Crysis 3, and Borderlands 2. All are stable at 1200/1500 +87mV offset for 5760*1080 resolution.

I run the OC to game since I'm running Eyefinity with one card. When I'm done playing I return the card to stock clocks.

I Just picked up Tomb Raider on a steam sale for $11.99, that is going to be my next test


----------



## Roboyto

Quote:


> Originally Posted by *JordanTr*
> 
> I told before that im not going to use FUE pads at all. I want the best result from the block as it comes from shop. So probably i will go with kryographics waterblock+ their passive backplate, i will order some phobya termal pads ( thats the best i can find in uk ), but its 7w/mk. Is it worth to put them under backplate at all?


For their to be heat transfer you have to use pads under the backplate. Metal-to-metal contact would cause something to short out most likely.

I would get the active one personally, the top of the card looks incomplete without it...even if it's only a few C performance difference.


----------



## rv8000

Does anyone know what thickness thermal pad would be used if I went and replaced the vrm1 and 2 pads on a Tri-x card?


----------



## Raephen

Quote:


> Originally Posted by *JordanTr*
> 
> I told before that im not going to use FUE pads at all. I want the best result from the block as it comes from shop. So probably i will go with kryographics waterblock+ their passive backplate, i will order some phobya termal pads ( thats the best i can find in uk ), but its 7w/mk. Is it worth to put them under backplate at all?


I use the Phobya pads with my Kryographics block, and they wotk better than the stock ones, so yeah: they'll do fine.

I haven't got a backplate, though, so couldn't comment on that.


----------



## krillz0

Sapphire r9 290
core: 1222mhz
ram: 1500mhz
http://www.3dmark.com/3dm/2778891

is this score low? the funny thing is that i haw run the test like 6tims now and it lowers the score after every try... started att 10456 and whent down to 10302 and are now stabel @ 10368...


----------



## chronicfx

asked this in motherboards a couple hours back and have had no traffic.. maybe this is a good place too. let me know if this makes sense lol

I currently have my rig in my sig. 3570k, 2400mhz ram and 3x R9 290x on a Z77x-UD5H gigabyte board running 3.0 (x8,x4,x4). I am thinking this a bandwidth limited situation and have looked into going 2011 but $800+(or even more than $400 for that matter) on a platform upgrade is a huge step. So I was thinking of upgrading the UD5H to an ASUS P8Z77 WS or a Maximus V Extreme for the extra PLX lanes. I have a sound blaster Zx card which would need one of the x16 slots (bottom one I presume) on these boards since they don't have the exposed x1 slot layout that the UD5H has.. So the main questions is will I gain framerate and smoothness at 1440p 60hz by upping my available pcie lanes and staying on z77? I run my cpu at 5ghz, sometimes 4.8 depending on my mood.


----------



## JordanTr

Quote:


> Originally Posted by *Roboyto*
> 
> For their to be heat transfer you have to use pads under the backplate. Metal-to-metal contact would cause something to short out most likely.
> 
> I would get the active one personally, the top of the card looks incomplete without it...even if it's only a few C performance difference.


To be honest, it will be my first waterloop, so i dont even know how to connect together that active backplate and block. Will i need extra barbs or fittings?


----------



## Roboyto

Quote:


> Originally Posted by *JordanTr*
> 
> To be honest, it will be my first waterloop, so i dont even know how to connect together that active backplate and block. Will i need extra barbs or fittings?


Nope, they give you a different adapter for the waterblock. It allows for water to reach that heatpipe to pull the heat away.



Quote:


> Originally Posted by *rv8000*
> 
> Does anyone know what thickness thermal pad would be used if I went and replaced the vrm1 and 2 pads on a Tri-x card?


Not sure for an air cooler. You can buy as thick as 1.5mm pads, so I would say to play it safe, order the 1mm pads; If they are not thick enough you can always lay 2 pieces on top of each other.

You could pull the cooler off and measure how thick the stock ones are. Or shoot sapphire an e-mail and ask them.

Quote:


> Originally Posted by *chronicfx*
> 
> asked this in motherboards a couple hours back and have had no traffic.. maybe this is a good place too. let me know if this makes sense lol
> 
> I currently have my rig in my sig. 3570k, 2400mhz ram and 3x R9 290x on a Z77x-UD5H gigabyte board running 3.0 (x8,x4,x4). I am thinking this a bandwidth limited situation and have looked into going 2011 but $800+(or even more than $400 for that matter) on a platform upgrade is a huge step. So I was thinking of upgrading the UD5H to an ASUS P8Z77 WS or a Maximus V Extreme for the extra PLX lanes. I have a sound blaster Zx card which would need one of the x16 slots (bottom one I presume) on these boards since they don't have the exposed x1 slot layout that the UD5H has.. So the main questions is will I gain framerate and smoothness at 1440p 60hz by upping my available pcie lanes and staying on z77? I run my cpu at 5ghz, sometimes 4.8 depending on my mood.




If I remember correctly from someone else asking a similar question, one of these cards isn't bottlenecked by an 8X slot...but a 4X could be questionable. Running 3 cards, I think you should first invest in an i7 over anything; 4 threads is likely not enough for 3 of these very powerful GPUs.

I would think that 2 cards wouldn't have much, if any, trouble pushing 1440p @ 60Hz. My single 290 running at 1200/1500 runs Crysis 3 very smoothly 5760*1080, all settings maxed except AA which I bumped down to 4XAA I believe. Granted it's certainly not 60fps, but I don't believe it ever falls below 35.


----------



## Paul17041993

the problem with lanes is more in the lines of latency, anything below the full x16 lanes on both PCIe 2.0 and 3.0 can give a little less performance, but usually you can go as low as only x4 PCIe2.0 and only get a ~10% perf. drop, allowing quadfire to still be practical provided the primary card has at least x8 PCIe2.0 to cope with the buffer copies.


----------



## chronicfx

Quote:


> Originally Posted by *Paul17041993*
> 
> the problem with lanes is more in the lines of latency, anything below the full x16 lanes on both PCIe 2.0 and 3.0 can give a little less performance, but usually you can go as low as only x4 PCIe2.0 and only get a ~10% perf. drop, allowing quadfire to still be practical provided the primary card has at least x8 PCIe2.0 to cope with the buffer copies.


My friend wants to sell me his z77 maximus V extreme for $150.. I have to decide before he puts it in ebay. I will be sticking to the 3570k.

It has a plx chip. So just thought I would ask if it would improve things alot? Or should I just leave it be and not tear down all my watercooling and having to overclock again and all that... Worth it?


----------



## Roboyto

Quote:


> Originally Posted by *chronicfx*
> 
> My friend wants to sell me his z77 maximus V extreme for $150.. I have to decide before he puts it in ebay. I will be sticking to the 3570k.


You were willing to spend $1500+ on GPUs, but you have a $250 CPU. That is a nice board, but I would probably be searching for a used 3770k as well.


----------



## lawson67

I have managed to bring my x2 R9 290 card top card temp down from 91c - 93c after about 20 -30 mins of Heaven Benchmark 4.0 in full screen ultra settings at 26050x1440 on my Qnix Monitor... to 76c max after 2 hours of Heaven Benchmark 4.0 on the top card which is a Gigabyte Windforce in the top slot and the other being a Powercolor in the bottom slot...i did this by simply re-pasting my windforce GPU with Antec formula 7 nano diamond compound and exhausting my H80I fans instead of in-taking air from the outside though the radiator which corsair say they recommend!

I was worried about doing this as i have read from other people that there CPU temps have shot up doing this with Xfire or SLI setups as the heat of course goes out though the radiator!...However my CPU temps only increased by 3-4c to a max of 69c one core 1 after 2 hours of benchmarking!...i have shaved off 17c from a max of 93c on the windforce after 20-30 mins of Heaven Benchmark 4.0 down to a max of 76c after *2 hours* of Heaven Benchmark 4.0 which seems incredible!...but its true!

I also installed 2x AK174CB-4BLB Akasa Fans which spin at 1700 rpm and move 59.05cu.ft/min of air each last week to blow between the cards in an attempt to bring the temps down!.. however these made little difference until now as i believe they was fighting with the intake of the H80i fans!...Also i have 3 more of these fans 2 in the roof exhaust and one more in the front also as an intake and also 2x 80mm side 26000rpm fans...Anyhow i am very please with the results and all of this was made so much easier monitoring GPU temps and VRM temps and fan speeds on the fly in Heaven Benchmark 4.0 by using HWiNFO and passing the sensor data over to RTSS and using this data in am OSD.


----------



## Iniura

Quote:


> Originally Posted by *krillz0*
> 
> Sapphire r9 290
> core: 1222mhz
> ram: 1500mhz
> http://www.3dmark.com/3dm/2778891
> 
> is this score low? the funny thing is that i haw run the test like 6tims now and it lowers the score after every try... started att 10456 and whent down to 10302 and are now stabel @ 10368...


It is in the neighborhood of my card.

XFX R9 290 Black OC Edition
Core 1180
Memory 1450

Firestrike: Graphics Score 12242 http://www.3dmark.com/3dm/2778891


----------



## chronicfx

What gains are there from a 3770k from a 3570k? Show me, i dont do bf4 multi.


----------



## kizwan

Quote:


> Originally Posted by *Raephen*
> 
> Quote:
> 
> 
> 
> Originally Posted by *JordanTr*
> 
> I told before that im not going to use FUE pads at all. I want the best result from the block as it comes from shop. So probably i will go with kryographics waterblock+ their passive backplate, i will order some phobya termal pads ( thats the best i can find in uk ), but its 7w/mk. Is it worth to put them under backplate at all?
> 
> 
> 
> I use the Phobya pads with my Kryographics block, and they wotk better than the stock ones, so yeah: they'll do fine.
> 
> I haven't got a backplate, though, so couldn't comment on that.
Click to expand...

@JordanTr, AC Kryographics + backplate out-of-the-box deliver better VRM1 cooling out of four waterblocks tested. The difference between active & normal backplate is small though. The backplate play major role in VRM1 cooling with Krygraphics block.

http://www.xtremesystems.org/forums/showthread.php?288109-Stren-s-R9-290-290x-Water-Block-Testing


You can get the Phobya thermal pad if you want, works well for @Raephen. The stock thermal pad (come with the Kryogrpahics block) should be sufficient though.


----------



## sugarhell

Please that graph is lame. Do you actually believe that an ek block has 44C delta on vrms? Even with ek stock pads they outperform any other block on vrm cooling


----------



## kizwan

Quote:


> Originally Posted by *sugarhell*
> 
> Please that graph is lame. Do you actually believe that an ek block has 44C delta on vrms? Even with ek stock pads they outperform any other block on vrm cooling


I have EK waterblock, depends on how much I overclocked & overvolted, I can see the delta temp on the VRM1 in between low to high 30s Celsius. That with just 3d Mark11. So I do believe it can go that high. Do you have any result or review that show EK waterblocks & with stock thermal pad outperform other waterblocks for 290/290x? The review I linked to, the only one I know, so far only have result for four waterblocks for 290/290x.


----------



## Roboyto

Quote:


> Originally Posted by *chronicfx*
> 
> What gains are there from a 3770k from a 3570k? Show me, i dont do bf4 multi.


I would have trouble finding that information because no one is running a mid-level CPU with 3 enthusiast level GPUs. You could sell your 3570k and buy a 3770k for what you would spend on that motherboard. It has been contested in this thread, more that once, the ability of a 4670k to keep up with (2) 290 cards, let alone 3. If you don't want to cough up the cash, sell one of your 290X and only use 2. 3 of these cards is overkill for 1440p gaming at 60Hz; if you were running 3D it would probably be a different story.


----------



## Roboyto

Quote:


> Originally Posted by *kizwan*
> 
> I have EK waterblock, depends on how much I overclocked & overvolted, I can see the delta temp on the VRM1 in between low to high 30s Celsius. That with just 3d Mark11. So I do believe it can go that high. Do you have any result or review that show EK waterblocks & with stock thermal pad outperform other waterblocks for 290/290x? The review I linked to, the only one I know, so far only have result for four waterblocks for 290/290x.


I agree. Highest VRM1 temp I recorded was 78C running +200mV 1315/1675 clocks in FFXIV benchmark. That was before I upgraded pads on my XSPC Razor. After I switched that temperature came down to 58C at the same clocks and voltages.


----------



## JordanTr

Quote:


> Originally Posted by *kizwan*
> 
> @JordanTr, AC Kryographics + backplate out-of-the-box deliver better VRM1 cooling out of four waterblocks tested. The difference between active & normal backplate is small though. The backplate play major role in VRM1 cooling with Krygraphics block.
> 
> http://www.xtremesystems.org/forums/showthread.php?288109-Stren-s-R9-290-290x-Water-Block-Testing
> 
> 
> You can get the Phobya thermal pad if you want, works well for @Raephen. The stock thermal pad (come with the Kryogrpahics block) should be sufficient though.


Does kryographics include thermal pads with their backplate or its just a plain piece of metal? And what thickness should i get for block itself and for backplate? Is it alright 1mm on everything or i should get 0.5 and 1mm as well?


----------



## King4x4

On quadfire setup the highest vrm temp was 51'c with the liquid temp being around 34'c with 180mv and gpus grinding at 1240mhz.


----------



## Raephen

Quote:


> Originally Posted by *JordanTr*
> 
> Does kryographics include thermal pads with their backplate or its just a plain piece of metal? And what thickness should i get for block itself and for backplate? Is it alright 1mm on everything or i should get 0.5 and 1mm as well?


For the block, the stock pads are 0.5 mm, I think. But they are stiff pads. The Phobya are much more pliable for a good fit over the components, so I went with 1mm pads.

I don't know which thickness they use on the backplate. Have they listed it on their website?


----------



## JordanTr

Another stupid question ive been wondering if i put phobya termal pad of 0.5mm on gpu itself instead of noctua nt-h1. Maybe i could get better results?







However with any wb core temp is not a problem, only vrms


----------



## Roboyto

Quote:


> Originally Posted by *JordanTr*
> 
> Another stupid question ive been wondering if i put phobya termal pad of 0.5mm on gpu itself instead of noctua nt-h1. Maybe i could get better results?
> 
> 
> 
> 
> 
> 
> 
> However with any wb core temp is not a problem, only vrms


No waterblock manufacturer suggests pads for the core. The blocks are machined for direct contact to the die, a pad would likely cause the card to warp when tightening everything. I would just put some paste on there, use the included TIM even.

Quote:


> Originally Posted by *JordanTr*
> 
> Does kryographics include thermal pads with their backplate or its just a plain piece of metal? And what thickness should i get for block itself and for backplate? Is it alright 1mm on everything or i should get 0.5 and 1mm as well?


Aquacomputer suggests 0.5mm for RAM and VRMs; I would stick with their suggestion. They are also the only company who requires TIM on the RAM chips. They must have machined the block to make direct contact to RAM as well; using pads on the RAM for their block could cause warping like I mentioned above.

In my thermal pad upgrade write up, I have the installation instructions and thermal pad thicknesses for all the 290(X) blocks listed.

Quote:


> Originally Posted by *JordanTr*
> 
> Does kryographics include thermal pads with their backplate or its just a plain piece of metal? And what thickness should i get for block itself and for backplate? Is it alright 1mm on everything or i should get 0.5 and 1mm as well?


http://shop.aquacomputer.de/product_info.php?language=en&products_id=3150

They say it comes with 'mounting material', which is likely a literal translation from German; I would say it comes with them.


----------



## Paul17041993

Spoiler: quote



Quote:


> Originally Posted by *lawson67*
> 
> I have managed to bring my x2 R9 290 card top card temp down from 91c - 93c after about 20 -30 mins of Heaven Benchmark 4.0 in full screen ultra settings at 26050x1440 on my Qnix Monitor... to 76c max after 2 hours of Heaven Benchmark 4.0 on the top card which is a Gigabyte Windforce in the top slot and the other being a Powercolor in the bottom slot...i did this by simply re-pasting my windforce GPU with Antec formula 7 nano diamond compound and exhausting my H80I fans instead of in-taking air from the outside though the radiator which corsair say they recommend!
> 
> I was worried about doing this as i have read from other people that there CPU temps have shot up doing this with Xfire or SLI setups as the heat of course goes out though the radiator!...However my CPU temps only increased by 3-4c to a max of 69c one core 1 after 2 hours of benchmarking!...i have shaved off 17c from a max of 93c on the windforce after 20-30 mins of Heaven Benchmark 4.0 down to a max of 76c after *2 hours* of Heaven Benchmark 4.0 which seems incredible!...but its true!
> 
> I also installed 2x AK174CB-4BLB Akasa Fans which spin at 1700 rpm and move 59.05cu.ft/min of air each last week to blow between the cards in an attempt to bring the temps down!.. however these made little difference until now as i believe they was fighting with the intake of the H80i fans!...Also i have 3 more of these fans 2 in the roof exhaust and one more in the front also as an intake and also 2x 80mm side 26000rpm fans...Anyhow i am very please with the results and all of this was made so much easier monitoring GPU temps and VRM temps and fan speeds on the fly in Heaven Benchmark 4.0 by using HWiNFO and passing the sensor data over to RTSS and using this data in am OSD.






yea, back and top of your case should ONLY be outtake, if its intake you completely screw over your airflow and can even make the CPU cooling work significantly worse depending on your case environment.

I honestly have no idea what idea corsair has about using it as an intake, unless they are referring to using it on a front or side panel (or bottom if you have one that isn't flow limited).


----------



## vieuxchnock

Et voilà:My R9 290 single slot. And still working!!!

http://www.servimg.com/image_preview.php?i=510&u=17159996

http://www.servimg.com/image_preview.php?i=511&u=17159996

http://www.servimg.com/image_preview.php?i=512&u=17159996


----------



## JordanTr

Quote:


> Originally Posted by *Roboyto*
> 
> No waterblock manufacturer suggests pads for the core. The blocks are machined for direct contact to the die, a pad would likely cause the card to warp when tightening everything. I would just put some paste on there, use the included TIM even.
> 
> Aquacomputer suggests 0.5mm for RAM and VRMs; I would stick with their suggestion. In my thermal pad upgrade write up, I have the installation instructions and thermal pad thicknesses for all the 290(X) blocks listed.
> http://shop.aquacomputer.de/product_info.php?language=en&products_id=3150
> 
> They say it comes with 'mounting material', which is likely a literal translation from German; I would say it comes with them.


i knew it was stupid question its just convenient to cut the pad and slap it on gpu, no need to spread thermal compound. Anyway should i use noctua nt-h1 i got at the moment ( it does wonders on air comparing to paste supplied with gelid icy vision cooler).


----------



## Roboyto

Quote:


> Originally Posted by *JordanTr*
> 
> i knew it was stupid question its just convenient to cut the pad and slap it on gpu, no need to spread thermal compound. Anyway should i use noctua nt-h1 i got at the moment ( it does wonders on air comparing to paste supplied with gelid icy vision cooler).


It's pretty good paste, you certainly can. The cores don't suffer from heat issues once they're under a block so the TIM won't make or break your overclock.

One of the most popular 'traditional' pastes, especially for GPU blocks, is Gelid GC-Extreme; even EK recommends it in their instructions.

My paste of choice these days is Xigmatek PTI-G4512. Not very well known, but it performs and stays tacky even under heavy/long use. Longest I have had it on a block without removing was 18 months and it was still good.


----------



## JordanTr

Really nice results. My corsair h90 havnt been repasted for 8months and temperatures still the same as first day. However then i will make my full waterloop corsair will go to retirement


----------



## nightfox

Quote:


> Originally Posted by *vieuxchnock*
> 
> Et voilà:My R9 290 single slot. And still working!!!
> 
> http://www.servimg.com/image_preview.php?i=510&u=17159996
> 
> http://www.servimg.com/image_preview.php?i=511&u=17159996
> 
> http://www.servimg.com/image_preview.php?i=512&u=17159996


How did you removed the 2nd DVI?


----------



## vieuxchnock

Cut it.


----------



## Roboyto

Quote:


> Originally Posted by *nightfox*
> 
> How did you removed the 2nd DVI?


Desoldered from the board maybe?


----------



## Paul17041993

Quote:


> Originally Posted by *nightfox*
> 
> How did you removed the 2nd DVI?


Quote:


> Originally Posted by *vieuxchnock*
> 
> Cut it.


yea if you look carefully you can see it looks like its been hit with a saw of some sort.

bit of a hacky way to do it, but provided the signal lines don't touch each other in the end it should be fine regardless.


----------



## vieuxchnock

I cut it with a pair of cutter


----------



## caenlen

Just bought my first R9 290, $323 free ship off Ebay. Sapphire Stock Cooler... not going to unlock it or anything, but, considering buying the Arctic Hybrid to install on it, http://www.newegg.com/Product/Product.aspx?Item=N82E16835186095

Worth it? Seller had 100% feedback and sold several cards... so it should be legit, and Sapphire has a good rma service, so as long as I have s/n sticker on back i should be good


----------



## chiknnwatrmln

Quote:


> Originally Posted by *caenlen*
> 
> Just bought my first R9 290, $323 free ship off Ebay. Sapphire Stock Cooler... not going to unlock it or anything, but, considering buying the Arctic Hybrid to install on it, http://www.newegg.com/Product/Product.aspx?Item=N82E16835186095
> 
> Worth it? Seller had 100% feedback and sold several cards... so it should be legit, and Sapphire has a good rma service, so as long as I have s/n sticker on back i should be good


Good deal. Just be aware that if Sapphire find out you removed the stock cooler they will not honor the warranty. I would test the card out (find max OC, bench and game the heck out of it) for a month or so to make sure the card is good before you remove the stock cooler. That's what I did with my ref card before I put on my Gelid, then eventually water block.


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Good deal. Just be aware that if Sapphire find out you removed the stock cooler they will not honor the warranty. I would test the card out (find max OC, bench and game the heck out of it) for a month or so to make sure the card is good before you remove the stock cooler. That's what I did with my ref card before I put on my Gelid, then eventually water block.


indeed.
unscrewing perfectly and all of that doesn't matter when the thermal paste inside is not the same and the thermal pads are broken or misplaced.
this is why I'm starting to appreciate the companies that allow removal of cooler if there is no damage done to card


----------



## Forceman

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Good deal. Just be aware that if Sapphire find out you removed the stock cooler they will not honor the warranty. I would test the card out (find max OC, bench and game the heck out of it) for a month or so to make sure the card is good before you remove the stock cooler. That's what I did with my ref card before I put on my Gelid, then eventually water block.


Most companies don't honor warranties on secondhand purchases anyway, so it may not matter. Does Sapphire warranty secondhand?


----------



## vortex240

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> indeed.
> unscrewing perfectly and all of that doesn't matter when the thermal paste inside is not the same and the thermal pads are broken or misplaced.
> this is why I'm starting to appreciate the companies that allow removal of cooler if there is no damage done to card


I have RMA'd 2 sapphire cards - both with replaced TIM, both took just over a week to get a replacement. Mind you this is in CANADA


----------



## rv8000

Quote:


> Originally Posted by *rv8000*
> 
> Does anyone know what thickness thermal pad would be used if I went and replaced the vrm1 and 2 pads on a Tri-x card?


Anyone?


----------



## Niberius

Quote:


> Originally Posted by *caenlen*
> 
> Just bought my first R9 290, $323 free ship off Ebay. Sapphire Stock Cooler... not going to unlock it or anything, but, considering buying the Arctic Hybrid to install on it, http://www.newegg.com/Product/Product.aspx?Item=N82E16835186095
> 
> Worth it? Seller had 100% feedback and sold several cards... so it should be legit, and Sapphire has a good rma service, so as long as I have s/n sticker on back i should be good


Great deal on the card but I would avoid the Arctic Hybrid, lots of returns on those coolers mainly because they do not perform as well as the cheaper but larger Arctic Accelero 3 and 4. I have the arctic accelero 3 myself and another member here had the Arctic Hybrid and using the same voltage and overclock I was getting as much as 10c cooler on the core. Only issue as I mentioned is room, I have a huge tower so its not big deal for myself.


----------



## wntrsnowg

How are you guys going above 1235 core clock in Afterburner? No matter what I try, even doing the "unofficial overclock method," it still doesnt change anything for me. I also tried using trixx 4.8.2 and it didnt have any core voltage adjustment. I am using an xfx 290, reference.


----------



## Niberius

Quote:


> Originally Posted by *wntrsnowg*
> 
> How are you guys going above 1235 core clock in Afterburner? No matter what I try, even doing the "unofficial overclock method," it still doesnt change anything for me. I also tried using trixx 4.8.2 and it didnt have any core voltage adjustment. I am using an xfx 290, reference.


Are you using the latest beta for afterburner? It lets me go as high as 1300mhz core on the slider.


----------



## wntrsnowg

Yeah im using beta 19 straight from the website. I just reinstalled too, no dice


----------



## Matt-Matt

Quote:


> Originally Posted by *Niberius*
> 
> Are you using the latest beta for afterburner? It lets me go as high as 1300mhz core on the slider.


Trixx lets me go to 1400 on the core and 1800 on the vRAM without changing anything.


----------



## wntrsnowg

I tried Trixx. It gave me no option for altering Vcore. only some powertune percentage thing.

So in Trixx they call Vcore -> VDDC. I altered VDDC in trixx, loaded up afterburner and the vcore setting had changed and was idential. Kept making changes and they are linked. Trixx it is


----------



## vortex240

Quote:


> Originally Posted by *wntrsnowg*
> 
> I tried Trixx. It gave me no option for altering Vcore. only some powertune percentage thing.
> 
> So in Trixx they call Vcore -> VDDC. I altered VDDC in trixx, loaded up afterburner and the vcore setting had changed and was idential. Kept making changes and they are linked. Trixx it is


Just throw another bios on it. Its very easy and make sure to backup the one you have on the card now.


----------



## Niberius

Quote:


> Originally Posted by *Matt-Matt*
> 
> Trixx lets me go to 1400 on the core and 1800 on the vRAM without changing anything.


It does for me as well but since my card wont do 1400core and 1800ram it doesnt really matter to me.


----------



## Matt-Matt

Quote:


> Originally Posted by *Niberius*
> 
> It does for me as well but since my card wont do 1400core and 1800ram it doesnt really matter to me.


Same, max for me so far is 1250.. Kind of devo about it haha. Wanted 1300









Mind you that's only at +120 voltage, so I guess i may be able to do 1300 yet.


----------



## Niberius

Quote:


> Originally Posted by *Matt-Matt*
> 
> Same, max for me so far is 1250.. Kind of devo about it haha. Wanted 1300
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Mind you that's only at +120 voltage, so I guess i may be able to do 1300 yet.


Yeah I hear you man, sounds like your card is about like mine. I can happily do 1200mhz core with +100mv which comes out to around 1.328vc under load. I did push for 1250mhz core but didnt really notice anything in the way of a better gaming experience so I backed it back down to 1200mhz core since I can run considerably lower voltage at that clock speed and get better temps as well.

I guess 1200 to 1300 would be good for benching but for gaming it probably wont make much of a difference.


----------



## Durvelle27

Don't see my name on the list.


----------



## Matt-Matt

Quote:


> Originally Posted by *Niberius*
> 
> Yeah I hear you man, sounds like your card is about like mine. I can happily do 1200mhz core with +100mv which comes out to around 1.328vc under load. I did push for 1250mhz core but didnt really notice anything in the way of a better gaming experience so I backed it back down to 1200mhz core since I can run considerably lower voltage at that clock speed and get better temps as well.
> 
> I guess 1200 to 1300 would be good for benching but for gaming it probably wont make much of a difference.


Yeah, it's not too bad though. Not a dud like ALL of my three 7950's and like my two 6850's. I actually got a decent card..

What kind of cooling are you running?


----------



## Niberius

Quote:


> Originally Posted by *Matt-Matt*
> 
> Yeah, it's not too bad though. Not a dud like ALL of my three 7950's and like my two 6850's. I actually got a decent card..
> 
> What kind of cooling are you running?


Arctic Accelero 3. With the overclock and overvolt im seeing 64c load temps on the core and the hottest vrm is vrm 1 which gets to 79c. Thats not bad at all considering the clock speed and voltage that im running, also keep in mind my tower is a Rosewill Thor which has some of the best airflow you can get in a full size tower but its extremely quite because the fans are large low rpm fans.


----------



## Matt-Matt

Quote:


> Originally Posted by *Niberius*
> 
> Arctic Accelero 3. With the overclock and overvolt im seeing 64c load temps on the core and the hottest vrm is vrm 1 which gets to 79c. Thats not bad at all considering the clock speed and voltage that im running, also keep in mind my tower is a Rosewill Thor which has some of the best airflow you can get in a full size tower but its extremely quite because the fans are large low rpm fans.


Nice, yeah I've got a HAF 932 and I've replaced all my fans with 120mm ones. I do have the stock 230mm ones but ones starting to go and 4x covers more area anyway.

That is a really nice temp though! I've got a block coming soon, the rest of my watercooling stuff arrived today. Just waiting for a clients PC build to go through this week hopefully and grab the block as payment.







Well kind of but yeah.


----------



## Niberius

Quote:


> Originally Posted by *Matt-Matt*
> 
> Nice, yeah I've got a HAF 932 and I've replaced all my fans with 120mm ones. I do have the stock 230mm ones but ones starting to go and 4x covers more area anyway.
> 
> That is a really nice temp though! I've got a block coming soon, the rest of my watercooling stuff arrived today. Just waiting for a clients PC build to go through this week hopefully and grab the block as payment.
> 
> 
> 
> 
> 
> 
> 
> Well kind of but yeah.


I had the HAF 932 at one point and its a great case, I sold that PC build to a friend a few years ago as I tried to give up gaming but eventually built another rig and went with the Thor. I also replaced the 1 big fan on the side door and replaced it with 4 120mm fans but they are good ones and dont make much noise while moving a good bit of air. That alone dropped my VRM1 temps 10 degrees compared to just the 1 big fan that came already installed in the case.

I might go water one day but for now im content with the old school way of air cooling with big coolers which is the reason I have a Noctua DH14 on my 2700k cpu. lol.


----------



## cephelix

Quote:


> Originally Posted by *Niberius*
> 
> I had the HAF 932 at one point and its a great case, I sold that PC build to a friend a few years ago as I tried to give up gaming but eventually built another rig and went with the Thor. I also replaced the 1 big fan on the side door and replaced it with 4 120mm fans but they are good ones and dont make much noise while moving a good bit of air. That alone dropped my VRM1 temps 10 degrees compared to just the 1 big fan that came already installed in the case.
> 
> I might go water one day but for now im content with the old school way of air cooling with big coolers which is the reason I have a Noctua DH14 on my 2700k cpu. lol.


water water water water!!


----------



## Niberius

Quote:


> Originally Posted by *cephelix*
> 
> water water water water!!


Nah not for me bro, I like the low maintenance of big air coolers and I often find myself hitting the upper limits of vcore before temps become the main problem. Water is nice though, just not for me.









Now keep in mind my CPU is a 2700K which doesnt have temp issues like the newer chips so im sure water would be the best rout for cooling ivy bridge and newer intel cpu's unless of course you dont mind deliding.


----------



## cephelix

I agree with you too....though i hit usually run into temperature limits first. Hence the possible move to water...


----------



## kizwan

Quote:


> Originally Posted by *cephelix*
> 
> I agree with you too....though i hit usually run into temperature limits first. Hence the possible move to water...


Yeah, in our weather, temperature is the main problem unfortunately.


----------



## cephelix

Quote:


> Originally Posted by *kizwan*
> 
> Yeah, in our weather, temperature is the main problem unfortunately.


LN2 for you then sir?


----------



## rdr09

Quote:


> Originally Posted by *Niberius*
> 
> Nah not for me bro, I like the low maintenance of big air coolers and I often find myself hitting the upper limits of vcore before temps become the main problem. Water is nice though, just not for me.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now keep in mind my CPU is a 2700K which doesnt have temp issues like the newer chips so im sure water would be the best rout for cooling ivy bridge and newer intel cpu's unless of course you dont mind deliding.


water is high maintenance during installation. after that it is just like air cooling. i watercooled my 290 last November and have not touched my loop since. i'll wait a few more months before renewing the water. i ran out of distilled, so may have to get a gallon for a dollar.


----------



## kizwan

Quote:


> Originally Posted by *cephelix*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Yeah, in our weather, temperature is the main problem unfortunately.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> LN2 for you then sir?
Click to expand...

Maybe or water chiller.


----------



## MaCk-AtTaCk

hey guys i just picked up a refrence 290 for cheap. Im comming from a single refrence 670. Will the refrence blower on the 290 be alot louder then the 670? also I should see a nice jump in frames I suppose. I manly upgraded because I have a 1440p monitor and was hitting the ram max in some of my games.


----------



## Durvelle27

Quote:


> Originally Posted by *MaCk-AtTaCk*
> 
> hey guys i just picked up a refrence 290 for cheap. Im comming from a refrence 670. Will the refrence blower on the 290 be alot louder then the 670? also I should see a nice jump in frames I suppose. I manly upgraded because I have a 1440p monitor and was hitting the ram max in some of my games.


Yes it will surely be louder depending on what % fan is running at and yes you'll see a nice jump in performance from a 670 over 20% at stock.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *MaCk-AtTaCk*
> 
> hey guys i just picked up a refrence 290 for cheap. Im comming from a single refrence 670. Will the refrence blower on the 290 be alot louder then the 670? also I should see a nice jump in frames I suppose. I manly upgraded because I have a 1440p monitor and was hitting the ram max in some of my games.


The ref 290 fan is louder than the ref 680 fan. I had a 670 FTW which had a ref 680's PCB and fan. Clocked at 1241/7500 my 670 would get mid 11k in 3dMark11 graphics, my 290 at 1245/6500 gets just over 18k so there is a massive performance increase.


----------



## Niberius

Quote:


> Originally Posted by *rdr09*
> 
> water is high maintenance during installation. after that it is just like air cooling. i watercooled my 290 last November and have not touched my loop since. i'll wait a few more months before renewing the water. i ran out of distilled, so may have to get a gallon for a dollar.


That is true, its not a big deal maintenance wise I guess you could say but for the time being im good with air. Might eventually go the water route one day though.


----------



## MaCk-AtTaCk

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> The ref 290 fan is louder than the ref 680 fan. I had a 670 FTW which had a ref 680's PCB and fan. Clocked at 1241/7500 my 670 would get mid 11k in 3dMark11 graphics, my 290 at 1245/6500 gets just over 18k so there is a massive performance increase.


ok thanks. I had the stock 670 blower from which i read was alittle more noisy then the 680. i plan on setting the temp target at 88C and the max fan at 50% I know it will down clock but i dont really care that much. im sure it will still run circles around my old 670. and for 120$ more ontop of what i sold my 670 for I think its a good deal. oh and I got a copy of bf4 too







Im also waiting to see if i can go water.


----------



## taem

Powercolor PCS+ 290 is $459.99 at Newegg right now. Some Hynix versions of this card are reporting underperformance issues, and it's a huge card a three slotter. Those two caveats aside, great price for the coolest running 290 out there with factory clocks of 1040/1350.

http://www.newegg.com/Product/Product.aspx?Item=14-131-549&nm_mc=EMC-GD033114&cm_mmc=EMC-GD033114-_-index-_-Item-_-14-131-549


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Niberius*
> 
> Yeah I hear you man, sounds like your card is about like mine. I can happily do 1200mhz core with +100mv which comes out to around 1.328vc under load. I did push for 1250mhz core but didnt really notice anything in the way of a better gaming experience so I backed it back down to 1200mhz core since I can run considerably lower voltage at that clock speed and get better temps as well.
> 
> I guess 1200 to 1300 would be good for benching but for gaming it probably wont make much of a difference.


Im flat out getting [email protected]@1.485v on mine


----------



## phallacy

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Im flat out getting [email protected]@1.485v on mine


That's insane, but how are you able to push to 1.485 on the card? Even with +200mV on trixx I think before the vdroop max I saw was 1.36 or so. Are you changing the hexadecimal in AB?


----------



## kizwan

Quote:


> Originally Posted by *phallacy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HOMECINEMA-PC*
> 
> Im flat out getting [email protected]@1.485v on mine
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That's insane, but how are you able to push to 1.485 on the card? Even with +200mV on trixx I think before the vdroop max I saw was 1.36 or so. Are you changing the hexadecimal in AB?
Click to expand...

He is using ASUS BIOS. Using GPU Tweak allow higher voltage.


----------



## phallacy

Quote:


> Originally Posted by *kizwan*
> 
> He is using ASUS BIOS. Using GPU Tweak allow higher voltage.


Gotcha thanks. I would use GPU Tweak but it never reads my cards properly in my setup and power limit and voltage is always greyed out for the non primary card.


----------



## hotrod717

Quote:


> Originally Posted by *phallacy*
> 
> Gotcha thanks. I would use GPU Tweak but it never reads my cards properly in my setup and power limit and voltage is always greyed out for the non primary card.


AB is also unlockable to allow for higher voltage than +100, +200, +300, ect.


----------



## Frontside

Hi, need some help. I've noticed that memory clock drops to idle while gaming . It happened a few times in a couple of month in BF4 (the only game i play). Is it software or hardware problem. Catalyst 14.xx betas


----------



## INCREDIBLEHULK

What better place than here








I have some 290/290x I have up for sale, if any of you are interested in saving a bit off of a new price card check out my listing.
Quote:


> Originally Posted by *rdr09*
> 
> water is high maintenance during installation. after that it is just like air cooling. i watercooled my 290 last November and have not touched my loop since. i'll wait a few more months before renewing the water. i ran out of distilled, so may have to get a gallon for a dollar.


Indeed


----------



## jfry94

hi guys, i brought a gigabyte r9 290 a few weeks back to last about 6 months till i do a full rebuild. Does anybody know if it can be crossfired with a 290x or even the r9 295x2?


----------



## Paul17041993

Quote:


> Originally Posted by *jfry94*
> 
> hi guys, i brought a gigabyte r9 290 a few weeks back to last about 6 months till i do a full rebuild. Does anybody know if it can be crossfired with a 290x or even the r9 295x2?


290 and 290X, yes they can crossfire, 295x2 I would assume it can too as its either a 290X core or a slightly different core still based on Hawaii.


----------



## Frontside

Quote:


> Originally Posted by *jfry94*
> 
> hi guys, i brought a gigabyte r9 290 a few weeks back to last about 6 months till i do a full rebuild. Does anybody know if it can be crossfired with a 290x or even the r9 295x2?


Yes. You can


----------



## Matt-Matt

Quote:


> Originally Posted by *cephelix*
> 
> water water water water!!


I second this!


----------



## Roboyto

Quote:


> Originally Posted by *Niberius*
> 
> Yeah I hear you man, sounds like your card is about like mine. I can happily do 1200mhz core with +100mv which comes out to around 1.328vc under load. I did push for 1250mhz core but didnt really notice anything in the way of a better gaming experience so I backed it back down to 1200mhz core since I can run considerably lower voltage at that clock speed and get better temps as well.
> 
> I guess 1200 to 1300 would be good for benching but for gaming it probably wont make much of a difference.


Depends on what resolution you are trying to run. I'm pushing 5760*1080 on a single 290 with my gaming clocks of 1200/1500 +87 offset. I get smooth playable frames with all settings, but AA and maybe shadows, maxed in games.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *phallacy*
> 
> That's insane, but how are you able to push to 1.485 on the card? Even with +200mV on trixx I think before the vdroop max I saw was 1.36 or so. Are you changing the hexadecimal in AB?
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> He is using ASUS BIOS. Using GPU Tweak allow higher voltage.
Click to expand...

Asus 290x PT1T bios which is about 1.54v in gpu tweek .
This particular bios tells pc whatever your card is msi,giga,sapphy a asus 290x card but clearly its not









Quote:


> Originally Posted by *Matt-Matt*
> 
> I second this!


Thirded


----------



## Kalistoval

I just reinstalled everything on my rig and installed amd ccc 14.2.beta1.3 all other mobo drivers too. Im having this problem where cpuz is reporting a link width of x1 some times it will report x16 this is on a freash install and is the only think on my pci-e lanes also its on a x16 slot this is a Asus R9 290x DCUIIOC with a kraken g10 and x40 im running a fx 8320 atm running a modest 4.ghz all CnQ power saving features off on a kraken x60 does anyone have an idea on what might be going on I never had this happend with my old 6870 in the same slot thus why I reinstalled windows 7 x64.
Asrock 990fx killer


----------



## ReXtN

Quote:


> Originally Posted by *Kalistoval*
> 
> I just reinstalled everything on my rig and installed amd ccc 14.2.beta1.3 all other mobo drivers too. Im having this problem where cpuz is reporting a link width of x1 some times it will report x16 this is on a freash install and is the only think on my pci-e lanes also its on a x16 slot this is a Asus R9 290x DCUIIOC with a kraken g10 and x40 im running a fx 8320 atm running a modest 4.ghz all CnQ power saving features off on a kraken x60 does anyone have an idea on what might be going on I never had this happend with my old 6870 in the same slot thus why I reinstalled windows 7 x64.
> Asrock 990fx killer


Hi!

Download GPU-Z and check if it says x1, x4, x8 or x16. The lane usage in GPU-z is dynamic, so if your card is idling, it may be at just x1, but if you put the card under load, it jumps straight up to x16









What i do to test this is to download GPU-Z and click on the little "?" and run the "Render Test".(as shown in the picture below). When the "Render Test" is running, your PCIe-lane should jump up to x16 if you GPU is in a x16 pci-e lane on your motherboard









I hope this helped you out a bit


----------



## Kalistoval

Quote:


> Originally Posted by *ReXtN*
> 
> Hi!
> 
> Download GPU-Z and check if it says x1, x4, x8 or x16. The lane usage in GPU-z is dynamic, so if your card is idling, it may be at just x1, but if you put the card under load, it jumps straight up to x16
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What i do to test this is to download GPU-Z and click on the little "?" and run the "Render Test".(as shown in the picture below). When the "Render Test" is running, your PCIe-lane should jump up to x16 if you GPU is in a x16 pci-e lane on your motherboard
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I hope this helped you out a bit


Okk tried it you are right











What kinda cinebench score would this card normally score? R15


----------



## ReXtN

Quote:


> Originally Posted by *Kalistoval*
> 
> Okk tried it you are right
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What kinda cinebench score would this card normally score? R15


Great!









IDK what Cinebench score the 290X would normally acheive..
I use Unigine Valley or 3DMark Firestrike Extreme for benchmarking and Clockspeed testing


----------



## ReXtN

I ran Cinebench now in the Open GL test.
The Valley score is allmost two months old, so it uses the 14.1 Beta Driver which was a bit buggy..
This is my 3DMark Firestrike Extreme Score: http://www.3dmark.com/fs/1943349

All of these tests/Benchmarks has been run at 1200mhz on the Core, and 6GHz on the Memory


----------



## kizwan

*VRM1: Stock EK thermal pad vs. Fujipoly Extreme thermal pad*

_Note:-
_

Additional 120mm radiator was added to the loop for Fujipoly Extreme tests. The additional 120mm rad will only drop water temp a couple degrees Celsius max.
With stock EK thermal pad, I didn't monitor water temp. So just calculate the difference between VRM1 max temp & ambient temp to see the difference. Not best approach but it's all I have.

*3DMark11 - 1175/1400 +200mV*

*Stock EK Thermal Pad*
Ambient: 32.5C
VRM1: 66C
Delta (VRM1 - Ambient): 33.5C


*Fujipoly Extreme Thermal Pad*
Ambient: 27.6C
Water temp: 30C
VRM1: 44C
Delta (VRM1 - Ambient): 16.4C


*Firestrike Extreme - 1200/1400 +193mV*

*Stock EK Thermal Pad*
Ambient: 30C
VRM1: 56C
Delta (VRM1 - Ambient): 26C


*Fujipoly Extreme Thermal Pad*
Ambient: 27.2C
Water temp: 32C
VRM1: 46C
Delta (VRM1 - Ambient): 18.8C


*Unigine Valley Benchmark 1.0 - 1150/1600 +100mV*

*Stock EK Thermal Pad*
Ambient: 28.5C
VRM1: 65C
Delta (VRM1 - Ambient): 36.5C


*Fujipoly Extreme Thermal Pad*
Ambient: 27.3C
Water temp: 32C
VRM1: 47C
Delta (VRM1 - Ambient): 19.7C


----------



## neurotix

Quote:


> Originally Posted by *kizwan*
> 
> *VRM1: Stock EK thermal pad vs. Fujipoly Extreme thermal pad*
> 
> _Note:-
> _
> 
> Additional 120mm radiator was added to the loop for Fujipoly Extreme tests. The additional 120mm rad will only drop water temp a couple degrees Celsius max.
> With stock EK thermal pad, I didn't monitor water temp. So just calculate the difference between VRM1 max temp & ambient temp to see the difference. Not best approach but it's all I have.
> 
> *3DMark11 - 1175/1400 +200mV*
> 
> *Stock EK Thermal Pad*
> Ambient: 32.5C
> VRM1: 66C
> Delta (VRM1 - Ambient): 33.5C
> 
> 
> *Fujipoly Extreme Thermal Pad*
> Ambient: 27.6C
> Water temp: 30C
> VRM1: 44C
> Delta (VRM1 - Ambient): 16.4C
> 
> 
> *Firestrike Extreme - 1200/1400 +193mV*
> 
> *Stock EK Thermal Pad*
> Ambient: 30C
> VRM1: 56C
> Delta (VRM1 - Ambient): 26C
> 
> 
> *Fujipoly Extreme Thermal Pad*
> Ambient: 27.2C
> Water temp: 32C
> VRM1: 46C
> Delta (VRM1 - Ambient): 18.8C
> 
> 
> *Unigine Valley Benchmark 1.0 - 1150/1600 +100mV*
> 
> *Stock EK Thermal Pad*
> Ambient: 28.5C
> VRM1: 65C
> Delta (VRM1 - Ambient): 36.5C
> 
> 
> *Fujipoly Extreme Thermal Pad*
> Ambient: 27.3C
> Water temp: 32C
> VRM1: 47C
> Delta (VRM1 - Ambient): 19.7C


+Rep for this, good post.

Anyone know if I could use those Fujipoly extreme pads for the VRM section of my Tri-X coolers? My top cards VRM runs very hot mining.


----------



## kizwan

Quote:


> Originally Posted by *neurotix*
> 
> +Rep for this, good post.
> 
> Anyone know if I could use those Fujipoly extreme pads for the VRM section of my Tri-X coolers? My top cards VRM runs very hot mining.


Thanks.

I'm pretty sure you can. Just need to find the correct thickness. I think you cannot go wrong with 0.5mm or 1mm.


----------



## Imprezzion

Quote:


> Originally Posted by *neurotix*
> 
> +Rep for this, good post.
> 
> Anyone know if I could use those Fujipoly extreme pads for the VRM section of my Tri-X coolers? My top cards VRM runs very hot mining.


I ordered a strip of 100x15x1.0mm for my Tri-X and we shall see when it arrives








Might take 2 weeks orso as it has to come from USA (FrozenCPU) all the way to Holland but k.

My VRM's run mid 80's as well when gaming on 1165Mhz @ +100mV


----------



## HOMECINEMA-PC

@kizwan is a very knowledgeable benchmarker , modder , overclocker and gamer........ kudos to you good sir


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> @kizwan is a very knowledgeable benchmarker , modder , overclocker and gamer........ kudos to you good sir


----------



## ReXtN

Im using the EK Full Cover block on my 290X with the stock EK thermal pads, and I am getting VRM temps of max 39C on VRM1 and 46C on VRM2. Coretemp is not over 41C








The temps are with my GPU overclocked to 1205/1500MHz +100mV on core and +19mV on AUX








This is ofc under Water, using a 360x80mm rad to cool the 290X and I7 4770K @4.6GHz.

As Im getting such low temps I don't see the point of getting myself the Fujipoly Extreme pads. I recommend checking out your VRM temps before ordering Fujipoly Extreme thermalpads for your GPU as the Fujipoly pads are quite expensive.. (19$ for 100x15mm pad, and you probably need 3-4 of those pads on your GPU)

Im not stating that the Fujipoly Thermalpads are bad. Not at all. I think the Fujipoly is a great product for the people who needs lover VRM temps, but check you temps before buying


----------



## neurotix

Quote:


> Originally Posted by *Imprezzion*
> 
> I ordered a strip of 100x15x1.0mm for my Tri-X and we shall see when it arrives
> 
> 
> 
> 
> 
> 
> 
> 
> Might take 2 weeks orso as it has to come from USA (FrozenCPU) all the way to Holland but k.
> 
> My VRM's run mid 80's as well when gaming on 1165Mhz @ +100mV


I run my cards at 1000/1500mhz +25 offset for mining. My top card reaches 95C on VRM1 sometimes. I have to open the windows even this time of year in Wisconsin to cool the room off. This thing is a space heater. The other day the ambient temp went up by ten degrees in a half hour (70F to 80F). At least I know my case is dumping the hot air well, from the back and through the top grill, where I have a 230mm fan above my radiator.


----------



## kizwan

Quote:


> Originally Posted by *ReXtN*
> 
> Im using the EK Full Cover block on my 290X with the stock EK thermal pads, and I am getting VRM temps of max 39C on VRM1 and 46C on VRM2. Coretemp is not over 41C
> 
> 
> 
> 
> 
> 
> 
> 
> The temps are with my GPU overclocked to 1205/1500MHz +100mV on core and +19mV on AUX
> 
> 
> 
> 
> 
> 
> 
> 
> This is ofc under Water, using a 360x80mm rad to cool the 290X and I7 4770K @4.6GHz.
> 
> As Im getting such low temps I don't see the point of getting myself the Fujipoly Extreme pads. I recommend checking out your VRM temps before ordering Fujipoly Extreme thermalpads for your GPU as the Fujipoly pads are quite expensive.. (19$ for 100x15mm pad, and you probably need 3-4 of those pads on your GPU)
> 
> Im not stating that the Fujipoly Thermalpads are bad. Not at all. I think the Fujipoly is a great product for the people who needs lover VRM temps, but check you temps before buying


What is your water temp or at least ambient (indoor) temp?


----------



## Imprezzion

Quote:


> Originally Posted by *ReXtN*
> 
> Im using the EK Full Cover block on my 290X with the stock EK thermal pads, and I am getting VRM temps of max 39C on VRM1 and 46C on VRM2. Coretemp is not over 41C
> 
> 
> 
> 
> 
> 
> 
> 
> The temps are with my GPU overclocked to 1205/1500MHz +100mV on core and +19mV on AUX
> 
> 
> 
> 
> 
> 
> 
> 
> This is ofc under Water, using a 360x80mm rad to cool the 290X and I7 4770K @4.6GHz.
> 
> As Im getting such low temps I don't see the point of getting myself the Fujipoly Extreme pads. I recommend checking out your VRM temps before ordering Fujipoly Extreme thermalpads for your GPU as the Fujipoly pads are quite expensive.. (19$ for 100x15mm pad, and you probably need 3-4 of those pads on your GPU)
> 
> Im not stating that the Fujipoly Thermalpads are bad. Not at all. I think the Fujipoly is a great product for the people who needs lover VRM temps, but check you temps before buying


3 or 4 strips of 100x15?? How so lol.

One strip is enough for 3 mounts of the primary VRM's lol...
Secondary don't get hot anyway so stock pad would do fine on there..


----------



## ReXtN

Quote:


> Originally Posted by *kizwan*
> 
> What is your water temp or at least ambient (indoor) temp?


I have no idea what my watertemp is, i don't have a temp sensor yet..
My ambient temp is close to 24-25C


----------



## ReXtN

Quote:


> Originally Posted by *Imprezzion*
> 
> 3 or 4 strips of 100x15?? How so lol.
> 
> One strip is enough for 3 mounts of the primary VRM's lol...
> Secondary don't get hot anyway so stock pad would do fine on there..


Sorry for not being clear enough..
I ment for The entire GPU. Thats for the Memory IC's as well







Idk if you actually need the Fujipoly on those to, but if was going to buy Fujipoly Thermalpads, im going to go overkill and put it on the Memory IC's as well


----------



## Imprezzion

Quote:


> Originally Posted by *ReXtN*
> 
> Sorry for not being clear enough..
> I ment for The entire GPU. Thats for the Memory IC's as well
> 
> 
> 
> 
> 
> 
> 
> Idk if you actually need the Fujipoly on those to, but if was going to buy Fujipoly Thermalpads, im going to go overkill and put it on the Memory IC's as well


Nah man VRAM doesn't get hot especially since we can't directly overvolt it.

VRM1 only would be enough.


----------



## kizwan

Quote:


> Originally Posted by *ReXtN*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> What is your water temp or at least ambient (indoor) temp?
> 
> 
> 
> 
> 
> 
> 
> I have no idea what my watertemp is, i don't have a temp sensor yet..
> My ambient temp is close to 24-25C
Click to expand...

Your ambient is pretty low. What is the voltage with +100mV? Strange that your VRM2 hotter than VRM1. What software did you use to monitor temp?


----------



## ReXtN

For the VRM temps im using GPU-Z actually. Normally I use OpenHardwareMonitor though








Do you know of a better software to monitor VRM temps?

btw, the VRM temps i mentioned was wrong.. the VRM1 is higher than the VRM2... My bad










The Corevolt is 1412mV from the stock 1250mV btw









My ambient when im not OC'ing is around 29-30C , but i like to keep it a bit cooler when OC'ing ^^


----------



## kizwan

Quote:


> Originally Posted by *ReXtN*
> 
> For the VRM temps im using GPU-Z actually. Normally I use OpenHardwareMonitor though
> 
> 
> 
> 
> 
> 
> 
> 
> Do you know of a better software to monitor VRM temps?
> 
> btw, the VRM temps i mentioned was wrong.. the VRM1 is higher than the VRM2... My bad
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> The Corevolt is 1412mV from the stock 1250mV btw
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My ambient when im not OC'ing is around 29-30C , but i like to keep it a bit cooler when OC'ing ^^


For VRM, I use either GPU-Z or HWiNFO. Too bad Open Hardware Monitor doesn't support VRM temp reporting.


----------



## ReXtN

Quote:


> Originally Posted by *kizwan*
> 
> For VRM, I use either GPU-Z or HWiNFO. Too bad Open Hardware Monitor doesn't support VRM temp reporting.


okey








I have used HWiNFO to, but i liked GPU-Z the most. OpenHardwareMonitor should have VRM temps for sure! That would make it possible for me to only use Open Hardware Monitor for temp monitoring


----------



## ReXtN

Do you guys think that the Fujipoly Extreme is worth it when my highest VRM temp i have seen is 46C?
How much lower could the VRM temp be?


----------



## chronicfx

Quote:


> Originally Posted by *Roboyto*
> 
> Your i5 could be showing its age with the beastly GPU. Not really certain why your CPU score would drop, that is odd. Your 750W PSU should be more than enough to power your system. Is the CPU holding clocks during the benches?


Found this article you may find interesting. It compares the i5 to i7 in multi gpu setups.

http://www.anandtech.com/show/7189/choosing-a-gaming-cpu-september-2013/10


----------



## rdr09

Quote:


> Originally Posted by *ReXtN*
> 
> Do you guys think that the Fujipoly Extreme is worth it when my highest VRM temp i have seen is 46C?
> How much lower could the VRM temp be?


your temps are good. i'd suggest keeping what you have. not worth the risk of damaging something. there are other factors in play here like the efficiency and size of the rad, power of the pump, efficiency of fans, etc. prolly next time you need to drain your loop.


----------



## ReXtN

Quote:


> Originally Posted by *rdr09*
> 
> your temps are good. i'd suggest keeping what you have. not worth the risk of damaging something. there are other factors in play here like the efficiency and size of the rad, power of the pump, efficiency of fans, etc. prolly next time you need to drain your loop.


That was what i was thinking as well








Im draining my loop today actually though









I got my GTX 780ti today, not sure if the 780ti should replace the 290x in the main rig, or the 780 which is in my Prodigy build..
My plans for now is just to test it and see how the performance is compared to the 290x really









In Norway we have a 45-days return policy(or whatever it translates to..). That means that if you are not satisfied with the product you can just send the product back, and get your money back.(within the 45 days) So if the performance of the 780ti isn't noticable, i can just return it and get my money back








Idk what the prices are for the 290X and the 780ti are in US, but in Norway the 290X from costs 4200nok = 700$ and the 780ti costs from 5500nok = 900$. So the 780ti is approximately 200$ more expensive than the 290X. Idk if the 780ti is worth the extra 200$.. That is why Im going to test it for a month or so


----------



## rdr09

Quote:


> Originally Posted by *ReXtN*
> 
> That was what i was thinking as well
> 
> 
> 
> 
> 
> 
> 
> 
> Im draining my loop today actually though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I got my GTX 780ti today, not sure if the 780ti should replace the 290x in the main rig, or the 780 which is in my Prodigy build..
> My plans for now is just to test it and see how the performance is compared to the 290x really
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In Norway we have a 45-days return policy(or whatever it translates to..). That means that if you are not satisfied with the product you can just send the product back, and get your money back.(within the 45 days) So if the performance of the 780ti isn't noticable, i can just return it and get my money back
> 
> 
> 
> 
> 
> 
> 
> 
> Idk what the prices are for the 290X and the 780ti are in US, but in Norway the 290X from costs 4200nok = 700$ and the 780ti costs from 5500nok = 900$. So the 780ti is approximately 200$ more expensive than the 290X. Idk if the 780ti is worth the extra 200$.. That is why Im going to test it for a month or so


these are what i used . . .

http://www.frozencpu.com/products/16878/thr-164/Fujipoly_Extreme_System_Builder_Thermal_Pad_-_14_Sheet_-_150_x_100_x_05_-_Thermal_Conductivity_110_WmK.html

http://www.frozencpu.com/products/16880/thr-165/Fujipoly_Extreme_System_Builder_Thermal_Pad_-_14_Sheet_-_150_x_100_x_10_-_Thermal_Conductivity_110_WmK.html

good for at least 3 applications including the vrams. not sure of the shelf life. if you find it locally, then go.


----------



## ReXtN

Quote:


> Originally Posted by *rdr09*
> 
> these are what i used . . .
> 
> http://www.frozencpu.com/products/16878/thr-164/Fujipoly_Extreme_System_Builder_Thermal_Pad_-_14_Sheet_-_150_x_100_x_05_-_Thermal_Conductivity_110_WmK.html
> 
> http://www.frozencpu.com/products/16880/thr-165/Fujipoly_Extreme_System_Builder_Thermal_Pad_-_14_Sheet_-_150_x_100_x_10_-_Thermal_Conductivity_110_WmK.html
> 
> good for at least 3 applications including the vrams. not sure of the shelf life. if you find it locally, then go.


Thanks for the links, M8








Maybe i should order some to have laying around








If i end up keeping the 780ti, it is going to be watercooled, and then i can just use the Fujipoly right away when im mounting the block.
If it can be stored without access to air, the thermalpads should hold up quite nice i would reckon


----------



## rdr09

Quote:


> Originally Posted by *ReXtN*
> 
> Thanks for the links, M8
> 
> 
> 
> 
> 
> 
> 
> 
> Maybe i should order some to have laying around
> 
> 
> 
> 
> 
> 
> 
> 
> If i end up keeping the 780ti, it is going to be watercooled, and then i can just use the Fujipoly right away when im mounting the block.
> If it can be stored without access to air, the thermalpads should hold up quite nice i would reckon


verify what thickness is needed for the Ti's. those i linked are for the 290s.


----------



## ReXtN

Quote:


> Originally Posted by *rdr09*
> 
> verify what thickness is needed for the Ti's. those i linked are for the 290s.


Yeah, i know








I have a watercooled GTX 780, and from what i can see the VRM and Memory IC's are the exact height and position on the 780 as on the 780ti.
A quick google search will also most likely tell me what thickness of thermalpads i need for the 780ti









The 780 with the Aquacomputer Kryographics waterblock used 1mm thickness only


----------



## rdr09

Quote:


> Originally Posted by *ReXtN*
> 
> Yeah, i know
> 
> 
> 
> 
> 
> 
> 
> 
> I have a watercooled GTX 780, and from what i can see the VRM and Memory IC's are the exact height and position on the 780 as on the 780ti.
> A quick google search will also most likely tell me what thickness of thermalpads i need for the 780ti
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The 780 with the Aquacomputer Kryographics waterblock used 1mm thickness only


you got a good clocking 290X. share your findings please. thanks.


----------



## kizwan

Quote:


> Originally Posted by *ReXtN*
> 
> Do you guys think that the Fujipoly Extreme is worth it when my highest VRM temp i have seen is 46C?
> How much lower could the VRM temp be?


Your VRM1 low enough. You don't need to change anything. I wouldn't need to if I have such low ambient temp.

VRM1 with stock EK thermal pad is very sensitive with ambient temp. The Valley benchmark that I ran in almost identical ambient temp, VRM1 running a lot cooler with Fujipoly Extreme. I already have more than enough radiator space (600mm) & proper pump. I doubt I'll get similar result if I (only) add more radiator space.

Just for the sake of argument, I actually do think with Fujipoly Extreme, your VRM1 will running at much lower temp. As you can see, your VRM1 is 46C & core is 41C. VRM1 tend to run few degrees higher than core with stock EK thermal pad. Pretty much the same pattern I got when using stock EK thermal pad.


----------



## lostsurfer

Would you guys pick up a refrence 290 or 290x if you could get a used one for $350-$400? I know the temps are hot but can't you put an aftermarket cooler on it, non liquid? Or would you just spend the extra money on a new one?


----------



## The Mac

you can grab a gelid icy vision rev2 for 35 bucks, it works well.

If you want to spend a little more money, the Acellero is a bit better.


----------



## friend'scatdied

Quote:


> Originally Posted by *The Mac*
> 
> you can grab a gelid icy vision rev2 for 35 bucks, it works well.


I really like this cooler but haven't been able to find it in stock for that amount. I've been able to find some for $50-60 but they seem to be the Nvidia-compatible version. The AMD-only version has an extruded diamond shape on the heatsink specifically for contact with the GPU die.

The Accelero 3 looks good but it's so much larger. It makes the 290/X over 12.2" and quite wide so fitment will be a problem for many cases.

I'm hoping I can score an AMD-version Icy Vision Rev 2 soon. With prices as low as they are it's never been a better time to buy red (as long as you get a card with a transferable warranty or have the original invoice).

(Please disregard the strikethroughs on the Gelid.)


----------



## Roboyto

Quote:


> Originally Posted by *chronicfx*
> 
> Found this article you may find interesting. It compares the i5 to i7 in multi gpu setups.
> 
> http://www.anandtech.com/show/7189/choosing-a-gaming-cpu-september-2013/10


I have posted that same link in here before and it was brought up on how many threads those games are utilizing. A 7970 is not a 290X, especially if you are overclocking them. I suggested a CPU upgrade because it is difficult to find another person running an i5 with (3) cards of this caliber. Most who want to ensure they won't have performance issues with several enthusiast cards are likely running a 2011 socket CPU.

Try sending http://www.overclock.net/u/368261/taem a PM and ask him about his experiences. A few weeks ago he was asking the same questions; he has 2 cards and a 4670k.

The only other suggestions I could give you are covering the basics...re-seat your cards, check all cable/power connections, driver/BIOS update, BIOS settings for PCI-E, and disabling ULPS. Don't recall if you said what drivers you're running, but if you're on anything 14.X, I would scrub with DDU and see if 13.12 gives you better results.


----------



## Niberius

Quote:


> Originally Posted by *friend'scatdied*
> 
> I really like this cooler but haven't been able to find it in stock for that amount. I've been able to find some for $50-60 but they seem to be the Nvidia-compatible version. The AMD-only version has an extruded diamond shape on the heatsink specifically for contact with the GPU die.
> 
> The Accelero 3 looks good but it's so much larger. It makes the 290/X over 12.2" and quite wide so fitment will be a problem for many cases.
> 
> I'm hoping I can score an AMD-version Icy Vision Rev 2 soon. With prices as low as they are it's never been a better time to buy red (as long as you get a card with a transferable warranty or have the original invoice).


Unless room is an issue go with the Accelero 3 or 4. I have played around with both the gelid and accelero and while the gelid is decent for the money it doesnt hold a candle to the accelero once you start upping voltage and clocks a good bit. Im running a 290X at 1.328vc with 1200mhz core clock and loading at 64c, vrm 1 temp is 79c. Having good case airflow also obviously helps which is why I have a full tower Thor Case.


----------



## chronicfx

Quote:


> Originally Posted by *Roboyto*
> 
> I have posted that same link in here before and it was brought up on how many threads those games are utilizing. A 7970 is not a 290X, especially if you are overclocking them. I suggested a CPU upgrade because it is difficult to find another person running an i5 with (3) cards of this caliber. Most who want to ensure they won't have performance issues with several enthusiast cards are likely running a 2011 socket CPU.
> 
> Try sending http://www.overclock.net/u/368261/taem a PM and ask him about his experiences. A few weeks ago he was asking the same questions; he has 2 cards and a 4670k.
> 
> The only other suggestions I could give you are covering the basics...re-seat your cards, check all cable/power connections, driver/BIOS update, BIOS settings for PCI-E, and disabling ULPS. Don't recall if you said what drivers you're running, but if you're on anything 14.X, I would scrub with DDU and see if 13.12 gives you better results.


Thanks! You must be confusing me with someone. I am not having performance issues. I simply wanted to know if buying a z77 maximus v extreme with a plx chip would alleviate bandwidth issues for my 3 cards. I would have to sell a card to go lga2011.. I would gladly put my three card i5 setup against an i7 dual card setup. I would smoke them everytime. My question had nothing to do with a cpu, ever. I am running 13.12 and have never even tried the 14.x drivers, i hear they are a nightmare.


----------



## Niberius

Quote:


> Originally Posted by *chronicfx*
> 
> Thanks! You must be confusing me with someone. I am not having performance issues. I simply wanted to know if buying a z77 maximus v extreme with a plx chip would alleviate bandwidth issues for my 3 cards. I would have to sell a card to go lga2011.. I would gladly put my three card i5 setup against an i7 dual card setup. I would smoke them everytime. My question had nothing to do with a cpu, ever. I am running 13.12 and have never even tried the 14.x drivers, i hear they are a nightmare.


The newer drivers have screwed up a few games running direct x, for mantle they are great if you are on the 14.3.


----------



## b0x3d

Can I be added to the club please? Loving these cards on my RIVBE


----------



## Niberius

Quote:


> Originally Posted by *b0x3d*
> 
> Can I be added to the club please? Loving these cards on my RIVBE


Nice, congrats!


----------



## Roy360

anyone else own the ASUS R9 290 DU OC card?

The powertune and voltage are locked on my card. Tried my machine aswell as NCIX's machines and two different verisons of GPU Tweak

p.s does anyone know how to disable GPUs in a multi card config? My spare rig is gone, so I will have to test this card in my main rig, which already has 2 water cooled cards in it.


----------



## friend'scatdied

Quote:


> Originally Posted by *Niberius*
> 
> Unless room is an issue go with the Accelero 3 or 4. I have played around with both the gelid and accelero and while the gelid is decent for the money it doesnt hold a candle to the accelero once you start upping voltage and clocks a good bit. Im running a 290X at 1.328vc with 1200mhz core clock and loading at 64c, vrm 1 temp is 79c. Having good case airflow also obviously helps which is why I have a full tower Thor Case.


Definitely a huge issue for me as I'm in a mini-ITX case. The Acceleros jut out way past the PCB and make the card way too tall and long.

Does the Gelid increase the height of the card (i.e. jut out above the power connectors)? I'm also limited there since my case has the motherboard mounted flat.


----------



## kizwan

Quote:


> Originally Posted by *chronicfx*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Roboyto*
> 
> I have posted that same link in here before and it was brought up on how many threads those games are utilizing. A 7970 is not a 290X, especially if you are overclocking them. I suggested a CPU upgrade because it is difficult to find another person running an i5 with (3) cards of this caliber. Most who want to ensure they won't have performance issues with several enthusiast cards are likely running a 2011 socket CPU.
> 
> Try sending http://www.overclock.net/u/368261/taem a PM and ask him about his experiences. A few weeks ago he was asking the same questions; he has 2 cards and a 4670k.
> 
> The only other suggestions I could give you are covering the basics...re-seat your cards, check all cable/power connections, driver/BIOS update, BIOS settings for PCI-E, and disabling ULPS. Don't recall if you said what drivers you're running, but if you're on anything 14.X, I would scrub with DDU and see if 13.12 gives you better results.
> 
> 
> 
> 
> 
> 
> 
> Thanks! You must be confusing me with someone. I am not having performance issues. I simply wanted to know *if buying a z77 maximus v extreme with a plx chip would alleviate bandwidth issues* for my 3 cards. I would have to sell a card to go lga2011.. I would gladly put my three card i5 setup against an i7 dual card setup. I would smoke them everytime. My question had nothing to do with a cpu, ever. I am running 13.12 and have never even tried the 14.x drivers, i hear they are a nightmare.
Click to expand...

Yeah, I read a couple of people complaining slowdown *with certain games* when running 290's @PCIe 2.0 x8 which bandwidth equivalent to PCIe 3.0 x4. If this is true, going with Maximus V Extreme will going to solved your bandwidth issue. I believe when running 3 x 290's on your current motherboard, it will be PCIe 3.0 x8-x4-x4. With the Maximus V Extreme motherboard, your cards will run at PCIe 3.0 x8-x16-x8.

PLX chip introduced latency. Do look this up before buying the motherboard.


----------



## Niberius

Quote:


> Originally Posted by *friend'scatdied*
> 
> Definitely a huge issue for me as I'm in a mini-ITX case. The Acceleros jut out way past the PCB and make the card way too tall and long.
> 
> Does the Gelid increase the height of the card (i.e. jut out above the power connectors)? I'm also limited there since my case has the motherboard mounted flat.


You should probably consider a case with more room before messing with after market air coolers as it sounds pretty cramped for you as it is.


----------



## The Mac

Quote:


> Originally Posted by *friend'scatdied*
> 
> Definitely a huge issue for me as I'm in a mini-ITX case. The Acceleros jut out way past the PCB and make the card way too tall and long.
> 
> Does the Gelid increase the height of the card (i.e. jut out above the power connectors)? I'm also limited there since my case has the motherboard mounted flat.


I will havea look when I get home, but I don't think so.

Also,there are 2 versions. you want the rev2, not the one you mentioned above. The star shaped one is for 79xx only and will not fit on a 290 due to resessing.

Here is a link with an install guide and lots of people who are using it.

http://www.overclock.net/t/1437634/installation-guide-tips-of-rev-2-icy-vision-on-r9-290x/40


----------



## Roboyto

Quote:


> Originally Posted by *friend'scatdied*
> 
> Definitely a huge issue for me as I'm in a mini-ITX case. The Acceleros jut out way past the PCB and make the card way too tall and long.
> 
> Does the Gelid increase the height of the card (i.e. jut out above the power connectors)? I'm also limited there since my case has the motherboard mounted flat.


What Mini-ITX case? Have you considered the Kraken G10 with an AIO? You would have to put sinks on the VRMs and memory like the Gelid/Accelero.

Check this thread for Gelid information: http://www.overclock.net/t/1437634/installation-guide-tips-of-rev-2-icy-vision-on-r9-290x


----------



## friend'scatdied

Quote:


> Originally Posted by *The Mac*
> 
> I will havea look when I get home, but I don't think so.
> 
> Also,there are 2 versions. you want the rev2, not the one you mentioned above. The star shaped one is for 79xx only and will not fit on a 290 due to resessing.
> 
> Here is a link with an install guide and lots of people who are using it.
> 
> http://www.overclock.net/t/1437634/installation-guide-tips-of-rev-2-icy-vision-on-r9-290x/40


Thanks, that is useful information. The Rev2-02 is advertised as AMD-only so I was confused. I suppose the flat-based Rev2-01 is the one to get. (compatible with Nvidia cards too). Edited my incorrect post so no one else would get confused by it.
Quote:


> Originally Posted by *Roboyto*
> 
> What Mini-ITX case? Have you considered the Kraken G10 with an AIO? You would have to put sinks on the VRMs and memory like the Gelid/Accelero.


Thanks. SG08; I already have a H80i with insufficient clearance/room for another AIO.


----------



## Roboyto

Quote:


> Originally Posted by *friend'scatdied*
> 
> Thanks, that is useful information. The Rev2-02 is advertised as AMD-only so I was confused. I suppose the flat-based Rev2-01 is the one to get. (compatible with Nvidia cards too). Edited my incorrect post so no one else would get confused by it.
> Thanks. SG08; I already have a H80i with insufficient clearance/room for another AIO.


Just ordered the Corsair 250D to upgrade HTPC for my wife. This is the start of weening her off of the PS3









The 250D will take a 240mm rad up top. My plan is to find out if it will take dual 120mm AIOs or not, so I can use both my Antec 620s.

I'm unfamiliar with the SG08. Is it possible to fit a decent air cooler for the CPU and use the AIO for GPU instead?


----------



## ebduncan

Quote:


> Originally Posted by *Roboyto*
> 
> Just ordered the Corsair 250D to upgrade HTPC for my wife. This is the start of weening her off of the PS3
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The 250D will take a 240mm rad up top. My plan is to find out if it will take dual 120mm AIOs or not, so I can use both my Antec 620s.
> 
> I'm unfamiliar with the SG08. Is it possible to fit a decent air cooler for the CPU and use the AIO for GPU instead?


you can, but the radiator would have to be mounted in the front or bottom of the case. Since any decent cpu cooler is more than likely going to prevent you from using the rear exhaust spot, or top fan spots. You can fit a fan, but not fan+ radiator in most cases.


----------



## Roboyto

Quote:


> Originally Posted by *ebduncan*
> 
> you can, but the radiator would have to be mounted in the front or bottom of the case. Since any decent cpu cooler is more than likely going to prevent you from using the rear exhaust spot, or top fan spots. You can fit a fan, but not fan+ radiator in most cases.


Good to know I can fit 2 of them in there...what about a 240mm AIO and a 120mm AIO?


----------



## directorJay

http://www.techpowerup.com/gpuz/34p7m/

SAPPHIRE R9 290X 4GB GDDR5 BATTLEFIELD 4 EDITION


Spoiler: Warning: Spoiler!









COOLING: NZXT Kraken G10 with Kraken x60 Radiator in push/pull



Hi guys this is mine, do I have an ok card? Max I can go without artifacts is 1200Mhz core clock and 1500Mhz memory clock, only power tweak is core voltage +100mV and idk why everytime I open afterburner the power limit slider is at 0, is that normal guys or does it always reset when I close it and open it again?

Edit: I forgot to tell the max temperatures when I ran the valley benchmark








GPU: 57C VRM1: 82C VRM2: 66C


----------



## friend'scatdied

Quote:


> Originally Posted by *Roboyto*
> 
> I'm unfamiliar with the SG08. Is it possible to fit a decent air cooler for the CPU and use the AIO for GPU instead?


Had a Noctua NH-C12P in there before and it definitely wouldn't work.

What might actually be possible is two AIOs with slim 120mm rads (e.g. H60) and a 38mm thick 120mm fan (e.g. Panaflo), but that'd be a huge compromise in noise and temps.


----------



## Iniura

Guys I am seeing some strange behaviour on my R9 290, when I was on air I installed 13.12 drivers, used MSI AB Beta 18 and could run an OC of 1180/1625 on Battlefield 4 without problems.
But because AB beta expired I had to switch to Trixx, and now when I game under water on 1180/1450 my whole screen gets distorted like some kind of blurr nothing is sharp anymore. What could this be?

Now with Trixx I can't use the AUX voltage like in AB and that made my memory OC stable at the time, could it be that my memory OC is to high now without the AUX voltage?

Is there a new version out of AB which works without problems and won't expire?


----------



## phallacy

Quote:


> Originally Posted by *Iniura*
> 
> Guys I am seeing some strange behaviour on my R9 290, when I was on air I installed 13.12 drivers, used MSI AB Beta 18 and could run an OC of 1180/1625 on Battlefield 4 without problems.
> But because AB beta expired I had to switch to Trixx, and now when I game under water on 1180/1450 my whole screen gets distorted like some kind of blurr nothing is sharp anymore. What could this be?
> 
> Now with Trixx I can't use the AUX voltage like in AB and that made my memory OC stable at the time, could it be that my memory OC is to high now without the AUX voltage?
> 
> Is there a new version out of AB which works without problems and won't expire?


Sounds like the aux voltage is what was keeping the memory stable. I know what you mean by the fuzzy screen it usually happened to me with unstable overclocks. There is a new AB out. Beta 19

You can get it here: http://www.guru3d.com/files_details/msi_afterburner_beta_download.html


----------



## killerfurball

So figured I would join this.

Here is my GPU-Z link

Manufacturer: HIS

Series: R9 290

Cooling: Stock


----------



## Jflisk

Add me to the club. Thanks
XFX R9 290x
Water cooled
GPUZ
http://www.techpowerup.com/gpuz/gygag/


----------



## ReXtN

Quote:


> Originally Posted by *Roy360*
> 
> anyone else own the ASUS R9 290 DU OC card?
> 
> The powertune and voltage are locked on my card. Tried my machine aswell as NCIX's machines and two different verisons of GPU Tweak
> 
> p.s does anyone know how to disable GPUs in a multi card config? My spare rig is gone, so I will have to test this card in my main rig, which already has 2 water cooled cards in it.


Check if you have some kind of dip-switch on your motherboard that controlls the PCI-e lanes.
My Gigabyte Z87 OC has some dip-switches down in the right corner(see the picture below) which controls the PCI-e lanes








The easiest way is probably just to check your MB manual to see if you have a dip-switch for the PCI-e lanes.

If you have no such thing, go into your MB BIOS and check if you have somewhere to config the PCI-e lanes. I know i can set which PCI-e lane to be enabled/disabled in my BIOS.


----------



## Arizonian

Quote:


> Originally Posted by *b0x3d*
> 
> Can I be added to the club please? Loving these cards on my RIVBE
> 
> 
> Spoiler: Warning: Spoiler!


You sure may, Congrats - added









Quote:


> Originally Posted by *directorJay*
> 
> http://www.techpowerup.com/gpuz/34p7m/
> 
> SAPPHIRE R9 290X 4GB GDDR5 BATTLEFIELD 4 EDITION
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> COOLING: NZXT Kraken G10 with Kraken x60 Radiator in push/pull
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Hi guys this is mine, do I have an ok card? Max I can go without artifacts is 1200Mhz core clock and 1500Mhz memory clock, only power tweak is core voltage +100mV and idk why everytime I open afterburner the power limit slider is at 0, is that normal guys or does it always reset when I close it and open it again?
> 
> Edit: I forgot to tell the max temperatures when I ran the valley benchmark
> 
> 
> 
> 
> 
> 
> 
> 
> GPU: 57C VRM1: 82C VRM2: 66C


Congrats - added









Quote:


> Originally Posted by *killerfurball*
> 
> So figured I would join this.
> 
> Here is my GPU-Z link
> 
> Manufacturer: HIS
> 
> Series: R9 290
> 
> Cooling: Stock


Glad you did, Congrats - added
















Quote:


> Originally Posted by *Jflisk*
> 
> Add me to the club. Thanks
> XFX R9 290x
> Water cooled
> GPUZ
> http://www.techpowerup.com/gpuz/gygag/


Congrats - added to roster


----------



## Nevk

Here is my GPU-Z link

ASUS R9 290 DC2OC.

please add me. Thanks









http://www.techpowerup.com/gpuz/688gg/


----------



## Jflisk

Does anyone have problems with some temp programs reading there R9 290x temps vrm or gpu.


----------



## chronicfx

Usually have to make sure they are under load to show vrm temp in gpuz


----------



## Jack Mac

Can you switch me from stock Sapphire to MSI Gaming on the roster please Arizonian? I found someone who wants to trade my 780 for his MSI Gaming Edition 290.


----------



## Durvelle27

Quote:


> Originally Posted by *Arizonian*
> 
> You sure may, Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> Glad you did, Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added to roster


I still wan't added to the list

http://www.techpowerup.com/gpuz/f8r/


----------



## centvalny

http://imgur.com/sOSVvdY





http://imgur.com/ejW1zWq


----------



## Roboyto

Quote:


> Originally Posted by *Roy360*  p.s does anyone know how to disable GPUs in a multi card config? My spare rig is gone, so I will have to test this card in my main rig, which already has 2 water cooled cards in it.


I believe I have just pulled the PCIe power cables and then the GPU(s) shouldn't be recognized. Your bios could likely have some sort of setting to enable individual slots.


----------



## Arizonian

Quote:


> Originally Posted by *Durvelle27*
> 
> I still wan't added to the list
> 
> http://www.techpowerup.com/gpuz/f8r/


Sorry for missing you.









If I didn't say it before, congrats and your added









Quote:


> Originally Posted by *centvalny*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> http://imgur.com/sOSVvdY
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> http://imgur.com/ejW1zWq


Welcome aboard with your Matrix! Congrats - added









Please just add a GPU-Z validation link like Durvelle27 did on his for proof in your original post if I may ask.

Let us know how that puppy over clocks.


----------



## kzinti1

For the club: http://www.techpowerup.com/gpuz/52ueh/
MSI R9 290X Lightning x 2
Stock cooling
(Why doesn't GPU-Z show X-Fire?
This does: http://www.3dmark.com/fs/1961004)


----------



## Jflisk

This is a check in what are your vrm temps under water. Thanks

Awesome my other R9 290x is coming back from rma.

Also taking donations for a water bock.


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *Roy360*
> 
> anyone else own the ASUS R9 290 DU OC card?
> 
> The powertune and voltage are locked on my card. Tried my machine aswell as NCIX's machines and two different verisons of GPU Tweak
> 
> p.s does anyone know how to disable GPUs in a multi card config? My spare rig is gone, so I will have to test this card in my main rig, which already has 2 water cooled cards in it.


you should be able to bypass with msi-software to unlock

worked fine on my asus


----------



## Arizonian

Quote:


> Originally Posted by *kzinti1*
> 
> For the club: http://www.techpowerup.com/gpuz/52ueh/
> MSI R9 290X Lightning x 2
> Stock cooling
> (Why doesn't GPU-Z show X-Fire?
> This does: http://www.3dmark.com/fs/1961004)


Wow, Matrix and now Lightning today. Congrats - added









For those members who might want to follow the Lightning a littler more closely - I've added the *MSI R9 290X Lightning Thread* link on the OP of OCN / AMD Related info and your welcome to subscribe there too.

Also updated the third post on OP *Reviews section* with a Matrix and Lightning review I found.


----------



## gogolXmogol

Guys, does anyone know what could cause the following issue:
My new sapphire tri-x 290x just stopped dropping clocks in idle mode (win only) as well as in 3d, the clocks remain 1040/1300 Mhz no matter what (as per GPU-Z the CPU usage is 0%). The temps is high because of that as well as power consumption is at full mode while on windows. Catalyst driver reinstall does not help.
Card is not overclocked, sapphire trixx software removed..
Please advice!


----------



## nightfox

false
Quote:


> Originally Posted by *gogolXmogol*
> 
> Guys, does anyone know what could cause the following issue:
> My new sapphire tri-x 290x just stopped dropping clocks in idle mode (win only) as well as in 3d, the clocks remain 1040/1300 Mhz no matter what (as per GPU-Z the CPU usage is 0%). The temps is high because of that as well as power consumption is at full mode while on windows. Catalyst driver reinstall does not help.
> Card is not overclocked, sapphire trixx software removed..
> Please advice!


have you disabled ULPS? if so, enable it. thats the only thing I could think of. Even you remove trixx, check registry if ULPS is enable.

This is a script to enable and disable ULPS in the registry.

http://www.ulpsconfigurationutility.com/

http://www.overclock.net/t/1088266/ulps-gui-config-utility-enable-disable


----------



## gogolXmogol

*nightfox* Thank you I will tray that when I will be at home, that does not seems to be an official AMD powertune method but will give it a shot. I hope I won't have to spend my weekend reinstalling win8 and all the software because of this issue.


----------



## nightfox

Quote:


> Originally Posted by *gogolXmogol*
> 
> *nightfox* Thank you I will tray that when I will be at home, that does not seems to be an official AMD powertune method but will give it a shot. I hope I won't have to spend my weekend reinstalling win8 and all the software because of this issue.


youre welcome. it is not official. But actually, if you have disabled the ULPS using trixx, Trixx just modify the windows registry. If you have just uninstall the TRIXX after you disabled ULPS, then the ULPS will not be reverted.

That program or script has the same function. Or actually, you can manually do it if you are comfortable playing around your registry.

Just do a "find" function on your registry using "regedit" and search for EnableULPS. and change the Dword Value to 1 Take note that there is more than one so after you change click the find next.

And then restart your windows to take effect.

for more info you can read here.

http://www.tomshardware.co.uk/faq/id-1904869/disable-ulps-amd-crossfire-setups.html


----------



## gogolXmogol

I don't remember disabling anything in trixx. How is this option called? I will install trixx back again and check my settings with it.


----------



## nightfox

ok. it is called disable ULPS.


----------



## swiftypoison

Hey guys!
I just got a refurb Asus R9 290 DirectII. What are the regular temperatures for this card? I am getting around 85-89 in game and around 92 when i am using [email protected] that sound right?? Also, does anyone know if the fans on this card are tilted outwards on one side? Mine is like that so I am not sure if that is normal.


----------



## cephelix

hmm, can't say much about the fans but it'll be useful if we could know what kind of setup you have...


----------



## gogolXmogol

Thanks for advices, but everything seems to be in place though...
I have no idea whats going on with my card.


----------



## swiftypoison

Quote:


> Originally Posted by *cephelix*
> 
> hmm, can't say much about the fans but it'll be useful if we could know what kind of setup you have...


4770K CPU in a R4 case that is well ventilated.

Most of the reviews I have seen have temps around 70-79.

Does anyone have any experience with this card?


----------



## rdr09

Quote:


> Originally Posted by *gogolXmogol*
> 
> Guys, does anyone know what could cause the following issue:
> My new sapphire tri-x 290x just stopped dropping clocks in idle mode (win only) as well as in 3d, the clocks remain 1040/1300 Mhz no matter what (as per GPU-Z the CPU usage is 0%). The temps is high because of that as well as power consumption is at full mode while on windows. Catalyst driver reinstall does not help.
> Card is not overclocked, sapphire trixx software removed..
> Please advice!


how many monitors are you using with that card?


----------



## gogolXmogol

just one monitor. i also tried different dvi ports with same result. i have another card in a 3rd pci-e slot, but it is disabled by an Asus PCI-e lane switch (rampage iv board)


----------



## Tobiman

Quote:


> Originally Posted by *gogolXmogol*
> 
> Thanks for advices, but everything seems to be in place though...
> I have no idea whats going on with my card.


That looks like a 14.XX driver bug. Restart the system to fix it or install 13.12 WHQL for all round stability.


----------



## Tobiman

Quote:


> Originally Posted by *swiftypoison*
> 
> Hey guys!
> I just got a refurb Asus R9 290 DirectII. What are the regular temperatures for this card? I am getting around 85-89 in game and around 92 when i am using [email protected] that sound right?? Also, does anyone know if the fans on this card are tilted outwards on one side? Mine is like that so I am not sure if that is normal.


Check the fan speed. It's probably running around 20-40% ish. You'll want to set a better fan curve and I'd recommend using MSI afterburner or Sapphire Trixx. A fan curve with a slope of 1 should be good enough.


----------



## hotrod717

Quote:


> Originally Posted by *Arizonian*
> 
> Wow, Matrix and now Lightning today. Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> For those members who might want to follow the Lightning a littler more closely - I've added the *MSI R9 290X Lightning Thread* link on the OP of OCN / AMD Related info and your welcome to subscribe there too.
> 
> Also updated the third post on OP *Reviews section* with a Matrix and Lightning review I found.


Thanks for the link!







If you own a Lightning please stop by and post your oc results and info. The Lightning dropped in price to $699 @ Newegg for anyone interested. Less than the Tri-X now.


----------



## NixZiZ

http://www.techpowerup.com/gpuz/bdpcm/

ASUS reference card

Stock cooler

Got it before the mining rush


----------



## gogolXmogol

Quote:


> Originally Posted by *Tobiman*
> 
> That looks like a 14.XX driver bug. Restart the system to fix it or install 13.12 WHQL for all round stability.


I've tried 13.12 as well. Same result. And on my second 290x tri-x as well. Going to reinstall windows as soon as my backups are created.


----------



## Tobiman

Quote:


> Originally Posted by *gogolXmogol*
> 
> I've tried 13.12 as well. Same result. And on my second 290x tri-x as well. Going to reinstall windows as soon as my backups are created.


I don't think you have to tbh. Why not use DDU to uninstall drivers instead of manual deleting?


----------



## Arizonian

Quote:


> Originally Posted by *NixZiZ*
> 
> http://www.techpowerup.com/gpuz/bdpcm/
> 
> ASUS reference card
> 
> Stock cooler
> 
> Got it before the mining rush


Congrats - added


----------



## neurotix

Arizonian, can you update me to two cards please?


----------



## igrease

Quote:


> Originally Posted by *Tobiman*
> 
> I don't think you have to tbh. Why not use DDU to uninstall drivers instead of manual deleting?


I had some issues with DDU when uninstalling Nvidia's drivers. No matter how many times I ran it, it never deleted everything in the folder. I kept having issues with BF4 until I manually deleted the folder, ran DDU and reinstalled the drivers.


----------



## Sgt Bilko

Quote:


> Originally Posted by *neurotix*
> 
> Arizonian, can you update me to two cards please?
> 
> 
> Spoiler: Warning: Spoiler!


You nailed the name Big Red









looks pretty good there, nice job


----------



## Niberius

Quote:


> Originally Posted by *igrease*
> 
> I had some issues with DDU when uninstalling Nvidia's drivers. No matter how many times I ran it, it never deleted everything in the folder. I kept having issues with BF4 until I manually deleted the folder, ran DDU and reinstalled the drivers.


Working perfectly for me with ATI, might have just been a fluke with nvidia.


----------



## combine1237

Hello, anyone have experience with amd vce? Especially pertaining to afterburner.


----------



## neurotix

Quote:


> Originally Posted by *Sgt Bilko*
> 
> You nailed the name Big Red
> 
> 
> 
> 
> 
> 
> 
> 
> 
> looks pretty good there, nice job


Thanks Sgt Bilko.

Big Red has always been Big Red, ever since I built this machine in 2010. Check out my build log sometime if you haven't already. If you have a build log, link me to it and I'll take a look.


----------



## Sgt Bilko

Quote:


> Originally Posted by *neurotix*
> 
> Thanks Sgt Bilko.
> 
> Big Red has always been Big Red, ever since I built this machine in 2010. Check out my build log sometime if you haven't already. If you have a build log, link me to it and I'll take a look.


My rig is way too much of a mess to justify a build log, maybe in the future.

Just looking over some of Syr's builds from when i first joined OCN, the man truly was a master


----------



## kizwan

Quote:


> Originally Posted by *neurotix*
> 
> Arizonian, can you update me to two cards please?
> 
> 
> Spoiler: Warning: Spoiler!


I like the last pic.


----------



## neurotix

I really like that one too.

And, Vesuvius. xD


----------



## Zaid

I just got my 290x lightning, great initial overclock. pretty sure i can get more out of it.

http://www.techpowerup.com/gpuz/wskk/


----------



## Aussiejuggalo

Was playing Cabela's Big Game Hunter Pro Hunts and noticed some weird usage readings afterburner was throwing out, CPU wasnt even at 50% lol




















Is it just a screwy thing afterburner does?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Was playing Cabela's Big Game Hunter Pro Hunts and noticed some weird usage readings afterburner was throwing out, CPU wasnt even at 50% lol
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is it just a screwy thing afterburner does?


Afterburner doesn't read GPU usage correctly iirc, if the game is running smooth on the screen then it's enough


----------



## Forceman

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Was playing Cabela's Big Game Hunter Pro Hunts and noticed some weird usage readings afterburner was throwing out, CPU wasnt even at 50% lol
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is it just a screwy thing afterburner does?


Upgrade to Afterburner 3.0.0 Beta 19.

http://download.msi.com/uti_exe/vga/MSIAfterburnerSetup300Beta19.zip


----------



## Sgt Bilko

Sapphire R9 290 Vapor-X



http://www.sapphiretech.com/presentation/media/media_index.aspx?psn=0004&articleID=5683&lid=1

Much better looking than the Tri-X imo.


----------



## Belkov

...and much more expensive.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Belkov*
> 
> ...and much more expensive.


If it's priced higher than the DCU II I'll be surprised.

Probably another $20-30 over the Tri-X which is fine imo.


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Afterburner doesn't read GPU usage correctly iirc, if the game is running smooth on the screen then it's enough


Well thats a pita







lol
Quote:


> Originally Posted by *Forceman*
> 
> Upgrade to Afterburner 3.0.0 Beta 19.
> 
> http://download.msi.com/uti_exe/vga/MSIAfterburnerSetup300Beta19.zip


Thanks I'll see if that fixes it


----------



## Belkov

Quote:


> Originally Posted by *Sgt Bilko*
> 
> If it's priced higher than the DCU II I'll be surprised.
> 
> Probably another $20-30 over the Tri-X which is fine imo.


Actually i heard something about 50-70$ more. If it is only 30$ it will be a great choice. Will see...


----------



## Sgt Bilko

Quote:


> Originally Posted by *Belkov*
> 
> Actually i heard something about 50-70$ more. If it is only 30$ it will be a great choice. Will see...


$70-80 can't be justified by the (unknown but probably slightly better cooling).

DCU II is currently the most expensive 290 here at $570, the Tri-X is $540 with the XFX DD cards being the cheapest at $500.

Looks like they knocked the price off the DCU II cards a little here, they were around $600 at one point (a bit over iirc)


----------



## Arizonian

Quote:


> Originally Posted by *neurotix*
> 
> Arizonian, can you update me to two cards please?
> 
> 
> Spoiler: Warning: Spoiler!


updated









Quote:


> Originally Posted by *Zaid*
> 
> I just got my 290x lightning, great initial overclock. pretty sure i can get more out of it.
> 
> http://www.techpowerup.com/gpuz/wskk/


Congrats









Love to add you to the list.

To be added on the member list please submit the following in your post
1. GPU-Z Link with OCN name or Screen shot of GPU-Z validation tab open with OCN Name or finally a pic of GPU with piece of paper with OCN name showing.
2. Manufacturer & Brand - Please don't make me guess.
3. Cooling - Stock, Aftermarket or 3rd Party Water


----------



## Zaid

1. http://www.techpowerup.com/gpuz/b5rgq/
2.MSI Lightning
3.stock


----------



## gogolXmogol

Quote:


> Originally Posted by *Tobiman*
> 
> I don't think you have to tbh. Why not use DDU to uninstall drivers instead of manual deleting?


Reinstalled windows and looks like I have found the issue what is causing the card to stop downclocking in Idle.
It's the desktop setting for the monitor refresh rate, I do have a benq xl24te monitor and when I set the refresh rate to 144Hz in windows it stops downclocking. However 120 Hz and down are just fine. This is very weird and frustrating. I've submitted bug report on AMD drivers page, I'd appreciate if you guys do the same to help ATI resolve the issue.
Once again thanks for trying to help me







I'm sticking with 120Hz in desktop for now.


----------



## rdr09

Quote:


> Originally Posted by *gogolXmogol*
> 
> Reinstalled windows and looks like I have found the issue what is causing the card to stop downclocking in Idle.
> It's the desktop setting for the monitor refresh rate, I do have a benq xl24te monitor and when I set the refresh rate to 144Hz in windows it stops downclocking. However 120 Hz and down are just fine. This is very weird and frustrating. I've submitted bug report on AMD drivers page, I'd appreciate if you guys do the same to help ATI resolve the issue.
> Once again thanks for trying to help me
> 
> 
> 
> 
> 
> 
> 
> I'm sticking with 120Hz in desktop for now.


that's been an issue for awhile now and has not been addressed. AMD:doh:


----------



## Mercy4You

144 Herz is nonsense anyway...


----------



## rdr09

Quote:


> Originally Posted by *Mercy4You*
> 
> 144 Herz is nonsense anyway...


but it causes frustration to some who owns them and opted for hawaii. i think it is a hardware issue and no software can fix it. it is like plugging more monitors to the card.


----------



## gogolXmogol

Quote:


> Originally Posted by *rdr09*
> 
> that's been an issue for awhile now and has not been addressed. AMD:doh:


Hope they will fix it with the new beta release. I do not want my gpu to eat 250watts while in idle


----------



## sugarhell

Quote:


> Originally Posted by *gogolXmogol*
> 
> Hope they will fix it with the new beta release. I do not want my gpu to eat 250watts while in idle


Its a hardware issue for amd and nvidia. And no without usage your gpu cant eat 250 watt on idle. You just have 3d clocks all the time.You can fix it if you know how to do it


----------



## BradleyW

Quote:


> Originally Posted by *Mercy4You*
> 
> 144 Herz is nonsense anyway...


With 144Hz, I've never needed Vsync again. Gameplay is perfect in any game without any input lag. It's a great feeling. The different is night and day, even when the fps is low.


----------



## rdr09

Quote:


> Originally Posted by *sugarhell*
> 
> Its a hardware issue for amd and nvidia. And no without usage your gpu cant eat 250 watt on idle. You just have 3d clocks all the time.You can fix it if you know how to do it


please share it and maybe we can request the Arizonian to include it to the op. thanks.


----------



## sugarhell

Quote:


> Originally Posted by *rdr09*
> 
> please share it and maybe we can request the Arizonian to include it to the op. thanks.


Disable powerplay


----------



## rdr09

Quote:


> Originally Posted by *sugarhell*
> 
> Disable powerplay


can it be done without using AB?


----------



## Mercy4You

Quote:


> Originally Posted by *BradleyW*
> 
> With 144Hz, I've never needed Vsync again. Gameplay is perfect in any game without any input lag. It's a great feeling. The different is night and day, even when the fps is low.


I meant 144 Hertz over 120 Hertz









120 Hertz is great, I agree! But 144 Hertz is really a marketing thing, because a human being can not see the difference between those 2


----------



## BradleyW

Quote:


> Originally Posted by *Mercy4You*
> 
> I meant 144 Hertz over 120 Hertz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 120 Hertz is great, I agree! But 144 Hertz is really a marketing thing, because a human being can not see the difference between those 2


Sorry, did not realise. .


----------



## Mercy4You

Quote:


> Originally Posted by *BradleyW*
> 
> Sorry, did not realise. .


NP







BTW, when you use 120 Hertz, try to cap your frames in games at 122. That way your GPU has to work less hard and that saves energy=less heat= higher OC


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> can it be done without using AB?


I think Trixx has the option as well, but CCC doesn't


----------



## BradleyW

I have both AB and Trixx. Can't see the option for it, unless my eyes for going funny.


----------



## Sgt Bilko

Quote:


> Originally Posted by *BradleyW*
> 
> I have both AB and Trixx. Can't see the option for it, unless my eyes for going funny.




I was wrong about Trixx though. can't see it


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 
> 
> I was wrong about Trixx though. can't see it


Cant see it either my eyes are red and sore


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Cant see it either my eyes are red and sore
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Here i was thinking that was a normal look for you










Spoiler: But just in case


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Here i was thinking that was a normal look for you
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: But just in case


It is when im not driving LoooL


----------



## chronicfx

May grab a 4930k at microcenter later but they only carry a few motherboards for x79. I want to run three 290x and a soundblaster zx (x1 slot). Seems that only the asus p9x79 ws and the x79 asus sabretooth have the slot room. Although the specs on the sabretooth don't specify three gpu's can run on it. Can someone confirm this? I will also be getting a replacement plan to not have to deal with asus rma. What should I get? And will the asus sabretooth work in trifire? It has the same slot layout as my current gigabyte z77-ud5h. Which would be ideal as i can use that second x1 slot for my sound card instead of the bottom x16 on the p9x79 ws. Anyone?


----------



## Elmy

Quote:


> Originally Posted by *chronicfx*
> 
> May grab a 4930k at microcenter later but they only carry a few motherboards for x79. I want to run three 290x and a soundblaster zx (x1 slot). Seems that only the asus p9x79 ws and the x79 asus sabretooth have the slot room. Although the specs on the sabretooth don't specify three gpu's can run on it. Can someone confirm this? I will also be getting a replacement plan to not have to deal with asus rma. What should I get? And will the asus sabretooth work in trifire? It has the same slot layout as my current gigabyte z77-ud5h. Which would be ideal as i can use that second x1 slot for my sound card instead of the bottom x16 on the p9x79 ws. Anyone?


http://www.newegg.com/Product/Product.aspx?Item=N82E16813131971

This is probably the best board for you.


----------



## Arizonian

Quote:


> Originally Posted by *sugarhell*
> 
> Its a hardware issue for amd and nvidia. And no without usage your gpu cant eat 250 watt on idle. You just have 3d clocks all the time.You can fix it if you know how to do it


Quote:


> Originally Posted by *rdr09*
> 
> please share it and maybe we can request the Arizonian to include it to the op. thanks.


Quote:


> Originally Posted by *sugarhell*
> 
> Disable powerplay


PM a write up I'd be glad to add it to the OP for reference.


----------



## gogolXmogol

Quote:


> Originally Posted by *Mercy4You*
> 
> I meant 144 Hertz over 120 Hertz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 120 Hertz is great, I agree! But 144 Hertz is really a marketing thing, because a human being can not see the difference between those 2


The whole point is but the card downclocks just fine under 120 Hz! 22Hz difference makes it speed up the clock from 150MHz on mem to 1300MHz

What are the drawbacks of disabling AMD powerplay?


----------



## arrow0309

Quote:


> Originally Posted by *chronicfx*
> 
> May grab a 4930k at microcenter later but they only carry a few motherboards for x79. I want to run three 290x and a soundblaster zx (x1 slot). Seems that only the asus p9x79 ws and the x79 asus sabretooth have the slot room. Although the specs on the sabretooth don't specify three gpu's can run on it. Can someone confirm this? I will also be getting a replacement plan to not have to deal with asus rma. What should I get? And will the asus sabretooth work in trifire? It has the same slot layout as my current gigabyte z77-ud5h. Which would be ideal as i can use that second x1 slot for my sound card instead of the bottom x16 on the p9x79 ws. Anyone?


I can't recommend you anything but the R4E, you can install your sounblaster on the second pcie slot and yet having a 16x/8x/8x cf with the first, third and fourth pcie (@16x) slots









http://www.newegg.com/Product/Product.aspx?Item=N82E16813131802R


----------



## sugarhell

Quote:


> Originally Posted by *gogolXmogol*
> 
> The whole point is but the card downclocks just fine under 120 Hz! 22Hz difference makes it speed up the clock from 150MHz on mem to 1300MHz
> 
> What are the drawbacks of disabling AMD powerplay?


The 144hz issue is an old issue. All the gpus do it. Its a problem with the dvi/dp clocks or something

None. you just control the 2d/3d clocks by yourself with msi AB profiles.

Disable powerplay. The default profile is the 2d clocks. Then you create another one profile with a bit higher clocks. Thats your 3d clocks. (someone needs to test it i had my 290s only for mining,but on 7970 works this way)

Ps. By the way i dont know why you guys bench with powerplay on. The performance with powerplay off is higher and you dont get stupid bugs like my gpu doesnt downclock,flickering etc etc. But powerplay off sometimes can cause bluescreen if you use hardware acceleration


----------



## chronicfx

Quote:


> Originally Posted by *chronicfx*
> 
> May grab a 4930k at microcenter later but they only carry a few motherboards for x79. I want to run three 290x and a soundblaster zx (x1 slot). Seems that only the asus p9x79 ws and the x79 asus sabretooth have the slot room. Although the specs on the sabretooth don't specify three gpu's can run on it. Can someone confirm this? I will also be getting a replacement plan to not have to deal with asus rma. What should I get? And will the asus sabretooth work in trifire? It has the same slot layout as my current gigabyte z77-ud5h. Which would be ideal as i can use that second x1 slot for my sound card instead of the bottom x16 on the p9x79 ws. Anyone?


So annoying... I ask a vs b and always get an answer of c... Anyone want to answer this question?


----------



## Sgt Bilko

Quote:


> Originally Posted by *chronicfx*
> 
> So annoying... I ask a vs b and always get an answer of c... Anyone want to answer this question?


They both support Quad-Fire and Quad SLI from what the product page is telling me:

http://www.asus.com/Motherboards/SABERTOOTH_X79/specifications/

2 x PCIe 3.0/2.0 x16 (dual x16)
1 x PCIe 3.0/2.0 x16 (x8 mode)

https://www.asus.com/Motherboards/P9X79/specifications/

2 x PCIe 3.0/2.0 x16 (dual x16)
1 x PCIe 3.0/2.0 x16 (x8 mode)


----------



## chronicfx

Quote:


> Originally Posted by *Sgt Bilko*
> 
> They both support Quad-Fire and Quad SLI from what the product page is telling me:
> 
> http://www.asus.com/Motherboards/SABERTOOTH_X79/specifications/
> 
> 2 x PCIe 3.0/2.0 x16 (dual x16)
> 1 x PCIe 3.0/2.0 x16 (x8 mode)
> 
> https://www.asus.com/Motherboards/P9X79/specifications/
> 
> 2 x PCIe 3.0/2.0 x16 (dual x16)
> 1 x PCIe 3.0/2.0 x16 (x8 mode)


Quad is two cards like 7990 or 690. And i mean the ws version of the p9x79


----------



## Durvelle27

Ok guys I have a werid issue. For some reason my card will only run at 2.0 x16 instead of 3.0 x16

Motherboard is a ASUS Z87-A w/ i7-4770


----------



## Tobiman

Quote:


> Originally Posted by *Durvelle27*
> 
> Ok guys I have a werid issue. For some reason my card will only run at 2.0 x16 instead of 3.0 x16
> 
> Motherboard is a ASUS Z87-A w/ i7-4770


Sounds like a mobo issue but performance wise, you aren't losing much. Maybe 1-3 fps out of 100 so nothing to really worry about.


----------



## Imprezzion

Fujipoly Extreme is in. 100x15x1.0mm is plenty for 3 mounts for primary VRM's.

On my Tri-X VRM temps dropped ~10c depending on fanspeed.

Further testing in progress now.


----------



## chronicfx

Quote:


> Originally Posted by *chronicfx*
> 
> May grab a 4930k at microcenter later but they only carry a few motherboards for x79. I want to run three 290x and a soundblaster zx (x1 slot). Seems that only the asus p9x79 ws and the x79 asus sabretooth have the slot room. Although the specs on the sabretooth don't specify three gpu's can run on it. Can someone confirm this? I will also be getting a replacement plan to not have to deal with asus rma. What should I get? And will the asus sabretooth work in trifire? It has the same slot layout as my current gigabyte z77-ud5h. Which would be ideal as i can use that second x1 slot for my sound card instead of the bottom x16 on the p9x79 ws. Anyone?


Ok, let me rephrase. These being the only boards in stock and the only ones that qualify for the $50 cash back at microcenter in paterson, nj. What would be WRONG with running three 290x on the above boards?


----------



## Roboyto

Quote:


> Originally Posted by *Durvelle27*
> 
> Ok guys I have a werid issue. For some reason my card will only run at 2.0 x16 instead of 3.0 x16
> 
> Motherboard is a ASUS Z87-A w/ i7-4770


Check PCI-E settings in BIOS, either force Gen3, or make sure it is on AUTO


----------



## Roboyto

Quote:


> Originally Posted by *chronicfx*
> 
> So annoying... I ask a vs b and always get an answer of c... Anyone want to answer this question?


http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/50312-asus-sabertooth-x79-motherboard-review-2.html

I downloaded the manual for the board and it only has instructions for 2 cards, but Hardwarecanucks states 3-way SLI

http://dlcdnet.asus.com/pub/ASUS/mb/LGA2011/SABERTOOTH-X79/E8040_SABERTOOTH_X79.pdf


----------



## chronicfx

Quote:


> Originally Posted by *Roboyto*
> 
> http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/50312-asus-sabertooth-x79-motherboard-review-2.html
> 
> I downloaded the manual for the board and it only has instructions for 2 cards, but Hardwarecanucks states 3-way SLI
> 
> http://dlcdnet.asus.com/pub/ASUS/mb/LGA2011/SABERTOOTH-X79/E8040_SABERTOOTH_X79.pdf


Yeah i saw a thread too where a guy used tri sli 680's with success. I have not seen trifirr though. I just did not want to dump $900 pull apart my watercooling loop and find out i need to return it.


----------



## Roboyto

Quote:


> Originally Posted by *chronicfx*
> 
> Yeah i saw a thread too where a guy used tri sli 680's with success. I have not seen trifirr though. I just did not want to dump $900 pull apart my watercooling loop and find out i need to return it.


Call/e-mail ASUS tech support. You can hopefully get an answer from them







I laugh because of my recent horrible experience with their tech support. According to the 'tech' I was dealing with, the PCIe slot *isn't* on the motherboard


----------



## Durvelle27

Quote:


> Originally Posted by *Roboyto*
> 
> Check PCI-E settings in BIOS, either force Gen3, or make sure it is on AUTO


I believe it is on Auto


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *chronicfx*
> 
> So annoying... I ask a vs b and always get an answer of c... Anyone want to answer this question?


It should run tri I did with 760's on that weak Sabertooth . Highest I could get the cpu to clock was 48x100 . Lame


----------



## chronicfx

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> It should run tri I did with 760's on that weak Sabertooth . Highest I could get the cpu to clock was 48x100 . Lame


And what did you get with that chip in another board?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *chronicfx*
> 
> And what did you get with that chip in another board?


3930k 5.2 @2428 . That's my benchmark speed on Asus R4F / RIVE


----------



## kizwan

Quote:


> Originally Posted by *chronicfx*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Roboyto*
> 
> http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/50312-asus-sabertooth-x79-motherboard-review-2.html
> 
> I downloaded the manual for the board and it only has instructions for 2 cards, but Hardwarecanucks states 3-way SLI
> 
> http://dlcdnet.asus.com/pub/ASUS/mb/LGA2011/SABERTOOTH-X79/E8040_SABERTOOTH_X79.pdf
> 
> 
> 
> 
> 
> 
> 
> Yeah i saw a thread too where a guy used tri sli 680's with success. I have not seen trifirr though. I just did not want to dump $900 pull apart my watercooling loop and find out i need to return it.
Click to expand...

The WS natively support dual, tri & quad GPU.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *kizwan*
> 
> The WS natively support dual, tri & quad GPU.


How are ya ?
I wonder if WS has the stuff to clock as well


----------



## Matt-Matt

So does anyone here have a Koolance 290x block? I just ordered one after looking at strens results.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Matt-Matt*
> 
> So does anyone here have a Koolance 290x block? I just ordered one after looking at strens results.


I went full cover XSPC for my 290's


----------



## Matt-Matt

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> I went full cover XSPC for my 290's


Nice, how are they?

I went with XSPC for some 7970 blocks at one stage and I didn't end up using them but they didn't look too great. So I went with the Koolance this time, EK cost a bit more and according to strens results they are about the same.

I really did want some of the AquaComputer ones but nowhere here stocks them!


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Matt-Matt*
> 
> Nice, how are they?
> 
> I went with XSPC for some 7970 blocks at one stage and I didn't end up using them but they didn't look too great. So I went with the Koolance this time, EK cost a bit more and according to strens results they are about the same.
> 
> I really did want some of the AquaComputer ones but nowhere here stocks them!


Dropped all of my temps 50% + !
They do the job nicely for my first go at water blocking vid cards


----------



## Matt-Matt

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Dropped all of my temps 50% + !
> They do the job nicely for my first go at water blocking vid cards


Nice! I'm hoping for under 50c overall with one card an an RX360 and the 3570k in the loop. Been using the stock cooler for some time now so yeah It'll be nice.


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> The WS natively support dual, tri & quad GPU.
> 
> 
> 
> How are ya ?
> I wonder if WS has the stuff to clock as well
Click to expand...











Should be able to overclock well. Depends on the CPU of course. He doesn't have much choice anyway since only Saberteeth & WS available. Look like Saberteeth can run tri-GPU but WS natively support dual, tri & quad GPU's. Less secondary voltage control, e.g. PLL voltage, than ROG boards but should do well.
Quote:


> Originally Posted by *Matt-Matt*
> 
> So does anyone here have a Koolance 290x block? I just ordered one after looking at strens results.


You couldn't go wrong with any blocks. If VRM1 still running hot, like reaching 70s to 80s Celsius when overclock, you may need Fujipoly Extreme or Ultra Extreme for VRM1. They're expensive though.


----------



## boldenc

I noticed random squares like artifacts appears during browsing with Firefox, disabling hardware acceleration fixed it but I wonder if AMD can solve it in future drivers?
I did a quick search and it looks it is a common issue with GPU's based on GCN architectures and it has been there from a long time without a fix.
https://bugzilla.mozilla.org/show_bug.cgi?id=837489


----------



## Imprezzion

Well, my Fujipoly Extreme came in from FrozenCPU yesterday and I went out to test it on my stock Tri-X cooled 290x.

VRM1 temps dropped by well over 10c even on air. VRM1 is now more or less equal to core temps and I can run much lower fanspeeds while keeping VRM1 cool as well.

Also, if I wanted to, I can get away with +200mV on air now.
Ran +200mV, 1230Mhz core, 1500Mhz VRAM and max power limit yesterday.
On 100% fanspeed temps in testing (BF4, FC3, Valley, FireStrike, 3DM11) did not exceed 85c for VRM1 and core was around 75c. Before VRM1 would just fly towards high 90s to low 100s..

Also, 24/7 clocks i ran where 1165Mhz core, 1500Mhz VRAM at +100mV. This took ~75-85% fanspeed to keep VRM1 under 80c at extended periods of load. Core barely breached the 60c mark then.
Now I got fanspeed set to 65% which is a lot quiter. VRM1 maxed at exactly 80c after running all tests for at least 30 minutes. Core stayed around 72c. Lot better temp-noise balance.

Imma try to get something like 1200-1240Mhz core stable as i couldn't care less about load noise when gaming so... Can't resist. I know 1250 aint gunna work as it artifacts like hell in BF4 and FC3 foliage even on +200mV.

P.S. The 100x15x1.0mm strip of Fujipoly is enough for 3 mounts for the VRM1 section. Just cut it into 3 strips of 5mm wide. A sharp boxcutter / stanley style knife should do fine.


----------



## rdr09

Quote:


> Originally Posted by *boldenc*
> 
> I noticed random squares like artifacts appears during browsing with Firefox, disabling hardware acceleration fixed it but I wonder if AMD can solve it in future drivers?
> I did a quick search and it looks it is a common issue with GPU's based on GCN architectures and it has been there from a long time without a fix.
> https://bugzilla.mozilla.org/show_bug.cgi?id=837489


see highlighted portion of post # 2 . . .

http://www.overclock.net/t/1265543/the-amd-how-to-thread


----------



## Imprezzion

Update: 1200Mhz core, +150mV, 1500Mhz VRAM, 85% fanspeed. With Prolimatech PK-1 on the core and Fujipoly Extreme on the VRM1.

BF4 50 player SQDM @ 1080p Ultra w/ 4x Adaptive AA forced in CCC and HBAO enabled. Ran for 30 minutes.
- Core temp max: 74c.
- VRM 1 temp max: 76c.
- VRM2 temp max: 48c.

Far Cry 3 all settings maxed with 4x Adaptive AA forced in CCC also ran for 30 minutes.
- Core temp max: 72c.
- VRM1 temp max: 73c.
- VRM2 temp max: 51c.

Unigine Valley Extreme preset, 8x Adaptive AA forced, windowed mode. Also ran 30 minutes.
- Core temp max: 76c.
- VRM1 temp max: 80c.
- VRM2 temp max: 53c.

And as a added bonus:
BF4 64 player Conquest large (Naval Strike) @ Ultra w/ 2x Adaptive AA, HBAO and at 150% res scale so 2880x1620 resolution.
- Core temp max: 75c.
- VRM1 temp max: 79c.
- VRM2 temp max: 53c.

If these clocks prove to stay stable at 1200Mhz core then I cannot be happier cause a 1200Mhz R9 290X on 3rd party air is not all that common







(Unless it's a Lightning but k)


----------



## chronicfx

Quote:


> Originally Posted by *kizwan*
> 
> The WS natively support dual, tri & quad GPU.


I think this is what I will end up getting. P9x79 ws. If I can get to 4.7ghz with the 4930k I will be a happy man... At least until I sleep and have visions of a 5ghz chip... I think it needs that bios flash update for ivy out if the box though, i will have to look into that and figure out how to overclock these as they seem different than overclocking my 3570k.


----------



## Dasboogieman

Sapphire AMD 290 Tri-X OC, latest stable AMD drivers.

Sup guys, just bought this beauty for the low low price of $530 AUD with free shipping. Decided against the 290X (which was $730) and instead picked up an additional ASUS Direct CU II 7870 for the spare comp with the $200 saved.

I upgraded from an SLI GTX 570 system and its really nice to have a quiet PC for once. This thing's cooling capacity is phenomenal.
Very impressed with the engineering so far, however, software had a few hiccups.

Recently discovered that gameplay is really choppy with Skyrim, DOTA2, Far Cry 3 and Shogun 2 without adaptive Vsync (set via RadeonPro since CCC seems to be ineffective).

Been playing around with the OC, stock seems to like 1100mhz core/ 1400mhz Mem though I bumped +35mv with Afterburner for good measure.
How far has everyone been getting with the Memory Overclocks? My card has Hynix according to MemoryInfo.

Also, I don't have much experience with GDDR5 overclocking since the GTX 570 was more limited by the anemic Memory controller than the actual memory ICs. Is there a reliable way to verify a GDDR5 memory clock? I used to use CUDAmemtest but it was only reliable with GDDR3 since the ECC on the GDDR5 prevents obvious errors.

Cheers


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Dasboogieman*
> 
> 
> 
> Sapphire AMD 290 Tri-X OC, latest stable AMD drivers.
> 
> Sup guys, just bought this beauty for the low low price of $530 AUD with free shipping. Decided against the 290X (which was $730) and instead picked up an additional ASUS Direct CU II 7870 for the spare comp with the $200 saved.
> 
> I upgraded from an SLI GTX 570 system and its really nice to have a quiet PC for once. This thing's cooling capacity is phenomenal.
> Very impressed with the engineering so far, however, software had a few hiccups.
> 
> Recently discovered that gameplay is really choppy with Skyrim, DOTA2, Far Cry 3 and Shogun 2 without adaptive Vsync (set via RadeonPro since CCC seems to be ineffective).
> 
> Been playing around with the OC, stock seems to like 1100mhz core/ 1400mhz Mem though I bumped +35mv with Afterburner for good measure.
> How far has everyone been getting with the Memory Overclocks? My card has Hynix according to MemoryInfo.
> 
> Also, I don't have much experience with GDDR5 overclocking since the GTX 570 was more limited by the anemic Memory controller than the actual memory ICs. Is there a reliable way to verify a GDDR5 memory clock? I used to use CUDAmemtest but it was only reliable with GDDR3 since the ECC on the GDDR5 prevents obvious errors.
> 
> Cheers


You should get a higher mem clock outta that one with Hynix say 1600 + (6400) . Compared to my 290 with Elphida mem about 1450 - 1500 Mhz (6000) .
Have fun dude


----------



## cyenz

I guys i´ve been looking at Hybrid II 120 to replace my 290x, but i am a little worried since the VRM and memory cooler is placed on the backside of the card.. does it really cool anything? Or will probably pop some VRM during intense gaming?

Haven´t seen alot of feedback since it´s a relativly new cooler.

Any 290X owners that have one? How are the VRM temps under load?


----------



## Matt-Matt

Quote:


> Originally Posted by *kizwan*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Should be able to overclock well. Depends on the CPU of course. He doesn't have much choice anyway since only Saberteeth & WS available. Look like Saberteeth can run tri-GPU but WS natively support dual, tri & quad GPU's. Less secondary voltage control, e.g. PLL voltage, than ROG boards but should do well.
> You couldn't go wrong with any blocks. If VRM1 still running hot, like reaching 70s to 80s Celsius when overclock, you may need Fujipoly Extreme or Ultra Extreme for VRM1. They're expensive though.


Yeah, I don't think you can get the Fujipoly here (Australia) easily or really cheaply at all.. I do have a Phoyba 1mm pad though. My housemate also has a few that i could borrow if i need more too.

I'll see how it goes I guess, it's going to be better then air either way.


----------



## DerekParker

1.http://www.techpowerup.com/gpuz/vwfpp/

2.Msi Gaming Series Twin Frozr 290x x2

3.Stock twin frozr cooling until someone produces a block that will fit over the larger capacitors on this card.


----------



## axiumone

Quote:


> Originally Posted by *DerekParker*
> 
> 1.http://www.techpowerup.com/gpuz/vwfpp/
> 
> 2.Msi Gaming Series Twin Frozr 290x x2
> 
> 3.Stock twin frozr cooling until someone produces a block that will fit over the larger capacitors on this card.


EK has a block coming soon that will fit the larger capacitors on these cards.


----------



## Roboyto

Quote:


> Originally Posted by *DerekParker*
> 
> 1.http://www.techpowerup.com/gpuz/vwfpp/
> 
> 2.Msi Gaming Series Twin Frozr 290x x2
> 
> 3.Stock twin frozr cooling until someone produces a block that will fit over the larger capacitors on this card.


EK lists a block as 'Coming Soon' on their Cooling Configurator website


----------



## axiumone

Is there a bios editor capable of editing r9 290/x roms?


----------



## Roboyto

Quote:


> Originally Posted by *Dasboogieman*
> 
> 
> 
> Sapphire AMD 290 Tri-X OC, latest stable AMD drivers.
> 
> Sup guys, just bought this beauty for the low low price of $530 AUD with free shipping. Decided against the 290X (which was $730) and instead picked up an additional ASUS Direct CU II 7870 for the spare comp with the $200 saved.
> 
> I upgraded from an SLI GTX 570 system and its really nice to have a quiet PC for once. This thing's cooling capacity is phenomenal.
> Very impressed with the engineering so far, however, software had a few hiccups.
> 
> *Recently discovered that gameplay is really choppy with Skyrim, DOTA2, Far Cry 3 and Shogun 2 without adaptive Vsync (set via RadeonPro since CCC seems to be ineffective).*
> 
> Been playing around with the OC, stock seems to like 1100mhz core/ 1400mhz Mem though I bumped +35mv with Afterburner for good measure.
> *How far has everyone been getting with the Memory Overclocks? My card has Hynix according to MemoryInfo.*
> 
> Also, I don't have much experience with GDDR5 overclocking since the GTX 570 was more limited by the anemic Memory controller than the actual memory ICs. *Is there a reliable way to verify a GDDR5 memory clock?* I used to use CUDAmemtest but it was only reliable with GDDR3 since the ECC on the GDDR5 prevents obvious errors.
> 
> Cheers


*Choppy Gameplay:*


Make sure all of your old drivers are gone by using Display Driver Uninstaller.
What driver version are you running? My suggestion would be the 13.12 as 14.X beta drivers have been problematic for some time now. I have read that 14.3 is better than the last couple iterations, especially if you fancy using Mantle.

*Memory Overclock:*


This really varies quite a bit from card to card.
Hynix usually outperforms Elpida, but there have been a few people with outstanding Elpida cards.
Personally my XFX 290 Black Edition can run the memory up to 1700; it has Hynix
Some people have better luck with RAM overclock using MSI Afterburner since it has AUX Voltage adjustment which is for the RAM; I use Sapphire Trixx
Memory OC comes in a definite second behind core clock
To verify your OC run some benchmarks.
Anything 3DMark of Unigine are very popular choices for stability testing
When pushing the RAM too high I usually experienced Black Screens and lock ups
Pushing core too far will more than likely result in artifacting and screen tearing



Check out my Shrunken Beast build log and you can see all my overclocking I did with my card in a spreadsheet. On average my large RAM overclock was only good for about 5% benchmark performance boost



Most important thing to consider with these cards is temperatures on VRM1, it will inhibit your OC before core temp does. The consensus in this thread seems to be two things: The VRMs are rated for up to ~120C, and most feel the comfortable operating temperature ~80C. Whenever you order the block for your card, I would highly suggest upgrading the VRM thermal pads to Fujipoly Ultra Extreme, they make a considerable difference.

http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures


----------



## Imprezzion

Ok, my core speed is 100% stable on 1200Mhz on air now.
+150mV does the trick for 1200Mhz. Been playing random games and benches all afternoon (about 4-5 hours) and max temps where 76c core, 83c VRM1 and 53c VRM2. Ambient 20c, fanspeed ~80%.

Just a shame my Hynix VRAM will randomly give single colored screens or driver freezes above ~1525Mhz.


----------



## Roboyto

Quote:


> Originally Posted by *Imprezzion*
> 
> Ok, my core speed is 100% stable on 1200Mhz on air now.
> +150mV does the trick for 1200Mhz. Been playing random games and benches all afternoon (about 4-5 hours) and max temps where 76c core, 83c VRM1 and 53c VRM2. Ambient 20c, fanspeed ~80%.
> 
> Just a shame my Hynix VRAM will randomly give single colored screens or driver freezes above ~1525Mhz.


Nice clocks







RAM speed isn't that important, so don't be sad









See my post preceding yours, the RAM clocks hold little weight compared to core on these cards. My card is stable for nearly everything at, or very close to, 1300/1700 +200mV, but the voltage and temperature increases weren't worth it; My everyday usage clocks are 1200/1500 +87mV. I have compared 1200/1500 to my max stable clocks and I only lose ~5% performance. I'll take it since I'm not abusing the card with +200mV.


----------



## Red1776

Quote:


> Originally Posted by *axiumone*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DerekParker*
> 
> 1.http://www.techpowerup.com/gpuz/vwfpp/
> 
> 2.Msi Gaming Series Twin Frozr 290x x2
> 
> 3.Stock twin frozr cooling until someone produces a block that will fit over the larger capacitors on this card.
> 
> 
> 
> EK has a block coming soon that will fit the larger capacitors on these cards.
Click to expand...

 I have EK blocks that fit the 1040 OC MSI R290X 4GB game twin frozr cards


----------



## Imprezzion

Quote:


> Originally Posted by *Roboyto*
> 
> Nice clocks
> 
> 
> 
> 
> 
> 
> 
> RAM speed isn't that important, so don't be sad
> 
> 
> 
> 
> 
> 
> 
> 
> 
> See my post preceding yours, the RAM clocks hold little weight compared to core on these cards. My card is stable for nearly everything at, or very close to, 1300/1700 +200mV, but the voltage and temperature increases weren't worth it; My everyday usage clocks are 1200/1500 +87mV. I have compared 1200/1500 to my max stable clocks and I only lose ~5% performance. I'll take it since I'm not abusing the card with +200mV.


Now those are nice clocks! Almost 50% less offset voltage for 1200Mhz.

I can't even get close to 1300Mhz on +200mV. Benches start to artifact around 1240-1250Mhz..

Then again, yours is a 290 and not a 290x and they probably clock slightly better due to having less shaders.


----------



## Roboyto

Quote:


> Originally Posted by *Imprezzion*
> 
> Now those are nice clocks! Almost 50% less offset voltage for 1200Mhz.
> 
> I can't even get close to 1300Mhz on +200mV. Benches start to artifact around 1240-1250Mhz..
> 
> Then again, yours is a 290 and not a 290x and they probably clock slightly better due to having less shaders.


Watercooling definitely reduced the voltage I needed for core speeds, as well as giving me a little more headroom for overclocks. If you go full cover water you may experience better results


----------



## Arizonian

Quote:


> Originally Posted by *Dasboogieman*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Sapphire AMD 290 Tri-X OC, latest stable AMD drivers.
> 
> Sup guys, just bought this beauty for the low low price of $530 AUD with free shipping. Decided against the 290X (which was $730) and instead picked up an additional ASUS Direct CU II 7870 for the spare comp with the $200 saved.
> 
> I upgraded from an SLI GTX 570 system and its really nice to have a quiet PC for once. This thing's cooling capacity is phenomenal.
> Very impressed with the engineering so far, however, software had a few hiccups.
> 
> Recently discovered that gameplay is really choppy with Skyrim, DOTA2, Far Cry 3 and Shogun 2 without adaptive Vsync (set via RadeonPro since CCC seems to be ineffective).
> 
> Been playing around with the OC, stock seems to like 1100mhz core/ 1400mhz Mem though I bumped +35mv with Afterburner for good measure.
> How far has everyone been getting with the Memory Overclocks? My card has Hynix according to MemoryInfo.
> 
> Also, I don't have much experience with GDDR5 overclocking since the GTX 570 was more limited by the anemic Memory controller than the actual memory ICs. Is there a reliable way to verify a GDDR5 memory clock? I used to use CUDAmemtest but it was only reliable with GDDR3 since the ECC on the GDDR5 prevents obvious errors.
> 
> Cheers


Congrats - added









Quote:


> Originally Posted by *DerekParker*
> 
> 1.http://www.techpowerup.com/gpuz/vwfpp/
> 
> 2.Msi Gaming Series Twin Frozr 290x x2
> 
> 3.Stock twin frozr cooling until someone produces a block that will fit over the larger capacitors on this card.


Congrats - added


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *axiumone*
> 
> Is there a bios editor capable of editing r9 290/x roms?


Id just flash the Asus PT1T bios its unlocked and got me this









http://www.3dmark.com/3dm11/7725420


----------



## NixZiZ

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added


Sorry, I never mentioned the card type! It's a 290, not a 290x.


----------



## Xylene

Is this safe? R9 290, reference cooler, fan at 87%, 1200mhz +137mv. See temps, VRM temp is set to MAX, GPU temp maxed out at 73C, but is cut off in the screenshot. This was a quick and dirty overclock that survived a round of BF3 at all ultra with full AA, could use less volts at that clock, or get more clock at that voltage.


----------



## rdr09

Quote:


> Originally Posted by *Xylene*
> 
> Is this safe? R9 290, reference cooler, fan at 87%, 1200mhz +137mv. See temps, VRM temp is set to MAX, GPU temp maxed out at 73C, but is cut off in the screenshot. This was a quick and dirty overclock that survived a round of BF3 at all ultra with full AA, could use less volts at that clock, or get more clock at that voltage.


i think that is safe if those temps are maintained. we have very similar cards when i had mine on air (for a week). you should be able to oc higher with better cooling.

you got the power li maxed? if not - do so. might be able to up the mem.


----------



## axiumone

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Id just flash the Asus PT1T bios its unlocked and got me this
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/7725420


That's great. I'm number 9 in the hall of fame.









http://www.3dmark.com/hall-of-fame-2/fire+strike+3dmark+score+extreme+preset/version+1.1

I already have a different manufacturer bios flash, but I'd like to do some of my own tweaks. It doesn't seem like any of the old amd bios editors work.


----------



## Dasboogieman

After some more testing on the Tri OC with 3dmark11 Perfomance:

1150mhz Core/ 1500mhz VRAM +50mv offset +50% power fan speed auto, 20% with ramp to 80%
Gradual throttling in all tests to 937mhz score is 15200 average load temp is 78 degrees

1150mhz Core/ 1500mhz VRAM +50mv offset +50% power fan speed 80%
Full clock stability with 3dmark 11, final score is 16380 average load temp is 72 degrees

1200mhz core/ 1500mhz VRAM +100mv offset +50%power fan speed 80%
Gradual throttling on 2nd test to 1000mhz when AMP draw is >140A score 14999 average load temp is 78 degrees

This is really weird, I thought Sapphire made their card independent of the fan speed impact on the performance envelope. Anyone have ideas?

After more reading, it seems the default BIOS on the 290 is the Silent profile. Is there a chance anyone has an Uber BIOS equivalent for this card? I have a hunch it will remove the throttling issue. Also, how are the watercoolers getting around this issue since theres technically 0% fan speed.

ack Overclocking is so complicated now, and I thought Fermi was a pain in the ass with the Power Limiter.


----------



## Tobiman

Quote:


> Originally Posted by *Dasboogieman*
> 
> After some more testing on the Tri OC with 3dmark11 Perfomance:
> 
> 1150mhz Core/ 1500mhz VRAM +50mv offset +50% power fan speed auto, 20% with ramp to 80%
> Gradual throttling in all tests to 937mhz score is 15200 average load temp is 78 degrees
> 
> 1150mhz Core/ 1500mhz VRAM +50mv offset +50% power fan speed 80%
> Full clock stability with 3dmark 11, final score is 16380 average load temp is 72 degrees
> 
> 1200mhz core/ 1500mhz VRAM +100mv offset +50%power fan speed 80%
> Gradual throttling on 2nd test to 1000mhz when AMP draw is >140A score 14999 average load temp is 78 degrees
> 
> This is really weird, I thought Sapphire made their card independent of the fan speed impact on the performance envelope. Anyone have ideas?
> 
> After more reading, it seems the default BIOS on the 290 is the Silent profile. Is there a chance anyone has an Uber BIOS equivalent for this card? I have a hunch it will remove the throttling issue. Also, how are the watercoolers getting around this issue since theres technically 0% fan speed.
> 
> ack Overclocking is so complicated now, and I thought Fermi was a pain in the ass with the Power Limiter.


It seems like you actually need more juice. Up the voltage a bit for the 1200mhz run.


----------



## Dasboogieman

Upping the voltage seems to accelerate the throttling for me. It seems the tradeoff at the moment is high voltage, high clockspeed for a short duration or lower voltage and prolonged clock stability.


----------



## arrow0309

Add me too, please









*Validation*

Sapphire R9 290 Tri-X OC on stock cooling










http://www.overclock.net/t/294838/the-cooler-master-690-club/19240#post_22050031

@Dasboogieman
Use the 13.12 drivers (no throttling) and run the Unigine Heaven 4.0 in "Custom" 1920x1080 (full screen) and settings set on Extreme
Show us the score of the benchmark at 1200/1500


----------



## Imprezzion

Exactly. Your probably using the 14.xx drivers. Those have broken power limits.
13.12 works fine.


----------



## Roboyto

Quote:


> Originally Posted by *Dasboogieman*
> 
> After some more testing on the Tri OC with 3dmark11 Perfomance:
> 
> 1150mhz Core/ 1500mhz VRAM +50mv offset +50% power fan speed auto, 20% with ramp to 80%
> Gradual throttling in all tests to 937mhz score is 15200 average load temp is 78 degrees
> 
> 1150mhz Core/ 1500mhz VRAM +50mv offset +50% power fan speed 80%
> Full clock stability with 3dmark 11, final score is 16380 average load temp is 72 degrees
> 
> 1200mhz core/ 1500mhz VRAM +100mv offset +50%power fan speed 80%
> Gradual throttling on 2nd test to 1000mhz when AMP draw is >140A score 14999 average load temp is 78 degrees
> 
> This is really weird, I thought Sapphire made their card independent of the fan speed impact on the performance envelope. Anyone have ideas?
> 
> After more reading, it seems the default BIOS on the 290 is the Silent profile. Is there a chance anyone has an Uber BIOS equivalent for this card? I have a hunch it will remove the throttling issue. Also, how are the watercoolers getting around this issue since theres technically 0% fan speed.
> 
> ack Overclocking is so complicated now, and I thought Fermi was a pain in the ass with the Power Limiter.


What program are you using to OC?

Ditch 14.X drivers if you're using them, scrub with DDU so there aren't any driver conflicts and try 13.12.

Whether you use Sapphire Trixx of MSI AB, make sure you disable ULPS and force constant voltage to help with stability. Keep the power limit at 50%.

Your score probably dropped from the 1150 to 1200 run because of additional voltage required. For 1200 MHz core, on air, you may need a little more than +100mV depending on how good of a card you have. Couldn't hurt to try reducing your VRAM clocks; they really are 'second fiddle' to the core on these cards with the ginormous 512-bit bus.

What temperatures for core and *VRM1* are you getting during benchmarks? The card shouldn't throttle until it surpasses 95C on the core. Issues seem to rise once VRM1 temps are in the 90C+ range.

If you're running a custom fan profile the BIOS switch setting shouldn't make a difference.

Water cools everything enough to make it more efficient than on air, thus reducing voltage/power requirements. My voltage requirements dropped when I went to water, as well as allowing me to push the card a little further. I run out of overclocking ability of my card before I run into any temperature limits; my card goes up to 1300/1700 with +200mV.


----------



## Dasboogieman

MSI afterburner without CCC support. VRM temperatures top out at 85 degrees. Disabled ULPS but the throttling still persists. Its a fresh 13.12 install since I got rid of the NVIDIA drivers.

Seems so far

1150mhz core/ 1400mhz RAM +25mv with 55% fan speed seems to be the most consistent so far.
Seems to me this is an issue with tripping the board power limit, the higher temperatures from the Air cooling vs water cooling may make the difference.

Ordered a backplate so hopefully the VRMs are happier. Might actually go watercooling if this throttling shows up in games later on.


----------



## Roboyto

Quote:


> Originally Posted by *Dasboogieman*
> 
> MSI afterburner without CCC support. VRM temperatures top out at 85 degrees. Disabled ULPS but the throttling still persists. Its a fresh 13.12 install since I got rid of the NVIDIA drivers.
> 
> Seems so far
> 
> 1150mhz core/ 1400mhz RAM +25mv with 55% fan speed seems to be the most consistent so far.
> Seems to me this is an issue with tripping the board power limit, the higher temperatures from the Air cooling vs water cooling may make the difference.
> 
> Ordered a backplate so hopefully the VRMs are happier. Might actually go watercooling if this throttling shows up in games later on.


That is a large OC with only +25mV; my card needed +75mV to reach that high on the core when I was running the reference cooler. I did get my card up to 1215/1675 +125mV with reference cooler, but I do have an above average performer.

I don't think you've hit the power limit yet at that low of a voltage; See pic below



And that is with my little 650W Rosewill Capstone PSU









Maybe give Trixx a shot, I had issues with afterburner right out of the gate.

I'm going to be adding thermal pads under my backplate in the near future too see if they make a difference or not. For the sake of efficiency it couldn't hurt to upgrade thermal pads on your air cooler, they dropped VRM1 temps with my waterblock over 20%. Fujipoly Ultra Extreme: http://www.frozencpu.com/products/17504/thr-185/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_Mosfet_Block_-_100_x_15_x_10_-_Thermal_Conductivity_170_WmK.html

http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures


----------



## Dasboogieman

1.4v? holy crap!!!

how are you guys getting a consistent clockspeed?
anything above 1.23V sets off the throttling for me. And this is just Futuremark 11. Performance actually goes down for me.
Defective card perhaps? my ASIC quality is 78.9%.

EDIT: Sapphire TRIXX seems to have done the trick, no more throttling even with +100mv and the fan on 25%


----------



## VulgarDisplay

Does anyone else get artifacts in google chrome with any memory overclock at all? I get these strange lines across my page while scrolling if I overclock the memory on my GPU.

R9 290 with 14.3.


----------



## Roboyto

Quote:


> Originally Posted by *VulgarDisplay*
> 
> Does anyone else get artifacts in google chrome with any memory overclock at all? I get these strange lines across my page while scrolling if I overclock the memory on my GPU.
> 
> R9 290 with 14.3.


I'm running Firefox and I've noticed it only a handful of times in the last 10 weeks or so. I'm on 13.12

I usually return card to stock clocks when I'm not gaming so it may not be an OC issue. I'll have to pay attention next time it happens and check if card is OC'D or not


----------



## DrClaw

Quote:


> Originally Posted by *centvalny*
> 
> 
> 
> http://imgur.com/sOSVvdY
> 
> 
> 
> 
> 
> http://imgur.com/ejW1zWq


WOw buddy thats a beautiful looking 290x so nice


----------



## The Mac

i got them at stock on all the 14.x series.


----------



## Roboyto

Quote:


> Originally Posted by *Dasboogieman*
> 
> 1.4v? holy crap!!!
> 
> how are you guys getting a consistent clockspeed?
> anything above 1.23V sets off the throttling for me. And this is just Futuremark 11. Performance actually goes down for me.
> Defective card perhaps? my ASIC quality is 78.9%.
> 
> EDIT: Sapphire TRIXX seems to have done the trick, no more throttling even with +100mv and the fan on 25%


Yes, 1.4V 

Glad your problem has been solved...Now you should try and push it a little further... ^_^


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *VulgarDisplay*
> 
> Does anyone else get artifacts in google chrome with any memory overclock at all? I get these strange lines across my page while scrolling if I overclock the memory on my GPU.
> 
> R9 290 with 14.3.


It does it in explorer too


----------



## chalkypink

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> It does it in explorer too


I had these in my browsers as well but they seem not to happen anymore since rolling back to 13.12, even at 1500mhz.


----------



## Jflisk

Quote:


> Originally Posted by *VulgarDisplay*
> 
> Does anyone else get artifacts in google chrome with any memory overclock at all? I get these strange lines across my page while scrolling if I overclock the memory on my GPU.
> 
> R9 290 with 14.3.


Vulgar looks like there is something with your GPU core. You might want to start a RMA.


----------



## VulgarDisplay

Quote:


> Originally Posted by *Jflisk*
> 
> Vulgar looks like there is something with your GPU core. You might want to start a RMA.


How did we go from a memory overclock causing artifacts that other people say they get also to me needing to RMA my GPU because the core is bad? Does not compute.


----------



## Jflisk

Quote:


> Originally Posted by *VulgarDisplay*
> 
> How did we go from a memory overclock causing artifacts that other people say they get also to me needing to RMA my GPU because the core is bad? Does not compute.


Okay you have artifacts correct ??

Usually artifacting is from a bad gpu core.

I am running at 1060 - 1350 trying google chrome now

No strange lines in Google chrome. I have shut of ULPS though.

Try shutting off ulps then try it.


----------



## hotrod717

Received my replacement Lightning today -


----------



## Paul17041993

Quote:


> Originally Posted by *VulgarDisplay*
> 
> Does anyone else get artifacts in google chrome with any memory overclock at all? I get these strange lines across my page while scrolling if I overclock the memory on my GPU.
> 
> R9 290 with 14.3.


Quote:


> Originally Posted by *Roboyto*
> 
> I'm running Firefox and I've noticed it only a handful of times in the last 10 weeks or so. I'm on 13.12
> 
> I usually return card to stock clocks when I'm not gaming so it may not be an OC issue. I'll have to pay attention next time it happens and check if card is OC'D or not


Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> It does it in explorer too


check your idle and in-browser memory clocks, likely its a stability issue caused by the browsers (flash, javascript and/or HTML5 hardware acceleration likely), the memory on these cards is very dependent on the actual core voltage, if its running the max memory clocks while the core is throttled down (lack of load), then that's a very bad thing...
Quote:


> Originally Posted by *chalkypink*
> 
> I had these in my browsers as well but they seem not to happen anymore since rolling back to 13.12, even at 1500mhz.


your clock control is working perfectly fine then








Quote:


> Originally Posted by *Jflisk*
> 
> Vulgar looks like there is something with your GPU core. You might want to start a RMA.


problems occurring while using hardware outside design specifications and/or use of unauthorized 3rd party tools or modifications =/= grounds for RMA.


----------



## Roboyto

Quote:


> Originally Posted by *Paul17041993*
> 
> check your idle and in-browser memory clocks, likely its a stability issue caused by the browsers (flash, javascript and/or HTML5 hardware acceleration likely), the memory on these cards is very dependent on the actual core voltage, if its running the max memory clocks while the core is throttled down (lack of load), then that's a very bad thing...


I will have to pay better attention when it happens again. Every time I have seen the artifacts, it is only a small amount of black square/rectangles, and scrolling down the page and back up removes them.

Looking at clocks right now and they are 305/150


----------



## Dasboogieman

OK the overclocks are looking good so far

1150mhz core / 1500mhz memory is stable with +75mv in EVGA OC scanner 1hr run.

I would want to push further but I'm hitting 95 degree VRM temperatures with this voltage so it seems the Tri X doesnt cool this particular component very well. I might see if the padded backplate will allow 1200mhz / 1500mhz on +125mv.

Gonna go for a 3 hr EVGA OC scanner run to fully verify.


----------



## Roboyto

Quote:


> Originally Posted by *Dasboogieman*
> 
> OK the overclocks are looking good so far
> 
> 1150mhz core / 1500mhz memory is stable with +75mv in EVGA OC scanner 1hr run.
> 
> I would want to push further but I'm hitting 95 degree VRM temperatures with this voltage so it seems the Tri X doesnt cool this particular component very well. I might see if the padded backplate will allow 1200mhz / 1500mhz on +125mv.
> 
> Gonna go for a 3 hr EVGA OC scanner run to fully verify.


Very nice, glad you're moving forward after ditching AB.

Not familiar with EVGA OC scanner, what kind of load is it putting on the card? Something similar to Furmark? If so, that's not really a great judge of the stress games will be putting it through. Even if a game pushes the GPU to 100%, furmark-esque tests are going to give worse temps. Also, if it's heavy in tessellation that will stress your card even more since AMD doesn't compute it nearly as well as Nvidia.

Run some 3DMark or Unigine benches. I have found the FFXIV benchmark to be a better test than anything as I get the lowest stable clocks in it.

Either way, may want to consider that thermal pad upgrade :-D


----------



## Roy360

finally decided to watercool my ASUS R9 DU 290. Doesn't look like I can return it







, so I guess I have to skip on the TF version off amazon.ca for 450$.

Does anyone know if the EK blackplates are worth it or not? I know koolance's blackplate is active, but I'm not sure if the Koolance block will line up with my other two EKs

Usage: For now atleast, I"m going to be mining with these cards 24/7. Aim is to hit 750$, so I can pay off the extra 150$+$600 I paid extra ontop of MSRP to get this beast, as well as the cost of the 3 waterblocks. =)

P.S, my internal temps are around the 60 degree range with two cards running, and AX1200 is absolutely burning.
Currently have:
2x120 (stacked) exhaust
4x120 intake
2x120 recycling case air.
1x120 empty fan slot

Maybe I should turn one of the radiators into an exhaust?


----------



## hotrod717

Quote:


> Originally Posted by *Roboyto*
> 
> Very nice, glad you're moving forward after ditching AB.
> 
> Not familiar with EVGA OC scanner, what kind of load is it putting on the card? Something similar to Furmark? If so, that's not really a great judge of the stress games will be putting it through. Even if a game pushes the GPU to 100%, furmark-esque tests are going to give worse temps. Also, if it's heavy in tessellation that will stress your card even more since AMD doesn't compute it nearly as well as Nvidia.
> 
> Run some 3DMark or Unigine benches. I have found the FFXIV benchmark to be a better test than anything as I get the lowest stable clocks in it.
> 
> Either way, may want to consider that thermal pad upgrade :-D


That's because EVGA OC scanner is predominantly used for Nvidia cards, as well as, Precision X. 10 min on OC scanner should tell you what you need to know. AB is the "Go To" tool for AMD. Not sure what the "ditched" reference is to, but it is easily the most tweakable of all oc software. If you know what to do...


----------



## Roboyto

Quote:


> Originally Posted by *hotrod717*
> 
> That's because EVGA OC scanner is predominantly used for Nvidia cards, as well as, Precision X. 10 min on OC scanner should tell you what you need to know. AB is the "Go To" tool for AMD. Not sure what the "ditched" reference is to, but it is easily the most tweakable of all oc software. If you know what to do...


He is not the first person who has had issues when overclocking these 290(X) cards with AB. I suggest Trixx as an alternative because I also had issues with AB getting my card to move past 1030 on the core. Once I switched to Trixx I found my card to be quite the 'monster' clocking as high as 1315/1700. Getting my card to those clocks I must know something.

I don't have any personal vendetta with AB, I just use what works...and Trixx has always been better for me since my 4890.


----------



## The Mac

If i didnt need RTSS, i would probobly use trixx.

I find AB+RTSS resource footprint to be a bit high.


----------



## Roboyto

Quote:


> Originally Posted by *Roy360*
> 
> finally decided to watercool my ASUS R9 DU 290. Doesn't look like I can return it
> 
> 
> 
> 
> 
> 
> 
> , so I guess I have to skip on the TF version off amazon.ca for 450$.
> 
> Does anyone know if the EK blackplates are worth it or not? I know koolance's blackplate is active, but I'm not sure if the Koolance block will line up with my other two EKs
> 
> Usage: For now atleast, I"m going to be mining with these cards 24/7. Aim is to hit 750$, so I can pay off the extra 150$+$600 I paid extra ontop of MSRP to get this beast, as well as the cost of the 3 waterblocks. =)
> 
> P.S, my internal temps are around the 60 degree range with two cards running, and AX1200 is absolutely burning.
> Currently have:
> 2x120 (stacked) exhaust
> 4x120 intake
> 2x120 recycling case air.
> 1x120 empty fan slot
> 
> Maybe I should turn one of the radiators into an exhaust?


I emailed XSPC regarding putting thermal pads on the back of their backplate and they said it would assist in cooling the VRMs. I would imagine the same to be true for the EK backplate. I wouldn't suggest TIM on the backside of the board though, too much of a mess.

Haven't gotten around to it yet, but I will be posting results once I do.

I had my front radiator running as an intake, and it was nuking the inside of my case. I turned the fans around so now both radiators are exhausting. It didn't effect core/VRM or CPU temperatures worth noting. However, putting my hand on the card, cables, RAM, etc inside the case I can tell a significant difference. I didn't monitor temps before and after, so I have no hard data...but by feel I know it made quite a difference.


----------



## Jflisk

Quote:


> Originally Posted by *Roy360*
> 
> finally decided to watercool my ASUS R9 DU 290. Doesn't look like I can return it
> 
> 
> 
> 
> 
> 
> 
> , so I guess I have to skip on the TF version off amazon.ca for 450$.
> 
> Does anyone know if the EK blackplates are worth it or not? I know koolance's blackplate is active, but I'm not sure if the Koolance block will line up with my other two EKs
> 
> Usage: For now atleast, I"m going to be mining with these cards 24/7. Aim is to hit 750$, so I can pay off the extra 150$+$600 I paid extra ontop of MSRP to get this beast, as well as the cost of the 3 waterblocks. =)
> 
> P.S, my internal temps are around the 60 degree range with two cards running, and AX1200 is absolutely burning.
> Currently have:
> 2x120 (stacked) exhaust
> 4x120 intake
> 2x120 recycling case air.
> 1x120 empty fan slot
> 
> Maybe I should turn one of the radiators into an exhaust?


I have a R9 290x under water with the back plate running core 45C and 53C VRM under water with 2X240 and 1 X 120 thick. All rads running push.One more R9 290X coming I will post temps when up and running. EK water block and back plate.I think the burn in on the thermal pads is 2 days. Lost 2c in two days.This is with mining scrypt 24/7


----------



## Roboyto

Quote:


> Originally Posted by *The Mac*
> 
> If i didnt need RTSS, i would probobly use trixx.
> 
> I find AB+RTSS resource footprint to be a bit high.


I use RTSS with HWiNFO for display in games and run Trixx for OC. Quite a nice feature once you get it all tweaked to your liking.


----------



## The Mac

I already run Aida64 for most of my sensors, but id need to use fraps for FPS, so its either trixx+Fraps, or RTSS+AB.


----------



## hotrod717

Quote:


> Originally Posted by *Roboyto*
> 
> He is not the first person who has had issues when overclocking these 290(X) cards with AB. I suggest Trixx as an alternative because I also had issues with AB getting my card to move past 1030 on the core. Once I switched to Trixx I found my card to be quite the 'monster' clocking as high as 1315/1700. Getting my card to those clocks I must know something.
> 
> I don't have any personal vendetta with AB, I just use what works...and Trixx has always been better for me since my 4890.


To each their own. Always use what works. But AB isn't limited to +200mv and can do what you want it to. I have seen other people post that Trixx has worked better for them as well. I haven't seen it in the few 290X that I've owned though. Could only think it may have something to do with different voltage controllers or such that different manufact. may use.


----------



## Roboyto

Quote:


> Originally Posted by *hotrod717*
> 
> To each their own. Always use what works. But AB isn't limited to +200mv and can do what you want it to. I have seen other people post that Trixx has worked better for them as well. I haven't seen it in the few 290X that I've owned though. Could only think it may have something to do with different voltage controllers or such that different manufact. may use.


Sounds reasonable enough







, the different controllers never really crossed my mind







. 200mV gets my card up over 1.4V Not sure where it is after droop but it's sucking down some juice as seen here:



I might be able to push my card a little further with that ASUS BIOS and GPU Tweak, but I don't want to tempt fate with my lucky draw in the Silicon Lottery.

How's that Lightning? It looks more promising than the Matrix


----------



## Dasboogieman

I've been mostly an NVIDIA guy before this but the 290 was just too cheap and good to pass up.
MSI afterburner has always been my go to program since forever, quite frankly if it weren't for the weird things it was doing to my Sapphire AMD 290, I would go back to it. The interface is fantastic and the reliability is unrivaled.
However, when overclocked (relatively modest 1150mhz +25mv) MSI afterburner was forcing the 290 clockspeeds to full power then it would slowly throttle to 1000mhz or lower during 3dmark 11 and sometimes during games. I believe it's due to the power limit knob not working on this GPU for some reason.

TBH neither programs are ideal at the moment.

On the Sapphire TRIXX, I get a consistent 1150mhz with more aggressive +75mv under 100% load regardless of application but I miss the 100% clockspeed at all times that AB has. TRIXX has been working great except for Skyrim which is still a little problematic. Since it doesn't need to load the GPU heavily to crank 60FPS the core adapts its clockspeed to achieve this target, AB gets around this by forcing 100% speed at all times (with the relatively minor throttling to 1000mhz but no lower) but Trixx swings between 600mhz-900mhz (seemingly related to the GPU loading) so I still get 60 FPS but its not very smooth due to the constant transitioning.

yea I liked EVGA OC scanner since it can detect artifacts that the human eye can miss. Its basically the modern version of the granddaddy, ATITOOL. However, it produces OpenGL loading so the temperatures are definitely exaggerated.

Gonna try Ghost Recon Future Soldier to test this overclock, this bastard of a game actually crashes on my GTX 570's factory overclocks (had to either pump voltage or go to 100% reference speeds) so I consider it the ultimate OC reliability test.


----------



## The Mac

power control doesnt work on 14.x, it works fine on 13.12 however.

It not ABs fault, its the drivers.


----------



## bigbangSG

any 1 here using msi R9-290 and water block,
wanted to know which water block it can use


----------



## Jflisk

Quote:


> Originally Posted by *bigbangSG*
> 
> any 1 here using msi R9-290 and water block,
> wanted to know which water block it can use


Find the EK waterblock and back plate for your card.

Start here find your card put it under water and enjoy.

http://www.coolingconfigurator.com/

I have a EK for R9 290X and back plate about 150.00 shipped


----------



## Arizonian

Quote:


> Originally Posted by *NixZiZ*
> 
> Sorry, I never mentioned the card type! It's a 290, not a 290x.


No worries - corrected.









Quote:


> Originally Posted by *arrow0309*
> 
> Add me too, please
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Validation*
> 
> Sapphire R9 290 Tri-X OC on stock cooling
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/294838/the-cooler-master-690-club/19240#post_22050031
> 
> @Dasboogieman
> Use the 13.12 drivers (no throttling) and run the Unigine Heaven 4.0 in "Custom" 1920x1080 (full screen) and settings set on Extreme
> Show us the score of the benchmark at 1200/1500


Congrats - added









Quote:


> Originally Posted by *hotrod717*
> 
> Received my replacement Lightning today -
> 
> 
> Spoiler: Warning: Spoiler!


Congrats on the Lightning - updated









Good job on the *MSI R9 290X Lightning Thread* .


----------



## DerekParker

Quote:


> Originally Posted by *Roboyto*
> 
> EK lists a block as 'Coming Soon' on their Cooling Configurator website


I know I cant wait man


----------



## Roboyto

Quote:


> Originally Posted by *DerekParker*
> 
> I know I cant wait man


I would keep a sharp eye out, the first batch of them may sell out quickly


----------



## Dasboogieman

Quote:


> Originally Posted by *arrow0309*
> 
> Add me too, please
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Validation*
> 
> Sapphire R9 290 Tri-X OC on stock cooling
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/294838/the-cooler-master-690-club/19240#post_22050031
> 
> @Dasboogieman
> Use the 13.12 drivers (no throttling) and run the Unigine Heaven 4.0 in "Custom" 1920x1080 (full screen) and settings set on Extreme
> Show us the score of the benchmark at 1200/1500


1200mhz/1500mhz run with +100mv 13.12 drivers. Can't say how reliable this is for regular use.


----------



## rdr09

Quote:


> Originally Posted by *Dasboogieman*
> 
> 1200mhz/1500mhz run with +100mv 13.12 drivers. Can't say how reliable this is for regular use.


exact same as mine (compare graphics scores) . . .

http://www.3dmark.com/3dm11/8201733

here is what you'll get at 1300 +200 offset . . .

http://www.3dmark.com/3dm11/7716320


----------



## Roboyto

Quote:


> Originally Posted by *Dasboogieman*
> 
> 1200mhz/1500mhz run with +100mv 13.12 drivers. Can't say how reliable this is for regular use.


Those are my everyday usage settings, except I am able to use 87mV; The water undoubtedly helps.

I would say you have an above average performer to get those clocks on air. You need a waterblock and soon!

There's mine 1200/1500 +100mV with Tesselation forced OFF in CCC. Usually good for between 8-12% graphics score boost

http://www.3dmark.com/3dm11/8205841


----------



## cephelix

Quote:


> Originally Posted by *Roboyto*
> 
> I would keep a sharp eye out, the first batch of them may sell out quickly


Yeah,hoping to get the EK WB for mine as well...hoping I could reuse the backplate that comes with the msi r9 290 gaming though.


----------



## Roboyto

Quote:


> Originally Posted by *cephelix*
> 
> Yeah,hoping to get the EK WB for mine as well...hoping I could reuse the backplate that comes with the msi r9 290 gaming though.


Good chance you could...worst case a little ingenuity may be required


----------



## cephelix

Quote:


> Originally Posted by *Roboyto*
> 
> Good chance you could...worst case a little ingenuity may be required


But I dont want ingenuity! I want a straightforward application right out of the box!!..lol


----------



## Arizonian

Quote:


> Originally Posted by *DerekParker*
> 
> I know I cant wait man


I'm curious what temps your seeing in crossfire on the top card with the MSI Gaming cards on air? Full load / idle?


----------



## hotrod717

Quote:


> Originally Posted by *Roboyto*
> 
> Sounds reasonable enough
> 
> 
> 
> 
> 
> 
> 
> , the different controllers never really crossed my mind
> 
> 
> 
> 
> 
> 
> 
> . 200mV gets my card up over 1.4V Not sure where it is after droop but it's sucking down some juice as seen here:
> 
> 
> 
> I might be able to push my card a little further with that ASUS BIOS and GPU Tweak, but I don't want to tempt fate with my lucky draw in the Silicon Lottery.
> 
> How's that Lightning? It looks more promising than the Matrix


Yeah after Vdroop I would guesstimate about 1.36v depending on your asic% Exactly the reason a lot of people , including myself, use a DMM to measure actual voltage. Not just for GPU, but CPU as well. So, far, so good on the second one. Don't really like too push too far on air, but still need to see what it can do. Quick and dirty 1150 on stock voltage so far. Didn't really have much time with it yet. People have been getting some really nice oc's on air however, and the heatsink /fans are doing a great job of keeping this card cool.

Quote:


> Originally Posted by *Dasboogieman*
> 
> I've been mostly an NVIDIA guy before this but the 290 was just too cheap and good to pass up.
> MSI afterburner has always been my go to program since forever, quite frankly if it weren't for the weird things it was doing to my Sapphire AMD 290, I would go back to it. The interface is fantastic and the reliability is unrivaled.
> However, when overclocked (relatively modest 1150mhz +25mv) MSI afterburner was forcing the 290 clockspeeds to full power then it would slowly throttle to 1000mhz or lower during 3dmark 11 and sometimes during games. I believe it's due to the power limit knob not working on this GPU for some reason.
> 
> TBH neither programs are ideal at the moment.
> 
> On the Sapphire TRIXX, I get a consistent 1150mhz with more aggressive +75mv under 100% load regardless of application but I miss the 100% clockspeed at all times that AB has. TRIXX has been working great except for Skyrim which is still a little problematic. Since it doesn't need to load the GPU heavily to crank 60FPS the core adapts its clockspeed to achieve this target, AB gets around this by forcing 100% speed at all times (with the relatively minor throttling to 1000mhz but no lower) but Trixx swings between 600mhz-900mhz (seemingly related to the GPU loading) so I still get 60 FPS but its not very smooth due to the constant transitioning.
> 
> yea I liked EVGA OC scanner since it can detect artifacts that the human eye can miss. Its basically the modern version of the granddaddy, ATITOOL. However, it produces OpenGL loading so the temperatures are definitely exaggerated.
> 
> Gonna try Ghost Recon Future Soldier to test this overclock, this bastard of a game actually crashes on my GTX 570's factory overclocks (had to either pump voltage or go to 100% reference speeds) so I consider it the ultimate OC reliability test.


Quote:


> Originally Posted by *The Mac*
> 
> power control doesnt work on 14.x, it works fine on 13.12 however.
> 
> It not ABs fault, its the drivers.


^This. 13.12 seems to be the only driver to use for benching or getting the most out 3rd party software.

Quote:


> Originally Posted by *Arizonian*
> 
> No worries - corrected.
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats on the Lightning - updated
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Good job on the *MSI R9 290X Lightning Thread* .


Thanks! Hopefully get a influx of info once the waterblock hits retailers and people get them under water. Had some people post nice LN2 runs though.


----------



## INCREDIBLEHULK

Question that I'd like some opinions on.

If I wanted to run a crossfire or trifire setup, what do you think would give me the greatest performance

(mind you I own all of the hardware already)

3 x 290 same cards
2 x 290 same cards
1x 290x + 1x 290 mixed cards

I'm curious if the 290x+290 will cause issues from the cards capabilities being a bit off. Would this be a case where identical 290's would outperform a mixmatched 290+290x of different clocks?
I read much that tri-fire doesn't scale as well, however, it will still be an improvement over crossfire regardless of the situation it is put correct?

I don't know if I'm just going to buy some waterblocks and use a multiple card setup or do a single card setup. I'm sort of itchin to buy a msi 290x lightning, regardless all these 780ti's performing amazing on benchmarks/games


----------



## spongeyturtle

So this is a little project I had over the weekend- using a closed loop watercooler (an Antec Kuhler 620 in this case) and using it to cool my r9 290...

took the whole day but I am pleased with the results-

idle: [email protected] 35C [email protected] [email protected]
Full Load: [email protected] [email protected] [email protected]











Thanks for looking


----------



## Matt-Matt

Quote:


> Originally Posted by *spongeyturtle*
> 
> So this is a little project I had over the weekend- using a closed loop watercooler (an Antec Kuhler 620 in this case) and using it to cool my r9 290...
> 
> took the whole day but I am pleased with the results-
> 
> idle: [email protected] 35C [email protected] [email protected]
> Full Load: [email protected] [email protected] [email protected]
> 
> Thanks for looking


Looking really good! Awesome temps too for such a cheap mod!

Did you leave any metal off the actual heatsink in there, or just left it how it was in the pictures? I think adding some better cooling to the VRM's would improve it a bit.


----------



## spongeyturtle

Quote:


> Originally Posted by *Matt-Matt*
> 
> Looking really good! Awesome temps too for such a cheap mod!
> 
> Did you leave any metal off the actual heatsink in there, or just left it how it was in the pictures? I think adding some better cooling to the VRM's would improve it a bit.


Thanks,

i just used an old pentium 1 heatsink i had lying around and cut it in half and then stuck it on using themal tape. i have attached this to vrm2 and left vrm1 with the stock heatsink (the one that is under the stock fan)

seems to do the job quite well but got some coppers on the way and gonna get a backplate too


----------



## Roy360

how do you overclock crossfired cards? I currently have two R9 290s, going for 3 soon. But MSI Afterburner keepts setting my Core and Memory clocks down to zero when I hit apply profile.

The overclock that I"m trying to apply is:
1175/1350/+69mV/+30

that's the highest I could get on my single XFX card. Not sure about the 2nd and 3rd card. (but I figure I could use this as my base line)

GPUZ just says it detects two cards running in crossfire at 947/1250.

p.s how much would x4 gen 3.0 hurt my performance? Unfortunately, my motherboard only does 8/8/4 gen 3.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Roy360*
> 
> how do you overclock crossfired cards? I currently have two R9 290s, going for 3 soon. But MSI Afterburner keepts setting my Core and Memory clocks down to zero when I hit apply profile.
> 
> The overclock that I"m trying to apply is:
> 1175/1350/+69mV/+30
> 
> that's the highest I could get on my single XFX card. Not sure about the 2nd and 3rd card. (but I figure I could use this as my base line)
> 
> GPUZ just says it detects two cards running in crossfire at 947/1250.
> 
> *p.s how much would x4 gen 3.0 hurt my performance? Unfortunately, my motherboard only does 8/8/4 gen 3.*


You quite certain that it only does that ? there would have to be a x16 slot somewhere being a ROG board and all that stuff........


----------



## arrow0309

Got a nice Heaven 4.0 score with my 290 Tri-X, way better than my old 7970:

http://s4.postimg.org/rh2ii907f/heaven_2014_04_08_08_57_03_09.jpg
http://s24.postimg.org/vkga45s0j/Heaven_4_0.jpg

However I had to push the vddc up to 156 (1.325v) and the pl to 50, as I can see you all manage to do similar oc with lower vddc, am I supposed to worry?
My asic quality is 81.5%








The temps seem fine, also no throttling with the catalyst 13.12









I wonder if the memory speed will also pend on the vddc, maybe I should try with a lower memory to see if I could maintain the gpu oc with less voltage


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> You quite certain that it only does that ? there would have to be a x16 slot somewhere being a ROG board and all that stuff........


3 x PCIe 3.0/2.0 x16 (x16 or dual x8 or x8/x4/x4)

http://www.asus.com/ROG_ROG/MAXIMUS_V_FORMULA/specifications/

i was expecting x16/x16/x8 or something actually.

Quote:


> Originally Posted by *Roy360*
> 
> how do you overclock crossfired cards? I currently have two R9 290s, going for 3 soon. But MSI Afterburner keepts setting my Core and Memory clocks down to zero when I hit apply profile.
> 
> p.s how much would x4 gen 3.0 hurt my performance? Unfortunately, my motherboard only does 8/8/4 gen 3.


In Afterburner you have an option to sychronise settings for similar graphics processors, check that....if that doesn't work then you might need to re-install AB or get a newer version.

but x4 shouldn't really hurt the Bandwitdth that much on PCIe 3.0, same as a x8 PCIe 2.0 slot


----------



## Sgt Bilko

Quote:


> Originally Posted by *arrow0309*
> 
> Got a nice Heaven 4.0 score with my 290 Tri-X, way better than my old 7970:
> 
> http://s4.postimg.org/rh2ii907f/heaven_2014_04_08_08_57_03_09.jpg
> http://s24.postimg.org/vkga45s0j/Heaven_4_0.jpg
> 
> However I had to push the vddc up to 156 (1.325v) and the pl to 50, as I can see you all manage to do similar oc with lower vddc, am I supposed to worry?
> My asic quality is 81.5%
> 
> 
> 
> 
> 
> 
> 
> 
> The temps seem fine, also no throttling with the catalyst 13.12
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I wonder if the memory speed will also pend on the vddc, maybe I should try with a lower memory to see if I could maintain the gpu oc with less voltage


Upping the core voltage will help with Memory oc a bit. I'd drop it down to 1400 mem and try again, my 290's will do 1200 on the core at +110mV


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 3 x PCIe 3.0/2.0 x16 (x16 or dual x8 or x8/x4/x4)
> 
> http://www.asus.com/ROG_ROG/MAXIMUS_V_FORMULA/specifications/
> 
> i was expecting x16/x16/x8 or something actually. But x4 shouldn't really hurt the Bandwitdth that much on PCIe 3.0, same as a x8 PCIe 2.0 slot


That's why I went 2011 / X79 .... Lots of bandwith


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> That's why I went 2011 / X79 .... Lots of bandwith


Beat that Catzilla score yet?









just messing with ya


----------



## Roy360

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> You quite certain that it only does that ? there would have to be a x16 slot somewhere being a ROG board and all that stuff........


As far as I know, only the V and VI Extreme can do 8/8/8/8
Quote:


> Originally Posted by *Sgt Bilko*
> 
> 3 x PCIe 3.0/2.0 x16 (x16 or dual x8 or x8/x4/x4)
> 
> http://www.asus.com/ROG_ROG/MAXIMUS_V_FORMULA/specifications/
> 
> i was expecting x16/x16/x8 or something actually.
> In Afterburner you have an option to sychronise settings for similar graphics processors, check that....if that doesn't work then you might need to re-install AB or get a newer version.
> 
> but x4 shouldn't really hurt the Bandwitdth that much on PCIe 3.0, same as a x8 PCIe 2.0 slot


Damn, that really sucks. (What is the plex chip on this motherboard even doing?)

I'm using beta 19, but I'll check in the settings you see if there is anything to change.

P.s did anyone figure out out what aux voltage actually for? I know some people said they got fewer black screens with a high aux voltage


----------



## DolanTheDuck

Did anyone ever had the problem that trixx custom fan control not reacts to higer temperatures, mine stays at 50% fan speed(not fixed mode) on any temperature and when i close the game or stresstest, trixx suddenly works at the fancurve i have set.


----------



## Roboyto

Quote:


> Originally Posted by *hotrod717*
> 
> Yeah after Vdroop I would guesstimate about 1.36v depending on your asic% Exactly the reason a lot of people , including myself, use a DMM to measure actual voltage. Not just for GPU, but CPU as well. So, far, so good on the second one. Don't really like too push too far on air, but still need to see what it can do. Quick and dirty 1150 on stock voltage so far. Didn't really have much time with it yet. People have been getting some really nice oc's on air however, and the heatsink /fans are doing a great job of keeping this card cool.


I got mine to 1215/1600,i think, with 125mV with stock blower. Temps were fair, but the noise was.. yeah lol

Yeah, the lightning really blows the pants off the Matrix when comparing provided coolers. ASUS never fixed the issue from the DC2, where they're using the GK104 heat pipe design. Sure most people who buy a matrix are going to water cool it, if not use LN2, but it's the premise of buying a Flagship item that has a design flaw.

1150 is pretty good for stock settings, what kind of voltage to they come with out of the box?


----------



## Roboyto

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Upping the core voltage will help with Memory oc a bit. I'd drop it down to 1400 mem and try again, my 290's will do 1200 on the core at +110mV


What he said. Mem clock is second fiddle on these cards big time, especially in 3DMark and Unigine.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Beat that Catzilla score yet?
> 
> 
> 
> 
> 
> 
> 
> 
> just messing with ya


No I haven't but ............







Bunch of fives ?


----------



## lawson67

Hey guys can anyone someone tell me where to find the power limit option in MSI Afterburner beta 19?.....I have 2x R9 290 cards


----------



## cyenz

Just click the arrow on the right of the core voltage option


----------



## lawson67

Quote:


> Originally Posted by *cyenz*
> 
> Just click the arrow on the right of the core voltage option


Thats what i have done in that picture yet i see no powerlimit option?

Edit: I can see the power limit option now!..... i had to change skins


----------



## xseanx

Hi guys, I just got a good deal open box ASUS R9 290X-DC2OC. After an hour playing Metro 2033 the VRM temperature 1 highest reading is 107c, is that acceptable ? and GPU temp around 86-89c, isn't that too high for a non-reference card? another question, how do u check if the card is throttling or not ? thanks.


----------



## Paul17041993

Quote:


> Originally Posted by *xseanx*
> 
> Hi guys, I just got a good deal open box ASUS R9 290X-DC2OC. After an hour playing Metro 2033 the VRM temperature 1 highest reading is 107c, is that acceptable ? and GPU temp around 86-89c, isn't that too high for a non-reference card? another question, how do u check if the card is throttling or not ? thanks.


yea, that's relatively normal for the non-ref ASUS cards, shouldn't be throttling really, but you can check in CCC overdrive settings, if the load is 100% but core clock is varying below the max then its likely throttling, but at the same time it will also vary slightly depending on how heavy the game/load is.

having a fan blowing towards the backplate I guess could bring the VRM temps down slightly, but otherwise I don't think there's much you can do about those temps without replacing the cooler completely...

edit; oh and VRM max acceptable is usually ~120C for the ASUS non-ref I'm pretty sure, reference/others is usually ~110C I think (ceramic vs soft chokes etc).


----------



## friend'scatdied

My experience with Hawaii so far has been that you have to design the airflow of your entire system around the heat the GPU puts out.

Normally with a mITX form factor I budget for an additional 10'C coming to the CPU/GPU temps over a half-decent mid-tower. This was the case with my aftermarket 670 and reference 780 Ti -- they ran just hotter and louder than in reviews.

With Hawaii this is much greater than that and there should be an emphasis on exhausting all of that hot air outside of your chassis lest it be recycled. My MSI Gaming 290X is hitting high 80's as well but it seems low positive pressure airflow is not very sustainable with this kind of card.

Not sure how the 500D is in this regard.


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> Question that I'd like some opinions on.
> 
> If I wanted to run a crossfire or trifire setup, what do you think would give me the greatest performance
> 
> (mind you I own all of the hardware already)
> 
> 3 x 290 same cards
> 2 x 290 same cards
> 1x 290x + 1x 290 mixed cards
> 
> I'm curious if the 290x+290 will cause issues from the cards capabilities being a bit off. Would this be a case where identical 290's would outperform a mixmatched 290+290x of different clocks?
> I read much that tri-fire doesn't scale as well, however, it will still be an improvement over crossfire regardless of the situation it is put correct?
> 
> I don't know if I'm just going to buy some waterblocks and use a multiple card setup or do a single card setup. I'm sort of itchin to buy a msi 290x lightning, regardless all these 780ti's performing amazing on benchmarks/games


Can anyone speak from experience?







I can only trust some reviewers benchmarks so much.

Trying to re-quote my post so it doesn't get buried 6 feet under









http://www.techpowerup.com/forums/threads/crossfire-vs-sli-780ti-780-290x-290-x-dual-triple-quad.195818/

This looks interesting, makes me wonder if its the exact same 7950/7970 deal. Will the 290 just be a clear winner vs the 290x from the performance/cost ratio?
I know the 7950 had a lot of headroom to overclock but I don't know if the 290 is the same with the 290x. I remember some 7950's at almost vddc -0.050 than a 7970 but these 290/290x don't seem that way
I can only search so much


----------



## Red1776

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> Quote:
> 
> 
> 
> Originally Posted by *INCREDIBLEHULK*
> 
> Question that I'd like some opinions on.
> 
> If I wanted to run a crossfire or trifire setup, what do you think would give me the greatest performance
> 
> (mind you I own all of the hardware already)
> 
> 3 x 290 same cards
> 2 x 290 same cards
> 1x 290x + 1x 290 mixed cards
> 
> I'm curious if the 290x+290 will cause issues from the cards capabilities being a bit off. Would this be a case where identical 290's would outperform a mixmatched 290+290x of different clocks?
> I read much that tri-fire doesn't scale as well, however, it will still be an improvement over crossfire regardless of the situation it is put correct?
> 
> I don't know if I'm just going to buy some waterblocks and use a multiple card setup or do a single card setup. I'm sort of itchin to buy a msi 290x lightning, regardless all these 780ti's performing amazing on benchmarks/games
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can anyone speak from experience?
> 
> 
> 
> 
> 
> 
> 
> I can only trust some reviewers benchmarks so much.
> 
> Trying to re-quote my post so it doesn't get buried 6 feet under
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/forums/threads/crossfire-vs-sli-780ti-780-290x-290-x-dual-triple-quad.195818/
> 
> This looks interesting, makes me wonder if its the exact same 7950/7970 deal. Will the 290 just be a clear winner vs the 290x from the performance/cost ratio?
> I know the 7950 had a lot of headroom to overclock but I don't know if the 290 is the same with the 290x. I remember some 7950's at almost vddc -0.050 than a 7970 but these 290/290x don't seem that way
> I can only search so much
Click to expand...

well first off I am a reviewer so...sorry about that 

However you will get a slight bump going with the 2x290 +1x 290x probably in the area of 3-4%

at least that is what I observed when I was trying out mixing the 290(x) series.

Hope that helps, and let me know what results you get I would be interested if you see close to the same results I did.


----------



## hotrod717

Quote:


> Originally Posted by *Roboyto*
> 
> I got mine to 1215/1600,i think, with 125mV with stock blower. Temps were fair, but the noise was.. yeah lol
> 
> Yeah, the lightning really blows the pants off the Matrix when comparing provided coolers. ASUS never fixed the issue from the DC2, where they're using the GK104 heat pipe design. Sure most people who buy a matrix are going to water cool it, if not use LN2, but it's the premise of buying a Flagship item that has a design flaw.
> 
> 1150 is pretty good for stock settings, what kind of voltage to they come with out of the box?


I'm on 12hr days right now and won't be able to get any real time with it until this weekend. Another quick and dirty tonight at 1200mhz w/ +125mv. Ran a full loop of Valley artifact free and only topped out at 61*. I've literally spent less than 30 min. oc'ing this card. I 'll hook up the DMM first thing tomorrow and let you know what stock voltage is. Can't imagine it being much above 1.2-1.25v. As for the Matrix, it definitely holds a special place in my heart, as I had a 7970 Matrix that did 1385/1885 w/ 1.36v actual. Got really lucky with that one and have yet to own one that holds a candle to it. I have to say, I am a bit disappointed with the 290X version. Does it have hotwire connectors? Not that it worked well on the 7970 Matrix, but I haven't seen any info or pics that it does.


----------



## DeadlyDNA

So i went a little crazy and wanted to see how hard i could push my 290 quad setup. I have finished setting up a 4k eyefinity to play with. I cannot post and benchmark screenies cause they are to large. lol. is it against TOS to link screenshots?


----------



## DerekParker

Quote:


> Originally Posted by *Arizonian*
> 
> I'm curious what temps your seeing in crossfire on the top card with the MSI Gaming cards on air? Full load / idle?


I have really only been playing bf4 so far since I put this thing together, and it often gets up into the 80's, I know they can run higher than that, but I still don't like it. Idle temps are usually at 41 Celsius most of the time, occasionally it goes up a few degrees. That's with all settings on ultra but I am only running a single 1080p 60hz monitor so they don't work very hard.


----------



## Paul17041993

Quote:


> Originally Posted by *DeadlyDNA*
> 
> So i went a little crazy and wanted to see how hard i could push my 290 quad setup. I have finished setting up a 4k eyefinity to play with. I cannot post and benchmark screenies cause they are to large. lol. is it against TOS to link screenshots?


no screenies are against TOS that I know of provided its not anything inappropriate or adult, for super-high res screenies its probably best to upload to some 3rd party site like photobucket or imgur and just link the picture in a spoiler, the forum itself generally handles the max resolution and if its inside a spoiler it wont be loaded until the spoiler is opened (usually).


----------



## Roboyto

Quote:


> Originally Posted by *hotrod717*
> 
> I'm on 12hr days right now and won't be able to get any real time with it until this weekend. Another quick and dirty tonight at 1200mhz w/ +125mv. Ran a full loop of Valley artifact free and only topped out at 61*. I've literally spent less than 30 min. oc'ing this card. I 'll hook up the DMM first thing tomorrow and let you know what stock voltage is. Can't imagine it being much above 1.2-1.25v. As for the Matrix, it definitely holds a special place in my heart, as I had a 7970 Matrix that did 1385/1885 w/ 1.36v actual. Got really lucky with that one and have yet to own one that holds a candle to it. I have to say, I am a bit disappointed with the 290X version. Does it have hotwire connectors? Not that it worked well on the 7970 Matrix, but I haven't seen any info or pics that it does.


I believe it has the hotwire capability..?


----------



## Arizonian

Quote:


> Originally Posted by *DerekParker*
> 
> I have really only been playing bf4 so far since I put this thing together, and it often gets up into the 80's, I know they can run higher than that, but I still don't like it. Idle temps are usually at 41 Celsius most of the time, occasionally it goes up a few degrees. That's with all settings on ultra but I am only running a single 1080p 60hz monitor so they don't work very hard.


Not bad for the top card I think. I was curious how the MSI gaming combo did in crossfire temps. Thanks for replying to my question.


----------



## DeadlyDNA

Wow, what a pain to do this. I will get more variable runs and i can run scaling tests if people are just curious. These are mostly in landscape as i will be making some custom stands for these monitors. Enjoy. 4k Eyefinity

Heaven Benchmark 3.0 - same as below but Ambient occlusion off. 4.0 doesn't allow you to tweak settings so i believe i am getting capped on vram possible with ambient occlusion on.(not sure yet just a guess)


Spoiler: Warning: Spoiler!







Heaven Benchmark 4.0 - 4xR9 290 Crossfire - NO AA - Medium Quality - Normal Tess


Spoiler: Warning: Spoiler!



http://i.imgur.com/R8cs4ec.jpg



Metro2033 - 11520x2160 benchmarks. not sure if i am getting vram capped as i haven't checked yet


Spoiler: Warning: Spoiler!








HL2 landscape


Spoiler: Warning: Spoiler!









Crysis 2 potrait


Spoiler: Warning: Spoiler!









Sadly Jpegs and compression hurt the quality. It murdered the screenshots with artifacts but oh well still okay i guess.


----------



## Sgt Bilko

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Wow, what a pain to do this. I will get more variable runs and i can run scaling tests if people are just curious. These are mostly in landscape as i will be making some custom stands for these monitors. Enjoy. 4k Eyefinity
> 
> Heaven Benchmark 3.0 - same as below but Ambient occlusion off. 4.0 doesn't allow you to tweak settings so i believe i am getting capped on vram possible with ambient occlusion on.(not sure yet just a guess)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Heaven Benchmark 4.0 - 4xR9 290 Crossfire - NO AA - Medium Quality - Normal Tess
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://i.imgur.com/R8cs4ec.jpg
> 
> 
> 
> Metro2033 - 11520x2160 benchmarks. not sure if i am getting vram capped as i haven't checked yet
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> HL2 landscape
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Crysis 2 potrait
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sadly Jpegs and compression hurt the quality. It murdered the screenshots with artifacts but oh well still okay i guess.


That is awesome!

Nice to see these cards can handle extreme high res









I'm only just going to 1440p now but i'll enjoy it nonetheless


----------



## Norse

Am pondering getting two second hand 290/290x's any danger of them acting odd due to them having come from someone that was crypto farming for a month or two?


----------



## Paul17041993

295X2 officially announced and Ive come to realize I may be retiring from being a hardware enthusiast, don't seem excited anymore about these things...



http://www.anandtech.com/show/7930/the-amd-radeon-r9-295x2-review

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Wow, what a pain to do this. I will get more variable runs and i can run scaling tests if people are just curious. These are mostly in landscape as i will be making some custom stands for these monitors. Enjoy. 4k Eyefinity
> 
> Heaven Benchmark 3.0 - same as below but Ambient occlusion off. 4.0 doesn't allow you to tweak settings so i believe i am getting capped on vram possible with ambient occlusion on.(not sure yet just a guess)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Heaven Benchmark 4.0 - 4xR9 290 Crossfire - NO AA - Medium Quality - Normal Tess
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://i.imgur.com/R8cs4ec.jpg
> 
> 
> 
> Metro2033 - 11520x2160 benchmarks. not sure if i am getting vram capped as i haven't checked yet
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> HL2 landscape
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Crysis 2 potrait
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sadly Jpegs and compression hurt the quality. It murdered the screenshots with artifacts but oh well still okay i guess.


wait, how did you get 3*4K eyefinity? HDMI + 2 DP and all running 30Hz? that res is ridiculous regardless surprised actually they perform so well with that many pixels...


----------



## Sgt Bilko

Taken from Team AU's FB page:

Quote:


> We tested another ASUS R9 290X Matrix on an EK water block today, this card seems even better than the previous card.
> Full pass of 3DMark11 GT with 1400MHz core and 1800 MHz memory. The Elpida memory gets a bad wrap, but the cards we have tried have 1750 MHz or above.
> Will test this on LN2 in the coming days, we are also planning to do some give away to celebrate hitting our 1000th like (soon we hope).


Looks like Elpida works just fine on the Matrix cards.


----------



## vortex240

Quote:


> Originally Posted by *Paul17041993*
> 
> 295X2 officially announced and Ive come to realize I may be retiring from being a hardware enthusiast, don't seem excited anymore about these things...
> 
> 
> 
> http://www.anandtech.com/show/7930/the-amd-radeon-r9-295x2-review
> wait, how did you get 3*4K eyefinity? HDMI + 2 DP and all running 30Hz? that res is ridiculous regardless surprised actually they perform so well with that many pixels...


Are you disappointed?? Have you seen the techpowerup review, which includes OC results. The card has new hynix chips and did 1120 core 1725 mem!! Now youtube 295X2 and see how quiet it is on full load 50db.

Aside from the price tag, which was to be expected, its a great piece of engineering. Especially considering, it has no frame pacing issues at all. Granted I wish they used a thicker rad.

It's a halo product but look at how many boutique brands already announced systems with it. It will sell out, no doubt about it. The whole point of this is exclusivity.


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *Norse*
> 
> Am pondering getting two second hand 290/290x's any danger of them acting odd due to them having come from someone that was crypto farming for a month or two?


You will be fine if the source you are buying from is reputable and honest. If you know what temps, voltages, and clock speed were run you will be able to make a secure purchase ( and with warranty included, who cares







).
You never see people asking if CPU/GPU been used for [email protected] and now mining is mainstream and most of the time mainstream is misinformed.
Sure some people will go and tell you that because the card is under load constant that it's going to die soon. The fact is they don't know what level of degradation is caused unless they have performed tests over long periods of time with many units.

What's worse constant temperature, clock speed, and voltage or a card that goes from 30C-80C 20-50times a day, gets overclocked, overvolted, and is basically power cycled all day long.

Imho , If I was going to buy hardware, lets say a CPU, I would rather buy the cpu ran on 1.4vcore 24/7 for 6months than the LN2 cpu ran on 1.8vcore for a bunch of benchmarking runs. That's just me, sure the temps don't get high on LN2, that doesn't change the amount of power that's actually running through the chip. As for GPU, i'd rather buy a 100 day mined gpu over a 100 day +200mV/overclocked gpu for benching and gaming
Quote:


> Originally Posted by *vortex240*
> 
> Are you disappointed?? Have you seen the techpowerup review, which includes OC results. The card has new hynix chips and did 1120 core 1725 mem!! Now youtube 295X2 and see how quiet it is on full load 50db.
> 
> Aside from the price tag, which was to be expected, its a great piece of engineering. Especially considering, it has no frame pacing issues at all. Granted I wish they used a thicker rad.
> 
> It's a halo product but look at how many boutique brands already announced systems with it. It will sell out, no doubt about it. The whole point of this is exclusivity.


Still looks like 2 290x on one board to me


----------



## Norse

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> You will be fine if the source you are buying from is reputable and honest. If you know what temps, voltages, and clock speed were run you will be able to make a secure purchase ( and with warranty included, who cares
> 
> 
> 
> 
> 
> 
> 
> ).
> You never see people asking if CPU/GPU been used for [email protected] and now mining is mainstream and most of the time mainstream is misinformed.
> Sure some people will go and tell you that because the card is under load constant that it's going to die soon. The fact is they don't know what level of degradation is caused unless they have performed tests over long periods of time with many units.
> 
> What's worse constant temperature, clock speed, and voltage or a card that goes from 30C-80C 20-50times a day, gets overclocked, overvolted, and is basically power cycled all day long.
> 
> Imho , If I was going to buy hardware, lets say a CPU, I would rather buy the cpu ran on 1.4vcore 24/7 for 6months than the LN2 cpu ran on 1.8vcore for a bunch of benchmarking runs. That's just me, sure the temps don't get high on LN2, that doesn't change the amount of power that's actually running through the chip. As for GPU, i'd rather buy a 100 day mined gpu over a 100 day +200mV/overclocked gpu for benching and gaming
> Still looks like 2 290x on one board to me


Gotcha







its just the cards used on ebay are around £100 cheaper than new and i would be buying two cards to it adds up









I have one question though it hink with two cards in my system the cards would be almost touching due to the location of the x16 slots how much clearance between them do i need to ensure they dont get too toasty?


----------



## rdr09

Quote:


> Originally Posted by *Norse*
> 
> Gotcha
> 
> 
> 
> 
> 
> 
> 
> its just the cards used on ebay are around £100 cheaper than new and i would be buying two cards to it adds up
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have one question though it hink with two cards in my system the cards would be almost touching due to the location of the x16 slots how much clearance between them do i need to ensure they dont get too toasty?


if they are reference they will be toasty even with a good space between them. in your case, they will be toastier. reference are meant for watercooling.

if non-reference, then having fans at back end and/or the side might help. on the side door, i think it should be exhausting. not sure.

@Deadly. +rep.


----------



## vortex240

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> Still looks like 2 290x on one board to me


Well, that's because that is what it is.. Just like 7990, 6990, 5970, etc. etc. and just like nvidia does the exact same thing.

Is this news to you? I'm not sure what you were expecting here, lots of chrome and shinny things?


----------



## Norse

Quote:


> Originally Posted by *rdr09*
> 
> if they are reference they will be toasty even with a good space between them. in your case, they will be toastier. reference are meant for watercooling.
> 
> if non-reference, then having fans at back end and/or the side might help. on the side door, i think it should be exhausting. not sure.
> 
> @Deadly. +rep.


I am currently eyeing 2 MSI R9 290 reference and 2 Saphire R9 290X's which look reference too. so could it be beneficial i look for non reference even with a case with high airflow or is it easy to swap the reference heatsinks for differant ones? (i dont have much experience with AMD/ATI Graphics







)


----------



## rdr09

Quote:


> Originally Posted by *Norse*
> 
> I am currently eyeing 2 MSI R9 290 reference and 2 Saphire R9 290X's which look reference too. so could it be beneficial i look for non reference even with a case with high airflow or is it easy to swap the reference heatsinks for differant ones? (i dont have much experience with AMD/ATI Graphics
> 
> 
> 
> 
> 
> 
> 
> )


it might take out another slot installing aftermarket cooler on the reference except a waterblock. i recommend the msi for reference. that's what i have.

some non-ref already takes more than 2 slots, so take note.

just browse through and you'll see what others did . . .

http://www.overclock.net/f/72/ati-cooling

edit: another thing . . . take note of warranty. powercolor's warranty, i believe, is not transferable.


----------



## Norse

Quote:


> Originally Posted by *rdr09*
> 
> it might take out another slot installing aftermarket cooler on the reference except a waterblock. i recommend the msi for reference. that's what i have.
> 
> some non-ref already takes more than 2 slots, so take note.
> 
> just browse through and you'll see what others did . . .
> 
> http://www.overclock.net/f/72/ati-cooling
> 
> edit: another thing . . . take note of warranty. powercolor's warranty, i believe, is not transferable.


the issue is my mobo the first slot for graphics is only two slots above the one below it (The blue slots) the top white PCI-E is used by my graphics so things are tight







what temps do you get on your MSI Reference?


----------



## rdr09

Quote:


> Originally Posted by *Norse*
> 
> the issue is my mobo the first slot for graphics is only two slots above the one below it (The blue slots) the top white PCI-E is used by my graphics so things are tight
> 
> 
> 
> 
> 
> 
> 
> what temps do you get on your MSI Reference?


i don't think it will be that bad. i've seen worse. if you don't mind the noise of 70% fan speed, then you should be fine.

my core and vrms do not see 60C. it is watercooled.









edit: you wil not oc anyways . . . and if you do . . . you won't get too far, i think, with the 850W.

yah, aftermarket cooler will work but i've yet to see good heatsinks for the vrms.


----------



## Norse

Quote:


> Originally Posted by *rdr09*
> 
> i don't think it will be that bad. i've seen worse. if you don't mind the noise of 70% fan speed, then you should be fine.
> 
> my core and vrms do not see 60C. it is watercooled.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> edit: you wil not oc anyways . . . and if you do . . . you won't get too far, i think, with the 850W.


I will be swapping the 850 for one of my 1050's









I dont mind noise as when i gaming i like to hear my games nice and loud


----------



## rdr09

Quote:


> Originally Posted by *Norse*
> 
> I will be swapping the 850 for one of my 1050's
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I dont mind noise as when i gaming i like to hear my games nice and loud


you are good to go! it is not really that loud.

edit: btw, whatever cpu you have - you better oc it.


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *vortex240*
> 
> Well, that's because that is what it is.. Just like 7990, 6990, 5970, etc. etc. and just like nvidia does the exact same thing.
> 
> Is this news to you? I'm not sure what you were expecting here, lots of chrome and shinny things?


I think you interpreted my post wrong, I'm not confused on what this card is or does
What I was trying to get across is that everything you mentioned in your post is a given















Since you seemed really excited I made the sarcastic remark of the obvious which is this is 2cores on one pcb
Quote:


> Originally Posted by *vortex240*
> 
> Are you disappointed?? Have you seen the techpowerup review, which includes OC results. The card has new hynix chips and did 1120 core 1725 mem!! Now youtube 295X2 and see how quiet it is on full load 50db.
> 
> Aside from the price tag, which was to be expected, its a great piece of engineering. Especially considering, it has no frame pacing issues at all. Granted I wish they used a thicker rad.
> 
> It's a halo product but look at how many boutique brands already announced systems with it. It will sell out, no doubt about it. The whole point of this is exclusivity.


----------



## Norse

Quote:


> Originally Posted by *rdr09*
> 
> you are good to go! it is not really that loud.
> 
> edit: btw, whatever cpu you have - you better oc it.


Cant OC on the ASUS KGPE-D16 unless i have a engineering sample CPU. i currently have 2 16core Optersons though, 2.4ghz full load 3ghz half load due to the turbo







its my sig rig


----------



## vortex240

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> I think you interpreted my post wrong, I'm not confused on what this card is or does
> What I was trying to get across is that everything you mentioned in your post is a given
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Since you seemed really excited I made the sarcastic remark of the obvious which is this is 2cores on one pcb


Yes, I am excited, I think it's a kick ass product. I would throw this in a mini-itx build any second if money wasn't an object. And I'm also excited that an actual gpu manufacturer used water for cooling and with great results. 50 db noise level on load is way better then any other dual gpu card shipped by nvidia or amd.

This is a first step but hopefully, more partners will follow with AIO water cooled versions in the future.

It's easy for people to be sarcastic and expect miracles - so my question to you 'What would get you excited? Knowing very well current gpu's are still on the 28nm process'


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *vortex240*
> 
> Yes, I am excited, I think it's a kick ass product. I would throw this in a mini-itx build any second if money wasn't an object. And I'm also excited that an actual gpu manufacturer used water for cooling and with great results. 50 db noise level on load is way better then any other dual gpu card shipped by nvidia or amd.
> 
> This is a first step but hopefully, more partners will follow with AIO water cooled versions in the future.
> 
> It's easy for people to be sarcastic and expect miracles - so my question to you 'What would get you excited? Knowing very well current gpu's are still on the 28nm process'


The difference is I'm not expecting miracles. Unfortunately it is more difficult for some to read sarcasm then write it







.
I am glad you are excited about this product.

AIO water cooled cards become a huge liability for manufacturers because if they leak or there is issues there becomes a fine line between warranty and no warranty.
"There was a pin sized hole in your tubing that caused the leak, it is caused by the user, your RMA is denied"

People pay for what they want or what they feel is good.

Price per performance is a different story. Look at the titan-z for example, it is a joke


----------



## vortex240

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> The difference is I'm not expecting miracles. Unfortunately it is more difficult for some to read sarcasm then write it
> 
> 
> 
> 
> 
> 
> 
> .
> I am glad you are excited about this product.
> 
> AIO water cooled cards become a huge liability for manufacturers because if they leak or there is issues there becomes a fine line between warranty and no warranty.
> "There was a pin sized hole in your tubing that caused the leak, it is caused by the user, your RMA is denied"
> 
> People pay for what they want or what they feel is good.
> 
> Price per performance is a different story. Look at the titan-z for example, it is a joke


I'm sorry to be this frank but have you seen what has been happening with CPU cooling for the last 3+ years??? Well, in case you have not noticed it's easily 75% AIO wattercooler. Have you heard or seen any horror stories of those failing and burning out systems? I certainly have not, anywhere on the net. Hence the reason they are so popular. Also here is a fun fact of the day for you - distilled water is non-conductive. Which is what all these cooler use.

Just so you know, I picked up on your sarcasm in the first place. I just think this is a case of 'if you don't have anything nice to say, don't say anything' - yet people like to beat down anything that might not fit their current liking. I guarantee you that if you had a mini-itx build and wanted crossfire for a 4K monitor(money no object) and had no space for it, then this would be a golden ticket.


----------



## Kokin

I'm definitely surprised that AMD could launch such a beast of a card on a single PCB when the single-core 290/290X had lots of bumps in terms of cooling. Even though the non-reference coolers have mostly fixed that, VRM temps still get pretty hot.

Due to having a 1440p monitor @ 120Hz, I really wish I could get my hands on the performance of the 295x2, since my "measly" 7950 is approximately 35~40% of its performance. Even with a 1300/1700 OC on my 7950, I can almost reach stock GTX 780/R9 290 speeds (keyword is almost). But since I have an ITX system, I am limited to just one PCI-E slot, so CF outside of a single PCB solution is out of the question. Cooling would be no problem though, I already use 2x 240mm rads (one 58mm, one 35mm thick) + 1x 120mm rad (35mm thick).

I have seen a few instances of AIO coolers failing, such as the pump dying (most common) or the ever-so rare leak. Also, AIO coolers *DO NOT* just use distilled water. They often use aluminum radiators, so they have some sort of mix to prevent corrosion. Any leak from an AIO will result in dead components, unless it was prevented right away.


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *vortex240*
> 
> I'm sorry to be this frank but have you seen what has been happening with CPU cooling for the last 3+ years??? Well, in case you have not noticed it's easily 75% AIO wattercooler. Have you heard or seen any horror stories of those failing and burning out systems? I certainly have not, anywhere on the net. Hence the reason they are so popular. Also here is a fun fact of the day for you - distilled water is non-conductive. Which is what all these cooler use.
> 
> Just so you know, I picked up on your sarcasm in the first place. I just think this is a case of '*if you don't have anything nice to say, don't say anything*' - yet people like to beat down anything that might not fit their current liking. I guarantee you that if you had a mini-itx build and wanted crossfire for a 4K monitor(money no object) and had no space for it, then this would be a golden ticket.


You've never heard of an AIO pump dying, leaking, or failing? Anywhere on the net? Go click the water cooling sub forum, theres plenty of stories there. Have you ever heard of corsair







?
There is tons of AIO's that die or have issues

Also, I had no malicious intent with my post. However, if you picked up on the sarcasm and still continued a conversation just to '*if you don't have anything nice to say, don't say anything*' then that's foolish









Everyone has an opinion. I did nothing more than state fact that its 2 290x on one board. I'm not bashing anything. Whether they put a fancy shroud or however you seek to justify this card then so be it, this is why I gave the titan-z example. ( it's not worth the money and people who buy it are entitled to, however, you can't persuade your own justifications to other people as if it is a fact )

By the way, Kokin already mentioned but, what you mentioned about AIO's is not fact so please do the research and double check what you post, some other user might read your post and actually think it was a fact.


----------



## rdr09

Quote:


> Originally Posted by *Kokin*
> 
> I'm definitely surprised that AMD could launch such a beast of a card on a single PCB when the single-core 290/290X had lots of bumps in terms of cooling. Even though the non-reference coolers have mostly fixed that, VRM temps still get pretty hot.
> 
> Due to having a 1440p monitor @ 120Hz, I really wish I could get my hands on the performance of the 295x2, since my "measly" 7950 is approximately 35~40% of its performance. Even with a 1300/1700 OC on my 7950, I can almost reach stock GTX 780/R9 290 speeds (keyword is almost). But since I have an ITX system, I am limited to just one PCI-E slot, so CF outside of a single PCB solution is out of the question. Cooling would be no problem though, I already use 2x 240mm rads (one 58mm, one 35mm thick) + 1x 120mm rad (35mm thick).
> 
> I have seen a few instances of AIO coolers failing, such as the pump dying (most common) or the ever-so rare leak. Also, AIO coolers *DO NOT* just use distilled water. They often use aluminum radiators, so they have some sort of mix to prevent corrosion. Any leak from an AIO will result in dead components, unless it was prevented right away.


if you are able to get by a 7950, then you should with a 290.

7900 crossed

http://www.3dmark.com/3dm11/6233890

290 @ 1100 using 14.3

http://www.3dmark.com/3dm11/8211253


----------



## vortex240

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> You've never heard of an AIO pump dying, leaking, or failing? Anywhere on the net? Go click the water cooling sub forum, theres plenty of stories there. Have you ever heard of corsair
> 
> 
> 
> 
> 
> 
> 
> ?
> There is tons of AIO's that die or have issues
> 
> Also, I had no malicious intent with my post. However, if you picked up on the sarcasm and still continued a conversation just to '*if you don't have anything nice to say, don't say anything*' then that's foolish
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Everyone has an opinion. I did nothing more than state fact that its 2 290x on one board. I'm not bashing anything. Whether they put a fancy shroud or however you seek to justify this card then so be it, this is why I gave the titan-z example. ( it's not worth the money and people who buy it are entitled to, however, you can't persuade your own justifications to other people as if it is a fact )
> 
> By the way, Kokin already mentioned but, what you mentioned about AIO's is not fact so please do the research and double check what you post, some other user might read your post and actually think it was a fact.


AIO coolers use distilled water with additives(Propylene glycol) that prevent corrosion. That IS A FACT, feel free to read google reputable sources for that and so can Kokin for that matter. So yes, please do YOUR research,







I'm not here to educate anyone.

90% of the stories are from teenagers that did something to cause a leak, twisted the lines, etc etc. The units are sealed very well and DON'T just leak. That is a reason why there are millions of them out there. And just like with any product failures happen.

I continued on the sarcasm to see if you had anything of value to add to the conversation, instead of the usual 'oh its been done before, who cares' I didn't expect much and that's exactly what I got. So in conclusion you are right about everything including, pat yourself on the back


----------



## Caos

This OC is safe?

core voltage: 50mv
power limit: 50+
core clock: 1085 mhz
memory clock: 1300 mhz

max core temp: 80°C
vrm1 max:75 °C

r9 290 gaming msi


----------



## Red1776

Quote:


> Originally Posted by *Caos*
> 
> This OC is safe?
> 
> core voltage: 50mv
> power limit: 50+
> core clock: 1085 mhz
> memory clock: 1300 mhz
> 
> max core temp: 80°C
> vrm1 max:75 °C
> 
> r9 290 gaming msi


"safe OC" is kind of a nebulous term, but having said that, at those temps yes its safe and a rather mild OC.

if you can get more airflow on the VRM do so, but they are made to run at those temps

Hope that helped 

I own 4 x MSI R290X gaming GPUs so I am familiar with them.


----------



## Caos

Quote:


> Originally Posted by *Red1776*
> 
> "safe OC" is kind of a nebulous term, but having said that, at those temps yes its safe and a rather mild OC.
> if you can get more airflow on the VRM do so, but they are made to run at those temps
> Hope that helped
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I own 4 x MSI R290X gaming GPUs so I am familiar with them.


thanks for the reply .. what I meant with OC sure is not going to kill the card?


----------



## Mercy4You

Quote:


> Originally Posted by *Caos*
> 
> what I meant with OC sure is not going to kill the card?


You'll only know when it happens









I ran all my AMD cards @ 1100-1175 and that was never a problem...


----------



## Red1776

Quote:


> Originally Posted by *Caos*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> "safe OC" is kind of a nebulous term, but having said that, at those temps yes its safe and a rather mild OC.
> if you can get more airflow on the VRM do so, but they are made to run at those temps
> Hope that helped
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I own 4 x MSI R290X gaming GPUs so I am familiar with them.
> 
> 
> 
> thanks for the reply .. what I meant with OC sure is not going to kill the card?
Click to expand...

yes, unless you got a defective card that is a safe and mild overclock. The only point I wanted to make was occasionally you can get a card with a weak component. But that overclock is close to the OC they ship the MSI lightning with so with that voltage, and those temps you will be just fine.


----------



## fubmabs

Help me out guys!
http://www.overclock.net/t/1480734/whats-your-opinion-on-crossfire#post_22086128


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *vortex240*
> 
> AIO coolers use distilled water with additives(Propylene glycol) that prevent corrosion. That IS A FACT, feel free to read google reputable sources for that and so can Kokin for that matter. So yes, please do YOUR research,
> 
> 
> 
> 
> 
> 
> 
> I'm not here to educate anyone.
> 
> *90% of the stories are from teenagers that did something to cause a leak, twisted the lines, etc etc. The units are sealed very well and DON'T just leak. That is a reason why there are millions of them out there. And just like with any product failures happen.
> *
> 
> I continued on the sarcasm to see if you had anything of value to add to the conversation, instead of the usual 'oh its been done before, who cares' I didn't expect much and that's exactly what I got. So in conclusion you are right about everything including, pat yourself on the back


I guess there was a huge reason Corsair had a recall?
You should probably not condescend people with high caliber and reputation on this forum who have had first hand experience with coolers failing and call them teenagers. It's really rude.

You should fill out your profile and sig rig, you have been registered on this forum for a long time, would be nice to see you around and actually provide something to this community









Do not generalize all AIO and claim it's a fact when you lack this actual information. Referring others to search google is probably how you became misinformed in the first place.

If there is anything else I can guide you with please send me a PM, your conversations have not been constructive to this thread and are fore the sole purpose of insulting me as you've stated. Please re-consider using the word fact when you are posting your opinions, you are going to confuse other users.


----------



## vortex240

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> I guess there was a huge reason Corsair had a recall?
> You should probably not condescend people with high caliber and reputation on this forum who have had first hand experience with coolers failing and call them teenagers. It's really rude.
> 
> You should fill out your profile and sig rig, you have been registered on this forum for a long time, would be nice to see you around and actually provide something to this community
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Do not generalize all AIO and claim it's a fact when you lack this actual information. Referring others to search google is probably how you became misinformed in the first place.
> 
> If there is anything else I can guide you with please send me a PM, your conversations have not been constructive to this thread and are fore the sole purpose of insulting me as you've stated. Please re-consider using the word fact when you are posting your opinions, you are going to confuse other users.


Dude, what are you smoking? Distilled water with additives is what is used, that is a FACT - why you and another members are saying that its otherwise just baffles me. Please, would you kindly go ahead and show PROOF that its NOT what is used, and I will then value your opinion. My sig is my sig, I have multiple systems, and work with countless servers - I'm not here to brag.

Also, I'm not misinformed at all, and please leave your guidance for others that are happy to just accept it. Also my original comment wasn't even in response to you then you added your sarcasm, now you claim I'm not putting anything constructive in here.

Anyways, I'm done here. I learned a long time ago to not argue with trolls.


----------



## Caos

Quote:


> Originally Posted by *Red1776*
> 
> yes, unless you got a defective card that is a safe and mild overclock. The only point I wanted to make was occasionally you can get a card with a weak component. But that overclock is close to the OC they ship the MSI lightning with so with that voltage, and those temps you will be just fine.


thanks very much..


----------



## Paul17041993

Quote:


> Originally Posted by *vortex240*
> 
> Are you disappointed?? Have you seen the techpowerup review, which includes OC results. The card has new hynix chips and did 1120 core 1725 mem!! Now youtube 295X2 and see how quiet it is on full load 50db.
> 
> Aside from the price tag, which was to be expected, its a great piece of engineering. Especially considering, it has no frame pacing issues at all. Granted I wish they used a thicker rad.
> 
> It's a halo product but look at how many boutique brands already announced systems with it. It will sell out, no doubt about it. The whole point of this is exclusivity.


nono definitely not disappointed, its a very good single card, just not excited anymore for whatever reason, I guess I just expected it to turn out this good regardless









this talk about closed-loop coolers though, yea its just standard distilled and biocide, maby a little lube so the shaft doesn't grind too much (they are cheap and basic pumps regardless), and its completely possible to get a bad batch and have it asplode on you, just a very very low chance, you're probably more likely to have a custom loop leak on you then one of those.

just recently I drained my thermaltake water2.0 pro, it had a very strong biocide mix that you could smell from outside the room, but this generally means it will last years before anything can possibly grow, by then the pump would have likely failed and/or it would have been replaced anyway, I re-filled mine with silver water and distilled water (silver water is distilled been electrolyzed with silver rods, but is very heavy by itself), for the two years Ive had it its worked quite well, reason I drained it is due to it having a low amount of fluid then it should have, possibly due to some evaporation out the tubes over the years and heat it has had to deal with.

hoses wise, these rubber ones are virtually impossible to get off, they're glued and even if you overheat the loop the hoses just expand to compensate for the increase in size, only real risk of it exploding is if you boiled it, which your motherboard would shut off if you tried, in my case to do the drain involved removing 18 screws off the copper block, there's also a plastic riser block that has another 18 screws I think it was, lets just say they have a very excessive amount of screws really on this part and the pump itself is a sealed shaft barrel, when mounted this puts further reinforcement onto the gaskets so there's literally no chance of a leak occurring here, so you can only really make them leak if you pull the hoses really hard or puncture the radiator.

those plastic hoses however I'm not too sure of, they possibly break a lot easier then the rubber ones.


----------



## hotrod717

Quote:


> Originally Posted by *Roboyto*
> 
> I got mine to 1215/1600,i think, with 125mV with stock blower. Temps were fair, but the noise was.. yeah lol
> 
> Yeah, the lightning really blows the pants off the Matrix when comparing provided coolers. ASUS never fixed the issue from the DC2, where they're using the GK104 heat pipe design. Sure most people who buy a matrix are going to water cool it, if not use LN2, but it's the premise of buying a Flagship item that has a design flaw.
> 
> 1150 is pretty good for stock settings, what kind of voltage to they come with out of the box?


With an asic of 75.8%, I'm drawing 1.22v under 3d load with DMM. At 1220 and +135mv, it's topping out at 1.29v. I'll continue to push a little further on air since temps seem pretty good, but will most likely stop at 1250 until I get it under water. This is all on stock bios. I haven't had time to play with the secondary "LN2" bios yet.


----------



## Roboyto

Quote:


> Originally Posted by *hotrod717*
> 
> With an asic of 75.8%, I'm drawing 1.22v under 3d load with DMM. At 1220 and +135mv, it's topping out at 1.29v. I'll continue to push a little further on air since temps seem pretty good, but will most likely stop at 1250 until I get it under water. This is all on stock bios. I haven't had time to play with the secondary "LN2" bios yet.


Very nice!

Never probed a card with DMM, where do I measure from? I want to check mine


----------



## hotrod717

Quote:


> Originally Posted by *Roboyto*
> 
> Very nice!
> 
> Never probed a card with DMM, where do I measure from? I want to check mine


Good question. I'm not sure on the reference cards. Hwbot or Kingpin Cooling forum should have details. Believe it involves manually probing specific points on pcb. Nice thing about Lightning, DCII, and Matrix, they have specified points for taking a measurement.


----------



## bond32

Anyone have tips on the lines/picture distortion issues I have seen? I think I have literally the worst 290x ever made. Running +200 mv in AB I can finally get 1200+, however after about 10 minutes the entire picture gets distorted into lines. Temps are all fine and in check. This is a sapphire 290x with elpida memory on PT1 bios. For sure, my card is the absolute worst clocker for memory as anything over 1350, regardless of bios/drivers/setup/voltage results in a black screen. Makes me sick... And all this on water.


----------



## Matt-Matt

So I was up till 3AM last night..




40c max on the GPU @ stock. 70c for VRM 1's.. That's waiting till next time i drain the loop.

Also; What drivers are you guys doing? 13.12's don't let me overvolt.


----------



## Forceman

13.12 should let you overvolt, which program are you using to overclock? Make sure you have the newest beta if you are using Afterburner.


----------



## Matt-Matt

Quote:


> Originally Posted by *Forceman*
> 
> 13.12 should let you overvolt, which program are you using to overclock? Make sure you have the newest beta if you are using Afterburner.


I did... I'm using trixx ATM and now i'm getting throttling.









Right, so 13.12 fixed it.. New problem is the VRM1 temp..

Going to put a 1mm pad on it this weekend, (has the 0.7mm one). Will get a fujipoly one eventually, but not for a while I guess. Which set of VRM's is VRM 1 though?


----------



## Forceman

VRM1 is the line of them near the power connectors. They provide core power.


----------



## Kokin

Quote:


> Originally Posted by *rdr09*
> 
> if you are able to get by a 7950, then you should with a 290.
> 
> 7900 crossed
> 
> http://www.3dmark.com/3dm11/6233890
> 
> 290 @ 1100 using 14.3
> 
> http://www.3dmark.com/3dm11/8211253


Hmm that CF score looks pretty low considering my single 7950 can already score 11K. I'm seeing a lot of the 7950/7970CF scores range in the 16~18K (which is about a 80~90% increase of the stock 9K score). I'm most likely going to hold off until 20nm releases unless I start seeing the R9 290 come down to $250. I only limit myself to spending $300 or less for a GPU since I mostly play LoL, but for the times where I play games like Crysis 3, BF3, Tomb Raider, my single 7950 is hurting with a 1440p 120Hz monitor.


Quote:


> Originally Posted by *vortex240*
> 
> AIO coolers use distilled water with additives(Propylene glycol) that prevent corrosion. That IS A FACT, feel free to read google reputable sources for that and *so can Kokin for that matter.* So yes, please do YOUR research,
> 
> 
> 
> 
> 
> 
> 
> I'm not here to educate anyone.


Whoa whoa whoa, why did you have to include me? Did you not read my previous post? In case you missed it, *you just repeated what I said*:
Quote:


> Originally Posted by *Kokin*
> 
> Also, AIO coolers *DO NOT* just use distilled water. They often use aluminum radiators, so they have some sort of mix to prevent corrosion. Any leak from an AIO will result in dead components, unless it was prevented right away.


Your "facts" make me question if you personally water cool and if you even know what you're talking about since you're making contradictory statements. I used a Corsair H50 for well over a year before going into custom watercooling (~3 years now) and I do not consider myself an expert, but I have done my research for many years before jumping into anything water-related.

Look at this previous post from you:
Quote:


> Originally Posted by *vortex240*
> 
> Also here is a fun fact of the day for you - distilled water is non-conductive. Which is what all these cooler use.


Guess what? When you mix that distilled water with additives, it becomes conductive! Did you also know that after several heat cycles, pure distilled water will become conductive because of how it interacts with the different materials in your loop? It won't kill your components if a leak happens and you *clean it up right away,* but if a *short happens* and it's strong enough, it will kill your components.
Quote:


> Originally Posted by *vortex240*
> 
> 90% of the stories are from teenagers that did something to cause a leak, twisted the lines, etc etc. The units are sealed very well and DON'T just leak. That is a reason why there are millions of them out there. And just like with any product failures happen.


Yes because assuming all cases of bad units are automatically "teenagers" who have no idea what they are doing. I've seen several leaks from Corsair units posted in OCN in the last few years. These are from *adults* who did their proper research and carefully installed the units, not just some newbie who started building computers. Yes, 99% of the units out there are good, but there will always be cases where a unit will leak or have it's pump die (again most common case of failure).

If you still think I'm some amateur, feel free to see the hard work (and money) I've put into my rig. I've even gone as far as painting my fans and radiators because I find watercooling such a fun hobby.


----------



## INCREDIBLEHULK

Nice rig Kokin















I would say we should ignore the troll though. I already learned he will just degrade the quality of the thread by baiting conversations as he did to insult others.


----------



## 66racer

I think we need to get back on track and leave AIO unit talk for the time being in the cooling thread, thanks guys.


----------



## Mega Man

look forward to joining you guys ! just found a great deal and i dont even need a waterblock ! extremely happy 1/4 is now mine waiting till mid summer to buy more as i am going outta the country and saving money for that and purchases overseas ! ( buying a few leadex 1000/1250 from china to import back ! )

this is my card that is on the way http://www.newegg.com/Product/Product.aspx?Item=N82E16814131543R
seems i walked in the middle of a fight, so hows the weather today


----------



## Matt-Matt

Quote:


> Originally Posted by *Forceman*
> 
> VRM1 is the line of them near the power connectors. They provide core power.


That's what I thought, cheers!









+Reppity Rep Rep


----------



## VulgarDisplay

Quote:


> Originally Posted by *bond32*
> 
> Anyone have tips on the lines/picture distortion issues I have seen? I think I have literally the worst 290x ever made. Running +200 mv in AB I can finally get 1200+, however after about 10 minutes the entire picture gets distorted into lines. Temps are all fine and in check. This is a sapphire 290x with elpida memory on PT1 bios. For sure, my card is the absolute worst clocker for memory as anything over 1350, regardless of bios/drivers/setup/voltage results in a black screen. Makes me sick... And all this on water.


I think this may be a driver issue. I have this problem when I OC my ram. I also have black screen crashes when my computer sleeps at the moment. Pretty frustrating.


----------



## Paul17041993

Quote:


> Originally Posted by *Mega Man*
> 
> look forward to joining you guys ! just found a great deal and i dont even need a waterblock ! extremely happy 1/4 is now mine waiting till mid summer to buy more as i am going outta the country and saving money for that and purchases overseas ! ( buying a few leadex 1000/1250 from china to import back ! )
> 
> this is my card that is on the way
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814195115
> 
> seems i walked in the middle of a fight, so hows the weather today


you're late boi, and apparently you linked a firepro...

needs quadfires!1!


----------



## King4x4

These cards can sure take a beating when OCing them... mine OC so much that I lose the DP signal and the DVI signal starts to go haywire


----------



## arrow0309

Hi guys!

Reading

__
https://www.reddit.com/r/1wgh2f/psa_probably_290x_black_screen_fix_use_two_pcie/
 about the black screens (I still haven't got one though) I started to "investigate" and control how my 12v rail voltages are, especially under load on the pcie rail (I'm using two separate cables, one 8pin and one 6pin) of my Cooler Master 850W Silent Pro M
And I found an "ugly" 11.63-11.50v (gpuz) in idle and low load and a "worse", down to 11.25v (gpuz) or 11.31v (hwinfo64) 12v real voltage after an Unigine Heaven 4.0 bench at 1180/1500 @ +156mv, +50PL









http://www.xtremeshack.com/photos/20140410139712056164242.jpg

So I started to first of all control and change the position of the two pcie cables connectors plugged on my psu
But I got the very same results:

http://www.xtremeshack.com/photos/20140410139712058149796.jpg
http://www.xtremeshack.com/photos/20140410139712060841855.jpg

In the end I even "tried" to power the 8pin with the 2x4pin molex to 8pin adapter, using two independent molex lines from my psu, the result was worse however, only 11.28v min on the 12v rail and lots of cable mess (at least I tried)









http://www.xtremeshack.com/photos/20140410139712062110427.jpg

My psu is about 4 years old, kept with the fan filter, I've even replaced the fan with a better one (a couple of years ago already, becoming dead silent)








But should I get worried somehow, does this affect any of my 290's performance in mid oc or its stability?
Or I only have to start considering a nice psu replacement?


----------



## devilhead

hi, what do you think guys, if i will put card with water block on it to pc, without connecting card to loop, i only want to flash bios to those card, and sell it. Because i don't want to connect card to loop, it will be a lot work







so how do you think it will not overheat? In my opinion it will not, it can overheat only, if i will play game with card


----------



## rdr09

Quote:


> Originally Posted by *Kokin*
> 
> Hmm that CF score looks pretty low considering my single 7950 can already score 11K. I'm seeing a lot of the 7950/7970CF scores range in the 16~18K (which is about a 80~90% increase of the stock 9K score). I'm most likely going to hold off until 20nm releases unless I start seeing the R9 290 come down to $250. I only limit myself to spending $300 or less for a GPU since I mostly play LoL, but for the times where I play games like Crysis 3, BF3, Tomb Raider, my single 7950 is hurting with a 1440p 120Hz monitor.
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Whoa whoa whoa, why did you have to include me? Did you not read my previous post? In case you missed it, *you just repeated what I said*:
> Your "facts" make me question if you personally water cool and if you even know what you're talking about since you're making contradictory statements. I used a Corsair H50 for well over a year before going into custom watercooling (~3 years now) and I do not consider myself an expert, but I have done my research for many years before jumping into anything water-related.
> 
> Look at this previous post from you:
> Guess what? When you mix that distilled water with additives, it becomes conductive! Did you also know that after several heat cycles, pure distilled water will become conductive because of how it interacts with the different materials in your loop? It won't kill your components if a leak happens and you *clean it up right away,* but if a *short happens* and it's strong enough, it will kill your components.
> Yes because assuming all cases of bad units are automatically "teenagers" who have no idea what they are doing. I've seen several leaks from Corsair units posted in OCN in the last few years. These are from *adults* who did their proper research and carefully installed the units, not just some newbie who started building computers. Yes, 99% of the units out there are good, but there will always be cases where a unit will leak or have it's pump die (again most common case of failure).
> 
> If you still think I'm some amateur, feel free to see the hard work (and money) I've put into my rig. I've even gone as far as painting my fans and radiators because I find watercooling such a fun hobby.


yah, with that oc your card is about within 10% of a stock 290. here compare it at around same clocks . . .

http://www.3dmark.com/3dm11/7684264

it goes back up to around 30%.


----------



## rdr09

Quote:


> Originally Posted by *devilhead*
> 
> hi, what do you think guys, if i will put card with water block on it to pc, without connecting card to loop, i only want to flash bios to those card, and sell it. Because i don't want to connect card to loop, it will be a lot work
> 
> 
> 
> 
> 
> 
> 
> so how do you think it will not overheat? In my opinion it will not, it can overheat only, if i will play game with card


don't take a chance.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *devilhead*
> 
> hi, what do you think guys, if i will put card with water block on it to pc, without connecting card to loop, i only want to flash bios to those card, and sell it. Because i don't want to connect card to loop, it will be a lot work
> 
> 
> 
> 
> 
> 
> 
> so how do you think it will not overheat? In my opinion it will not, it can overheat only, if i will play game with card


If you bork it you cant sell it can you ?

Quote:


> Originally Posted by *King4x4*
> 
> These cards can sure take a beating when OCing them... mine OC so much that I lose the DP signal and the DVI signal starts to go haywire


LooooooooL That's exactly what mine do over DP @ 1440p and DVI and HDMI . Complete benches with sig dropping in and out









Quote:


> Originally Posted by *66racer*
> 
> I think we need to get back on track and leave AIO unit talk for the time being in the cooling thread, thanks guys.


Agreed on that


----------



## Mega Man

Quote:


> Originally Posted by *Paul17041993*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> look forward to joining you guys ! just found a great deal and i dont even need a waterblock ! extremely happy 1/4 is now mine waiting till mid summer to buy more as i am going outta the country and saving money for that and purchases overseas ! ( buying a few leadex 1000/1250 from china to import back ! )
> 
> this is my card that is on the way
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814195115
> 
> seems i walked in the middle of a fight, so hows the weather today
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> you're late boi, and apparently you linked a firepro...
> 
> needs quadfires!1!
Click to expand...

my bad, fixed, no not late, just lazy didnt wanna work the ot to pay off 2.7-5k until summer hit, this way i dont have a choice, because it is summer, also was waiting for komodos to be release, which i still might buy

was supposed to be http://www.newegg.com/Product/Product.aspx?Item=N82E16814131543R


----------



## Mega Man

meh he just has to show his epeen on AIOs, anyone who gets that and doesnt full block it, makes me lol ( non corporate, although at that price, they should be going firepro )


----------



## phallacy

Quote:


> Originally Posted by *arrow0309*
> 
> Hi guys!
> 
> Reading
> 
> __
> https://www.reddit.com/r/1wgh2f/psa_probably_290x_black_screen_fix_use_two_pcie/
> about the black screens (I still haven't got one though) I started to "investigate" and control how my 12v rail voltages are, especially under load on the pcie rail (I'm using two separate cables, one 8pin and one 6pin) of my Cooler Master 850W Silent Pro M
> And I found an "ugly" 11.63-11.50v (gpuz) in idle and low load and a "worse", down to 11.25v (gpuz) or 11.31v (hwinfo64) 12v real voltage after an Unigine Heaven 4.0 bench at 1180/1500 @ +156mv, +50PL
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.xtremeshack.com/photos/20140410139712056164242.jpg
> 
> So I started to first of all control and change the position of the two pcie cables connectors plugged on my psu
> But I got the very same results:
> 
> http://www.xtremeshack.com/photos/20140410139712058149796.jpg
> http://www.xtremeshack.com/photos/20140410139712060841855.jpg
> 
> In the end I even "tried" to power the 8pin with the 2x4pin molex to 8pin adapter, using two independent molex lines from my psu, the result was worse however, only 11.28v min on the 12v rail and lots of cable mess (at least I tried)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.xtremeshack.com/photos/20140410139712062110427.jpg
> 
> My psu is about 4 years old, kept with the fan filter, I've even replaced the fan with a better one (a couple of years ago already, becoming dead silent)
> 
> 
> 
> 
> 
> 
> 
> 
> But should I get worried somehow, does this affect any of my 290's performance in mid oc or its stability?
> Or I only have to start considering a nice psu replacement?


Hey, I looked up your PSU on jonnyguru and if you go to the end of this review here: http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story5&reid=166 it mentions voltage regulation as one of the negatives of that model ( I linked you to the 1000w one but I believe if it's the same model name but different wattages, mostly the same internals would be used and have similar traits). Those voltages do seem out of the normal range for the 12v rail though. With my EVGA it's about 11.88-12.06 depending on when I look at the reading. A 850w psu should be able to handle a single 290x just fine but because it is 4 years old that's already a really long time in upgrade time. Maybe it is time to consider getting a new one? My 2 cents


----------



## Matt-Matt

Would love to join the club! 1200 is all for now though!



What would you guys say is a "Golden" 290? I'm doing 1200MHz at +56mv in Trixx, the RAM is terrible however.

Linky to validation


----------



## Sgt Bilko

Quote:


> Originally Posted by *Matt-Matt*
> 
> Would love to join the club! 1200 is all for now though!
> 
> 
> 
> What would you guys say is a "Golden" 290? I'm doing 1200MHz at +56mv in Trixx, the RAM is terrible however.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Linky to validation


Ummm, i'd say that's pretty good, i need +80mV for one card and +100mV for the other


----------



## Shortie

Hello, everyone.
I need you help.
I got my MSI R9 290X (Factory overclocked 1030MHz core 1250MHz memory) , a month ago. Last night I have noticed that my card throttles, even if it running pretty cool and overall performance is great.
I was using the latest catalyst beta 14.3. But some throttling is still present with 13.12 WHQL too.
Here is a MSI AB screenshot taken during Metro 2033 run



It does not throttle as much as 14.3 beta but still............


----------



## phallacy

Quote:


> Originally Posted by *Matt-Matt*
> 
> Would love to join the club! 1200 is all for now though!
> 
> 
> 
> What would you guys say is a "Golden" 290? I'm doing 1200MHz at +56mv in Trixx, the RAM is terrible however.
> 
> Linky to validation


Nice overclock on the core! 1200 imo is safe to say you have a decent 290x. If you can maintain 1200 on the core and reach 1500+ on ram 1600+ if it's hynix by upping the mV to under 100 then it's a pretty good card. 1200/1550 on my elpida 290x with +87mV and 1200/1650 on my hynix with +94 mV I think and of course 50% power limit.

What I would consider a golden 290x is something that will do 1200/1600 with under 50mV and 50 power limit and 1350+/1700+ with about 120-150mV for 24/7 use in a loop. You could always do suicide runs to see what you're topping out at even if it's only stable enough for running a benchmark or two.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Shortie*
> 
> Hello, everyone.
> I need you help.
> I got my MSI R9 290X (Factory overclocked 1030MHz core 1250MHz memory) , a month ago. Last night I have noticed that my card throttles, even if it running pretty cool and overall performance is great.
> I was using the latest catalyst beta 14.3. But some throttling is still present with 13.12 WHQL too.
> Here is a MSI AB screenshot taken during Metro 2033 run
> 
> 
> Spoiler: Warning: Spoiler!


http://www.overclock.net/t/1258253/how-to-put-your-rig-in-your-sig/0_40

that will help for starters


----------



## Sgt Bilko

Quote:


> Originally Posted by *phallacy*
> 
> Nice overclock on the core! 1200 imo is safe to say you have a decent 290x. If you can maintain 1200 on the core and reach 1500+ on ram 1600+ if it's hynix by upping the mV to under 100 then it's a pretty good card. 1200/1550 on my elpida 290x with +87mV and 1200/1650 on my hynix with +94 mV I think and of course 50% power limit.
> 
> What I would consider a golden 290x is something that will do 1200/1600 with under 50mV and 50 power limit and 1350+/1700+ with about 120-150mV for 24/7 use in a loop. You could always do suicide runs to see what you're topping out at even if it's only stable enough for running a benchmark or two.


it's a 290, In general they seem to clock higher, i'm assuming due to the fact they don't have the extra shaders to power.


----------



## arrow0309

Quote:


> Originally Posted by *phallacy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *arrow0309*
> 
> Hi guys!
> 
> Reading
> 
> __
> https://www.reddit.com/r/1wgh2f/psa_probably_290x_black_screen_fix_use_two_pcie/
> about the black screens (I still haven't got one though) I started to "investigate" and control how my 12v rail voltages are, especially under load on the pcie rail (I'm using two separate cables, one 8pin and one 6pin) of my Cooler Master 850W Silent Pro M
> And I found an "ugly" 11.63-11.50v (gpuz) in idle and low load and a "worse", down to 11.25v (gpuz) or 11.31v (hwinfo64) 12v real voltage after an Unigine Heaven 4.0 bench at 1180/1500 @ +156mv, +50PL
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.xtremeshack.com/photos/20140410139712056164242.jpg
> 
> So I started to first of all control and change the position of the two pcie cables connectors plugged on my psu
> But I got the very same results:
> 
> http://www.xtremeshack.com/photos/20140410139712058149796.jpg
> http://www.xtremeshack.com/photos/20140410139712060841855.jpg
> 
> In the end I even "tried" to power the 8pin with the 2x4pin molex to 8pin adapter, using two independent molex lines from my psu, the result was worse however, only 11.28v min on the 12v rail and lots of cable mess (at least I tried)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.xtremeshack.com/photos/20140410139712062110427.jpg
> 
> My psu is about 4 years old, kept with the fan filter, I've even replaced the fan with a better one (a couple of years ago already, becoming dead silent)
> 
> 
> 
> 
> 
> 
> 
> 
> But should I get worried somehow, does this affect any of my 290's performance in mid oc or its stability?
> Or I only have to start considering a nice psu replacement?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hey, I looked up your PSU on jonnyguru and if you go to the end of this review here: http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story5&reid=166 it mentions voltage regulation as one of the negatives of that model ( I linked you to the 1000w one but I believe if it's the same model name but different wattages, mostly the same internals would be used and have similar traits). Those voltages do seem out of the normal range for the 12v rail though. With my EVGA it's about 11.88-12.06 depending on when I look at the reading. A 850w psu should be able to handle a single 290x just fine but because it is 4 years old that's already a really long time in upgrade time. Maybe it is time to consider getting a new one? My 2 cents
Click to expand...

+Rep
Thanks for the answer, yeah I've red that review and other +12v low issue related and it seems that's a low voltage indeed
The 290 in oc is a power hungry video card and the min for the 12v rail is of 11.40v within the atx 2.3 specs (+/- 5%), mine's showing down to 11.30 so I guess I do have to think of getting a new one.
Hoping is gonna last for one more month and I'll may get the Corsair RM850 that I've found at a nice price

Cheers!


----------



## rdr09

Quote:


> Originally Posted by *Matt-Matt*
> 
> Would love to join the club! 1200 is all for now though!
> 
> 
> 
> What would you guys say is a "Golden" 290? I'm doing 1200MHz at +56mv in Trixx, the RAM is terrible however.
> 
> Linky to validation


Matt, certainly above average if it only needs 56.


----------



## Jflisk

Well my second R9 290x showed up yesterday just waiting for the water blocks and bridge to show up have to emit EK makes one nice water block.


----------



## velocityx

Quote:


> Originally Posted by *arrow0309*
> 
> Hi guys!
> 
> Reading
> 
> __
> https://www.reddit.com/r/1wgh2f/psa_probably_290x_black_screen_fix_use_two_pcie/
> about the black screens (I still haven't got one though) I started to "investigate" and control how my 12v rail voltages are, especially under load on the pcie rail (I'm using two separate cables, one 8pin and one 6pin) of my Cooler Master 850W Silent Pro M
> And I found an "ugly" 11.63-11.50v (gpuz) in idle and low load and a "worse", down to 11.25v (gpuz) or 11.31v (hwinfo64) 12v real voltage after an Unigine Heaven 4.0 bench at 1180/1500 @ +156mv, +50PL
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.xtremeshack.com/photos/20140410139712056164242.jpg
> 
> So I started to first of all control and change the position of the two pcie cables connectors plugged on my psu
> But I got the very same results:
> 
> http://www.xtremeshack.com/photos/20140410139712058149796.jpg
> http://www.xtremeshack.com/photos/20140410139712060841855.jpg
> 
> In the end I even "tried" to power the 8pin with the 2x4pin molex to 8pin adapter, using two independent molex lines from my psu, the result was worse however, only 11.28v min on the 12v rail and lots of cable mess (at least I tried)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.xtremeshack.com/photos/20140410139712062110427.jpg
> 
> My psu is about 4 years old, kept with the fan filter, I've even replaced the fan with a better one (a couple of years ago already, becoming dead silent)
> 
> 
> 
> 
> 
> 
> 
> 
> But should I get worried somehow, does this affect any of my 290's performance in mid oc or its stability?
> Or I only have to start considering a nice psu replacement?


I read that too

made me really angry because i had an xfx 650 watt psu and a single 290 and it Black screened, then I got another 290 and a new gold 850 XFX psu and it still managed to bs or act weird. so either this has nothing to do with it or i donno but I cant imagine it happenening on two quality seasonic units.


----------



## Forceman

Quote:


> Originally Posted by *arrow0309*
> 
> +Rep
> Thanks for the answer, yeah I've red that review and other +12v low issue related and it seems that's a low voltage indeed
> The 290 in oc is a power hungry video card and the min for the 12v rail is of 11.40v within the atx 2.3 specs (+/- 5%), mine's showing down to 11.30 so I guess I do have to think of getting a new one.
> Hoping is gonna last for one more month and I'll may get the Corsair RM850 that I've found at a nice price
> 
> Cheers!


Do you have a multimeter? I'd check the voltage on the 12V rail with that, if you have one, before I got a new PSU. I don't think the GPU-Z readings are all that accurate.

Edit: Just checked mine. GPU-Z shows 11.88 at idle and 11.75 under load, while the multimeter showed 11.94 at idle and 11.90 at load. So the voltage drop GPU-Z is reading is either nonexistent, or it is occurring on the card, not on the input. That was checking at the PCIe power plug (I checked the 6 pin and the 8 pin and they were both the same).


----------



## BiG StroOnZ

Anyone with an 290 or 290X coming from an 6 series nVidia care to share their experience with me? Planning on upgrading GPU real soon and am torn between a non-reference 290 or a 780.

Currently looking at used Sapphire Tri-X 290's on eBay and loving the prices I'm seeing.


----------



## Jack Mac

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> Anyone with an 290 or 290X coming from an 6 series nVidia care to share their experience with me? Planning on upgrading GPU real soon and am torn between a non-reference 290 or a 780.
> 
> Currently looking at used Sapphire Tri-X 290's on eBay and loving the prices I'm seeing.


Go for it, I switched to a 290 from a GTX 670 and loved the performance increase. Easily worth it, I really miss my 290.


----------



## friend'scatdied

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> Anyone with an 290 or 290X coming from an 6 series nVidia care to share their experience with me? Planning on upgrading GPU real soon and am torn between a non-reference 290 or a 780.
> 
> Currently looking at used Sapphire Tri-X 290's on eBay and loving the prices I'm seeing.


780.

Less heat to worry about, better thermal/acoustic performance when using the same cooler. Plus, eVGA.

I went from a DirectCU II GTX 670 to a eVGA reference GTX 780 Ti to two different 290Xs (reference and Gaming), and would stick to the 780 if I found the right deal.

Frame this opinion in the context of circumstances however, as I have a SFF/mITX case so thermal performance is very important to me. If you have excellent airflow (particularly exhaust), they become roughly comparable.


----------



## arrow0309

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *arrow0309*
> 
> +Rep
> Thanks for the answer, yeah I've red that review and other +12v low issue related and it seems that's a low voltage indeed
> The 290 in oc is a power hungry video card and the min for the 12v rail is of 11.40v within the atx 2.3 specs (+/- 5%), mine's showing down to 11.30 so I guess I do have to think of getting a new one.
> Hoping is gonna last for one more month and I'll may get the Corsair RM850 that I've found at a nice price
> 
> Cheers!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Do you have a multimeter? I'd check the voltage on the 12V rail with that, if you have one, before I got a new PSU. I don't think the GPU-Z readings are all that accurate.
> 
> Edit: Just checked mine. GPU-Z shows 11.88 at idle and 11.75 under load, while the multimeter showed 11.94 at idle and 11.90 at load. So the voltage drop GPU-Z is reading is either nonexistent, or it is occurring on the card, not on the input. That was checking at the PCIe power plug (I checked the 6 pin and the 8 pin and they were both the same).
Click to expand...

Thanks, I'll borrow a multimeter soon and even test with the OCCT PSU Test tomorrow,
I'll let you know of the results


----------



## BiG StroOnZ

Quote:


> Originally Posted by *Jack Mac*
> 
> Go for it, I switched to a 290 from a GTX 670 and loved the performance increase. Easily worth it, I really miss my 290.


How was installation and drivers? Any issues? Was it as simple as uninstalling old drivers, removing card. Installing 290, installing AMD drivers and playing?
Quote:


> Originally Posted by *friend'scatdied*
> 
> 780.
> 
> Less heat to worry about, better thermal/acoustic performance when using the same cooler. Plus, eVGA.
> 
> I went from a DirectCU II GTX 670 to a eVGA reference GTX 780 Ti to two different 290Xs (reference and Gaming), and would stick to the 780 if I found the right deal.
> 
> Frame this opinion in the context of circumstances however, as I have a SFF/mITX case so thermal performance is very important to me. If you have excellent airflow (particularly exhaust), they become roughly comparable.


If you have adequate cooling and go for a non-reference 290 would your opinion change? My 670 stays under 60C at 1320MHz/7300MHz in Heaven or 3DMark11 with my cooling setup. I dont think cooling the card will become an issue. I'm mostly concerned with drivers being a possible problem after hearing the horror stories (not sure if that's just a rumor or partially true).

I really am leaning towards the 290 because you can get them off of eBay for around $360-400 easily. That and it has 4GB of VRAM so I won't have to touch the card for a while, if anything get a second and go Crossfire. Only thing holding me back from the 780 as being a confirmed buy is the 3GB of VRAM or spending an extra $150 for eVga's 6GB 780.


----------



## friend'scatdied

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> If you have adequate cooling and go for a non-reference 290 would your opinion change? My 670 stays under 60C at 1320MHz/7300MHz in Heaven or 3DMark11 with my cooling setup. I dont think cooling the card will become an issue. I'm mostly concerned with drivers being a possible problem after hearing the horror stories (not sure if that's just a rumor or partially true).
> 
> I really am leaning towards the 290 because you can get them off of eBay for around $360-400 easily. That and it has 4GB of VRAM so I won't have to touch the card for a while, if anything get a second and go Crossfire. Only thing holding me back from the 780 as being a confirmed by is the 3GB of VRAM.


I have an aftermarket 290X right now and the heat situation hasn't improved dramatically. It just dumps a great deal of hot air relative to the 780 series, and could increase the temperatures of your other components. With your temperatures on that level of overclock with the 670, you should be more than fine.

I haven't had any driver issues with the 290X. There was one crash during BF4 (with Mantle enabled) but that was easily resolved with a quick Google (that told me to disable the iGPU). Temperatures aside, it's a very pleasant experience.

ASUS/Gigabyte/MSI cards off eBay are a very good buy right now.


----------



## Jack Mac

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> How was installation and drivers? Any issues? Was it as simple as uninstalling old drivers, removing card. Installing 290, installing AMD drivers and playing?
> If you have adequate cooling and go for a non-reference 290 would your opinion change? My 670 stays under 60C at 1320MHz/7300MHz in Heaven or 3DMark11 with my cooling setup. I dont think cooling the card will become an issue. I'm mostly concerned with drivers being a possible problem after hearing the horror stories (not sure if that's just a rumor or partially true).
> 
> I really am leaning towards the 290 because you can get them off of eBay for around $360-400 easily. That and it has 4GB of VRAM so I won't have to touch the card for a while, if anything get a second and go Crossfire. Only thing holding me back from the 780 as being a confirmed buy is the 3GB of VRAM or spending an extra $150 for eVga's 6GB 780.


Yeah, installing the drivers was really simple, it's basically just plug and play. I reccomebd using BradleyW's method for uninstalling drivers, he has two threads here on OCN for AMD and Nvidia and it's a simple, painless process. Drivers and stability should be a non-issue if you stick to 13.12 WHQL, the newer 14.X have a couple minor issues such as power limit not working correctly. Besides that, the only other issues I had with a 290 when I owned one are issues with Firefox's flash player and the AMD reference cooler, which was a tad bit too loud IMO.


----------



## heroxoot

Looks like I'll be joining you guys soon. MSI is replacing my 7970 lightning with a 290 Gaming series. Does anyone have experience with the MSI 290 and can give me some insight on OCing them? They originally gave me a 280x as a replacement but it was defective so they upgraded me for my troubles. The 280x seemed voltage locked so I hope the 290 will be easy to OC on.


----------



## Durquavian

Quote:


> Originally Posted by *heroxoot*
> 
> Looks like I'll be joining you guys soon. MSI is replacing my 7970 lightning with a 290 Gaming series. Does anyone have experience with the MSI 290 and can give me some insight on OCing them? They originally gave me a 280x as a replacement but it was defective so they upgraded me for my troubles. The 280x seemed voltage locked so I hope the 290 will be easy to OC on.


Congratulations!


----------



## sugarhell

Quote:


> Originally Posted by *friend'scatdied*
> 
> I have an aftermarket 290X right now and the heat situation hasn't improved dramatically. It just dumps a great deal of hot air relative to the 780 series, and could increase the temperatures of your other components. With your temperatures on that level of overclock with the 670, you should be more than fine.
> 
> I haven't had any driver issues with the 290X. There was one crash during BF4 (with Mantle enabled) but that was easily resolved with a quick Google (that told me to disable the iGPU). Temperatures aside, it's a very pleasant experience.
> 
> ASUS/Gigabyte/MSI cards off eBay are a very good buy right now.


You have a problem with the temps even with a custom card? Buy a better case with better airflow. Most custom cards runs below 75C. My pcs 290 runs at 65C without a case.

Or just get a loop the best investement that you can do with a high end system


----------



## Kokin

Quote:


> Originally Posted by *devilhead*
> 
> hi, what do you think guys, if i will put card with water block on it to pc, without connecting card to loop, i only want to flash bios to those card, and sell it. Because i don't want to connect card to loop, it will be a lot work
> 
> 
> 
> 
> 
> 
> 
> so how do you think it will not overheat? In my opinion it will not, it can overheat only, if i will play game with card


You can do it, but it has to be quick. You have maybe a 10 minute window to do it before the card will shut down due to thermal limits. There have been times where I shut down my pump and fans to test this and with just regular browsing (no games), my system shut down after 15~20 minutes of usage.

Make sure there is water inside the block (add plugs) so that the heat has more than just air to transfer to after heating up the metal.

Also, another good thing is that there are 2 BIOS, so if you fudge one up, you can always repair the broken one.
Quote:


> Originally Posted by *heroxoot*
> 
> Looks like I'll be joining you guys soon. MSI is replacing my 7970 lightning with a 290 Gaming series. Does anyone have experience with the MSI 290 and can give me some insight on OCing them? They originally gave me a 280x as a replacement but it was defective so they upgraded me for my troubles. The 280x seemed voltage locked so I hope the 290 will be easy to OC on.


Wow that is quite the upgrade, congrats! The 290 should OC just the same since you have voltage control. Some 280x/7900 series GPUs had locked voltage due to OEMs using different VRMs that couldn't be controlled via software (mostly Asus GPUs).

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> I really am leaning towards the 290 because you can get them off of eBay for around $360-400 easily. That and it has 4GB of VRAM so I won't have to touch the card for a while, if anything get a second and go Crossfire. Only thing holding me back from the 780 as being a confirmed buy is the 3GB of VRAM or spending an extra $150 for eVga's 6GB 780.


If you look at the 290 (non-X), many are in the low $300s but use the reference cooler so you may have to spend extra for a beefier aftermarket cooler.


----------



## Red1776

Quote:


> Originally Posted by *heroxoot*
> 
> Looks like I'll be joining you guys soon. MSI is replacing my 7970 lightning with a 290 Gaming series. Does anyone have experience with the MSI 290 and can give me some insight on OCing them? They originally gave me a 280x as a replacement but it was defective so they upgraded me for my troubles. The 280x seemed voltage locked so I hope the 290 will be easy to OC on.


I have 4 x MSI 4GB R290X gaming and so far have got all of them to 1150-1165 core with .75mv bump. I am putting blocks on them next week if you want to follow along I will share results with you.


----------



## BiG StroOnZ

Quote:


> Originally Posted by *friend'scatdied*
> 
> I have an aftermarket 290X right now and the heat situation hasn't improved dramatically. It just dumps a great deal of hot air relative to the 780 series, and could increase the temperatures of your other components. With your temperatures on that level of overclock with the 670, you should be more than fine.
> 
> I haven't had any driver issues with the 290X. There was one crash during BF4 (with Mantle enabled) but that was easily resolved with a quick Google (that told me to disable the iGPU). Temperatures aside, it's a very pleasant experience.
> 
> ASUS/Gigabyte/MSI cards off eBay are a very good buy right now.


Thanks for the advice, I'm leaning towards a Sapphire Tri-X 290 for $380 how's that sound?
Quote:


> Originally Posted by *Jack Mac*
> 
> Yeah, installing the drivers was really simple, it's basically just plug and play. I reccomebd using BradleyW's method for uninstalling drivers, he has two threads here on OCN for AMD and Nvidia and it's a simple, painless process. Drivers and stability should be a non-issue if you stick to 13.12 WHQL, the newer 14.X have a couple minor issues such as power limit not working correctly. Besides that, the only other issues I had with a 290 when I owned one are issues with Firefox's flash player and the AMD reference cooler, which was a tad bit too loud IMO.


Any reason why you went to a 780 though?


----------



## friend'scatdied

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> Thanks for the advice, I'm leaning towards a Sapphire Tri-X 290 for $380 how's that sound?


It's a great card but try to get a copy of the original invoice/receipt from the seller to be able to use for warranty purposes if you need it.


----------



## heroxoot

Quote:


> Originally Posted by *Red1776*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Looks like I'll be joining you guys soon. MSI is replacing my 7970 lightning with a 290 Gaming series. Does anyone have experience with the MSI 290 and can give me some insight on OCing them? They originally gave me a 280x as a replacement but it was defective so they upgraded me for my troubles. The 280x seemed voltage locked so I hope the 290 will be easy to OC on.
> 
> 
> 
> I have 4 x MSI 4GB R290X gaming and so far have got all of them to 1150-1165 core with .75mv bump. I am putting blocks on them next week if you want to follow along I will share results with you.
Click to expand...

I don't think I'll be doing that, not till the warranty is blown out. I should have another year from my lightning warranty.

But heh, he said it was new but he wanted to test it anyway to make sure. The 280x replacement was also new and the fans were bad. Glad to hear good things about the G series 290 though. I'm not even sure if I will OC it yet, but the difference in hardware between the 7970 I had @ 1225/1700 should not be too far off what the 290 does on stock right? This was implied to me in another thread, so I hope for the best. Considering I only run a single 27" monitor doing 1080p for gaming, I can only imagine the 290 will push 120fps in games like BF4 on Ultra.


----------



## Red1776

Quote:


> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Looks like I'll be joining you guys soon. MSI is replacing my 7970 lightning with a 290 Gaming series. Does anyone have experience with the MSI 290 and can give me some insight on OCing them? They originally gave me a 280x as a replacement but it was defective so they upgraded me for my troubles. The 280x seemed voltage locked so I hope the 290 will be easy to OC on.
> 
> 
> 
> I have 4 x MSI 4GB R290X gaming and so far have got all of them to 1150-1165 core with .75mv bump. I am putting blocks on them next week if you want to follow along I will share results with you.
> 
> Click to expand...
> 
> I don't think I'll be doing that, not till the warranty is blown out. I should have another year from my lightning warranty.
> 
> But heh, he said it was new but he wanted to test it anyway to make sure. The 280x replacement was also new and the fans were bad. Glad to hear good things about the G series 290 though. I'm not even sure if I will OC it yet, but the difference in hardware between the 7970 I had @ 1225/1700 should not be too far off what the 290 does on stock right? This was implied to me in another thread, so I hope for the best. Considering I only run a single 27" monitor doing 1080p for gaming, I can only imagine the 290 will push 120fps in games like BF4 on Ultra.
Click to expand...

well the 280x is a rebrand of the 7970 so if you run the 290 at say 1060-1100 (an easy and mild OC ) you will see a nice bump.


----------



## heroxoot

Quote:


> Originally Posted by *Red1776*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Looks like I'll be joining you guys soon. MSI is replacing my 7970 lightning with a 290 Gaming series. Does anyone have experience with the MSI 290 and can give me some insight on OCing them? They originally gave me a 280x as a replacement but it was defective so they upgraded me for my troubles. The 280x seemed voltage locked so I hope the 290 will be easy to OC on.
> 
> 
> 
> I have 4 x MSI 4GB R290X gaming and so far have got all of them to 1150-1165 core with .75mv bump. I am putting blocks on them next week if you want to follow along I will share results with you.
> 
> Click to expand...
> 
> I don't think I'll be doing that, not till the warranty is blown out. I should have another year from my lightning warranty.
> 
> But heh, he said it was new but he wanted to test it anyway to make sure. The 280x replacement was also new and the fans were bad. Glad to hear good things about the G series 290 though. I'm not even sure if I will OC it yet, but the difference in hardware between the 7970 I had @ 1225/1700 should not be too far off what the 290 does on stock right? This was implied to me in another thread, so I hope for the best. Considering I only run a single 27" monitor doing 1080p for gaming, I can only imagine the 290 will push 120fps in games like BF4 on Ultra.
> 
> Click to expand...
> 
> well the 280x is a rebrand of the 7970 so if you run the 290 at say 1060-1100 (an easy and mild OC ) you will see a nice bump.
Click to expand...

Thats great to hear. I know the 280x is just rebranding but I am unsure of the real differences of performance. I'm sure I can leave it stock and see differences after some research. Just waiting now for the guy to either send me tracking or some kind of conformation from MSI. I am hoping they will get it out by tomorrow.


----------



## Red1776

cool, let me know what kind of results you get when you get to play around with it if you would


----------



## battleaxe

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> Anyone with an 290 or 290X coming from an 6 series nVidia care to share their experience with me? Planning on upgrading GPU real soon and am torn between a non-reference 290 or a 780.
> 
> Currently looking at used Sapphire Tri-X 290's on eBay and loving the prices I'm seeing.


I have 670's in SLI and also 290's in Crossfire. All are great cards IMO. And the 290 obviously more powerful. I do really like my 290's and there's no way I'm selling them unless I absolutely had to for finances or something. I tend to keep most of my cards. I still have a 5770 laying around here. Still using it actually.


----------



## King4x4

Video of the funky OC


----------



## Jack Mac

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> Thanks for the advice, I'm leaning towards a Sapphire Tri-X 290 for $380 how's that sound?
> Any reason why you went to a 780 though?


$380 for a Tri-X sounds like an awesome deal. And I went for the 780 because my family was complaining about the noise the reference cooler made.


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *Red1776*
> 
> well the 280x is a rebrand of the 7970 so if you run the 290 at say 1060-1100 (an easy and mild OC ) you will see a nice bump.


The new 280x is not a rebrand, the old is
New 280x runs Tahiti XTL core gpu which is rated to pull ~250w, it changes the whole game in terms of performance per watt when comparing 280x and 7970









Problem is, not every single manufacturer is pushing the newer chips ( for obvious reasons )

There are a few articles online about it though, I think MSI and Gigabyte have been for a few months now


----------



## Red1776

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> well the 280x is a rebrand of the 7970 so if you run the 290 at say 1060-1100 (an easy and mild OC ) you will see a nice bump.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The new 280x is not a rebrand, the old is
> New 280x runs Tahiti XTL core gpu which is rated to pull ~250w, it changes the whole game in terms of performance per watt when comparing 280x and 7970
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Problem is, not every single manufacturer is pushing the newer chips ( for obvious reasons )
> 
> There are a few articles online about it though, I think MSI and Gigabyte have been for a few months now
Click to expand...

 ok cool, last I read they were not out of 7970 (280x) stock. makes sense though some would not want to advertise that.


----------



## Dasboogieman

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> Thanks for the advice, I'm leaning towards a Sapphire Tri-X 290 for $380 how's that sound?
> Any reason why you went to a 780 though?


I was sorely tempted by the 780ti since I've used NVIDIA for ages and know all the NVIDIAinspector driver flags, performance quirks, BIOS modding and power characteristics back to front.

However, the combination of 4GB (I like my texture mods in skyrim) and the Sapphire Tri X engineering was too good to pass up for the 290.

before buying the 290, check out this posting, unusually objective for an Overclockers.UK forum post but insightful.
http://forums.overclockers.co.uk/showthread.php?t=18575360

I can tell from personal experience with the Sapphire Tri X, 100% fan speed is really really loud but thankfully with no high pitched while, its more of a deeper sounding hairdryer, however, the cooling capacity is truly godly you probably won't need more than 50%. Judging from the performance of the cooler, core cooling is far superior to the reference AMD solution but the VRM cooling is a tad weaker. It won't be uncommon for your core to be in the low 70s but your VRMs to be 90 degrees+ when heavily overvolted.

I can definitely recommend the Tri X version, especially for $380, I got mine for $530 which I thought was a damn good price already.
Plus one of the best perks of all? HYNIX Memory!!!!!!!!, seriously, I got mine to 1500mhz with zero effort (since apparently it is rated for 6000mhz at stock)


----------



## VSG

HIS has FINALLY released their IceQ X2 card: http://www.techpowerup.com/199745/his-announces-higher-overclocked-r9-290x-iceq-x-turbo-graphics-card.html

I was hopeful they took this long to get the design back to normal but no


----------



## Matt-Matt

Quote:


> Originally Posted by *rdr09*
> 
> Matt, certainly above average if it only needs 56.


Quote:


> Originally Posted by *phallacy*
> 
> Nice overclock on the core! 1200 imo is safe to say you have a decent 290x. If you can maintain 1200 on the core and reach 1500+ on ram 1600+ if it's hynix by upping the mV to under 100 then it's a pretty good card. 1200/1550 on my elpida 290x with +87mV and 1200/1650 on my hynix with +94 mV I think and of course 50% power limit.
> 
> What I would consider a golden 290x is something that will do 1200/1600 with under 50mV and 50 power limit and 1350+/1700+ with about 120-150mV for 24/7 use in a loop. You could always do suicide runs to see what you're topping out at even if it's only stable enough for running a benchmark or two.


Quote:


> Originally Posted by *Sgt Bilko*
> 
> Ummm, i'd say that's pretty good, i need +80mV for one card and +100mV for the other


Nice, it may use less. I don't know, I just know that it can do 1200 stable so far. I'll test it more with some gaming tonight. Hehe.

I'll try the vRAM out, any tips for Elpida?

I'll try to max it out over the easter break one day, along with my CPU once again. (Around 4.4GHz)


----------



## Durquavian

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> The new 280x is not a rebrand, the old is
> New 280x runs Tahiti XTL core gpu which is rated to pull ~250w, it changes the whole game in terms of performance per watt when comparing 280x and 7970
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Problem is, not every single manufacturer is pushing the newer chips ( for obvious reasons )
> 
> There are a few articles online about it though, I think MSI and Gigabyte have been for a few months now


Yeah everyone was throwing a fit that they didn't get the new chips. They thought it was a secret to help sell off the old chips first.


----------



## Sgt Bilko

Quote:


> Originally Posted by *geggeg*
> 
> HIS has FINALLY released their IceQ X2 card: http://www.techpowerup.com/199745/his-announces-higher-overclocked-r9-290x-iceq-x-turbo-graphics-card.html
> 
> I was hopeful they took this long to get the design back to normal but no


Very dissapointed in HIS here, That card is 100% ugly, Gold cooler with Blue PCB and and backplate that looks like it shouldn't fit. HIS always had decent looking cards......what happened?

Quote:


> Originally Posted by *Matt-Matt*
> 
> Nice, it may use less. I don't know, I just know that it can do 1200 stable so far. I'll test it more with some gaming tonight. Hehe.
> 
> I'll try the vRAM out, any tips for Elpida?
> 
> I'll try to max it out over the easter break one day, along with my CPU once again. (Around 4.4GHz)


Elpida usually maxes around 1500Mhz (what it's rated for) but examples have hit 1625Mhz before.

I don't know what the max clock is on mine due to temps, but i can run 1250/1500 on both cards at +155mV/+175mV, hoping the extra will get me over 1300 when the temps start dropping here


----------



## Mega Man

Quote:


> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Looks like I'll be joining you guys soon. MSI is replacing my 7970 lightning with a 290 Gaming series. Does anyone have experience with the MSI 290 and can give me some insight on OCing them? They originally gave me a 280x as a replacement but it was defective so they upgraded me for my troubles. The 280x seemed voltage locked so I hope the 290 will be easy to OC on.
> 
> 
> 
> I have 4 x MSI 4GB R290X gaming and so far have got all of them to 1150-1165 core with .75mv bump. I am putting blocks on them next week if you want to follow along I will share results with you.
> 
> Click to expand...
> 
> I don't think I'll be doing that, not till the warranty is blown out. I should have another year from my lightning warranty.
> 
> But heh, he said it was new but he wanted to test it anyway to make sure. The 280x replacement was also new and the fans were bad. Glad to hear good things about the G series 290 though. I'm not even sure if I will OC it yet, but the difference in hardware between the 7970 I had @ 1225/1700 should not be too far off what the 290 does on stock right? This was implied to me in another thread, so I hope for the best. Considering I only run a single 27" monitor doing 1080p for gaming, I can only imagine the 290 will push 120fps in games like BF4 on Ultra.
Click to expand...

why ? msi honors warranty after blocked as long as you dont cause damage to gpu


----------



## Matt-Matt

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Very dissapointed in HIS here, That card is 100% ugly, Gold cooler with Blue PCB and and backplate that looks like it shouldn't fit. HIS always had decent looking cards......what happened?
> 
> Elpida usually maxes around 1500Mhz (what it's rated for) but examples have hit 1625Mhz before.
> 
> I don't know what the max clock is on mine due to temps, but i can run 1250/1500 on both cards at +155mV/+175mV, hoping the extra will get me over 1300 when the temps start dropping here


Okay awesome, I'll see what I can do hey?


----------



## heroxoot

Quote:


> Originally Posted by *Mega Man*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Looks like I'll be joining you guys soon. MSI is replacing my 7970 lightning with a 290 Gaming series. Does anyone have experience with the MSI 290 and can give me some insight on OCing them? They originally gave me a 280x as a replacement but it was defective so they upgraded me for my troubles. The 280x seemed voltage locked so I hope the 290 will be easy to OC on.
> 
> 
> 
> I have 4 x MSI 4GB R290X gaming and so far have got all of them to 1150-1165 core with .75mv bump. I am putting blocks on them next week if you want to follow along I will share results with you.
> 
> Click to expand...
> 
> I don't think I'll be doing that, not till the warranty is blown out. I should have another year from my lightning warranty.
> 
> But heh, he said it was new but he wanted to test it anyway to make sure. The 280x replacement was also new and the fans were bad. Glad to hear good things about the G series 290 though. I'm not even sure if I will OC it yet, but the difference in hardware between the 7970 I had @ 1225/1700 should not be too far off what the 290 does on stock right? This was implied to me in another thread, so I hope for the best. Considering I only run a single 27" monitor doing 1080p for gaming, I can only imagine the 290 will push 120fps in games like BF4 on Ultra.
> 
> Click to expand...
> 
> why ? msi honors warranty after blocked as long as you dont cause damage to gpu
Click to expand...

MSI puts a sticker on it now saying void if removed on a screw for the stock cooler.


----------



## Mega Man

Quote:


> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Looks like I'll be joining you guys soon. MSI is replacing my 7970 lightning with a 290 Gaming series. Does anyone have experience with the MSI 290 and can give me some insight on OCing them? They originally gave me a 280x as a replacement but it was defective so they upgraded me for my troubles. The 280x seemed voltage locked so I hope the 290 will be easy to OC on.
> 
> 
> 
> I have 4 x MSI 4GB R290X gaming and so far have got all of them to 1150-1165 core with .75mv bump. I am putting blocks on them next week if you want to follow along I will share results with you.
> 
> Click to expand...
> 
> I don't think I'll be doing that, not till the warranty is blown out. I should have another year from my lightning warranty.
> 
> But heh, he said it was new but he wanted to test it anyway to make sure. The 280x replacement was also new and the fans were bad. Glad to hear good things about the G series 290 though. I'm not even sure if I will OC it yet, but the difference in hardware between the 7970 I had @ 1225/1700 should not be too far off what the 290 does on stock right? This was implied to me in another thread, so I hope for the best. Considering I only run a single 27" monitor doing 1080p for gaming, I can only imagine the 290 will push 120fps in games like BF4 on Ultra.
> 
> Click to expand...
> 
> why ? msi honors warranty after blocked as long as you dont cause damage to gpu
> 
> Click to expand...
> 
> MSI puts a sticker on it now saying void if removed on a screw for the stock cooler.
Click to expand...

they have for quite a while and people still dont have issues you can take out the screw with pliers and elec tape without marking it if it worries you so much, i just use alcohol to get all res off and claim it never had it


----------



## heroxoot

I have no idea, my lightening showed no such thing. I'm sure I could just peel the sticker off and put it back on if I had to. But I have no plans to do heavy overclocking nor do I have the funds right now to do a block.


----------



## Matt-Matt

Quote:


> Originally Posted by *heroxoot*
> 
> MSI puts a sticker on it now saying void if removed on a screw for the stock cooler.


Yeah, all the companies do now as far as I know.
Quote:


> Originally Posted by *Mega Man*
> 
> they have for quite a while and people still dont have issues you can take out the screw with pliers and elec tape without marking it if it worries you so much, i just use alcohol to get all res off and claim it never had it


This is a good idea! Might do that to my XFX card if I ever need to.. Heh


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *Mega Man*
> 
> they have for quite a while and people still dont have issues you can take out the screw with pliers and elec tape without marking it if it worries you so much, i just use alcohol to get all res off and claim it never had it


Yeah, seems MSI has publicly posted too correct? I remember seeing posts of reps from peoples support tickets mentioning if the card is not damaged by removal of the HSF that they honor warranty.

I guess that sticker is to scare some people away? I know for other companies it's an instant denial of warranty :/ which really sucks

Then again I read a bunch of stories of people removing HSF on many different manufacturers and never having issue with warranty, who knows


----------



## 66racer

My asus nvidia card had a sticker on it too. Well screw warranty, I volt modded it anyways lol. How unlocked are the asus 290's? I felt compelled to volt mod the 770 since nvidia basically locks the voltage to 1.212v. Is stock amd/asus 290 volts more than enough or is there a lot of usable voltage left on the table? I was tempted by an asus r9 290 today on my lunch break at the local frys but chickened out.


----------



## DeadlyDNA

I am going to be posting and benchmarking 4k Eyefinity, and Crossfire scaling on r9 290 cards from single to quad. These cards are really beast and cossfire scaling is looking pretty good.

http://www.overclock.net/t/1481154/4k-eyefinity-crossfire-scaling-from-1-2-3-4-gpus-benchmarks#post_22094068

Hopefully i posted in the right section, and hope it provides good information along with representing our beloved R9 cards


----------



## arrow0309

Quote:


> Originally Posted by *arrow0309*
> 
> Hi guys!
> 
> Reading
> 
> __
> https://www.reddit.com/r/1wgh2f/psa_probably_290x_black_screen_fix_use_two_pcie/
> about the black screens (I still haven't got one though) I started to "investigate" and control how my 12v rail voltages are, especially under load on the pcie rail (I'm using two separate cables, one 8pin and one 6pin) of my Cooler Master 850W Silent Pro M
> And I found an "ugly" 11.63-11.50v (gpuz) in idle and low load and a "worse", down to 11.25v (gpuz) or 11.31v (hwinfo64) 12v real voltage after an Unigine Heaven 4.0 bench at 1180/1500 @ +156mv, +50PL
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.xtremeshack.com/photos/20140410139712056164242.jpg
> 
> So I started to first of all control and change the position of the two pcie cables connectors plugged on my psu
> But I got the very same results:
> 
> http://www.xtremeshack.com/photos/20140410139712058149796.jpg
> http://www.xtremeshack.com/photos/20140410139712060841855.jpg
> 
> In the end I even "tried" to power the 8pin with the 2x4pin molex to 8pin adapter, using two independent molex lines from my psu, the result was worse however, only 11.28v min on the 12v rail and lots of cable mess (at least I tried)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.xtremeshack.com/photos/20140410139712062110427.jpg
> 
> My psu is about 4 years old, kept with the fan filter, I've even replaced the fan with a better one (a couple of years ago already, becoming dead silent)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But should I get worried somehow, does this affect any of my 290's performance in mid oc or its stability?
> Or I only have to start considering a nice psu replacement?


Quote:


> Originally Posted by *Forceman*
> 
> Do you have a multimeter? I'd check the voltage on the 12V rail with that, if you have one, before I got a new PSU. I don't think the GPU-Z readings are all that accurate.
> 
> Edit: Just checked mine. GPU-Z shows 11.88 at idle and 11.75 under load, while the multimeter showed 11.94 at idle and 11.90 at load. So the voltage drop GPU-Z is reading is either nonexistent, or it is occurring on the card, not on the input. That was checking at the PCIe power plug (I checked the 6 pin and the 8 pin and they were both the same).


Just wondering about my PSU 12v rail and found that both gpuz and hwinfo64 (downside part [GPU #0]) are indeed showing the internal pwm IR 3567 controller (VRM) info only and not those of the +12v real rail
I also found other two (or three) sections of the hwinfo64 about the +12v rail sensors, one comes out from the Nuvoton NCT6776F (main board) and the other one is from the mobo's vrm controller, ChiL CHL8328

And look what those sensors are showing:

http://s30.postimg.org/eeijpj1sh/Cattura.jpg

Absolutely normal values so I ran a quick (10') Linx 0.6.5 cpu stress test and noticed the +12v rail voltages went down a little bit only, still remaining within the +12v range









http://s30.postimg.org/qk7qzxgi7/Clipboard01.jpg

Now my 3770K @4.6Ghz is not such a "high power" (TDP) cpu so I found a pic of an older (yet still recent) test with a 2600K @4.7 Ghz (~ 127W TDP) and yet the mainboard +12v rail is absolutely ok:

http://s4.postimg.org/c6keksawb/4_7ghz_offset_Xmp_linx.jpg

Furthermore I've even found one more info (pic) with my Sapphire 7970 @1175/1550 right after a Valley bench (made that pic to point out the lower gpu temps after I changed the tim with the CLU) and right here we have the internal, CHiL controller of the 7970 showing a min voltage of 11.813v (for the gpu dedicated +12v):

http://www.xtremeshack.com/photos/20140131139118931191897.jpg

I agree, the occed 7970 is showing ~ 200W (max) and my R9 290 Tri-X @ 1180/1500 up to 335W but maybe my PSU is still "working"
And I don't really know if I have to worry or not since I still got no black screens yet.
I'll test with the multimeter soon, better with my 290 in OC and check my +12v real instant voltages on the both my motherboard reading points and my pcie plugs and take my final decision


----------



## Arizonian

Quote:


> Originally Posted by *Matt-Matt*
> 
> Would love to join the club! 1200 is all for now though!
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> What would you guys say is a "Golden" 290? I'm doing 1200MHz at +56mv in Trixx, the RAM is terrible however.
> 
> Linky to validation


Nice overclock on Core already









Congrats - welcome aboard.


----------



## Mega Man

Quote:


> Originally Posted by *Matt-Matt*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> MSI puts a sticker on it now saying void if removed on a screw for the stock cooler.
> 
> 
> 
> Yeah, all the companies do now as far as I know.
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> they have for quite a while and people still dont have issues you can take out the screw with pliers and elec tape without marking it if it worries you so much, i just use alcohol to get all res off and claim it never had it
> 
> Click to expand...
> 
> This is a good idea! Might do that to my XFX card if I ever need to.. Heh
Click to expand...

Source ( click radeon )
Quote:


> ** XFX has carefully selected the optimal thermal or fansink component for your graphics card model. We do not encourage the removal of components due to damage that may result in the process. XFX understands that some enthusiasts may choose to replace the original component with their own cooling solution. To support the gaming community, we recommend that you contact XFX prior to any modifications so that we can update your profile and product registration to avoid potential issues with warranty support. In addition, XFX support will be able to walk through the installation with you or provide feedback and pointers on available options for your specific product. You may even consider shipping your components to XFX and allow the technicians at XFX to perform the modification for you (shipping charges to XFX apply).


Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> they have for quite a while and people still dont have issues you can take out the screw with pliers and elec tape without marking it if it worries you so much, i just use alcohol to get all res off and claim it never had it
> 
> 
> 
> Yeah, seems MSI has publicly posted too correct? I remember seeing posts of reps from peoples support tickets mentioning if the card is not damaged by removal of the HSF that they honor warranty.
> 
> I guess that sticker is to scare some people away? I know for other companies it's an instant denial of warranty :/ which really sucks
> 
> Then again I read a bunch of stories of people removing HSF on many different manufacturers and never having issue with warranty, who knows
Click to expand...

yea but i am too lazy to find it


----------



## HOMECINEMA-PC

Oooh Happy days will be here this easter hols








Just scored a unlocked 290 / 290x with aftermarket air cooler for $420AU . Gonna bench tri - fire and single 290x


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Oooh Happy days will be here this easter hols
> 
> 
> 
> 
> 
> 
> 
> 
> Just scored a unlocked 290 / 290x with aftermarket air cooler for $420AU . Gonna bench tri - fire and single 290x
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


I still have an AX III sitting here doing nothing after my 290x got RMA'd


----------



## Imprezzion

Just in case someone missed it, there are leaked 14.4 drivers out (295X2 press release drivers) and these have *fixed power limits*. So, everyone can clock and use Mantle now!

The thread on them here on OCN seems very positive and everyone is reporting performance increases and working powerlimits. Also they appear very stable and good for CF.


----------



## Widde

For all you watercoolers, EK seems to have released a 290x lightning









http://www.ekwb.com/news/477/51/EK-releases-MSI-R9-290X-Lightning-Full-Cover-water-block/


----------



## Sgt Bilko

Quote:


> Originally Posted by *Imprezzion*
> 
> Just in case someone missed it, there are leaked 14.4 drivers out (295X2 press release drivers) and these have *fixed power limits*. So, everyone can clock and use Mantle now!
> 
> The thread on them here on OCN seems very positive and everyone is reporting performance increases and working powerlimits. Also they appear very stable and good for CF.


Link please?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Link please?


Yep I want that


----------



## boldenc

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Link please?




Build Info:
Catalyst: 14.4 / 295X Release Driver
OpenGL 4.3.12872
OpenCL 1.2 AMD-APP (1445.5)
HSA 0.8.0.0.241
Mantle 9.1.10.0009

Download:
Uploaded
Mega

credit to Cyris @ guru3d


----------



## Imprezzion

Yeah sorry, I was on my phone. Am on my PC now but someone beat me to the link








Thanks.

I'm installing them myself now to see how good they really are









EDIT: Your MEGA link doesn't work. Asks for decryption key.

This works: https://mega.co.nz/#!Yx82QCKL!_pGacc52C6dK2IToGXGj1ggAsoRsprhCOowsAfZGcxQ

EDIT: Holy hell!
Mantle with working power limits and a +100mV OC puts one heck of a load on the card lol. Easily 5-6c hotter core and 10c hotter VRM's then same settings on DX11 in BF4.

And VRAM usage is at 3200MB on 125% res scale with Ultra settings (2x Adaptive AA forced in CCC)


----------



## NBAasDOGG

Hello people,

I have a question about MSI afterburner beta 19. When i unlock my voltage MSI afterburner only allows my to put +100mv on the card, whish gives me 1.256v, but i need 1.3 for my overclock. Does anyone know how to extend the voltage slider in beta 19? BTW, this problem only happens to my R9 280X, but works perfext with the 290X.

Please help1


----------



## Mega Man

i would be very wary about leaked drivers, laast time it made you mine for other people


----------



## rdr09

668 place lol


----------



## Matt-Matt

So i'm pretty sure I can't get voltage control in Afterburner working with 2.3.1 and 13.12 AMD drivers.

Trixx works so I'm using that for now, but I'd rather afterburner to be honest. Anyone got any ideas? I've reinstalled afterburner twice now..


----------



## rdr09

Quote:


> Originally Posted by *Matt-Matt*
> 
> So i'm pretty sure I can't get voltage control in Afterburner working with 2.3.1 and 13.12 AMD drivers.
> 
> Trixx works so I'm using that for now, but I'd rather afterburner to be honest. Anyone got any ideas? I've reinstalled afterburner twice now..


that AB version is old. try the newest. Beta 19 i think. make sure you turn off Voltage monitoring. gave me problems in the past, so i uninstalled AB forever.


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> 668 place lol


Not bad









Here's mine at 1250/1500:



Just shy of 20k.....


----------



## Norse

Anyone know how well the ASUS R9 290's overclock? http://www.asus.com/Graphics_Cards/R92904GD5/ one not the OC one


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Not bad
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here's mine at 1250/1500:
> 
> 
> 
> Just shy of 20k.....


testing to see how 13.11 compares to 14.4 later. I did run 1250 and I guess cpu matters in this bench . . .



I think it is a good instability test app.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Norse*
> 
> Anyone know how well the ASUS R9 290's overclock? http://www.asus.com/Graphics_Cards/R92904GD5/ one not the OC one


Reference cards will probably get anywhere between 1200-1250 core Max and the memory usually depends on type, Elpida i've seen go up to 1625 and Hynix just above 1700Mhz Max.

Luck of the draw though, i've seen worse and i've seen better


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> testing to see how 13.11 compares to 14.4 later. I did run 1250 and I guess cpu matters in this bench . . .
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I think it is a good instability test app.


at 720p it does yeah, at 1440p......not so much









http://hwbot.org/submission/2518655_sgt_bilko_catzilla___1440p_2x_radeon_r9_290_14220_marks

It's a quirky bench, i don't mind the music and the scenes are just so silly it's funny, reminds me of the Truck in 3DMark01 actually


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> at 720p it does yeah, at 1440p......not so much
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://hwbot.org/submission/2518655_sgt_bilko_catzilla___1440p_2x_radeon_r9_290_14220_marks
> 
> It's a quirky bench, i don't mind the music and the scenes are just so silly it's funny, reminds me of the Truck in 3DMark01 actually


I see. tried 1300 but my score went down. maybe more voltage higher than +180 offset.


----------



## Norse

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Reference cards will probably get anywhere between 1200-1250 core Max and the memory usually depends on type, Elpida i've seen go up to 1625 and Hynix just above 1700Mhz Max.
> 
> Luck of the draw though, i've seen worse and i've seen better


Ah awesome i was only intending to do about 10% or so for an extra increase, i will have to see due to the temps


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> I see. tried 1300 but my score went down. maybe more voltage higher than +180 offset.


i only need +140mV for 1250/1500, haven't really tried higher than that due to temps. vrm's get quite toast in this bench, Firestrike etc...more break in between scenes so little more time to cool.

Quote:


> Originally Posted by *Norse*
> 
> Ah awesome i was only intending to do about 10% or so for an extra increase, i will have to see due to the temps


10% increase would almost be guaranteed, should be able to run 1100/1300 or so with a minimal amount of extra voltage.

the Ref cooler is actually quite good, it's just loud over 70% fan speed, might be better off looking at a custom card if you prefer silence?


----------



## Norse

Quote:


> Originally Posted by *Sgt Bilko*
> 
> i only need +140mV for 1250/1500, haven't really tried higher than that due to temps. vrm's get quite toast in this bench, Firestrike etc...more break in between scenes so little more time to cool.
> 10% increase would almost be guaranteed, should be able to run 1100/1300 or so with a minimal amount of extra voltage.
> 
> the Ref cooler is actually quite good, it's just loud over 70% fan speed, might be better off looking at a custom card if you prefer silence?


Got a sound dampened case (Sig rig, fractal design XL R2) plus whilst gaming it will be on speakers so wont notice background noise by by feet under the desk


----------



## bond32

Quote:


> Originally Posted by *Sgt Bilko*
> 
> i only need +140mV for 1250/1500, haven't really tried higher than that due to temps. vrm's get quite toast in this bench, Firestrike etc...more break in between scenes so little more time to cool.
> 10% increase would almost be guaranteed, should be able to run 1100/1300 or so with a minimal amount of extra voltage.
> 
> the Ref cooler is actually quite good, it's just loud over 70% fan speed, might be better off looking at a custom card if you prefer silence?


I love how mine takes +200 for 1210/1320 and mine is full block....


----------



## Sgt Bilko

Quote:


> Originally Posted by *bond32*
> 
> I love how mine takes +200 for 1210/1320 and mine is full block....


But you have a 290x right?

Mine couldn't get over 1180/1480 no matter what voltage i threw at it before i had to send it off for an RMA.
Decided i was better off with a couple of 290's after that and they turned out to be pretty good clockers as well








Quote:


> Originally Posted by *Norse*
> 
> Got a sound dampened case (Sig rig, fractal design XL R2) plus whilst gaming it will be on speakers so wont notice background noise by by feet under the desk


Mine sits about a metre away from my right ear, good thing i use headphones









You should be pretty right then, hope you get a good card


----------



## Talon720

So I have 2 crossfired 290x's(hynix+elpida 1200/1550 max)watercooled with a hx1050. Ive been wondering lately if my psu has been enough for a heaily overclocked system... hwinfo shows 350-370 watts with 1.35-1.37v if i try to go higher my system shuts off. I should be logging hwinfo i suppose i only assume peak wattage hits close to 400watts per card. Ive been trying to push over 1200 and with the way the ab voltmod command works in steps cant really fine tune the voltage. Another reason i ask about the psu is ive dabbled in the thought of a 3rd card... But since my board would be 8x 4x 4x and ive read people have had issues. Thought i might prove it to myself though i wanna move up to the asus rog1600p 120hz or any that might come out after.


----------



## Norse

Quote:


> Originally Posted by *Sgt Bilko*
> 
> But you have a 290x right?
> 
> Mine couldn't get over 1180/1480 no matter what voltage i threw at it before i had to send it off for an RMA.
> Decided i was better off with a couple of 290's after that and they turned out to be pretty good clockers as well
> 
> 
> 
> 
> 
> 
> 
> 
> Mine sits about a metre away from my right ear, good thing i use headphones
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You should be pretty right then, hope you get a good card


Gona be luck of the draw offered a ebay seller £400 for two of the cards he has, trying to get myself a steal







especially seeming how i can probably sell my 680 for around £200


----------



## VSG

You guys should really not use Catzilla as an indicative benchmark, it is extremely random and unreliable in my opinion. My single card run was higher than 2-way run with the 780 Ti cards.


----------



## bond32

Quote:


> Originally Posted by *Sgt Bilko*
> 
> But you have a 290x right?


Yeah sapphire 290x. I need to send it in just don't want to be without a card. It black-screens often too, anything over 1340 on memory will black screen.


----------



## Sgt Bilko

Quote:


> Originally Posted by *geggeg*
> 
> You guys should really not use Catzilla as an indicative benchmark, it is extremely random and unreliable in my opinion. My single card run was higher than 2-way run with the 780 Ti cards.


I know it's CF vs SLI here but i had to run it then exit back to the menu and start again to get it to use both cards.

Quote:


> Originally Posted by *bond32*
> 
> Yeah sapphire 290x. I need to send it in just don't want to be without a card. It black-screens often too, anything over 1340 on memory will black screen.


I always have a back-up or two laying around if something goes pear-shaped.

Mine started giving me artifacts on stock settings, not sure what the exact cause was but it happened to a couple of others as well, Day 1 buy........not gonna do that again


----------



## VSG

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I know it's CF vs SLI here but i had to run it then exit back to the menu and start again to get it to use both cards.


I have tried all tricks given out by the Catzilla guys but nothing. SLI works at times but doesn't more often than not. When I had dual 290x cards, CFX was completely broken so I don't know if it's gotten better now.


----------



## Sgt Bilko

Quote:


> Originally Posted by *geggeg*
> 
> I have tried all tricks given out by the Catzilla guys but nothing. SLI works at times but doesn't more often than not. When I had dual 290x cards, CFX was completely broken so I don't know if it's gotten better now.


i had to re-name the .exe file and change the CF profile to AFR friendly in addition to stop/starting it.

not ideal for multi-cards, but for Single cards it's fine, my scores have been pretty consistent.


----------



## devilhead

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Not bad
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here's mine at 1250/1500:
> 
> 
> 
> Just shy of 20k.....


tryed and i those catzilla







with daily cpu overclock and gpu 1250 on core + 1000 background programs


----------



## Arizonian

Quote:


> Originally Posted by *ghostly44*
> 
> Still running the i2500k but got a Power-color PCS+ R9 290 Stock cooling
> 
> 
> Spoiler: Warning: Spoiler!


Sorry I missed this entry. It didn't have the proper proof. Please go back and add a GPU-Z validation link as proof with OCN name. In the meantime you are entered where you would have slotted in #309.









http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/18670#post_21918072

Congrats - added









Quick note for anyone else who may have not been added.

Please remember I'm seeking proof in a certain format to be added to roster. If you've been missed please provide the following per OP request.
Quote:


> To be added on the member list please submit the following in your post
> 
> 1. GPU-Z Link with OCN name or Screen shot of GPU-Z validation tab open with OCN Name or finally a pic of GPU with piece of paper with OCN name showing.
> 2. Manufacturer & Brand - Please don't make me guess.
> 3. Cooling - Stock, Aftermarket or 3rd Party Water


----------



## bond32

This is what happens to my lovely 290x when I overclock: 

This was at 1175/1315 +100mv. Does not matter the bios/drivers/combination.


----------



## Sgt Bilko

Quote:


> Originally Posted by *devilhead*
> 
> tryed and i those catzilla
> 
> 
> 
> 
> 
> 
> 
> with daily cpu overclock and gpu 1250 on core + 1000 background programs


I'm guessing 290x with.....4930K?


----------



## Jflisk

Quote:


> Originally Posted by *bond32*
> 
> This is what happens to my lovely 290x when I overclock:
> 
> This was at 1175/1315 +100mv. Does not matter the bios/drivers/combination.


Anything over 1125 and you will get the results above. I was reading the over clocking on these things. 1125/ 1450 is as high as they could get. So I really dont understand where 1250/1500 is coming from. From what i have seen 1200/1500 or higher only works when mining and even then they say not for gaming or daily use will not happen. Under water i set 1060/1350 with no problems.


----------



## Jflisk

Quote:


> Originally Posted by *Jflisk*
> 
> Anything over 1125 and you will get the results above. I was reading the over clocking on these things. 1125/ 1450 is as high as they could get. So I really dont understand where 1250/1500 is coming from. From what i have seen 1200/1500 or higher only works when mining and even then they say not for gaming or daily use will not happen.Start at 1060/1350 work your way up. Under water i set 1060/1350 with no problems.


Double post


----------



## Sgt Bilko

Quote:


> Originally Posted by *Jflisk*
> 
> Anything over 1125 and you will get the results above. I was reading the over clocking on these things. 1125/ 1450 is as high as they could get. So I really dont understand where 1250/1500 is coming from. From what i have seen 1200/1500 or higher only works when mining and even then they say not for gaming or daily use will not happen. Under water i set 1060/1350 with no problems.


You can clock quite high on these cards for benching, and 1200/1500 for mining would have to be underwater.

I can game with mine at 1150/1500 perfectly fine, if i kept them cool enough i could game at 1200/1500.

not sure where you got 1125 max from though.


----------



## Jflisk

Quote:


> Originally Posted by *Sgt Bilko*
> 
> You can clock quite high on these cards for benching, and 1200/1500 for mining would have to be underwater.
> 
> I can game with mine at 1150/1500 perfectly fine, if i kept them cool enough i could game at 1200/1500.
> 
> not sure where you got 1125 max from though.


Looking for the site now. Right about here

http://www.hardocp.com/article/2014/02/24/xfx_r9_290x_double_dissipation_overclocking_review/3


----------



## bond32

Quote:


> Originally Posted by *Jflisk*
> 
> Anything over 1125 and you will get the results above. I was reading the over clocking on these things. 1125/ 1450 is as high as they could get. So I really dont understand where 1250/1500 is coming from. From what i have seen 1200/1500 or higher only works when mining and even then they say not for gaming or daily use will not happen. Under water i set 1060/1350 with no problems.


What resolution are you at and refresh rate? I feel like that may have something to do with it as I have to apply the patch to be able to run over 60 hz on my 1440p monitor. But I also think I tried both with and without and still fails.


----------



## Jflisk

Quote:


> Originally Posted by *bond32*
> 
> What resolution are you at and refresh rate? I feel like that may have something to do with it as I have to apply the patch to be able to run over 60 hz on my 1440p monitor. But I also think I tried both with and without and still fails.


1080x1920 60 HZ

Catalyst 14.3 v1 B drivers. I can try it Sunday to get the cards over clocked to 1150/1500. I will have my second water block installed by then. But with where i sit now i am at 1060/ 1350 no problems.

Not putting you down bilko or saying it cant be done just each card is its own case.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Jflisk*
> 
> Looking for the site now. Right about here
> 
> http://www.hardocp.com/article/2014/02/24/xfx_r9_290x_double_dissipation_overclocking_review/3


Yeah, i take review site overclocking as a base line tbh.

there is a few variables to consider as well, Case airflow, type of cooling, space between cards (for CF), ambients temps, quality of the card's cooling etc


----------



## rdr09

Quote:


> Originally Posted by *devilhead*
> 
> tryed and i those catzilla
> 
> 
> 
> 
> 
> 
> 
> with daily cpu overclock and gpu 1250 on core + 1000 background programs


a 290 will need around 1320 to match that.


----------



## friend'scatdied

Quote:


> Originally Posted by *rdr09*
> 
> a 290 will need around 1320 to match that.


Is it 6%? I thought the per-clock difference was 3%.


----------



## rdr09

Quote:


> Originally Posted by *friend'scatdied*
> 
> Is it 6%? I thought the per-clock difference was 3%.


just an estimate. 3% is kinda low. prolly around 60MHz oc on the Pro will match the X.


----------



## mojobear

Hey guys, got a quick question....so I somehow acquired a powercolor 290 as a gift...which brings me to 4 x 290s....

What are your thoughts on sticking 4 290s together for 5760 eyefinity...the downsides are the extra cost of just one more 290....waterblock, new PSU for 1500W, potentially an extra external radiatior...I've been trying to find an excuse to get a MORA3....I have a 420 and a 360 for now.

Other option which is what I have for now is putting the 290 is a mini-itx gaming pc...with a 4670K...its a nice little rig but is rarely every used...actually only used when my main rig is mining.

From what Ive been reading 3 x 290s is a sweet spot in terms of driver issues, esp for quad + eyefinity...

Thoughts anyone? Thanks!


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> just an estimate. 3% is kinda low. prolly around 60MHz oc on the Pro will match the X.


Here's some estimation to go on





I really can't give you a better comparison than that


----------



## devilhead

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'm guessing 290x with.....4930K?


no, it is 3930k







and 290X 1250/1700


----------



## maarten12100

Hello fellow R9 290x/290 owners as some of you may now I returned my pair or R9 290's not too long after I got them. Well now I have gotten my hands on a R9 290 with no cooler on the cheap. All I need is a non reference cooler for it I just need air cooling nothing too fancy. I've tried to slap on my Artic cooler which was on my gtx275 but it wouldn't fit. Suggestions are very welcome.

I tried to use the search option but all I got were random posts.


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Here's some estimation to go on
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I really can't give you a better comparison than that


maybe at higher clocks the 290X pulls away . . .

http://www.3dmark.com/3dm/2004180

http://www.3dmark.com/fs/1050519

Quote:


> Originally Posted by *maarten12100*
> 
> Hello fellow R9 290x/290 owners as some of you may now I returned my pair or R9 290's not too long after I got them. Well now I have gotten my hands on a R9 290 with no cooler on the cheap. All I need is a non reference cooler for it I just need air cooling nothing too fancy. I've tried to slap on my Artic cooler which was on my gtx275 but it wouldn't fit. Suggestions are very welcome.
> 
> I tried to use the search option but all I got were random posts.


i recommend an aio plus some copper heatsinks/thermal pads. you may have to use zip ties to keep the sinks in place . . .

http://www.overclock.net/t/1479969/cm-seidon-120xl-r9-290

also a dedicated fan or 2. one at the rear of the card and another at the side door panel.

OR this . . .

http://www.overclock.net/t/1459728/guide-and-review-accelero-xtreme-iii-on-r9-290


----------



## maarten12100

Quote:


> Originally Posted by *rdr09*
> 
> i recommend an aio plus some copper heatsinks/thermal pads. you may have to use zip ties to keep the sinks in place . . .
> 
> http://www.overclock.net/t/1479969/cm-seidon-120xl-r9-290
> 
> also a dedicated fan or 2. one at the rear of the card and another at the side door panel.
> 
> OR this . . .
> 
> http://www.overclock.net/t/1459728/guide-and-review-accelero-xtreme-iii-on-r9-290


Thanks for the fast response but since this is essentially a "broken"card I would like to first run some tests before investing big. The problem is I'm unable to mount any of my coolers tried (570ref, 5870ref and the Nvidia Accelero 3 fan thingy)
For now I just need the cheapest cooler I can find then later I can go with something good I would prefer air since I don't have my pc in a case atm and have nowhere to mount the rad except a cardboard box


----------



## Paul17041993

Quote:


> Originally Posted by *Jflisk*
> 
> Anything over 1125 and you will get the results above. I was reading the over clocking on these things. 1125/ 1450 is as high as they could get. So I really dont understand where 1250/1500 is coming from. From what i have seen 1200/1500 or higher only works when mining and even then they say not for gaming or daily use will not happen. Under water i set 1060/1350 with no problems.


punched my 290X to 1200/1750 before it overheated due to ref cooler not being enough, its currently pending the accelero IV or a waterblock in the future.


----------



## jamaican voodoo

Quote:


> Originally Posted by *mojobear*
> 
> Hey guys, got a quick question....so I somehow acquired a powercolor 290 as a gift...which brings me to 4 x 290s....
> 
> What are your thoughts on sticking 4 290s together for 5760 eyefinity...the downsides are the extra cost of just one more 290....waterblock, new PSU for 1500W, potentially an extra external radiatior...I've been trying to find an excuse to get a MORA3....I have a 420 and a 360 for now.
> 
> Other option which is what I have for now is putting the 290 is a mini-itx gaming pc...with a 4670K...its a nice little rig but is rarely every used...actually only used when my main rig is mining.
> 
> From what Ive been reading 3 x 290s is a sweet spot in terms of driver issues, esp for quad + eyefinity...
> 
> Thoughts anyone? Thanks!


i can attest to this i have 3 R9 290's in trifire clock @ 1100mhz/1330mem an can tell i don't have no issues at all playing all my games smoothness all over...playing BF4 with mantle enable pure joy i tell ya.. thats using 14.3 beta drivers...all my cards are under water also they don't throttle under load


----------



## Tobiman

Quote:


> Originally Posted by *geggeg*
> 
> I have tried all tricks given out by the Catzilla guys but nothing. SLI works at times but doesn't more often than not. When I had dual 290x cards, CFX was completely broken so I don't know if it's gotten better now.


Thank god i'm not the only one who noticed this. My r9 290 clocked at 1250/1550 was smacked by a mere 760. I was like woot?


----------



## Norse

Wooooooooo Won a Sapphire 290x for £250 on ebay, missed the other one by £5 damn you sniper! .

Now on the hunt for a second


----------



## Raephen

Quote:


> Originally Posted by *maarten12100*
> 
> Thanks for the fast response but since this is essentially a "broken"card I would like to first run some tests before investing big. The problem is I'm unable to mount any of my coolers tried (570ref, 5870ref and the Nvidia Accelero 3 fan thingy)
> For now I just need the cheapest cooler I can find then later I can go with something good I would prefer air since I don't have my pc in a case atm and have nowhere to mount the rad except a cardboard box


Maarten, This VGA COOLER Gelid ICY VISION Rev 2 might be what you are looking for atm. It fits and has worked out great for a few forum members.

Another option is the Arctic Accelero S1R2 with 2 good 120mm fans strapped on it.

Good luck!

Edit: The Gelid Icy Vision Rev 2 was hard to find on nl.hardware.info, but since that cooler works, I'd Imagine the Arctic Accelero Twin Turbo II would do so too. It's a pain to mount, but well worth it


----------



## Roboyto

Quote:


> Originally Posted by *maarten12100*
> 
> Thanks for the fast response but since this is essentially a "broken"card I would like to first run some tests before investing big. The problem is I'm unable to mount any of my coolers tried (570ref, 5870ref and the Nvidia Accelero 3 fan thingy)
> For now I just need the cheapest cooler I can find then later I can go with something good I would prefer air since I don't have my pc in a case atm and have nowhere to mount the rad except a cardboard box


You wouldn't happen to have an AIO water cooler lying around would you? With a little patience and some zip ties you can probably strap one of those on for testing purposes.

http://www.overclock.net/t/1203528/official-nvidia-gpu-mod-club-aka-the-mod

There are fancier ways to get this done now, but if you have an AIO and some zip ties it could work temporarily.


----------



## Dasboogieman

Quote:


> Originally Posted by *bond32*
> 
> This is what happens to my lovely 290x when I overclock:
> 
> This was at 1175/1315 +100mv. Does not matter the bios/drivers/combination.


Out of curiosity, whats your ASIC quality (via GPUz)? I've heard theres a huge variance in chip quality on the 290s, not so much the 290x. That particular screen looks like it might be caused by your VRAM, are you on Hynix or Elpida?

As for reference.
Mine is 78.9% asic quality
I can get 1150 fully stable on +100mv, 1200 on +187mv. (not just benchmark stable, verified with 200 Crysis runs ).
My memory is Hynix so I just run at its rated speed of 1500mhz. I've heard the worst of the Elpida chips cant go beyond 1300mhz.

That being said, I only use 1100mhz +38mv 1500mhz VRAM for daily use since it is the best balance of heat, noise and performance.


----------



## bond32

Quote:


> Originally Posted by *Dasboogieman*
> 
> Out of curiosity, whats your ASIC quality (via GPUz)? I've heard theres a huge variance in chip quality on the 290s, not so much the 290x. That particular screen looks like it might be caused by your VRAM, are you on Hynix or Elpida?
> 
> As for reference.
> Mine is 78.9% asic quality
> I can get 1150 fully stable on +100mv, 1200 on +187mv. (not just benchmark stable, verified with 200 Crysis run hrs).
> My memory is Hynix so I just run at its rated speed of 1500mhz. I've heard the worst of the Elpida chips cant go beyond 1300mhz.
> 
> That being said, I only use 1100mhz +38mv 1500mhz VRAM for daily use since it is the best balance of heat, noise and performance.


ASIC is 72.2%, Elipida memory.


----------



## DeadlyDNA

Quote:


> Originally Posted by *mojobear*
> 
> Hey guys, got a quick question....so I somehow acquired a powercolor 290 as a gift...which brings me to 4 x 290s....
> 
> What are your thoughts on sticking 4 290s together for 5760 eyefinity...the downsides are the extra cost of just one more 290....waterblock, new PSU for 1500W, potentially an extra external radiatior...I've been trying to find an excuse to get a MORA3....I have a 420 and a 360 for now.
> 
> Other option which is what I have for now is putting the 290 is a mini-itx gaming pc...with a 4670K...its a nice little rig but is rarely every used...actually only used when my main rig is mining.
> 
> From what Ive been reading 3 x 290s is a sweet spot in terms of driver issues, esp for quad + eyefinity...
> 
> Thoughts anyone? Thanks!


Running quad here, i water blocked all mine and have a 1600watt psu to go with them. I can't really overclock very much because i get worried about my PSU and breaker lol. I can only suggest if your looking for absolute max fps go 4 gpus. It's really hard to recommend it over 3 because 3 do well. I am currently doing scaling tests but in 4k eyefinity in another thread 11520x2160 right now. If you would like i can do some scaling tests at 5760 from 3 to 4gpus. I don't have alot of the brand new game releases though. And yes some games really have problems with 4 way CF over 3 wayCF. One comes to my mind is SKYRIM.

If your going to have to spend a chunk of money to get that 4th card in, your benefits per dollar will go down


----------



## Dasboogieman

Quote:


> Originally Posted by *bond32*
> 
> ASIC is 72.2%, Elipida memory.


yea, high leakage chip, I'm pretty sure everyone whose been getting 1200+ mhz with minimal voltage are either 80+ ASIC quality or only partially stable, you might need max voltage for 1200mhz (shouldn't be a problem if you watercooled).


----------



## bond32

Quote:


> Originally Posted by *Dasboogieman*
> 
> yea, bad luck I guess, you might need max voltage for 1200mhz (shouldn't be a problem if you watercooled).


Max voltage gets me about 1185. I played a few times with +200 but got the exact same result as what I pictured and this was with no memory adjustment.


----------



## Jhors2

What are people doing to maximize power limit and voltage these days? Bios flash or doing the afterburner command hacks?


----------



## Jack Mac

Quote:


> Originally Posted by *Jhors2*
> 
> What are people doing to maximize power limit and voltage these days? Bios flash or doing the afterburner command hacks?


Stock bios with AB or Trixx works well enough, or you can use the ASUS PT bios for benching.


----------



## Roy360

I"m not trying to promote NCIX or anything, but they are having a crazy sale on GPUs right now

CLUB3D Radeon R9 290X Royalking 1030MHZ 4GB 5GHZ GDDR5 2xDVI HDMI DisplayPort PCI-E Video Card $599.99 - 30% = $420
ASUS GeForce GTX 760 X 2 ROG Mars 1072MHZ 4GB 6.0GHZ GDDR5 3XDVI HDMI Minidp PCI-E Video Card $599.99 - 30% = $420'
ASUS Radeon R9 290X DirectCU II OC 1050MHZ 4GB 5.4GHZ GDDR5 2XDVI HDMI DisplayPort PCI-E Video Card $649.99 - 30% = $455
Zotac GeForce GTX 780 Ti 928MHZ 3GB 6.0GHZ GDDR5 2xDVI HDMI DisplayPort PCI-E Video Card $679.99 - 30% = $475 - $15 MIR = $460 5 left
*ASUS Radeon R9 290 DirectCU II OC 1000MHZ 4GB 5.0GHZ GDDR5 2XDVI HDMI DisplayPort PCE-E Video Card $479.99 - 30% = $336*

If it weren't for my measly credit card limits I could have bought all of these, but instead I had to make multiple orders (which might all get canceled)

Scored:
1x CLUB3D R9 290
1x ASUS R9 290X
1x ASUS R9 290

and unfortunately, I accidently order a 2nd ASUS R9 290 at 370$









damn, I bought my ASUS R9 290 for 539 a month ago, and now I could have gotten two for the price of one.

I just hope I can either a) sell one card to make up for my loss, or b) mastercard will price protect it.


----------



## DeadlyDNA

Quote:


> Originally Posted by *Roy360*
> 
> I"m not trying to promote NCIX or anything, but they are having a crazy sale on GPUs right now
> 
> CLUB3D Radeon R9 290X Royalking 1030MHZ 4GB 5GHZ GDDR5 2xDVI HDMI DisplayPort PCI-E Video Card $599.99 - 30% = $420
> ASUS GeForce GTX 760 X 2 ROG Mars 1072MHZ 4GB 6.0GHZ GDDR5 3XDVI HDMI Minidp PCI-E Video Card $599.99 - 30% = $420'
> ASUS Radeon R9 290X DirectCU II OC 1050MHZ 4GB 5.4GHZ GDDR5 2XDVI HDMI DisplayPort PCI-E Video Card $649.99 - 30% = $455
> Zotac GeForce GTX 780 Ti 928MHZ 3GB 6.0GHZ GDDR5 2xDVI HDMI DisplayPort PCI-E Video Card $679.99 - 30% = $475 - $15 MIR = $460 5 left
> *ASUS Radeon R9 290 DirectCU II OC 1000MHZ 4GB 5.0GHZ GDDR5 2XDVI HDMI DisplayPort PCE-E Video Card $479.99 - 30% = $336*
> 
> If it weren't for my measly credit card limits I could have bought all of these, but instead I had to make multiple orders (which might all get canceled)
> 
> Scored:
> 1x CLUB3D R9 290
> 1x ASUS R9 290X
> 1x ASUS R9 290
> 
> and unfortunately, I accidently order a 2nd ASUS R9 290 at 370$
> 
> 
> 
> 
> 
> 
> 
> 
> 
> damn, I bought my ASUS R9 290 for 539 a month ago, and now I could have gotten two for the price of one.


So prices have come back down now mining craze is over?


----------



## Roy360

Quote:


> Originally Posted by *DeadlyDNA*
> 
> So prices have come back down now mining craze is over?


down? these are currently under msrp.


----------



## Forceman

Quote:


> Originally Posted by *Roy360*
> 
> I"m not trying to promote NCIX or anything, but they are having a crazy sale on GPUs right now
> 
> CLUB3D Radeon R9 290X Royalking 1030MHZ 4GB 5GHZ GDDR5 2xDVI HDMI DisplayPort PCI-E Video Card $599.99 - 30% = $420
> ASUS GeForce GTX 760 X 2 ROG Mars 1072MHZ 4GB 6.0GHZ GDDR5 3XDVI HDMI Minidp PCI-E Video Card $599.99 - 30% = $420'
> ASUS Radeon R9 290X DirectCU II OC 1050MHZ 4GB 5.4GHZ GDDR5 2XDVI HDMI DisplayPort PCI-E Video Card $649.99 - 30% = $455
> Zotac GeForce GTX 780 Ti 928MHZ 3GB 6.0GHZ GDDR5 2xDVI HDMI DisplayPort PCI-E Video Card $679.99 - 30% = $475 - $15 MIR = $460 5 left
> *ASUS Radeon R9 290 DirectCU II OC 1000MHZ 4GB 5.0GHZ GDDR5 2XDVI HDMI DisplayPort PCE-E Video Card $479.99 - 30% = $336*
> 
> If it weren't for my measly credit card limits I could have bought all of these, but instead I had to make multiple orders (which might all get canceled)
> 
> Scored:
> 1x CLUB3D R9 290
> 1x ASUS R9 290X
> 1x ASUS R9 290
> 
> and unfortunately, I accidently order a 2nd ASUS R9 290 at 370$
> 
> 
> 
> 
> 
> 
> 
> 
> 
> damn, I bought my ASUS R9 290 for 539 a month ago, and now I could have gotten two for the price of one.
> 
> I just hope I can either a) sell one card to make up for my loss, or b) mastercard will price protect it.


Is there a code or something for the 30% discount?


----------



## Mega Man

Quote:


> Originally Posted by *Roy360*
> 
> I"m not trying to promote NCIX or anything, but they are having a crazy sale on GPUs right now
> 
> CLUB3D Radeon R9 290X Royalking 1030MHZ 4GB 5GHZ GDDR5 2xDVI HDMI DisplayPort PCI-E Video Card $599.99 - 30% = $420
> ASUS GeForce GTX 760 X 2 ROG Mars 1072MHZ 4GB 6.0GHZ GDDR5 3XDVI HDMI Minidp PCI-E Video Card $599.99 - 30% = $420'
> ASUS Radeon R9 290X DirectCU II OC 1050MHZ 4GB 5.4GHZ GDDR5 2XDVI HDMI DisplayPort PCI-E Video Card $649.99 - 30% = $455
> Zotac GeForce GTX 780 Ti 928MHZ 3GB 6.0GHZ GDDR5 2xDVI HDMI DisplayPort PCI-E Video Card $679.99 - 30% = $475 - $15 MIR = $460 5 left
> *ASUS Radeon R9 290 DirectCU II OC 1000MHZ 4GB 5.0GHZ GDDR5 2XDVI HDMI DisplayPort PCE-E Video Card $479.99 - 30% = $336*
> 
> If it weren't for my measly credit card limits I could have bought all of these, but instead I had to make multiple orders (which might all get canceled)
> 
> Scored:
> 1x CLUB3D R9 290
> 1x ASUS R9 290X
> 1x ASUS R9 290
> 
> and unfortunately, I accidently order a 2nd ASUS R9 290 at 370$
> 
> 
> 
> 
> 
> 
> 
> 
> 
> damn, I bought my ASUS R9 290 for 539 a month ago, and now I could have gotten two for the price of one.
> 
> I just hope I can either a) sell one card to make up for my loss, or b) mastercard will price protect it.


Quote:


> Originally Posted by *Roy360*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DeadlyDNA*
> 
> So prices have come back down now mining craze is over?
> 
> 
> 
> down? these are currently under msrp.
Click to expand...

where do you see this 30% off ??


----------



## HOMECINEMA-PC

Single 290 @ 1440p

http://www.catzilla.com/showresult?lp=240971
Beat ya Bilko


----------



## Germanian

that's a nice 290 HOMECINEMA.
Voltage is high, but nonetheless 1220 on a 290 is pretty good.

I can only reach 1180 at 100mv on both r9 290's until artifacting starts


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Single 290 @ 1440p
> 
> http://www.catzilla.com/showresult?lp=240971
> Beat ya Bilko


Grats dude









I might see what happens tonight with some chilly air


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Germanian*
> 
> that's a nice 290 HOMECINEMA.
> Voltage is high, but nonetheless 1220 on a 290 is pretty good.
> 
> I can only reach 1180 at 100mv on both r9 290's until artifacting starts


This one did [email protected]
http://www.3dmark.com/fs/1940725

and this is my best single MK 11 Pscore [email protected]
http://www.3dmark.com/3dm11/7725420
http://hwbot.org/submission/2471716_homecinema_pc_3dmark11___performance_radeon_r9_290_19306_marks

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Grats dude
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I might see what happens tonight with some chilly air


Thanks mate . Im having dp / hdmi loss of monitor sig from using this PT1T bios and the 14.4 driver dont bench as good as 13.12


----------



## cennis

Radeon R9 290X royalKing is a non ref design http://www.techpowerup.com/gpudb/b2866/club-3d-r9-290x-royalking.html

instead of the ncix images of reference cooler.

does this make 420$ a better deal than the 335$ asus directcu 290?

In terms of overclockability, 2816 cores and maybe better cooler?(direct cu seem to be bad on 290s)


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Thanks mate . Im having dp / hdmi loss of monitor sig from using this PT1T bios and the 14.4 driver dont bench as good as 13.12


I'm having some issues as well actually, 14.4 just not that great for benching, Losing my second monitor (DVI) when benching as well sometimes......just weird.

Might roll back to 13.12 on friday or something and see if i can take that Silver back


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'm having some issues as well actually, 14.4 just not that great for benching, Losing my second monitor (DVI) when benching as well sometimes......just weird.
> 
> Might roll back to 13.12 on friday or something and see if i can take that Silver back


Mantle 14.4 looks real good on BF4 @ 1150 / 1300 , but alas AB aint giving me OSD so I can see frames and clocks..........


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Mantle 14.4 looks real good on BF4 @ 1150 / 1300 , but alas AB aint giving me OSD so I can see frames and clocks..........


BF4 has the fps counter built in and i use a second monitor for temps, clocks and general other crap ;p


----------



## Matt-Matt

Quote:


> Originally Posted by *Sgt Bilko*
> 
> BF4 has the fps counter built in and i use a second monitor for temps, clocks and general other crap ;p


Same but with BF3 (Not a fan of BF4 myself)

Also I've had a few black screens when using VLC media player.. I'm not sure if it's my card or if it's VLC causing it. The PC eventually hard locks up and has to reboot.


----------



## arrow0309

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HOMECINEMA-PC*
> 
> Thanks mate . Im having dp / hdmi loss of monitor sig from using this PT1T bios and the 14.4 driver dont bench as good as 13.12
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm having some issues as well actually, 14.4 just not that great for benching, Losing my second monitor (DVI) when benching as well sometimes......just weird.
> 
> Might roll back to 13.12 on friday or something and see if i can take that Silver back
Click to expand...

Actually the current 14.4 has given me a "slightly" better Heaven 4.0 score than the 13.12 had, 1534 instead of 1529


----------



## muhd86

also i have 3 -30 inch dell 3007 wfp lcd , can i use eye fintty ..

the displays only have dvi connectors

is there a way to use eye finity with the current set up with using any expensive adaptors


----------



## arrow0309

Back on my 850W PSU "low" +12v rail software shown by the gpuz and hwinfo64 (290's vrm section) so I decided to test this morning with an *analog* multimeter (I only have this one)
The multimeter is "somehow" indicating me a bit more than the real voltage, tested on different dc adaptors and ps and it shows me something like 0.75-1v more than what it should do.

So I checked on an empty 12v pcie cable on my idle pc and showed me ~12.8v, same on a 4pin molex.
Therefore I loaded the 1180/1500 @+156 mv and 50 pl, maxed all the case fans and ran another Heaven 4.0 bench, recording this at the end (just like before happened):



However, during the benchmark, my analog multimeter "went down" only indicating me ~12.5v (from the idle 12.8v)
I know the mm was always showing me a bit more than the real voltage but I'm not really seeing such a great +12v decrease under heavy load, what do ya guys think?









Do I still have to try with a good, digital mm (at least once), should I stay safe or really consider a PSU replacement?
The both gpuz and hwinfo +12v min with the default 290 clocks remain however a bit higher


----------



## rdr09

Quote:


> Originally Posted by *arrow0309*
> 
> Back on my 850W PSU "low" +12v rail software shown by the gpuz and hwinfo64 (290's vrm section) so I decided to test this morning with an *analog* multimeter (I only have this one)
> The multimeter is "somehow" indicating me a bit more than the real voltage, tested on different dc adaptors and ps and it shows me something like 0.75-1v more than what it should do.
> 
> So I checked on an empty 12v pcie cable on my idle pc and showed me ~12.8v, same on a 4pin molex.
> Therefore I loaded the 1180/1500 @+156 mv and 50 pl, maxed all the case fans and ran another Heaven 4.0 bench, recording this at the end (just like before happened):
> 
> 
> 
> However, during the benchmark, my analog multimeter "went down" only indicating me ~12.5v (from the idle 12.8v)
> I know the mm was always showing me a bit more than the real voltage but I'm not really seeing such a great +12v decrease under heavy load, what do ya guys think?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Do I still have to try with a good, digital mm (at least once), should I stay safe or really consider a PSU replacement?
> The both gpuz and hwinfo +12v min with the default 290 clocks remain however a bit higher


you really need 156 offset? can you lower a bit. try 100. your 12V rdgs are a bit lower. around .13 - .15 v lower but does not seem to affect card performance 'cause your H4 score is normal.


----------



## Sgt Bilko

Quote:


> Originally Posted by *arrow0309*
> 
> Actually the current 14.4 has given me a "slightly" better Heaven 4.0 score than the 13.12 had, 1534 instead of 1529


thats pretty much margin of error, for me i can't hit the same clocks that i can under 13.12, struggling to bench 1250/1500 when on 13.12 i could go as high as 1280 on the core.


----------



## maarten12100

As I slight update to whether the broken R9 290 could be fixed or not. Today I mounted a cooler and installed the gpu. The setup sparked luckily for me I always test in a garbage AM3 board I have. Either way the Pci-e slot survived and I localized the spark it is next to one of the mounting holes.

To be more exact the previous owner has removed a amount of pcb substrate to mount a cooler this normally is not that much of a problem hell I myself used the dremel on cards in various ocassions but in this case it is. The reason it is a problem here is because there is a voltage regulator that has it's traces very close to it. With that we have a short circuit since the mounting holes on graphics cards are connected to the ground.

My best solution I can think of right now is sealing of the open traces as good as I can with epoxy but I'm open to suggestions of those that are familiar with extreme modding.


----------



## arrow0309

Quote:


> Originally Posted by *rdr09*
> 
> you really need 156 offset? can you lower a bit. try 100. your 12V rdgs are a bit lower. around .13 - .15 v lower but does not seem to affect card performance 'cause your H4 score is normal.


It gives me the artifacts at any lower than 156 (for that clocks), tried again with 125, 137 and 145, the last one would start the artifacts in the middle of the benchmark
Could be the "low" +12v and therefore a serious vdrop?
The card is having an 81.5% asic and Hynix memory

Quote:


> Originally Posted by *Sgt Bilko*
> 
> thats pretty much margin of error, for me i can't hit the same clocks that i can under 13.12, struggling to bench 1250/1500 when on 13.12 i could go as high as 1280 on the core.


The 13.12 always gave me a similar score of 1529-1530 at that oc (1180/1500/156) and some black stripe artifacts, now is a bit better and absolutely no artifacts








And btw, I won't oc that far on a reference cooling, neither with an excellent case airflow


----------



## rdr09

Quote:


> Originally Posted by *arrow0309*
> 
> It gives me the artifacts at any lower than 156 (for that clocks), tried again with 125, 137 and 145, the last one would start the artifacts in the middle of the benchmark
> Could be the "low" +12v and therefore a serious vdrop?
> The card is having an 81.5% asic and Hynix memory


not sure if ASIC matters. 79% here and only needs less than 50 offset for same oc. prolly lower. i never really bother finding the "right" amount for runs like these. i know i need +200 for 1300 and 0 for 1175. i prolly added 30 offset for the 1180 run.

if you don't feel safe with the psu, then replace it.


----------



## BiG StroOnZ

Anyone want to tell me if there is anything wrong with the PowerColor PCS+ R9 290 and 290X? I'm finding crazy deals on them for great prices. Are they not good cards?


----------



## heroxoot

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> Anyone want to tell me if there is anything wrong with the PowerColor PCS+ R9 290 and 290X? I'm finding crazy deals on them for great prices. Are they not good cards?


Not a thing. Seems prices on AMD cards are taking a dive for whatever reason. Probably no one is buying them at the prices they were.


----------



## Sgt Bilko

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> Anyone want to tell me if there is anything wrong with the PowerColor PCS+ R9 290 and 290X? I'm finding crazy deals on them for great prices. Are they not good cards?


Nothing wrong with them at all, they are quite good cards.

problem is the name, when people see Powercolor they think "It's a piece of crap"

Not everyone mind you, most people in this thread will tell you they are quite good.


----------



## rdr09

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> Anyone want to tell me if there is anything wrong with the PowerColor PCS+ R9 290 and 290X? I'm finding crazy deals on them for great prices. Are they not good cards?


some cards are not performing the way they should. like a 290X performing like a 290 Pro. also, check the warranty. afaik, powercolor warranty is not transferable. if it is still, i would not buy a used one.

edit: i know one case that was solved with a bios update.


----------



## Mega Man

Quote:


> Originally Posted by *muhd86*
> 
> also i have 3 -30 inch dell 3007 wfp lcd , can i use eye fintty ..
> 
> the displays only have dvi connectors
> 
> is there a way to use eye finity with the current set up with using any expensive adaptors


you will need a dvi adapter or 2 ( the expensive kind ~ 100 each ) and probably a mst hub
Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BiG StroOnZ*
> 
> Anyone want to tell me if there is anything wrong with the PowerColor PCS+ R9 290 and 290X? I'm finding crazy deals on them for great prices. Are they not good cards?
> 
> 
> 
> Nothing wrong with them at all, they are quite good cards.
> 
> problem is the name, when people see Powercolor they think "It's a piece of crap"
> 
> Not everyone mind you, most people in this thread will tell you they are quite good.
Click to expand...

warranty through them sucks, iirc you have to send them to asia to be warrantied

and ( in my opinion ) this
Source
Quote:


> • PowerColor products sent in for RMA MUST be free of any improper use, including but not limited to physical damage from dropping, improper installation, or modification of any kind (this includes installing aftermarket cooling solutions). The warranty WILL BE VOID if the product has been damaged or altered.


my fav is the "scratched gold finger" have you ever seen a gold finger without scratching ?

i will add i have heard good things about them, so yea all of the above is just personal, just ordered a powercolor card and ill let you know what i think


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mega Man*
> 
> warranty through them sucks, iirc you have to send them to asia to be warrantied
> 
> and ( in my opinion ) this
> Source


Pretty much no different for most people, Not many companies allow you to take the cooler off and not void the warranty.

As for warranty, well that depends on whether or not the place of purchase does that for you, my 290x didn't get to Sapphire before they approved the claim.


----------



## BiG StroOnZ

Quote:


> Originally Posted by *heroxoot*
> 
> Not a thing. Seems prices on AMD cards are taking a dive for whatever reason. Probably no one is buying them at the prices they were.


Noticed the price drop in the cards, but these are from online vendors which is why I'm concerned. Basically eBay prices from an online vendor.
Quote:


> Originally Posted by *Sgt Bilko*
> 
> Nothing wrong with them at all, they are quite good cards.
> 
> problem is the name, when people see Powercolor they think "It's a piece of crap"
> 
> Not everyone mind you, most people in this thread will tell you they are quite good.


I was under the impression that Powercolor was a pretty solid company, didn't they produce the AX7990 Devil? That looked pretty well developed.
Quote:


> Originally Posted by *rdr09*
> 
> some cards are not performing the way they should. like a 290X performing like a 290 Pro. also, check the warranty. afaik, powercolor warranty is not transferable. if it is still, i would not buy a used one.
> 
> edit: i know one case that was solved with a bios update.


They are Open Box cards. $399 for R9 290, $449 for R9 290X from an online vendor so not necessarily used though. Which is why I'm wondering if it's a bad move to jump on one of the two for the price.
Quote:


> Originally Posted by *Mega Man*
> 
> warranty through them sucks, iirc you have to send them to asia to be warrantied
> 
> and ( in my opinion ) this
> Source


Do they have a high RMA rate? Is it something that would sway you away from purchasing one of their products?


----------



## Mega Man

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> warranty through them sucks, iirc you have to send them to asia to be warrantied
> 
> and ( in my opinion ) this
> Source
> 
> 
> 
> Pretty much no different for most people, Not many companies allow you to take the cooler off and not void the warranty.
> 
> As for warranty, well that depends on whether or not the place of purchase does that for you, my 290x didn't get to Sapphire before they approved the claim.
Click to expand...

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Not a thing. Seems prices on AMD cards are taking a dive for whatever reason. Probably no one is buying them at the prices they were.
> 
> 
> 
> Noticed the price drop in the cards, but these are from online vendors which is why I'm concerned. Basically eBay prices from an online vendor.
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Nothing wrong with them at all, they are quite good cards.
> 
> problem is the name, when people see Powercolor they think "It's a piece of crap"
> 
> Not everyone mind you, most people in this thread will tell you they are quite good.
> 
> Click to expand...
> 
> I was under the impression that Powercolor was a pretty solid company, didn't they produce the AX7990 Devil? That looked pretty well developed.
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> some cards are not performing the way they should. like a 290X performing like a 290 Pro. also, check the warranty. afaik, powercolor warranty is not transferable. if it is still, i would not buy a used one.
> 
> edit: i know one case that was solved with a bios update.
> 
> Click to expand...
> 
> They are Open Box cards. $399 for R9 290, $449 for R9 290X from an online vendor so not necessarily used though. Which is why I'm wondering if it's a bad move to jump on one of the two for the price.
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> warranty through them sucks, iirc you have to send them to asia to be warrantied
> 
> and ( in my opinion ) this
> Source
> 
> Click to expand...
> 
> Do they have a high RMA rate? Is it something that would sway you away from purchasing one of their products?
Click to expand...




to both~ ninjaed !
Quote:


> Originally Posted by *Mega Man*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *muhd86*
> 
> also i have 3 -30 inch dell 3007 wfp lcd , can i use eye fintty ..
> 
> the displays only have dvi connectors
> 
> is there a way to use eye finity with the current set up with using any expensive adaptors
> 
> 
> 
> you will need a dvi adapter or 2 ( the expensive kind ~ 100 each ) and probably a mst hub
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BiG StroOnZ*
> 
> Anyone want to tell me if there is anything wrong with the PowerColor PCS+ R9 290 and 290X? I'm finding crazy deals on them for great prices. Are they not good cards?
> 
> Click to expand...
> 
> Nothing wrong with them at all, they are quite good cards.
> 
> problem is the name, when people see Powercolor they think "It's a piece of crap"
> 
> Not everyone mind you, most people in this thread will tell you they are quite good.
> 
> Click to expand...
> 
> warranty through them sucks, iirc you have to send them to asia to be warrantied
> 
> and ( in my opinion ) this
> Source
> Quote:
> 
> 
> 
> • PowerColor products sent in for RMA MUST be free of any improper use, including but not limited to physical damage from dropping, improper installation, or modification of any kind (this includes installing aftermarket cooling solutions). The warranty WILL BE VOID if the product has been damaged or altered.
> 
> Click to expand...
> 
> 
> 
> 
> *
> my fav is the "scratched gold finger" have you ever seen a gold finger without scratching ?
> 
> i will add i have heard good things about them, so yea all of the above is just personal, just ordered a powercolor card and ill let you know what i think*
Click to expand...

it really is just personal pref, MSI and XFX both allow cooler to be taken off without rma issue as long as you dont damage the card, i prefer to support companies like that, and although i have been lucky with pc stuffs i never have had to send anything to a non us address. i cant find the article where i saw their warranty steps with address,


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mega Man*
> 
> to both~ ninjaed !
> it really is just personal pref, MSI and XFX both allow cooler to be taken off without rma issue as long as you dont damage the card, i prefer to support companies like that, and although i have been lucky with pc stuffs i never have had to send anything to a non us address. i cant find the article where i saw their warranty steps with address,


XFX is only North America iirc?

Sapphire kinda has a "don't ask don't tell" type of policy, others i'm unsure of.


----------



## Mega Man

ahha ! well i live in the us so good enough for meh !


----------



## Sgt Bilko

I don't get why it's a NA only thing though, same as the Lifetime Warranty (unless that changed)


----------



## Roy360

Quote:


> Originally Posted by *Mega Man*
> 
> where do you see this 30% off ??


Quote:


> Originally Posted by *Forceman*
> 
> Is there a code or something for the 30% discount?


my bad, I completely forgot to include the code. CLRSALEAPR11

The Royal X is still on sale for 419, with 22 left in stock.

here's the actaul card


----------



## cennis

i thinl that is the Royalaces. And Royal kings is more similar to the 2 fan asus design


----------



## hornedfrog86

Quote:


> Originally Posted by *Roy360*
> 
> my bad, I completely forgot to include the code. CLRSALEAPR11
> 
> The Royal X is still on sale for 419, with 22 left in stock.
> 
> here's the actaul card


Where?


----------



## lawson67

Quote:


> Originally Posted by *BiG StroOnZ*
> 
> Anyone want to tell me if there is anything wrong with the PowerColor PCS+ R9 290 and 290X? I'm finding crazy deals on them for great prices. Are they not good cards?


I have a powercolor R9 290 and i highly recommended it also i have had no trouble at all with it!...Also i believe it has the best non reference cooler on it?...The cooler on it is very large and covers 3 slots and has some lovely heat sinks on the VRM chips...It comes factory overclocked at 1040mhz and the memory is running at 1350mhz...Also after a 3 hour heaven benchmark 4.0 the temps barely raised above 70c.

Edit: And that max temp of 70c is running in crossfire with a Gigabyte windforce R9 290 above it....which i would also highly recommend...However the windforce is a fair bit louder than the Powercolor once she gets up and running after a few hours of gaming!


----------



## lawson67

Quote:


> Originally Posted by *Roy360*
> 
> my bad, I completely forgot to include the code. CLRSALEAPR11
> 
> The Royal X is still on sale for 419, with 22 left in stock.
> 
> here's the actaul card


This card looks to me like it has the powercolor pcs+ heatsink on it?


----------



## disintegratorx

Here's a link to the Catalyst 14.4 drivers, if anybody wants to try them.
http://www.ngohq.com/news/24735-amd-catalyst-14-4-beta.html I think they're good. I haven't read any notes on them though...


----------



## Mega Man

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I don't get why it's a NA only thing though, same as the Lifetime Warranty (unless that changed)


that i dont know


----------



## bond32

The prices of the cards has dropped for 2 main reasons: 1, the difficulty of most crpto coins has gone up drastically - so much it has shot profits down. And 2, a lot of people can't keep them cool due to the increasing ambient temperatures. Unless you have them in a separate building, that's going to be a huge load put on the ac.


----------



## Roy360

Quote:


> Originally Posted by *lawson67*
> 
> This card looks to me like it has the powercolor pcs+ heatsink on it?


Quote:


> Originally Posted by *hornedfrog86*
> 
> Where?


http://products.ncix.com/detail/club3d-radeon-r9-290x-royalking-6d-94194-1374.htm

and I uploaded the wrong pic. NCIX has the RoyalKing not the ACE


----------



## hornedfrog86

Thanks!


----------



## mojobear

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Running quad here, i water blocked all mine and have a 1600watt psu to go with them. I can't really overclock very much because i get worried about my PSU and breaker lol. I can only suggest if your looking for absolute max fps go 4 gpus. It's really hard to recommend it over 3 because 3 do well. I am currently doing scaling tests but in 4k eyefinity in another thread 11520x2160 right now. If you would like i can do some scaling tests at 5760 from 3 to 4gpus. I don't have alot of the brand new game releases though. And yes some games really have problems with 4 way CF over 3 wayCF. One comes to my mind is SKYRIM.
> 
> If your going to have to spend a chunk of money to get that 4th card in, your benefits per dollar will go down


Hey thanks DeadlyDNA...would love to see some scaling on 3 vs 4 cards @ 5760. If I do put the 4th 290 in with my other 3 290s I think I would just use an add2psu and hook up a 600W to feed the extra GPU. That way I could save on money and wait for the 1500W next gen EVGA or the AX1500i from corsair







I would still need to buy another radiator though, and hook up some GT 1450rpm fans to those...so we are talking about 400 dollars extra just to do quad 290s.

I was looking at your 4k eyefinity thread...impressive stuff! Cant wait until you load up a pic of the setup and make sure to put something in for size reference haha.


----------



## cennis

http://www.tomshardware.com/reviews/...up,3655-5.html
via comparison in this 280X review, the royalking needs to spin to 2500rpm and asus directcu only need to spin to 1500rpm for the same temps
this may or may not apply to 290x, but the royalking is a louder cooler as it has faster fans

anyone has more info on this card?
the asus directcu seem to be a failed implementation on 290/290x. high core and vrm temps.


----------



## Roy360

Quote:


> Originally Posted by *cennis*
> 
> http://www.tomshardware.com/reviews/...up,3655-5.html
> via comparison in this 280X review, the royalking needs to spin to 2500rpm and asus directcu only need to spin to 1500rpm for the same temps
> this may or may not apply to 290x, but the royalking is a louder cooler as it has faster fans
> 
> anyone has more info on this card?
> the asus directcu seem to be a failed implementation on 290/290x. high core and vrm temps.


yup, my Core and VRM temps are at 90 degrees with my ASUS R9 DU card. I plan on adding shims to cores, but I"m not sure how will I will improve the VRM temps. There only so much a quality thermal pad can do

plus now I need to consider which cards I want to watercool. I only want to watercool 3 cards, and I already have two reference R9 290 blocks

My R9 200 cards

XFX R9 290 reference <--Highest Hashrate
VisionTek R9 290 Reference with non reference cooler <--In the R9 280X range
2x ASUS DU R9 290 <---Strictly gaming card, horrible hashrate and horrible overclocker
CLUB 3D RoyalKing R9 290X
ASUS DU R9 290X

I've never been this this situation before







do I cool the high end cards? or should I cool the cards in desperate need of cooling (ie. the ASUS cards)


----------



## Norse

Was just looking into possibly WC'ing the 290x's i am getting and have a query .......... http://www.watercoolinguk.co.uk/p/Aqua-Computer-kryographics-Hawaii-for-R9-Radeon-290X-edition-acrylic-glass-nickel-plated-version_42990.html that kind of block what would i do as by stock both graphics would be almost touching? they do NOT have a gap between them? i have no experience with WC'ing so what owuld best thing be to do as i doubt i will have enough gap between them to have the loop directly between both GPU before going back to the radiator


----------



## sugarhell

New CIV game has mantle support. Finally a RTS mantle game


----------



## Jflisk

Quote:


> Originally Posted by *Norse*
> 
> Was just looking into possibly WC'ing the 290x's i am getting and have a query .......... http://www.watercoolinguk.co.uk/p/Aqua-Computer-kryographics-Hawaii-for-R9-Radeon-290X-edition-acrylic-glass-nickel-plated-version_42990.html that kind of block what would i do as by stock both graphics would be almost touching? they do NOT have a gap between them? i have no experience with WC'ing so what owuld best thing be to do as i doubt i will have enough gap between them to have the loop directly between both GPU before going back to the radiator


Your best bet would be looking at EK water blocks and FC bridges. The bridge with a block would probably be preferred in your case.

http://www.ekwb.com/shop/blocks/vga-blocks/ati-radeon-full-cover-blocks/radeon-rx-200-series.html

or here

http://www.frozencpu.com/cat/l1/g57/EK_Products.html?id=S2jkY9qh


----------



## Norse

The bridge would have almost no clearance though? as the cards would be against each other pretty much as i dont have a 2 bay gap between then it iwll be 0


----------



## Jflisk

Quote:


> Originally Posted by *Norse*
> 
> The bridge would have almost no clearance though? as the cards would be against each other pretty much as i dont have a 2 bay gap between then it iwll be 0


Check this one

http://www.frozencpu.com/products/16407/ex-blc-1161/EK_FC_Bridge_Dual_Parallel_CSQ_-_SLI_Connection_-_Acetal_EK-FC_Bridge_DUAL_Parallel_CSQ.html?tl=g57c593s1895


----------



## Jflisk

Quote:


> Originally Posted by *Jflisk*
> 
> Check this one
> 
> http://www.frozencpu.com/products/16407/ex-blc-1161/EK_FC_Bridge_Dual_Parallel_CSQ_-_SLI_Connection_-_Acetal_EK-FC_Bridge_DUAL_Parallel_CSQ.html?tl=g57c593s1895


With water dont matter if they are next to each other as long as you can hit the pci-e slot and connect the loop. R9 290 xs are dual slot cards. There is no real heat to speak of.


----------



## brucevilanch

Add me to this list!











XFX r9 290 OC Black edition. Reference cooler for now, but will water block it in a later.


----------



## kizwan

Quote:


> Originally Posted by *Norse*
> 
> Was just looking into possibly WC'ing the 290x's i am getting and have a query .......... http://www.watercoolinguk.co.uk/p/Aqua-Computer-kryographics-Hawaii-for-R9-Radeon-290X-edition-acrylic-glass-nickel-plated-version_42990.html that kind of block what would i do as *by stock both graphics would be almost touching*? they do NOT have a gap between them? i have no experience with WC'ing so what owuld best thing be to do as *i doubt i will have enough gap between them to have the loop directly between both GPU before going back to the radiator*


With stock cooler, they will be touching but it's different story with water block. You'll have one *PCIe slot* gap between them.

Why don't you connect both GPU in series (or parallel)? No need to go through a radiator before second GPU. The heat is not that bad if you have enough radiator.


----------



## Roboyto

Quote:


> Originally Posted by *bond32*
> 
> This is what happens to my lovely 290x when I overclock:
> 
> This was at 1175/1315 +100mv. Does not matter the bios/drivers/combination.


Is the bench/game hard locking when this happens? I never got vertical bars like that with my card.

If I pushed the memory too far I would get a pure black screen and a hard lock or a crash; not always a BSOD though.

Pushing the core too far I would get artifacting and flickering which was typically solved by more voltage. Even when I ran out of voltage to add, a benchmark would complete no matter how severe the artifacting got...I let it go one time through a whole bench expecting it to crash because of how awful the the artifacting/flickering was, but it finished the bench. My score was lower than previous runs but it held through, I was









What are your temperatures on VRMs and the core like at those clocks? If your rig pics are up to date it looks like you're









What drivers you running? Did you scrub drivers before re-installing each time?

I would reduce the memory clock back to stock and see if that 1175 core is stable with +100 mV; 100mV seems a little high for only 1175 unless you got an underwhelming card.

What utility are you using to OC? Whichever you use. make sure ULPS is disabled and you are forcing constant voltage. A few people have had issues with AB, myself included. Trixx has solved some strange issues for a few people, you may want to consider trying it.


----------



## Norse

Quote:


> Originally Posted by *kizwan*
> 
> With stock cooler, they will be touching but it's different story with water block. You'll have one *PCIe slot* gap between them.
> 
> Why don't you connect both GPU in series (or parallel)? No need to go through a radiator before second GPU. The heat is not that bad if you have enough radiator.


Will have to see what the temps and noise levels are like once i get them







dont really want to have to spend another £400


----------



## Roboyto

Quote:


> Originally Posted by *brucevilanch*
> 
> Add me to this list!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> XFX r9 290 OC Black edition. Reference cooler for now, but will water block it in a later.


I have the same card and mine is a wonderful performer, if you have any questions feel free to ask. I pushed the card hard with stock blower, and then even further once I went


----------



## Roboyto

Quote:


> Originally Posted by *Norse*
> 
> Will have to see what the temps and noise levels are like once i get them
> 
> 
> 
> 
> 
> 
> 
> dont really want to have to spend another £400


Once you have to start battling the heat 2 of these volcanoes dissipate when Xfiring you may reconsider that 400 pretty quickly


----------



## brucevilanch

Yes, I can't wait till I get my loop set up to see just how far I can push this thing. I can get 1125/1350 at stock voltage, so that was way more than what I was expecting. I set up a pretty aggressive fan curve(loud as ****) and my core temps are usually in the low 70's.

I do have a question for you, though. Anything over 1150 on the core and the card throttles back severely. Core and VRM temps look fine, but I just haven't been able to identify the problem. I've tried it in Valley, 3dmark11, and 3dmark Firestrike. The same thing happens in all of them.

Any ideas?


----------



## Norse

Quote:


> Originally Posted by *Roboyto*
> 
> Once you have to start battling the heat 2 of these volcanoes dissipate when Xfiring you may reconsider that 400 pretty quickly


I'll get the credit card and aircon ready then


----------



## Roboyto

Quote:


> Originally Posted by *brucevilanch*
> 
> Yes, I can't wait till I get my loop set up to see just how far I can push this thing. I can get 1125/1350 at stock voltage, so that was way more than what I was expecting. I set up a pretty aggressive fan curve(loud as ****) and my core temps are usually in the low 70's.
> 
> I do have a question for you, though. Anything over 1150 on the core and the card throttles back severely. Core and VRM temps look fine, but I just haven't been able to identify the problem. I've tried it in Valley, 3dmark11, and 3dmark Firestrike. The same thing happens in all of them.
> 
> Any ideas?


Have you given it any extra voltage at 1150? My card needed +75mV to get to 1180/1450.

What are the core/VRM1 temps looking like? At 1180/1450 +75mV fan forced to 75% I got: Core 69C VRM1/VRM2 50C/57C

The card should not throttle from core temperature unless you break 95C. VRMs are rated for ~120C, consensus in this thread seems to be exceeding 80C isn't good for longevity of the card.


----------



## Roboyto

Quote:


> Originally Posted by *Norse*
> 
> I'll get the credit card and aircon ready then










Good old plastic money! Probably going to want a 240mm of radiator per card, plus some headroom for CPU...dual 360s would probably keep things under control.

Check out: http://www.overclock.net/t/1442038/build-log-the-hawaiian-heat-wave

Gunderman456 is one thorough SOB! There is lots of useful information in his build log for Xfire 290(X), I would highly suggest checking it out.


----------



## brucevilanch

Oh yeah, I've tried it all the way up to +100mv just to see if that was the problem, even though typically it would just crash without the proper voltage. I might try something other than afterburner and see what that does.


----------



## Norse

Quote:


> Originally Posted by *Roboyto*
> 
> 
> 
> 
> 
> 
> 
> 
> Good old plastic money! Probably going to want a 240mm of radiator per card, plus some headroom for CPU...dual 360s would probably keep things under control.
> 
> Check out: http://www.overclock.net/t/1442038/build-log-the-hawaiian-heat-wave
> 
> Gunderman456 is one thorough SOB! There is lots of useful information in his build log for Xfire 290(X), I would highly suggest checking it out.


Wont be WC;ing my CPU as they run cool enough as it is plus the AIR coolers were £55 each.......i dont wana throw away £110 as they only fit C32 and G34 sockets


----------



## Roboyto

Quote:


> Originally Posted by *Norse*
> 
> Wont be WC;ing my CPU as they run cool enough as it is plus the AIR coolers were £55 each.......i dont wana throw away £110 as they only fit C32 and G34 sockets


Oooh, dual Opterons...Sweeeeeeeet


----------



## Norse

Quote:


> Originally Posted by *Roboyto*
> 
> Oooh, dual Opterons...Sweeeeeeeet


VERY sweet 32cores, 64GB Ram







but it cost quite a bit so


----------



## Roboyto

Quote:


> Originally Posted by *brucevilanch*
> 
> Oh yeah, I've tried it all the way up to +100mv just to see if that was the problem, even though typically it would just crash without the proper voltage. I might try something other than afterburner and see what that does.


What are the core/VRM1 temps looking like? At 1180/1450 +75mV fan forced to 75% I got: Core 69C VRM1/VRM2 50C/57C

Also what drivers you running? 14.X beta have been problematic. 13.12 is my suggestion for issue free stability

Are your rig signatures up to date? I see 79XX cards in them. What PSU you using to feed this monster GPU?

Also, it is best to use the Quote button when responding directly to someone, so they know you were speaking to them. This thread can move very fast at times and your post could get buried quickly.


----------



## brucevilanch

Core around 74c and VRMs in the 60s. Nothing problematic. I am on 14.3 though. I'll try out 13.12 and see if that helps. Thanks.


----------



## Roboyto

Quote:


> Originally Posted by *brucevilanch*
> 
> Core around 74c and VRMs in the 60s. Nothing problematic. I am on 14.3 though. I'll try out 13.12 and see if that helps. Thanks.


Make sure you scrub those drivers first. Display Driver Uninstaller FTW

http://www.guru3d.com/files_details/display_driver_uninstaller_download.html


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Norse*
> 
> I'll get the credit card and *aircon* ready then


LoooooL ive already done that


----------



## trihy

It´s me or it´s just impossible to watch a 3d movie with 14.3 drivers?

I get instant blue screen as it switches to 24hz mode on arc media theatre or powerdvd...

On 13.12 it´s not perfect, but works.


----------



## FtW 420

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> LoooooL ive already done that


Preparing for the summer!


----------



## centvalny

Testing 290X matrix



http://imgur.com/4Y92WC2


----------



## Norse

Quote:


> Originally Posted by *FtW 420*
> 
> Preparing for the summer!


is that for quad 290x's? one for each?


----------



## brucevilanch

Quote:


> Originally Posted by *Roboyto*
> 
> Make sure you scrub those drivers first. Display Driver Uninstaller FTW
> 
> http://www.guru3d.com/files_details/display_driver_uninstaller_download.html


I've been using atiman uninstaller for the last year. Would you recommend that over Atiman?


----------



## Roboyto

Quote:


> Originally Posted by *brucevilanch*
> 
> I've been using atiman uninstaller for the last year. Would you recommend that over Atiman?


I've never heard of ATiman, so I can't compare. Up until I got my card and started following this thread I was using an outdated version of Driver Sweeper


----------



## brucevilanch

Right on. I'll give it a try the next time I switch drivers. I have already went back to the 13.12s and noticed that it did help, also the power limit slider is working like it should.


----------



## Arizonian

Quote:


> Originally Posted by *brucevilanch*
> 
> Add me to this list!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> XFX r9 290 OC Black edition. Reference cooler for now, but will water block it in a later.


Congrats - added


----------



## brucevilanch

Thank you, sir.


----------



## Paul17041993

why do I hear so much "high RPM makes it louder" fud...? that would imply the stock heatsinks for my 8150 and A.II 640 are louder then my radiator fans, which they aren't...

fan RPM is related to the pressure the fan can push out, noise levels are dependent on the impeller and bearing used, size is then a multiplier.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *FtW 420*
> 
> Preparing for the summer!


Those units have a very large fancoil on em by the looks of it








My house has a 15.5kw fully ducted system can turn on / off zoning as well . So full blast zone 3 AND AC/PC








Quote:


> Originally Posted by *centvalny*
> 
> Testing 290X matrix
> 
> 
> 
> http://imgur.com/4Y92WC2


Did you replace the pch 'sink to get that beasty to fit ?

Quote:


> Originally Posted by *Norse*
> 
> is that for quad 290x's? one for each?


LooooooL


----------



## VSG

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Did you replace the pch 'sink to get that beasty to fit ?


That actually looks like a prototype z97 or x99 board he has, likely can't talk about it till NDA lifts.


----------



## grunion

It's a R4BE


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *geggeg*
> 
> That actually looks like a prototype z97 or x99 board he has, likely can't talk about it till NDA lifts.
> Quote:
> 
> 
> 
> Originally Posted by *grunion*
> 
> It's a R4BE
Click to expand...

I wonder what made him think that LooooL


----------



## VSG

D'oh! I saw those heatsinks on the leaked z97 motherboard pics and figured the guy had been given an Asus board to test along with the Matrix card. I know a lot of LN2 guys have gotten these already.

Example image of upcoming MSI board with test heatsinks on:


----------



## centvalny

Quote:


> Originally Posted by *grunion*
> 
> It's a R4BE


Its actually RIVBE ES rev. 1.01. The first of 20 testing boards for RIVBE official launch day @ IDF '13


----------



## swiftypoison

That matrix looks amazing. I returned my Asus DirectCU IIR9 290 just to wait for that. Do you know when they will be available?


----------



## TheBenson

Bought a R9 290 DCUII, good news is it can make it through firestrike at 1230 on the core, bad news is the scores are no better than at 1150.


----------



## Paul17041993

Quote:


> Originally Posted by *TheBenson*
> 
> Bought a R9 290 DCUII, good news is it can make it through firestrike at 1230 on the core, bad news is the scores are no better than at 1150.


throttle most likely, check your temps, and you might be hitting the TDP limit (usually requires a BIOS mod to overcome).


----------



## UZ7

Currently interested on side bending to R9 290 lol, so within the year I went 7950 -> 770 -> 780, kinda wanna jump back to AMD.









Looking at:
R9 290 Gaming
R9 290 Tri-X

With warranties aside, how does the cooler perform? Kinda anal with noise so I want something that runs silent as well, so if anyone has had experience with any of these please share your input


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *FtW 420*
> 
> Preparing for the summer!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Those units have a very large fancoil on em by the looks of it
> 
> 
> 
> 
> 
> 
> 
> 
> My house has a 15.5kw fully ducted system can turn on / off zoning as well . So full blast zone 3 AND AC/PC
Click to expand...

That is a lot of horsepower. Whenever I see this...


----------



## VinhDiezel

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> yeah, loud to say the least
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I agree with you. I hear so much bad things about reference coolers, but aside from the noise it kept my card pretty cool in any situation. Of course that's completely ignoring the sound levels. 40%fan on reference is like my DCUII at full blast 100% speed. 100%fan on reference is like 1000000% on DCUII


How is your temps on reference cooler at different fan speeds? 40% 80% 100%? Thinking of picking up a pair of 290s off the bay for cheap.


----------



## rdr09

Quote:


> Originally Posted by *TheBenson*
> 
> Bought a R9 290 DCUII, good news is it can make it through firestrike at 1230 on the core, bad news is the scores are no better than at 1150.


what driver?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *kizwan*
> 
> That is a lot of horsepower. Whenever I see this...


Heres some moar horsepower...........

Added a 2rd 360 / 65mm phyobia I had from antec 1200 cut and shut . QDC's some rough plumbing...........


----------



## heroxoot

Wow man I think you need a bigger case. Or maybe you need a second one.


----------



## arrow0309

Quote:


> Originally Posted by *UZ7*
> 
> Currently interested on side bending to R9 290 lol, so within the year I went 7950 -> 770 -> 780, kinda wanna jump back to AMD.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looking at:
> R9 290 Gaming
> R9 290 Tri-X
> 
> With warranties aside, how does the cooler perform? Kinda anal with noise so I want something that runs silent as well, so if anyone has had experience with any of these please share your input


Get the 290 Tri-X OC








With its 3 x 90mm fans is dead silent (idle 20%, load 36-37%) at default (bios) fan curve and you can still improve its cooling by using "custom" fan profiles with the Trixx like (some personal)









*OC Gaming*



*Or Extreme*


----------



## UZ7

Quote:


> Originally Posted by *arrow0309*
> 
> Get the 290 Tri-X OC
> 
> 
> 
> 
> 
> 
> 
> 
> With its 3 x 90mm fans is dead silent (idle 20%, load 36-37%) at default (bios) fan curve and you can still improve its cooling by using "custom" fan profiles with the Trixx like (some personal)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *OC Gaming*
> 
> 
> 
> *Or Extreme*


Niice! thanks for the info.









Another question, is there any card sag with this cooler? I see most custom coolers have back plates or inner base plate to prevent sagging.


----------



## DerekParker

@Arizonian Sure, I agree its not so bad, but I would like to do some things to bring it down a bit. The cards themselves seem to be very nicely built, have a good solid factory backplate, and the sink and fans seem to keep up really well when it is considered how hot the reference cards get. I will definitely always reply to folks, I do however work a lot and not all of my responses are very timely


----------



## heroxoot

Quote:


> Originally Posted by *UZ7*
> 
> Another question, is there any card sag with this cooler? I see most custom coolers have back plates or inner base plate to prevent sagging.


If there is, you can just make something to help it up. Thats one reason why I love the new high end cards. Everyone is putting back plates on them now. For me, I have the HAF XB from cooler master. The card sits with the side the heat dumps from standing up, so there is no way for it to sag. It pleases me because seeing pictures of sagging GPU makes me mad. How anyone could just let it bend like that and not try to help it is beyond me.

Hopefully MSI ships my 290 tomorrow so I can know the feel of a sturdy gpu again.


----------



## arrow0309

Quote:


> Originally Posted by *UZ7*
> 
> Niice! thanks for the info.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Another question, is there any card sag with this cooler? I see most custom coolers have back plates or inner base plate to prevent sagging.


Exactly!
The inner plate is well fixed with a lot of screws on the backside of the card's pcb and more, the 5 heatpipe cooler and the inner plate are assembled altogether, you can't only remove the main cooler without the (vrm & memory cooling) baseplate









http://images.anandtech.com/doci/7601/S290HSF_Back.jpg
http://www.anandtech.com/show/7601/sapphire-radeon-r9-290-review-our-first-custom-cooled-290
http://www.kitguru.net/wp-content/uploads/2013/12/ACC_4866_DxO1.jpg

This will ensure a nice rigidity of the card and, of course, no sagging for such a huge and long video card
Look at mine










Spoiler: Warning: Spoiler!


----------



## Roy360

does anyknow of it the https://www.dazmode.com/store/product/back-plate-for-kryographics-hawaii-r9-290x290-active-xcs/ will fit on the EK blocks? Or if at least ports match up? so I can buy two Kryographics blocks and add them with black plates

I want to add to more cards to my loop, but I don't want the regular EK block anymore


----------



## brucevilanch

I don't know if they'll line up or not. I thought about doing the same thing, though. That active back plate looks incredible and seems to perform really well.



The rest can be found in this thread. http://www.xtremesystems.org/forums/showthread.php?288109-Stren-s-R9-290-290x-Water-Block-Testing


----------



## boldenc

anyone tried a mod bios on his Sapphire Tri X?
I don't like the vdroop of the card that during firestrike extreme the gpu voltage go down up to 1.09 in some areas and that's already @ 100mv


----------



## Roy360

Quote:


> Originally Posted by *brucevilanch*
> 
> I don't know if they'll line up or not. I thought about doing the same thing, though. That active back plate looks incredible and seems to perform really well.
> 
> 
> 
> The rest can be found in this thread. http://www.xtremesystems.org/forums/showthread.php?288109-Stren-s-R9-290-290x-Water-Block-Testing


I contacted Daz, and he's "almost positive that port will not line up". Looks like my next two R9 290s will suffer under EK's poor block







. Even that passive AquaBlock is incredible.

I was even thinking of buying the EK backplate, but it doesn't seem to be doing much.


----------



## brucevilanch

Yeah, the first block I was looking at is the EK nickel. That block looks really nice, but I think I'm going to go with the XSPC Razer. I love the clean design and the LED, plus it will match the CPU block I'm getting. I might still go with the Aquacomputer block just because of the ridiculous performance it offers, especially with their backplates.

I believe the active back plate is around $75 by itself, but man it looks sexy.


----------



## kizwan

Quote:


> Originally Posted by *Roy360*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *brucevilanch*
> 
> I don't know if they'll line up or not. I thought about doing the same thing, though. That active back plate looks incredible and seems to perform really well.
> 
> 
> 
> The rest can be found in this thread. http://www.xtremesystems.org/forums/showthread.php?288109-Stren-s-R9-290-290x-Water-Block-Testing
> 
> 
> 
> 
> 
> 
> 
> I contacted Daz, and he's "almost positive that port will not line up". Looks like my next two R9 290s will suffer under EK's poor block
> 
> 
> 
> 
> 
> 
> 
> . Even that passive AquaBlock is incredible.
> 
> I was even thinking of buying the EK backplate, but it doesn't seem to be doing much.
Click to expand...

You can either sell the EK block & get Aqua water block + backplate, or replaced the stock EK thermal pad with Fujipoly Extreme or Ultra Exteme thermal pad. I'm happy with my EK water block (+ Fujipoly Extreme thermal pad on VRM1).


----------



## Jflisk

Okay so I got my R9290x's under water how far do we think I can push these things. I have been running at 1060 - 1350 with no problems One of my cards Is Hynix mem and the other is elipilida mem. Thanks


----------



## brucevilanch

You should have no problem pushing it up to 1100-1125 with stock voltage. My 290 is running at 1125 with stock voltage and +50mv on the aux power to put the ram up to 1500. My card has hynix ram


----------



## Roboyto

Quote:


> Originally Posted by *Roy360*
> 
> does anyknow of it the https://www.dazmode.com/store/product/back-plate-for-kryographics-hawaii-r9-290x290-active-xcs/ will fit on the EK blocks? Or if at least ports match up? so I can buy two Kryographics blocks and add them with black plates
> 
> I want to add to more cards to my loop, but *I don't want the regular EK block anymore*


Quote:


> Originally Posted by *brucevilanch*
> 
> Yeah, the first block I was looking at is the EK nickel. That block looks really nice, but I think I'm going to go with the *XSPC Razer*. I love the clean design and the LED, plus it will match the CPU block I'm getting. I might still go with the *Aquacomputer block just because of the ridiculous performance* it offers, especially with their backplates.
> 
> I believe the active back plate is around $75 by itself, but man it looks sexy.


Quote:


> Originally Posted by *kizwan*
> 
> You can either sell the EK block & get Aqua water block + backplate, or replaced the stock EK thermal pad with *Fujipoly Extreme or Ultra Exteme thermal pad. I'm happy with my EK water block (+ Fujipoly Extreme thermal pad on VRM1).*


*What Kizwan said, see my thread here with 20%+ temp drops on VRM1. A few people with EK blocks have commented on similar temp drops in the thread.*

http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures

*You will need to buy (1) of these to cover up to (3) cards' VRM1 section:*

http://www.frozencpu.com/products/17504/thr-185/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_Mosfet_Block_-_100_x_15_x_10_-_Thermal_Conductivity_170_WmK.html?tl=g8c487s1797

*I have the XSPC and the performance is fine for core, it helps to upgrade thermal pads if you really want to push voltage/clocks. *

*The active backplate for Aquacomputer actually does little more than the passive one; it does however complete the look







*

*I've been meaning to put thermal pads between my backplate and VRMs, just haven't gotten around to doing it yet. I will have that info posted sooner or later.*


----------



## Roboyto

Quote:


> Originally Posted by *Jflisk*
> 
> Okay so I got my R9290x's under water how far do we think I can push these things. I have been running at 1060 - 1350 with no problems One of my cards Is Hynix mem and the other is elipilida mem. Thanks


*Really depends on how good the cards you got are, and if you have enough radiator space to cool them off when OC'd and over-volted. *

*Seems like most don't have much trouble hitting high 1100's to low 1200's with around +100mV for the core. *

*RAM is a different story and varies more. RAM clock is not nearly as important with these cards enormous 512-bit bus. Concentrate on core clock first and then worry about memory second.*


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> That is a lot of horsepower. Whenever I see this...
> 
> 
> 
> 
> 
> 
> Heres some moar horsepower...........
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Added a 2rd 360 / 65mm phyobia I had from antec 1200 cut and shut . QDC's some rough plumbing...........
Click to expand...

Added SR-1 120. With it & Shin-Etsu G751 TIM, GPU's core temp down by ~10 to ~12 Celsius.


----------



## brucevilanch

Quote:


> Originally Posted by *Roboyto*
> 
> *I have the XSPC and the performance is fine for core, just need to upgrade thermal pads. *
> 
> *The active backplate for Aquacomputer actually does little more than the passive one; it does however complete the look
> 
> 
> 
> 
> 
> 
> 
> *


Are you using the thermal pads that you linked with the XSPC?

Yeah, I'm still torn about with block I want. I plan on adding a second 290 down the road(after I upgrade from this Fx 8350), so keeping the cost of the block+backplate as a reasonable price point is a concern since I'll be buying 2 of each.


----------



## Roboyto

Quote:


> Originally Posted by *brucevilanch*
> 
> Are you using the thermal pads that you linked with the XSPC?
> 
> Yeah, I'm still torn about with block I want. I plan on adding a second 290 down the road(after I upgrade from this Fx 8350), so keeping the cost of the block+backplate as a reasonable price point is a concern since I'll be buying 2 of each.


*Yes, I have the XSPC Razor block. The Fujipoly Ultra Extreme made a large difference in VRM temperatures. I would suggest them for whichever block you end up with; my thread lists what thickness pad every block requires as they can differ.*


----------



## brucevilanch

Excellent. Thank you very much.


----------



## Roboyto

Quote:


> Originally Posted by *brucevilanch*
> 
> Excellent. Thank you very much.


*No problem







Feel free to ask if you have other questions*

*The XSPC blocks are quite stunning when lit up*











*Even with this shoddy picture they still look pretty sweet*


----------



## Roboyto

Quote:


> Originally Posted by *kizwan*
> 
> Added SR-1 120. With it & Shin-Etsu G751 TIM, GPU's core temp down by ~10 to ~12 Celsius.
> 
> 
> 
> Spoiler: Warning: Spoiler!


*Hmm, I was contemplating stuffing a 120mm radiator in the same spot of my case...







*

*Very nice*


----------



## brucevilanch

Yeah, that really does look sweet. I'll be running the Raystorm block as well. I love that you can buy different color LEDs and switch them out whenever you get tired of them.


----------



## aaroc

I bought Arctic MX-4 for my GPU and GPU. Is it better than the TIM that EK send with their water blocks? I bought the Fujipoly extreme that roboyto recommends in his How-to.


----------



## cephelix

@Roboyto.would the vrm heights differ between manufacturers?the reason i'm asking is cos i'm planning to order the ultra extreme and extreme versions of the fujipoly thermal pads in anticipation of the release of the ek full cover for my msi 290 gaming and was wondering wht thickness pads i should be getting


----------



## VulgarDisplay

Is 1120 heaven stable with no added voltage good for an r9 290? Its where I have gotten to without throttling on 14.3 beta. Just curious if this card is going to be a good over clocked once they fix power limit.


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *kizwan*
> 
> You can either sell the EK block & get Aqua water block + backplate, or replaced the stock EK thermal pad with Fujipoly Extreme or Ultra Exteme thermal pad. I'm happy with my EK water block (+ Fujipoly Extreme thermal pad on VRM1).


the problem is the people who ordered backplates from aquatuning 3months ago and still haven't gotten them


----------



## Roboyto

Quote:


> Originally Posted by *cephelix*
> 
> @Roboyto.would the vrm heights differ between manufacturers?the reason i'm asking is cos i'm planning to order the ultra extreme and extreme versions of the fujipoly thermal pads in anticipation of the release of the ek full cover for my msi 290 gaming and was wondering wht thickness pads i should be getting


*That is hard to say at this point since I don't have any information on the EK block for the MSI Gaming card. I will take an educated* guess *and say that EK would probably still be using 1mm thickness pads for the VRMs and 0.5mm pads for the RAM. However, we won't know until the block is listed on their site and the installation instructions will be there containing the information regarding pad thickness.*

*If you are contemplating purchasing new pads for the RAM, it really isn't worth the $. I did it for the sake of knowing if it would increase stability/OC for RAM, which it did not. RAM doesn't get hot enough to warrant the extra expense.*

*I will shoot them an e-mail and see if they can enlighten us now*









Quote:


> Originally Posted by *aaroc*
> 
> I bought Arctic MX-4 for my GPU and GPU. Is it better than the TIM that EK send with their water blocks? I bought the Fujipoly extreme that roboyto recommends in his How-to.


*MX-4 is a great paste and actually one that is recommended in the EK installation instructions. It is hard to find something that is as easy to use in every respect, and has a claimed useful life/durability of 8 years. *

*Many people are fond of Gelid GC-Extreme specifically for GPU use.*

*I have found Xigmatek PTI-G4512 to be very effective as well.*


----------



## Forceman

Quote:


> Originally Posted by *VulgarDisplay*
> 
> Is 1120 heaven stable with no added voltage good for an r9 290? Its where I have gotten to without throttling on 14.3 beta. Just curious if this card is going to be a good over clocked once they fix power limit.


Have you tried the 14.4s? Reports are that they fix the power limit.

But 1120 is pretty good at stock voltage. Is it game stable at that voltage? I can pass Heaven a lot higher than I can run BF4.


----------



## cephelix

@Roboyto
Ok,thanx for the info.will contact ek about pad thickness then.playing ard with my 290,asic about 80+%..but it takes me +62mv on trixx to reach 1100mhz.is this normal??


----------



## Dasboogieman

Out of sheer annoyance at the variable clockspeed and curiosity, I flashed my 290 with the ASUS 290x PT1T vBIOS. I had a small glimmer of hope that it would unlock in to a 290x but it was not meant to be. On the other hand, this BIOS works exactly as advertised, it completely removes PowerPlay at the BIOS level so I have a beautiful, constant, 1000mhz (or whatever afterburner sets it) at all times. I do wish there was an idle state but I'm not complaining too much since this solves all the issues I've been having, no more having to mess with RadeonPro profiles.

Winter is also approaching here so the extra heat is welcome.


Quote:


> Originally Posted by *cephelix*
> 
> @Roboyto
> Ok,thanx for the info.will contact ek about pad thickness then.playing ard with my 290,asic about 80+%..but it takes me +62mv on trixx to reach 1100mhz.is this normal??


It sounds about right, I needed about the same voltage to be fully stable at 1100mhz and mine is a 78.9% chip.


----------



## cephelix

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Dasboogieman*
> 
> Out of sheer annoyance at the variable clockspeed and curiosity, I flashed my 290 with the ASUS 290x PT1T vBIOS. I had a small glimmer of hope that it would unlock in to a 290x but it was not meant to be. On the other hand, this BIOS works exactly as advertised, it completely removes PowerPlay at the BIOS level so I have a beautiful, constant, 1000mhz (or whatever afterburner sets it) at all times. I do wish there was an idle state but I'm not complaining too much since this solves all the issues I've been having, no more having to mess with RadeonPro profiles.
> 
> Winter is also approaching here so the extra heat is welcome.
> 
> 
> It sounds about right, I needed about the same voltage to be fully stable at 1100mhz and mine is a 78.9% chip.














Thanks for that @Dasboogieman ...Will play around more with my card tonight after work. Sad thing is at 1100, my card is already hitting 80 deg C at the core. But my vrm1 is nice and comfy at 72 and vrm 2 hasn't budged from 54 deg C at all.


----------



## kizwan

Quote:


> Originally Posted by *aaroc*
> 
> I bought Arctic MX-4 for my GPU and GPU. Is it better than the TIM that EK send with their water blocks? I bought the Fujipoly extreme that roboyto recommends in his How-to.


To be honest I don't have good experience with MX-4.
Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> You can either sell the EK block & get Aqua water block + backplate, or replaced the stock EK thermal pad with Fujipoly Extreme or Ultra Exteme thermal pad. I'm happy with my EK water block (+ Fujipoly Extreme thermal pad on VRM1).
> 
> 
> 
> the problem is the people who ordered backplates from aquatuning 3months ago and still haven't gotten them
Click to expand...

Ouch!







Performance of *[*EK water block + backplate + Fujipoly Extreme*]* should be == *[*Aqua + backplate*]*.









I also replaced the stock EK thermal pad for the EK backplate with Fujipoly Extreme too. However I didn't compare temp between with & without backplate though. So can't tell whether it really make a difference or not.


----------



## aaroc

Quote:


> Originally Posted by *kizwan*
> 
> To be honest I don't have good experience with MX-4.
> Ouch!
> 
> 
> 
> 
> 
> 
> 
> Performance of *[*EK water block + backplate + Fujipoly Extreme*]* should be == *[*Aqua + backplate*]*.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I also replaced the stock EK thermal pad for the EK backplate with Fujipoly Extreme too. However I didn't compare temp between with & without backplate though. So can't tell whether it really make a difference or not.


For two months it was very difficult to get Aqua Computer WB and black plate with heat sink for R9 290. My initial plan was to buy that, but went for EK WB+black plate because they were available and with Roboyto How to the Temp difference of the VRM was shorter (Aquacomputer vs EK backplate).


----------



## Roy360

Quote:


> Originally Posted by *kizwan*
> 
> You can either sell the EK block & get Aqua water block + backplate, or replaced the stock EK thermal pad with Fujipoly Extreme or Ultra Exteme thermal pad. I'm happy with my EK water block (+ Fujipoly Extreme thermal pad on VRM1).


https://www.dazmode.com/store/category/thermal_solutions/

which of those should I get to cool 4 cards?

and once I get the pads, should I bother with EK backplate? I'm guessing the stock EK backplate performed poorly, due to the stock thermal pads. Maybe I'll just buy backplates and use the Fuji pads?

Secondly, anyone know any tricks to using the stock backplate that comes with our cards? 5/6 of my cards have backplates. And right now they are wasting away, in their boxes.

I'm guessing I want these pads: https://www.dazmode.com/store/product/sarcon-thermal-pad-builder-value-size-150-x-100-x-1-11wmk/

since they have the highest w/mk and they are only 1mm


----------



## kizwan

Quote:


> Originally Posted by *Roy360*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> You can either sell the EK block & get Aqua water block + backplate, or replaced the stock EK thermal pad with Fujipoly Extreme or Ultra Exteme thermal pad. I'm happy with my EK water block (+ Fujipoly Extreme thermal pad on VRM1).
> 
> 
> 
> 
> 
> 
> 
> https://www.dazmode.com/store/category/thermal_solutions/
> 
> which of those should I get to cool 4 cards?
> 
> and once I get the pads, should I bother with EK backplate? I'm guessing the stock EK backplate performed poorly, due to the stock thermal pads. Maybe I'll just buy backplates and use the Fuji pads?
> 
> Secondly, anyone know any tricks to using the stock backplate that comes with our cards? 5/6 of my cards have backplates. And right now they are wasting away, in their boxes.
> 
> I'm guessing I want these pads: https://www.dazmode.com/store/product/sarcon-thermal-pad-builder-value-size-150-x-100-x-1-11wmk/
> 
> since they have the highest w/mk and they are only 1mm
Click to expand...

I also replaced thermal pad on the EK backplate to Fujipoly Extreme.

Sarcon == Fujipoly. Same product.


----------



## Mega Man

Quote:


> Originally Posted by *cephelix*
> 
> @Roboyto.would the vrm heights differ between manufacturers?the reason i'm asking is cos i'm planning to order the ultra extreme and extreme versions of the fujipoly thermal pads in anticipation of the release of the ek full cover for my msi 290 gaming and was wondering wht thickness pads i should be getting


what ever ek recommends


----------



## Frontside

Hi, folks. o/

*Can anyone confirm that the odd core clock drops are common issue for the 290X?*

In my case it looks like that (approximate numbers) :
Flat line 1030 mhz (factory OC) ->split second drop to ~1000 mhz -> flat line ->split second drop to ~850 ->fluctuating 1030-1027-1010 -> drop ~830 - flat line e.t.c deppends on game ofc.
*Lots of dips on 14.xxx and less on 13.12 WHQL but still some present.*

Thanks
Best regards


----------



## Forceman

Did you increase the power limit (which may be broken in 14.x)? You may be power throttling. How are your temps?


----------



## Dasboogieman

I don't know if this has been mentioned but does anyone know or has a copy of the specially modified ASUS GPUTweak Hawaii edition?
I can't seem to find it anywhere which leads me to believe theres some kind of purge going on.


----------



## Frontside

Quote:


> Originally Posted by *Forceman*
> 
> Did you increase the power limit (which may be broken in 14.x)? You may be power throttling. How are your temps?


I tried but it did not help much. Core temps are fine 83 C in worst case, forgot to connect exhaust fan.
I have a problem with VRM1 temperature. It goes beyond 110C


----------



## Sgt Bilko

Quote:


> Originally Posted by *Frontside*
> 
> I tried but it did not help much. Core temps are fine 83 C in worst case, forgot to connect exhaust fan.
> I have a problem with VRM1 temperature. It goes beyond 110C


thats a little high on the vrm's

Under 100c is best but under 90c is ideal. Is this overclocked or on stock?


----------



## Frontside

Quote:


> Originally Posted by *Sgt Bilko*
> 
> thats a little high on the vrm's
> 
> Under 100c is best but under 90c is ideal. Is this overclocked or on stock?


Factory OC 1030 mhz.
My card thanks to MSI factory worker has no thermal pads on vrm1.
I was going to water cool it anyway so removing warranty void sticker is not a problem.
Just want to be sure if everything else are OK before i pulled a trigger.

MSI Afterburner graph showing core clock during Unigine Valley run 13.12 WHQL
https://dl.dropboxusercontent.com/u/68180433/clock%20drops.JPG


----------



## Sgt Bilko

Quote:


> Originally Posted by *Frontside*
> 
> Factory OC 1030 mhz.
> My card thanks to MSI factory worker has no thermal pads on vrm1.
> I was going to water cool it anyway so removing warranty void sticker is not a problem.
> Just want to be sure if everything else are OK before i pulled a trigger.
> 
> MSI Afterburner graph showing core clock during Unigine Valley run 13.12 WHQL
> https://dl.dropboxusercontent.com/u/68180433/clock%20drops.JPG


My vrms don't even get that high unless i'm running Heaven in 30c ambients with Crossfire









Hows your case airflow?


----------



## Frontside

Quote:


> Originally Posted by *Sgt Bilko*
> 
> My vrms don't even get that high unless i'm running Heaven in 30c ambients with Crossfire
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hows your case airflow?


2 GT-AP18 ~1200 rpm intake
2 GT-AP-13 ~1100 rpm exhaust
and 1 Thermalright 140 mm 900 rpm side intake.
Bet you cards has some king of thermal compound between heatsink and mosfets, mine does not.
Just a small gap instead of it


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *Frontside*
> 
> Factory OC 1030 mhz.
> My card thanks to MSI factory worker has no thermal pads on vrm1.
> I was going to water cool it anyway so removing warranty void sticker is not a problem.
> Just want to be sure if everything else are OK before i pulled a trigger.
> 
> MSI Afterburner graph showing core clock during Unigine Valley run 13.12 WHQL
> https://dl.dropboxusercontent.com/u/68180433/clock%20drops.JPG


they put the sticker in a place that you must remove it?

i hear all these good things about msi and how they allow after market coolers if no damage is done to the card but, how does that work when theres no sticker?







im confused!


----------



## Hattifnatten

Did some runs on FirestrikeX last night with TriXX. Can't believe how high my memory can go, I benched it at 1800mhz, but the score dropped just a tad. Could be random variance, but when I tried 1775, I got a screen-lockup. Went to bed after that. Still, impressive results if you ask me, despite the insane vdroop (1,35 down to 1,23-26) on reference-cards.

I can only wonder how high this chip would go if I could feed it some more voltage


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Dasboogieman*
> 
> I don't know if this has been mentioned but does anyone know or has a copy of the specially modified ASUS GPUTweak Hawaii edition?
> I can't seem to find it anywhere which leads me to believe theres some kind of purge going on.


Flash the PT1T bios and that should give you unlocked voltage in GPU Tweek


----------



## Frontside

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> they put the sticker in a place that you must remove it?
> 
> i hear all these good things about msi and how they allow after market coolers if no damage is done to the card but, how does that work when theres no sticker?
> 
> 
> 
> 
> 
> 
> 
> im confused!


There were no strikers on my old MSI R7970 Lightning. But R9 290X Gaming has it. Contacted MSI support and two guys answered me that i will loose warranty if i remove them


----------



## velocityx

anyone get this while loading a map in bf4?





sometimes I will also get this but the cards or game somehow recovers and brings me to the loading screen and I can play fine.


----------



## rdr09

Quote:


> Originally Posted by *velocityx*
> 
> anyone get this while loading a map in bf4?
> 
> 
> 
> 
> 
> sometimes I will also get this but the cards or game somehow recovers and brings me to the loading screen and I can play fine.


i am currently using both 13.11 and 14.3 (2 HDDs) without issues. are you using stock bios on both your gpus?

edit: actually, there is one annoying issue in BF4 . . . logging on. takes forever and at times i just give up and go to BF3.


----------



## velocityx

Quote:


> Originally Posted by *rdr09*
> 
> i am currently using both 13.11 and 14.3 (2 HDDs) without issues. are you using stock bios on both your gpus?


yes it happens on stock bios as well as stock bios taken from a sapphire r9 290x tri-x.

I load the map just now, for a split second I saw the flickering but then the load screen showed up and all is fine and dandy.

too bad that sometimes I will get this black flickering mess.


----------



## rdr09

Quote:


> Originally Posted by *velocityx*
> 
> yes it happens on stock bios as well as stock bios taken from a sapphire r9 290x tri-x.
> 
> I load the map just now, for a split second I saw the flickering but then the load screen showed up and all is fine and dandy.
> 
> too bad that sometimes I will get this black flickering mess.


that's scary. have you tried to repair BF4?


----------



## velocityx

Quote:


> Originally Posted by *rdr09*
> 
> that's scary. have you tried to repair BF4?


bf4 is 3 days old on my rig, fresh origin install, I was formating my c drive and putting a fresh windows install as well. 14.3 for one day then I got the 14.4. I mean, I bought my gpu's 1 month apart, so I doubt this is some faulty cards, I think its either bf4 is buggy or the drivers or I got no idea.




it looks like that, from beggining to end (action starts at 0:30)


----------



## Sgt Bilko

Quote:


> Originally Posted by *velocityx*
> 
> bf4 is 3 days old on my rig, fresh origin install, I was formating my c drive and putting a fresh windows install as well. 14.3 for one day then I got the 14.4. I mean, I bought my gpu's 1 month apart, so I doubt this is some faulty cards, I think its either bf4 is buggy or the drivers or I got no idea.
> 
> 
> 
> 
> it looks like that, from beggining to end (action starts at 0:30)


Try 14.3 again, I uninstalled 14.4, was causing some issues on my rig including flashing black screens.


----------



## Roboyto

Quote:


> Originally Posted by *Mega Man*
> 
> what ever ek recommends


Quote:


> Originally Posted by *cephelix*
> 
> @Roboyto
> Ok,thanx for the info.will contact ek about pad thickness then.playing ard with my 290,asic about 80+%..but it takes me +62mv on trixx to reach 1100mhz.is this normal??


*Got a response from EK regading thermal pads for their upcoming blocks. It will be the same as reference cards with 1mm for VRMs and 0.5mm for the pads.







*

http://www.frozencpu.com/products/17504/thr-185/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_Mosfet_Block_-_100_x_15_x_10_-_Thermal_Conductivity_170_WmK.html

Just wondering if the new blocks you have coming out for *290/290X ASUS DC2, ASUS Matrix, MSI Gaming OC, and MSI Lightning* will use same pad thickness as the reference blocks?

The reference blocks call for 1mm on the VRMs, and 0.5mm on the RAM.

Thank You,

Bob Melchert















*EK Support*










3:12 AM (3 hours ago)









 



















to me









 

Dear customer,

Thank you for contacting EK Support

They will use thickness 1mm on the VRMs and 0.5mm on the RAM.

If you need any further information, please do not hesitate to contact me.

Lep pozdrav, Kind regards, MFG!


----------



## Roy360

Quote:


> Originally Posted by *kizwan*
> 
> I also replaced thermal pad on the EK backplate to Fujipoly Extreme.
> 
> Sarcon == Fujipoly. Same product.


how much of a difference did the Fuji pads on the backplate make? VRMs are at 60 degrees with using stock ingredients, but I plan on adding 1-2 more cards.


----------



## btemtd

Hi Guys,

I am wanting to do a Quadfire Setup - I have the cards already 290x Reference I have enough Power 1200w + 650w (Add2Psu_Adapter)

All I need is the motherboard and the CPU wchich will be the 4930k or 4960k which should remove any bottlenecks.

I was thinking about the Rampage Extreme 4 But then I thought Maybe the Cards will be 2 close Because of what I am about to consider, and here it is:

I was thinking Of getting a 7 PCIE Motherboard And having the cards Seperated by 1 slot Each If you can picture it? Now On top of this separation I will remove the Standard cooler and in replace it with the *GELID Icy Vision Universal VGA Cooler Rev 2* For each Card

What do you think about this guys? Aswell as All this I will have a good Case which will suck HOt air out and suck air in,

This should quite the cards down aswell as cooling them , I know its possible with the ref cards to have them sitting next to each other as they are designed to suck the air out from the outer edge and blow it out the front.. But they do get pretty loud from the sucking.

What do you think of my setup above? with the 1 x PCIE gap between each card and change the standard coolers to GELID Icy Vision Universal VGA Cooler Rev 2 ?? I do not want to Create a massive water cooled setup , I want a simple but effective setup that does not take that much space..


----------



## Norse

Just wondering but has anyone got experience with http://www.watercoolinguk.co.uk/p/Watercool-HEATKILLER-reg;-GPU-backplate-R9-290X_44412.html that Backplate? the Picture leads be to believe it is active ie needs a loop but i cannot find any information about it via google.

I just want it as it is quite pleasing to the eye compared to the standard black ones that everyone else does


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *Norse*
> 
> Just wondering but has anyone got experience with http://www.watercoolinguk.co.uk/p/Watercool-HEATKILLER-reg;-GPU-backplate-R9-290X_44412.html that Backplate? the Picture leads be to believe it is active ie needs a loop but i cannot find any information about it via google.
> 
> I just want it as it is quite pleasing to the eye compared to the standard black ones that everyone else does


the watercool backplate is simply aesthetics, i had one on my old card


----------



## Norse

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> the watercool backplate is simply aesthetics, i had one on my old card


is it easy to mount that exact backplate without having a block? im not going to go watercooling yet i just want things to look nicer in my case for epeenery value









Edit card is a Sapphire 290X reference


----------



## cephelix

Ok Guys, finnally got around to properly overclocking my card and I need references to see whether my OC is normal or crappy.
Ambient Temps are around 26 deg C.
Card: MSI R9 290 Gaming 4G
WHQL: 13.12
Core Clock: 1090MHz
Memory Clock: 1600MHz
VDDC Offset: +68
Power Limit: +50
Overclocking Software: Sapphire Trixx (with Force Constant Voltage and Diable ULPS checked), Overdrive disabled
Benchmark used: 3DMark
Stock Firestrike Score: 8373
OC Firestrike Score: 9139
OC Temps: Core-83, VRM1-69







, VRM2-55
ASIC Quality: 83.9%
Memory Type: Hynix

Below is the link for GPU-Z Validation
GPU-Z Validation

For now, core temps are preventing me from adding more voltage. From the looks of it, this card would benefit from going under water (Correct me if I'm wrong though)
Also, if temperatures allow for it, there shouldn't be any harm maxing out the VDDC slider in Trixx right?


----------



## cephelix

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Roboyto*
> 
> *Got a response from EK regading thermal pads for their upcoming blocks. It will be the same as reference cards with 1mm for VRMs and 0.5mm for the pads.
> 
> 
> 
> 
> 
> 
> 
> *
> 
> http://www.frozencpu.com/products/17504/thr-185/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_Mosfet_Block_-_100_x_15_x_10_-_Thermal_Conductivity_170_WmK.html
> 
> _Just wondering if the new blocks you have coming out for *290/290X ASUS DC2, ASUS Matrix, MSI Gaming OC, and MSI Lightning* will use same pad thickness as the reference blocks? _
> 
> _The reference blocks call for 1mm on the VRMs, and 0.5mm on the RAM.
> 
> Thank You,
> 
> Bob Melchert_
> 
> _
> 
> 
> 
> 
> 
> 
> 
> _
> 
> 
> 
> 
> 
> *EK Support*
> 
> 
> _
> 
> 
> 
> 
> 
> 
> 
> 3:12 AM (3 hours ago)_
> _
> 
> 
> 
> 
> 
> 
> 
> _
> _
> 
> 
> 
> 
> 
> 
> 
> _
> _
> 
> 
> 
> 
> 
> 
> 
> _
> 
> _to me_
> _
> 
> 
> 
> 
> 
> 
> 
> _
>  
> 
> _Dear customer,_
> 
> _Thank you for contacting EK Support_
> 
> _They will use thickness 1mm on the VRMs and 0.5mm on the RAM._
> 
> _If you need any further information, please do not hesitate to contact me._
> 
> _Lep pozdrav, Kind regards, MFG!_






Beat me to it. Thanx @Roboyto. Was thinking of purchasing a few strips of the Ultra Extreme and 1 of the Extreme since i figured it would not affect shipping costs. Right?


----------



## Roy360

Quote:


> Originally Posted by *btemtd*
> 
> Hi Guys,
> 
> I am wanting to do a Quadfire Setup - I have the cards already 290x Reference I have enough Power 1200w + 650w (Add2Psu_Adapter)
> 
> All I need is the motherboard and the CPU wchich will be the 4930k or 4960k which should remove any bottlenecks.
> 
> I was thinking about the Rampage Extreme 4 But then I thought Maybe the Cards will be 2 close Because of what I am about to consider, and here it is:
> 
> I was thinking Of getting a 7 PCIE Motherboard And having the cards Seperated by 1 slot Each If you can picture it? Now On top of this separation I will remove the Standard cooler and in replace it with the *GELID Icy Vision Universal VGA Cooler Rev 2* For each Card
> 
> What do you think about this guys? Aswell as All this I will have a good Case which will suck HOt air out and suck air in,
> 
> This should quite the cards down aswell as cooling them , I know its possible with the ref cards to have them sitting next to each other as they are designed to suck the air out from the outer edge and blow it out the front.. But they do get pretty loud from the sucking.
> 
> What do you think of my setup above? with the 1 x PCIE gap between each card and change the standard coolers to GELID Icy Vision Universal VGA Cooler Rev 2 ?? I do not want to Create a massive water cooled setup , I want a simple but effective setup that does not take that much space..


Isn't it bad to mix power supplies? even if you do share common ground wouldn't you have different voltages coming from the mobo and the pcie connectors? I've seen power supplies dip as long as 11.3V whereas others go as high as 12.7V

You might want to look into getting a powered 16x riser for your last card and cut the 3 12V+ wires. That way you have the same power supply powering the 3 12V pins aswell as the pcie connectors.

I'm assuming you are powering one card with the 650W and 3 + system on the 1200W.

You could probably get away with all 4 on the 1200W and 440W to power the system.
but then each card would have to be "cut"


----------



## Dodda

I currently have a sapphire 290 tri-x and am looking to grab a 2nd in crossfire. In Australia prices have currently come down a lot. My options are

(1) Grab a 2nd hand reference card (ex miner) and slap on a kraken g10 & h55 as well as some extra vrm heat sinks. $430 USD for the card + g10 + h55.

(2) Grab a retail Gigabyte windforcex3 (non oc) currently on sale for $437 USD

(3) Grab another sapphire trii-x for $500

My priorities in order are performance --> noise --> temps

Any input would be amazing.


----------



## arrow0309

Quote:


> Originally Posted by *Dodda*
> 
> I currently have a sapphire 290 tri-x and am looking to grab a 2nd in crossfire. In Australia prices have currently come down a lot. My options are
> 
> (1) Grab a 2nd hand reference card (ex miner) and slap on a kraken g10 & h55 as well as some extra vrm heat sinks. $430 USD
> 
> (2) Grab a retail Gigabyte windforcex3 (non oc) currently on sale for $437 USD
> 
> (3) Grab another sapphire trii-x for $500
> 
> My priorities in order are performance --> noise --> temps
> 
> Any input would be amazing.


Hmm, second option?
The first one is good as well


----------



## kayan

May I join the club please? This is my picture from launch when I got my XFX R9 290x BF4 edition:



No comments about my paper , it's my wife's and that's what was handy.....

Anyway, I'm running my XFX 290x on custom h2o loop, XSPC Razer 290 waterblock:



Temps went from throttling at 95C to benching (Unigine and 3dMark) at 57C, which interestingly enough, that's the same temp as my CPU under load. While gaming the temps are around 10C lower.

I finally got to overclock it today, and using MSI Afterburner, I can take the Core Clock to 1110 and the Memory Clock to 1375 without any voltage mods. @ 1400 on Memory I get artifacts on my desktop.


----------



## Roy360

Quote:


> Originally Posted by *Dodda*
> 
> I currently have a sapphire 290 tri-x and am looking to grab a 2nd in crossfire. In Australia prices have currently come down a lot. My options are
> 
> (1) Grab a 2nd hand reference card (ex miner) and slap on a kraken g10 & h55 as well as some extra vrm heat sinks. $430 USD
> 
> (2) Grab a retail Gigabyte windforcex3 (non oc) currently on sale for $437 USD
> 
> (3) Grab another sapphire trii-x for $500
> 
> My priorities in order are performance --> noise --> temps
> 
> Any input would be amazing.


maybe you should wait. Canada is getting alot of love from AMD cards right now. The the 290X OC version of that gigabyte cards goes for 410USD here.

The regular R9 290s are floating around the 330USD range before tax.

For EK blocks
VRM need 1mm
RAM need .5mm
Blackplate need 1.00mm

do you think I can get away with using 1.0MM for everything? Or maybe I'll just leave the RAM with their cheap pads?


----------



## Jflisk

Quote:


> Originally Posted by *kayan*
> 
> May I join the club please? This is my picture from launch when I got my XFX R9 290x BF4 edition:
> 
> 
> 
> No comments about my paper , it's my wife's and that's what was handy.....
> 
> Anyway, I'm running my XFX 290x on custom h2o loop, XSPC Razer 290 waterblock:
> 
> 
> 
> Temps went from throttling at 95C to benching (Unigine and 3dMark) at 57C, which interestingly enough, that's the same temp as my CPU under load. While gaming the temps are around 10C lower.
> 
> I finally got to overclock it today, and using MSI Afterburner, I can take the Core Clock to 1110 and the Memory Clock to 1375 without any voltage mods. @ 1400 on Memory I get artifacts on my desktop.


Find hwinfo and download it you might also be able to see vrm1 temps in gpuz some cards do some dont. check the vram temps make sure there under 65C. Also how many and what type rads. The 290Xs usually run under 50C core and vrm1(more important temp then gpu on these) under 55C - vrm2 under 50c but depends on the rads.Looks good though did you get the back plate that helps lower the temps also


----------



## Roboyto

Quote:


> Originally Posted by *cephelix*
> 
> 
> Beat me to it. Thanx @Roboyto. Was thinking of purchasing a few strips of the Ultra Extreme and 1 of the Extreme since i figured it would not affect shipping costs. Right?


Yeah, it won't affect shipping costs at all. That one strip of Ultra Extreme 15mm x 100mm can easily cover VRMs for (2) cards; possibly 3 if you cut very carefully.


----------



## Iniura

I bought a XFX R9 290 Black OC edition, with the stock bios and 13.12 drivers on air I could run a OC of 1180/1625 Stable for playing games with MSI AB, (Mem Elpidia went that high because of +13 Aux Voltage).

Benching Firestrike I could go a little bit higher OC and did some 1200/1625 runs that was the max I could do.

I recently installed an EK waterblock.
And also flashed an UEFI capable R9 290 bios because I wanted to use ultra fast boot on my system, I forgot which bios it is have to look that up, still using the 13.12 drivers.

Now I can bench up to 1300 Core clock and 1500 Memory on Firestrike with +200mV on Trixx.

However I'm having huge difficulties finding an OC for gaming again.

For some reason with MSI AB and +13 aux voltage my memory can't go higher then 1250(stock) anymore before crashing in game.(red screen and blue screen hard locks)

Also with +200 I cant go higher then 1150 Core clock anymore., yes maybe 1160 or 1170 but 1180 I get hard locks and black screens.

And to top it all off i've been gaming for a week on 1100/1250 with no problems what so ever, and now yesterday my GPU began throttling during a gaming session, just out of nowhere and I could see huge frame drops and lag.

I am going to flash the ASUS PT1 Bios and see if I'm still having issues.

In the meantime is there anyone who could give some pointers of what might be the problem? Or what could be different, I guess It all could be the Bios switch that caused it or maybe when I reinstalled my OS and MSI AB I forgot to set some settings or with beta 19 instead of beta 18 something changed with AB.

What I would like to have is a 1200/1500 Stable OC for gaming, but I can't use Trixx for that because I can't up aux voltage which I need for higher Memory OC.
And with AB I ran into the problem that I can't go higher then +100mV without doing some mods Ugh.

Can I use GPU tweak with the Asus PT1 bios, is there an option similar to aux voltage? to help memory OC?


----------



## Roboyto

Quote:


> Originally Posted by *Roy360*
> 
> how much of a difference did the Fuji pads on the backplate make? VRMs are at 60 degrees with using stock ingredients, but I plan on adding 1-2 more cards.


I believe Kizwan said he couldn't measure temp drops because he did the pads on the card and the backplate at the same time. I will be adding the pads to my backplate soon, I can let you know what happens with mine. Using XSPC block/backplate.


----------



## Roboyto

Quote:


> Originally Posted by *Roy360*
> 
> maybe you should wait. Canada is getting alot of love from AMD cards right now. The the 290X OC version of that gigabyte cards goes for 410USD here.
> 
> The regular R9 290s are floating around the 330USD range before tax.
> 
> For EK blocks
> VRM need 1mm
> RAM need .5mm
> Blackplate need 1.00mm
> 
> do you think I can get away with using 1.0MM for everything? Or maybe I'll just leave the RAM with their cheap pads?


Leave the RAM pads. I upgrade them and didn't get any change in performance.

I wouldn't suggest using thicker pads for the RAM. The block is machined for those specific thicknesses. Putting thicker pads on RAM will affect how the rest of the card sits and inhibit performance is more critical areas; i.e. core and VRMs.

Check out my thermal pad thread if you'd like


----------



## Roboyto

Quote:


> Originally Posted by *btemtd*
> 
> Hi Guys,
> 
> I am wanting to do a Quadfire Setup - I have the cards already 290x Reference I have enough Power 1200w + 650w (Add2Psu_Adapter)
> 
> All I need is the motherboard and the CPU wchich will be the 4930k or 4960k which should remove any bottlenecks.
> 
> I was thinking about the Rampage Extreme 4 But then I thought Maybe the Cards will be 2 close Because of what I am about to consider, and here it is:
> 
> I was thinking Of getting a 7 PCIE Motherboard And having the cards Seperated by 1 slot Each If you can picture it? Now On top of this separation I will remove the Standard cooler and in replace it with the *GELID Icy Vision Universal VGA Cooler Rev 2* For each Card
> 
> What do you think about this guys? Aswell as All this I will have a good Case which will suck HOt air out and suck air in,
> 
> This should quite the cards down aswell as cooling them , I know its possible with the ref cards to have them sitting next to each other as they are designed to suck the air out from the outer edge and blow it out the front.. But they do get pretty loud from the sucking.
> 
> What do you think of my setup above? with the 1 x PCIE gap between each card and change the standard coolers to GELID Icy Vision Universal VGA Cooler Rev 2 ?? I do not want to Create a massive water cooled setup , I want a simple but effective setup that does not take that much space..


People are running into temperature issues with cards that have upgraded coolers from manufacturers in Crossfire. I don't think any amount of fans would keep the temperature for 4 of these things at bay on air coolers.

If you are going to go with an air cooler, the new Arctic Accelero Extreme 4 would be the only way to do it IMHO. They have an enormous backplate/heatsink that cools the RAM/VRMs. The biggest advantage to the new Arctic Extreme IV is you don't have to glue any heatsinks onto RAM/VRMs, so the card can be returned to stock very easily if you need to RMA.

http://www.newegg.com/Product/Product.aspx?gclid=CJPA2sLp4L0CFckWMgodVXUAhg&Item=N82E16835186097&nm_mc=KNC-GoogleAdwords&cm_mmc=KNC-GoogleAdwords-_-pla-_-VGA+Cooling-_-N82E16835186097&ef_id=U0xGFAAAATkoIKT2:20140414203324:s

If you have the patience to install 4 of them, then they would be worth it. I have used the Accelero Twin Turbo II on several occasions, and the Accelero 7970 twice. They don't disappoint.


----------



## UZ7

Well this just happened









With less than a year I went 7950 -> 770 -> 780, now going back to AMD lol


----------



## Roboyto

Quote:


> Originally Posted by *Iniura*
> 
> I bought a XFX R9 290 Black OC edition, with the stock bios and 13.12 drivers on air I could run a OC of 1180/1625 Stable for playing games with MSI AB, (Mem Elpidia went that high because of +13 Aux Voltage).
> 
> Benching Firestrike I could go a little bit higher OC and did some 1200/1625 runs that was the max I could do.
> 
> I recently installed an EK waterblock.
> And also flashed an UEFI capable R9 290 bios because I wanted to use ultra fast boot on my system, I forgot which bios it is have to look that up, still using the 13.12 drivers.
> 
> Now I can bench up to 1300 Core clock and 1500 Memory on Firestrike with +200mV on Trixx.
> 
> However I'm having huge difficulties finding an OC for gaming again.
> 
> For some reason with MSI AB and +13 aux voltage my memory can't go higher then 1250(stock) anymore before crashing in game.(red screen and blue screen hard locks)
> 
> Also with +200 I cant go higher then 1150 Core clock anymore., yes maybe 1160 or 1170 but 1180 I get hard locks and black screens.
> 
> And to top it all off i've been gaming for a week on 1100/1250 with no problems what so ever, and now yesterday my GPU began throttling during a gaming session, just out of nowhere and I could see huge frame drops and lag.
> 
> I am going to flash the ASUS PT1 Bios and see if I'm still having issues.
> 
> In the meantime is there anyone who could give some pointers of what might be the problem? Or what could be different, I guess It all could be the Bios switch that caused it or maybe when I reinstalled my OS and MSI AB I forgot to set some settings or with beta 19 instead of beta 18 something changed with AB.
> 
> What I would like to have is a 1200/1500 Stable OC for gaming, but I can't use Trixx for that because I can't up aux voltage which I need for higher Memory OC.
> And with AB I ran into the problem that I can't go higher then +100mV without doing some mods Ugh.
> 
> Can I use GPU tweak with the Asus PT1 bios, is there an option similar to aux voltage? to help memory OC?


If this all happened after changing BIOS I would think that is the problem personally. Try flashing back to stock BIOS and see if performance returns.

Memory OC on these cards is not that important. I have tested with stock RAM clock and very high overclock; up to 1700. Check my Shrunken Beast build log for the spreadsheet listing performance gains from RAM speed. On average for 3dmark, Unigine, and FFXIV benchmark it was only a 5% performance jump.

If you flash the ASUS BIOS, then you should be able to use GPU Tweak. However, I don't know if it has Aug Voltage.

You definitely didn't damage anything when installing the block correct?

Have you tried re-seating the card in the slot? Check all your other connections in the case while you're at it. This can fix all kinds of strange unexplainable issues. I know my card with XSPC backplate makes it a tight fit to get underneath the RAM clips.

Scrub drivers and try reinstalling them.

Maybe try fresh Windows install again.


----------



## Paul17041993

Quote:


> Originally Posted by *velocityx*
> 
> anyone get this while loading a map in bf4?
> 
> 
> 
> 
> 
> sometimes I will also get this but the cards or game somehow recovers and brings me to the loading screen and I can play fine.


Quote:


> Originally Posted by *velocityx*
> 
> bf4 is 3 days old on my rig, fresh origin install, I was formating my c drive and putting a fresh windows install as well. 14.3 for one day then I got the 14.4. I mean, I bought my gpu's 1 month apart, so I doubt this is some faulty cards, I think its either bf4 is buggy or the drivers or I got no idea.
> 
> 
> 
> 
> it looks like that, from beggining to end (action starts at 0:30)


done the classic hardware debugging of re-seat the cards, swap their places, test each one individually?

looks to me like one of them either has a shot display controller or a bad memory IC, or bad choke/VRM causing lack of power to the board, also check your power rails to be sure your PSU is supplying the voltage it should be.


----------



## cephelix

Quote:


> Originally Posted by *UZ7*
> 
> 
> 
> Well this just happened
> 
> 
> 
> 
> 
> 
> 
> 
> 
> With less than a year I went 7950 -> 770 -> 780, now going back to AMD lol


Congrats!!..looking forward to your experience with them


----------



## Mega Man

ill have to get caught up in a min ! wanted to share with everyone.
since i cant locate my megaman mouse pad megaman himself + rush + met will have to do !



Spoiler: Warning: Spoiler!



for open box, it looks unopened to me ...






i have to admit the ek without the crop circles they acctually look good !





sad i cant add it to my loop till i get back from my trip


----------



## FuriousPop

Quote:


> Originally Posted by *Dodda*
> 
> I currently have a sapphire 290 tri-x and am looking to grab a 2nd in crossfire. In Australia prices have currently come down a lot. My options are
> 
> (1) Grab a 2nd hand reference card (ex miner) and slap on a kraken g10 & h55 as well as some extra vrm heat sinks. $430 USD for the card + g10 + h55.
> 
> (2) Grab a retail Gigabyte windforcex3 (non oc) currently on sale for $437 USD
> 
> (3) Grab another sapphire trii-x for $500
> 
> My priorities in order are performance --> noise --> temps
> 
> Any input would be amazing.


I personally went for option 3.

My two still running on air - haven't had a chance to research a water system.... definitely the next item i want to look at doing..

Running 3x30 - with 2 of those in xfire. Installing BF4 tonight so will be able to advise.. however on BF3 ultra with 7560x1600 it will struggle to maintain 40 FPS without any tweaks on the settings..

but seriously - these cards will not disappoint, temps on BF3 didn't go above 78 degrees cel (1st Card) 2nd card normally sits below that in temps.
Also i paid 589 for mine in Aus - but i have had mine for a little while now...


----------



## SSJVegeta

I can get my MSI R9290 Gaming card to 1100/1400 at stock voltages. ASIC is 82.0% but I can't clock the memory any higher. Which bios is it that improves overclocks? Is it the Asus PT1T bios?


----------



## Mega Man

Quote:


> Originally Posted by *Frontside*
> 
> Quote:
> 
> 
> 
> Originally Posted by *INCREDIBLEHULK*
> 
> they put the sticker in a place that you must remove it?
> 
> i hear all these good things about msi and how they allow after market coolers if no damage is done to the card but, how does that work when theres no sticker?
> 
> 
> 
> 
> 
> 
> 
> im confused!
> 
> 
> 
> There were no strikers on my old MSI R7970 Lightning. But R9 290X Gaming has it. Contacted MSI support and two guys answered me that i will loose warranty if i remove them
Click to expand...

meh they have not given anyone i know an issue with it yet


----------



## Roboyto

Quote:


> Originally Posted by *SSJVegeta*
> 
> I can get my MSI R9290 Gaming card to 1100/1400 at stock voltages. ASIC is 82.0% but I can't clock the memory any higher. Which bios is it that improves overclocks? Is it the Asus PT1T bios?


That BIOS would likely help your core more than the memory, but it doesn't down clock. Why not try adding voltage to see where you get before switching BIOS? If you're at 1100 with no extra voltage, odds are you have a pretty good performing card. My 290, which hits 1300/1700 +200mV, *definitely can't* hit 1100 core at stock voltages.

Do you have Hynix or Elpida memory? Hynix typically clocks better, but there are a few with some very good performing Elpida cards.

MSI Afterburner has the Aux Voltage option for RAM usually, that could help you attain better RAM clocks.

Just an FYI, memory clock no where near as important on these cards as core. That enormous 512-bit bus helps make up for lower clocks.

Testing I did with my card



Find max clocks for your core first and then worry about the RAM. Just be sure to keep an eye on VRM1 temperatures; ~80C is where most people let up on the voltage/clocks.


----------



## Roboyto

Quote:


> Originally Posted by *Mega Man*
> 
> ill have to get caught up in a min ! wanted to share with everyone.
> since i cant locate my megaman mouse pad megaman himself + rush + met will have to do !
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> for open box, it looks unopened to me ...
> 
> 
> 
> 
> 
> 
> i have to admit the ek without the crop circles they acctually look good !
> 
> 
> 
> 
> 
> sad i cant add it to my loop till i get back from my trip


Sweet figurines







I loved Mega Man growing up...Now when I try to relive those joyous childhood memories I just







and wonder how I had the patience to play any of the NES renditions









I just bought a PowerColor 270X Devil Edition off of NewEgg and it was "Open Box". They put a new circular seal on one side of the box because everything else was unscathed inside, bonus round! 20% off and free shipping for a brand new item.

I'm extremely pleased with every aspect of the 270X BTW, PowerColor will be a strong contender for my next GPU purchase.


----------



## Faster_is_better

For the purpose of watercooling, is it better to get a reference card or one of the aftermarket versions? I'm asking in terms of the quality of the card, not waterblock availability which generally benefits reference more. Typically there see to be "uber" cards released like MSI lightnings that supposedly will clock better, or are just better made, but is it worth it to pay more for one of these or just get a reference card + wb? I'm mainly thinking of ASUS DCII version vs reference, but if there are some better non-reference designs of similar price then I might watch for those.

I saw a few different manufacturers are even selling card with waterblocks attached, but I'm not sure if they are reference designed cards or custom pcb + water blocks.


----------



## Roy360

not the best place to ask, but what kinds of cases are you g
Quote:


> Originally Posted by *Faster_is_better*
> 
> For the purpose of watercooling, is it better to get a reference card or one of the aftermarket versions? I'm asking in terms of the quality of the card, not waterblock availability which generally benefits reference more. Typically there see to be "uber" cards released like MSI lightnings that supposedly will clock better, or are just better made, but is it worth it to pay more for one of these or just get a reference card + wb? I'm mainly thinking of ASUS DCII version vs reference, but if there are some better non-reference designs of similar price then I might watch for those.
> 
> I saw a few different manufacturers are even selling card with waterblocks attached, but I'm not sure if they are reference designed cards or custom pcb + water blocks.


Here's what I"m doing:



The ASUS is an improvement over the reference in every are, besides the Elpida memory anyways.


----------



## Roboyto

Quote:


> Originally Posted by *Faster_is_better*
> 
> For the purpose of watercooling, is it better to get a reference card or one of the aftermarket versions? I'm asking in terms of the quality of the card, not waterblock availability which generally benefits reference more. Typically there see to be "uber" cards released like MSI lightnings that supposedly will clock better, or are just better made, but is it worth it to pay more for one of these or just get a reference card + wb? I'm mainly thinking of ASUS DCII version vs reference, but if there are some better non-reference designs of similar price then I might watch for those.
> 
> I saw a few different manufacturers are even selling card with waterblocks attached, but I'm not sure if they are reference designed cards or custom pcb + water blocks.


Fairly certain the VisionTek CryoVenom and PowerColor LCS are reference boards with EK waterblocks already attached; I believe custom backplates in both instances?

The non-reference boards usually upgrade several aspects and probably most importantly power delivery, increasing your chances for better performance. Lightning/Matrix are typically known for using higher binned chips to give you better odds of a good performer; amongst their vast list of other enhancements.

RAM clocks aren't the most important thing with the 290(X) but you should be aware of DC2/Matrix likely having Elpida RAM; see my post a few spots up with spreadsheet image. Elpida *can* perform well, *but* are usually less likely to overclock as well as Hynix. Friend of mine just picked up a VisionTek 280X with Elpida 6Gbps RAM, and it clocks up to 7.1, on air, without any memory voltage adjustments.

EK has waterblocks out, or on the way for ASUS DC2, ASUS Matrix, MSI Gaming OC, and MSI Lightning. So your choices are fairly vast for watercooling.

Lastly, can't forget the good old silicon lottery...some chips are excellent performers, even on a reference board, like my XFX 290 BE











If you go with an EK block, I would highly suggest using Fujipoly Ultra Extreme thermal pads to cool the VRMs. The included pads are underwhelming, especially when pushing voltage/clocks to the brink. I saw over 20% temperature drop with my XSPC Razor, and several people with EK blocks have reported similar results.

http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures#post_21821258


----------



## Frontside

Hi, OCN back to my question about core clock drops without vsync and vrm temperature. I really need your help. I'm about to start RMA process, which cmay take at least two or three month if i get lucky (since ibought my card overseeas). Wich means i would not be able to use my PC until replacement arrives.
Is it typical for all R9 290/290X or not?
could anyone run metro 2033 , Heaven and Valley benchmarks and take an msi ab screens showing core clock and gpu load
thanks


----------



## Paul17041993

Quote:


> Originally Posted by *Mega Man*
> 
> ill have to get caught up in a min ! wanted to share with everyone.
> since i cant locate my megaman mouse pad megaman himself + rush + met will have to do !
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> for open box, it looks unopened to me ...
> 
> 
> 
> 
> 
> 
> i have to admit the ek without the crop circles they acctually look good !
> 
> 
> 
> 
> 
> sad i cant add it to my loop till i get back from my trip


cheater, getting a pre-blocked card









( joking







)


----------



## Roboyto

Quote:


> Originally Posted by *Frontside*
> 
> Hi, OCN back to my question about core clock drops without vsync and vrm temperature. I really need your help. I'm about to start RMA process, which cmay take at least two or three month if i get lucky (since ibought my card overseeas). Wich means i would not be able to use my PC until replacement arrives.
> Is it typical for all R9 290/290X or not?
> could anyone run metro 2033 , Heaven and Valley benchmarks and take an msi ab screens showing core clock and gpu load
> thanks


From MSI website regarding warranty:


The MSI product *MUST* be free of any *physical damage* due to improper installation or modification of *ANY* kind (this includes installing aftermarket parts) or the warranty *WILL* be *VOID*.
The product MUST be returned to MSI in the original factory configuration and condition. All aftermarket modifications must be reversed prior to sending in the product for repair or replacement.

If you are missing thermal pads, then I was just put some thermal pads on there yourself. You will NOT void your warranty as long as you don't damage the card. I wouldn't wait 3 months for an RMA.

Your core clock is likely reducing to cool off the VRMs because they're *WAY, WAY TOO HOT*









http://www.frozencpu.com/products/17504/thr-185/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_Mosfet_Block_-_100_x_15_x_10_-_Thermal_Conductivity_170_WmK.html

Buy some of those and you're good to go!


----------



## bond32

So after discovering my loop had some algae buildup or something, I disassembled the entire loop. Took some pictures of my Koolance block in case anyone wants to see.

Before cleaning: 

After cleaning: 

Reservoir:


----------



## Roboyto

Quote:


> Originally Posted by *bond32*
> 
> So after discovering my loop had some algae buildup or something, I disassembled the entire loop. Took some pictures of my Koolance block in case anyone wants to see.
> 
> Before cleaning:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> After cleaning:
> 
> Reservoir:


Been down a similar road before, mine was caused due to cheap barbs that flaked and contaminated the water.



Get a biocide of some sort and possibly a silver kill-coil to prevent it from happening again


----------



## cephelix

@bond32
Maybe it's just the picture,but the unclened block looks terrible. Is it really algae?wht did u use to clean the block?


----------



## Mega Man

Quote:


> Originally Posted by *Paul17041993*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> ill have to get caught up in a min ! wanted to share with everyone.
> since i cant locate my megaman mouse pad megaman himself + rush + met will have to do !
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> for open box, it looks unopened to me ...
> 
> 
> 
> 
> 
> 
> i have to admit the ek without the crop circles they acctually look good !
> 
> 
> 
> 
> 
> sad i cant add it to my loop till i get back from my trip
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> cheater, getting a pre-blocked card
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ( joking
> 
> 
> 
> 
> 
> 
> 
> )
Click to expand...

Price was cheaper then buying them separately


----------



## Paul17041993

Quote:


> Originally Posted by *bond32*
> 
> So after discovering my loop had some algae buildup or something, I disassembled the entire loop. Took some pictures of my Koolance block in case anyone wants to see.
> 
> Before cleaning:
> 
> After cleaning:
> 
> Reservoir:


looks a bit more like copper oxide then an algae (coppoxide comes in both green and black), make sure you bleed the loop out completely and be sure your radiator doesn't have air pockets, you don't want to be aerating your water as this is generally the result.

distilled + silver bullet, killcoil or pre-made silverwater are generally the most recommended for watercooling as it prevents lifeforms while not causing things like tube flaking, however you still have to be sure you're not leaving air pockets in the loop, be sure your reservoir is as full as possible.

Quote:


> Originally Posted by *Roboyto*
> 
> Been down a similar road before, mine was caused due to cheap barbs that flaked and contaminated the water.
> 
> 
> 
> 
> Get a biocide of some sort and possibly a silver kill-coil to prevent it from happening again


that's a very severe case of oxidization there, looks like the block itself was a defect and contained bubbles that obviously exploded...


----------



## Roboyto

Quote:


> Originally Posted by *Paul17041993*
> 
> that's a very severe case of oxidization there, looks like the block itself was a defect and contained bubbles that obviously exploded...




Look closely and you can see where the plating on these cheap barbs flaked off and clogged the block. 90% of the black gunk you see was actually solid and very hard. I noticed the temps on my Phenom X6 were up between 6-10C depending on core, which lead me to dismantle the loop and check it out. The radiators were fine and so was my HD4890 block; this was from a few years back.

Once removing all that crap, the block functioned normally again. MCCsolutions just bought it from me actually, it was an Apogee XT.


----------



## yawa

Y'know for the first time in my life I'm content with my build. This is the first Flagship anything I've ever owned. Being fully water blocked and 24/7 clocked at 1225 MHz (of course it can do higher) this is really the first time I've been satisfied with a build.

I mean, yes I will eventually cap off this endgame build the way anyone shelling out for a Flagship card should. Buy the best monitor available (preferably 4K) and a 2nd card when they start dropping prices, but beyond the pipe dream of me eventually wanting a six core excavator, I'm finally happy with my computer.

What a weird feeling.


----------



## Arizonian

Quote:


> Originally Posted by *kayan*
> 
> May I join the club please? This is my picture from launch when I got my XFX R9 290x BF4 edition:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> No comments about my paper , it's my wife's and that's what was handy.....
> 
> Anyway, I'm running my XFX 290x on custom h2o loop, XSPC Razer 290 waterblock:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Temps went from throttling at 95C to benching (Unigine and 3dMark) at 57C, which interestingly enough, that's the same temp as my CPU under load. While gaming the temps are around 10C lower.
> 
> I finally got to overclock it today, and using MSI Afterburner, I can take the Core Clock to 1110 and the Memory Clock to 1375 without any voltage mods. @ 1400 on Memory I get artifacts on my desktop.


You sure may - congrats - added















Quote:


> Originally Posted by *UZ7*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Well this just happened
> 
> 
> 
> 
> 
> 
> 
> 
> 
> With less than a year I went 7950 -> 770 -> 780, now going back to AMD lol


Sweet.









Post a submission when you receive it.


----------



## bond32

I have biocide, however I haven't had my new case and radiators but a few weeks now. I would have seen it quicker but my pump decoupling foam is yellow, so the discoloration in the reservoir I thought was just the foam. I'll use the kill coil too after its all back.

I'm running about 10% vinegar in it tonight. Will drain and repeat tomorrow. I sent my sapphire 290x in today for RMA so won't have my pc running for a while.

I thought you guys would like to see the Koolance block naked. I was surprised to see how much gunk and mess was in mine. I used a very soft bristle toothbrush and some 99% alcohol to clean it. It appears whatever it was started to take some of the nickel coating off.

Also, forgot this. It looks extremely terrible in the first picture because they grease the o-rings from the factory. That grease is still there


----------



## Paul17041993

Quote:


> Originally Posted by *Roboyto*
> 
> 
> 
> Look closely and you can see where the plating on these cheap barbs flaked off and clogged the block. 90% of the black gunk you see was actually solid and very hard. I noticed the temps on my Phenom X6 were up between 6-10C depending on core, which lead me to dismantle the loop and check it out. The radiators were fine and so was my HD4890 block; this was from a few years back.
> 
> Once removing all that crap, the block functioned normally again. MCCsolutions just bought it from me actually, it was an Apogee XT.


ah, yea coppoxide is usually very hard, its a protective type of corrosion (a good thing in a lot of cases, however its an issue for waterblocks as it blocks flow and reduces thermal absorption to the water), those barbs look to be aluminium, that's horrible...


----------



## Roboyto

Quote:


> Originally Posted by *Paul17041993*
> 
> ah, yea coppoxide is usually very hard, its a protective type of corrosion (a good thing in a lot of cases, however its an issue for waterblocks as it blocks flow and reduces thermal absorption to the water), those barbs look to be aluminium, that's horrible...


It was my initial plunge into water cooling and the cost of everything at first is a little high of course. I did my homework and knew about silver's inherent anti-everything properties, so I wanted silver barbs. I bought what I assumed, and made an (_!_) of myself, were silver coated barbs. They were in fact, not silver coated, just called Silver Diamond. I should have known better by their low price, as I researched everything else regarding water cooling as much as I could to death...but I bought them anyway.

It took quite some time for that to happen so when it did I had some extra coin to splurge on proper Bitspower compression fittings.


----------



## Roboyto

Quote:


> "yawa" url="/t/1436497/official-amd-r9-290x-290-owners-club/20640#post_22113425"]Y'know for the first time in my life I'm content with my build. This is the first Flagship anything I've ever owned. Being fully water blocked and 24/7 clocked at 1225 MHz (of course it can do higher) this is really the first time I've been satisfied with a build.
> 
> I mean, yes I will eventually cap off this endgame build the way anyone shelling out for a Flagship card should. Buy the best monitor available (preferably 4K) and a 2nd card when they start dropping prices, but beyond the pipe dream of me eventually wanting a six core excavator, I'm finally happy with my computer.
> 
> What a weird feeling.


I known the feeling man. I had been on a seemingly never ending upgrade cycle since I left the AMD camp for CPUs a few years ago.



Spoiler: Warning: What I thought was gong to be a quick reply turned into WAY too much text



I thought I was rocking and rolling with my 4.2GHz Phenom X6 and a 4890 overclocked to the brink...Then this little company called Square Enix released the benchmark for FFXIV just before they opened beta testing the first time around back in Fall of 2010.

My system fell flat on its face







. It was half due to their horrendously unoptimized graphics engine, as people with greater hardware than I were also scoring poorly.

First I upgraded to a 7870, which at the time was the best 1080p card for the money at around $330. I put it under water and overclocked it quite a bit and was still unsatisfied









Then I came across a smoking deal on an ASUS i5 laptop. This was just after HD4000 IGP hit the mobile market. I immediately doubled the RAM and threw a SSD in it. The performance was inspiring







, and then I benched SSD against my desktop...Laptop blew its side panels off









Shortly thereafter the FX-8350 released with underwhelming performance...and I threw in the proverbial towel







I sadly waved goodbye to AMD as my CPU manufacturer for the first time since a P3 maybe? Went to my local Microcenter and grabbed an ASRock Z77 extreme6 and a 3570k. Graphics performance jumped up 30% from CPU/Board







I was hoping for a 5 GHz Ivy, but was only able to make 4.7ish.

After a few months later the PCIe slot died and I had to RMA. I took this as an opportunity to upgrade my desktop as well as my HTPC. Went to MicroCenter again and grabbed the Mini-ITX version of my board and a 3770k, hoping again for a 5GHz Ivy.

I got close, the chip would boot at 5GHz, but would also be hitting







95C when it got to the desktop. I knew of delidding but was too scared to hammer or slice my way to lower temps. So back to the GPU scene I went.

I had been watching an open box GTX 670 Superclock 4GB for a few months at my local Microcenter. When the price finally hit $329, I jumped on it. Brought it home, tossed it in my HTPC for a few weeks to burn it in and play some games on it; BL2 is glorious with PhysX. I was satisfied with the card at that point. Then I came across an open box 670 4GB DC2 on Newegg for $329. Ordered it without thinking. Also ordered Waterblocks for both cards and some other goodies from FrozenCPU. DC2 arrived a couple days after blocks, which I opened, and was DOA







While I was waiting for the card I got excited and put the other 670 under water only to find out A) it had horrendous coil whine, and B) would NOT overclock at all! Now I had a DOA and a POS...EXTRA salty!







Fortunately FrozenCPU took back the DC2 block and backplate for credit minus Restock







, which I then immediately purchased the enormous Phantom 630. I was still planning on SLI/Xfire but wanted to wait for a decent deal.

The deal never came and the 670







Only positive was I bought the walk-in warranty so I had $350 to put towards something bigger and better. The 780 came out, was king of the hill, and very tempting since I had 1/2 the money in credit awaiting to be spent. Rumors of Hawaii were swirling and I was quite salty with Nvidia cards, even though I gambled on open box; before those 2 cards I had never been burned though. I decided to wait for Hawaii to release, and deal with my DC2 7770 for a little while longer...back to CPU...

Now I had grown the juevos to delid my 3770k. At first I made a very poor attempt with a cheap-o exacto knife and got nowhere; blade was much too thick. Took my CPU to work the next day and on lunch decided to give it the old hammer and wood block treatment...didn't work out so well as I took a little chunk







out of the edge of the PCB. Infuriated at my stupidity, I walked into our parts department, grabbed a new 'standard' razor blade and sliced the IHS off in about 45 seconds. Time to 'upgrade' again as I thought the 3770k was dead...it wasn't as I'm typing away on my HTPC that is powered by it right now; as it's ripping Wolf of Wall Street @ 95% CPU usage









Back to Microcenter for a 4770k and a Maximus VI Hero. Tested the CPU for about 2 hours sitting in the retail box on my dining table and lopped the top off for delid without hesitation this time







. Installed it in my Phantom 630 to find I have a fair 4770k making 4.4GHz at 1.340V







I was content, until one day I looked up at my case and realized how ridiculous







this enormous case, I had intended for SLI, looked with practically nothing in it.







I had been watching Compact Splash come together and was really wanting to make a powerful mATX or Mini-ITX build. Jokingly I put my 4770k and MoBo up on eBay and it sold in less than 3 hours!










And that would take this ridiculously long post, more or less, to the start of my mATX build log: http://www.overclock.net/t/1456279/honey-i-shrunk-the-ultra-tower-beast-my-journey-to-creating-a-more-compact-pc-with-an-r9-290

I am still ecstatic over the performance of my little mATX box. The 290(X) are glorious cards, and, quite honestly, I think AMD had water on the brain when they cooked







up the idea for these cards. My single 290 @ 1200/1500 has enough power behind it to get respectable/playable frames in 1080 Eyefinity.


----------



## kizwan

Quote:


> Originally Posted by *Roy360*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *btemtd*
> 
> Hi Guys,
> 
> I am wanting to do a Quadfire Setup - I have the cards already 290x Reference I have enough Power 1200w + 650w (Add2Psu_Adapter)
> 
> All I need is the motherboard and the CPU wchich will be the 4930k or 4960k which should remove any bottlenecks.
> 
> I was thinking about the Rampage Extreme 4 But then I thought Maybe the Cards will be 2 close Because of what I am about to consider, and here it is:
> 
> I was thinking Of getting a 7 PCIE Motherboard And having the cards Seperated by 1 slot Each If you can picture it? Now On top of this separation I will remove the Standard cooler and in replace it with the *GELID Icy Vision Universal VGA Cooler Rev 2* For each Card
> 
> What do you think about this guys? Aswell as All this I will have a good Case which will suck HOt air out and suck air in,
> 
> This should quite the cards down aswell as cooling them , I know its possible with the ref cards to have them sitting next to each other as they are designed to suck the air out from the outer edge and blow it out the front.. But they do get pretty loud from the sucking.
> 
> What do you think of my setup above? with the 1 x PCIE gap between each card and change the standard coolers to GELID Icy Vision Universal VGA Cooler Rev 2 ?? I do not want to Create a massive water cooled setup , I want a simple but effective setup that does not take that much space..
> 
> 
> 
> 
> 
> 
> 
> *Isn't it bad to mix power supplies?* even if you do share common ground wouldn't you have different voltages coming from the mobo and the pcie connectors? I've seen power supplies dip as long as 11.3V whereas others go as high as 12.7V
> 
> You might want to look into getting a powered 16x riser for your last card and cut the 3 12V+ wires. That way you have the same power supply powering the 3 12V pins aswell as the pcie connectors.
> 
> I'm assuming you are powering one card with the 650W and 3 + system on the 1200W.
> 
> You could probably get away with all 4 on the 1200W and 440W to power the system.
> but then each card would have to be "cut"
Click to expand...

Nope, no problem using two PSU's. Regarding the voltage tolerance, the +12V should be within +/-5% (+11.400 to +12.600 VDC). Also software reading is not accurate.

I'm guessing 1200W for the quad & 650W for the system.
Quote:


> Originally Posted by *Roboyto*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Roy360*
> 
> how much of a difference did the Fuji pads on the backplate make? VRMs are at 60 degrees with using stock ingredients, but I plan on adding 1-2 more cards.
> 
> 
> 
> I believe Kizwan said he couldn't measure temp drops because he did the pads on the card and the backplate at the same time. I will be adding the pads to my backplate soon, I can let you know what happens with mine. Using XSPC block/backplate.
Click to expand...

Roboyto is correct. I can't measure it because I did the block & backplate at the same time.

@Roy360, in addition to Roboyto's XSPC vs. Fujipoly *Ultra Extreme* thermal pads thermal performance comparison *[Link]*, I also recorded VRM1 temp for before & after applying the Fujipoly *Extreme* on EK waterblock + backplate, in case you're interested. *[Link]*
Quote:


> Originally Posted by *bond32*
> 
> I have biocide, however I haven't had my new case and radiators but a few weeks now. I would have seen it quicker but my pump decoupling foam is yellow, so the discoloration in the reservoir I thought was just the foam. I'll use the kill coil too after its all back.
> 
> I'm running about 10% vinegar in it tonight. Will drain and repeat tomorrow. I sent my sapphire 290x in today for RMA so won't have my pc running for a while.
> 
> I thought you guys would like to see the Koolance block naked. I was surprised to see how much gunk and mess was in mine. I used a very soft bristle toothbrush and some 99% alcohol to clean it. It appears whatever it was started to take some of the nickel coating off.
> 
> Also, forgot this. It looks extremely terrible in the first picture because they grease the o-rings from the factory. That grease is still there


Don't forget, for good measure, to flush with baking soda (Sodium Bicarbonate) to neutralize the acid.


----------



## DeadlyDNA

Anyone here using 14.3 beta drivers with Eyefinity and crossfire? I am having an issue with hot plug detection on monitors with 14.3 beta. I can confirm 13.12whql it was working fine except v-synch as we know. When i enter a 3d application my monitors flicker and remove/reconnect causing the game to run only on a single card. I' curious if anyone else has experienced this.

I cannot continue my 4k eyefinity on 14.3 beta drivers if i cannot figure out a fix.

issues with 14.3

1# monitors will not work manual detection. (i have pin disabled for hot plug detection) in 14.3 ( I.E. fresh load windows 7 or 8 monitors work fine.
Load 14.3 beta and monitors lose signal, never detected unless i hook up with hot plug detection pin enabled)

#2 because of constant detection and dropping of monitors when changing resolutions (entering 3d application) eyefinity only runs on a single card. crossfire not working like it went to full screen windowed...

Anyone?


----------



## Mega Man

Quote:


> Originally Posted by *Arizonian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kayan*
> 
> May I join the club please? This is my picture from launch when I got my XFX R9 290x BF4 edition:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> No comments about my paper , it's my wife's and that's what was handy.....
> 
> Anyway, I'm running my XFX 290x on custom h2o loop, XSPC Razer 290 waterblock:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Temps went from throttling at 95C to benching (Unigine and 3dMark) at 57C, which interestingly enough, that's the same temp as my CPU under load. While gaming the temps are around 10C lower.
> 
> I finally got to overclock it today, and using MSI Afterburner, I can take the Core Clock to 1110 and the Memory Clock to 1375 without any voltage mods. @ 1400 on Memory I get artifacts on my desktop.
> 
> 
> 
> You sure may - congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *UZ7*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Well this just happened
> 
> 
> 
> 
> 
> 
> 
> 
> 
> With less than a year I went 7950 -> 770 -> 780, now going back to AMD lol
> 
> Click to expand...
> 
> Sweet.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Post a submission when you receive it.
Click to expand...

Quote:


> Originally Posted by *Mega Man*
> 
> ill have to get caught up in a min ! wanted to share with everyone.
> since i cant locate my megaman mouse pad megaman himself + rush + met will have to do !
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> for open box, it looks unopened to me ...
> 
> 
> 
> 
> 
> 
> i have to admit the ek without the crop circles they acctually look good !
> 
> 
> 
> 
> 
> sad i cant add it to my loop till i get back from my trip


gonna add meh D:


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Mega Man*
> 
> ill have to get caught up in a min ! wanted to share with everyone.
> since i cant locate my megaman mouse pad megaman himself + rush + met will have to do !
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> for open box, it looks unopened to me ...
> 
> 
> 
> 
> 
> 
> i have to admit the ek without the crop circles they acctually look good !
> 
> 
> 
> 
> 
> sad i cant add it to my loop till i get back from my trip


I am jelly man pre installed waterblock !

Just got a email from the dude that I won his unlocked XFX 290 with tri fan aftermarket cooler on the ebay.........

" Mate by chance I'm driving to Brisbane for Easter. I will be getting there Friday mid afternoon. Your place is just off the highway by the looks of it, so I could drop it at your door if you like? If not I will post it this afternoon. "

I said " Hell yes bring it to me man I need to bench this weekend . Im no 3 in AUS enthusiast and no 80 worldwide on HWBOT need TRI fire for points . If you could legendary status is YOURS "

I cant WAIT


----------



## HOMECINEMA-PC

@jjjc_93
I haven't seen you here before .... howsit goin ?


----------



## Imprezzion

Quote:


> Originally Posted by *Roboyto*
> 
> Yeah, it won't affect shipping costs at all. That one strip of Ultra Extreme 15mm x 100mm can easily cover VRMs for (2) cards; possibly 3 if you cut very carefully.


Correct. Cut it into 3 strips of 100x5mm with something like a simple box cutter. Those small Stanley snap off knifes work really well for it.

I did it with my strip and 5mm is wide enough for the VRM1 strip. Also, 100mm is a tad too long. It needs about 20mm cut off in length as well which you can use for VRM2 if ya want.

Temps on my stock Tri-X cooler dropped about 10c compared to Sapphire's stock thermal crud which is quite low quality if ya ask me..


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> I am jelly man pre installed waterblock !
> 
> Just got a email from the dude that I won his unlocked XFX 290 with tri fan aftermarket cooler on the ebay.........
> 
> " Mate by chance I'm driving to Brisbane for Easter. I will be getting there Friday mid afternoon. Your place is just off the highway by the looks of it, so I could drop it at your door if you like? If not I will post it this afternoon. "
> 
> I said " Hell yes bring it to me man I need to bench this weekend . Im no 3 in AUS enthusiast and no 80 worldwide on HWBOT need TRI fire for points . If you could legendary status is YOURS "
> 
> I cant WAIT
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Got a block on the way for it?

Looking forward to the numbers you pump out with that rig


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Got a block on the way for it?
> 
> Looking forward to the numbers you pump out with that rig


I haven't ordered one yet . PC casegear wont have any ( XSPC ) till after the 17th so I will tri first 2 w/blocked and one on air with some A/C , then single 290x and I might even pull off a block and whack it on to see if it will go harder . Also this one is def gonna be a LN2 card or sumthin like that . Bench 290 flick switch bench 290x . 2 for the price of one ......... cant lose man


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> I haven't ordered one yet . PC casegear wont have any ( XSPC ) till after the 17th so I will tri first 2 w/blocked and one on air with some A/C , then single 290x and I might even pull off a block and whack it on to see if it will go harder . Also this one is def gonna be a LN2 card or sumthin like that . Bench 290 flick switch bench 290x . 2 for the price of one ......... cant lose man


Good point that, you should have gotten all the Ref XFX cards that PCCG had, a friend got one that unlocked then first time setting up his loop he fried the card









Second card he ordered also unlocked, XFX had a good rate in Aus....

We have Newegg avaliable to us, now we just need FrozenCPU


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Good point that, you should have gotten all the Ref XFX cards that PCCG had, a friend got one that unlocked then first time setting up his loop he fried the card
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Second card he ordered also unlocked, XFX had a good rate in Aus....
> 
> We have Newegg avaliable to us, now we just need FrozenCPU


1. How did old mate cook his xfx ?

2. I only get new cards local no xfx for sale here at the time









3. I am ignorant...... when did newegg get here to the land of plenty ?

4. Ive gotta go and see my bro ..... ill be back shortly


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 1. How did old mate cook his xfx ?
> 
> 2. I only get new cards local no xfx for sale here at the time
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 3. I am ignorant...... when did newegg get here to the land of plenty ?
> 
> 4. Ive gotta go and see my bro ..... ill be back shortly



Drop of water fell onto the back of the card, just near the PCIe connector

Understandable, No local stores where i am though









Newegg started up an AUS/UK store on the the 8th April, i found out yesterday


----------



## Paul17041993

Quote:


> Originally Posted by *Roboyto*
> 
> It was my initial plunge into water cooling and the cost of everything at first is a little high of course. I did my homework and knew about silver's inherent anti-everything properties, so I wanted silver barbs. I bought what I assumed, and made an (_!_) of myself, were silver coated barbs. They were in fact, not silver coated, just called Silver Diamond. I should have known better by their low price, as I researched everything else regarding water cooling as much as I could to death...but I bought them anyway.
> 
> It took quite some time for that to happen so when it did I had some extra coin to splurge on proper Bitspower compression fittings.


you learn from your mistakes and now you know the horror that is aluminium and water


----------



## Norse

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 
> Drop of water fell onto the back of the card, just near the PCIe connector
> 
> Understandable, No local stores where i am though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Newegg started up an AUS/UK store on the the 8th April, i found out yesterday


Selection to UK seems to be a little sucky checked a random mix of things and most said unavailable and also dont see mention of being possibly stung by import taxes as its shipped from outside the EU


----------



## Sgt Bilko

Quote:


> Originally Posted by *Norse*
> 
> Selection to UK seems to be a little sucky checked a random mix of things and most said unavailable and also dont see mention of being possibly stung by import taxes as its shipped from outside the EU


I was wondering about that, If you get stuck with the Import fees then whats the point?

I'm happy though, it's still cheaper than our largest e-tailer with shipping included


----------



## Frontside

Quote:


> Originally Posted by *Roboyto*
> 
> From MSI website regarding warranty:
> 
> The MSI product *MUST* be free of any _*physical damage*_ due to improper installation or modification of *ANY* kind (this includes installing aftermarket parts) or the warranty *WILL* be *VOID*.
> The product MUST be returned to MSI in the original factory configuration and condition. All aftermarket modifications must be reversed prior to sending in the product for repair or replacement.
> If you are missing thermal pads, then I was just put some thermal pads on there yourself. You will NOT void your warranty as long as you don't damage the card. I wouldn't wait 3 months for an RMA.
> 
> Your core clock is likely reducing to cool off the VRMs because they're *WAY, WAY TOO HOT*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.frozencpu.com/products/17504/thr-185/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_Mosfet_Block_-_100_x_15_x_10_-_Thermal_Conductivity_170_WmK.html
> 
> Buy some of those and you're good to go!


Thanks.
Already ordered some pads on ebay.
MSI support guy said that i will loose the warranty by removing that sticker.
I was going to put my card under water sooner or later, so removing it is not a big deal.
I own this card for about two and a half month. I noticed that the vrms running so hot 10 days ago.
Is it safe to run it this long under the temps this high?


----------



## Matt-Matt

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added


So I forgot to add that
Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> I am jelly man pre installed waterblock !
> 
> Just got a email from the dude that I won his unlocked XFX 290 with tri fan aftermarket cooler on the ebay.........
> 
> " Mate by chance I'm driving to Brisbane for Easter. I will be getting there Friday mid afternoon. Your place is just off the highway by the looks of it, so I could drop it at your door if you like? If not I will post it this afternoon. "
> 
> I said " Hell yes bring it to me man I need to bench this weekend . Im no 3 in AUS enthusiast and no 80 worldwide on HWBOT need TRI fire for points . If you could legendary status is YOURS "
> 
> I cant WAIT


You're one lucky guy! I've only ever once bought something that's local (Tasmania) and I was going to meet them but I ended up having uni that day! I can't wait to move to the mainland haha.


----------



## Norse

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I was wondering about that, If you get stuck with the Import fees then whats the point?
> 
> I'm happy though, it's still cheaper than our largest e-tailer with shipping included


Not sure about aus but its VERY likely a package delivered by UPS, DHL or Citylink will get hit by customs of 20%ish then the couriers admin charge (£10-20)

Although they are a lot cheaper than EU for certain products ie the AMD G34 63xx range of CPU's even if i include 20% customs though would be interesting to see how the Warranty works as it was technically purchased from a supplier in the US.....so would i then have to send it to a US returns/


----------



## Tokkan

Soon to become the proud owner of a reference r9 290.
Reference because it comes from a miner, bought it for 280 pounds, lets see how my Phenom handles it


----------



## Norse

Quote:


> Originally Posted by *Tokkan*
> 
> Soon to become the proud owner of a reference r9 290.
> Reference because it comes from a miner, bought it for 280 pounds, lets see how my Phenom handles it


Look around there is loads for £230

Also my Sapphire R9 290X came in today (probably from a miner







)

Will install it tonight after doing benchmarks on my GTX 680 so i can compare performance


----------



## Sgt Bilko

Quote:


> Originally Posted by *Norse*
> 
> Not sure about aus but its VERY likely a package delivered by UPS, DHL or Citylink will get hit by customs of 20%ish then the couriers admin charge (£10-20)
> 
> Although they are a lot cheaper than EU for certain products ie the AMD G34 63xx range of CPU's even if i include 20% customs though would be interesting to see how the Warranty works as it was technically purchased from a supplier in the US.....so would i then have to send it to a US returns/


I know It's a fee of 25% of the products value in Denmark plus a flat fee of 70 kroner i think


----------



## Tokkan

Quote:


> Originally Posted by *Norse*
> 
> Look around there is loads for £230
> 
> Also my Sapphire R9 290X came in today (probably from a miner
> 
> 
> 
> 
> 
> 
> 
> )
> 
> Will install it tonight after doing benchmarks on my GTX 680 so i can compare performance


Actually Im just lazy with conversions and Im not in the UK and mines coming from the UK so Im paying DHL shipping also.
Im shelling out 280 euros google says its 231 pounds, that includes shipping lol.


----------



## Norse

Quote:


> Originally Posted by *Tokkan*
> 
> Actually Im just lazy with conversions and Im not in the UK and mines coming from the UK so Im paying DHL shipping also.
> Im shelling out 280 euros google says its 231 pounds, that includes shipping lol.


Ah right £231 is good then







Where are you located?


----------



## Tokkan

Quote:


> Originally Posted by *Norse*
> 
> Ah right £231 is good then
> 
> 
> 
> 
> 
> 
> 
> Where are you located?


Portugal, should arrive still this week I hope :fingerscrossed:


----------



## velocityx

Quote:


> Originally Posted by *Paul17041993*
> 
> done the classic hardware debugging of re-seat the cards, swap their places, test each one individually?
> 
> looks to me like one of them either has a shot display controller or a bad memory IC, or bad choke/VRM causing lack of power to the board, also check your power rails to be sure your PSU is supplying the voltage it should be.


sure, first I had just one card and a 650 watt psu. it ran fine, black screened from time to time like almost all of them did. then I got a second one, swapped the cards, it behaved exactly the same. later I bought a new 850w gold psu unit, made CF, point is, crysis 3 diablo tomb raider all the other games I have are running ok, its just bf4 and mantle are a bugfest because I can play for hours straight but sometimes the video happens and it's a reboot after reboot.

it happened less with 14.3 but 14.4 has a working power limit so it doesnt throttle so I keep em


----------



## rdr09

Quote:


> Originally Posted by *velocityx*
> 
> sure, first I had just one card and a 650 watt psu. it ran fine, *black screened from time to time like almost all of them did*. then I got a second one, swapped the cards, it behaved exactly the same. later I bought a new 850w gold psu unit, made CF, point is, crysis 3 diablo tomb raider all the other games I have are running ok, its just bf4 and mantle are a bugfest because I can play for hours straight but sometimes the video happens and it's a reboot after reboot.
> 
> it happened less with 14.3 but 14.4 has a working power limit so it doesnt throttle so I keep em


how many 290(X) have you had that bs? all of them did from time to time?


----------



## Roboyto

Quote:


> Originally Posted by *Frontside*
> 
> Thanks.
> Already ordered some pads on ebay.
> MSI support guy said that i will loose the warranty by removing that sticker.
> I was going to put my card under water sooner or later, so removing it is not a big deal.
> I own this card for about two and a half month. I noticed that the vrms running so hot 10 days ago.
> Is it safe to run it this long under the temps this high?


Very odd MSI support said it would void warranty









The VRMs are rated to run at temperatures in that range, ~120C from everything I have seen in this thread here, but it's probably not the best thing for the card. If you're not having problems until the temps get that high, then the card is probably still OK...won't know for sure until you put the pads on and see how it runs.


----------



## Paul17041993

Quote:


> Originally Posted by *velocityx*
> 
> sure, first I had just one card and a 650 watt psu. it ran fine, *black screened from time to time like almost all of them did*. then I got a second one, swapped the cards, it behaved exactly the same. later I bought a new 850w gold psu unit, made CF, point is, crysis 3 diablo tomb raider all the other games I have are running ok, its just bf4 and mantle are a bugfest because I can play for hours straight but sometimes the video happens and it's a reboot after reboot.
> 
> it happened less with 14.3 but 14.4 has a working power limit so it doesnt throttle so I keep em


you have a problem there, have you tried 13.12? you shouldn't get blackscreens entirely unless you're using the 14.x drivers which _will_ have bugs galore, so if you don't want constant crashes etc you need to be on the official drivers for one thing.

otherwise I'm not too sure whats happening there...


----------



## Imprezzion

I do notice using Mantle that there are random FPS drops into the 10's / 20's for a few seconds. This only happens after a few rounds wierdly enough. Didn't have it with 14.1-14.3 but 14.4 does it...

It's not full VRAM as VRAM usage is only at about ~3.1GB according to MSI AB. Anyone else got any ideas?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Imprezzion*
> 
> I do notice using Mantle that there are random FPS drops into the 10's / 20's for a few seconds. This only happens after a few rounds wierdly enough. Didn't have it with 14.1-14.3 but 14.4 does it...
> 
> It's not full VRAM as VRAM usage is only at about ~3.1GB according to MSI AB. Anyone else got any ideas?


A couple of people have noticed this as well, but with the earlier drivers.

Was discussed in the Mantle discussion thread: http://www.overclock.net/t/1429303/amd-mantle-discussion-thread/0_40


----------



## velocityx

Quote:


> Originally Posted by *Paul17041993*
> 
> you have a problem there, have you tried 13.12? you shouldn't get blackscreens entirely unless you're using the 14.x drivers which _will_ have bugs galore, so if you don't want constant crashes etc you need to be on the official drivers for one thing.
> 
> otherwise I'm not too sure whats happening there...


majority of my gaming is bf4, and mantle is just such a bomb I will stay on 14.1/2/3/4 until whql or some new drivers because playing without mantle is just meh...
Quote:


> Originally Posted by *rdr09*
> 
> how many 290(X) have you had that bs? all of them did from time to time?


both of them do Black screen from time to time. But i'm starting to think it's just battlefield's fault more than anything because other games run fine.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Matt-Matt*
> 
> So I forgot to add that
> *You're one lucky guy!* I've only ever once bought something that's local (Tasmania) and I was going to meet them but I ended up having uni that day! I can't wait to move to the mainland haha.


LooooL








I still haven't heard back from xfx card owner yet , so not lucky just yet fingers crossed


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> LooooL
> 
> 
> 
> 
> 
> 
> 
> 
> I still haven't heard back from xfx card owner yet , so not lucky just yet fingers crossed


it will get there


----------



## rdr09

Quote:


> Originally Posted by *velocityx*
> 
> majority of my gaming is bf4, and mantle is just such a bomb I will stay on 14.1/2/3/4 until whql or some new drivers because playing without mantle is just meh...
> both of them do Black screen from time to time. But i'm starting to think it's just battlefield's fault more than anything because other games run fine.


i only have a single 290 and currently using both 13.11 and 14.3 without issues. never had a chance to play BF4 with previous 14s. i found out i have to move the settings folder under BF4 files to another location (mine on my desktop) in order for BF4 not to crash while loading and use mantle. no bs in any driver, though. i have 2 hdds imaged but one has 13.11 and the other 14.3 (for mantle). 13.11 is for other gamers in the house playing C2 and BI.


----------



## Talon720

Latley ive been getting the itch to buy a lightning. I orginally wanted it, but the excitment won out. I have 2 290x ref both pretty decent (hynix+elpida). I wanna try a 3rd card, and if it dosnt work sell the weakest. I also wanna see for myself how 8x4x4x is gonna work out. How is crossfire with a lightning and ref card gonna work? Is there any issues with overclocking cards not synced? Should i even bother with the lightning... Ek block is $150.00, and theres no gauereentee of better performance over reference. Maybe just a better chance what do you all think? Id be kinda pissed to get a lightning and have my ref beat it


----------



## Brian18741

Quote:


> Originally Posted by *Faster_is_better*
> 
> For the purpose of watercooling, is it better to get a reference card or one of the aftermarket versions? I'm asking in terms of the quality of the card, not waterblock availability which generally benefits reference more. Typically there see to be "uber" cards released like MSI lightnings that supposedly will clock better, or are just better made, but is it worth it to pay more for one of these or just get a reference card + wb? I'm mainly thinking of ASUS DCII version vs reference, but if there are some better non-reference designs of similar price then I might watch for those.
> 
> I saw a few different manufacturers are even selling card with waterblocks attached, but I'm not sure if they are reference designed cards or custom pcb + water blocks.


It's worth noting the Asus DCII cards have a little sticker over one of the screws holding the stock cooler on so you will definitely void your warranty by putting on a waterblock. I have two of them and emailed Asus who confirmed warranty is void if the stock cooler is removed.


----------



## Talon720

Quote:


> Originally Posted by *Brian18741*
> 
> It's worth noting the Asus DCII cards have a little sticker over one of the screws holding the stick cooler on so you will definitely void your warranty by putting on a waterblock. I have two of them and emailed Asus who confirmed warranty is good if the stock cooler is removed.


Well even if removing the stickers voided the warranty, its easy to get them off with a razor blade. I did it just in case, and the sticker wasn't damaged at all.


----------



## rdr09

Quote:


> Originally Posted by *Talon720*
> 
> Latley ive been getting the itch to buy a lightning. I orginally wanted it, but the excitment won out. I have 2 290x ref both pretty decent (hynix+elpida). I wanna try a 3rd card, and if it dosnt work sell the weakest. I also wanna see for myself how 8x4x4x is gonna work out. How is crossfire with a lightning and ref card gonna work? Is there any issues with overclocking cards not synced? Should i even bother with the lightning... Ek block is $150.00, and theres no gauereentee of better performance over reference. Maybe just a better chance what do you all think? Id be kinda pissed to get a lightning and have my ref beat it


you are right. silicon lottery applies to all cards. if you don't plan on using LN2, then you might as well stick with the ref you have and add the same type. what are your highest clocks on your cards? care to show some benchmarks?


----------



## UZ7

http://www.kitguru.net/components/graphic-cards/zardon/sapphire-r9-280x-vapor-x-oc-and-r9-290-vapor-x-oc-review/

Kit guru did a review on the R9 290 and 280X Vapor-X.

Meh I couldve ordered this yesterday but right when I saw it in stock for an hour or so my Tri-X already shipped. Ah well

Anyway supposedly the new cooler is about 5C~ better than the Tri-X and the fan has auto mode where it one fan stays on during idle then when it gets to 60C the other 2 fans kick in or you can set it to manual. Other FYI, custom pcb/chokes, 2x 8pin, backplate and vapor chamber cooler.


----------



## Widde

Anyone else having trouble with getting AB to give +200mV since the update? Mine doesnt do anything anymore


----------



## Faster_is_better

Quote:


> Originally Posted by *Roboyto*
> 
> Fairly certain the VisionTek CryoVenom and PowerColor LCS are reference boards with EK waterblocks already attached; I believe custom backplates in both instances?
> 
> The non-reference boards usually upgrade several aspects and probably most importantly power delivery, increasing your chances for better performance. Lightning/Matrix are typically known for using higher binned chips to give you better odds of a good performer; amongst their vast list of other enhancements.
> 
> RAM clocks aren't the most important thing with the 290(X) but you should be aware of DC2/Matrix likely having Elpida RAM; see my post a few spots up with spreadsheet image. Elpida *can* perform well, _*but*_ are usually less likely to overclock as well as Hynix. Friend of mine just picked up a VisionTek 280X with Elpida 6Gbps RAM, and it clocks up to 7.1, on air, without any memory voltage adjustments.
> 
> EK has waterblocks out, or on the way for ASUS DC2, ASUS Matrix, MSI Gaming OC, and MSI Lightning. So your choices are fairly vast for watercooling.
> 
> Lastly, can't forget the good old silicon lottery...some chips are excellent performers, even on a reference board, like my XFX 290 BE
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you go with an EK block, I would highly suggest using Fujipoly Ultra Extreme thermal pads to cool the VRMs. The included pads are underwhelming, especially when pushing voltage/clocks to the brink. I saw over 20% temperature drop with my XSPC Razor, and several people with EK blocks have reported similar results.
> 
> http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures#post_21821258


This is a great post response +1







I'm primarily liking the ASUS DC2 cards because they have the reworked power phases, and such + their stock coolers are typically pretty good. Is it random chance whether any of the cards come with elpida vs hynix memory or is that actually decided per model/per vendor?
Quote:


> Originally Posted by *Brian18741*
> 
> It's worth noting the Asus DCII cards have a little sticker over one of the screws holding the stick cooler on so you will definitely void your warranty by putting on a waterblock. I have two of them and emailed Asus who confirmed warranty is good if the stock cooler is removed.


Good to know, Thanks


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *Talon720*
> 
> Latley ive been getting the itch to buy a lightning. I orginally wanted it, but the excitment won out. I have 2 290x ref both pretty decent (hynix+elpida). I wanna try a 3rd card, and if it dosnt work sell the weakest. I also wanna see for myself how 8x4x4x is gonna work out. How is crossfire with a lightning and ref card gonna work? Is there any issues with overclocking cards not synced? Should i even bother with the lightning... Ek block is $150.00, and theres no gauereentee of better performance over reference. Maybe just a better chance what do you all think? Id be kinda pissed to get a lightning and have my ref beat it


Don't. I had same itch and as amazing as the card seems, from the people who own it and actually have ran LN2 or extreme benching it seems its just very limited. It doesn't seem to overclock as we expect it would


----------



## sugarhell

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> Don't. I had same itch and as amazing as the card seems, from the people who own it and actually have ran LN2 or extreme benching it seems its just very limited. It doesn't seem to overclock as we expect it would


http://www.3dmark.com/fs/1881409

1350/1750 ref 290x water

Its not enough overclocking?


----------



## igrease

Okay guys, so I just got my MSI r9 290 and I have a quick question. Do I need to actually change the modes before I start to overclock? I really CBA to unhook everything from pc to change the thing.


----------



## Forceman

Quote:


> Originally Posted by *igrease*
> 
> Okay guys, so I just got my MSI r9 290 and I have a quick question. Do I need to actually change the modes before I start to overclock? I really CBA to unhook everything from pc to change the thing.


What modes are yon talking about? If you mean the BIOS switch then no, on a 290 they are both the same.


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *sugarhell*
> 
> http://www.3dmark.com/fs/1881409
> 
> 1350/1750 ref 290x water
> 
> Its not enough overclocking?


If you read who my reply was written to, it was in regards to a msi 290x lightning.
Quote:


> Originally Posted by *Talon720*
> 
> *Latley ive been getting the itch to buy a lightning*. I orginally wanted it, but the excitment won out. I have 2 290x ref both pretty decent (hynix+elpida). I wanna try a 3rd card, and if it dosnt work sell the weakest. I also wanna see for myself how 8x4x4x is gonna work out. *How is crossfire with a lightning and ref card gonna work?* Is there any issues with overclocking cards not synced? *Should i even bother with the lightning*... Ek block is $150.00, and theres no gauereentee of better performance over reference. Maybe just a better chance what do you all think? *Id be kinda pissed to get a lightning and have my ref beat it
> 
> 
> 
> 
> 
> 
> 
> *


So yeah. I'd be pretty pissed too if a MSI 290x lightning walled at LN2 at 1400-1500mhz core and people are running reference cards on water at 1350mhz


----------



## sugarhell

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> If you read who my reply was written to, it was in regards to a msi 290x lightning.


Did you check the lightning thread? We have people with 1280 on air. 8 pack said that he can do easily 1400 on water. Ln2 we dont know until someone experience try to overclock this card.Smoke did 1500 on a ref 290x


----------



## Roboyto

Quote:


> name="Faster_is_better" url="/t/1436497/official-amd-r9-290x-290-owners-club/20670#post_22116828"]
> This is a great post response +1
> 
> 
> 
> 
> 
> 
> 
> I'm primarily liking the ASUS DC2 cards because they have the reworked power phases, and such + their stock coolers are typically pretty good. Is it random chance whether any of the cards come with elpida vs hynix memory or is that actually decided per model/per vendor?
> Good to know, Thanks


The 290(X) DC2/Matrix cards actually have a major flaw in the air coolers. ASUS was lazy and used the heatsink from the GK104 which is a much larger die than Hawaii. 2.5ish out of 5 heatpipes make contact with the die. It is likely easily fixable with a thin copper shim to assist in distributing the heat to all pipes; like those that were necessary for the 79XX cards in some cases.

If you want a good cooler, Sapphire, MSI, and that ridiculous 3-slot Powercolor seem to be leading the pack this time around.

I can also confirm that ASUS will honor their warranty once HSF is removed. I recently RMAd my DC2 7870 and had no issues with the claim after it being under water. Just be certain there is no damage to the card and everything is back to stock configuration.

It's hard to say with the RAM. I know with reference boards it is usually luck of the draw depending on who can supply the RAM at that time. All the reviews I've read on the Matrix have shown Elpida memory; may have to contact the manufacturers to get a definite answer.

XFX with their black edition, I believe you are paying for the better memory from all the people I've seen who have those cards; both of mine had Hynix but one wasn't a phenomenal performer.


----------



## Mega Man

Wow plus one for Asus


----------



## Jflisk

Quote:


> Originally Posted by *Roboyto*
> 
> The 290(X) DC2/Matrix cards actually have a major flaw in the air coolers. ASUS was lazy and used the heatsink from the GK104 which is a much larger die than Hawaii. 2.5ish out of 5 heatpipes make contact with the die. It is likely easily fixable with a thin copper shim to assist in distributing the heat to all pipes; like those that were necessary for the 79XX cards in some cases.
> 
> If you want a good cooler, Sapphire, MSI, and that ridiculous 3-slot Powercolor seem to be leading the pack this time around.
> 
> I can also confirm that ASUS will honor their warranty once HSF is removed. I recently RMAd my DC2 7870 and had no issues with the claim after it being under water. Just be certain there is no damage to the card and everything is back to stock configuration.
> 
> It's hard to say with the RAM. I know with reference boards it is usually luck of the draw depending on who can supply the RAM at that time. All the reviews I've read on the Matrix have shown Elpida memory; may have to contact the manufacturers to get a definite answer.
> 
> XFX with their black edition, I believe you are paying for the better memory from all the people I've seen who have those cards; both of mine had Hynix but one wasn't a phenomenal performer.


My 290X's XFX's are mixed Hynix and Epilida. Just a heads up


----------



## sugarhell

Quote:


> Originally Posted by *Mega Man*
> 
> Wow plus one for Asus


-99 +1=-98


----------



## Roboyto

Quote:


> Originally Posted by *Jflisk*
> 
> My 290X's XFX's are mixed Hynix and Epilida. Just a heads up


Are they both black edition?


----------



## Jflisk

Quote:


> Originally Posted by *Roboyto*
> 
> Are they both black edition?


As black edition as they get.


----------



## Arizonian

Quote:


> Originally Posted by *UZ7*
> 
> http://www.kitguru.net/components/graphic-cards/zardon/sapphire-r9-280x-vapor-x-oc-and-r9-290-vapor-x-oc-review/
> 
> Kit guru did a review on the R9 290 and 280X Vapor-X.
> 
> Meh I couldve ordered this yesterday but right when I saw it in stock for an hour or so my Tri-X already shipped. Ah well
> 
> Anyway supposedly the new cooler is about 5C~ better than the Tri-X and the fan has auto mode where it one fan stays on during idle then when it gets to 60C the other 2 fans kick in or you can set it to manual. Other FYI, custom pcb/chokes, 2x 8pin, backplate and vapor chamber cooler.


Vapor X temps on the new design are impressive.

See Newegg getting it ready in the line up.









http://www.newegg.com/Product/Product.aspx?Item=N82E16814202103


----------



## INCREDIBLEHULK

Quote:


> Originally Posted by *sugarhell*
> 
> Did you check the lightning thread? We have people with 1280 on air. 8 pack said that he can do easily 1400 on water. Ln2 we dont know until someone experience try to overclock this card.Smoke did 1500 on a ref 290x


I believe thats my point.
Smoke did 1500 on a ref 290x....... So whats the point in the premium for the lightning if nobody has done much more than those numbers? Seems if a reference makes more sense in that situation, unless you want to pay extra for a sweet cooler you have to take off


----------



## Roboyto

Quote:


> Originally Posted by *Jflisk*
> 
> As black edition as they get.


Cool, glad you confirmed. I was just making a guess since I'm not sure what else they could be charging you another ~$30 for, ya know? My reference cards had a wholloping 3% OC out of the box.

One was a minimally performer, 10% OC in RAM and core. The one I kept is beast mode 1300/1700.


----------



## sugarhell

290 vaporx pcb. The vrm area is interesting


----------



## Roboyto

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> I believe thats my point.
> Smoke did 1500 on a ref 290x....... So whats the point in the premium for the lightning if nobody has done much more than those numbers? Seems if a reference makes more sense in that situation, unless you want to pay extra for a sweet cooler you have to take off


Hoping MSI has picked chips that are better performers. Otherwise you rely on the lottery?


----------



## heroxoot

Quote:


> Originally Posted by *Roboyto*
> 
> Quote:
> 
> 
> 
> Originally Posted by *INCREDIBLEHULK*
> 
> I believe thats my point.
> Smoke did 1500 on a ref 290x....... So whats the point in the premium for the lightning if nobody has done much more than those numbers? Seems if a reference makes more sense in that situation, unless you want to pay extra for a sweet cooler you have to take off
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hoping MSI has picked chips that are better performers. Otherwise you rely on the lottery?
Click to expand...

The lightnings are normally better binned. My 7970 lightning had lower than normal lighting voltage.

On another note. MSI said that I should get an email with tracking between today and tomorrow for my 290 G series replacement. It seems like a fair replacement for my 7970 lightning as they don't make a 280x lightning. And the performance differences clock for clock should be substantial.


----------



## Roboyto

Quote:


> Originally Posted by *heroxoot*
> 
> The lightnings are normally better binned. My 7970 lightning had lower than normal lighting voltage.
> 
> On another note. MSI said that I should get an email with tracking between today and tomorrow for my 290 G series replacement. It seems like a fair replacement for my 7970 lightning as they don't make a 280x lightning. And the performance differences clock for clock should be substantial.


That's definitely a fair replacement :-D Bonus round for you!


----------



## BradleyW

Quote:


> Originally Posted by *sugarhell*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 290 vaporx pcb. The vrm area is interesting


I wonder how effective that heat sink will prove to be?


----------



## Norse

Got my R9 290x in computer







despite people saying they run hot and noisy as hell, 60% Fan and its noise level is fine and quieter than the xbox 360 behind me!

No pics of the install as i was doing it quickly before dinner, will sort out nice pics later









CPUZ http://valid.x86.fr/6zw70k

But until then! i have been OCing it (Mishka is new name im starting to use)

http://www.techpowerup.com/gpuz/5fkfz/ 13% and seemingly stable

http://www.techpowerup.com/gpuz/33pr2/ 15% Core, 4% memory also seemingly stable

1200Mmhz Core locked up PC as soon as i did apply


----------



## igrease

Okay, so there has to be something I am doing wrong. So my frames in Battlefield 4 are not what they should be. As you may have seen my older posts, I just upgrade from my 560 Ti. I could play Battlefield 4 on Low settings at around 70 ~ 80 FPS. So now I put in my R9 290 and have the 14.3 drivers installed (I have tried 14.4). On all low settings with dx11, I get 70 ~ 80 FPS, on ultra I get 50 ~ 64 FPS. On Mantle, I get slightly less FPS. This is kind of how my previous 7950 handled. My 560 Ti outperformed an overclocked 7950 in most games. These were all on full 64v64 player servers. What the hell is going on? I refuse to believe my CPU is holding me back. It is a i5 2500k @ 4ghz. I know it is not the best OC but I am just afraid to push it further on this Motherboard because of what happened last time. Also, this card seems to run pretty damn hot. When I was running the Heaven Bench on Extreme everything 1600 x 900 the temps were 78c on the core, 77c vm1 and 68c vm2. The fan was at 85%. The card has not yet been overclocked and it at it's default 977/1250. This is the MSI r9 290 GAMING edition.


----------



## Germanian

it's your CPU. Even my 4770k at 3.8Ghz stock is holding back my R9 290 and once i go crossfire mode enabled I see even bigger CPU bottleneck. This is on DX11 mode not mantle.

Hopefully 14.4 WHQL mantle will help fix bottleneck with CPU. Best way to get rid of bottleneck is by using downsampling by changing extra resolution higher than the 100% bar so you max out graphic use.

Btw that ram of yours has really high timings for 1600 speed are you sure you set that correct?
1600 almost all ram modules can do at least 9-9-9 timings


----------



## igrease

Quote:


> Originally Posted by *Germanian*
> 
> it's your CPU. Even my 4770k at 3.8Ghz stock is holding back my R9 290 and once i go crossfire mode enabled I see even bigger CPU bottleneck. This is on DX11 mode not mantle.
> 
> Hopefully 14.4 WHQL mantle will help fix bottleneck with CPU. Best way to get rid of bottleneck is by using downsampling by changing extra resolution higher than the 100% bar so you max out graphic use.
> 
> Btw that ram of yours has really high timings for 1600 speed are you sure you set that correct?
> 1600 almost all ram modules can do at least 9-9-9 timings


I just fixed the ram issue. Apparently I needed to have it set to 1600 and not auto. It is back down to 9-9-9-24. But anyways about the CPU. So basically saying is that it doesn't matter what GPU I have I will never be able to get over 80 fps in Multiplayer because of my CPU?


----------



## Faster_is_better

Quote:


> Originally Posted by *Germanian*
> 
> it's your CPU. Even my 4770k at 3.8Ghz stock is holding back my R9 290 and once i go crossfire mode enabled I see even bigger CPU bottleneck. This is on DX11 mode not mantle.
> 
> Hopefully 14.4 WHQL mantle will help fix bottleneck with CPU. Best way to get rid of bottleneck is by using downsampling by changing extra resolution higher than the 100% bar so you max out graphic use.
> 
> Btw that ram of yours has really high timings for 1600 speed are you sure you set that correct?
> 1600 almost all ram modules can do at least 9-9-9 timings


That seems ridiculous, overclocked i5 2500k are still quite fast aren't they?

@igrease how is your cpu load while playing? You might want to try some other games and see if you have improvements. The 7950 should have been much better than the 560ti even. Hmm, although it kind of does look like a cpu bottleneck if you used both gpu's and put the game on Low settings, that will force the most load upon the cpu, might explain why your fps numbers are similar even with the gpu change. You could try higher visual settings, maybe you will gain FPS (sounds crazy







).


----------



## sugarhell

Quote:


> Originally Posted by *BradleyW*
> 
> I wonder how effective that heat sink will prove to be?


Its good. On the review they said that vrm temps is around 60-65C


----------



## igrease

Quote:


> Originally Posted by *Faster_is_better*
> 
> That seems ridiculous, overclocked i5 2500k are still quite fast aren't they?
> 
> @igrease how is your cpu load while playing? You might want to try some other games and see if you have improvements. The 7950 should have been much better than the 560ti even. Hmm, although it kind of does look like a cpu bottleneck if you used both gpu's and put the game on Low settings, that will force the most load upon the cpu, might explain why your fps numbers are similar even with the gpu change. You could try higher visual settings, maybe you will gain FPS (sounds crazy
> 
> 
> 
> 
> 
> 
> 
> ).


When I play Battlefield 4 my CPU is only at 66%. GPU usage is at 99%.


----------



## Talon720

Quote:


> Originally Posted by *Widde*
> 
> Anyone else having trouble with getting AB to give +200mV since the update? Mine doesnt do anything anymore


Check the first page ive done this and it works

Make a txt document with notepad then write

CD C:\Program Files (x86)\MSI Afterburner
MSIAfterburner.exe /wi4,30,8d,10

If you have crossfire you can use this command line to change voltages per gpu

MSIAfterburner.exe /sg0 /wi6,30,8d,10 /sg1 /wi6,30,8d,10

and then save as .bat file. Everytime you start this bat file then start msi it will start with +100mv

For 50mv: 8
For 100mv:10
For 125mv:14
For 150mv:18
For 175mv:1C
For 200mv:20
Think thats all right im not home yet.


----------



## rdr09

Quote:


> Originally Posted by *igrease*
> 
> I just fixed the ram issue. Apparently I needed to have it set to 1600 and not auto. It is back down to 9-9-9-24. But anyways about the CPU. So basically saying is that it doesn't matter what GPU I have I will never be able to get over 80 fps in Multiplayer because of my CPU?


Multiplayer and not using mantle - you definitely need to oc your cpu. i recommend at least 4.5GHz.


----------



## wes1099

Does anyone know if an EK full cover water block will fit on an MSI R9 290 Gaming? If not, what water block(s) can I use?


----------



## cephelix

Quote:


> Originally Posted by *wes1099*
> 
> Does anyone know if an EK full cover water block will fit on an MSI R9 290 Gaming? If not, what water block(s) can I use?


Derrick has said that EK will be releasing a waterblock specifically for the MSI R9 290 Gaming and hopefully that it'll be available for sale on the EK webshop by Wednesday. ETA on when Frozencpu or PPCs is still kinda sketchy.

The stock 290 block wont fit on the MSI Gaming due to the larger capacitors used


----------



## igrease

Quote:


> Originally Posted by *rdr09*
> 
> Multiplayer and not using mantle - you definitely need to oc your cpu. i recommend at least 4.5GHz.


With or without Mantle I get the same FPS. Shouldn't Mantle be helping me out a little bit?


----------



## rdr09

Quote:


> Originally Posted by *igrease*
> 
> With or without Mantle I get the same FPS. Shouldn't Mantle be helping me out a little bit?


i don't monitor my fps ever since i started using my 290. what i do know is that i can't leave my i7 at stock and ht off when using DX11. only in mantle can i do that. same with BF3 MP. i have to oc my cpu.


----------



## Talon720

Quote:


> Originally Posted by *wes1099*
> 
> Does anyone know if an EK full cover water block will fit on an MSI R9 290 Gaming? If not, what water block(s) can I use?


Quote:


> Originally Posted by *cephelix*
> 
> Derrick has said that EK will be releasing a waterblock specifically for the MSI R9 290 Gaming and hopefully that it'll be available for sale on the EK webshop by Wednesday. ETA on when Frozencpu or PPCs is still kinda sketchy.
> 
> The stock 290 block wont fit on the MSI Gaming due to the larger capacitors used


on Reds amd build thread hes on here too he was saying the reference ek blocks do fit on the msi gaming cards. that it was old information that they didnt fit. Some other people were saying this same thing to him and thats was his response.


----------



## Kokin

Quote:


> Originally Posted by *igrease*
> 
> With or without Mantle I get the same FPS. Shouldn't Mantle be helping me out a little bit?


Something seems wrong. With my 7950 @ 1440p and low clocks (950/1400), I get 120~130 FPS for BF3 using low/med settings. Granted my 3570K is at 4.7GHz, but even at stock, there wasn't a big difference.

I know BF4 is a little more demanding than BF3, but you shouldn't be getting that low of a performance using a 290 with a 1080 120Hz monitor.


----------



## cennis

Does anyone have any insight regarding performance running in pcie 3.0 8x vs pcie 3.0 16x at 4k or higher resolution?
for example 8x 8x 8x trifire vs 16x 8x 16x trifire.

I know the common consensus is it doesn't matter but I haven't seen testing for 4k or higher resolutions.


----------



## Paul17041993

Quote:


> Originally Posted by *cennis*
> 
> Does anyone have any insight regarding performance running in pcie 3.0 8x vs pcie 3.0 16x at 4k or higher resolution?
> for example 8x 8x 8x trifire vs 16x 8x 16x trifire.
> 
> I know the common consensus is it doesn't matter but I haven't seen testing for 4k or higher resolutions.


more lanes == lower latency and you can get a couple frames difference, but for the most part there shouldn't be a bottlneck on even PCIe 2.0 x8 on the main card, 2.0 x4 however would be pushing it, so if you have PCIe 3.0 you shouldn't have to worry about anything.

4K @ 60FPS == 2GB/s, PCIe2.0 x4 == 2GB/s, so you can see where it becomes a problem if one card has only 2.0x4, said card would be capped at ~60 frames, which for quadfire would be a total of ~240FPS. (assuming the primary had 2.0 x16)


----------



## cephelix

Quote:


> Originally Posted by *Talon720*
> 
> on Reds amd build thread hes on here too he was saying the reference ek blocks do fit on the msi gaming cards. that it was old information that they didnt fit. Some other people were saying this same thing to him and thats was his response.


Really?havent read anything about tht.could u please direct me to the info??


----------



## Talon720

heres my latest result
Quote:


> Originally Posted by *rdr09*
> 
> you are right. silicon lottery applies to all cards. if you don't plan on using LN2, then you might as well stick with the ref you have and add the same type. what are your highest clocks on your cards? care to show some benchmarks?


Here was my best result so far. Im not sure if its average above average or below average.


----------



## wes1099

Quote:


> Originally Posted by *cephelix*
> 
> Derrick has said that EK will be releasing a waterblock specifically for the MSI R9 290 Gaming and hopefully that it'll be available for sale on the EK webshop by Wednesday. ETA on when Frozencpu or PPCs is still kinda sketchy.
> 
> The stock 290 block wont fit on the MSI Gaming due to the larger capacitors used


Thanks for the help. I saw something elsewhere saying the capacitors on the MSI card were larger but somewhere else I saw someone say they had a full cover reference water block on it, so I just wanted to clarify.

Edit: I see that Talon720 said something about reference water blocks actually do fit the MSI cards, so I will 'stay tuned' for more information.


----------



## cephelix

Quote:


> Originally Posted by *wes1099*
> 
> Thanks for the help. I saw something elsewhere saying the capacitors on the MSI card were larger but somewhere else I saw someone say they had a full cover reference water block on it, so I just wanted to clarify.


now i'm confused as well. all i know is the EK rep said that they're releasing the msi r9 290 gaming compatible full cover block tomorrow


----------



## wes1099

Quote:


> Originally Posted by *Talon720*
> 
> on Reds amd build thread hes on here too he was saying the reference ek blocks do fit on the msi gaming cards. that it was old information that they didnt fit. Some other people were saying this same thing to him and thats was his response.


As cephelix said, could we please have more information about that?


----------



## cennis

Quote:


> Originally Posted by *Paul17041993*
> 
> more lanes == lower latency and you can get a couple frames difference, but for the most part there shouldn't be a bottlneck on even PCIe 2.0 x8 on the main card, 2.0 x4 however would be pushing it, so if you have PCIe 3.0 you shouldn't have to worry about anything.
> 
> 4K @ 60FPS == 2GB/s, PCIe2.0 x4 == 2GB/s, so you can see where it becomes a problem if one card has only 2.0x4, said card would be capped at ~60 frames, which for quadfire would be a total of ~240FPS. (assuming the primary had 2.0 x16)


thanks for the reply. Im not sure if the bandwidth calculation can be done as so.
If you see the below link, even (4) gtx 680s show a signifcant pcie bandwidth bottleneck between x8 and x16 pcie 2.0. Doesnt this mean a similar situation may arise ?http://forums.anandtech.com/showthread.php?t=2238947

also another user. seems to be affecting surround setups, which is less than a single 4k
http://hardforum.com/showpost.php?p=1038589760&postcount=247


----------



## Paul17041993

Quote:


> Originally Posted by *cennis*
> 
> thanks for the reply. Im not sure if the bandwidth calculation can be done as so.
> If you see the below link, even (4) gtx 680s show a signifcant pcie bandwidth bottleneck between x8 and x16 pcie 2.0. Doesnt this mean a similar situation may arise ?http://forums.anandtech.com/showthread.php?t=2238947
> 
> also another user. seems to be affecting surround setups, which is less than a single 4k
> http://hardforum.com/showpost.php?p=1038589760&postcount=247


nvidia hardware is excluded and uses a different system entirely, they use a mixed system where displays are intended to be attached to multiple cards and not just the primary, which means buffers are copied between each card individually for every frame rendered, also including larger amounts of control data, so for said setups they require an even distribution of PCIe lanes to each and every card.

to add to that, hardware physx will make it even worse, there's a reason why nvidia discontinued support for quad-SLI.


----------



## igrease

Quote:


> Originally Posted by *Kokin*
> 
> Something seems wrong. With my 7950 @ 1440p and low clocks (950/1400), I get 120~130 FPS for BF3 using low/med settings. Granted my 3570K is at 4.7GHz, but even at stock, there wasn't a big difference.
> 
> I know BF4 is a little more demanding than BF3, but you shouldn't be getting that low of a performance using a 290 with a 1080 120Hz monitor.


So far there are two maps where I actually get 120fps on Low. Golmud Railway and Rogue Transmission. Every other map is around 60 ~ 80. Though in Blade and Soul I practically doubled my fps. League of Legends still likes to hover between 70 fps and 110. Not sure why. Would overclocking my CPU to 4.5Ghz really make such a dramatic difference?


----------



## Kokin

Quote:


> Originally Posted by *igrease*
> 
> So far there are two maps where I actually get 120fps on Low. Golmud Railway and Rogue Transmission. Every other map is around 60 ~ 80. Though in Blade and Soul I practically doubled my fps. League of Legends still likes to hover between 70 fps and 110. Not sure why. Would overclocking my CPU to 4.5Ghz really make such a dramatic difference?


It couldn't hurt to, but in most tests I've seen, OCing the CPU only helps increase minimum FPS, not so much the overall FPS, unless your CPU was really bottlenecking.

I'm getting a constant 120FPS for League of Legends, otherwise it swings around 200~300 if I'm not using Vsync. Also, I recommend using full screen over windowed or windowed/borderless, as that has brought down FPS for me in the past. I find it strange that you only get 70~110, when my resolution is much larger than yours and my GPU is also a lot weaker.


----------



## Roboyto

Quote:


> Originally Posted by *igrease*
> 
> Okay, so there has to be something I am doing wrong. So my frames in Battlefield 4 are not what they should be. As you may have seen my older posts, I just upgrade from my 560 Ti. I could play Battlefield 4 on Low settings at around 70 ~ 80 FPS. So now I put in my R9 290 and have the 14.3 drivers installed (I have tried 14.4). On all low settings with dx11, I get 70 ~ 80 FPS, on ultra I get 50 ~ 64 FPS. On Mantle, I get slightly less FPS. This is kind of how my previous 7950 handled. My 560 Ti outperformed an overclocked 7950 in most games. These were all on full 64v64 player servers. What the hell is going on? I refuse to believe my CPU is holding me back. It is a i5 2500k @ 4ghz. I know it is not the best OC but I am just afraid to push it further on this Motherboard because of what happened last time. Also, this card seems to run pretty damn hot. When I was running the Heaven Bench on Extreme everything 1600 x 900 the temps were 78c on the core, 77c vm1 and 68c vm2. The fan was at 85%. The card has not yet been overclocked and it at it's default 977/1250. This is the MSI r9 290 GAMING edition.


Quote:


> Originally Posted by *rdr09*
> 
> Multiplayer and not using mantle - you definitely need to oc your cpu. i recommend at least 4.5GHz.


^ What he said, and with a 2500k 4.5GHz should be a cake walk.

RAM speed plays a very important role in BF4, if you are only running 1600 you can benefit greatly from higher RAM clocks. 2133 is about the sweet spot for price/performance ratio.

See Gunderman456 Hawaiian Heat Wave post #122:

http://www.overclock.net/t/1442038/build-log-the-hawaiian-heat-wave/120


----------



## Talon720

Quote:


> Originally Posted by *cephelix*
> 
> now i'm confused as well. all i know is the EK rep said that they're releasing the msi r9 290 gaming compatible full cover block tomorrow


Quote:


> Originally Posted by *cephelix*
> 
> Really?havent read anything about tht.could u please direct me to the info??


Quote:


> Originally Posted by *cephelix*
> 
> now i'm confused as well. all i know is the EK rep said that they're releasing the msi r9 290 gaming compatible full cover block tomorrow


Quote:


> Originally Posted by *wes1099*
> 
> Thanks for the help. I saw something elsewhere saying the capacitors on the MSI card were larger but somewhere else I saw someone say they had a full cover reference water block on it, so I just wanted to clarify.
> 
> Edit: I see that Talon720 said something about reference water blocks actually do fit the MSI cards, so I will 'stay tuned' for more information.


Quote:


> Originally Posted by *cephelix*
> 
> now i'm confused as well. all i know is the EK rep said that they're releasing the msi r9 290 gaming compatible full cover block tomorrow


Ok http://www.overclock.net/t/1473361/amd-high-performance-project-by-red1776/100

image.jpg 25k .jpg file


image.jpg 39k .jpg file

If you go to the thread you'll see mega man and red1776 talking about it


----------



## Talon720

Also I picked up a 3rd card sapphire 290x reference for $400. Despite trying to talk myself out of it which didnt work. I lost out on a $375.00 hynix I'm still bummed about that. I don't know what kind of memory this has though. Supposedly, it's never been used, so that's something!


----------



## cennis

Quote:


> Originally Posted by *Paul17041993*
> 
> nvidia hardware is excluded and uses a different system entirely, they use a mixed system where displays are intended to be attached to multiple cards and not just the primary, which means buffers are copied between each card individually for every frame rendered, also including larger amounts of control data, so for said setups they require an even distribution of PCIe lanes to each and every card.
> 
> to add to that, hardware physx will make it even worse, there's a reason why nvidia discontinued support for quad-SLI.


havent
Quote:


> Originally Posted by *Paul17041993*
> 
> nvidia hardware is excluded and uses a different system entirely, they use a mixed system where displays are intended to be attached to multiple cards and not just the primary, which means buffers are copied between each card individually for every frame rendered, also including larger amounts of control data, so for said setups they require an even distribution of PCIe lanes to each and every card.
> 
> to add to that, hardware physx will make it even worse, there's a reason why nvidia discontinued support for quad-SLI.


Thanks haven't seen this anywhere on the web before!


----------



## King4x4

Anybody tried crossfiring the new 295x2 with another 290x? or maybe 2?


----------



## cephelix

Quote:


> Originally Posted by *Talon720*
> 
> Ok http://www.overclock.net/t/1473361/amd-high-performance-project-by-red1776/100
> 
> image.jpg 25k .jpg file
> 
> 
> image.jpg 39k .jpg file
> 
> If you go to the thread you'll see mega man and red1776 talking about it


Oh, that is interesting indeed. Will have to take a look at my card and work from there


----------



## kizwan

Quote:


> Originally Posted by *heroxoot*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Roboyto*
> 
> Quote:
> 
> 
> 
> Originally Posted by *INCREDIBLEHULK*
> 
> I believe thats my point.
> Smoke did 1500 on a ref 290x....... So whats the point in the premium for the lightning if nobody has done much more than those numbers? Seems if a reference makes more sense in that situation, unless you want to pay extra for a sweet cooler you have to take off
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hoping MSI has picked chips that are better performers. Otherwise you rely on the lottery?
> 
> Click to expand...
> 
> 
> 
> 
> The lightnings are normally better binned. My 7970 lightning had lower than normal lighting voltage.
> 
> On another note. MSI said that I should get an email with tracking between today and tomorrow for my 290 G series replacement. It seems like a fair replacement for my 7970 lightning as they don't make a 280x lightning. And the performance differences clock for clock should be substantial.
Click to expand...

Are you sure it's not April Fools?








Quote:


> Originally Posted by *BradleyW*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *sugarhell*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 290 vaporx pcb. The vrm area is interesting
> 
> 
> 
> 
> 
> 
> 
> I wonder how effective that heat sink will prove to be?
Click to expand...

I'm wondering the same thing too. I doubt though.
Quote:


> Originally Posted by *igrease*
> 
> Okay, so there has to be something I am doing wrong. So my frames in Battlefield 4 are not what they should be. As you may have seen my older posts, I just upgrade from my 560 Ti. I could play Battlefield 4 on Low settings at around 70 ~ 80 FPS. So now I put in my R9 290 and have the 14.3 drivers installed (I have tried 14.4). On all low settings with dx11, I get 70 ~ 80 FPS, on ultra I get 50 ~ 64 FPS. On Mantle, I get slightly less FPS. This is kind of how my previous 7950 handled. My 560 Ti outperformed an overclocked 7950 in most games. These were all on full 64v64 player servers. What the hell is going on? I refuse to believe my CPU is holding me back. It is a i5 2500k @ 4ghz. I know it is not the best OC but I am just afraid to push it further on this Motherboard because of what happened last time. Also, this card seems to run pretty damn hot. When I was running the Heaven Bench on Extreme everything 1600 x 900 the temps were 78c on the core, 77c vm1 and 68c vm2. The fan was at 85%. The card has not yet been overclocked and it at it's default 977/1250. This is the MSI r9 290 GAMING edition.
> 
> 
> Spoiler: Warning: Spoiler!


I doubt it's CPU bottleneck with one 290. Screen resolution when playing BF4? (Never mind I saw you have [email protected]) Download Open Hardware Monitor, run it in the background while playing BF4. Then go to View >> Show Plot. Select CPU cores (Load). There you can see whether CPU is bottleneck or not.

Did you use DDU to completely uninstall previous drivers?


----------



## Red1776

Quote:


> Originally Posted by *cephelix*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Talon720*
> 
> Ok http://www.overclock.net/t/1473361/amd-high-performance-project-by-red1776/100
> 
> image.jpg 25k .jpg file
> 
> 
> image.jpg 39k .jpg file
> 
> If you go to the thread you'll see mega man and red1776 talking about it
> 
> 
> 
> Oh, that is interesting indeed. Will have to take a look at my card and work from there
Click to expand...

My board with picture/serial/model etc is listed as a 'reference' PCB (or follows reference) I am starting to wonder myself. while the layout is identical, I am reading some yea and some nay. My blocks cleared customs today I was told but now I am starting to wonder when the larger caps were added and if they were missed or some of the MSI 290X Gamer series were not listed correctly.


----------



## cephelix

Hmmm..is there any way to check without having to take the card apart?


----------



## kizwan

Quote:


> Originally Posted by *Red1776*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *cephelix*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Talon720*
> 
> Ok http://www.overclock.net/t/1473361/amd-high-performance-project-by-red1776/100
> 
> image.jpg 25k .jpg file
> 
> 
> image.jpg 39k .jpg file
> 
> 
> If you go to the thread you'll see mega man and red1776 talking about it
> 
> 
> 
> Oh, that is interesting indeed. Will have to take a look at my card and work from there
> 
> Click to expand...
> 
> 
> 
> 
> My board with picture/serial/model etc is listed as a 'reference' PCB (or follows reference) I am starting to wonder myself. while the layout is identical, I am reading some yea and some nay. My blocks cleared customs today I was told but now I am starting to wonder when the larger caps were added and if they were missed or some of the MSI 290X Gamer series were not listed correctly.
Click to expand...

EK also said the Gaming card is reference card but with bigger caps. If I'm not mistaken the ones near the VRM1. I'll have to check this again.


----------



## rdr09

Quote:


> Originally Posted by *Talon720*
> 
> heres my latest result
> Here was my best result so far. Im not sure if its average above average or below average.


yah, lightning will easily beat those numbers. even on air.

my 290 will, too.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *King4x4*
> 
> Anybody tried crossfiring the new 295x2 with another 290x? or maybe 2?


No and def never will . Waaaaaaay to much paper









BTW Has anyone here cracked 19k on 3D Mk 11 using 290 ???











http://www.3dmark.com/3dm11/7725420


----------



## sugarhell

Wow 1540 ref 290x ln2

http://hwbot.org/submission/2530799_moonman_unigine_heaven___xtreme_preset_radeon_r9_290x_5504.9_dx11_marks


----------



## Norse

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> No and def never will . Waaaaaaay to much paper


Paper? more like power usage and heat.....it'd be roughly like having a 1KW fan heater


----------



## rdr09

Quote:


> Originally Posted by *sugarhell*
> 
> Wow 1540 ref 290x ln2
> 
> http://hwbot.org/submission/2530799_moonman_unigine_heaven___xtreme_preset_radeon_r9_290x_5504.9_dx11_marks


single channel. lol

wow, 770W.


----------



## Dasboogieman

Quote:


> Originally Posted by *igrease*
> 
> Okay, so there has to be something I am doing wrong. So my frames in Battlefield 4 are not what they should be. As you may have seen my older posts, I just upgrade from my 560 Ti. I could play Battlefield 4 on Low settings at around 70 ~ 80 FPS. So now I put in my R9 290 and have the 14.3 drivers installed (I have tried 14.4). On all low settings with dx11, I get 70 ~ 80 FPS, on ultra I get 50 ~ 64 FPS. On Mantle, I get slightly less FPS. This is kind of how my previous 7950 handled. My 560 Ti outperformed an overclocked 7950 in most games. These were all on full 64v64 player servers. What the hell is going on? I refuse to believe my CPU is holding me back. It is a i5 2500k @ 4ghz. I know it is not the best OC but I am just afraid to push it further on this Motherboard because of what happened last time. Also, this card seems to run pretty damn hot. When I was running the Heaven Bench on Extreme everything 1600 x 900 the temps were 78c on the core, 77c vm1 and 68c vm2. The fan was at 85%. The card has not yet been overclocked and it at it's default 977/1250. This is the MSI r9 290 GAMING edition.


Actually, do you know if your card is actually running at 977mhz when playing the various maps on Low Settings?
The reason I ask is the powerplay algorithm on the 290 tends to downclock the core at a certain performance level (especially with VSYNC)......basically what I mean is that the card would rather downclock to 667mhz or something to provide 60 ish FPS than staying at 977mhz and providing 80+.
You can test this out by using RadeonPro to create a custom profile for BF4 whereby you force the GPU to use max clocks when the game is running.

I also second the potential CPU bottleneck, that being said, I would hold off on this judgement until you verify that the GPU is actually running at full power.


----------



## Norse

Silly question but any harm in mixing two makes of cards? ie Sapphire and Asus 290X reference?


----------



## igrease

Quote:


> Originally Posted by *kizwan*
> 
> Are you sure it's not April Fools?
> 
> 
> 
> 
> 
> 
> 
> 
> I'm wondering the same thing too. I doubt though.
> I doubt it's CPU bottleneck with one 290. Screen resolution when playing BF4? (Never mind I saw you have [email protected]) Download Open Hardware Monitor, run it in the background while playing BF4. Then go to View >> Show Plot. Select CPU cores (Load). There you can see whether CPU is bottleneck or not.
> 
> Did you use DDU to completely uninstall previous drivers?


I did use DDU to completely remove my Nvidia drivers and the previous 14.4 drivers I had. I am currently using 14.3. Not sure if I did anything wrong but nothing is appearing the PLOT window.

Here is what is going on while in Battlefield 4. The only way to make my GPU usage 99% is when I ramp the Resolution Scale all the way up to 200%. Obviously that tanks my FPS hard. On standard Ultra settings with Antialiasing Post OFF, It hovers around 67%. On all low settings is hovers around 55%. Yet it still says 977 on the core. And if you don't know already, Mantle actually decreases my performance and I get lots of stuttering and hiccups. Should I try one of the 13.xxx Stable Drivers?


Spoiler: Warning: Spoiler!


----------



## rdr09

Quote:


> Originally Posted by *igrease*
> 
> I did use DDU to completely remove my Nvidia drivers and the previous 14.4 drivers I had. I am currently using 14.3. Not sure if I did anything wrong but nothing is appearing the PLOT window.
> 
> Here is what is going on while in Battlefield 4. The only way to make my GPU usage 99% is when I ramp the Resolution Scale all the way up to 200%. Obviously that tanks my FPS hard. On standard Ultra settings with Antialiasing Post OFF, It hovers around 67%. On all low settings is hovers around 55%. Yet it still says 977 on the core. And if you don't know already, Mantle actually decreases my performance and I get lots of stuttering and hiccups. Should I try one of the 13.xxx Stable Drivers?
> 
> 
> Spoiler: Warning: Spoiler!


look at your cpu usage across all cores? the max value when the system is at load.


----------



## Woundingchaney

I have never had an issue with mixing brand name reference cards.


----------



## Norse

Quote:


> Originally Posted by *Woundingchaney*
> 
> I have never had an issue with mixing brand name reference cards.


perfect thanks


----------



## kizwan

Quote:


> Originally Posted by *igrease*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Are you sure it's not April Fools?
> 
> 
> 
> 
> 
> 
> 
> 
> I'm wondering the same thing too. I doubt though.
> I doubt it's CPU bottleneck with one 290. Screen resolution when playing BF4? (Never mind I saw you have [email protected]) Download Open Hardware Monitor, run it in the background while playing BF4. Then go to View >> Show Plot. Select CPU cores (Load). There you can see whether CPU is bottleneck or not.
> 
> Did you use DDU to completely uninstall previous drivers?
> 
> 
> 
> 
> 
> 
> 
> I did use DDU to completely remove my Nvidia drivers and the previous 14.4 drivers I had. I am currently using 14.3. Not sure if I did anything wrong but nothing is appearing the PLOT window.
> 
> Here is what is going on while in Battlefield 4. The only way to make my GPU usage 99% is when I ramp the Resolution Scale all the way up to 200%. Obviously that tanks my FPS hard. On standard Ultra settings with Antialiasing Post OFF, It hovers around 67%. On all low settings is hovers around 55%. Yet it still says 977 on the core. And if you don't know already, Mantle actually decreases my performance and I get lots of stuttering and hiccups. Should I try one of the 13.xxx Stable Drivers?
> 
> 
> Spoiler: Warning: Spoiler!
Click to expand...

You didn't select/tick the "CPU Core #X" under "Load" section. You can see in the main window, there is a little box next to the sensor name.

Example,


----------



## igrease

Quote:


> Originally Posted by *rdr09*
> 
> look at your cpu usage across all cores? the max value when the system is at load.


Okay well when playing BF4 on OWM the cores hover around 70% ~ 80%. Just switched to the latest 13.x Drivers. Nothing has changed. When I play on Low it hovers around 55% GPU usage, when I crank the resolution scale up it goes up to around 94% but then the fps drop again.


----------



## Jflisk

Quote:


> Originally Posted by *Norse*
> 
> Silly question but any harm in mixing two makes of cards? ie Sapphire and Asus 290X reference?


Nope but it will go by the lowest clocked card. As far as the clock settings.


----------



## kizwan

Quote:


> Originally Posted by *igrease*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> look at your cpu usage across all cores? the max value when the system is at load.
> 
> 
> 
> 
> 
> 
> 
> Okay well when playing BF4 on OWM the cores hover around 70% ~ 80%. Just switched to the latest 13.x Drivers. Nothing has changed. When I play on Low it hovers around 55% GPU usage, when I crank the *resolution scale up it goes up to around 94% but then the fps drop again*.
Click to expand...

That basically what you should get. When playing at 200% resolution scale with 1080p monitor, GPU actually rendering the game at 2160p (4k) resolution. This of course will affect FPS but also lower CPU load. Basically a good solution if you're experiencing CPU bottleneck with 1080p monitor. Did you use the same resolution scale when using NVIDIA GPU?


----------



## Jflisk

Quote:


> Originally Posted by *igrease*
> 
> I did use DDU to completely remove my Nvidia drivers and the previous 14.4 drivers I had. I am currently using 14.3. Not sure if I did anything wrong but nothing is appearing the PLOT window.
> 
> Here is what is going on while in Battlefield 4. The only way to make my GPU usage 99% is when I ramp the Resolution Scale all the way up to 200%. Obviously that tanks my FPS hard. On standard Ultra settings with Antialiasing Post OFF, It hovers around 67%. On all low settings is hovers around 55%. Yet it still says 977 on the core. And if you don't know already, Mantle actually decreases my performance and I get lots of stuttering and hiccups. Should I try one of the 13.xxx Stable Drivers?
> 
> 
> Spoiler: Warning: Spoiler!


Stupid question have you shut of ULPS on the card. Look up ultra low power state turning off there are programs that will do it of you or you can search the registry and make changes.

As far as mantle goes try the 14.3 V1 beta.

http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx


----------



## Brian18741

Quote:


> Originally Posted by *Roboyto*
> 
> I can also confirm that ASUS will honor their warranty once HSF is removed. I recently RMAd my DC2 7870 and had no issues with the claim after it being under water. Just be certain there is no damage to the card and everything is back to stock configuration.


Interesting. See the below repsone I got when I emailed them.
Quote:


> Dear Brian,
> 
> Thank you for contacting Asus Customer Support.
> 
> Please be informed that we have certain exclusions from the warranty service. The warranty will not apply if the product has been tampered, repaired and/or modified by non-authorized personnel.
> 
> If you replace a part of the product this will indeed void the warranty of the product.
> 
> We are sorry for the inconvenience caused.
> 
> Please do not hesitate to contact us again for any further questions.
> 
> Kind Regards,
> 
> Kiona A
> ASUS UK
> 
> Original Message
> 
> From : brian
> Sent : 14-3-2014 16:46:08
> To : "[email protected]"
> Subject : Satisfaction-IRL(EN) : Waterblocks and Warranty
> 
> [CASEID=WTM20140315004607987]
> 
> Apply date : 2014/03/14 16:46:07(UTC Time)
> 
> Subject : Waterblocks and Warranty
> Topic : 3. Others
> Description :
> Hi Guys,
> 
> If I were to replace the Direct CU II cooler on an R9 290 with a watercooling block
> would that void my warranty?
> 
> In the event of an RMA I would obviously replace the stock cooler. If there was NO
> damage as a result of the cooler swap, would warranty hold up?
> 
> Regards,
> 
> Brian.


----------



## Norse

Quote:


> Originally Posted by *Jflisk*
> 
> Nope but it will go by the lowest clocked card. As far as the clock settings.


Both cards would be reference so same clock speeds


----------



## igrease

Quote:


> Originally Posted by *kizwan*
> 
> You didn't select/tick the "CPU Core #X" under "Load" section. You can see in the main window, there is a little box next to the sensor name.
> 
> Example,


Wow I didn't even notice the little check boxes. Had it on my second monitor and haven't fixed the color settings yet so it blended in.

So I guess I am being bottlenecked.


----------



## Roboyto

Quote:


> Originally Posted by *Brian18741*
> 
> Interesting. See the below repsone I got when I emailed them.


UK policies must be different than USA. I have had a couple DC2 cards and none of them had tamper stickers on the HSF screws.

I know XFX does this as well. Both my 290's had the tamper stickers on them. However, for North American market, warranty is not voided by removing the HSF as long as no damage is done to the card and it is returned to stock configuration if you need to RMA.


----------



## rdr09

Quote:


> Originally Posted by *igrease*
> 
> Okay well when playing BF4 on OWM the cores hover around 70% ~ 80%. Just switched to the latest 13.x Drivers. Nothing has changed. When I play on Low it hovers around 55% GPU usage, when I crank the resolution scale up it goes up to around 94% but then the fps drop again.


i was referring to that screenie you included. was that Hwinfo64? one core showing 100% load and the rest were all above 90%. not good.

edit: but that thing is . . . mantle should help in 14.3. it may not show in the graphs but at least in actual gameplay.


----------



## igrease

Quote:


> Originally Posted by *rdr09*
> 
> i was referring to that screenie you included. was that Hwinfo64? one core showing 100% load and the rest were all above 90%. not good.
> 
> edit: but that thing is . . . mantle should help in 14.3. it may not show in the graphs but at least in actual gameplay.


Except it isn't. Mantle just cause hiccups and stuttering with worse performance. I don't understand.


----------



## rdr09

Quote:


> Originally Posted by *igrease*
> 
> Except it isn't. Mantle just cause hiccups and stuttering with worse performance. I don't understand.


in some systems it still does. others, like mine, it works. not sure what's up but at the very least, DX 11 should not give you any of these issues if only your cpu is oc'ed a bit more.

one thing, though, i never use third-party cleaners like DDU and driver sweeper. but i never went from green to red either.


----------



## Jflisk

Quote:


> Originally Posted by *igrease*
> 
> Except it isn't. Mantle just cause hiccups and stuttering with worse performance. I don't understand.


Okay is that with the newest beta driver from AMD. The one I gave the link to 14.3 V1 . i had studering with the initial driver the one mentioned here seems to fix all of that. I cant see any of you links have you checked your VRM temps If the VRMs or Core are too hot for any reason you will be getting wild results with these cards.Never mind just saw the info

temps were 78c on the core, 77c vm1 and 68c vm2. The fan was at 85%. this looks close to right under air.


----------



## Brian18741

Quote:


> Originally Posted by *Roboyto*
> 
> UK policies must be different than USA. I have had a couple DC2 cards and none of them had tamper stickers on the HSF screws.
> 
> I know XFX does this as well. Both my 290's had the tamper stickers on them. However, for North American market, warranty is not voided by removing the HSF as long as no damage is done to the card and it is returned to stock configuration if you need to RMA.


I've heard of this difference in US and EU policies before, how do they justify it?!


----------



## Roboyto

Quote:


> Originally Posted by *Brian18741*
> 
> I've heard of this difference in US and EU policies before, how do they justify it?!


Hell if I know! It doesn't make any sense really...all the waterblock manufacturers are in Europe...so it would seem that you folks across the pond are quite fond of watercooling your PC equipment


----------



## bond32

Anyone have the bone stock AMD manufacturer reference? When I get funds, I want one of those.


----------



## Talon720

Quote:


> Originally Posted by *rdr09*
> 
> yah, lightning will easily beat those numbers. even on air.
> 
> my 290 will, too.


Hmm well unless your cpu score is way higher than mine are you using an ivy-e. So far I can only get my 4770k to 4.4ghz I need to delid that heat's killin me. I wonder why im having trouble breaking 10k or getting beat by a 290. I still picked up another ref 290x maybe if i can find a cheaper lightning or a deal on an ek block $850.00 is alot lol. You did run firemark extreme and not performance, right? Cause performance gives higher scores. Also what driver are you using I think I used 14.3 w/pt1 for that run. Does the 13.12 give significant performance advantage in benchmarks. Can you or anyone else post some 3d mark firestrike extreme scores?


----------



## Faster_is_better

Quote:


> Originally Posted by *Brian18741*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Roboyto*
> 
> I can also confirm that ASUS will honor their warranty once HSF is removed. I recently RMAd my DC2 7870 and had no issues with the claim after it being under water. Just be certain there is no damage to the card and everything is back to stock configuration.
> 
> 
> 
> Interesting. See the below repsone I got when I emailed them.
> Quote:
> 
> 
> 
> Dear Brian,
> 
> Thank you for contacting Asus Customer Support.
> 
> Please be informed that we have certain exclusions from the warranty service. The warranty will not apply if the product has been tampered, repaired and/or modified by non-authorized personnel.
> 
> If you replace a part of the product this will indeed void the warranty of the product.
> 
> We are sorry for the inconvenience caused.
> 
> Please do not hesitate to contact us again for any further questions.
> 
> Kind Regards,
> 
> Kiona A
> ASUS UK
> 
> Original Message
> 
> From : brian
> Sent : 14-3-2014 16:46:08
> To : "[email protected]"
> Subject : Satisfaction-IRL(EN) : Waterblocks and Warranty
> 
> [CASEID=WTM20140315004607987]
> 
> Apply date : 2014/03/14 16:46:07(UTC Time)
> 
> Subject : Waterblocks and Warranty
> Topic : 3. Others
> Description :
> Hi Guys,
> 
> If I were to replace the Direct CU II cooler on an R9 290 with a watercooling block
> would that void my warranty?
> 
> In the event of an RMA I would obviously replace the stock cooler. If there was NO
> damage as a result of the cooler swap, would warranty hold up?
> 
> Regards,
> 
> Brian.
> 
> Click to expand...
Click to expand...

Another thing to consider, Support will always cover themselves. Unless the manufacturer specifically has it stated somewhere that they allow you to modify your card, then general support inquiries are probably going to get that response back. If you are careful with your card when removing the cooler, and if you can somehow save those tamper stickers, then if at some point you had to RMA, as long as you can replace it back to stock configuration, they probably aren't going to know unless you tell them.

Speaking of the international warranty differences, I thought EU usually got screwed on these (as far as limitations), but come to think of it EU does have some better consumer protection laws for some things.


----------



## Talon720

Quote:


> Originally Posted by *bond32*
> 
> Anyone have the bone stock AMD manufacturer reference? When I get funds, I want one of those.


Amazon and ebay have both new and used


----------



## Durvelle27

Quote:


> Originally Posted by *bond32*
> 
> Anyone have the bone stock AMD manufacturer reference? When I get funds, I want one of those.


I do


----------



## kayan

Update on overclocking my GPU:

I've gotten the card up to stable in MSI Afterburner with the following settings:

Core Voltage = Stock,
Power Limit = +50 (max),
Core Clock = 1110
Memory Clock = 1340



That's as far as I can get it to go without adding voltage. I can pass runs of both Heaven Valley and 3dMark FireStrike (+Extreme).

Firestrike: 9193
Extreme: 5092

GPU tempt while playing BF4 and Titanfall are hovering around 50-57C, and benchmarks around 55-60C.

VRM Temps according to GPU-Z are:
Benching:
VRM 1 - 59C
VRM 2 - 37C


----------



## Brian18741

Quote:


> Originally Posted by *Faster_is_better*
> 
> Another thing to consider, Support will always cover themselves. Unless the manufacturer specifically has it stated somewhere that they allow you to modify your card, then general support inquiries are probably going to get that response back. If you are careful with your card when removing the cooler, and if you can somehow save those tamper stickers, then if at some point you had to RMA, as long as you can replace it back to stock configuration, they probably aren't going to know unless you tell them.
> 
> Speaking of the international warranty differences, I thought EU usually got screwed on these (as far as limitations), but come to think of it EU does have some better consumer protection laws for some things.


Could well be the case but I have already broken the tamper proof sticker on both coolers.


----------



## Roboyto

Quote:


> Originally Posted by *Talon720*
> 
> Hmm well unless your cpu score is way higher than mine are you using an ivy-e. So far I can only get my 4770k to 4.4ghz I need to delid that heat's killin me. I wonder why im having trouble breaking 10k or getting beat by a 290. I still picked up another ref 290x maybe if i can find a cheaper lightning or a deal on an ek block $850.00 is alot lol. You did run firemark extreme and not performance, right? Cause performance gives higher scores. Also what driver are you using I think I used 14.3 w/pt1 for that run. Does the 13.12 give significant performance advantage in benchmarks. Can you or anyone else post some 3d mark firestrike extreme scores?


*XFX R9 290 BE 1200/1500 +87mV*

*4770K 4.5GHZ 1.259V*

*16GB Dominator Platinum 2400MHz 1.65V*

*Tesselation ON*



*Tesselation OFF*


----------



## armartins

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> I believe thats my point.
> Smoke did 1500 on a ref 290x....... So whats the point in the premium for the lightning if nobody has done much more than those numbers? Seems if a reference makes more sense in that situation, unless you want to pay extra for a sweet cooler you have to take off


Two words => Coil Whine.


----------



## Mega Man

[quote name="Roboyto" url="/t/1436497/official-am
Quote:


> Originally Posted by *Brian18741*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Roboyto*
> 
> UK policies must be different than USA. I have had a couple DC2 cards and none of them had tamper stickers on the HSF screws.
> 
> I know XFX does this as well. Both my 290's had the tamper stickers on them. However, for North American market, warranty is not voided by removing the HSF as long as no damage is done to the card and it is returned to stock configuration if you need to RMA.
> 
> 
> 
> I've heard of this difference in US and EU policies before, how do they justify it?!
Click to expand...

although i could be wrong i am pretty sure they have to PROVE that the installation made the damage to the product. by law in us


----------



## rdr09

Quote:


> Originally Posted by *Talon720*
> 
> Hmm well unless your cpu score is way higher than mine are you using an ivy-e. So far I can only get my 4770k to 4.4ghz I need to delid that heat's killin me. I wonder why im having trouble breaking 10k or getting beat by a 290. I still picked up another ref 290x maybe if i can find a cheaper lightning or a deal on an ek block $850.00 is alot lol. You did run firemark extreme and not performance, right? Cause performance gives higher scores. Also what driver are you using I think I used 14.3 w/pt1 for that run. Does the 13.12 give significant performance advantage in benchmarks. Can you or anyone else post some 3d mark firestrike extreme scores?


at 1250 a 290X will beat this . . .

http://www.3dmark.com/3dm/2098310

that is the reason why i recommend the 290X if the user likes to bench. the chance increases with a gpu like a lightning. it applies to red and green.

edit: actually, i did not see your results. i just assumed they are low. use 13.12 for benching. 14 drivers will make your gpu throttle. can't recommend the leaked one either. 13.11 driver here.


----------



## Euda

I was bored and got into illustrating my Valley 1.0-Results with my R9 290X @ +200mV - Temps were permanently <70°C [158° F], VRM1/VRM2 max. 90° C/69° C.
Used Photoshop CC - I hope it helps someone


----------



## Roboyto

Quote:


> Originally Posted by *INCREDIBLEHULK*
> 
> I believe thats my point.
> Smoke did 1500 on a ref 290x....... So whats the point in the premium for the lightning if nobody has done much more than those numbers? Seems if a reference makes more sense in that situation, unless you want to pay extra for a sweet cooler you have to take off


Quote:


> Originally Posted by *armartins*
> 
> Two words => Coil Whine.


^ What he said ^

My reference card lets out an obvious cry from the abuse of +200mV/1.4V+ to the core. All the enhancements to the card make it more probable for high clocks, as well as enhancing durability under duress.

1500 on a reference board must have been THE GOLDEN 290X.


----------



## DrClaw

[quote/]

^ What he said ^

My reference card lets out an obvious cry from the abuse of +200mV/1.4V+ to the core. All the enhancements to the card make it more probable for high clocks, as well as enhancing durability under duress.

1500 on a reference board must have been THE GOLDEN 290X.
[/quote]

dont worry, its just a new song


----------



## DrClaw

Quote:


> Originally Posted by *Roboyto*
> 
> ^ What he said ^
> 
> My reference card lets out an obvious cry from the abuse of +200mV/1.4V+ to the core. All the enhancements to the card make it more probable for high clocks, as well as enhancing durability under duress.
> 
> 1500 on a reference board must have been THE GOLDEN 290X.


dont worry, its just a new song.


----------



## FuriousPop

finally got a chance to try BF4 last night.

13.12 drivers with my 2x Sapphire Tri-X R9 290's in crossfire..... and what to my surprise BSOD...

De ja vu all over again, i remember spending 2 months on my 7970's in xfire trying to fix it for BF3.... you mean to tell me these issues still going on in BF4









ulps disabled.. everything on stock and still same result..

What kills me is in Single player mode it didn't flinch...no issues at all.. (as well as no other game has experienced this)

Play a MP map for 5 mins (only 10 ppl on it) and boom gone.....BSOD

Heat? 81 degree cel they were at (air). surely cannot be CPU bottlenecking 3770k.. plus would show in other games...

was running at 7560x1600 on high - smooth as, but perhaps its not as stable in DirectX, do i need to run it in mantle?

any help, much appreciated,


----------



## devilhead

Quote:


> Originally Posted by *FuriousPop*
> 
> finally got a chance to try BF4 last night.
> 
> 13.12 drivers with my 2x Sapphire Tri-X R9 290's in crossfire..... and what to my surprise BSOD...
> 
> De ja vu all over again, i remember spending 2 months on my 7970's in xfire trying to fix it for BF3.... you mean to tell me these issues still going on in BF4
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ulps disabled.. everything on stock and still same result..
> 
> What kills me is in Single player mode it didn't flinch...no issues at all.. (as well as no other game has experienced this)
> 
> Play a MP map for 5 mins (only 10 ppl on it) and boom gone.....BSOD
> 
> Heat? 81 degree cel they were at (air). surely cannot be CPU bottlenecking 3770k.. plus would show in other games...
> 
> was running at 7560x1600 on high - smooth as, but perhaps its not as stable in DirectX, do i need to run it in mantle?
> 
> any help, much appreciated,


what kind of BSOD you are getting?


----------



## FuriousPop

Quote:


> Originally Posted by *devilhead*
> 
> what kind of BSOD you are getting?


cannot recall 100%, saw it just before i went to bed. think it was a 0x0D1, (IRQ_NOT LESS THAN blah blah)

But i'll double check this tonight and repost it up here...


----------



## bond32

Quote:


> Originally Posted by *FuriousPop*
> 
> cannot recall 100%, saw it just before i went to bed. think it was a 0x0D1, (IRQ_NOT LESS THAN blah blah)
> 
> But i'll double check this tonight and repost it up here...


Maybe a ram error? Did you set the cpu/ram to stock? Bf4 is a huge stability test in itself.


----------



## FuriousPop

Quote:


> Originally Posted by *bond32*
> 
> Maybe a ram error? Did you set the cpu/ram to stock? Bf4 is a huge stability test in itself.


yeah, everything is at stock (default settings) at the moment... just to be safe..

dont think it will be mem, since everything else i have run hasn't replicated the issue.


----------



## rdr09

Quote:


> Originally Posted by *FuriousPop*
> 
> yeah, everything is at stock (default settings) at the moment... just to be safe..
> 
> dont think it will be mem, since everything else i have run hasn't replicated the issue.


just to be safe i think you should run memtest. if not the ram, then i could be the card driver. not just the gpu but other cards installed like a network card or a sound card. are you using a wireless card?

my last suggestion is to start a thread here . . .

http://www.overclock.net/f/17986/crash-analysis-and-debugging


----------



## Paul17041993

Quote:


> Originally Posted by *Roboyto*
> 
> ^ What he said ^
> 
> My reference card lets out an obvious cry from the abuse of +200mV/1.4V+ to the core. All the enhancements to the card make it more probable for high clocks, as well as enhancing durability under duress.
> 
> 1500 on a reference board must have been THE GOLDEN 290X.


mine doesn't emit whine at all with +200mV, generally its a chance thing but the majority of cards with soft chokes don't get coil whine, ceramic/iron chokes just eliminates that chance entirely.

1500 core on a ref. PCB doesn't sound very surprising either TBH, colder you keep it the less it pulls (on same clocks) until you hit superconductor critical, which is when having lots of power phases comes to play.


----------



## kizwan

Both of my cards also don't have coil whine too @+200mV. How many people have coil whine with their cards?


----------



## rv8000

My reference sapphire 290 and tri-x are whine free, there is some moderate whine on my PCS+ at 200mv but nothing like some of the whine on my hd7xxx series cards.


----------



## rdr09

no coil whine here either. reference 290.

my previous 7950 and 7970 did not whine, too, but my first Gigabyte 7950 whined.


----------



## Widde

Getting some coil whine or some sort of feedback in my headphones when I'm getting very high fps, and alos in stuff like firestrike while trying to push max oc I can hear some coil whine from the rig itself


----------



## DeadlyDNA

sorry if this has been asked a bunch, does MSI afterburner not see GPU memory usage correctly on r9 series? I am trying to use it for VRAM usage on 4k eyefinity but it only shows GPU2MEM in list and when its on monitor in Metro2033 its saying 7.2gb used.... Thats doesnt make any sense to me.


----------



## VulgarDisplay

I get a buzzing sound on my tri x whenever the fan hits 45%. Anything above or below and its silent. Strangest thing...


----------



## pkrexer

My Sapphire 290 had some major coil whine, it also had the black screen issue. Replaced it with a reference Asus 290x and has 0 coil whine.


----------



## FuriousPop

Quote:


> Originally Posted by *pkrexer*
> 
> My Sapphire 290 had some major coil whine, it also had the black screen issue. Replaced it with a reference Asus 290x and has 0 coil whine.


mine had that as well, however once i changed driver version it magically went away...


----------



## DeadlyDNA

Is coil whine more like a hissing sound when cards are stressed? Because mine do that and i only really noticed it when i went watercooling


----------



## FuriousPop

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Is coil whine more like a hissing sound when cards are stressed? Because mine do that and i only really noticed it when i went watercooling


I remember when my first slot card was doing it when the fan's were at approx 45% or above..

But the second i put my finger on the corner of the cards the sound disappeared. thought it was caused by the fans themselves.

If water cooled and still happening, wonder it they are touching anything as well?


----------



## DeadlyDNA

Quote:


> Originally Posted by *FuriousPop*
> 
> I remember when my first slot card was doing it when the fan's were at approx 45% or above..
> 
> But the second i put my finger on the corner of the cards the sound disappeared. thought it was caused by the fans themselves.
> 
> If water cooled and still happening, wonder it they are touching anything as well?


People have said it comes from the chokes on the card? (i think)
Mine sounds a lot like HDD accessing the data on a platter but very lightly. I mean when i have a system stutter for instance the noise perfectly stops for that brief time and resumes when FPS is backup. (example) Or better yet it reminds me of an old CD/DVD player i used to have when it was seeking the disc it made this weird chatter.

I know it's probably not making sense for description but thats what it sounds like to me


----------



## Talon720

Quote:


> Originally Posted by *Roboyto*
> 
> *XFX R9 290 BE 1200/1500 +87mV*
> *4770K 4.5GHZ 1.259V*
> *16GB Dominator Platinum 2400MHz 1.65V*
> 
> *Tesselation ON*
> 
> 
> 
> *Tesselation OFF*


Quote:


> Originally Posted by *rdr09*
> 
> at 1250 a 290X will beat this . . .
> 
> http://www.3dmark.com/3dm/2098310
> 
> that is the reason why i recommend the 290X if the user likes to bench. the chance increases with a gpu like a lightning. it applies to red and green.
> 
> edit: actually, i did not see your results. i just assumed they are low. use 13.12 for benching. 14 drivers will make your gpu throttle. can't recommend the leaked one either. 13.11 driver here.


Yea I looked back and the link I put in never came up... but yea based on your 2 scores I think I'm doing fine . Oh i thought the pt1 bios dosnt throttle, least when I looked at hwinfo64. I'll rebench it and see if it gets any better I had a score of 9600. Rounded off cause I can't remember exactly. Maybe this new card I have comming will be golden (crosses fingers) def would still take a lightning or a few if the price is right


----------



## cennis

Got me a asus 290 dcu ii
I know asic is almost useless for ocability
Mine is 83.2% and i noticed something when overclocking.

I set 1400mv in gputweak but i get 1.17v running 3dmark11 and theres artifacting at 1200 core clock. I want o get
temps are not high and vrm temps are monitored too


----------



## Mercy4You

Quote:


> Originally Posted by *VulgarDisplay*
> 
> I get a buzzing sound on my tri x whenever the fan hits 45%. Anything above or below and its silent. Strangest thing...


My Tri-X has that too. It's plastic plate vibration caused at a certain speed of 1 or more of the 3 fans. By the sheer weight it's bending a little.
I resolved it by connecting the end of the fanplate to my case with a shoolace, so much for my high end system


----------



## VulgarDisplay

Quote:


> Originally Posted by *Mercy4You*
> 
> My Tri-X has that too. It's plastic plate vibration caused at a certain speed of 1 or more of the 3 fans. By the sheer weight it's bending a little.
> I resolved it by connecting the end of the fanplate to my case with a shoolace, so much for my high end system


It's a hoopty now. lol


----------



## VulgarDisplay

What VDDC offset are you guys willing to go with for a 24/7 overclock? I'm trying to keep my VRM's under 90 on my tri-x. It stays around 80C at +67 offset, and my fan profile isn't really that aggressive. Is +100 ok for an air cooled card as long as core stays cool and VRM's are under 90c?


----------



## Euda

Quote:


> Originally Posted by *VulgarDisplay*
> 
> What VDDC offset are you guys willing to go with for a 24/7 overclock? I'm trying to keep my VRM's under 90 on my tri-x. It stays around 80C at +67 offset, and my fan profile isn't really that aggressive. Is +100 ok for an air cooled card as long as core stays cool and VRM's are under 90c?


+100 mV should be fine @ air, max. +150 on aftermarket solutions I'd say


----------



## cennis

Quote:


> Originally Posted by *VulgarDisplay*
> 
> What VDDC offset are you guys willing to go with for a 24/7 overclock? I'm trying to keep my VRM's under 90 on my tri-x. It stays around 80C at +67 offset, and my fan profile isn't really that aggressive. Is +100 ok for an air cooled card as long as core stays cool and VRM's are under 90c?


Quote:


> Originally Posted by *Euda*
> 
> +100 mV should be fine @ air, max. +150 on aftermarket solutions I'd say


wondering what is your actual voltage under 3dmark11 stress?


----------



## Mercy4You

Quote:


> Originally Posted by *VulgarDisplay*
> 
> It's a hoopty now. lol


----------



## Schussnik

Hi,

Was just wondering if anyone had tried out the new Sapphire R290 Vapor-X yet, and if so what's the feedback?

Being interested in silent/discrete running when it comes to my PC I'm wondering if this one could be the "R290 of my dreams".

Cheers


----------



## Raephen

Quote:


> Originally Posted by *Mercy4You*
> 
> My Tri-X has that too. It's plastic plate vibration caused at a certain speed of 1 or more of the 3 fans. By the sheer weight it's bending a little.
> I resolved it by connecting the end of the fanplate to my case with a shoolace, so much for my high end system


MaGyver FTW!


----------



## VulgarDisplay

Quote:


> Originally Posted by *cennis*
> 
> wondering what is your actual voltage under 3dmark11 stress?


It hovers aroudn 1.2v with lots of delightful vdroop all over the place. I imagine 1.25 volts is still pretty safe for a 28nm chip just like it was for Tahiti right?


----------



## Iniura

Quote:


> Originally Posted by *Roboyto*
> 
> If this all happened after changing BIOS I would think that is the problem personally. Try flashing back to stock BIOS and see if performance returns.
> 
> Memory OC on these cards is not that important. I have tested with stock RAM clock and very high overclock; up to 1700. Check my Shrunken Beast build log for the spreadsheet listing performance gains from RAM speed. On average for 3dmark, Unigine, and FFXIV benchmark it was only a 5% performance jump.
> 
> If you flash the ASUS BIOS, then you should be able to use GPU Tweak. However, I don't know if it has Aug Voltage.
> 
> You definitely didn't damage anything when installing the block correct?
> 
> Have you tried re-seating the card in the slot? Check all your other connections in the case while you're at it. This can fix all kinds of strange unexplainable issues. I know my card with XSPC backplate makes it a tight fit to get underneath the RAM clips.
> 
> Scrub drivers and try reinstalling them.
> 
> Maybe try fresh Windows install again.


I reinstalled my drivers, reseated the card. and I flashed the ASUS PT1 bios for now, it seems a bit more stable I can at least run 1150/1250 now with MSI AB on only +100mV
Quote:


> Originally Posted by *Talon720*
> 
> Hmm well unless your cpu score is way higher than mine are you using an ivy-e. So far I can only get my 4770k to 4.4ghz I need to delid that heat's killin me. I wonder why im having trouble breaking 10k or getting beat by a 290. I still picked up another ref 290x maybe if i can find a cheaper lightning or a deal on an ek block $850.00 is alot lol. You did run firemark extreme and not performance, right? Cause performance gives higher scores. Also what driver are you using I think I used 14.3 w/pt1 for that run. Does the 13.12 give significant performance advantage in benchmarks. Can you or anyone else post some 3d mark firestrike extreme scores?


Here's the highest score I could get with my 290 for Firestrike Extreme.

http://www.3dmark.com/3dm/2802744


----------



## Mercy4You

Quote:


> Originally Posted by *Raephen*
> 
> MaGyver FTW!










Raephen, I think you're an old gamer, like me


----------



## Euda

Quote:


> Originally Posted by *cennis*
> 
> wondering what is your actual voltage under 3dmark11 stress?


@Valley 'Extreme HD'-Preset, with +200 mV it's around ~1.26-1.28 V, with some spikes to >1.3 V, max. spike is 1.388 V. Temps. @ 67° C maximum.
Not sure for 3DMark, will test that soon.

@Stock it is 1.15 V average - 290X XFX Ref.-Board @ Accelero Hybrid | Tri-X BIOS | ASIC 78.9% | Catalyst 14.4 Beta

Additionally, I hear some mild coil whine at 800+ FPS (high-pitched whiny noise) and at +200mV (very silent, sounds like a HDD-access.)

_


----------



## Raephen

Quote:


> Originally Posted by *Mercy4You*
> 
> 
> 
> 
> 
> 
> 
> 
> Raephen, I think you're an old gamer, like me


/sigh

I still think back to the fun I had my 16-bit Sega Megadrive


----------



## arrow0309

Quote:


> Originally Posted by *Euda*
> 
> I was bored and got into illustrating my Valley 1.0-Results with my R9 290X @ +200mV - Temps were permanently <70°C [158° F], VRM1/VRM2 max. 90° C/69° C.
> Used Photoshop CC - I hope it helps someone


+Rep!

Nice work, mine too needs a lot of vddc offset for the 1200, not a golden chip I'd say









http://s24.postimg.org/j01ja39ur/Clipboard01.jpg

With the "proper" ventilation you can even "survive" on stock air cooling also


----------



## Euda

Quote:


> Originally Posted by *arrow0309*
> 
> +Rep!
> 
> Nice work,


Thanks, nice that someone takes advantage of the info.









Quote:


> mine too needs a lot of vddc offset for the 1200, not a golden chip I'd say
> 
> 
> 
> 
> 
> 
> 
> 
> http://s24.postimg.org/j01ja39ur/Clipboard01.jpg
> With the "proper" ventilation you can even "survive" on stock air cooling also


Conversely, your RAM runs fine @ 1500 MHz - mine is artifacting on the desktop instantly after I set those 1500 MHz. Ingame, it also leads to a blackscreen.









Btw.: I just sat down after a new bench-session and created a new graphic. Contrary to the assumption of a big FPS-profit when using VRAM-OC and heavy-SSAA/Downsampling combined, it gets clear that besides benchmarking, VRAM-OC is almost useless on the Hawaii-GPU. In addition, you pay 82% power draw for an fps-plus of ~15% when using an OC-cripple as mine









Nice setup, looks as it should have an excellent airflow!


----------



## Widde

Seems like the Lightnings pack something extra compared to the regular 290xs ^^ http://www.3dmark.com/hall-of-fame-2/

Apparently that was on water aswell


----------



## igrease

I would just like to say thanks to all who attempted to help me with my GPU issue. All my problems are gone now. I simply switched to Windows 8.1 and re-unparked my cores. Battlefield 4 now runs at 100+ Fps constantly on Ultra with no AA. Mantle works perfectly fine with no more crazy stuttering. I didn't really believe windows 8 made such a difference but wow...

Also could I get added to the club?


----------



## rdr09

Quote:


> Originally Posted by *igrease*
> 
> I would just like to say thanks to all who attempted to help me with my GPU issue. All my problems are gone now. I simply switched to Windows 8.1 and re-unparked my cores. Battlefield 4 now runs at 100+ Fps constantly on Ultra with no AA. Mantle works perfectly fine with no more crazy stuttering. I didn't really believe windows 8 made such a difference but wow...
> 
> Also could I get added to the club?
> 
> 
> Spoiler: Warning: Spoiler!


are you serious? could you do more testing in MP 64 and are you using 14.4 or older 14 driver?

i can get Win8 at a discount but i just don't want to reinstall everything to the drive.

anyways, i'm glad it is working for you, though, i have no issue using Win7. i'm just curious 'cause it could be reinstalling the driver and/or the game itself that are making the difference.


----------



## Frontside

Quote:


> Originally Posted by *Widde*
> 
> Seems like the Lightnings pack something extra compared to the regular 290xs ^^ http://www.3dmark.com/hall-of-fame-2/
> 
> Apparently that was on water aswell


Yeah it pack *8 Pack*


----------



## igrease

Quote:


> Originally Posted by *rdr09*
> 
> are you serious? could you do more testing in MP 64 and are you using 14.4 or older 14 driver?
> 
> i can get Win8 at a discount but i just don't want to reinstall everything to the drive.
> 
> anyways, i'm glad it is working for you, though, i have no issue using Win7. i'm just curious 'cause it could be reinstalling the driver and/or the game itself that are making the difference.


It may be Windows 8.1 doing the trick or it may be that I reformatted and just flat out reinstalled Windows. I am using 14.4 Beta Drivers. I would love to do more testing but I won't be able to until tomorrow.


----------



## rdr09

Quote:


> Originally Posted by *igrease*
> 
> It may be Windows 8.1 doing the trick or it may be that I reformatted and just flat out reinstalled Windows. I am using 14.4 Beta Drivers. I would love to do more testing but I won't be able to until tomorrow.


no problem, i. please include details on your settings including cpu, ram and such.


----------



## Jflisk

Quote:


> Originally Posted by *rdr09*
> 
> no problem, i. please include details on your settings including cpu, ram and such.


I am on 8.1 and have not had any problems with Mantle or 14.3 drivers. My specs are in my signature.


----------



## Roy360

The ASUS R9290-DC2OC-4GD5 suck.

These cards may be a improved version, but they suck at both overclocking and for mining. Can't even game with them unless you go the extra mile to cool them. (VRM and Core, rocket to 90 if you don't push the fans past 70%)

I have 2 of the initial reference cards, and they hit 1175/1350 @ +30mV and +30% powertune

My VisionTek reference card that I bought in March, hits the same, but with +38mV and +30% powertune.

My two ASUS cards can't even hit 1175/1300 @ +50mV and +50% powertune

what the heck ASUS?

Either qc at ASUS stopped caring about quality control during the mining rush, or these cards were not intended for overclocking.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Roy360*
> 
> The ASUS R9290-DC2OC-4GD5 suck.
> 
> These cards may be a improved version, but they suck at both overclocking and for mining. Can't even game with them unless you go the extra mile to cool them. (VRM and Core, rocket to 90 if you don't push the fans past 70%)
> 
> I have 2 of the initial reference cards, and they hit 1175/1350 @ +30mV and +30% powertune
> 
> My VisionTek reference card that I bought in March, hits the same, but with +38mV and +30% powertune.
> 
> My two ASUS cards can't even hit 1175/1300 @ +50mV and +50% powertune
> 
> what the heck ASUS?
> 
> Either qc at ASUS stopped caring about quality control during the mining rush, or these cards were not intended for overclocking.


When they released it i posted some pics in here of the cooler's contact (or lack thereof) with the die, not all of the heatpipes touch.

they just used the GTX 780 cooler on a Hawaii card, same as the Matrix....

Asus is on my never buy list for GPU's now, very disappointed in them over it


----------



## Faster_is_better

Quote:


> Originally Posted by *igrease*
> 
> It may be Windows 8.1 doing the trick or it may be that I reformatted and just flat out reinstalled Windows. I am using 14.4 Beta Drivers. I would love to do more testing but I won't be able to until tomorrow.


I was kind of expecting it to be some sort of driver/software issue. Considering your hardware the numbers just were not matching up. Good that you got it resolved.








Quote:


> Originally Posted by *Sgt Bilko*
> 
> When they released it i posted some pics in here of the cooler's contact (or lack thereof) with the die, not all of the heatpipes touch.
> 
> they just used the GTX 780 cooler on a Hawaii card, same as the Matrix....
> 
> Asus is on my never buy list for GPU's now, very disappointed in them over it


Does that DC2 cooler problem effect the 280x version? I'm running 4 of them for mining and they stay cooled well considering the ambients. Although at least 2 of them have developed a nasty fan rattle at high fan rpm, if they stay at high rpm for to long. I think the bearings may dry out or something, but once they come back down in rpm they are quiet again...


----------



## Sgt Bilko

Quote:


> Originally Posted by *Faster_is_better*
> 
> I was kind of expecting it to be some sort of driver/software issue. Considering your hardware the numbers just were not matching up. Good that you got it resolved.
> 
> 
> 
> 
> 
> 
> 
> 
> Does that DC2 cooler problem effect the 280x version? I'm running 4 of them for mining and they stay cooled well considering the ambients. Although at least 2 of them have developed a nasty fan rattle at high fan rpm, if they stay at high rpm for to long. I think the bearings may dry out or something, but once they come back down in rpm they are quiet again...


The Die on the 79xx/280x is in a diamond shape rather than a square, 280x is fine from what i know, check out the 280x owners club though for a user that has them.


----------



## Forceman

Quote:


> Originally Posted by *Euda*
> 
> Thanks, nice that someone takes advantage of the info.
> 
> 
> 
> 
> 
> 
> 
> 
> Conversely, your RAM runs fine @ 1500 MHz - mine is artifacting on the desktop instantly after I set those 1500 MHz. Ingame, it also leads to a blackscreen.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Btw.: I just sat down after a new bench-session and created a new graphic. Contrary to the assumption of a big FPS-profit when using VRAM-OC and heavy-SSAA/Downsampling combined, it gets clear that besides benchmarking, VRAM-OC is almost useless on the Hawaii-GPU. In addition, you pay 82% power draw for an fps-plus of ~15% when using an OC-cripple as mine
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nice setup, looks as it should have an excellent airflow!


Any chance you could check it at 1200/1250 and 1200/1400 to see if memory overlooking helps once you've overclocked the core? I tested it when I first got my card, but I can't find the results any longer. I don't think it helped there either though.


----------



## UZ7

Breaking the card in lol.. Yeah my case is messy I know









http://www.techpowerup.com/gpuz/vuuaz/
Catalyst 14.4 (14.100.0.0)
1125/1500 (Hynix)
+50mV
+20%
Max Temp: 74C
Min Temp: 38C
Fan: Auto

Edit:
3DMark11 http://www.3dmark.com/3dm11/8239615
3DMark13 FS: http://www.3dmark.com/fs/2023682
3DMark13 FSE: http://www.3dmark.com/3dm/2897897


----------



## MapRef41N93W

So I just got my replacement Tri-X from RMA today. Strangely though after some benching in Heaven, Valley, and Firestrike I am seeing quite a lot of microstutter and mostly in Heaven actual stuttering (as in the frame freezing for an entire second). It's definitely not thermal throttling as the cards are running at 80c/67c max during benches. I got BSOD error a0000001 which doesn't seem to be a very common error code when running Firestrike as well. This is running both cards at the Tri-X stock OC (1040/1300) I wasn't seeing any of this stuttering before with my previous Tri-X and I thought frame pacing was supposed to have fixed this.

Any ideas on what's causing this? I'm on the wqhl drivers.

Edit: Shortly after I posted this my screen went black with a beep and all I could see was my taskbar and a very slow moving cursor. Had to manual hard shutdown.


----------



## Paul17041993

Quote:


> Originally Posted by *Widde*
> 
> Getting some coil whine or some sort of feedback in my headphones when I'm getting very high fps, and alos in stuff like firestrike while trying to push max oc I can hear some coil whine from the rig itself


Quote:


> Originally Posted by *VulgarDisplay*
> 
> I get a buzzing sound on my tri x whenever the fan hits 45%. Anything above or below and its silent. Strangest thing...


that's generally what onboard audio does, or if you have a cheap soundcard right next to the card.
Quote:


> Originally Posted by *Faster_is_better*
> 
> Does that DC2 cooler problem effect the 280x version? I'm running 4 of them for mining and they stay cooled well considering the ambients. Although at least 2 of them have developed a nasty fan rattle at high fan rpm, if they stay at high rpm for to long. I think the bearings may dry out or something, but once they come back down in rpm they are quiet again...


what ASUS dont tell you is how cheap their DCII cooler actually is, its no more expensive then the reference blower.

and yea it generally affects the 280X too, the 79x0s (and 280X DCIIV2 and matrix rebadges) are just as bad, lower fan on my 7970DCIIT is shot and only good on <40%, ASUS's RMA is just the card sitting on a shelf for a month and they ship it back to you.


----------



## MapRef41N93W

So I've done a bunch of testing and I can't seem to find anything wrong with my fresh out of RMA Tri-X. In fact, it's actually a heck of a lot better than my old card was. It runs cold as ice (maxing out at 70c in Heaven/Valley with 60ish% fan speed at +75mv, or 74c with the auto fan), and is completely stable at 1125/1500 +75mv. The only issue is when I try to run it with my Asus I just get massive stuttering and some weird occasional crashes and blackscreens. I don't get any stuttering with the Asus card running solo, but as I have stated on here numerous times before I have to underclock the memory on that card to stay stable.

Sending in my Asus for RMA seems like it will accomplish nothing as numerous users have reported on here Asus just takes forever and then eventually sends the card back to you and you're out $25-$30 for shipping. I'm at a complete loss of what to do here now as it seems like I am just totally stuck with a $600 paperweight.


----------



## DeadlyDNA

I am scared to try 14.4 unofficial beta driver for 4k eyefinity testing. 14.3 is a total mess and did they fix much in 14.4 i wonder....


----------



## kzinti1

Quote:


> Originally Posted by *MapRef41N93W*
> 
> So I've done a bunch of testing and I can't seem to find anything wrong with my fresh out of RMA Tri-X. In fact, it's actually a heck of a lot better than my old card was. It runs cold as ice (maxing out at 70c in Heaven/Valley with 60ish% fan speed at +75mv, or 74c with the auto fan), and is completely stable at 1125/1500 +75mv. The only issue is when I try to run it with my Asus I just get massive stuttering and some weird occasional crashes and blackscreens. I don't get any stuttering with the Asus card running solo, but as I have stated on here numerous times before I have to underclock the memory on that card to stay stable.
> 
> Sending in my Asus for RMA seems like it will accomplish nothing as numerous users have reported on here Asus just takes forever and then eventually sends the card back to you and you're out $25-$30 for shipping. I'm at a complete loss of what to do here now as it seems like I am just totally stuck with a $600 paperweight.


Do you mean the Sapphire Tri-X OC?
I bought one and had to RMA it back to the Egg.
I couldn't even get my computer to Post with it installed.
They paid shipping but stuck me, somehow, with a restock fee.
I guess because they had nothing I trusted then to replace it.
My MSI Lightnings weren't even for sale back then.
I used to trust Sapphire, but not any more.
I found my last Sapphire, still in its box, just last week.
An X800 PRO LE, still with a huge chunk of copper water block from Danger Den mounted on it.
I had problems with that one too, but I found out it was driver related.
Still, it drove me to Nvidia for years afterwards.


----------



## heroxoot

Quote:


> Originally Posted by *DeadlyDNA*
> 
> I am scared to try 14.4 unofficial beta driver for 4k eyefinity testing. 14.3 is a total mess and did they fix much in 14.4 i wonder....


There is one way to find out you know. Worst case you go back to your current driver.

As for me, I just got tracking on my 290 Gaming Edition. Will be here tuesday this coming week. Cannot wait to join the club.


----------



## Paul17041993

Quote:


> Originally Posted by *MapRef41N93W*
> 
> Sending in my Asus for RMA seems like it will accomplish nothing as numerous users have reported on here Asus just takes forever and then eventually sends the card back to you and you're out $25-$30 for shipping. I'm at a complete loss of what to do here now as it seems like I am just totally stuck with a $600 paperweight.


your best bet is to give it a shot, I just cant expect much to come out of it, I gave up on my 7970 a few months back and just gave it away to my sister for her rig, it at least stays stable with a mild underclock and she doesn't care about it not having the performance it should.
(its about the same perf as a 7950 despite still higher clocks)


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Widde*
> 
> Seems like the Lightnings pack something extra compared to the regular 290xs ^^ http://www.3dmark.com/hall-of-fame-2/
> 
> Apparently that was on water aswell


HOF is biased towards NVidia . If they allowed tess off nearly all of my 3D mk subs would be in the top 10 / 20


----------



## Roy360

well, I just wasted 200$ today. Why the heck did EK make their DCII block a different size







.

It is taller than the reference blocks, so I can't complete my loop.

What the heck am I supposed to do now.


----------



## hotrod717

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> HOF is biased towards NVidia . If they allowed tess off nearly all of my 3D mk subs would be in the top 10 / 20


They also remove older scores and update the list quite frequently. Not really a HOF in my book if they get rid of older higher scores just to put new names on it. Kind of a joke.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Paul17041993*
> 
> your best bet is to give it a shot, I just cant expect much to come out of it, I gave up on my 7970 a few months back and just gave it away to my sister for her rig, it at least stays stable with a mild underclock and she doesn't care about it not having the performance it should.
> (its about the same perf as a 7950 despite still higher clocks)


Sounds like to me that your a good brother









Quote:


> Originally Posted by *Roy360*
> 
> well, I just wasted 200$ today. Why the heck did EK make their DCII block a different size
> 
> 
> 
> 
> 
> 
> 
> .
> 
> It is taller than the reference blocks, so I can't complete my loop.
> 
> What the heck am I supposed to do now.
> 
> [[URL=http://s24.postimg.org/5yjm6aet0/IMG_20140417_202213_1.jpg%5B/IMG]http://s24.postimg.org/5yjm6aet0/IMG_20140417_202213_1.jpg[/IMG[/URL]][/QUOTE]
> Bad luck man [IMG alt="frown.gif"]https://www.overclock.net/images/smilies/frown.gif
> 
> Quote:
> 
> 
> 
> Originally Posted by *hotrod717*
> 
> They also remove older scores and update the list quite frequently. Not really a HOF in my book if they get rid of older higher scores just to put new names on it. Kind of a joke.
> 
> 
> 
> Pretty much . Just another form of control
Click to expand...


----------



## Mega Man

Quote:


> Originally Posted by *Roy360*
> 
> well, I just wasted 200$ today. Why the heck did EK make their DCII block a different size
> 
> 
> 
> 
> 
> 
> 
> .
> 
> It is taller than the reference blocks, so I can't complete my loop.
> 
> What the heck am I supposed to do now.


huh ?


----------



## Paul17041993

Quote:


> Originally Posted by *Roy360*
> 
> well, I just wasted 200$ today. Why the heck did EK make their DCII block a different size
> 
> 
> 
> 
> 
> 
> 
> .
> 
> It is taller than the reference blocks, so I can't complete my loop.
> 
> What the heck am I supposed to do now.


yea that's an unfortunate consequence of using mixed blocks a lot of the time, your best bet is to connect the two with hosing+90deg fittings or multi-angle fittings.


----------



## Roy360

Quote:


> Originally Posted by *Paul17041993*
> 
> yea that's an unfortunate consequence of using mixed blocks a lot of the time, your best bet is to connect the two with hosing+90deg fittings or multi-angle fittings.


unfortunately, I can't see any other option. I'll have to up the ASUS card into the top slot, since my motherboard has a 3 slot gap there, and then use a boatload of 90 degrees. (I can't see tubing flexing that much, unless I use like a 1m to connect the two. )

^I also sent EK an email. Hopefully they anticipated this, and have designed some special adapter so I can use EK-FC R9-290X with my EK-FC R9-290X DCII blocks....

I only upside to this, is that I didn't buy two DC blocks like I originally planned.

EDIT: maybe I could buy a EK-FC Terminal and use some G1/4 spacers to connect them?
EDIT: nvn, didnt realize the terminal needed to make direct connection.


----------



## DeadlyDNA

Quote:


> Originally Posted by *heroxoot*
> 
> There is one way to find out you know. Worst case you go back to your current driver.
> 
> As for me, I just got tracking on my 290 Gaming Edition. Will be here tuesday this coming week. Cannot wait to join the club.


14.4 leaked drivers are a bust for me, they have exact same issues as 14.3.

Eyefinity issues, crossfire issues, Back to Vsynch broken 13.12whql. i may try 14.2 out this weekend, and 14.1 was the worst driver ever.


----------



## cennis

Does anyone know if the club3d royalking 290/290x have hynix or Elpida?
it seems to be the same as powercolor turboduo card which does not have any reviews either


----------



## bbond007

I bought a NEW Gigabyte Windforce 290x and it died the first day. Windows would not even boot.

I then sent the card back for refund and ended up buying a USED Gigabyte r9 290x Windforce from eBay.

My logic is that I'd be better off with one that has been tested thoroughly and the used one worked fine for several weeks and I was using it for mining.

I never overclocked it and it always ran open-air and never really got much over 80c.

Anyway, one of my miners started crashing so I ran 3Dmark and noticed artifacts and video glitches everywhere.

Heaven benchmark does the exact same thing.

I rebuilt the system with a fresh windows 8.1, & Catalyst 13.12 and the video board is still misbehaving like its overclocked when its not.

I saved the BIOS when i first got the card and although I never did change the BIOS out, I went ahead and flashed it back. No effect.

I'm not sure what else to do...RMA? Has this board failed too? Any ideas? I'm so discouraged. These boards don't seem to be reliable at all and I'm not mistreating them.

thanks!


----------



## Roboyto

Quote:


> Originally Posted by *bbond007*
> 
> I bought a NEW Gigabyte Windforce 290x and it died the first day. Windows would not even boot.
> 
> I then sent the card back for refund and ended up buying a used Gigabyte r9 290x Windforce from eBay.
> 
> My logic is that I'd be better off with one that has been tested thoroughly and the used one worked fine for several weeks and I was using it for mining.
> 
> I never overclocked it and it always ran open-air and never really got much over 80c.
> 
> Anyway, one of my miners started crashing so I ran 3Dmark and noticed artifacts and video glitches everywhere.
> 
> Heaven benchmark does the exact same thing.
> 
> I rebuilt the system with a fresh windows 8.1, & Catalyst 13.12 and the video board is still misbehaving like its overclocked when its not.
> 
> I saved the BIOS when i first got the card and although I never did change the BIOS out, I went ahead and flashed it back. No effect.
> 
> I'm not sure what else to do...RMA? Has this board failed too? Any ideas? I'm so discouraged. These boards don't seem to be reliable at all and I'm not mistreating them.
> 
> thanks!


Mining sort of is mistreating the card...running 100% load for hours and/or days non-stop. That is unrealistic load for the card. Weeks of mining can be equal to years of gaming.

What did the previous owner use it for? It could have been abused by the first person and you just finished it off.


----------



## bbond007

Quote:


> Originally Posted by *Roboyto*
> 
> Mining sort of is mistreating the card...running 100% load for hours and/or days non-stop. That is unrealistic load for the card. Weeks of mining can be equal to years of gaming.
> 
> What did the previous owner use it for? It could have been abused by the first person and you just finished it off.


perhaps he hooked a monitor to it? it was only a few weeks old as he gave me the receipt.

Unless he loaded another BIOS on it, the card never went over 84c. Gigabyte throttles 10c cooler than the reference BIOS.

I would think the components should be rated for x number of hours of operation and it should not matter if those hours are consecutive or not...


----------



## Arizonian

Quote:


> Originally Posted by *UZ7*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Breaking the card in lol.. Yeah my case is messy I know
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/vuuaz/
> Catalyst 14.4 (14.100.0.0)
> 1125/1500 (Hynix)
> +50mV
> +20%
> Max Temp: 74C
> Min Temp: 38C
> Fan: Auto
> 
> Edit:
> 3DMark11 http://www.3dmark.com/3dm11/8239615
> 3DMark13 FS: http://www.3dmark.com/fs/2023682
> 3DMark13 FSE: http://www.3dmark.com/3dm/2897897


Welcome - added









Quote:


> Originally Posted by *igrease*
> 
> I would just like to say thanks to all who attempted to help me with my GPU issue. All my problems are gone now. I simply switched to Windows 8.1 and re-unparked my cores. Battlefield 4 now runs at 100+ Fps constantly on Ultra with no AA. Mantle works perfectly fine with no more crazy stuttering. I didn't really believe windows 8 made such a difference but wow...
> 
> Also could I get added to the club?
> 
> 
> Spoiler: Warning: Spoiler!


I'll be glad to add you, however I need a GPU-Z validation link with your OCN name showing like U27 did above as example or see OP for other valid proof.







Thanks.


----------



## Paul17041993

Quote:


> Originally Posted by *bbond007*
> 
> perhaps he hooked a monitor to it? it was only a few weeks old as he gave me the receipt.
> 
> Unless he loaded another BIOS on it, the card never went over 84c. *Gigabyte throttles 10c cooler than the reference BIOS.*
> 
> I would think the components should be rated for x number of hours of operation and it should not matter if those hours are consecutive or not...


might be a reason behind that...

your best bet is either RMA or see if you can underclock till its stable again.


----------



## arrow0309

Quote:


> Originally Posted by *UZ7*
> 
> 
> 
> 
> 
> Breaking the card in lol.. Yeah my case is messy I know
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/vuuaz/
> Catalyst 14.4 (14.100.0.0)
> 1125/1500 (Hynix)
> +50mV
> +20%
> Max Temp: 74C
> Min Temp: 38C
> Fan: Auto
> 
> Edit:
> 3DMark11 http://www.3dmark.com/3dm11/8239615
> 3DMark13 FS: http://www.3dmark.com/fs/2023682
> 3DMark13 FSE: http://www.3dmark.com/3dm/2897897


Nice, the bench results are aligned too








Here's mine (Heaven 4.0 w. 13.12) at 1150/1450:

http://s18.postimg.org/68pm4qsgn/Heaven_4_0.jpg

And another one with the 14.4 at 1180/1500:

http://s8.postimg.org/tytirsumb/Clipboard_oc2_14_4_beta.jpg

Do you guys know if (or why not) the 290 Tri-X would unlock to the 290X?


----------



## Tokkan

Okay so the past few days Ive been wondering how will my 1090T behave with the R9 290, I haven't seen any1 talking about and I'm in doubt if my ~4.175Mhz 1090T will be able to feed it to a point in which it won't be a massive bottleneck. Google gave me only a few people asking the same question but no relevant data.
The resolution will be 1080p, and when it gets here I will do some testing on OC speeds on my Phenom. Maybe some graphs going from Stock speed up to 4.2Ghz + Memory speeds from 1333Mhz CL7 1600Mhz CL9 and 2000Mhz CL11 + Northbridge OC. I'll also disable two cores to simulate a Phenom II Quad for my testings.
I also got the availability to buy a year old 3770k+MSI z77 Mpower for 250 euros, wondering if I should or not.

Anyway, I hope my findings don't dissapoint me and my Phenom shows itself still capable of playing in the big league, if it isn't I'll make a thread on either the AMD GPU section or the AMD CPU section with this info.

Anything I should take into consideration?
What benchmarks should I use for the test I plan on doing? And what software should I use to monitor my CPU usage + GPU usage?
I'll need a log of everything to have the thread built up nicely so any help I can get is welcome.


----------



## JordanTr

Hello guys, today i checked my GPU oc potencial before deciding to buy waterblock and i noticed strange thing, then i set my card to 1200/1500, through heaven benchmark my core fluctuates 800-1200. However if i set to stock clock it looks more stable, 850-947. Is it possible that my OCZ 750W ZT PSU cant handle 2500k ( 4.2ghz) + single r9 290 (oced)? Or its related to 14.3 beta drivers? However i tried to run thief benchmark, and core stayed at 1200 all the time except for 0.5-1s dropped to 1050.

P.S. Arizonian - update me to aftermarket. I done it 2 weeks after buying my stock gpu, but had no time to put my photos somewhere. Here is the photos with hooked gelid icy vision, but it was not silent enough for me, so i decided to change fans to Be Quiet Shadow Wings 92mm (2nd pic):


----------



## Paul17041993

Quote:


> Originally Posted by *Tokkan*
> 
> Okay so the past few days Ive been wondering how will my 1090T behave with the R9 290, I haven't seen any1 talking about and I'm in doubt if my ~4.175Mhz 1090T will be able to feed it to a point in which it won't be a massive bottleneck. Google gave me only a few people asking the same question but no relevant data.
> The resolution will be 1080p, and when it gets here I will do some testing on OC speeds on my Phenom. Maybe some graphs going from Stock speed up to 4.2Ghz + Memory speeds from 1333Mhz CL7 1600Mhz CL9 and 2000Mhz CL11 + Northbridge OC. I'll also disable two cores to simulate a Phenom II Quad for my testings.
> I also got the availability to buy a year old 3770k+MSI z77 Mpower for 250 euros, wondering if I should or not.
> 
> Anyway, I hope my findings don't dissapoint me and my Phenom shows itself still capable of playing in the big league, if it isn't I'll make a thread on either the AMD GPU section or the AMD CPU section with this info.
> 
> Anything I should take into consideration?
> What benchmarks should I use for the test I plan on doing? And what software should I use to monitor my CPU usage + GPU usage?
> I'll need a log of everything to have the thread built up nicely so any help I can get is welcome.


yea, your 1090T should be perfectly fine for the most part, only concern may be a couple games like BF4 that will push it to the absolute max, but even then, you'll likely still get better results than a 4-core i5 or i7.

benchmarks wise, you can use a lot of things, though you'll likely get a little less points then intel equivalents due to older tech in use etc, but definitely don't confuse this for game experience.
Quote:


> Originally Posted by *JordanTr*
> 
> Hello guys, today i checked my GPU oc potencial before deciding to buy waterblock and i noticed strange thing, then i set my card to 1200/1500, through heaven benchmark my core fluctuates 800-1200. However if i set to stock clock it looks more stable, 850-947. Is it possible that my OCZ 750W ZT PSU cant handle 2500k ( 4.2ghz) + single r9 290 (oced)? Or its related to 14.3 beta drivers? However i tried to run thief benchmark, and core stayed at 1200 all the time except for 0.5-1s dropped to 1050.
> 
> P.S. Arizonian - update me to aftermarket. I done it 2 weeks after buying my stock gpu, but had no time to put my photos somewhere. Here is the photos with hooked gelid icy vision, but it was not silent enough for me, so i decided to change fans to Be Quiet Shadow Wings 92mm (2nd pic):
> 
> http://postimg.org/image/kpr2vzh1n/
> 
> http://postimg.org/image/n58b1oor3/


there's always a little throttle on these cards, the 14.x betas might be particularly nasty in that regard, heard trixx somewhat disables the clock variance though.


----------



## UZ7

Quote:


> Originally Posted by *arrow0309*
> 
> Nice, the bench results are aligned too
> 
> 
> 
> 
> 
> 
> 
> 
> Here's mine (Heaven 4.0 w. 13.12) at 1150/1450:
> 
> http://s18.postimg.org/68pm4qsgn/Heaven_4_0.jpg
> 
> And another one with the 14.4 at 1180/1500:
> 
> http://s8.postimg.org/tytirsumb/Clipboard_oc2_14_4_beta.jpg
> 
> Do you guys know if (or why not) the 290 Tri-X would unlock to the 290X?


You would have to check the 290 -> 290X thread and use their info tool to determine if there is a chance it can unlock. Nowadays its a bit rare for a brand new card to unlock since the majority of the ones that did unlock were when they were high in demand and to make ends meet they would use the same boards and just flash the bios. Now that they've been catching up they either laser lock them or just use the 290 boards. The higher chances to unlocking are from people who buy old mining gear as it was during that time the cards were high in demand and they never bothered to check them (some do).


----------



## JordanTr

Quote:


> Originally Posted by *Paul17041993*
> 
> yea, your 1090T should be perfectly fine for the most part, only concern may be a couple games like BF4 that will push it to the absolute max, but even then, you'll likely still get better results than a 4-core i5 or i7.
> 
> benchmarks wise, you can use a lot of things, though you'll likely get a little less points then intel equivalents due to older tech in use etc, but definitely don't confuse this for game experience.
> there's always a little throttle on these cards, the 14.x betas might be particularly nasty in that regard, heard trixx somewhat disables the clock variance though.


Shouldnt it throttle then it reaches 94 degrees? My oced card doesnt overcome 90 and vrm1 96 max.


----------



## Paul17041993

Quote:


> Originally Posted by *JordanTr*
> 
> Shouldnt it throttle then it reaches 94 degrees? My oced card doesnt overcome 90 and vrm1 96 max.


the drivers do some estimations and such based on core and VRM temps, power draw and active fan headroom-to-max, so its quite possible what your getting is a result of over-aggression on the driver's behalf.


----------



## Trys0meM0re

Quote:


> Originally Posted by *JordanTr*
> 
> Hello guys, today i checked my GPU oc potencial before deciding to buy waterblock and i noticed strange thing, then i set my card to 1200/1500, through heaven benchmark my core fluctuates 800-1200. However if i set to stock clock it looks more stable, 850-947. Is it possible that my OCZ 750W ZT PSU cant handle 2500k ( 4.2ghz) + single r9 290 (oced)? Or its related to 14.3 beta drivers? However i tried to run thief benchmark, and core stayed at 1200 all the time except for 0.5-1s dropped to 1050.
> 
> P.S. Arizonian - update me to aftermarket. I done it 2 weeks after buying my stock gpu, but had no time to put my photos somewhere. Here is the photos with hooked gelid icy vision, but it was not silent enough for me, so i decided to change fans to Be Quiet Shadow Wings 92mm (2nd pic):
> 
> The power limit is borked on 14.XX except for 14.4. So try other drivers to make it work better.
> 
> Quote:
> 
> 
> 
> Originally Posted by *Paul17041993*
> 
> yea, your 1090T should be perfectly fine for the most part, only concern may be a couple games like BF4 that will push it to the absolute max, but even then, you'll likely still get better results than a 4-core i5 or i7.
> 
> I have a few systems at home and 1 has a 1090T @ 4.1 Ghz and i can say from experience that it WILL botlleneck a 290. ( sorry didnt quoted properly )
Click to expand...


----------



## Roboyto

Quote:


> Originally Posted by *bbL07*
> 
> perhaps he hooked a monitor to it? it was only a few weeks old as he gave me the receipt.
> 
> Unless he loaded another BIOS on it, the card never went over 84c. Gigabyte throttles 10c cooler than the reference BIOS.
> 
> I would think the components should be rated for x number of hours of operation and it should not matter if those hours are consecutive or not...


Just because my car can go 140mph, doesn't mean I'm going to jump in it and drive that fast redlining the engine, until I run out of gas..refill the tank and do it again.

Some cards have no problem doing this, that is true, but it's still not good for them. Just because they can, doesn't mean you should.

You said open air? That can be a bad thing especially if the VRMs aren't staying cool enough from other airflow. If the core was throttling at 84 like Paul said, how hot was VRM1 during all this.

You said no overclock, but from the few things I have read people have been relying on undervolting and under clocking to keep temperatures at bay..


----------



## rdr09

if only 290s can clock as easy . . .

http://www.3dmark.com/3dm11/7684243

http://cdn.overclock.net/4/4d/4de873ba_ep5QzJ.png

http://www.3dmark.com/3dm/1895751

http://cdn.overclock.net/3/36/36f6a78e_x005XZ.png

http://www.3dmark.com/3dm/1895777

http://cdn.overclock.net/5/5d/5dd53db4_KqAXkd.png

just for comparison.


----------



## Tokkan

Quote:


> I have a few systems at home and 1 has a 1090T @ 4.1 Ghz and i can say from experience that it WILL botlleneck a 290.


My doubt wasn't if it will, but how severe it will be. My Phenom is running at 4175Mhz with a 3.2Ghz NB.
Given certain scenarios it will happen with every CPU, but how hard would it be and would like to document it for further Phenom II users.
If my Phenom II is bottlenecking my 290 only in Skyrim and most other applications it will push the 290 usage to above 90% I would be pretty well satisfied cause that would be money I would save for the long run.
But if the 290 usage in games like BF4 is 50%~60% I would be very disappointed because I wouldn't be able to take full advantage of the GPU and would have to buy a new motherboard/cpu, something that I want to avoid for later iterations of Intel/AMD CPU's.

I plan on seeing how hard it is bottlenecking.
UPS is taking long but after the weekend I should have the card in my rig and I'll start with the testing/benching.
If it plays at the settings the card can handle with smoothness and only losing about 10% performance I'll be happy because regardless of losing 10% it'll still be a 200% increase in GPU performance over my old 6850 crossfire, plus increased smoothness from being single card and not being VRAM capped. Also power consumption will be lower when compared to the crossfire. All in all should be win win.


----------



## rdr09

Quote:


> Originally Posted by *Tokkan*
> 
> My doubt wasn't if it will, but how severe it will be. My Phenom is running at 4175Mhz with a 3.2Ghz NB.
> Given certain scenarios it will happen with every CPU, but how hard would it be and would like to document it for further Phenom II users.
> If my Phenom II is bottlenecking my 290 only in Skyrim and most other applications it will push the 290 usage to above 90% I would be pretty well satisfied cause that would be money I would save for the long run.
> But if the 290 usage in games like BF4 is 50%~60% I would be very disappointed because I wouldn't be able to take full advantage of the GPU and would have to buy a new motherboard/cpu, something that I want to avoid for later iterations of Intel/AMD CPU's.
> 
> I plan on seeing how hard it is bottlenecking.
> UPS is taking long but after the weekend I should have the card in my rig and I'll start with the testing/benching.
> If it plays at the settings the card can handle with smoothness and only losing about 10% performance I'll be happy because regardless of losing 10% it'll still be a 200% increase in GPU performance over my old 6850 crossfire, plus increased smoothness from being single card and not being VRAM capped. Also power consumption will be lower when compared to the crossfire. All in all should be win win.


i've used my thuban to run a 7970 and a 7950 - not crossfired. it handled the 2 quite well with no lag or apparent bottlenecking. the 290 is about 35% faster than a 7950 (my estimate). the thuban at 4 pushed both these cards to to their max in BF4 and BF3 MP 64 maxed using 1080.

i used Hwinfo64 to check cpu usage and they went pretty high . . .





i wish i can test using the 290 but it is watercooled. too much work.


----------



## DeadlyDNA

Quote:


> Originally Posted by *Tokkan*
> 
> Okay so the past few days Ive been wondering how will my 1090T behave with the R9 290, I haven't seen any1 talking about and I'm in doubt if my ~4.175Mhz 1090T will be able to feed it to a point in which it won't be a massive bottleneck. Google gave me only a few people asking the same question but no relevant data.
> The resolution will be 1080p, and when it gets here I will do some testing on OC speeds on my Phenom. Maybe some graphs going from Stock speed up to 4.2Ghz + Memory speeds from 1333Mhz CL7 1600Mhz CL9 and 2000Mhz CL11 + Northbridge OC. I'll also disable two cores to simulate a Phenom II Quad for my testings.
> I also got the availability to buy a year old 3770k+MSI z77 Mpower for 250 euros, wondering if I should or not.
> 
> Anyway, I hope my findings don't dissapoint me and my Phenom shows itself still capable of playing in the big league, if it isn't I'll make a thread on either the AMD GPU section or the AMD CPU section with this info.
> 
> Anything I should take into consideration?
> What benchmarks should I use for the test I plan on doing? And what software should I use to monitor my CPU usage + GPU usage?
> I'll need a log of everything to have the thread built up nicely so any help I can get is welcome.


Last I recall reading and its been a while but a single gpu was barely if at all held back by a cpu. I think your gaming resolution and setting cs would be more important .when I benchmarked my r9 290s on a single 1080p screen on many benchmarks my performance didnt really increase at all. Now that I think about it my 680gtxs also faced similarities when single screen.

give it a whirl I would like to see how it works for you.


----------



## arrow0309

Quote:


> Originally Posted by *UZ7*
> 
> You would have to check the 290 -> 290X thread and use their info tool to determine if there is a chance it can unlock. Nowadays its a bit rare for a brand new card to unlock since the majority of the ones that did unlock were when they were high in demand and to make ends meet they would use the same boards and just flash the bios. Now that they've been catching up they either laser lock them or just use the 290 boards. The higher chances to unlocking are from people who buy old mining gear as it was during that time the cards were high in demand and they never bothered to check them (some do).


Thanks, downloaded the utility and my gpu is obviously core locked
Never mind, at least I've solved out one more mystery


----------



## rdr09

Quote:


> Originally Posted by *JordanTr*
> 
> Shouldnt it throttle then it reaches 94 degrees? My oced card doesnt overcome 90 and vrm1 96 max.


it's 14.3 that's causing the throttling when you apply voltage at higher clocks. try 13.12 or 13.11 for benching.


----------



## bbond007

Quote:


> Originally Posted by *Paul17041993*
> 
> might be a reason behind that...
> 
> your best bet is either RMA or see if you can underclock till its stable again.


Yeah, its really strange. I did down-clock it to 1000 mhz and the Artifacting is indeed reduced.

+ REP for the suggestion









I was also having some other ideas on my way to work that I can try when I get home.

Perhaps my power-supply has degraded. It is an old 680watt OCZ. I have never had a problem with it, but somewhere something has changed.
Quote:


> Originally Posted by *Roboyto*
> 
> Just because my car can go 140mph, doesn't mean I'm going to jump in it and drive that fast redlining the engine, until I run out of gas..refill the tank and do it again.
> 
> Some cards have no problem doing this, that is true, but it's still not good for them. Just because they can, doesn't mean you should.


The card temps were absolutely great and the card operated with core and VRMs in 70c range because I was not even mining power hungry algorithm. The total computer was drawing under 300 watt of power.

I suspect the Gigabyte Windforce in particular is not a reliable product regardless of application.

just as long as we are using a car analogy:

Cars are engineered to withstand a range of operating circumstances from the owner perspective. For example, a car might have an elderly owner who drives it infrequently, on the other hand, it could be owned by a delivery business. Ford, for example, tested the durability of their eco-boost system my using the same engine continuously(hard) for 150K miles. They actually transplant the engine between vehicles to complete different segments of the test.

This reminds me of the time that I bought a 660 TI from EVGA and they refused to honor the $20 rebate because of some BS (their document scanning was terrible quality and rejected my receipt twice). I said in the EVGA forum that I was unhappy and would not consider them in the future. People chimed in and said stuff to the effect that you should never consider a rebate when making a purchase decision - because you frequently don't get them. Those people were entirely missing the point.


----------



## Tokkan

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Last I recall reading and its been a while but a single gpu was barely if at all held back by a cpu. I think your gaming resolution and setting cs would be more important .when I benchmarked my r9 290s on a single 1080p screen on many benchmarks my performance didnt really increase at all. Now that I think about it my 680gtxs also faced similarities when single screen.
> 
> give it a whirl I would like to see how it works for you.


After I get everything set up and start doing the benchmarks I'll post a link here.


----------



## JordanTr

Quote:


> Originally Posted by *rdr09*
> 
> it's 14.3 that's causing the throttling when you apply voltage at higher clocks. try 13.12 or 13.11 for benching.


But in game it didnt throttle. Btw i just installed 14.4 (wiped old ones with ddu ofcourse)


----------



## DeadlyDNA

Quote:


> Originally Posted by *rdr09*
> 
> it's 14.3 that's causing the throttling when you apply voltage at higher clocks. try 13.12 or 13.11 for benching.


YES I experienced this while benching 4k eyefinity. I also found I may be having a similar issue after rolling back to 13.12
I havent had much time to sit down and figure out my issue. I know its not temps but I got worse performance once I overclocked all 4 cards now. It sucks because im trying to post benchmarks that reflect actual performance issues at 4k eyefinity.


----------



## Faster_is_better

Quote:


> Originally Posted by *Paul17041993*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Faster_is_better*
> 
> Does that DC2 cooler problem effect the 280x version? I'm running 4 of them for mining and they stay cooled well considering the ambients. Although at least 2 of them have developed a nasty fan rattle at high fan rpm, if they stay at high rpm for to long. I think the bearings may dry out or something, but once they come back down in rpm they are quiet again...
> 
> 
> 
> what ASUS dont tell you is how cheap their DCII cooler actually is, its no more expensive then the reference blower.
> 
> and yea it generally affects the 280X too, the 79x0s (and 280X DCIIV2 and matrix rebadges) are just as bad, lower fan on my 7970DCIIT is shot and only good on <40%, ASUS's RMA is just the card sitting on a shelf for a month and they ship it back to you.
Click to expand...

I guess I will have to do some more research if I buy into the 290 series at some point. I was under the impression the ASUS DC2 were still great as they had been in the past, but if they ASUS are skimping on their coolers and RMA I may have to look to some other vendors.
Quote:


> Originally Posted by *MapRef41N93W*
> 
> So I've done a bunch of testing and I can't seem to find anything wrong with my fresh out of RMA Tri-X. In fact, it's actually a heck of a lot better than my old card was. It runs cold as ice (maxing out at 70c in Heaven/Valley with 60ish% fan speed at +75mv, or 74c with the auto fan), and is completely stable at 1125/1500 +75mv. The only issue is when I try to run it with my Asus I just get massive stuttering and some weird occasional crashes and blackscreens. I don't get any stuttering with the Asus card running solo, but as I have stated on here numerous times before I have to underclock the memory on that card to stay stable.
> 
> Sending in my Asus for RMA seems like it will accomplish nothing as numerous users have reported on here Asus just takes forever and then eventually sends the card back to you and you're out $25-$30 for shipping. I'm at a complete loss of what to do here now as it seems like I am just totally stuck with a $600 paperweight.


That's really sad to hear, DC2 coolers were always top notch in the previous series. ASUS RMA just gone downhill recently or what? I plan to run my 280x until the fans completely lock up or are just bad enough that they warrant an RMA, unless they get sold off first.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Faster_is_better*
> 
> I guess I will have to do some more research if I buy into the 290 series at some point. I was under the impression the ASUS DC2 were still great as they had been in the past, but if they ASUS are skimping on their coolers and RMA I may have to look to some other vendors.
> That's really sad to hear, DC2 coolers were always top notch in the previous series. ASUS RMA just gone downhill recently or what? I plan to run my 280x until the fans completely lock up or are just bad enough that they warrant an RMA, unless they get sold off first.


DCU II contact with the Die:



290x Matrix heatsink (Look familiar?) :


----------



## Norse

Woooohoooo just won a second sapphire 290x £295 off ebay Brand new in box! the seller had 2 others that went for £270 the R9 cards are really bombing on ebay at the moment!


----------



## igrease

Quote:


> Originally Posted by *rdr09*
> 
> no problem, i. please include details on your settings including cpu, ram and such.


*This is just a rough diagnosis.*

*Computer Spec-*
GPU ~ MSI r9 290 @ 1047/1250
CPU ~ i5 2500k @ 4.0ghz
RAM ~ G. Skill 16gb 1600

*Fresh Windows 7-*
AMD 14.4 Beta Drivers
Unparked CPU Cores

*Battlefield 4 - MANTLE*
Resolution 1920 x 1080
Field of View 90
Resolution Scale 100%
Custom Ultra
Antialiasing Deffered - OFF
Antialisasing Post - OFF

*While playing Battlefield 4-*
GPU usage is not steady.
Average frame rate is less. 85 ~ 105 FPS
Minimum frame rate is less. 55~60 FPS
Maximum frame rate is less. Up to 154 FPS
Frame rate not nearly as steady.
Frame rate drops substantially when certain things happen.

*Fresh Windows 8.1-*
AMD 14.4 Beta Drivers
Unparked CPU Cores

*Battlefield 4 - MANTLE*
Resolution 1920 x 1080
Field of View 90
Resolution Scale 100%
Custom Ultra
Antialiasing Deffered - OFF
Antialisasing Post - OFF

*While playing Battlefield 4-*
GPU usage is steady, no fluctuations.
Average frame rate is higher. 100 ~ 120+ FPS
Minimum frame rate is higher. 70 ~ 80 FPS
Maximum frame rate is higher. Up to 200 FPS
Frame rate incredibly stable.
No frame rate drops.

Overall just better gaming performance in Battlefield 4 with Windows 8.1.


----------



## AddictedGamer93

Nvm, I can't read.


----------



## cennis

I have a ASUS DCUII 290 with a AIO and fans blowing at the vrm. 83.2% asic elpida memory

I can finish 3dmark11 with some artefacting on 1220/1300 with 25(around +250mw?) offset in MSI afterburner command which is only around 1.244v underload. I think the high asic is the reason why the voltage is low compared to my past 290s.

However when I increase to 2C (around +280mw) offset in MSI afterburner the voltage is 1.25~1.28v under load with the same clocks but my computer shuts off in 2~3 seconds into 3dmark11. I feel like there overvoltage protection or some limitation on the asus board which is supposed to have improved stability for its vrm

My temperatures are <80 for vrm and <70 for core

anyone have similar experience?


----------



## Roy360

I got my ASUS connected with the use of two 90 degrees

















now I need to figure out why Windows isn't detecting the third card. Bios sees all 3 running at 8x 4x 4x. Was on 13.11, currently downloading 13.12 from 3DGuru. (why is it taking over 2hrs?)

EDIT: hmm, I stumped. The first time I added a card to my loop, the drivers wouldn't install. It would always crash* during installation. Vers. 14.1 and 13.12 and 13.23

However, DDU and 13.11 fixed the problem.

Now, I added the third card. Windows see the two other cards, but not the ASUS. I try opening CCC, it says there are no options CCC can edit.
So I use DDU in safe mode and uninstall everything. Tried installing 14.1, 13.12 and 13.11. crashes* every time, but at-least I have 3 Microsoft Basic Display adapters this time.

*crash - by this I mean, the mouse freezes and background music starts looping the last 2 seconds.


----------



## Roboyto

Quote:


> Originally Posted by *Faster_is_better*
> 
> I guess I will have to do some more research if I buy into the 290 series at some point. I was under the impression the ASUS DC2 were still great as they had been in the past, but if they ASUS are skimping on their coolers and RMA I may have to look to some other vendors.
> That's really sad to hear, DC2 coolers were always top notch in the previous series. ASUS RMA just gone downhill recently or what? I plan to run my 280x until the fans completely lock up or are just bad enough that they warrant an RMA, unless they get sold off first.


My recent ATTEMPT with ASUS for RMA of a motherboard aggravated me to the point of selling the board on eBay as a 'parts only', purchasing a gigabyte to replace it, and, literally, selling(or planning to sell) nearly everything ASUS that I own. That included: my laptop, netbook, 2 external DVD drives, router, GT640 GPU, and a Bluetooth dongle. Still contemplating what I'm going to do with my Z87 Gryphon..I have 2 year warranty at Microcenter, so I'm thinking of waiting for that warranty to expire and then I'll upgrade to whatever new CPU/MoBo is at that time.

For their tech support to tell me that the PCIe slot isn't on/part of the motherboard is simply sad.

My advice...steer well clear of them. Especially with their unexcusably lazy use of an Nvidia cooler on AMD cards.


----------



## rdr09

Quote:


> Originally Posted by *DeadlyDNA*
> 
> YES I experienced this while benching 4k eyefinity. I also found I may be having a similar issue after rolling back to 13.12
> I havent had much time to sit down and figure out my issue. I know its not temps but I got worse performance once I overclocked all 4 cards now. It sucks because im trying to post benchmarks that reflect actual performance issues at 4k eyefinity.


13.12 shouldn't be giving that issue. you only have one psu feeding all gpus?

Quote:


> Originally Posted by *igrease*
> 
> *This is just a rough diagnosis.*
> 
> *Computer Spec-*
> GPU ~ MSI r9 290 @ 1047/1250
> CPU ~ i5 2500k @ 4.0ghz
> RAM ~ G. Skill 16gb 1600
> 
> *Fresh Windows 7-*
> AMD 14.4 Beta Drivers
> Unparked CPU Cores
> 
> *Battlefield 4 - MANTLE*
> Resolution 1920 x 1080
> Field of View 90
> Resolution Scale 100%
> Custom Ultra
> Antialiasing Deffered - OFF
> Antialisasing Post - OFF
> 
> *While playing Battlefield 4-*
> GPU usage is not steady.
> Average frame rate is less. 85 ~ 105 FPS
> Minimum frame rate is less. 55~60 FPS
> Maximum frame rate is less. Up to 154 FPS
> Frame rate not nearly as steady.
> Frame rate drops substantially when certain things happen.
> 
> *Fresh Windows 8.1-*
> AMD 14.4 Beta Drivers
> Unparked CPU Cores
> 
> *Battlefield 4 - MANTLE*
> Resolution 1920 x 1080
> Field of View 90
> Resolution Scale 100%
> Custom Ultra
> Antialiasing Deffered - OFF
> Antialisasing Post - OFF
> 
> *While playing Battlefield 4-*
> GPU usage is steady, no fluctuations.
> Average frame rate is higher. 100 ~ 120+ FPS
> Minimum frame rate is higher. 70 ~ 80 FPS
> Maximum frame rate is higher. Up to 200 FPS
> Frame rate incredibly stable.
> No frame rate drops.
> 
> Overall just better gaming performance in Battlefield 4 with Windows 8.1.


nice. thank you so much. 14.3 works for mantle here, so i'll just wait for whql.


----------



## DeadlyDNA

Quote:


> Originally Posted by *rdr09*
> 
> 13.12 shouldn't be giving that issue. you only have one psu feeding all gpus?
> nice. thank you so much. 14.3 works for mantle here, so i'll just wait for whql.


Yep only 1 PSU but i wasn't having the issue before. I just need to do more testing to see if i am not hallucinating because i've ran so many benchmarks lately its getting tot he point i need a sign off sheet before starting each test lol. It's possible i was actually on 14.3 and thought i was on 13.12 at some point.


----------



## hotrod717

EK 290X Lightning waterblock-




Can't wait to see what this puppy will do with vrm and core temps. Stock heatsink fan is great already.


----------



## Faster_is_better

Quote:


> Originally Posted by *Roboyto*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Faster_is_better*
> 
> I guess I will have to do some more research if I buy into the 290 series at some point. I was under the impression the ASUS DC2 were still great as they had been in the past, but if they ASUS are skimping on their coolers and RMA I may have to look to some other vendors.
> That's really sad to hear, DC2 coolers were always top notch in the previous series. ASUS RMA just gone downhill recently or what? I plan to run my 280x until the fans completely lock up or are just bad enough that they warrant an RMA, unless they get sold off first.
> 
> 
> 
> My recent ATTEMPT with ASUS for RMA of a motherboard aggravated me to the point of selling the board on eBay as a 'parts only', purchasing a gigabyte to replace it, and, literally, selling(or planning to sell) nearly everything ASUS that I own. That included: my laptop, netbook, 2 external DVD drives, router, GT640 GPU, and a Bluetooth dongle. Still contemplating what I'm going to do with my Z87 Gryphon..I have 2 year warranty at Microcenter, so I'm thinking of waiting for that warranty to expire and then I'll upgrade to whatever new CPU/MoBo is at that time.
> 
> For their tech support to tell me that the PCIe slot isn't on/part of the motherboard is simply sad.
> 
> My advice...steer well clear of them. Especially with their unexcusably lazy use of an Nvidia cooler on AMD cards.
Click to expand...

That's too bad, I haven't had to RMA to many items and all have gone smoothly. I also haven't had to RMA anything for a while though and degradation of service is entirely possible. Then you get horror stories from all manufacturers, so its a bit of a crapshoot altogether.

That was quite lazy of them on the cooler, The whole rear end of the cooler looks to be mostly useless since that pipe don't even touch the die.


----------



## Sgt Bilko

Quote:


> Originally Posted by *hotrod717*
> 
> EK 290X Lightning waterblock-
> 
> 
> 
> 
> Can't wait to see what this puppy will do with vrm and core temps. Stock heatsink fan is great already.










Shiny


----------



## Paul17041993

Quote:


> Originally Posted by *cennis*
> 
> I have a ASUS DCUII 290 with a AIO and fans blowing at the vrm. 83.2% asic elpida memory
> 
> I can finish 3dmark11 with some artefacting on 1220/1300 with 25(around +250mw?) offset in MSI afterburner command which is only around 1.244v underload. I think the high asic is the reason why the voltage is low compared to my past 290s.
> 
> However when I increase to 2C (around +280mw) offset in MSI afterburner the voltage is 1.25~1.28v under load with the same clocks but my computer shuts off in 2~3 seconds into 3dmark11. I feel like there overvoltage protection or some limitation on the asus board which is supposed to have improved stability for its vrm
> 
> My temperatures are <80 for vrm and <70 for core
> 
> anyone have similar experience?


blackscreen or complete power outage? sounds like you may be out of PSU room?


----------



## cennis

Quote:


> Originally Posted by *Paul17041993*
> 
> blackscreen or complete power outage? sounds like you may be out of PSU room?


I only have 1 290 and 1 4770k running on a 1000W xfx platinum that I don't think is faulty.
I believe its a power off but I will check again when I'm home.

Black screen is usually instability/lack of voltage right? My issue happens when I put more than 20(+200mv) in msi ab command line ie. 25, 2c, or 30. I only put that much voltage because this card seem to be running a much lower voltage than my old 290 where +200mw is like 1.266~1.28v (62% asic) while this one is 1.19~1.211v


----------



## Mercy4You

Quote:


> Originally Posted by *Faster_is_better*
> 
> I guess I will have to do some more research if I buy into the 290 series at some point. I was under the impression the ASUS DC2 were still great as they had been in the past, but if they ASUS are skimping on their coolers and RMA I may have to look to some other vendors.
> That's really sad to hear, DC2 coolers were always top notch in the previous series. ASUS RMA just gone downhill recently or what? I plan to run my 280x until the fans completely lock up or are just bad enough that they warrant an RMA, unless they get sold off first.


*"ASUS's RMA is just the card sitting on a shelf for a month and they ship it back to you."*

I have to deny that!

It's 2 months


----------



## Paul17041993

Quote:


> Originally Posted by *Mercy4You*
> 
> *"ASUS's RMA is just the card sitting on a shelf for a month and they ship it back to you."*
> 
> I have to deny that!
> 
> It's 2 months


that's very true for my first RMA attempt...


----------



## Mercy4You

Quote:


> Originally Posted by *Paul17041993*
> 
> that's very true for my first RMA attempt...


Well, it's a bloody shame, ASUS must loose many customers this way. I wonder if they really care...


----------



## Roboyto

Quote:


> Originally Posted by *Faster_is_better*
> 
> That's too bad, I haven't had to RMA to many items and all have gone smoothly. I also haven't had to RMA anything for a while though and degradation of service is entirely possible. Then you get horror stories from all manufacturers, so its a bit of a crapshoot altogether.
> 
> That was quite lazy of them on the cooler, The whole rear end of the cooler looks to be mostly useless since that pipe don't even touch the die.


It is too bad, I have owned a seemingly uncountable number of ASUS products over the years. But it does seem they are on the downswing. From mispackaged, to DOA, and sudden unexpected failure of devices over the last 2 years...I am not in the ASUS camp anymore.

I have said this a couple times in this thread, but I don't know if anyone has given it a shot...I think a copper shim could fix, or enhance, the cooling ability of the DC2 coolers by making contact with all the pipes. /shrug
Quote:


> Originally Posted by *Tokkan*
> 
> My doubt wasn't if it will, but how severe it will be. My Phenom is running at 4175Mhz with a 3.2Ghz NB.
> Given certain scenarios it will happen with every CPU, but how hard would it be and would like to document it for further Phenom II users.
> If my Phenom II is bottlenecking my 290 only in Skyrim and most other applications it will push the 290 usage to above 90% I would be pretty well satisfied cause that would be money I would save for the long run.
> But if the 290 usage in games like BF4 is 50%~60% I would be very disappointed because I wouldn't be able to take full advantage of the GPU and would have to buy a new motherboard/cpu, something that I want to avoid for later iterations of Intel/AMD CPU's.
> 
> I plan on seeing how hard it is bottlenecking.
> UPS is taking long but after the weekend I should have the card in my rig and I'll start with the testing/benching.
> If it plays at the settings the card can handle with smoothness and only losing about 10% performance I'll be happy because regardless of losing 10% it'll still be a 200% increase in GPU performance over my old 6850 crossfire, plus increased smoothness from being single card and not being VRAM capped. Also power consumption will be lower when compared to the crossfire. All in all should be win win.


I had an X4 960T, which unlocked to an X6 1605T, that I was running at 4.1GHz with a similar HT speed(can't remember exactly), with my HD7870 that I was running on an ASRock 890FX board. I switched to a 3570k and an ASRock Z77 Extreme6 board, and graphics performance jumped ~25% for games and benchmarks; all other hardware/windows/GPU drivers were identical. Does anyone know if there is a large difference in performance between 890FX and 900 AMD boards for graphics performance? That could have been the cause, but I don't know since I switched to Intel when the 8350 was a flop.

If you can get a great deal on that 3770k and Z77 board I would say do it. The 3770k is still a solid chip and will continue to be for probably the next ~3 years, or possibly more.
Quote:


> Originally Posted by *bbond007*
> 
> *The card temps were absolutely great and the card operated with core and VRMs in 70c range*
> 
> *Perhaps my power-supply has degraded. It is an old 680watt OCZ*.
> 
> Cars are engineered to withstand a *range of operating circumstances* from the owner perspective. For example, a car might have an elderly owner who drives it infrequently, on the other hand, it could be owned by a delivery business. Ford, for example, tested the durability of their eco-boost system my using the same engine continuously(hard) for 150K miles. They actually transplant the engine between vehicles to complete different segments of the test.
> 
> This reminds me of the time that I bought a 660 TI from EVGA and they refused to honor the $20 rebate because of some BS (their document scanning was terrible quality and rejected my receipt twice). I said in the EVGA forum that I was unhappy and would not consider them in the future. People chimed in and said stuff to the effect that you should never consider a rebate when making a purchase decision - *because you frequently don't get them*. Those people were entirely missing the point.


70C is certainly better than the 84C you had mentioned previously.

Depends on how old for the PSU. My 650W Rosewill Capstone easily handles my 4770k @ 4.5GHz, 16GB RAM 2400MHz, 290 up to 1300/1700, water pump and 4 drives.

They are designed to withstand a range of operating circumstances. However, heavy use from a delivery driver is not the same as running that engine at redline for hours and hours; that would be the equivalency of mining. This is one of the risks of mining though, you push the card to 100% for hours, days, or weeks at a time and if the card wasn't 100% proper to begin with, it will fail earlier than normal. Also, *a lot* could have happened in the few weeks the other guy owned the card. If it was a secondary GPU, it probably did not have had a monitor plugged into it









You could just chalk this up to poor luck with chips...it happens. Recently just went through this with a friend of mine. Ordered a PowerColor 7870 and it worked, but exhibited some strange issues during gaming and video playback especially; After testing on 2 different MoBo/CPUs with fresh Windows installs, it was returned to NewEgg. Ordered a VisionTek 280X in its place, it came DOA. RMAd the DOA and now he is very happy with his 280X, with a lifetime warranty, for $309.

I have had great luck with rebates; I bought my Wii U with rebate cards from PC parts. Submit them ASAP and triple check you have everything required for it to be processed without issue.
Quote:


> Originally Posted by *cennis*
> 
> I only have 1 290 and 1 4770k running on a 1000W xfx platinum that I don't think is faulty.
> I believe its a power off but I will check again when I'm home.
> 
> Black screen is usually instability/lack of voltage right? My issue happens when I put more than 20(+200mv) in msi ab command line ie. 25, 2c, or 30. I only put that much voltage because this card seem to be running a much lower voltage than my old 290 where +200mw is like 1.266~1.28v (62% asic) while this one is 1.19~1.211v


My experience, with both of my 290s, has been that a black screen more than likely happened when pushing the memory too far. Core speed too high or lack of voltage always meant: artifacts, tearing, or screen flickering.

If you are getting to 1200 core with only ~1.24V, then you are doing pretty good. My XFX 290 BE has 80.4% and for full bench/gaming stability at 1200/1500 I need 1.313V; +87mV in Trixx. +200mV & +50% Power Limit put my 290 over 1.4V

Have you increased the Power Limit as well?

Have you tried a different OC utility? GPU Tweak is usually pretty good. Trixx has always given me the best results.
Quote:


> Originally Posted by *Mercy4You*
> 
> *"ASUS's RMA is just the card sitting on a shelf for a month and they ship it back to you."*
> 
> I have to deny that!
> 
> It's 2 months


Quote:


> Originally Posted by *Paul17041993*
> 
> that's very true for my first RMA attempt...


Quote:


> Originally Posted by *Mercy4You*
> 
> Well, it's a bloody shame, ASUS must loose many customers this way. I wonder if they really care...


Have you guys tried calling in the RMA to ASUS? I submitted the RMA for my 7870 online and waited 2 weeks without as much as a peep from ASUS regarding what the hell was going on. I called in the RMA one night, and had RMA approval by noon the next day. When I called I asked if they had any information regarding my online RMA request, and they did NOT...what a joke. I called in the RMA on 1/21/14, shipped it the next day, and received a replacement card on 2/11/14.








I'm one of those people that they have lost


----------



## cennis

Quote:


> Originally Posted by *Roboyto*
> 
> My experience, with both of my 290s, has been that a black screen only happened when pushing the memory too far. Core speed too high or lack of voltage always meant: artifacts, tearing, or screen flickering.
> 
> If you are getting to 1200 core with only ~1.24V, then you are doing pretty good. My XFX 290 BE has 80.4% and for full bench/gaming stability at 1200/1500 I need 1.313V; +87mV in Trixx. +200mV & +50% Power Limit put my 290 over 1.4V
> 
> Have you tried a different OC utility? GPU Tweak is usually pretty good. Trixx has always given me the best results.


It is a good overclock but more voltage shouldn't make it shut off unless this board has limitations that aren't on reference. Its a DCUII card.
I haven't seen a 1.3v other than doing a 40(400mv) offset through AB, but once 3dmark runs it blackscreens/power off in 2~3 seconds.

My memory is elpida and from your post it does not seem to affect performance much overclocking the vram so I have given up on it for now.


----------



## Roboyto

Quote:


> Originally Posted by *cennis*
> 
> It is a good overclock but more voltage shouldn't make it shut off unless this board has limitations that aren't on reference. Its a DCUII card.
> I haven't seen a 1.3v other than doing a 40(400mv) offset through AB, but once 3dmark runs it blackscreens/power off in 2~3 seconds.
> 
> My memory is elpida and from your post it does not seem to affect performance much overclocking the vram so I have given up on it for now.


Leave the RAM at stock clocks and try another utility. Got nothing to lose except for black screen problems







You are correct, extra voltage shouldn't do that...quite strange

The VRAM can help, but not that much. I still need to test and see if it has more impact with EyeFinity, as those tests were done at 1080 resolution.


----------



## heroxoot

Has anyone had the BF4 ram leak glitch on windows 8.1 with any 290/x card? I get my 290 soon but on the 14.3beta I get the ram leaks again, and I'm wondering if the issue relates to the gpu I use directly or the driver its self. It seems like majority of the people with the issue are on GPU below 3gb vram.


----------



## Roy360

does anyone know where I can get replacement warranty sticker for my ASUS R9 290.

I need to RMA my card, but I broke the sticker on one screw.

I have a second working card that I can take the screw out of, but it's too tight to unscrew without a screwdriver.


----------



## DeadlyDNA

Quote:


> Originally Posted by *heroxoot*
> 
> Has anyone had the BF4 ram leak glitch on windows 8.1 with any 290/x card? I get my 290 soon but on the 14.3beta I get the ram leaks again, and I'm wondering if the issue relates to the gpu I use directly or the driver its self. It seems like majority of the people with the issue are on GPU below 3gb vram.


i hope not i just got bf4 finally, for benching if possible.


----------



## heroxoot

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Has anyone had the BF4 ram leak glitch on windows 8.1 with any 290/x card? I get my 290 soon but on the 14.3beta I get the ram leaks again, and I'm wondering if the issue relates to the gpu I use directly or the driver its self. It seems like majority of the people with the issue are on GPU below 3gb vram.
> 
> 
> 
> i hope not i just got bf4 finally, for benching if possible.
Click to expand...

Well it didnt happen on my 7970, but its happening on this 6850. I'm curious if this problem happens due to a lack of vram. My friend had the issue on a R9 270 so he switched back to windows 7. I actually like 8.1 so I hope this is just a weird issue with low vram cards. I know playing the game on 32bit solves the issue too.


----------



## Arizonian

Quote:


> Originally Posted by *JordanTr*
> 
> Hello guys, today i checked my GPU oc potencial before deciding to buy waterblock and i noticed strange thing, then i set my card to 1200/1500, through heaven benchmark my core fluctuates 800-1200. However if i set to stock clock it looks more stable, 850-947. Is it possible that my OCZ 750W ZT PSU cant handle 2500k ( 4.2ghz) + single r9 290 (oced)? Or its related to 14.3 beta drivers? However i tried to run thief benchmark, and core stayed at 1200 all the time except for 0.5-1s dropped to 1050.
> 
> P.S. Arizonian - update me to aftermarket. I done it 2 weeks after buying my stock gpu, but had no time to put my photos somewhere. Here is the photos with hooked gelid icy vision, but it was not silent enough for me, so i decided to change fans to Be Quiet Shadow Wings 92mm (2nd pic):
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - updated









Quote:


> Originally Posted by *hotrod717*
> 
> EK 290X Lightning waterblock-
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> Can't wait to see what this puppy will do with vrm and core temps. Stock heatsink fan is great already.


Congrats - updated


----------



## Mega Man

Quote:


> Originally Posted by *bbond007*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Paul17041993*
> 
> might be a reason behind that...
> 
> your best bet is either RMA or see if you can underclock till its stable again.
> 
> 
> 
> Yeah, its really strange. I did down-clock it to 1000 mhz and the Artifacting is indeed reduced.
> 
> + REP for the suggestion
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I was also having some other ideas on my way to work that I can try when I get home.
> 
> Perhaps my power-supply has degraded. It is an old 680watt OCZ. I have never had a problem with it, but somewhere something has changed.
> Quote:
> 
> 
> 
> Originally Posted by *Roboyto*
> 
> Just because my car can go 140mph, doesn't mean I'm going to jump in it and drive that fast redlining the engine, until I run out of gas..refill the tank and do it again.
> 
> Some cards have no problem doing this, that is true, but it's still not good for them. Just because they can, doesn't mean you should.
> 
> Click to expand...
> 
> The card temps were absolutely great and the card operated with core and VRMs in 70c range because I was not even mining power hungry algorithm. The total computer was drawing under 300 watt of power.
> 
> I suspect the Gigabyte Windforce in particular is not a reliable product regardless of application.
> 
> just as long as we are using a car analogy:
> 
> Cars are engineered to withstand a range of operating circumstances from the owner perspective. For example, a car might have an elderly owner who drives it infrequently, on the other hand, it could be owned by a delivery business. Ford, for example, tested the durability of their eco-boost system my using the same engine continuously(hard) for 150K miles. They actually transplant the engine between vehicles to complete different segments of the test.
> 
> This reminds me of the time that I bought a 660 TI from EVGA and they refused to honor the $20 rebate because of some BS (their document scanning was terrible quality and rejected my receipt twice). I said in the EVGA forum that I was unhappy and would not consider them in the future. People chimed in and said stuff to the effect that you should never consider a rebate when making a purchase decision - because you frequently don't get them. Those people were entirely missing the point.
Click to expand...

Gigabyte has one of the more sought after cards that I have seen.
Quote:


> Originally Posted by *igrease*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> no problem, i. please include details on your settings including cpu, ram and such.
> 
> 
> 
> *This is just a rough diagnosis.*
> 
> *Computer Spec-*
> GPU ~ MSI r9 290 @ 1047/1250
> CPU ~ i5 2500k @ 4.0ghz
> RAM ~ G. Skill 16gb 1600
> 
> *Fresh Windows 7-*
> AMD 14.4 Beta Drivers
> Unparked CPU Cores
> 
> *Battlefield 4 - MANTLE*
> Resolution 1920 x 1080
> Field of View 90
> Resolution Scale 100%
> Custom Ultra
> Antialiasing Deffered - OFF
> Antialisasing Post - OFF
> 
> *While playing Battlefield 4-*
> GPU usage is not steady.
> Average frame rate is less. 85 ~ 105 FPS
> Minimum frame rate is less. 55~60 FPS
> Maximum frame rate is less. Up to 154 FPS
> Frame rate not nearly as steady.
> Frame rate drops substantially when certain things happen.
> 
> *Fresh Windows 8.1-*
> AMD 14.4 Beta Drivers
> Unparked CPU Cores
> 
> *Battlefield 4 - MANTLE*
> Resolution 1920 x 1080
> Field of View 90
> Resolution Scale 100%
> Custom Ultra
> Antialiasing Deffered - OFF
> Antialisasing Post - OFF
> 
> *While playing Battlefield 4-*
> GPU usage is steady, no fluctuations.
> Average frame rate is higher. 100 ~ 120+ FPS
> Minimum frame rate is higher. 70 ~ 80 FPS
> Maximum frame rate is higher. Up to 200 FPS
> Frame rate incredibly stable.
> No frame rate drops.
> 
> Overall just better gaming performance in Battlefield 4 with Windows 8.1.
Click to expand...

Maybe I am missing something but bf4 recommends 8 over 7 last I knew for this reason


----------



## Gooberman

Well i just bought a 290x for $365 on ebay :O


----------



## cephelix

Quote:


> Originally Posted by *Gooberman*
> 
> Well i just bought a 290x for $365 on ebay :O


Card prices seem to be falling......


----------



## chronicfx

Quote:


> Originally Posted by *cephelix*
> 
> Card prices seem to be falling......


Script mining became unprofitable recently. It is a flood of miners cashing out their hardware. price should go back up some if you wait a month. Right now it is just too many cards on ebay to sell anything at a reasonable price. Great time to snag that second 290x if you are thinking about it.


----------



## Gooberman

I'm happy lol selling my 7950 to my friend so it isn't that expensive of an upgrade


----------



## DeadlyDNA

dang if it was worth it i would get some 290x's. this makes me a sad panda, im just glad i got my 290'sat launch for retail msrp 399$


----------



## cephelix

Quote:


> Originally Posted by *chronicfx*
> 
> Script mining became unprofitable recently. It is a flood of miners cashing out their hardware. price should go back up some if you wait a month. Right now it is just too many cards on ebay to sell anything at a reasonable price. Great time to snag that second 290x if you are thinking about it.


290x is wasted on me right now. especially when i'm running at 1920x1200.
only have 1 x 290.
adding a second one would mean that i need a psu upgrade as well, currently have a 3yr old seasonic x750.
and i think my cpu is bottlenecking my card, i5 760 @ 4GHz.
Planning for a haswell K refresh upgrade when it comes out but unsure whether i should wait for broadwell instead.
thoughts?


----------



## bbond007

Quote:


> Originally Posted by *Mega Man*
> 
> Gigabyte has one of the more sought after cards that I have seen.


I don't doubt it, as the GV-R929XOC-4GD is a beautiful looking card










The heatsink and card in general seem really well designed and seems to have outstanding performance, but the are number of reviewers on Amazon and Newegg who have had similar hardware failures which suggests some kind of issue.

Anyway.... I played with it a little more and it seems to always artifact regardless of clock settings.

I removed Nvidia drivers and installed it in my computer with ga-990fxa-ud3 and FX 8320 and it did the artifacting on that machine too. Surround artifacting









I'll try and RMA it. I read somewhere that someone RMA'd one and the sent the same board back with Hynix RAM when it originally had Elpida.

After the issues with these two cards and the general kind of flaky behavior of my ga-990fxa-ud3 motherboard, I think I'm done with Gigabyte...


----------



## cennis

Quote:


> Originally Posted by *Roboyto*
> 
> Leave the RAM at stock clocks and try another utility. Got nothing to lose except for black screen problems
> 
> 
> 
> 
> 
> 
> 
> You are correct, extra voltage shouldn't do that...quite strange
> 
> The VRAM can help, but not that much. I still need to test and see if it has more impact with EyeFinity, as those tests were done at 1080 resolution.


Quote:


> Originally Posted by *Paul17041993*
> 
> blackscreen or complete power outage? sounds like you may be out of PSU room?


Upon further testing, it looks like black screen as my cpu fan which is connected to the mobo is still running, and my mouse and kb lights.
It happens when I increase the voltage to 1.266~1.3v which is 300mw offset. triXX tool 200mw only gives me 1.2~1.211v...
This is with 1200/1260(stock) which is stable at 200mw
1220 or 1200 with 300mw black screens 20 seconds into valley,
2 seconds in 3dmark


----------



## DeadlyDNA

Quote:


> Originally Posted by *cennis*
> 
> Upon further testing, it looks like black screen as my cpu fan which is connected to the mobo is still running, and my mouse and kb lights.
> It happens when I increase the voltage to 1.266~1.3v which is 300mw offset. triXX tool 200mw only gives me 1.2~1.211v...
> This is with 1200/1260(stock) which is stable at 200mw
> 1220 or 1200 with 300mw black screens 20 seconds into valley,
> 2 seconds in 3dmark


I have been pushing my system hard lately and i was having black screen crashes. i went back and revisited 2 possibilities in my case. My cpu OC was unstable, and my GPU OC was having issues with power limit/voltages. I downclocked my pc to 4ghz for stability so i can do further testing. I also started making sure my CCC overdrive settings were correct along with MSI AB. I am on 13.12whql drivers though so i don't know what the deal is with the betas yet. I heard they have power issues that are broken?

I am now able to OC my cards better and they have been rock solid for benching. I will up my CPU OC when i am done benching and see if it was also a factor.

Also when i was testing 14.3 beta drivers i kept having issue with random monitor losing signal and going black screen, it was wierd.


----------



## cennis

Quote:


> Originally Posted by *DeadlyDNA*
> 
> I have been pushing my system hard lately and i was having black screen crashes. i went back and revisited 2 possibilities in my case. My cpu OC was unstable, and my GPU OC was having issues with power limit/voltages. I downclocked my pc to 4ghz for stability so i can do further testing. I also started making sure my CCC overdrive settings were correct along with MSI AB. I am on 13.12whql drivers though so i don't know what the deal is with the betas yet. I heard they have power issues that are broken?
> 
> I am now able to OC my cards better and they have been rock solid for benching. I will up my CPU OC when i am done benching and see if it was also a factor.
> 
> Also when i was testing 14.3 beta drivers i kept having issue with random monitor losing signal and going black screen, it was wierd.


I am using 13.12, and my cpu is on stock right now ram is at 1333 or 1600 too.
did not touch CCC overdrive. using the AB setting that does not restart the driver while applying clocks.


----------



## Mega Man

I don't know why people get mad when betas are actually betas


----------



## DeadlyDNA

Quote:


> Originally Posted by *Mega Man*
> 
> I don't know why people get mad when betas are actually betas


If you mean me, i am not mad really at betas so much as, Still having to sit on 13.12whql drivers from like Dec 18th 2013. 5months is a long time for dealing with drivers that work except broken vsynch. Sadly 14.3 was actually one of the better beta drivers release so far.


----------



## Norse

Quote:


> Originally Posted by *Gooberman*
> 
> Well i just bought a 290x for $365 on ebay :O


I have gotten two 290x's at cheap prices, one was £260 (436USD), the other brand new in box £295 (495 USD) its crazy







but good for cheapo cards atm









I saw 2 290's go for only £190


----------



## hotrod717

Quote:


> Originally Posted by *Norse*
> 
> I have gotten two 290x's at cheap prices, one was £260 (436USD), the other brand new in box £295 (495 USD) its crazy
> 
> 
> 
> 
> 
> 
> 
> but good for cheapo cards atm
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I saw 2 290's go for only £190


It's great that prices are dropping and would hope most people are wary enough to pick up *NEW* cards and not someone else's mining trash. They drove the prices up for months and now trying to unload their abused, overworked cards. There are deals out there to be sure, but I certainly wouldn't want one that has been mined with. I picked up a used 7970 before this mining craze really kicked in. Thought it strange to have a underclocked, under performing bios. Once flashed to a standard bios the card was incredibly unstable and overheated easily. Put the old bios it came with back on and sold it back to a miner.


----------



## Norse

Quote:


> Originally Posted by *hotrod717*
> 
> It's great that prices are dropping and would hope most people are wary enough to pick up *NEW* cards and not someone else's mining trash. They drove the prices up for months and now trying to unload their abused, overworked cards. There are deals out there to be sure, but I certainly wouldn't want one that has been mined with. I picked up a used 7970 before this mining craze really kicked in. Thought it strange to have a underclocked, under performing bios. Once flashed to a standard bios the card was incredibly unstable and overheated easily. Put the old bios it came with back on and sold it back to a miner.


I think my first 290x was a ex miner card because i asked the seller and he said he'll have a few more in a couple of weeks but so far the card has been stable other than my OCing and a single BSOD due to drivers


----------



## Mercy4You

Are those the Blackscreen-editon?


----------



## Norse

Quote:


> Originally Posted by *Mercy4You*
> 
> Are those the Blackscreen-editon?


no black screens yet


----------



## Mercy4You

Quote:


> Originally Posted by *Norse*
> 
> no black screens yet


Hopefully it stays that way









Are there still guys suffering from blackscreens?


----------



## DeadlyDNA

Quote:


> Originally Posted by *Mercy4You*
> 
> Are those the Blackscreen-editon?


Funny, just flip it to Uber mode like this


----------



## Norse

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Funny, just flip it to Uber mode like this


Hehe might have to once i get both 290x's record it as they get to 100%









Most people showing off the real noise of the cards do it open case or with the back of the case facing them which is both unrealistic in real world terms

Case is normally either ontop of desk or below it


----------



## rdr09

Quote:


> Originally Posted by *DeadlyDNA*
> 
> I have been pushing my system hard lately and i was having black screen crashes. i went back and revisited 2 possibilities in my case. My cpu OC was unstable, and my GPU OC was having issues with power limit/voltages. I downclocked my pc to 4ghz for stability so i can do further testing. I also started making sure my CCC overdrive settings were correct along with MSI AB. I am on 13.12whql drivers though so i don't know what the deal is with the betas yet. I heard they have power issues that are broken?
> 
> I am now able to OC my cards better and they have been rock solid for benching. I will up my CPU OC when i am done benching and see if it was also a factor.
> 
> Also when i was testing 14.3 beta drivers i kept having issue with random monitor losing signal and going black screen, it was wierd.


so, you are oc'ing the gpus with that psu? i don't know about that.

edit: also, i don't think that cpu at 4GHz will handle the gpus. i highly doubt it. ask Bradley.


----------



## Norse

Woooooooo triple screen goodness........but for some reason fraps crashes whatever game is running

22" 24" 22"


----------



## DeadlyDNA

Quote:


> Originally Posted by *rdr09*
> 
> so, you are oc'ing the gpus with that psu? i don't know about that.
> 
> edit: also, i don't think that cpu at 4GHz will handle the gpus. i highly doubt it. ask Bradley.


Just a guess, but the cpu is alright while im running extremely high resolutions. If there is a cpu component to my scores if will be lower min, lower max and probably lower avg fps. It wont be the same bottleneck as if i was running 1080p where gpus are waitng on cpu so much. That said, ill oc the hell out of the cpu soon enough. Right now my goal is to have good stability while testing 4k eyefinity.


----------



## igrease

Do I actually need to run a benchmark to test card stability? I have just been playing a match or two of bf and IG I don't see any artifacts I assume it is steady.


----------



## heroxoot

Quote:


> Originally Posted by *igrease*
> 
> Do I actually need to run a benchmark to test card stability? I have just been playing a match or two of bf and IG I don't see any artifacts I assume it is steady.


I like to loop Heaven a few times myself, but using things like furmark is just silly. Problem is even not seeing any artifacts in Heaven I still did in BF4. So my suggestion is to play games for an hour or two non stop and just watch your temps and for artifacts. You can have a stable OC in 9 games and that 10th one will have issues. Benchmark stressing only helps see the highest heat in most cases.


----------



## battleaxe

Quote:


> Originally Posted by *igrease*
> 
> Do I actually need to run a benchmark to test card stability? I have just been playing a match or two of bf and IG I don't see any artifacts I assume it is steady.


If its stable on BF4 you can assume its stable. BF4 is a harder test than Valley or most other benches.


----------



## Roboyto

Quote:


> Originally Posted by *Mega Man*
> 
> I don't know why people get mad when betas are actually betas


Yup

Quote:


> Originally Posted by *Roy360*
> 
> does anyone know where I can get replacement warranty sticker for my ASUS R9 290.
> 
> I need to RMA my card, but I broke the sticker on one screw.
> 
> I have a second working card that I can take the screw out of, but it's too tight to unscrew without a screwdriver.


Can try removing one from your other card by heating up the sticker, with a hair dryer maybe, to make the glue tacky and then use a razor blade to get it off.


----------



## the9quad

Quote:


> Originally Posted by *igrease*
> 
> Do I actually need to run a benchmark to test card stability? I have just been playing a match or two of bf and IG I don't see any artifacts I assume it is steady.


thats what i do, been months now no crashes. if bf4 runs it, im safe.


----------



## Tobiman

14.4 beta has been very stable for me thus far. Those who want to give mantle a try might want to try it.


----------



## heroxoot

Quote:


> Originally Posted by *Tobiman*
> 
> 14.4 beta has been very stable for me thus far. Those who want to give mantle a try might want to try it.


Good to hear. I will be sure to try it when my 290 gets here. Is there a real Mantle improvement? My 7970 was getting 100 - 120 on Mantle in BF4 before I sacrificed it to MSI.


----------



## Tobiman

Quote:


> Originally Posted by *heroxoot*
> 
> Good to hear. I will be sure to try it when my 290 gets here. Is there a real Mantle improvement? My 7970 was getting 100 - 120 on Mantle in BF4 before I sacrificed it to MSI.


I mostly play multiplayer on 64 man servers and I get 90-100fps with D3D11. Mantle bumps it up to 100-110fps avg.


----------



## Euda

Quote:


> Originally Posted by *Tobiman*
> 
> 14.4 beta has been very stable for me thus far. Those who want to give mantle a try might want to try it.


Agree with that. The previous 14.x-versions brought some Power Limit-related issues for me. Besides, my card clocked at 3D-clockstates after booting into Win8.1. 14.4 runs rockstable and without any issues/bugs so far, since release/leak-date.


----------



## the9quad

The only issue i have with 14.1 is occasionally it wont wake from sleep. monitor just stays dark. happens about every 5th time. Im ok with it though, takes about 10 seconds to reboot.


----------



## passinos

Hey Team AMD,
I got 7970's CF and just sold them on the bay. Help me decide replacement.
Make goal is quiet BF4/TitanFall Max at 1440p

1) 2x Cryovenom 290's
2) 2x Visiontek 290x Ref with koolance blocks (no backplates)

Price is within $50 total difference.
both are about 2 months (probably ex-miners) seems like there is a flood of people getting out of mining.

Thanks All.


----------



## Paul17041993

Quote:


> Originally Posted by *cennis*
> 
> Upon further testing, it looks like black screen as my cpu fan which is connected to the mobo is still running, and my mouse and kb lights.
> It happens when I increase the voltage to 1.266~1.3v which is 300mw offset. triXX tool 200mw only gives me 1.2~1.211v...
> This is with 1200/1260(stock) which is stable at 200mw
> 1220 or 1200 with 300mw black screens 20 seconds into valley,
> 2 seconds in 3dmark


if that's on the reference blower, the blackscreen is perfectly normal, it doesn't cool the memory enough when the core is pumping out so much heat.

if its a custom/non-ref cooler its probably the same thing happening, whats your temps through the board?
Quote:


> Originally Posted by *passinos*
> 
> Hey Team AMD,
> I got 7970's CF and just sold them on the bay. Help me decide replacement.
> Make goal is quiet BF4/TitanFall Max at 1440p
> 
> 1) 2x Cryovenom 290's
> 2) 2x Visiontek 290x Ref with koolance blocks (no backplates)
> 
> Price is within $50 total difference.
> both are about 2 months (probably ex-miners) seems like there is a flood of people getting out of mining.
> 
> Thanks All.


multi 290 == good for air crossfire, you can undervolt and underclock if they throttle too much and get better perf.
multi 290X == best on water due to shear heat production, don't suffer from throttle while under water and cards likely last longer, so go for these preferably for both noise and performance.

side note, my 290X doesn't seem to have an uber BIOS, both seem to be 40% fan (silent)...


----------



## igrease

Quote:


> Originally Posted by *heroxoot*
> 
> Good to hear. I will be sure to try it when my 290 gets here. Is there a real Mantle improvement? My 7970 was getting 100 - 120 on Mantle in BF4 before I sacrificed it to MSI.


For me, mantle adds about 30fps ~ 40fps


----------



## Roy360

Did anyone find an oil like substance on their GPU core? I installed a waterblock on my 4th R9 290, and there some strange oil on the core and heatsync. (from the fans?)

p.s how do I set the voltage to be the same across all my cards.

When I had 3 cards, I could safely undervolt to -80mV
but now with 4 cards, sometimes -70 would work, other times it would crash at -50

I tested the new card outside the main rig, and it operated for 3 days at -70mV

But according to GPUz, all the cards are acting all their own voltages. ie. 1.086, 1.063, 1.133
The core clock seems to be fluctuating on my VisionTek card too. between 944.2- 945.6 (I've set them all to 947 with MSI afterburner. )

I want to see how low I can bring down power drain. Right now, each card is taking 220W


----------



## darkelixa

I have oil on the back of my r9 290 just next to the backplate screw, card is ruined and its not even water cooled. Time to buy a r9 280x


----------



## passinos

Quote:


> Originally Posted by *Paul17041993*
> 
> if that's on the reference blower, the blackscreen is perfectly normal, it doesn't cool the memory enough when the core is pumping out so much heat.
> 
> if its a custom/non-ref cooler its probably the same thing happening, whats your temps through the board?
> multi 290 == good for air crossfire, you can undervolt and underclock if they throttle too much and get better perf.
> multi 290X == best on water due to shear heat production, don't suffer from throttle while under water and cards likely last longer, so go for these preferably for both noise and performance.
> 
> side note, my 290X doesn't seem to have an uber BIOS, both seem to be 40% fan (silent)...


Actually they are both water cooled. The cryovenom 290 have factory EKs the 290x have user installed koolance on Ref cards.


----------



## Jflisk

Quote:


> Originally Posted by *passinos*
> 
> Hey Team AMD,
> I got 7970's CF and just sold them on the bay. Help me decide replacement.
> Make goal is quiet BF4/TitanFall Max at 1440p
> 
> 1) 2x Cryovenom 290's
> 2) 2x Visiontek 290x Ref with koolance blocks (no backplates)
> 
> Price is within $50 total difference.
> both are about 2 months (probably ex-miners) seems like there is a flood of people getting out of mining.
> 
> Thanks All.


Don't forget the back plates they help even when water cooled.


----------



## Jflisk

Quote:


> Originally Posted by *Roy360*
> 
> Did anyone find an oil like substance on their GPU core? I installed a waterblock on my 4th R9 290, and there some strange oil on the core and heatsync. (from the fans?)
> 
> p.s how do I set the voltage to be the same across all my cards.
> 
> When I had 3 cards, I could safely undervolt to -80mV
> but now with 4 cards, sometimes -70 would work, other times it would crash at -50
> 
> I tested the new card outside the main rig, and it operated for 3 days at -70mV
> 
> But according to GPUz, all the cards are acting all their own voltages. ie. 1.086, 1.063, 1.133
> The core clock seems to be fluctuating on my VisionTek card too. between 944.2- 945.6 (I've set them all to 947 with MSI afterburner. )
> 
> I want to see how low I can bring down power drain. Right now, each card is taking 220W


The oil is either from one of the heat sink tubes leaking or It is from from whatever TIM they used on it. They might have hit the top of the tube.


----------



## passinos

Quote:


> Originally Posted by *Jflisk*
> 
> Don't forget the back plates they help even when water cooled.


Good Point. So the Delta is $100 is I factor new Backplates for the 290x's.

Do you think the Binned 290 Cryovenoms will be close to the Ref 290x w/ Koolance?


----------



## cennis

Quote:


> Originally Posted by *Paul17041993*
> 
> if that's on the reference blower, the blackscreen is perfectly normal, it doesn't cool the memory enough when the core is pumping out so much heat.
> 
> if its a custom/non-ref cooler its probably the same thing happening, whats your temps through the board?


It is a directcu ii ASUS 290

My core is ziptied to a h100 and core temps stay below 65c.
vrm heatsink has the asus directcu ii high speed fan blowing at it at 80%, stays below 75c.

memory speed is stock so I dont see why it would overheat seeing that GDDR5 does not usually require active cooling.
Regardless 3dmark11 shouldnt heat up the pcb in 3 seconds so that it can black screen right...? pretty sure my entire board is <55c within the 3 seconds.

and 1.25~1.266 volts is not even that much heat considering my old reference 290 runs 1.3volts without blackscreen as long as the ref cooler are screaming

Also, when it black screens the gpu's fan profile goes back to normal instead of 80% fan speed I set. Not sure if the black screen others are encountering resets the fan profile(or oc)


----------



## rdr09

Quote:


> Originally Posted by *passinos*
> 
> Good Point. So the Delta is $100 is I factor new Backplates for the 290x's.
> 
> Do you think the Binned 290 Cryovenoms will be close to the Ref 290x w/ Koolance?


check their warranties. i know powercolor's are not transferable.


----------



## battleaxe

Quote:


> Originally Posted by *cennis*
> 
> It is a directcu ii ASUS 290
> 
> My core is ziptied to a h100 and core temps stay below 65c.
> vrm heatsink has the asus directcu ii high speed fan blowing at it at 80%, stays below 75c.
> 
> memory speed is stock so I dont see why it would overheat seeing that GDDR5 does not usually require active cooling.
> Regardless 3dmark11 shouldnt heat up the pcb in 3 seconds so that it can black screen right...? pretty sure my entire board is <55c within the 3 seconds.
> 
> and 1.25~1.266 volts is not even that much heat considering my old reference 290 runs 1.3volts without blackscreen as long as the ref cooler are screaming
> 
> Also, when it black screens the gpu's fan profile goes back to normal instead of 80% fan speed I set. Not sure if the black screen others are encountering resets the fan profile(or oc)


probably memory us unstable for some reason. You might have some bad memory chips. If you add a bit of core voltage things should stabilize. If they do your memory is bad. Only time mine black screens is when memory is pushed too high.


----------



## cennis

Quote:


> Originally Posted by *battleaxe*
> 
> probably memory us unstable for some reason. You might have some bad memory chips. If you add a bit of core voltage things should stabilize. If they do your memory is bad. Only time mine black screens is when memory is pushed too high.


My posts probably got lost in the thread but my problem occurs when I put more than 1.211v under load.
1200/1250 stable 1.211v
1200/1250 blackscreen 1.25v
1220/1250 black screen 1.25v


----------



## passinos

Quote:


> Originally Posted by *rdr09*
> 
> check their warranties. i know powercolor's are not transferable.


They are VisionTeks for both 290 and 290x. 1 year unless registered with 30 days of purchase then life time, but only original owner.

Screw it, going for the 290x's. Little more chancy but price is good.


----------



## rdr09

Quote:


> Originally Posted by *passinos*
> 
> They are VisionTeks for both 290 and 290x. 1 year unless registered with 30 days of purchase then life time, but only original owner.
> 
> Screw it, going for the 290x's. Little more chancy but price is good.


good choice.


----------



## rdr09

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Just a guess, but the cpu is alright while im running extremely high resolutions. If there is a cpu component to my scores if will be lower min, lower max and probably lower avg fps. It wont be the same bottleneck as if i was running 1080p where gpus are waitng on cpu so much. That said, ill oc the hell out of the cpu soon enough. Right now my goal is to have good stability while testing 4k eyefinity.


it depends on the games you'll bench. Single players may not matter much but Multiplayer . . . the cpu need to be fast enough to feed all four.


----------



## Jflisk

Quote:


> Originally Posted by *passinos*
> 
> Good Point. So the Delta is $100 is I factor new Backplates for the 290x's.
> 
> Do you think the Binned 290 Cryovenoms will be close to the Ref 290x w/ Koolance?


WOW we mine from EK were like 30.00 a POP


----------



## passinos

Quote:


> Originally Posted by *rdr09*
> 
> good choice.


Thanks. makes me feel better on decision!!

Done and checked out. 2x 290x VisionTeks Ref design with Koolance blocks $925.
BF4 should be nice at [email protected]


----------



## passinos

Quote:


> Originally Posted by *Jflisk*
> 
> WOW we mine from EK were like 30.00 a POP


I meant the difference in price of the 290's to 290x when I add in new backplates will be $50 more. Total difference was $125


----------



## Jflisk

Quote:


> Originally Posted by *passinos*
> 
> I meant the difference in price of the 290's to 290x when I add in new backplates will be $50 more. Total difference was $125


That makes sense now. They will definitely take heat from the DIE and VRMS.


----------



## Roy360

I want to my ASUS R9 290 to run at 947/1250 straight from the bios. But I get the feeling that flashing a reference bios will just mess up the card.

by the way, how do you tell if a crossfire setup is stable? I don't want any unnecessary power going to waste, so I"m going to flash a new bios with a step from the absolute lowest voltage it can run at.

I used VBE7.0.0.7b to mod my older amd cards, but it doesn't seem to work with the R9 290s


----------



## Tobiman

Quote:


> Originally Posted by *rdr09*
> 
> if only 290s can clock as easy . . .
> 
> http://www.3dmark.com/3dm11/7684243
> 
> http://cdn.overclock.net/4/4d/4de873ba_ep5QzJ.png
> 
> http://www.3dmark.com/3dm/1895751
> 
> http://cdn.overclock.net/3/36/36f6a78e_x005XZ.png
> 
> http://www.3dmark.com/3dm/1895777
> 
> http://cdn.overclock.net/5/5d/5dd53db4_KqAXkd.png
> 
> just for comparison.


yeah...http://www.3dmark.com/3dm11/8144294
My card doesn't like going over 1250mhz though which is a bummer but I can game at those clocks with my G10.


----------



## Tobiman

Quote:


> Originally Posted by *Roy360*
> 
> I want to my ASUS R9 290 to run at 947/1250 straight from the bios. But I get the feeling that flashing a reference bios will just mess up the card.
> 
> by the way, how do you tell if a crossfire setup is stable? I don't want any unnecessary power going to waste, so I"m going to flash a new bios with a step from the absolute lowest voltage it can run at.
> 
> I used VBE7.0.0.7b to mod my older amd cards, but it doesn't seem to work with the R9 290s


You can always flash it back to stock, if the reference bios is unstable. For flashing, I've had good success using atiflash (DOS version) for my pcs+ 290 and many other cards.


----------



## chronicfx

Anyone know the power draw on three r9-290x at stock settings? I have been gaming off and on and my computer has randomly rebooted a couple times, not often but randomly maybe 3 times in a month all while gaming. I have an evga 1300w g2, do you think I am going over capacity in some instances? Is this bad for the gpu's? I am thinking of putting my FSP booster x5 on one of my gpu now but I hate that damn lighted X on the front and it is not modular so I would have to tuck the two extra pcie cables somewhere. I would rather not install it if you think it is something else...


----------



## igrease

Okay so I managed to get my 290 up to 1080 and it played fine for about 10+ matches of BF4. I the upped it to 1100 and noticed artifacting. So I closed the game and dropped it back down to 1080. Now some things in the distance don't full render until I get close. I don't remember if that happened before. did i bork muh card


----------



## cennis

Quote:


> Originally Posted by *Roy360*
> 
> I want to my ASUS R9 290 to run at 947/1250 straight from the bios. But I get the feeling that flashing a reference bios will just mess up the card.
> 
> by the way, how do you tell if a crossfire setup is stable? I don't want any unnecessary power going to waste, so I"m going to flash a new bios with a step from the absolute lowest voltage it can run at.
> 
> I used VBE7.0.0.7b to mod my older amd cards, but it doesn't seem to work with the R9 290s


i tried flashing pt1 bios from another thread but that bios is bricked for now. i dont think asus custom pcb takes stock bios. powercolor pcs+ is a reference pcb so it flashses anything


----------



## Tobiman

Quote:


> Originally Posted by *chronicfx*
> 
> Anyone know the power draw on three r9-290x at stock settings? I have been gaming off and on and my computer has randomly rebooted a couple times, not often but randomly maybe 3 times in a month all while gaming. I have an evga 1300w g2, do you think I am going over capacity in some instances? Is this bad for the gpu's? I am thinking of putting my FSP booster x5 on one of my gpu now but I hate that damn lighted X on the front and it is not modular so I would have to tuck the two extra pcie cables somewhere. I would rather not install it if you think it is something else...


Max power draw should be around 1000W. Maybe it's more of a problem with the rail not supplying enough current?


----------



## Jflisk

Quote:


> Originally Posted by *chronicfx*
> 
> Anyone know the power draw on three r9-290x at stock settings? I have been gaming off and on and my computer has randomly rebooted a couple times, not often but randomly maybe 3 times in a month all while gaming. I have an evga 1300w g2, do you think I am going over capacity in some instances? Is this bad for the gpu's? I am thinking of putting my FSP booster x5 on one of my gpu now but I hate that damn lighted X on the front and it is not modular so I would have to tuck the two extra pcie cables somewhere. I would rather not install it if you think it is something else...


Some power supplies have multiple rails you might want to check that with yours.

you also might want to check this out as far as power usage goes

http://www.realhardtechx.com/index_archivos/Page362.htm


----------



## chronicfx

Thanks for the quick answer. 1300w is the recommended in that chart. I believe the g2 has it all on a single rail.. Yeah box says 1299.6 watts on the 12v..


----------



## hotrod717

Finally got this installed. A little running around and some painstaking work to get the factory backplate on.


----------



## Jflisk

Quote:


> Originally Posted by *chronicfx*
> 
> Thanks for the quick answer. 1300w is the recommended in that chart. I believe the g2 has it all on a single rail.. Yeah box says 1299.6 watts on the 12v..


Check for dump files here. Not always created. If its not created chances are power.
C:\Windows\Minidump\Minidump.dmp. (might have dates and times on them of time of crash)

Find bluscreenview on the internet. Open dump file Might help you track down the problem.

Dump files are usually created at point of crash.


----------



## cephelix

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *hotrod717*
> 
> Finally got this installed. A little running around and some painstaking work to get the factory backplate on.






Very smexy.....








Still waiting on my msi r9 290 gaming waterblock to become available. Keep checking cooling configurator and it says "coming soon"


----------



## Roy360

Quote:


> Originally Posted by *hotrod717*
> 
> Finally got this installed. A little running around and some painstaking work to get the factory backplate on.


How did you do it? did you install it after or before the waterblock. I'd love to save 30$ and use my own backplate.


----------



## hotrod717

Quote:


> Originally Posted by *Roy360*
> 
> How did you do it? did you install it after or before the waterblock. I'd love to save 30$ and use my own backplate.


I'm going to be posting a short guide on how I did it in the "Lightning Thread". Some light modding was needed, but nothing too hard. Temps are great so far.

*Instructions are up for installing factory backplate with a EK waterblock for 290X Lightning.


----------



## Matt-Matt

Quote:


> Originally Posted by *chronicfx*
> 
> Anyone know the power draw on three r9-290x at stock settings? I have been gaming off and on and my computer has randomly rebooted a couple times, not often but randomly maybe 3 times in a month all while gaming. I have an evga 1300w g2, do you think I am going over capacity in some instances? Is this bad for the gpu's? I am thinking of putting my FSP booster x5 on one of my gpu now but I hate that damn lighted X on the front and it is not modular so I would have to tuck the two extra pcie cables somewhere. I would rather not install it if you think it is something else...


Try just two of the cards instead of three. Depending on other system overclocks and hardware and the age of the PSU you could be over drawing yes. See how two goes, and if it still does it I'd say it's not the PSU (or maybe try one also?) that's the only way you'll really know haha.

My old PSU did it with two 7950's, that was 750W but it had degraded quite a bit.


----------



## Arizonian

Quote:


> Originally Posted by *Gooberman*
> 
> Well i just bought a 290x for $365 on ebay :O


Sweet. Submit a club entry when you get your card. (Just look at OP) and let us know how it runs.

Quote:


> Originally Posted by *hotrod717*
> 
> Finally got this installed. A little running around and some painstaking work to get the factory backplate on.
> 
> 
> 
> Spoiler: Warning: Spoiler!


Looking forward to your results in the Lightning club. Just update us here with a cross link would be nice.


----------



## Paul17041993

Quote:


> Originally Posted by *Roy360*
> 
> Did anyone find an oil like substance on their GPU core? I installed a waterblock on my 4th R9 290, and there some strange oil on the core and heatsync. (from the fans?)
> 
> p.s how do I set the voltage to be the same across all my cards.
> 
> When I had 3 cards, I could safely undervolt to -80mV
> but now with 4 cards, sometimes -70 would work, other times it would crash at -50
> 
> I tested the new card outside the main rig, and it operated for 3 days at -70mV
> 
> But according to GPUz, all the cards are acting all their own voltages. ie. 1.086, 1.063, 1.133
> The core clock seems to be fluctuating on my VisionTek card too. between 944.2- 945.6 (I've set them all to 947 with MSI afterburner. )
> 
> I want to see how low I can bring down power drain. Right now, each card is taking 220W


Quote:


> Originally Posted by *darkelixa*
> 
> I have oil on the back of my r9 290 just next to the backplate screw, card is ruined and its not even water cooled. Time to buy a r9 280x


that's most likely from the thermal paste if not spill from fans, its not even conductive so nothing to worry about.

Quote:


> Originally Posted by *cennis*
> 
> It is a directcu ii ASUS 290
> 
> My core is ziptied to a h100 and core temps stay below 65c.
> vrm heatsink has the asus directcu ii high speed fan blowing at it at 80%, stays below 75c.
> 
> memory speed is stock so I dont see why it would overheat seeing that GDDR5 does not usually require active cooling.
> Regardless 3dmark11 shouldnt heat up the pcb in 3 seconds so that it can black screen right...? pretty sure my entire board is <55c within the 3 seconds.
> 
> and 1.25~1.266 volts is not even that much heat considering my old reference 290 runs 1.3volts without blackscreen as long as the ref cooler are screaming
> 
> Also, when it black screens the gpu's fan profile goes back to normal instead of 80% fan speed I set. Not sure if the black screen others are encountering resets the fan profile(or oc)


blackscreens are an issue on the asus cards regardless of if you're overclocking, they have really crappy memory, might be able to get it stable with a different BIOS however.

also check the PCB temp, wouldn't surprise me if the back of it is scalding hot and overheating the memory.


----------



## cennis

Quote:


> Originally Posted by *Paul17041993*
> 
> that's most likely from the thermal paste if not spill from fans, its not even conductive so nothing to worry about.
> blackscreens are an issue on the asus cards regardless of if you're overclocking, they have really crappy memory, might be able to get it stable with a different BIOS however.
> 
> also check the PCB temp, wouldn't surprise me if the back of it is scalding hot and overheating the memory.


i tried flashing another bios PT1 and it bricked that bios. i dont think the ref pcb bios can work on this custom one.
and temps isnt a problem i touched the pcb and ram chips it is cool. besides it is seconds of run time. i feel like its my mobo..


----------



## Paul17041993

Quote:


> Originally Posted by *cennis*
> 
> i tried flashing another bios PT1 and it bricked that bios. i dont think the ref pcb bios can work on this custom one.
> and temps isnt a problem i touched the pcb and ram chips it is cool. besides it is seconds of run time. i feel like its my mobo..


if it were mobo everything would hardlock before the blackscreen, if anything you wouldn't blackscreen at all, if you blackscreen or get a stripy/messy screen and the mobo still responds to the keyboard (eg; numlock still toggles) at least for a second afterwards then its the card failing.

think you might just be stuck at those clocks, the DCII isn't exactly great for overclocking in the first place, at least not without a full-cover waterblock or subzero...

have you tried underclocking the memory though to see if it still blackscreens?


----------



## cennis

Quote:


> Originally Posted by *Paul17041993*
> 
> if it were mobo everything would hardlock before the blackscreen, if anything you wouldn't blackscreen at all, if you blackscreen or get a stripy/messy screen and the mobo still responds to the keyboard (eg; numlock still toggles) at least for a second afterwards then its the card failing.
> 
> think you might just be stuck at those clocks, the DCII isn't exactly great for overclocking in the first place, at least not without a full-cover waterblock or subzero...
> 
> have you tried underclocking the memory though to see if it still blackscreens?


It black screens 10 seconds instead of 2 seconds at 1100mhz.

However what puzzles me is why using a higher voltage(1.25+) with 1200/1250 black screens the card, when the clocks are stable at a lower voltage 1.21v.


----------



## Paul17041993

Quote:


> Originally Posted by *cennis*
> 
> It black screens 10 seconds instead of 2 seconds at 1100mhz.
> 
> However what puzzles me is why using a higher voltage(1.25+) with 1200/1250 black screens the card, when the clocks are stable at a lower voltage 1.21v.


vdroop from higher power draw screwing memory voltage regulation, or you're hitting PCB TDP.


----------



## Mercy4You

Quote:


> Originally Posted by *cennis*
> 
> It black screens 10 seconds instead of 2 seconds at 1100mhz.
> 
> However what puzzles me is why using a higher voltage(1.25+) with 1200/1250 black screens the card, when the clocks are stable at a lower voltage 1.21v.


Sorry if I didn't read all your posts, but does it also BS @ stock?

My BS's did cause total lockup's and were not mobo related, because new GPU (Hynix) solved it completely...


----------



## cennis

Quote:


> Originally Posted by *Mercy4You*
> 
> Sorry if I didn't read all your posts, but does it also BS @ stock?
> 
> My BS's did cause total lockup's and were not mobo related, because new GPU (Hynix) solved it completely...


I dont think it has a problem at stock but i havent gamed much. non during bench though


----------



## centvalny

Matrix h20 test



http://imgur.com/XM2SlZ8


----------



## cennis

Quote:


> Originally Posted by *centvalny*
> 
> Matrix h20 test
> 
> 
> 
> http://imgur.com/XM2SlZ8


... elpida 1677mhz..


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *centvalny*
> 
> Matrix h20 test
> 
> 
> 
> http://imgur.com/XM2SlZ8











http://www.3dmark.com/3dm11/7725420


----------



## VSG

That's weird, your graphics and physics scores are lower but the combined score is over 1k higher. Either of you changed clocks during the run?


----------



## rdr09

Quote:


> Originally Posted by *igrease*
> 
> Okay so I managed to get my 290 up to 1080 and it played fine for about 10+ matches of BF4. I the upped it to 1100 and noticed artifacting. So I closed the game and dropped it back down to 1080. Now some things in the distance don't full render until I get close. I don't remember if that happened before. did i bork muh card


DX11 or Mantle? it could be the gpu waiting on the cpu.


----------



## Mercy4You

Guys, maybe you can help me out here. Last year I did a lot of searching about 60 Hertz monitor versus 120 Hertz before I bought one.. Yesterday I found an article wich stated that its humanly impossible to see more than 90 FPS.

Is that true, and why are then 120 Hertz monitors on the market, only for 3D? I am testing now 100 hertz because I see quite some heat difference between gaming @100 versus @120 Fps... And with the summer coming I like loosing say 10 C on my Vrm1...

Almost all articles I found compare 60/120 Hertz, and indeed that is a big difference. In fact, my accuracy in gaming went up a lot since I game @ 120 herz...

But what's the story about 100 Hertz/100Fps?


----------



## Mega Man

Different people can see faster then others. They used to say no one can see more then 30fps. Then 60. Ect do what you like and thinks looks best.


----------



## Mercy4You

Quote:


> Originally Posted by *Mega Man*
> 
> Different people can see faster then others. They used to say no one can see more then 30fps. Then 60. Ect do what you like and thinks looks best.


I understand that, but.... no one can lift 2.000 Lbs.

So where is the limit for Hertz, is that indeed 90?

BTW, your response time was Mega fast


----------



## Mega Man

Not much to do. Atm.

Also to note people solo gave lifted large boulders and cars and helicopters off of people and it has been documented.

My previous point was it seems to change. What the max fps "people can see" worry less about what people tell you you can see and more about what you can see


----------



## Jflisk

Quote:


> Originally Posted by *Mercy4You*
> 
> I understand that, but.... no one can lift 2.000 Lbs.
> 
> So where is the limit for Hertz, is that indeed 90?
> 
> BTW, your response time was Mega fast


Okay 60 fps or 60 hz is the best answer. For games 120 hz is better just means the screen is going to move and refresh quicker.

this seems to be a good article to describe hz or frps

http://www.pcmag.com/article2/0,2817,2379206,00.asp


----------



## Jflisk

Quote:


> Originally Posted by *Mega Man*
> 
> Not much to do. Atm.
> 
> Also to note people solo gave lifted large boulders and cars and helicopters off of people and it has been documented.
> 
> My previous point was it seems to change. What the max fps "people can see" worry less about what people tell you you can see and more about what you can see


Mega you home yet or still out and about.


----------



## Dasboogieman

Quote:


> Originally Posted by *Mercy4You*
> 
> Guys, maybe you can help me out here. Last year I did a lot of searching about 60 Hertz monitor versus 120 Hertz before I bought one.. Yesterday I found an article wich stated that its humanly impossible to see more than 90 FPS.
> 
> Is that true, and why are then 120 Hertz monitors on the market, only for 3D? I am testing now 100 hertz because I see quite some heat difference between gaming @100 versus @120 Fps... And with the summer coming I like loosing say 10 C on my Vrm1...
> 
> Almost all articles I found compare 60/120 Hertz, and indeed that is a big difference. In fact, my accuracy in gaming went up a lot since I game @ 120 herz...
> 
> But what's the story about 100 Hertz/100Fps?


To be honest, I think 120hz smoothness is partly placebo. I can understand the difference between 30hz and 60hz is perceptible but the craze strikes me as very similar to the Audiophile craze.
Sound can actually be perceived as better if the user is convinced there is some inherent superiority. That being said, I have seen some truly astounding Audiophile setups which sound amazing but how much of that quality is derived from real engineering improvements or simply clever marketing?

I have also encountered blinded tests where most participants cannot perceive the difference in motion between 120hz and 60hz. Yet there are many gamers in the wild that swear by their 120hz setups.

Basically, 120hz exists for 3D but there was a secondary marketing angle in having a "superior" motion. However, like Audiophiles, the technology is there for hardcore image motion aficionados.

Personally, I find image quality and color to be of greater importance than absolute motion fluidity. Continuous viewing of incorrect colors is extremely jarring to me to the point of inducing eyestrain.


----------



## hotrod717

Quote:


> Originally Posted by *Arizonian*
> 
> Sweet. Submit a club entry when you get your card. (Just look at OP) and let us know how it runs.
> Looking forward to your results in the Lightning club. Just update us here with a cross link would be nice.


Here you go. First hour with waterblock.
Definately room for improvement.









http://www.overclock.net/t/1472341/official-msi-r9-290x-lightning-thread/450#post_22142643


----------



## Mega Man

Quote:


> Originally Posted by *Jflisk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> Not much to do. Atm.
> 
> Also to note people solo gave lifted large boulders and cars and helicopters off of people and it has been documented.
> 
> My previous point was it seems to change. What the max fps "people can see" worry less about what people tell you you can see and more about what you can see
> 
> 
> 
> Mega you home yet or still out and about.
Click to expand...

not till the 30th why


----------



## Sgt Bilko

Well, XFX gave me the go ahead to take the cooler off my card and change the paste so that will be interesting to see what some MX-4 under there will do to the temps









Might even see what i can do about vrm cooling as well.


----------



## Arizonian

Quick reminder - if someone uses profanity, don't bother wasting your time replying, as it gets removed along with anyone's reply that quoted it.


----------



## brucevilanch

Did not know that. Thanks for the head-up.


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *centvalny*
> 
> Matrix h20 test
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> http://imgur.com/XM2SlZ8
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/7725420
Click to expand...

Did hexa boost the score? I can get P19298 with *two* 290 crossfire @1200/1625 & [email protected]


----------



## chiknnwatrmln

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/7725420


Your graphics score looks a bit low for those clocks. I would think 100MHz would make more than a 200 point different in graphics score, even if my memory is a bit higher.
290 @ 1245/6500 w/ Elpida:
http://www.3dmark.com/3dm11/7941592


----------



## Tokkan

If this trend of cheap r9 290 flooding the market keeps going I might snag another one and make a new build to pass my current one to my gf








They going as low as 200 pounds on ebay. If they keep flooding I can see the prices get even lower on the used market atleast.


----------



## xaho

I am hoping anybody out here can help me with the following problem:


When playing Metro Last Light the core clock doesn't seem to want to go any higher then about 500MHz







, it shouldn't be a temperature issue (watercooled -> 46 degrees celsius),
I also tried setting the power limit to +50% (at first in CCC, then in MSI AB), even tried setting fan speed manual to 100% just for the sake of it.

Sadly it doesn't seem to want to go to the proper minimum of 9xx MHz so I can raise my FPS above the current 17 FPS -_-'

Any tips, ideas or help would be much apreciated , I have searched this thread in which I saw someone else have the same issue but didn't find a solution to it.


----------



## Sgt Bilko

Quote:


> Originally Posted by *xaho*
> 
> I am hoping anybody out here can help me with the following problem:
> 
> 
> When playing Metro Last Light the core clock doesn't seem to want to go any higher then about 500MHz
> 
> 
> 
> 
> 
> 
> 
> , it shouldn't be a temperature issue (watercooled -> 46 degrees celsius),
> I also tried setting the power limit to +50% (at first in CCC, then in MSI AB), even tried setting fan speed manual to 100% just for the sake of it.
> 
> Sadly it doesn't seem to want to go to the proper minimum of 9xx MHz so I can raise my FPS above the current 17 FPS -_-'
> 
> Any tips, ideas or help would be much apreciated , I have searched this thread in which I saw someone else have the same issue but didn't find a solution to it.


You have a couple of options to try

do a wipe with DDU then try 13.12 drivers

enable CCC then crank the powerlimit in that

or try trixx and see if that works.


----------



## motokill36

Quote:


> Originally Posted by *Tokkan*
> 
> If this trend of cheap r9 290 flooding the market keeps going I might snag another one and make a new build to pass my current one to my gf
> 
> 
> 
> 
> 
> 
> 
> 
> They going as low as 200 pounds on ebay. If they keep flooding I can see the prices get even lower on the used market atleast.


Just got one of ebay for £170 so well happy


----------



## heroxoot

Quote:


> Originally Posted by *xaho*
> 
> I am hoping anybody out here can help me with the following problem:
> 
> 
> When playing Metro Last Light the core clock doesn't seem to want to go any higher then about 500MHz
> 
> 
> 
> 
> 
> 
> 
> , it shouldn't be a temperature issue (watercooled -> 46 degrees celsius),
> I also tried setting the power limit to +50% (at first in CCC, then in MSI AB), even tried setting fan speed manual to 100% just for the sake of it.
> 
> Sadly it doesn't seem to want to go to the proper minimum of 9xx MHz so I can raise my FPS above the current 17 FPS -_-'
> 
> Any tips, ideas or help would be much apreciated , I have searched this thread in which I saw someone else have the same issue but didn't find a solution to it.


I had a similar issue with BF4 when it first came out. Nothing would trigger my card to clock up and I had no idea why. I would just force the clocks to full with its profile in MSIAB and the problem eventually went away later.


----------



## Sgt Bilko

Quote:


> Originally Posted by *heroxoot*
> 
> I had a similar issue with BF4 when it first came out. Nothing would trigger my card to clock up and I had no idea why. I would just force the clocks to full with its profile in MSIAB and the problem eventually went away later.


Ahh, Disable ULPS, that could work too


----------



## xaho

Quote:


> Originally Posted by *heroxoot*
> 
> I had a similar issue with BF4 when it first came out. Nothing would trigger my card to clock up and I had no idea why. I would just force the clocks to full with its profile in MSIAB and the problem eventually went away later.


How does one "force" clocks in MSIAB? I just set the core/memory clock a bit higher than stock with power limit on +50% and pressed apply, will that do? Or did you assign it to a profile and with some sort of setting made it stick?


----------



## Sgt Bilko

Quote:


> Originally Posted by *xaho*
> 
> How does one "force" clocks in MSIAB? I just set the core/memory clock a bit higher than stock with power limit on +50% and pressed apply, will that do? Or did you assign it to a profile and with some sort of setting made it stick?


try disabling ULPS in the Afterburner Settings.


----------



## xaho

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Ahh, Disable ULPS, that could work too


Disable ULPS in the registry I suppose?
Any info on which one? first one? or all?








HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Video\{905177D4-8A74-476A-9433-FF69B6A6ED11}\0000
I also have 0001,0002,0003,0004 and 0005 which all have the EnableUlps key...









edit: I'll just do this then: http://forums.guru3d.com/showthread.php?t=327291 (change em all!







)


----------



## Sgt Bilko

Quote:


> Originally Posted by *xaho*
> 
> How would you
> Disable ULPS in the registry I suppose?
> Any info on which one? first one? or all?
> 
> 
> 
> 
> 
> 
> 
> 
> HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Video\{905177D4-8A74-476A-9433-FF69B6A6ED11}\0000
> I also have 0001,0002,0003,0004 and 0005 which all have the EnableUlps key...


----------



## xaho

Quote:


> Originally Posted by *Sgt Bilko*


Well that's awkward, AMD Compatibility properties part doesn't exist in my General tab of MSI Afterburner properties, but I just edited the registry and now after a reboot it constantly works on 947MHz, setting core clock in MSIAB to 999MHz only results in 960MHz on the logging window and in Sapphire Trixx same issue, 1000MHz -> MSIAB shows only 960MHz. (Correction: it does peek to 1000MHz while gaming, see image below)

For now my problems are solved though, time to enjoy some Metro LL, thank you very much! ^^
So now the clock stays stable however the GPU usage in MSIAB just keeps going up and down from 0 to 100 and back again. :') No Metro just yet for me!


----------



## cennis

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Your graphics score looks a bit low for those clocks. I would think 100MHz would make more than a 200 point different in graphics score, even if my memory is a bit higher.
> 290 @ 1245/6500 w/ Elpida:
> http://www.3dmark.com/3dm11/7941592


Can you do a run at 1200/6000 and show me your graphics score for 3dm11 P?
I an getting 17200 graphics score with tess OFF, I dont think 45mhz/125mhz more will bring me 3500 points up
also with 1220/5000 I get 17200.
Something wrong with my elpida?

on 14.4 drivers, 13.12 was similar never broke 18000.
I notice that sometimes my utilization drops to 95%~ but my clocks never throttle


----------



## Forceman

Quote:


> Originally Posted by *xaho*
> 
> Well that's awkward, AMD Compatibility properties part doesn't exist in my General tab of MSI Afterburner properties, but I just edited the registry and now after a reboot it constantly works on 947MHz, setting core clock in MSIAB to 999MHz only results in 960MHz on the logging window and in Sapphire Trixx same issue, 1000MHz -> MSIAB shows only 960MHz. (Correction: it does peek to 1000MHz while gaming, see image below)
> 
> For now my problems are solved though, time to enjoy some Metro LL, thank you very much! ^^
> So now the clock stays stable however the GPU usage in MSIAB just keeps going up and down from 0 to 100 and back again. :') No Metro just yet for me!


Try updating to the newest Beta release of Afterburner, that old version you are using doesn't properly support Hawaii cards.


----------



## rdr09

Quote:


> Originally Posted by *cennis*
> 
> Can you do a run at 1200/6000 and show me your graphics score for 3dm11 P?
> I an getting 17200 graphics score with tess OFF, I dont think 45mhz/125mhz more will bring me 3500 points up
> also with 1220/5000 I get 17200.
> Something wrong with my elpida?
> 
> on 14.4 drivers, 13.12 was similar never broke 18000.
> I notice that sometimes my utilization drops to 95%~ but my clocks never throttle


17200 is with tess on. to break 18000 with the 290 you gonna need around 1250. here is 1260 . . .

http://www.3dmark.com/3dm11/8142501

13.12 should work but not any of the 14 drivers except the leaked version or so i read.


----------



## cennis

Quote:


> Originally Posted by *rdr09*
> 
> 17200 is with tess on. to break 18000 with the 290 you gonna need around 1250. here is 1260 . . .
> 
> http://www.3dmark.com/3dm11/8142501
> 
> 13.12 should work but not any of the 14 drivers except the leaked version or so i read.


Yea im using 14.4 leaked. I thought I had tess disabled but I will check


----------



## rdr09

Quote:


> Originally Posted by *cennis*
> 
> Yea im using 14.4 leaked. I thought I had tess disabled but I will check


thanks for testing 14.4. i am switching between 13.11 and 14.3 (mantle).

http://www.3dmark.com/3dm11/8190009


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *geggeg*
> 
> That's weird, your graphics and physics scores are lower but the combined score is over 1k higher. Either of you changed clocks during the run?
> Quote:
> 
> 
> 
> Originally Posted by *chiknnwatrmln*
> 
> Your graphics score looks a bit low for those clocks. I would think 100MHz would make more than a 200 point different in graphics score, even if my memory is a bit higher.
> 290 @ 1245/6500 w/ Elpida:
> http://www.3dmark.com/3dm11/7941592
Click to expand...

On this run It blackscreened in and out thru the whole bench maybe that's why its a bit low BUT its a single r9 290 world record on HWBOT








http://hwbot.org/submission/2471716_homecinema_pc_3dmark11___performance_radeon_r9_290_19306_marks








Quote:


> Originally Posted by *rdr09*
> 
> DX11 or Mantle? it could be the gpu waiting on the cpu.


Imagine if hexcore could bottleneck mantle ......... NOT








Quote:


> Originally Posted by *kizwan*
> 
> Did hexa boost the score? I can get P19298 with *two* 290 crossfire @1200/1625 & [email protected]


Combination of a random convergence of technologies and Hexcore Physics and my MAD SKILLZ








Quote:


> Originally Posted by *motokill36*
> 
> Just got one of ebay for £170 so well happy
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Tokkan*
> 
> If this trend of cheap r9 290 flooding the market keeps going I might snag another one and make a new build to pass my current one to my gf
> 
> 
> 
> 
> 
> 
> 
> 
> They going as low as 200 pounds on ebay. If they keep flooding I can see the prices get even lower on the used market atleast.
Click to expand...

I picked up a XFX unlocked 290 / 290x for $420AU = 233.35 POUNDS


----------



## diggiddi

What's to look out for when purchasing cards that have been used for mining off EBay?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *diggiddi*
> 
> What's to look out for when purchasing cards that have been used for mining off EBay?


Good question , umm , Warped or slightly melted plastic maybe ? I chat to owner a bit find out what site they are on and what they do


----------



## igrease

Alright so I have had my fill of Battlefield 4 on Ultra settings with fps varying between 100 ~ 120+ fps. Now I want 120FPS consistently. Of course I tried to lower the settings but my frames actually dropped with it. When on Low settings my GPU is not using more than 60% usage. Is there a way to fix this? Happens on all drivers.


----------



## diggiddi

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Good question , umm , Warped or slightly melted plastic maybe ? I chat to owner a bit find out what site they are on and *what they do*


You mean their job if they have one?


----------



## rdr09

Quote:


> Originally Posted by *igrease*
> 
> Alright so I have had my fill of Battlefield 4 on Ultra settings with fps varying between 100 ~ 120+ fps. Now I want 120FPS consistently. Of course I tried to lower the settings but my frames actually dropped with it. When on Low settings my GPU is not using more than 60% usage. Is there a way to fix this? Happens on all drivers.


raise your resolution scale. try 140%. DX11 first. that and/or oc the cpu higher.


----------



## Dasboogieman

Quote:


> Originally Posted by *xaho*
> 
> I am hoping anybody out here can help me with the following problem:
> 
> 
> When playing Metro Last Light the core clock doesn't seem to want to go any higher then about 500MHz
> 
> 
> 
> 
> 
> 
> 
> , it shouldn't be a temperature issue (watercooled -> 46 degrees celsius),
> I also tried setting the power limit to +50% (at first in CCC, then in MSI AB), even tried setting fan speed manual to 100% just for the sake of it.
> 
> Sadly it doesn't seem to want to go to the proper minimum of 9xx MHz so I can raise my FPS above the current 17 FPS -_-'
> 
> Any tips, ideas or help would be much apreciated , I have searched this thread in which I saw someone else have the same issue but didn't find a solution to it.


Ah I had this problem earlier.
MSI afterburner in Unofficial mode doesn't work, all it does is set a maximum clockspeed but the clocks will throttle down when the core is loaded (granted the game/benchmark will still run at a steady 900ish mhz)
TriXX doesn't work because it only sets the maximum clock, it doesn't force 3D only clocks.
ULPS Disable didn't work for me, I don't think it solves your problem thought it doesn't really hurt to try.

There are only 2 foolproof ways I found around this though each has its compromises.

1. Custom Radeon Pro profiles. You can use force maximum 3D clocks on an individual game basis when you create profiles. The main advantage of this is that you still retain full idle and video clocks while still getting full boost. The biggest drawback to this method is you can't force 3d clocks AND do voltage assisted overclocking at the same time, since Radeon Pro uses the Overdrive engine (which doesn't support overvoltage), any overclock you perform after the game is launched will override Radeon Pro's settings.

2. Flash the ASUS 290X PT1T BIOS. This works like a charm, its actually a 290X bios but I was able to flash it on to my Sapphire Tri X 290 card with no issues. The beauty of this BIOS is it sets 1000mhz as the minimum state of the GPU at all times (i.e. PowerPlay has been disabled). You can then use a specially modded ASUS GPU TWEAK (which was borderline impossible to locate) to do your overclocking and overvoltage. The card typically boosts to your overclocked speeds 95% of the time and will do it for most games. Why the PT1T version? this one has VDROOP but it also apparently has some funky coding to prevent pre-boot blackscreens if your motherboard is picky about mismatched device IDs.
The biggest disadvantage to this method is that your GPU runs at 1000mhz all the time, no 2d or video clocks are present. I'm running this configuration on air at the moment with no issues so I can't imagine you having problems since your card is watercooled.


----------



## cennis

Quote:


> Originally Posted by *Dasboogieman*
> 
> You can then use a specially modded ASUS GPU TWEAK (which was borderline impossible to locate) to do your overclocking and overvoltage.


Do you mean the one in this post? http://www.overclockers.com/forums/showthread.php?t=739529
or is there another one?
Does this version allow voltage past 1412mw?


----------



## Forceman

Quote:


> Originally Posted by *igrease*
> 
> Alright so I have had my fill of Battlefield 4 on Ultra settings with fps varying between 100 ~ 120+ fps. Now I want 120FPS consistently. Of course I tried to lower the settings but my frames actually dropped with it. When on Low settings my GPU is not using more than 60% usage. Is there a way to fix this? Happens on all drivers.


If the frame rates don't go up when you lower settings then you are CPU limited and the GPU is waiting around for the CPU to send it frames to render. Are you using Mantle?


----------



## igrease

Quote:


> Originally Posted by *Forceman*
> 
> If the frame rates don't go up when you lower settings then you are CPU limited and the GPU is waiting around for the CPU to send it frames to render. Are you using Mantle?


Yes I am using Mantle


----------



## chronicfx

Anyone know if the vsync+crossfire audio crackling issue has been fixed in 14.4? I am playing far cry 3 with vsync off but the cutscenes still use it anyways and the skipping and cracking make me drop my headphones around my neck until the scene ends. I heard 14,4 fixes this, but i hear they also bring a host of other probs vs. 13.12. Anyone running 14.4 and is happy?

By the way two cards seems to be working well. For those who addressed my psu issue last night.


----------



## Dasboogieman

Quote:


> Originally Posted by *cennis*
> 
> Do you mean the one in this post? http://www.overclockers.com/forums/showthread.php?t=739529
> or is there another one?
> Does this version allow voltage past 1412mw?


Yup thats the ASUS GPU Tweak version, be really really really really careful, I can't stress this enough, the interface is so cluttered I almost accidentally dialed 2000mv once since I briefly confused it with the memory clock tab.

The PT1T can be found here:
http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread

under the Other 290x ROM spoilers


----------



## DeadlyDNA

Quote:


> Originally Posted by *chronicfx*
> 
> Anyone know if the vsync+crossfire audio crackling issue has been fixed in 14.4? I am playing far cry 3 with vsync off but the cutscenes still use it anyways and the skipping and cracking make me drop my headphones around my neck until the scene ends. I heard 14,4 fixes this, but i hear they also bring a host of other probs vs. 13.12. Anyone running 14.4 and is happy?
> 
> By the way two cards seems to be working well. For those who addressed my psu issue last night.


is 14.4 officially released as beta drivers and not leaked?


----------



## cennis

Quote:


> Originally Posted by *Dasboogieman*
> 
> Yup thats the ASUS GPU Tweak version, be really really really really careful, I can't stress this enough, the interface is so cluttered I almost accidentally dialed 2000mv once since I briefly confused it with the memory clock tab.
> 
> The PT1T can be found here:
> http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread
> 
> under the Other 290x ROM spoilers


I flashed the PT1T bios onto my asus dcuii card WORKS


----------



## Dasboogieman

Quote:


> Originally Posted by *cennis*
> 
> I flashed the PT1T bios onto my asus dcuii card before which has a custom pcb, it doesn't seem to work. Gonna try out the tool later anyway. Thanks


Did you use the DOS USB ATIFLASH or the ATIwinflash? I had issues with the ATIwinflash when I did mine but the bootable USB worked like a charm.


----------



## cennis

Quote:


> Originally Posted by *Dasboogieman*
> 
> Did you use the DOS USB ATIFLASH or the ATIwinflash? I had issues with the ATIwinflash when I did mine but the bootable USB worked like a charm.


I used bootable USB and it has worked on my reference cards before, don't think I did anything different this time but it doesn't have any video out put on that bios now.


----------



## HardwareDecoder

is it likely if I flashed the PT1 bios to my 290x xfire setup I would be able to overclock more than 1150 on the core w/ +100mV.

I guess what I'm asking is it likely i'm voltage limited or just my cards hit a wall.


----------



## Forceman

Quote:


> Originally Posted by *DeadlyDNA*
> 
> is 14.4 officially released as beta drivers and not leaked?


Just leaked.
Quote:


> Originally Posted by *HardwareDecoder*
> 
> is it likely if I flashed the PT1 bios to my 290x xfire setup I would be able to overclock more than 1150 on the core w/ +100mV.
> 
> I guess what I'm asking is it likely i'm voltage limited or just my cards hit a wall.


You can use up to +200mV with Trixx, I'd try that before I flashed the PT1. I can't remember if the PT1 gets rid of Vdroop or not, but that's the only real way it would help you with the same voltage, and using Trixx would let you simulate that by using the higher offset.


----------



## HardwareDecoder

Quote:


> Originally Posted by *Forceman*
> 
> Just leaked.
> You can use up to +200mV with Trixx, I'd try that before I flashed the PT1. I can't remember if the PT1 gets rid of Vdroop or not, but that's the only real way it would help you with the same voltage, and using Trixx would let you simulate that by using the higher offset.


Thanks for the info didn't know trixx could do +200mV.

Unfortunately, I tried 1200 core with +200mV and it gave me a ton of lines on my monitor.

It was the same kind of thing that happens when I try to overclock my qnix monitor while overclocking my 290x xfire setup.

I really think there is just something "wrong" with my qnix.

I have to choose between 1000mhz core / 110hz on monitor, or 1150mhz core / 60hz stock.

Otherwise I get lines and distortion. Not sure what the cause is maybe I just lost the monitor lottery. Gonna be replacing this thing as soon as a 4k monitor with the specs I want comes out.


----------



## Roboyto

Quote:


> Originally Posted by *cennis*
> 
> Can you do a run at 1200/6000 and show me your graphics score for 3dm11 P?
> I an getting 17200 graphics score with tess OFF, I dont think 45mhz/125mhz more will bring me 3500 points up
> also with 1220/5000 I get 17200.
> Something wrong with my elpida?
> 
> on 14.4 drivers, 13.12 was similar never broke 18000.
> I notice that sometimes my utilization drops to 95%~ but my clocks never throttle


There is nothing wrong with your Elpida, it's average. Elpida typically has a smaller chance of overclocking well compared to Hynix. I have seen both scenarios of fantastic Elpida, and crap Hynix...just luck of the draw. My first 290 had Hynix and could only go to 1350.

RAM clocks do practically nothing for 3DMark11, especially on performance settings. I don't even gain 2% when pushing RAM up to 1700.



My best run with Tess ON is 16312. http://www.3dmark.com/3dm11/8251505

Best run with Tess OFF is 17940. If I had a 6-core I would be tickling 20k I'm pretty sure. http://www.3dmark.com/3dm11/8251499

Here is 1200/6000 for reference as you requested: http://www.3dmark.com/3dm11/8251540 I am running 4770k @ 4.5GHz with 16GB RAM @ 2400MHz


----------



## cennis

Quote:


> Originally Posted by *Roboyto*
> 
> There is nothing wrong with your Elpida, it's average. Elpida typically has a smaller chance of overclocking well compared to Hynix. I have seen both scenarios of fantastic Elpida, and crap Hynix...just luck of the draw. My first 290 had Hynix and could only go to 1350.
> 
> RAM clocks do practically nothing for 3DMark11, especially on performance settings. I don't even gain 2% when pushing RAM up to 1700.
> 
> 
> 
> My best run with Tess ON is 16312. http://www.3dmark.com/3dm11/8251505
> 
> Best run with Tess OFF is 17940. If I had a 6-core I would be tickling 20k I'm pretty sure. http://www.3dmark.com/3dm11/8251499
> 
> Here is 1200/6000 for reference as you requested: http://www.3dmark.com/3dm11/8251540 I am running 4770k @ 4.5GHz with 16GB RAM @ 2400MHz


Thats some ridiculous graphics scores..
What voltages was it underload?

*Also I have made some progress on my DCUII 290 having blacks screen when I increase over 250mw voltage at a previously stable clock. I flashed PT1T bios and that problem has went away, put 1.375v under load in the card without black screen.
vrm reaches 116c probably not that good but at least we know the pcb has no limitations, but the BIOS has problems. Not sure if another manufactures bios will work as well as PT1T. Don't like how gpu doesn't downclock*


----------



## cennis

Quote:


> Originally Posted by *rdr09*
> 
> thanks for testing 14.4. i am switching between 13.11 and 14.3 (mantle).
> 
> http://www.3dmark.com/3dm11/8190009


got some 19600 with tess off at 1200/1500

Thanks


----------



## Roboyto

Quote:


> Originally Posted by *cennis*
> 
> Thats some ridiculous graphics scores..
> What voltages was it underload?
> 
> *Also I have made some progress on my DCUII 290 having blacks screen when I increase over 250mw voltage at a previously stable clock. I flashed PT1T bios and that problem has went away, put 1.375v under load in the card without black screen.
> vrm reaches 116c probably not that good but at least we know the pcb has no limitations, but the BIOS has problems. Not sure if another manufactures bios will work as well as PT1T. Don't like how gpu doesn't downclock*


Glad to see that you have solved the black screen issue. 116C on VRM







Holy cow, that's definitely not good for long term use! I would work on getting that temp under control ASAP.

I put Fujipoly Extreme on my Powercolor Devil R9 270X in place of thermal tape, and it dropped VRM1 temps by 7C(10%) on that card. The Ultra Extreme would probably do even better in your case.

http://www.overclock.net/t/1478544/the-devil-inside-a-gaming-htpc-for-my-wife-3770k-corsair-250d-powercolor-r9-270x-devil-edition/10

Voltage under load



The lack of downclock is why I haven't tried the PT1T BIOS personally; I know I could put it on the 2nd BIOS. My card performs well enough in stock XFX configuration...I don't want to tempt fate and blow it up while trying to push it further since 1300/1700 is pretty good already. Killing it wouldn't really bother me that much, but getting a dud overclocker as a replacement definitely would.


----------



## centvalny

Heaven h20



http://imgur.com/3sAPD68



Matrix's LN2 bios max settings on GPU Tweak and XCB board



http://imgur.com/2W7DBVu





http://imgur.com/kTPZd4c


----------



## Mercy4You

Quote:


> Originally Posted by *Mega Man*
> 
> My previous point was it seems to change. What the max fps "people can see" worry less about what people tell you you can see and more about what you can see


Quote:


> Originally Posted by *Jflisk*
> 
> Okay 60 fps or 60 hz is the best answer. For games 120 hz is better just means the screen is going to move and refresh quicker.
> 
> this seems to be a good article to describe hz or frps
> 
> http://www.pcmag.com/article2/0,2817,2379206,00.asp


Quote:


> Originally Posted by *Dasboogieman*
> 
> To be honest, I think 120hz smoothness is partly placebo. I can understand the difference between 30hz and 60hz is perceptible but the craze strikes me as very similar to the Audiophile craze.
> Sound can actually be perceived as better if the user is convinced there is some inherent superiority. That being said, I have seen some truly astounding Audiophile setups which sound amazing but how much of that quality is derived from real engineering improvements or simply clever marketing?
> 
> I have also encountered blinded tests where most participants cannot perceive the difference in motion between 120hz and 60hz. Yet there are many gamers in the wild that swear by their 120hz setups.
> 
> Basically, 120hz exists for 3D but there was a secondary marketing angle in having a "superior" motion. However, like Audiophiles, the technology is there for hardcore image motion aficionados.


Thx guys for sharing some new insights









Went back from 120 to 100Hertz. Cap my frames @90 FPS now, much less tearing and still very smooth gameplay...

Less heat GPU, CPU load from 100%@ core 4 to 80%


----------



## rdr09

Quote:


> Originally Posted by *Roboyto*
> 
> There is nothing wrong with your Elpida, it's average. Elpida typically has a smaller chance of overclocking well compared to Hynix. I have seen both scenarios of fantastic Elpida, and crap Hynix...just luck of the draw. My first 290 had Hynix and could only go to 1350.
> 
> RAM clocks do practically nothing for 3DMark11, especially on performance settings. I don't even gain 2% when pushing RAM up to 1700.
> 
> 
> 
> My best run with Tess ON is 16312. http://www.3dmark.com/3dm11/8251505
> 
> Best run with Tess OFF is 17940. If I had a 6-core I would be tickling 20k I'm pretty sure. http://www.3dmark.com/3dm11/8251499
> 
> Here is 1200/6000 for reference as you requested: http://www.3dmark.com/3dm11/8251540 I am running 4770k @ 4.5GHz with 16GB RAM @ 2400MHz


i think your ram sppeds makes the difference. this with 1600 . . .

http://www.3dmark.com/3dm11/7748882


----------



## Toqi

i need Matrix bios , for R9 290pcs+ , some upload and link? thank you


----------



## Jflisk

Quote:


> Originally Posted by *Toqi*
> 
> i need Matrix bios , for R9 290pcs+ , some upload and link? thank you


Have you tried the video bios collection.

http://www.techpowerup.com/vgabios/


----------



## Gooberman

So excited, getting the card today







benchmarkssssssssssssss


----------



## Toqi

Quote:


> Originally Posted by *Toqi*
> 
> i need matris bios , for R9 290pcs+ , some upload and link? thank you


Quote:


> Originally Posted by *Jflisk*
> 
> Have you tried the video bios collection.
> 
> http://www.techpowerup.com/vgabios/


need "Asus Matrix" better overclock capacity?sample , i try msi lightning bios failed flash , why i dont know
i now need Asus R9 290x Matrix bios and overclock EK universal water block ^^ , some upload Asus R9 290x Matrix bios pls ^^


----------



## xaho

Quote:


> Originally Posted by *Dasboogieman*
> 
> Ah I had this problem earlier.
> MSI afterburner in Unofficial mode doesn't work, all it does is set a maximum clockspeed but the clocks will throttle down when the core is loaded (granted the game/benchmark will still run at a steady 900ish mhz)
> TriXX doesn't work because it only sets the maximum clock, it doesn't force 3D only clocks.
> ULPS Disable didn't work for me, I don't think it solves your problem thought it doesn't really hurt to try.
> 
> There are only 2 foolproof ways I found around this though each has its compromises.
> 
> 1. Custom Radeon Pro profiles. You can use force maximum 3D clocks on an individual game basis when you create profiles. The main advantage of this is that you still retain full idle and video clocks while still getting full boost. The biggest drawback to this method is you can't force 3d clocks AND do voltage assisted overclocking at the same time, since Radeon Pro uses the Overdrive engine (which doesn't support overvoltage), any overclock you perform after the game is launched will override Radeon Pro's settings.
> 
> 2. Flash the ASUS 290X PT1T BIOS. This works like a charm, its actually a 290X bios but I was able to flash it on to my Sapphire Tri X 290 card with no issues. The beauty of this BIOS is it sets 1000mhz as the minimum state of the GPU at all times (i.e. PowerPlay has been disabled). You can then use a specially modded ASUS GPU TWEAK (which was borderline impossible to locate) to do your overclocking and overvoltage. The card typically boosts to your overclocked speeds 95% of the time and will do it for most games. Why the PT1T version? this one has VDROOP but it also apparently has some funky coding to prevent pre-boot blackscreens if your motherboard is picky about mismatched device IDs.
> The biggest disadvantage to this method is that your GPU runs at 1000mhz all the time, no 2d or video clocks are present. I'm running this configuration on air at the moment with no issues so I can't imagine you having problems since your card is watercooled.


Tried it, didn't work








Weird thing is, during benchmarking it blasts at 100% without any problem.
While playing I even tried setting things like "lock frame rate up to monitor's refresh rate" in RadeonPro, didn't do any good, the clocks did go up and stay up with RadeonPro so that's something...










Red part is playing, blue part is benchmarking...
Benchmark results:
Options: Resolution: 1920 x 1080; DirectX: DirectX 11; Quality: Very High; Texture filtering: AF 16X; Advanced PhysX: Disabled; Tesselation: High; Motion Blur: Normal; SSAA: ON;
Total Frames: 6988, Total Time: 171.1223 sec
Average Framerate: 40.87
Max. Framerate: 65.32 (Frame: 5)
Min. Framerate: 5.04 (Frame: 790)

Might just have to consider trying to flash it to a 290x


----------



## VSG

Quote:


> Originally Posted by *Toqi*
> 
> need "Asus Matrix" better overclock capacity?sample , i try msi lightning bios failed flash , why i dont know
> i now need Asus R9 290x Matrix bios and overclock EK universal water block ^^ , some upload Asus R9 290x Matrix bios pls ^^


You can't just flash a BIOS meant for another card on your own, it will only likely brick it.


----------



## Roboyto

Quote:


> Originally Posted by *rdr09*
> 
> i think your ram sppeds makes the difference. this with 1600 . . .
> 
> http://www.3dmark.com/3dm11/7748882


Force tessellation off in CCC, and your score will come up around 8-10%

Plus my physics score is 500 higher.


----------



## Sgt Bilko

R9 295x2 just became available to the US: http://www.newegg.com/Product/Product.aspx?Item=N82E16814202108


----------



## Roy360

Quote:


> Originally Posted by *Sgt Bilko*
> 
> R9 295x2 just became available to the US: http://www.newegg.com/Product/Product.aspx?Item=N82E16814202108


I could buy 4 ASUS R9 290X for that price. Hopefully it will come down to the 800$ range


----------



## Sgt Bilko

Quote:


> Originally Posted by *Roy360*
> 
> I could buy 4 ASUS R9 290X for that price. Hopefully it will come down to the 800$ range


True, but these would only require 4 slots on your board instead of the 8 needed by 4 x 290x's









plus you can run them both at x16 instead of x16/x8/x8/x4 in some cases.


----------



## heroxoot

Quote:


> Originally Posted by *geggeg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Toqi*
> 
> need "Asus Matrix" better overclock capacity?sample , i try msi lightning bios failed flash , why i dont know
> i now need Asus R9 290x Matrix bios and overclock EK universal water block ^^ , some upload Asus R9 290x Matrix bios pls ^^
> 
> 
> 
> You can't just flash a BIOS meant for another card on your own, it will only likely brick it.
Click to expand...

This just isn't true. People change their bios with other cards all the time. Only cards that are custom cannot use other bios. Like the lightning cards.


----------



## Jflisk

R9 295 X what are they smoking. Probably get one at a latter date when the price comes down some and EK water block available.


----------



## rdr09

Quote:


> Originally Posted by *Roboyto*
> 
> Force tessellation off in CCC, and your score will come up around 8-10%
> 
> Plus my physics score is 500 higher.


i did . . .

http://www.3dmark.com/3dm11/7748506

true. your cpu is faster.


----------



## Roboyto

Quote:


> Originally Posted by *rdr09*
> 
> i did . . .
> 
> http://www.3dmark.com/3dm11/7748506
> 
> true. your cpu is faster.


Nice Score :-D

80 MHz on RAM would make a little difference, but the physics score has more effect in this case.

I have been able to get my 4770k to 4.7, I wonder if I could break 18000?


----------



## Roy360

Quote:


> Originally Posted by *Sgt Bilko*
> 
> True, but these would only require 4 slots on your board instead of the 8 needed by 4 x 290x's
> 
> 
> 
> 
> 
> 
> 
> 
> 
> plus you can run them both at x16 instead of x16/x8/x8/x4 in some cases.


in that case, I'd buy 4 ASUS R9 290s and a Maximus Extreme board and a usb risers for my sound card for that price

335.99*4 + 300+ 30 = 1673.96


----------



## rdr09

Quote:


> Originally Posted by *Roboyto*
> 
> Nice Score :-D
> 
> 80 MHz on RAM would make a little difference, but the physics score has more effect in this case.
> 
> I have been able to get my 4770k to 4.7, I wonder if I could break 18000?


you might. if not . . . get a X79. lol


----------



## Sgt Bilko

Quote:


> Originally Posted by *Roy360*
> 
> in that case, I'd buy 4 ASUS R9 290s and a Maximus Extreme board and a usb risers for my sound card for that price
> 
> 335.99*4 + 300+ 30 = 1673.96
> 
> *for some reason I thought the R9 295 was *4 R9 290s on a single PCB*.


Now that would be a feat indeed









It's about convenience, better factory cooler and the fact you have the fastest single card around (yes i know it's 2 gpu's)

I don't think the price is that bad really, i mean the 7990 launched at $999 correct?

So new gen, factory CLC and the full 290x cores plus a small oc on them?

Not that bad really...


----------



## Roboyto

Quote:


> Originally Posted by *rdr09*
> 
> you might. if not . . . get a X79. lol


I have contemplated it...but there is no real gain for my purposes. Once my MicroCenter warranty expires for my Z87 Gryphon I will consider it...or if the Gryphon croaks









Quote:


> Originally Posted by *Sgt Bilko*
> 
> Now that would be a feat indeed
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's about convenience, better factory cooler and the fact you have the fastest single card around (yes i know it's 2 gpu's)
> 
> I don't think the price is that bad really, i mean the 7990 launched at $999 correct?
> 
> So new gen, factory CLC and the full 290x cores plus a small oc on them?
> 
> Not that bad really...


I concur, $1500 isn't all that bad considering what you're getting. Dual 290X and essentially dual AIO coolers using one radiator.

If you figure:


(2) 290X @ $550
(2) AIO @ $60 each
Sweet pacakaging
Makes Titan Z look ridiculous

Roughly $1220, plus the stupendous packaging it comes in...If I had $1500 lying around I'd buy one and run 5x1 portrait EyeFinity









If I were to buy one of these I would be SURE to have a set of handcuffs when I went and picked it up...Only time I would ever be able to carry a sweet item in a briefcase handcuffed to my wrist


----------



## Sgt Bilko

Quote:


> Originally Posted by *Roboyto*
> 
> I concur, $1500 isn't all that bad considering what you're getting. Dual 290X and essentially dual AIO coolers using one radiator.
> 
> If you figure:
> 
> (2) 290X @ $550
> (2) AIO @ $60 each
> Sweet pacakaging
> Makes Titan Z look ridiculous
> Roughly $1220, plus the stupendous packaging it comes in...If I had $1500 lying around I'd buy one and run 5x1 portrait EyeFinity


I'm trying not to compare it to the stupidity that is the Titan Z but when Nvidia themselves market it as a "gaming" card







then what do you do?


----------



## Roboyto

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'm trying not to compare it to the stupidity that is the Titan Z but when Nvidia themselves market it as a "gaming" card
> 
> 
> 
> 
> 
> 
> 
> then what do you do?


Laugh at the guy who bought a Titan Z, when he could have had (2) briefcases strapped to his wrists







(see pics added above)


----------



## Jhors2

I recently watercooled my trifire 290Xs and have some really weird screen blanking issues on displayport if I start going beyond +75mv. VRM temps and core temps always stay below 50c. Probably just crappy cards as I have some coil whine on one of them, ASIC quality is in the mid to high 70% range between the three. Although I'd appreciate any input


----------



## Roboyto

Quote:


> Originally Posted by *Jhors2*
> 
> I recently watercooled my trifire 290Xs and have some really weird screen blanking issues on displayport if I start going beyond +75mv. VRM temps and core temps always stay below 50c. Probably just crappy cards as I have some coil whine on one of them, ASIC quality is in the mid to high 70% range between the three. Although I'd appreciate any input


Beta drivers have been problematic for quite some time, 13.12 is the way to go for benching/testing purposes; play around with beta once you know everything else is A-OK.

Scrub old drivers with Display Driver Uninstaller when changing.

Make sure you have ULPS disabled in whatever OC utility you fancy. If you're using CCC, I would suggest disabling OverDrive immediately and using MSI AB, Trixx, HIS iTurbo...anything but CCC pretty much. CCC will automatically save OC profile and load it with Windows. If you push too far you can end up in a crash/boot loop that is very annoying. CCC has also been having issues with Power Limit, but I believe most have said that 14.4 has fixed this issue?

Raise Power Limit with your voltage. With water cooling I just ram the slider over to 50% from the get go since I know things won't be overheating.

Force constant voltage should be used as well in any of the aforementioned OC utilities.

Overclock core first, RAM speed is not super important on these cards. Whenever I pushed my 290s too far on RAM, it always caused black screens and lockups. Pushing core too far, or not having enough voltage, always meant artifacts, tearing, or screen flickering.

Don't have any experience with Display Port, so that may be something to investigate as well. Do you have all DP monitors, or are you using a DP splitter/converter/adapter of some sort?

Can't forget the basics either...Try reseating the cards, and check all other connections while you're at it.

My card has coil whine as well, but it doesn't stop it from hitting 1300/1700 with +200mV









Let us know what happens


----------



## heroxoot

Quote:


> Originally Posted by *Jhors2*
> 
> I recently watercooled my trifire 290Xs and have some really weird screen blanking issues on displayport if I start going beyond +75mv. VRM temps and core temps always stay below 50c. Probably just crappy cards as I have some coil whine on one of them, ASIC quality is in the mid to high 70% range between the three. Although I'd appreciate any input


Assuming there is no adapter between the displayport and your monitor, I'd sight on a bad port.


----------



## Tokkan

My Sapphire R9 290 arrived, fits on my Lexa S. So thats solved








Some quick tests running it at stock with my Phenom shows a _*massive*_ difference when compared to my old 6850 crossfire. Needless to say that I'm pretty satisfied.
CPU usage in BF4 with DX11 is averaging at 60% per core, none was pegged at 100% and GPU had no issues with going to 100%.
Completely maxed out everything, and then decided to increase the scaling option to 200% still was butter smooth.
VRM temps topped at 65 degrees during bf4 while core temp topped at 95 degrees.
Elpida memory on mine.
GPU-Z link



Edit: I'm trying to search online to compare scores, seems like I'm a tad on the low.


----------



## Jhors2

Unfortunately there is an MST adapter sitting between the 2 monitors and the single display port, it very rarely ever blanks, but for some reason when I go over 75mv it blanks progressively more until it gets to the point that it the displays completely lose signal. Kind of strange that core voltage affects that port. I'd swap cards but it's a huge pain with the watercooling loop


----------



## Sgt Bilko

14.4 Driver has been released (301MB)

http://www2.ati.com/drivers/amd-catalyst-14.4-rc-v1.0-windows-apr17.exe


Spoiler: Improvements and Warnings



Feature Highlights of The AMD Catalyst™ 14.4 Release Candidate Driver for Windows
Support for the AMD Radeon™ R9 295X
CrossFire™ fixes enhancements:
Crysis 3 - frame pacing improvements
Far Cry 3 - 3 and 4 GPU performance improvements at high quality settings, high resolution settings
Anno 2070 - Improved CrossFire scaling up to 34%
Titanfall - Resolved in game flickering with CrossFire enabled
Metro Last Light - Improved Crossfire scaling up to 10%
Eyefinity 3x1 (with three 4K panels) no longer cuts off portions of the application
Stuttering has been improved in certain applications when selecting mid-Eyefinity resolutions with V-sync Enabled
Full support for OpenGL 4.4
OpenGL 4.4 supports the following extensions:
ARB_buffer_storage
ARB_enhanced_layouts
ARB_query_buffer_object
ARB_clear_texture
ARB_texture_mirror_clamp_to_edge
ARB_texture_stencil8
ARB_vertex_type_10f_11f_11f_rev
ARB_multi_bind
ARB_bindless_texture
ARB_spare_texture
ARB_seamless_cubemap_per_texture
ARB_indirect_parameters
ARB_compute_variable_group_size
ARB_shader_draw_parameters
ARB_shader_group_vote
Mantle beta driver improvements:
BattleField 4: Performance slowdown is no longer seen when performing a task switch/Alt-tab
BattleField 4: Fuzzy images when playing in rotated SLS resolution with an A10 Kaveri system
Known Issues
System will TDR or BSOD when encoding with Power Director 11
Driver installation might result in a black screen when installing on a Dual AMD Radeon R9 295X configuration under Windows 8.1 on specific platforms (see below). The issue can be overcome by rebooting the PC; upon reboot the display driver will be installed. The remaining Catalyst components can then be installed.
ASUS Crosshair V Formula-Z (990FX)
ASUS Maximus VI Extreme (Z87
ASUS Rampage IV Extreme (X79)



EDIT: 2k posts, Wooo!!


----------



## Roboyto

GPU-Z 0.7.8 allows you to enlarge the window vertically so you don't have to take 2 screenshots to show VRM temps







But only while in the sensors tab

My post on the TechPowerUp forums must have inspired someone to make the change


----------



## Sgt Bilko

It's a great addition to be sure


----------



## Gooberman

Sapphire R9 290x Stock cooling


Just installing drivers now


----------



## DeadlyDNA

I have submitted bug reports in CCC , to be honest I am suprised to see 4k eyefinity listed. I just assumed things like quad cross fire
and 3 4k screens would not be addressed!!

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 14.4 Driver has been released (301MB)
> 
> http://www2.ati.com/drivers/amd-catalyst-14.4-rc-v1.0-windows-apr17.exe
> 
> 
> Spoiler: Improvements and Warnings
> 
> 
> 
> Feature Highlights of The AMD Catalyst™ 14.4 Release Candidate Driver for Windows
> Support for the AMD Radeon™ R9 295X
> CrossFire™ fixes enhancements:
> Crysis 3 - frame pacing improvements
> Far Cry 3 - 3 and 4 GPU performance improvements at high quality settings, high resolution settings
> Anno 2070 - Improved CrossFire scaling up to 34%
> Titanfall - Resolved in game flickering with CrossFire enabled
> Metro Last Light - Improved Crossfire scaling up to 10%
> Eyefinity 3x1 (with three 4K panels) no longer cuts off portions of the application
> Stuttering has been improved in certain applications when selecting mid-Eyefinity resolutions with V-sync Enabled
> Full support for OpenGL 4.4
> OpenGL 4.4 supports the following extensions:
> ARB_buffer_storage
> ARB_enhanced_layouts
> ARB_query_buffer_object
> ARB_clear_texture
> ARB_texture_mirror_clamp_to_edge
> ARB_texture_stencil8
> ARB_vertex_type_10f_11f_11f_rev
> ARB_multi_bind
> ARB_bindless_texture
> ARB_spare_texture
> ARB_seamless_cubemap_per_texture
> ARB_indirect_parameters
> ARB_compute_variable_group_size
> ARB_shader_draw_parameters
> ARB_shader_group_vote
> Mantle beta driver improvements:
> BattleField 4: Performance slowdown is no longer seen when performing a task switch/Alt-tab
> BattleField 4: Fuzzy images when playing in rotated SLS resolution with an A10 Kaveri system
> Known Issues
> System will TDR or BSOD when encoding with Power Director 11
> Driver installation might result in a black screen when installing on a Dual AMD Radeon R9 295X configuration under Windows 8.1 on specific platforms (see below). The issue can be overcome by rebooting the PC; upon reboot the display driver will be installed. The remaining Catalyst components can then be installed.
> ASUS Crosshair V Formula-Z (990FX)
> ASUS Maximus VI Extreme (Z87
> ASUS Rampage IV Extreme (X79)
> 
> 
> 
> EDIT: 2k posts, Wooo!!


----------



## rdr09

Quote:


> Originally Posted by *Tokkan*
> 
> My Sapphire R9 290 arrived, fits on my Lexa S. So thats solved
> 
> 
> 
> 
> 
> 
> 
> 
> Some quick tests running it at stock with my Phenom shows a _*massive*_ difference when compared to my old 6850 crossfire. Needless to say that I'm pretty satisfied.
> CPU usage in BF4 with DX11 is averaging at 60% per core, none was pegged at 100% and GPU had no issues with going to 100%.
> Completely maxed out everything, and then decided to increase the scaling option to 200% still was butter smooth.
> VRM temps topped at 65 degrees during bf4 while core temp topped at 95 degrees.
> Elpida memory on mine.
> GPU-Z link
> 
> 
> 
> Edit: I'm trying to search online to compare scores, seems like I'm a tad on the low.












at 1080, you should be able to max out C3.


----------



## kayan

So, I have a quick question...how are you guys getting /seeing your ASIC scores?

What V mV should I start off for over clocking my 290x? In MSI AB I can hit 1110/1325 with no voltage increase, but power slider is at +50, ULPS is off. I'm on water cooling and temps are fine on core and VRAM 1 &2.

Any help would be appreciated.


----------



## Roboyto

Quote:


> Originally Posted by *kayan*
> 
> So, I have a quick question...how are you guys getting /seeing your ASIC scores?
> 
> What V mV should I start off for over clocking my 290x? In MSI AB I can hit 1110/1325 with no voltage increase, but power slider is at +50, ULPS is off. I'm on water cooling and temps are fine on core and *VRAM* 1 &2.
> 
> Any help would be appreciated.


Right-Click on the upper left hand corner of the GPU-Z window, and choose "Read ASIC Quality".

VRM, not VRAM. Probably just a typo, but they are very different things









Personally I add mV in small increments, around 10mV each time and see how far you can go. This way you can find the 'sweet spot' blend of volts, clocks and temps. When you get artifacting, tears, flickers ETC you know it's time for more voltage.

Only increase core clock at first, it is far more important than RAM speeds on these cards. This will also reduce confusion of wondering if the core clock, or RAM clock, is causing your instability.

Too high of a core clock, or not enough voltage, always gave me artifacting, tearing, or screen flickering. Pushing RAM too far was nearly always black screens or lockups.

Keep the power limit at +50

Force Constant Voltage in Trixx/AfterBurner

VRM1 is the temp to watch most closely, once you're in the ballpark of 80-90C is when you need to be concerned. If temps get too high on the VRM1 then thermal pad upgrades can help drastically; Check link in my sig for info.

Hopefully you have a champ OCer like mine











Spoiler: Warning: Sweet 3DMark Score Inside :D















I have same/similar card depending on if you have BE or not. To get to 1100 without voltage is a good sign so far.

Best of luck to you


----------



## FtW 420

Quote:


> Originally Posted by *Roboyto*
> 
> Right-Click on the upper left hand corner of the GPU-Z window, and choose "Read ASIC Quality".
> 
> VRM, not VRAM. Probably just a typo, but they are very different things
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Personally I add mV in small increments, around 10mV each time and see how far you can go. This way you can find the 'sweet spot' blend of volts, clocks and temps. When you get artifacting, tears, flickers ETC you know it's time for more voltage.
> 
> Only increase core clock at first, it is far more important than RAM speeds on these cards. This will also reduce confusion of wondering if the core clock, or RAM clock, is causing your instability.
> 
> Too high of a core clock, or not enough voltage, always gave me artifacting, tearing, or screen flickering. Pushing RAM too far was nearly always black screens or lockups.
> 
> Keep the power limit at +50
> 
> Force Constant Voltage in Trixx/AfterBurner
> 
> VRM1 is the temp to watch most closely, once you're in the ballpark of 80-90C is when you need to be concerned. If temps get too high on the VRM1 then thermal pad upgrades can help drastically; Check link in my sig for info.
> 
> Hopefully you have a champ OCer like mine
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Sweet 3DMark Score Inside :D
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have same/similar card depending on if you have BE or not. To get to 1100 without voltage is a good sign so far.
> 
> Best of luck to you


21578 graphics at 1315/1700? The 290 is pretty close to the 290x, at 1275/1678 I'm at 21598 graphics score...


----------



## Roy360

Quote:


> Originally Posted by *Roboyto*
> 
> Right-Click on the upper left hand corner of the GPU-Z window, and choose "Read ASIC Quality".


I've been wondering this for the longest time. So apparently my ASUS card, the one that has the highest tendency to crash has the best ASIC.

Overall, looks like I have bad luck
79.5
79.8
76.5

awww. if only I didn't kill the VRMs(dropped a tiny heatsync right over the backside of the VRMs) on my old card. I sure that card had a killer ASIC value.


----------



## Paul17041993

I wonder how many people realize the efficiency gain the 295X2 has over standard 2*290X on air...


----------



## Jflisk

Quote:


> Originally Posted by *Paul17041993*
> 
> I wonder how many people realize the efficiency gain the 295X2 has over standard 2*290X on air...


Please explain I see 2 x 290X and 2 x 8 pin adapters.


----------



## cennis

Quote:


> Originally Posted by *Roy360*
> 
> I've been wondering this for the longest time. So apparently my ASUS card, the one that has the highest tendency to crash has the best ASIC.
> 
> Overall, looks like I have **** luck
> 79.5
> 79.8
> 76.5
> 
> awww. if only I didn't kill the VRMs(dropped a tiny heatsync right over the backside of the VRMs) on my old card. I sure that card had a killer ASIC value.


I think 75+ is above average and close to 80 is great?
ive seen alot of 290s in the 60s


----------



## maxforces

290 tri-x GPU 1275 Mem 1440

http://www.3dmark.com/3dm/2924270


----------



## cephelix

Quote:


> Originally Posted by *cennis*
> 
> I think 75+ is above average and close to 80 is great?
> ive seen alot of 290s in the 60s


My asic quality is 83.9 but i could only do 1040MHz core before needing to increase voltage....


----------



## maxforces

I have asic 78.7 1100Mhz without adding voltage


----------



## rdr09

Quote:


> Originally Posted by *maxforces*
> 
> 290 tri-x GPU 1275 Mem 1440
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/2924270


it could be 14 driver. your graphics score should be over 13K.

my bad. your score is right on.


----------



## armartins

"Profanity free"









Spoiler: Warning: Spoiler!



Sgt Bilko, let's agree to disagree... $500 for 2 universal blocks and a single rad AIO cooler, I know there's R&D to be covered add to that the fact that it's not a mainstream product, it will not sell thousands and thousands of that specific model. But on the other hand, "full 290x cores" well as far as I'm aware the 7990 also packed two 7970s cores, right? We should replace "oh it's next gen, so it's ok higher MSPR", with "it's next gen, so it will replace the older card at the same MSPR, while the older gets cheaper"... Nvidia was the mastermind creating this concept that it's fine $700 for an enthusiast grade GPU (after making it look cheap considering the monster performing TITAN $1000 GPU, wich was the most impressive GPU to launch in a long long time, considering the competition at that time) after we saw they kind of trade blows with the 7970 using the GK104 680 at a higher price tag that it should've been... well that puts out my rant biggrin.gif ... the fact is... nvidia's propaganda is now convenient to AMD puting the TITAN Z in perspective...$1500 may sound reasonable to some people and I totally understand it thumb.gif



Just adding to your later response Sgt Bilko... the fact that really grind my gears is that until the 6xxx series and that was true to the Nvidia counterpart (GTX590) when we bought a dual card... it was actually at a lower MSPR... and both the 5970 (I had one dec/2009 at launch) an 6990 were able to match the crossfired 5870/6970 performance... in most case scenarios....


----------



## Arizonian

Quote:


> Originally Posted by *Tokkan*
> 
> My Sapphire R9 290 arrived, fits on my Lexa S. So thats solved
> 
> 
> 
> 
> 
> 
> 
> 
> Some quick tests running it at stock with my Phenom shows a _*massive*_ difference when compared to my old 6850 crossfire. Needless to say that I'm pretty satisfied.
> CPU usage in BF4 with DX11 is averaging at 60% per core, none was pegged at 100% and GPU had no issues with going to 100%.
> Completely maxed out everything, and then decided to increase the scaling option to 200% still was butter smooth.
> VRM temps topped at 65 degrees during bf4 while core temp topped at 95 degrees.
> Elpida memory on mine.
> GPU-Z link
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Edit: I'm trying to search online to compare scores, seems like I'm a tad on the low.


Congrats - added









Quote:


> Originally Posted by *Gooberman*
> 
> Sapphire R9 290x Stock cooling
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Just installing drivers now


Congrats - added


----------



## Sgt Bilko

Quote:


> Originally Posted by *armartins*
> 
> Just adding to your later response Sgt Bilko... the fact that really grind my gears is that until the 6xxx series and that was true to the Nvidia counterpart (GTX590) when we bought a dual card... it was actually at a lower MSPR... and both the 5970 (I had one dec/2009 at launch) an 6990 were able to match the crossfired 5870/6970 performance... in most case scenarios....


I get where you are coming from but in all honesty it makes more sense to me that a Dual GPU card would be more expensive than 2 of it's single die counter-parts.

Convenience, efficiency, size, and as before the fact it's the single fastest card out there.


----------



## Roy360

Quote:


> Originally Posted by *Paul17041993*
> 
> I wonder how many people realize the efficiency gain the 295X2 has over standard 2*290X on air...


http://hothardware.com/Reviews/AMD-Radeon-R9-295X2-Review-Hawaii-x-2/

500W vs 250*2, so none.

I'll admit it saves 2 PCIe slots, and PCIe bandwidth, but that's about it. According the papers anyways. Maybe it's like the GTX 750ti and uses significantly less power than the spec sheet.


----------



## FtW 420

Quote:


> Originally Posted by *Roy360*
> 
> http://hothardware.com/Reviews/AMD-Radeon-R9-295X2-Review-Hawaii-x-2/
> 
> 500W vs 250*2, so none.
> 
> I'll admit it saves 2 PCIe slots, and PCIe bandwidth, but that's about it. According the papers anyways. Maybe it's like the GTX 750ti and uses significantly less power than the spec sheet.


If it is like the 7990, it may also outscore 2 x 290x at the same clocks. 2 x 7970 needed to be clocked higher to score the same as a single 7990, not sure if it was the internal PLX or what, but the 7990s scored high for the clocks compared to 7970 xfire.


----------



## TripleTurbo

Anybody here able to tell me what water blocks fit the MSI 4G 290x so called gaming edition? Can't seem to find any that do, custom power circuitry or some such relative to the reference board.


----------



## cephelix

Quote:


> Originally Posted by *maxforces*
> 
> I have asic 78.7 1100Mhz without adding voltage


Tht's wht i dont understand.some say asic value is directly related to oc ability.some say it's an unreliable,arbitrary value.
Whtever it is,i'll probably need to put mine under water + change my board n cpu to see better scores


----------



## Germanian

DO NOT DOWNLOAD those new 14.4 drivers if you are using crossfire.

As soon as i flip to crossfire mode on the 2nd GPU goes to 100% full load while GPU1 stays around 6%. Idk if it's PEBKAC error or what, but i didn't have these issues on 13.12


----------



## Sgt Bilko

Quote:


> Originally Posted by *Germanian*
> 
> DO NOT DOWNLOAD those new 14.4 drivers if you are using crossfire.
> 
> As soon as i flip to crossfire mode on the 2nd GPU goes to 100% full load while GPU1 stays around 6%. Idk if it's PEBKAC error or what, but i didn't have these issues on 13.12


Thats weird, they were working fine for me


----------



## DeadlyDNA

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Thats weird, they were working fine for me


i want to try these but i am worried i will waste time with "hotplug detect" and crossfire vs eyefinity not working like the leaked drivers had.


----------



## TripleTurbo

Quote:


> Originally Posted by *cephelix*
> 
> Quote:
> 
> 
> 
> Originally Posted by *maxforces*
> 
> I have asic 78.7 1100Mhz without adding voltage
> 
> 
> 
> Tht's wht i dont understand.some say asic value is directly related to oc ability.some say it's an unreliable,arbitrary value.
> Whtever it is,i'll probably need to put mine under water + change my board n cpu to see better scores
Click to expand...

Dave Bauman (spelling?) remarked on beyond3d 9 or so months ago that gpu-z's ASIC rating for AMD GPUs was flat-out incorrect, and to pay no respect to the value it spit forth. That was GCN 1.0, I believe, but I doubt much has changed. I'll try and dig up the remark.


----------



## Forceman

Quote:


> Originally Posted by *TripleTurbo*
> 
> Dave Bauman (spelling?) remarked on beyond3d 9 or so months ago that gpu-z's ASIC rating for AMD GPUs was flat-out incorrect, and to pay no respect to the value it spit forth. That was GCN 1.0, I believe, but I doubt much has changed. I'll try and dig up the remark.


The comment was about Hawaii specifically.
Quote:


> ASIC "quality" is a misnomer propobated by GPU-z reading a register and not really knowing what it results in. It is even more meaningless with the binning mechanism of Hawaii.


http://forum.beyond3d.com/showpost.php?p=1808073&postcount=2130


----------



## kayan

Quote:


> Originally Posted by *Jflisk*
> 
> Find hwinfo and download it you might also be able to see vrm1 temps in gpuz some cards do some dont. check the vram temps make sure there under 65C. Also how many and what type rads. The 290Xs usually run under 50C core and vrm1(more important temp then gpu on these) under 55C - vrm2 under 50c but depends on the rads.Looks good though did you get the back plate that helps lower the temps also


I had a 280 30mm Alphacool and 240 40mm XSPC rad and everything was all messed up. I am swapping out my case, but I only had fans on the 280 (and I stupidly put them on with airflow going the wrong direction), and because of how my case was shaped, I had no fans on the 240.







Anyway, temps were good, but I assume that when I get my new case and get fans on everything pointing the correct direction then temps should be even better!

New case is due Thurs, I'll update after leak testing.


----------



## kayan

Quote:


> Originally Posted by *Roboyto*
> 
> Right-Click on the upper left hand corner of the GPU-Z window, and choose "Read ASIC Quality".
> 
> VRM, not VRAM. Probably just a typo, but they are very different things
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Personally I add mV in small increments, around 10mV each time and see how far you can go. This way you can find the 'sweet spot' blend of volts, clocks and temps. When you get artifacting, tears, flickers ETC you know it's time for more voltage.
> 
> Only increase core clock at first, it is far more important than RAM speeds on these cards. This will also reduce confusion of wondering if the core clock, or RAM clock, is causing your instability.
> 
> Too high of a core clock, or not enough voltage, always gave me artifacting, tearing, or screen flickering. Pushing RAM too far was nearly always black screens or lockups.
> 
> Keep the power limit at +50
> 
> Force Constant Voltage in Trixx/AfterBurner
> 
> VRM1 is the temp to watch most closely, once you're in the ballpark of 80-90C is when you need to be concerned. If temps get too high on the VRM1 then thermal pad upgrades can help drastically; Check link in my sig for info.
> 
> Hopefully you have a champ OCer like mine
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Sweet 3DMark Score Inside :D
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have same/similar card depending on if you have BE or not. To get to 1100 without voltage is a good sign so far.
> 
> Best of luck to you


Thanks for the info! The VRM was an auto-correct typo on my phone that I didn't catch,







. My VRM1 temp didn't go above 65, and if you check my previous post about my dumb-as-crap rad setup, then that's actual pretty impressive!

I have an XFX Reference 290x, not a BE :/. Thinking about adding in a 2nd card too, since they are so cheap (relatively) on eBay right now.

Edit: sorry for the double post, I wasn't sure how to multi-quote from different pages...


----------



## HOMECINEMA-PC

[email protected]@2428 TRI 290 [email protected] *P30704 13.11 B 9.5* Tess off









http://www.3dmark.com/3dm11/8255134
My highest Pscore ever


----------



## Devotii

Been back a few pages and not seen this asked. Is the higher price of the 290X worth it over the 290? I am deciding between these cards, Sapphire TriX series.


----------



## Paul17041993

Quote:


> Originally Posted by *Jflisk*
> 
> Please explain I see 2 x 290X and 2 x 8 pin adapters.


shared PCB, temps are generally lower then air cooling due to water on the cores, this might even have the cherrypicking that the 7990 had which means the cores themselves would have an extra little efficiency boost.

overall a 295X2 would be 10-20% more efficient then 2x290X on air, while being fairly low in noise production and only requiring 2+1 slots (ie; extra optional slot for giving the fan breathing room).

2x290X on fullcover waterblocks however there wouldn't be much of a difference, but in that case a 295X2 would only take 2 (or 1 even) slots on either its stock or on a fullcover block and many people would find this usefull in compact rigs. overall Id find the price of it perfectly justified, especially compared to its rivals...
Quote:


> Originally Posted by *Roy360*
> 
> http://hothardware.com/Reviews/AMD-Radeon-R9-295X2-Review-Hawaii-x-2/
> 
> 500W vs 250*2, so none.
> 
> I'll admit it saves 2 PCIe slots, and PCIe bandwidth, but that's about it. According the papers anyways. Maybe it's like the GTX 750ti and uses significantly less power than the spec sheet.


that's the max TDP, not at all related to efficiency, and you cant even compare to a single 290X either due to different CPU load etc

if you look at the benchmarks you linked, card itself is only pulling ~400W by itself really, might not even be that, + its outperforming the 290X crossfire quite a lot...
Quote:


> Originally Posted by *Devotii*
> 
> Been back a few pages and not seen this asked. Is the higher price of the 290X worth it over the 290? I am deciding between these cards, Sapphire TriX series.


yes and no, it depends on whether you have a large res screen and what sort of cooling you want, most go for either the large custom cooler cards or waterblocks.

if you want to go a step further, multiple 290s are a good choice, albeit you have a small risk of crossfire issues.


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> [email protected]@2428 TRI 290 [email protected] *P30704 13.11 B 9.5* Tess off
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/8255134
> My highest Pscore ever


Very nice there









Should be going pretty good in the higher res benches as well now you have a 3rd yeah?


----------



## Devotii

Quote:


> Originally Posted by *Paul17041993*
> 
> yes and no, it depends on whether you have a large res screen and what sort of cooling you want, most go for either the large custom cooler cards or waterblocks.


As mentioned it would be the Sapphire TriX series, 290 or 290X. I don't have a large screen resolution but might soon have a 3 monitor setup. It would be air cooled, I only want a single card setup.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Devotii*
> 
> As mentioned it would be the Sapphire TriX series, 290 or 290X. I don't have a large screen resolution but might soon have a 3 monitor setup. It would be air cooled, I only want a single card setup.


Could always go for the Vapor-X when it releases if that's more pleasing to your eyes, i find the Tri-X to be a bit ugly imo.

Oh wait, you said single card, ok you want a 290x then. More power the better (so long as you don't mind spending a little more)









PCS+, Tri-X, MSI Gaming and the Gigabyte Windforce cards are the best ones out afaik.


----------



## rdr09

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> [email protected]@2428 TRI 290 [email protected] *P30704 13.11 B 9.5* Tess off
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/8255134
> My highest Pscore ever


congrat!


----------



## Devotii

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Could always go for the Vapor-X when it releases if that's more pleasing to your eyes, i find the Tri-X to be a bit ugly imo.
> 
> Oh wait, you said single card, ok you want a 290x then. More power the better (so long as you don't mind spending a little more)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PCS+, Tri-X, MSI Gaming and the Gigabyte Windforce cards are the best ones out afaik.


Do you think the extra money for the 290*X* worth it over the 290? Price to performance etc.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Devotii*
> 
> Do you think the extra money for the 290*X* worth it over the 290? Price to performance etc.


Price/Performance wise 290 is king, no argument.....just look at HomeCinema-PC's 3DMark 11 results.

I had a single 290x but decided that CF 290's was a much better option.

personally i think you should get 2 cards for eye-finity and call it a day, won't need an upgrade for quite a long time then


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Very nice there
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Should be going pretty good in the higher res benches as well now you have a 3rd yeah?


So far I gots a 19k score on catzilla 1440p . On 13.11 beta 5 driver seems to work well with trifire

http://www.catzilla.com/showresult?lp=246953


----------



## Devotii

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Price/Performance wise 290 is king, no argument.....just look at HomeCinema-PC's 3DMark 11 results.
> 
> I had a single 290x but decided that CF 290's was a much better option.
> 
> personally i think you should get 2 cards for eye-finity and call it a day, won't need an upgrade for quite a long time then


I can't personally justify more than around £400 on a graphics card. It is single card only, either 290 or 290X [or 780 classy].


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> So far I gots a 19k score on catzilla 1440p . On 13.11 beta 5 driver seems to work well with trifire
> 
> http://www.catzilla.com/showresult?lp=246953


Looking good though, you've set the bar fairly high for me there









Tried out 14.4 yet?

i was getting some decent results with it a little earlier, never did any extensive testing though.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Devotii*
> 
> I can't personally justify more than around £400 on a graphics card. It is single card only, either 290 or 290X [or 780 classy].


Then a 290 will serve you well imo, cheaper than it's big brother and almost as fast


----------



## rdr09

Quote:


> Originally Posted by *Devotii*
> 
> I can't personally justify more than around £400 on a graphics card. It is single card only, either 290 or 290X [or 780 classy].


go 290. it is about 35% faster than the 7950.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Looking good though, you've set the bar fairly high for me there
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Tried out 14.4 yet?
> 
> i was getting some decent results with it a little earlier, never did any extensive testing though.


Yes the leaked one and didn't bench as well ( a bit broken for cross / TRIfire ) as this old driver does . I wanna crack 24k on F/S and 14k on extreme








Ordered these bits today..........

Should improve things a bit with a w/b . Then I can bench 290X when im done with TRI


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yes the leaked one and didn't bench as well ( a bit broken for cross / TRIfire ) as this old driver does . I wanna crack 24k on F/S and 14k on extreme
> 
> 
> 
> 
> 
> 
> 
> 
> Ordered these bits today..........
> 
> Should improve things a bit with a w/b . Then I can bench 290X when im done with TRI


I'm using the 14.4 AMD release (not sure if there is a difference) and it's working well for me, better frame rate in BF4 and a few other games.

Looking forward to see how you get on with them all under water


----------



## Sgt Bilko

For those that were waiting on EK blocks for the revision Gigabyte and MSI cards here you go: http://www.ekwb.com/shop/blocks/vga-blocks/ati-radeon-full-cover-blocks/radeon-rx-200-series.html

Article:
http://www.techpowerup.com/200108/ek-launches-new-revision-of-ek-fc-r9-290x-water-block.html


----------



## wes1099

Quote:


> Originally Posted by *Sgt Bilko*
> 
> For those that were waiting on EK blocks for the revision Gigabyte and MSI cards here you go: http://www.ekwb.com/shop/blocks/vga-blocks/ati-radeon-full-cover-blocks/radeon-rx-200-series.html
> 
> Article:
> http://www.techpowerup.com/200108/ek-launches-new-revision-of-ek-fc-r9-290x-water-block.html


Awesome! I wonder if they will come out with plexi+copper version that is not nickel plated...


----------



## Sgt Bilko

Quote:


> Originally Posted by *wes1099*
> 
> Awesome! I wonder if they will come out with plexi+copper version that is not nickel plated...


You mean this one?: http://www.ekwb.com/shop/blocks/vga-blocks/ati-radeon-full-cover-blocks/radeon-rx-200-series/ek-fc-r9-290x-rev-2-0.html


----------



## cephelix

Finally!!been waiting to get one for my system


----------



## Devotii

Do you think the R9 290 Vapor-X runs cooler and quieter than the other aftermarket coolers for R9 290?


----------



## kayan

Quote:


> Originally Posted by *Devotii*
> 
> Do you think the R9 290 Vapor-X runs cooler and quieter than the other aftermarket coolers for R9 290?


I can't speak from experience with the 290 Vapor-x, but for aftermarket coolers I go for the Vapor-X every time. I[ve owned many Vapor-X cards, across many generations, and they are phenomenal.


----------



## BroJin

My 290 Vapor X should be here by Thursday. Let you guys know how it goes.


----------



## UZ7

Quote:


> Originally Posted by *Devotii*
> 
> Do you think the R9 290 Vapor-X runs cooler and quieter than the other aftermarket coolers for R9 290?


http://www.kitguru.net/components/graphic-cards/zardon/sapphire-r9-280x-vapor-x-oc-and-r9-290-vapor-x-oc-review/28/

Here is a review by kit guru.

Based on the review and design of the card, being that it has a custom pcb, more phases, better chokes, and better cooler then it would essentially run cooler and oc higher. If the fans are exactly the same as the Tri-X then they would be on par in terms of acoustics are concerned but since in the benches it shows about roughly 5C~ better it probably wont rev up as high, plus there is the hybrid switch, which makes the card run on 1 fan until it hits 60C or something. Judging from my perspective from the Tri-X (not vapor) I feel the cooler is low noise during idle, stays at about 20% till higher temps and keeps temps lower than reference. At load it feels like an average cooler because the R9 290(X) series definitely run on the hot side. In my opinion msi gaming and asus dc2 run more quieter but those are the coolers designed for the 700 series. Now I don't know if they revised it with the 200 series but I heard they used the same design and thus not cooling as effectively and it doesnt cool everything but in my opinion I think the msi gaming is super quiet, 2 bigger fans vs 3 smaller fans.

So if I were to rank based on sound I would say msi gaming > dc2 > tri-x, and dont get me wrong as for average users this is silent. I'm just anal with sound and I consider anything above my noctuas to be loud







but I've heard the 780dc2 and the 770gaming as well as 780acx/classy and the 290 tri-x and out of all of them I feel the gaming to be the most quietest, but again if they didnt tweak up the cooler then it would be kinda useless.

Cooling performance I would lean toward the vapor-x as the custom design helps a lot. If you're in the market I would check out the vapor-x. I personally made the mistake of jumping too early and getting the tri-x but I don't really regret it, though I do prefer it to be quieter.


----------



## Mega Man

Quote:


> Originally Posted by *Devotii*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Could always go for the Vapor-X when it releases if that's more pleasing to your eyes, i find the Tri-X to be a bit ugly imo.
> 
> Oh wait, you said single card, ok you want a 290x then. More power the better (so long as you don't mind spending a little more)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PCS+, Tri-X, MSI Gaming and the Gigabyte Windforce cards are the best ones out afaik.
> 
> 
> 
> Do you think the extra money for the 290*X* worth it over the 290? Price to performance etc.
Click to expand...

no if you have to ask, then no. 290x is only for epeen and is not much improvement over 290
Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Looking good though, you've set the bar fairly high for me there
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Tried out 14.4 yet?
> 
> i was getting some decent results with it a little earlier, never did any extensive testing though.
> 
> 
> 
> Yes the leaked one and didn't bench as well ( a bit broken for cross / TRIfire ) as this old driver does . I wanna crack 24k on F/S and 14k on extreme
> 
> 
> 
> 
> 
> 
> 
> 
> Ordered these bits today..........
> 
> Should improve things a bit with a w/b . Then I can bench 290X when im done with TRI
Click to expand...

congrats my friend on your high score !
Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HOMECINEMA-PC*
> 
> [email protected]@2428 TRI 290 [email protected] *P30704 13.11 B 9.5* Tess off
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/8255134
> My highest Pscore ever
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> congrat!
Click to expand...


----------



## Roy360

watercooling question here. I have two reference EK blocks and one ASUS DC block.

ASUS is ontop, XFX reference is sandwiched, and VistonTek reference is at the bottom.

XFX has been flashed to a lower voltage (170W), Visiontek takes (180W ), and ASUS at (200) (can't find any good bios for this card)

Yet, the VisionTek has the coolest temps (50) while the other two are floating close to 60 degrees. I remember switching between Noctua N1H1 and EK thermal paste while setting up the cards, is the difference I'm seeing due to the Noctua?

All the cards are running at 947/1250

p.s Anyone found a bios for the ASUS R9 290 DC that will underclock it to 947? I would like to switch to the iGPU while mining, and back to the GPU when gaming,


----------



## wes1099

Quote:


> Originally Posted by *Sgt Bilko*
> 
> You mean this one?: http://www.ekwb.com/shop/blocks/vga-blocks/ati-radeon-full-cover-blocks/radeon-rx-200-series/ek-fc-r9-290x-rev-2-0.html


Oh yeah that one! For some reason it did not show up on the coolingconfiguratior tool when i searched for the MSI R9 290 Gaming


----------



## Devotii

Quote:


> Originally Posted by *UZ7*
> 
> http://www.kitguru.net/components/graphic-cards/zardon/sapphire-r9-280x-vapor-x-oc-and-r9-290-vapor-x-oc-review/28/
> 
> Here is a review by kit guru.
> 
> Based on the review and design of the card, being that it has a custom pcb, more phases, better chokes, and better cooler then it would essentially run cooler and oc higher. If the fans are exactly the same as the Tri-X then they would be on par in terms of acoustics are concerned but since in the benches it shows about roughly 5C~ better it probably wont rev up as high, plus there is the hybrid switch, which makes the card run on 1 fan until it hits 60C or something. Judging from my perspective from the Tri-X (not vapor) I feel the cooler is low noise during idle, stays at about 20% till higher temps and keeps temps lower than reference. At load it feels like an average cooler because the R9 290(X) series definitely run on the hot side. In my opinion msi gaming and asus dc2 run more quieter but those are the coolers designed for the 700 series. Now I don't know if they revised it with the 200 series but I heard they used the same design and thus not cooling as effectively and it doesnt cool everything but in my opinion I think the msi gaming is super quiet, 2 bigger fans vs 3 smaller fans.
> 
> So if I were to rank based on sound I would say msi gaming > dc2 > tri-x, and dont get me wrong as for average users this is silent. I'm just anal with sound and I consider anything above my noctuas to be loud
> 
> 
> 
> 
> 
> 
> 
> but I've heard the 780dc2 and the 770gaming as well as 780acx/classy and the 290 tri-x and out of all of them I feel the gaming to be the most quietest, but again if they didnt tweak up the cooler then it would be kinda useless.
> 
> Cooling performance I would lean toward the vapor-x as the custom design helps a lot. If you're in the market I would check out the vapor-x. I personally made the mistake of jumping too early and getting the tri-x but I don't really regret it, though I do prefer it to be quieter.


Thanks for info! Last question: Is the Corsair TX650w PSU compatible with the Vapor X R9 290?


----------



## UZ7

Quote:


> Originally Posted by *Devotii*
> 
> Thanks for info! Last question: Is the Corsair TX650w PSU compatible with the Vapor X R9 290?


Yeah it should be more than enough unless you plan on getting another one later on but for single gpu setup you should be good







just double check that you have 2x 8pin as the vapor-x needs it.


----------



## Devotii

Quote:


> Originally Posted by *UZ7*
> 
> Yeah it should be more than enough unless you plan on getting another one later on but for single gpu setup you should be good
> 
> 
> 
> 
> 
> 
> 
> just double check that you have 2x 8pin as the vapor-x needs it.


It appears to have the 6 +2 that hang off, they are still okay right? Might actually order this card!


----------



## UZ7

Quote:


> Originally Posted by *Devotii*
> 
> It appears to have the 6 +2 that hang off, they are still okay right? Might actually order this card!


Yeah most psu connectors come like that 6+2 = 8 configuration just in case the card only takes 6


----------



## Devotii

Thanks for helping, looks like I am getting the Vapor X R9 290!!


----------



## mojobear

Quote:


> Originally Posted by *Devotii*
> 
> Thanks for info! Last question: Is the Corsair TX650w PSU compatible with the Vapor X R9 290?


I have a corsair CX600M in my min itx system with a r9 290 and it runs fine!


----------



## Roboyto

Quote:


> Originally Posted by *mojobear*
> 
> I have a corsair CX600M in my min itx system with a r9 290 and it runs fine!


I concur, a good 650W PSU is enough for a 290(X) even with large overclock. I'm using a Rosewill Capstone 650M and it's enough for my 290 up to 1300/1700, 4770k @ 4.5,16GB RAM @ 2400mhz, 3 SSD, HDD, and a water pump.

If you're using an AMD FX CPU, specifically an 8 core and it's OC, you may need a larger PSU as they are pretty power hungry. They will draw nearly as much as a highly OC 290 depending on how far they are pushed.


----------



## invincible20xx

anybody here running crossfire r9 290X on a 3770k ?! will there be a bottleneck if i run them off a 4.5 ghz 3770k system , i probably won't overclock gpus and just keep them stock


----------



## BradleyW

Quote:


> Originally Posted by *invincible20xx*
> 
> anybody here running crossfire r9 290X on a 3770k ?! will there be a bottleneck if i run them off a 4.5 ghz 3770k system , i probably won't overclock gpus and just keep them stock


Depends on the application and the amount of CPU overhead. For the most part, yes there will be some bottlenecking. Even my 3930K bottlenecks these cards, but it's more due to the applications only being coded to utilize 2 cores and a part of a third core.


----------



## invincible20xx

Quote:


> Originally Posted by *BradleyW*
> 
> Depends on the application and the amount of CPU overhead. For the most part, yes there will be some bottlenecking. Even my 3930K bottlenecks these cards, but it's more due to the applications only being coded to utilize 2 cores and a part of a third core.


will going haswell help if i match clocks with my 3770k ?! what are the odds i hit a 4770k that can do over 4.5 ghz/s ?


----------



## Widde

What's the word on 14.4? My experience with it is utter (insert profanity). http://piclair.com/mlt5b my cpu usage in bf4 with mantle and gpu http://piclair.com/8us80.

With mantle on ultra 1080p 180% res scale I dropped from 90-135 fps, rarely saw fps under 80s now I've dropped to rarely being over 60 fps :S I scrubbed with DDU before installing 14.4 coming from 13.12.


----------



## heroxoot

Quote:


> Originally Posted by *Widde*
> 
> What's the word on 14.4? My experience with it is utter (insert profanity). http://piclair.com/mlt5b my cpu usage in bf4 with mantle and gpu http://piclair.com/8us80.
> 
> With mantle on ultra 1080p 180% res scale I dropped from 90-135 fps, rarely saw fps under 80s now I've dropped to rarely being over 60 fps :S I scrubbed with DDU before installing 14.4 coming from 13.12.


That looks like throttle on your CPU to me. Using an i3 compared to an i5 I doubt it benches exactly the same. Considering on ultra BF4 claims to need an i5/7? Hyperthread cores do not compare to real cores.


----------



## Widde

Quote:


> Originally Posted by *heroxoot*
> 
> That looks like throttle on your CPU to me. Using an i3 compared to an i5 I doubt it benches exactly the same. Considering on ultra BF4 claims to need an i5/7? Hyperthread cores do not compare to real cores.


What do you mean?







I have an i5


----------



## heroxoot

Quote:


> Originally Posted by *Widde*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> That looks like throttle on your CPU to me. Using an i3 compared to an i5 I doubt it benches exactly the same. Considering on ultra BF4 claims to need an i5/7? Hyperthread cores do not compare to real cores.
> 
> 
> 
> What do you mean?
> 
> 
> 
> 
> 
> 
> 
> I have an i5
Click to expand...

Oh god I'm an idiot. Sorry dealing with a hangover I miss read your rig. I can't say why then.

On another note, these idiots at MSI messed up AGAIN! They sent me wrong graphics card can you believe them? What am I supposed to do with this?



Jokes aside, I am now in the club.


----------



## sugarhell

Anime subtitles 100%


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *rdr09*
> 
> congrat!
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> I'm using the 14.4 AMD release (not sure if there is a difference) and it's working well for me, better frame rate in BF4 and a few other games.
> 
> Looking forward to see how you get on with them all under water
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> no if you have to ask, then no. 290x is only for epeen and is not much improvement over 290
> congrats my friend on your high score !
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
Click to expand...

Thanks guys









Quote:


> Originally Posted by *Widde*
> 
> What's the word on 14.4? My experience with it is utter (insert profanity). http://piclair.com/mlt5b my cpu usage in bf4 with mantle and gpu http://piclair.com/8us80.
> 
> With mantle on ultra 1080p 180% res scale I dropped from 90-135 fps, rarely saw fps under 80s now I've dropped to rarely being over 60 fps :S I scrubbed with DDU before installing 14.4 coming from 13.12.


Drop res scale to 125% ?


----------



## Paul17041993

Quote:


> Originally Posted by *Roy360*
> 
> watercooling question here. I have two reference EK blocks and one ASUS DC block.
> 
> ASUS is ontop, XFX reference is sandwiched, and VistonTek reference is at the bottom.
> 
> XFX has been flashed to a lower voltage (170W), Visiontek takes (180W ), and ASUS at (200) (can't find any good bios for this card)
> 
> Yet, the VisionTek has the coolest temps (50) while the other two are floating close to 60 degrees. I remember switching between Noctua N1H1 and EK thermal paste while setting up the cards, is the difference I'm seeing due to the Noctua?
> 
> All the cards are running at 947/1250
> 
> p.s Anyone found a bios for the ASUS R9 290 DC that will underclock it to 947? I would like to switch to the iGPU while mining, and back to the GPU when gaming,


best bet for the BIOS underclock on the DCII would be to edit the BIOS yourself, however I have no idea what tools there are for this on the 290/Xs...

temps wise, the reason the middle card would have better temps then the lowest card is due to your loop being in series, that's normal, the DCII having higher temps then the middle card isn't quite however, though this could simply be due to a different block and/or little less pad pressure/contact compared to the references.


----------



## BradleyW

Quote:


> Originally Posted by *invincible20xx*
> 
> will going haswell help if i match clocks with my 3770k ?! what are the odds i hit a 4770k that can do over 4.5 ghz/s ?


It will help slightly, but like I said, even my 3930K bottlenecks my GPU's due to a combination of lazy programmers and no access to low level API's other than a specialized Mantle.


----------



## heroxoot

Quote:


> Originally Posted by *sugarhell*
> 
> Anime subtitles 100%


Watching JoJo so I can watch the new season.

So I've run into 1 slight issue. I cannot control fan curve on this card. Has anyone had this issue? Every thing about this 290x seems like its higher quality. Hynix ram and all. It doesn't seem to stay any cooler with the fans clocked down for 2d either.

Also the default voltage is set for +25mV in MSI AB if you reset it. Not sure why. Maybe it has to do with the different modes.

It seems nothing can read the fan speed % but they seem to be spinning fine. 1030RPM but I cannot adjust it to get the temps a bit lower.


Spoiler: Warning: Spoiler!


----------



## Roboyto

Quote:


> Originally Posted by *FtW 420*
> 
> 21578 graphics at 1315/1700? The 290 is pretty close to the 290x, at 1275/1678 I'm at 21598 graphics score...
> 
> 
> 
> Spoiler: Warning: Spoiler!


I know they have 10% more cores, but it doesn't seem like they exhibit 10% more performance in most cases.

Still a nice score you have there









Quote:


> Originally Posted by *Roy360*
> 
> I've been wondering this for the longest time. So apparently my ASUS card, the one that has the highest tendency to crash has the best ASIC.
> 
> Overall, looks like I have bad luck
> 79.5
> 79.8
> 76.5
> 
> awww. if only I didn't kill the VRMs(dropped a tiny heatsync right over the backside of the VRMs) on my old card. I sure that card had a killer ASIC value.


My 290 is 80.4%, and there have been several people who get higher clocks without any additional voltage. My card started needing extra volts shortly after exceeding 1000MHz core...but by card does very well under water though, which with the high ASIC quality that shouldn't be the case? Right? I don't really know much about ASIC, if someone could elaborate?

Quote:


> Originally Posted by *TripleTurbo*
> 
> Anybody here able to tell me what water blocks fit the MSI 4G 290x so called gaming edition? Can't seem to find any that do, custom power circuitry or some such relative to the reference board.


EK has one coming out soon, check CoolingConfigurator.com

Quote:


> Originally Posted by *Devotii*
> 
> As mentioned it would be the Sapphire TriX series, 290 or 290X. I don't have a large screen resolution but might soon have a 3 monitor setup. It would be air cooled, I only want a single card setup.


290 probably better bargain, as a mild OC gets you same or better performance than 290X. If you're running 5760x1080 eyefinity, one 290 can do it, but don't expect 60fps in every game. I am running 5760x1080 on one 290 and it is very capable, with a healthy overclock of 1200/1500. Only problem is it taxes the a single 290(X) to the max most of the time. Your cooler would have to be very good, or use a Kraken as I mention below. FFXIV I never go below 40fps with all settings but AA maxed out. Borderlands 2 is capped at 72fps the entire time even with 4 people playing and fighting The Warrior.

That PowerColor PCS+ with 3-slot cooler is a beast if you have the room for it. Great temps and low noise.

Other option is a NZXT Kraken G10 and an AIO cooler. As long as you can fit a 120mm AIO radiator somewhere this option is hard to beat since it will be silent, keep the core much lower than any air cooler, and the Kraken G10 has a fan that sits directly over the VRMs for most 290s AFAIK.

Quote:


> Originally Posted by *Devotii*
> 
> Thanks for info! Last question: Is the Corsair TX650w PSU compatible with the Vapor X R9 290?


Should be, Rosewill Capstone 650M runs my 290 and 4770k without issue. If you have FX 8-core with high OC you probably wanta little more power because they need as much as a 290 with a fair OC on them

Quote:


> Originally Posted by *invincible20xx*
> 
> will going haswell help if i match clocks with my 3770k ?! what are the odds i hit a 4770k that can do over 4.5 ghz/s ?


Minimal improvement going to 4770k. Your odds are pretty much guaranteed to hit 4.4GHz, as I have had a 4770k that wouldn't make 4.5GHz. Most chips will get to 4.5GHz in the ballpark of 1.25V

My 4770k makes 4.5GHz at 1.259V

If you're going to upgrade CPUs my suggestion would be to skip 4770k and wait for next gen, or go with 6-core.


----------



## heroxoot

http://www.techpowerup.com/gpuz/gbq5e/

MSI 290X 4G Gaming series stock cooling.

Feels so good.


----------



## prasopes

So here is mine 4way crossfire









*4× Sapphire R9 290X 4GB GDDR5 BATTLEFIELD 4 EDITION*
GPU-Z: link
Core: 1075 MHz 1100 MHz
Memory: 1290 MHz 1300 MHz
Stock cooling


----------



## the9quad

Quote:


> Originally Posted by *prasopes*
> 
> So here is mine 4way crossfire


Looks really good. How's the temps/noise? Mine in tri fire gets very loud if I keep them cool, I can't imagine 4 way. Also just 1000 watts? Or do you have a second psu? Congrats though, I'm not knocking your system just curious, your setup looks great.


----------



## jamaican voodoo

beast mode bro!! they are begging for 4k


----------



## DeadlyDNA

Quote:


> Originally Posted by *the9quad*
> 
> Looks really good. How's the temps/noise? Mine in tri fire gets very loud if I keep them cool, I can't imagine 4 way. Also just 1000 watts? Or do you have a second psu? Congrats though, I'm not knocking your system just curious, your setup looks great.


I would also, i gave up trying to put 4 on air and went water cooling. Now i'm spoiled because i have almost no noise now. But on air mine kept crashing due to massive heat buildup between each other.
Congrats on a rad setup:thumb:


----------



## invincible20xx

count me in , i just swapped my 7970's out for a couple of R9 290's

didn't try to unlock them yet but here is the results from hawaii info :

Compatible adapters detected: 2
Adapter #1 PCI ID: 1002:67B1 - 1002:0B00
Memory config: 0x500036A9 Hynix
RA1: F8010005 RA2: 00000000
RB1: F8010005 RB2: 00000000
RC1: F8400005 RC2: 00000000
RD1: F8010005 RD2: 00000000
Adapter #2 PCI ID: 1002:67B1 - 1002:0B00
Memory config: 0x500036A9 Hynix
RA1: F8010005 RA2: 00000000
RB1: F8010005 RB2: 00000000
RC1: F8010005 RC2: 00000000
RD1: F8010005 RD2: 00000000

these wouldn't unlock eh ?!

also how much performance EXACTLY is lost by not being able to unlock my GPUs to 290X's ?


----------



## prasopes

First of all thanks :3 I'm saving up for 4k monitor. Anyway here are thermal pictures of full load for about 18 hours. Degrees are in celsius.



Main power!










Although GPUs are very close avg. temp is around 70°C.
While I'm playing Battlefield 4 noise is not to loud. But when I'm mining - fan on 100% - noise is 76 dB at a distance of 1 meter from the case.
I like when cables are hidden and I made it minimalistic. This is latest photo - I switched PSU.
1300W is for 3 GPU - nothing more. 1000W is for MB, SSD and GPU.
Also in take out from case HDD holders for better air flow.



Yeah I would like to go and switch this into water cool monster. Probably I would...but you know - money







.
Now I'm saving up so I gonna see what I will buy...


----------



## JordanTr

Quote:


> Originally Posted by *invincible20xx*
> 
> 
> 
> 
> count me in , i just swapped my 7970's out for a couple of R9 290's
> 
> didn't try to unlock them yet but here is the results from hawaii info :
> 
> Compatible adapters detected: 2
> Adapter #1 PCI ID: 1002:67B1 - 1002:0B00
> Memory config: 0x500036A9 Hynix
> RA1: F8010005 RA2: 00000000
> RB1: F8010005 RB2: 00000000
> RC1: F8400005 RC2: 00000000
> RD1: F8010005 RD2: 00000000
> Adapter #2 PCI ID: 1002:67B1 - 1002:0B00
> Memory config: 0x500036A9 Hynix
> RA1: F8010005 RA2: 00000000
> RB1: F8010005 RB2: 00000000
> RC1: F8010005 RC2: 00000000
> RD1: F8010005 RD2: 00000000
> 
> these wouldn't unlock eh ?!
> 
> also how much performance EXACTLY is lost by not being able to unlock my GPUs to 290X's ?


Its roughly 5%.


----------



## heroxoot

Ok so I think my card is throttling? During BF4 my core clock kept fluctuating between 990mhz and 1030mhz. Is this normal? 1030 is the stock clock of this card. I'm using the 14.3 beta driver. Maybe thats the problem?


----------



## the9quad

Quote:


> Originally Posted by *heroxoot*
> 
> Ok so I think my card is throttling? During BF4 my core clock kept fluctuating between 990mhz and 1030mhz. Is this normal? 1030 is the stock clock of this card. I'm using the 14.3 beta driver. Maybe thats the problem?


are you using a custom fan profile in something like afterburner? If not then you should. You will eventually throttle as these cards like to get up to the limit and throttle.


----------



## Tobiman

Quote:


> Originally Posted by *heroxoot*
> 
> Ok so I think my card is throttling? During BF4 my core clock kept fluctuating between 990mhz and 1030mhz. Is this normal? 1030 is the stock clock of this card. I'm using the 14.3 beta driver. Maybe thats the problem?


What are the temps though? Might want to setup a decent fan curve if temps are in the 90s.


----------



## Widde

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Thanks guys
> 
> 
> 
> 
> 
> 
> 
> 
> Drop res scale to 125% ?


Been able to do 200% no problem before =S So something must have changed for the worse in the driver =< Running dx11 now without problems though







Have they implemented crossfire support in thief yet? I tried it when it came out and it only used 1 of my cards then havent been keeping up with that game


----------



## heroxoot

Quote:


> Originally Posted by *Tobiman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Ok so I think my card is throttling? During BF4 my core clock kept fluctuating between 990mhz and 1030mhz. Is this normal? 1030 is the stock clock of this card. I'm using the 14.3 beta driver. Maybe thats the problem?
> 
> 
> 
> What are the temps though? Might want to setup a decent fan curve if temps are in the 90s.
Click to expand...

Quote:


> Originally Posted by *the9quad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Ok so I think my card is throttling? During BF4 my core clock kept fluctuating between 990mhz and 1030mhz. Is this normal? 1030 is the stock clock of this card. I'm using the 14.3 beta driver. Maybe thats the problem?
> 
> 
> 
> are you using a custom fan profile in something like afterburner? If not then you should. You will eventually throttle as these cards like to get up to the limit and throttle.
Click to expand...

I cannot set the fan it won't let me in MSI AB. Nothing can read the % but the GPU temp was under 70c playing BF4 on ultra. I was only getting like 90fps at best on Mantle tho which is a bit lacking from what my 7970 got.


----------



## Norse

Quote:


> Originally Posted by *prasopes*
> 
> So here is mine 4way crossfire
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *4× Sapphire R9 290X 4GB GDDR5 BATTLEFIELD 4 EDITION*
> GPU-Z: link
> Core: 1075 MHz 1100 MHz
> Memory: 1290 MHz 1300 MHz
> Stock cooling


what voltage do you have on that OC? i have BF4 Edition 290x aswell


----------



## invincible20xx

i'm noticing a more "smooth" effect in batman arkham city using those r9 290's over my 7970's is that real or placebo lol ?! same frame rates so capped @ 60fps


----------



## lawson67

Hey guys i am currently overclocking without adding voltage via CCC and i have Afterburner installed along with RTSS purely for the OSD...However i want to try push my cards higher and start to add voltage which means i will have to use Afterburner!

So to move from CCC and overclock using Afterburner do i just uncheck enable overdrive in CCC and then apply and then use Afterburner?.. Or do i just reset all the values to "0" overdrive in CCC and then use Afterburner leaving the box still checked in CCC that says enable overdrive...Thanks in advance


----------



## Germanian

Quote:


> Originally Posted by *lawson67*
> 
> Hey guys i am currently overclocking without adding voltage via CCC and i have Afterburner installed along with RTSS purely for the OSD...However i want to try push my cards higher and start to add voltage which means i will have to use Afterburner!
> 
> So to move from CCC and overclock using Afterburner do i just uncheck enable overdrive in CCC and then apply and then use Afterburner?.. Or do i just reset all the values to "0" overdrive in CCC and then use Afterburner leaving the box still checked in CCC that says enable overdrive...Thanks in advance


Leave CCC unchecked your overclock will be done via Afterburner. So leave ENABLE GRAPHICS OVERDRIVE in CCC UNCHECKED.

set your settings like in these pictures for Afterburner. If you want you can switch unofficial overclocking mode to WITH POWERPLAY SUPPORT that is what I use. That way you can use the power slider from 0-50% and it will stop throttling your speeds. Without powerplay your cards will run 24/7 at the clocks set. It will not go lower during idle or movie watching.
With powerplay support i see it idle at 300core and 150 VRAM i think.

BTW OSD can cause issues with overclocks btw. I turned it off, but you can give it a try. If you see any problems with unstability or any weird issues turn off OSD first and try again.


----------



## lawson67

Quote:


> Originally Posted by *Germanian*
> 
> Leave CCC unchecked your overclock will be done via Afterburner. So leave ENABLE GRAPHICS OVERDRIVE in CCC UNCHECKED.
> 
> set your settings like in these pictures. If you want you can switch unofficial overclocking mode to WITH POWERPLAY SUPPORT that way you can use the power slider from 0-50%. Without powerplay you are cards will run 24/7 at the clocks you set your overclock to without it going lower during idle or movie watching. With powerplay support i see it idle at 300core and 150 VRAM i think.
> 
> BTW OSD can cause issues with overclocks btw. I turned it off, but you can give it a try. If you see any problems with unstability or any weird issues turn off OSD first and try again.


Thank you for your explanation of powerplay and your answer!.. .However in CCC before i start to overclock with Afterburner...Do i uncheck enable overdrive in CCC and then hit apply?....Or do i reset my overclock values in CCC *but leave the box checked* in CCC saying enable overdrive?


----------



## heroxoot

Nothing seems to be able to manually control my fan speeds. I just switched from 14.3 to 14.4 beta and nothing changed. Not even able to set a manual speed in CCC.


----------



## Germanian

Quote:


> Originally Posted by *lawson67*
> 
> Thank you for your explanation of powerplay and your answer!.. .However in CCC before i start to overclock with Afterburner...Do i uncheck enable overdrive in CCC and then hit apply?....Or do i reset my overclock values in CCC *but leave the box checked* in CCC saying enable overdrive?


1. close Afterburner make sure you exit the program not minimize.

2. go into CCC *remove the checkmark* on ENABLE OFFICIAL overclocking press APPLY (setting it to stock settings). It does not matter if you had set some OC values as soon as you press UNCHECK they will be ignored. You can reset them if you like for peace of mind.

3. Make sure you pressed apply before. Now close CCC

4. Open afterburner

5. Overclock your heart out


----------



## Germanian

Quote:


> Originally Posted by *heroxoot*
> 
> Nothing seems to be able to manually control my fan speeds. I just switched from 14.3 to 14.4 beta and nothing changed. Not even able to set a manual speed in CCC.


that's normal CCC is bugged for fan control. You have to use external software like afterburner, Trixx, or ASUS software to change fan speeds.


----------



## lawson67

Quote:


> Originally Posted by *Germanian*
> 
> 1. close Afterburner make sure you exit the program not minimize.
> 
> 2. go into CCC *remove the checkmark* on ENABLE OFFICIAL overclocking press APPLY (setting it to stock settings). It does not matter if you had set some OC values as soon as you press UNCHECK they will be ignored. You can reset them if you like for peace of mind.
> 
> 3. Make sure you pressed apply before. Now close CCC
> 
> 4. Open afterburner
> 
> 5. Overclock your heart out


Thank you very much Sir for your kind help!.... + REP coming your way!


----------



## Paul17041993

Quote:


> Originally Posted by *invincible20xx*
> 
> i'm noticing a more "smooth" effect in batman arkham city using those r9 290's over my 7970's is that real or placebo lol ?! same frame rates so capped @ 60fps


ironically on that topic, i noticed when I came off my 7970 and onto my 290X, minecraft runs dead smooth on 25-40FPS despite it being so low, which is a bit odd, I can only imagine its a combination of the framepacing and the advanced clock control hawaii has in that the drivers try to regulate the framerate a bit more then normal...


----------



## heroxoot

Quote:


> Originally Posted by *Germanian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Nothing seems to be able to manually control my fan speeds. I just switched from 14.3 to 14.4 beta and nothing changed. Not even able to set a manual speed in CCC.
> 
> 
> 
> 
> 
> that's normal CCC is bugged for fan control. You have to use external software like afterburner, Trixx, or ASUS software to change fan speeds.
Click to expand...

You don't seem to get it tho. Look at MSI AB. There is no slider for the fan, and there is no fan tab in the settings. I cannot set the fan curve! ALSO I CAN'T OC THE CARD!. I clocked it up to 1040 which is the cards own boosting mode but it refuses to let me do it. I tried everything I can think of and I'm curious if this is a bios bug? When I clock it up it only goes to 1030 in games.


----------



## Germanian

Quote:


> Originally Posted by *heroxoot*
> 
> You don't seem to get it tho. Look at MSI AB. There is no slider for the fan, and there is no fan tab in the settings. I cannot set the fan curve! ALSO I CAN'T OC THE CARD!. I clocked it up to 1040 which is the cards own boosting mode but it refuses to let me do it. I tried everything I can think of and I'm curious if this is a bios bug? When I clock it up it only goes to 1030 in games.


ya you are right. I am lost on that one. Your card is not reference I missed that.
Best bet is to ask other users of that custom MSI card. I know they use different components on the board than reference boards.
Try messaging some people with the same exact GPU model as yours for advice on settings.


----------



## Germanian

i am rolling back to 13.12 after these horrible new 14.4 drivers. Crossfire won't work I tried everything I could think off. Last good crossfire driver was 13.12 for me.

Single GPU worked amazing though and seems like it gave some FPS bonus for some games compared to 13.12. Well I will use 13.12 until 14.5 or better comes out.


----------



## heroxoot

Could this be my issue? http://www.tomshardware.com/news/msi-r9-290x-lightning-overheat,26557.html

I'm about to roll back to 13.12 and see if I can control my fan. That could be the issue!


----------



## luckyboy

Hi. I have r9 290 tri-x oc and memory clock always is 1300MHz, even in idle, when my monitor xl2411t run to native 144hz. I'm trying 13.2/14.3 beta and result is the same. Any fix for that? Thank you and sorry for my english.


----------



## Hl86

Is blower topcard better for crossfire?


----------



## UZ7

Quote:


> Originally Posted by *luckyboy*
> 
> Hi. I have r9 290 tri-x oc and memory clock always is 1250MHz, even in idle, when my monitor xl2411t run to native 144hz. I'm trying 13.2/14.3 beta and result is the same. Any fix for that? Thank you and sorry for my english.


Does this happen after reboot? Also during idle are you doing anything (browsing web, watching youtube vids etc...), if thats the case hardware acceleration might be on.

I'm on 14.4RC1 and it idles at low state.


----------



## diggiddi

Blower bottom card, you want to get the heat out of the case before it rises and heats up top card


----------



## heroxoot

Issue Solved! Its the Glitchy Mantle driver 14.3 and newer. MSI has some kind of new fan thing going on their cards and it causes the issue. 13.12 driver lets me set my fan curve.


----------



## Paul17041993

Quote:


> Originally Posted by *heroxoot*
> 
> You don't seem to get it tho. Look at MSI AB. There is no slider for the fan, and there is no fan tab in the settings. I cannot set the fan curve! ALSO I CAN'T OC THE CARD!. I clocked it up to 1040 which is the cards own boosting mode but it refuses to let me do it. I tried everything I can think of and I'm curious if this is a bios bug? When I clock it up it only goes to 1030 in games.


@Red1776 BIOS flash needed?

edit;
Quote:


> Originally Posted by *heroxoot*
> 
> Issue Solved! Its the Glitchy Mantle driver 14.3 and newer. MSI has some kind of new fan thing going on their cards and it causes the issue. 13.12 driver lets me set my fan curve.


yea 14.x atm isn't good for overclocking, various issues still exist...


----------



## Red1776

Quote:


> Originally Posted by *Paul17041993*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> You don't seem to get it tho. Look at MSI AB. There is no slider for the fan, and there is no fan tab in the settings. I cannot set the fan curve! ALSO I CAN'T OC THE CARD!. I clocked it up to 1040 which is the cards own boosting mode but it refuses to let me do it. I tried everything I can think of and I'm curious if this is a bios bug? When I clock it up it only goes to 1030 in games.
> 
> 
> 
> @Red1776 BIOS flash needed?
> 
> edit;
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Issue Solved! Its the Glitchy Mantle driver 14.3 and newer. MSI has some kind of new fan thing going on their cards and it causes the issue. 13.12 driver lets me set my fan curve.
> 
> Click to expand...
> 
> yea 14.x atm isn't good for overclocking, various issues still exist...
Click to expand...

It's acting like such, but on that card I can't figure why.

Would you list the exact model numbers please (all of them)


----------



## luckyboy

Quote:


> Originally Posted by *UZ7*
> 
> Does this happen after reboot? Also during idle are you doing anything (browsing web, watching youtube vids etc...), if thats the case hardware acceleration might be on.
> 
> I'm on 14.4RC1 and it idles at low state.


Every time, when i change refresh rate from 120 to 144hz, memory clock is 1300MHz (screenshot below). I close all open programs, like players, browser etc, but result is the same.


----------



## heroxoot

Quote:


> Originally Posted by *Red1776*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Paul17041993*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> You don't seem to get it tho. Look at MSI AB. There is no slider for the fan, and there is no fan tab in the settings. I cannot set the fan curve! ALSO I CAN'T OC THE CARD!. I clocked it up to 1040 which is the cards own boosting mode but it refuses to let me do it. I tried everything I can think of and I'm curious if this is a bios bug? When I clock it up it only goes to 1030 in games.
> 
> 
> 
> @@Red1776 BIOS flash needed?
> 
> edit;
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Issue Solved! Its the Glitchy Mantle driver 14.3 and newer. MSI has some kind of new fan thing going on their cards and it causes the issue. 13.12 driver lets me set my fan curve.
> 
> Click to expand...
> 
> yea 14.x atm isn't good for overclocking, various issues still exist...
> 
> Click to expand...
> 
> It's acting like such, but on that card I can't figure why.
> Would you list the exact model numbers please (all of them)
Click to expand...

I have a MSI R9 290X 4G gaming series card.


----------



## Red1776

Quote:


> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Paul17041993*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> You don't seem to get it tho. Look at MSI AB. There is no slider for the fan, and there is no fan tab in the settings. I cannot set the fan curve! ALSO I CAN'T OC THE CARD!. I clocked it up to 1040 which is the cards own boosting mode but it refuses to let me do it. I tried everything I can think of and I'm curious if this is a bios bug? When I clock it up it only goes to 1030 in games.
> 
> 
> 
> @Red1776 BIOS flash needed?
> 
> edit;
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Issue Solved! Its the Glitchy Mantle driver 14.3 and newer. MSI has some kind of new fan thing going on their cards and it causes the issue. 13.12 driver lets me set my fan curve.
> 
> Click to expand...
> 
> yea 14.x atm isn't good for overclocking, various issues still exist...
> 
> Click to expand...
> 
> It's acting like such, but on that card I can't figure why.
> Would you list the exact model numbers please (all of them)
> 
> Click to expand...
> 
> I have a MSI R9 290X 4G gaming series card.
Click to expand...

Hey hex

I mean the exact model number at the top of the sticker with the serial number.

is the card not responding in both BIOS's to OC'ing ?.


----------



## heroxoot

Quote:


> Originally Posted by *Red1776*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Paul17041993*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> You don't seem to get it tho. Look at MSI AB. There is no slider for the fan, and there is no fan tab in the settings. I cannot set the fan curve! ALSO I CAN'T OC THE CARD!. I clocked it up to 1040 which is the cards own boosting mode but it refuses to let me do it. I tried everything I can think of and I'm curious if this is a bios bug? When I clock it up it only goes to 1030 in games.
> 
> 
> 
> @@Red1776 BIOS flash needed?
> 
> edit;
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Issue Solved! Its the Glitchy Mantle driver 14.3 and newer. MSI has some kind of new fan thing going on their cards and it causes the issue. 13.12 driver lets me set my fan curve.
> 
> Click to expand...
> 
> yea 14.x atm isn't good for overclocking, various issues still exist...
> 
> Click to expand...
> 
> It's acting like such, but on that card I can't figure why.
> 
> Would you list the exact model numbers please (all of them)
> 
> Click to expand...
> 
> I have a MSI R9 290X 4G gaming series card.
> 
> Click to expand...
> 
> Hey hex
> I mean the exact model number at the top of the sticker with the serial number.
> is the card not responding in both BIOS's to OC'ing ?.
Click to expand...

Oh its responding perfectly now! Switching to the 13.12 driver did the trick indeed. Now its staying pegged at 1060/1250 clocks in heaven too. Do you still want it though?


----------



## Red1776

Yeah, I want to look up your Gamer model against my Twin Frozr 4GB gaming 290X's and see if they have the same BIOS, memory, etc

re-cranking them up tomorrow


----------



## heroxoot

If this is it, mine is 912-V308-014. The Bios is 015.043.000.004. Memory is Hynix.


----------



## ledzepp3

Anybody here been running their 290/290X's on EK blocks with the reinforcing brackets? There's a lack of images of them in use, just stock pictures from EK's website and places like FCPU.

-Zepp


----------



## Forceman

Quote:


> Originally Posted by *luckyboy*
> 
> Every time, when i change refresh rate from 120 to 144hz, memory clock is 1300MHz (screenshot below). I close all open programs, like players, browser etc, but result is the same.


Common problem, once the blanking interval gets too short (at least I think that's the technical reason) it'll stop downclocking at idle.


----------



## VulgarDisplay

Has anyone put the Fuji poly pads on a tri x card? What has been their experience? My vrms don't really seem to get very hot so I'm wondering if it is worth the hassle.

I assume you have to remove the heatsink to so it. My vrm 1 usually stays about 5c over my core temp and I have never seen it above 75c at +100 vddc offset.


----------



## luckyboy

Forceman, if i'm understood correctly, this is normal behaviour, right? I'm little worried...can this wear videocard?


----------



## cephelix

Quote:


> Originally Posted by *ledzepp3*
> 
> Anybody here been running their 290/290X's on EK blocks with the reinforcing brackets? There's a lack of images of them in use, just stock pictures from EK's website and places like FCPU.
> 
> -Zepp


just saw the brackets and wondering where they attach to?and is it really necessary?


----------



## Arizonian

Quote:


> Originally Posted by *heroxoot*
> 
> http://www.techpowerup.com/gpuz/gbq5e/
> 
> MSI 290X 4G Gaming series stock cooling.
> 
> Feels so good.


Congrats - added









Quote:


> Originally Posted by *prasopes*
> 
> So here is mine 4way crossfire
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *4× Sapphire R9 290X 4GB GDDR5 BATTLEFIELD 4 EDITION*
> GPU-Z: link
> Core: 1075 MHz 1100 MHz
> Memory: 1290 MHz 1300 MHz
> Stock cooling
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added x 4









Quote:


> Originally Posted by *invincible20xx*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> count me in , i just swapped my 7970's out for a couple of R9 290's
> 
> didn't try to unlock them yet but here is the results from hawaii info :
> 
> Compatible adapters detected: 2
> Adapter #1 PCI ID: 1002:67B1 - 1002:0B00
> Memory config: 0x500036A9 Hynix
> RA1: F8010005 RA2: 00000000
> RB1: F8010005 RB2: 00000000
> RC1: F8400005 RC2: 00000000
> RD1: F8010005 RD2: 00000000
> Adapter #2 PCI ID: 1002:67B1 - 1002:0B00
> Memory config: 0x500036A9 Hynix
> RA1: F8010005 RA2: 00000000
> RB1: F8010005 RB2: 00000000
> RC1: F8010005 RC2: 00000000
> RD1: F8010005 RD2: 00000000
> 
> these wouldn't unlock eh ?!
> 
> also how much performance EXACTLY is lost by not being able to unlock my GPUs to 290X's ?


Please submit a GPU-Z validation link with your OCN name on is as proof to be added.


----------



## Forceman

Quote:


> Originally Posted by *luckyboy*
> 
> Forceman, if i'm understood correctly, this is normal behaviour, right? I'm little worried...can this wear videocard?


It's normal - or at least there's not really anything you can do to stop it. Shouldn't really affect the card in any way though.


----------



## FlighterPilot

Hey everyone. I have a 290, and will soon have a 290x. Is it possible to flash the 290x to a 290 so they can be crossfired?


----------



## Mega Man

They can be certified *without* flashing

@Ledzepp3 I have been asking the same. I think I will buy it to test it


----------



## ledzepp3

Quote:


> Originally Posted by *cephelix*
> 
> just saw the brackets and wondering where they attach to?and is it really necessary?


I believe they attach to the PCI bracket on the back of the card to stiffen it. I mean if your cards are going to have more weight added to them and they're in a case not oriented like a Silverstone FT02 case, then I'd say grab them. I mean it's $15 a piece for lessened or no PCB sag which can if it gets bad enough will crack a PCB. It happened to my 5770








Quote:


> Originally Posted by *Mega Man*
> 
> They can be certified *without* flashing
> 
> @Ledzepp3 I have been asking the same. I think I will buy it to test it


Think I may have beaten you it







two will be at my door tomorrow and will be installed. Will report back









-Zepp


----------



## diggiddi

Quote:


> Originally Posted by *FlighterPilot*
> 
> Hey everyone. I have a 290, and will soon have a 290x. Is it possible to flash the 290x to a 290 so they can be crossfired?


If you meant flash the 290 to a 290x then yes some models can be flashed (mainly XFX and Powercolor), but to run crossfire they don't have to run at the same speeds


----------



## FlighterPilot

Quote:


> Originally Posted by *Mega Man*
> 
> They can be certified *without* flashing
> 
> @Ledzepp3 I have been asking the same. I think I will buy it to test it


Do you know if it causes any issues that wouldn't be a problem if they were identical cards?


----------



## cephelix

Quote:


> Originally Posted by *ledzepp3*
> 
> I believe they attach to the PCI bracket on the back of the card to stiffen it. I mean if your cards are going to have more weight added to them and they're in a case not oriented like a Silverstone FT02 case, then I'd say grab them. I mean it's $15 a piece for lessened or no PCB sag which can if it gets bad enough will crack a PCB. It happened to my 5770
> 
> 
> 
> 
> 
> 
> 
> 
> 
> -Zepp


Ahh, thanks for that. will give them a go when the time comes....suddenly short on cash thanks to the impulse 500gb ssd i just purchased


----------



## FlighterPilot

Quote:


> Originally Posted by *diggiddi*
> 
> If you meant flash the 290 to a 290x then yes some models can be flashed (mainly XFX and Powercolor), but to run crossfire they don't have to run at the same speeds


Nope I mean flashing a 290x to a 290









My 290 is not unlockable unfortunately.


----------



## Mega Man

Quote:


> Originally Posted by *ledzepp3*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cephelix*
> 
> just saw the brackets and wondering where they attach to?and is it really necessary?
> 
> 
> 
> I believe they attach to the PCI bracket on the back of the card to stiffen it. I mean if your cards are going to have more weight added to them and they're in a case not oriented like a Silverstone FT02 case, then I'd say grab them. I mean it's $15 a piece for lessened or no PCB sag which can if it gets bad enough will crack a PCB. It happened to my 5770
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> They can be certified *without* flashing
> 
> @Ledzepp3 I have been asking the same. I think I will buy it to test it
> 
> Click to expand...
> 
> Think I may have beaten you it
> 
> 
> 
> 
> 
> 
> 
> two will be at my door tomorrow and will be installed. Will report back
> 
> 
> 
> 
> 
> 
> 
> 
> 
> -Zepp
Click to expand...

15 they are only 7 ea.

Please post some picks and send me links. Even if I could pick them up today I won't be state side till the 30th in China atm visiting family
Quote:


> Originally Posted by *FlighterPilot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> They can be certified *without* flashing
> 
> @Ledzepp3 I have been asking the same. I think I will buy it to test it
> 
> 
> 
> Do you know if it causes any issues that wouldn't be a problem if they were identical cards?
Click to expand...

no it won't


----------



## diggiddi

Offtopic how is the Chinese food in china?


----------



## ledzepp3

Quote:


> Originally Posted by *diggiddi*
> 
> Offtopic how is the Chinese food in china?


Gonna take a stab and say... Chinese?

Anyways but the brackets will hopefully give the cards the support they need.

-Zepp


----------



## Mega Man

Quote:


> Originally Posted by *ledzepp3*
> 
> Quote:
> 
> 
> 
> Originally Posted by *diggiddi*
> 
> Offtopic how is the Chinese food in china?
> 
> 
> 
> Gonna take a stab and say... Chinese?
> 
> Anyways but the brackets will hopefully give the cards the support they need.
> 
> -Zepp
Click to expand...

you can get similar food in most Chinese restaurants. However it is not the same nordo they have this much selection. But most ppl will never get it as they don't put it in their"American menu" but only on their "Chinese menu"


----------



## Mercy4You

Anyone enthusiastic about 14.4? Or are we for ever stuck on 13.12


----------



## shwarz

Sign me up please









http://www.techpowerup.com/gpuz/d8nsd/

XFX R9 290

stock cooling for now


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mercy4You*
> 
> Anyone enthusiastic about 14.4? Or are we for ever stuck on 13.12


It's pretty decent for me so far, it's the same as 13.12 in most games and better in BF4 for me, that is of course when BF4 doesn't crash on me









(was crashing before 14.4 btw)


----------



## Germanian

Quote:


> Originally Posted by *Sgt Bilko*
> 
> It's pretty decent for me so far, it's the same as 13.12 in most games and better in BF4 for me, that is of course when BF4 doesn't crash on me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (was crashing before 14.4 btw)


i never had crashes on 13.12 with crossfire. 14.4 are absolutely abysmal for me. Crossfire does not work properly. Loads are unbalanced between cards don't know why.

For people who are still on 13.12 keep it that way. Here to hoping that 14.5 are better


----------



## Mercy4You

Quote:


> Originally Posted by *Sgt Bilko*
> 
> It's pretty decent for me so far, it's the same as 13.12 in most games and better in BF4 for me, that is of course when BF4 doesn't crash on me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (was crashing before 14.4 btw)


Are you on Mantle?

BTW, what crashes do you get on BF4, the Direct X error, like me?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mercy4You*
> 
> Are you on Mantle?
> 
> BTW, what crashes do you get on BF4, the Direct X error, like me?


Mantle, DX....doesn't matter.

BEX64, DX shutdowns, it's just weird with no pattern to it.


----------



## Mercy4You

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Mantle, DX....doesn't matter.
> 
> BEX64, DX shutdowns, it's just weird with no pattern to it.


Hmm, I only get DX Get Device removed bla bla error...

Why is 14.4 better for you, than 13.12?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mercy4You*
> 
> Hmm, I only get DX Get Device removed bla bla error...
> 
> Why is 14.4 better for you, than 13.12?


It's roughly 10 fps faster than 13.12 for me, And with DX i'm getting these horrible frame drops every few mins that annoys me.


----------



## centvalny

FS test



http://imgur.com/esRUUWm


----------



## Mercy4You

Quote:


> Originally Posted by *Sgt Bilko*
> 
> It's roughly 10 fps faster than 13.12 for me, And with DX i'm getting these horrible frame drops every few mins that annoys me.


I also noticed the very weird framedrops in BF4... I now force constant FPS @90 and that solved it for me.


----------



## Paul17041993

Quote:


> Originally Posted by *Mega Man*
> 
> They can be certified *without* flashing


because certifimacation is key for... wait nevermind...
Quote:


> Originally Posted by *Mercy4You*
> 
> Anyone enthusiastic about 14.4? Or are we for ever stuck on 13.12


I like things to mature before they go out in the "open", so yea I'm patiently excited...


----------



## Mercy4You

Quote:


> Originally Posted by *Paul17041993*
> 
> I like things to mature before they go out in the "open", so yea I'm patiently excited...


----------



## JordanTr

Im experiencing strange issue while keeping fps capped to 60 without vsync. I get weird screen tearing. However if i set it to 55 screen tearing is gone. Im caping fps through rivatuner statistics. Is there any other way to cap fps and use ab, cause radeonpro+ ab doesnt want to be friends







i could use vsync all the time, but games like lineage2 doesnt have vsync option and some games like dead space locks on 30fps which becomes unplayable for me


----------



## Mega Man

Quote:


> Originally Posted by *Paul17041993*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> They can be certified *without* flashing
> 
> 
> 
> because certifimacation is key for... wait nevermind...
> Quote:
> 
> 
> 
> Originally Posted by *Mercy4You*
> 
> Anyone enthusiastic about 14.4? Or are we for ever stuck on 13.12
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> I like things to mature before they go out in the "open", so yea I'm patiently excited...
Click to expand...

My bad. Was supposed to say crossfired. But my phone likes to correct my poor typing


----------



## kizwan

My 3dmark score, the graphics score seems low in Crossfire.

Single 290 1200/1600 - tessellation off (13.12)
http://www.3dmark.com/3dm11/8253789

290 Crossfire 1000/1300 - tessellation off (13.11)
http://www.3dmark.com/3dm11/8258566

290 Crossfire 1200/1500 - tessellation off (13.11)
http://www.3dmark.com/3dm11/8258649

If you can see the graphics score doesn't look right. Below is the old 3mark11 scores at 947/1250.

13.11
http://www.3dmark.com/3dm11/7656003

13.12
http://www.3dmark.com/3dm11/7688525


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> My 3dmark score, the graphics score seems low in Crossfire.
> 
> Single 290 1200/1600 - tessellation off (13.12)
> http://www.3dmark.com/3dm11/8253789
> 
> 290 Crossfire 1000/1300 - tessellation off (13.11)
> http://www.3dmark.com/3dm11/8258566
> 
> 290 Crossfire 1200/1500 - tessellation off (13.11)
> http://www.3dmark.com/3dm11/8258649
> 
> If you can see the graphics score doesn't look right. Below is the old 3mark11 scores at 947/1250.
> 
> 13.11
> http://www.3dmark.com/3dm11/7656003
> 
> 13.12
> http://www.3dmark.com/3dm11/7688525


in crossfire, your tess were on. i see what you're saying. try putting everything at default in CCC. normally if tess is off futuremark will detect it.


----------



## kizwan

Quote:


> Originally Posted by *rdr09*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> My 3dmark score, the graphics score seems low in Crossfire.
> 
> Single 290 1200/1600 - tessellation off (13.12)
> http://www.3dmark.com/3dm11/8253789
> 
> 290 Crossfire 1000/1300 - tessellation off (13.11)
> http://www.3dmark.com/3dm11/8258566
> 
> 290 Crossfire 1200/1500 - tessellation off (13.11)
> http://www.3dmark.com/3dm11/8258649
> 
> If you can see the graphics score doesn't look right. Below is the old 3mark11 scores at 947/1250.
> 
> 13.11
> http://www.3dmark.com/3dm11/7656003
> 
> 13.12
> http://www.3dmark.com/3dm11/7688525
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> in crossfire, your tess were on. i see what you're saying. try putting everything at default in CCC. normally if tess is off futuremark will detect it.
Click to expand...

CCC reset to default.

1200/1500
http://www.3dmark.com/3dm11/8258718

Comparison between *tweaked* (including tessellation off) & default CCC settings.
http://www.3dmark.com/compare/3dm11/8258649/3dm11/8258718


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> CCC reset to default.
> 
> 1200/1500
> http://www.3dmark.com/3dm11/8258718
> 
> Comparison between *tweaked* (including tessellation off) & default CCC settings.
> http://www.3dmark.com/compare/3dm11/8258649/3dm11/8258718


looks ok. 6 -8% difference.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *centvalny*
> 
> FS test
> 
> 
> 
> http://imgur.com/esRUUWm


That 2900 on da ram man


----------



## darkelixa

My new gigabyte r9 290 arrived today and let me tell you, it a little beauty!

Installing the card was a different story as it would not line up with the back of the case and then it got stuck as it had latched the back part and would not let go, so that took ages for me to get it to unlock and then eventually i managed to get the card down and into the pcie slot. the back part then did not line up where the screws went down so that took a little force on the backplate screw bit to get the holes to line up and then screwed that baby boy down, is it meant to be that hard to install a card, the last card lined up perfectly


----------



## kizwan

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> CCC reset to default.
> 
> 1200/1500
> http://www.3dmark.com/3dm11/8258718
> 
> Comparison between *tweaked* (including tessellation off) & default CCC settings.
> http://www.3dmark.com/compare/3dm11/8258649/3dm11/8258718
> 
> 
> 
> looks ok. 6 -8% difference.
Click to expand...

Nothing wrong with my score? Shouldn't I get >30k for the graphics score?


----------



## heroxoot

Quote:


> Originally Posted by *darkelixa*
> 
> My new gigabyte r9 290 arrived today and let me tell you, it a little beauty!
> 
> Installing the card was a different story as it would not line up with the back of the case and then it got stuck as it had latched the back part and would not let go, so that took ages for me to get it to unlock and then eventually i managed to get the card down and into the pcie slot. the back part then did not line up where the screws went down so that took a little force on the backplate screw bit to get the holes to line up and then screwed that baby boy down, is it meant to be that hard to install a card, the last card lined up perfectly


Thats kind of odd actually. When I put my GPU in it has wiggle room till I put the thumb screw in to hold it straight.


----------



## darkelixa

I had the same issue with my old 770gtx 2gb windofoce, seems to be an issue i have with that brand of card and cooler


----------



## darkelixa

Is it someting that has to be remounted? I dont really want to try and take the gpu out again and have it get stuck again


----------



## cephelix

Had the same with 2 different cards and 2 different cases....


----------



## darkelixa

Had the same issue?


----------



## Sgt Bilko

It's the same for most cards, that's why you have the thumbscrew although my XFX ones are near rock solid after sliding them in the slot but always put the thumbscrew in.


----------



## heroxoot

So how normal is it that VRM Temp 2 never moves except during games and not by much? Vrm temp one is 38c idle but temp 2 is 57c and it never seems to budge. Card is relatively cool idling 46c on 30% fan speed with idle clocks 515/625. Its the lowest MSI AB sliders will go for my 2D profile.


----------



## Mercy4You

*I hope ASUS is reading this* ( sry guys if it's a bit off thread)

After 5 years of buying ASUS products, I did not buy an ASUS Xonar Essence STX today, but a CREATIVE Soundblaster X-FI Titanium









Why? Because in those 5 years I never returned a product of them, until I bought their R9 290X and had to wait 10 weeks to get my very same faulty card back.


----------



## Sgt Bilko

Quote:


> Originally Posted by *heroxoot*
> 
> So how normal is it that VRM Temp 2 never moves except during games and not by much? Vrm temp one is 38c idle but temp 2 is 57c and it never seems to budge. Card is relatively cool idling 46c on 30% fan speed with idle clocks 515/625. Its the lowest MSI AB sliders will go for my 2D profile.


vrm 2 is for the vram iirc, it's usually fine and temps never move for it really, vrm 1 is for the core and that's the one that heats up the most.


----------



## heroxoot

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> So how normal is it that VRM Temp 2 never moves except during games and not by much? Vrm temp one is 38c idle but temp 2 is 57c and it never seems to budge. Card is relatively cool idling 46c on 30% fan speed with idle clocks 515/625. Its the lowest MSI AB sliders will go for my 2D profile.
> 
> 
> 
> vrm 2 is for the vram iirc, it's usually fine and temps never move for it really, vrm 1 is for the core and that's the one that heats up the most.
Click to expand...

Good to know. My 7970 only had 1 Vram temp that actually was shown if another existed. But yea my ram is downclocked for 2D so I guess its no concern.


----------



## Sgt Bilko

Quote:


> Originally Posted by *heroxoot*
> 
> Good to know. My 7970 only had 1 Vram temp that actually was shown if another existed. But yea my ram is downclocked for 2D so I guess its no concern.


It's good temps, with these cards anything over 90c vrm's is hitting the danger point, but they are rated to 120c or so i think.

I've had the vrm's on one of my cards hit 115c when benching and no harm done.....wasn't watching the temps


----------



## cephelix

Quote:


> Originally Posted by *darkelixa*
> 
> Had the same issue?


Yup same issue.holes didnt align.had to flex the back of the case inward and force the thumbscrews in

Quote:


> Originally Posted by *heroxoot*
> 
> So how normal is it that VRM Temp 2 never moves except during games and not by much? Vrm temp one is 38c idle but temp 2 is 57c and it never seems to budge. Card is relatively cool idling 46c on 30% fan speed with idle clocks 515/625. Its the lowest MSI AB sliders will go for my 2D profile.


As mentioned.vrm2 temps doesnt change much.in wht little overclocking i've done,it's moved about 2 deg C. Vrm1 on the other hand...well,u guys have the same cards....i dont need to repeat what we all know


----------



## wes1099

Is there any way to cool off the VRMs without water cooling?


----------



## Sgt Bilko

Quote:


> Originally Posted by *wes1099*
> 
> Is there any way to cool off the VRMs without water cooling?


More airflow usually, It also depends on the brand of card, if it's reference then change the thermal pads to some better.


----------



## cephelix

Quote:


> Originally Posted by *wes1099*
> 
> Is there any way to cool off the VRMs without water cooling?


So far,it's changing The stock thermal pads to the fujipoly ultra extreme ones and adding forged copper heatsinks like the enzotech ones.have a search around the forum. The ultra extreme thermal pad thread has Roboyto as the OP.

Of course with the heatsinks,u then have to find a way to secure it.
And not to forget,more airflow to the heatsinks as well


----------



## wes1099

Quote:


> Originally Posted by *cephelix*
> 
> So far,it's changing The stock thermal pads to the fujipoly ultra extreme ones and adding forged copper heatsinks like the enzotech ones.have a search around the forum. The ultra extreme thermal pad thread has Roboyto as the OP.
> 
> Of course with the heatsinks,u then have to find a way to secure it.
> And not to forget,more airflow to the heatsinks as well


How would one add forged copper heatsinks on an air cooled card? Do you mean copper shims?


----------



## cephelix

Quote:


> Originally Posted by *wes1099*
> 
> How would one add forged copper heatsinks on an air cooled card? Do you mean copper shims?


Oh right,my bad.i was thinking if u used the red mod or the nzxt solutions.of course if u use the stock coolers there's no way u could use the copper heatsinks.in tht case it would be just changing the thermal pads


----------



## lawson67

Quote:


> Originally Posted by *luckyboy*
> 
> Every time, when i change refresh rate from 120 to 144hz, memory clock is 1300MHz (screenshot below). I close all open programs, like players, browser etc, but result is the same.


Due to running 144hz your pixel clock will be elevated to accommodate the extra bandwidth!....However your vertical blanking values must be to low which is not giving enough time to change the memory clock without corrupting the screen!...Resulting in the fact that your GPU memory can not down clock on the desktop!...This the same with the korean monitors running at 120hz no AMD card users can down clock there memory on the desktop running much over 110hz with the Korean monitors and i believe it sounds like the same for you using this monitor over 120hz....Also you may not be able to set the right values to correct this!.. However you could contact the monitor manufacturer!

But this does mean that your GPU at 144hz is drawing as much wattage as if it was running flat out gaming just sat on the desktop doing nothing!...In my case with my Qnix monitor running at 120hz this is 4gb of DDR 5 Ram running flat out on the desktop at 1350mhz and drawing an extra 60 watts of power which is also adding 10c to my cards idle temp!...

None of this will hurt / damage your card in any way at all...or even make it run any hotter when gaming because it cant go any faster than it is at idle on the desktop at 1350mhz if it can not downclock!...However your electricity bill will go up and if you care about that then i would advise using CCC and making some presets for your custom resolutions as i have done in my picture below...This will enable you at the touch of a keystroke to flick over to 60hz or 96hz or in your case 120hz on the desktop and your GPU memory will be able to down clock!...Once you want to game again just hit your 144hz preset keystroke and off you go!


----------



## BradleyW

Just to add, my screen is 144hz and I am able to downclock the VRAM. Using Dual-Link DVI and CCC 14.4 RC


----------



## lawson67

Quote:


> Originally Posted by *BradleyW*
> 
> Just to add, my screen is 144hz and I am able to downclock the VRAM. Using Dual-Link DVI and CCC 14.4 RC


I wonder if you both use the same monitor?...With the Qnix / X Star monitors this can not be done due to the fact that they have a 450mhz pixel clock limit and an AMD card at 120HZ the pixel clock vertical blanking timings are to low to be able to down clock its memory!...Also if you raise the vertical blanking timings with an AMD card you end up to far above the 450mhz limit to be able to have a strong enough signal to hit 120hz!...However i do not know which monitor he has or the pixel clock limits of it?...However it does sound like his vertical blanking timings are to low to enable his memory to down clock!

Nvidia card users on the other hand using the Qnix and Xstar monitors can use lower vertical blanking timings than any AMD card can and be able to down clock its memory at 120hz using the Qnix and X-Star monitors!...However as i have said i do not know what monitor he has and maybe he should contact the monitor manufacturer!...I am only talking from a purely Korean monitor stand point and pointing out to the OP that his problems sounds very similar to the low vertical blanking timings that we have problems with using AMD cards with the Qnix and X-Star monitors....However i just use CCC presets and flick over to 96hz on the desktop!...And flick back over to 120hz when i want to game again!


----------



## invincible20xx

guys anybody has an idea how to undervolt an r9 290 without undervolting the 2d mood as well ?! trying to get better temps on my reference r9 290's as the top gpu goes into 90c+ easily


----------



## Roy360

I"ve noticed catalyst 13.12 has supersampling under aa method. Is this an improvement over the orginal supersampling? or is AMD downsamplingGUI the way to go still?


----------



## BradleyW

Hello,
I have 2 issues I've discovered as of late.
1) Every game picks up the amount of dedicated VRAM @ 3072MB.
2) Every game picks up the second card as a Microsoft Basic Render Device.

I've tested this on all versions of CCC and Windows.

Idea's?
Thank you.


----------



## HAVOKNW

I have two of the PowerColor R9 290X Water Cooled cards. I had my mod in the PowerColor suite at CES this year. At the time, I was only running one card. We had BF4 on ultra settings running 3 x 27" monitors in Eyefinity and the game was smooth. Vendors were baffled at how smooth it was running on one card. This thing is a beast!


----------



## luckyboy

Lawson, if I change refresh rate from 144hz to 120hz via CCC, after 10-20 seconds...is auto revert to 120hz! I can change from 120 to 144 via CCC without problem, but not from 144 to 120hz. For that, i'm using windows display settings etc. Perhaps i will using 144hz, only while gaming and switch between 144/120 every day. This is so uncomfortable


----------



## kizwan

Quote:


> Originally Posted by *kizwan*
> 
> My 3dmark score, the graphics score seems low in Crossfire.
> 
> Single 290 1200/1600 - tessellation off (13.12)
> http://www.3dmark.com/3dm11/8253789
> 
> 290 Crossfire 1000/1300 - tessellation off (13.11)
> http://www.3dmark.com/3dm11/8258566
> 
> 290 Crossfire 1200/1500 - tessellation off (13.11)
> http://www.3dmark.com/3dm11/8258649
> 
> If you can see the graphics score doesn't look right. Below is the old 3mark11 scores at 947/1250.
> 
> 13.11
> http://www.3dmark.com/3dm11/7656003
> 
> 13.12
> http://www.3dmark.com/3dm11/7688525


Thanks to madman (@HOMECINEMA-PC), I fixed it. Turned out that "Enable GPU Scaling" was not enabled in CCC. 3dmark 11 was running centred in 1080p screen, loosing FPS this way. I thought 3dmark 11 update was causing this *issue*.









http://www.3dmark.com/3dm11/8258906


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> Thanks to madman (@HOMECINEMA-PC), I fixed it. Turned out that "Enable GPU Scaling" was not enabled in CCC. 3dmark 11 was running centred in 1080p screen, loosing FPS this way. I thought 3dmark 11 update was causing this *issue*.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/8258906


wth? what scaling! tell me that's tess off.


----------



## heroxoot

I have given up hope on the MSI stock cooling. My 290X fans make a little rattling even when running 40%. I guess its not so bad because I can only hear that playing a game, but its so annoying the quality of the twin frozr has fallen. Also I am testing overclocks and I'm unsure if it was BF4, but I have 1080/1225 on +25mV and it seemed fine, no artifacts, but the game was hardly putting load on my GPU. Is this throttle or is this BF4 being poorly optimized? I was only getting 30fps where I was getting 70 - 80fps on 64player server with 1060/1225. I'm new to GPU that throttle so I don't exactly know the differences yet.

BF4 just gave me the answer. Joined another server and I'm averaging 60% load getting 70ish FPS. Its funny because on the kill cam it goes into 90% load and I get like 100+ FPS.


----------



## Roy360

finally bit the bullet, after seeing my VRMs hover at 80 degrees.

Just bought this: http://www.frozencpu.com/products/17500/thr-182/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_60_x_50_x_10_-_Thermal_Conductivity_170_WmK.html for my EK block. Its a shame I won't be able to use the extra or anything else.

used the coupon ocn, for a 5.1% discount.

I guess this is better than buying 3 EK backplates for my cards.


----------



## kizwan

Quote:


> Originally Posted by *rdr09*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Thanks to madman (@HOMECINEMA-PC), I fixed it. Turned out that "Enable GPU Scaling" was not enabled in CCC. 3dmark 11 was running centred in 1080p screen, loosing FPS this way. I thought 3dmark 11 update was causing this *issue*.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/8258906
> 
> 
> 
> 
> 
> 
> 
> 
> 
> wth? what scaling! tell me that's tess off.
Click to expand...

Yup, tessellation OFF. For hwbot submission.

http://hwbot.org/submission/2536520_kizwan_3dmark11___performance_2x_radeon_r9_290_22263_marks


----------



## rdr09

Quote:


> Originally Posted by *Roy360*
> 
> finally bit the bullet, after seeing my VRMs hover at 80 degrees.
> 
> Just bought this: http://www.frozencpu.com/products/17500/thr-182/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_60_x_50_x_10_-_Thermal_Conductivity_170_WmK.html for my EK block. Its a shame I won't be able to use the extra or anything else.
> 
> used the coupon ocn, for a 5.1% discount.
> 
> I guess this is better than buying 3 EK backplates for my cards.


yah, baby!

edit: you gonna drain the system, then you might as well use this for the mem . . .

http://www.frozencpu.com/products/16878/thr-164/Fujipoly_Extreme_System_Builder_Thermal_Pad_-_14_Sheet_-_150_x_100_x_05_-_Thermal_Conductivity_110_WmK.html

and maybe a better paste for the core like gelid extreme if what you have is ordinary.

Quote:


> Originally Posted by *kizwan*
> 
> Yup, tessellation OFF. For hwbot submission.
> 
> http://hwbot.org/submission/2536520_kizwan_3dmark11___performance_2x_radeon_r9_290_22263_marks


nice. i was going to ask you before how many pumps do you have in your rig. 3? i can't really see how the loop goes just by looking at the pic.


----------



## Jflisk

Quote:


> Originally Posted by *Roy360*
> 
> finally bit the bullet, after seeing my VRMs hover at 80 degrees.
> 
> Just bought this: http://www.frozencpu.com/products/17500/thr-182/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_60_x_50_x_10_-_Thermal_Conductivity_170_WmK.html for my EK block. Its a shame I won't be able to use the extra or anything else.
> 
> used the coupon ocn, for a 5.1% discount.
> 
> I guess this is better than buying 3 EK backplates for my cards.


Roy the EK back plates help a lot. I was going to do the blocks without them and bit the bullet glad i did.Cause temps are way down down on the VRMS. VRM 1 is murder on these cards. Can cook bacon on them everything is better with bacon even video cards.


----------



## kizwan

Quote:


> Originally Posted by *ledzepp3*
> 
> Anybody here been running their 290/290X's on EK blocks with the reinforcing brackets? There's a lack of images of them in use, just stock pictures from EK's website and places like FCPU.
> 
> -Zepp


Quote:


> Originally Posted by *cephelix*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ledzepp3*
> 
> Anybody here been running their 290/290X's on EK blocks with the reinforcing brackets? There's a lack of images of them in use, just stock pictures from EK's website and places like FCPU.
> 
> -Zepp
> 
> 
> 
> just saw the brackets and wondering where they attach to?and is it really necessary?
Click to expand...

If I'm not mistaken, the reinforcing bracket useful if you have one card or doesn't use the EK bridge/terminal with multiple GPU.
Quote:


> Originally Posted by *rdr09*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Yup, tessellation OFF. For hwbot submission.
> 
> http://hwbot.org/submission/2536520_kizwan_3dmark11___performance_2x_radeon_r9_290_22263_marks
> 
> 
> 
> 
> 
> 
> 
> nice. i was going to ask you before how many pumps do you have in your rig. 3? i can't really see how the loop goes just by looking at the pic.
Click to expand...

Thanks. I have one D5 pump in there. The flow:-
res/pump >> CPU >> top radiator >> GPU's (with parallel EK terminal) >> bottom radiator >> front radiator >> res/pump

The bottom 120mm radiator really lower water temp a lot. I mean, without it, I can feel the reservoir & front radiator heat up. Now it juts feels warm when gaming & benching.


----------



## lawson67

Quote:


> Originally Posted by *luckyboy*
> 
> Lawson, if I change refresh rate from 144hz to 120hz via CCC, after 10-20 seconds...is auto revert to 120hz! I can change from 120 to 144 via CCC without problem, but not from 144 to 120hz. For that, i'm using windows display settings etc. Perhaps i will using 144hz, only while gaming and switch between 144/120 every day. This is so uncomfortable


Yes this is basically what i do using my Qnix monitor since i sold my 2x Asus gtx660 cards in SLI which would down clock at 120hz on the desktop and bought my 2x R9 290 cards which can not down clock on the desktop at 120hz!.....However i have no trouble at all using my CCC presets which i have signed to keystrokes to flick from 120hz once i have finished gaming to either 60hz...96hz or even 110hz as all these resolutions will allow my R9 290 cards to down clock on the desktop!...

The Korean monitors we use are overclocked and so therefore there is no support for them at higher refresh rates other than our enthusiast community!...Also some of the Korean monitors can not hit 120hz its the silicon lottery gamble!....However you bought your monitor as a 144hz monitor and so therefore you can ask advice from BenQ and put your problem to them!

We use ToastyX CRU program ( Linked below ) to create EDID overrides for our custom resolutions where we can change our timings when we are creating our custom resolutions!....We use CRU because AMD Catalyst control Center does not support the creation of custom resolutions like the Nvidia control Panel does!....So using CRU would allow you to create new resolutions and timings!... However you will be taking a step into the unknown using CRU and messing with timings and i would not do this as you do not know the monitors pixel clock limit etc!...I believe if i was you i would get onto Benq support and ask for there advice as to why your GPU memory does not down clock at 144hz.!...They many well have a work around and a fix for you!...I hope this helps









CRU
http://www.monitortests.com/forum/Thread-Custom-Resolution-Utility-CRU


----------



## BradleyW

Hey all,
My previous post got missed.
I have an issue:
1) Every game picks up the amount of dedicated VRAM @ 3072MB.
2) Every game picks up the second card as a Microsoft Basic Render Device.
Any help would be great, because this issue is bugging me..








Thank you.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *kizwan*
> 
> Thanks to madman (@HOMECINEMA-PC), I fixed it. Turned out that "Enable GPU Scaling" was not enabled in CCC. 3dmark 11 was running centred in 1080p screen, loosing FPS this way. I thought 3dmark 11 update was causing this *issue*.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/8258906
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> wth? what scaling! tell me that's tess off.
Click to expand...

Not just a pretty face............


Quote:


> Originally Posted by *BradleyW*
> 
> Hey all,
> My previous post got missed.
> I have an issue:
> 1) Every game picks up the amount of dedicated VRAM @ 3072MB.
> 2) Every game picks up the second card as a Microsoft Basic Render Device.
> Any help would be great, because this issue is bugging me..
> 
> 
> 
> 
> 
> 
> 
> 
> Thank you.


Not sure about that issue mate . My gear reads vram correctly in AB . COD Ghosts draws nearly 8gigs in CF


----------



## Jflisk

Quote:


> Originally Posted by *BradleyW*
> 
> Hey all,
> My previous post got missed.
> I have an issue:
> 1) Every game picks up the amount of dedicated VRAM @ 3072MB.
> 2) Every game picks up the second card as a Microsoft Basic Render Device.
> Any help would be great, because this issue is bugging me..
> 
> 
> 
> 
> 
> 
> 
> 
> Thank you.


Okay have you checked to be sure both cards are installed under hardware in windows. If one is and one is not install the amd driver of your choice on the second card or first card. That by the looks of things shows microsoft basic render device.You can try running DDU/Display driver unistaller and re installing your drivers.It sounds like the amd drivers are not installed on one of the cards. If you cannot install the drivers and you constantly see MBRD in windows hardware or screen keeps turning black staying black. do a RMA on the card. Thanks

DDU - safe for win 8 and 8.1
http://www.guru3d.com/files_details/display_driver_uninstaller_download.html


----------



## darkelixa

Having a quick look behind my gpu the standoff looks like it has made some sort of marks around the screw hole where the standoff goes into the case, is this normal?



http://imgur.com/Gh6JQqC


----------



## Jflisk

Quote:


> Originally Posted by *darkelixa*
> 
> Having a quick look behind my gpu the standoff looks like it has made some sort of marks around the screw hole where the standoff goes into the case, is this normal?
> 
> 
> 
> http://imgur.com/Gh6JQqC


those are some pretty cables. Looks like it is striped in there would need a better picture to be sure and are you talking about the stand off to the right side of the picture.The cables are creating shine.Could also be paint that was in the screw hole that was pulled threw by the screw.


----------



## darkelixa

Yes yes that second standoff on the right is the one i am talking about, I can take the screw in and out no worries, i was just wondering if thats damaged the board? Here is another image with a little less glare


----------



## chronicfx

I put 14.4 on last night and that was pretty good I played all night with no problems


----------



## diggiddi

Quote:


> Originally Posted by *HAVOKNW*
> 
> I have two of the PowerColor R9 290X Water Cooled cards. I had my mod in the PowerColor suite at CES this year. At the time, I was only running one card. We had BF4 on ultra settings running 3 x 27" monitors in Eyefinity and the game was smooth. Vendors were baffled at how smooth it was running on one card. This thing is a beast!


What mod?


----------



## invincible20xx

Quote:


> Originally Posted by *chronicfx*
> 
> I put 14.4 on last night and that was pretty good I played all night with no problems


3 x 290x on a 3570k ? how does that work for you ? no gpu bottleneck ? i was wondering if my 3770k was bottlenecking my 2 r9 290's at stock @ 1080p


----------



## heroxoot

Quote:


> Originally Posted by *invincible20xx*
> 
> Quote:
> 
> 
> 
> Originally Posted by *chronicfx*
> 
> I put 14.4 on last night and that was pretty good I played all night with no problems
> 
> 
> 
> 3 x 290x on a 3570k ? how does that work for you ? no gpu bottleneck ? i was wondering if my 3770k was bottlenecking my 2 r9 290's at stock @ 1080p
Click to expand...

I used to wonder about this too, but here is what you need to always know. CPU cannot be a bottleneck unless it is not enough. I have an 8150. Is it the greatest CPU on the market? Nope, but its enough processing power for gaming. I get 60+ FPS on all my games. So if the CPU is more than enough, it's never a bottleneck.


----------



## invincible20xx

Quote:


> Originally Posted by *heroxoot*
> 
> I used to wonder about this too, but here is what you need to always know. CPU cannot be a bottleneck unless it is not enough. I have an 8150. Is it the greatest CPU on the market? Nope, but its enough processing power for gaming. I get 60+ FPS on all my games. So if the CPU is more than enough, it's never a bottleneck.


you on 1080p ?!

i'm not sure if i should garaduate from 1080p yet lol i need to make sure to always have a system that will never ever drop below 60 fps at absolute max settings i don't want to get into more pixels jerking circle just yet heh


----------



## heroxoot

Just benchmarked my 290x and I'm a little concerned. Its on 1100/1250 +50% power limit +31mV and during the benchmark Heaven lagged for a second and then the frame rate imediately picked back up. Can this be solved by another driver or is it maybe a power thing? The short hang occurred in 2 places. I'm going to increase the power to 35mV and see if it helps, but so far in benching this seems stable. 77c with 94% fan at the peak. Seems a little hot for my blood but I'm told the 290/X run a bit hotter than last gen.


----------



## invincible20xx

Quote:


> Originally Posted by *heroxoot*
> 
> Just benchmarked my 290x and I'm a little concerned. Its on 1100/1250 +50% power limit +31mV and during the benchmark Heaven lagged for a second and then the frame rate imediately picked back up. Can this be solved by another driver or is it maybe a power thing? The short hand occurred in 2 places. I'm going to increase the power to 35mV and see if it helps, but so far in benching this seems stable. 77c with 94% fan at the peak. Seems a little hot for my blood but I'm told the 290/X run a bit hotter than last gen.


amd says 95c is the norm for an r9 290/290x , mine get up to 90 ~ 92 c @ stock clocks @ 1.05v @ auto fan which maxes out @ 42 ~ 45 %


----------



## heroxoot

Quote:


> Originally Posted by *invincible20xx*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Just benchmarked my 290x and I'm a little concerned. Its on 1100/1250 +50% power limit +31mV and during the benchmark Heaven lagged for a second and then the frame rate imediately picked back up. Can this be solved by another driver or is it maybe a power thing? The short hand occurred in 2 places. I'm going to increase the power to 35mV and see if it helps, but so far in benching this seems stable. 77c with 94% fan at the peak. Seems a little hot for my blood but I'm told the 290/X run a bit hotter than last gen.
> 
> 
> 
> amd says 95c is the norm for an r9 290/290x , mine get up to 90 ~ 92 c @ stock clocks @ 1.05v @ auto fan which maxes out @ 42 ~ 45 %
Click to expand...

Is it OK to OC though without a higher fan curve. If you're on stock 290s, but im on an OC'd 290X, I can only imagine it would easily break 80c on such low fan speeds.


----------



## invincible20xx

Quote:


> Originally Posted by *heroxoot*
> 
> Is it OK to OC though without a higher fan curve. If you're on stock 290s, but im on an OC'd 290X, I can only imagine it would easily break 80c on such low fan speeds.


i don't understand it's fine if you let your gpu hit 80+ c that is actually good for an r9 290x overclocked !


----------



## heroxoot

Quote:


> Originally Posted by *invincible20xx*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Is it OK to OC though without a higher fan curve. If you're on stock 290s, but im on an OC'd 290X, I can only imagine it would easily break 80c on such low fan speeds.
> 
> 
> 
> i don't understand it's fine if you let your gpu hit 80+ c that is actually good for an r9 290x overclocked !
Click to expand...

That isn't the issue here. The issue is, if its almost breaking 80c on 95% fan speed, it will easily break 90c on stock curve. This is why I refuse to use the 14.4 driver. I have no fan control.


----------



## ledzepp3

Alright so reporting back on those FC R9-290X Reinforcement Brackets which I posted about earlier.

The installation was remarkably easy. If you already have the block (and backplate if you've got one) installed, there's _no_ need to rip the block off and waste TIM or thermal pads. There are three bolts required (two allen heads which attach at the PCI bracket), and one at the very end of the card right by the PCI 6 and 8 pin connections.




There is very little of the beautiful acrylic (on my blocks at least) covered up by these, and no nickel covered whatsoever. There is clearance from the brackets to the actual block (~11mm), and clearance from the bracket to the inlet/outlet ports. More than enough to be able to fit rotary or angled adapters. Ample space is had for any aftermarket/upgraded PCI power connectors to make their way into the sockets on the card, so no worries if anyone has sleeved cables with larger connectors.





Overall, these adapters are phenomenal. The stiffness and rigidity of the cards is noticeably higher- plus I think they look even classier now that the _hideous_ half-pink caps are somewhat covered by the brackets when viewed from a head-on angle (as if they were installed on a board in a case). Here are some other shots I have of the perfect finishes which are flat black, but will catch some light if shown directly on them due to the mildly textured look. The brackets themselves span the whole length of the PCB for those who are concerned about overhanging components or accessories.



Mountain Dew can for scale










Cheers all and hope this helps!








-Zepp


----------



## Roy360

Quote:


> Originally Posted by *rdr09*
> 
> yah, baby!
> 
> edit: you gonna drain the system, then you might as well use this for the mem . . .
> 
> http://www.frozencpu.com/products/16878/thr-164/Fujipoly_Extreme_System_Builder_Thermal_Pad_-_14_Sheet_-_150_x_100_x_05_-_Thermal_Conductivity_110_WmK.html
> 
> and maybe a better paste for the core like gelid extreme if what you have is ordinary.
> nice. i was going to ask you before how many pumps do you have in your rig. 3? i can't really see how the loop goes just by looking at the pic.


Order already shipped. (damn these guys are fast).

Is the EK paste bad? They gave me Gelid paste for my PCU block, so I figured this would be something similar.

I thought about it getting the 11mk pads, but ASUS didn't even both cooling the memory in their DirectCU cooler, so I figured EK's pads should be more than enough. Plus Rob said if we were cheap, we could probably skimp on memory and get away with it too









Video card for the past 2 years: Intel 3000
New Video Card : R9 290 TF with a forth in box

No need to drain my loop







I never filled it up to the top, so most of the water just stays in the radiators. However this is gonna be a pain when doing yearly maintenance.


----------



## BradleyW

Quote:


> Originally Posted by *Jflisk*
> 
> Okay have you checked to be sure both cards are installed under hardware in windows. If one is and one is not install the amd driver of your choice on the second card or first card. That by the looks of things shows microsoft basic render device.You can try running DDU/Display driver unistaller and re installing your drivers.It sounds like the amd drivers are not installed on one of the cards. If you cannot install the drivers and you constantly see MBRD in windows hardware or screen keeps turning black staying black. do a RMA on the card. Thanks
> 
> DDU - safe for win 8 and 8.1
> http://www.guru3d.com/files_details/display_driver_uninstaller_download.html


Thanks for the help.
Both cards are installed in Device Manager under the latest CCC driver files, so no issues there. My main concern is the VRAM. All games report both cards only hold 3GB of the stuff.


----------



## Jflisk

Quote:


> Originally Posted by *BradleyW*
> 
> Thanks for the help.
> Both cards are installed in Device Manager under the latest CCC driver files, so no issues there. My main concern is the VRAM. All games report both cards only holding 3GB of the stuff.


Find a copy of gpu-z and run it see what it says the size of your v ram is you should see a tab called memory size.


----------



## Thorteris

Hello eveybody should I use Heaven or Valley to Benchmark my Overclocks? And Should I even bother to OC on my reference r9-290x or wait until I get the money to buy a Kraken x60 for my kraken g10?


----------



## BradleyW

Quote:


> Originally Posted by *Jflisk*
> 
> Find a copy of gpu-z and run it see what it says the size of your v ram is you should see a tab called memory size.


GPU-Z reports 4096, but that does not help in this case. GPU-Z refers to pre-defined databases to find the stats. Even if the VRAM was wrecked, it would still report 4096.


----------



## darkelixa

Fantastic, so the marks on my mainboard where the riser is, there is nothing to worry about?


----------



## chronicfx

Quote:


> Originally Posted by *invincible20xx*
> 
> 3 x 290x on a 3570k ? how does that work for you ? no gpu bottleneck ? i was wondering if my 3770k was bottlenecking my 2 r9 290's at stock @ 1080p


My cpu cores stay in the low 90's for usage in some more demanding titles like crysis 3. But I don't hit 100%. Anyway I designed it that way. i would rather have my gpu end be too powerful for my cpu than leave anything on the table going the other way around







Games run smooth and above 60 so thats all I care about.


----------



## BradleyW

Quote:


> Originally Posted by *chronicfx*
> 
> My cpu cores stay in the low 90's for usage in some more demanding titles like crysis 3. But I don't hit 100%. Anyway I designed it that way. *i would rather have my gpu end be too powerful for my cpu* than leave anything on the table going the other way around
> 
> 
> 
> 
> 
> 
> 
> Games run smooth and above 60 so thats all I care about.


Just thought I'd highlight that.


----------



## invincible20xx

Quote:


> Originally Posted by *chronicfx*
> 
> My cpu cores stay in the low 90's for usage in some more demanding titles like crysis 3. But I don't hit 100%. Anyway I designed it that way. i would rather have my gpu end be too powerful for my cpu than leave anything on the table going the other way around
> 
> 
> 
> 
> 
> 
> 
> Games run smooth and above 60 so thats all I care about.


well that's what i did just upgraded from 7970's just to be able to hold 60fps at all times maxed out with filters to the absolute maximum on my 1080p display without dropping below 60 ever lol even though a lot of people said it would be a waste for 1080p but some games just wouldn't keep 60 fps minimum when i used x8 AA or even x4 AA so i found myself dropping my AA in games like tomb raider because at 8x i was dropping in the 45-ish area with the 2 7970's i had and i didn't like this







also an example of this was far cry 3 i guess ... and other games that i can't remember too , also having had 7970's for quit a while there is something more "smooth" about them R9's in displaying video games like for example in arkham city that i tried yesterday just to see the difference in a game i know 7970's were able to keep 60fps and were smooth but when i popped in the r9's same fps but the game feels more fluid there is something "smooth" more smoother idk if i can describe lol could this be due to the better hardware frame pacing technique use in the new R9 Series GPUs ? or the new crossfire technique ?! i'm sure as hell not placebo !


----------



## Dasboogieman

Quote:


> Originally Posted by *invincible20xx*
> 
> 3 x 290x on a 3570k ? how does that work for you ? no gpu bottleneck ? i was wondering if my 3770k was bottlenecking my 2 r9 290's at stock @ 1080p


I find that CPU bottlenecks are almost a non-factor with any decent Sandy Bridge or later model chip. The reason is that games that tend to hog the CPU so much to the point you go under 60FPS are doing so because they are either badly optimized or Single Core coded. For Example, Sins of a Solar Empire is a really old game that can bring any modern CPU to its knees (so much so your average FPS is around 20 even on SLI Titans), and the only reason is that the AI and Physics engine were coded for Single Cores.
In other words, these kind games won't improve in FPS if you buy an expensive CPU (until they are better optimized or Intel figures out a way to improve single core performance by 100%)

On the other hand, games that are GPU dependent are plenty and the majority. So having Trifire 290x on tap is far more important than worrying about losing a few frames when you are already capable of pushing 120+FPS.

Unless of course you run 120hz monitors, then this is a completely different beast.


----------



## BradleyW

It's not the CPU's that are the bottleneck. It is in fact the programs we run don't use our CPU's efficiently. This causes slow downs on the CPU side, resulting in a bottleneck as the GPU's wait their turn.


----------



## invincible20xx

Quote:


> Originally Posted by *chronicfx*
> 
> My cpu cores stay in the low 90's for usage in some more demanding titles like crysis 3. But I don't hit 100%. Anyway I designed it that way. i would rather have my gpu end be too powerful for my cpu than leave anything on the table going the other way around
> 
> 
> 
> 
> 
> 
> 
> Games run smooth and above 60 so thats all I care about.


but how are you hooking 3 x 290x on a z77 ? what mobo do you have ?! the last pci-e on my sabertooth for example is only good for a sound card / usb card / wifi bluetooth card only because it's a 4x pci-e 2.0 slot was not really made to be utilized with a gpu the rank of an r9 series or even the 7900 series ...


----------



## cephelix

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *ledzepp3*
> 
> Alright so reporting back on those FC R9-290X Reinforcement Brackets which I posted about earlier.
> 
> The installation was remarkably easy. If you already have the block (and backplate if you've got one) installed, there's _no_ need to rip the block off and waste TIM or thermal pads. There are three bolts required (two allen heads which attach at the PCI bracket), and one at the very end of the card right by the PCI 6 and 8 pin connections.
> 
> 
> 
> 
> There is very little of the beautiful acrylic (on my blocks at least) covered up by these, and no nickel covered whatsoever. There is clearance from the brackets to the actual block (~11mm), and clearance from the bracket to the inlet/outlet ports. More than enough to be able to fit rotary or angled adapters. Ample space is had for any aftermarket/upgraded PCI power connectors to make their way into the sockets on the card, so no worries if anyone has sleeved cables with larger connectors.
> 
> 
> 
> 
> 
> Overall, these adapters are phenomenal. The stiffness and rigidity of the cards is noticeably higher- plus I think they look even classier now that the _hideous_ half-pink caps are somewhat covered by the brackets when viewed from a head-on angle (as if they were installed on a board in a case). Here are some other shots I have of the perfect finishes which are flat black, but will catch some light if shown directly on them due to the mildly textured look. The brackets themselves span the whole length of the PCB for those who are concerned about overhanging components or accessories.
> 
> 
> 
> Mountain Dew can for scale
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers all and hope this helps!
> 
> 
> 
> 
> 
> 
> 
> 
> -Zepp






Thanx for that Zepp!.....


----------



## BradleyW

Quote:


> Originally Posted by *invincible20xx*
> 
> but how are you hooking 3 x 290x on a z77 ? what mobo do you have ?! the last pci-e on my sabertooth for example is only good for a sound card / usb card / wifi bluetooth card only because it's a 4x pci-e 2.0 slot was not really made to be utilized with a gpu the rank of an r9 series or even the 7900 series ...


???
Did you mean to quote someone else?


----------



## invincible20xx

Quote:


> Originally Posted by *BradleyW*
> 
> ???
> Did you mean to quote someone else?


oh sorry i wanted to quote somebody else

now edited


----------



## lawson67

Hi guys i have noticed that gigabyte have released a new bios( F4 ) update for the windforce R9 290!....Has anyone tried it and does anyone think it would benefit me to update to F4 from the F3L version currently installed?

http://www.gigabyte.com/products/product-page.aspx?pid=4884#bios


----------



## heroxoot

Quote:


> Originally Posted by *lawson67*
> 
> Hi guys i have noticed that gigabyte have released a new bios( F4 ) update for the windforce R9 290!....Has anyone tried it and does anyone think it would benefit me to update to F4 from the F3L version currently installed?
> 
> http://www.gigabyte.com/products/product-page.aspx?pid=4884#bios


Considering it says it improves stability, probably.


----------



## lawson67

Quote:


> Originally Posted by *heroxoot*
> 
> Considering it says it improves stability, probably.


Yeah but it also says if you are having no problems don't flash it as it could brick your card etc!...I seem to have no problems at all and i am overclocked at 1100mhz though CCC and my memory is clocked at 1350mhz the same as my powercolor card with no added voltage just +20 on power limit settings...Also I was rather hoping someone would reply who had tried it!


----------



## chronicfx

Quote:


> Originally Posted by *invincible20xx*
> 
> but how are you hooking 3 x 290x on a z77 ? what mobo do you have ?! the last pci-e on my sabertooth for example is only good for a sound card / usb card / wifi bluetooth card only because it's a 4x pci-e 2.0 slot was not really made to be utilized with a gpu the rank of an r9 series or even the 7900 series ...


The gigabyte z77x-ud5h has 3.0 x8/x4/x4 and i put my sound card in the third x1 pcie between the first and second card.



I actually am running two power supplies since i recently took a mining rig offline for a bit. I just installed my fsp booster x5 450w psu for the bottom card. The psu fits into the 5.25 slot you can see it has the "X" on it in the picture below.



Please don't shred me on wire management I am well aware it is not great and I have great respect for those of you who do it well and wonder when you are coming over to teach me


----------



## invincible20xx

Quote:


> Originally Posted by *chronicfx*
> 
> My cpu cores stay in the low 90's for usage in some more demanding titles like crysis 3. But I don't hit 100%. Anyway I designed it that way. i would rather have my gpu end be too powerful for my cpu than leave anything on the table going the other way around
> 
> 
> 
> 
> 
> 
> 
> Games run smooth and above 60 so thats all I care about.


Quote:


> Originally Posted by *chronicfx*
> 
> The gigabyte z77x-ud5h has 3.0 x8/x4/x4 and i put my sound card in the third x1 pcie between the first and second card.
> 
> 
> 
> I actually am running two power supplies since i recently took a mining rig offline for a bit. I just installed my fsp booster x5 450w psu for the bottom card. The psu fits into the 5.25 slot you can see it has the "X" on it in the picture below.
> 
> 
> 
> Please don't shred me on wire management I am well aware it is not great and I have great respect for those of you who do it well and wonder when you are coming over to teach me


lol never ever told anybody what to do everybody knows whatever floats their boat lol , good system non the less


----------



## Paul17041993

Quote:


> Originally Posted by *chronicfx*
> 
> Please don't shred me on wire management I am well aware it is not great and I have great respect for those of you who do it well and wonder when you are coming over to teach me


that's better then mine really, Ive only used twistyties on a couple areas of my rig just to get the side/back panel to sit properly and to stop sata cables drifting...


----------



## BroJin

Add me to the club









Sapphire 290 Vapor X


Spoiler: Warning: Spoiler!









Please no comments on the dust in my system. I am a smoker







Will spend time this weekend to clean









This card is pretty damn heavy and about a foot long. Will do some minor overclocks and some benches

First bench all stock at 1440p
Let me know if I'm doing anything wrong










Spoiler: Warning: Spoiler!



[IMG


----------



## heroxoot

My OC is making my 290X worse! I guess its throttling for some reason. Can more voltage help? It doesn't seem to have helped at all.


----------



## Roboyto

Ooooohhhhhhhhhh yeahhhhhhh!



The wife is really going to be stylin' now...totally unnecessary but I was going to order a 280X for $309 and checked Open Box first...glad I did

Got a Kraken G10 on the way too







New Batman is going to look nice on my TV

Can't wait to stuff this thing into the 250D


----------



## DeadlyDNA

This thread goes so fast, even with my overclocked pc i cannot keep up. I know i am missing alot of good information on here. If i could just be like "Number Five" and speed through it while understanding it


----------



## UZ7

Quote:


> Originally Posted by *BroJin*
> 
> Add me to the club
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sapphire 290 Vapor X
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Please no comments on the dust in my system. I am a smoker
> 
> 
> 
> 
> 
> 
> 
> Will spend time this weekend to clean
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This card is pretty damn heavy and about a foot long. Will do some minor overclocks and some benches
> 
> First bench all stock at 1440p
> Let me know if I'm doing anything wrong
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> [IMG


You must overclock and show temps!









lol


----------



## Roboyto

Quote:


> Originally Posted by *DeadlyDNA*
> 
> This thread goes so fast, even with my overclocked pc i cannot keep up. I know i am missing alot of good information on here. If i could just be like "Number Five" and speed through it while understanding it


I concur, it is quite ridiculous sometimes. I'll log on after work to see what happened through the day and I'm 5 pages behind...like







that's fast


----------



## Satchmo0016

Hey man, I just got my 290 Vapor-X in today as well (scored right at 1300 at 1080p ultra/tess extreme) but I am noticing some pretty bad coil whine and buzzing when its under a moderate load. Are you getting any sounds out of yours?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Roboyto*
> 
> I concur, it is quite ridiculous sometimes. I'll log on after work to see what happened through the day and I'm 5 pages behind...like
> 
> 
> 
> 
> 
> 
> 
> that's fast


Eh, i've got like 40 posts to a page set so it makes it a bit easier keeping track of it all.

On another note, I thought i'd run the Star Swarm Bench to see it the 14.4 drivers improved it.

I was pleasantly surprised:


Spoiler: DX Crossfire



== Hardware Configuration =================================
GPU: AMD Radeon R9 200 Series
CPU: AuthenticAMD
AMD FX(tm)-8350 Eight-Core Processor
Physical Cores: 4
Logical Cores: 8
Physical Memory: 17138200576
Allocatable Memory: 8796092891136
===========================================================

== Configuration ==========================================
API: DirectX
Scenario: ScenarioRTS.csv
User Input: Disabled
Resolution: 1920x1080
Fullscreen: True
GameCore Update: 16.6 ms
Bloom Quality: High
PointLight Quality: High
ToneCurve Quality: High
Glare Overdraw: 16
Shading Samples: 64
Shade Quality: Mid
Deferred Contexts: Disabled
Temporal AA Duration: 16
Temporal AA Time Slice: 2
Detailed Frame Info: Off
===========================================================

== Results ================================================
Test Duration: 360 Seconds
Total Frames: 1718

*Average FPS: 4.77*
Average Unit Count: 3400
Maximum Unit Count: 5655
Average Batches/MS: 441.80
Maximum Batches/MS: 574.13
Average Batch Count: 91129
Maximum Batch Count: 167027





Spoiler: Mantle Single Card



== Hardware Configuration =================================
GPU: AMD Radeon R9 200 Series
CPU: AuthenticAMD
AMD FX(tm)-8350 Eight-Core Processor
Physical Cores: 4
Logical Cores: 8
Physical Memory: 17138200576
Allocatable Memory: 8796092891136
===========================================================

== Configuration ==========================================
API: Mantle
Scenario: ScenarioRTS.csv
User Input: Disabled
Resolution: 1920x1080
Fullscreen: True
GameCore Update: 16.6 ms
Bloom Quality: High
PointLight Quality: High
ToneCurve Quality: High
Glare Overdraw: 16
Shading Samples: 64
Shade Quality: Mid
Deferred Contexts: Disabled
Temporal AA Duration: 16
Temporal AA Time Slice: 2
Detailed Frame Info: Off
===========================================================

== Results ================================================
Test Duration: 360 Seconds
Total Frames: 10978

*Average FPS: 30.49*
Average Unit Count: 4037
Maximum Unit Count: 5295
Average Batches/MS: 2504.42
Maximum Batches/MS: 3511.80
Average Batch Count: 83350
Maximum Batch Count: 177837



I wish Crossfire would work with Mantle, I was GPU bound the entire time while only hitting 30% usage at best with DX.


----------



## darkelixa

Now that I have my new gigabyte r9 290, would it be more wise to use the old referance sapphire r9 290 and upgrade the misses 7870 or just be greedy and use it as a crossfire machine on my main


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> Now that I have my new gigabyte r9 290, would it be more wise to use the old referance sapphire r9 290 and upgrade the misses 7870 or just be greedy and use it as a crossfire machine on my main


Still running that 8350 at 4.0?


----------



## heroxoot

So I just cannot understand what to do. My 290X performs worse with an Overclock. How can I solve this?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *darkelixa*
> 
> Now that I have my new gigabyte r9 290, would it be more wise to use the old referance sapphire r9 290 and upgrade the misses 7870 or just be greedy and use it as a crossfire machine on my main


Go greedy and run the 290 in CF and give her the 7870...... shell get over it LooooL


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *lawson67*
> 
> Yeah but it also says if you are having no problems don't flash it as it could brick your card etc!...I seem to have no problems at all and i am overclocked at 1100mhz though CCC and my memory is clocked at 1350mhz the same as my powercolor card with no added voltage just +20 on power limit settings...Also I was rather hoping someone would reply who had tried it!


The only bios id be looking at to flash is the unlocked Asus PT1T bios ....... and if aint borked don't fix it


----------



## darkelixa

I should really change my signature, I gave up on my amd and changed over to intel, now running an i7 4770k at stock speeds and its ALOT faster than my amd 8350


----------



## heroxoot

Quote:


> Originally Posted by *lawson67*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Considering it says it improves stability, probably.
> 
> 
> 
> Yeah but it also says if you are having no problems don't flash it as it could brick your card etc!...I seem to have no problems at all and i am overclocked at 1100mhz though CCC and my memory is clocked at 1350mhz the same as my powercolor card with no added voltage just +20 on power limit settings...Also I was rather hoping someone would reply who had tried it!
Click to expand...

Not a problem! Just put it on the second bios, boot back into DOS, flip the switch, and flash it again. Also if you have another GPU you can tell ATIFLASH what PCIE slot to look at for flashing. You're 100% safe.


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> I should really change my signature, I gave up on my amd and changed over to intel, now running an i7 4770k at stock speeds and its ALOT faster than my amd 8350


Use the Rigbuilder then add it to your sig









and of course the 4770k is faster, it's nearly double the price here









So yes, keep the 290 and then when you upgrade donate it to the missus.


----------



## darkelixa

FFXIV stuttered for me with my gigabyte 770gtx 2gb big time on my amd r9 290 no issues really except for the main start video, that always had issues, every other cut scene worked fine. The game has quite a bit of stuttering even in game, try out elder scrolls online, that game runs perfectly 100+ fps on my r9 290


----------



## Roboyto

Quote:


> Originally Posted by *DeadlyDNA*
> 
> I had a short stint with FFXIV on Nvidia 680gtx, and R9 290s. I recall flickering on my nvidia setup. it was something to do with lighting for me. I did not see it on my AMD cards and they ran it was smoother for me. I also had problems with brightness going up and down with nvidia cards in that game. it was strange. I am about to do some MMO benching with 4k eyefinity assuming it works i will give a go on FFXIV soon, maybe we can check it out


It does seem to be smoother on AMD than Nvidia. I was previously running it on a GTX 670, and it worked that card to the bone and ultimately blew it up despite being under a full cover block @ stock clocks/voltages; it did have a very squeaky coil whine and wouldn't overclock *at all*.

It is very odd since the card will go 6+ hours without flinching, except for cut scenes; doesn't matter if it's overclocked or not. The whole time the 290 is running between 95-100% pushing 5760*1080. All settings are maxed except for AA which I have turned down to 4X(I think), and it never dips under 40 fps. Thoroughly pleased with my 290









I was browsing your 4K thread earlier, very nice work! Look forward to seeing what you come up with.

Quote:


> Originally Posted by *igrease*
> 
> Are you on the official 14.4 beta drivers?


Nope, 13.12 on Windows 7

Haven't tried beta since 14.2 Just been keeping an eye on this thread for the issues people are running in to.

Quote:


> Originally Posted by *darkelixa*
> 
> FFXIV stuttered for me with my gigabyte 770gtx 2gb big time on my amd r9 290 no issues really except for the main start video, that always had issues, every other cut scene worked fine. The game has quite a bit of stuttering even in game, try out elder scrolls online, that game runs perfectly 100+ fps on my r9 290


I don't have any stuttering anywhere, the game is glorious...just weird flicker/tears in cut scenes

Played ESO in beta and I'm too much of a FF fan, and have quite a few friends who are also playing, to jump (air)ship


----------



## darkelixa

I don't use beta drivers, never had luck with them and i dont like to be a beta pig becuase they cant release a stable 14 version. Still using 13.12


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> I don't use beta drivers, never had luck with them and i dont like to be a beta pig becuase they cant release a stable 14 version. Still using 13.12


How do you know they aren't stable if you've never used them?


----------



## Arizonian

Quote:


> Originally Posted by *shwarz*
> 
> Sign me up please
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/d8nsd/
> 
> XFX R9 290
> 
> stock cooling for now


Congrats - added









Quote:


> Originally Posted by *BroJin*
> 
> Add me to the club
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sapphire 290 Vapor X
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Please no comments on the dust in my system. I am a smoker
> 
> 
> 
> 
> 
> 
> 
> Will spend time this weekend to clean
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This card is pretty damn heavy and about a foot long. Will do some minor overclocks and some benches
> 
> First bench all stock at 1440p
> Let me know if I'm doing anything wrong
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> [IMG


Congrats - added







First Vapor X. Give us the low down on temps.


----------



## darkelixa

All the complaints on forums lol


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> All the complaints on forums lol


Well i can tell you that 14.4 is working fine for me, only problem i have with it is i can't use Trixx because the clocks won't stay consistent but it doesn't bother me because i use AB on start-up anyways.

Also, Complaints are louder than Compliments, If something is working fine then most people just leave it at that, if not they take to the net either looking for answers or shouting out complaints.


----------



## Forceman

Quote:


> Originally Posted by *heroxoot*
> 
> So I just cannot understand what to do. My 290X performs worse with an Overclock. How can I solve this?


Probably power throttling. Which drivers are you using? The 14.x (except maybe the 14,4s) have a broken power limit and it can't be raised, which will cause the card to throttle. There is also an issue sometimes where the power limit is not correctly set by Afterburner - check and see if CCC shows the increased power limit as well. Open the Afterburner monitoring graph while you test and see if the core clock speed is dropping under heavy load.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Forceman*
> 
> Probably power throttling. Which drivers are you using? The 14.x (except maybe the 14,4s) have a broken power limit and it can't be raised, which will cause the card to throttle. There is also an issue sometimes where the power limit is not correctly set by Afterburner - check and see if CCC shows the increased power limit as well. Open the Afterburner monitoring graph while you test and see if the core clock speed is dropping under heavy load.


Power limit is fine for me in 14.4 with AB but doesn't work in Trixx unless i mess around with CCC first.

I've also installed 14.4 over 13.12 if thats any help.


----------



## igrease

Okay so I finally got my card stable hopefully but I am worried about the temps now.

*Core* - 1100
*Memory* -1250
*CoreV* - +44
*GPU Temp* - 82c max
*VRM 1* - 85c max
*VRM 2* - 69c max
*VDDCI* - 1.211 max
*Fan* - 90%

This is the MSI r9 290.


----------



## kizwan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *darkelixa*
> 
> All the complaints on forums lol
> 
> 
> 
> Well i can tell you that 14.4 is working fine for me, only problem i have with it is i can't use Trixx because the clocks won't stay consistent but it doesn't bother me because i use AB on start-up anyways.
> 
> Also, Complaints are louder than Compliments, If something is working fine then most people just leave it at that, if not they take to the net either looking for answers or shouting out complaints.
Click to expand...

The only thing I want to complain is that 14.2 beta driver works flawlessly for me.








Quote:


> Originally Posted by *Sgt Bilko*
> 
> Power limit is fine for me in 14.4 with AB but doesn't work in Trixx unless i mess around with CCC first.
> 
> I've also installed 14.4 over 13.12 if thats any help.


Power limit seems working well for me in 14.2 beta driver. Even overclocked with +200mV, no throttling at all.







I remember seeing very minor throttling when running only one GPU but no throttling at all in Crossfire. I'll check this again later.


----------



## cephelix

Quote:


> Originally Posted by *igrease*
> 
> Okay so I finally got my card stable hopefully but I am worried about the temps now.
> 
> *Core* - 1100
> *Memory* -1250
> *CoreV* - +44
> *GPU Temp* - 82c max
> *VRM 1* - 85c max
> *VRM 2* - 69c max
> *VDDCI* - 1.211 max
> *Fan* - 90%
> 
> This is the MSI r9 290.


temps seem fine for me, especially the VRMs that has a max operating temperature of 120 deg C i think...the core though, personally i'd run it a bit cooler but that's just me.
1100 at +44 vcore eh?that is good. if you could lower your temps, i'm sure you could push more


----------



## igrease

Quote:


> Originally Posted by *cephelix*
> 
> temps seem fine for me, especially the VRMs that has a max operating temperature of 120 deg C i think...the core though, personally i'd run it a bit cooler but that's just me.
> 1100 at +44 vcore eh?that is good. if you could lower your temps, i'm sure you could push more


The only way I could probably reduce the temps any further would be to just blast the fans at 100%. The nice thing is about these fans is that I don't hear them at all even at 90%. If I reduce the voltage, my gpu usage goes crazy and if I reduce it even farther I get artifacts.


----------



## cephelix

Quote:


> Originally Posted by *igrease*
> 
> The only way I could probably reduce the temps any further would be to just blast the fans at 100%. The nice thing is about these fans is that I don't hear them at all even at 90%. If I reduce the voltage, my gpu usage goes crazy and if I reduce it even farther I get artifacts.


well, of course by reducing voltage you'd need to reduce your clock speeds as well.
If you're fine with running at a slightly reduced clock speed along with lower voltage, as long as temps are kept in check,then why not?

alternatively, it's like what you mentioned, running the fans on full blast with an aggressive fan curve.


----------



## heroxoot

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> So I just cannot understand what to do. My 290X performs worse with an Overclock. How can I solve this?
> 
> 
> 
> Probably power throttling. Which drivers are you using? The 14.x (except maybe the 14,4s) have a broken power limit and it can't be raised, which will cause the card to throttle. There is also an issue sometimes where the power limit is not correctly set by Afterburner - check and see if CCC shows the increased power limit as well. Open the Afterburner monitoring graph while you test and see if the core clock speed is dropping under heavy load.
Click to expand...

I was on 13.12 and I got better FPS when I put it on stock. Now I'm using 14.4 and I'm on stock clocks but I'm going to attempt 14.2 and see where it goes.


----------



## darkelixa

Would it be worth upgrading the driver if i am getting good performance now?


----------



## Sgt Bilko

Quote:


> Originally Posted by *darkelixa*
> 
> Would it be worth upgrading the driver if i am getting good performance now?


You don't have to, I'm just willing to take a small amount of time and install it then run a few things to see how it works for me, only takes a few mins for a DDU then re-install a different driver.


----------



## cephelix

Quote:


> Originally Posted by *darkelixa*
> 
> Would it be worth upgrading the driver if i am getting good performance now?


As what @Sgt Bilko said. Personal preference really


----------



## VulgarDisplay

Quote:


> Originally Posted by *heroxoot*
> 
> So I just cannot understand what to do. My 290X performs worse with an Overclock. How can I solve this?


What kind of overclock are we talking about? If you go to far on your RAM OC it can start correcting errors which will kill performance. You won't always see an artifact or other obvious problem. Just slowly increase it and run a benchmark every time. As long as it keeps going up you're fine. As soon as your score drops it's time to roll back.


----------



## HOMECINEMA-PC

Stripped the air cooler off gotta clean it up with some alcohol then fit block












Found a De lorean


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Stripped the air cooler off gotta clean it up with some alcohol then fit block
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Oh no......they used the Thermal junk that came with the cooler, horrible stuff to clean off isn't it? It's just like roofing sealant lol
Quote:


> Found a De lorean


Dat ass.......


----------



## Forceman

Quote:


> Originally Posted by *kizwan*
> 
> Power limit seems working well for me in 14.2 beta driver. Even overclocked with +200mV, no throttling at all.
> 
> 
> 
> 
> 
> 
> 
> I remember seeing very minor throttling when running only one GPU but no throttling at all in Crossfire. I'll check this again later.


I think water cooling greatly improves the efficiency of the cards and reduces the total power draw. Even at 100% I didn't throttle once I put the card under water.
Quote:


> Originally Posted by *heroxoot*
> 
> I was on 13.12 and I got better FPS when I put it on stock. Now I'm using 14.4 and I'm on stock clocks but I'm going to attempt 14.2 and see where it goes.


Check the overdrive page in CCC and make sure it is showing the increased power limit. I ran into that a few times where it changed in AB but CCC didn't show it, and I got throttling. Otherwise it may be the VRAM issue VulgarDisplay brought up.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Oh no......they used the Thermal junk that came with the cooler, horrible stuff to clean off isn't it? It's just like roofing sealant lol
> Dat ass.......




Yep its a bit like silicon most of it stayed on da h'sinks








Plus im lucky to get my bits today , missed delivery by 40mins went up to Auspost with card they said no package till Monday ..........Then as I was leaving dejected I saw and bailed up the delivery driver and lucky me I gots it















Unlocked XFX 290

Block










Time to bench


----------



## invincible20xx

Quote:


> Originally Posted by *heroxoot*
> 
> My OC is making my 290X worse! I guess its throttling for some reason. Can more voltage help? It doesn't seem to have helped at all.


maybe your vrms are getting way too hot ?!


----------



## invincible20xx

Quote:


> Originally Posted by *Roboyto*
> 
> I concur, it is quite ridiculous sometimes. I'll log on after work to see what happened through the day and I'm 5 pages behind...like
> 
> 
> 
> 
> 
> 
> 
> that's fast


lol same i slept 7 hours i woke up 5 pages behind lol


----------



## invincible20xx

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 
> 
> Yep its a bit like silicon most of it stayed on da h'sinks
> 
> 
> 
> 
> 
> 
> 
> 
> Plus im lucky to get my bits today , missed delivery by 40mins went up to Auspost with card they said no package till Monday ..........Then as I was leaving dejected I saw and bailed up the delivery driver and lucky me I gots it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Unlocked XFX 290
> 
> Block
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Time to bench




don't forget to added thermal pads to that area or you might short something and burn the card !


----------



## Norse

Just wondering what would happen if i put a factory OC'ed card and a Ref card at stock into crossfire?


----------



## invincible20xx

Quote:


> Originally Posted by *Norse*
> 
> Just wondering what would happen if i put a factory OC'ed card and a Ref card at stock into crossfire?


i think both will work @ the lower clocked card frequency


----------



## Norse

Quote:


> Originally Posted by *invincible20xx*
> 
> i think both will work @ the lower clocked card frequency


Ah Gotcha


----------



## Tmfs

Purchased a reference XFX 290 off ebay that ended up being a miner special black screen edition. Big surprise huh? Well thankfully the guys over at XFX setup an RMA with zero hassle in which I received a new sealed Double D 290 in return.










VRM1 gets a little toasty and it has Elpida mem but I honestly can't even begin to complain given the circumstances.

I know this has been said before but prospective buyers beware of the flood of miner stock being offloaded currently. I was *extremely* lucky that XFX customer service was amazing and took care of me.

1100/1400 stock volts


----------



## Devotii

Quote:


> Originally Posted by *Tmfs*
> 
> Purchased a reference XFX 290 off ebay that ended up being a miner special black screen edition. Big surprise huh? Well thankfully the guys over at XFX setup an RMA with zero hassle in which I received a new sealed Double D 290 in return.


I agree, it's annoying though as I want to sell my non-mined 7950







You were lucky for RMA!


----------



## invincible20xx

Quote:


> Originally Posted by *Tmfs*
> 
> Purchased a reference XFX 290 off ebay that ended up being a miner special black screen edition. Big surprise huh? Well thankfully the guys over at XFX setup an RMA with zero hassle in which I received a new sealed Double D 290 in return.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> VRM1 gets a little toasty and it has Elpida mem but I honestly can't even begin to complain given the circumstances.
> 
> I know this has been said before but prospective buyers beware of the flood of miner stock being offloaded currently. I was *extremely* lucky that XFX customer service was amazing and took care of me.
> 
> 1100/1400 stock volts


what is a miner black screen ? maybe all it needed is a bios flash to original or something lol


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *invincible20xx*
> 
> 
> 
> don't forget to added thermal pads to that area or you might short something and burn the card !


I did its not my first w/blocked 290 matey


----------



## invincible20xx

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> I did its not my first w/blocked 290 matey


haha just wanted to make sure your day is not ruined


----------



## Tmfs

Quote:


> Originally Posted by *invincible20xx*
> 
> what is a miner black screen ? maybe all it needed is a bios flash to original or something lol


I was just making a joke that it was a miner owned card that also had black screen issue. I flashed it back to it's stock bios along with a few others with same results. Card 100% had the black screen bug which the miner probably never even noticed if he flashed it with the mining bios and slapped it in his farm.


----------



## Norse

Quote:


> Originally Posted by *Tmfs*
> 
> I was just making a joke that it was a miner owned card that also had black screen issue. I flashed it back to it's stock bios along with a few others with same results. Card 100% had the black screen bug which the miner probably never even noticed if he flashed it with the mining bios and slapped it in his farm.


Whats the differance between miner bios and normal?


----------



## Mega Man

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Stripped the air cooler off gotta clean it up with some alcohol then fit block
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Found a De lorean


If that is your car I want you too know I hate you. I love that car!
Quote:


> Originally Posted by *invincible20xx*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Norse*
> 
> Just wondering what would happen if i put a factory OC'ed card and a Ref card at stock into crossfire?
> 
> 
> 
> i think both will work @ the lower clocked card frequency
Click to expand...

Last I knew and now let's both cards (all) operate at different freq.
Quote:


> Originally Posted by *Norse*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Tmfs*
> 
> I was just making a joke that it was a miner owned card that also had black screen issue. I flashed it back to it's stock bios along with a few others with same results. Card 100% had the black screen bug which the miner probably never even noticed if he flashed it with the mining bios and slapped it in his farm.
> 
> 
> 
> Whats the differance between miner bios and normal?
Click to expand...

They normally underfoot the card at the bios lv to save energy/heat

In other news I have a second 290x otw and will have a third and fourth on my doorstep (probably almost ready to pull the trigger) before I get back to the us. Can not wait for quad fire

The only thing is I have the black blocks from ek and the 2 are the clear (acetyl vs acrylic) but I think it would look good if card 2 and 3 glowed. Idk I may just buy it and re block if I don't like it

Ironically all before I even install my first 290x. Even though I am excited to do it. I don't want to. I am having a blast in China with my family


----------



## Tmfs

Quote:


> Originally Posted by *Norse*
> 
> Whats the differance between miner bios and normal?


Ram timings and some shaders disabled I believe. Not a miner so not 100% sure but that's what I recall reading at some point.


----------



## Norse

Anyway to tell if it has the miner bios? one of the cards i got off ebay might have been a miner card but im not sure everything seems to run fine GPUZ picks up 2816 shaders


----------



## SupahSpankeh

Hi.

Am experiencing some odd behaviour. It seems I cannot wake from sleep perhaps half the time if I allow the display to turn off. I'm using the 14.2 AMD drivers - any ideas?

Was thinking about moving back to the 13.9 drivers (I've got an R9 290, not 290X) but I was reluctant to do that for fear it would cost me performance. No stability issues in game, only a problem when the display shuts off.

Ta,


----------



## Mega Man

I have that on both my amd and Intel. I just shut off the monitor by hand


----------



## invincible20xx

Quote:


> Originally Posted by *SupahSpankeh*
> 
> Hi.
> 
> Am experiencing some odd behaviour. It seems I cannot wake from sleep perhaps half the time if I allow the display to turn off. I'm using the 14.2 AMD drivers - any ideas?
> 
> Was thinking about moving back to the 13.9 drivers (I've got an R9 290, not 290X) but I was reluctant to do that for fear it would cost me performance. No stability issues in game, only a problem when the display shuts off.
> 
> Ta,


same problem with both 7900 series and r9 series they need to fix this in the whql driver


----------



## Paul17041993

Quote:


> Originally Posted by *Norse*
> 
> Just wondering what would happen if i put a factory OC'ed card and a Ref card at stock into crossfire?


you'll get some couple more frames or more compared to crossfire with both at ref. clocks, assuming they're not throttling at all.
Quote:


> Originally Posted by *invincible20xx*
> 
> i think both will work @ the lower clocked card frequency


not in crossfire they don't.

Quote:


> Originally Posted by *SupahSpankeh*
> 
> Hi.
> 
> Am experiencing some odd behaviour. It seems I cannot wake from sleep perhaps half the time if I allow the display to turn off. I'm using the 14.2 AMD drivers - any ideas?
> 
> Was thinking about moving back to the 13.9 drivers (I've got an R9 290, not 290X) but I was reluctant to do that for fear it would cost me performance. No stability issues in game, only a problem when the display shuts off.
> 
> Ta,


you're using windows 7 is why you're getting that, try to disable ULPS and disable monitor going to sleep.


----------



## Norse

Quote:


> Originally Posted by *Paul17041993*
> 
> you'll get some couple more frames or more compared to crossfire with both at ref. clocks, assuming they're not throttling at all.
> not in crossfire they don't.
> you're using windows 7 is why you're getting that, try to disable ULPS and disable monitor going to sleep.


Ah

Can i enable/disable crossfire without restarting just incase a game doesnt support it? if so im assuming the "better" card should be in the first PCi-E slot so it is used as Master


----------



## Paul17041993

Quote:


> Originally Posted by *Norse*
> 
> Ah
> 
> Can i enable/disable crossfire without restarting just incase a game doesnt support it? if so im assuming the "better" card should be in the first PCi-E slot so it is used as Master


I think crossfire doesn't need reboots to enable/disable, I mean updating drivers doesn't even need a reboot most of the time...

and yea, usually you put the best card in the top (primary) slot, with the exception of some mobos that the primary slot is on the bottom (yea some mobos are like that for whatever reason), in most cases that crossfire doesn't work, the best card is usually chosen to do the job.

I don't have any real first-hand experience with crossfire though so I'm not 100% sure about the application profiles etc.


----------



## BroJin

Quote:


> Originally Posted by *UZ7*
> 
> You must overclock and show temps!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> lol


at work








Quote:


> Originally Posted by *Satchmo0016*
> 
> Hey man, I just got my 290 Vapor-X in today as well (scored right at 1300 at 1080p ultra/tess extreme) but I am noticing some pretty bad coil whine and buzzing when its under a moderate load. Are you getting any sounds out of yours?










No coil whine on mine


----------



## BradleyW

Quote:


> Originally Posted by *Satchmo0016*
> 
> Hey man, I just got my 290 Vapor-X in today as well (scored right at 1300 at 1080p ultra/tess extreme) but I am noticing some pretty bad coil whine and buzzing when its under a moderate load. Are you getting any sounds out of yours?


Coil whine and buzzing usually dies down after a few month. My GPU's were extremely loud. It pulsated through the whole room. Now I can't hear them hardly.


----------



## Dragonsyph

Quote:


> Originally Posted by *Paul17041993*
> 
> I think crossfire doesn't need reboots to enable/disable, I mean updating drivers doesn't even need a reboot most of the time...
> 
> and yea, usually you put the best card in the top (primary) slot, with the exception of some mobos that the primary slot is on the bottom (yea some mobos are like that for whatever reason), in most cases that crossfire doesn't work, the best card is usually chosen to do the job.
> 
> I don't have any real first-hand experience with crossfire though so I'm not 100% sure about the application profiles etc.


Set in CCC to auto turn it off when program does not support it.


----------



## SupahSpankeh

Quote:


> Originally Posted by *Mega Man*
> 
> I have that on both my amd and Intel. I just shut off the monitor by hand


I've been doing this too, until very recently when it seems to reach the black screen state even when it's been turned off and the sleep mode has been disabled. Any ideas there?
Quote:


> Originally Posted by *invincible20xx*
> 
> same problem with both 7900 series and r9 series they need to fix this in the whql driver


Yup. /signed
Quote:


> Originally Posted by *Paul17041993*
> 
> you're using windows 7 is why you're getting that, try to disable ULPS and disable monitor going to sleep.


Mea culpa - it's W8.1, haven't update sig rig OS in a while.

I think I've already tried disabling ULPS - has this helped anyone else?


----------



## invincible20xx

Quote:


> Originally Posted by *SupahSpankeh*
> 
> I've been doing this too, until very recently when it seems to reach the black screen state even when it's been turned off and the sleep mode has been disabled. Any ideas there?
> Yup. /signed
> Mea culpa - it's W8.1, haven't update sig rig OS in a while.
> 
> I think I've already tried disabling ULPS - has this helped anyone else?


disabled ulps doesn't help because i tried on a multitude of gpus including r9 290 series and 7900 series sometimes screen just won't wake up if i let it sleep it's a bug in the beta drivers i suppose and it should be fixed in the WHQL driver ASAP !


----------



## BroJin

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> First Vapor X. Give us the low down on temps.


Hehehe good to be first at something right







Thank you!


----------



## Satchmo0016

Quote:


> Originally Posted by *BradleyW*
> 
> Coil whine and buzzing usually dies down after a few month. My GPU's were extremely loud. It pulsated through the whole room. Now I can't hear them hardly.


Yeah hopefully. The only other weird thing I noticed is that the bios is kinda laggy now, is that normal? I'm on an old x58 board that doesn't support UEFI and have the 290 in legacy mode - however switching modes didnt appear to make any difference at all.


----------



## invincible20xx

overclocking ram over 1400 crashes my desktop as soon as clicking apply ?! i have 2 gpus with hynix memory thought 1500 was okay , no ?!


----------



## Sgt Bilko

Quote:


> Originally Posted by *invincible20xx*
> 
> overclocking ram over 1400 crashes my desktop as soon as clicking apply ?! i have 2 gpus with hynix memory thought 1500 was okay , no ?!


Are you adding any voltage and are you increasing the core clock along with it?


----------



## invincible20xx

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Are you adding any voltage and are you increasing the core clock along with it?


not doing anything other than put 1500 ram in afterburner click apply , then horizontal line artifacts or black screen the second i click apply ....


----------



## Sgt Bilko

Quote:


> Originally Posted by *invincible20xx*
> 
> not doing anything other than put 1500 ram in afterburner click apply , then horizontal line artifacts or black screen the second i click apply ....


Try increasing the core clock by 50-100Mhz and add +50mV in AB as well.


----------



## Purostaff

I think there's something wrong with my card.. I just finished OC it and everything but this voltage fluctuation worries me. (Everything runs great so far)



Tried stock setting and OC setting
Disabled ULPS
Uncheck/Check Force constant voltage on Afterburner

Any inputs?


----------



## royalkilla408

Also getting that bug that won't let my monitor turn back on after idle when they turn off. Glad to hear I'm not the only one. Needs to be fixed ASAP. First time going with AMD since the 90s. What's the site to report bugs to AMD? Thanks.


----------



## Talon720

What kinda power is needed for tri-fire. Of course it depends on your equipment, and if u wanna oc. I have a corsair hx1050, and ever since my second 290x (and watercooling) I get random shutdowns when im turning my fans up on my fan controller or loading program thats more intensive. It could be a coincidence or its my [email protected] 1.334v oc is unstable causing a hard shutdown. Ive seen many different answeres as far as psu wattage, mostly from people who dont own these cards and know how much juice they can suck. I dont wanna get a new psu (its fairly new and on a second one due to rma) even though this psu is constantly turning on @idle and is loud. I saw a good deal on a evga g2 1300w, but first what do you all think?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Purostaff*
> 
> I think there's something wrong with my card.. I just finished OC it and everything but this voltage fluctuation worries me. (Everything runs great so far)
> 
> 
> 
> Tried stock setting and OC setting
> Disabled ULPS
> Uncheck/Check Force constant voltage on Afterburner
> 
> Any inputs?


That's what it does ..... looks fine to me









Been doin a couple of 290X benchies and I discovered that my 290 scores are just too good beat


----------



## Sgt Bilko

Quote:


> Originally Posted by *Talon720*
> 
> What kinda power is needed for tri-fire. Of course it depends on your equipment, and if u wanna oc. I have a corsair hx1050, and ever since my second 290x (and watercooling) I get random shutdowns when im turning my fans up on my fan controller or loading program thats more intensive. It could be a coincidence or its my [email protected] 1.334v oc is unstable causing a hard shutdown. Ive seen many different answeres as far as psu wattage, mostly from people who dont own these cards and know how much juice they can suck. I dont wanna get a new psu (its fairly new and on a second one due to rma) even though this psu is constantly turning on @idle and is loud. I saw a good deal on a evga g2 1300w, but first what do you all think?


You want 1200w at least imo, i don't think a 1050w is gonna cut it for the amount of power you will draw


----------



## Talon720

Quote:


> Originally Posted by *invincible20xx*
> 
> not doing anything other than put 1500 ram in afterburner click apply , then horizontal line artifacts or black screen the second i click apply ....


Ive read people having good luck adding aux voltage using ab to stabilze mem ocs. Could be your memory just dosnt wanna get pushed that hard. Im not sure if your on air or not but water has made this car run amazing adding voltage and never worring about the card getting to hot.


----------



## Purostaff

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> That's what it does ..... looks fine to me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Been doin a couple of 290X benchies and I discovered that my 290 scores are just too good beat


I figured it out.

It will fluctuate like that when there's a 3D application (ie. game) running in windowed mode. It's a straight line if the game full screen.

false alarm:thumb:

EDIT: Nvm.. I played around with it some more and the windowed mode and full screen thing has no effect.. The line is just straight and pretty now. lol


----------



## Tokkan

Quote:


> Originally Posted by *Paul17041993*
> 
> I think crossfire doesn't need reboots to enable/disable, I mean updating drivers doesn't even need a reboot most of the time...
> 
> and yea, usually you put the best card in the top (primary) slot, with the exception of some mobos that the primary slot is on the bottom (yea some mobos are like that for whatever reason), in most cases that crossfire doesn't work, the best card is usually chosen to do the job.
> 
> I don't have any real first-hand experience with crossfire though so I'm not 100% sure about the application profiles etc.


You don't have 1st hand experience uh?
Well I'm reading through the pages and with the previous crossfire I owned, I could have my main card overclocked at 1Ghz while the slave card would be at 775Mhz, what would crossfire do? Nothing at all.
Main card would wait for the slave card to finish processing, this translates to one card being pegged at 99% usage while the other was at 87% usage with applications that supported crossfire.
If you know of a game that does not have crossfire support you can create a custom profile just for that game disabling crossfire when its being executed (you only require to do this incase the driver tries to use both cards and starts artifacting). You never really need to manually turn off crossfire for anything, think of it as Nvidia Optimus, creating an app profile for low power GPU or high performance GPU.

On a side note...
Yesterday did a test run on my card at some OC's, managed to get 1100Mhz Core and 1350Mhz Mem. Didn't touch power limit or volts, benchmarked heaven with it without artifacting or crashes, gotta see if I can push further and what are the gains.


----------



## Dragonsyph

Quote:


> Originally Posted by *Tokkan*
> 
> You don't have 1st hand experience uh?
> Well I'm reading through the pages and with the previous crossfire I owned, I could have my main card overclocked at 1Ghz while the slave card would be at 775Mhz, what would crossfire do? Nothing at all.
> Main card would wait for the slave card to finish processing, this translates to one card being pegged at 99% usage while the other was at 87% usage with applications that supported crossfire.
> If you know of a game that does not have crossfire support you can create a custom profile just for that game disabling crossfire when its being executed (you only require to do this incase the driver tries to use both cards and starts artifacting). You never really need to manually turn off crossfire for anything, think of it as Nvidia Optimus, creating an app profile for low power GPU or high performance GPU.
> 
> On a side note...
> Yesterday did a test run on my card at some OC's, managed to get 1100Mhz Core and 1350Mhz Mem. Didn't touch power limit or volts, benchmarked heaven with it without artifacting or crashes, gotta see if I can push further and what are the gains.


Most of the time you can just pick some random game profile and use it on games that dont support crossfire or sli.

Had to do this for DAYZ standalone


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> You want 1200w at least imo, i don't think a 1050w is gonna cut it for the amount of power you will draw


Agreed


----------



## BradleyW

Quote:


> Originally Posted by *Dragonsyph*
> 
> Most of the time you can just pick some random game profile and use it on games that dont support crossfire or sli.
> 
> Had to do this for DAYZ standalone


It's best to pick a profile of a game that uses the same engine as the game your trying to force CFX onto.


----------



## ledzepp3

For anyone who wants to know more specific measurements, I've got a review up of the bracket









Thanks for the rep guys, didn't anticipate that actually









-Zepp


----------



## kizwan

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Power limit seems working well for me in 14.2 beta driver. Even overclocked with +200mV, no throttling at all.
> 
> 
> 
> 
> 
> 
> 
> I remember seeing very minor throttling when running only one GPU but no throttling at all in Crossfire. I'll check this again later.
> 
> 
> 
> I think water cooling greatly improves the efficiency of the cards and reduces the total power draw. Even at 100% I didn't throttle once I put the card under water.
Click to expand...

I was wrong. First GPU actually throttling in 14.2 beta.


I can confirm, no throttling in 14.4 beta driver.


----------



## chronicfx

Quote:


> Originally Posted by *Talon720*
> 
> What kinda power is needed for tri-fire. Of course it depends on your equipment, and if u wanna oc. I have a corsair hx1050, and ever since my second 290x (and watercooling) I get random shutdowns when im turning my fans up on my fan controller or loading program thats more intensive. It could be a coincidence or its my [email protected] 1.334v oc is unstable causing a hard shutdown. Ive seen many different answeres as far as psu wattage, mostly from people who dont own these cards and know how much juice they can suck. I dont wanna get a new psu (its fairly new and on a second one due to rma) even though this psu is constantly turning on @idle and is loud. I saw a good deal on a evga g2 1300w, but first what do you all think?


My g2 has random restarts sometimes. I am not overclocking except my cpu. I have had to install a supplementary 450w psu for the bottom card. The restarts are rare but occur suddenly when i pop into my game, much like at the point where the load is activated on the cards the computer will suddenly restart. Happened three times last month gaming everynight. It is possible I just have an issue with my psu or something but with a 7 year warranty i will wait a bit to see if it continues while only powering two cards.


----------



## Mega Man

Quote:


> Originally Posted by *chronicfx*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Talon720*
> 
> What kinda power is needed for tri-fire. Of course it depends on your equipment, and if u wanna oc. I have a corsair hx1050, and ever since my second 290x (and watercooling) I get random shutdowns when im turning my fans up on my fan controller or loading program thats more intensive. It could be a coincidence or its my [email protected] 1.334v oc is unstable causing a hard shutdown. Ive seen many different answeres as far as psu wattage, mostly from people who dont own these cards and know how much juice they can suck. I dont wanna get a new psu (its fairly new and on a second one due to rma) even though this psu is constantly turning on @idle and is loud. I saw a good deal on a evga g2 1300w, but first what do you all think?
> 
> 
> 
> My g2 has random restarts sometimes. I am not overclocking except my cpu. I have had to install a supplementary 450w psu for the bottom card. The restarts are rare but occur suddenly when i pop into my game, much like at the point where the load is activated on the cards the computer will suddenly restart. Happened three times last month gaming everynight. It is possible I just have an issue with my psu or something but with a 7 year warranty i will wait a bit to see if it continues while only powering two cards.
Click to expand...

without the exact model num of your psu, sounds like you are tripping OCP. are you sure you are keeping your rails separate ( assuming you have a multi rail psu ) but it looks as you are using the g2 which is a single rail... weird, maybe it thinks there is a power surge


----------



## chiknnwatrmln

Quote:


> Originally Posted by *ledzepp3*
> 
> For anyone who wants to know more specific measurements, I've got a review up of the bracket
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks for the rep guys, didn't anticipate that actually
> 
> 
> 
> 
> 
> 
> 
> 
> 
> -Zepp


Think I can finangle that onto my 290 without removing it? I don't feel like draining my loop. I have a backplate btw.


----------



## SupahSpankeh

Seriously, does _anyone_ have any more suggestions for blackscreen if the display sleeps or the PC hibernates?

It's driving me nuts. More concerning than that, it just blackscreened while browsing - it's not OC'd, and has no issues under load in any games. I'm currently using the amd_catalyst_14.2_beta1.3 drivers. Issue occurs on the 14.4 RC drivers too, and amd_catalyst_13.11_betav9.2.

Can anyone help?


----------



## Dragonsyph

Quote:


> Originally Posted by *SupahSpankeh*
> 
> Seriously, does _anyone_ have any more suggestions for blackscreen if the display sleeps or the PC hibernates?
> 
> It's driving me nuts. More concerning than that, it just blackscreened while browsing - it's not OC'd, and has no issues under load in any games. I'm currently using the amd_catalyst_14.2_beta1.3 drivers. Issue occurs on the 14.4 RC drivers too, and amd_catalyst_13.11_betav9.2.
> 
> Can anyone help?


Just have your PC on Performance power mode. And dont turn on sleep. Maybe just dim or shut off screen.

I had this problem a while back on my streaming PC, i just dont let my computer sleep. Cant remember what driver it was sorry.


----------



## B NEGATIVE

One for the list.

Sapphire 290 with AquaC block.










Prismatic paint flips from black...



To rainbow.



Just add light.









The missing bolt was in the paint rack,its back in place now.


----------



## kizwan

Quote:


> Originally Posted by *SupahSpankeh*
> 
> Seriously, does _anyone_ have any more suggestions for blackscreen if the display sleeps or the PC hibernates?
> 
> It's driving me nuts. More concerning than that, it just blackscreened while browsing - it's not OC'd, and has no issues under load in any games. I'm currently using the amd_catalyst_14.2_beta1.3 drivers. Issue occurs on the 14.4 RC drivers too, and amd_catalyst_13.11_betav9.2.
> 
> Can anyone help?


Can't help you because I don't have this issue. I always have my monitor auto turn off after 10 minutes idle & I don't have problem waking up the monitor. I've used 13.11 beta, 13.12 WHQL & 14.2 beta. Currently using 14.4 beta.


----------



## heroxoot

So after being told some things, I now know that drivers are bugged on hawaii cards for power limit? Welp I tried a small OC anyway.

The card likes to start at +25mV in MSI AB so I'm assuming its for a reason. This is +44mV, +50% limit ( if its even doing anything) 1100/1250 14.2 beta 13 driver.










The minimum is about 1.5fps higher, the max is about the same, and the average is almost 3fps higher on this OC. Not sure how much benefit in a game I will have.


----------



## Talon720

Quote:


> Originally Posted by *chronicfx*
> 
> My g2 has random restarts sometimes. I am not overclocking except my cpu. I have had to install a supplementary 450w psu for the bottom card. The restarts are rare but occur suddenly when i pop into my game, much like at the point where the load is activated on the cards the computer will suddenly restart. Happened three times last month gaming everynight. It is possible I just have an issue with my psu or something but with a 7 year warranty i will wait a bit to see if it continues while only powering two cards.


So same problem with more power.. Im might back off on my cpu overclock and see if it might be a rare crash occurrence. I know i had an fx8350 that was stable in p95 for 12hr but then when bf4 came around it started crashing, which was odd. Although the fx crashed differently ive had my haswell crash by the computer just shutting off.


----------



## Mega Man

Quote:


> Originally Posted by *Talon720*
> 
> Quote:
> 
> 
> 
> Originally Posted by *chronicfx*
> 
> My g2 has random restarts sometimes. I am not overclocking except my cpu. I have had to install a supplementary 450w psu for the bottom card. The restarts are rare but occur suddenly when i pop into my game, much like at the point where the load is activated on the cards the computer will suddenly restart. Happened three times last month gaming everynight. It is possible I just have an issue with my psu or something but with a 7 year warranty i will wait a bit to see if it continues while only powering two cards.
> 
> 
> 
> So same problem with more power.. Im might back off on my cpu overclock and see if it might be a rare crash occurrence. I know i had an fx8350 that was stable in p95 for 12hr but then when bf4 came around it started crashing, which was odd. Although the fx crashed differently ive had my haswell crash by the computer just shutting off.
Click to expand...

did you run it at 90% memory usage ? ( prime/ibt-avx )


----------



## B NEGATIVE

Quote:


> Originally Posted by *Talon720*
> 
> Quote:
> 
> 
> 
> Originally Posted by *chronicfx*
> 
> My g2 has random restarts sometimes. I am not overclocking except my cpu. I have had to install a supplementary 450w psu for the bottom card. The restarts are rare but occur suddenly when i pop into my game, much like at the point where the load is activated on the cards the computer will suddenly restart. Happened three times last month gaming everynight. It is possible I just have an issue with my psu or something but with a 7 year warranty i will wait a bit to see if it continues while only powering two cards.
> 
> 
> 
> So same problem with more power.. Im might back off on my cpu overclock and see if it might be a rare crash occurrence. I know i had an fx8350 that was stable in p95 for 12hr but then when bf4 came around it started crashing, which was odd. Although the fx crashed differently ive had my haswell crash by the computer just shutting off.
Click to expand...

Such is the wonder of overclocking,a 12 hr prime run means its stable for 12 hrs of prime,thats all.

I had a i7 Ivy that wouldnt ever shut down normally,always had to be powered off from the plug.


----------



## mojobear

Hey all....thought people might be interested an a review on the Raijintek Morpheus....looks like it might take up 3-4 slots with 25mm fans installed. :O

http://www.tomshardware.de/raijintek-morpheus-vga-cooler-hawaii-review,testberichte-241525-7.html

On a side note, I remember there was a discussion on how much temps can change the power draw on the 290s...take a look at the above page. About 11.5% less power draw by cooling the card down! Really cool stuff. That's why I think my watercooled 3 way crossfire 290 system is only pulling about 850W from the wall at stock and 1000W with a mild OC 1175/1375 +88mV









Gooooooooo watercooling!

Oh ya...and this goes to show that watercooling actually cools your room better than air







Increased efficiency baby!


----------



## Mega Man

holy poop that is so not worth it, at that point just buy a small loop.....


----------



## chronicfx

Quote:


> Originally Posted by *B NEGATIVE*
> 
> Such is the wonder of overclocking,a 12 hr prime run means its stable for 12 hrs of prime,thats all.
> 
> I had a i7 Ivy that wouldnt ever shut down normally,always had to be powered off from the plug.


Did you quote me as having a questionable overclock? Because I have not had a whea error ever. I am rock solid. I am
With mega it is most likely the ocp triggering.


----------



## Terrere

GPU-z Validation of my Gigabyte Windforce r9 290.
Please add me to the list


----------



## Dragonsyph

Quote:


> Originally Posted by *chronicfx*
> 
> Did you quote me as having a questionable overclock? Because I have not had a whea error ever. I am rock solid. I am
> With mega it is most likely the ocp triggering.


FYI people dont use prime any more. It wont crash your system if you unstable, the new CPUs just downthrottle instead of crash.


----------



## ledzepp3

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Think I can finangle that onto my 290 without removing it? I don't feel like draining my loop. I have a backplate btw.


Dude without a doubt! The screws which hold the block onto the card itself aren't removed in order to install the bracket.

There's the only three screws and a single washer you'd need for installation with a backplate. The support is more than worth the $8 price










Spoiler: Warning: Spoiler!








-Zepp


----------



## chiknnwatrmln

Quote:


> Originally Posted by *ledzepp3*
> 
> Dude without a doubt! The screws which hold the block onto the card itself aren't removed in order to install the bracket.
> 
> There's the only three screws and a single washer you'd need for installation with a backplate. The support is more than worth the $8 price
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> -Zepp


Sweet, it looks like I should be able to put it on no problem.

My bracket just shipped from FCPU so it should be here Friday or Monday.


----------



## heroxoot

So what is the best driver right now to use on the 290X? 14.4 and 14.3 seem terrible. I'm told 14.2 has a power limit bug where increasing it does nothing. 13.12 has no mantle but it seems ok. 14.2 gives me the best performance with an OC having the power slider maxed, so maybe its not broken for me?

Any opinions would be great.


----------



## Mega Man

what ever that works best for you

no one here can tell you what works best with your hardware


----------



## anubis1127

Anybody try Metro 2033 on these cards?


----------



## invincible20xx

Quote:


> Originally Posted by *heroxoot*
> 
> So what is the best driver right now to use on the 290X? 14.4 and 14.3 seem terrible. I'm told 14.2 has a power limit bug where increasing it does nothing. 13.12 has no mantle but it seems ok. 14.2 gives me the best performance with an OC having the power slider maxed, so maybe its not broken for me?
> 
> Any opinions would be great.


well maybe you will have to wait till they release a stable 14.4


----------



## Roboyto

Quote:


> Originally Posted by *B NEGATIVE*
> 
> Such is the wonder of overclocking,a 12 hr prime run means its stable for 12 hrs of prime,thats all.
> 
> I had a i7 Ivy that wouldnt ever shut down normally,always had to be powered off from the plug.


Yup..my 4770k will run IBT at 4.7GHz for max load on 16GB of RAM for ten runs. Won't game at those clocks though. This is also the first chip I've ever had that isn't stable in Prime95 at anything other than stock; quite strange.
Quote:


> Originally Posted by *invincible20xx*
> 
> overclocking ram over 1400 crashes my desktop as soon as clicking apply ?! i have 2 gpus with hynix memory thought 1500 was okay , no ?!


Hynix is no guarantee. My first 290 only went to 1080/1350 no matter what voltage I gave it.

Push core first and leave RAM stock clocks, much more performance to be had from it compared to the RAM.


----------



## invincible20xx

Quote:


> Originally Posted by *Roboyto*
> 
> Yup..my 4770k will run IBT at 4.7GHz for max load on 16GB of RAM for ten runs. Won't game at those clocks though. This is also the first chip I've ever had that isn't stable in Prime95 at anything other than stock; quite strange.
> Hynix is no guarantee. My first 290 only went to 1080/1350 no matter what voltage I gave it.
> 
> Push core first and leave RAM stock clocks, much more performance to be had from it compared to the RAM.


what are the temps on your r9 290 ? and what fan speeds u top out at ?!


----------



## Mega Man

Quote:


> Originally Posted by *Roboyto*
> 
> Quote:
> 
> 
> 
> Originally Posted by *B NEGATIVE*
> 
> Such is the wonder of overclocking,a 12 hr prime run means its stable for 12 hrs of prime,thats all.
> 
> I had a i7 Ivy that wouldnt ever shut down normally,always had to be powered off from the plug.
> 
> 
> 
> Yup..my 4770k will run IBT at 4.7GHz for max load on 16GB of RAM for ten runs. Won't game at those clocks though. This is also the first chip I've ever had that isn't stable in Prime95 at anything other than stock; quite strange.
> Quote:
> 
> 
> 
> Originally Posted by *invincible20xx*
> 
> overclocking ram over 1400 crashes my desktop as soon as clicking apply ?! i have 2 gpus with hynix memory thought 1500 was okay , no ?!
> 
> Click to expand...
> 
> Hynix is no guarantee. My first 290 only went to 1080/1350 no matter what voltage I gave it.
> 
> Push core first and leave RAM stock clocks, much more performance to be had from it compared to the RAM.
Click to expand...

are you sure you are running ibt-avx or normal ibt? ibt avx is in my sig iirc


----------



## heroxoot

Well thanks I guess? 14.2 seems to be the best with my OC and without it but I was told the power limit does not actually do anything. Is this 100% for all cards? Or maybe only reference designs? Mine is a custom design I think.


----------



## The Mac

AFAIK it effects all cards.


----------



## Harry604

what are you guys getting in bf4 with 2x 290x's at 2560x1440P

ive searched but im looking for info with the latest drivers and mantle

i can get 2x290x for 600$

would i get by with a ax750 gold psu


----------



## The Mac

i wouldnt 350 tdp x 2 = 700. that only leaves 50 for the rest of your stuff.

powersupplies are most efficient at 80% load so a 1000watt would be preferable (giving a 100watt tdp for the rest of your system)


----------



## B NEGATIVE

Quote:


> Originally Posted by *Dragonsyph*
> 
> Quote:
> 
> 
> 
> Originally Posted by *chronicfx*
> 
> Did you quote me as having a questionable overclock? Because I have not had a whea error ever. I am rock solid. I am
> With mega it is most likely the ocp triggering.
> 
> 
> 
> FYI people dont use prime any more. It wont crash your system if you unstable, the new CPUs just downthrottle instead of crash.
Click to expand...

Not true,prime is still good for more than just CPU testing
Quote:


> Originally Posted by *The Mac*
> 
> i wouldnt 350 tdp x 2 = 700. that only leaves 50 for the rest of your stuff.
> 
> powersupplies are most efficient at 80% load so a 1000watt would be preferable (giving a 100watt tdp for the rest of your system)


TDP is nothing to do with power draw,its a value provided for cooling requirements.

You know...Thermal Design Power


----------



## heroxoot

Quote:


> Originally Posted by *The Mac*
> 
> AFAIK it effects all cards.


Then why is my performance worse on a driver that is supposedly fixed? Somehow I don't think its all cards. And if it is then my OC doesn't hit the limit yet.


----------



## The Mac

maybe, are you adding volts?


----------



## heroxoot

Quote:


> Originally Posted by *The Mac*
> 
> maybe, are you adding volts?


+44mV. It might be too much I have no idea. The GPU likes to set its self to +25 in MSI AB.


----------



## Mercy4You

12 hours stable in Prime mdamp
Quote:


> Originally Posted by *B NEGATIVE*
> 
> Such is the wonder of overclocking,a 12 hr prime run means its stable for 12 hrs of prime,thats all.


12 hours stable in Prime only means it *was* stable in Prime for 12 hours...


----------



## Widde

I'm getting jelly of all the people that get good oc'ing cards







Anyone wanna switch?







(Joking)

These furnaces needs +100mV for 1080/1350 to be stable







Ref cards that aint under water yet ^_^

Fan curve ^^ http://piclair.com/vytrz


----------



## Talon720

Quote:


> Originally Posted by *Mega Man*
> 
> did you run it at 90% memory usage ? ( prime/ibt-avx )


Quote:


> Originally Posted by *B NEGATIVE*
> 
> Such is the wonder of overclocking,a 12 hr prime run means its stable for 12 hrs of prime,thats all.
> 
> I had a i7 Ivy that wouldnt ever shut down normally,always had to be powered off from the plug.


I had ran blend at the time, but b negative kinda summed up what i was trying to say. Also on Eurogamer review of the r9 295x2 compared to a crossfired 290 they claimed that the x-fired 290 got up to 850watts. Thats crazy if right that means a x-fire 290x is hitting 875-900w. I guess that could explain mine and chronicfx's shutdowns that appear psu related. Which, it's still making me question my hx1050 ability to push what I have at its peaks let alone a 3rd card.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *heroxoot*
> 
> So what is the best driver right now to use on the 290X? 14.4 and 14.3 seem terrible. I'm told 14.2 has a power limit bug where increasing it does nothing. 13.12 has no mantle but it seems ok. 14.2 gives me the best performance with an OC having the power slider maxed, so maybe its not broken for me?
> 
> Any opinions would be great.


I personally stick with 13.12. I play almost exclusively DX9 games that don't support Mantle, and I saw literally no improvement on BF4 from DX11 to Mantle.

I don't like how I couldn't set power limit on 14.whatever so I don't use it. To game on 3 monitors on a single card I need a hefty OC, and for that I need power limit.

As for Metro 2033, I get about 60FPS in most places with AA off on 4800x900 res... Unusual res but it's just a bit more pixels than 1440p (17% more, 4.3 vs 3.7MP) so if you game at 1440p expect 60FPS just about everywhere with no AA.


----------



## The Mac

Quote:


> Originally Posted by *B NEGATIVE*
> 
> Not true,prime is still good for more than just CPU testing
> TDP is nothing to do with power draw,its a value provided for cooling requirements.
> 
> You know...Thermal Design Power


And where exactly do you think thermal transfer comes from? The ether?

My analysis, and recomendation stand. Tdp is a close approximation of power draw.


----------



## chiknnwatrmln

In regards to power usage I don't really believe that 2 290x's will draw 900w...

My entire rig with my 290 at full tilt (we're talking more than +200mV here, so some pretty serious OC'ing for water) draws around 560w max at the wall. This is include my 3770k on 1.4v, a water pump, 10 fans, etc. Now with the 92% efficiency of my PSU that means my PSU is outputting ~515w DC current.

Now GPU-z says that my card is pulling around 350w max, give or take a few watts due to software inaccuracies. But if this is anywhere near right, that means everything else in my rig is using 165 watts. This makes sense because an OC'ed 3770k is around 100w TDP (77w stock), if I'm at 75% usage that around 75w, leaving 90 watts for my fans, HDD, SSD's, pump, lights, RAM, mobo, etc.

So pretty much the most my ref 290 draws when OC'ed to the max is 350w, keep in mind that's only for spikes so most of the time it's under 330w. Keep in mind this is under water, but I'd expect a similarly OC'ed 290x to draw max of 370ish watts.

So, going off these estimations and simple math, 2 290x's OC'ed for suicide benchmark runs will draw roughly 750 watts maximum.

Others know more about the power draw of these cards, they might be able to chime in.


----------



## Paul17041993

Quote:


> Originally Posted by *SupahSpankeh*
> 
> I've been doing this too, until very recently when it seems to reach the black screen state even when it's been turned off and the sleep mode has been disabled. Any ideas there?
> Yup. /signed
> Mea culpa - it's W8.1, haven't update sig rig OS in a while.
> 
> I think I've already tried disabling ULPS - has this helped anyone else?


ok looks like a stability issue of some sort, have you flushed out and re-installed the drivers? otherwise I think you'll have to RMA the card...

Quote:


> Originally Posted by *Tokkan*
> 
> You don't have 1st hand experience uh?
> Well I'm reading through the pages and with the previous crossfire I owned, I could have my main card overclocked at 1Ghz while the slave card would be at 775Mhz, what would crossfire do? Nothing at all.
> Main card would wait for the slave card to finish processing, this translates to one card being pegged at 99% usage while the other was at 87% usage with applications that supported crossfire.
> If you know of a game that does not have crossfire support you can create a custom profile just for that game disabling crossfire when its being executed (you only require to do this incase the driver tries to use both cards and starts artifacting). You never really need to manually turn off crossfire for anything, think of it as Nvidia Optimus, creating an app profile for low power GPU or high performance GPU.
> 
> On a side note...
> Yesterday did a test run on my card at some OC's, managed to get 1100Mhz Core and 1350Mhz Mem. Didn't touch power limit or volts, benchmarked heaven with it without artifacting or crashes, gotta see if I can push further and what are the gains.


neither card waits for each other, the driver just feeds them the next frame whenever they finish the current one. framepacing however can likely cause the effect you describe as it tries to fight the odd frame times.


----------



## The Mac

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> In regards to power usage I don't really believe that 2 290x's will draw 900w...
> 
> My entire rig with my 290 at full tilt (we're talking more than +200mV here, so some pretty serious OC'ing for water) draws around 560w max at the wall. This is include my 3770k on 1.4v, a water pump, 10 fans, etc. Now with the 92% efficiency of my PSU that means my PSU is outputting ~515w DC current.
> 
> Now GPU-z says that my card is pulling around 350w max, give or take a few watts due to software inaccuracies. But if this is anywhere near right, that means everything else in my rig is using 165 watts. This makes sense because an OC'ed 3770k is around 100w TDP (77w stock), if I'm at 75% usage that around 75w, leaving 90 watts for my fans, HDD, SSD's, pump, lights, RAM, mobo, etc.
> 
> So pretty much the most my ref 290 draws when OC'ed to the max is 350w, keep in mind that's only for spikes so most of the time it's under 330w. Keep in mind this is under water, but I'd expect a similarly OC'ed 290x to draw max of 370ish watts.
> 
> So, going off these estimations and simple math, 2 290x's OC'ed for suicide benchmark runs will draw roughly 750 watts maximum.
> 
> Others know more about the power draw of these cards, they might be able to chime in.


My math is similar, i figure full system draw to be around 800watts. which is why i recomended a 1000w PS. 80% load, better efficiency with some headroom for ugrades etc....

the 750ps the OP wanted to use would be right at its limit at 100% load.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *The Mac*
> 
> My math is similar, i figure full system draw to be around 800watts. which is why i recomended a 1000w PS. 80% load, better efficiency with some headroom for ugrades etc....
> 
> the 750ps the OP wanted to use would be right at its limit at 100% load.


Of course, if you want a power supply to last you don't want it to be at 100% all the time.

I currently have an 860 watt power supply, I might go CF in the future but if I do I'm sticking to stock voltages. These cards really only start to draw a lot of power once they see more than stock voltage.


----------



## JordanTr

Quote:


> Originally Posted by *B NEGATIVE*
> 
> One for the list.
> 
> Sapphire 290 with AquaC block.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Prismatic paint flips from black...
> 
> 
> 
> To rainbow.
> 
> 
> 
> Just add light.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The missing bolt was in the paint rack,its back in place now.


which block is that? It covers full pcb. Its not kryographics or its your own mod to it. If not. Can i get the link where to buy it? Im planing to go kryographics, but tjis one looks awesome


----------



## BroJin

Any tips????
2nd Bench
i3570k at 4.5ghz
Driver 14.4



Loving these temps btw


----------



## Tokkan

Quote:


> Originally Posted by *Paul17041993*
> 
> neither card waits for each other, the driver just feeds them the next frame whenever they finish the current one. framepacing however can likely cause the effect you describe as it tries to fight the odd frame times.


Im talking about a cards that I have since 2011 and always had this behavior... When no1 even talked about framepacing nor it was included in drivers...


----------



## kizwan

Quote:


> Originally Posted by *The Mac*
> 
> i wouldnt 350 tdp x 2 = 700. that only leaves 50 for the rest of your stuff.
> 
> powersupplies are most efficient at 80% load so a 1000watt would be preferable (giving a 100watt tdp for the rest of your system)


If the PSU is rated to deliver 1050W for example, it will be able to deliver continuous 1050W without any problem, regardless efficiency. The only difference between low & high efficiency PSU (for example both rated for 1050W) is the former will draw more power from the wall than the latter for delivering the same max output power. There is one exception though, if the PSU is made by dishonest manufacturer who use crappy components, then it may unable to maintain continuous max rated wattage. In some/most (I don't know which is suitable because I don't have statistic figure) cases, the PSU blows & may take some of the hardware with it.
Quote:


> Originally Posted by *The Mac*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *B NEGATIVE*
> 
> Quote:
> 
> 
> 
> Originally Posted by *The Mac*
> 
> i wouldnt 350 tdp x 2 = 700. that only leaves 50 for the rest of your stuff.
> 
> powersupplies are most efficient at 80% load so a 1000watt would be preferable (giving a 100watt tdp for the rest of your system)
> 
> 
> 
> TDP is nothing to do with power draw,its a value provided for cooling requirements.
> 
> You know...Thermal Design Power
> 
> Click to expand...
> 
> 
> 
> 
> 
> And where exactly do you think thermal transfer comes from? The ether?
> 
> My analysis, and recomendation stand. Tdp is a close approximation of power draw.
Click to expand...

I agree with BNEG. At least that is incorrect uses of "TDP" term because you're talking about power consumption, not thermal output.


----------



## Jflisk

Quote:


> Originally Posted by *Talon720*
> 
> I had ran blend at the time, but b negative kinda summed up what i was trying to say. Also on Eurogamer review of the r9 295x2 compared to a crossfired 290 they claimed that the x-fired 290 got up to 850watts. Thats crazy if right that means a x-fire 290x is hitting 875-900w. I guess that could explain mine and chronicfx's shutdowns that appear psu related. Which, it's still making me question my hx1050 ability to push what I have at its peaks let alone a 3rd card.


Ill help you out a little here I had a FX9590 and 2x 7990 running off a 1350W power supply no problems with split rail - rail split 500 and 750(GPUS) I don't think the R9 295X hits more then 700 to 750 watts max and that's 2 of them, Of course the rating was probably off on my power supplies rating from 1350 to 1500W. So says all the reading I did on the power supply also the fact my lights flickered when my 7990's were at full tilt. In order for the power supply to flicker the lights had to be trying to pull 1500W at the outlet I believe a 120 breaker pops at 1550 Watts draw.


----------



## The Mac

Quote:


> Originally Posted by *kizwan*
> 
> If the PSU is rated to deliver 1050W for example, it will be able to deliver continuous 1050W without any problem, regardless efficiency. The only difference between low & high efficiency PSU (for example both rated for 1050W) is the former will draw more power from the wall than the latter for delivering the same max output power. There is one exception though, if the PSU is made by dishonest manufacturer who use crappy components, then it may unable to maintain continuous max rated wattage. In some/most (I don't know which is suitable because I don't have statistic figure) cases, the PSU blows & may take some of the hardware with it.
> I agree with BNEG. At least that is incorrect uses of "TDP" term because you're talking about power consumption, not thermal output.


Again, my recomendation stands.

My repeated testing over many years has shown rated tdp is a close approximation of actual power usage. The fact that they are not the same is irelevent, as they are close enough.

800 watts total power. Maybe a bit more depending on his oc of the cpu and gpu.

A 1000 watts ps will give him 20% headroom for expansion and additional overclockngwhile maintaining optimal pfc.


----------



## the9quad

I have no idea how much power I use but I am sure its close to my psu limit, but I never have issues, even when i run all 3 cards oc'd.


----------



## VladimirT

Hi . Anyone can help with this? http://www.overclock.net/t/1472341/official-msi-r9-290x-lightning-thread/460 post 465 and to the end.


----------



## chiknnwatrmln

First thing I'd do is verify your OC is sticking. Make sure your GPU stays clocked up, in GPU-Z your core clock should be a straight line right at 1200MHz under 100% load.


----------



## VladimirT

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> First thing I'd do is verify your OC is sticking. Make sure your GPU stays clocked up, in GPU-Z your core clock should be a straight line right at 1200MHz under 100% load.


Clock are 1200 all the time. Temps is normal. Load 100% all the time. On stock lightining have a bad performance , compare to ref.

Metro LL

1000-1250 (both)
Ref - 71-72
MSI - 64-65


----------



## B NEGATIVE

Quote:


> Originally Posted by *The Mac*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> If the PSU is rated to deliver 1050W for example, it will be able to deliver continuous 1050W without any problem, regardless efficiency. The only difference between low & high efficiency PSU (for example both rated for 1050W) is the former will draw more power from the wall than the latter for delivering the same max output power. There is one exception though, if the PSU is made by dishonest manufacturer who use crappy components, then it may unable to maintain continuous max rated wattage. In some/most (I don't know which is suitable because I don't have statistic figure) cases, the PSU blows & may take some of the hardware with it.
> I agree with BNEG. At least that is incorrect uses of "TDP" term because you're talking about power consumption, not thermal output.
> 
> 
> 
> Again, my recomendation stands.
> 
> My repeated testing over many years has shown rated tdp is a close approximation of actual power usage. The fact that they are not the same is irelevent, as they are close enough.
> 
> 800 watts total power. Maybe a bit more depending on his oc of the cpu and gpu.
> 
> A 1000 watts ps will give him 20% headroom for expansion and additional overclockngwhile maintaining optimal pfc.
Click to expand...

Applying that logic is incorrect.
Example: My x5650s....TDP 95w.....actual draw 190w.
Each manufacturer has its own idea of what TDP means,NVIDIA use a whole card value,Intel use a throttled value etc.
Until TDP is used the same across the board then it is worthless as a metric.


----------



## HOMECINEMA-PC

Okay then Tri fire w/blocked










And re-plumbed 360 Alphacool UT45 Ghetto modd to help out with da temps









Gonna re-bench TRI-Fire tonight ......... Spend no more


----------



## Paul17041993

Quote:


> Originally Posted by *The Mac*
> 
> Again, my recomendation stands.
> 
> My repeated testing over many years has shown rated tdp is a close approximation of actual power usage. The fact that they are not the same is irelevent, as they are close enough.
> 
> 800 watts total power. Maybe a bit more depending on his oc of the cpu and gpu.
> 
> A 1000 watts ps will give him 20% headroom for expansion and additional overclockngwhile maintaining optimal pfc.


Quote:


> Originally Posted by *B NEGATIVE*
> 
> Applying that logic is incorrect.
> Example: My x5650s....TDP 95w.....actual draw 190w.
> Each manufacturer has its own idea of what TDP means,NVIDIA use a whole card value,Intel use a throttled value etc.
> Until TDP is used the same across the board then it is worthless as a metric.


TDP is a thermal design value, in the real world it doesn't mean anything, and in most cases on AMD hardware the TDP overshoots the peak power draw by 25%, even more in the case of my FX-8150 (~200W TDP, actual max draw is <80W)

average max powerdraw on a 290X with ref blower is <200W, its only when overclocking that this increases sharply, but watercooling also decreases the draw just as sharply to a certain point (both due to shear die size and gate count).

in a lot of companies you have to watch for actual power draw, especially on intel and nvidia where the TDP may be rated for 17W but actually draw 35W under peak load. in a similar case a 290X can potentially pull 300W on furmark, which is why said tool is discouraged on this level of hardware.



Spoiler: quote



Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Okay then Tri fire w/blocked
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And re-plumbed 360 Alphacool UT45 Ghetto modd to help out with da temps
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Gonna re-bench TRI-Fire tonight ......... Spend no more






why not just mount it as an outboard? lol


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Paul17041993*
> 
> TDP is a thermal design value, in the real world it doesn't mean anything, and in most cases on AMD hardware the TDP overshoots the peak power draw by 25%, even more in the case of my FX-8150 (~200W TDP, actual max draw is <80W)
> 
> average max powerdraw on a 290X with ref blower is <200W, its only when overclocking that this increases sharply, but watercooling also decreases the draw just as sharply to a certain point (both due to shear die size and gate count).
> 
> in a lot of companies you have to watch for actual power draw, especially on intel and nvidia where the TDP may be rated for 17W but actually draw 35W under peak load. in a similar case a 290X can potentially pull 300W on furmark, which is why said tool is discouraged on this level of hardware.
> 
> 
> why not just mount it as an outboard? lol


1. Don't have a boat ......









2. Need it there so the A/C can blast it cool


----------



## Talon720

Last night while palying bf4 i was trying my oc to see if it was stable. My computer crashed hard went back to the big resolution, and said there were no drivers for amd. Never had that happen. I reinstalled and during the first time bsod something 07... Cant remember, well then my second card wont go over 300mhz. Still gotta try and run ddu and reinstall. Just never had my ish crash that hard crazyness I tell ya.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Talon720*
> 
> Last night while palying bf4 i was trying my oc to see if it was stable. My computer crashed hard went back to the big resolution, and said there were no drivers for amd. Never had that happen. I reinstalled and during the first time bsod something 07... Cant remember, well then my second card wont go over 300mhz. Still gotta try and run ddu and reinstall. Just never had my ish crash that hard crazyness I tell ya.


My debug code for vga driver is 62 on my RIVE then its time for a unplug and a reboot , when it hard crashes


----------



## The Mac

Quote:


> Originally Posted by *B NEGATIVE*
> 
> Applying that logic is incorrect.
> Example: My x5650s....TDP 95w.....actual draw 190w.
> Each manufacturer has its own idea of what TDP means,NVIDIA use a whole card value,Intel use a throttled value etc.
> Until TDP is used the same across the board then it is worthless as a metric.


For the love of pete people, its an estimate.

Its close enough.


----------



## passinos

Hey gang, just got my 2x 290x CF with Koolance Blocks. Upgrade from 7970 CF.

These cards weigh like 3-4 lbs! I am sure its the heavy blocks, but man they are heavy.

Should probably get some back plates to help stiffen them up.

Now to see if my 860 platinum PSU is going to melt when I OC these bad boys.


----------



## Roy360

I've been thinking about what rdr09 said about using a better TIM.

Is CLU Ultra Safe for GPUs? The heatsync is copper so it shouldn't erase anything should it? (I already has some, whcih is why I ask)

otherwise I guess Arctic Silver 5 or Prolimatech would be the way to go?

EDIT: I think I just answered my own question: http://www.hardwaresecrets.com/article/Thermal-Compound-Roundup-January-2012/1468/5

I don't know why people say they have seen a 20 degree drop when using CLU, but according to that review, almost all aftermarket TIMs are within 2-3 degrees of each other. I just I'll stick with whatever is lying around. (expired N1-H1)


----------



## Sgt Bilko

Quote:


> Originally Posted by *Roy360*
> 
> I've been thinking about what rdr09 said about using a better TIM.
> 
> Is CLU Ultra Safe for GPUs? The heatsync is copper so it shouldn't erase anything should it? (I already has some, whcih is why I ask)
> 
> otherwise I guess Arctic Silver 5 or Prolimatech would be the way to go?


AS5 is conductive as well.

They are safe to use so long as you only make contact with the die and nothing else, otherwise you run the risk of shorting the card.

use masking tape around the die itself then apply a thin layer would be the safest way i'd think


----------



## Jflisk

Quote:


> Originally Posted by *Roy360*
> 
> I've been thinking about what rdr09 said about using a better TIM.
> 
> Is CLU Ultra Safe for GPUs? The heatsync is copper so it shouldn't erase anything should it? (I already has some, whcih is why I ask)
> 
> otherwise I guess Arctic Silver 5 or Prolimatech would be the way to go?
> 
> EDIT: I think I just answered my own question: http://www.hardwaresecrets.com/article/Thermal-Compound-Roundup-January-2012/1468/5
> 
> I don't know why people say they have seen a 20 degree drop when using CLU, but according to that review, almost all aftermarket TIMs are within 2-3 degrees of each other. I just I'll stick with whatever is lying around. (N1-H1)


I have heard of people using CLU on GPUS. You just need to follow the directions to a T if you own it already. You know to tape everything off around the DIE. Artic silver 5 is never suggested for use around GPU. I forget if its conductive or capasitive. Either or is not good to use around the GPU. N1-H1 will work. Any TIm that is non conductive nor capacitive.


----------



## The Mac

Gelid GC-Extreme is good, and non-conductive as well


----------



## B NEGATIVE

Quote:


> Originally Posted by *The Mac*
> 
> Gelid GC-Extreme is good, and non-conductive as well


Agreed,its my preferred TIM,MX2 is my second.


----------



## invincible20xx

guys anybody knows if changing the tim on the reference design cards with something like mx4 will improve thermals ?!


----------



## chiknnwatrmln

Quote:


> Originally Posted by *invincible20xx*
> 
> guys anybody knows if changing the tim on the reference design cards with something like mx4 will improve thermals ?!


Probably. I've heard they often put either too much or too little TIM on.

Check your warranty though, some companies (cough Sapphire cough) will void your warranty if they find out you did anything to the card except plug it in.

You will probably see a few degrees drop, nothing major though.


----------



## The Mac

a degree or two, nothing earth shattering.

It depends on how poorly the stock tim was applied at the factory.


----------



## Jflisk

Put them under water shave 40C off the top.


----------



## Sgt Bilko

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Probably. I've heard they often put either too much or too little TIM on.
> 
> Check your warranty though, some companies (cough Sapphire cough) will void your warranty if they find out you did anything to the card except plug it in.
> 
> You will probably see a few degrees drop, nothing major though.


Sapphire refunded me, and i replaced the cooler with an AX III.

they have more of a "don't ask don't tell" policy imo


----------



## heroxoot

Quote:


> Originally Posted by *invincible20xx*
> 
> guys anybody knows if changing the tim on the reference design cards with something like mx4 will improve thermals ?!


In my experience only non ref cards or cards with non ref coolers have good paste. I like noctua thermal paste myself. It dropped temps compared to AS5 by 2 or 3 c.


----------



## rdr09

it seems no throttling issues with 14.4 WHQL.

290 @ 1260/1500

http://www.3dmark.com/3dm11/8266945

i'll test BF4 with mantle tonite or this weekend.

used Trixx to oc.


----------



## BradleyW

What's the difference between WHQL and the RC versions of 14.4?


----------



## sugarhell

RC are the drivers on the process for WHQL certification


----------



## BradleyW

Quote:


> Originally Posted by *sugarhell*
> 
> RC are the drivers on the process for WHQL certification


So I assume there's no difference between RC and WHQL in this instance?


----------



## The Mac

someone did a but compare, the WHQL are a little newer, but there are OGL extensions missing.


----------



## sugarhell

Quote:


> Originally Posted by *The Mac*
> 
> someone did a but compare, the WHQL are a little newer, but there are OGL extensions missing.


Only CCC is newer


----------



## The Mac

dunno, im just reporting, i didnt do the compare. Missing OGL extention is curious however.


----------



## BradleyW

So the differences more or less concern OpenGL 4.4. Thanks to both of you for the help. +1.


----------



## The Mac

Notice ARB_compute_variable_group_size is missing from WHQL supposedly


----------



## BroJin

Quote:


> Originally Posted by *BradleyW*
> 
> What's the difference between WHQL and the RC versions of 14.4?


No patch note on WHQl version







so who knows, it's just certified


----------



## The Mac

CrossFire fixes enhancements:
Crysis 3 - frame pacing improvements
Far Cry 3 - 3 and 4 GPU performance improvements at high quality settings, high resolution settings
Anno 2070 - Improved CrossFire scaling up to 34%
Titanfall - Resolved in game flickering with CrossFire enabled
Metro Last Light - Improved Crossfire scaling up to 10%
Eyefinity 3x1 (with three 4K panels) no longer cuts off portions of the application
Stuttering has been improved in certain applications when selecting mid-Eyefinity resolutions with V-sync Enabled

Mantle beta driver improvements:
BattleField 4: Performance slowdown is no longer seen when performing a task switch/Alt-tab
BattleField 4: Fuzzy images when playing in rotated SLS resolution with an A10 Kaveri system


----------



## Lord Xeb

Anyone having issues with their cards having a black screen after being idle for a while?


----------



## Paul17041993

Quote:


> Originally Posted by *BradleyW*
> 
> What's the difference between WHQL and the RC versions of 14.4?


WHQL just means they have been officially signed and ready for official use, RC is just a release candidate, a pre-release or preview that people can try out to hopefully catch any later bugs that snuck out of the beta so they can be patched quickly before the official release is signed out.

oh and fyi, the entire 14.x drivers were missing a fair few openGL extensions, they've been overhawling a lot of it, generally one reason why I haven't been using them as they broke minecraft last version I checked and I need stable drivers to get my current assessment done.
(multi-window openGL-based GUI system







)

oh and I guess Ill report the extension list back here afterwards if people want? only needs a few more lines of code in my program.


----------



## chiknnwatrmln

Just upgraded to 14.4, seems to work fine. No downclocking or anything else. Haven't tried any Mantle games yet though, only Skyrim.


----------



## centvalny

Update FS with h2o before go cold



http://imgur.com/Spo3OVF


----------



## Goride

I recently got a BF4 edition Powercolor 290x reference card.

The performance is not nearly as good as I was expecting. I was wondering if maybe my expectations were too high.

At stock/default settings with 13.12 drivers, I was not able to max out the settings of Metro: Last Light on a 1080p monitor, and stay above 60fps. In fact it was closer to 29fps. I had GPU-Z logging, and it did throttle down and the core was staying between 850-950mhz.

I used MSI AB to manually turn up the fan and power limit %, which stopped the throttling and stayed at the default 1030mhz. This increased the frames a bit, but basically stayed the same.

I OC'd it to 1175mhz, with the fan running at 65% (really loud). I got some more frames, but I still could only barely stay above 30fps.

I know Metro: Last Light is a pretty demanding game, but I figured @ 1080p, I would be able to stay above 60fps.

I also tried Guild Wars 2. I ran that at max settings, and could not stay above 60fps either. I turned super sampling off, and that helped some, but I still could not maintain at least 60fps. My friend, who has a 780 (same 1080p), was running around in the same areas as me, and he was getting around 20-35 more fps than I was at any given time, with the same graphics settings. During this time I was running at 1150core/1250vram, with no throttling, and he was running the 780 at stock/default settings.
(I did see this link: http://techreport.com/review/25509/amd-radeon-r9-290x-graphics-card-reviewed/11 which suggests the 780 just runs better in gw2 than the 290x, though)

I was able to OC the 290x to 1300core/1250vram, with the fan at 70%, with core voltage slider to +80. GPU-Z said the max core voltage was 1.286v. It seemed to run Metro: Last Light stable at this. Core temp was around 80c (i forget vram temps). I then tried to push it to 1340/1250 and got artifacting. I pushed the voltage all the way to the right, the artifacting went away, but it crashed in like 10 seconds. GPU-Z didn't show the voltage going past 1.286v. When I tried to put to back to 1300/1250 it just crashes. In fact, I can not even keep 1200/1250 stable. The best I can do now is 1175/1250. Maybe running 1300/1250 for about 5-10 minutes without issue was just luck. But now I cannot run 1300/1250 more than 5-10 seconds without crashing.

Does any of this seem "normal" or at least "par for the course" with these 290x cards?

I had planned on investing in some aftermarket cooling. But the 290x + aftermarket cooling puts me in the 780 price range. And even though the 290x is supposed to out perform the 780, it seems like at least with my card, this is not the case.

Should I be able to run Metro:Last Light on max above 60fps on @ 1080p? Can a 780? What about the differences in GW2 between the 290x and 780? Even if GW2 was better optimized for Nvidia, I would more or less expect the cards to perform about the same FPS in that case - not the 780 beating my 290x by 25-35fps on same settings.

(I know this was long. If you made it this far, thanks for reading.)


----------



## igrease

Quote:


> Originally Posted by *Goride*
> 
> I recently got a BF4 edition Powercolor 290x reference card.
> 
> The performance is not nearly as good as I was expecting. I was wondering if maybe my expectations were too high.
> 
> At stock/default settings with 13.12 drivers, I was not able to max out the settings of Metro: Last Light on a 1080p monitor, and stay above 60fps. In fact it was closer to 29fps. I had GPU-Z logging, and it did throttle down and the core was staying between 850-950mhz.
> 
> I used MSI AB to manually turn up the fan and power limit %, which stopped the throttling and stayed at the default 1030mhz. This increased the frames a bit, but basically stayed the same.
> 
> I OC'd it to 1175mhz, with the fan running at 65% (really loud). I got some more frames, but I still could only barely stay above 30fps.
> 
> I know Metro: Last Light is a pretty demanding game, but I figured @ 1080p, I would be able to stay above 60fps.
> 
> I also tried Guild Wars 2. I ran that at max settings, and could not stay above 60fps either. I turned super sampling off, and that helped some, but I still could not maintain at least 60fps. My friend, who has a 780 (same 1080p), was running around in the same areas as me, and he was getting around 20-35 more fps than I was at any given time, with the same graphics settings. During this time I was running at 1150core/1250vram, with no throttling, and he was running the 780 at stock/default settings.
> (I did see this link: http://techreport.com/review/25509/amd-radeon-r9-290x-graphics-card-reviewed/11 which suggests the 780 just runs better in gw2 than the 290x, though)
> 
> I was able to OC the 290x to 1300core/1250vram, with the fan at 70%, with core voltage slider to +80. GPU-Z said the max core voltage was 1.286v. It seemed to run Metro: Last Light stable at this. Core temp was around 80c (i forget vram temps). I then tried to push it to 1340/1250 and got artifacting. I pushed the voltage all the way to the right, the artifacting went away, but it crashed in like 10 seconds. GPU-Z didn't show the voltage going past 1.286v. When I tried to put to back to 1300/1250 it just crashes. In fact, I can not even keep 1200/1250 stable. The best I can do now is 1175/1250. Maybe running 1300/1250 for about 5-10 minutes without issue was just luck. But now I cannot run 1300/1250 more than 5-10 seconds without crashing.
> 
> Does any of this seem "normal" or at least "par for the course" with these 290x cards?
> 
> I had planned on investing in some aftermarket cooling. But the 290x + aftermarket cooling puts me in the 780 price range. And even though the 290x is supposed to out perform the 780, it seems like at least with my card, this is not the case.
> 
> Should I be able to run Metro:Last Light on max above 60fps on @ 1080p? Can a 780? What about the differences in GW2 between the 290x and 780? Even if GW2 was better optimized for Nvidia, I would more or less expect the cards to perform about the same FPS in that case - not the 780 beating my 290x by 25-35fps on same settings.
> 
> (I know this was long. If you made it this far, thanks for reading.)


I wish I could OC my 290 to at least 1250 on air. At just 1100/1250 it sits at almost 80c with fans at 90%.


----------



## mnicassio89

Quote:


> Originally Posted by *Goride*
> 
> I recently got a BF4 edition Powercolor 290x reference card.
> 
> The performance is not nearly as good as I was expecting. I was wondering if maybe my expectations were too high.
> 
> At stock/default settings with 13.12 drivers, I was not able to max out the settings of Metro: Last Light on a 1080p monitor, and stay above 60fps. In fact it was closer to 29fps. I had GPU-Z logging, and it did throttle down and the core was staying between 850-950mhz.
> 
> I used MSI AB to manually turn up the fan and power limit %, which stopped the throttling and stayed at the default 1030mhz. This increased the frames a bit, but basically stayed the same.
> 
> I OC'd it to 1175mhz, with the fan running at 65% (really loud). I got some more frames, but I still could only barely stay above 30fps.
> 
> I know Metro: Last Light is a pretty demanding game, but I figured @ 1080p, I would be able to stay above 60fps.
> 
> I also tried Guild Wars 2. I ran that at max settings, and could not stay above 60fps either. I turned super sampling off, and that helped some, but I still could not maintain at least 60fps. My friend, who has a 780 (same 1080p), was running around in the same areas as me, and he was getting around 20-35 more fps than I was at any given time, with the same graphics settings. During this time I was running at 1150core/1250vram, with no throttling, and he was running the 780 at stock/default settings.
> (I did see this link: http://techreport.com/review/25509/amd-radeon-r9-290x-graphics-card-reviewed/11 which suggests the 780 just runs better in gw2 than the 290x, though)
> 
> I was able to OC the 290x to 1300core/1250vram, with the fan at 70%, with core voltage slider to +80. GPU-Z said the max core voltage was 1.286v. It seemed to run Metro: Last Light stable at this. Core temp was around 80c (i forget vram temps). I then tried to push it to 1340/1250 and got artifacting. I pushed the voltage all the way to the right, the artifacting went away, but it crashed in like 10 seconds. GPU-Z didn't show the voltage going past 1.286v. When I tried to put to back to 1300/1250 it just crashes. In fact, I can not even keep 1200/1250 stable. The best I can do now is 1175/1250. Maybe running 1300/1250 for about 5-10 minutes without issue was just luck. But now I cannot run 1300/1250 more than 5-10 seconds without crashing.
> 
> Does any of this seem "normal" or at least "par for the course" with these 290x cards?
> 
> I had planned on investing in some aftermarket cooling. But the 290x + aftermarket cooling puts me in the 780 price range. And even though the 290x is supposed to out perform the 780, it seems like at least with my card, this is not the case.
> 
> Should I be able to run Metro:Last Light on max above 60fps on @ 1080p? Can a 780? What about the differences in GW2 between the 290x and 780? Even if GW2 was better optimized for Nvidia, I would more or less expect the cards to perform about the same FPS in that case - not the 780 beating my 290x by 25-35fps on same settings.
> 
> (I know this was long. If you made it this far, thanks for reading.)


What drivers are you using?


----------



## Goride

Quote:


> Originally Posted by *"igrease*
> I wish I could OC my 290 to at least 1250 on air. At just 1100/1250 it sits at almost 80c with fans at 90%.


1250mhz for the vram was just the default setting. I did not mess with it.

And the vram 1 temps are about 85c, sometimes hitting 92c, with the fans around 60% (which are still really loud, and I have a sound dampened Fractal R3 case)

Quote:


> Originally Posted by *mnicassio89*
> 
> What drivers are you using?


13.12 - the lastest non-beta drivers


----------



## Sgt Bilko

Quote:


> Originally Posted by *Goride*
> 
> I recently got a BF4 edition Powercolor 290x reference card.
> 
> The performance is not nearly as good as I was expecting. I was wondering if maybe my expectations were too high.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> At stock/default settings with 13.12 drivers, I was not able to max out the settings of Metro: Last Light on a 1080p monitor, and stay above 60fps. In fact it was closer to 29fps. I had GPU-Z logging, and it did throttle down and the core was staying between 850-950mhz.
> 
> I used MSI AB to manually turn up the fan and power limit %, which stopped the throttling and stayed at the default 1030mhz. This increased the frames a bit, but basically stayed the same.
> 
> I OC'd it to 1175mhz, with the fan running at 65% (really loud). I got some more frames, but I still could only barely stay above 30fps.
> 
> I know Metro: Last Light is a pretty demanding game, but I figured @ 1080p, I would be able to stay above 60fps.
> 
> I also tried Guild Wars 2. I ran that at max settings, and could not stay above 60fps either. I turned super sampling off, and that helped some, but I still could not maintain at least 60fps. My friend, who has a 780 (same 1080p), was running around in the same areas as me, and he was getting around 20-35 more fps than I was at any given time, with the same graphics settings. During this time I was running at 1150core/1250vram, with no throttling, and he was running the 780 at stock/default settings.
> (I did see this link: http://techreport.com/review/25509/amd-radeon-r9-290x-graphics-card-reviewed/11 which suggests the 780 just runs better in gw2 than the 290x, though)
> 
> I was able to OC the 290x to 1300core/1250vram, with the fan at 70%, with core voltage slider to +80. GPU-Z said the max core voltage was 1.286v. It seemed to run Metro: Last Light stable at this. Core temp was around 80c (i forget vram temps). I then tried to push it to 1340/1250 and got artifacting. I pushed the voltage all the way to the right, the artifacting went away, but it crashed in like 10 seconds. GPU-Z didn't show the voltage going past 1.286v. When I tried to put to back to 1300/1250 it just crashes. In fact, I can not even keep 1200/1250 stable. The best I can do now is 1175/1250. Maybe running 1300/1250 for about 5-10 minutes without issue was just luck. But now I cannot run 1300/1250 more than 5-10 seconds without crashing.
> 
> Does any of this seem "normal" or at least "par for the course" with these 290x cards?
> 
> I had planned on investing in some aftermarket cooling. But the 290x + aftermarket cooling puts me in the 780 price range. And even though the 290x is supposed to out perform the 780, it seems like at least with my card, this is not the case.
> 
> Should I be able to run Metro:Last Light on max above 60fps on @ 1080p? Can a 780? What about the differences in GW2 between the 290x and 780? Even if GW2 was better optimized for Nvidia, I would more or less expect the cards to perform about the same FPS in that case - not the 780 beating my 290x by 25-35fps on same settings.
> 
> (I know this was long. If you made it this far, thanks for reading.)


What's the rest of you rig? CPU, Mobo etc.
Quote:


> Originally Posted by *mnicassio89*
> 
> What drivers are you using?


3rd line: 13.12


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Goride*
> 
> I recently got a BF4 edition Powercolor 290x reference card.
> 
> The performance is not nearly as good as I was expecting. I was wondering if maybe my expectations were too high.
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> At stock/default settings with 13.12 drivers, I was not able to max out the settings of Metro: Last Light on a 1080p monitor, and stay above 60fps. In fact it was closer to 29fps. I had GPU-Z logging, and it did throttle down and the core was staying between 850-950mhz.
> 
> I used MSI AB to manually turn up the fan and power limit %, which stopped the throttling and stayed at the default 1030mhz. This increased the frames a bit, but basically stayed the same.
> 
> I OC'd it to 1175mhz, with the fan running at 65% (really loud). I got some more frames, but I still could only barely stay above 30fps.
> 
> I know Metro: Last Light is a pretty demanding game, but I figured @ 1080p, I would be able to stay above 60fps.
> 
> I also tried Guild Wars 2. I ran that at max settings, and could not stay above 60fps either. I turned super sampling off, and that helped some, but I still could not maintain at least 60fps. My friend, who has a 780 (same 1080p), was running around in the same areas as me, and he was getting around 20-35 more fps than I was at any given time, with the same graphics settings. During this time I was running at 1150core/1250vram, with no throttling, and he was running the 780 at stock/default settings.
> (I did see this link: http://techreport.com/review/25509/amd-radeon-r9-290x-graphics-card-reviewed/11 which suggests the 780 just runs better in gw2 than the 290x, though)
> 
> I was able to OC the 290x to 1300core/1250vram, with the fan at 70%, with core voltage slider to +80. GPU-Z said the max core voltage was 1.286v. It seemed to run Metro: Last Light stable at this. Core temp was around 80c (i forget vram temps). I then tried to push it to 1340/1250 and got artifacting. I pushed the voltage all the way to the right, the artifacting went away, but it crashed in like 10 seconds. GPU-Z didn't show the voltage going past 1.286v. When I tried to put to back to 1300/1250 it just crashes. In fact, I can not even keep 1200/1250 stable. The best I can do now is 1175/1250. Maybe running 1300/1250 for about 5-10 minutes without issue was just luck. But now I cannot run 1300/1250 more than 5-10 seconds without crashing.
> 
> Does any of this seem "normal" or at least "par for the course" with these 290x cards?
> 
> I had planned on investing in some aftermarket cooling. But the 290x + aftermarket cooling puts me in the 780 price range. And even though the 290x is supposed to out perform the 780, it seems like at least with my card, this is not the case.
> 
> Should I be able to run Metro:Last Light on max above 60fps on @ 1080p? Can a 780? What about the differences in GW2 between the 290x and 780? Even if GW2 was better optimized for Nvidia, I would more or less expect the cards to perform about the same FPS in that case - not the 780 beating my 290x by 25-35fps on same settings.
> 
> (I know this was long. If you made it this far, thanks for reading.)


First, if you have Metro maxed you're not running 1080p, you're running 4K. The game uses SSAA so you're rendering many more pixels than you're seeing. 30 FPS for Metro on 4K for a single card is about what you should expect.

There is no way you could run 1300+MHz core on air with only +80mv, whatever tests you did at these clocks must have not sufficiently loaded the card or it would crash right away. 1175MHz core on air is a little above average, not great but not bad either.

As for the fluctuating clocks, add +50 power limit.


----------



## mnicassio89

Sorry my mistake, one of those days.


----------



## cennis

Does anyone here know where I can get the Matrix 290x bios? It has vrm frequency adjustment in gputweak.
I want to try it on DCUII card to try to fix the VRM temps.


----------



## Sgt Bilko

Quote:


> Originally Posted by *cennis*
> 
> Does anyone here know where I can get the Matrix 290x bios? It has vrm frequency adjustment in gputweak.
> I want to try it on DCUII card to try to fix the VRM temps.


Different PCB, more power phases, I don't think that would fix your problem.

You'd be better off adding some thermal pads to whatever vrm cooling there is and seeing if you can improve your airflow


----------



## Goride

Quote:


> Originally Posted by *Sgt Bilko*
> 
> What's the rest of you rig? CPU, Mobo etc.


CPU: 3570k @ stock 3.4ghz
RAM: 2x4gb(8gb) of Samsung 30nm @ ddr3 1600 with 11-11-11-28 timings
SSD: 240gb Mushkin
Mobo: Asrock z77 extreme4
PSU: Seasonic 650w (I would not dare try to crossfire, but should be fine with 1)

I can OC the CPU and ram, but I have them at stock or tame settings just to try and rule out their settings as the culprit for any stability issues.

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> First, if you have Metro maxed you're not running 1080p, you're running 4K. The game uses SSAA so you're rendering many more pixels than you're seeing. 30 FPS for Metro on 4K for a single card is about what you should expect.


I guess I do not quite understand this part. Is having SSAA at x4 while at 1920x1080 resolution, analogous to supersampling textures (where it renders at a higher resolution then scales it down)?
Quote:


> There is no way you could run 1300+MHz core on air with only +80mv, whatever tests you did at these clocks must have not sufficiently loaded the card or it would crash right away. 1175MHz core on air is a little above average, not great but not bad either.


+80 was making it report 1.286v in GPU-Z. I have seen people say they were getting 1300 @ 1.3v (but yes water cooled). Even at +100, with the fan at 70%, the core temps do not pass 90c, and usually are around 85c (the vram 1 temps were hitting up to 95c though).
Quote:


> As for the fluctuating clocks, add +50 power limit.


Yeah, I turned this up too. But when I was experiencing it, as described above, the reason it was throttling down was because it was hitting 94c. When I manually turned the fan speed up, keeping it away from 95c, it stopped throttling itself.


----------



## cennis

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Different PCB, more power phases, I don't think that would fix your problem.
> 
> You'd be better off adding some thermal pads to whatever vrm cooling there is and seeing if you can improve your airflow


I will do that too, but it cant hurt to try


----------



## bond32

Run Firestone at stock clocks then run it again at your 1300 +80mv. I would be money your graphics score will actually drop. Why? Unstable.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Goride*
> 
> CPU: 3570k @ stock 3.4ghz
> RAM: 2x4gb(8gb) of Samsung 30nm @ ddr3 1600 with 11-11-11-28 timings
> SSD: 240gb Mushkin
> Mobo: Asrock z77 extreme4
> PSU: Seasonic 650w (I would not dare try to crossfire, but should be fine with 1)
> 
> I can OC the CPU and ram, but I have them at stock or tame settings just to try and rule out their settings as the culprit for any stability issues.
> I guess I do not quite understand this part. Is having SSAA at x4 while at 1920x1080 resolution, analogous to supersampling textures (where it renders at a higher resolution then scales it down)?
> +80 was making it report 1.286v in GPU-Z. I have seen people say they were getting 1300 @ 1.3v (but yes water cooled). Even at +100, with the fan at 70%, the core temps do not pass 90c, and usually are around 85c (the vram 1 temps were hitting up to 95c though).
> Yeah, I turned this up too. But when I was experiencing it, as described above, the reason it was throttling down was because it was hitting 94c. When I manually turned the fan speed up, keeping it away from 95c, it stopped throttling itself.


SSAA = SuperSampling AA which means you're rending 4x your display res

You said the max GPU-Z reported was 1.286v. This is not load voltage, this is max. People can run 1300 on 1.3v load. Look up and read about Vdroop, you will understand it more.


----------



## Goride

Quote:


> Originally Posted by *bond32*
> 
> Run Firestone at stock clocks then run it again at your 1300 +80mv. I would be money your graphics score will actually drop. Why? Unstable.


If you are referring to me, I cannot even get the 290x to run at 1300 for more than 5-10 seconds now before it crashes.

I did have it at 1300 for about 5-10 minutes with everything set as high as possible in Metro: Last Light. It seemed to be running well, so I thought, I would try 1340, which immediately crashed. Ever since then, I have not been able to run anything higher than 1175 without eventually having artifacting or crashing.

So it was probably just some sort of anomaly and a good luck streak I guess that let me run Metro at 1300 for 5-10 minutes I guess. Not sure how else to explain it. But with that said, I was able to run it at 1200/1250 and 1250/1250 for awhile too. But now I cannot do that either.

I can get it to run stable at 1175, but the fan is way higher than I would like. I can turn the voltage down some at 1150, and keep the fan speed around 40%, which is still loud, but at least somewhat reasonable.


----------



## cennis

Quote:


> Originally Posted by *Goride*
> 
> If you are referring to me, I cannot even get the 290x to run at 1300 for more than 5-10 seconds now before it crashes.
> 
> I did have it at 1300 for about 5-10 minutes with everything set as high as possible in Metro: Last Light. It seemed to be running well, so I thought, I would try 1340, which immediately crashed. Ever since then, I have not been able to run anything higher than 1175 without eventually having artifacting or crashing.
> 
> So it was probably just some sort of anomaly and a good luck streak I guess that let me run Metro at 1300 for 5-10 minutes I guess. Not sure how else to explain it. But with that said, I was able to run it at 1200/1250 and 1250/1250 for awhile too. But now I cannot do that either.
> 
> I can get it to run stable at 1175, but the fan is way higher than I would like. I can turn the voltage down some at 1150, and keep the fan speed around 40%, which is still loud, but at least somewhat reasonable.


I think stock cooler at stock speeds will hit 95c at 40% which was the original problem everyone had?


----------



## chiknnwatrmln

Quiet, overclocked, low temps. If you have the ref cooler pick two.

As for the guy trying to flash a Lightning BIOS on a DCUII, I don't think it'll work too well.


----------



## Goride

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> SSAA = SuperSampling AA which means you're rending 4x your display res


Ahh ok, thank you, that makes a lot more sense. I will set it to x1 and check it later. I am guessing that will make a big difference. (I am still in the process of learning more and messing around with some of this stuff (it has been awhile since I have had a powerful GPU, and could even think about playing with some of these settings)
Quote:


> You said the max GPU-Z reported was 1.286v. This is not load voltage, this is max. People can run 1300 on 1.3v load. Look up and read about Vdroop, you will understand it more.


I had just unboxed the card and was playing around with it. I was focused on looking at max voltages and max temps at this point, because I wanted to see how far I could clock it up, but at the same time never letting it go over 1.3v and staying under 90c at all times (IE: I was trying to avoid frying it or overheating it right off the bat, before I even figured out what I was doing).

I do not fully understand how vdroop works, but I think I understand the general concepts of it. What I do not get though, is if I have the offset set to +100 in MSI AB, and the GPU usage is at or near 100% pretty much at all times, the most I have seen GPU-Z report was 1.316v (max), but usually it is still much lower than that. Why is it not averaging near 1.3v, when at 100% usage (according to MSI AB). I would not do this on air, but can you set it higher than +100 somehow, or else how are the WCers doing it?

(Also, for what it is worth, the reference cooler actually seems to cool decently (at least the core, the vram1 was still pretty high), it is just ungodly loud when you turn it up to 70% or so)


----------



## Goride

Quote:


> Originally Posted by *cennis*
> 
> I think stock cooler at stock speeds will hit 95c at 40% which was the original problem everyone had?


40-45% keeps the core temp around 85-92c at all times. But 40-45% fan speed is still beyond tolerable loudness IMO. Anywhere between 20-35% seems to be ok (at least in my case). But that is not enough to keep it below 95c. (also 40-45% cannot keep the vram1 temp below 95c).

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Quiet, overclocked, low temps. If you have the ref cooler pick two.


Yes, I know.

I was just trying to find out if some of the things I was running into were "normal" for the 290x, before I spend the money on aftermarket cooling. For instance, I got less than expected results from running Metro: Last light, and in Guild Wars 2 a stock 780 was getting significantly better FPS than I could at both stock and when I had it OC'd. So things just seemed weird.

In Metro, I expected to stay above 60fps on max settings for 1080p, but I was getting aroudn 29fps. However, it appears I was supersampling AA at 4k, when I did not mean to, which explains why I was only getting around 29fps.

In GW2, the 780 at stock speeds was getting around 30-40 more FPS than my 290x at stock/default settings (which were throttling around 850-950mhz), while using the exact same graphics settings. The 780 was still getting around 25-30 more FPS at any given moment when I had my core stable at 1030mhz and even 1150mhz in GW2. But, apparently, GW2 just runs better on a 780 than a 290x. I just did not think a stock 780 would have 25fps more than a 290x running at 1150/1250, and even more when at stock.

So out of the two games I tested, I messed up the settings on Metro, and GW2 is just not a good example. Which explains why I was getting results I was not expecting.


----------



## Goride

Anyway, thanks for the replys everyone.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Goride*
> 
> Ahh ok, thank you, that makes a lot more sense. I will set it to x1 and check it later. I am guessing that will make a big difference. (I am still in the process of learning more and messing around with some of this stuff (it has been awhile since I have had a powerful GPU, and could even think about playing with some of these settings)
> I had just unboxed the card and was playing around with it. I was focused on looking at max voltages and max temps at this point, because I wanted to see how far I could clock it up, but at the same time never letting it go over 1.3v and staying under 90c at all times (IE: I was trying to avoid frying it or overheating it right off the bat, before I even figured out what I was doing).
> 
> I do not fully understand how vdroop works, but I think I understand the general concepts of it. What I do not get though, is if I have the offset set to +100 in MSI AB, and the GPU usage is at or near 100% pretty much at all times, the most I have seen GPU-Z report was 1.316v (max), but usually it is still much lower than that. Why is it not averaging near 1.3v, when at 100% usage (according to MSI AB). I would not do this on air, but can you set it higher than +100 somehow, or else how are the WCers doing it?
> 
> (Also, for what it is worth, the reference cooler actually seems to cool decently (at least the core, the vram1 was still pretty high), it is just ungodly loud when you turn it up to 70% or so)


This should help clarify...

As for more than +100, there are commands to use which let you use more in MSI AB. I hate AB so I prefer Trixx, which by default allows +200mv. I run +168mv daily, this takes me to just around 1.3v load.


Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Highest voltage means nothing..
> 
> What happens is when the card kicks into 3d mode it clocks up and voltage is increased to max. In your case 1.422v, around the same for me.
> 
> Due to vdroop, as GPU usage (and current) increases the voltage decreases. Each GPU will droop a different amount, depending on chip quality.
> 
> To figure out actual voltage, run a bench like Heaven that will give you constant 100% usage and look at your VDDC in GPUz as you're running the bench. The VDDC will jump a bit (usually like .1v in either direction) but will stay around the same number... Mine goes from 1.29v to 1.31v usually so I call it 1.3v after droop.
> 
> If you're not familiar with vdroop, if I remember right it's a safety feature implemented by resistors. Because of changes in voltage as the GPU kicks into 3d mode, if those resistors weren't there then a massive yet quick change in voltage would be present, and over time could damage the GPU. The same principle applies to CPU's, which is why CPU's also have droop.
> 
> LLC (loadline correction) basically uses algorithms to feed more voltage when the CPU/GPU is under load... I'm hesitant to use PT3 BIOS because as far as I know there's no way to control LLC. With my motherboard, on the other hand, I can adjust LLC so that at 100% clock speed but no load I have the same exact voltage as 100% clock speed and full load, 1.39v.
> 
> This link has graphs that will help if I didn't explain it clearly. http://en.wikipedia.org/wiki/Voltage_droop






Also, try not to double/triple post. Edit your posts instead.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Goride*
> 
> 1250mhz for the vram was just the default setting. I did not mess with it.
> 
> And the vram 1 temps are about 85c, sometimes hitting 92c, with the fans around 60% (which are still really loud, and I have a sound dampened Fractal R3 case)
> 13.12 - the lastest non-beta drivers
> Quote:
> 
> 
> 
> Originally Posted by *mnicassio89*
> 
> Sorry my mistake, one of those days.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *cennis*
> 
> Does anyone here know where I can get the Matrix 290x bios? It has vrm frequency adjustment in gputweak.
> I want to try it on DCUII card to try to fix the VRM temps.
> 
> Click to expand...
Click to expand...

Hey guys you should fill out rig builder in your Profile at top left of screen . So we can see what you've got


----------



## BroJin

This a good score?

i3570k 4.5


----------



## alancsalt

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Goride*
> 
> 1250mhz for the vram was just the default setting. I did not mess with it.
> 
> And the vram 1 temps are about 85c, sometimes hitting 92c, with the fans around 60% (which are still really loud, and I have a sound dampened Fractal R3 case)
> 13.12 - the lastest non-beta drivers
> Quote:
> 
> 
> 
> Originally Posted by *mnicassio89*
> 
> Sorry my mistake, one of those days.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *cennis*
> 
> Does anyone here know where I can get the Matrix 290x bios? It has vrm frequency adjustment in gputweak.
> I want to try it on DCUII card to try to fix the VRM temps.
> 
> Click to expand...
> 
> 
> 
> Click to expand...
> 
> Hey guys you should fill out rig builder in your Profile at top left of screen . So we can see what you've got
Click to expand...

If you do, don't forget to add the rigbuilder list to your sig after.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *alancsalt*
> 
> If you do, don't forget to add the rigbuilder list to your sig after.


Good to know that your there to back me up ..........


----------



## UZ7

Quote:


> Originally Posted by *BroJin*
> 
> 
> This a good score?
> 
> i3570k 4.5


I like your background


----------



## Forceman

Quote:


> Originally Posted by *Lord Xeb*
> 
> Anyone having issues with their cards having a black screen after being idle for a while?


Yes, it's happening to quite a few people. No real fix for it at this point.
Quote:


> Originally Posted by *Goride*
> 
> I do not fully understand how vdroop works, but I think I understand the general concepts of it. What I do not get though, is if I have the offset set to +100 in MSI AB, and the GPU usage is at or near 100% pretty much at all times, the most I have seen GPU-Z report was 1.316v (max), but usually it is still much lower than that. Why is it not averaging near 1.3v, when at 100% usage (according to MSI AB). I would not do this on air, but can you set it higher than +100 somehow, or else how are the WCers doing it?
> 
> (Also, for what it is worth, the reference cooler actually seems to cool decently (at least the core, the vram1 was still pretty high), it is just ungodly loud when you turn it up to 70% or so)


Generally speaking, the higher the load (and hence power draw) the lower the voltage will be - that's what Vdroop is. So under heavy load the voltage will drop, and then when the load eases up the voltage will temporarily spike. So the 1.31V you are seeing is when the card is transitioning load states, while the average voltage is lower.


----------



## Roy360

Quote:


> Originally Posted by *Forceman*
> 
> Yes, it's happening to quite a few people. No real fix for it at this point.
> Generally speaking, the higher the load (and hence power draw) the lower the voltage will be - that's what Vdroop is. So under heavy load the voltage will drop, and then when the load eases up the voltage will temporarily spike. So the 1.31V you are seeing is when the card is transitioning load states, while the average voltage is lower.


I get this on my ASUS card whenever try doing something while CGminer is starting.

But NCIX won't take it back, because they can't mimic the problem.


----------



## DeadlyDNA

I prefer to remove the cooler and treat it like my coffee,
Quote:


> Originally Posted by *Roy360*
> 
> I get this on my ASUS card whenever try doing something while CGminer is starting.
> 
> But NCIX won't take it back, because they can't mimic the problem.


is that the flannel panel edition? Sorry bad joke but that sucks. It never comes around just crashes or stays like that until reboot?


----------



## Lord Xeb

Card is crashing RMA time.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Roy360*
> 
> I get this on my ASUS card whenever try doing something while CGminer is starting.
> 
> But NCIX won't take it back, because they can't mimic the problem.


Lower overclock and mem or flash different vbios


----------



## Roy360

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Lower overclock and mem or flash different vbios


card is being underclocked actually.

947/1250 from 1000/1260. Stock voltage. Card pulls in around 183W according to GPUZ, whereas my reference cards pull around 170

where can I get a different bios for this card? I am already using the new update from ASUS
Quote:


> Originally Posted by *DeadlyDNA*
> 
> I prefer to remove the cooler and treat it like my coffee,
> is that the flannel panel edition? Sorry bad joke but that sucks. It never comes around just crashes or stays like that until reboot?


stays like that until I hit reset. If I don't hit reset, the screen will flash black and change to a new pattern and repeat.

What's odd is, if I start CGminer, and give it a few seconds, and then start using the PC, there is no problem. However, if I try doing something like CGminer is still working, then I get this. (Which is why I'm having a hard time demonstrating this to NCIX, I don't know if they consider this a defect.)

But this is only only ASUS card that does this, this my only ASUS card on the new vbios. This has been repeated on a MSI G45 board, ASUS B85 and ASUS Formula V


----------



## BroJin

Quote:


> Originally Posted by *UZ7*
> 
> I like your background


hahahahaha thanks











Enjoy


----------



## Matt-Matt

Quote:


> Originally Posted by *Roy360*
> 
> I've been thinking about what rdr09 said about using a better TIM.
> 
> Is CLU Ultra Safe for GPUs? The heatsync is copper so it shouldn't erase anything should it? (I already has some, whcih is why I ask)
> 
> otherwise I guess Arctic Silver 5 or Prolimatech would be the way to go?
> 
> EDIT: I think I just answered my own question: http://www.hardwaresecrets.com/article/Thermal-Compound-Roundup-January-2012/1468/5
> 
> I don't know why people say they have seen a 20 degree drop when using CLU, but according to that review, almost all aftermarket TIMs are within 2-3 degrees of each other. I just I'll stick with whatever is lying around. (expired N1-H1)


I used CLU on a 7950, only one because i only had enough for one and it was the top card as naturally it gets hotter. Just make sure you only get it on the core and nothing else and you'll be fine! From what I remember I only got like a 4c drop but again it was the top card so it wasn't too bad.


----------



## kizwan

Quote:


> Originally Posted by *BroJin*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> This a good score?
> 
> i3570k 4.5


Yes, that is good score, in the range what you should get with one card at that clocks.


----------



## centvalny

Cold test



http://imgur.com/OUsKRLT


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Roy360*
> 
> card is being underclocked actually.
> 
> 947/1250 from 1000/1260. Stock voltage. Card pulls in around 183W according to GPUZ, whereas my reference cards pull around 170
> 
> where can I get a different bios for this card? I am already using the new update from ASUS
> stays like that until I hit reset. If I don't hit reset, the screen will flash black and change to a new pattern and repeat.
> 
> What's odd is, if I start CGminer, and give it a few seconds, and then start using the PC, there is no problem. However, if I try doing something like CGminer is still working, then I get this. (Which is why I'm having a hard time demonstrating this to NCIX, I don't know if they consider this a defect.)
> 
> But this is only only ASUS card that does this, this my only ASUS card on the new vbios. This has been repeated on a MSI G45 board, ASUS B85 and ASUS Formula V


PT1T unlocked / volts bios should do the trick with GPU Tweek


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *centvalny*
> 
> Update FS with h2o before go cold
> 
> 
> 
> http://imgur.com/Spo3OVF


Quote:


> Originally Posted by *centvalny*
> 
> Cold test
> 
> 
> 
> http://imgur.com/OUsKRLT


AWESOME scores mate


----------



## darkelixa

I got a gigabyte r9 290 oc windforce last week with a new corsair c750m psu and when i play elder scrolls online for about a hour the pc just randomly shuts down, and then re powers up, this would be a psu issue would it not as my old fps psu does not have that issue, i have for now pulled the corsair psu out and am testing with the fsp psu and so far no more random shut downs, but only time will tell i guess, i also flashed the gigabyte gpu as the new bios said in its description improved stabability


----------



## passinos

Quote:


> Originally Posted by *passinos*
> 
> Hey gang, just got my 2x 290x CF with Koolance Blocks. Upgrade from 7970 CF.
> 
> These cards weigh like 3-4 lbs! I am sure its the heavy blocks, but man they are heavy.
> 
> Should probably get some back plates to help stiffen them up.
> 
> Now to see if my 860 platinum PSU is going to melt when I OC these bad boys.


Ordered 3/8 barb and Crossfire water bridge adapter. Got to wait another week.

Anyone think the backplate will help support this 4lb card?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *passinos*
> 
> Ordered 3/8 barb and Crossfire water bridge adapter. Got to wait another week.
> 
> Anyone think the backplate will help support this 4lb card?


I would definitely fit a backplate those block aren't light


----------



## francesco1

Hello everyone I am new to this forum!!!!
I live in Italy and I'm 16 so I do not have much experience , I have a r9 msi 290x bios with custom tri- x with memories hynix , I flashed the bios because I have an arctic accelero extreme III and saw that my card was suffering coil whine and frequencies unpacked I flashed but does not seem to have solved nothing because the coil whine is still there, though now the default frequencies are 1040/1300 . The problem is that with battlefield 4 (64bit ) and in general in all the games ; with vsync goes a bit jerky and the average fps are below 60 . use the drivers that are the famous 13:12 and controlling the frequencies of AB and turns the fans are crazy , like the use of gpu.i have formatted and uninstalled in cleanly drivers but nothing, ( if i do not use the vsync the picture does not look good ) please help me on this computer because I put all my savings , many have told me to replace the video card T.T, THANKS TO ALL FOR THE HELP!!


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *francesco1*
> 
> Hello everyone I am new to this forum!!!!
> I live in Italy and I'm 16 so I do not have much experience , I have a r9 msi 290x bios with custom tri- x with memories hynix , I flashed the bios because I have an arctic accelero extreme III and saw that my card was suffering coil whine and frequencies unpacked I flashed but does not seem to have solved nothing because the coil whine is still there, though now the default frequencies are 1040/1300 . The problem is that with battlefield 4 (64bit ) and in general in all the games ; with vsync goes a bit jerky and the average fps are below 60 . use the drivers that are the famous 13:12 and controlling the frequencies of AB and turns the fans are crazy , like the use of gpu.ho formatted and disistallato in cleanly drivers but nothing, ( if you do not use the vsync the picture does not look good ) please help me on this computer because I put all my savings , many have told me to replace the video card T.T, THANKS TO ALL FOR THE HELP!!


Have you heard of the Asus 290X PT1T / PT1 bios ? Flash that and run it with GPU tweek for fully unlocked voltage . You know how to flash so give it a shot








And the coil whine will dissipate after you run the card in a bit too








You can also use AB 19 to monitor your OSD too and run gpu tweek to clock


----------



## Roy360

Quote:


> Originally Posted by *DeadlyDNA*
> 
> I prefer to remove the cooler and treat it like my coffee,
> is that the flannel panel edition? Sorry bad joke but that sucks. It never comes around just crashes or stays like that until reboot?


Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> PT1T unlocked / volts bios should do the trick with GPU Tweek


I'm a little confused, but your basically telling me to increase voltage? Or is it a bios fault? I can currently adjust two types of voltages in MSI afterburner. AUX and Core.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Roy360*
> 
> I'm a little confused, but your basically telling me to increase voltage? Or is it a bios fault? I can currently adjust two types of voltages in MSI afterburner. AUX and Core.


What im suggesting is if all else fails after upping vcore is to flash vbios after trying 13.11 driver . Seemed to work okay for me when running TRI , now im running the new 14.4 WHQL driver . Runs real well too
Also make sure your general settings are correct in AB too......
And also why would you underclock your cards ? Run em at stock settings . That could help you too


----------



## francesco1

thank you so much for letting me respond. the problem is that I would not do crap there is not a guide on how to flash it with asus pt1t/pt1 and if you give me the link also in the bios issue??, and how do I overvolt and how much increase? unfortunately I do not know much.


----------



## darkelixa

Is it normal for giagbyte to have all the xxx- after the bios revision


----------



## francesco1

I read that the custom rom pt1t/pt1/pt3 are very dangerous. I can not understand why should I flash it again, or rather, what do these bios that thou hast recommended?
13:12 drivers are fine? if there is a guide or a video on how to flash it I would be grateful!


----------



## Mercy4You

14.4 WHQL just came out, this is the one I'm gonna install now


----------



## heroxoot

I'm on 14.4 and my dx11 performance is the same as 13.12 and 14.2 with my OC. Still have to try Mantle.


----------



## lawson67

The new stable 14.4 drivers that came out today are working really nice and even seem smoother in Heaven Benchmark 4.0 than the 13.12 drivers on my 2X R9 290 crossfire set up!


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *francesco1*
> 
> thank you so much for letting me respond. the problem is that I would not do crap there is not a guide on how to flash it with asus pt1t/pt1 and if you give me the link also in the bios issue??, and how do I overvolt and how much increase? unfortunately I do not know much.
> Quote:
> 
> 
> 
> Originally Posted by *francesco1*
> 
> I read that the custom rom pt1t/pt1/pt3 are very dangerous. I can not understand why should I flash it again, or rather, what do these bios that thou hast recommended?
> 13:12 drivers are fine? if there is a guide or a video on how to flash it I would be grateful!
Click to expand...

Yes there is a guide BUT I would try to learn first how to overclock first before flashing bios's and the like until you have more confidence / experience
First page of this club thread should help you and read up on the other thread on how to flash / unlock your card

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/0_20
http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread/0_20

Most importantly is to have a fun experience , don't let it do your head in


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *lawson67*
> 
> The new stable 14.4 drivers that came out today are working really nice and even seem smoother in Heaven Benchmark 4.0 than the 13.12 drivers on my 2X R9 290 crossfire set up!


I haven't run heaven 4.0 in over 18mths . This 14.4 WHQL is a good reason to bench it I rekon








Its nearly 2am here in Brisvegas im off catch youse ronski


----------



## Mercy4You

Quote:


> Originally Posted by *heroxoot*
> 
> I'm on 14.4 and my dx11 performance is the same as 13.12 and 14.2 with my OC. Still have to try Mantle.


I'd be happy if we loose the DirectX error now loads of people get in BF4...


----------



## francesco1

thank for your help but i'm italian my english is to bad i need a simple guide!!
for ex:
if you can.
1-...
2- ...
3-...
4-....
5-....
ecc.

my big question is, can i flash my bios tri-x to pt1t or pt1?? what bios i need to use pt1 or pt1t??


----------



## disintegratorx

Has AMD launched something recently like an SoC? Anyone have any news on that? I know that they've just launched a different kind of APU for their FM1 socket, with their Steamroller architecture.. With ARM processors I believe.. I think that would be nice as long as their integrated technologies work good.


----------



## MojoW

Quote:


> Originally Posted by *francesco1*
> 
> thank for your help but i'm italian my english is to bad i need a simple guide!!
> for ex:
> if you can.
> 1-...
> 2- ...
> 3-...
> 4-....
> 5-....
> ecc.
> 
> my big question is, can i flash my bios tri-x to pt1t or pt1?? what bios i need to use pt1 or pt1t??


Yes you can but that bios doesn't have power savings and run @ 3d clocks and volts 24/7.
So you better watch your temp's and your card will degrade faster because it's made for LN2 overclockers.
You will also need a special made asus oc utility to make full use of it ... then you will have core voltage up to 2.0v.
It's not made for air cooling, so i would use a normal bios if i where you unless your absolutely sure of what your doing.

Why do you need a different bios??


----------



## francesco1

so what do you advise me to do since I have an arctic accelero iii fps with frequencies and unstable? you advise me to flash another bios, the uber mode?


----------



## MojoW

Quote:


> Originally Posted by *francesco1*
> 
> so what do you advise me to do since I have an arctic accelero iii fps with frequencies and unstable? you advise me to flash another bios, the uber mode?


What are your temps while gaming or benching?
What kind of instability's?
What driver are you using?
Did you install the cooler yourself?
Did you have a different graphics card installed before this one?


----------



## francesco1

i don't now but my problem is when i start game like bf4 or the witcher 2 or another...the coil whine start the frequencies ranging from 800 to 1040 (default clock tri-x) and unstable fps and jerky.


----------



## francesco1

the driver whic i have are catalyst 13.12, temperatures are around 55 to 60 in full with BF4, the cooler I installed it myself, but I do not damaged nothing I put a precise care.


----------



## MojoW

Quote:


> Originally Posted by *francesco1*
> 
> i don't now but my problem is when i start game like bf4 or the witcher 2 or another...the coil whine start the frequencies ranging from 800 to 1040 (default clock tri-x) and unstable fps and jerky.


Sounds like driver problems or overheating to me.
You need to check things so we can help you.
If you use MSI Afterburner you should check the temps with if u don't have that program.
Download GPU-Z and keep it running while you play bf4 and in the sensor tab you can check all the temps.( core, vrm1 and vrm2 )
Give us the max temps you see.


----------



## MojoW

Quote:


> Originally Posted by *francesco1*
> 
> the driver whic i have are catalyst 13.12, temperatures are around 55 to 60 in full with BF4, the cooler I installed it myself, but I do not damaged nothing I put a precise care.


And the vrm1 and 2 temps?


----------



## MojoW

So it was allready throttling and jerky before you installed the cooler?


----------



## francesco1

I can tell you now not because I formatted it all but I was using gpu-z and maximum temperatures were 60/65 VRM1 and 57/58 with VRM2 heaven on 69/73 VRM1 and 64/68 VRM2 the temperatures of the core I would not say bull**** but they were about 30/40 *!


----------



## francesco1

partly, I'm sure the coil whine is started after installing the new cooling, while the instability do not know. But it is incredible I have 16 years with all my savings, I assembled a PC and having to play with vsync BF4 jerky and the sound of ******* video card!


----------



## MojoW

Quote:


> Originally Posted by *francesco1*
> 
> partly, I'm sure the coil whine is started after installing the new cooling, while the instability do not know. But it is incredible I have 16 years with all my savings, I assembled a PC and having to play with vsync BF4 jerky and the sound of ******* video card!


The cooler is more quiet so the coil whine is more obvious that's what i think.(most people start hearing coil whine with silent or less noisy cooling.)
Try running DDU to uninstall your drivers.
And then reinstall 13.12 or 14.4 as those have working power limits.


----------



## chiknnwatrmln

I didn't even know my card made coil whine until I replaced the cooler. All cards wine to some extent, you just have to deal with it.

As for the downclocking, have you tried increasing the power limit?


----------



## MojoW

But the throttling happens @ stock so strange that you would need that on stock clocks.
And the jerky gameplay leads me to think there's more wrong then just the PL.


----------



## francesco1

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I didn't even know my card made coil whine until I replaced the cooler. All cards wine to some extent, you just have to deal with it.
> 
> As for the downclocking, have you tried increasing the power limit?


no why?


----------



## francesco1

Quote:


> Originally Posted by *MojoW*
> 
> But the throttling happens @ stock so strange that you would need that on stock clocks.
> And the jerky gameplay leads me to think there's more wrong then just the PL.


my frequencies is 1040/1300 the normal of tri-x (uber mode switch).
and if i reflash the bios tri-x to tri-x???


----------



## MojoW

Quote:


> Originally Posted by *francesco1*
> 
> my frequencies is 1040/1300 the normal of tri-x (uber mode switch).
> and if i reflash the bios tri-x to tri-x???


You can try to flash it but the chance it went wrong is low.
You can always flip the switch and test the original bios, so you know for sure it's a problem with the tri-x bios.

Did you reinstall the drivers after you flashed your card?
You always need to clean out the driver before you flash your card and reinstall driver afterwards.


----------



## Rozayz

Add to owners list please. http://i.imgur.com/fSA15wE.png - ASUS, reference/stock cooled at the moment. Getting waterblock soon in prep for my finalized loop.

Also, need a bit of help diagnosing a shut down issue I'm having with my GPU







posted at incorrect location - here's the thread.

Cheers.


----------



## francesco1

Quote:


> Originally Posted by *MojoW*
> 
> You can try to flash it but the chance it went wrong is low.
> You can always flip the switch and test the original bios, so you know for sure it's a problem with the tri-x bios.
> 
> Did you reinstall the drivers after you flashed your card?
> You always need to clean out the driver before you flash your card ad reinstall driver afterwards.


but the quiet mode cut the frequency no ??


----------



## MojoW

Quote:


> Originally Posted by *francesco1*
> 
> but the quiet mode cut the frequency no ??


No not @ stock and it's just a difference in fan speed and you have good cooling so it's no problem.
But if u change the bios switch reinstall the drivers or you will have problems.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *francesco1*
> 
> no why?


Because power limit causes downclocking. Set it to +50 via MSI AB or some other overclocking tool.


----------



## francesco1

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Because power limit causes downclocking. Set it to +50 via MSI AB or some other overclocking tool.


ok, so you're telling me that with the BIOS tri-x also the default frequencies need pwm?


----------



## francesco1

Quote:


> Originally Posted by *MojoW*
> 
> No not @ stock and it's just a difference in fan speed and you have good cooling so it's no problem.
> But if u change the bios switch reinstall the drivers or you will have problems.


no problem







, at the moment ther isn't driver on pc. But when i re-swith to ubermode i need to unistall and reinstall the driver no really??


----------



## Jflisk

Quote:


> Originally Posted by *Roy360*
> 
> I get this on my ASUS card whenever try doing something while CGminer is starting.
> 
> But NCIX won't take it back, because they can't mimic the problem.


What drivers are you using do not use the 14.4 you will get the results listed.


----------



## francesco1

Quote:


> Originally Posted by *Jflisk*
> 
> What drivers are you using do not use the 14.4 you will get the results listed.


i must install a new driver you advice me the 13.12 or 14.4 on msi r9 290x??


----------



## Norse

Got a bit of weirdness running the 14.4 drivers, if i am away from PC for a while with all 3 monitors off, anything on the right one gets shifted over to the left one and anythign on the left stays where it was same with the central monitor

Left monitor = DVI, Right = Displayport to VGA cable, central HDMI


----------



## chiknnwatrmln

Quote:


> Originally Posted by *francesco1*
> 
> ok, so you're telling me that with the BIOS tri-x also the default frequencies need pwm?


What? PWM is pulse width modulation which controls fan speed....

You need to set the card's power limit as high as it will go to prevent throttling. That might be what's causing your card to throttle, no way to tell until you try it.


----------



## Jflisk

Quote:


> Originally Posted by *francesco1*
> 
> i must install a new driver you advice me the 13.12 or 14.4 on msi r9 290x??


I tried the 14.4 drivers and get a crash with cgminer just like you showed us. try anything below 14.4 drivers. 14.3 v1 beta or below. Thanks


----------



## chiknnwatrmln

Just got and installed my EK Reinforcement Bracket...

Not a huge help as my card still bends a bit, but it looks nice. Not bad for $8. Thanks to whoever suggested it.

I need to clean my case window, there's a thin layer of dust on it.


----------



## straKK

Well I tried to add more volts to MSI Afterburner via the command line args method in the 1st or 2nd sticky post but whenever I run the bat file, a command prompt window briefly pops up and then nothing happens. Anyone know how to make it work?


----------



## Jflisk

Quote:


> Originally Posted by *straKK*
> 
> Well I tried to add more volts to MSI Afterburner via the command line args method in the 1st or 2nd sticky post but whenever I run the bat file, a command prompt window briefly pops up and then nothing happens. Anyone know how to make it work?


I might have missed something what are you trying to get to work?


----------



## straKK

I'm trying to offset the core voltage.
Quote:


> Just use /wi4,30,8d,10 for 100mv. The offset is 6.25 mv in hexademical. So on decimal is :16*6.25=100 mv. For 50mv you need 8. For 200mv you need 20( 20=32 on dec. So 32 * 6.25=200mv)
> 
> The easy way to do changes:
> 
> Create a txt on desktop. Write
> CD C:\Program Files (x86)\MSI Afterburner
> MSIAfterburner.exe /wi4,30,8d,10
> 
> and then save as .bat file. Eveyrtime you start this bat file msi will start with +100mv
> 
> For 50mv: 8
> For 100mv:10
> For 125mv:14
> For 150mv:18
> For 175mv:1C
> For 200mv:20
> 
> I wouldn't go over this point because
> 1)You are close to leave the sweet spot of the ref pcb vrms efficiency
> 2)These commands add 200mv on top of the 100mv offset through AB gui.That means 300mv
> 
> By default /wi command apply to current gpu only. So if you have 2 or more gpus you must use /sg command. That means the command line is something like that
> ex:MsiAfterburner.exe /sg0 /wi6,30,8d,10 /sg1 /wi6,30,8d,10


This is what I'm trying to get to work, it's towards the bottom of the first post of the thread. I've tried it via making a .bat file and via putting the args in a shortcut, but when I click the bat file or the modified shortcut, MSI Afterburner doesn't start, a cmd window flashes on screen, and the core voltage remains the same as before (checked using GPU-Z).


----------



## rdr09

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Just got and installed my EK Reinforcement Bracket...
> 
> Not a huge help as my card still bends a bit, but it looks nice. Not bad for $8. Thanks to whoever suggested it.
> 
> I need to clean my case window, there's a thin layer of dust on it.


you ran your 290 1245 24/7?

nice rig, btw.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *rdr09*
> 
> you ran your 290 1245 24/7?
> 
> nice rig, btw.


Thanks, 1245 core is for benching, everyday is 1200 core and 1437 mem. I need +168mv for that, around 1.3v actual.

This card's been holding up to high voltages and clocks very nicely, it's been running without a hitch since October.


----------



## rdr09

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Thanks, 1245 core is for benching, everyday is 1200 core and 1437 mem. I need +168mv for that, around 1.3v actual.
> 
> This card's been holding up to high voltages and clocks very nicely, it's been running without a hitch since October.


1200 including BF4? i play at stock so i don't know what my card is capable of.


----------



## bond32

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> What? PWM is pulse width modulation which controls fan speed....
> 
> You need to set the card's power limit as high as it will go to prevent throttling. That might be what's causing your card to throttle, no way to tell until you try it.


Voltage control on the 290 is pwm controlled...


----------



## chiknnwatrmln

Quote:


> Originally Posted by *rdr09*
> 
> 1200 including BF4? i play at stock so i don't know what my card is capable of.


Yep, I don't play BF4 too much any more but when I was playing it every day it was just fine. Pretty good framerate, too. IIRC I was almost always above 60FPS, albeit I turned a few settings to high and the only AA I was running was 135% SS.
Skyrim really gives this card a run for it's money, it's literally always 100% usage when in game and my framerate sometimes goes down into the low 30's.


----------



## rdr09

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Yep, I don't play BF4 too much any more but when I was playing it every day it was just fine. Pretty good framerate, too. IIRC I was almost always above 60FPS, albeit I turned a few settings to high and the only AA I was running was 135% SS.
> Skyrim really gives this card a run for it's money, it's literally always 100% usage when in game and my framerate sometimes goes down into the low 30's.


wow, thanks for the info.


----------



## cennis

Quote:


> Originally Posted by *straKK*
> 
> I'm trying to offset the core voltage.
> This is what I'm trying to get to work, it's towards the bottom of the first post of the thread. I've tried it via making a .bat file and via putting the args in a shortcut, but when I click the bat file or the modified shortcut, MSI Afterburner doesn't start, a cmd window flashes on screen, and the core voltage remains the same as before (checked using GPU-Z).


/wi6 is what you need instead of wi/4
click it AFTER applying settings in msi AB or it will be overwritten when you apply msi AB.


----------



## Roy360

Quote:


> Originally Posted by *Jflisk*
> 
> What drivers are you using do not use the 14.4 you will get the results listed.


so far I have tried, 13.11 WHQL & 13.12

I've set the ASUS to +30 and the AUX to +12, but I"m still using my underclock. Power usage seems to jump from 140-200W


----------



## Jflisk

Quote:


> Originally Posted by *Roy360*
> 
> so far I have tried, 13.11 WHQL & 13.12
> 
> I've set the ASUS to +30 and the AUX to +12, but I"m still using my underclock. Power usage seems to jump from 140-200W


Can you show me your config for cgminer. If you have a 290 or 290x there's two different scripts there's one for high clocks 1100/1500 and 850/1250 so it depends on your card took me 2 days to figure out what was right for mine. mine runs 850/1250 and the other 860/1350 getting 800+ per card TC 32768 - Intensity 20 - powe rtune 20- g 1 never g 2 the cards don't like it. They say you can hit 900KHS I have yet to see it myself. Mileage may vary good luck


----------



## straKK

Quote:


> /wi6 is what you need instead of wi/4
> click it AFTER applying settings in msi AB or it will be overwritten when you apply msi AB.


Hey thanks, that seems to have worked, although now for some reason my sliders look like the picture below, with 1200 core and 1550 mem all the way to the left. Also, do I need to enable Graphics OverDrive and change the power limit there as well? Or do I just leave that alone?


----------



## cennis

Quote:


> Originally Posted by *straKK*
> 
> Hey thanks, that seems to have worked, although now for some reason my sliders look like the picture below, with 1200 core and 1550 mem all the way to the left. Also, do I need to enable Graphics OverDrive and change the power limit there as well? Or do I just leave that alone?


Click reset, then restart computer with stock settings.
open MSI AB (see if your sliders are reset)
then apply clocks and power limit.
then click .bat

Do not check overdrive


----------



## straKK

Also is there a way to raise the temperature threshold at which the card starts to throttle? I'm using a PowerColor PCS+ 290x which is throttling at 78 degrees Celsius. The target temperature slider seems to be missing from my CCC.


----------



## cennis

Quote:


> Originally Posted by *straKK*
> 
> Also is there a way to raise the temperature threshold at which the card starts to throttle? I'm using a PowerColor PCS+ 290x which is throttling at 78 degrees Celsius. The target temperature slider seems to be missing from my CCC.


Have never heard of that. but if you are just testing you can probably blast the fans higher?
Long term probably flash a different bios, most throttle at 95


----------



## straKK

Quote:


> Long term probably flash a different bios, most throttle at 95


Well, at least I think it's throttling. Every time the card gets to 78, I notice the framerate in OCCT starts dropping and rising precipitously (i.e. 223, 105, 178, 223, 94, 223), and looking at MSI Afterburner, the temperature peaks at 78, goes back down a little, corresponding with the framerate and keeps doing that until I finally shut the test off. The peaks and troughs of the framerate, temperature, and core clock are all very well correlated, so I think it's throttling. I did try to blast the fan higher, but to no avail, as it's already running at 87% during the test and 100% didn't do much of anything for me other than more noise.

If you know another BIOS compatible with this card with a 95 degree thermal limit, I'm open to suggestions. I am kind of leery of bricking my card and voiding my warranty though.


----------



## Arizonian

Quote:


> Originally Posted by *Tmfs*
> 
> Purchased a reference XFX 290 off ebay that ended up being a miner special black screen edition. Big surprise huh? Well thankfully the guys over at XFX setup an RMA with zero hassle in which I received a new sealed Double D 290 in return.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> VRM1 gets a little toasty and it has Elpida mem but I honestly can't even begin to complain given the circumstances.
> 
> I know this has been said before but prospective buyers beware of the flood of miner stock being offloaded currently. I was *extremely* lucky that XFX customer service was amazing and took care of me.
> 
> 1100/1400 stock volts
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Quote:


> Originally Posted by *royalkilla408*
> 
> Also getting that bug that won't let my monitor turn back on after idle when they turn off. Glad to hear I'm not the only one. Needs to be fixed ASAP. First time going with AMD since the 90s. What's the site to report bugs to AMD? Thanks.


*
AMD Issue Reporting Form for AMD Catalyst™ 14.4 for Desktop Radeon™ Products*

Quote:


> Originally Posted by *B NEGATIVE*
> 
> One for the list.
> 
> Sapphire 290 with AquaC block.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Prismatic paint flips from black...
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> To rainbow.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Just add light.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The missing bolt was in the paint rack,its back in place now.


Congrats - added
















Sweet









Quote:


> Originally Posted by *Terrere*
> 
> GPU-z Validation of my Gigabyte Windforce r9 290.
> Please add me to the list


Congrats - added









Quote:


> Originally Posted by *Lord Xeb*
> 
> Card is crashing RMA time.


Sorry to hear that. Let us know how the RMA process goes please.

Quote:


> Originally Posted by *Rozayz*
> 
> Add to owners list please. http://i.imgur.com/fSA15wE.png - ASUS, reference/stock cooled at the moment. Getting waterblock soon in prep for my finalized loop.
> 
> Also, need a bit of help diagnosing a shut down issue I'm having with my GPU
> 
> 
> 
> 
> 
> 
> 
> posted at incorrect location - here's the thread.
> 
> Cheers.


Congrats - added









Hope you get that figured out.

*Side note: /thread cleaned.*

Reminder of the no profanity rule obviously but remember if you quoted someone who used profanity expect your reply to be deleted and all your hard work of info provided to that member. It's best not to reply to posts with profanity and give moderation time to remove it. Keep in mind all replies after were removed as well to keep discussion from becoming fragmented and not make sense.

Thank you everyone. Carry on.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Thanks, 1245 core is for benching, everyday is 1200 core and 1437 mem. I need +168mv for that, around 1.3v actual.
> 
> This card's been holding up to high voltages and clocks very nicely, it's been running without a hitch since October.


Sounds like you have a good one
Quote:


> Originally Posted by *rdr09*
> 
> 1200 including BF4? i play at stock so i don't know what my card is capable of.


I play mine at stock clocks as well . Only overclock to benchmark bla , bla , bla


----------



## chiknnwatrmln

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Sounds like you have a good one
> I play mine at stock clocks as well . Only overclock to benchmark bla , bla , bla


Can't complain.









When I go CF 290's I'm probably gonna run them at like 1100 core as I won't need the extra power. This is still a giant if though, a new 290 and a waterblock are expensive but a car and college is quite a bit more...


----------



## Roy360

Followed another member's advice and flashed my ASUS DC card with reference XFX bios. Card works properly now







. I actually see improvements when I overclock







Now to flash Slits custom bios over, and I'll be able to switch from Gaming and mode mode at the flip of 3 buttons


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Can't complain.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> When I go CF 290's I'm probably gonna run them at like 1100 core as I won't need the extra power. This is still a giant if though, a new 290 and a waterblock are expensive but a car and college is quite a bit more...


The money ive spent on TRI 290 W/Blocked IS a car
Quote:


> Originally Posted by *Roy360*
> 
> Followed another member's advice and flashed my ASUS DC card with reference XFX bios. Card works properly now
> 
> 
> 
> 
> 
> 
> 
> . I actually see improvements when I overclock
> 
> 
> 
> 
> 
> 
> 
> Now to flash Slits custom bios over, and I'll be able to switch from Gaming and mode mode at the flip of 3 buttons


im glad someone could help you


----------



## cennis

Quote:


> Originally Posted by *Roy360*
> 
> Followed another member's advice and flashed my ASUS DC card with reference XFX bios. Card works properly now
> 
> 
> 
> 
> 
> 
> 
> . I actually see improvements when I overclock
> 
> 
> 
> 
> 
> 
> 
> Now to flash Slits custom bios over, and I'll be able to switch from Gaming and mode mode at the flip of 3 buttons


great. not sure if you saw my post even as it was deleted soon after.
What are your voltage offset and OC? dinujan:thumb:


----------



## heroxoot

I've been trying my hardest understanding this 290X and it all seems like it under performs my old 7970. 1100/1250 and I score 1199 points on Heaven with extreme everything 1080p. In BF4 I get 80 - 90fps, dips down to 75, but 80 - 90 is the area it is mostly in for frame rate. I keep hearing my CPU is the bottleneck but why was my 7970 not being limited this hard? I also I watched my power usage on GPUZ and the highest power usage I see is 220w, while its showing 180 - 190w most commonly.

So is my GPU malfunctioning or is it just so great having it on my motherboard or with this CPU is dragging it down? Money is not an option so no CPU upgrades are happening any time soon.


----------



## Dasboogieman

Quote:


> Originally Posted by *heroxoot*
> 
> I've been trying my hardest understanding this 290X and it all seems like it under performs my old 7970. 1100/1250 and I score 1199 points on Heaven with extreme everything 1080p. In BF4 I get 80 - 90fps, dips down to 75, but 80 - 90 is the area it is mostly in for frame rate. I keep hearing my CPU is the bottleneck but why was my 7970 not being limited this hard? I also I watched my power usage on GPUZ and the highest power usage I see is 220w, while its showing 180 - 190w most commonly.
> 
> So is my GPU malfunctioning or is it just so great having it on my motherboard or with this CPU is dragging it down? Money is not an option so no CPU upgrades are happening any time soon.


What is your GPU clockspeed during the heaven test?
The 290X/290 series seems to be notorious in downclocking the core to save power when 60FPS can be achieved easily (i.e. it likes using a lower clockspeed with a higher GPU utilization).

Have you tried using RadeonPro? create a custom profile for Unigine Heaven and clicking on the OverDrive tab to force Maximum clockspeed at all times.
If the problem is inconsistent clockspeed then this will fix your issue. If you wish to do voltage assisted overclocking AND forcing maximum clocks, you will need BIOS mods.


----------



## heroxoot

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> I've been trying my hardest understanding this 290X and it all seems like it under performs my old 7970. 1100/1250 and I score 1199 points on Heaven with extreme everything 1080p. In BF4 I get 80 - 90fps, dips down to 75, but 80 - 90 is the area it is mostly in for frame rate. I keep hearing my CPU is the bottleneck but why was my 7970 not being limited this hard? I also I watched my power usage on GPUZ and the highest power usage I see is 220w, while its showing 180 - 190w most commonly.
> 
> So is my GPU malfunctioning or is it just so great having it on my motherboard or with this CPU is dragging it down? Money is not an option so no CPU upgrades are happening any time soon.
> 
> 
> 
> What is your GPU clockspeed during the heaven test?
> The 290X/290 series seems to be notorious in downclocking the core to save power when 60FPS can be achieved easily (i.e. it likes using a lower clockspeed with a higher GPU utilization).
> 
> Have you tried using RadeonPro? create a custom profile for Unigine Heaven and clicking on the OverDrive tab to force Maximum clockspeed at all times.
> If the problem is inconsistent clockspeed then this will fix your issue. If you wish to do voltage assisted overclocking AND forcing maximum clocks, you will need BIOS mods.
Click to expand...

It never drops from 1100/1250 clocks. The power usage is always 150w or more never lower, while seeing 220w as the highest I noticed. GPU utilization drops to 70% in 1 or 2 areas but its for a second and it goes right back to 90%+.

I'm really confused by this GPU. I do see differences of performance between stock and my OC. About 5fps actually. So it is OCing but the way power is working I'm very confused.


----------



## Roy360

Quote:


> Originally Posted by *cennis*
> 
> great. not sure if you saw my post even as it was deleted soon after.
> What are your voltage offset and OC? dinujan:thumb:


I'm using 975/1250 with 1.109-1.113V (GPUZ), and apparently drawing 7A (a sideaffect from using a reference bios?)
I don't want to have high clocks while mining, so for now I just want to find the lowest stable voltage. Once school is over (Monday), I will try finding the best ratio between OC and voltage.

Hopefully my fujis will come in soon.


----------



## heroxoot

Quote:


> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> I've been trying my hardest understanding this 290X and it all seems like it under performs my old 7970. 1100/1250 and I score 1199 points on Heaven with extreme everything 1080p. In BF4 I get 80 - 90fps, dips down to 75, but 80 - 90 is the area it is mostly in for frame rate. I keep hearing my CPU is the bottleneck but why was my 7970 not being limited this hard? I also I watched my power usage on GPUZ and the highest power usage I see is 220w, while its showing 180 - 190w most commonly.
> 
> So is my GPU malfunctioning or is it just so great having it on my motherboard or with this CPU is dragging it down? Money is not an option so no CPU upgrades are happening any time soon.
> 
> 
> 
> What is your GPU clockspeed during the heaven test?
> The 290X/290 series seems to be notorious in downclocking the core to save power when 60FPS can be achieved easily (i.e. it likes using a lower clockspeed with a higher GPU utilization).
> 
> Have you tried using RadeonPro? create a custom profile for Unigine Heaven and clicking on the OverDrive tab to force Maximum clockspeed at all times.
> If the problem is inconsistent clockspeed then this will fix your issue. If you wish to do voltage assisted overclocking AND forcing maximum clocks, you will need BIOS mods.
> 
> Click to expand...
> 
> It never drops from 1100/1250 clocks. The power usage is always 150w or more never lower, while seeing 220w as the highest I noticed. GPU utilization drops to 70% in 1 or 2 areas but its for a second and it goes right back to 90%+.
> 
> I'm really confused by this GPU. I do see differences of performance between stock and my OC. About 5fps actually. So it is OCing but the way power is working I'm very confused.
Click to expand...

I was wrong and I started seeing artifacts when the voltage is backed down to +0mV, so it is doing something. And I remember my FPS being just below 40fps on Heaven doing 1080p extreme on my 7970, so this GPU is doing better on DX11. But why is my FPS so low on BF4? Mantle pushed 100+ fps on my 7970 I swear. Did they goof on something somewhere down the line? I know the framepacingmethod thing being set to 1 helps but I don't see much of a difference so I think that is purely crossfire.


----------



## Aussiejuggalo

So are the 14.4 drivers stable now? bit confused coz the stable and beta drivers are 14.4


----------



## cennis

Quote:


> Originally Posted by *cennis*
> 
> great. not sure if you saw my post even as it was deleted soon after.


Quote:


> Originally Posted by *Roy360*
> 
> I'm using 975/1250 with 1.109-1.113V (GPUZ), and apparently drawing 7A (a sideaffect from using a reference bios?)
> I don't want to have high clocks while mining, so for now I just want to find the lowest stable voltage. Once school is over (Monday), I will try finding the best ratio between OC and voltage.
> 
> Hopefully my fujis will come in soon.


did your royalking 290 come yet? msged you on rfd lol also let me know ill probably need some fuji


----------



## the9quad

Quote:


> Originally Posted by *heroxoot*
> 
> I was wrong and I started seeing artifacts when the voltage is backed down to +0mV, so it is doing something. And I remember my FPS being just below 40fps on Heaven doing 1080p extreme on my 7970, so this GPU is doing better on DX11. But why is my FPS so low on BF4? Mantle pushed 100+ fps on my 7970 I swear. Did they goof on something somewhere down the line? I know the framepacingmethod thing being set to 1 helps but I don't see much of a difference so I think that is purely crossfire.


Well they did patch it, have you ran your 7970 since the last BF4 update? If you still have it, then try and compare them now.


----------



## heroxoot

Quote:


> Originally Posted by *the9quad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> I was wrong and I started seeing artifacts when the voltage is backed down to +0mV, so it is doing something. And I remember my FPS being just below 40fps on Heaven doing 1080p extreme on my 7970, so this GPU is doing better on DX11. But why is my FPS so low on BF4? Mantle pushed 100+ fps on my 7970 I swear. Did they goof on something somewhere down the line? I know the framepacingmethod thing being set to 1 helps but I don't see much of a difference so I think that is purely crossfire.
> 
> 
> 
> Well they did patch it, have you ran your 7970 since the last BF4 update? If you still have it, then try and compare them now.
Click to expand...

I wish I could. This 290X was given to me in exchange for it. But the last time I used my 7970 was pre navel strike on 14.1 beta.

As far as I can tell my DX11 performance is much better than my 7970 was because I remember with its OC on the same benchmark (Heavene, Extreme, 1080p) it got about 39fps, almost 40. So the 290x getting 47 seems fine thus far. But my BF4 performance seems lacking on Mantle. Mantle still is better than DX11 in BF4 however.


----------



## burksdb

hey if you guys have time can i possibly get some input on an issue im having.

http://www.overclock.net/t/1485284/290x-issues-multiple-drivers-and-a-fresh-install/0_100#post_22177328


----------



## HOMECINEMA-PC

Stuff ive been benching

Valley
*1080p*
HOMECINEMA-PC [email protected]@2428 TRI 290 [email protected] *155.5fps 6505*


*1440p*
HOMECINEMA-PC [email protected]@2428 TRI 290 [email protected] *115.9fps 4847*


FS
HOMECINEMA-PC [email protected]@2428 [email protected] *24545*

http://www.3dmark.com/fs/2068285

3D MK 11 Performance
HOMECINEMA-PC [email protected]@2428 TRI 290 WB [email protected] *31005*









http://www.3dmark.com/3dm11/8268154
Cracked 31k ..........









Catzilla *576p*
HOMECINEMA-PC [email protected]@2428 [email protected]

http://www.catzilla.com/showresult?lp=251423|*|Result
http://hwbot.org/submission/2538430_ NO1 W/R

Catzilla *720p*
HOMECINEMA-PC [email protected]@2428 [email protected]

http://www.catzilla.com/showresult?lp=251411|*|Result

Catzilla *1440p*
HOMECINEMA-PC [email protected]@2428 [email protected]

http://www.catzilla.com/showresult?lp=251414








My best TRI'S and scores


----------



## Mercy4You

Quote:


> Originally Posted by *heroxoot*
> 
> I was wrong and I started seeing artifacts when the voltage is backed down to +0mV, so it is doing something. And I remember my FPS being just below 40fps on Heaven doing 1080p extreme on my 7970, so this GPU is doing better on DX11. But why is my FPS so low on BF4? Mantle pushed 100+ fps on my 7970 I swear. Did they goof on something somewhere down the line? I know the framepacingmethod thing being set to 1 helps but I don't see much of a difference so I think that is purely crossfire.


Well, I can tell you this. I have an I74930K and I play BF4 a lot. I use manual V for CPU and change clocks between desktop /gaming/ youtube. My lowest clock is @3.4Ghz, that's stock clock with Turbo disabled.

For gaming I use 4.3Ghz and that will bring the load on cores @ 90% max.
Sometimes I start gaming @3.4Ghz and forget to change clock, but GPU clocks are @ 1100/1400.

My FPS on BF4 then drop from 120+ that are normal to around 70-80 FPS...


----------



## Mercy4You

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> So are the 14.4 drivers stable now? bit confused coz the stable and beta drivers are 14.4


Yep, they are stable







For 14.4 WHQL they stripped 20MB from 14.4 Beta...


----------



## King4x4

14.4 is really bad with eyefinity and quadfire.

Good thing that crossfire is good with it though









Only trifire and quadfire left then


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Mercy4You*
> 
> Yep, they are stable
> 
> 
> 
> 
> 
> 
> 
> For 14.4 WHQL they stripped 20MB from 14.4 Beta...


Ah cool just wasnt sure


----------



## directorJay

Hello everyone, I own a Sapphire 290x, does anyone know if this card supports 10-bit color out-of-the-box?

I plan to buy an Asus PA249Q or PA279Q monitor or if anyone has a better value option 10-bit monitor for professional use (Adobe Premiere Pro, After Effects, Davinci Resolve)

Anyway, I tried searching around and I can't seem to find a definite answer, some ppl even say you have to change some things in the registry, made me a little confused.


----------



## Rozayz

So I upgraded (or downgraded, some might say) my GTX 780 Ti with an R9 290X in preparation for x2 4k res monitors 4 days ago.

I purchased a reference ASUS card and am experiencing shut down issues. Basically the card gets hot and shuts down, however they're designed to run at ~ 95c or thereabouts.

My question was whether anyone knows if the VRM on my MSI Z77 MPOWER could be affecting this and causing a thermal shutdown?

I have forced fan speed % to ~ 60% using MSI Afterburner and have found that leaving the fan speed % set to 'Auto' causes a system shutdown in about 5 minutes or less of playing any video game. (waterblock could not come sooner at this stage ;_; finalizing WC loop and getting that badboy installed asap!) ~

Rig is as follows:

Processor: Intel Core i7 3770K ‒ 5GHz
Video Card: ASUS Radeon R9 290X 4GB
Memory/RAM: 2x8GB G.Skill Trident X ‒ 2400 MHz
Motherboard: MSI Z77 MPOWER
Power Supply: Seasonic X-1050 ‒ 80+ GOLD
Case: Corsair Obsidian 900D
CPU Cooler: Corsair H100i w/ aftermarket SP120's
Storage: Western Digital Caviar Black ‒ 1TB ‒ x2
SSD: Intel 520 Series ‒ 180GB ‒ x2
Display Samsung U28D590D 28inch 4K UHD ‒ x2
Keyboard: White Keycool 84 Cherry MX Red
Mouse: Corsair Vengeance M65
Headset: Sennheiser HD598
Microphone: Audio Technica AT2035

Bullet points added from incorrectly located thread:

> http://i.imgur.com/c8Lejpt.png - GPU Temp @ 47 °C ~

> Nvidia drivers are completely removed from my PC. I actually reformatted/repaired my OS partition for an unrelated reason but yeah, Nvidia stuff is 100% all gone, onboard VGA is disabled.

> Running the R9 290X on two separate rails of my Seasonic X-1050.

> Seriously struggling to think of why I'm still getting shut downs other than the fact that my motherboard's VRM's are potentially /worn out; although the board is "specced to overclock" as mentioned in many reviews/by MSI themselves. (they actually do 24 hour burn in's on all their MPOWER series at 4.6 GHz) ~

> I had zero issues with the 780 Ti and used it for well over 3 months clocked quite high.

---

> Just tried playing BF4, as I joined the game and it "tabbed" into the game after the "Loading Level" message had initialized etc, PC shut down during the transition to Fullscreen mode. Same thing happened during Unigine Heaven benchmark when changing from 1600x900 Windowed to 1920x1080 Fullscreen. Thinking about it, the issue has only existed when alt-tabbing to and from Fullscreen games/apps.

> These are [obviously] during idle, but I can 150% confirm that the shutdown only occurs when tabbing to/from Fullscreen games/apps.http://i.imgur.com/eHGszNv.png - are those Voltages about right? Anyone able to spot anything out of the norm? oO

> R9 290X reached a max of 69 °C after 11 minutes of Unigine Heaven and system shutdown instantly when I alt tabbed from Fullscreen to screen capture HWMonitor to post here, hah.

> I have completely reset, updated BIOS. Reseated CPU, reapplied thermal paste, tried old 780 Ti, 660 Ti (both worked flawlessly, even OC'd)...

1. plugged computer singularly into isolated circuit, still shut down.
2. someone mentioned something, somewhere, about Nvidia, Intel, AMD and Realtek potentially conflicting audio drivers.. how should I go about ensuring this isn't the problem?
3. could an extremely large gas heater on the wall opposite my power point be the cause of the shut down? I figured since the wall gets quite warm, this could cause a shut down.

Any help is appreciated.


----------



## velocityx

Quote:


> Originally Posted by *Rozayz*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> So I upgraded (or downgraded, some might say) my GTX 780 Ti with an R9 290X in preparation for x2 4k res monitors 4 days ago.
> 
> I purchased a reference ASUS card and am experiencing shut down issues. Basically the card gets hot and shuts down, however they're designed to run at ~ 95c or thereabouts.
> 
> My question was whether anyone knows if the VRM on my MSI Z77 MPOWER could be affecting this and causing a thermal shutdown?
> 
> I have forced fan speed % to ~ 60% using MSI Afterburner and have found that leaving the fan speed % set to 'Auto' causes a system shutdown in about 5 minutes or less of playing any video game. (waterblock could not come sooner at this stage ;_; finalizing WC loop and getting that badboy installed asap!) ~
> 
> Rig is as follows:
> 
> Processor: Intel Core i7 3770K ‒ 5GHz
> Video Card: ASUS Radeon R9 290X 4GB
> Memory/RAM: 2x8GB G.Skill Trident X ‒ 2400 MHz
> Motherboard: MSI Z77 MPOWER
> Power Supply: Seasonic X-1050 ‒ 80+ GOLD
> Case: Corsair Obsidian 900D
> CPU Cooler: Corsair H100i w/ aftermarket SP120's
> Storage: Western Digital Caviar Black ‒ 1TB ‒ x2
> SSD: Intel 520 Series ‒ 180GB ‒ x2
> Display Samsung U28D590D 28inch 4K UHD ‒ x2
> Keyboard: White Keycool 84 Cherry MX Red
> Mouse: Corsair Vengeance M65
> Headset: Sennheiser HD598
> Microphone: Audio Technica AT2035
> 
> Bullet points added from incorrectly located thread:
> 
> > http://i.imgur.com/c8Lejpt.png - GPU Temp @ 47 °C ~
> 
> > Nvidia drivers are completely removed from my PC. I actually reformatted/repaired my OS partition for an unrelated reason but yeah, Nvidia stuff is 100% all gone, onboard VGA is disabled.
> 
> > Running the R9 290X on two separate rails of my Seasonic X-1050.
> 
> > Seriously struggling to think of why I'm still getting shut downs other than the fact that my motherboard's VRM's are potentially worn out; although the board is "specced to overclock" as mentioned in many reviews/by MSI themselves. (they actually do 24 hour burn in's on all their MPOWER series at 4.6 GHz) ~
> 
> > I had zero issues with the 780 Ti and used it for well over 3 months clocked quite high.
> 
> ---
> 
> > Just tried playing BF4, as I joined the game and it "tabbed" into the game after the "Loading Level" message had initialized etc, PC shut down during the transition to Fullscreen mode. Same thing happened during Unigine Heaven benchmark when changing from 1600x900 Windowed to 1920x1080 Fullscreen. Thinking about it, the issue has only existed when alt-tabbing to and from Fullscreen games/apps.
> 
> > These are [obviously] during idle, but I can 150% confirm that the shutdown only occurs when tabbing to/from Fullscreen games/apps.http://i.imgur.com/eHGszNv.png - are those Voltages about right? Anyone able to spot anything out of the norm? oO
> 
> > R9 290X reached a max of 69 °C after 11 minutes of Unigine Heaven and system shutdown instantly when I alt tabbed from Fullscreen to screen capture HWMonitor to post here, hah.
> 
> > I have completely reset, updated BIOS. Reseated CPU, reapplied thermal paste, tried old 780 Ti, 660 Ti (both worked flawlessly, even OC'd)...
> 
> 1. plugged computer singularly into isolated circuit, still shut down.
> 2. someone mentioned something, somewhere, about Nvidia, Intel, AMD and Realtek potentially conflicting audio drivers.. how should I go about ensuring this isn't the problem?
> 3. could an extremely large gas heater on the wall opposite my power point be the cause of the shut down? I figured since the wall gets quite warm, this could cause a shut down.
> 
> Any help is appreciated.


a) either PSU is faulty
or
b) cable is faulty, or there's a short somewhere.

Radeon eats a bit more power than Nvidia but still this should be a non issue here, I power two 290's off of an 850.

Can you take your case to a different room/different power outlet? to check? or maybe plug a different PSU to check?

and if you dont feel like checkin, you could also set up additional OS install and check if nvidia works ok on that still. if it does than I would consider the 290x a faulty unit.


----------



## NirHahs

why does my MSI r9 290 Gaming stuttering in dota 2? ive tried all the drivers. Will next 14.xx Official Non beta driver solve this issue?


----------



## Rozayz

Quote:


> Originally Posted by *velocityx*
> 
> a) either PSU is faulty
> or
> b) cable is faulty, or there's a short somewhere.
> 
> Radeon eats a bit more power than Nvidia but still this should be a non issue here, I power two 290's off of an 850.
> 
> Can you take your case to a different room/different power outlet? to check? or maybe plug a different PSU to check?
> 
> and if you dont feel like checkin, you could also set up additional OS install and check if nvidia works ok on that still. if it does than I would consider the 290x a faulty unit.


Here's another weird issue, most likely related... I have one fan, one, that continually speeds up and slows down. It's connected to the same fan distribution block (powered by PCI-E 6 pin connector) and just dims on and off (it's an LED fan).

I'm going to be taking the GPU into work tomorrow to have it tested by our warranty team... if it comes back okay, I'll take my entire rig in the following day.


----------



## bond32

Your psu might be to blame. If the card is running at those temps it's not the issue. Did you set everything else to stock?


----------



## Rozayz

Quote:


> Originally Posted by *bond32*
> 
> Your psu might be to blame. If the card is running at those temps it's not the issue. Did you set everything else to stock?


Everything is stock. I have tried running the CPU at 5.0 GHz, 4.4 GHz and stock (3.5 GHz with 3.9 GHz turbo). The issue has existed in all scenarios.


----------



## rdr09

Quote:


> Originally Posted by *Rozayz*
> 
> Here's another weird issue, most likely related... I have one fan, one, that continually speeds up and slows down. It's connected to the same fan distribution block (powered by PCI-E 6 pin connector) and just dims on and off (it's an LED fan).
> 
> I'm going to be taking the GPU into work tomorrow to have it tested by our warranty team... if it comes back okay, I'll take my entire rig in the following day.


either the the psu or temp issue. what are the vrm temps? you can use the render test in GPUZ to check. open three of them like so . . .



click on the "?" to open Render test and Start. Watch the temps closely. you may also need to lower your dpi to see both vrm temps using the slider on the right.

edit: even if you resolve this issue with a new psu i would not recommend keeping the stock cooler. reference is meant for watercooling. look at my temps . . . that is just one 120 rad and a fan. i use to idle at 45C with a stock cooler.

also, use DDU in safe mode to get rid of old drivers if you have not done so.


----------



## velocityx

Quote:


> Originally Posted by *Rozayz*
> 
> Here's another weird issue, most likely related... I have one fan, one, that continually speeds up and slows down. It's connected to the same fan distribution block (powered by PCI-E 6 pin connector) and just dims on and off (it's an LED fan).
> 
> I'm going to be taking the GPU into work tomorrow to have it tested by our warranty team... if it comes back okay, I'll take my entire rig in the following day.


can you see in the reliability monitor, are there any critical errors or logs or error codes? however usually when it shuts down it's PSU protector so I would also check cables if non of them is fried or stuck somewhere or got pushed back in to the plastic mold or something.

I also had these shutdowns but that was me pushing gpu/cpu oc to the psu limit in bf4.


----------



## NirHahs

why does my MSI r9 290 Gaming stuttering in dota 2? ive tried all the drivers. Will next 14.xx Official Non beta driver solve this issue?


----------



## velocityx

Quote:


> Originally Posted by *Rozayz*
> 
> Everything is stock. I have tried running the CPU at 5.0 GHz, 4.4 GHz and stock (3.5 GHz with 3.9 GHz turbo). The issue has existed in all scenarios.


maybe a silly question

did you try to unplug that faulty looking fan? maybe it's causing a short and psu shuts down while in load.


----------



## Rozayz

Quote:


> Originally Posted by *velocityx*
> 
> can you see in the reliability monitor, are there any critical errors or logs or error codes? however usually when it shuts down it's PSU protector so I would also check cables if non of them is fried or stuck somewhere or got pushed back in to the plastic mold or something.
> 
> I also had these shutdowns but that was me pushing gpu/cpu oc to the psu limit in bf4.


As in these? http://i.imgur.com/AGjC5vt.png

No idea what you mean by reliability monitor :\
Quote:


> Originally Posted by *rdr09*
> 
> either the the psu or temp issue. what are the vrm temps? you can use the render test in GPUZ to check. open three of them like so . . .
> 
> 
> 
> click on the "?" to open Render test and Start. Watch the temps closely. you may also need to lower your dpi to see both vrm temps using the slider on the right.
> 
> edit: even if you resolve this issue with a new psu i would not recommend keeping the stock cooler. reference is meant for watercooling. look at my temps . . . that is just one 120 rad and a fan. i use to idle at 45C with a stock cooler.
> 
> also, use DDU in safe mode to get rid of old drivers if you have not done so.


Got the Render going, how do I open the 3 things or w/e you meant by that? oO



--

These VRM temps normal?


----------



## velocityx

Quote:


> Originally Posted by *Rozayz*
> 
> As in these? http://i.imgur.com/AGjC5vt.png
> 
> No idea what you mean by reliability monitor :\


yea that one, it shows a kernel-power error 41 which means there was no clean shutdown meaning the power was shut so it's psu protecting itself. did you check that fan?

as for the kernel-pnp

http://social.technet.microsoft.com/wiki/contents/articles/3164.event-id-219-windows-kernel-pnp.aspx

yea those vrm temps are ok, in serious long gaming you could see these hitting 80-90 C.


----------



## Rozayz

Quote:


> Originally Posted by *velocityx*
> 
> yea that one, it shows a kernel-power error 41 which means there was no clean shutdown meaning the power was shut so it's psu protecting itself. did you check that fan?
> 
> as for the kernel-pnp
> 
> http://social.technet.microsoft.com/wiki/contents/articles/3164.event-id-219-windows-kernel-pnp.aspx
> 
> yea those vrm temps are ok, in serious long gaming you could see these hitting 80-90 C.


I did. I also came across this little thing.



It's a BitFeenix Alchemy LED strip connector which I suspect was shorting on the floor of my 900D. It also sits next to my HDD cage/power cables for fans etc. I have left the fan plugged in for the time being but will unplug it if the issue persists.

HOPEFULLY that was the problem.


----------



## rdr09

Quote:


> Originally Posted by *Rozayz*
> 
> As in these? http://i.imgur.com/AGjC5vt.png
> 
> No idea what you mean by reliability monitor :\
> Got the Render going, how do I open the 3 things or w/e you meant by that? oO
> 
> 
> 
> --
> 
> These VRM temps normal?


your temps look really good like Velocity said. Let it run for at least 15 mins. this test seems to get similar temps as when i play BF4 but you got to let it run a bit longer. set the temp reading to show max readings.


----------



## rdr09

Quote:


> Originally Posted by *Rozayz*
> 
> I did. I also came across this little thing.
> 
> 
> 
> It's a BitFeenix Alchemy LED strip connector which I suspect was shorting on the floor of my 900D. It also sits next to my HDD cage/power cables for fans etc. I have left the fan plugged in for the time being but will unplug it if the issue persists.
> 
> HOPEFULLY that was the problem.


i suggest using electrical tape to cover exposed metal.


----------



## Rozayz

Quote:


> Originally Posted by *rdr09*
> 
> your temps look really good like Velocity said. Let it run for at least 15 mins. this test seems to get similar temps as when i play BF4 but you got to let it run a bit longer. set the temp reading to show max readings.


Max temps the GPU has ever reached was like 75 °c at 100% load after a couple hours of BF4 whilst botting D3 Reaper of Souls in the background. VRMs have only ever reached around 65-70 °c. System itself sits at a max of 50 °c at 4.4 GHz and reaches about 70 °c at 5.0 GHz when ambient temps allow. :3


----------



## rdr09

Quote:


> Originally Posted by *Rozayz*
> 
> Max temps the GPU has ever reached was like 75 °c at 100% load after a couple hours of BF4 whilst botting D3 Reaper of Souls in the background. VRMs have only ever reached around 65-70 °c. System itself sits at a max of 50 °c at 4.4 GHz and reaches about 70 °c at 5.0 GHz when ambient temps allow. :3


the temps for the gpu are when you set the fan to 60% and it won't shut down?


----------



## Rozayz

Quote:


> Originally Posted by *rdr09*
> 
> i suggest using electrical tape to cover exposed metal.


Did one better.



Realized I don't actually need it connected at all because I removed the LED strip due to the adhesive being awful. Changing to 5050 LEDs from http://coldzero.eu/ soon anyway ~


----------



## heroxoot

Quote:


> Originally Posted by *Mercy4You*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> I was wrong and I started seeing artifacts when the voltage is backed down to +0mV, so it is doing something. And I remember my FPS being just below 40fps on Heaven doing 1080p extreme on my 7970, so this GPU is doing better on DX11. But why is my FPS so low on BF4? Mantle pushed 100+ fps on my 7970 I swear. Did they goof on something somewhere down the line? I know the framepacingmethod thing being set to 1 helps but I don't see much of a difference so I think that is purely crossfire.
> 
> 
> 
> Well, I can tell you this. I have an I74930K and I play BF4 a lot. I use manual V for CPU and change clocks between desktop /gaming/ youtube. My lowest clock is @3.4Ghz, that's stock clock with Turbo disabled.
> 
> For gaming I use 4.3Ghz and that will bring the load on cores @ 90% max.
> Sometimes I start gaming @3.4Ghz and forget to change clock, but GPU clocks are @ 1100/1400.
> 
> My FPS on BF4 then drop from 120+ that are normal to around 70-80 FPS...
Click to expand...

Not the case for me as my 8150 is always rolling 4.4ghz. I just feel like something changed and I have no way to test it. But again my DX11 performance seems better in heaven so it has to not be the GPU or OC. From what I hear 1100/1250 is easy to do on the +25mV that my card defaults to and doesn't even need +50% powerlimit though mine is set for it. I did see artifacts on +0mV compared to +25mV so its working that much. And performance is better with the OC.

Just wish it was better than what it seems like.


----------



## Roy360

My computer shut off randomly today. Looks like my AX1200 overheated powering 3 R9 290s....... time to flip the intakes radiator fans into exhausts. Which means now I have to remove 3 radiators so I have to room to unscrew the fans...... Why couldn't Corsair put a proper fan in it. The PSU is burning, but I can barely feel fan blowing air.


----------



## Rozayz

Quote:


> Originally Posted by *Rozayz*
> 
> I did. I also came across this little thing.
> 
> 
> 
> It's a BitFeenix Alchemy LED strip connector which I suspect was shorting on the floor of my 900D. It also sits next to my HDD cage/power cables for fans etc. I have left the fan plugged in for the time being but will unplug it if the issue persists.
> 
> HOPEFULLY that was the problem.


Still problematic. Sigh. Taking card into work tmrw and getting RA dept to bench it.


----------



## Falkentyne

Quote:


> Originally Posted by *straKK*
> 
> I'm trying to offset the core voltage.
> This is what I'm trying to get to work, it's towards the bottom of the first post of the thread. I've tried it via making a .bat file and via putting the args in a shortcut, but when I click the bat file or the modified shortcut, MSI Afterburner doesn't start, a cmd window flashes on screen, and the core voltage remains the same as before (checked using GPU-Z).


It was already explained that wi4 is only for the expired beta 17. IC addressing was changed in beta 18+, so instead of /wi4, you need to use /wi6.


----------



## Forceman

Quote:


> Originally Posted by *directorJay*
> 
> Hello everyone, I own a Sapphire 290x, does anyone know if this card supports 10-bit color out-of-the-box?
> 
> I plan to buy an Asus PA249Q or PA279Q monitor or if anyone has a better value option 10-bit monitor for professional use (Adobe Premiere Pro, After Effects, Davinci Resolve)
> 
> Anyway, I tried searching around and I can't seem to find a definite answer, some ppl even say you have to change some things in the registry, made me a little confused.


Not out of the box, no. It may be capable, and you may be able to tweak it to work, but the desktop drivers don't allow it as released.


----------



## lawson67

Hey guys i have an r9 290 crossfire setup with the new 14.4 drivers and i have decided to stop over clocking via CCC and add voltage and see if i can get a not to hot overclock!...i think i have found the sweet spot at 1130mhz on the 2 cores and i have also managed to with a +6 off set to get the memory up to 1450mhz!....Also i have turned off/ unchecked overdrive in CCC and am now doing everything via AB!... But i notice that when i reboot powertune is set back to 0%?....and if i set it to 50% powertune on GPU1 it takes GPU2 back to 0% on powertune and visa versa?....Also if i reboot powertune has reset to back to 0% but all the other OC settings are still there?....Any help guys?


----------



## velocityx

Quote:


> Originally Posted by *lawson67*
> 
> Hey guys i have an r9 290 crossfire setup with the new 14.4 drivers and i have decided to stop over clocking via CCC and add voltage and see if i can get a not to hot overclock!...i think i have found the sweet spot at 1130mhz on the 2 cores and i have also managed to with a +6 off set to get the memory up to 1450mhz!....Also i have turned off/ unchecked overdrive in CCC and am now doing everything via AB!... But i notice that when i reboot powertune is set back to 0%?....and if i set it to 50% powertune on GPU1 it takes GPU2 back to 0% on powertune and visa versa?....Also if i reboot powertune has reset to back to 0% but all the other OC settings are still there?....Any help guys?


first thing is I would prolly DDU the driver. reinstall clean, never touch OC in CCC and start with AB. just to confirm whose fault that is, ccc left over setting sticking or AB doing something wrong.


----------



## noles1983

so i set up crossfire 290's, now i have this terrible audio distortion during games. Anybody else have this? removed previous drivers, installed the new 14.4. SOunds only is messed up during games....quite annoying.


----------



## Widde

Quote:


> Originally Posted by *noles1983*
> 
> so i set up crossfire 290's, now i have this terrible audio distortion during games. Anybody else have this? removed previous drivers, installed the new 14.4. SOunds only is messed up during games....quite annoying.


I get that too, Mostly only with vsync enabled, If I disable vsync it goes away. Doesnt matter what driver it's still present for me. Can also get stutter when I alt-tab a fullscreen game like CS:GO and my voice goes robotmode on skype but as soon as I tab in it goes back to normal.


----------



## chronicfx

He is using 14.4 and the vsync issue is fixed. I play with vsync on now with 14.4 and no distortion. It is something else.


----------



## chronicfx

If you are using a soundcard I have trouble using my first pciex1 slot above the first gpu because it gets static for some reason. The third x1 slot under the first gpu does not exhibit this static. Sometimes card positioning can play a role. Also i wish you had your sig filled out so i am not shooting in the dark but if you are using a soundcard the amd audio that is installed with the drivers should be disabled because it conflicts. Above all i would try using ddu (display driver uninstaller) then install 14.4 then uninstall reinstall your audio drivers.


----------



## straKK

My Powercolor PCS+ 290x is throttling at 78C. The target temperature slider is also missing from my CCC, and was missing from the previous driver I had as well (have 14.4, prev. had 14.3 beta). Anyone know how I can raise the target temperature? I googled this issue but haven't really seen anyone with the same issue with this temperature and this card.


----------



## Widde

Quote:


> Originally Posted by *chronicfx*
> 
> He is using 14.4 and the vsync issue is fixed. I play with vsync on now with 14.4 and no distortion. It is something else.


Is it fixed in the whql? I'm still on 14.4 betas and have the issue







onboard soundcard so nothing on a pci-e 1x


----------



## chronicfx

Quote:


> Originally Posted by *Widde*
> 
> Is it fixed in the whql? I'm still on 14.4 betas and have the issue
> 
> 
> 
> 
> 
> 
> 
> onboard soundcard so nothing on a pci-e 1x


Whql is 13.12. Vsync was fixed in 14.3 after the whql. 14.4 should be fine. It is for me at least.


----------



## Widde

Quote:


> Originally Posted by *chronicfx*
> 
> Whql is 13.12. Vsync was fixed in 14.3 after the whql. 14.4 should be fine. It is for me at least.


It's a 14.4 whql out now aswell, Gonna give that a try after some scrubbing with DDU


----------



## chronicfx

Nice! I wonder if it is any different than the beta


----------



## noles1983

Quote:


> Originally Posted by *chronicfx*
> 
> If you are using a soundcard I have trouble using my first pciex1 slot above the first gpu because it gets static for some reason. The third x1 slot under the first gpu does not exhibit this static. Sometimes card positioning can play a role. Also i wish you had your sig filled out so i am not shooting in the dark but if you are using a soundcard the amd audio that is installed with the drivers should be disabled because it conflicts. Above all i would try using ddu (display driver uninstaller) then install 14.4 then uninstall reinstall your audio drivers.


I am using a soundcard, uninstalling the amd audio driver appears to have worked. thanks!


----------



## chronicfx

Quote:


> Originally Posted by *noles1983*
> 
> I am using a soundcard, uninstalling the amd audio driver appears to have worked. thanks!


Cool. Glad to hear it!


----------



## lawson67

Quote:


> Originally Posted by *velocityx*
> 
> first thing is I would prolly DDU the driver. reinstall clean, never touch OC in CCC and start with AB. just to confirm whose fault that is, ccc left over setting sticking or AB doing something wrong.


Yep did what you said " re-installed drivers" and Started off using just AB and working great now!... thanks mate


----------



## Aussiejuggalo

Does the 14.4 WHQL have mantle without the Gaming Evolved App or do you have to install it for mantle?


----------



## igrease

Why does my MSI 290 Gaming run so hot? BF4 on stock clocks it is at 80c with 90% fan speed. I have tried multiple fan setups inside my case to see if airflow was a problem. None of them helped and one of them increased the temp. I could understand it being 80c + if it was overclocked but jesus. Is my card not working correctly?


----------



## heroxoot

Quote:


> Originally Posted by *igrease*
> 
> Why does my MSI 290 Gaming run so hot? BF4 on stock clocks it is at 80c with 90% fan speed. I have tried multiple fan setups inside my case to see if airflow was a problem. None of them helped and one of them increased the temp. I could understand it being 80c + if it was overclocked but jesus. Is my card not working correctly?


It just does. With stock fan speed curve on 1100/1300 mine gets to 80c still. So I wouldnt worry too much about it. BTW whats your frame rate with mantle? Lately I'm getting 80 - 90fps on average with some dips ti mid 70fps, like 75 at the lowest. Pretty sure buying an 8350 for my rig would help tons but no money.


----------



## igrease

Quote:


> Originally Posted by *heroxoot*
> 
> It just does. With stock fan speed curve on 1100/1300 mine gets to 80c still. So I wouldnt worry too much about it. BTW whats your frame rate with mantle? Lately I'm getting 80 - 90fps on average with some dips ti mid 70fps, like 75 at the lowest. Pretty sure buying an 8350 for my rig would help tons but no money.


So if my core temp is around 86c and VRM 1 around 80+ that is acceptable? I currently get anywhere between 80 ~ 120 fps on Ultra with no AA.


----------



## heroxoot

Quote:


> Originally Posted by *igrease*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> It just does. With stock fan speed curve on 1100/1300 mine gets to 80c still. So I wouldnt worry too much about it. BTW whats your frame rate with mantle? Lately I'm getting 80 - 90fps on average with some dips ti mid 70fps, like 75 at the lowest. Pretty sure buying an 8350 for my rig would help tons but no money.
> 
> 
> 
> So if my core temp is around 86c and VRM 1 around 80+ that is acceptable? I currently get anywhere between 80 ~ 120 fps on Ultra with no AA.
Click to expand...

Ok I'm getting 80 - 90fps with Ultra 4x AA. My VRM only gts to 70c. Might need some kind of ventilation. My case is a HAF XB with a 200MM fan right above the GPU sucking all the heat out. My VRM pegs around 70c and never higher for the most part.


----------



## igrease

Quote:


> Originally Posted by *heroxoot*
> 
> Ok I'm getting 80 - 90fps with Ultra 4x AA. My VRM only gts to 70c. Might need some kind of ventilation. My case is a HAF XB with a 200MM fan right above the GPU sucking all the heat out. My VRM pegs around 70c and never higher for the most part.


I literally have a side fan sucking the hot air out of where the gpu's fans spit out the hot air.


----------



## luckyboy

Gpu usage drops in far cry3 and sleeping dogs, when set games from high to medium. Cpu usage is 30-50 %, gpu usage fluctuating 40-60-70. When games is on high settings gpu usage is 95%-100%. This card is driving me crazy. Any suggestions, how to fix that? Thanks.


----------



## Paul17041993

Quote:


> Originally Posted by *directorJay*
> 
> Hello everyone, I own a Sapphire 290x, does anyone know if this card supports 10-bit color out-of-the-box?
> 
> I plan to buy an Asus PA249Q or PA279Q monitor or if anyone has a better value option 10-bit monitor for professional use (Adobe Premiere Pro, After Effects, Davinci Resolve)
> 
> Anyway, I tried searching around and I can't seem to find a definite answer, some ppl even say you have to change some things in the registry, made me a little confused.


DP standard means you can go up to 16bpp if bandwidth allows, if you look at 10bit displays you'll notice they require you to use DP in the first place, this is why.

Currently the 290/X is DP 1.2 compliant, DP 1.3 will be finalized the end of this quarter which brings double bandwidth and allows up to [email protected] or 8K, however there's no knowledge if 290/X can be upgraded to this standard or not.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Does the 14.4 WHQL have mantle without the Gaming Evolved App or do you have to install it for mantle?


That's just another start up programme that annoys me , so I don't install it and you don't need it for mantle . That's what the drivers for


----------



## Aussiejuggalo

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> That's just another start up programme that annoys me , so I don't install it and you don't need it for mantle . That's what the drivers for


Ah ok cool, I just wasnt sure where AMD stuck Mantle


----------



## Paul17041993

only reason why the "gaming evolved" app is now part of the drivers is to remove fanboys the ability to diss them for not having something similar to the turd that is geforce experience.

or something like that, I don't like either and both are just as useless as each other except for those who have no idea how to play the games they own.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Paul17041993*
> 
> only reason why the "gaming evolved" app is now part of the drivers is to remove fanboys the ability to diss them for not having something similar to the turd that is geforce experience.
> 
> or something like that, I don't like either and both are just as useless as each other except for those who have no idea how to play the games they own.


Yeah, my "recommended" settings for CS:GO was 1280 x 720


----------



## Aussiejuggalo

Geforce Experience said with my old 670 my settings for BF3 were 800x600 everything on low


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yeah, my "recommended" settings for CS:GO was 1280 x 720
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Aussiejuggalo*
> 
> Geforce Experience said with my old 670 my settings for BF3 were 800x600 everything on low
Click to expand...

Yeah so much for recommended settings Pfffffftttt








Quote:


> Originally Posted by *Paul17041993*
> 
> only reason why the "gaming evolved" app is now part of the drivers is to remove fanboys the ability to diss them for not having something similar to the turd that is geforce experience.
> 
> or something like that, I don't like either and both are just as useless as each other except for those who have no idea how to play the games they own.


LooooooooooL


----------



## arrow0309

Quote:


> Originally Posted by *Falkentyne*
> 
> It was already explained that wi4 is only for the expired beta 17. IC addressing was changed in beta 18+, so instead of /wi4, you need to use /wi6.


Thanks!
I also managed to create 4 bat launchers with the 4 vddc offsets of 125, 150, 175 and 200v
However I didn't get quite well, when I'm supposed to launch the bat file, right before or after I apply to the desired oc clocks?
And I also noticed an auxiliary voltage right under the (locked) memory voltage
Is this one also supposed to improve something?


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yeah so much for recommended settings Pfffffftttt
> 
> 
> 
> 
> 
> 
> 
> 
> LooooooooooL


btw, my Qnix turned up today......it's awesome, already got 96hz running sweet and it's purdy


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> btw, my Qnix turned up today......it's awesome, already got 96hz running sweet and it's purdy


Already ? that was well fast ? Good qual eh ? How do you overclock these things ( monitors ) anyways


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Already ? that was well fast ? Good qual eh ? How do you overclock these things ( monitors ) anyways


Check out the Qnix thread, first post will give you everything you need to know really, and as always with OCN, people are very helpful as well


----------



## rdr09

Quote:


> Originally Posted by *igrease*
> 
> I literally have a side fan sucking the hot air out of where the gpu's fans spit out the hot air.


your rear fan sucking out, too. the fans might be opposing each other.


----------



## Paul17041993

Quote:


> Originally Posted by *Paul17041993*
> 
> Quote:
> 
> 
> 
> Originally Posted by *directorJay*
> 
> Hello everyone, I own a Sapphire 290x, does anyone know if this card supports 10-bit color out-of-the-box?
> 
> I plan to buy an Asus PA249Q or PA279Q monitor or if anyone has a better value option 10-bit monitor for professional use (Adobe Premiere Pro, After Effects, Davinci Resolve)
> 
> Anyway, I tried searching around and I can't seem to find a definite answer, some ppl even say you have to change some things in the registry, made me a little confused.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> DP standard means you can go up to 16bpp if bandwidth allows, if you look at 10bit displays you'll notice they require you to use DP in the first place, this is why.
> 
> Currently the 290/X is DP 1.2 compliant, DP 1.3 will be finalized the end of this quarter which brings double bandwidth and allows up to [email protected] or 8K, however there's no knowledge if 290/X can be upgraded to this standard or not.
Click to expand...

oh woops, actually meant 16bpc, which is 48bpp, which I doubt will ever see use really unless monitors start using super dynamic contrast tech that can utilize full HDR
(without backlight adjustment that is, most monitors today adjust the backlight intensity to obtain their high dynamic contrast)


----------



## Mercy4You

Who wants to share his opinion about what's the best R9 290X on air?

Thought the Lightning might be *'The One'*, until I read this Techpowerup review... (http://www.techpowerup.com/reviews/MSI/R9_290X_Lightning/)

I vote for mine, the Tri-X







(fast and silent)


----------



## rdr09

Quote:


> Originally Posted by *Mercy4You*
> 
> Who wants to share his opinion about what's the best R9 290X on air?
> 
> Thought the Lightning might be *'The One'*, until I read this Techpowerup review... (http://www.techpowerup.com/reviews/MSI/R9_290X_Lightning/)
> 
> I vote for mine, the Tri-X
> 
> 
> 
> 
> 
> 
> 
> (fast and silent)


they tested the card with overclock using 14.2?


----------



## igrease

Quote:


> Originally Posted by *rdr09*
> 
> your rear fan sucking out, too. the fans might be opposing each other.


Yes I know that my rear fan is exhausting. Like I have said, I have tried many different fan configurations and nothing seem to help the gpu.


----------



## ds84

I just gotten 2 MSI r9 290 Gaming, but from different batches. My 3Dmark score is P7607. Is this normal? no OC.

http://www.3dmark.com/3dm/2965143?

Also, my new 290 is missing some caps near to the gpu fan controller, as compared to the 1st batch which has it.

Lastly, what else can i do to the settings? Will be WC it at a later time.


----------



## rdr09

Quote:


> Originally Posted by *igrease*
> 
> Yes I know that my rear fan is exhausting. Like I have said, I have tried many different fan configurations and nothing seem to help the gpu.


no fans will help if the cooler itself is the problem. i mean if the paste used is junk or applied the wrong way. anyway to contact msi. i would inspect the card by disassembling it. it is cooling no better than a reference and that temp is just for the core. i would imagine the vrms are higher. if you don't plan on watercooling . . . i would return it. seriously.

Quote:


> Originally Posted by *ds84*
> 
> I just gotten 2 MSI r9 290 Gaming, but from different batches. My 3Dmark score is P7607. Is this normal? no OC.
> 
> http://www.3dmark.com/3dm/2965143?
> 
> Also, my new 290 is missing some caps near to the gpu fan controller, as compared to the 1st batch which has it.
> 
> Lastly, what else can i do to the settings? Will be WC it at a later time.


not sure what happened there. single 290 bone stock with an i7 HT off at 4GHz.

http://www.3dmark.com/3dm/2964777?

what psu do you have? pls fill Rigbuilder found in the upper right corner of page. btw, you can't possibly leave the i5 stock with 2. i would not even do it with 1.


----------



## ds84

Quote:


> Originally Posted by *rdr09*
> 
> no fans will help if the cooler itself is the problem. i mean if the paste used is junk or applied the wrong way. anyway to contact msi. i would inspect the card by disassembling it. it is cooling no better than a reference and that temp is just for the core. i would imagine the vrms are higher. if you don't plan on watercooling . . . i would return it. seriously.
> not sure what happened there. single 290 bone stock with an i7 HT off at 4GHz.
> 
> http://www.3dmark.com/3dm/2964777?
> 
> what psu do you have? pls fill Rigbuilder found in the upper right corner of page. btw, you can't possibly leave the i5 stock with 2. i would not even do it with 1.


Using seasonic x850... Tot i filled up the rigbuilder months ago.... As for cpu OC, im having trouble trying to OC it.. Guess i gt the other end of the stick...


----------



## rdr09

Quote:


> Originally Posted by *ds84*
> 
> Using seasonic x850... Tot i filled up the rigbuilder months ago.... As for cpu OC, im having trouble trying to OC it.. Guess i gt the other end of the stick...


psu is good. but the score is not normal. you may have to reinstall 14.4. you enabled crossfire in CCC? you disabled ULPS?

the cpu is fast just need to be oc a bit. 4.3GHz at least.


----------



## greenscobie86

Just got my Reference MSI R9 290. Loving it so far. Got the temps around 80 with no overclock at full load, by using a custom profile. using latest stable drivers. Loving the card so far. But thinking of either going the Kraken G10 water route or maybe a Gelid Icy Rev 2.... Thoughts?


----------



## ds84

Quote:


> Originally Posted by *rdr09*
> 
> psu is good. but the score is not normal. you may have to reinstall 14.4. you enabled crossfire in CCC? you disabled ULPS?
> 
> the cpu is fast just need to be oc a bit. 4.3GHz at least.


Enabled Xfire in CCC..Checking for ULPS now..

As for OC, even at 4.2ghz at 1.25V with H100i, im getting BSOD... Trying to find out more abt my mobo settings.

Also, how do i enable my rig specs to be listed at the bottom?


----------



## JordanTr

Quote:


> Originally Posted by *greenscobie86*
> 
> Just got my Reference MSI R9 290. Loving it so far. Got the temps around 80 with no overclock at full load, by using a custom profile. using latest stable drivers. Loving the card so far. But thinking of either going the Kraken G10 water route or maybe a Gelid Icy Rev 2.... Thoughts?


Gelid icy vision is ok, but if you like to leave you pc on at night it can be disturbing, cause fans are not pwm and always spinning 2000rpm. Thats why i replaced fans to be quiet shadow wings, but eventually it will go under water


----------



## ds84

I just tried to OC it. 1077mhz / 1300mhz. I got P6896 in 3DMark..... Cant be my CPU is bottlenecking?


----------



## rdr09

Quote:


> Originally Posted by *greenscobie86*
> 
> Just got my Reference MSI R9 290. Loving it so far. Got the temps around 80 with no overclock at full load, by using a custom profile. using latest stable drivers. Loving the card so far. But thinking of either going the Kraken G10 water route or maybe a Gelid Icy Rev 2.... Thoughts?


green, that's what i have. i would leave the stock cooler on mine but my rig is watercooled. it isn't that bad really but i did hear it past 55% during gaming. temp seem fine and mine did not suffer from throttling. i only had it a week before installing the a waterblock.

Quote:


> Originally Posted by *ds84*
> 
> I just tried to OC it. 1077mhz / 1300mhz. I got P6896 in 3DMark..... Cant be my CPU is bottlenecking?


i'll keep them at stock 'cause of the psu and cpu. try one 290 and see if you get better results.

disable CCC, shutdown system, unlpug power from the second gpu (no need to uninstall it), power up and test. use GPUZ to check the temps.


----------



## greenscobie86

Thanks for the input, I might just buy the Kraken G10 with a used Corsair H50 and some heatsinks then...

Also will a 140mm side fan blowing on the card in reference configuration be an improvement for temps? I've got it living in a Phantom 410 right now, which allegedly has decent airflow management.


----------



## rdr09

Quote:


> Originally Posted by *greenscobie86*
> 
> Thanks for the input, I might just buy the Kraken G10 with a used Corsair H50 and some heatsinks then...
> 
> Also will a 140mm side fan blowing on the card in reference configuration be an improvement for temps? I've got it living in a Phantom 410 right now, which allegedly has decent airflow management.


it should help.


----------



## Matt-Matt

Quote:


> Originally Posted by *Mercy4You*
> 
> Who wants to share his opinion about what's the best R9 290X on air?
> 
> Thought the Lightning might be *'The One'*, until I read this Techpowerup review... (http://www.techpowerup.com/reviews/MSI/R9_290X_Lightning/)
> 
> I vote for mine, the Tri-X
> 
> 
> 
> 
> 
> 
> 
> (fast and silent)


It's not the Gigabyte windforce.. Used it in a build and it was DOA (sort of, artifacting at stock at least).. Ran a quick stress test and 85c straight away. Not a great cooler...


----------



## ds84

Quote:


> Originally Posted by *rdr09*
> 
> green, that's what i have. i would leave the stock cooler on mine but my rig is watercooled. it isn't that bad really but i did hear it past 55% during gaming. temp seem fine and mine did not suffer from throttling. i only had it a week before installing the a waterblock.
> i'll keep them at stock 'cause of the psu and cpu. try one 290 and see if you get better results.
> 
> disable CCC, shutdown system, unlpug power from the second gpu (no need to uninstall it), power up and test. use GPUZ to check the temps.


How would my psu affect it? Also, i placed my blaster z at top pcie instead of at the lowest pcie x16 slot. It was running at x8 x4 according to gpu-z. now its at x8 x8.

Also, in 3Dmark, it says that my graphics driver isnt support. maybe i will try the 14.4 tomorrow and see how it goes.


----------



## rdr09

Quote:


> Originally Posted by *ds84*
> 
> How would my psu affect it? Also, i placed my blaster z at top pcie instead of at the lowest pcie x16 slot. It was running at x8 x4 according to gpu-z. now its at x8 x8.
> 
> Also, in 3Dmark, it says that my graphics driver isnt support. maybe i will try the 14.4 tomorrow and see how it goes.


what driver are they on? yah, 850 maybe enough for 2 but i would advise not oc'ing them. the psu might give and take other components with it.

edit: like i said, try one gpu. i would try them *individually*. see if any of them is bad. no need to take them out. just unplugged the one not being used. even if one is running at X8. make sure it is running at X16 or X8 when running single. plug the monitor to the right gpu.


----------



## Mega Man

Quote:


> Originally Posted by *ds84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> green, that's what i have. i would leave the stock cooler on mine but my rig is watercooled. it isn't that bad really but i did hear it past 55% during gaming. temp seem fine and mine did not suffer from throttling. i only had it a week before installing the a waterblock.
> i'll keep them at stock 'cause of the psu and cpu. try one 290 and see if you get better results.
> 
> disable CCC, shutdown system, unlpug power from the second gpu (no need to uninstall it), power up and test. use GPUZ to check the temps.
> 
> 
> 
> How would my psu affect it? Also, i placed my blaster z at top pcie instead of at the lowest pcie x16 slot. It was running at x8 x4 according to gpu-z. now its at x8 x8.
> 
> Also, in 3Dmark, it says that my graphics driver isnt support. maybe i will try the 14.4 tomorrow and see how it goes.
Click to expand...

Hate to be the one to tell you this (I can not see your rig I am on mobile).

Your psu effects everything in your rig. Dirty power (excessive ripple, bad voltage regulation )


----------



## Mercy4You

Quote:


> Originally Posted by *rdr09*
> 
> they tested the card with overclock using 14.2?


Not good I guess, never tried 14.2









I was surprised by the noise it makes including coilwhine above the fannoise , heard also others complaining about that...

Quote:


> Originally Posted by *Matt-Matt*
> 
> It's not the Gigabyte windforce.. Used it in a build and it was DOA (sort of, artifacting at stock at least).. Ran a quick stress test and 85c straight away. Not a great cooler...


Not the Windforce, haven't heard many about buying that 1 either...

So, other suggestions?


----------



## igrease

Quote:


> Originally Posted by *rdr09*
> 
> no fans will help if the cooler itself is the problem. i mean if the paste used is junk or applied the wrong way. anyway to contact msi. i would inspect the card by disassembling it. it is cooling no better than a reference and that temp is just for the core. i would imagine the vrms are higher. if you don't plan on watercooling . . . i would return it. seriously.


Just called MSI support and told them my issue and they basically said that it was running within the temp range of normal operation. The guy told me to download and run Furmark for an hour and see if I get any artifacting or crashes. Hehe. But I do actually plan to water cool this card within the next 3 months. But in the mean time, for a non-reference card to run so hot at stocks, especially the VRM1, should I just RMA it back to Newegg and get another one?

Also can I use the thermal paste I use for my CPU on the GPU?
Will taking the heatsink off my card void the warranty?


----------



## Rozayz

Quote:


> Originally Posted by *Rozayz*
> 
> Still problematic. Sigh. Taking card into work tmrw and getting RA dept to bench it.


GPU was benched at work today. Ran Bioshock for 6 hours without an issue as well as a few Furmark tests/Intel Burn Test. The issue still persists after taking the card home and swapping sleeved PSU cables to the OEM Seasonic X-1050 ones. At this stage I suspect a faulty PSU or Motherboard, as Memtest ran without issues and I have reformatted my OS drive twice.

S i g h.


----------



## rdr09

Quote:


> Originally Posted by *igrease*
> 
> Just called MSI support and told them my issue and they basically said that it was running within the temp range of normal operation. The guy told me to download and run Furmark for an hour and see if I get any artifacting or crashes. Hehe. But I do actually plan to water cool this card within the next 3 months. But in the mean time, for a non-reference card to run so hot at stocks, especially the VRM1, should I just RMA it back to Newegg and get another one?
> 
> Also can I use the thermal paste I use for my CPU on the GPU?
> Will taking the heatsink off my card void the warranty?


if newegg do you that favor without hassel, i'll come back buy stuff from them. get the VaporX.


----------



## igrease

Quote:


> Originally Posted by *rdr09*
> 
> if newegg do you that favor without hassel, i'll come back buy stuff from them. get the VaporX.


Actually I was going to get that one but it was about an inch too long to fit in my case.


----------



## Mega Man

Quote:


> Originally Posted by *igrease*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> no fans will help if the cooler itself is the problem. i mean if the paste used is junk or applied the wrong way. anyway to contact msi. i would inspect the card by disassembling it. it is cooling no better than a reference and that temp is just for the core. i would imagine the vrms are higher. if you don't plan on watercooling . . . i would return it. seriously.
> 
> 
> 
> Just called MSI support and told them my issue and they basically said that it was running within the temp range of normal operation. The guy told me to download and run Furmark for an hour and see if I get any artifacting or crashes. Hehe. But I do actually plan to water cool this card within the next 3 months. But in the mean time, for a non-reference card to run so hot at stocks, especially the VRM1, should I just RMA it back to Newegg and get another one?
> 
> Also can I use the thermal paste I use for my CPU on the GPU?
> Will taking the heatsink off my card void the warranty?
Click to expand...

Mixed stories as with any manufacture. As always do it at your own risk
Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *igrease*
> 
> Just called MSI support and told them my issue and they basically said that it was running within the temp range of normal operation. The guy told me to download and run Furmark for an hour and see if I get any artifacting or crashes. Hehe. But I do actually plan to water cool this card within the next 3 months. But in the mean time, for a non-reference card to run so hot at stocks, especially the VRM1, should I just RMA it back to Newegg and get another one?
> 
> Also can I use the thermal paste I use for my CPU on the GPU?
> Will taking the heatsink off my card void the warranty?
> 
> 
> 
> if newegg do you that favor without hassel, i'll come back buy stuff from them. get the VaporX.
Click to expand...

I hate how people always blame equipment that they don't understand. 99% off rmas are user error.and then people whine about the price of tech.

My fav is non us areas about pc pricing. They made a rediculas law about not only being able to return an item no questions asked for 2 weeks (specifically uk in this example) and warranties handled by seller. Then complain that prices are too high. When will people learn. They are not going to eat that cost. They will pass it back to the consumers. Nothing is ever free. Nor do I approve of the "just return it" attitude of my generation for the same reason


----------



## Rozayz

Quote:


> Originally Posted by *Rozayz*
> 
> GPU was benched at work today. Ran Bioshock for 6 hours without an issue as well as a few Furmark tests/Intel Burn Test. The issue still persists after taking the card home and swapping sleeved PSU cables to the OEM Seasonic X-1050 ones. At this stage I suspect a faulty PSU or Motherboard, as Memtest ran without issues and I have reformatted my OS drive twice.
> 
> S i g h.


With the above information, as well as this:

> PC shuts down with R9 290X in 3/3 PCI-E slots.
> PC does not shut down when I use onboard GPU (3770k)...

.... what is more likely? Issue with Motherboard, PSU or GPU? oO So damn confused at this stage. :l


----------



## rdr09

Quote:


> Originally Posted by *igrease*
> 
> Actually I was going to get that one but it was about an inch too long to fit in my case.


you got it make fit. mod it. remove a hdd cage or something.

Quote:


> Originally Posted by *Mega Man*
> 
> Mixed stories as with any manufacture. As always do it at your own risk
> I hate how people always blame equipment that they don't understand. 99% off rmas are user error.and then people whine about the price of tech.
> 
> My fav is non us areas about pc pricing. They made a rediculas law about not only being able to return an item no questions asked for 2 weeks (specifically uk in this example) and warranties handled by seller. Then complain that prices are too high. When will people learn. They are not going to eat that cost. They will pass it back to the consumers. Nothing is ever free. Nor do I approve of the "just return it" attitude of my generation for the same reason


that msi gaming is known for bad fans with oil leaking out of them. it might just be a matter of time before it happens. i would highly suggest returning it to newegg or msi. don't get me wrong . . . i love msi . . . i own a msi 290 reference.


----------



## Mega Man

Yet I have never had an issue with them. Hmmmm


----------



## velocityx

Quote:


> Originally Posted by *Rozayz*
> 
> With the above information, as well as this:
> 
> > PC shuts down with R9 290X in 3/3 PCI-E slots.
> > PC does not shut down when I use onboard GPU (3770k)...
> 
> .... what is more likely? Issue with Motherboard, PSU or GPU? oO So damn confused at this stage. :l


are you able to bring a spare PSU from work to check whether it works alright?


----------



## Rozayz

May just take rig into work tbh. I can do it myself but it's easier to get RA to do it while I work Assembly/ Sales.


----------



## bond32

Finally got my Sapphire 290x back from RMA! They actually had a quick turnaround. I shipped it last week and they received it thursday, sent out my new one monday. Problem was they shipped it Fedex ground and I had to sign for it. I couldn't even have it held at a facility...

In any case, looks like my black screen issue is gone. Also got Hynix memory!

Edit: I think they may have revised the reference PCB... I see small differences between my old one and new one. My old one was purchased on release day. If they did revise it I just wasn't aware of it.


----------



## hornedfrog86

Quote:


> Originally Posted by *bond32*
> 
> Finally got my Sapphire 290x back from RMA! They actually had a quick turnaround. I shipped it last week and they received it thursday, sent out my new one monday. Problem was they shipped it Fedex ground and I had to sign for it. I couldn't even have it held at a facility...
> 
> In any case, looks like my black screen issue is gone. Also got Hynix memory!


Great does it have the latest BIOS?


----------



## bond32

Quote:


> Originally Posted by *hornedfrog86*
> 
> Great does it have the latest BIOS?


Shows this one: http://www.techpowerup.com/vgabios/152146/sapphire-r9290x-4096-131202.html

Def later than my last one...


----------



## hornedfrog86

Thanks


----------



## bond32

I am really curious to see if there was a power delivery problem with the card I had. Seems a lot of people had identical issues, then the rma took care of it. Don't see near as many reports now days.


----------



## Arizonian

Quote:


> Originally Posted by *bond32*
> 
> Finally got my Sapphire 290x back from RMA! They actually had a quick turnaround. I shipped it last week and they received it thursday, sent out my new one monday. Problem was they shipped it Fedex ground and I had to sign for it. I couldn't even have it held at a facility...
> 
> In any case, looks like my black screen issue is gone. Also got Hynix memory!
> 
> Edit: I think they may have revised the reference PCB... I see small differences between my old one and new one. My old one was purchased on release day. If they did revise it I just wasn't aware of it.


Glad to hear this.


----------



## kizwan

Quote:


> Originally Posted by *Rozayz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Rozayz*
> 
> GPU was benched at work today. Ran Bioshock for 6 hours without an issue as well as a few Furmark tests/Intel Burn Test. The issue still persists after taking the card home and swapping sleeved PSU cables to the OEM Seasonic X-1050 ones. At this stage I suspect a faulty PSU or Motherboard, as Memtest ran without issues and I have reformatted my OS drive twice.
> 
> S i g h.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> With the above information, as well as this:
> 
> > PC shuts down with R9 290X in 3/3 PCI-E slots.
> > PC does not shut down when I use onboard GPU (3770k)...
> 
> .... what is more likely? Issue with Motherboard, PSU or GPU? oO So damn confused at this stage. :l
Click to expand...

Since GPU was benched without any problem at work, you can rule out GPU. Did you test with CPU at stock clock or at lower OC?

I think most likely PSU problem. Maybe malfunction OCP.

BTW, what do you mean by "Running the R9 290X on two separate rails of my Seasonic X-1050"? X-1050 is single rail PSU.
Quote:


> Originally Posted by *bond32*
> 
> Finally got my Sapphire 290x back from RMA! They actually had a quick turnaround. I shipped it last week and they received it thursday, sent out my new one monday. Problem was they shipped it Fedex ground and I had to sign for it. I couldn't even have it held at a facility...
> 
> In any case, looks like my black screen issue is gone. Also got Hynix memory!
> 
> Edit: *I think they may have revised the reference PCB... I see small differences between my old one and new one.* My old one was purchased on release day. If they did revise it I just wasn't aware of it.


Picture?


----------



## TheComputerNub

Hello. Im not an owner of a amd r9 290 but im thinking of buying one, and would like some info on it (is it loud, fast ect.). If anyone could help it would be appreciated.


----------



## bond32

Quote:


> Originally Posted by *kizwan*
> 
> Since GPU was benched without any problem at work, you can rule out GPU. Did you test with CPU at stock clock or at lower OC?
> 
> I think most likely PSU problem. Maybe malfunction OCP.
> 
> BTW, what do you mean by "Running the R9 290X on two separate rails of my Seasonic X-1050"? X-1050 is single rail PSU.
> Picture?


Yeah I really should have taken pictures. Too late now until I take the block off again.

But seriously, this card smokes my old one. Firestrike on pt1 bios, +100mv, 1200/1600 : http://www.3dmark.com/3dm/2967082?

Where my old card poopoo'ed out at anything over 1185 core, 1330 memory.


----------



## BroJin

Quote:


> Originally Posted by *TheComputerNub*
> 
> Hello. Im not an owner of a amd r9 290 but im thinking of buying one, and would like some info on it (is it loud, fast ect.). If anyone could help it would be appreciated.


How much are you willing to spend?
You can buy good after market 290's from $400.00 - $450.00 that are not LOUD and FAST


----------



## TheComputerNub

well for the whole computer i plan on spending 800-1000 maybe a little more.


----------



## TheComputerNub

And thanks for the info on the r9 290


----------



## bond32

Oh wow: http://www.3dmark.com/3dm/2967259?



Pulled 350 watts! That's a lot.

Not far from HOMECINEMA-PC at 11629. This makes me happy lol. Went from one of the worst 290x samples to a pretty darn good one.

Edit again: This was all with the latest 14.1 drivers. I see a few stating that 13.x are best for 4770k, is this correct? Going to try anyway.


----------



## igrease

OK so I was cleaning the old thermal paste off GPU and some how managed to get it in between the things. I cant seem to get it out. I do have a extremely small flat heat but idk how delicate these things are


----------



## chiknnwatrmln

Doesn't matter. Gray TIM's are almost always nonconductive, I've gotten some TX2 and Gelid Extreme stuck in those things and nothings has happened.

If you're OCD about it like me, try using a qtip and press it firmly onto the area.


----------



## King PWNinater

I want to put a Kraken g10 on my future R9 290, but I have a dilemma... At first, I wanted the XFX one, because I thought it has a lifetime warranty and I thought it didn't have an warranty void sticker. After some research, I found out that is has two. The only other 290 without a warranty sticker is the Gigabyte one. So should I worry so much about the warranty or not worry about it and get a really cheap 290?


----------



## Durquavian

Saw a number of users post communictions that stated warranties weren't void if removed, just all parts need to be back to original if RMA is warranted. Could swear XFX was one that stated that.


----------



## Brian18741

I can confirm Asus DCII warranty is VOID if you remove the stock cooler for any reason in Europe. May be different in the US.


----------



## Brian18741

Quote:


> Originally Posted by *igrease*
> 
> OK so I was cleaning the old thermal paste off GPU and some how managed to get it in between the things. I cant seem to get it out. I do have a extremely small flat heat but idk how delicate these things are
> 
> 
> 
> Spoiler: Warning: Spoiler!


I just used a cotton bud


----------



## igrease

Alright so I applied the new paste and here are the results.... Still not great at all. It can't be my case's air flow because my CPU temps are lower than ever. Could my card just run hot what whatever apparent reason? Maybe faulty cooler/fans?

*NEW*


*OLD*


----------



## rdr09

Quote:


> Originally Posted by *igrease*
> 
> Alright so I applied the new paste and here are the results.... Still not great at all. It can't be my case's air flow because my CPU temps are lower than ever. Could my card just run hot what whatever apparent reason? Maybe faulty cooler/fans?
> 
> *NEW*
> 
> 
> *OLD*
> 
> 
> Spoiler: Warning: Spoiler!


well, it improved or it could be that the ambient changed. imo, the 2 fan design should have stopped with the Tahitis. it is just not enough for hawaii's tpd.


----------



## cennis

Quote:


> Originally Posted by *bond32*
> 
> Oh wow: http://www.3dmark.com/3dm/2967259?
> 
> 
> 
> Pulled 350 watts! That's a lot.
> 
> Not far from HOMECINEMA-PC at 11629. This makes me happy lol. Went from one of the worst 290x samples to a pretty darn good one.
> 
> Edit again: This was all with the latest 14.1 drivers. I see a few stating that 13.x are best for 4770k, is this correct? Going to try anyway.


Are these temps under water?


----------



## Paul17041993

Quote:


> Originally Posted by *igrease*
> 
> OK so I was cleaning the old thermal paste off GPU and some how managed to get it in between the things. I cant seem to get it out. I do have a extremely small flat heat but idk how delicate these things are


you can just use a plastic or wooden toothpick really, solder is metal and the components are ceramic (or similar) coated so they are not easy to break.



Spoiler: quote



Quote:


> Originally Posted by *igrease*
> 
> Alright so I applied the new paste and here are the results.... Still not great at all. It can't be my case's air flow because my CPU temps are lower than ever. Could my card just run hot what whatever apparent reason? Maybe faulty cooler/fans?
> 
> *NEW*
> 
> 
> *OLD*






cooling is adequate, if you're looking to overclock and/ore extreme silence you'll need a 3rd party cooler or water, eg; arcticcooling's accelero III or IV.


----------



## igrease

Quote:


> Originally Posted by *rdr09*
> 
> [/SPOILER]
> 
> well, it improved or it could be that the ambient changed. imo, the 2 fan design should have stopped with the Tahitis. it is just not enough for hawaii's tpd.


Quote:


> Originally Posted by *Paul17041993*
> 
> you can just use a plastic or wooden toothpick really, solder is metal and the components are ceramic (or similar) coated so they are not easy to break.
> 
> 
> cooling is adequate, if you're looking to overclock and/ore extreme silence you'll need a 3rd party cooler or water, eg; arcticcooling's accelero III or IV.


Here is what happens if I take the side panel off and have my fan blowing on medium into the case.


----------



## Mega Man

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Doesn't matter. Gray TIM's are almost always nonconductive, I've gotten some TX2 and Gelid Extreme stuck in those things and nothings has happened.
> 
> If you're OCD about it like me, try using a qtip and press it firmly onto the area.


You can always use a tooth brush as well


----------



## rdr09

Quote:


> Originally Posted by *igrease*
> 
> Here is what happens if I take the side panel off and have my fan blowing on medium into the case.


looks a lot better. guess the side fan is not helping much? if it is ok with you playing at stock or prolly a little oc will work.

my side panel is off, too. that side is against the wall, so it is never seen.


----------



## igrease

Quote:


> Originally Posted by *rdr09*
> 
> looks a lot better. guess the side fan is not helping much? if it is ok with you playing at stock or prolly a little oc will work.
> 
> my side panel is off, too. that side is against the wall, so it is never seen.


Well it did help a little but not much. When I had it as a in-take it increased the temp by 7c. I guess I'll just have to keep the side panel off until I save up for my water cooling.


----------



## ds84

Quote:


> Originally Posted by *Mega Man*
> 
> Hate to be the one to tell you this (I can not see your rig I am on mobile).
> 
> Your psu effects everything in your rig. Dirty power (excessive ripple, bad voltage regulation )


so,my x850 isnt good?


----------



## BradleyW

Quote:


> Originally Posted by *ds84*
> 
> so,my x850 isnt good?


If that's an 850X SeaSonic Gold Unit, it's absolutely fine. That thing can run OC'ed 290X's and an overclocked 3930K.


----------



## Paul17041993

Quote:


> Originally Posted by *igrease*
> 
> Here is what happens if I take the side panel off and have my fan blowing on medium into the case.


case needs higher pressure if you want those temps, get more or stronger intake fans in bottom, front and/or side, make sure all back and top fans are exhaust-only.


----------



## Mega Man

I did not say that. You asked about how your psu could affect your gpu. I answered that. As I said I am on mobile, in China (I love in us) I can not see your rig as they don't show up on mobile. If you mean x850 from seasonic. Then it is a great unit assuming it is not over loaded


----------



## BradleyW

Quote:


> Originally Posted by *Mega Man*
> 
> I did not say that. You asked about how your psu could affect your gpu. I answered that. As I said I am on mobile, in China (I love in us) I can not see your rig as they don't show up on mobile. If you mean x850 from seasonic. Then it is a great unit assuming it is not over loaded


I was just making a fair and valid comment.


----------



## bond32

Quote:


> Originally Posted by *cennis*
> 
> Are these temps under water?


Yes temps are under water. Have 360,240,and 120 radiators for a 4770k and 290x. Max vrm temp is 61 C. After I disassbled and reassembled my Koolance block temperatures were much better.


----------



## BradleyW

Quote:


> Originally Posted by *bond32*
> 
> Yes temps are under water. Have 360,240,and 120 radiators for a 4770k and 290x. Max vrm temp is 61 C. After I disassbled and reassembled my Koolance block temperatures were much better.


I guess I am lucky to get 50c on the VRM's on my WC setup, given that I have my fans on 600rpm and the ambient is 22c in my heated room, plus the added heat from 3930K and additional 290X.


----------



## bond32

Quote:


> Originally Posted by *BradleyW*
> 
> I guess I am lucky to get 50c on the VRM's on my WC setup, given that I have my fans on 600rpm and the ambient is 22c in my heated room, plus the added heat from 3930K and additional 290X.


My vrm temps will hover around 50 on stock bios. 61 was with pt1 bios, benching, and about an hour of bf4 at +200mv.


----------



## BradleyW

Quote:


> Originally Posted by *bond32*
> 
> My vrm temps will hover around 50 on stock bios. 61 was with pt1 bios, benching, and about an hour of bf4 at +200mv.


That explains it!


----------



## Mega Man

Quote:


> Originally Posted by *BradleyW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> I did not say that. You asked about how your psu could affect your gpu. I answered that. As I said I am on mobile, in China (I love in us) I can not see your rig as they don't show up on mobile. If you mean x850 from seasonic. Then it is a great unit assuming it is not over loaded
> 
> 
> 
> I was just making a fair and valid comment.
Click to expand...

oh not at all! I did not mean that to be directed at you in any way. He seemed to think I was staying his psu was bad. Which I didn't


----------



## igrease

Quote:


> Originally Posted by *Paul17041993*
> 
> case needs higher pressure if you want those temps, get more or stronger intake fans in bottom, front and/or side, make sure all back and top fans are exhaust-only.


Well I plan on upgrading my case to as Corsair 450D. It comes with two 120mm AF series fans in the front. Would that be good enough? If not, what kind of fans should I be looking at?


----------



## Widde

Quote:


> Originally Posted by *igrease*
> 
> Well it did help a little but not much. When I had it as a in-take it increased the temp by 7c. I guess I'll just have to keep the side panel off until I save up for my water cooling.


I'm running without a side aswell to let the beasts "breathe"







Got a antec threehundred so it's alittle tight in there







2 ref design cards aswell









And got fans in every available slot ^^


----------



## NEK4TE

Hello guys,

sorry to jump in just like that.

Today, i purchased Gigabyte r9 290 Windforce card.

Could somebody please tell me what is the max temp for this card?

Also, i see that this card has performance fan mode and regular fan mode.

My current setup looks like this:










is this performance mode? and it should be like that?

Big thanks for your support!


----------



## Widde

Quote:


> Originally Posted by *NEK4TE*
> 
> Hello guys,
> 
> sorry to jump in just like that.
> 
> Today, i purchased Gigabyte r9 290 Windforce card.
> 
> Could somebody please tell me what is the max temp for this card?
> 
> Also, i see that this card has performance fan mode and regular fan mode.
> 
> My current setup looks like this:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> is this performance mode? and it should be like that?
> 
> Big thanks for your support!


If it's a regular 290 and not a 290x it is just different bioses, it doesnt impact the fan profile


----------



## NEK4TE

How come their website says it matters how switch is turned?

If you check here:
http://www.gigabyte.com/products/product-page.aspx?pid=4884#ov

It says " Smart Fan Control " - my switch is turned to the left, meaning i am on that performance mode now?

Also, i am concerned about temps.
idle temp is around 37-38C and while playing games, i saw around 70C.
I am coming from NVIDIA 780ti , and temps were lower, so, not sure if mentioned temps are fine?

Big thanks!


----------



## Widde

Quote:


> Originally Posted by *NEK4TE*
> 
> How come their website says it matters how switch is turned?
> 
> If you check here:
> http://www.gigabyte.com/products/product-page.aspx?pid=4884#ov
> 
> It says " Smart Fan Control " - my switch is turned to the left, meaning i am on that performance mode now?
> 
> Also, i am concerned about temps.
> idle temp is around 37-38C and while playing games, i saw around 70C.
> I am coming from NVIDIA 780ti , and temps were lower, so, not sure if mentioned temps are fine?
> 
> Big thanks!


Seems like gigabyte have made some changes ^^ Your temps should be fine, I'm up to 80 degrees while playing battlefield 4 with a pretty "loud" fan profile. Idle temps are around 40-45.

Your performance mode seems to be the switch to the back of the case "close to the bracket"


----------



## NEK4TE

To be honest with you, i don't know.
This card is purchased today, and i am getting used to this.
I was able to sell 780ti for good $ and for my needs, this is more then enough, really.

Switch is turned to the back of the case (left side once card is mounted) - so, this is correct?

I had very long day today, very tired, and their images\explanation is confusing me a bit.

Big thanks!


----------



## NEK4TE

Maybe better picture:


----------



## Dhalgren65

Yep.
In your photo it looks like the switch is in the middle!
Closer to rr ramps fan up quicker.
I would still use custom profile in MSI AB if possible...


----------



## NEK4TE

For some reason i cant push it more to the left








It does look a little bit (while checking the image) that is in the middle, but, thats what i get by pushing it to the left side.

I am gonna keep it as it is, and make AB profiles like this maybe:



Big thanks all!


----------



## FuriousPop

Hey all, wanted to ask (Couldn't find anywhere that would tell me), would my Enermax Plat 1000W PSU, run 3x R9 290 Tri-x OC...???

Currently running 2x in crossfire - would also be looking at watercooling system to be implemented also once i get the 3rd.

So would the 3 run with or without the watercooling system on my current PSU?


----------



## Paul17041993

Quote:


> Originally Posted by *FuriousPop*
> 
> Hey all, wanted to ask (Couldn't find anywhere that would tell me), would my Enermax Plat 1000W PSU, run 3x R9 290 Tri-x OC...???
> 
> Currently running 2x in crossfire - would also be looking at watercooling system to be implemented also once i get the 3rd.
> 
> So would the 3 run with or without the watercooling system on my current PSU?


if on water your power draw should drop significantly, so you'll likely have enough room on a 1KW for 3, however this would mean no CPU or GPU overclocks and only one or two HDDs.


----------



## cennis

Quote:


> Originally Posted by *Paul17041993*
> 
> if on water your power draw should drop significantly, so you'll likely have enough room on a 1KW for 3, however this would mean no CPU or GPU overclocks and only one or two HDDs.


Does reducing just the core temp reduce the power draw or do I need to reduce the vrm too, i.e. 80c vrm temps.
in the case of a AIO mounted on the core and air to cool the vrms

Did some testing with Reference Asus 290 bios on the DCUII card.
Using the same stage in battlefield (with alot of dust and effects, idling) I tested different overclocks and voltage/vrm temps with same fan speed

Firstly, in either bioses, VRM temp increases steeply with voltage in a similar fashion.

However I noticed:
ASUS DCUII BIOS: 1150mhz (75mw offset) 1.15v under load vrm1 77c
ASUS REF BIOS: 1150mhz ( 75mw offset) 1.21v under load vrm1 77c

This is the minimum voltage I need for 1150mhz on both bioses before artifacts appears.
The same pattern can be observed when using the minimum voltage for 1175mhz, 1200mhz.
I.E. if I use 1.21v on ASUS DCUII bios which will require running 1.28v on the REF bios (for 1200mhz) and vrm will be close to 90c in both cases.
However lets say if I put 1.28v on DCUII bios, vrm will exceed 100c by far.

It seems like on the ASUS DCUII bios I need a significantly lower voltage to achieve the same clock BUT the VRM heats up equally much.
.


----------



## Roy360

How is Corsair's warranty in Canada?

Yesterday my AX1200 shut off while it was mining. I think the power supply overheated. I originally thought it was due to my high internal temps, so I turned off a video card left the case door open. Within 10 minutes the PSU is burning hot again, but its fan is still blowing at its low speed. I'm thinking of just leaving it alone, and letting it die by itself and RMA later, but if Corsair is a hassle, I might try to fix the problem now.

But how does one go about cooling a power supply? It already has it's dedicated intake and thin fan filter to prevent dust build up. Tape a universal GPU block to the outside housing of the PSU?

What kind of PSUs are you guys running for a triple R9 290 setup?


----------



## Roboyto

Quote:


> Originally Posted by *Roy360*
> 
> How is Corsair's warranty in Canada?
> 
> Yesterday my AX1200 shut off while it was mining. I think the power supply overheated. I originally thought it was due to my high internal temps, so I turned off a video card left the case door open. Within 10 minutes the PSU is burning hot again, but its fan is still blowing at its low speed. I'm thinking of just leaving it alone, and letting it die by itself and RMA later, but if Corsair is a hassle, I might try to fix the problem now.
> 
> But how does one go about cooling a power supply? It already has it's dedicated intake and thin fan filter to prevent dust build up. Tape a universal GPU block to the outside housing of the PSU?
> 
> What kind of PSUs are you guys running for a triple R9 290 setup?


The fan should ramp up once temperatures rise. You may have an issue with it not adjusting fan speed accordingly. With only two 290(X) running at high load, that PSU shouldn't be even close to overheating.

You would be surprised how much a filter can restrict air flow; Try removing it and see what happens.


----------



## Mega Man

Quote:


> Originally Posted by *FuriousPop*
> 
> Hey all, wanted to ask (Couldn't find anywhere that would tell me), would my Enermax Plat 1000W PSU, run 3x R9 290 Tri-x OC...???
> 
> Currently running 2x in crossfire - would also be looking at watercooling system to be implemented also once i get the 3rd.
> 
> So would the 3 run with or without the watercooling system on my current PSU?


It would be close but doable with no ocs
Quote:


> Originally Posted by *Roy360*
> 
> How is Corsair's warranty in Canada?
> 
> Yesterday my AX1200 shut off while it was mining. I think the power supply overheated. I originally thought it was due to my high internal temps, so I turned off a video card left the case door open. Within 10 minutes the PSU is burning hot again, but its fan is still blowing at its low speed. I'm thinking of just leaving it alone, and letting it die by itself and RMA later, but if Corsair is a hassle, I might try to fix the problem now.
> 
> But how does one go about cooling a power supply? It already has it's dedicated intake and thin fan filter to prevent dust build up. Tape a universal GPU block to the outside housing of the PSU?
> 
> What kind of PSUs are you guys running for a triple R9 290 setup?


psus are not something you want to let die. They can take out everything in your system. With that said most psus fans are designed to be low speed. That is one way they keep the db down as suggested Terry removing the fan filter.

Again due to the low speed and db they tend not to be high static pressure


----------



## ds84

Quote:


> Originally Posted by *Mega Man*
> 
> oh not at all! I did not mean that to be directed at you in any way. He seemed to think I was staying his psu was bad. Which I didn't


Wasnt blaming you though... tot i missed out on something abt psu affecting gpu.. i will try out 14.4 whql tonight and see how it goes....


----------



## WotanWipeout

I run 2 x 290x and 1 x 290
On water. Powerdraw is between
750 watt to 1200 watt
Depends on powertarget
My psu is a corsair 1200i

Hope that helped


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Roy360*
> 
> How is Corsair's warranty in Canada?
> 
> Yesterday my AX1200 shut off while it was mining. I think the power supply overheated. I originally thought it was due to my high internal temps, so I turned off a video card left the case door open. Within 10 minutes the PSU is burning hot again, but its fan is still blowing at its low speed. I'm thinking of just leaving it alone, and letting it die by itself and RMA later, but if Corsair is a hassle, I might try to fix the problem now.
> 
> But how does one go about cooling a power supply? It already has it's dedicated intake and thin fan filter to prevent dust build up. Tape a universal GPU block to the outside housing of the PSU?
> 
> *What kind of PSUs are you guys running for a triple R9 290 setup?*


My solution is this ........... You will need a 1200w - 1300 peak PSU to run TRI just to be future proofed . Especially if you run 2011


When it comes to PSU's don't be tight about it ........ you need good clean stable power


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> My solution is this ........... You will need a 1200w - 1300 peak PSU to run TRI just to be future proofed . Especially if you run 2011
> 
> 
> When it comes to PSU's don't be tight about it ........ you need good clean stable power


You have good taste sir, very good taste indeed


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> You have good taste sir, very good taste indeed


Yep that's how I roll man








Fully modular , well priced going for about $270AU at Umart in Brisbane . That ones my spare , bran new and I have one in my rig now


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yep that's how I roll man
> 
> 
> 
> 
> 
> 
> 
> 
> Fully modular , well priced going for about $270AU at Umart in Brisbane . That ones my spare , bran new and I have one in my rig now


I think mine was $330 Brand new about 2 years ago.

Great PSU, I really should have a back-up....just in case.


----------



## rdr09

Quote:


> Originally Posted by *bond32*
> 
> My vrm temps will hover around 50 on stock bios. 61 was with pt1 bios, benching, and about an hour of bf4 at +200mv.


any update on benches using 13 drivers? please.

my experience is physics is up, graphics score down, and overall is down with 14.


----------



## ds84

Something interested i found in my GPU-z..

For the 1st 290, it has ROP/TMU of 64/176 and my newer 290 has 64/160.. isnt 64/176 for 290x???

Shaders also diff, 2816 for 1st gpu and 2560 for 2nd gpu...

Also, i gt BSOD while turning off Xfire..

Problem signature:
Problem Event Name: BlueScreen
OS Version: 6.1.7601.2.1.0.768.3
Locale ID: 1033

Additional information about the problem:
BCCode: 1000007e
BCP1: FFFFFFFFC0000005
BCP2: 0000000000000000
BCP3: FFFFF880061FC748
BCP4: FFFFF880061FBFA0
OS Version: 6_1_7601
Service Pack: 1_0
Product: 768_1

Files that help describe the problem:
C:\Windows\Minidump\042914-9094-01.dmp
C:\Users\XXX\AppData\Local\Temp\WER-15662-0.sysdata.xml

Read our privacy statement online:
Windows 7 Privacy Statement - Microsoft Windows

If the online privacy statement is not available, please read our privacy statement offline:
C:\Windows\system32\en-US\erofflps.txt

DMP files are as such:

https://www.dropbox.com/s/lfylk1jmjf95yth/042914-9094-01.dmp

https://www.dropbox.com/s/2hou4g56wjj40g9/042914-9765-01.dmp


----------



## Nissejacke

Heya!

I haven't seen it mentioned but I guess everyone knowes by now that there are new WHQL drivers out.
14.4 WHQL
http://support.amd.com/en-us/download/desktop?os=Windows%208.1%20-%2064

/Jakob


----------



## Matt-Matt

Quote:


> Originally Posted by *Paul17041993*
> 
> you can just use a plastic or wooden toothpick really, solder is metal and the components are ceramic (or similar) coated so they are not easy to break.
> 
> 
> cooling is adequate, if you're looking to overclock and/ore extreme silence you'll need a 3rd party cooler or water, eg; arcticcooling's accelero III or IV.


A credit/debit card works fine too! Just don't pry at the components.
Also a used toothbrush is good too, don't put it on the core though and let it dry out for a bit in case ANY water/moisture is in that toothbrush though. Probably wash the toothbrush itself before you use it with metho/distilled water;
Quote:


> Originally Posted by *igrease*
> 
> Alright so I applied the new paste and here are the results.... Still not great at all. It can't be my case's air flow because my CPU temps are lower than ever. Could my card just run hot what whatever apparent reason? Maybe faulty cooler/fans?
> 
> *NEW*
> 
> 
> *OLD*


It depends on the cooler.. What cooler is it? I see that your fan speed is around 80% in the new and 90% in the old. So it has made quite a bit of difference actually.

VRM's are really alright at 83c so that's not a big issue. The cooler is better but the main issue now is your core.


----------



## ds84

I used the DDU to detect and uninstall all drivers, including AMD and Nvidia... Ran 3DMark and got this.

http://www.3dmark.com/3dm/2971516?

Is this normal for 290 Xfire on stock clocks?


----------



## igrease

Quote:


> Originally Posted by *Matt-Matt*
> 
> A credit/debit card works fine too! Just don't pry at the components.
> Also a used toothbrush is good too, don't put it on the core though and let it dry out for a bit in case ANY water/moisture is in that toothbrush though. Probably wash the toothbrush itself before you use it with metho/distilled water;
> It depends on the cooler.. What cooler is it? I see that your fan speed is around 80% in the new and 90% in the old. So it has made quite a bit of difference actually.
> 
> VRM's are really alright at 83c so that's not a big issue. The cooler is better but the main issue now is your core.


Well I sort of fixed the issue with the cooling. I simply removed the side panel and have a medium sized fan blowing directly onto the GPU. Now it maxes out at 80c on the core and 75c on VRM1 @ 1100/1250.


----------



## Mega Man

Quote:


> Originally Posted by *igrease*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Matt-Matt*
> 
> A credit/debit card works fine too! Just don't pry at the components.
> Also a used toothbrush is good too, don't put it on the core though and let it dry out for a bit in case ANY water/moisture is in that toothbrush though. Probably wash the toothbrush itself before you use it with metho/distilled water;
> It depends on the cooler.. What cooler is it? I see that your fan speed is around 80% in the new and 90% in the old. So it has made quite a bit of difference actually.
> 
> VRM's are really alright at 83c so that's not a big issue. The cooler is better but the main issue now is your core.
> 
> 
> 
> Well I sort of fixed the issue with the cooling. I simply removed the side panel and have a medium sized fan blowing directly onto the GPU. Now it maxes out at 80c on the core and 75c on VRM1 @ 1100/1250.
Click to expand...

2 different sets of 2 words

poor airflow

static pressure ( fans )


----------



## Kokin

Hey guys, I bought a used reference XFX R9 290 from Craigslist yesterday. What's the typical offset for 1200/1250?

This is what I'm getting with 1200/1250 with +150mv and +50 power limit. ASIC is at 72%, which seems good since I plan to get a waterblock.

GPU-Z max readings during benchmarks:


Heaven 4.0/Valley1.0:


----------



## chiknnwatrmln

I'd say that's pretty good, my card requires +168mv for 1200 under water and my card is above average.

If you end up puttinga block on, I'd estimate your max OC to be around 1250-1260MHz.


----------



## BroJin

Quote:


> Originally Posted by *Kokin*
> 
> Hey guys, I bought a used reference XFX R9 290 from Craigslist yesterday. What's the typical offset for 1200/1250?
> 
> This is what I'm getting with 1200/1250 with +150mv and +50 power limit. ASIC is at 72%, which seems good since I plan to get a waterblock.
> 
> GPU-Z max readings during benchmarks:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> Heaven 4.0/Valley1.0:


How is your VRM temps so low???? Are they even working?


----------



## Kokin

It looks pretty ghetto right now, but this is what it looks like to have a 7950 in the loop but using a 290. That's 600mm of rad space just for my CPU right now.











Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I'd say that's pretty good, my card requires +168mv for 1200 under water and my card is above average.
> 
> If you end up puttinga block on, I'd estimate your max OC to be around 1250-1260MHz.


Thanks for your input. It gives me high hopes for my card, though I'm happy it already reaches 1200MHz core. It's a big improvement coming from a 7950 that could do 1300/1700.
Quote:


> Originally Posted by *BroJin*
> 
> How is your VRM temps so low???? Are they even working?


I'm not sure, but the fan is blasting at 100% (super vacuum). The original owner of the card bought it in Feb 2014, so it's possible that the reference cooler could have been improved for the VRMs, but I doubt it.


----------



## gerardfraser

The XFX reference cooler is great.Hell it beats my Sapphire Tri-x in cooling at crazy volts for air in crossfire.Love the reference XFX card.

High voltage ,Max fan speed was 70% on XFX and 67% on Tri-x
1200/1600


Regular gaming BF4
1075/1354


----------



## BroJin

Quote:


> Originally Posted by *gerardfraser*
> 
> The XFX reference cooler is great.Hell it beats my Sapphire Tri-x in cooling at crazy volts for air in crossfire.Love the reference XFX card.
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> High voltage ,Max fan speed was 70% on XFX and 67% on Tri-x
> 1200/1600
> 
> 
> Regular gaming BF4
> 1075/1354


Wow I'm speechless VRM temps make it look like you running on water


----------



## chronicfx

Quote:


> Originally Posted by *gerardfraser*
> 
> The XFX reference cooler is great.Hell it beats my Sapphire Tri-x in cooling at crazy volts for air in crossfire.Love the reference XFX card.
> 
> High voltage ,Max fan speed was 70% on XFX and 67% on Tri-x
> 1200/1600
> 
> 
> Regular gaming BF4
> 1075/1354


I get the same temps with my trifire setup gaming. The reference coolers are pretty sweet for vrm. Mine are never out of the low 50's on all three cards.


----------



## Roy360

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> My solution is this ........... You will need a 1200w - 1300 peak PSU to run TRI just to be future proofed . Especially if you run 2011
> 
> 
> When it comes to PSU's don't be tight about it ........ you need good clean stable power


The AX1200 is a 1200W PSU with peak power at 1504W. I kinda thought it would be enough.


----------



## gerardfraser

Quote:


> Wow I'm speechless VRM temps make it look like you running on water


Yes I think they run good on air.
Its too bad there is so much negativity surrounding the reference coolers when they are just fine.JMO

Quote:


> I get the same temps with my trifire setup gaming. The reference coolers are pretty sweet for vrm. Mine are never out of the low 50's on all three cards.


The reference coolers are very good.I may just buy another reference card and sell the TRI-X card.
I have not run Trifire since 5970/5850 and that was a great setup.I thought about trifire on the R9 290 cards but I rather run crossfire and Nvidia card for Physx for older games ,like Batman/Mirror Edge ETC.


----------



## igrease

Quote:


> Originally Posted by *gerardfraser*
> 
> The XFX reference cooler is great.Hell it beats my Sapphire Tri-x in cooling at crazy volts for air in crossfire.Love the reference XFX card.
> 
> High voltage ,Max fan speed was 70% on XFX and 67% on Tri-x
> 1200/1600
> 
> 
> Regular gaming BF4
> 1075/1354


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *ds84*
> 
> Something interested i found in my GPU-z..
> 
> For the 1st 290, it has ROP/TMU of 64/176 and my newer 290 has 64/160.. isnt 64/176 for 290x???
> 
> Shaders also diff, 2816 for 1st gpu and 2560 for 2nd gpu...
> 
> Also, i gt BSOD while turning off Xfire..
> 
> Problem signature:
> Problem Event Name: BlueScreen
> OS Version: 6.1.7601.2.1.0.768.3
> Locale ID: 1033
> 
> Additional information about the problem:
> BCCode: 1000007e
> BCP1: FFFFFFFFC0000005
> BCP2: 0000000000000000
> BCP3: FFFFF880061FC748
> BCP4: FFFFF880061FBFA0
> OS Version: 6_1_7601
> Service Pack: 1_0
> Product: 768_1
> 
> Files that help describe the problem:
> C:\Windows\Minidump\042914-9094-01.dmp
> C:\Users\XXX\AppData\Local\Temp\WER-15662-0.sysdata.xml
> 
> Read our privacy statement online:
> Windows 7 Privacy Statement - Microsoft Windows
> 
> If the online privacy statement is not available, please read our privacy statement offline:
> C:\Windows\system32\en-US\erofflps.txt
> 
> DMP files are as such:
> 
> https://www.dropbox.com/s/lfylk1jmjf95yth/042914-9094-01.dmp
> 
> https://www.dropbox.com/s/2hou4g56wjj40g9/042914-9765-01.dmp


1st card is a 290x or unlocked 290


----------



## Roboyto

Quote:


> Originally Posted by *BroJin*
> 
> [/SPOILER]
> 
> How is your VRM temps so low???? Are they even working?


The stock blower cools the VRM1 first, this is why core temperature sucks on reference cooler. VRM2 barely does any work so they don't get that hot, its only that temp because the hot air from core is blowing across it. I got great temps on VRMs woth reference blower as well. I was able to push 1200 core on stock cooler with 75% fan. Overall i got about 85-90% of my max clocks on reference cooler; water did help me push further.


----------



## Kokin

Has there been any test comparing the different waterblocks? I'm most likely going to get EK, but I read that Koolance and XSPC are better in terms of temps?


----------



## lawson67

Hey guys i need some help...AB beta 19 does not apply my overclock in CCC so i have stopped using it!...And i can not get trixx to load my overclock at system start up even with restore clocks at windows start up checked....i have a crossfire R9 290 setup and i have reinstalled all drivers and i am using the new 14.4 drivers and i also have the same experience with the 13.12 drivers and i am running windows 8.1...So does anyone know how to make AB move the power-limit etc in CCC ...or make trixx load your overclock at windows boot up?...Many thanks


----------



## heroxoot

I had similar issue and removing MSI AB, DDU removing the driver, reinstalling the driver, and reinstalling MSI AB fixed it for me. Make sure you delete your profiles too.


----------



## lawson67

Quote:


> Originally Posted by *heroxoot*
> 
> I had similar issue and removing MSI AB, DDU removing the driver, reinstalling the driver, and reinstalling MSI AB fixed it for me. Make sure you delete your profiles too.


Thanks mate i will give that a go


----------



## boot318

Quote:


> Originally Posted by *igrease*


I lol'd looking at that gif.


----------



## Forceman

Quote:


> Originally Posted by *ds84*
> 
> Something interested i found in my GPU-z..
> 
> For the 1st 290, it has ROP/TMU of 64/176 and my newer 290 has 64/160.. isnt 64/176 for 290x???
> 
> Shaders also diff, 2816 for 1st gpu and 2560 for 2nd gpu...


GPU-z sometimes messes up reading cards in Crossfire, download the HawaiiInfo tool from this thread and find out for sure.

http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread
Quote:


> Originally Posted by *Kokin*
> 
> Has there been any test comparing the different waterblocks? I'm most likely going to get EK, but I read that Koolance and XSPC are better in terms of temps?


Yes, there's a chart posted somewhere in this thread (I saw it a few days ago, but can't find it on the phone). Compares core and VRM Temps.

Edit: couldn't find the post here, but this is the source thread for the testing - the charts are in the OP.

http://www.xtremesystems.org/forums/showthread.php?288109-Stren-s-R9-290-290x-Water-Block-Testing


----------



## bond32

If I recall correctly, all the gpu blocks performed very close. Almost negligible. Which bums me out as I really would have preferred the xspc personally. Ended out getting the Koolance as it was the only thing available at the time. Works great though!


----------



## Jflisk

Quote:


> Originally Posted by *Kokin*
> 
> Has there been any test comparing the different waterblocks? I'm most likely going to get EK, but I read that Koolance and XSPC are better in terms of temps?


EK is top Oh the line. Just a heads up I have owned XSPC not happening. EK low flow and keeps excellent temps. Remember to get the back plate for anything you get cost more money for 30.00 a piece but worth it as far as temps go.


----------



## Kokin

Quote:


> Originally Posted by *Forceman*
> 
> Yes, there's a chart posted somewhere in this thread (I saw it a few days ago, but can't find it on the phone). Compares core and VRM Temps.
> 
> Edit: couldn't find the post here, but this is the source thread for the testing - the charts are in the OP.
> 
> http://www.xtremesystems.org/forums/showthread.php?288109-Stren-s-R9-290-290x-Water-Block-Testing


Awesome thanks for the link. +rep
Quote:


> Originally Posted by *bond32*
> 
> If I recall correctly, all the gpu blocks performed very close. Almost negligible. Which bums me out as I really would have preferred the xspc personally. Ended out getting the Koolance as it was the only thing available at the time. Works great though!


I also like the XSPC block, but I prefer a see-through top since I'm using Ice White Pastel coolant and it will be mounted vertically. The Aquacomputer block looks very nice, but it doesn't cover the whole PCB and their nickel/plexi has been sold out for a while (I don't like the black plexi version). So that leaves me with the EK CSQ block since it seems to cover what I want (nickel, plexi, fully covers PCB). I would probably see how it looks as the stock frosted look, otherwise I'll try to follow lowfat's polishing guide to get that clear look.

Quote:


> Originally Posted by *Jflisk*
> 
> EK is top Oh the line. Just a heads up I have owned XSPC not happening. EK low flow and keeps excellent temps. Remember to get the back plate for anything you get cost more money for 30.00 a piece but worth it as far as temps go.


Stren's testing seems to say otherwise, but again it's only a 1-2C difference between all the blocks. I'll probably go with the EK CSQ and possibly polish it out.


----------



## Roboyto

Quote:


> Originally Posted by *Kokin*
> 
> Has there been any test comparing the different waterblocks? I'm most likely going to get EK, but I read that Koolance and XSPC are better in terms of temps?


Nearly all the blocks will give you similar temperatures for cooling the core. When it comes to VRM temperatures, specifically VRM1, however there is a difference as the thermal pads that most of the manufacturers include are fairly poor quality.

The block/backplate combination with the best VRM1 temperature out of the box is Aquacomputer. They have a passive and active backplate. The passive does nearly as good of a job as the active, but is rather ugly/incomplete looking compared to the nice nickel plated cover w/ heatpipe.






You can achieve same/similar temperatures to AquaComputer with other blocks by using higher performing thermal pads. 
Fujipoly Ultra Extreme specifically: http://www.frozencpu.com/products/17504/thr-185/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_Mosfet_Block_-_100_x_15_x_10_-_Thermal_Conductivity_170_WmK.html 
If you plan on using these thermal pads, please check my thread link below so you can order the appropriate thickness for your block; this link is for 1mm.

I am using the XSPC block and it works well. It worked even better once I upgraded thermal pads for the VRMs.

See my thread here: http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures

My temperature drops were quite drastic, and I still haven't added thermal pads to my backplate. I'm sure they will come down further once I get around to that...just not feeling like draining my loop to do it.

Other users with EK blocks have reported similar temperature drops to what I experienced. I am unsure of Koolance performance as I haven't seen many people with those blocks; most popular seem to be EK, XSPC, and AquaComputer.


----------



## ds84

Quote:


> Originally Posted by *Forceman*
> 
> GPU-z sometimes messes up reading cards in Crossfire, download the HawaiiInfo tool from this thread and find out for sure.
> 
> http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread
> Yes, there's a chart posted somewhere in this thread (I saw it a few days ago, but can't find it on the phone). Compares core and VRM Temps.
> 
> Edit: couldn't find the post here, but this is the source thread for the testing - the charts are in the OP.
> 
> http://www.xtremesystems.org/forums/showthread.php?288109-Stren-s-R9-290-290x-Water-Block-Testing


Thanks for the link, will try it once i reach home.


----------



## Matt-Matt

Quote:


> Originally Posted by *Roboyto*
> 
> Nearly all the blocks will give you similar temperatures for cooling the core. When it comes to VRM temperatures, specifically VRM1, however there is a difference as the thermal pads that most of the manufacturers include are fairly poor quality.
> 
> The block/backplate combination with the best VRM1 temperature out of the box is Aquacomputer. They have a passive and active backplate. The passive does nearly as good of a job as the active, but is rather ugly/incomplete looking compared to the nice nickel plated cover w/ heatpipe.
> 
> 
> 
> 
> 
> 
> 
> You can achieve same/similar temperatures to AquaComputer with other blocks by using higher performing thermal pads.
> Fujipoly Ultra Extreme specifically: _http://www.frozencpu.com/products/17504/thr-185/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_Mosfet_Block_-_100_x_15_x_10_-_Thermal_Conductivity_170_WmK.html _
> If you plan on using these thermal pads, please check my thread link below so you can order the appropriate thickness for your block; this link is for 1mm.
> I am using the XSPC block and it works well. It worked even better once I upgraded thermal pads for the VRMs.
> 
> _See my thread here: http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures_
> 
> My temperature drops were quite drastic, and I still haven't added thermal pads to my backplate. I'm sure they will come down further once I get around to that...just not feeling like draining my loop to do it.
> 
> Other users with EK blocks have reported similar temperature drops to what I experienced. I am unsure of Koolance performance as I haven't seen many people with those blocks; most popular seem to be EK, XSPC, and AquaComputer.


I will try it on my koolance block one day.... lol


----------



## Roboyto

Quote:


> Originally Posted by *Matt-Matt*
> 
> I will try it on my koolance block one day.... lol


I tried it out since my VRM1 temps were so high at high OC. 1300/1700 +200mV they were reaching mid-high 70s

The temp drops are worth the effort and $20 for the pads.


----------



## Kokin

So expensive though! I will probably give it a try in the future, but just getting a block, backplate, and the CSQ bridge is already killing my wallet.


----------



## heroxoot

I just noticed on windows 8 if I increase my power it does not do anything, but on windows 7 increasing the power slider past +25mV works 100%. I'm not sure why that is. Anyone else notice anything like this on the 290X? This is on 13.12 driver on both 7 and 8 so powerlimit +50% should work on both.

I was wrong, once the benchmark starts it goes right to 1.18v even if its set for +40 which should put it at 1.2v or close.


----------



## Matt-Matt

Quote:


> Originally Posted by *Roboyto*
> 
> I tried it out since my VRM1 temps were so high at high OC. 1300/1700 +200mV they were reaching mid-high 70s
> 
> The temp drops are worth the effort and $20 for the pads.


Mine hit 70c stock.. on VRM1. I do need better fans in my loop to be fair but the thermal pads provided are crap.


----------



## Forceman

Quote:


> Originally Posted by *heroxoot*
> 
> I just noticed on windows 8 if I increase my power it does not do anything, but on windows 7 increasing the power slider past +25mV works 100%. I'm not sure why that is. Anyone else notice anything like this on the 290X? This is on 13.12 driver on both 7 and 8 so powerlimit +50% should work on both.
> 
> I was wrong, once the benchmark starts it goes right to 1.18v even if its set for +40 which should put it at 1.2v or close.


I had a problem where the voltage increase would reset (or just not be applied) sometimes after the monitor went to sleep/the computer was woken up from sleep. Maybe same thing happening to you?


----------



## Roboyto

Quote:


> Originally Posted by *Matt-Matt*
> 
> Mine hit 70c stock.. on VRM1. I do need better fans in my loop to be fair but the thermal pads provided are crap.


Koolance provides 2 different thickness pads. Maybe you used the thinner one and its not making good contact? They give 0.5mm and 0.7mm thicknesses.


----------



## kizwan

Quote:


> Originally Posted by *ds84*
> 
> I used the DDU to detect and uninstall all drivers, including AMD and Nvidia... Ran 3DMark and got this.
> 
> http://www.3dmark.com/3dm/2971516?
> 
> Is this normal for 290 Xfire on stock clocks?


Graphics score look normal.
Quote:


> Originally Posted by *Kokin*
> 
> So expensive though! I will probably give it a try in the future, but just getting a block, backplate, and the CSQ bridge is already killing my wallet.


Why CSQ bridge if you only going to run one 290? Assuming you're going to run it in the same rig you posted earlier.

Anyway, if you buy directly from EKWB or aquacomputer online shop, the Aqua Kryographics + passive backplate is 6.5 euro cheaper (excluding shipping) than EK-FC R9-290X (Original CSQ) + backplate. Also Aqua Kryographics + *active* backplate is 8.5 euro more (excluding shipping) than EK-FC R9-290X (Original CSQ) + backplate. If the Kryographics + passive/active backplate already provide excellent VRMs cooling, don't you think it's good idea to get them instead of EK-FC?

Well, I chose EK-FC waterblock over Kryographics because I like EK-FC waterblock design very much.







I also use Fujipoly Extreme for VRMs.


----------



## Matt-Matt

Quote:


> Originally Posted by *Roboyto*
> 
> Koolance provides 2 different thickness pads. Maybe you used the thinner one and its not making good contact? They give 0.5mm and 0.7mm thicknesses.


I thought I did at first too? But I put the thicker pad (0.7mm on) to get a 10c-20c drop which means I HAD the wrong pad/the block wasn't on properly.. I've gotta pull it apart one day but for now it's OK at stock. The main thing is my CPU is now overclocked again haha.


----------



## heroxoot

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> I just noticed on windows 8 if I increase my power it does not do anything, but on windows 7 increasing the power slider past +25mV works 100%. I'm not sure why that is. Anyone else notice anything like this on the 290X? This is on 13.12 driver on both 7 and 8 so powerlimit +50% should work on both.
> 
> I was wrong, once the benchmark starts it goes right to 1.18v even if its set for +40 which should put it at 1.2v or close.
> 
> 
> 
> I had a problem where the voltage increase would reset (or just not be applied) sometimes after the monitor went to sleep/the computer was woken up from sleep. Maybe same thing happening to you?
Click to expand...

Nah my PC never goes to sleep.

So best I got on my rig was 49fps on the nose. I didn't save the benchmark. This is still low it seems. A 290 with an i5 getting 55fps makes me feel like my 290X is bad somehow or my CPU is just that big of a bottleneck for it.

On another note, I just realized both windows 7 and 8 were running balanced instead of high performance during these fresh install tests (Not the initial install that had issues) so my benchmark could potentially get better? Not sure. Maybe the fps drops will stop at least.


----------



## Enfeeble

Hey guys, it has been four days since i received my 290 tri-x. For the first three days i ran few benchmarks on my 1440p monitor. Everything was going great, low temps/low vrm temps. So last night I decided to play Crysis 3 and while I am in the graphic settings(in the main menu) i decided to turn off vsync. My card started to whine coil like crazy because of the +3000 fps i was receiving. I immediately turned it off and played the game with vysnc for around 30 minutes. Then i went on to Skyrim and Metro Last Night. Skyrim was fine but anything over 61 fps my card would whine... but in Metro, my card would whine like crazy when im hitting 15-20 fps. Now today, ALL day the fans on my card have been rattling around +35% fan speed and it gets ridiculously loud above 70%. Also even on 20% fan speed my card seems to be buzzing constantly. The buzzing isn't that loud but it is noticeable since my pc is right next to me. Should i get a replacement?

The rattling sound is _*exactly*_ like this





I did the same thing, push the edge of the card up just like him and tug the power cables upwards and it would stop.

Also i heard the whine coil could be a GPU,PSU and a mobo issue.
Currently using Asus Maximus hero IV, EVGA SuperNova 750G.

My 290 tri-x was fine until the crysis moment. I can't really put the blame on that. Also prior to having this card i had the Asus 280 DCUII top, and it ran flawlessly.

edit- I physically had my head right next to my gpu and my psu and im pretty sure the faint buzzing noise is coming from my psu.

Just ran Unigene Valley, 68 temp with 37% fan speed but i hear ticking sound again which i thought was my PSU at desktop. Is this normal? Is my gpu making ticking sounds because its underload?


----------



## Kokin

Quote:


> Originally Posted by *kizwan*
> 
> Why CSQ bridge if you only going to run one 290? Assuming you're going to run it in the same rig you posted earlier.
> 
> Anyway, if you buy directly from EKWB or aquacomputer online shop, the Aqua Kryographics + passive backplate is 6.5 euro cheaper (excluding shipping) than EK-FC R9-290X (Original CSQ) + backplate. Also Aqua Kryographics + *active* backplate is 8.5 euro more (excluding shipping) than EK-FC R9-290X (Original CSQ) + backplate. If the Kryographics + passive/active backplate already provide excellent VRMs cooling, don't you think it's good idea to get them instead of EK-FC?
> 
> Well, I chose EK-FC waterblock over Kryographics because I like EK-FC waterblock design very much.
> 
> 
> 
> 
> 
> 
> 
> I also use Fujipoly Extreme for VRMs.


Shipping from Europe to the US would cost a lot more, so that's a big no-no for me. I do like the AC block + backplate, but it's not the look I'm going for.

I want to do something like this with the EK CSQ block and it requires the single bridge. If I were to keep it with just the link (no bridge), I would have to buy more 90 degree fittings, which would cost more than just the single bridge. Plus, having the bridge would let me keep my loop routing exactly the same as it is with the EK-FC7950.



Quote:


> Originally Posted by *Enfeeble*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Hey guys, it has been four days since i received my 290 tri-x. For the first three days i ran few benchmarks on my 1440p monitor. Everything was going great, low temps/low vrm temps. So last night I decided to play Crysis 3 and while I am in the graphic settings(in the main menu) i decided to turn off vsync. My card started to whine coil like crazy because of the +3000 fps i was receiving. I immediately turned it off and played the game with vysnc for around 30 minutes. Then i went on to Skyrim and Metro Last Night. Skyrim was fine but anything over 61 fps my card would whine... but in Metro, my card would whine like crazy when im hitting 15-20 fps. Now today, ALL day the fans on my card have been rattling around +35% fan speed and it gets ridiculously loud above 70%. Also even on 20% fan speed my card seems to be buzzing constantly. The buzzing isn't that loud but it is noticeable since my pc is right next to me. Should i get a replacement?
> 
> 
> The rattling sound is _*exactly*_ like this
> 
> 
> 
> 
> 
> I did the same thing, push the edge of the card up just like him and tug the power cables upwards and it would stop.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Also i heard the whine coil could be a GPU,PSU and a mobo issue.
> Currently using Asus Maximus hero IV, EVGA SuperNova 750G.
> 
> My 290 tri-x was fine until the crysis moment. I can't really put the blame on that. Also prior to having this card i had the Asus 280 DCUII top, and it ran flawlessly.
> 
> edit- I physically had my head right next to my gpu and my psu and im pretty sure the faint buzzing noise is coming from my psu.
> 
> Just ran Unigene Valley, 68 temp with 37% fan speed but i hear ticking sound again which i thought was my PSU at desktop. Is this normal? Is my gpu making ticking sounds because its underload?


Seems like the weight of the cooler is causing the fans to make some sort of vibration. I would try to connect the PCI-E cables from the top of the card so that the card is pulled up slightly. Maybe zip tie part of the cooler so that it doesn't make those vibrations.


----------



## Forceman

Quote:


> Originally Posted by *heroxoot*
> 
> Nah my PC never goes to sleep.
> 
> So best I got on my rig was 49fps on the nose. I didn't save the benchmark. This is still low it seems. A 290 with an i5 getting 55fps makes me feel like my 290X is bad somehow or my CPU is just that big of a bottleneck for it.
> 
> On another note, I just realized both windows 7 and 8 were running balanced instead of high performance during these fresh install tests (Not the initial install that had issues) so my benchmark could potentially get better? Not sure. Maybe the fps drops will stop at least.


Have you tried anything other than Heaven/Valley? Maybe 3DMark (compare the graphics score) or Metro or some other game benchmark? There's a thread where people posted a bunch of benchmark tests a while back - you could compare your scores there. Maybe Heaven/Valley are really CPU dependent - we already know they are sensitive to memory clocks.

If you really want to test just the GPU try using Compubench or ShaderToyMark, that'll take the CPU out completely.


----------



## makasouleater69

Quote:


> Originally Posted by *Enfeeble*
> 
> Hey guys, it has been four days since i received my 290 tri-x. For the first three days i ran few benchmarks on my 1440p monitor. Everything was going great, low temps/low vrm temps. So last night I decided to play Crysis 3 and while I am in the graphic settings(in the main menu) i decided to turn off vsync. My card started to whine coil like crazy because of the +3000 fps i was receiving. I immediately turned it off and played the game with vysnc for around 30 minutes. Then i went on to Skyrim and Metro Last Night. Skyrim was fine but anything over 61 fps my card would whine... but in Metro, my card would whine like crazy when im hitting 15-20 fps. Now today, ALL day the fans on my card have been rattling around +35% fan speed and it gets ridiculously loud above 70%. Also even on 20% fan speed my card seems to be buzzing constantly. The buzzing isn't that loud but it is noticeable since my pc is right next to me. Should i get a replacement?
> 
> The rattling sound is _*exactly*_ like this
> 
> 
> 
> 
> 
> I did the same thing, push the edge of the card up just like him and tug the power cables upwards and it would stop.
> 
> Also i heard the whine coil could be a GPU,PSU and a mobo issue.
> Currently using Asus Maximus hero IV, EVGA SuperNova 750G.
> 
> My 290 tri-x was fine until the crysis moment. I can't really put the blame on that. Also prior to having this card i had the Asus 280 DCUII top, and it ran flawlessly.
> 
> edit- I physically had my head right next to my gpu and my psu and im pretty sure the faint buzzing noise is coming from my psu.
> 
> Just ran Unigene Valley, 68 temp with 37% fan speed but i hear ticking sound again which i thought was my PSU at desktop. Is this normal? Is my gpu making ticking sounds because its underload?


I cant tell you anything about fans. I can say this, mine is water cooled and it has no noise when it is not working. I hear the wine clearly when I start using it. In fact i can tell my fps depending on what pitch the whine is lol. Now the whine for me goes higher pitched as the frames go up and down. I found this out with eve online, when i turned the settings to interval immediate. When i zoom out I get like 650, and when I zoom in around 450. Doing this makes it whine really loud and high pitch at 650, and slightly lower pitched and less loud at 450. It sounds like a engine changing rpms lol.
I put another post here, and the person told me it was the electricity running through the card, and not to worry. The lower the fps it does kinda sound like a ticking noise too, and I dont think that is your fan. This card makes some funky noises, Although I am not sure if all cards dont make this noise, this is the first one I have had that is water cooled.
I couldent hear it though when I had the stock cooler on it. The stock cooler on mine though was so loud, I couldent really hear anything but the fan. I wouldent really say its from under power ethier, because it makes these noises for me overclocked, or under clocked. The power it requires according to gpuz is only like, let me see. Is like 160 watts, at 1.227 volts, Alright it doesnt sound like a whine at 50-80fps it sounds like a ticking noise, the lower the fps the more of a ticking noise the higher, the more high pitched it is.


----------



## neurotix

I have the exact same problem, with both of my Tri-X.

Annoying loud grinding noise with the fan past certain rpms and under load.

Pushing up on the card makes it stop.

Considering an RMA, these coolers are defective.


----------



## makasouleater69

Here is the GPUZ link http://www.techpowerup.com/gpuz/8hb8s/ . Link to GPUZ screen shot http://gpuz.techpowerup.com/14/04/30/g4x.png . I am not sure what a OCN name is. http://gpuz.techpowerup.com/14/04/30/723.png that is a picture of the temps. It is water cooled with a ek block and back plate. It is a Sapphire reference 290x, with 290x tri x bios.


----------



## VSG

FYI Sapphire has cancelled it's plans for the 8gb Vapor-x and Toxic versions of the 290x.


----------



## darkelixa

Oh really, i was going to buy a new r9 290 sapphire toxic for my new rig, but if thats the case then eh


----------



## phantomowl

Just get the MSi Lightning or Gigabyte


----------



## Paul17041993

Quote:


> Originally Posted by *neurotix*
> 
> I have the exact same problem, with both of my Tri-X.
> 
> Annoying loud grinding noise with the fan past certain rpms and under load.
> 
> Pushing up on the card makes it stop.
> 
> Considering an RMA, these coolers are defective.


pushing up asin making it straight or applying pressure to the chasis?

sleeve bearing fans always make noise fyi, especially when facing up or downwards and running for 100% non-stop.


----------



## Devotii

Quote:


> Originally Posted by *geggeg*
> 
> FYI Sapphire has cancelled it's plans for the 8gb Vapor-x and Toxic versions of the 290x.


Quote:


> Originally Posted by *darkelixa*
> 
> Oh really, i was going to buy a new r9 290 sapphire toxic for my new rig, but if thats the case then eh


There is a R9 290 and R9 290X Vapor X with 4GB though


----------



## VSG

Ya I should have been more clear there. The regular 4gb versions are not affected.


----------



## Norse

Quote:


> Originally Posted by *Roboyto*
> 
> Nearly all the blocks will give you similar temperatures for cooling the core. When it comes to VRM temperatures, specifically VRM1, however there is a difference as the thermal pads that most of the manufacturers include are fairly poor quality.
> 
> The block/backplate combination with the best VRM1 temperature out of the box is Aquacomputer. They have a passive and active backplate. The passive does nearly as good of a job as the active, but is rather ugly/incomplete looking compared to the nice nickel plated cover w/ heatpipe.
> 
> 
> 
> 
> 
> 
> 
> You can achieve same/similar temperatures to AquaComputer with other blocks by using higher performing thermal pads.
> Fujipoly Ultra Extreme specifically: _http://www.frozencpu.com/products/17504/thr-185/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_Mosfet_Block_-_100_x_15_x_10_-_Thermal_Conductivity_170_WmK.html _
> If you plan on using these thermal pads, please check my thread link below so you can order the appropriate thickness for your block; this link is for 1mm.
> I am using the XSPC block and it works well. It worked even better once I upgraded thermal pads for the VRMs.
> 
> _See my thread here: http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures_
> 
> My temperature drops were quite drastic, and I still haven't added thermal pads to my backplate. I'm sure they will come down further once I get around to that...just not feeling like draining my loop to do it.
> 
> Other users with EK blocks have reported similar temperature drops to what I experienced. I am unsure of Koolance performance as I haven't seen many people with those blocks; most popular seem to be EK, XSPC, and AquaComputer.


Is it easy to mount a backplate without also going WC aswell? (reference card) am pondering getting http://www.watercoolinguk.co.uk/p/Watercool-HEATKILLER-reg;-GPU-backplate-R9-290X_44412.html to help cool my card a little aswell as make it more pleasing to the eye Watercooling UK wasnt really helpful as to if i can use it without a block too its a passive backplate its just the screws that worry me


----------



## Rozayz

Quote:


> Originally Posted by *Rozayz*
> 
> May just take rig into work tbh. I can do it myself but it's easier to get RA to do it while I work Assembly/ Sales.


In case anyone cared, the issue was my SSD. Replaced, everything working perfectly fine now.


----------



## Mercy4You

Quote:


> Originally Posted by *neurotix*
> 
> I have the exact same problem, with both of my Tri-X.
> 
> Annoying loud grinding noise with the fan past certain rpms and under load.
> 
> Pushing up on the card makes it stop.
> 
> Considering an RMA, these coolers are defective.


Mine does that too, I lifted the end of the card up to my case with a shoelace









You're not going to RMA your cards for something that easy solvable I hope?


----------



## motokill36

Hi All
Just fitting Prolima MK-26 To R9 290 .
I have been Trying to find Max fan amp Header output for card fan connection .
Has anyone fittied 2 fans on this cooler using card fan output .

Thanks


----------



## bond32

Love this replacement card:


----------



## Roboyto

Quote:


> Originally Posted by *Norse*
> 
> Is it easy to mount a backplate without also going WC aswell? (reference card) am pondering getting http://www.watercoolinguk.co.uk/p/Watercool-HEATKILLER-reg;-GPU-backplate-R9-290X_44412.html to help cool my card a little aswell as make it more pleasing to the eye Watercooling UK wasnt really helpful as to if i can use it without a block too its a passive backplate its just the screws that worry me


I personally don't have any experience with it, but a friend of mine attached EK backplates to 7970s while cooling them with AIO water coolers. You may have to get a little crafty with it, but it's possible; a trip to the hardware store may be necessary.


----------



## DeadlyDNA

I cant believe how strong these cards are, I only have 290's. I managed to crack 34 Mega Pixels in game, While its too far for realistic gaming just yet
these video cards can keep up even. BF4 @ 34MP and can still hold a playable FPS. Not for hard core gamers obviously but its a peak into the future


----------



## rdr09

Quote:


> Originally Posted by *bond32*
> 
> Love this replacement card:


Very Nice! you'll make some lightning owners envy. 13 drivers should allow you to cross 19K in graphics score.


----------



## rdr09

Quote:


> Originally Posted by *motokill36*
> 
> Hi All
> Just fitting Prolima MK-26 To R9 290 .
> I have been Trying to find Max fan amp Header output for card fan connection .
> Has anyone fittied 2 fans on this cooler using card fan output .
> 
> Thanks


use the motherboard's. you might mess the one for the gpu.


----------



## TheComputerNub

I <3 this card


----------



## Devotii

Vapor X and Tri X cards on sale at overclockers.co.uk this week, tiny discount though


----------



## bond32

Quote:


> Originally Posted by *rdr09*
> 
> Very Nice! you'll make some lightning owners envy. 13 drivers should allow you to cross 19K in graphics score.


Going to try those drivers later, thanks. Never actually thought I would be anywhere close to the top single card scores, but it looks like I got a good sample now!


----------



## rdr09

Quote:


> Originally Posted by *bond32*
> 
> Going to try those drivers later, thanks. Never actually thought I would be anywhere close to the top single card scores, but it looks like I got a good sample now!


time is running out . . .

http://www.overclock.net/t/1476601/3d-fanboy-overclocking-competition-2014-500-in-prizing/710







Go RED Team!


----------



## Devotii

Sapphire Radeon R9 290X Vapor-X OC 8192MB GDDR5 PCI-Express Graphics Card **OcUK WorldWide Exclusive**
http://www.overclockers.co.uk/showproduct.php?prodid=GX-353-SP

Pre-order
ETA: 05/05/14
£599.99


----------



## chronicfx

Quote:


> Originally Posted by *rdr09*
> 
> time is running out . . .
> 
> http://www.overclock.net/t/1476601/3d-fanboy-overclocking-competition-2014-500-in-prizing/710
> 
> 
> 
> 
> 
> 
> 
> Go RED Team!


I get about P21k stock in 3dmark11 would I get booed for posting stock score, would it help?


----------



## rdr09

Quote:


> Originally Posted by *chronicfx*
> 
> I get about P21k stock in 3dmark11 would I get booed for posting stock score, would it help?


not 3DMark11 but 3DMark (Firestrike). it is a friendly competition. Turn off Tessellation in CCC. anything to help Team Red.


----------



## heroxoot

I'm starting to find there is just a performance loss for this card on windows 8.1. I get 2fps more on regular windows 8 and windows 7 in comparison. This blows because now I have to pick performance over function as I use parts of windows 8 all the time.

Using heaven and valley.


----------



## chronicfx

Quote:


> Originally Posted by *rdr09*
> 
> not 3DMark11 but 3DMark (Firestrike). it is a friendly competition. Turn off Tessellation in CCC. anything to help Team Red.


Never bought that one.. I am cheap and only run freebies


----------



## Mega Man

Quote:


> Originally Posted by *Kokin*
> 
> So expensive though! I will probably give it a try in the future, but just getting a block, backplate, and the CSQ bridge is already killing my wallet.


Ha ha ha. Try going quad fire..... (Water cooled)
Quote:


> Originally Posted by *Enfeeble*
> 
> Hey guys, it has been four days since i received my 290 tri-x. For the first three days i ran few benchmarks on my 1440p monitor. Everything was going great, low temps/low vrm temps. So last night I decided to play Crysis 3 and while I am in the graphic settings(in the main menu) i decided to turn off vsync. My card started to whine coil like crazy because of the +3000 fps i was receiving. I immediately turned it off and played the game with vysnc for around 30 minutes. Then i went on to Skyrim and Metro Last Night. Skyrim was fine but anything over 61 fps my card would whine... but in Metro, my card would whine like crazy when im hitting 15-20 fps. Now today, ALL day the fans on my card have been rattling around +35% fan speed and it gets ridiculously loud above 70%. Also even on 20% fan speed my card seems to be buzzing constantly. The buzzing isn't that loud but it is noticeable since my pc is right next to me. Should i get a replacement?
> 
> The rattling sound is _*exactly*_ like this
> 
> 
> 
> 
> 
> I did the same thing, push the edge of the card up just like him and tug the power cables upwards and it would stop.
> 
> Also i heard the whine coil could be a GPU,PSU and a mobo issue.
> Currently using Asus Maximus hero IV, EVGA SuperNova 750G.
> 
> My 290 tri-x was fine until the crysis moment. I can't really put the blame on that. Also prior to having this card i had the Asus 280 DCUII top, and it ran flawlessly.
> 
> edit- I physically had my head right next to my gpu and my psu and im pretty sure the faint buzzing noise is coming from my psu.
> 
> Just ran Unigene Valley, 68 temp with 37% fan speed but i hear ticking sound again which i thought was my PSU at desktop. Is this normal? Is my gpu making ticking sounds because its underload?


Fyi coil wine will happen with different psus, mobos, ect. Even similar model combos will differ (same equip). And you can not pinpoint it with naked hearing due to its high freq it is very easy to "hear"it from some where else
Quote:


> Originally Posted by *heroxoot*
> 
> I'm starting to find there is just a performance loss for this card on windows 8.1. I get 2fps more on regular windows 8 and windows 7 in comparison. This blows because now I have to pick performance over function as I use parts of windows 8 all the time.
> 
> Using heaven and valley.


..... both heaven and valley are heavy physics. Windows 8 has a lower physics score.

Stop trying to compare benchmark scores across platforms. Esp using benches that heavily favor the side of the platform that you don't have ( Intel ) try.. idk playing a game and see if you notice a difference from your old card instead of basing your experience off of a synthetic benchmark.

I'll make it even more simple for you. If all you want to do is chase benchmark scores. Stop. Buy a 3930k/4930k (at min) and at least a rivf. Then try again. Otherwise I can always show you a better bench.

Please forgive all errors auto correct on mobile is killing me


----------



## heroxoot

Quote:


> Originally Posted by *Mega Man*
> 
> Stop trying to compare benchmark scores across platforms. Esp using benches that heavily favor the side of the platform that you don't have ( Intel ) try.. idk playing a game and see if you notice a difference from your old card instead of basing your experience off of a synthetic benchmark.
> 
> I'll make it even more simple for you. If all you want to do is chase benchmark scores. Stop. Buy a 3930k/4930k (at min) and at least a rivf. Then try again. Otherwise I can always show you a better bench.
> 
> Please forgive all errors auto correct on mobile is killing me


You act like you read anything I said, but you did not. I have said countless times I was getting 100+ fps on mantle on BF4 on my 7970 and hardly 90 on this 290X. I feel like I was ripped off by MSI right now because this thing is junk. And I will never, I mean NEVER, buy an over priced intel CPU. The performance is not worth the price of both CPU and motherboard.

My rig was outbenched by a FX 6200 @ 4.1ghz and a 290X on stock, that just aint right. I get better benches on windows 7 and windows 8 but windows 8.1 gives me less frames in everything including benchmarks. Unless that person messed with settings to make it happen, I was outright left behind 3 - 5fps on what my average should be in heaven.

So do I go to windows 7 for the little bit more performance, or do I stick with windows 8.1 containing features I actually use? I really do not want to dual boot at this point.


----------



## Mega Man

And I will say again. If all you want is max scores go to Intel.

Diff versions. Diff driver. Ect will affect scoring. You talk about how everything out performs your setup. Yet you have only mentioned bf4 as your baseline. The 290x is still maturing yet 7970s have been out for a while.

Let alone 290s shine more under higher res. All you are looking at is a number rather then actual performance.

I have heard win 8 is better for bf4.

However you will score less in any physics bench win 8vs win 7.

No one else can make this choice for you yet you want them to. Let alone it sounds to me like you don't even understand your 290 yet let alone what makes it work best. How long did you tinker with your 7970 to get"100+".

I stand by what I said your gpu seems to be working fine. Your understanding of how to use it is what I think your problem is AND the fact the drivers are still maturing


----------



## heroxoot

Quote:


> Originally Posted by *Mega Man*
> 
> And I will say again. If all you want is max scores go to Intel.
> 
> Diff versions. Diff driver. Ect will affect scoring. You talk about how everything out performs your setup. Yet you have only mentioned bf4 as your baseline. The 290x is still maturing yet 7970s have been out for a while.
> 
> Let alone 290s shine more under higher res. All you are looking at is a number rather then actual performance.
> 
> I have heard win 8 is better for bf4.
> 
> However you will score less in any physics bench win 8vs win 7.
> 
> No one else can make this choice for you yet you want them to. Let alone it sounds to me like you don't even understand your 290 yet let alone what makes it work best. How long did you tinker with your 7970 to get"100+".
> 
> I stand by what I said your gpu seems to be working fine. Your understanding of how to use it is what I think your problem is AND the fact the drivers are still maturing


Yea but a 290X and 6200 @ 4.1ghz on windows 7 out benched my setup on windows 8.1. So when I switched to windows 7, same thing, but not as bad. So I'm comparing apples to apples, why is mine worse? Are you saying it's just luck? Because everyone tells me his is average and mine is well below. That is my issue. He got 52fps in Heaven on stock, I get 47ish on stock on windows 7, 49 on 1100/1250. I just want to understand why a rig so similar is doing better than mine even when using the same OS. That is my issue, that is what I do not understand.

My 7970 was running 1225/1700 clocks long before 14.1 driver came out. I just updated, turned on mantle, BAM 100+!

As far as OS. 7 =/= 8 but 8.1 has a loss. Now I understand 8.1 is not a service pack its another version of the OS, and I realize this because first off 8.1 has its own CD keys. So while there is an 8 driver it's technically not an 8.1 driver. And 13.12 is the earliest 290X working driver correct? So yea maybe the performance is not as solid for everyone. I'm on 14.4 now and I actually did get a boost on a fresh 8.1 install. Not a lot but it got me 47.7fps in heaven apposed to 47.1, so it is an increase. Half a frame in a benchmark can be 10fps in a game.

My 290X does however drop its load during the benchmark a lot. Like a real lot. The problem doesn't exist on windows 7/8, just 8.1.

But if you think my performance is not bad, I can only hope the loss of performance in BF4 was due to the latest patches. I quit playing the game for 3 months and my 7970 is lost forever to MSI as this 290X is my replacement. So I cna only hope Dice screwed up mantle performance.


----------



## DeadlyDNA

Quote:


> Originally Posted by *heroxoot*
> 
> Yea but a 290X and 6200 @ 4.1ghz on windows 7 out benched my setup on windows 8.1. So when I switched to windows 7, same thing, but not as bad. So I'm comparing apples to apples, why is mine worse? Are you saying it's just luck? Because everyone tells me his is average and mine is well below. That is my issue. He got 52fps in Heaven on stock, I get 47ish on stock on windows 7, 49 on 1100/1250. I just want to understand why a rig so similar is doing better than mine even when using the same OS. That is my issue, that is what I do not understand.
> 
> My 7970 was running 1225/1700 clocks long before 14.1 driver came out. I just updated, turned on mantle, BAM 100+!
> 
> As far as OS. 7 =/= 8 but 8.1 has a loss. Now I understand 8.1 is not a service pack its another version of the OS, and I realize this because first off 8.1 has its own CD keys. So while there is an 8 driver it's technically not an 8.1 driver. And 13.12 is the earliest 290X working driver correct? So yea maybe the performance is not as solid for everyone. I'm on 14.4 now and I actually did get a boost on a fresh 8.1 install. Not a lot but it got me 47.7fps in heaven apposed to 47.1, so it is an increase. Half a frame in a benchmark can be 10fps in a game.
> 
> My 290X does however drop its load during the benchmark a lot. Like a real lot. The problem doesn't exist on windows 7/8, just 8.1.
> 
> But if you think my performance is not bad, I can only hope the loss of performance in BF4 was due to the latest patches. I quit playing the game for 3 months and my 7970 is lost forever to MSI as this 290X is my replacement. So I cna only hope Dice screwed up mantle performance.


I know you have been through the loops with your setup, however bf4 itself has had some quirks for me. When I change in game settings I noticed a drop in gpu performance immediately . I think it could be issues with the game also. Have you tried deleting the game completely and remove all folders it was in and reload it? I have done this twice in my benching, its almost like when I turn down a setting then back up it doesn't seem like it goes back properly. Then I notice my gpu usage tanks. Forgive me if you have went through this already.


----------



## rdr09

Quote:


> Originally Posted by *chronicfx*
> 
> Never bought that one.. I am cheap and only run freebies


oh noes, don't need to buy just for benching. the Heaven and Catzilla have free versions. the download link is in the op of the that thread. i find heaven more demanding of all those.


----------



## sergec19

my benchmarks!

Cpu @ 4.51Ghz,
HT is disabled! will try soon with HT On
R9 290 is overclocked but not max.. did not tried with more voltage..
has an Arctic Accelero IV cooler!

Max temps R9 [email protected] 1100/1400
Core 66° - VRM1 77° - VRM2 60°

Max temp X5660 4.51GHz HT off @1.40V
hottest core 64°

3Dmark11: P13377
Firestrike: 9809
Valley 1.0 Extreme HD: 2643
Heaven 4.0 Extreme : 1773


----------



## rdr09

Quote:


> Originally Posted by *sergec19*
> 
> my benchmarks!
> 
> Cpu @ 4.51Ghz,
> HT is disabled! will try soon with HT On
> R9 290 is overclocked but not max.. did not tried with more voltage..
> has an Arctic Accelero IV cooler!
> 
> Max temps R9 [email protected] 1100/1400
> Core 66° - VRM1 77° - VRM2 60°
> 
> Max temp X5660 4.51GHz HT off @1.40V
> hottest core 64°
> 
> 3Dmark11: P13377
> Firestrike: 9809
> Valley 1.0 Extreme HD: 2643
> Heaven 4.0 Extreme : 1773
> 
> 
> Spoiler: Warning: Spoiler!


nice. can you run those again and submit it here before today's deadline . . .

http://www.overclock.net/t/1476601/3d-fanboy-overclocking-competition-2014-500-in-prizing

make sure you turn off tessellation in CCC . . .



here is a sample


----------



## heroxoot

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Yea but a 290X and 6200 @ 4.1ghz on windows 7 out benched my setup on windows 8.1. So when I switched to windows 7, same thing, but not as bad. So I'm comparing apples to apples, why is mine worse? Are you saying it's just luck? Because everyone tells me his is average and mine is well below. That is my issue. He got 52fps in Heaven on stock, I get 47ish on stock on windows 7, 49 on 1100/1250. I just want to understand why a rig so similar is doing better than mine even when using the same OS. That is my issue, that is what I do not understand.
> 
> My 7970 was running 1225/1700 clocks long before 14.1 driver came out. I just updated, turned on mantle, BAM 100+!
> 
> As far as OS. 7 =/= 8 but 8.1 has a loss. Now I understand 8.1 is not a service pack its another version of the OS, and I realize this because first off 8.1 has its own CD keys. So while there is an 8 driver it's technically not an 8.1 driver. And 13.12 is the earliest 290X working driver correct? So yea maybe the performance is not as solid for everyone. I'm on 14.4 now and I actually did get a boost on a fresh 8.1 install. Not a lot but it got me 47.7fps in heaven apposed to 47.1, so it is an increase. Half a frame in a benchmark can be 10fps in a game.
> 
> My 290X does however drop its load during the benchmark a lot. Like a real lot. The problem doesn't exist on windows 7/8, just 8.1.
> 
> But if you think my performance is not bad, I can only hope the loss of performance in BF4 was due to the latest patches. I quit playing the game for 3 months and my 7970 is lost forever to MSI as this 290X is my replacement. So I cna only hope Dice screwed up mantle performance.
> 
> 
> 
> I know you have been through the loops with your setup, however bf4 itself has had some quirks for me. When I change in game settings I noticed a drop in gpu performance immediately . I think it could be issues with the game also. Have you tried deleting the game completely and remove all folders it was in and reload it? I have done this twice in my benching, its almost like when I turn down a setting then back up it doesn't seem like it goes back properly. Then I notice my gpu usage tanks. Forgive me if you have went through this already.
Click to expand...

That issue is actually old for BF3/4. You have to set your stuff then relaunch the game for it to take effect properly. But I have deleted the Mantle caches and such, but now that its a clean format I need to try it again. This install of windows is now 11 hours old and I slept through most of that 11 hours. It just concerns me when a PC of similar spec out performs mine badly and it has never occurred before this gpu.

Also, am I the only one with an MSI card that cannot control fan speed on 14.4? This is really annoying too. My card could stay cooler if the fan would run 100% around 80c, but instead I cannot control fan curve past 13.12 drivers.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *rdr09*
> 
> time is running out . . .
> 
> http://www.overclock.net/t/1476601/3d-fanboy-overclocking-competition-2014-500-in-prizing/710
> 
> 
> 
> 
> 
> 
> 
> Go RED Team!


Quote:


> Originally Posted by *chronicfx*
> 
> I get about P21k stock in 3dmark11 would I get booed for posting stock score, would it help?


Anybody that can post scores for the 'red team ' please do so . Greenies are starting to pull ahead


----------



## DeadlyDNA

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Anybody that can post scores for the 'red team ' please do so . Greenies are starting to pull ahead


win8 eliminates 2 of these benchs for me and the only on I can run on a 4 core cpu at 720p or 1080p is a total waste of my gpus. Maybe I am looking at it wronfg.


----------



## Goride

Does anyone have any idea why all of my bench scores are much lower than other 290/290x owners?

My full system is listed in my account, but here is a quick lowdown on what I am running:
CPU: 3570k @ 4.4ghz
RAM: DDR3 1866 9-9-9-28
GPU: 290x
Resolution: 1920x1080

For these benches I was running at 1030mhz core and 1250mhz memory. I had power percentage at 50%, and I manually turned the fans up to 60% to avoid heat throttling. I had GPU-z logging to and checked to verify that no throttling took place, plus temps never hit 95c.

I turned off tesslation in CCC.

In 3dmark 11 in FIRE STRIKE 1.1 this was my score - (default settings)


This score seems aweful low compared to Guru3d's review/bench:








http://www.guru3d.com/articles_pages/radeon_r9_290_review_benchmarks,27.html

Even the stock 290 blew me out of the water.

In Heaven this was my score:


The following were my settings:
API: DX11
Quality: Ultra
Tess: Extreme
3d: diabled
MultiMon: disabled
AA: x8
Full screen: checked
Res: 1920x1080

When I overclocked the 290x to 1200mhz core/ 1250mhz memory I raised my Heaven score to 1420.

When I overclocked it to 1200mhz core / 1500mhz memory I got 1471.

My Heaven scores seem low compared to averages I have seen from others around here and other places on the internet.

Any idea why I am getting such low scores?

I have ran Prime95 for over 2 hours without an error. I realize that just means it runs prime95 well, but that would also indicate my cpu and ram are seemingly stable. Just for kicks I put my cpu to stock speeds and my ram to ddr3 1600 11-11-11-28, just to see if maybe it was a stability issue. My scores dropped by a small amount, but about what you would expect.

I have not seen any artifacting on these benches, so the gpu seems stable (thought I did see artifacting when I pushed the core to 1200mhz without increasing the voltage enough).

EDIT:

Forgot to list my drivers.

I ran the Heaven bench marks on both 13.12 and 14.14. I got similar results in both (just a few points difference)

In 3dMark 11, I only tried 14.14.

Oh and I used DDU in safemode to wipe drivers in between installing 14.14 from 13.12.


----------



## sergec19

Quote:


> Originally Posted by *Goride*
> 
> Does anyone have any idea why all of my bench scores are much lower than other 290/290x owners?
> 
> My full system is listed in my account, but here is a quick lowdown on what I am running:
> CPU: 3570k @ 4.4ghz
> RAM: DDR3 1866 9-9-9-28
> GPU: 290x
> Resolution: 1920x1080
> 
> For these benches I was running at 1030mhz core and 1250mhz memory. I had power percentage at 50%, and I manually turned the fans up to 60% to avoid heat throttling. I had GPU-z logging to and checked to verify that no throttling took place, plus temps never hit 95c.
> 
> I turned off tesslation in CCC.
> 
> In 3dmark 11 in FIRE STRIKE 1.1 this was my score - (default settings)
> 
> 
> This score seems aweful low compared to Guru3d's review/bench:
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.guru3d.com/articles_pages/radeon_r9_290_review_benchmarks,27.html
> 
> Even the stock 290 blew me out of the water.
> 
> In Heaven this was my score:
> 
> 
> The following were my settings:
> API: DX11
> Quality: Ultra
> Tess: Extrem
> 3d: diabled
> MultiMon: disabled
> AA: x8
> Full screen: checked
> Res: 1920x1080
> 
> When I overclocked the 290x to 1200mhz core/ 1250mhz memory I raised my Heaven score to 1420.
> 
> When I overclocked it to 1200mhz core / 1500mhz memory I got 1471.
> 
> My Heaven scores seem low compared to averages I have seen from others around here and other places on the internet.
> 
> Any idea why I am getting such low scores?
> 
> I have ran Prime95 for over 2 hours without an error. I realize that just means it runs prime95 well, but that would also indicate my cpu and ram are seemingly stable. Just for kicks I put my cpu to stock speeds and my ram to ddr3 1600 11-11-11-28, just to see if maybe it was a stability issue. My scores dropped by a small amount, but about what you would expect.
> 
> I have not seen any artifacting on these benches, so the gpu seems stable (thought I did see artifacting when I pushed the core to 1200mhz without increasing the voltage enough).
> 
> EDIT:
> 
> Forgot to list my drivers.
> 
> I ran the Heaven bench marks on both 13.12 and 14.14. I got similar results in both (just a few points difference)
> 
> In 3dMark 11, I only tried 14.14.
> 
> Oh and I used DDU in safemode to wipe drivers in between installing 14.14 from 13.12.[/quote
> 
> Ive had the exact same thing with my sapphire r9 290 with stock cooler
> 
> First i got firestrike score of 6600+ ... Temp very high 95degrees.: i bought an arctic accelero iV cooler.. Fan speed 30% i got a score of 8700+ max temp 87degrees..
> Then i upped fan speed to 75% and overclock to 1200/1400 and i got a score of 9800+ With max temp 66degrees
> Its all about the temps.. Card is throtling @ high temps


----------



## ABADY

From the reviews . Is it true that xfx r9 290 runs @93' under load ?! Cuz ill get one of the r9 290s non-ref soon .


----------



## Matt-Matt

Quote:


> Originally Posted by *ABADY*
> 
> From the reviews . Is it true that xfx r9 290 runs @93' under load ?! Cuz ill get one of the r9 290s non-ref soon .


If you don't turn the fan up yes it does, because they try and make it quiet. I had mine overclocked to 1075/1400 with a custom fan profile on the stock cooler at like 83c max with a custom profile.


----------



## rdr09

Quote:


> Originally Posted by *Goride*
> 
> Does anyone have any idea why all of my bench scores are much lower than other 290/290x owners?
> 
> My full system is listed in my account, but here is a quick lowdown on what I am running:
> CPU: 3570k @ 4.4ghz
> RAM: DDR3 1866 9-9-9-28
> GPU: 290x
> Resolution: 1920x1080
> 
> For these benches I was running at 1030mhz core and 1250mhz memory. I had power percentage at 50%, and I manually turned the fans up to 60% to avoid heat throttling. I had GPU-z logging to and checked to verify that no throttling took place, plus temps never hit 95c.
> 
> I turned off tesslation in CCC.
> 
> In 3dmark 11 in FIRE STRIKE 1.1 this was my score - (default settings)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> This score seems aweful low compared to Guru3d's review/bench:
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.guru3d.com/articles_pages/radeon_r9_290_review_benchmarks,27.html
> 
> Even the stock 290 blew me out of the water.
> 
> In Heaven this was my score:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> The following were my settings:
> API: DX11
> Quality: Ultra
> Tess: Extreme
> 3d: diabled
> MultiMon: disabled
> AA: x8
> Full screen: checked
> Res: 1920x1080
> 
> When I overclocked the 290x to 1200mhz core/ 1250mhz memory I raised my Heaven score to 1420.
> 
> When I overclocked it to 1200mhz core / 1500mhz memory I got 1471.
> 
> My Heaven scores seem low compared to averages I have seen from others around here and other places on the internet.
> 
> Any idea why I am getting such low scores?
> 
> I have ran Prime95 for over 2 hours without an error. I realize that just means it runs prime95 well, but that would also indicate my cpu and ram are seemingly stable. Just for kicks I put my cpu to stock speeds and my ram to ddr3 1600 11-11-11-28, just to see if maybe it was a stability issue. My scores dropped by a small amount, but about what you would expect.
> 
> I have not seen any artifacting on these benches, so the gpu seems stable (thought I did see artifacting when I pushed the core to 1200mhz without increasing the voltage enough).
> 
> EDIT:
> 
> Forgot to list my drivers.
> 
> I ran the Heaven bench marks on both 13.12 and 14.14. I got similar results in both (just a few points difference)
> 
> In 3dMark 11, I only tried 14.14.
> 
> Oh and I used DDU in safemode to wipe drivers in between installing 14.14 from 13.12.


check out the cpu guru used - sb-e oc'ed to 4.6GHz. you can't compare overall scores in 3dm just graphics.


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> check out the cpu guru used - sb-e oc'ed to 4.6GHz. you can't compare overall scores in 3dm just graphics.


Agreed, Even though graphics scores vary from platform to platform.


----------



## bbond007

Quote:


> Originally Posted by *Goride*
> 
> Any idea why I am getting such low scores?


My benchmarks were lower before I did this:


Spoiler: Warning: Spoiler!







seems obvious but I have helped others who did not know about that or simply forgot.

cheers!


----------



## Kokin

Quote:


> Originally Posted by *ABADY*
> 
> From the reviews . Is it true that xfx r9 290 runs @93' under load ?! Cuz ill get one of the r9 290s non-ref soon .


The reference cooler isn't that great, but it's not as bad as reviews made it out to be. It can run at 93C, but your fan would be set to silence at that temp. I think 40~50% is tolerable for most people and 60~70% if you want to overclock.

If i do 80~100% fan speed and overclock to 1275/1250 (artifacts but can still bench), I normally cap at 79~81C. This is super loud though and I would only recommend it for benching purposes and if no one else is in the house.


----------



## Durquavian

Quote:


> Originally Posted by *Kokin*
> 
> The reference cooler isn't that great, but it's not as bad as reviews made it out to be. It can run at 93C, but your fan would be set to silence at that temp. I think 40~50% is tolerable for most people and 60~70% if you want to overclock.
> 
> If i do 80~100% fan speed and overclock to 1275/1250 (artifacts but can still bench), I normally cap at 79~81C. This is super loud though and I would only recommend it for benching purposes and if no one else is in the house.


I do the same with my Delta Mega Fast fan. Hook her up to 12V and see if deafness sets in.


----------



## ABADY

Quote:


> Originally Posted by *Matt-Matt*
> 
> If you don't turn the fan up yes it does, because they try and make it quiet. I had mine overclocked to 1075/1400 with a custom fan profile on the stock cooler at like 83c max with a custom profile.


is it hotter than msi and sapphire or is it the same ? what fan speed r u running now ? and is it quiet ?

cuz really now i cant choose what to get from these three ( tri-x - TF - DD )


----------



## sergec19

Quote:


> Originally Posted by *ABADY*
> 
> is it hotter than msi and sapphire or is it the same ? what fan speed r u running now ? and is it quiet ?
> 
> cuz really now i cant choose what to get from these three ( tri-x - TF - DD )


yes it is.. mine was running 95° after 3min bf4 and fan makes a Jet noise at 60% or higher..
now i have an Arctic Accelero IV and its running max 66° after some heavy benchmarking and gaming, fan is max 85% speed and i cant hear it!


----------



## Red1776

Quote:


> Originally Posted by *ABADY*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Matt-Matt*
> 
> If you don't turn the fan up yes it does, because they try and make it quiet. I had mine overclocked to 1075/1400 with a custom fan profile on the stock cooler at like 83c max with a custom profile.
> 
> 
> 
> is it hotter than msi and sapphire or is it the same ? what fan speed r u running now ? and is it quiet ?
> 
> cuz really now i cant choose what to get from these three ( tri-x - TF - DD )
Click to expand...

I RAN 4x MSI R290X Twin Frozr while waiting for waterbloks and they ran at 68c while gaming.


----------



## dasparx

How is this score for 2 R9 290s @ 1050/1250 running at 8x 2.0 with a 2600k @ 4.8
Running with a custom fanprofile, fans go up to ~70%, temps are 79c/77c max.


----------



## ABADY

Quote:


> Originally Posted by *sergec19*
> 
> yes it is.. mine was running 95° after 3min bf4 and fan makes a Jet noise at 60% or higher..
> now i have an Arctic Accelero IV and its running max 66° after some heavy benchmarking and gaming, fan is max 85% speed and i cant hear it!


wow! whats the point of their non-reference cooler then ! looks like ill get msi .
Quote:


> Originally Posted by *Red1776*
> 
> I RAN 4x MSI R290X Twin Frozr while waiting for waterbloks and they ran at 68c while gaming.


thats great . @ what speed ?

ــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــ

for overclocking wahts the best msi or sapphire ?and is it possible to hit more than 1120/1450 ? cuz i saw a video of som1 hitting over 1600 on memory ! idk if it depend on luck or its fake !


----------



## heroxoot

Here is a question. If you put a Bios for a 290X on a 290 what happens? Does it show the shader count for a 290X? I have wondered what happens to programs like GPUZ. when they read the card hardware. I ask this question because I tried other 290X bios for my card but they all seem to crash. It makes me wonder.


----------



## MojoW

Quote:


> Originally Posted by *heroxoot*
> 
> Here is a question. If you put a Bios for a 290X on a 290 what happens? Does it show the shader count for a 290X? I have wondered what happens to programs like GPUZ. when they read the card hardware. I ask this question because I tried other 290X bios for my card but they all seem to crash. It makes me wonder.


Very rare, but you have a little program to check if your card is unlockable.
And if it is you will have the shaders unlocked but on a normal 290 only the clock speeds change.
Some get a black screen with a 290x bios, i'm one of them too.
A fellow ocn member(forgot who) had a gigabyte mobo and a mobo bios update fixed the black screen for him.


----------



## heroxoot

Quote:


> Originally Posted by *MojoW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Here is a question. If you put a Bios for a 290X on a 290 what happens? Does it show the shader count for a 290X? I have wondered what happens to programs like GPUZ. when they read the card hardware. I ask this question because I tried other 290X bios for my card but they all seem to crash. It makes me wonder.
> 
> 
> 
> Very rare, but you have a little program to check if your card is unlockable.
> And if it is you will have the shaders unlocked but on a normal 290 only the clock speeds change.
> Some get a black screen with a 290x bios, i'm one of them too.
> A fellow ocn member(forgot who) had a gigabyte mobo and a mobo bios update fixed the black screen for him.
Click to expand...

Ok so my card showing 60/176 ROPS and 2816 shaders pretty much proves it is a 290X. That and I don't have the issues unlocked cards do. I just question MSI's intelligence lately.

Also comparing my card to a heavy OC'd 7950 with an i5 2500k my rig obliterated it in Heaven, so I guess my performance difference compared to other 290X rigs is just luck. My card still easily out performs the 7970 I had too.


----------



## MojoW

Quote:


> Originally Posted by *heroxoot*
> 
> Ok so my card showing 60/176 ROPS and 2816 shaders pretty much proves it is a 290X. That and I don't have the issues unlocked cards do. I just question MSI's intelligence lately.
> 
> Also comparing my card to a heavy OC'd 7950 with an i5 2500k my rig obliterated it in Heaven, so I guess my performance difference compared to other 290X rigs is just luck. My card still easily out performs the 7970 I had too.


Very nice.
Does it out perform your original 290? If so, then it's all good!


----------



## heroxoot

Quote:


> Originally Posted by *MojoW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Ok so my card showing 60/176 ROPS and 2816 shaders pretty much proves it is a 290X. That and I don't have the issues unlocked cards do. I just question MSI's intelligence lately.
> 
> Also comparing my card to a heavy OC'd 7950 with an i5 2500k my rig obliterated it in Heaven, so I guess my performance difference compared to other 290X rigs is just luck. My card still easily out performs the 7970 I had too.
> 
> 
> 
> Very nice.
> Does it out perform your original 290? If so, then it's all good!
Click to expand...

Oh no no you miss understand. I had a 7970 lightning before this, and as it stands people with 290 cards of the same speeds are out benching me. I have spent the last week trying to figure out the issue. It just crossed my mind maybe my card wasn't a 290X but I read if it was an unlocked 290 the ROPS do not show as 60/176 they show as something else like 60/160 or something. I don't think MSI is that dumb, but they could have been.


----------



## rdr09

Quote:


> Originally Posted by *heroxoot*
> 
> Oh no no you miss understand. I had a 7970 lightning before this, and as it stands *people with 290 cards of the same speeds are out benching me*. I have spent the last week trying to figure out the issue. It just crossed my mind maybe my card wasn't a 290X but I read if it was an unlocked 290 the ROPS do not show as 60/176 they show as something else like 60/160 or something. I don't think MSI is that dumb, but they could have been.


no, i saw your stock heaven score and it is higher than my stock 290. i have an i7 SB at 4.5GHz. don't use unigine benchmarks as the basis like me and a few others have been saying. i asked if you can run Firestrike but you said it is too long. so . . . there it is. hard to compare.

if you'll compare your score with others, i suggest you at least make sure that the systems are using the same os and driver.


----------



## 033Y5

thought i would share with you guys that my xfx r9 290 core edition unlocks the extra shaders and core clock speed
hopefully the link works
http://www.techpowerup.com/gpuz/7k3ha/


----------



## heroxoot

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Oh no no you miss understand. I had a 7970 lightning before this, and as it stands *people with 290 cards of the same speeds are out benching me*. I have spent the last week trying to figure out the issue. It just crossed my mind maybe my card wasn't a 290X but I read if it was an unlocked 290 the ROPS do not show as 60/176 they show as something else like 60/160 or something. I don't think MSI is that dumb, but they could have been.
> 
> 
> 
> no, i saw your stock heaven score and it is higher than my stock 290. i have an i7 SB at 4.5GHz. don't use unigine benchmarks as the basis like me and a few others have been saying. i asked if you can run Firestrike but you said it is too long. so . . . there it is. hard to compare.
> 
> if you'll compare your score with others, i suggest you at least make sure that the systems are using the same os and driver.
Click to expand...

Not too long, it wouldnt lt me run it solo. If I run all of 3D mark demo is firestrike score included? That was what I was getting at.

So my score is fine for what my rig is? That actually makes me feel better. I'll run 3D mark Demo after I'm done with a few things.


----------



## rdr09

Quote:


> Originally Posted by *heroxoot*
> 
> Not too long, it wouldnt lt me run it solo. If I run all of 3D mark demo is firestrike score included? That was what I was getting at.
> 
> So my score is fine for what my rig is? That actually makes me feel better. I'll run 3D mark Demo after I'm done with a few things.


the demo includes fs. i also suggest to not let leave your fan in auto just to eliminate the possibility of throttling. set at 60% or something just for benching. i recommend the highest but tolerable fan speed. when i had the stock cooler on my reference i set mine at 55% gaming.

i must say that demo is long. find a book to read.


----------



## heroxoot

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Not too long, it wouldnt lt me run it solo. If I run all of 3D mark demo is firestrike score included? That was what I was getting at.
> 
> So my score is fine for what my rig is? That actually makes me feel better. I'll run 3D mark Demo after I'm done with a few things.
> 
> 
> 
> the demo includes fs. i also suggest to not let leave your fan in auto just to eliminate the possibility of throttling. set at 60% or something just for benching. i recommend the highest but tolerable fan speed. when i had the stock cooler on my reference i set mine at 55% gaming.
> 
> i must say that demo is long. find a book to read.
Click to expand...

I keep mentioning manual fan isn't an option for me at all. I cannot control fan speed with ANYTHING! CCC won't show me a % it shows me RPM amounts and it does not let me peg it at a specific speed. When I hit apply it resets to default. MSI AB shows no slider for fan nor the curve tab. Trixx won't alter fan speeds either.

This seems to be an MSI problem past driver 13.12 and I have no idea what they did exactly. I heard they adopted a new API, but why that stops me from controlling fan is beyond me.


----------



## rdr09

Quote:


> Originally Posted by *heroxoot*
> 
> I keep mentioning manual fan isn't an option for me at all. I cannot control fan speed with ANYTHING! CCC won't show me a % it shows me RPM amounts and it does not let me peg it at a specific speed. When I hit apply it resets to default. MSI AB shows no slider for fan nor the curve tab. Trixx won't alter fan speeds either.
> 
> This seems to be an MSI problem past driver 13.12 and I have no idea what they did exactly. I heard they adopted a new API, but why that stops me from controlling fan is beyond me.


that is enuf reason to return the card. I mean if you will depend on the stock cooler for cooling. no manual control should be good enough to tell msi . . . hey, this is not normal. it ain't. kinda hard to follow your complaints from several threads. sorry.


----------



## kizwan

Quote:


> Originally Posted by *heroxoot*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Oh no no you miss understand. I had a 7970 lightning before this, and as it stands *people with 290 cards of the same speeds are out benching me*. I have spent the last week trying to figure out the issue. It just crossed my mind maybe my card wasn't a 290X but I read if it was an unlocked 290 the ROPS do not show as 60/176 they show as something else like 60/160 or something. I don't think MSI is that dumb, but they could have been.
> 
> 
> 
> no, i saw your stock heaven score and it is higher than my stock 290. i have an i7 SB at 4.5GHz. don't use unigine benchmarks as the basis like me and a few others have been saying. i asked if you can run Firestrike but you said it is too long. so . . . there it is. hard to compare.
> 
> if you'll compare your score with others, i suggest you at least make sure that the systems are using the same os and driver.
> 
> Click to expand...
> 
> 
> 
> 
> Not too long, it wouldnt lt me run it solo. If I run all of 3D mark demo is firestrike score included? That was what I was getting at.
> 
> So my score is fine for what my rig is? That actually makes me feel better. I'll run 3D mark Demo after I'm done with a few things.
Click to expand...

Yeah, run 3dmark 11 (Performance) and/or Firestrike. Please note whether you run it with Tessellation ON or OFF. Then compare the graphics score. If you want to compare with mine, I have 1000/1300 & 1200/1600. Most my scores are with Tessellation OFF though.


----------



## Red1776

Quote:


> Originally Posted by *ABADY*
> 
> Quote:
> 
> 
> 
> Originally Posted by *sergec19*
> 
> yes it is.. mine was running 95° after 3min bf4 and fan makes a Jet noise at 60% or higher..
> now i have an Arctic Accelero IV and its running max 66° after some heavy benchmarking and gaming, fan is max 85% speed and i cant hear it!
> 
> 
> 
> wow! whats the point of their non-reference cooler then ! looks like ill get msi .
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> I RAN 4x MSI R290X Twin Frozr while waiting for waterbloks and they ran at 68c while gaming.
> 
> Click to expand...
> 
> thats great . @ what speed ?
> 
> ــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــــ
> 
> for overclocking wahts the best msi or sapphire ?and is it possible to hit more than 1120/1450 ? cuz i saw a video of som1 hitting over 1600 on memory ! idk if it depend on luck or its fake !
Click to expand...

That was running @ 1150 core.


----------



## heroxoot

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> I keep mentioning manual fan isn't an option for me at all. I cannot control fan speed with ANYTHING! CCC won't show me a % it shows me RPM amounts and it does not let me peg it at a specific speed. When I hit apply it resets to default. MSI AB shows no slider for fan nor the curve tab. Trixx won't alter fan speeds either.
> 
> This seems to be an MSI problem past driver 13.12 and I have no idea what they did exactly. I heard they adopted a new API, but why that stops me from controlling fan is beyond me.
> 
> 
> 
> that is enuf reason to return the card. I mean if you will depend on the stock cooler for cooling. no manual control should be good enough to tell msi . . . hey, this is not normal. it ain't. kinda hard to follow your complaints from several threads. sorry.
Click to expand...

I've already told them about it and they said their is nothing wrong with the card. They told me the card is running fine as long as the fan speed increases when it gets hotter. I have only ever seen the card get 86c in a benchmark, but it doesn't last long. It usually drops below 85c after a few seconds. This problem is definitely either driver related or bios related. As I said, fan speed control works for 13.12 driver. It is only Mantle drivers that I cannot control it.

As for Firestrike. After the first tests, ice storm, it played another demo and the program locked up after that. I tried to close it but I couldnt see the task manager as it was stuck behind it. Had to hard reboot and I do not want to do that again, so I will just have to pass on it. The card definitely does better than a 7970/280x which is what I had, so I cannot complain in the end. The card is however 74% asic which seems low, but luck of the draw right?


----------



## ABADY

Quote:


> Originally Posted by *Red1776*
> 
> That was running @ 1150 core.


i meant the fan speed and is it quiet ?. so ur card running @ 1150/? ?


----------



## Red1776

Quote:


> Originally Posted by *ABADY*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> That was running @ 1150 core.
> 
> 
> 
> i meant the fan speed and is it quiet ?. so ur card running @ 1150/? ?
Click to expand...

That is OC by 110MHz, but yes.

The fans are at 60% +/- and are tolerable as far as most brands.


----------



## ABADY

Quote:


> Originally Posted by *Red1776*
> 
> That is OC by 110MHz, but yes.
> The fans are at 60% +/- and are tolerable as far as most brands.


wow this is really messed up idk what to get







!

sapphire is my last choice


----------



## sergec19

Quote:


> Originally Posted by *Red1776*
> 
> That is OC by 110MHz, but yes.
> The fans are at 60% +/- and are tolerable as far as most brands.


I have a sapphire reference with arctic accelero IV running @ 1200/1450 stock voltages so im happy. Max temp 66


----------



## greenscobie86

^
Damn I'm to between the Gelid Icy Rev 2 and an Accelero... This last post has made me even more anxious lol.


----------



## heroxoot

Quote:


> Originally Posted by *sergec19*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> That is OC by 110MHz, but yes.
> The fans are at 60% +/- and are tolerable as far as most brands.
> 
> 
> 
> I have a sapphire reference with arctic accelero IV running @ 1200/1450 stock voltages so im happy. Max temp 66
Click to expand...

This sounds crazy. Stock voltage? Are the 290/x cards really so limited by stock clocks that they can OC this hard on stock voltages? My card automatically does +25mV on the core so I have just been OCing on that and it has also been stable thus far for what I have attempted.


----------



## kayan

Ok, so I overclocked my card. I've tried it with Trixx and MSI AB.

I can overclock my XFX 290x to 1110 Core on stock voltages, but anything past that artifacts.

I can run 1200/1250 @ 125mV in Trixx and AB. I'm on a full loop, so temps aren't a problem. Core is at 57C and VRM 1 & 2 are 59 & 50 respectively. I haven't overclocked my memory yet, as I was advised not to. In 3DMark I get an overall score of 9381 (I have an AMD CPU), and my highest graphics score thus far is 12324.

Is my card sucky?


----------



## rdr09

Quote:


> Originally Posted by *kayan*
> 
> Ok, so I overclocked my card. I've tried it with Trixx and MSI AB.
> 
> I can overclock my XFX 290x to 1110 Core on stock voltages, but anything past that artifacts.
> 
> I can run 1200/1250 @ 125mV in Trixx and AB. I'm on a full loop, so temps aren't a problem. Core is at 57C and VRM 1 & 2 are 59 & 50 respectively. I haven't overclocked my memory yet, as I was advised not to. In 3DMark I get an overall score of 9381 (I have an AMD CPU), and my highest graphics score thus far is 12324.
> 
> Is my card sucky?


actually, your oc at that much voltage is not bad. what driver are you using? i just now compared 13.11 and 14.4 WHQLs and found out that i have to add volts for my oc's using the latter. 1100 now needs +40 offset. using 13.11, i can oc up to 1175 without voltage tweaks just max out PL.

if you are using 14.4, then you'll need lower voltage using 13.11 or 13.12 i bet.


----------



## kayan

I'm actually using 13.12 whql.


----------



## rdr09

Quote:


> Originally Posted by *kayan*
> 
> I'm actually using 13.12 whql.


still not bad. for suicide runs . . . i've used +200 in Trixx to reach 1300 with a 290. that is roughly equivalent to a 1260 290X. you might find Trixx better for oc'ing single 290s. that's what i use. your memory should do 1500. it helps bring the graphics score go a bit higher.


----------



## sergec19

Quote:


> Originally Posted by *heroxoot*
> 
> This sounds crazy. Stock voltage? Are the 290/x cards really so limited by stock clocks that they can OC this hard on stock voltages? My card automatically does +25mV on the core so I have just been OCing on that and it has also been stable thus far for what I have attempted.


sorry thats a typo!









stock voltages i have 1100/1400
on 1.3V i tried 1200/1450 succesful few hours gaming and benchmarking..
did not finetuned it yet to the max speed or lowest vcore.
but if i try 1235 on 1.3V i got artifacts.. same as higher then 1100 on stock voltage (1.2v)


----------



## Roy360

Just got my fuji pads, so I re-did my entire loop, and some strange things are happening after booting up the system. I'm guessing there's still some water in my PC, because everytime I "hit" the large Enter or backspace on my keyboard, the PC registers backslash instead. and according to GPUz, my XFX R9 290 (middle card in tri-fire setup) has over 4TB of memory. I uninstalled the broken R9 290, and \ and used Windows to automatical\ly install the drivers, and now I get System exception BSOD everytime I boot. DD\U and insst\\alling 14.4 as I write. Not quite sure why my system gives me so much trou\\ble with tri-fire R9 290s.
\
On the brig\tside, VRM1 is at \30 d\\e\\grees now:_)

EDIT: Somehow re-installing drivers fixed everything, including the keyboard....

Once again, the ASUS fails me again. 10 degrees higher than reference








The ASUS climbed to 67 shortly have uploading this pic.

In general the fujipads dropped the temps by 20 degrees.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *kayan*
> 
> Ok, so I overclocked my card. I've tried it with Trixx and MSI AB.
> 
> I can overclock my XFX 290x to 1110 Core on stock voltages, but anything past that artifacts.
> 
> I can run 1200/1250 @ 125mV in Trixx and AB. I'm on a full loop, so temps aren't a problem. Core is at 57C and VRM 1 & 2 are 59 & 50 respectively. I haven't overclocked my memory yet, as I was advised not to. In 3DMark I get an overall score of 9381 (I have an AMD CPU), and my highest graphics score thus far is 12324.
> 
> Is my card sucky?


No. I run 1200/1437 with +168 daily. Your card is good, I'd say above average.


----------



## PillarOfAutumn

How many watts do stock crossfired 290s require? Does anyone have a kill-a-watt to measure? How about 290s overclocked to 1100/1400 +100mv?

If I were to use crossfired 290s at stock vs ones OCed/OVed to max, about how much performance would I be losing?


----------



## phantomowl

Quote:


> Originally Posted by *Paul17041993*
> 
> pushing up asin making it straight or applying pressure to the chasis?
> 
> sleeve bearing fans always make noise fyi, especially when facing up or downwards and running for 100% non-stop.


mine is really noisy too, but really tolerable.


----------



## Roy360

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> How many watts do stock crossfired 290s require? Does anyone have a kill-a-watt to measure? How about 290s overclocked to 1100/1400 +100mv?
> 
> If I were to use crossfired 290s at stock vs ones OCed/OVed to max, about how much performance would I be losing?


With a Corsair AX1200, 3 R9 290, at stock clocks and overclocked 3570k @ 5ghz drains 900W on the outlet side on full load. So ~800?


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Roy360*
> 
> With a Corsair AX1200, 3 R9 290, at stock clocks and overclocked 3570k @ 5ghz drains 900W on the outlet side on full load. So ~800?


So with 2 290s, would you say you come close to 700watts?


----------



## Roy360

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> So with 2 290s, would you say you come close to 700watts?


It won't be 100% accurate, but I"ll only apply load on 2 cards and check for you. But that sounds right.

EDIT:
No cards ~200W
1 card ~ 450W
2 cards ~643W
ULPS off.
AX1200 is a GOLD PSU, so efficiency will range from 95-85%. So yea, a quality(Seasonic, XFX, Corsair) 750W PSU should be able to power 2 cards.


----------



## PillarOfAutumn

+rep, and may you have many more +reps!

Thanks for doing that for me, man! I have a seasonic x750 running a watercooled 290. Right now, I'm making 2 monitors to have a 1440p PLP monitor setup with a 27" and two 15.4 inch. It would be pretty nice to run games at 4360x1440 ultrawide resolution.


----------



## Forceman

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> How many watts do stock crossfired 290s require? Does anyone have a kill-a-watt to measure? How about 290s overclocked to 1100/1400 +100mv?
> 
> If I were to use crossfired 290s at stock vs ones OCed/OVed to max, about how much performance would I be losing?


My system with a single 290X at +50mV, 1100/1350, draws about 400-425W at the wall with a 4770K, so on par with what Roy360 found.

And you can figure on about 10% performance gain from overclocked vs stock.


----------



## bond32

I've been really surprised at how much these cards pull when oc'ed. I wrote a bunch of posts a while back saying how I felt a 1000 watt psu could handle 3 290's but barely. I can admit I was way wrong... Mine pulls 350 watts according to gpuz max which is higher than I thought. My evga g2 1000 watt will do for 2 290x's but not 3 (whenever I can afford a second card).


----------



## phallacy

Quote:


> Originally Posted by *bond32*
> 
> I've been really surprised at how much these cards pull when oc'ed. I wrote a bunch of posts a while back saying how I felt a 1000 watt psu could handle 3 290's but barely. I can admit I was way wrong... Mine pulls 350 watts according to gpuz max which is higher than I thought. My evga g2 1000 watt will do for 2 290x's but not 3 (whenever I can afford a second card).


I was pulling around 1100 with a 4.4ghz 4770k OC and 3 290x at 1200/1600 with I believe 120-150mV each in Crysis 3 after about 30 minutes. Hovered around 1030-1080 in 3dmark11, Firestrike, and the ungine benches. Added a 4th card recently so now I am using 2 PSUs. EVGA 1000w G2 and the 1300 G2.


----------



## Roboyto

Quote:


> Originally Posted by *bond32*
> 
> I've been really surprised at how much these cards pull when oc'ed. I wrote a bunch of posts a while back saying how I felt a 1000 watt psu could handle 3 290's but barely. I can admit I was way wrong... Mine pulls 350 watts according to gpuz max which is higher than I thought. My evga g2 1000 watt will do for 2 290x's but not 3 (whenever I can afford a second card).


*Yup, when pushing these cards to the brink they suck down some juice!*



*Run them at stock clocks and it's not so bad*. *This was with reference blower auto settings. If you're water cooling, consumption comes down; adding AIO to this card brought it down 21W, ~12%, at same settings. Don't have screenie yet, just finished that project late last night and the wife is using that computer presently*.


----------



## Roy360

Quote:


> Originally Posted by *Forceman*
> 
> My system with a single 290X at +50mV, 1100/1350, draws about 400-425W at the wall with a 4770K, so on par with what Roy360 found.
> 
> And you can figure on about 10% performance gain from overclocked vs stock.


overclocking mult-card setups are a huge pain, especially when you use different PCBs









Finding a sweet spot that all the cards will work on is such a pain. My XFX will do: 1150/1500 at stock, but my Visiontek will only go up to 1100/1500, despite being an "Overclocked" card with a higher than regular voltage. Finally, there's my beloved ASUS that won't accept any memory overclock above 1300 and runs 10 degrees hotter than the reference cards, despite being first in the loop.

But I will agree with you on the performance increase. In general I saw 4-5 fps increase when I tested and overclocked the cards separately.

Next next gen I'm waiting for the dual gpu card to come out first.

p.s I'd like to join the club plz.

ASUS R9290DC2OC4GD5
XFX R9-290A-ENFC
VisionTek R9 290

On all EK Full Cover Water blocks.


----------



## Mega Man

Quote:


> Originally Posted by *Roy360*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Forceman*
> 
> My system with a single 290X at +50mV, 1100/1350, draws about 400-425W at the wall with a 4770K, so on par with what Roy360 found.
> 
> And you can figure on about 10% performance gain from overclocked vs stock.
> 
> 
> 
> overclocking mult-card setups are a huge pain, especially when you use different PCBs
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Finding a sweet spot that all the cards will work on is such a pain. My XFX will do: 1150/1500 at stock, but my Visiontek will only go up to 1100/1500, despite being an "Overclocked" card with a higher than regular voltage. Finally, there's my beloved ASUS that won't accept any memory overclock above 1300 and runs 10 degrees hotter than the reference cards, despite being first in the loop.
> 
> But I will agree with you on the performance increase. In general I saw 4-5 fps increase when I tested and overclocked the cards separately.
> 
> Next next gen I'm waiting for the dual gpu card to come out first.
Click to expand...

hate to tell you but it has been out 295x


----------



## Roy360

Quote:


> Originally Posted by *Mega Man*
> 
> hate to tell you but it has been out 295x


I meant the generation after the next generation. ie R9 495X or whatever they call the next one.

No way I'm upgrading from 3 R9 290s to 1 R9 295X, especially given that my four R9 290s cost less than one of those cards.


----------



## Paul17041993

Quote:


> Originally Posted by *Roy360*
> 
> I meant the generation after the next generation. ie R9 495X or whatever they call the next one.
> 
> No way I'm upgrading from 3 R9 290s to 1 R9 295X, especially given that my four R9 290s cost less than one of those cards.


could always do 295X2 + 2* 290 if you wanted


----------



## Arizonian

Quote:


> Originally Posted by *Roy360*
> 
> overclocking mult-card setups are a huge pain, especially when you use different PCBs
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Finding a sweet spot that all the cards will work on is such a pain. My XFX will do: 1150/1500 at stock, but my Visiontek will only go up to 1100/1500, despite being an "Overclocked" card with a higher than regular voltage. Finally, there's my beloved ASUS that won't accept any memory overclock above 1300 and runs 10 degrees hotter than the reference cards, despite being first in the loop.
> 
> But I will agree with you on the performance increase. In general I saw 4-5 fps increase when I tested and overclocked the cards separately.
> 
> Next next gen I'm waiting for the dual gpu card to come out first.
> 
> p.s I'd like to join the club plz.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> ASUS R9290DC2OC4GD5
> XFX R9-290A-ENFC
> VisionTek R9 290
> 
> On all EK Full Cover Water blocks.


Congrats - added


----------



## lawson67

Has anyone crossfired the Powercolor PCS+ R9 290? ...And being a triple slot card do they over heat due to the fact of being so close to each other?


----------



## Sgt Bilko

Quote:


> Originally Posted by *heroxoot*
> 
> Not too long, it wouldnt lt me run it solo. If I run all of 3D mark demo is firestrike score included? That was what I was getting at.
> 
> So my score is fine for what my rig is? That actually makes me feel better. I'll run 3D mark Demo after I'm done with a few things.


http://www.3dmark.com/fs/1091649

I ran that when i had an 8150 and a 290x. Your scores are fine.


----------



## kizwan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Not too long, it wouldnt lt me run it solo. If I run all of 3D mark demo is firestrike score included? That was what I was getting at.
> 
> So my score is fine for what my rig is? That actually makes me feel better. I'll run 3D mark Demo after I'm done with a few things.
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/1091649
> 
> I ran that when i had an 8150 and a 290x. Your scores are fine.
Click to expand...

That with Tessellation ON or OFF?


----------



## Sgt Bilko

Quote:


> Originally Posted by *kizwan*
> 
> That with Tessellation ON or OFF?


Tess on, just got the 290x so it was a quick and dirty stock volt bench.


----------



## HOMECINEMA-PC

http://www.3dmark.com/fs/2068285


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *lawson67*
> 
> Has anyone crossfired the Powercolor PCS+ R9 290? ...And being a triple slot card do they over heat due to the fact of being so close to each other?


If it does put a fan on the cards


----------



## magicase

Does anyone know if PCI-E 3.0 running at x8/x8/x4 will have bandwidth issues with a 290 Tri CF or is x16/x16/x8 needed?


----------



## BradleyW

Quote:


> Originally Posted by *magicase*
> 
> Does anyone know if PCI-E 3.0 running at x8/x8/x4 will have bandwidth issues with a 290 Tri CF or is x16/x16/x8 needed?


You should be alright since the 3rd card will be useless, thanks to drivers, and applications have high CPU overhead which means the CPU will be the bottlenecked anyway, before the slots have chance to be an issue in terms of bandwidth.


----------



## phallacy

Quote:


> Originally Posted by *magicase*
> 
> Does anyone know if PCI-E 3.0 running at x8/x8/x4 will have bandwidth issues with a 290 Tri CF or is x16/x16/x8 needed?


Yes, some small bottlenecking will occur depending on your cpu OC. x16 / x 16 / x8 is only achievable on the x79 platform iirc so along with an lga 2011 cpu there should be some difference. I used trifire with a plx 1150 mobo in a x8 / x 16 / x8 config and it worked great. Unless you plan on only running synthetic benches when using your cards, the difference in games would be like 5-10% assuming all else is kept equal.


----------



## DeadlyDNA

Quote:


> Originally Posted by *magicase*
> 
> Does anyone know if PCI-E 3.0 running at x8/x8/x4 will have bandwidth issues with a 290 Tri CF or is x16/x16/x8 needed?


im using quad 290s on x79. Ive ran some tests on pcie 3.0 vs 2.0 and right now the results are shocking me. Keep in mind im testing high resolutions. I havent posted the results yet because they are almost identical. I am going to dig deeper into when I have more time.

I will say that if my testing is without flaw you will not see a difference. However I have yet to do the same testing at 1080p. I almost feel like 2.0 vs 3.0 is a myth at this point. With R9 2xxx except maybe 295x2?


----------



## ahnafakeef

Hello everyone! I need some advice on which PSU to get for the following system that will have a Tri-X 290 in it.

1. CPU: Core i5 4570
2. RAM: Corsair Vengeance 8GB DDR3 1600MHz
3. Monitor: Dell S2240L
4. GPU: R9 290 Tri-X Edition
5. Motherboard: Z87-G43
6. HDD: 1TB Western Digital Caviar Blue

The PSU needs to be from this list: http://globalbrand.com.bd/computer-accessories/desktop-accessories%20/power-supply?sort=p.price&order=DESC&limit=100

Would I need to get a bigger PSU if I want to OC the GPU? If I don't OC, which PSU will get the job done?

Thank you very much. I really appreciate your help.


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/2068285












For unknown reason, I no longer get tessellation load modified with 3dmark. I already checked, when Tess is OFF, I did get correct score when it OFF. Only that I didn't get the tessellation modified message anymore.


----------



## sinnedone

So I haven't been keeping up with these newer cards and have some questions about overclocking them.

Is the built in overdrive preferred now over other apps like afterburner?

What exactly can you check to see if its voltage locked? Can flashing another bios unlock voltage if locked? (Saphire 290)

Any tips/tricks or possibly overclock specific threads I can check out?

Thanks all.


----------



## sergec19

over de 10.000 Firestrike !!








think i have a very good score with 1 R9 290!





and over 14.000 3DMARK11


----------



## the9quad

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 
> http://www.3dmark.com/fs/2068285


Nice can you run FS extreme? I'd like to see how I compare. I get 14060 on extreme.
http://www.3dmark.com/fs/1299035


----------



## Roboyto

Quote:


> Originally Posted by *ahnafakeef*
> 
> Hello everyone! I need some advice on which PSU to get for the following system that will have a Tri-X 290 in it.
> 
> 1. CPU: Core i5 4570
> 2. RAM: Corsair Vengeance 8GB DDR3 1600MHz
> 3. Monitor: Dell S2240L
> 4. GPU: R9 290 Tri-X Edition
> 5. Motherboard: Z87-G43
> 6. HDD: 1TB Western Digital Caviar Blue
> 
> The PSU needs to be from this list: http://globalbrand.com.bd/computer-accessories/desktop-accessories%20/power-supply?sort=p.price&order=DESC&limit=100
> 
> Would I need to get a bigger PSU if I want to OC the GPU? If I don't OC, which PSU will get the job done?
> 
> Thank you very much. I really appreciate your help.


I am on my phone and did not check your list, but a good 650W would be just fine for your application. My Rosewill Capstone 650W has been flawless. It has no problems powering my 4770k @ 4.5GHZ, my 290 up to 1300/1700, with 3 SSDs, 1 HDD, 6 fans, and a water pump.

If you're not planning on overclocking either component you can likely even get away with as low as a good 450W. My media center, which I just put a reference Powercolor 290 OC in, isn't having any issues on a Rosewill Capstone 450W with a 3770k; CPU and GPU both at stock clocks with an AIO water cooler on each.

My wife has had 0 issues playing Tomb Raider 1080p max settings for several hours at a time. I Still need to do some more testing to see how reliable it's going to be, but with the water cooler on the 290 running at stock 975/1250 it's only pulling 160W.

I have my Corsair RM650 on standby if I run into problems... But I only need to drive 1080 on my TV so I won't be needing to OC the 290 which can literally double it's power draw.


----------



## Caos

good, I have a r9 290 vapor x. with OC gpu and 1100 memory at 1400 (stock), the power limit to 30%, would have to put 50%? playing BF4 maximum temperature is 60 ° C and 56 ° C VRM1, VRM2 53 ° C


----------



## Kokin

Quote:


> Originally Posted by *sinnedone*
> 
> So I haven't been keeping up with these newer cards and have some questions about overclocking them.
> 
> Is the built in overdrive preferred now over other apps like afterburner?
> 
> What exactly can you check to see if its voltage locked? Can flashing another bios unlock voltage if locked? (Saphire 290)
> 
> Any tips/tricks or possibly overclock specific threads I can check out?
> 
> Thanks all.


Overdrive is more for "automatic" OCing as it will boost according to temps and whatever you set it to. If you prefer more control to your OC, I would suggest using Sapphire Trixx or MSI Afterburner to get the exact clocks and fan profiles to your liking.

Most of the R9 290s should be voltage unlocked, but I would wait for other folks to chime in about voltage locked cards. If the voltage regulator used is a standard one, a BIOS change should unlock voltage, but if the regulator is different, it most likely will stay voltage locked.

This was the main thread I used to learn about the 290 cards including how to OC them.

GL!


----------



## D3v1L0

Hello, Greetings to all .. I have problems configuring 3 monitors in eyeinfinity .... some can help ..

I have 2 r9 290x .... powercolor video card can move to any output (no hdmi) to 144hz, but when I connect 2 or 3 screens can not move can only move 144hz 60hz., anyone know why?,

I clable dvi-d and display port 1.2,

r9 crossfire sapphire before had I run 290x dvi-d, dvi-d, dp 3 Portrait 3240 x 1980 144hz.

r9 powercolor crossfire now 290x dvi-d, dvi-d, dp 3 Portrait 3240 x 1980 144hz does not work, only 60hz.

144 single monitor works perfect ..

help please


----------



## Tobiman

Quote:


> Originally Posted by *heroxoot*
> 
> So I just cannot understand what to do. My 290X performs worse with an Overclock. How can I solve this?


Quote:


> Originally Posted by *D3v1L0*
> 
> Hello, Greetings to all .. I have problems configuring 3 monitors in eyeinfinity .... some can help ..
> 
> I have 2 r9 290x .... powercolor video card can move to any output (no hdmi) to 144hz, but when I connect 2 or 3 screens can not move can only move 144hz 60hz., anyone know why?,
> 
> I clable dvi-d and display port 1.2,
> 
> r9 crossfire sapphire before had I run 290x dvi-d, dvi-d, dp 3 Portrait 3240 x 1980 144hz.
> 
> r9 powercolor crossfire now 290x dvi-d, dvi-d, dp 3 Portrait 3240 x 1980 144hz does not work, only 60hz.
> 
> 144 single monitor works perfect ..
> 
> help please


This seems to be a limit that AMD placed on their radeon line of cards. It's only posssible with Nvidia surround.


----------



## D3v1L0

Quote:


> Originally Posted by *Tobiman*
> 
> This seems to be a limit that AMD placed on their radeon line of cards. It's only posssible with Nvidia surround.


hi .. thanks for replying .. well my previous crossfire was the same brand only sapphire, powercolor now, it has something to do branding?, you can remove this limitation?,

when I try to do 144hz eyeinfinity the pc crashes ..

I xpower Z87 motherboard msi ..


----------



## sinnedone

Quote:


> Originally Posted by *Kokin*
> 
> Overdrive is more for "automatic" OCing as it will boost according to temps and whatever you set it to. If you prefer more control to your OC, I would suggest using Sapphire Trixx or MSI Afterburner to get the exact clocks and fan profiles to your liking.
> 
> Most of the R9 290s should be voltage unlocked, but I would wait for other folks to chime in about voltage locked cards. If the voltage regulator used is a standard one, a BIOS change should unlock voltage, but if the regulator is different, it most likely will stay voltage locked.
> 
> This was the main thread I used to learn about the 290 cards including how to OC them.
> 
> GL!


Thank you. I'm researching right now to see which "reference" version 290 I should get.


----------



## Kokin

Quote:


> Originally Posted by *D3v1L0*
> 
> hi .. thanks for replying .. well my previous crossfire was the same brand only sapphire, powercolor now, it has something to do branding?, you can remove this limitation?,
> 
> when I try to do 144hz eyeinfinity the pc crashes ..
> 
> I xpower Z87 motherboard msi ..


Are you using the same drivers? I read that a few people had problems getting Eyefinity to work after updating to the recent 14.4 drivers, but still worked perfectly after reverting back to 13.12.
Quote:


> Originally Posted by *sinnedone*
> 
> Thank you. I'm researching right now to see which "reference" version 290 I should get.


All of the reference ones are the same. There are a few non-reference coolers (2 or 3 fans) where they keep the reference PCB, so you know you have unlocked voltage and can place a waterblock if you choose to in the future. An example is the Sapphire Tri-X.

Please see this list for ones that use the reference PCB: http://www.coolingconfigurator.com/waterblock/3831109868584

There's another list that includes a few non-reference coolers, but are reference PCBs: http://www.coolingconfigurator.com/waterblock/3831109869017


----------



## D3v1L0

Quote:


> Originally Posted by *Kokin*
> 
> Are you using the same drivers? I read that a few people had problems getting Eyefinity to work after updating to the recent 14.4 drivers, but still worked perfectly after reverting back to 13.12.


no .. in the previous crossfire I had 13.12 and now install the new 14.10 .. I'll have to try the previous driver .. be like me going ..

but if anyone has suggestions .. I keep hearing.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *kizwan*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> For unknown reason, I no longer get tessellation load modified with 3dmark. I already checked, when Tess is OFF, I did get correct score when it OFF. Only that I didn't get the tessellation modified message anymore.


You beauty








Gonna try and get some subs in da HOF


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *the9quad*
> 
> Nice can you run FS extreme? I'd like to see how I compare. I get 14060 on extreme.
> http://www.3dmark.com/fs/1299035


Ive only done 1 run of TRI FSE and your ahead

http://www.3dmark.com/fs/2038353


----------



## the9quad

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Ive only done 1 run of TRI FSE and your ahead
> 
> http://www.3dmark.com/fs/2038353


Thanks, I am sure you'll smoke me when you try. Pretty jealous of your setup. I just cannot afford watercooling (if I want to stay married lol)


----------



## Talon720

well damn my 3rd 290x might be a dud. It said it was new on the ebay add but when i took the cooler off the little x bracket wasnt there.. Figured it was a sapphire thing. Also the memory thermal pads were black but the vrm thermal pads were grey hmmm.. i suppose my 3rd slot could be bad as i havnt put anything else in there.. but i have a gut feeling its this card since i got it from ebay, which would be my luck trying to save a few a dollars. its not detected in bios or windows. I thought about trying to rma it to sapphire, and just not tell them, but i feel like they are going to see that the **** is different.i feel if I could tell that it looked differnt they can tell. One bad thing about water cooling taking parts in and out for testing and replaicing.


----------



## Roboyto

Quote:


> Originally Posted by *sinnedone*
> 
> So I haven't been keeping up with these newer cards and have some questions about overclocking them.
> 
> Is the built in overdrive preferred now over other apps like afterburner?
> 
> What exactly can you check to see if its voltage locked? Can flashing another bios unlock voltage if locked? (Saphire 290)
> 
> Any tips/tricks or possibly overclock specific threads I can check out?
> 
> Thanks all.


As Kokin said, keep away from CCC. It will likely cause more problems than give you results.

Afterburner/Trixx/GPU Tweak are the most common OC utility choices, I would stick to them.

I don't believe any 290(X) is voltage locked, but I am not 100% certain; I haven't seen anyone mention it. It wouldn't really make sense since these are the enthusiast class cards.

This is the thread you want to pay attention to for 290(X) information. There can be a lot to wade through, but if you take your time there is plenty of information.

Feel free to ask away in the thread, or send a PM, but don't be discouraged if you don't get a response as this thread moves extremely quickly at times and your post can get buried; Re-post if need be

*The R9 290(X) "Need to Know"*

*Anyone please feel free to PM/quote/@Roboyto to correct/add information.*








*These cards make a lot of heat*










*Have great airflow with a great heatsink, or plan on watercooling them.*
They will continue to run normally up to 94C
*throttle at 95C* reducing clocks/voltages.




*Why do they run so hot you might be wondering?*
AMD has crammed 6.2 Billion transistors into a 438 mm² die
Compared to GTX 780 which has 7.1 Billion transistors in a 561 mm² die
Nvidia has 14% more transistors but uses 28% more space to do so




*Keep an eye on VRM1 temps, <90C is where you want to be for VRM1*
VRM1 controls core voltage and is the *long vertical strip* closest to the PCIe power connectors.
Cooling VRM1 can drastically impact the efficiency of the card
And possibly 'unlock' some additional overclocking potential



*VRM Locations*




*VRM2 controls RAM* voltage and is nearest the video output connections. 
You won't likely need to be as concerned with keeping them cool since they don't run nearly as hot.
From my experience with a Kraken G10 on a 290, all you need is a good heatsink(s) on VRM2 too keep it cool; no airflow is really even necessary.


Kraken & Gelid Heatsink Cooling - http://www.overclock.net/t/1478544/the-devil-inside-a-gaming-htpc-for-my-wife-3770k-corsair-250d-powercolor-r9-270x-devil-edition/20#post_22214255





*I've heard that the 290(X) cards have blackscreen issues, what's up with that?*


There *was* a problem with *the first* batch of 290(X) cards that were sold
Most of those cards were RMAd and AMD acknowledged a driver issue with a driver update
Any driver 13.12 and up includes the fix


I had experienced black screen issues, but only when using up to 14.4 drivers on Win7 64-bit
I used 13.12 on Win7 until I switched to Win8.1

*14.9 has given me no issues with Win7 or Win8.1*
Some users seem to be having better luck with 14.4 and Win8/8.1
I was very pleased with Win8.1 and 14.6 Beta

*Unstable RAM overclock will cause blackscreen/lockup* - read further for more info


*I'm having issues with video playback in my browser?*
Disabling hardware acceleration will likely solve this problem



Spoiler: Right-Click - Settings - Hardware Acceleration

















*I heard that 290s are unlockable to 290X with a BIOS flash?!?*


True statement. XFX & PowerColor had the highest probability.
Unfortunately that ship has sailed








Very early reference 290s had a fair chance
I wouldn't expect any custom cooled card to be unlocking as only 5 have been marked as successful via the thread below.

If you purchased a used reference card from a miner, see this thread for information on how to check if your card is unlockable:
Courtesy of @airisom2 - http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread
How to flash your BIOS information also within this ^ thread


*What should you benchmark with? *


Remember a benchmark is a benchmark...Stability in any of the following is a good test, *but does not guarantee* gaming/daily use stability.
3DMark
Unigine/Heaven
Catzilla
Tomb Raider
Bioshock
FFXIV (my fav hands down "It's Stable" test)
etc

*FurMark/Kombustor*







*= poor choice*








Gives unrealistic power draw and causes excessive heat
Poor representation of what happens when playing a game
Don't know many games with swaying furry objects









If you're going to run it, use the 15min benchmark











*What should you Overclock with?*


Stay away from Catalyst Control Center aka OverDrive
Afterburner
Trixx
GPU Tweak
Requires ASUS card or ASUS BIOS

If you have poor results with one utility, uninstall and try another.


*Your benchmark score went down after OCing? *
The card is *throttling* due to temperature.
"May indicate instability. Reduce the overclock and retest" @BradleyW
Or, add a little voltage/power limit *IF temperatures will allow*

Make sure you *Force Constant Voltage* in your OC utility
Make sure you *Disable ULPS* via registry or OC Utility; especially with multipe GPUs!



*Core clock is king! Overclock core 1st, worry about memory 2nd. The 512-bit bus makes up for the 5GHz default memory speeds.*
If you experience *artifacting/tearing/flashing* then you odds are you:
pushed core clock too high
need to add more voltage (if temperatures will allow)
should increase power limit (if temperatures will allow)
If you can't add more voltage/power, bring the core clock down a little



*If you experience black screen, BSOD, or other lockups then odds are you have pushed the RAM clock too high.*
*RAM speed gives a fraction of a performance boost compared to core*
MSI Afterburner AUX voltage can sometime assist with RAM clocks.
Make sure you *Force Constant Voltage* in your OC utility
Make sure you *Disable ULPS* via registry or OC Utility; especially with multiple GPUs!



*RAM Speed Comparisons I found @ 1080P*




*Above average cards can get to ~1100 core clock without needing additional voltage. *
This is of course dependent upon:
silicon lottery/specific GPU
PSU
cooling
voltage regulation
and the varying card/BIOS default voltage settings




*What is maximum core clock for a 290(X)?*
This is dependent on all of the same factors listed directly above
However, if you have a capable card *AND* cooling
~1300 core clock is excellent
Higher speeds achievable with a remarkably outstanding card and/or DIce/LN2




*Elpida or Hynix RAM, which is better?*


*Most* of the time you have a better *chance* to get a good OC out of Hynix memory.
This isn't always the case, as there are several users who have *reported excellent Elpida RAM speeds*.

But, remember, *RAM speeds are 2nd priority* compared to core so..I wouldn't fret over this all that much.
*Outstanding* RAM OC for 290(X) is ~6.5GHz-7GHz


*Don't be worried about ASIC quality**.*


ASIC seems to be very inaccurate for these cards.
Test the card to see how well it does.

*Want stable drivers?*


http://support.amd.com/en-us/download
13.11/13.12 Catalyst 14.9 has given me no problems on Win7 or Win8.1
I experienced blackscreen/lockup with 14.4 & Win7
Other users on Win 8/8.1 seem to be having better luck with 14.4

14.6 Beta has been great for me on 8.1

*Make sure you remove ANY old drivers properly before installing new ones:*
*Display Driver Uninstaller* - http://www.guru3d.com/files_get/display_driver_uninstaller_download,9.html
*Courtesy of @BradleyW* http://www.overclock.net/t/988215/how-to-properly-uninstall-ati-amd-software-drivers-for-graphics-cards
*Courtesy of @BradleyW* http://www.overclock.net/t/1150443/how-to-remove-your-nvidia-gpu-drivers#post_15432476
Sometimes a fresh OS installation is the only way to guarantee *ALL* traces of old drivers are gone



*Omega Drivers...Is the hype worth it for my 290(X)?!?*


I hate to be the bearer of bad news, but from my experience there isn't much to get excited about for performance gains with these cards...see link below
http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/33460_20#post_23300417

The drivers haven't given me any headaches, so that is a bonus.
However, I am only running single GPUs in both my machines.


*How big of a PSU do you need? *


*This is dependent upon:*
 CPU
If you're using an AMD FX 8-core chip with a high overclock, then be mindful of their power hungry tendencies; they can draw ~300W alone

# of GPUs
what clocks/voltages you will run them at
How cool you can keep them
lower temps = more efficiency
Ambient temperature effects this as well



Reference R9 290 @ stock settings is in the ~180W range
290(X) with a very high overclock/voltage can be in ~350W range


*With high OC/voltage (1) card can pull 350W at full load. *
Max voltage in AB or Trixx will put you around 1.4V core; unsure for GPU Tweak. 
Additional voltage can be had with an extreme BIOS' such as the ASUS PT1/PT1T

It's "safe" if you can keep core *and* VRM1 cool.
I say "safe" because OC and overvolting is done at *your own risk*!



*Latts O' Watts! 1315/1700 +200mV +50% Power Limit*




*Cold hard facts for overclocked 290X power usage courtesy of* @SonDa5
http://www.overclock.net/t/1441118/290x-psu-power-output-tests#post_21156921



*Need a good PSU? 3 guides courtesy of* @shilka
http://www.overclock.net/t/1482157/700-750-watts-comparison-thread#post_22109815
http://www.overclock.net/t/1438987/1000-1050-watts-comparison-thread#post_21108368
http://www.overclock.net/t/1483789/1200-1350-watts-comparison-thread#post_22141960



*How small of a PSU can I get away with?*
My HTPC runs flawlessly with the following*:*
450W Rosewill Capstone
Reference 290 - 1075/1375 +37mV
4770k - 4.3GHz - 1.252V & 1.800 VRIN
8GB 2133MHz @ 1.5V
1 SSD, 2 HDD & BD-RW
2 AIO pumps & 3 fans presently
290 @ 1075/1375 +37mV using ~220W
~160W @ stock with AIO

After you factor in AIO water pumps, fans, SSD, HDD and ODD...this system is still easily under the 450W mark.


This is obviously a *mild OC on the GPU/CPU*, but still shows that a relatively low wattage, high quality, PSU can push "big" hardware if you're smart about it.
http://www.overclock.net/t/1478544/the-build-formerly-known-as-the-devil-inside-a-gaming-htpc-for-my-wife-4770k-corsair-250d-powercolor-r9-290



*My main desktop flawlessly runs:*
650W Rosewill Capstone
(1) 290 up to 1300/1700 +200mV @ 5760*1080 benching
1200/1500 +87mV @ 5760*1080 everyday gaming clocks

4770k - 4.5GHz @ 1.259V
16GB RAM 2400MHz @ 1.65V
3 SSDs & 1 HDD
6 fans & 1 water pump
*The Shrunken Beast -* http://www.overclock.net/t/1456279/honey-i-shrunk-the-ultra-tower-beast-my-journey-to-creating-a-more-compact-pc-with-an-r9-290


*Which waterblock/backplate is best?*


Pretty much all waterblocks will cool the GPU core the same
The core isn't the *problem child* of the 290(X) though, it's *VRM1 *



*Best overall waterblock/backplate performance when using exactly what is given to you in the box goes to Aquacomputer.*
Whether you choose the passive/active backplate, you will get the best VRM1 temps here. 




Spoiler: Active and Passive Backplates





Card looks a lot better with the active attachment; but it is not necessary as you get 90% of the performance without it.






*Other manufacturers have slacked in the VRM1 area, especially when you start pushing these cards...but there is an easy fix!*
Replacing the thermal pads with Fujipoly Ultra Extreme will slash VRM temps.
*See my thread here:* http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures
Make sure you double check pad thickness for your specific block in my ^ thread
the 15x100mm strip of Fujipoly Ultra Extreme is easily enough to do 2 cards; 3 if you cut thin enough.




*I want to add a full cover waterblock, do I have a reference design card?*
Check EK's site: www.coolingconfigurator.com
A Review of your card may tell you as well: http://www.techpowerup.com/reviewdb/Graphics-Cards/AMD/R9-290/





Spoiler: Reference Design















*What non-reference cards are there full cover blocks for?*
ASUS DC2 & Matirx - EK
MSI Gaming & Lightning - EK
Gigabyte made a change to their PCB - EK has updated their reference blocks to Rev 2.0 
https://www.facebook.com/EKWaterBlocks/photos/a.204208322966540.61821.182927101761329/633154450071923/?type=1



Spoiler: Gigabyte PCB Change



















*Which card/brand has the best air cooler?*


For the *290X the MSI Lightning* undoubtedly takes the cake
For a *290* this probably goes to the *PowerColor PCS+* and its massive *3-slot* *cooler*. If you have room for it then you may not need to look much further.
If you want a *2-slot cooler* than you have plenty of good choices
MSI, Sapphire, Gigabyte, VisionTek and XFX all have capable coolers on their cards
However, nearly all of these add to the already ridiculous length of the 290(X) PCB at 278mm/~11 in.
It appears that the MSI Gaming and VisionTek are the only two coolers that *don't extrude past the PCB*



*You didn't mention the ASUS DC2/Matrix?*
That is because it has the worst performance from outright laziness on ASUS' part
They took the cooler from the GK104, GTX 780, and slapped it on the 290(X)
290(X) die is much smaller than GK104, so only 3/5 heatpipes make contact; severely inhibiting performance

I wouldn't bother unless you plan to use a waterblock or more extreme cooling
I do think the heatpipe contact could *possibly* be remedied with a thin copper shim to help distribute heat to the other 2 pipes












*So you're thinking about Xfiring a pair of these bad boys**?*


As stated previously, you must have a sufficient power supply. 
Depending on your OC goals and CPU draw you will likely be in the 750-1000 watt range.

Cooling is going to be just as, if not *more*, *important* for an enjoyable 290(X) Xfire experience
If you're sticking with air cooling, then you *MUST HAVE a ridiculous amount of airflow* to keep the VRMs cool. 
All the 3rd party coolers sling the hot air all around your case so it must be evacuated properly
You will want to fill every fan location possible to assist in cooling these beasts
Typically front, bottom, and side panel locations are best to run as intake

You may still be unhappy with temps/OCs especially when you start applying additional voltage


*Xfire with Full Cover water blocks?*
Good Choice
You'll be hard pressed for find more thoroughly documented information than what is in the build log below.
Courtesy of @Gunderman456: http://www.overclock.net/t/1442038/build-log-the-hawaiian-heat-wave


*My card runs too hot, is too noisy, and/or custom water cooling isn't possible for me...what are my modding options?*


*Gelid Icy Vision 2* - http://www.newegg.com/Product/Product.aspx?Item=N82E16835426026
Courtesy of @tony_choi - http://www.overclock.net/t/1437634/installation-guide-tips-of-rev-2-icy-vision-on-r9-290x



*Arctic Cooling Accelero Extreme IV* - http://www.newegg.com/Product/Product.aspx?Item=N82E16835186097
Xtreme III results courtesy of TomsHardware - http://www.tomshardware.com/reviews/r9-290-accelero-xtreme-290,3671.html



*Attach an AIO water cooler*
*Arctic Accelero Hybrid II* - http://www.newegg.com/Product/Product.aspx?Item=N82E16835186095
*NZXT Kraken G10* - http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&Description=kraken+g10&N=-1&isNodeId=1
*G10/290 in my 250D* - http://www.overclock.net/t/1478544/the-build-formerly-known-as-the-devil-inside-a-gaming-htpc-for-my-wife-3770k-corsair-250d-powercolor-r9-290/20#post_22214255




*Need Heatsinks for VRMs or RAM?*
*Gelid VRM Heatsink Kit* - http://www.feppaspot.com/servlet/the-897/GELID-Solutions-Icy-Vision/Detail
Definitely best solution for reference cards as no modification is needed
You can see results from this kit in my 250D thread ^ just above ^

*You have many options for RAM sinks*
Here is the cheap/easy solution I chose from 3M
*3M 8810* - http://www.performance-pcs.com/catalog/index.php?main_page=product_info&products_id=39462
Thermal tape is pre-applied. Easy to install *and* easy to remove if need be





*My PowerColor 290 w/ Gelid VRM Kit + 3M 8810 Ram Sinks*



*What thermal paste, or TIM, should I use for my card?*


This boils down to a bit of personal preference as there are so many to choose from, but the most popular TIMs are popular because they work
*Gelid GC-Extreme*
*Arctic MX-4*
Both of these deliver optimum results immediately(non-curing), are non-conductive, and have a long service life.


*Only 2 products...you didn't list my favorite thermal paste?*
There are countless different products that deliver similar/better results; Xigmakek PTI-G4512 is my personal fav
Gelid GC-Extreme and Arctic MX-4 simply work, are safe, cheap and easily attainable



*I've heard about CoolLaboratory Liquid Pro/Ultra (CLP/CLU), does it really work that much better?*
It does deliver superior results compared to traditional paste, but there are a couple things to be mindful of
It's metal and conducts electricity, be very careful when applying/removing this stuff!
Put some masking tape down around the GPU die; better safe than $300 sorry

It can have a shorter service life than a traditional paste
This depends on how rough you are on your GPU(s), so keep an eye on those temperatures

It can be difficult to apply and even more difficult to remove
I have used CLU and can attest that it is easy to apply and remove
Pro is harder to work with from what I have read

Is it OK to use with a bare copper heatsink/waterblock?
I have read of more than one instance where a heatsink, block, or lapped/polished IHS "fuse" together when cooling a CPU
It is probably best to use with a nickle plated heatsink/waterblock so this doesn't happen
http://www.overclock.net/t/1377337/how-should-i-apply-coollaboratory-liquid-pro-to-my-cpu
http://www.overclock.net/t/1364267/question-about-coollaboratory-liquid-ultra-thermel-paste/10 





*Anyone please feel free to PM/quote/@Roboyto to correct/add information.*


----------



## Sgt Bilko

Quote:


> Originally Posted by *Roboyto*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> As Kokin said, keep away from CCC. It will likely cause more problems than give you results.
> 
> Afterburner/Trixx/GPU Tweak are the most common OC utility choices, I would stick to them.
> 
> I don't believe any 290(X) is voltage locked, but I am not 100% certain; I haven't seen anyone mention it. It wouldn't really make sense since these are the enthusiast class cards.
> 
> This is the thread you want to pay attention to for 290(X) information. There can be a lot to wade through, but if you take your time there is plenty of information.
> 
> Feel free to ask away in the thread, or send a PM, but don't be discouraged if you don't get a response as this thread moves extremely quickly at times and your post can get buried; Re-post if need be
> 
> *The R9 290 "Need to Know"*
> 
> *Anyone please feel free to chime in and correct me if I am wrong in my following statements*
> 
> _*These cards make a lot of heat. Have great airflow, a great heatsink, or plan on watercooling them.*_
> Card will continue to run normally up to 94C, _*they throttle at 95C*_ reducing clocks/voltages.
> 
> 
> *Keep an eye on VRM1 temps, <90C is where you want to be for VRM1.*
> VRM1 controls core voltage and is the *long vertical strip* closest to the PCIe power connectors.
> 
> 
> 
> 
> Spoiler: VRM locations
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *VRM2 controls RAM* voltage and is nearest the video output connections.
> You won't likely need to be concerned with keeping them cool since they don't run as hot.
> 
> 
> *What should you benchmark with? *
> 3DMark, Unigine, Catzilla, etc.
> 
> 
> *Your benchmark score went down after OCing? *
> The card is probably _*throttling*_ due to temperature.
> Make sure you *Force Constant Voltage* in your OC utility
> Make sure you *Disable ULPS* via registry or OC Utility; especiial with multipe GPUs!
> 
> 
> *FurMark is a poor choice*
> it gives unrealistic power draw.
> It is also a poor representation of what happens when playing a game
> don't know many that involve continuous repetition of a swaying furry doughnut.
> 
> 
> 
> *With high OC/voltage (1) card can pull 350W at full load. *
> Max voltage in AB or Trixx will put you around 1.4V core; unsure for GPU Tweak.
> It's "safe" if you can keep core and VRM1 cool.
> 
> 
> 
> 
> Spoiler: Latts O' Watts!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Core clock is king. * *Overclock core 1st, worry about memory 2nd. The 512-bit bus makes up for the 5GHz default memory speeds.*
> If you experience artifacting/tearing/flashing then you odds are you have either A) pushed core clock too high, or B) need to add more voltage. If you can't add more voltage, bring the core clock down a little.
> If you experience black screen, BSOD, or other lockups then odds are you have pushed the RAM clock too high.
> MSI Afterburner AUX voltage can sometime assist in RAM clocks.
> Make sure you *Force Constant Voltage* in your OC utility
> Make sure you *Disable ULPS* via registry or OC Utility; especially with multipe GPUs!
> 
> 
> 
> 
> 
> Spoiler: RAM Clock Comparisons I Found
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Elpida or Hynix, which is better?*
> _*Most*_ of the time you have a better _*chance*_ to get a good OC out of Hynix memory.
> This isn't always the case, as there are several users who have _*reported excellent Elpida RAM speeds*_.
> But, remember, _*RAM speeds are 2nd fiddle*_ to core so I wouldn't fret over this all that much.
> 
> 
> *Don't be so worried about ASIC quality. *
> Test the card to see how well it does.
> 
> 
> *Above average cards can get to ~1100 core clock without needing additional voltage. *
> This is of course dependent upon:
> silicon lottery
> specific GPU
> cooling
> voltage regulation
> and the varying card/BIOS default voltage settings.
> 
> 
> 
> *Want stable drivers?*
> 13.12 was the bees knees up until 4/25/14 when 1.4 WHQL released.
> 14.4 seems OK so far, but if you have issues *13.12 is ROCK SOLID.*
> *Make sure you remove ANY old drivers properly before installing new ones:* http://www.guru3d.com/files_get/display_driver_uninstaller_download,9.html
> 
> 
> *How big of a PSU do you need? *
> 
> This is dependent upon:
> CPU
> # of GPUs
> what clocks/voltages you will run them at.
> 
> Reference R9 290 @ stock settings is in the ~180W range
> 290(X) with high overclock/voltage can be in ~350W range
> If you're using an AMD FX 8-core chip with a high overclock, then be mindful of their power hungry tendencies; especially when OCd
> 
> *From my personal experience my HTPC flawlessly runs:*
> 290 & 3770k at stock settings with a 450W Rosewill Capstone
> A 3770k at stock is easily under 100W
> The 290 is only drawing ~160W with an AIO on it
> After you factor in AIO water pumps, fans, SSD, HDD and ODD...this system is still easily under the 450W mark.
> 
> 
> *My main desktop flawlessly runs:*
> 650W Rosewill Capstone
> (1) 290 up to 1300/1700 @ 5760*1080 benching
> 1200/1500 @ 5760*1080 everyday gaming clocks
> 
> 4770k @ 4.5GHz
> 16GB RAM 2400MHz
> 3 SSDs & 1 HDD
> 6 fans & 1 water pump
> 
> 
> *Which waterblock/backplate is best?*
> 
> Pretty much all waterblocks will cool the GPU core the same
> The core isn't the problem child of the 290(X) though, *VRM1* is.
> 
> 
> *Best overall waterblock performance when using exactly what is given to you in the box goes to Aquacomputer.*
> Whether you choose the passive/active backplate, you will get the best VRM1 temps here.
> 
> 
> 
> 
> Spoiler: Active and Passive Backplates
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *The other manufacturers (EK, XSPC, Koolance) seem to have slacked in the VRM1 area, especially when you start pushing these cards...but there is an easy fix!*
> Replacing the thermal pads with Fujipoly Ultra Extreme will slash VRM temps.
> _*See my thread here:*_ http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures
> Make sure you check pad thickness for your specific block
> the $20 15x100mm strip is enough to do 2-3 cards
> 
> 
> 
> 
> 
> *Anyone please feel free to chime in and correct me if I am wrong in my preceding statements*


I'm gonna +Rep you for that post.

Well put-together and well thought out


----------



## Mega Man

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Quote:
> 
> 
> 
> Originally Posted by *magicase*
> 
> Does anyone know if PCI-E 3.0 running at x8/x8/x4 will have bandwidth issues with a 290 Tri CF or is x16/x16/x8 needed?
> 
> 
> 
> im using quad 290s on x79. Ive ran some tests on pcie 3.0 vs 2.0 and right now the results are shocking me. Keep in mind im testing high resolutions. I havent posted the results yet because they are almost identical. I am going to dig deeper into when I have more time.
> 
> I will say that if my testing is without flaw you will not see a difference. However I have yet to do the same testing at 1080p. I almost feel like 2.0 vs 3.0 is a myth at this point. With R9 2xxx except maybe 295x2?
Click to expand...

not really it is known it makes little to no difference, unless your saying to the otherwise.
Quote:


> Originally Posted by *Tobiman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> So I just cannot understand what to do. My 290X performs worse with an Overclock. How can I solve this?
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *D3v1L0*
> 
> Hello, Greetings to all .. I have problems configuring 3 monitors in eyeinfinity .... some can help ..
> 
> I have 2 r9 290x .... powercolor video card can move to any output (no hdmi) to 144hz, but when I connect 2 or 3 screens can not move can only move 144hz 60hz., anyone know why?,
> 
> I clable dvi-d and display port 1.2,
> 
> r9 crossfire sapphire before had I run 290x dvi-d, dvi-d, dp 3 Portrait 3240 x 1980 144hz.
> 
> r9 powercolor crossfire now 290x dvi-d, dvi-d, dp 3 Portrait 3240 x 1980 144hz does not work, only 60hz.
> 
> 144 single monitor works perfect ..
> 
> help please
> 
> Click to expand...
> 
> This seems to be a limit that AMD placed on their radeon line of cards. It's only posssible with Nvidia surround.
Click to expand...

nope. it is either a windows issue or a driver one, but i have the same problem on my 7970s in win 7 win 8 works np though...
Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Roboyto*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> As Kokin said, keep away from CCC. It will likely cause more problems than give you results.
> 
> Afterburner/Trixx/GPU Tweak are the most common OC utility choices, I would stick to them.
> 
> I don't believe any 290(X) is voltage locked, but I am not 100% certain; I haven't seen anyone mention it. It wouldn't really make sense since these are the enthusiast class cards.
> 
> This is the thread you want to pay attention to for 290(X) information. There can be a lot to wade through, but if you take your time there is plenty of information.
> 
> Feel free to ask away in the thread, or send a PM, but don't be discouraged if you don't get a response as this thread moves extremely quickly at times and your post can get buried; Re-post if need be
> 
> *The R9 290 "Need to Know"*
> 
> *Anyone please feel free to chime in and correct me if I am wrong in my following statements*
> 
> _*These cards make a lot of heat. Have great airflow, a great heatsink, or plan on watercooling them.*_
> Card will continue to run normally up to 94C, _*they throttle at 95C*_ reducing clocks/voltages.
> 
> 
> *Keep an eye on VRM1 temps, <90C is where you want to be for VRM1.*
> VRM1 controls core voltage and is the *long vertical strip* closest to the PCIe power connectors.
> 
> 
> 
> 
> Spoiler: VRM locations
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *VRM2 controls RAM* voltage and is nearest the video output connections.
> You won't likely need to be concerned with keeping them cool since they don't run as hot.
> 
> 
> *What should you benchmark with? *
> 3DMark, Unigine, Catzilla, etc.
> 
> 
> *Your benchmark score went down after OCing? *
> The card is probably _*throttling*_ due to temperature.
> Make sure you *Force Constant Voltage* in your OC utility
> Make sure you *Disable ULPS* via registry or OC Utility; especiial with multipe GPUs!
> 
> 
> *FurMark is a poor choice*
> it gives unrealistic power draw.
> It is also a poor representation of what happens when playing a game
> don't know many that involve continuous repetition of a swaying furry doughnut.
> 
> 
> 
> *With high OC/voltage (1) card can pull 350W at full load. *
> Max voltage in AB or Trixx will put you around 1.4V core; unsure for GPU Tweak.
> It's "safe" if you can keep core and VRM1 cool.
> 
> 
> 
> 
> Spoiler: Latts O' Watts!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Core clock is king. * *Overclock core 1st, worry about memory 2nd. The 512-bit bus makes up for the 5GHz default memory speeds.*
> If you experience artifacting/tearing/flashing then you odds are you have either A) pushed core clock too high, or B) need to add more voltage. If you can't add more voltage, bring the core clock down a little.
> If you experience black screen, BSOD, or other lockups then odds are you have pushed the RAM clock too high.
> MSI Afterburner AUX voltage can sometime assist in RAM clocks.
> Make sure you *Force Constant Voltage* in your OC utility
> Make sure you *Disable ULPS* via registry or OC Utility; especially with multipe GPUs!
> 
> 
> 
> 
> 
> Spoiler: RAM Clock Comparisons I Found
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Elpida or Hynix, which is better?*
> _*Most*_ of the time you have a better _*chance*_ to get a good OC out of Hynix memory.
> This isn't always the case, as there are several users who have _*reported excellent Elpida RAM speeds*_.
> But, remember, _*RAM speeds are 2nd fiddle*_ to core so I wouldn't fret over this all that much.
> 
> 
> *Don't be so worried about ASIC quality. *
> Test the card to see how well it does.
> 
> 
> *Above average cards can get to ~1100 core clock without needing additional voltage. *
> This is of course dependent upon:
> silicon lottery
> specific GPU
> cooling
> voltage regulation
> and the varying card/BIOS default voltage settings.
> 
> 
> 
> *Want stable drivers?*
> 13.12 was the bees knees up until 4/25/14 when 1.4 WHQL released.
> 14.4 seems OK so far, but if you have issues *13.12 is ROCK SOLID.*
> *Make sure you remove ANY old drivers properly before installing new ones:* http://www.guru3d.com/files_get/display_driver_uninstaller_download,9.html
> 
> 
> *How big of a PSU do you need? *
> 
> This is dependent upon:
> CPU
> # of GPUs
> what clocks/voltages you will run them at.
> 
> Reference R9 290 @ stock settings is in the ~180W range
> 290(X) with high overclock/voltage can be in ~350W range
> If you're using an AMD FX 8-core chip with a high overclock, then be mindful of their power hungry tendencies; especially when OCd
> 
> *From my personal experience my HTPC flawlessly runs:*
> 290 & 3770k at stock settings with a 450W Rosewill Capstone
> A 3770k at stock is easily under 100W
> The 290 is only drawing ~160W with an AIO on it
> After you factor in AIO water pumps, fans, SSD, HDD and ODD...this system is still easily under the 450W mark.
> 
> 
> *My main desktop flawlessly runs:*
> 650W Rosewill Capstone
> (1) 290 up to 1300/1700 @ 5760*1080 benching
> 1200/1500 @ 5760*1080 everyday gaming clocks
> 
> 4770k @ 4.5GHz
> 16GB RAM 2400MHz
> 3 SSDs & 1 HDD
> 6 fans & 1 water pump
> 
> 
> *Which waterblock/backplate is best?*
> 
> Pretty much all waterblocks will cool the GPU core the same
> The core isn't the problem child of the 290(X) though, *VRM1* is.
> 
> 
> *Best overall waterblock performance when using exactly what is given to you in the box goes to Aquacomputer.*
> Whether you choose the passive/active backplate, you will get the best VRM1 temps here.
> 
> 
> 
> 
> Spoiler: Active and Passive Backplates
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *The other manufacturers (EK, XSPC, Koolance) seem to have slacked in the VRM1 area, especially when you start pushing these cards...but there is an easy fix!*
> Replacing the thermal pads with Fujipoly Ultra Extreme will slash VRM temps.
> _*See my thread here:*_ http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures
> Make sure you check pad thickness for your specific block
> the $20 15x100mm strip is enough to do 2-3 cards
> 
> 
> 
> 
> 
> *Anyone please feel free to chime in and correct me if I am wrong in my preceding statements*
> 
> 
> 
> I'm gonna +Rep you for that post.
> 
> Well put-together and well thought out
Click to expand...

agreed


----------



## Roboyto

Quote:
Originally Posted by *Sgt Bilko* 


> I'm gonna +Rep you for that post.
> 
> Well put-together and well thought out


Thanks!

I think that covers most of the frequent repeat questions for this thread.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Roboyto*
> 
> As Kokin said, keep away from CCC. It will likely cause more problems than give you results.
> 
> Afterburner/Trixx/GPU Tweak are the most common OC utility choices, I would stick to them.
> 
> I don't believe any 290(X) is voltage locked, but I am not 100% certain; I haven't seen anyone mention it. It wouldn't really make sense since these are the enthusiast class cards.
> 
> This is the thread you want to pay attention to for 290(X) information. There can be a lot to wade through, but if you take your time there is plenty of information.
> 
> Feel free to ask away in the thread, or send a PM, but don't be discouraged if you don't get a response as this thread moves extremely quickly at times and your post can get buried; Re-post if need be
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> *The R9 290(X) "Need to Know"*
> 
> *Anyone please feel free to chime in and correct me if I am wrong in my following statements*
> 
> _*These cards make a lot of heat. Have great airflow, a great heatsink, or plan on watercooling them.*_
> Card will continue to run normally up to 94C, _*they throttle at 95C*_ reducing clocks/voltages.
> 
> 
> *Keep an eye on VRM1 temps, <90C is where you want to be for VRM1.*
> VRM1 controls core voltage and is the *long vertical strip* closest to the PCIe power connectors.
> 
> *VRM Locations*
> 
> 
> 
> *VRM2 controls RAM* voltage and is nearest the video output connections.
> You won't likely need to be concerned with keeping them cool since they don't run as hot.
> 
> 
> *What should you Overclock with?*
> Stay away from Catalyst Control Center aka OverDrive
> Afterburner
> Trixx
> GPU Tweak
> 
> 
> 
> *What should you benchmark with? *
> 3DMark
> Unigine
> Catzilla
> Tomb Raider
> Bioshock
> FFXIV (my favorite hands down "It's Stable" test)
> etc
> 
> 
> *FurMark is a poor choice*
> it gives unrealistic power draw.
> It is also a poor representation of what happens when playing a game
> don't know many that involve continuous repetition of a swaying furry doughnut.
> 
> 
> 
> *Your benchmark score went down after OCing? *
> The card is probably _*throttling*_ due to temperature.
> Make sure you *Force Constant Voltage* in your OC utility
> Make sure you *Disable ULPS* via registry or OC Utility; especiial with multipe GPUs!
> 
> 
> 
> *With high OC/voltage (1) card can pull 350W at full load. *
> Max voltage in AB or Trixx will put you around 1.4V core; unsure for GPU Tweak.
> It's "safe" if you can keep core and VRM1 cool.
> 
> *Latts O' Watts!*
> 
> 
> 
> 
> *Core clock is king. * *Overclock core 1st, worry about memory 2nd. The 512-bit bus makes up for the 5GHz default memory speeds.*
> If you experience artifacting/tearing/flashing then you odds are you have either A) pushed core clock too high, or B) need to add more voltage. If you can't add more voltage, bring the core clock down a little.
> If you experience black screen, BSOD, or other lockups then odds are you have pushed the RAM clock too high.
> MSI Afterburner AUX voltage can sometime assist in RAM clocks.
> Make sure you *Force Constant Voltage* in your OC utility
> Make sure you *Disable ULPS* via registry or OC Utility; especially with multipe GPUs!
> 
> 
> *RAM Speed Comparisons I found*
> 
> 
> 
> 
> *Elpida or Hynix, which is better?*
> _*Most*_ of the time you have a better _*chance*_ to get a good OC out of Hynix memory.
> This isn't always the case, as there are several users who have _*reported excellent Elpida RAM speeds*_.
> But, remember, _*RAM speeds are 2nd fiddle*_ to core so I wouldn't fret over this all that much.
> 
> 
> *Don't be so worried about ASIC quality. *
> Test the card to see how well it does.
> 
> 
> *Above average cards can get to ~1100 core clock without needing additional voltage. *
> This is of course dependent upon:
> silicon lottery
> specific GPU
> cooling
> voltage regulation
> and the varying card/BIOS default voltage settings.
> 
> 
> 
> *Want stable drivers?*
> 13.12 was the bees knees up until 4/25/14 when 14.4 WHQL released.
> 14.4 seems OK so far, but if you have issues *13.12 is ROCK SOLID.*
> *Make sure you remove ANY old drivers properly before installing new ones:* http://www.guru3d.com/files_get/display_driver_uninstaller_download,9.html
> 
> 
> *How big of a PSU do you need? *
> 
> This is dependent upon:
> CPU
> # of GPUs
> what clocks/voltages you will run them at.
> 
> Reference R9 290 @ stock settings is in the ~180W range
> 290(X) with high overclock/voltage can be in ~350W range
> If you're using an AMD FX 8-core chip with a high overclock, then be mindful of their power hungry tendencies; especially when OCd
> 
> *From my personal experience my HTPC flawlessly runs:*
> 450W Rosewill Capstone
> 290 @ stock
> 3770k @ stock
> 1 SSD, 2 HDD and 1 ODD
> 2 AIO pumps & 3 fans presently
> A 3770k at stock is easily under 100W
> The 290 is only drawing ~160W with an AIO on it
> After you factor in AIO water pumps, fans, SSD, HDD and ODD...this system is still easily under the 450W mark.
> 
> 
> *My main desktop flawlessly runs:*
> 650W Rosewill Capstone
> (1) 290 up to 1300/1700 @ 5760*1080 benching
> 1200/1500 @ 5760*1080 everyday gaming clocks
> 
> 4770k @ 4.5GHz
> 16GB RAM 2400MHz
> 3 SSDs & 1 HDD
> 6 fans & 1 water pump
> 
> 
> *Which waterblock/backplate is best?*
> 
> Pretty much all waterblocks will cool the GPU core the same
> The core isn't the problem child of the 290(X) though, *VRM1* is.
> 
> 
> *Best overall waterblock performance when using exactly what is given to you in the box goes to Aquacomputer.*
> Whether you choose the passive/active backplate, you will get the best VRM1 temps here.
> 
> 
> 
> 
> Spoiler: Active and Passive Backplates
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *The other manufacturers (EK, XSPC, Koolance) seem to have slacked in the VRM1 area, especially when you start pushing these cards...but there is an easy fix!*
> Replacing the thermal pads with Fujipoly Ultra Extreme will slash VRM temps.
> _*See my thread here:*_ http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures
> Make sure you check pad thickness for your specific block
> the $20 15x100mm strip is enough to do 2-3 cards
> 
> 
> 
> 
> 
> 
> *Anyone please feel free to chime in and correct me if I am wrong in my preceding statements*


Yerp Rep + Good dummies guide


----------



## DeadlyDNA

Quote:


> Originally Posted by *Mega Man*


the tests i ran showed no difference between 3.0 x16x8x8x8 vs 2.0 x16x8x8x8, i just wanted to run more and find a way to be 100% positive because i was doing testing at 11520x2160 for the pcie lanes

i verified Mobo settings, GPUz fullscreen test. - only thing changed between the 2 were the pcie setting.

edit: also i used GPU heavy tests, to try and eliminate CPU bottleneck


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *the9quad*
> 
> Thanks, I am sure you'll smoke me when you try. Pretty jealous of your setup. I just cannot afford watercooling (if I want to stay married lol)


Yeah thanks man







I don't have a wife so that's why I can do what I can do


----------



## Samurai Batgirl

So I'm having trouble with what I think is my R9 290. (I don't even remember if I've updated my sig rig to say I have one, but whatever.







)

Anyway, I've been crashing and blue screening. I'm not sure why. The four most recent drivers, including betas, haven't fixed the blue screen and I don't know if they're what's caused them.

Everything in my sig rig is up to date (again, except maybe the 290). I reformatted before I installed this card, and I always to clean installs of drivers.

If needed, I can upload the most recent blue screen dumps.

Help would very much be appreciated.


----------



## kizwan

Quote:


> Originally Posted by *Samurai Batgirl*
> 
> So I'm having trouble with what I think is my R9 290. (I don't even remember if I've updated my sig rig to say I have one, but whatever.
> 
> 
> 
> 
> 
> 
> 
> )
> 
> Anyway, I've been crashing and blue screening. I'm not sure why. The four most recent drivers, including betas, haven't fixed the blue screen and I don't know if they're what's caused them.
> 
> Everything in my sig rig is up to date (again, except maybe the 290). I reformatted before I installed this card, and I always to clean installs of drivers.
> 
> If needed, I can upload the most recent blue screen dumps.
> 
> Help would very much be appreciated.


Need more info:-

Crashing & BSOD while doing...?
Overclock...?
Temps for both core & VRM1 when it occurs?
BSOD bug code(s)?
CPU overclock? Stable?
Always crashing & BSOD since day one or just recently?


----------



## eternal7trance

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yeah thanks man
> 
> 
> 
> 
> 
> 
> 
> I don't have a wife so that's why I can do what I can do


Hey I have a wife and I can do what I wanna do.


----------



## rdr09

Quote:


> Originally Posted by *eternal7trance*
> 
> Hey I have a wife and I can do what I wanna do.


what? laundry, dishes . . .?


----------



## Sgt Bilko

Quote:


> Originally Posted by *eternal7trance*
> 
> Hey I have a wife and I can do what I wanna do.


Pretty sure your avatar spells it all out


----------



## Samurai Batgirl

Quote:


> Originally Posted by *kizwan*
> 
> Need more info:-
> 
> Crashing & BSOD while doing...?
> Overclock...?
> Temps for both core & VRM1 when it occurs?
> BSOD bug code(s)?
> CPU overclock? Stable?
> Always crashing & BSOD since day one or just recently?


1. When my monitor sleeps and sometimes when watching stuff that uses Flash.
2. No. I know I'm on OCN, but I'm not nerdy or confident enough to overclock...








3. High 40's to mid 50's. Depends on how much I have going on...and me needing to clean out my tower...








4. Uh... I'm pretty sure it's 0x0000003
5. Again, no OC.
6. Since the beginning, though it feels like it's become more frequent as of late.


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Pretty sure your avatar spells it all out


lol. he forgot to throw the trash.
Quote:


> Originally Posted by *Samurai Batgirl*
> 
> 1. When my monitor sleeps and sometimes when watching stuff that uses Flash.
> 2. No. I know I'm on OCN, but I'm not nerdy or confident enough to overclock...
> 
> 
> 
> 
> 
> 
> 
> 
> 3. High 40's to mid 50's. Depends on how much I have going on...and me needing to clean out my tower...
> 
> 
> 
> 
> 
> 
> 
> 
> 4. Uh... I'm pretty sure it's 0x0000003
> 5. Again, no OC.
> 6. Since the beginning, though it feels like it's become more frequent as of late.


you might have Hardware Acceleration enabled in Flash like You Tube videos. Rightclick the video>settings>disable Hardware Acceleration.

if you are using Firefox . . . give it up.


----------



## Caos

good, I have a r9 290 vapor x. with OC gpu and 1100 memory at 1400 (stock), the power limit to 30%, would have to put 50%? playing BF4 maximum temperature is 60 ° C and 56 ° C VRM1, VRM2 53 ° C


----------



## Samurai Batgirl

Quote:


> Originally Posted by *rdr09*
> 
> you might have Hardware Acceleration enabled in Flash like You Tube videos. Rightclick the video>settings>disable Hardware Acceleration.
> 
> if you are using Firefox . . . give it up.


I actually use HTML5 on YouTube.








I did deactivate hardware acceleration, though. Let's see if it helps.

I can't give up Firefox. Last time I used Chrome it crashed so much I literally couldn't use it. It's made me not ever want to go back. And Waterfox is the tits! D:


----------



## sinnedone

Quote:


> Originally Posted by *Roboyto*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> As Kokin said, keep away from CCC. It will likely cause more problems than give you results.
> 
> Afterburner/Trixx/GPU Tweak are the most common OC utility choices, I would stick to them.
> 
> I don't believe any 290(X) is voltage locked, but I am not 100% certain; I haven't seen anyone mention it. It wouldn't really make sense since these are the enthusiast class cards.
> 
> This is the thread you want to pay attention to for 290(X) information. There can be a lot to wade through, but if you take your time there is plenty of information.
> 
> Feel free to ask away in the thread, or send a PM, but don't be discouraged if you don't get a response as this thread moves extremely quickly at times and your post can get buried; Re-post if need be
> 
> *The R9 290(X) "Need to Know"*
> 
> *Anyone please feel free to chime in and correct me if I am wrong in my following statements*
> 
> _*These cards make a lot of heat. Have great airflow, a great heatsink, or plan on watercooling them.*_
> Card will continue to run normally up to 94C, _*they throttle at 95C*_ reducing clocks/voltages.
> 
> 
> *Keep an eye on VRM1 temps, <90C is where you want to be for VRM1.*
> VRM1 controls core voltage and is the *long vertical strip* closest to the PCIe power connectors.
> 
> *VRM Locations*
> 
> 
> 
> *VRM2 controls RAM* voltage and is nearest the video output connections.
> You won't likely need to be concerned with keeping them cool since they don't run as hot.
> 
> 
> *What should you Overclock with?*
> Stay away from Catalyst Control Center aka OverDrive
> Afterburner
> Trixx
> GPU Tweak
> 
> 
> 
> *What should you benchmark with? *
> 3DMark
> Unigine
> Catzilla
> Tomb Raider
> Bioshock
> FFXIV (my favorite hands down "It's Stable" test)
> etc
> 
> 
> *FurMark is a poor choice*
> it gives unrealistic power draw.
> It is also a poor representation of what happens when playing a game
> don't know many that involve continuous repetition of a swaying furry doughnut.
> 
> 
> 
> *Your benchmark score went down after OCing? *
> The card is probably _*throttling*_ due to temperature.
> Make sure you *Force Constant Voltage* in your OC utility
> Make sure you *Disable ULPS* via registry or OC Utility; especiial with multipe GPUs!
> 
> 
> 
> *With high OC/voltage (1) card can pull 350W at full load. *
> Max voltage in AB or Trixx will put you around 1.4V core; unsure for GPU Tweak.
> It's "safe" if you can keep core and VRM1 cool.
> 
> *Latts O' Watts!*
> 
> 
> 
> 
> *Core clock is king. * *Overclock core 1st, worry about memory 2nd. The 512-bit bus makes up for the 5GHz default memory speeds.*
> If you experience artifacting/tearing/flashing then you odds are you have either A) pushed core clock too high, or B) need to add more voltage. If you can't add more voltage, bring the core clock down a little.
> If you experience black screen, BSOD, or other lockups then odds are you have pushed the RAM clock too high.
> MSI Afterburner AUX voltage can sometime assist in RAM clocks.
> Make sure you *Force Constant Voltage* in your OC utility
> Make sure you *Disable ULPS* via registry or OC Utility; especially with multipe GPUs!
> 
> 
> *RAM Speed Comparisons I found*
> 
> 
> 
> 
> *Elpida or Hynix, which is better?*
> _*Most*_ of the time you have a better _*chance*_ to get a good OC out of Hynix memory.
> This isn't always the case, as there are several users who have _*reported excellent Elpida RAM speeds*_.
> But, remember, _*RAM speeds are 2nd fiddle*_ to core so I wouldn't fret over this all that much.
> 
> 
> *Don't be so worried about ASIC quality. *
> Test the card to see how well it does.
> 
> 
> *Above average cards can get to ~1100 core clock without needing additional voltage. *
> This is of course dependent upon:
> silicon lottery
> specific GPU
> cooling
> voltage regulation
> and the varying card/BIOS default voltage settings.
> 
> 
> 
> *Want stable drivers?*
> 13.12 was the bees knees up until 4/25/14 when 14.4 WHQL released.
> 14.4 seems OK so far, but if you have issues *13.12 is ROCK SOLID.*
> *Make sure you remove ANY old drivers properly before installing new ones:* http://www.guru3d.com/files_get/display_driver_uninstaller_download,9.html
> 
> 
> *How big of a PSU do you need? *
> 
> This is dependent upon:
> CPU
> # of GPUs
> what clocks/voltages you will run them at.
> 
> Reference R9 290 @ stock settings is in the ~180W range
> 290(X) with high overclock/voltage can be in ~350W range
> If you're using an AMD FX 8-core chip with a high overclock, then be mindful of their power hungry tendencies; especially when OCd
> 
> *From my personal experience my HTPC flawlessly runs:*
> 450W Rosewill Capstone
> 290 @ stock
> 3770k @ stock
> 1 SSD, 2 HDD and 1 ODD
> 2 AIO pumps & 3 fans presently
> A 3770k at stock is easily under 100W
> The 290 is only drawing ~160W with an AIO on it
> After you factor in AIO water pumps, fans, SSD, HDD and ODD...this system is still easily under the 450W mark.
> 
> 
> *My main desktop flawlessly runs:*
> 650W Rosewill Capstone
> (1) 290 up to 1300/1700 @ 5760*1080 benching
> 1200/1500 @ 5760*1080 everyday gaming clocks
> 
> 4770k @ 4.5GHz
> 16GB RAM 2400MHz
> 3 SSDs & 1 HDD
> 6 fans & 1 water pump
> 
> 
> *Which waterblock/backplate is best?*
> 
> Pretty much all waterblocks will cool the GPU core the same
> The core isn't the problem child of the 290(X) though, *VRM1* is.
> 
> 
> *Best overall waterblock performance when using exactly what is given to you in the box goes to Aquacomputer.*
> Whether you choose the passive/active backplate, you will get the best VRM1 temps here.
> 
> 
> 
> 
> Spoiler: Active and Passive Backplates
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *The other manufacturers (EK, XSPC, Koolance) seem to have slacked in the VRM1 area, especially when you start pushing these cards...but there is an easy fix!*
> Replacing the thermal pads with Fujipoly Ultra Extreme will slash VRM temps.
> _*See my thread here:*_ http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures
> Make sure you check pad thickness for your specific block
> the $20 15x100mm strip is enough to do 2-3 cards
> 
> 
> 
> *Anyone please feel free to chime in and correct me if I am wrong in my preceding statements*


Thank you for the very informative post.









I say add that to the first post.


----------



## diggiddi

Quote:


> Originally Posted by *Samurai Batgirl*
> 
> 1. When my monitor sleeps and sometimes when watching stuff that uses Flash.
> 2. No. I know I'm on OCN, but I'm not nerdy or confident enough to overclock...
> 
> 
> 
> 
> 
> 
> 
> 
> 3. High 40's to mid 50's. Depends on how much I have going on...and me needing to clean out my tower...
> 
> 
> 
> 
> 
> 
> 
> 
> 4. Uh... I'm pretty sure it's 0x0000003
> 5. Again, no OC.
> 6. Since the beginning, though it feels like it's become more frequent as of late.


Download whocrashed and run it


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yeah thanks man
> 
> 
> 
> 
> 
> 
> 
> I don't have a wife so that's why I can do what I can do
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *eternal7trance*
> 
> Hey I have a wife and I can do what I wanna do.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> what? laundry, dishes . . .?
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Pretty sure your avatar spells it all out
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
Click to expand...

*LOOOOOOOOOOOOOOOL*


----------



## Samurai Batgirl

Quote:


> Originally Posted by *diggiddi*
> 
> Download whocrashed and run it


It pointed me to drivers.

Bug check codes were 0x3B and 0x1000007E. Do I need to give the parameters for these as well?

Gonna roll back my drivers to like 13.9, if I can find the thing.


----------



## Mega Man

Quote:


> Originally Posted by *eternal7trance*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HOMECINEMA-PC*
> 
> Yeah thanks man
> 
> 
> 
> 
> 
> 
> 
> I don't have a wife so that's why I can do what I can do
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hey I have a wife and I can do what I wanna do.
Click to expand...

they key is to let them have a more expensive hobby, my wifes, purses and shoes, mine, PCs, console gaming ( i collect all systems currently at 32+ all not duplicated ) ), guns, snowboarding. ( which she joins on the last 2. and ironically i spend more then her. so i win ~, but i just make sure i can pay all my bills )


----------



## diggiddi

Quote:


> Originally Posted by *Samurai Batgirl*
> 
> It pointed me to drivers.
> 
> Bug check codes were 0x3B and 0x1000007E. Do I need to give the parameters for these as well?
> 
> Gonna roll back my drivers to like 13.9, if I can find the thing.


What drivers are you running right now ? use DDU an uninstall old drivers in safe mode, then try 14.4 WHQL or roll back to last stable driver


----------



## Aussiejuggalo

Question, does having a 290 in a PCIe v2 instead of v3 make any performance difference what so ever?

I cant remember









Edit, also how well do you guys think Watch Dogs is gonna run on AMD cards with all the Nvidia tech in it?


----------



## Samurai Batgirl

I was already using 14.4. :/

Apparently this whole thing has to do with atikmdag.sys.

Side note: pretty sure I'm never using another AMD card after this. It's infuriating that I have to go fix a driver that should be fine when it comes out so I can use the $400 card.


----------



## DeadlyDNA

Yep, R9 series is freaking insane guys. I am now testing 49 Megapixel gaming on my R9 290's. While there isn't any benchmarks yet im doing a Demo thread on this. Check it out!

Keep in mind it's a demo, that said you will be impressed when i start posting numbers on what works.


----------



## Forceman

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Question, does having a 290 in a PCIe v2 instead of v3 make any performance difference what so ever?
> 
> I cant remember


No, no appreciable difference for a single card.


----------



## kizwan

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Yep, R9 series is freaking insane guys. I am now testing 49 Megapixel gaming on my R9 290's. While there isn't any benchmarks yet im doing a Demo thread on this. Check it out!
> 
> Keep in mind it's a demo, that said you will be impressed when i start posting numbers on what works.


With 3820?! Nice!








I'll check it out tomorrow morning.


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Forceman*
> 
> No, no appreciable difference for a single card.


Ah cool, thanks









I keep forgetting whats marketing bs and whats actually real


----------



## HOMECINEMA-PC

After 1 hr of ghosts ultra settings at 1440p . Stock clocks though CPU at [email protected] 37c full load temps and 27c to 32c vrm temps . Ambient 23c No a/c either
















:


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> After 1 hr of ghosts ultra settings at 1440p . Stock clocks though CPU at [email protected] 37c full load temps and 27c to 32c vrm temps . Ambient 23c No a/c either
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> :


Whats the temp up there?

I'm at 15c where i'm sitting atm and loving it


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Whats the temp up there?
> 
> I'm at 15c where i'm sitting atm and loving it


12c outside 19c in . Strong storms rolled thru yesterday hailing the end of summer . So today first taste of winter with 20c top and very windy ..... Perfect for benching .


----------



## Aussiejuggalo

You guys are lucky, I'm still sitting on 51° idle







and its only 21c inside... with fans going!


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 12c outside 19c in . Strong storms rolled thru yesterday hailing the end of summer . So today first taste of winter with 20c top and very windy ..... Perfect for benching .


We had our first proper taste of Winter yesterday, was 7c when i got to work.
Quote:


> Originally Posted by *Aussiejuggalo*
> 
> You guys are lucky, I'm still sitting on 51° idle
> 
> 
> 
> 
> 
> 
> 
> and its only 21c inside... with fans going!


I'm at 51c under load from my top card with fan speed on auto


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'm at 51c under load from my top card with fan speed on auto


You mother...









I really dont have any air flow through my case seeing its all just rads







need to get this thing underwater


----------



## HOMECINEMA-PC

CPU is idle between 29c and 33c . Cards are idling at 28c


----------



## kpoeticg

Yeah i'm in a similar situation. My reference cards heat up ridiculously quick in my test/open-air setup. I've had my AC Blocks for them for a while, just gotta add em and plan out the loop. Luckily Newegg took my Random-Black-Screen Elpida back and sent me a better Elpida. Still hardly takes anything at all to cause Black Screens in CFX with my Hynix card tho.

Can't wait to see how these guys perform with proper cooling.


----------



## cplifj

mm yes, those temps are high, are you sure the gpu load stays at 0 when there is no pixels moving on screen ?

i had the fabulous luck of finding again something taxing mu gpu in the background whilst staying hidden itself, kept my temps high too.

every now and then some scruppulous little criminal **** succeeds in hacking my pc for their profit it seems. it has become ****internet of hacking trolls.

well a quick cleaning does the trick, one becomes trained in it after all.

i get idle 41°c in a 24°c room on a refference 290x from asus.


----------



## Rozayz

Spoiler: Warning: Spoiler!






Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Rozayz*
> 
> So I upgraded (or downgraded, some might say) my GTX 780 Ti with an R9 290X in preparation for x2 4k res monitors 4 days ago.
> 
> I purchased a reference ASUS card and am experiencing shut down issues. Basically the card gets hot and shuts down, however they're designed to run at ~ 95c or thereabouts.
> 
> My question was whether anyone knows if the VRM on my MSI Z77 MPOWER could be affecting this and causing a thermal shutdown?
> 
> I have forced fan speed % to ~ 60% using MSI Afterburner and have found that leaving the fan speed % set to 'Auto' causes a system shutdown in about 5 minutes or less of playing any video game. (waterblock could not come sooner at this stage ;_; finalizing WC loop and getting that badboy installed asap!) ~
> 
> Rig is as follows:
> 
> Processor: Intel Core i7 3770K ‒ 5GHz
> Video Card: ASUS Radeon R9 290X 4GB
> Memory/RAM: 2x8GB G.Skill Trident X ‒ 2400 MHz
> Motherboard: MSI Z77 MPOWER
> Power Supply: Seasonic X-1050 ‒ 80+ GOLD
> Case: Corsair Obsidian 900D
> CPU Cooler: Corsair H100i w/ aftermarket SP120's
> Storage: Western Digital Caviar Black ‒ 1TB ‒ x2
> SSD: Intel 520 Series ‒ 180GB ‒ x2
> Display Samsung U28D590D 28inch 4K UHD ‒ x2
> Keyboard: White Keycool 84 Cherry MX Red
> Mouse: Corsair Vengeance M65
> Headset: Sennheiser HD598
> Microphone: Audio Technica AT2035
> 
> Bullet points added from incorrectly located thread:
> 
> > http://i.imgur.com/c8Lejpt.png - GPU Temp @ 47 °C ~
> 
> > Nvidia drivers are completely removed from my PC. I actually reformatted/repaired my OS partition for an unrelated reason but yeah, Nvidia stuff is 100% all gone, onboard VGA is disabled.
> 
> > Running the R9 290X on two separate rails of my Seasonic X-1050.
> 
> > Seriously struggling to think of why I'm still getting shut downs other than the fact that my motherboard's VRM's are potentially /worn out; although the board is "specced to overclock" as mentioned in many reviews/by MSI themselves. (they actually do 24 hour burn in's on all their MPOWER series at 4.6 GHz) ~
> 
> > I had zero issues with the 780 Ti and used it for well over 3 months clocked quite high.
> 
> ---
> 
> > Just tried playing BF4, as I joined the game and it "tabbed" into the game after the "Loading Level" message had initialized etc, PC shut down during the transition to Fullscreen mode. Same thing happened during Unigine Heaven benchmark when changing from 1600x900 Windowed to 1920x1080 Fullscreen. Thinking about it, the issue has only existed when alt-tabbing to and from Fullscreen games/apps.
> 
> > These are [obviously] during idle, but I can 150% confirm that the shutdown only occurs when tabbing to/from Fullscreen games/apps.http://i.imgur.com/eHGszNv.png - are those Voltages about right? Anyone able to spot anything out of the norm? oO
> 
> > R9 290X reached a max of 69 °C after 11 minutes of Unigine Heaven and system shutdown instantly when I alt tabbed from Fullscreen to screen capture HWMonitor to post here, hah.
> 
> > I have completely reset, updated BIOS. Reseated CPU, reapplied thermal paste, tried old 780 Ti, 660 Ti (both worked flawlessly, even OC'd)...
> 
> 1. plugged computer singularly into isolated circuit, still shut down.
> 2. someone mentioned something, somewhere, about Nvidia, Intel, AMD and Realtek potentially conflicting audio drivers.. how should I go about ensuring this isn't the problem?
> 3. could an extremely large gas heater on the wall opposite my power point be the cause of the shut down? I figured since the wall gets quite warm, this could cause a shut down.
> 
> Any help is appreciated.






Continuing from this...

Would prolonged use of an intermittent power switch (both front panel and m/b power buttons) cause issues? I have been shorting my PSU using the plug for the last ~ 3-5 weeks. (terrible, I know. I'm buying a new m/b as soon as Haswell refresh is stocked where I work!)
Quote:


> Originally Posted by *Rozayz*
> 
> So I upgraded (or downgraded, some might say) my GTX 780 Ti with an R9 290X in preparation for x2 4k res monitors 4 days ago.
> 
> I purchased a reference ASUS card and am experiencing shut down issues. Basically the card gets hot and shuts down, however they're designed to run at ~ 95c or thereabouts.
> 
> My question was whether anyone knows if the VRM on my MSI Z77 MPOWER could be affecting this and causing a thermal shutdown?
> 
> I have forced fan speed % to ~ 60% using MSI Afterburner and have found that leaving the fan speed % set to 'Auto' causes a system shutdown in about 5 minutes or less of playing any video game. (waterblock could not come sooner at this stage ;_; finalizing WC loop and getting that badboy installed asap!) ~
> 
> Rig is as follows:
> 
> Processor: Intel Core i7 3770K ‒ 5GHz
> Video Card: ASUS Radeon R9 290X 4GB
> Memory/RAM: 2x8GB G.Skill Trident X ‒ 2400 MHz
> Motherboard: MSI Z77 MPOWER
> Power Supply: Seasonic X-1050 ‒ 80+ GOLD
> Case: Corsair Obsidian 900D
> CPU Cooler: Corsair H100i w/ aftermarket SP120's
> Storage: Western Digital Caviar Black ‒ 1TB ‒ x2
> SSD: Intel 520 Series ‒ 180GB ‒ x2
> Display Samsung U28D590D 28inch 4K UHD ‒ x2
> Keyboard: White Keycool 84 Cherry MX Red
> Mouse: Corsair Vengeance M65
> Headset: Sennheiser HD598
> Microphone: Audio Technica AT2035
> 
> Bullet points added from incorrectly located thread:
> 
> > http://i.imgur.com/c8Lejpt.png - GPU Temp @ 47 °C ~
> 
> > Nvidia drivers are completely removed from my PC. I actually reformatted/repaired my OS partition for an unrelated reason but yeah, Nvidia stuff is 100% all gone, onboard VGA is disabled.
> 
> > Running the R9 290X on two separate rails of my Seasonic X-1050.
> 
> > Seriously struggling to think of why I'm still getting shut downs other than the fact that my motherboard's VRM's are potentially /worn out; although the board is "specced to overclock" as mentioned in many reviews/by MSI themselves. (they actually do 24 hour burn in's on all their MPOWER series at 4.6 GHz) ~
> 
> > I had zero issues with the 780 Ti and used it for well over 3 months clocked quite high.
> 
> ---
> 
> > Just tried playing BF4, as I joined the game and it "tabbed" into the game after the "Loading Level" message had initialized etc, PC shut down during the transition to Fullscreen mode. Same thing happened during Unigine Heaven benchmark when changing from 1600x900 Windowed to 1920x1080 Fullscreen. Thinking about it, the issue has only existed when alt-tabbing to and from Fullscreen games/apps.
> 
> > These are [obviously] during idle, but I can 150% confirm that the shutdown only occurs when tabbing to/from Fullscreen games/apps.http://i.imgur.com/eHGszNv.png - are those Voltages about right? Anyone able to spot anything out of the norm? oO
> 
> > R9 290X reached a max of 69 °C after 11 minutes of Unigine Heaven and system shutdown instantly when I alt tabbed from Fullscreen to screen capture HWMonitor to post here, hah.
> 
> > I have completely reset, updated BIOS. Reseated CPU, reapplied thermal paste, tried old 780 Ti, 660 Ti (both worked flawlessly, even OC'd)...
> 
> 1. plugged computer singularly into isolated circuit, still shut down.
> 2. someone mentioned something, somewhere, about Nvidia, Intel, AMD and Realtek potentially conflicting audio drivers.. how should I go about ensuring this isn't the problem?
> 3. could an extremely large gas heater on the wall opposite my power point be the cause of the shut down? I figured since the wall gets quite warm, this could cause a shut down.
> 
> Any help is appreciated.






Bit of a continuation from this...

So I replaced the SSD, reformatted one of my mechanical drives and the problem was solved.

Fast track to today, the problem is back. The minute I load into any Fullscreen game (Furmark, Prime95, IBT, Memtest all work 100% fine), my PC shuts down.

I have come to the conclusion that the issue lies with my motherboard, as the PSU seems to be fine.

I will note here that both my PC front panel power button and m/b power button have been intermittent or not working for quite some time now; as such, I have been powering on my computer by shorting the PSU by wriggling the power cable and powering the system on. (******ed, I know - but I haven't had the money to replace my m/b/CPU... this is my only option due to my Z77 being obsolete now).

Also, don't think I'm hitting any OCP limits on any of the rails of my X-1050 (yes, it is multi-rail, not single rail ;_; Google it if you did not know this!)... the only other thing I have read about which I haven't been able to test yet is my power cord / surge protector.

ANYWAY. Just for clarification/confirmation.. is it possible that powering on my PC this way could have caused this problem? Is it likely, as I suspect, that I have buggered my m/b, or is it likely my PSU?

:c


----------



## Mega Man

Quote:


> Originally Posted by *Rozayz*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Rozayz*
> 
> So I upgraded (or downgraded, some might say) my GTX 780 Ti with an R9 290X in preparation for x2 4k res monitors 4 days ago.
> 
> I purchased a reference ASUS card and am experiencing shut down issues. Basically the card gets hot and shuts down, however they're designed to run at ~ 95c or thereabouts.
> 
> My question was whether anyone knows if the VRM on my MSI Z77 MPOWER could be affecting this and causing a thermal shutdown?
> 
> I have forced fan speed % to ~ 60% using MSI Afterburner and have found that leaving the fan speed % set to 'Auto' causes a system shutdown in about 5 minutes or less of playing any video game. (waterblock could not come sooner at this stage ;_; finalizing WC loop and getting that badboy installed asap!) ~
> 
> Rig is as follows:
> 
> Processor: Intel Core i7 3770K ‒ 5GHz
> Video Card: ASUS Radeon R9 290X 4GB
> Memory/RAM: 2x8GB G.Skill Trident X ‒ 2400 MHz
> Motherboard: MSI Z77 MPOWER
> Power Supply: Seasonic X-1050 ‒ 80+ GOLD
> Case: Corsair Obsidian 900D
> CPU Cooler: Corsair H100i w/ aftermarket SP120's
> Storage: Western Digital Caviar Black ‒ 1TB ‒ x2
> SSD: Intel 520 Series ‒ 180GB ‒ x2
> Display Samsung U28D590D 28inch 4K UHD ‒ x2
> Keyboard: White Keycool 84 Cherry MX Red
> Mouse: Corsair Vengeance M65
> Headset: Sennheiser HD598
> Microphone: Audio Technica AT2035
> 
> Bullet points added from incorrectly located thread:
> 
> > http://i.imgur.com/c8Lejpt.png - GPU Temp @ 47 °C ~
> 
> > Nvidia drivers are completely removed from my PC. I actually reformatted/repaired my OS partition for an unrelated reason but yeah, Nvidia stuff is 100% all gone, onboard VGA is disabled.
> 
> > Running the R9 290X on two separate rails of my Seasonic X-1050.
> 
> > Seriously struggling to think of why I'm still getting shut downs other than the fact that my motherboard's VRM's are potentially /worn out; although the board is "specced to overclock" as mentioned in many reviews/by MSI themselves. (they actually do 24 hour burn in's on all their MPOWER series at 4.6 GHz) ~
> 
> > I had zero issues with the 780 Ti and used it for well over 3 months clocked quite high.
> 
> ---
> 
> > Just tried playing BF4, as I joined the game and it "tabbed" into the game after the "Loading Level" message had initialized etc, PC shut down during the transition to Fullscreen mode. Same thing happened during Unigine Heaven benchmark when changing from 1600x900 Windowed to 1920x1080 Fullscreen. Thinking about it, the issue has only existed when alt-tabbing to and from Fullscreen games/apps.
> 
> > These are [obviously] during idle, but I can 150% confirm that the shutdown only occurs when tabbing to/from Fullscreen games/apps.http://i.imgur.com/eHGszNv.png - are those Voltages about right? Anyone able to spot anything out of the norm? oO
> 
> > R9 290X reached a max of 69 °C after 11 minutes of Unigine Heaven and system shutdown instantly when I alt tabbed from Fullscreen to screen capture HWMonitor to post here, hah.
> 
> > I have completely reset, updated BIOS. Reseated CPU, reapplied thermal paste, tried old 780 Ti, 660 Ti (both worked flawlessly, even OC'd)...
> 
> 1. plugged computer singularly into isolated circuit, still shut down.
> 2. someone mentioned something, somewhere, about Nvidia, Intel, AMD and Realtek potentially conflicting audio drivers.. how should I go about ensuring this isn't the problem?
> 3. could an extremely large gas heater on the wall opposite my power point be the cause of the shut down? I figured since the wall gets quite warm, this could cause a shut down.
> 
> Any help is appreciated.
> 
> 
> 
> 
> 
> 
> 
> Continuing from this...
> 
> Would prolonged use of an intermittent power switch (both front panel and m/b power buttons) cause issues? I have been shorting my PSU using the plug for the last ~ 3-5 weeks. (terrible, I know. I'm buying a new m/b as soon as Haswell refresh is stocked where I work!)
> Quote:
> 
> 
> 
> Originally Posted by *Rozayz*
> 
> So I upgraded (or downgraded, some might say) my GTX 780 Ti with an R9 290X in preparation for x2 4k res monitors 4 days ago.
> 
> I purchased a reference ASUS card and am experiencing shut down issues. Basically the card gets hot and shuts down, however they're designed to run at ~ 95c or thereabouts.
> 
> My question was whether anyone knows if the VRM on my MSI Z77 MPOWER could be affecting this and causing a thermal shutdown?
> 
> I have forced fan speed % to ~ 60% using MSI Afterburner and have found that leaving the fan speed % set to 'Auto' causes a system shutdown in about 5 minutes or less of playing any video game. (waterblock could not come sooner at this stage ;_; finalizing WC loop and getting that badboy installed asap!) ~
> 
> Rig is as follows:
> 
> Processor: Intel Core i7 3770K ‒ 5GHz
> Video Card: ASUS Radeon R9 290X 4GB
> Memory/RAM: 2x8GB G.Skill Trident X ‒ 2400 MHz
> Motherboard: MSI Z77 MPOWER
> Power Supply: Seasonic X-1050 ‒ 80+ GOLD
> Case: Corsair Obsidian 900D
> CPU Cooler: Corsair H100i w/ aftermarket SP120's
> Storage: Western Digital Caviar Black ‒ 1TB ‒ x2
> SSD: Intel 520 Series ‒ 180GB ‒ x2
> Display Samsung U28D590D 28inch 4K UHD ‒ x2
> Keyboard: White Keycool 84 Cherry MX Red
> Mouse: Corsair Vengeance M65
> Headset: Sennheiser HD598
> Microphone: Audio Technica AT2035
> 
> Bullet points added from incorrectly located thread:
> 
> > http://i.imgur.com/c8Lejpt.png - GPU Temp @ 47 °C ~
> 
> > Nvidia drivers are completely removed from my PC. I actually reformatted/repaired my OS partition for an unrelated reason but yeah, Nvidia stuff is 100% all gone, onboard VGA is disabled.
> 
> > Running the R9 290X on two separate rails of my Seasonic X-1050.
> 
> > Seriously struggling to think of why I'm still getting shut downs other than the fact that my motherboard's VRM's are potentially /worn out; although the board is "specced to overclock" as mentioned in many reviews/by MSI themselves. (they actually do 24 hour burn in's on all their MPOWER series at 4.6 GHz) ~
> 
> > I had zero issues with the 780 Ti and used it for well over 3 months clocked quite high.
> 
> ---
> 
> > Just tried playing BF4, as I joined the game and it "tabbed" into the game after the "Loading Level" message had initialized etc, PC shut down during the transition to Fullscreen mode. Same thing happened during Unigine Heaven benchmark when changing from 1600x900 Windowed to 1920x1080 Fullscreen. Thinking about it, the issue has only existed when alt-tabbing to and from Fullscreen games/apps.
> 
> > These are [obviously] during idle, but I can 150% confirm that the shutdown only occurs when tabbing to/from Fullscreen games/apps.http://i.imgur.com/eHGszNv.png - are those Voltages about right? Anyone able to spot anything out of the norm? oO
> 
> > R9 290X reached a max of 69 °C after 11 minutes of Unigine Heaven and system shutdown instantly when I alt tabbed from Fullscreen to screen capture HWMonitor to post here, hah.
> 
> > I have completely reset, updated BIOS. Reseated CPU, reapplied thermal paste, tried old 780 Ti, 660 Ti (both worked flawlessly, even OC'd)...
> 
> 1. plugged computer singularly into isolated circuit, still shut down.
> 2. someone mentioned something, somewhere, about Nvidia, Intel, AMD and Realtek potentially conflicting audio drivers.. how should I go about ensuring this isn't the problem?
> 3. could an extremely large gas heater on the wall opposite my power point be the cause of the shut down? I figured since the wall gets quite warm, this could cause a shut down.
> 
> Any help is appreciated.
> 
> Click to expand...
> 
> 
> 
> 
> 
> Bit of a continuation from this...
> 
> So I replaced the SSD, reformatted one of my mechanical drives and the problem was solved.
> 
> Fast track to today, the problem is back. The minute I load into any Fullscreen game (Furmark, Prime95, IBT, Memtest all work 100% fine), my PC shuts down.
> 
> I have come to the conclusion that the issue lies with my motherboard, as the PSU seems to be fine.
> 
> I will note here that both my PC front panel power button and m/b power button have been intermittent or not working for quite some time now; as such, I have been powering on my computer by shorting the PSU by wriggling the power cable and powering the system on. (******ed, I know - but I haven't had the money to replace my m/b/CPU... this is my only option due to my Z77 being obsolete now).
> 
> Also, don't think I'm hitting any OCP limits on any of the rails of my X-1050 (yes, it is multi-rail, not single rail ;_; Google it if you did not know this!)... the only other thing I have read about which I haven't been able to test yet is my power cord / surge protector.
> 
> ANYWAY. Just for clarification/confirmation.. is it possible that powering on my PC this way could have caused this problem? Is it likely, as I suspect, that I have buggered my m/b, or is it likely my PSU?
> 
> :c
Click to expand...

why not try to take the surge protector out of the situation, if it is this, the more they trip the weaker they are and they will trip easier


----------



## Rozayz

Quote:


> Originally Posted by *Mega Man*
> 
> why not try to take the surge protector out of the situation, if it is this, the more they trip the weaker they are and they will trip easier


Tried this right after I posted. Still shutting down. :\


----------



## NEK4TE

Hey guys, sorry to bother you!










By any chance, could any expert







please check my thread about overclocking (For the first time ever) my Gigabyte r9 290 Windforce card?

Big thanks!

Thread is @ http://www.overclock.net/t/1487194/gigabyte-radeon-r9-290-4gb-gddr5-gv-r929oc-4gd-first-time-overclocking


----------



## kpoeticg

Quote:



> Originally Posted by *Rozayz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> why not try to take the surge protector out of the situation, if it is this, the more they trip the weaker they are and they will trip easier
> 
> 
> 
> Tried this right after I posted. Still shutting down. :\
Click to expand...

Is your PC shutting down, or are you getting black screens, or blue screens?

If it's black screens, it probly has nothing to do with your motherboard or psu. If it's shutting down, and both your mobo's power button and power header have stopped working, i'd start my troubleshooting there. Either by RMA or looking for damage near the header or any of your ATX contacts. Or a loose wire on your ATX or EPS connector

Edit: i feel your pain man. Seems i'm doing nothing but troubleshooting and rma'ing lately. It's real disappointing when you try a recommended solution, it works just long enough to convince you, then the problem comes back.

Edit x2: if you have friends or family with compatible hardware, it'll save you many hours of troubleshooting. I bought (and returned) a new PSU & r5 230 (or r3 250?) from BestBuy, and a new (Used) motherboard from the marketplace, between troubleshooting problems with my first 290x and my mobo in the past cpl months


----------



## Rozayz

Quote:


> Originally Posted by *kpoeticg*
> 
> Is your PC shutting down, or are you getting black screens, or blue screens?
> 
> If it's black screens, it probly has nothing to do with your motherboard or psu. If it's shutting down, and both your mobo's power button and power header have stopped working, i'd start my troubleshooting there. Either by RMA or looking for damage near the header or any of your ATX contacts. Or a loose wire on your ATX or EPS connector
> 
> Edit: i feel your pain man. Seems i'm doing nothing but troubleshooting and rma'ing lately. It's real disappointing when you try a recommended solution, it works just long enough to convince you, then the problem comes back.


Yep yep and yep. 100% shuts down.

I work in I.T. sales/tech/assembly/troubleshooting too...and this has me completely stumped as powering the PC on the way I have been _shouldn't_ cause issues, but obviously there _is_ still an issue. Have contacted MSI today as a last resort / seeing if I can RA the board... unlikely though, due to the fact that I acquired the product from a reseller and because the product is obsolete (ie MSI can't give me 'store credit' or make a special Z77 MPOWER specifically for me lol...) ~

The fact that the system is booting and running perfectly fine under stress tests says it's likely an m/b issue or PSU OCP/power-in-general issue. There is almost certainly no loose connections throughout the entire system.


----------



## heroxoot

Quote:


> Originally Posted by *Rozayz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kpoeticg*
> 
> Is your PC shutting down, or are you getting black screens, or blue screens?
> 
> If it's black screens, it probly has nothing to do with your motherboard or psu. If it's shutting down, and both your mobo's power button and power header have stopped working, i'd start my troubleshooting there. Either by RMA or looking for damage near the header or any of your ATX contacts. Or a loose wire on your ATX or EPS connector
> 
> Edit: i feel your pain man. Seems i'm doing nothing but troubleshooting and rma'ing lately. It's real disappointing when you try a recommended solution, it works just long enough to convince you, then the problem comes back.
> 
> 
> 
> Yep yep and yep. 100% shuts down.
> 
> I work in I.T. sales/tech/assembly/troubleshooting too...and this has me completely stumped as powering the PC on the way I have been _shouldn't_ cause issues, but obviously there _is_ still an issue. Have contacted MSI today as a last resort / seeing if I can RA the board... unlikely though, due to the fact that I acquired the product from a reseller and because the product is obsolete (ie MSI can't give me 'store credit' or make a special Z77 MPOWER specifically for me lol...) ~
> 
> The fact that the system is booting and running perfectly fine under stress tests says it's likely an m/b issue or PSU OCP/power-in-general issue. There is almost certainly no loose connections throughout the entire system.
Click to expand...

MSI upgrades you when your warranty is valid but your product is out of production. They gave me the 990FXA-GD80 for my 890FXA-GD70, and a R9 290X when they couldnt replace my 7970 lightning just recently. You just have to talk to them about the issue and get someone who knows how to ask RMA for a replacement of equal quality and options. Like my 890 board had 5 PCIE slots, so they gave me one that could do the same or similar. Everything else on the board was a bonus, like usb3.0 header.


----------



## Rozayz

Quote:


> Originally Posted by *heroxoot*
> 
> MSI upgrades you when your warranty is valid but your product is out of production. They gave me the 990FXA-GD80 for my 890FXA-GD70, and a R9 290X when they couldnt replace my 7970 lightning just recently. You just have to talk to them about the issue and get someone who knows how to ask RMA for a replacement of equal quality and options. Like my 890 board had 5 PCIE slots, so they gave me one that could do the same or similar. Everything else on the board was a bonus, like usb3.0 header.


Will look into this. Cheers mate.


----------



## kpoeticg

If you bought it BNIB from a legit reseller, your MSI warranty should be in tact. It's not like you're actually RMA'ing the mobo "To MSI". It goes to a repair center that they farm out to. If it's a mobo component that's burned out, they should be able to locate and fix it. If they don't have a compatible component, then they indeed owe you an equal motherboard.

^^Obviously that's assuming that the mobo's the issue (which it sounds like it is). A new PSU is the easiest to try. If you work in IT you should have access to like a 750W you can test with your system. If not, there's gotta be an Aussie version of Microcenter or Bestbuy you can buy a PSU from with a 15-30 day return poliy


----------



## heroxoot

Quote:


> Originally Posted by *kpoeticg*
> 
> If you bought it BNIB from a legit reseller, your MSI warranty should be in tact. It's not like you're actually RMA'ing the mobo "To MSI". It goes to a repair center that they farm out to. If it's a mobo component that's burned out, they should be able to locate and fix it. If they don't have a compatible component, then they indeed owe you an equal motherboard.
> 
> ^^Obviously that's assuming that the mobo's the issue (which it sounds like it is). A new PSU is the easiest to try. If you work in IT you should have access to like a 750W you can test with your system. If not, there's gotta be an Aussie version of Microcenter or Bestbuy you can buy a PSU from with a 15-30 day return poliy


Yea MSI as an off site RMA DEPT that judges it and I think other companies might use it. I swear when I sent ram to G.skill it went to the same general area.


----------



## kpoeticg

I don't know of any hardware company that doesn't have rma's shipped to 3rd party repair centers. It's pretty standard, and probly why manufacturer rma's take so freakin long.


----------



## heroxoot

Sapphire might not because I've heard of Sapphire being able to send RMA parts to people like GPU coolers. Most other companies do not do this and I think it's mainly because they have no access to the parts. Maybe I'm wrong, but its fishy.


----------



## kpoeticg

It's not so much fishy, as much as people just don't realize that there's like a 95% chance that the product you bought wasn't "Manufactured" by who you think. Most manufacturers are really the people that design, or add an idea to a product.

Sapphire may be an in-house company since they don't have a million products on the market. But companies like MSI & Asus that make motherboards, gpu's, laptops, phones, tablets, etc... don't actually make the product. They have OEM's they use.

Same with PSU's. Corsair , Coolermaster, and EVGA don't "Make" PSU's. Seasonic, Superflower, and other cheaper companies OEM them with a new label.

Once businesses grow to a certain point, it becomes more affordable and logical to have factories assemble your products inside the country that your selling to, or where there's cheaper labor.


----------



## Rozayz

Quote:


> Originally Posted by *kpoeticg*
> 
> I don't know of any hardware company that doesn't have rma's shipped to 3rd party repair centers. It's pretty standard, and probly why manufacturer rma's take so freakin long.


This is why I have contacted them as a last resort. I don't have time to wait 30+ days for an RMA. I know first hand that it will take this long lol. We have customers that complain about this type of thing all the time despite having an on-site warranty and return's department (we rarely have to send RA's back to manufacturers, but there's always a handful that make it to that stage).

I'm pretty mind made up about grabbing a Z97 board and purchasing a 4770k for the interim.. whatever MSI are able to do with the board is just added $$ for when it's back with me I suppose.

As for the PSU/testing... not much point, I am fairly sure PSU works fine, and the OCP on X-1050's is VERY rarely reached due to multi-railing and disgustingly high amps. There's no way I'm hitting OCP running no OC and a reference + stock R9 290X, it's just not possible (under normal circumstances!). If push comes to shove, I'll take my rig into work and have it benched during the day, but we've been extremely busy at work lately so it's probably a job for ~ 2 weeks from now (sucks!).


----------



## heroxoot

For motherboards it took the rough 15 days before they shipped back to me, so about 25 days total ( 5 day ground > 15 day RMA > 5 day return ground) As for GPU, Well that was almost 3 months for me because the original replacement was also defective, but resulted in this 290X. MSI RMA is very good from my experience, but all RMA takes too long too often.


----------



## kpoeticg

It's never not worth it to test a component before manufacturer rma. If you work in IT you should know that. It could be something like an io shield shorting out with the chassis. With the system shutting down like it is, the psu is definitely a possible suspect. It's worth it to try a new psu if you have a store near you with a good return policy.

I mean if you were willing to try a new SSD and formatting your HDD, then you obviously know enough that with PC's it can be ANYTHING causing the problem

Just because your not hitting OCP on your PSU, doesn't mean that something hasn't blown inside it. Quality PSU's have ALOT of protection mechanisms inside that can blow before your components do.

Also, this might be obvious, but with a multi-rail psu, the 1050 isn't as important when troubleshooting as how many amps each rail can handle


----------



## Rozayz

Quote:


> Originally Posted by *kpoeticg*
> 
> It's not so much fishy, as much as people just don't realize that there's like a 95% chance that the product you bought wasn't "Manufactured" by who you think. Most manufacturers are really the people that design, or add an idea to a product.
> 
> Sapphire may be an in-house company since they don't have a million products on the market. But companies like MSI & Asus that make motherboards, gpu's, laptops, phones, tablets, etc... don't actually make the product. They have OEM's they use.
> 
> Same with PSU's. Corsair , Coolermaster, and EVGA don't "Make" PSU's. Seasonic, Superflower, and other cheaper companies OEM them with a new label.
> 
> Once businesses grow to a certain point, it becomes more affordable and logical to have factories assemble your products inside the country that your selling to, or where there's cheaper labor.


MSI are an OEM company lol. I've actually been in a conference call with their marketing manager at their HQ in Taipei where he showed us around the building as part of an induction we had with MSI Australia - they manufacture for HP, Medion, Lenovo/IBM, eMachines, Packard-Bell, Gateway, Advent, PC World and Acer off the top of my head.. but yeah, what you said is otherwise correct. Why buy Seasonic when you can buy Corsair for $50 less, right? (knowing that Corsair = re-branded Seasonic)..

The term OEM (Original Equipment Manufacture) is used so freely these days. "Try using the OEM cables that came with your Seasonic power supply." - Seasonic is not the OEM for those cables; they just sleeve, crimp, package. ha!


----------



## Rozayz

Quote:


> Originally Posted by *kpoeticg*
> 
> It's never not worth it to test a component before manufacturer rma. If you work in IT you should know that. It could be something like an io shield shorting out with the chassis. With the system shutting down like it is, the psu is definitely a possible suspect. It's worth it to try a new psu if you have a store near you with a good return policy.
> 
> I mean if you were willing to try a new SSD and formatting your HDD, then you obviously know enough that with PC's it can be ANYTHING causing the problem
> 
> Just because your not hitting OCP on your PSU, doesn't mean that something hasn't blown inside it. Quality PSU's have ALOT of protection mechanisms inside that can blow before your components do.
> 
> Also, this might be obvious, but with a multi-rail psu, the 1050 isn't as important when troubleshooting as how many amps each rail can handle


SSD had it's plastic tab snapped and part of the Sata data pin had loosened. I needed new ones anyway. I do know what you mean, but with an intermittent power button both on my front panel and m/b; I'm gonna get that out of the way regardless.

If the issue occurs only during Fullscreen gaming, it's very unlikely to be shorting on the case. The fact that a certain power draw (seemingly, at this stage) is required before the issue is reproduced, means it's probably an issue with [as you say] PSU or my m/b PCI-E or power phasing. That is just my opinion and what I have come to at this stage - reason I posted here is to double check in case someone thought it might be something else.

I should clarify. The system runs perfectly fine 24/7, just not when gaming. GPU has been benched and like I said, all stress tests are 100% fine with impressive results.. it's just as soon as I start gaming, PC takes a dump on me.


----------



## kpoeticg

Oh, well guess i was wrong (and suprised) about MSI. I had no idea they were an OEM

But it's more like "Why buy a Corsair for 100 bux MORE when it's a Seasonic PSU"

Quote:


> Originally Posted by *Rozayz*
> 
> SSD had it's plastic tab snapped and part of the Sata data pin had loosened. I needed new ones anyway. I do know what you mean, but with an intermittent power button both on my front panel and m/b; I'm gonna get that out of the way regardless.


It's most likely a mobo issue. I'm just saying that it's good practice to use any local resources you have to MAKE SURE it's the mobo before shipping it out with no return date.

PSU's, even it's 90% likely it's the mobo, are usually available to try locally for people. I would think mailing it back to MSI would be more of a PITA than testing a new PSU, that's all.

But MSI DOES have a great rma dept (probly cuz they're in-house apparently), so maybe it's worth the month to get a new mobo...


----------



## Rozayz

All noted. The only other thing that may be causing it is awful circuits in my house. I know the entire breaker shorted the other night when I was benching my rig and plugged a friend's rig in after assembly to test things.

I have tried isolating the problem with my PC on different circuits throughout the house though and the issue still persists. I'm going to try to get a hold of a VOM and measure the draw of my system to ensure it complies with the power gear I have... otherwise, like I said, Z97 babyyyyyyyyyy. :3


----------



## kizwan

Quote:


> Originally Posted by *Rozayz*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kpoeticg*
> 
> I don't know of any hardware company that doesn't have rma's shipped to 3rd party repair centers. It's pretty standard, and probly why manufacturer rma's take so freakin long.
> 
> 
> 
> 
> 
> 
> 
> This is why I have contacted them as a last resort. I don't have time to wait 30+ days for an RMA. I know first hand that it will take this long lol. We have customers that complain about this type of thing all the time despite having an on-site warranty and return's department (we rarely have to send RA's back to manufacturers, but there's always a handful that make it to that stage).
> 
> I'm pretty mind made up about grabbing a Z97 board and purchasing a 4770k for the interim.. whatever MSI are able to do with the board is just added $$ for when it's back with me I suppose.
> 
> As for the PSU/testing... not much point, I am fairly sure PSU works fine, and the OCP on X-1050's is VERY rarely reached due to multi-railing and disgustingly high amps. There's no way I'm hitting OCP running no OC and a reference + stock R9 290X, it's just not possible (under normal circumstances!). If push comes to shove, I'll take my rig into work and have it benched during the day, but we've been extremely busy at work lately so it's probably a job for ~ 2 weeks from now (sucks!).
Click to expand...

Unless OCP is malfunction...
Quote:


> Originally Posted by *Rozayz*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kpoeticg*
> 
> It's not so much fishy, as much as people just don't realize that there's like a 95% chance that the product you bought wasn't "Manufactured" by who you think. Most manufacturers are really the people that design, or add an idea to a product.
> 
> Sapphire may be an in-house company since they don't have a million products on the market. But companies like MSI & Asus that make motherboards, gpu's, laptops, phones, tablets, etc... don't actually make the product. They have OEM's they use.
> 
> Same with PSU's. Corsair , Coolermaster, and EVGA don't "Make" PSU's. Seasonic, Superflower, and other cheaper companies OEM them with a new label.
> 
> Once businesses grow to a certain point, it becomes more affordable and logical to have factories assemble your products inside the country that your selling to, or where there's cheaper labor.
> 
> 
> 
> 
> 
> 
> 
> MSI are an OEM company lol. I've actually been in a conference call with their marketing manager at their HQ in Taipei where he showed us around the building as part of an induction we had with MSI Australia - they manufacture for HP, Medion, Lenovo/IBM, eMachines, Packard-Bell, Gateway, Advent, PC World and Acer off the top of my head.. but yeah, what you said is otherwise correct. Why buy Seasonic when you can buy Corsair for $50 less, right? (knowing that Corsair = re-branded Seasonic)..
> 
> The term OEM (Original Equipment Manufacture) is used so freely these days. "Try using the OEM cables that came with your Seasonic power supply." - Seasonic is not the OEM for those cables; they just sleeve, crimp, package. ha!
Click to expand...

Using your logic, then Seasonic is not the OEM for their PSU too. I'm pretty sure at least the capacitors used in their PSU are made by different (OEM) company.


----------



## heroxoot

Very few companies make their product 100% on their own. Samsung is one of those few for a lot of products. Their SSD are all them and they put their amazing tricore controller in it.


----------



## Arizonian

Quote:


> Originally Posted by *Roboyto*
> 
> As Kokin said, keep away from CCC. It will likely cause more problems than give you results.
> 
> Afterburner/Trixx/GPU Tweak are the most common OC utility choices, I would stick to them.
> 
> I don't believe any 290(X) is voltage locked, but I am not 100% certain; I haven't seen anyone mention it. It wouldn't really make sense since these are the enthusiast class cards.
> 
> This is the thread you want to pay attention to for 290(X) information. There can be a lot to wade through, but if you take your time there is plenty of information.
> 
> Feel free to ask away in the thread, or send a PM, but don't be discouraged if you don't get a response as this thread moves extremely quickly at times and your post can get buried; Re-post if need be
> 
> *The R9 290(X) "Need to Know"*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> *Anyone please feel free to chime in and correct me if I am wrong in my following statements*
> 
> _*These cards make a lot of heat. Have great airflow, a great heatsink, or plan on watercooling them.*_
> Card will continue to run normally up to 94C, _*they throttle at 95C*_ reducing clocks/voltages.
> 
> 
> *Keep an eye on VRM1 temps, <90C is where you want to be for VRM1.*
> VRM1 controls core voltage and is the *long vertical strip* closest to the PCIe power connectors.
> 
> *VRM Locations*
> 
> 
> 
> *VRM2 controls RAM* voltage and is nearest the video output connections.
> You won't likely need to be concerned with keeping them cool since they don't run as hot.
> 
> 
> *What should you Overclock with?*
> Stay away from Catalyst Control Center aka OverDrive
> Afterburner
> Trixx
> GPU Tweak
> 
> 
> 
> *What should you benchmark with? *
> 3DMark
> Unigine
> Catzilla
> Tomb Raider
> Bioshock
> FFXIV (my favorite hands down "It's Stable" test)
> etc
> 
> 
> *FurMark is a poor choice*
> it gives unrealistic power draw.
> It is also a poor representation of what happens when playing a game
> don't know many that involve continuous repetition of a swaying furry doughnut.
> 
> 
> 
> *Your benchmark score went down after OCing? *
> The card is probably _*throttling*_ due to temperature.
> Make sure you *Force Constant Voltage* in your OC utility
> Make sure you *Disable ULPS* via registry or OC Utility; especiial with multipe GPUs!
> 
> 
> 
> *With high OC/voltage (1) card can pull 350W at full load. *
> Max voltage in AB or Trixx will put you around 1.4V core; unsure for GPU Tweak.
> It's "safe" if you can keep core and VRM1 cool.
> 
> *Latts O' Watts!*
> 
> 
> 
> 
> *Core clock is king. * *Overclock core 1st, worry about memory 2nd. The 512-bit bus makes up for the 5GHz default memory speeds.*
> If you experience artifacting/tearing/flashing then you odds are you have either A) pushed core clock too high, or B) need to add more voltage. If you can't add more voltage, bring the core clock down a little.
> If you experience black screen, BSOD, or other lockups then odds are you have pushed the RAM clock too high.
> MSI Afterburner AUX voltage can sometime assist in RAM clocks.
> Make sure you *Force Constant Voltage* in your OC utility
> Make sure you *Disable ULPS* via registry or OC Utility; especially with multipe GPUs!
> 
> 
> *RAM Speed Comparisons I found*
> 
> 
> 
> 
> *Elpida or Hynix, which is better?*
> _*Most*_ of the time you have a better _*chance*_ to get a good OC out of Hynix memory.
> This isn't always the case, as there are several users who have _*reported excellent Elpida RAM speeds*_.
> But, remember, _*RAM speeds are 2nd fiddle*_ to core so I wouldn't fret over this all that much.
> 
> 
> *Don't be so worried about ASIC quality. *
> Test the card to see how well it does.
> 
> 
> *Above average cards can get to ~1100 core clock without needing additional voltage. *
> This is of course dependent upon:
> silicon lottery
> specific GPU
> cooling
> voltage regulation
> and the varying card/BIOS default voltage settings.
> 
> 
> 
> *Want stable drivers?*
> 13.12 was the bees knees up until 4/25/14 when 14.4 WHQL released.
> 14.4 seems OK so far, but if you have issues *13.12 is ROCK SOLID.*
> *Make sure you remove ANY old drivers properly before installing new ones:* http://www.guru3d.com/files_get/display_driver_uninstaller_download,9.html
> 
> 
> *How big of a PSU do you need? *
> 
> This is dependent upon:
> CPU
> # of GPUs
> what clocks/voltages you will run them at.
> 
> Reference R9 290 @ stock settings is in the ~180W range
> 290(X) with high overclock/voltage can be in ~350W range
> If you're using an AMD FX 8-core chip with a high overclock, then be mindful of their power hungry tendencies; especially when OCd
> 
> *From my personal experience my HTPC flawlessly runs:*
> 450W Rosewill Capstone
> 290 @ stock
> 3770k @ stock
> 1 SSD, 2 HDD and 1 ODD
> 2 AIO pumps & 3 fans presently
> A 3770k at stock is easily under 100W
> The 290 is only drawing ~160W with an AIO on it
> After you factor in AIO water pumps, fans, SSD, HDD and ODD...this system is still easily under the 450W mark.
> 
> 
> *My main desktop flawlessly runs:*
> 650W Rosewill Capstone
> (1) 290 up to 1300/1700 @ 5760*1080 benching
> 1200/1500 @ 5760*1080 everyday gaming clocks
> 
> 4770k @ 4.5GHz
> 16GB RAM 2400MHz
> 3 SSDs & 1 HDD
> 6 fans & 1 water pump
> 
> 
> *Which waterblock/backplate is best?*
> 
> Pretty much all waterblocks will cool the GPU core the same
> The core isn't the problem child of the 290(X) though, *VRM1* is.
> 
> 
> *Best overall waterblock performance when using exactly what is given to you in the box goes to Aquacomputer.*
> Whether you choose the passive/active backplate, you will get the best VRM1 temps here.
> 
> 
> 
> 
> Spoiler: Active and Passive Backplates
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *The other manufacturers (EK, XSPC, Koolance) seem to have slacked in the VRM1 area, especially when you start pushing these cards...but there is an easy fix!*
> Replacing the thermal pads with Fujipoly Ultra Extreme will slash VRM temps.
> _*See my thread here:*_ http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures
> Make sure you check pad thickness for your specific block
> the $20 15x100mm strip is enough to do 2-3 cards
> 
> 
> 
> *Anyone please feel free to chime in and correct me if I am wrong in my preceding statements*


Very nice write up. Added to OP - *USEFUL SOFTWARE & INFO SECTION*


----------



## kizwan

Quote:


> Originally Posted by *kpoeticg*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Oh, well guess i was wrong (and suprised) about MSI. I had no idea they were an OEM
> 
> But it's more like "Why buy a Corsair for 100 bux MORE when it's a Seasonic PSU"
> Quote:
> 
> 
> 
> Originally Posted by *Rozayz*
> 
> Originally Posted by *Rozayz*
> 
> 
> 
> 
> 
> 
> 
> 
> It's most likely a mobo issue. I'm just saying that it's good practice to use any local resources you have to MAKE SURE it's the mobo before shipping it out with no return date.
> 
> PSU's, even it's 90% likely it's the mobo, are usually available to try locally for people. I would think mailing it back to MSI would be more of a PITA than testing a new PSU, that's all.
> 
> But MSI DOES have a great rma dept (probly cuz they're in-house apparently), so maybe it's worth the month to get a new mobo...
Click to expand...

IMHO, it's more likely PSU is faulty/malfunction.


----------



## HOMECINEMA-PC

Winter is here ....

Case ambient 16.5c - Ambient 15c , 16c - 18c VRM temps - vcore 24c - gpu core 22c - 24c ...... Chilly


----------



## Mega Man

Quote:


> Originally Posted by *Rozayz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kpoeticg*
> 
> I don't know of any hardware company that doesn't have rma's shipped to 3rd party repair centers. It's pretty standard, and probly why manufacturer rma's take so freakin long.
> 
> 
> 
> This is why I have contacted them as a last resort. I don't have time to wait 30+ days for an RMA. I know first hand that it will take this long lol. We have customers that complain about this type of thing all the time despite having an on-site warranty and return's department (we rarely have to send RA's back to manufacturers, but there's always a handful that make it to that stage).
> 
> I'm pretty mind made up about grabbing a Z97 board and purchasing a 4770k for the interim.. whatever MSI are able to do with the board is just added $$ for when it's back with me I suppose.
> 
> As for the PSU/testing... not much point, I am fairly sure PSU works fine, and the OCP on X-1050's is VERY rarely reached due to multi-railing and disgustingly high amps. There's no way I'm hitting OCP running no OC and a reference + stock R9 290X, it's just not possible (under normal circumstances!). If push comes to shove, I'll take my rig into work and have it benched during the day, but we've been extremely busy at work lately so it's probably a job for ~ 2 weeks from now (sucks!).
Click to expand...

[ save yourself the upgrade itch and just get a RIVBE +4930k now ]


----------



## BradleyW

Just to add the the 290X guide.

Your benchmark score went down after OCing?
"May indicate instability. Reduce the overclock and retest".


----------



## chiknnwatrmln

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Winter is here ....
> Case ambient 16.5c - Ambient 15c , 16c - 18c VRM temps - vcore 24c - gpu core 22c - 24c ...... Chilly


Exact opposite here, after a few hours of gaming my room temps will go from high 60's to mid 80's (or high teens to 30ish for the rest of the world)... I have a feeling my room would break 100f (38c) during the summer if it wasn't for AC.
Core temp idles around 35c now, but when I had my rig on the windowsill during winter it would idle in the single digits..


----------



## JordanTr

Ok guys so its time for me to go watercooling. I picked some parts from overclockers.co.uk



This is my first watercooling and im total noob. Gonna cool my Sapphire R9 290 + 2500K (4.5 - 4.8Ghz). Did i choose extra barbs and clips correctly? Everything gonna be fine? Actually i prefer alphacool res+pump ( which i can put into my dvd rom place), but i like ex280 radiator ( got Fractal Design XL R2 case, so im going to mount it on top and replace my corsair h90) + i get all instructions then i buy full kit and buying parts separately gonna cost me even more. I also took 2 litres of coolant. Maybe 1 litre is enough?







Thanks for your answers.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Exact opposite here, after a few hours of gaming my room temps will go from high 60's to mid 80's (or high teens to 30ish for the rest of the world)... I have a feeling my room would break 100f (38c) during the summer if it wasn't for AC.
> Core temp idles around 35c now, but when I had my rig on the windowsill during winter it would idle in the single digits..


First taste of winter here after a long hot and sweaty summer , autumn . In summer my room can hit 36c in the afternoons western sun . Yes thank gawd for my portable a/c unit and above me the fully ducted a/c gawd bless it


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *kpoeticg*
> 
> It's not so much fishy, as much as people just don't realize that there's like a 95% chance that the product you bought wasn't "Manufactured" by who you think. Most manufacturers are really the people that design, or add an idea to a product.
> 
> Sapphire may be an in-house company since they don't have a million products on the market. But companies like MSI & Asus that make motherboards, gpu's, laptops, phones, tablets, etc... don't actually make the product. They have OEM's they use.
> 
> Same with PSU's. Corsair , Coolermaster, and EVGA don't "Make" PSU's. Seasonic, Superflower, and other cheaper companies OEM them with a new label.
> 
> *Once businesses grow to a certain point, it becomes more affordable and logical to have factories assemble your products inside the country that your selling to, or where there's cheaper labor. *


Yerp that's whats happening here . After 2017 oztralias local car manufacturing heads offshore after 70 years


----------



## the9quad

played a bunch of BF4 today, had something weird happen. I started the game and it was like the cards never kicked in to realize it was a 3d game, so they just kind of sputtered out a very low framerate and never really did their thing. Left the game, tried it again, same thing. Finally rebooted and it's working again. not sure if it's the cards or the game, I am leaning toward the game since BF4 isn't the most stable beat around.

Anyone else ever have this happen? Like they still thought I was doing something in 2d or something.


----------



## BradleyW

Quote:


> Originally Posted by *the9quad*
> 
> played a bunch of BF4 today, had something weird happen. I started the game and it was like the cards never kicked in to realize it was a 3d game, so they just kind of sputtered out a very low framerate and never really did their thing. Left the game, tried it again, same thing. Finally rebooted and it's working again. not sure if it's the cards or the game, I am leaning toward the game since BF4 isn't the most stable beat around.
> 
> Anyone else ever have this happen? Like they still thought I was doing something in 2d or something.


I had this a few times and Alt + Tab fixed it most of the time.


----------



## Tokkan

Quote:


> Originally Posted by *the9quad*
> 
> played a bunch of BF4 today, had something weird happen. I started the game and it was like the cards never kicked in to realize it was a 3d game, so they just kind of sputtered out a very low framerate and never really did their thing. Left the game, tried it again, same thing. Finally rebooted and it's working again. not sure if it's the cards or the game, I am leaning toward the game since BF4 isn't the most stable beat around.
> 
> Anyone else ever have this happen? Like they still thought I was doing something in 2d or something.


I had with BF3 on my old crossfire system, crossfire wouldn't work and main card would not go to full power state. It would do as you said untill it never disappeared, so I reinstalled windows and it went away. Never had it again.
At that time I noticed that the frames were locked to 30, but a really choppy 30 frames, no matter what settings I changed it stayed like that.


----------



## rdr09

Quote:


> Originally Posted by *JordanTr*
> 
> Ok guys so its time for me to go watercooling. I picked some parts from overclocker.co.uk
> 
> 
> 
> This is my first watercooling and im total noob. Gonna cool my Sapphire R9 290 + 2500K (4.5 - 4.8Ghz). Did i choose extra barbs and clips correctly? Everything gonna be fine? Actually i prefer alphacool res+pump ( which i can put into my dvd rom place), but i like ex280 radiator ( got Fractal Design XL R2 case, so im going to mount it on top and replace my corsair h90) + i get all instructions then i buy full kit and buying parts separately gonna cost me even more. I also took 2 litres of coolant. Maybe 1 litre is enough?
> 
> 
> 
> 
> 
> 
> 
> Thanks for your answers.


add a drain line. you gonna need these . . .

http://www.frozencpu.com/products/2275/ex-tub-14/12_ID_UV_Reactive_Leakproof_T_Fitting.html?tl=g30c499s745

or this
http://www.frozencpu.com/products/10382/ex-tub-622/Bitspower_G14_Matte_Black_T_Adapter_BP-MBTMB.html?tl=g30c499s745

and

http://www.frozencpu.com/products/14873/ex-tub-1067/Enzotech_G14_Barb_Stop_Fitting_w_Cap_-_12_ID_-_Nickel_Plate_Metallic_Silver_NPH-ID12-G14.html?tl=g30c101

here is an alternative clamp . . .

http://www.frozencpu.com/products/5390/ex-tub-135/12_OD_Reusable_Clamp_-_Black.html

those barbs are the right ones. i recommend distilled water instead of the coolant.


----------



## Mr357

Quote:


> Originally Posted by *JordanTr*
> 
> Ok guys so its time for me to go watercooling. I picked some parts from overclocker.co.uk
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> This is my first watercooling and im total noob. Gonna cool my Sapphire R9 290 + 2500K (4.5 - 4.8Ghz). Did i choose extra barbs and clips correctly? Everything gonna be fine? Actually i prefer alphacool res+pump ( which i can put into my dvd rom place), but i like ex280 radiator ( got Fractal Design XL R2 case, so im going to mount it on top and replace my corsair h90) + i get all instructions then i buy full kit and buying parts separately gonna cost me even more. I also took 2 litres of coolant. Maybe 1 litre is enough?
> 
> 
> 
> 
> 
> 
> 
> Thanks for your answers.


The more appropriate place for this would be the OCN Watercooling thread, or a thread of your own.

Most importantly, you need to make sure your tubing, barbs, and clips are of the correct size. The diameter of the bards should match the inner diameter of the tubing, and the outer diameter of the tubing should match the size of the clips. Although it costs quite a bit more, I would get a kit with a D5 over the 750. It's simply a much stronger, quieter, and typically more reliable pump.


----------



## JordanTr

Quote:


> Originally Posted by *Mr357*
> 
> The more appropriate place for this would be the OCN Watercooling thread, or a thread of your own.
> 
> Most importantly, you need to make sure your tubing, barbs, and clips are of the correct size. The diameter of the bards should match the inner diameter of the tubing, and the outer diameter of the tubing should match the size of the clips. Although it costs quite a bit more, I would get a kit with a D5 over the 750. It's simply a much stronger, quieter, and typically more reliable pump.


Its expensive here in UK. I don't want to spend more than £350-£360 ($550+) for my first watercooling. I will change my pump later if I will see the need. Can you find me that drain line from overclockers.co.uk which fits my watercooling kit. what about those clips and barbs are they gonna fit gpu waterblock and extra 140mm rad? And last question. What direction should my water go? Is it alright: res/pump-> 280mm rad-> CPU-> 140mm rad-> GPU->res/pump ?


----------



## Jflisk

Quote:


> Originally Posted by *JordanTr*
> 
> Its expensive here in UK. I don't want to spend more than £350-£360 ($550+) for my first watercooling. I will change my pump later if I will see the need. Can you find me that drain line from overclockers.co.uk which fits my watercooling kit. what about those clips and barbs are they gonna fit gpu waterblock and extra 140mm rad? And last question. What direction should my water go? Is it alright: res/pump-> 280mm rad-> CPU-> 140mm rad-> GPU->res/pump ?


Try something like this if its your first build. Kit should be available in uk comes with everything you need. You might want to go with color tubing if its an option. This is a good base start then you can add GPU water block. Price 269.00 US

http://www.frozencpu.com/products/16551/ex-wat-217/XSPC_Raystorm_EX280_Extreme_Universal_CPU_Water_Cooling_Kit_w_D5_Variant_Pump_Included_and_Free_Dead-Water.html?tl=g59c683s2180

Tubing and fittings are
7/16" x 5/8"

Ek GPU block 112.99 US
http://www.frozencpu.com/products/21664/ex-blc-1565/EK_Radeon_R9-290X_VGA_Liquid_Cooling_Block_-_Acetal_EK-FC_R9-290X_-_Acetal.html

Back plate not needed but helps with cooling 30.00
http://www.frozencpu.com/products/21680/ex-blc-1569/EK_R9-290X_VGA_Liquid_Cooling_RAM_Backplate_-_Black_EK-FC_R9-290X_Backplate_-_Black.html


----------



## Roy360

Quote:


> Originally Posted by *JordanTr*
> 
> Ok guys so its time for me to go watercooling. I picked some parts from overclockers.co.uk
> 
> 
> 
> This is my first watercooling and im total noob. Gonna cool my Sapphire R9 290 + 2500K (4.5 - 4.8Ghz). Did i choose extra barbs and clips correctly? Everything gonna be fine? Actually i prefer alphacool res+pump ( which i can put into my dvd rom place), but i like ex280 radiator ( got Fractal Design XL R2 case, so im going to mount it on top and replace my corsair h90) + i get all instructions then i buy full kit and buying parts separately gonna cost me even more. I also took 2 litres of coolant. Maybe 1 litre is enough?
> 
> 
> 
> 
> 
> 
> 
> Thanks for your answers.


use regular water as coolant. Regular water + biocide. I have the same case as you, and it sucks to watercool in. At most you can fit a 280 on top, 240 on the front, and a 120mm in the back. Don't bother with a drain port though, I have quick disconnects in mines, and barely any water gets through the GPU blocks. I just keep a bucket underneath now, and pop the tubing out.


----------



## chiknnwatrmln

I really hope by regular water you mean distilled water, running tap in a loop is not a good idea...


----------



## ds84

Wanna ask, i using MSI 290 Gaming Xfire and i try to OC till +38mv, +100mhz over stock clocks. Is this acceptable on air?

Coz i did get some BSOD, which may be related to CCC.


----------



## the9quad

Quote:


> Originally Posted by *ds84*
> 
> Wanna ask, i using MSI 290 Gaming Xfire and i try to OC till +38mv, +100mhz over stock clocks. Is this acceptable on air?
> 
> Coz i did get some BSOD, which may be related to CCC.


Might be much for a 750 watt PSU? I don't know man, I have 1200 watts, and I think that is short for 3 cards is 750 enough for two?


----------



## Sgt Bilko

Quote:


> Originally Posted by *the9quad*
> 
> Might be much for a 750 watt PSU? I don't know man, I have 1200 watts, and I think that is short for 3 cards is 750 enough for two?


he has an 850w, It's enough.
Quote:


> Originally Posted by *ds84*
> 
> Wanna ask, i using MSI 290 Gaming Xfire and i try to OC till +38mv, +100mhz over stock clocks. Is this acceptable on air?
> 
> Coz i did get some BSOD, which may be related to CCC.


Is anything else in your rig overclocked?

If not then try a full driver wipe then re-install.

If so, is it stable?


----------



## ds84

Quote:


> Originally Posted by *Sgt Bilko*
> 
> he has an 850w, It's enough.
> Is anything else in your rig overclocked?
> 
> If not then try a full driver wipe then re-install.
> 
> If so, is it stable?


Have not tried to OC my cpu yet....

i just reinstalled my OS to my new samsung evo ssd. So everything is fresh, so to speak... No BSOD on 1077,1300,+38mv. But when i tried 1107, 1300, +50mv and ran 3Dmark, i gt around 11000 points, as compared to 12000 on 1077. Shortly after, i got BSOD.


----------



## Sgt Bilko

Quote:


> Originally Posted by *ds84*
> 
> Have not tried to OC my cpu yet....
> 
> i just reinstalled my OS to my new samsung evo ssd. So everything is fresh, so to speak... No BSOD on 1077,1300,+38mv. But when i tried 1107, 1300, +50mv and ran 3Dmark, i gt around 11000 points, as compared to 12000 on 1077. Shortly after, i got BSOD.


Try adding a bit more voltage try +60mV then go up by ten mV.

It's possible that your cards just aren't stable at those clocks.


----------



## the9quad

Quote:


> Originally Posted by *Sgt Bilko*
> 
> he has an 850w, It's enough.
> Is anything else in your rig overclocked?
> 
> If not then try a full driver wipe then re-install.
> 
> If so, is it stable?


Ha! I looked at his case!


----------



## ds84

Quote:


> Originally Posted by *the9quad*
> 
> Ha! I looked at his case!


Meaning???
Quote:


> Originally Posted by *Sgt Bilko*
> 
> Try adding a bit more voltage try +60mV then go up by ten mV.
> 
> It's possible that your cards just aren't stable at those clocks.


At +40mv, it is already producing quite a lot of heat during 3Dmark. Not sure if +60mv would help. But i will try that once im on WC. is +100mv safe on WC?


----------



## heroxoot

Quote:


> Originally Posted by *ds84*
> 
> Wanna ask, i using MSI 290 Gaming Xfire and i try to OC till +38mv, +100mhz over stock clocks. Is this acceptable on air?
> 
> Coz i did get some BSOD, which may be related to CCC.


Is it actually volting up? I have the 290X Gaming and when I push the voltage above +25mV it shows no differences on the voltage when I run benchmarks and games. It's always 1.81v when the game runs.


----------



## Sgt Bilko

Quote:


> Originally Posted by *ds84*
> 
> Meaning???
> At +40mv, it is already producing quite a lot of heat during 3Dmark. Not sure if +60mv would help. But i will try that once im on WC. is +100mv safe on WC?


He meant you have a 750D case and at first glance he thought you had a 750w PSU









+100mV is safe on Water cooling yes.

you might have to wait until you get them underwater to test them full then if temps are an issue for you atm.

My 290's will hit 1050/1300 on stock voltage and for 1100/1350 i need +30mV in Afterburner, so cards perform better, others a little worse, luck of the draw really.
Quote:


> Originally Posted by *the9quad*
> 
> Ha! I looked at his case!


Thought that might have been the......"case" Ha!


----------



## Sgt Bilko

Quote:


> Originally Posted by *heroxoot*
> 
> Is it actually volting up? I have the 290X Gaming and when I push the voltage above +25mV it shows no differences on the voltage when I run benchmarks and games. It's always 1.81v when the game runs.


Thats a good point actually, overvolting won't do jack unless you have the powerlimit raised.

Thanks for bringing that up


----------



## heroxoot

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Is it actually volting up? I have the 290X Gaming and when I push the voltage above +25mV it shows no differences on the voltage when I run benchmarks and games. It's always 1.81v when the game runs.
> 
> 
> 
> Thats a good point actually, overvolting won't do jack unless you have the powerlimit raised.
> 
> Thanks for bringing that up
Click to expand...

My powerlimit is raised tho. Not sure why it won't work for me, not that I have pushed my 290X in OC yet.


----------



## Dasboogieman

Just some thoughts

I have mounted a Corsair H90 on to a Sapphire 290 Tri X using the NZXT G10. My case is the CM HAF 932 and my CPU cooler is the Noctua NH D14

1. MEASURE YOUR 120/140mm slot before getting an AIO. I have a HAF923 and the H90 doesn't actually fit in to the 140mm back slot (with hoses oriented towards the ground) with a Noctua NH D14, couldn't fit it sideways since the pump blocks the side panel from closing. In the end, I lashed the radiator in to the 5.25 inch bays. In hindsight, I would've purchased a 120mm radiator AIO knowing this.

2. AIO Hose length, more is better. I would advise against the H90 for future buyers, this is cost optimized for the CPU cooling thus the hose is 4 inches shorter than the Kraken X40 despite identical specs (both are identical Asetek units) This extra 4 inches would've allowed me to mount the rad at the rear with the hoses oriented towards the ceiling (thus not needing to be zip tied) and still use the NH D14.

3. Mounting: the bracket has to be mounted to the GPU with the AIO, I suggest clearing space in the case prior to doing this since this will ease the final insertion of the GPU + AIO assembly later. Only 2-3 turns of the screw driver is neccessary, the fit is quite snug, any more and you risk cracking the GPU die or warping the PCB.

Results
The Sapphire Tri X cooler was a brilliant design, the fans were placed in such a way that the air from the 3rd fan blows directly on to the VRM 1 area which was direct contact cooled with the whole heatsink frame.
Thus, at 50% fan speed (barely audible) the core is 72 degrees (ambient 22) and VRM1 is around (75 degrees), VRM 2 is 45 degrees in EVGA OC scanner (a variant of Furmark)

Kraken G10 + H90 with no additional VRM cooling
The overall noise is equivalent to the Tri X at 40% fan speed
The core never peaks above 45 degrees but VRM 1 hovers around 82 degrees, VRM 2 reaches 60 degrees in EVGA OC scanner

Thus, I strongly recommend aftermarket VRM heatsinks if you want to overclock or using high stress apps, while the 92mm fan does a respectable job, there's actually an air dead zone in the VRM 1 assembly which lies directly under the 92mm fan's motor. If any of you has seen the Puget systems review, I believe this manifests as the giant red spot on the thermal camera.
I recommend also either thermal adhesive (Arctic Alumina is the safest as I have read read 1 report of someone blowing a board fuse using excessive Arctic Silver adhesive) or a screw type heatsink assembly.
DON'T use Sekisui 5760 for the VRM1 area. This tape has maximum adhesion around the 50-60 degree temperature range, 70-80 starts to see a degradation in the bond as the glue becomes less viscous. debonding starts to happen at 90-100 degrees over repeated cycles.

Useful dimensions and features:
On a reference AMD board, the distance from the first screw hole centre to the second centre on the VRM 1 area is 85mm
The height clearance from the RAM chip to the NZXT bracket is ~5mm
The height clearance from the VRM 1 zone to the NXZT bracket is ~6mm
The width of the VRM area from the Choke to the nearest capacitor is 13mm
The whole assembly takes 2 slots.
The AMD VRMs are metallic and despite the size difference, are the exact same height.

I have just ordered the GELID Enhancement kit for 290X from these guys http://www.feppaspot.com/servlet/the-897/GELID-Solutions-Icy-Vision/Detail so I don't have to DIY my own VRM sink.


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Winter is here ....
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Case ambient 16.5c - Ambient 15c , 16c - 18c VRM temps - vcore 24c - gpu core 22c - 24c ...... Chilly


AC is here... LMAO










Case ambient 31.3C, Ambient 28.5C (indoor) & 37.4C (outdoor), 31C - 32C VRM1, CPU core 37C, GPU core 37C - 38C ...... Hot!


----------



## Rozayz

Quote:


> Originally Posted by *Mega Man*
> 
> [ save yourself the upgrade itch and just get a RIVBE +4930k now ]


What is this itch you speak of? I can get both tomorrow (today is Sunday) if I wanted oO


----------



## kpoeticg

Lol my RIVE BE's in RMA right now









Probly why i had so much to say on the topic. I spent endless hours trying everything i could, it ended up being the board tho.


----------



## ds84

Quote:


> Originally Posted by *Sgt Bilko*
> 
> He meant you have a 750D case and at first glance he thought you had a 750w PSU
> 
> 
> 
> 
> 
> 
> 
> 
> 
> +100mV is safe on Water cooling yes.
> 
> you might have to wait until you get them underwater to test them full then if temps are an issue for you atm.
> 
> My 290's will hit 1050/1300 on stock voltage and for 1100/1350 i need +30mV in Afterburner, so cards perform better, others a little worse, luck of the draw really.
> Thought that might have been the......"case" Ha!


Currently using 1077/1250 mhz, +38mv. seems to be stable for me at least.


----------



## kizwan

Quote:


> Originally Posted by *kpoeticg*
> 
> Lol my RIVE BE's in RMA right now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Probly why i had so much to say on the topic. I spent endless hours trying everything i could, it ended up being the board tho.


What was the problem


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *kizwan*
> 
> AC is here... LMAO
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Case ambient 31.3C, Ambient 28.5C (indoor) & 37.4C (outdoor), 31C - 32C VRM1, CPU core 37C, GPU core 37C - 38C ...... Hot!


A/C is ALWAYS here


----------



## Roboyto

Quote:


> Originally Posted by *Dasboogieman*
> 
> Just some thoughts
> 
> *I have mounted a Corsair H90 on to a Sapphire 290 Tri X using the NZXT G10.* My case is the CM HAF 932 and my CPU cooler is the Noctua NH D14
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 1. MEASURE YOUR 120/140mm slot before getting an AIO. I have a HAF923 and the H90 doesn't actually fit in to the 140mm back slot (with hoses oriented towards the ground) with a Noctua NH D14, couldn't fit it sideways since the pump blocks the side panel from closing. In the end, I lashed the radiator in to the 5.25 inch bays. In hindsight, I would've purchased a 120mm radiator AIO knowing this.
> 
> 2. AIO Hose length, more is better. I would advise against the H90 for future buyers, this is cost optimized for the CPU cooling thus the hose is 4 inches shorter than the Kraken X40 despite identical specs (both are identical Asetek units) This extra 4 inches would've allowed me to mount the rad at the rear with the hoses oriented towards the ceiling (thus not needing to be zip tied) and still use the NH D14.
> 
> 3. Mounting: the bracket has to be mounted to the GPU with the AIO, I suggest clearing space in the case prior to doing this since this will ease the final insertion of the GPU + AIO assembly later. Only 2-3 turns of the screw driver is neccessary, the fit is quite snug, any more and you risk cracking the GPU die or warping the PCB.
> 
> Results
> The Sapphire Tri X cooler was a brilliant design, the fans were placed in such a way that the air from the 3rd fan blows directly on to the VRM 1 area which was direct contact cooled with the whole heatsink frame.
> Thus, at 50% fan speed (barely audible) the core is 72 degrees (ambient 22) and VRM1 is around (75 degrees), VRM 2 is 45 degrees in EVGA OC scanner (a variant of Furmark)
> 
> Kraken G10 + H90 with no additional VRM cooling
> The overall noise is equivalent to the Tri X at 40% fan speed
> The core never peaks above 45 degrees but VRM 1 hovers around 82 degrees, VRM 2 reaches 60 degrees in EVGA OC scanner
> 
> Thus, I strongly recommend aftermarket VRM heatsinks if you want to overclock or using high stress apps, while the 92mm fan does a respectable job, there's actually an air dead zone in the VRM 1 assembly which lies directly under the 92mm fan's motor. If any of you has seen the Puget systems review, I believe this manifests as the giant red spot on the thermal camera.
> I recommend also either thermal adhesive (Arctic Alumina is the safest as I have read read 1 report of someone blowing a board fuse using excessive Arctic Silver adhesive) or a screw type heatsink assembly.
> DON'T use Sekisui 5760 for the VRM1 area. This tape has maximum adhesion around the 50-60 degree temperature range, 70-80 starts to see a degradation in the bond as the glue becomes less viscous. debonding starts to happen at 90-100 degrees over repeated cycles.
> 
> Useful dimensions and features:
> On a reference AMD board, the distance from the first screw hole centre to the second centre on the VRM 1 area is 85mm
> The height clearance from the RAM chip to the NZXT bracket is ~5mm
> The height clearance from the VRM 1 zone to the NXZT bracket is ~6mm
> The width of the VRM area from the Choke to the nearest capacitor is 13mm
> The whole assembly takes 2 slots.
> The AMD VRMs are metallic and despite the size difference, are the exact same height.
> 
> 
> 
> I *have just ordered the GELID Enhancement kit for 290X from these guys http://www.feppaspot.com/servlet/the-897/GELID-Solutions-Icy-Vision/Detail so I don't have to DIY my own VRM sink.*


*I'm in the midst of a post in one of my build logs for this heatsink kit with a Kraken G10. I'll share here first though







*

*NZXT Kraken G10 + Gelid R9 290(X) VRM Cooling Enhancement Kit*

I paid the additional $24 Courier Express shipping from Hong Kong. Ordered on Saturday Night 4/26 and it arrived Tuesday afternoon 4/29!

FeppaSpot's customer service was great. They let me know when the item was packaged/shipped.

They ask for a phone number so DHL can text you information to: check tracking info, digitally sign for your package, specify delivery instructions, or even change delivery address.

Superb service between FeppaSpot and DHL!



*Long VRM1 Sink & Thermal Pad - VRM1 Mounting Studs/Nuts - Plastic Shims - VRM2 Sink & Thermal Tape*



*3M 8810 RAM Sinks* - http://www.performance-pcs.com/catalog/index.php?main_page=product_info&products_id=39462

Thermal tape on these isn't the greatest. I accidentally knocked sinks off a couple times when handling the card. It's a pro/con they come off so easily.

May want to consider a similar sink and some different thermal tape.



*VRM1 sink is perfectly tailored to these cards. Nice job Gelid!*



*Tiny thumb nuts from the back side.*



*Junpus Thermal Tape* instead of supplied super thin tape - http://www.performance-pcs.com/catalog/index.php?main_page=product_info&products_id=37756

Sold by the foot



*All Sinks Installed* *- It looks purdy*











*This low profile sink comes in the kit*, but was not pictured above because I didn't think I would need it.

If you want a heatsink on the bottom most RAM chip then you will need this low profile sink!



*NZXT + Gelid Results*

*Reference Cooler on PowerColor R9 290 with auto fan settings*



*Kraken G10 + Antec 620 + SilenX Effizio + Gelid R9 290 Heatsink Upgrade Kit*



*Before*

Core - 94C

VRM1 - 65C

VRM2 - 83C

*After*

Core - 53C

VRM1 - 56C

VRM2 - 61C

Temps are outstanding, power consumption is down ~12% to 160W at 975/1250, and the computer is absolutely silent using the SilenX Effizio.

Even with 0 airflow across VRM2, its temp is down 22C due to no longer getting all the hot air from core and VRM1 blown across it.

The 92mm NZXT fan is pulling heat from VRM1 because of the cards placement in my 250D. Yes, a R9 290 with a Kraken G10 and Antec 620*** will**** fit inside a Corsair 250D.

****I had to remove a screw from the pump housing so the hoses/fittings would rotate further. If I hadn't done this, the side panel would not have fit as the hoses/fittings protruded too far out****

***I don't know if all of the Asetek based coolers have this little screw that can be removed***

Take care when turning the fitting on the pump further than was previously allowed, as I accidentally pulled it completely out! It dripped a little fluid, but did *NOT* break anything



*It takes a little finesse and patience, but it all fits in there with the radiator in the front of the case.*

Please excuse my poor quality cell phone photos & current lack of cable management







*A few things I still need to test/tinker around with:*


Temps when overclocking the card
Did I gain any additional overclocking headroom?
Reference cooler best clocks were 1160/1550 +200mV Trixx

Power consumption at max clocks/voltages
Swapping thermal tape on VRM2 for some Fujipoly Ultra or Ultra Extreme.
The Fuji Pad should hold sinks with the card standing upright as it worked great for a VRM heatsink upgrade on my Devil 270X


----------



## Rozayz

Quote:


> Originally Posted by *kpoeticg*
> 
> Lol my RIVE BE's in RMA right now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Probly why i had so much to say on the topic. I spent endless hours trying everything i could, it ended up being the board tho.


Tested my Z77 MPOWER with a different PSU today. Front panel power button and m/b power button both work. Was not able to jump my PSU with a paperclip either - gonna buy an AX1200i (can get them cheap at work) and go from there! M/B seems fine at this stage. [email protected][email protected] PC hardware ;_;


----------



## igrease

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> A/C is ALWAYS here


How much extra a month does a portable AC cost you a month?


----------



## cennis

would like to see overclocked vrm temps on the g10


----------



## Matt-Matt

Forgot to update this a while back... Now watercooled!









Can OP please update me in the spreadsheet?



Cable management is bad because by the time I got everything running properly I was in rush to get it together and I kind of need a new case


----------



## Arizonian

Quote:


> Originally Posted by *Matt-Matt*
> 
> Forgot to update this a while back... Now watercooled!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can OP please update me in the spreadsheet?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Cable management is bad because by the time I got everything running properly I was in rush to get it together and I kind of need a new case


Congrats - updated.









I've been looking at the NZXT Phantom 630 Windowed for my next build, whenever that might be.


----------



## Matt-Matt

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - updated.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've been looking at the NZXT Phantom 630 Windowed for my next build, whenever that might be.


Ah, a fan of the NZXT cases?








I like the look of them, but they feel a bit plasticy I've found. Mind you it's a mates PC and not mine so I haven't really worked with the case much.

I'm a Corsair fan myself now, haha.









Thanks for the update btw!


----------



## zGunBLADEz

My 290 @ 1250/1430
http://www.3dmark.com/3dm11/8297921
my first overclock attempt. on the fly. for the looks of it i have some spare room on the core

i think in a cold day i can get this baby to 1300 XD
ram is there there







need to play a lil bit more


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *zGunBLADEz*
> 
> My 290 @ 1250/1430
> http://www.3dmark.com/3dm11/8297921
> my first overclock attempt. on the fly. for the looks of it i have some spare room on the core
> 
> i think in a cold day i can get this baby to 1300 XD
> ram is there there
> 
> 
> 
> 
> 
> 
> 
> need to play a lil bit more


Nice effort


----------



## the9quad

Ever since i built this pc is 3dmark 11 is the only one of the benches this thing will not run. it crashes every time while loading the final test.


----------



## Roy360

Anyone know if the Club 3D RoyalKing R9 290X is a reference PCB or not? Want to see if it's compatible with my reference block before I break open the seal.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *kpoeticg*
> 
> Lol my RIVE BE's in RMA right now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Probly why i had so much to say on the topic. I spent endless hours trying everything i could, it ended up being the board tho.


Bios issues ? 00 post ?


----------



## Dhalgren65

http://www.techpowerup.com/gpuz/9y559/
http://www.techpowerup.com/gpuz/hpupb/

Did that work?
Proud new owner-for Mistress in Sig
2x MSI R9 290 Gaming 4G w/EK fullcovers Crossfire!
Photos soon!


----------



## Mega Man

Quote:


> Originally Posted by *Roy360*
> 
> Anyone know if the Club 3D RoyalKing R9 290X is a reference PCB or not? Want to see if it's compatible with my reference block before I break open the seal.


By the pcie fingers do you see a and and and symbol


----------



## Arizonian

Quote:


> Originally Posted by *Dhalgren65*
> 
> http://www.techpowerup.com/gpuz/9y559/
> http://www.techpowerup.com/gpuz/hpupb/
> 
> Did that work?
> Proud new owner-for Mistress in Sig
> 2x MSI R9 290 Gaming 4G w/EK fullcovers Crossfire!
> Photos soon!


Congrats - added


----------



## Matt-Matt

Quote:


> Originally Posted by *the9quad*
> 
> Ever since i built this pc is 3dmark 11 is the only one of the benches this thing will not run. it crashes every time while loading the final test.


It could just be 3DMark not liking the drivers or the OC or something? It's strange that it's only 3DMark and only the final scene. I personally wouldn't worry about it!


----------



## phantomowl

Hi 290 users, have you updated your drivers to 14.4?


----------



## Mega Man

it is official last 2 GPUs on the way

quadfire 290xs baby !


----------



## Paul17041993

Quote:


> Originally Posted by *Mega Man*
> 
> By the pcie fingers do you see a and and and symbol


and and and what symbol?










(fyi, yea he means AMD symbol)


----------



## Mega Man

yea... auto correct on mobile


----------



## aidhanc

Quote:


> Originally Posted by *phantomowl*
> 
> Hi 290 users, have you updated your drivers to 14.4?


Updated to 14.4 from 13.12 and have been running smoothly without issues.
I would recommend a clean install to avoid possible conflicts though.


----------



## heroxoot

Quote:


> Originally Posted by *phantomowl*
> 
> Hi 290 users, have you updated your drivers to 14.4?


The work fine but my 290X can't seem to have its fan speeds changed. It's only my card from what I can tell. I have not met any other MSI 290X owners so I cannot confirm. Worked fine on 13.12.

Other than that 14.4 is great.


----------



## Kokin

14.4 has no problems with both my 7950 and R9 290. It also works with either card when patched to OC my 1440p monitor to 120Hz.
Quote:


> Originally Posted by *Rozayz*
> 
> Tested my Z77 MPOWER with a different PSU today. Front panel power button and m/b power button both work. Was not able to jump my PSU with a paperclip either - gonna buy an AX1200i (can get them cheap at work) and go from there! M/B seems fine at this stage. [email protected][email protected] PC hardware ;_;


Just curious, but why go with such large capacity PSUs? I already consider my 750W PSU to be overkill for my ITX system. If you're using 2x GPUs then nevermind my post.


----------



## Rozayz

Quote:


> Originally Posted by *Kokin*
> 
> 14.4 has no problems with both my 7950 and R9 290. It also works with either card when patched to OC my 1440p monitor to 120Hz.
> Just curious, but why go with such large capacity PSUs? I already consider my 750W PSU to be overkill for my ITX system. If you're using 2x GPUs then nevermind my post.


I've actually reconsidered this and opted for the Strider 750W. It's plentiful for what I need and if I ever require an upgrade, I'll be putting the second PSU slot in my 900D to use. :3


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Kokin*
> 
> 14.4 has no problems with both my 7950 and R9 290. It also works with either card when patched to OC my 1440p monitor to 120Hz.
> Just curious, but why go with such large capacity PSUs? I already consider my 750W PSU to be overkill for my ITX system. If you're using 2x GPUs then nevermind my post.


Tried to o/c monitor once the patch made win 7 say no driver ( 14.4 WHQL ) ??

Quote:


> Originally Posted by *Rozayz*
> 
> I've actually reconsidered this and opted for the Strider 750W. It's plentiful for what I need and if I ever require an upgrade, I'll be putting the second PSU slot in my 900D to use. :3


Silverstone PSU's are really good I run the 1200 watter . I need it for when im benching 3 290's heavily volted up and [email protected]


----------



## Rozayz

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Tried to o/c monitor once the patch made win 7 say no driver ( 14.4 WHQL ) ??
> Silverstone PSU's are really good I run the 1200 watter . I need it for when im benching 3 290's heavily volted up and [email protected]


Even then, that wouldn't use 1200W.

See my sig for my rig ~ 3770k is at 5.0 GHz a lot of the time, but otherwise 4.6-7 GHz.


----------



## phantomowl

Quote:


> Originally Posted by *heroxoot*
> 
> The work fine but my 290X can't seem to have its fan speeds changed. It's only my card from what I can tell. I have not met any other MSI 290X owners so I cannot confirm. Worked fine on 13.12.
> 
> Other than that 14.4 is great.


my friend got a msi lightning 290x, he said that he needs to update the gpu bios to make the 14.4 work


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Rozayz*
> 
> Even then, that wouldn't use 1200W.
> 
> See my sig for my rig ~ 3770k is at 5.0 GHz a lot of the time, but otherwise 4.6-7 GHz.


Ive had 3 cards full load pull 360w each and 3930k at said clock pull 250w when benching . Plus all the accessories I have as well . Come close to under load . Better to have than not to have it . That's just how I roll man


----------



## Tokkan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Ive had 3 cards full load pull 360w each and 3930k at said clock pull 250w when benching . Plus all the accessories I have as well . Come close to under load . Better to have than not to have it . That's just how I roll man


There's a saying in my language that translated is a little like the following:
A big -insert your preferred container such as a box- fits both the small items and the big items, while a small box fits only small items.
So... A big PSU powers both the low power systems and the power hungry systems, while a small PSU can only power the low power ones.


----------



## Kokin

14.4 works with the OC patch, just make sure you are using the latest version of the patcher and of CRU.

It's definitely safer to have extra room than not.

I don't think my whole system pulls more than 350~400W total despite my 3570K being clocked to 4.7GHz (1.264V) and my R9 290 going past 1200+mhz with +200mv. My eight GT fans only consume about 7 watts (total) running at about 7V.


----------



## Tokkan

Quote:


> Originally Posted by *Kokin*
> 
> 14.4 works with the OC patch, just make sure you are using the latest version of the patcher and of CRU.
> 
> It's definitely safer to have extra room than not.
> 
> I don't think my whole system pulls more than 350~400W total despite my 3570K being clocked to 4.7GHz (1.264V) and my R9 290 going past 1200+mhz with +200mv. My eight GT fans only consume about 7 watts (total) running at about 7V.


What is this OC patch you speak of? I'm unaware of such thing.
I pulled my r9 290 to 1100/1350 on stock volts/50% power limit with 14.4 and it isn't throttling nor crashing lol something I'm doing wrong?


----------



## Kokin

Quote:


> Originally Posted by *Tokkan*
> 
> What is this OC patch you speak of? I'm unaware of such thing.
> I pulled my r9 290 to 1100/1350 on stock volts/50% power limit with 14.4 and it isn't throttling nor crashing lol something I'm doing wrong?


Oh I'm talking about the OC patch needed for the Qnix/X-star Korean monitors. It lets us OC the monitor's refresh rate from 60Hz to 96Hz~120Hz+


----------



## Tokkan

Quote:


> Originally Posted by *Kokin*
> 
> Oh I'm talking about the OC patch needed for the Qnix/X-star Korean monitors. It lets us OC the monitor's refresh rate from 60Hz to 96Hz~120Hz+


Oh that makes sense now, I remember reading about that.
Actually, looking at them right now on ebay lol my country is kinda bad for these type of products. One of them costs as much as my 290 costed me


----------



## Kokin

Quote:


> Originally Posted by *Tokkan*
> 
> Oh that makes sense now, I remember reading about that.
> Actually, looking at them right now on ebay lol my country is kinda bad for these type of products. One of them costs as much as my 290 costed me


Ah that's a shame. It is a very nice monitor even if you don't OC it.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Kokin*
> 
> 14.4 works with the OC patch, just make sure you are using the latest version of the patcher and of CRU.


Same here, 14.4 and 96Hz (haven't tried higher)

has anyone seen the cooler design for the (rumoured) Powercolor Devil13 Dual Card?

Looks insane













Thanks to NavDigitalStorm:


----------



## Kokin

The fan design looks pretty darn cool. But if it ain't a waterblock, I'm not interested.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Kokin*
> 
> The fan design looks pretty darn cool. But if it ain't a waterblock, I'm not interested.


i get that,

curious to see how it goes though


----------



## Kokin

Quote:


> Originally Posted by *Sgt Bilko*
> 
> i get that,
> 
> curious to see how it goes though


I'm definitely interested too, at least for the card itself. It would be amazing if it performed just as well as the 295x2, as that card had almost perfect scaling in most scenarios.

I think I'm just too much of a watercooling fanboy to appreciate heatsinks anymore. I definitely don't like the reference cooler on my R9 290!


----------



## ZealotKi11er

Quote:


> Originally Posted by *Kokin*
> 
> 14.4 works with the OC patch, just make sure you are using the latest version of the patcher and of CRU.
> 
> It's definitely safer to have extra room than not.
> 
> I don't think my whole system pulls more than 350~400W total despite my 3570K being clocked to 4.7GHz (1.264V) and my R9 290 going past 1200+mhz with +200mv. My eight GT fans only consume about 7 watts (total) running at about 7V.


Playing BF4 i get ~ 500W with 290X @ 1200/1500 +100mV. You are probably pulling 550-600W.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Kokin*
> 
> I'm definitely interested too, at least for the card itself. It would be amazing if it performed just as well as the 295x2, as that card had almost perfect scaling in most scenarios.
> 
> I think I'm just too much of a watercooling fanboy to appreciate heatsinks anymore. I definitely don't like the reference cooler on my R9 290!


i wasn't a fan of the Ref 290x i had either, ran either too hot or too loud for me









I'm hoping this will do well actually, the PCS+ 290/x is a good card so maybe they can tame the beast (or Devil







)


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Kokin*
> 
> The fan design looks pretty darn cool. But if it ain't a waterblock, I'm not interested.


^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ THIS


----------



## bond32

Keep in mind, that although loud that reference cooler does keep the card temperatures usable while staying cheap. Yeah its loud, but it works.


----------



## bbond007

Quote:


> Originally Posted by *Sgt Bilko*
> 
> has anyone seen the cooler design for the (rumoured) Powercolor Devil13 Dual Card?


That is an impressive amount of PCIe power connectors.

I thought my new Lightning had a lot of connectors. I thought Powercolor was just being silly until i realized it was a dual card....

Still, why would one pick this over the 295x2....


----------



## Sgt Bilko

Quote:


> Originally Posted by *bbond007*
> 
> That is an impressive amount of PCIe power connectors.
> 
> I thought my new Lightning had a lot of connectors. I thought Powercolor was just being silly until i realized it was a dual card....
> 
> Still, why would one pick this over the 295x2....


Overclocking ability (possibly), no space for the Rad or maybe the looks?

Personally i like the look of both versions but this one just seems Mental


----------



## Kokin

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Playing BF4 i get ~ 500W with 290X @ 1200/1500 +100mV. You are probably pulling 550-600W.


Highly doubt it's more than 275W at +200mv. A 290X pulls a lot more than a 290 from what I've seen.

With 1200/1250 +150mv, it was pulling about 250W:


Quote:


> Originally Posted by *bond32*
> 
> Keep in mind, that although loud that reference cooler does keep the card temperatures usable while staying cheap. Yeah its loud, but it works.


It's certainly true, but Hawaii runs so much hotter than Tahiti. The stock cooler on my 7950 was puny, but it kept the card cool and relatively quiet even with overclocking.

Here's what it looked like:


----------



## ZealotKi11er

Quote:


> Originally Posted by *Kokin*
> 
> Highly doubt it's more than 275W at +200mv. A 290X pulls a lot more than a 290 from what I've seen.
> 
> With 1200/1250 +150mv, it was pulling about 250W:
> 
> It's certainly true, but Hawaii runs so much hotter than Tahiti. The stock cooler on my 7950 was puny, but it kept the card cool and relatively quiet even with overclocking.
> 
> Here's what it looked like:


GPU-Z is not accurate. I am using actually meter from the wall.


----------



## Mr357

Thought you guys would like to know that even though this thread was created exactly 5 months after the Official GTX 780 club, it has about 3000 more posts!









GTX 780 Club created on 5/23/13 - 19079 posts

290/290X Club created on 10/23/13 - 22037 posts


----------



## ZealotKi11er

Quote:


> Originally Posted by *Mr357*
> 
> Thought you guys would like to know that even though this thread was created exactly 5 months after the Official GTX 780 club, it has about 3000 more posts!


We have less problems then GTX780 thats why.


----------



## eternal7trance

Probably from all the bitcoin mining. I just got a 290 for really cheap coming in soon


----------



## kayan

Quote:


> Originally Posted by *eternal7trance*
> 
> Probably from all the bitcoin mining. I just got a 290 for really cheap coming in soon


I got a 290x at launch for 550. I just got a 2nd one on eBay for 340. Never registered and it seems to run fine so far. I dunno about overclocking though, as I installed it in my wife's way too small PC







. I want to get another if I can find one under 350, so I can XFire. But then I need more monitors or a higher resolution one. ;-)


----------



## heroxoot

Quote:


> Originally Posted by *kayan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *eternal7trance*
> 
> Probably from all the bitcoin mining. I just got a 290 for really cheap coming in soon
> 
> 
> 
> I got a 290x at launch for 550. I just got a 2nd one on eBay for 340. Never registered and it seems to run fine so far. I dunno about overclocking though, as I installed it in my wife's way too small PC
> 
> 
> 
> 
> 
> 
> 
> 
> . I want to get another if I can find one under 350, so I can XFire. But then I need more monitors or a higher resolution one. ;-)
Click to expand...

I've never registered a damn GPU and I have owned 5+ from AMD. I feel like they should do the Nintendo thing and give me a T-Shirt for registering like 10 GPU or something. I'd wear an AMD shirt.


----------



## Tokkan

Quote:


> Originally Posted by *heroxoot*
> 
> I've never registered a damn GPU and I have owned 5+ from AMD. I feel like they should do the Nintendo thing and give me a T-Shirt for registering like 10 GPU or something. I'd wear an AMD shirt.


How do you even register it lmao? Through the Sapphire website that comes with the paper inside the box? I never really cared about that thing. Has nothing good for me lol.
If I register it do I get like a warranty that starts from the time it was registered?








Cause if not my invoice will suffice, do not need spam on my mail box.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Tokkan*
> 
> How do you even register it lmao? Through the Sapphire website that comes with the paper inside the box? I never really cared about that thing. Has nothing good for me lol.
> If I register it do I get like a warranty that starts from the time it was registered?
> 
> 
> 
> 
> 
> 
> 
> 
> Cause if not my invoice will suffice, do not need spam on my mail box.


Thats why you Get ASUS, Gigabyte, MSI. You dont need to register, receipt. Just the S/N of the card and you have 3 Years warranty.


----------



## eternal7trance

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Thats why you Get ASUS, Gigabyte, MSI. You dont need to register, receipt. Just the S/N of the card and you have 3 Years warranty.


I always try to get one of those brands, the 290 i just ordered was from gigabyte


----------



## heroxoot

Quote:


> Originally Posted by *Tokkan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> I've never registered a damn GPU and I have owned 5+ from AMD. I feel like they should do the Nintendo thing and give me a T-Shirt for registering like 10 GPU or something. I'd wear an AMD shirt.
> 
> 
> 
> How do you even register it lmao? Through the Sapphire website that comes with the paper inside the box? I never really cared about that thing. Has nothing good for me lol.
> If I register it do I get like a warranty that starts from the time it was registered?
> 
> 
> 
> 
> 
> 
> 
> 
> Cause if not my invoice will suffice, do not need spam on my mail box.
Click to expand...

Every time I install a new driver it throws up the AMD website asking me to register.


----------



## chronicfx

Been getting the static audio popping and crackling while running vsync using the 14.4 wqhl. It is not the absolute deal breaker it was on 13.12 but it is definitely there when vsync is on and not there when vsync is off. Am I missing a setting somewhere? I tried disabling everything except for my "speakers" in device manager even went as far as pulling my microphone input cable as the microphone interfering had been suggested if you google around. Playing with vsync off for now.


----------



## kayan

Hey all, so I'm (my wife really) is having a problem similar to BatGirl's from a few days back. I remember reading that someone suggested downloading whocrashed and running it.

About 1 month ago my wife started having problems with video flickering in and out every so often whenever she starts any streaming video program, whether it's a W8 App, or a browser streaming (The CW, Amazon Prime, Netflix, etc...). The screen would go completely black off and on. It would only occur when starting some streaming service, and it would continue to go black and then back on until she reboots her PC. When the screen turns black it stays like that for about 5-6 seconds and then fine for a random amount of time, until it goes black again.

Today she got her first BSOD caused by this. This is the crashdump according to whoscrashed:

Crash Dump Analysis

Crash dump directory: C:\WINDOWS\Minidump

Crash dumps are enabled on your computer.

On Mon 5/5/2014 5:59:37 PM GMT your computer crashed
crash dump file: C:\WINDOWS\Minidump\050514-5828-01.dmp
This was probably caused by the following module: ntoskrnl.exe (nt+0x153FA0)
Bugcheck code: 0xA0000001 (0x5, 0x0, 0x0, 0x0)
Error: CUSTOM_ERROR
file path: C:\WINDOWS\system32\ntoskrnl.exe
product: Microsoft® Windows® Operating System
company: Microsoft Corporation
description: NT Kernel & System
The crash took place in the Windows kernel. Possibly this problem is caused by another driver that cannot be identified at this time.

Her PC is as follows:
Gigabyte 990FX UD3
AMD FX 8320 (stock)
XFX 5830 (more on this later)
8GB of Corsair 1600 ram
OCZ 850w Gold PSU

So, I originally thought that it was her video card, or a bad connection cord from her PC to Monitor. I bought her a 290x to replace her ancient GPU, we installed it last night. Properly disposing of the old drivers. Installed the card, downloaded the 14.4 drivers, and the BSOD happened today. It has happened with 2 different monitors (hers and mine) using a DVI-D connection. I'm at a loss. Any suggestions?


----------



## VSG

I have a stinking feeling that is Windows 8 at fault honestly and nothing to do with the GPU.


----------



## Tokkan

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Thats why you Get ASUS, Gigabyte, MSI. You dont need to register, receipt. Just the S/N of the card and you have 3 Years warranty.


Well the thing is that they are kinda forced to give me 2 years since I live in Europe








I'm just the end customer







And even though my card came from ebay, the seller promptly sent me a copy of the invoice with the card. Anyone can RMA it thanks to the invoice.


----------



## eternal7trance

Quote:


> Originally Posted by *kayan*
> 
> Hey all, so I'm (my wife really) is having a problem similar to BatGirl's from a few days back. I remember reading that someone suggested downloading whocrashed and running it.
> 
> About 1 month ago my wife started having problems with video flickering in and out every so often whenever she starts any streaming video program, whether it's a W8 App, or a browser streaming (The CW, Amazon Prime, Netflix, etc...). The screen would go completely black off and on. It would only occur when starting some streaming service, and it would continue to go black and then back on until she reboots her PC. When the screen turns black it stays like that for about 5-6 seconds and then fine for a random amount of time, until it goes black again.
> 
> Today she got her first BSOD caused by this. This is the crashdump according to whoscrashed:
> 
> Crash Dump Analysis
> 
> Crash dump directory: C:\WINDOWS\Minidump
> 
> Crash dumps are enabled on your computer.
> 
> On Mon 5/5/2014 5:59:37 PM GMT your computer crashed
> crash dump file: C:\WINDOWS\Minidump\050514-5828-01.dmp
> This was probably caused by the following module: ntoskrnl.exe (nt+0x153FA0)
> Bugcheck code: 0xA0000001 (0x5, 0x0, 0x0, 0x0)
> Error: CUSTOM_ERROR
> file path: C:\WINDOWS\system32\ntoskrnl.exe
> product: Microsoft® Windows® Operating System
> company: Microsoft Corporation
> description: NT Kernel & System
> The crash took place in the Windows kernel. Possibly this problem is caused by another driver that cannot be identified at this time.
> 
> Her PC is as follows:
> Gigabyte 990FX UD3
> AMD FX 8320 (stock)
> XFX 5830 (more on this later)
> 8GB of Corsair 1600 ram
> OCZ 850w Gold PSU
> 
> So, I originally thought that it was her video card, or a bad connection cord from her PC to Monitor. I bought her a 290x to replace her ancient GPU, we installed it last night. Properly disposing of the old drivers. Installed the card, downloaded the 14.4 drivers, and the BSOD happened today. It has happened with 2 different monitors (hers and mine) using a DVI-D connection. I'm at a loss. Any suggestions?


I would also try a different power supply just to be sure since you probably have another you can use


----------



## kayan

Quote:


> Originally Posted by *geggeg*
> 
> I have a stinking feeling that is Windows 8 at fault honestly and nothing to do with the GPU.


That could be, it was fine before she installed W7 on a VM for her work from home stuff. We tried her monitor, my monitor, and her newer bigger monitor it blinks on mine and the new one, but does not at all on her old one. Maybe an issue with the new monitor?

Quote:


> Originally Posted by *eternal7trance*
> 
> I would also try a different power supply just to be sure since you probably have another you can use


It's possible, but highly unlikely. Everything has run on that PSU for about 2 years, with no issues. I've had two-3 systems built around it, and she's had 2. Everything is rock solid on the power delivery front as well.

I just tried plugging her new monitor in via DP (instead of DVI-D) and her PC wouldn't even show anything on it. The cord is fine since it works on my PC with my monitor. I even tried rebooting everything and I still couldn't get a signal via DP, but lo and behold I plugged the DVI-D back in and bam, video signal no problem.








I am so confused.


----------



## EPiiKK

Anyone got any experience on used mining cards? I need to upgrade my gaming rig, and i could get ex mining gpus real cheap, like sub 200€, could mining padt cause some stapibility issues etc?
Also there are pretty uch only ref versions on sale, would 250w arctic cooling cooler be good on a 290? i have one on my gtx 560 overclocked to max and my temps dont go above 43 with fans as low as they go. Should i look into watercooling or aios?


----------



## JordanTr

Quote:


> Originally Posted by *EPiiKK*
> 
> Anyone got any experience on used mining cards? I need to upgrade my gaming rig, and i could get ex mining gpus real cheap, like sub 200€, could mining padt cause some stapibility issues etc?
> Also there are pretty uch only ref versions on sale, would 250w arctic cooling cooler be good on a 290? i have one on my gtx 560 overclocked to max and my temps dont go above 43 with fans as low as they go. Should i look into watercooling or aios?


If they have been hold on high temperatures yes. Also flash regular bios of the card, cause probably they have been using modified bios.


----------



## Roboyto

Anyone experience black screen video loss on 14.4? Screen goes blank/black and only a reboot fixes it?

It happened twice and I have since switched back to 13.12 and all seems to be well, but it's only been a day and a half. Just wondering if I need to be concerned with my "new" open box 290 from NewEgg, or if others have experienced this.


----------



## Terrere

Quote:


> Originally Posted by *EPiiKK*
> 
> Anyone got any experience on used mining cards? I need to upgrade my gaming rig, and i could get ex mining gpus real cheap, like sub 200€, could mining padt cause some stapibility issues etc?
> Also there are pretty uch only ref versions on sale, would 250w arctic cooling cooler be good on a 290? i have one on my gtx 560 overclocked to max and my temps dont go above 43 with fans as low as they go. Should i look into watercooling or aios?


I bought my r9 290 as a used mining card. I've had no issues, but the person I got it from only mined for about 2 months with the cards undervolted and underclocked. If the cards were pushed to their limits with temps staying high, it could affect the stability.


----------



## Roboyto

Quote:


> Originally Posted by *EPiiKK*
> 
> Anyone got any experience on used mining cards? I need to upgrade my gaming rig, and i could get ex mining gpus real cheap, like sub 200€, could mining padt cause some stapibility issues etc?
> Also there are pretty uch only ref versions on sale, would 250w arctic cooling cooler be good on a 290? i have one on my gtx 560 overclocked to max and my temps dont go above 43 with fans as low as they go. Should i look into watercooling or aios?


Used mining card is going to be a gamble anyway you look at it. If the card was abused then it is a possibility it could have issues.

A 250W cooler would probably do OK, but you wouldn't be able to run any earth-shattering OCs on it. The more important question is if the cooler would be compatible with the 290.

A GTX 560 is nothing like the volcanic 290(X), don't expect those temperatures, with that cooler, on one of these beasts









Go with an AIO, way better option and you're not epoxying heatsinks to everything.

NZXT Kraken G10 since they're readily available: http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&Description=kraken+g10&N=-1&isNodeId=1

Order one of these heatsinks: http://www.feppaspot.com/servlet/the-897/GELID-Solutions-Icy-Vision/Detail

Zalman LQ310 $65 - $20 MIR: http://www.newegg.com/Product/Product.aspx?Item=N82E16835118134

And get results like I experienced: http://www.overclock.net/t/1478544/the-devil-inside-a-gaming-htpc-for-my-wife-3770k-corsair-250d-powercolor-r9-270x-devil-edition/20#post_22214255

Quote:


> Originally Posted by *kayan*
> 
> Hey all, so I'm (my wife really) is having a problem similar to BatGirl's from a few days back. I remember reading that someone suggested downloading whocrashed and running it.
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> About 1 month ago my wife started having problems with video flickering in and out every so often whenever she starts any streaming video program, whether it's a W8 App, or a browser streaming (The CW, Amazon Prime, Netflix, etc...). The screen would go completely black off and on. It would only occur when starting some streaming service, and it would continue to go black and then back on until she reboots her PC. When the screen turns black it stays like that for about 5-6 seconds and then fine for a random amount of time, until it goes black again.
> 
> Today she got her first BSOD caused by this. This is the crashdump according to whoscrashed:
> 
> Crash Dump Analysis
> 
> Crash dump directory: C:\WINDOWS\Minidump
> 
> Crash dumps are enabled on your computer.
> 
> 
> 
> On Mon 5/5/2014 5:59:37 PM GMT your computer crashed
> crash dump file: C:\WINDOWS\Minidump\050514-5828-01.dmp
> This was probably caused by the following module: *ntoskrnl.exe* (nt+0x153FA0)
> Bugcheck code: 0xA0000001 (0x5, 0x0, 0x0, 0x0)
> Error: CUSTOM_ERROR
> file path: C:\WINDOWS\system32\ntoskrnl.exe
> product: Microsoft® Windows® Operating System
> company: Microsoft Corporation
> description: NT Kernel & System
> The crash took place in the Windows kernel. Possibly this problem is caused by another driver that cannot be identified at this time.
> 
> Her PC is as follows:
> Gigabyte 990FX UD3
> AMD FX 8320 (stock)
> XFX 5830 (more on this later)
> 8GB of Corsair 1600 ram
> OCZ 850w Gold PSU
> 
> So, I originally thought that it was her video card, or a bad connection cord from her PC to Monitor. I bought her a 290x to replace her ancient GPU, we installed it last night. Properly disposing of the old drivers. Installed the card, downloaded the 14.4 drivers, and the BSOD happened today. It has happened with 2 different monitors (hers and mine) using a DVI-D connection. I'm at a loss. Any suggestions?


Try swapping RAM from another PC if you have it. I've had similar BSOD with that NTOSKRNL.exe and RAM was the culprit on 2 occasions I believe.

Ah, just remembered something! I just had a laptop I repaired with Win8 on it that was having video flickering issues; it would happen as soon as you left the "Metro" tile interface by going to the desktop. I couldn't remedy the problem so I used the "Refresh My PC" option of Win8 and all was well. Glorious feature to reisntall the OS without losing/affecting anything else. Ah, and don't forget the free 8.1 upgrade if you haven't taken advantage of it.


----------



## the9quad

Quote:


> Originally Posted by *Roboyto*
> 
> Anyone experience black screen video loss on 14.4? Screen goes blank/black and only a reboot fixes it?
> 
> It happened twice and I have since switched back to 13.12 and all seems to be well, but it's only been a day and a half. Just wondering if I need to be concerned with my "new" open box 290 from NewEgg, or if others have experienced this.


its purely an issue with the 14.4 and the press leaked drivers. No worries man.


----------



## rdr09

Quote:


> Originally Posted by *Roboyto*
> 
> Anyone experience black screen video loss on 14.4? Screen goes blank/black and only a reboot fixes it?
> 
> It happened twice and I have since switched back to 13.12 and all seems to be well, but it's only been a day and a half. Just wondering if I need to be concerned with my "new" open box 290 from NewEgg, or if others have experienced this.


i think you should be concerned. Been using 14.4 since it came out without issues. did notice that it needs more VDDC for the same oc compared to 13.11.

edit: may i add that i am using win7 and HDMI. i had to unplug and plug the HDMI cable after rebooting the system the first and last time 14.4 was installed to get the monitor to be recognized. since then it has been flawless. also, i use hibernate and windows is set to balanced option.


----------



## Roboyto

Quote:


> Originally Posted by *the9quad*
> 
> its purely an issue with the 14.4 and the press leaked drivers. No worries man.


Phew







I was not looking forward to having to reverse my AIO upgrade and return yet another card.

Just upgraded a friends PC and we got 2 bunk cards back to back...all is well now with his shiny new VisionTek 280X. Never personally handled/owned a VisionTek until he got this card, and I must say I am impressed with the cooling capabilities. It looks like a cheapo, but the heatpipes are enormous and it does a swell job of quietly cooling the 280X; even at a fairly high OC of 1195/1700.

Quote:


> Originally Posted by *rdr09*
> 
> i think you should be concerned. Been using 14.4 since it came out without issues. did notice that it needs more VDDC for the same oc compared to 13.11.


Hmmm...will have to switch over to the HTPC for a while and see if it happens today. I did a fresh Windows install and opted for 14.4 since they were official release.

Still have almost 2 weeks to return the card if it is the cause...will play it by ear and see what happens.

Thank you both for your input.


----------



## Paul17041993

Quote:


> Originally Posted by *Kokin*
> 
> Just curious, but why go with such large capacity PSUs? I already consider my 750W PSU to be overkill for my ITX system. If you're using 2x GPUs then nevermind my post.


headroom, regulation and efficiency.

you'll be surprised how much power they waste and how unstable they can become when pulled up to the max. even platinum-rated ones.


----------



## ZealotKi11er

Quote:


> Originally Posted by *EPiiKK*
> 
> Anyone got any experience on used mining cards? I need to upgrade my gaming rig, and i could get ex mining gpus real cheap, like sub 200€, could mining padt cause some stapibility issues etc?
> Also there are pretty uch only ref versions on sale, would 250w arctic cooling cooler be good on a 290? i have one on my gtx 560 overclocked to max and my temps dont go above 43 with fans as low as they go. Should i look into watercooling or aios?


I dont think mining did anything to 290/290X cards. They really did not run for that long. 3-4 Months tops. Because the card was temp limited even if it hit 95C it is well within the specs. If card was running 1-2 years thats a different thing.


----------



## Goride

Quote:


> Originally Posted by *Dasboogieman*
> 
> Just some thoughts
> 
> I have mounted a Corsair H90 on to a Sapphire 290 Tri X using the NZXT G10. My case is the CM HAF 932 and my CPU cooler is the Noctua NH D14
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 1. MEASURE YOUR 120/140mm slot before getting an AIO. I have a HAF923 and the H90 doesn't actually fit in to the 140mm back slot (with hoses oriented towards the ground) with a Noctua NH D14, couldn't fit it sideways since the pump blocks the side panel from closing. In the end, I lashed the radiator in to the 5.25 inch bays. In hindsight, I would've purchased a 120mm radiator AIO knowing this.
> 
> 2. AIO Hose length, more is better. I would advise against the H90 for future buyers, this is cost optimized for the CPU cooling thus the hose is 4 inches shorter than the Kraken X40 despite identical specs (both are identical Asetek units) This extra 4 inches would've allowed me to mount the rad at the rear with the hoses oriented towards the ceiling (thus not needing to be zip tied) and still use the NH D14.
> 
> 3. Mounting: the bracket has to be mounted to the GPU with the AIO, I suggest clearing space in the case prior to doing this since this will ease the final insertion of the GPU + AIO assembly later. Only 2-3 turns of the screw driver is neccessary, the fit is quite snug, any more and you risk cracking the GPU die or warping the PCB.
> 
> Results
> The Sapphire Tri X cooler was a brilliant design, the fans were placed in such a way that the air from the 3rd fan blows directly on to the VRM 1 area which was direct contact cooled with the whole heatsink frame.
> Thus, at 50% fan speed (barely audible) the core is 72 degrees (ambient 22) and VRM1 is around (75 degrees), VRM 2 is 45 degrees in EVGA OC scanner (a variant of Furmark)
> 
> Kraken G10 + H90 with no additional VRM cooling
> The overall noise is equivalent to the Tri X at 40% fan speed
> The core never peaks above 45 degrees but VRM 1 hovers around 82 degrees, VRM 2 reaches 60 degrees in EVGA OC scanner
> 
> Thus, I strongly recommend aftermarket VRM heatsinks if you want to overclock or using high stress apps, while the 92mm fan does a respectable job, there's actually an air dead zone in the VRM 1 assembly which lies directly under the 92mm fan's motor. If any of you has seen the Puget systems review, I believe this manifests as the giant red spot on the thermal camera.
> I recommend also either thermal adhesive (Arctic Alumina is the safest as I have read read 1 report of someone blowing a board fuse using excessive Arctic Silver adhesive) or a screw type heatsink assembly.
> DON'T use Sekisui 5760 for the VRM1 area. This tape has maximum adhesion around the 50-60 degree temperature range, 70-80 starts to see a degradation in the bond as the glue becomes less viscous. debonding starts to happen at 90-100 degrees over repeated cycles.
> 
> Useful dimensions and features:
> On a reference AMD board, the distance from the first screw hole centre to the second centre on the VRM 1 area is 85mm
> The height clearance from the RAM chip to the NZXT bracket is ~5mm
> The height clearance from the VRM 1 zone to the NXZT bracket is ~6mm
> The width of the VRM area from the Choke to the nearest capacitor is 13mm
> The whole assembly takes 2 slots.
> The AMD VRMs are metallic and despite the size difference, are the exact same height.
> 
> I have just ordered the GELID Enhancement kit for 290X from these guys http://www.feppaspot.com/servlet/the-897/GELID-Solutions-Icy-Vision/Detail so I don't have to DIY my own VRM sink.


Quote:


> Originally Posted by *Roboyto*
> 
> *I'm in the midst of a post in one of my build logs for this heatsink kit with a Kraken G10. I'll share here first though
> 
> 
> 
> 
> 
> 
> 
> *
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> *NZXT Kraken G10 + Gelid R9 290(X) VRM Cooling Enhancement Kit*
> 
> I paid the additional $24 Courier Express shipping from Hong Kong. Ordered on Saturday Night 4/26 and it arrived Tuesday afternoon 4/29!
> FeppaSpot's customer service was great. They let me know when the item was packaged/shipped.
> They ask for a phone number so DHL can text you information to: check tracking info, digitally sign for your package, specify delivery instructions, or even change delivery address.
> Superb service between FeppaSpot and DHL!
> 
> 
> 
> *Long VRM1 Sink & Thermal Pad - VRM1 Mounting Studs/Nuts - Plastic Shims - VRM2 Sink & Thermal Tape*
> 
> 
> 
> *3M 8810 RAM Sinks* - http://www.performance-pcs.com/catalog/index.php?main_page=product_info&products_id=39462
> Thermal tape on these isn't the greatest. I accidentally knocked sinks off a couple times when handling the card. It's a pro/con they come off so easily.
> May want to consider a similar sink and some different thermal tape.
> 
> 
> 
> *VRM1 sink is perfectly tailored to these cards. Nice job Gelid!*
> 
> 
> 
> *Tiny thumb nuts from the back side.*
> 
> 
> 
> *Junpus Thermal Tape* instead of supplied super thin tape - http://www.performance-pcs.com/catalog/index.php?main_page=product_info&products_id=37756
> Sold by the foot
> 
> 
> 
> *All Sinks Installed* *- It looks purdy*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *This low profile sink comes in the kit*, but was not pictured above because I didn't think I would need it.
> If you want a heatsink on the bottom most RAM chip then you will need this low profile sink!
> 
> 
> 
> *NZXT + Gelid Results*
> 
> *Reference Cooler on PowerColor R9 290 with auto fan settings*
> 
> 
> 
> *Kraken G10 + Antec 620 + SilenX Effizio + Gelid R9 290 Heatsink Upgrade Kit*
> 
> 
> 
> *Before*
> 
> Core - 94C
> VRM1 - 65C
> VRM2 - 83C
> 
> *After*
> Core - 53C
> VRM1 - 56C
> VRM2 - 61C
> 
> Temps are outstanding, power consumption is down ~12% to 160W at 975/1250, and the computer is absolutely silent using the SilenX Effizio.
> Even with 0 airflow across VRM2, its temp is down 22C due to no longer getting all the hot air from core and VRM1 blown across it.
> 
> The 92mm NZXT fan is pulling heat from VRM1 because of the cards placement in my 250D. Yes, a R9 290 with a Kraken G10 and Antec 620*** will**** fit inside a Corsair 250D.
> ****I had to remove a screw from the pump housing so the hoses/fittings would rotate further. If I hadn't done this, the side panel would not have fit as the hoses/fittings protruded too far out****
> ***I don't know if all of the Asetek based coolers have this little screw that can be removed***
> Take care when turning the fitting on the pump further than was previously allowed, as I accidentally pulled it completely out! It dripped a little fluid, but did _*NOT*_ break anything
> 
> 
> 
> *It takes a little finesse and patience, but it all fits in there with the radiator in the front of the case.*
> Please excuse my poor quality cell phone photos & current lack of cable management
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *A few things I still need to test/tinker around with:*
> 
> Temps when overclocking the card
> Did I gain any additional overclocking headroom?
> Reference cooler best clocks were 1160/1550 +200mV Trixx
> 
> Power consumption at max clocks/voltages
> Swapping thermal tape on VRM2 for some Fujipoly Ultra or Ultra Extreme.
> The Fuji Pad should hold sinks with the card standing upright as it worked great for a VRM heatsink upgrade on my Devil 270X


I wish I saw Roboyto and Dasboogieman's post earlier.

I just ordered 15 of these heatsinks:
http://www.frozencpu.com/products/17727/vid-191/Micro_Thermal_Heatsink_-_65mm_x_65mm_x_12mm_for_Motherboard_MOS.html

They are 6.5mm x 6.5mm x 12mm. I was going to but 10-12 of them in a row over the vrm1 area, and put 3 of them over the vrm2 area. I could not find much information on sizes, but it seemed people were saying that the clearance was almost 14mm, so I thought 12mm would be good. But according to these posts, that is much too high.

I also ordered these sinks for the memory:
http://www.newegg.com/Product/Product.aspx?Item=9SIA3TR18A6835

They are 15mm x 15mm x 8mm. So It appears that they are going to be too high as well.

I wanted to get the Gelid 290x enhancement kit, but it was (and still is) out of stock everywhere.


----------



## cennis

h90 ghetto mod
dcuii fan blowing on stock dcuii vrm heatsink 80% (loud)


----------



## Roboyto

Quote:
Originally Posted by *Goride* 



> I wish I saw Roboyto and Dasboogieman's post earlier.
> 
> I just ordered 15 of these heatsinks: http://www.frozencpu.com/products/17727/vid-191/Micro_Thermal_Heatsink_-_65mm_x_65mm_x_12mm_for_Motherboard_MOS.html
> They are 6.5mm x 6.5mm x 12mm. I was going to but 10-12 of them in a row over the vrm1 area, and put 3 of them over the vrm2 area. I could not find much information on sizes, but it seemed people were saying that the clearance was almost 14mm, so I thought 12mm would be good.
> 
> I also ordered these sinks for the memory: http://www.newegg.com/Product/Product.aspx?Item=9SIA3TR18A6835
> They are 15mm x 15mm x 8mm. So It appears that they are going to be too high as well.
> 
> I wanted to get the Gelid 290x enhancement kit, but it was (and still is) out of stock everywhere.


They must have just gone on backorder from feppaspot, but you may want to contact FrozenCPU about them. When I found out how fantastic the sink kit worked I e-mailed them and said they should start carrying them. I never got a response back, but you never know.

The RAM sinks I ordered are 13 x 13 x 7mm, so I don't think 1mm is going to kill you there. The only issue you'll have is the bottom most RAM chip, you need a low profile sink as it will hit the AIO otherwise.

I also just measured the distance between the fan on the Kraken and the PCB; According to my mic it is 12.73mm. I would say if you get them and they are a little too tall, shave/cut them down. I did it when I put new sinks/thermal pads on my Devil 270X. All I had laying around were leftover sinks from some Arctic Accelero coolers, and I made them fit


----------



## Caos

the power limit set it to 50%? for 1130/1400 .


----------



## Dasboogieman

Quote:


> Originally Posted by *Goride*


Those numbers were fast and dirty with a ruler at a distance. I would only be worried about the RAM height, there a bit more headroom on the VRM side.


----------



## lawson67

Hi guys i am running 2 R9 290 cards one a powercolor pcs+ and a Gigabyte windforce R9 290 in crossfire and they have been running great together for a few months!....However for what ever reason the gigabyte card gave up putting out a signal to the monitor!...It was the top card above the powercolor card so as to have a nice gap for cooling seeing as the powercolor is a triple slot card...So once it gave up putting out a signal to the monitor i tried it as the botton card and still no joy it was indeed dead even though the fans still turned on it!

I have RMA the old one and bought a new Gigabyte card!...However this card i believe is faulty also!...If ran alone in the top slot looping heaven 4.0 it keeps crashing the drivers black screens and sometimes crashes the drivers completely leaving a looping stuck heaven audio loop in the background forcing me to hard rest...which is hardly ideal!....Also any kind of over clock or even slight voltage boost crashes this card!...Also if i try to crossfire it with the powercolor card below it i get the same result....BTW powercolor card runs for hours in the top slot alone looping heaven no problem...However if i run the powercolor as the top card and place the Gigabyte below the powercolor it seems fine and happy...and it even overclocks?....

There is clearly a fault with this card guys and i have heard about some R9 290 cards black screening but do not know how this is solved?...Also why would it run fine in crossfire as the bottom card and not the top card without crashing the drivers (amdkmdap stopped responding and has successfully recovered) ?..And why will not run alone without doing the same..And why is it fine as the bottom card but *not* the top card in a crossfire configuration?....Any thoughts or help guys would be well appreciated


----------



## Kokin

Quote:


> Originally Posted by *EPiiKK*
> 
> Anyone got any experience on used mining cards? I need to upgrade my gaming rig, and i could get ex mining gpus real cheap, like sub 200€, could mining padt cause some stapibility issues etc?
> Also there are pretty uch only ref versions on sale, would 250w arctic cooling cooler be good on a 290? i have one on my gtx 560 overclocked to max and my temps dont go above 43 with fans as low as they go. Should i look into watercooling or aios?


I got my R9 290 for sub-$200 and it was used for mining for 2 months, though it was undervolted. The person I bought it from was still mining with it when I came to pick it up, so I got to test it out before paying him anything. But like others have said, most of the R9 290s up for sale are 2~4 months old and should still have plenty of life left in them.
Quote:


> Originally Posted by *lawson67*
> 
> Hi guys i am running 2 R9 290 cards one a powercolor pcs+ and a Gigabyte windforce R9 290 in crossfire and they have been running great together for a few months!....However for what ever reason the gigabyte card gave up putting out a signal to the monitor!...It was the top card above the powercolor card so as to have a nice gap for cooling seeing as the powercolor is a triple slot card...So once it gave up putting out a signal to the monitor i tried it as the botton card and still no joy it was indeed dead even though the fans still turned on it!
> 
> I have RMA the old one and bought a new Gigabyte card!...However this card i believe is faulty also!...If ran alone in the top slot looping heaven 4.0 it keeps crashing the drivers black screens and sometimes crashes the drivers completely leaving a looping stuck heaven audio loop in the background forcing me to hard rest...which is hardly ideal!....Also any kind of over clock or even slight voltage boost crashes this card!...Also if i try to crossfire it with the powercolor card below it i get the same result....BTW powercolor card runs for hours in the top slot alone looping heaven no problem...However if i run the powercolor as the top card and place the Gigabyte below the powercolor it seems fine and happy...and it even overclocks?....
> 
> There is clearly a fault with this card guys and i have heard about some R9 290 cards black screening but do not know how this is solved?...Also why would it run fine in crossfire as the bottom card and not the top card without crashing the drivers (amdkmdap stopped responding and has successfully recovered) ?..And why will not run alone without doing the same..And why is it fine as the bottom card but *not* the top card in a crossfire configuration?....Any thoughts or help guys would be well appreciated


I've seen the black screening issue being associated with Windows8/8.1, so that may be one cause. I do find it weird that your Powercolor card works as intended, but not the Gigabyte card.

Have you tried both DVI ports on your Gigabyte card and is your Qnix overclocked? If so, make sure your AMD patcher and CRU are updated to the latest versions. I use the DVI port (next to the HDMI port) for my R9 290 and I have no problems going to 120Hz.


----------



## Forceman

Quote:


> Originally Posted by *Roboyto*
> 
> Phew
> 
> 
> 
> 
> 
> 
> 
> I was not looking forward to having to reverse my AIO upgrade and return yet another card.
> 
> Just upgraded a friends PC and we got 2 bunk cards back to back...all is well now with his shiny new VisionTek 280X. Never personally handled/owned a VisionTek until he got this card, and I must say I am impressed with the cooling capabilities. It looks like a cheapo, but the heatpipes are enormous and it does a swell job of quietly cooling the 280X; even at a fairly high OC of 1195/1700.
> 
> Hmmm...will have to switch over to the HTPC for a while and see if it happens today. I did a fresh Windows install and opted for 14.4 since they were official release.
> 
> Still have almost 2 weeks to return the card if it is the cause...will play it by ear and see what happens.
> 
> Thank you both for your input.


Mine does the same thing - fine on 13.12 and borked on 14.x. It's a pretty common problem.


----------



## lawson67

Quote:


> Originally Posted by *Kokin*
> 
> I got my R9 290 for sub-$200 and it was used for mining for 2 months, though it was undervolted. The person I bought it from was still mining with it when I came to pick it up, so I got to test it out before paying him anything. But like others have said, most of the R9 290s up for sale are 2~4 months old and should still have plenty of life left in them.
> I've seen the black screening issue being associated with Windows8/8.1, so that may be one cause. I do find it weird that your Powercolor card works as intended, but not the Gigabyte card.
> 
> Have you tried both DVI ports on your Gigabyte card and is your Qnix overclocked? If so, make sure your AMD patcher and CRU are updated to the latest versions. I use the DVI port (next to the HDMI port) for my R9 290 and I have no problems going to 120Hz.


Yes i have tried both DVI-D ports on the Gigabyte and with the Qnix not overclocked and the drivers not patched...And afterburner not installed with fresh clean driver installed in fact every single combination that you can ever try thinking of however still no go?...Its very frustrating indeed and i can only believe the card has a fault as running alone it sometimes crashes the drivers where as the powercolor never does...so its not windows as i had 2 running fine for months...its also not toasty patch....The strange thing is that it will run only as the slave card in a crossfire situation and it will also overclock in that situation any other situation it dont wanna know!....its very frustrating


----------



## Jflisk

Quote:


> Originally Posted by *kayan*
> 
> Hey all, so I'm (my wife really) is having a problem similar to BatGirl's from a few days back. I remember reading that someone suggested downloading whocrashed and running it.
> 
> About 1 month ago my wife started having problems with video flickering in and out every so often whenever she starts any streaming video program, whether it's a W8 App, or a browser streaming (The CW, Amazon Prime, Netflix, etc...). The screen would go completely black off and on. It would only occur when starting some streaming service, and it would continue to go black and then back on until she reboots her PC. When the screen turns black it stays like that for about 5-6 seconds and then fine for a random amount of time, until it goes black again.
> 
> Today she got her first BSOD caused by this. This is the crashdump according to whoscrashed:
> 
> Crash Dump Analysis
> 
> Crash dump directory: C:\WINDOWS\Minidump
> 
> Crash dumps are enabled on your computer.
> 
> On Mon 5/5/2014 5:59:37 PM GMT your computer crashed
> crash dump file: C:\WINDOWS\Minidump\050514-5828-01.dmp
> This was probably caused by the following module: ntoskrnl.exe (nt+0x153FA0)
> Bugcheck code: 0xA0000001 (0x5, 0x0, 0x0, 0x0)
> Error: CUSTOM_ERROR
> file path: C:\WINDOWS\system32\ntoskrnl.exe
> product: Microsoft® Windows® Operating System
> company: Microsoft Corporation
> description: NT Kernel & System
> The crash took place in the Windows kernel. Possibly this problem is caused by another driver that cannot be identified at this time.
> 
> Her PC is as follows:
> Gigabyte 990FX UD3
> AMD FX 8320 (stock)
> XFX 5830 (more on this later)
> 8GB of Corsair 1600 ram
> OCZ 850w Gold PSU
> 
> So, I originally thought that it was her video card, or a bad connection cord from her PC to Monitor. I bought her a 290x to replace her ancient GPU, we installed it last night. Properly disposing of the old drivers. Installed the card, downloaded the 14.4 drivers, and the BSOD happened today. It has happened with 2 different monitors (hers and mine) using a DVI-D connection. I'm at a loss. Any suggestions?


Find bluescreen view and download it. open the minidump and see where the problem might lie.

Bluescreen view link
http://www.nirsoft.net/utils/blue_screen_view.html


----------



## eternal7trance

Does anyone know which BIOS to install for performance mode on a gigabyte 290? The website is really vague but it does have some BIOS updates for my card.

http://www.gigabyte.com/products/product-page.aspx?pid=4884#bios


----------



## lawson67

Quote:


> Originally Posted by *Forceman*
> 
> Mine does the same thing - fine on 13.12 and borked on 14.x. It's a pretty common problem.


Before the other Gigabyte blow up it was just fine on 14.x drivers and overclocked great up to 120hz so i just can not believe it that also!...Its just a duff card and also needs to be returned and i believe it may have a problem with the VRM chokes and it not getting clean voltage!...

Also if it was the 14.x drivers the powercolor would crash being the top card with the Gigabyte below...But no its fine in this configuration with the powercolor card carrying it sorry ass lol


----------



## lawson67

Quote:


> Originally Posted by *eternal7trance*
> 
> Does anyone know which BIOS to install for performance mode on a gigabyte 290? The website is really vague but it does have some BIOS updates for my card.
> 
> http://www.gigabyte.com/products/product-page.aspx?pid=4884#bios


The top F4 bios worked just fine on my once working Gigabyte windforce card before it blow up!....HMM did i just write that!


----------



## eternal7trance

Quote:


> Originally Posted by *lawson67*
> 
> The top F4 bios worked just fine on my once working Gigabyte windforce card before it blow up!....HMM did i just write that!


That doesn't make me very confident lol


----------



## lawson67

Quote:


> Originally Posted by *eternal7trance*
> 
> That doesn't make me very confident lol


Yes after i wrote that i looked back at what i just wrote and thought!....Something does not sound right with that statement









BTW: The performance mode bios on the Gigabyte is the F4L one


----------



## kizwan

Quote:


> Originally Posted by *the9quad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Roboyto*
> 
> Anyone experience black screen video loss on 14.4? Screen goes blank/black and only a reboot fixes it?
> 
> It happened twice and I have since switched back to 13.12 and all seems to be well, but it's only been a day and a half. Just wondering if I need to be concerned with my "new" open box 290 from NewEgg, or if others have experienced this.
> 
> 
> 
> its purely an issue with the 14.4 and the press leaked drivers. No worries man.
Click to expand...

Really? No problem here with 14.4 beta & WHQL drivers. Most of the time windows power plan set to Balanced & sometime to High Performance. The monitor also set to turn off after 10/15 minutes. No problem at all either under load or idle (single & Crossfire mode).

Is this when idle or what?

*[EDIT]* With 14.4 beta & WHQL drivers, whenever monitor is turn off/on, whether auto turn off when idle or turn off manually, I hear device disconnect & re-connect notifications. I don't recall it behave like this with 14.2 beta & earlier drivers though.

Also, a couple of times last night, monitor doesn't turn off when idle like it supposed to. The screen is black but the backlight is still on & I can see the mouse pointer on the screen. However when I moved the mouse, the monitor wake up, so to speak, without any problem.


----------



## Forceman

Quote:


> Originally Posted by *lawson67*
> 
> Before the other Gigabyte blow up it was just fine on 14.x drivers and overclocked great up to 120hz so i just can not believe it that also!...Its just a duff card and also needs to be returned and i believe it may have a problem with the VRM chokes and it not getting clean voltage!...
> 
> Also if it was the 14.x drivers the powercolor would crash being the top card with the Gigabyte below...But no its fine in this configuration with the powercolor card carrying it sorry ass lol


I was referring to Roboyto's idle crash problem, which isn't unusual - your card sounds borked.
Quote:


> Originally Posted by *kizwan*
> 
> Really? No problem here with 14.4 beta & WHQL drivers. Most of the time windows power plan set to Balanced & sometime to High Performance. The monitor also set to turn off after 10/15 minutes. No problem at all either under load or idle (single & Crossfire mode).
> 
> Is this when idle or what?
> 
> *[EDIT]* With 14.4 beta & WHQL drivers, whenever monitor is turn off/on, whether auto turn off when idle or turn off manually, I hear device disconnect & re-connect notifications. I don't recall it behave like this with 14.2 beta & earlier drivers though.
> 
> Also, a couple of times last night, monitor doesn't turn off when idle like it supposed to. The screen is black but the backlight is still on & I can see the mouse pointer on the screen. However when I moved the mouse, the monitor wake up, so to speak, without any problem.


It's a problem at idle, and for me at least, it only happens when it's been idle for a long time - if I try to wake it up after 30 minutes or something, but 2 hours and it's locked about 30% of the time. My wife's card (a 7770) does the black screen with the mouse pointer thing - I haven't been able to figure that one out either. Just seems like AMD has some issues with idle on their cards - maybe it has something to do with ULPS (even though I have it disabled) or powerplay.


----------



## Samurai Batgirl

For anyone who cares: I fixed my crashing problem.

It had to do with atikmdag.sys and HTML5 on Waterfox not getting along...or something. Anyway, I uninstalled the HTML5 add-on and everything's fine.


----------



## Dire Squirrel

So, I have decided to go with a 290 for my new GPU. Seems to have a very good price/performance ratio and it more than covers my needs (my current 7870XT actually covers my needs







).

The currently best priced 290 around these parts, is this: VTX3D r9 290 x-edition v2
From what I hear, VTX3D 290's are also consistently good, so it seems to be the way to go.

I am seriously considering going full water cooling at the same time, which obviously makes card selection a bit more critical. It needs to fit a block after all.
This is the block I would like to use: EK-FC R9-290X - Acetal
EK's "Cooling Configurator" gave it's standard "visual" answer (big surprise).

The problem is that I simply can't find any source (good or bad) that can verify if it will work or not. Videocardz.com mentioned that the PCB is "slightly different than reference" . That was about the 290X version of the same model but as I understand it, the 290 and 290X are the same PCB. Ppleace correct me if I'm wrong here.
There is no word anywhere on WHAT the difference is, but given the EK visual "confirmation" (for whatever that's worth) it may well be within reference parameters.

I seem to recall that the numbering on the PCB follows industry standard, but I have no clue how to read it. This is about as good a shot as can be found (that I can verify as being the right card):



Does that mean anything to anyone?

If anyone knows, this is the place. Lay some knowledge on me








Thanks in advance.


----------



## Zamoldac

Anybody here knows a Bios editor that will work on these cards?


----------



## Forceman

Quote:


> Originally Posted by *Dire Squirrel*
> 
> So, I have decided to go with a 290 for my new GPU. Seems to have a very good price/performance ratio and it more than covers my needs (my current 7870XT actually covers my needs
> 
> 
> 
> 
> 
> 
> 
> ).
> 
> The currently best priced 290 around these parts, is this: VTX3D r9 290 x-edition v2
> From what I hear, VTX3D 290's are also consistently good, so it seems to be the way to go.
> 
> I am seriously considering going full water cooling at the same time, which obviously makes card selection a bit more critical. It needs to fit a block after all.
> This is the block I would like to use: EK-FC R9-290X - Acetal
> EK's "Cooling Configurator" gave it's standard "visual" answer (big surprise).
> 
> The problem is that I simply can't find any source (good or bad) that can verify if it will work or not. Videocardz.com mentioned that the PCB is "slightly different than reference" . That was about the 290X version of the same model but as I understand it, the 290 and 290X are the same PCB. Ppleace correct me if I'm wrong here.
> There is no word anywhere on WHAT the difference is, but given the EK visual "confirmation" (for whatever that's worth) it may well be within reference parameters.
> 
> I seem to recall that the numbering on the PCB follows industry standard, but I have no clue how to read it. This is about as good a shot as can be found (that I can verify as being the right card):
> 
> 
> 
> Does that mean anything to anyone?
> 
> If anyone knows, this is the place. Lay some knowledge on me
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks in advance.


How much more expensive is a reference card? Unless it's a lot, I'd just get a reference card that you know will fit the waterblock, rather than taking a chance on the VTX3D card having a taller capacitor or something that blocks it. Otherwise you run the risk of having to return either the card or block.


----------



## 1rad3

Does anyone know if there are any blocks to fit the Sapphire R9 290X Vapor-X OC 8192MB? Did they just change the ram density or is the pcb completely custom?


----------



## Matt-Matt

Quote:


> Originally Posted by *1rad3*
> 
> Does anyone know if there are any blocks to fit the Sapphire R9 290X Vapor-X OC 8192MB? Did they just change the ram density or is the pcb completely custom?


It has rear RAM chips as far as I know, so you'd need some form of heatsink or backplate for it to start with.

Pretty sure it's ref as the reference PCB has spots on the back for the extra chips.


----------



## Sgt Bilko

Didn't they say the 8GB versions aren't going to happen?


----------



## 1rad3

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Didn't they say the 8GB versions aren't going to happen?


http://www.overclockers.co.uk/showproduct.php?prodid=GX-353-SP


----------



## Sgt Bilko

Quote:


> Originally Posted by *1rad3*
> 
> http://www.overclockers.co.uk/showproduct.php?prodid=GX-353-SP


Well then, looks like OcUK snagged the only ones.

As for waterblocks i really couldn't say, If it is a custom PCB (likely) and there is only a limited number of them then no-one will make blocks.

The Vapor-X cooler performs well though if thats any consolation


----------



## rdr09

Quote:


> Originally Posted by *1rad3*
> 
> Does anyone know if there are any blocks to fit the Sapphire R9 290X Vapor-X OC 8192MB? Did they just change the ram density or is the pcb completely custom?


the 290X VaporX might have a similar pcb to the 290. the vrm1 is completely redesigned from the reference . . .


----------



## 1rad3

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well then, looks like OcUK snagged the only ones.
> 
> As for waterblocks i really couldn't say, If it is a custom PCB (likely) and there is only a limited number of them then no-one will make blocks.
> 
> The Vapor-X cooler performs well though if thats any consolation


Quote:


> Originally Posted by *rdr09*
> 
> the 290X VaporX might have a similar pcb to the 290. the vrm1 is completely redesigned from the reference . . .


a shop in germany sells them as well, but I think they are partnered with ocuk. I checked with the configurator, nothing yet! But 8GBs! Modded games delight







But then again, Daylight uses up north of 3Gbs @ 1080p







if that is indicative of the UE4 we're gonna be having problems, haha! Let's hope that's just bad optimization. Then again, there are rumors of the gtx880 having 8Gbs as well, and from what I understand, thats not even the flagship gpu of the series. Hard time to pick a gpu!


----------



## Dire Squirrel

Quote:


> Originally Posted by *1rad3*
> 
> a shop in germany sells them as well, but I think they are partnered with ocuk. I checked with the configurator, nothing yet! But 8GBs! Modded games delight
> 
> 
> 
> 
> 
> 
> 
> But then again, Daylight uses up north of 3Gbs @ 1080p
> 
> 
> 
> 
> 
> 
> 
> if that is indicative of the UE4 we're gonna be having problems, haha! Let's hope that's just bad optimization. Then again, there are rumors of the gtx880 having 8Gbs as well, and from what I understand, thats not even the flagship gpu of the series. Hard time to pick a gpu!


Caseking.de has them: http://www.caseking.de/shop/catalog/Graphics-Card/AMD-Graphics-Card/RADEON-R9-290-Series/Sapphire-Radeon-R9-290X-Vapor-X-Tri-X-OC-8192-MB-DDR5-DP-HDMI::27272.html
Or at least they on their way.


----------



## 1rad3

Quote:


> Originally Posted by *Dire Squirrel*
> 
> Caseking.de has them: http://www.caseking.de/shop/catalog/Graphics-Card/AMD-Graphics-Card/RADEON-R9-290-Series/Sapphire-Radeon-R9-290X-Vapor-X-Tri-X-OC-8192-MB-DDR5-DP-HDMI::27272.html
> Or at least they on their way.


yeap! but at an extreme price premium imho. the k|ngp|n costs 765 Euros and the 290x lightning 530. That's extra 200 for memory. :/


----------



## Dire Squirrel

Quote:


> Originally Posted by *Forceman*
> 
> How much more expensive is a reference card? Unless it's a lot, I'd just get a reference card that you know will fit the waterblock, rather than taking a chance on the VTX3D card having a taller capacitor or something that blocks it. Otherwise you run the risk of having to return either the card or block.


At this point, full reference designs actually tend to be a fair bit more expensive.
But I just stumbled upon a Club3D 290 for only about $485,- .
As far s I can see, it should be a reference PCB. But I would be happy if someone could verify that for me.


----------



## Matt-Matt

Quote:


> Originally Posted by *Dire Squirrel*
> 
> At this point, full reference designs actually tend to be a fair bit more expensive.
> But I just stumbled upon a Club3D 290 for only about $485,- .
> As far s I can see, it should be a reference PCB. But I would be happy if someone could verify that for me.


If it has the AMD logo above the PCI-E slot it's reference.


----------



## Dire Squirrel

Quote:


> Originally Posted by *Matt-Matt*
> 
> If it has the AMD logo above the PCI-E slot it's reference.


I had the store check, and it does not. It has LF R29F.

It is model nr: CGAX-R9298 which from what I can tell is reference. It has the reference cooler. I do not know what the LF R29F means.


----------



## bond32

Check out the EK WB cooling configurator too. It should show if the pcb is a reference design.


----------



## heroxoot

I'm starting to want to water cool with this fan control problem. I hate seeing my GPU get 85c on its small OC. Since my explanations are bad I took a pic for another forum. This is what I see.



Seems this is a problem with the MSI Gaming cards period.


----------



## Miz3r

Hey guys i need some assistance here

I currently have a Asus Gtx Titan and i am looking at getting a 290x

THis one in particular

http://www.club-3d.com/index.php/products/reader.en/product/radeon-r9-290x-royalking.html

Is it worth the upgrade considering i run a Qnix 27 at 120Hrtz at 2560x1440

Cheers


----------



## 1rad3

Quote:


> Originally Posted by *Miz3r*
> 
> Hey guys i need some assistance here
> 
> I currently have a Asus Gtx Titan and i am looking at getting a 290x
> 
> THis one in particular
> 
> http://www.club-3d.com/index.php/products/reader.en/product/radeon-r9-290x-royalking.html
> 
> Is it worth the upgrade considering i run a Qnix 27 at 120Hrtz at 2560x1440
> 
> Cheers


what are you looking to do with the card? gaming? what titles and what kind of frame rates are you comfortable with?


----------



## DeadlyDNA

Quote:


> Originally Posted by *Miz3r*
> 
> Hey guys i need some assistance here
> 
> I currently have a Asus Gtx Titan and i am looking at getting a 290x
> 
> THis one in particular
> 
> http://www.club-3d.com/index.php/products/reader.en/product/radeon-r9-290x-royalking.html
> 
> Is it worth the upgrade considering i run a Qnix 27 at 120Hrtz at 2560x1440
> 
> Cheers


is that an upgrade? Your losing 2gb vram, and game /bench depending you will trade blows?


----------



## Miz3r

Purely for Battlefield 4 and fps games etc

I am use to getting around 70+ FPS on my current config with my TITAN

So deciding between the two really now and what will give me best performance and fps


----------



## Dire Squirrel

Quote:


> Originally Posted by *bond32*
> 
> Check out the EK WB cooling configurator too. It should show if the pcb is a reference design.


EK's WBCC is notoriously vague.

But I emailed Club3D, and they confirmed that it is a AMD reference PCB. So it is going straight in the shopping cart. At that price it would be crazy to not get it.


----------



## sugarhell

Quote:


> Originally Posted by *heroxoot*
> 
> I'm starting to want to water cool with this fan control problem. I hate seeing my GPU get 85c on its small OC. Since my explanations are bad I took a pic for another forum. This is what I see.
> 
> 
> 
> Seems this is a problem with the MSI Gaming cards period.


You need a new bios. Go to msi forums and ask a new one with fan control


----------



## 1rad3

Quote:


> Originally Posted by *DeadlyDNA*
> 
> is that an upgrade? Your losing 2gb vram, and game /bench depending you will trade blows?


+1, unless you can sell the titan @ a good price, you will probably be losing money. And
Quote:


> Originally Posted by *Miz3r*
> 
> Purely for Battlefield 4 and fps games etc
> 
> I am use to getting around 70+ FPS on my current config with my TITAN
> 
> So deciding between the two really now and what will give me best performance and fps


the 290x could be faster in some titles - especially amd optimized ones - but the difference is marginal at best. I would keep the money and wait on something more enticing.


----------



## rdr09

Quote:


> Originally Posted by *Miz3r*
> 
> Hey guys i need some assistance here
> 
> I currently have a Asus Gtx Titan and i am looking at getting a 290x
> 
> THis one in particular
> 
> http://www.club-3d.com/index.php/products/reader.en/product/radeon-r9-290x-royalking.html
> 
> Is it worth the upgrade considering i run a Qnix 27 at 120Hrtz at 2560x1440
> 
> Cheers


Is your cpu oced? If not, then try that first. 4.5Ghz.


----------



## Miz3r

Well its a straight trade from a friend for his 290x for my Titan so its not money loss at all for me,

But just trying to justify if it is worth the trade or not.

rdr09 I am planning on OC the cpu this weekend


----------



## rdr09

Quote:


> Originally Posted by *Miz3r*
> 
> Well its a straight trade from a friend for his 290x for my Titan so its not money loss at all for me,
> 
> But just trying to justify if it is worth the trade or not.
> 
> rdr09 I am planning on OC the cpu this weekend


BF4, especially MP, you need to oc the cpu when paired to a strong card like the Titan.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Zamoldac*
> 
> Anybody here knows a Bios editor that will work on these cards?


Why don't you just flash one of 3 bios's from Asus . Standard 290 x , PT1T or PT1 . these last 2 are unlocked .
Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well then, looks like OcUK snagged the only ones.
> 
> As for waterblocks i really couldn't say, If it is a custom PCB (likely) and there is only a limited number of them then no-one will make blocks.
> 
> The Vapor-X cooler performs well though if thats any consolation


Holy crap dude in the UK they want nearly $900AU for it .


----------



## eternal7trance

I don't know what gigabyte uses for their paste, but it was so thin and my windforce 290 was running really hot. I redid the paste and it went from 88 load to 78 load.


----------



## heroxoot

Quote:


> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> I'm starting to want to water cool with this fan control problem. I hate seeing my GPU get 85c on its small OC. Since my explanations are bad I took a pic for another forum. This is what I see.
> 
> 
> 
> Seems this is a problem with the MSI Gaming cards period.
> 
> 
> 
> You need a new bios. Go to msi forums and ask a new one with fan control
Click to expand...

I even gave them my S/N for my card and I have no replies past a Mod asking for S/N. It is an unhelpful place and very slow for anything.


----------



## Zamoldac

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Why don't you just flash one of 3 bios's from Asus . Standard 290 x , PT1T or PT1 . these last 2 are unlocked .


I want to save my final settings to the card also


----------



## bluedevil

Sweet club. In! Gonna do the Red mod soon, but different. Watch out, it's not like any other red mod.


----------



## Arizonian

Quote:


> Originally Posted by *bluedevil*
> 
> Sweet club. In! Gonna do the Red mod soon, but different. Watch out, it's not like any other red mod.


Thank you.

You peaked our interest, post back.


----------



## Crossed

New here, but I've been stalking this thread for a bit. I'm about to pull the trigger on a Vapor X r9 290 - could you give the length, width and height - mini itx builds can be a pain!


----------



## 1rad3

Quote:


> Originally Posted by *Crossed*
> 
> New here, but I've been stalking this thread for a bit. I'm about to pull the trigger on a Vapor X r9 290 - could you give the length, width and height - mini itx builds can be a pain!


305(L)X 114(W)X 47(H) mm is what i got off http://www.sapphiretech.com/presentation/product/product_index.aspx?cid=1&gid=3&sgid=1227&pid=2167&psn=000101&lid=1


----------



## 250Gimp

Hi All

I'm new to this forum so thought I would start here since I just got a used xfx R9 290, reference spec.

I didn't like the noise of the fan, so I just hooked up an NZXT G10 with a cheap Corsair H55 to cool.

WAY quieter!!!

Here is a screen shot from GPU-Z, and a pic of the card in my case.

Now to start thinking about a mild OC!!





Edit: here is my GPU-Z link.

http://www.techpowerup.com/gpuz/nfvc5/

Cheers


----------



## 1rad3

Quote:


> Originally Posted by *250Gimp*
> 
> Hi All
> 
> I'm new to this forum so thought I would start here since I just got a used xfx R9 290, reference spec.
> 
> I didn't like the noise of the fan, so I just hooked up an NZXT G10 with a cheap Corsair H55 to cool.
> 
> WAY quieter!!!
> 
> Here is a screen shot from GPU-Z, and a pic of the card in my case.
> 
> Now to start thinking about a mild OC!!
> 
> 
> 
> 
> 
> Cheers


nice! keep an eye on the vrm temps though, that could be an issue with the g10


----------



## Crossed

Thanks 1rad3 - I've just been a bit paranoid over gpu size after I almost bought the PCS+ over Newegg (and Powercolor's) wrong specs on their websites. If anybody has the card (vapor x) in person, could you confirm that these dimensions are it?
Thanks


----------



## 250Gimp

Quote:


> Originally Posted by *1rad3*
> 
> nice! keep an eye on the vrm temps though, that could be an issue with the g10


Thanks 1rad3!!

I have noticed the vrm temp is up a bit, but I don't plan on a major overclock.

I am going to run it as is for now, and may order some heat syncs to boost its cooling later.


----------



## phantomowl

i already tried Darks ouls 2 with Gedosato and Sweetfx

i average at 39fps

i got a r9 290 and i5 4670k and running the game in 1080p. downsampled from 3820x2160

is this normal?


----------



## Mega Man

Quote:


> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> I'm starting to want to water cool with this fan control problem. I hate seeing my GPU get 85c on its small OC. Since my explanations are bad I took a pic for another forum. This is what I see.
> 
> 
> 
> Seems this is a problem with the MSI Gaming cards period.
> 
> 
> 
> You need a new bios. Go to msi forums and ask a new one with fan control
> 
> Click to expand...
> 
> I even gave them my S/N for my card and I have no replies past a Mod asking for S/N. It is an unhelpful place and very slow for anything.
Click to expand...

really? took them around 24 hours to respond to my email. try emailing them
Quote:


> Originally Posted by *bluedevil*
> 
> Sweet club. In! Gonna do the Red mod soon, but different. Watch out, it's not like any other red mod.


i have to say i find it ironic you are doing a "red" mod and you are a "blue"devil


----------



## heroxoot

Quote:


> Originally Posted by *Mega Man*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> I'm starting to want to water cool with this fan control problem. I hate seeing my GPU get 85c on its small OC. Since my explanations are bad I took a pic for another forum. This is what I see.
> 
> 
> 
> Seems this is a problem with the MSI Gaming cards period.
> 
> 
> 
> You need a new bios. Go to msi forums and ask a new one with fan control
> 
> Click to expand...
> 
> I even gave them my S/N for my card and I have no replies past a Mod asking for S/N. It is an unhelpful place and very slow for anything.
> 
> Click to expand...
> 
> really? took them around 24 hours to respond to my email. try emailing them
> Quote:
> 
> 
> 
> Originally Posted by *bluedevil*
> 
> Sweet club. In! Gonna do the Red mod soon, but different. Watch out, it's not like any other red mod.
> 
> Click to expand...
> 
> i have to say i find it ironic you are doing a "red" mod and you are a "blue"devil
Click to expand...

I actually called MSI straight up and the guy told me there is no newer bios currently and he has no idea why MSI AB nor the MSI Gaming APP cannot control my fan.


----------



## Mega Man

Quote:


> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> I'm starting to want to water cool with this fan control problem. I hate seeing my GPU get 85c on its small OC. Since my explanations are bad I took a pic for another forum. This is what I see.
> 
> 
> 
> Seems this is a problem with the MSI Gaming cards period.
> 
> 
> 
> You need a new bios. Go to msi forums and ask a new one with fan control
> 
> Click to expand...
> 
> I even gave them my S/N for my card and I have no replies past a Mod asking for S/N. It is an unhelpful place and very slow for anything.
> 
> Click to expand...
> 
> really? took them around 24 hours to respond to my email. try emailing them
> Quote:
> 
> 
> 
> Originally Posted by *bluedevil*
> 
> Sweet club. In! Gonna do the Red mod soon, but different. Watch out, it's not like any other red mod.
> 
> Click to expand...
> 
> i have to say i find it ironic you are doing a "red" mod and you are a "blue"devil
> 
> Click to expand...
> 
> I actually called MSI straight up and the guy told me there is no newer bios currently and he has no idea why MSI AB nor the MSI Gaming APP cannot control my fan.
Click to expand...

so just ask for a new copy worth a shot


----------



## heroxoot

Quote:


> Originally Posted by *Mega Man*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> I'm starting to want to water cool with this fan control problem. I hate seeing my GPU get 85c on its small OC. Since my explanations are bad I took a pic for another forum. This is what I see.
> 
> 
> 
> Seems this is a problem with the MSI Gaming cards period.
> 
> 
> 
> You need a new bios. Go to msi forums and ask a new one with fan control
> 
> Click to expand...
> 
> I even gave them my S/N for my card and I have no replies past a Mod asking for S/N. It is an unhelpful place and very slow for anything.
> 
> Click to expand...
> 
> really? took them around 24 hours to respond to my email. try emailing them
> Quote:
> 
> 
> 
> Originally Posted by *bluedevil*
> 
> Sweet club. In! Gonna do the Red mod soon, but different. Watch out, it's not like any other red mod.
> 
> Click to expand...
> 
> i have to say i find it ironic you are doing a "red" mod and you are a "blue"devil
> 
> Click to expand...
> 
> I actually called MSI straight up and the guy told me there is no newer bios currently and he has no idea why MSI AB nor the MSI Gaming APP cannot control my fan.
> 
> Click to expand...
> 
> so just ask for a new copy worth a shot
Click to expand...

I doubt that would even work. I tried older bios for my specific 290X from techpowerup's VGA collection but they all make my GPU crash one time or another so I quit using it. I'll have to bug them again tomorrow.


----------



## sugarhell

At least for lightning worked.

PS megaman you owe 1 dollar to someone


----------



## Roboyto

Quote:


> Originally Posted by *Dire Squirrel*
> 
> At this point, full reference designs actually tend to be a fair bit more expensive.
> But I just stumbled upon a Club3D 290 for only about $485,- .
> As far s I can see, it should be a reference PCB. But I would be happy if someone could verify that for me.


Send EK or VTX an e-mail. EK is usually very quick in their responses from my experience









Quote:


> Originally Posted by *250Gimp*
> 
> Hi All
> 
> I'm new to this forum so thought I would start here since I just got a used xfx R9 290, reference spec.
> 
> I didn't like the noise of the fan, so I just hooked up an NZXT G10 with a cheap Corsair H55 to cool.
> 
> WAY quieter!!!
> 
> Here is a screen shot from GPU-Z, and a pic of the card in my case.
> 
> Now to start thinking about a mild OC!!
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers


Very nice, just did my 290 with a G10 and Gelid VRM kit.

Quote:


> Originally Posted by *1rad3*
> 
> nice! keep an eye on the vrm temps though, that could be an issue with the g10


The Gelid VRM upgrade kit with the G10 keeps that VRM1 respectably cool, even with a mild OC of 1075/1375 +37mV on my PowerColor reference.

*Gelid R9 290(X) VRM Kit*



*Gelid Kit + 3M 8810 RAM Sinks*



*PowerColor 'OC' Stock 975/1250*



*1075/1375 +37mV*



Those VRM1 temps are with the NZXT fan pulling hot air off VRMs since this card is in a Corsair 250D









The nice 'Tetris Block' VRM2 sink does very well for not getting any airflow over it in my case.

Quote:


> Originally Posted by *Crossed*
> 
> New here, but I've been stalking this thread for a bit. I'm about to pull the trigger on a Vapor X r9 290 - could you give the length, width and height - mini itx builds can be a pain!


They can be a pain, but when you have that much power in such a small package it is worth the extra effort. My Mini-ITX build is mostly complete, still have a few things pending if you want to scope it out.

http://www.overclock.net/t/1478544/the-devil-inside-a-gaming-htpc-for-my-wife-3770k-corsair-250d-powercolor-r9-270x-devil-edition

Quote:


> Originally Posted by *heroxoot*
> 
> I even gave them my S/N for my card and I have no replies past a Mod asking for S/N. It is an unhelpful place and very slow for anything.


I would even try calling them. That made the RMA for my ASUS 7870 get approved with *the quickness*


----------



## heroxoot

Nah I don't need an RMA I need a new bios. But calling got me the 290X upgrade from my 7970. I just need them to fix this issue. I told the guy it doesn't work with the gaming app or MSI AB on a driver higher than 13.12 driver.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *heroxoot*
> 
> I doubt that would even work. I tried older bios for my specific 290X from techpowerup's VGA collection but they all make my GPU crash one time or another so I quit using it. I'll have to bug them again tomorrow.


Did you try flashing asus 290x bios or the PT1T / PT1 ?


----------



## heroxoot

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> I doubt that would even work. I tried older bios for my specific 290X from techpowerup's VGA collection but they all make my GPU crash one time or another so I quit using it. I'll have to bug them again tomorrow.
> 
> 
> 
> Did you try flashing asus 290x bios or the PT1T / PT1 ?
Click to expand...

Custom card. It won't work with other bios. I've tried such with past cards of generally the same nature. MSI cards with twin frozr are custom PCB.


----------



## sugarhell

Quote:


> Originally Posted by *heroxoot*
> 
> Custom card. It won't work with other bios. I've tried such with past cards of generally the same nature. MSI cards with twin frozr are custom PCB.


No. Its a ref card with different capacitors

check at cooling configurator


----------



## blue1512

.
Quote:


> Originally Posted by *heroxoot*
> 
> I'm starting to want to water cool with this fan control problem. I hate seeing my GPU get 85c on its small OC. Since my explanations are bad I took a pic for another forum. This is what I see.
> 
> 
> 
> Seems this is a problem with the MSI Gaming cards period.


I had the exactly same issue with 290x Gaming. It turned out that one of the two fan just stoped spinning on its own. Had to make it run with my finger. It happened after every cold boot until now

To bad that I bought it on eBay so I'm not sure about RMA.


----------



## Arizonian

Quote:


> Originally Posted by *250Gimp*
> 
> Hi All
> 
> I'm new to this forum so thought I would start here since I just got a used xfx R9 290, reference spec.
> 
> I didn't like the noise of the fan, so I just hooked up an NZXT G10 with a cheap Corsair H55 to cool.
> 
> WAY quieter!!!
> 
> Here is a screen shot from GPU-Z, and a pic of the card in my case.
> 
> Now to start thinking about a mild OC!!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers


Welcome to OCN and the 290X / 290 Owners Club - congrats - added









http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21860#post_22204820

Just add a GPU-Z validation link with your OCN name in your post would be appreciated.


----------



## Mega Man

Quote:


> Originally Posted by *sugarhell*
> 
> At least for lightning worked.
> 
> PS megaman you owe 1 dollar to someone


no i dont, first i never agreed to the bet, second even if i did, he said " a month after i get back to the us" ( i was in china till 4/30) which would mean till 5/30 even if i did and atm it is 5/7

dont let him sugar talk you into anything


----------



## bluedevil

Forgot to post this.









bluedevil
http://www.techpowerup.com/gpuz/45pn3/


----------



## Matt-Matt

Quote:


> Originally Posted by *blue1512*
> 
> .
> I had the exactly same issue with 290x Gaming. It turned out that one of the two fan just stoped spinning on its own. Had to make it run with my finger. It happened after every cold boot until now
> 
> To bad that I bought it on eBay so I'm not sure about RMA.


RMA should still be fine as MSI do serial based warranties. I didn't RMA my MSI 7950 but I asked about it and they were fine with it being returned if need be (got mine "Broken" off ebay)
Quote:


> Originally Posted by *bluedevil*
> 
> Forgot to post this.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> bluedevil
> http://www.techpowerup.com/gpuz/45pn3/


Pretty sure the submission name must match your OCN name.


----------



## bluedevil

Quote:


> Originally Posted by *Matt-Matt*
> 
> RMA should still be fine as MSI do serial based warranties. I didn't RMA my MSI 7950 but I asked about it and they were fine with it being returned if need be (got mine "Broken" off ebay)
> Pretty sure the submission name must match your OCN name.


And it does....


----------



## Matt-Matt

Quote:


> Originally Posted by *bluedevil*
> 
> Forgot to post this.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> bluedevil
> http://www.techpowerup.com/gpuz/45pn3/


Quote:


> Originally Posted by *bluedevil*
> 
> And it does....


Wow, sorry... I somehow read that as Blue with random numbers. It hasn't been my day today..


----------



## heroxoot

Quote:


> Originally Posted by *blue1512*
> 
> .
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> I'm starting to want to water cool with this fan control problem. I hate seeing my GPU get 85c on its small OC. Since my explanations are bad I took a pic for another forum. This is what I see.
> 
> 
> 
> Seems this is a problem with the MSI Gaming cards period.
> 
> 
> 
> I had the exactly same issue with 290x Gaming. It turned out that one of the two fan just stoped spinning on its own. Had to make it run with my finger. It happened after every cold boot until now
> 
> To bad that I bought it on eBay so I'm not sure about RMA.
Click to expand...

Lucky for me this isn't the issue. My fans spin perfectly thank god, because the 280X originally given to me did not even spin 1000 rpm. Both my fan's spin fine, they just make some noise at higher rpm. Not the usual jet noise its a little squeaky. At low speed it is completely silent tho and at jet speed it's not too bad.


----------



## Jflisk

Quote:


> Originally Posted by *Mega Man*
> 
> no i dont, first i never agreed to the bet, second even if i did, he said " a month after i get back to the us" ( i was in china till 4/30) which would mean till 5/30 even if i did and atm it is 5/7
> 
> dont let him sugar talk you into anything


Mega you ever get the Quad 290X set up and if so hows it running temps excetra.


----------



## sinnedone

So I've been working on an overclock for a Saphire 290 and am noticing a couple of things that are different than previous generations.

The voltage adjustment in afterburner I've noticed is not fixed, but more of an offset. Is this correct?

It seems this card will not cause a driver dump or computer freeze/reset but rather artifacts instead. Is this correct?

Been using my normal apps to test overclock, like 3d mark vantage, 3dmark, heaven, valley and playing some BF3. Any other programs that anyone can recommend that will debunk an unstable overclock quickly?


----------



## sena

Guys, why my card throotles when i increase voltage in msi afterburner, temps are below 80C, power limit is +50%?


----------



## The Mac

what driver?

any of the 14.x except the 14.4 doesnt register power limit increase.


----------



## heroxoot

Quote:


> Originally Posted by *The Mac*
> 
> what driver?
> 
> any of the 14.x except the 14.4 doesnt register power limit increase.


It doesn't feel like 14.4 registers the increase either. With limit on 50% my card won't use more voltage than 1.181v during usage.


----------



## The Mac

if it throttles if you put it to 0, then its registering.


----------



## sena

FIxed by installing 14.4 whql.

Altrough i saw downlocking two-three times in heaven.


----------



## The Mac

nice...


----------



## Dire Squirrel

Ordered my 290 this evening.
To late to have it shipped today, but it should be here by friday.


----------



## blue1512

Quote:


> Originally Posted by *sena*
> 
> Guys, why my card throotles when i increase voltage in msi afterburner, temps are below 80C, power limit is +50%?


AB isn't fully compatible with 14.4 yet. You can try OC with AMD Overdrive.


----------



## piquadrat

Anybody knows if a standard 3-pin fan can be rpm-controlled by a reference R9 290 pcb 4-pin fan header?
Or am I stuck with constant 12V independent on current core temperature?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *piquadrat*
> 
> Anybody knows if a standard 3-pin fan can be rpm-controlled by a reference R9 290 pcb 4-pin fan header?
> Or am I stuck with constant 12V independent on current core temperature?


You're stuck with 12V, I believe.

You can grab a single fan fancontroller that will fit into a PCI bracket for pretty cheap. I have a few that have dials on them, and because they adjust voltage directly they will work with 3 pin fans.

Something like this, I just looked quick, I'm sure you can find one for cheaper.
http://www.newegg.com/Product/Product.aspx?Item=N82E16811995073


----------



## kpoeticg

GPU fans are PWM. A standard PWM fan can be controlled from the header. Most WC/Modding resellers sell the gpu fan pins and connectors.


----------



## piquadrat

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> You're stuck with 12V, I believe.
> 
> You can grab a single fan fancontroller that will fit into a PCI bracket for pretty cheap. I have a few that have dials on them, and because they adjust voltage directly they will work with 3 pin fans.
> 
> Something like this, I just looked quick, I'm sure you can find one for cheaper.
> http://www.newegg.com/Product/Product.aspx?Item=N82E16811995073


But are there any working standalone solutions to do the conversion like this: PWM signal in 4-pin header (out of card) - > Voltage signal in 3-pin header (fan input)? I think it it doable but can someone selling these things?
I'm trying to auto control rpm of 3-pin fans using cards header and putting the cards core and VMR1-2 temparatures as controlling variables. Can it be done via software without additional hardware fan controllers?


----------



## kpoeticg

It's much simpler to just buy pwm fans for the header. You don't wanna voltage control fans from your gpu. GPU's not meant to be a fan controller. The voltages running through it can be sensitive enough without voltage controlling fans on top of it.


----------



## The Mac

Quote:


> Originally Posted by *blue1512*
> 
> AB isn't fully compatible with 14.4 yet. You can try OC with AMD Overdrive.


its a ADL command, if AB doesnt work, neither will CCC OD, or any of the other programs as every one of them uses ADL.


----------



## rdr09

Quote:


> Originally Posted by *kpoeticg*
> 
> It's much simpler to just buy pwm fans for the header. You don't wanna voltage control fans from your gpu. GPU's not meant to be a fan controller. The voltages running through it can be sensitive enough without voltage controlling fans on top of it.


+rep.


----------



## Mega Man

Quote:


> Originally Posted by *Jflisk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> no i dont, first i never agreed to the bet, second even if i did, he said " a month after i get back to the us" ( i was in china till 4/30) which would mean till 5/30 even if i did and atm it is 5/7
> 
> dont let him sugar talk you into anything
> 
> 
> 
> Mega you ever get the Quad 290X set up and if so hows it running temps excetra.
Click to expand...

last 2 cards will be here today before nine ( i called and they are keeping the delivery driver out till nine tonight... ouch... ) and i ordered 4x komodos for them as well !


----------



## olil170

What do you guys think is the best non-ref 290 non-x card out there? Looking to buy a new gpu and hesitating between the r9 290 or a gtx 780. Also, are any of you running 2 290's on an ax860 or similar? Thanks for your advice.


----------



## 250Gimp

Quote:


> Originally Posted by *Roboyto*
> 
> Send EK or VTX an e-mail. EK is usually very quick in their responses from my experience
> 
> 
> 
> 
> 
> 
> 
> 
> Very nice, just did my 290 with a G10 and Gelid VRM kit.
> 
> The Gelid VRM upgrade kit with the G10 keeps that VRM1 respectably cool, even with a mild OC of 1075/1375 +37mV on my PowerColor reference.
> 
> *Gelid R9 290(X) VRM Kit*
> 
> 
> 
> *Gelid Kit + 3M 8810 RAM Sinks*
> 
> 
> 
> *PowerColor 'OC' Stock 975/1250*
> 
> 
> 
> *1075/1375 +37mV*
> 
> 
> 
> Those VRM1 temps are with the NZXT fan pulling hot air off VRMs since this card is in a Corsair 250D
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The nice 'Tetris Block' VRM2 sink does very well for not getting any airflow over it in my case.


Thanks for the info Roboyto!

Now I just have to try to find those heat sinks semi locally.

Cheers


----------



## 250Gimp

Quote:


> Originally Posted by *Arizonian*
> 
> Welcome to OCN and the 290X / 290 Owners Club - congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21860#post_22204820
> 
> Just add a GPU-Z validation link with your OCN name in your post would be appreciated.


Thanks Arizonian.

Will add the link to original post.

Here it is now as well.

http://www.techpowerup.com/gpuz/nfvc5/

Cheers


----------



## Arizonian

Quote:


> Originally Posted by *bluedevil*
> 
> Forgot to post this.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> bluedevil
> http://www.techpowerup.com/gpuz/45pn3/


Congrats - added


----------



## Roboyto

Quote:


> Originally Posted by *250Gimp*
> 
> Thanks for the info Roboyto!
> 
> Now I just have to try to find those heat sinks semi locally.
> 
> Cheers


You're welcome

If there is nothing local eBay is always an option for something very similar.


----------



## Spectre-

Quote:


> Originally Posted by *olil170*
> 
> What do you guys think is the best non-ref 290 non-x card out there? Looking to buy a new gpu and hesitating between the r9 290 or a gtx 780. Also, are any of you running 2 290's on an ax860 or similar? Thanks for your advice.


Sapphite Tri- X is the best non-ref. card


----------



## The Mac

naw, sapphire vapor-x. same tri-x cooler, better parts...

the tri-x is actually a referece board with a better cooler.


----------



## UZ7

Quote:


> Originally Posted by *Spectre-*
> 
> Sapphite Tri- X is the best non-ref. card


Tri-X is reference with non-reference cooler.
Vapor-X is non-reference pcb with a better cooler than Tri-X


----------



## The Mac

actually, vapor x is the same tri-x cooler, just a better non reference board.


----------



## UZ7

Quote:


> Originally Posted by *The Mac*
> 
> actually, vapor x is the same tri-x cooler, just a better non reference board.


Well yeah, overall if you add the custom pcb, custom chokes, custom vrm sinks and back plate becomes overall a better card for air. On top of that you can control the fans to use 1 fan on idle/2d and the led logo has a nice touch to it as well.







which is why I think its a better cooler lol


----------



## blue1512

Quote:


> Originally Posted by *The Mac*
> 
> actually, vapor x is the same tri-x cooler, just a better non reference board.


It's not the same tri-x cooler. A vapor chamber was added to the Vapor-X, same approach as any Sapphire Vapor-X edition. The "1 fan" feature is very stupid cause it makes your card idle at more than 50c. Luckily it can be switched off though.

About the parts, Vapor-x has custom PCB, 8+8 power delivery and better VRM.


----------



## Aussiejuggalo

I forgot that Thief came out with a Mantle patch a little while ago, decided to check it out and this is what I got

Very High, windowed, no vsync & 1080p

Mantle

Min. 58.3
Max. 94.5
Avg.69.2

Normal

Min. 9.5
Max. 84.5
Avg. 60.2

Boosts the max and avg a little bit but the min







thats insane, hope more mantle games come out soon







not like that bf4 crap


----------



## Devotii

Nice, how to you enable mantle? Is it in game? I thought it was just "on" when the game has it, but I guess not?


----------



## piquadrat

Quote:


> Originally Posted by *kpoeticg*
> 
> It's much simpler to just buy pwm fans for the header. You don't wanna voltage control fans from your gpu. GPU's not meant to be a fan controller. The voltages running through it can be sensitive enough without voltage controlling fans on top of it.


Maybe not meant to but as a matter of fact it is.
As I'm thinking of putting not-full-cover water block on my 290 there is a problem with vrm1 and vrm2 active cooling. Using card's own fan header is the most straightforward solution as most controlling and monitoring utilities out there give necessary (adjustable) function: core_temperature -> cards_fan_speed "out of the box".
There is almost no super low profile (10 mm) PWM controlled fans on the market in the place where I live. I would have to buy one in China which I'm trying to avoid.

The other problem is the vrm1 and vrm2 temperatures. AFAIK only two applications report them: HWiNFO and GPU-Z. Unfortunately SpeedFan 4.49 does not (only core temp).
From your observations is there a strong correlation between core temp and vrm1/2 temps?
That would allow me to get core temp as a controlling variable for vrm cooling fans.


----------



## Dasboogieman

Quote:


> Originally Posted by *The Mac*
> 
> actually, vapor x is the same tri-x cooler, just a better non reference board.


While the Tri-X had fantastic core temperatures, it's key weakness was the VRM cooling and coil whine. While it is not a fault of the Tri-X cooler, the reference VRMs get very very hot yet they also have a tiny surface area to disperse the heat to any heatsink. Most overclocks on the Tri-X will ultimately be limited by the VRM temperatures rather than the core temperatures. Additionally, the inductors on the reference design are susceptible to coil whine, I actually solved this by taping some Sekisui 5760 tape on the inductors, others solved it with hot glue.

I would expect much better core temperatures due to the Vapor chamber (not that the Tri X was limited by core temperatures anyway) of the VaporX model. The real difference however, is the VRMs, the improved type, composition, PCB positioning, quantity and Heatsink backplate will definitely result in better temperatures (hence high OC headroom) and much less coil whine (with the custom inductors). I'm hoping that the Vapor chamber makes direct contact with the VRM assembly for even better efficiency.


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Devotii*
> 
> Nice, how to you enable mantle? Is it in game? I thought it was just "on" when the game has it, but I guess not?


In the launcher go to options and it should be there



Think it came in the 1.4 update


----------



## kpoeticg

Quote:


> Originally Posted by *piquadrat*
> 
> Maybe not meant to but as a matter of fact it is.
> As I'm thinking of putting not-full-cover water block on my 290 there is a problem with vrm1 and vrm2 active cooling. Using card's own fan header is the most straightforward solution as most controlling and monitoring utilities out there give necessary (adjustable) function: core_temperature -> cards_fan_speed "out of the box".
> There is almost no super low profile (10 mm) PWM controlled fans on the market in the place where I live. I would have to buy one in China which I'm trying to avoid.
> 
> The other problem is the vrm1 and vrm2 temperatures. AFAIK only two applications report them: HWiNFO and GPU-Z. Unfortunately SpeedFan 4.49 does not (only core temp).
> From your observations is there a strong correlation between core temp and vrm1/2 temps?
> That would allow me to get core temp as a controlling variable for vrm cooling fans.


The core temp, vrm1, and vrm2 are related but unrelated lol. They usually rise and fall at the same time, but they're completely seperate and on different parts of the pcb. The Aquacomputer 290/x block has the best vrm cooling out of the bunch, as long as you get the backplate (doesn't need to be the active backplate) your vrm temps should stay low.

I still wouldn't try to voltage control any fans from your card though (or your mobo for that matter). Everything your describing is why pwm control got popular. If you used pwm fans, you could let the card control the fan or just set the slope in afterburner. Where do u live? Gelid, Akasa, Prolimatech, and Titan all make 120x12/15mm PWM fans. I don't know any 120x10mm fans.

Afterburner and Aida64 also recognizes vrm 1 & 2 temps.


----------



## piquadrat

Quote:


> Originally Posted by *kpoeticg*
> 
> The core temp, vrm1, and vrm2 are related but unrelated lol. They usually rise and fall at the same time, but they're completely seperate and on different parts of the pcb. The Aquacomputer 290/x block has the best vrm cooling out of the bunch, as long as you get the backplate (doesn't need to be the active backplate) your vrm temps should stay low.
> 
> I still wouldn't try to voltage control any fans from your card though (or your mobo for that matter). Everything your describing is why pwm control got popular. If you used pwm fans, you could let the card control the fan or just set the slope in afterburner. Where do u live? Gelid, Akasa, Prolimatech, and Titan all make 120x12/15mm PWM fans. I don't know any 120x10mm fans.
> 
> Afterburner and Aida64 also recognizes vrm 1 & 2 temps.


I have water block already, no need to buy another. Don't know if there are widely available totally passive vrm heatsinks for 290 out there?
If core and vrm1/2 temps are logically dependent that suffice me.
Virtually no 15mm or better 10mm PWM fans on the market in place where I live. I would have to buy some TwinFrozr fans replacement from abroad (China is the cheapest place - 16USD a piece or 22USD double pack)
... and will use Speedfan or Afterburner for core_temp -> fan_speed curve adjustments.
If it is a better way (also from the financial point of view) please let me know.


----------



## kpoeticg

Alphacool and Swiftech both make little waterblocks for ram or vrm area's. They're meant to go with Universal GPU Blocks, dunno how you mount em though. Heatkiller makes 2 different sized blocks that are meant for gpu vrm's too.

There's definitely passive heatsinks out there that'll work. You should put heatsinks on there even with fans cooling em off. Search through this thread, i've seen a bunch of 290x's with 3rd party heatsinks. Swiftech, Enzotech, and ModMyToys have a bunch of different sizes.



lol....


----------



## piquadrat

Quote:


> Originally Posted by *kpoeticg*
> 
> Alphacool and Swiftech both make little waterblocks for ram or vrm area's. They're meant to go with Universal GPU Blocks, dunno how you mount em though. Heatkiller makes 2 different sized blocks that are meant for gpu vrm's too.
> 
> There's definitely passive heatsinks out there that'll work. You should put heatsinks on there even with fans cooling em off. Search through this thread, i've seen a bunch of 290x's with 3rd party heatsinks. Swiftech, Enzotech, and ModMyToys have a bunch of different sizes.


I suspect (and correct me if I'm wrong) that if I want to buy blocks for ram and vrm separately the price generally would be higher then one full-cover block.
It seems that active air cooling for vrm and ram and water for gpu (as I have gpu block already) is the most economically justified.


----------



## kpoeticg

Yeah, i've never seen any1 actually use those little blocks lol. That card must have crazy restriction









I'd still get the little heatsinks for your vrm's though. Some1 in here can probly tell you a good size if you keep checking the thread.


----------



## piquadrat

I think I'm going to follow this route: http://www.mckeemaker.com/2014/01/diy-vrm-heat-sinks-for-amd-r9-290.html
concerning vrms.


----------



## diggiddi

Quote:


> Originally Posted by *Dasboogieman*
> 
> While the Tri-X had fantastic core temperatures, it's key weakness was the VRM cooling and coil whine. While it is not a fault of the Tri-X cooler, the reference VRMs get very very hot yet they also have a tiny surface area to disperse the heat to any heatsink. Most overclocks on the Tri-X will ultimately be limited by the VRM temperatures rather than the core temperatures. Additionally, the inductors on the reference design are susceptible to coil whine, I actually solved this by taping some Sekisui 5760 tape on the inductors, others solved it with hot glue.
> 
> I would expect much better core temperatures due to the Vapor chamber (not that the Tri X was limited by core temperatures anyway) of the VaporX model. The real difference however, is the VRMs, the improved type, composition, PCB positioning, quantity and Heatsink backplate will definitely result in better temperatures (hence high OC headroom) and much less coil whine (with the custom inductors). I'm hoping that the Vapor chamber makes direct contact with the VRM assembly for even better efficiency.


So which is the best non reference cooling solution and card


----------



## kpoeticg

Probly the Lightning. Usually the MSI Lightning and Asus Matrix are the top cards. The Matrix 290x hasn't gotten great reviews...


----------



## Devotii

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> In the launcher go to options and it should be there
> 
> Think it came in the 1.4 update


I dont have Thief but I meant in general for other games (BF4 mostly) Ill have a look in options next time I play!

Quote:


> Originally Posted by *diggiddi*
> 
> So which is the best non reference cooling solution and card


I'd say the Vapor X but it might not be best £ for £ (or $ for $ etc.).
I am looking at the PCS+ for best £ for £ but still researching. Not many reviews for it sadly.


----------



## diggiddi

Yeah the matrix and lightning tend to be rather E$$$$$$$pensive


----------



## phantomowl

Quote:


> Originally Posted by *diggiddi*
> 
> Yeah the matrix and lightning tend to be rather E$$$$$$$pensive


i think it worth it if you do a lot of heavy overclock.


----------



## Norse

Just wondering but anyone in UK or EU that has a reference card and swapped the ref cooler for aftermarket?

I want to get my hands on a spare shroud to test something for a build i might be doing soon


----------



## Dasboogieman

Quote:


> Originally Posted by *diggiddi*
> 
> So which is the best non reference cooling solution and card


Depends on what you define as the "best". Also you have to consider whether you want to deal with the limitations of the reference PCB or not.

Reference designs:
The best VRM temperatures I've seen, ironically enough, comes from the original blower coolers. This is because the VRM heatsink is directly beneath the blower fan. However, the crap core temperatures mean you are going to be core temperature limited before you are VRM temperature limited.

The Tri-X as already mentioned is probably your best bet at the moment for reference design if you are aware of the VRM cooling limitation.
Probably should give an honourable mention to the PowerColor PCS+ monstrosity, it technically is the better performer though apparently some reviewers found it to be significantly louder.

Non-Reference designs:
Hands down the MSI lightning for core temperatures, just the sheer amount of metal on the GPU (3 slots). As for the VRMs, just the fact that it has 15 phases means there's more heat dispersal area plus individual VRM phases don't have to work as hard, so it has a huge advantage already.

The Vapor X is shaping up to be a more svelte alternative to the MSi lightning, while reviews are not out yet, I expect the core and VRM cooling to be quite competitive with the Lightning.


----------



## diggiddi

Quote:


> Originally Posted by *Dasboogieman*
> 
> Depends on what you define as the "best". Also you have to consider whether you want to deal with the limitations of the reference PCB or not.
> 
> Reference designs:
> The best VRM temperatures I've seen, ironically enough, comes from the original blower coolers. This is because the VRM heatsink is directly beneath the blower fan. However, the crap core temperatures mean you are going to be core temperature limited before you are VRM temperature limited.
> 
> The Tri-X as already mentioned is probably your best bet at the moment for reference design if you are aware of the VRM cooling limitation.
> Probably should give an honourable mention to the PowerColor PCS+ monstrosity, it technically is the better performer though apparently some reviewers found it to be significantly louder.
> 
> Non-Reference designs:
> Hands down the MSI lightning for core temperatures, just the sheer amount of metal on the GPU (3 slots). As for the VRMs, just the fact that it has 15 phases means there's more heat dispersal area plus individual VRM phases don't have to work as hard, so it has a huge advantage already.
> 
> The Vapor X is shaping up to be a more svelte alternative to the MSi lightning, while reviews are not out yet, I expect the core and VRM cooling to be quite competitive with the Lightning.


That Tri X be uugly tho, congrats on your 2nd rep
So ideally one might wont to slap an aio onto a ref card with the reference fan for best results


----------



## Dire Squirrel

Quote:


> Originally Posted by *diggiddi*
> 
> So which is the best non reference cooling solution and card


To be perfectly honest, the right way to put it would be "least terrible". Having seen and heard vast numbers of different designs, I have yet so find one that I would consider tolerable.
MSI, Sapphire and Gigabyte all have cooler designs that are a great improvement over reference. But they are all still very loud and their performance does not come close to after market solutions, and the is not justified in my opinion.

If you want performance and quiet operation, you need to look at after market coolers. Water would be at the top, but expensive and not worth it for everyone. A very close second would be high end air coolers like Prolimatech and Alpenföhn.

Prolimatech mk-26:


A reference card (or cheaper non reference) + one of those, is actually cheaper than some of the better non-reference cooler designs. And in terms of performance, only significantly more expensive custom water cooling can beat it.
And it is pretty as well.
Quote:


> Originally Posted by *Norse*
> 
> Just wondering but anyone in UK or EU that has a reference card and swapped the ref cooler for aftermarket?
> 
> I want to get my hands on a spare shroud to test something for a build i might be doing soon


I have a reference 290 and a matching waterblock on the way. What sort of mad scientist stuff are you planning?


----------



## Dasboogieman

Quote:


> Originally Posted by *diggiddi*
> 
> That Tri X be uugly tho, congrats on your 2nd rep
> So ideally one might wont to slap an aio onto a ref card with the reference fan for best results


Thanks









Leaving the Ref blower is a no-go for AIOs, apparently it fudges with most of the AIO mounting systems so you probably need to be handy with a dremel.

If you are going the AIO route, you might want to consider the ASUS 290X direct CU II (Non-Reference) or the PowerColor PCS+ (Reference) cards since their VRM assemblies both come with their own heatsinks which are totally independent of the main cooling unit, you can slap an Asetek AIO of your choice on a G10 bracket for excellent core and good VRM temperatures. No hardmodding always makes life easy








The ASUS card has the advantage of having a higher quantity and more efficient VRMs with a smaller heatsink so you might get better mileage overall for large voltmods under AIO cooling.
The PCS+ card has the reference 6 phases but the VRM heatsink is bigger. You might not do quite as well as the ASUS card under the AIO but I think this card is cheaper.


----------



## Norse

Quote:


> Originally Posted by *Dire Squirrel*
> 
> I have a reference 290 and a matching waterblock on the way. What sort of mad scientist stuff are you planning?


Nothing too odd really, just wanting to test spraying it white for a new build i am doing (Fractal Design R4 case, Gigabyte Z87-HD3 Mobo which is black) then have white SATA and PSU extension cables and have white graphics cards so its not all dark


----------



## Rainmaker91

A cool thread, I'm not honored to have a card myself yet but I have been on the fence a while now for getting one. What surprises me is that there sin't more water blocks mentioned at the first page (may be mentioned elsewhere but I'm not going to read through 2219 pages atm). From what I could gather of information the AquaComputer kryographics Hawaii should be the best performing one, so it surprised me that it's not mentioned. Anyways, I might be joining you all soon with my own build unless I manage to draw it out to Bermuda XTX before upgrading.


----------



## sena

Quote:


> Originally Posted by *sugarhell*
> 
> Guys its easy to give more volts on msi.
> 
> Just use /wi4,30,8d,10 for 100mv. The offset is 6.25 mv in hexademical. So on decimal is :16*6.25=100 mv. For 50mv you need 8. For 200mv you need 20( 20=32 on dec. So 32 * 6.25=200mv)
> 
> The easy way to do changes:
> 
> Create a txt on desktop. Write
> CD C:\Program Files (x86)\MSI Afterburner
> MSIAfterburner.exe /wi4,30,8d,10
> 
> and then save as .bat file. Eveyrtime you start this bat file msi will start with +100mv
> 
> For 50mv: 8
> For 100mv:10
> For 125mv:14
> For 150mv:18
> For 175mv:1C
> For 200mv:20
> 
> I wouldn't go over this point because
> 1)You are close to leave the sweet spot of the ref pcb vrms efficiency
> 2)These commands add 200mv on top of the 100mv offset through AB gui.That means 300mv
> 
> By default /wi command apply to current gpu only. So if you have 2 or more gpus you must use /sg command. That means the command line is something like that
> ex:MsiAfterburner.exe /sg0 /wi4,30,8d,10 /sg1 /wi4,30,8d,10


Its not working for me, i dont know what i did wrong, created notepad file, writen command in its, saved as bat file, but nothing.


----------



## Dire Squirrel

Quote:


> Originally Posted by *Norse*
> 
> Nothing too odd really, just wanting to test spraying it white for a new build i am doing (Fractal Design R4 case, Gigabyte Z87-HD3 Mobo which is black) then have white SATA and PSU extension cables and have white graphics cards so its not all dark


So something like this:?



It would look awesome with a few white LED's under the shroud, illuminating the blower. Small 2-3mm. ones should fit with no issues.

Should be a pretty simple mod.

Please excuse the rather hastily done PS work.


----------



## imadorkx

Hi guys.

I'm new to this forum.

I bought the r9 290 tri x early this month.. Might as well join this club.

Looking forward to learn new stuff.

Here is a screen shot from GPU-Z and the gpu itself.

Thanks guys.


----------



## Norse

Quote:


> Originally Posted by *Dire Squirrel*
> 
> So something like this:?
> 
> 
> 
> It would look awesome with a few white LED's under the shroud, illuminating the blower. Small 2-3mm. ones should fit with no issues.
> 
> Should be a pretty simple mod.
> 
> Please excuse the rather hastily done PS work.


Yes like that would be perfect, am pondering having "R9 290X" on the side in black though like how some of the Nvidia series says "GTX 780" on the side kinda thing


----------



## Widde

Quote:


> Originally Posted by *imadorkx*
> 
> Hi guys.
> 
> I'm new to this forum.
> 
> I bought the r9 290 tri x early this month.. Might as well join this club.
> 
> Looking forward to learn new stuff.
> 
> Here is a screen shot from GPU-Z and the gpu itself.
> 
> Thanks guys.


If you have the time/energy there is over 20000 posts







Not all of them info but many xD


----------



## Dire Squirrel

Quote:


> Originally Posted by *Norse*
> 
> Yes like that would be perfect, am pondering having "R9 290X" on the side in black though like how some of the Nvidia series says "GTX 780" on the side kinda thing


----------



## Norse

Quote:


> Originally Posted by *Dire Squirrel*


I was thinking the side of the card so its visible through case window


----------



## imadorkx

Quote:


> Originally Posted by *Widde*
> 
> If you have the time/energy there is over 20000 posts
> 
> 
> 
> 
> 
> 
> 
> Not all of them info but many xD


Haha. Still at page 35 or so.. reading all the comment.


----------



## Sgt Bilko

Quote:


> Originally Posted by *imadorkx*
> 
> Haha. Still at page 35 or so.. reading all the comment.


Get your Caffeine of choice, some good music and sit back.....you're in for a long long ride


----------



## Dire Squirrel

Quote:


> Originally Posted by *Norse*
> 
> I was thinking the side of the card so its visible through case window


I probably should have realized that. But I have been so spoiled by the glory of having a horizontal motherboard, that I almost forget that this is not the norm.


----------



## Devotii

Quote:


> Originally Posted by *Dire Squirrel*
> 
> I probably should have realized that. But I have been so spoiled by the glory of having a horizontal motherboard, that I almost forget that this is not the norm.


Got any posts with pictures? Sounds great!


----------



## Norse

Quote:


> Originally Posted by *Dire Squirrel*
> 
> I probably should have realized that. But I have been so spoiled by the glory of having a horizontal motherboard, that I almost forget that this is not the norm.


hehe it should be a fun build once funds allow


----------



## Dire Squirrel

Quote:


> Originally Posted by *Devotii*
> 
> Got any posts with pictures? Sounds great!


Check out the HAF-XB Club
That is the case I'm using.


----------



## velocityx

How is everybody feeling about OCing in crossfire? I'm wondering about it and it's something I haven't seen in my previous 6xxx series CF. cause I have two gigabyte 290, i've put a bios from a tri-x sapphire reference card and it has a stock 1000 core clock.

when I game with them, they both are stock, and in bf4, in my perfoverlay.drawgraph 1 gpu is a perfect green line. nothing is throttling all is fine. it's also perfectly smooth.

but as soon as I put them on 1025/1050/1075/1100 and I start to increase aux and voltage I can go as far as 1150 but the point is, as soon as I oc, even put that 25mhz overclock, that perfectly smooth line in the graph drawing gets distorted and gameplay isn't that smooth anymore, as if there's a lag introduced into the pipeline. FPS isn't that much higher, I even feel like it's getting lower fps (instead of lets say 100 its gonna be around 90)

running 14.4 and ocing through Afterburner. I would have to take one card out and oc it and see how it performs, because I donno it's CF that doesnt like ocing or I donno.


----------



## Sgt Bilko

Quote:


> Originally Posted by *velocityx*
> 
> How is everybody feeling about OCing in crossfire? I'm wondering about it and it's something I haven't seen in my previous 6xxx series CF. cause I have two gigabyte 290, i've put a bios from a tri-x sapphire reference card and it has a stock 1000 core clock.
> 
> when I game with them, they both are stock, and in bf4, in my perfoverlay.drawgraph 1 gpu is a perfect green line. nothing is throttling all is fine. it's also perfectly smooth.
> 
> but as soon as I put them on 1025/1050/1075/1100 and I start to increase aux and voltage I can go as far as 1150 but the point is, as soon as I oc, even put that 25mhz overclock, that perfectly smooth line in the graph drawing gets distorted and gameplay isn't that smooth anymore, as if there's a lag introduced into the pipeline. FPS isn't that much higher, I even feel like it's getting lower fps (instead of lets say 100 its gonna be around 90)
> 
> running 14.4 and ocing through Afterburner. I would have to take one card out and oc it and see how it performs, because I donno it's CF that doesnt like ocing or I donno.


Try and set the powerlimit in CCC then overclock with Afterburner, powerlimit still seems to be a little dodgy for some people with the 14.x drivers.

Other than that give the 13.12 driver and see how it goes.

I run 1100/1400 daily on 14.4 and it's fine through Afterburner but as soon as i use Trixx (for big OC's) the core clock starts jumping around due to the powerlimit not being set anymore so i need to manually set it in CCC to get it to work.


----------



## velocityx

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Try and set the powerlimit in CCC then overclock with Afterburner, powerlimit still seems to be a little dodgy for some people with the 14.x drivers.
> 
> Other than that give the 13.12 driver and see how it goes.
> 
> I run 1100/1400 daily on 14.4 and it's fine through Afterburner but as soon as i use Trixx (for big OC's) the core clock starts jumping around due to the powerlimit not being set anymore so i need to manually set it in CCC to get it to work.


hmm alright, I wasn't even touching CCC in them as I know it can cause trouble so I was setting power limit through AB. I will check the CCC and report back. thx.


----------



## Talon720

Quote:


> Originally Posted by *sena*
> 
> Its not working for me, i dont know what i did wrong, created notepad file, writen command in its, saved as bat file, but nothing.


The wi4 part is a 6 now i thought, so wi6.


----------



## velocityx

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Try and set the powerlimit in CCC then overclock with Afterburner, powerlimit still seems to be a little dodgy for some people with the 14.x drivers.
> 
> Other than that give the 13.12 driver and see how it goes.
> 
> I run 1100/1400 daily on 14.4 and it's fine through Afterburner but as soon as i use Trixx (for big OC's) the core clock starts jumping around due to the powerlimit not being set anymore so i need to manually set it in CCC to get it to work.


so I went in CCC, set power limit +50 for the cards, applied overclock without power limit in AB, the line was even and gameplay was smooth. However now when I look at the ccc settings I see the power limit is at 0 again. Not that I need to overclock, but I think it shouldn't behave like that?


----------



## Sgt Bilko

Quote:


> Originally Posted by *velocityx*
> 
> so I went in CCC, set power limit +50 for the cards, applied overclock without power limit in AB, the line was even and gameplay was smooth. However now when I look at the ccc settings I see the power limit is at 0 again. Not that I need to overclock, but I think it shouldn't behave like that?


I'm not sure what actually causes it but it's pretty similar to what was happening with the old AB and drivers:
Quote:


> Using AB 3.0.0 Beta 17, once you have applied your overclock you must go back into CCC and reset the Power Limit back to 50% again. Even though it says it's 50% in AB it's NOT 50% in CCC.
> 
> However IF CCC is disabled you DO NOT ENABLE OVERDRIVE in CCC and your AB settings will work.
> 
> NOTE: Whenever you set a new clock and your screen flashes you need to bump the power limit back up in CCC as well as it resets every time the driver does.


^ Taken from the OP

You can try and untick the "reset display mode on applying unofficial overclocking" box in AB's settings, might help.


----------



## bond32

Broke 12k in firestrike!! http://www.3dmark.com/3dm/3028227?


----------



## rdr09

Quote:


> Originally Posted by *bond32*
> 
> Broke 12k in firestrike!! http://www.3dmark.com/3dm/3028227?


you certainly hit the lottery, bond. once a 290X hit 1270 on the core it would be very hard for a 290 to catch it in benching. nice oc.


----------



## yknot

Newbie warning......









Hi to all,

I've got myself a reference 290x (Gigabyte) and watercooled it....................but.........I was wondering where the voltage monitoring points were located so I can connect a DMM.

I know I can use GPUZ but from reading this thread ppl seem to be using DMMs quite regular and I was puzzled as to where the connections were on the card.

Thanx


----------



## bond32

Quote:


> Originally Posted by *rdr09*
> 
> you certainly hit the lottery, bond. once a 290X hit 1270 on the core it would be very hard for a 290 to catch it in benching. nice oc.


Thanks! It's nice after all the components I have had to finally get a good one!


----------



## rdr09

Quote:


> Originally Posted by *bond32*
> 
> Thanks! It's nice after all the components I have had to finally get a good one!


i see you are using 13 drivers. for benching - i do, too. can't give it up. i had to add volts using 14.4. a lot more volts.

i know we haven't seen the last from your card. i only use Trixx to oc and that +200 offset really helps. keep it coming. you should be able to hit 1300 on the core under cooler conditions.


----------



## bond32

Quote:


> Originally Posted by *rdr09*
> 
> i see you are using 13 drivers. for benching - i do, too. can't give it up. i had to add volts using 14.4. a lot more volts.
> 
> i know we haven't seen the last from your card. i only use Trixx to oc and that +200 offset really helps. keep it coming. you should be able to hit 1300 on the core under cooler conditions.


Doubt my card will do more... VRM1 temp when running many benchmarks max was 51 C which is the lowest I have seen it. Core max was around 44 C. If I push the core more I get artifacts. I may at some point try +300mv with afterburner...


----------



## sena

Quote:


> Originally Posted by *Talon720*
> 
> The wi4 part is a 6 now i thought, so wi6.


Now i am lost, what is part of what?


----------



## rdr09

Quote:


> Originally Posted by *bond32*
> 
> Doubt my card will do more... VRM1 temp when running many benchmarks max was 51 C which is the lowest I have seen it. Core max was around 44 C. If I push the core more I get artifacts. I may at some point try +300mv with afterburner...


yah, that means more volts. 300 is scary. you can try lower your mem to 1600 or even 1575 and raise the core. still nice card you have.


----------



## sena

Quote:


> Originally Posted by *sena*
> 
> Now i am lost, what is part of what?


Chagned 4 to 6 and it works, thx man.

EDIT: It doesnt work, voltage in 3d is still not at +1.3V, damn.


----------



## bond32

Quote:


> Originally Posted by *rdr09*
> 
> yah, that means more volts. 300 is scary. you can try lower your mem to 1600 or even 1575 and raise the core. still nice card you have.


Thanks, yeah I have been meaning to just play around with different scenarios just haven't had the time due to school. No more school now though so hopefully I can get it a tad higher. I see a few 290x's in the list for the top single card fire strike scores but I doubt I will get as high as them - my limiting factor is the 4770k. 4.9 is about as high as I can go, I tried 5.0 ghz but even with an absurd vcore it still won't bench.


----------



## cennis

Quote:


> Originally Posted by *sena*
> 
> Chagned 4 to 6 and it works, thx man.
> 
> EDIT: It doesnt work, voltage in 3d is still not at +1.3V, damn.


increase volts higher,

under load voltage vdrops so if you set 1.3v it will not run at 1.3v, probably closer to 1.2v


----------



## rdr09

Quote:


> Originally Posted by *bond32*
> 
> Thanks, yeah I have been meaning to just play around with different scenarios just haven't had the time due to school. No more school now though so hopefully I can get it a tad higher. I see a few 290x's in the list for the top single card fire strike scores but I doubt I will get as high as them - my limiting factor is the 4770k. 4.9 is about as high as I can go, I tried 5.0 ghz but even with an absurd vcore it still won't bench.


yah, in order for us to compete to the top we gonna need a X79 system. i'd focus on the graphics score. there is one thread in the benchmarking forum that allows tesselation to be off in CCC and that would at least bump the overall score.


----------



## diggiddi

Quote:


> Originally Posted by *Dasboogieman*
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Leaving the Ref blower is a no-go for AIOs, apparently it fudges with most of the AIO mounting systems so you probably need to be handy with a dremel.
> 
> If you are going the AIO route, you might want to consider the ASUS 290X direct CU II (Non-Reference) or the PowerColor PCS+ (Reference) cards since their VRM assemblies both come with their own heatsinks which are totally independent of the main cooling unit, you can slap an Asetek AIO of your choice on a G10 bracket for excellent core and good VRM temperatures. No hardmodding always makes life easy
> 
> 
> 
> 
> 
> 
> 
> 
> The ASUS card has the advantage of having a higher quantity and more efficient VRMs with a smaller heatsink so you might get better mileage overall for large voltmods under AIO cooling.
> The PCS+ card has the reference 6 phases but the VRM heatsink is bigger. You might not do quite as well as the ASUS card under the AIO but I think this card is cheaper.


You'll be Repped again for the recommend







as soon as it starts working


----------



## chiknnwatrmln

You could also get the Gelid VRM sink enhancement kit. I think it's like $20 and I've heard it performs pretty well.


----------



## The Mac

i have one coming, backordered till the end of may. grrrrr


----------



## cennis

Quote:


> Originally Posted by *diggiddi*
> 
> You'll be Repped again for the recommend
> 
> 
> 
> 
> 
> 
> 
> as soon as it starts working


I have the direct CU II card.
I just attached the h90 onto it without using the G10 bracket.
I punched some holes into the stock intel mounting bracket that came with the h90, got some M3 nuts and bolts and it fits very nicely. As for VRM I use one of the stock fans on it.

If you get G10 I think you cannot use the original backplate of the DCUII with the g10 backplate.

Also i believe reference cards have better VRM in terms of heat. The DCUII uses custom vrm with higher amps usually used on motherboards (forgot where I read that) and cause extra heat. The advertised "better" overclocking feels like a hoax too.

Will up some pics later and thermal results


----------



## 1rad3

A good idea would be for NZXT to make the g10 "custom" backplate compatible. It would definitely add to the rigidity of longer cards like the 290(x)


----------



## Rainmaker91

As far as I know there is nothing that's stopping you from mounting the G10 with a backplate. If you're thinking of the DirectCUII backplate then you may only need a spacer for each screw to fit on it, I even custom fitted a Alphacool backplate on my 7950 with a custom made AIO bracket so there isn't much more work then you make it to. The rule of thumb here is that if the backplate has holes in it for mounting a cooler to the GPU then you can mount any GPU cooler through those exact same holes.









As for AIO alternatives you should check out my thread. As you can see there are several options other than the G10 and the best part is that all of them are in stock.


----------



## sinnedone

So this "AUX Voltage", what exactly does it do?

Is i like vccio for memory on cpu's or something completely different.

Trying to push some overclocks, and wondering how this perticular voltage can help me. Thanks


----------



## the9quad

Finally got a kill-a-watt.

In firestrike extreme on the last test at stock settings except an OC processor, I pull 1080 watts at the wall. eeek I didnt realize I was that close to the 1200 watt power supply. Think this will hurt anything?


----------



## VSG

You are fine, remember that is not accounting for PSU efficiency. So you are pulling closer to 900-950W depending on your PSU.


----------



## Tokkan

Quote:


> Originally Posted by *the9quad*
> 
> Finally got a kill-a-watt.
> 
> In firestrike extreme on the last test at stock settings except an OC processor, I pull 1080 watts at the wall. eeek I didnt realize I was that close to the 1200 watt power supply. Think this will hurt anything?


Like said, you're not even close to hitting your PSU limits. That is what efficiency is for. You would only be close to hitting the 1200w delivered by your PSU if it had 100% efficiency. But it doesn't and efficiency also changes with load so you're still safe to put another r9 290x on it


----------



## VSG

Ok, not another R9-290x but the point remains


----------



## the9quad

Quote:


> Originally Posted by *geggeg*
> 
> You are fine, remember that is not accounting for PSU efficiency. So you are pulling closer to 900-950W depending on your PSU.


Quote:


> Originally Posted by *Tokkan*
> 
> Like said, you're not even close to hitting your PSU limits. That is what efficiency is for. You would only be close to hitting the 1200w delivered by your PSU if it had 100% efficiency. But it doesn't and efficiency also changes with load so you're still safe to put another r9 290x on it


Thank you gents


----------



## DeadlyDNA

Quote:


> Originally Posted by *the9quad*
> 
> Finally got a kill-a-watt.
> 
> In firestrike extreme on the last test at stock settings except an OC processor, I pull 1080 watts at the wall. eeek I didnt realize I was that close to the 1200 watt power supply. Think this will hurt anything?


Not sure if it matters, try running BF4 maxed out if you can, right now it is the top benchmark for me on wattage consumption, its beating out the others however i'm trying to push massive resolutions and it's a very efficient game and uses every ounce of my quads. Really impressed with the game engine and i have hated on EA so long. My mistake though hats off to Dice on this.


----------



## the9quad

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Not sure if it matters, try running BF4 maxed out if you can, right now it is the top benchmark for me on wattage consumption, its beating out the others however i'm trying to push massive resolutions and it's a very efficient game and uses every ounce of my quads. Really impressed with the game engine and i have hated on EA so long. My mistake though hats off to Dice on this.


bf4 maxed out is about the same actually (1070 watts ish). If I run BF4 mantle frames locked to 130fps, it is 700 watts (that is where I will keep it)


----------



## chronicfx

Quote:


> Originally Posted by *the9quad*
> 
> Finally got a kill-a-watt.
> 
> In firestrike extreme on the last test at stock settings except an OC processor, I pull 1080 watts at the wall. eeek I didnt realize I was that close to the 1200 watt power supply. Think this will hurt anything?


did you ever get 3dmark11 to work on the generic setting for a Pscore? Interested to see if your graphics score with the 4930k is different than mine is with the 3570k+reduced pcie bandwidth with 3 gpu at stock.

Here is mine http://www.3dmark.com/3dm11/7712467


----------



## blue1512

Quote:


> Originally Posted by *the9quad*
> 
> bf4 maxed out is about the same actually (1070 watts ish). If I run BF4 mantle frames locked to 130fps, it is 700 watts (that is where I will keep it)


This is why Mantle is so sweet, no stress on CPU and more efficiency from GPU. I hope that Mantle will appear in more games, so that I could still stick to my beloved i7 2600


----------



## the9quad

Quote:


> Originally Posted by *chronicfx*
> 
> did you ever get 3dmark11 to work on the generic setting for a Pscore? Interested to see if your graphics score with the 4930k is different than mine is with the 3570k+reduced pcie bandwidth with 3 gpu at stock.
> 
> Here is mine http://www.3dmark.com/3dm11/7712467


Reinstalling windows as I type from my ipad lol.


----------



## chronicfx




----------



## DeadlyDNA

Quote:


> Originally Posted by *blue1512*
> 
> This is why Mantle is so sweet, no stress on CPU and more efficiency from GPU. I hope that Mantle will appear in more games, so that I could still stick to my beloved i7 2600


Sadly I wouldn't know about mantle since 14.x drivers break everything else for me. :-(

what I would give to see it in action at extreme resolutions.


----------



## Widde

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Sadly I wouldn't know about mantle since 14.x drivers break everything else for me. :-(
> 
> what I would give to see it in action at extreme resolutions.


Mantle right now is unplayable for me with 14.4 whql







Massive frame dorps compared to dx11


----------



## phallacy

Quote:


> Originally Posted by *the9quad*
> 
> bf4 maxed out is about the same actually (1070 watts ish). If I run BF4 mantle frames locked to 130fps, it is 700 watts (that is where I will keep it)


This is very similar to what I had with 3 cards. Between 1010 and 1100 depending on the game or benchmark. Low was valley and high was crysis 3.


----------



## bond32

Got 12607 in firestrike with tess off. Heres a SS


----------



## LA_Kings_Fan

Hey there Arizonian









I'm a little late to this party I know, but just picked up this guy from NewEgg's OpenBox section today ...

*POWERCOLOR* Radeon *R9-290X* w/ *PCS+* 3 Fan cooler Model # *AXR9 290X 4GBD5-PPDHE*











I gotta still install it and move my Sapphire TOXIC HD-695/70 into the back-up rig, so I'll get you the GPU-Z screenshot then.









But not to bad for only *$440.00*







and the box was still factory sealed never even opened, card still in the factory static bag un-opened









Hopefully get it installed and find I got a winner on my hands


----------



## cennis

*Some pics and results of my asus dcuii h90 mount:

*


Spoiler: Warning: Spoiler!


----------



## eternal7trance

Someone is selling R9 290s on ebay for $250, just wow. They are XFX so I bet they might even unlock


----------



## rdr09

Quote:


> Originally Posted by *bond32*
> 
> Got 12607 in firestrike with tess off. Heres a SS


for benching . . . 290X it is . . .

http://www.3dmark.com/3dm/2071136


----------



## Dire Squirrel

Just got my 290 today. Got It installed and updated driver.

I'm experiencing a few issues.

My second monitor is always turned of when I'm not actively using it. This has never been a problem. But now, whenever I turn it on or off, Windows reacts as if it was just being plugged in and when it is turned off, Windows does not even recognize it. This has never happened before.
The second monitor is plugged into HDMI and the Main is on DVI. Just as it has always been.

And it gets worse. Whenever I play something (youtube video etc.) in full screen on the secondary monitor, things really turn bad. When I close full screen mode, the mouse cursor turns into this:


(I had to take a photo of the screen since for some reason the messed up cursor does not show up on screenshots).

At that point I can't move the cursor back to the main screen until I completely turn the secondary monitor off. And it keeps looking like the picture. So far the only thing I have found that gets it back to normal, is a complete system reboot.

And when booting into Windows, the mouse does not respond at all. I have to physically unplug it and plug it back in. Sometimes several times.

Any idea as to what is going on and more importantly, how to fix it?

Edit to add:

Forgot this:


----------



## Matt-Matt

Quote:


> Originally Posted by *eternal7trance*
> 
> Someone is selling R9 290s on ebay for $250, just wow. They are XFX so I bet they might even unlock


You sure it's not a scam? Or is it bidding? Because DAMN you could buy them all out and sell them at $380 a piece and keep the best one for yourself and you'd be on top.


----------



## eternal7trance

Quote:


> Originally Posted by *Matt-Matt*
> 
> You sure it's not a scam? Or is it bidding? Because DAMN you could buy them all out and sell them at $380 a piece and keep the best one for yourself and you'd be on top.


I dunno the guy has good feedback and 250 is the buy it now price


----------



## Arizonian

Quote:


> Originally Posted by *imadorkx*
> 
> Hi guys.
> 
> I'm new to this forum.
> 
> I bought the r9 290 tri x early this month.. Might as well join this club.
> 
> Looking forward to learn new stuff.
> 
> Here is a screen shot from GPU-Z and the gpu itself.
> 
> Thanks guys.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Quote:


> Originally Posted by *Dire Squirrel*
> 
> Just got my 290 today. Got It installed and updated driver.
> 
> I'm experiencing a few issues.
> 
> My second monitor is always turned of when I'm not actively using it. This has never been a problem. But now, whenever I turn it on or off, Windows reacts as if it was just being plugged in and when it is turned off, Windows does not even recognize it. This has never happened before.
> The second monitor is plugged into HDMI and the Main is on DVI. Just as it has always been.
> 
> And it gets worse. Whenever I play something (youtube video etc.) in full screen on the secondary monitor, things really turn bad. When I close full screen mode, the mouse cursor turns into this:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> (I had to take a photo of the screen since for some reason the messed up cursor does not show up on screenshots).
> 
> At that point I can't move the cursor back to the main screen until I completely turn the secondary monitor off. And it keeps looking like the picture. So far the only thing I have found that gets it back to normal, is a complete system reboot.
> 
> And when booting into Windows, the mouse does not respond at all. I have to physically unplug it and plug it back in. Sometimes several times.
> 
> Any idea as to what is going on and more importantly, how to fix it?
> 
> Edit to add:
> 
> Forgot this:
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - sorry to hear about your problem.

Hope you get that figured out. Only missing which 290 did you get to add to list but your partially added.









Quote:


> Originally Posted by *LA_Kings_Fan*
> 
> Hey there Arizonian
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm a little late to this party I know, but just picked up this guy from NewEgg's OpenBox section today ...
> 
> *POWERCOLOR* Radeon *R9-290X* w/ *PCS+* 3 Fan cooler Model # *AXR9 290X 4GBD5-PPDHE*
> 
> 
> 
> I gotta still install it and move my Sapphire TOXIC HD-695/70 into the back-up rig, so I'll get you the GPU-Z screenshot then.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But not to bad for only *$440.00*
> 
> 
> 
> 
> 
> 
> 
> and the box was still factory sealed never even opened, card still in the factory static bag un-opened
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hopefully get it installed and find I got a winner on my hands


Hey LA_Kings_Fan - congrats - added









Been a while since the Sapphire Club - I'm surprised yours wasn't a Sapphire. Sweet price on your 290X though so that would have been hard to pass up. Sweet.


----------



## The Mac

Quote:


> Originally Posted by *Dire Squirrel*
> 
> Just got my 290 today. Got It installed and updated driver.
> 
> I'm experiencing a few issues.
> 
> My second monitor is always turned of when I'm not actively using it. This has never been a problem. But now, whenever I turn it on or off, Windows reacts as if it was just being plugged in and when it is turned off, Windows does not even recognize it. This has never happened before.
> The second monitor is plugged into HDMI and the Main is on DVI. Just as it has always been.
> 
> And it gets worse. Whenever I play something (youtube video etc.) in full screen on the secondary monitor, things really turn bad. When I close full screen mode, the mouse cursor turns into this:
> 
> (I had to take a photo of the screen since for some reason the messed up cursor does not show up on screenshots).
> 
> At that point I can't move the cursor back to the main screen until I completely turn the secondary monitor off. And it keeps looking like the picture. So far the only thing I have found that gets it back to normal, is a complete system reboot.
> 
> And when booting into Windows, the mouse does not respond at all. I have to physically unplug it and plug it back in. Sometimes several times.
> 
> Any idea as to what is going on and more importantly, how to fix it?


yeah, the primary will disocnnect/conect if it goes into powersave or you turn it off as well.

its something with 14.x, it was fine on 13.12

I filed a bug report, and Sam was notified and informed the driver team directly as well.


----------



## Dire Squirrel

Quote:


> Originally Posted by *The Mac*
> 
> yeah, the primary will disocnnect/conect if it goes into powersave or you turn it off as well.
> 
> its something with 14.x, it was fine on 13.12
> 
> I filed a bug report, and Sam was notified and informed the driver team directly as well.


So there is nothing to do other than "downgrade" to 13.12?
Will that take care of all the issues?


----------



## Talon720

Im having some issues. I added a 3rd card 290x with waterblock finally got everything to work. I put cards in order of 67% w/Hynix first then 77% Hynix 76% Elpida. It probably dosnt matter what order, but just in case. After I installed the new card i used a triple parallel block from a dual serial. My temps are high now on the core to where its throttling. All 3 cards cores are hot though some if i squeeze the tubing or tilt the case temps drop. Which is air id assume, but i cant seem to get it out this time. Also with 14.4 bf4 mantle with trifire i get a mantle error and a bunch of artifacts that look like glass shards. On 13.--wqhl or directx 11 it runs fine. Anyone else have mantle errors with 14.4?. One more question i stated my core temps were high like 95c high but the vrms are not cold, but not overheating. How durable or long lasting is clu . The core temps were fine which makes me thing air is still in the system the vrm temps though hotter than they were are not going past 75c. Im using 1 d5 w/res.


----------



## The Mac

Quote:


> Originally Posted by *Dire Squirrel*
> 
> So there is nothing to do other than "downgrade" to 13.12?
> Will that take care of all the issues?


Correct


----------



## Matt-Matt

Quote:


> Originally Posted by *eternal7trance*
> 
> I dunno the guy has good feedback and 250 is the buy it now price


Yeah nice, that's a steal! Shame it's not in Australia


----------



## sinnedone

Quote:


> Originally Posted by *eternal7trance*
> 
> I dunno the guy has good feedback and 250 is the buy it now price


It might not be a scam. I purchased 2 290's for 200 each just 2 weeks ago. Like always its a crapshoot on Ebay though. Pay with your credit card just in case. You cant trust Ebay or paypal to get your money back. Your credit card will usually have your money back within 2 weeks though.


----------



## chronicfx

Mining has recently become unprofitable unless you can sit in front of your screen and day trade. This just simple supply and demand as people who owned 290x for farming are now trying to cash in their hardware all at once. The prices are on the buyers side. Of course alot of these cards have been run hot, overclocked and overvolted to the point of instability, with high fan speeds (80%+) for months and months 24/7,and may be using whatever bios has the best hashrate leaving you to flash it back. That is what you need to know going in.


----------



## Dire Squirrel

Quote:


> Originally Posted by *The Mac*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dire Squirrel*
> 
> So there is nothing to do other than "downgrade" to 13.12?
> Will that take care of all the issues?
> 
> 
> 
> Correct
Click to expand...

Downloaded and installed 13.12. But apparently AMD do not allow older drivers to overwrite newer ones, because exactly nothing has happened. 14.4 is still the driver in use.

Just to be absolutely safe here, what is the correct way to go from a newer to an older driver?
I have never had to do this and threads like this: http://www.overclock.net/t/1486368/important-dont-ever-use-the-amd-clean-uninstall-utility Make it clear that there is at least some wrong ways to do it.


----------



## diggiddi

Quote:


> Originally Posted by *Dire Squirrel*
> 
> Downloaded and installed 13.12. But apparently AMD do not allow older drivers to overwrite newer ones, because exactly nothing has happened. 14.4 is still the driver in use.
> 
> Just to be absolutely safe here, what is the correct way to go from a newer to an older driver?
> I have never had to do this and threads like this: http://www.overclock.net/t/1486368/important-dont-ever-use-the-amd-clean-uninstall-utility Make it clear that there is at least some wrong ways to do it.


Uninstall driver with DDU in safe mode


----------



## bond32

Quote:


> Originally Posted by *Talon720*
> 
> Im having some issues. I added a 3rd card 290x with waterblock finally got everything to work. I put cards in order of 67% w/Hynix first then 77% Hynix 76% Elpida. It probably dosnt matter what order, but just in case. After I installed the new card i used a triple parallel block from a dual serial. My temps are high now on the core to where its throttling. All 3 cards cores are hot though some if i squeeze the tubing or tilt the case temps drop. Which is air id assume, but i cant seem to get it out this time. Also with 14.4 bf4 mantle with trifire i get a mantle error and a bunch of artifacts that look like glass shards. On 13.--wqhl or directx 11 it runs fine. Anyone else have mantle errors with 14.4?. One more question i stated my core temps were high like 95c high but the vrms are not cold, but not overheating. How durable or long lasting is clu . The core temps were fine which makes me thing air is still in the system the vrm temps though hotter than they were are not going past 75c. Im using 1 d5 w/res.


Something isn't installed properly. Also if you really have that much air in the loop you shouldn't be running power to anything but the pump. Are you sure the parallel block is configured properly? Perhaps you have incorrect flow direction somewhere.


----------



## Jflisk

Quote:


> Originally Posted by *Dire Squirrel*
> 
> Downloaded and installed 13.12. But apparently AMD do not allow older drivers to overwrite newer ones, because exactly nothing has happened. 14.4 is still the driver in use.
> 
> Just to be absolutely safe here, what is the correct way to go from a newer to an older driver?
> I have never had to do this and threads like this: http://www.overclock.net/t/1486368/important-dont-ever-use-the-amd-clean-uninstall-utility Make it clear that there is at least some wrong ways to do it.


Find DDU display driver uninstall utility link below. Run - follow instructions re install drivers. If you can find 14.3 beta V1 drivers they should treat you okay.

http://www.wagnardmobile.com/DDU/


----------



## Dire Squirrel

Quote:


> Originally Posted by *diggiddi*
> 
> Uninstall driver with DDU in safe mode


Quote:


> Originally Posted by *Jflisk*
> 
> Find DDU display driver uninstall utility link below. Run - follow instructions re install drivers. If you can find 14.3 beta V1 drivers they should treat you okay.
> 
> http://www.wagnardmobile.com/DDU/


Will try the DDU thing.

So the 14.3 beta is better than 13.12 and without any of the issues that I have been having?
Found this: http://www.guru3d.com/news_story/download_amd_catalyst_14_3_beta_driver.html


----------



## Jflisk

Quote:


> Originally Posted by *Dire Squirrel*
> 
> Will try the DDU thing.
> 
> So the 14.3 beta is better than 13.12 and without any of the issues that I have been having?
> Found this: http://www.guru3d.com/news_story/download_amd_catalyst_14_3_beta_driver.html


I have 2x 290x and I dont have any problems with the beta above the 14.4 drivers cause me a great deal of problems.


----------



## Ized

What options do I have for reducing minimum fan speed while idle in day to day desktop work?

I can't find any info on BIOS editing for hawaii so any options would be welcome.

A minimum of 20% seems to be enforced.


----------



## The Mac

Quote:


> Originally Posted by *Dire Squirrel*
> 
> Will try the DDU thing.
> 
> So the 14.3 beta is better than 13.12 and without any of the issues that I have been having?
> Found this: http://www.guru3d.com/news_story/download_amd_catalyst_14_3_beta_driver.html


all 14.x have the monitor issue.

As i mentioned b4, use 13.12


----------



## LA_Kings_Fan

Quote:


> Originally Posted by *Arizonian*
> 
> Hey LA_Kings_Fan - congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Been a while since the Sapphire Club - I'm surprised yours wasn't a Sapphire. Sweet price on your 290X though so that would have been hard to pass up. Sweet.


Yeah I was looking at the Sapphire TriX 290x ... but reviewers claimed this PowerColor PCS+ was better, metal build not plastic, quieter, and ran cooler, thus might OC further ?
Add to the fact that the best deal I could find on the Sapphire TriX was $550.00 ... and yeah this was too good a deal to pass up at $110 bucks cheaper.









Thought about picking up a REF version and maybe getting into WC'ing but ... not sure that's an avenue I wanted to go down ?

Wish Sapphire had came out with a ballin' R9-290X TOXIC edition, I waited and waited ... and waited ... don't think we'll see one ? that plus the latest TOXIC edition cards don't seem much better than the TriX or Vapor X series of cards ... so couldn't really justify putting this off anymore, and it seemed like Sapphire kind of has made the TOXIC an irrelevant card in the line up now ? Kind of bummed about that actually.

And again from the reviews I checked out, everyone seemed to place this PCS+ card above the Asus, and Sapphire offerings, and the MSI Lightning card is still listing at $700.

Heck I was even thinking of going back to Green ( a GTX 780 ) for a little while ... haven't seen too many 780ti's in NewEgg's open box section to consider those.

Anyways glad to have bumped up to this generation of GPU's ... most likely the next purchase is a bigger case finally


----------



## cennis

Quote:


> I can't find any info on BIOS editing for hawaii so any options would be welcome.
> 
> A minimum of 20% seems to be enforced.


try controlling with speedfan?


----------



## Slomo4shO

Quote:


> Originally Posted by *sinnedone*
> 
> It might not be a scam.


There are too many miners dumping cards currently due to an influx of ASIC script miners...


----------



## sinnedone

Quote:


> Originally Posted by *Slomo4shO*
> 
> There are too many miners dumping cards currently due to an influx of ASIC script miners...


Quote:


> Originally Posted by *sinnedone*
> 
> It might not be a scam. I purchased 2 290's for 200 each just 2 weeks ago. Like always its a crapshoot on Ebay though. Pay with your credit card just in case. You cant trust Ebay or paypal to get your money back. Your credit card will usually have your money back within 2 weeks though.


Please see the rest of that response.


----------



## Ized

Quote:


> Originally Posted by *cennis*
> 
> try controlling with speedfan?


As I mentioned, the card/tools (not 100% sure which) seem to enforce a 20% min speed which is a bit loud on my setup.


----------



## BIGARCANGEL

In regards to power consumption running furmark I pull 1280 watts from the wall with 2 r9 290xs underwater at 1200/1400 +!200mv in furmark, and in bf4 2560x1440 4xmsaa and 120 % res scale 1005 watts ! Love my cards they destroy games


----------



## Dire Squirrel

Quote:


> Originally Posted by *Jflisk*
> 
> I have 2x 290x and I dont have any problems with the beta above the 14.4 drivers cause me a great deal of problems.


Quote:


> Originally Posted by *The Mac*
> 
> all 14.x have the monitor issue.
> 
> As i mentioned b4, use 13.12


I have now tried both 14.3 and 13.12 and both seem identical. On 13.12 now.

They both fixed the problem with the second monitor not being recognized while turned off. But the infinitely more annoying problem with the cursor is still there. As soon as I move my mouse to the second monitor, the cursor turns into this:



It looks different than before, but still utterly useless.
I desperately need a fix for this. This is a big enough problem to make me throw the 290 away and go back to the 7870 if not fixed VERY quick. It is beyond intolerable and AMD really need to get their freaking act together.

It also seems like it is running hotter with this driver.



1% load, fans at 44% and still 53C. And that is pretty much where it sits. Had the fans at 70 for a few minutes. Got it down to 50C but a few minutes later it was right back up.


----------



## The Mac

Try 13.11. I know a lot of people prefer it.

14.3 had the monitor problem for me. However I'm on a single monitor.


----------



## BIGARCANGEL

Hey bud have you looked in ccc to see if it your driver install of 13.12 was successful ? As you might still be using 14x drivers without realizing . Sometimes the driver doesnt take for some stupid reason .


----------



## Dire Squirrel

Quote:


> Originally Posted by *BIGARCANGEL*
> 
> Hey bud have you looked in ccc to see if it your driver install of 13.12 was successful ? As you might still be using 14x drivers without realizing . Sometimes the driver doesnt take for some stupid reason .


Both CCC and GPU-Z says 13.12.
I used DDU to uninstall previous drivers, so there shouldn't be anything left of them.

Discovered that enabling "pointer trails" partially fixes it. But only partially and with the added horror of that ridiculous gimmick that somehow survived from the 90's.


----------



## Slomo4shO

Quote:


> Originally Posted by *sinnedone*
> 
> Please see the rest of that response.










So would my explanation of the card dumping and the drop in prices be any more relevant if I quoted the rest of your post?


----------



## PJFT808

I would like to be added to the club, finally.









http://www.techpowerup.com/gpuz/b2dwb/

ASUS R9290-4GD5

Water

I got this day one from the Egg. I've used it reference cooled until a few week ago. Modded with a cheapo H55 ,NZXT bracket, XSPC backplate, and Gelid VRM kit.

Some images:


----------



## cennis

Quote:


> Originally Posted by *PJFT808*
> 
> I would like to be added to the club, finally.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/b2dwb/
> 
> ASUS R9290-4GD5
> 
> Water
> 
> I got this day one from the Egg. I've used it reference cooled until a few week ago. Modded with a cheapo H55 ,NZXT bracket, XSPC backplate, and Gelid VRM kit.
> 
> Some images:


clock and temps?


----------



## BIGARCANGEL

dam sorry bud


----------



## grunion

Quote:


> Originally Posted by *The Mac*
> 
> all 14.x have the monitor issue.
> 
> As i mentioned b4, use 13.12


What is this monitor issue that you speak of, does it affect single, dual, triple, etc?


----------



## DeadlyDNA

Quote:


> Originally Posted by *Dire Squirrel*
> 
> Both CCC and GPU-Z says 13.12.
> I used DDU to uninstall previous drivers, so there shouldn't be anything left of them.
> 
> Discovered that enabling "pointer trails" partially fixes it. But only partially and with the added horror of that ridiculous gimmick that somehow survived from the 90's.


can you try a whole new set of mouse cursors? just for kicks?


----------



## PJFT808

Quote:


> Originally Posted by *cennis*
> 
> clock and temps?


@ 1200/1400 Leaving Valley on max settings for half an hour the temps are 60c core, 69C Vrm 1, and 55C Vrm 2


----------



## Dire Squirrel

Quote:


> Originally Posted by *DeadlyDNA*
> 
> can you try a whole new set of mouse cursors? just for kicks?


Tried that. It makes absolutely no difference.

I did however discover that this is apparently a "bug" that has been around in AMD drivers for donkey's years.
Commonly known as ATI or AMD cursor corruption.
So far I have not found any sort of permanent solution.

It really says a lot about how little AMD thinks of it's costumers, that such a problem can be present for years (at least 8-10 from what I can tell) and they don't care enough to actually fix it.

Nvidia is starting to look pretty good.


----------



## cennis

Quote:


> Originally Posted by *PJFT808*
> 
> @ 1200/1400 Leaving Valley on max settings for half an hour the temps are 60c core, 69C Vrm 1, and 55C Vrm 2


good temps


----------



## The Mac

both
Quote:


> Originally Posted by *grunion*
> 
> What is this monitor issue that you speak of, does it affect single, dual, triple, etc?


both.

If you turn off your monitor it will disconnect, then reconnect when you turn in back on. or if it goes into powersave.

basically, the virtual desktop is disconnected.

It more of just an annoyance with single monitors, but really hoses up eyefinity when it disconnects.

If you change your montior to a generic, it wont happen, but then you lose color profiles if you have them.

I tried blocking the pin on the hdmi cable that controls power detect (pin 19), but then it wouldnt detect the monitor on startup


----------



## sinnedone

Quote:


> Originally Posted by *Slomo4shO*
> 
> 
> 
> 
> 
> 
> 
> 
> So would my explanation of the card dumping and the drop in prices be any more relevant if I quoted the rest of your post?


My apologies, I understood that you were quoting only that section implying that yes it was a scam.


----------



## hotrod717

Just a little update on the progress of my Lightning. Even though Valley is Nvidia biased, it is a nice comparison for 290X to 290X.

1275/1625 - 1.38v actual (DMM)

1285/1625 - 1.38v actual (DMM)


Something to shoot for.


----------



## grunion

Quote:


> Originally Posted by *The Mac*
> 
> both
> both.
> 
> If you turn off your monitor it will disconnect, then reconnect when you turn in back on. or if it goes into powersave.
> 
> basically, the virtual desktop is disconnected.
> 
> It more of just an annoyance with single monitors, but really hoses up eyefinity when it disconnects.
> 
> If you change your montior to a generic, it wont happen, but then you lose color profiles if you have them.
> 
> I tried blocking the pin on the hdmi cable that controls power detect (pin 19), but then it wouldnt detect the monitor on startup


So only HDMI?
I don't see this issue, just tried, DP and DVI-D.


----------



## The Mac

other with dvi and dp have seen it as well.

you are lucky, its sooooo freakin annoying.

or perhaps your monitor doesnt report low power state.

Some expensive monitors have an option to turn off the reporting.


----------



## grunion

Quote:


> Originally Posted by *The Mac*
> 
> other with dvi and dp have seen it as well.
> 
> you are lucky, its sooooo freakin annoying.
> 
> or perhaps your monitor doesnt report low power state.
> 
> Some expensive monitors have an option to turn off the reporting.


My samsung has drivers but my acer is g pnp.
I'm going to switch some things around and also round up an hdmi cable to see if I can reproduce it.

And holy crap hotrod








Gonna see if my Ti can keep up with that.


----------



## hotrod717

Quote:


> Originally Posted by *grunion*
> 
> My samsung has drivers but my acer is g pnp.
> I'm going to switch some things around and also round up an hdmi cable to see if I can reproduce it.
> 
> And holy crap hotrod
> 
> 
> 
> 
> 
> 
> 
> 
> Gonna see if my Ti can keep up with that.


I'm sure your ti can and will beat that on Valley. Unfortunately Valley is very Nvidia biased, but as I said, good comparison 290X to 290X. Hopefully will be able to get some good 3dMark perf runs in. That's a better comparison for Nvidia to AMD.


----------



## Mega Man

Quote:


> Originally Posted by *Rainmaker91*
> 
> A cool thread, I'm not honored to have a card myself yet but I have been on the fence a while now for getting one. What surprises me is that there sin't more water blocks mentioned at the first page (may be mentioned elsewhere but I'm not going to read through 2219 pages atm). From what I could gather of information the AquaComputer kryographics Hawaii should be the best performing one, so it surprised me that it's not mentioned. Anyways, I might be joining you all soon with my own build unless I manage to draw it out to Bermuda XTX before upgrading.


ill comment how well the komodo does mine will be in on wed, should be able to install on fri/sat
Quote:


> Originally Posted by *yknot*
> 
> Newbie warning......
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hi to all,
> 
> I've got myself a reference 290x (Gigabyte) and watercooled it....................but.........I was wondering where the voltage monitoring points were located so I can connect a DMM.
> 
> I know I can use GPUZ but from reading this thread ppl seem to be using DMMs quite regular and I was puzzled as to where the connections were on the card.
> 
> Thanx


you read directly from the vrms on the back of the board, DO NOT do this unless you know what you are doing. you will brick your card
Quote:


> Originally Posted by *cennis*
> 
> Quote:
> 
> 
> 
> Originally Posted by *diggiddi*
> 
> You'll be Repped again for the recommend
> 
> 
> 
> 
> 
> 
> 
> as soon as it starts working
> 
> 
> 
> I have the direct CU II card.
> I just attached the h90 onto it without using the G10 bracket.
> I punched some holes into the stock intel mounting bracket that came with the h90, got some M3 nuts and bolts and it fits very nicely. As for VRM I use one of the stock fans on it.
> 
> If you get G10 I think you cannot use the original backplate of the DCUII with the g10 backplate.
> 
> Also i believe reference cards have better VRM in terms of heat. The DCUII uses custom vrm with higher amps usually used on motherboards (forgot where I read that) and cause extra heat. The advertised "better" overclocking feels like a hoax too.
> 
> Will up some pics later and thermal results
Click to expand...

i dont know why people care so much about aftermarket. last gen generally the best cards are ref, just because you dont need 20 million phases..... ( looking at you matrix 7970 )
Quote:


> Originally Posted by *the9quad*
> 
> Finally got a kill-a-watt.
> 
> In firestrike extreme on the last test at stock settings except an OC processor, I pull 1080 watts at the wall. eeek I didnt realize I was that close to the 1200 watt power supply. Think this will hurt anything?


no. even if it is maxing out your psu, a quality psu is designed to run at full load without issue, peak loads, on the other hand are different
Quote:


> Originally Posted by *Arizonian*
> 
> Hey LA_Kings_Fan - congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Been a while since the Sapphire Club - I'm surprised yours wasn't a Sapphire. Sweet price on your 290X though so that would have been hard to pass up. Sweet.


!!! you add him with a pic but not meh !!! will have my 4 gpus up and running with gpuz proof later ( hoping tonight ) using my existing rads
all blocks and 480 rads should be here wed,

i have 1 45mm 1 60mm 2 80mm ( total of 3 480 80 mm rads in this build ) hope its enough to cool this beast
Quote:


> Originally Posted by *Dire Squirrel*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DeadlyDNA*
> 
> can you try a whole new set of mouse cursors? just for kicks?
> 
> 
> 
> Tried that. It makes absolutely no difference.
> 
> I did however discover that this is apparently a "bug" that has been around in AMD drivers for donkey's years.
> Commonly known as ATI or AMD cursor corruption.
> So far I have not found any sort of permanent solution.
> 
> It really says a lot about how little AMD thinks of it's costumers, that such a problem can be present for years (at least 8-10 from what I can tell) and they don't care enough to actually fix it.
> 
> Nvidia is starting to look pretty good.
Click to expand...

all i hear is whining, big deal. move your cursor around really fast


----------



## HOMECINEMA-PC

@Mega Man
Quote:


> all i hear is whining, big deal. move your cursor around really fast


LoooooooooooL


----------



## DeadlyDNA

Quote:


> Originally Posted by *Dire Squirrel*
> 
> Tried that. It makes absolutely no difference.
> 
> I did however discover that this is apparently a "bug" that has been around in AMD drivers for donkey's years.
> Commonly known as ATI or AMD cursor corruption.
> So far I have not found any sort of permanent solution.
> 
> It really says a lot about how little AMD thinks of it's costumers, that such a problem can be present for years (at least 8-10 from what I can tell) and they don't care enough to actually fix it.
> 
> Nvidia is starting to look pretty good.


The grass is always greener on the other side, pun intended on that one 
I've never heard of this issue. Not sure how much your willing to do and fix it. disabel cursor shadows? It sounds like its the cursor files or something. can you try on an alternate windows installtion, or safe mode or something?

And as for 14.4 this was my experience with the driver on eyefinity


----------



## Jpmboy

as an ex-290x owner, thought I'd post this proud-father pic here... will replace sli titans and see how she does


----------



## DeadlyDNA

Maybe try this as well


----------



## Goride

Quote:


> Originally Posted by *Goride*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dasboogieman*
> 
> Just some thoughts
> 
> I have mounted a Corsair H90 on to a Sapphire 290 Tri X using the NZXT G10. My case is the CM HAF 932 and my CPU cooler is the Noctua NH D14
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 1. MEASURE YOUR 120/140mm slot before getting an AIO. I have a HAF923 and the H90 doesn't actually fit in to the 140mm back slot (with hoses oriented towards the ground) with a Noctua NH D14, couldn't fit it sideways since the pump blocks the side panel from closing. In the end, I lashed the radiator in to the 5.25 inch bays. In hindsight, I would've purchased a 120mm radiator AIO knowing this.
> 
> 2. AIO Hose length, more is better. I would advise against the H90 for future buyers, this is cost optimized for the CPU cooling thus the hose is 4 inches shorter than the Kraken X40 despite identical specs (both are identical Asetek units) This extra 4 inches would've allowed me to mount the rad at the rear with the hoses oriented towards the ceiling (thus not needing to be zip tied) and still use the NH D14.
> 
> 3. Mounting: the bracket has to be mounted to the GPU with the AIO, I suggest clearing space in the case prior to doing this since this will ease the final insertion of the GPU + AIO assembly later. Only 2-3 turns of the screw driver is neccessary, the fit is quite snug, any more and you risk cracking the GPU die or warping the PCB.
> 
> Results
> The Sapphire Tri X cooler was a brilliant design, the fans were placed in such a way that the air from the 3rd fan blows directly on to the VRM 1 area which was direct contact cooled with the whole heatsink frame.
> Thus, at 50% fan speed (barely audible) the core is 72 degrees (ambient 22) and VRM1 is around (75 degrees), VRM 2 is 45 degrees in EVGA OC scanner (a variant of Furmark)
> 
> Kraken G10 + H90 with no additional VRM cooling
> The overall noise is equivalent to the Tri X at 40% fan speed
> The core never peaks above 45 degrees but VRM 1 hovers around 82 degrees, VRM 2 reaches 60 degrees in EVGA OC scanner
> 
> Thus, I strongly recommend aftermarket VRM heatsinks if you want to overclock or using high stress apps, while the 92mm fan does a respectable job, there's actually an air dead zone in the VRM 1 assembly which lies directly under the 92mm fan's motor. If any of you has seen the Puget systems review, I believe this manifests as the giant red spot on the thermal camera.
> I recommend also either thermal adhesive (Arctic Alumina is the safest as I have read read 1 report of someone blowing a board fuse using excessive Arctic Silver adhesive) or a screw type heatsink assembly.
> DON'T use Sekisui 5760 for the VRM1 area. This tape has maximum adhesion around the 50-60 degree temperature range, 70-80 starts to see a degradation in the bond as the glue becomes less viscous. debonding starts to happen at 90-100 degrees over repeated cycles.
> 
> Useful dimensions and features:
> On a reference AMD board, the distance from the first screw hole centre to the second centre on the VRM 1 area is 85mm
> The height clearance from the RAM chip to the NZXT bracket is ~5mm
> The height clearance from the VRM 1 zone to the NXZT bracket is ~6mm
> The width of the VRM area from the Choke to the nearest capacitor is 13mm
> The whole assembly takes 2 slots.
> The AMD VRMs are metallic and despite the size difference, are the exact same height.
> 
> I have just ordered the GELID Enhancement kit for 290X from these guys http://www.feppaspot.com/servlet/the-897/GELID-Solutions-Icy-Vision/Detail so I don't have to DIY my own VRM sink.
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Roboyto*
> 
> *I'm in the midst of a post in one of my build logs for this heatsink kit with a Kraken G10. I'll share here first though
> 
> 
> 
> 
> 
> 
> 
> *
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> *NZXT Kraken G10 + Gelid R9 290(X) VRM Cooling Enhancement Kit*
> 
> I paid the additional $24 Courier Express shipping from Hong Kong. Ordered on Saturday Night 4/26 and it arrived Tuesday afternoon 4/29!
> FeppaSpot's customer service was great. They let me know when the item was packaged/shipped.
> They ask for a phone number so DHL can text you information to: check tracking info, digitally sign for your package, specify delivery instructions, or even change delivery address.
> Superb service between FeppaSpot and DHL!
> 
> 
> 
> *Long VRM1 Sink & Thermal Pad - VRM1 Mounting Studs/Nuts - Plastic Shims - VRM2 Sink & Thermal Tape*
> 
> 
> 
> *3M 8810 RAM Sinks* - http://www.performance-pcs.com/catalog/index.php?main_page=product_info&products_id=39462
> Thermal tape on these isn't the greatest. I accidentally knocked sinks off a couple times when handling the card. It's a pro/con they come off so easily.
> May want to consider a similar sink and some different thermal tape.
> 
> 
> 
> *VRM1 sink is perfectly tailored to these cards. Nice job Gelid!*
> 
> 
> 
> *Tiny thumb nuts from the back side.*
> 
> 
> 
> *Junpus Thermal Tape* instead of supplied super thin tape - http://www.performance-pcs.com/catalog/index.php?main_page=product_info&products_id=37756
> Sold by the foot
> 
> 
> 
> *All Sinks Installed* *- It looks purdy*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *This low profile sink comes in the kit*, but was not pictured above because I didn't think I would need it.
> If you want a heatsink on the bottom most RAM chip then you will need this low profile sink!
> 
> 
> 
> *NZXT + Gelid Results*
> 
> *Reference Cooler on PowerColor R9 290 with auto fan settings*
> 
> 
> 
> *Kraken G10 + Antec 620 + SilenX Effizio + Gelid R9 290 Heatsink Upgrade Kit*
> 
> 
> 
> *Before*
> 
> Core - 94C
> VRM1 - 65C
> VRM2 - 83C
> 
> *After*
> Core - 53C
> VRM1 - 56C
> VRM2 - 61C
> 
> Temps are outstanding, power consumption is down ~12% to 160W at 975/1250, and the computer is absolutely silent using the SilenX Effizio.
> Even with 0 airflow across VRM2, its temp is down 22C due to no longer getting all the hot air from core and VRM1 blown across it.
> 
> The 92mm NZXT fan is pulling heat from VRM1 because of the cards placement in my 250D. Yes, a R9 290 with a Kraken G10 and Antec 620*** will**** fit inside a Corsair 250D.
> ****I had to remove a screw from the pump housing so the hoses/fittings would rotate further. If I hadn't done this, the side panel would not have fit as the hoses/fittings protruded too far out****
> ***I don't know if all of the Asetek based coolers have this little screw that can be removed***
> Take care when turning the fitting on the pump further than was previously allowed, as I accidentally pulled it completely out! It dripped a little fluid, but did _*NOT*_ break anything
> 
> 
> 
> *It takes a little finesse and patience, but it all fits in there with the radiator in the front of the case.*
> Please excuse my poor quality cell phone photos & current lack of cable management
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *A few things I still need to test/tinker around with:*
> 
> Temps when overclocking the card
> Did I gain any additional overclocking headroom?
> Reference cooler best clocks were 1160/1550 +200mV Trixx
> 
> Power consumption at max clocks/voltages
> Swapping thermal tape on VRM2 for some Fujipoly Ultra or Ultra Extreme.
> The Fuji Pad should hold sinks with the card standing upright as it worked great for a VRM heatsink upgrade on my Devil 270X
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> I wish I saw Roboyto and Dasboogieman's post earlier.
> 
> I just ordered 15 of these heatsinks:
> http://www.frozencpu.com/products/17727/vid-191/Micro_Thermal_Heatsink_-_65mm_x_65mm_x_12mm_for_Motherboard_MOS.html
> 
> They are 6.5mm x 6.5mm x 12mm. I was going to but 10-12 of them in a row over the vrm1 area, and put 3 of them over the vrm2 area. I could not find much information on sizes, but it seemed people were saying that the clearance was almost 14mm, so I thought 12mm would be good. But according to these posts, that is much too high.
> 
> I also ordered these sinks for the memory:
> http://www.newegg.com/Product/Product.aspx?Item=9SIA3TR18A6835
> 
> They are 15mm x 15mm x 8mm. So It appears that they are going to be too high as well.
> 
> I wanted to get the Gelid 290x enhancement kit, but it was (and still is) out of stock everywhere.
Click to expand...

I just wanted to come back and say that all of these heatsinks fit under the G10+fan just fine, with the exception of one sink.

One memory chip heatsink that is just below the GPU was too tall because the water cooler contact point hovered too close over it. To fix this all I needed to do was cut off some of the height off part of it, and it fit in just fine.

With that said, the 6.5mmx6.5mmx12mm heatsinks that I linked did not help reduce the vrm1 much at all. I used fujipoly tape on the vrms, and placed 12 of those heatsinks in a row on the strip. I also put a very small amount of Gelid Extreme3 paste inbetween each heatsink to make sure they had good contact with each other and transferred heat evenly.

GPU-Z said the vrm1 were getting to 96c at stock voltage and settings. When I tried to OC it to 1200mhz core / 1500mhz ram and +133 mV, the vrm1 temps started got to 105c before I cut that off. When I touched the heatsinks they were scorching hot - so I know the heat was transferring to them. The G10's fan just could not keep them cool.

I ordered a Gelid Enhancement Kit, now that people have it in stock again. Hopefully the larger radiating area will allow the G10's fan to keep the temps down.

On the bright side, the GPU core temp never went over 67c - and that was with it OC'd to 1200mhz core / 1500mhz ram and +133 mV.

Also, the ramsinks on the memory chips seemed pretty hot to touch. The temp felt somewhere between the vrm2 heatsinks and the vrm1 heatsinks when I touched them. The vrm2 was around 65c and the vrm1 was around 96c.


----------



## cennis

Quote:


> Originally Posted by *Goride*


you need air flow on the memory if you overclock it,

also you might consider remounting , 67c is not very cool.

which AIO and what fan speed are you running?


----------



## grunion

Quote:


> Originally Posted by *hotrod717*
> 
> I'm sure your ti can and will beat that on Valley. Unfortunately Valley is very Nvidia biased, but as I said, good comparison 290X to 290X. Hopefully will be able to get some good 3dMark perf runs in. That's a better comparison for Nvidia to AMD.


Maybe my DCII, but my ref Ti boosts 150 less.
I removed the DCII cause it dumped too much heat into my Prodigy.


----------



## hotrod717

Having a good night!

http://www.3dmark.com/3dm11/8314059


----------



## Arizonian

Quote:


> Originally Posted by *PJFT808*
> 
> I would like to be added to the club, finally.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/b2dwb/
> 
> ASUS R9290-4GD5
> 
> Water
> 
> I got this day one from the Egg. I've used it reference cooled until a few week ago. Modded with a cheapo H55 ,NZXT bracket, XSPC backplate, and Gelid VRM kit.
> 
> Some images:
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added
















Quote:


> Originally Posted by *Mega Man*
> 
> -SNIP-
> 
> !!! you add him with a pic but not meh !!! will have my 4 gpus up and running with gpuz proof later ( hoping tonight ) using my existing rads
> all blocks and 480 rads should be here wed,
> 
> -SNIP-


Did I miss your entry? I could have sworn I had added you but I just went over the roster and I haven't. Just link me bud, I'm sorry.


----------



## Goride

Quote:


> Originally Posted by *Goride*


Quote:


> Originally Posted by *cennis*
> 
> you need air flow on the memory if you overclock it,
> 
> also you might consider remounting , 67c is not very cool.
> 
> which AIO and what fan speed are you running?


The memory felt that hot at stock speeds. But I do have a fan blowing across the card from the side of the case. But I doubt it will get all of the memory chips perfectly. (At the time of testing I did not have this fan blowing on my 290x, as I had the side of the case off. When I put the side back on, the vrm1 temps actually dropped down into the mid 80s. But that was with the side fan at max speed, which I will not have on typically.

I figured 67c was decent for 1200/1500 @ +133mV. It runs around 52-55c stock speed/voltage.

My AIO is a AMD Liquid Cooler System from the FX-8150. I believe it is just a differently badged version of the Antec Kuhler 920. Instead of the normal fans, I replaced them with 2 Gentle Typhoon fans in a push-pull configuration. I do not know exactly the RPM of them, but I believe they were running at a default slow speed. When I have time, I can try to turn them up a bit and see what happens.


----------



## cennis

Quote:


> Originally Posted by *Goride*
> 
> The memory felt that hot at stock speeds. But I do have a fan blowing across the card from the side of the case. But I doubt it will get all of the memory chips perfectly. (At the time of testing I did not have this fan blowing on my 290x, as I had the side of the case off. When I put the side back on, the vrm1 temps actually dropped down into the mid 80s. But that was with the side fan at max speed, which I will not have on typically.
> 
> I figured 67c was decent for 1200/1500 @ +133mV. It runs around 52-55c stock speed/voltage.
> 
> My AIO is a AMD Liquid Cooler System from the FX-8150. I believe it is just a differently badged version of the Antec Kuhler 920. Instead of the normal fans, I replaced them with 2 Gentle Typhoon fans in a push-pull configuration. I do not know exactly the RPM of them, but I believe they were running at a default slow speed. When I have time, I can try to turn them up a bit and see what happens.


fan speed could be a reason.
my h90 runs that same overclock under 60c looping 3dm11 GT4
max fan 1500rpm. push only.


----------



## Matt-Matt

So I've left it for a bit to see if the Drivers/Utilities improved..
But I'm still getting stable overclocks fine, it's just when the PC reboots it's like it's "Forgetting" to set the voltage but the core/memory run overclocked and therefore I crash..









I've tried both Afterburner and Trixx. Going to try them now but they're on the same version so I doubt it's gonna do anything.


----------



## aznsniper911

Maybe i'm too stupid for this in the middle of the night but is anyone having issues using " /wi6,32,8d,20" for beta 19 of afterburner? I can't seem to get pass the 100mw slider!


----------



## Mercy4You

Guys, I need some help from you to determine why my Sapphire Tri-X locks up on 'older' game Battlefield Bad Company2









About 20-30 minutes ingame *a freeze and total system lockup*. I've tried to underclock the card, enable disable Vsync etc...

In the logfile in the attachement, I have 3 crashes, easy to see at 3 different clockspeed Core/Memory:

1: 900/1100,
2: 950/1200
3: 1000/1250

GPUBFBC2.txt 1002k .txt file


I know my card is overkill for this game @ 1920x1080, but it should work anyway I guess









Maybe it's a needle in a haystack, but I appreciate any input









*BTW: card runs just fine on BF3 and BF4*


----------



## sena

Guys, how safe is to keep 290x on +200mV(trixx) for everyday gaming on full cover waterblock?


----------



## King4x4

+50mv is my max for daily use.


----------



## Forceman

Quote:


> Originally Posted by *Matt-Matt*
> 
> So I've left it for a bit to see if the Drivers/Utilities improved..
> But I'm still getting stable overclocks fine, it's just when the PC reboots it's like it's "Forgetting" to set the voltage but the core/memory run overclocked and therefore I crash..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've tried both Afterburner and Trixx. Going to try them now but they're on the same version so I doubt it's gonna do anything.


I had the same problem, only mine happened after the monitor went to sleep. If it is happening on boot, someone suggested going into task scheduler and delaying Afterburner on startup - apparently the problem happens if CCC starts after AB and resets the voltage. That didn't help my problem, so I just flashed the PCS+ BIOS which has the +50mV already incorporated instead.


----------



## Widde

Running +100mV 24/7 on ref cooling


----------



## dureiken

Hi,

first, thanks for this forum and this post, very helpful. I would like to be added to 290X owner's club : I have 2 Sapphire 290X, watercooled.

http://www.techpowerup.com/gpuz/74ykr/

I would like to know if some 290(x) owner's achieved to downsample at a resolution higher than 2720*1530 ? I can't do more than that with registry method and Catalyst 13.12.

Thanks a lot


----------



## centvalny

290X on Z97



http://imgur.com/hCF3I3Y


----------



## sena

Quote:


> Originally Posted by *Widde*
> 
> Running +100mV 24/7 on ref cooling


----------



## Blue Dragon

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Mercy4You*
> 
> Guys, I need some help from you to determine why my Sapphire Tri-X locks up on 'older' game Battlefield Bad Company2
> 
> 
> 
> 
> 
> 
> 
> 
> 
> About 20-30 minutes ingame *a freeze and total system lockup*. I've tried to underclock the card, enable disable Vsync etc...
> 
> In the logfile in the attachement, I have 3 crashes, easy to see at 3 different clockspeed Core/Memory:
> 
> 1: 900/1100,
> 2: 950/1200
> 3: 1000/1250
> 
> GPUBFBC2.txt 1002k .txt file
> 
> 
> I know my card is overkill for this game @ 1920x1080, but it should work anyway I guess
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Maybe it's a needle in a haystack, but I appreciate any input
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *BTW: card runs just fine on BF3 and BF4*






have you tried re-installing game?


----------



## Roboyto

Quote:


> Originally Posted by *Widde*
> 
> Running +100mV 24/7 on ref cooling


Can I take a guess...65% fan?

I even tried upgrading TIM and pads on my 2nd reference card before attaching G10, and it made little difference.


----------



## Mercy4You

Quote:


> Originally Posted by *Blue Dragon*
> 
> 
> have you tried re-installing game?


Not yet, though it's a good idea!


----------



## lionheart73

I would like to be added to the club. I am running two Sapphire Tri-x 290's at stock. I also have one Sapphire 290X that's not installed. I plan on switching to water eventually.

http://www.techpowerup.com/gpuz/6b67v/.


----------



## Widde

Quote:


> Originally Posted by *Roboyto*
> 
> Can I take a guess...65% fan?
> 
> I even tried upgrading TIM and pads on my 2nd reference card before attaching G10, and it made little difference.


41% fan speed at idle both cards at 42 degrees ^^

Pic of the fan profile http://piclair.com/d00pj







Many would consider it very loud but as long as I get performance I dont mind


----------



## Roboyto

Quote:


> Originally Posted by *Widde*
> 
> 41% fan speed at idle both cards at 42 degrees ^^
> 
> Pic of the fan profile http://piclair.com/d00pj
> 
> 
> 
> 
> 
> 
> 
> Many would consider it very loud but as long as I get performance I dont mind


At that low of a fan speed they must come very close to, or hit, the throttling temperature under heavy load. Default fan/clocks for 290s is/was 42% fan I think, and then they changed it to 47% to keep the cards from throttling. With 100mV I don't see how they couldn't be throttling.

40% isn't horrible, once you breach 55% is when it becomes ridiculous.

A couple Gelid Icy Vision, with their VRM kit, would be a cheap solution to making them near silent.


----------



## Blue Dragon

I'd like to join








picked up an Asus 290 OC for the low
http://www.techpowerup.com/gpuz/wc499/



Spoiler: had little trouble with GPU usage...



I had afterburner to run my 7870 CF and loved the fan profile and easy overclocking features, but alas... it's just doesn't work for me with this card.
so the problem I was having was my gpu usage jumping around causing terrible frames in games in benches... would work, but the usage or polling or whatever would slow my system down just long enough to drop my FPS and cause lags. it was being caused (as far as I can tell) from AB and CCC together. AB wasn't the only thing effecting usage, it was just more noticeable through AB, Hwinfo and GPUz were also showing usage spikes and drops just not as pronounced as AB's grid.
290 f'ed-

compared it to my 7870 hawks-

here's where it get's a little blurry... looking through this thread I tried several different settings and resets between AB and CCC and tried 13.12 - 14.4 drivers etc. would always figure a way to get it to run properly for 1 thing(bench or game), then it would always go back when I hit idle afterwards.










but finally (after many long hours) I found settings/drivers/programs that keep usage under control-

~CCC- 13.251-131206a-165807C-Asus (14.4) - don't be fooled be the date on their site like I was at first...








~Asus GPU Tweak 2.4.9.2
I disabled overdrive in CCC and my Tweak prog shows power target at 100% - not sure what the deal with 0% in AB and CCC... assuming it means 50% would be equal to 150% in Tweak?
idle-



side note- I did not have to uninstall AB - just stopped it from loading at windows start-up.

main thing I had problem with is Asus drivers. if their site wasn't wrong about the dates, I might not have wasted so much time jumping between drivers and power limits etc...


----------



## Roy360

I use my R9 290s to mine whenever I'm not using the PC, but for some reason all 3 watercooled cards combined hashrate dropped to 20kh/s. CGminer reports over 2400 but every pool I try gives me 20 at most. HWs errors are in the thousands.....

Cards damaged?

EDIT: NVM, it was the ram. I took some of the ram in my PC to build a new one, that lack of ram seems to be causing me errors...


----------



## Widde

Quote:


> Originally Posted by *Roboyto*
> 
> At that low of a fan speed they must come very close to, or hit, the throttling temperature under heavy load. Default fan/clocks for 290s is/was 42% fan I think, and then they changed it to 47% to keep the cards from throttling. With 100mV I don't see how they couldn't be throttling.
> 
> 40% isn't horrible, once you breach 55% is when it becomes ridiculous.
> 
> A couple Gelid Icy Vision, with their VRM kit, would be a cheap solution to making them near silent.


80 degrees while playing bf4







But it is noisy but not to the point that it bothers me







between 70-80% fan speed


----------



## TheDarkLord100

Switched from SLI 680s 4GB to CF 290s (both Sapphire) and DAMN they're loud (i've heard ppl complain but damn), definitely going under water soon


----------



## Roboyto

Quote:


> Originally Posted by *Widde*
> 
> 80 degrees while playing bf4
> 
> 
> 
> 
> 
> 
> 
> But it is noisy but not to the point that it bothers me
> 
> 
> 
> 
> 
> 
> 
> between 70-80% fan speed


Ahh, 41% for idle. 70%+ for gaming, that makes more sense.

I don't know how you do that. I benched both my 290s first with the stock cooler and it is obnoxiously loud at that point. Fan @75% is good up to ~125mV and ~1250 core if your card will go that far. Temps were in the high 80s if I remember correctly.


----------



## Tokkan

Quote:


> Originally Posted by *Roboyto*
> 
> At that low of a fan speed they must come very close to, or hit, the throttling temperature under heavy load. Default fan/clocks for 290s is/was 42% fan I think, and then they changed it to 47% to keep the cards from throttling. With 100mV I don't see how they couldn't be throttling.
> 
> 40% isn't horrible, once you breach 55% is when it becomes ridiculous.
> 
> A couple Gelid Icy Vision, with their VRM kit, would be a cheap solution to making them near silent.


You got it all wrong. He said 40% at idle, not at load. According to his fan profile, 90ºC=100% Fan speed.


----------



## Mercy4You

Quote:


> Originally Posted by *Blue Dragon*
> 
> 
> have you tried re-installing game?


Guess what, I just re-installed the game and no crashes anymore, thx


----------



## heroxoot

Quote:


> Originally Posted by *TheDarkLord100*
> 
> Switched from SLI 680s 4GB to CF 290s (both Sapphire) and DAMN they're loud (i've heard ppl complain but damn), definitely going under water soon


That's why you only buy ref cooler if you plan to water cool. 2nd party coolers like MSI Twin Frozr or the Sapphire toxic coolers are much quieter all around in comparison.


----------



## TheDarkLord100

Quote:


> Originally Posted by *heroxoot*
> 
> That's why you only buy ref cooler if you plan to water cool. 2nd party coolers like MSI Twin Frozr or the Sapphire toxic coolers are much quieter all around in comparison.


lol I know, one of the reasons why i switched was the fact that my 680s were the Galaxy OC edition which I couldn't find any blocks for.


----------



## Blue Dragon

Quote:


> Originally Posted by *Mercy4You*
> 
> Guess what, I just re-installed the game and no crashes anymore, thx


only problem is what ever program or game that you loaded after that original install will probably freeze as well (or just not launch) because of something they share (ie- directX) so just keep that in mind if you start having problems with one of your other games. though there is slight chance it was just a bad cluster on ur hdd and will never pop up again.


----------



## Mercy4You

Quote:


> Originally Posted by *Blue Dragon*
> 
> only problem is what ever program or game that you loaded after that original install will probably freeze as well (or just not launch) because of something they share (ie- directX) so just keep that in mind if you start having problems with one of your other games. though there is slight chance it was just a bad cluster on ur hdd and will never pop up again.


Meaning in that case I also have to reinstall the other games or even reinstall my OS? I do have a DirectX error on BF4 once in a while...


----------



## Blue Dragon

Quote:


> Originally Posted by *Mercy4You*
> 
> Meaning in that case I also have to reinstall the other games or even reinstall my OS? I do have a DirectX error on BF4 once in a while...


ouch, if you can do it without losing anything it would probably be the safer way, but with all the new mantle tech and what not, could possible just be driver issues along whatever problem you had with this game... sorry that doesn't really answer ur question but I'd hate to suggest wiping the OS and the problem ends up just being that you can have two certain games loaded on your system at the same time...


----------



## Mercy4You

Quote:


> Originally Posted by *Blue Dragon*
> 
> ouch, if you can do it without losing anything it would probably be the safer way, but with all the new mantle tech and what not, could possible just be driver issues along whatever problem you had with this game... sorry that doesn't really answer ur question but I'd hate to suggest wiping the OS and the problem ends up just being that you can have two certain games loaded on your system at the same time...


I get your point









Lets see where this gets me for now, BFBC2 is working so that's a win. The other DirectX BF4 problem was once a week or so, like you state maybe driver/Mantle/BF4 ongoing bugs...
No, I'm not eager to reinstall my OS, it'll cost me a days work to get everything back and running.

But, I value your input very much and know now that locking up/freezing is not necessarily GPU related as I was thinking in this case


----------



## kayan

Quote:


> Originally Posted by *geggeg*
> 
> I have a stinking feeling that is Windows 8 at fault honestly and nothing to do with the GPU.


Quote:


> Originally Posted by *eternal7trance*
> 
> I would also try a different power supply just to be sure since you probably have another you can use


Quote:


> Originally Posted by *Roboyto*
> 
> Used mining card is going to be a gamble anyway you look at it. If the card was abused then it is a possibility it could have issues.
> 
> A 250W cooler would probably do OK, but you wouldn't be able to run any earth-shattering OCs on it. The more important question is if the cooler would be compatible with the 290.
> 
> A GTX 560 is nothing like the volcanic 290(X), don't expect those temperatures, with that cooler, on one of these beasts
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Go with an AIO, way better option and you're not epoxying heatsinks to everything.
> 
> NZXT Kraken G10 since they're readily available: http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&Description=kraken+g10&N=-1&isNodeId=1
> 
> Order one of these heatsinks: http://www.feppaspot.com/servlet/the-897/GELID-Solutions-Icy-Vision/Detail
> 
> Zalman LQ310 $65 - $20 MIR: http://www.newegg.com/Product/Product.aspx?Item=N82E16835118134
> 
> And get results like I experienced: http://www.overclock.net/t/1478544/the-devil-inside-a-gaming-htpc-for-my-wife-3770k-corsair-250d-powercolor-r9-270x-devil-edition/20#post_22214255
> 
> Try swapping RAM from another PC if you have it. I've had similar BSOD with that NTOSKRNL.exe and RAM was the culprit on 2 occasions I believe.
> 
> Ah, just remembered something! I just had a laptop I repaired with Win8 on it that was having video flickering issues; it would happen as soon as you left the "Metro" tile interface by going to the desktop. I couldn't remedy the problem so I used the "Refresh My PC" option of Win8 and all was well. Glorious feature to reisntall the OS without losing/affecting anything else. Ah, and don't forget the free 8.1 upgrade if you haven't taken advantage of it.


So, just an update for everyone who tried to help me out. It turns out that the issue was actually two-fold. I swapped out my wife's RAM for a single stick I had laying around here somewhere, and it stopped blinking the screen off and on, but continued crashing. We then did a driver re-install of 14.4 and wham, problem solved. No crashes, no blue screens. Her PC has been rock-solid since the RAM swap + driver install.

Thank you all for your help and suggestions.

Also, I got around to overclocking the 290x I bought her, and it OCs better than the one I have on water, and hers is on air. She'll hit 1100/1400 50% power, no additional voltage. I can hit 1100/1350ish with 50% power. Contemplating swapping our cards....*shhh* don't tell her.


----------



## Widde

Quote:


> Originally Posted by *Roboyto*
> 
> Ahh, 41% for idle. 70%+ for gaming, that makes more sense.
> 
> I don't know how you do that. I benched both my 290s first with the stock cooler and it is obnoxiously loud at that point. Fan @75% is good up to ~125mV and ~1250 core if your card will go that far. Temps were in the high 80s if I remember correctly.


My cards max out at 1125 after that doesnt matter if I pump +200mV through them







Doing 1080/1350 with +100mV, Think I have the worst cards out of the bunch


----------



## Mercy4You

Quote:


> Originally Posted by *Widde*
> 
> My cards max out at 1125 after that doesnt matter if I pump +200mV through them
> 
> 
> 
> 
> 
> 
> 
> Doing 1080/1350 with +100mV, Think I have the worst cards out of the bunch


Well, for 1920x1080 you're just fine


----------



## Widde

Quote:


> Originally Posted by *Mercy4You*
> 
> Well, for 1920x1080 you're just fine


I dont "need" it but I want more


----------



## Mercy4You

Quote:


> Originally Posted by *Widde*
> 
> I dont "need" it but I want more


Then, you should have bought the X


----------



## cennis

Anyone know how 290 cross/trifire works when cards are clocked differently?

i.e. core clocks 1200 1200 1200 vs 1200 1200 1150 vs 1150 1150 1150

nvidia's driver seem to limit them all of them to the lowest one


----------



## heroxoot

Quote:


> Originally Posted by *cennis*
> 
> Anyone know how 290 cross/trifire works when cards are clocked differently?
> 
> i.e. core clocks 1200 1200 1200 vs 1200 1200 1150 vs 1150 1150 1150
> 
> nvidia's driver seem to limit them all of them to the lowest one


Like anything of the sort they throttle down to match the lowest frequency.


----------



## Roboyto

Quote:


> Originally Posted by *Widde*
> 
> My cards max out at 1125 after that doesnt matter if I pump +200mV through them
> 
> 
> 
> 
> 
> 
> 
> Doing 1080/1350 with +100mV, Think I have the worst cards out of the bunch


Cooling the card with something superior to the reference blower may allow you to get more out of them. When they're that hot they aren't being very efficient.


----------



## dureiken

Hi

I have 2 Sapphire 290x BF4 Edition (Hynix memory) under watercooling, I can't achieve to OC GPU clock more than 1180+100mV. I tried with PT1 bios, same.

More than that, If I put more than +100mV on both cards, I have black screens, artefacts ...

Could someone help me please ?

Thanks


----------



## Roboyto

Quote:


> Originally Posted by *dureiken*
> 
> Hi
> 
> I have 2 Sapphire 290x BF4 Edition (Hynix memory) under watercooling, I can't achieve to OC GPU clock more than 1180+100mV. I tried with PT1 bios, same.
> 
> More than that, If I put more than +100mV on both cards, I have black screens, artefacts ...
> 
> Could someone help me please ?
> 
> Thanks


What's your system as a whole looking like, could we get a list of components? One of the most important is going to be your PSU.

The black screen is more than likely from memory OC, are you overclocking the memory?

What kind of temperatures are you getting for core and more importantly VRM1 on both cards?

Xfire overclocking is a bit more work than a single card. Typically the best practice is to *bench both cards separately* to find their limits individually. Both cards can only go as fast as the slower of the two.

If other components are OCd then please list those as well.

The more information you give, the quicker and more likely the community will be able to help you out.


----------



## dureiken

I have a i7 3770k 4.6Ghz 1.25V, and Akasa 1000W PSU.

I tried with/without OC memory, the same. I also tried each card (one stock, other OC), same.


----------



## lawson67

For anyone that is interested Gigabyte have released last week the F4 and F4L Bios update for ELPIDA Memory for the Gigabyte windforce R9 290 cards and i have managed to overclock my cards ram with the new Bios to 1600mhz when before i could only hit 1450mhz and i needed more voltage!...So now i can hit 1600mhz with less voltage than i needed to hit my old limit of 1450mhz before the flash.

However for whatever reason i do not see a performance gain in Heaven Benchmark 4.0 with anything over 1500mhz strange?


----------



## Arizonian

Quote:


> Originally Posted by *dureiken*
> 
> Hi,
> 
> first, thanks for this forum and this post, very helpful. I would like to be added to 290X owner's club : I have 2 Sapphire 290X, watercooled.
> 
> http://www.techpowerup.com/gpuz/74ykr/
> 
> I would like to know if some 290(x) owner's achieved to downsample at a resolution higher than 2720*1530 ? I can't do more than that with registry method and Catalyst 13.12.
> 
> Thanks a lot


Congrats - added
















Quote:


> Originally Posted by *centvalny*
> 
> 290X on Z97
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> http://imgur.com/hCF3I3Y










I was so about to build a Z97 rig and then 2560x1440 120 Hz monitors have to go and finally show up. Nice man - and you forgot to mention that's a Matrix 290X on Z97.









Quote:


> Originally Posted by *lionheart73*
> 
> I would like to be added to the club. I am running two Sapphire Tri-x 290's at stock. I also have one Sapphire 290X that's not installed. I plan on switching to water eventually.
> 
> http://www.techpowerup.com/gpuz/6b67v/.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added

Quote:


> Originally Posted by *Blue Dragon*
> 
> I'd like to join
> 
> 
> 
> 
> 
> 
> 
> 
> picked up an Asus 290 OC for the low
> http://www.techpowerup.com/gpuz/wc499/
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: had little trouble with GPU usage...
> 
> 
> 
> I had afterburner to run my 7870 CF and loved the fan profile and easy overclocking features, but alas... it's just doesn't work for me with this card.
> so the problem I was having was my gpu usage jumping around causing terrible frames in games in benches... would work, but the usage or polling or whatever would slow my system down just long enough to drop my FPS and cause lags. it was being caused (as far as I can tell) from AB and CCC together. AB wasn't the only thing effecting usage, it was just more noticeable through AB, Hwinfo and GPUz were also showing usage spikes and drops just not as pronounced as AB's grid.
> 290 f'ed-
> 
> compared it to my 7870 hawks-
> 
> here's where it get's a little blurry... looking through this thread I tried several different settings and resets between AB and CCC and tried 13.12 - 14.4 drivers etc. would always figure a way to get it to run properly for 1 thing(bench or game), then it would always go back when I hit idle afterwards.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> but finally (after many long hours) I found settings/drivers/programs that keep usage under control-
> 
> ~CCC- 13.251-131206a-165807C-Asus (14.4) - don't be fooled be the date on their site like I was at first...
> 
> 
> 
> 
> 
> 
> 
> 
> ~Asus GPU Tweak 2.4.9.2
> I disabled overdrive in CCC and my Tweak prog shows power target at 100% - not sure what the deal with 0% in AB and CCC... assuming it means 50% would be equal to 150% in Tweak?
> idle-
> 
> 
> 
> side note- I did not have to uninstall AB - just stopped it from loading at windows start-up.
> 
> main thing I had problem with is Asus drivers. if their site wasn't wrong about the dates, I might not have wasted so much time jumping between drivers and power limits etc...
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Quote:


> Originally Posted by *TheDarkLord100*
> 
> Switched from SLI 680s 4GB to CF 290s (both Sapphire) and DAMN they're loud (i've heard ppl complain but damn), definitely going under water soon
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## Roboyto

Quote:


> Originally Posted by *dureiken*
> 
> I have a i7 3770k 4.6Ghz 1.25V, and Akasa 1000W PSU.
> 
> I tried with/without OC memory, the same. I also tried each card (one stock, other OC), same.


Your answer is extremely vague, like I said, the more specific you are the more you can be helped. What power supply exactly?

I meant you have to run each card separately entirely; Not just leave one at stock clocks and OC the other. Disable a PCIe slot, disconnect PCIe power cables, remove the card from the slot if possible.

For it to black screen without a RAM OC, you probably have some other issue.

What drivers are you running? 13.12 are the only drivers I have had work flawlessly so far.


----------



## dureiken

I have a Akasa Venom 1000W.

I will try tomorrow to disable a card and test the other one. What method do you suggest ? Is there a guide to OC 290X in a proper way ?

Do I have to go back to original bios ?

I use 13.12 drivers under 7 64 bits.

Thanks


----------



## Talon720

Quote:


> Originally Posted by *bond32*
> 
> Something isn't installed properly. Also if you really have that much air in the loop you shouldn't be running power to anything but the pump. Are you sure the parallel block is configured properly? Perhaps you have incorrect flow direction somewhere.


I Followed the directions and picture from ek on the parallel fc terminal. Everything was working fine with the two blocks and serial block. I was able to get more air bubbles out of the mobo vrm block, but ever since the 3rd block gpu was added in parallel its not bleeding as easy as it was. Water temp currently is 25.3c. If i rock the case back and forth gpu temps will flucuate. I almost seemed like the 3rd block added enough restriction that my single d5 pump cant push all the air out like before. I might have to drain my loop again because of a bad/leaky rotory fitting.


----------



## Roboyto

Quote:


> Originally Posted by *dureiken*
> 
> I have a Akasa Venom 1000W.
> 
> I will try tomorrow to disable a card and test the other one. What method do you suggest ? Is there a guide to OC 290X in a proper way ?
> 
> Do I have to go back to original bios ?
> 
> I use 13.12 drivers under 7 64 bits.
> 
> Thanks


Good information for nearly all things concerning 290(X): http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781

Whatever drivers you came from previously it is a good idea to make sure all old traces are gone with Display Driver Uninstaller. Download, run it, and re-install 13.12 just to be sure there isn't a driver problem.

OK, so power likely isn't your issue.

I would say go back to original BIOS is the flash didn't help you any.

What's your water loop composed of? Blocks, radiators, etc? How hot is VRM1? VRM1 temps can still be problematic under water.

OC the core first, and worry about memory later. It doesn't give very much performance boost with the large bus on these cards.

When OCing it's best to take notes so you can find a trend of where your card performs its best.

Most common benches used are: Anything 3DMark, Unigine Heaven/Valley, Catzilla, Tomb Raider, Bioshock, etc. *FurMark is a poor choice.*

Run card at stock settings with your benchmark of choice and record the results.

If you're using CCC/OverDrive for your OC, disable it immediately.

Use one of the following:


Trixx
Afterburner
HIS iTurbo
ASUS GPU Tweak
needs ASUS BIOS



When you start overclocking you can make fairly large jumps in core speed; say 30-40MHz. 
*Pushing the core too far almost always ends up with artifacts, tearing, or screen flickering. *
This means you need more voltage.




Apply more voltage, but do so slowly; 10-15mV at a time.
Re-Run your bench and see if the artifacting/tearing/flickering are gone, and if it runs properly the score should increase.
If the bench runs normal and your score increases, then you can push the core clock further. 


If it doesn't improve, then add a little more voltage and re-test.


Eventually you will hit a point where you can't increase the core clock by ~30MHz
Use a smaller interval to fine tune the card. 
You can overclock down to the MHz, but that is completely up to you if you want to spend the time.



Personally I haven't had a core overclock cause a blackscreen, on either of my 290s, the worst case scenario was EXTREME flickering of the video signal.

Blackscreen/Lockup has always been caused by RAM clocks.

Under water you will *probably* run out of voltage to apply before you run into a temperature issue; Xfire 290s do produce a *crap load* of heat however. VRM1 would probably overheat before the core.

You core clock of 1180 is still fairly strong considering its a 24.6% increase over stock. Watercooling doesn't guarantee maximum clocks, luck in the silicon lottery definitely plays its part.


----------



## Widde

Quote:


> Originally Posted by *Roboyto*
> 
> Cooling the card with something superior to the reference blower may allow you to get more out of them. When they're that hot they aren't being very efficient.


Gonna watercool the rig in the near future


----------



## Roboyto

Quote:


> Originally Posted by *lawson67*
> 
> For anyone that is interested Gigabyte have released last week the F4 and F4L Bios update for ELPIDA Memory for the Gigabyte windforce R9 290 cards and i have managed to overclock my cards ram with the new Bios to 1600mhz when before i could only hit 1450mhz and i needed more voltage!...So now i can hit 1600mhz with less voltage than i needed to hit my old limit of 1450mhz before the flash.
> 
> However for whatever reason i do not see a performance gain in Heaven Benchmark 4.0 with anything over 1500mhz strange?


Heaven didn't respond very well to memory clocks from my experience. Bench score only increased 6.42% when pushing RAM to 1675, compared to stock 1250.



These were all done at 1080P resolution. Been meaning to test at 5760*1080, but haven't gotten around to it yet.


----------



## Roy360

Has anyone tried cooling a R9 290 with an EK uni block? I"m thinking of cooling my final R9 290 with my old uni block instead of another expensive full cover. I"ll probably add my Club3D R9 290X to my build, instead of the ASUS R9 290, since the reference VRMs are cooler. Fujipads or direct contact to connect the small VRMs heatsyncs, and use industrial zip ties to connect a 2150rpm gentle typhoon to cool VRM1. (Could I wire it to the fan connector on the R9 290?)

Does that sound like it would work? VRM2 won't have any direct airflow, thought I guess I could get a 30mm fan.

by the way, a 1200W should be enough right? My rig currently sucks 900W at the wall with 3 R9 290s mining.
I thought about adding a CX430 to the build, but not entirely sure if that's safe. Even if I do connect the grounds.


----------



## Roboyto

Quote:


> Originally Posted by *Widde*
> 
> Gonna watercool the rig in the near future


Do iiiiiiiiiiiiiiiiit


----------



## FlighterPilot

Quote:


> Originally Posted by *cennis*
> 
> Anyone know how 290 cross/trifire works when cards are clocked differently?
> 
> i.e. core clocks 1200 1200 1200 vs 1200 1200 1150 vs 1150 1150 1150
> 
> nvidia's driver seem to limit them all of them to the lowest one


I have a 290, and a 290x in crossfire. I have the 290x clocked at 949Mhz, and the 290 at 1049mhz to try and even out the performance -- haven't had any issues so far.

Thats with different cards, _and_ different clocks.


----------



## Roboyto

Quote:


> Originally Posted by *Roy360*
> 
> Has anyone tried cooling a R9 290 with an EK uni block? I"m thinking of cooling my final R9 290 with my old uni block instead of another expensive full cover. I"ll probably add my Club3D R9 290X to my build, instead of the ASUS R9 290, since the reference VRMs are cooler. Fujipads or direct contact to connect the small VRMs heatsyncs, and use industrial zip ties to connect a 2150rpm gentle typhoon to cool VRM1. (Could I wire it to the fan connector on the R9 290?)
> 
> Does that sound like it would work? VRM2 won't have any direct airflow, thought I guess I could get a 30mm fan.
> 
> by the way, a 1200W should be enough right? My rig currently sucks 900W at the wall with 3 R9 290s mining.


I have a 290 in my HTPC with a Kraken G10 on it, so you should be able to do it and get good results.

The Gelid VRM heatsink kit is the way to go if they are available; Kit information in link below.

Check out my thread here to see: http://www.overclock.net/t/1478544/the-devil-inside-a-gaming-htpc-for-my-wife-3770k-corsair-250d-powercolor-r9-270x-devil-edition/20#post_22214255

VRM2 doesn't need any airflow whatsoever. It doesn't even break 70C with only a heatsink attached with JunPus Thermal Tape from my experience.

1200W should be enough, pending how far you plan on overclocking them


----------



## devilhead

Quote:


> Originally Posted by *dureiken*
> 
> Hi
> 
> I have 2 Sapphire 290x BF4 Edition (Hynix memory) under watercooling, I can't achieve to OC GPU clock more than 1180+100mV. I tried with PT1 bios, same.
> 
> More than that, If I put more than +100mV on both cards, I have black screens, artefacts ...
> 
> Could someone help me please ?
> 
> Thanks


Hi, i have 5x 290x







and the problem with blackscreen is bad output ports, if you will connect to hdmi your monitor and set to 30hz, you will get maximum without black screen







i'm using 120hz monitor so for my 290X is max. 1200 on core without black screen. But with 30hz hdmi cable for tests i can reach 1300


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *devilhead*
> 
> Hi, i have 5x 290x
> 
> 
> 
> 
> 
> 
> 
> and the problem with blackscreen is bad output ports, if you will connect to hdmi your monitor and set to 30hz, you will get maximum without black screen
> 
> 
> 
> 
> 
> 
> 
> i'm using 120hz monitor so for my 290X is max. 1200 on core without black screen. But with 30hz hdmi cable for tests i can reach 1300


Hey mate








So if I was to set my DP / HDMI to 30hz , it should eliminate the cutting in and out while benching








Gonna give that a go








Im already benching at 1300mhz +









Stuff I benched last nite........
Catzilla CF 290 all were benched at 1300 / 1487 @ 1.375v - 1.41v vdroop
576p

720p

1080p

1440p

@Sgt Bilko Finally beat your pesky 1440p sub









HOMECINEMA-PC [email protected]@2400 CF WB [email protected]@1487 *27861*









http://www.3dmark.com/3dm11/8316420
That gets me no 1 CF 290 on HWBOT and just scraped in by 1 point










Had a good run apart from a bugged CF Firestrike Extreme sub









http://www.3dmark.com/fs/2125881
Big time bugged combined score


----------



## Dasboogieman

Quote:


> Originally Posted by *devilhead*
> 
> Hi, i have 5x 290x
> 
> 
> 
> 
> 
> 
> 
> and the problem with blackscreen is bad output ports, if you will connect to hdmi your monitor and set to 30hz, you will get maximum without black screen
> 
> 
> 
> 
> 
> 
> 
> i'm using 120hz monitor so for my 290X is max. 1200 on core without black screen. But with 30hz hdmi cable for tests i can reach 1300


So essentially we are bypassing the possibly unstable Video output block (at high OC speeds) on the GPU die to get higher headroom?


----------



## heroxoot

Quote:


> Originally Posted by *FlighterPilot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cennis*
> 
> Anyone know how 290 cross/trifire works when cards are clocked differently?
> 
> i.e. core clocks 1200 1200 1200 vs 1200 1200 1150 vs 1150 1150 1150
> 
> nvidia's driver seem to limit them all of them to the lowest one
> 
> 
> 
> I have a 290, and a 290x in crossfire. I have the 290x clocked at 949Mhz, and the 290 at 1049mhz to try and even out the performance -- haven't had any issues so far.
> 
> Thats with different cards, _and_ different clocks.
Click to expand...

How well do your 290s and 290x fair at the same clocks? My 290X was under performing a little bit compared to others but it may be my CPU. I'm curious how much more performance the shader difference really is. With the 7950/70 the differences were only 4 - 5fps with a friend using a similar rig.


----------



## lawson67

Quote:


> Originally Posted by *Roboyto*
> 
> Heaven didn't respond very well to memory clocks from my experience. Bench score only increased 6.42% when pushing RAM to 1675, compared to stock 1250.
> 
> 
> 
> 
> These were all done at 1080P resolution. Been meaning to test at 5760*1080, but haven't gotten around to it yet.


Try running the tests again this time set 1500mhz on first run and note your FPS....Then on your second run set 1600mhz or anything over 1500mhz and see what happens?....I lose 1 FPS at 1600mhz compared to 1500mhz!.....or anything over 1500mhz i lose FPS strange indeed!....All the way up to 1500mhz i gaining FPS big time but anything over 1500mhz and i start to *lose* FPS?


----------



## lawson67

Arizonian can you list me please i have two R9 290 cards one is a powercolor pcs+ and the other is a Gigabyte windforce R9 290...Thanks


----------



## cennis

I just installed crossfire 290s, why is GT1 in 3dmark11 not using 100%cpu load?

GT4 is close to 100% load


----------



## Forceman

Quote:


> Originally Posted by *lawson67*
> 
> Try running the tests again this time set 1500mhz on first run and note your FPS....Then on your second run set 1600mhz or anything over 1500mhz and see what happens?....I lose 1 FPS at 1600mhz compared to 1500mhz!.....or anything over 1500mhz i lose FPS strange indeed!....All the way up to 1500mhz i gaining FPS big time but anything over 1500mhz and i start to *lose* FPS?


Error correction.


----------



## lawson67

Quote:


> Originally Posted by *Forceman*
> 
> Error correction.


What do you mean?


----------



## cennis

Im getting utilization drops on all programs with crossfire, using 13.12

anyone any ideas?


----------



## Forceman

Quote:


> Originally Posted by *lawson67*
> 
> What do you mean?


If you overclock the memory too far it generates errors, and the built in error correction causes it to run slower. Pretty common to see performance tail off if you push the VRAM too far. Happens on Nvidia too.


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Hey mate
> 
> 
> 
> 
> 
> 
> 
> 
> So if I was to set my DP / HDMI to 30hz , it should eliminate the cutting in and out while benching
> 
> 
> 
> 
> 
> 
> 
> 
> Gonna give that a go
> 
> 
> 
> 
> 
> 
> 
> 
> Im already benching at 1300mhz +
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Stuff I benched last nite........
> Catzilla CF 290 all were benched at 1300 / 1487 @ 1.375v - 1.41v vdroop
> 576p
> 
> 720p
> 
> 1080p
> 
> 
> 
> 1440p
> 
> @Sgt Bilko Finally beat your pesky 1440p sub
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> HOMECINEMA-PC [email protected]@2400 CF WB [email protected][email protected] *27861*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/8316420
> That gets me no 1 CF 290 on HWBOT and just scraped in by 1 point
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Had a good run apart from a bugged CF Firestrike Extreme sub
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/2125881
> Big time bugged combined score


Grats on the scores there dude,
i didn't think you would have needed 1300 on the core to beat me though


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Grats on the scores there dude,
> i didn't think you would have needed 1300 on the core to beat me though


I wanted to make sure it wouldn't happen again soon








I managed to find another 1000pts , that I think is well pretty cool eh ?


----------



## dureiken

Quote:


> Originally Posted by *devilhead*
> 
> Hi, i have 5x 290x
> 
> 
> 
> 
> 
> 
> 
> and the problem with blackscreen is bad output ports, if you will connect to hdmi your monitor and set to 30hz, you will get maximum without black screen
> 
> 
> 
> 
> 
> 
> 
> i'm using 120hz monitor so for my 290X is max. 1200 on core without black screen. But with 30hz hdmi cable for tests i can reach 1300


Hi , sorry but I don't understand, you set refresh rate to 30hz ? I am in HDMI right now but I have a active DP->HDMI adaptater in stock.

Thanks


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> I wanted to make sure it wouldn't happen again soon
> 
> 
> 
> 
> 
> 
> 
> 
> I managed to find another 1000pts , that I think is well pretty cool eh ?


Ah well, you can keep it for a few months









Gonna be a challenge though


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Ah well, you can keep it for a few months
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Gonna be a challenge though


It was a challenge after all this time to finally work the buggars out mate . You put me on da path with ccc and all that


----------



## Mega Man

Quote:


> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cennis*
> 
> Anyone know how 290 cross/trifire works when cards are clocked differently?
> 
> i.e. core clocks 1200 1200 1200 vs 1200 1200 1150 vs 1150 1150 1150
> 
> nvidia's driver seem to limit them all of them to the lowest one
> 
> 
> 
> Like anything of the sort they throttle down to match the lowest frequency.
Click to expand...

no they dont, amd stopped doing that many drivers ago

in other news i only have tri fire, one card is DOA :/

but besides that,..... i have tri fire !!!!!


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> It was a challenge after all this time to finally work the buggars out mate . You put me on da path with ccc and all that


Good to see it worked out for someone else


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Mega Man*
> 
> no they dont, amd stopped doing that many drivers ago
> 
> in other news i only have tri fire, one card is DOA :/
> 
> but besides that,..... i have tri fire !!!!!


DOA already DAMN YOU MURPHY

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Good to see it worked out for someone else


Didn't it work for you ??


----------



## Tokkan

Quote:


> Originally Posted by *FlighterPilot*
> 
> I have a 290, and a 290x in crossfire. I have the 290x clocked at 949Mhz, and the 290 at 1049mhz to try and even out the performance -- haven't had any issues so far.
> 
> Thats with different cards, _and_ different clocks.


Further proof of what I said, they will run at their according performance specs. If one of them is slower it will run maxed out while the other one gets lower usage, basicly waiting.
People and their crossfire misinformation is simply amazing and I believe every AMD GPU thread should have it on their 1st page and it should be a sticky correcting this.
Crossfire != SLI


----------



## Sleazybigfoot

Sapphire R9 290's - Stock cooler


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Didn't it work for you ??


Yea, worked for me but had some issues regardless.

I've got a couple of small things to try out with Catzilla as well, might make a difference, might not


----------



## lawson67

Quote:


> Originally Posted by *Forceman*
> 
> Error correction.


Oh ok thanks for letting me know Forceman i did not know that happened!...However it explains things


----------



## Blue Dragon

Quote:


> Originally Posted by *Tokkan*
> 
> Further proof of what I said, they will run at their according performance specs. If one of them is slower it will run maxed out while the other one gets lower usage, basicly waiting.
> People and their crossfire misinformation is simply amazing and I believe every AMD GPU thread should have it on their 1st page and it should be a sticky correcting this.
> Crossfire != SLI


some of this may be due to afterburner having and option (on by default when I installed) that applies same core/mem clocks to all cards in system coupled with the good old 'apply at system startup' setting that every one loves so much w/ these R9's. so there are probably a lot of people sticking their cards in that use afterburner, never knowing that it's AB syncing the clocks and not AMD drivers.


----------



## sugarhell

Quote:


> Originally Posted by *lawson67*
> 
> What do you mean?


Error correction. Gaming cards produce artifacts all the times but you cant see them. Overclocking memory too high produce more errors so when you are completely unstable memory does double job.


----------



## Dire Squirrel

Really loving this GPU.

First spent half a day finding a driver that only has "minor" issues and now the black screen/lock up starts.
Only happened once so far, but from what I have read, it is due to subpar ram that AMD have apparently known about since day one, but has yet to own up to.

Is AMD actively trying to by a worse company than EA? If so, they are well on their way.


----------



## lawson67

Quote:


> Originally Posted by *sugarhell*
> 
> Error correction. Gaming cards produce artifacts all the times but you cant see them. Overclocking memory too high produce more errors so when you are completely unstable memory does double job.


Thank you for explanation sugarhell


----------



## zGunBLADEz

This is the first card from amd that i had plug and 0 issues with drivers lol

So far from this card
290 @ 1300/1700
P16058
GS18408

http://www.3dmark.com/3dm11/8319711

BTW: you can mix match 290x with 290 on cfx they will clock at their own speeds..


----------



## motokill36

Hi All
Iwant to crossfire my R 290
was just wondering how smooth Battlefield 4 is in crossfire .

Had sli back in days of gtx 460's and it used to judder a bit .
want to go 4K in the end

Any advice
Thanks


----------



## velocityx

Crossfire is butter smooth. single card like feeling double the performance in mantle. 2x290 and a fx 8320 @4.7 here


----------



## Mega Man

Quote:


> Originally Posted by *Tokkan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *FlighterPilot*
> 
> I have a 290, and a 290x in crossfire. I have the 290x clocked at 949Mhz, and the 290 at 1049mhz to try and even out the performance -- haven't had any issues so far.
> 
> Thats with different cards, _and_ different clocks.
> 
> 
> 
> Further proof of what I said, they will run at their according performance specs. If one of them is slower it will run maxed out while the other one gets lower usage, basicly waiting.
> People and their crossfire misinformation is simply amazing and I believe every AMD GPU thread should have it on their 1st page and it should be a sticky correcting this.
> Crossfire != SLI
Click to expand...

they used to downclock gpus, it did happen in old drivers, they changed it recently ( last year or so )


----------



## cennis

Quote:


> Originally Posted by *kizwan*
> 
> Thanks to madman (@HOMECINEMA-PC), I fixed it. Turned out that "Enable GPU Scaling" was not enabled in CCC. 3dmark 11 was running centred in 1080p screen, loosing FPS this way. I thought 3dmark 11 update was causing this *issue*.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/8258906


http://www.3dmark.com/3dm11/8320018 my card seem to have low load in GT1,
same clocks and similar cpu settings but my GT1 is over 10fps slower.. anyone knows why?

frame pacing and vsync off, gpu scaling on,
stretched


----------



## cennis

Quote:


> Originally Posted by *cennis*
> 
> http://www.3dmark.com/3dm11/8320018 my card seem to have low load in GT1,
> same clocks and similar cpu settings but my GT1 is over 10fps slower.. anyone knows why?
> 
> frame pacing and vsync off, gpu scaling on,
> stretched


im using 14.4 btw


----------



## Tokkan

Quote:


> Originally Posted by *Mega Man*
> 
> they used to downclock gpus, it did happen in old drivers, they changed it recently ( last year or so )


I'm coming from a 2011 Crossfire of 6850's that had this exact behavior... since 2011 and I really doubt it was from that time.
2nd gen crossfire allowed this, not driver changes. r9 290 and 290x use the 3rd gen, which I believe is named XDMA.
Quote:


> Originally Posted by *cennis*
> 
> http://www.3dmark.com/3dm11/8320018 my card seem to have low load in GT1,
> same clocks and similar cpu settings but my GT1 is over 10fps slower.. anyone knows why?
> 
> frame pacing and vsync off, gpu scaling on,
> stretched


DRAM speed and timings? NB? HT?


----------



## Matt-Matt

Quote:


> Originally Posted by *Mega Man*
> 
> they used to downclock gpus, it did happen in old drivers, they changed it recently ( last year or so )


They haven't downclocked since 5850's at least, you've been able to run individual clocks since 68xx cards (6850, 6870, etc).

As for the RAM issues, my housemate got a used card off eBay cheap a bit back and it has the RAM issues of black screens and black/white fuzzies


----------



## nemm

I need some input as to whether I should upgrade my 2x R9290 and add a third for my upcoming build/upgrade to Haswell-E 8core when released if my budget allows? Bare in mind no more upgrades will take place after this for the next 2 years other than a 4K monitor in 2015.

Thank you in advance


----------



## kizwan

Quote:


> Originally Posted by *Talon720*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bond32*
> 
> Something isn't installed properly. Also if you really have that much air in the loop you shouldn't be running power to anything but the pump. Are you sure the parallel block is configured properly? Perhaps you have incorrect flow direction somewhere.
> 
> 
> 
> I Followed the directions and picture from ek on the parallel fc terminal. Everything was working fine with the two blocks and serial block. I was able to get more air bubbles out of the mobo vrm block, but ever since the 3rd block gpu was added in parallel its not bleeding as easy as it was. Water temp currently is 25.3c. If i rock the case back and forth gpu temps will flucuate. I almost seemed like the 3rd block added enough restriction that my single d5 pump cant push all the air out like before. I might have to drain my loop again because of a bad/leaky rotory fitting.
Click to expand...

In parallel, with two cards, the flow is halved & with three cards, the flow is now 1/3. You probably need second pump with three GPU's in parallel.

My advice is only powered the pump when bleeding. Shake and tilt the case to dislodge the trapped air bubbles.


----------



## cennis

Quote:


> Originally Posted by *Tokkan*
> 
> I'm coming from a 2011 Crossfire of 6850's that had this exact behavior... since 2011 and I really doubt it was from that time.
> 2nd gen crossfire allowed this, not driver changes. r9 290 and 290x use the 3rd gen, which I believe is named XDMA.
> DRAM speed and timings? NB? HT?


i thought dram, NB, HT affect physics scores?


----------



## Dire Squirrel

Help me out here guys.

This can't possibly be normal, can it?


And this is NOT right after heavy load.


----------



## rt123

Quote:


> Originally Posted by *Dire Squirrel*
> 
> Help me out here guys.
> 
> This can't possibly be normal, can it?
> 
> 
> And this is NOT right after heavy load.


Is POWERPLAY ON in AfterBurner.?

If not then turn in ON.


----------



## Talon720

Quote:


> Originally Posted by *aznsniper911*
> 
> Maybe i'm too stupid for this in the middle of the night but is anyone having issues using " /wi6,32,8d,20" for beta 19 of afterburner? I can't seem to get pass the 100mw slider!


Quote:


> Originally Posted by *kizwan*
> 
> In parallel, with two cards, the flow is halved & with three cards, the flow is now 1/3. You probably need second pump with three GPU's in parallel.
> 
> My advice is only powered the pump when bleeding. Shake and tilt the case to dislodge the trapped air bubbles.


well thats what I did at first i jumped the 24 pinand have a springy switch so it dosnt stay on unless i push it. Filled ran pump stopped filled rinsed and repeated whille rocking. i turned the case upside down with the pump off. My pump/res is kinda hard to remove, but its at the highest point. I was reading thought depending on the gpu block (eks are suppose to do well on low pressure) my d5 might not be enough. Still seems like it should be doing way better but these cards are super hot. I Think ill just hook up the orginal single terminals and hook them up in series. I cant get my phton 170 w d5 pump to fit and dont have a seperate top to put it in. The other thing is my cpu temps are fine.


----------



## Dire Squirrel

Quote:


> Originally Posted by *rt123*
> 
> Is POWERPLAY ON in AfterBurner.?
> 
> If not then turn in ON.


I have no such option.


----------



## DeadlyDNA

Quote:


> Originally Posted by *Dire Squirrel*
> 
> I have no such option.


i think he means ULPS?


----------



## rt123

Quote:


> Originally Posted by *Dire Squirrel*
> 
> I have no such option.


Open AfterBurner, go into settings, General

Third option from Bottom.

http://www.overclock.net/t/1472341/lightbox/post/22199327/id/1997398

Unlike that image, you want yours to say *with PowerPlay support*, not without.

Edit:
And yes you *ULPS ON* too, these two things prevent the Card from Downclocking, much like Nvidia's K-Boost.


----------



## Dire Squirrel

Quote:


> Originally Posted by *rt123*
> 
> Open AfterBurner, go into settings,
> 
> Third option from Bottom.
> 
> http://www.overclock.net/t/1472341/lightbox/post/22199327/id/1997398
> 
> Unlike that image, you want yours to say *with PowerPlay support*, not without.
> 
> Edit:
> And yes you *ULPS ON* too, these two things prevent the Card from Downclocking, much like Nvidia's K-Boost.


As I said, I do not have that option:


----------



## rdr09

Quote:


> Originally Posted by *zGunBLADEz*
> 
> This is the first card from amd that i had plug and 0 issues with drivers lol
> 
> So far from this card
> 290 @ 1300/1700
> P16058
> GS18408
> 
> http://www.3dmark.com/3dm11/8319711
> 
> BTW: you can mix match 290x with 290 on cfx they will clock at their own speeds..


use 13 drivers for benching. it requires less VDDC and get higher graphics score . . .

http://www.3dmark.com/3dm11/7716320


----------



## hwoverclkd

any thoughts on the new Sapphire Radeon VAPOR-X R9 290X 4GB [11226-09-40G]? Looks like it comes with factory overclocked gpu and memory.


----------



## zGunBLADEz

Quote:


> Originally Posted by *rdr09*
> 
> use 13 drivers for benching. it requires less VDDC and get higher graphics score . . .
> 
> http://www.3dmark.com/3dm11/7716320


how much less voltage are we talking about?


----------



## rt123

Quote:


> Originally Posted by *Dire Squirrel*
> 
> As I said, I do not have that option:


Oh..









Try ticking the
Unlock Voltage Control & Unlock Voltage monitoring, maybe then the bottom Dialog will show up.


----------



## Dire Squirrel

Quote:


> Originally Posted by *rt123*
> 
> Oh..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Try ticking the
> Unlock Voltage Control & Unlock Voltage monitoring, maybe then the bottom Dialog will show up.


Nope. Tried that. And as you can see, I have the latest version, so it is not a matter of it being outdated.


----------



## rt123

Quote:


> Originally Posted by *Dire Squirrel*
> 
> Nope. Tried that. And as you can see, I have the latest version, so it is not a matter of it being outdated.


Well this is surprising.

There is an AfterBurner Unlocking Guide in this thread, which should maybe Unlock those options for you
http://www.overclock.net/t/1265543/the-amd-how-to-thread

But first try Disabling OverDrive completely from Catalyst Control Centre, & reset the any OC you have in After Burner or Trixx.
If this fixes the issue then you can unlock AB latter at your leisure.


----------



## rdr09

Quote:


> Originally Posted by *zGunBLADEz*
> 
> how much less voltage are we talking about?


i use 13.11. i know 13.12 sets my VDDC a tad higher, thus my idle and load temps are higher.

i can bench my 290 all the way to 1175 without voltage tweaks (0 offset) just maxing the Power Li (TRIXX). now, with 14.4, i had to add +40 offset and max out PL. Your card is better than mine, i can't get 1300 stable for benching using 14 driver.

same oc but cpu at 4.9GHz . . .

http://www.3dmark.com/3dm11/7748408

compare the graphics scores.


----------



## Dire Squirrel

Quote:


> Originally Posted by *rt123*
> 
> Well this is surprising.
> 
> There is an AfterBurner Unlocking Guide in this thread, which should maybe Unlock those options for you
> http://www.overclock.net/t/1265543/the-amd-how-to-thread
> 
> But first try Disabling OverDrive completely from Catalyst Control Centre, & reset the any OC you have in After Burner or Trixx.
> If this fixes the issue then you can unlock AB latter at your leisure.


Overdrive has not been enabled and the card has never been OC'ed.

That thread looks like a mouthful. I will take a look at it after I have had my dinner.


----------



## rt123

Quote:


> Originally Posted by *Dire Squirrel*
> 
> Overdrive has not been enabled and the card has never been OC'ed.
> 
> That thread looks like a mouthful. I will take a look at it after I have had my dinner.


Yes or maybe somebody else will jump in to help you. I have a Lightning, I didn't know that other Cards didn't have these options. I tried, sorry.
Try a reboot if you haven't already.

The issue you are having is pretty common though, Google

Amd R9 290 stuck at 3D clocks, you will see a lot of complaints,

From what I gather, somehow your card is running at max Clocks & therefore hot, even though you are not playing any Games or anything. Traditionally it should downclock, but that is not happening here.
Good luck.


----------



## sugarhell

Quote:


> Originally Posted by *Dire Squirrel*
> 
> As I said, I do not have that option:


Old version AB


----------



## Norse

Woooooooo Just got my second Sapphire 290x £263 and its BF4 edition with game


----------



## Dire Squirrel

Quote:


> Originally Posted by *sugarhell*
> 
> Old version AB


Take a look at the date.


----------



## Forceman

Quote:


> Originally Posted by *Dire Squirrel*
> 
> Nope. Tried that. And as you can see, I have the latest version, so it is not a matter of it being outdated.


Quote:


> Originally Posted by *Dire Squirrel*
> 
> Take a look at the date.


Make sure you are checking for Beta versions. I think all the recent versions are still considered beta. You need 3.0.0 something.


----------



## rt123

Quote:


> Originally Posted by *Dire Squirrel*
> 
> Take a look at the date.


You need the latest Beta 19.
That is what everyone is using.
Forgot to ask you that.


----------



## Dire Squirrel

Quote:


> Originally Posted by *rt123*
> 
> You need the latest Beta 19.
> That is what everyone is using.
> Forgot to ask you that.


I see.
Is that stable?

After just having been through just about every halfway viable AMD driver to find one that is only slightly messed up, I have zero patience for software that does not simply work. In fact I am very close to declaring this GPU not worth the effort and buying Nvidia.


----------



## rt123

Quote:


> Originally Posted by *Dire Squirrel*
> 
> I see.
> Is that stable?
> 
> After just having been through just about every halfway viable AMD driver to find one that is only slightly messed up, I have zero patience for software that does not simply work. In fact I am very close to declaring this GPU not worth the effort and buying Nvidia.


It is 100% stable.
Been using it & Beta 18 for last 3 months.
First on a 780 Classy & now on a 290X Lightning.

I don't get why MSI doesn't just release one calling it Stable.

Edit: Also, if you Jump drivers, use this
http://support.amd.com/en-us/kb-articles/Pages/AMD-Clean-Uninstall-Utility.aspx

I jumped back & forth between 13.12 Whql & 14.4 Whql four times within 2 hours & I had zero issues as long as i used that utility to Uninstall my Drivers.


----------



## Dire Squirrel

Quote:


> Originally Posted by *rt123*
> 
> It is 100% stable.
> Been using it & Beta 18 for last 3 months.
> First on a 780 Classy & now on a 290X Lightning.
> 
> I don't get why MSI doesn't just release one calling it Stable.
> 
> Edit: Also, if you Jump drivers, use this
> http://support.amd.com/en-us/kb-articles/Pages/AMD-Clean-Uninstall-Utility.aspx
> 
> I jumped back & forth between 13.12 Whql & 14.4 Whql four times within 2 hours & I had zero issues as long as i used that utility to Uninstall my Drivers.


I used Display Driver Uninstaller, as I have read enough bad things about the AMD utility to not go anywhere near it.

Anyway. Got beta 19 now and the extra options are there.
Just to make sure. What are the EXACT settings to go for? and WHY. I have found no useful explanation of exactly what powerplay is supposed to do in this particular situation.


----------



## rt123

Quote:


> Originally Posted by *Dire Squirrel*
> 
> I used Display Driver Uninstaller, as I have read enough bad things about the AMD utility to not go anywhere near it.
> 
> Anyway. Got beta 19 now and the extra options are there.
> Just to make sure. What are the EXACT settings to go for? and WHY. I have found no useful explanation of exactly what powerplay is supposed to do in this particular situation.


http://www.overclock.net/t/1472341/lightbox/post/22199327/id/1997398

This my Benching profie, things you want to change,

1) Unlike me you *don't* want to *tick* *Disable ULPS*

Because, (leeching somebody else's explanation)
Quote:


> ULPS quite literally stands for Ultra-Low Power State. Think of it as a stand-by state for your graphics card(s). The idea is that when your displays are ordered into stand-by mode by the OS, the graphics card does the same and powers down everything that may be necessary to actually put a picture on the screen (presumably in conjunction with lowering its clocks). It also appears to affect secondary cards in Crossfire setups, turning them "off" when they're not in use.
> 
> Ideally it shouldn't have any effect at all when that's not happening, but various reports exist which say that disabling it has a positive effect. If you don't use stand-by features (like myself) than there's no harm at all in disabling it. No harm disabling it otherwise, either - just be aware that the card could be drawing 10w+ (depending on model, clock speeds, voltages, etc) over what it needs to when your system is on but the screens are in stand-by. Crossfire systems will draw that much more per extra card, even with the screens active.


Now, I personally keep ULPS enabled except when Mining or Benching as I am not Idling & need the constant performance.
So don't disable it if you are running stock. It shouldn't interfere with your OCs but some people say otherwise, so I disable it for Benchmarks, Whatever.

2) You want *Unofficial Overclocking mode With PowerPlay Support*.

PowerPlay is like Nvidia's K-Boost (you would know if you have a recent Nvidia card).

It will force the Card at max clocks all the time. Again, good for Benchmarking, not so good for daily use. ( I believe this might be the cause of your issue)

3) You can copy basically everything else.


----------



## Dire Squirrel

Quote:


> Originally Posted by *rt123*
> 
> http://www.overclock.net/t/1472341/lightbox/post/22199327/id/1997398
> 
> This my Benching profie, things you want to change,
> 
> 1) Unlike me you *don't* want to *tick* *Disable ULPS*


By default, disable ULPS is NOT ticked.
Quote:


> Originally Posted by *rt123*
> 
> 2) You want *Unofficial Overclocking mode With PowerPlay Support*.
> 
> PowerPlay is like Nvidia's K-Boost (you would know if you have a recent Nvidia card).
> 
> It will force the Card at max clocks all the time. Again, good for Benchmarking, not so good for daily use. ( I believe this might be the cause of your issue)


So that would mean that I should just not touch "unofficial overclocking mode". I do NOT bench, and have absolutely zero need for or interest in anything other than stability. So callled "daily use", is what this rig is for.

So unless I have missed something, I should keep everything exactly as it is, which means I am back where I started.


----------



## rt123

Quote:


> Originally Posted by *Dire Squirrel*
> 
> By default, disable ULPS is NOT ticked.
> So that would mean that I should just not touch "unofficial overclocking mode". I do NOT bench, and have absolutely zero need for or interest in anything other than stability. So callled "daily use", is what this rig is for.
> 
> So unless I have missed something, I should keep everything exactly as it is, which means I am back where I started.


Yes I know Disable ULPS is not ticked by default. Yep, just leave it at that.

I want you try to Enable Unofficial Overclocking mode with PowerPlay enabled. I think your issue lies there. Can you try it once.
Ticking Unofficial mode doesn't OC your card till you actually raise the Clocks/Voltage.
However the PowerPlay thing still takes effect. Mine card's idle temp raise +10C by going from PowerPlay enabled to disabled.

I urge you to try it. If if doesn't work, you can always revert & then we are back to square zero.









Edit: If these things don't work than the last thing would be trying changing the ULPS state.
Except this I don't think I can help you. Sorry Again.


----------



## Foresight

Just tried oc'ing my gigabte r9 290 oc
Hmm hows this so far?


+100mv, 50% power
1154/1360


----------



## rdr09

Quote:


> Originally Posted by *Foresight*
> 
> Just tried oc'ing my gigabte r9 290 oc
> Hmm hows this so far?


let me guess . . . 1150 MHz?


----------



## ebhsimon

I'm upgrading from a 7970 OC'd to 1180/1500

Sapphire Vapor-X R9 290

Or

Giga Windforce (non oc) R9 290 + Kraken G10 + VRM Heatsinks + H75 AIO

The giga optin is about $30 more expensive. I know that rationally the giga option is better, but I can't help but gravitate towards the Vapor-X. Please convince me either way. I will be trying to overclock as hard as possible.


----------



## Dire Squirrel

Quote:


> Originally Posted by *rt123*
> 
> Yes I know Disable ULPS is not ticked by default. Yep, just leave it at that.
> 
> I want you try to Enable Unofficial Overclocking mode with PowerPlay enabled. I think your issue lies there. Can you try it once.
> Ticking Unofficial mode doesn't OC your card till you actually raise the Clocks/Voltage.
> However the PowerPlay thing still takes effect. Mine card's idle temp raise +10C by going from PowerPlay enabled to disabled.
> 
> I urge you to try it. If if doesn't work, you can always revert & then we are back to square zero.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: If these things don't work than the last thing would be trying changing the ULPS state.
> Except this I don't think I can help you. Sorry Again.


I set things as per your suggestion. So far I have not seen any changes, but I'll give it a day or so while I look for other things that could be the root. Some rep has been thrown your way.

I noticed that certain programs do not agree with this thing at all. Went though the list of browsers and it would appear that some use insane amounts of hardware acceleration for pretty much no reason. Opera and IE for example. Opera by far seems to be the worst.

So now I'm looking over everything else to see if something is clashing with this tape and baling twine driver.
I just hope I find it before the waterblock for the card arrives. I can't in good conscience stick a block on it and potentially cover up a problem that I have not yet found. And I absolutely need to get this noisy bastard on water as soon as possible, to avoid going crazy.


----------



## Widde

I'm getting something weird everytime I install the whqls :S http://piclair.com/owfmu

And this is from the xml http://piclair.com/b5epj http://piclair.com/xeds8 Not one error message :S Anyone know what's wrong?

Tried both with only DDU and amds tool + DDU and still the same results


----------



## Devotii

Quote:


> Originally Posted by *ebhsimon*
> 
> I'm upgrading from a 7970 OC'd to 1180/1500
> 
> Sapphire Vapor-X R9 290
> 
> Or
> 
> Giga Windforce (non oc) R9 290 + Kraken G10 + VRM Heatsinks + H75 AIO
> 
> The giga optin is about $30 more expensive. I know that rationally the giga option is better, but I can't help but gravitate towards the Vapor-X. Please convince me either way. I will be trying to overclock as hard as possible.


The VaporX runs cooler and quieter (from reviews).


----------



## hwoverclkd

Quote:


> Originally Posted by *Dire Squirrel*
> 
> Help me out here guys.
> 
> This can't possibly be normal, can it?
> 
> 
> And this is NOT right after heavy load.


This happened to my 290x lightning twice, i just rebooted it and since then I had re-installed several drivers (from 13.12 to 14.4) because i was troubleshooting a blackscreen issue. That 'stuck' clock issue never came back, i'm not sure if reinstallation of drivers helped clear it out...the black screen is still occuring though


----------



## kayan

So, this is odd. I bought a new (used) 2nd 290x for my system, plugged it in, everything is fine. It's detected in GPU-Z, W8.1 64bit, CCC, but where the heck do I enable Crossfire? I don't see an option in CCC at all? And this:



The 2nd card is disabled, but when I go to device manager it's working properly and enabled?

I already tried re-installing the drivers. I even updated the drivers.


----------



## Germanian

Quote:


> Originally Posted by *kayan*
> 
> So, this is odd. I bought a new (used) 2nd 290x for my system, plugged it in, everything is fine. It's detected in GPU-Z, W8.1 64bit, CCC, but where the heck do I enable Crossfire? I don't see an option in CCC at all? And this:
> 
> 
> 
> The 2nd card is disabled, but when I go to device manager it's working properly and enabled?
> 
> I already tried re-installing the drivers. I even updated the drivers.


something is interfering with your card. Why does it show disabled? Try turning off ULPS maybe it fixes it. Crossfire should be under PERFORMANCE and GAMING tab at least mine does 14.4


----------



## weebeast

Getting my 290x tomorrow, i have a 6970 at the moment.







.


----------



## DeadlyDNA

Quote:


> Originally Posted by *weebeast*
> 
> Getting my 290x tomorrow, i have a 6970 at the moment.
> 
> 
> 
> 
> 
> 
> 
> .


sounds like your in for a treat!


----------



## kayan

Quote:


> Originally Posted by *Germanian*
> 
> something is interfering with your card. Why does it show disabled? Try turning off ULPS maybe it fixes it. Crossfire should be under PERFORMANCE and GAMING tab at least mine does 14.4


I opened everything, and more screenshots:




Device manager shows that both cards are functioning properly. I can't really unplug my main card as it's in a water loop, and I don't want to drain it and retest my loop. Anyway, according to CCC the 2nd card is disabled still. According to Device Manager, MSI AB, and GPU-Z they are both being recognized. I also ran a bench of 3dMark and my score is actually lower with both plugged in, than what it was solo, by almost 900points.

Anybody have any suggestions?

EDIT: Problem solved. I unplugged my main card's power and put my displayport into the new card, and tested it. It seemed to be ok. So, I then plugged the power back into my main card and after rebooting they both worked & CCC told me it had some important info and took me right to the XFire approval page. Turned it on, and it's working now.

So strange....


----------



## Roboyto

Quote:


> Originally Posted by *Foresight*
> 
> Just tried oc'ing my gigabte r9 290 oc
> Hmm hows this so far?
> 
> 
> +100mv, 50% power
> 1154/1360


Clocks are fair for +100mV. I would set RAM back to stock and see how far the core can go. Just be mindful of VRM1 temperatures when adding voltage; <90C is where you want to be.


----------



## hwoverclkd

Quote:


> Originally Posted by *weebeast*
> 
> Getting my 290x tomorrow, i have a 6970 at the moment.
> 
> 
> 
> 
> 
> 
> 
> .


hope you won't get any black screen


----------



## weebeast

Quote:


> Originally Posted by *DeadlyDNA*
> 
> sounds like your in for a treat!


Haha yes can't wait

@acupalypse is black screen a common problem with the 290x?


----------



## bond32

Quote:


> Originally Posted by *weebeast*
> 
> Haha yes can't wait
> 
> @acupalypse is black screen a common problem with the 290x?


Yeah, unfortunately. My first 290x I got at release had black screens but not as bad as some on here. Although it occurs with both Hynix and Elpida, it seems to be much worse on Elpida cards. RMA'ed mine and got a killer overclocker from sapphire so personally I had good luck.


----------



## weebeast

Quote:


> Originally Posted by *bond32*
> 
> Yeah, unfortunately. My first 290x I got at release had black screens but not as bad as some on here. Although it occurs with both Hynix and Elpida, it seems to be much worse on Elpida cards. RMA'ed mine and got a killer overclocker from sapphire so personally I had good luck.


Ah i didn't know about that, the card is not new but used so i hope it works well. Will keep my 6970 till i'm sure that my 290x is fine


----------



## Roboyto

Quote:


> Originally Posted by *weebeast*
> 
> Haha yes can't wait
> 
> @acupalypse is black screen a common problem with the 290x?


14.4 was giving me blackscreens. I still say 13.12 is the only guarantee for flawless operation.


----------



## weebeast

Ah, well i will try 14.4 first, we will see how it goes. By the way, i just need to delete the old drivers and then reinstall them am i right? Or do i need to format my pc so i can be sure that i get the right performance out of it. By the way i am using PCI-E 2.0, is that a problem?


----------



## ArbyWan

Hello everyone, just got my 290X in today, picture is to follow....



Hope that gets me in, will take screens when I get home from work







Can't wait to unleash this beast as I am coming from a single HD7770GHz


----------



## VSG

Unless there are 34 months in the year 2025, that ain't happening


----------



## DeadlyDNA

Quote:


> Originally Posted by *geggeg*
> 
> Unless there are 34 months in the year 2025, that ain't happening


i corrected it for the technical junkies


----------



## Dire Squirrel

Quote:


> Originally Posted by *Roboyto*
> 
> 14.4 was giving me blackscreens. I still say 13.12 is the only guarantee for flawless operation.


I have had black screen with 13.12


----------



## VSG

Quote:


> Originally Posted by *DeadlyDNA*
> 
> i corrected it for the technical junkies












+1 for the humour


----------



## Krusher33

I'm on a freshly installed Win8.1 and 14.4 drivers. No problems here.


----------



## zGunBLADEz

I used to have black screens with my 7970 and it was related to "AMD" drivers.. Only one set of drivers would fix that (was stuck with april 18 drivers for awhile).. Its not the card is a miss config somewhere between the drivers as the same crap happened to me with 2 different 7970's..

Like my 290 i open trixx and is +50 default (even gpu-z confirms that) i didnt do it.. I have to click reset so it drop that extra +50 of offset.

My 7970 cfrx 2nd card memory volts were always wrong as well causing freezings and crap. if i overclock or enable or disable cfx i have to make sure it wasnt on 1.1v on the mem. XD


----------



## hwoverclkd

Quote:


> Originally Posted by *Dire Squirrel*
> 
> I have had black screen with 13.12


you're not alone mate...i get black screen regardless of driver version


----------



## hwoverclkd

Quote:


> Originally Posted by *weebeast*
> 
> Haha yes can't wait
> 
> @acupalypse is black screen a common problem with the 290x?


apparently....and there are only few chosen ones who don't experience it.... i have 290x lightning, returned the first due to excessive whine and black screen...and now the 2nd has no more whine but worse on black screen (better overclocker than the first though







)


----------



## hwoverclkd

Quote:


> Originally Posted by *ArbyWan*
> 
> Hello everyone, just got my 290X in today, picture is to follow....
> 
> 
> 
> Hope that gets me in, will take screens when I get home from work
> 
> 
> 
> 
> 
> 
> 
> Can't wait to unleash this beast as I am coming from a single HD7770GHz


excellent choice there mate...just voice out any concern right away and i'm sure someone will jump in and...cry with you


----------



## DeadlyDNA

Quote:


> Originally Posted by *acupalypse*
> 
> apparently....and there are only few chosen ones who don't experience it.... i have 290x lightning, returned the first due to excessive whine and black screen...and now the 2nd has no more whine but worse on black screen (better overclocker than the first though
> 
> 
> 
> 
> 
> 
> 
> )


I dont have 290x but i have 4 290's only time i have gotten a black screen was when i overclocked. I have never had a black screen on stock settings that i recall.. Once i started messing with Overdrive and stuff then i had a few black screens. There are others who say they never overclocked and still got black screens....


----------



## weebeast

I will do a format tomorrow when i installed the card. Will let you guys know if i get any black screen problems


----------



## bak3donh1gh

Hey guys im thinking of buying a 290 does anyone know which versions of the xfx cards can be unlocked to a 290x or if that even possible anymore ?


----------



## Roboyto

My first 290 has never black screen problem on 13.12; only when using anything 14.X

Tried 14.4 on my 2nd 290, fresh Windows installation, and within the first 3-4 hours it black screened twice while browsing the web or sitting idle; Reverted to 13.12 and it's back to normal.

Both my machines running Win7 x64 Ultimate
Quote:


> Originally Posted by *acupalypse*
> 
> apparently....and there are only few chosen ones who don't experience it.... i have 290x lightning, returned the first due to excessive whine and black screen...and now the 2nd has no more whine but worse on black screen (better overclocker than the first though
> 
> 
> 
> 
> 
> 
> 
> )


Quote:


> Originally Posted by *acupalypse*
> 
> you're not alone mate...i get black screen regardless of driver version


Quote:


> Originally Posted by *Dire Squirrel*
> 
> I have had black screen with 13.12


Quote:


> Originally Posted by *DeadlyDNA*
> 
> I dont have 290x but i have 4 290's only time i have gotten a black screen was when i overclocked. I have never had a black screen on stock settings that i recall.. Once i started messing with Overdrive and stuff then i had a few black screens. There are others who say they never overclocked and still got black screens....


Quote:


> Originally Posted by *bak3donh1gh*
> 
> Hey guys im thinking of buying a 290 does anyone know which versions of the xfx cards can be unlocked to a 290x or if that even possible anymore ?


Not possible anymore, that was only with very early reference edition cards. Your only chance is to gamble on a used mining card that may be able to unlock.

The $150+ premium for 290X really isn't worth it for the ~5% performance boost, unless you're going for the boss Lightning edition IMO


----------



## Dire Squirrel

2 black screen and hardlock in less than 15 minutes.
And AMD don't care one single bit. They got their money so as far as they are concerned, we don't exist.

I really hope they loose a lot of customers permanently over this.


----------



## bond32

Quote:


> Originally Posted by *Dire Squirrel*
> 
> 2 black screen and hardlock in less than 15 minutes.
> And AMD don't care one single bit. They got their money so as far as they are concerned, we don't exist.
> 
> I really hope they loose a lot of customers permanently over this.


What brand? RMA it... It's something more than memory I believe, something faulty. Def hardware related.


----------



## DeadlyDNA

Quote:


> Originally Posted by *Dire Squirrel*
> 
> 2 black screen and hardlock in less than 15 minutes.
> And AMD don't care one single bit. They got their money so as far as they are concerned, we don't exist.
> 
> I really hope they loose a lot of customers permanently over this.


Can you return your card? You haven't been happy with the card so maybe cut your loss and return go green?


----------



## weebeast

To be honest, if i knew about this before i maybe wouldn't buy the 290x. Hope for the best..


----------



## Dire Squirrel

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Can you return your card? You haven't been happy with the card so maybe cut your loss and return go green?


I could. But At a significant loss as I have already ordered waterblock for it.

But that is not really the point. AMD has known that this is a common problem since day one. And instead of owning up to it and fixing it, they choose to take a huge steaming dump on their customers. And we let them by accepting it and finding alternative solutions.
AMD have already made their money on this card. Returning it does nothing to change that, as it will never come any further up the chain than the manufacturer that put their sticker on it, and since this is not a defective card as such, it will be sold again. We all know this.

We need to actually hold them accountable and as a community DEMAND that they take responsibility for their mistakes.


----------



## sugarhell

AIBs make the card. If its a hardware problem return it. If your card black screen its either you bench at 1.6 volts or your cards needs RMA.No need for rant with misinformations


----------



## Dire Squirrel

Quote:


> Originally Posted by *sugarhell*
> 
> AIBs make the card. If its a hardware problem return it. If your card black screen its either you bench at 1.6 volts or your cards needs RMA.No need for rant with misinformations


Have you not been paying attention?
This is A COMMON PROBLEM. It has been linked to subpar RAM and has NOT been accepted as a manufacturing flaw. As such these 1000's of card are officially not defective.


----------



## DeadlyDNA

Quote:


> Originally Posted by *Raephen*
> 
> ROFL!


Glad to make someone laugh today, we need it after pulling our hair out 

Quote:


> Originally Posted by *Dire Squirrel*
> 
> Have you not been paying attention?
> This is A COMMON PROBLEM. It has been linked to subpar RAM and has NOT been accepted as a manufacturing flaw. As such these 1000's of card are officially not defective.


You have had many complaints though, not just black screens.

1.Mouse cursor corruption
2. Idle clocks/power consumption
3. Driver issue?(i think)
4.black screens

Your overall comments aren't just asking about issues you have and if anyone can help, they have also been putting down AMD. Not saying your anger and frustration isn't justified. I think your better to cut your loss and vote with your wallet.

I was feeling close to how you feel when i first went through the mill with my cards. I since have accepted the pros and cons. Nvidia isn't perfect either, but maybe they will provide you with a better experience.


----------



## chronicfx

In case a couple of you remember a few weeks back I mentioned I was having an issue with my PC restarting itself randomly during gaming. I picked up a kill-a-watt to check my load wattage with 3xR9 290x and to my suprise they are more power thrifty then I anticipated. Cryisis 3, Far Cry 3, and The Witcher 3 all drawing less than 1000 watts with my 3570k at 4.8ghz and more accessories lights and fans than most have. So I concluded that wattage was not the issue. So I check HWmonitor and saw that my voltages were actually on the low side. My +3.3v was reading about 2.7v minimum and my +12v was reading all the way down to 11.5v during gaming. I was still getting the restarts but I was never able to catch if that +11.5 was getting any lower, because it would restart and info was gone. Very early on in my ownership of the PSU there was a recall and EVGA crosshipped me a new PSU BNIB for just the old PSU and I could keep accesories and cables. So after exchanging the 24pin, and all the GPU cables with the new unused ones from the other box my voltages are now +3.305v min and +12.024v min. No more restarts it seems too. Maybe 5 months of 24/7 mining degraded the cables a bit? anyways everything is alright again now!


----------



## Raephen

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Glad to make someone laugh today, we need it after pulling our hair out
> .


Well, I guess I've been lucky with my close to after launch Sapphire 290, so I can laugh









Appart from a bit of coil whine (software specific, and then only above 100 fps, so I just limit my display to 100 Hz), I have had no trouble.

So if you ever have another good joke brewed up, please share it.

I for one love to laugh at myself (still on 13.12 - never saw the need to change)


----------



## Dire Squirrel

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Your overall comments aren't just asking about issues you have and if anyone can help, they have also been putting down AMD. Not saying your anger and frustration isn't justified. I think your better to cut your loss and vote with your wallet.
> 
> I was feeling close to how you feel when i first went through the mill with my cards. I since have accepted the pros and cons. Nvidia isn't perfect either, but maybe they will provide you with a better experience.


I'm sorry but that just isn't acceptable to me. Simply giving in is the same as telling them to carry on doing the same thing.
This is the reason Apple keep making money. Their customers keep accepting everything they get and never question anything.

As I said, it is too late to "vote with my wallet". At this point AMD has gotten their money, and have nothing further to do with this specific product. And since it is "technically " not defective, if returned, it will go back to the manufacturer, and eventually be sold again, since it will pass any test that it passed in it's first trip through QC.

It is in the best interest of the community as a whole to hold companies accountable when they mess up. Especially when there is only 2 companies on the market.


----------



## sugarhell

Quote:


> Originally Posted by *Dire Squirrel*
> 
> I'm sorry but that just isn't acceptable to me. Simply giving in is the same as telling them to carry on doing the same thing.
> This is the reason Apple keep making money. Their customers keep accepting everything they get and never question anything.
> 
> As I said, it is too late to "vote with my wallet". At this point AMD has gotten their money, and have nothing further to do with this specific product. And since it is "technically " not defective, if returned, it will go back to the manufacturer, and eventually be sold again, since it will pass any test that it passed in it's first trip through QC.
> 
> It is in the best interest of the community as a whole to hold companies accountable when they mess up. Especially when there is only 2 companies on the market.


If they cant reproduce it then its not defective. And amd cant do anything about that. AIBs controls the RMA not amd. You got a bad gpu i got bad gpus for over 10 years. Just stop ranting and accept the fact that sometimes you get a bad product. Black screen issue with hawaii is really a strange one i had a hawaii on my msi i had black screens on my asus mobo i didint.


----------



## rdr09

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> I forgot that Thief came out with a Mantle patch a little while ago, decided to check it out and this is what I got
> 
> Very High, windowed, no vsync & 1080p
> 
> Mantle
> 
> Min. 58.3
> Max. 94.5
> Avg.69.2
> 
> Normal
> 
> Min. 9.5
> Max. 84.5
> Avg. 60.2
> 
> Boosts the max and avg a little bit but the min
> 
> 
> 
> 
> 
> 
> 
> thats insane, hope more mantle games come out soon
> 
> 
> 
> 
> 
> 
> 
> not like that bf4 crap


may have to try this out. thanks.


----------



## Dire Squirrel

Quote:


> Originally Posted by *sugarhell*
> 
> If they cant reproduce it then its not defective. And amd cant do anything about that. AIBs controls the RMA not amd. You got a bad gpu i got bad gpus for over 10 years. Just stop ranting and accept the fact that sometimes you get a bad product. Black screen issue with hawaii is really a strange one i had a hawaii on my msi i had black screens on my asus mobo i didint.


When it is this common a problem, you can't simply excuse it with "you got a bad one".
And since this many have been able to get through QA in the first place, it is quite clear that however they "test" them, they are not doing it in a way that realistically resembles the average buyer's use.

And you can't simply take all the responsibility away from AMD. It makes no difference if they do not make them in house. Just like the monitor I RMA'ed last year. The company who's brand name is on it, did not make the actual panel. But they put their name on it and in doing so they took on the responsibility. They are also the last ones to do QA and give the product the green light to be sold.
If they so choose, they can take it up with the panel supplier, but as far as the end customers is concerned, the ball stops at the company who's logo is on the product.

By your logic we could also just blame pretty much everything on Foxconn, since they supply to pretty much everyone and pretty much every product that is even remotely electronic. But obviously that would make no sense.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Dire Squirrel*
> 
> When it is this common a problem, you can't simply excuse it with "you got a bad one".
> And since this many have been able to get through QA in the first place, it is quite clear that however they "test" them, they are not doing it in a way that realistically resembles the average buyer's use.
> 
> And you can't simply take all the responsibility away from AMD. It makes no difference if they do not make them in house. Just like the monitor I RMA'ed last year. The company who's brand name is on it, did not make the actual panel. But they put their name on it and in doing so they took on the responsibility. They are also the last ones to do QA and give the product the green light to be sold.
> If they so choose, they can take it up with the panel supplier, but as far as the end customers is concerned, the ball stops at the company who's logo is on the product.
> 
> By your logic we could also just blame pretty much everything on Foxconn, since they supply to pretty much everyone and pretty much every product that is even remotely electronic. But obviously that would make no sense.


It was to my understanding that this was a day 1 batch issue and has been sorted in later batches.

My 290X (day 1) had a black screen once. I updated to the driver that AMD released to fix this issue and never had a problem until the card itself died. Went through RMA and got a couple of 290's instead..no black screens, no issues.

Is your card an ex-miner? If so then its probably a day 1 batch.

And im sorry but you're silly to buy a block for the card without testing it out throughly first.


----------



## bak3donh1gh

Quote:


> Not possible anymore, that was only with very early reference edition cards. Your only chance is to gamble on a used mining card that may be able to unlock.
> 
> The $150+ premium for 290X really isn't worth it for the ~5% performance boost, unless you're going for the boss Lightning edition IMO


Thanks I figured that I was just hoping I could still get lucky.

Theres seems to be a lot of people complaining of a black screen issue do you guys think I should just get a nvidia card instead?


----------



## imadorkx

Quote:


> Originally Posted by *Dire Squirrel*
> 
> When it is this common a problem, you can't simply excuse it with "you got a bad one".
> And since this many have been able to get through QA in the first place, it is quite clear that however they "test" them, they are not doing it in a way that realistically resembles the average buyer's use.
> 
> And you can't simply take all the responsibility away from AMD. It makes no difference if they do not make them in house. Just like the monitor I RMA'ed last year. The company who's brand name is on it, did not make the actual panel. But they put their name on it and in doing so they took on the responsibility. They are also the last ones to do QA and give the product the green light to be sold.
> If they so choose, they can take it up with the panel supplier, but as far as the end customers is concerned, the ball stops at the company who's logo is on the product.
> 
> By your logic we could also just blame pretty much everything on Foxconn, since they supply to pretty much everyone and pretty much every product that is even remotely electronic. But obviously that would make no sense.


well, i hope you problem sort out eventually. Good luck mate. Me myself have been using amd for a while. Never had a black screen before. I had before but i was overclocking the crap out of my 7950.

Well, started using r9 290 tri x for half a month. Safe to say i have never encounter a single black screen. the 13.12 driver really a stable one. tried the 14.4. fps drop on bioshock infinite.

and heck, im using cm extreme power plus 550w. with an i5 2500k oc to 4.5ghz, r9 290 tri x ( stock, not gona oc with that psu), 3 fans, 2 on a cpu cooler, 3 hdd and a led strip .Smooth sailing here.

well, maybe yours got the defective one from all the batches.


----------



## hwoverclkd

Quote:


> Originally Posted by *DeadlyDNA*
> 
> I dont have 290x but i have 4 290's only time i have gotten a black screen was when i overclocked. I have never had a black screen on stock settings that i recall.. Once i started messing with Overdrive and stuff then i had a few black screens. There are others who say they never overclocked and still got black screens....


Quote:


> Originally Posted by *DeadlyDNA*
> 
> Can you return your card? You haven't been happy with the card so maybe cut your loss and return go green?


Quote:


> Originally Posted by *Dire Squirrel*
> 
> 2 black screen and hardlock in less than 15 minutes.
> And AMD don't care one single bit. They got their money so as far as they are concerned, we don't exist.
> 
> I really hope they loose a lot of customers permanently over this.


I really believe it's the AMD gpu. I'm on my 2nd MSI 290X lightning and still having black screen problem...i don't think it's MSI's fault and they're all aware but just staying silent about it. Whether it's just a specific lot or a design problem to begin with, or maybe they're also scratching their heads









I'd probably give AMD another chance. That new Sapphire Radeon VAPOR-X R9 290X 4GB is tempting and could be one of those 'golden breeds' of 290x (you can never tell) and costs $40 less than lightning. I'm not going to wait for my hd to fail all because of unexpected reboots due to black screen.


----------



## bak3donh1gh

The thing is I dont like nvidia 780 especially since its on 3gb and besides the clocks the 290 is a better card


----------



## hwoverclkd

Quote:


> Originally Posted by *bak3donh1gh*
> 
> The thing is I dont like nvidia 780 especially since its on 3gb and besides the clocks the 290 is a better card


evga has just released a 6gb version of 780...and 780 ti 6gb 'possibly' coming out too...just so you know


----------



## DeadlyDNA

Quote:


> Originally Posted by *acupalypse*
> 
> Yeah, go GREEN ...that's what the world needs today


I can see the company slogans now...

Nvidia, Go green! because sometimes you need more than just a black screen!

Linux, Go Gentoo! because sometimes you need more on the screen than just blue!

AMD Go red! because sometimes a driver update stops your fan dead!


----------



## magicase

Will I have any overheating issues if I have 3 Gigabyte WF 290s or 3 Saphhire Tri-x OC 290s next to each other?


----------



## bak3donh1gh

Quote:


> Originally Posted by *acupalypse*
> 
> evga has just released a 6gb version of 780...and 780 ti 6gb 'possibly' coming out too...just so you know


well im looking sub 500$ with a deal(pre-tax)


----------



## Dire Squirrel

Quote:


> Originally Posted by *magicase*
> 
> Will I have any overheating issues if I have 3 Gigabyte WF 290s or 3 Saphhire Tri-x OC 290s next to each other?


They definitely wont be cool, but they should survive. I have seen it done with the horrible reference cooler.
AMD GPU's are generally very good at handling temperatures that would make Nvidia GPU's blow up.

I would make sure to have very good airflow around them.


----------



## chiknnwatrmln

Only time I ever get blackscreens is if my memory is too high (over 1650 or so) or if I run reaaaally high voltage (higher than +200mv offset) and I have an early batch card. I bought it back in October.

I went through 3 670 FTW's in a month when I bought my first rig, sometimes you just get bum hardware. It's not specific to any manufacturer or device, hell I had 4 Xbox 360's crap out on me over the course of 2 years. Also had a PS2 crap out, several controllers, a phone, two monitors, etc etc.

Basically if you don't like technology breaking don't buy technology, because you WILL get defective stuff. It's just the game.


----------



## The Mac

same here. only black screens i get are when i push the mem too far.


----------



## Roboyto

Quote:


> Originally Posted by *Dire Squirrel*
> 
> When it is this common a problem, you can't simply excuse it with "you got a bad one".
> And since this many have been able to get through QA in the first place, it is quite clear that however they "test" them, they are not doing it in a way that realistically resembles the average buyer's use.
> 
> And you can't simply take all the responsibility away from AMD. It makes no difference if they do not make them in house. Just like the monitor I RMA'ed last year. The company who's brand name is on it, did not make the actual panel. But they put their name on it and in doing so they took on the responsibility. They are also the last ones to do QA and give the product the green light to be sold.
> If they so choose, they can take it up with the panel supplier, but as far as the end customers is concerned, the ball stops at the company who's logo is on the product.
> 
> By your logic we could also just blame pretty much everything on Foxconn, since they supply to pretty much everyone and pretty much every product that is even remotely electronic. But obviously that would make no sense.


Bad cards happen. In the last 2 months I have gotten 3 bad cards...1 DOA and 2 with issues. I pinpointed the problem early and painlessly RMAd with NewEgg. I am ruthless on cards within the first month to see if I can get them to act up, this usually avoids headaches later on.

Issues have happened to me with Nvidia and AMD, it's the name of the game. You're upset, that's understandable, but whining here repeating the same thing over and over isn't going to solve your problem. If you're so terribly upset with your purchase then:


Buy an Nvidia card
RMA your 290(X)
Sell 290(X) when you get the replacement
Sell the waterblock
You're done. 

Quote:


> Originally Posted by *bak3donh1gh*
> 
> Thanks I figured that I was just hoping I could still get lucky.
> 
> Theres seems to be a lot of people complaining of a black screen issue do you guys think I should just get a nvidia card instead?


If you want to avoid issues then you must properly uninstall old drivers, install 13.12 drivers, keep these volcanic cards cooled properly, have a good PSU, and know the limits when overclocking a card.

Have other questions regarding the 290(X)? See here: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781

I have pushed 3 of these cards to to their individual limits and can tell you, without a doubt, that *if you try to push the RAM too hard then you will get black screen and lockup problems*. These cards with their 512-bit bus are unlike most anything else previously, RAM clocks aren't that important, but some people are dead set on seeing a RAM speed similar to 79XX and the GTX 780 cards. *A) it probably won't happen and B) it's not necessary *

These cards come equipped with 5GHz RAM modules, so it's unlikely you are going to get to 7GHz+ like the 79XX cards. Yes some people do get lucky and get phenomenal RAM clocks, I am one of them, with a card that will run at 1300/1700. But the RAM speeds mean squat compared to the core.

Yes, I *briefly* experienced black screens on the 2/3 cards I own(ed), *however as soon as this happened, I removed the 14.X drivers and all was well. *

You can certainly buy an Nvidia card, that is up to you, but you have a similar chance to get a problematic Nvidia card as well...silicon lottery.

Plenty of people in this forum love these cards, myself included, which is why there are 22500 posts in a little over 6 months.

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Only time I ever get blackscreens is if my memory is too high (over 1650 or so) or if I run reaaaally high voltage (higher than +200mv offset) and I have an early batch card. I bought it back in October.
> 
> I went through 3 670 FTW's in a month when I bought my first rig, sometimes you just get bum hardware. It's not specific to any manufacturer or device, hell I had 4 Xbox 360's crap out on me over the course of 2 years. Also had a PS2 crap out, several controllers, a phone, two monitors, etc etc.
> 
> *Basically if you don't like technology breaking don't buy technology, because you WILL get defective stuff. It's just the game.*


There must of been a bad batch of 670s at some point...Me and a friend of mine each had 670s crap out, within a few days of each other, for no reason whatsoever. As well as getting another one I planned for SLI that was DOA.


----------



## rt123

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Whoops was trying to reply and i got a black screen.
> Sorry guys best of luck!


Quote:


> Originally Posted by *rdr09*
> 
> I got one, too, lol . . .
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 13.11 and 14.3 for mantle here.


Can you Gents please tell me how to take a ScreenShot & save it while my PC is BlackScreened .?








I would like to try this.
Quote:


> Originally Posted by *rdr09*
> 
> yah, new architecture. it will have some kinks that need straightening. my 290 blackscreened when i benched my memory over 1620.
> 
> tbh, some people are just meant to use xbox.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> edit: don't flame me pls. lol


Too high of memory clock, you can't blame AMD for it.
But I think you already knew it.
Quote:


> Originally Posted by *The Mac*
> 
> same here. only black screens i get are when i push the mem too far.


Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Only time I ever get blackscreens is if my memory is too high (over 1650 or so) or if I run reaaaally high voltage (higher than +200mv offset) and I have an early batch card. I bought it back in October.
> 
> I went through 3 670 FTW's in a month when I bought my first rig, sometimes you just get bum hardware. It's not specific to any manufacturer or device, hell I had 4 Xbox 360's crap out on me over the course of 2 years. Also had a PS2 crap out, several controllers, a phone, two monitors, etc etc.
> 
> Basically if you don't like technology breaking don't buy technology, because you WILL get defective stuff. It's just the game.


Same here I haven't had a single blackscreen unless pushing the memory too high.

To be Honest, seeing these people here, I feel like I am the Chosen one.









Had 2 Nvidia 780s before, moved to AMD & just uninstalled Nvidia Drivers.
Worked perfectly.
Moved back & forth between 13.12 & 14.4 four times within 2 hours & used AMD's DDU, nothing else.
Zero problems. Plus I am on Win 8.1 which people hate with a passion.

I don't get what is wrong with these people or their Computers.


----------



## Dasboogieman

Quote:


> Originally Posted by *magicase*
> 
> Will I have any overheating issues if I have 3 Gigabyte WF 290s or 3 Saphhire Tri-x OC 290s next to each other?


Your biggest issue with that setup is extracting the hot exhaust air. Assuming you have a monstrous case+ motherboard that allows the requisite 1 slot gap spacing between the cards, you still have the dilemma that all those cards have horizontal heatsink fins i.e. the hot air goes sideways in to the motherboard and out the side. Your secondary issue is ensuring the cards have a steady supply of cold air to prevent re-use of the hot air.

You will need a very high CFM (about 180ish CFM ish recommended) side case fan to extract the hot air (I personally used a Silverstone 180mm penetrator back when I had SLI Palit GTX570s DualFans which also dumped their hot air sideways) plus a reasonable CFM front fan, preferably with a shroud that covers all the GPUs (My Coolermaster HAF932 case came with this accessory and a 120mm fan) to force feed cold air to minimize the risk of the top cards swallowing the hot air from the bottom cards. Even then, the top cards are usually 4-5 degrees (possibly more with the 290) hotter than the bottom ones.

Your case overall should be moving at least 250CFM of air (Intake and exhaust) for this kind of open air Crossfire setup. That being said, AMD cards are engineered for 95 degree operation. However, what I worry about is the VRM cooling. For the Tri-X, with the fans at 50% (which is the threshold of silence) your VRMs can reach 70 degrees, and that's a single GPU in a well ventilated case, expect 80-90 (or even more with overclocking) with 3 of these open air cards in close proximity.


----------



## chiknnwatrmln

Honestly if I was running Trifire on air I'd just never use a side panel and get a giant window fan and point it directly at my rig. Get one of those power saving power strips, use your PC as master and fan as slave so every time your PC turns on the fan spins up. Profit.

That or you could install a mechanism that automatically transports ice from your freezer to your PC and gets rid of the melted water.. A water-cooled-air-cooled rig, now that would be impressive.


----------



## Roboyto

Quote:


> Originally Posted by *magicase*
> 
> Will I have any overheating issues if I have 3 Gigabyte WF 290s or 3 Saphhire Tri-x OC 290s next to each other?


Quote:


> Originally Posted by *Dasboogieman*
> 
> *Your biggest issue with that setup is extracting the hot exhaust air*. Assuming you have a monstrous case+ motherboard that allows the requisite 1 slot gap spacing between the cards, you still have the dilemma that all those cards have horizontal heatsink fins i.e. the hot air goes sideways in to the motherboard and out the side. Your secondary issue is ensuring the cards have a steady supply of cold air to prevent re-use of the hot air.
> 
> You will need a very high CFM (about 180ish CFM ish recommended) side case fan to extract the hot air (I personally used a Silverstone 180mm penetrator back when I had SLI Palit GTX570s DualFans which also dumped their hot air sideways) plus a reasonable CFM front fan, preferably with a shroud that covers all the GPUs (My Coolermaster HAF932 case came with this accessory and a 120mm fan) to force feed cold air to minimize the risk of the top cards swallowing the hot air from the bottom cards. Even then, the top cards are usually 4-5 degrees (possibly more with the 290) hotter than the bottom ones.
> 
> Your case overall should be moving at least 250CFM of air (Intake and exhaust) for this kind of open air Crossfire setup. *That being said, AMD cards are engineered for 95 degree operation. However, what I worry about is the VRM cooling. For the Tri-X, with the fans at 50% (which is the threshold of silence) your VRMs can reach 70 degrees, and that's a single GPU in a well ventilated case, expect 80-90 (or even more with overclocking) with 3 of these open air cards in close proximity.*


Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Honestly if I was running Trifire on air I'd just never use a side panel and get a *giant window fan* and point it directly at my rig. Get one of those power saving power strips, use your PC as master and fan as slave so every time your PC turns on the fan spins up. Profit.
> 
> That or you could install a mechanism that automatically transports ice from your freezer to your PC and gets rid of the melted water.. A water-cooled-air-cooled rig, now that would be impressive.


Heed this advice.

Make no mistake that there is going to be a TON of heat you will have to deal with. As Boogie said your VRM1s will be cooking and that will cause issues/instability far before 90C+ core temps will.

If you're going to run 3 cards and you don't want issues you should run a custom loop if $ allow. Next best solution is probably to use AIOs with the Kraken G10 brackets on some(if not all) of the cards so you're directing most of the heat out of the case through a radiator. This will significantly help your cause of keeping VRM1s cool.

I got phenomenal results with Kraken G10 in a the midgy Corsair 250D. Even OCd to 1075/1375 +37mV the core and VRM1 are plently cool. Can see in link below:

http://www.overclock.net/t/1478544/the-devil-inside-a-gaming-htpc-for-my-wife-3770k-corsair-250d-powercolor-r9-270x-devil-edition/20#post_22214255

Quote:


> Originally Posted by *DeadlyDNA*
> 
> but wait theres more!! use the heated water to supply your shower and sinks!


Now that's being green!


----------



## Dire Squirrel

Quote:


> Originally Posted by *Roboyto*
> 
> Bad cards happen. In the last 2 months I have gotten 3 bad cards...1 DOA and 2 with issues.


This makes me sad. That you are willing to keep going through this and still come running to their rescue when someone dare question the state of things.

This is not about "being upset" or as you put it "whining". It is about not accepting such a horrible level of quality. If you had 3 coffee makers go bad in such a short time, I dare say you would not be running out to buy number 4. When we only have 2 companies to choose from, we need to be even less tolerant.

I have no general problems with either company. I have used a AMD card for the past year with not a single problem. Nvidia before that and AMD before that one. I have very rarely had any issues worse than the occasional unstable driver.

But past success does not excuse current failure. They are in it for the money. If we allow such common and serious problems to go unchecked, they will care just a little bit less about quality, next time.
Just look at Apple. That is exactly what happened to them.


----------



## magicase

Quote:


> Originally Posted by *Dasboogieman*
> 
> Your biggest issue with that setup is extracting the hot exhaust air. Assuming you have a monstrous case+ motherboard that allows the requisite 1 slot gap spacing between the cards, you still have the dilemma that all those cards have horizontal heatsink fins i.e. the hot air goes sideways in to the motherboard and out the side. Your secondary issue is ensuring the cards have a steady supply of cold air to prevent re-use of the hot air.
> 
> You will need a very high CFM (about 180ish CFM ish recommended) side case fan to extract the hot air (I personally used a Silverstone 180mm penetrator back when I had SLI Palit GTX570s DualFans which also dumped their hot air sideways) plus a reasonable CFM front fan, preferably with a shroud that covers all the GPUs (My Coolermaster HAF932 case came with this accessory and a 120mm fan) to force feed cold air to minimize the risk of the top cards swallowing the hot air from the bottom cards. Even then, the top cards are usually 4-5 degrees (possibly more with the 290) hotter than the bottom ones.
> 
> Your case overall should be moving at least 250CFM of air (Intake and exhaust) for this kind of open air Crossfire setup. That being said, AMD cards are engineered for 95 degree operation. However, what I worry about is the VRM cooling. For the Tri-X, with the fans at 50% (which is the threshold of silence) your VRMs can reach 70 degrees, and that's a single GPU in a well ventilated case, expect 80-90 (or even more with overclocking) with 3 of these open air cards in close proximity.


Quote:


> Originally Posted by *Dire Squirrel*
> 
> They definitely wont be cool, but they should survive. I have seen it done with the horrible reference cooler.
> AMD GPU's are generally very good at handling temperatures that would make Nvidia GPU's blow up.
> 
> I would make sure to have very good airflow around them.


I have the Silverstone FT02 case so air flow won't be an issue. They will be sitting vertical so that should help a bit.

Another option is that I buy the InWin D frame case which is fully open and add Scythe GT AP15 fans to it.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Dire Squirrel*
> 
> This makes me sad. That you are willing to keep going through this and still come running to their rescue when someone dare question the state of things.
> 
> This is not about "being upset" or as you put it "whining". It is about not accepting such a horrible level of quality. If you had 3 coffee makers go bad in such a short time, I dare say you would not be running out to buy number 4. When we only have 2 companies to choose from, we need to be even less tolerant.
> 
> I have no general problems with either company. I have used a AMD card for the past year with not a single problem. Nvidia before that and AMD before that one. I have very rarely had any issues worse than the occasional unstable driver.
> 
> But past success does not excuse current failure. They are in it for the money. If we allow such common and serious problems to go unchecked, they will care just a little bit less about quality, next time.
> Just look at Apple. That is exactly what happened to them.


Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Basically if you don't like technology breaking don't buy technology, because you WILL get defective stuff. It's just the game.


Every company is in it for the money. If you don't want to support either graphics card mfr, then don't and have fun running intel graphics...


----------



## bak3donh1gh

ok do you guys have any reason I should go with this Asus card over this powercolor card?

Thanks


----------



## VSG

None, unless you are watercooling. Get the Powercolor PCS otherwise.


----------



## chiknnwatrmln

Yeah man I've heard of fan issues with the ASUS cards, more specifically they can leak oil. Also the DCUII cooler is not optimized for Hawaii and does not cool it as well.

PCS+ has some of the best cooling for Hawaii, and imo the best looks.


----------



## hwoverclkd

Quote:


> Originally Posted by *bak3donh1gh*
> 
> ok do you guys have any reason I should go with this Asus card over this powercolor card?
> 
> Thanks


c'mon man, get a 290x and join the fun


----------



## bak3donh1gh

and spend an extra 150 no thanks this is going to drain me as is with a a headset or reciever


----------



## Roboyto

Quote:


> Originally Posted by *bak3donh1gh*
> 
> ok do you guys have any reason I should go with this Asus card over this powercolor card?
> 
> Thanks


Quote:


> Originally Posted by *geggeg*
> 
> None, unless you are watercooling. Get the Powercolor PCS otherwise.


Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Yeah man I've heard of fan issues with the ASUS cards, more specifically they can leak oil. Also the DCUII cooler is not optimized for Hawaii and does not cool it as well.
> 
> PCS+ has some of the best cooling for Hawaii, and imo the best looks.


What they said.

Only 3/5 heatpipes on the cooler make contact with the 290 die, as it is taken directly from the 780. PowerColor is the better card here.



Things you should know if you want a 290(X)

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781


----------



## Arizonian

Quote:


> Originally Posted by *lawson67*
> 
> Arizonian can you list me please i have two R9 290 cards one is a powercolor pcs+ and the other is a Gigabyte windforce R9 290...Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Quote:


> Originally Posted by *Sleazybigfoot*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Sapphire R9 290's - Stock cooler


Congrats - added









_Nice screen shot for submission_

Quote:


> Originally Posted by *ArbyWan*
> 
> Hello everyone, just got my 290X in today, picture is to follow....
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Hope that gets me in, will take screens when I get home from work
> 
> 
> 
> 
> 
> 
> 
> Can't wait to unleash this beast as I am coming from a single HD7770GHz


Congrats - added









_I had a feeling with dropped prices again that ownership would be on the uptick._


----------



## hwoverclkd

Quote:


> Originally Posted by *Dire Squirrel*
> 
> This makes me sad. That you are willing to keep going through this and still come running to their rescue when someone dare question the state of things.
> 
> This is not about "being upset" or as you put it "whining". It is about not accepting such a horrible level of quality. If you had 3 coffee makers go bad in such a short time, I dare say you would not be running out to buy number 4. When we only have 2 companies to choose from, we need to be even less tolerant.
> 
> I have no general problems with either company. I have used a AMD card for the past year with not a single problem. Nvidia before that and AMD before that one. I have very rarely had any issues worse than the occasional unstable driver.
> 
> But past success does not excuse current failure. They are in it for the money. If we allow such common and serious problems to go unchecked, they will care just a little bit less about quality, next time.
> Just look at Apple. That is exactly what happened to them.


Conceal, don't feel, don't let them know...well, now they know...let it go, let it go


----------



## DeadlyDNA

Quote:


> Originally Posted by *acupalypse*
> 
> Conceal, don't feel, don't let them know...well, now they know...let it go, let it go


I'd like to see it work good for the guy. The more frustrated he gets with it the less likely he will be able troubleshoot issues. Of course it would be nice to have it working out the box. Sometimes it just isn't meant to be.


----------



## Goride

Anyone have a clue why my temps are so hot?

I have a NZXT g10 installed on my 290x. The AIO is an AMD Liquid Cooler System from the FX-8150 (rebadged Antec Kuhler 920). I replaced the fans on the AIO with Gentle Typhoon fans in a push-pull configuration.

I have the g10 fan blowing on the vrm1 area.

I have a Gelid enhancement kit for the vrm1 heatsinks and vrm2 heat sinks. I replaced the thermal padding of the vrm1 heatsink with fujipoly.

When I touch the heatsink with my fingers it is scorching hot, so it is transfering heat to the sink.

At 1050mhz core / 1350mhz vram and stock voltage, at load while running Heaven, my GPU temp is 65c. This is a bit higher than others, but I can live with this. (I have reseated the AIO a couple times now, and I am using Gelid Extreme3 paste).

My vrm1 temps though are 95c. The g10 fan just cannot seem to cool the heatsink down. This is what I am really concerned about. This seems way too high. (I had a different heatsink setup on it, and I thought that was the problem, but I know others are getting good results with the gelid enhancement kit heatsink). If I try to overclock it, and put +133 voltage so I can go 1200/1500, the vrm1 temps got to 112c before I shut it down.

This was all with my case side panel open. When I put my side panel back on, with a fan blowing at the 290x, the vrm1 temps drop to about 91c @ stock.

Any ideas?


----------



## heroxoot

Quote:


> Originally Posted by *Goride*
> 
> Anyone have a clue why my temps are so hot?
> 
> I have a NZXT g10 installed on my 290x. The AIO is an AMD Liquid Cooler System from the FX-8150 (rebadged Antec Kuhler 920). I replaced the fans on the AIO with Gentle Typhoon fans in a push-pull configuration.
> 
> I have the g10 fan blowing on the vrm1 area.
> 
> I have a Gelid enhancement kit for the vrm1 heatsinks and vrm2 heat sinks. I replaced the thermal padding of the vrm1 heatsink with fujipoly.
> 
> When I touch the heatsink with my fingers it is scorching hot, so it is transfering heat to the sink.
> 
> At 1050mhz core / 1350mhz vram and stock voltage, at load while running Heaven, my GPU temp is 65c. This is a bit higher than others, but I can live with this. (I have reseated the AIO a couple times now, and I am using Gelid Extreme3 paste).
> 
> My vrm1 temps though are 95c. The g10 fan just cannot seem to cool the heatsink down. This is what I am really concerned about. This seems way too high. (I had a different heatsink setup on it, and I thought that was the problem, but I know others are getting good results with the gelid enhancement kit heatsink). If I try to overclock it, and put +133 voltage so I can go 1200/1500, the vrm1 temps got to 112c before I shut it down.
> 
> This was all with my case side panel open. When I put my side panel back on, with a fan blowing at the 290x, the vrm1 temps drop to about 91c @ stock.
> 
> Any ideas?


Cooler than mine. On 1030 stock clock it gets to 78c and with my 1100 it hits 85c during heaven. Stays lower in games.


----------



## hwoverclkd

Quote:


> Originally Posted by *DeadlyDNA*
> 
> I'd like to see it work good for the guy. The more frustrated he gets with it the less likely he will be able troubleshoot issues. Of course it would be nice to have it working out the box. Sometimes it just isn't meant to be.


i understand. It's just sad for such a high end gpu.


----------



## kizwan

No black screen here with both of my 290's at stock & overclock, idle or under load. I did get BSOD/freeze/crash when overclocked too high or voltage too low but never black screen.
Quote:


> Originally Posted by *Roboyto*
> 
> My first 290 has never black screen problem on 13.12; only when using anything 14.X
> 
> Tried 14.4 on my 2nd 290, fresh Windows installation, and within the first 3-4 hours it black screened twice while browsing the web or sitting idle; Reverted to 13.12 and it's back to normal.
> 
> Both my machines running Win7 x64 Ultimate
> Quote:
> 
> 
> 
> Originally Posted by *acupalypse*
> 
> apparently....and there are only few chosen ones who don't experience it.... i have 290x lightning, returned the first due to excessive whine and black screen...and now the 2nd has no more whine but worse on black screen (better overclocker than the first though
> 
> 
> 
> 
> 
> 
> 
> )
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *acupalypse*
> 
> you're not alone mate...i get black screen regardless of driver version
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dire Squirrel*
> 
> I have had black screen with 13.12
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *DeadlyDNA*
> 
> I dont have 290x but i have 4 290's only time i have gotten a black screen was when i overclocked. I have never had a black screen on stock settings that i recall.. Once i started messing with Overdrive and stuff then i had a few black screens. There are others who say they never overclocked and still got black screens....
> 
> Click to expand...
Click to expand...

Quote:


> Originally Posted by *Dire Squirrel*
> 
> 2 black screen and hardlock in less than 15 minutes.
> And AMD don't care one single bit. They got their money so as far as they are concerned, we don't exist.
> 
> I really hope they loose a lot of customers permanently over this.


Don't want to RMA? Obviously something wrong with your cards. Sell your cards & return to Nvidia camp.
Quote:


> Originally Posted by *Dire Squirrel*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *sugarhell*
> 
> If they cant reproduce it then its not defective. And amd cant do anything about that. AIBs controls the RMA not amd. You got a bad gpu i got bad gpus for over 10 years. Just stop ranting and accept the fact that sometimes you get a bad product. Black screen issue with hawaii is really a strange one i had a hawaii on my msi i had black screens on my asus mobo i didint.
> 
> 
> 
> 
> 
> 
> 
> When it is this common a problem, you can't simply excuse it with "you got a bad one".
> And since this many have been able to get through QA in the first place, it is quite clear that however they "test" them, they are not doing it in a way that realistically resembles the average buyer's use.
> 
> And you can't simply take all the responsibility away from AMD. It makes no difference if they do not make them in house. Just like the monitor I RMA'ed last year. The company who's brand name is on it, did not make the actual panel. But they put their name on it and in doing so they took on the responsibility. They are also the last ones to do QA and give the product the green light to be sold.
> If they so choose, they can take it up with the panel supplier, but as far as the end customers is concerned, the ball stops at the company who's logo is on the product.
> 
> *By your logic we could also just blame pretty much everything on Foxconn, since they supply to pretty much everyone and pretty much every product that is even remotely electronic.* But obviously that would make no sense.
Click to expand...









AMD/ATI AIB partners include Sapphire, PowerColor, MSI, etc. So, they're the ones who handle RMA, not AMD. Like you said, as far as the end customers are concerned, the ball stops at the AIB partners. They may or may not take it up to AMD, depending on the problem.


----------



## DeadlyDNA

Quote:


> Originally Posted by *rt123*
> 
> Can you Gents please tell me how to take a ScreenShot & save it while my PC is BlackScreened .?
> 
> 
> 
> 
> 
> 
> 
> 
> I would like to try this.
> Too high of memory clock, you can't blame AMD for it.
> But I think you already knew it.
> 
> Same here I haven't had a single blackscreen unless pushing the memory too high.
> 
> To be Honest, seeing these people here, I feel like I am the Chosen one.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Had 2 Nvidia 780s before, moved to AMD & just uninstalled Nvidia Drivers.
> Worked perfectly.
> Moved back & forth between 13.12 & 14.4 four times within 2 hours & used AMD's DDU, nothing else.
> Zero problems. Plus I am on Win 8.1 which people hate with a passion.
> 
> I don't get what is wrong with these people or their Computers.


To print screen a black screen. Start>Run>type "mspaint" hit enter>click paint bucket icon> click in empty area>save as jpg>upload = WINNING! j/k

My background on desktop is black, i come prepared for black screens!

Sometimes it's just luck of the draw i guess... I just get aggravated with drivers the 14.x ones to be specific.


----------



## Dasboogieman

Quote:


> Originally Posted by *Goride*
> 
> Anyone have a clue why my temps are so hot?
> 
> I have a NZXT g10 installed on my 290x. The AIO is an AMD Liquid Cooler System from the FX-8150 (rebadged Antec Kuhler 920). I replaced the fans on the AIO with Gentle Typhoon fans in a push-pull configuration.
> 
> I have the g10 fan blowing on the vrm1 area.
> 
> I have a Gelid enhancement kit for the vrm1 heatsinks and vrm2 heat sinks. I replaced the thermal padding of the vrm1 heatsink with fujipoly.
> 
> When I touch the heatsink with my fingers it is scorching hot, so it is transfering heat to the sink.
> 
> At 1050mhz core / 1350mhz vram and stock voltage, at load while running Heaven, my GPU temp is 65c. This is a bit higher than others, but I can live with this. (I have reseated the AIO a couple times now, and I am using Gelid Extreme3 paste).
> 
> My vrm1 temps though are 95c. The g10 fan just cannot seem to cool the heatsink down. This is what I am really concerned about. This seems way too high. (I had a different heatsink setup on it, and I thought that was the problem, but I know others are getting good results with the gelid enhancement kit heatsink). If I try to overclock it, and put +133 voltage so I can go 1200/1500, the vrm1 temps got to 112c before I shut it down.
> 
> This was all with my case side panel open. When I put my side panel back on, with a fan blowing at the 290x, the vrm1 temps drop to about 91c @ stock.
> 
> Any ideas?


When you installed the GELID enhancement heatsink, did you use the washers provided? If you did, try removing them and install the sink bare with 1mm thick thermal pads (which is what I did). The only time my VRMs got that hot with my G10 setup was when I was running EVGA OC scanner and I forgot to plug in the 92mm fan.


----------



## PJFT808

Quote:


> Originally Posted by *Goride*
> 
> Anyone have a clue why my temps are so hot?
> 
> I have a NZXT g10 installed on my 290x. The AIO is an AMD Liquid Cooler System from the FX-8150 (rebadged Antec Kuhler 920). I replaced the fans on the AIO with Gentle Typhoon fans in a push-pull configuration.
> 
> I have the g10 fan blowing on the vrm1 area.
> 
> I have a Gelid enhancement kit for the vrm1 heatsinks and vrm2 heat sinks. I replaced the thermal padding of the vrm1 heatsink with fujipoly.
> 
> When I touch the heatsink with my fingers it is scorching hot, so it is transfering heat to the sink.
> 
> At 1050mhz core / 1350mhz vram and stock voltage, at load while running Heaven, my GPU temp is 65c. This is a bit higher than others, but I can live with this. (I have reseated the AIO a couple times now, and I am using Gelid Extreme3 paste).
> 
> My vrm1 temps though are 95c. The g10 fan just cannot seem to cool the heatsink down. This is what I am really concerned about. This seems way too high. (I had a different heatsink setup on it, and I thought that was the problem, but I know others are getting good results with the gelid enhancement kit heatsink). If I try to overclock it, and put +133 voltage so I can go 1200/1500, the vrm1 temps got to 112c before I shut it down.
> 
> This was all with my case side panel open. When I put my side panel back on, with a fan blowing at the 290x, the vrm1 temps drop to about 91c @ stock.
> 
> Any ideas?


*I noticed when I was installing mine the vrm1 sink wasn't really making good enough contact. The supplied pad wasn't thick enough. So I had to use what I had. I layered some junplus until it was thick enough to make nice contact (visible squish). I didn't want to do it that way but the temps turned out great.*


----------



## Dire Squirrel

Quote:


> Originally Posted by *acupalypse*
> 
> Conceal, don't feel, don't let them know...well, now they know...let it go, let it go


Quote:


> Originally Posted by *DeadlyDNA*
> 
> I'd like to see it work good for the guy. The more frustrated he gets with it the less likely he will be able troubleshoot issues. Of course it would be nice to have it working out the box. Sometimes it just isn't meant to be.


Do not mistake harsh words for a lack of composure.
It is more than a little patronizing and downright offensive to simply assume that someone you don't agree with, is merely having some sort of emotional episode, like a spoiled teenager.
Nothing could be further from the truth.

Am I frustrated? A little bit. But not on any level that would even register as a reaction. The kind of frustration that you are thinking about, is very unproductive.
What I feel is closer to the sort of low level anger that gets things done. The kind that makes you want to fix things rather than just give up. That is a productive response.

You may think I'm crazy, or that it is not worth it, but I do feel strongly that you should never just accept things. Sure there will always be a few bad ones in every batch. That is near impossible to avoid. But in this case it is not a few bad ones, but so many subpar products that it is considered a lottery (not the common silicon lottery, mind you). And every time someone just accepts it or excuse it, we are telling them that it''s OK for them to not care about quality.

This is not the good old days where things were build by craftsmen who took pride in their work. Today, you will only ever get as good a product as the collective market demands. As the ones spending the money, we should have power. But we are loosing that power by not using it.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Dire Squirrel*
> 
> Do not mistake harsh words for a lack of composure.
> It is more than a little patronizing and downright offensive to simply assume that someone you don't agree with, is merely having some sort of emotional episode, like a spoiled teenager.
> Nothing could be further from the truth.
> 
> Am I frustrated? A little bit. But not on any level that would even register as a reaction. The kind of frustration that you are thinking about, is very unproductive.
> What I feel is closer to the sort of low level anger that gets things done. The kind that makes you want to fix things rather than just give up. That is a productive response.
> 
> You may think I'm crazy, or that it is not worth it, but I do feel strongly that you should never just accept things. Sure there will always be a few bad ones in every batch. That is near impossible to avoid. But in this case it is not a few bad ones, but so many subpar products that it is considered a lottery (not the common silicon lottery, mind you). And every time someone just accepts it or excuse it, we are telling them that it''s OK for them to not care about quality.
> 
> This is not the good old days where things were build by craftsmen who took pride in their work. Today, you will only ever get as good a product as the collective market demands. As the ones spending the money, we should have power. But we are loosing that power by not using it.


There aren't as many as you seem to think, I've been following this thread from day one and i can honestly say i that there are (i think) less than 20 complaints about black screens in here being on stock clocks, nearly every single one was from a day 1 batch or a used card (likely day 1 as well)

You never answered before either, is your card an ex-mining card or is it used?

And AMD did acknowladge the black screen issue, they released a driver for it.

EDIT: Look what i found: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/6560_40#post_21223467

And for the majority of people (myself included) this fixed the issue


----------



## Dire Squirrel

Quote:


> Originally Posted by *Sgt Bilko*
> 
> There aren't as many as you seem to think, I've been following this thread from day one and i can honestly say i that there are (i think) less than 20 complaints about black screens in here being on stock clocks, nearly every single one was from a day 1 batch or a used card (likely day 1 as well)


Take a quick peek outside this thread. On this forum there are several threads devoted to just that. And if you take a look around the internet as a whole you will be surprised.
This is a good place to start: http://www.overclock.net/t/1441349/290-290x-black-screen-poll

And one single post mentioning some mythical magic driver which is never even mentioned by name/number, is meaningless. Especially when not a single AMD driver release was anywhere near that date (should be 21-11-13 based on the post). The closest to that is 13.12 which was 3 weeks later and certainly did not fix anything.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Dire Squirrel*
> 
> Take a quick peek outside this thread. On this forum there are several threads devoted to just that. And if you take a look around the internet as a whole you will be surprised.
> This is a good place to start: http://www.overclock.net/t/1441349/290-290x-black-screen-poll
> 
> And one single post mentioning some mythical magic driver which is never even mentioned by name/number, is meaningless. Especially when not a single AMD driver release was anywhere near that date (should be 21-11-13 based on the post). The closest to that is 13.12 which was 3 weeks later and certainly did not fix anything.


it was a Beta driver release (13.11b i believe) which was later incorporated into the 13.12 driver.

I am aware of the Black Screen poll thread as it was posted here several times, however you still haven't answered a simple question again.

Was your card an ex-miner or used?

If it was it was probably a day 1 batch which we (and AMD) now know to be prone to these issues.

Other than that, plenty of people have given you options, RMA it, Sell it (and the block) or just switch to the Green team.

Also, have you submitted a bug report to AMD about this?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *bak3donh1gh*
> 
> ok do you guys have any reason I should go with this Asus card over this powercolor card?
> 
> Thanks


Asus aircooler not as good as Powercolor and flash PT1T unlocked vbios on to it first . If your seeking even more out of it if there is a block I would watercool it as well later on ......


----------



## bak3donh1gh

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> flash PT1T unlocked vbios on to it first .


Thanks for the info! it was just the thing I was looking for. Does this mean that the powercolor is reference model and its possible to unlock or are you just spreading info around?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *bak3donh1gh*
> 
> Thanks for the info! it was just the thing I was looking for. Does this mean that the powercolor is reference model and its possible to unlock or are you just spreading info around?


No that powercolor is not ref one . Refernce cards have loud single fan cooler and are the easiest ones to waterblock and see the temps drop 50% . XFX here in OZ do unlock to 290x ive got one







. Powercolor ref not so much


----------



## Sgt Bilko

Quote:


> Originally Posted by *bak3donh1gh*
> 
> Thanks for the info! it was just the thing I was looking for. Does this mean that the powercolor is reference model and its possible to unlock or are you just spreading info around?


You'd be lucky to find a 290 that unlocks, it would need to be a Ref model (Ref PCB and Cooler, very early batch)

The Asus DCU II and Matrix cards have terrible air-cooling capabilities while the Powercolor PCS+ is a very good air cooler and the only downside to it is it's a 2.5 slot card opposed to most of the 2 slot cards around.

The PT1 bios is a modded bios that allows extra overclocking capabilities.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> You'd be lucky to find a 290 that unlocks, it would need to be a Ref model (Ref PCB and Cooler, very early batch)
> 
> The Asus DCU II and Matrix cards have terrible air-cooling capabilities while the Powercolor PCS+ is a very good air cooler and the only downside to it is it's a 2.5 slot card opposed to most of the 2 slot cards around.
> 
> The PT1 bios is a modded bios that allows extra overclocking capabilities.


Yep what he said too


----------



## bak3donh1gh

Aah thanks bilko i thought homecinema might be spreading info that could possibly fry my card if I wasnt careful. its a little late now but could this card possibly been unlcoked?


----------



## Sgt Bilko

Quote:


> Originally Posted by *bak3donh1gh*
> 
> Aah thanks bilko i thought homecinema might be spreading info that could possibly fry my card if I wasnt careful. its a little late now but could this card possibly been unlcoked?


PT1 won't fry your card unless you are adding an obscene amount of voltage to it.

HC-PC knows what he is doing............(sometimes







)

And that has a *chance* of unlocking, no promises.......lottery and all that


----------



## bak3donh1gh

yea i figured that its not worth the extra 100$ though


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *bak3donh1gh*
> 
> Aah thanks bilko i thought homecinema might be spreading info that could possibly fry my card if I wasnt careful. its a little late now but could this card possibly been unlcoked?


Wadda you mean spreading info that could fry your card







. Only YOU can fry your card if you don't know fully what your doing .

Quote:


> Originally Posted by *Sgt Bilko*
> 
> PT1 won't fry your card unless you are adding an obscene amount of voltage to it.
> 
> HC-PC knows what he is doing............(sometimes
> 
> 
> 
> 
> 
> 
> 
> )
> 
> And that has a *chance* of unlocking, no promises.......lottery and all that


40 HWBOT world records overclocking 290 in single CF and TRI in a most benchmarks makes me very qualified dontchya think


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> .
> 40 HWBOT world records overclocking 290 in single CF and TRI in a most benchmarks makes me very qualified dontchya think


I thought i was giving you a compliment









(It's ok, i know you are qualified)


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I thought i was giving you a compliment
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (It's ok, i know you are qualified)


Yeah I know its all good








I wanted to say that spread some more info LooooL


----------



## bak3donh1gh

I thought that you might be telling people to fLash there card with a incompatible bios. I didnt mean it as an insult just that you're trying to be helpful and screwing people over. I didn't know you do this as you're job.


----------



## hwoverclkd

Quote:


> Originally Posted by *Dire Squirrel*
> 
> Do not mistake harsh words for a lack of composure.
> It is more than a little patronizing and downright offensive to simply assume that someone you don't agree with, is merely having some sort of emotional episode, like a spoiled teenager.
> Nothing could be further from the truth.
> 
> Am I frustrated? A little bit. But not on any level that would even register as a reaction. The kind of frustration that you are thinking about, is very unproductive.
> What I feel is closer to the sort of low level anger that gets things done. The kind that makes you want to fix things rather than just give up. That is a productive response.
> 
> You may think I'm crazy, or that it is not worth it, but I do feel strongly that you should never just accept things. Sure there will always be a few bad ones in every batch. That is near impossible to avoid. But in this case it is not a few bad ones, but so many subpar products that it is considered a lottery (not the common silicon lottery, mind you). And every time someone just accepts it or excuse it, we are telling them that it''s OK for them to not care about quality.
> 
> This is not the good old days where things were build by craftsmen who took pride in their work. Today, you will only ever get as good a product as the collective market demands. As the ones spending the money, we should have power. But we are loosing that power by not using it.


I don't know about the others but I can certainly speak for myself and I can certainly feel ya since I'm still on the same boat as you are. Like I said this is already my 2nd 290x Lightning and I'm tired of this issue and I'm no longer seeking for an answer, especially after MSI said it's GPU going bad when I described my issue. They didn't say I WAS the problem ...I'm sure neither are you. I trust MSI is pointing to the right root cause when I talked to them.

I don't care what others would say, I know what I know, they can call me crazy but I know myself more than them. Sad that I spent $$$ for this high-end gpu and yet my 580, 760, and 780 Ti ARE JUST FINE (not to mention the on-board intel is just as well). That alone, without doing deep troubleshooting, would almost be enough to narrow down what the cause is. I have caught this in video many times and I can easily post it if anyone request for it, together with voltage readings (gpu, memory). But then again, I am aware there ARE many variants of black screen so it may not be always caused by bad gpu.


----------



## hwoverclkd

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Other than that, plenty of people have given you options, RMA it, Sell it (and the block) or just switch to the Green team.
> 
> Also, have you submitted a bug report to AMD about this?


This is actually my problem (not the black screen anymore), which among the green team members should I choose without spending extra?







I know this lightning is very close to 780 ti sc but I already have that lol


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *bak3donh1gh*
> 
> I thought that you might be telling people to fLash there card with a incompatible bios. I didnt mean it as an insult just that you're trying to be helpful and screwing people over . I didn't know you do this as you're job.


I think you missed some words there .








As far as im aware you can flash any kind of 290 / 290x bios's just the results will differ . As long as you keep the original bios on one of the bios switch positions , you can easily recover from a bad flash
Flashing is easy . I was hesitant too till I did it and apart from watercooling my 290s it is one of the best modifcations you can do to these red things
Benchmarking isn't my job , its my hobby
Driving trucks is my job


----------



## rdr09

Quote:


> Originally Posted by *acupalypse*
> 
> This is actually my problem (not the black screen anymore), which among the green team members should I choose without spending extra?
> 
> 
> 
> 
> 
> 
> 
> I know this lightning is very close to 780 ti sc but I already have that lol


lol. you can't even beat a 290 with your lightning. how can you even compare to the Ti?

the issue is in front of the monitor. you can't even get your cpu oc stable.


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> I think you missed some words there .
> 
> 
> 
> 
> 
> 
> 
> 
> As far as im aware you can flash any kind of 290 / 290x bios's just the results will differ . As long as you keep the original bios on setting on one of the bios switch positions , you can easily recover from a bad flash
> Flashing is easy . I was hesitant too till I did it and apart from watercooling my 290s it is one of the best modifcations you can do to these red things
> Benchmarking isn't my job , its my hobby
> Driving trucks is my job


This ^


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *rdr09*
> 
> lol. you can't even beat a 290 with your lightning. how can you even compare to the Ti?
> 
> the issue is in front of the monitor. you can't even get your cpu oc stable.


LooooL burned him good esse


----------



## Tokkan

@Homecinema
Oh the irony...
I'm a IT tech that stress tests components and repairs computers for a living and one of my hobbies is just sitting on truck driver simulator with some nice 80's music and just tag along.









Obviously its not even remotely the same as driving a real truck, but its still a fun comparison eh?









Edit: Regarding flashing the vbios yea you're correct.
These cards have dual bios, even if some1 screws up they can just flip the switch to boot and flip it back to flash the correct one again, you can have a benchmarking bios and a 24/7 bios. As long as everything is kept under control. But no ammount of words said to any1 can make them ruin anything. It is doing it without knowledge that makes that.

I've had a 16 year old kid purchase a 1200€ system from me unassembled. I told his mother that I could've assembled it for a measly 20€ on top of the current price.
Well the kid said he knew how to do it cause he saw videos on youtube.
Well okay then... if you're sure. Conclusion: He didn't put the motherboard standoffs and fried it.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Tokkan*
> 
> @Homecinema
> Oh the irony...
> I'm a IT tech that stress tests components and repairs computers for a living and one of my hobbies is just sitting on truck driver simulator with some nice 80's music and just tag along.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Obviously its not even remotely the same as driving a real truck, but its still a fun comparison eh?


I operate earth-moving equipment for a living......all walks of life hey?


----------



## Tokkan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I operate earth-moving equipment for a living......all walks of life hey?


Shouldn't you have a bulldozer then?








Or are we not talking about those type of earth moving equipment?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Tokkan*
> 
> Shouldn't you have a bulldozer then?
> 
> 
> 
> 
> 
> 
> 
> 
> Or are we not talking about those type of earth moving equipment?


I had a Bulldozer (Wife's Rig), now it's Piledriver and then maybe Excavator or whichever AMD launch in 2016


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Tokkan*
> 
> @Homecinema
> Oh the irony...
> I'm a IT tech that stress tests components and repairs computers for a living and one of my hobbies is just sitting on truck driver simulator with some nice 80's music and just tag along.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Obviously its not even remotely the same as driving a real truck, but its still a fun comparison eh?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Regarding flashing the vbios yea you're correct.
> These cards have dual bios, even if some1 screws up they can just flip the switch to boot and flip it back to flash the correct one again, you can have a benchmarking bios and a 24/7 bios. As long as everything is kept under control. But no ammount of words said to any1 can make them ruin anything. It is doing it without knowledge that makes that.
> 
> I've had a 16 year old kid purchase a 1200€ system from me unassembled. I told his mother that I could've assembled it for a measly 20€ on top of the current price.
> Well the kid said he knew how to do it cause he saw videos on youtube.
> Well okay then... if you're sure. Conclusion: He didn't put in the motherboard standoffs and fried it.


If I could have got that on video it would make for a good avatar


----------



## rdr09

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> LooooL burned him good esse


i was just gonna say - XBOX but . . .


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *rdr09*
> 
> i was just gonna say - XBOX but . . .


Pppffffftttttt








I just might put that quote in my sig ..... Gold


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I had a Bulldozer (Wife's Rig), now it's Piledriver and then maybe Excavator or whichever AMD launch in 2016


Hyperthreading is a good start


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Hyperthreading is a good start


Depending on how they go it might have a version of hyper threading









Too early to tell yet but heres hoping for some good competition


----------



## cennis

For all you water cooling experts,

With ambient temp water and 290 blocked, is it easy to maintain <50c with 1200mhz core overclocks? 150~200mw offset?
If so, how much rads space do you need per 290?

I am asking because my h90 is basically a 140mm rad dedicated to a 290, with pretty good mount pressure, and in valley/bf4 I've seen temps peak at 60c, in a 25c room with max stock fan.
To achieve the same performance (50c) as a custom loop, do I just need more rad space or is the h90 cold-plate that needs to be lapped?


----------



## Matt-Matt

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> No that powercolor is not ref one . Refernce cards have loud single fan cooler and are the easiest ones to waterblock and see the temps drop 50% . XFX here in OZ do unlock to 290x ive got one
> 
> 
> 
> 
> 
> 
> 
> . Powercolor ref not so much


For reference it's just the same chance as everyone else though.. Mine didn't, mind you I was a bit late to the party too (my card is V1.1).


----------



## Dasboogieman

Quote:


> Originally Posted by *cennis*
> 
> For all you water cooling experts,
> 
> With ambient temp water and 290 blocked, is it easy to maintain <50c with 1200mhz core overclocks? 150~200mw offset?
> If so, how much rads space do you need per 290?
> 
> I am asking because my h90 is basically a 140mm rad dedicated to a 290, with pretty good mount pressure, and in valley/bf4 I've seen temps peak at 60c, in a 25c room with max stock fan.
> To achieve the same performance (50c) as a custom loop, do I just need more rad space or is the h90 cold-plate that needs to be lapped?


How are you mounting your H90? Those temps seem a bit on the high side.
I've got an NZXT G10 with the H90 and I average around 45 degrees under load (25ish degrees temp) at stock and maybe 52 degrees with +150mv. The contact from the H90 with the GPU is pretty good with enough pressure since the copper plate is slightly curved from the central flat portion.

As for the full WC loop, assuming your flow rate is sufficient (i.e. no underpowered pump) I would check this site out:
http://skinneelabs.com/triple-radiator-comparison-v2/3/

Essentially from this data, it seems the minimum for quiet operation (~1000RPM fan speed) and a 15 degree delta (350W from GPU, 150W from CPU) is a single XSPC RX360 caliber 360mm Rad.
If you want a 10 degree delta then you probably need an additional 140mm/120mm rad too.


----------



## phallacy

Quote:


> Originally Posted by *cennis*
> 
> For all you water cooling experts,
> 
> With ambient temp water and 290 blocked, is it easy to maintain <50c with 1200mhz core overclocks? 150~200mw offset?
> If so, how much rads space do you need per 290?
> 
> I am asking because my h90 is basically a 140mm rad dedicated to a 290, with pretty good mount pressure, and in valley/bf4 I've seen temps peak at 60c, in a 25c room with max stock fan.
> To achieve the same performance (50c) as a custom loop, do I just need more rad space or is the h90 cold-plate that needs to be lapped?


I can give you my experience about keeping those temps.

2 in series flow with a 360mm 30mm + 280mm monsta (86mm) was able to keep both cards @50 at 1200/1600 with around 130-150mV I believe. CPU was in this loop as well
3 cards in series flow with the same two rads mentioned above and CPU, temps were up to 60c at the same clocks and overvoltage.

4 with parallel flow going to a 480 60mm 360mm 30mm and 280mm monsta dedicated loop, I'm able to get temps usually between 47-51 depending on game/benchmark and duration 30-45min vs 3+ hours.

I can't tell you much about the coldplate you're talking about for the h90 but I'd say a 240mm rad or one of those double AIOs should be able to keep the core under 50 with the clocks and voltages you're using without too much hassle. It's not really the core you should worry about though, it's the VRMs and specifically VRM1 that causes lots of instability with these cards when overclocking


----------



## Goride

Quote:


> Originally Posted by *Dasboogieman*
> 
> When you installed the GELID enhancement heatsink, did you use the washers provided? If you did, try removing them and install the sink bare with 1mm thick thermal pads (which is what I did). The only time my VRMs got that hot with my G10 setup was when I was running EVGA OC scanner and I forgot to plug in the 92mm fan.


I did use the washer.

I did not use the thermal pad that it came with. I double layered some 0.5mm Fujipoly pad that I had. I believe it is making a decent contact though. Because when I tough the heatsink, it burns my finger.

The problem seems not to be the thermal pads or transferring heat from the vrm to the heatsink. The problem seems to be the fan on the g10 does not spin fast enough to blow enough air over the heatsink.

I have tried plugging the fan directly into a power source (rather than the MB), which should be forcing it to spin at max speed. But I still do not feel much air coming off it.

I can try your suggestion, but if the heatsink is burning hot, it would seem that this would not be the issue.


----------



## hwoverclkd

Quote:


> Originally Posted by *rdr09*
> 
> lol. you can't even beat a 290 with your lightning. how can you even compare to the Ti?
> 
> the issue is in front of the monitor. you can't even get your cpu oc stable.


man you can say what you want. I have my own results. All you nitpick is the single unstable out of >10 results...you know how stats work right? I can tell you're an AMD fanboy







I'm just here to see which card best suits me. I have both cards so I can say which one gives better gaming results. If your card says otherwise, that's good for you. I'm not here to compete


----------



## rdr09

Quote:


> Originally Posted by *acupalypse*
> 
> man you can say what you want. I have my own results. All you nitpick is the single unstable out of >10 results...you know how stats work right? I can tell you're an AMD fanboy
> 
> 
> 
> 
> 
> 
> 
> I'm just here to see which card best suits me. I have both cards so I can say which one gives better gaming results. If your card says otherwise, that's good for you. I'm not here to compete


PS4?

http://www.overclock.net/t/1489262/eurogamer-witcher-3-may-run-better-on-ps4-than-xbox-one#post_22257672


----------



## hwoverclkd

Quote:


> Originally Posted by *rdr09*
> 
> i was just gonna say - XBOX but . . .


you should have said, i don't care


----------



## rdr09

Quote:


> Originally Posted by *acupalypse*
> 
> you should have said, i don't care


why not? it is pnp. suits you.

just looking after you. now, if you really care . . . ask hotrod for advise instead of yapping here.


----------



## hwoverclkd

Quote:


> Originally Posted by *rdr09*
> 
> why not? it is pnp. suits you.
> 
> just looking after you. now, if you really care . . . ask hotrod for advise instead of yapping here.


like i said many times, i'm not seeking for an answer...i appreciate the help though. You're doing great


----------



## rdr09

Quote:


> Originally Posted by *acupalypse*
> 
> like i said many times, i'm not seeking for an answer...i appreciate the help though. You're doing great


tell us how it goes with the ps4. it should be easy to set up.


----------



## cennis

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Dasboogieman*
> 
> How are you mounting your H90? Those temps seem a bit on the high side.
> I've got an NZXT G10 with the H90 and I average around 45 degrees under load (25ish degrees temp) at stock and maybe 52 degrees with +150mv. The contact from the H90 with the GPU is pretty good with enough pressure since the copper plate is slightly curved from the central flat portion.
> 
> As for the full WC loop, assuming your flow rate is sufficient (i.e. no underpowered pump) I would check this site out:
> http://skinneelabs.com/triple-radiator-comparison-v2/3/
> 
> Essentially from this data, it seems the minimum for quiet operation (~1000RPM fan speed) and a 15 degree delta (350W from GPU, 150W from CPU) is a single XSPC RX360 caliber 360mm Rad.
> If you want a 10 degree delta then you probably need an additional 140mm/120mm rad too.


Quote:


> Originally Posted by *phallacy*
> 
> I can give you my experience about keeping those temps.
> 
> 2 in series flow with a 360mm 30mm + 280mm monsta (86mm) was able to keep both cards @50 at 1200/1600 with around 130-150mV I believe. CPU was in this loop as well
> 3 cards in series flow with the same two rads mentioned above and CPU, temps were up to 60c at the same clocks and overvoltage.
> 
> 4 with parallel flow going to a 480 60mm 360mm 30mm and 280mm monsta dedicated loop, I'm able to get temps usually between 47-51 depending on game/benchmark and duration 30-45min vs 3+ hours.
> 
> I can't tell you much about the coldplate you're talking about for the h90 but I'd say a 240mm rad or one of those double AIOs should be able to keep the core under 50 with the clocks and voltages you're using without too much hassle. It's not really the core you should worry about though, it's the VRMs and specifically VRM1 that causes lots of instability with these cards when overclocking






I punched holes into the intel mounting bracket of the h90 to match the gpu, and used screw/washers to hold it down.
I don't think my mounting is going to be worse compared to a G10, essentially the same thing, without a fan bracket.
*Some pics and results of my asus dcuii h90 mount:

*


Spoiler: Warning: Spoiler!















Dasboogieman, what kind of loads did you run? maybe loop valley max setting or GT4 of 3dmark11?
I am using the stock thermal paste with brand new h90s,
mount seems to be solid but at max h90 fan speeds the temps stabilize around 58~60c 1200mhz

Fallacy, that is much more rad space as its 360mm + 280mm while my two h90s combined is 140+140 so it seems reasonable that It wouldn't maintain 50c.
Also if 360mm +280mm will reach 60c with 3 cards then having 3x140 (3 h90s) doing the same is not bad.


----------



## rt123

Quote:


> Originally Posted by *acupalypse*
> 
> like i said many times, i'm not seeking for an answer...i appreciate the help though. You're doing great


If you aren't seeking answers or solutions, then posting again & again about your issue is just going to piss people off.

Which is what is happening here.


----------



## hwoverclkd

Quote:


> Originally Posted by *rt123*
> 
> If you aren't seeking answers or solutions, then posting again & again about your issue is just going to piss people off.
> 
> Which is what is happening here.


no problem, just sharing in some facts based on personal experience. Do you think it's bad?

EDIT: just reviewed my posts, how or where did i nag about the issue that it becomes irritating?


----------



## rt123

Quote:


> Originally Posted by *acupalypse*
> 
> no problem, just sharing in some facts based on personal experience. Do you think it's bad?


It isn't a problem.
But there is no solid evidence & too many variables, to establish what you claim to be a fact.

You said this is your 2nd Lightning. You know how rare it is for someone to receive 2 back to back defective cards. Not that it can't happen, but chances are really slim.
Maybe you are doing something wrong or something else in your system is the culprit.


----------



## hwoverclkd

Quote:


> Originally Posted by *rt123*
> 
> It isn't a problem.
> But there is no solid evidence & too many variables, to establish what you claim to be a fact.
> 
> You said this is your 2nd Lightning. You know how rare it is for someone to receive 2 back to back defective cards. Not that it can't happen, but chances are really slim.
> Maybe you are doing something wrong or something else in your system is the culprit.


Agreed. It's a fact based only on my experience and based on my conversation with MSI support. But I try (maybe failing) to give a hint that it's only applicable to my setup, not necessarily for everyone else. Every reader has a right to take that with a grain of salt, use common sense and practical reasoning. Not everything you get off internet is true, right?

So maybe it's me ...or something in my system. But here's where i'm coming from:
- MSI was too quick to point to bad gpu when i described it
- Why other cards are doing fine, only 290x is bad. I can easily buy another cheaper AMD card (270x) to compare, but the wife is going to kill me
- I already did a research and saw the issues even before I bought the 1st 290x...you know, self-inflicted blindness because i love this badass card.
- it's rare to have defective card twice in a row, but if it's a design flaw or compatibility issue to begin with, you can never tell without thorough testing and with proper tools.

The only dilemma I have right now is which card should I exchange it with, or just get a refund.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *acupalypse*
> 
> Agreed. It's a fact based only on my experience and based on my conversation with MSI support. But I try (maybe failing) to give a hint that it's only applicable to my setup, not necessarily for everyone else. Every reader has a right to take that with a grain of salt, use common sense and practical reasoning. Not everything you get off internet is true, right?
> 
> So maybe it's me ...or something in my system. But here's where i'm coming from:
> - MSI was too quick to point to bad gpu when i described it
> - Why other cards are doing fine, only 290x is bad. I can easily buy another cheaper AMD card (270x) to compare, but the wife is going to kill me
> - I already did a research and saw the issues even before I bought the 1st 290x...you know, self-inflicted blindness because i love this badass card.
> - it's rare to have defective card twice in a row, but if it's a design flaw or compatibility issue to begin with, you can never tell without thorough testing and with proper tools.
> 
> The only dilemma I have right now is which card should I exchange it with, or just get a refund.


Did you re-install Windows?

If you've had previous graphics drivers, then there are probably bits and pieces left in your OS that can cause problems. No display driver remover gets them all, you have to manually check several hidden folders and registry entries. I can try and find you the page with the entire list if you want.


----------



## rt123

Quote:


> Originally Posted by *acupalypse*
> 
> Agreed. It's a fact based only on my experience and based on my conversation with MSI support. But I try (maybe failing) to give a hint that it's only applicable to my setup, not necessarily for everyone else. Every reader has a right to take that with a grain of salt, use common sense and practical reasoning. Not everything you get off internet is true, right?
> 
> So maybe it's me ...or something in my system. But here's where i'm coming from:
> - MSI was too quick to point to bad gpu when i described it
> - Why other cards are doing fine, only 290x is bad. I can easily buy another cheaper AMD card (270x) to compare, but the wife is going to kill me
> - I already did a research and saw the issues even before I bought the 1st 290x...you know, self-inflicted blindness because i love this badass card.
> - it's rare to have defective card twice in a row, but if it's a design flaw or compatibility issue to begin with, you can never tell without thorough testing and with proper tools.
> 
> The only dilemma I have right now is which card should I exchange it with, or just get a refund.


Just get a refund I guess.

For the people complaining about design flaws, there are also people on the other side of the fence who have had barely any issues. So I guess the argument goes both ways.

One of the major issues here is that ( general case, not specific to you) AMD cards' Memory tends to BlackScreen when you OC it too much. Nvidia on the other side just artifacts, it would have been better if AMD's Memory artifact ed too so people could understand that it is the OC that is the issue not card being defective.

It should be,
Artifacts when too much Overclock,
BlackScreen when an actual HW faults.

The last ditch effort would be as *chiknnwatrmln* pointed out, an OS re-install.
But before you try that, Download CCleaner & use its Registery Cleaner, to clean / delete useless entries from your Registry. _Maybe_, that will work.


----------



## hwoverclkd

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Did you re-install Windows?
> 
> If you've had previous graphics drivers, then there are probably bits and pieces left in your OS that can cause problems. No display driver remover gets them all, you have to manually check several hidden folders and registry entries. I can try and find you the page with the entire list if you want.


Yes, i have 2 extra hard drives, installed win7 and 8.1... extra monitor and PSU (if in case someone thinks it could be a PSU problem). Also tried DDU, AMD cleanup and MS Fixit. None of them fixed it.

I was also planning to attach a debugger to see if it's something to do with software. But i quit after hearing from MSI that it's gpu.

Thanks for checking. I'd probably love to see some instructions that's beyond an intermediate pc user would do







But pls don't spend more than 5 mins on it...totally fine, i already accepted its fate


----------



## hwoverclkd

Quote:


> Originally Posted by *rt123*
> 
> Just get a refund I guess.
> 
> For the people complaining about design flaws, there are also people on the other side of the fence who have had barely any issues. So I guess the argument goes both ways.
> 
> One of the major issues here is that ( general case, not specific to you) AMD cards' Memory tends to BlackScreen when you OC it too much. Nvidia on the other side just artifacts, it would have been better if AMD's Memory artifact ed too so people could understand that it is the OC that is the issue not card being defective.
> 
> It should be,
> Artifacts when too much Overclock,
> BlackScreen when an actual HW faults.


Right, hence my comment 'chosen ones' or 'golden chips'

Mine does blackscreen, whether oc'd or not. It's been a while since i used AMD and i agree it's a different animal. The last issue i had with any gpu card was during CRT time (no lcd yet). Some hynix batch caused faint horizontal gray lines when matched with specific mobo chipset. That was due to compatibility problem.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *acupalypse*
> 
> Yes, i have 2 extra hard drives, installed win7 and 8.1... extra monitor and PSU (if in case someone thinks it could be a PSU problem). Also tried DDU, AMD cleanup and MS Fixit. None of them fixed it.
> 
> I was also planning to attach a debugger to see if it's something to do with software. But i quit after hearing from MSI that it's gpu.
> 
> Thanks for checking. I'd probably love to see some instructions that's beyond an intermediate pc user would do
> 
> 
> 
> 
> 
> 
> 
> But pls don't spend more than 5 mins on it...totally fine, i already accepted its fate


Judging by the sound of it you got two bum cards in a row. Has happened to me with 670's. Nothing you can do but RMA and hope for the best.

http://www.overclock.net/t/988215/how-to-remove-your-amd-ati-gpu-drivers

This most likely won't help your current card, but driver uninstallers (even AMD's) always leave scraps behind, this will ensure you get everything. I never understood why both AMD and Nvidia can't make their driver uninstallers actually uninstall their drivers. I mean, they made them, surely they know everywhere the drivers store files, no? Just makes it a hassle to uninstall/change drivers.

If you're not familiar with regedit it might be a good idea to back up your registry before you start. I recommend you don't use any registry cleaners, however if you use CCleaner's cleaner (but not registry cleaner) you will be fine.


----------



## DeadlyDNA

What about underclocking the card for testing sake. Not sure what they set the clock too at shipping but try dropping it some. Maybe. Both core clock and memory. Or one at. A time.


----------



## Norse

Should prolly be fully in this club now

Crossfire Sapphire 290X's
Ignore the bad quality!


----------



## Sempre

Quote:


> Quote:
> 
> 
> 
> Originally Posted by *Loktar Ogar*
> 
> Follow up question: What is the average OC of R9 290 anyway? CC 1150 or CC 1200?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *battleaxe*
> 
> Average would be much closer to 1150 or even 1100 than 1200. Not too many can do 1200. Very few can do 1250 and only a handful reach 1300.
> 
> Click to expand...
Click to expand...

What about water-cooled 290s?


----------



## hwoverclkd

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Judging by the sound of it you got two bum cards in a row. Has happened to me with 670's. Nothing you can do but RMA and hope for the best.
> 
> http://www.overclock.net/t/988215/how-to-remove-your-amd-ati-gpu-drivers
> 
> This most likely won't help your current card, but driver uninstallers (even AMD's) always leave scraps behind, this will ensure you get everything. I never understood why both AMD and Nvidia can't make their driver uninstallers actually uninstall their drivers. I mean, they made them, surely they know everywhere the drivers store files, no? Just makes it a hassle to uninstall/change drivers.
> 
> If you're not familiar with regedit it might be a good idea to back up your registry before you start. I recommend you don't use any registry cleaners, however if you use CCleaner's cleaner (but not registry cleaner) you will be fine.


this is why i love hanging around here, there are people who'll pull you back when you're about to jump off the cliff









I did installed windows and AMD drivers only on one of my spare drives, unfortunately story went the same way.
Quote:


> Originally Posted by *DeadlyDNA*
> 
> What about underclocking the card for testing sake. Not sure what they set the clock too at shipping but try dropping it some. Maybe. Both core clock and memory. Or one at. A time.


Now this is something i haven't done yet. If it won't blackscreen after lowering the clocks, I'm going to go straight to RMA. Makes no sense using gpu that operates 'properly' below the advertised stock settings.


----------



## DeadlyDNA

Quote:


> Originally Posted by *acupalypse*
> 
> this is why i love hanging around here, there are people who'll pull you back when you're about to jump off the cliff
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I did installed windows and AMD drivers only on one of my spare drives, unfortunately story went the same way.
> Now this is something i haven't done yet. If it won't blackscreen after lowering the clocks, I'm going to go straight to RMA. Makes no sense using gpu that operates 'properly' below the advertised stock settings.


yes, if it runs on lower clocks that gives you something to go with. If not well no harm done and keep looking


----------



## battleaxe

Quote:


> Originally Posted by *acupalypse*
> 
> this is why i love hanging around here, there are people who'll pull you back when you're about to jump off the cliff
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I did installed windows and AMD drivers only on one of my spare drives, unfortunately story went the same way.
> Now this is something i haven't done yet. If it won't blackscreen after lowering the clocks, I'm going to go straight to RMA. Makes no sense using gpu that operates 'properly' below the advertised stock settings.


This is what I would put my money on. For some reason your cards cannot run stock clocks on memory. The only time I've seen black screen is in relation to too high a memory OC. My guess is you've got a dud chip on those things somewhere and its what is causing it to not even be able to pull stock mem clocks. Down-clocking should fix it. Or as you said RMA, cause who wants a card that can't pull stock clocks? Not I.


----------



## PJFT808

Quote:


> Originally Posted by *Goride*
> 
> I did use the washer.
> 
> I did not use the thermal pad that it came with. I double layered some 0.5mm Fujipoly pad that I had. I believe it is making a decent contact though. Because when I tough the heatsink, it burns my finger.
> 
> The problem seems not to be the thermal pads or transferring heat from the vrm to the heatsink. The problem seems to be the fan on the g10 does not spin fast enough to blow enough air over the heatsink.
> 
> I have tried plugging the fan directly into a power source (rather than the MB), which should be forcing it to spin at max speed. But I still do not feel much air coming off it.
> 
> I can try your suggestion, but if the heatsink is burning hot, it would seem that this would not be the issue.


I used the .5mm pads and had to layer it 4 maybe 5 layers before it made the contact I thought was appropriate.I started at 2 and I could tell the thumbscrews where at the limit. I still wouldn't rule it out completely. If say the heat sink is making half contact I'm sure it would still get hot to touch over time. I 100% agree the NZXT fan supplied felt like 15 cfm or someone breathing on your neck. I used a seasoned Delta(parted dell). Its like 70 cfm and rewired it as 7V. I wouldn't doubt that the fan is a main culprit.


----------



## heroxoot

Finally reinstalled BF4 and would ya believe it the performance is great again. 90 - 100fps with MSAA off, 80 - 90 with it on, using Mantle. The voltage increase is indeed working now too. I don't see it reflect my increases on Heaven or Valley but in BF4 it was running 1.23v I think. Seems a bit high though. Running 1100/1250 on 14.4 and it was with a +25mV increase. I upped it to +30mV because sometimes I see a weird white checkered artifacting once in a while on all my games. It is an odd thing I never saw till 14.4 driver and it happens on Mantle and DX11 so I assume it's slight instability. No jetting artifacts like polygons stretching off screen at all. Temps were great even on stock fan (which I still cannot adjust). 85C, 86C for a few seconds on and off. VRM1 67c and VRM 2 52c which never moves. My room is very hot too so these temps are not bad at all. It's probably over 70F in my room right now.

No clock throttle what so ever during play and it pegged 100% GPU Usage majority of the play too. Very happy right now. I'm scared to OC higher though because I cannot set the fan to run 100% at 80c+ and it only was using the fan at 2500RPM and I think this is about 70% fan? Not sure but 85c is a ok temp from what I have read.


----------



## bak3donh1gh

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> I think you missed some words there .
> 
> 
> 
> 
> 
> 
> 
> 
> As far as im aware you can flash any kind of 290 / 290x bios's just the results will differ . As long as you keep the original bios on one of the bios switch positions , you can easily recover from a bad flash
> Flashing is easy . I was hesitant too till I did it and apart from watercooling my 290s it is one of the best modifcations you can do to these red things
> Benchmarking isn't my job , its my hobby
> Driving trucks is my job


sorry it was early but it only recently that bios flashing had become less risky(for anything besides video cards) I dont know about you but i've run into people/apps that have meant to be helpful(or malicious) but have screwed me over in the short run. Plus I haven't kept up with hardware since the last time I upgraded.


----------



## AMDMAXX

Asus R9 290 Reference card bought at microcenter...

going to eventually add it to my watercooling loop...


----------



## Widde

Quote:


> Originally Posted by *heroxoot*
> 
> Finally reinstalled BF4 and would ya believe it the performance is great again. 90 - 100fps with MSAA off, 80 - 90 with it on, using Mantle. The voltage increase is indeed working now too. I don't see it reflect my increases on Heaven or Valley but in BF4 it was running 1.23v I think. Seems a bit high though. Running 1100/1250 on 14.4 and it was with a +25mV increase. I upped it to +30mV because sometimes I see a weird white checkered artifacting once in a while on all my games. It is an odd thing I never saw till 14.4 driver and it happens on Mantle and DX11 so I assume it's slight instability. No jetting artifacts like polygons stretching off screen at all. Temps were great even on stock fan (which I still cannot adjust). 85C, 86C for a few seconds on and off. VRM1 67c and VRM 2 52c which never moves. My room is very hot too so these temps are not bad at all. It's probably over 70F in my room right now.
> 
> No clock throttle what so ever during play and it pegged 100% GPU Usage majority of the play too. Very happy right now. I'm scared to OC higher though because I cannot set the fan to run 100% at 80c+ and it only was using the fan at 2500RPM and I think this is about 70% fan? Not sure but 85c is a ok temp from what I have read.


Do I need to reinstall my game or OS?







Getting a massive fps drop with mantle enabled >:| Have reinstalled the drivers 3 times using DDU using 14.4whql.


----------



## ebhsimon

How's everyone's overclocks going?

I just got a Vapor-X R9 290 last night. Extremely well built card, super heavy and is just sexy. Very large too - 2.5 slot cooler. Runs extremely quiet compared to a twin frozr 7970 and even when fans turned up to 100% they don't sound high pitched but rather you just hear the air, temperatures are good, vrm temps are good, core idles at ~50C with IFC on (only runs one out of the three fans when temp is under 60C).

I could only get to 1135core and 1580mem before I started to get artifacts (pushing +100mV and 50% powerlimit, roughly 330W according to GPU-Z). I could push the memory up to 1675, but there was no point since there were artifacts. I haven't black screened it yet either.

I feel like the memory overclocks well, but I'd much rather my core to overclock to at least 1175..

This card is 70% ASIC quality, while my 60% ASIC 7970 got to 1175 core clock. I guess ASIC doesn't tell the whole story then, but still I would have liked a card that could have overclocked higher since I paid such a high premium for it :/


----------



## hwoverclkd

Quote:


> Originally Posted by *ebhsimon*
> 
> How's everyone's overclocks going?
> 
> I just got a Vapor-X R9 290 last night. Extremely well built card, super heavy and is just sexy. Very large too - 2.5 slot cooler. Runs extremely quiet compared to a twin frozr 7970 and even when fans turned up to 100% they don't sound high pitched but rather you just hear the air, temperatures are good, vrm temps are good, core idles at ~50C with IFC on (only runs one out of the three fans when temp is under 60C).
> 
> I could only get to 1135core and 1580mem before I started to get artifacts (pushing +100mV and 50% powerlimit, roughly 330W according to GPU-Z). I could push the memory up to 1675, but there was no point since there were artifacts. I haven't black screened it yet either.
> 
> I feel like the memory overclocks well, but I'd much rather my core to overclock to at least 1175..
> 
> This card is 70% ASIC quality, while my 60% ASIC 7970 got to 1175 core clock. I guess ASIC doesn't tell the whole story then, but still I would have liked a card that could have overclocked higher since I paid such a high premium for it :/


OC version? i.e. 1080mhz / 1410mhz stock clocks?


----------



## ebhsimon

Quote:


> Originally Posted by *acupalypse*
> 
> OC version? i.e. 1080mhz / 1410mhz stock clocks?


It's stock clocked at 1030core, 1400mem.

Some cards get higher stock factory overclocks than I have... And the Vapor-X is the most expensive 290 card.

EDIT: Scratch that, I was thinking about Nvidia cards, they clock higher but have less performance per clock.


----------



## hwoverclkd

Quote:


> Originally Posted by *ebhsimon*
> 
> It's stock clocked at 1030core, 1400mem.
> 
> Some cards get higher stock factory overclocks than I have... And the Vapor-X is the most expensive 290 card.
> 
> EDIT: Scratch that, I was thinking about Nvidia cards, they clock higher but have less performance per clock.


oh ok, i thought you got this newer version:

http://www.sapphiretech.com/presentation/product/?cid=1&gid=3&sgid=1227&pid=2283&psn=&lid=1&leg=0

I'm waiting for amazon to have it in stock


----------



## rdr09

Quote:


> Originally Posted by *ebhsimon*
> 
> How's everyone's overclocks going?
> 
> I just got a Vapor-X R9 290 last night. Extremely well built card, super heavy and is just sexy. Very large too - 2.5 slot cooler. Runs extremely quiet compared to a twin frozr 7970 and even when fans turned up to 100% they don't sound high pitched but rather you just hear the air, temperatures are good, vrm temps are good, core idles at ~50C with IFC on (only runs one out of the three fans when temp is under 60C).
> 
> I could only get to 1135core and 1580mem before I started to get artifacts (pushing +100mV and 50% powerlimit, roughly 330W according to GPU-Z). I could push the memory up to 1675, but there was no point since there were artifacts. I haven't black screened it yet either.
> 
> I feel like the memory overclocks well, but I'd much rather my core to overclock to at least 1175..
> 
> This card is 70% ASIC quality, while my 60% ASIC 7970 got to 1175 core clock. I guess ASIC doesn't tell the whole story then, but still I would have liked a card that could have overclocked higher since I paid such a high premium for it :/


you oc'ed the core and memory at same time during test? try oc'ing your core first with the mem at stock.


----------



## ebhsimon

Quote:


> Originally Posted by *rdr09*
> 
> you oc'ed the core and memory at same time during test? try oc'ing your core first with the mem at stock.


Does this actually work? I upped my core clock first until it started artifacting, then I dialed it back until they stopped. Then I started with the memory.


----------



## rdr09

Quote:


> Originally Posted by *ebhsimon*
> 
> Does this actually work? I upped my core clock first until it started artifacting, then I dialed it back until they stopped. Then I started with the memory.


guess you did it right the first time. i was going to suggest 1150/1500. you are prolly using 14 drivers.


----------



## ebhsimon

Quote:


> Originally Posted by *rdr09*
> 
> guess you did it right the first time. i was going to suggest 1150/1500. you are prolly using 14 drivers.


I'm using 13.251


----------



## rdr09

Quote:


> Originally Posted by *ebhsimon*
> 
> I'm using 13.251


best for benching.









13.11 in one HDD and 14.3 for mantle on the other.


----------



## ebhsimon

Quote:


> Originally Posted by *rdr09*
> 
> best for benching.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 13.11 in one HDD and 14.3 for mantle on the other.


Thanks I'll keep note of this for future reference +1 rep.

However, I'm not interested in benchmarks but stable overclocks which I can set and forget.


----------



## rdr09

Quote:


> Originally Posted by *ebhsimon*
> 
> Thanks I'll keep note of this for future reference +1 rep.
> 
> However, I'm not interested in benchmarks but stable overclocks which I can set and forget.


can't help you there 'cause my 7/24 is at stock. if i were to oc for daily use . . . +100 offset would prolly be my highest but it also depends on the temp. i suggest keeping your temps below 80C. i would also not recommend leaving the fan in auto when oc'ed. a manual curve or a tolerable % would be ideal like 55 maybe. so long as your temps are in check.

edit: wait, you got a vaporX? then forget about the fan curve just make sure your temps are good when at load. i test my load temps using GPUZ render test and BF4 MP.


----------



## ebhsimon

Quote:


> Originally Posted by *rdr09*
> 
> can't help you there 'cause my 7/24 is at stock. if i were to oc for daily use . . . +100 offset would prolly be my highest but it also depends on the temp. i suggest keeping your temps below 80C. i would also not recommend leaving the fan in auto when oc'ed. a manual curve or a tolerable % would be ideal like 55 maybe. so long as your temps are in check.
> 
> edit: wait, you got a vaporX? then forget about the fan curve just make sure your temps are good when at load. i test my load temps using GPUZ render test and BF4 MP.


yep my temps get to 77C core and 72C VRM1 max for this overclock (1135/1580). I just keep fans on auto, and they only get to 37% I think, but once I pass 80 on the core it starts to ramp up way more aggressively - 80C is like 50% fan speed already. My noisiest part when under load is probably my PSU (Xigmatek Vector G 650W) since it's only a 650W and having a GPU draw out almost 350W (according to GPU-Z) along with other shiz probably puts it at around 70% load.


----------



## rdr09

Quote:


> Originally Posted by *ebhsimon*
> 
> yep my temps get to 77C core and 72C VRM1 max for this overclock (1135/1580). I just keep fans on auto, and they only get to 37% I think, but once I pass 80 on the core it starts to ramp up way more aggressively - 80C is like 50% fan speed already. My noisiest part when under load is probably my PSU (Xigmatek Vector G 650W) since it's only a 650W and having a GPU draw out almost 350W (according to GPU-Z) along with other shiz probably puts it at around 70% load.


don't let shilka catch you with that Xig. I'd keep the gpu at stock for now. for most of us . . . it's the cpu that needs oc'ing.

edit: at 1100 it is already faster than a reference 290X.


----------



## blue1512

Quote:


> Originally Posted by *ebhsimon*
> 
> yep my temps get to 77C core and 72C VRM1 max for this overclock (1135/1580). I just keep fans on auto, and they only get to 37% I think, but once I pass 80 on the core it starts to ramp up way more aggressively - 80C is like 50% fan speed already. My noisiest part when under load is probably my PSU (Xigmatek Vector G 650W) since it's only a 650W and having a GPU draw out almost 350W (according to GPU-Z) along with other shiz probably puts it at around 70% load.


It seems that the memory timing for vapor X is the same as triX, strictly at 1250 MHz or 1500 MHz or you will lose performance.
You should try again with memory starting at 1250 or 1500. The artifact you saw mostly caused by the bad timing of the default mem clock 1300.


----------



## ebhsimon

Quote:


> Originally Posted by *rdr09*
> 
> don't let shilka catch you with that Xig.


Yolo.
Quote:


> Originally Posted by *blue1512*
> 
> It seems that the memory timing for vapor X is the same as triX, strictly at 1250 MHz or 1500 MHz or you will lose performance.
> You should try again with memory starting at 1250 or 1500. The artifact you saw mostly caused by the bad timing of the default mem clock 1300.


Default is 1400Mhz for memory clock. I've already got it up to 1580, but you're saying chuck it to 1250 or 1500 and then test out core overclocks?


----------



## Roboyto

Quote:


> Originally Posted by *rdr09*
> 
> you oc'ed the core and memory at same time during test? try oc'ing your core first with the mem at stock.





> Originally Posted by *ebhsimon*
> 
> Does this actually work? I upped my core clock first until it started artifacting, then I dialed it back until they stopped. Then I started with the memory.


Yes absolutely. Core first, RAM second.

If you've got 1400 already stock then I would push that core as far as possible as the RAM really plays a small role in performance with the very large bus on these cards.

See the "290(X) Need to Know" in my sig for more information.

Quote:


> Originally Posted by *blue1512*
> 
> It seems that the memory timing for vapor X is the same as triX, strictly at 1250 MHz or 1500 MHz or you will lose performance.
> You should try again with memory starting at 1250 or 1500. *The artifact you saw mostly caused by the bad timing of the default mem clock 1300.*


I've never gotten artifacts from a memory clock, especially at stock settings. Pushing RAM too far has always resulted in blackscreen/lockups for me. Artifacts are practically guaranteed to be from too high of a core clock or not enough voltage for a given core clock. If you have no more voltage to apply, dial down the core clock a little and the artifacts should go away.



> Originally Posted by *rdr09*
> 
> don't let shilka catch you with that Xig. I'd keep the gpu at stock for now. for most of us . . . it's the cpu that needs oc'ing


You should consider a higher quality PSU as @rdr09 is hinting at with one of these beastly cards; poor power could be contributing to your issues.

My little Rosewill Capstone 450W modular is powering


3770k @ 4GHz 1.2V
R9 290 1075/1375 +37mV
8GB RAM 2400MHz 1.65V
2 AIOs and 3 fans
1 SSD, 2 HDD, and a BD-RW

It does so without making a peep even after my wife has been on Tomb Raider for 4-5 hours straight. To hear anything happening in the Corsair 250D all of the aforementioned is housed in, I have to put my ear up next to it. With my face up against the case, the Antec 620 pumps are the loudest components, and they're not loud by any standard.

I'm using the 650W Capstone in my main rig as well..solid/silent PSUs, made by SuperFlower, for a very reasonable price, with a fantastic warranty.


----------



## Arizonian

Quote:


> Originally Posted by *Norse*
> 
> Should prolly be fully in this club now
> 
> Crossfire Sapphire 290X's
> Ignore the bad quality!
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Quote:


> Originally Posted by *AMDMAXX*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Asus R9 290 Reference card bought at microcenter...
> 
> going to eventually add it to my watercooling loop...


Congrats - added


----------



## ace1ndahole

Add me to the R9 290 Club









My card is the Sapphire R9 290 Vapor-X

Definitely not reference cooler, it has Tri-x cooler with vapor chamber.


----------



## Arizonian

Quote:


> Originally Posted by *ace1ndahole*
> 
> Add me to the R9 290 Club
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My card is the Sapphire R9 290 Vapor-X
> 
> Definitely not reference cooler, it has Tri-x cooler with vapor chamber.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Like to know your gaming temps on the Vapor X.


----------



## ace1ndahole

I think my gaming temp are pretty impressive since my NZXT H630 doesn't have any airflow at all haha, I get around 70-74 Celsius with slight overclocked to 1100 core and 1500 memory. One time it did get to 79C but that was because I pulled out of the modular cable for the two 200mm exhaust fans on top of the case and my exhaust fan sucking away the dispersed heat at the bottom. But this thing is impressive because my old HiS 7850 2GB got up to 76C during long hours of gaming.


----------



## ebhsimon

Yo add me too.


Yes I removed all the expansion slot covers as per http://www.overclock.net/t/1460812/a-little-test-removing-pci-slot-covers-does-make-your-gpu-cooler-pic-heavy my thread showing it does allow your gpu to be cooler.

And @ace1ndahole, damn son that's some nice value ram you got there. First build I've seen to use it.

*Temperature runs: Running Kombustor.
Stock: 1030 core, 1400 mem.
*25.0C at the intake.

*10 minutes in, core maxes out @ 74C, VRM1 @ 75C. Fans on auto reaching 40%*

Then applying overclock right after running 10 minutes of kombuster on stock:
*Overclock 1135 core, 1580 mem, +100mV, +50% power limit*
25.0C at the intake.

*10 minutes in, core maxes out @ 79C, VRM1 @ 99C. Fans on auto reaching 46%*

Now I have to crack open a window because of all the heat that was just dumped into my room...


----------



## ace1ndahole

HAHAHA I was waiting for someone to comment on that. Yeah I figured my mobo and cpu is pretty aged, so I think them value ram looks beautiful haha.

Truth is I'm a broke college student and my broke ass is too cheap to shell out more dough







.
Maybe some one will be kind enough when they see my ram and donate some nice one my way.....maybe...here's hoping to


----------



## Mega Man

Quote:


> Originally Posted by *Matt-Matt*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HOMECINEMA-PC*
> 
> No that powercolor is not ref one . Refernce cards have loud single fan cooler and are the easiest ones to waterblock and see the temps drop 50% . XFX here in OZ do unlock to 290x ive got one
> 
> 
> 
> 
> 
> 
> 
> . Powercolor ref not so much
> 
> 
> 
> For reference it's just the same chance as everyone else though.. Mine didn't, mind you I was a bit late to the party too (my card is V1.1).
Click to expand...

to my understanding this is not true, as they have started to laser cut the die, so you physically can not unlock them
Quote:


> Originally Posted by *acupalypse*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> lol. you can't even beat a 290 with your lightning. how can you even compare to the Ti?
> 
> the issue is in front of the monitor. you can't even get your cpu oc stable.
> 
> 
> 
> man you can say what you want. I have my own results. All you nitpick is the single unstable out of >10 results...you know how stats work right? I can tell you're an AMD fanboy
> 
> 
> 
> 
> 
> 
> 
> I'm just here to see which card best suits me. I have both cards so I can say which one gives better gaming results. If your card says otherwise, that's good for you. I'm not here to compete
Click to expand...

sorry i think he still has you here..
Quote:


> Originally Posted by *acupalypse*
> 
> - MSI was too quick to point to bad gpu when i described it


this is not ment to be personal so dont take it like that.

but when ever people say this... i laugh. 99% of cust service/ tech support are more tech illiterate then you. only once you speak to a upper level supervisor do they really begin to be competent

most of them just have a flow chart to use, my favorite when you list a long list of what you did, and they still ask you " did you restart your pc, did you try it in another slot, another pc "god it makes me want to smack them through the phone.....
Quote:


> Originally Posted by *ebhsimon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> don't let shilka catch you with that Xig.
> 
> 
> 
> Yolo.*my pc will only live once*
Click to expand...

fixed it for you !~


----------



## FuriousPop

hey can anyone recommend a block for my sapphire R9 290 Tri-x OC??

I am looking at a water cooling kit however wanted to check the w/IHS or no-IHS...

the kit i am currently looking at is XSPC RayStorm D5 Photon RX360 V3 Water Cooling Kit, i already have 3x Sapphire R9 290 Tri-x OC's don't think i am gonna go for a 4th so will stick with the 3x for as long as possible...

any suggestions?

I am currently reading up on Tomshardware sticky water cooling to get a better understanding...


----------



## bond32

You would need additional rad space, likely a 240 with that 360. But that kit is top notch, xspc makes great products and the rx series radiator is their best. The d5 pump is arguably the best pump you can select. Check out Martin's testing to see some hat numbers... Might actually need way more rad space than that


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *bak3donh1gh*
> 
> sorry it was early but it only recently that bios flashing had become less risky(for anything besides video cards) I dont know about you but i've run into people/apps that have meant to be helpful(or malicious) but have screwed me over in the short run. Plus I haven't kept up with hardware since the last time I upgraded.


Im disturbed to hear your tale of woe and despair . Im here to steer you in the right direction . ok









Quote:


> Originally Posted by *ebhsimon*
> 
> How's everyone's overclocks going?
> 
> I just got a Vapor-X R9 290 last night. Extremely well built card, super heavy and is just sexy. Very large too - 2.5 slot cooler. Runs extremely quiet compared to a twin frozr 7970 and even when fans turned up to 100% they don't sound high pitched but rather you just hear the air, temperatures are good, vrm temps are good, core idles at ~50C with IFC on (only runs one out of the three fans when temp is under 60C).
> 
> I could only get to 1135core and 1580mem before I started to get artifacts (pushing +100mV and 50% powerlimit, roughly 330W according to GPU-Z). I could push the memory up to 1675, but there was no point since there were artifacts. I haven't black screened it yet either.
> 
> I feel like the memory overclocks well, but I'd much rather my core to overclock to at least 1175..
> 
> This card is 70% ASIC quality, while my 60% ASIC 7970 got to 1175 core clock. I guess ASIC doesn't tell the whole story then, but still I would have liked a card that could have overclocked higher since I paid such a high premium for it :/
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> you oc'ed the core and memory at same time during test? try oc'ing your core first with the mem at stock.
Click to expand...

I rekon take 100mhz of the mem clock to 1485


----------



## ebhsimon

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> I rekon take 100mhz of the mem clock to 1485


Why?

I don't apply my overclock at startup, so 80% of the time it's at stock 1030/1400. When I put on a game I kick on the overclock 1135/1580.


----------



## heroxoot

Quote:


> Originally Posted by *Widde*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Finally reinstalled BF4 and would ya believe it the performance is great again. 90 - 100fps with MSAA off, 80 - 90 with it on, using Mantle. The voltage increase is indeed working now too. I don't see it reflect my increases on Heaven or Valley but in BF4 it was running 1.23v I think. Seems a bit high though. Running 1100/1250 on 14.4 and it was with a +25mV increase. I upped it to +30mV because sometimes I see a weird white checkered artifacting once in a while on all my games. It is an odd thing I never saw till 14.4 driver and it happens on Mantle and DX11 so I assume it's slight instability. No jetting artifacts like polygons stretching off screen at all. Temps were great even on stock fan (which I still cannot adjust). 85C, 86C for a few seconds on and off. VRM1 67c and VRM 2 52c which never moves. My room is very hot too so these temps are not bad at all. It's probably over 70F in my room right now.
> 
> No clock throttle what so ever during play and it pegged 100% GPU Usage majority of the play too. Very happy right now. I'm scared to OC higher though because I cannot set the fan to run 100% at 80c+ and it only was using the fan at 2500RPM and I think this is about 70% fan? Not sure but 85c is a ok temp from what I have read.
> 
> 
> 
> Do I need to reinstall my game or OS?
> 
> 
> 
> 
> 
> 
> 
> Getting a massive fps drop with mantle enabled >:| Have reinstalled the drivers 3 times using DDU using 14.4whql.
Click to expand...

I'd try deleting the Mantle Cache first. I didn't learn about it till later. I did get some FPS drops but it wasn't many. The GPU load just drops now and then at random. It isn't unheard of to have your GPU drop load during a game, it's just how often? More than once in a while dropping load too hard is bad obviously.

The maps I played during my testing was Zavod 311 and operation locker.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *ebhsimon*
> 
> Why?
> I don't apply my overclock at startup, so 80% of the time it's at stock 1030/1400. When I put on a game I kick on the overclock 1135/1580.


You were having black screen probs with too high mem o/clock weren't you ??
That's why chimed in and added my opinion
and any way mem clocks don't really do that much with the 512bit bus anyways so id be upping the core more than mem


----------



## Dasboogieman

Quote:


> Originally Posted by *cennis*
> 
> 
> I punched holes into the intel mounting bracket of the h90 to match the gpu, and used screw/washers to hold it down.
> I don't think my mounting is going to be worse compared to a G10, essentially the same thing, without a fan bracket.
> *Some pics and results of my asus dcuii h90 mount:
> 
> *
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Dasboogieman, what kind of loads did you run? maybe loop valley max setting or GT4 of 3dmark11?
> I am using the stock thermal paste with brand new h90s,
> mount seems to be solid but at max h90 fan speeds the temps stabilize around 58~60c 1200mhz
> 
> Fallacy, that is much more rad space as its 360mm + 280mm while my two h90s combined is 140+140 so it seems reasonable that It wouldn't maintain 50c.
> Also if 360mm +280mm will reach 60c with 3 cards then having 3x140 (3 h90s) doing the same is not bad.


Hmmm I'm using EVGA OC scanner which is equivalent to Furmark. If it matters, I'm using Phobya HeGrease extreme but to be honest, the contact is so good between the H90 and the GPU die that the difference between the crappy Normal Phobya version (that I used initially) and the uber expensive Nanogrease (that I remounted with) is about 1 degree at the most.
Actually, the G10 can supply a surprising amount of force via the really stiff foam in the centre, the PCB actually warps if theres too much force. Either way, the difference is about 3-5 degrees at the most between you and I and thats well within statistical variance.


----------



## ebhsimon

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> You were having black screen probs with too high mem o/clock weren't you ??
> That's why chimed in and added my opinion
> and any way mem clocks don't really do that much with the 512bit bus anyways so id be upping the core more than mem


You might have the wrong person...

My memory overclocked pretty high but no black screen. Just artifacting.
1135 core is the highest I can overclock before artifacts no matter what clock speed the memory is running at.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *ebhsimon*
> 
> You might have the wrong person...
> 
> My memory overclocked pretty high but no black screen. Just artifacting.
> 1135 core is the highest I can overclock before artifacts no matter what clock speed the memory is running at.


Sorry mate that was for Blue1512


----------



## sena

Guys i notice some spikes over 1.4V when i push voltage, is this harmful?


----------



## hwoverclkd

Quote:


> Originally Posted by *sena*
> 
> Guys i notice some spikes over 1.4V when i push voltage, is this harmful?


it depends on what card you have...


----------



## sena

Quote:


> Originally Posted by *acupalypse*
> 
> it depends on what card you have...


Asus stock one, but i will install full cover waterblock in next couple of days.


----------



## Zamoldac

Count me in!








R9 290 (BBA/Press Sample)/ Aquacomputer kryographics R9 290 black edition.


----------



## Gestler

Hello guys, i have R9 290X, Sapphire a reference desing. Im not realy AMD overcloker,what i need is make this noob cooler a bit quiet, i need set a lower 3D GPU voltage. In 3D if i use MSI Afterburen, i can put 3D Vcore 0,1V lower and it is still 100% stable! temperatures and noise are awesome, but after i go back to 2D....it is unstable, because voltage is offset....is here some custom bios, or bios mod like is posible for Nvidia GTX700 cards? Thx in advance







Forgive me bad English.


----------



## heroxoot

Quote:


> Originally Posted by *Gestler*
> 
> Hello guys, i have R9 290X, Sapphire a reference desing. Im not realy AMD overcloker,what i need is make this noob cooler a bit quiet, i need set a lower 3D GPU voltage. In 3D if i use MSI Afterburen, i can put 3D Vcore 0,1V lower and it is still 100% stable! temperatures and noise are awesome, but after i go back to 2D....it is unstable, because voltage is offset....is here some custom bios, or bios mod like is posible for Nvidia GTX700 cards? Thx in advance
> 
> 
> 
> 
> 
> 
> 
> Forgive me bad English.


There is no way to make a junk stock cooler more quiet. You should have bought a card with a second party cooler. Your best bet is to buy a 3rd party cooler now and slap it on.


----------



## Gestler

Hmmm, how i wrote....reference cooler can be quiet, i need only lower 3D Vcore...i realy dont want buy 50euro or more expensive cooler.


----------



## heroxoot

Quote:


> Originally Posted by *Gestler*
> 
> Hmmm, how i wrote....reference cooler can be quiet, i need only lower 3D Vcore...i realy dont want buy 50euro or more expensive cooler.


Because lowering the voltage is not going to make a fan more quiet. I lower the voltage on my card to 1v during 2D idle clocks and it does not make it even run cooler. You are not going to get a quieter fan unless you slow down the speed or buy one that isn't bad.


----------



## Sgt Bilko

Quote:


> Originally Posted by *heroxoot*
> 
> Because lowering the voltage is not going to make a fan more quiet. I lower the voltage on my card to 1v during 2D idle clocks and it does not make it even run cooler. You are not going to get a quieter fan unless you slow down the speed or buy one that isn't bad.


The stock cooler isn't that bad really, Just leave the fan speed under 50-60% are you're good to go.
Quote:


> Originally Posted by *Gestler*
> 
> Hmmm, how i wrote....reference cooler can be quiet, i need only lower 3D Vcore...i realy dont want buy 50euro or more expensive cooler.


Lowering voltage would lower heat which in turn would lower the fan speed. otherwise set a custom fan curve in Afterburner to your liking and go from there.

It will only throttle at 95c on the core and the Ref design has excellent VRM cooling so that's not an issue either.


----------



## heroxoot

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Because lowering the voltage is not going to make a fan more quiet. I lower the voltage on my card to 1v during 2D idle clocks and it does not make it even run cooler. You are not going to get a quieter fan unless you slow down the speed or buy one that isn't bad.
> 
> 
> 
> The stock cooler isn't that bad really, Just leave the fan speed under 50-60% are you're good to go.
> Quote:
> 
> 
> 
> Originally Posted by *Gestler*
> 
> Hmmm, how i wrote....reference cooler can be quiet, i need only lower 3D Vcore...i realy dont want buy 50euro or more expensive cooler.
> 
> Click to expand...
> 
> Lowering voltage would lower heat which in turn would lower the fan speed. otherwise set a custom fan curve in Afterburner to your liking and go from there.
> 
> It will only throttle at 95c on the core and the Ref design has excellent VRM cooling so that's not an issue either.
Click to expand...

I doubt lowering the voltage will lower the heat. Again I have my voltage lowered by a lot and the clocks as well and it doesn't idle any colder. The card normally would idle 350/1250 but I have it at 515/625 for 2D with multi monitors and it idles no cooler with the memory set way down.


----------



## kizwan

Quote:


> Originally Posted by *heroxoot*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Because lowering the voltage is not going to make a fan more quiet. I lower the voltage on my card to 1v during 2D idle clocks and it does not make it even run cooler. You are not going to get a quieter fan unless you slow down the speed or buy one that isn't bad.
> 
> 
> 
> The stock cooler isn't that bad really, Just leave the fan speed under 50-60% are you're good to go.
> Quote:
> 
> 
> 
> Originally Posted by *Gestler*
> 
> Hmmm, how i wrote....reference cooler can be quiet, i need only lower 3D Vcore...i realy dont want buy 50euro or more expensive cooler.
> 
> Click to expand...
> 
> Lowering voltage would lower heat which in turn would lower the fan speed. otherwise set a custom fan curve in Afterburner to your liking and go from there.
> 
> It will only throttle at 95c on the core and the Ref design has excellent VRM cooling so that's not an issue either.
> 
> Click to expand...
> 
> 
> 
> 
> I doubt lowering the voltage will lower the heat. Again I have my voltage lowered by a lot and the clocks as well and it doesn't idle any colder. The card normally would idle 350/1250 but I have it at 515/625 for 2D with multi monitors and it idles no cooler with the memory set way down.
Click to expand...

You misunderstand, I think. Gestler trying to achieve lower temp in 3D mode, i.e. when under load, not idle. Gestler did achieved the objective (good temp & less fan noise) by lowering the voltage. However the problem is, when the card return to 2D mode, i.e. idle, it become unstable. So, Gestler is looking if there is any modified BIOS with fixed voltage or can set fixed voltage instead of offset (?).

@Gestler, I think with ASUS PT1 VBIOS, you can set voltage instead of offset using GPU Tweak apps. Check it out. madman (@HOMECINEMA-PC) can help you I think.

*[EDIT]* I forgot that PT1 VBIOS don't have 2D mode.


----------



## hwoverclkd

Quote:


> Originally Posted by *Mega Man*
> 
> to my understanding this is not true, as they have started to laser cut the die, so you physically can not unlock them
> sorry i think he still has you here..
> this is not ment to be personal so dont take it like that.
> 
> but when ever people say this... i laugh. 99% of cust service/ tech support are more tech illiterate then you. only once you speak to a upper level supervisor do they really begin to be competent
> 
> most of them just have a flow chart to use, my favorite when you list a long list of what you did, and they still ask you " did you restart your pc, did you try it in another slot, another pc "god it makes me want to smack them through the phone.....


did he really? that's fine...i don't really mind. Besides, that's not the whole story and he's entitled to his opinion. Anyways, it doesn't say anything about who i am, but it does say something about him i guess









on MSI response, i'm aware of that. There's this point in your life when you realize even if you know the truth, you'd just stick to what is important to you. In this case, time.


----------



## weebeast

Hey, i received my 290x and i'm gonna install it soon. I would like to know if there is a way to know if the first owner used it for mining because he bought 12 of these cards in februari 2014 lol(proof of purchase that i got from him).


----------



## Terrere

Quote:


> Originally Posted by *weebeast*
> 
> Hey, i received my 290x and i'm gonna install it soon. I would like to know if there is a way to know if the first owner used it for mining because he bought 12 of these cards in februari 2014 lol(proof of purchase that i got from him).


Odds are that if they bought 12 cards, they used them for mining.


----------



## velocityx

why would he buy 12 of these if not to mine with them?


----------



## weebeast

yeh lol i was thinking the same but why would he sell this one after 3months of use?

It really looks brand new.

Edit:

seems he has sold 3 videocards(including mine) in total, ah well hopefully it works well, if not i can send it RMA. No worries


----------



## VSG

Mining is not very profitable now even with AMD cards, hence he is likely selling them off one by one.


----------



## kckyle

ah i have two options, one is sapphire 290 with very questionable warranty(no invoice so no ganratee)

or i can spend 50 bucks more and get a msi 290 which from what i read is based on serial, if thats the case thn i should have 2 years left on teh card since the seller is stating 3 months old.


----------



## eternal7trance

Most used AMD cards being sold recently are for mining and now being sold since GPU mining sucks


----------



## heroxoot

Quote:


> Originally Posted by *geggeg*
> 
> Mining is not very profitable now even with AMD cards, hence he is likely selling them off one by one.


What happened? Is mining really that dead?

Anyway, if you buy from ebay and they do not work, ebay seriously protects buyers. You are not in danger realistically.


----------



## eternal7trance

Quote:


> Originally Posted by *heroxoot*
> 
> What happened? Is mining really that dead?
> 
> Anyway, if you buy from ebay and they do not work, ebay seriously protects buyers. You are not in danger realistically.


It's not dead, just GPU mining is dead


----------



## kckyle

Quote:


> Originally Posted by *heroxoot*
> 
> What happened? Is mining really that dead?
> 
> Anyway, if you buy from ebay and they do not work, ebay seriously protects buyers. You are not in danger realistically.


my worries is what happens after 30 days. i had perfectly fine cards flake on me after 5 months of normal gaming use,


----------



## weebeast

Well i have 21 months left of warranty since i got the invoice. I just don't have a box.

What's the best way to test the card?


----------



## EliteReplay

Quote:


> Originally Posted by *Roboyto*
> 
> As Kokin said, keep away from CCC. It will likely cause more problems than give you results.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Afterburner/Trixx/GPU Tweak are the most common OC utility choices, I would stick to them.
> 
> I don't believe any 290(X) is voltage locked, but I am not 100% certain; I haven't seen anyone mention it. It wouldn't really make sense since these are the enthusiast class cards.
> 
> This is the thread you want to pay attention to for 290(X) information. There can be a lot to wade through, but if you take your time there is plenty of information.
> 
> Feel free to ask away in the thread, or send a PM, but don't be discouraged if you don't get a response as this thread moves extremely quickly at times and your post can get buried; Re-post if need be
> 
> *The R9 290(X) "Need to Know"*
> 
> *Anyone please feel free to PM/quote/@Roboyto
> to correct/add information.*
> 
> _*These cards make a lot of heat. Have great airflow, a great heatsink, or plan on watercooling them.*_
> Card will continue to run normally up to 94C, _*they throttle at 95C*_ reducing clocks/voltages.
> 
> 
> *Keep an eye on VRM1 temps, <90C is where you want to be for VRM1.*
> VRM1 controls core voltage and is the *long vertical strip* closest to the PCIe power connectors.
> 
> *VRM Locations*
> 
> 
> 
> *VRM2 controls RAM* voltage and is nearest the video output connections.
> You won't likely need to be concerned with keeping them cool since they don't run as hot.
> From my experience with a Kraken G10 on a 290, all you need is a good heatsink(s) on VRM2 too keep it cool; no airflow is really even necessary.
> 
> Kraken & Gelid Heatsink Cooling - http://www.overclock.net/t/1478544/the-devil-inside-a-gaming-htpc-for-my-wife-3770k-corsair-250d-powercolor-r9-270x-devil-edition/20#post_22214255
> 
> 
> 
> 
> 
> *What should you Overclock with?*
> Stay away from Catalyst Control Center aka OverDrive
> Afterburner
> Trixx
> GPU Tweak
> 
> Requires ASUS card or ASUS BIOS
> 
> 
> 
> *What should you benchmark with? *
> Remember a benchmark is a benchmark. Stability in any of the following is a good test,_*but does not guarantee*_ gaming/daily use stability.
> 3DMark
> Unigine
> Catzilla
> Tomb Raider
> Bioshock
> FFXIV (my fav hands down "It's Stable" test)
> etc
> 
> 
> 
> *FurMark is a poor choice*
> it gives unrealistic power draw
> It is also a poor representation of what happens when playing a game
> don't know many that involve continuous repetition of a swaying furry doughnut.
> 
> 
> 
> *Your benchmark score went down after OCing? *
> The card is _*throttling*_ due to temperature.
> "May indicate instability. Reduce the overclock and retest" @BradleyW
> Add a little voltage/power limit *IF temperatures will allow*
> 
> Make sure you *Force Constant Voltage* in your OC utility
> Make sure you *Disable ULPS* via registry or OC Utility; especially with multipe GPUs!
> 
> 
> 
> *With high OC/voltage (1) card can pull 350W at full load. *
> Max voltage in AB or Trixx will put you around 1.4V core; unsure for GPU Tweak.
> It's "safe" if you can keep core and VRM1 cool.
> 
> *Latts O' Watts! 1315/1700 +200mV +50% Power Limit*
> 
> 
> 
> 
> *Core clock is king. * *Overclock core 1st, worry about memory 2nd. The 512-bit bus makes up for the 5GHz default memory speeds.*
> If you experience artifacting/tearing/flashing then you odds are you:
> pushed core clock too high
> need to add more voltage (if temperatures will allow)
> should increase power limit (if temperatures will allow)
> If you can't add more voltage, bring the core clock down a little
> 
> 
> If you experience black screen, BSOD, or other lockups then odds are you have pushed the RAM clock too high.
> MSI Afterburner AUX voltage can sometime assist in RAM clocks.
> Make sure you *Force Constant Voltage* in your OC utility
> Make sure you *Disable ULPS* via registry or OC Utility; especially with multipe GPUs!
> 
> 
> *RAM Speed Comparisons I found @ 1080P*
> 
> 
> 
> 
> *Elpida or Hynix, which is better?*
> _*Most*_ of the time you have a better _*chance*_ to get a good OC out of Hynix memory.
> This isn't always the case, as there are several users who have _*reported excellent Elpida RAM speeds*_.
> But, remember, _*RAM speeds are 2nd fiddle*_ to core so I wouldn't fret over this all that much.
> 
> 
> *Don't be so worried about ASIC quality. *
> Test the card to see how well it does.
> 
> 
> *Above average cards can get to ~1100 core clock without needing additional voltage. *
> This is of course dependent upon:
> silicon lottery
> specific GPU
> cooling
> voltage regulation
> and the varying card/BIOS default voltage settings.
> 
> 
> 
> *Want stable drivers?*
> 13.12 was the bees knees up until 4/25/14 when 14.4 WHQL released.
> 14.4 seems OK so far, but if you have issues *13.12 is ROCK SOLID.*
> _I have experienced blackscreen/lockup with 14.4 & Win7 64-bit._
> 
> *Make sure you remove ANY old drivers properly before installing new ones:*
> *Display Driver Uninstaller* http://www.guru3d.com/files_get/display_driver_uninstaller_download,9.html
> Courtesy of @BradleyW
> http://www.overclock.net/t/988215/how-to-properly-uninstall-ati-amd-software-drivers-for-graphics-cards
> Courtesy of @BradleyW
> http://www.overclock.net/t/1150443/how-to-remove-your-nvidia-gpu-drivers#post_15432476
> 
> 
> 
> *How big of a PSU do you need? *
> 
> *This is dependent upon:*
> CPU
> # of GPUs
> what clocks/voltages you will run them at.
> 
> Reference R9 290 @ stock settings is in the ~180W range
> 290(X) with high overclock/voltage can be in ~350W range
> If you're using an AMD FX 8-core chip with a high overclock, then be mindful of their power hungry tendencies; they can draw ~300W alone
> 
> *Cold hard facts for overclocked 290X power usage courtesy of* @SonDa5
> http://www.overclock.net/t/1441118/290x-psu-power-output-tests#post_21156921
> 
> 
> *Need a good PSU? 3 guides courtesy of* @shilka
> http://www.overclock.net/t/1482157/700-750-watts-comparison-thread#post_22109815
> http://www.overclock.net/t/1438987/1000-1050-watts-comparison-thread#post_21108368
> http://www.overclock.net/t/1483789/1200-1350-watts-comparison-thread#post_22141960
> 
> 
> *From my experience my HTPC flawlessly runs:*
> 450W Rosewill Capstone
> 290 @ stock
> 3770k @ stock
> 1 SSD, 2 HDD and 1 ODD
> 2 AIO pumps & 3 fans presently
> A 3770k at stock is easily under 100W
> The 290 is only drawing ~160W with an AIO on it
> After you factor in AIO water pumps, fans, SSD, HDD and ODD...this system is still easily under the 450W mark.
> 
> 
> 
> *My main desktop flawlessly runs:*
> 650W Rosewill Capstone
> (1) 290 up to 1300/1700 +200mV @ 5760*1080 benching
> 1200/1500 +87mV @ 5760*1080 everyday gaming clocks
> 
> 4770k @ 4.5GHz 1.259V
> 16GB RAM 2400MHz 1.65V
> 3 SSDs & 1 HDD
> 6 fans & 1 water pump
> 
> 
> *Which waterblock/backplate is best?*
> 
> Pretty much all waterblocks will cool the GPU core the same
> The core isn't the *problem child* of the 290(X) though, it's *VRM1 *
> 
> 
> *Best overall waterblock performance when using exactly what is given to you in the box goes to Aquacomputer.*
> Whether you choose the passive/active backplate, you will get the best VRM1 temps here.
> 
> 
> 
> 
> Spoiler: Active and Passive Backplates
> 
> 
> 
> 
> 
> Card looks a lot better with the active attachment; but it is not necessary as you get 90% of the performance without it.
> 
> 
> 
> 
> 
> *The other manufacturers (EK, XSPC, Koolance) seem to have slacked in the VRM1 area, especially when you start pushing these cards...but there is an easy fix!*
> Replacing the thermal pads with Fujipoly Ultra Extreme will slash VRM temps.
> _*See my thread here:*_ http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures
> Make sure you double check pad thickness for your specific block in my ^ thread
> the 15x100mm strip of Fujipoly Ultra Extreme enough to do 2-3 cards if you cut thin enough.
> 
> 
> *I want to add a full cover waterblock, do I have a reference design card?*
> 
> Check EK's site: www.coolingconfigurator.com
> A Review of your card may tell you as well: http://www.techpowerup.com/reviewdb/Graphics-Cards/AMD/R9-290/
> 
> 
> 
> Spoiler: Reference Design
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *What non-reference cards are there full cover blocks for?*
> ASUS DC2 & Matirx - EK
> MSI Gaming & Lightning - EK
> Gigabyte made a change to their PCB - EK has updated their reference blocks to Rev 2.0
> 
> 
> 
> *Which card/brand has the best air cooler?*
> 
> For the *290X the MSI Lightning* undoubtedly takes the cake
> For a *290* this probably goes to the *PowerColor PCS+* and its massive *3-slot* cooler. If you have room for it then you may not need to look much further.
> If you want one that is *2-slot* than you have plenty of good choices
> MSI, Sapphire, Gigabyte, VisionTek and XFX all have capable coolers on their cards
> However, nearly all of these add to the already ridiculous length of the 290(X) PCB at 278mm/~11 in.
> It appears that the MSI Gaming and VisionTek are the only two coolers that don't extrude past the PCB
> 
> 
> 
> *You didn't mention the ASUS DC2?*
> That is because it has the worst performance from outright laziness on ASUS' part
> They took the cooler from the GK104, GTX 780, and slapped it on the 290(X)
> 290(X) die is much smaller than GK104, so only 3/5 heatpipes make contact; severely inhibiting performance
> I wouldn't bother unless you plan to use a waterblock
> I do think the heatpipe contact could _*possibly*_ be remedied with a thin copper shim to help distribute heat to the other 2 pipes
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *My card runs too hot, is too noisy, and/or custom water cooling isn't possible for me...what are my modding options?*
> 
> *Gelid Icy Vision 2* - http://www.newegg.com/Product/Product.aspx?Item=N82E16835426026
> Courtesy of @tony_choi
> http://www.overclock.net/t/1437634/installation-guide-tips-of-rev-2-icy-vision-on-r9-290x
> 
> 
> *Arctic Cooling Accelero Extreme IV* - http://www.newegg.com/Product/Product.aspx?Item=N82E16835186097
> Xtreme III results courtesy of TomsHardware - http://www.tomshardware.com/reviews/r9-290-accelero-xtreme-290,3671.html
> 
> 
> *Attach an AIO water cooler*
> *Arctic Accelero Hybrid* - http://www.newegg.com/Product/Product.aspx?Item=N82E16835186095
> *NZXT Kraken G10* - http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&Description=kraken+g10&N=-1&isNodeId=1
> *Kraken G10 in my 250D* - http://www.overclock.net/t/1478544/the-devil-inside-a-gaming-htpc-for-my-wife-3770k-corsair-250d-powercolor-r9-270x-devil-edition/20#post_22214255
> 
> 
> 
> *Need Heatsinks for VRMs or RAM?*
> *Gelid VRM Heatsink Kit* - http://www.feppaspot.com/servlet/the-897/GELID-Solutions-Icy-Vision/Detail
> Definitely best solution for reference cards as no modification is needed
> You can see results from this kit in my 250D thread ^ just above ^
> 
> *You have many options for RAM sinks*
> Here is the cheap/easy solution I chose from 3M
> *3M 8810* - http://www.performance-pcs.com/catalog/index.php?main_page=product_info&products_id=39462
> Thermal tape is pre-applied. Easy to install _*and*_ easy to remove if need be.
> 
> 
> 
> *Gelid VRM Kit + 3M 8810 Ram Sinks*
> 
> 
> 
> *What thermal paste, or TIM, should I use for my card?*
> 
> This boils down to a bit of personal preference as there are so many to choose from, but the most popular TIMs are popular because they work
> *Gelid GC-Extreme*
> *Arctic MX-4*
> Both of these deliver optimum results immediately(non-curing), are non-conductive, and have a long service life.
> 
> 
> Only 2 products...you didn't list my favorite thermal paste?
> There are countless different products that deliver similar/better results; Xigmakek PTI-G4512 is my personal fav
> Gelid GC-Extreme and Arctic MX-4 simply work, are safe, cheap and easily attainable
> 
> *I've heard about CoolLaboratory Liquid Pro/Ultra (CLP/CLU), does it really work that much better?*
> 
> It does deliver superior results compared to traditional paste, but there are a couple things to be mindful of
> It's metal and conducts electricity, be very careful when applying/removing this stuff!
> Put some masking tape down around the GPU die; better safe than $400+ sorry
> 
> It can have a shorter service life than a traditional paste
> This depends on how rough you are on your GPU(s), so keep an eye on those temperatures
> 
> It can be difficult to apply and even more difficult to remove
> _*I have 0 first hand experience*_, but have read my share of info and have decided it is not for me.
> Ultra is supposed to be easier to apply and remove then Pro
> 
> Is it OK to use with a bare copper heatsink/waterblock?
> I have read of more than one instance where a heatsink, block, or lapped/polished IHS "fuse" together when cooling a CPU
> It is probably best to use with a nickle plated heatsink/waterblock so this doesn't happen
> http://www.overclock.net/t/1377337/how-should-i-apply-coollaboratory-liquid-pro-to-my-cpu
> http://www.overclock.net/t/1364267/question-about-coollaboratory-liquid-ultra-thermel-paste/10
> 
> 
> *Anyone please feel free to PM/quote/@Roboyto
> to correct/add information.*


this info should be on the OP on the frist page... it has a lot of info that cant be found once the thread has many pages on and on...


----------



## eternal7trance

Quote:


> Originally Posted by *weebeast*
> 
> Well i have 21 months left of warranty since i got the invoice. I just don't have a box.
> 
> What's the best way to test the card?


You could always just let it sit in the valley benchmark for a while and see how it does


----------



## Roboyto

Quote:


> Originally Posted by *EliteReplay*


It is in the OP :-D


----------



## kckyle

nvm


----------



## kayan

Well, I got my 2nd 290x, reference XFX board. One of them has a block on it, while the other does not....yet. Everything is at stock, everything. This is my 3dMark run:



I am broke at the moment, so the 2nd block and overclocking will have to wait a while methinks.

Edit: Sorry for the Steam chat in the bottom.


----------



## bond32

Quote:


> Originally Posted by *kayan*
> 
> Well, I got my 2nd 290x, reference XFX board. One of them has a block on it, while the other does not....yet. Everything is at stock, everything. This is my 3dMark run:
> 
> 
> 
> I am broke at the moment, so the 2nd block and overclocking will have to wait a while methinks.
> 
> Edit: Sorry for the Steam chat in the bottom.


Wow... That 9370 is really holding your score back. I got almost the exact same score but with a single 290x... http://www.3dmark.com/fs/2139361

Always wanted to go back to an 8350 and crosshair v formula z board, but no way now...


----------



## DeadlyDNA

Quote:


> Originally Posted by *bond32*
> 
> Wow... That 9370 is really holding your score back. I got almost the exact same score but with a single 290x... http://www.3dmark.com/fs/2139361
> 
> Always wanted to go back to an 8350 and crosshair v formula z board, but no way now...


Okay im confused because this si the 290/290x owners thread. So i am assuming were comparing the GPU's.
If thats true,

his GPU SCORE = 20528

vs

Your GPU SCORE = 13903

If your comparing the overall system score then yes, your CPU scored more points and HIS GPU's scored more points.... am i missing something?


----------



## bond32

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Okay im confused because this si the 290/290x owners thread. So i am assuming were comparing the GPU's.
> If thats true,
> 
> his GPU SCORE = 20528
> 
> vs
> 
> Your GPU SCORE = 13903
> 
> If your comparing the overall system score then yes, your CPU scored more points and HIS GPU's scored more points.... am i missing something?


You quoted my post, but you seem as if you didn't read it. I was pointing out how his low physics score (caused by the 9370) compared to intel users contributes to the overall lower score.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *kizwan*
> 
> You misunderstand, I think. Gestler trying to achieve lower temp in 3D mode, i.e. when under load, not idle. Gestler did achieved the objective (good temp & less fan noise) by lowering the voltage. However the problem is, when the card return to 2D mode, i.e. idle, it become unstable. So, Gestler is looking if there is any modified BIOS with fixed voltage or can set fixed voltage instead of offset (?).
> 
> @Gestler, I think with ASUS PT1 VBIOS, you can set voltage instead of offset using GPU Tweak apps. Check it out. madman (@HOMECINEMA-PC) can help you I think.
> 
> *[EDIT]* I forgot that PT1 VBIOS don't have 2D mode.


@Gestler
I run the Asus PT1T unlocked vbios . You will be able to drop the voltage some but not the vclock it stays at 1000/1250 using Asus Gpu Tweek
http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread/0_20#


----------



## DeadlyDNA

Quote:


> Originally Posted by *bond32*
> 
> You quoted my post, but you seem as if you didn't read it. I was pointing out how his low physics score (caused by the 9370) compared to intel users contributes to the overall lower score.


Okay.


----------



## heroxoot

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bond32*
> 
> Wow... That 9370 is really holding your score back. I got almost the exact same score but with a single 290x... http://www.3dmark.com/fs/2139361
> 
> Always wanted to go back to an 8350 and crosshair v formula z board, but no way now...
> 
> 
> 
> Okay im confused because this si the 290/290x owners thread. So i am assuming were comparing the GPU's.
> If thats true,
> 
> his GPU SCORE = 20528
> 
> vs
> 
> Your GPU SCORE = 13903
> 
> If your comparing the overall system score then yes, your CPU scored more points and HIS GPU's scored more points.... am i missing something?
Click to expand...

He is pointing out the crossfire setup is getting less than double the score compared to a single 290X. This is because the physics score is crap on AMD CPU. I also bench slightly low compared to people with Intel CPU. I see about 4 - 5fps less than people with the same build on Intel.


----------



## Gil80

Hi all,

I purchased the R9 290 Gigabyte Windforce OC BF4 edition.
Now I really want to go CF but do I have to buy the exact same GPU?
In addition to that, I wanted to expend my water cooling to the GPU but for my Gigabyte Windforce edition I couldn't find GPU water block.
If I could get the ATI R9 290, then I can water cool it but I don't know if it will work well in CF with the Gigabyte GPU.

Any recommendations?

Thanks


----------



## kayan

Quote:


> Originally Posted by *bond32*
> 
> Wow... That 9370 is really holding your score back. I got almost the exact same score but with a single 290x... http://www.3dmark.com/fs/2139361
> 
> Always wanted to go back to an 8350 and crosshair v formula z board, but no way now...


Quote:


> Originally Posted by *DeadlyDNA*
> 
> Okay im confused because this si the 290/290x owners thread. So i am assuming were comparing the GPU's.
> If thats true,
> 
> his GPU SCORE = 20528
> 
> vs
> 
> Your GPU SCORE = 13903
> 
> If your comparing the overall system score then yes, your CPU scored more points and HIS GPU's scored more points.... am i missing something?


Quote:


> Originally Posted by *heroxoot*
> 
> He is pointing out the crossfire setup is getting less than double the score compared to a single 290X. This is because the physics score is crap on AMD CPU. I also bench slightly low compared to people with Intel CPU. I see about 4 - 5fps less than people with the same build on Intel.


@everyone -- You know, my benchmarks aren't as high as comparable Intel machines, but the experience (in my experience) is and has been smoother. I had a 3770k that was rock solid for a little over a year, and I sold it to my brother for cheap, because his old Phenom crapped out and he was short on funds. I replaced it with a 4770k + Gigabyte UD4 board, and the PC was never stable, it was crashing all the time. I took it back to Microcenter, and got a diff 4770k + MSI MPower board, and this board would reset all my BIOS settings on reboot, every single time. I took that back and went back to an AMD CPU, because for the many many years I was with AMD, I never had a single issue that wasn't caused by me. I got the 9370 + Crosshair V-Z for about 200 less than either of the 4770k combos I had, and with the money saved I also upgraded my then fiancé's PC (now wife).

In games, the 290x has been flawless, but I wanted to max everything so I grabbed another (since they're cheap on ebay at the moment), and it's awesome. My benchmarks aren't as good as Intel builds, but again the performance in games has been stellar, with no issues in anything I play. And, speaking from owning both sides of the coin, I had stuttering and skipping with the same stuff on a 4770k.

*Rant over*

I uploaded a screenie so everyone could see what it was like on an AMD machine. Hope it helps someone.


----------



## Sgt Bilko

Quote:


> Originally Posted by *bond32*
> 
> You quoted my post, but you seem as if you didn't read it. I was pointing out how his low physics score (caused by the 9370) compared to intel users contributes to the overall lower score.


Intel usually does better on the physics test but its the combined test where AMD falls short....

Firestrike doesnt load up all the cores during the combined so the score is not as good, gaming wise there are very comparable and there is little to no difference really.


----------



## Arizonian

Quote:


> Originally Posted by *ebhsimon*
> 
> Yo add me too.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Yes I removed all the expansion slot covers as per http://www.overclock.net/t/1460812/a-little-test-removing-pci-slot-covers-does-make-your-gpu-cooler-pic-heavy my thread showing it does allow your gpu to be cooler.
> 
> And @ace1ndahole, damn son that's some nice value ram you got there. First build I've seen to use it.
> 
> *Temperature runs: Running Kombustor.
> Stock: 1030 core, 1400 mem.
> *25.0C at the intake.
> 
> *10 minutes in, core maxes out @ 74C, VRM1 @ 75C. Fans on auto reaching 40%*
> 
> Then applying overclock right after running 10 minutes of kombuster on stock:
> *Overclock 1135 core, 1580 mem, +100mV, +50% power limit*
> 25.0C at the intake.
> 
> *10 minutes in, core maxes out @ 79C, VRM1 @ 99C. Fans on auto reaching 46%*
> 
> Now I have to crack open a window because of all the heat that was just dumped into my room...


Congrats - added










If you could just add a GPU-Z validation link with OCN name on it in original post would be great.

Quote:


> Originally Posted by *Zamoldac*
> 
> Count me in!
> 
> 
> 
> 
> 
> 
> 
> 
> R9 290 (BBA/Press Sample)/ Aquacomputer kryographics R9 290 black edition.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added
















If you could just add a GPU-Z validation link with OCN name on it in original post would be great as well.
Quote:


> Originally Posted by *EliteReplay*
> 
> this info should be on the OP on the frist page... it has a lot of info that cant be found once the thread has many pages on and on...


Quote:


> Originally Posted by *Roboyto*
> 
> It is in the OP :-D












*The R9 290(X) "Need to Know" by OCN member Roboyto*









*The R9 290(X) "Need to Know" Post*


----------



## heroxoot

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bond32*
> 
> You quoted my post, but you seem as if you didn't read it. I was pointing out how his low physics score (caused by the 9370) compared to intel users contributes to the overall lower score.
> 
> 
> 
> Intel usually does better on the physics test but its the combined test where AMD falls short....
> 
> Firestrike doesnt load up all the cores during the combined so the score is not as good, gaming wise there are very comparable and there is little to no difference really.
Click to expand...

This is literally why I just don't like benching anymore. Getting 100 or more FPS on my crappy 8150 makes no difference to me because it is performing very well in comparison. Otherwise, everyones 290X look good.


----------



## Roboyto

Quote:


> Originally Posted by *Gil80*
> 
> Hi all,
> 
> I purchased the R9 290 Gigabyte Windforce OC BF4 edition.
> Now I really want to go CF but do I have to buy the exact same GPU?
> In addition to that, I wanted to expend my water cooling to the GPU but for my Gigabyte Windforce edition I couldn't find GPU water block.
> If I could get the ATI R9 290, then I can water cool it but I don't know if it will work well in CF with the Gigabyte GPU.
> 
> Any recommendations?
> 
> Thanks


The normal Windforce OC will accept EK's Rev. 2.0 blocks. I doubt the BF4 version is any different aside from including the game. Go to www.coolingconfigurator.com to check it out.

If you must be 100% certain before purchasing you can always e-mail Gigabyte or EK for validation of this fact.

It's not necessary to have identical cards, but it can potentially eliminate some issues you could run into.


----------



## kizwan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bond32*
> 
> You quoted my post, but you seem as if you didn't read it. I was pointing out how his low physics score (caused by the 9370) compared to intel users contributes to the overall lower score.
> 
> 
> 
> Intel usually does better on the physics test but its the combined test where AMD falls short....
> 
> Firestrike doesnt load up all the cores during the combined so the score is not as good, *gaming wise there are very comparable and there is little to no difference really*.
Click to expand...

^^ This.


----------



## kayan

Quote:


> Originally Posted by *kizwan*
> 
> ^^ This.


See, this is my whole point. It's kinda pointless to bench on my hardware, but I do it to check some stabilities and what not out. If I can bench stable (regardless of score, even though I do try to get mine higher), then I surely can game stable. My hobby is gaming, not benching, and if I get the kind of performance, that I crave, on AMD or Intel, then so be it. Doesn't matter to me, if I'm stable and can crank the settings in game. I've used both sides of the camp for CPUs and the other two sides for GPUs (and ATI before it was bought by AMD, so I guess it's a 3 side there) and I've switched along the way, back and forth. I'm not trying to start a flame war by any means, but if you get anything out of my post, please get this....

We all have a love of computing, and we all want the best hardware, but what is "best" to you, may not be what is "best" for me. Some want the fastest, no holds barred, while others want stable and fast (even if it isn't the fastest of the fast), but we all come here to discuss these things -- in all their many facets. We should be encouraging each other, not saying some of the stuff from the last page, because of their choice in components. The more benchers/gamers/media enthusiasts that there are, the better it is for all of us. *All of us* are what pushes these companies (ALL of them) to innovate and push cutting edge.

So, please try to be a little more accepting and not mock each other for our choice of hardware, because of what we value in our rigs.

Back on topic, these two video cards are awesome, and I can't wait to try 4k. Is anybody here pushing 4k monitors with these cards? Or even 780s/Titans?


----------



## Mega Man

Quote:


> Originally Posted by *bond32*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kayan*
> 
> Well, I got my 2nd 290x, reference XFX board. One of them has a block on it, while the other does not....yet. Everything is at stock, everything. This is my 3dMark run:
> 
> 
> 
> I am broke at the moment, so the 2nd block and overclocking will have to wait a while methinks.
> 
> Edit: Sorry for the Steam chat in the bottom.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Wow... That 9370 is really holding your score back. I got almost the exact same score but with a single 290x... http://www.3dmark.com/fs/2139361
> 
> Always wanted to go back to an 8350 and crosshair v formula z board, but no way now...
Click to expand...

to note both intel and amd score way way way lower in windows 8 on physics, ( 500-2k different systems will have different scores, on one of my 8350s it is 500 and another ~ 1500 )
Quote:


> Originally Posted by *kayan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bond32*
> 
> Wow... That 9370 is really holding your score back. I got almost the exact same score but with a single 290x... http://www.3dmark.com/fs/2139361
> 
> Always wanted to go back to an 8350 and crosshair v formula z board, but no way now...
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *DeadlyDNA*
> 
> Okay im confused because this si the 290/290x owners thread. So i am assuming were comparing the GPU's.
> If thats true,
> 
> his GPU SCORE = 20528
> 
> vs
> 
> Your GPU SCORE = 13903
> 
> If your comparing the overall system score then yes, your CPU scored more points and HIS GPU's scored more points.... am i missing something?
> 
> Click to expand...
> 
> your missing the epeen, he is after numbers which most benches are intel favored
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> He is pointing out the crossfire setup is getting less than double the score compared to a single 290X. This is because the physics score is crap on AMD CPU. I also bench slightly low compared to people with Intel CPU. I see about 4 - 5fps less than people with the same build on Intel.
> 
> Click to expand...
> 
> @everyone -- You know, my benchmarks aren't as high as comparable Intel machines, but the experience (in my experience) is and has been smoother. I had a 3770k that was rock solid for a little over a year, and I sold it to my brother for cheap, because his old Phenom crapped out and he was short on funds. I replaced it with a 4770k + Gigabyte UD4 board, and the PC was never stable, it was crashing all the time. I took it back to Microcenter, and got a diff 4770k + MSI MPower board, and this board would reset all my BIOS settings on reboot, every single time. I took that back and went back to an AMD CPU, because for the many many years I was with AMD, I never had a single issue that wasn't caused by me. I got the 9370 + Crosshair V-Z for about 200 less than either of the 4770k combos I had, and with the money saved I also upgraded my then fiancé's PC (now wife).
> 
> In games, the 290x has been flawless, but I wanted to max everything so I grabbed another (since they're cheap on ebay at the moment), and it's awesome. My benchmarks aren't as good as Intel builds, but again the performance in games has been stellar, with no issues in anything I play. And, speaking from owning both sides of the coin, I had stuttering and skipping with the same stuff on a 4770k.
> 
> *Rant over*
> 
> I uploaded a screenie so everyone could see what it was like on an AMD machine. Hope it helps someone.
Click to expand...

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bond32*
> 
> You quoted my post, but you seem as if you didn't read it. I was pointing out how his low physics score (caused by the 9370) compared to intel users contributes to the overall lower score.
> 
> 
> 
> Intel usually does better on the physics test but its the combined test where AMD falls short....
> 
> Firestrike doesnt load up all the cores during the combined so the score is not as good, *gaming wise there are very comparable and there is little to no difference really*.
> 
> Click to expand...
> 
> ^^ This.
Click to expand...

not as bad in 3dm11
http://www.3dmark.com/3dm11/6664211


Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *kayan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> ^^ This.
> 
> 
> 
> See, this is my whole point. It's kinda pointless to bench on my hardware, but I do it to check some stabilities and what not out. If I can bench stable (regardless of score, even though I do try to get mine higher), then I surely can game stable. My hobby is gaming, not benching, and if I get the kind of performance, that I crave, on AMD or Intel, then so be it. Doesn't matter to me, if I'm stable and can crank the settings in game. I've used both sides of the camp for CPUs and the other two sides for GPUs (and ATI before it was bought by AMD, so I guess it's a 3 side there) and I've switched along the way, back and forth. I'm not trying to start a flame war by any means, but if you get anything out of my post, please get this....
> 
> We all have a love of computing, and we all want the best hardware, but what is "best" to you, may not be what is "best" for me. Some want the fastest, no holds barred, while others want stable and fast (even if it isn't the fastest of the fast), but we all come here to discuss these things -- in all their many facets. We should be encouraging each other, not saying some of the stuff from the last page, because of their choice in components. The more benchers/gamers/media enthusiasts that there are, the better it is for all of us. *All of us* are what pushes these companies (ALL of them) to innovate and push cutting edge.
> 
> So, please try to be a little more accepting and not mock each other for our choice of hardware, because of what we value in our rigs.
> 
> Back on topic, these two video cards are awesome, and I can't wait to try 4k. Is anybody here pushing 4k monitors with these cards? Or even 780s/Titans?
Click to expand...





shhhhhh ! please dont let everyone know the secret personally i am sick of the trolls in the amd threads









and, i have both a 2011 and a 8350 and i much much much prefer my 8350, i really hope amd does not abandon this because the 8350 feels so much smoother day to day ( windows, internet ) i can see the hangs on my 2011 vs my 8350

also i would like to throw this in to here i am starting to mess with firestrike and when i get my 8350 back up ill mess with it ( atm it is ripped apart for a complete overhaul that will probably take quite awhile, i need some stuff and i am spending my monies elsewhere

in other news fedex came with my komodos and blocks/rads ( 4x480s







RIVBE block ) but he didnt leave them i need to sign for them :/


----------



## bak3donh1gh

Hey guys so I've got my 290 installed. I did get a black screen while overclocking so im back to stock but Im getting a lot of CTD in titanfall now? anyone know what could be causing this.

I uninstalled and went into safe mode used ccleaner and then driversweeper so Im pretty confident in the driver install(14.4)

also Im sure its stupidly obvious but where can I find the PT1 bios thanks!


----------



## kayan

Quote:


> Originally Posted by *bak3donh1gh*
> 
> Hey guys so I've got my 290 installed. I did get a black screen while overclocking so im back to stock but Im getting a lot of CTD in titanfall now? anyone know what could be causing this.
> 
> I uninstalled and went into safe mode used ccleaner and then driversweeper so Im pretty confident in the driver install(14.4)
> 
> also Im sure its stupidly obvious but where can I find the PT1 bios thanks!


After I stuck a waterblock on my 290x, and started overclocking, I had the same exact issue in Titanfall, crash to desktop. Also, I had the same problem in Skyrim, Battlefield 4. I needed to turn the memory overclock down, and up the power % to +50 in AB/Trixx. Try to just overclock your core and see if the card holds up while gaming.


----------



## Dasboogieman

Quote:


> Originally Posted by *bak3donh1gh*
> 
> Hey guys so I've got my 290 installed. I did get a black screen while overclocking so im back to stock but Im getting a lot of CTD in titanfall now? anyone know what could be causing this.
> 
> I uninstalled and went into safe mode used ccleaner and then driversweeper so Im pretty confident in the driver install(14.4)
> 
> also Im sure its stupidly obvious but where can I find the PT1 bios thanks!


Hera ya gohttp://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread
4 spoiler down, under Other BIOSes.

Please be careful with these BIOSes, especially the PT3 one which disables all safety measures.

What is the OC, voltage, Memory type (Elpida, Samsung or Hynix) and ASIC quality of your chip?


----------



## Sgt Bilko

Quote:


> Originally Posted by *kayan*
> 
> See, this is my whole point. It's kinda pointless to bench on my hardware, but I do it to check some stabilities and what not out. If I can bench stable (regardless of score, even though I do try to get mine higher), then I surely can game stable. My hobby is gaming, not benching, and if I get the kind of performance, that I crave, on AMD or Intel, then so be it. Doesn't matter to me, if I'm stable and can crank the settings in game. I've used both sides of the camp for CPUs and the other two sides for GPUs (and ATI before it was bought by AMD, so I guess it's a 3 side there) and I've switched along the way, back and forth. I'm not trying to start a flame war by any means, but if you get anything out of my post, please get this....
> 
> We all have a love of computing, and we all want the best hardware, but what is "best" to you, may not be what is "best" for me. Some want the fastest, no holds barred, while others want stable and fast (even if it isn't the fastest of the fast), but we all come here to discuss these things -- in all their many facets. We should be encouraging each other, not saying some of the stuff from the last page, because of their choice in components. The more benchers/gamers/media enthusiasts that there are, the better it is for all of us. *All of us* are what pushes these companies (ALL of them) to innovate and push cutting edge.
> 
> So, please try to be a little more accepting and not mock each other for our choice of hardware, because of what we value in our rigs.
> 
> Back on topic, these two video cards are awesome, and I can't wait to try 4k. Is anybody here pushing 4k monitors with these cards? Or even 780s/Titans?


Agreed









On the 4k note, @DeadlyDNA Has been running triple 4k Eyefinity with 4 of these beasts and he is very comprehensive in the testing









http://www.overclock.net/t/1481154/4k-eyefinity-crossfire-scaling-from-1-2-3-4-gpus-benchmarks/0_40

http://www.overclock.net/t/1487232/56k-warning-49-megapixel-gaming-test-warning-cell-phones-too/0_40


----------



## bak3donh1gh

Quote:


> Originally Posted by *Dasboogieman*
> 
> What is the OC, voltage, Memory type (Elpida, Samsung or Hynix) and ASIC quality of your chip?


hynix and 78.8%. Whats the best way to give tou voltages ? its a PC PCS+ Thanks for the info


----------



## Dasboogieman

hmmm ok

With that setup, you can expect about 1085mhz/1500mhz on stock voltage, perhaps 1100mhz on the core and 1500mhz on core with +12mv voltage. You will probably cap on air cooling at 1200mhz core and 1600mhz memory at +175mV voltage.


----------



## Devotii

Does the PC PCS+ R9 290 still have problems at manufacture? They are a good price but I'm not sure whether it is worth the grief, if it faults,


----------



## battleaxe

Quote:


> Originally Posted by *Zamoldac*
> 
> Count me in!
> 
> 
> 
> 
> 
> 
> 
> 
> R9 290 (BBA/Press Sample)/ Aquacomputer kryographics R9 290 black edition.


All that copper... I bet that's heavy as a tank... ?


----------



## Raephen

Quote:


> Originally Posted by *battleaxe*
> 
> All that copper... I bet that's heavy as a tank... ?


My 290 is cooled by an AC Kryographics block, too, and it isn't that bad


----------



## hokochu

Here is the link to my gpu-z validation

http://www.techpowerup.com/gpuz/6m9/

They were both ASUS cards with the stock cooler but now they are rocking EK copper acetal water cooling blocks. keeps them temps nice and cool for that serious clock.


----------



## kayan

Random Q, has anyone bought one of those new Swiftech water blocks for their 290/x?

Those things look sexy as h.ell! Even more attractive than the XSPC Razor blocks.


----------



## Roy360

I"m thinking of swapping my i5 3570k/ ASUS Formula V for a Intel Core i7-3820/ASUS P9X79 (I'm hoping its Deluxe, but I don't think it is)

But I just realized that the P9X79 only has 3 PCIE slots.....

Will I see much of a difference if I upgrade? I am running 3 R9 290s(forth still in box), and two cards are running at 4x.

Otherwise, I"m just waiting for Haswell -E, for Intel Retail Edge for a deal on a CPU, and then buy new RAM and motherboard.


----------



## Roaches

Quote:


> Originally Posted by *Roy360*
> 
> I"m thinking of swapping my i5 3570k/ ASUS Formula V for a Intel Core i7-3820/ASUS P9X79.
> 
> But I just realized that the P9X79 only has 3 PCIE slots..... the only ASUS motherboard I can find with 4 slots costs over 500$, so its out of the question.
> 
> Will I see much of a difference if I upgrade? I am running 3 R9 290s(forth still in box), and two cards are running at 4x.
> 
> Otherwise, I"m just waiting for Haswell -E, for Intel Retail Edge for a deal on a CPU, and then buy new RAM and motherboard.


If you're looking for a sub 500 ASUS board with 4 PCIeX16 Slots then the P9X79-WS fits the sub 500 dollar price range...heck even the X79-E WS with full 7 X16 Slots cost less than a Rampage Extreme Black Edition.

http://www.newegg.com/Product/Product.aspx?Item=N82E16813131798

http://www.newegg.com/Product/Product.aspx?Item=N82E16813131971


----------



## eternal7trance

Quote:


> Originally Posted by *Roy360*
> 
> I"m thinking of swapping my i5 3570k/ ASUS Formula V for a Intel Core i7-3820/ASUS P9X79.
> 
> But I just realized that the P9X79 only has 3 PCIE slots..... the only ASUS motherboard I can find with 4 slots costs over 500$, so its out of the question.
> 
> Will I see much of a difference if I upgrade? I am running 3 R9 290s(forth still in box), and two cards are running at 4x.
> 
> Otherwise, I"m just waiting for Haswell -E, for Intel Retail Edge for a deal on a CPU, and then buy new RAM and motherboard.


It mostly depends on your clock difference from the 3570k to the 3820. The other difference would come from the higher bandwidth PCIe slots maybe like 3-4 fps more.


----------



## DeadlyDNA

Quote:


> Originally Posted by *Roy360*
> 
> I"m thinking of swapping my i5 3570k/ ASUS Formula V for a Intel Core i7-3820/ASUS P9X79.
> 
> But I just realized that the P9X79 only has 3 PCIE slots..... the only ASUS motherboard I can find with 4 slots costs over 500$, so its out of the question.
> 
> Will I see much of a difference if I upgrade? I am running 3 R9 290s(forth still in box), and two cards are running at 4x.
> 
> Otherwise, I"m just waiting for Haswell -E, for Intel Retail Edge for a deal on a CPU, and then buy new RAM and motherboard.


I would suggest to wait I dont think there is enough difference to justify the cost and aggravation. My testing for 290s in my scaling thread compared 8x 2.0 vs 3.0 8x. - I cant do 4x that I am aware of but its probably like 10% difference? Also running 4x 290s may present some issues in games. All my scaling test in most titles keeps going beyond 3 cards but im using really high resolutions. I will do 4k testing I think since that seems to be more sought after.


----------



## Roy360

Quote:


> Originally Posted by *Roaches*
> 
> If you're looking for a sub 500 ASUS board with 4 PCIeX16 Slots then the P9X79-WS fits the sub 500 dollar price range...heck even the X79-E WS with full 7 X16 Slots cost less than a Rampage Extreme Black Edition.
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16813131798
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16813131971


hmm, I was just looking at NCIX and Tigerdirect, completely forgot about Newegg.

NOw the question is, will that 460$ worth it
Quote:


> Originally Posted by *eternal7trance*
> 
> It mostly depends on your clock difference from the 3570k to the 3820. The other difference would come from the higher bandwidth PCIe slots maybe like 3-4 fps more.


3570k is at 5GHz, but it's currently runs at 100% in every game I play. I've been told that my CPU was the bottleneck


----------



## kizwan

Quote:


> Originally Posted by *Roy360*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Roaches*
> 
> If you're looking for a sub 500 ASUS board with 4 PCIeX16 Slots then the P9X79-WS fits the sub 500 dollar price range...heck even the X79-E WS with full 7 X16 Slots cost less than a Rampage Extreme Black Edition.
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16813131798
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16813131971
> 
> 
> 
> hmm, I was just looking at NCIX and Tigerdirect, completely forgot about Newegg.
> 
> NOw the question is, will that 460$ worth it
> Quote:
> 
> 
> 
> Originally Posted by *eternal7trance*
> 
> It mostly depends on your clock difference from the 3570k to the 3820. The other difference would come from the higher bandwidth PCIe slots maybe like 3-4 fps more.
> 
> Click to expand...
> 
> 
> 
> 
> 
> 3570k is at 5GHz, but it's currently runs at 100% in every game I play. I've been told that my CPU was the bottleneck
Click to expand...

At what resolution you play the games?


----------



## Roaches

You can always go for the X79-UP4 if you want to save some cash but still have 4 slots for a 4 way SLI/CFX configuration.








Plus the Aesthetics are right up there, mmmmm black









http://www.newegg.com/Product/Product.aspx?Item=N82E16813128562


----------



## Roy360

Quote:


> Originally Posted by *kizwan*
> 
> At what resolution you play the games?


Been testing at 1080p, however I will be gaming at 5760x1080p. The upgrade to LGA2011 with the original mobo will be more or less free. It's just going to be a huge pain to swap motherboards in my watercooled rig

p.s LGA2011 supports the Sandy Bridge-E and Ivy Bridge-E right? Only Haswell-E requires a new socket.


----------



## Roaches

Yes, existing LGA 2011 boards need a BIOS flash in order to support Ivy Bridge-E


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Roaches*
> 
> You can always go for the X79-UP4 if you want to save some cash but still have 4 slots for a 4 way SLI/CFX configuration.
> 
> 
> 
> 
> 
> 
> 
> 
> Plus the Aesthetics are right up there, mmmmm black
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16813128562


Stay away from the UP4 its bios is unworkable for overclocking ........ I have one


----------



## Agoniizing

If I get a lower score on valley with my 290x mem at 1525 compared to 1500, does that mean im getting error correction?


----------



## heroxoot

Quote:


> Originally Posted by *Agoniizing*
> 
> If I get a lower score on valley with my 290x mem at 1525 compared to 1500, does that mean im getting error correction?


Errors or throttle.


----------



## Agoniizing

Im pretty sure its errors because once i put it back to 1500 my score instantly jumped


----------



## AMDMAXX

Is this decent for my sig rig?

http://www.3dmark.com/3dm/3069886


----------



## DeadlyDNA

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Stay away from the UP4 its bios is unworkable for overclocking ........ I have one


I 2nd this. I have one of these dead right now. All it did was loose bios settings crash and burn. I didn't even get to OC because it wouldn't even work on defaults without schizophrenia attacks.

hands down one of the worst mobos I have ever had. I wouldn't even give it to my worst enemy its that bad. Of course YMMV.


----------



## Roaches

I guess thats why its so cheap...I'll remind myself not to recommend it from here on out. Its a nice looking board though.


----------



## DeadlyDNA

Quote:


> Originally Posted by *Roaches*
> 
> I guess thats why its so cheap...I'll remind myself not to recommend it from here on out. Its a nice looking board though.


it's funny you say that, i actually recommended it to a friend. I thought it was awesome looking and supported 4 way sli/CF to boot. He has 4 x570GTX cards. He got it and had nothing but issues with it. Then it died, and he RMA'd it. The one he got back was doing same thing, so out of guilty conscience i bought it from him and he got an RIVE. Well i literally pulled my hair out with it and gave up on it. I think the board itself physically is sound. The BIOS was where the issue was. So they may have corrected this and moved on...


----------



## Roy360

The motherboard is the deluxe









Looks like this: 

and I can upgrade to it for free.

Now the question is, how can I fit 4 GPUs and a sound card in it. I have 16x risers, but I don't know if they are that flexible. Anyone know any simple tricks to make the cards single slot? The case has 9 PCI slots, so no limitations on that front

All my cards are watercooled with full cover blocks.


----------



## hotrod717

Sorry if this has been brought up recently, but, does anyone know when Asus will finally release the 290X Matrix to retail??? This has to be one of their longest announces to retail that I've seen. Revamping the cooler possibly??? jeeez!


----------



## Aussiejuggalo

What kinda temps do you guys see with full cover blocks on 290s?


----------



## Agoniizing

Why am I getting a low valley score on my 290x? I'm averaging around 56fps at 1115/1500.


----------



## pkrexer

Looks like I'm going crossfire lol. Wasn't really looking to do it, but how could I pass up $250 for a 290. Its a Gigabyte reference 290 that I'll be crossfiring with my 290x. Only bummer is I need to pick up another waterblock.

Any unlockable Gigabyte's out there? I'm crossing my fingers, but not getting my hopes up.


----------



## Enzarch

Does anyone else have stability issues with over-volting alone? I have a reference GigaByte 290 vBIOS...3523 on 14.4 Win8.1.1, and it seems going much beyond about +100mV regardless of clocks or Aux V causes instability, and this card fails in a way I have never seen before, weather too high of a clock or this voltage issue; about half the image in thin even vertical bands will shift left or right making it appear like a garbled double image with black stripes sometimes with pink bands on one edge, in extreme cases it'll black screen.

Either way, the driver never crashes/restarts, no typical over-overclocking artifacts, the image will just stay that way until reboot

I use Afterburner B19; force constant voltage. Card is under hybrid cooling, temps are great. PCPower 950W single rail, plenty of power.

I will test again with 13.12 drivers. I am not sure if a different vBIOS could make a difference, as all but the Sapphire have the same version numbers. Gigabyte did release a new UEFI vBIOS recently I may try.

This isn't a dealbreaker, as I can hit a stable 1150/1500 @+100mV, but i think this chip can go much further as I was able to bench at 1250/1600 +165mV before it **** itself.

Sorry for the long post, I will make a new thread if needed. Suggestions?


----------



## Roy360

My first time playing a game on my R9 290s....










anyone know how to fix that?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Roy360*
> 
> My first time playing a game on my R9 290s....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> anyone know how to fix that?


Use splash?









Anyways, looks like the clocks are unstable.

Are your Monitors connected via DVI? If so then tick these boxes in CCC:


----------



## Agoniizing

Why is my valley score low with my 290x at 1115/1500?


----------



## heroxoot

Quote:


> Originally Posted by *Agoniizing*
> 
> Why is my valley score low with my 290x at 1115/1500?


WIndows 8. try it on 7 I bet you will get 1 - 3 more fps like I do. The difference for my rig is 49 on windows 8 and 52 on 7 with 1100/1250.


----------



## rdr09

Quote:


> Originally Posted by *Agoniizing*
> 
> Why is my valley score low with my 290x at 1115/1500?


either Win8, 14 driver, or both. that is almost equal to my 290 at stock. what are your temps - core and vrms?


----------



## Agoniizing

Quote:


> Originally Posted by *rdr09*
> 
> either Win8, 14 driver, or both. that is almost equal to my 290 at stock. what are your temps - core and vrms?


Core is 79C and VRMs under 70C


----------



## rdr09

Quote:


> Originally Posted by *Agoniizing*
> 
> Core is 79C and VRMs under 70C


looks fine. what driver? are you adding VDDC on the core? how about Power Limit?


----------



## Agoniizing

Quote:


> Originally Posted by *rdr09*
> 
> looks fine. what driver? are you adding VDDC on the core? how about Power Limit?


14.4 driver, +38mV, +50 power limit. Im overclocking in AB beta 19


----------



## Mega Man

Quote:


> Originally Posted by *kayan*
> 
> Random Q, has anyone bought one of those new Swiftech water blocks for their 290/x?
> 
> Those things look sexy as h.ell! Even more attractive than the XSPC Razor blocks.


i bought 4
fed ex was supposed to deliver yesterday, and the retailer asked for a sig ... so again they wait till no one is home to deliver ( i swear it is a conspiracy... like they have a look out andthey are like " wait..... ok GOGOGOGOGOGO ") and even though i asked them this mourning to deliver the dang thing 2 min drive from my house... it takes them till tomorrow...... i mean really ???
Quote:


> Originally Posted by *Roy360*
> 
> The motherboard is the deluxe
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looks like this:
> 
> and I can upgrade to it for free.
> 
> Now the question is, how can I fit 4 GPUs and a sound card in it. I have 16x risers, but I don't know if they are that flexible. Anyone know any simple tricks to make the cards single slot? The case has 9 PCI slots, so no limitations on that front
> 
> All my cards are watercooled with full cover blocks.


yes, buy 295xs !

on the bright side i finally got my 4th card ( 5th really but one didnt work ) i think i may buy a fourth powercolor... as the back i/o is black and saphires ( sounds petty but it really does look better, and the covers when they match it just looks good ! ) .... not but ... 3/4 of my cards are hynix !



@Arizonian does that work ?

http://www.techpowerup.com/gpuz/mwvyu/

http://www.techpowerup.com/gpuz/4rn4g/

http://www.techpowerup.com/gpuz/gxrb/

http://www.techpowerup.com/gpuz/7cd2h/


----------



## rdr09

Quote:


> Originally Posted by *Agoniizing*
> 
> 14.4 driver, +38mV, +50 power limit. Im overclocking in AB beta 19


with 14.4 driver i noticed that i have to add more volts for the same oc that i use for 13.11. a lot more volts . . . like +40 offset more. if you are not using mantle, i advise you try 13.12. i get lower scores with 14.4 in 3DMarks.

i use Trixx to oc.


----------



## Agoniizing

Quote:


> Originally Posted by *rdr09*
> 
> with 14.4 driver i noticed that i have to add more volts for the same oc that i use for 13.11. a lot more volts . . . like +40 offset more. if you are not using mantle, i advise you try 13.12. i get lower scores with 14.4 in 3DMarks.
> 
> i use Trixx to oc.


I dont use mantle so should I go back to 13.12? And is Trixx better to use?


----------



## rdr09

Quote:


> Originally Posted by *Agoniizing*
> 
> I dont use mantle so should I go back to 13.12? And is Trixx better to use?


it is up to you. try it. see if you can achieve the same oc without adding VDDC just maxing the PL. I use Trixx 'cause it allows +200 offset, which i used for suicide runs (1300/1600) last winter. again, it could be a combination of win8 and 14 driver. what brand of 290?


----------



## hwoverclkd

Quote:


> Originally Posted by *Agoniizing*
> 
> Why is my valley score low with my 290x at 1115/1500?


this seems a bit low...i got a lemon 290x but still getting 69-ish on the same clocks (1115/1500)


----------



## Agoniizing

Quote:


> Originally Posted by *rdr09*
> 
> it is up to you. try it. see if you can achieve the same oc without adding VDDC just maxing the PL. I use Trixx 'cause it allows +200 offset, which i used for suicide runs (1300/1600) last winter. again, it could be a combination of win8 and 14 driver. what brand of 290?


What is PL? And I have a reference XFX 290X


----------



## rdr09

Quote:


> Originally Posted by *Agoniizing*
> 
> What is PL? And I have a reference XFX 290X


sorry, PL is Power Limit. I thought you have one of those underperforming powercolors.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agoniizing*
> 
> What is PL? And I have a reference XFX 290X


PL is powerlimit


----------



## Agoniizing

Quote:


> Originally Posted by *rdr09*
> 
> sorry, PL is Power Limit. I thought you have one of those underperforming powercolors.


Do I have to do anything in amd overdrive? Or can I just do everything in AB?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agoniizing*
> 
> Do I have to do anything in amd overdrive? Or can I just do everything in AB?


Disable AMD overdrive and up the powerlimit in AB,


----------



## Ricey20

Been awhile since I've posted here but I take it the black screen bug still isn't fixed outside of 13.12? Installed 14.4 and I can OC ram to 1500 and run valley for hours but after 15 mins in a game it will black screen, even at 1255 (5mhz above stock).


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Disable AMD overdrive and up the powerlimit in AB,


^this. just AB or just Trixx.


----------



## rdr09

Quote:


> Originally Posted by *Ricey20*
> 
> Been awhile since I've posted here but I take it the black screen bug still isn't fixed outside of 13.12? Installed 14.4 and I can OC ram to 1500 and run valley for hours but after 15 mins in a game it will black screen, even at 1255 (5mhz above stock).


so, it does not bs when the memory is at stock? if so, then leave it at stock. i read there is no benefit in oc'ing memory for games. i game with the gpu at stock. my i7 needs oc'ing, otoh.


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> so, it does not bs when the memory is at stock? if so, then leave it at stock. i read there is no benefit in oc'ing memory for games. i game with the gpu at stock. my i7 needs oc'ing, otoh.


Agreed, ocing memory isn't that useful in the real world for these cards, going from stock 1250Mhz to 1500Mhz (on both cards) i might gain an extra 2-3 fps.


----------



## DeadlyDNA

Try disabling Tessellation in the CCC also for a valley run. It will help your score.


----------



## Gil80

Quote:


> Originally Posted by *Roboyto*
> 
> The normal Windforce OC will accept EK's Rev. 2.0 blocks. I doubt the BF4 version is any different aside from including the game. Go to www.coolingconfigurator.com to check it out.
> 
> If you must be 100% certain before purchasing you can always e-mail Gigabyte or EK for validation of this fact.
> 
> It's not necessary to have identical cards, but it can potentially eliminate some issues you could run into.


Out of the EK waterblock list, only the rev 2 will fit?


----------



## Agoniizing

Quote:


> Originally Posted by *rdr09*
> 
> it is up to you. try it. see if you can achieve the same oc without adding VDDC just maxing the PL. I use Trixx 'cause it allows +200 offset, which i used for suicide runs (1300/1600) last winter. again, it could be a combination of win8 and 14 driver. what brand of 290?


I dont get it, I went to 13.12 drivers and im still getting bad results. Every 290x valley run ive seen is way better than mine even when their OC is lower.


----------



## DeadlyDNA

Quote:


> Originally Posted by *Agoniizing*
> 
> I dont get it, I went to 13.12 drivers and im still getting bad results. Every 290x valley run ive seen is way better than mine even when their OC is lower.


can you give numbers for comparing? Can you verify gpu usage, and its not being throttled? Also maybe make sure during your bench to turn off and Overlay. Disable tessellation maybe?


----------



## Arizonian

Quote:


> Originally Posted by *hokochu*
> 
> Here is the link to my gpu-z validation
> 
> http://www.techpowerup.com/gpuz/6m9/
> 
> They were both ASUS cards with the stock cooler but now they are rocking EK copper acetal water cooling blocks. keeps them temps nice and cool for that serious clock.


Congrats - added















Quote:


> Originally Posted by *Mega Man*
> 
> i bought 4
> fed ex was supposed to deliver yesterday, and the retailer asked for a sig ... so again they wait till no one is home to deliver ( i swear it is a conspiracy... like they have a look out andthey are like " wait..... ok GOGOGOGOGOGO ") and even though i asked them this mourning to deliver the dang thing 2 min drive from my house... it takes them till tomorrow...... i mean really ???
> yes, buy 295xs !
> 
> on the bright side i finally got my 4th card ( 5th really but one didnt work ) i think i may buy a fourth powercolor... as the back i/o is black and saphires ( sounds petty but it really does look better, and the covers when they match it just looks good ! ) .... not but ... 3/4 of my cards are hynix !
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> @Arizonian does that work ?
> 
> http://www.techpowerup.com/gpuz/mwvyu/
> 
> http://www.techpowerup.com/gpuz/4rn4g/
> 
> http://www.techpowerup.com/gpuz/gxrb/
> 
> http://www.techpowerup.com/gpuz/7cd2h/


I could have sworn I had added you a while back. LOL. I guess not because I had to check. Welcome aboard - congrats


----------



## Agoniizing

Quote:


> Originally Posted by *DeadlyDNA*
> 
> can you give numbers for comparing? Can you verify gpu usage, and its not being throttled? Also maybe make sure during your bench to turn off and Overlay. Disable tessellation maybe?


GPU usage is 100%. This is a run at 1115/1500


----------



## kizwan

Quote:


> Originally Posted by *Roy360*
> 
> The motherboard is the deluxe
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looks like this:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> and I can upgrade to it for free.
> 
> Now the question is, how can I fit 4 GPUs and a sound card in it. I have 16x risers, but I don't know if they are that flexible. Anyone know any simple tricks to make the cards single slot? The case has 9 PCI slots, so no limitations on that front
> 
> All my cards are watercooled with full cover blocks.


I don't think you can run 4 GPU's on that board. I could be wrong though. The second white PCIe x16 slot (x8 mode) seems remain active/enabled when the first white PCIe x16 slot (x8 mode) is occupied.


----------



## Forceman

221218912462
Quote:


> Originally Posted by *Agoniizing*
> 
> GPU usage is 100%. This is a run at 1115/1500


Have you tried keeping the Afterburner monitoring window open while you run the test so you can see if the clock speeds are dropping? Maybe try it with stock memory settings - the memory overclock may be too high and it is losing performance because of error correction? What are your scores stock? Are you sure the power limit increase is taking? Sometimes setting it in AB doesn't work - you should be able to look in CCC and see if it is showing +50% there. Have you tried Heaven, or 3DMark to see if they also show low performance?


----------



## Agoniizing

Quote:


> Originally Posted by *Forceman*
> 
> 221218912462
> Have you tried keeping the Afterburner monitoring window open while you run the test so you can see if the clock speeds are dropping? Maybe try it with stock memory settings - the memory overclock may be too high and it is losing performance because of error correction? What are your scores stock? Are you sure the power limit increase is taking? Sometimes setting it in AB doesn't work - you should be able to look in CCC and see if it is showing +50% there. Have you tried Heaven, or 3DMark to see if they also show low performance?


I just ran Heaven at 1110/1350 at stock volts and I still got a low score. I also had AB open and my core clock stayed at 1110 the whole time.


----------



## Forceman

Are you coming from a Nvidia card?

Did you reset the 3D application settings in CCC? Maybe you have something non-standard set there?


----------



## Agoniizing

Quote:


> Originally Posted by *Forceman*
> 
> Are you coming from a Nvidia card?
> 
> Did you reset the 3D application settings in CCC? Maybe you have something non-standard set there?


Quote:


> Originally Posted by *Forceman*
> 
> Are you coming from a Nvidia card?
> 
> Did you reset the 3D application settings in CCC? Maybe you have something non-standard set there?


I'm on a fresh install on windows. And everything is default in the 3D app settings.


----------



## Forceman

Have you tried 3DMark? Check the graphics score specifically - might tell you if it's the card or something else in your system. Otherwise maybe something else GPU specific, like ShaderToyMark or CompuBench?


----------



## ebhsimon

Validation link: http://www.techpowerup.com/gpuz/2mcvf/

Also, I freaking love this Vapor-X cooler.



what the frickety frack?! my msi twin frozr 7970 is not even in the same league. Running kombustor at stock my 7970 got to about 75C, and this cooler is SO much quieter.

That's a *15 minute kombustor run* at stock clock (1030/1400). Holy balls is all I can say. Maybe the fact that I don't put on my side panel helps? But it only ever helped by a few degrees.

*VRM1 temperature: 58C
GPU temperature: 60C*

Room temperature: 23C

EDIT: Lol... I think I know why Noctua closed up the sides of the NH-D15 instead of having it fully open like the D14. All the hot air from the gpu is basically flowing through the D14 cpu cooler and as a result my cpu isn't being cooled very well. I went from 35-40 idle to 45-50 while running kombuster with overclocks (1135/1580).


----------



## Agoniizing

Quote:


> Originally Posted by *Forceman*
> 
> Have you tried 3DMark? Check the graphics score specifically - might tell you if it's the card or something else in your system. Otherwise maybe something else GPU specific, like ShaderToyMark or CompuBench?


Here's my 3DMark 11 score. I dont know if this is good or bad for 3DMark 11.
http://www.3dmark.com/3dm11/8332792


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agoniizing*
> 
> Here's my 3DMark 11 score. I dont know if this is good or bad for 3DMark 11.
> http://www.3dmark.com/3dm11/8332792


Graphics score is pretty much spot on with mine at those clocks: http://www.3dmark.com/3dm11/7423142


----------



## Agoniizing

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Graphics score is pretty much spot on with mine at those clocks: http://www.3dmark.com/3dm11/7423142


Did you run stretched or centered?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agoniizing*
> 
> Did you run stretched or centered?


Tbh i really don't remember, that was when i had a 8150 and a 290x in my rig, it was most likely the default though.


----------



## kizwan

Quote:


> Originally Posted by *Agoniizing*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Graphics score is pretty much spot on with mine at those clocks: http://www.3dmark.com/3dm11/7423142
> 
> 
> 
> Did you run stretched or centered?
Click to expand...

Are you running centered? This is because you didn't enable GPU scaling in CCC (Digital Flat Panel section) but it shouldn't affect the 3dmark score though. I did ran some tests & I found it only affected the score in Crossfire. Can you set clock to 1000/1300 & run 3dmark 11 again? If not, I'll set mine to 1110/1350 & run 3dmark 11 this arvo. I have 290 but we'll be able to see whether you get correct score.


----------



## Roboyto

Quote:


> Originally Posted by *Gil80*
> 
> Out of the EK waterblock list, only the rev 2 will fit?


I would get the Rev 2.0 just to be sure. They changed the height/location of caps/chokes near the PCIe power connectors if I remember what the pics looked like.



https://www.facebook.com/EKWaterBlocks/photos/a.204208322966540.61821.182927101761329/633154450071923/?type=1


----------



## Durvelle27

I have a question. Will i lose performance by crossfiring a R9 290 with R9 290X ?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Durvelle27*
> 
> I have a question. Will i lose performance by crossfiring a R9 290 with R9 290X ?


What do you mean by lose performance?

if you CF a 290x with a 290 then they will run at their respective clocks and games/apps that will use CF will use both cards.


----------



## BradleyW

Quote:


> Originally Posted by *Durvelle27*
> 
> I have a question. Will i lose performance by crossfiring a R9 290 with R9 290X ?


No, you won't really see a performance different between 290X CFX and 290 + 290X CFX. They practically perform the same since they are basically the same card. At most a 2-3c difference on a very GPU demanding game.


----------



## Durvelle27

Quote:


> Originally Posted by *BradleyW*
> 
> No, you won't really see a performance different between 290X CFX and 290 + 290X CFX. They practically perform the same since they are basically the same card. At most a 2-3c difference on a very GPU demanding game.


Alright thx bud. Guess its time for an upgrade lol


----------



## blue1512

Quote:


> Originally Posted by *Durvelle27*
> 
> I have a question. Will i lose performance by crossfiring a R9 290 with R9 290X ?


Quote:


> Originally Posted by *BradleyW*
> 
> No, you won't really see a performance different between 290X CFX and 290 + 290X CFX. They practically perform the same since they are basically the same card. At most a 2-3c difference on a very GPU demanding game.


When you CF 290 with a 290x, the performance will be exactly like 290 CF. It is even worse in case you can't keep them cool, because 290x is hotter than 290.
So the better choice is to buy another 290 for 290 CF. Or if your cooling is good, sold the 290 and get a couple of 290X for the best CF performance


----------



## Sgt Bilko

Quote:


> Originally Posted by *blue1512*
> 
> When you CF 290 with a 290x, the performance will be exactly like 290 CF. It is even worse in case you can't keep them cool, because 290x is hotter than 290.
> So the better choice is to buy another 290 for 290 CF. Or if your cooling is good, sold the 290 and get a couple of 290X for the best CF performance


How will it be like 290 CF?

He has a 290x and was thinking of adding a 290 alongside it.

Performance would be better than 290 CF but a little slower than 290x CF due to the 290x having more shaders.


----------



## blue1512

Quote:


> Originally Posted by *Sgt Bilko*
> 
> How will it be like 290 CF?
> 
> He has a 290x and was thinking of adding a 290 alongside it.
> 
> Performance would be better than 290 CF but a little slower than 290x CF due to the 290x having more shaders.


CF mirrors everything, from RAM to shader. So when you CF a 290 to 290x, only 2560 of 2816 shaders from 290x will be used. The performance therefore will be the same as 290 CF.


----------



## BradleyW

Quote:


> Originally Posted by *blue1512*
> 
> CF mirrors everything, from RAM to shader. So when you CF a 290 to 290x, only 2560 of 2816 shaders from 290x will be used. The performance therefore will be the same as 290 CF.


Assuming the game is using more than 2560 shaders.


----------



## Sgt Bilko

Quote:


> Originally Posted by *blue1512*
> 
> CF mirrors everything, from RAM to shader. So when you CF a 290 to 290x, only 2560 of 2816 shaders from 290x will be used. The performance therefore will be the same as 290 CF.


afaik you can still run 2 cards at their respectable core and mem speeds and nothing else would change.

So that means the shaders would still but unlocked as well i guess.


----------



## BradleyW

The only issue with mismatch cards is that latency can slightly increase a little.


----------



## Durvelle27

Quote:


> Originally Posted by *blue1512*
> 
> When you CF 290 with a 290x, the performance will be exactly like 290 CF. It is even worse in case you can't keep them cool, because 290x is hotter than 290.
> So the better choice is to buy another 290 for 290 CF. Or if your cooling is good, sold the 290 and get a couple of 290X for the best CF performance


The 290 is to go along side with my 290X. Not willing to part with it just for 2x 290s


----------



## nightfox

Quote:


> Originally Posted by *BradleyW*
> 
> The only issue with mismatch cards is that latency can slightly increase a little.


interesting. could you possibly explain more?


----------



## blue1512

Quote:


> Originally Posted by *Sgt Bilko*
> 
> afaik you can still run 2 cards at their respectable core and mem speeds and nothing else would change.
> 
> So that means the shaders would still but unlocked as well i guess.


Yes, the 2816 shaders of 290x are still there but only 2560 of them used when you put it in CF with a 290. That's how CF have been working for ages.
The cards can be set to different core/mem speeds separately, but that will lead to some serious latency and stuttering. The best way is mirror everything though.


----------



## BradleyW

Quote:


> Originally Posted by *nightfox*
> 
> interesting. could you possibly explain more?


Well think about it. Two cards rendering at the same time. Both cards slightly different from one another in terms of IC's, tracers, resistors, architectural differences and IC timings + clock speeds. The faster card will always wait for the slower card. That "could" slighty increase frame time latency.


----------



## blue1512

Quote:


> Originally Posted by *BradleyW*
> 
> Well think about it. Two cards rendering at the same time. Both cards slightly different from one another in terms of IC's, tracers, resistors, architectural differences and IC timings + clock speeds. The faster card will always wait for the slower card. That "could" slighty increase frame time latency.


CF is not exactly like that. It's actually is "two set of same shaders rendering at the same time". Please do some research about pipeline processing so that you could have better understanding about these stuffs.
Quote:


> Originally Posted by *Durvelle27*
> 
> The 290 is to go along side with my 290X. Not willing to part with it just for 2x 290s


The performance is will be the same as 2x 290s. But the good thing is you can sell your 290 and get another 290x later


----------



## hotrod717

Quote:


> Originally Posted by *blue1512*
> 
> Yes, the 2816 shaders of 290x are still there but only 2560 of them used when you put it in CF with a 290. *That's how CF have been working for ages.*
> The cards can be set to different core/mem speeds separately, but that will lead to some serious latency and stuttering. The best way is mirror everything though.


This. Don't want to offend anybody, but it's interesting to see people giving so many opinions, but don't know this very basic stuff.


----------



## BradleyW

Quote:


> Originally Posted by *blue1512*
> 
> CF is not exactly like that. It's actually is "two set of same shaders rendering at the same time". Please do some research about pipeline processing so that you could have better understanding about these stuffs.
> The performance is will be the same as 2x 290s. But the good thing is you can sell your 290 and get another 290x later


I'm trying to explain in an easier way. Of course I understand pipelines. I'm a Computer Science student. I believe you were meant to say "about this stuff".


----------



## the9quad

Quote:


> Originally Posted by *blue1512*
> 
> CF is not exactly like that. It's actually is "two set of same shaders rendering at the same time". Please do some research about pipeline processing so that you could have better understanding about these stuffs.
> The performance is will be the same as 2x 290s. But the good thing is you can sell your 290 and get another 290x later


Quote:


> Originally Posted by *hotrod717*
> 
> This. Don't want to offend anybody, but it's interesting to see people giving so many opinions, but don't know this very basic stuff.


sorry to burst your bubbles(kind of) from hardocp review

"The big question is; do the two 1018MHz GPUs on the AMD Radeon R9 295X2 downclock to 1000MHz since our XFX Radeon R9 290X DD video card runs at 1000MHz? The answer is a big fat NO. *Indeed, we checked in every game we tested and the two GPUs on the AMD Radeon R9 295X2 maintained a consistent 1018MHz clock speed while the XFX Radeon R9 290X DD maintained a consistent 1000MHz clock speed. This means each video card was running at its maximum performance. No throttling and no downclocking*."

amd crossfire doesnt work like SLI, shaders might be lower but clock speeds are what they are.

http://www.hardocp.com/article/2014/05/13/amd_radeon_r9_295x2_xfx_290x_dd_trifire_review#.U3YmqvldWip


----------



## Sgt Bilko

Quote:


> Originally Posted by *the9quad*
> 
> sorry to burst your bubbles(kind of) from hardocp review
> 
> "The big question is; do the two 1018MHz GPUs on the AMD Radeon R9 295X2 downclock to 1000MHz since our XFX Radeon R9 290X DD video card runs at 1000MHz? The answer is a big fat NO. *Indeed, we checked in every game we tested and the two GPUs on the AMD Radeon R9 295X2 maintained a consistent 1018MHz clock speed while the XFX Radeon R9 290X DD maintained a consistent 1000MHz clock speed. This means each video card was running at its maximum performance. No throttling and no downclocking*."
> 
> amd crossfire doesnt work like SLI, shaders might be lower but clock speeds are what they are.
> 
> http://www.hardocp.com/article/2014/05/13/amd_radeon_r9_295x2_xfx_290x_dd_trifire_review#.U3YmqvldWip


^ This

thats what i meant about it, 290 and 290x CF would be different though due to shaders, guess you could just clock the 290 higher to compensate?


----------



## Durvelle27

Quote:


> Originally Posted by *blue1512*
> 
> CF is not exactly like that. It's actually is "two set of same shaders rendering at the same time". Please do some research about pipeline processing so that you could have better understanding about these stuffs.
> The performance is will be the same as 2x 290s. But the good thing is you can sell your 290 and get another 290x later


Thats to much of hassle


----------



## hotrod717

Quote:


> Originally Posted by *the9quad*
> 
> sorry to burst your bubbles(kind of) from hardocp review
> 
> "The big question is; do the two 1018MHz GPUs on the AMD Radeon R9 295X2 downclock to 1000MHz since our XFX Radeon R9 290X DD video card runs at 1000MHz? The answer is a big fat NO. *Indeed, we checked in every game we tested and the two GPUs on the AMD Radeon R9 295X2 maintained a consistent 1018MHz clock speed while the XFX Radeon R9 290X DD maintained a consistent 1000MHz clock speed. This means each video card was running at its maximum performance. No throttling and no downclocking*."
> 
> amd crossfire doesnt work like SLI, shaders might be lower but clock speeds are what they are.
> 
> http://www.hardocp.com/article/2014/05/13/amd_radeon_r9_295x2_xfx_290x_dd_trifire_review#.U3YmqvldWip


This doesn't even reference what I quoted. ???? Discussion was xfiring with 290/290X, not 295x2.


----------



## the9quad

Quote:


> Originally Posted by *hotrod717*
> 
> This doesn't even reference what I quoted. ???? Discussion was xfiring with 290/290X, not 295x2.


well I apologize, I didn't research your post (aka read it thoroughly). Just went all multi quote crazy! same principle applies though the core speeds and mem speeds stay what they are, the only difference is in the # of shaders being used. which is probably what you were saying.


----------



## Rozayz

I'm using an R9 290X at the moment, which I purchased specially for my 4k [Samsung U28D590D] setup - but it won't even run BF4 at "Ultra" settings, generally I have to turn off MSAA to sustain anything above 50 fps. I would imagine grabbing another card and running them in CF would anal BF4.

Currently I have 2 [Samsung U28D590D], but I'm only running one (the other is setup with another rig).

I'm contemplating buying another and using three of them together.

My question is, should I buy another R9 290X and CF them for the triple (or even double, both @ 60 Hz) 4k setup? I will only ever be gaming using the 1 display, but having to run x2 displays at 30Hz using HDMI puts me off even for non-gaming use.

Thoughts? H4LP.


----------



## the9quad

Quote:


> Originally Posted by *Rozayz*
> 
> I'm using an R9 290X at the moment, which I purchased specially for my 4k [Samsung U28D590D] setup - but it won't even run BF4 at "Ultra" settings, generally I have to turn off MSAA to sustain anything above 50 fps. I would imagine grabbing another card and running them in CF would anal BF4.
> 
> Currently I have 2 [Samsung U28D590D], but I'm only running one (the other is setup with another rig).
> 
> I'm contemplating buying another and using three of them together.
> 
> My question is, should I buy another R9 290X and CF them for the triple (or even double, both @ 60 Hz) 4k setup? I will only ever be gaming using the 1 display, but having to run x2 displays at 30Hz using HDMI puts me off even for non-gaming use.
> 
> Thoughts? H4LP.


well no single card is going to push 4k @ 60 fps with ultra settings period. As you can see here the fastest single card (which is actually a double card) is getting 60 fps barely) So yeah your going to need to crossfire to hit 60 fps, and more than likely trifire (which is the sweet spot) to stay above it. Good news is used 290x's are cheap now.


----------



## Rozayz

Quote:


> Originally Posted by *the9quad*
> 
> well no single card is going to push 4k @ 60 fps with ultra settings period.


That's kind of my point. So +1 R9 290X and CF seems like a feasible path.

Heh. That stealth edit.
Quote:


> Originally Posted by *the9quad*
> well no single card is going to push 4k @ 60 fps with ultra settings period.
> 
> two 290x's pull about 70 fps USING MANTLE at 4k
> Three pull about 82 FPS using mantle,


So is there even a point in going CF R9 290X's? or just give up until 4k gaming is optimizable for non-millionaires?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Rozayz*
> 
> I'm using an R9 290X at the moment, which I purchased specially for my 4k [Samsung U28D590D] setup - but it won't even run BF4 at "Ultra" settings, generally I have to turn off MSAA to sustain anything above 50 fps. I would imagine grabbing another card and running them in CF would anal BF4.
> 
> Currently I have 2 [Samsung U28D590D], but I'm only running one (the other is setup with another rig).
> 
> I'm contemplating buying another and using three of them together.
> 
> My question is, should I buy another R9 290X and CF them for the triple (or even double, both @ 60 Hz) 4k setup? I will only ever be gaming using the 1 display, but having to run x2 displays at 30Hz using HDMI puts me off even for non-gaming use.
> 
> Thoughts? H4LP.


http://www.overclock.net/t/1481154/4k-eyefinity-crossfire-scaling-from-1-2-3-4-gpus-benchmarks/0_40

Have a read









Personal opinion: get another 290x and possibly a 3rd depending on graphics settings.


----------



## Durvelle27

Quote:


> Originally Posted by *Rozayz*
> 
> I'm using an R9 290X at the moment, which I purchased specially for my 4k [Samsung U28D590D] setup - but it won't even run BF4 at "Ultra" settings, generally I have to turn off MSAA to sustain anything above 50 fps. I would imagine grabbing another card and running them in CF would anal BF4.
> 
> Currently I have 2 [Samsung U28D590D], but I'm only running one (the other is setup with another rig).
> 
> I'm contemplating buying another and using three of them together.
> 
> My question is, should I buy another R9 290X and CF them for the triple (or even double, both @ 60 Hz) 4k setup? I will only ever be gaming using the 1 display, but having to run x2 displays at 30Hz using HDMI puts me off even for non-gaming use.
> 
> Thoughts? H4LP.


The only reason i'm thinking of getting a second R9 290 and run CFx. Just for 4K and plus i see 290s going for $270 now so i might run R9 290X + 2x R9 290s or R9 290X + R9 290


----------



## Sgt Bilko

Quote:


> Originally Posted by *Durvelle27*
> 
> The only reason i'm thinking of getting a second R9 290 and run CFx. Just for 4K and plus i see 290s going for $270 now so i might run R9 290X + 2x R9 290s or R9 290X + R9 290


@HOMECINEMA-PC Is running a 290x + 2 R9 290's in Tri-fire atm.

from memory the 290x is a 290 that unlocked so he would be the best source of info here for real world info and experience


----------



## Rozayz

Not interested in Eyefinity, just wanted to deal with FPS issues whilst gaming on a single 4k monitor.









I guess I also wondered, would x1 DP 1.2 port per 4k monitor be preferred, or do people find HDMI/30Hz okay outside of gaming?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Rozayz*
> 
> Not interested in Eyefinity, just wanted to deal with FPS issues whilst gaming on a single 4k monitor.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I guess I also wondered, would x1 DP 1.2 port per 4k monitor be preferred, or do people find HDMI/30Hz okay outside of gaming?


Should give you a good indication though, for a single 4k screen there are reviews all over the place with these cards

http://www.tweaktown.com/tweakipedia/32/sapphire-radeon-r9-290x-tri-x-benchmarked-at-4k/index.html

http://hexus.net/tech/reviews/graphics/61629-amd-radeon-r9-290x-vs-nvidia-geforce-gtx-titan-4k/

http://www.eteknix.com/4k-gaming-showdown-amd-r9-290x-crossfire-vs-nvidia-gtx-780-ti-sli/

http://www.legitreviews.com/xfx-radeon-r9-290-crossfire-video-card-review-at-4k-ultra-hd_139418

etc etc


----------



## DeadlyDNA

Quote:


> Originally Posted by *Rozayz*
> 
> That's kind of my point. So +1 R9 290X and CF seems like a feasible path.
> 
> Heh. That stealth edit.
> So is there even a point in going CF R9 290X's? or just give up until 4k gaming is optimizable for non-millionaires?


Keep in mind, you don't have to max settings at 4k, it really looks better just on its own. I bet if i make a poll on what screenshot was low,medium.high.etc... people would probably not get it right that much.


----------



## Tokkan

I have to say, I am really satisfied that now people are giving correct info regarding crossfire than the simple false statements that they were giving just a few weeks ago, where they all yelled that they would run at the same speed.
The closest explanation to what really happens is:
Quote:


> Originally Posted by *BradleyW*
> 
> Well think about it. Two cards rendering at the same time. Both cards slightly different from one another in terms of IC's, tracers, resistors, architectural differences and IC timings + clock speeds. The faster card will always wait for the slower card. That "could" slighty increase frame time latency.


It can be seen through GPU usage. (The card "waiting")

Hope that now all the crossfire lies that have been spread will be cleansed from OCN.


----------



## Tokkan

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Keep in mind, you don't have to max settings at 4k, it really looks better just on its own. I bet if i make a poll on what screenshot was low,medium.high.etc... people would probably not get it right that much.


Please do, I'd like to check the results








Poll for several games too! BF4, BF3, Thief, Metro, Crysis 3 are some of the names that come to mind.

Edit: Sorry for double post


----------



## lolwatpear

can anyone tell me how they're liking their gigabyte windforce r9 290?


----------



## Durvelle27

Quote:


> Originally Posted by *Sgt Bilko*
> 
> @HOMECINEMA-PC Is running a 290x + 2 R9 290's in Tri-fire atm.
> 
> from memory the 290x is a 290 that unlocked so he would be the best source of info here for real world info and experience


I'll shoot him a PM


----------



## sinnedone

Quote:


> Originally Posted by *Rozayz*
> 
> Not interested in Eyefinity, just wanted to deal with FPS issues whilst gaming on a single 4k monitor.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I guess I also wondered, would x1 DP 1.2 port per 4k monitor be preferred, or do people find HDMI/30Hz okay outside of gaming?


Just wondering, but do you even see the difference aa makes at 4k?


----------



## DeadlyDNA

Quote:


> Originally Posted by *sinnedone*
> 
> Just wondering, but do you even see the difference aa makes at 4k?


For me personally yes and no. Some games have horrible jaggies, and other maybe through post processing not so much. Over all compared though its way less noticeable. I am finding in my own testing AA reduces the quality. I leave it off in pretty much all my testing and i in most cases have to look for it. I think though if you have say the 24inch dell 4k, its probably even harder to see?


----------



## weebeast

Well i installed my 290x couple days ago. Formatted my pc yesterday and i haven't met any problems. Battlefield 4 works great on 1440p

The card is so silent hahaha and it stay at max 70 degrees

I also tested the card with valley, let it run for 20 minutes. No artifacts so i'm verry happy with the card


----------



## motokill36

Hi All

Just got a R9 290 of ebay
Runs ok but has coil Win .
Is this a bad sign or just one of those thing .
Funny Seller never metioned it


----------



## weebeast

Almost all my cards have had coil wine to be honest. It should be fine


----------



## motokill36

Quote:


> Originally Posted by *weebeast*
> 
> Almost all my cards have had coil wine to be honest. It should be fine


once fan spin up you cant her it but was going to put it under water so would get well annoying









Thanks for reply


----------



## bencher

I am in guys







. This thing is so loud omg.

72c on core after playing crysis 3 for a while with 60% fan speed.


----------



## motokill36

Quote:


> Originally Posted by *bencher*
> 
> I am in guys
> 
> 
> 
> 
> 
> 
> 
> . This thing is so loud omg.
> 
> 72c on core after playing crysis 3 for a while with 60% fan speed.


Welcome

LOL Yep they are Loud for sure lol


----------



## Agoniizing

Quote:


> Originally Posted by *kizwan*
> 
> Are you running centered? This is because you didn't enable GPU scaling in CCC (Digital Flat Panel section) but it shouldn't affect the 3dmark score though. I did ran some tests & I found it only affected the score in Crossfire. Can you set clock to 1000/1300 & run 3dmark 11 again? If not, I'll set mine to 1110/1350 & run 3dmark 11 this arvo. I have 290 but we'll be able to see whether you get correct score.


Here I ran it at 1000/1300 like you said. I still think this is a bit low for a 290X.


----------



## rdr09

Quote:


> Originally Posted by *bencher*
> 
> I am in guys
> 
> 
> 
> 
> 
> 
> 
> . This thing is so loud omg.
> 
> 72c on core after playing crysis 3 for a while with 60% fan speed.


that's Trueaudio.


----------



## the9quad

Quote:


> Originally Posted by *rdr09*
> 
> that's Trueaudio.


thats pretty funny


----------



## rdr09

Quote:


> Originally Posted by *the9quad*
> 
> thats pretty funny


i know exactly where bencher is coming from. i used my 290 with the stock cooler for about a week. i heard the fan at around 55%. to me it wasn't that bad really but it was loud.

anyways, i've been recommending the 290 a lot 'cause of my own personal experience in the games i play with it. other than a few issues with 14 drivers it has been flawless, solid, and these . . .



i've seen better scores but mine are not too bad. Tess off all at 1320/1620. Best of all - Blackscreen past 1620.


----------



## the9quad

Quote:


> Originally Posted by *rdr09*
> 
> i know exactly where bencher is coming from. i used my 290 with the stock cooler for about a week. i heard the fan at around 55%. to me it wasn't that bad really but it was loud.


Try running three 290x's on stock coolers.......you don't know the meaning of loud!


----------



## motokill36

Any one running crossfire on 14.4 Catalyst ?


----------



## kpoeticg

I was, but one of my cards completely died on me. So now i'm back down to one during RMA


----------



## the9quad

Quote:


> Originally Posted by *motokill36*
> 
> Any one running crossfire on 14.4 Catalyst ?


yes I am


----------



## motokill36

ok thanks ill give it a go


----------



## rdr09

Quote:


> Originally Posted by *the9quad*
> 
> Try running three 290x's on stock coolers.......you don't know the meaning of loud!


lol. wth? how can you stand those? must be located in another room.


----------



## Faster_is_better

Some of these used 290's are becoming enticingly cheap. Is it worth it to even bother going for a non reference if I intend to watercool? I'm not sure if the custom PCB cards are that much better (other than the really expensive models).


----------



## rdr09

Quote:


> Originally Posted by *Faster_is_better*
> 
> Some of these used 290's are becoming enticingly cheap. Is it worth it to even bother going for a non reference if I intend to watercool? I'm not sure if the custom PCB cards are that much better (other than the really expensive models).


reference if watercooling.

http://www.overclock.net/t/1489992/two-r9-290-reference-and-two-r9-290-tri-x


----------



## the9quad

Quote:


> Originally Posted by *rdr09*
> 
> lol. wth? how can you stand those? must be located in another room.


headphones always.


----------



## DeadlyDNA

Quote:


> Originally Posted by *the9quad*
> 
> headphones always.


i had 3 on stock coolers for a week or two before i went water cooling. these are the loudest video cards i have even owned. Callsignvega had a video clip of his vacuum being less DB than his 290x cards. i think he had 2 of them.


----------



## IBIubbleTea

Not sure what is going on....

So I finished leak testing my first wc build. My r9 290 is making this weird noise when it starts to have 90% load or higher. It sounds like something is vibrating very quickly near the power plug area. I didnt have this before when I was on air. What is going on?


----------



## 250Gimp

Quote:


> Originally Posted by *DeadlyDNA*
> 
> i had 3 on stock coolers for a week or two before i went water cooling. these are the loudest video cards i have even owned. Callsignvega had a video clip of his vacuum being less DB than his 290x cards. i think he had 2 of them.


I didn't even make it a week with reference cooler before putting a G10 on it.

Core temps are good, just waiting on Gelid vrm heat sinks before crankng it up!

Cheers


----------



## VSG

Quote:


> Originally Posted by *IBIubbleTea*
> 
> Not sure what is going on....
> 
> So I finished leak testing my first wc build. My r9 290 is making this weird noise when it starts to have 90% load or higher. It sounds like something is vibrating very quickly near the power plug area. I didnt have this before when I was on air. What is going on?


Coil whine?


----------



## Dhalgren65

Answer for fasterisbetter

I went w/MSI R9 2904G Gaming (2x) as soon as there were fullcovers for them
I have a pref for their PCB 's & components...
I have never used Giga's before,and was not finding compatibility jnfo for HIS fast enough .
I heard some less than flattering things about Powercolor...
Would never AGAIN go XFX under any circumstances...
Add'l custom features almost never hurt unless they preclude block fitment....


----------



## IBIubbleTea

Quote:


> Originally Posted by *geggeg*
> 
> Coil whine?


I think it is coil whine but I don't really know why I have it as it wasn't there when I was on air.


----------



## Mega Man

Quote:


> Originally Posted by *blue1512*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> How will it be like 290 CF?
> 
> He has a 290x and was thinking of adding a 290 alongside it.
> 
> Performance would be better than 290 CF but a little slower than 290x CF due to the 290x having more shaders.
> 
> 
> 
> CF mirrors everything, from RAM to shader. So when you CF a 290 to 290x, only 2560 of 2816 shaders from 290x will be used. The performance therefore will be the same as 290 CF.
Click to expand...

Absolutely incorrect. They do not mirror the shaders. They can have different shaders and clock speeds
They have done this since at least 79xx.
Quote:


> Originally Posted by *IBIubbleTea*
> 
> Not sure what is going on....
> 
> So I finished leak testing my first wc build. My r9 290 is making this weird noise when it starts to have 90% load or higher. It sounds like something is vibrating very quickly near the power plug area. I didnt have this before when I was on air. What is going on?


Quote:


> Originally Posted by *geggeg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *IBIubbleTea*
> 
> Not sure what is going on....
> 
> So I finished leak testing my first wc build. My r9 290 is making this weird noise when it starts to have 90% load or higher. It sounds like something is vibrating very quickly near the power plug area. I didnt have this before when I was on air. What is going on?
> 
> 
> 
> Coil whine?
Click to expand...

Yes. And you did have it. You just could not hear it

Please forgive all auto correct mistakes


----------



## sinnedone

custom fan profile? Maybe you couldn't hear it before?


----------



## Mirob0t

Hey guys

i just got my Sapphire r9 290x tri x version and overclocked it to 1150/1550 but i noticed that the vrm temp is reaching 100
do you guys know waht the max vrm temp is on this gpu?
also if u guys have better overclock settings.. feel free to share with me

peace


----------



## IBIubbleTea

Quote:


> Originally Posted by *Mega Man*
> 
> Yes. And you did have it. You just could not hear it
> 
> Please forgive all auto correct mistakes


Really? I swear I didn't hear it, my fan profile wasn't that high....


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> How will it be like 290 CF?
> 
> He has a 290x and was thinking of adding a 290 alongside it.
> 
> Performance would be better than 290 CF but a little slower than 290x CF due to the 290x having more shaders.


This is true but not by much though performance wise


----------



## airisom2

Just to correct something roboyto posted in his super post a couple pages back:



The top left phase area is for PLL (trio of mosfets), which is responsible for handling PCIe voltage (it also includes the large rectangular choke I forgot to encircle in the picture). The phase on the top of the group of 6 phases is memory (it's separated a bit more than the other 5), and the last 5 phases are for the core.

VRM2 Shows PLL VRM temps, and VRM1 shows Core/Mem VRM temps.


----------



## hwoverclkd

Quote:


> Originally Posted by *Agoniizing*
> 
> Here I ran it at 1000/1300 like you said. I still think this is a bit low for a 290X.


Couple of things:
- how does your core clock graph look like? voltages?
- cpu?
- could you try using a different gpu port, e.g. displayport? I know it's very remote to be the root cause but just for troubleshooting sake.
- might be a good idea to try logging gpu-z logs into a file and upload it so we can take a look


----------



## ace1ndahole

Hey guys, my Sapphire 290 Vapor-X sometimes makes a pretty irritable buzz noise and I just RMA it for a replacement on Newegg. Can you tell me if it was the right thing to do?
I read somewhere that it buzzing noise could have come from power supply choke but the thing is I have Seasonic SS760 Watts Platinum, one of the best power supply on the market as of now so I don't think that is the cause of it, however, I'm still not sure because the power requirement for the R9 290 is relatively high at 750Watts.

Can you tell me what you guys think the cause is? I will be getting my replacement next week and hopefully it was just a bad card and not my PSU not having enough juice.


----------



## blue1512

]
Quote:


> Originally Posted by *Dhalgren65*
> 
> Answer for fasterisbetter
> 
> I went w/MSI R9 2904G Gaming (2x) as soon as there were fullcovers for them
> I have a pref for their PCB 's & components...
> I have never used Giga's before,and was not finding compatibility jnfo for HIS fast enough .
> I heard some less than flattering things about Powercolor...
> Would never AGAIN go XFX under any circumstances...
> Add'l custom features almost never hurt unless they preclude block fitment....


MSI Gaming 290/290x use weaker VRM, which lead to lower voltage on memory. You will have a hard time pushing them up.
The good news is you can flash reference BIOS onto them, the best one is Asus and Sapphire. But the weak VRm of MSI Gaming need to be cooled properly, cause they can't stand more than 110c like the VRMs on ref cards.
Quote:


> Originally Posted by *Mirob0t*
> 
> Hey guys
> 
> i just got my Sapphire r9 290x tri x version and overclocked it to 1150/1550 but i noticed that the vrm temp is reaching 100
> do you guys know waht the max vrm temp is on this gpu?
> also if u guys have better overclock settings.. feel free to share with me
> 
> peace


Tri X uses ref board, so like I just mentioned, 110c is still safe for them. They will make the card throttle a bit at that temp though, so it is not recommended. Having a 100c component in your case is not a good idea


----------



## hwoverclkd

Quote:


> Originally Posted by *ace1ndahole*
> 
> Hey guys, my Sapphire 290 Vapor-X sometimes makes a pretty irritable buzz noise and I just RMA it for a replacement on Newegg. Can you tell me if it was the right thing to do?
> I read somewhere that it buzzing noise could have come from power supply choke but the thing is I have Seasonic SS760 Watts Platinum, one of the best power supply on the market as of now so I don't think that is the cause of it, however, I'm still not sure because the power requirement for the R9 290 is relatively high at 750Watts.
> 
> Can you tell me what you guys think the cause is? I will be getting my replacement next week and hopefully it was just a bad card and not my PSU not having enough juice.


coil whine i guess? It depends. I would treat it like an 'allergy' ...to some folks, it's very annoying while others don't really mind (especially if you use headphones most of the time). I think it'll always be there, it's just a matter of how loud it gets. That's one of the reasons why i returned one of my gpu card, btw. Replacement card I got is better.


----------



## blue1512

Quote:


> Originally Posted by *ace1ndahole*
> 
> Hey guys, my Sapphire 290 Vapor-X sometimes makes a pretty irritable buzz noise and I just RMA it for a replacement on Newegg. Can you tell me if it was the right thing to do?
> I read somewhere that it buzzing noise could have come from power supply choke but the thing is I have Seasonic SS760 Watts Platinum, one of the best power supply on the market as of now so I don't think that is the cause of it, however, I'm still not sure because the power requirement for the R9 290 is relatively high at 750Watts.
> 
> Can you tell me what you guys think the cause is? I will be getting my replacement next week and hopefully it was just a bad card and not my PSU not having enough juice.


On OCN there is a fix for coil whine in 7970. The idea is simple and works for every card: "Let the card whine continuously until that noise stops on its own"

http://www.overclock.net/t/1259672/7970-coil-whine-a-way-to-fix-it-solved/10


----------



## ace1ndahole

My ears are pretty sensitive to high pitch noise. I love this card to death because it's complete silent 90 percent of the time but sometimes it just buzz like crazy and that's unacceptable for something that costs $450. I know it's my 290 because my 7850 never had any buzz or whine at all. Here's hoping it's not my PSU because I just bought it







.

#blue1512

The sound isn't coil whine but a very irritable buzzing.


----------



## ebhsimon

Quote:


> Originally Posted by *ace1ndahole*
> 
> My ears are pretty sensitive to high pitch noise. I love this card to death because it's complete silent 90 percent of the time but sometimes it just buzz like crazy and that's unacceptable for something that costs $450. I know it's my 290 because my 7850 never had any buzz or whine at all. Here's hoping it's not my PSU because I just bought it
> 
> 
> 
> 
> 
> 
> 
> .
> 
> #blue1512
> 
> The sound isn't coil whine but a very irritable buzzing.


Yep coil whine isn't actually a constant whine but a high pitched, piercing buzzing sound. I haven't noticed it yet on my VaporX 290 so I think you might have to RMA yours. I've been running kombustor pretty heavily.

What frame rates are you getting coil whine?
Actually just for you I'll try load some menu screens. My 7970 used to get coil whine when getting above 1000 fps, then it developed to just being over 70% load lol.

EDIT: Nope, no coil whine over 400 fps. I don't want to actually push it to the point where it develops coil whine like my old 7970, so I'm good with this.


----------



## yawa

This was my biggest fear BTW. That I'd drop $600 on a 290X, put it underwater, and have to suffer through horrific amounts of coil whine.

Luckily my Diamond 290X is a pretty great overclocker, runs extremely smoothly, and is quiet as a mouse in my loop.

I'd RMA that thing ASAP.


----------



## KyGuy

Just thought I would drop in and say that Newegg has ASUS and PowerColor OpenBox 290x's for $435 and $415 respectively. Pretty good deal if you ask me. Probably black screeners that have been redone


----------



## Mega Man

Quote:


> Originally Posted by *ace1ndahole*
> 
> Hey guys, my Sapphire 290 Vapor-X sometimes makes a pretty irritable buzz noise and I just RMA it for a replacement on Newegg. Can you tell me if it was the right thing to do?
> I read somewhere that it buzzing noise could have come from power supply choke but the thing is I have Seasonic SS760 Watts Platinum, one of the best power supply on the market as of now so I don't think that is the cause of it, however, I'm still not sure because the power requirement for the R9 290 is relatively high at 750Watts.
> 
> Can you tell me what you guys think the cause is? I will be getting my replacement next week and hopefully it was just a bad card and not my PSU not having enough juice.


It is OK to ram it. However there is no way the manufacture could anticipate it. Many factors affect it. Incoming power (wall voltage) psu, Mobo. Ect. You can take the same card and put it into anther rig with no coil whine. It is just physics and luck
Quote:


> Originally Posted by *blue1512*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ace1ndahole*
> 
> Hey guys, my Sapphire 290 Vapor-X sometimes makes a pretty irritable buzz noise and I just RMA it for a replacement on Newegg. Can you tell me if it was the right thing to do?
> I read somewhere that it buzzing noise could have come from power supply choke but the thing is I have Seasonic SS760 Watts Platinum, one of the best power supply on the market as of now so I don't think that is the cause of it, however, I'm still not sure because the power requirement for the R9 290 is relatively high at 750Watts.
> 
> Can you tell me what you guys think the cause is? I will be getting my replacement next week and hopefully it was just a bad card and not my PSU not having enough juice.
> 
> 
> 
> On OCN there is a fix for coil whine in 7970. The idea is simple and works for every card: "Let the card whine continuously until that noise stops on its own"
> 
> http://www.overclock.net/t/1259672/7970-coil-whine-a-way-to-fix-it-solved/10
Click to expand...

Not a 100% fix though
Quote:


> Originally Posted by *ace1ndahole*
> 
> My ears are pretty sensitive to high pitch noise. I love this card to death because it's complete silent 90 percent of the time but sometimes it just buzz like crazy and that's unacceptable for something that costs $450. I know it's my 290 because my 7850 never had any buzz or whine at all. Here's hoping it's not my PSU because I just bought it
> 
> 
> 
> 
> 
> 
> 
> .
> 
> #blue1512
> 
> The sound isn't coil whine but a very irritable buzzing.


Not everyone can heart it. Others can. And as I said it does have allot of variables that can affect it


----------



## HOMECINEMA-PC

Dudes check it .........

HOMECINEMA-PC [email protected]@2400 WB TRI [email protected]@1454 *31215* Tess off









http://www.3dmark.com/3dm11/8335447
One more push and I rekon P31300


----------



## Red1776

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Dudes check it .........
> 
> HOMECINEMA-PC [email protected]@2400 WB TRI [email protected]@1454 *31215*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/8335447
> One more push and I rekon P31300


good god....you're taking over


----------



## ace1ndahole

Ok, so I ran Tomb Raider benchmark on lowest setting and got around 450 FPS average. There is definitely a buzzing when I click on the game or when doing the benchmark. I will try to make a video of it for you guys to see.

EDIT: Here's the link to the video I just made. http://goo.gl/JGtXhp


----------



## Mega Man

well guys. finally 30lbs of watercooling waiting for meh !!! komodo piics inside, however they turned out very dark.... ill retake later


Spoiler: Warning: Spoiler!





*sigh* look at that sexay REAL chrome. just sexay......























and just because ( China auto show )


----------



## Faster_is_better

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Faster_is_better*
> 
> Some of these used 290's are becoming enticingly cheap. Is it worth it to even bother going for a non reference if I intend to watercool? I'm not sure if the custom PCB cards are that much better (other than the really expensive models).
> 
> 
> 
> reference if watercooling.
> 
> http://www.overclock.net/t/1489992/two-r9-290-reference-and-two-r9-290-tri-x
Click to expand...

Ya I saw those.. great deal lol
Quote:


> Originally Posted by *Dhalgren65*
> 
> Answer for fasterisbetter
> 
> I went w/MSI R9 2904G Gaming (2x) as soon as there were fullcovers for them
> I have a pref for their PCB 's & components...
> I have never used Giga's before,and was not finding compatibility jnfo for HIS fast enough .
> I heard some less than flattering things about Powercolor...
> Would never AGAIN go XFX under any circumstances...
> Add'l custom features almost never hurt unless they preclude block fitment....


I was really interested in the ASUS DCII cards but there seems to be unanimous hate of them in here with their RMA and support. Thanks for info though.


----------



## ace1ndahole

People hate the Asus 290 because it doesn't have any heat sink for the VRM and it felt cheap and lazy quality by Asus.


----------



## sinnedone

Quote:


> Originally Posted by *Mega Man*
> 
> well guys. finally 30lbs of watercooling waiting for meh !!! komodo piics inside, however they turned out very dark.... ill retake later
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> *sigh* look at that sexay REAL chrome. just sexay......
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and just because ( China auto show )


Definitely want to see how this looks mounted. I'd like to see the finish of the backplate as well. Congrats to you mega man, them some mighty fine looking blocks from what I've seen.


----------



## ebhsimon

Quote:


> Originally Posted by *ace1ndahole*
> 
> Ok, so I ran Tomb Raider benchmark on lowest setting and got around 450 FPS average. There is definitely a buzzing when I click on the game or when doing the benchmark. I will try to make a video of it for you guys to see.
> 
> EDIT: Here's the link to the video I just made. http://goo.gl/JGtXhp


I can't hear it. My MSI 79790 was clearly audible over my voice in videos. Crazy. I'll upload it now for you, it's WAY worse than yours.

EDIT: @ace1ndahole here's a video I took probably about 6 months ago: 




Turn the sound up for this. I ramped up the fans to show the noise wasn't from the fans (and they do sound like turbines, terrible I know). I could not even hear yours.


----------



## ace1ndahole

It's pretty noticeable in person, but to some it might just be a minor gripes. You can hear it if you turn up the volume during the click part.
My phone isn't the best recording device for this haha sorry about the quality.

@ebhsimon

AOT









Yeah that's some crazy buzz, sounds like a hive of insect got trap in your computer case haha.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Red1776*
> 
> good god....you're taking over


I have a bit left to go

Quote:


> Originally Posted by *Mega Man*
> 
> well guys. finally 30lbs of watercooling waiting for meh !!! komodo piics inside, however they turned out very dark.... ill retake later
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> *sigh* look at that sexay REAL chrome. just sexay......
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and just because ( China auto show )


Frothin








They look sooo chunky


----------



## ebhsimon

Quote:


> Originally Posted by *ace1ndahole*
> 
> AOT
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah that's some crazy buzz, sounds like a hive of insect got trap in your computer case haha.


It used to be AOT, but recently changed to one piece.

I did love the speed of the 7970, but the constant coil whine every time i started up a game was incredibly annoying, so I sold it, took a leap of faith and hopped on the VaporX 290. I love it. I actually keep my side panel off.
If you see my post here: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/22700#post_22273346 you'll see how the VaporX performs. Not bad at all.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Red1776*
> 
> good god....you're taking over


HOMECINEMA-PC [email protected]@2400 WB TRI 290 @ [email protected] *31505







*

http://www.3dmark.com/3dm11/8335781
I smashed it real good


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> HOMECINEMA-PC [email protected]@2400 WB TRI 290 @ [email protected] *31505
> 
> 
> 
> 
> 
> 
> 
> *
> 
> http://www.3dmark.com/3dm11/8335781
> I smashed it real good


Not bad man,









you keep making me want to buy more stuff you know that right?


----------



## ace1ndahole

How did you get such a nice overclock on your card, I see artifact at only 1100/1500 that's not even a 10% overclock. Can I see your AfterBurner setting?


----------



## phantomowl

Is it ok to crossfire r9 290 with a r9 270x or 280x?


----------



## Sgt Bilko

Quote:


> Originally Posted by *phantomowl*
> 
> Is it ok to crossfire r9 290 with a r9 270x or 280x?


Nope


----------



## Arizonian

Quote:


> Originally Posted by *bencher*
> 
> I am in guys
> 
> 
> 
> 
> 
> 
> 
> . This thing is so loud omg.
> 
> 72c on core after playing crysis 3 for a while with 60% fan speed.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added








Quote:


> Originally Posted by *Mega Man*
> 
> well guys. finally 30lbs of watercooling waiting for meh !!! komodo piics inside, however they turned out very dark.... ill retake later
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> *sigh* look at that sexay REAL chrome. just sexay......
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and just because ( China auto show )


Congrats - updated







Nice
Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> HOMECINEMA-PC [email protected]@2400 WB TRI 290 @ [email protected] *31505
> 
> 
> 
> 
> 
> 
> 
> *
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/8335781
> I smashed it real good


sure did......great clocks all around. Way to represent.









@kayan your updated to crossfire XFX 290X under water.


----------



## kayan

Quote:


> Originally Posted by *Mega Man*
> 
> well guys. finally 30lbs of watercooling waiting for meh !!! komodo piics inside, however they turned out very dark.... ill retake later
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> *sigh* look at that sexay REAL chrome. just sexay......
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and just because ( China auto show )


Those coolers are so sexy. I may grab one for my 2nd 290x, instead of the matching Razor. Let us know how it performs!


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Not bad man,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> you keep making me want to buy more stuff you know that right?


Yerp it does that to ya









Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - updated
> 
> 
> 
> 
> 
> 
> 
> Nice
> sure did......great clocks all around. Way to represent.


This now means that I own Mk 11 P with Single , CF and TRI on HWBOT
http://hwbot.org/hardware/videocard/radeon_r9_290/









And I tops it off with my best single card score

HOMECINEMA-PC [email protected]@2428 WB 290 [email protected] *21595* Tess off











http://www.3dmark.com/3dm11/8336159 CL 9 on the ram as well


----------



## kizwan

Quote:


> Originally Posted by *Mega Man*
> 
> well guys. finally 30lbs of watercooling waiting for meh !!! komodo piics inside, however they turned out very dark.... ill retake later
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> *sigh* look at that sexay REAL chrome. just sexay......
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and just because ( China auto show )


Komodo blocks looks very nice. Bulky, I like it!







Is these going into your Caselabs MAGNUM M8 Case?
Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> good god....you're taking over
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> HOMECINEMA-PC [email protected]@2400 WB TRI 290 @ [email protected] *31505
> 
> 
> 
> 
> 
> 
> 
> *
> 
> http://www.3dmark.com/3dm11/8335781
> I smashed it real good
Click to expand...

Nice score madman!








Quote:


> Originally Posted by *ace1ndahole*
> 
> How did you get such a nice overclock on your card, I see artifact at only 1100/1500 that's not even a 10% overclock. Can I see your AfterBurner setting?


If you're referring to @HOMECINEMA-PC, he is using ASUS PT1 VBIOS. With ASUS GPU Tweak, you can push more voltage to your card therefore can achieve higher overclock. Even with stock VBIOS & using AB/Trixx, I'm pretty sure you can push more than 1100 on the core.


----------



## Mercy4You

Just got an email from AMD, they are increasing the availability and lowering the prices of the R9 series...


----------



## rdr09

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yerp it does that to ya
> 
> 
> 
> 
> 
> 
> 
> 
> This now means that I own Mk 11 P with Single , CF and TRI on HWBOT
> http://hwbot.org/hardware/videocard/radeon_r9_290/
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And I tops it off with my best single card score
> 
> HOMECINEMA-PC [email protected]@2428 WB 290 [email protected] *21595* Tess off
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/8336159 CL 9 on the ram as well


you are a madman. congrats!


----------



## ebhsimon

I went through the list and just realised there are only three people who have VaporX 290s.


----------



## Masochryst

First time posting here, and I have a question I'd love to get answered. Recently I got a reference Asus R9 290. At first I was scared about the thermal performance, that it would be noisy and overheat left and right. I'm pleased to find out that after more than 1 hour of Crysis 3 action, with the fan no faster than 50%, it achieved a maximum temperature of 90º. My question is, I read that if you remove the endplate of the card (that allows it to be bolted to the case), temperatures can be reduced by as much as 7º. Can anyone confirm temperatures can be improved decently by modding the card that way ?

Sorry if it has already been talked about, but this thread is quite massive and I haven't been able to get much from Google.


----------



## Gobigorgohome

Hallo, I will soon get my 3x Asus DCUII OC R9 290s to my home together with the ASUS RIVBE and i7-3930K, i will probably go for another R9 290 in a month or so. I will mostly be doing 4K gaming and benchmarks and some workstation programs. Will I run into trouble with the quad R9 290 and the 3930K (overclocked slightly, like 4,2 Ghz)? I plan on watercool the R9 290s and the CPU in a little while aswell.

My other worry is my PSU which is the EVGA 1300W G2, will it be enough or should i get something like the Lepa G1600?


----------



## the9quad

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Hallo, I will soon get my 3x Asus DCUII OC R9 290s to my home together with the ASUS RIVBE and i7-3930K, i will probably go for another R9 290 in a month or so. I will mostly be doing 4K gaming and benchmarks and some workstation programs. Will I run into trouble with the quad R9 290 and the 3930K (overclocked slightly, like 4,2 Ghz)? I plan on watercool the R9 290s and the CPU in a little while aswell.
> 
> My other worry is my PSU which is the EVGA 1300W G2, will it be enough or should i get something like the Lepa G1600?


Either get two power supplies or the 1600!watter for quad. 1200-1300 is barely enough for 3 . I'm on reference cooling with two hard drives and a overclocked 4930k and pull 1100 at the wall. I imagine adding pumps and any oc to my cards would put me over the capacity of my power supply. So yeah 1600 watts for room to breathe.


----------



## alancsalt

My LEPA only lasted six months. PCCASEGEAR no longer stock them. When I applied for warranty repair they just gave me new price credit ($AU430). That is not normal here in OZ, not for me. Some of my returns have been harder fought than the battle of Verdun.

I got the LEPA1600W thinking it might supply enough juice for my 3930k and quad 580 setup, but it would fail in the same place as my SilverStone 1500W. The plugs and sockets look kinda el cheapo too. So I feel it no better than the Silverstone, just higher priced.


----------



## pkrexer

The corsair AX1500i should be out soon and looks beastly.

http://www.corsair.com/en-us/landing/ax1500i


----------



## hwoverclkd

Quote:


> Originally Posted by *ace1ndahole*
> 
> Ok, so I ran Tomb Raider benchmark on lowest setting and got around 450 FPS average. There is definitely a buzzing when I click on the game or when doing the benchmark. I will try to make a video of it for you guys to see.
> 
> EDIT: Here's the link to the video I just made. http://goo.gl/JGtXhp


Man i couldn't hear anything bad. Unless that clicking noise wasn't you clicking your mouse button


----------



## hwoverclkd

Quote:


> Originally Posted by *Mercy4You*
> 
> Just got an email from AMD, they are increasing the availability and lowering the prices of the R9 series...


Same here. I think they're just trying to control the prices as it was driven up by other market conditions. I'm sureyou know what i'm referring to here, right


----------



## ABADY

gigabyte r9 290 costs 380$ on amazon . is it worth it over the tri-x ?

http://www.amazon.com/gp/product/B00HS84DFU/ref=ox_sc_act_title_1?ie=UTF8&psc=1&smid=ATVPDKIKX0DER


----------



## bencher

Quote:


> Originally Posted by *ABADY*
> 
> gigabyte r9 290 costs 380$ on amazon . is it worth it over the tri-x ?


Can you post a link to what you are talking about?


----------



## ABADY

Quote:


> Originally Posted by *bencher*
> 
> Can you post a link to what you are talking about?


http://www.amazon.com/gp/product/B00HS84DFU/ref=ox_sc_act_title_1?ie=UTF8&psc=1&smid=ATVPDKIKX0DER


----------



## bencher

Quote:


> Originally Posted by *ABADY*
> 
> http://www.amazon.com/gp/product/B00HS84DFU/ref=ox_sc_act_title_1?ie=UTF8&psc=1&smid=ATVPDKIKX0DER


From what I am seeing the Tri-x is more expensive.


----------



## Sgt Bilko

Quote:


> Originally Posted by *ABADY*
> 
> gigabyte r9 290 costs 380$ on amazon . is it worth it over the tri-x ?
> 
> http://www.amazon.com/gp/product/B00HS84DFU/ref=ox_sc_act_title_1?ie=UTF8&psc=1&smid=ATVPDKIKX0DER


Yes, for that price you'd be mad to get a Tri-X over the WF3.

They roughly cool the same.


----------



## ABADY

Quote:


> Originally Posted by *bencher*
> 
> From what I am seeing the Tri-x is more expensive.


yeah am not talking about the tri-x . i was thinking of getting the tri-x ( great cooling at good price ) but since gigabyte's dropped to 380$ is it worth it over the tri-x ?


----------



## bencher

Quote:


> Originally Posted by *ABADY*
> 
> yeah am not talking about the tri-x . i was thinking of getting the tri-x ( great cooling at good price ) but since gigabyte's dropped to 380$ is it worth it over the tri-x ?


All the Non reference cards have great cooling. just get the cheapest one with good vrm cooling. It seem like gigabyte is it now.


----------



## hwoverclkd

So there were 4 sapphire r9 290x vapor-x (1080/1410) in stock at amazon which were immediately sold out in just 1 day. I hope whoever bought them would post their feedback here


----------



## Gobigorgohome

Quote:


> Originally Posted by *alancsalt*
> 
> My LEPA only lasted six months. PCCASEGEAR no longer stock them. When I applied for warranty repair they just gave me new price credit ($AU430). That is not normal here in OZ, not for me. Some of my returns have been harder fought than the battle of Verdun.
> 
> I got the LEPA1600W thinking it might supply enough juice for my 3930k and quad 580 setup, but it would fail in the same place as my SilverStone 1500W. The plugs and sockets look kinda el cheapo too. So I feel it no better than the Silverstone, just higher priced.


Soo? Which powersupply should I use for the quad R9 290s then? I will probably try my EVGA 1300W G2 (based on the Xtreme PSU calculator 1300W should be enough for 3930K, 2x SSD, 3x HDD, 3x Pumps, 13 fans and four R9 290s ... which was supposed to use around 1265 watts, it is almost 1300 watts but I guess it will last a few months until I buy a new PSU that are a bit quieter and have som more wattage. What about Enermax Platimax?

Reason I don't want to do dual PSUs is mostly because I am going with custom watercooling, and the 900D doesn't support the hardware I need (watercooling and components) with 2x PSUs.


----------



## the9quad

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Soo? Which powersupply should I use for the quad R9 290s then? I will probably try my EVGA 1300W G2 (based on the Xtreme PSU calculator 1300W should be enough for 3930K, 2x SSD, 3x HDD, 3x Pumps, 13 fans and four R9 290s ... which was supposed to use around 1265 watts, it is almost 1300 watts but I guess it will last a few months until I buy a new PSU that are a bit quieter and have som more wattage. What about Enermax Platimax?
> 
> Reason I don't want to do dual PSUs is mostly because I am going with custom watercooling, and the 900D doesn't support the hardware I need (watercooling and components) with 2x PSUs.


is the rosewill hercules any good? also that Corsair AX1500i Titanium Power Supply is getting good reviews and looks awesome

http://www.guru3d.com/articles_pages/corsair_ax1500i_psu_review,1.html


----------



## kpoeticg

Anybody around right now have a RIVE BE? I just got a new board from rma and did a fresh windows install. Now my cards acting buggy again, it wasn't on the Asrock x79 Extreme6 i was using during rma.

I'm getting the screen freeze with audio skip/loop and alot of screen-tearing. I haven't had any black screens while i was around, it seems like one happeneed while i was sleeping tho. I think it could be the Realtek audio drivers for the board. I like the ROG Realtek GUI, but obviously my 290x's working is more important to me.

The card was working great on the other board. Anybody able to use the RIVE BE Realtek with 14.4?


----------



## ABADY

Quote:


> Originally Posted by *bencher*
> 
> All the Non reference cards have great cooling. just get the cheapest one with good vrm cooling. It seem like gigabyte is it now.


done its only 14 days until i get it .


----------



## Gobigorgohome

Quote:


> Originally Posted by *the9quad*
> 
> is the rosewill hercules any good? also that Corsair AX1500i Titanium Power Supply is getting good reviews and looks awesome
> 
> http://www.guru3d.com/articles_pages/corsair_ax1500i_psu_review,1.html


I would actually like to own a Rosewill Hercules (I don't know about the stability though), but that PSU is not available in my country.

The Corsair AX1500i seems like a good PSU, but I have my own thoughts of Corsairs quality after receiving the Corsair 900D (which in NO WAY can be compared to Cooler Master cases when it comes to paint quality, build quality and price). Anyway the price tag which was listed at Guru3D at 450USD will be something like 570USD here (or more, probably more), which is kind of silly.

I guess that leaves me with the option of getting a secend PSU.


----------



## bencher

Quote:


> Originally Posted by *ABADY*
> 
> done its only 14 days until i get it .


----------



## Mercy4You

Quote:


> Originally Posted by *acupalypse*
> 
> So there were 4 sapphire r9 290x vapor-x (1080/1410) in stock at amazon which were immediately sold out in just 1 day. I hope whoever bought them would post their feedback here


Of course, it's a very tempting card even for Tri-X owners like me








- 3 fans of which 2 only run above 65C
- backplate
- adapted PCB, choke's and capacitators...

But, is it wise to buy a card that lasts 5 years I wonder


----------



## motokill36

Quote:


> Originally Posted by *IBIubbleTea*
> 
> I think it is coil whine but I don't really know why I have it as it wasn't there when I was on air.


You just could not her it maybe
Mine dose the same
Works fine tho


----------



## ebhsimon

Quote:


> Originally Posted by *acupalypse*
> 
> So there were 4 sapphire r9 290x vapor-x (1080/1410) in stock at amazon which were immediately sold out in just 1 day. I hope whoever bought them would post their feedback here


I've got one, what do you need?


----------



## DeadlyDNA

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Soo? Which powersupply should I use for the quad R9 290s then? I will probably try my EVGA 1300W G2 (based on the Xtreme PSU calculator 1300W should be enough for 3930K, 2x SSD, 3x HDD, 3x Pumps, 13 fans and four R9 290s ... which was supposed to use around 1265 watts, it is almost 1300 watts but I guess it will last a few months until I buy a new PSU that are a bit quieter and have som more wattage. What about Enermax Platimax?
> 
> Reason I don't want to do dual PSUs is mostly because I am going with custom watercooling, and the 900D doesn't support the hardware I need (watercooling and components) with 2x PSUs.


i am using a GS1600w Lepa now myself for my 4x290s. It's been rock solid for me. I havent capped it out yet, but i been just under 1600watts at the wall. Shilka is a knowledgeable PSU guy around these forums. Maybe he can help guide you?


----------



## Mega Man

false
Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> well guys. finally 30lbs of watercooling waiting for meh !!! komodo piics inside, however they turned out very dark.... ill retake later
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> *sigh* look at that sexay REAL chrome. just sexay......
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and just because ( China auto show )
> 
> 
> 
> 
> 
> 
> 
> Komodo blocks looks very nice. Bulky, I like it!
> 
> 
> 
> 
> 
> 
> 
> Is these going into your Caselabs MAGNUM M8 Case?
Click to expand...

nope in my lesser known th10 3x480monstas 1x480 ut60, 1x480 45mm
also changing ( more like modifying ) all leds to RGB i found the ones i will be buying the ones i need to make the front glow today
Quote:


> Originally Posted by *Gobigorgohome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *alancsalt*
> 
> My LEPA only lasted six months. PCCASEGEAR no longer stock them. When I applied for warranty repair they just gave me new price credit ($AU430). That is not normal here in OZ, not for me. Some of my returns have been harder fought than the battle of Verdun.
> 
> I got the LEPA1600W thinking it might supply enough juice for my 3930k and quad 580 setup, but it would fail in the same place as my SilverStone 1500W. The plugs and sockets look kinda el cheapo too. So I feel it no better than the Silverstone, just higher priced.
> 
> 
> 
> Soo? Which powersupply should I use for the quad R9 290s then? I will probably try my EVGA 1300W G2 (based on the Xtreme PSU calculator 1300W should be enough for 3930K, 2x SSD, 3x HDD, 3x Pumps, 13 fans and four R9 290s ... which was supposed to use around 1265 watts, it is almost 1300 watts but I guess it will last a few months until I buy a new PSU that are a bit quieter and have som more wattage. What about Enermax Platimax?
> 
> Reason I don't want to do dual PSUs is mostly because I am going with custom watercooling, and the 900D doesn't support the hardware I need (watercooling and components) with 2x PSUs.
Click to expand...

and this is why i go CL, i get 2psus, and my epic watercooling
Quote:


> Originally Posted by *kpoeticg*
> 
> Anybody around right now have a RIVE BE? I just got a new board from rma and did a fresh windows install. Now my cards acting buggy again, it wasn't on the Asrock x79 Extreme6 i was using during rma.
> 
> I'm getting the screen freeze with audio skip/loop and alot of screen-tearing. I haven't had any black screens while i was around, it seems like one happeneed while i was sleeping tho. I think it could be the Realtek audio drivers for the board. I like the ROG Realtek GUI, but obviously my 290x's working is more important to me.
> 
> The card was working great on the other board. Anybody able to use the RIVE BE Realtek with 14.4?


i do . with out issues ( at stock, slowly working on ocing these, but i do mean slowly need to wrap my head on how to )


----------



## DeadlyDNA

I am pondering the idea of getting 290x's off ebay from ex miners, and selling my 290's. Any suggestions on this? 290x are under 350$ now on ebay.


----------



## bencher

Quote:


> Originally Posted by *DeadlyDNA*
> 
> I am pondering the idea of getting 290x's off ebay from ex miners, and selling my 290's. Any suggestions on this? 290x are under 350$ now on ebay.


I don't think it is worth the hassle and added cost for 10% improvement.


----------



## hotrod717

Just thought I'd share -


----------



## DeadlyDNA

Quote:


> Originally Posted by *bencher*
> 
> I don't think it is worth the hassle and added cost for 10% improvement.


yes, but roughly 10% per card? I think mega man needs to answer this NOW


----------



## kpoeticg

Quote:


> Originally Posted by *Mega Man*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kpoeticg*
> 
> Anybody around right now have a RIVE BE? I just got a new board from rma and did a fresh windows install. Now my cards acting buggy again, it wasn't on the Asrock x79 Extreme6 i was using during rma.
> 
> I'm getting the screen freeze with audio skip/loop and alot of screen-tearing. I haven't had any black screens while i was around, it seems like one happeneed while i was sleeping tho. I think it could be the Realtek audio drivers for the board. I like the ROG Realtek GUI, but obviously my 290x's working is more important to me.
> 
> The card was working great on the other board. Anybody able to use the RIVE BE Realtek with 14.4?
> 
> 
> 
> i do . with out issues ( at stock, slowly working on ocing these, but i do mean slowly need to wrap my head on how to )
Click to expand...

Thanx for responding. Maybe it's the order i installed everything. I installed the GPU drivers last out of everything. Maybe i shoulda installed em first...

Can't figure it out. Card was running great on the Extreme6. I know i've read somewhere about Realtek/AMD conflicts, but the Extreme6 used Realtek also.


----------



## kayan

Quote:


> Originally Posted by *DeadlyDNA*
> 
> I am pondering the idea of getting 290x's off ebay from ex miners, and selling my 290's. Any suggestions on this? 290x are under 350$ now on ebay.


I'd agree that it probably isn't worth the hassle, but if you want the very best for your rig, then do it. I've bought 2 reference 290x off of eBay in the last 3 weeks and I paid 340 (shipped) and 320 (shipped) for each of them. Which is awesome considering I got mine at launch for 550. I now have a crossfire setup, and my wife got an upgrade from a 5830 to a 290x.

As far as experience, the 1st one overclocks quite well, whereas the 2nd doesn't so much. The second also has a strange smell to it (I'm thinking it was watercooled). I messaged the seller and he said he bought it from another eBayer 2 weeks before reselling it, and claimed it was because of how loud the reference card is compared to his other Tri-X. /shrug. It performs fine in games, just doesn't overclock well.

Both have been running solidly though.


----------



## bencher

Quote:


> Originally Posted by *DeadlyDNA*
> 
> yes, but roughly 10% per card? I think mega man needs to answer this NOW


10% per card averages 10%


----------



## sugarhell

Quote:


> Originally Posted by *hotrod717*
> 
> Just thought I'd share -


I expected a lightning to be faster than a ref. But its not

http://www.3dmark.com/fs/1877211

http://www.3dmark.com/fs/1881364


----------



## heroxoot

Quote:


> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *hotrod717*
> 
> Just thought I'd share -
> 
> 
> 
> 
> 
> 
> 
> I expected a lightning to be faster than a ref. But its not
> 
> http://www.3dmark.com/fs/1877211
Click to expand...

There is a guy somewhere who gets way lower benchmarks on OC on his lightning than he does on stock. The lightnings seem problematic this time around. Really sucks because my 7970 lightning OC'd better than others.


----------



## Sgt Bilko

Quote:


> Originally Posted by *sugarhell*
> 
> I expected a lightning to be faster than a ref. But its not
> 
> http://www.3dmark.com/fs/1877211


The Graphics score is higher on the Lightning even though it's clocker lower......

EDIT: nvm, Tess modified, didn't see that









Weird though, but not every 290x can hit 1300 on the core though, takes a special ref card to do that, i'd guess that nearly every lightning could though


----------



## aaroc

Quote:


> Originally Posted by *ace1ndahole*
> 
> Ok, so I ran Tomb Raider benchmark on lowest setting and got around 450 FPS average. There is definitely a buzzing when I click on the game or when doing the benchmark. I will try to make a video of it for you guys to see.
> 
> EDIT: Here's the link to the video I just made. http://goo.gl/JGtXhp


I bought 2 new XFX R9 290 when they come out that had coil whine. When I first run 3dmark they were coil whining SOOOOOOOOOOO Loud. Like a cat crying. So I thought the only possibility was a RMA. For a lot of reasons I couldnt RMA them right away and one month later the coil whine was 1/2. Now I dont hear it with my PC fans in low. You have to put your ears inside the PC case to hear some coil noise. If you cant stand the noise RMA them right now, but its possible that the noise goes away in the very near future. I have another MSI R9 290 and that card didnt had coil noise. Hope it helps


----------



## Gobigorgohome

Quote:


> Originally Posted by *DeadlyDNA*
> 
> i am using a GS1600w Lepa now myself for my 4x290s. It's been rock solid for me. I havent capped it out yet, but i been just under 1600watts at the wall. Shilka is a knowledgeable PSU guy around these forums. Maybe he can help guide you?


I see, but what is the difference between the LEPA G1600 and the LEPA GS1600? It seems that I just can get the LEPA G1600 in my country.

I own a Silver Power 500W (leak-testing PSU) and the EVGA G2 1300W so I can manage if I get the "little one" hooked up to one GPU, the CPU and maybe a couple of pumps. Then again I have a little problems with the space ... hmmm ... first off I am going 3x R9 290 so I guess the 1300W EVGA will do fine until I buy the last one.

Thanks for the replay, much appreciated!


----------



## rdr09

Quote:


> Originally Posted by *DeadlyDNA*
> 
> I am pondering the idea of getting 290x's off ebay from ex miners, and selling my 290's. Any suggestions on this? 290x are under 350$ now on ebay.


i read that the extra shraders help at higher rez. who is using high rez?

you


----------



## ace1ndahole

Yeah I know the video doesn't have the best sound quality because its a smartphone camera.

But the noise is very much audible in person, even through my 2 200mm and 140mm fans inside a H630 Silent case.

Hopefully the replacement won't have the same buzzing.


----------



## hotrod717

Quote:


> Originally Posted by *sugarhell*
> 
> I expected a lightning to be faster than a ref. But its not
> 
> http://www.3dmark.com/fs/1877211
> 
> http://www.3dmark.com/fs/1881364


Lol! Even though the examples you give are clocked quite higher and scored lower. I've only started to push this card and don't really think it is one of the better examples. From what I've seen most Lightnings clock higher than reference on average.


----------



## hotrod717

Quote:


> Originally Posted by *Sgt Bilko*
> 
> The Graphics score is higher on the Lightning even though it's clocker lower......
> 
> EDIT: nvm, Tess modified, didn't see that
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Weird though, but not every 290x can hit 1300 on the core though, takes a special ref card to do that, i'd guess that nearly every lightning could though


Pretty much standard for AMD to run tess off. And a large majority of Lightnings can do 1250 core on air, fyi. The only bad instances I see, are people on the hater parade trying to discredit a non-reference model. It was pretty much the same for the 7970's as well. Here we go again.


----------



## sugarhell

Quote:


> Originally Posted by *hotrod717*
> 
> Lol! Even though the examples you give are clocked quite higher and scored lower. I've only started to push this card and don't really think it is one of the better examples. From what I've seen most Lightnings clock higher than reference on average.


You mean tess off vs tess on?


----------



## hotrod717

Quote:


> Originally Posted by *sugarhell*
> 
> You mean tess off vs tess on?


No, you mean hwbot bench rules vs. someone posting other peoples scores trying to hate?


----------



## sugarhell

Quote:


> Originally Posted by *hotrod717*
> 
> No, you mean hwbot bench rules vs. someone posting other peoples scores trying to hate?


We are on hwbot? I just told you that the score is low for a lightning with tess off. Get over it


----------



## hwoverclkd

Quote:


> Originally Posted by *ebhsimon*
> 
> I've got one, what do you need?


thought yours was a 290. clocks, voltage and temp please?


----------



## hwoverclkd

Quote:


> Originally Posted by *heroxoot*
> 
> There is a guy somewhere who gets way lower benchmarks on OC on his lightning than he does on stock. The lightnings seem problematic this time around. Really sucks because my 7970 lightning OC'd better than others.


Vladimir is one...think he was trying to RMA it and someone was trying to help him out.


----------



## hwoverclkd

Quote:


> Originally Posted by *sugarhell*
> 
> I expected a lightning to be faster than a ref. But its not
> 
> http://www.3dmark.com/fs/1877211
> 
> http://www.3dmark.com/fs/1881364


sad but true


----------



## Goride

Last ditch effort before I RMA this 290x:

Why are my vrm1 temps so damn hot?

I posted a few times in this thread, but its pretty buried so I will recap.

I have an NZXT G10, with an AMD Liquid Cooling System AIO (rebadged Antec Kuhler 920). I have a Gelid Enhancement Kit for the heatsink over the vrm1 and 2. I have some decent aluminum heatsinks over the 16 memory chips.

I have tried using 2 layers and 4 layers (it was suggested that maybe I need more padding to transfer the heat better) of 0.5mm Fujipoly Extreme thermal pad. The Gelid heatsink gets scorching hot... as in it burns my finger to touch it. Even the little nut on the back of the card is flaming hot. So it appears the heat is transferring to the heatsink ok.

I have both the g10 fan blowing directly on the vrm1 from the front. I have another fan blowing on the 290x from the side. I have the case panel off, so the ambient temp of the case should not be more than room temperature.

With stock voltage and a clocks speed of 1030/1250, I get vrm1 temps of 97c. When I overclock it to 1200/1500 with a +133 mV, the vrm1 will get to 125c (running 3dmark and Heaven to load the GPU, I do NOT use furmark).

The stock g10 fan does seem pretty weak, others are getting like 60-70c on the vrm1 with a very similar setup.

(This card also seems to always score quite a bit lower on every benchmark compared with other like systems, and same clock speeds. I am not sure if this has anything to do with it either.)

Any ideas?

I have tried various case fan placements, nothing seems to help. I have plugged the g10 in a few different power ports on the motherboard, as well as plugging it directly into a molex coming off the PSU, which should force the fan to its max speed - no help.

The card is running seemingly abnormally hot, and yields abnormally low scores in benchmarks and FPS while in real games compared to others. But the card does not artifact, and I used GPU-Z to verify the card was not throttling itself.

I was having these issues with the stock cooler on it as well. It seemed that the answer to the low scores was that the card was overheating, and if I water cooled it, this would be fixed. But that does not seem to be the case.

Any last suggestions before I RMA it? I am trying to do all the due diligence I can before I send it back.


----------



## bencher

Quote:


> Originally Posted by *Goride*
> 
> Last ditch effort before I RMA this 290x:
> 
> Why are my vrm1 temps so damn hot?
> 
> I posted a few times in this thread, but its pretty buried so I will recap.
> 
> I have an NZXT G10, with an AMD Liquid Cooling System AIO (rebadged Antec Kuhler 920). I have a Gelid Enhancement Kit for the heatsink over the vrm1 and 2. I have some decent aluminum heatsinks over the 16 memory chips.
> 
> I have tried using 2 layers and 4 layers of 0.5mm Fujipoly Extreme thermal pad. The Gelid heatsink gets scorching hot... as in it burns my finger to touch it. Even the little nut on the back of the card is flaming hot. So it appears the heat is transferring to the heatsink ok.
> 
> I have both the g10 fan blowing directly on the vrm1 from the front. I have another fan blowing on the 290x from the side. I have the case panel off, so the ambient temp of the case should not be more than room temperature.
> 
> With stock voltage and a clocks speed of 1030/1250, I get vrm1 temps of 97c. When I overclock it to 1200/1500 with a +133 mV, the vrm1 will get to 125c (running 3dmark and Heaven to load the GPU, I do NOT use furmark).
> 
> The stock g10 fan does seem pretty weak, others are getting like 60-70c on the vrm1 with a very similar setup.
> 
> (This card also seems to always score quite a bit lower on every benchmark compared with other like systems, and same clock speeds. I am not sure if this has anything to do with it either.)
> 
> Any ideas?
> 
> I have tried various case fan placements, nothing seems to help. I have plugged the g10 in a few different power ports on the motherboard, as well as plugging it directly into a molex coming off the PSU, which should force the fan to its max speed - no help.
> 
> The card is running seemingly abnormally hot, and yields abnormally low scores in benchmarks and FPS while in real games compared to others. But the card does not artifact, and I used GPU-Z to verify the card was not throttling itself.
> 
> I was having these issues with the stock cooler on it as well. It seemed that the answer to the low scores was that the card was overheating, and if I water cooled it, this would be fixed. But that does not seem to be the case.
> 
> Any last suggestions before I RMA it? I am trying to do all the due diligence I can before I send it back.


Sounds like user error....








Quote:


> Originally Posted by *airisom2*
> 
> Just to correct something roboyto posted in his super post a couple pages back:
> 
> 
> 
> The top left phase area is for PLL (trio of mosfets), which is responsible for handling PCIe voltage (it also includes the large rectangular choke I forgot to encircle in the picture). The phase on the top of the group of 6 phases is memory (it's separated a bit more than the other 5), and the last 5 phases are for the core.
> 
> VRM2 Shows PLL VRM temps, and VRM1 shows Core/Mem VRM temps.


Take a look at this pic.


----------



## Sgt Bilko

Quote:


> Originally Posted by *hotrod717*
> 
> Pretty much standard for AMD to run tess off. And a large majority of Lightnings can do 1250 core on air, fyi. The only bad instances I see, are people on the hater parade trying to discredit a non-reference model. It was pretty much the same for the 7970's as well. Here we go again.


If i'm submitting a score to the Bot then i'll run Tess off but most of the time i'll leave it on for my own testing.
Quote:


> Originally Posted by *sugarhell*
> 
> We are on hwbot? I just told you that the score is low for a lightning with tess off. Get over it


Sugar, how many 290x's have you seen that can pull 1300 on the core? (without LN2)

Honest question here.









I haven't seen that many really, and i've watched the lightning thread a bit and most users are pulling 1250 on air cooling, you can't say that's not special.


----------



## Goride

Quote:


> Originally Posted by *bencher*
> 
> Sounds like user error....
> 
> 
> 
> 
> 
> 
> 
> 
> Take a look at this pic.


It shows the vrm1 and vrm2 locations.

What about that picture?


----------



## sugarhell

Quote:


> Originally Posted by *Sgt Bilko*
> 
> If i'm submitting a score to the Bot then i'll run Tess off but most of the time i'll leave it on for my own testing.
> Sugar, how many 290x's have you seen that can pull 1300 on the core? (without LN2)
> 
> Honest question here.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I haven't seen that many really, and i've watched the lightning thread a bit and most users are pulling 1250 on air cooling, you can't say that's not special.


At least 12. And not with binning


----------



## bencher

Quote:


> Originally Posted by *Goride*
> 
> It shows the vrm1 and vrm2 locations.
> 
> What about that picture?


Do you have both rows of vrm 2 covered with sinks?


----------



## Silent Scone

1250 on air isn't special lol. I pulled 1285 out of my Lightning on air. It's a brilliant cooler.They're not binned though so you're not likely to get any better than you are on reference


----------



## Sgt Bilko

Quote:


> Originally Posted by *Silent Scone*
> 
> 1250 on air isn't special lol. I pulled 1285 out of my Lightning on air. It's a brilliant cooler.They're not binned though so you're not likely to get any better than you are on reference


You missed the point of my post it seems
Quote:


> Originally Posted by *sugarhell*
> 
> At least 12. And not with binning


I'm just saying that the lightning's seem to be more capable of hitting higher clocks on air (not sure about water) than a normal 290x, probably more to do with the cooler than anything but still.


----------



## Goride

Quote:


> Originally Posted by *bencher*
> 
> Do you have both rows of vrm 2 covered with sinks?


Yes, and my vrm 2 temps are not the problem. It is my vrm1 temps.


----------



## devilhead

Quote:


> Originally Posted by *Sgt Bilko*
> 
> If i'm submitting a score to the Bot then i'll run Tess off but most of the time i'll leave it on for my own testing.
> Sugar, how many 290x's have you seen that can pull 1300 on the core? (without LN2)
> 
> Honest question here.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I haven't seen that many really, and i've watched the lightning thread a bit and most users are pulling 1250 on air cooling, you can't say that's not special.


i have 4 ref. 290x all of them can reach 1300 DD but the memory 2x of them is elpidia, thats bad, other 2x is ok, 1700mhz hynix


----------



## Silent Scone

Quote:


> Originally Posted by *Sgt Bilko*
> 
> You missed the point of my post it seems
> I'm just saying that the lightning's seem to be more capable of hitting higher clocks on air (not sure about water) than a normal 290x, probably more to do with the cooler than anything but still.


The reference cooler certainly is no where near as good as the lightning but as long as you can manage it under 1.4v for short runs 1300 isn't as rare as some people make out. I know of two people personally who have done 1300 on air.

That's not to say every card should do it, but you've just as much chance on both variants. The lightning however could do 1250/1650+ all day long and not break a sweat, near silent in fact. Hats off to MSi for the cooler. Unfortunately the PCB seems to throttle amperage over 1.55v so for water you're better off saving your money and buying reference IMO


----------



## sugarhell

Quote:


> Originally Posted by *Sgt Bilko*
> 
> You missed the point of my post it seems
> I'm just saying that the lightning's seem to be more capable of hitting higher clocks on air (not sure about water) than a normal 290x, probably more to do with the cooler than anything but still.


Better cooler. Just that


----------



## Sgt Bilko

Quote:


> Originally Posted by *devilhead*
> 
> i have 4 ref. 290x all of them can reach 1300 DD but the memory 2x of them is elpidia, thats bad, other 2x is ok, 1700mhz hynix


Water or Air?

If water then i'm impressed, if it's air then it would seem there isn't much special about the lightnings at all.

Quote:


> Originally Posted by *Silent Scone*
> 
> The reference cooler certainly is no where near as good as the lightning but as long as you can manage it under 1.4v for short runs 1300 isn't as rare as some people make out. I know of two people personally who have done 1300 on air


That's pretty much what i figuring out now, seems that the cooler is the only thing special about the lightning outside of LN2 purposes.....even then there are a fair few 290x's that have hit 1500+


----------



## Sgt Bilko

Quote:


> Originally Posted by *sugarhell*
> 
> Better cooler. Just that


yep, getting that now.

learn something new everyday, stock vrm's are good enough then


----------



## Mega Man

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bencher*
> 
> I don't think it is worth the hassle and added cost for 10% improvement.
> 
> 
> 
> yes, but roughly 10% per card? I think mega man needs to answer this NOW
Click to expand...

huh? how do i know? is it worth it to you ? top teir cards always cost more, you have to decide if that perf is better for you
Quote:


> Originally Posted by *kpoeticg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kpoeticg*
> 
> Anybody around right now have a RIVE BE? I just got a new board from rma and did a fresh windows install. Now my cards acting buggy again, it wasn't on the Asrock x79 Extreme6 i was using during rma.
> 
> I'm getting the screen freeze with audio skip/loop and alot of screen-tearing. I haven't had any black screens while i was around, it seems like one happeneed while i was sleeping tho. I think it could be the Realtek audio drivers for the board. I like the ROG Realtek GUI, but obviously my 290x's working is more important to me.
> 
> The card was working great on the other board. Anybody able to use the RIVE BE Realtek with 14.4?
> 
> 
> 
> i do . with out issues ( at stock, slowly working on ocing these, but i do mean slowly need to wrap my head on how to )
> 
> Click to expand...
> 
> Thanx for responding. Maybe it's the order i installed everything. I installed the GPU drivers last out of everything. Maybe i shoulda installed em first...
> 
> Can't figure it out. Card was running great on the Extreme6. I know i've read somewhere about Realtek/AMD conflicts, but the Extreme6 used Realtek also.
Click to expand...

they are new audio drivers out have you tried them, tbh i use headphones atm cause my rig is by the tv atm


----------



## devilhead

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Water or Air?
> 
> If water then i'm impressed, if it's air then it would seem there isn't much special about the lightnings at all.
> That's pretty much what i figuring out now, seems that the cooler is the only thing special about the lightning outside of LN2 purposes.....even then there are a fair few 290x's that have hit 1500+


all of them is under water


----------



## Clockster

So I might be heading back to the AMD camp this week, I got an offer for 2x Brand new Sapphire TRI-X R9-290 + R1500 (Roughly $150) for my single 780Ti...So tempted to do it lol


----------



## sugarhell

Quote:


> Originally Posted by *Clockster*
> 
> So I might be heading back to the AMD camp this week, I got an offer for 2x Brand new Sapphire TRI-X R9-290 + R1500 (Roughly $150) for my single 780Ti...So tempted to do it lol


He is insane?


----------



## Clockster

Quote:


> Originally Posted by *sugarhell*
> 
> He is insane?


lol Neah, he wants my card and then he wants to add another in sli, and he is desperate for mine because it clocks like a beast on air lol


----------



## Sgt Bilko

Quote:


> Originally Posted by *Clockster*
> 
> So I might be heading back to the AMD camp this week, I got an offer for 2x Brand new Sapphire TRI-X R9-290 + R1500 (Roughly $150) for my single 780Ti...So tempted to do it lol


I'd take the deal but if your 780 already clocks well then i dunno......up to you but i'd rather the 290's


----------



## Tokkan

With this GPU I finally got to mod skyrim heavy on the visuals. Mods have improved the visual quality of it by ALOT, but my r9 290 is at 100% usage only giving me a min of 20FPS and an average of 40.








Such game... The GTX765M on my laptop crawls to an amazing 2FPS minimum, so yea









More of a reason to OC it


----------



## Clockster

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'd take the deal but if your 780 already clocks well then i dunno......up to you but i'd rather the 290's


Yeah that's the thing, I know what my ti is capable of, and I think I would have to swap my brand new MSI Z97M Gaming for something like the Gaming 5 or 7, so I can have some spacing between the cards.


----------



## heroxoot

Has anyone witnessed a weird checkered white over their games usually near the top of their screen? I have seen it in BF4 and SSFIV:AE. It happens once in a while and I have no idea why. It doesn't seem like artifacting but it might be? I tried upping voltage and I think I saw it once on stock clocks too. Pretty sure I never saw it before 14.4.

Also this checkering does not appear on programs like OBS that record.


----------



## Forceman

It's artifacting. Mine does the same thing when I push it too far.


----------



## Frontside

I had the artifacts in BF4 too. Happened to me once. Strange green square with rounded corners appeared in lower part of the screen. I restarted the game and it's
gone.
Also i have artifacts in Crysis 2. Red lines all over the screen only when I stealth killing cephs


----------



## kpoeticg

My 290x is giving me screen tearing and artifacts on my RIVE BE. Can't figure out why. It was working perfectly and even oc'ing pretty decent on the x79 Extreme6. I can't get rid of these artifacts and screen tears on the RIVE BE when playing any video =\


----------



## vuldin

The link to the Asus GPU Tweak download is broken in the first post. Here is the correct link:
http://support.asus.com/download.aspx?SLanguage=en&p=9&m=ASUS%20GPU%20Tweak%20for%20Graphics%20cards&hashedid=n/a


----------



## Mega Man

Quote:


> Originally Posted by *kpoeticg*
> 
> My 290x is giving me screen tearing and artifacts on my RIVE BE. Can't figure out why. It was working perfectly and even oc'ing pretty decent on the x79 Extreme6. I can't get rid of these artifacts and screen tears on the RIVE BE when playing any video =\


What bios are you on


----------



## kpoeticg

It came with 0602 already on it, so i haven't changed the bios.

I'm bout to wipe my ssd again and reinstall win7. After how smooth this card was running on the Extreme6, i wasn't expecting any issue's with a fresh windows install on my new mobo. Dunno if one of the Asus drivers/software is conflicting, or i shoulda installed the gpu driver first or something. Kinda strange.

It's funny cuz when i got my new board from rma i just switched all the hardware over without wiping the ssd and everything was running fine. Then after a cpl hours I decided to wipe it and do a fresh win7 install just to be on the safe side and not have any leftover drivers from a different motherboard lol

I'm mainly just having alot of artifacts and screen tears during video playback, which is a mild issue compared to black screens. I just can't figure out the source of the problem. I've tried with and without the extra auxillary molex. Not oc'ing. Just have AB running for the fan profile, which i also had running on my Extreme6. I've tried enabling and disabling just about everything in CCC. Uninstalled the Realtek driver and downloaded the latest one from TPU... =\


----------



## Arizonian

Quote:


> Originally Posted by *vuldin*
> 
> The link to the Asus GPU Tweak download is broken in the first post. Here is the correct link:
> http://support.asus.com/download.aspx?SLanguage=en&p=9&m=ASUS%20GPU%20Tweak%20for%20Graphics%20cards&hashedid=n/a


Thank you vuldin. OP ASUS link updated









I see your new to OCN, welcome aboard.. Submit a proof with your 290X and join the roster of owners.









Quote:


> To be added on the member list please submit the following in your post
> 
> 1. GPU-Z Link with OCN name or Screen shot of GPU-Z validation tab open with OCN Name or finally a pic of GPU with piece of paper with OCN name showing.
> 2. Manufacturer & Brand - Please don't make me guess.
> 3. Cooling - Stock, Aftermarket or 3rd Party Water


----------



## Roboyto

290(X) "Need to Know" edited/organized + info still being added

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781


----------



## Arizonian

Quote:


> Originally Posted by *Roboyto*
> 
> 290(X) "Need to Know" edited/organized + info still being added
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781


Awesome contribution.









Thank you.


----------



## hotrod717

Quote:


> Originally Posted by *sugarhell*
> 
> We are on hwbot? I just told you that the score is low for a lightning with tess off. Get over it














Quote:


> Originally Posted by *Sgt Bilko*
> 
> If i'm submitting a score to the Bot then i'll run Tess off but most of the time i'll leave it on for my own testing.
> Sugar, how many 290x's have you seen that can pull 1300 on the core? (without LN2)
> 
> Honest question here.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I haven't seen that many really, and i've watched the lightning thread a bit and most users are pulling 1250 on air cooling, you can't say that's not special.


Exactly. Sugar has been pulling this since the 7970 thread and just keeps hating on anything other than reference. Funny, I've had 2 290x reference release cards that couldn't break 1225 and 1250 respectively.

Quote:


> Originally Posted by *sugarhell*
> 
> At least 12. And not with binning










Hurry up and go get TSM so he he can give you it. You might want to try posting your own results instead of riding on someone else's coattails.

Please anyone that has a card pulling 1300 core, post here - http://www.overclock.net/t/1361939/top-30-3dmark11-scores-for-single-dual-tri-quad


----------



## vuldin

Thanks Arizonian! I'll join the club once my card arrives sometime (hopefully early) next week.


----------



## sugarhell

Quote:


> Originally Posted by *hotrod717*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Exactly. Sugar has been pulling this since the 7970 thread and just keeps hating on anything other than reference. Funny, I've had 2 290x reference release cards that couldn't break 1225 and 1250 respectively.
> 
> 
> 
> 
> 
> 
> 
> Hurry up and go get TSM so he he can give you it. You might want to try posting your own results instead of riding on someone else's coattails.
> 
> Please anyone that has a card pulling 1300 core, post here - http://www.overclock.net/t/1361939/top-30-3dmark11-scores-for-single-dual-tri-quad


Lol. You are mad over nothing. I post you a score without tess off and you are whining.And you call it slower when your score is with tess off. And yeah tsm 3 ref cards jomama 4 cards ranger 4 cards devilheadas 4 cards. You dont even know me and you insult me. I know when ref is better and when its not. But you probably dont. You dont even push 1.6 volts like someone that you hate. For regular use even ln2 ref is capable. Hawaii doesnt scale at all go look hwbot ln2. Lightning results is the same as the ref card.

At the end of the day if you get a good chip no matter what you use ref pcb or lightning pcb you will get good clocks. But you are a marketing kid that believes anything. Amd cards scale with temps and lightning has a way better cooler than amd ref cooler. But when you change to water lightning is the same as a ref pcb. But ofc you will rage and insult more because you want to justify your card.

Also you post a score with tess off. And i told you its slow for a lightning with tess off. I dont care about hwbot but the link that i post you has tess on and its really close to you.Fix your card. DO you remember when i told you to fix your card for valley too with your 7970 matrix? I was correct

tldr; For regular use even for ln2 a custom pcb will give you the same results as a ref pcb. You clocks depends only at the binning. Dont believe marketing.


----------



## bond32

Quote:


> Originally Posted by *sugarhell*
> 
> Lol. You are mad over nothing. I post you a score without tess off and you are whining.And you call it slower when your score is with tess off. And yeah tsm 3 ref cards jomama 4 cards ranger 4 cards devilheadas 4 cards. You dont even know me and you insult me. I know when ref is better and when its not. But you probably dont. You dont even push 1.6 volts like someone that you hate. For regular use even ln2 ref is capable. Hawaii doesnt scale at all go look hwbot ln2. Lightning results is the same as the ref card.
> 
> At the end of the day if you get a good chip no matter what you use ref pcb or lightning pcb you will get good clocks. But you are a marketing kid that believes anything. Amd cards scale with temps and lightning has a way better cooler than amd ref cooler. But when you change to water lightning is the same as a ref pcb. But ofc you will rage and insult more because you want to justify your card.
> 
> Also you post a score with tess off. And i told you its slow for a lightning with tess off. I dont care about hwbot but the link that i post you has tess on and its really close to you.Fix your card. DO you remember when i told you to fix your card for valley too with your 7970 matrix? I was correct
> 
> tldr; For regular use even for ln2 a custom pcb will give you the same results as a ref pcb. You clocks depends only at the binning. Dont believe marketing.


Having owned a 780 Lightning I'm inclined to agree with you. Which begs the question, the extra phases/quality components in the cards like the lightning are good for what?? Are there any cards out there (like the 780ti) that could get any benefit from the extra phases?


----------



## sugarhell

Quote:


> Originally Posted by *bond32*
> 
> Having owned a 780 Lightning I'm inclined to agree with you. Which begs the question, the extra phases/quality components in the cards like the lightning are good for what?? Are there any cards out there (like the 780ti) that could get any benefit from the extra phases?


Classi is really good because of the bios and the extra volts. But then you see even titans can do 1400+


----------



## HOMECINEMA-PC

Two of my three 290's will pull 1300+ on da core and my unlocked 290 / 290x so far will do 1280 . All 3 are ref and do about 1500 on da mem max...... I lucky


----------



## Foresight

How do you get more than +100mv on AB?


----------



## hwoverclkd

Quote:


> Originally Posted by *Foresight*
> 
> How do you get more than +100mv on AB?


i'm using beta 19 and it's +200mV max. Have you checked off all pertinent settings yet, e.g. unlock voltage ctrl, etc?


----------



## Unknownm

Just upgraded to a 290! holy cow for the price (used) it is a nice upgrade!!!

Also replaced the thermal paste because the stock was crap


----------



## bencher

Quote:


> Originally Posted by *Unknownm*
> 
> Just upgraded to a 290! holy cow for the price (used) it is a nice upgrade!!!
> 
> Also replaced the thermal paste because the stock was crap


Is that 96c 0_o?


----------



## Unknownm

Quote:


> Originally Posted by *bencher*
> 
> Is that 96c 0_o?


whats 96c?


----------



## bencher

Quote:


> Originally Posted by *Unknownm*
> 
> whats 96c?


The temp in the gpu-z screenshot.


----------



## velocityx

Quote:


> Originally Posted by *bencher*
> 
> The temp in the gpu-z screenshot.


at one point I seen 98 with mine; ]


----------



## sugarhell

Quote:


> Originally Posted by *bencher*
> 
> The temp in the gpu-z screenshot.


Its 59?


----------



## bencher

Quote:


> Originally Posted by *sugarhell*
> 
> Its 59?


The pic is so small I can barely see it.


----------



## BradleyW

Is it normal to have a 18c difference between VRM 1 and VRM 2? (290X)
Thank you.


----------



## Goride

Quote:


> Originally Posted by *Goride*
> 
> Last ditch effort before I RMA this 290x:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Why are my vrm1 temps so damn hot?
> 
> I posted a few times in this thread, but its pretty buried so I will recap.
> 
> I have an NZXT G10, with an AMD Liquid Cooling System AIO (rebadged Antec Kuhler 920). I have a Gelid Enhancement Kit for the heatsink over the vrm1 and 2. I have some decent aluminum heatsinks over the 16 memory chips.
> 
> I have tried using 2 layers and 4 layers (it was suggested that maybe I need more padding to transfer the heat better) of 0.5mm Fujipoly Extreme thermal pad. The Gelid heatsink gets scorching hot... as in it burns my finger to touch it. Even the little nut on the back of the card is flaming hot. So it appears the heat is transferring to the heatsink ok.
> 
> I have both the g10 fan blowing directly on the vrm1 from the front. I have another fan blowing on the 290x from the side. I have the case panel off, so the ambient temp of the case should not be more than room temperature.
> 
> With stock voltage and a clocks speed of 1030/1250, I get vrm1 temps of 97c. When I overclock it to 1200/1500 with a +133 mV, the vrm1 will get to 125c (running 3dmark and Heaven to load the GPU, I do NOT use furmark).
> 
> The stock g10 fan does seem pretty weak, others are getting like 60-70c on the vrm1 with a very similar setup.
> 
> (This card also seems to always score quite a bit lower on every benchmark compared with other like systems, and same clock speeds. I am not sure if this has anything to do with it either.)
> 
> Any ideas?
> 
> I have tried various case fan placements, nothing seems to help. I have plugged the g10 in a few different power ports on the motherboard, as well as plugging it directly into a molex coming off the PSU, which should force the fan to its max speed - no help.
> 
> The card is running seemingly abnormally hot, and yields abnormally low scores in benchmarks and FPS while in real games compared to others. But the card does not artifact, and I used GPU-Z to verify the card was not throttling itself.
> 
> I was having these issues with the stock cooler on it as well. It seemed that the answer to the low scores was that the card was overheating, and if I water cooled it, this would be fixed. But that does not seem to be the case.
> 
> Any last suggestions before I RMA it? I am trying to do all the due diligence I can before I send it back.


I just tried replacing the G10's stock 92mm fan with a smaller ThermalTake fan I had lying around.

With that fan blowing on the gelid enhancement kit heatsink, while running Heaven @ 1200/1500 with +133mV, I am now down to 87-90c (down from over 120c with the g10's stock fan at same settings).

Replacing the fan helped a lot, but this still seems abnormally high. Especially, since others with similar setups are still much lower in temps.


----------



## Foresight

Quote:


> Originally Posted by *acupalypse*
> 
> i'm using beta 19 and it's +200mV max. Have you checked off all pertinent settings yet, e.g. unlock voltage ctrl, etc?


I am also using beta 19 with everything unlocked =).
But my max is +100mv.


----------



## Greg.m

Here are my Sapphire 290X reference beauties running mostly @ stock... Coooool and quiet...


----------



## ace1ndahole

Hey can you see if my 290 Vapor X overclock settings are results are decent?
Do you think I can push more or am I being conservative?

Here's Afterburner setting and Uningine Valley result that the setting got. When I try to push it to 1150 core and 1550 memory, artifacts and some black screens shows up.

But I got stable overclock on 1130/1550


----------



## Unknownm

Quote:


> Originally Posted by *bencher*
> 
> The temp in the gpu-z screenshot.


Quote:


> Originally Posted by *sugarhell*
> 
> Its 59?


Quote:


> Originally Posted by *bencher*
> 
> The pic is so small I can barely see it.


have you clicked the screenshot? it goes full size 1080p.... it says 59c. I'm so confused


----------



## Durvelle27

Anybody here running 2560x1440


----------



## owcraftsman

Quote:


> Originally Posted by *Durvelle27*
> 
> Anybody here running 2560x1440


Yep


----------



## kayan

As am I 2560x1440, but my cell doesn't like to quote.


----------



## Durvelle27

Quote:


> Originally Posted by *owcraftsman*
> 
> Yep


Do you play BF4


----------



## bencher

Quote:


> Originally Posted by *Unknownm*
> 
> have you clicked the screenshot? it goes full size 1080p.... it says 59c. I'm so confused


When I click it is is still tiny :/


----------



## bluedevil

1440p here..can run BF4 at 96hz on High.


----------



## the9quad

Quote:


> Originally Posted by *Durvelle27*
> 
> Anybody here running 2560x1440


I do and I seriously recommend mantle. With mantle I can keep frames locked at 125 fps on ultra and it is just solid as a rock, but with dx 11 frames are all over the place. It might take uninstalling and re-installing the 14.4's a few times, but once they actually install correctly mantle runs flawless.


----------



## kayan

Quote:


> Originally Posted by *Durvelle27*
> 
> Do you play BF4


I play BF4 @ 1440p. What do you want to know?


----------



## Roy360

should my middle R9 290 be bending like that?


bigger -> http://s3.postimg.org/5k2uo3n8x/IMG_2294.jpg


----------



## hwoverclkd

Quote:


> Originally Posted by *Foresight*
> 
> I am also using beta 19 with everything unlocked =).
> But my max is +100mv.


i figured. could be depending on your card. How abiut trixx? You may try rbby voltage tool also.


----------



## Durvelle27

Quote:


> Originally Posted by *kayan*
> 
> I play BF4 @ 1440p. What do you want to know?


What FPS do you get and what settings ?

Also mantle didn't make any difference for me with my i7


----------



## kayan

Quote:


> Originally Posted by *Durvelle27*
> 
> What FPS do you get and what settings ?
> 
> Also mantle didn't make any difference for me with my i7


Honestly, I have yet to use mantle. I play on Ultra everything, except AA which is around x2, and My frames stay constant in the 50s. Granted this was back on air, and not overclocked. I don't have a frame counter installed, but I could possibly check tomorrow, if I can still get into my FRAPS account.

Edit: I ONLY play on full 64-person servers.


----------



## the9quad

Quote:


> Originally Posted by *kayan*
> 
> Honestly, I have yet to use mantle. I play on Ultra everything, except AA which is around x2, and My frames stay constant in the 50s. Granted this was back on air, and not overclocked. I don't have a frame counter installed, but I could possibly check tomorrow, if I can still get into my FRAPS account.
> 
> Edit: I ONLY play on full 64-person servers.


do this no need for fraps:

while in game type

PerfOverlay.FrameFileLogEnable 1

This will generate a csv file of your frametimes

When done type (after about 10 minutes of gametime)

PerfOverlay.FrameFileLogEnable 0

when you exit the game, load the csv file into BF4FTA (thread here: http://www.overclock.net/t/1469627/bf4fta-battlefield-4-frame-time-analyzer-version-4-2-released-major-release)

This will show you exactly how constant your 50 fps is and will give a true bench of your BF4 gameplay. My guess is you will be surprised how often you are not at 50 fps.


----------



## owcraftsman

Quote:


> Originally Posted by *Durvelle27*
> 
> Quote:
> 
> 
> 
> Originally Posted by *owcraftsman*
> 
> Yep
> 
> 
> 
> Do you play BF4
Click to expand...

Hope this helps. From a round of Navel Strike multiplayer 42000+ frames. Results are subjective YMMV depending on map number of players etc etc.



If I get a chance I'll do DX11.1 for comparison but I use Mantle and cores are unparked W8.1x64


----------



## Roboyto

Quote:


> Originally Posted by *ace1ndahole*
> 
> Hey can you see if my 290 Vapor X overclock settings are results are decent?
> Do you think I can push more or am I being conservative?
> 
> Here's Afterburner setting and Uningine Valley result that the setting got. When I try to push it to 1150 core and 1550 memory, artifacts and some black screens shows up.
> 
> But I got stable overclock on 1130/1550





Spoiler: Warning: Spoiler!






>






Getting artifacts and blackscreens means your core and memory overclocks are too high.


Artifacts are from not enough voltage for the clock speed you're running
Blackscreens are from RAM speeds being too high for the voltage you're running or just too high for your cards capabilities in general
Must remember all the 290(X) cards come with only 5GHz RAM modules and some don't overclock well at all
1550 = 6.2GHz


My suggestion is to set the RAM clocks back to factory and concentrate on the core clock first.


RAM speeds give minimal performance boost compared to core on these cards. 
Once you get around to overclocking the RAM you can try adjusting the AUX voltage in Afterburner to assists in stability.

As long as you can keep temperatures on the core and VRM1 in check then you can keep pushing


Card will throttle by reducing clocks/voltages once you exceed 94C
Typically best to keep VRM1 temperatures under 90C

More information in general for 290(X): http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781


----------



## Durvelle27

Quote:


> Originally Posted by *kayan*
> 
> Honestly, I have yet to use mantle. I play on Ultra everything, except AA which is around x2, and My frames stay constant in the 50s. Granted this was back on air, and not overclocked. I don't have a frame counter installed, but I could possibly check tomorrow, if I can still get into my FRAPS account.
> 
> Edit: I ONLY play on full 64-person servers.


On Ultra with 4xMSAA @2560x1080 I'm only getting 30-70 FPS 64MP with my 290X at 1125/1400


----------



## Aussiejuggalo

Gonna be sticking my 290 underwater soon, hopefully I'll be able to clock it and see what she can do









Is there anything I need to be careful of when sticking a block on my card or just the usual dont over tighten and use the paper washer thingys?


----------



## bencher

After I enabled Mantle BF4 crashes on start up


----------



## bond32

Quote:


> Originally Posted by *bencher*
> 
> After I enabled Mantle BF4 crashes on start up


Are you overclocked? I've found mantle is more stressful on the cpu ram, a stable directx OC can fail on mantle.


----------



## bencher

Quote:


> Originally Posted by *bond32*
> 
> Are you overclocked? I've found mantle is more stressful on the cpu ram, a stable directx OC can fail on mantle.


Nope... BF4 doesn't even start.

Everything was fine with my 7970 in.


----------



## bond32

Honestly, I would just stick with the 13.12 drivers. Mantle doesn't improve anything for most of us


----------



## Roboyto

Quote:


> Originally Posted by *bencher*
> 
> After I enabled Mantle BF4 crashes on start up


Quote:


> Originally Posted by *bond32*
> 
> Honestly, I would just stick with the 13.12 drivers. Mantle doesn't improve anything for most of us


^

What he said. 13.12 doesn't fail.


----------



## Roboyto

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Gonna be sticking my 290 underwater soon, hopefully I'll be able to clock it and see what she can do
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is there anything I need to be careful of when sticking a block on my card or just the usual dont over tighten and use the paper washer thingys?


Just take normal precautions when taking everything apart.

Clean off all previous TIM and even the residue left by old thermal pads for best results. A tiny dab of TIM is all that is necessary for a perfectly flat block directly contacting the die.

Usually there is a designated order to tighten the screws, it is best to follow that, use the washers, and don't overtighten anything. If you do over tighten, you will likely be able to see the PCB bending/warping.

Unless you're using the AquaComputer block, I would suggest upgrading thermal pads to keep VRM1 as cool as the core: http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures

Leak test extensively before giving anything power and all should be well


----------



## SwaggerDragon

Hi everyone, new OCN user here. I'd like to join this club with my retired Dogecoin miner!
GPU-Z validation: http://www.techpowerup.com/gpuz/aww8q/
Screenshot:

Its a Gigabyte Reference model, got it not long after release.
Cooling is stock for now, looking to do the NZXT G10 block later!


----------



## Mega Man

Quote:


> Originally Posted by *Goride*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Goride*
> 
> Last ditch effort before I RMA this 290x:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Why are my vrm1 temps so damn hot?
> 
> I posted a few times in this thread, but its pretty buried so I will recap.
> 
> I have an NZXT G10, with an AMD Liquid Cooling System AIO (rebadged Antec Kuhler 920). I have a Gelid Enhancement Kit for the heatsink over the vrm1 and 2. I have some decent aluminum heatsinks over the 16 memory chips.
> 
> I have tried using 2 layers and 4 layers (it was suggested that maybe I need more padding to transfer the heat better) of 0.5mm Fujipoly Extreme thermal pad. The Gelid heatsink gets scorching hot... as in it burns my finger to touch it. Even the little nut on the back of the card is flaming hot. So it appears the heat is transferring to the heatsink ok.
> 
> I have both the g10 fan blowing directly on the vrm1 from the front. I have another fan blowing on the 290x from the side. I have the case panel off, so the ambient temp of the case should not be more than room temperature.
> 
> With stock voltage and a clocks speed of 1030/1250, I get vrm1 temps of 97c. When I overclock it to 1200/1500 with a +133 mV, the vrm1 will get to 125c (running 3dmark and Heaven to load the GPU, I do NOT use furmark).
> 
> The stock g10 fan does seem pretty weak, others are getting like 60-70c on the vrm1 with a very similar setup.
> 
> (This card also seems to always score quite a bit lower on every benchmark compared with other like systems, and same clock speeds. I am not sure if this has anything to do with it either.)
> 
> Any ideas?
> 
> I have tried various case fan placements, nothing seems to help. I have plugged the g10 in a few different power ports on the motherboard, as well as plugging it directly into a molex coming off the PSU, which should force the fan to its max speed - no help.
> 
> The card is running seemingly abnormally hot, and yields abnormally low scores in benchmarks and FPS while in real games compared to others. But the card does not artifact, and I used GPU-Z to verify the card was not throttling itself.
> 
> I was having these issues with the stock cooler on it as well. It seemed that the answer to the low scores was that the card was overheating, and if I water cooled it, this would be fixed. But that does not seem to be the case.
> 
> Any last suggestions before I RMA it? I am trying to do all the due diligence I can before I send it back.
> 
> 
> 
> 
> 
> 
> I just tried replacing the G10's stock 92mm fan with a smaller ThermalTake fan I had lying around.
> 
> With that fan blowing on the gelid enhancement kit heatsink, while running Heaven @ 1200/1500 with +133mV, I am now down to 87-90c (down from over 120c with the g10's stock fan at same settings).
> 
> Replacing the fan helped a lot, but this still seems abnormally high. Especially, since others with similar setups are still much lower in temps.
Click to expand...

i hate when people say " similar setups " what are their ambient temps vs yours ?
Quote:


> Originally Posted by *Durvelle27*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kayan*
> 
> Honestly, I have yet to use mantle. I play on Ultra everything, except AA which is around x2, and My frames stay constant in the 50s. Granted this was back on air, and not overclocked. I don't have a frame counter installed, but I could possibly check tomorrow, if I can still get into my FRAPS account.
> 
> Edit: I ONLY play on full 64-person servers.
> 
> 
> 
> On Ultra with 4xMSAA @2560x1080 I'm only getting 30-70 FPS 64MP with my 290X at 1125/1400
Click to expand...

sounds like you need to go gack to your 780ti? ( or w.e. it is )

on a side note do people ever use search or do they just ask a question which has been answered a few hundred times in this thread .....


----------



## bencher

Quote:


> Originally Posted by *bond32*
> 
> Honestly, I would just stick with the 13.12 drivers. Mantle doesn't improve anything for most of us


Quote:


> Originally Posted by *Roboyto*
> 
> ^
> 
> What he said. 13.12 doesn't fail.


I will try that when i have the time.

Are there any beta drivers I can try?


----------



## kizwan

Quote:


> Originally Posted by *hotrod717*
> 
> Funny, I've had 2 290x reference release cards that couldn't break 1225 and 1250 respectively.


Some cards need more volts to break 1250 & 1300. It depends how much voltage you willing to push to your cards.

Quote:


> Originally Posted by *bond32*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *sugarhell*
> 
> Lol. You are mad over nothing. I post you a score without tess off and you are whining.And you call it slower when your score is with tess off. And yeah tsm 3 ref cards jomama 4 cards ranger 4 cards devilheadas 4 cards. You dont even know me and you insult me. I know when ref is better and when its not. But you probably dont. You dont even push 1.6 volts like someone that you hate. For regular use even ln2 ref is capable. Hawaii doesnt scale at all go look hwbot ln2. Lightning results is the same as the ref card.
> 
> At the end of the day if you get a good chip no matter what you use ref pcb or lightning pcb you will get good clocks. But you are a marketing kid that believes anything. Amd cards scale with temps and lightning has a way better cooler than amd ref cooler. But when you change to water lightning is the same as a ref pcb. But ofc you will rage and insult more because you want to justify your card.
> 
> Also you post a score with tess off. And i told you its slow for a lightning with tess off. I dont care about hwbot but the link that i post you has tess on and its really close to you.Fix your card. DO you remember when i told you to fix your card for valley too with your 7970 matrix? I was correct
> 
> tldr; For regular use even for ln2 a custom pcb will give you the same results as a ref pcb. You clocks depends only at the binning. Dont believe marketing.
> 
> 
> 
> 
> 
> 
> Having owned a 780 Lightning I'm inclined to agree with you. Which begs the question, the extra phases/quality components in the cards like the lightning are good for what?? Are there any cards out there (like the 780ti) that could get any benefit from the extra phases?
Click to expand...

Even with the extravagant components, it still depends on the chip quality. The quality components on non-ref cards are useful if the ref cards use weak components but for ref 290/290X cards, I don't think so. I think 290 lightning use chips from the same pile of chips the manufacturer use to make ref cards.


----------



## rdr09

Quote:


> Originally Posted by *Mega Man*
> 
> i hate when people say " similar setups " what are their ambient temps vs yours ?
> sounds like *you need to bo gack to your 780ti?* ( or w.e. it is )
> 
> on a side note do people ever use search or do they just ask a question which has been answered a few hundred times in this thread .....


nah, either get another 290X or give up some eyecandies.


----------



## kizwan

Quote:


> Originally Posted by *bencher*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Unknownm*
> 
> have you clicked the screenshot? it goes full size 1080p.... it says 59c. I'm so confused
> 
> 
> 
> When I click it is is still tiny :/
Click to expand...

Right click the image & open in new tab.


----------



## Durvelle27

Quote:


> Originally Posted by *Mega Man*
> 
> i hate when people say " similar setups " what are their ambient temps vs yours ?
> sounds like you need to bo gack to your 780ti? ( or w.e. it is )
> 
> on a side note do people ever use search or do they just ask a question which has been answered a few hundred times in this thread .....


I use the search sometimes but not often.


----------



## Roboyto

Quote:


> Originally Posted by *Mega Man*
> 
> i hate when people say " similar setups " what are their ambient temps vs yours ?
> 
> on a side note do people ever use search or do they just ask a question which has been answered a few hundred times in this thread .....


I've gotten great results with my 290 and Kraken G10, using the Gelid VRM kit and an Antec 620; ambient for my condo are ~21C. From the sounds of that post, the heatisnk/pads aren't installed properly. I don't how/why using 4 layers of 0.5mm pads would be a good idea when the included pad works just fine.

It would seem people never use search. If they did, they would probably land on my post which includes answers for most any question regarding these cards: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781

Quote:


> Originally Posted by *BradleyW*
> 
> Is it normal to have a 18c difference between VRM 1 and VRM 2? (290X)
> Thank you.


Looks like you have EK blocks, so with the stock EK pads this would be normal. If you get yourself some Fujipoly Ultra Extreme pads and use them on the VRMs you should be able to get core and both VRMs very close to the same temperature.

Check my thread out here for thermal pad upgrade: http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures

After the pads have 'broken in' over the last few months, my VRM temps are actually under core temp.


----------



## Mega Man

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> i hate when people say " similar setups " what are their ambient temps vs yours ?
> sounds like *you need to bo gack to your 780ti?* ( or w.e. it is )
> 
> on a side note do people ever use search or do they just ask a question which has been answered a few hundred times in this thread .....
> 
> 
> 
> nah, either get another 290X or give up some eyecandies.
Click to expand...

i stand by my original comment


----------



## kizwan

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Gonna be sticking my 290 underwater soon, hopefully I'll be able to clock it and see what she can do
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is there anything I need to be careful of when sticking a block on my card or just the usual dont over tighten and use the paper washer thingys?


Yup, don't over tighten. There is one screw holes above the 290 chip where tiny resistors are very close to it. Definitely you don't want over tighten there. I didn't put screw there at all.


----------



## Goride

Quote:


> Originally Posted by *Roboyto*
> 
> I've gotten great results with my 290 and Kraken G10, using the Gelid VRM kit and an Antec 620; ambient for my condo are ~21C. From the sounds of that post, the heatisnk/pads aren't installed properly. I don't how/why using 4 layers of 0.5mm pads would be a good idea when the included pad works just fine.
> 
> It would seem people never use search. If they did, they would probably land on my post which includes answers for most any question regarding these cards: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781


I did in fact use the search. And I also viewed and quoted your posts regarding this in the past.

I have tried using a row of aluminum heatsinks w/ fujipoly extreme thermal pads, but had high vrm1 temps. I saw that you, and some others, were getting good results with the gelid enhancement kit heatsink. So I ordered it and tried it. I used 2 layers of 0.5mm fujipoly extreme pads. Same temps as the aluminum heatsinks. Another user said that he tried the same thing as I, but that 2 layers was not enough, and suggested that I use more. So I tried more. Same temps. I have also, for the heck of it, tried the padding that the g10 came with, roughly the same temps.

At stock voltage and clock speeds it will be 95-100c in Heaven. When I overclock it has gotten to 120c before I shut it down, I am sure it would have gone higher.

The Gelid heatsink (and the first aluminum heatsinks that I used) were absolutely scorching hot when I touched it. The heat seems to have transferred to the sink fine each time.

The only thing that has lowered the temps so far was replacing the stock g10 fan, but it is still too hot.

(Also, as I said before, whenever I run a bench mark I also get substantially lower scores than others. For instance at stock I get about 8600 in 3dMark Firestrike 1.1, when I overclock to 1200/1500 I get about 9600. I have a 3570k overclocked stable at 4.4ghz. I also tried running the cpu at stock. I have tried two different ram kits one Samsung 8gb kit, and another Patriot 16gb kit. The scores only vary slightly when I do this. Most people seem to be able to break 10k in firestrike, I believe stock speeds. I have not even come close with it overclocked.)


----------



## Aussiejuggalo

Quote:


> Originally Posted by *kizwan*
> 
> Yup, don't over tighten. There is one screw holes above the 290 chip where tiny resistors are very close to it. Definitely you don't want over tighten there. I didn't put screw there at all.


Ah ok thanks







, getting a block and backplate so it should be prudy









Damaged a card from overtightening and crushed the core once


----------



## kizwan

Quote:


> Originally Posted by *Mega Man*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Goride*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Goride*
> 
> Last ditch effort before I RMA this 290x:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Why are my vrm1 temps so damn hot?
> 
> I posted a few times in this thread, but its pretty buried so I will recap.
> 
> I have an NZXT G10, with an AMD Liquid Cooling System AIO (rebadged Antec Kuhler 920). I have a Gelid Enhancement Kit for the heatsink over the vrm1 and 2. I have some decent aluminum heatsinks over the 16 memory chips.
> 
> I have tried using 2 layers and 4 layers (it was suggested that maybe I need more padding to transfer the heat better) of 0.5mm Fujipoly Extreme thermal pad. The Gelid heatsink gets scorching hot... as in it burns my finger to touch it. Even the little nut on the back of the card is flaming hot. So it appears the heat is transferring to the heatsink ok.
> 
> I have both the g10 fan blowing directly on the vrm1 from the front. I have another fan blowing on the 290x from the side. I have the case panel off, so the ambient temp of the case should not be more than room temperature.
> 
> With stock voltage and a clocks speed of 1030/1250, I get vrm1 temps of 97c. When I overclock it to 1200/1500 with a +133 mV, the vrm1 will get to 125c (running 3dmark and Heaven to load the GPU, I do NOT use furmark).
> 
> The stock g10 fan does seem pretty weak, others are getting like 60-70c on the vrm1 with a very similar setup.
> 
> (This card also seems to always score quite a bit lower on every benchmark compared with other like systems, and same clock speeds. I am not sure if this has anything to do with it either.)
> 
> Any ideas?
> 
> I have tried various case fan placements, nothing seems to help. I have plugged the g10 in a few different power ports on the motherboard, as well as plugging it directly into a molex coming off the PSU, which should force the fan to its max speed - no help.
> 
> The card is running seemingly abnormally hot, and yields abnormally low scores in benchmarks and FPS while in real games compared to others. But the card does not artifact, and I used GPU-Z to verify the card was not throttling itself.
> 
> I was having these issues with the stock cooler on it as well. It seemed that the answer to the low scores was that the card was overheating, and if I water cooled it, this would be fixed. But that does not seem to be the case.
> 
> Any last suggestions before I RMA it? I am trying to do all the due diligence I can before I send it back.
> 
> 
> 
> 
> 
> 
> I just tried replacing the G10's stock 92mm fan with a smaller ThermalTake fan I had lying around.
> 
> With that fan blowing on the gelid enhancement kit heatsink, while running Heaven @ 1200/1500 with +133mV, I am now down to 87-90c (down from over 120c with the g10's stock fan at same settings).
> 
> Replacing the fan helped a lot, but this still seems abnormally high. Especially, since others with similar setups are still much lower in temps.
> 
> Click to expand...
> 
> i hate when people say " similar setups " what are their *ambient temps vs yours* ?
Click to expand...

^^ Exactly.


----------



## PJFT808

On my ref asus 290 the gelids vrm pad didnt make contact well. Like I said I used what I had it made the best contact with 4 . I can play bf4 for hours with my temps all nice [email protected] 1000/5300 nothing major. Kraken G10 and a h55 push/pull . 51C core 53c on both Vrms. In my case it worked best. I dont think these are all exacts for everyone. Seems my tolerances varied for some reason. But those temps you have are still way high.


----------



## Mega Man

Quote:


> Originally Posted by *Durvelle27*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> i hate when people say " similar setups " what are their ambient temps vs yours ?
> sounds like you need to bo gack to your 780ti? ( or w.e. it is )
> 
> on a side note do people ever use search or do they just ask a question which has been answered a few hundred times in this thread .....
> 
> 
> 
> I use the search sometimes but not often.
Click to expand...

it was not directed to you... but seriously over ...idk the last week the same question has been asked ~10 times...


----------



## Goride

Quote:


> Originally Posted by *Mega Man*
> 
> i hate when people say " similar setups " what are their ambient temps vs yours ?


Quote:


> Originally Posted by *kizwan*
> 
> ^^ Exactly.


The answer was pretty much in the post I quoted of myself.

I am getting 95-100c at load in Heaven w/ stock voltage and clock speeds. Others seem to be anywhere between 50-70c in Heaven.

My ambient vrm1 temps without a load are around 35c with the new fan. With the stock fan it was usually somewhere in the 50c range. This is also somewhat higher than others, but well within acceptable range. What is not acceptable is the at load temps.

Any by similar setups I mean people running a single 290/290x with a g10 + AIO water cooling system, which is what I said I was running.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Mega Man*
> 
> i hate when people say " similar setups " what are their ambient temps vs yours ?
> sounds like you need to go gack to your 780ti? ( or w.e. it is )
> 
> on a side note do people ever use search or do they just ask a question which has been answered a few hundred times in this thread .....


I don't too busy benching


----------



## kizwan

Quote:


> Originally Posted by *Goride*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> i hate when people say " similar setups " what are their ambient temps vs yours ?
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> ^^ Exactly.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> 
> 
> 
> 
> The answer was pretty much in the post I quoted of myself.
> 
> I am getting 95-100c at load in Heaven w/ stock voltage and clock speeds. Others seem to be between 60-70c in Heaven.
> 
> My ambient temps without a load are around 40c with the new fan. With the stock fan it was usually somewhere in the 50-60c range. This is also somewhat higher than others, but well within acceptable range. What is not acceptable is the at load temps.
> 
> Any by similar setups I mean people running a single 290/290x with a g10 + AIO water cooling system, which is what I said I was running.
Click to expand...

Ambient temp is referring to indoor temp or your room temp where your computer located. But case temp useful as well.

Can you take pictures? I would like to see how you installed the thermal pad & whether the heatsink have proper contact with the VRM1. You might want to try pointing another fan on the back of the card (at the back of VRM1). It should help in VRM1 cooling.


----------



## ace1ndahole

Wow thanks, I'm a total noob.


----------



## Mega Man

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Goride*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> i hate when people say " similar setups " what are their ambient temps vs yours ?
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> ^^ Exactly.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> 
> 
> 
> 
> The answer was pretty much in the post I quoted of myself.
> 
> I am getting 95-100c at load in Heaven w/ stock voltage and clock speeds. Others seem to be between 60-70c in Heaven.
> 
> My ambient temps without a load are around 40c with the new fan. With the stock fan it was usually somewhere in the 50-60c range. This is also somewhat higher than others, but well within acceptable range. What is not acceptable is the at load temps.
> 
> Any by similar setups I mean people running a single 290/290x with a g10 + AIO water cooling system, which is what I said I was running.
> 
> Click to expand...
> 
> *Ambient temp is referring to indoor temp or your room temp* where your computer located. But case temp useful as well.
> 
> Can you take pictures? I would like to see how you installed the thermal pad & whether the heatsink have proper contact with the VRM1. You might want to try pointing another fan on the back of the card (at the back of VRM1). It should help in VRM1 cooling.
Click to expand...

case vs ambient can tell alot


----------



## Goride

Quote:


> Originally Posted by *kizwan*
> 
> Ambient temp is referring to indoor temp or your room temp where your computer located. But case temp useful as well.
> 
> Can you take pictures? I would like to see how you installed the thermal pad & whether the heatsink have proper contact with the VRM1. You might want to try pointing another fan on the back of the card (at the back of VRM1). It should help in VRM1 cooling.


I can take pic. But as I said I tried several pads and thicknesses. Right now it is extra thick, so it has a bit more than normal squishing out the sides. But each time I took the pads off and replaced them, every module had an imprint in the pad, and there was no space between it and the heatsink (solid thermal pad in between).

I have also tried many different fan configurations. I have tried positioning a fan at the end of the card towards the vrm1, from the side of the card (where the side panel of the case would be), and infront of the g10's fan to try and bring extra cool air over there. I have tried up to 4 fans blowing on iit, side panel on/off, etc. No real difference in temps - depending on the fan configuration it fluctuates between 90-100c with stock temps.

Ambient temp of my room is about 20-24c depending.




(I know these are not the best pics, but I could not do much better as I need the computer running at the moment. For what it is worth, the padding makes solid contact like that all the way down. As I said above there is more on it than necessary at the moment. It is maybe about 2-3c hotter with this extra layers than what it was with 2 layers. But tried extra layers on another users suggestion. Adding the layers made it slightly worse, but basically the same.)


----------



## Arizonian

Quote:


> Originally Posted by *Unknownm*
> 
> Just upgraded to a 290! holy cow for the price (used) it is a nice upgrade!!!
> 
> Also replaced the thermal paste because the stock was crap
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added








Quote:


> Originally Posted by *Greg.m*
> 
> Here are my Sapphire 290X reference beauties running mostly @ stock... Coooool and quiet...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added








Quote:


> Originally Posted by *SwaggerDragon*
> 
> Hi everyone, new OCN user here. I'd like to join this club with my retired Dogecoin miner!
> GPU-Z validation: http://www.techpowerup.com/gpuz/aww8q/
> Screenshot:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Its a Gigabyte Reference model, got it not long after release.
> Cooling is stock for now, looking to do the NZXT G10 block later!


Congrats - added


----------



## Talon720

Now I dunno if anyone remembers my issue. I have a trifire 290x setup in parrallel with bad temps idle and load. I couldn't keep the temps down. It always seemed like I couldn't get all the air out despite my best effort. Well I decided to install second d5 pump I had laying around, it was a night and day difference. For one the system bled so much easier and faster. Second thing is, the temperature is lower way lower comparatively and right where it should be. Now I just might need to shoehorn a 3rd rad in and I should be set. Figured I'd give this update to anyone who might be reading through all the pages like I did.


----------



## kizwan

Quote:


> Originally Posted by *Goride*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Ambient temp is referring to indoor temp or your room temp where your computer located. But case temp useful as well.
> 
> Can you take pictures? I would like to see how you installed the thermal pad & whether the heatsink have proper contact with the VRM1. You might want to try pointing another fan on the back of the card (at the back of VRM1). It should help in VRM1 cooling.
> 
> 
> 
> 
> 
> 
> 
> I can take pic. But as I said I tried several pads and thicknesses. Right now it is extra thick, so it has a bit more than normal squishing out the sides. But each time I took the pads off and replaced them, every module had an imprint in the pad, and there was no space between it and the heatsink (solid thermal pad in between).
> 
> I have also tried many different fan configurations. I have tried positioning a fan at the end of the card towards the vrm1, from the side of the card (where the side panel of the case would be), and infront of the g10's fan to try and bring extra cool air over there. I have tried up to 4 fans blowing on iit, side panel on/off, etc. No real difference in temps - depending on the fan configuration it fluctuates between 90-100c with stock temps.
> 
> Ambient temp of my room is about 20-24c depending.
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> (I know these are not the best pics, but I could not do much better as I need the computer running at the moment. For what it is worth, the padding makes solid contact like that all the way down. As I said above there is more on it than necessary at the moment. It is maybe about 2-3c hotter with this extra layers than what it was with 2 layers. But tried extra layers on another users suggestion. Adding the layers made it slightly worse, but basically the same.)
Click to expand...

You're using different fan right? So, Heaven @ 1200/1500 with +133mV, down to 87-90c (VRM1). Basically this look good considering you're overclocked & overvolted. What do you get at stock clock?

IMO, no more than 2 layers of thermal pads though. BTW, do you have the fan blowing air toward VRM1 or sucking air from it? I'm assuming you tried both orientation?

I can't see anything wrong from the picture. Look ok to me. If you got the same problem with stock cooler, even at stock clock, probably good idea to RMA the card.
Quote:


> Originally Posted by *Talon720*
> 
> Now I dunno if anyone remembers my issue. I have a trifire 290x setup in parrallel with bad temps idle and load. I couldn't keep the temps down. It always seemed like I couldn't get all the air out despite my best effort. Well I decided to install second d5 pump I had laying around, it was a night and day difference. For one the system bled so much easier and faster. Second thing is, the temperature is lower way lower comparatively and right where it should be. Now I just might need to shoehorn a 3rd rad in and I should be set. Figured I'd give this update to anyone who might be reading through all the pages like I did.


I told you so.


----------



## Unknownm

so my card is now a jet engine at 90% fan speed...


----------



## Durvelle27

Quote:


> Originally Posted by *Mega Man*
> 
> it was not directed to you... but seriously over ...idk the last week the same question has been asked ~10 times...


Ahhh I see. Guess I'll try Ocing further and see what it yields


----------



## hotrod717

Quote:


> Originally Posted by *sugarhell*
> 
> Lol. You are mad over nothing. I post you a score without tess off and you are whining.And you call it slower when your score is with tess off. And yeah tsm 3 ref cards jomama 4 cards ranger 4 cards devilheadas 4 cards. You dont even know me and you insult me. I know when ref is better and when its not. But you probably dont. You dont even push 1.6 volts like someone that you hate. For regular use even ln2 ref is capable. Hawaii doesnt scale at all go look hwbot ln2. Lightning results is the same as the ref card.
> 
> At the end of the day if you get a good chip no matter what you use ref pcb or lightning pcb you will get good clocks. But you are a marketing kid that believes anything. Amd cards scale with temps and lightning has a way better cooler than amd ref cooler. But when you change to water lightning is the same as a ref pcb. But ofc you will rage and insult more because you want to justify your card.
> 
> Also you post a score with tess off. And i told you its slow for a lightning with tess off. I dont care about hwbot but the link that i post you has tess on and its really close to you.Fix your card. DO you remember when i told you to fix your card for valley too with your 7970 matrix? I was correct
> 
> tldr; For regular use even for ln2 a custom pcb will give you the same results as a ref pcb. You clocks depends only at the binning. Dont believe marketing.


I post a score and you post things to try and chop it down. That's hating. I don't hate on anyone for getting decent scores, I congratulate them. As for the Matrix, you continually trolled me in the 7970 thread everytime I posted a score, just like now. And when I respond to your hate and insults, I'm whining?? Lol! Again, try posting your own scores and stop trolling, looking to find fault in anything but what you believe. -


----------



## ebhsimon

Quote:


> Originally Posted by *ace1ndahole*
> 
> Hey can you see if my 290 Vapor X overclock settings are results are decent?
> Do you think I can push more or am I being conservative?
> 
> Here's Afterburner setting and Uningine Valley result that the setting got. When I try to push it to 1150 core and 1550 memory, artifacts and some black screens shows up.
> 
> But I got stable overclock on 1130/1550


What's your power limit and voltage offset?

I'm only getting 1135/1550 at +100mV and +50% power limit. Playing titanfall VRM temps max out at about 71C, core maxes out at 67C with an ambient of 23C.

EDIT: I just chucked a case fan on top of my harddrive cage so it's blowing directly over the GPU. Max of 60C VRM1 and 67 core.


----------



## rdr09

Quote:


> Originally Posted by *ebhsimon*
> 
> What's your power limit and voltage offset?
> 
> I'm only getting 1135/1550 at +100mV and +50% power limit. Playing titanfall VRM temps max out at about 71C, core maxes out at 67C with an ambient of 23C.
> 
> EDIT: I just chucked a case fan on top of my harddrive cage so it's blowing directly over the GPU. Max of 60C VRM1 and 67 core.


nice temps on the vrms. it is like watercooled considering the oc.


----------



## ebhsimon

Quote:


> Originally Posted by *rdr09*
> 
> nice temps on the vrms. it is like watercooled considering the oc.


Keep in mind this is on Titanfall. Even though it's at 1440p maxed settings it's locked to 60fps. Doesn't produce anywhere near as much heat as benchmarking or stress testing.

I took the fan out, and I still only got 61C for VRM1 and 68C for core. But like I have said before, I'm sure it's because I leave my side panel off.


----------



## rdr09

Quote:


> Originally Posted by *ebhsimon*
> 
> Keep in mind this is on Titanfall. Even though it's at 1440p maxed settings it's locked to 60fps. Doesn't produce anywhere near as much heat as benchmarking or stress testing.
> 
> I took the fan out, and I still only got 61C for VRM1 and 68C for core. But like I have said before, I'm sure it's because I leave my side panel off.


my panel is off, too. who needs it when that side is against the wall.

edit: do you really see any benefit oc'ing the memory?


----------



## Goride

Quote:


> Originally Posted by *kizwan*
> 
> You're using different fan right? So, Heaven @ 1200/1500 with +133mV, down to 87-90c (VRM1). Basically this look good considering you're overclocked & overvolted. What do you get at stock clock?


No. Those temps are STOCK voltage and STOCK clock speeds, WITH a replacement fan. With the replacement fan and at those clock speeds and voltage I am about 100c. With the g10's orginial fan I would be over 120c . With the orignial fan I would be at 97-100c stock voltage and stock clock speeds.

Quote:


> IMO, no more than 2 layers of thermal pads though.


Yes, I agree. But since I was getting horrible temps with 2 layers, someone suggested that I try more to ensure I was making good contact. Being at a loss for any other reason why my temps are so high, I tried it out. I gained 2-3c on my temps, but otherwise basically the same.
Quote:


> BTW, do you have the fan blowing air toward VRM1 or sucking air from it? I'm assuming you tried both orientation?
> 
> I can't see anything wrong from the picture. Look ok to me. If you got the same problem with stock cooler, even at stock clock, probably good idea to RMA the card.
> I told you so.


Yeah, it is blowing directly on the heatsink. I have not actually tried to orientate it away from the heatsink. I suppose I could try, but the sink really needs air blowing on it.

I think I will be RMAing the card as well. I am just trying to make sure there is nothing that I have not tried before I do so.


----------



## ebhsimon

Quote:


> Originally Posted by *rdr09*
> 
> my panel is off, too. who needs it when that side is against the wall.
> 
> edit: do you really see any benefit oc'ing the memory?


Probably not much, but seeing as titanfall with max (insane) textures uses 3.6gb of ram, and the temps don't rise too high I don't mind. I haven't measured the performance gain though.

Interestingly, I just tested temperatures with my side panel off and on, both with case fans off and on - 15 minute runs with prime95 and kombustor.

With the side panel off, it didn't make a difference to the temperatures if the case fans were off or max (I have 7 with slight positive pressure).
With the side panel on, the case fans on max were 2 degrees cooler than with the side panel off. When I turned the case fans off, I had to abort the stress tests within 5 minutes because my computer would have melted down.

*Conclusion: if you want quiet, don't bother changing your fans. Just turn them all off with a fan controller (or disconnect/remove) and take off the side panel. It's the same temperature as having your case fans on maximum (12v) with the side panel on.*

Note: this is for cases with no side panel mounted fan.

I have all my testing data and pictures saved if anyone wants me to make a separate post.


----------



## kizwan

Quote:


> Originally Posted by *Goride*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> BTW, do you have the fan blowing air toward VRM1 or sucking air from it? I'm assuming you tried both orientation?
> 
> 
> 
> Yeah, it is blowing directly on the heatsink. I have not actually tried to orientate it away from the heatsink. I suppose I could try, but the sink really needs air blowing on it.
Click to expand...

I'm asking because @Roboyto have the fan in "sucking" orientation. See *here*. It's a long shot but couldn't hurt to try.


----------



## bencher

I am at 1049/1250 on my R9 290, just started overclocking.

Stock volts stable with 2 hours of Crysis 3


----------



## Goride

Quote:


> Originally Posted by *kizwan*
> 
> I'm asking because @Roboyto have the fan in "sucking" orientation. See *here*. It's a long shot but couldn't hurt to try.


I can give it a shot as a last ditch effort. It does look like he put the fan on backwards, pulling the air.

The fan I replaced that stock fan with can really push air. So I doubt it can do better than this fan.

But, I might as well give it a try.


----------



## Gobigorgohome

Hey, this might seem like a idiotic question, but how does 4x R9 290s work on 1080P (yes, 1920x1080, one monitor)? I am upgrading to 4K as soon as they release the Samsung LU28D590 to my country, and it got another delay, so I guess it will come next month.


----------



## hwoverclkd

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Hey, this might seem like a idiotic question, but how does 4x R9 290s work on 1080P (yes, 1920x1080, one monitor)? I am upgrading to 4K as soon as they release the Samsung LU28D590 to my country, and it got another delay, so I guess it will come next month.


It all depends on what you plan to do down the road. 4x 290's? overkill for 1080p...that should still have more than enough muscle for 4k but the 4gb vram should be enough for most games, but not all.


----------



## kayan

Quote:


> Originally Posted by *the9quad*
> 
> do this no need for fraps:
> 
> while in game type
> 
> PerfOverlay.FrameFileLogEnable 1
> 
> This will generate a csv file of your frametimes
> 
> When done type (after about 10 minutes of gametime)
> 
> PerfOverlay.FrameFileLogEnable 0
> 
> when you exit the game, load the csv file into BF4FTA (thread here: http://www.overclock.net/t/1469627/bf4fta-battlefield-4-frame-time-analyzer-version-4-2-released-major-release)
> 
> This will show you exactly how constant your 50 fps is and will give a true bench of your BF4 gameplay. My guess is you will be surprised how often you are not at 50 fps.


Quote:


> Originally Posted by *Durvelle27*
> 
> On Ultra with 4xMSAA @2560x1080 I'm only getting 30-70 FPS 64MP with my 290X at 1125/1400


I just played 2 matches of BF4 for you, just to check for sure. Metro2014, 61-64 players, 2560x1440, 2xMSAA, Ultra everything else: my fraps jumped from 47-67 on DX11. Played for about 1.5 hours on this map, and frames were steady, mostly hovered around 55-63, dips as low as 47, highs as high as 73.

I was in single card mode, as I turned off my extra card (no power at all), and there was no overclock on my one card.


----------



## Gobigorgohome

Quote:


> Originally Posted by *acupalypse*
> 
> It all depends on what you plan to do down the road. 4x 290's? overkill for 1080p...that should still have more than enough muscle for 4k but the 4gb vram should be enough for most games, but not all.


I am not really that worried about the 4GB vRAM on the cards, I thought more like bottlenecking because of the resolution, I will definitively go 4K as soon as they release the monitor I want. I am only going to play on one 4K screen, not like others on the forum working with three.


----------



## Cool Mike

Any idea's when the resaler's like Newegg will receive The sapphire 290x Toxic?

Thanks...


----------



## the9quad

Quote:


> Originally Posted by *kayan*
> 
> I just played 2 matches of BF4 for you, just to check for sure. Metro2014, 61-64 players, 2560x1440, 2xMSAA, Ultra everything else: my fraps jumped from 47-67 on DX11. Played for about 1.5 hours on this map, and frames were steady, mostly hovered around 55-63, dips as low as 47, highs as high as 73.
> 
> I was in single card mode, as I turned off my extra card (no power at all), and there was no overclock on my one card.


Here are some BF4 results-:
64 Player Caspian Sea (I find this the most demanding)
1440p *TRUE* Ultra-100% Resolution Scale-90 FOV-4xMSAA-High Post AA
API-Mantle
Frame Pacing Off in CCC and In-Game


Spoiler: Warning: Spoiler!

















Spoiler: Warning: Spoiler!

















Spoiler: Warning: Spoiler!

















Spoiler: Warning: Spoiler!

















Spoiler: Warning: Spoiler!

















Spoiler: Warning: Spoiler!

















Spoiler: Warning: Spoiler!















64 Player Caspian Sea
1440p *TRUE* Ultra-100% Resolution Scale-90 FOV-4xMSAA-High Post AA
API-Mantle
Frame Pacing Method 2-default-highest (Crazy performance hit)


Spoiler: Warning: Spoiler!

















Spoiler: Warning: Spoiler!

















Spoiler: Warning: Spoiler!

















Spoiler: Warning: Spoiler!

















Spoiler: Warning: Spoiler!

















Spoiler: Warning: Spoiler!

















Spoiler: Warning: Spoiler!


----------



## motokill36

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I am not really that worried about the 4GB vRAM on the cards, I thought more like bottlenecking because of the resolution, I will definitively go 4K as soon as they release the monitor I want. I am only going to play on one 4K screen, not like others on the forum working with three.


This is what i had in mind but cant get crossfire to work very well at all wont make it through one game of BT4 with out driver crashing .


----------



## Gobigorgohome

Quote:


> Originally Posted by *motokill36*
> 
> This is what i had in mind but cant get crossfire to work very well at all wont make it through one game of BT4 with out driver crashing .


I do not get what you are trying to say here (at all) ... are you talking about 4K eyefinity, quad crossfire on one 4K screen or quad crossfire on one 1080P screen? Please specify what you are trying to say to me, it is very frustrating.

And what is BT4? Did you mean BF4? And what is this you try saing about the driver crash, is that on crossfire (two cards) or on three or four cards? And again, what resolution are you talking about?


----------



## armartins

Quote:


> Originally Posted by *Goride*
> 
> I can give it a shot as a last ditch effort. It does look like he put the fan on backwards, pulling the air.
> 
> The fan I replaced that stock fan with can really push air. So I doubt it can do better than this fan.
> 
> But, I might as well give it a try.


Maybe you have a really, really low ASIC card that needs a lot of juice at stock clocks... what's your default core voltage?


----------



## Widde

Quote:


> Originally Posted by *the9quad*
> 
> Here are some BF4 results-:
> 64 Player Caspian Sea (I find this the most demanding)
> 1440p *TRUE* Ultra-100% Resolution Scale-90 FOV-4xMSAA-High Post AA
> API-Mantle
> Frame Pacing Off in CCC and In-Game
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 64 Player Caspian Sea
> 1440p *TRUE* Ultra-100% Resolution Scale-90 FOV-4xMSAA-High Post AA
> API-Mantle
> Frame Pacing Method 2-default-highest (Crazy performance hit)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Is it the framepacing that makes my fps tank with mantle?







Getting a massive fps drop with mantle enabled, 14.4 drivers, reinstalled the game aswell and cleared the mantle cache


----------



## sugarhell

Frame pacing method 1 is the best for bf4


----------



## the9quad

Quote:


> Originally Posted by *sugarhell*
> 
> Frame pacing method 1 is the best for bf4


exactly this, the difference of actual frametime variance between the default method (2) and the simpler method (1) is negligible, you'd never notice it at all, but it is a serious framerate hit.

I should have done one more bench with method 1 to show it, I have to go to work, but I will try tomorrow. Anyway, it's nearly as fast as no framepacing and nearly as flat as method 2.

here is an old bench I did comparing the two. Green is method 1 and red is method 2. lower is faster btw.



one other odd thing about method 2 is that when it hits a framerate hiccup then the algorithm will try and hold frames there, it is like it will not allow frames to go back to the faster value right away. You can see it near the end on the red line where it takes a step change.


----------



## the9quad

Quote:


> Originally Posted by *Widde*
> 
> Is it the framepacing that makes my fps tank with mantle?
> 
> 
> 
> 
> 
> 
> 
> Getting a massive fps drop with mantle enabled, 14.4 drivers, reinstalled the game aswell and cleared the mantle cache


try this and see if it helps
RenderDevice.FramePacingMethod 1


----------



## Widde

Quote:


> Originally Posted by *the9quad*
> 
> try this and see if it helps
> RenderDevice.FramePacingMethod 1


That did it







Thx ^^

Went from 50-60ish to 140+


----------



## bencher

Hi guys,

Is this valley score ok for a stock r9 290 (940/1250)?


----------



## the9quad

Quote:


> Originally Posted by *Widde*
> 
> That did it
> 
> 
> 
> 
> 
> 
> 
> Thx ^^
> Glad it worked for ya. The interesting thing about the default frame pacing algorithm is that because of the way it works is if you go into options you can apply and reapply settings like 4 x aa and 2x aa back and forth each time gaining fps. If you do it enough times it will get pretty high, but it's a pain.
> 
> Went from 50-60ish to 140+


----------



## bencher

Quote:


> Originally Posted by *bencher*
> 
> Hi guys,
> 
> Is this valley score ok for a stock r9 290 (940/1250)?


Any insight?


----------



## cennis

http://www.3dmark.com/3dm11/8344976

Hey everyone, hows this score for 2x290 1200/1500, i7 3930k 4.9ghz / 2133mhz cl 10 ram?


----------



## Germanian

Quote:


> Originally Posted by *bencher*
> 
> Any insight?


sounds about right.


----------



## Germanian

Quote:


> Originally Posted by *cennis*
> 
> http://www.3dmark.com/3dm11/8344928
> 
> Hey everyone, hows this score for 2x290 1200/1500, i7 3930k 4.9ghz / 2133mhz cl 0 ram?


I don't have a 3Dmark11 score, but if you got time download uniengine valley benchmark and run it (it's free).
https://unigine.com/products/valley/download/

*With my 2x R9 290 @1080/1450 on 14.4 driver I got*



Your score should be better since your at 1200 and higher memory at 1500.
If your score comes out lower than mine your memory might not be stable try 1450. The memory will use ECC correction if it's not stable and slow down performance.

EDIT: BRB i am testing 1170/1350 and report back new results. I get artifacts over 1180, but for some reason I get throttling @+100mv so your score might be better.
I suspect afterburner beta 19 bug with 14.4.
UPDATE: I get heavy throttling at 1170 goes to 983 clocks. I think it's the afterburner program sorry, but the 1080 should be fine let me double check.
2nd run it does some light throttling up to 1020 clocks with core set to 1080 power limit is broken for me I think.

On 2nd run I got 100 FPS your score should be at least better than 102 FPS.


----------



## cennis

Quote:


> Originally Posted by *Germanian*
> 
> I don't have a 3Dmark11 score, but if you got time download uniengine valley benchmark and run it (it's free).
> https://unigine.com/products/valley/download/
> 
> *With my 2x R9 290 @1080/1450 on 14.4 driver I got*
> 
> 
> 
> Your score should be better since your at 1200 and higher memory at 1500.
> If your score comes out lower than mine your memory might not be stable try 1450. The memory will use ECC correction if it's not stable and slow down performance.
> 
> EDIT: BRB i am testing 1170/1350 and report back new results. I get artifacts over 1180, but for some reason I get throttling @+100mv so your score might be better.
> I suspect afterburner beta 19 bug with 14.4.
> UPDATE: I get heavy throttling at 1170 goes to 983 clocks. I think it's the afterburner program sorry, but the 1080 should be fine let me double check.
> 2nd run it does some light throttling up to 1020 clocks with core set to 1080 power limit is broken for me I think.
> 
> On 2nd run I got 100 FPS your score should be at least better than 102 FPS.




i think the cpu makes a difference in ram


----------



## bencher

Quote:


> Originally Posted by *Germanian*
> 
> sounds about right.


Thank you sir







+Rep


----------



## Unknownm

Quote:


> Originally Posted by *bencher*
> 
> Hi guys,
> 
> Is this valley score ok for a stock r9 290 (940/1250)?


I just bought my XFX R9 290 used for $300 dollars two days ago. Here is valley benchmark at stock speeds

The temp sensor works fine in GPUZ but for some reason my GPU is 1,885,496c in valley which if was possible would be burning through my floor and the ground.


----------



## kizwan

Quote:


> Originally Posted by *the9quad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Widde*
> 
> Is it the framepacing that makes my fps tank with mantle?
> 
> 
> 
> 
> 
> 
> 
> Getting a massive fps drop with mantle enabled, 14.4 drivers, reinstalled the game aswell and cleared the mantle cache
> 
> 
> 
> try this and see if it helps
> RenderDevice.FramePacingMethod 1
Click to expand...

Good to know!







I don't remember whether I did set this already though.









*[EDIT]* Do I need to enable Frame Pacing in CCC? Frame Time analyzer doesn't show any difference between FP Method 1 & 2.









*[EDIT #2]* Red = FP method 1 & Green = FP method 2. Yup, method 1 look better.
_(290 Crossfire 1000/1300, BF4, 1080p, 200% resolution scale (2160p, 4K), Ultra + no AA)_

If the result look wrong to you, please let me know. Thanks.


----------



## hht92

Hi guys i am planning to go with 290 tri-x. Is my xfx psu enough??


----------



## bencher

Quote:


> Originally Posted by *Unknownm*
> 
> I just bought my XFX R9 290 used for $300 dollars two days ago. Here is valley benchmark at stock speeds
> 
> The temp sensor works fine in GPUZ but for some reason my GPU is 1,885,496c in valley which if was possible would be burning through my floor and the ground.


What driver are you using?


----------



## Sgt Bilko

Quote:


> Originally Posted by *bencher*
> 
> What driver are you using?


14.4 from the looks of it.

I'm on 13.12 atm, i'll give valley a run on single and dual cards stock settings if you'd like?


----------



## bluedevil

Quote:


> Originally Posted by *hht92*
> 
> Hi guys i am planning to go with 290 tri-x. Is my xfx psu enough??


You will be fine. I run about the same system on my 550w PSU just fine.


----------



## hht92

Quote:


> Originally Posted by *bluedevil*
> 
> You will be fine. I run about the same system on my 550w PSU just fine.


Thank you


----------



## Unknownm

Quote:


> Originally Posted by *bencher*
> 
> What driver are you using?


14.4


----------



## Dasboogieman

Ok finally fixed the second of the major Sapphire Tri X design flaws.
(The first being lack of integrated backplate to cool VRMs and the usage of rare M1.5 screws which preclude the use of aftermarket backplates with the Tri X cooler)

I don't know if anybody with a Tri-X has noticed, at specific fan speeds (mine were at 47% 50% 55% 60%), you get this rhythmic rattling sound. Mine was more of a pulsating buzz. The noise goes away completely at higher or lower fan speeds of >70% or <30%. The noise also can come on when the fan quickly ramps speed (e.g. steep fan curves)
Now normally, this buzz isn't a problem if the fans are loud but it is extremely prominent due to the sheer silence of the Tri X design.

I first thought the issue was caused by the VRMs, so I opened up the cooler and placed a small square of Sekisui 5760 on each choke. The coil buzz definitely went away but the rattling remained. I next thought it may be due to shoddy fans so I opened up an RMA with Sapphire. While waiting for the RMA ticket, I happened upon this youtube video.





So I got some garden rubber gloves from Bunnings, cut them up in to rectangles and placed them so that the fan frame would not contact the HS anymore. Not only has the rattling speed gone away, the fan transitions are now buttery smooth and the whole assembly even sounds slightly less loud at full speed.

Basically, this issue is a design flaw where there is a lack of padding between the Fan assembly and the HS assembly leading to resonance of the metal.

Thought would be worth posting for anyone with a Tri-X, try this mod and the VRM mod before RMAing the card.


----------



## Sgt Bilko

Quote:


> Originally Posted by *bencher*
> 
> What driver are you using?


Quote:


> Originally Posted by *Unknownm*
> 
> 14.4


Thought you guys might find this interesting as will other people:

13.12 on the left and 14.4 on the right,


----------



## hht92

Quote:


> Originally Posted by *Dasboogieman*
> 
> Ok finally fixed the second of the major Sapphire Tri X design flaws.
> (The first being lack of integrated backplate to cool VRMs and the usage of rare M1.5 screws which preclude the use of aftermarket backplates with the Tri X cooler)
> 
> I don't know if anybody with a Tri-X has noticed, at specific fan speeds (mine were at 47% 50% 55% 60%), you get this rhythmic rattling sound. Mine was more of a pulsating buzz. The noise goes away completely at higher or lower fan speeds of >70% or <30%. The noise also can come on when the fan quickly ramps speed (e.g. steep fan curves)
> Now normally, this buzz isn't a problem if the fans are loud but it is extremely prominent due to the sheer silence of the Tri X design.
> 
> I first thought the issue was caused by the VRMs, so I opened up the cooler and placed a small square of Sekisui 5760 on each choke. The coil buzz definitely went away but the rattling remained. I next thought it may be due to shoddy fans so I opened up an RMA with Sapphire. While waiting for the RMA ticket, I happened upon this youtube video.
> 
> 
> 
> 
> 
> So I got some garden rubber gloves from Bunnings, cut them up in to rectangles and placed them so that the fan frame would not contact the HS anymore. Not only has the rattling speed gone away, the fan transitions are now buttery smooth and the whole assembly even sounds slightly less loud at full speed.
> 
> Basically, this issue is a design flaw where there is a lack of padding between the Fan assembly and the HS assembly leading to resonance of the metal.
> 
> Thought would be worth posting for anyone with a Tri-X, try this mod and the VRM mod before RMAing the card.


Is the tri-x have many that bugs ? I might reconsider buy 290?


----------



## motokill36

Hi All

Any sugestions to why BF4 freezes after 5 mins with cards in crossfire .? stock clocks

Had both cards on there own no problems at all .
Tried both drivers same thing happens


----------



## Unknownm

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Thought you guys might find this interesting as will other people:
> 
> 13.12 on the left and 14.4 on the right,


my card stock bios was 942mhz or something like that, flashed to custom BIOS that sticks to 1GHZ as stock and added 6% gpu limit (1060mhz). Seems stable so far and haven't touch any voltages


----------



## Dasboogieman

Quote:


> Originally Posted by *hht92*
> 
> Is the tri-x have many that bugs ? I might reconsider buy 290?


Actually not that many bugs, just that one and the VRM issue (which doesn't really become a problem since you need WC before you can push that much voltage thru). Hell, at least the bugs are fixable. The Tri X is still extremely good compared to the other coolers (possibly the most efficient performance vs noise 2 slot cooler available other than the Vapor X).


----------



## hht92

The tri-x has many these bugs like
Quote:


> Originally Posted by *Dasboogieman*
> 
> Actually not that many bugs, just that one and the VRM issue (which doesn't really become a problem since you need WC before you can push that much voltage thru). Hell, at least the bugs are fixable. The Tri X is still extremely good compared to the other coolers (possibly the most efficient performance vs noise 2 slot cooler available other than the Vapor X).


I am asking cause my next choice is a 780 which comes with 50 more euros


----------



## Sgt Bilko

Quote:


> Originally Posted by *Unknownm*
> 
> my card stock bios was 942mhz or something like that, flashed to custom BIOS that sticks to 1GHZ as stock and added 6% gpu limit (1060mhz). Seems stable so far and haven't touch any voltages
> 
> 
> Spoiler: Warning: Spoiler!


I'm running the default DD Black Edition Bios still (980/1250) But i'm running it at Tri-X clocks (1000/1300) on stock voltage.

Was just surprised with the difference (or lack of) between 13.12 and 14.4 in Valley anyways, 13.12 or 13.11 are still better for firestrike etc but for valley 14.4 seems to be the way to go.


----------



## Sgt Bilko

Quote:


> Originally Posted by *hht92*
> 
> The tri-x has many these bugs like
> I am asking cause my next choice is a 780 which comes with 50 more euros


Gigabyte Windforce 3, Sapphire Tri-X and Powercolor PCS+

Those are the three choices you have with these cards for the best cooling options.

They all cool effectively the only difference is the PCS+ is a 2.5 slot card compared to the other 2.

Honourable mention goes to the Sapphire Vapor-X (too pricy) and the XFX DD's (not quite as cool as the others)


----------



## hht92

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Gigabyte Windforce 3, Sapphire Tri-X and Powercolor PCS+
> 
> Those are the three choices you have with these cards for the best cooling options.
> 
> They all cool effectively the only difference is the PCS+ is a 2.5 slot card compared to the other 2.
> 
> Honourable mention goes to the Sapphire Vapor-X (too pricy) and the XFX DD's (not quite as cool as the others)


ok thanks for the advise.Vapor-X yep too pricy same price with 780 so not worth


----------



## Mercy4You

Quote:


> Originally Posted by *hht92*
> 
> ok thanks for the advise.Vapor-X yep too pricy same price with 780 so not worth


I own(ed) 2 Tri-X cards and they both run extremely well


----------



## hht92

Quote:


> Originally Posted by *Mercy4You*
> 
> I own(ed) 2 Tri-X cards and they both run extremely well


Nice thanks


----------



## imadorkx

Quote:


> Originally Posted by *Dasboogieman*
> 
> Ok finally fixed the second of the major Sapphire Tri X design flaws.
> (The first being lack of integrated backplate to cool VRMs and the usage of rare M1.5 screws which preclude the use of aftermarket backplates with the Tri X cooler)
> 
> I don't know if anybody with a Tri-X has noticed, at specific fan speeds (mine were at 47% 50% 55% 60%), you get this rhythmic rattling sound. Mine was more of a pulsating buzz. The noise goes away completely at higher or lower fan speeds of >70% or <30%. The noise also can come on when the fan quickly ramps speed (e.g. steep fan curves)
> Now normally, this buzz isn't a problem if the fans are loud but it is extremely prominent due to the sheer silence of the Tri X design.
> 
> I first thought the issue was caused by the VRMs, so I opened up the cooler and placed a small square of Sekisui 5760 on each choke. The coil buzz definitely went away but the rattling remained. I next thought it may be due to shoddy fans so I opened up an RMA with Sapphire. While waiting for the RMA ticket, I happened upon this youtube video.
> 
> 
> 
> 
> 
> So I got some garden rubber gloves from Bunnings, cut them up in to rectangles and placed them so that the fan frame would not contact the HS anymore. Not only has the rattling speed gone away, the fan transitions are now buttery smooth and the whole assembly even sounds slightly less loud at full speed.
> 
> Basically, this issue is a design flaw where there is a lack of padding between the Fan assembly and the HS assembly leading to resonance of the metal.
> 
> Thought would be worth posting for anyone with a Tri-X, try this mod and the VRM mod before RMAing the card.


my r9 290 tri x used to have that problem.. im not sure what id done but i think i cleaned the middle fan and tighten the screw.


----------



## hht92

Quote:


> Originally Posted by *imadorkx*
> 
> my r9 290 tri x used to have that problem.. im not sure what id done but i think i cleaned the middle fan and tighten the screw.


Thanks guys cause i don't want to waste 395 euros


----------



## Mercy4You

Quote:


> Originally Posted by *hht92*
> 
> Nice thanks


There's 1 thing though, the Tri-X can have a bit of fan-rattle. Mine had some too, solved it myself...

Edited: Jus saw *imadorkx* mentioned this problem already. So I can confirm this


----------



## cennis

Is it normal for my XFX platinum 1000W PSU to spin up very loudly in 3dmark GT3, GT4?
By loud I mean louder than H100 on full speed + some other fans around 1000rpm
I am running 2x290 1200/1500 1.3v under load and 3930k 4.9 1.44v

The load should be at most 800W, on my 1000W PSU it should'nt break a sweat?


----------



## sugarhell

Quote:


> Originally Posted by *cennis*
> 
> Is it normal for my XFX platinum 1000W PSU to spin up very loudly in 3dmark GT3, GT4?
> By loud I mean louder than H100 on full speed + some other fans around 1000rpm
> I am running 2x290 1200/1500 1.3v under load and 3930k 4.9 1.44v
> 
> The load should be at most 800W, on my 1000W PSU it should'nt break a sweat?


More like close to 1000


----------



## cennis

Quote:


> Originally Posted by *sugarhell*
> 
> More like close to 1000


mind explaining?
290s clocked that way pull 250w?
and the cpu probably less than 300w?

and GT3, GT4 are not very cpu intensive so I doubt CPU even pulls 250


----------



## kizwan

Quote:


> Originally Posted by *motokill36*
> 
> Hi All
> 
> Any sugestions to why BF4 freezes after 5 mins with cards in crossfire .? stock clocks
> 
> Had both cards on there own no problems at all .
> Tried both drivers same thing happens


Did you try swap the cards? Second GPU to first PCIe slot & first GPU to secondary PCIe slot.


----------



## motokill36

No Can do as i have 4 Slot Air cooler on second slot card .and if i swap them it covers slot 2 so then i would have to run Ref card in slot 3 if it would even fit .

Can i still run Cross f in slot one and 3 ?
Its on X79 Board Asrock Extr 4
Played on both cards for Days and not one glitch but together No go .


----------



## Germanian

Quote:


> Originally Posted by *motokill36*
> 
> No Can do as i have 4 Slot Air cooler on second slot card .and if i swap them it covers slot 2 so then i would have to run Ref card in slot 3 if it would even fit .
> 
> Can i still run Cross f in slot one and 3 ?
> Its on X79 Board Asrock Extr 4
> Played on both cards for Days and not one glitch but together No go .


could be power supply not being able to cope with load.

Try underclocking both cards by -10mv and changing clocks to 865 core and 1250 memory on both. If it now runs in crossfire it means you don't have enough power with your current Power supply.


----------



## motokill36

quote name="Germanian" url="/t/1436497/official-amd-r9-290x-290-owners-club/23010#post_22291946"]
could be power supply not being able to cope with load.

Its a 950w Corsair
Think it ok lol


----------



## sugarhell

Quote:


> Originally Posted by *cennis*
> 
> mind explaining?
> 290s clocked that way pull 250w?
> and the cpu probably less than 300w?
> 
> and GT3, GT4 are not very cpu intensive so I doubt CPU even pulls 250


290s stock more like 260-300 watt based on the app. Oced with 50% power limit up to 400-450 watt
Cpu 250-300 watt
Hdds,ssds,fans etc etc etc

From the wall probably is around 1000 watt


----------



## bluedevil

Quote:


> Originally Posted by *sugarhell*
> 
> 290s stock more like 260-300 watt based on the app. Oced with 50% power limit up to 400-450 watt
> Cpu 250-300 watt
> Hdds,ssds,fans etc etc etc
> 
> From the wall probably is around 1000 watt


Wouldn't that really fry a 290 @ 150℅ ?


----------



## motokill36

Had wall meter
Never went over 600


----------



## cennis

Quote:


> Originally Posted by *sugarhell*
> 
> 290s stock more like 260-300 watt based on the app. Oced with 50% power limit up to 400-450 watt
> Cpu 250-300 watt
> Hdds,ssds,fans etc etc etc
> 
> From the wall probably is around 1000 watt


Okay, higher than i expected,
even then, should a PSU fan be that loud, louder than my H100 + 2x H90s +case fans combined?

From my previous experience with a 650W psu pulling close to its maximum load with a overclocked 4770k+GTX 690 overclocked, the psu fan sound never varied much


----------



## sugarhell

Quote:


> Originally Posted by *cennis*
> 
> Okay, higher than i expected,
> even then, should a PSU fan be that loud, louder than my H100 + 2x H90s +case fans combined?
> 
> From my previous experience with a 650W psu pulling close to its maximum load with a overclocked 4770k+GTX 690 overclocked, the psu fan sound never varied much


Depends the psu fan. You should ask shilka about psus


----------



## bencher

Question... On load gpuz is working 11.77v on 12v.

What are you guys getting?


----------



## Gobigorgohome

Today I received three Asus DirectCU II OC R9 290s in the mail, I unpacked them and I am actually a little sorry that I want to water cool them, because they are soooo beautiful with the stock coolers. On the other side the graphic cards did hit 80 degrees celsius under the first "burn-in-test", so it is maybe not that silly after all.

The R9 290 actually surprised me a bit, it is laing the GTX780 in the dirt (even overclocked) and for a lower cost. Surprisingly good behavoir for just one card at 1080P, it did Crysis 3 at Very High with MSAA x4 (which I could only dream about with my previous GTX 660 Ti SLI configuration or 3-way GTX 660 Ti). I am in the middel of testing all the cards (one by one) and I will probably go in to test them in crossfire, trifire and when my fourth card arrive in quadfire, all at 1080P though.


----------



## bak3donh1gh

anyone know why the pt1 bios seem sto cause instability. Got a lot more fps drops.


----------



## bak3donh1gh

[quote name="Gobigorgohome" url="/t/1436497/official-amd-r9-290x-290-owners-club/23030#post_22293490"
The R9 290 actually surprised me a bit, it is laing the GTX780 in the dirt (even overclocked) and for a lower cost. Surprisingly good behavoir for just one card at 1080P, it did Crysis 3 at Very High with MSAA x4 (which I could only dream about with my previous GTX 660 Ti SLI configuration or 3-way GTX 660 Ti). I am in the middel of testing all the cards (one by one) and I will probably go in to test them in crossfire, trifire and when my fourth card arrive in quadfire, all at 1080P though.[/quote]

Can anyone explain why you would spend the money on quad fire? especially why not just buy a 290x2?


----------



## Gobigorgohome

Quote:


> Can anyone explain why you would spend the money on quad fire? especially why not just buy a 290x2?


Easy, the R9 295x2 costs the same as three R9 290 (in my country anyways), if I would buy two of those it is about 3835 USD while 4x R9 290 is 2135USD. Simple choice for me at least.

And I am going to play at 4K, quadfire R9 290 is the cheapest way to get the best performance for your money (my thoughts).


----------



## the9quad

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Easy, the R9 295x2 costs the same as three R9 290 (in my country anyways), if I would buy two of those it is about 3835 USD while 4x R9 290 is 2135USD. Simple choice for me at least.
> 
> And I am going to play at 4K, quadfire R9 290 is the cheapest way to get the best performance for your money (my thoughts).


makes sense, didnt at first when I saw you say you were going quad for 1080p, but for 4k yep, 290's are the best bang for the buck by far. Hope you enjoy them.


----------



## Mercy4You

Quote:


> Originally Posted by *Gobigorgohome*
> 
> And I am going to play at 4K, quadfire R9 290 is the cheapest way to get the best performance for your money (my thoughts).


LOL, bit of an understatement 'the cheapest way' for a rig that's gonna sky rocket your electric bill...


----------



## rdr09

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Today I received three Asus DirectCU II OC R9 290s in the mail, I unpacked them and I am actually a little sorry that I want to water cool them, because they are soooo beautiful with the stock coolers. On the other side the graphic cards did hit 80 degrees celsius under the first "burn-in-test", so it is maybe not that silly after all.
> 
> The R9 290 actually surprised me a bit, it is laing the GTX780 in the dirt (even overclocked) and for a lower cost. Surprisingly good behavoir for just one card at 1080P, it did Crysis 3 at Very High with MSAA x4 (which I could only dream about with my previous GTX 660 Ti SLI configuration or 3-way GTX 660 Ti). I am in the middel of testing all the cards (one by one) and I will probably go in to test them in crossfire, trifire and when my fourth card arrive in quadfire, all at 1080P though.


what cpu are you using? Try C3 using Very High and 8MSAA.


----------



## bak3donh1gh

oh ok 4k resolution. if I was running only 60 fps the 290 would be maxing everything np.


----------



## rdr09

Quote:


> Originally Posted by *bak3donh1gh*
> 
> oh ok 4k resolution. if I was running only 60 fps the 290 would be maxing everything np.


how many 290s do you think one needs for 4K? I estimate 3, if so then the 290 solution will be cheaper than the 295, right?

here I am shooting for 1440 while others 4K.


----------



## bencher

Question... On load gpuz is working 11.77v. Is anyone getting full 12v?


----------



## kizwan

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bak3donh1gh*
> 
> oh ok 4k resolution. if I was running only 60 fps the 290 would be maxing everything np.
> 
> 
> 
> how many 290s do you think one needs for 4K? I estimate 3, if so then the 290 solution will be cheaper than the 295, right?
> 
> here I am shooting for 1440 while others 4K.
Click to expand...

I think two 290's should be minimum for 4K.


----------



## Gobigorgohome

Quote:


> Originally Posted by *the9quad*
> 
> makes sense, didnt at first when I saw you say you were going quad for 1080p, but for 4k yep, 290's are the best bang for the buck by far. Hope you enjoy them.


The only reason I still am at 1080P is because they have not released the "cheap" 4K panels in my country yet, but in the beginning of next month they probably will. So then it will be 4K, I have a little experience with Nvidia Surround though, have been running 5760x1080 and 3240x1920 a couple of months, until the bezel took away the most of the fun. I hope that the 4K panel can offer a better solution to a higher resolution at games.


----------



## Gobigorgohome

Quote:


> Originally Posted by *Mercy4You*
> 
> LOL, bit of an understatement 'the cheapest way' for a rig that's gonna sky rocket your electric bill...


If you can't afford to pay for the gas you don't buy a Ferrari, simple as that.


----------



## Gobigorgohome

Quote:


> Originally Posted by *rdr09*
> 
> what cpu are you using? Try C3 using Very High and 8MSAA.


I am going to use a i7-3930K with these GPUs.


----------



## Mercy4You

;-) any idea how much that will pull with 4K monitor?


----------



## Germanian

Quote:


> Originally Posted by *Mercy4You*
> 
> ;-) any idea how much that will pull with 4K monitor?


With 2 cards:
around 30-40 FPS maxed out most new games. You will need to lower settings to get 60 FPS and no AA.

Older games you will be able to get 60 FPS easy.


----------



## josephimports

Quote:


> Originally Posted by *bencher*
> 
> Question... On load gpuz is working 11.77v. Is anyone getting full 12v?


Card 1 = 12.00v idle. 11.88v load.
Card 2 = 11.88v idle 11.75-11.63v load.

It's normal.


----------



## Mega Man

Quote:


> Originally Posted by *josephimports*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bencher*
> 
> Question... On load gpuz is working 11.77v. Is anyone getting full 12v?
> 
> 
> 
> Card 1 = 12.00v idle. 11.88v load.
> Card 2 = 11.88v idle 11.75-11.63v load.
> 
> It's normal.
Click to expand...

when will people learn not to trust software sensors


----------



## Red1776

Quote:


> Originally Posted by *Mega Man*
> 
> Quote:
> 
> 
> 
> Originally Posted by *josephimports*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bencher*
> 
> Question... On load gpuz is working 11.77v. Is anyone getting full 12v?
> 
> 
> 
> Card 1 = 12.00v idle. 11.88v load.
> Card 2 = 11.88v idle 11.75-11.63v load.
> 
> It's normal.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> when will people learn not to trust software sensors
Click to expand...

what do you mean?

My 8350 is running at 5.2Ghz @1.21v and that's that!


----------



## imadorkx

Quote:


> Originally Posted by *rdr09*
> 
> how many 290s do you think one needs for 4K? I estimate 3, if so then the 290 solution will be cheaper than the 295, right?
> 
> here I am shooting for 1440 while others 4K.


i feel you. Im still aiming for 1440p.. meanwhile others going 4K.


----------



## the9quad

1
Quote:


> Originally Posted by *imadorkx*
> 
> i feel you. Im still aiming for 1440p.. meanwhile others going 4K.


1440p at 120hz


----------



## Sgt Bilko

Quote:


> Originally Posted by *the9quad*
> 
> 1
> 1440p at 120hz


1440p @ 110hz









Close enough


----------



## edo101

Just wondering what are max safe temps for VRMs?


----------



## Sgt Bilko

Quote:


> Originally Posted by *edo101*
> 
> Just wondering what are max safe temps for VRMs?


Imo anything over 100c is too much.

Try to keep them under 90c if you can


----------



## Aussiejuggalo

Quote:


> Originally Posted by *imadorkx*
> 
> i feel you. Im still aiming for 1440p.. meanwhile others going 4K.


And here I am on good ol' 1080p


----------



## edo101

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Imo anything over 100c is too much.
> 
> Try to keep them under 90c if you can


What about GPU core? And what is the bigger factor when it comes to thermal throttling? VRM or GPU core?

And when does throttling kick in?


----------



## Forceman

Quote:


> Originally Posted by *edo101*
> 
> What about GPU core? And what is the bigger factor when it comes to thermal throttling? VRM or GPU core?


By default, the core throttles at 94,and you are more llikely to get core throttling than VRM throttling (although that does depend on the cooler)


----------



## edo101

Quote:


> Originally Posted by *Forceman*
> 
> By default, the core throttles at 94,and you are more llikely to get core throttling than VRM throttling (although that does depend on the cooler)


Yeah I have the Windforce OC X3. It's not performing up to the level of a reference 290 (which is crazy considering the higher factory clock)...

I am looking at OC'ing it to 1150 but this card needs +131mV to get stable. My VRM goes to 90C at that point but core stays around 85C (my fan is not a max though). Just checking to see where I am meeting diminishing returns...Cause i can get to 1140 with ~+80mV.

Would like to get 1200 but max allowable voltage on TRIXX gives me artifacts.


----------



## Mega Man

well its official i have 4 powercolor 290xs for my main rig and 1 sapphire for my a10


----------



## Forceman

The non-reference coolers often have higher VRM temps, but 90C shouldn't be causing any throttling. Have you kept the monitoring window open to make sure the core clock is holding steady?


----------



## Germanian

Quote:


> Originally Posted by *josephimports*
> 
> Card 1 = 12.00v idle. 11.88v load.
> Card 2 = 11.88v idle 11.75-11.63v load.
> 
> It's normal.


i get the same thing here
Quote:


> Originally Posted by *Mega Man*
> 
> well its official i have 4 powercolor 290xs for my main rig and 1 sapphire for my a10


i hope your power supply is ready








I think even my Enermax 1500w wouldn't be able to handle all 4 overclocked; stock should be doable though.


----------



## edo101

Quote:


> Originally Posted by *Forceman*
> 
> The non-reference coolers often have higher VRM temps, but 90C shouldn't be causing any throttling. Have you kept the monitoring window open to make sure the core clock is holding steady?


Yeah I though my old X-58 might be the problem but i haven't lokoed too much into it. Haven't had time to get through with the problem. I just hate the fact that my higher clocked card is getting beat by REFERENCE cards...its still a good upgrade from my GTX 470 at 1440p gaming

So throttling happens when Vcore hits 90C?


----------



## Mega Man

Quote:


> Originally Posted by *Germanian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *josephimports*
> 
> Card 1 = 12.00v idle. 11.88v load.
> Card 2 = 11.88v idle 11.75-11.63v load.
> 
> It's normal.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i get the same thing here
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> well its official i have 4 powercolor 290xs for my main rig and 1 sapphire for my a10
> 
> Click to expand...
> 
> i hope your power supply is ready
> 
> 
> 
> 
> 
> 
> 
> 
> I think even my Enermax 1500w wouldn't be able to handle all 4 overclocked; stock should be doable though.
Click to expand...

i have 2 sets the ones in it is 1000w x2 ( superflower leadex, i would have the 1200w but the distributor { only one in china } when i went over seas did not have it ) ... or 1250w x2 ( xfx 1250w ) so yea 2k-2.5k pretty sure it can


----------



## Forceman

Quote:


> Originally Posted by *edo101*
> 
> Yeah I though my old X-58 might be the problem but i haven't lokoed too much into it. Haven't had time to get through with the problem. I just hate the fact that my higher clocked card is getting beat by REFERENCE cards...its still a good upgrade from my GTX 470 at 1440p gaming
> 
> So throttling happens when Vcore hits 90C?


For reference cards it is 94C, but non-reference may be different.


----------



## Roboyto

Quote:


> Originally Posted by *edo101*
> 
> Yeah I have the Windforce OC X3. It's not performing up to the level of a reference 290 (which is crazy considering the higher factory clock)...
> 
> I am looking at OC'ing it to 1150 but this card needs +131mV to get stable. My VRM goes to 90C at that point but core stays around 85C (my fan is not a max though). Just checking to see where I am meeting diminishing returns...Cause i can get to 1140 with ~+80mV.
> 
> Would like to get 1200 but max allowable voltage on TRIXX gives me artifacts.




Looks like VRM1 gets blasted by a lot of the heat from all 5 pipes unfortunately. You may be able to get better VRM temps with a thermal pad upgrade. I did this on a Devil 270X and the temps came down ~10%; the card layout and cooling were setup differently though.

Not 100% certain of the thickness of the particular thermal pad on your Gigabyte card but I would wager 1mm; you can always remove the HSF and check the thickness or double up on new pads to be certain there is good contact. http://www.frozencpu.com/products/17504/thr-185/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_Mosfet_Block_-_100_x_15_x_10_-_Thermal_Conductivity_170_WmK.html

Maybe try a different OC utility? Some cards do better with different utilities.

Friend of mine just bought an HIS 290 that performed pretty well out of the box. He loaded reference ASUS 290X BIOS and it got even better; could be worth a shot since you have dual BIOS to play around with. Also if you load ASUS BIOS you'd have the ability to try GPU Tweak.

If you can't go full blown WC then a Kraken G10 and AIO could be a viable solution to keep temps in check: http://www.overclock.net/t/1478544/the-devil-inside-a-gaming-htpc-for-my-wife-3770k-corsair-250d-powercolor-r9-270x-devil-edition/20#post_22214255

You may also want to consider a backplate or attaching a sink to the backside of the VRMs to assist in cooling them down; at the very least steady airflow over the backside of the card.


----------



## edo101

Quote:


> Originally Posted by *Roboyto*
> 
> 
> 
> Looks like VRM1 gets blasted by a lot of the heat from all 5 pipes unfortunately. You may be able to get better VRM temps with a thermal pad upgrade. I did this on a Devil 270X and the temps came down ~10%; the card layout and cooling were setup differently though.
> 
> Not 100% certain of the thickness of the particular thermal pad on your Gigabyte card but I would wager 1mm; you can always remove the HSF and check the thickness or double up on new pads to be certain there is good contact. http://www.frozencpu.com/products/17504/thr-185/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_Mosfet_Block_-_100_x_15_x_10_-_Thermal_Conductivity_170_WmK.html
> 
> Maybe try a different OC utility? Some cards do better with different utilities.
> 
> Friend of mine just bought an HIS 290 that performed pretty well out of the box. He loaded reference ASUS 290X BIOS and it got even better; could be worth a shot since you have dual BIOS to play around with. Also if you load ASUS BIOS you'd have the ability to try GPU Tweak.
> 
> If you can't go full blown WC then a Kraken G10 and AIO could be a viable solution to keep temps in check: http://www.overclock.net/t/1478544/the-devil-inside-a-gaming-htpc-for-my-wife-3770k-corsair-250d-powercolor-r9-270x-devil-edition/20#post_22214255
> 
> You may also want to consider a backplate or attaching a sink to the backside of the VRMs to assist in cooling them down; at the very least steady airflow over the backside of the card.


Yeah the card does have two bios (at least I think because it has a switch). But I think i'll drop down to 1140 and lower my VRM heat (it seats aorund 80-85C at this setting). My card doesn't OC well so no point in spending any extra money to get no where in pushing Volts. I haven't tried other OC tools besides AFB and Trixx (it seems to do better on Trixx).

And my card is a Rev 1 for their Windforce OC line so I am not sure if a Bios flash to a 290X will work since ultimately its not a reference card. (need to investigate)...I'm guessing have two bios means I can flash one and it will still boot with the other one if it bricks? I don't have a UEFI boot though cause I'm still on X-58. I know my card (one switch is for legacy bios and the other is UEFI). Can a graphics card boot into UEFI mode even if it's motherboard doesn't have UEFI?


----------



## kizwan

Quote:


> Originally Posted by *edo101*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Roboyto*
> 
> 
> 
> Looks like VRM1 gets blasted by a lot of the heat from all 5 pipes unfortunately. You may be able to get better VRM temps with a thermal pad upgrade. I did this on a Devil 270X and the temps came down ~10%; the card layout and cooling were setup differently though.
> 
> Not 100% certain of the thickness of the particular thermal pad on your Gigabyte card but I would wager 1mm; you can always remove the HSF and check the thickness or double up on new pads to be certain there is good contact. http://www.frozencpu.com/products/17504/thr-185/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_Mosfet_Block_-_100_x_15_x_10_-_Thermal_Conductivity_170_WmK.html
> 
> Maybe try a different OC utility? Some cards do better with different utilities.
> 
> Friend of mine just bought an HIS 290 that performed pretty well out of the box. He loaded reference ASUS 290X BIOS and it got even better; could be worth a shot since you have dual BIOS to play around with. Also if you load ASUS BIOS you'd have the ability to try GPU Tweak.
> 
> If you can't go full blown WC then a Kraken G10 and AIO could be a viable solution to keep temps in check: http://www.overclock.net/t/1478544/the-devil-inside-a-gaming-htpc-for-my-wife-3770k-corsair-250d-powercolor-r9-270x-devil-edition/20#post_22214255
> 
> You may also want to consider a backplate or attaching a sink to the backside of the VRMs to assist in cooling them down; at the very least steady airflow over the backside of the card.
> 
> 
> 
> Yeah the card does have two bios (at least I think because it has a switch). But I think i'll drop down to 1140 and lower my VRM heat (it seats aorund 80-85C at this setting). My card doesn't OC well so no point in spending any extra money to get no where in pushing Volts. I haven't tried other OC tools besides AFB and Trixx (it seems to do better on Trixx).
> 
> And my card is a Rev 1 for their Windforce OC line so I am not sure if a Bios flash to a 290X will work since ultimately its not a reference card. (need to investigate)...I'm guessing have two bios means I can flash one and it will still boot with the other one if it bricks? I don't have a UEFI boot though cause I'm still on X-58. I know my card (one switch is for legacy bios and the other is UEFI).
> 
> 
> Can a graphics card boot into UEFI mode even if it's motherboard doesn't have UEFI?
Click to expand...

You need to use boot loader like UEFI DUET & this is a lot of work though. BTW, why do you want it boot in UEFI mode anyway?


----------



## edo101

Quote:


> Originally Posted by *kizwan*
> 
> You need to use boot loader like UEFI DUET & this is a lot of work though. BTW, why do you want it boot in UEFI mode anyway?


I don't want to boot my computer into UEFI. I just want to know how a dual bios GPU works. My WF3 uses a legacy and UEFI bios. If I were to flash one (probably the legacy one), would the card still be able to work when it is switched to it's UEFI mode on a non UEFI motherboard


----------



## Mega Man

.... not that kind of dual bios ...


----------



## kizwan

Quote:


> Originally Posted by *edo101*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> You need to use boot loader like UEFI DUET & this is a lot of work though. BTW, why do you want it boot in UEFI mode anyway?
> 
> 
> 
> 
> 
> 
> 
> I don't want to boot my computer into UEFI. I just want to know how a dual bios GPU works. My WF3 uses a legacy and UEFI bios. If I were to flash one (probably the legacy one), would the card still be able to work when it is switched to it's UEFI mode on a non UEFI motherboard
Click to expand...

What is WF3? Dual bios on 290 cards are like two profiles you can choose. By default, in 290X, you have two fan profiles. So, when you want to play games, you can select the one that have higher max fan speed & when you're not you can choose lower max fan speed for low noise. 290's have two BIOSes (chips) too but usually both contain identical BIOS. If you're going to flash your card, you're going to only flash one BIOS chip & the second BIOS chip will not affected. If you flashed the legacy BIOS to UEFI BIOS, which means both BIOSes are UEFI BIOS. In this case, you should be able to boot with that card on a non UEFI motherboard but there will be no signal on the monitor until Windows/OS loaded.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kizwan*
> 
> *What is WF3?* Dual bios on 290 cards are like two profiles you can choose. By default, in 290X, you have two fan profiles. So, when you want to play games, you can select the one that have higher max fan speed & when you're not you can choose lower max fan speed for low noise. 290's have two BIOSes (chips) too but usually both contain identical BIOS. If you're going to flash your card, you're going to only flash one BIOS chip & the second BIOS chip will not affected. If you flashed the legacy BIOS to UEFI BIOS, which means both BIOSes are UEFI BIOS. In this case, you should be able to boot with that card on a non UEFI motherboard but there will be no signal on the monitor until Windows/OS loaded.


Gigabyte Windforce 3 Cooler


----------



## edo101

Quote:


> Originally Posted by *kizwan*
> 
> What is WF3? Dual bios on 290 cards are like two profiles you can choose. By default, in 290X, you have two fan profiles. So, when you want to play games, you can select the one that have higher max fan speed & when you're not you can choose lower max fan speed for low noise. 290's have two BIOSes (chips) too but usually both contain identical BIOS. If you're going to flash your card, you're going to only flash one BIOS chip & the second BIOS chip will not affected. If you flashed the legacy BIOS to UEFI BIOS, which means both BIOSes are UEFI BIOS. In this case, you should be able to boot with that card on a non UEFI motherboard but there will be no signal on the monitor until Windows/OS loaded.


Yeah I understand now thanks. The UEFI bios on my Gigabyte R9 290 Windforce 3X (WF3) is the low performance one. I suppose if I were to flash the card, since I'm still on a non UEFI build, I should flash over the UEFI low performance bios.

I'll have to look into figuring out if my non reference card can be unlocked


----------



## Mercy4You

Quote:


> Originally Posted by *imadorkx*
> 
> i feel you. Im still aiming for 1440p.. meanwhile others going 4K.
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *the9quad*
> 
> 1
> 1440p at 120hz
Click to expand...

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 1440p @ 110hz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Close enough


It's around the corner









ROG SWIFT PG278Q Premium Gaming Monitor, 2560 x 1440, 120 Hz, 1ms


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mercy4You*
> 
> It's around the corner
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ROG SWIFT PG278Q Premium Gaming Monitor, 2560 x 1440, 120 Hz, 1ms


With a $800 USD price tag









means around $1k here, no way i was going to spend that so i grabbed a Qnix Evo II, currently running at 110hz


----------



## imadorkx

Quote:


> Originally Posted by *Sgt Bilko*
> 
> With a $800 USD price tag
> 
> 
> 
> 
> 
> 
> 
> 
> 
> means around $1k here, no way i was going to spend that so i grabbed a Qnix Evo II, currently running at 110hz


its gonna rape my wallet for sure. quick question, is the Qnix Evo II worth it? so hard to find it in my country.


----------



## Mercy4You

Quote:


> Originally Posted by *Sgt Bilko*
> 
> With a $800 USD price tag
> 
> 
> 
> 
> 
> 
> 
> 
> 
> means around $1k here, no way i was going to spend that so i grabbed a Qnix Evo II, currently running at 110hz


I agree, that's *a lot of money* for a monitor...


----------



## Sgt Bilko

Quote:


> Originally Posted by *imadorkx*
> 
> its gonna rape my wallet for sure. quick question, is the Qnix Evo II worth it? so hard to find it in my country.


I went from a 24" 1080p, 5ms, 60hz TN Monitor to the Qnix which is a 27" 1440p 8ms (not 100% sure there), PLS Monitor.

Well worth it for me imo, some people have backlight bleed issues or dead pixels but mine was damn near perfect if im honest.

If you do get one then get the Single DVI Input version, the Multi input models will not overclock (unless of course you're happy with 60hz)


----------



## RooTxBeeR

Would like to join the club!!! I have 2 Gigabyte Windforces's. Have aftermarket Hyper EVO 212 for cooling.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mercy4You*
> 
> I agree, that's *a lot of money* for a monitor...


For a 4k IPS panel then it's worth it, but Asus always have to jack up the price for some reason


----------



## Mercy4You

Quote:


> Originally Posted by *Sgt Bilko*
> 
> For a 4k IPS panel then it's worth it, but Asus always have to jack up the price for some reason


Since I plan not to buy an ASUS product ever again, I guess I have to wait some time more for a 2560 x 1440 120 Hz 1 ms monitor to come by


----------



## kizwan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> *What is WF3?*
> 
> 
> 
> Gigabyte Windforce 3 Cooler
Click to expand...

I know that.








Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *imadorkx*
> 
> its gonna rape my wallet for sure. quick question, is the Qnix Evo II worth it? so hard to find it in my country.
> 
> 
> 
> I went from a 24" 1080p, 5ms, 60hz TN Monitor to the Qnix which is a 27" 1440p 8ms (not 100% sure there), PLS Monitor.
> 
> Well worth it for me imo, some people have *backlight bleed issues or dead pixels* but mine was damn near perfect if im honest.
> 
> If you do get one then get the Single DVI Input version, the Multi input models will not overclock (unless of course you're happy with 60hz)
Click to expand...

The only reason I hesitate to get one but it's not expensive though. A re-seller here even offer to overclocked it for the customer if they want one that can do 120Hz. Need to pay more though.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kizwan*
> 
> I know that.


You do now








Quote:


> The only reason I hesitate to get one but it's not expensive though. A re-seller here even offer to overclocked it for the customer if they want one that can do 120Hz. Need to pay more though.


I admit, it was a concern for me as well but the Panels are apparently getting better and fewer people are reporting dead pixels and the backlight bleed can be fixed if it's serious and you don't mind cracking open the cover


----------



## Gobigorgohome

Quote:


> Originally Posted by *Mercy4You*
> 
> ;-) any idea how much that will pull with 4K monitor?


Do you mean how much wattage or frames?

Well for wattage I do not know, but I guess close to 1300 wattage (which my PSU should handle)

For the frames I do not know either, but it should do OKAY considering adding antializing is kind of a waste at 4K (because of the pixel desity), I think it will do 50 FPS ++ in BF4, Hitman Absolution, Metro 2033 (maybe), Metro LL (maybe), C3 (maybe), at least then I am happy. I hope that is a precise enough answer for you and I will know for sure in about three weeks I'm guessing.


----------



## ebhsimon

Quote:


> Originally Posted by *edo101*
> 
> But I think i'll drop down to 1140 and lower my VRM heat (it seats aorund 80-85C at this setting). My card doesn't OC well so no point in spending any extra money to get no where in pushing Volts. I haven't tried other OC tools besides AFB and Trixx (it seems to do better on Trixx).


Is this while benchmarking/stress testing or gaming? Because if that's during stress testing or benchmarking then 85C is not too bad of a temperature to hit.
However if you're getting these temps while gaming I would definitely reduce the core clock (or turn on vsync if you're getting mad fps that isn't really necessary).


----------



## HOMECINEMA-PC

Stuff ive done so far this week with me trusty TRI 290's









Mk 11 P

http://www.3dmark.com/3dm11/8348490

FSE

http://www.3dmark.com/fs/2169491

FS

http://www.3dmark.com/fs/2169548


----------



## Sgt Bilko

Very nice work there bud


----------



## Mercy4You

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I went from a 24" 1080p, 5ms, 60hz TN Monitor to the Qnix which is a 27" 1440p 8ms (not 100% sure there), PLS Monitor.
> 
> Well worth it for me imo, some people have backlight bleed issues or dead pixels but mine was damn near perfect if im honest.
> 
> If you do get one then get the Single DVI Input version, the Multi input models will not overclock (unless of course you're happy with 60hz)


I've read that on 27 inch 2560x1440 items look smaller than on 24 inch 1920 x 1080....

Can you confirm this? How does this work in games, do you get to see more although smaller?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mercy4You*
> 
> I've read that on 27 inch 2560x1440 items look smaller than on 24 inch 1920 x 1080....
> 
> Can you confirm this? How does this work in games, do you get to see more although smaller?


Web browsing and such (This forum for example) i've had to go to 125% zoom to read most of the text and such.

Easiest way to test for yourself is to run a 1080p Monitor in 720p for a week or so and then take it back to 1080p, same difference (1.5x the previous res)

and i don't really see more in games it's just more detail (pixel density) and the HUD does appear smaller in a few games but nothing so major as i can't read or understand it. From what i've seen there usually is a HUD size adjust option in most games now anyway.

Hope that answers your question in some way


----------



## Mercy4You

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Web browsing and such (This forum for example) i've had to go to 125% zoom to read most of the text and such.
> 
> Easiest way to test for yourself is to run a 1080p Monitor in 720p for a week or so and then take it back to 1080p, same difference (1.5x the previous res)
> 
> and i don't really see more in games it's just more detail (pixel density) and the HUD does appear smaller in a few games but nothing so major as i can't read or understand it. From what i've seen there usually is a HUD size adjust option in most games now anyway.
> 
> Hope that answers your question in some way


Thx







, I'm gonna try that 720p idea!

Guess the best for gaming would be a 27 inch monitor on 1080p, screen bigger and items also but I'm afraid it is not going to be sharp enough for me


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mercy4You*
> 
> Thx
> 
> 
> 
> 
> 
> 
> 
> , I'm gonna try that 720p idea!
> 
> Guess the best for gaming would be a 27 inch monitor on 1080p, screen bigger and items also but I'm afraid it is not going to be sharp enough for me


I wouldn't go any bigger than 24" for 1080p, The pixels stretch a bit too much imo.

I'm very happy with my Qnix though, will be a few years before i move up to 4k so this is a nice gap in between


----------



## rdr09

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Stuff ive done so far this week with me trusty TRI 290's
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Mk 11 P
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/8348490
> 
> FSE
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/2169491
> 
> FS
> 
> http://www.3dmark.com/fs/2169548


Wow, a lot of hard work there m8. You need a 4930/60K. Great job!


----------



## Mercy4You

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I wouldn't go any bigger than 24" for 1080p, The pixels stretch a bit too much imo.
> 
> I'm very happy with my Qnix though, will be a few years before i move up to 4k so this is a nice gap in between


Well, too much pixel stretch was also my opinion from the beginning of the 27 inch screens on 1080...

I tried your 720 suggestion and THAT DID IT









Items are way bigger, so 1440 will be way smaller. I play BF3/4 Death Match mostly, so that is not going to work for me. The soldiers are already small on 1080, so...

Sgt Bilko, you just saved me a lot of trouble (and money, and heat LOL), just love this forum


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mercy4You*
> 
> Well, too much pixel stretch was also my opinion from the beginning of the 27 inch screens on 1080...
> 
> I tried your 720 suggestion and that DID IT
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Items are way bigger, so 1440 will be way smaller. I play BF3/4 Death Match mostly, so that is not going to work for me. The soldiers are already small on 1080, so...
> 
> Sgt Bilko, you just saved me a lot of trouble (and money, and heat lol), just love this forum


Here are a couple of 1440p screencaps i found:

http://i.imgur.com/wIXddoy.jpg

http://i.imgur.com/sdzs6dP.jpg

And here is a res scale for you: http://i.imgur.com/ql5wOQK.png

remember that 1440p is also across 27" as well so you have that extra screen real estate for the extra pixels


----------



## imadorkx

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Here are a couple of 1440p screencaps i found:
> 
> http://i.imgur.com/wIXddoy.jpg
> 
> http://i.imgur.com/sdzs6dP.jpg
> 
> And here is a res scale for you: http://i.imgur.com/ql5wOQK.png
> 
> remember that 1440p is also across 27" as well so you have that extra screen real estate for the extra pixels


is that SS without AA?


----------



## Sgt Bilko

Quote:


> Originally Posted by *imadorkx*
> 
> is that SS without AA?


I believe so, They aren't mine so i'm assuming they ran BF4 without AA for the extra performance.......guess they aren't running 290/x's huh?


----------



## Mercy4You

Quote:


> Originally Posted by *Sgt Bilko*
> 
> remember that 1440p is also across 27" as well so you have that extra screen real estate for the extra pixels


I get your point. The item reducing effect is partly compensated by the extra screen surface you get... So, the items will not look factor 1.5 smaller, right?


----------



## cennis

Just got a samsung 4k 28",

My tested and stable 290 crossfire overclocks 1200/1500 +200mw produce some flashing black screening on this monitor on certain benchmarks.

On 4K,
3dmark11 P mode, even its 720p it is flashing black screen at times, *however scores are not reduced and there are no crashing.*
Heaven/valley black screen here and there
Battlefield 4 mostly ok

On 1080p monitor
no black screen or artifacts anywhere.

At stock clocks there are no black screen on either monitor..

I *reduced* my overclock voltage by +25mw ( to +175mw) keeping 1200/1500 and there was much less black screen in 3dmark11 P on the 4k monitor

tried using 1200/1250 (Stock memory) still black screens as long I put +200mw

Any ideas?

TL;DR black screen on 4k but no black screen on 1080p


----------



## Sgt Bilko

Quote:


> Originally Posted by *Mercy4You*
> 
> I get your point. The item reducing effect is partly compensated by the extra screen surface you get... So, the items will not look factor 1.5 smaller, right?


Correct









Smaller Icons etc yes, but not by such a large margin


----------



## Sgt Bilko

Quote:


> Originally Posted by *imadorkx*
> 
> is that SS without AA?


Added some of my own here:

BF4, 1440p, Ultra Settings:


Spoiler: Warning: Spoiler!



http://i.imgur.com/XjDlGji.png

http://i.imgur.com/reG9xP4.png

http://i.imgur.com/AYPaTnR.png

http://i.imgur.com/uAttxIW.png

http://i.imgur.com/P6jVTwZ.png

http://i.imgur.com/qiBetba.png


----------



## imadorkx

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I believe so, They aren't mine so i'm assuming they ran BF4 without AA for the extra performance.......guess they aren't running 290/x's huh?


thanks. really liking the idea of playing at 1440p.. but the monitor that you mentioned earlier arent available in my country. bummer.


----------



## bencher

I am at 1100/1250 stable (1hr crysis 3) on stick voltages









The key is keeping the gpu under 80c which takes 60% fan speed for me.

My Asic quality is 66 btw.


----------



## Arizonian

Quote:


> Originally Posted by *RooTxBeeR*
> 
> Would like to join the club!!! I have 2 Gigabyte Windforces's. Have aftermarket Hyper EVO 212 for cooling.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## Mercy4You

Quote:


> Originally Posted by *Sgt Bilko*
> 
> BF4, 1440p, Ultra Settings:]


You play BF4 too I see. How fluent is BF4 for you on 1440 compared to 1080p?

I notice less fluent gameplay when I go from High settings to Ultra settings, while my FPS remain above 80. I suspect this to be a frametime issue and that would possibly also happen when going to 1440p....

Edited:
Found 'proof' for my theory:
http://www.techspot.com/review/734-battlefield-4-benchmarks/page5.html

So much for Fraps


----------



## Imprezzion

As the Lightning thread is pretty much dead i'll just rant here.

Jumped on a deal for a secondhand (month old) 290X Lightning thinkin' it would be a nice sidegrade for my Tri-X as the Lightning should clock MUCH better and should be cooler and quiter.

Well, have I ever been so wrong...

It's louder, it's quite a lot hotter, it clocks like absolute garbage and I have no idea how it can even run it's 1080Mhz base clocks as at +200mV it won't even run 1200Mhz for 5 flipping seconds in Valley let alone something worse like 3dmark..

It hits like, 75c on Auto fans on stock volts while my Tri-X barely hits that at +120mV with Auto fans...

Bah.. very very disappointed with the Lightning.


----------



## heroxoot

The lightning 290X is a huge disappointment compared to the 7970 lightnings. I'd sell that thing back off. Everything I have seen from less performance to bad OCing period, it just isn't what the lightning should be. the lightning 7970 was an amazing overclocker too.


----------



## bencher

Quote:


> Originally Posted by *Imprezzion*
> 
> As the Lightning thread is pretty much dead i'll just rant here.
> 
> Jumped on a deal for a secondhand (month old) 290X Lightning thinkin' it would be a nice sidegrade for my Tri-X as the Lightning should clock MUCH better and should be cooler and quiter.
> 
> Well, have I ever been so wrong...
> 
> It's louder, it's quite a lot hotter, it clocks like absolute garbage and I have no idea how it can even run it's 1080Mhz base clocks as at +200mV it won't even run 1200Mhz for 5 flipping seconds in Valley let alone something worse like 3dmark..
> 
> It hits like, 75c on Auto fans on stock volts while my Tri-X barely hits that at +120mV with Auto fans...
> 
> Bah.. very very disappointed with the Lightning.


Sorry to hear that. I knew it would be a disappointment based on the review.


----------



## Mercy4You

Quote:


> Originally Posted by *Imprezzion*
> 
> Jumped on a deal for a secondhand (month old) 290X Lightning thinkin' it would be a nice sidegrade for my Tri-X as the Lightning should clock MUCH better and should be cooler and quiter.
> 
> Well, have I ever been so wrong...
> 
> It's louder, it's quite a lot hotter....


The hotter thing is weird. I would complain to the seller. Sorry for your loss...


----------



## Imprezzion

Hmm.. Mistake on my end then still believing in the whole ''Lightning" is the best there is.

Well, it costs me a couple of bucks but if that's all it costs.. Lesson well learned









I mean, those idle temps for example.. What's with that.. 28c ambients, almost 45c idle.. My Tri-X with much higher voltage idles at 34c...
Replaced the paste on the Lightning as well. Made no difference at all.


----------



## Mercy4You

I think the best card till now is the Sapphire Vapor-X


----------



## Imprezzion

Quote:


> Originally Posted by *Mercy4You*
> 
> I think the best card till now is the Sapphire Vapor-X


It's not available in Holland otherwise i'd have bought one already. My entire case + mods are blue so i'd trade in my Tri-X for a Vapor-X no matter what it costs extra however it's still not available here..


----------



## Mercy4You

Quote:


> Originally Posted by *Imprezzion*
> 
> It's not available in Holland otherwise i'd have bought one already. My entire case + mods are blue so i'd trade in my Tri-X for a Vapor-X no matter what it costs extra however it's still not available here..


???? Max ICT, 79 in stock.....


----------



## hwoverclkd

Quote:


> Originally Posted by *Imprezzion*
> 
> It's not available in Holland otherwise i'd have bought one already. My entire case + mods are blue so i'd trade in my Tri-X for a Vapor-X no matter what it costs extra however it's still not available here..


check out amazon stock, 290x vapor-x (1080/1410 clocks), undercuts lightning price by $40. I believe it's available for shipment outside US.


----------



## Imprezzion

Quote:


> Originally Posted by *Mercy4You*
> 
> ???? Max ICT, 79 in stock.....


I will never, ever, put that kind of money in the hands of that sad excuse of a store lol.
Also, it's the only store with a so called stock and they're known to just make up random numbers for stock. So it probably isn't even actually there.


----------



## sugarhell

Quote:


> Originally Posted by *Imprezzion*
> 
> I will never, ever, put that kind of money in the hands of that sad excuse of a store lol.
> Also, it's the only store with a so called stock and they're known to just make up random numbers for stock. So it probably isn't even actually there.


http://www.overclockers.co.uk/showproduct.php?prodid=GX-351-SP&groupid=701&catid=56&subcat=1752


----------



## rdr09

Quote:


> Originally Posted by *Imprezzion*
> 
> Hmm.. Mistake on my end then still believing in the whole ''Lightning" is the best there is.
> 
> Well, it costs me a couple of bucks but if that's all it costs.. Lesson well learned
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I mean, those idle temps for example.. What's with that.. 28c ambients, almost 45c idle.. My Tri-X with much higher voltage idles at 34c...
> Replaced the paste on the Lightning as well. Made no difference at all.


your lightning is worst than acupalyspe. maybe not, his blackscreens.


----------



## hwoverclkd

Quote:


> Originally Posted by *rdr09*
> 
> your lightning is worst than acupalyspe. maybe not, his blackscreens.


forgot to tell ya, no more black screens.

let me know when you found good deal on 1440p, been watching b&h prices since december.


----------



## rdr09

Quote:


> Originally Posted by *acupalypse*
> 
> forgot to tell ya, no more black screens.
> 
> let me know when you found good deal on 1440p, been watching b&h prices since december.


i have not even considered b&h. i skip tax buying from there. i will.

good for you no more bs.


----------



## hwoverclkd

Quote:


> Originally Posted by *rdr09*
> 
> i have not even considered b&h. i skip tax buying from there. i will.
> 
> good for you no more bs.


order it online, they won't charge tax. Only if you buy in store. Return policy is 30 days for most electronics.


----------



## velocityx

I think I pinpointed the cause of my black screens in BF4. They are gone when I take out my sound blaster z from my system. I was googling about it but nothing came up so I started doing some tests and then I noticed something's not right about how the card fits in PCIE slot, meaning, when I screw it to the case, the pcie connector comes out of the slot a little bit, but when I put it in 100% the sbz bracket it doesn't align with my case. Radeons fit nicely, so I guess SBZ has to have a bad bracket? because I can't really explain, there's a 90 degree angle there, the board has all standoffs, radeons fit, but SBZ if I screw it, the connector is not 100% in. I took the bracket off of it entirely and the card sits there and holds by the cables connected to it so for now it's ok.

I'm seriously considering switching to the Z97 Hero and a 4770K or DC and selling SBZ, the onboard sound on that board is supposedly close to the xonar stuff.


----------



## Loktar Ogar

Possibly the bracket is not aligned perfectly and i do experience that for most of my video cards... Just slighty bend the bracket or the mounting holes in your PC case. Even the PC case is not always a perfect fit.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Very nice work there bud
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> I wouldn't go any bigger than 24" for 1080p, The pixels stretch a bit too much imo.
> 
> I'm very happy with my Qnix though, will be a few years before i move up to 4k so this is a nice gap in between
Click to expand...

Thanks maaaaate








I love my 1440p monitor . Makes my 1080p tv shows and movies look so much better more depth , games too









Quote:


> Originally Posted by *rdr09*
> 
> Wow, a lot of hard work there m8. You need a 4930/60K. Great job!


Thanks maaaate








Thats the next sidegrade . Need to bench hexy with 2666 on mem








Quote:


> Originally Posted by *Imprezzion*
> 
> As the Lightning thread is pretty much dead i'll just rant here.
> 
> Jumped on a deal for a secondhand (month old) 290X Lightning thinkin' it would be a nice sidegrade for my Tri-X as the Lightning should clock MUCH better and should be cooler and quiter.
> 
> Well, have I ever been so wrong...
> 
> It's louder, it's quite a lot hotter, it clocks like absolute garbage and I have no idea how it can even run it's 1080Mhz base clocks as at +200mV it won't even run 1200Mhz for 5 flipping seconds in Valley let alone something worse like 3dmark..
> 
> It hits like, 75c on Auto fans on stock volts while my Tri-X barely hits that at +120mV with Auto fans...
> 
> Bah.. very very disappointed with the Lightning.


Spewing man








Looks like ref 290 perform betterish


----------



## velocityx

Quote:


> Originally Posted by *Loktar Ogar*
> 
> Possibly the bracket is not aligned perfectly and i do experience that for most of my video cards... Just slighty bend the bracket or the mounting holes in your PC case. Even the PC case is not always a perfect fit.


Before I bend anything I need to be perfectly sure it's not the SBZ causing all this. Seems CF over PCIE doesnt like interference. Will run onboard audio for a day see if I get a black screen,

EDIT- well I spoke too soon. onboard BS also. Will switch off mantle, check in direct x. gee.


----------



## chronicfx

Quote:


> Originally Posted by *velocityx*
> 
> Before I bend anything I need to be perfectly sure it's not the SBZ causing all this. Seems CF over PCIE doesnt like interference. Will run onboard audio for a day see if I get a black screen,
> 
> EDIT- well I spoke too soon. onboard BS also. Will switch off mantle, check in direct x. gee.


No warranty for sbz? I would not trash it. I like mine. I think the pcie lanes that are for gpu are from the cpu and the x1 lanes are from the motherboard. I don't think there would be interference or sharing of bandwidth going on.


----------



## IBIubbleTea

Hey guys, I was wondering which brand I should get for my friend, XFX R9 290 Double Dissipation or MSI Gaming R9 290? OR any other brands.... I don' think looks really matters that much, more of custom services and price....


----------



## Dasboogieman

Quote:


> Originally Posted by *bencher*
> 
> I am at 1100/1250 stable (1hr crysis 3) on stick voltages
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The key is keeping the gpu under 80c which takes 60% fan speed for me.
> 
> My Asic quality is 66 btw.


very very nice. Whats the stock voltage on your card? My Tri X is 78.9 ASIC and it actually requires +12mv to achieve 1100mhz. I suspect its because since its lower leakage, Sapphire programmed a lower stock voltage of 1.212mV.

[/quote]
Quote:


> Originally Posted by *Imprezzion*
> 
> As the Lightning thread is pretty much dead i'll just rant here.
> 
> Jumped on a deal for a secondhand (month old) 290X Lightning thinkin' it would be a nice sidegrade for my Tri-X as the Lightning should clock MUCH better and should be cooler and quiter.
> 
> Well, have I ever been so wrong...
> 
> It's louder, it's quite a lot hotter, it clocks like absolute garbage and I have no idea how it can even run it's 1080Mhz base clocks as at +200mV it won't even run 1200Mhz for 5 flipping seconds in Valley let alone something worse like 3dmark..
> 
> It hits like, 75c on Auto fans on stock volts while my Tri-X barely hits that at +120mV with Auto fans...
> 
> Bah.. very very disappointed with the Lightning.


Thats rotten luck








I don't get why companies like MSI don't bother binning ASICs for performance or low leakage if they intend on charging this much of a premium. At this price range, you shouldn't have to beseech the chip lottery gods as much.

Whats the VRM temperatures like on the Lightning? The Tri X had fairly weak VRM cooling so even if the core was <75c, the VRMs would need >50% fan speed anyway otherwise they easily hit 90 or 100 degrees when overvolted.
At this stage, the Vapor-X looks to be the best since the price is reasonable compared to the lightning. In Australia here, the Lightning is on special and its still equivalent in price to a stock GTX 780Ti (which is faster, cooler and less loud).


----------



## ace1ndahole

Get Sapphire or Gigabyte. The asus 290 lacks quality, doesn't have cooling for VRM and I read a lot of horror stories of MSI infamous customer service.


----------



## Roy360

so basically all my r9 are good for is Heaven's benchmark. I just tried to play Total Medival War 2, and my screen has random squares flashing, and a shaden bar the keeps going from the bottom to top.

My card isn't even overclocked.

edit: looks like I have to change my post to: all crossfirre is good for is Heaven's benchmark

disabled crossfire and everything is working. Question why do games they break in crossfire have profiles? I thought only games that supported crossfire would use it
Quote:


> Originally Posted by *ace1ndahole*
> 
> Get Sapphire or Gigabyte. The asus 290 lacks quality, doesn't have cooling for VRM and I read a lot of horror stories of MSI infamous customer service.


Sapphire isn't that good either(customer service). Not sure about Gigabyte.

ASUS has cooling, it's just that the fan is too low. If up the fan to 60%, like most other manufacturers(Club3s, XFX and Visiontek), the vrms are actually quite cool ~40-50.


----------



## Goride

Quote:


> Originally Posted by *kizwan*
> 
> I'm asking because @Roboyto have the fan in "sucking" orientation. See *here*. It's a long shot but couldn't hurt to try.


Tried the fan thing, did not work.
Quote:


> Originally Posted by *armartins*
> 
> Maybe you have a really, really low ASIC card that needs a lot of juice at stock clocks... what's your default core voltage?


The ASIC score was 72.x%. Not the best, but not the worst.

I RMA'd the 290x. I bought a 290 reference that was on sale at newegg for $299.

The 290 arrived today. I set the 290 to the same clock speed as the stock 290x (1030/1250). In Firestrike the 290x was giving me around 8600 at those speeds. The 290 gave me over 9000 *insert vegeta pic*. When I had the 290x at 1200/1500 the best score I could get was 9600. When I had the 290 at 1150/1500 I got a 9867. But there was some artifacting. I would wager if I turned up the voltage a bit more to stabilize it, I would have gotten even higher.

The temps of the 290 only got to about 50c at stock while running the benchmark, with the reference cooler. The 290x was doing around 90-100c stock voltage. When I had the 290 overclocked with +133mV, the highest the vrm1 temps got to was 67c, reference cooler.

The ASIC of this 290 is only 71.7%.

Also while watching the benchmark videos during the 290 tests, the graphics looked much smoother.

Clearly there was just something wrong with that 290x. It was unreasonably hot, and it was seriously under performing. This 290 is out performing it by a large margin, while remaining significantly cooler. The 290 is NOT unlocked to run as a 290x.


----------



## kizwan

Quote:


> Originally Posted by *Roy360*
> 
> so basically all my r9 are good for is Heaven's benchmark. I just tried to play Total Medival War 2, and my screen has random squares flashing, and a shaden bar the keeps going from the bottom to top.
> 
> My card isn't even overclocked.
> 
> edit: looks like I have to change my post to: all crossfirre is good for is Heaven's benchmark
> 
> disabled crossfire and everything is working. Question why do games they break in crossfire have profiles? I thought only games that supported crossfire would use it
> Quote:
> 
> 
> 
> Originally Posted by *ace1ndahole*
> 
> Get Sapphire or Gigabyte. The asus 290 lacks quality, doesn't have cooling for VRM and I read a lot of horror stories of MSI infamous customer service.
> 
> 
> 
> Sapphire isn't that good either(customer service). Not sure about Gigabyte.
> 
> ASUS has cooling, it's just that the fan is too low. If up the fan to 60%, like most other manufacturers(Club3s, XFX and Visiontek), the vrms are actually quite cool ~40-50.
Click to expand...

I see you edited your signature. Happy or buyer/trader remorse?









Please refresh my memory. The square flashing issue is carried forward from previous rig or just happen on current rig? Even with power limit set to max OR all the cards underclock, it still happening? I wonder whether you can disabled one card in Device Manager, then enabled Crossfire just for two cards & see whether it still happening.

BTW, what monitor do you have? 60Hz or 120Hz?


----------



## kckyle

Quote:


> Originally Posted by *Goride*
> 
> Tried the fan thing, did not work.
> The ASIC score was 72.x%. Not the best, but not the worst.
> 
> I RMA'd the 290x. I bought a 290 reference that was on sale at newegg for $299.
> 
> The 290 arrived today. I set the 290 to the same clock speed as the stock 290x (1030/1250). In Firestrike the 290x was giving me around 8600 at those speeds. The 290 gave me over 9000 *insert vegeta pic*. When I had the 290x at 1200/1500 the best score I could get was 9600. When I had the 290 at 1150/1500 I got a 9867. But there was some artifacting. I would wager if I turned up the voltage a bit more to stabilize it, I would have gotten even higher.
> 
> The temps of the 290 only got to about 50c at stock while running the benchmark, with the reference cooler. The 290x was doing around 90-100c stock voltage. When I had the 290 overclocked with +133mV, the highest the vrm1 temps got to was 67c, reference cooler.
> 
> The ASIC of this 290 is only 71.7%.
> 
> Also while watching the benchmark videos during the 290 tests, the graphics looked much smoother.
> 
> Clearly there was just something wrong with that 290x. It was unreasonably hot, and it was seriously under performing. This 290 is out performing it by a large margin, while remaining significantly cooler. The 290 is NOT unlocked to run as a 290x.


there are a number of things i'm having a hard time believing.

1st. 299? new? can you show invoice or receipt? i know even some used 290 on ebay dont hit that low.

2nd, at what fan speed are you running at for you to achieve 50c under heavy load(assuming your benchmark was pushing it to 100 usage). even when i play nfs most wanted the fan spins up to 40-50 percent with temp hovering around 80-90c


----------



## sinnedone

Quote:


> Originally Posted by *kckyle*
> 
> there are a number of things i'm having a hard time believing.
> 
> 1st. 299? new? can you show invoice or receipt? i know even some used 290 on ebay dont hit that low.


You just have to keep looking. I purchased 2 290's for 400 used.


----------



## Roy360

Quote:


> Originally Posted by *ace1ndahole*
> 
> Get Sapphire or Gigabyte. The asus 290 lacks quality, doesn't have cooling for VRM and I read a lot of horror stories of MSI infamous customer service.


Quote:


> Originally Posted by *kizwan*
> 
> I see you edited your signature. Happy or buyer/trader remorse?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Please refresh my memory. The square flashing issue is carried forward from previous rig or just happen on current rig? Even with power limit set to max OR all the cards underclock, it still happening? I wonder whether you can disabled one card in Device Manager, then enabled Crossfire just for two cards & see whether it still happening.
> 
> BTW, what monitor do you have? 60Hz or 120Hz?


Lost a few miscellaneous features (Optical Audio IN), but for the most part, I'm happy. Went from 4.7GHz i5(overvolted) to 4.4GHz i7(undervolted) for 50$









No performance gain at 1080p though. I don't know why so many people on this forum said i5 wasn't enough. With that said, I forgot to test higher resolutions on my old machine.

Cards are running at pure stock settings, with 13.11 Catalyst. Never bothered to check two cards at once, I'll check that out.

The AMD drivers don't seem to by syncing the cards, so I have one at 1000/1260, one at 947/1250 and one at 977/1250

Running 3 monitors at 60Hz, but Total War doesn't seem to like multi-screen.


----------



## heroxoot

Quote:


> Originally Posted by *ace1ndahole*
> 
> Get Sapphire or Gigabyte. The asus 290 lacks quality, doesn't have cooling for VRM and I read a lot of horror stories of MSI infamous customer service.


Really? Because MSI's terrible customer service upgraded my broken 7970 to a 290X. They are a really bad company.


----------



## kckyle

Quote:


> Originally Posted by *sinnedone*
> 
> You just have to keep looking. I purchased 2 290's for 400 used.


400 for 2 290 yeah i understand, couple guys on ebay were selling sapphire 290 for 200 each a week ago i know cause i almost bought one but decide to spend a lil extra for warranty sake, saw that on gpuscanner. but 299 brand new from newegg? i just bought a MSI 290 for 250 from ebay, highly doubt newegg selling brand new ones for only 50 bucks more.


----------



## ace1ndahole

Haha really? a 7970 to 290X? sound exaggerated.

But many people have complain about MSI service but maybe they have gotten better because of feedback.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *ace1ndahole*
> 
> Haha really? a 7970 to 290X? sound exaggerated.
> 
> But many people have complain about MSI service but maybe they have gotten better because of feedback.


I think he means they gave him a 290x instead . Not bios flash or whatever ....








@Arizonian
Could you put my third 290 on the list please its a unlocked , waterblocked XFX Core edition with so far a 1290 / 1500 bench clock


It had this thing on it


Fitted XSPC Block to it much better


Ghetto 2nd 1200w psu and extra 360 / 45mm QR rad modd


----------



## piquadrat

I have (small but annoying) problem with overvoltage my new Sapphire R9 290.
Whatever program I use (AB, Trixx) when some non zero core voltage offset is applied to the card I'm not able to reset it to the default.
Idle voltage returns to its defaults (0.992V) but stress voltage remains at the previous level.
Currently I'm overriding this by defining one profile with stock clocks and very small offset (+6mV) which works surprisingly.
Strange.
After reboot and before any overvoltage VDDC starts behaving as it should (stressing = 1.15V in my case).
Do you have any thoughts on this ?


----------



## Bagmup

I just picked up my first giggy 290 for my prodigy build


----------



## Goride

Quote:


> Originally Posted by *kckyle*
> 
> there are a number of things i'm having a hard time believing.
> 
> 1st. 299? new? can you show invoice or receipt? i know even some used 290 on ebay dont hit that low.
> 
> 2nd, at what fan speed are you running at for you to achieve 50c under heavy load(assuming your benchmark was pushing it to 100 usage). even when i play nfs most wanted the fan spins up to 40-50 percent with temp hovering around 80-90c


http://www.newegg.com/Product/Product.aspx?Item=N82E16814131544

It is back on sale again. With that 8% off promo code it is $301 after a $30 rebate. Plus the 3 free games.

(EDIT: This sale is $2 higher than what it was last weekend.)

I was running that fans at 70%. I was bench marking. I wanted to make sure it did not throttle.

But on my 290x, even with 70% fans it was getting into the 90-100c range, when overclocked.


----------



## blue1512

T
Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> I think he means they gave him a 290x instead . Not bios flash or whatever ....
> 
> 
> 
> 
> 
> 
> 
> 
> @Arizonian
> Could you put my third 290 on the list please its a unlocked , waterblocked XFX Core edition with so far a 1290 / 1500 bench clock
> 
> 
> It had this thing on it
> 
> 
> Fitted XSPC Block to it much better
> 
> 
> Ghetto 2nd 1200w psu and extra 360 / 45mm QR rad modd


I am pretty sure that I lost the auction of this exact card on eBay


----------



## Imprezzion

Quote:


> Originally Posted by *Dasboogieman*
> 
> Thats rotten luck
> 
> 
> 
> 
> 
> 
> 
> 
> I don't get why companies like MSI don't bother binning ASICs for performance or low leakage if they intend on charging this much of a premium. At this price range, you shouldn't have to beseech the chip lottery gods as much.
> 
> Whats the VRM temperatures like on the Lightning? The Tri X had fairly weak VRM cooling so even if the core was <75c, the VRMs would need >50% fan speed anyway otherwise they easily hit 90 or 100 degrees when overvolted.
> At this stage, the Vapor-X looks to be the best since the price is reasonable compared to the lightning. In Australia here, the Lightning is on special and its still equivalent in price to a stock GTX 780Ti (which is faster, cooler and less loud).


VRM cooling on the Lightning is godlike. I mean, barely even touched 60c VRM's after 30 minutes of running +200mV.

The Tri-X has weaker VRM cooling but that's not really down to the cooling itsself.
First of all, they use very bad thermal pads. I upgraded to the famous Fujipoly Ultra Extreme 1MM pads and temps dropped ~20c on VRM1.
Also, it still has the stock reference power delivery system so don't expect miracles as on say, +100mV it's getting close to it's operational limits so ofcourse it'll get hot. The Lightning has twice the phases.

However, I can, on air, comfortably run up to +150mV without VRM's going into the 90's.

Right now, I ran 1200Mhz core, 1500Mhz VRAM, +165mV core, +50mV AUX and at 67% fanspeed core runs at about 80-82c flat and VRM1 tops out at about 92-93c. Card is still reasonably quiet then. It's audible, but not overly annoying.

Little more fanspeed, 80%, louder ofcourse, drops temps to 75c core and 87c VRM1.

But, in all honestly, I liked my 780's a LOT more then I do my 290X. The 290X performs way better then any of my 780's, even the 1300Mhz+ ones, but they are such a joy to play with. I just love playing with a card and OCing it.. I mean, all the BIOS flashing and modding, the command line codes.. everything with the 780's is just much nicer to play with and they have WAY more headroom. So yeah, i'd really wanna have a 780Ti but here in Holland they are usually around €500-550 secondhand and €575-650 new. which is a LOT more than any 290x.


----------



## Silent Scone

Echo most of the above, from my brief testing with the Lightning, the VRMs never exceeded 70c. That's with an indicated 325mv+ which was showing as roughly over 1.51v on the MM. This was obviously over bench runs and in no way reflects daily use. However, that is an incredible feet for air cooling. It's probably the best air cooler ever made. Genuinely.


----------



## Imprezzion

Just for the hell of it i'm running the Tri-X on 200mV now with the Fujipoly's. Going for a round of Battlefield 4 CTE on Locker (highest temps of any map) and i'll report temps and such.

Core speed is 1220Mhz, memory 1500Mhz, +200mV, 80% fanspeed.
Game on all Ultra with 2x AA and 125% res scale running Mantle on 14.4 drivers.

EDIT: Results are in.
Flat lines for temps at 78c core and 95c VRM1. And that's on full +200mV. Not all that bad but I wouldn't really want to run 95c VRM1 as 24/7 clocks.. Or do you guys thing that 95c @ +200mV aint even that bad


----------



## Dasboogieman

Quote:


> Originally Posted by *Imprezzion*
> 
> VRM cooling on the Lightning is godlike. I mean, barely even touched 60c VRM's after 30 minutes of running +200mV.
> 
> The Tri-X has weaker VRM cooling but that's not really down to the cooling itsself.
> First of all, they use very bad thermal pads. I upgraded to the famous Fujipoly Ultra Extreme 1MM pads and temps dropped ~20c on VRM1.
> Also, it still has the stock reference power delivery system so don't expect miracles as on say, +100mV it's getting close to it's operational limits so ofcourse it'll get hot. The Lightning has twice the phases.
> 
> However, I can, on air, comfortably run up to +150mV without VRM's going into the 90's.
> 
> Right now, I ran 1200Mhz core, 1500Mhz VRAM, +165mV core, +50mV AUX and at 67% fanspeed core runs at about 80-82c flat and VRM1 tops out at about 92-93c. Card is still reasonably quiet then. It's audible, but not overly annoying.
> 
> Little more fanspeed, 80%, louder ofcourse, drops temps to 75c core and 87c VRM1.
> 
> But, in all honestly, I liked my 780's a LOT more then I do my 290X. The 290X performs way better then any of my 780's, even the 1300Mhz+ ones, but they are such a joy to play with. I just love playing with a card and OCing it.. I mean, all the BIOS flashing and modding, the command line codes.. everything with the 780's is just much nicer to play with and they have WAY more headroom. So yeah, i'd really wanna have a 780Ti but here in Holland they are usually around €500-550 secondhand and €575-650 new. which is a LOT more than any 290x.


Wow, yeah I guessed the combination of more phases, efficient MOSFETs and sheer metallic bulk would contribute to the Lightning's strong overvoltage headroom.
That being said, a backplate on the Tri X would drop temperatures a lot too but sadly, Sapphire uses these really rare M1.5 screws that don't actually come in long varieties (6mm +) without resorting to custom screws, thus rendering it impossible to attach an aftermarket backplate.

I've already fujiploy modded mine and yea, for me also +165mv, 1200/6000, 65% fan speed is the absolute limit the VRMs can take though the core is still happy on 80 degrees.

I agree, I used to run a GTX 570 SLI system and my goodness NVIDIA was so much easier to fiddle with. They were a tinkerer's dream because stuff just worked and was better supported. No PowerPlay crap, no enforced BIOS signatures, NvidiaInspector etc. However, the value for money was always poor and longevity was really bad. NVIDIA cards just didn't seem to age well because their hardware simply wasn't futureproofed (the 1.28gb VRAM on the GTX 570 was criminal). Plus the NVIDIA hardware also seemed to be more fragile under extreme conditions (I thankfully wasn't affected by the 570 VRM fiasco).


----------



## velocityx

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Fitted XSPC Block to it much better


I see the core VRM has no thermal pads. did you put it or its just the picture doesnt have it yet?


----------



## nightfox

false
Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> I think he means they gave him a 290x instead . Not bios flash or whatever ....
> 
> 
> 
> 
> 
> 
> 
> 
> @Arizonian
> Could you put my third 290 on the list please its a unlocked , waterblocked XFX Core edition with so far a 1290 / 1500 bench clock
> 
> 
> It had this thing on it
> 
> 
> Fitted XSPC Block to it much better
> 
> 
> Ghetto 2nd 1200w psu and extra 360 / 45mm QR rad modd


nice.. you need a bigger case


----------



## alancsalt

Quote:


> Originally Posted by *velocityx*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HOMECINEMA-PC*
> 
> Fitted XSPC Block to it much better
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I see the core VRM has no thermal pads. did you put it or its just the picture doesnt have it yet?
Click to expand...

Do you mean the 2 x "2" on the left?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Bagmup*
> 
> I just picked up my first giggy 290 for my prodigy build


Hope you got it for a good price .........









Quote:


> Originally Posted by *blue1512*
> 
> T
> I am pretty sure that I lost the auction of this exact card on eBay


That's a shame could of been a unlockable one

Quote:


> Originally Posted by *velocityx*
> 
> I see the core VRM has no thermal pads. did you put it or its just the picture doesnt have it yet?


Wasn't fitted at time of picture








@alancsalt
Yes that one









Quote:


> Originally Posted by *nightfox*
> 
> false
> nice.. you need a bigger case


Suggestions ??

and I need a better camera im using a potatoe ATM.....



and

HOMECINEMA-PC [email protected]@2417 TRI 290 WB @ 1300 / 1400 *32194*

http://www.3dmark.com/3dm11/8351447


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Suggestions ??


Case Labs S8,

Open bench maybe


----------



## VSG

The S8 by itself doesn't support dual PSU.


----------



## Bagmup

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Hope you got it for a good price .........


Oh yeah, not bad


----------



## Sgt Bilko

Quote:


> Originally Posted by *geggeg*
> 
> The S8 by itself doesn't support dual PSU.


Good point, forgot he is using Dual's









Magnum M8 then?

not 100% familiar with Case Labs cases, everytime i look at them i want to buy....


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Sgt Bilko*
> 
> not 100% familiar with Case Labs cases, everytime i look at them i want to buy....


Maybe you should buy.... there purdy


----------



## Sgt Bilko

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Maybe you should buy.... there purdy


Noooooooo..........Maybe.......Yeah.........Nope Nope Nope.

need to save some cash to go overseas









But they are purdy though


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Noooooooo..........Maybe.......Yeah.........Nope Nope Nope.
> 
> need to save some cash to go overseas
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But they are purdy though


Get out! lol

There awesome cases














my SM5


----------



## VSG

Why go overseas for a short trip when you can get a caselabs case and have the overseas come to you forever?


----------



## velocityx

I was researching dual gpus too but read that its not recommemded because the 5volt line isnt used so it causes drops on the 12v rail thst can cause instability.

Can anyone confirm?


----------



## Sgt Bilko

Quote:


> Originally Posted by *geggeg*
> 
> Why go overseas for a short trip when you can get a caselabs case and have the overseas come to you forever?


To see my Wife


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Sgt Bilko*
> 
> To see my Wife


Wives are only temporary, Caselabs is forever


----------



## Gobigorgohome

Okay guys, I have a BIG problem. I have just got two R9 290s and when I installed the second card the "Crossfire" option was nowhere to find. It did not lie under the "Performance" menu, then I read a little on the internet and uninstalled the drivers (and everything else from AMD), then I installed the 13.12 and same thing happens. The only thing that is under the "Performance" category is "AMD Overdrive" and that's it.
So I have tried two drivers and none of them are working for me.

Both cards appear in device manager and they are installed properly.

Thanks in advance.

PS: Had the 14.4 installed at first


----------



## hwoverclkd

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Wives are only temporary, Caselabs is forever


if the wife is happy, you can setup any rig you want


----------



## Widde

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Okay guys, I have a BIG problem. I have just got two R9 290s and when I installed the second card the "Crossfire" option was nowhere to find. It did not lie under the "Performance" menu, then I read a little on the internet and uninstalled the drivers (and everything else from AMD), then I installed the 13.12 and same thing happens. The only thing that is under the "Performance" category is "AMD Overdrive" and that's it.
> So I have tried two drivers and none of them are working for me.
> 
> Both cards appear in device manager and they are installed properly.
> 
> Thanks in advance.
> 
> PS: Had the 14.4 installed at first


Try both cards individually just to eliminate the possibility of a faulty card


----------



## Aussiejuggalo

Quote:


> Originally Posted by *acupalypse*
> 
> if the wife is happy, you can setup any rig you want


Or just stay single









whats the avg OC people are getting on Tri-X 290s?


----------



## Gobigorgohome

Quote:


> Originally Posted by *Widde*
> 
> Try both cards individually just to eliminate the possibility of a faulty card


I have checked both cards individually and there is nothing wrong with the any of the cards, they are functioning just fine.

I am going to check if the motherboard will do SLI in the PCI-E ports instead of CFX. It is a Sabertooth Z77 by the way.

PS: Motherboard failure


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> @Arizonian
> Could you put my third 290 on the list please its a unlocked , waterblocked XFX Core edition with so far a 1290 / 1500 bench clock
> 
> 
> It had this thing on it
> 
> 
> Fitted XSPC Block to it much better
> 
> 
> 
> 
> Ghetto 2nd 1200w psu and extra 360 / 45mm QR rad modd


Nice!







I can't do that ^^, my cats will destroy them.








Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> and I need a better camera im using a potatoe ATM.....
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> and
> 
> HOMECINEMA-PC [email protected]@2417 TRI 290 WB @ 1300 / 1400 *32194*
> 
> http://www.3dmark.com/3dm11/8351447





















You can get a beginner/entry level DSLR/SLT camera & they're cheap.
Quote:


> Originally Posted by *velocityx*
> 
> I was researching dual gpus too but read that its not recommemded because the 5volt line isnt used so it causes drops on the 12v rail thst can cause instability.
> 
> Can anyone confirm?


Don't know about that but many people use dual GPUs without any problem.
Quote:


> Originally Posted by *Gobigorgohome*
> 
> Okay guys, I have a BIG problem. I have just got two R9 290s and when I installed the second card the "Crossfire" option was nowhere to find. It did not lie under the "Performance" menu, then I read a little on the internet and uninstalled the drivers (and everything else from AMD), then I installed the 13.12 and same thing happens. The only thing that is under the "Performance" category is "AMD Overdrive" and that's it.
> So I have tried two drivers and none of them are working for me.
> 
> Both cards appear in device manager and they are installed properly.
> 
> Thanks in advance.
> 
> PS: Had the 14.4 installed at first


What GPU-Z say? Look at "ATI Crossfire" field in GPU-Z.


----------



## DeadlyDNA

I will be changing out my 4 r9 290s for 4 r9 290x sometime next week. Thank you mining craze, people are selling really cheap. I just have to hope they are good to go.
after I test them on stock I will put them under water and sell off my 290's


----------



## rdr09

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Hope you got it for a good price .........
> 
> 
> 
> 
> 
> 
> 
> 
> That's a shame could of been a unlockable one
> Wasn't fitted at time of picture
> 
> 
> 
> 
> 
> 
> 
> 
> @alancsalt
> Yes that one
> 
> 
> 
> 
> 
> 
> 
> 
> Suggestions ??
> 
> and I need a better camera im using a potatoe ATM.....
> 
> 
> 
> You have those size potatoes? Whiile in Hobart I figured I'd try this Steak place, so they served this huge steak that came with 2 golfball-sized potatoes. Loin was Tops.
> 
> and
> 
> HOMECINEMA-PC [email protected]@2417 TRI 290 WB @ 1300 / 1400 *32194*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/8351447


you bin those 290s? 1300 core on three.


----------



## Talon720

Im having a weird issue. I put my 3rd card and in bf4 + mantle I got
an artifacting i hadnt seen before. If i continued to play i got a mantle-gpu-function-error driver related 14.4RC I had chalked it up to high temps at the time :shrug: . So I fixed the temp issue, and replaced
Cpu from a delidding gone bad. When I put everything back together the artifacting was gone and ran beautifully and smooth. I decided to try to reach my max daily oc on afterburner and got some 116 bsod and some 124 +101 from my cpu oc. Now both are stable, but bf4 started to get the crazy artifacting again. It dosn't artifact on dx11, but gpu usage seems to suck on 2 of the gpus according to hwinfo64 where as mantle had 3 gpus full load. I tried ddu, and reinstalling 14.4 RC, and 14.4 press release I guess I'm going to keep trying drivers. Also, the artifacting in mantle seems to only be in trifire, but crossfire, and single gpu configs seem to be fine. Im almost positive its driver related, and not a broken gpu, but i dunno how it was fix in the first place and how to do it again. Anyone have a similar issue with low
gpu usage in dx11 and artifacting in mantle?


----------



## cennis

Anyone had black screening with Displayport?

I am having blackscreening when I push voltage over 100mw ONLY when connected to displayport regardless of resolution,

on DVI I have 1200/1500 +175mw stable


----------



## piquadrat

Are there any thermal photos of the back of the reference R9 290s available somewhere?
I wonder where is the hottest area at the back of the card. Want to build a custom heatsink/(partial-)backplate for it and connect it with the pcb with essential points through pads.
Those vrm temps are way too high for me (almost 80C-1200Mhz/1500MHz) even with custom built large heatsink laying on the vrm1.
Have to cool the back as efficient as possible. I think, the only path I can follow at the moment.


----------



## owcraftsman

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mercy4You*
> 
> I've read that on 27 inch 2560x1440 items look smaller than on 24 inch 1920 x 1080....
> 
> Can you confirm this? How does this work in games, do you get to see more although smaller?
> 
> 
> 
> Web browsing and such (This forum for example) i've had to go to 125% zoom to read most of the text and such.
> 
> Easiest way to test for yourself is to run a 1080p Monitor in 720p for a week or so and then take it back to 1080p, same difference (1.5x the previous res)
> 
> and i don't really see more in games it's just more detail (pixel density) and the HUD does appear smaller in a few games but nothing so major as i can't read or understand it. From what i've seen there usually is a HUD size adjust option in most games now anyway.
> 
> Hope that answers your question in some way
Click to expand...

Little keyboard/mouse trick I use to keep 100% DPI is Ctrl + Scroll Wheel to zoom browser content. I actually prefer the smaller content with 100% DPI but admit things can be a bit to small at times in a browser window. Use this tip you'll like it.


----------



## velocityx

Quote:


> Originally Posted by *piquadrat*
> 
> Are there any thermal photos of the back of the reference R9 290s available somewhere?
> I wonder where is the hottest area at the back of the card. Want to build a custom heatsink/(partial-)backplate for it and connect it with the pcb with essential points through pads.
> Those vrm temps are way too high for me (almost 80C-1200Mhz/1500MHz) even with custom built large heatsink laying on the vrm1.
> Have to cool the back as efficient as possible. I think, the only path I can follow at the moment.


Guru3d has thermal imaging in their reviews.


----------



## Sgt Bilko

Quote:


> Originally Posted by *owcraftsman*
> 
> Little keyboard/mouse trick I use to keep 100% DPI is Ctrl + Scroll Wheel to zoom browser content. I actually prefer the smaller content with 100% DPI but admit things can be a bit to small at times in a browser window. Use this tip you'll like it.


Already do that and i have on the fly DPI switching on my mouse, use in more in windows than gaming actually









Thanks though, theyare great tips nonetheless


----------



## hwoverclkd

Quote:


> Originally Posted by *cennis*
> 
> Anyone had black screening with Displayport?
> 
> I am having blackscreening when I push voltage over 100mw ONLY when connected to displayport regardless of resolution,
> 
> on DVI I have 1200/1500 +175mw stable


Here...blackscreen when coming out of 'idle' regardless of the clock /voltage setting - even on stock. I also thought it could be because of displayport so i tried using hdmi and DVI, did NOT blackscreen but it reboots on its own during idle. Yours is pretty common to many. There's a black screen poll thread, in case you haven't seen that yet.


----------



## Red1776

everyone in the pool...


----------



## velocityx

Kinda wish oems would start selling cards without coolers. Kinda sad that these excellent 4 coolers are gonna go to waste.

are they reference boards?


----------



## Red1776

Quote:


> Originally Posted by *velocityx*
> 
> Kinda wish oems would start selling cards without coolers. Kinda sad that these excellent 4 coolers are gonna go to waste.
> 
> are they reference boards?


well they follow ref design with one exception. MSI put larger capacitors around the VRM area. That's why I needed to wait for the EK rev 2.0 FC blocks


----------



## VSG

Those MSI gaming cards if bought recently use the reference design but taller caps. As to why selling with coolers- it is way easier to test for DOA or any other error with coolers on and this also lowers liability of damage from users or shipping.


----------



## sugarhell

You always need an air cooler.


----------



## velocityx

I wonder if these fit on a reference amd board...


----------



## Gobigorgohome

Quote:


> Originally Posted by *DeadlyDNA*
> 
> I will be changing out my 4 r9 290s for 4 r9 290x sometime next week. Thank you mining craze, people are selling really cheap. I just have to hope they are good to go.
> after I test them on stock I will put them under water and sell off my 290's


I got to ask, why do you do that upgrade?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> To see my Wife
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *acupalypse*
> 
> if the wife is happy, you can setup any rig you want
Click to expand...

Happy wife happy life ............. sentence









Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Or just stay single
> 
> 
> 
> 
> 
> 
> 
> 
> 
> whats the avg OC people are getting on Tri-X 290s?


LooooooooL









Quote:


> Originally Posted by *kizwan*
> 
> Nice!
> 
> 
> 
> 
> 
> 
> 
> I can't do that ^^, my cats will destroy them.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You can get a beginner/entry level DSLR/SLT camera & they're cheap.
> Don't know about that but many people use dual GPUs without any problem.
> What GPU-Z say? Look at "ATI Crossfire" field in GPU-Z.


I will look for better camera OR find the charger for my Sony Cybershot

Quote:


> Originally Posted by *rdr09*
> 
> you bin those 290s? 1300 core on three.


Yep 1300 on tri 1330 so far on CF and 1350 single and mem o/clock does pretty much jack . Max is 1400 - 1525mhz

Quote:


> Originally Posted by *cennis*
> 
> Anyone had black screening with Displayport?
> 
> I am having blackscreening when I push voltage over 100mw ONLY when connected to displayport regardless of resolution,
> 
> on DVI I have 1200/1500 +175mw stable


Yes my DP does it too . Have to run 1080p 46' @ HDMI to bench instead of 1440p monitor









Quote:


> Originally Posted by *Red1776*
> 
> everyone in the pool...











Quote:


> Originally Posted by *sugarhell*
> 
> You always need an air cooler.


Especially if you don't run watercooling


----------



## sugarhell

Lol like the guys that bought evga hydro without having a loop and just fire up the gpu.


----------



## DeadlyDNA

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I got to ask, why do you do that upgrade?


i need every advantage i can get at 4k eyefinity. well i can try at least. These should help by a decent margin i hope at high resolutions. And they are dirt cheap on ebay even if they are ex miner cards.


----------



## DeadlyDNA

Quote:


> Originally Posted by *sugarhell*
> 
> Lol like the guys that bought evga hydro without having a loop and just fire up the gpu.


wait what???

oh no i googled it.... what the....


----------



## Gobigorgohome

Quote:


> Originally Posted by *DeadlyDNA*
> 
> i need every advantage i can get at 4k eyefinity. well i can try at least. These should help by a decent margin i hope at high resolutions. And they are dirt cheap on ebay even if they are ex miner cards.


Why don't you just go with 4x GTX Titan Black and overvolt like crazy? I have seen some pretty impressive numbers coming from a few of those cards at high voltage. If you have the money to buy 4x R9 290s and 4x R9 290Xs I am sure you can wipe together the cash for four of those too. You probably have a lot more experience than me at GPUs, but from what I have seen there is not really any question about the GTX Titan Black at high resolutions.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *sugarhell*
> 
> Lol like the guys that bought evga hydro without having a loop and just fire up the gpu.












Quote:


> Originally Posted by *DeadlyDNA*
> 
> wait what???
> 
> oh no i googled it.... what the....


Im lazy link pls


----------



## DeadlyDNA

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Im lazy link pls


http://www.tomshardware.com/forum/384756-33-hydro-copper-help

http://rog.asus.com/forum/showthread.php?35861-need-help-with-EVGA-GeForce-GTX-TITAN-Hydro-Copper-Signature

just a couple apparently this happens more than once.
Quote:


> Originally Posted by *Gobigorgohome*
> 
> Why don't you just go with 4x GTX Titan Black and overvolt like crazy? I have seen some pretty impressive numbers coming from a few of those cards at high voltage. If you have the money to buy 4x R9 290s and 4x R9 290Xs I am sure you can wipe together the cash for four of those too. You probably have a lot more experience than me at GPUs, but from what I have seen there is not really any question about the GTX Titan Black at high resolutions.


believe it or not, even buying 4 of both, i still paid less than 4 titan blacks. i would still need at least another 1k for 4 titan blacks.
And i will be selling off my 290's to offset the cost to boot.


----------



## Bagmup

Quote:


> Originally Posted by *DeadlyDNA*
> 
> http://www.tomshardware.com/forum/384756-33-hydro-copper-help
> 
> http://rog.asus.com/forum/showthread.php?35861-need-help-with-EVGA-GeForce-GTX-TITAN-Hydro-Copper-Signature
> 
> just a couple apparently this happens more than once.


Guys buys two hydro copper titans, doesn't realise they need water...wait wut??

All the gear and no idea...


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *DeadlyDNA*
> 
> http://www.tomshardware.com/forum/384756-33-hydro-copper-help
> 
> http://rog.asus.com/forum/showthread.php?35861-need-help-with-EVGA-GeForce-GTX-TITAN-Hydro-Copper-Signature
> 
> just a couple apparently this happens more than once.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> believe it or not, even buying 4 of both, i still paid less than 4 titan blacks. i would still need at least another 1k for 4 titan blacks.
> And i will be selling off my 290's to offset the cost to boot.


----------



## DeadlyDNA

Quote:


> Originally Posted by *Bagmup*
> 
> Guys buys two hydro copper titans, doesn't realise they need water...wait wut??
> 
> All the gear and no idea...


You would think the different look or the extra costs would mean something. I guess some people just think the more money the better it is maybe?


----------



## Gobigorgohome

Quote:


> Originally Posted by *DeadlyDNA*
> 
> just a couple apparently this happens more than once.
> believe it or not, even buying 4 of both, i still paid less than 4 titan blacks. i would still need at least another 1k for 4 titan blacks.
> And i will be selling off my 290's to offset the cost to boot.


Okay, I see. For the best performance at 4K Eyefinity/Nvidia Surround I still think the Titan Black will do you best.


----------



## Bagmup

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Okay, I see. For the best performance at 4K Eyefinity/Nvidia Surround I still think the Titan Black will do you best.


Price to performance, the 290x smashes the Titan blacks.


----------



## Gobigorgohome

Quote:


> Originally Posted by *Bagmup*
> 
> Price to performance, the 290x smashes the Titan blacks.


Yes, I know that. I thought the 4K panels wasen't dirt cheap, my bad.


----------



## DeadlyDNA

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Yes, I know that. I thought the 4K panels wasen't dirt cheap, my bad.


The ones i have are compared to whats on the market. 39inch Seiki screens can be found used for good prices. They only run 30hz at 4k, so they are not recommended. However for benching at 4k eyefinity and as that becomes more popular its for science







its a poor mans setup


----------



## Dasboogieman

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Or just stay single
> 
> 
> 
> 
> 
> 
> 
> 
> 
> whats the avg OC people are getting on Tri-X 290s?


Mine has 78.9% ASIC quality

Stock voltage is 1.21V
1100/1500 at +12mv
1150/1500 at +125mv
1200/1500 at +165mv

Been gaming at these settings for months. Sometimes get flickering on the screen though dropping the Mem clock 25mhz fixed that. I read it was caused by an unstable memory controller when it heats up beyond 75 degrees. Can confirm the flickering goes away under Watercooling.


----------



## Bagmup

Can I ask a quick noob question, will the Aquacomputer full cover block fit a rev1 giggy windforce pcb? I'm assuming the rev 1 is reference design with the rev 2 having taller caps?

http://www.frozencpu.com/products/22294/ex-blc-1593/Aquacomputer_Kryographics_Hawaii_Radeon_R9_290X_Full_Coverage_Liquid_Cooling_Block_-_Copper_Acrylic_23580.html?tl=c357s1657b200#blank


----------



## devilhead

Can somebody tell how to add +250mv thru MSI Afterburner?
tested pt1 bios and 1.5v







and i think original sapphire bios is better, just need more voltage, usualy i run triXX +200mv for test


----------



## Bagmup

Quote:


> Originally Posted by *devilhead*
> 
> Can somebody tell how to add +250mv thru MSI Afterburner?
> tested pt1 bios and 1.5v
> 
> 
> 
> 
> 
> 
> 
> and i think original sapphire bios is better, just need more voltage, usualy i run triXX +200mv for test


Is the voltage slider unlocked?


----------



## devilhead

Quote:


> Originally Posted by *Bagmup*
> 
> Is the voltage slider unlocked?


in MSI Afterburner you can use +100mv, but it is some hack, to modify MSI Afterburner file and you can get 250mv, just need to know how...


----------



## Bagmup

Quote:


> Originally Posted by *devilhead*
> 
> in MSI Afterburner you can use +100mv, but it is some hack, to modify MSI Afterburner file and you can get 250mv, just need to know how...


I think you will have to flash the bios to do that.


----------



## EpIcSnIpErZ23

Just bought a reference Sapphire 290x for 300 bucks shipped! Gonna watercool it with the Aquacomputer Hawaii Block







. Can't wait for it to arrive!


----------



## DeadlyDNA

Quote:


> Originally Posted by *EpIcSnIpErZ23*
> 
> Just bought a reference Sapphire 290x for 300 bucks shipped! Gonna watercool it with the Aquacomputer Hawaii Block
> 
> 
> 
> 
> 
> 
> 
> . Can't wait for it to arrive!


grats on your deal man, they are dirt cheap right now. If they are problem free they are extremely good deals right now.


----------



## EpIcSnIpErZ23

Quote:


> Originally Posted by *DeadlyDNA*
> 
> grats on your deal man, they are dirt cheap right now. If they are problem free they are extremely good deals right now.


Thanks! i'll let you know how it plays out. He claims it was mined on for 2 weeks. The receipt shows he bought it 4 weeks ago, so it seems legit. It has the BF4 coupon and original box. Since i already have BF4 imma do a free giveaway for the coupon lol.


----------



## Roy360

well this is interesting. I started up Medieval Wars II with different combinations of the video cards. There was one instance when I thought the flickering went away, but as soon as I re-maximized the window, it came back up. For now I'm just going to blame it all on the game. One card is more than enough to max all the settings and probably get over 100fps.

it seems the flickering only starts, 2-4 seconds into the game, and by game I mean the start menu.


----------



## DeadlyDNA

Quote:


> Originally Posted by *Roy360*
> 
> well this is interesting. I started up Medieval Wars II with different combinations of the video cards. There was one instance when I thought the flickering went away, but as soon as I re-maximized the window, it came back up. For now I'm just going to blame it all on the game. One card is more than enough to max all the settings and probably get over 100fps.
> 
> it seems the flickering only starts, 2-4 seconds into the game, and by game I mean the start menu.


Can you try compatibility mode check box in shortcut properties? if in steam try making a direct shortcut by browsing to the game folder and game exe?


----------



## bencher

hi guys what can I use to edit the bios of the R9 290? RBE is saying "invalid file"

And what is the difference between VDDC and VDDCI.


----------



## Arizonian

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> I think he means they gave him a 290x instead . Not bios flash or whatever ....
> 
> 
> 
> 
> 
> 
> 
> 
> @Arizonian
> Could you put my third 290 on the list please its a unlocked , waterblocked XFX Core edition with so far a 1290 / 1500 bench clock
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> It had this thing on it
> 
> 
> Fitted XSPC Block to it much better
> 
> 
> Ghetto 2nd 1200w psu and extra 360 / 45mm QR rad modd


Updated - congrats
















Quote:


> Originally Posted by *Red1776*
> 
> everyone in the pool...
> 
> 
> Spoiler: Warning: Spoiler!


Updated - Congrats as well


----------



## kizwan

Quote:


> Originally Posted by *devilhead*
> 
> Can somebody tell how to add +250mv thru MSI Afterburner?
> tested pt1 bios and 1.5v
> 
> 
> 
> 
> 
> 
> 
> and i think original sapphire bios is better, just need more voltage, usualy i run triXX +200mv for test


Use GPU Tweak.

Sorry, misunderstood your post.


----------



## Forceman

Quote:


> Originally Posted by *bencher*
> 
> hi guys what can I use to edit the bios of the R9 290? RBE is saying "invalid file"
> 
> And what is the difference between VDDC and VDDCI.


You can't.

VDDC is core voltage, VDDCI is something else, possibly PLL Voltage (since the theory is that's what AUX voltage is in AB, and they are the same voltage).


----------



## bencher

Quote:


> Originally Posted by *Forceman*
> 
> You can't.
> 
> VDDC is core voltage, VDDCI is something else, possibly PLL Voltage (since the theory is that's what AUX voltage is in AB, and they are the same voltage).


What is your VDDC currently at?

Are you sure? the internet says VDDC is Vram voltage :/


----------



## bencher

Quote:


> Originally Posted by *Forceman*
> 
> You can't.
> 
> VDDC is core voltage, VDDCI is something else, possibly PLL Voltage (since the theory is that's what AUX voltage is in AB, and they are the same voltage).


Quote:


> Originally Posted by *bencher*
> 
> What is your VDDC currently at?
> 
> Are you sure? the internet says VDDC is Vram voltage :/


Ok you are right.

I just realized my cards idle voltage is 1.227 and load is 1.18v.... Yes the load is lower than idle.

How can I edit my bios?


----------



## Forceman

Quote:


> Originally Posted by *bencher*
> 
> Ok you are right.
> 
> I just realized my cards idle voltage is 1.227 and load is 1.18v.... Yes the load is lower than idle.
> 
> How can I edit my bios?


There is no way to edit the BIOS - the card enforces BIOS signing or something, which makes them uneditable. Or at least that's my understanding of why there are no editing tools available. In any case, there are no tools to edit it.

Strange that your card doesn't idle lower - at stock mine is also 1.18V under load, but idles at 0.96V (or something close to that).


----------



## the9quad

Quote:


> Originally Posted by *Forceman*
> 
> Strange that your card doesn't idle lower - at stock mine is also 1.18V under load, but idles at 0.96V (or something close to that).


I am betting this is why: "*3x 23" 1080p*"


----------



## Forceman

Quote:


> Originally Posted by *the9quad*
> 
> I am betting this is why: "*3x 23" 1080p*"


Yeah, I didn't see that. Sounds like the culprit.


----------



## bencher

Quote:


> Originally Posted by *Forceman*
> 
> There is no way to edit the BIOS - the card enforces BIOS signing or something, which makes them uneditable. Or at least that's my understanding of why there are no editing tools available. In any case, there are no tools to edit it.
> 
> Strange that your card doesn't idle lower - at stock mine is also 1.18V under load, but idles at 0.96V (or something close to that).


Quote:


> Originally Posted by *the9quad*
> 
> I am betting this is why: "*3x 23" 1080p*"


Quote:


> Originally Posted by *Forceman*
> 
> Yeah, I didn't see that. Sounds like the culprit.


I force flashed another bios on it and now it is idling at 0.9v and loading at 1.18v







.

My room is a lot cooler now .


----------



## Blue Dragon

Quote:


> Originally Posted by *Forceman*
> 
> You can't.
> 
> VDDC is core voltage, VDDCI is something else, possibly PLL Voltage (since the theory is that's what AUX voltage is in AB, and they are the same voltage).


straight from MSI's 7870 Hawk page-


----------



## HOMECINEMA-PC

Done this vvvv
HOMECINEMA-PC [email protected]@2417 TRI 290 WB @ 1300 / 1400 *32194*

http://www.3dmark.com/3dm11/8351447


----------



## the9quad

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Done this vvvv
> HOMECINEMA-PC [email protected]@2417 TRI 290 WB @ 1300 / 1400 *32194*
> 
> http://www.3dmark.com/3dm11/8351447


NICE JOB!!!!!


----------



## sinnedone

Quote:


> Originally Posted by *Blue Dragon*
> 
> straight from MSI's 7870 Hawk page-


So VDDCI aka "aux voltage" pulls extra from the pci connection?

This is new to me, so if anyone has any tips on this please post. ie motherboard bios settings etc .


----------



## BradleyW

Which CCC should I use for Wolfenstein? 14.4 Beta or 14.4 WHQL?
Does 290X CFX work in this game?

Thank you.


----------



## devilhead

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Done this vvvv
> HOMECINEMA-PC [email protected]@2417 TRI 290 WB @ 1300 / 1400 *32194*
> 
> http://www.3dmark.com/3dm11/8351447


that voltage







so after vdrop 1.5v?


----------



## Gunderman456

Quote:


> Originally Posted by *BradleyW*
> 
> Which CCC should I use for Wolfenstein? 14.4 Beta or 14.4 WHQL?
> Does 290X CFX work in this game?
> 
> Thank you.


I would use 14.4 WHQL since it's not beta and I'm pretty sure reviews of the game indicated that the game will not work with Crossfire that is why I'm not buying it.


----------



## BradleyW

Quote:


> Originally Posted by *Gunderman456*
> 
> I would use 14.4 WHQL since it's not beta and I'm pretty sure reviews of the game indicated that the game will not work with Crossfire that is why I'm not buying it.


A few websites have suggested to use 14.4 WHQL for Crossfire. Maybe they are just talking out of their you know what! I don't mind the lack of CFX since my 290X should be able to play the game without issues.


----------



## Gunderman456

Problem is that the game is capped at 60fps, so even if you had both cards working, you will not get benefit of more frames.


----------



## bencher

My other R9 290 is on it's way









I am having a hard time keeping one cool now to keep 2 cool :/


----------



## pkrexer

So I just got my 290 which I plan to crossfire with my 290x. I would like to test my 290 before crossfiring and adding to my water loop. Instead of completely removing my 290x, would it be ok if disconnected the power from it and leave it the pcie slot while i test just the 290?

This is my first time going dual GPU.


----------



## bencher

Quote:


> Originally Posted by *pkrexer*
> 
> So I just got my 290 which I plan to crossfire with my 290x. I would like to test my 290 before crossfiring and adding to my water loop. Instead of completely removing my 290x, would it be ok if disconnected the power from it and leave it the pcie slot while i test just the 290?
> 
> This is my first time going dual GPU.


I think you have to remove it


----------



## Bagmup

Quote:


> Originally Posted by *pkrexer*
> 
> So I just got my 290 which I plan to crossfire with my 290x. I would like to test my 290 before crossfiring and adding to my water loop. Instead of completely removing my 290x, would it be ok if disconnected the power from it and leave it the pcie slot while i test just the 290?
> 
> This is my first time going dual GPU.


Disable crossfire in CCC.


----------



## velocityx

Quote:


> Originally Posted by *pkrexer*
> 
> So I just got my 290 which I plan to crossfire with my 290x. I would like to test my 290 before crossfiring and adding to my water loop. Instead of completely removing my 290x, would it be ok if disconnected the power from it and leave it the pcie slot while i test just the 290?
> 
> This is my first time going dual GPU.


the only time my system allowed me to keep my gpu without cables and not scream about it with beeps is when the gpu is in the secondary slot. First one has to have cables or it wont boot I think.


----------



## BradleyW

Add the second card and plug the screen into the second card. Then, disable CFX in CCC. Done!
No need to mess around with GPU 1.


----------



## Unknownm

Not to much of a issue but when I turn off the main monitor (1x HDMI 1x S-DVI) everythings fine, the second screen works great. However when the main monitor is turned back on , both monitors go into stand by mode than all the windows reset to the main monitor.

I also hear the "do-do" when a device is re-added (like when USB flash drive is plugged into a slot) when this happens. Anyway to fix it? my R9 280x never had this issue


----------



## pkrexer

Cool! Thanks for the help
Quote:


> Originally Posted by *BradleyW*
> 
> Add the second card and plug the screen into the second card. Then, disable CFX in CCC. Done!
> No need to mess around with GPU 1.


----------



## DeadlyDNA

Quote:


> Originally Posted by *Unknownm*
> 
> Not to much of a issue but when I turn off the main monitor (1x HDMI 1x S-DVI) everythings fine, the second screen works great. However when the main monitor is turned back on , both monitors go into stand by mode than all the windows reset to the main monitor.
> 
> I also hear the "do-do" when a device is re-added (like when USB flash drive is plugged into a slot) when this happens. Anyway to fix it? my R9 280x never had this issue


not sure why it didn't do this before but welcome to windows and hardware hot plug detection at its finest. I hate this with a passion


----------



## Red1776

Quote:


> Originally Posted by *Gunderman456*
> 
> Problem is that the game is capped at 60fps, so even if you had both cards working, you will not get benefit of more frames.


remove the 60FPS cap


----------



## BradleyW

Quote:


> Originally Posted by *Red1776*
> 
> remove the 60FPS cap


And how would you do that since 60 is hard coded directly into this OpenGL based engine? That's right, you can't.


----------



## Kiros

I managed to grab a 290X Sapphire BF4 edition off of ebay for $275..great price but these cards run very hot!


----------



## EpIcSnIpErZ23

Quote:


> Originally Posted by *Kiros*
> 
> I managed to grab a 290X Sapphire BF4 edition off of ebay for $275..great price but these cards run very hot!


Damn! I picked mine up yesterday for 300 shipped.


----------



## Kiros

300 was what I bidded mine on but luckily ebay doesn't make it so You Are Bidding 300. The other person who was bidding just stopped at like 274 and that's how I got my price.

I might have to watercool this card unfortunately...the heat and noise ratio is unacceptable! Sounds like the card wants to take off!


----------



## BradleyW

Does anyone think we will see Driver support for Watch Dogs?


----------



## Red1776

Quote:


> Originally Posted by *BradleyW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> remove the 60FPS cap
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And how would you do that since 60 is hard coded directly into this OpenGL based engine? That's right, you can't.
Click to expand...

I figured seeing the smiley you would recognize I was kidding, oh that's right, you can't


----------



## BradleyW

Quote:


> Originally Posted by *Red1776*
> 
> I figured seeing the smiley you would recognize I was kidding, oh that's right, you can't


I hope you can recognise that I can recognise the smiley! That's right, you can't!








I sure hope they remove that cap! Please!


----------



## Red1776

Quote:


> Originally Posted by *BradleyW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> I figured seeing the smiley you would recognize I was kidding, oh that's right, you can't
> 
> 
> 
> I hope you can recognise that I can recognise the smiley! That's right, you can't!
> 
> 
> 
> 
> 
> 
> 
> 
> I sure hope they remove that cap! Please!
Click to expand...

actually there was a guy here at OCN that claimed to know how to remove the frame cap. he posted evidence of it and posted the lines to edit and how to do it.

I never tried it because I like a great frame rate, I wouldn't know game code if it bit me in the butt.

but that's right...I don't


----------



## kckyle

i upgraded to xeon just for this damn game, watch when it comes out an average i5 can run it ultra


----------



## bencher

Quote:


> Originally Posted by *kckyle*
> 
> i upgraded to xeon just for this damn game, watch when it comes out an average i5 can run it ultra


what game?


----------



## heroxoot

Quote:


> Originally Posted by *Kiros*
> 
> 300 was what I bidded mine on but luckily ebay doesn't make it so You Are Bidding 300. The other person who was bidding just stopped at like 274 and that's how I got my price.
> 
> I might have to watercool this card unfortunately...the heat and noise ratio is unacceptable! Sounds like the card wants to take off!


I wouldn't be shocked if that is why it was for sale. The only reason to buy reference is if you want to watercool. When I was given the choice between the 290X ref and Gaming I told the MSI rep the gaming because the fan is way better. Reference fan always sucks. But hey, 275 bucks? Not bad.


----------



## bencher

Quote:


> Originally Posted by *Kiros*
> 
> 300 was what I bidded mine on but luckily ebay doesn't make it so You Are Bidding 300. The other person who was bidding just stopped at like 274 and that's how I got my price.
> 
> I might have to watercool this card unfortunately...the heat and noise ratio is unacceptable! Sounds like the card wants to take off!


I got my first for $250 and my second for $235


----------



## gerardfraser

Quote:


> Originally Posted by *Red1776*
> 
> actually there was a guy here at OCN that claimed to know how to remove the frame cap. he posted evidence of it and posted the lines to edit and how to do it.
> I never tried it because I like a great frame rate, I wouldn't know game code if it bit me in the butt.
> but that's right...I don't


Could you post a link,or what thread if you remember,I would like to try what he did.
Thanks


----------



## the9quad

Keep in mind, that idtech 5 has the *physics engine HARD CODED to 60 FPS*. So even if you unlock teh frames expect it be wonky as all get out.


----------



## Red1776

Quote:


> Originally Posted by *the9quad*
> 
> Keep in mind, that idtech 5 has the *physics engine HARD CODED to 60 FPS*. So even if you unlock teh frames expect it be wonky as all get out.


 Yeah I thought it was an interesting idea, but passed on his instruction for that reason. If I don't understand the engine, I don't screw with it.


----------



## maynard14

hi guys,. i have nzxt g10 and r9 290,.. which aio cooler is the best for r9 290? kraken x60 or h110?


----------



## bluedevil

Quote:


> Originally Posted by *maynard14*
> 
> hi guys,. i have nzxt g10 and r9 290,.. which aio cooler is the best for r9 290? kraken x60 or h110?


Take your pick. Whatever is cheapest/goes with your setup.


----------



## maynard14

Quote:


> Originally Posted by *bluedevil*
> 
> Take your pick. Whatever is cheapest/goes with your setup.


thank you


----------



## Dasboogieman

Quote:


> Originally Posted by *maynard14*
> 
> hi guys,. i have nzxt g10 and r9 290,.. which aio cooler is the best for r9 290? kraken x60 or h110?


Go for the Kraken, the H110 is identical except it has 4 inches less hose length. This extra is a godsend when it comes to mounting locations.
I'm using a H90 myself and am seriously regretting not getting the X40.


----------



## maynard14

thanks sir... how about kraken x60 vs corsair h105. coz i already have 2 corsair sp fans here and they wont fit with x60


----------



## BradleyW

Hello fellow AMD users,
Has anyone found a way to force 120/144 Hz in Wolfenstein? (Not to be confused with the fps cap)

Thank you.


----------



## bluedevil

Quote:


> Originally Posted by *maynard14*
> 
> thanks sir... how about kraken x60 vs corsair h105. coz i already have 2 corsair sp fans here and they wont fit with x60


If you want my opinion.....

http://www.overclock.net/t/1479969/cm-seidon-120xl-r9-290


----------



## EpIcSnIpErZ23

I have a corsair H50 on my GTX 770, and it keeps my load temps under 45C.


----------



## pkrexer

So I got crossfire up and running without any issues. I was a bit concerned with having only a 860watt power supply, but played an hour of bf4 and ran some benches without a problem. My power supply has a hybrid fan that's supposed to kick on under a % of load... Far as I know, its yet to kick on which worries me little.

Also when I try running bf4 with mantle, I have a hard time breaking 60fps. Change to dx11 and I'm running 100fps+ (1080p, ultra, 4xAA, 160%) Is mantle broke with crossfire? It was working fine with my single 290x.


----------



## Talon720

Quote:


> Originally Posted by *pkrexer*
> 
> So I got crossfire up and running without any issues. I was a bit concerned with having only a 860watt power supply, but played an hour of bf4 and ran some benches without a problem. My power supply has a hybrid fan that's supposed to kick on under a % of load... Far as I know, its yet to kick on which worries me little.
> 
> Also when I try running bf4 with mantle, I have a hard time breaking 60fps. Change to dx11 and I'm running 100fps+ (1080p, ultra, 4xAA, 160%) Is mantle broke with crossfire? It was working fine with my single 290x.


What driver are you using... And did you use DDU to wipe the old driver before installing new? Also did you disable upls in AB or by editing registry? Both are the same .Also when i went crossfire i think i needed to wipe and reinstall the driver. I havnt had any crossfire issues up till i went trifire thats when i started to get weird issues. Also check CCC to make sure crossfire is enabled your fps should be higher with mantle. Also id check the specs on that psu for the fan profile. I'm not saying anythings wrong, but my hx1050 comes on at 40% or 60% and is loud sometimes at idle it came on till i attached a little 92mm noctura fan to the grill. I really wanna change the fan, but its a super high cfm delta fan. It could be your psu just has a lot of air flow around it to keep it cool.


----------



## gerardfraser

Quote:


> Originally Posted by *pkrexer*
> 
> So I got crossfire up and running without any issues. I was a bit concerned with having only a 860watt power supply, but played an hour of bf4 and ran some benches without a problem. My power supply has a hybrid fan that's supposed to kick on under a % of load... Far as I know, its yet to kick on which worries me little.
> 
> Also when I try running bf4 with mantle, I have a hard time breaking 60fps. Change to dx11 and I'm running 100fps+ (1080p, ultra, 4xAA, 160%) Is mantle broke with crossfire? It was working fine with my single 290x.


Just checked Mantle in R9 290 Crossfire in MP all seems fine.Although the settings you have are quite high on the res scale.
Setting 1920x1080-Ultra 4xMSAA-Res Scale 160%.
Driver 14.4 Whql
3770K CPU
GPU 1000/1300



EDIT:ADDED DX11 Run
Setting 1920x1080-Ultra 4xMSAA-Res Scale 160%.
Driver 14.4 Whql
3770K CPU
GPU 1000/1300


----------



## pkrexer

I did run ddu and installed 14.4 whql drivers. I found someone that suggested the command devicerender.framepacing 1 which seemed to fix the issue... However fps is still a little sporadic and averages less then using dx11.

Quote:


> Originally Posted by *Talon720*
> 
> What driver are you using... And did you use DDU to wipe the old driver before installing new? Also did you disable upls in AB or by editing registry? Both are the same .Also when i went crossfire i think i needed to wipe and reinstall the driver. I havnt had any crossfire issues up till i went trifire thats when i started to get weird issues. Also check CCC to make sure crossfire is enabled your fps should be higher with mantle. Also id check the specs on that psu for the fan profile. I'm not saying anythings wrong, but my hx1050 comes on at 40% or 60% and is loud sometimes at idle it came on till i attached a little 92mm noctura fan to the grill. I really wanna change the fan, but its a super high cfm delta fan. It could be your psu just has a lot of air flow around it to keep it cool.


----------



## kizwan

Quote:


> Originally Posted by *pkrexer*
> 
> I did run ddu and installed 14.4 whql drivers. I found someone that suggested the command devicerender.framepacing 1 which seemed to fix the issue... However fps is still a little sporadic and averages less then using dx11.


Did you try delete the Mantle cache? (C:\Users\[user-name]\Documents\Battlefield 4\cache) It's a long shot though.


----------



## heroxoot

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *pkrexer*
> 
> I did run ddu and installed 14.4 whql drivers. I found someone that suggested the command devicerender.framepacing 1 which seemed to fix the issue... However fps is still a little sporadic and averages less then using dx11.
> 
> 
> 
> Did you try delete the Mantle cache? (C:\Users\[user-name]\Documents\Battlefield 4\cache) It's a long shot though.
Click to expand...

There needs to be some kind of program that does this for you. The Mantle cache doesn't get touched when a driver is uninstalled, and deleting this helps solve major performance issues. Deleting the Mantle cache for BF4 improved my fps by a lot. I went from hardly 80fps to 90fps average and 100+ in multiple areas.

The only map I get bad FPS on is Hainan Hotel. I get hardly 60fps on this map but I get 80 - 90 on CQ large maps fine.


----------



## velocityx

Quote:


> Originally Posted by *pkrexer*
> 
> So I got crossfire up and running without any issues. I was a bit concerned with having only a 860watt power supply, but played an hour of bf4 and ran some benches without a problem. My power supply has a hybrid fan that's supposed to kick on under a % of load... Far as I know, its yet to kick on which worries me little.
> 
> Also when I try running bf4 with mantle, I have a hard time breaking 60fps. Change to dx11 and I'm running 100fps+ (1080p, ultra, 4xAA, 160%) Is mantle broke with crossfire? It was working fine with my single 290x.


go in game and open console with a tilde (`) and type renderdevice.framepacingmethod 1

or create a user.cfg file in the game directory and put it there

you can allso put render.drawscreeninfo 1 and perfoverlay.drawfps 1

it will sort your fps.


----------



## Talon720

Quote:


> Originally Posted by *pkrexer*
> 
> I did run ddu and installed 14.4 whql drivers. I found someone that suggested the command devicerender.framepacing 1 which seemed to fix the issue... However fps is still a little sporadic and averages less then using dx11.


That sucks. This 14,4 whql sucks so far for me it keeps crashing and causing artifacting for me in trifire. I hope they fix this soon with a new driver. Latley ive been getting 116 bsod with this driver you could always try the press release 14.4


----------



## DeadlyDNA

Quote:


> Originally Posted by *Talon720*
> 
> That sucks. This 14,4 whql sucks so far for me it keeps crashing and causing artifacting for me in trifire. I hope they fix this soon with a new driver. Latley ive been getting 116 bsod with this driver you could always try the press release 14.4


maybe not for all using 14.4 but crossfire and eyefinity really is the pits with 14.4 I think this version was for 295x2 release ..


----------



## lazerpickle

Hello, I am trying to remove the stock cooler on my stock XFX R9 290x, and I'm 95% sure i've gotten all the screws out, put the heat sink still feels stuck to the PCB. I am unable to separate it without feeling as if the PCB is going to break. Does anybody know what this problem could be / what I can do to fix it?


----------



## Red1776

Quote:


> Originally Posted by *lazerpickle*
> 
> Hello, I am trying to remove the stock cooler on my stock XFX R9 290x, and I'm 95% sure i've gotten all the screws out, put the heat sink still feels stuck to the PCB. I am unable to separate it without feeling as if the PCB is going to break. Does anybody know what this problem could be / what I can do to fix it?


Probably a missed pair of tiny screws on the sides of the blower cover. If you can't at least twist the pcb then there is a mechanical fastener still on there.


----------



## DeadlyDNA

Yes sometimes the TIM has hardened and may require a slight twist as you pull apart. I had to use a tool as a lever between the cooler and pcb gently separate slowly around the gpu core. The 2 screws I think red didnt mention are on the metal slot cover near dvi connectors.


----------



## rdr09

Quote:


> Originally Posted by *lazerpickle*
> 
> Hello, I am trying to remove the stock cooler on my stock XFX R9 290x, and I'm 95% sure i've gotten all the screws out, put the heat sink still feels stuck to the PCB. I am unable to separate it without feeling as if the PCB is going to break. Does anybody know what this problem could be / what I can do to fix it?


like Red said, there might still be some screws still holding it. check the bracket where the ports are (DVI/HDMI).


----------



## Red1776

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Yes sometimes the TIM has hardened and may require a slight twist as you pull apart. I had to use a tool as a lever between the cooler and pcb gently separate slowly around the gpu core. The 2 screws I think red didnt mention are on the metal slot cover near dvi connectors.


Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lazerpickle*
> 
> Hello, I am trying to remove the stock cooler on my stock XFX R9 290x, and I'm 95% sure i've gotten all the screws out, put the heat sink still feels stuck to the PCB. I am unable to separate it without feeling as if the PCB is going to break. Does anybody know what this problem could be / what I can do to fix it?
> 
> 
> 
> like Red said, there might still be some screws still holding it. check the bracket where the ports are (DVI/HDMI).
Click to expand...

 They put tiny screws along the side of the housing (you need a driver the size of a glass repair kit) and they are black and easy to miss: (if they are not the screws on the port end the guys mentioned)

This is a housing/heatsink I removed from an XFX


----------



## motokill36

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Okay guys, I have a BIG problem. I have just got two R9 290s and when I installed the second card the "Crossfire" option was nowhere to find. It did not lie under the "Performance" menu, then I read a little on the internet and uninstalled the drivers (and everything else from AMD), then I installed the 13.12 and same thing happens. The only thing that is under the "Performance" category is "AMD Overdrive" and that's it.
> So I have tried two drivers and none of them are working for me.
> 
> Both cards appear in device manager and they are installed properly.
> 
> Thanks in advance.
> 
> PS: Had the 14.4 installed at first


Have you tried swapping card in pci slots
Worked for me
No idear why it worked thi


----------



## HOMECINEMA-PC

Deskputer


----------



## Mega Man

Quote:


> Originally Posted by *sinnedone*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Blue Dragon*
> 
> straight from MSI's 7870 Hawk page-
> 
> 
> 
> 
> So VDDCI aka "aux voltage" pulls extra from the pci connection?
> 
> This is new to me, so if anyone has any tips on this please post. ie motherboard bios settings etc .
Click to expand...

no, your mobo and the pcie power connectors supply 12v, which is then converted to lower voltage via the VRMs it allows it to supply more voltage on the low voltage side, not the high voltage side
Quote:


> Originally Posted by *pkrexer*
> 
> So I just got my 290 which I plan to crossfire with my 290x. I would like to test my 290 before crossfiring and adding to my water loop. Instead of completely removing my 290x, would it be ok if disconnected the power from it and leave it the pcie slot while i test just the 290?
> 
> This is my first time going dual GPU.


i did that with my 7970s with no ill effects. i dont do it now as the 7970s are in my 8350 and i am using my 3930k with the 290xs, the RIVBE has PCIE dip switches
Quote:


> Originally Posted by *lazerpickle*
> 
> Hello, I am trying to remove the stock cooler on my stock XFX R9 290x, and I'm 95% sure i've gotten all the screws out, put the heat sink still feels stuck to the PCB. I am unable to separate it without feeling as if the PCB is going to break. Does anybody know what this problem could be / what I can do to fix it?


you can also apply heat to the hs to help soften the tim

in other news my 5th 290x is here !


----------



## hwoverclkd

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Deskputer


def looks a lot better than i initially thought


----------



## rdr09

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Deskputer


is it safe in your neighborhood? you got qdc's . . . just stage your rads outside the window.


----------



## alancsalt

He'll just run an air-con unit thru the rads..


----------



## rdr09

Quote:


> Originally Posted by *alancsalt*
> 
> He'll just run an air-con unit thru the rads..


lol. that's how he had it last summer.


----------



## HOMECINEMA-PC

I run A/C even in winter cause , well winter here is not very cold ......... sometimes
Plus have full ducted a/c in house , can turn different zones on and off fed by a Fujitsu 15.5kw indoor / outdoor setup . I can get it below 15c ambient


----------



## ebhsimon

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Deskputer


This is the only way to do it.


----------



## Dasboogieman

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Deskputer


...its...its...its...alive?


----------



## pdasterly

I have sapphire R9 290X with kraken g10 bracket, corsair H75 aio cooler.
What settings should I change to take advantage of the liquid cooler


----------



## Mega Man

for anyone running CFX ect the 1300w evga is at a great price from the newegg shell shocker

edit anyone seen this, cheapest i ever saw it !


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Mega Man*
> 
> for anyone running CFX ect the 1300w evga is at a great price from the newegg shell shocker
> 
> edit anyone seen this, cheapest i ever saw it !


That's $120AU cheaper than cheapest priced ref 290x here


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Deskputer


Upgrading to new case/chassis?


----------



## sena

Guys how you gpu usage looks in cfx, mine is not very good.

Here is screen.


----------



## Red1776

Quote:


> Originally Posted by *sena*
> 
> Guys how you gpu usage looks in cfx, mine is not very good.
> 
> Here is screen.


 That depends on what you were running. was that a benchmark? because its up @ 100% then looks like it completed the benchmark.


----------



## sena

This was BF4.


----------



## kizwan

Quote:


> Originally Posted by *sena*
> 
> Guys how you gpu usage looks in cfx, mine is not very good.
> 
> Here is screen.


Quote:


> Originally Posted by *sena*
> 
> This was BF4.


Did you experiencing poor performance? If not GPU usage like that is normal with 290/290X Crossfire. BTW, @1440p?

BF4, 1 hour, 14.4, Mantle, Ultra NO AA, 1080p 200% res scale - GPU usage
_(Note: The two 0% gpu usage drop are because I exit full screen for checking temps)_


----------



## sena

Performance in bf4 was pretty good, over 100 fps for most of time, mantle api, 1440p, 4xmsaa.

ALtrough i just benched heaven 4.0, 1080p, 4msaa, ultra, extreme, and it scored less than two r290 at 1050/1350, hmmmm.

Also what temps you are getting, it feels i am getting too high temps, 55C on stock, 60C overclocked.

I have one rx360 and one ek240xt, laing ddc 18W pump.

EDIT; i just tested with 3dmark11
And it seems low http://www.3dmark.com/3dm11/8361169


----------



## KnownDragon

Hey guys, I just ordered the XFX R9 290 (R9290AENFC). I ordered it because I believe it can be unlocked to the 290X and because if it doesn't the price to performance was good for me. I know that the cooler on the one I am getting sucks. My question is. Which water block from the op is the best? I don't think I saw any raw data on any of them. I am going to start skimming the posts on the thread to see if I can find any of the information that I have asked. There are a lot of post. So when I can meet the requirements of proving my r9 290 I would like to join.


----------



## kizwan

Quote:


> Originally Posted by *sena*
> 
> Performance in bf4 was pretty good, over 100 fps for most of time, mantle api, 1440p, 4xmsaa.
> 
> ALtrough i just benched heaven 4.0, 1080p, 4msaa, ultra, extreme, and it scored less than two r290 at 1050/1350, hmmmm.
> 
> Also what temps you are getting, it feels i am getting too high temps, 55C on stock, 60C overclocked.
> 
> I have one rx360 and one ek240xt, laing ddc 18W pump.
> 
> EDIT; i just tested with 3dmark11
> And it seems low http://www.3dmark.com/3dm11/8361169


You can take a peek in the picture in my previous post. Look in Max column for temps while playing BF4 (Tri-X 290 stock clock: 1000/1300). Ambient (indoor) temp is around 29 - 30C. VRM1 between high 40s to low 50s Celsius, thanks to Fujipoly Extreme thermal pad.

I have one HL Black Ice GTS 360 (360mm), XSPC EX240 (240mm) & HL Black Ice SR-1 120 (120mm). It's pretty hot here. Ambient (outside) 35 - 37 Celsius with high humidity. Even with 12830 BTU/h A/C, I still feels I need more rads though.

For 3dmark 11, did it run centered or full screen? Also the score you're comparing too probably with tessellation off.
Quote:


> Originally Posted by *KnownDragon*
> 
> Hey guys, I just ordered the XFX R9 290 (R9290AENFC). I ordered it because I believe it can be unlocked to the 290X and because if it doesn't the price to performance was good for me. I know that the cooler on the one I am getting sucks. My question is. Which water block from the op is the best? I don't think I saw any raw data on any of them. I am going to start skimming the posts on the thread to see if I can find any of the information that I have asked. There are a lot of post. So when I can meet the requirements of proving my r9 290 I would like to join.


All water blocks perform close to each other (GPU core cooling). When it come to VRM1 cooling, the Aquacomputer Kryographics water block + active/passive backplate performs a lot better than the rest. With other water blocks, you'll need Fujipoly Extreme or Ultra Extreme thermal pad to get proper/excellent VRM1 cooling.

http://www.xtremesystems.org/forums/showthread.php?288109-Stren-s-R9-290-290x-Water-Block-Testing


----------



## sugarhell

Quote:


> Originally Posted by *sena*
> 
> Performance in bf4 was pretty good, over 100 fps for most of time, mantle api, 1440p, 4xmsaa.
> 
> ALtrough i just benched heaven 4.0, 1080p, 4msaa, ultra, extreme, and it scored less than two r290 at 1050/1350, hmmmm.
> 
> Also what temps you are getting, it feels i am getting too high temps, 55C on stock, 60C overclocked.
> 
> I have one rx360 and one ek240xt, laing ddc 18W pump.
> 
> EDIT; i just tested with 3dmark11
> And it seems low http://www.3dmark.com/3dm11/8361169


Change framepacing method from 2 to 1 for bf4. You will get more fps


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Deskputer











Quote:


> Originally Posted by *acupalypse*
> 
> def looks a lot better than i initially thought
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *ebhsimon*
> 
> This is the only way to do it.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dasboogieman*
> 
> ...its...its...its...alive?
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Upgrading to new case/chassis?
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
Click to expand...

Thanks guys its my first attempt of Deskputer and it turned out ok


----------



## sena

Quote:


> Originally Posted by *kizwan*
> 
> You can take a peek in the picture in my previous post. Look in Max column for temps while playing BF4 (Tri-X 290 stock clock: 1000/1300). Ambient (indoor) temp is around 29 - 30C. VRM1 between high 40s to low 50s Celsius, thanks to Fujipoly Extreme thermal pad.
> 
> I have one HL Black Ice GTS 360 (360mm), XSPC EX240 (240mm) & HL Black Ice SR-1 120 (120mm). It's pretty hot here. Ambient (outside) 35 - 37 Celsius with high humidity. Even with 12830 BTU/h A/C, I still feels I need more rads though.
> 
> For 3dmark 11, did it run centered or full screen? Also the score you're comparing too probably with tessellation off.
> All water blocks perform close to each other (GPU core cooling). When it come to VRM1 cooling, the Aquacomputer Kryographics water block + active/passive backplate performs a lot better than the rest. With other water blocks, you'll need Fujipoly Extreme or Ultra Extreme thermal pad to get proper/excellent VRM1 cooling.
> 
> http://www.xtremesystems.org/forums/showthread.php?288109-Stren-s-R9-290-290x-Water-Block-Testing


Quote:


> Originally Posted by *kizwan*
> 
> You can take a peek in the picture in my previous post. Look in Max column for temps while playing BF4 (Tri-X 290 stock clock: 1000/1300). Ambient (indoor) temp is around 29 - 30C. VRM1 between high 40s to low 50s Celsius, thanks to Fujipoly Extreme thermal pad.
> 
> I have one HL Black Ice GTS 360 (360mm), XSPC EX240 (240mm) & HL Black Ice SR-1 120 (120mm). It's pretty hot here. Ambient (outside) 35 - 37 Celsius with high humidity. Even with 12830 BTU/h A/C, I still feels I need more rads though.
> 
> For 3dmark 11, did it run centered or full screen? Also the score you're comparing too probably with tessellation off.
> All water blocks perform close to each other (GPU core cooling). When it come to VRM1 cooling, the Aquacomputer Kryographics water block + active/passive backplate performs a lot better than the rest. With other water blocks, you'll need Fujipoly Extreme or Ultra Extreme thermal pad to get proper/excellent VRM1 cooling.
> 
> http://www.xtremesystems.org/forums/showthread.php?288109-Stren-s-R9-290-290x-Water-Block-Testing


Quote:


> Originally Posted by *kizwan*
> 
> You can take a peek in the picture in my previous post. Look in Max column for temps while playing BF4 (Tri-X 290 stock clock: 1000/1300). Ambient (indoor) temp is around 29 - 30C. VRM1 between high 40s to low 50s Celsius, thanks to Fujipoly Extreme thermal pad.
> 
> I have one HL Black Ice GTS 360 (360mm), XSPC EX240 (240mm) & HL Black Ice SR-1 120 (120mm). It's pretty hot here. Ambient (outside) 35 - 37 Celsius with high humidity. Even with 12830 BTU/h A/C, I still feels I need more rads though.
> 
> For 3dmark 11, did it run centered or full screen? Also the score you're comparing too probably with tessellation off.
> All water blocks perform close to each other (GPU core cooling). When it come to VRM1 cooling, the Aquacomputer Kryographics water block + active/passive backplate performs a lot better than the rest. With other water blocks, you'll need Fujipoly Extreme or Ultra Extreme thermal pad to get proper/excellent VRM1 cooling.
> 
> http://www.xtremesystems.org/forums/showthread.php?288109-Stren-s-R9-290-290x-Water-Block-Testing


I cant see in the picture, it looks like 50C, but i am not sure, do you have bigger picture?


----------



## kizwan

Quote:


> Originally Posted by *sena*
> 
> I cant see in the picture, it looks like 50C, but i am not sure, do you have bigger picture?


Right-click on the picture & open it in new tab.


----------



## KnownDragon

Quote:


> Originally Posted by *kizwan*
> 
> You can take a peek in the picture in my previous post. Look in Max column for temps while playing BF4 (Tri-X 290 stock clock: 1000/1300). Ambient (indoor) temp is around 29 - 30C. VRM1 between high 40s to low 50s Celsius, thanks to Fujipoly Extreme thermal pad.
> 
> I have one HL Black Ice GTS 360 (360mm), XSPC EX240 (240mm) & HL Black Ice SR-1 120 (120mm). It's pretty hot here. Ambient (outside) 35 - 37 Celsius with high humidity. Even with 12830 BTU/h A/C, I still feels I need more rads though.
> 
> For 3dmark 11, did it run centered or full screen? Also the score you're comparing too probably with tessellation off.
> All water blocks perform close to each other (GPU core cooling). When it come to VRM1 cooling, the Aquacomputer Kryographics water block + active/passive backplate performs a lot better than the rest. With other water blocks, you'll need Fujipoly Extreme or Ultra Extreme thermal pad to get proper/excellent VRM1 cooling.
> 
> http://www.xtremesystems.org/forums/showthread.php?288109-Stren-s-R9-290-290x-Water-Block-Testing


Thank you for your quick response. I believe I will be going the Kryographics route. This will be my first time putting a gpu on Water. Is artic silver 5 okay for the tim?


----------



## sena

Quote:


> Originally Posted by *kizwan*
> 
> Right-click on the picture & open it in new tab.


Thx man, mine max after 30 minutes of heaven is 64C, that 5C difference is in one radiator that you have compared to me.


----------



## dspacek

Hello, guys,
I have R9-290 for 6 months without problem. Now I decided to reinstall my Win7 as usual from my USB drive. So since that I install catalys 13.12 or the latest my graphic card gets BLACK SCREEN after Windows loading screeen. Is any way how to solve this? I use ASUS BIOS on saphire.
Only way how to get into working PC is in Windows save mode remove dispaly driver and then run without display driver. Every try for instalation of catalyst leads to black screeen loading crash









Very appreciate every of your advice.


----------



## Mega Man

Quote:


> Originally Posted by *KnownDragon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Thank you for your quick response. I believe I will be going the Kryographics route. This will be my first time putting a gpu on Water. Is artic silver 5 okay for the tim?
> 
> 
> 
> it can be. but it is capacitive so if you get it on the resistors around the die you can damage it. most recommend using non conductive/capacitive tims your first time(s)
Click to expand...


----------



## kizwan

Quote:


> Originally Posted by *KnownDragon*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> You can take a peek in the picture in my previous post. Look in Max column for temps while playing BF4 (Tri-X 290 stock clock: 1000/1300). Ambient (indoor) temp is around 29 - 30C. VRM1 between high 40s to low 50s Celsius, thanks to Fujipoly Extreme thermal pad.
> 
> I have one HL Black Ice GTS 360 (360mm), XSPC EX240 (240mm) & HL Black Ice SR-1 120 (120mm). It's pretty hot here. Ambient (outside) 35 - 37 Celsius with high humidity. Even with 12830 BTU/h A/C, I still feels I need more rads though.
> 
> For 3dmark 11, did it run centered or full screen? Also the score you're comparing too probably with tessellation off.
> All water blocks perform close to each other (GPU core cooling). When it come to VRM1 cooling, the Aquacomputer Kryographics water block + active/passive backplate performs a lot better than the rest. With other water blocks, you'll need Fujipoly Extreme or Ultra Extreme thermal pad to get proper/excellent VRM1 cooling.
> 
> http://www.xtremesystems.org/forums/showthread.php?288109-Stren-s-R9-290-290x-Water-Block-Testing
> 
> 
> 
> 
> 
> 
> Thank you for your quick response. I believe I will be going the Kryographics route. This will be my first time putting a gpu on Water. Is artic silver 5 okay for the tim?
Click to expand...

AS5 is fine but remember it's slightly capacitive. Make sure it's not spilled from the GPU core/chip.

*[EDIT]* Ninja'd! I took too long to compose a reply.








Quote:


> Originally Posted by *sena*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Right-click on the picture & open it in new tab.
> 
> 
> 
> Thx man, mine max after 30 minutes of heaven is 64C, that 5C difference is in one radiator that you have compared to me.
Click to expand...

That is correct if your ambient (indoor) temp is 29C - 30C. What is your ambient (indoor) temp?


----------



## sena

Quote:


> Originally Posted by *kizwan*
> 
> AS5 is fine but remember it's slightly capacitive. Make sure it's not spilled from the GPU core/chip.
> That is correct if your ambient (indoor) temp is 29C - 30C. What is your ambient (indoor) temp?


Its about 26C.

Hmmm, now i think they run hotter then they should, what paste you are using?

Also bear in mind that i am running +100mV and 1125/1425 clocks.


----------



## aaroc

What benchmarks do you recommend for a before (air)/after (water) comparison? What tools do you use to save the temps, volts, etc... to compare later? Thanks for your help.


----------



## kizwan

Quote:


> Originally Posted by *sena*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> AS5 is fine but remember it's slightly capacitive. Make sure it's not spilled from the GPU core/chip.
> That is correct if your ambient (indoor) temp is 29C - 30C. What is your ambient (indoor) temp?
> 
> 
> 
> Its about 26C.
> 
> Hmmm, now i think they run hotter then they should, what paste you are using?
Click to expand...

I'd say yours just *slightly* run hotter than mine. You need more rad(s) I think. BTW, I use Shin-Etsu G751 TIM. It never fail me since first gen Intel Core CPU's.


----------



## sena

Quote:


> Originally Posted by *kizwan*
> 
> AS5 is fine but remember it's slightly capacitive. Make sure it's not spilled from the GPU core/chip.
> 
> *[EDIT]* Ninja'd! I took too long to compose a reply.
> 
> 
> 
> 
> 
> 
> 
> 
> That is correct if your ambient (indoor) temp is 29C - 30C. What is your ambient (indoor) temp?


Also bear in mind i am running +100mV and 1125 MHz clocks
Quote:


> Originally Posted by *kizwan*
> 
> I'd say yours just *slightly* run hotter than mine. You need more rad(s) I think. BTW, I use Shin-Etsu G751 TIM. It never fail me since first gen Intel Core CPU's.


What voltage you are running?

I am running +100mV.


----------



## dspacek

Pls read my 23 305 question. Im really in trouble. Help pls


----------



## rdr09

Quote:


> Originally Posted by *aaroc*
> 
> What benchmarks do you recommend for a before (air)/after (water) comparison? What tools do you use to save the temps, volts, etc... to compare later? Thanks for your help.


i would recommend the long benchmarking tools like Valley and Heaven but you might as well do all the Futuremark ones like 3DMark11 and 3DMark13(Firestrike). For Firestrike i recommend the free version since you are after differences in temps.

Just use Windows Snipping tool to take a screenshot of the results. I use GPUZ (2 windows) to show core and the vrm temps. Hwinfo64 is also a good tool to show temps.

Quote:


> Originally Posted by *dspacek*
> 
> Pls read my 23 305 question. Im really in trouble. Help pls


try the other bios (switch 2) for installing driver. i would also clear the cmos before doing anything. turn off computer and switch to bios no. 2. Stick for 13.12 for this purpose.


----------



## Loktar Ogar

Hi Guys,

I just wanted to ask if somebody tried and posted a temperature results using the Fujupoly Extreme Thermal pad on R9 290 Tri-X? Was there any significant gain to lower VRM1 temps?

Can you direct me to the right post if there was one please. Thanks.


----------



## Yvese

So finally got my R9 290. Anyone else have issues with the GPU and Memory clocks fluctuating between 300/150 and 4xx / stock mem while watching flash videos? ( mainly twitch ). I have hardware acceleration disabled so this is pretty odd.

Didn't have this problem on my 7950's. Using latest 14.4 WHQL.

Problem goes away if I exit any flash videos or go to another tab.


----------



## smarteck

///

*=)* Hello everyone,

I have always been using Nvidia cards, and now I need to renew my GPU and I watched it for the price and performance, AMD 290x is the best option.

But I'm undecided about switch from Nvidia to AMD, thinking of having some sort of problem or lose some features such as:

*- Can I use/force SGSSAA on DX9 games with AMD?
- Can I leave my old nvidia card just for Physx?
- Is it possible to use HBAO+ with AMD?
- AMD Drivers are good?
- Someone told me that the 290x had temperature problems... does that is solved with a 3 fans model?

- And most importantly, the 290x is a good choice to play 4K smoothly?*

Please, you guys already have this card, tell me... *290x is it a good option?*

*Thanks for ANY advise. ; )*

//////


----------



## heroxoot

Quote:


> Originally Posted by *smarteck*
> 
> ///
> 
> *=)* Hello everyone,
> 
> I have always been using Nvidia cards, and now I need to renew my GPU and I watched it for the price and performance of AMD 290x is the best option.
> 
> But I'm undecided about switch from Nvidia to AMD, thinking of having some sort of problem or lose some features such as:
> 
> *- Can I use/force SGSSAA on DX9 games with AMD?
> - Can I leave my old nvidia card just for Physx?
> - Is it possible to use HBAO+ with AMD?
> - AMD Drivers are good?
> - Someone told me that the 290x had temperature problems... does that is solved with a 3 fans model?
> 
> - And most importantly, the 290x is a good choice to play 4K smoothly?*
> 
> Please, you guys already have this card, tell me... *290x is it a good option?*
> 
> *Thanks for ANY advise. ; )*
> 
> //////


- Don't know
-Yes there is a mod to make a nvidia card run with an amd card for physx
-Yes
-Depends on the person. Some people have no problem and others do. I personally have 1 slight problem that does not hinder performance that is specific to me only.
-My 290X is a MSI Gaming and it does not have temp issues with my OC. It maxes its temp around 85c. Don't buy a reference fan model and you should be ok. They are rated for like 95c max temps during usage.
-You can find 4k res benchmarks around here. I've seen them do pretty well in comparison to nvidia cards.


----------



## Tobiman

Quote:


> Originally Posted by *dspacek*
> 
> Hello, guys,
> I have R9-290 for 6 months without problem. Now I decided to reinstall my Win7 as usual from my USB drive. So since that I install catalys 13.12 or the latest my graphic card gets BLACK SCREEN after Windows loading screeen. Is any way how to solve this? I use ASUS BIOS on saphire.
> Only way how to get into working PC is in Windows save mode remove dispaly driver and then run without display driver. Every try for instalation of catalyst leads to black screeen loading crash
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Very appreciate every of your advice.


Yeah, try using the second bios. If it doesn't work then rma.


----------



## Roboyto

Quote:



> Originally Posted by *smarteck*
> 
> ///
> 
> *=)* Hello everyone,
> 
> I have always been using Nvidia cards, and now I need to renew my GPU and I watched it for the price and performance, AMD 290x is the best option.
> 
> But I'm undecided about switch from Nvidia to AMD, thinking of having some sort of problem or lose some features such as:
> 
> *- Can I use/force SGSSAA on DX9 games with AMD?
> - Can I leave my old nvidia card just for Physx?
> - Is it possible to use HBAO+ with AMD?
> - AMD Drivers are good?
> - Someone told me that the 290x had temperature problems... does that is solved with a 3 fans model?
> 
> - And most importantly, the 290x is a good choice to play 4K smoothly?*
> 
> Please, you guys already have this card, tell me... *290x is it a good option?*
> 
> *Thanks for ANY advise. ; )*
> 
> //////


SGSSAA - Not sure

Nvidia Card for PhysX - This is possible by running Hybrid PhysX. It tricks the Nvidia drivers/software into thinking the Nvidia card is your primary. I have been down this road and it can really be a pain for it to work properly; most games need some little tweaks to make it work properly. Honestly not worth it IMHO.

HBAO - not sure; quick google search says yes?

The best drivers to date are still 13.12 If you are running Windows 8/8.1 then 14.4 seem to be working better there.

The non-reference models do a good job of cooling these cards depending on how far you want to push them. Before purchasing one of these you must tell yourself that these cards are perfectly capable of running at higher temperatures than what you're used to seeing. AMD is standing behind the fact that these cards can run at 90C all day, everyday without problems. With that being said you will more than likely hit a temperature wall on VRM1 before the core, depending on which card you choose. More information in the link below.

The extra $ on 290X may or may not be worth it. If you get a mediocre 290X that doesn't clock well, then a 290 can surpass it.

A single 290(X) will make playable frame rates for 4K depending on the game and settings applied. Don't expect 60fps with a single 290(X) on the newest games with every setting maxed; compromises will have to be made. As @heroxoot said you should check the benches that people have posted here, or check the hardware review sites for 4K results. If you want a truly flawless experience for 4K then your best bet is probably going to be 2 cards.

This will help answer other questions you have regarding these cards: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *dspacek*
> 
> Hello, guys,
> I have R9-290 for 6 months without problem. Now I decided to reinstall my Win7 as usual from my USB drive. So since that I install catalyst 13.12 or the latest my graphic card gets BLACK SCREEN after Windows loading screen. Is any way how to solve this? I use ASUS BIOS on sapphire.
> Only way how to get into working PC is in Windows save mode remove display driver and then run without display driver. Every try for installation of catalyst leads to black screen loading crash
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Very appreciate every of your advice.


Try flashing the PT1T bios it tricks mobos to stop black screen
http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread/0_20


Quote:


> Originally Posted by *aaroc*
> 
> What benchmarks do you recommend for a before (air)/after (water) comparison? What tools do you use to save the temps, volts, etc... to compare later? Thanks for your help.


Nice bench man


----------



## motokill36

Is there a Glitch in after burner .
Says im using over 4Gb of Vram in BF4


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *motokill36*
> 
> Is there a Glitch in after burner .
> Says im using over 4Gb of Vram in BF4


I would say yes mate









That's on one card though ?


----------



## caenlen

that the new retail version of afterburner that just came out?


----------



## sugarhell

Quote:


> Originally Posted by *motokill36*
> 
> Is there a Glitch in after burner .
> Says im using over 4Gb of Vram in BF4


Gpu-z and msi reports crossfire setups as 8gb. If you use 4gb then the real usage is around 2gb


----------



## motokill36

Quote:


> Originally Posted by *sugarhell*
> 
> Gpu-z and msi reports crossfire setups as 8gb. If you use 4gb then the real usage is around 2gb


ooow ok thats makes sence

Yes its in Crossfire

Thanks
All


----------



## kizwan

Quote:


> Originally Posted by *sena*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> AS5 is fine but remember it's slightly capacitive. Make sure it's not spilled from the GPU core/chip.
> 
> *[EDIT]* Ninja'd! I took too long to compose a reply.
> 
> 
> 
> 
> 
> 
> 
> 
> That is correct if your ambient (indoor) temp is 29C - 30C. What is your ambient (indoor) temp?
> 
> 
> 
> Also bear in mind i am running +100mV and 1125 MHz clocks
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I'd say yours just *slightly* run hotter than mine. You need more rad(s) I think. BTW, I use Shin-Etsu G751 TIM. It never fail me since first gen Intel Core CPU's.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> What voltage you are running?
> 
> I am running +100mV.
Click to expand...

I don't have Heaven data but I do have Valley & 3dmark 11 data.

*Valley 1150-1600 +100mV ambient 27.3C*



*3dmark 11 1175-1400 +200mV ambient 27.6C*


----------



## chronicfx

Quoted the reply..


----------



## Mega Man

not bad... 100% stock gpus except 100% fan love ref

http://www.3dmark.com/3dm/3132310


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Mega Man*
> 
> not bad... 100% stock gpus except 100% fan love ref
> 
> http://www.3dmark.com/3dm/3132310


3 290's








http://www.3dmark.com/3dm/3102791


----------



## Germanian

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 3 290's
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/3102791


#REKT, nice overclock on 3 R9 290's


----------



## phallacy

My result with 4 290x overclocked with a not so great 4770k; can only reach 4.4/4.5 stable.

http://www.3dmark.com/fs/2112220


----------



## igrease

Would running crossfire 290s at 16x & 4x cause any real big issues? I've read that for AMD cards that 4x slot doesn't really decrease performance.


----------



## Mega Man

it does but it is not horrible


----------



## pdasterly

Found some good stuff for the 290X, don't know if its posted in here already(2334 pages is a lot)
I did this mod and gained numbers in 3dmark in every test
My card is sapphire reference 290X
http://forums.overclockers.co.uk/showthread.php?t=18552408


----------



## RooTxBeeR

Been trying to find a answer to this. How hot is too hot on my R9 290 before I should turn games off???


----------



## Sgt Bilko

Quote:


> Originally Posted by *RooTxBeeR*
> 
> Been trying to find a answer to this. How hot is too hot on my R9 290 before I should turn games off???


There is no "too hot" it will hit 95c and then downclock itself.

If you are comfortable with those temps then by all means just keep it running.


----------



## RooTxBeeR

I am fine, I have gone over that temp. That's why I am asking, what is too hot.


----------



## Sgt Bilko

Quote:


> Originally Posted by *RooTxBeeR*
> 
> I am fine, I have gone over that temp. That's why I am asking, what is too hot.


What R9 290's are you using?


----------



## RooTxBeeR

Quote:


> Originally Posted by *Sgt Bilko*
> 
> What R9 290's are you using?


2 Gigabyte windforces


----------



## Sgt Bilko

Quote:


> Originally Posted by *RooTxBeeR*
> 
> 2 Gigabyte windforces


Are they still using the original BIOS?

The Windforce 3 is a very effective cooler and you shouldn't be getting even close to those temps really.

Have you set a custom fan curve and made sure you have adequate spacing between the cards?


----------



## KyadCK

Quote:


> Originally Posted by *RooTxBeeR*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> What R9 290's are you using?
> 
> 
> 
> 2 Gigabyte windforces
Click to expand...

Looked in your album.

This is bad:









Your Maximus has 4 red x16 lanes. Put the cards in the 1st and 3rd slots. and make sure the BIOS is in the "performance" position (switch toward the display outs).


----------



## the9quad

Quote:


> Originally Posted by *KyadCK*
> 
> Looked in your album.
> 
> This is bad:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Your Maximus has 4 red x16 lanes. Put the cards in the 1st and 3rd slots. and make sure the BIOS is in the "performance" position (switch toward the display outs).


4 x16 lanes??????I didn't think anything had 64 lanes but maybe some xeon boards.


----------



## sena

Quote:


> Originally Posted by *kizwan*
> 
> I don't have Heaven data but I do have Valley & 3dmark 11 data.
> 
> *Valley 1150-1600 +100mV ambient 27.3C*
> 
> 
> 
> *3dmark 11 1175-1400 +200mV ambient 27.6C*


Thx mate for info.

I have one question, how are you cards colder on 200mV compared to 100mV?


----------



## Sgt Bilko

Quote:


> Originally Posted by *the9quad*
> 
> 4 x16 lanes??????I didn't think anything had 64 lanes but maybe some xeon boards.


Would have 4 x16 capable slots, not 64 lanes


----------



## the9quad

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Would have 4 x16 capable slots, not 64 lanes


ah gotcha, looking at the pick it looks like only the top two will do 16 lanes, and only one card will ever do 16 in sli/cfx. So the best you can hope for is 16,8,8?


----------



## Sgt Bilko

Quote:


> Originally Posted by *the9quad*
> 
> ah gotcha, looking at the pick it looks like only the top two will do 16 lanes, and only one card will ever do 16 in sli/cfx. So the best you can hope for is 16,8,8?


Yep, which would be perfectly fine for quadfire seeing how x8 provides enough bandwidth easy


----------



## the9quad

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yep, which would be perfectly fine for quadfire seeing how x8 provides enough bandwidth easy


maybe today, but what about his grandchildren? j/k yeah I know i was just confused, when he said 4x16 I was like whoa i got to see this!


----------



## kizwan

Quote:


> Originally Posted by *sena*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I don't have Heaven data but I do have Valley & 3dmark 11 data.
> 
> *Valley 1150-1600 +100mV ambient 27.3C*
> 
> 
> 
> *3dmark 11 1175-1400 +200mV ambient 27.6C*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thx mate for info.
> 
> I have one question, how are you cards colder on 200mV compared to 100mV?
Click to expand...

That is because Valley is *heavy* on the GPU than 3dmark 11. When running Valley, GPU consistently under load all the time but in 3dmark 11, GPU have enough time to cool down between scenes (scenes too short, hence not enough time to heat up).


----------



## Tokkan

Quote:


> Originally Posted by *the9quad*
> 
> maybe today, but what about his grandchildren? j/k yeah I know i was just confused, when he said 4x16 I was like whoa i got to see this!


Im gonna answer your question with another question.
How many x58 users are running quad/tri cards (290/290x/780/780ti/Titan)? Think about it.


----------



## sena

Quote:


> Originally Posted by *kizwan*
> 
> That is because Valley is *heavy* on the GPU than 3dmark 11. When running Valley, GPU consistently under load all the time but in 3dmark 11, GPU have enough time to cool down between scenes (scenes too short, hence not enough time to heat up).


Thx man for all help and info.

+1 rep.


----------



## dspacek

I tried many bioses including original and ASUS etc. but nothing changed. RMA? I have the card from my friend, without purchase documents. *is this card able to be used atleast for mining?* If anyone want it. I sell it for 300 dollars. ;-)


----------



## dspacek

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Try flashing the PT1T bios it tricks mobos to stop black screen
> http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread/0_20
> 
> Nice bench man


Thanks,
but I tried this PT1T BIOS andt problem still present. Im not rookie in OCing, but this problem is very strange for me. I hoped that My gigabyte X79 is not so shi...t.y MOBO as is.
Or Did I take a defective card? Realy RMA, no other solution?

- I tried, clear CMOS, run the system and then reinstall system and drivers
- Reinstall systemand drivers with clear cmos for every slot of MOBO
- Reinstall system and drivers with different VGA ROM
- using drivers 13.12 primary, then trying 14.4
-
- Everything begun with first installation of 14.4 catalyst, since that Im in trouble.


----------



## piquadrat

Hello guys.
My little project named "R9 on the water" is heading to the happy end but I have one question to fujipoly "experts".
Looking at the spec of the Fujipoly XR-j (14 W/mK) series one can see that 0.5mm of pad thickness (generally the whole offer below 1mm) is reserved for "harden top surface" variety. What does it really mean? Harden top surface? It means the protective foil is hardend on the one of the surfaces or the compound itself is anisotropic? Hardened top surface variety is as good as the second one for VRM1 application?

And what do you think about XR-e series but rated 11 W/mK. Cheaper but worth the attention? Good between VRM1 and heatsink?


----------



## RooTxBeeR

Ya I am still running stock bios. I have yet to change anything software wise. And who ever said about making sure it's in performance mode where is that exactly? I had asked in another topic of I could move my car down but no one ever answered. Thanks guys, I'll move the second card down a slot.


----------



## Jeffro422

Just bought a Sapphire reference 290x. Performance, temperature and noise are off the charts. Hit 90c on 3dmark 11 stock clocks, but those scores! For the price paid this card is a nice upgrade to the 7950.


----------



## pdasterly

Quote:


> Originally Posted by *RooTxBeeR*
> 
> Ya I am still running stock bios. I have yet to change anything software wise. And who ever said about making sure it's in performance mode where is that exactly? I had asked in another topic of I could move my car down but no one ever answered. Thanks guys, I'll move the second card down a slot.


uber mode, there's a small dip switch on board, flip it away from monitor connectors


----------



## pdasterly

Quote:


> Originally Posted by *Jeffro422*
> 
> Just bought a Sapphire reference 290x. Performance, temperature and noise are off the charts. Hit 90c on 3dmark 11 stock clocks, but those scores! For the price paid this card is a nice upgrade to the 7950.


I have the kraken g10 with corsair h75, gpu temp never goes over 57c. I had to unplug fans on cooler to let temp rise to 90c, soon as fans came back on temp dropped back to 57c. vrm got up to 60c and 70c for vrm2


----------



## Jeffro422

Quote:


> Originally Posted by *pdasterly*
> 
> I have the kraken g10 with corsair h75, gpu temp never goes over 57c. I had to unplug fans on cooler to let temp rise to 90c, soon as fans came back on temp dropped back to 57c. vrm got up to 60c and 70c for vrm2


Wow. Didn't know NZXT made the G10. Once upon a time I was looking at the dwood bracket but my windforce 7950 ran cool even at 1300 core clock. Now that I have this I think the G10 is more of a necessity. I have a H50 sitting on the shelf. Going to order the G10. Did you use heat sinks on the the VRM components?


----------



## pdasterly

Quote:


> Originally Posted by *Jeffro422*
> 
> Wow. Didn't know NZXT made the G10. Once upon a time I was looking at the dwood bracket but my windforce 7950 ran cool even at 1300 core clock. Now that I have this I think the G10 is more of a necessity. I have a H50 sitting on the shelf. Going to order the G10. Did you use heat sinks on the the VRM components?


----------



## RooTxBeeR

Quote:


> Originally Posted by *pdasterly*
> 
> uber mode, there's a small dip switch on board, flip it away from monitor connectors


Thanks, when I get back from the gym right now I am going to drop my first card down a slot and make sure what you speak of is enabled.


----------



## Jeffro422

Quote:


> Originally Posted by *pdasterly*


Those are the Kootek heatsinks? http://www.amazon.com/dp/B00CJRZP9I/ref=cm_sw_su_dp


----------



## pdasterly

Ebay copper and gelid kit


----------



## sena

Quote:


> Originally Posted by *pdasterly*


Clocks?


----------



## bencher

How do I enable crossfire?

I got my send R9 290.


----------



## pdasterly

,
Quote:


> Originally Posted by *sena*
> 
> Clocks?


1000mhz, I turned up to 1100mhz with no problems. Turned up to 1300mhz and started having problems. No heat problems but stability issues. Vrms warmed up but didnt overheat. My aio cooler started putting out some real heat. Office got hot from gpu air


----------



## cennis

Quote:


> Originally Posted by *pdasterly*
> 
> ,
> 1000mhz, I turned up to 1100mhz with no problems. Turned up to 1300mhz and started having problems. No heat problems but stability issues. Vrms warmed up but didnt overheat. My aio cooler started putting out some real heat. Office got hot from gpu air


Do you mean 1200? what volts are you running?
1300 requires significant overvolting


----------



## pdasterly

Quote:


> Originally Posted by *cennis*
> 
> Do you mean 1200? what volts are you running?
> 1300 requires significant overvolting


I just set it to max 14, asus bios allows for volts to be adjusted


----------



## pdasterly

best mobo for 290x crossfire, leaning towards 4770k cpu and a fancy board


----------



## DeadlyDNA

Quote:


> Originally Posted by *the9quad*
> 
> 4 x16 lanes??????I didn't think anything had 64 lanes but maybe some xeon boards.


http://www.newegg.com/Product/Product.aspx?Item=N82E16813131971

ASUS P9X79-E WS LGA 2011 Intel X79 SATA 6Gb/s USB 3.0 SSI CEB Intel Motherboard

I was tempted to get this board.


----------



## Kainn

just picked xfx double d fan model on newegg for 330 after coupons, rebates, and discounts. replacing my 670 ftw and once i sell my waterblock for it, it'll only cost around 50 to upgrade.


----------



## RooTxBeeR

Quote:


> Originally Posted by *DeadlyDNA*
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16813131971
> 
> ASUS P9X79-E WS LGA 2011 Intel X79 SATA 6Gb/s USB 3.0 SSI CEB Intel Motherboard
> 
> I was tempted to get this board.


o.o oh wowzors


----------



## Yvese

K so I just did some light OCing with my new 290. Got it to 1100/1400 on stock voltages and 50% power limit. Did a few rounds of BF4 1k ticket 64p locker and no crashes or artifacts. However, I did notice that my GPU fluctuates from 1100 down to 950 then back up all in a span of 2 seconds.

After that, it stays at 1100 for the next 30 or so seconds. My 7950's didn't have boost so not sure how this works. I assume giving it some voltage would fix it? Temps averaged 72C on both core and mem.


----------



## vuldin

Here's my entry into the owners club:

1. http://www.techpowerup.com/gpuz/g6gzn/
2. Asus DirectCU II OC R9290X
3. stock cooling



I've been slow to post this as I was hoping to spend lots of time seeing how far I can overclock and still be stable before posting pics of GPU-Z. Instead I've spent lots of time playing games







. The only overclocking I've done to the GPU is increased the clock speed to 1100MHz (from the default 1050MHz for this card). I've also increased the voltage to 1300mV, but I'm not yet sure if this was needed. When I increased the GPU memory clock speed I ended up getting lower 3dmark Firestrike scores, so I need to do more testing to see what the best settings are.


----------



## Roboyto

Quote:


> Originally Posted by *Yvese*
> 
> K so I just did some light OCing with my new 290. Got it to 1100/1400 on stock voltages and 50% power limit. Did a few rounds of BF4 1k ticket 64p locker and no crashes or artifacts. However, I did notice that my GPU fluctuates from 1100 down to 950 then back up all in a span of 2 seconds.
> 
> After that, it stays at 1100 for the next 30 or so seconds. My 7950's didn't have boost so not sure how this works. I assume giving it some voltage would fix it? Temps averaged 72C on both core and mem.


Have ULPS disabled and force constant voltage checked in your utility? This may solve your issue.


----------



## Yvese

Quote:


> Originally Posted by *Roboyto*
> 
> Have ULPS disabled and force constant voltage checked in your utility? This may solve your issue.


ULPS is for CF though. That and I prefer keeping the power savings in regards to voltage.

I just did some loops of the Tomb Raider benchmark and clocks didn't budge at all. This leads me to believe it's a BF4 mantle/driver issue.

Curious if you or anyone else experiences this in BF4 Mantle.

Lastly, I get clock spikes when scrolling up and down on my browser. Weird...


----------



## Mega Man

Quote:


> Originally Posted by *pdasterly*
> 
> best mobo for 290x crossfire, leaning towards 4770k cpu and a fancy board


either RIVE or RIVBE but you would need a different CPU


----------



## kizwan

Quote:


> Originally Posted by *Yvese*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Roboyto*
> 
> Have ULPS disabled and force constant voltage checked in your utility? This may solve your issue.
> 
> 
> 
> 
> 
> 
> ULPS is for CF though. That and I prefer keeping the power savings in regards to voltage.
> 
> I just did some loops of the Tomb Raider benchmark and clocks didn't budge at all. This leads me to believe it's a BF4 mantle/driver issue.
> 
> Curious if you or anyone else experiences this in BF4 Mantle.
> 
> Lastly, I get clock spikes when scrolling up and down on my browser. Weird...
Click to expand...

I only noticed very minor clock fluctuation when playing bf4 with single gpu (whql drivers). Nothing like yours though. I think I save a graph or two I can share with you whenever I can access my computer.

It seems 290's like to get involved in our life. I'm not worry with these little things, e.g. clock spikes, as long as it doesn't cause any issue or problem to me.
Quote:


> Originally Posted by *Mega Man*
> 
> Quote:
> 
> 
> 
> Originally Posted by *pdasterly*
> 
> best mobo for 290x crossfire, leaning towards 4770k cpu and a fancy board
> 
> 
> 
> either RIVE or RIVBE but you would need a different CPU
Click to expand...

For fancy board, I agree.


----------



## Mega Man

i have yet to be able to even play that game i always get errors, i last at most 5 min... looked fun... big waste of cash


----------



## pdasterly

That
Quote:


> Originally Posted by *Mega Man*
> 
> either RIVE or RIVBE but you would need a different CPU


That board and cpu is more than i want to spend


----------



## kizwan

Quote:


> Originally Posted by *pdasterly*
> 
> That
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> either RIVE or RIVBE but you would need a different CPU
> 
> 
> 
> That board and cpu is more than i want to spend
Click to expand...

With the quad core cpu, you can reduced the cost. You not necessarily need to get hexacore with that board though.

If you want to stick with haswell, for fancy board, asus rog board will be suitable. Are you going more than two cards in the future? Choose one with plx chip I think.


----------



## Mega Man

Quote:


> Originally Posted by *Mega Man*
> 
> Quote:
> 
> 
> 
> Originally Posted by *pdasterly*
> 
> _*best mobo*_ for 290x crossfire, leaning towards 4770k cpu and a fancy board
> 
> 
> 
> either RIVE or RIVBE but you would need a different CPU
Click to expand...

Quote:


> Originally Posted by *pdasterly*
> 
> That
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> either RIVE or RIVBE but you would need a different CPU
> 
> 
> 
> That board and cpu is more than i want to spend
Click to expand...

you asked for best. i gave it to you


----------



## Jeffro422

Took a quick shot of my 290x installed. Looking forward to getting the Kraken G10 pdasterly recommended.


----------



## Roboyto

Quote:


> Originally Posted by *Yvese*
> 
> ULPS is for CF though. That and I prefer keeping the power savings in regards to voltage.
> 
> I just did some loops of the Tomb Raider benchmark and clocks didn't budge at all. This leads me to believe it's a BF4 mantle/driver issue.
> 
> Curious if you or anyone else experiences this in BF4 Mantle.
> 
> Lastly, I get clock spikes when scrolling up and down on my browser. Weird...


Disabling ULPS and forcing constant voltage aren't going to inhibit your cards capabilities to downclock. Doing this can help with issues like you're describing and with stability when pushing the GPU.

If you're using 14.4, that may have something to do with your problem as well. Make sure you have scrubbed whatever drivers you came from previously with DDU.

If you have a capable CPU then mantle may cause more headaches than give you benefits. It was in fact designed for low/mid end hardware, not AMDs flagship GPU.

My suggestion is still 13.12 drivers.

Only issue I have had in browser is artifacting occasionally for no apparent reason. Scrolling up and then down on the page and they go away.

More 290(X) information:
http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781


----------



## maynard14

http://www.guru3d.com/files_get/amd_catalyst_14_6_beta_download,1.html

AMD Catalyst 14.6 Beta out now :0


----------



## JordanTr

Quote:


> Originally Posted by *maynard14*
> 
> http://www.guru3d.com/files_get/amd_catalyst_14_6_beta_download,1.html
> 
> AMD Catalyst 14.6 Beta out now :0


Any noticeble improvents?


----------



## BradleyW

Quote:


> Originally Posted by *JordanTr*
> 
> Any noticeble improvents?


Only for Watch Dogs, Murdered SS and eyefinity. No other improvements.


----------



## velocityx

Quote:


> Originally Posted by *BradleyW*
> 
> Only for Watch Dogs, Murdered SS and eyefinity. No other improvements.


well, mantle bf4 runs a tiny bit better. getting better averages than 14.4


----------



## rdr09

Quote:


> Originally Posted by *velocityx*
> 
> well, mantle bf4 runs a tiny bit better. getting better averages than 14.4


so, you've tried it? how did you install it? you mind running a bench? 3DMark11 pls.


----------



## velocityx

Quote:


> Originally Posted by *rdr09*
> 
> so, you've tried it? how did you install it? you mind running a bench? 3DMark11 pls.


I uninstalled 14.4 then did a ddu then installed these.

I dont have 3d mark.


----------



## rdr09

Quote:


> Originally Posted by *velocityx*
> 
> I uninstalled 14.4 then did a ddu then installed these.
> 
> I dont have 3d mark.


the dreaded DDU (to me). +rep.

not 3DMark but 3DMark11 . . .

http://www.techpowerup.com/downloads/Benchmarking/Futuremark/

its free and short. third from the list. thanks. I want to finally rid of 13.11.


----------



## velocityx

Quote:


> Originally Posted by *rdr09*
> 
> the dreaded DDU (to me). +rep.
> 
> not 3DMark but 3DMark11 . . .
> 
> http://www.techpowerup.com/downloads/Benchmarking/Futuremark/
> 
> its free and short. third from the list. thanks. I want to finally rid of 13.11.


well I didnt exactly wanted to use DDU but manually uninstalling from catalyst in control panel gave me errors and nothing uninstalled except list positions from the install manager, and drivers stayed so I had to DDU. Its hard for me to comprehend how a company like AMD, that makes microscopic chips and other mind bending electronics cant create software for it. One would think hardware is harder to make not software lol

anyway, I'm downloading 3d mark.


----------



## rdr09

Quote:


> Originally Posted by *velocityx*
> 
> well I didnt exactly wanted to use DDU but manually uninstalling from catalyst in control panel gave me errors and nothing uninstalled except list positions from the install manager, and drivers stayed so I had to DDU. Its hard for me to comprehend how a company like AMD, that makes microscopic chips and other mind bending electronics cant create software for it. One would think hardware is harder to make not software lol
> 
> anyway, I'm downloading 3d mark.


for the longest time, since my HD 4555, all all I did was use the newer driver to uninstall the old. express and it worked. with 14.4, did not work. I am using 14.3 and 13.11 (2 hdds).

no ddus or driver sweeper in this system. just no.


----------



## velocityx

Quote:


> Originally Posted by *rdr09*
> 
> for the longest time, since my HD 4555, all all I did was use the newer driver to uninstall the old. express and it worked. with 14.4, did not work. I am using 14.3 and 13.11 (2 hdds).
> 
> no ddus or driver sweeper in this system. just no.


I cant really use express as it will dump my usb and chipset drivers so that leaves me with no input and you need to reinstall windows from scratch lol. catch 22.


----------



## Dasboogieman

Good lord just ran Watchdogs.
Ultra Texture settings with Enhanced SMAA on 1080p
45-90 FPS range
but damn, 3.5Gb VRAM usage with 600mb non-dedicated usage.

It was hitting 3.9Gb VRAM with 1200mb non-dedicated usage with x4 MSAA.


----------



## rdr09

Quote:


> Originally Posted by *velocityx*
> 
> I cant really use express as it will dump my usb and chipset drivers so that leaves me with no input and you need to reinstall windows from scratch lol. catch 22.


you must have an amd rig. I do the same with my amd rig, though, without issue. hmmm.


----------



## BradleyW

Quote:


> Originally Posted by *Dasboogieman*
> 
> Good lord just ran Watchdogs.
> Ultra Texture settings with Enhanced SMAA on 1080p
> 45-90 FPS range
> but damn, 3.5Gb VRAM usage with 600mb non-dedicated usage.
> 
> It was hitting 3.9Gb VRAM with 1200mb non-dedicated usage with x4 MSAA.


Rig specs and drivers please?


----------



## Arizonian

Quote:


> Originally Posted by *vuldin*
> 
> Here's my entry into the owners club:
> 
> 1. http://www.techpowerup.com/gpuz/g6gzn/
> 2. Asus DirectCU II OC R9290X
> 3. stock cooling
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I've been slow to post this as I was hoping to spend lots of time seeing how far I can overclock and still be stable before posting pics of GPU-Z. Instead I've spent lots of time playing games
> 
> 
> 
> 
> 
> 
> 
> . The only overclocking I've done to the GPU is increased the clock speed to 1100MHz (from the default 1050MHz for this card). I've also increased the voltage to 1300mV, but I'm not yet sure if this was needed. When I increased the GPU memory clock speed I ended up getting lower 3dmark Firestrike scores, so I need to do more testing to see what the best settings are.


Congrats - added









Your right, for a 1100 Mhz Core OC you should be able to do that without any voltage bump. It'll just higher temps.


----------



## heroxoot

I had to bump my voltage from the default +25mV to +30mV to get it to stop showing weird checkered artifacts in games. No polygon type artifacts at all tho. Not sure what it was but the +5 bump fixed it. I guess I'm just unlucky.

Does anyone know what I can do to make my GPU Idle like it does on stock, but with my OC? On stock my GPU idles 350/1250 and it idles cooler than my 2D profile in MSI AB of 515/625. I cannot make the slider go down anymore on the GPU and it's annoying me to no end. I could be idling in the mid 40s but instead it idles 52c for a whole 200mhz.


----------



## chronicfx

Anyone tried watchdogs in xfire yet using 14.6? How does it run and what resolution are you using?


----------



## phallacy

Quote:


> Originally Posted by *chronicfx*
> 
> Anyone tried watchdogs in xfire yet using 14.6? How does it run and what resolution are you using?


I've got 14.6 can't seem to even register on uplay to download Watch Dogs though. Seems uplay is down for a lot of people PCs and consoles.


----------



## DolanTheDuck

Anyone with r9 290 who played watch dogs, what are your average fps? I'm getting too low framerates!(stupid nvidia/ubisoft)

Specs:
Ultra settings, ultra textures, temporal smaa @ 1080p
35-50 fps while walking outside in the game.

i5 4670k, r9 290 tri-x, 8gb ram, win 8.1, catalyst 14.6


----------



## edo101

Quote:


> Originally Posted by *DolanTheDuck*
> 
> Anyone with r9 290 who played watch dogs, what are your average fps? I'm getting too low framerates!(stupid nvidia/ubisoft)
> 
> Specs:
> Ultra settings, ultra textures, temporal smaa @ 1080p
> 35-50 fps while walking outside in the game.
> 
> i5 4670k, r9 290 tri-x, 8gb ram, win 8.1, catalyst 14.6


Oh man, then I'm screwed on 1440p


----------



## rdr09

Quote:


> Originally Posted by *DolanTheDuck*
> 
> Anyone with r9 290 who played watch dogs, what are your average fps? I'm getting too low framerates!(stupid nvidia/ubisoft)
> 
> Specs:
> Ultra settings, ultra textures, temporal smaa @ 1080p
> 35-50 fps while walking outside in the game.
> 
> i5 4670k, r9 290 tri-x, 8gb ram, win 8.1, catalyst 14.6


see post # 23394. not sure what driver was used.

anyway, i installed 14.6 Beta using express method. no ddu, no driver sweeper crap just how i used to do it . . .







compare graphics scores.

13.11

http://www.3dmark.com/3dm11/8233743

14.6 Beta

http://www.3dmark.com/3dm11/8369902

both at 1200/1500. Gonna try BF4 soon.







Quote:


> Originally Posted by *heroxoot*
> 
> 14.4
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 14.6 beta
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What an increase!
> 
> But I still again, have no fan control since 13.12. I do not know why. Also I saw my GPU go up to 90c near the end of this benchmark and that seems a little scary.


----------



## heroxoot

14.4


Spoiler: Warning: Spoiler!















14.6 beta


Spoiler: Warning: Spoiler!















What an increase!

But I still again, have no fan control since 13.12. I do not know why. Also I saw my GPU go up to 90c near the end of this benchmark and that seems a little scary.


----------



## edo101

Quote:


> Originally Posted by *heroxoot*
> 
> 14.4
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 14.6 beta
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What an increase!
> 
> But I still again, have no fan control since 13.12. I do not know why. Also I saw my GPU go up to 90c near the end of this benchmark and that seems a little scary.


When you say fan control, Cant you use Trixx or AFB to get the fans to operate how you want it?


----------



## pdasterly

picking up 4770k and maximus vi hero(z87), is this a good choice
Quote:


> Originally Posted by *Mega Man*
> 
> you asked for best. i gave it to you


picking up 4770k and maximus vi hero(z87), is this a good choice


----------



## Ramzinho

i feel like buying one of them korean IPS monitors. is it worth it to buy a 290X when the 390X is out? or Am i gonna be fine with my GPU until the 390X is a little bit cheaper? i am torn.. if i wanna take that step towards 1440p i need to start saving soon so i dont get into a financial crisis.


----------



## rdr09

Quote:


> Originally Posted by *heroxoot*
> 
> 14.4
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 14.6 beta
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What an increase!
> 
> But I still again, have no fan control since 13.12. I do not know why. Also I saw my GPU go up to 90c near the end of this benchmark and that seems a little scary.


i edited my post above yours resetting my cpu to stock and HT off to match the 13.11 run. almost 500 pts increase for the graphics score.









Happy for you Hex. I just wish BF4 does not crash. I'll test now.


----------



## Faster_is_better

Quote:


> Originally Posted by *Ramzinho*
> 
> i feel like buying one of them korean IPS monitors. is it worth it to buy a 290X when the 390X is out? or Am i gonna be fine with my GPU until the 390X is a little bit cheaper? i am torn.. if i wanna take that step towards 1440p i need to start saving soon so i dont get into a financial crisis.


I suppose it depends on what you use your rig for. 7970 is still a decent card and will play a lot of games at med-high/ultra settings at 1440p. On the other hand I don't know if we even have details of 390x, even rumors so its release is probably a long ways out. By the time it has released and been on the market long enough to drop in price, 290s will be old, but I would guess still very capable cards.

If you will be upgrading monitor soon, then a 290 would be a nice interim card. If you don't mind buying used, there are some ridiculous prices on these cards right now due to the huge flood of miners dumping them.


----------



## Ramzinho

Quote:


> Originally Posted by *Faster_is_better*
> 
> I suppose it depends on what you use your rig for. 7970 is still a decent card and will play a lot of games at med-high/ultra settings at 1440p. On the other hand I don't know if we even have details of 390x, even rumors so its release is probably a long ways out. By the time it has released and been on the market long enough to drop in price, 290s will be old, but I would guess still very capable cards.
> 
> If you will be upgrading monitor soon, then a 290 would be a nice interim card. If you don't mind buying used, there are some ridiculous prices on these cards right now due to the huge flood of miners dumping them.


i can't flip cards that fast. the idea of paying 400 for the Monitor and 400 for the card in a span of 4 month then wanting to flip the gpu for a newer one is probably scary for me regarding i'll have to pay a lot of money to go from 290X to the next gen..

There are rumors about the 390X being released later this year "around november" i'm getting my Monitor in mid July .. but then seeing that some games are not gonna be even run at medium using my 7970 is making me think it's time to turn it.. However i don't know if it's still viable to go 290X and keep it until the 490X is out or just bite on my 7970 until the 390X is a around 400$


----------



## heroxoot

Quote:


> Originally Posted by *edo101*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> 14.4
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 14.6 beta
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What an increase!
> 
> But I still again, have no fan control since 13.12. I do not know why. Also I saw my GPU go up to 90c near the end of this benchmark and that seems a little scary.
> 
> 
> 
> When you say fan control, Cant you use Trixx or AFB to get the fans to operate how you want it?
Click to expand...

There are no controls or options no. Doing it in TRIXX has no effect, MSI AB shows no fan control or fan curve tab in settings. Changing it in CCC just reverts to default when applied.


----------



## sugarhell

You probably need a new bios from msi.


----------



## Krusher33

Quote:


> Originally Posted by *Ramzinho*
> 
> There are rumors about the 390X being released later this year "around november"


First I've heard of this...


----------



## heroxoot

Quote:


> Originally Posted by *sugarhell*
> 
> You probably need a new bios from msi.


I've already called and they won't do jack crap because no one else claims the problem as well. The older bios for my card doesn't have the problem but the bios crashes my GPU so I won't use it.


----------



## Ramzinho

Quote:


> Originally Posted by *Krusher33*
> 
> First I've heard of this...


said rumors and here is the post

*Link*


----------



## Hl86

Summer varmth is too much for my 290x, my vrm1 hits 100c in 3 mins overclocked with tri-x cooler. Should i get AC or Watercooling?


----------



## heroxoot

Quote:


> Originally Posted by *Hl86*
> 
> Summer varmth is too much for my 290x, my vrm1 hits 100c in 3 mins overclocked with tri-x cooler. Should i get AC or Watercooling?


I'd try an AC first so you don't die with it.


----------



## bencher

Quote:


> Originally Posted by *Hl86*
> 
> Summer varmth is too much for my 290x, my vrm1 hits 100c in 3 mins overclocked with tri-x cooler. Should i get AC or Watercooling?


Even at max fanspeed?

My r9 290s' vrms are cooler than the gpu lol. They stay below 70c while the gpu is 80C+


----------



## heroxoot

Quote:


> Originally Posted by *bencher*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Hl86*
> 
> Summer varmth is too much for my 290x, my vrm1 hits 100c in 3 mins overclocked with tri-x cooler. Should i get AC or Watercooling?
> 
> 
> 
> Even at max fanspeed?
> 
> My r9 290s' vrms are cooler than the gpu lol. They stay below 70c while the gpu is 80C+
Click to expand...

Same here. Mine sit around 68c actually. I'd get that checked out honestly. VRM1 never passes 70c for me.


----------



## Mercy4You

Someone tried 14.6 Beta @ BF4 already?


----------



## Hl86

Quote:


> Originally Posted by *bencher*
> 
> Even at max fanspeed?
> 
> My r9 290s' vrms are cooler than the gpu lol. They stay below 70c while the gpu is 80C+


Yeah 100 fanspeed, it was 30-35 degrees though and no air flow in room


----------



## heroxoot

Does anyone know a way to force my underclock lower than MSI AB will let me? The difference of 200mhz keeps my gpu a lot hotter than it should. When my GPU downclocks on stock clocks to 300/1250 it stays like 3c cooler. Yet when I down clock it to 515/625 it stays hotter. Right now I got it switching between my OC and default clock just to see if it gets cooler and it does, but I think if I were able to have my ram running half speed with it, I'd get it a bit cooler at idle.

Just looking for ways to cool it during the summer so I don't fry.


----------



## edo101

Quote:


> Originally Posted by *heroxoot*
> 
> Does anyone know a way to force my underclock lower than MSI AB will let me? The difference of 200mhz keeps my gpu a lot hotter than it should. When my GPU downclocks on stock clocks to 300/1250 it stays like 3c cooler. Yet when I down clock it to 515/625 it stays hotter. Right now I got it switching between my OC and default clock just to see if it gets cooler and it does, but I think if I were able to have my ram running half speed with it, I'd get it a bit cooler at idle.
> 
> Just looking for ways to cool it during the summer so I don't fry.


See if Trixx will let you go lower


----------



## heroxoot

I tried trixx but it doesn't seem to be able to rotate 2d and 3d clocks for me. That is an issue because I don't want max clocks going all day. Trixx seems like it lacks a lot of features MSI AB has. it seems the limit for MSI AB is half your stock clock. Maybe I should make a mod bios with a lower default clock so I can get it to 300 for idle? Not sure. My gpu does stay 2c colder letting the default clocks run. So for now on 2d profile it is running the 1030/1250 stock in which case it will downclock to 300/1250 and it stays 2c colder.


----------



## DolanTheDuck

Edit: nevermind mobile site totally screwd quote.


----------



## chiknnwatrmln

What I do is run a few different profiles. If you have an OC profile that's set to, say, 1150/5500 with +100 mV, then as long as you're on that profile you're running +100mV. Even when idling, you have increased voltages which increase temps.

With my profile that runs +168mV, instead of idling at .961v (stock) it idles around 1.117v. I remedy this by using a secondary profile that's set to 500/2500 on stock voltage. The result is an even lower idle voltage (.951v) and lower temps. It usually idles a few C above ambient, but that's because water loops hold heat longer than air.

The card should automatically downclock (to around 300/600) on all profiles, unless you're forcing 3D clocks. I don't mess with 2D clocks, 300/600 is just about perfect imo. What I think is happening is your card is going into 3D mode and increasing voltage. Even though the clocks are low, the GPU is being fed more volts and that causes higher temperatures.


----------



## DolanTheDuck

Quote:


> Originally Posted by *rdr09*
> 
> see post # 23394. not sure what driver was used.
> 
> anyway, i installed 14.6 Beta using express method. no ddu, no driver sweeper crap just how i used to do it . . .
> 
> 
> 
> 
> 
> 
> 
> compare graphics scores.
> 
> 13.11
> 
> http://www.3dmark.com/3dm11/8233743
> 
> 14.6 Beta
> 
> http://www.3dmark.com/3dm11/8369902
> 
> both at 1200/1500. Gonna try BF4 soon.


Great method, first old driver and then new one, gonna see if i gain fps. A strange thing is when i change the settings to high/medium the fps stays the same, but when i change the settings to low i got sudden 80+ fps.


----------



## Faster_is_better

Quote:


> Originally Posted by *Ramzinho*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Faster_is_better*
> 
> I suppose it depends on what you use your rig for. 7970 is still a decent card and will play a lot of games at med-high/ultra settings at 1440p. On the other hand I don't know if we even have details of 390x, even rumors so its release is probably a long ways out. By the time it has released and been on the market long enough to drop in price, 290s will be old, but I would guess still very capable cards.
> 
> If you will be upgrading monitor soon, then a 290 would be a nice interim card. If you don't mind buying used, there are some ridiculous prices on these cards right now due to the huge flood of miners dumping them.
> 
> 
> 
> i can't flip cards that fast. the idea of paying 400 for the Monitor and 400 for the card in a span of 4 month then wanting to flip the gpu for a newer one is probably scary for me regarding i'll have to pay a lot of money to go from 290X to the next gen..
> 
> There are rumors about the 390X being released later this year "around november" i'm getting my Monitor in mid July .. but then seeing that some games are not gonna be even run at medium using my 7970 is making me think it's time to turn it.. However i don't know if it's still viable to go 290X and keep it until the 490X is out or just bite on my 7970 until the 390X is a around 400$
Click to expand...

Well if release truly is that close then you might just get the monitor and see how your 7970 does, maybe you can make do with it until the next ones release. But I suspect even if they were to drop in November, it's going to be a few months before price drops after release. It also depends on what competition the cards are having at the time. If AMD release beats out Nvidia's next series, then they may be on top for a while as far as single card performance, with little pressure to lower prices...

That rumored release seems to near to me, It would be cool if AMD could do it, but I'm expecting more of a Q1 2015 release. Maybe we will have some better info in July for you to go on


----------



## bencher

Quote:


> Originally Posted by *Hl86*
> 
> Yeah 100 fanspeed, it was 30-35 degrees though and no air flow in room


I have 70% fan speed. My room in 31c lol


----------



## Hl86

Quote:


> Originally Posted by *bencher*
> 
> I have 70% fan speed. My room in 31c lol


Is it reference?

Ill go check some reviews, if its true vrm shouldnt be that high.


----------



## Hl86

This pic shows temps at stock, its crazy. Add some voltage and its easy going 100c on a hot day.
http://www.legitreviews.com/wp-content/uploads/2014/01/sapphire-vrm-load-645x405.jpg

From this review
http://www.legitreviews.com/asus-r9-290x-directcu-ii-sapphire-r9-290x-tri-x-video-card-reviews_132158/11


----------



## rdr09

Quote:


> Originally Posted by *Hl86*
> 
> This pic shows temps at stock, its crazy. Add some voltage and its easy going 100c on a hot day.
> http://www.legitreviews.com/wp-content/uploads/2014/01/sapphire-vrm-load-645x405.jpg
> 
> From this review
> http://www.legitreviews.com/asus-r9-290x-directcu-ii-sapphire-r9-290x-tri-x-video-card-reviews_132158/11


look at the fan speed rpm & percentage. When oc'ing you don't want to leave the fan(s) in auto. Either set it at manual or make a fan curve. these cards been out awhile and it's no secret Hawaiis get hot. Best way to cool these cards is to watercool.


----------



## rdr09

Quote:


> Originally Posted by *Hl86*
> 
> Summer varmth is too much for my 290x, my vrm1 hits 100c in 3 mins overclocked with tri-x cooler. Should i get AC or Watercooling?


watercool.


----------



## Jeffro422

So I ordered the Kraken G10. Should be here tomorrow. Still need VRM heatsinks. But question, with multiple displays, any way to make it so the memory doesn't run at 1250mhz at the desktop with 0% usage?


----------



## Dispersion

Woo, summerheat...

Anyways, I guess I'll join this club!

GPU-Z Validation
Reference Sapphire R9 290 with EK FC R9-290X.



Just my summerclock, guess I got some kind of semi-golden sample or something, 80.3% ASIC.


----------



## Dasboogieman

Quote:


> Originally Posted by *DolanTheDuck*
> 
> Anyone with r9 290 who played watch dogs, what are your average fps? I'm getting too low framerates!(stupid nvidia/ubisoft)
> 
> Specs:
> Ultra settings, ultra textures, temporal smaa @ 1080p
> 35-50 fps while walking outside in the game.
> 
> i5 4670k, r9 290 tri-x, 8gb ram, win 8.1, catalyst 14.6


DW dude my rig is similar to yours except with a 2500k. At pretty much the same WD settings.

I'm on 14.4 and I'm going between 45 and 90 fps. About 35 to 60 outdoors.

Vram usage is a problem though. About 3.5gb


----------



## BradleyW

I'm struggling to keep 40fps with 290X CFX and 3930K in Watch Dogs! Constant stuttering. Can't drive due to the stutter. The game is unplayable.


----------



## Aussiejuggalo

I see a lot of complaints about Watch Dogs and CFX, So... how does Watch Dogs play on single 290s?









Still waiting for mine to get delivered


----------



## Elmy

Quote:


> Originally Posted by *BradleyW*
> 
> I'm struggling to keep 40fps with 290X CFX and 3930K in Watch Dogs! Constant stuttering. Can't drive due to the stutter. The game is unplayable.


Have you tried the 14.6 Beta drivers that came out today?


----------



## BradleyW

Yes. I am on a fresh install of Win 8.1 with CCC 14.6 from AMD website. The game started off at 100+ max out with FXAA, but as soon as I got half way into the first level, my fps plummeted to 40fps. Outside I constantly fluctuate between 30 to 70 fps. Driving is impossible and running is almost impossible. All I can do is stand still and enjoy around 55 fps. Plus CFX is only giving me a 10fps gain. All other games work perfectly fine.


----------



## chronicfx

Quote:


> Originally Posted by *BradleyW*
> 
> Yes. I am on a fresh install of Win 8.1 with CCC 14.6 from AMD website. The game started off at 100+ max out with FXAA, but as soon as I got half way into the first level, my fps plummeted to 40fps. Outside I constantly fluctuate between 30 to 70 fps. Driving is impossible and running is almost impossible. All I can do is stand still and enjoy around 55 fps. Plus CFX is only giving me a 10fps gain. All other games work perfectly fine.


I am out of town till friday so i cannot play it but I am very sad to hear this.
I guess I will pull out my third 290x and just play it with crossfire, if that even works.


----------



## buddatech

Guess I'll join for now


----------



## pkrexer

I'm still having troubles with crossfire and mantle running poorly on 14.6; My GPU usage is jumping all over the place. However DX11 performance seems even better, so I'm happy with them.


----------



## Mega Man

Quote:


> Originally Posted by *Ramzinho*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Faster_is_better*
> 
> I suppose it depends on what you use your rig for. 7970 is still a decent card and will play a lot of games at med-high/ultra settings at 1440p. On the other hand I don't know if we even have details of 390x, even rumors so its release is probably a long ways out. By the time it has released and been on the market long enough to drop in price, 290s will be old, but I would guess still very capable cards.
> 
> If you will be upgrading monitor soon, then a 290 would be a nice interim card. If you don't mind buying used, there are some ridiculous prices on these cards right now due to the huge flood of miners dumping them.
> 
> 
> 
> i can't flip cards that fast. the idea of paying 400 for the Monitor and 400 for the card in a span of 4 month then wanting to flip the gpu for a newer one is probably scary for me regarding i'll have to pay a lot of money to go from 290X to the next gen..
> 
> There are rumors about the 390X being released later this year "around november" i'm getting my Monitor in mid July .. but then seeing that some games are not gonna be even run at medium using my 7970 is making me think it's time to turn it.. However i don't know if it's still viable to go 290X and keep it until the 490X is out or just bite on my 7970 until the 390X is a around 400$
Click to expand...

sounds like drivers need optimized imo


----------



## Jeffro422

Quote:


> Originally Posted by *Ramzinho*
> 
> i can't flip cards that fast. the idea of paying 400 for the Monitor and 400 for the card in a span of 4 month then wanting to flip the gpu for a newer one is probably scary for me regarding i'll have to pay a lot of money to go from 290X to the next gen..


I grabbed my 290x for $280 and X-star DP2710 LED for $300 this week so it's not quite as much as you think.


----------



## Elmy

Quote:


> Originally Posted by *BradleyW*
> 
> Yes. I am on a fresh install of Win 8.1 with CCC 14.6 from AMD website. The game started off at 100+ max out with FXAA, but as soon as I got half way into the first level, my fps plummeted to 40fps. Outside I constantly fluctuate between 30 to 70 fps. Driving is impossible and running is almost impossible. All I can do is stand still and enjoy around 55 fps. Plus CFX is only giving me a 10fps gain. All other games work perfectly fine.


Sucks to hear that.... I heard Nvidia purposely paid so AMD did not have access to any of the code for the game.

If true shame on Nvidia


----------



## Mega Man

Quote:


> Originally Posted by *Elmy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BradleyW*
> 
> Yes. I am on a fresh install of Win 8.1 with CCC 14.6 from AMD website. The game started off at 100+ max out with FXAA, but as soon as I got half way into the first level, my fps plummeted to 40fps. Outside I constantly fluctuate between 30 to 70 fps. Driving is impossible and running is almost impossible. All I can do is stand still and enjoy around 55 fps. Plus CFX is only giving me a 10fps gain. All other games work perfectly fine.
> 
> 
> 
> Sucks to hear that.... I heard Nvidia purposely paid so AMD did not have access to any of the code for the game.
> 
> If true shame on Nvidia
Click to expand...

pretty typical actually esp of nvidia


----------



## Dasboogieman

Quote:


> Originally Posted by *Elmy*
> 
> Sucks to hear that.... I heard Nvidia purposely paid so AMD did not have access to any of the code for the game.
> 
> If true shame on Nvidia


Well Ubisoft gotta get on the ball, the consoles use AMD hardware last I checked. This usage of proprietary NVIDIA libraries will only hurt Ubisoft.
Plus, the 780Ti is in a poor position to fully let WD stretch since the 3Gb RAM is killing performance.
At this rate, anyone with less than a Titan Black will not be able to properly run this game.


----------



## phantomowl

i got the weirdest bug on amd drive 14.4 whql last night. i installed it and it detected a 2nd monitor (which i dont have) then proceeds to blackscreen. i rolled backed to 14.3 and i will definetly try the latest driver


----------



## Mega Man

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Elmy*
> 
> Sucks to hear that.... I heard Nvidia purposely paid so AMD did not have access to any of the code for the game.
> 
> If true shame on Nvidia
> 
> 
> 
> Well Ubisoft gotta get on the ball, the consoles use AMD hardware last I checked. This usage of proprietary NVIDIA libraries will only hurt Ubisoft.
> Plus, the 780Ti is in a poor position to fully let WD stretch since the 3Gb RAM is killing performance.
> At this rate, anyone with less than a Titan Black will not be able to properly run this game.
Click to expand...

meh when was ubi good, the only real game they have is AC and even that... meh rinse and repeat soon it will be old

as much as i hate when squaresoft merged and made square enix. at least they continued to make decent rpg ( until 13 imo, decent story, junky gameplay ! )


----------



## IBIubbleTea

Can anyone help me with GPU scaling?
http://www.overclock.net/t/1492238/scaling-help


----------



## Arizonian

Quote:


> Originally Posted by *Dispersion*
> 
> Woo, summerheat...
> 
> Anyways, I guess I'll join this club!
> 
> GPU-Z Validation
> Reference Sapphire R9 290 with EK FC R9-290X.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Just my summerclock, guess I got some kind of semi-golden sample or something, 80.3% ASIC.


Congrats - added
















Quote:


> Originally Posted by *buddatech*
> 
> Guess I'll join for now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## edo101

SO um, how is Watch Dog doing for y'all. I get 45 fps on average with default Video settings and Ultra settings for graphics.

I do notice that my card does dip below 100% load more frequently than I'd like

And this game seems very CPU heavy...and eats up a lot of system RAM and GPU ram although for GPU its never gone past 3Gb.

I had to terminate Mozilla to keep the game going cause my 6GB system is not enough to keep my many tabbed Firefox on


----------



## Foresight

With 14.4 drivers I got ~30 fps any settings (lol) with xfire r9 290
with 14.6 drivers i get ~60 fps up to 80 fps but sometimes dips down to 40 fps on 2560x1440 ultra


----------



## edo101

Quote:


> Originally Posted by *Foresight*
> 
> With 14.4 drivers I got ~30 fps any settings (lol) with xfire r9 290
> with 14.6 drivers i get ~60 fps up to 80 fps but sometimes dips down to 40 fps on 2560x1440 ultra


So Xfire is working now for you Xfire users?


----------



## Foresight

It is working for me =).


----------



## the9quad

Runs like utter garbage in my rig. An unplayable mess of stuttering, even with aa off, even if I go to 1080p down from 1440p, even if I disable crossfire. Complete turd. Using 14.6 btw.


----------



## Mercy4You

Quote:


> Originally Posted by *the9quad*
> 
> Runs like utter garbage in my rig. An unplayable mess of stuttering, even with aa off, even if I go to 1080p down from 1440p, even if I disable crossfire. Complete turd. Using 14.6 btw.


This is really new to me, the last 5 years I always used the Betas and was happy. Since the R9 series it's another story...


----------



## Dasboogieman

Some may find this interesting, I present: Watchdogs

I5-2500k at 4.5ghz
16gb 1600mhz RAM
AMD 290 Tri X at 1150/1475

Ultra Textures
Temporal SMAA
Medium Details, HBAO, Medium Shadows and Medium Reflections

FPS is around 55 on average when i'm on the streets, it drops to 45 when I'm moving or in a car
It goes to about 90 when I'm looking at a wall or inside a shop.



Notice the VRAM usage









Dat Optimization


----------



## Aussiejuggalo

Dat GPU load.... so erratic









How does the 2500k handle it?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Dasboogieman*
> 
> Some may find this interesting, I present: Watchdogs
> 
> I5-2500k at 4.5ghz
> 16gb 1600mhz RAM
> AMD 290 Tri X at 1150/1475
> 
> Ultra Textures
> Temporal SMAA
> Medium Details, HBAO, Medium Shadows and Medium Reflections
> 
> FPS is around 55 on average when i'm on the streets, it drops to 45 when I'm moving or in a car
> It goes to about 90 when I'm looking at a wall or inside a shop.
> 
> 
> 
> Notice the VRAM usage
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Dat Optimization


Turn off Temporal SMAA, heard it's been having some issues with AMD cards :/


----------



## Unknownm

What's all your guys ASIC Quality?


----------



## rdr09

Loving 14.6 Beta. I may finally give up 13.11 on my other HDD. Can't wait for Homecinema to try it.

Firestrike at 1200/1500

13.11

http://www.3dmark.com/3dm/2859470?

14.6 Beta

http://www.3dmark.com/3dm/3147012?

@Unknownm, my asic is same as yours.


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> Loving 14.6 Beta. I may finally give up 13.11 on my other HDD. *Can't wait for Homecinema to try it.*
> 
> Firestrike at 1200/1500
> 
> 13.11
> 
> http://www.3dmark.com/3dm/2859470?
> 
> 14.6 Beta
> 
> http://www.3dmark.com/3dm/3146872?
> 
> @Unknownm, my asic is same as yours.


I can see some records being smashed wide open









I've only used it to test out Thief and Mantle Crossfire......works pretty well actually, nothing like the boost i got from BF4 but that's to be expected and i've heard some good reports about the mixed res eyefinity


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I can see some records being smashed wide open
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've only used it to test out Thief and Mantle Crossfire......works pretty well actually, nothing like the boost i got from BF4 but that's to be expected and i've heard some good reports about the mixed res eyefinity


it requires a bit more voltage and may present problems to those on air but nothing serious.


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> it requires a bit more voltage and may present problems to those on air but nothing serious.


haven't done any serious benching on this driver yet but i'll keep that in mind, thanks


----------



## Dasboogieman

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Turn off Temporal SMAA, heard it's been having some issues with AMD cards :/


Yea everything seems much smoother on x2MSAA
Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Dat GPU load.... so erratic
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How does the 2500k handle it?


The 2500k handles it fine, tho i'm sure if others with units at 4.8ghz and/ore faster RAM may notice better improvements.

The erratic GPu loading is because I have Vsync on, everything is reasonably smooth now but damn, any settings on High is firmly out of reach until they either optimize it or I'm going to have to brute force it with another 290.

I honestly don't see how the 780Ti is gonna handle this, it technically has the rendering power to better handle the higher settings than the 290 but its 3Gb Memory will hold it back a lot, especially at 1440p.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Dasboogieman*
> 
> Yea everything seems much smoother on x2MSAA


I'm in the process of attempting to download it but i haven't got high hopes, pretty sure i'll run out of Data before it's finished


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Dasboogieman*
> 
> The 2500k handles it fine, tho i'm sure if others with units at 4.8ghz and/ore faster RAM may notice better improvements.
> 
> The erratic GPu loading is because I have Vsync on, everything is reasonably smooth now but damn, any settings on High is firmly out of reach until they either optimize it or I'm going to have to brute force it with another 290.
> 
> I honestly don't see how the 780Ti is gonna handle this, it technically has the rendering power to better handle the higher settings than the 290 but its 3Gb Memory will hold it back a lot, especially at 1440p.


Cool







wonder how mine will go at stock







pita thing wont stay stable over it









Ah yeah, didnt think of that lol, I think it may run better with a stable driver update and game update

Apparently even 780Tis are having trouble running it to according to people on the Watch Dogs thread, seems to be having stuttering, lag, low FPS etc

Seems like Watch Dogs is either really hard to run or is a seriously bad console port


----------



## Sgt Bilko

Gameworks.....

http://www.forbes.com/sites/jasonevangelho/2014/05/26/why-watch-dogs-is-bad-news-for-amd-users-and-potentially-the-entire-pc-gaming-ecosystem/

Was very sad when i first read about this


----------



## Mercy4You

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Gameworks.....
> 
> http://www.forbes.com/sites/jasonevangelho/2014/05/26/why-watch-dogs-is-bad-news-for-amd-users-and-potentially-the-entire-pc-gaming-ecosystem/
> 
> Was very sad when i first read about this


Us gamers LOVE a bit of war, don't we...


----------



## DolanTheDuck

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Gameworks.....
> 
> http://www.forbes.com/sites/jasonevangelho/2014/05/26/why-watch-dogs-is-bad-news-for-amd-users-and-potentially-the-entire-pc-gaming-ecosystem/
> 
> Was very sad when i first read about this


I got no words for this. Every gamer should support AMD right now in my opinion, because if we don't AMD goes bankrupt and then we're going to pay 600 dollars for a GTX 750 I guess.
Besides that, it is nice from AMD that they invent features that works also great on Nvidia hardware.


----------



## Silent Scone

Quote:


> Originally Posted by *Dasboogieman*
> 
> Yea everything seems much smoother on x2MSAA
> The 2500k handles it fine, tho i'm sure if others with units at 4.8ghz and/ore faster RAM may notice better improvements.
> 
> The erratic GPu loading is because I have Vsync on, everything is reasonably smooth now but damn, any settings on High is firmly out of reach until they either optimize it or I'm going to have to brute force it with another 290.
> 
> I honestly don't see how the 780Ti is gonna handle this, it technically has the rendering power to better handle the higher settings than the 290 but its 3Gb Memory will hold it back a lot, especially at 1440p.


It's fine for me with ultra setting and 'high' textures with x2 MSAA at 1440p on my Ti. SLI is broken however. Becomes a stutter fest. As does ultra texture setting.

The 8GB 290s / Titans look less ridiculous right about now. The games broken, though.


----------



## cplifj

i for one don't care if AMD goes bankrupt.

if BMW would sell their M model line with too small a radiator, and onboard software that stopped the engine when your driving 200Mph.....

just to say , it wouldnt fly, so why should AMD be allowed to do this then ?????


----------



## Elmy

Quote:


> Originally Posted by *cplifj*
> 
> i for one don't care if AMD goes bankrupt.
> 
> if BMW would sell their M model line with too small a radiator, and onboard software that stopped the engine when your driving 200Mph.....
> 
> just to say , it wouldnt fly, so why should AMD be allowed to do this then ?????


Worst analogy ever....

With that 120 radiator temps sit at 70c on full load thats in my rig.

And if the watercooling system fails on a 295X2 it should stop working....

It is an acclaimed engineering marvel by almost every reviewer on the planet.

I would take their word over yours any day of the week.


----------



## Sgt Bilko

Quote:


> Originally Posted by *cplifj*
> 
> i for one don't care if AMD goes bankrupt.
> 
> if BMW would sell their M model line with too small a radiator, and onboard software that stopped the engine when your driving 200Mph.....
> 
> just to say , it wouldnt fly, so why should AMD be allowed to do this then ?????


Ummm.........Wut?


----------



## rdr09

Quote:


> Originally Posted by *cplifj*
> 
> i for one don't care if AMD goes bankrupt.
> 
> if BMW would sell their M model line with too small a radiator, and onboard software that stopped the engine when your driving 200Mph.....
> 
> just to say , it wouldnt fly, so why should AMD be allowed to do this then ?????


do what? do this . . .

13.11

http://www.3dmark.com/3dm/2859470?

14.4 Beta

http://www.3dmark.com/3dm/3147012?

i'll take it. No need to dwell here. Go here . . .

http://www.overclock.net/f/69/nvidia

it's not like amd is forcing you. Go! I bet you are/were a miner.


----------



## Dasboogieman

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Dat GPU load.... so erratic
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How does the 2500k handle it?


Quote:


> Originally Posted by *Silent Scone*
> 
> It's fine for me with ultra setting and 'high' textures with x2 MSAA at 1440p on my Ti. SLI is broken however. Becomes a stutter fest. As does ultra texture setting.
> 
> The 8GB 290s / Titans look less ridiculous right about now. The games broken, though.


I agree, something is also broken with the engine when it comes to rendering motion. It just doesn't achieve any degree of fluidity. Would love to see how the 290X with 8GB or the Titan black users are coping, however, they seem to be rarer than a honest politician.


----------



## edo101

Yeah no way this game with its crappy/average last gen visuals is this hard to run. THIS is just a bad port. The worst I've seen since GTA 4


----------



## Ramzinho

Quote:


> Originally Posted by *Jeffro422*
> 
> I grabbed my 290x for $280 and X-star DP2710 LED for $300 this week so it's not quite as much as you think.


i live in egypt your total is 580 i'll be adding customs + shipping which is at least about 120-150$ more , also the cheapest guy shipping the X-Star or Qnix is like 350$ so 300 + 350 + 150 that's around 800$


----------



## hwoverclkd

Quote:


> Originally Posted by *heroxoot*
> 
> I've already called and they won't do jack crap because no one else claims the problem as well. The older bios for my card doesn't have the problem but the bios crashes my GPU so I won't use it.


which msi card do u have?


----------



## heroxoot

Quote:


> Originally Posted by *acupalypse*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> I've already called and they won't do jack crap because no one else claims the problem as well. The older bios for my card doesn't have the problem but the bios crashes my GPU so I won't use it.
> 
> 
> 
> which msi card do u have?
Click to expand...

Gaming series. It puzzles me a twin frozr can have this much heat run through it. my lightning 7970 couldnt even fathom real heat like this.


----------



## DolanTheDuck

@14.6 so yeah gpu working normal right?


----------



## heroxoot

Yea your GPU looks great.

I cannot get 1125/1250 on air tho. 14.6 pushes my GPU to 92c with this clock and the clock will start to throttle down to save its self. I got it stable on voltage doing +56mV core and +31mV aux(not sure if aux is helping) but it was throttling down when it got to 92c. So I guess 1100 core clock will be my max clock on air. Not so bad I guess.


----------



## TTheuns

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Gameworks.....
> 
> http://www.forbes.com/sites/jasonevangelho/2014/05/26/why-watch-dogs-is-bad-news-for-amd-users-and-potentially-the-entire-pc-gaming-ecosystem/
> 
> Was very sad when i first read about this


I swear, whenever AMD get's their chance, they will completely destroy NVidia with just one generation of cards. And I think/hope it is coming soon, with rumors about stacked DRAM for next gen AMD cards and NVidia only implementing this in 2 generations.


----------



## DolanTheDuck

Quote:


> Originally Posted by *heroxoot*
> 
> Yea your GPU looks great.
> 
> I cannot get 1125/1250 on air tho. 14.6 pushes my GPU to 92c with this clock and the clock will start to throttle down to save its self. I got it stable on voltage doing +56mV core and +31mV aux(not sure if aux is helping) but it was throttling down when it got to 92c. So I guess 1100 core clock will be my max clock on air. Not so bad I guess.


Lol, you're overclocking like a madmen and i'm just stock 2 fps behind you







.


----------



## kizwan

Quote:


> Originally Posted by *DolanTheDuck*
> 
> 
> 
> @14.6 so yeah gpu working normal right?


Clocks? It seems low to me though. Can you do 1000/1300?


----------



## TTheuns

A question for you guys:
Has any of you painted their original R9 290X shroud?


----------



## DolanTheDuck

Quote:


> Originally Posted by *kizwan*
> 
> Clocks? It seems low to me though. Can you do 1000/1300?


Yes, constant 1000/1300.

Specs:
r9 290 tri-x
i5 4670k
8gb ram

Compared to heroxoot, which has a r9 290x overclocked and gets 53 fps its really really good.


----------



## kizwan

Quote:


> Originally Posted by *DolanTheDuck*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Clocks? It seems low to me though. Can you do 1000/1300?
> 
> 
> 
> 
> 
> 
> 
> Yes, constant 1000/1300.
> 
> Specs:
> r9 290 tri-x
> i5 4670k
> 8gb ram
> 
> Compared to heroxoot, which has a r9 290x overclocked and gets 53 fps its really really good.
Click to expand...

My apologies! I forgot about Tessellation. Your score is what you should get at that clock with Tessellation ON (default).


----------



## heroxoot

Quote:


> Originally Posted by *DolanTheDuck*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Clocks? It seems low to me though. Can you do 1000/1300?
> 
> 
> 
> Yes, constant 1000/1300.
> 
> Specs:
> r9 290 tri-x
> i5 4670k
> 8gb ram
> 
> Compared to heroxoot, which has a r9 290x overclocked and gets 53 fps its really really good.
Click to expand...

The Difference is im being held back by this 8150. Were I on an i5 or i7 like you, I would probably be seeing 55+ easily. Heaven only uses 1 core and my 8150 doesn't do single threading too well. On another note, windows 8.1 has a lower physics ability too and so that also lowers my score. Still, my performance is pretty good I'd say.


----------



## Lee17

I can't find how to add +100 mv over the +100mv you can add in Afterburner 3.0.0 Beta 19. I have try this without success :

CD C:\Program Files (x86)\MSI Afterburner
MSIAfterburner.exe /sg0 /wi4,30,8d,10 /sg1 /wi4,30,8d,10

I have 2 GPU and that is why I have /sg0 and /sg1. I have read somewhere that I should do that. I have also try without the /sgX but without luck. Any one know how to put +100 mv on these bad boy?

Thanks, Lee


----------



## MojoW

Quote:


> Originally Posted by *Lee17*
> 
> I can't find how to add +100 mv over the +100mv you can add in Afterburner 3.0.0 Beta 19. I have try this without success :
> 
> CD C:\Program Files (x86)\MSI Afterburner
> 
> MSIAfterburner.exe /sg0 /wi4,30,8d,10 /sg1 /wi4,30,8d,10
> 
> I have 2 GPU and that is why I have /sg0 and /sg1. I have read somewhere that I should do that. I have also try without the /sgX but without luck. Any one know how to put +100 mv on these bad boy?
> 
> Thanks, Lee


MSIAfterburner.exe /sg0 /wi*6*,30,8d,10 /sg1 /wi*6*,30,8d,10

that what you used was for beta 18.


----------



## Learath2

I feel alone out here can I join you guys









GPU-Z Validation Link
Its a ASUS R9 290x reference card. I will be watercooling it but not yet as I still am not sure if the card is actually stable because of the totally unbalanced rig that I have.


----------



## Lee17

It still don't launch... Should I try to find the Beta 18?

Quote:


> Originally Posted by *Learath2*
> 
> GPU-Z Validation Link
> Its a ASUS R9 290x reference card. I will be watercooling it but not yet as I still am not sure if the card is actually stable because of the totally unbalanced rig that I have.


I hope you will be able to use that monster! Enjoy it as much as you can!


----------



## MojoW

Quote:


> Originally Posted by *Lee17*
> 
> It still don't launch... Should I try to find the Beta 18?
> I hope you will be able to use that monster! Enjoy it as much as you can!


Are you running that CMD from the Afterburner folder?


----------



## Lee17

Quote:


> Originally Posted by *MojoW*
> 
> Are you running that CMD from the Afterburner folder?


I start the bat file with :

CD C:\Program Files (x86)\MSI Afterburner

It should load the MSI Afterburner file. BTW, it work to open the shortcut without the extension to add 100mv


----------



## rdr09

Just taking advantage of the slightly chilly weather . . .

13.11 1300/1500 Graphics Score: 18500

http://www.3dmark.com/3dm11/7716320

14.6 Beta 1300/1500 Graphics Score: 18800

http://www.3dmark.com/3dm11/8373308


----------



## heroxoot

Quote:


> Originally Posted by *DolanTheDuck*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Yea your GPU looks great.
> 
> I cannot get 1125/1250 on air tho. 14.6 pushes my GPU to 92c with this clock and the clock will start to throttle down to save its self. I got it stable on voltage doing +56mV core and +31mV aux(not sure if aux is helping) but it was throttling down when it got to 92c. So I guess 1100 core clock will be my max clock on air. Not so bad I guess.
> 
> 
> 
> Lol, you're overclocking like a madmen and i'm just stock 2 fps behind you
> 
> 
> 
> 
> 
> 
> 
> .
Click to expand...

There is nothing mad about it. If my CPU wasn't holding me back I would be 5fps ahead of you. FYI, 947mhz or so is stock. 1000 is a overclock to the 290. Congratz your GPU is overclocked.
Quote:


> Originally Posted by *rdr09*
> 
> Just taking advantage of the slightly chilly weather . . .
> 
> 13.11 1300/1500 Graphics Score: 18500
> 
> http://www.3dmark.com/3dm11/7716320
> 
> 14.6 Beta 1300/1500 Graphics Score: 18800
> 
> http://www.3dmark.com/3dm11/8373308


You're crazy but I love the score.


----------



## Sgt Bilko

Quote:


> Originally Posted by *heroxoot*
> 
> There is nothing mad about it. If my CPU wasn't holding me back I would be 5fps ahead of you. FYI, 947mhz or so is stock. 1000 is a overclock to the 290. Congratz your GPU is overclocked.
> You're crazy but I love the score.


1000/1300 is stock for a Tri-X

And yes, even in graphics heavy benches AMD chips have a performance penalty.

Anyone who didnt already know that is an idiot


----------



## phantomowl

im having problems with the amds latest driver

i installed itt and after reboot there was just black screen and the "no signal" prompt from my monitor pops out. im having this problem since 14.4whql.

and the weird part is after the windows logo it goes black and you can hear it reach the windows desktop.


----------



## rdr09

Quote:


> Originally Posted by *heroxoot*
> 
> There is nothing mad about it. If my CPU wasn't holding me back I would be 5fps ahead of you. FYI, 947mhz or so is stock. 1000 is a overclock to the 290. Congratz your GPU is overclocked.
> You're crazy but I love the score.


wait till Homecinema gets a hold of this driver.
Quote:


> Originally Posted by *phantomowl*
> 
> im having problems with the amds latest driver
> 
> i installed itt and after reboot there was just black screen and the "no signal" prompt from my monitor pops out. im having this problem since 14.4whql.
> 
> and the weird part is after the windows logo it goes black and you can hear it reach the windows desktop.


that happened to me when i installed 14.4 with the "help" of DDU. If you are using HDMI. try unplugging and plugging it back. might work with other ports but not sure. If you get it back up, then use 14.6 Beta. Not sure if it will work for you but all I did was . . .

1. Download 14.6 Beta and used it to uninstall the current driver.I used Express. Reboot.
2. Ran it again (14.6) and installed it. Again, used Express.

BTW, I'm still using Win7.


----------



## mojobear

Quote:


> Originally Posted by *phantomowl*
> 
> im having problems with the amds latest driver
> 
> i installed itt and after reboot there was just black screen and the "no signal" prompt from my monitor pops out. im having this problem since 14.4whql.
> 
> and the weird part is after the windows logo it goes black and you can hear it reach the windows desktop.


i had the same problem when I went to 14.6 with eyefinity...

Had to do a system restore to get things working again.


----------



## mojobear

In other news...for those of you who dont have battlefield 3...its on the house on origin until June 3rd! FFREEEEEEEEEEEE!!


----------



## heroxoot

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> There is nothing mad about it. If my CPU wasn't holding me back I would be 5fps ahead of you. FYI, 947mhz or so is stock. 1000 is a overclock to the 290. Congratz your GPU is overclocked.
> You're crazy but I love the score.
> 
> 
> 
> 1000/1300 is stock for a Tri-X
> 
> And yes, even in graphics heavy benches AMD chips have a performance penalty.
> 
> Anyone who didnt already know that is an idiot
Click to expand...

I think you have confused stock and factory overclock. 1000/1300 is a factory overclock. Reference design clocks are lower and considered stock.


----------



## Sgt Bilko

Quote:


> Originally Posted by *heroxoot*
> 
> I think you have confused stock and factory overclock. 1000/1300 is a factory overclock. Reference design clocks are lower and considered stock.


I consider whatever the clocks are when it left the factory as being stock.

For the ref design its 947/1250, for my cards its 980/1250 and for the tri-x its 1000/1300


----------



## phantomowl

Quote:


> Originally Posted by *rdr09*
> 
> wait till Homecinema gets a hold of this driver.
> that happened to me when i installed 14.4 with the "help" of DDU. If you are using HDMI. try unplugging and plugging it back. might work with other ports but not sure. If you get it back up, then use 14.6 Beta. Not sure if it will work for you but all I did was . . .
> 
> 1. Download 14.6 Beta and used it to uninstall the current driver.I used Express. Reboot.
> 2. Ran it again (14.6) and installed it. Again, used Express.
> 
> BTW, I'm still using Win7.


im still using windows 7 too.

i uninstalled the driver with ddu too, is there any option to uninstall your current driver on the 14.6?

i already tried the unpluging of hdmi method and it's no good.


----------



## rdr09

Quote:


> Originally Posted by *phantomowl*
> 
> im still using windows 7 too.
> 
> i uninstalled the driver with ddu too, is there any option to uninstall your current driver on the 14.6?
> 
> i already tried the unpluging of hdmi method and it's no good.


every amd driver, afaik, has an option to uninstall current or existing driver. Run it and you'll be given a choice whether to uninstall or install.

Skipping DDU worked for me. 14.4 was the only time i used DDU and that will be the last time.


----------



## Roboyto

Quote:


> Originally Posted by *phantomowl*
> 
> im having problems with the amds latest driver
> 
> i installed itt and after reboot there was just black screen and the "no signal" prompt from my monitor pops out. im having this problem since 14.4whql.
> 
> and the weird part is after the windows logo it goes black and you can hear it reach the windows desktop.


Quote:


> Originally Posted by *phantomowl*
> 
> im still using windows 7 too.


Quote:


> Originally Posted by *rdr09*
> 
> BTW, I'm still using Win7.


I still think 13.12 is the only viable option for Win7. I briefly tried every 14.1 through 14.4 while running Win7 and they only lasted an hour at most before I saw some kind of issue.

I just made the switch to 8.1 Pro, last night, and am running 14.6 without issue in Windows thus far with 5760*1080 EyeFinity. Still need to load up a game or two and see where that goes...will be back with more info on that.

Quote:


> Originally Posted by *rdr09*
> 
> every amd driver, afaik, has an option to uninstall current or existing driver. Run it and you'll be given a choice whether to uninstall or install.
> 
> Skipping DDU worked for me. 14.4 was the only time i used DDU and that will be the last time.


Can always use the AMD uninstall utility: http://support.amd.com/en-us/kb-articles/Pages/AMD-Clean-Uninstall-Utility.aspx

I only had an issue with DDU one time. I had to go through Add/Remove Programs to uninstall CCC/drivers and it solved the problem.


----------



## bluedevil

It's starting to come together.


----------



## phantomowl

OK ill ttry them. thanks.

the last driver i used is 14.3 and i think i'll roll back if your suggestion doesnt work.

thanks!

Quote:


> Originally Posted by *mojobear*
> 
> i had the same problem when I went to 14.6 with eyefinity...
> 
> Had to do a system restore to get things working again.


i installed win 7 again last night, will it work?


----------



## heroxoot

14.6 driver nets me 51.7fps on default 1030 clock and 53.8 on 1100 in heaven, but the heat is just so much. 90c with my OC max during any kind of stress, 85c tops without an OC. BF4 runs smooth enough on default clocks so I think I'm going to revert to stock clock for now. I'm netting good FPS in BF4 on default tho, and before 14.6 I hardly could keep 70fps. Now I see 60fps minimum and it goes up as high as 120 in certain areas. The clock however never likes to be exactly one speed on the default but seeing mostly 70 - 80fps is good enough for me.

It's a shame though, I could get 1125 on +51mV but the heat causes clock throttle.


----------



## bak3donh1gh

anyone know why when I black screen my core voltage is set at +50mv? Also anyone know why my boot time has increased to so much im running a ssd so at first I thought it was hanging.


----------



## bak3donh1gh

Quote:


> Originally Posted by *heroxoot*
> 
> 14.6 driver nets me 51.7fps .


what was your 14.4?


----------



## heroxoot

Quote:


> Originally Posted by *bak3donh1gh*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> 14.6 driver nets me 51.7fps .
> 
> 
> 
> what was your 14.4?
Click to expand...

about 48fps in heaven WITH an overclock. My BF4 performance is about the same using either driver and Mantle.


----------



## Mega Man

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Gameworks.....
> 
> http://www.forbes.com/sites/jasonevangelho/2014/05/26/why-watch-dogs-is-bad-news-for-amd-users-and-potentially-the-entire-pc-gaming-ecosystem/
> 
> Was very sad when i first read about this


Quote:


> Originally Posted by *mojobear*
> 
> In other news...for those of you who dont have battlefield 3...its on the house on origin until June 3rd! FFREEEEEEEEEEEE!!


meh nothing new for nvidia

sad really instead of pushing inovation they just hold back the market.

with that said freesync is part of monitor standards not even ( IIRC DP 1.2 ) amd .... and nvidia is just ripping it off and making it proprietary rather then using the built in one... because it is "better"


----------



## Enzarch

Quote:


> Originally Posted by *bak3donh1gh*
> 
> anyone know why when I black screen my core voltage is set at +50mv? Also anyone know why my boot time has increased to so much im running a ssd so at first I thought it was hanging.


I have this same issue, mine will black screen at >+80mV. However ONLY at 120hz refresh, and I see you have a 120hz display.

I do my benchmark runs @ 60hz, and can run +200mV.

I have tried many drivers and a couple vBIOS'. Nothing seems to change this behavior unfortunately.

See if you have similar results at 60hz, I have seen a few other reports of the same thing.


----------



## imadorkx

Hey guys. I need some clarification. I just bought a seasonic x series 650w. i was thinking the pcie cable for gpu has two 6+2pin.

Is it dangerous if i only use one cable that has 6+2pin.

it look like this but mine it has 6+2pin to 6+2pin instead.


----------



## rdr09

Quote:


> Originally Posted by *imadorkx*
> 
> Hey guys. I need some clarification. I just bought a seasonic x series 650w. i was thinking the pcie cable for gpu has two 6+2pin.
> 
> Is it dangerous if i only use one cable that has 6+2pin.
> 
> it look like this but mine it has 6+2pin to 6+2pin instead.


the gpu should have 2 connectors - 1/6-pin and 1/8 pin (6+2). each needs separate power line coming from the psu. that adapter will work for one connector.


----------



## imadorkx

false
Quote:


> Originally Posted by *rdr09*
> 
> the gpu should have 2 connectors - 1/6-pin and 1/8 pin (6+2). each needs separate power line coming from the psu. that adapter will work for one connector.


Thanks mate. Much appreciated.







+1 Rep.


----------



## chronicfx

Quote:


> Originally Posted by *rdr09*
> 
> the gpu should have 2 connectors - 1/6-pin and 1/8 pin (6+2). each needs separate power line coming from the psu. that adapter will work for one connector.


Are you sure about this? My evga 1300w g2 comes with those cables too that are 6+2 split to a 6 pin and a single one of those cables from the psu powered my 290x just fine. I am pretty sure that is what they are for. You just plug em in and run one single cable from the psu with less clutter.


----------



## Sgt Bilko

Quote:


> Originally Posted by *chronicfx*
> 
> Are you sure about this? My evga 1300w g2 comes with those cables too that are 6+2 split to a 6 pin and a single one of those cables from the psu powered my 290x just fine. I am pretty sure that is what they are for. You just plug em in and run one single cable from the psu with less clutter.


As did my Silverstone Strider.


----------



## rdr09

Quote:


> Originally Posted by *chronicfx*
> 
> Are you sure about this? My evga 1300w g2 comes with those cables too that are 6+2 split to a 6 pin and a single one of those cables from the psu powered my 290x just fine. I am pretty sure that is what they are for. You just plug em in and run one single cable from the psu with less clutter.


now i am not sure. if this is what ima has . . .



i would use 2. maybe shilka is the right person to ask.


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> now i am not sure. if this is what ima has . . .
> 
> 
> 
> i would use 2. maybe shilka is the right person to ask.


I've clocked my cards fairly high when just using a single 6 pin to 6+2 & 6 pin connector.

I'd think it would be perfectly fine otherwise we'd need 4 of those connectors just for crossfire.

i understand splitting it if they were full on LN2 benching but just for daily and some air/water benching then it would be fine imo

of course it's always nice to have an educated opinion on things like this though.....shilka around anywhere?


----------



## chronicfx

Mention buying a corsair


----------



## imadorkx

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I've clocked my cards fairly high when just using a single 6 pin to 6+2 & 6 pin connector.
> 
> I'd think it would be perfectly fine otherwise we'd need 4 of those connectors just for crossfire.
> 
> i understand splitting it if they were full on LN2 benching but just for daily and some air/water benching then it would be fine imo
> 
> of course it's always nice to have an educated opinion on things like this though.....shilka around anywhere?


I better as shilka 1st because me case is already tight as it is. with another cable so much hassle for cable management and im so eager to OC my r9 290


----------



## Sgt Bilko

Quote:


> Originally Posted by *imadorkx*
> 
> I better as shilka 1st because me case is already tight as it is. with another cable so much hassle for cable management and im so eager to OC my r9 290


I've had mine up to 1250/1550 on both cards with no hassles but more info is always nice









rdr09 has had his at 1300/1625 though so not sure if this may have something to do with it or just silicon lottery


----------



## imadorkx

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I've had mine up to 1250/1550 on both cards with no hassles but more info is always nice
> 
> 
> 
> 
> 
> 
> 
> 
> 
> rdr09 has had his at 1300/1625 though so not sure if this may have something to do with it or just silicon lottery


Yeah, better safe than sorry. Will update more info after Shilka reply my PM.


----------



## bluedevil

So now that my 290 is Red Modded under water, should I crank the VDDC offset to max and see what it will do?


----------



## sena

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Your right, for a 1100 Mhz Core OC you should be able to do that without any voltage bump. It'll just higher temps.


Mine cards are really bad, they need +100mV(maybe little less) for 1100 MHz, asic 70% and 72%.


----------



## cplifj

Quote:


> Originally Posted by *Elmy*
> 
> Worst analogy ever....
> 
> With that 120 radiator temps sit at 70c on full load thats in my rig.
> 
> And if the watercooling system fails on a 295X2 it should stop working....
> 
> It is an acclaimed engineering marvel by almost every reviewer on the planet.
> 
> I would take their word over yours any day of the week.


yeah sure an engineering wonder that only is wonderfull on paper, the thing that gets sold as wonderous is obviously flawed in many ways.

all judging by the Users Experiences with this card. The internet is full off complaints about it.

a faqt no marketing monkey can overcome.

are people really so simple they "buy" into everything aslong as it gets repeated enough ????

sure amd did the impossible , they sold peas for Pearls.

the sadest thing is that most people seem to accept this way of bussines, like they never had to work for their money and its all comming for free.

for ever strong in finding ways to defend something that is flawed. period.

or how every new generation thinks they will re-invent hot water. i'm sure marketing people hate old and experienced in life people.


----------



## sinnedone

Quote:


> Originally Posted by *bluedevil*
> 
> So now that my 290 is Red Modded under water, should I crank the VDDC offset to max and see what it will do?


Why not









As long as those vrm temps stay in check, its always nice to see. Do you game? I ask because even though usually when I bench and there are no artifacts the true test is prolonged gaming sessions of a stable overclock.


----------



## bluedevil

Quote:


> Originally Posted by *sinnedone*
> 
> Why not
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As long as those vrm temps stay in check, its always nice to see. Do you game? I ask because even though usually when I bench and there are no artifacts the true test is prolonged gaming sessions of a stable overclock.


Been playing AC: Black Flag and BF4 lately.


----------



## sinnedone

Then go for it.

The only thing is that its such a process. Play with voltages/bechmarks/clocks for hours then finally get to gaming only to freeze or dump driver. Then have to drop your clocks a few times while your games freeze. ugh


----------



## Sgt Bilko

Quote:


> Originally Posted by *cplifj*
> 
> yeah sure an engineering wonder that only is wonderfull on paper, the thing that gets sold as wonderous is obviously flawed in many ways.
> 
> all judging by the Users Experiences with this card. The internet is full off complaints about it.
> 
> a faqt no marketing monkey can overcome.
> 
> are people really so simple they "buy" into everything aslong as it gets repeated enough ????
> 
> sure amd did the impossible , they sold peas for Pearls.
> 
> the sadest thing is that most people seem to accept this way of bussines, like they never had to work for their money and its all comming for free.
> 
> for ever strong in finding ways to defend something that is flawed. period.
> 
> or how every new generation thinks they will re-invent hot water. i'm sure marketing people hate old and experienced in life people.


I assume you are talking about the 295x2, you realise this is the 290/x club right?

Wrong thread mate.......at least get that right.

there is nothing wrong with these cards, stop trolling please?


----------



## rdr09

Quote:


> Originally Posted by *bluedevil*
> 
> So now that my 290 is Red Modded under water, should I crank the VDDC offset to max and see what it will do?


hell yeah! could you show some pics?

btw, i use +200 offset in Trixx. just for benches, though.


----------



## imadorkx

is this normal or something is wrong with my r9? the gpu load are messed up.

btw, im playing wither 2 half an hour and check gpu-z and it show me this. is this normal or something is wrong.


----------



## Sgt Bilko

Quote:


> Originally Posted by *imadorkx*
> 
> is this normal or something is wrong with my r9? the gpu load are messed up.
> 
> btw, im playing wither 2 half an hour and check gpu-z and it show me this. is this normal or something is wrong.


Core clocks are constant.

Do you have MSI Afterburner installed or AMD overdrive enabled?


----------



## imadorkx

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Core clocks are constant.
> 
> Do you have MSI Afterburner installed or AMD overdrive enabled?


yeah i enabled overdrive in RadeonPro and set "always use highest performance clocks while gaming".


----------



## bluedevil

Quote:


> Originally Posted by *rdr09*
> 
> hell yeah! could you show some pics?
> 
> btw, i use +200 offset in Trixx. just for benches, though.


Here's a preliminary.


----------



## Sgt Bilko

Quote:


> Originally Posted by *imadorkx*
> 
> yeah i enabled overdrive in RadeonPro and set "always use highest performance clocks while gaming".


Do you have the powerlimit set to 50%?

core clocks are constant but the voltage and usage is all over the place......might be the witcher 2 though, haven't played it for a while though.


----------



## imadorkx

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Do you have the powerlimit set to 50%?
> 
> core clocks are constant but the voltage and usage is all over the place......might be the witcher 2 though, haven't played it for a while though.


yeah, already set powerlimit to 50% and im playing at 1080p with Ultra setting+ ubersampling enabled but i set ubersampling=1 in config file.

got 45-59fp in flotsam town and 55-60fps in the jungle. just started playing it again.

im using 14.4 driver.


----------



## rdr09

Quote:


> Originally Posted by *bluedevil*
> 
> Here's a preliminary.


nice. i'd suggest putting a heatsink on vrm2 prior to oc'ing. just to be safe unless you have already.


----------



## Sgt Bilko

Quote:


> Originally Posted by *imadorkx*
> 
> yeah, already set powerlimit to 50% and im playing at 1080p with Ultra setting+ ubersampling enabled but i set ubersampling=1 in config file.
> 
> got 45-59fp in flotsam town and 55-60fps in the jungle. just started playing it again.
> 
> im using 14.4 driver.


I'd say it's just the game then, so long as the fps is constant and not choppy then no reason to worry too much


----------



## imadorkx

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'd say it's just the game then, so long as the fps is constant and not choppy then no reason to worry too much


Oh, Okay then. Thanks mate. +1 rep

Btw, Shilka replied my PM and i quote this , he said "The cable is rated for far more power then your video card could pull so one cable is more then enough".
So one cable is enough.


----------



## Sgt Bilko

Quote:


> Originally Posted by *imadorkx*
> 
> Oh, Okay then. Thanks mate. +1 rep
> 
> Btw, Shilka replied my PM and i quote this , he said "The cable is rated for far more power then your video card could pull so one cable is more then enough".
> So one cable is enough.


No worries

Always good to have confirmation though


----------



## chronicfx

Quote:


> Originally Posted by *imadorkx*
> 
> yeah, already set powerlimit to 50% and im playing at 1080p with Ultra setting+ ubersampling enabled but i set ubersampling=1 in config file.
> 
> got 45-59fp in flotsam town and 55-60fps in the jungle. just started playing it again.
> 
> im using 14.4 driver.


I just finished witcher 2 about a month ago I had very smooth gameplay 70-90fps 1440p with three 290x and a 3570k at 5.0ghz. Keep vsync off and your cpu overclock high.

And i don't normally use full AA with 1440p I forget if there is an option but if it was 1,2,4 or 8 i would have used 2. You can use 4 i guess to match on 1080p


----------



## Learath2

I did have a thread of my own but no one seems to notice it







So my R9 290X totally randomly causes a BSOD I've checked the RAM as that was it last time. However, this time I get 0x1000007E seems to happen while watching youtube videos. I know the flash player might cause a BSOD but its the HTML5 player I am using. Using the 14.6 beta drivers in hopes that It might help.

The beginning of the GPU Z log is from the Heaven bench the rest is totally idle. There were some random clock drops which caused the stutters and FPS drops in Heaven. Card seems to be stable under heavy load but cause a stupid BSOD at the most unrelated time.


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> No worries
> 
> Always good to have confirmation though


indeed.


----------



## tweb321

Is there any way to force downsampling with amd cards? I'm playing watch dogs with the supersampling ini tweak and it looks great for the most part but there are a few issues. Its doesn't display the hacking minigame correctly and water reflections can get distorted. In the geforce Watch Dogs guide they state that forcing downsampling through the nvidia control panel had better performance without the issues. How would I do this with the r9 290?


----------



## AmcieK

My XFX r9 290 DD.
CC 1015
Mc 1300
Drivers 14.6

http://i.imgur.com/fgjLbNh.jpg
http://i.imgur.com/FZE84J9.jpg

On 14.4 my max fps was more then 100 . After instal 14.6 have synchro on 60 Hz but know why ... Its off on unigine and ccc ;/


----------



## heroxoot

Quote:


> Originally Posted by *AmcieK*
> 
> My XFX r9 290 DD.
> CC 1015
> Mc 1300
> Drivers 14.6
> 
> http://i.imgur.com/fgjLbNh.jpg
> http://i.imgur.com/FZE84J9.jpg
> 
> On 14.4 my max fps was more then 100 . After instal 14.6 have synchro on 60 Hz but know why ... Its off on unigine and ccc ;/


I would assume you do have Vsync on. There is at least 2 spots in Heaven where everyone gets a huge FPS jump. One is exiting the castle going over the wall to the bridge. The fps jumps to 90 at least. So I would check to see if Vsync got on somewhere, because 59fps is too exact a number. Check in CCC under gaming.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *tweb321*
> 
> Is there any way to force downsampling with amd cards? I'm playing watch dogs with the supersampling ini tweak and it looks great for the most part but there are a few issues. Its doesn't display the hacking minigame correctly and water reflections can get distorted. In the geforce Watch Dogs guide they state that forcing downsampling through the nvidia control panel had better performance without the issues. How would I do this with the r9 290?


Lots of information here: http://forums.guru3d.com/showthread.php?t=366244


----------



## tweb321

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Lots of information here: http://forums.guru3d.com/showthread.php?t=366244


Thanks, I have seen that but it doesn't work for me. I know it has issues with windows 8.1 and I don't believe you can do 4k with that anyway. I was looking more for a driver level solution. It seems so simple to do with Nvidia inspector I was just hoping amd had something similar.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *tweb321*
> 
> Thanks, I have seen that but it doesn't work for me. I know it has issues with windows 8.1 and I don't believe you can do 4k with that anyway. I was looking more for a driver level solution. It seems so simple to do with Nvidia inspector I was just hoping amd had something similar.


Yea not sure why more games don't have the resolution scale option? Hell even DAYZ has it lol


----------



## NixZiZ

I somehow managed this OC... on air... with a stock cooler. Thing is, fan is at 100% to maintain it. Not too bad. I think I can bump it down to 80 and still be fine.

33%! Best I have ever done. Tell me if that is good or not.

http://www.techpowerup.com/gpuz/5zv7n/

EDIT: Had to bump down to 1253 for stability reasons.


----------



## VSG

How did you test for the OC stability?


----------



## NixZiZ

still testing it. Not final, I guess. Actually, I had to bump it down to 1253 now.

OCCT is a good GPU stability tool, correct?

EDIT: Installing latest drivers.


----------



## VSG

I haven't ever used it to be honest so I can't vouch for it. Come to think of it, I don't see anyone using it to test GPU OC stabilities either. If it uses the Furmark test, stay away from it.


----------



## heroxoot

Quote:


> Originally Posted by *NixZiZ*
> 
> still testing it. Not final, I guess. Actually, I had to bump it down to 1253 now.
> 
> OCCT is a good GPU stability tool, correct?
> 
> EDIT: Installing latest drivers.


God no. Run some benchmarks with Heaven/valley, and try some games. Testing how to handles in games is much more than any stability test.


----------



## Arizonian

Quote:


> Originally Posted by *NixZiZ*
> 
> I somehow managed this OC... on air... with a stock cooler. Thing is, fan is at 100% to maintain it. Not too bad. I think I can bump it down to 80 and still be fine.
> 
> 33%! Best I have ever done. Tell me if that is good or not.
> 
> http://www.techpowerup.com/gpuz/5zv7n/
> 
> EDIT: Had to bump down to 1253 for stability reasons.


Now bench it.


----------



## Radmanhs

I am trying to overvolt my 290 with afterburner, but I'm getting confused. I created the document and put the two lines that are right under the "write" in the original post. How do I save it as a .bat file? Its been such a long time since I've done that i forgot. Also, these

-

"For 50mv: 8
For 100mv:10
For 125mv:14
For 150mv:18
For 175mv:1C
For 200mv:20"

Do I just change the number corresponding to the last number on the lines you put in the text document?


----------



## hokochu

Anybody else play skyrim with crossfire 290x? I have mine modded with like 250+ mods. it was runnin pretty good with the last official drivers but I think I noticed that there was considerably more framerate variance since the new beta drivers were released. anybody else notice this problem?


----------



## kpoeticg

I just got my 2nd 290x back from rma today and i'll be ordering my 3rd in a cpl weeks. I played Skyrim on my laptop for a few months when it first came out, but since i started this build i've been saving it for my Tri-CFX setup. Can't wait to get back into it


----------



## Yvese

How hard is it for you guys to hit 1200? I just tried +100mv ( max MSI AB allows me to add ) but get tons of artifacts in Heaven. I'm not going to use that speed for gaming but was curious how far it could go.

My card is pretty good though. 1150 core only needs +31mv. Stock voltages can do 1100 np







. Was hoping it could continue being awesome and hit 1200 without adding too much, but alas


----------



## braincracking

^ be careful in games, mine can do 1150 in benches on that voltage no problem, but play some really demanding games and I need more. Maybe my cards aren't the best do


----------



## NixZiZ

Quote:


> Originally Posted by *heroxoot*
> 
> God no. Run some benchmarks with Heaven/valley, and try some games. Testing how to handles in games is much more than any stability test.


Ah, ok. I'll download valley then.
Quote:


> Originally Posted by *geggeg*
> 
> I haven't ever used it to be honest so I can't vouch for it. Come to think of it, I don't see anyone using it to test GPU OC stabilities either. If it uses the Furmark test, stay away from it.


It does use the furmark test. I guess that is bad? I'll try with Valley. I noticed that it does not artifact at 1253 in minecraft (Ok, not a demanding game, I know), but it throws out tons of errors in OCCT.
Quote:


> Originally Posted by *Arizonian*
> 
> Now bench it.


Ok. I will use Valley and Catzilla.

Quote:


> Originally Posted by *Yvese*
> 
> How hard is it for you guys to hit 1200? I just tried +100mv ( max MSI AB allows me to add ) but get tons of artifacts in Heaven. I'm not going to use that speed for gaming but was curious how far it could go.
> 
> My card is pretty good though. 1150 core only needs +31mv. Stock voltages can do 1100 np
> 
> 
> 
> 
> 
> 
> 
> . Was hoping it could continue being awesome and hit 1200 without adding too much, but alas


Try GPU tweak. Seems pretty ok now.


----------



## imadorkx

Quote:


> Originally Posted by *chronicfx*
> 
> I just finished witcher 2 about a month ago I had very smooth gameplay 70-90fps 1440p with three 290x and a 3570k at 5.0ghz. Keep vsync off and your cpu overclock high.
> 
> And i don't normally use full AA with 1440p I forget if there is an option but if it was 1,2,4 or 8 i would have used 2. You can use 4 i guess to match on 1080p


Yeah, if i oc my i5 2500k higher than 4.5ghz it need more than 1.3v to stable though. Right now its sitting between 1.3v at idle and 1.28 on load.

my framerate was about 50fps ish so its okay at least its not choppy. I was worry because my gpu load was a mess.

thanks though.


----------



## imadorkx

Quote:


> Originally Posted by *NixZiZ*
> 
> Ah, ok. I'll download valley then.
> It does use the furmark test. I guess that is bad? I'll try with Valley. I noticed that it does not artifact at 1253 in minecraft (Ok, not a demanding game, I know), but it throws out tons of errors in OCCT.
> .


Furmark really stress your gpu beyond belief. Most of the time i read its not that good for gpu because it increase your temp too much. Ive always do 2033 last light benchmark for stabilty test if im lazy and play the game if im bored.

Playing game is considered statbility test too.







make sure its graphic intensive game.


----------



## Yvese

Quote:


> Originally Posted by *braincracking*
> 
> ^ be careful in games, mine can do 1150 in benches on that voltage no problem, but play some really demanding games and I need more. Maybe my cards aren't the best do


I actually did some spectator BF4 tests ( 64p, 1k tickets, op locker ) and there were no issues








Quote:


> Originally Posted by *NixZiZ*
> 
> Try GPU tweak. Seems pretty ok now.


I was hoping there was a way to increase it in MSI AB


----------



## NixZiZ

Benchmarks with my OC. Stable, pretty decent I guess.

GPU-Z validation:
http://www.techpowerup.com/gpuz/aeras/ (30%!)

Catzilla bench photo:


Spoiler: Warning: Spoiler!







Valley bench:


Spoiler: Warning: Spoiler!


----------



## kpoeticg

Here's a proof screenshot for my 2nd 290x. Everything's at stock except fan profiles and 3D Application Settings. It's a reference Powercolor 290x (Like my other/1st card) with stock/reference coolers on both. I have the Kryographics blocks + Passive Backplates for both, just haven't attached em yet...


----------



## phantomowl

im having this problem with the latest drivers. it says that im on a mobile pc display...

i cant disable it, and even if i do the black screen thing will happen again.


----------



## Radmanhs

hey guys, just wondering how will I know if the voltage boost is working?


----------



## Forceman

Quote:


> Originally Posted by *Radmanhs*
> 
> hey guys, just wondering how will I know if the voltage boost is working?


Easiest way is to check the before and after voltage in GPU-z. It is applied to the idle voltage also, so it's easy to check.


----------



## Radmanhs

Should the .bat file stay open? because when I click on it the file flashes up for a second then goes away and I don't notice any difference in gpuz


----------



## Forceman

Quote:


> Originally Posted by *Radmanhs*
> 
> Should the .bat file stay open? because when I click on it the file flashes up for a second then goes away and I don't notice any difference in gpuz


What .bat file? You using Afterburner to overclock?


----------



## Radmanhs

ya, i read on the op that this is one of the ways to boost the voltage. For some reason I am unable to adjust it at all. I checked the options to control/monitor voltage on the settings menu but the option didn't "light up" on the main menu


----------



## Arizonian

Quote:


> Originally Posted by *kpoeticg*
> 
> Here's a proof screenshot for my 2nd 290x. Everything's at stock except fan profiles and 3D Application Settings. It's a reference Powercolor 290x (Like my other/1st card) with stock/reference coolers on both. I have the Kryographics blocks + Passive Backplates for both, just haven't attached em yet...
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - updated
















Quote:


> Originally Posted by *NixZiZ*
> 
> Benchmarks with my OC. Stable, pretty decent I guess.
> 
> GPU-Z validation:
> http://www.techpowerup.com/gpuz/aeras/ (30%!)
> 
> Catzilla bench photo:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Valley bench:
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## kpoeticg

Thanx bro









When it died on me last time I'd been running Firestrike and Valley for a cpl days pushing both my gpu's to test em. I decided that the next day when i woke up i was gonna put em under water. Then at like 4am before i was gonna go to sleep, the gpu just DIED outta nowhere lolll. Only reason they're not swimming yet









Guess it wasn't that bad tho cuz Powercolor repaired it instead of replacing it. It's running great now. I always thought that with gpu's, dead = dead. My mobo wouldn't even recognize that there was anything in the PCI slot


----------



## DeadlyDNA

Am I wrong for saying people buying r9 295x2 are insane? Aside from saving a slot for pcie and maybe lack of slots whats the point?
290x ex miner cards are insanely cheap or even new ones cost less in quantity of 2????


----------



## blue1512

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Am I wrong for saying people buying r9 295x2 are insane? Aside from saving a slot for pcie and maybe lack of slots whats the point?
> 290x ex miner cards are insanely cheap or even new ones cost less in quantity of 2????


295x2 is the absolute solution now, fast, cool and quiet. You can even save 1500$ for a trip to Hawaii when compared to a certain dual GPU card from nVidia


----------



## kizwan

Quote:


> Originally Posted by *kpoeticg*
> 
> I just got my 2nd 290x back from rma today and i'll be ordering my 3rd in a cpl weeks. I played Skyrim on my laptop for a few months when it first came out, but since i started this build i've been saving it for my Tri-CFX setup. Can't wait to get back into it


3 290X's will be your max or you going for quad in the future? Now I'm thinking getting 3rd card too.








Quote:


> Originally Posted by *Yvese*
> 
> How hard is it for you guys to hit 1200? I just tried +100mv ( max MSI AB allows me to add ) but get tons of artifacts in Heaven. I'm not going to use that speed for gaming but was curious how far it could go.
> 
> My card is pretty good though. 1150 core only needs +31mv. Stock voltages can do 1100 np
> 
> 
> 
> 
> 
> 
> 
> . Was hoping it could continue being awesome and hit 1200 without adding too much, but alas


Use Trixx. It can do +200mV.
Quote:


> Originally Posted by *braincracking*
> 
> ^ be careful in games, mine can do 1150 in benches on that voltage no problem, but play some really demanding games and I need more. Maybe my cards aren't the best do


That is usually the case.


----------



## kpoeticg

I was originally planning on quads when the 290x's were first coming out. Then I watched that comparison video that Paul from Newegg did showing the actual performance gains from single => dual => triple => quads. Even the gains for triple isn't really worth the money of an extra card, but I already bought my RIVE BE. Wouldn't feel right going with a 2011 build and not using the extra lanes









I'm probly not gonna go quads tho, unless I decide I wanna get into mining maybe.

Edit: Anybody know why AB's causing 0x0000007E atikmdag.sys blue screens for me? It's a fresh windows install with 14.4, CFX, ULPS Disabled in reg, & it's happening in Unofficial With Powerplay and Without Powerplay support (1 & 2 in cfg file)


----------



## Unknownm

Quote:


> Originally Posted by *bluedevil*
> 
> It's starting to come together.


wait don't you need vram heatsinks?


----------



## kpoeticg

Oops, I think I figured it out. I guess it's important to install the chipset driver after all...


----------



## blue1512

^^ Basically when you are not allow to alter memory voltage, you don't really need to cool them. They will run a bit warmer though.


----------



## Forceman

Quote:


> Originally Posted by *Radmanhs*
> 
> ya, i read on the op that this is one of the ways to boost the voltage. For some reason I am unable to adjust it at all. I checked the options to control/monitor voltage on the settings menu but the option didn't "light up" on the main menu


Are you using the 3.0.0 version of Afterburner?


----------



## Dasboogieman

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Am I wrong for saying people buying r9 295x2 are insane? Aside from saving a slot for pcie and maybe lack of slots whats the point?
> 290x ex miner cards are insanely cheap or even new ones cost less in quantity of 2????


Not necessarily, you gotta factor the added cost of the WC system and the as yet unknown frame latency benefits of having the Card work over a single PCIE slot instead of shuffling data between 2 slots.

Its actually not a bad solution for heavily space constrained systems since its easier and more silent to find a mount point for a single 120mm Rad in smaller cases than it is to install enough/powerful fans to extract the hot air from x2 Aircooled 290X cards.

That being said, for that cost, I'd personally go for x2 290x cards and a custom WC loop and it would probably come up to about $1500.
The biggest weakness of the 295 design is the VRM assembly and cooling. There is an INSANE amount of heat density within a very small area plus there are actually LESS vrms than the ideal amount (10 vs 12). This adds up to a huge amount of heat output within a small area with only a small air-cooled HSF with little ventilation. It may have been better for AMD to team up with EK to get a custom block for the PCB then approach Asetek for a sealed solution using that block. Then at least the VRMs won't cook and the card will become pretty much completely silent.


----------



## boldenc

anyone using Sapphire R9 290/x Tri-X with Cooler Master CM Storm Trooper case? Does the card fit well?


----------



## Sgt Bilko

Quote:


> Originally Posted by *boldenc*
> 
> anyone using Sapphire R9 290/x Tri-X with Cooler Master CM Storm Trooper case? Does the card fit well?


I have 2 XFX DD cards and they fit fine, i also had a Ref 290x with an Accelero Xtreme III Cooler on it and it fit so a Tri-X will be fine

EDIT: max GPU length in the Storm Trooper is 322mm and the Tri-X dimensions are: 305(L) X 113(W) X 38(H) mm

You will be fine


----------



## Silent Scone

Does anyone know if the the Tri-X 8GB 290X fits on the reference EK block? Or if there is a block available for them?

Thanks


----------



## blue1512

Quote:


> Originally Posted by *Silent Scone*
> 
> Does anyone know if the the Tri-X 8GB 290X fits on the reference EK block? Or if there is a block available for them?
> 
> Thanks


You mean VaporX, right? There is no block for it at this moment


----------



## Nephalem

1. http://www.techpowerup.com/gpuz/fxxb5/
2. MSI R9 290 Gaming OC
3. Stock


----------



## Talon720

So i think I found what is causing my artifacting tessellation. Bf4 has gotten to the point where i cant use mantle on any combination of cards 1-3 1- 1-2-3. Just recently with 14.6 beta on dx11 fullscreen mode started to cause this too, but not borderless ..I ran unigine valley with no problem all 3 cranking away. Heaven though with tessellation on caused the same artifacting from bf4. If i turned it off it ran like butter no artifacts. I have a gut feeling its driver related and hopefullu not a dieing card. Anytime I quit bf4, and try to restart I. get bsod 116 which is graphic/display related. It just weird that this all stopped after replacing cpu and removing and put back in one gpu. Then after ocing and getting a couple bsods it all came back. Maybe ddu is causing issues? I know on overclockers.com there was talk of amds installer causing issues I dunno what to do. Maybe ill go back to the pt1 bios... Do you guys think it could it be my psu hx1050? I had to use a moltex to pcie 6-pin all pcie slots are used.. The 4 pin cpu cable uses one which sucks. I also have the extra gpu power moltex plug plugged in.


----------



## Radmanhs

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Radmanhs*
> 
> ya, i read on the op that this is one of the ways to boost the voltage. For some reason I am unable to adjust it at all. I checked the options to control/monitor voltage on the settings menu but the option didn't "light up" on the main menu
> 
> 
> 
> Are you using the 3.0.0 version of Afterburner?
Click to expand...

looks like that fixed it, even though i auto checked for update i guess it didnt want to update... so far i got 1140 on the core and 1400 on the mem stock voltage, could maybe even go more on the memory. What's the max i should go on power limit and voltage? I have a zalman lq320 water cooler so temps aren't to big of a deal, i get 63-64 max on core


----------



## Silent Scone

Quote:


> Originally Posted by *blue1512*
> 
> You mean VaporX, right? There is no block for it at this moment


Sorry, yes. Lazy me I couldn't be bothered to double check. But it's the only 8GB so knew someone would know. Didn't think there was..shame


----------



## mkmitch

Just ordered the HIS iPower IceQ 290x through the Egg for $479.99. I been wanting to upgrade from my 2gb 6870 and this price was too good to pass up. Much better card then the 770 I almost pull the triggered on. Also I get 3 free games, so why not







.


----------



## Arizonian

Quote:


> Originally Posted by *Nephalem*
> 
> 1. http://www.techpowerup.com/gpuz/fxxb5/
> 2. MSI R9 290 Gaming OC
> 3. Stock


Congrats - added










Thank you for the perfect submission.


----------



## disintegratorx

Sometimes I've found that the Power Play Limit setting can cause that problem. Other than that, having your memory clocked to high can cause that. You could try to turn both of those settings down until the issue stops, and make sure to restart your system before you attempt to play in 3D mode again. I've found that even restarting a couple times is sometimes necessary. I successfully have games running on 1124/1450 and the Power Play at 8% but with some other programs running at the same time, I can still get a system freeze, so watch out for that too. A lot of times with programs that aren't fully compatible with even the Windows OS, you'll get problems that will reoccur either way. Good luck..


----------



## Radmanhs

If i want to update my gpu on the list should i just submit my card again? Also wondering what your guys fire strike scores are? Just want to see how i manage in the crowd


----------



## rdr09

Quote:


> Originally Posted by *Radmanhs*
> 
> If i want to update my gpu on the list should i just submit my card again? Also wondering what your guys fire strike scores are? Just want to see how i manage in the crowd


yes, Arizonian will require that you re-submit. Just state Update as heading.

here are my FS scores (cpu varies, so compare graphics scores) . . .

Stock 290 reference

http://www.3dmark.com/3dm/3158989?

1200

http://www.3dmark.com/3dm/3146872?

1300

http://www.3dmark.com/3dm/3154307?

using the latest Beta gives at least 300 more points in graphics.


----------



## sugarhell

Quote:


> Originally Posted by *rdr09*
> 
> yes, Arizonian will require that you re-submit. Just state Update as heading.
> 
> here are my FS scores (cpu varies, so compare graphics scores) . . .
> 
> Stock 290 reference
> 
> http://www.3dmark.com/3dm/3158989?
> 
> 1200
> 
> http://www.3dmark.com/3dm/3146872?
> 
> 1300
> 
> http://www.3dmark.com/3dm/3154307?
> 
> using the latest Beta gives at least 300 more points in graphics.


My 7970s 24/7 clocks are not that behind.

http://www.3dmark.com/fs/1232254
http://www.3dmark.com/fs/2208420

Probably max clocks it would be a lot closer


----------



## rv8000

Does anyone know the thickness thermal pad for vrm 1 and 2 on a Sapphire 290 Tri-X. I bought 1mm fujipoly ultra extreme and I have seen no decrease in temperatures whatsoever. I was not expecting more than a 3-5c drop as changing the pads out on air aren't going to yield as good of results as running on water would but what gives... I also went to check if the pads were properly aligned and covering properly. They were but it appears the pads are dried out, the vrm 1 strip crumbled/broke into pieces







. Were the pads I bought bad to begin with?


----------



## LA_Kings_Fan

Well ... POOP !









Transferred my build into its new home, a nice new Corsair Obsidian Series 450D gaming case, and looks like I got to send my PowerColor PCS+ R9 290X back to NewEgg via RMA. It was an OpenBox buy, and even though it looked factory sealed when I got it, something is wrong with the card ... it's getting too hot or something ? at the PCIe slot I think and after 30 minutes or so of intense gaming my computer rig shuts down and auto-reboots for some reason ? it's like someone hit the reset switch or something ?

I took the 290X card out of the system, re-installed my old TOXIC HD-6950/70 card and I don't get this issue, meaning it's something within the PowerColor card itself, so I guess why risk it, RMA it and get my money back.









Which means I'll be looking to replace it, and was now leaning towards maybe getting one of the $300 to $350 USED reference R9 290x cards off Amazon /or Ebay, and maybe water cooling it ?









I've NEVER watercooled before, and had some QUESTIONS though ....

- I know a FULL WC Block would be best, didn't see this one listed up the OP Front page (_is there a reason for that Arizonian_ ?) , and wondered how good it is, people's opinions on it, etc.
(http://www.swiftech.com/KOMODO-R9-LE.aspx) IMO, it's the best LOOKING option of the bunch, the XSPC is the runner up. But which performs better ?


- Next, what are peoples opinions on using an AIO WC Kit for just the GPU, like this Swiftech unit where the pump is apart of the radiator / reservoir unit, and are there any other options out there to do this ? (http://www.swiftech.com/H2O-x20-Edge-HD.aspx) or is this a mistake and should I just be looking at a FULL Custom Loop WC system build ? I think I know I'm gonna get told to do a FULL Custom loop, but I like the idea of separate loops for each for some reason.









I'm using a Noctua Air Cooler on the CPU now, and was looking at getting a Corsair Hydro Series H105 to maybe replace that, and then maybe WC the VGA off a different separate loop as outlined above ? or maybe getting either of these Swiftech AIO units that are expandable to incorporate the GPU ... (http://www.swiftech.com/H220.aspx) /or (http://www.swiftech.com/H2O-x20-Elite.aspx)
 

... any thoughts and/or feedback would be appreciated, thanks guys


----------



## Forceman

Quote:


> Originally Posted by *Radmanhs*
> 
> looks like that fixed it, even though i auto checked for update i guess it didnt want to update... so far i got 1140 on the core and 1400 on the mem stock voltage, could maybe even go more on the memory. What's the max i should go on power limit and voltage? I have a zalman lq320 water cooler so temps aren't to big of a deal, i get 63-64 max on core


You can safely max the power limit, and I think the +100mV limit in AB is plenty safe as long as your temps are okay.


----------



## sugarhell

Quote:


> Originally Posted by *LA_Kings_Fan*
> 
> snip


No dont use an AIO. Get a good custom loop and keep it for years. You will probably need a 360 rad for the gpu and a 240 rad for the cpu (depends the cpu).

I always suggest my own loop. Get an external 1080 nova rad, a mcp35x pump with top, a small res,ek blocks,cheap xspc fittings and primochill tubing. Futureproof and cheap.


----------



## rdr09

Quote:


> Originally Posted by *sugarhell*
> 
> My 7970s 24/7 clocks are not that behind.
> 
> http://www.3dmark.com/fs/1232254
> http://www.3dmark.com/fs/2208420
> 
> Probably max clocks it would be a lot closer


that's about right. a 290 is about 30% faster over a 7970 and 2 7970s are about 30% faster than a 290.


----------



## LA_Kings_Fan

Quote:


> Originally Posted by *sugarhell*
> 
> No dont use an AIO. Get a good custom loop and keep it for years. You will probably need a 360 rad for the gpu and a 240 rad for the cpu (*depends the cpu*).
> 
> I always suggest my own loop. Get an external 1080 nova rad, a mcp35x pump with top, a small res,ek blocks,cheap xspc fittings and primochill tubing. Futureproof and cheap.


My Sig Rig ... running a SandyBridge Core i7 2600K @ 4.6GHz on an Asus ROG Maximus IV Gene-Z68 mobo, w/ 16 GB of RAM

... which reminded me of another question I forgot to bring up, related to if I go WC'ing ... does it require much additional power ?

I need to replace my Old Octopus of a PSU (a BFG LS-680) it's a mess in such a nice new case, and I was looking at maybe getting the Corsair AX860i Modular PSU, would that be future proof enough to X-fire two R9 290x's and WC it all ? or would I need the AX1200i ?


----------



## sugarhell

Quote:


> Originally Posted by *LA_Kings_Fan*
> 
> My Sig Rig ... running a SandyBridge Core i7 2600K @ 4.6GHz on an Asus ROG Maximus IV Gene-Z68 mobo, w/ 16 GB of RAM
> 
> ... which reminded me of another question I forgot to bring up, related to if I go WC'ing ... does it require much additional power ?
> 
> I need to replace my Old Octopus of a PSU (a BFG LS-680) it's a mess in such a nice new case, and I was looking at maybe getting the Corsair AX860i Modular PSU, would that be future proof enough to X-fire two R9 290x's and WC it all ? or would I need the AX1200i ?


Get an evga g2 1000 watt.Superior psu

Yeah it requieres additional power based on the amount of fans


----------



## Learath2

Quote:


> Originally Posted by *Learath2*
> 
> I feel alone out here can I join you guys
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GPU-Z Validation Link
> Its a ASUS R9 290x reference card. I will be watercooling it but not yet as I still am not sure if the card is actually stable because of the totally unbalanced rig that I have.


You somehow missed my application







. I always feel so buried up in forums.


----------



## Faster_is_better

Quote:


> Originally Posted by *LA_Kings_Fan*
> 
> Well ... POOP !
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Transferred my build into its new home, a nice new Corsair Obsidian Series 450D gaming case, and looks like I got to send my PowerColor PCS+ R9 290X back to NewEgg via RMA. It was an OpenBox buy, and even though it looked factory sealed when I got it, something is wrong with the card ... it's getting too hot or something ? at the PCIe slot I think and after 30 minutes or so of intense gaming my computer rig shuts down and auto-reboots for some reason ? it's like someone hit the reset switch or something ?
> 
> I took the 290X card out of the system, re-installed my old TOXIC HD-6950/70 card and I don't get this issue, meaning it's something within the PowerColor card itself, so I guess why risk it, RMA it and get my money back.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Which means I'll be looking to replace it, and was now leaning towards maybe getting one of the $300 to $350 USED reference R9 290x cards off Amazon /or Ebay, and maybe water cooling it ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've NEVER watercooled before, and had some QUESTIONS though ....
> 
> ... any thoughts and/or feedback would be appreciated, thanks guys


You can probably troubleshoot that card more if you haven't sent it back already. Windows should provide some idea of the reason, you can check the Event Viewer to see why the system shut down. It might tell you if it was a BSOD too.

As far as watercooling, I think all of the blocks out are relatively close in performance, maybe a 5C difference between the best/worst for the full cover blocks. A full custom loop certainly is the best case scenario, and is fun itself, but can get expensive quickly... its addicting, seriously lol. An ok solution is to get one of those NZXT gpu brackets + an AIO unit, its a decent first step into watercooling, not totally full blown but alright. For a high end and hot card like the 290 series it is probably worth it to get a full cover block and a proper custom watercooling kit for it.


----------



## Roboyto

Quote:


> Originally Posted by *LA_Kings_Fan*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Well ... POOP !
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Transferred my build into its new home, a nice new Corsair Obsidian Series 450D gaming case, and looks like I got to send my PowerColor PCS+ R9 290X back to NewEgg via RMA. It was an OpenBox buy, and even though it looked factory sealed when I got it, something is wrong with the card ... it's getting too hot or something ? at the PCIe slot I think and after 30 minutes or so of intense gaming my computer rig shuts down and auto-reboots for some reason ? it's like someone hit the reset switch or something ?
> 
> I took the 290X card out of the system, re-installed my old TOXIC HD-6950/70 card and I don't get this issue, meaning it's something within the PowerColor card itself, so I guess why risk it, RMA it and get my money back.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Which means I'll be looking to replace it, and was now leaning towards maybe getting one of the $300 to $350 USED reference R9 290x cards off Amazon /or Ebay, and maybe water cooling it ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've NEVER watercooled before, and had some QUESTIONS though ....
> 
> - I know a FULL WC Block would be best, didn't see this one listed up the OP Front page (_is there a reason for that Arizonian_ ?) , and wondered how good it is, people's opinions on it, etc.
> (http://www.swiftech.com/KOMODO-R9-LE.aspx) IMO, it's the best LOOKING option of the bunch, the XSPC is the runner up. But which performs better ?
> 
> 
> - Next, what are peoples opinions on using an AIO WC Kit for just the GPU, like this Swiftech unit where the pump is apart of the radiator / reservoir unit, and are there any other options out there to do this ? (http://www.swiftech.com/H2O-x20-Edge-HD.aspx) or is this a mistake and should I just be looking at a FULL Custom Loop WC system build ? I think I know I'm gonna get told to do a FULL Custom loop, but I like the idea of separate loops for each for some reason.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm using a Noctua Air Cooler on the CPU now, and was looking at getting a Corsair Hydro Series H105 to maybe replace that, and then maybe WC the VGA off a different separate loop as outlined above ? or maybe getting either of these Swiftech AIO units that are expandable to incorporate the GPU ... (http://www.swiftech.com/H220.aspx) /or (http://www.swiftech.com/H2O-x20-Elite.aspx)
> 
> 
> ... any thoughts and/or feedback would be appreciated, thanks guys


This will answer a few of the questions you have.

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781

On my phone, can give a more in depth answer later.


----------



## Roboyto

Quote:


> Originally Posted by *LA_Kings_Fan*
> 
> Well ... POOP !
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Transferred my build into its new home, a nice new Corsair Obsidian Series 450D gaming case, and looks like I got to send my PowerColor PCS+ R9 290X back to NewEgg via RMA. It was an OpenBox buy, and even though it looked factory sealed when I got it, something is wrong with the card ... it's getting too hot or something ? at the PCIe slot I think and after 30 minutes or so of intense gaming my computer rig shuts down and auto-reboots for some reason ? it's like someone hit the reset switch or something ?
> 
> I took the 290X card out of the system, re-installed my old TOXIC HD-6950/70 card and I don't get this issue, meaning it's something within the PowerColor card itself, so I guess why risk it, RMA it and get my money back.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Which means I'll be looking to replace it, and was now leaning towards maybe getting one of the $300 to $350 USED reference R9 290x cards off Amazon /or Ebay, and maybe water cooling it ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've NEVER watercooled before, and had some QUESTIONS though ....
> 
> - I know a FULL WC Block would be best, didn't see this one listed up the OP Front page (is there a reason for that Arizonian ?) , and wondered how good it is, people's opinions on it, etc.
> (http://www.swiftech.com/KOMODO-R9-LE.aspx) IMO, it's the best LOOKING option of the bunch, the XSPC is the runner up. But which performs better ?
> 
> 
> - Next, what are peoples opinions on using an AIO WC Kit for just the GPU, like this Swiftech unit where the pump is apart of the radiator / reservoir unit, and are there any other options out there to do this ? (http://www.swiftech.com/H2O-x20-Edge-HD.aspx) or is this a mistake and should I just be looking at a FULL Custom Loop WC system build ? I think I know I'm gonna get told to do a FULL Custom loop, but I like the idea of separate loops for each for some reason.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm using a Noctua Air Cooler on the CPU now, and was looking at getting a Corsair Hydro Series H105 to maybe replace that, and then maybe WC the VGA off a different separate loop as outlined above ? or maybe getting either of these Swiftech AIO units that are expandable to incorporate the GPU ... (http://www.swiftech.com/H220.aspx) /or (http://www.swiftech.com/H2O-x20-Elite.aspx)
> 
> 
> ... any thoughts and/or feedback would be appreciated, thanks guys


http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781

Your issue could be caused by drivers. Have a few links for driver removal in my post above. If you're on Win7 I would highly suggest using 13.12, anything 14.X with Win7 gave me problems. 14.6 beta seems to be pretty good on Win8.1 Pro in Eyefinity so far; Tomb Raider plays flawlessly.

If you're getting a BSOD or if Windows is logging the issue as @Faster_is_better mentioned, then something like BlueScreenView can help you decipher the issue: http://www.nirsoft.net/utils/blue_screen_view.html#DownloadLinks

Used cards are going to be a gamble; you don't want to end up with a card that was severely abused. I would spend the extra ~$100 and get a brand new card with a warranty. If you do in fact have a defective card, I would RMA it. I would sign up for Newegg Premier, it's free for first 30 days, to take advantage of the free returns and expedited shipping. RMA for a refund, and just order another one so you don't have any downtime. The HIS IceQ isn't a bad choice right now for $379 +$20 MIR; buddy of mine just bought one and it's a great performer.

All blocks are going to perform nearly identical when cooling the core. VRM1 is the problematic area though so, unless you go with Aquacomputer/backplate a thermal pad upgrade is highly suggested to keep the VRMs as cool/cooler than the core.

For the first Swiftech unit you have linked you could run a FC block with it. A 240mm radiator would be pretty ideal for one of these cards even with very high OCs as long as you were just cooling the GPU; adding the CPU would likely require an additional 240mm radiator depending on what clocks you are looking to run things at. The size of that radiator/reservoir/pump may be hard to fit somewhere in your case, would have to scope that out thoroughly before hand.

My system gets very respectable temperatures using (2) 120mm radiators cooling my 290 and 4770k. My 290 clocks up to 1300/1700 and my 4770k @ 4.5GHz. Temp information can be found in my build log: http://www.overclock.net/t/1456279/honey-i-shrunk-the-ultra-tower-beast-my-journey-to-creating-a-more-compact-pc-with-an-r9-290/40

Going with a Corsair Hydro for the CPU may not be a bad idea, but separating the 2 components if you run a custom loop isn't really worth it.

You could also use an AIO like the Corsairs to cool the 290, although this isn't ideal because of VRM temperatures. I am cooling a reference 290 in a 250D with a small Antec 620 AIO. The temps I'm getting aren't breaking any records, but I'm using a completely silent fan for the radiator. I could get better temps, but silence is my top priority. I'm extremely pleased with the results for a mild overclock of 1075/1375. I have a freakin powerhouse i7/290 system, in a little cube, that you can't hear running unless your face is pressed up against it







http://www.overclock.net/t/1478544/the-devil-inside-a-gaming-htpc-for-my-wife-3770k-corsair-250d-powercolor-r9-270x-devil-edition/20

Custom loop gets pretty expensive pretty quickly, so this must be taken into consideration. You have many, many options for each component and how you install them in your case..and the more extravagant/flashy items add up $$ very quickly. I won't get into a huge sidetracked watercooling rant but, the one item that is worth the extra $ out of everything is going to be compression fittings IMO. You will probably rack up a $200 bill for compression fittings in a CPU/GPU dual-radiator loop, but they are much easier to work with than any kind of clamp and look 100 times better.

It does require a bit of patience and planning to get a custom loop done properly. Don't expect to order your parts and toss everything in your case in a matter of a couple hours. The time/effort/$ are worth it though, anyone who has a custom loop will tell you that 

If you choose to go down this rabbit hole feel free to send a PM, I enjoy a good watercooling discussion


----------



## Roboyto

Quote:


> Originally Posted by *rv8000*
> 
> Does anyone know the thickness thermal pad for vrm 1 and 2 on a Sapphire 290 Tri-X. I bought 1mm fujipoly ultra extreme and I have seen no decrease in temperatures whatsoever. I was not expecting more than a 3-5c drop as changing the pads out on air aren't going to yield as good of results as running on water would but what gives... I also went to check if the pads were properly aligned and covering properly. They were but it appears the pads are dried out, the vrm 1 strip crumbled/broke into pieces
> 
> 
> 
> 
> 
> 
> 
> . Were the pads I bought bad to begin with?


The Ultra Extreme is a bit flaky/fragile, this seems to be par for the course from my experience with it.

Even if the pad seems dried/crumbly you can take all the fragments and roll them into one homogenous piece again. You can then flatten it out into the necessary shape. I would suggest using a piece of wax paper so it doesn't stick to whatever you're using to flatten it out.



I used the Fujipoly Ultra thermal pads on a PowerColor Devil 270X and saw a 7C drop in temperature. This was only the case when running overclocked/volted with a fairly high forced fan speed. If I left the fans on auto, there wasn't much of a difference in temperatures.

There is also the possibility that Sapphire is already using a good thermal pad for their cooler as it is no secret that VRM1 on these cards is the limiting factor for OCing.

Do you have some pics of what the pad contact looked like, and of it breaking apart?


----------



## rv8000

Quote:


> Originally Posted by *Roboyto*
> 
> The Ultra Extreme is a bit flaky/fragile, this seems to be par for the course from my experience with it.
> 
> Even if the pad seems dried/crumbly you can take all the fragments and roll them into one homogenous piece again. You can then flatten it out into the necessary shape. I would suggest using a piece of wax paper so it doesn't stick to whatever you're using to flatten it out.
> 
> 
> 
> I used the Fujipoly Ultra thermal pads on a PowerColor Devil 270X and saw a 7C drop in temperature. This was only the case when running overclocked/volted with a fairly high forced fan speed. If I left the fans on auto, there wasn't much of a difference in temperatures.
> 
> There is also the possibility that Sapphire is already using a good thermal pad for their cooler as it is no secret that VRM1 on these cards is the limiting factor for OCing.
> 
> Do you have some pics of what the pad contact looked like, and of it breaking apart?


Thank you. I don't have any pictures, i still have enough pad left to recover the vrm 2 section so ill take that pad off and take some pictures. I may just side grade to a Vapor-X. I also have to think about my case being an issue, and the card speed hovers around 40-50% on the custom curve I have so thats probably not the best for seeing the drop in temps I want.


----------



## Roboyto

Quote:


> Originally Posted by *rv8000*
> 
> Thank you. I don't have any pictures, i still have enough pad left to recover the vrm 2 section so ill take that pad off and take some pictures. I may just side grade to a Vapor-X. I also have to think about my case being an issue, and the card speed hovers around 40-50% on the custom curve I have so thats probably not the best for seeing the drop in temps I want.


You do need exceptional airflow going to these cards to keep things under control. Have any pics of your case and how things are setup now?

Your fractal case is great for noise, but lacks in the airflow department I think.

You may want to consider attaching a heatsink(s) to the rear of the VRMs, this could prove useful as well.


----------



## Mega Man

Quote:


> Originally Posted by *LA_Kings_Fan*
> 
> Well ... POOP !
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Transferred my build into its new home, a nice new Corsair Obsidian Series 450D gaming case, and looks like I got to send my PowerColor PCS+ R9 290X back to NewEgg via RMA. It was an OpenBox buy, and even though it looked factory sealed when I got it, something is wrong with the card ... it's getting too hot or something ? at the PCIe slot I think and after 30 minutes or so of intense gaming my computer rig shuts down and auto-reboots for some reason ? it's like someone hit the reset switch or something ?
> 
> I took the 290X card out of the system, re-installed my old TOXIC HD-6950/70 card and I don't get this issue, meaning it's something within the PowerColor card itself, so I guess why risk it, RMA it and get my money back.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Which means I'll be looking to replace it, and was now leaning towards maybe getting one of the $300 to $350 USED reference R9 290x cards off Amazon /or Ebay, and maybe water cooling it ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've NEVER watercooled before, and had some QUESTIONS though ....
> 
> - I know a FULL WC Block would be best, didn't see this one listed up the OP Front page (_is there a reason for that Arizonian_ ?) , and wondered how good it is, people's opinions on it, etc.
> (http://www.swiftech.com/KOMODO-R9-LE.aspx) IMO, it's the best LOOKING option of the bunch, the XSPC is the runner up. But which performs better ?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> - Next, what are peoples opinions on using an AIO WC Kit for just the GPU, like this Swiftech unit where the pump is apart of the radiator / reservoir unit, and are there any other options out there to do this ? (http://www.swiftech.com/H2O-x20-Edge-HD.aspx) or is this a mistake and should I just be looking at a FULL Custom Loop WC system build ? I think I know I'm gonna get told to do a FULL Custom loop, but I like the idea of separate loops for each for some reason.
> 
> 
> 
> 
> 
> 
> 
> 
> ... any thoughts and/or feedback would be appreciated, thanks guys
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My Sig Rig ... running a SandyBridge Core i7 2600K @ 4.6GHz on an Asus ROG Maximus IV Gene-Z68 mobo, w/ 16 GB of RAM
> 
> ... which reminded me of another question I forgot to bring up, related to if I go WC'ing ... does it require much additional power ?
> 
> 
> I need to replace my Old Octopus of a PSU (a BFG LS-680) it's a mess in such a nice new case, and I was looking at maybe getting the Corsair AX860i Modular PSU, would that be future proof enough to X-fire two R9 290x's and WC it all ? or would I need the AX1200i ?


psu i recommend a 1k unit, the evga is a good option

i can get you better pics of the komodo, it does look sharp i am trying to install them starting tomorrow.

i can update you too ! ( on how they perform )

i have the kit above and i love it, great to start with ! the others are a good way too, i just prefer to keep my pump off block !


----------



## rv8000

Quote:


> Originally Posted by *Roboyto*
> 
> You do need exceptional airflow going to these cards to keep things under control. Have any pics of your case and how things are setup now?
> 
> Your fractal case is great for noise, but lacks in the airflow department I think.
> 
> You may want to consider attaching a heatsink(s) to the rear of the VRMs, this could prove useful as well.


Heres a picture of the current setup. All HD cages have been removed, all cabling routed through the back etc for minimal interference. 4x 140mm cougar 1200rpm fans, 3 intake 1 exhaust.


----------



## blue1512

Quote:


> Originally Posted by *Roboyto*
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781
> 
> My system gets very respectable temperatures using (2) 120mm radiators cooling my 290 and 4770k. My 290 clocks up to 1300/1700 and my 4770k @ 4.5GHz. Temp information can be found in my build log: http://www.overclock.net/t/1456279/honey-i-shrunk-the-ultra-tower-beast-my-journey-to-creating-a-more-compact-pc-with-an-r9-290/40


Reading your build log makes me envy all the time. I wish my gf could be like your wife lol


----------



## ebhsimon

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Deskputer


----------



## Arizonian

Quote:


> Originally Posted by *Learath2*
> 
> I feel alone out here can I join you guys
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GPU-Z Validation Link
> Its a ASUS R9 290x reference card. I will be watercooling it but not yet as I still am not sure if the card is actually stable because of the totally unbalanced rig that I have.


Oooops very sorry







You've been slotted right after buddatech if I had not missed you. Your # 375 out of 377 members. Congrats - added









Quote:


> Originally Posted by *Radmanhs*
> 
> If i want to update my gpu on the list should i just submit my card again? Also wondering what your guys fire strike scores are? Just want to see how i manage in the crowd


What did you need updated? If it's a new GPU then just a GPU-Z validation link will suffice.


----------



## Radmanhs

I just wanted to update that im under water now


----------



## LA_Kings_Fan

Quote:


> Originally Posted by *Roboyto*
> 
> Custom loop gets pretty expensive pretty quickly, so this must be taken into consideration. You have many, many options for each component and how you install them in your case..and the more extravagant/flashy items add up $$ very quickly. I won't get into a huge sidetracked watercooling rant but, the one item that is worth the extra $ out of everything is going to be compression fittings IMO. You will probably rack up a $200 bill for compression fittings in a CPU/GPU dual-radiator loop, but they are much easier to work with than any kind of clamp *and look 100 times better*.


1st off ... THANK YOU SO MUCH for all the time you took in the response appreciate it ... a lot to digest and think over









2nd off ... yeah the cost for full custom loops (on top of the whole mixing water & electricity thing







) is something that has kept me from considering it, but these Swiftech type units that bridge the gap between simple AIO units and individual component full blown custom loops has me rethinking it.

Something along the lines of using the following ...

- a CORSAIR H105 AIO for the CPU ... self contained, looks good, know reliability, cost effective at under *$150* complete.
- a Swiftech KOMODO-R9-LE GPU FC Block w/ Backplate for the R9 290x ... about *$180*.
- coupled to a Swiftech H2O-X20 Edge HD Series Radiator / Pump / Reservior Combo kit ... about *$230* complete.

= so about *$560* ... hmmm that's more than I thought it might be, but I guess cheaper than an otherwise full custom loop might be ?

as for the *CLAMPS* looking bad I was thinking of dressing them up using Automotive anodized aluminum Clamp covers like so ...


----------



## Arizonian

Quote:


> Originally Posted by *LA_Kings_Fan*
> 
> 1. *-SNIP-*
> 
> Transferred my build into its new home, a nice new Corsair Obsidian Series 450D gaming case
> 
> 2. *SNIP -*
> 
> - I know a FULL WC Block would be best, didn't see this one listed up the OP Front page (_*is there a reason for that Arizonian*_ ?) , and wondered how good it is, people's opinions on it, etc.
> 
> (http://www.swiftech.com/KOMODO-R9-LE.aspx) IMO, it's the best LOOKING option of the bunch, the XSPC is the runner up. But which performs better ?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 3. *-SNIP-*
> 
> I took the 290X card out of the system, re-installed my old TOXIC HD-6950/70 card and I don't get this issue, meaning it's something within the PowerColor card itself, so I guess why risk it, RMA it and get my money back.
> 
> 4. *-SNIP-*
> 
> I need to replace my Old Octopus of a PSU (a BFG LS-680) it's a mess in such a nice new case, and I was looking at maybe getting the Corsair AX860i Modular PSU, would that be future proof enough to X-fire two R9 290x's and WC it all ? or would I need the AX1200i ?


1. Been looking at the Obsidian 450D myself for next build. I can't find any other mid-tower I'm interested in. Nice.

2. Nope no reason at all and thank you for the link It's been added to the water block section and anyone else who'd like their WB's if not up there, feel free to PM me.









3. Sorry to hear this. Hope you find your replacement soon.

4. I agree with Mega Man on going with EVGA Supernova 1000W G2, for about the same price of the Corsair AX860i. In fact with current sales you can pick one up for a bit less at about $150.









See the OP info section for guides to 1000 Watt PSU's by Shilka for the reviews on the 1000W G2.
_
In fact Mega Man's 'heads up Newegg sale" posted in this club and the 7970 club pushed me over the cliff to pick up a Supernova 1300W G2 PSU for $160 after rebate. A deal I couldn't pass up. Thanks bud._











Spoiler: PSU Pic!



Quote:


> Originally Posted by *Mega Man*
> 
> for anyone running CFX ect the 1300w evga is at a great price from the newegg shell shocker
> 
> edit anyone seen this, cheapest i ever saw it !






Quote:


> Originally Posted by *Radmanhs*
> 
> I just wanted to update that im under water now


Congrats - updated


----------



## Mega Man

glad i could help


----------



## imadorkx

Quote:


> Originally Posted by *Arizonian*
> 
> 1. Been looking at the Obsidian 450D myself for next build. I can't find any other mid-tower I'm interested in. Nice.
> 
> 2. Nope no reason at all and thank you for the link It's been added to the water block section and anyone else who'd like their WB's if not up there, feel free to PM me.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 3. Sorry to hear this. Hope you find your replacement soon.
> 
> 4. I agree with Mega Man on going with EVGA Supernova 1000W G2, for about the same price of the Corsair AX860i. In fact with current sales you can pick one up for a bit less at about $150.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> See the OP info section for guides to 1000 Watt PSU's by Shilka for the reviews on the 1000W G2.
> _
> In fact Mega Man's 'heads up Newegg sale" posted in this club and the 7970 club pushed me over the cliff to pick up a Supernova 1300W G2 PSU for $160 after rebate. A deal I couldn't pass up. Thanks bud._
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: PSU Pic!
> 
> 
> 
> 
> 
> 
> Congrats - updated


You should be. Just bought myself a 450D a few days ago. All i can say is money well spent. Still looking for parts for my watercooling setup. you can fit a 360mm radiator. One thing to note, you cant use optical drive bay though and 280mm radiator in front.

This is my 450D as for now.





and i found this on corsair website with watercooling setup. Looking good.


----------



## Forceman

Quote:


> Originally Posted by *LA_Kings_Fan*
> 
> 1st off ... THANK YOU SO MUCH for all the time you took in the response appreciate it ... a lot to digest and think over
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2nd off ... yeah the cost for full custom loops (on top of the whole mixing water & electricity thing
> 
> 
> 
> 
> 
> 
> 
> ) is something that has kept me from considering it, but these Swiftech type units that bridge the gap between simple AIO units and individual component full blown custom loops has me rethinking it.
> 
> Something along the lines of using the following ...
> 
> - a CORSAIR H105 AIO for the CPU ... self contained, looks good, know reliability, cost effective at under *$150* complete.
> - a Swiftech KOMODO-R9-LE GPU FC Block w/ Backplate for the R9 290x ... about *$180*.
> - coupled to a Swiftech H2O-X20 Edge HD Series Radiator / Pump / Reservior Combo kit ... about *$230* complete.
> 
> = so about *$560* ... hmmm that's more than I thought it might be, but I guess cheaper than an otherwise full custom loop might be ?
> 
> as for the *CLAMPS* looking bad I was thinking of dressing them up using Automotive anodized aluminum Clamp covers like so ...


Dependiong on where you live, you can do a custom CPU/single GPU loop for around $400. The price climbs rapidly when you want to get fancy (acrylic, colored die, high-cost fans, fancy fittings), but if all you want is okay looks and great cooling, it's not all that expensive. Although I guess that's relative. Just buying a XSPC kit and adding a GPU block to it (along with some fittings) is an easy way to go. I built a full custom loop a couple of month ago, and I think the all-in cost was under $450.


----------



## Unknownm

Quote:


> Originally Posted by *blue1512*
> 
> ^^ Basically when you are not allow to alter memory voltage, you don't really need to cool them. They will run a bit warmer though.


Thanks!

Also I upped the aux voltage to get better memory overclock. The ram can get up to 1530mhz before showing artifacts, but just to be on the stable side I lowered to 1500 (6000Mhz). Stock ram clock is 1250 (5000Mhz)

Is it safe to push more than 50 100+mV on Aux voltage or not recommended? is it even possible...


----------



## Sgt Bilko

Quote:


> Originally Posted by *Unknownm*
> 
> Thanks!
> 
> Also I upped the aux voltage to get better memory overclock. The ram can get up to 1530mhz before showing artifacts, but just to be on the stable side I lowered to 1500 (6000Mhz). Stock ram clock is 1250 (5000Mhz)
> 
> Is it safe to push more than 50 100+mV on Aux voltage or not recommended? is it even possible...


I've pushed +200mV through both my cards to get them up to 1250/1550 before. it's ok for short sessions and +100mV would be fine for daily use provided you can keep the cards cool enough


----------



## Roboyto

Quote:


> Originally Posted by *rv8000*
> 
> Heres a picture of the current setup. All HD cages have been removed, all cabling routed through the back etc for minimal interference. 4x 140mm cougar 1200rpm fans, 3 intake 1 exhaust.


Do you run it with the side panel on or off? The location for the side panel fan doesn't look idea to assist in VRM1 cooling unfortunately, but it could be worth a shot to put a fan there to see what happens.

I think the front intake fans would have a little trouble moving air to the card when only having those little slits to draw air from. I was considering a Fractal case like that for my build until I went and scoped it out in person. I really liked the way it looked, but potential lack of airflow was a concern for me.

Heat from your CPU cooler blowing onto the back of the GPU definitely isn't helping your situation. I'm assuming the fans are blowing down through the heatsink, which would be forcing the air straight onto the MoBo/RAM and conequently to the back of your GPU. Maybe try to reverse the fans so they suck the heat up away from the board? This may help depending on if you use the side panel or not. A tower cooler or AIO for the CPU may help your cause to direct heat away from other components. You may also want to try moving your GPU down to the next slot? It will run in PCIe 3.0 8x, but you shouldn't see a performance drop as that is same bandwidth as PCIe 2.0 16x.

Quote:


> Originally Posted by *blue1512*
> 
> Reading your build log makes me envy all the time. I wish my gf could be like your wife lol


My wife usually doesn't mind my lengthy PC projects, although that build took MUCH longer than I anticipated. Towards the end she was 'hinting' at when she was going to get her dining table back









I make a decent amount of extra $ doing PC stuff on the side, so she usually doesn't mind my projects/builds happening. She is also aware of how spoiled she is when it comes to her PCs..her Ultrabook is a dual SSD system for cryin out loud.

Quote:


> Originally Posted by *LA_Kings_Fan*
> 
> 1st off ... THANK YOU SO MUCH for all the time you took in the response appreciate it ... a lot to digest and think over
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2nd off ... yeah the cost for full custom loops (on top of the whole mixing water & electricity thing
> 
> 
> 
> 
> 
> 
> 
> ) is something that has kept me from considering it, but these Swiftech type units that bridge the gap between simple AIO units and individual component full blown custom loops has me rethinking it.
> 
> Something along the lines of using the following ...
> 
> - a CORSAIR H105 AIO for the CPU ... self contained, looks good, know reliability, cost effective at under *$150* complete.
> - a Swiftech KOMODO-R9-LE GPU FC Block w/ Backplate for the R9 290x ... about *$180*.
> - coupled to a Swiftech H2O-X20 Edge HD Series Radiator / Pump / Reservior Combo kit ... about *$230* complete.
> 
> = so about *$560* ... hmmm that's more than I thought it might be, but I guess cheaper than an otherwise full custom loop might be ?
> 
> as for the *CLAMPS* looking bad I was thinking of dressing them up using Automotive anodized aluminum Clamp covers like so ...


You're welcome, glad I could help.

That's not a bad idea with the automotive clamp covers









It really depends on your budget and how smart you want to shop as @Forceman said. For ~$560 you could probably be into a full custom loop. The largest downside I'm seeing is your modified HP case, unless your sig isn't up to date; One suited for radiator mounting would be highly suggested. I'll mock something up price wise and see where it ends up.

Quote:


> Originally Posted by *Unknownm*
> 
> Thanks!
> 
> Also I upped the aux voltage to get better memory overclock. The ram can get up to 1530mhz before showing artifacts, but just to be on the stable side I lowered to 1500 (6000Mhz). Stock ram clock is 1250 (5000Mhz)
> 
> Is it safe to push more than 50 100+mV on Aux voltage or not recommended? is it even possible...


Typically artifacts are caused by core clocks too high or not enough core voltage. What were your core clocks/voltages? At that memory speed where you were experiencing artifacts try giving the core a little boost in V and see what happens.

As @Sgt Bilko said the voltage is 'safe' as long as you can keep the card cool enough.


----------



## pkrexer

I'm loving these cheap 290's on eBay. Picked up a xfx 290x reference for $300. Wasn't expecting much, but looks like I scored a good one. 0 coil whine, hynix memory and can do 1120 / 1450 on stock volts before showing artifacts.

Can't wait to get this baby under water to see what she can really do.


----------



## Antoniv

Hiya, I'm thinking of grabbing a 290 in the following weeks. Thing is, I want to get a msi model. Are there any issues that I should be aware of when buying? 2 weeks ago, my asus r9 280x died, got rma'd, second card was faulty too, made me aware of buying asus gpu's. Anyways, MSI r9 290, any issues revolving around with this card? Or am I good to go?


----------



## Nephalem

Quote:


> Originally Posted by *Antoniv*
> 
> Hiya, I'm thinking of grabbing a 290 in the following weeks. Thing is, I want to get a msi model. Are there any issues that I should be aware of when buying? 2 weeks ago, my asus r9 280x died, got rma'd, second card was faulty too, made me aware of buying asus gpu's. Anyways, MSI r9 290, any issues revolving around with this card? Or am I good to go?


I've only had mine for a day but I have to say, I'm really impressed, the afterburner makes overclocking a breeze (got mine at 1130MHz on stock) it has a backplate attached already and it looks pretty friggin sweet, all in all can't complain so far.


----------



## Roboyto

Quote:


> Originally Posted by *Antoniv*
> 
> Hiya, I'm thinking of grabbing a 290 in the following weeks. Thing is, I want to get a msi model. Are there any issues that I should be aware of when buying? 2 weeks ago, my asus r9 280x died, got rma'd, second card was faulty too, made me aware of buying asus gpu's. Anyways, MSI r9 290, any issues revolving around with this card? Or am I good to go?


http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781

That should answer any questions you have, and even some you don't









MSI are solid cards, and if you want to put it under water I believe EK has made a revision to their reference block to fit the Gaming Edition cards.

Quote:


> Originally Posted by *pkrexer*
> 
> I'm loving these cheap 290's on eBay. Picked up a xfx 290x reference for $300. Wasn't expecting much, but looks like I scored a good one. 0 coil whine, hynix memory and can do 1120 / 1450 on stock volts before showing artifacts.
> 
> Can't wait to get this baby under water to see what she can really do.


I was pleased with both of my XFX; sold one. It may not exhibit coil while under normal voltages, but when you start pushing the envelope you may get some. My card at 1300/1700 +50% power & +200mV lets out a bit of a cry when benching at those speeds. 1200/1500 +87mV gaming clocks it's quiet as a church mouse though


----------



## rdr09

Quote:


> Originally Posted by *Roboyto*
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781
> 
> That should answer any questions you have, and even some you don't
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I was pleased with both of my XFX; sold one. It may not exhibit coil while under normal voltages, but when you start pushing the envelope you may get some. My card at 1300/1700 +50% power & +200mV lets out a bit of a cry when benching at those speeds. 1200/1500 +87mV gaming clocks it's quiet as a church mouse though


Rob, it that +87mV using 14.6 Beta? my 1200 now needs a bit more. this latest driver works for all my games, so i can finally give up 13.11,lol


----------



## blue1512

Quote:


> Originally Posted by *rdr09*
> 
> Rob, it that +87mV using 14.6 Beta? my 1200 now needs a bit more. this latest driver works for all my games, so i can finally give up 13.11,lol


The lastest driver is solid, but it requires a bit of juice for the card. I think AMD has some tweaks on Powertune.


----------



## Roboyto

Quote:


> Originally Posted by *rdr09*
> 
> Rob, it that +87mV using 14.6 Beta? my 1200 now needs a bit more. this latest driver works for all my games, so i can finally give up 13.11,lol


That is on 14.6, and I just moved to 8.1 Pro. The only game I've played is Tomb Raider so far...but it is solid 5760*1080, all settings maxed, 43 FPS average. Not bad for one 290







I'm going to drop the AA down to boost my frames a little, but the 43 is smooth as butter.

I was stuck 13.12 myself. Everything prior to 14.6 gave me problems, but I was running Win7 at the time. I think the OS is part of the problem/solution.

The only odd thing I have noticed since moving to 8.1/14.6 is when I turn off the 3rd monitor in my array, farthest to the right, windows 'boops' as if a piece of hardware has been connected/disconnected and I get no video signal to any monitor. I thought the computer locked up overnight when I got on it this morning, but turning that 3rd monitor on/off/on fixes the problem and video works.







Any thoughts?


----------



## blue1512

Quote:


> Originally Posted by *Roboyto*
> 
> That is on 14.6, and I just moved to 8.1 Pro. The only game I've played is Tomb Raider so far...but it is solid 5760*1080, all settings maxed, 43 FPS average. Not bad for one 290
> 
> 
> 
> 
> 
> 
> 
> I'm going to drop the AA down to boost my frames a little, but the 43 is smooth as butter.
> 
> The only odd thing I have noticed since moving to 8.1/14.6 is when I turn off the 3rd monitor in my array, farthest to the right, windows 'boops' as if a piece of hardware has been connected/disconnected and I get no video signal to any monitor. I thought the computer locked up overnight when I got on it this morning, but turning that 3rd monitor on/off/on fixes the problem and video works.
> 
> 
> 
> 
> 
> 
> 
> Any thoughts?


It might have a thing with new CFX setup of 14.6. I think you should send AMD a feedback of their driver on this issue


----------



## kizwan

*14.4 WHQL vs. 14.6 Beta*

*Valley*
_(Note: No comparison for Crossfire.)_

 14.4 WHQL14.6 betaSingle 290

*Heaven*

 14.4 WHQL14.6 beta Single 290Tess ONTess OFF 290 CrossfireTess ONTess OFF 

*3DMark 11*

 14.4 WHQL14.6 beta Single 290Tess ONhttp://www.3dmark.com/3dm11/8371754
http://www.3dmark.com/3dm11/8382247
Tess OFFhttp://www.3dmark.com/3dm11/8371767
http://www.3dmark.com/3dm11/8382340
 290 CrossfireTess ONhttp://www.3dmark.com/3dm11/8371731
http://www.3dmark.com/3dm11/8382505
Tess OFFhttp://www.3dmark.com/3dm11/8371717
http://www.3dmark.com/3dm11/8382617
 

*Firestrike*

 14.4 WHQL14.6 beta Single 290Tess ONhttp://www.3dmark.com/fs/2200589
http://www.3dmark.com/fs/2215138
Tess OFFhttp://www.3dmark.com/fs/2200651
http://www.3dmark.com/fs/2215231
 290 CrossfireTess ONhttp://www.3dmark.com/fs/2200675
http://www.3dmark.com/fs/2215393
Tess OFFhttp://www.3dmark.com/fs/2200723
http://www.3dmark.com/fs/2215496
 


----------



## rdr09

Quote:


> Originally Posted by *blue1512*
> 
> The lastest driver is solid, but it requires a bit of juice for the card. I think AMD has some tweaks on Powertune.


i play at stock, so no need extra juice but you are right. i have yet to figure out how much added volts is needed when i am benching. my current estimate using Trixx is about +40 offset, which is a lot.

Quote:


> Originally Posted by *Roboyto*
> 
> That is on 14.6, and I just moved to 8.1 Pro. The only game I've played is Tomb Raider so far...but it is solid 5760*1080, all settings maxed, 43 FPS average. Not bad for one 290
> 
> 
> 
> 
> 
> 
> 
> I'm going to drop the AA down to boost my frames a little, but the 43 is smooth as butter.
> 
> I was stuck 13.12 myself. Everything prior to 14.6 gave me problems, but I was running Win7 at the time. I think the OS is part of the problem/solution.
> 
> The only odd thing I have noticed since moving to 8.1/14.6 is when I turn off the 3rd monitor in my array, farthest to the right, windows 'boops' as if a piece of hardware has been connected/disconnected and I get no video signal to any monitor. I thought the computer locked up overnight when I got on it this morning, but turning that 3rd monitor on/off/on fixes the problem and video works.
> 
> 
> 
> 
> 
> 
> 
> Any thoughts?


Quote:


> Originally Posted by *blue1512*
> 
> The lastest driver is solid, but it requires a bit of juice for the card. I think AMD has some tweaks on Powertune.


thanks. i'll try to play BF4 at 1200 and compare my fps.

Quote:


> Originally Posted by *kizwan*
> 
> *14.4 WHQL vs. 14.6 Beta*
> 
> *Valley*
> _(Note: No comparison for Crossfire.)_
> 
> 14.4 WHQL14.6 betaSingle 290
> 
> *Heaven*
> 
> 14.4 WHQL14.6 beta Single 290Tess ONTess OFF 290 CrossfireTess ONTess OFF 
> 
> *3DMark 11*
> 
> 14.4 WHQL14.6 beta Single 290Tess ONhttp://www.3dmark.com/3dm11/8371754
> http://www.3dmark.com/3dm11/8382247
> Tess OFFhttp://www.3dmark.com/3dm11/8371767
> http://www.3dmark.com/3dm11/8382340
>  290 CrossfireTess ONhttp://www.3dmark.com/3dm11/8371731
> http://www.3dmark.com/3dm11/8382505
> Tess OFFhttp://www.3dmark.com/3dm11/8371717
> http://www.3dmark.com/3dm11/8382617
>  
> 
> *Firestrike*
> 
> 14.4 WHQL14.6 beta Single 290Tess ONhttp://www.3dmark.com/fs/2200589
> http://www.3dmark.com/fs/2215138
> Tess OFFhttp://www.3dmark.com/fs/2200651
> http://www.3dmark.com/fs/2215231
>  290 CrossfireTess ONhttp://www.3dmark.com/fs/2200675
> http://www.3dmark.com/fs/2215393
> Tess OFFhttp://www.3dmark.com/fs/2200723
> http://www.3dmark.com/fs/2215496
>  


nice work K. +rep. not sure how to spoiler them. sorry.


----------



## rv8000

Quote:


> Originally Posted by *Roboyto*
> 
> Do you run it with the side panel on or off? The location for the side panel fan doesn't look idea to assist in VRM1 cooling unfortunately, but it could be worth a shot to put a fan there to see what happens.
> 
> I think the front intake fans would have a little trouble moving air to the card when only having those little slits to draw air from. I was considering a Fractal case like that for my build until I went and scoped it out in person. I really liked the way it looked, but potential lack of airflow was a concern for me.
> 
> Heat from your CPU cooler blowing onto the back of the GPU definitely isn't helping your situation. I'm assuming the fans are blowing down through the heatsink, which would be forcing the air straight onto the MoBo/RAM and conequently to the back of your GPU. Maybe try to reverse the fans so they suck the heat up away from the board? This may help depending on if you use the side panel or not. A tower cooler or AIO for the CPU may help your cause to direct heat away from other components. You may also want to try moving your GPU down to the next slot? It will run in PCIe 3.0 8x, but you shouldn't see a performance drop as that is same bandwidth as PCIe 2.0 16x.
> 
> -Snip-


I run it with the side panel on, the case panel fan wouldn't be in a good place for cooling vrm 1. I originally bought the case for a water cooling build but I ended up never getting some of the parts I ordered so I returned or sold the parts I did have. I may consider some faster intake fans because the intake from the front of the case is definitely poor, at that point it begins defeating the purpose of the quiet case it is though. All of the AIO's i've dealt with are frustratingly noisy for the performance they give, If I find a good deal at some point I may switch back but currently I'm not even oc'd and my cpu hardly sees temps above 50c.

I cannot remember if you answered one of my original questions or not, do you know if 1mm is the proper thickness pad for the Tri-X vrms? Another user ( i think imprezzion) replaced his pads with 1mm and ended up seeing a 3-5c drop which is still why I'm very disappointed with the thermal pads. Thanks for all the help!


----------



## kizwan

Quote:


> Originally Posted by *rdr09*
> 
> nice work K. +rep. not sure how to spoiler them. sorry.


Thanks!


----------



## Roboyto

Quote:


> Originally Posted by *rv8000*
> 
> I run it with the side panel on, the case panel fan wouldn't be in a good place for cooling vrm 1. I originally bought the case for a water cooling build but I ended up never getting some of the parts I ordered so I returned or sold the parts I did have. I may consider some faster intake fans because the intake from the front of the case is definitely poor, at that point it begins defeating the purpose of the quiet case it is though. All of the AIO's i've dealt with are frustratingly noisy for the performance they give, If I find a good deal at some point I may switch back but currently I'm not even oc'd and my cpu hardly sees temps above 50c.
> 
> I cannot remember if you answered one of my original questions or not, do you know if 1mm is the proper thickness pad for the Tri-X vrms? Another user ( i think imprezzion) replaced his pads with 1mm and ended up seeing a 3-5c drop which is still why I'm very disappointed with the thermal pads. Thanks for all the help!


Most AIOs do come with fairly noisy fans from my experience. I've used the Antec 620s, Zalman LQ310, Thermaltake Performer 2.0, and a several Corsair models, all with very respectable results. I always use a different fan than what comes with them however; usually a Corsair Air Series of some sort.

If you can't give the card a plentiful supply of fresh cool air, you are going to have temperature issues while air cooling the card. Those Cougar fans are fair, but there are definitely better options out there for a reasonable price.

You could always upgrade the cooler to the Arctic Accelero Extreme IV, that monstrosity keeps the core and VRMs plenty cool. They have made a large change coming from the Extreme III, so you don't have to deal with the permanent thermal epoxy for RAM/VRMs anymore.

Your other option is to attach an AIO with the NZXT Kraken G10 for $30. I believe the Corsair H55 Quiet edition cooler is on sale at NewEgg again for around $50. Then you need a good heatsink for the VRM1 which can be found here for $8 + shipping from China: http://www.feppaspot.com/servlet/the-897/GELID-Solutions-Icy-Vision/Detail

I am getting reasonable temperatures using a very quiet low speed fan on the radiator. The VRM1 temps could be better, but in the tiny 250D there isn't any option for airflow where the GPU sits...so using the fan to pull hot air off the VRM1 sink is my best option; I have a new fan I've yet to install for the VRM1. You would likely have better results in your case for VRM1 temps. That 120mm fan location in your case in front of the PSU would be perfect to mount the rad and push the hot GPU air out the bottom.

290 w/ NZXT Kraken in 250D: http://www.overclock.net/t/1478544/the-devil-inside-a-gaming-htpc-for-my-wife-3770k-corsair-250d-powercolor-r9-270x-devil-edition/20

I don't know for certain what the thickness of the pads for the air cooled cards are. You will want to use the search function for the thread to hopefully find that post or send him a PM. 1mm would be my best guesstimate, but if they got crushed and crumbled then maybe 1mm is too thick?

Just thought of this. Your card does have a backplate on it, did you remove it and see if there is a thermal pad on the rear of VRM1? If not, adding the pad will definitely help with temperatures...if it does already have pads, upgrading them might help as well.


----------



## rv8000

Quote:


> Originally Posted by *Roboyto*
> 
> Most AIOs do come with fairly noisy fans from my experience. I've used the Antec 620s, Zalman LQ310, Thermaltake Performer 2.0, and a several Corsair models, all with very respectable results. I always use a different fan than what comes with them however; usually a Corsair Air Series of some sort.
> 
> If you can't give the card a plentiful supply of fresh cool air, you are going to have temperature issues while air cooling the card. Those Cougar fans are fair, but there are definitely better options out there for a reasonable price.
> 
> You could always upgrade the cooler to the Arctic Accelero Extreme IV, that monstrosity keeps the core and VRMs plenty cool. They have made a large change coming from the Extreme III, so you don't have to deal with the permanent thermal epoxy for RAM/VRMs anymore.
> 
> Your other option is to attach an AIO with the NZXT Kraken G10 for $30. I believe the Corsair H55 Quiet edition cooler is on sale at NewEgg again for around $50. Then you need a good heatsink for the VRM1 which can be found here for $8 + shipping from China: http://www.feppaspot.com/servlet/the-897/GELID-Solutions-Icy-Vision/Detail
> I am getting reasonable temperatures using a very quiet low speed fan on the radiator. The VRM1 temps could be better, but in the tiny 250D there isn't any option for airflow where the GPU sits...so using the fan to pull hot air off the VRM1 sink is my best option; I have a new fan I've yet to install for the VRM1. You would likely have better results in your case for VRM1 temps. That 120mm fan location in your case in front of the PSU would be perfect to mount the rad and push the hot GPU air out the bottom.
> 
> 290 w/ NZXT Kraken in 250D: http://www.overclock.net/t/1478544/the-devil-inside-a-gaming-htpc-for-my-wife-3770k-corsair-250d-powercolor-r9-270x-devil-edition/20
> 
> I don't know for certain what the thickness of the pads for the air cooled cards are. You will want to use the search function for the thread to hopefully find that post or send him a PM. 1mm would be my best guesstimate, but if they got crushed and crumbled then maybe 1mm is too thick?
> 
> Just thought of this. Your card does have a backplate on it, did you remove it and see if there is a thermal pad on the rear of VRM1? If not, adding the pad will definitely help with temperatures...if it does already have pads, upgrading them might help as well.


I may pick up the Zalman LQ-315 if I see it on sale and swap the fans out, getting rid of the heat dump and overall space restriction would be helpful/nice. The backplate is plexi, and also has spacers to raise the backplate away from the pcb so not gonna be any luck with thermal padding there. The only option to really get temps down would probably be the accelero or going h2o, and for that hassle I'd rather just swap to a Vapor-X card for better quality components and mildly better cooling.


----------



## Roboyto

Quote:


> Originally Posted by *rv8000*
> 
> I may pick up the Zalman LQ-315 if I see it on sale and swap the fans out, getting rid of the heat dump and overall space restriction would be helpful/nice. The backplate is plexi, and also has spacers to raise the backplate away from the pcb so not gonna be any luck with thermal padding there. The only option to really get temps down would probably be the accelero or going h2o, and for that hassle I'd rather just swap to a Vapor-X card for better quality components and mildly better cooling.


Could get a watercooling backplate and put it on there, they are relatively inexpensive and would definitely help with your temps. I belive some folks have put small sinks on the back of the card; since it lies flat a little thermal pad is all you would need.

Have to keep in mind pretty much all of the coolers that have the round pump with the notches for mounting are more or less identical and a rebranding of an Asetek model. The newest ones, like Thermaltake 3.0, are the third revision from Asetek and have improved on pump noise/heat output to help reduce temps. The largest variation between brands is typically what fan(s) are included and how long the hoses are.


----------



## kpoeticg

Thanx for those comparisons Kizwan. I really had no intention of upgrading to the Beta til now. After all the headaches i had trying to fix my old black screen plagued card, i've only used the last 2 official releases since. That's a noticeable improvement tho.

Not sure if it's enough to get me to leave Windows 7 tho


----------



## DeadlyDNA

Thoroughly testing some 290x cards. I forgot just how loud the reference cooler is... hahaha I am spoiled by watercooling now


----------



## DeadlyDNA

is Hynix the better memory or just a myth?


----------



## sugarhell

Quote:


> Originally Posted by *DeadlyDNA*
> 
> is Hynix the better memory or just a myth?


Better latency,better oc speeds. Depends the bios. Sapphire,his and asus are the best.


----------



## DeadlyDNA

Quote:


> Originally Posted by *sugarhell*
> 
> Better latency,better oc speeds. Depends the bios. Sapphire,his and asus are the best.


i think all my exminer 290x cards are hynix. Hmmmm am i that lucky? I never get anything golden so i will take a better memory chip then


----------



## heroxoot

Quote:


> Originally Posted by *DeadlyDNA*
> 
> is Hynix the better memory or just a myth?


better everything for sure.


----------



## Roboyto

Quote:


> Originally Posted by *DeadlyDNA*
> 
> is Hynix the better memory or just a myth?


Typically the Hynix outperforms the Elpida and usually has a better probability of OC. However, there are a few users in this thread who have outstanding performance from Elpida memory. 1 of my XFX cards I had was equipped with Hynix and was an underwhelming performer; I barely got a 10% OC out of the memory..just luck of the draw.

For these cards though, RAM speeds don't affect performance all that much because of the enormous 512-bit bus. Your best bet is to OC the core as far as you can and work on the memory afterwards.

You also have to take into consideration these cards have 5GHz RAM modules by default, unlike last Gen which were 6GHz by default. It is much less likely to see such high RAM speeds on these cards.


----------



## Arizonian

@kizwan

Great work on the driver comparison chart.









Good to see 14.6 beta performance boost to the benchmarks.


----------



## Jflisk

Quote:


> Originally Posted by *DeadlyDNA*
> 
> is Hynix the better memory or just a myth?


Hynix is better then the epilida. I have one card of each chip set for mining the Hynix is the bomb the epilida not so much. Epilida takes more to tune in and still comes in below the Hynix.


----------



## Dasboogieman

Quote:


> Originally Posted by *DeadlyDNA*
> 
> is Hynix the better memory or just a myth?


I've read a mention from Anandtech that the Hynix manufacturing process is extremely mature thus explaining the excellent performance of the RAM. The only manufacturer better than Hynix is arguably Samsung when it comes to high performance RAM.
However, Samsung GDDR5 is extremely rare so the sample size is too small to directly compare.
From what I gather, Elpida is more of a budget offering as it seems to be cheaper in the BOM.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *rdr09*
> 
> i play at stock, so no need extra juice but you are right. i have yet to figure out how much added volts is needed when i am benching. my current estimate using Trixx is about +40 offset, which is a lot.
> 
> thanks. i'll try to play BF4 at 1200 and compare my fps.
> nice work K. +rep. not sure how to spoiler them. sorry.
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Thanks!
Click to expand...

Wow you have been busy









Benched these again with tesselation on and with degraded IMC on Triple Ch ........

HOMECINEMA-PC [email protected]@2428 TRI 290 [email protected] *31979* Tess On









http://www.3dmark.com/3dm11/8382588
No 4 in HOF for TRI ........ Test version 1.0.5

HOMECINEMA-PC [email protected]@2428 TRI 290 [email protected] *25063* Tess On









http://www.3dmark.com/fs/2215560
No 9 in HOF for TRI


----------



## Arizonian

^^^^Congrats^^^^ HOMECINEMA-PC


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Arizonian*
> 
> ^^^^Congrats^^^^ HOMECINEMA-PC


Thanks maaaaate








Thought you would dig that








Gonna do a CF and a single run tess on then install this 14.6 beta ....... when I get back from breakfast


----------



## El-Fuego

anyone upgraded from 7870 ? I'm running xfire of 2 7870 and now getting into problems with games like watch dogs, i had to lower the settings to medium on all settings to get a playable fps, I know I also have some low fps due to HDD load times, I see my hdd working harder with this game, but anyway I feel i need to change to something newer since mine is getting older and more and more games will give me the same problem now,


----------



## Nephalem

Quote:


> Originally Posted by *El-Fuego*
> 
> anyone upgraded from 7870 ? I'm running xfire of 2 7870 and now getting into problems with games like watch dogs, i had to lower the settings to medium on all settings to get a playable fps, I know I also have some low fps due to HDD load times, I see my hdd working harder with this game, but anyway I feel i need to change to something newer since mine is getting older and more and more games will give me the same problem now,


just a thought, have you disabled crossfire when playing watch_dogs? because I know here everyone is saying that Crossfire/SLI cause issues.


----------



## Aussiejuggalo

Anyone else having driver crashed with 14.4WHQL and minecraft? been doing it the last couple of days


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *El-Fuego*
> 
> anyone upgraded from 7870 ? I'm running xfire of 2 7870 and now getting into problems with games like watch dogs, i had to lower the settings to medium on all settings to get a playable fps, I know I also have some low fps due to HDD load times, I see my hdd working harder with this game, but anyway I feel i need to change to something newer since mine is getting older and more and more games will give me the same problem now,


SSD for your sig rig ?

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Anyone else having driver crashed with 14.4WHQL and minecraft? been doing it the last couple of days


Sorry mate not me


----------



## DeadlyDNA

Welp i am about to drown my 4 290x's in water, they appear to be good to go. All hynix memory as well so i hope they will OC well


----------



## El-Fuego

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> SSD for your sig rig ?


No, planning on getting one for the OS, but that won't change much since I most likely wont put games on it unless I buy 2 one for OS and one for games.


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Thanks maaaaate
> 
> 
> 
> 
> 
> 
> 
> 
> Thought you would dig that
> 
> 
> 
> 
> 
> 
> 
> 
> Gonna do a CF and a single run tess on then install this 14.6 beta ....... when I get back from breakfast


You truly are Mad.....but in a good way of course








Quote:


> Originally Posted by *DeadlyDNA*
> 
> Welp i am about to drown my 4 290x's in water, they appear to be good to go. All hynix memory as well so i hope they will OC well


Good luck !!


----------



## Kriant

Sooo....question here guys:
I have tri-fire R9 290's under water....I also have one R9 290x lying around. Should I: a) add R9 290x as my main card for quadfire and call it a day. b) sell tri-fire R9 290 and swap for R9 290x tri-fire ?

Which will benefit me more in the end of the day?


----------



## Forceman

Quote:


> Originally Posted by *Kriant*
> 
> Sooo....question here guys:
> I have tri-fire R9 290's under water....I also have one R9 290x lying around. Should I: a) add R9 290x as my main card for quadfire and call it a day. b) sell tri-fire R9 290 and swap for R9 290x tri-fire ?
> 
> Which will benefit me more in the end of the day?


Why not C) sell the 290X and get a bigger SSD to put your OS and games on instead? Diminishing returns with quadfire, and not much of a boost over 290 by going all 290X.


----------



## Kriant

Quote:


> Originally Posted by *Forceman*
> 
> Why not C) sell the 290X and get a bigger SSD to put your OS and games on instead? Diminishing returns with quadfire, and not much of a boost over 290 by going all 290X.


I've already snatched those tasty SSD's for 37.50 from Amazon during PNY price error sale


----------



## Sgt Bilko

Quote:


> Originally Posted by *Kriant*
> 
> Sooo....question here guys:
> I have tri-fire R9 290's under water....I also have one R9 290x lying around. Should I: a) add R9 290x as my main card for quadfire and call it a day. b) sell tri-fire R9 290 and swap for R9 290x tri-fire ?
> 
> Which will benefit me more in the end of the day?


Add the 290x as the main card then call it a day unless you are planning on 3 x 4k monitors


----------



## Kriant

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Add the 290x as the main card then call it a day unless you are planning on 3 x 4k monitors


Nah, I'm perfectly fine with my 1080p eyefinity setup, besides 4k will require 8 gigs of vram per card lol.

Ok, then quad-fire it is yey.


----------



## ebhsimon

Quote:


> Originally Posted by *Kriant*
> 
> Sooo....question here guys:
> I have tri-fire R9 290's under water....I also have one R9 290x lying around. Should I: a) add R9 290x as my main card for quadfire and call it a day. b) sell tri-fire R9 290 and swap for R9 290x tri-fire ?
> 
> Which will benefit me more in the end of the day?


When you say benefit you in the end of the day, are you actively using these gpus for something? Are you willing to pay the price premium for the extra 5% performance? Are you willing to put in a few more hours work?

I'd say keep the 290 tri fire and sell the 290x unless you really need it (most people think they do, but they don't). Or maybe chuck it in a htpc?


----------



## StonedAlex

What kind of temps are you guys getting with r9 290 trixx in crossfire? I'm thinking about picking a couple of these up since they are so cheap.


----------



## Kriant

Quote:


> Originally Posted by *StonedAlex*
> 
> What kind of temps are you guys getting with r9 290 trixx in crossfire? I'm thinking about picking a couple of these up since they are so cheap.


with or without water?
without they are hot and loud and hot and loud, and I wouldn't put them close together without some additional fans to pull air through cards.


----------



## StonedAlex

Quote:


> Originally Posted by *Kriant*
> 
> with or without water?
> without they are hot and loud and hot and loud, and I wouldn't put them close together without some additional fans to pull air through cards.


Without. That's what I was afraid of. My case doesn't have that much room either. I guess I'll just stick with my 780 ti. I'm really itching to upgrade something though lol.


----------



## the9quad

Quote:


> Originally Posted by *Kriant*
> 
> with or without water?
> without they are hot and loud and hot and loud, and I wouldn't put them close together without some additional fans to pull air through cards.


If you have a case with good air flow, they are just loud, not so hot really/ I have 3 ref cooler 290x's and it just is loud. Doesn't bother me enough to drop another grand and a half on putting them under water though.


----------



## Jeffro422

Quote:


> Originally Posted by *the9quad*
> 
> If you have a case with good air flow, they are just loud, not so hot really/ I have 3 ref cooler 290x's and it just is loud. Doesn't bother me enough to drop another grand and a half on putting them under water though.


My ref 290x hits 94c even with the fan at 50% which you know is loud. For about $100 bucks you can buy the Kraken G10, an AIO water cooler, and heatsinks for the VRAM and VRMs and be good to go. Not as good as full block watercooling solution but it does wonders for the 290x.


----------



## the9quad

Quote:


> Originally Posted by *Jeffro422*
> 
> My ref 290x hits 94c even with the fan at 50% which you know is loud. For about $100 bucks you can buy the Kraken G10, an AIO water cooler, and heatsinks for the VRAM and VRMs and be good to go. Not as good as full block watercooling solution but it does wonders for the 290x.


My cards stay around 68% fan speed when gaming and 78c. A aio cooler is not an option in trifire too big


----------



## kizwan

Quote:


> Originally Posted by *kpoeticg*
> 
> Thanx for those comparisons Kizwan. I really had no intention of upgrading to the Beta til now. After all the headaches i had trying to fix my old black screen plagued card, i've only used the last 2 official releases since. That's a noticeable improvement tho.
> 
> Not sure if it's enough to get me to leave Windows 7 tho


I'm lucky because I don't have serious problem with beta drivers. Just typical crash & BSOD when overclock. Usually when water in the loop get hot, even though card temps are not high. I can play BF4 @1200/1500 or 1200/1600 with +200mV but may crash after a couple of hours in game.

BTW, with 14.6 beta driver, I'm getting black screen when I reset the clocks to default/stock (basically when I press the reset button in Trixx). Windows not crashed though because I can see SSD activity. No black screen when idle or when under load though. I don't have this problem with previous drivers. The workaround would be restarting the computer instead of resetting the clocks in windows.

Yeah, I want to try Windows 8 too but maybe when I have extra SSD's. I don't know whether it's worth it to format current Windows 7 while I don't have problem with it.
Quote:


> Originally Posted by *Arizonian*
> 
> @kizwan
> 
> Great work on the driver comparison chart.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Good to see 14.6 beta performance boost to the benchmarks.


Thanks! Performance boost pretty consistent too.







I want to do BF4 comparison but I don't remember what map I played last time & also it's time consuming.
Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Wow you have been busy
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Benched these again with tesselation on and with degraded IMC on Triple Ch ........
> 
> HOMECINEMA-PC [email protected]@2428 TRI 290 [email protected] *31979* Tess On
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/8382588
> No 4 in HOF for TRI ........ Test version 1.0.5
> 
> HOMECINEMA-PC [email protected]@2428 TRI 290 [email protected] *25063* Tess On
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/2215560
> No 9 in HOF for TRI


Thanks madman!









Nice scores! Congrats!








Quote:


> Originally Posted by *El-Fuego*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HOMECINEMA-PC*
> 
> SSD for your sig rig ?
> 
> 
> 
> No, planning on getting one for the OS, but that won't change much since I most likely wont put games on it unless I buy 2 one for OS and one for games.
Click to expand...

Buy two 128GB SSD's & RAID 0 them. Enough for OS and games and you get nice performance boost from RAID 0.


----------



## rv8000

Whats the deal with the resale market for 290/290x, did it really bottom out that badly with the flooding of used miner cards? I see some cards for ridiculous prices I probably can't even get 320 for my Tri-X







*shakes fists at miners*


----------



## chronicfx

Quote:


> Originally Posted by *rv8000*
> 
> Whats the deal with the resale market for 290/290x, did it really bottom out that badly with the flooding of used miner cards? I see some cards for ridiculous prices I probably can't even get 320 for my Tri-X
> 
> 
> 
> 
> 
> 
> 
> *shakes fists at miners*


Now people are raging that you can pick one up dirt cheap


----------



## hotrod717

Quote:


> Originally Posted by *chronicfx*
> 
> Now people are raging that you can pick one up dirt cheap


I think it's more about miners screwing the market up, yet again. The swings are extreme. You pay $600 for a card and a month later you can barely get half it's price on return. Also there's more of a risk of getting a biffed card if you're buying used.


----------



## eternal7trance

I love all the miners dumping thier cards now. So many dirt cheap 290s


----------



## rdr09

Quote:


> Originally Posted by *chronicfx*
> 
> Now people are raging that you can pick one up dirt cheap


depends which side of the fence you are standing on - buyer or seller.


----------



## chronicfx

Yeah I know it was a little joke. I understand perfectly as I have a handful of 7970's sitting on a shelf from my mining days that I refuse to sell for the paultry $125-$150 you end up with after ebay fees. I actually think they may have greater value a year from now. But from his perspective (and my own) it stinks because I was hoping that selling them would fund my upgrade to broadwell or even devils canyon but after selling four cards just to get a maximus and a cpu... well...l bet the framerate increase over my current i5 setup will not even be worth it. The prices are definitely in the buyers court right now.


----------



## eternal7trance

Quote:


> Originally Posted by *chronicfx*
> 
> Yeah I know it was a little joke. I understand perfectly as I have a handful of 7970's sitting on a shelf from my mining days that I refuse to sell for the paultry $125-$150 you end up with after ebay fees. I actually think they may have greater value a year from now. But from his perspective (and my own) it stinks because I was hoping that selling them would fund my upgrade to broadwell or even devils canyon but after selling four cards just to get a maximus and a cpu... well...l bet the framerate increase over my current i5 setup will not even be worth it. The prices are definitely in the buyers court right now.


Why don't you sell them for 200-220 on here then?


----------



## rdr09

Quote:


> Originally Posted by *chronicfx*
> 
> Yeah I know it was a little joke. I understand perfectly as I have a handful of 7970's sitting on a shelf from my mining days that I refuse to sell for the paultry $125-$150 you end up with after ebay fees. I actually think they may have greater value a year from now. But from his perspective (and my own) it stinks because I was hoping that selling them would fund my upgrade to broadwell or even devils canyon but after selling four cards just to get a maximus and a cpu... well...l bet the framerate increase over my current i5 setup will not even be worth it. The prices are definitely in the buyers court right now.


i read the market fine last december when i sold my 7950 for $325 and my 7970 for $375. those prices are more than what i paid for them after using them 6 months or so. i don't mine, so nothing really stopped me from selling them.


----------



## chronicfx

Quote:


> Originally Posted by *rdr09*
> 
> i read the market fine last december when i sold my 7950 for $325 and my 7970 for $375. those prices are more than what i paid for them after using them 6 months or so. i don't mine, so nothing really stopped me from selling them.


I was still mining then. Right up to the point where it became negative profit without day trading, watching charts, and candle sticks all day. I knew the market would be flooded but I anticipated making a gaming computer for each card and craigslist them for extra $$. She decided for us to do the roof windows siding build a retaining wall give my kids 2 extra days at preschool hire a landscaper full time and flatten and reseed the entire yard instead. Back to work for me lol. No $$$$


----------



## melodystyle2003

I think i am ready to sail with red flags, at least until maxwell arrives.
How does sapphire tri-x oc r9 290 (no x) stands as choise? Some reviews i ve read were ok, but user experience is what i am looking for.









Thanks!


----------



## rdr09

Quote:


> Originally Posted by *chronicfx*
> 
> I was still mining then. Right up to the point where it became negative profit without day trading, watching charts, and candle sticks all day. I knew the market would be flooded but I anticipated making a gaming computer for each card and craigslist them for extra $$. She decided for us to do the roof windows siding build a retaining wall give my kids 2 extra days at preschool hire a landscaper full time and flatten and reseed the entire yard instead. Back to work for me lol. No $$$$


ahhhh, the joy of owning a home and having kids. i just had my house painted and i can't imagine how much the re-roofing is gonna cost. my house was built back in 1999, so it is coming. $15K maybe.

edit: anyway, i am happy with 14.6 Beta for my single 290 and single monitor. i played Shogun, BI, BF3 & 4, C3 and have yet to try Metro and C2.


----------



## Sgt Bilko

Just started folding again....270k ppd so far on stock clocks


----------



## Durvelle27

Guys i really need some help


----------



## Dasboogieman

Quote:


> Originally Posted by *melodystyle2003*
> 
> I think i am ready to sail with red flags, at least until maxwell arrives.
> How does sapphire tri-x oc r9 290 (no x) stands as choise? Some reviews i ve read were ok, but user experience is what i am looking for.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks!


Essentially inaudible up to 47% fan speed. Barely audible at 50%. From here, the fan noise increases almost exponentially with every 10% extra. Loud at 100%, sounds like a hairdryer except less screeching and more whooshing.
At 50%, the cooler has capacity for 1100mhz core, 1500mhz RAM and +12mV keeping the core at around 72ish and the VRMs at 75-78.
The cooler tops out at around +125mV (even with Fujiploy mods), 100% fan speed due to the VRMs reaching 90+.

Big Pluses:
1. Insane cooling capacity, especially for the core, adequate for the VRMs
2. 2 Slot
3. Hynix Memory
4. Very quiet

Some minor annoyances you may notice
1. Mild to god awful coil whine depending on your luck and hearing sensitivity. Can be fixed by hot glue gun mod or simply taping the chokes with Sekisui 5760
2. Aftermarket WC Backplate impossible due to the usage of unusual M1.5 7mm screws which have no longer variants.
3. May notice resonance noise. This is caused by the fan assembly vibrating against the Heatsink, very very obvious if present. Manifests at around 47-50% fan speed and 55%. Can be fixed by wedging rubber (I cut up an old kitchen glove) in the places where the Fan assembly touches the heatsink.


----------



## bond32

Seeing about 200 point score increase in firestrike with the new beta 14.6 drivers... Still running tests, but here's the best run so far at 1280/1625:

http://www.3dmark.com/3dm/3176965?


----------



## melodystyle2003

Quote:


> Originally Posted by *Dasboogieman*
> 
> Essentially inaudible up to 47% fan speed. Barely audible at 50%. From here, the fan noise increases almost exponentially with every 10% extra. Loud at 100%, sounds like a hairdryer except less screeching and more whooshing.
> At 50%, the cooler has capacity for 1100mhz core, 1500mhz RAM and +12mV keeping the core at around 72ish and the VRMs at 75-78.
> The cooler tops out at around +125mV (even with Fujiploy mods), 100% fan speed due to the VRMs reaching 90+.
> 
> Big Pluses:
> 1. Insane cooling capacity, especially for the core, adequate for the VRMs
> 2. 2 Slot
> 3. Hynix Memory
> 4. Very quiet
> 
> Some minor annoyances you may notice
> 1. Mild to god awful coil whine depending on your luck and hearing sensitivity. Can be fixed by hot glue gun mod or simply taping the chokes with Sekisui 5760
> 2. Aftermarket WC Backplate impossible due to the usage of unusual M1.5 7mm screws which have no longer variants.
> 3. May notice resonance noise. This is caused by the fan assembly vibrating against the Heatsink, very very obvious if present. Manifests at around 47-50% fan speed and 55%. Can be fixed by wedging rubber (I cut up an old kitchen glove) in the places where the Fan assembly touches the heatsink.


Complete answer. +rep


----------



## bond32

Quote:


> Originally Posted by *Dasboogieman*
> 
> Essentially inaudible up to 47% fan speed. Barely audible at 50%. From here, the fan noise increases almost exponentially with every 10% extra. Loud at 100%, sounds like a hairdryer except less screeching and more whooshing.
> At 50%, the cooler has capacity for 1100mhz core, 1500mhz RAM and +12mV keeping the core at around 72ish and the VRMs at 75-78.
> The cooler tops out at around +125mV (even with Fujiploy mods), 100% fan speed due to the VRMs reaching 90+.
> 
> Big Pluses:
> 1. Insane cooling capacity, especially for the core, adequate for the VRMs
> 2. 2 Slot
> 3. Hynix Memory
> 4. Very quiet
> 
> Some minor annoyances you may notice
> 1. Mild to god awful coil whine depending on your luck and hearing sensitivity. Can be fixed by hot glue gun mod or simply taping the chokes with Sekisui 5760
> 2. Aftermarket WC Backplate impossible due to the usage of unusual M1.5 7mm screws which have no longer variants.
> 3. May notice resonance noise. This is caused by the fan assembly vibrating against the Heatsink, very very obvious if present. Manifests at around 47-50% fan speed and 55%. Can be fixed by wedging rubber (I cut up an old kitchen glove) in the places where the Fan assembly touches the heatsink.


If I recall, some had luck with rubbing the vrm with acetone (nail polish). Might be something to try if coil whine is an issue.


----------



## rv8000

Quote:


> Originally Posted by *melodystyle2003*
> 
> Complete answer. +rep


I agree with all his post said. Coil whine is just a luck thing, I have zero whine even under heavy loads and insane fps (first card without any in a long time). Also at 50% the fan starts to became somewhat loud for me, no annoying whines or sounds from the fans just the sound of air moving and fans spinning. Good card overall. If you're buying new I'd look at the vapor-x.


----------



## sugarhell

Quote:


> Originally Posted by *rv8000*
> 
> I agree with all his post said. Coil whine is just a luck thing, I have zero whine even under heavy loads and insane fps (first card without any in a long time). Also at 50% the fan starts to became somewhat loud for me, no annoying whines or sounds from the fans just the sound of air moving and fans spinning. Good card overall. If you're buying new I'd look at the vapor-x.


Just put some neutral cure silicon to the chokes.


----------



## melodystyle2003

The vapor-x is by a significant margin more expensive, even higher than gtx780 and if needed i will proceed to an AIO solution on the tri-x.
Thank you all for your clarifications and inputs.


----------



## pdasterly

If u get the aio cooler setup, then reference card will be fine unless you sell fans to recoup some of the costs


----------



## melodystyle2003

Ref cost more than the tri-x as i see. I will go to AIO if needed only, which you keep and use it on another gpu plus i have some of the AIO setup almost ready but no time for mods and tests atm.


----------



## chalkypink

Finally got my Kryographics waterblock installed (with the "active" backplate for vrm cooling). Temps are amazing now.
1225/1467 no artifacts for 30 minutes of Valley (extremehd preset) and never broke 50 degrees. It's idling at around 35 though. Is that high? Think I might have put too thick a layer of TIM on when installing the block. In any case I don't think it really matters if the idle is a little high as long as the load temps are fine, right?


----------



## Durvelle27

How do I connect 3x VGA monitors to my 290X guys. Thanks for any help.


----------



## pkrexer

Finally crossfired now. Just put my 2nd (xfx 290x) under water, so please update me







Pretty happy with the results. Both flash to the 290x PCS bios (to get the +50mv by default) and running 1150/1450.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *chalkypink*
> 
> Finally got my Kryographics waterblock installed (with the "active" backplate for vrm cooling). Temps are amazing now.
> 1225/1467 no artifacts for 30 minutes of Valley (extremehd preset) and never broke 50 degrees. It's idling at around 35 though. Is that high? Think I might have put too thick a layer of TIM on when installing the block. In any case I don't think it really matters if the idle is a little high as long as the load temps are fine, right?


Good OC, what volt offset are you running?

Full waterloops make your GPU + CPU idle warmer, water has a much higher heat capacity than air so the water loop will hold heat much, much longer. My 3770k used to idle at ambient with my old H60i, now on a full loop it idles a couple C above ambient on the coolest cores.


----------



## MojoW

Quote:


> Originally Posted by *MojoW*
> 
> I'm joining in, with my Sapphire R9 290 with elpida memory.
> It's running on a comfortable 1100/1300 on the stock uber bios with the stock cooler.(Custom fan profile)
> I'm running Windows 8.1 with 13.11 B9.2 no black screens so far but i always clean thoroughly after driver uninstall.(Came from 6990 quad)
> 
> (Don't mind the pic my phone lens is busted)


Hey Arizonian,

I'm crossfired now could you add me for a second 290?
Both cards are reference Sapphire on stock cooling.
Hynix and elpida.

Validation 1
Validation 2

Thank you.


----------



## Arizonian

Quote:


> Originally Posted by *MojoW*
> 
> Hey Arizonian,
> 
> I'm crossfired now could you add me for a second 290?
> Both cards are reference Sapphire on stock cooling.
> Hynix and elpida.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Validation 1
> Validation 2
> 
> 
> 
> Thank you.


Sure can. Congrats - updated








Quote:


> Originally Posted by *pkrexer*
> 
> Finally crossfired now. Just put my 2nd (xfx 290x) under water, so please update me
> 
> 
> 
> 
> 
> 
> 
> Pretty happy with the results. Both flash to the 290x PCS bios (to get the +50mv by default) and running 1150/1450.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - updated


----------



## Sgt Bilko

Quote:


> Originally Posted by *Durvelle27*
> 
> How do I connect 3x VGA monitors to my 290X guys. Thanks for any help.


Dvi to VGA adapters and a hdmi to vga adapter would be the most simple way imo


----------



## Durvelle27

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Dvi to VGA adapters and a hdmi to vga adapter would be the most simple way imo


Have you tried DVI to VGA ?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Durvelle27*
> 
> Have you tried DVI to VGA ?


I dont have vga monitors anymore sorry

Cant see why it wouldnt work though. Hdmi to vga might be different though


----------



## Durvelle27

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I dont have vga monitors anymore sorry
> 
> Cant see why it wouldnt work though. Hdmi to vga might be different though


People keep telling me the R9 290(X) no longer support analogue (VGA)


----------



## Dasboogieman

Quote:


> Originally Posted by *bond32*
> 
> If I recall, some had luck with rubbing the vrm with acetone (nail polish). Might be something to try if coil whine is an issue.


Youch, sounds a bit extreme but whatever works. I thought of applying the tape since when I had a close look at the choke, its actually a cube made up of two metal sub bits with a horizontal seam. I reasoned that Powercolor were on to something when they smeared hot glue all over theirs. Must be the two pieces vibrating together, so I just ringed the cube with adhesive tape.
Quote:


> Originally Posted by *melodystyle2003*
> 
> The vapor-x is by a significant margin more expensive, even higher than gtx780 and if needed i will proceed to an AIO solution on the tri-x.
> Thank you all for your clarifications and inputs.


Actually, the Vapor X brings some additional benefits that I believe are worth the extra cost. The VRM assembly is much larger (bigger surface area for heat transfer which the reference design lacks) and connected better via Vapor Chamber so your Over voltage headroom under Air is much much better. Plus, the Backplate is of huge help when it comes to additional RAM and VRM cooling. Not many users have reported back but I believe the Vapor X can sustain at least +168mV continuously, which for me on the Tri X would allow 1200/1500mhz clocks maybe at 60% ish fan speeds. The lack of backplate on the Tri X makes it difficult to rig additional underside cooling for the VRM. In a nutshell, if you don't intend to custom watercool, then the Vapor X is better.

I actually had an idea to place an NVIDIA GT 210 cooler directly on the VRM area if only it were possible to add a backplate to the Tri X.

I did have a phase where I placed an AIO cooler on my Tri X. The biggest problem was never core temperatures, it was actually VRM and VRAM temperatures. The G10 can't guarantee adequate cooling on neither. The VRM cooling was inferior to the Tri X cooler even with the GELID Enhancement kit plus I have no idea how the RAM was doing but after a few games of BF3, I started getting memory artifacts at stock so I reverted back to air cooling to wait for a proper WC loop. On my MSI 270X, it shows the RAM can get up to 80 degrees when overclocked to 1500mhz and thats with the Hawk cooler + cooling plate for the RAM.


----------



## DeadlyDNA

So I removed the coolers on my 290x cards and I noticed 2 of them have capacitors that are a hair taller. In the pic its the blue caps on cards in the middle.



Will this be an issue with waterblocks?


----------



## ebduncan

Not sure if this is any good

but here is a run of 3dmark i did a bit ago with my R9-290 under water. 1250/1650 using the 14.4 drivers.

http://www.3dmark.com/fs/2123899

and yes the 290/X series cards do not support analog connections, Digital only.


----------



## pdasterly

Ugh, screen tearing is an issue now


----------



## Irev

Hi fellas

I have a sapphire r9 290 TRI-X OC card.... im running it @ 1075 core and 1300 memory and +50% Power limit but No voltage changes. Can anyone confirm if this is going to hurt the card as a daily use @ those clocks??? im just unusre......

I have managed to get 1125 core with no voltage adjustments before artifacts... i havnt tried changing voltage but im happy to run @ 1075 providing it wont do long term damage.. i plan on keeping the card for at least 2 years. please reply someone with some knowledge would be great!!


----------



## Dasboogieman

You are fine. The tri x can sustain up to +125mV daily provided the fan speed is at least 50%. It can sustain about +168mV for short periods with at least 80% fan speeds.


----------



## rdr09

Quote:


> Originally Posted by *ebduncan*
> 
> Not sure if this is any good
> 
> but here is a run of 3dmark i did a bit ago with my R9-290 under water. 1250/1650 using the 14.4 drivers.
> 
> http://www.3dmark.com/fs/2123899
> 
> and yes the 290/X series cards do not support analog connections, *Digital only*.


i did not know this. Even with the use of an adapter? I may have to try it myself.

anyway, ed, you can achieve about the same score in that bench using the latest beta at only 1200 on the core. it is like amd turned our 290 into a 290X. but of course, the 290X is still faster.


----------



## Widde

Did a heaven run with 14.6 betas tess off







http://piclair.com/3am6n


----------



## melodystyle2003

Quote:


> Originally Posted by *Dasboogieman*
> 
> Actually, the Vapor X brings some additional benefits that I believe are worth the extra cost. The VRM assembly is much larger (bigger surface area for heat transfer which the reference design lacks) and connected better via Vapor Chamber so your Over voltage headroom under Air is much much better. Plus, the Backplate is of huge help when it comes to additional RAM and VRM cooling. Not many users have reported back but I believe the Vapor X can sustain at least +168mV continuously, which for me on the Tri X would allow 1200/1500mhz clocks maybe at 60% ish fan speeds. The lack of backplate on the Tri X makes it difficult to rig additional underside cooling for the VRM. In a nutshell, if you don't intend to custom watercool, then the Vapor X is better.
> 
> I actually had an idea to place an NVIDIA GT 210 cooler directly on the VRM area if only it were possible to add a backplate to the Tri X.
> 
> I did have a phase where I placed an AIO cooler on my Tri X. The biggest problem was never core temperatures, it was actually VRM and VRAM temperatures. The G10 can't guarantee adequate cooling on neither. The VRM cooling was inferior to the Tri X cooler even with the GELID Enhancement kit plus I have no idea how the RAM was doing but after a few games of BF3, I started getting memory artifacts at stock so I reverted back to air cooling to wait for a proper WC loop. On my MSI 270X, it shows the RAM can get up to 80 degrees when overclocked to 1500mhz and thats with the Hawk cooler + cooling plate for the RAM.


Thanks for your input! Your experience with AIO and gelid kit puts me in thoughts tbh.
The fact is that i do find 290 tri-x oc close to gtx770's price, whice i also had had and returned it back. GTX780 was too much for 1080p to my eyes, gtx770 was fine so, on same price with the last one to get nearly 780 performance, is enough to my standards, at least until gm2xx arises. Thus, having the 290 tri-oc in stock or to 1.1/1.4Ghz, which as i read is nearly sure possible with low noise and if lucky without coil whine, is the golden point i am after.
In a nutshell, don't intent to spend time modding its cooling system since its capable and can deal with some oc headroom. I would like to unpack it, install it to my pc, clock it to 1.1/1.4Ghz and when have time to enjoy some gaming.


----------



## Brian18741

http://www.3dmark.com/fs/2224915

My CPU cooler died a little while ago so in process of saving up to put everything under water but until then, everything is stock. 14.3 drivers. Is this result any good?

:EDIT: Also what sort of score should I be aiming for? My CPU usually runs at 4.5ghz (maybe more when WC'd) and I'll squeeze as much as asi can out of the cards also.

:EDIT 2: Just upgraded to 14.6 drivers and squeezed another 388 points out of Fire Strike! http://www.3dmark.com/fs/2225056

Also another 7°C out of my top card though! Maxed out at 87°C this run. With GPU and case fans on 100%. Need to get this rig under water asap!


----------



## Talon720

I found out, at least I'm pretty sure, thatI I have a bad card. Even though its strange it artifacted when first put in, and then worked flawlessly for a bit. I isolated it using tessellation in Heaven benchmark and bf4. Otherwise it doesn't artifact in Valley or Heaven with tessellation disabled. I tried overvolting core/aux, & underclocked core/mem with no change. Ive also removed some of the stickers on the back for the backplate that covered the vrm. Its a reference sapphire there warranty dosn't look good for my chances of rma. They stat no modifications no removal of any stickers (even though it in a dumb spot) Oh and im second owner though i think the guy sent me the invoice. Am i screwed?


----------



## chronicfx

Quote:


> Originally Posted by *Talon720*
> 
> I found out, at least I'm pretty sure, thatI I have a bad card. Even though its strange it artifacted when first put in, and then worked flawlessly for a bit. I isolated it using tessellation in Heaven benchmark and bf4. Otherwise it doesn't artifact in Valley or Heaven with tessellation disabled. I tried overvolting core/aux, & underclocked core/mem with no change. Ive also removed some of the stickers on the back for the backplate that covered the vrm. Its a reference sapphire there warranty dosn't look good for my chances of rma. They stat no modifications no removal of any stickers (even though it in a dumb spot) Oh and im second owner though i think the guy sent me the invoice. Am i screwed?


Not necessarily. Call and tell them, i jave no experience with sapphire but powercolor took an rma from me on a 7990 that I threw away the stock thermal pads. Sometimes they are nice, just be honest and nice about it.


----------



## Willi

gone from the old 6970 to the R9 290X Gigabyte with Elpida memory. Love this beast. All I have to do now is wait for the waterblock to arrive so I can put the loop back together.

For some weird reason GPUz crashes my system, so I'll take pictures once I take it off for the waterblock. I'm keeping it on 1000mhz core/1250mem since this beast is HOT and is still on stock cooler. Reference card btw, keeping is as cool as possible until I get the waterblock.

EDIT: forgot to mention, running PT1 BIOS. Rock solid so far.


----------



## KnownDragon

Loving my r9 290! Will do a verification today!


----------



## Talon720

Quote:


> Originally Posted by *chronicfx*
> 
> Not necessarily. Call and tell them, i jave no experience with sapphire but powercolor took an rma from me on a 7990 that I threw away the stock thermal pads. Sometimes they are nice, just be honest and nice about it.


Well I've read and heard powercolor's warranty is similar to sapphire's. I guess i should call them even though i hate phone calls its harder to read people. I suppose i could email and call. Would there happen to be a sapphire rep on this site/forums? Thanks for your advice


----------



## rdr09

Quote:


> Originally Posted by *DeadlyDNA*
> 
> So I removed the coolers on my 290x cards and I noticed 2 of them have capacitors that are a hair taller. In the pic its the blue caps on cards in the middle.
> 
> 
> 
> Will this be an issue with waterblocks?


Deadly, i am not sure if you'll be using better thermal pad than the one that came with the kit but you can use the cheap pads and apply them over all the components that need them and test for contact. This is without applying thermal paste on the core. You might want to cover the core with something clean and thin enuf just to protect it from getting scratched. sort of a test run. i would go as far as tightening the screws at least in that area.

I have a question for you . . . for a single 4K, will you need any sort of MSAA in any of the games that uses it? I thought I read you said, you have to in order to lessen jagginess.


----------



## DeadlyDNA

Quote:


> Originally Posted by *rdr09*
> 
> Deadly, i am not sure if you'll be using better thermal pad than the one that came with the kit but you can use the cheap pads and apply them over all the components that need them and test for contact. This is without applying thermal paste on the core. You might want to cover the core with something clean and thin enuf just to protect it from getting scratched. sort of a test run. i would go as far as tightening the screws at least in that area.
> 
> I have a question for you . . . for a single 4K, will you need any sort of MSAA in any of the games that uses it? I thought I read you said, you have to in order to lessen jagginess.


it really imo depends on the game. Overall the most I would use is 2x in games that can use it. Its really more a personal preference but some games seem to reduce image quality with AA. I dont like blur and if I were to choose I would take jaggies over blur. I think if your coming from 1080p or 1440p you will probably be happy with the higher resolution by itself and AA will be more of an afterthought.


----------



## KnownDragon

Can I join? Gpu-z Validation


----------



## Arizonian

Quote:


> Originally Posted by *Willi*
> 
> gone from the old 6970 to the R9 290X Gigabyte with Elpida memory. Love this beast. All I have to do now is wait for the waterblock to arrive so I can put the loop back together.
> 
> For some weird reason GPUz crashes my system, so I'll take pictures once I take it off for the waterblock. I'm keeping it on 1000mhz core/1250mem since this beast is HOT and is still on stock cooler. Reference card btw, keeping is as cool as possible until I get the waterblock.
> 
> EDIT: forgot to mention, running PT1 BIOS. Rock solid so far.


Cool. Submit proof to join the roster.









To be added on the member list please submit the following in your post

1. GPU-Z Link with OCN name or Screen shot of GPU-Z validation tab open with OCN Name or finally a pic of GPU with piece of paper with OCN name showing.
2. Manufacturer & Brand - Please don't make me guess.
3. Cooling - Stock, Aftermarket or 3rd Party Water

Quote:


> Originally Posted by *KnownDragon*
> 
> Can I join? Gpu-z Validation


I didn't know which manufacturer - so by default it's Sapphire stock cooliing unless otherwise specified due to popularity and highers my chances of guessing correctly. .









Congrats - added


----------



## galil3o

I'm having some voltage trouble on my 290x. Its a Sapphire reference card flashed with the asus bios and I'm using GPU Tweak to overclock (1175MHz core). I have the voltage slider maxed out to 1.412v with 150% power target but it sits around 1.2v to 1.25v during 100% utilization (BF4). The idle 2D voltage is pretty constant at 1.117v.

I know there is some vdroop built it but 150 to 200mV seems like alot under load. Its water cooled and maxes out around 60 to 65C so I think I can go a bit further if I can get the voltage consistently up around 1.3v. (it will spike to 1.312v but only for a moment and not very often)

Anyone else experience something similar?


----------



## kizwan

Quote:


> Originally Posted by *Arizonian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KnownDragon*
> 
> Can I join? Gpu-z Validation
> 
> 
> 
> I didn't know which manufacturer - so by default it's Sapphire stock cooliing unless otherwise specified due to popularity and highers my chances of guessing correctly. .
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added
Click to expand...

Subvendor: XFX Pine Group (1682 - 9295)


----------



## KnownDragon

Quote:


> Originally Posted by *Arizonian*
> 
> Cool. Submit proof to join the roster.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> To be added on the member list please submit the following in your post
> 
> 1. GPU-Z Link with OCN name or Screen shot of GPU-Z validation tab open with OCN Name or finally a pic of GPU with piece of paper with OCN name showing.
> 2. Manufacturer & Brand - Please don't make me guess.
> 3. Cooling - Stock, Aftermarket or 3rd Party Water
> I didn't know which manufacturer - so by default it's Sapphire stock cooliing unless otherwise specified due to popularity and highers my chances of guessing correctly. .
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added


xfx r9 290 reference! Stock cooler! No voltage bump overclock! firestrike

XFX RADEON R9 290 947MHz 4GB DDR5 DP HDMI 2XDVI Graphics Card (R9290AENFC

http://www.techpowerup.com/gpuz/8a58p/


----------



## pkrexer

So now that I have both my 290x's under water, I'm seeing a pretty significant temp difference between my two cards. My 1st is maxing out around 62c while the 2nd GPU is around 4-5c cooler, hitting around 57c.

Is this typical? It has me wondering if i should reinstall my block on my 1st GPU.


----------



## rdr09

Quote:


> Originally Posted by *DeadlyDNA*
> 
> it really imo depends on the game. Overall the most I would use is 2x in games that can use it. Its really more a personal preference but some games seem to reduce image quality with AA. I dont like blur and if I were to choose I would take jaggies over blur. I think if your coming from 1080p or 1440p you will probably be happy with the higher resolution by itself and AA will be more of an afterthought.


thanks, Deadly. I don't like blur either.


----------



## DeadlyDNA

Quote:


> Originally Posted by *DeadlyDNA*
> 
> So I removed the coolers on my 290x cards and I noticed 2 of them have capacitors that are a hair taller. In the pic its the blue caps on cards in the middle.
> 
> 
> 
> Will this be an issue with waterblocks?


allright im still moving stuff around and to answer my own question with a picture. I have full copper ek blocks on my 290s and I took a pic of the edge where the caps are. Looks like I will Have enough clearance judging by this. I dread tearing apart my water loop :-(


----------



## TommyGunn123

ASUS bringing out the "Ares 3", a 295X2 with a custom EK block, and might I say, it looks ***tastic!





sauce: http://www.anandtech.com/show/8113/asus-rog-press-conference-live-blog


----------



## DeadlyDNA

Quote:


> Originally Posted by *TommyGunn123*
> 
> ASUS bringing out the "Ares 3", a 295X2 with a custom EK block, and might I say, it looks ***tastic!
> 
> 
> 
> 
> 
> sauce: http://www.anandtech.com/show/8113/asus-rog-press-conference-live-blog


Epic


----------



## Roboyto

Quote:


> Originally Posted by *galil3o*
> 
> I'm having some voltage trouble on my 290x. Its a Sapphire reference card flashed with the asus bios and I'm using GPU Tweak to overclock (1175MHz core). I have the voltage slider maxed out to 1.412v with 150% power target but it sits around 1.2v to 1.25v during 100% utilization (BF4). The idle 2D voltage is pretty constant at 1.117v.
> 
> I know there is some vdroop built it but 150 to 200mV seems like alot under load. Its water cooled and maxes out around 60 to 65C so I think I can go a bit further if I can get the voltage consistently up around 1.3v. (it will spike to 1.312v but only for a moment and not very often)
> 
> Anyone else experience something similar?


Which ASUS BIOS, there are a few people use for overclocking? Is that as high as the core clock will go when you set 1412mV in GPU Tweak?

Did you try overclocking the card with the Sapphire BIOS and another utility? Does it exhibit the same voltages?

When running 3DMark or Unigine do the voltages go any higher? These aren't nearly as CPU dependent as BF4.

Is your i5 overclocked? BF4 will run all 4 threads to 90-95% if need be. From what I've seen in this thread 4.5GHz is about where you want to be with an i5 and these cards for BF4.


----------



## Roboyto

Quote:


> Originally Posted by *DeadlyDNA*
> 
> allright im still moving stuff around and to answer my own question with a picture. I have full copper ek blocks on my 290s and I took a pic of the edge where the caps are. Looks like I will Have enough clearance judging by this. I dread tearing apart my water loop :-(


What brand are the cards with the blue caps? I know Gigabyte made a change to their 290(X) which required EK To make a revision to their blocks. I believe that cap size was the primary difference in MSI gaming cards as well..?

https://www.facebook.com/EKWaterBlocks/photos/a.204208322966540.61821.182927101761329/633154450071923/?type=1


----------



## DeadlyDNA

Quote:


> Originally Posted by *Roboyto*
> 
> What brand are the cards with the blue caps? I know Gigabyte made a change to their 290(X) which required EK To make a revision to their blocks. I believe that cap size was the primary difference in MSI gaming cards as well..?
> 
> https://www.facebook.com/EKWaterBlocks/photos/a.204208322966540.61821.182927101761329/633154450071923/?type=1


they are supposed to be sapphire 290x, i have been trying to find a way to confirm in fact the cards are 290x and not 290's flashed to 290x. I have s/n stickers etc on the top side of the gpu's which all say 290x and so does the coolers. However i know it's possible the board could be a 290 and stickers/cooler were just modified/added. Short of finding a way to check s/n everything else checks out so far.

edit :The waterblocks i have are not full coverage i guess. they don't cover that last few inches of the pcb.


----------



## Roboyto

Quote:


> Originally Posted by *DeadlyDNA*
> 
> they are supposed to be sapphire 290x, i have been trying to find a way to confirm in fact the cards are 290x and not 290's flashed to 290x. I have s/n stickers etc on the top side of the gpu's which all say 290x and so does the coolers. However i know it's possible the board could be a 290 and stickers/cooler were just modified/added. Short of finding a way to check s/n everything else checks out so far.
> 
> edit :The waterblocks i have are not full coverage i guess. they don't cover that last few inches of the pcb.


CoolingConfigurator shows the caps with pink for the reference Sapphire 290X. I've had 2 XFX, a Powercolor and a HIS apart and all were identical with the silver/pink caps; all 290s.

You could probably check the 290 unlock thread for information on how to tell them apart.

But your best bet would probably be to e-mail Sapphire and have them confirm the serial numbers.


----------



## Irev

okay running core clock @ 1075 and power limit +30% , memory stock, custom fan profile.... fan speed @ 54% the card is sitting at 79degrees Celsius playing games... is that good or is it a little too hot?
(sapphire r9 290 tri-x oc)


----------



## DeadlyDNA

Quote:


> Originally Posted by *Roboyto*
> 
> CoolingConfigurator shows the caps with pink for the reference Sapphire 290X. I've had 2 XFX, a Powercolor and a HIS apart and all were identical with the silver/pink caps; all 290s.
> 
> You could probably check the 290 unlock thread for information on how to tell them apart.
> 
> But your best bet would probably be to e-mail Sapphire and have them confirm the serial numbers.


it appears like the ones with blue caps are Sapphire tri-x as i just compared them to this picture and they are identical to caps, sapphire sticker on DP, the little white sticker that says b0191 on the tongue next to PCIE connection fingers. The funny thing is all 4 came on reference coolers. So i will keep looking as if it is a tri-x then that's kinda better but still a ref card?

Sample photo for Tri-x


My photo(sorry crappy cell phone)


----------



## Roboyto

Quote:


> Originally Posted by *DeadlyDNA*
> 
> it appears like the ones with blue caps are Sapphire tri-x as i just compared them to this picture and they are identical to caps, sapphire sticker on DP, the little white sticker that says b0191 on the tongue next to PCIE connection fingers. The funny thing is all 4 came on reference coolers. So i will keep looking as if it is a tri-x then that's kinda better but still a ref card?
> 
> Sample photo for Tri-x
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> My photo(sorry crappy cell phone)


The reference design isn't really all that bad if you consider the clocks that people get on them, granted this relies on silicon lottery...but a good chip effects any card no matter how extravagantly upgraded the other components are. I haven't seen anything miraculous out of Matrix or Lightning cards yet under 'normal conditions'. They are obviously better suited to extreme cooling solutions, but a great overclocking reference card would probably do pretty well with dry ice or LN2 as well.

Did you buy them from a miner? They may have had a surplus of reference coolers and used the Tri-X coolers for something else?

Edit: The caps are same as Tri-X but chokes are same as reference it looks like?


----------



## Roboyto

Quote:


> Originally Posted by *Irev*
> 
> okay running core clock @ 1075 and power limit +30% , memory stock, custom fan profile.... fan speed @ 54% the card is sitting at 79degrees Celsius playing games... is that good or is it a little too hot?
> (sapphire r9 290 tri-x oc)


What voltage? You may not need need the additional power limit at that core speed. My reference PowerColor, with Kraken/AIO, makes 1075/1375 with +37mV and 0% power limit.

Not sure what the trend for Tri-X coolers is, but as long as these cards are under 95C they will keep chugging along.

The more important question is where is VRM1 temp sitting at?


----------



## DeadlyDNA

Quote:


> Originally Posted by *Roboyto*
> 
> The reference design isn't really all that bad if you consider the clocks that people get on them, granted this relies on silicon lottery...but a good chip effects any card no matter how extravagantly upgraded the other components are. I haven't seen anything miraculous out of Matrix or Lightning cards yet under 'normal conditions'. They are obviously better suited to extreme cooling solutions, but a great overclocking reference card would probably do pretty well with dry ice or LN2 as well.
> 
> Did you buy them from a miner? They may have had a surplus of reference coolers and used the Tri-X coolers for something else?
> 
> Edit: The caps are same as Tri-X but chokes are same as reference it looks like?


Yep chokes are different, i am pretty sure these are 290x. I just hope the Hynix gives me a little love since i think 3 or 4 of my 290's are elpidia. Yes got them from a miner and they all tested good just need to be added to the water blocks. I think i will keep a 290 around in case one dies, dn't think i will be able to do anything under warranty with sapphire. They are tied to the purchaser it appears.


----------



## Roboyto

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Yep chokes are different, i am pretty sure these are 290x. I just hope the Hynix gives me a little love since i think 3 or 4 of my 290's are elpidia. Yes got them from a miner and they all tested good just need to be added to the water blocks. I think i will keep a 290 around in case one dies, dn't think i will be able to do anything under warranty with sapphire. They are tied to the purchaser it appears.


Your odds are definitely better with Hynix, but it's no guarantee. I initially bought a pair of 290s; XFX Black Edition. I sold the lesser performer, with Hynix, which only clocked to 1080/1350 no matter what I did with it.

Best of luck to you though







Look forward to seeing the results.


----------



## Mega Man

so i love my komodos, they do help cool the back vrms... but .... my sli fittings leaked.... so yea no power on yet... so sad...


----------



## Irev

Quote:


> Originally Posted by *Roboyto*
> 
> What voltage? You may not need need the additional power limit at that core speed. My reference PowerColor, with Kraken/AIO, makes 1075/1375 with +37mV and 0% power limit.
> 
> Not sure what the trend for Tri-X coolers is, but as long as these cards are under 95C they will keep chugging along.
> 
> The more important question is where is VRM1 temp sitting at?


Just ran a 10 minute furmark test.... @ 1075 core... 54% fan speed, 79degree celcius gpu temp.... 89 degree celcius VRM1 Temp and 58 degree celcius VRM2 Temp
I would rather not play with voltage im happy with a 1075 core daily .. i have found if i dont adjust the power limit then the card seems to back off to 930mhz core clock with 0% power limit


----------



## Widde

Quote:


> Originally Posted by *Roboyto*
> 
> Your odds are definitely better with Hynix, but it's no guarantee. I initially bought a pair of 290s; XFX Black Edition. I sold the lesser performer, with Hynix, which only clocked to 1080/1350 no matter what I did with it.
> 
> Best of luck to you though
> 
> 
> 
> 
> 
> 
> 
> Look forward to seeing the results.


You seem to have had my kind of luck







Got 2 290s stuck on 1080/1350 @+100mV (although ref coolers atm) But lottery restricting me more than temps :< Anyone wanna switch?


----------



## piquadrat

But this luck had it's reflection in ASIC Quality rating?
I wonder if this number mean anything in practice.


----------



## DeadlyDNA

Quote:


> Originally Posted by *Mega Man*
> 
> so i love my komodos, they do help cool the back vrms... but .... my sli fittings leaked.... so yea no power on yet... so sad...


ouch, was it on when this happened?

I had a leak 1 month after i setup mine, it got all over the place in the psu etc. I was lucky, distilled water wasn't conductive i guess. nothing got hurt 
Hope that is the same for you!(everything is safe)


----------



## kizwan

Corsair HG10 GPU Cooling Bracket

I wonder whether this one provide better VRM1 cooling or not.









(Pics taken from techpowerup & guru3d)


----------



## Sgt Bilko

Looks like Corsair has taken the Red Mod idea and thrown their name on it









Well it should provide better vrm 1 cooling over the NZXT G10 due to the fact it uses the fan from the ref design.

From what i read you are supposed to use these on Ref design versions and take the fan from it in the process.


----------



## pdasterly

Having problems with sapphire trixx. Seems to keep adjusting settings, I tried saving profiles but either trixx or ccc is changing mem to 650. I have to open ccc and set defaults but when I restart everything goes back. Annoying. No remedy, just removed trixx


----------



## JordanTr

If they use ref fan, then it means its gonna be the still old jet engine, just better temps for gpu.


----------



## PJFT808

Only if the fans rpm is set high. It wont have to be most likely. The reference fan doesn't start to get too loud till about 60% speed. I dunno but it looks like the bracket touches the ram. I'm wondering if it comes with thermal pads. It could just have a air gap though.


----------



## JordanTr

For me it was loud even thenbit passed 45%


----------



## PJFT808

Yeah I have a higher sound tolerance than most and pain for that matter lol. Wont have to run very high rpm most likely. The ref blowers ran high cause they cooled everything. Now with the aio taking care of the core. The fan shouldnt have to run too fast to keep vrms and ram cooled.


----------



## ebduncan

I didn't think the reference cooler was loud under about 60% as well. Even at that speed it wasn't annoying (not high pitched whine) just gush of air you could hear.
Quote:


> Your odds are definitely better with Hynix, but it's no guarantee. I initially bought a pair of 290s; XFX Black Edition. I sold the lesser performer, with Hynix, which only clocked to 1080/1350 no matter what I did with it.


dang that sucks. I guess I got lucky my XFX black edition 290 can do 1300mhz core, and 1800mhz memory (hynix). Now only if it unlocked to 290x :-D


----------



## Roboyto

Quote:


> Originally Posted by *Irev*
> 
> Just ran a 10 minute furmark test.... @ 1075 core... 54% fan speed, 79degree celcius gpu temp.... 89 degree celcius VRM1 Temp and 58 degree celcius VRM2 Temp
> I would rather not play with voltage im happy with a 1075 core daily .. i have found if i dont adjust the power limit then the card seems to back off to 930mhz core clock with 0% power limit


I wouldn't suggest FurMark especially with the nature of these cards, they get hot enough on their own without that terrible torture test. There are plenty of benchmarks that give a more realistic scenario than FurMark.

That 89C for VRM1 is going to be your concern before the core, I wouldn't run it much hotter than that. You may want to keep track of that VRM1 temp with a different bench or during a long gaming session. You need lots of cool fresh air for these cards to keep reasonable temperatures with air cooling.

A very small bump in voltage could yield the same results and possibly better temps for VRM1; it could be worth a shot.

Quote:


> Originally Posted by *Mega Man*
> 
> so i love my komodos, they do help cool the back vrms... but .... my sli fittings leaked.... so yea no power on yet... so sad...










oh no

Quote:


> Originally Posted by *Widde*
> 
> You seem to have had my kind of luck
> 
> 
> 
> 
> 
> 
> 
> Got 2 290s stuck on 1080/1350 @+100mV (although ref coolers atm) But lottery restricting me more than temps :< Anyone wanna switch?


I bought a pair of them on a newegg combo deal back in October for $899. One went to 1080/1350, but the one I kept is a beast @ 1300/1700









Quote:


> Originally Posted by *piquadrat*
> 
> But this luck had it's reflection in ASIC Quality rating?
> I wonder if this number mean anything in practice.


ASIC doesn't seem to mean too much with these cards.

Quote:


> Originally Posted by *kizwan*
> 
> Corsair HG10 GPU Cooling Bracket
> 
> I wonder whether this one provide better VRM1 cooling or not.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (Pics taken from techpowerup & guru3d)
> 
> 
> 
> Spoiler: Warning: Spoiler!


Nice. My Kraken does pretty good with the Gelid VRM heatsink kit. I also just installed a better fan to cool the VRM1 and it dropped temps by 5-7C depending on what is going on.

Quote:


> Originally Posted by *JordanTr*
> 
> If they use ref fan, then it means its gonna be the still old jet engine, just better temps for gpu.


Not necessarily, as the blower was so loud because the GPU was getting all the hot air from VRM1. If you're just cooling VRM1 than the fan RPM could probably be kept fairly low and practically silent.

Quote:


> Originally Posted by *pdasterly*
> 
> Having problems with sapphire trixx. Seems to keep adjusting settings, I tried saving profiles but either trixx or ccc is changing mem to 650. I have to open ccc and set defaults but when I restart everything goes back. Annoying. No remedy, just removed trixx


If you have OverDrive enabled it will interfere quite badly. Disable that.

I also wouldn't suggest having Trixx load or apply settings on start up. Just save an overclock profile and enable/disable them as necesarry.


----------



## Roboyto

Quote:


> Originally Posted by *ebduncan*
> 
> I didn't think the reference cooler was loud under about 60% as well. Even at that speed it wasn't annoying (not high pitched whine) just gush of air you could hear.
> dang that sucks. I guess I got lucky my XFX black edition 290 can do 1300mhz core, and 1800mhz memory (hynix). Now only if it unlocked to 290x :-D


The card I kept does 1300/1700 so I'm pretty happy with those results.

I don't run it that high unless I'm benching, so everyday gaming clocks sit at 1200/1500 +87mV offset. Gives me great performance for Eyefinity 5760*1080. Couldn't be happier with what the card does, and it will only get better as drivers mature.

Quote:


> Originally Posted by *PJFT808*
> 
> Yeah I have a higher sound tolerance than most and pain for that matter lol. Wont have to run very high rpm most likely. The ref blowers ran high cause they cooled everything. Now with the aio taking care of the core. The fan shouldnt have to run too fast to keep vrms and ram cooled.


The one thing I don't like about that Corsair Contraption is all the hot air from VRM1 is going to be blasted right at your AIO pump. It probably won't make that much of a difference...but the VRMs do produce a fair amount of heat on these cards. I like my Kraken where I have the fan oriented to pull heat off the VRM as opposed to blowing it down at the PCB.

RAM/VRM2 really don't need any airflow at all. I have some pretty low conductance thermal tape, 1.8 W/mk, on VRM2 with the Gelid sink. Even after several hours of gaming I haven't seen temps over 65C, with 0 airflow.

I am intrigued by the design and will be watching closely for reviews to see how well it does. If it can give me a 10C drop from where I'm at then I'd probably buy it.


----------



## galil3o

Quote:


> Originally Posted by *Roboyto*
> 
> Which ASUS BIOS, there are a few people use for overclocking? Is that as high as the core clock will go when you set 1412mV in GPU Tweak?
> 
> Did you try overclocking the card with the Sapphire BIOS and another utility? Does it exhibit the same voltages?
> 
> When running 3DMark or Unigine do the voltages go any higher? These aren't nearly as CPU dependent as BF4.
> 
> Is your i5 overclocked? BF4 will run all 4 threads to 90-95% if need be. From what I've seen in this thread 4.5GHz is about where you want to be with an i5 and these cards for BF4.


I just updated the Asus bios to version 67B0HB.15.39.0.6.AS02S. As for the max clock, I can push it to 1200 MHz stable in furmark and it works fairly well for gaming, though BF4 blackscreens once in a blue moon at that clock so I backed it off a bit.

Original Sapphire Uber bios (tried with both trixx and gputweak) produces slightly less voltage and a smaller overclock ceiling.

Unigine doesnt produce a different result than BF4, aside from the CPU usage which in Unigine is around 50% and in BF4 gets up to 70 - 85%. This is with an i5-3570k @ 4.2GHz.

Interestingly, I notice that when gpu tweak shows a voltage spike up to 1.3v every once in awhile, its when I'm just sitting at my desktop, not ingame...


----------



## Roboyto

Quote:


> Originally Posted by *galil3o*
> 
> I just updated the Asus bios to version 67B0HB.15.39.0.6.AS02S. As for the max clock, I can push it to 1200 MHz stable in furmark and it works fairly well for gaming, though BF4 blackscreens once in a blue moon at that clock so I backed it off a bit.
> 
> Original Sapphire Uber bios (tried with both trixx and gputweak) produces slightly less voltage and a smaller overclock ceiling.
> 
> Unigine doesnt produce a different result than BF4, aside from the CPU usage which in Unigine is around 50% and in BF4 gets up to 70 - 85%. This is with an i5-3570k @ 4.2GHz.
> 
> Interestingly, I notice that when gpu tweak shows a voltage spike up to 1.3v every once in awhile, its when I'm just sitting at my desktop, not ingame...


OK so you're not running one of the modified BIOS' that force voltage and don't downclock. I presume that is a stock BIOS from some other

FurMark is a poor test for stability, all it does is make the card run hotter than most other scenarios.

Is you RAM overclocked too? Blackscreen/Lockup 9/10 times is caused by RAM overclock.

Have you tried monitoring the voltage in GPU-Z or HWInfo? Those may give you a more accurate reading.

Is GPU Tweak an Offset voltage or forced? Offset for GPU I would imagine would apply voltage as needed according to load like a CPU? So maybe the card isn't needing 1.4V at 1175MHz. I wouldn't think it would require that much unless you have a farily poor quality chip.

My XFX BE doesn't get to the 1.4V+ mark unless I'm running near max clocks of 1300/1700 with +200mV IN Trixx and 50% power limit.


----------



## ebduncan

Quote:


> The card I kept does 1300/1700 so I'm pretty happy with those results.
> 
> I don't run it that high unless I'm benching, so everyday gaming clocks sit at 1200/1500 +87mV offset. Gives me great performance for Eyefinity 5760*1080. Couldn't be happier with what the card does, and it will only get better as drivers mature.


I run the same resolution in some games on my triple monitor setup. (Samsung MD230X3 23" 1080p IPS) Honestly though I usually game on my 40" led tv at 1080p. My daily clocks are 1200 core/1500mem with + 100mv core. Takes +200mv to get to 1300 core. Temps are nice and cool under 50c full load.

Something about sitting back on the sofa with the 360 controller and the large 40" just appeals to me. Although sitting at the desk in front of the 3 23" screens is really nice for some games Esp the racing games.


----------



## galil3o

Quote:


> Originally Posted by *Roboyto*
> 
> OK so you're not running one of the modified BIOS' that force voltage and don't downclock. I presume that is a stock BIOS from some other
> 
> FurMark is a poor test for stability, all it does is make the card run hotter than most other scenarios.
> 
> Is you RAM overclocked too? Blackscreen/Lockup 9/10 times is caused by RAM overclock.
> 
> Have you tried monitoring the voltage in GPU-Z or HWInfo? Those may give you a more accurate reading.
> 
> Is GPU Tweak an Offset voltage or forced? Offset for GPU I would imagine would apply voltage as needed according to load like a CPU? So maybe the card isn't needing 1.4V at 1175MHz. I wouldn't think it would require that much unless you have a farily poor quality chip.
> 
> My XFX BE doesn't get to the 1.4V+ mark unless I'm running near max clocks of 1300/1700 with +200mV IN Trixx and 50% power limit.


Thanks for all the info.

Yeah, it was originally a stock Asus bios from when the cards first came out (flashed it back in Nov 2013). I have two 8gb sticks of DDR3-2400 running at their rated speed.

GPU tweak isnt an offset I dont believe. Default sets the voltage slider to 1.25v and GPUz reads the idle voltage at about 0.96v and load at 1.17v. Maxing out the voltage slider to 1.412 volts brings idle up to 1.12v and load to 1.20 to 1.26, so it seems to add a 150mv offset so I guess its not a forced voltage.

I can bring the core up to 1250Mhz with a bit of artifacting under load but the voltage still doesnt go beyond ~1.26v

This is all with leaving the vram at default (1250 / 5000MHz)


----------



## galil3o

I would be open to switching bios and OC utility, the asus / gpu tweak route was just the simplest option when I first got my waterblock back at the end of last year.

Any suggestions?


----------



## Roboyto

Quote:


> Originally Posted by *ebduncan*
> 
> I run the same resolution in some games on my triple monitor setup. (Samsung MD230X3 23" 1080p IPS) Honestly though I usually game on my 40" led tv at 1080p. My daily clocks are 1200 core/1500mem with + 100mv core. Takes +200mv to get to 1300 core. Temps are nice and cool under 50c full load.
> 
> Something about sitting back on the sofa with the 360 controller and the large 40" just appeals to me. Although sitting at the desk in front of the 3 23" screens is really nice for some games Esp the racing games.


I've got a Logitech F710, but the only thing I've ever programmed it for is ROMs. I've been considering grabbing a 360 controller to use at my desk, but I've been a PC gamer since the original Duke Nukem, Wolfenstein, Doom, etc. There are only a few games I prefer a controller for outside the realm on a Nintendo system.

I have the wife setup on the 60" in the living with MotionInJoy for the PS3 DS3 controller and she is loving life playing Tomb Raider maxed out









Once I got into the 3 monitors I haven't been able to play much on a single screen. This is a great fix for when the side monitors are skewed and look funky: http://www.flawlesswidescreen.org/index.php/Flawless_Widescreen


----------



## pdasterly

The xbox 360 controller is nice


----------



## lawson67

Quote:


> Originally Posted by *Roboyto*
> 
> That 89C for VRM1 is going to be your concern before the core, I wouldn't run it much hotter than that. You may want to keep track of that VRM1 temp with a different bench or during a long gaming session. You need lots of cool fresh air for these cards to keep reasonable temperatures with air cooling


VRM1 is designed to run up to 120c?...Or is rated up 120c and is designed to withstand temps up to 120c from all the info i have read?


----------



## PounceBounce

Good night, folks. Sorry if i am asking in the wrong thread. Standing at the crossroads now. Cannot decide whether to buy R9 290 Vapor-X or GTX780 iChill. The seocnd one is much cooler and is really silent. The first one has 4 gb VRAM and 512 bit bus.

The best way out as i think is to use 290 + some custom watercooler (like G10 + H55). The problem is that i cannot order G10 with delivery to Belarus (yep, some may even no where it is







).

Is it possible to use some gpu bracket for Vapor-X version? is the game worth the candle?


----------



## Roboyto

Quote:


> Originally Posted by *lawson67*
> 
> VRM1 is designed to run up to 120c?...Or is rated up 120c and is designed to withstand temps up to 120c from all the info i have read?


Designed to withstand, yes. Good for stability, efficiency, or longevity of the card, most definitely not.

The core can handle 94C all day long without throttling, but I wouldn't want to run it at that temp all the time.


----------



## Frag Mortuus

Well, I pulled the trigger and bought two XFX 290's. Yeah, they are going into my signature machine. But I plan to upgrade it to Haswell-e 5820K, DDR4 RAM, and whatever EVGA or ASUS e-ATX boards are available at launch.

I'm impatiently waiting them to be delivered tomorrow!

I'm jumping ship from Nvidia and judging from reviews and benchmarks I've read, I don't think I'm going to be sorry.


----------



## rv8000

Vapor-X 290



Had to do it


----------



## PounceBounce

Quote:


> Originally Posted by *rv8000*
> 
> Vapor-X 290
> 
> 
> 
> Had to do it


Temps and rpm, please?


----------



## Dasboogieman

Quote:


> Originally Posted by *Roboyto*
> 
> Designed to withstand, yes. Good for stability, efficiency, or longevity of the card, most definitely not.
> 
> The core can handle 94C all day long without throttling, but I wouldn't want to run it at that temp all the time.


I agree yes, withstand =/= safe for sustained operation

For anyone interested in really how stable your memory OC is. I suggest checking out MemtestCL https://simtk.org/project/xml/downloads.xml?group_id=385
You'll have to register an account to download since it's an academic institution.

As it turns out, my 1500mhz mem clocks were throwing out loads of silent errors but everything stabilized once I dialed it back to 1475mhz.

Also, I'm curious as to what results the people with chronic black screen get at stock, if it throws out loads of errors then it pretty much confirms that blackscreen is caused by unstable memory.

This nifty little app may also answer the question of whether Aux Voltage helps memory OCs once and for all since it is far more sensitive than any benchmark.

Should probably mention, I'm using the latest AMD driver (the one for watchdogs). IIRC 13.12 and 14.4 throws up a lot of false negatives.

Edit: just also tested with then Fans at 100% at stock, extra cooling on the core + VRAM doesn't seem to help the mem stability @ 1500mhz. I guess Roboyto was correct when he tried to use FujiPloy for the RAM.
Edited by Dasboogieman - Today at 6:15 pm View History


----------



## Mr357

Quote:


> Originally Posted by *Dasboogieman*
> 
> I agree yes, withstand =/= safe for sustained operation
> 
> For anyone interested in really how stable your memory OC is. I suggest checking out MemtestCL https://simtk.org/project/xml/downloads.xml?group_id=385
> You'll have to register an account to download since it's an academic institution.
> 
> As it turns out, my 1500mhz mem clocks were throwing out loads of silent errors but everything stabilized once I dialed it back to 1475mhz.
> 
> Also, I'm curious as to what results the people with chronic black screen get at stock, if it throws out loads of errors then it pretty much confirms that blackscreen is caused by unstable memory.
> 
> This nifty little app may also answer the question of whether Aux Voltage helps memory OCs once and for all since it is far more sensitive than any benchmark.
> 
> Edit: just also tested with then Fans at 100% at stock, extra cooling on the core + VRAM doesn't seem to help the mem stability @ 1500mhz. I guess Roboyto was correct when he tried to use FujiPloy for the RAM.


+Rep for you sir!









This should be added to the OP.


----------



## rv8000

Quote:


> Originally Posted by *PounceBounce*
> 
> Temps and rpm, please?


I'll test everything tomorrow when I come back home from work, too busy to give this card the tlc it deserves tonight.


----------



## lawson67

Quote:


> Originally Posted by *Roboyto*
> 
> Designed to withstand, yes. Good for stability, efficiency, or longevity of the card, most definitely not.
> 
> The core can handle 94C all day long without throttling, but I wouldn't want to run it at that temp all the time.


My cards have 3 years warranty and do not believe for one minute that VRM1 running at 89c or even 100c would affect the stability, efficiency, or longevity of the card and if it did i would get a brand new card for free!...Also 100c is 20c from the design limit....VRM temps have always run hot its just now there is software that show you just how hot!....And 89c sounds bad when you think in terms of GPU temps....But they are not GPU,s they are voltage regulator modules and they are designed to run way hotter than most will ever get!


----------



## Jeffro422

Time to join the club I suppose. http://www.techpowerup.com/gpuz/mnuum/

Just installed my Kraken G10

Small install album:


http://imgur.com/ZVBky


----------



## Gunderman456

Hi Folks;

By endless Stress Testing/Benching, I have proven that in upping VCore and if required in tandem with VCCIN you can reach previously unattainable overclocks on the GPU.

Please refer to the last two pages of "The Hawaiian Heat Wave" Build Log (in sig) for the irrefutable proof!


----------



## Arizonian

Quote:


> Originally Posted by *KnownDragon*
> 
> xfx r9 290 reference! Stock cooler! No voltage bump overclock! firestrike
> 
> XFX RADEON R9 290 947MHz 4GB DDR5 DP HDMI 2XDVI Graphics Card (R9290AENFC
> 
> http://www.techpowerup.com/gpuz/8a58p/


Thanks - corrected









Quote:


> Originally Posted by *rv8000*
> 
> Vapor-X 290
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Had to do it


Congrats - added









Enjoy and report back with clocks / temps on the Vapor X

Quote:


> Originally Posted by *Jeffro422*
> 
> Time to join the club I suppose. http://www.techpowerup.com/gpuz/mnuum/
> 
> Just installed my Kraken G10
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Small install album:
> 
> 
> http://imgur.com/ZVBky


Congrats - added









Kraken G10 looking good.


----------



## rv8000

Quote:


> Originally Posted by *Arizonian*
> 
> Thanks - corrected
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Enjoy and report back with clocks / temps on the Vapor X
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Kraken G10 looking good.


Sadly that's going to have to wait. The card has a bum fan, emitting an electrical buzzing noise, also ran into a lot of black screening issues after several driver re-installs (multiple ways, and I never had this problem with a reference, pcs+ and tri-x). To the egg for a replacement it goes







. I do say though vrm temps looked insanely good even for just a few firestrike runes, vrm 1 barely hit over 50c, 15-30 mins of both gw2 and diablo didnt seem to get the vrm 1 much hotter than 58c. No extensive testing so take it all lightly.


----------



## Roboyto

Quote:


> Originally Posted by *lawson67*
> 
> My cards have 3 years warranty and do not believe for one minute that VRM1 running at 89c or even 100c would affect the stability, efficiency, or longevity of the card and if it did i would get a brand new card for free!...Also 100c is 20c from the design limit....VRM temps have always run hot its just now there is software that show you just how hot!....And 89c sounds bad when you think in terms of GPU temps....But they are not GPU,s they are voltage regulator modules and they are designed to run way hotter than most will ever get!


If cooling components didn't make them more stable, efficient, or last longer than there would be little use for any of the products that are created for PC hardware.

If you enjoy RMAing your parts from unnecessary abuse then that is your prerogative.

You may not want to believe than an electronic component operating at a lower temperature is more efficient, but that is absolutely how it works.

44W reduction in power consumption; one small example.

http://www.legitreviews.com/nzxt-kraken-g10-gpu-water-cooler-review-on-an-amd-radeon-r9-290x_130344/5


----------



## Roboyto

Quote:


> Originally Posted by *Dasboogieman*
> 
> I agree yes, withstand =/= safe for sustained operation
> 
> For anyone interested in really how stable your memory OC is. I suggest checking out MemtestCL https://simtk.org/project/xml/downloads.xml?group_id=385
> You'll have to register an account to download since it's an academic institution.
> 
> As it turns out, my 1500mhz mem clocks were throwing out loads of silent errors but everything stabilized once I dialed it back to 1475mhz.
> 
> Also, I'm curious as to what results the people with chronic black screen get at stock, if it throws out loads of errors then it pretty much confirms that blackscreen is caused by unstable memory.
> 
> This nifty little app may also answer the question of whether Aux Voltage helps memory OCs once and for all since it is far more sensitive than any benchmark.
> 
> Edit: just also tested with then Fans at 100% at stock, extra cooling on the core + VRAM doesn't seem to help the mem stability @ 1500mhz. I guess Roboyto was correct when he tried to use FujiPloy for the RAM.
> Edited by Dasboogieman - Today at 6:15 pm View History


I'm going to have to scope this out. This could explain bench scores dropping unexpectedly when there are no visual signs of problems.

The RAM definitely doesn't generate enough heat to be affected. VRM2 for the RAM barely gets hot even after hours of gaming when passively cooled. Used GPU-Z to monitor my wife's last Tomb Raider session and after 75 minutes VRM2 peaked at 65C; 1075/1375 +37mV. It only has some JunPus 1.8 w/mk thermal tape and the small Gelid heatsink to cool it.



Spoiler: VRM2 sink

















Spoiler: No airflow back there


----------



## Jeffro422

Quote:


> Originally Posted by *Roboyto*
> 
> I'm going to have to scope this out. This could explain bench scores dropping unexpectedly when there are no visual signs of problems.
> 
> The RAM definitely doesn't generate enough heat to be affected. VRM2 for the RAM barely gets hot even after hours of gaming when passively cooled. Used GPU-Z to monitor my wife's last Tomb Raider session and after 75 minutes VRM2 peaked at 65C; 1075/1375 +37mV. It only has some JunPus 1.8 w/mk thermal tape and the small Gelid heatsink to cool it.
> 
> 
> Spoiler: VRM2 sink
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: No airflow back there


You have your 92mm fan on backwards unless you didn't want it blowing on the VRMs.


----------



## Mega Man

to all worried no power was on, i always leak test for 24 hours min !!! now i wont turn them on for a week to make sure they are dried ! and it was not the blocks, they are so sexay ill try to post some links i need to buy some small gauge wire. but i have a test of one with RGBs behing the block. looks epic, still need to get some RGBs for the front "swiftech" logo
Quote:


> Originally Posted by *lawson67*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Roboyto*
> 
> Designed to withstand, yes. Good for stability, efficiency, or longevity of the card, most definitely not.
> 
> The core can handle 94C all day long without throttling, but I wouldn't want to run it at that temp all the time.
> 
> 
> 
> My cards have 3 years warranty and do not believe for one minute that VRM1 running at 89c or even 100c would affect the stability, efficiency, or longevity of the card and if it did i would get a brand new card for free!...Also 100c is 20c from the design limit....VRM temps have always run hot its just now there is software that show you just how hot!....And 89c sounds bad when you think in terms of GPU temps....But they are not GPU,s they are voltage regulator modules and they are designed to run way hotter than most will ever get!
Click to expand...

feel free ! please do tell us when they go boom though... ( please note the "when" not the "if" )
Quote:


> Originally Posted by *Gunderman456*
> 
> Hi Folks;
> 
> By endless Stress Testing/Benching, I have proven that in upping VCore and if required in tandem with VCCIN you can reach previously unattainable overclocks on the GPU.
> 
> Please refer to the last two pages of "The Hawaiian Heat Wave" Build Log (in sig) for the irrefutable proof!


thanks i will


----------



## Nephalem

Would this OC similar to the Gaming Edition by MSI?


----------



## heroxoot

So I was about to be ready to stab that void if removed sticker on my 290x..........and it seems MSI got to it first. had a screwdriver marking thru it already. So I take it apart and dear lord it was disgusting. Thermal paste just all over the GPU. It was in and around the die. I would have taken pictures but it would make you vomit. GPU seems to be staying a tad cooler too. Whom ever did this should be punished to a terrible extent. Rub thermal paste all over them.


----------



## Irev

Quote:


> Originally Posted by *Roboyto*
> 
> I wouldn't suggest FurMark especially with the nature of these cards, they get hot enough on their own without that terrible torture test. There are plenty of benchmarks that give a more realistic scenario than FurMark.
> 
> That 89C for VRM1 is going to be your concern before the core, I wouldn't run it much hotter than that. You may want to keep track of that VRM1 temp with a different bench or during a long gaming session. You need lots of cool fresh air for these cards to keep reasonable temperatures with air cooling.


Hi mate... when playing games @ 1075 core 1300 mem I get the same result as furmark = 89 degrees VRM1 Temp
When running the card @ stock 1000mhz core I get 87 degree VRM1 temp in furkmark

It seems that from stock 1000/1300 to 1075/1300 I jump up 2 degrees celcius on VRM1...... .is this normal? this should be safe considering VRM1 gets 87degree stock speeds.
BTW I have a well ventilated case so cannot be an issue there

any ideas?

Edit: I have noticed bumping up my custom fan profile from 54% fan speed to 62% my VRM1 Temp seems to be more stable around 85degree celcius @ 1075core


----------



## ebhsimon

Some updates on my VaporX 290.

I previously thought 1130/1550 was my max overclock. I didn't think there would be a big difference between stock and +100 or 200mV. Boy was I wrong. During the Tomb Raider benchmark stock power usage was under 200 watts according to GPU-Z, when VDDC was raised to +100 it went to 330W, and when raised further to +200 power went up again to 370W. Note sure how accurate the actual number is, but it almost doubled the power usage. Crazy.

Anyway I managed to get 1180/1550 without artifacts. I haven't pushed the memory very far. I've got Hynix memory in mine though, should I try push it further?

GPU temp reached 76C max and VRM1 reached 83C max during the benchmarks.


----------



## the9quad

Results with new patch- 64 player shangahi building down 1440p Ultra 2xmsaa As you can see no Massive CPU spikes at all, so good news is mantle seems to be working pretty decently. Cards at stock and CPU at 4.3ghz for that match.

disregard I accidentally uploaded old bench, I'm an idiot.
Instead here is a quick golmud railway: Time above 144fps is >99%, and average fps is >200 fps... wut wut. The biggish spike is is a round change/autoswitch or whatever ( i think because I never noticed it in game)


----------



## TommyGunn123

Logan from TekSyndicate's done a video showing the new Corsair HG10 red-mod style gpu bracket, and as most of us expected, the stock fan and vrm cooling solution works great in tandom with the aio cooler, but im just curious about noise. The fan speed % from gpu-z says 20% but if it's really picking up the blower fan's speed i'm not sure. If it is however, i might be grabbing a cheapy ref card just for this...

Link to the temps for the bracket:


----------



## the9quad

Quote:


> Originally Posted by *TommyGunn123*
> 
> Logan from TekSyndicate's done a video showing the new Corsair HG10 red-mod style gpu bracket, and as most of us expected, the stock fan and vrm cooling solution works great in tandom with the aio cooler, but im just curious about noise. The fan speed % from gpu-z says 20% but if it's really picking up the blower fan's speed i'm not sure. If it is however, i might be grabbing a cheapy ref card just for this...
> 
> Link to the temps for the bracket:


you think three of those would work in this case/mobo:


----------



## TommyGunn123

Quote:


> Originally Posted by *the9quad*
> 
> you think three of those would work in this case/mobo:


Could you get 3 aio radiators in there? Because it looks a tad crampy in there


----------



## ebhsimon

Quote:


> Originally Posted by *TommyGunn123*
> 
> Logan from TekSyndicate's done a video showing the new Corsair HG10 red-mod style gpu bracket, and as most of us expected, the stock fan and vrm cooling solution works great in tandom with the aio cooler, but im just curious about noise. The fan speed % from gpu-z says 20% but if it's really picking up the blower fan's speed i'm not sure. If it is however, i might be grabbing a cheapy ref card just for this...
> 
> Link to the temps for the bracket:


Good thing you mention noise. Most people put on aftermarket coolers to get rid of the reference axial fan. The VRM temps while good, aren't ground breaking. I think bigger and better VRM heatsinks would better complement the AIO cooler.

EDIT: He says "The VRM temps are significantly higher (on the reference cooler vs corsair cooler)"... The camera focuses on the VRM temps and it's 1C higher. Lol... sensationalist much?


----------



## the9quad

Quote:


> Originally Posted by *TommyGunn123*
> 
> Could you get 3 aio radiators in there? Because it looks a tad crampy in there


I doubt it, lol. And custom watercooling is out, because I dont have $600 for blocks, $440 for a new case, and whatever else rads and pumps and tubing and reservoir, and fittings would cost.


----------



## TommyGunn123

Quote:


> Originally Posted by *ebhsimon*
> 
> Good thing you mention noise. Most people put on aftermarket coolers to get rid of the reference axial fan. The VRM temps while good, aren't ground breaking. I think bigger and better VRM heatsinks would better complement the AIO cooler.
> 
> EDIT: He says "The VRM temps are significantly higher (on the reference cooler vs corsair cooler)"... The camera focuses on the VRM temps and it's 1C higher. Lol... sensationalist much?


Yeah well thats why i wanna know the actual speed the blower fan goes, because if it can stay at say 30%, thats pretty damn quiet still, and we all know how good it is for vrm cooling
Quote:


> Originally Posted by *the9quad*
> 
> I doubt it, lol. And custom watercooling is out, because I dont have $600 for blocks, $440 for a new case, and whatever else rads and pumps and tubing and reservoir, and fittings would cost.


Haha ye well if you get 3 brackets and 3 aio's, still not cheap, but you'd probs be able to cram 2 in, but that 3rd maybe a bit on the old dodgey side haha


----------



## ebhsimon

Quote:


> Originally Posted by *TommyGunn123*
> 
> Yeah well thats why i wanna know the actual speed the blower fan goes, because if it can stay at say 30%, thats pretty damn quiet still, and we all know how good it is for vrm cooling


If you look closely you can see the fan on the corsair bracket is at 20% fan speed, RPM unspecified.
And the fan on the stock cooler is 33% fan speed, RPM unspecified (I'm assuming silent mode?).


----------



## TommyGunn123

Quote:


> Originally Posted by *ebhsimon*
> 
> If you look closely you can see the fan on the corsair bracket is at 20% fan speed, RPM unspecified.
> And the fan on the stock cooler is 33% fan speed, RPM unspecified (I'm assuming silent mode?).


The only reason i can think of them running it at a stupidly low speed like 33% is to exacerbate the vrm2 temps, but yeah idno im still cautiously optimistic for it, maybe someone whos done pretty much the same thing but pure redmod style can give us some fan speed and temps


----------



## Roboyto

Quote:


> Originally Posted by *Jeffro422*
> 
> You have your 92mm fan on backwards unless you didn't want it blowing on the VRMs.


The fan is turned around, but this was intentional. The card is in a Corsair 250D and the fan is pulling the heat off the VRM1 right out the side ventilated panel









That was an old pic, I have since upgraded the VRM1 fan to a nice Zalman one and it gave me 5-7C drop in VRM1 temps.



Spoiler: Zalman Shark Fin















Quote:


> Originally Posted by *heroxoot*
> 
> So I was about to be ready to stab that void if removed sticker on my 290x..........and it seems MSI got to it first. had a screwdriver marking thru it already. So I take it apart and dear lord it was disgusting. Thermal paste just all over the GPU. It was in and around the die. I would have taken pictures but it would make you vomit. GPU seems to be staying a tad cooler too. Whom ever did this should be punished to a terrible extent. Rub thermal paste all over them.


That's par for the course for most manufacturers it seems. Good thing they use non-conductive TIM









This was my Devil 270X



Spoiler: Too much paste














Quote:


> Originally Posted by *Irev*
> 
> Hi mate... when playing games @ 1075 core 1300 mem I get the same result as furmark = 89 degrees VRM1 Temp
> When running the card @ stock 1000mhz core I get 87 degree VRM1 temp in furkmark
> 
> It seems that from stock 1000/1300 to 1075/1300 I jump up 2 degrees celcius on VRM1...... .is this normal? this should be safe considering VRM1 gets 87degree stock speeds.
> BTW I have a well ventilated case so cannot be an issue there
> 
> any ideas?
> 
> Edit: I have noticed bumping up my custom fan profile from 54% fan speed to 62% my VRM1 Temp seems to be more stable around 85degree celcius @ 1075core


It is surprising that gaming is giving similar temperature to FurMark, would you happen to have some pics of your rig so we can see how it is set up?

Can't remember if you mentioned this, but what 290(X) are you running?

Quote:


> Originally Posted by *ebhsimon*
> 
> Some updates on my VaporX 290.
> 
> I previously thought 1130/1550 was my max overclock. I didn't think there would be a big difference between stock and +100 or 200mV. Boy was I wrong. During the Tomb Raider benchmark stock power usage was under 200 watts according to GPU-Z, when VDDC was raised to +100 it went to 330W, and when raised further to +200 power went up again to 370W. Note sure how accurate the actual number is, but it almost doubled the power usage. Crazy.
> 
> Anyway I managed to get 1180/1550 without artifacts. I haven't pushed the memory very far. I've got Hynix memory in mine though, should I try push it further?
> 
> GPU temp reached 76C max and VRM1 reached 83C max during the benchmarks.


That power draw is about right. The highest I've recorded on mine @ 1300/1700 +200mV +50% power limit, full waterblock, is 359.5W in GPU-Z

Quote:


> Originally Posted by *TommyGunn123*
> 
> The only reason i can think of them running it at a stupidly low speed like 33% is to exacerbate the vrm2 temps, but yeah idno im still cautiously optimistic for it, maybe someone whos done pretty much the same thing but pure redmod style can give us some fan speed and temps


VRM2 runs nice and cool when it doesn't have heat from everything else being forced over it. With my Kraken G10 I haven't seen temps any higher than 65C on VRM2 and it is passively cooled with a small heatsink.

I am anxiously awaiting some official reviews of this contraption as well. If it can give better temps than my Kraken + Gelid Sinks + Zalman fan than I wouldn't have a problem shelling out $40 for it.

The big questions are still going to be noise/temps. I know the blower fan can be quiet at low speed, but will it cool properly at low speed? And I still don't like the fact it's going to be blowing all the hot air right at the AIO pump.


----------



## kizwan

Quote:


> Originally Posted by *the9quad*
> 
> Results with new patch- 64 player shangahi building down 1440p Ultra 2xmsaa As you can see no Massive CPU spikes at all, so good news is mantle seems to be working pretty decently. Cards at stock and CPU at 4.3ghz for that match.
> 
> disregard I accidentally uploaded old bench, I'm an idiot.
> Instead here is a quick golmud railway: Time above 144fps is >99%, and average fps is >200 fps... wut wut. The biggish spike is is a round change/autoswitch or whatever ( i think because I never noticed it in game)
> 
> 
> Spoiler: Warning: Spoiler!


There's new patch? My internet quota is less than 2GB.









I have been playing BF4 at 1080p 200% resolution scale (4k), Ultra No AA & average FPS is around ~80s to ~90s (clocks 1000/1300). Do you think I can get same FPS with 4k monitor?


----------



## Blue Dragon

been playing around with my 2 asus 290 oc cards, trying to figure out how to get CF to work properly - it only works right when I disable CF, restart rig, and then activate CF- it'll usually work for at least one bench. but always returns to going to [email protected]$$ clocks... usually my second card is at 500Mhz clock instead of 1000Mhz. the 14.6 beta drivers seem to be working in keeping the CF working right (haven't tested fully) but I noticed as a side issue that my second GPU wasn't going to sleep. not sure if this is the only cause, but I figured out that Asus Tweak was stopping the second card from sleeping. I disable the start at boot option and now my second card 'sleeps' as it should. will report back once I figure out if the CF works steady on several benches and games back-to-back.

cat- 14.6 beta
asus vbios- 015.041.000.002 looks like-

in GPUZ (original bios ended in 101 instead of 103)


----------



## sugarhell

Quote:


> Originally Posted by *the9quad*
> 
> I doubt it, lol. And custom watercooling is out, because I dont have $600 for blocks, $440 for a new case, and whatever else rads and pumps and tubing and reservoir, and fittings would cost.


External. Get a nova 1080 and enjoy your new high end loop


----------



## PJFT808

*Originally Posted by Jeffro422*

Looks good, we have similar set ups


----------



## heroxoot

Quote:


> Originally Posted by *Roboyto*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> So I was about to be ready to stab that void if removed sticker on my 290x..........and it seems MSI got to it first. had a screwdriver marking thru it already. So I take it apart and dear lord it was disgusting. Thermal paste just all over the GPU. It was in and around the die. I would have taken pictures but it would make you vomit. GPU seems to be staying a tad cooler too. Whom ever did this should be punished to a terrible extent. Rub thermal paste all over them.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That's par for the course for most manufacturers it seems. Good thing they use non-conductive TIM
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This was my Devil 270X
> 
> 
> Spoiler: Too much paste
Click to expand...

Mine was worse and spread into the little chip areas around the die too. I had to take a tooth brush and scrub it out.


----------



## Sgt Bilko

Quote:


> Originally Posted by *heroxoot*
> 
> Mine was worse and spread into the little chip areas around the die too. I had to take a tooth brush and scrub it out.


Toothbrushes are awesome for that job, just remember to keep them separate otherwise no amount of mouth wash will help you


----------



## Jeffro422

Quote:


> Originally Posted by *Roboyto*
> 
> The fan is turned around, but this was intentional. The card is in a Corsair 250D and the fan is pulling the heat off the VRM1 right out the side ventilated panel
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That was an old pic, I have since upgraded the VRM1 fan to a nice Zalman one and it gave me 5-7C drop in VRM1 temps.
> 
> 
> Spoiler: Zalman Shark Fin


I'm going to look into getting a different 92mm fan then. Did you try having the fan blowing on the VRM vs. pulling heat off the VRM with the Shark Fin to see the difference in temperatures. I'm curious now, but I'll definitely grab a different 92mm fan.


----------



## Roy360

guys, where's HydraVision? I finally setup my eyefinity group, and now I want to make "frames" so I can just snap applications into place, but it isn't in catalyst like it was in the case of my R9 280X. I tried downloading it separately, but the wizard won't open after updating to 14.4. Plus when it did open, it would only let me switch between different desktops.

How do you create frames so you snap applications into them?


----------



## chronicfx

Quote:


> Originally Posted by *the9quad*
> 
> you think three of those would work in this case/mobo:


I run mine 3x290x in a haf x with same slot config. Use the afterburner profile and they will be 65-75c gaming. Loud but I don't hear it with headphones.


----------



## PureBlackFire

got some time to do tests before I change the reference cooler on this. what drivers should I install? 14.4 or 14.6?


----------



## lawson67

Quote:


> Originally Posted by *Roboyto*
> 
> If cooling components didn't make them more stable, efficient, or last longer than there would be little use for any of the products that are created for PC hardware.
> 
> If you enjoy RMAing your parts from unnecessary abuse then that is your prerogative.
> 
> You may not want to believe than an electronic component operating at a lower temperature is more efficient, but that is absolutely how it works.
> 
> 44W reduction in power consumption; one small example.
> http://www.legitreviews.com/nzxt-kraken-g10-gpu-water-cooler-review-on-an-amd-radeon-r9-290x_130344/5


Point one: If cooling components didn't make them more stable, efficient, or last longer than there would be little use for any of the products that are created for PC hardware.

= At 89c or even 100c VRM1 is well below the designed operating temperature...So its not less stable or less efficient and its not going to have shorter life!

Point 2: If you enjoy RMAing your parts from unnecessary abuse then that is your prerogative.

= Can you explane to me and the manufacturer of the voltage regulator modules where the abuse is coming from if you are running a component at 89c or even 100c when its designed to run up to 120c?...Also i wont need to RMA my card as there is no abuse taking place at at 89c or even 100c as i am sure the manufacturer will be only to happy to point out to you!...However you clearly know better than the people that designed them and therefore you should tell them where they are going wrong!
Quote:


> Originally Posted by *Mega Man*
> 
> feel free ! please do tell us when they go boom though... ( please note the "when" not the "if" )


There will be no need to tell you when they go boom because guess what they simply will not at 89c or even 100c!...Please educate yourself on the component that you would like to scare others into thinking will blow up at temperature well below the design limit...Or tell the people that manufacturer and builds them as you clearly know better than they do!

Also
From tech support at Saphire: 2013-11-12 [10:07]
Yes, VRM temp normally over 100c and sometime goes to 120c

I have never seen mine VRM1 go over 91c which is way below its designed limit!.. And if i see it hit 91c again i will not be worried one bit as again it is well within its operating designed limit of 120c...scaremongering is one thing!... But data and Facts are the truth!


----------



## kizwan

Quote:


> Originally Posted by *PureBlackFire*
> 
> got some time to do tests before I change the reference cooler on this. what drivers should I install? 14.4 or 14.6?
> 
> 
> Spoiler: Warning: Spoiler!


14.6


----------



## sena

I forgot to post submision.

Here it is:

1. http://www.techpowerup.com/gpuz/4kggm/
2. 2x Asus R290X reference
3. 3rd party water


----------



## PureBlackFire

ok. I just ran Valley bench and some things stood out. first is performance. it's much lower than my gtx780 and much lower than most other R9 290 scores I've seen on here.odd, and gpu usage stayed at 100% most of the run. second, the card did not break 70c on the reference cooler. also, vrm1 hihest temp was 47c and vrm 2 59c. my temps are generally better than what many people post on here with the same hardware so no surprise there, but the numbers and low valley score seem to suggest that gpu usage is not 100%. I used 14.4 btw. gonna try some other benches and 14.6. I think I need to update MSI afterburner to the newest version as well.


----------



## PureBlackFire

another Vallley run, same result.


----------



## sugarhell

Hmm thats strange. Did you use DDU to uninstall the drivers?


----------



## VSG

That is incredibly low, I first thought may be it was a 1440p run given the low score. Something is really off there!


----------



## Gunderman456

That is close to half of what I get with one card. Two cards in CF gives me 112fps and we have the same CPU.

I'm currently on the MSI AB 3.0 beta 19 (so not even the newest one) and on 14.4 drivers.

Your loads appear fine, temps are decent like mine, however you're not overclocked, but it shoud not result in those poor fps.

Try changing drivers to 14.6 beta and see what happens.


----------



## sena

I also have problem with valley, its not that low but it pretty low for two r290x, also there is a lot of stuttering.

I will post score later.


----------



## PureBlackFire

okay guys. installed 14.6 beta and installed the final release of AB 3.0.0. my score in valley is proper for a stock 290:




core temp went up to 77c. vrm temps still very good:


for comparison a stock gtx780 (863/1502) with 80mhz boost:


----------



## Gunderman456

Excellent news!


----------



## ebhsimon

Wow @PureBlackFire you have some pretty sweet temps for a reference cooler.
77C core and 55C VRM1 @ 100% load? Not too shabby at all mate.
What are the noise levels like under 68% fan speed?

By the way, I think it's time you overclocked


----------



## PureBlackFire

noise isn't an issue with my ear buds on, but without them lol. and my H440 has noise canceling foam. it sounds like the kitchen faucet is wide open lol. still, I will admit it's not as bad as I've heard up until now. here is my Metro LL run:




card is pretty strong. gonna try some overclocking.


----------



## piquadrat

Quote:


> Originally Posted by *lawson67*
> 
> = Can you explane to me and the manufacturer of the voltage regulator modules where the abuse is coming from if you are running a component at 89c or even 100c when its designed to run up to 120c?...Also i wont need to RMA my card as there is no abuse taking place at at 89c or even 100c as i am sure the manufacturer will be only to happy to point out to you!...However you clearly know better than the people that designed them and therefore you should tell them where they are going wrong!
> There will be no need to tell you when they go boom because guess what they simply will not at 89c or even 100c!...Please educate yourself on the component that you would like to scare others into thinking will blow up at temperature well below the design limit...Or tell the people that manufacturer and builds them as you clearly know better than they do!
> (...)
> 
> I have never seen mine VRM1 go over 91c which is way below its designed limit!.. And if i see it hit 91c again i will not be worried one bit as again it is well within its operating designed limit of 120c...scaremongering is one thing!... But data and Facts are the truth!


The fact is the temperature shortens life of electronics, semiconductor devices included (electromigration, defects generation, diffusion procesess - all dependent on temperature) and in case of semiconductors reduces the efficiency through thermal generation of charge carriers. Hotter semiconductor dissipates more heat (in non linear fashion).
But if the temperature reduces the life of vrms on R9 from 5 to 2 years or from 25 to 10 years is to be seen.
I would be no worried about vrm life, but the life of surrounded components, especially capacitors. They are heated through the metalization layers of the pcb and through the air convection.
And if anybody ever tell you that the capacitors mounted on these cards withstand a hundred degrees for 5 years without consequences he is living in a dreamword. After 2-3 years measured capacity will be reduced by half, after 4-5 by half of that.


----------



## xDorito

I currently have the gigabyte r9 290x windforce card and I was considering purchasing a second. If I were to get a reference 290x, would I run into any trouble with getting them to crossfire/performance issues?


----------



## PureBlackFire

so it looks like 1097mhz/1400mhz is the highest I can do without touching voltage. any more and I see artifacting.

1047mhz/1350mhz


1097mhz/1400mhz


----------



## Roboyto

Quote:


> Originally Posted by *heroxoot*
> 
> Mine was worse and spread into the little chip areas around the die too. I had to take a tooth brush and scrub it out.


I've had them like this too, just didn't have any photographic evidence handy.

Quote:


> Originally Posted by *Jeffro422*
> 
> I'm going to look into getting a different 92mm fan then. Did you try having the fan blowing on the VRM vs. pulling heat off the VRM with the Shark Fin to see the difference in temperatures. I'm curious now, but I'll definitely grab a different 92mm fan.


I haven't tried it blowing down because I'd rather not heat up the PCB and anything else around it. Also with the 250D, the side panels are ventilated so the fan lines up nearly perfectly to exhaust it all right out the side.

I would imagine if it made this difference pulling heat off, it would work just as good(if not better) blowing down through the heatsink.

Quote:


> Originally Posted by *PureBlackFire*
> 
> got some time to do tests before I change the reference cooler on this. what drivers should I install? 14.4 or 14.6?
> 
> 
> 
> Spoiler: Warning: Spoiler!


I would definitely say 14.6

Quote:


> Originally Posted by *lawson67*
> 
> Point one: If cooling components didn't make them more stable, efficient, or last longer than there would be little use for any of the products that are created for PC hardware. = At 89c or even 100c VRM1 is well below the designed operating temperature...So its not less stable or less efficient and its not going to have shorter life!
> 
> Point 2: If you enjoy RMAing your parts from unnecessary then that is your prerogative. = Can you explane to me and the manufacturer of the voltage regulator modules where the abuse is coming from if you are running a component at 89c or even 100c when its designed to run up to 120c?...Also i wont need to RMA my card as there is no abuse taking place at at 89c or even 100c as i am sure the manufacturer will be only to happy to point out to you!...However you clearly know better than the people that designed them and therefore you should tell them where they are going wrong!
> There will be no need to tell you when they go boom because guess what they simply will not at 89c or even 100c!...Please educate yourself on the component that you would like to scare others into thinking will blow up at temperature well below the design limit...Or tell the people that manufacturers that built them as you clearly know better than they do!










I say good day, and good bye.

Quote:


> Originally Posted by *PureBlackFire*
> 
> another Vallley run, same result.
> 
> 
> Spoiler: Warning: Spoiler!


I agree with with @sugarhell, @Gunderman456 you should definitely do a driver scrub especially coming from Nvidia; a fresh Windows install may not hurt either depending on how long it's been.

14.6 is definitely a better bet for 14.X drivers. I found when I was on Win7 that 13.12 was king and 14.1-14.4 gave me nothing but problems, but when I switched to 8.1 Pro 14.6 has been glorious thus far.


----------



## heroxoot

Even with the GPU die cleaned and some fresh thermal paste my GPU doesn't idle as low as it does on 14.4. On 14.6 my idle is 53c with a 78C temp during load on stock clocks. On 14.4 the idle was about 48c in a cold room with the nasty half crap of goop. On 13.12 it was also about 48c idle.

Why is 14.6 so damn hot? I remember when the awesome boosting driver for the 7970 came out. It didn't push this much more heat on my card. I saw a 1c difference, not 3 - 5c.


----------



## Roy360

I"m starting to think I have the wrong program in mind. What's that AMD application that lets to snap applications to certain areas?

nvm, its HydraGrid. Why isn't this in catalyst anymore?

...and there isn't anything under optional downloads.


----------



## sugarhell

Hydra^


----------



## Widde

What might be causing this? :S http://piclair.com/l16yi

Top card is idling now at 80 degrees :S


----------



## sugarhell

Quote:


> Originally Posted by *Widde*
> 
> What might be causing this? :S http://piclair.com/l16yi
> 
> Top card is idling now at 80 degrees :S


Dual monitor? 3d clocks? Flash player?


----------



## Arizonian

Quote:


> Originally Posted by *sena*
> 
> I forgot to post submision.
> 
> Here it is:
> 
> 1. http://www.techpowerup.com/gpuz/4kggm/
> 2. 2x Asus R290X reference
> 3. 3rd party water


Congrats - added


----------



## Widde

Quote:


> Originally Posted by *sugarhell*
> 
> Dual monitor? 3d clocks? Flash player?


Might be the summer heat combined with youtube then :S Seems ridiculous though that it is at 80 C at the desktop when the fan is at 45% and ambient is 20-21 degrees. Gonna take it apart tomorrow and clean them but while gaming the temps are still only in the 75-80s


----------



## sugarhell

Quote:


> Originally Posted by *Widde*
> 
> Might be the summer heat combined with youtube then :S Seems ridiculous though that it is at 80 C at the desktop when the fan is at 45% and ambient is 20-21 degrees. Gonna take it apart tomorrow and clean them but while gaming the temps are still only in the 75-80s


Just disable hardware acceleration on flash player


----------



## heroxoot

Quote:


> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Widde*
> 
> Might be the summer heat combined with youtube then :S Seems ridiculous though that it is at 80 C at the desktop when the fan is at 45% and ambient is 20-21 degrees. Gonna take it apart tomorrow and clean them but while gaming the temps are still only in the 75-80s
> 
> 
> 
> Just disable hardware acceleration on flash player
Click to expand...

And on your browser if it uses it. Firefox has hardware acceleration when available. I'd see my GPU clocking up for no reason during flash still. Had to disable it.


----------



## sugarhell

I dont know why they have by default hardware acceleration on. It sucks. Html 5 >>>>flash


----------



## Widde

Quote:


> Originally Posted by *sugarhell*
> 
> Just disable hardware acceleration on flash player


God I'm stupid >_< Forgot I had enabled hardware acceleration while troubleshooting the bluescreens I was getting while watching videos on the more "questionable" websites
















Works fine in ie though but not firefox









Back to 45-50 degrees idle thanks







^^


----------



## the9quad

Quote:


> Originally Posted by *chronicfx*
> 
> I run mine 3x290x in a haf x with same slot config. Use the afterburner profile and they will be 65-75c gaming. Loud but I don't hear it with headphones.


Lol, thanks, I have already been doing that since the day they released. I was asking about 3 of those AOI coolers. Thanks for the reply though seriously + rep.


----------



## rdr09

Quote:


> Originally Posted by *Roy360*
> 
> I"m starting to think I have the wrong program in mind. What's that AMD application that lets to snap applications to certain areas?
> 
> nvm, its HydraGrid. Why isn't this in catalyst anymore?
> 
> ...and there isn't anything under optional downloads.


found it yet?

http://support.amd.com/en-us/download/desktop/previous/detail?os=Windows%207%20-%2064&rev=13.12#optional-downloads


----------



## Roy360

Quote:


> Originally Posted by *rdr09*
> 
> found it yet?
> 
> http://support.amd.com/en-us/download/desktop/previous/detail?os=Windows%207%20-%2064&rev=13.12#optional-downloads


thanks. I think I downloaded an older version before


----------



## chronicfx

Quote:


> Originally Posted by *the9quad*
> 
> Lol, thanks, I have already been doing that since the day they released. I was asking about 3 of those AOI coolers. Thanks for the reply though seriously + rep.


Aww sorry I was just glancing through my phone. Did not even realize that was you. If you can pull that off with three I will fix crossfire myself for ubisoft. Hopefully the hoses are long enough to find a spot for the rads.


----------



## Hl86

Im thinking of getting another 290 for crossfire.
I have [email protected] 4,9ghz, 1 ssd, 1 harddisk.
Will this psu be sufficient?
http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story&reid=93


----------



## cephelix

Guys,for the EK backplate, which one is compatible with the rev 2.0 version of the full cover waterblock? I have the msi r9 290 gaming


----------



## DeadlyDNA

Quote:


> Originally Posted by *cephelix*
> 
> Guys,for the EK backplate, which one is compatible with the rev 2.0 version of the full cover waterblock? I have the msi r9 290 gaming


possibly pm red1776 hes doing a build with 4 of the msi 290x gamings. Not sure if they are much diff from290?


----------



## cephelix

Quote:


> Originally Posted by *DeadlyDNA*
> 
> possibly pm red1776 hes doing a build with 4 of the msi 290x gamings. Not sure if they are much diff from290?


Thanks....will pm him then...there isnt a difference i think. but i just can't find the backplate on frozencpu...


----------



## Juub

How is the Crossfire experience with the R9 290 and R9 290X? I heard it is a lot smoother and stable than with the 7000 series. Any truth to that?


----------



## Jeffro422

1200/1500 +125mv +50 power

Is this pretty good? Couldn't find many single 290x scores.


----------



## FlighterPilot

Quote:


> Originally Posted by *heroxoot*
> 
> How well do your 290s and 290x fair at the same clocks? My 290X was under performing a little bit compared to others but it may be my CPU. I'm curious how much more performance the shader difference really is. With the 7950/70 the differences were only 4 - 5fps with a friend using a similar rig.


This is suuuper late, but here is the performance of my *290 and 290x* crossfired, both with the same clocks in heaven (947Mhz core 1250Mhz memory)



Is this on par with typical crossfire 290 setups?


----------



## heroxoot

Seems pretty damn nice to me. My single 290x is at stock right now and Im seeing about 52fps in heaven on it solo. With the OC its almost 54.


----------



## Roy360

Quote:


> Originally Posted by *Hl86*
> 
> Im thinking of getting another 290 for crossfire.
> I have [email protected] 4,9ghz, 1 ssd, 1 harddisk.
> Will this psu be sufficient?
> http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story&reid=93


never heard of that company before, but my 3 R9 290s take ~900W at the wall with my AX1200, so you should be good. I posted up my results with 2 and 1 card somewhere on this thread.

That with the cards underclocked to 947/1250.
ASUS DCII Card <-- uses extra voltage by default
Reference XFX R9 290
Vistontek R9 290 <-- uses extra voltage by default

p.s, do you guys think it would be worth swapping one of my R9 290s with a R9 290x? My visiontek card is really limiting my overclock, and I have a reference CLUB3D R9 290X that isn't doing anything.


----------



## heroxoot

AMD pretty much just told me my fan problem is unheard of. I can have full control on 13.12 but not 14.1 or higher and they want me to reformat my pc, AGAIN. I just reformatted on Monday and I'm not doing it again.

I feel like I'm being trolled.


----------



## doginpants12

Currently I have a XFX Core single r9 290 (soon to have an EK waterblock) , here is my validation. So i am having a problem with Metro Last Light I can't seem to find anywhere, while playing I get frame rates between 15 and 20 and my gpu load percentage jumps up and down randomly. I am running it at 1600x900 everything maxed, except SSAA off. Any ideas?


----------



## Mega Man

Quote:


> Originally Posted by *xDorito*
> 
> I currently have the gigabyte r9 290x windforce card and I was considering purchasing a second. If I were to get a reference 290x, would I run into any trouble with getting them to crossfire/performance issues?


no issues
Quote:


> Originally Posted by *DeadlyDNA*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cephelix*
> 
> Guys,for the EK backplate, which one is compatible with the rev 2.0 version of the full cover waterblock? I have the msi r9 290 gaming
> 
> 
> 
> possibly pm red1776 hes doing a build with 4 of the msi 290x gamings. Not sure if they are much diff from290?
Click to expand...

there is no difference to the back of the pcb, just slightly larger caps in the front ! should be the same backplate

http://www.frozencpu.com/products/21680/ex-blc-1569/EK_R9-290X_VGA_Liquid_Cooling_RAM_Backplate_-_Black_EK-FC_R9-290X_Backplate_-_Black.html


----------



## FlighterPilot

Quote:


> Originally Posted by *Roy360*
> 
> never heard of that company before, but my 3 R9 290s take ~900W at the wall with my AX1200, so you should be good. I posted up my results with 2 and 1 card somewhere on this thread.
> 
> That with the cards underclocked to 947/1250.
> ASUS DCII Card <-- uses extra voltage by default
> Reference XFX R9 290
> Vistontek R9 290 <-- uses extra voltage by default
> 
> p.s, do you guys think it would be worth swapping one of my R9 290s with a R9 290x? My visiontek card is really limiting my overclock, and I have a reference CLUB3D R9 290X that isn't doing anything.


Go for it. Since you're on air there's only a small investment of time required to swap the cards out. I personally haven't had any issues mixing the two.


----------



## cephelix

Quote:


> Originally Posted by *Mega Man*
> 
> no issues
> there is no difference to the back of the pcb, just slightly larger caps in the front ! should be the same backplate
> 
> http://www.frozencpu.com/products/21680/ex-blc-1569/EK_R9-290X_VGA_Liquid_Cooling_RAM_Backplate_-_Black_EK-FC_R9-290X_Backplate_-_Black.html


Thanx for that.....hoping to order the stuff i need this weekend


----------



## Irev

Quote:


> Originally Posted by *Roboyto*
> 
> It is surprising that gaming is giving similar temperature to FurMark, would you happen to have some pics of your rig so we can see how it is set up?
> 
> Can't remember if you mentioned this, but what 290(X) are you running?


I have this Case here.... though the fans are setup a little different... I have 1x 120mm rear exhaust, 2x 140mm top exhaust, 1x 120mm side panel exhaust, 1x 140mm front intake
http://www.casecom.com.tw/pc-case/2012/CS14.jpg

I have the Sapphire R9 290 TRI-X OC


----------



## kizwan

Quote:


> Originally Posted by *Jeffro422*
> 
> 
> 1200/1500 +125mv +50 power
> 
> Is this pretty good? Couldn't find many single 290x scores.


Yup, that look good.


----------



## Roboyto

Quote:


> Originally Posted by *Irev*
> 
> I have this Case here.... though the fans are setup a little different... I have 1x 120mm rear exhaust, 2x 140mm top exhaust, 1x 120mm side panel exhaust, 1x 140mm front intake
> http://www.casecom.com.tw/pc-case/2012/CS14.jpg
> 
> I have the Sapphire R9 290 TRI-X OC


I think more intake air is needed; a single 140mm isn't going to cut it.

Add another fan in front as intake. Switch side panel to intake.
Try adding a fan to the bottom 120 slot for more fresh air.


----------



## kizwan

Quote:


> Originally Posted by *Roboyto*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Irev*
> 
> I have this Case here.... though the fans are setup a little different... I have 1x 120mm rear exhaust, 2x 140mm top exhaust, 1x 120mm side panel exhaust, 1x 140mm front intake
> http://www.casecom.com.tw/pc-case/2012/CS14.jpg
> 
> I have the Sapphire R9 290 TRI-X OC
> 
> 
> 
> I think more intake air is needed; a single 140mm isn't going to cut it.
> 
> Add another fan in front as intake. Switch side panel to intake.
> Try adding a fan to the bottom 120 slot for more fresh air.
Click to expand...

Great minds think alike!


----------



## sena

Quote:


> Originally Posted by *Juub*
> 
> How is the Crossfire experience with the R9 290 and R9 290X? I heard it is a lot smoother and stable than with the 7000 series. Any truth to that?


Its superb, its even smoother then my old gtx 780 SLI setup.


----------



## King4x4

I can confirm that 290s are much much smoother then 780s.

Tri-780s Vs Quad 290x

The Quads are much smoother (which is a start!)


----------



## Enzarch

Quote:


> Originally Posted by *Dasboogieman*
> 
> I agree yes, withstand =/= safe for sustained operation
> 
> For anyone interested in really how stable your memory OC is. I suggest checking out MemtestCL https://simtk.org/project/xml/downloads.xml?group_id=385
> You'll have to register an account to download since it's an academic institution.
> 
> As it turns out, my 1500mhz mem clocks were throwing out loads of silent errors but everything stabilized once I dialed it back to 1475mhz.
> 
> Also, I'm curious as to what results the people with chronic black screen get at stock, if it throws out loads of errors then it pretty much confirms that blackscreen is caused by unstable memory.
> 
> This nifty little app may also answer the question of whether Aux Voltage helps memory OCs once and for all since it is far more sensitive than any benchmark.
> 
> Should probably mention, I'm using the latest AMD driver (the one for watchdogs). IIRC 13.12 and 14.4 throws up a lot of false negatives.
> 
> Edit: just also tested with then Fans at 100% at stock, extra cooling on the core + VRAM doesn't seem to help the mem stability @ 1500mhz. I guess Roboyto was correct when he tried to use FujiPloy for the RAM.
> Edited by Dasboogieman - Today at 6:15 pm View History


Wow, thank you for finding this, this definitely confirms (for me) that blackscreens are caused by memory unstability

My blackscreens happen when I use higher core voltages (>80mV) at 120hz only.

This is mirrored exactly in MemtestCL with thousands of errors when I set +150mV core and 120hz refresh and ZERO errors below +~75mV and/or 60hz

I sure hope this, in some way, gets us closer to a resolution as I am one of many that has this identical problem.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Juub*
> 
> How is the Crossfire experience with the R9 290 and R9 290X? I heard it is a lot smoother and stable than with the 7000 series. Any truth to that?


It is sublime if im honest, now we just need Xfire support in windowed borderless mode and it's perfect








Quote:


> Originally Posted by *sena*
> 
> Its superb, its even smoother then my old gtx 780 SLI setup.


Quote:


> Originally Posted by *King4x4*
> 
> I can confirm that 290s are much much smoother then 780s.
> 
> Tri-780s Vs Quad 290x
> 
> The Quads are much smoother (which is a start!)


^


----------



## sugarhell

Quote:


> Originally Posted by *Sgt Bilko*
> 
> It is sublime if im honest, now we just need Xfire support in windowed borderless mode and it's perfect
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ^


You have cfx support in borderless with mantle


----------



## Sgt Bilko

Quote:


> Originally Posted by *sugarhell*
> 
> You have cfx support in borderless with mantle


Really?

Never tested it out before.....AWESOME!!!!


----------



## sena

Guys i have problem with monitor sleep, when my monitors enters sleep it cant awake, i have to turn off pc to have image again.

ULPS is disabled, trixx is used for overclocking.

Can anyone throw some light on this.

Thanks

EDIT: Now my vram is stuck at 3d speed in 2d, damn.


----------



## jtom320

Does anyone have any thoughts on the 290 Heatkiller or Aquacomputer blocks?

I'd love to get the Aquacomputer but I don't think the active backplate is every going to be in stock so my main focus is the heatkillers. This would be for two reference 290s.


----------



## heroxoot

After tons of emails with what seems like the most poor english I have ever seen, AMD tells me my fan problem isn't a driver issue. So why does it work on 13.12 and not 14.1 and up? They then had me submit a ticket about the 14.6 driver specifically because I mentioned it causes my GPU more heat than 14.4.

Kind of mad, but it seems im the only person who cannot control their GPU fan speed. No idea why.


----------



## sugarhell

Quote:


> Originally Posted by *heroxoot*
> 
> After tons of emails with what seems like the most poor english I have ever seen, AMD tells me my fan problem isn't a driver issue. So why does it work on 13.12 and not 14.1 and up? They then had me submit a ticket about the 14.6 driver specifically because I mentioned it causes my GPU more heat than 14.4.
> 
> Kind of mad, but it seems im the only person who cannot control their GPU fan speed. No idea why.


I already told you its a bios problem. But msi doesnt care at all

Lightning 13.12 has fan control with 14.x it doesnt.


----------



## heroxoot

Quote:


> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> After tons of emails with what seems like the most poor english I have ever seen, AMD tells me my fan problem isn't a driver issue. So why does it work on 13.12 and not 14.1 and up? They then had me submit a ticket about the 14.6 driver specifically because I mentioned it causes my GPU more heat than 14.4.
> 
> Kind of mad, but it seems im the only person who cannot control their GPU fan speed. No idea why.
> 
> 
> 
> I already told you its a bios problem. But msi doesnt care at all
> 
> Lightning 13.12 has fan control with 14.x it doesnt.
Click to expand...

Actually you're wrong! Just got off the phone with MSI and my 290X has a new uber bios or something he said. Still don't have the email with it tho.


----------



## hwoverclkd

Quote:


> Originally Posted by *heroxoot*
> 
> AMD pretty much just told me my fan problem is unheard of. I can have full control on 13.12 but not 14.1 or higher and they want me to reformat my pc, AGAIN. I just reformatted on Monday and I'm not doing it again.
> 
> I feel like I'm being trolled.


that perhaps is what 'they' read on the other forums as the solution


----------



## Yvese

Quote:


> Originally Posted by *Jeffro422*
> 
> 
> 1200/1500 +125mv +50 power
> 
> Is this pretty good? Couldn't find many single 290x scores.


Is that with AF set to Performance in CCC? This is what I get with a 290 @ 1150/1500 with AF set to Performance:


----------



## hwoverclkd

Quote:


> Originally Posted by *heroxoot*
> 
> Actually you're wrong! Just got off the phone with MSI and my 290X has a new uber bios or something he said. Still don't have the email with it tho.


you don't have a lightning, do you? For 290x lightning, it's a bios update that fixes the fan control on 14.x.


----------



## heroxoot

Quote:


> Originally Posted by *acupalypse*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Actually you're wrong! Just got off the phone with MSI and my 290X has a new uber bios or something he said. Still don't have the email with it tho.
> 
> 
> 
> you don't have a lightning, do you? For 290x lightning, it's a bios update that fixes the fan control on 14.x.
Click to expand...

No I have the 4G gaming but there seems to be an update for it too. Its not the same problem. My fans all work I just cant adjust their speed.

never got the bios. Called MSI and instead they are going to test the same card and see if they can find the issue I'm having. If not I might have to RMA......AGAIN. The guy said if my bios is a different version they will send the new one first to try but if my issue is isolated I will need to RMA.

Lovely.


----------



## Jeffro422

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Yvese*
> 
> Is that with AF set to Performance in CCC? This is what I get with a 290 @ 1150/1500 with AF set to Performance:






Just ran it with AF set to performance. 1150/1500 got me 1568 62.3fps. Print screen blacked out Heaven and only print screened my second monitor. I'll do it again later. I didn't know about changing the AF. Although in CCC on 14.6 it's called Texture Filtering Quality, is that what you were referring to? I set that to performance. It did net me the same score as when I had the core at 1200. Will try 1200/1500 and beyond later but can't right now.


----------



## Roboyto

Quote:


> Originally Posted by *sena*
> 
> Guys i have problem with monitor sleep, when my monitors enters sleep it cant awake, i have to turn off pc to have image again.
> 
> ULPS is disabled, trixx is used for overclocking.
> 
> Can anyone throw some light on this.
> 
> Thanks
> 
> EDIT: Now my vram is stuck at 3d speed in 2d, damn.


I'm seeing this issue as well. I need to turn the 3rd monitor in my array(far right) on, off, on to get all 3 to wake properly.
Quote:


> Originally Posted by *jtom320*
> 
> Does anyone have any thoughts on the 290 Heatkiller or Aquacomputer blocks?
> 
> I'd love to get the Aquacomputer but I don't think the active backplate is every going to be in stock so my main focus is the heatkillers. This would be for two reference 290s.


The Aquacomputer with either backplate gives you best VRM1 cooling out of the box. Don't know much about Heatkiller.


----------



## Yvese

Quote:


> Originally Posted by *Jeffro422*
> 
> Just ran it with AF set to performance. 1150/1500 got me 1568 62.3fps. Print screen blacked out Heaven and only print screened my second monitor. I'll do it again later. I didn't know about changing the AF. Although in CCC on 14.6 it's called Texture Filtering Quality, is that what you were referring to? I set that to performance. It did net me the same score as when I had the core at 1200. Will try 1200/1500 and beyond later but can't right now.


Yea Texture Filtering Quality is the one. It surprisingly gives a good boost to benchmarks. I always have mine on High Quality which put me at 56.7 in Heaven.


----------



## heroxoot

Wooooow I had no idea setting the AF mattered. That is a huge reason why my 290X was performing even lower on 14.4. I always set it to Quality. Leaving it on standard or even changing it to performance looks no different in all honesty. The FPS difference is about 2fps in heaven also. 50.x on Quality, 52.x on Performance. Standard is somewhere in the middle so I just left it on standard.


----------



## Jeffro422

Quote:


> Originally Posted by *Yvese*
> 
> Yea Texture Filtering Quality is the one. It surprisingly gives a good boost to benchmarks. I always have mine on High Quality which put me at 56.7 in Heaven.


Had the chance to try 1210/1500 with TFQ set to performance. My previous post was 1200/1500 and 62.3 fps score: 1568


----------



## ledzepp3

What've y'all been using for stability testing? I've always used MSI Kombustor with a variety of tests to stress the memory plus the core.

-Zepp


----------



## pdasterly

Bf4 gets things cooking fast, I use unigine also


----------



## Roboyto

Quote:


> Originally Posted by *Hl86*
> 
> Im thinking of getting another 290 for crossfire.
> I have [email protected] 4,9ghz, 1 ssd, 1 harddisk.
> Will this psu be sufficient?
> http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story&reid=93


Might want something a little newer with better efficiency especially if you plan to OC the cards; looks like that review is from 2008. You could likely get a new 850W and be OK.

Something like this would do:

http://www.newegg.com/Product/Product.aspx?Item=N82E16817139011

Similar price but 1000W:

http://www.newegg.com/Product/Product.aspx?Item=N82E16817182285

I'm very happy with my Capstone 450/650 units and considering the 1000W is a SuperFlower product as well, I would bet it would perform very well.


----------



## Roboyto

Quote:


> Originally Posted by *ledzepp3*
> 
> What've y'all been using for stability testing? I've always used MSI Kombustor with a variety of tests to stress the memory plus the core.
> 
> -Zepp


Kombustor/FurMark are a poor representation of stability for gaming. They usually just give worst case scenario for power draw and temperatures.

You're better off with:


Anything 3DMark
Unigine
Catzilla
Various game benches such as Tomb Raider, Bioshock, etc
My personal favorite is FFXIV bench as it always gives me the lowest stable clocks. Any settings that can run on FFXIV bench, will run on anything else.


----------



## Yvese

Quote:


> Originally Posted by *Jeffro422*
> 
> Had the chance to try 1210/1500 with TFQ set to performance. My previous post was 1200/1500 and 62.3 fps score: 1568


I can't test 1200 but your 1150 score seems about right - Here's specopsFI's results @ 1150/1500. You could use that as a reference for your 290x.

His older results ( linked in that thread ) shows how it compares with a 290 as well. The numbers are pretty interesting


----------



## hwoverclkd

Quote:


> Originally Posted by *ledzepp3*
> 
> What've y'all been using for stability testing? I've always used MSI Kombustor with a variety of tests to stress the memory plus the core.
> 
> -Zepp


Valley and Tomb Raider here...at least two passes each. Firestrike is also good although i have OC'd that worked flawless in firestrike but failed on both TR and Valley. Msi kombustor tests draw the most power off the wall but have not been a good test in terms of stability. That's just me...


----------



## heroxoot

Just got a call from MSI. They replicated my fan problem finally on the same card with the same bios. Now I'm waiting for a call back tomorrow with some information as to if they could fix it or if they have to outsource the problem to the main HQ. Idk why I couldnt get anyone to run this test on a previous call but I guess its better late than never. If it isn't a bios issue I was told they will most likely collab with AMD on a driver fix. I'm sure this is a bios problem tho.


----------



## cephelix

Quote:


> Originally Posted by *heroxoot*
> 
> Just got a call from MSI. They replicated my fan problem finally on the same card with the same bios. Now I'm waiting for a call back tomorrow with some information as to if they could fix it or if they have to outsource the problem to the main HQ. Idk why I couldnt get anyone to run this test on a previous call but I guess its better late than never. If it isn't a bios issue I was told they will most likely collab with AMD on a driver fix. I'm sure this is a bios problem tho.


gd to know that they at least responded..does tht mean that you wouldn't have to rma your card thn?


----------



## heroxoot

Quote:


> Originally Posted by *cephelix*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Just got a call from MSI. They replicated my fan problem finally on the same card with the same bios. Now I'm waiting for a call back tomorrow with some information as to if they could fix it or if they have to outsource the problem to the main HQ. Idk why I couldnt get anyone to run this test on a previous call but I guess its better late than never. If it isn't a bios issue I was told they will most likely collab with AMD on a driver fix. I'm sure this is a bios problem tho.
> 
> 
> 
> gd to know that they at least responded..does tht mean that you wouldn't have to rma your card thn?
Click to expand...

Yes I can keep my card and not RMA it. Flashing is very easy from DOS. So either the bios will receive an update or MSI will talk to AMD and help them make it work on this GPU.


----------



## cephelix

Great..been out of the loop a while,finally ordering my wb this weekend..whts the latest stable driver??


----------



## pdasterly

Unigine score low, games seem to run fine.
Getting 59 fps, 14.6 driver. Cards arnt throttling, max gpu temp is 70c and vrm1 up to 90c but that was oc'd. Oc dosent help much but does bring minimum fps up


----------



## Roy360

does anyone else get an annoying message like

"The Current input timings is not supported by the monitor display. Please change your input timing to [email protected] or any other ....."

I only ever get this on my central monitor, and usually just turn off and on(the monitor) until it works

Using 50ft HDMI and HDMI to DVI adapter to connect the monitor. (I'd use a normal wire, but at the moment, I can't find a regular DVI, my room is currently a mess)


----------



## heroxoot

Quote:


> Originally Posted by *cephelix*
> 
> Great..been out of the loop a while,finally ordering my wb this weekend..whts the latest stable driver??


14.6 is very good but pushes more heat for many people than 14.4. Not a big issue because my performance is way better on 14.6 without an OC compared to 14.4 with an OC.


----------



## joeh4384

I snagged a MSI gaming 290x from the forums here. I have mine stuffed in my gaming HTPC and it certainly heats it up but so far its ok.

http://www.techpowerup.com/gpuz/3zgrz/

https://pcpartpicker.com/b/7yGXsY


----------



## cephelix

Quote:


> Originally Posted by *heroxoot*
> 
> 14.6 is very good but pushes more heat for many people than 14.4. Not a big issue because my performance is way better on 14.6 without an OC compared to 14.4 with an OC.


Gd to know. Will have to update catalyst and find out if there is an updated bios for my particular card before i start messing about with overclocking


----------



## sinnedone

Quote:


> Originally Posted by *ledzepp3*
> 
> What've y'all been using for stability testing? I've always used MSI Kombustor with a variety of tests to stress the memory plus the core.
> 
> -Zepp


I usually use 3dmark vantage and valley. Then follow that up with a couple of hours of BF4. Benchmark stable is very different from gaming stable.

On my cards I can even get those benchmarks to artifact some and continue upping the mhz and scores will still climb. Even though they are no where near gaming stable.

Takes almost as long as cpu overclocking.


----------



## heroxoot

I'm getting some really good heat results now that I changed the thermal paste. 75c max playing BF4 on ultra, 78c during heaven. Heaven squeezes more heat than a game ever will tho. If the 14.6 driver didn't cause me more heat I think I'd be around 72c during load.


----------



## pdasterly

Having problem with crossfire setup.

Ran ddu and reinstalled 14.6 beta
Enabled crossfire and set power limit to 50. Eyefinity disabled
Unigine heaven 4.0
Fps 59
Score 1502
Min fps 26
Max fps 68


----------



## pdasterly

Sorry, my v-sync was enabled, I was just momentarily sick.

Same setting, eyefinity enabled
Fps 106
Score 2691
Min fps 28
Max fps 220


----------



## givmedew

Dead card bad solder.

Has anyone seen this before? This is from HIS it is most definitely NOT from any abuse of the card. Does anyone think I will have trouble getting this handled through HIS? These pieces fell off from heat.





This last picture is just of the pieces laid on top of where they belong.


----------



## cephelix

that totally sucks dude....but if there's no wrong-doing on your part i dont think it should be any trouble.though i may be wrong


----------



## Arizonian

Quote:


> Originally Posted by *joeh4384*
> 
> I snagged a MSI gaming 290x from the forums here. I have mine stuffed in my gaming HTPC and it certainly heats it up but so far its ok.
> 
> http://www.techpowerup.com/gpuz/3zgrz/
> 
> https://pcpartpicker.com/b/7yGXsY


Congrats - added


----------



## heroxoot

Quote:


> Originally Posted by *joeh4384*
> 
> I snagged a MSI gaming 290x from the forums here. I have mine stuffed in my gaming HTPC and it certainly heats it up but so far its ok.
> 
> http://www.techpowerup.com/gpuz/3zgrz/
> 
> https://pcpartpicker.com/b/7yGXsY


Are you able to control the fan speed in MSI AB? If so what bios version do you have? There seems to be an issue with fan control on the newer bios for the card. I got MSI looking into it to see if the problem is driver or bios related. Fan control is not able to be altered on 14.1 or higher for my card.


----------



## kizwan

Quote:


> Originally Posted by *cephelix*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> 14.6 is very good but pushes more heat for many people than 14.4. Not a big issue because my performance is way better on 14.6 without an OC compared to 14.4 with an OC.
> 
> 
> 
> Gd to know. Will have to update catalyst and find out if there is an updated bios for my particular card before i start messing about with overclocking
Click to expand...

14.4 WHQL vs. 14.6 beta: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/23630#post_22346469

14.6 beta, BF4 1080p 200% res scale Ultra NO AA


----------



## hwoverclkd

Quote:


> Originally Posted by *heroxoot*
> 
> No I have the 4G gaming but there seems to be an update for it too. Its not the same problem. My fans all work I just cant adjust their speed.
> 
> never got the bios. Called MSI and instead they are going to test the same card and see if they can find the issue I'm having. If not I might have to RMA......AGAIN. The guy said if my bios is a different version they will send the new one first to try but if my issue is isolated I will need to RMA.
> 
> Lovely.


Good to see they are willing to troubleshoot on their end. Goodluck! hope you get this all sorted out soon.


----------



## Blue Dragon

Quote:


> Originally Posted by *heroxoot*
> 
> Are you able to control the fan speed in MSI AB? If so what bios version do you have? There seems to be an issue with fan control on the newer bios for the card. I got MSI looking into it to see if the problem is driver or bios related. Fan control is not able to be altered on 14.1 or higher for my card.


probably has nothing to do with the 290x's... but my MSI 7870 Hawks got stuck at 50% fan when my fan curve should have had them around 30% at 28C...
using AB 3.0.0 I had to hit the reset button couple of times to get it to go back to default then reapply user define settings. after several reboots, etc... it did it once more (twice in one day of continuous use.) I'm on 13.12 Cat. only reason I mention it is while it's stuck at 50% I have absolutely no control, even when I manually move the slider. if I reboot without doing anything the problem persists. never had this problem w/ previous AB, so I suspect that might be the cause.


----------



## joeh4384

I haven't really had a chance to play around with fan speed yet. I am hoping to get some time to really test it out this weekend. Right now I'm idling at 41 and during BF4 i was in the low 80s on the one time I really used it. I noticed it runs hot too when it is running TV in media center at 65.


----------



## cephelix

Quote:


> Originally Posted by *kizwan*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 14.4 WHQL vs. 14.6 beta: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/23630#post_22346469
> 
> 14.6 beta, BF4 1080p 200% res scale Ultra NO AA


thanks a bunch kizwan...i'm still on 13. whtever driver


----------



## TommyGunn123

Quote:


> Originally Posted by *spongeyturtle*
> 
> So this is a little project I had over the weekend- using a closed loop watercooler (an Antec Kuhler 620 in this case) and using it to cool my r9 290...
> 
> took the whole day but I am pleased with the results-
> 
> idle: [email protected] 35C [email protected] [email protected]
> Full Load: [email protected] [email protected] [email protected]
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks for looking


Here's the mod that I remember looking at, this is pretty much the corsair bracket that's coming out. Doesn't have load pics for fan speed but them temps n stuff back up the ones corsair are showing


----------



## cephelix

how is he keeping his vrms that cool under load?


----------



## Durvelle27

Anybody here using a Universal GPU block on their R9 290X


----------



## cephelix

i think there is a thread of someone using a heatkiller block on his....not sure how much help it would be to you though..


----------



## heroxoot

Quote:


> Originally Posted by *Blue Dragon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Are you able to control the fan speed in MSI AB? If so what bios version do you have? There seems to be an issue with fan control on the newer bios for the card. I got MSI looking into it to see if the problem is driver or bios related. Fan control is not able to be altered on 14.1 or higher for my card.
> 
> 
> 
> probably has nothing to do with the 290x's... but my MSI 7870 Hawks got stuck at 50% fan when my fan curve should have had them around 30% at 28C...
> using AB 3.0.0 I had to hit the reset button couple of times to get it to go back to default then reapply user define settings. after several reboots, etc... it did it once more (twice in one day of continuous use.) I'm on 13.12 Cat. only reason I mention it is while it's stuck at 50% I have absolutely no control, even when I manually move the slider. if I reboot without doing anything the problem persists. never had this problem w/ previous AB, so I suspect that might be the cause.
Click to expand...

This problem is most definitely a problem with the bios for the 290X. It's only the newer bios too. Hopefully they give me good news today.


----------



## Brian18741

Anyone having trouble with GPUz? It's making my system lag. Running version 0.7.8. Also when I open a second one to monitor my second GPU, the sensors aren't showing any readings for the second card.

14.6 drivers.
Asus R9 290 DUII crossfire.
Win7 64bit


----------



## sena

I think i have worst cards on forum, for 1100 MHz they need +125mV to be stable in crysis 2.


----------



## Brian18741

To answer my own question, ULPS was enabled. Must of re-enabled when I updated to 14.6. Disabled and problem resolved!


----------



## sena

Quote:


> Originally Posted by *sena*
> 
> I think i have worst cards on forum, for 1100 MHz they need +125mV to be stable in crysis 2.


Quote:


> Originally Posted by *Brian18741*
> 
> Anyone having trouble with GPUz? It's making my system lag. Running version 0.7.8. Also when I open a second one to monitor my second GPU, the sensors aren't showing any readings for the second card.
> 
> 14.6 drivers.
> Asus R9 290 DUII crossfire.
> Win7 64bit


Disable ulps.


----------



## koekwau5

Here is mine to enter the club:

http://www.techpowerup.com/gpuz/cwxs2/

Brand and model: MSI R9 290X Gaming 4GB.


----------



## Durvelle27

Quote:


> Originally Posted by *cephelix*
> 
> i think there is a thread of someone using a heatkiller block on his....not sure how much help it would be to you though..


Full or universal


----------



## cephelix

Quote:


> Originally Posted by *Durvelle27*
> 
> Full or universal


Universal.....
Link

Post #6


----------



## dartuil

HI,
Someone tried the mixed eyefinity?
http://www.techpowerup.com/img/14-05-27/120b.jpg
LIke this?
I mean i have a 27 and would like to buy two 24 to go eyefinity.
U think i can do that if the 27 is taller?


----------



## Durvelle27

Quote:


> Originally Posted by *dartuil*
> 
> HI,
> Someone tried the mixed eyefinity?
> http://www.techpowerup.com/img/14-05-27/120b.jpg
> LIke this?
> I mean i have a 27 and would like to buy two 24 to go eyefinity.
> U think i can do that if the 27 is taller?


I'll let you know on Monday as i'll be using 2x 15" and on 21.5"


----------



## Rainmaker91

Quote:


> Originally Posted by *Durvelle27*
> 
> I'll let you know on Monday as i'll be using 2x 15" and on 21.5"


would think it would be a bit strange in games, especially if you have the same resolution on them. For Example a 24" 1080p monitor besides a 18" 1080p monitor. A more outdated setrup but interesting non the less is something like a single 800x480 or 854x480 monitor in center and a couple of 480x320 monitors on each side flipped vertically . That would give you a nice transition between the screens, especially if the 480x320 screens are as wide as the 800x480 screen is tall.

I haven't run an eyfinity setup yet though so I'm not one to talk but for desktop use you can more or less use any size screen besides eachother and be golden (as I have done in the past). Games however is best used with a fluid motion between all the screens, and you don't want your hand to be smaller on your left and right screen then on the center screen. I do hope you get it working great though.

I'm drooling on a triple 27" 2560x1440 screen setup myself, as I think a triple setup with all screens vertically flipped just looks so cool.


----------



## Durvelle27

Quote:


> Originally Posted by *Rainmaker91*
> 
> would think it would be a bit strange in games, especially if you have the same resolution on them. For Example a 24" 1080p monitor besides a 18" 1080p monitor. A more outdated setrup but interesting non the less is something like a single 800x480 or 854x480 monitor in center and a couple of 480x320 monitors on each side flipped vertically . That would give you a nice transition between the screens, especially if the 480x320 screens are as wide as the 800x480 screen is tall.
> 
> I haven't run an eyfinity setup yet though so I'm not one to talk but for desktop use you can more or less use any size screen besides eachother and be golden (as I have done in the past). Games however is best used with a fluid motion between all the screens, and you don't want your hand to be smaller on your left and right screen then on the center screen. I do hope you get it working great though.
> 
> I'm drooling on a triple 27" 2560x1440 screen setup myself, as I think a triple setup with all screens vertically flipped just looks so cool.


I'll be using

15" - 1280x1024
15" - 1024x768
21.5" - 1366x768
Quote:


> Originally Posted by *cephelix*
> 
> Universal.....
> Link
> 
> Post #6


Thx i saw. Wish here was here so i could ask some questions


----------



## pdasterly

Quote:


> Originally Posted by *Rainmaker91*
> 
> would think it would be a bit strange in games, especially if you have the same resolution on them. For Example a 24" 1080p monitor besides a 18" 1080p monitor. A more outdated setrup but interesting non the less is something like a single 800x480 or 854x480 monitor in center and a couple of 480x320 monitors on each side flipped vertically . That would give you a nice transition between the screens, especially if the 480x320 screens are as wide as the 800x480 screen is tall.
> 
> I haven't run an eyfinity setup yet though so I'm not one to talk but for desktop use you can more or less use any size screen besides eachother and be golden (as I have done in the past). Games however is best used with a fluid motion between all the screens, and you don't want your hand to be smaller on your left and right screen then on the center screen. I do hope you get it working great though.
> 
> I'm drooling on a triple 27" 2560x1440 screen setup myself, as I think a triple setup with all screens vertically flipped just looks so cool.


Be prepared to add another card if you only have 1


----------



## Rainmaker91

Quote:


> Originally Posted by *Durvelle27*
> 
> I'll be using
> 
> 15" - 1280x1024
> 15" - 1024x768
> 21.5" - 1366x768
> Thx i saw. Wish here was here so i could ask some questions


That will be sort of a strange setup since two and two of your screens fit perfectly but crash together (the 1024x768 15" screen flipped vertically with the other 15", and the 1024x768 15" besides the 21.5" for a smooth transition of resolution in hight). It might work well though so I do hope you like it.
Quote:


> Originally Posted by *pdasterly*
> 
> Be prepared to add another card if you only have 1


It would likely require a full upgrade of my GPU line since I'm not qualified to participate in this club yet with my HD7950. Its just something I have ben considering getting for years now, but I am content with my single 24" 1920x1200 screen so I'm not about to spoil my money on that anytime in the near future. I mean unless I see a sale that is 70-80% off normal price


----------



## Durvelle27

Quote:


> Originally Posted by *Rainmaker91*
> 
> That will be sort of a strange setup since two and two of your screens fit perfectly but crash together (the 1024x768 15" screen flipped vertically with the other 15", and the 1024x768 15" besides the 21.5" for a smooth transition of resolution in hight). It might work well though so I do hope you like it.
> It would likely require a full upgrade of my GPU line since I'm not qualified to participate in this club yet with my HD7950. Its just something I have ben considering getting for years now, but I am content with my single 24" 1920x1200 screen so I'm not about to spoil my money on that anytime in the near future. I mean unless I see a sale that is 70-80% off normal price


I'll just see. I'm waiting on my Displayport adapters before i can set it up.


----------



## Spooked25

Hello everyone,

Anyone else having trouble installing the new 14.6 beta drivers ? Every time I make an attempt to install the new beta. It will black screen and hard lock. It's starting to do it when I install the 14.4 as well. Right now I have the orignal drivers that came with the card brand new. I forget what driver that is. I did a fresh install of OS 8.1 pro 64 bit. But that does not help.


----------



## Faster_is_better

Just got my vacuum cleaner.. I mean... r9 290 installed.









http://www.techpowerup.com/gpuz/9kbve/

MSI - reference cooling (for now)

ASIC 72.3%


----------



## Enzarch

Quote:


> Originally Posted by *Durvelle27*
> 
> Anybody here using a Universal GPU block on their R9 290X


I am using a EK Supremacy VGA on my reference 290; Ask away

Quote:


> Originally Posted by *cephelix*
> 
> how is he keeping his vrms that cool under load?


He is using my mod, seen HERE


----------



## blue1512

Quote:


> Originally Posted by *Spooked25*
> 
> Hello everyone,
> 
> Anyone else having trouble installing the new 14.6 beta drivers ? Every time I make an attempt to install the new beta. It will black screen and hard lock. It's starting to do it when I install the 14.4 as well. Right now I have the orignal drivers that came with the card brand new. I forget what driver that is. I did a fresh install of OS 8.1 pro 64 bit. But that does not help.


For multicard, you should diasble all but one card (the main one) and then install the driver to avoid black screen crash. The other cards can be enable later after the installation. The disable/enable part is done in Device Manager FYI


----------



## Draygonn

Just picked up an unused Sapphire 290x off ebay. I'm very excited. Will transfer the Gelid Icy Vision from my 680 as soon as the VRM kit arrives.


----------



## ebhsimon

Quote:


> Originally Posted by *Spooked25*
> 
> Hello everyone,
> 
> Anyone else having trouble installing the new 14.6 beta drivers ? Every time I make an attempt to install the new beta. It will black screen and hard lock. It's starting to do it when I install the 14.4 as well. Right now I have the orignal drivers that came with the card brand new. I forget what driver that is. I did a fresh install of OS 8.1 pro 64 bit. But that does not help.


Disable ULPS (via MSI afterburner if CCC keeps crashing), that's what I had to do.


----------



## Dasboogieman

Woot, some new developments a little unrelated to the 290

Just got me an i5-3570k. Gonna be the last upgrade to future proof this platform (PCIe 3.0 for future SSDs) before the jump to Devil's Canyon.
After some tinkering, just wow. My luck with Silicon lottery is horrible.

My previous i5-2500k (Malay unit) was absolutely rubbish, needed 1.42 V (1.392 when adjusted for LLC) to hit 4.5ghz. Absolutely refused to do any higher.

My current i5-3570k (Late Costa Rica model) is marginally better at 1.36V (1.32V LLC) at 4.6ghz, will do 4.7ghz at 1.42 V (which is apparently equivalent to 1.5V on Sandy, aka ridiculous)

I honestly have no clue how some people manage 4.5ghz at 1.22V or 5Ghz at 1.35V. Either I'm missing something fundamental about overclocking or those people aren't testing for stability rigorously enough. Mine all did 30 passes of IBT on extreme, 12hrs of Prime 95 AVX, 8hrs of AIDA64 and 24hrs of FAH. Even monitor the WHEA logs too for parity errors.

And this is all on my trusty NH-D14 cooler too.

Oh well, at least Devils Canyon is guaranteed 4.4ghz.

On the topic of the 290, recently discovered that +13mV is not stable for 1100mhz so I bumped it to +25mV, FAH was whining about some silent errors.


----------



## KnownDragon

Quote:


> Originally Posted by *Dasboogieman*
> 
> Woot, some new developments a little unrelated to the 290
> 
> Just got me an i5-3570k. Gonna be the last upgrade to future proof this platform (PCIe 3.0 for future SSDs) before the jump to Devil's Canyon.
> After some tinkering, just wow. My luck with Silicon lottery is horrible.
> 
> My previous i5-2500k (Malay unit) was absolutely rubbish, needed 1.42 V (1.392 when adjusted for LLC) to hit 4.5ghz. Absolutely refused to do any higher.
> 
> My current i5-3570k (Late Costa Rica model) is marginally better at 1.36V (1.32V LLC) at 4.6ghz, will do 4.7ghz at 1.42 V (which is apparently equivalent to 1.5V on Sandy, aka ridiculous)
> 
> I honestly have no clue how some people manage 4.5ghz at 1.22V or 5Ghz at 1.35V. Either I'm missing something fundamental about overclocking or those people aren't testing for stability rigorously enough. Mine all did 30 passes of IBT on extreme, 12hrs of Prime 95 AVX, 8hrs of AIDA64 and 24hrs of FAH. Even monitor the WHEA logs too for parity errors.
> 
> And this is all on my trusty NH-D14 cooler too.
> 
> Oh well, at least Devils Canyon is guaranteed 4.4ghz.
> 
> On the topic of the 290, recently discovered that +13mV is not stable for 1100mhz so I bumped it to +25mV, FAH was whining about some silent errors.


Sorry to hear about the Silicon luck. I would have that the same about the stability with some people not testing. Right now I have a 4.7 3770k at 1.36 with llc set. The I5 3570k I bought my wife is crazy it is at 4.7 with a evo 212 and volts at 1.3 core.

I want to say my r9 290 can push that with no voltage increase and gain performance but when I do bump the voltage at the 1100mhz it also gets a little more performance out of it. I can't wait till I can get it water blocked. Right now I am trying to decide if I should block it or get a second r9 290 and then wait to block them both. I don't think I will run more then 3 monitors.


----------



## Dasboogieman

Quote:


> Originally Posted by *KnownDragon*
> 
> Sorry to hear about the Silicon luck. I would have that the same about the stability with some people not testing. Right now I have a 4.7 3770k at 1.36 with llc set. The I5 3570k I bought my wife is crazy it is at 4.7 with a evo 212 and volts at 1.3 core.
> 
> I want to say my r9 290 can push that with no voltage increase and gain performance but when I do bump the voltage at the 1100mhz it also gets a little more performance out of it. I can't wait till I can get it water blocked. Right now I am trying to decide if I should block it or get a second r9 290 and then wait to block them both. I don't think I will run more then 3 monitors.


Yeah, now those are numbers I can believe. Chips should only deviate between VID by about +- .03 between the best and worst aside from the obvious walls which may be present. Any less and I'm skeptical that adequate stability testing was done (since the outlier cases should be really really rare).


----------



## kizwan

Quote:


> Originally Posted by *KnownDragon*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dasboogieman*
> 
> Woot, some new developments a little unrelated to the 290
> 
> Just got me an i5-3570k. Gonna be the last upgrade to future proof this platform (PCIe 3.0 for future SSDs) before the jump to Devil's Canyon.
> After some tinkering, just wow. My luck with Silicon lottery is horrible.
> 
> My previous i5-2500k (Malay unit) was absolutely rubbish, needed 1.42 V (1.392 when adjusted for LLC) to hit 4.5ghz. Absolutely refused to do any higher.
> 
> My current i5-3570k (Late Costa Rica model) is marginally better at 1.36V (1.32V LLC) at 4.6ghz, will do 4.7ghz at 1.42 V (which is apparently equivalent to 1.5V on Sandy, aka ridiculous)
> 
> I honestly have no clue how some people manage 4.5ghz at 1.22V or 5Ghz at 1.35V. Either I'm missing something fundamental about overclocking or those people aren't testing for stability rigorously enough. Mine all did 30 passes of IBT on extreme, 12hrs of Prime 95 AVX, 8hrs of AIDA64 and 24hrs of FAH. Even monitor the WHEA logs too for parity errors.
> 
> And this is all on my trusty NH-D14 cooler too.
> 
> Oh well, at least Devils Canyon is guaranteed 4.4ghz.
> 
> On the topic of the 290, recently discovered that +13mV is not stable for 1100mhz so I bumped it to +25mV, FAH was whining about some silent errors.
> 
> 
> 
> 
> 
> 
> Sorry to hear about the Silicon luck. I would have that the same about the stability with some people not testing. Right now I have a 4.7 3770k at 1.36 with llc set. The I5 3570k I bought my wife is crazy it is at 4.7 with a evo 212 and volts at 1.3 core.
> 
> I want to say my r9 290 can push that with no voltage increase and gain performance but when I do bump the voltage at the 1100mhz it also gets a little more performance out of it. I can't wait till I can get it water blocked. Right now I am trying to decide if I should block it or get a second r9 290 and then wait to block them both. I don't think I will run more then 3 monitors.
Click to expand...

How often you overclocked your card? If you always play games at stock clock, no need to get water block right now. I would get second card & put them under water later.


----------



## KnownDragon

I do play at stock clocks for the time being. Here is a 3dMark 1100mhz core with no voltage bump. 1100mhz

That was my same thought process. Get the second card then block them. I was originally thinking the cryographics block with active backplate but I wander if the card might sag? However the price for blocking the card is almost half the price of a second. I have to get more radiator for the gpu's because I use the Maximus V formula board and cool the vrm's by water.

Also is it normal for the 3dmark results to say graphics driver not approved? I am using the latest beta driver. So it puzzles me....


----------



## alancsalt

Quote:


> Originally Posted by *KnownDragon*
> 
> I do play at stock clocks for the time being. Here is a 3dMark 1100mhz core with no voltage bump. 1100mhz
> 
> That was my same thought process. Get the second card then block them. I was originally thinking the cryographics block with active backplate but I wander if the card might sag? However the price for blocking the card is almost half the price of a second. I have to get more radiator for the gpu's because I use the Maximus V formula board and cool the vrm's by water.
> 
> Also is it normal for the 3dmark results to say graphics driver not approved? I am using the latest beta driver. So it puzzles me....


beta drivers are never futuremark approved and there is even a lag in approving whql drivers.


----------



## KnownDragon

So in order to go over 1120 Mhz on core I am going to have to give it some voltage. I am going to try 1140 with some voltage just to see if the performance gain is in my mind worth it. Right after I get this coffee. Sorry been working so much haven't had any time to play with this card.


----------



## Rainmaker91

In the last few months I have been building up a small guide for those who wish to use closed loop coolers on their GPUs. I have managed to gather the most known pieces there along with a few less known ones, but what I really need now is peoples experiences with them. So I encourage all who wish to do so to stop by my thread and post your experiences. If you would happen to know of some solutions that has not been mentioned in the thread I would be happy to include them as well, just post a post in the thread and I will add it.

I am aware that not everyone is a big fan of the use of AIO coolers instead of an open loop, but there is people who are and I made this guide for them. I do hope you will take your time to stop by, and I'm happy to take any constructive criticism and apply it to the thread as well. The guide is for all the users after all and I want to offer the best possible help I can for those on the hunt for something other then regular air coolers.

Once again check it out here, and thank you for your time.


----------



## Durvelle27

Quote:


> Originally Posted by *Enzarch*
> 
> I am using a EK Supremacy VGA on my reference 290; Ask away
> He is using my mod, seen HERE


How are the temps and what sinks did you use on the VRMs


----------



## Widde

Quote:


> Originally Posted by *KnownDragon*
> 
> Sorry to hear about the Silicon luck. I would have that the same about the stability with some people not testing. Right now I have a 4.7 3770k at 1.36 with llc set. The I5 3570k I bought my wife is crazy it is at 4.7 with a evo 212 and volts at 1.3 core.
> 
> I want to say my r9 290 can push that with no voltage increase and gain performance but when I do bump the voltage at the 1100mhz it also gets a little more performance out of it. I can't wait till I can get it water blocked. Right now I am trying to decide if I should block it or get a second r9 290 and then wait to block them both. I don't think I will run more then 3 monitors.


4.5ghz here at 1.37volts







4.7 needs 1.46







Want to find a kind soul willing to trade me another 3570k or a 3770k and see if I have better luck


----------



## Willi

okay, after two weeks of my R9 290X running rock solid on PT1 bios with no overclock, it decides its time to start black screening on me. I tried reinstalling the 13.12 WHQL drivers, tried the 14.4 but it won't even detect the gpu (possibly because of the BIOS). I tried flashing the card back to original BIOS and it still doesn't seem to solve the problem.
I boot the PC, get as far as the windows logo and then black screen, keyboard lights shut down. I can see the monitor still has the backlight on, but still black.
Mine is a reference Gigabyte REV1 (gigabyte made some changes to the pcb, it seems), bought it from a friend and I saw it working without any issues before buying. I hope it didn't decide to die on me. It was never overclocked and still uses the stock cooler (although the waterblock is on it's way).
I don't know if I should wait for the waterblock or just try to call the previous owner and RMA it. It was purchased in december last year, used for six months with 13.12 WHQL no problems. A few black screens on jan. and then worked flawlessly. I don't really know what to do.
I might even try the PT2 Bios once the waterblock arrives. I don't want to risk frying the card.
Any ideas?


----------



## PachAz

I have a sapphire tri-x r9 290 OC, can I just slap in another reference r9 290 and OC it to my first cards settings, or should I get a second sapphire tri-x?


----------



## Sgt Bilko

Quote:


> Originally Posted by *PachAz*
> 
> I have a sapphire tri-x r9 290 OC, can I just slap in another reference r9 290 and OC it to my first cards settings, or should I get a second sapphire tri-x?


You can get a Crossfire a Ref model with a AIB's model.

I'd just do as you said and oc it to the Tri-X's clocks.


----------



## PachAz

Yeah, I hope the reference will OC to the tri-x moderate factory clock (1000/1300). I will be using WB on both of them.


----------



## Sgt Bilko

Quote:


> Originally Posted by *PachAz*
> 
> Yeah, I hope the reference will OC to the tri-x moderate factory clock (1000/1300). I will be using WB on both of them.


It should be able to do it on stock voltage, especially under water.


----------



## Durvelle27

Quote:


> Originally Posted by *Rainmaker91*
> 
> That will be sort of a strange setup since two and two of your screens fit perfectly but crash together (the 1024x768 15" screen flipped vertically with the other 15", and the 1024x768 15" besides the 21.5" for a smooth transition of resolution in hight). It might work well though so I do hope you like it.
> It would likely require a full upgrade of my GPU line since I'm not qualified to participate in this club yet with my HD7950. Its just something I have ben considering getting for years now, but I am content with my single 24" 1920x1200 screen so I'm not about to spoil my money on that anytime in the near future. I mean unless I see a sale that is 70-80% off normal price


This is how they look



*I used a 17" instead of the 21.5"


----------



## Rainmaker91

Quote:


> Originally Posted by *Durvelle27*
> 
> This is how they look
> 
> 
> 
> *I used a 17" instead on the 21.5"


looks cool, I hope you get it to work exactly how you want to.


----------



## sugarhell

I had this setup 6 years ago or something







^


----------



## Durvelle27

Quote:


> Originally Posted by *Rainmaker91*
> 
> looks cool, I hope you get it to work exactly how you want to.


I hope so too lol


----------



## b0x3d

Can you update me to having 2x 290x's please?


----------



## Enzarch

Quote:


> Originally Posted by *Enzarch*
> 
> I am using a EK Supremacy VGA on my reference 290; Ask away
> He is using my mod, seen HERE


Quote:


> Originally Posted by *Durvelle27*
> 
> How are the temps and what sinks did you use on the VRMs


Your answers are within the link in my post , lol


----------



## Durvelle27

Quote:


> Originally Posted by *Enzarch*
> 
> Your answers are within the link in my post , lol


Don't see it


----------



## Arizonian

Quote:


> Originally Posted by *Faster_is_better*
> 
> Just got my vacuum cleaner.. I mean... r9 290 installed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/9kbve/
> 
> MSI - reference cooling (for now)
> 
> ASIC 72.3%


Congrats - added









Quote:


> Originally Posted by *b0x3d*
> 
> Can you update me to having 2x 290x's please?
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - updated








Quote:


> Originally Posted by *Draygonn*
> 
> Just picked up an unused Sapphire 290x off ebay. I'm very excited. Will transfer the Gelid Icy Vision from my 680 as soon as the VRM kit arrives.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added








Quote:


> Originally Posted by *Rainmaker91*
> 
> In the last few months I have been building up a small guide for those who wish to use closed loop coolers on their GPUs. I have managed to gather the most known pieces there along with a few less known ones, but what I really need now is peoples experiences with them. So I encourage all who wish to do so to stop by my thread and post your experiences. If you would happen to know of some solutions that has not been mentioned in the thread I would be happy to include them as well, just post a post in the thread and I will add it.
> 
> I am aware that not everyone is a big fan of the use of AIO coolers instead of an open loop, but there is people who are and I made this guide for them. I do hope you will take your time to stop by, and I'm happy to take any constructive criticism and apply it to the thread as well. The guide is for all the users after all and I want to offer the best possible help I can for those on the hunt for something other then regular air coolers.
> 
> Once again check it out here, and thank you for your time.


Nice work.

Side note: Did I miss your club submission? I didn't see one.

Also for any members who haven't yet submitted proof of ownership who own a 290 or 290X please join our roster of owners. If you did and I missed you please let me know.

To be added on the member list please submit the following in your post
1. GPU-Z Link with OCN name or Screen shot of GPU-Z validation tab open with OCN Name or finally a pic of GPU with piece of paper with OCN name showing.
2. Manufacturer & Brand - Please don't make me guess.
3. Cooling - Stock, Aftermarket or 3rd Party Water

EDIT: *Member count : 385 with 556 GPU's*


----------



## falcon26

OK. I got my MSI 290 installed. I would like to enable dynamic vsync I can not find that anywhere in the CCC. Where the heck is that option? Also this has me a little worried. When gaming with my GTX 780 on BF4 its butter smooth at 2560x1440. On the 290 at the exact same settings its choppy, and I have tried both DX 11 and Mantle. Is their something I am missing in the CCC to enable or disable? I really want the dynamic vsync to work, I loved that on my GTX 780 it worked like a charm....


----------



## Rainmaker91

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - updated
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> Nice work.
> 
> Side note: Did I miss your club submission? I didn't see one.
> 
> Also for any members who haven't yet submitted proof of ownership who own a 290 or 290X please join our roster of owners. If you did and I missed you please let me know.
> 
> To be added on the member list please submit the following in your post
> 1. GPU-Z Link with OCN name or Screen shot of GPU-Z validation tab open with OCN Name or finally a pic of GPU with piece of paper with OCN name showing.
> 2. Manufacturer & Brand - Please don't make me guess.
> 3. Cooling - Stock, Aftermarket or 3rd Party Water


I started hanging here when I had saved up some money for a 290x but it fell through so I'm stuck with my current 7950 and it works well enough for the moment. I just stuck around, but if the consensus is that none other then club members are to be here then by all means I will go away. If not then I figure I might as well be here and follow what happens with Hawaii and how people like it. I hope I'm not intruding.

As for that post, I will have to admit that I took the liberty of posting it in many of the official threads to get the attention of as many as possible with experience about the mod.


----------



## Arizonian

Quote:


> Originally Posted by *Rainmaker91*
> 
> I started hanging here when I had saved up some money for a 290x but it fell through so I'm stuck with my current 7950 and it works well enough for the moment. I just stuck around, but if the consensus is that none other then club members are to be here then by all means I will go away. If not then I figure I might as well be here and follow what happens with Hawaii and how people like it. I hope I'm not intruding.
> 
> As for that post, I will have to admit that I took the liberty of posting it in many of the official threads to get the attention of as many as possible with experience about the mod.


Whether you have a 290 / 290X card is not a requirement to hang out in the club. Just making sure I didn't miss you is all.









Your more than welcome to hang out and talk GPU's all day. I enjoy reading the thread every day and I don't even own one anymore.


----------



## Durvelle27

What do you guys think of these

http://pages.ebay.com/link/?nav=item.view&id=141264980186&alt=web


----------



## battleaxe

I'm using them to cool my RAM on the 290's. They work fine. They get warm, but not hot to the touch. So I guess they are doing the job fine. I havn't had any trouble with them. They stick on really well too, so that's good. Won't fall off like some thermal pads will.


----------



## Rainmaker91

Quote:


> Originally Posted by *Durvelle27*
> 
> What do you guys think of these
> 
> http://pages.ebay.com/link/?nav=item.view&id=141264980186&alt=web


They look exactly like the Zalman ones that I bought. they come in packs of eight though and cost a lot more. If it's the same then I would change the thermal tape since it's not that good, I use Alphacool thermal tape myself but there is better out there. They do an adequate job of cooling my memory so you should be fine with those.


----------



## sena

Just when i tought it cant be worse, it become worse.

Freeze in crysis 2 at 1080 MHz +100mV, damn, damn, damn, damn.

I feel frustrated, disappointed.

Every other game is ok, crysis 2 is super heavy on gpus, i recommend it for stability testing.


----------



## CoolRonZ

I guess I payed too much.... but still happy


----------



## Durvelle27

Quote:


> Originally Posted by *Rainmaker91*
> 
> They look exactly like the Zalman ones that I bought. they come in packs of eight though and cost a lot more. If it's the same then I would change the thermal tape since it's not that good, I use Alphacool thermal tape myself but there is better out there. They do an adequate job of cooling my memory so you should be fine with those.


Are those all I need


----------



## rdr09

Quote:


> Originally Posted by *sena*
> 
> Just when i tought it cant be worse, it become worse.
> 
> Freeze in crysis 2 at 1080 MHz +100mV, damn, damn, damn, damn.
> 
> I feel frustrated, disappointed.
> 
> Every other game is ok, crysis 2 is super heavy on gpus, i recommend it for stability testing.


i just played it at 1100 core on my 290 . . .



about 1/2 hour. no voltage tweaks just max out PL using Trixx. I normally play at stock. C2 is looking very much like C3. had to alt tab to take a screenie.


----------



## heroxoot

Quote:


> Originally Posted by *Durvelle27*
> 
> What do you guys think of these
> 
> http://pages.ebay.com/link/?nav=item.view&id=141264980186&alt=web


I think Copper would be better but these all help none the less. Sometimes more than you would expect from a piece of metal with no fan.


----------



## cephelix

well, the ones i see being the most recommended are the enzotech forged copper heatsinks...
Something like this but i'm unsure about the sizes used though.


----------



## Rainmaker91

Quote:


> Originally Posted by *Durvelle27*
> 
> Are those all I need


They are good to go for memory and if you cut one of them to size you can use it on VRM2 as well. You will need some others for your main VRMs though as you need either a long and thin one or several small ones (if you do get that then you can go for that on VRM2 to). I know both Alphacool and Enzotech offer heatinks that fit on top of each VRM, the Enzo are copper and the Alphacool is aluminum. Copper excels when you have low airflow while aluminum is best with high airflow. This is because copper holds more heat but dissipates it slower then aluminum. That is how I see it, but the reality may be different (entirely dependent on if my source is reliable or not). Wither way aluminum isn't going to cripple you that much.


----------



## Durvelle27

Quote:


> Originally Posted by *Rainmaker91*
> 
> They are good to go for memory and if you cut one of them to size you can use it on VRM2 as well. You will need some others for your main VRMs though as you need either a long and thin one or several small ones (if you do get that then you can go for that on VRM2 to). I know both Alphacool and Enzotech offer heatinks that fit on top of each VRM, the Enzo are copper and the Alphacool is aluminum. Copper excels when you have low airflow while aluminum is best with high airflow. This is because copper holds more heat but dissipates it slower then aluminum. That is how I see it, but the reality may be different (entirely dependent on if my source is reliable or not). Wither way aluminum isn't going to cripple you that much.


Quote:


> Originally Posted by *Rainmaker91*
> 
> They look exactly like the Zalman ones that I bought. they come in packs of eight though and cost a lot more. If it's the same then I would change the thermal tape since it's not that good, I use Alphacool thermal tape myself but there is better out there. They do an adequate job of cooling my memory so you should be fine with those.


I'll be using 2x 120MM blowing over the card. What size for the VRMs


----------



## Rainmaker91

Quote:


> Originally Posted by *Durvelle27*
> 
> I'll be using 2x 120MM blowing over the card. What size for the VRMs


These would be to cover 1 VRM chip each and you can see the small size. As for that air flow, that will give some massive cooling...

These are the ones I metnioned:
http://www.aquatuning.no/product_info.php/language/en/info/p7046_Enzotech-Mosfet-cooler-MOS-C1---passive.html
http://www.aquatuning.no/product_info.php/language/en/info/p13664_Alphacool-GPU-Heatsinks-7x7mm---black-10-Stk-.html

Just go to the US Site and you will get USD prices.


----------



## Durvelle27

Quote:


> Originally Posted by *Rainmaker91*
> 
> These would be to cover 1 VRM chip each and you can see the small size. As for that air flow, that will give some massive cooling...
> 
> These are the ones I metnioned:
> http://www.aquatuning.no/product_info.php/language/en/info/p7046_Enzotech-Mosfet-cooler-MOS-C1---passive.html
> http://www.aquatuning.no/product_info.php/language/en/info/p13664_Alphacool-GPU-Heatsinks-7x7mm---black-10-Stk-.html
> 
> Just go to the US Site and you will get USD prices.


How many would I need


----------



## Rainmaker91

Quote:


> Originally Posted by *Durvelle27*
> 
> How many would I need


Don't quite know how many VRMs 290X has but if you tell me what brand and so on it is and whether or not it's reference then I can count them for you in a picture







You will need 16 Ram heatsinks though, that much I know.


----------



## Durvelle27

Quote:


> Originally Posted by *Rainmaker91*
> 
> Don't quite know how many VRMs 290X has but if you tell me what brand and so on it is and whether or not it's reference then I can count them for you in a picture
> 
> 
> 
> 
> 
> 
> 
> You will need 16 Ram heatsinks though, that much I know.


I have an original AMD Unbranded Reference R9 290X


----------



## Rainmaker91

Quote:


> Originally Posted by *Durvelle27*
> 
> I have an original AMD Unbranded Reference R9 290X


From what I an see on the reference design, you will need 15 VRM heatsinks. 12 on a row and 3 by them self right next to the crossfire contact (or where it used to be).


----------



## Durvelle27

Quote:


> Originally Posted by *Rainmaker91*
> 
> From what I an see on the reference design, you will need 15 VRM heatsinks. 12 on a row and 3 by them self right next to the crossfire contact (or where it used to be).


Like these

http://www.frozencpu.com/products/17727/vid-191/Micro_Thermal_Heatsink_-_65mm_x_65mm_x_12mm_for_Motherboard_MOS.html?tl=g40c16s1859


----------



## Rainmaker91

Quote:


> Originally Posted by *Durvelle27*
> 
> Like these
> 
> http://www.frozencpu.com/products/17727/vid-191/Micro_Thermal_Heatsink_-_65mm_x_65mm_x_12mm_for_Motherboard_MOS.html?tl=g40c16s1859


Sure, those would work. Get 16 heatsinks isometihng like 1.3cmx1.3cm to 1.5cmx1.5cm in size and youre good to go fo the card. I would however add a good thermal tape to if the one that comes with them fails to attach properly. Something like this or one of the 3M ones, I'm not sure of which one has best conductivity since I'm not that familiar with that way of measuring things but the Akasa seems to be OK. There is better out there though with way higher thermal conductivity, but I can't remember the name or where I found it...


----------



## Durvelle27

Quote:


> Originally Posted by *Rainmaker91*
> 
> Sure, those would work. Get 16 heatsinks isometihng like 1.3cmx1.3cm to 1.5cmx1.5cm in size and youre good to go fo the card. I would however add a good thermal tape to if the one that comes with them fails to attach properly. Something like this or one of the 3M ones, I'm not sure of which one has best conductivity since I'm not that familiar with that way of measuring things but the Akasa seems to be OK. There is better out there though with way higher thermal conductivity, but I can't remember the name or where I found it...


The I'll look around. Very excited to slip a better cooler on


----------



## Aussiejuggalo

Does PCIE 3.0 do anything for these cards? gotta change my CPU to a 3570k and just curious

Also got my waterblock but still missing some parts but soon this beast will be wet


----------



## pdasterly

How does the 3d application settings work in catalyst 14.6, are the profiles loaded automatically and/or can they be modified?


----------



## KyadCK

Quote:


> Originally Posted by *pdasterly*
> 
> How does the 3d application settings work in catalyst 14.6, are the profiles loaded automatically and/or can they be modified?


Same as every other version of Catalyst?
Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Does PCIE 3.0 do anything for these cards? gotta change my CPU to a 3570k and just curious
> 
> Also got my waterblock but still missing some parts but soon this beast will be wet


Shouldn't for single card.


----------



## Dasboogieman

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Does PCIE 3.0 do anything for these cards? gotta change my CPU to a 3570k and just curious
> 
> Also got my waterblock but still missing some parts but soon this beast will be wet


probably not very much for gaming but it'll help tons for OpenCL GPGPU computation. You will basically have double the bandwidth for Memory copy operations.
I primarily upgraded from the 2500k to the 3570k so I could have PCIe 3.0 ready for the future when PCIe SSDs become common. I expect the 2nd or 3rd generation of those drives can saturate a PCIe 2.0 X8 link, this computer will be long obsolete before the PCIE SSD can saturate a x8 3.0 link.


----------



## Roy360

I just realized this.... why is the R9 295X double slot? With the price tag of 1.5k, couldn't they have gone for a single PCI full body cover?

AMD put a stupid DVI port on the regular cards, so we can't just switch the I/O shield on the regular R9 290s, but these R295X don't even have I/O on the 2nd slot.

sigh... the PCIe riser I'm using basically cancels out any benefits of my Xonar STX







. I get a stupid buzzing noise during certain high freqs.... maybe I should just go back to onboard.

my motherboard :


ASUS basically crammed all their PCIe slots together....


----------



## Aussiejuggalo

Quote:


> Originally Posted by *KyadCK*
> 
> Shouldn't for single card.


Thanks
Quote:


> Originally Posted by *Dasboogieman*
> 
> probably not very much for gaming but it'll help tons for OpenCL GPGPU computation. You will basically have double the bandwidth for Memory copy operations.
> I primarily upgraded from the 2500k to the 3570k so I could have PCIe 3.0 ready for the future when PCIe SSDs become common. I expect the 2nd or 3rd generation of those drives can saturate a PCIe 2.0 X8 link, this computer will be long obsolete before the PCIE SSD can saturate a x8 3.0 link.


Thanks

I'm only upgrading coz my CPU is dying and I want this rig to last till Skylake


----------



## Sgt Bilko

Quote:


> Originally Posted by *Roy360*
> 
> I just realized this.... why is the R9 295X double slot? With the price tag of 1.5k, couldn't they have gone for a single PCI full body cover?
> 
> AMD put a stupid DVI port on the regular cards, so we can't just switch the I/O shield on the regular R9 290s, but these R295X don't even have I/O on the 2nd slot.
> 
> sigh... the PCIe riser I'm using basically cancels out any benefits of my Xonar STX
> 
> 
> 
> 
> 
> 
> 
> . I get a stupid buzzing noise during certain high freqs.... maybe I should just go back to onboard.


The height of the AIO Pump/blocks could not have been done as a single slot cards, the Asus Ares III is a single slot but that's only because it's using an EK waterblock.


----------



## pdasterly

Coming from nvidia, havent used catalyst since ati all n wonder


----------



## Roy360

Quote:


> Originally Posted by *Sgt Bilko*
> 
> The height of the AIO Pump/blocks could not have been done as a single slot cards, the Asus Ares III is a single slot but that's only because it's using an EK waterblock.


ooo, so they do have it. but it won't be released until Q3?
source: http://rog.asus.com/323592014/labels/rog-exclusive/rog-at-computex-2014-ares-iii-gr8-crossblade-gladius-gx500-formula-impact-g20-gk2000/

Looks like I have a reason to sell my 3 R9 290s and R9 290x. Is what I would say, if R9 290s weren't selling for so cheap right now. I probably won't be able to cover hte cost of a single R9 295X even if I sold all 4 cards.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Roy360*
> 
> ooo, so they do have it. but it won't be released until Q3?
> source: http://rog.asus.com/323592014/labels/rog-exclusive/rog-at-computex-2014-ares-iii-gr8-crossblade-gladius-gx500-formula-impact-g20-gk2000/
> 
> Looks like I have a reason to sell my 3 R9 290s and R9 290x. Is what I would say, if R9 290s weren't selling for so cheap right now. I probably won't be able to cover hte cost of a single R9 295X even if I sold all 4 cards.


Or save some cash and get a normal 295x2 and slap a single slot EK block on it









As for selling....yeah, it's a buyers market atm


----------



## Rainmaker91

Quote:


> Originally Posted by *Roy360*
> 
> I just realized this.... why is the R9 295X double slot? With the price tag of 1.5k, couldn't they have gone for a single PCI full body cover?
> 
> AMD put a stupid DVI port on the regular cards, so we can't just switch the I/O shield on the regular R9 290s, but these R295X don't even have I/O on the 2nd slot.
> 
> sigh... the PCIe riser I'm using basically cancels out any benefits of my Xonar STX
> 
> 
> 
> 
> 
> 
> 
> . I get a stupid buzzing noise during certain high freqs.... maybe I should just go back to onboard.
> 
> my motherboard :
> 
> 
> ASUS basically crammed all their PCIe slots together....


Good luck with going back to on board audio. I will never revert back to the stoneage again, not since I got my STX a couple of years ago


----------



## Roboyto

Quote:


> Originally Posted by *Durvelle27*
> 
> I have an original AMD Unbranded Reference R9 290X


Gelid makes a heatsink kit for 290(X), I would highly recommend it if you will have the clearance for it. Link for it and temp results in the following:

http://www.overclock.net/t/1478544/the-devil-inside-a-gaming-htpc-for-my-wife-3770k-corsair-250d-powercolor-r9-270x-devil-edition/20#post_22214255

Just about any heatsink you stick on the RAM will be more than sufficient considering there are plenty of cards that go without heatsinks on the RAM.

For VRM1 if you do decide for something besides the Gelid kit, make sure you are using a good thermal pad; my sugesstion is FujiPoly Ultra Extreme. Yes it is expensive, but it does work very well. http://www.frozencpu.com/products/17504/thr-185/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_Mosfet_Block_-_100_x_15_x_10_-_Thermal_Conductivity_170_WmK.html

Thermal tape has very poor thermal conductance compared to even mediocre pads and VRM1 gets hot, hot, hot on these cards especially when you start adding voltage.

VRM2 is a different story, it runs much cooler, so I'm using JunPus thermal tape on it; more information about that in the link above.


----------



## Durvelle27

Quote:


> Originally Posted by *Roboyto*
> 
> Gelid makes a heatsink kit for 290(X), I would highly recommend it if you will have the clearance for it. Link for it and temp results in the following:
> 
> http://www.overclock.net/t/1478544/the-devil-inside-a-gaming-htpc-for-my-wife-3770k-corsair-250d-powercolor-r9-270x-devil-edition/20#post_22214255
> 
> Just about any heatsink you stick on the RAM will be more than sufficient considering there are plenty of cards that go without heatsinks on the RAM.
> 
> For VRM1 if you do decide for something besides the Gelid kit, make sure you are using a good thermal pad; my sugesstion is FujiPoly Ultra Extreme. Yes it is expensive, but it does work very well. http://www.frozencpu.com/products/17504/thr-185/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_Mosfet_Block_-_100_x_15_x_10_-_Thermal_Conductivity_170_WmK.html
> 
> Thermal tape has very poor thermal conductance compared to even mediocre pads and VRM1 gets hot, hot, hot on these cards especially when you start adding voltage.
> 
> VRM2 is a different story, it runs much cooler, as I'm using JunPus thermal tape; more information about that in the link above.


Wow thanks. Much more reasonably priced and what I need.


----------



## pdasterly

Coming from nvidia, havent used catalyst since ati all n wonder


----------



## KyadCK

Quote:


> Originally Posted by *Roy360*
> 
> I just realized this.... why is the R9 295X double slot? With the price tag of 1.5k, *couldn't they have gone for a single PCI full body cover?*
> 
> AMD put a stupid DVI port on the regular cards, so we can't just switch the I/O shield on the regular R9 290s, but these R295X don't even have I/O on the 2nd slot.
> 
> sigh... the PCIe riser I'm using basically cancels out any benefits of my Xonar STX
> 
> 
> 
> 
> 
> 
> 
> . I get a stupid buzzing noise during certain high freqs.... maybe I should just go back to onboard.
> 
> my motherboard :
> 
> ASUS basically crammed all their PCIe slots together....


No, because not everyone who wants a 295X2 has a loop?

Pumps take space.
Quote:


> Originally Posted by *pdasterly*
> 
> Coming from nvidia, havent used catalyst since ati all n wonder


----------



## Aussiejuggalo

Quote:


> Originally Posted by *KyadCK*


I was looking for that every where a few weeks ago


----------



## pdasterly

Ok let me clarify, are the game profiles automatically loaded or do I have to add for each game


----------



## KyadCK

Quote:


> Originally Posted by *pdasterly*
> 
> Ok let me clarify, are the game profiles automatically loaded or do I have to add for each game


They have defaults. You "add" them to customize.


----------



## steadly2004

Just got a higher 3dmark11 score with my new-to-me 290x than I could get max on my 780ti, lol







This with my i7 3820 @ 4.7ghz and my mem @ 2333


----------



## pdasterly

Quote:


> Originally Posted by *KyadCK*
> 
> They have defaults. You "add" them to customize.


Should I add each game, playing rome total war and fps is low, I added it to list and there is profile in settings for it but I seems like it didnt load til I added it manually. Game runs benchmark avg 16 fps and 25 fps when I load profile. Eyefinity resolution, 5780x1080 or something like that


----------



## VSG

24000th reply! Who would have thought?


----------



## pdasterly

Quote:


> Originally Posted by *geggeg*
> 
> 24000th reply! Who would have thought?


Respectable, the 290x is like a camaro aka thepoor mans vette, why buy a vette when camaro offers similar performance for about half the cost. Why buy gtx 780ti when the 290x is about half the cost lol. It might go around a few corners a little faster but the latter isnt behind by much. I love cars lol


----------



## Roy360

this is giving me ideas










but I'm guessing putting the DVI back, will be next to impossible....


----------



## pdasterly

Any testing of 290x with corsair hg10, interested in vrm1 temps with high overclock


----------



## VSG

It isn't even out for retail yet, still not a finished product. So you will have to wait a while.


----------



## pdasterly

Posted this on another thread, just looking for best solution

Using kraken g10 bracket

Going to buy backplate for bracket, my cards are flexing. Tried everything. I have machine laying on side so cards are vertical, unscewed 4 nuts holding bracket to card, measure and front and rear, no matter how loose I tighten nuts the card flexes and sags in the rear.I read on another post where a person mounted the heatkiller backplate to card eith g10 with minimal modifications. Since it has been done, what backplate would you go with, choicesHeat killer or bitspowerOpen to any other suggestions?I have reference sapphire 290x.Also if any offer additional vrm cooling would be a bonus.


----------



## motokill36

Hi all
Just checked max Watts on wall plug meter and it says 902 Watts
Is this right ?
When gaming iv been checking and looks around 780w mark
Could it really peak that high ?
Psu is 950 corsair unit
290 in crossfire


----------



## Red1776

Quote:


> Originally Posted by *motokill36*
> 
> Hi all
> Just checked max Watts on wall plug meter and it says 902 Watts
> Is this right ?
> When gaming iv been checking and looks around 780w mark
> Could it really peak that high ?
> Psu is 950 corsair unit
> 290 in crossfire


It is, I have pulled 1835w myself.


----------



## MeynardMayhem

So I'm about to get a 290 with a waterblock already installed, excited to get a significant step up from my old GT640


----------



## aneutralname

Quote:


> Originally Posted by *Roboyto*
> 
> [*] *Keep an eye on VRM1 temps, <90C is where you want to be for VRM1*


Instersting that my 290 Tri-X OC reaches 89C VRM1 just for mining 5 hours with stock settings. This means that there is virtually no safe overclocking headroom without modding?


----------



## rdr09

Quote:


> Originally Posted by *aneutralname*
> 
> Instersting that my 290 Tri-X OC reaches 89C VMR1 just for mining 5 hours with stock settings. This means that there is virtually no safe overclocking headroom without modding?


you left the fans in auto? oc'ing may require its own fan curve or, better yet, setting them in manual mode like 55 - 65% depending on the amount of oc and other stuff like ambient.


----------



## ebhsimon

Quote:


> Originally Posted by *aneutralname*
> 
> Instersting that my 290 Tri-X OC reaches 89C VRM1 just for mining 5 hours with stock settings. This means that there is virtually no safe overclocking headroom without modding?


What %fan speed did it get up to, and what is the ambient temperature in the room? What did the core temp get up to?

I feel that mining might be more intensive than games, which is what people usually overclock for. I know a lot of people undervolt gpus when mining to minimise heat output and power consumption.

Not a direct comparison, but I'm going to run 2 instances of kombustor for a few hours and report back my core and VRM1 temperatures. I'll be using fan speeds on auto (but I'll include that in the post when I report back).


----------



## motokill36

Quote:


> Originally Posted by *Red1776*
> 
> It is, I have pulled 1835w myself.


Just saw 988w with 1200 on cores
Think ill call it a day there ss psu is 950 lol


----------



## aneutralname

Quote:


> Originally Posted by *rdr09*
> 
> you left the fans in auto? oc'ing may require its own fan curve or, better yet, setting them in manual mode like 55 - 65% depending on the amount of oc and other stuff like ambient.


Quote:


> Originally Posted by *ebhsimon*
> 
> What %fan speed did it get up to, and what is the ambient temperature in the room? What did the core temp get up to?
> 
> I feel that mining might be more intensive than games, which is what people usually overclock for. I know a lot of people undervolt gpus when mining to minimise heat output and power consumption.
> 
> Not a direct comparison, but I'm going to run 2 instances of kombustor for a few hours and report back my core and VRM1 temperatures. I'll be using fan speeds on auto (but I'll include that in the post when I report back).


Yeah, fully stock settings. Ambient temp is about 25 degrees (I have no AC). The fan only got up to 49% with the VRM1 at 89C so I can think of 2 possibilities

1) Sapphire calibrates fans poorly
2) Going above 90 in VRMs is not too dangerous after all

I guess I could set a more aggressive fan curve, but I was looking for an overclock that wasn't too loud.

Also increasing the speed of my chasis fan (side one) will bring down VRM1 temps by 5C or so.


----------



## rdr09

Quote:


> Originally Posted by *aneutralname*
> 
> Yeah, fully stock settings. Ambient temp is about 25 degrees (I have no AC). The fan only got up to 49% with the VRM1 at 89C so I can think of 2 possibilities
> 
> 1) Sapphire calibrates fans poorly
> 2) Going above 90 in VRMs is not too dangerous after all
> 
> I guess I could set a more aggressive fan curve, but I was looking for an overclock that wasn't too loud.
> 
> Also increasing the speed of my chasis fan (side one) will bring down VRM1 temps by 5C or so.


what was the core's temp? i am just guessing . . . it could be that they calibrate the fan rpm based on the core's temp. vrm1, like you said, can handle more. nothing better than water for Hawaii or any gpu i guess.


----------



## killbom

Anyone tried the Kraken G10 or similar on their cards? How are temperatures?


----------



## aneutralname

Quote:


> Originally Posted by *rdr09*
> 
> what was the core's temp? i am just guessing . . . it could be that they calibrate the fan rpm based on the core's temp. vrm1, like you said, can handle more. nothing better than water for Hawaii or any gpu i guess.


About 79 as I remember. Usually I get a 10C difference between core temp and vrm1 temp when mining with stock settings, unless I increase the speed of my side chasis fan, which affects more the VRM1 temps.


----------



## ebhsimon

Quote:


> Originally Posted by *aneutralname*
> 
> Yeah, fully stock settings. Ambient temp is about 25 degrees (I have no AC). The fan only got up to 49% with the VRM1 at 89C so I can think of 2 possibilities
> 
> 1) Sapphire calibrates fans poorly
> 2) Going above 90 in VRMs is not too dangerous after all
> 
> I guess I could set a more aggressive fan curve, but I was looking for an overclock that wasn't too loud.
> 
> Also increasing the speed of my chasis fan (side one) will bring down VRM1 temps by 5C or so.


I'm about *50 minutes in* (I'll keep it running until it hits 2 hours probably), and my 290 (vapor x) has been at 100% load the whole way running stock (1030/1400 vs your 1000/1300).

*My gpu is at 66C core, 33% fan speed, VRM1 61C,VRM2 59C, VDDC Power In: 200W, VDDC Power out: 170W.*

Temperatures haven't budged for a while. Is your case in a position for you to remove your side panel? Mine's on my desk so I've just removed both side panels so it's got pretty much all the fresh air it needs, and the open air cooler naturally blows air away from it. I only have one case fan turned on (the exhaust to suck the hot cpu air out) but I could probably get away with turning that off if I wanted.

Also a benefit of this set up is that it's extremely quiet.


----------



## rdr09

Quote:


> Originally Posted by *aneutralname*
> 
> About 79 as I remember. Usually I get a 10C difference between core temp and vrm1 temp when mining with stock settings, unless I increase the speed of my side chasis fan, which affects more the VRM1 temps.


like eb said, try with the side panel off. aren't the fans suppose to suck air and out the top of the gpu? if so, then the fan on the side panel might be opposing the gpu fans.


----------



## aneutralname

Just a general question: let's say that you are going on a trip and want to leave your computer mining, would there be any negatives to setting the fan at a steady 100%? Do fans wear out with time or is it only a matter of noise why they are set as low as possible?


----------



## ebhsimon

Quote:


> Originally Posted by *aneutralname*
> 
> Just a general question: let's say that you are going on a trip and want to leave your computer mining, would there be any negatives to setting the fan at a steady 100%? Do fans wear out with time or is it only a matter of noise why they are set as low as possible?


I don't see a problem with doing that. Plenty of miners do that (although not all leave it on 100%, but 60, or 70%). They leave it on 24/7 with no air filters or anything, just in an open crate.
The only bad thing I can think of is maybe if something somehow gets in the fan and breaks the fan since it's spinning at full speed.

I'd test it for 24 hours first to make sure that the temperatures when stable aren't at a level you're uncomfortable with. Use GPUZ to monitor your temperatures (you can see present, max, min and avg readings)


----------



## aneutralname

Quote:


> Originally Posted by *ebhsimon*
> 
> I'm about *50 minutes in* (I'll keep it running until it hits 2 hours probably), and my 290 (vapor x) has been at 100% load the whole way running stock (1030/1400 vs your 1000/1300).
> 
> *My gpu is at 66C core, 33% fan speed, VRM1 61C,VRM2 59C, VDDC Power In: 200W, VDDC Power out: 170W.*
> 
> Temperatures haven't budged for a while. Is your case in a position for you to remove your side panel? Mine's on my desk so I've just removed both side panels so it's got pretty much all the fresh air it needs, and the open air cooler naturally blows air away from it. I only have one case fan turned on (the exhaust to suck the hot cpu air out) but I could probably get away with turning that off if I wanted.
> 
> Also a benefit of this set up is that it's extremely quiet.


If I removed my side panel, the system would be much louder never mind it would get a lot of dust. I don't think it's sustainable in the long run. Interesting that you get lower VRM1 temps than GPU temps.


----------



## ebhsimon

Quote:


> Originally Posted by *aneutralname*
> 
> If I removed my side panel, the system would be much louder never mind it would get a lot of dust. I don't think it's sustainable in the long run. Interesting that you get lower VRM1 temps than GPU temps.


Elevate the computer if needed. But if you're uncomfortable with that it's not necessary at all. Is your side panel fan intake of exhaust? It should be exhaust.

Interestingly enough when I OC (1180/1550) my VRM1 temps go higher than my GPU temps.


----------



## rdr09

Quote:


> Originally Posted by *ebhsimon*
> 
> Elevate the computer if needed. But if you're uncomfortable with that it's not necessary at all. Is your side panel fan intake of exhaust? It should be exhaust.
> 
> Interestingly enough when I OC (1180/1550) my VRM1 temps go higher than my GPU temps.


that is is pretty normal for aircooled where the core is in direct contact with the bulk of the heatsink. watercooled . . . the core will be higher.

like i said earlier to anue, the fan rpm could be calibrated to the core temp.


----------



## aneutralname

Quote:


> Originally Posted by *ebhsimon*
> 
> Elevate the computer if needed. But if you're uncomfortable with that it's not necessary at all. Is your side panel fan intake of exhaust? It should be exhaust.
> 
> Interestingly enough when I OC (1180/1550) my VRM1 temps go higher than my GPU temps.


Why do you say exhaust? Most people seem to set the side panel fans as intake. I think that fan does a good job, it cools down the VRM1 by 5C or so, the air hits it more or less directly.

I tried removing that side panel and I get 85 VRM1 temp while mining against 88/87 as usual. I think VRM1 actually runs cooler with the side panel fan at max than taking out the side panel.







(_edit_ confirmed: it does run cooler 84 vs 85).

Undervolting did make a difference. I can run my card with -31 mV at the same frequencies resulting in a VRM temperature of 84.









Undervolting + side panel fan at max= 81C VRM1.


----------



## ebhsimon

Quote:


> Originally Posted by *aneutralname*
> 
> Why do you say exhaust? Most people seem to set the side panel fans as intake. I think that fan does a good job, it cools down the VRM1 by 5C or so, the air hits it more or less directly.
> 
> I tried removing that side panel and I get 85 VRM1 temp while mining against 88/87 as usual. I think VRM1 actually runs cooler with the side panel fan at max than taking out the side panel.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Undervolting did make a difference. I can run my card with -31 mV at the same frequencies resulting in a VRM temperature of 84.


An open air cooler sucks air from the bottom and pushes hot air out towards the side panel. While turning the side panel fan to max might make it cooler, that's because you're feeding it cool air. But you're also opposing the hot air that is being pushed out of it. Usually in a case with good airflow the side panel should be exhaust to take the heat from the gpu directly out of the case.

With minimal airflow, the hot air just swirls around inside the case and gets recycled and that's what makes your components hotter (mainly gpu and cpu). If your gpu temps are lower with the side panel fan as an intake rather than an exhaust that means that your gpu isn't getting enough cool air (which you can fix by using an intake on the bottom of the case.

But if you don't have a spare fan, then I'd take an intake from somewhere else in the case and put it as a bottom intake - since you're mining, you'd want to prioritise the cooling of your GPU over everything else.

And an update on running kombustor at 1 hour 50 minutes, gpu core is 70C, vrm1 68C and 37% fan speed. Ambient temperature of 21C.


----------



## Rainmaker91

Quote:


> Originally Posted by *pdasterly*
> 
> Posted this on another thread, just looking for best solution
> 
> Using kraken g10 bracket
> 
> Going to buy backplate for bracket, my cards are flexing. Tried everything. I have machine laying on side so cards are vertical, unscewed 4 nuts holding bracket to card, measure and front and rear, no matter how loose I tighten nuts the card flexes and sags in the rear.I read on another post where a person mounted the heatkiller backplate to card eith g10 with minimal modifications. Since it has been done, what backplate would you go with, choicesHeat killer or bitspowerOpen to any other suggestions?I have reference sapphire 290x.Also if any offer additional vrm cooling would be a bonus.


You should check out my thread as I address this issue there. I'm just using a backplate on my card but I stabilized it by witting my custom made all cover fitting bracket to the PCI cover. I also ordered myself the Bitspower support for VGA so I will be able to give some opinions on that way of stabilizing your card later on. For 290 and 290x EKWB has a few reinforcer that you should check out. It's all stated in my thread though so swing by and take a look.
Quote:


> Originally Posted by *killbom*
> 
> Anyone tried the Kraken G10 or similar on their cards? How are temperatures?


You should check out The Red Mod as that is exactly what they do.

Edit: added another comment so to not double post


----------



## doginpants12

Can I be added to the club? Here is my verification, I am running a XFX Core Edition R9 290 on water.


----------



## pdasterly

Quote:


> Originally Posted by *Rainmaker91*
> 
> You should check out my thread as I address this issue there. I'm just using a backplate on my card but I stabilized it by witting my custom made all cover fitting bracket to the PCI cover. I also ordered myself the Bitspower support for VGA so I will be able to give some opinions on that way of stabilizing your card later on. For 290 and 290x EKWB has a few reinforcer that you should check out. It's all stated in my thread though so swing by and take a look.
> You should check out The Red Mod as that is exactly what they do.
> 
> Edit: added another comment so to not double post


What are your temps when oc'd?
I ordered two heatkiller backplates


----------



## Silent Scone

Must ask EK if they plan on doing a block for the 8gb Vapor.


----------



## Rainmaker91

Quote:


> Originally Posted by *pdasterly*
> 
> What are your temps when oc'd?
> I ordered two heatkiller backplates


The temps on my 7950 at 1210mhz @1.3v, memory at 1500mhz @1.6v is about 50*C for the core and the VRMs are getting in the 70s after hours of gameplay. They are stable at 50-60 under a 30min valley run though.

Edit: This mod is dependent on what cooler you use tough as my H105 will be among the better ones. If you want to use something other then the G10 then you can wait for Fractal Design to release theirs and use Richie's universal mounting kit to fit them. they will likely crush everything in the AIO market at the moment, and they are expandable in the future if you want since they use g1/4 threads.

Edit 2: Forgot to re-read your post, you already have the g10


----------



## pdasterly

I have g10 with h75 aio


----------



## Roboyto

Quote:


> Originally Posted by *aneutralname*
> 
> Instersting that my 290 Tri-X OC reaches 89C VRM1 just for mining 5 hours with stock settings. This means that there is virtually no safe overclocking headroom without modding?


You are mining so temps should be expected to be a little on the high side since you're likely pushing the card to (near) max usage for those extended periods of time.

The VRMs are designed to withstand temperatures of greater than 90C, it is more in the realm of ~120C. However, the general consensus in this thread seems to be that it is in the best interest for the health and longevity of your card that you keep it under this mark. It is the same scenario as your CPU or GPU, they have an operating threshold, and it is usually not the best idea to run them at maximum operating temperature. Some may disagree and say, "let it burn," because it's designed to do so..but it is simply not in the best interest for the stability, longevity or efficiency of the GPU as a whole for the VRMs to be run at exceedingly high temperatures all the time.

Air cooling these cards requires you do everything you can to get a nice breeze of fresh air to the GPU, especially in/around the VRM1 area. If you haven't done so already, I would change positioning of, or add, fans as necessary to give as much air to your GPU as possible.


----------



## Roboyto

Quote:


> Originally Posted by *killbom*
> 
> Anyone tried the Kraken G10 or similar on their cards? How are temperatures?


http://www.overclock.net/t/1478544/the-devil-inside-a-gaming-htpc-for-my-wife-3770k-corsair-250d-powercolor-r9-270x-devil-edition/20#post_22214255

Kraken + Gelid 290(X) heatsink kit on reference 290 in a little Corsair 250D.


----------



## motokill36

Hi All

Im having reall treouble getting driver to instail after a crash when benching Unigine Valley .

Got stuck in a Blue screen loop on boot up .
Mange to start in safe mode and uninstail driver but once driver is instailed same thing happens just blue screen loop ?

Any idear would be great


----------



## Faster_is_better

Is there some incompatibility with the latest drivers and MSI Kombustor? When I try to run any of the kombustor tests the program just stops responding and gives an error. I'm running latest version of Afterburner as well.


----------



## aneutralname

Quote:


> Originally Posted by *Faster_is_better*
> 
> Is there some incompatibility with the latest drivers and MSI Kombustor? When I try to run any of the kombustor tests the program just stops responding and gives an error. I'm running latest version of Afterburner as well.


Works fine for me.


----------



## aneutralname

IS there any app that will show you VRM temperatures on screen when you are gaming? I couldn't find any so far.


----------



## sena

Quote:


> Originally Posted by *aneutralname*
> 
> IS there any app that will show you VRM temperatures on screen when you are gaming? I couldn't find any so far.


nope


----------



## Faster_is_better

Quote:


> Originally Posted by *aneutralname*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Faster_is_better*
> 
> Is there some incompatibility with the latest drivers and MSI Kombustor? When I try to run any of the kombustor tests the program just stops responding and gives an error. I'm running latest version of Afterburner as well.
> 
> 
> 
> Works fine for me.
Click to expand...

Which version of Kombustor are you using? I guess there is a x64 3.0 version and a legacy 2.x version available.


----------



## heroxoot

Don't use Kombuster/Furmark. All they do is cause heat. It's best to test a GPU by playing games and using benchmarks like Heaven. Mostly playing games tho.


----------



## Roboyto

Quote:



> Originally Posted by *motokill36*
> 
> Hi All
> 
> Im having reall treouble getting driver to instail after a crash when benching Unigine Valley .
> 
> Got stuck in a Blue screen loop on boot up .
> Mange to start in safe mode and uninstail driver but once driver is instailed same thing happens just blue screen loop ?
> 
> Any idear would be great


Were you overclocking with AMD OverDrive? I'm not sure if this issue has been resolved, but previously it would automatically apply your last settings even if your compy crashed. Since CCC loads on boot it could be a real pain.

Otherwise be certain you remove all traces of previous drivers with something like Display Driver Uninstaller, or AMD/Nvidia removal tools.

13.12 or 14.6 are my driver suggestions. Anything 14.x-14.4 gave me problems on Win7 Ultimate so I stuck with 13.12.

8.1 Pro has been very good with 14.6 so far

Quote:


> Originally Posted by *aneutralname*
> 
> IS there any app that will show you VRM temperatures on screen when you are gaming? I couldn't find any so far.


There certainly is. You need to Run HWInfo with RSS(RivaTuner Statistics Server) and you can display all kinds of information on screen. I have mine set to display CPU/GPU loads, RAM/VRAM usage, CPU/GPU/VRM temperatures. Download AB and RSS is checked to install by default with AB; must install together. Takes a little finagling to setup, but it is worth it; Have any ?'s feel free to ask.

Hard for me to get a good pic of my monitor with my phone, but this will give you the idea:



I didn't get all the stats in the pic but here you can see:

CPU Usage - Core1 - Core2 - Core3

GPU - GPU Usage - VRM1 - VRM2

FPS


----------



## Faster_is_better

Quote:


> Originally Posted by *heroxoot*
> 
> Don't use Kombuster/Furmark. All they do is cause heat. It's best to test a GPU by playing games and using benchmarks like Heaven. Mostly playing games tho.


It was a decent test for artifacts too though, since it was extra tough it could show weakness in an OC really quick, I only usually ran it for 2-10 minutes just to test upper limits of OC. Faster than loading up any games or other benchmarks.

I will use games/other benchmark utilities for final stability testing.


----------



## Roboyto

Quote:


> Originally Posted by *Faster_is_better*
> 
> Is there some incompatibility with the latest drivers and MSI Kombustor? When I try to run any of the kombustor tests the program just stops responding and gives an error. I'm running latest version of Afterburner as well.


Quote:


> Originally Posted by *heroxoot*
> 
> Don't use Kombuster/Furmark. All they do is cause heat. It's best to test a GPU by playing games and using benchmarks like Heaven. Mostly playing games tho.


@heroxoot is right. Those programs are likely going to put a superficial load on the card making higher than normal load temperatures. They are better as a cooling test than a stability test since all you have is a swaying furry object, which doesn't replicate what happens in a game. Benchmarks are good stability tests, but aren't fool proof. Sometimes you need to tweak a little for gaming even after extensive benchmark testing.

My last test for GPU OC stability is the FFXIV Benchmark. I've come to find it it will run smoothly through that bench, than it will likely run smoothly on just about anything else; it is quite brutal on the GPU and it looks glorious









Quote:


> Originally Posted by *Faster_is_better*
> 
> It was a decent test for artifacts too though, since it was extra tough it could show weakness in an OC really quick, I only usually ran it for 2-10 minutes just to test upper limits of OC. Faster than loading up any games or other benchmarks.
> 
> I will use games/other benchmark utilities for final stability testing.


You can fit a couple 3DMark or Unigine runs in 10 mins depending on which you choose. If you want lengthy stability testing, run Ungine Heaven/Valley in loop and see where you end up.


----------



## aneutralname

false
Quote:


> Originally Posted by *Faster_is_better*
> 
> Which version of Kombustor are you using? I guess there is a x64 3.0 version and a legacy 2.x version available.


2.5.6 Build 14 Nov 2013 11:56:31


----------



## motokill36

Thanks was using AB on 14.6 driver
This driver wont instail now at all even when I use unisterlation tool
Managed to get 13.12 working now tho


----------



## bluedevil

In case anyone else is like me and think the G10 and all the other adapter plates are stupid. The CM Seidon 120m/XL and 240m all bolt up to the 290/x. Four 3mm/20mm bolts with nylon washers and nuts. Done. If you want additional cooling to the VRM, a spot cool with mini sinks work very well.


----------



## Yvese

Quote:


> Originally Posted by *aneutralname*
> 
> IS there any app that will show you VRM temperatures on screen when you are gaming? I couldn't find any so far.


IMO, best thing for that is get a 2nd monitor and dl the HWINFO gadget. This is what mine looks like on my 2nd screen



You can customize it any way you want and you wont have a bunch of numbers on the screen you're gaming on. This is just one of the many great things about using more than one monitor


----------



## heroxoot

Quote:


> Originally Posted by *Faster_is_better*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Don't use Kombuster/Furmark. All they do is cause heat. It's best to test a GPU by playing games and using benchmarks like Heaven. Mostly playing games tho.
> 
> 
> 
> It was a decent test for artifacts too though, since it was extra tough it could show weakness in an OC really quick, I only usually ran it for 2-10 minutes just to test upper limits of OC. Faster than loading up any games or other benchmarks.
> 
> I will use games/other benchmark utilities for final stability testing.
Click to expand...

Not even a little. I have had my GPU show stable in Furmark and then just have artifacts in BF3. It's terrible and unrealistic. I quit using it then and there. If your games can run without artifacts you are solid.


----------



## falcon26

Dam these cards do run very very hot. I was playing BF3 for about 30 minutes. My 290 got to about 80 degrees. Now its sitting here idling at about 55 degrees. This is on the MSI Twin Frozer card as well. My Gtx 780 would idle about 32 degrees and load about 60 degrees. I guess all the reviews were correct on these cards...


----------



## Aussiejuggalo

Quote:


> Originally Posted by *falcon26*
> 
> Dam these cards do run very very hot. I was playing BF3 for about 30 minutes. My 290 got to about 80 degrees. Now its sitting here idling at about 55 degrees. This is on the MSI Twin Frozer card as well. My Gtx 780 would idle about 32 degrees and load about 60 degrees. I guess all the reviews were correct on these cards...


Yeah there hot, in Watch Dogs mine gets to about 80 but I have no real air flow seeing all I have is rad









I'd still take this $560 hot cake over the $900 Nvidia equivalent tho


----------



## aneutralname

Quote:


> Originally Posted by *ebhsimon*
> 
> An open air cooler sucks air from the bottom and pushes hot air out towards the side panel. While turning the side panel fan to max might make it cooler, that's because you're feeding it cool air. But you're also opposing the hot air that is being pushed out of it. Usually in a case with good airflow the side panel should be exhaust to take the heat from the gpu directly out of the case.
> 
> With minimal airflow, the hot air just swirls around inside the case and gets recycled and that's what makes your components hotter (mainly gpu and cpu). If your gpu temps are lower with the side panel fan as an intake rather than an exhaust that means that your gpu isn't getting enough cool air (which you can fix by using an intake on the bottom of the case.
> 
> But if you don't have a spare fan, then I'd take an intake from somewhere else in the case and put it as a bottom intake - since you're mining, you'd want to prioritise the cooling of your GPU over everything else.
> 
> And an update on running kombustor at 1 hour 50 minutes, gpu core is 70C, vrm1 68C and 37% fan speed. Ambient temperature of 21C.


Wouldn't a fan on the bottom also oppose the air that's being pulled out of the card?

I have searched online about the side panel fan issue, and it appears to be quite divided whether the side fan should be intake or exhaust, with a slight majority opining intake.

After reading your post I tried multiple configurations, and having the fan on the side panel as intake made the card 2 degrees cooler than placing it on the bottom or setting it as exhaust on the side. A current of fresh air straight into the card is probably hard to beat.


----------



## Dasboogieman

Ghetto VRM cooling

Got frustrated with VRM temps, strapped a vintage NVIDIA GT 210 cooler to the backside



Dropped VRM temperatures by about 5-10 degrees. Didn't expect much more due to the inefficiency of the mounting + thick thermal pad contact.

Gosh, this would be so much easier if Sapphire had put a backplate on it. Then the GT 210 cooler would've gone across the plate with excellent contact.

Will probably redo mod once thermal putty arrives.

YESSS I have finally found it.
http://www.helipal.com/m2-x-8mm-flat-head-cap-screws.html?osCsid=8nqp9e491o249eppavas8hf451

If anyone wants to use an XSPC watercooling backplate with your reference AMD 290X then give these screws a try, you'll have to drill extra large clearance holes for the GPU screws . Theoretically might work with EK Backplates too.
I just tested it, M2 screws do work with the AMD 290 Sapphire Tri X provided the length is at least 5mm, the 8mm variety will allow the backplate to be mounted.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Dasboogieman*
> 
> Ghetto VRM cooling
> 
> Got frustrated with VRM temps, strapped a vintage NVIDIA GT 210 cooler to the backside
> 
> 
> 
> Dropped VRM temperatures by about 5-10 degrees. Didn't expect much more due to the inefficiency of the mounting + thick thermal pad contact.
> 
> Gosh, this would be so much easier if Sapphire had put a backplate on it. Then the GT 210 cooler would've gone across the plate with excellent contact.
> 
> Will probably redo mod once thermal putty arrives.


I wonder how effective one of those placed on the backside of the core would be.


----------



## heroxoot

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dasboogieman*
> 
> Ghetto VRM cooling
> 
> Got frustrated with VRM temps, strapped a vintage NVIDIA GT 210 cooler to the backside
> 
> 
> 
> Dropped VRM temperatures by about 5-10 degrees. Didn't expect much more due to the inefficiency of the mounting + thick thermal pad contact.
> 
> Gosh, this would be so much easier if Sapphire had put a backplate on it. Then the GT 210 cooler would've gone across the plate with excellent contact.
> 
> Will probably redo mod once thermal putty arrives.
> 
> 
> 
> I wonder how effective one of those placed on the backside of the core would be.
Click to expand...

Considering the black plate absorbs plenty of heat off the GPU, I'd sight on plenty effective. It depends on your case. I have the Half XB so for me it wouldnt help much as I already have a 200MM fan a good 4 inches away from the side of the card pulling heat off of it. Since its a test bench type case the side of the GPU faces UP rather than to the side of a case. All the heat is taken away rather quick this way.

For other cases tho? I'd sight on amazing.


----------



## Mega Man

Quote:


> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *chiknnwatrmln*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dasboogieman*
> 
> Ghetto VRM cooling
> 
> Got frustrated with VRM temps, strapped a vintage NVIDIA GT 210 cooler to the backside
> 
> 
> 
> Dropped VRM temperatures by about 5-10 degrees. Didn't expect much more due to the inefficiency of the mounting + thick thermal pad contact.
> 
> Gosh, this would be so much easier if Sapphire had put a backplate on it. Then the GT 210 cooler would've gone across the plate with excellent contact.
> 
> Will probably redo mod once thermal putty arrives.
> 
> 
> 
> I wonder how effective one of those placed on the backside of the core would be.
> 
> Click to expand...
> 
> Considering the black plate absorbs plenty of heat off the GPU, I'd sight on plenty effective. It depends on your case. I have the Half XB so for me it wouldnt help much as I already have a 200MM fan a good 4 inches away from the side of the card pulling heat off of it. Since its a test bench type case the side of the GPU faces UP rather than to the side of a case. All the heat is taken away rather quick this way.
> 
> For other cases tho? I'd sight on amazing.
Click to expand...

one word for you all..... watercooling


----------



## ebhsimon

Quote:


> Originally Posted by *falcon26*
> 
> Dam these cards do run very very hot. I was playing BF3 for about 30 minutes. My 290 got to about 80 degrees. Now its sitting here idling at about 55 degrees. This is on the MSI Twin Frozer card as well. My Gtx 780 would idle about 32 degrees and load about 60 degrees. I guess all the reviews were correct on these cards...


Did you overclock your card at all? Are you running more than one monitor? Dual/triple/quad/pent etc. monitor setups generally increase the idle temperatures on your gpu.


----------



## heroxoot

Quote:


> Originally Posted by *Mega Man*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *chiknnwatrmln*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dasboogieman*
> 
> Ghetto VRM cooling
> 
> Got frustrated with VRM temps, strapped a vintage NVIDIA GT 210 cooler to the backside
> 
> 
> 
> Dropped VRM temperatures by about 5-10 degrees. Didn't expect much more due to the inefficiency of the mounting + thick thermal pad contact.
> 
> Gosh, this would be so much easier if Sapphire had put a backplate on it. Then the GT 210 cooler would've gone across the plate with excellent contact.
> 
> Will probably redo mod once thermal putty arrives.
> 
> 
> 
> I wonder how effective one of those placed on the backside of the core would be.
> 
> Click to expand...
> 
> Considering the black plate absorbs plenty of heat off the GPU, I'd sight on plenty effective. It depends on your case. I have the Half XB so for me it wouldnt help much as I already have a 200MM fan a good 4 inches away from the side of the card pulling heat off of it. Since its a test bench type case the side of the GPU faces UP rather than to the side of a case. All the heat is taken away rather quick this way.
> 
> For other cases tho? I'd sight on amazing.
> 
> Click to expand...
> 
> one word for you all..... watercooling
Click to expand...

Two words. Too expensive.


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Mega Man*
> 
> one word for you all..... watercooling




(as for it being expensive even AIO kits are decent enough)


----------



## ebhsimon

AIO kits + bracket + vrm heatsink + thermal glue when added together are still pretty expensive.
As for ghetto cooling, I was thinking about getting a riser and putting the backplate facing down on top of two 140mm fans and just keep it like that.


----------



## Dasboogieman

Lol,waiting for WC parts. I'm still going to use the 210 cooler on the backplate when I go WC. Basically becomes a ghetto Kryographics Backplate.


----------



## joeh4384

My 290x ran really hot in my silverstone gd 05 case Idle at 45 and gaming at 90s but I moved into my Fractal Node 304 rig and it idles in the 30s and games in the low 80s. I think the key is having good enough exhaust. My silverstone case only has 3 120mm intake fans but no exhaust while the fractal has a nice big 140 exhaust fan.


----------



## rdr09

Played some BF4 yesterday and compared the i7 SB usage @ 4.5GHz between HT off (one on the left) and HT on. With a single 290 . . .



this was DX11. I'll see if Mantle makes a difference.


----------



## sena

Can somebodz with asus R290X reference card post bios here, i downloaded bios from tpu, but in gpu-z says ati instead asus.

I am asking this because of eventually rma.

Thx


----------



## aneutralname

Quote:


> Originally Posted by *rdr09*
> 
> Played some BF4 yesterday and compared the i7 SB usage @ 4.5GHz between HT off (one on the left) and HT on. With a single 290 . . .
> 
> 
> 
> this was DX11. I'll see if Mantle makes a difference.


One of the graphs is showing temps.


----------



## rdr09

Quote:


> Originally Posted by *aneutralname*
> 
> One of the graphs is showing temps.


my bad. i posted the wrong one. i'll look for the right one. but here was the usage for both and the game played very smooth for both.



here was the cpu usage with HT on . . .



some core/threads were hardly even used.


----------



## aneutralname

I have a lot of graphical weirdness in Firefox when mining. Have any of you guys found a way around it yet? This is not an issue with the latest drivers, but they cause too many HW errros.


----------



## aneutralname

Quote:


> Originally Posted by *rdr09*
> 
> my bad. i posted the wrong one. i'll look for the right one. but here was the usage for both and the game played very smooth for both.
> 
> 
> 
> here was the cpu usage with HT on . . .
> 
> 
> 
> some core/threads were hardly even used.


Quote:


> Originally Posted by *rdr09*


Multithreading is known not to make a lot of difference in games.


----------



## falcon26

OK I have to ask, and this is not a fan boy question I'm serious I use either Nvidia or Ati depending on price. But if the 290 290X are basically the same as a GTX 780, but run twice as hot and use twice as much power why would anyone not get the 780 over the 290 290x? Even price wise they are almost the same. I am using the 290 now, but man oh man, idling at 50 loading at 90 that is pretty insane considering the 780 idles at 30 and loads at 60. That is like a 30 degree difference which is huge. Then their is drivers. WIth Nvidia you get Adaptive Vysnc with Ati you do not (Officially) I love that feature and it helps me alot, I was really bummed to find out Ati doesn't do this. At this point the MSI gaming 290 I have does look better in my case  but that is about it.


----------



## kizwan

Quote:


> Originally Posted by *rdr09*
> 
> Played some BF4 yesterday and compared the i7 SB usage @ 4.5GHz between HT off (one on the left) and HT on. With a single 290 . . .
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> this was DX11. I'll see if Mantle makes a difference.


Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *aneutralname*
> 
> One of the graphs is showing temps.
> 
> 
> 
> my bad. i posted the wrong one. i'll look for the right one. but here was the usage for both and the game played very smooth for both.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> here was the cpu usage with HT on . . .
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> some core/threads were hardly even used.
Click to expand...

I did these in January. Crossfire.

HT OFF, BF4 1080p 100% res scale Ultra


HT ON, BF4 1080p 100% res scale Ultra


HT ON, BF4 1080p 200% res scale Ultra


----------



## aneutralname

Quote:


> Originally Posted by *falcon26*
> 
> OK I have to ask, and this is not a fan boy question I'm serious I use either Nvidia or Ati depending on price. But if the 290 290X are basically the same as a GTX 780, but run twice as hot and use twice as much power why would anyone not get the 780 over the 290 290x? Even price wise they are almost the same. I am using the 290 now, but man oh man, idling at 50 loading at 90 that is pretty insane considering the 780 idles at 30 and loads at 60. That is like a 30 degree difference which is huge. Then their is drivers. WIth Nvidia you get Adaptive Vysnc with Ati you do not (Officially) I love that feature and it helps me alot, I was really bummed to find out Ati doesn't do this. At this point the MSI gaming 290 I have does look better in my case  but that is about it.


ATI cards look better









Seriously, in my case it's because I'm interested in mining and because my mobo does not support SLI (stuck-up Nvidia won't approve PCIe 4x connections for SLI despite essentially no performance difference).

To be perfectly honest, if you are only going to game, and everything else being equal, I would say the GTX 780 is clearly superior to R9 290. It runs much cooler, more overclocking potential, better driver support for games and downsampling with Nvidia is pretty easy. I haven't got it to work with ATI yet.

And 780 is almost half a year older than R9 290. Goes to show that AMD has some catching up to do or it will end up like in the processor scene.


----------



## Durvelle27

Can i use a non-powered x16 to x16 rise with my R9 290X


----------



## Red1776

Quote:


> Originally Posted by *Durvelle27*
> 
> Can i use a non-powered x16 to x16 rise with my R9 290X


you actually want to use a passive pass thru riser.


----------



## alex22808

Hey all, i have an xfx r9 290 with the reference cooler design. what is the best/most cost efficient aftermarket cooler for it? with the recent hot weather i have been having where i live, my card has really struggled to keep cool with the reference cooler so i am looking to replace it.
Many thanks


----------



## rdr09

Quote:


> Originally Posted by *aneutralname*
> 
> Multithreading is known not to make a lot of difference in games.


i did notice higher fps, especially minimums in Battlefield.



Quote:


> Originally Posted by *kizwan*
> 
> I did these in January. Crossfire.
> 
> HT OFF, BF4 1080p 100% res scale Ultra
> 
> 
> HT ON, BF4 1080p 100% res scale Ultra
> 
> 
> HT ON, BF4 1080p 200% res scale Ultra


i think with a single 290 the i7 with HT off is very adequate but with 2 290s or more, then the HT might help. this is assuming the game uses the extra threads.


----------



## rdr09

Quote:


> Originally Posted by *falcon26*
> 
> OK I have to ask, and this is not a fan boy question I'm serious I use either Nvidia or Ati depending on price. But if the 290 290X are basically the same as a GTX 780, but run twice as hot and use twice as much power why would anyone not get the 780 over the 290 290x? Even price wise they are almost the same. I am using the 290 now, but man oh man, idling at 50 loading at 90 that is pretty insane considering the 780 idles at 30 and loads at 60. That is like a 30 degree difference which is huge. Then their is drivers. WIth Nvidia you get Adaptive Vysnc with Ati you do not (Officially) I love that feature and it helps me alot, I was really bummed to find out Ati doesn't do this. At this point the MSI gaming 290 I have does look better in my case  but that is about it.


i bought my reference 290 at launch and it was $400. i can't remember the price of the 780 back then but it was higher. Reference because i planned on putting a block for it.

another reason is the extra vram and last but not the least - i play Battlefield. I've seen what nvidia drivers done to BF3 and BF4, can't take a chance. other players in my house will curse me. lol

edit: btw,

http://www.newegg.com/Product/Product.aspx?Item=N82E16814127746


----------



## Durvelle27

Quote:


> Originally Posted by *Red1776*
> 
> you actually want to use a passive pass thru riser.


This is the adapter i have


----------



## kizwan

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *aneutralname*
> 
> Multithreading is known not to make a lot of difference in games.
> 
> 
> 
> i did notice higher fps, especially minimums in Battlefield.
Click to expand...

It definitely make a lot of difference in two situations; 1) CPU intensive games, 2) multi-GPU

Quote:


> Originally Posted by *rdr09*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I did these in January. Crossfire.
> 
> HT OFF, BF4 1080p 100% res scale Ultra
> 
> 
> HT ON, BF4 1080p 100% res scale Ultra
> 
> 
> HT ON, BF4 1080p 200% res scale Ultra
> 
> 
> 
> 
> 
> 
> 
> 
> i think with a single 290 the i7 with HT off is very adequate but with 2 290s or more, then the HT might help. this is assuming the game uses the extra threads.
Click to expand...

I'll check this out later using HWiNFO. You're correct though, 4-cores/threads CPU is adequate for single 290.


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> It definitely make a lot of difference in two situations; 1) CPU intensive games, 2) multi-GPU
> I'll check this out later using HWiNFO. You're correct though, 4-cores/threads CPU is adequate for single 290.


looking forward to the results. +rep.


----------



## PachAz

I just bought a used, but never mounted sapphire r9 290:



I paid 2000 SEK, and a new one cost liek 3300 SEK, so I paid 60% of the price of a new one, please tell mi that was a good deal?

I also got a brand new EK 290x waterblock, a nickel one because the shop didnt have a regular copper. I got 30% discount and paid 560 SEK instead of 865 SEK.



I plan on mounting the setup like this:


----------



## Gobigorgohome

Can the Asus R9 290 DCU II OC backplate be used with the EK-FC R9-290X DCII - Acetal+Nickel?

In the case that do not work, how much temperature difference is it with and without backplates on the cards? Couple degrees Celsius?


----------



## cennis

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Can the Asus R9 290 DCU II OC backplate be used with the EK-FC R9-290X DCII - Acetal+Nickel?
> 
> In the case that do not work, how much temperature difference is it with and without backplates on the cards? Couple degrees Celsius?


http://www.ekwb.com/shop/EK-IM/EK-IM-3831109868744.pdf
it seems like all those screw holes will be used for the waterblocks.
screws go in bottom -> top
the original backplate is meant to be threaded the opposite direction (top-> bottom) and the thread ends at the bottom side iirc.

If the threads do not end you just need to find long enough m3 screws and put the backplate on.

backplate cooling for vrm only work best on direct contact, so you can put a couple layers of thermal pads between the backplate and pcb if you manage to mount it.

For reference:
Quote:


> Originally Posted by *cennis*
> 
> *Some pics and results of my asus dcuii h90 mount:
> 
> *
> 
> 
> Spoiler: Warning: Spoiler!


----------



## Roboyto

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Can the Asus R9 290 DCU II OC backplate be used with the EK-FC R9-290X DCII - Acetal+Nickel?
> 
> In the case that do not work, how much temperature difference is it with and without backplates on the cards? Couple degrees Celsius?


I'm not certain regarding your Backplate question, however your choice in thermal pads on the block are going to effect temperatures more than the backplate for the EK cards.

Read here for thermal pad info: http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures


----------



## kizwan

Quote:


> Originally Posted by *rdr09*
> 
> looking forward to the results. +rep.


290 CFX, BF4 1080p 200% res scale Ultra NO AA




Spoiler: Look here for individual Core usage


----------



## MTDEW

Quote:


> Originally Posted by *alex22808*
> 
> Hey all, i have an xfx r9 290 with the reference cooler design. what is the best/most cost efficient aftermarket cooler for it? with the recent hot weather i have been having where i live, my card has really struggled to keep cool with the reference cooler so i am looking to replace it.
> Many thanks


The Gelid Icy Vision rev.2 is around $55.00 on Newegg.
And you'll need the updated VRM cooling kit which is around $14.00 on Newegg.

You can watch an installation video 



.
You can ignore any of the modding he does with a dremel since with the updated VRM kit, you won't have any of those issues.

Be aware though, that even with the updated VRM cooling kit, the VRMs still run a bit hotter than with the reference cooler .

You can use the original plate from your reference cooler for better VRM cooling instead of using the updated VRM kit if you don't mind doing some modding.


http://imgur.com/F2bBr

 is a link with pics of how to mod the reference plate to fit with the Gelid Icy Vision rev.2 cooler.
In the pics, he is doing it for a water block, but the procedure is exactly the same for the Gelid cooler.

Same thing HERE , he is doing it for a different cooler , but the procedure is pretty much the same.

Hope that helps some.


----------



## PachAz

Well, at 200% scale, but what if you would run it ar regular scale, which increase the load on the cpu? Anyways, soon I will test 2x r9 290 in bf4 at 1080p with my 3570k, before I upgrade to a 4930k, and see the cpu usage and gpu usage.


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> 290 CFX, BF4 1080p 200% res scale Ultra NO AA
> 
> 
> 
> 
> Spoiler: Look here for individual Core usage


are you gonna do one with HT off? thanks.

Quote:


> Originally Posted by *PachAz*
> 
> Well, at 200% scale, but what if you would run it ar regular scale, which increase the load on the cpu? Anyways, soon I will test 2x r9 290 in bf4 at 1080p with my 3570k, before I upgrade to a 4930k, and see the cpu usage and gpu usage.


and thank you.


----------



## fateswarm

So I ordered a tri-x 290. Tell me what I need to know. I have no idea.

Any URLs or pointers will be appreciated.

For OCing, or for not blowing it up.


----------



## chronicfx

Quote:


> Originally Posted by *Durvelle27*
> 
> This is how they look
> 
> 
> 
> *I used a 17" instead of the 21.5"


Haha i was thinking about it. I have a 24 inch 1080p a 27' 1440p ips and a square 1024x768 not sure but may be a 17 like you said. I want to try it but the 17 is vga and it just seems silly







lemme know how it works. I may trade the 17 for my father in laws 1080p dell monitor. He just watches italian tv anyways


----------



## KyadCK

Quote:


> Originally Posted by *ebhsimon*
> 
> AIO kits + bracket + vrm heatsink + thermal glue when added together are still pretty expensive.
> As for ghetto cooling, I was thinking about getting a riser and putting the backplate facing down on top of two 140mm fans and just keep it like that.


http://www.newegg.com/Product/Product.aspx?Item=N82E16835186095
Quote:


> Originally Posted by *falcon26*
> 
> OK I have to ask, and this is not a fan boy question I'm serious I use either Nvidia or Ati depending on price. But if the 290 290X are basically the same as a GTX 780, but run twice as hot and use twice as much power why would anyone not get the 780 over the 290 290x? *Even price wise they are almost the same.* I am using the 290 now, but man oh man, idling at 50 loading at 90 that is pretty insane considering the 780 idles at 30 and loads at 60. That is like a 30 degree difference which is huge. Then their is drivers. WIth Nvidia you get Adaptive Vysnc with Ati you do not (Officially) I love that feature and it helps me alot, I was really bummed to find out Ati doesn't do this. At this point the MSI gaming 290 I have does look better in my case  but that is about it.


We'll start with the fact that you can buy brand new R9-290s for $360 and average around $400, and move on to the fact that the cheapest 780 cost $480 and average around $500.

So... "Almost the same"?
Quote:


> Originally Posted by *aneutralname*
> 
> Quote:
> 
> 
> 
> Originally Posted by *falcon26*
> 
> OK I have to ask, and this is not a fan boy question I'm serious I use either Nvidia or Ati depending on price. But if the 290 290X are basically the same as a GTX 780, but run twice as hot and use twice as much power why would anyone not get the 780 over the 290 290x? Even price wise they are almost the same. I am using the 290 now, but man oh man, idling at 50 loading at 90 that is pretty insane considering the 780 idles at 30 and loads at 60. That is like a 30 degree difference which is huge. Then their is drivers. WIth Nvidia you get Adaptive Vysnc with Ati you do not (Officially) I love that feature and it helps me alot, I was really bummed to find out Ati doesn't do this. At this point the MSI gaming 290 I have does look better in my case  but that is about it.
> 
> 
> 
> ATI cards look better
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Seriously, in my case it's because I'm interested in mining and because my mobo does not support SLI (stuck-up Nvidia won't approve PCIe 4x connections for SLI despite essentially no performance difference).
> 
> To be perfectly honest, if you are only going to game, and everything else being equal, I would say the GTX 780 is clearly superior to R9 290. It runs much cooler, more overclocking potential, better driver support for games and downsampling with Nvidia is pretty easy. I haven't got it to work with ATI yet.
> 
> And 780 is almost half a year older than R9 290. Goes to show that AMD has some catching up to do or it will end up like in the processor scene.
Click to expand...

nVidia and AMD have staggered releases... nVidia in the middle of the year, AMD in December/January.

These cards are not made in a year, they take several years to get to market. As a result, you compare generations, not release date.

If we're comparing your way, the 770 is rather pathetic compared to the much older 7970.


----------



## Durvelle27

Quote:


> Originally Posted by *chronicfx*
> 
> Haha i was thinking about it. I have a 24 inch 1080p a 27' 1440p ips and a square 1024x768 not sure but may be a 17 like you said. I want to try it but the 17 is vga and it just seems silly
> 
> 
> 
> 
> 
> 
> 
> lemme know how it works. I may trade the 17 for my father in laws 1080p dell monitor. He just watches italian tv anyways


Was hoping to get some results today for you guys but was sent the wrong adapters so have to wait for them to reship.


----------



## Dasboogieman

Quote:


> Originally Posted by *falcon26*
> 
> OK I have to ask, and this is not a fan boy question I'm serious I use either Nvidia or Ati depending on price. But if the 290 290X are basically the same as a GTX 780, but run twice as hot and use twice as much power why would anyone not get the 780 over the 290 290x? Even price wise they are almost the same. I am using the 290 now, but man oh man, idling at 50 loading at 90 that is pretty insane considering the 780 idles at 30 and loads at 60. That is like a 30 degree difference which is huge. Then their is drivers. WIth Nvidia you get Adaptive Vysnc with Ati you do not (Officially) I love that feature and it helps me alot, I was really bummed to find out Ati doesn't do this. At this point the MSI gaming 290 I have does look better in my case  but that is about it.


In a sense you are correct. NVIDIA designs are, without a doubt, more elegant and refined. They are all about features, user friendliness and efficiency. However theres a little thing I've discovered about NVIDIA designs over the years (I came from an SLI GTX 570 system).

When NVIDIA designs a GPU, they will tend to razor optimize the chip (along the whole product stack, from the Drivers to the VRAM) for the current trends only. I.e. they will most definitely outperform the contemporary competition but the performance has poor futureproofing. For example, you only have to look at the focus of the GPUs themselves. The GTX 570 actually trailed the AMD HD 6970 in just about every hardware metric (Shader array, RAM) but its sheer efficiency gave it, arguably, the upper hand in all 1080p scenarios. Fast forward to today, many HD 6970 chips are still very useable but the 570 is severely bottlenecked by its RAM.

My point is, NVIDIA chips are almost always superior in the current term and maybe a year down the track but their performance tapers very very quickly once you start throwing out scenarios that the original designers didn't optimize for.

AMD hardware has a lot more engineering headroom built in, sometimes I reason this is because they are compensating for their weaker Driver optimizations to squeeze performance.
With the 290x all that heat and power isn't just to match the 780 at 1080p, this card was designed to excel at high resolution and high levels of Anti Aliasing. Throw the latter scenarios at the 780 and you realize how fast the performance melts.

I must admit, adaptive vsync was nice but the one feature I miss from NVIDIA was the SuperSample AA. It was much easier to apply and with less bugs but ironically, AMD hardware is better built to perform SSAA.

As for the temperatures, I would actually agree with AMD, there is nothing inherently wrong with running an ASIC at 95 degrees provided it is properly engineered for (leakage wise and interconnect + package component thermal tolerance). Part of the reason the NVIDIA chips run at 80-85 degrees is that they cannot handle 90+ degrees sustained load, it simply wasn't taken in to account with the transistor selection.
Even the VRMs on the 290X was properly designed for high temperatures, they use expensive high density designs with even more expensive metallic packaging for more efficient heat transfer.


----------



## cephelix

Quote:


> Originally Posted by *Dasboogieman*
> 
> In a sense you are correct. NVIDIA designs are, without a doubt, more elegant and refined. They are all about features, user friendliness and efficiency. However theres a little thing I've discovered about NVIDIA designs over the years (I came from an SLI GTX 570 system).
> 
> When NVIDIA designs a GPU, they will tend to razor optimize the chip (along the whole product stack, from the Drivers to the VRAM) for the current trends only. I.e. they will most definitely outperform the contemporary competition but the performance has poor futureproofing. For example, you only have to look at the focus of the GPUs themselves. The GTX 570 actually trailed the AMD HD 6970 in just about every hardware metric (Shader array, RAM) but its sheer efficiency gave it, arguably, the upper hand in all 1080p scenarios. Fast forward to today, many HD 6970 chips are still very useable but the 570 is severely bottlenecked by its RAM.
> 
> My point is, NVIDIA chips are almost always superior in the current term and maybe a year down the track but their performance tapers very very quickly once you start throwing out scenarios that the original designers didn't optimize for.
> 
> AMD hardware has a lot more engineering headroom built in, sometimes I reason this is because they are compensating for their weaker Driver optimizations to squeeze performance.
> With the 290x all that heat and power isn't just to match the 780 at 1080p, this card was designed to excel at high resolution and high levels of Anti Aliasing. Throw the latter scenarios at the 780 and you realize how fast the performance melts.
> 
> I must admit, adaptive vsync was nice but the one feature I miss from NVIDIA was the SuperSample AA. It was much easier to apply and with less bugs but ironically, AMD hardware is better built to perform SSAA.
> 
> As for the temperatures, I would actually agree with AMD, there is nothing inherently wrong with running an ASIC at 95 degrees provided it is properly engineered for (leakage wise and interconnect + package component thermal tolerance). Part of the reason the NVIDIA chips run at 80-85 degrees is that they cannot handle 90+ degrees sustained load, it simply wasn't taken in to account with the transistor selection.
> Even the VRMs on the 290X was properly designed for high temperatures, they use expensive high density designs with even more expensive metallic packaging for more efficient heat transfer.


Can't really comment since this is only my 3rd card,coming from a gtx 460 and 570 and the fist time i'm going to be seriously trying to OC once my blocks arrive. But it is informative. If only we could get the best of both worlds,nvidia's driver n efficiency,amd's robust designs and prices..but tht's most likely just a pipe dream


----------



## fateswarm

It was a nobrainer here. A tri-x OC sapphire 290 cost 380 euros and the first half-decent 780 was 470. Then, I noticed that sapphire had very impressive benchmarks, rarely beating even the 780ti.

Also it gave an excuse to get a 1000W PSU (in case it CFs in the future+CPU OC).

Compared to a 780+850W PSU it costed 50 euros cheaper as well.


----------



## falcon26

I don't understand why AMD can't do adaptive vsync like nvidia does. Can't they just write that into their divers like nvidia does?

Sent from my Nexus 10 using Tapatalk


----------



## maynard14

finally after installing gelid vrm heatsinks my r9 290 unlock to 290x can now be oc properly using stock volts i manage to oc it to 1060 core and memory to 1300, vrm1 max load is 60c and vrm 2 55c

now i got this score



http://www.3dmark.com/3dm11/8413998

heres my set up:


----------



## PachAz

I would always choose amd gpus over nvidia, because you get more performance for the money and better vram. Why pay more for less? The extra money you "save" you can put on other things, like waterblocks. I mean intel cpu and amd gpu, you have a killer setup







. A stock r9 290 even beats the gtx titan and 780ti in many games, and two r9 290/290x is on pair with 3x titan or 3x 780ti at least. Even the older 7970 ghz in cf is like 10-20 fps slower than 2x 780 or 2x 780ti in many games. So I do think that the top end amd cards are very powefull and they can be even meaner while overclocking.

I dont get why people buy nvidia at all.


----------



## rv8000

Quote:


> Originally Posted by *PachAz*
> 
> I would always choose amd gpus over nvidia, because you get more performance for the money and better vram. Why pay more for less? The extra money you "save" you can put on other things, like waterblocks. I mean intel cpu and amd gpu, you have a killer setup
> 
> 
> 
> 
> 
> 
> 
> . A stock r9 290 even beats the gtx titan and 780ti in many games, and two r9 290/290x is on pair with 3x titan or 3x 780ti at least. Even the older 7970 ghz in cf is like 10-20 fps slower than 2x 780 or 2x 780ti in many games. So I do think that the top end amd cards are very powefull and they can be even meaner while overclocking.
> 
> *I dont get why people buy nvidia at all*.


Marketing, specific performance for games, reviews, features (gimmicky or not is subject to your opinion







). I've owned a reference 780, evga 780 classy, msi 780 lightning, ref 290, 290 pcs+, 290 Tri-X, and now a 290 Vapor-X, all in all the 290 is currently the better buy imo now that prices have come down from the heavens and there are plenty of aftermarket cooling solutions available.


----------



## PachAz

AMD better be better than "Nevidja" because I just got a second r9 290 used, and people here know I cant accept "no" as an answer







. Also red, what do you mean by 5..4..3..2..1??


----------



## Red1776

Quote:


> Originally Posted by *PachAz*
> 
> AMD better be better than "Nevidja" because I just got a second r9 290 used, and people here know I cant accept "no" as an answer
> 
> 
> 
> 
> 
> 
> 
> . Also red, what do you mean by 5..4..3..2..1??


I meant that a statement like that is usually, and quickly followed by many 'spirited' retorts.


----------



## DeadlyDNA

Quote:


> Originally Posted by *PachAz*
> 
> AMD better be better than "Nevidja" because I just got a second r9 290 used, and people here know I cant accept "no" as an answer
> 
> 
> 
> 
> 
> 
> 
> . Also red, what do you mean by 5..4..3..2..1??


He is probably reading that post the way I did. It is your opinion and boldly written to invite brand wars.


----------



## Red1776

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PachAz*
> 
> AMD better be better than "Nevidja" because I just got a second r9 290 used, and people here know I cant accept "no" as an answer
> 
> 
> 
> 
> 
> 
> 
> . Also red, what do you mean by 5..4..3..2..1??
> 
> 
> 
> He is probably reading that post the way I did. It is your opinion and boldly written to invite brand wars.
Click to expand...

could not have said it better myself


----------



## PachAz

I can even go so far to say that two r9 290 will beat 4 way sli 780ti in most of the games. And the reason I say so is because I actually belive that.


----------



## kizwan

Quote:


> Originally Posted by *rdr09*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> 290 CFX, BF4 1080p 200% res scale Ultra NO AA
> 
> 
> 
> 
> Spoiler: Look here for individual Core usage
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> are you gonna do one with HT off? thanks.
Click to expand...

will do it later tonight


----------



## Aussiejuggalo

Any idea when the new WHQL drivers gonna be out? wanna see how much of an improvement it is for Watch Dogs


----------



## Arizonian

Quote:


> Originally Posted by *doginpants12*
> 
> Can I be added to the club? Here is my verification, I am running a XFX Core Edition R9 290 on water
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> .


Congrats - added


----------



## hwoverclkd

Quote:


> Originally Posted by *falcon26*
> 
> I don't understand why AMD can't do adaptive vsync like nvidia does. Can't they just write that into their divers like nvidia does?
> 
> Sent from my Nexus 10 using Tapatalk


probably there's a patent around it. I've also seen companies making patents of competitors products. Competition couldn't implement as they would need to pay huge $$ to patent owner. So if you see like company XYZ seems dumb and couldn't design something, it could probably because WXY company holds a key patent.


----------



## J4ckV4lentine

Quote:


> Originally Posted by *lawson67*
> 
> There is clearly a fault with this card guys and i have heard about some R9 290 cards black screening but do not know how this is solved?...Also why would it run fine in crossfire as the bottom card and not the top card without crashing the drivers (amdkmdap stopped responding and has successfully recovered) ?..And why will not run alone without doing the same..And why is it fine as the bottom card but *not* the top card in a crossfire configuration?....Any thoughts or help guys would be well appreciated


Hi, I have a Gigabyte R9 270x OC and I noticed the display drivers stops responding and comes back to life again, usually when I browse with Chrome, and I don't think there is a cure for this unless Gigabyte or AMD Radeon fix the drivers, can you confirm when you are browsing with chrome and when the drivers stops responding? I am sure the problem is with Chrome, and I think you need to send the card back for replacement on this issue...


----------



## Naxxy

Could i be added to the club please?

Running a Sapphire 290X , Tri-X cooler.

GPU Z Validation


----------



## Dasboogieman

Today, purely on impulse, I had a yolo moment and bought an MSI 290 Gaming for $495.

Took a good part of the afternoon reshuffling parts in the case so I wouldn't have a nuclear meltdown.

Surprisingly, all the aftermarket 290/X cards that actually have decent VRM temperatures have one thing in common, they all have active backplates (with a piece of Thermal pad on the backside).

The MSI card has crazy low VRM 1 temps but really high VRM2 temps (of 64 degrees which doesn't change).

ASIC quality was 79.8% (no golden chips here, just 2 slightly above average ones)

Cards seem to get along. Watchdogs decided to crap itself when I ran it on the Crossfired cards. Checked MSI afterburner and Voila, the game was requesting 6.2 GB of VRAM and climbing before it crashed.

Then I now have a dillemma, my Custom WC loop is sure as hell not going to cope with this thermal load. So I'm thinking, of shuffling the H90 as dedicated to the CPU (i5-3570k) and the full 360mm rad and D5 pump for the 2 GPUs. Sigh, gonna need another Waterblock.


----------



## rdr09

Quote:


> Originally Posted by *Dasboogieman*
> 
> Today, purely on impulse, I had a yolo moment and bought an MSI 290 Gaming for $495.
> 
> Took a good part of the afternoon reshuffling parts in the case so I wouldn't have a nuclear meltdown.
> 
> Surprisingly, all the aftermarket 290/X cards that actually have decent VRM temperatures have one thing in common, they all have active backplates (with a piece of Thermal pad on the backside).
> 
> The MSI card has crazy low VRM 1 temps but really high VRM2 temps (of 64 degrees which doesn't change).
> 
> ASIC quality was 79.8% (no golden chips here, just 2 slightly above average ones)
> 
> Cards seem to get along. Watchdogs decided to crap itself when I ran it on the Crossfired cards. Checked MSI afterburner and Voila, the game was requesting 6.2 GB of VRAM and climbing before it crashed.
> 
> Then I now have a dillemma, my Custom WC loop is sure as hell not going to cope with this thermal load. So I'm thinking, of shuffling the H90 as dedicated to the CPU (i5-3570k) and the full 360mm rad and D5 pump for the 2 GPUs. Sigh, gonna need another Waterblock.


that card is kinda expensive. i've seen one for less than $400 a few days ago. also, not sure if asic matters but mine is about the same as yours and can do 1300 core.

can you still return it?

http://www.overclock.net/t/1471215/sapphire-290x-with-ek-fc-290x-acrylic-nickel-waterblock

hotrod is not a miner.


----------



## BradleyW

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Any idea when the new WHQL drivers gonna be out? wanna see how much of an improvement it is for Watch Dogs


I would not be surprised if we don't see improvements for Watch Dogs. But I'm sure we will see some more improvements for Murdered Soul Suspect which I welcome!


----------



## Dasboogieman

Quote:


> Originally Posted by *rdr09*
> 
> that card is kinda expensive. i've seen one for less than $400 a few days ago. also, not sure if asic matters but mine is about the same as yours and can do 1300 core.
> 
> can you still return it?
> 
> http://www.overclock.net/t/1471215/sapphire-290x-with-ek-fc-290x-acrylic-nickel-waterblock
> 
> hotrod is not a miner.


Unfortunately, when Australia Tax is factored in, it will probably only end up the same. Card's brand new from my local PC store, returns aren't possible. Thanx for the recommend tho.


----------



## alancsalt

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> that card is kinda expensive. i've seen one for less than $400 a few days ago. also, not sure if asic matters but mine is about the same as yours and can do 1300 core.
> 
> can you still return it?
> 
> http://www.overclock.net/t/1471215/sapphire-290x-with-ek-fc-290x-acrylic-nickel-waterblock
> 
> hotrod is not a miner.
> 
> 
> 
> Unfortunately, when Australia Tax is factored in, it will probably only end up the same. Card's brand new from my local PC store, returns aren't possible. Thanx for the recommend tho.
Click to expand...

No tax for imported items under $AU1000.00, but that doesn't affect our "returns" situation here.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Dasboogieman*
> 
> Unfortunately, when Australia Tax is factored in, it will probably only end up the same. Card's brand new from my local PC store, returns aren't possible. Thanx for the recommend tho.


ordering from the US is a bit easier than you would think, even shipping isn't that much of a pain anymore


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *PachAz*
> 
> I can even go so far to say that two r9 290 will beat 4 way sli 780ti in most of the games. And the reason I say so is because I actually believe that.


Ive got some Classies coming I will test that theory









Quote:


> Originally Posted by *rdr09*
> 
> that card is kinda expensive. i've seen one for less than $400 a few days ago. also, not sure if asic matters but mine is about the same as yours and can do 1300 core.
> 
> can you still return it?
> 
> http://www.overclock.net/t/1471215/sapphire-290x-with-ek-fc-290x-acrylic-nickel-waterblock
> 
> *hotrod is not a miner* .


LoOoOoOoL








My cards asic is 67% - 69% and you now what those things have done 1350 single, 1330 CF and 1300 TRI









Quote:


> Originally Posted by *Dasboogieman*
> 
> Unfortunately, when Australia Tax is factored in, it will probably only end up the same. Card's brand new from my local PC store, returns aren't possible. Thanx for the recommend tho.


Yes damn GST and customs brokers man








Quote:


> Originally Posted by *alancsalt*
> 
> No tax for imported items under $AU1000.00, but that doesn't affect our "returns" situation here.


Yeah so don't get all three at once like 780ti's and the like


----------



## rdr09

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Ive got some Classies coming I will test that theory
> 
> 
> 
> 
> 
> 
> 
> 
> LoOoOoOoL
> 
> 
> 
> 
> 
> 
> 
> 
> My cards asic is 67% - 69% and you now what those things have done 1350 single, 1330 CF and 1300 TRI
> 
> 
> 
> 
> 
> 
> 
> 
> Yes damn GST and customs brokers man
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah so don't get all three at once like 780ti's and the like


Pach man. tsk tsk. i look up to him when it comes to watercooling . . . but gpu? i might as well go by tom's. lol


----------



## dartuil

Quote:


> Originally Posted by *Durvelle27*
> 
> This is how they look
> 
> 
> 
> *I used a 17" instead of the 21.5"


What that look in mixed eyefinity?


----------



## Durvelle27

Quote:


> Originally Posted by *dartuil*
> 
> What that look in mixed eyefinity?


Quote:


> Originally Posted by *Durvelle27*
> 
> *Was hoping to get some results today for you guys but was sent the wrong adapters so have to wait for them to reship*.


Read above post please


----------



## dartuil

Didnt see all the thread sorry.
Im just arrived my bad.


----------



## Aussiejuggalo

Quote:


> Originally Posted by *BradleyW*
> 
> I would not be surprised if we don't see improvements for Watch Dogs.


Dont say that














lol


----------



## Durvelle27

Quote:


> Originally Posted by *Durvelle27*
> 
> Was hoping to get some results today for you guys but was sent the wrong adapters so have to wait for them to reship.


Quote:


> Originally Posted by *dartuil*
> 
> Didnt see all the thread sorry.
> Im just arrived my bad.


your fine bud. I ordered 2x Active HDMI to VGA adapters to use with eyefinty and the company accidentally sent me 2x iPhone/iPod to HDMI adapters


----------



## Dasboogieman

holy smokes, I am so confused at the moment. I've been trying for the WHOLE AFTERNOON figuring out why my new Crossfire setup runs so badly with stuttering and such.

I chanced looked on GPUz and as it turns out, my PC shop sent me an *MSI 290X* instead of a regular 290. I don't know if its the crossfire doing funky things to GPUz (The MSI Gaming card is the "slave") but nope, it says there, 2816 shaders



Also checked the box, clearly says AMD 290. Tell me I'm dreaming and slap me until I cry mommy. I just bought a 290 for $500 on impulse and got a 290X because MSI derped with QA.

Gonna install the MSI card single and I'll keep you guys in the loop. Also how do you tell the difference between 290 and 290X with benchmarks? I was under the impression they were really close.

If it really is a 290X then I have absolutely no idea what to do, it stutters like hell when paired with the 290.


----------



## shwarz

gpuz doesnt read right in crossfire one of my 290's shows as a 290x as well


----------



## Sgt Bilko

Quote:


> Originally Posted by *Dasboogieman*
> 
> holy smokes, I am so confused at the moment. I've been trying for the WHOLE AFTERNOON figuring out why my new Crossfire setup runs so badly with stuttering and such.
> 
> I chanced looked on GPUz and as it turns out, my PC shop sent me an *MSI 290X* instead of a regular 290. I don't know if its the crossfire doing funky things to GPUz (The MSI Gaming card is the "slave") but nope, it says there, 2816 shaders
> 
> 
> 
> Also checked the box, clearly says AMD 290. Tell me I'm dreaming and slap me until I cry mommy. I just bought a 290 for $500 on impulse and got a 290X because MSI derped with QA.
> 
> Gonna install the MSI card single and I'll keep you guys in the loop. Also how do you tell the difference between 290 and 290X with benchmarks? I was under the impression they were really close.
> 
> If it really is a 290X then I have absolutely no idea what to do, it stutters like hell when paired with the 290.


Disable ULPS and then reload GPU-Z









Sometimes it derps out.


----------



## Dasboogieman

Yeah tried the MSI card solo.

Got excited over nothing









Turns out its just a mundane, run of the mill, MSI 290


----------



## shwarz

sweet as for the stuttering does it do it if both cards are at stock clocks


----------



## fateswarm

How high can a run of the mill Sapphire 290 Tri-x overclock?


----------



## rdr09

Quote:


> Originally Posted by *fateswarm*
> 
> How high can a run of the mill Sapphire 290 Tri-x overclock?


by silicon. try raising the core 10MHz at a time without touching the VDDC but max out the Power Limit each time. see how far the card will oc first without voltage tweaks. my run of the mill msi reference 290 starts needing extra volts past 1170 core.

you can use a combination Heaven, Valley and/or any of the futuremark benchmarks like Firestrike. Best tool(s) will be the games you play.

keep the vram at stock first. my vram maxes out at 1600.


----------



## fateswarm

Thanks. Rep. And temp limits?


----------



## rdr09

Quote:


> Originally Posted by *fateswarm*
> 
> Thanks. Rep. And temp limits?


imo - the core and vrms below 80C. when oc'ing i would advise not to leave the fans in auto. set them up to manual maybe at a level tolerable to your ears while maintaining the temps. those small fans would normally start making noise at 55%. ambient has a lot to do with these temps as well.


----------



## Naxxy

Quote:


> Originally Posted by *fateswarm*
> 
> How high can a run of the mill Sapphire 290 Tri-x overclock?


I've reached 1270 core and 1650 on memory stable but required a lot of extra voltage and temps and fan rpm raised a lot. The sweet spot I reached is at 1150 core and 1400 memory for everyday use.


----------



## Dasboogieman

Quote:


> Originally Posted by *fateswarm*
> 
> Thanks. Rep. And temp limits?


Based on my Tri X. It seems that the core cooling is extremely solid but the VRM management leaves much to be desired.
You basically won't breach 80 degrees on the Tri X, your VRMs will shutdown long before that happens.

At 50% fan speed, you can expect around 80-85 degrees on VRM1 and 70-75 degrees core with a +125mv OC if you are drawing about 225W of load.

The highest I've ever gone for long periods of time is +167mV with the fans on 65%. Core was around 77 degrees, VRMs were 85-90 and drawing about 245W load, I was playing Mass Effect 3 with x8 SuperSample AA.


----------



## Gobigorgohome

Quote:


> Originally Posted by *cennis*
> 
> http://www.ekwb.com/shop/EK-IM/EK-IM-3831109868744.pdf
> it seems like all those screw holes will be used for the waterblocks.
> screws go in bottom -> top
> the original backplate is meant to be threaded the opposite direction (top-> bottom) and the thread ends at the bottom side iirc.
> 
> If the threads do not end you just need to find long enough m3 screws and put the backplate on.
> 
> backplate cooling for vrm only work best on direct contact, so you can put a couple layers of thermal pads between the backplate and pcb if you manage to mount it.
> 
> For reference:


I am all for not doing any modifications to the original backplate, the EK-backplates is pretty cheap so it is not a big hit. But I then maybe just go for the Sapphire Radeon R9 290X instead of the Asus DCU II OC R9 290 .... how big of a difference is it between the R9 290 and the R9 290X on water anyways? The price difference for two cards is just 234 USD and I guess the Sapphire Radeon R9 290X is a better overclocker on water than the Asus DCU II OC R9 290 is?

Any other suggestions for the best water-performer of the R9 290/R9 290X, I will be gaming on a Samsung U28D590D at 4K. The MSI R9 290X Lightning would maybe be the best, but it is almost as expensive as the GTX 780 Ti and there is not that easy to find waterblocks to those cards either. Not in my country at least.
Quote:


> Originally Posted by *Roboyto*
> 
> I'm not certain regarding your Backplate question, however your choice in thermal pads on the block are going to effect temperatures more than the backplate for the EK cards.
> 
> Read here for thermal pad info: http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures


I was thinking more of a concrete temperature difference with and without the backplate mounted on the graphic cards. Sure the thermal pad has something to say, but anyone know which temperatures you get with and without backplates? I am just trying to cut the expenses somewhat, even though two backplates only is 84 USD.


----------



## cennis

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I am all for not doing any modifications to the original backplate, the EK-backplates is pretty cheap so it is not a big hit. But I then maybe just go for the Sapphire Radeon R9 290X instead of the Asus DCU II OC R9 290 .... how big of a difference is it between the R9 290 and the R9 290X on water anyways? The price difference for two cards is just 234 USD and I guess the Sapphire Radeon R9 290X is a better overclocker on water than the Asus DCU II OC R9 290 is?
> 
> Any other suggestions for the best water-performer of the R9 290/R9 290X, I will be gaming on a Samsung U28D590D at 4K. The MSI R9 290X Lightning would maybe be the best, but it is almost as expensive as the GTX 780 Ti and there is not that easy to find waterblocks to those cards either. Not in my country at least.
> I was thinking more of a concrete temperature difference with and without the backplate mounted on the graphic cards. Sure the thermal pad has something to say, but anyone know which temperatures you get with and without backplates? I am just trying to cut the expenses somewhat, even though two backplates only is 84 USD.


Samsung U28D590D requires displayport,

from my experience, and homecinema-PC, cards flash blackscreens when pushed with high voltage on DP, and total stable on HDMI/dvi, not sure why.


----------



## Roboyto

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I was thinking more of a concrete temperature difference with and without the backplate mounted on the graphic cards. Sure the thermal pad has something to say, but anyone know which temperatures you get with and without backplates? I am just trying to cut the expenses somewhat, even though two backplates only is 84 USD.


If you want to cut cost then run without the backplates. All my temps shown for my 290 are with a backplate installed, but no pads to make it useful as a heatsink. The XSPC and EK blocks get very similar VRM temps once you use a good thermal pad. Better to spend the $20 for Fujipoly Ultra Extreme than $80 on backplates.


----------



## kizwan

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> 290 CFX, BF4 1080p 200% res scale Ultra NO AA
> 
> 
> 
> 
> Spoiler: Look here for individual Core usage
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> are you gonna do one with HT off? thanks.
> 
> Click to expand...
> 
> will do it later tonight
Click to expand...

Note: These are using Mantle.

*290 CFX, BF4 1080p 100% vs. 200% res scale Ultra NO AA*




Spoiler: Look here for individual Core usage!










*290 CFX, BF4 1080p 100% vs. 200% res scale Ultra NO AA - HT OFF*
HWiNFO causing severe stuttering/lagging in BF4. I use Open Hardware Monitor instead. I actually forgot about AB.



For comparison, this is with DirectX but with different driver, either 13.11 or 13.12, that I posted in my previous post. This one I remember CPU almost bottleneck.


----------



## PachAz

Nice test.

What I meant was that two r9 290 will be close to 4x 780ti in some games due to the limitations that exist. And hence 4x 780ti is totally unneeded. But I still believe AMD gives the best performance for the price both in games and in benchmarks. If im not wrong all benchmark records are made with intel + amd gpus.


----------



## pdasterly

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I am all for not doing any modifications to the original backplate, the EK-backplates is pretty cheap so it is not a big hit. But I then maybe just go for the Sapphire Radeon R9 290X instead of the Asus DCU II OC R9 290 .... how big of a difference is it between the R9 290 and the R9 290X on water anyways? The price difference for two cards is just 234 USD and I guess the Sapphire Radeon R9 290X is a better overclocker on water than the Asus DCU II OC R9 290 is?
> 
> Any other suggestions for the best water-performer of the R9 290/R9 290X, I will be gaming on a Samsung U28D590D at 4K. The MSI R9 290X Lightning would maybe be the best, but it is almost as expensive as the GTX 780 Ti and there is not that easy to find waterblocks to those cards either. Not in my country at least.
> I was thinking more of a concrete temperature difference with and without the backplate mounted on the graphic cards. Sure the thermal pad has something to say, but anyone know which temperatures you get with and without backplates? I am just trying to cut the expenses somewhat, even though two backplates only is 84 USD.


interesting to know this as well. I ordered two heatkiller backplates, should I add thermal pad also?


----------



## Jeffro422

Quote:


> Originally Posted by *Dasboogieman*
> 
> holy smokes, I am so confused at the moment. I've been trying for the WHOLE AFTERNOON figuring out why my new Crossfire setup runs so badly with stuttering and such.
> 
> I chanced looked on GPUz and as it turns out, my PC shop sent me an *MSI 290X* instead of a regular 290. I don't know if its the crossfire doing funky things to GPUz (The MSI Gaming card is the "slave") but nope, it says there, 2816 shaders
> 
> 
> 
> Also checked the box, clearly says AMD 290. Tell me I'm dreaming and slap me until I cry mommy. I just bought a 290 for $500 on impulse and got a 290X because MSI derped with QA.
> 
> Gonna install the MSI card single and I'll keep you guys in the loop. Also how do you tell the difference between 290 and 290X with benchmarks? I was under the impression they were really close.
> 
> If it really is a 290X then I have absolutely no idea what to do, it stutters like hell when paired with the 290.


Dang I thought maybe you got a 290x. My 290x came in a 290 box but serials on the box and on the card say it's a 290x. Weird.


----------



## Gobigorgohome

Quote:


> Originally Posted by *cennis*
> 
> Samsung U28D590D requires displayport,
> 
> from my experience, and homecinema-PC, cards flash blackscreens when pushed with high voltage on DP, and total stable on HDMI/dvi, not sure why.


Hmm, so there is problems with using DP and high voltages on 4K? Darn it. I want 60 Hz so I guess I have to push easy then.
Quote:


> Originally Posted by *Roboyto*
> 
> If you want to cut cost then run without the backplates. All my temps shown for my 290 are with a backplate installed, but no pads to make it useful as a heatsink. The XSPC and EK blocks get very similar VRM temps once you use a good thermal pad. Better to spend the $20 for Fujipoly Ultra Extreme than $80 on backplates.


I thought you had to use thermal pads between the backside of the PCB and the backplate so it would be no shortings (or however it is spelled). I might just add the backplate, I have to look a little at it first.


----------



## Roboyto

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I thought you had to use thermal pads between the backside of the PCB and the backplate so it would be no shortings (or however it is spelled). I might just add the backplate, I have to look a little at it first.


Nope, not necessary to have the pads on there for backplate. You will still get far more performance from using the high performance pads on the water cooled block compared to the passive backplate.


----------



## naved777

will 1200w psu be enough for 3 OCed 290x ?


----------



## Roboyto

Quote:


> Originally Posted by *naved777*
> 
> will 1200w psu be enough for 3 OCed 290x ?


Could be, max I've seen benching is 360W with a very high OC under water; 1300/1700 +200mV & 50% power.

Depends on the rest of your system draw and what you'll be doing.

At milder clocks and voltages when gaming I'm more in the 270W range; 1200/1500 +87mV & 0% power limit.


----------



## Gunderman456

Hi Roboyto,

I see you've been recommending FFXIV Benchmark heavily. It looks like you can download it independently of owning the game and according to you will give a good indication on GPU(s) overclock stability (keeping in mind nothing is fool proof of course).

In that respect, I will download and try the Benchmark to stress my OCs since I'm tired of testing in BF4 only to lose all my stats when the game freezes due to unstable overclocks. Mind you that has not happened in a while; however I have run through a fresh set of extensive stability tests on CPU and RAM OCs using the new x264 v.5.0.1 which has only served to strengthen and improve my OCs on the CPU and RAM.

In so doing, I will be running my GPUs through the gauntlet again to see if I can benefit from all the work I have just completed on the CPU and RAM.

Again, thanks for suggesting FFXIV Benchmark which I will add to my arsenal of Benchmarks/Stressors.


----------



## phallacy

Quote:


> Originally Posted by *naved777*
> 
> will 1200w psu be enough for 3 OCed 290x ?


I run 3 290x that are quite OCd off an evga 1300w and it's been completely steady. The 4th one is powered by a evga 1000w along with the cpu mobo ssds fans etc


----------



## Roboyto

Quote:


> Originally Posted by *Gunderman456*
> 
> Hi Roboyto,
> 
> I see you've been recommending FFXIV Benchmark heavily. It looks like you can download it independently of owning the game and according to you will give a good indication on GPU(s) overclock stability (keeping in mind nothing is fool proof of course).
> 
> In that respect, I will download and try the Benchmark to stress my OCs since I'm tired of testing in BF4 only to lose all my stats when the game freezes due to unstable overclocks. Mind you that has not happened in a while; however I have run through a fresh set of extensive stability tests on CPU and RAM OCs using the new x264 v.5.0.1 which has only served to strengthen and improve my OCs on the CPU and RAM.
> 
> In so going, I will be running my GPUs through the gauntlet again to see if I can benefit from all the work I have just completed on the CPU and RAM.
> 
> Again, thanks for the suggesting FFXIV Benchmark which I will add to my arsenal of Benchmarks/Stressors.


I'm sub'd to your build log and have been following it for some time now. I'm going to be re-exploring my CPU clocks to see if I can get 4.7GHz stable for benching due to your relentless tinkering with your clocks and settings.

FFXIV bench gives me the lowest stable clocks compared to every other "common" test, so it has been my go to for a little while now. If it doesn't crash in the midst of the bench, or artifact in the last scene, then odds are pretty good that you can load up just about any game and have at it.

I don't have BF4 however, so this will be good information to have from you. Keep me posted on your results.


----------



## Gunderman456

Will do!


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> Note: These are using Mantle.
> 
> *290 CFX, BF4 1080p 100% vs. 200% res scale Ultra NO AA*
> 
> 
> 
> 
> Spoiler: Look here for individual Core usage!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *290 CFX, BF4 1080p 100% vs. 200% res scale Ultra NO AA - HT OFF*
> HWiNFO causing severe stuttering/lagging in BF4. I use Open Hardware Monitor instead. I actually forgot about AB.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> For comparison, this is with DirectX but with different driver, either 13.11 or 13.12, that I posted in my previous post. This one I remember CPU almost bottleneck.


Kizwan, i've never used that tool, so i tried to decipher it but failed. what do the red and green colors represent? is one the core and the other the thread?


----------



## falcon26

Just played some BF4. In some maps and parts it got choppy. I tried mantle and DX11. My 780 never got choppy ever. This 290 also becomes quite the hairdryer on load very very loud...


----------



## Widde

Just did a valley and heaven run for fun http://piclair.com/lxq52 http://piclair.com/86p7r


----------



## falcon26

Maybe its poor coding on BF4 part. When I play BF3 its smooth as butter on all maps really. Card also does not got that loud or hot really either..


----------



## heroxoot

Quote:


> Originally Posted by *falcon26*
> 
> Maybe its poor coding on BF4 part. When I play BF3 its smooth as butter on all maps really. Card also does not got that loud or hot really either..


It took BF3 ages to get running properly. Plus BF3 runs on a 7870 as its max needed and the 290X soars above that by far. BF4 is far harder on the GPU.


----------



## Dasboogieman

Woot
Crossfire vetting complete
No overheating or anything with both cards at +37mV. The Sapphire one gets to about 85 degrees VRM at 45% fan speed but the MSI one is doing even better.

Now I can use all the SSAA and high res textures I want









Power Supply (CM Silent Pro 1000M) is crying a bit though, I've actually never heard the fan spin up before, though that might be because the previous SLI 570s were loud as hell.


----------



## kizwan

Quote:


> Originally Posted by *rdr09*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Note: These are using Mantle.
> 
> *290 CFX, BF4 1080p 100% vs. 200% res scale Ultra NO AA*
> 
> 
> 
> 
> Spoiler: Look here for individual Core usage!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *290 CFX, BF4 1080p 100% vs. 200% res scale Ultra NO AA - HT OFF*
> HWiNFO causing severe stuttering/lagging in BF4. I use Open Hardware Monitor instead. I actually forgot about AB.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> For comparison, this is with DirectX but with different driver, either 13.11 or 13.12, that I posted in my previous post. This one I remember CPU almost bottleneck.
> 
> 
> 
> 
> 
> 
> 
> 
> Kizwan, i've never used that tool, so i tried to decipher it but failed. what do the red and green colors represent? is one the core and the other the thread?
Click to expand...

With that tool I can load two files at the same time to see graphical difference between the two. You can see on the top left, on the left of the filename fields, the red is for 100% res scale & the green is for 200% res scale. Also on the top right on each of the graph, there is a label showing which core & which thread it represent.


----------



## Arizonian

Quote:


> Originally Posted by *Naxxy*
> 
> Could i be added to the club please?
> 
> Running a Sapphire 290X , Tri-X cooler.
> 
> GPU Z Validation
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## DeadlyDNA

Quote:


> Originally Posted by *Dasboogieman*
> 
> Woot
> Crossfire vetting complete
> No overheating or anything with both cards at +37mV. The Sapphire one gets to about 85 degrees VRM at 45% fan speed but the MSI one is doing even better.
> 
> Now I can use all the SSAA and high res textures I want
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Power Supply (CM Silent Pro 1000M) is crying a bit though, I've actually never heard the fan spin up before, though that might be because the previous SLI 570s were loud as hell.


haha my psu gets loud when I oc and bench. Never knew what people were talking about on the reviews for it. Apparently these r9 cards eat insane amounts of power.


----------



## Red1776

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dasboogieman*
> 
> Woot
> Crossfire vetting complete
> No overheating or anything with both cards at +37mV. The Sapphire one gets to about 85 degrees VRM at 45% fan speed but the MSI one is doing even better.
> 
> Now I can use all the SSAA and high res textures I want
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Power Supply (CM Silent Pro 1000M) is crying a bit though, I've actually never heard the fan spin up before, though that might be because the previous SLI 570s were loud as hell.
> 
> 
> 
> haha my psu gets loud when I oc and bench. Never knew what people were talking about on the reviews for it. Apparently these r9 cards eat insane amounts of power.
Click to expand...

well get a decent PSU...mine stays whisper quiet powering all 4 oc'd R290X ...hehehe


----------



## DeadlyDNA

Quote:


> Originally Posted by *Red1776*
> 
> well get a decent PSU...mine stays whisper quiet powering all 4 oc'd R290X ...hehehe


PSU BRAND WARS! Quick everyone jump in! 

I might consider changing to dual PSU if its worth it. Sad thing is i just bought my Lepa i think in nov, or dec. :-(


----------



## heroxoot

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> well get a decent PSU...mine stays whisper quiet powering all 4 oc'd R290X ...hehehe
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PSU BRAND WARS! Quick everyone jump in!
> 
> I might consider changing to dual PSU if its worth it. Sad thing is i just bought my Lepa i think in nov, or dec. :-(
Click to expand...

I've been a fan of OCZ psu myself. On average they have lasted 5+ years before they start to have issues at all. The last one that broke was 6 years old and the PSU functioned but the 24pin busted somewhere. Was not modular sadly. Wound up buying a new one of the same brand but different type.


----------



## Red1776

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> well get a decent PSU...mine stays whisper quiet powering all 4 oc'd R290X ...hehehe
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PSU BRAND WARS! Quick everyone jump in!
> 
> I might consider changing to dual PSU if its worth it. Sad thing is i just bought my Lepa i think in nov, or dec. :-(
Click to expand...

hehe, how is the Lepa 1600 treating you anyway?


----------



## DeadlyDNA

So far good, i think though its getting pushed even harder now. I have had no issues yet, but i hit 1650 at the wall earlier so i guess i still have room. The watercooling and fans are on a psu by itself. I do have 2 other PSU's floating around 1200w and 1100w i could use, but last time i used 2 psu's on a 480gtx 4 way sli build i ended up melting some 12v leads on 24pin ATX connector. Ever since then i've been paranoid, although back then it was my fault for not using the added power connector for PCIE ports. Is the add2psu adapter the item to get for 2 psu;s? Should i match the units by brand/wattage/model?


----------



## kizwan

Yup, you will need that adapter if you don't want to turn on the PSU one by one yourself. Don't need to match wattage/brand.


----------



## Red1776

Quote:


> Originally Posted by *DeadlyDNA*
> 
> So far good, i think though its getting pushed even harder now. I have had no issues yet, but i hit 1650 at the wall earlier so i guess i still have room. The watercooling and fans are on a psu by itself. I do have 2 other PSU's floating around 1200w and 1100w i could use, but last time i used 2 psu's on a 480gtx 4 way sli build i ended up melting some 12v leads on 24pin ATX connector. Ever since then i've been paranoid, although back then it was my fault for not using the added power connector for PCIE ports. Is the add2psu adapter the item to get for 2 psu;s? Should i match the units by brand/wattage/model?


 The adapters just so both PSU get the 'wake up' power on signal from the BIOS.


----------



## diggiddi

Quote:


> Originally Posted by *kizwan*
> 
> *HWiNFO causing severe stuttering/lagging in BF4. I use Open Hardware Monitor instead. I actually forgot about AB.*


This!! Hwinfo eats up a ton of CPU cycles and was causing stuttering in BF3


----------



## Sgt Bilko

Quote:


> Originally Posted by *diggiddi*
> 
> This!! Hwinfo eats up a ton of CPU cycles and was causing stuttering in BF3


Really?

Can't say i've ever noticed tbh


----------



## kizwan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *diggiddi*
> 
> This!! Hwinfo eats up a ton of CPU cycles and was causing stuttering in BF3
> 
> 
> 
> Really?
> 
> Can't say i've ever noticed tbh
Click to expand...

For me it only happen if HT is OFF (in BF4); CPU is running with only 4 cores/threads.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kizwan*
> 
> For me it only happen if HT is OFF (in BF4); CPU is running with only 4 cores/threads.


Ah, well no HT for me obviously









I have a tendency to have everything running at once when i game


----------



## kizwan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> For me it only happen if HT is OFF (in BF4); CPU is running with only 4 cores/threads.
> 
> 
> 
> 
> 
> 
> 
> Ah, well no HT for me obviously
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have a tendency to have everything running at once when i game
Click to expand...

It's not about HT per se. When CPU is running with only 4 cores/threads, HWiNFO may suck the life out of your CPU in BF4.

I don't turn off anything too when gaming. I just did it to record CPU usage with Crossfire in BF4, to see what likely to happen with 4-cores/threads CPU (e.g. i5).


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> With that tool I can load two files at the same time to see graphical difference between the two. You can see on the top left, on the left of the filename fields, the red is for 100% res scale & the green is for 200% res scale. Also on the top right on each of the graph, there is a label showing which core & which thread it represent.


thanks, Kiz, pretty neat tool. even the cores of an i7 3820 with HT off can get up there with 2 290s. raising the rez scale helps, though, and that means using higher rez relieves the cpu quite a bit. you got a hefty oc on the cpu and i think that, too, helps.


----------



## Dasboogieman

Quote:


> Originally Posted by *Red1776*
> 
> well get a decent PSU...mine stays whisper quiet powering all 4 oc'd R290X ...hehehe


Probs should but mine was apparently the performance-price-reliability king back in the day, I was hoping to keep it for at least a decade since I thought 1000W was waaaaay overkill to begin with (Since I had planned for Tri SLI 570 back then)....I guess I was wrong since these 290s eat juice like anything.


----------



## Durvelle27

I haven't gotten an answer for this yet. Is it safe to use a unpowered x16 to x16 riser with a 290X


----------



## Red1776

Quote:


> Originally Posted by *Durvelle27*
> 
> I haven't gotten an answer for this yet. Is it safe to use a unpowered x16 to x16 riser with a 290X


Yes it is safe to use a passive Pcie X16 riser with a R290 or any GPU for that matter. I have used passive risers for GPU's many times myself . mostly to make use of the ASUS Crosshair IV and V -Z. just sme advice , stay away from the cheap risers, you get what you pay for.

here are two examples of my riser applications





Greg


----------



## Durvelle27

Quote:


> Originally Posted by *Red1776*
> 
> Yes it is safe to use a passive Pcie X16 riser with a R290 or any GPU for that matter. I have used passive risers for GPU's many times myself . mostly to make use of the ASUS Crosshair IV and V -Z. just sme advice , stay away from the cheap risers, you get what you pay for.
> 
> here are two examples of my riser applications
> 
> 
> 
> 
> 
> 
> 
> Greg


did you see the pic of the riser i have


----------



## Silent Scone

I've recently been able to get my hands on (curtisy of AMDs Roy Taylor) the Sapphire 290X Vapor-X 8GB. Will be putting it (possibly two) through their paces at some point in the next couple of weeks with 4K


----------



## Red1776

Quote:


> Originally Posted by *Durvelle27*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> Yes it is safe to use a passive Pcie X16 riser with a R290 or any GPU for that matter. I have used passive risers for GPU's many times myself . mostly to make use of the ASUS Crosshair IV and V -Z. just sme advice , stay away from the cheap risers, you get what you pay for.
> 
> here are two examples of my riser applications
> 
> 
> 
> 
> 
> Greg
> 
> 
> 
> did you see the pic of the riser i have
Click to expand...

I did, I was not taking a shot at it by any means. I don't know what brand it is or anything about it.

I just have learned from experience that cheap passive risers are more trouble than they are worth. So if you have a good one great!


----------



## Durvelle27

Quote:


> Originally Posted by *Red1776*
> 
> I did, I was not taking a shot at it by any means. I don't know what brand it is or anything about it.
> I just have learned from experience that cheap passive risers are more trouble than they are worth. So if you have a good one great!


Here's a link for it

http://www.ebay.com/itm/PCI-Express-PCI-E-16X-Riser-Card-Flex-Flexible-Ribbon-Extender-Extension-Cable-/400504001453?ru=http%3A%2F%2Fwww.ebay.com%2Fsch%2Fi.html%3F_sacat%3D0%26_from%3DR40%26_nkw%3D400504001453%26_rdc%3D1&nma=true&si=aXaw1B36Pw2HNhsOJn%252B9wq9EBHY%253D&orig_cvip=true&rt=nc&_trksid=p2047675.l2557


----------



## piquadrat

Quote:


> Originally Posted by *aneutralname*
> 
> IS there any app that will show you VRM temperatures on screen when you are gaming? I couldn't find any so far.


Riva tuner statistic server + HWiNFO


----------



## Red1776

Quote:


> Originally Posted by *Durvelle27*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> I did, I was not taking a shot at it by any means. I don't know what brand it is or anything about it.
> I just have learned from experience that cheap passive risers are more trouble than they are worth. So if you have a good one great!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here's a link for it
> 
> http://www.ebay.com/itm/PCI-Express-PCI-E-16X-Riser-Card-Flex-Flexible-Ribbon-Extender-Extension-Cable-/400504001453?ru=http%3A%2F%2Fwww.ebay.com%2Fsch%2Fi.html%3F_sacat%3D0%26_from%3DR40%26_nkw%3D400504001453%26_rdc%3D1&nma=true&si=aXaw1B36Pw2HNhsOJn%252B9wq9EBHY%253D&orig_cvip=true&rt=nc&_trksid=p2047675.l2557
Click to expand...

I tried buying a cuple of the cheaper $10 risers of Ebay and had nothing but pronblrms.

If you run into the same problem, check this company out. I paid $60.00 for my x16 riser but it is high quality and worked perfectly.

http://www.adexelec.com/pciexp.htm


----------



## Durvelle27

Quote:


> Originally Posted by *Red1776*
> 
> I tried buying a cuple of the cheaper $10 risers of Ebay and had nothing but pronblrms.
> If you run into the same problem, check this company out. I paid $60.00 for my x16 riser but it is high quality and worked perfectly.
> 
> http://www.adexelec.com/pciexp.htm


What kind of problems


----------



## Red1776

Quote:


> Originally Posted by *Durvelle27*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> I tried buying a cuple of the cheaper $10 risers of Ebay and had nothing but pronblrms.
> If you run into the same problem, check this company out. I paid $60.00 for my x16 riser but it is high quality and worked perfectly.
> 
> http://www.adexelec.com/pciexp.htm
> 
> 
> 
> 
> 
> What kind of problems
Click to expand...

 The first one just flat out did not work and did not register as the 4th card. The second one would cut out and not register about every other start up.

I finally broke down and purchased the ADEX and had no problems at all.


----------



## Gobigorgohome

How high of an overclock am I looking at before the Displayport will start to give me trouble with 2x Sapphire Radeon R9 290X's in crossfire?

I have not bought the cards yet, but will order as soon as the paycheck comes, will buy the water cooling at the same time.


----------



## cennis

Quote:


> Originally Posted by *Gobigorgohome*
> 
> How high of an overclock am I looking at before the Displayport will start to give me trouble with 2x Sapphire Radeon R9 290X's in crossfire?
> 
> I have not bought the cards yet, but will order as soon as the paycheck comes, will buy the water cooling at the same time.


my cards need +175mw to run 1200/1500
and I can only do +75mw or less which results in 1100/1500

its not the clock speed, its the voltage

i.e. +175mw at stock speeds still have flashing black screen


----------



## rdr09

Quote:


> Originally Posted by *Gobigorgohome*
> 
> How high of an overclock am I looking at before the Displayport will start to give me trouble with 2x Sapphire Radeon R9 290X's in crossfire?
> 
> I have not bought the cards yet, but will order as soon as the paycheck comes, will buy the water cooling at the same time.


you are taliking about reference 290Xs, right? i would suggest delaying the waterblock. test the gpus first (for about a week) find out which one oc better than the other on air and make that the primary card. if both tested fine, then get the blocks for them.


----------



## Scorpion49

Hey guys, I just got a new Sapphire R9 290 Vapor-X, and I'm having a bit of an issue. Sometimes when watching a video, min/maxing a window, or sometimes in a game switching between loading screens I get a couple of horizontal lines across the screen very briefly, for maybe one frame. I'm worried that the card may not be stable at its stock 1030/1400 clocks. Anyone else seen something like this? I did not have it with any of the other 290's I've had, and I am using the 14.4 WHQL drivers.


----------



## fateswarm

When I get a tri-x 290, should I buy a little fan pointing at the back of the pcb underneath the vrm mosfets?


----------



## VSG

Not really, that Tri-X cooler is really good for a stock cooler. If you are sticking to air, there isn't much that cooler can't handle (other than crazy volts that is).


----------



## fateswarm

but everyone is complaining about vrm temps.

oh, they're on water?


----------



## VSG

The complaints are from stock coolers or 3rd party coolers that paid no attention to the VRMs. Sapphire has done an excellent job with their Tri-X (along with Vapor-X and the teased Toxic) cooler as did Powercolor with their PCS and MSI with the Lightning.


----------



## fateswarm

Alright. Thanks. I guess I did a good job changing my mind at the last moment.

Though I mainly had set my mind to a 780.

A later figured out a tri-x for €100 less.


----------



## DeadlyDNA

Quote:


> Originally Posted by *Red1776*
> 
> I did, I was not taking a shot at it by any means. I don't know what brand it is or anything about it.
> I just have learned from experience that cheap passive risers are more trouble than they are worth. So if you have a good one great!


Quote:


> Originally Posted by *Durvelle27*
> 
> Here's a link for it
> 
> http://www.ebay.com/itm/PCI-Express-PCI-E-16X-Riser-Card-Flex-Flexible-Ribbon-Extender-Extension-Cable-/400504001453?ru=http%3A%2F%2Fwww.ebay.com%2Fsch%2Fi.html%3F_sacat%3D0%26_from%3DR40%26_nkw%3D400504001453%26_rdc%3D1&nma=true&si=aXaw1B36Pw2HNhsOJn%252B9wq9EBHY%253D&orig_cvip=true&rt=nc&_trksid=p2047675.l2557


Quote:


> Originally Posted by *Red1776*
> 
> I tried buying a cuple of the cheaper $10 risers of Ebay and had nothing but pronblrms.
> If you run into the same problem, check this company out. I paid $60.00 for my x16 riser but it is high quality and worked perfectly.
> 
> http://www.adexelec.com/pciexp.htm


Quote:


> Originally Posted by *Durvelle27*
> 
> What kind of problems


Red1776 is giving sound advice on this, i can somewhat back this up.

I bought a couple risers around 2 years back, when i wanted to run quad crossfire and SLI on a 3 way classified mobo. The first one i bought was like the one you linked from ebay but i found out quickly it was not PCIE- 2.0 compliant. For me when i connected it any gpu 2.0 capable on a 2.0 slot it BSOD'd for me. I bought one just like the one Red has in his pic. It worked great except one issue which didn't stop me from using it. you will notice the shielding on the one red is recommending, if you peeled it back the wiring is higher quality. The on you linked from ebay uses wiring that's basically like the old FDD and IDE cables. In fact i think not entirely sure the PCIE 2.0 riser Red shows has 2 wires per connection. Anyways the only issue i had with my PCIE 2.0 riser like Red's was i had it on my desk near my wireless router and when i was using it to game it was emitting interference interrupting my Wireless Router signal. I literally took a week to figure this out at first i thought my router was failing. So word to the wise if your going to operate it near a router it may interfere, say within 6-10f eet.


----------



## Hl86

The problem is that vrm hots out before the core does, sadly.


----------



## aNoN_

Hey guys. Just got me a Sapphire 290 reference w/ stock cooler. I am just starting my overclocking and stumbled opon some issues.

After raising the Core clock to 1100 and Power Limit to 50% I ran Unigine Valley, temps were fine, not exeeding 80c with my current fanspeed profile. But the clockspeed fluctuates up and down when the GPU is stressed, when fps goes up again the clock is raised to 1100 again. What is causing this? I had the same problem with my old 7950.



My afterburner settings:


Thanks in advance.


----------



## rdr09

Quote:


> Originally Posted by *Hl86*
> 
> The problem is that vrm hots out before the core does, sadly.


not when watercooled or see #24018

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/24010


----------



## DeadlyDNA

Not sure how this works but i have both 290s and 290x's now.

My 290x's are in the machine though and thats what i changed over from 290's so here she is:

4x 290x Sapphire all Hynix Watercooled on EK copper blocks


Spoiler: Warning: Spoiler!






Update meh!


----------



## heroxoot

Got a call from MSI today. The fan problem has been outsourced to their main HQ now. It's definitely a bios problem I think but the guy who called me had no specific information. Understandable he doesn't want to make an assumption. Either way, happy to see my problem is acknowledged and not just me.


----------



## b0x3d

Quote:


> Originally Posted by *aneutralname*
> 
> Wouldn't a fan on the bottom also oppose the air that's being pulled out of the card?
> 
> I have searched online about the side panel fan issue, and it appears to be quite divided whether the side fan should be intake or exhaust, with a slight majority opining intake.
> 
> After reading your post I tried multiple configurations, and having the fan on the side panel as intake made the card 2 degrees cooler than placing it on the bottom or setting it as exhaust on the side. A current of fresh air straight into the card is probably hard to beat.


My case didn't come with a side fan. My previous 6970's blew hot air out the rear so the exhaust worked perfectly. However my windforce 290x's blow a load of hot air out the side and I couldn't have the side panel on without them overheating (going to 94 degrees and downclocking) in spite having the 120mm exhaust fan on the back of the cards, as well as a 140mm rear exhaust and 3x 120mm in the top. The first thing that made a noticeable difference was adding 2x 120mm fans on the HDD bay blowing air on to the end of the cards. I also tried adding a bottom fan but it didn't do anything. Even with these additional fans the cards were overheating when I put the side panel on. I drilled a hole with a jigsaw in the side panel and installed another 140mm fan. I thought exhaust would be best to suck the hot air out from the cards. But it didn't make any difference at all the cards were still overheating. I'd almost given up and resigned to leaving the side panel off - without the side panel they stay max out at 87 degrees so still pretty. Anyway I tried one last thing, I turned the side fan around to intake blowing air Into the card. I was not expecting it to work but it did! Temps stabilising at 75-80 degrees (depending on game). Couldn't believe how much of a difference it's made.

The other problem I had is that I'd upgraded from an H70 with rad mounted on the rear to an H100i with the rad mounted on the top. The hot air was blowing up through the rad and causing the CPU to overheat (60 degrees plus). So I installed the rad on the HDD bay with push pull config towards the gfx cards. This keeps air blowing on the cards and also keeps the CPU cooler away from the hot air and now CPU temp stays below 50 degrees. Finally I can have the side panel on whilst keeping both gfx and CPU at respectable temps - it's taking 10 fans to do it though - 16 if you include the 6 on the cards!


----------



## Gobigorgohome

Quote:


> Originally Posted by *cennis*
> 
> my cards need +175mw to run 1200/1500
> and I can only do +75mw or less which results in 1100/1500
> 
> its not the clock speed, its the voltage
> 
> i.e. +175mw at stock speeds still have flashing black screen


Oh, that bad. I see, that is kind of sad. I have the flashing black screen in BF4 on my GTX 660 Ti SLI, is this the same problem?
Quote:


> Originally Posted by *rdr09*
> 
> you are taliking about reference 290Xs, right? i would suggest delaying the waterblock. test the gpus first (for about a week) find out which one oc better than the other on air and make that the primary card. if both tested fine, then get the blocks for them.


I was talking about reference 290X's, yes. Only reason I would go for waterblocks is because of the temps, I do not think the 90C + is good, maybe I should throw the extra money in to a better aftermarket cooler then?

My only question is, is the MSI Lightning R9 290X worth the money? (Taken price, temperature and 4K performance without flickering in mind)


----------



## rdr09

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Oh, that bad. I see, that is kind of sad. I have the flashing black screen in BF4 on my GTX 660 Ti SLI, is this the same problem?
> I was talking about reference 290X's, yes. Only reason I would go for waterblocks is because of the temps, I do not think the 90C + is good, maybe I should throw the extra money in to a better aftermarket cooler then?
> 
> *the best aftermarket cooler is a waterblock. stay the course.*
> 
> My only question is, is the MSI Lightning R9 290X worth the money? (Taken price, temperature and 4K performance without flickering in mind)


i'll pass this to a Lightning owner.


----------



## Gobigorgohome

Quote:


> Originally Posted by *rdr09*
> 
> i'll pass this to a Lightning owner.


Okay, where are they? I would love to get feedback from someone running Crossfire MSI Lightning R9 290X's at 4K via Displayport.

Anyways, the flickering black screen I do not know in which games that appear, but I am currently having that issue with "stock clocks and stock voltage" on my 2x GTX 660 Ti's in SLI so that tell me that this is not a card-problem, more a displayport at 4K problem.

In which games do you guys run into the problem of black flickering screen? I would like to know exactly which titles it is. And I would really like that question answered before pushing the buy button on cards that will cost me 1700 USD. I can do air-cooling, no problem, I will be doing water cooling for my 3930K anyways, so adding a couple of radiators is no problem if I find out I will do water cooling in a little while.

Understand me right, I just want something that works good and can keep a "low" temperature in close space. (low temperature is relative, but temps under 80 degree celsius which the Asus DCU II OC did not clear).

I just throw in another question, how much of a performance hit will I get of running the cards in x16 and x8, instead of x16 and x16 (I am on a Asus Rampage IV Gene with slot one and two as x16, the third slot is x8.


----------



## VSG

I would check here: http://www.overclock.net/t/1472341/official-msi-r9-290x-lightning-thread/0_50


----------



## Gobigorgohome

Quote:


> Originally Posted by *geggeg*
> 
> I would check here: http://www.overclock.net/t/1472341/official-msi-r9-290x-lightning-thread/0_50


I have posted there now, thanks.


----------



## Scorpion49

Does anyone get anything like this on their card?

Sig rig is accurate, I don't have any issue with my spare HD 5770 at all. These lines appear only during certain times:

- When mousing over video/picture icons
- When mousing over edges of a window that can be interacted with to make larger or smaller
- When initializing a game, not during play

It is driving me nuts, I'm trying to figure out if it is a windows issue or the new R9 290 Vapor-X causing the problem.


----------



## aneutralname

This is so far the only downsampling method I've found to work with the R9 290. It only downsamples from 1440p for me, but better than nothing. Share if you know other methods.

http://www.overclock.net/t/1459584/amd-downsampling/10#post_22404939


----------



## aneutralname

Quote:


> Originally Posted by *Scorpion49*
> 
> Does anyone get anything like this on their card?
> 
> Sig rig is accurate, I don't have any issue with my spare HD 5770 at all. These lines appear only during certain times:
> 
> - When mousing over video/picture icons
> - When mousing over edges of a window that can be interacted with to make larger or smaller
> - When initializing a game, not during play
> 
> It is driving me nuts, I'm trying to figure out if it is a windows issue or the new R9 290 Vapor-X causing the problem.


Looks totally like a video card issue. Try increasing the voltage or lowering the clocks. I get similar lines when I overclock or undervolt too much. Also different drivers could solve the problem. Try getting rid of third party applications like Afterburner, etc.


----------



## Scorpion49

Quote:


> Originally Posted by *aneutralname*
> 
> Looks totally like a video card issue. Try increasing the voltage or lowering the clocks. I get similar lines when I overclock or undervolt too much.


This is fresh out of the box and into the machine, no overclock. If it isn't stable stock then it is going to get returned I guess.


----------



## aaroc

Quote:


> Originally Posted by *Red1776*
> 
> I tried buying a cuple of the cheaper $10 risers of Ebay and had nothing but pronblrms.
> If you run into the same problem, check this company out. I paid $60.00 for my x16 riser but it is high quality and worked perfectly.
> 
> http://www.adexelec.com/pciexp.htm


What model of that list and in what length did you buy? I want one for quadfireX in CHVFZ


----------



## aneutralname

Quote:


> Originally Posted by *Scorpion49*
> 
> This is fresh out of the box and into the machine, no overclock. If it isn't stable stock then it is going to get returned I guess.


Try reinstalling drivers and/or shutting off the system entirely and letting it rest. I had it become crazy on me like that when I overclocked the memory too much. If it persists for a week or so I would RMA it.


----------



## alancsalt

Quote:


> Originally Posted by *Scorpion49*
> 
> Quote:
> 
> 
> 
> Originally Posted by *aneutralname*
> 
> Looks totally like a video card issue. Try increasing the voltage or lowering the clocks. I get similar lines when I overclock or undervolt too much.
> 
> 
> 
> This is fresh out of the box and into the machine, no overclock. If it isn't stable stock then it is going to get returned I guess.
Click to expand...

Do a full uninstall of all graphics card drivers of any kind, use a third party driver cleaner, and only then reinstall the right drivers?


----------



## Roboyto

Quote:


> Originally Posted by *Scorpion49*
> 
> Hey guys, I just got a new Sapphire R9 290 Vapor-X, and I'm having a bit of an issue. Sometimes when watching a video, min/maxing a window, or sometimes in a game switching between loading screens I get a couple of horizontal lines across the screen very briefly, for maybe one frame. I'm worried that the card may not be stable at its stock 1030/1400 clocks. Anyone else seen something like this? I did not have it with any of the other 290's I've had, and I am using the 14.4 WHQL drivers.


What version of Windows are you running? I had nothing but issues with 14.X drivers while running Win7 64 Ultimate.

I have since switched to 8.1 Pro and 14.6 Beta have been very good to me.

Did you properly remove your old drivers before upgrading to 14.4? DDU or AMD Clean Uninstall and a registry cleaning are highly recommended.

Quote:


> Originally Posted by *Durvelle27*
> 
> What kind of problems


I have no personal experience with them, but have read on several occasions that the cheapies will be nothing but troublesome as @Red1776 is stating. The newest thing I had read was that epic wall mount build from a few months back; 3M risers were the way to go in that instance. http://www.overclock.net/t/1424387/gallery-build-log-ultimate-wall-mount-rig-maxxplanck-v2-completed

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Anyways, the flickering black screen I do not know in which games that appear, but I am currently having that issue with "stock clocks and stock voltage" on my 2x GTX 660 Ti's in SLI so that tell me that this is not a card-problem, more a displayport at 4K problem.
> 
> In which games do you guys run into the problem of black flickering screen? I would like to know exactly which titles it is. And I would really like that question answered before pushing the buy button on cards that will cost me 1700 USD. I can do air-cooling, no problem, I will be doing water cooling for my 3930K anyways, so adding a couple of radiators is no problem if I find out I will do water cooling in a little while.
> 
> Understand me right, I just want something that works good and can keep a "low" temperature in close space. (low temperature is relative, but temps under 80 degree celsius which the Asus DCU II OC did not clear).
> 
> I just throw in another question, how much of a performance hit will I get of running the cards in x16 and x8, instead of x16 and x16 (I am on a Asus Rampage IV Gene with slot one and two as x16, the third slot is x8.


Under normal circumstances flickering/blackscreen issues are caused by unstable RAM OCs on the 290(X) cards. The 4K/DP scenario I don't have experience with.

The ASUS DCII have a serious flaw with the cooler. ASUS was extremely lazy and took the cooler from the GK104, 780, and slapped it on the 290(X). Problem being only 3/5 heatpipes make contact with the GPU die. The temps are passable when compared to reference, but could honestly be significantly better if they would have exerted a little effort.

If you want cooling that works good in low space with Xfire, then the only suggestion I can make is full cover blocks. Air cooling a pair of these is a challenge regardless of what cooler you have, especially if you enjoy overclocking.

A 240mm for each 290(X) would probably be minimum depending on what your temperature goals are; plus radiator space for your 3930k. You will probably want to check out @Gunderman456 build log, or shoot him PM, for loads of Xfire information. http://www.overclock.net/t/1442038/build-log-the-hawaiian-heat-wave

Performance will not suffer with x16 and x8 from what I have seen in this thread.

Tons of info for 290(X) before you purchase to be found here: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781

There are a couple specific links you will want to check out within ^ that post.


The Kraken G10 cooling from my 250D build, since you were wondering about alternatives to full blown water.
VRMs thermal pad upgrade is strongly suggested with full cover blocks, so you'll want to take a peek at my post and @kizwan's which I've linked within mine.

My opinion on Lightning is that it is great if you plan on extreme cooling. Otherwise the $ can be wasted if you are expecting higher than average OCs as the silicon lottery plays a bigger role in OCability with 'normal' forms of cooling.


----------



## aNoN_

Hey guys. Just got me a Sapphire 290 reference w/ stock cooler. I am just starting my overclocking and stumbled opon some issues.

After raising the Core clock to 1100 and Power Limit to 50% I ran Unigine Valley, temps were fine, not exeeding 80c with my current fanspeed profile. But the clockspeed fluctuates up and down when the GPU is stressed, when fps goes up again the clock is raised to 1100 again. What is causing this? I had the same problem with my old 7950.



My afterburner settings:


Thanks in advance.


----------



## heroxoot

Its either the program or the driver causing your clock to fluctuate. When I benchmark I see it peg constantly at my clock but when I play a game it likes to be 1029.9mhz instead of the 1030 default clock. It does not seem to affect actual performance.


----------



## pdasterly

Remove msiab and just use trixx. I have pushed my cards far with vrm1 temps only thing holding me back. Sapphire reference cards

http://www.techpowerup.com/forums/threads/hawaii-290x-p-overclocking-guide.199077/


----------



## heroxoot

Quote:


> Originally Posted by *pdasterly*
> 
> Remove msiab and just use trixx. I have pushed my cards far with vrm1 temps only thing holding me back. Sapphire reference cards
> 
> http://www.techpowerup.com/forums/threads/hawaii-290x-p-overclocking-guide.199077/


I have no issues using MSI AB with the newest version. I dislike how trixx doesn't let me set 2d and 3d profiles for automatic switching.


----------



## Mega Man

Quote:


> Originally Posted by *aaroc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> I tried buying a cuple of the cheaper $10 risers of Ebay and had nothing but pronblrms.
> If you run into the same problem, check this company out. I paid $60.00 for my x16 riser but it is high quality and worked perfectly.
> 
> http://www.adexelec.com/pciexp.htm
> 
> 
> 
> 
> 
> 
> What model of that list and in what length did you buy? I want one for quadfireX in CHVFZ
Click to expand...

you can talk to them and they will tell you what length is acceptable for pcie 2.0 and or 3.0 if you are interested, mine also cost 60, but they are awesome. they are all made by hand to order, and hand tested


----------



## Red1776

Quote:


> Originally Posted by *Mega Man*
> 
> Quote:
> 
> 
> 
> Originally Posted by *aaroc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> I tried buying a cuple of the cheaper $10 risers of Ebay and had nothing but pronblrms.
> If you run into the same problem, check this company out. I paid $60.00 for my x16 riser but it is high quality and worked perfectly.
> 
> http://www.adexelec.com/pciexp.htm
> 
> 
> 
> 
> 
> What model of that list and in what length did you buy? I want one for quadfireX in CHVFZ
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> you can talk to them and they will tell you what length is acceptable for pcie 2.0 and or 3.0 if you are interested, mine also cost 60, but they are awesome. they are all made by hand to order, and hand tested
Click to expand...

Right, they will make a custom length for you. I ordered a 4" cable. and was using it for a CHVFZ as well. I also ordered one a few years ago for the crosshair IV. They worked beautifully.


----------



## Durvelle27

Ok i see what you mean.First problem i occurred was PC wouldn't boot when i set it to PCIe 3.0 but did ok with PCIe 2.0


----------



## Arizonian

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Not sure how this works but i have both 290s and 290x's now.
> 
> My 290x's are in the machine though and thats what i changed over from 290's so here she is:
> 
> 4x 290x Sapphire all Hynix Watercooled on EK copper blocks
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Update meh!


Congrats - updated
















Reminder
Any members who haven't yet done so please submit submission proof per OP and join the roster.


----------



## DeadlyDNA

Quote:


> Originally Posted by *Durvelle27*
> 
> Ok i see what you mean.First problem i occurred was PC wouldn't boot when i set it to PCIe 3.0 but did ok with PCIe 2.0


Yep, PCIE comliance is the big thing on those. Now that AMD r9 series doesn't require crossfire bridges i am surprised these don't sell better. Think about this you can run 3 or 4 290s/290x on stock coolers and use a riser to separate the cards so they can breathe. I always wanted to make a custom build where the gpus would mount on the outside wall of my case. Mount them so they are visible to people who came and looked at the machine. Some video cards look really nice and you can keep air cooling but then they will be nosier. I think it would look cool as hell myself. Hell i may do this on my seconday pc. I really fell in love with the way the GTX 480 looked. Sometimes the look so good they should be seen!


----------



## Red1776

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Durvelle27*
> 
> Ok i see what you mean.First problem i occurred was PC wouldn't boot when i set it to PCIe 3.0 but did ok with PCIe 2.0
> 
> 
> 
> Yep, PCIE comliance is the big thing on those. Now that AMD r9 series doesn't require crossfire bridges i am surprised these don't sell better. Think about this you can run 3 or 4 290s/290x on stock coolers and use a riser to separate the cards so they can breathe. I always wanted to make a custom build where the gpus would mount on the outside wall of my case. Mount them so they are visible to people who came and looked at the machine. Some video cards look really nice and you can keep air cooling but then they will be nosier. I think it would look cool as hell myself. Hell i may do this on my seconday pc. I really fell in love with the way the GTX 480 looked. Sometimes the look so good they should be seen!
Click to expand...

I always wondered though how much latency using the risers added.


----------



## DeadlyDNA

Quote:


> Originally Posted by *Red1776*
> 
> I always wondered though how much latency using the risers added.


I have no clue, never did check into that with mine that i bought. it was only a 4inch like yours though

I have 3 of them and I dont remember why.


----------



## rt123

Quote:


> Originally Posted by *aNoN_*
> 
> Hey guys. Just got me a Sapphire 290 reference w/ stock cooler. I am just starting my overclocking and stumbled opon some issues.
> 
> After raising the Core clock to 1100 and Power Limit to 50% I ran Unigine Valley, temps were fine, not exeeding 80c with my current fanspeed profile. But the clockspeed fluctuates up and down when the GPU is stressed, when fps goes up again the clock is raised to 1100 again. What is causing this? I had the same problem with my old 7950.
> 
> 
> 
> My afterburner settings:
> 
> 
> Thanks in advance.


While it might not help your issue, I highly suggest getting rid of that force constant voltage tick & maybe set umofficial overclocking mode WITH powerplay support for 24/7 use outside of OCing.
Or else you might kill your GPU faster, especially with the first one.

As for your issue, maybe your OC needs more voltage or what watt PSU are you running. ??

Also,this clock fluctuation might be for in the small moment when the screen goes black for a second & valley changes to a new scene, no need to go max clocks during a scene change.


----------



## b0x3d

I have CF 290x's - does anyone know if I could add my spare 6970 to the mix for a kind of hybrid tri-fire? Would I need to
bridge it? Would sort of performed boost would I get?


----------



## Sgt Bilko

Quote:


> Originally Posted by *b0x3d*
> 
> I have CF 290x's - does anyone know if I could add my spare 6970 to the mix for a kind of hybrid tri-fire? Would I need to
> bridge it? Would sort of performed boost would I get?


No, you can't do that, Will not work


----------



## heroxoot

Quote:


> Originally Posted by *b0x3d*
> 
> I have CF 290x's - does anyone know if I could add my spare 6970 to the mix for a kind of hybrid tri-fire? Would I need to
> bridge it? Would sort of performed boost would I get?


Why would you even ask if you could crossfire 2 identical GPU with one of a completely different make? On top of that you want to crossfire a card that uses a bridge with 2 cards that do not use a bridge.

You could use the 6970 for mining maybe.


----------



## kizwan

Quote:


> Originally Posted by *aNoN_*
> 
> Hey guys. Just got me a Sapphire 290 reference w/ stock cooler. I am just starting my overclocking and stumbled opon some issues.
> 
> After raising the Core clock to 1100 and Power Limit to 50% I ran Unigine Valley, temps were fine, not exeeding 80c with my current fanspeed profile. But the clockspeed fluctuates up and down when the GPU is stressed, when fps goes up again the clock is raised to 1100 again. What is causing this? I had the same problem with my old 7950.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> My afterburner settings:
> 
> 
> Thanks in advance.


Try untick voltage monitoring. Just because there are checkboxes, doesn't mean you need to ticked all of them.

BTW, MSI AB v3 (non-beta) now available.


----------



## Dasboogieman

Quote:


> Originally Posted by *aNoN_*
> 
> Hey guys. Just got me a Sapphire 290 reference w/ stock cooler. I am just starting my overclocking and stumbled opon some issues.
> 
> After raising the Core clock to 1100 and Power Limit to 50% I ran Unigine Valley, temps were fine, not exeeding 80c with my current fanspeed profile. But the clockspeed fluctuates up and down when the GPU is stressed, when fps goes up again the clock is raised to 1100 again. What is causing this? I had the same problem with my old 7950.
> 
> 
> 
> My afterburner settings:
> 
> 
> Thanks in advance.


Any particular reason you need the Unofficial overclocking method?
I presume it is because you want the constant load clocks. However, this is also the cause of your clock fluctuation.

I posted about this a while back, basically my conclusion was not to stress over the fact that the clocks are ramping up and down, AMD actually did a good job with this (i.e. tying it with the loading appropriately) and it its pretty much as responsive as Intel's TurboBoost. You can overclock with Trixx or MSI afterburner, it doesn't matter (obviously ruling out Overdrive), as long as the unofficial mode is off. If you want constant clocks for some reason (e.g. extreme benching) then consider the ASUS PT1T BIOS which forces the chip to operate at a minimum of 1000mhz. I didn't notice a difference to be honest compared to the automatic clocking scheme so I don't think its worth it.


----------



## b0x3d

Quote:


> Originally Posted by *heroxoot*
> 
> Why would you even ask if you could crossfire 2 identical GPU with one of a completely different make? On top of that you want to crossfire a card that uses a bridge with 2 cards that do not use a bridge..


Thought maybe I could put it in slot 4 and bridge it to the second 290x in slot 3 for a bit of extra power. Thanks for putting me right I'll leave it on ebay!


----------



## b0x3d

Quote:


> Originally Posted by *Red1776*
> 
> well get a decent PSU...mine stays whisper quiet powering all 4 oc'd R290X ...hehehe


What PSU you got?


----------



## b0x3d

Quote:


> Originally Posted by *falcon26*
> 
> Just played some BF4. In some maps and parts it got choppy. I tried mantle and DX11. My 780 never got choppy ever. This 290 also becomes quite the hairdryer on load very very loud...


Why did you swap a 780 for a 290?


----------



## Red1776

Quote:


> Originally Posted by *b0x3d*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> well get a decent PSU...mine stays whisper quiet powering all 4 oc'd R290X ...hehehe
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What PSU you got?
Click to expand...

I have three in this build.

1 x Corsair AX 1200W

2 x FSP Group X5 500W each

Total= 2200W


----------



## heroxoot

Quote:


> Originally Posted by *Red1776*
> 
> Quote:
> 
> 
> 
> Originally Posted by *b0x3d*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> well get a decent PSU...mine stays whisper quiet powering all 4 oc'd R290X ...hehehe
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What PSU you got?
> 
> Click to expand...
> 
> I have three in this build.
> 1 x Corsair AX 1200W
> 2 x FSP Group X5 500W each
> 
> Total= 2200W
Click to expand...

You could run the city with your PC.


----------



## b0x3d

Quote:


> Originally Posted by *Red1776*
> 
> I have three in this build.
> 1 x Corsair AX 1200W
> 2 x FSP Group X5 500W each
> 
> Total= 2200W


Cool - just looking at your pics - why have you got bridges over the gfx cards?


----------



## MojoW

Quote:


> Originally Posted by *b0x3d*
> 
> Cool - just looking at your pics - why have you got bridges over the gfx cards?


If u look closely you can see those are 7970's.
My guess is, that those are the previous cards in the system.


----------



## Scorpion49

Quote:


> Originally Posted by *aneutralname*
> 
> Looks totally like a video card issue. Try increasing the voltage or lowering the clocks. I get similar lines when I overclock or undervolt too much. Also different drivers could solve the problem. Try getting rid of third party applications like Afterburner, etc.


I tried several drivers now.

Quote:


> Originally Posted by *aneutralname*
> 
> Try reinstalling drivers and/or shutting off the system entirely and letting it rest. I had it become crazy on me like that when I overclocked the memory too much. If it persists for a week or so I would RMA it.


Shutting it off has no effect, it stays off all night. Reinstalling drivers doesn't help.

Quote:


> Originally Posted by *alancsalt*
> 
> Do a full uninstall of all graphics card drivers of any kind, use a third party driver cleaner, and only then reinstall the right drivers?


I always use DDU to clean drivers before I put a new card in, I installed 14.4 first and then removed and tried 14.6.
Quote:


> Originally Posted by *Roboyto*
> 
> What version of Windows are you running? I had nothing but issues with 14.X drivers while running Win7 64 Ultimate.
> 
> I have since switched to 8.1 Pro and 14.6 Beta have been very good to me.
> 
> Did you properly remove your old drivers before upgrading to 14.4? DDU or AMD Clean Uninstall and a registry cleaning are highly recommended.


Was running on Win7 Pro, I just did a full re-install with Windows 8 Pro and the problem persists. I think the GPU is bad, or there is an incompatibility with my X58 chipset as my 5770 has no issue with the same drivers. The card plays games perfectly fine for hours with no problems at all, but only in Windows does it seem to have this issue.


----------



## cennis

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Okay, where are they? I would love to get feedback from someone running Crossfire MSI Lightning R9 290X's at 4K via Displayport.
> 
> Anyways, the flickering black screen I do not know in which games that appear, but I am currently having that issue with "stock clocks and stock voltage" on my 2x GTX 660 Ti's in SLI so that tell me that this is not a card-problem, more a displayport at 4K problem.
> 
> In which games do you guys run into the problem of black flickering screen? I would like to know exactly which titles it is. And I would really like that question answered before pushing the buy button on cards that will cost me 1700 USD. I can do air-cooling, no problem, I will be doing water cooling for my 3930K anyways, so adding a couple of radiators is no problem if I find out I will do water cooling in a little while.
> 
> Understand me right, I just want something that works good and can keep a "low" temperature in close space. (low temperature is relative, but temps under 80 degree celsius which the Asus DCU II OC did not clear).
> 
> I just throw in another question, how much of a performance hit will I get of running the cards in x16 and x8, instead of x16 and x16 (I am on a Asus Rampage IV Gene with slot one and two as x16, the third slot is x8.


Sems like most qns been answered, but from my first hand experience,

flickering occurs in 3dmark11, unigine valley, unigine heaven, BF4 (least occurances) at 4k, If i push the voltage past 75mw.


----------



## battleaxe

Quote:


> Originally Posted by *Scorpion49*
> 
> Does anyone get anything like this on their card?
> 
> Sig rig is accurate, I don't have any issue with my spare HD 5770 at all. These lines appear only during certain times:
> 
> - When mousing over video/picture icons
> - When mousing over edges of a window that can be interacted with to make larger or smaller
> - When initializing a game, not during play
> 
> It is driving me nuts, I'm trying to figure out if it is a windows issue or the new R9 290 Vapor-X causing the problem.


Yes, you need to bump the voltage slightly. Its just a bit too low for some reason. Either it was set too low at factory, your core isn't great, or other factors, but giving it about 20mv should probably fix it.


----------



## Scorpion49

Quote:


> Originally Posted by *battleaxe*
> 
> Yes, you need to bump the voltage slightly. Its just a bit too low for some reason. Either it was set too low at factory, your core isn't great, or other factors, but giving it about 20mv should probably fix it.


I've gone all the way up to +50mv, no change. I've tried forcing 3D clocks in 2D mode as well, also with no change. To be clear, this only happens in windows, not games (besides the initial transition from windows to the game loading). I can loop Heaven or Valley for 12 hours and not a peep out of it. I suspect a vBIOS incompatibility with my X58 setup, but who knows. I submitted a ticket to Sapphire and see what they say, probably some stupid amateur crap about installing the drivers properly.


----------



## kpoeticg

Finally got my blocks attached. Need to spend some time today and tomorrow getting my acrylic routed so i can finally test these guys with some proper cooling


----------



## Scorpion49

Ok, I did a little more digging and brought out my old C2D setup and put the 290 in there. Initially there was no artifacts at all, but then I realized my monitor was running at 60hz not 120hz. As soon as I set the refresh rate to 120hz they began to appear on that machine as well. I moved the card back to my main rig, and sure enough at 60hz there are none.

I did change out the DL-DVI cable, to no effect. What could cause these artifacts to show up only at 120hz with this card? I get nothing with my 9800GT, HD5450, or HD5770. Only the 290 does this. Only at desktop as well, nothing when gaming.


----------



## aneutralname

Quote:


> Originally Posted by *Scorpion49*
> 
> Ok, I did a little more digging and brought out my old C2D setup and put the 290 in there. Initially there was no artifacts at all, but then I realized my monitor was running at 60hz not 120hz. As soon as I set the refresh rate to 120hz they began to appear on that machine as well. I moved the card back to my main rig, and sure enough at 60hz there are none.
> 
> I did change out the DL-DVI cable, to no effect. What could cause these artifacts to show up only at 120hz with this card? I get nothing with my 9800GT, HD5450, or HD5770. Only the 290 does this. Only at desktop as well, nothing when gaming.


Could be driver problems. Welcome to ATI.

Try other driver versions and other operating systems and check if you have the same problem.


----------



## sugarhell

Quote:


> Originally Posted by *aneutralname*
> 
> Could be driver problems. Welcome to ATI.
> 
> Try other driver versions and other operating systems and check if you have the same problem.


I still have a 2900xt


----------



## Faster_is_better

Quote:


> Originally Posted by *Scorpion49*
> 
> Ok, I did a little more digging and brought out my old C2D setup and put the 290 in there. Initially there was no artifacts at all, but then I realized my monitor was running at 60hz not 120hz. As soon as I set the refresh rate to 120hz they began to appear on that machine as well. I moved the card back to my main rig, and sure enough at 60hz there are none.
> 
> I did change out the DL-DVI cable, to no effect. What could cause these artifacts to show up only at 120hz with this card? I get nothing with my 9800GT, HD5450, or HD5770. Only the 290 does this. Only at desktop as well, nothing when gaming.


Did you by chance try any of those other cards at 120hz also?


----------



## Scorpion49

Quote:


> Originally Posted by *aneutralname*
> 
> Could be driver problems. Welcome to ATI.
> 
> Try other driver versions and other operating systems and check if you have the same problem.


I've tried 6 driver versions I have already. I've tried on Windows 7 and Windows 8. My Sapphire Tri-X 290 I just had a few weeks ago did not exhibit this problem.

Quote:


> Originally Posted by *Faster_is_better*
> 
> Did you by chance try any of those other cards at 120hz also?


Of course, I always run my monitor at 120hz. I'm getting the feeling this is specific to the Vapor-X model and its vBIOS, as I said I just had a Sapphire Tri-X 290 normal version that had zero issue in the exact same environment with the same drivers.


----------



## DeadlyDNA

Quote:


> Originally Posted by *sugarhell*
> 
> I still have a 2900xt


isnt that the card they caught hell over with heat and noise and lack luster performance? i was an nvidia fanboy back then


----------



## Yvese

Quote:


> Originally Posted by *Scorpion49*
> 
> I've tried 6 driver versions I have already. I've tried on Windows 7 and Windows 8. My Sapphire Tri-X 290 I just had a few weeks ago did not exhibit this problem.
> Of course, I always run my monitor at 120hz. I'm getting the feeling this is specific to the Vapor-X model and its vBIOS, as I said I just had a Sapphire Tri-X 290 normal version that had zero issue in the exact same environment with the same drivers.


Definitely seems like it may be an issue with the card. I have a 290 Tri-X and I also have zero issues with it. My monitor is 144hz but I use 120 on desktop with no issues.


----------



## Faster_is_better

Quote:


> Originally Posted by *Yvese*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Scorpion49*
> 
> I've tried 6 driver versions I have already. I've tried on Windows 7 and Windows 8. My Sapphire Tri-X 290 I just had a few weeks ago did not exhibit this problem.
> Of course, I always run my monitor at 120hz. I'm getting the feeling this is specific to the Vapor-X model and its vBIOS, as I said I just had a Sapphire Tri-X 290 normal version that had zero issue in the exact same environment with the same drivers.
> 
> 
> 
> Definitely seems like it may be an issue with the card. I have a 290 Tri-X and I also have zero issues with it. My monitor is 144hz but I use 120 on desktop with no issues.
Click to expand...

It is definitely looking that way, seems like you tested basically everything you can, which just leaves the GPU being at fault.


----------



## Scorpion49

Quote:


> Originally Posted by *Yvese*
> 
> Definitely seems like it may be an issue with the card. I have a 290 Tri-X and I also have zero issues with it. My monitor is 144hz but I use 120 on desktop with no issues.


Darn, Sapphire responded to my ticket with a "submit RMA request". Sucks because I'm pretty sure they only send out refurb units like the rest of the manufacturers, and my 4K monitor is in shipment right now. I don't want to go weeks without a decent GPU.


----------



## battleaxe

Quote:


> Originally Posted by *Scorpion49*
> 
> Darn, Sapphire responded to my ticket with a "submit RMA request". Sucks because I'm pretty sure they only send out refurb units like the rest of the manufacturers, and my 4K monitor is in shipment right now. I don't want to go weeks without a decent GPU.


But as it stands do you have a decent GPU now? (given the current issue)... may as well get er' replaced.


----------



## PuffinMyLye

Hey guys, I'm an nVidia fanboy looking to make the move to a 290 or 290X and I'm looking for a little guidance. I'm really unsure of what brands are good on the AMD side and also what my *PSU* can handle (I'm thinking 290 max right)?. My priorities are noise (I like everything as quiet as possible) and good warranty service. I'm not a big OCer of GPU's and I don't upgrade my GPU every year. I just want a good performing card that will be fairly quiet, cool, and last long. Suggestions?


----------



## Scorpion49

Quote:


> Originally Posted by *battleaxe*
> 
> But as it stands do you have a decent GPU now? (given the current issue)... may as well get er' replaced.


Well thats the thing, this issue is an annoyance at best. Gaming is 100% cool, BF4 runs great, all my other games do too. No lines, no problems. Card is cool and quiet. Very strange issue, normally with artifacts you will get them all the time, or during 3D clocks mostly. Thats why this has me so stumped.

Quote:


> Originally Posted by *PuffinMyLye*
> 
> Hey guys, I'm an nVidia fanboy looking to make the move to a 290 or 290X and I'm looking for a little guidance. I'm really unsure of what brands are good on the AMD side and also what my *PSU* can handle (I'm thinking 290 max right)?. My priorities are noise (I like everything as quiet as possible) and good warranty service. I'm not a big OCer of GPU's and I don't upgrade my GPU every year. I just want a good performing card that will be fairly quiet, cool, and last long. Suggestions?


Most of the same brands are good. Gigabyte, MSI, Asus, Sapphire, XFX has some good cards as well. I would suggest a 290, since the prices are lower and you can OC it just a tad and get 290X performance. Your PSU will be fine with any single card you choose, I ran a 290 overclocked on a 450W SFX PSU with no problem. If you want quiet and good warranty, I would look at the MSI gaming, Gigabyte Windforce or the Sapphire Tri-X cards. Asus is notorious for bad warranty service, I've been a victim of it several times myself. XFX seems to have some models that aren't as good as others, so I don't pay much attention to them. Good luck in your upgrade.


----------



## killbom

Quote:


> Originally Posted by *PuffinMyLye*
> 
> Hey guys, I'm an nVidia fanboy looking to make the move to a 290 or 290X and I'm looking for a little guidance. I'm really unsure of what brands are good on the AMD side and also what my *PSU* can handle (I'm thinking 290 max right)?. My priorities are noise (I like everything as quiet as possible) and good warranty service. I'm not a big OCer of GPU's and I don't upgrade my GPU every year. I just want a good performing card that will be fairly quiet, cool, and last long. Suggestions?


You can run a 290X without problems on that PSU. I would recommend a 290 with a good aftermarket cooler (like Asus, Gigabyte or Powercolor)


----------



## Roboyto

Quote:


> Originally Posted by *Scorpion49*
> 
> Ok, I did a little more digging and brought out my old C2D setup and put the 290 in there. Initially there was no artifacts at all, but then I realized my monitor was running at 60hz not 120hz. As soon as I set the refresh rate to 120hz they began to appear on that machine as well. I moved the card back to my main rig, and sure enough at 60hz there are none.
> 
> I did change out the DL-DVI cable, to no effect. What could cause these artifacts to show up only at 120hz with this card? I get nothing with my 9800GT, HD5450, or HD5770. Only the 290 does this. Only at desktop as well, nothing when gaming.


Have you tried 14.6 beta?

Or is there an updated GPU BIOS?

Last ditch efforts, but worth a shot


----------



## Yvese

Quote:


> Originally Posted by *Scorpion49*
> 
> Darn, Sapphire responded to my ticket with a "submit RMA request". Sucks because I'm pretty sure they only send out refurb units like the rest of the manufacturers, and my 4K monitor is in shipment right now. I don't want to go weeks without a decent GPU.


Where and how long ago did you buy it? If it hasn't been 30 days you could just get a replacement directly from newegg/amazon. They always send out new ones if the one you got is faulty. That and Amazon pays the shipping ( haven't bought from newegg in awhile so not sure if they do the same )


----------



## Scorpion49

Quote:


> Originally Posted by *Yvese*
> 
> Where and how long ago did you buy it? If it hasn't been 30 days you could just get a replacement directly from newegg/amazon. They always send out new ones if the one you got is faulty. That and Amazon pays the shipping ( haven't bought from newegg in awhile so not sure if they do the same )


I got it at newegg, it wasn't available on Amazon. Newegg has a history of sending out other peoples returns to you when you send something like this back, but I may end up having to do that.


----------



## Roboyto

Quote:


> Originally Posted by *Scorpion49*
> 
> I got it at newegg, it wasn't available on Amazon. Newegg has a history of sending out other peoples returns to you when you send something like this back, but I may end up having to do that.


If you buy open box you get a returned item.

I've been shopping there for a long time and have never received a new item that was already open. If you want to expedite and ease the return process their Premier Service for $40/year is hard to beat. You get free 3 day shipping on just about everything, no Restock fees and they cover return shipping.


----------



## Scorpion49

Quote:


> Originally Posted by *Roboyto*
> 
> If you buy open box you get a returned item.
> 
> I've been shopping there for a long time and have never received a new item that was already open. If you want to expedite and ease the return process their Premier Service for $40/year is hard to beat. You get free 3 day shipping on just about everything, no Restock fees and they cover return shipping.


I've had at least four graphics cards from them I bought as new come open and used, and one that I returned I got a refurb from a totally different brand. Newegg used to be really good, but google newegg returns and you will find thousands of people having problems with them.


----------



## PuffinMyLye

Quote:


> Originally Posted by *Scorpion49*
> 
> Most of the same brands are good. Gigabyte, MSI, Asus, Sapphire, XFX has some good cards as well. I would suggest a 290, since the prices are lower and you can OC it just a tad and get 290X performance. Your PSU will be fine with any single card you choose, I ran a 290 overclocked on a 450W SFX PSU with no problem. If you want quiet and good warranty, I would look at the MSI gaming, Gigabyte Windforce or the Sapphire Tri-X cards. Asus is notorious for bad warranty service, I've been a victim of it several times myself. XFX seems to have some models that aren't as good as others, so I don't pay much attention to them. Good luck in your upgrade.


Quote:


> Originally Posted by *killbom*
> 
> You can run a 290X without problems on that PSU. I would recommend a 290 with a good aftermarket cooler (like Asus, Gigabyte or Powercolor)


Thanks for the responses guys. I've heard some 290's can be "unlocked" so that you can OC them to 290X speeds and some you can't. Is this the case? If so which ones should I be avoiding?


----------



## specopsFI

Guys, I know this is highly speculative but I thought I'd ask anyway since sometimes it's just fun to speculate (well, that and I'm actually considering on buying).

If you could get a reference 290X or a reference 290X BF4 Edition at the same cost, which one would you get? AFAIK, the BF4 voucher isn't valid anymore, so the game itself is not a factor. So the speculation is on the likelihood of getting good silicon and good memory chips. The BF4 edition cards all are early batches and they seemed to come with both Hynix and Elpida memory. I suppose that is the same with the potentially later batches of non-BF4 edition cards, too? How about the silicon? Is there any indication of the likelihood of getting a good chip getting better or worse with the later batches? Feel free to comment on gut feeling









BTW, I know Elpida isn't necessarily bad. It's just that Hynix is usually better


----------



## kizwan

Quote:


> Originally Posted by *PuffinMyLye*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Scorpion49*
> 
> Most of the same brands are good. Gigabyte, MSI, Asus, Sapphire, XFX has some good cards as well. I would suggest a 290, since the prices are lower and you can OC it just a tad and get 290X performance. Your PSU will be fine with any single card you choose, I ran a 290 overclocked on a 450W SFX PSU with no problem. If you want quiet and good warranty, I would look at the MSI gaming, Gigabyte Windforce or the Sapphire Tri-X cards. Asus is notorious for bad warranty service, I've been a victim of it several times myself. XFX seems to have some models that aren't as good as others, so I don't pay much attention to them. Good luck in your upgrade.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *killbom*
> 
> You can run a 290X without problems on that PSU. I would recommend a 290 with a good aftermarket cooler (like Asus, Gigabyte or Powercolor)
> 
> Click to expand...
> 
> 
> 
> 
> 
> Thanks for the responses guys. I've heard some 290's can be "unlocked" so that you can OC them to 290X speeds and some you can't. Is this the case? If so which ones should I be avoiding?
Click to expand...

A bit late for that, unless you can get (used) early batches card but even that still not necessarily mean it's unlockable.


----------



## KyadCK

Quote:


> Originally Posted by *PuffinMyLye*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Scorpion49*
> 
> Most of the same brands are good. Gigabyte, MSI, Asus, Sapphire, XFX has some good cards as well. I would suggest a 290, since the prices are lower and you can OC it just a tad and get 290X performance. Your PSU will be fine with any single card you choose, I ran a 290 overclocked on a 450W SFX PSU with no problem. If you want quiet and good warranty, I would look at the MSI gaming, Gigabyte Windforce or the Sapphire Tri-X cards. Asus is notorious for bad warranty service, I've been a victim of it several times myself. XFX seems to have some models that aren't as good as others, so I don't pay much attention to them. Good luck in your upgrade.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *killbom*
> 
> You can run a 290X without problems on that PSU. I would recommend a 290 with a good aftermarket cooler (like Asus, Gigabyte or Powercolor)
> 
> Click to expand...
> 
> Thanks for the responses guys. I've heard some 290's can be "unlocked" so that you can OC them to 290X speeds and some you can't. Is this the case? If so which ones should I be avoiding?
Click to expand...

Unlocking means unlocking the remaining 10% of shaders (2560 vs 2816), not unlocking overclock potential. 290s are not locked from overclocking.


----------



## pdasterly

Why not go for bf4 edition, both my reference cards came with hynix, guess I got lucky


----------



## PuffinMyLye

Quote:


> Originally Posted by *KyadCK*
> 
> Unlocking means unlocking the remaining 10% of shaders (2560 vs 2816), not unlocking overclock potential. 290s are not locked from overclocking.


Thanks for clearing that up!
Quote:


> Originally Posted by *pdasterly*
> 
> Why not go for bf4 edition, both my reference cards came with hynix, guess I got lucky


I'd rather not buy a reference card because I want quiet cooling so I'd have to buy an aftermarket cooler. The only way I'd do that is if I found a really cheap used one so I had enough money left to buy a decent aftermarket cooler.


----------



## pdasterly

Bf4 cards looks exactly like my reference cards


----------



## PuffinMyLye

Do you know of any good aftermarket coolers for the 290?


----------



## pdasterly

Are you going to oc? Kraken G10, corsair hg10 and artic sells nice air coolers


----------



## PuffinMyLye

Quote:


> Originally Posted by *pdasterly*
> 
> Are you going to oc? Kraken G10, corsair hg10 and artic sells nice air coolers


Maybe a little but I'm more interested in keeping noise down. I know the reference coolers are really loud.


----------



## Scorpion49

Quote:


> Originally Posted by *PuffinMyLye*
> 
> Maybe a little but I'm more interested in keeping noise down. I know the reference coolers are really loud.


I had a pair of reference cards, they really aren't bad if you just stick them in and leave them be. They may throttle slightly but they will not be loud like the old reference 7970 was, with two in my rig just left at stock they were barely noticeable.


----------



## JordanTr

I
Quote:


> Originally Posted by *PuffinMyLye*
> 
> Maybe a little but I'm more interested in keeping noise down. I know the reference coolers are really loud.


I read some tests of arctic xtreme IV on r9 290 and results are impresive. Around 55-60 core and few degrees more on vrm1. Thats with 50-70% fan speed which is very very quiet.


----------



## fateswarm

Will I see any benefit if I replace a tri-x's thermal compound with NT-H1?


----------



## PuffinMyLye

Will the 290X Tri-X be able to run games at 5760x1080 or should I not even both trying to game on my triple monitors (and just use the center monitor to game?)


----------



## PachAz

I just got my second sapphire r9 290, which was used but never mounted according to the seller. My luck didnt last long though, as soon as I started valley and heaven benchmark the screen froze and went black with a sound loop. I reinstalled the drivers and tested benchmarking with one card only and the issue still occured like 3-4 times. At first everything works nice, but as the card getting hotter and the fan is spinning faster, suddenly I get this freeze/sound loop. The gpu has elpida ram though. The gpu according to the seller was a 2nd grade one, ie sold with a reduced price for what ever reason.

Luckily the shop accept rma, since I got a warranty reciept from the seller. How ever their policy is that they can only repair or give a similar product to the second hand customer. I really hope I get a reference r9 290, if not then I hope the seller will accept my return so he can claim a refund from the shop. I am sad and hurt







.

It seem like the card is getting so hot that suddenly it just stops to work, like the throttling function dont work or something like that. Also theres no reason a reference should be that hot anyways because I had no sidepanel and was only running for like 3 min or so. Anyways this card is faulty, and I dont really think the shop even tested this card before they sold it, and even if they did they certanly didnt run any gpu benchmarks.


----------



## fateswarm

Isn't blank a sign the power supply is low?


----------



## bluedevil

Quote:


> Originally Posted by *fateswarm*
> 
> Isn't blank a sign the power supply is low?


His PSU is fine.


----------



## KyadCK

Quote:


> Originally Posted by *PuffinMyLye*
> 
> Will the 290X Tri-X be able to run games at 5760x1080 or should I not even both trying to game on my triple monitors (and just use the center monitor to game?)


They're designed for 4k, and do very well at that resolution. There may be some games (mostly newer ones) that it may not be able to max at tri-1080, but it should do very well.

I've personally got my 290X running my portrait tri-1080 eyefinity, it does well for being a single card.


----------



## PachAz

Blank sign? I have two r9 290, only one of them is working. The first one I have been using for some months now and work good in games and benchmarks. Some times it is black screen and some times the picture freeze with some artifacts. Also 1000w psu should be able to power a stock r9 290







. I have tested with crossfire and each card alone to make sure it is not my first gpu or my psu. I also tested with different cables from the osu to make sure everything is working.


----------



## BradleyW

It just sounds like the card is defective. Return it. Just out of interest, does it crash when underclocked and have you tried 100% fan speed.


----------



## Roboyto

Quote:


> Originally Posted by *PuffinMyLye*
> 
> Will the 290X Tri-X be able to run games at 5760x1080 or should I not even both trying to game on my triple monitors (and just use the center monitor to game?)


You should be able to get very respectable frame rates in 5760x1080 with a single 290X. I run that res with my 290 and it does very well, just don't expect to get 100fps with every setting maxed.

Last time I logged FPS in Tomb Raider, with all settings maxed, I was averaging low 40s with 29 being the minimum and 62 max. My 290 runs at 1200/1500 for gaming purposes. I could likely reduce AA and bring the frame rates up a little bit, but even in the low 40s it is an extremely smooth gaming experience.

Quote:


> Originally Posted by *PuffinMyLye*
> 
> Do you know of any good aftermarket coolers for the 290?


Gelid Icy Vision, Arctic Extreme IV, or using the Kraken G10 with an AIO are your 3 most popular choices.

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781

Lots of info for 290(X) in my post there. Check towards the bottom of my link there and there is information for aftermarket cooling.

The Gelid Icy Vision is the cheapest route and will give respectable temps with low noise, but don't expect to get huge overclocks from it.

The Arctic Extreme IV is a nice step forward from the III since you no longer have to deal with thermal epoxy for the RAM/VRMs. These things are dead silent.

The Kraken G10 with an AIO is a nice option too, you just need to use a good heatsink for VRM1. This info can be found in the link above.

Quote:


> Originally Posted by *fateswarm*
> 
> Will I see any benefit if I replace a tri-x's thermal compound with NT-H1?


Maybe? Only one way to find out









Quote:


> Originally Posted by *Scorpion49*
> 
> I've had at least four graphics cards from them I bought as new come open and used, and one that I returned I got a refurb from a totally different brand. Newegg used to be really good, but google newegg returns and you will find thousands of people having problems with them.










Intriguing. I've never had any issue of that sort, but if I did I wouldn't stand for it. Anytime I have a problem it is always easily sorted out with a quick support chat; I guess I've been fortunate. I've even received open box items where the exterior seal was broken, while inner packaging was still sealed.


----------



## Forceman

Quote:


> Originally Posted by *fateswarm*
> 
> Will I see any benefit if I replace a tri-x's thermal compound with NT-H1?


Probably not enough to make it worth the hassle.


----------



## blue1512

Quote:


> Originally Posted by *PachAz*
> 
> I just got my second sapphire r9 290, which was used but never mounted according to the seller. My luck didnt last long though, as soon as I started valley and heaven benchmark the screen froze and went black with a sound loop. I reinstalled the drivers and tested benchmarking with one card only and the issue still occured like 3-4 times. At first everything works nice, but as the card getting hotter and the fan is spinning faster, suddenly I get this freeze/sound loop. The gpu has elpida ram though. The gpu according to the seller was a 2nd grade one, ie sold with a reduced price for what ever reason.
> 
> Luckily the shop accept rma, since I got a warranty reciept from the seller. How ever their policy is that they can only repair or give a similar product to the second hand customer. I really hope I get a reference r9 290, if not then I hope the seller will accept my return so he can claim a refund from the shop. I am sad and hurt
> 
> 
> 
> 
> 
> 
> 
> .
> 
> It seem like the card is getting so hot that suddenly it just stops to work, like the throttling function dont work or something like that. Also theres no reason a reference should be that hot anyways because I had no sidepanel and was only running for like 3 min or so. Anyways this card is faulty, and I dont really think the shop even tested this card before they sold it, and even if they did they certanly didnt run any gpu benchmarks.


That is the notorious black screen issue of early 290/290x. You should try Sapphire BIOS with Elpida_debug in the description


----------



## MTDEW

Quote:


> Originally Posted by *blue1512*
> 
> That is the notorious black screen issue of early 290/290x. You should try Sapphire BIOS with Elpida_debug in the description


Any idea where to find that bios?
I don't see it on techpowerup.
And my google-fu isn't working.
The guys in the black screen poll thread would probably like to try that bios.

*EDIT:* I see an MSi one with Elpida EDW2032BBBG_DEBUG2.
Is THIS what you're referring to?


----------



## blue1512

Here you go. The BIOS of TriX is better for UEFI but maybe your card don't like it, so try both of them

TriX
http://www.techpowerup.com/vgabios/151348/sapphire-r9290-4096-131211-1.html

Ref
http://www.techpowerup.com/vgabios/151539/sapphire-r9290-4096-131205.html


----------



## MTDEW

Quote:


> Originally Posted by *blue1512*
> 
> Here you go. The BIOS of TriX is better for UEFI but maybe your card don't like it, so try both of them
> 
> TriX
> http://www.techpowerup.com/vgabios/151348/sapphire-r9290-4096-131211-1.html
> 
> Ref
> http://www.techpowerup.com/vgabios/151539/sapphire-r9290-4096-131205.html


Oh, ok, i was looking in the Sapphire reference bios files and missed it.
Thanks!
+rep


----------



## pdasterly

Triple monitors takes alot of horsepower to run, some games are nice, some are unplayable, too me at least


----------



## PuffinMyLye

Quote:


> Originally Posted by *pdasterly*
> 
> Triple monitors takes alot of horsepower to run, some games are nice, some are unplayable, too me at least


I was able to run a good amount of games on medium settings on my GTX 670 FTW so I would think I should be able to run those same ones on high with a 290X.


----------



## punkafi888

I'm joining the r9 290 club just got a xfx

Sent from my SM-N900T using Tapatalk


----------



## pdasterly

who wants to play on medium, max everything out. Probably going to dump my triple monitor setup and get some big dual 30" monitors or a large 4k 50" tv with high refresh rate. Haven't decided yet.


----------



## Arizonian

Quote:


> Originally Posted by *kpoeticg*
> 
> Finally got my blocks attached. Need to spend some time today and tomorrow getting my acrylic routed so i can finally test these guys with some proper cooling
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - I already had you updated to water
















Did I understand your going tri or quad soon?

Quote:


> Originally Posted by *punkafi888*
> 
> I'm joining the r9 290 club just got a xfx
> 
> Sent from my SM-N900T using Tapatalk


Sounds great. Here's what you need for submission.

1. GPU-Z Link with OCN name or Screen shot of GPU-Z validation tab open with OCN Name or a pic of GPU with piece of paper with OCN name showing.
2. Manufacturer & Brand
3. Cooling


----------



## th3illusiveman

can you still unlock 290's?


----------



## heroxoot

Quote:


> Originally Posted by *th3illusiveman*
> 
> can you still unlock 290's?


I think it is only certain brands I think. Powercolor may be one of them but don't quote me.


----------



## kpoeticg

Quote:


> Originally Posted by *Arizonian*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kpoeticg*
> 
> Finally got my blocks attached. Need to spend some time today and tomorrow getting my acrylic routed so i can finally test these guys with some proper cooling
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - I already had you updated to water
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Did I understand your going tri or quad soon?
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *punkafi888*
> 
> I'm joining the r9 290 club just got a xfx
> 
> Sent from my SM-N900T using Tapatalk
> 
> Click to expand...
> 
> Sounds great. Here's what you need for submission.
> 
> 1. GPU-Z Link with OCN name or Screen shot of GPU-Z validation tab open with OCN Name or a pic of GPU with piece of paper with OCN name showing.
> 2. Manufacturer & Brand
> 3. Cooling
Click to expand...

Yesssir =)

Probly not quads, i plan on grabbing my 3rd within the next cpl weeks tho along with some other stuff i need for my build (good monitor/s, razer orbweaver stealth, good mouse & keyboard, bluray burner)


----------



## Silent Scone

Any fellow Vapor-X 8GB owners here?


----------



## DeadlyDNA

Quote:


> Originally Posted by *Silent Scone*
> 
> Any fellow Vapor-X 8GB owners here?


they are only available in the UK?


----------



## Silent Scone

Unfortunately yes. I can't really brag though as I've spoken to EK and there are no plans for a water block for the Vapor-X range!

Shame.

From how I understand it, Sapphire made a small batch of these, but seems they had their wires crossed with AMD over it. AMD obviously gave the green light to sell what units they had already made...Makes this a very rare card indeed


----------



## Ramzinho

Quote:


> Originally Posted by *Silent Scone*
> 
> Any fellow Vapor-X 8GB owners here?


WHAT? if you can buy another one do it... next gen even wont get close to that


----------



## Silent Scone

I've been contemplating it for 4K going from three 780Ti. However the prospect of no water is kind of a big deal


----------



## Ized

(UK) What do we think a realistic price for a fully unlocked reference PowerColor AXR9 290 is these days?

I don't even know where to begin looking or selling. It has some coilwhine which probably isn't going to help my chances.

I wonder how much of my £314 I could recover









Any advice or guesses?


----------



## Willi

Quote:


> Originally Posted by *Ized*
> 
> (UK) What do we think a realistic price for a fully unlocked reference PowerColor AXR9 290 is these days?
> 
> I don't even know where to begin looking or selling. It has some coilwhine which probably isn't going to help my chances.
> 
> I wonder how much of my £314 I could recover
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any advice or guesses?


I remember some horrible coil whine back in the day with the Nvidias 295 dual PCB. I think if you encase the source of the whine with nail polish or anything similar, the noise stops completely. I used nail polish on my old 295, I think it might work on a R9 290 as well. I'll look it up to see if someone found the exact source of the noise and found a way to solve it.


----------



## Scorpion49

Ok, so small update to my Vapor-X saga of artifacts. I've narrowed it down even more, it only happens at 120hz when google chrome is open (doesn't have to be in the chrome windows, just the program open). Been using IE this morning as an experiment and haven't seen a single line across my screen. As soon as I open up chrome I start getting them.

I've tried so many drivers I doubt it can be a driver incompatibility, and plenty of people use chrome. What the heck.


----------



## battleaxe

Quote:


> Originally Posted by *Scorpion49*
> 
> Ok, so small update to my Vapor-X saga of artifacts. I've narrowed it down even more, it only happens at 120hz when google chrome is open (doesn't have to be in the chrome windows, just the program open). Been using IE this morning as an experiment and haven't seen a single line across my screen. As soon as I open up chrome I start getting them.


Try firefox?

Chrome operates with Flash running in background. Could that be the culprit?


----------



## heroxoot

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Scorpion49*
> 
> Ok, so small update to my Vapor-X saga of artifacts. I've narrowed it down even more, it only happens at 120hz when google chrome is open (doesn't have to be in the chrome windows, just the program open). Been using IE this morning as an experiment and haven't seen a single line across my screen. As soon as I open up chrome I start getting them.
> 
> 
> 
> Try firefox?
> 
> Chrome operates with Flash running in background. Could that be the culprit?
Click to expand...

I think Firefox operates chrome in its single plugin container. No artifacts but sometimes flash lags for me on twitch tv so I gotta full screen it which for whatever reason fixes that.

I hate flash so much.


----------



## Notion

Hi does the R 290 require active display port converter to work with eyefinity?
Thanks


----------



## nightfox

Quote:


> Originally Posted by *Notion*
> 
> Hi does the R 290 require active display port converter to work with eyefinity?
> Thanks


If you gonna use the display port yes.

BUT,

if your monitor has a DP port, you dont need converter.

If your monitor resolutions are 1080p and below, you can just directly connect it to HDMI+DVI (2). If you res is above 1200p, then I'm afraid you that you need DP to DVI (active)


----------



## aneutralname

I have a problem with 13.12 drivers. The monitor won't go into sleep/power save mode. Didn't have the problem with previous graphic cards. Any ideas?


----------



## Notion

Quote:


> Originally Posted by *nightfox*
> 
> If you gonna use the display port yes.
> 
> BUT,
> 
> if your monitor has a DP port, you dont need converter.
> 
> If your monitor resolutions are 1080p and below, you can just directly connect it to HDMI+DVI (2). If you res is above 1200p, then I'm afraid you that you need DP to DVI (active)


kk well my monitors don't have display port however they have 2x hdmi and a dvi and vga..

So i am guessing i would need a active display port converter..


----------



## rdr09

Quote:


> Originally Posted by *Notion*
> 
> kk well my monitors don't have display port however they have 2x hdmi and a dvi and vga..
> 
> So i am guessing i would need a active display port converter..


these figures came from the op (post#2) . . .



look closely at the pins of the dvi's and compare it to what you have. it might not be very clear.

here . . .


----------



## Faster_is_better

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Scorpion49*
> 
> Ok, so small update to my Vapor-X saga of artifacts. I've narrowed it down even more, it only happens at 120hz when google chrome is open (doesn't have to be in the chrome windows, just the program open). Been using IE this morning as an experiment and haven't seen a single line across my screen. As soon as I open up chrome I start getting them.
> 
> 
> 
> Try firefox?
> 
> Chrome operates with Flash running in background. Could that be the culprit?
Click to expand...

I think Chrome may also use hardware acceleration by default, if you disable it maybe that will fix it. Still, it shouldn't be happening to begin with. Maybe there is some sort of bug with high hz and Chrome?


----------



## rdr09

Skydive 290 @ 1200/1500

http://www.3dmark.com/3dm/3272905

edit: i'll try 1300 core and see if i hit 47K in graphics. doubt it. need a 290X. lol


----------



## grunion

Skydiver?


----------



## hotrod717

Quote:


> Originally Posted by *rdr09*
> 
> that card is kinda expensive. i've seen one for less than $400 a few days ago. also, not sure if asic matters but mine is about the same as yours and can do 1300 core.
> 
> can you still return it?
> 
> http://www.overclock.net/t/1471215/sapphire-290x-with-ek-fc-290x-acrylic-nickel-waterblock
> 
> hotrod is not a miner.


Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Ive got some Classies coming I will test that theory
> 
> 
> 
> 
> 
> 
> 
> 
> LoOoOoOoL
> 
> 
> 
> 
> 
> 
> 
> 
> My cards asic is 67% - 69% and you now what those things have done 1350 single, 1330 CF and 1300 TRI
> 
> 
> 
> 
> 
> 
> 
> 
> Yes damn GST and customs brokers man
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah so don't get all three at once like 780ti's and the like


Quote:


> Originally Posted by *rdr09*
> 
> Pach man. tsk tsk. i look up to him when it comes to watercooling . . . but gpu? i might as well go by tom's. lol


Little late to the conversation, but what gives?


----------



## pdasterly

What program can I use to check for errors for oc'ing?


----------



## rdr09

Quote:


> Originally Posted by *hotrod717*
> 
> Little late to the conversation, but what gives?


Pach was saying that crossfire 290s can beat 780 Tis in sli. We thought he went a bit overboard with the comment. i guess he meant in lower rez like 1080.


----------



## hotrod717

Quote:


> Originally Posted by *rdr09*
> 
> Pach was saying that crossfire 290s can beat 780 Tis in sli. We thought he went a bit overboard with the comment. i guess he meant in lower rez like 1080.


I saw H-C bold " hotrod is not a miner" and a "Lol" and wondered what I was missing. Didn't think it was a secret, that I'm not and despise what they've done to the market. Thanks for the reference to my card by the way.


----------



## rdr09

Quote:


> Originally Posted by *hotrod717*
> 
> I saw H-C bold " hotrod is not a miner" and a "Lol" and wondered what I was missing. Didn't think it was a secret, that I'm not and despise what they've done to the market. Thanks for the reference to my card by the way.


oh, that. i recommended your 290X that you are selling and made the person aware that it was not used for mining. lol


----------



## fateswarm

Is there a chance to a 290 that unlocks to 290x with a new tri-x?


----------



## cephelix

i thought the unlocking thing was confined to the first batch of cards when they had a shortage of the 290s? so they just put a 290 bios in a 290x...


----------



## Roaches

For unlocking your 290 or to see if it can:

See: http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread#post_21203153


----------



## fateswarm

No go on tri-x it seems.


----------



## Sgt Bilko

Quote:


> Originally Posted by *fateswarm*
> 
> No go on tri-x it seems.


It was only for the first batch of reference cards iirc, Powercolor and XFX seemed to have the highest rate of unlocks.

I've yet to hear or see any custom 290 being unlocked


----------



## fateswarm

Oh well, we get lower power consumption at least. Horrey..


----------



## Dasboogieman

Just before Watercooling, problem has cropped up.

As it turns out, my modest Cooler Master Silent Pro 1000W M (supplies 960W on the 12V rail) cannot actually take 290s in Dual Crossfire and an i5-3570k at 4.5 Ghz (1.27V). According to my calculations, at even 90% load, the PSU is being used roughly 80-85% capacity (I'm assuming fresh capacitors, I haven't taken in to account capacitor ageing from 2 years of being used with SLI GTX 570s). Therefore, the PSU fans ramp up rather spectacularly to create the equivalent noise to a stock 290X cooler at 80%.

Thus, there is absolutely no point in watercooling until I can upgrade the PSU.

At this wattage level, I've pretty much narrowed down to a short list of PSUs within $400 (I.e. the Superb Corsair AX1200i is ruled out) from data I've gleaned from JonnyGuru

Analog units

1. Evga SuperNova G2 1300W
+ best in class ripple suppression on all rails
+ second best in class voltage regulation
+ Highest rated wattage for price range
+ 10 years warranty
- possible issue with buzzing fan due to loose insulation (though this might be an isolated incidence)
- $335

2. Seasonic X-1250, 1250W
+ best in class voltage regulation
+ third best ripple suppression
+ 7 years warranty
+ silent fan mode available via switch
+ extremely reputable brand
- $349

3. Enermax Revolution 87+ 1200W
+ multiple 30A rails (i.e. slightly safer if there is a partial short)
+ excellent regulation
- so so ripple suppression (compared to the top 3)
- unproven regulation capability at 1200W (the other data was gleaned from the 1000W model)
- hot at full load
- $229

4. Possibly Seasonic X-1200W Platinum (Though I see little point in paying extra for Plat)
+ more or less the same as the X-1250
+ more efficient
- less wattage
- $379

So what do you guys think? I'm leaning towards either 1 or 2, the $229 Enermax unit looks really tempting but its more unproven and seems less robust to pure abuse.

Or I can Yolo and get the AX1200i or AX1500i and have no compromises whatsoever with dat sweet sweet DSP


----------



## Red1776

Quote:


> Originally Posted by *Dasboogieman*
> 
> Just before Watercooling, problem has cropped up.
> 
> As it turns out, my modest Cooler Master Silent Pro 1000W M (supplies 960W on the 12V rail) cannot actually take 290s in Dual Crossfire and an i5-3570k at 4.5 Ghz (1.27V). According to my calculations, at even 90% load, the PSU is being used roughly 80-85% capacity (I'm assuming fresh capacitors, I haven't taken in to account capacitor ageing from 2 years of being used with SLI GTX 570s). Therefore, the PSU fans ramp up rather spectacularly to create the equivalent noise to a stock 290X cooler at 80%.
> 
> Thus, there is absolutely no point in watercooling until I can upgrade the PSU.
> 
> At this wattage level, I've pretty much narrowed down to a short list of PSUs within $400 (I.e. the Superb Corsair AX1200i is ruled out) from data I've gleaned from JonnyGuru
> 
> Analog units
> 
> 1. Evga SuperNova G2 1300W
> + best in class ripple suppression on all rails
> + second best in class voltage regulation
> + Highest rated wattage for price range
> + 10 years warranty
> - possible issue with buzzing fan due to loose insulation (though this might be an isolated incidence)
> - $335
> 
> 2. Seasonic X-1250, 1250W
> + best in class voltage regulation
> + third best ripple suppression
> + 7 years warranty
> + silent fan mode available via switch
> + extremely reputable brand
> - $349
> 
> 3. Enermax Revolution 87+ 1200W
> + multiple 30A rails (i.e. slightly safer if there is a partial short)
> + excellent regulation
> - so so ripple suppression (compared to the top 3)
> - unproven regulation capability at 1200W (the other data was gleaned from the 1000W model)
> - hot at full load
> - $229
> 
> 4. Possibly Seasonic X-1200W Platinum (Though I see little point in paying extra for Plat)
> + more or less the same as the X-1250
> + more efficient
> - less wattage
> - $379
> 
> So what do you guys think? I'm leaning towards either 1 or 2, the $229 Enermax unit looks really tempting but its more unproven and seems less robust to pure abuse.
> 
> Or I can Yolo and get the AX1200i or AX1500i and have no compromises whatsoever with dat sweet sweet DSP


I am surprised this one did not make your list.

The JG review can be had here:

http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story5&reid=189

Corsair AX 1200W

Highest rated PSU (9.8) from JG until the AX 1500i

Incredible Ripple suppression (15mv or lower on most load tests)

1200W unit holds down 1550w in 100% load test

90% efficiency

runs cool Intake/exhaust

7 year warranty

I own three of these units and they are incredible units (I have run 4 x R290X OC off unit) as a test

The OEM is not the usual Seasonic but Flextronics


----------



## rdr09

Quote:


> Originally Posted by *grunion*
> 
> Skydiver?


lol Sky Diver. Two words. My bad.

here is at 1280 . . .

http://www.3dmark.com/3dm/3281686


----------



## KyadCK

Quote:


> Originally Posted by *Red1776*
> 
> I am surprised this one did not make your list.
> The JG review can be had here:
> 
> http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story5&reid=189
> Corsair AX 1200W
> 
> Highest rated PSU (9.8) from JG until the AX 1500i
> 
> Incredible Ripple suppression
> 1200W unit holds down 1550w in 100% load test
> 90% efficiency
> I own three of these units and they are incredible units
> The OEM is not the usual Seasonic but Flextronics


Well that certainly makes me feel better about my AX1200s. Wish it had 2 more CPU/PCI-e ports on it though.


----------



## Gobigorgohome

Red1776, what do you think is the smartest thing to do.

3x MSI Lightning R9 290X's in Tri-Fire (gaming at 4K)

OR

4x Sapphire Radeon R9 290X's in Quad-Fire (gaming at 4K) - reference cards

I will do water cooling anyways.

Price difference is 52 USD, which I do not care for.

System pretty much the same as the sig rig, beside Asus Rampage IV Formula and a new case. + Silver Power 500W PSU.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Red1776, what do you think is the smartest thing to do.
> 
> 3x MSI Lightning R9 290X's in Tri-Fire (gaming at 4K)
> 
> OR
> 
> 4x Sapphire Radeon R9 290X's in Quad-Fire (gaming at 4K) - reference cards
> 
> I will do water cooling anyways.
> 
> Price difference is 52 USD, which I do not care for.
> 
> System pretty much the same as the sig rig, beside Asus Rampage IV Formula and a new case. + Silver Power 500W PSU.


I don't think the lightnings are worth it if you are gonna watercool them anyways.

Not sure if you can do quad-fire on the PSU you have though, Tri Fire Ref cards with blocks i'd say would be best option for you


----------



## HOMECINEMA-PC

Yo wassup Bilksy


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yo wassup Bilksy


Not much man, see you went out and spent some more cash


----------



## Gobigorgohome

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I don't think the lightnings are worth it if you are gonna watercool them anyways.
> 
> Not sure if you can do quad-fire on the PSU you have though, Tri Fire Ref cards with blocks i'd say would be best option for you


Maybe, but I will use my 500 watt PSU for motherboard, CPU and waterpumps. I am not a big time overclocker so I think I am fine with 4x R9 290X's on the 1300W EVGA G2. I got the impression by the PSU-GOD on OCN that the EVGA G2 1300 was enough for the hole system if I did not overclock much.









Is there much problems with quadfire over trifire?


----------



## Dasboogieman

Quote:


> Originally Posted by *Red1776*
> 
> I am surprised this one did not make your list.
> The JG review can be had here:
> 
> http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story5&reid=189
> Corsair AX 1200W
> 
> Highest rated PSU (9.8) from JG until the AX 1500i
> 
> Incredible Ripple suppression (15mv or lower on most load tests)
> 1200W unit holds down 1550w in 100% load test
> 90% efficiency
> runs cool Intake/exhaust
> 7 year warranty
> I own three of these units and they are incredible units (I have run 4 x R290X OC off unit) as a test
> The OEM is not the usual Seasonic but Flextronics


Yeah that looks incredible, really close to the AX1200i
I think I might've missed it since it wasn't available at my preferred PC store since I prefer to get this PSU by Wednesday, my current one is really groaning. I'd prefer not to wait unless it is significantly better. Shame the AX1200i and AX1500i costs like $80 for express shipping (presumably because they're extremely bulky) otherwise I'd have bought the AX1500i in a heartbeat. The EVGA and Seasonic units only cost $17 express shipping.


----------



## Red1776

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Red1776, what do you think is the smartest thing to do.
> 
> 3x MSI Lightning R9 290X's in Tri-Fire (gaming at 4K)
> 
> OR
> 
> 4x Sapphire Radeon R9 290X's in Quad-Fire (gaming at 4K) - reference cards
> 
> I will do water cooling anyways.
> 
> Price difference is 52 USD, which I do not care for.
> 
> System pretty much the same as the sig rig, beside Asus Rampage IV Formula and a new case. + Silver Power 500W PSU.


Well the quadfire for 4K , however I would check on blocks because the last word from all manufacturers was no FC VGA blocks for the MSI lightning's. If you are going with the universal VGA WC blocks then its a non issue.

My solution gearing up for 4k was 4 x MSI game edition (ref layout with larger caps) and EK FC Rev 2.0 blocks.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Maybe, but I will use my 500 watt PSU for motherboard, CPU and waterpumps. I am not a big time overclocker so I think I am fine with 4x R9 290X's on the 1300W EVGA G2. I got the impression by the PSU-GOD on OCN that the EVGA G2 1300 was enough for the hole system if I did not overclock much.


If you are using a 500w for the CPU/Mobo then you have enough there for Quad-fire


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Not much man, see you went out and spent some more cash


Yerp I sure did








67" 4K Currrvvvyyy Sammy Frothin bro frothin


----------



## Sgt Bilko

Quote:


> Originally Posted by *Red1776*
> 
> Well the quadfire for 4K , however I would check on blocks because the last word from all manufacturers was no FC VGA blocks for the MSI lightning's. If you are going with the universal VGA WC blocks then its a non issue.
> 
> My solution gearing up for 4k was 4 x MSI game edition (ref layout with larger caps) and EK FC Rev 2.0 blocks.


Pretty sure EK made a block for the 290x Lightning, Been a while since i check out the site though


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yerp I sure did
> 
> 
> 
> 
> 
> 
> 
> 
> 67" 4K Currrvvvyyy Sammy Frothin bro frothin
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Dude.....









i kinda hate you right now









Also, Red: here is the block for the Lightning: LINK


----------



## Gobigorgohome

Quote:


> Originally Posted by *Red1776*
> 
> Well the quadfire for 4K , however I would check on blocks because the last word from all manufacturers was no FC VGA blocks for the MSI lightning's. If you are going with the universal VGA WC blocks then its a non issue.
> 
> My solution gearing up for 4k was 4 x MSI game edition (ref layout with larger caps) and EK FC Rev 2.0 blocks.


I think I understand, EK has waterblocks for Lightning's yes, do not know if they are universal or not though.

4x MSI Gaming edition is about the same as Lightnings (price wice) in my country, difference is only 83 USD per card.
The reference is 650 USD, while the gaming is 767 USD and the Lightnign is 850 USD, but anyways I am hitting 94 degree celsius with the Lightning cards too ... so it is not as cool as I thought. One card averaged out on about 75 degree celsius after some 4K gaming.

The sound is the problem I think, but the Lightning-cards are not that quiet either, not two of them at least.


----------



## Gobigorgohome

Quote:


> Originally Posted by *Sgt Bilko*
> 
> If you are using a 500w for the CPU/Mobo then you have enough there for Quad-fire


I think I will be good with that setup actually. Return-slip here we go for the Lightning's.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I think I understand, EK has waterblocks for Lightning's yes, do not know if they are universal or not though.
> 
> 4x MSI Gaming edition is about the same as Lightnings (price wice) in my country, difference is only 83 USD per card.
> The reference is 650 USD, while the gaming is 767 USD and the Lightnign is 850 USD, but anyways I am hitting 94 degree celsius with the Lightning cards too ... so it is not as cool as I thought. One card averaged out on about 75 degree celsius after some 4K gaming.
> 
> The sound is the problem I think, but the Lightning-cards are not that quiet either, not two of them at least.


See my above post for Lightning blocks

But in your case if you are just going to run them at stock and putting them under water then Ref cards will do the job well


----------



## Gobigorgohome

Quote:


> Originally Posted by *Sgt Bilko*
> 
> See my above post for Lightning blocks
> 
> But in your case if you are just going to run them at stock and putting them under water then Ref cards will do the job well


I have already ordered the 2x EK-FC Lightning waterblocks with backplates, need to cancel that too.


----------



## rdr09

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yerp I sure did
> 
> 
> 
> 
> 
> 
> 
> 
> 67" 4K Currrvvvyyy Sammy Frothin bro frothin


suits the name HOMECINEMA-PC!


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *rdr09*
> 
> suits the name HOMECINEMA-PC!


Yerp sure is man








But im that crazy im gonna get a 28" 4k for my bench room , I cant wait to get that and a 3970X . So I can BEEEEENNNNNNCCCCCCHHHHH again


----------



## rdr09

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yerp sure is man
> 
> 
> 
> 
> 
> 
> 
> 
> But im that crazy im gonna get a 28" 4k for my bench room , I cant wait to get that and a 3970X . So I can BEEEEENNNNNNCCCCCCHHHHH again


pls post screenshots of some games you play like Deadly does. thanks. do you even play games?


----------



## kizwan

Quote:


> Originally Posted by *Dasboogieman*
> 
> Just before Watercooling, problem has cropped up.
> 
> As it turns out, my modest Cooler Master Silent Pro 1000W M (supplies 960W on the 12V rail) cannot actually take 290s in Dual Crossfire and an i5-3570k at 4.5 Ghz (1.27V). According to my calculations, at even 90% load, the PSU is being used roughly 80-85% capacity (I'm assuming fresh capacitors, I haven't taken in to account capacitor ageing from 2 years of being used with SLI GTX 570s). Therefore, the PSU fans ramp up rather spectacularly to create the equivalent noise to a stock 290X cooler at 80%.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Thus, there is absolutely no point in watercooling until I can upgrade the PSU.
> 
> At this wattage level, I've pretty much narrowed down to a short list of PSUs within $400 (I.e. the Superb Corsair AX1200i is ruled out) from data I've gleaned from JonnyGuru
> 
> Analog units
> 
> 1. Evga SuperNova G2 1300W
> + best in class ripple suppression on all rails
> + second best in class voltage regulation
> + Highest rated wattage for price range
> + 10 years warranty
> - possible issue with buzzing fan due to loose insulation (though this might be an isolated incidence)
> - $335
> 
> 2. Seasonic X-1250, 1250W
> + best in class voltage regulation
> + third best ripple suppression
> + 7 years warranty
> + silent fan mode available via switch
> + extremely reputable brand
> - $349
> 
> 3. Enermax Revolution 87+ 1200W
> + multiple 30A rails (i.e. slightly safer if there is a partial short)
> + excellent regulation
> - so so ripple suppression (compared to the top 3)
> - unproven regulation capability at 1200W (the other data was gleaned from the 1000W model)
> - hot at full load
> - $229
> 
> 4. Possibly Seasonic X-1200W Platinum (Though I see little point in paying extra for Plat)
> + more or less the same as the X-1250
> + more efficient
> - less wattage
> - $379
> 
> So what do you guys think? I'm leaning towards either 1 or 2, the $229 Enermax unit looks really tempting but its more unproven and seems less robust to pure abuse.
> 
> Or I can Yolo and get the AX1200i or AX1500i and have no compromises whatsoever with dat sweet sweet DSP


I seriously believed your CM PSU is enough for dual 290's, including overclocked. I'm pretty sure your computer power consumption is less than mine. Even if you still want to "upgrade" the PSU, Seasonic X-1050 is more than enough for i5 & 2 x 290's overclocked + watercooling.
Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> suits the name HOMECINEMA-PC!
> 
> 
> 
> Yerp sure is man
> 
> 
> 
> 
> 
> 
> 
> 
> But im that crazy im gonna get a 28" 4k for my bench room , I cant wait to get that and a 3970X . So I can BEEEEENNNNNNCCCCCCHHHHH again
Click to expand...









I need to get myself a bench room!


----------



## fateswarm

I would feel better having a second small PSU for one of the cards or so, if there is ample space in the case. I'm sure there are ways to simultaneously turn them on without even doing the one manually.


----------



## Red1776

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HOMECINEMA-PC*
> 
> Yerp I sure did
> 
> 
> 
> 
> 
> 
> 
> 
> 67" 4K Currrvvvyyy Sammy Frothin bro frothin
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Dude.....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i kinda hate you right now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also, Red: here is the block for the Lightning: LINK
Click to expand...

 cool Sarge,

I asked Peter Sanjay a couple months ago and the official line was "no plans for a MSI lightning block"

I'm sure a lot of folks who bought Lightning's and didn't get the OC performance they expected will be happy about that.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Red1776*
> 
> cool Sarge,
> I asked Peter Sanjay a couple months ago and the official line was "no plans for a MSI lightning block"
> I'm sure a lot of folks who bought Lightning's and didn't get the OC performance they expected will be happy about that.


Yeah i think i posted it in the Lightning owners thread a while back but tbh i post in a fair few topics so i lose track from week to week


----------



## Durvelle27

Seems Eyefinty as been very problematic for me with VGA


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *rdr09*
> 
> pls post screenshots of some games you play like Deadly does. thanks. do you even play games?


Sure do . Will do too









Quote:


> Originally Posted by *kizwan*
> 
> I seriously believed your CM PSU is enough for dual 290's, including overclocked. I'm pretty sure your computer power consumption is less than mine. Even if you still want to "upgrade" the PSU, Seasonic X-1050 is more than enough for i5 & 2 x 290's overclocked + watercooling.
> 
> 
> 
> 
> 
> 
> 
> I need to get myself a bench room!


Yep they are good for hiding stockpiled puter bits


----------



## PachAz

A cooler master 1000w not beeing enough for two r9 290? I have a hard time beliving that since stock r9 290 consume like 200w in full load according to gpu-z. Your cpu consume like 77w at most. I will use my fractal r3 1000w with two r9 290 and OC 4930k and such a psu should be more than enough I have heard.


----------



## sugarhell

Quote:


> Originally Posted by *PachAz*
> 
> A cooler master 1000w not beeing enough for two r9 290? I have a hard time beliving that since stock r9 290 consume like 200w in full load according to gpu-z. Your cpu consume like 77w at most. I will use my fractal r3 1000w with two r9 290 and OC 4930k and such a psu should be more than enough I have heard.


>Reading your post.
>You say according to gpu-z.
>Software power consumption measure. (facepalm)

290 stock 250 watt. 4930k 150-200 watt at stock.
I dont know any cpu that will consume only 77watt at stock(except dual cores). Or you mean 3770k 77w tdp=77 watt power consumption?

Intel tdp is the average power consumption not the maximum.

>bye


----------



## PachAz

Well, how should I know, that is why I am asking.


----------



## aneutralname

Quote:


> Originally Posted by *PachAz*
> 
> A cooler master 1000w not beeing enough for two r9 290? I have a hard time beliving that since stock r9 290 consume like 200w in full load according to gpu-z. Your cpu consume like 77w at most. I will use my fractal r3 1000w with two r9 290 and OC 4930k and such a psu should be more than enough I have heard.


I have a hard time believing it myself. But he said his problems were the loud fan speeds? Maybe that's what he's concerned about. Sounds like a non-issue to me.



Spoiler: Warning: Spoiler!



4930k 150-200 watt at stock.



Source?


----------



## Scorpion49

I ran a 290 Tri-X overclocked with a 4770k at 4.5ghz on a 450W SFX PSU with zero problems, so they can't be that bad. More than likely that CM PSU just has a big ramp up in fan speed over a certain load (I have the same PSU in my machine right now with an overclocked 1366 Xeon and the 290).


----------



## Dasboogieman

Quote:


> Originally Posted by *kizwan*
> 
> I seriously believed your CM PSU is enough for dual 290's, including overclocked. I'm pretty sure your computer power consumption is less than mine. Even if you still want to "upgrade" the PSU, Seasonic X-1050 is more than enough for i5 & 2 x 290's overclocked + watercooling.
> 
> 
> 
> 
> 
> 
> 
> I need to get myself a bench room!


I'm not happy about upgrading the PSU either.

I have no doubt the wattage on the CM unit is sufficient. I had over-planned for Tri SLI GTX 570s when I first built this machine. However, what I didn't anticipate was the noise the PSU makes when it surpasses the ideal 60-70% loading mark. It seriously is a loud noise, all the other fans on my PC are quiet noctuas or massive 230mm Bitfenixes. Both 290s make a distinctive high pitched whirring noise, this one is more of a rumble, very similar to the type the stock 290 fans give out. I can manually force the GPU fans to 20% and the rumble continues so its definitely the PSU fan.

Despite being rated for 1000W, JonnyGuru actually found that this unit can only really supply 932W. I'm estimating about 10% capacitor degradation due to the age of my unit and the heavy FAH I used to do, so 832 W
Toms found the average consumption of a 290 to be about 232W at stock on a Metro LL workload.
My PSU fans only spin up when both GPUs are OCed to 1100mhz +37mV.
Thus average 290 Power consumption when OCed is 270W under Metro LL.
2x270W is 541W
i5-3570k at 4.6ghz 1.35V is around 110W
So all up around 649W not accounting for transformation inefficiencies. Thats already 78% of my current estimated PSU capacity. Actual numbers may be as much as 85% of my PSU capacity so I can understand the delivery components are gonna get really hot.

The reason I'm aiming for 1200W at least is to guarantee the PSU isn't loaded more than 60-70% so the heat doesn't prompt the PSU fans to spin up. Not much point watercooling if the PSU whines all day.
Shame you can't watercool a PSU


----------



## pdasterly

Newegg has the evga 1300 supernova on sale. I think I paid 165 for it earlier this month


----------



## steadly2004

Quote:


> Originally Posted by *pdasterly*
> 
> Newegg has the evga 1300 supernova on sale. I think I paid 165 for it earlier this month


I just bought the same PSU, I don't think I'm pushing it just yet, but It's definitely rock steady


----------



## b0x3d

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yerp I sure did
> 
> 
> 
> 
> 
> 
> 
> 
> 67" 4K Currrvvvyyy Sammy Frothin bro frothin


Just in time for the World Cup!


----------



## Sgt Bilko

Quote:


> Originally Posted by *b0x3d*
> 
> Just in time for the World Cup!


For the what?

Oh...that thing where they kick the thing and people scream and go crazy?


----------



## fateswarm

Quote:


> Originally Posted by *Sgt Bilko*
> 
> For the what?
> 
> Oh...that thing where they kick the thing and people scream and go crazy?


I've no idea why anyone likes any sport. If I liked a sport, I'd play it.


----------



## Silent Scone

Well after getting annoyed at the hitching in BF4 at 4K on my TI's (VRAM) I decided to install the first 290-X Vapor 8GB in order to give it a try.

...Not good start, won't display 60Hz without flickering like mad over DP1.2. Have made sure its set to 1.2 in the OSD. Is there something I'm missing here? Over HDMI 60Hz works fine (not at 4K obviously). Has anyone else experienced this issue? I can't see how it can be the cable as it was fine on my TI's.


----------



## Ramzinho

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yerp I sure did
> 
> 
> 
> 
> 
> 
> 
> 
> 67" 4K Currrvvvyyy Sammy Frothin bro frothin


i hope you are behind a VPN cause i'm trakcing you down and stealing that in the near future







... Now to think how to move this without being noticed







LOL


----------



## Silent Scone

Quote:


> Originally Posted by *aneutralname*
> 
> As I said before, welcome to ATI. There is a reason Nvidia has most of the market share, and it's not price/performance.


Helpful, thanks.









Not heard the name ATi for a good few years. Rock + trapped?


----------



## fateswarm

I'm probably an exception since while I get the *idea* NVIDIA is better, I had the better experience with AMD. I had an NVIDIA SLI setup that just exploded and an AMD card that lasted for years.


----------



## Silent Scone

Well I must say this 4K DP1.2 issue i'm getting isn't a great start to 4K Gaming on AMD







. But I'll reserve judgement. There must be someone else having the same problem surely?


----------



## pkrexer

My 860w seasonic pushes my overclocked crossfired 290x's + 4770k @ 4.6 without any issues. Fan does kick on under heavy load, but I can't hear it over my rad fans. I would think 1000w would be more than enough.


----------



## PachAz

I will be using a overclocked 4930k and they consume alot of power. Gpu I will be running stock, like 1000/1300.


----------



## aneutralname

Quote:


> Originally Posted by *Silent Scone*
> 
> Helpful, thanks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not heard the name ATi for a good few years. Rock + trapped?


It was intentional, as the driver issues date to ATI years. Having used both brands, I always run into a lot of "unexpected" issues with ATI cards.


----------



## Red1776

I am somewhat finicky about power, both quality and quanity. I run 2200W and have seen over 1900w from the wall. a 1000w PSU with 2 or certainly three cards and wc system is pushing it a bit for my taste. but that's just me.


----------



## rdr09

Quote:


> Originally Posted by *aneutralname*
> 
> It was intentional, as the driver issues date to ATI years. Having used both brands, I always run into a lot of "unexpected" issues with ATI cards.
> truth much


have you ever thought of using and sticking with Xbox? PnP.

edit: not being sarcastic. serious question.


----------



## PachAz

2 or 3 cards...big differance dude, when we are talking about the power hungry r9 290/290x.


----------



## aneutralname

Quote:


> Originally Posted by *Red1776*
> 
> I am somewhat finicky about power, both quality and quanity. I run 2200W and have seen over 1900w from the wall. a 1000w PSU with 2 or certainly three cards and wc system is pushing it a bit for my taste. but that's just me.


You need to factor efficiency when looking at values from the wall. It does not translate to the rated wattage.
Quote:


> Originally Posted by *rdr09*
> 
> have you ever thought of using and sticking with Xbox? PnP.
> 
> edit: not being sarcastic. serious question.


Expecting a product to behave like it's supposed to and like what you paid for has nothing to do with desiring simplicity. It's rather a matter of quality and respect to customers, which are things that ATI has trouble understanding.


----------



## Silent Scone

Bit pissed off TBH. Would appear there is no official support for DP1.2 @ 60Hz 4K...Few people having the same issue.


----------



## rdr09

Quote:


> Originally Posted by *aneutralname*
> 
> You need to factor efficiency when looking at values from the wall. It does not translate to the rated wattage.
> Expecting a product to behave like it's supposed to and like what you paid for has nothing to do with desiring simplicity. It's rather a matter of quality and respect to customers, which are things that ATI has troubles understanding.


not sure what's going on with your hawaii card but mine is flawless. when you point at the card as at fault check how many fingers are pointing back at you.

i am not saying AMD is perfect, since i come here almost everyday to help others with issues they have. at least i try. i think the response from silent to you says it all.


----------



## aneutralname

Quote:


> Originally Posted by *rdr09*
> 
> not sure what's going on with your hawaii card but mine is flawless. when you point at the card as at fault check how many fingers are pointing back at you.
> 
> i am not saying AMD is perfect, since i come here almost everyday to help others with issues they have. at least i try. i think the response from silent to you says it all.


Compatibility problems. Programs act up, certain drivers don't work for mining, certain resolutions and refresh rates won't work by default on certain games. Downsampling is a nightmare. Monitor won't go into sleep mode.

This is only with this card, I could tell you about when I owned a 1950gt, and I couldn't get it to work properly for 2 months. I wasted a lot of money on a new power supply and a lot of time trying to figure out the problem, which was that only some old drivers were able to make the card perform decently. I remember compatibility problems with games back then too.
Quote:


> i think the response from silent to you says it all.


That statement has no weight since we are in an ATI thread, and I have only posted my views a few minutes ago.


----------



## kizwan

Quote:


> Originally Posted by *aneutralname*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> not sure what's going on with your hawaii card but mine is flawless. when you point at the card as at fault check how many fingers are pointing back at you.
> 
> i am not saying AMD is perfect, since i come here almost everyday to help others with issues they have. at least i try. i think the response from silent to you says it all.
> 
> 
> 
> Compatibility problems. Programs act up, *certain drivers don't work for mining*, certain resolutions and refresh rates won't work by default on certain games. Downsampling is a nightmare. Monitor won't go into sleep mode.
> 
> This is only with this card, I could tell you about when I owned a 1950gt, and I couldn't get it to work properly for 2 months. I wasted a lot of money on a new power supply and a lot of time trying to figure out the problem, which was that only some old drivers were able to make the card perform decently. I remember compatibility problems with games back then too.
> Quote:
> 
> 
> 
> i think the response from silent to you says it all.
> 
> Click to expand...
> 
> That statement has no weight since we are in an ATI thread, and I have only posted my views a few minutes ago.
Click to expand...

Aaahhh! The plot thickens...


----------



## Gobigorgohome

Do I need to look for anything special when I do the switch from 2x MSI Lightning R9 290X's to 4x Sapphrire Radeon R9 290X's? The power usage of course, but aside from that? Is it harder to configure than normal crossfire? I will not do much of overclocking, but I will water cool them soon after I get them.


----------



## rdr09

Quote:


> Originally Posted by *aneutralname*
> 
> Compatibility problems. Programs act up, certain drivers don't work for mining, certain resolutions and refresh rates won't work by default on certain games. Downsampling is a nightmare. Monitor won't go into sleep mode.
> 
> This is only with this card, I could tell you about when I owned a 1950gt, and I couldn't get it to work properly for 2 months. I wasted a lot of money on a new power supply and a lot of time trying to figure out the problem, which was that only some old drivers were able to make the card perform decently. I remember compatibility problems with games back then too.
> That statement has no weight since we are in an ATI thread, and I have only posted my views a few minutes ago.


you should really try Xbox. You plug it and play. For mining, Nvidia cards can now mine just as good.


----------



## aneutralname

Quote:


> Originally Posted by *kizwan*
> 
> Aaahhh! The plot thickens...


Ehm, not really. Nvidia cards don't have more issues when mining, they just suck at it but you know it before you buy them. The equivalent would be something like Physx not working for certain games on Nvidia, which I never experienced.


----------



## aneutralname

Quote:


> Originally Posted by *rdr09*
> 
> you should really try Xbox. You plug it and play. For mining, Nvidia cards can now mine just as good.


No they can't. Blaming customers for a company's poor quality is simply pathetic.


----------



## rdr09

Quote:


> Originally Posted by *aneutralname*
> 
> No they can't. Blaming customers for a company's poor quality is simply pathetic.


mine works flawlessly but i don't mine. had mine since November.

anyways, we have other owners wanting to be heard and can't 'cause of us, so . . . see ya!


----------



## aneutralname

Quote:


> Originally Posted by *rdr09*
> 
> mine works flawlessly but i don't mine. had mine since November.
> 
> anyways, we have other owners wanting to be heard and can't 'cause of us, so . . . see ya!


What Nvidia card mines as well as an R9 290?

I only speak from experience, didn't have anything against ATI and gave them a fair chance tired of Nvidia's pricing and willingness to push people to adopt new standards. Didn't like the results so far.

Have a nice day.


----------



## rdr09

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Do I need to look for anything special when I do the switch from 2x MSI Lightning R9 290X's to 4x Sapphrire Radeon R9 290X's? The power usage of course, but aside from that? Is it harder to configure than normal crossfire? I will not do much of overclocking, but I will water cool them soon after I get them.


anyone?

i recommend testing them individually. install one card and install driver before crossfiring.


----------



## Notion

Hi peeps, looking for some knowledge andadvice please.. I am looking for the best hassle free card for 3 monitors.. Just got rid of my 7950 as the active cable thing was a hassle. . 2 broke and an ar*e to replace.. I am looking at 290 or 770 4gb or 780.. I here 780 has issues with fps but not convinced.. Just dont know which way to go..


----------



## Red1776

Quote:


> Originally Posted by *PachAz*
> 
> 2 or 3 cards...big differance dude, when we are talking about the power hungry r9 290/290x.


You need to read what I wrote if you are going to respond. I see a lot of members waxing on about powering 2 or three GPU systems with 1000w-1200w PSU's (and OC'd at that) with WC loops. my point is I prefer (and think it is a very good idea to have a good amount of headroom, like 25-30% for that amount of draw. I thought that was a fairly obvious conclusion...my mistake.

Quote:


> Originally Posted by *aneutralname*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> I am somewhat finicky about power, both quality and quantity. I run 2200W and have seen over 1900w from the wall. a 1000w PSU with 2 or certainly three cards and WC system is pushing it a bit for my taste. but that's just me.
> 
> 
> 
> You need to factor efficiency when looking at values from the wall. It does not translate to the rated wattage.
> 
> No kidding, see above response to patch. I don't care how efficient your PSU of choice is, 1000w is not enough for a 2 or 3 GPU system. And your comment about factoring efficiency buttresses my point even further. Never mind the fact that I added the caveat about this being my opinion and or preference.
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> have you ever thought of using and sticking with Xbox? PnP.
> 
> edit: not being sarcastic. serious question.
> 
> Click to expand...
> 
> Expecting a product to behave like it's supposed to and like what you paid for has nothing to do with desiring simplicity. It's rather a matter of quality and respect to customers, which are things that ATI has trouble understanding.
Click to expand...

Maybe the issue is that you are stuck in the ATI era. I have been building high end gaming rigs since 2007 mostly multi GPU rigs including a quadfire machine for my own personal use every 9 months or so.



Both team red and green have had periods when they have excelled and some low spots but I have not had more problems with one or the other (and I work largely multi GPU setups) If AMD GPU's were so bad and troublesome, believe me I would have left AMD so fast my shorts would have had to catch the next train.

Quote:


> It's rather a matter of quality and respect to customers, which are things that ATI has trouble understanding.


That is a throw away line, When I have needed support from ATI...er AMD they have been responsive, respectful, and timely.


----------



## Gobigorgohome

Quote:


> Originally Posted by *rdr09*
> 
> anyone?
> 
> i recommend testing them individually. install one card and install driver before crossfiring.


Yes, I always do that. The driver is already installed, I am not going to use the computer before I get the new cards. I am also returning my motherboard, so I will be without a computer for a couple of weeks.


----------



## b0x3d

reading through forums recently all I ever seem to see if people returning hardware - doesn't seem normal for so many faulty products to be sent out - or do you think people return stuff without properly trying to make it work - if so much stuff is being sent back makes me wonder how much stuff I'm buying that has already been returned by someone!


----------



## Gobigorgohome

Quote:


> Originally Posted by *b0x3d*
> 
> reading through forums recently all I ever seem to see if people returning hardware - doesn't seem normal for so many faulty products to be sent out - or do you think people return stuff without properly trying to make it work - if so much stuff is being sent back makes me wonder how much stuff I'm buying that has already been returned by someone!


If things is used by someone it usually is being re-sold as demo-used, at least in my country. I return ALOT of hardware which I do not think is good enough, or by the simple reason that I do not want it. It is a easy way to do changes for people who do not mind returning some packages. I do not buy used hardware as a rule of thumb, even though I am sure some of the parts that I have gotten which shoulc be new, has been used. Therefor I only purchase products from a trusted seller which do not send used hardware and claiming it is new. It is scam and I do not like it, with all the return policies of today there is nothing wrong with testing a product for two weeks to then send it back and get your money back, at least that is how it works here.

So the conclution you have to draw from this is, ALWAYS buy hardware from trusted sellers, unless you do not care if you got something used for the price of something new. I do care about that.

I am just pointing it out.


----------



## Silent Scone

Quote:


> Originally Posted by *b0x3d*
> 
> reading through forums recently all I ever seem to see if people returning hardware - doesn't seem normal for so many faulty products to be sent out - or do you think people return stuff without properly trying to make it work - if so much stuff is being sent back makes me wonder how much stuff I'm buying that has already been returned by someone!


You can count me sending mine back if they don't add proper 4k DP 1.2 support.


----------



## Widde

Quote:


> Originally Posted by *b0x3d*
> 
> reading through forums recently all I ever seem to see if people returning hardware - doesn't seem normal for so many faulty products to be sent out - or do you think people return stuff without properly trying to make it work - if so much stuff is being sent back makes me wonder how much stuff I'm buying that has already been returned by someone!


I think some is people not being comfortable with the heat/noise from the ref cards. My sapphire was a return that i bought in like january.


----------



## Mega Man

Quote:


> Originally Posted by *Silent Scone*
> 
> Quote:
> 
> 
> 
> Originally Posted by *b0x3d*
> 
> reading through forums recently all I ever seem to see if people returning hardware - doesn't seem normal for so many faulty products to be sent out - or do you think people return stuff without properly trying to make it work - if so much stuff is being sent back makes me wonder how much stuff I'm buying that has already been returned by someone!
> 
> 
> 
> You can count me sending mine back if they don't add proper 4k DP 1.2 support.
Click to expand...

i love how you keep pointing to the gpu and you dont even think it is a problem on the display side. !

plenty of people have used it with 4k without issue


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Ramzinho*
> 
> i hope you are behind a VPN cause i'm trakcing you down and stealing that in the near future
> 
> 
> 
> 
> 
> 
> 
> ... Now to think how to move this without being noticed
> 
> 
> 
> 
> 
> 
> 
> LOL


This will be waiting for you ....


----------



## dartuil

LOL


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *b0x3d*
> 
> Just in time for the World Cup!


Just watched Italy beat UK 2-1









Quote:


> Originally Posted by *Sgt Bilko*
> 
> For the what?
> 
> Oh...that thing where they kick the thing and people scream and go crazy?


LooooooL kick the thing


----------



## Roy360

How much would a G530 bottleneck a R9 290X?

I'm thinking of trading my spare G530 and R9 290X for my brother's 4670k. I realize I'm getting ripped, but I"m wondering if he'll be able to use the R9 with that processor.

Will the R9 compensate for the CPU? or will he be stuck at low settings?


----------



## Roy360

Quote:


> Originally Posted by *aneutralname*
> 
> What Nvidia card mines as well as an R9 290?
> 
> I only speak from experience, didn't have anything against ATI and gave them a fair chance tired of Nvidia's pricing and willingness to push people to adopt new standards. Didn't like the results so far.
> 
> Have a nice day.


For X11 coins, 750ti are known to get 2.6Mh/s whereas R9 290s get 3mh/s

169.99 vs 409.99

60W Vs 200W

Not related to R9 290s, but in a case with literally no air flow, would anyone suggest a R9 270X over a GTX 750 ti?
I've noticed that powercolor released a passive cooler for the card, so maybe I could survive with the DualX version?


----------



## Gobigorgohome

Can anyone do me the favour of running two referance R9 290X (Sapphire Radeon R9 290X) in 3D Mark 11 Xtreme? Just benchmarks.

My system is completely stock with 3930K at 3,8 Ghz, 2x MSI Lightning R9 290X and some other stuff and I get X9201 (which I think is good), do the referance get over 9000?


----------



## Scorpion49

Quote:


> Originally Posted by *Roy360*
> 
> How much would a G530 bottleneck a R9 290X?
> 
> I'm thinking of trading my spare G530 and R9 290X for my brother's 4670k. I realize I'm getting ripped, but I"m wondering if he'll be able to use the R9 with that processor.
> 
> Will the R9 compensate for the CPU? or will he be stuck at low settings?


Badly. Really badly. I tried one of those dual core pentiums with my Titan a while ago and it was awful.


----------



## CoolRonZ

add me please

http://www.techpowerup.com/gpuz/hang8/

POWERCOLOR AXR9 290X OC BF4 edition watercooled with an XSPC block


----------



## Arizonian

Quote:


> Originally Posted by *CoolRonZ*
> 
> add me please
> 
> http://www.techpowerup.com/gpuz/hang8/
> 
> POWERCOLOR AXR9 290X OC BF4 edition watercooled with an XSPC block
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## pdasterly

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> This will be waiting for you ....


Lol peashooter, seriously though im having problem with system. Im thinking its my power supply. evga 1300 supernova. The vga power plug has both 6+2 and 6 pin plug on single plug going to psu, does this provide enough juice to the video card?

Previous setup I had two cables from psu going to card, one 6+2 and one 6 pin


----------



## Ramzinho

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> This will be waiting for you ....


Good now i've to wear a bullet proof vest


----------



## Ramzinho

Quote:


> Originally Posted by *pdasterly*
> 
> Lol peashooter, seriously though im having problem with system. Im thinking its my power supply. evga 1300 supernova. The vga power plug has both 6+2 and 6 pin plug on single plug going to psu, does this provide enough juice to the video card?
> 
> Previous setup I had two cables from psu going to card, one 6+2 and one 6 pin


if it's the ECGA SuperNova G2 it's an awesome PSU. does your GPU requrie 2X8Pin or 1 X 6 + 1 X 8?


----------



## HOMECINEMA-PC

@pdasterly
How many cards r u running ? and that's a normal plug arrangement for VGA cards . How many amps are running off the 12v rail , 45a min I think you need ?
1200 watter will run 3 mildly clocked r9 290's and a hex @ [email protected] . But I added a 2nd psu so I could bench 3 290'[email protected]
Peashooter LooooL


----------



## pdasterly

2 cards, just got off phone with evga, they have 24/7 help line. They said psu should supply all the watts the cards need, BUT they recommend that I run two seperate power lines to the gpu, cause the 290x is a fat card. Switched power cables now all my random crashes and bsod are gone. Wiped out my ssd for nothing, at least now I have a fresh clean install. They said It's the amps its pulling not the watts.


----------



## KyadCK

Quote:


> Originally Posted by *pdasterly*
> 
> 2 cards, just got off phone with evga, they have 24/7 help line. They said psu should supply all the watts the cards need, BUT they recommend that I run two seperate power lines to the gpu, cause the 290x is a fat arse. Switched power cables now all my random crashes and bsod are gone. Wiped out my ssd for nothing, at least now I have a fresh clean install. *They said It's the amps its pulling not the watts.*


Well they were a bit misleading... Volts (12) x Amps (25) = Watts (300). Since voltage always will be (or should be) 12 going to the GPU, watts and amps are the same thing but wattage is 12 times higher as a number. But at least they got your problem solved.

And yes, I've never encouraged people to run a single line to a GPU, it's just a bad idea. Honestly you're lucky, trying to pull too many amps over a wire that isn't designed for it (isn't a strong enough gauge) can result in the insulation melting and then the wires shorting on each other. Sounds like the Supernova caps how much power can be drawn from the power outs. The result is a starved GPU, but no permanent damage done.

And yes, the 290X is a big boy. So's GK110 and Fermi. Go big or go home.


----------



## fh12volvo

Hello now i have 2 R9 290 biggrin.gif
1. http://www.techpowerup.com/gpuz/9cx2d/ , http://www.techpowerup.com/gpuz/ghv65/
2. Sapphire & VTX3D
3. Stock


----------



## KnownDragon

I have a question. My xfx r9 290 will overclock to 1126 core and 1400 mem with no added voltage. I have attempted to add voltage just a little and when I do the performance of the same overclock goes down drastically. I.E. Firestrike without voltage is 10165 and with voltage added to same clock 9089. Does anyone have any thoughts about this?


----------



## Xylene

Quote:


> Originally Posted by *KnownDragon*
> 
> I have a question. My xfx r9 290 will overclock to 1126 core and 1400 mem with no added voltage. I have attempted to add voltage just a little and when I do the performance of the same overclock goes down drastically. I.E. Firestrike without voltage is 10165 and with voltage added to same clock 9089. Does anyone have any thoughts about this?


Are you running the fan set on auto? If so, it's probably throttling.


----------



## Rimasleon

Hello guys. First sorry about my english i'm not very well.
I think i have a problem but not so sure. I just bought MSI R9 290 Twin Frozr from one miner and i thin i'm to high in temperatures.
At gamer mode in Valley Benchmark 1.0 i hit 85C at 72% fan speed it is normal? My pc case is well vented. Card is running on stock speeds 977mhz and 1250mhz. Had replaced TIM, tighten up back plate screws.
Yesterday i got 90c and 60% fan speed on Battlefield 4.

On 3DMark i hit 82C and ~62% fan speed. http://www.3dmark.com/3dm/3294183
Today ir run Valley Benchmark 1.0 at 100% fan speed and hit 76c max.

So is this normal temperatures on this model or i can RMA it?


----------



## fateswarm

Quote:


> Originally Posted by *KyadCK*
> 
> Well they were a bit misleading... Volts (12) x Amps (25) = Watts (300). Since voltage always will be (or should be) 12 going to the GPU, watts and amps are the same thing but wattage is 12 times higher as a number. But at least they got your problem solved.


Maybe they think the voltage droops.


----------



## KyadCK

Quote:


> Originally Posted by *fateswarm*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KyadCK*
> 
> Well they were a bit misleading... Volts (12) x Amps (25) = Watts (300). Since voltage always will be (or should be) 12 going to the GPU, watts and amps are the same thing but wattage is 12 times higher as a number. But at least they got your problem solved.
> 
> 
> 
> Maybe they think the voltage droops.
Click to expand...

On a Supernova 1300? I sure hope the 12v doesn't drop out of spec, it's rated really really highly.

Even putting a 6 and 8 pin connector on a single line is a bit scary though. I personally don't have experience with their cables, but to me it's always seemed as wrong as actually using that Molex to 6-pin adapter vendors sometimes include.


----------



## Scorpion49

Quote:


> Originally Posted by *Rimasleon*
> 
> Hello guys. First sorry about my english i'm not very well.
> I think i have a problem but not so sure. I just bought MSI R9 290 Twin Frozr from one miner and i thin i'm to high in temperatures.
> At gamer mode in Valley Benchmark 1.0 i hit 85C at 72% fan speed it is normal? My pc case is well vented. Card is running on stock speeds 977mhz and 1250mhz. Had replaced TIM, tighten up back plate screws.
> Yesterday i got 90c and 60% fan speed on Battlefield 4.
> 
> On 3DMark i hit 82C and ~62% fan speed. http://www.3dmark.com/3dm/3294183
> Today ir run Valley Benchmark 1.0 at 100% fan speed and hit 76c max.
> 
> So is this normal temperatures on this model or i can RMA it?


That seems quite high for that cooler. I'm sure someone else who has one will comment, but my Vapor-X and Tri-X cards never saw above 70*C even overclocked and I know the Twin Frozr IV is a good cooler.


----------



## KyadCK

Alright, so Bilko and I looked into the PSU stuff about needed a second line since he has a PSU that piggybacks the 6-pion off the 8-pin as well. His wires are 18 AWG, which in a normal PSU would be perfectly fine, but when Hawaii gets involved things can get weird.

Unlike other cards, Hawaii draws almost nothing from motherboard power, offloading it's hunger to the PCI-e power cables. This, combined with the single line for both 6 and 8 pin has just 3 18 AWG cables trying to pull about 250w of power.

250w / 12v = ~21A.

21A / 3 = 7A per line.

18 AWG wire is rated for 7A per line.

If his card managed to pull over about 260w under load (which, lets face it, 290Xs do that), then he would be over drawing his line. His card would be starved for power.

If AMD's cards followed PCI-e spec perfectly, the wire would be able to handle the load. But they do not and so it can not. Live and learn eh?

TL;DR, EVGA tech support knows their stuff. Props to them, although this makes me wish they would use thicker gauge wire or even just not do that piggyback thing at all.


----------



## fateswarm

Do those cables stop drawing current over ampacity or just overheat or both?


----------



## battleaxe

Quote:


> Originally Posted by *fateswarm*
> 
> Do those cables stop drawing current over ampacity or just overheat or both?


They don't' stop drawing current. But as you push too much power through them the resistance builds up to the point that the card is getting starved. (resistance just means the power supply might be pushing 11 amps but the card is only getting 8 amps for example) So it will lock up or make the PC shut down. At the very least you would start seeing some artifacts. This is very bad for your hardware BTW obviously.

The heat is a result of the resistance from pushing too many amps through the cables.


----------



## Scorpion49

Quote:


> Originally Posted by *fateswarm*
> 
> Do those cables stop drawing current over ampacity or just overheat or both?


Generally when you have a wire in an overcurrent situation it will heat up, eventually melting its covering and shorting to whatever is near it. Hopefully the PSU's OCP would have kicked in before then though.


----------



## fateswarm

Quote:


> Originally Posted by *battleaxe*
> 
> They don't' stop drawing current. But as you push too much power through them the resistance builds up to the point that the card is getting starved. (resistance just means the power supply might be pushing 11 amps but the card is only getting 8 amps for example) So it will lock up or make the PC shut down. At the very least you would start seeing some artifacts. This is very bad for your hardware BTW obviously.
> 
> The heat is a result of the resistance from pushing too many amps through the cables.


Yeah, I meant reduce current.







That "stop current" does sound extreme. Thanks.


----------



## Gobigorgohome

Is it possible to overclock the reference R9 290X (Sapphire Radeon non TRI-X) to about 1080 on core and leave anything else at stock (voltage and memory)?


----------



## Ramzinho

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Is it possible to overclock the reference R9 290X (Sapphire Radeon non TRI-X) to about 1080 on core and leave anything else at stock (voltage and memory)?


You gotta try, but you might need some cooling..


----------



## CoolRonZ

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Is it possible to overclock the reference R9 290X (Sapphire Radeon non TRI-X) to about 1080 on core and leave anything else at stock (voltage and memory)?


i would think easily, my PowerColor 290X OC could easily do that and as far as know its reference, its got the same markings on the PCB as the PCS+, so I'm assuming its exactly the same and has the exact same BIOS as the MSI Gaming series. The only thing that was needed was to take the cooler off and reapply thermal paste. Coming from the factory, it was way too much, was like a big glob of gum, cooled way better after reapplying thermal paste, but it was really loud, but stable. Overvolting never really made any difference even under water, my card seemed to be 1100/1500 with air and 1150/1525 under water. And since overvolting only made maybe 25mhz on the GPU, not really worth in my eyes..... I guess it really depends on the luck of the draw, but I would say it can do what you would like it to do.....


----------



## Mega Man

yes it will be fine, ytou could run far greater current through that wire and still be ok


----------



## Gobigorgohome

Quote:


> Originally Posted by *Ramzinho*
> 
> You gotta try, but you might need some cooling..


I will very likely do water cooling with full-cover waterblocks with backplates and everything.
Quote:


> Originally Posted by *CoolRonZ*
> 
> i would think easily, my PowerColor 290X OC could easily do that and as far as know its reference, its got the same markings on the PCB as the PCS+, so I'm assuming its exactly the same and has the exact same BIOS as the MSI Gaming series. The only thing that was needed was to take the cooler off and reapply thermal paste. Coming from the factory, it was way too much, was like a big glob of gum, cooled way better after reapplying thermal paste, but it was really loud, but stable. Overvolting never really made any difference even under water, my card seemed to be 1100/1500 with air and 1150/1525 under water. And since overvolting only made maybe 25mhz on the GPU, not really worth in my eyes..... I guess it really depends on the luck of the draw, but I would say it can do what you would like it to do.....


It seems good, I will not use the air cooler very long though, I thought the MSI Lightning R9 290X's was to loud so I guess my mind will be blown away with the reference cards. I did some benchmarking and gaming with the 2x MSI Lightning R9 290X and I had to have the windows to my room open or else the cards would thermal throttle at 94 degrees celsius, and I am living in Norway.








Quote:


> Originally Posted by *Mega Man*
> 
> yes it will be fine, ytou could run far greater current through that wire and still be ok


I am not a big time overclocker, I just like things to run cool and quite, a gaming machine with aircooling, loud sound and temperatures like in the Sahara doesn't really appeal to me. The headset is not for sound-cancalation but for sound, with four of the reference cards (or any card) that is not the case. I will be very happy if I can do 1080 on core, maybe 1100 if my PSU can take that power usage.


----------



## aaroc

With the PCIe cables that the corsair AX860i and AX1200i have, its save to have one cable to each R9 290? Or I need to use 2 cables for each R9 290?


----------



## Scorpion49

Well, after one week of use my Sapphire R9 290 Vapor-X has officially died. I played about an hour of BF4 and then the screen went black, no display on two different PC's with it now. What a great investment that card was.


----------



## fateswarm

What were the temps?


----------



## Scorpion49

Quote:


> Originally Posted by *fateswarm*
> 
> What were the temps?


Max I saw was 72*C, regardless the card should handle up to 95*C just like every other 290. I've already been having problems with it and having lines on the desktop at 120hz refresh rate so I am not surprised it finally died.


----------



## kizwan

Quote:


> Originally Posted by *aaroc*
> 
> With the PCIe cables that the corsair AX860i and AX1200i have, its save to have one cable to each R9 290? Or I need to use 2 cables for each R9 290?


I have Seasonic PSU & I use one cable to each 290 but the cable was actually two cables in one connector.


----------



## bluedevil

Quote:


> Originally Posted by *kizwan*
> 
> I have Seasonic PSU & I use one cable to each 290 but the cable was actually two cables in one connector.


I have my Capstone 550m with 2 pcie on one cable and its fine.


----------



## Gobigorgohome

Quote:


> Originally Posted by *Scorpion49*
> 
> Well, after one week of use my Sapphire R9 290 Vapor-X has officially died. I played about an hour of BF4 and then the screen went black, no display on two different PC's with it now. What a great investment that card was.


Maybe a silly question, but have you changed the displayport-cable (if you are at 4K)? That was the problem with my black flickering screen with my Samsung U28D590D (only in BF4 though).

Just RMA it if you bought it new.


----------



## Scorpion49

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Maybe a silly question, but have you changed the displayport-cable (if you are at 4K)? That was the problem with my black flickering screen with my Samsung U28D590D (only in BF4 though).
> 
> Just RMA it if you bought it new.


I do not have my 4k monitor yet, it will be here this week. This card is dead as dead can be, no HDMI, DVI, DP output at all now on two different machines.


----------



## Arizonian

Quote:


> Originally Posted by *fh12volvo*
> 
> Hello now i have 2 R9 290 biggrin.gif
> 1. http://www.techpowerup.com/gpuz/9cx2d/ , http://www.techpowerup.com/gpuz/ghv65/
> 2. Sapphire & VTX3D
> 3. Stock


Congrats- added


----------



## pdasterly

How do I join the club?


----------



## Forceman

Quote:


> Originally Posted by *KnownDragon*
> 
> I have a question. My xfx r9 290 will overclock to 1126 core and 1400 mem with no added voltage. I have attempted to add voltage just a little and when I do the performance of the same overclock goes down drastically. I.E. Firestrike without voltage is 10165 and with voltage added to same clock 9089. Does anyone have any thoughts about this?


Did you increase the power limit? Might be power throttling once you increase the voltage.


----------



## mAs81

Hello everyone!
I now own a Sapphire VAPOR-X R9 290 4GB TRI-X OC,and would like to join the club with all the other cool kids









GPU-Z validation


----------



## Silent Scone

Only 4GB?










Great purchase. One of the best Hawaii products on the market!


----------



## mAs81

Quote:


> Originally Posted by *Silent Scone*
> 
> Only 4GB?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Great purchase. One of the best Hawaii products on the market!


Thanks man , I love it so far...









here are some benchmarks
Firestrike

Valley:


Runs a little hot in my Obsidian 350D case,but its summer in Greece now,so I guess that is to be expected



Though I haven't overclocked it yet,I'm having some weird black screen issues at startup that I need to look into..


----------



## kizwan

Quote:


> Originally Posted by *mAs81*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Silent Scone*
> 
> Only 4GB?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Great purchase. One of the best Hawaii products on the market!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks man , I love it so far...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> here are some benchmarks
> Firestrike
> 
> Valley:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Runs a little hot in my Obsidian 350D case,but its summer in Greece now,so I guess that is to be expected
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Though I haven't overclocked it yet,I'm having some weird black screen issues at startup that I need to look into..
Click to expand...

Try increase voltage by +50mV & power limit to +50%. See if that fixed the black screen issues.


----------



## mAs81

Quote:


> Originally Posted by *kizwan*
> 
> Try increase voltage by +50mV & power limit to +50%. See if that fixed the black screen issues.


I don't have Afterburner atm, i use TriXX, and this is how it is now:



Should I up the VDDC Offset you think?


----------



## kizwan

Quote:


> Originally Posted by *mAs81*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Try increase voltage by +50mV & power limit to +50%. See if that fixed the black screen issues.
> 
> 
> 
> 
> 
> 
> 
> I don't have Afterburner atm, i use TriXX, and this is how it is now:
> 
> 
> 
> Should I up the VDDC Offset you think?
Click to expand...

Yup, try +50mV & also max out power limit.


----------



## mAs81

Quote:


> Originally Posted by *kizwan*
> 
> Yup, try +50mV & also max out power limit.


Will do..Even though the black screens are totally random and only on Windows startup..
Thank you,man I appreciate it..I want to be done with this so I can overclock with peace of mind


----------



## Silent Scone

Have a second still to come, but recently (for the intention of 4K) swapped my 780s out for 2x Vapor-X 290X 8GBs.

Just the one for now...

80.9% ASIC (Although I don't really pay much attention to this)

Wasn't too keen on going back to air, but Sapphire have done a top job on the cooler on these. It's near silent and tops out at 70C.


----------



## KnownDragon

Okay, I added a second line to power instead of splitting one. Performance gain reported fairly with added voltage. 2 lines allowed me to overclock with voltage. Started to throttle at 1149. Pushed it to 1200 MHz stock cooler with throttling and little power added. Went past it and had severe throttle with huge performance loss. Need to cool this thing. I bet 1300 MHz core would be nice.


----------



## Skanic

Can someone tell me why Afterburner is so bad?
I have an ASUS R9 290 Direct CU ii OC.
I have Core 1100 memory 1360 .
And it wont apply the oc even though its setup to start up with windows and apply the OC.

Are there any other good OC software for AMD gpu's?


----------



## cennis

Use asus gpu tweak, since you have a asus card anyway. People seem to prefer it here.

I have the same card and afterburner works perfectly for me

just move sliders and choose apply
Quote:


> Originally Posted by *Skanic*
> 
> Can someone tell me why Afterburner is so bad?
> I have an ASUS R9 290 Direct CU ii OC.
> I have Core 1100 memory 1360 .
> And it wont apply the oc even though its setup to start up with windows and apply the OC.
> 
> Are there any other good OC software for AMD gpu's?


----------



## Skanic

Quote:


> Originally Posted by *cennis*
> 
> Use asus gpu tweak, since you have a asus card anyway. People seem to prefer it here.
> 
> I have the same card and afterburner works perfectly for me
> 
> just move sliders and choose apply


So the ASUS Tweak one is the only good one?
Because even with Trixx i had problems, that it wont correctly apply the OC upon start up.


----------



## pkrexer

Do you have overdrive enabled in CCC? I know for me if I don't enable overdrive, it will revert my clocks back to stock on startup. Im sure there is away to delay the startup of AB so it starts up after CCC, but haven't messed with it. Enabling overdrive works for me and doesnt affect my ability to overclock.
Quote:


> Originally Posted by *Skanic*
> 
> Can someone tell me why Afterburner is so bad?
> I have an ASUS R9 290 Direct CU ii OC.
> I have Core 1100 memory 1360 .
> And it wont apply the oc even though its setup to start up with windows and apply the OC.
> 
> Are there any other good OC software for AMD gpu's?


----------



## Blue Dragon

Quote:


> Originally Posted by *Skanic*
> 
> So the ASUS Tweak one is the only good one?
> Because even with Trixx i had problems, that it wont correctly apply the OC upon start up.


I had no problems with asus tweak when I'm running in single, but w/ CF it seems to give my second card problems. right now I'm running the factory oc with no oc progs enabled and 14.6 beta drivers and CF has been working sweet so far. I can always manually open tweak to oc when benching.


----------



## Arizonian

Quote:


> Originally Posted by *pdasterly*
> 
> How do I join the club?
> 
> 
> Spoiler: Warning: Spoiler!


Per OP: To be added on the member list please submit the following in your post
1. GPU-Z Link with OCN name or Screen shot of GPU-Z validation tab open with OCN Name or finally a pic of GPU with piece of paper with OCN name showing.
2. Manufacturer & Brand
3. Cooling








Quote:


> Originally Posted by *mAs81*
> 
> Hello everyone!
> I now own a Sapphire VAPOR-X R9 290 4GB TRI-X OC,and would like to join the club with all the other cool kids
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GPU-Z validation


Congrats - added









Also I'd like to say saweeet theme inside. Love the cabling. Nice mod work / combination planning it out just right. .


----------



## KyadCK

Quote:


> Originally Posted by *aaroc*
> 
> With the PCIe cables that the corsair AX860i and AX1200i have, its save to have one cable to each R9 290? Or I need to use 2 cables for each R9 290?


Check the gauge. If it is 18 AWG like my AX1200, then you are cutting it close in theory. I'd just run another line, not cable managed, see what happens.
Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *aaroc*
> 
> With the PCIe cables that the corsair AX860i and AX1200i have, its save to have one cable to each R9 290? Or I need to use 2 cables for each R9 290?
> 
> 
> 
> I have Seasonic PSU & I use one cable to each 290 but the cable was actually two cables in one connector.
Click to expand...

As long as you have more than 3 yellow wires coming off the PSU, then it should be fine.
Quote:


> Originally Posted by *bluedevil*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I have Seasonic PSU & I use one cable to each 290 but the cable was actually two cables in one connector.
> 
> 
> 
> 
> 
> I have my Capstone 550m with 2 pcie on one cable and its fine.
Click to expand...

Power supplies themselves have nothing to do with it. How many load lines (yellow) there are coming from the PSU matters in this case. After the guy with the SuperNOVA had issues, everyone is questioning their rigs is all.

Who knows... Maybe running one line to a card is starving it for power and reducing the potential overclock.


----------



## pdasterly

Im not sure psu was problem, I leaning toward a bad stick of ram, machine runs fine with auto settings but when I enable xmp all he!! Breaks loose, damn pc's haven't really changed in 20 years, but officially from evga, they state the 290x is power hungry and recommend two separate wires per card. According to the very nice tech I spoke with I wasn't the only person with a 290x to give them a call


----------



## Mega Man

Quote:


> Originally Posted by *Skanic*
> 
> Can someone tell me why Afterburner is so bad?
> I have an ASUS R9 290 Direct CU ii OC.
> I have Core 1100 memory 1360 .
> And it wont apply the oc even though its setup to start up with windows and apply the OC.
> 
> Are there any other good OC software for AMD gpu's?


there are 3 decent programs.

AB trixx and gpu tweak,

as you have a asus card, you are limited to gpu tweak ONLY. has nothing to do with the other programs being "crappy" has to do with asus locking down your card on a bios level to only work with gpu tweak


----------



## kizwan

Quote:


> Originally Posted by *KyadCK*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *aaroc*
> 
> With the PCIe cables that the corsair AX860i and AX1200i have, its save to have one cable to each R9 290? Or I need to use 2 cables for each R9 290?
> 
> 
> 
> I have Seasonic PSU & I use one cable to each 290 but the cable was actually two cables in one connector.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> As long as you have more than 3 yellow wires coming off the PSU, then it should be fine.
Click to expand...

It should be. I've benched the hell out of these cards, back-to-back bench, for a couple of times with +200mV (1.35 - 1.4V).
Quote:


> Originally Posted by *KyadCK*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *bluedevil*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I have Seasonic PSU & I use one cable to each 290 but the cable was actually two cables in one connector.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have my Capstone 550m with 2 pcie on one cable and its fine.
> 
> Click to expand...
> 
> 
> 
> 
> 
> Power supplies themselves have nothing to do with it. How many load lines (yellow) there are coming from the PSU matters in this case. After the guy with the SuperNOVA had issues, everyone is questioning their rigs is all.
> 
> Who knows... Maybe running one line to a card is starving it for power and reducing the potential overclock.
Click to expand...

Rule of thumb with GPU is always one cable (from PSU) per PCIe power connector but usually you can get away with one cable per card. Emphasized "usually" there.


----------



## Forceman

Quote:


> Originally Posted by *Mega Man*
> 
> there are 3 decent programs.
> 
> AB trixx and gpu tweak,
> 
> as you have a asus card, you are limited to gpu tweak ONLY. has nothing to do with the other programs being "crappy" has to do with asus locking down your card on a bios level to only work with gpu tweak


That's not true. Only Asus cards can use GPU Tweak, but Asus cards can use AB just fine.

Custom cards like the Lightning might be locked down, but regular Asus cards are not.


----------



## Red1776

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> there are 3 decent programs.
> 
> AB trixx and gpu tweak,
> 
> as you have a asus card, you are limited to gpu tweak ONLY. has nothing to do with the other programs being "crappy" has to do with asus locking down your card on a bios level to only work with gpu tweak
> 
> 
> 
> That's not true. Only Asus cards can use GPU Tweak, but Asus cards can use AB just fine.
> 
> Custom cards like the Lightning might be locked down, but regular Asus cards are not.
Click to expand...

ASUS GPU Tweak works on most cards. I have used it with MSI cards.

In fact it is advertised as working on cards of any make.


----------



## Forceman

Quote:


> Originally Posted by *Red1776*
> 
> ASUS GPU Tweak works on most cards. I have used it with MSI cards.
> In fact it is advertised as working on cards of any make.


Huh. Guess we were both wrong then.


----------



## Skanic

I kinda feel burned for having bought a r9 290 direct cu ii OC, Since people with a Sapphire Tri-XX or a Powercolor PCS+ seem to get way better temps and overclocks.
Before the r9 290 i had an HD 7950 from sapphire stock. which i equipped with an prolimatech mk-26 wouldnt go over 65 °C on load and overclocked its clocks where.
1250 Core 1500 Memory.


----------



## Blue Dragon

Quote:


> Originally Posted by *Skanic*
> 
> I kinda feel burned for having bought a r9 290 direct cu ii OC, Since people with a Sapphire Tri-XX or a Powercolor PCS+ seem to get way better temps and overclocks.
> Before the r9 290 i had an HD 7950 from sapphire stock. which i equipped with an prolimatech mk-26 wouldnt go over 65 °C on load and overclocked its clocks where.
> 1250 Core 1500 Memory.


I haven't OC'd them yet, but my dcuii oc's do ok temp wise. single card I didn't really have heat problems, but CF is different story. although with CM seidon 120M AIO and the red mod my cards are getting barely 10 degrees over my WC 7870 Hawks in temps. but remember OC is luck of draw and not branding.

2x asus r9 290 DCUII OC catzilla 1080p w/ red mod

second card was reaching 90's before mod. above temps have no heatsink on vrm2 or mem. most likely would get better oc with sinks.

edit- probably should mention my second card was f'ed from factory- one of the heatsink stand-offs wasn't screwed in properly. that's most likely why it got higher temps than my first card ever did... it would throttle before I tweaked it a little and dropped the temps down to 90's...


----------



## mAs81

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also I'd like to say saweeet theme inside. Love the cabling. Nice mod work / combination planning it out just right. .


Thanks for the compliment,I appreciate it!








I will make it even cleaner in the future,I have already ordered cable combs to make them cables more neat!

Now,about the card I have some bad news..
I had a black screen at random times on windows startup,when windows begin to load,
with only an unresponsive cursor on screen,and some freezes at system shutdown..But I could boot in safe mode every time..
Googling around,I saw that many people had the same problem as me..(only with a responsive cursor)
I have tried literally everything,from uninstalling/rolling back drivers, system restore, everything...
I don't really think that it is the card though,because when it works,it works like a charm,I game,I benchmark everything is okay..
I've noticed though,that some troubles occurred with the VGA driver installation or something..at least that is what device manager reports..
To cut a long story short,I had to format,more than once because possibly a bad driver installation messed with win registry entries..
I am at work now,and I have the PC updating now back at home without the GPU,and we shall see..
If it works,I'll eventually oc my card and post the results here,if not,I 'all have to create a thread here at OCN 'cause I'm at my wits end..
Sorry for the thread derailment,rant is now over


----------



## p33k

Finally got around to putting on these white back plates on my asus 290 dc2oc's that I got a few weeks ago. Crappy cellphone shot, they look nice, good quality but I wish they didn't have the cfx/sli cutout and company logo on them...


----------



## kizwan

Quote:


> Originally Posted by *Red1776*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> there are 3 decent programs.
> 
> AB trixx and gpu tweak,
> 
> as you have a asus card, you are limited to gpu tweak ONLY. has nothing to do with the other programs being "crappy" has to do with asus locking down your card on a bios level to only work with gpu tweak
> 
> 
> 
> That's not true. Only Asus cards can use GPU Tweak, but Asus cards can use AB just fine.
> 
> Custom cards like the Lightning might be locked down, but regular Asus cards are not.
> 
> Click to expand...
> 
> 
> 
> 
> ASUS GPU Tweak works on most cards. I have used it with MSI cards.
> In fact it is advertised as working on cards of any make.
Click to expand...

You can overvolt your MSI with GPU Tweak?


----------



## nightfox

Quote:


> Originally Posted by *p33k*
> 
> Finally got around to putting on these white back plates on my asus 290 dc2oc's that I got a few weeks ago. Crappy cellphone shot, they look nice, good quality but I wish they didn't have the cfx/sli cutout and company logo on them...


How's the MEG backplates? do they have thermal pads at the back? I keep searching about when I saw online and they dont have a photo shot at the back side of the backplate (where the thermal pads are)

I wanted that backplate as well but as you know, backplate on 290/x are useful on VRM. I wonder about the performance. if you have a picture of the inside part, would you kindly post it here? Been thinking of getting the I/O shield. looks nice


----------



## CoolRonZ

I got a PowerColor AXR9 290X OC, seems to be the exact same PCB etc as the PCS+, flashed it to a PCS+, same BIOS except for going from BIOS xxx.000 to xxx.300 and the P/N from xxx.001 to xxx.051.. Doesn't really act any differently, still locked memory voltages. I can get alot better clocks out of her if i give her more volts, but 1150 is just comfy to me with stock volts. So I'm wondering if all PowerColor PCB's and components are the same?



PS. this is a stock photo taken from EK's cooling configurator.


----------



## p33k

Quote:


> Originally Posted by *nightfox*
> 
> How's the MEG backplates? do they have thermal pads at the back? I keep searching about when I saw online and they dont have a photo shot at the back side of the backplate (where the thermal pads are)
> 
> I wanted that backplate as well but as you know, backplate on 290/x are useful on VRM. I wonder about the performance. if you have a picture of the inside part, would you kindly post it here? Been thinking of getting the I/O shield. looks nice


Actually I have no thermal pads on the back, just aesthetics







The backside of the plate is flat and they provide some round washers that have it stick up off the card a bit so it doesn't make contact with the card - no shorting it out. It's not like the EK backplate that has indention's where the pads go. I am not sure if I could have put some very thick thermal pads on it. But honestly my VRM's aren't getting hot enough with the waterblock to make me worry.


----------



## PachAz

As you all know, I had blackscreen/freeze problems with my used sapphire r9 290. What I find intresting is that I ran a couple of loops in valley yesterday and also today, and it seems like when ever I put power limit to 50% I get black screen/freeze in benchmarks. How come this is the case? With my other sapphire r9 290 Trix I ran benchmarks with power limit 50% some times and I never crashed. Could it be that it draw too much current for the pcie wires to handle, when using 50% power limit? I have a fractal r3 newton 1000w, and I think that one should be able to handle this fine?

Edit: This theory cant be true, because I used my r9 290 tri-x with 50% power limit and it never crashed. Some actuall data from my valley test:

Stock power limit:
1- Pass
2- Pass
3- Pass
4- Pass
5- Pass

Power limit:
1- 50% fail
2- 25% fail
3- 20% fail
4- 15% fail
5- 10% fail
6- 5% fail
7- 3% pass

I think the problem is that the interman voltage regulator on the card makes the vdrop too high during load, when using power limit in sapphire trix software. This can happen even when using no power limit, but less occuring. The voltage according to gpu-z during benchmark using power limit is like 11.63v some times, where using no power limit, it stays at 11.75 and just some dips down to 11.63. I think the voltage regulator is poor on this card, which can explain why the crashes are happening more often with power limit on 50-5% (vdrop during load is higher). Correct me if im wrong.


----------



## nightfox

Quote:


> Originally Posted by *p33k*
> 
> Actually I have no thermal pads on the back, just aesthetics
> 
> 
> 
> 
> 
> 
> 
> The backside of the plate is flat and they provide some round washers that have it stick up off the card a bit so it doesn't make contact with the card - no shorting it out. It's not like the EK backplate that has indention's where the pads go. I am not sure if I could have put some very thick thermal pads on it. But honestly my VRM's aren't getting hot enough with the waterblock to make me worry.


alright. looks like its for aesthetics only. looks great though.







oh nice rig btw. looks neat


----------



## pronacyk

Quote:


> Originally Posted by *PachAz*
> 
> As you all know, I had blackscreen/freeze problems with my used sapphire r9 290. What I find intresting is that I ran a couple of loops in valley yesterday and also today, and it seems like when ever I put power limit to 50% I get black screen/freeze in benchmarks. How come this is the case? With my other sapphire r9 290 Trix I ran benchmarks with power limit 50% some times and I never crashed. Could it be that it draw too much current for the pcie wires to handle, when using 50% power limit? I have a fractal r3 newton 1000w, and I think that one should be able to handle this fine?
> 
> Edit: This theory cant be true, because I used my r9 290 tri-x with 50% power limit and it never crashed. Some actuall data from my valley test:
> 
> Stock power limit:
> 1- Pass
> 2- Pass
> 3- Pass
> 4- Pass
> 5- Pass
> 
> Power limit:
> 1- 50% fail
> 2- 25% fail
> 3- 20% fail
> 4- 15% fail
> 5- 10% fail
> 6- 5% fail
> 7- 3% pass
> 
> I think the problem is that the interman voltage regulator on the card makes the vdrop too high during load, when using power limit in sapphire trix software. This can happen even when using no power limit, but less occuring. The voltage according to gpu-z during benchmark using power limit is like 11.63v some times, where using no power limit, it stays at 11.75 and just some dips down to 11.63. I think the voltage regulator is poor on this card, which can explain why the crashes are happening more often with power limit on 50-5% (vdrop during load is higher). Correct me if im wrong.


I think you're only passing at low power limits because it throttles your core clock when you hit the power limit much like thermal throttling. You're passing because you're effectively running at a lower average clock speed.


----------



## PachAz

Okay, but how come my other r9 290 work fine? If im not wrong this has nothing to do with the psu I hope and guess, because this would also affect my other r9 290? Running no power limit, the voltage stays at 11.75v more often, but when I use power limit it drops to 11.63 and stays there untill I crash. Is there any way to read the voltage you get from the pcie cables so I know for sure, because im in a RMA process. Basicly it happens more often, and lately I have only been able to replicate this problem by using 5-50% power limit.


----------



## pronacyk

Got a question maybe some of you could help me out. I've got an XFX 290X reference card that's watercooled with above average ASIC quality (78%).

When overclocking the core and raising voltage (1200mhz @ 125mv), instead of seeing the traditional white artifacts as an instability symptom (like I've seen in my old R9 290s), everything runs buttery smooth without a glitch....except when (and only when) I want to exit back to desktop or ALT-TAB out. The video would freeze for a few seconds before returning to desktop and the core, mem, voltage, and gpu load would be pegged at 100% max 3d speed. Nothing short of a reset would snap it out of its heart attack. At this time, I'm thinking the unstable core caused a driver crash (but without a notification in the taskbar?). However, if I go back and fire up a game/benchmark, it still performs flawlessly (with the exception of the long black screen when switching back to 2D).

Further increasing voltage to 150mv seems to reduce (not 100% prevent) this issue.

Has anyone else seen this form of instability when overclocking the core or do you think it's a hardware issue? I've had 4 R9 290s and none of them behave this way when unstable.


----------



## pronacyk

Quote:


> Originally Posted by *PachAz*
> 
> Okay, but how come my other r9 290 work fine? If im not wrong this has nothing to do with the psu I hope and guess, because this would also affect my other r9 290? Running no power limit, the voltage stays at 11.75v more often, but when I use power limit it drops to 11.63 and stays there untill I crash. Is there any way to read the voltage you get from the pcie cables so I know for sure, because im in a RMA process. Basicly it happens more often, and lately I have only been able to replicate this problem by using 5-50% power limit.


Are you running the card at stock clock speed? Increasing power limit shouldn't affect stability if you haven't overclocked it. Did you keep GPU-Z in the background while running your benchmark to see if the core speeds were getting throttled?


----------



## PachAz

I am running stock settings. I was running gpu-z in background and the core clock did change back and fort, but mostly it was close to stock 947mhz. But as I said, running stock, voltage was 11.75, as soon as I use power limit, even +5% the voltage reading in gpu-z is dropping to 11.63v and stays there untill I get a black screen/crash. Maybe I should check the real voltage on the pice cable during load, so I will see.


----------



## jerrolds

Anyone else having issues with the new 14.6Beta drivers and HPC-HC media player? MadVR and the default "enhanced video renderer" seem to be choppy most of the time. There has been 1 or 2 times where a movie plays fine though.


----------



## MTDEW

Quote:


> Originally Posted by *mAs81*
> 
> Thanks for the compliment,I appreciate it!
> 
> 
> 
> 
> 
> 
> 
> 
> I will make it even cleaner in the future,I have already ordered cable combs to make them cables more neat!
> 
> Now,about the card I have some bad news..
> I had a black screen at random times on windows startup,when windows begin to load,
> with only an unresponsive cursor on screen,and some freezes at system shutdown..But I could boot in safe mode every time..
> Googling around,I saw that many people had the same problem as me..(only with a responsive cursor)
> I have tried literally everything,from uninstalling/rolling back drivers, system restore, everything...
> I don't really think that it is the card though,because when it works,it works like a charm,I game,I benchmark everything is okay..
> I've noticed though,that some troubles occurred with the VGA driver installation or something..at least that is what device manager reports..
> To cut a long story short,I had to format,more than once because possibly a bad driver installation messed with win registry entries..
> I am at work now,and I have the PC updating now back at home without the GPU,and we shall see..
> If it works,I'll eventually oc my card and post the results here,if not,I 'all have to create a thread here at OCN 'cause I'm at my wits end..
> Sorry for the thread derailment,rant is now over


We determined over in the black screen thread that the black screen is caused by the memory jumping to full speed while the core doesn't.
So the memory is forced to run at the full 3D clocks of 1250mhz on the default 2D core voltage.
The core and memory controller share voltage on these GPU's, and the 2D voltages simply arent enough for some GPU's memory to handle the sudden jump to full clocks.

So you can bump the core voltage up a bit to fix any black screen issues you may get while browsing or whatever in 2D mode that causes the memory to suddenly jump to full clocks on the 2D voltage.

But as you now know, if the black screen occurs during a cold boot before your new "bumped voltage" gets a chance to load, you can still experience a black screen on boot-up.

My guess is the voltage fluctuates, and if its fluctuating "down" when the memory clocks jump to full clocks....bam...black screen.

But for the most part if you cannot RMA the card, a slight core voltage bump pretty much cures the issue except if it happens before your desktop/new voltage settings get a chance to load fully first.


----------



## PachAz

I just tested my r9 290 trix with 50+ power limit and the vcore drops to 11.63 as well but the card doesnt crash, nor does it throttle. I really think my used r9 290 is faulty, dont know why it crashes when using power limit though.


----------



## hwoverclkd

Quote:


> Originally Posted by *mAs81*
> 
> Thanks for the compliment,I appreciate it!
> 
> 
> 
> 
> 
> 
> 
> 
> I will make it even cleaner in the future,I have already ordered cable combs to make them cables more neat!
> 
> Now,about the card I have some bad news..
> I had a black screen at random times on windows startup,when windows begin to load,
> with only an unresponsive cursor on screen,and some freezes at system shutdown..But I could boot in safe mode every time..
> Googling around,I saw that many people had the same problem as me..(only with a responsive cursor)
> I have tried literally everything,from uninstalling/rolling back drivers, system restore, everything...
> I don't really think that it is the card though,because when it works,it works like a charm,I game,I benchmark everything is okay..
> I've noticed though,that some troubles occurred with the VGA driver installation or something..at least that is what device manager reports..
> To cut a long story short,I had to format,more than once because possibly a bad driver installation messed with win registry entries..
> I am at work now,and I have the PC updating now back at home without the GPU,and we shall see..
> If it works,I'll eventually oc my card and post the results here,if not,I 'all have to create a thread here at OCN 'cause I'm at my wits end..
> Sorry for the thread derailment,rant is now over


welcome to the world of new AMD gpu







there's a black screen thread, you might want to check out if someone has already found a fix for a similar issue as yours. Apparently there are more blackscreen symptoms and may have different root cause.


----------



## PachAz

My problem is also a "new" one, only occuring when using "power limit"...wheres my other r9 290 works fine regardles of settings in sapphire trixx software.


----------



## mAs81

Quote:


> Originally Posted by *MTDEW*
> 
> We determined over in the black screen thread that the black screen is caused by the memory jumping to full speed while the core doesn't.
> So the memory is forced to run at the full 3D clocks of 1250mhz on the default 2D core voltage.
> The core and memory controller share voltage on these GPU's, and the 2D voltages simply arent enough for some GPU's memory to handle the sudden jump to full clocks.
> 
> So you can bump the core voltage up a bit to fix any black screen issues you may get while browsing or whatever in 2D mode that causes the memory to suddenly jump to full clocks on the 2D voltage.
> 
> But as you now know, if the black screen occurs during a cold boot before your new "bumped voltage" gets a chance to load, you can still experience a black screen on boot-up.
> 
> My guess is the voltage fluctuates, and if its fluctuating "down" when the memory clocks jump to full clocks....bam...black screen.
> 
> But for the most part if you cannot RMA the card, a slight core voltage bump pretty much cures the issue except if it happens before your desktop/new voltage settings get a chance to load fully first.


Thanks for the info,but I think that I finally have found the culprit...
...and his name is Intel Graphics!








I am now home and during the process of updating chipset drivers and application software from my motherboard's CD,
(with the GPU not installed)
when Intel graphics tried to update an error message appeared and..bam!
Black screen with unmovable cursor like before...what really baffles me,is that I uninstalled the thing,plus I deactivated it from BIOS before doing anything else....So I'm going to keep the drivers from Windows update,and hope for the best at he moment..


----------



## rv8000

Just to chime back in as I've finally gotten my 290 Vapor-X replacement. Quick temp testing 3 loops of valley...

- 1100/1400 @ +25mV pl+ 50, custom fan curve, ambient room temp of 26c (air inside case is sadly much warmer







)

- Max core temp - 73c
- Max vrm1 temp - 65c
- Max vrm2 temp - 62c
- Max fan speed - 55%

Mind to all you people, my case has AWFUL airflow currently. I can check other things per peoples request.


----------



## pkrexer

So ran into my first black screen. Well not really a black screen, but a white screen?

After about an hour into BF4 my screen froze, then my monitor starting blinking on and off with what appeared to be white checkered static. I powered off the computer and back on, everything seems fine again.

Anyone run into this before? Probably a sign my memory overclocked failed?


----------



## mAs81

Hmm..now this I don't understand...
The problem came back after all,but now I've noticed something else..
I have 2 monitors one via hdmi and one via dvi cable..
When I connect my main monitor on my 290 with hdmi,and the other one with dvi on the motherboard's intel graphics,I get no problems whatsoever..
(though I haven't gamed yet-had to reinstall windows)
If I try any of the 2 dvi slots on my 290 together with the hdmi,I get black screen with unmovable cursor again,and the system crashes..
Is that normal?Are both slots damaged in my gpu?Perhaps there's some configuration I've looked out?Could it be the cable?I currently don't have another to test it out;will ask for one tomorrow from my brother..I really don't want to have intel graphics enabled,I want both monitors to be on my gpu
Any thoughts?...


----------



## Silent Scone

Despite the fact I've removed them till a time that AMD sort the SST panel issues...these 8GB Vapors do look good


----------



## PachAz

Why dunt you have waterblock to those little coeds?


----------



## Zaxis01

Where'd you get those white back plates from?


----------



## p33k

Quote:


> Originally Posted by *Zaxis01*
> 
> Where'd you get those white back plates from?


A Korean company makes them - http://megcustom.com/


----------



## Zaxis01

Cool Thanks!


----------



## battleaxe

Quote:


> Originally Posted by *p33k*
> 
> A Korean company makes them - http://megcustom.com/


Are they sold anywhere in the states? Can't seem to find them.


----------



## bond32

Snagged a pretty good deal - got 2 more 290's with Hynix memory... Wow, I forgot how loud these boys are...

Still running my EVGA G2 1000 Watt psu strong with all three, but all three at stock settings. 4770k is still overclocked, but I know I am at the limits of the psu.


----------



## PachAz

If I get a new r9 290 from the shop, I have spend around 5690 SEK on gpus (two r9 290, one new and one used) which gives double the performance compared to a 780ti (which cost 5500 SEK) and similar or better performance than two 780ti. So yes the r9 290 give unbeatable performance for the price. Basicly two r9 290 in cf cost half the price of two 780ti in sli. I wonder how "some" people feel about this issue/luck.

I see no reason to buy a 780ti or even a 780 in terms of all round performance. Im so proud to be a amd owner and a special member in this club, wish me luck, that tha shop will return me a new r9 290







.


----------



## p33k

Quote:


> Originally Posted by *battleaxe*
> 
> Are they sold anywhere in the states? Can't seem to find them.


No, I think they only sell them via their website in Korea.


----------



## Silent Scone

Quote:


> Originally Posted by *Zaxis01*
> 
> Where'd you get those white back plates from?


They don't do them unfortunately. Shame as they're the only 8gb variant on the market.


----------



## Mercy4You

So I rolled back to 13.12, because I was getting weird FPS drops in BF4 and it's fine now. Not sure if 14.4 is the reason knowing the ongoing issues in BF4









Any1 else got this problem?


----------



## koekwau5

Quote:


> Originally Posted by *bond32*
> 
> Snagged a pretty good deal - got 2 more 290's with Hynix memory... Wow, I forgot how loud these boys are...
> 
> Still running my EVGA G2 1000 Watt psu strong with all three, but all three at stock settings. 4770k is still overclocked, but I know I am at the limits of the psu.


Nice build man!
Just curious .. where the hell did your PSU go? Is it behind the case and not attached to the case?


----------



## Ramzinho

Quote:


> Originally Posted by *koekwau5*
> 
> Nice build man!
> Just curious .. where the hell did your PSU go? Is it behind the case and not attached to the case?


This case is Carbide Air 540.. the Psu is on the other Side , it's one of the best selling cases in Corsair Cases History


----------



## Widde

Quote:


> Originally Posted by *Mercy4You*
> 
> So I rolled back to 13.12, because I was getting weird FPS drops in BF4 and it's fine now. Not sure if 14.4 is the reason knowing the ongoing issues in BF4
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any1 else got this problem?


Not really, On 14.6 betas atm and bf4 runs like a dream







But 14.4 was really weird for me aswell, had massive fps drops with mantle


----------



## gotmilkmang

can i get in on this....im still waiting for more parts for my new build...but heres one for now



sapphire with aquacool block


----------



## bond32

I have one of my 3 290's water cooled... Am I crazy because I want to put the reference cooler back on it because I like uniformity plus 3 290's with reference coolers look awesome??


----------



## p33k

Quote:


> Originally Posted by *bond32*
> 
> I have one of my 3 290's water cooled... Am I crazy because I want to put the reference cooler back on it because I like uniformity plus 3 290's with reference coolers look awesome??


Or you could just buy 2 more waterblocks for those 290s! I couldn't deal with 3 of those with reference cooler while gaming


----------



## bond32

Quote:


> Originally Posted by *p33k*
> 
> Or you could just buy 2 more waterblocks for those 290s! I couldn't deal with 3 of those with reference cooler while gaming


Oh I know... But good grief, its going to be another $350 for 2 more Koolance blocks and the gpu link







Probably won't afford that for a few weeks.


----------



## rdr09

Quote:


> Originally Posted by *bond32*
> 
> Oh I know... But good grief, its going to be another $350 for 2 more Koolance blocks and the gpu link
> 
> 
> 
> 
> 
> 
> 
> Probably won't afford that for a few weeks.


watercool the last two and aio for the primary (or keep this reference).


----------



## p33k

Quote:


> Originally Posted by *bond32*
> 
> Oh I know... But good grief, its going to be another $350 for 2 more Koolance blocks and the gpu link
> 
> 
> 
> 
> 
> 
> 
> Probably won't afford that for a few weeks.


Few weeks and a little uniformity... or a hassle for a few weeks of it looking the same







I'm lazy so you know what I would pick!


----------



## xTristinx

[/quote]
Quote:


> Originally Posted by *pkrexer*
> 
> So ran into my first black screen. Well not really a black screen, but a white screen?
> 
> After about an hour into BF4 my screen froze, then my monitor starting blinking on and off with what appeared to be white checkered static. I powered off the computer and back on, everything seems fine again.
> 
> Anyone run into this before? Probably a sign my memory overclocked failed?


i've had this problem since the beginning with no solution


----------



## xTristinx

Anyone know a Canadian supplier for 290x waterblocks??


----------



## bond32

Quote:


> Originally Posted by *rdr09*
> 
> watercool the last two and aio for the primary (or keep this reference).


Thought crossed my mind, but its around $130 for a full block plus I have the radiator space and a good pump. So might as well go with the full block as the bracket + clc will be around $80-100 right?
Quote:


> Originally Posted by *p33k*
> 
> Few weeks and a little uniformity... or a hassle for a few weeks of it looking the same
> 
> 
> 
> 
> 
> 
> 
> I'm lazy so you know what I would pick!


Haha, I think when I get home I will put the reference cooler back on it. I'm sure they will get crazy hot, in just one game of bf4 last night after setting a fan profile around 60-80% I could hardly hear bf4 on my logitech speakers lol.

Capping the fan at 40%, they throttle clocks down correct? So capping all three fans to 40%, with throttling, would be fine I think.
Quote:


> Originally Posted by *xTristinx*


My Sapphire 290X, first one I bought at release, had bad black screen issues. I battled with it for many months before finally sucking it up and sending it in for RMA. I am so glad I did as I got back one of the top clocking 290X's here (12,200 in Firestrike, single card: http://www.3dmark.com/fs/2221757). I recommend you seek an RMA as well. Just a theory of mine, but I believe there is a compatibility issue with Elpida memory and something on the power delivery side. There isn't a fix I found other than running slightly higher voltage at near stock clocks.


----------



## rdr09

Quote:


> Originally Posted by *bond32*
> 
> Thought crossed my mind, but its around $130 for a full block plus I have the radiator space and a good pump. So might as well go with the full block as the bracket + clc will be around $80-100 right?
> Haha, I think when I get home I will put the reference cooler back on it. I'm sure they will get crazy hot, in just one game of bf4 last night after setting a fan profile around 60-80% I could hardly hear bf4 on my logitech speakers lol.
> 
> Capping the fan at 40%, they throttle clocks down correct? So capping all three fans to 40%, with throttling, would be fine I think.
> My Sapphire 290X, first one I bought at release, had bad black screen issues. I battled with it for many months before finally sucking it up and sending it in for RMA. I am so glad I did as I got back one of the top clocking 290X's here (12,200 in Firestrike, single card: http://www.3dmark.com/fs/2221757). I recommend you seek an RMA as well. Just a theory of mine, but I believe there is a compatibility issue with Elpida memory and something on the power delivery side. There isn't a fix I found other than running slightly higher voltage at near stock clocks.


bond, leave that 290X alone. you might mess it up.


----------



## bond32

Quote:


> Originally Posted by *rdr09*
> 
> bond, leave that 290X alone. you might mess it up.


Ha, you're right, better leave it be. I may use my 240 and 120 radiators to cool the 290X and cpu, and remove my 360 from up front. Would allow more air to the other 290's.


----------



## morreale

ASUS R9290X-DC2OC-4GD5
Stock Asus
username: morreale



I have had issues with this card since it has been installed with black screens and BSODs. I am using a Corsair Professional Series AX860i so I know power is not an issue.


----------



## Viscerous

Quote:


> Originally Posted by *morreale*
> 
> ASUS R9290X-DC2OC-4GD5
> Stock Asus
> username: morreale
> 
> 
> 
> I have had issues with this card since it has been installed with black screens and BSODs. I am using a Corsair Professional Series AX860i so I know power is not an issue.


BSODs seem to be really common for the more recent drivers. The 14.6 betas arguably made it even worse because you have higher temperatures to worry about. Would need a bit more info to tell what your problem is. Watching flash videos on Youtube or Twitch is a major problem right now.


----------



## Naxxy

Quote:


> Originally Posted by *Viscerous*
> 
> BSODs seem to be really common for the more recent drivers. The 14.6 betas arguably made it even worse because you have higher temperatures to worry about. Would need a bit more info to tell what your problem is. Watching flash videos on Youtube or Twitch is a major problem right now.


I have the same issue with back screens but only happens when I try to resume the computer from sleep mode...computer powers up but the screens don't get out of standby......gaming and normal use works fine.

Are these problems related?


----------



## Viscerous

Quote:


> Originally Posted by *Naxxy*
> 
> I have the same issue with back screens but only happens when I try to resume the computer from sleep mode...computer powers up but the screens don't get out of standby......gaming and normal use works fine.
> 
> Are these problems related?


I'm pretty sure that's a separate driver issue, but seems to be really random as well. I've seen some people say that 14.6 drivers fix it, but others who have had it cause the sleep mode black screen. The most foolproof way that I've seen to fix a lot of the issues is to revert to 13.12. The problem with that is that you lose overall performance and Mantle support.


----------



## Naxxy

Quote:


> Originally Posted by *Viscerous*
> 
> I'm pretty sure that's a separate driver issue, but seems to be really random as well. I've seen some people say that 14.6 drivers fix it, but others who have had it cause the sleep mode black screen. The most foolproof way that I've seen to fix a lot of the BSODs is to revert to 13.12. The problem with that is that you lose overall performance and Mantle support.


Thanks for the info but id prefer not to do that. 14.4 had same issues i have now on 14.6....tried everything from resetting to stock values to some software that was supposed to fix the sleep mode issue...but nothing seems to work. Happens 50% of the time I let the computer go on sleep mode.

Guess only option left is disabling sleep mode.......this thing is bothering me so much I'm starting to think of changing the card........


----------



## Viscerous

Quote:


> Originally Posted by *Naxxy*
> 
> Thanks for the info but id prefer not to do that. 14.4 had same issues i have now on 14.6....tried everything from resetting to stock values to some software that was supposed to fix the sleep mode issue...but nothing seems to work. Happens 50% of the time I let the computer go on sleep mode.
> 
> Guess only option left is disabling sleep mode.......this thing is bothering me so much I'm starting to think of changing the card........


I feel the same way. Every time that I seem to fix an issue another one pops up.


----------



## KnownDragon

Still living my r9 290 thing must be golden. But I need to get it blocked. Only issue was flashing bios to r9 290x and then flashed it back. I want another xfx.


----------



## rdr09

Quote:


> Originally Posted by *KnownDragon*
> 
> Still living my r9 290 thing must be golden. But I need to get it blocked. Only issue was flashing bios to r9 290x and then flashed it back. I want another xfx.


got mine now about 8 months. man, time flies. zero issues but i leave mine at stock. oc'ed only during benching.


----------



## Skanic

Are there already any Custom bios for the R9 290?


----------



## Arizonian

Quote:


> Originally Posted by *morreale*
> 
> ASUS R9290X-DC2OC-4GD5
> Stock Asus
> username: morreale
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I have had issues with this card since it has been installed with black screens and BSODs. I am using a Corsair Professional Series AX860i so I know power is not an issue.


Congrats - added









However sorry to hear your problems hope you get it resolved.


----------



## CoolRonZ

Quote:


> Originally Posted by *xTristinx*
> 
> Anyone know a Canadian supplier for 290x waterblocks??


www.ncix.com

i just got one and a 290X for less than a EVGA 780 SC from them, altho i was warned the block was very low in stock a week ago, not they are sposed to be in stock in 3-6 weeks...


----------



## Mega Man

What about dazmode.com ( may notbe right. But it is something like that)


----------



## skruppe

Asus R9290X-G-4GD5
http://www.techpowerup.com/gpuz/ce7kn/

Powercolor R9 290X PCS+
http://www.techpowerup.com/gpuz/3dc5w/

Both are water cooled but don't expect any pictures, my (first) loop looks like crap.


----------



## ashyg

I need help!

I bought a 2 month old MSI R9 290 Gaming 4G from Ebay (700 trades, 100% positive feedback seller), and shipped it to New Zealand.

The shipping box and card looked imaculate.

I removed by GTX 680 and installed the R9 290, booted into safe mode and used DisplayDriverUninstaller to get rid of the old Nvidia Drivers.

Booted back into Windows 8 and began to install the AMD 13.12 drivers (tried 14.4 too, same problem). About half way through install my display starts artifacting HORRIBLY, and continues to do so after reboot etc. I have tried upping the voltage and power control a little, but no difference.

I grabbed a spare HDD, popped it in machine and installed a fresh copy of windows 7. Similar problem, after installing drivers, the top left 1/4 of my screen starts freezing all one colour, and after awhile I get a BSOD (win 7 only).

I am using a Seasonic SS-650T to power it (it has no 8 pin).
I have the 6 pin plugged in directly.
and...
I have 2x molex (2 seperate rails) into a 2->8 pin adapter
OR
I have a 6 pin to 8 pin adapter

I have tried both combinations to get the 8 pin, but both times I get the horrible artifacting.

I think the card is faulty, how do I go about this, contact MSI (I think they do warranties by serial, but have no server centre in NZ), do I contact ebay? Or am I doing something stupidly wrong?

Thanks guys


----------



## MojoW

Quote:


> Originally Posted by *ashyg*
> 
> I need help!
> 
> I bought a 2 month old MSI R9 290 Gaming 4G from Amazon, and shipped it to New Zealand.
> 
> The shipping box and card looked imaculate.
> 
> I removed by GTX 680 and installed the R9 290, booted into safe mode and used DisplayDriverUninstaller to get rid of the old Nvidia Drivers.
> 
> Booted back into Windows 8 and began to install the AMD 13.12 drivers (tried 14.4 too, same problem). About half way through install my display starts artifacting HORRIBLY, and continues to do so after reboot etc. I have tried upping the voltage and power control a little, but no difference.
> 
> I grabbed a spare HDD, popped it in machine and installed a fresh copy of windows 7. Similar problem, after installing drivers, the top left 1/4 of my screen starts freezing all one colour, and after awhile I get a BSOD (win 7 only).
> 
> I am using a Seasonic SS-650T to power it (it has no 8 pin).
> I have the 6 pin plugged in directly.
> I have 2x molex (2 seperate rails) into a 2->8 pin adapter
> OR
> I have a 6 pin to 8 pin adapter
> 
> I have tried both combinations to get the 8 pin, but both times I get the horrible artifacting.
> 
> I think the card is faulty, how do I go about this, contact MSI (I think they do warranties by serial, but have no server centre in NZ), do I contact ebay? Or am I doing something stupidly wrong?
> 
> Thanks guys


Sounds like the card is faulty as you should have enough power to run that card.


----------



## Dasboogieman

Just installed the EVGA 1300W G2 unit. Both GPUs overclocked to 1150mhz +125mV, folding at home. This setting used to make the old SilentPro 1000W squeal like a pig before the fan ramped up to a loud roar. The G2 units completely takes it and is basically inaudible (though I am told that this is the loudest of the +1200W units). Got it for a steal too, $325 +$17 overnight shipping.
Was basically debating whether to go for broke and get the AX1500i but dat extra $200 can buy a lot of KFC chicken so I said no, plus the shipping alone (due to the unit being monstrous) would've added an extra $60.

Now that power delivery silence has been achieved, I can focus back on to the other huge task at hand....watercooling


----------



## Hl86

Its a curse, if nvidia wasnt that damn pricy.


----------



## rdr09

Quote:


> Originally Posted by *Dasboogieman*
> 
> Just installed the EVGA 1300W G2 unit. Both GPUs overclocked to 1150mhz +125mV, folding at home. This setting used to make the old SilentPro 1000W squeal like a pig before the fan ramped up to a loud roar. The G2 units completely takes it and is basically inaudible (though I am told that this is the loudest of the +1200W units). Got it for a steal too, $325 +$17 overnight shipping.
> Was basically debating whether to go for broke and get the AX1500i but dat extra $200 can buy a lot of KFC chicken so I said no, plus the shipping alone (due to the unit being monstrous) would've added an extra $60.
> 
> Now that power delivery silence has been achieved, I can focus back on to the other huge task at hand....watercooling


Das, how exactly do you oc 2 cards in crossfire? do you sync them?
Quote:


> Originally Posted by *Hl86*
> 
> Its a curse, if nvidia wasnt that damn pricy.


i know and i play Battlefield.


----------



## Dasboogieman

Quote:


> Originally Posted by *rdr09*
> 
> Das, how exactly do you oc 2 cards in crossfire? do you sync them?
> i know and i play Battlefield.


I sync them with Sapphire Trixx. Works like a charm, also have ULPS disabled since it causes funky things to happen to FAH.


----------



## rdr09

Quote:


> Originally Posted by *Dasboogieman*
> 
> I sync them with Sapphire Trixx. Works like a charm, also have ULPS disabled since it causes funky things to happen to FAH.


thanks.


----------



## kizwan

Quote:


> Originally Posted by *ashyg*
> 
> I need help!
> 
> I bought a 2 month old MSI R9 290 Gaming 4G from Ebay (700 trades, 100% positive feedback seller), and shipped it to New Zealand.
> 
> The shipping box and card looked imaculate.
> 
> I removed by GTX 680 and installed the R9 290, booted into safe mode and used DisplayDriverUninstaller to get rid of the old Nvidia Drivers.
> 
> Booted back into Windows 8 and began to install the AMD 13.12 drivers (tried 14.4 too, same problem). About half way through install my display starts artifacting HORRIBLY, and continues to do so after reboot etc. I have tried upping the voltage and power control a little, but no difference.
> 
> I grabbed a spare HDD, popped it in machine and installed a fresh copy of windows 7. Similar problem, after installing drivers, the top left 1/4 of my screen starts freezing all one colour, and after awhile I get a BSOD (win 7 only).
> 
> I am using a Seasonic SS-650T to power it (it has no 8 pin).
> I have the 6 pin plugged in directly.
> and...
> I have 2x molex (2 seperate rails) into a 2->8 pin adapter
> OR
> I have a 6 pin to 8 pin adapter
> 
> I have tried both combinations to get the 8 pin, but both times I get the horrible artifacting.
> 
> I think the card is faulty, how do I go about this, contact MSI (I think they do warranties by serial, but have no server centre in NZ), do I contact ebay? Or am I doing something stupidly wrong?
> 
> Thanks guys


Firstly, nice avatar! I knew where you from once I saw it.









Secondly, did you try contact the seller? I think it's faster & economical if you can RMA it to MSI yourself. I think the memory on the card is busted. It only artifacting while in windows, not in BIOS right?
Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dasboogieman*
> 
> Just installed the EVGA 1300W G2 unit. Both GPUs overclocked to 1150mhz +125mV, folding at home. This setting used to make the old SilentPro 1000W squeal like a pig before the fan ramped up to a loud roar. The G2 units completely takes it and is basically inaudible (though I am told that this is the loudest of the +1200W units). Got it for a steal too, $325 +$17 overnight shipping.
> Was basically debating whether to go for broke and get the AX1500i but dat extra $200 can buy a lot of KFC chicken so I said no, plus the shipping alone (due to the unit being monstrous) would've added an extra $60.
> 
> Now that power delivery silence has been achieved, I can focus back on to the other huge task at hand....watercooling
> 
> 
> 
> Das, how exactly do you oc 2 cards in crossfire? do you sync them?
Click to expand...

You're joking right?


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> Firstly, nice avatar! I knew where you from once I saw it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Secondly, did you try contact the seller? I think it's faster & economical if you can RMA it to MSI yourself. I think the memory on the card is busted. It only artifacting while in windows, not in BIOS right?
> You're joking right?


no, lol. i crossfired 7950 and 7970. i oc'ed just the 7950 thinking that syncing them might hurt performance. i was also worried about my 700W psu. it worked, though, just oc'ing one card. stuttery mess in valley but played good in BF3, C3 and C2.


----------



## bond32

Quote:


> Originally Posted by *Dasboogieman*
> 
> Just installed the EVGA 1300W G2 unit. Both GPUs overclocked to 1150mhz +125mV, folding at home. This setting used to make the old SilentPro 1000W squeal like a pig before the fan ramped up to a loud roar. The G2 units completely takes it and is basically inaudible (though I am told that this is the loudest of the +1200W units). Got it for a steal too, $325 +$17 overnight shipping.
> Was basically debating whether to go for broke and get the AX1500i but dat extra $200 can buy a lot of KFC chicken so I said no, plus the shipping alone (due to the unit being monstrous) would've added an extra $60.
> 
> Now that power delivery silence has been achieved, I can focus back on to the other huge task at hand....watercooling


I'm using the 1000 watt evga g2 for 2 290's and a 290x right now...


----------



## MojoW

Quote:


> Originally Posted by *bond32*
> 
> I'm using the 1000 watt evga g2 for 2 290's and a 290x right now...


That's gotta be stock.
I can't imagine trying to OC 3 hawaii GPU's cutting it that close.


----------



## Ized

G10 alternative that I don't see many people use:





Fitted on my ref R9 290 perfectly.

Way cheaper than the NZXT too at more than half the price. Can find it on Amazon UK and Ebay.

Supports everything the G10 website lists, I used a Zalman LQ315.


----------



## aaroc

Quote:


> Originally Posted by *Ized*
> 
> G10 alternative that I don't see many people use:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Fitted on my ref R9 290 perfectly.
> 
> Way cheaper than the NZXT too at more than half the price. Can find it on Amazon UK and Ebay.
> 
> Supports everything the G10 website lists, I used a Zalman LQ315.


Those are Dwood's clones. I bought two when they were available.. I had them with 2x 7870 and x2 antec 620 on my previous PC. They were popular. Look for the Red mod (Ati) - Green mod (Nvidia) threads on this forum. I was planning to use them with R9 290, but only fit my Mobo/PC in dual CFX, and Im going Tri CFX.


----------



## Notion

Hi

does 290 suffer from screen tear on 3 monitors? also can you use dvi, dvi hdmi?

cheers


----------



## Mega Man

Quote:


> Originally Posted by *aaroc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ized*
> 
> G10 alternative that I don't see many people use:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Fitted on my ref R9 290 perfectly.
> 
> Way cheaper than the NZXT too at more than half the price. Can find it on Amazon UK and Ebay.
> 
> Supports everything the G10 website lists, I used a Zalman LQ315.
> 
> 
> 
> Those are Dwood's clones. I bought two when they were available.. I had them with 2x 7870 and x2 antec 620 on my previous PC. They were popular. Look for the Red mod (Ati) - Green mod (Nvidia) threads on this forum. I was planning to use them with R9 290, but only fit my Mobo/PC in dual CFX, and Im going Tri CFX.
Click to expand...

hahaha i was just going to say this, i still think something catastrophic happened to him


----------



## jerrolds

Quote:


> Originally Posted by *xTristinx*
> 
> Anyone know a Canadian supplier for 290x waterblocks??


ncix.ca carries XSPC blocks last time i checked


----------



## Decade

Hey guys, just got an R9 290 today.

Having some trouble running 3DMark. Reaching peak temps of 88*C, which isn't too concerning, but the random crashes are.
I'm wondering if it may be my PSU? The 12v is split into two rails with 24amps on each. SeaSonic M12II 620 Bronze ( http://www.newegg.com/Product/Product.aspx?Item=N82E16817151095 )
Also noticed GPU-Z is reporting the GPU's 12V at 11.63v, which... is concerning.

This GPU is an open box reference Asus from Newegg, I'm willing to bet it was RMA'd because it coil whines like a mofo when rendering 1000+ fps.
Any suggestions? I /cannot/ swap the PSU out as the other unit I have is a measly Corsair CX430 in my HTPC and I'm pretty sure that won't run a 4.5ghz 4670K AND an R9 290.

No issues running Furmark for 30 mins, other than hitting the thermal threshold.


----------



## ashyg

Quote:


> Originally Posted by *kizwan*
> 
> Firstly, nice avatar! I knew where you from once I saw it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Secondly, did you try contact the seller? I think it's faster & economical if you can RMA it to MSI yourself. I think the memory on the card is busted. It only artifacting while in windows, not in BIOS right?


Hey, cheers for reply









I've made one point of contact with seller, but still need to reply further. His first reply was basically that "other people have had this problem - links to thread" ... thread is not the same problem at all.
Seller thinks it is over heating despite me making it explicitly clear that it starts as soon as the AMD driver is installed and happens on the desktop. Both fans are spinning and the GPU has no load on it what so ever.

He also mentioned along side the heat comment that he "replaced the thermal paste when he had it because there was hardly any factory thermal paste on there".
Sigh - that sounds like an RMA void. I am wondering now if I should try to reach an agreement where I return it to MSI, and he pays the repair bill. Because the cost to send the GPU back to US is about $100 NZD for a start.

And yes - correct its only happening in windows.

I think I may have to upload a video to YouTube and link to seller to fully explain the problem. If your hunch is correct (busted memory), do you know if thats a major repair cost?

Cheers

Edit - looks like the MSI service centre is in Australia... so thats still going to be an international parcel (expensive!!!!) ... I'm in a bit of a pickle :S


----------



## kizwan

Quote:


> Originally Posted by *ashyg*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Firstly, nice avatar! I knew where you from once I saw it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Secondly, did you try contact the seller? I think it's faster & economical if you can RMA it to MSI yourself. I think the memory on the card is busted. It only artifacting while in windows, not in BIOS right?
> 
> 
> 
> 
> 
> 
> 
> Hey, cheers for reply
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've made one point of contact with seller, but still need to reply further. His first reply was basically that "other people have had this problem - links to thread" ... thread is not the same problem at all.
> Seller thinks it is over heating despite me making it explicitly clear that it starts as soon as the AMD driver is installed and happens on the desktop. Both fans are spinning and the GPU has no load on it what so ever.
> 
> He also mentioned along side the heat comment that he "replaced the thermal paste when he had it because there was hardly any factory thermal paste on there".
> Sigh - that sounds like an RMA void. I am wondering now if I should try to reach an agreement where I return it to MSI, and he pays the repair bill. Because the cost to send the GPU back to US is about $100 NZD for a start.
> 
> And yes - correct its only happening in windows.
> 
> I think I may have to upload a video to YouTube and link to seller to fully explain the problem. If your hunch is correct (busted memory), do you know if thats a major repair cost?
> 
> Cheers
> 
> Edit - looks like the MSI service centre is in Australia... so thats still going to be an international parcel (expensive!!!!) ... I'm in a bit of a pickle :S
Click to expand...

The youtube thing is good idea. Try that.

I would imagine it's going to be a major repair cost. There is MSI service center in New Zealand though.


----------



## ashyg

Quote:


> Originally Posted by *kizwan*
> 
> The youtube thing is good idea. Try that.
> 
> I would imagine it's going to be a major repair cost. There is MSI service center in New Zealand though.


Oh thanks, I didn't see that!

Actually within driving distance too









Perhaps I will upload YouTube video for seller, might post here too. Suggest I take into NZ service centre for quote. See if he's happy to pay quote.

Ah well, better get to work for the day, and muck around with GPU later, will probably post an update after the weekend. Thanks for help kizwan

Edit: turns out the NZ service centre only handles laptop repairs. Also turns out the seller was suggesting I replace the thermal paste, not telling me he had done it. Poor wording. Ill have a close inspection of the card tonight to see if I can see evidence that it has been pulled apart.


----------



## webhito

Hi there!

Anyone here know what the best air solution for a reference 290x is?


----------



## Dire Squirrel

Quote:


> Originally Posted by *webhito*
> 
> Hi there!
> 
> Anyone here know what the best air solution for a reference 290x is?


I would say the Prolimatech MK-26 or Alpenföhn Peter.
Which one to choose is basically down to the size of you case. The MK-26 needs a wider case than the Alpenföhn, as it is a bit over 17cm from the motherboard.


----------



## Skanic

Quote:


> Originally Posted by *Dire Squirrel*
> 
> I would say the Prolimatech MK-26 or Alpenföhn Peter.
> Which one to choose is basically down to the size of you case. The MK-26 needs a wider case than the Alpenföhn, as it is a bit over 17cm from the motherboard.


The Prolimatech mk-26 is a beast when i had my HD 7950 i had it 1250 Core 1500 Memory. it went never above 65°C.


----------



## bond32

Just ran firestrike extreme with my new 290's: 10861 overall http://www.3dmark.com/3dm/3328379?

Stock clocks around, PT1 bios, +50% power limit.

Pretty awesome score. Each of the 290's (stock cooling) pulled a max of 230 watts and the 290x (water) pulled 214.8 watts. CPU Package (cores + cache) pulled 103.7 and dram pulled 18.5 watts which brings the total (according to software measurements, I know not accurate) to 797.6 watts. Other draws on the psu are a single D5 pump at full speed (around 35 watts max), 8 AP-15's at full speed (assume 1 watt per), an AF-140 at slow speed (negligible), ssd (negligible), 7200 rpm hdd (assume 5 watts), and negligible lighting. Full draw comes to 845.6 watts. Really wish I had a watt meter...


----------



## Dire Squirrel

Quote:


> Originally Posted by *Skanic*
> 
> The Prolimatech mk-26 is a beast when i had my HD 7950 i had it 1250 Core 1500 Memory. it went never above 65°C.


Indeed it is.
I had one on a 7870XT. It never went above 50C, even when running Furmark.


And that was with the fans running at 500RPM.


----------



## Dasboogieman

Quote:


> Originally Posted by *webhito*
> 
> Hi there!
> 
> Anyone here know what the best air solution for a reference 290x is?


As in aftermarket cooling or AIB partner designs with reference boards?

In terms of aftermarket cooling, theres generally 3 options
- Accelero extreme IV: provides excellent core temperatures and extremely silent. Very large though and VRM cooling is subpar (compared to stock cooler)
- GELID Icy Vision: adequate core temperatures and VRM temps. Probably the more balanced design
- G10 Bracket + AIO: best possible core temperatures short of full watercooling but suffers from weak VRM cooling

AIB designs (reference)
- the best in terms of raw performance is definitely the Power color PCS+. Monstrous 3 slot fin stack + enormous dedicated VRM heatsink + Backplate.
- the most space efficient without compromising much performance is definitely the Sapphire Tri X, weaker VRM cooling than the Power Color mostly because it lacks the backplate but only takes 2 slots.
- the most compact is probably the MSI Gaming edition, very balanced core + VRM temperatures but is far smaller than the top two

AIB designs non-reference
- without question the best performer is the MSI Lightning. Massive fin stack, huge VRM assembly and 3 fans. Extremely expensive though
- the best runner up is probably the Sapphire Vapor X, similar cooling capacity to the PCS+ but with far superior VRM cooling (due to vapor chamber). Larger VRM assembly also means less heat output.


----------



## Dire Squirrel

Quote:


> Originally Posted by *Dasboogieman*
> 
> As in aftermarket cooling or AIB partner designs with reference boards?
> 
> In terms of aftermarket cooling, theres generally 3 options
> - Accelero extreme IV: provides excellent core temperatures and extremely silent. Very large though and VRM cooling is subpar (compared to stock cooler)
> - GELID Icy Vision: adequate core temperatures and VRM temps. Probably the more balanced design
> - G10 Bracket + AIO: best possible core temperatures short of full watercooling but suffers from weak VRM cooling


As seen by the other suggestions, there are clearly more than 3 options. And the ones you mention are not even the good ones.

-The Accelero, while doing a decent job of cooling, can only be considered quiet when compared to stock cooling. Reason = tiny fans. And smaller fans will always be louder than larger ones.

-The Gelid is cheap. And THAT is why it sells. It does pretty ok for the price, but is not worthy of anything more than low-mid end GPU's. It also suffers from tiny fans.

-The G10 is (when used with a decent AIO) not at all a bad product. But when you factor in a good AIO and a couple of decent fans (no AIO comes with proper, quiet fans) it is 50%-70% more expensive than the Prolimatech and Alpenföhn coolers which both are capable of pretty much the same cooling.


----------



## pdasterly

g10 has hidden cost to do right,
$30 g10
$20 fujipoly
$15 92mm fan
$20 gelid heatsink
$5 vram heatsink
optional backplate $40

Doing things the right way, Priceless

You forgot to mention corsair hg10 bracket


----------



## gotmilkmang

boom....sapphire 290x reference, with aqua computer block...havent installed yet...waiting for the 4790k to come in


----------



## nightfox

Quote:


> Originally Posted by *bond32*
> 
> Just ran firestrike extreme with my new 290's: 10861 overall http://www.3dmark.com/3dm/3328379?
> 
> Stock clocks around, PT1 bios, +50% power limit.
> 
> Pretty awesome score. Each of the 290's (stock cooling) pulled a max of 230 watts and the 290x (water) pulled 214.8 watts. CPU Package (cores + cache) pulled 103.7 and dram pulled 18.5 watts which brings the total (according to software measurements, I know not accurate) to 797.6 watts. Other draws on the psu are a single D5 pump at full speed (around 35 watts max), 8 AP-15's at full speed (assume 1 watt per), an AF-140 at slow speed (negligible), ssd (negligible), 7200 rpm hdd (assume 5 watts), and negligible lighting. Full draw comes to 845.6 watts. Really wish I had a watt meter...


something must be wrong. Either 3dmark or so.

I just pulled my previous record

and comparing both, your 4770k is oced to 4.8 mine is at stock everything.

http://www.3dmark.com/compare/fs/2311506/fs/1778016

and our score is not so much difference. But of course, mine is 4 months ago and yours is just recently. Drivers and updates probably?


----------



## yawa

Water cool it bub. Let the liquid seduce you.

I love this 290X. thing never breaks 45C in full load no matter what I set it to.


----------



## Dasboogieman

Quote:


> Originally Posted by *Dire Squirrel*
> 
> As seen by the other suggestions, there are clearly more than 3 options. And the ones you mention are not even the good ones.
> 
> -The Accelero, while doing a decent job of cooling, can only be considered quiet when compared to stock cooling. Reason = tiny fans. And smaller fans will always be louder than larger ones.
> 
> -The Gelid is cheap. And THAT is why it sells. It does pretty ok for the price, but is not worthy of anything more than low-mid end GPU's. It also suffers from tiny fans.
> 
> -The G10 is (when used with a decent AIO) not at all a bad product. But when you factor in a good AIO and a couple of decent fans (no AIO comes with proper, quiet fans) it is 50%-70% more expensive than the Prolimatech and Alpenföhn coolers which both are capable of pretty much the same cooling.


Well these 3 seem to be the most common, available with reasonably well understood performance and limitations. Do you, perhaps, know of any alternatives that I may have missed that don't take 4+ slots (e.g. the monstrous Rajintek unit) and are readily available?

I mentioned the G10 since its been on the market for a while and is fairly easy to use (lets face it, a CLC bracket is a bracket, nothing technologically advanced) however the G10 has at least a crude attempt at improving the VRM cooling. The corsair hg10 may perhaps be superior but it simply isn't available as of yet. As for the alternative brackets, I can't really vouch for them because I haven't got any experience with them nor have I seen much data from their usage.
Quote:


> Originally Posted by *pdasterly*
> 
> g10 has hidden cost to do right,
> $30 g10
> $20 fujipoly
> $15 92mm fan
> $20 gelid heatsink
> $5 vram heatsink
> optional backplate $40
> 
> Doing things the right way, Priceless
> 
> You forgot to mention corsair hg10 bracket


amen to that, I didn't realize that once all this is factored in, the custom loop WC solution actually looks extremely viable.


----------



## Dire Squirrel

Quote:


> Originally Posted by *Dasboogieman*
> 
> Well these 3 seem to be the most common, available with reasonably well understood performance and limitations. Do you, perhaps, know of any alternatives that I may have missed that don't take 4+ slots (e.g. the monstrous Rajintek unit) and are readily available?


Now you are just moving the goalposts. There was no mention of any size limitations.
Bottom line, when it comes to air coolers, is that they NEED surface area. More surface area = higher potential heat dissipation.

The only real step up from the big air coolers, are either the best and most expensive AIO's with really good fans added, or proper water cooling. But at that point, the AIO solution is almost as expensive as the cheaper water cooling solutions, and therefore not really worth it.

So the logical choice is between the big air coolers or proper water cooling.


----------



## bond32

Quote:


> Originally Posted by *nightfox*
> 
> something must be wrong. Either 3dmark or so.
> 
> I just pulled my previous record
> 
> and comparing both, your 4770k is oced to 4.8 mine is at stock everything.
> 
> http://www.3dmark.com/compare/fs/2311506/fs/1778016
> 
> and our score is not so much difference. But of course, mine is 4 months ago and yours is just recently. Drivers and updates probably?


Not sure what you mean, I believe they should be close. I have a 290X in the first slot (running at 8x) and the others are 290's at 4x. Other than that the gpu's are at 1000/1250, +50%. What ram do you have? Mine is 2x4gb trident x at 2600, 11-13-13-37.

Edit: if your gpu clocks are correct, suppose mine should be higher. Did you set your core to 975? Perhaps it throttled during the bench?


----------



## semahj

I would like to be added to the club, this is my first build and I'm using a powercolor pcs+ 290x


----------



## Dasboogieman

Quote:


> Originally Posted by *Dire Squirrel*
> 
> Now you are just moving the goalposts. There was no mention of any size limitations.
> Bottom line, when it comes to air coolers, is that they NEED surface area. More surface area = higher potential heat dissipation.
> 
> The only real step up from the big air coolers, are either the best and most expensive AIO's with really good fans added, or proper water cooling. But at that point, the AIO solution is almost as expensive as the cheaper water cooling solutions, and therefore not really worth it.
> 
> So the logical choice is between the big air coolers or proper water cooling.


I thought it would fairly obvious that any air cooling unit beyond 4 slots isn't a very elegant solution to dissipating heat. Even the accelero extreme IV is already pushing it with the huge backplate.
Why waste all that space in your case for an additional maybe 10 -15 degrees of cooling. Never mind the implications of hanging probably close to a kilo of metal off a PCB.
Perhaps I wasn't clear enough in my original posts. These suggestions are meant replacements for the AMD reference cooler should the OP decide on air cooling. You don't need a massive 4 slot design to improve on the reference blower.


----------



## Dire Squirrel

Quote:


> Originally Posted by *Dasboogieman*
> 
> I thought it would fairly obvious that any air cooling unit beyond 4 slots isn't a very elegant solution to dissipating heat. Even the accelero extreme IV is already pushing it with the huge backplate.
> Why waste all that space in your case for an additional maybe 10 -15 degrees of cooling. Never mind the implications of hanging probably close to a kilo of metal off a PCB.
> Perhaps I wasn't clear enough in my original posts. These suggestions are meant replacements for the AMD reference cooler should the OP decide on air cooling. You don't need a massive 4 slot design to improve on the reference blower.


Do are you to decide what is "elegant"? Or even decide that "elegance" is now more important than function.
Space that is not otherwise used can not be wasted. That is all there is to say about that silly excuse for a "argument"

And just to be absolutely clear here. 10-15C better temps is MASSIVE. Especially when you get MUCH lower noise at the same time.
But we are not talking 10-15C improvement with the best air coolers. More like 25-30C.


----------



## nightfox

Quote:


> Originally Posted by *bond32*
> 
> Not sure what you mean, I believe they should be close. I have a 290X in the first slot (running at 8x) and the others are 290's at 4x. Other than that the gpu's are at 1000/1250, +50%. What ram do you have? Mine is 2x4gb trident x at 2600, 11-13-13-37.
> 
> Edit: if your gpu clocks are correct, suppose mine should be higher. Did you set your core to 975? Perhaps it throttled during the bench?


no. They are stock. Well I have one powercolor 290 OC 975 stock clock and then I set MSI AB to have same clock thereby my other 2 xfx 290 have clock a little higher than stock (947 to 975). My ram is samsung 1600 at stock. 11-11-11-28 I think. Like I said everything is stock at that time. Just to see if there is any difference if I OC ram, GPU, and CPU.

Probably thats one of the reason why there is a little difference one the PCI lanes. As my board has the PLX chip, all my GPU's are running (8x) virtually.

Yes they should be close but seeing your CPU is oced to 4.8, you should have higher physx compared to mine.

Would you try running everything at stock? CPU, GPU, RAM, I mean everything. see if there is an improvement if you OC some of the components.


----------



## bond32

Quote:


> Originally Posted by *nightfox*
> 
> no. They are stock. Well I have one powercolor 290 OC 975 stock clock and then I set MSI AB to have same clock thereby my other 2 xfx 290 have clock a little higher than stock (947 to 975). My ram is samsung 1600 at stock. 11-11-11-28 I think. Like I said everything is stock at that time. Just to see if there is any difference if I OC ram, GPU, and CPU.
> 
> Probably thats one of the reason why there is a little difference one the PCI lanes. As my board has the PLX chip, all my GPU's are running (8x) virtually.
> 
> Yes they should be close but seeing your CPU is oced to 4.8, you should have higher physx compared to mine.
> 
> Would you try running everything at stock? CPU, GPU, RAM, I mean everything. see if there is an improvement if you OC some of the components.


My physics score was 13463, yours is 10363 which is reasonable. My ram is at 2600, however you have 16 gb to my 8 (if that mattered). Also my board has a plx chip as well, was pretty sure haswell wasn't capable even with a plx to run trifire at anything but 8x4x4... Which board do you have?

I'm about to head to bed, but if I remember it I will run a stock run tomorrow.

Edit: NVM, see your board now. 

Likely the difference.


----------



## Dasboogieman

Quote:


> Originally Posted by *Dire Squirrel*
> 
> Do are you to decide what is "elegant"? Or even decide that "elegance" is now more important than function.
> Space that is not otherwise used can not be wasted. That is all there is to say about that silly excuse for a "argument"
> 
> And just to be absolutely clear here. 10-15C better temps is MASSIVE. Especially when you get MUCH lower noise at the same time.
> But we are not talking 10-15C improvement with the best air coolers. More like 25-30C.


Well, say for example. The stock cooler can achieve 95 degrees at a loud 50-60 DbA yes?
Say a good two slot open air cooler can drop this by about 20 degrees with decent silence (e.g. the icy vision). You probably lose one pcie slot.
Say a four slot cooler drops the temperatures by a generous 35 degrees. You lose pretty much the remainder of your pcie slots.
You can also spin this around. Say the 4 slot cooler drops 20 degrees at half the noise of the 2 slot one.

It is all a tradeoff
Is the extra thermal headroom worth losing all your motherboard expansion capability? Is half of already barely audible noise really an improvement?
Elegance is the solution to a problem by the simplest and most concise means. If a 2 slot cooler can achieve 90% of most reasonable (barring extreme needs) objectives (e.g. a conservative 75 degree operation at stock at 40DbA) then I don't see how the significant sacrifices of a 4 slot cooler justify the added cooling performance (temperatures below 75 without extra vrm cooling) or acoustic gains (lower than the almost inaudible 40DbA).

I'm not saying 4 slot coolers are entirely useless. You cannot argue with the performance or silence of such a design. What I am saying is that it is a more unbalanced, hence, less elegant method of dissipating the heat from the 290x


----------



## Red1776

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dire Squirrel*
> 
> Do are you to decide what is "elegant"? Or even decide that "elegance" is now more important than function.
> Space that is not otherwise used can not be wasted. That is all there is to say about that silly excuse for a "argument"
> 
> And just to be absolutely clear here. 10-15C better temps is MASSIVE. Especially when you get MUCH lower noise at the same time.
> But we are not talking 10-15C improvement with the best air coolers. More like 25-30C.
> 
> 
> 
> Well, say for example. The stock cooler can achieve 95 degrees at a loud 50-60 DbA yes?
> Say a good two slot open air cooler can drop this by about 20 degrees with decent silence (e.g. the icy vision). You probably lose one pcie slot.
> Say a four slot cooler drops the temperatures by a generous 35 degrees. You lose pretty much the remainder of your pcie slots.
> You can also spin this around. Say the 4 slot cooler drops 20 degrees at half the noise of the 2 slot one.
> 
> It is all a tradeoff
> Is the extra thermal headroom worth losing all your motherboard expansion capability? Is half of already barely audible noise really an improvement?
> Elegance is the solution to a problem by the simplest and most concise means. If a 2 slot cooler can achieve 90% of most reasonable (barring extreme needs) objectives (e.g. a conservative 75 degree operation at stock at 40DbA) then I don't see how the significant sacrifices of a 4 slot cooler justify the added cooling performance (temperatures below 75 without extra vrm cooling) or acoustic gains (lower than the almost inaudible 40DbA).
> 
> I'm not saying 4 slot coolers are entirely useless. You cannot argue with the performance or silence of such a design. What I am saying is that it is a more unbalanced, hence, less elegant method of dissipating the heat from the 290x
Click to expand...

That was both mellifluous and eloquent. +1


----------



## Arizonian

Quote:


> Originally Posted by *gotmilkmang*
> 
> boom....sapphire 290x reference, with aqua computer block...havent installed yet...waiting for the 4790k to come in
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added
















Quote:


> Originally Posted by *semahj*
> 
> I would like to be added to the club, this is my first build and I'm using a powercolor pcs+ 290x
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## Dire Squirrel

Quote:


> Originally Posted by *Dasboogieman*
> 
> I'm not saying 4 slot coolers are entirely useless. You cannot argue with the performance or silence of such a design. What I am saying is that it is a more unbalanced, hence, less elegant method of dissipating the heat from the 290x


Basically what you are doing is defending your original statement, even in the face of clear evidence that it was wrong.

I'l break it down for you one last time:

-Performance - heatsink: More surface area = higher performance potential. That is just a simple fact. What you are arguing is that 2 X 92mm. heatsink is so close to a 2 x 120mm. (Alpenföhn) 2 x 140mm. (Prolimatech) that it makes no difference or that the difference is not enough to justify the larger units.
You are plain wrong. Simple as that.

-Performance - fans: Larger fan = more air being moved at the same RPM (all else being equal). 92mm. fans will NEVER move the same air as 120mm. or 140mm. fans.
Once again, this is a simple fact. And once again you are plain wrong (which is what happens when you argue against facts).

-Noise: Larger fan = less noise. A larger fan can move significantly more air than a smaller one, at the same RPM (all else being equal). No matter which way you try to twist it, 120mm. and 140mm. fans will always outperform 92mm. fans in terms of low noise because they can do more work while running at lower speeds.
Once again, a simple fact.

As for your silly little, made up "elegance" parameter that apparently means whatever you need it to mean to support you dying "argument";
Using one extra PCI slot that would otherwise not be used, is not a sacrifice. You have lost nothing. But lets just make that a factor anyway, to make you happy. With the backplate installed, the Accelero is 6,4mm THICKER than the MK-26.

But if you absolutely want to make simplicity a factor, the larger units that are nothing more than a bare heatsink with a couple of fans held on by wire clips, will beat you overly elaborate Accelero with all it's plastic "shrouds" by a mile.

You may now consider every single one of your arguments and claims debunked and beaten to a pulp.
I'll be in the teachers lounge.

Class dismissed.


----------



## JordanTr

I have gelid icy vision cooler. It driped only 15 degrees from stock cooler, lower load noise as well, but really p**sed me off while its idle its couple times louder then reference at idle, so had to sacrifice another 5 degrees and replaced fan's to be quiet shadow wing 92mm to get ultimate sikence at all times. However i cant do oc on them, so next week im ordering xtreme iv. Heard really good words about that coller + vrm cooling and noise ratio. Keep in mind i got fractal design xl r2 with all the fans mounted, so airflow is quite good.


----------



## Dasboogieman

Quote:


> Originally Posted by *JordanTr*
> 
> I have gelid icy vision cooler. It driped only 15 degrees from stock cooler, lower load noise as well, but really p**sed me off while its idle its couple times louder then reference at idle, so had to sacrifice another 5 degrees and replaced fan's to be quiet shadow wing 92mm to get ultimate sikence at all times. However i cant do oc on them, so next week im ordering xtreme iv. Heard really good words about that coller + vrm cooling and noise ratio. Keep in mind i got fractal design xl r2 with all the fans mounted, so airflow is quite good.


Likewise I was really annoyed with the G10 + H90 setup. I had a VRM cooling kit and everything. Dead silent and core temperatures were fantastic but the VRM temps were never able to match the stock Tri X cooler, hence I actually didn't gain any OC headroom. Plus there were some serious issues with VRAM heat despite the sticky heatsinks. It actually made me appreciate how much thought goes in to most good AIB cooler designs.

If you are overclocking then I don't think the accelero extreme IV is a good selection, you will be trading better core temperatures (vs the Icy Vision) for far worse VRM temperatures (according to the guys using the accelero CLC bracket + the huge backplate) because thermal transfer through the PCB is really inefficient, you may want to use the legacy Accelero glue-on heatsinks in addition to the backplate should you go down this route.

Quite frankly, I haven't been able to find anything which truly increases OC headroom (which requires a simultaneous improvement in VRM and Core cooling). Everything other than full scale Watercooling seems to have serious tradeoffs in terms of either core temperature, Auxilliary part temperatures or noise/bulk.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Dasboogieman*
> 
> Well, say for example. The stock cooler can achieve 95 degrees at a loud 50-60 DbA yes?
> Say a good two slot open air cooler can drop this by about 20 degrees with decent silence (e.g. the icy vision). You probably lose one pcie slot.
> Say a four slot cooler drops the temperatures by a generous 35 degrees. You lose pretty much the remainder of your pcie slots.
> You can also spin this around. Say the 4 slot cooler drops 20 degrees at half the noise of the 2 slot one.
> 
> It is all a tradeoff
> Is the extra thermal headroom worth losing all your motherboard expansion capability? Is half of already barely audible noise really an improvement?
> Elegance is the solution to a problem by the simplest and most concise means. If a 2 slot cooler can achieve 90% of most reasonable (barring extreme needs) objectives (e.g. a conservative 75 degree operation at stock at 40DbA) then I don't see how the significant sacrifices of a 4 slot cooler justify the added cooling performance (temperatures below 75 without extra vrm cooling) or acoustic gains (lower than the almost inaudible 40DbA).
> 
> I'm not saying 4 slot coolers are entirely useless. You cannot argue with the performance or silence of such a design. What I am saying is that it is a more unbalanced, hence, less elegant method of dissipating the heat from the 290x


Very well said good sir, you may have a +1 from me


----------



## KyadCK

Quote:


> Originally Posted by *Dire Squirrel*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dasboogieman*
> 
> I'm not saying 4 slot coolers are entirely useless. You cannot argue with the performance or silence of such a design. What I am saying is that it is a more unbalanced, hence, less elegant method of dissipating the heat from the 290x
> 
> 
> 
> Basically what you are doing is defending your original statement, even in the face of clear evidence that it was wrong.
> 
> I'l break it down for you one last time:
> 
> -Performance - heatsink: More surface area = higher performance potential. That is just a simple fact. What you are arguing is that 2 X 92mm. heatsink is so close to a 2 x 120mm. (Alpenföhn) 2 x 140mm. (Prolimatech) that it makes no difference or that the difference is not enough to justify the larger units.
> You are plain wrong. Simple as that.
> 
> -Performance - fans: Larger fan = more air being moved at the same RPM (all else being equal). 92mm. fans will NEVER move the same air as 120mm. or 140mm. fans.
> Once again, this is a simple fact. And once again you are plain wrong (which is what happens when you argue against facts).
> 
> -Noise: Larger fan = less noise. A larger fan can move significantly more air than a smaller one, at the same RPM (all else being equal). No matter which way you try to twist it, 120mm. and 140mm. fans will always outperform 92mm. fans in terms of low noise because they can do more work while running at lower speeds.
> Once again, a simple fact.
> 
> *As for your silly little, made up "elegance" parameter that apparently means whatever you need it to mean to support you dying "argument";*
> Using one extra PCI slot that would otherwise not be used, is not a sacrifice. You have lost nothing. But lets just make that a factor anyway, to make you happy. With the backplate installed, the Accelero is 6,4mm THICKER than the MK-26.
> 
> But if you absolutely want to make simplicity a factor, the larger units that are nothing more than a bare heatsink with a couple of fans held on by wire clips, will beat you overly elaborate Accelero with all it's plastic "shrouds" by a mile.
> 
> You may now consider every single one of your arguments and claims debunked and beaten to a pulp.
> I'll be in the teachers lounge.
> 
> Class dismissed.
Click to expand...

Elegance;
Quote:


> a : refined grace or dignified propriety : urbanity
> b : tasteful richness of design or ornamentation
> c : dignified gracefulness or restrained beauty of style : polish
> d : scientific precision, neatness, and simplicity


XFX's 290 cooler is "elegant". A single-slot plain water block and simple backplate is "elegant". A huge ugly and bulky 4-slot cooler is not "elegant". It is a waste of space that completely disables any thought of multi-GPU. It is brute force. There is zero elegance in it's design.

I understand the appeal of the various cooler styles. Usually the purpose of going for a larger air cooler is to retain air cooling and the versatility that comes with it while still getting extra cooling performance. However, as far as actually being on the card, there is only so large you can go before they are ugly, huge, heavy, and take up way too much space for the cooling capacity that they allow for. Even a relatively small Accelero (pre-backplate) needs a prop because it only connects by the 4 mounting holes, not the whole card, and you don't want to damage the PCB. They also sacrifice cooling of all the little things like RAM and VRMS, which you are then forced to tape or glue heatsinks to, with the potential to damage the card upon removal. Then you get into the hidden costs of having to get those extra coolers, and good fans, and suddenly it's not a very cost effective option.

If you want real cooling capacity for the entire card, you go water. If you want more room on the motherboard, you go water. If you want it quieter without sacrificing performance or space, you go water.

A water block will beat even the best air cooler to a pulp, be quieter, AND not render an entire motherboard useless.

Lesson is over. Get out of my lounge. Man that sounds arrogant, doesn't it?


----------



## Sgt Bilko

Quote:


> Originally Posted by *KyadCK*
> 
> Elegance;
> XFX's 290 cooler is "elegant". A single-slot plain water block and simple backplate is "elegant". A huge ugly and bulky 4-slot cooler is not "elegant". It is a waste of space that completely disables any thought of multi-GPU. It is brute force. There is zero elegance in it's design.
> 
> I understand the appeal of the various cooler styles. Usually the purpose of going for a larger air cooler is to retain air cooling and the versatility that comes with it while still getting extra cooling performance. However, as far as actually being on the card, there is only so large you can go before they are ugly, huge, heavy, and take up way too much space for the cooling capacity that they allow for. Even a relatively small Accelero (pre-backplate) needs a prop because it only connects by the 4 mounting holes, not the whole card, and you don't want to damage the PCB. They also sacrifice cooling of all the little things like RAM and VRMS, which you are then forced to tape or glue heatsinks to, with the potential to damage the card upon removal. Then you get into the hidden costs of having to get those extra coolers, and good fans, and suddenly it's not a very cost effective option.
> 
> If you want real cooling capacity for the entire card, you go water. If you want more room on the motherboard, you go water. If you want it quieter without sacrificing performance or space, you go water.
> 
> A water block will beat even the best air cooler to a pulp, be quieter, AND not render an entire motherboard useless.
> 
> Lesson is over. Get out of my lounge. Man that sounds arrogant, doesn't it?


Well said









Also, the XFX cooler isn't just elegant, it's also dead sexy











Spoiler: Pics


----------



## Dire Squirrel

Quote:


> Originally Posted by *KyadCK*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Elegance;
> XFX's 290 cooler is "elegant". A single-slot plain water block and simple backplate is "elegant". A huge ugly and bulky 4-slot cooler is not "elegant". It is a waste of space that completely disables any thought of multi-GPU. It is brute force. There is zero elegance in it's design.
> 
> I understand the appeal of the various cooler styles. Usually the purpose of going for a larger air cooler is to retain air cooling and the versatility that comes with it while still getting extra cooling performance. However, as far as actually being on the card, there is only so large you can go before they are ugly, huge, heavy, and take up way too much space for the cooling capacity that they allow for. Even a relatively small Accelero (pre-backplate) needs a prop because it only connects by the 4 mounting holes, not the whole card, and you don't want to damage the PCB. They also sacrifice cooling of all the little things like RAM and VRMS, which you are then forced to tape or glue heatsinks to, with the potential to damage the card upon removal. Then you get into the hidden costs of having to get those extra coolers, and good fans, and suddenly it's not a very cost effective option.
> 
> If you want real cooling capacity for the entire card, you go water. If you want more room on the motherboard, you go water. If you want it quieter without sacrificing performance or space, you go water.
> 
> A water block will beat even the best air cooler to a pulp, be quieter, AND not render an entire motherboard useless.
> 
> Lesson is over. Get out of my lounge. Man that sounds arrogant, doesn't it?


Congratulations on missing the point.

Forget that nonsensical "elegance" argument. It means nothing.
If you wan't it to be about simplicity, than the simplest solution wins. And in this case it does not get any simpler than a bare heatsink with fans strapped onto it.
If you want it to be about aesthetics, you are moving into the realm of personal preference. And since that is 100% subjective, you can't claim that anything is objectively better or worse. That is just simple logic.

If you absolutely play necromancer for the long dead "extra strain on the PCB/motherboard/PCI slot" nonsense, you might want to take into account the fact that the Accelero is significantly heavier than the MK-26 (and thicker as well) and that full cover waterblocks are even heavier still (my waterblock is easily twice as heavy as the MK-26).
all the parts involved can easily take the extra weight, which is why we never hear about any damage done by these products based on their weight.

As for the VRM/RAM cooling profiles. They generally come included so you can't really claim that as an added cost. And even if they didn't, the total cost would still be less than things like the sad little Accelero.
You clearly have never worked with such things if you actually think that removing them is some kind of massive risk. Sure if you actually glue them on, getting them off is not really an option. It is a permanent solution that is very rarely used these days. Thermal tape is very easy to remove. And unless you have hammers where other people have hands, there is no chance of any damage, which is why we never hear of such damage.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Dire Squirrel*
> 
> Congratulations on missing the point.
> 
> Forget that nonsensical "elegance" argument. It means nothing.
> If you wan't it to be about simplicity, than the simplest solution wins. And in this case it does not get any simpler than a bare heatsink with fans strapped onto it.
> If you want it to be about aesthetics, you are moving into the realm of personal preference. And since that is 100% subjective, you can't claim that anything is objectively better or worse. That is just simple logic.
> 
> If you absolutely play necromancer for the long dead "extra strain on the PCB/motherboard/PCI slot" nonsense, you might want to take into account the fact that *the Accelero is significantly heavier* than the MK-26 (and thicker as well) and that full cover waterblocks are even heavier still (my waterblock is easily twice as heavy as the MK-26).
> all the parts involved can easily take the extra weight, which is why we never hear about any damage done by these products based on their weight.
> 
> As for the VRM/RAM cooling profiles. They generally come included so you can't really claim that as an added cost. And even if they didn't, the total cost would still be less than things like the sad little Accelero.
> You clearly have never worked with such things if you actually think that removing them is some kind of massive risk. Sure if you actually glue them on, getting them off is not really an option. It is a permanent solution that is very rarely used these days. Thermal tape is very easy to remove. And unless you have hammers where other people have hands, there is no chance of any damage, which is why we never hear of such damage.


I owned an Accelero Xtreme III and i thought you might like this:

Weight: 653 g (source)

Prolimatech MK-26 Weight: 570g *Before* adding fans onto it, throw on a couple of Corsair 120mm fans and you are looking at another 200g or so.

"Much" lighter isn't a great description tbh.

Waterblocks have multiple contact points along the PCB which means that they won't warp or flex as much as 3rd party air coolers will.

And the AX III doesn't come with enough heat sinks for the 290/x so you need to order more (AX IV has enough i think).

And just for the icing on the cake here: If i was still using my AX III, i wouldn't have been able to use another GPU in my rig, takes up too much room for most people.


----------



## KyadCK

Quote:


> Originally Posted by *Dire Squirrel*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KyadCK*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Elegance;
> XFX's 290 cooler is "elegant". A single-slot plain water block and simple backplate is "elegant". A huge ugly and bulky 4-slot cooler is not "elegant". It is a waste of space that completely disables any thought of multi-GPU. It is brute force. There is zero elegance in it's design.
> 
> I understand the appeal of the various cooler styles. Usually the purpose of going for a larger air cooler is to retain air cooling and the versatility that comes with it while still getting extra cooling performance. However, as far as actually being on the card, there is only so large you can go before they are ugly, huge, heavy, and take up way too much space for the cooling capacity that they allow for. Even a relatively small Accelero (pre-backplate) needs a prop because it only connects by the 4 mounting holes, not the whole card, and you don't want to damage the PCB. They also sacrifice cooling of all the little things like RAM and VRMS, which you are then forced to tape or glue heatsinks to, with the potential to damage the card upon removal. Then you get into the hidden costs of having to get those extra coolers, and good fans, and suddenly it's not a very cost effective option.
> 
> If you want real cooling capacity for the entire card, you go water. If you want more room on the motherboard, you go water. If you want it quieter without sacrificing performance or space, you go water.
> 
> A water block will beat even the best air cooler to a pulp, be quieter, AND not render an entire motherboard useless.
> 
> Lesson is over. Get out of my lounge. Man that sounds arrogant, doesn't it?
> 
> 
> 
> 
> 
> 
> Congratulations on missing the point.
> 
> Forget that nonsensical "elegance" argument. It means nothing.
> If you wan't it to be about simplicity, than the simplest solution wins. And in this case it does not get any simpler than a bare heatsink with fans strapped onto it.
> If you want it to be about aesthetics, you are moving into the realm of personal preference. And since that is 100% subjective, you can't claim that anything is objectively better or worse. That is just simple logic.
> 
> If you absolutely play necromancer for the long dead "extra strain on the PCB/motherboard/PCI slot" nonsense, you might want to take into account the fact that the Accelero is significantly heavier than the MK-26 (and thicker as well) and that full cover waterblocks are even heavier still (my waterblock is easily twice as heavy as the MK-26).
> all the parts involved can easily take the extra weight, which is why we never hear about any damage done by these products based on their weight.
> 
> As for the VRM/RAM cooling profiles. They generally come included so you can't really claim that as an added cost. And even if they didn't, the total cost would still be less than things like the sad little Accelero.
> You clearly have never worked with such things if you actually think that removing them is some kind of massive risk. Sure if you actually glue them on, getting them off is not really an option. It is a permanent solution that is very rarely used these days. Thermal tape is very easy to remove. And unless you have hammers where other people have hands, there is no chance of any damage, which is why we never hear of such damage.
Click to expand...

It does mean something, you're just decided to not read it. I linked its description.

PCB sags. True fact. Thats bad for the PCB. The difference between a full cover block and backplate and a air cooler is the full cover uses mounting holes along the entire PCB, using it's own metal frame to support it and keep it from bending. Weight isn't the only factor, mounting style is. Even the AX III sags horribly, and that is lighter than the MX-26. At least the AX IV has a strong backplate to reduce the strain on the PCB.

Really, so that big 4-slot MK-26 comes with enough to cover up the 290X? Sure doesn't look like it.


----------



## bond32

What are we arguing here? Are we really discussing the effectiveness of a full cover waterblock vs anything else?? Lol.


----------



## rdr09

Quote:


> Originally Posted by *bond32*
> 
> What are we arguing here? Are we really discussing the effectiveness of a full cover waterblock vs anything else?? Lol.


that's what i was wondering, too.

anyway, bond, i saw your benches and i think the X8X4X4 is putting a damper on the results.


----------



## Dasboogieman

Quote:


> Originally Posted by *bond32*
> 
> What are we arguing here? Are we really discussing the effectiveness of a full cover waterblock vs anything else?? Lol.


I think it may have been my fault. It started off as my comparison + recommendation list of AIB and aftermarket cooling units for the 290. There was a a difference of opinion and some misunderstanding between myself and Dire Squirrel and now I'm afraid it may become a full blown forum melee


----------



## JordanTr

I think people are mixing ax iii vs ax iv. Ax iv backplate got thermal pads as well and does great job to cool vrms. One guy tested it and vrm1 didnt go above 60 on stock clocks, so for oc i would say not more 75, which is perfectly fine. I done some dirty oc just slaped +100mv 1200/1500 100% stable. But after a whule my temps goes too 92-93 and vrm 100+.


----------



## bond32

Quote:


> Originally Posted by *rdr09*
> 
> that's what i was wondering, too.
> 
> anyway, bond, i saw your benches and i think the X8X4X4 is putting a damper on the results.


It is. Kinda new that when I got the board, mistake on my part...

I'll be looking to get the Asus maximus vi extreme at some point, it runs at 8x16x8x


----------



## rdr09

Quote:


> Originally Posted by *bond32*
> 
> It is. Kinda new that when I got the board, mistake on my part...
> 
> I'll be looking to get the Asus maximus vi extreme at some point, it runs at 8x16x8x


one upgrade leads to another. lol. not much difference between X8 and X16 based on my experience with tahitis. but X4 - too low.


----------



## Dire Squirrel

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I owned an Accelero Xtreme III and i thought you might like this:
> 
> Weight: 653 g (source)


Way to keep up with the discussion.

The Accelero in question is the Xtreme IV which weights in at 991g

Quote:


> Originally Posted by *KyadCK*
> 
> PCB sags. True fact.


No one ever said otherwise. But the simple fact is that it does no damage.
People have been saying the exact thing you are saying, for literally decades. But time has shown that there is no damage done.
Just like with big CPU coolers, even today, people keep bringing up the old wives tale about how they damage the motherboard. But reality keeps showing us that it simply is not the case.

If you were correct, reports of CPU and GPU coolers damaging PCB's, PCI slots and motherboards would be common. In fact it would be more common for it to happen than not. Yet, this is not and has never been the case, which is so clearly demonstrated by the complete lack of any such reports.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Dire Squirrel*
> 
> Way to keep up with the discussion.
> 
> The Accelero in question is the Xtreme IV which weights in at 991g
> 
> No one ever said otherwise. But the simple fact is that it does no damage.
> People have been saying the exact thing you are saying, for literally decades. But time has shown that there is no damage done.
> Just like with big CPU coolers, even today, people keep bringing up the old wives tale about how they damage the motherboard. But reality keeps showing us that it simply is not the case.
> 
> If you were correct, reports of CPU and GPU coolers damaging PCB's, PCI slots and motherboards would be common. In fact it would be more common for it to happen than not. Yet, this is not and has never been the case, which is so clearly demonstrated by the complete lack of any such reports.


Yep, and i chose a lighter cooler that still manages to make the PCB sag and thus put stress not only on the PCB itself but also the PCIe slot.

Sorry if you missed that.


Spoiler: Just leaving this here


----------



## Dasboogieman

Quote:


> Originally Posted by *Dire Squirrel*
> 
> Way to keep up with the discussion.
> 
> The Accelero in question is the Xtreme IV which weights in at 991g
> 
> No one ever said otherwise. But the simple fact is that it does no damage.
> People have been saying the exact thing you are saying, for literally decades. But time has shown that there is no damage done.
> Just like with big CPU coolers, even today, people keep bringing up the old wives tale about how they damage the motherboard. But reality keeps showing us that it simply is not the case.
> 
> If you were correct, reports of CPU and GPU coolers damaging PCB's, PCI slots and motherboards would be common. In fact it would be more common for it to happen than not. Yet, this is not and has never been the case, which is so clearly demonstrated by the complete lack of any such reports.


Regarding the CPU coolers. I thought part of the reason AIO devices are so popular among boutique builders (other than the fact they can sell the "water cooled" logo) was that it significantly cut down on the amount of RMAs from damaged motherboard sockets due to heavy air coolers during transport.

http://worldthetech.blogspot.com.au/2013/03/four-more-closed-loop-liquid-coolers_8127.html


----------



## nightfox

Quote:


> Originally Posted by *bond32*
> 
> It is. Kinda new that when I got the board, mistake on my part...
> 
> I'll be looking to get the Asus maximus vi extreme at some point, it runs at 8x16x8x


Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yep, and i chose a lighter cooler that still manages to make the PCB sag and thus put stress not only on the PCB itself but also the PCIe slot.
> 
> Sorry if you missed that.
> 
> 
> Spoiler: Just leaving this here


whooooo.. seeing that damage makes me wanna tied up my gpu's so they dont sag and damage my pci slot.


----------



## Sgt Bilko

Quote:


> Originally Posted by *nightfox*
> 
> whooooo.. seeing that damage makes me wanna tied up my gpu's so they dont sag and damage my pci slot.


it would be an isolated incident but the potential is still there, that's why i don't like Big tower air coolers and the heavy aftermarket GPU coolers.

They perform well, no doubts about it but i just hate the idea of the extra weight.


----------



## Dire Squirrel

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yep, and i chose a lighter cooler that still manages to make the PCB sag and thus put stress not only on the PCB itself but also the PCIe slot.
> 
> Sorry if you missed that.
> 
> 
> Spoiler: Just leaving this here


Good for you. You found a single example.
And once again missed the point.

CAN this happen? sure. Just like your RAM can spontaneously burst into flames.
But these are extremely rare occurrences. The simple fact is that IF it was likely to happen, we would be seeing that sort of thing all the time. And we simply don't.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Dire Squirrel*
> 
> Good for you. You found a single example.
> And once again missed the point.
> 
> CAN this happen? sure. Just like your RAM can spontaneously burst into flames.
> But these are extremely rare occurrences. The simple fact is that IF it was likely to happen, we would be seeing that sort of thing all the time. And we simply don't.


See my above post.....you missed it


----------



## nightfox

Quote:


> Originally Posted by *Dire Squirrel*
> 
> Good for you. You found a single example.
> And once again missed the point.
> 
> CAN this happen? sure. Just like your RAM can spontaneously burst into flames.
> But these are extremely rare occurrences. The simple fact is that IF it was likely to happen, we would be seeing that sort of thing all the time. And we simply don't.


Damn what is wrong with you? You cant force everybody what you are thinking and you cannot force people to believe. You simply dont have that "convincing power". Just let it go man.


----------



## webhito

Thanks for all the suggestions, for those mentioning putting them on water, I actually do have both of them under water but had a scare recently with a leak and don't want it to happen again.

Quote:


> Originally Posted by *Dire Squirrel*
> 
> I would say the Prolimatech MK-26 or Alpenföhn Peter.
> Which one to choose is basically down to the size of you case. The MK-26 needs a wider case than the Alpenföhn, as it is a bit over 17cm from the motherboard.


Is there a way to counter the weight of any of those coolers? Like a backplate or something? I have read that due to how heavy they are with fans that they tend to warp the pcb.


----------



## rdr09

Quote:


> Originally Posted by *webhito*
> 
> Thanks for all the suggestions, for those mentioning putting them on water, I actually do have both of them under water but had a scare recently with a leak and don't want it to happen again.


what happened? no clamps, no drain line or something else? go back to full block and just use chopsticks to keep the card from sagging. actually even wihout a backplate i don't see any sag or droop, so i got rid of the chopstick.


----------



## bond32

Quote:


> Originally Posted by *webhito*
> 
> Is there a way to counter the weight of any of those coolers? Like a backplate or something? I have read that due to how heavy they are with fans that they tend to warp the pcb.


http://www.performance-pcs.com/catalog/index.php?main_page=product_info&cPath=59_971_1018_1038_1209&products_id=40575


----------



## Dasboogieman

Quote:


> Originally Posted by *rdr09*
> 
> what happened? no clamps, no drain line or something else? go back to full block and just use chopsticks to keep the card from sagging. actually even wihout a backplate i don't see any sag or droop, so i got rid of the chopstick.


I gotta see photos of this chopstick mod, sounds slightly ghetto.


----------



## rdr09

Quote:


> Originally Posted by *Dasboogieman*
> 
> I gotta see photos of this chopstick mod, sounds slightly ghetto.


i used it when i had the 7950. the tubing were not cut to size, so it was pushing the block down a bit. i got it right with the 290, so i got rid of it . . .



crappy phone cam. i see you lauging. it worked.


----------



## Dasboogieman

Quote:


> Originally Posted by *rdr09*
> 
> i used it when i had the 7950. the tubing were not cut to size, so it was pushing the block down a bit. i got it right with the 290, so i got rid of it . . .
> 
> 
> 
> crappy phone cam. i see you lauging. it worked.


deliciously ghetto


----------



## wrigleyvillain

Aw yiss my $230 shipped 290 incl an Accelero from ebay ships today. Thanks miners!


----------



## Red1776

Quote:


> Originally Posted by *DeadlyDNA*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dire Squirrel*
> 
> In any case, I'm out.
> My tolerance for fanboy arguments that ignore facts, has been both met and exceeded.
> 
> If you need me, I'll be in reality.
> 
> 
> 
> Fanboy of what exactly? I'm lost.
Click to expand...

I'm not chiming in to be argumentative here but CPU coolers and GPU's can warp motherboards and GPU PCB's. Not only from weight, but from the heating up and cooling down cycles.

The first pic is from the first quad I built in 07-08. I had two 4850's (not weighty cards at all) die from warping before I made a support rod ( the stainless steel rod with set screw bearings at the end of the GPU's) I used it in all my builds up until I went liquid. Not before letting my 6970 warp badly. I put the rod back in and they went back closer to flat but still were a bit convex.

The Silver Arrow has a tie in back to keep it from sagging as well.

Hope this helps someone down the road, I learned it the hard way


----------



## rdr09

Quote:


> Originally Posted by *Dasboogieman*
> 
> deliciously ghetto


A lot of Japanese and Korean restaurants around my area.

Quote:


> Originally Posted by *wrigleyvillain*
> 
> Aw yiss my $230 shipped 290 incl an Accelero from ebay ships today. Thanks miners!


----------



## battleaxe

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yep, and i chose a lighter cooler that still manages to make the PCB sag and thus put stress not only on the PCB itself but also the PCIe slot.
> 
> Sorry if you missed that.
> 
> 
> Spoiler: Just leaving this here


What are you talking about?

That looks completely fine! Phhhhsssssshhhhh....


----------



## Dasboogieman

On a completely different topic. Does anyone here know how to compile .exe files from OpenCL source code?
I have the makefiles for the fixed version of MemtestCL which would prove to be extremely useful for people suffering from blackscreens.
Basically, the existing version of MemtestCL has a crippling flaw, the modern AMD GPUs can complete the Random Block iterations so fast the app doesn't have time to select a new block from the RNG + seed. Thus, when the GPU fails to receive the next randomly assigned block, it records a fail for that iteration. Therefore, you get a whole heap of false Random Block errors.

A guy called iHaque worked on the original MemtestCL and managed to fix the issue but he left his work as open source code and has basically fallen off the face of the earth since 2012. I simply lack the expertise to compile it in to something useful for the community though I've been trying with VS2013 for days.

If someone can help, please do, its a great program to test VRAM stability because it is far more sensitive than the traditional stare at the screen for artifacts or run it till it crashes methodologies.
I'm fairly certain that people with chronic black screens will throw up massive amounts of legitimate Memory errors at stock.

heres the github repository
https://github.com/ihaque/memtestCL


----------



## Durvelle27

My new universal block for my 290X will be arriving later today.


----------



## webhito

Quote:


> Originally Posted by *rdr09*
> 
> what happened? no clamps, no drain line or something else? go back to full block and just use chopsticks to keep the card from sagging. actually even wihout a backplate i don't see any sag or droop, so i got rid of the chopstick.


Everything has compression fittings, one hose from the cpu block decided to slip and I got a few drops on the back on my gpu, thankfully it was right on top of the backplate.
Quote:


> Originally Posted by *bond32*
> 
> http://www.performance-pcs.com/catalog/index.php?main_page=product_info&cPath=59_971_1018_1038_1209&products_id=40575


Yea, but thats only for the ek block afaik, I want to dump the blocks altogether and go back to air, already had a scare


----------



## Red1776

Quote:


> Originally Posted by *Durvelle27*
> 
> My new universal block for my 290X will be arriving later today.


Hey D,

can ya throw a up a link to the U-block you purchased

Thanks


----------



## rdr09

Quote:


> Originally Posted by *webhito*
> 
> Everything has *compression fittings*, one hose from the cpu block decided to slip and I got a few drops on the back on my gpu, thankfully it was right on top of the backplate.
> Yea, but thats only for the ek block afaik, I want to dump the blocks altogether and go back to air, already had a scare


yikes. they are good looking but . . . barbs and zipties here.

actually, a similar thing happened to one of my tubes. I forgot to ziptie one fitting and tygon tubes . . . man, i tell you they are durable, stays clear longer but they lose their elasticity. water leaked down the side of the case all the way down missing my HDDs.


----------



## kizwan

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *webhito*
> 
> Everything has *compression fittings*, one hose from the cpu block decided to slip and I got a few drops on the back on my gpu, thankfully it was right on top of the backplate.
> Yea, but thats only for the ek block afaik, I want to dump the blocks altogether and go back to air, already had a scare
> 
> 
> 
> yikes. they are good looking but . . . barbs and zipties here.
> 
> actually, a similar thing happened to one of my tubes. I forgot to ziptie one fitting and tygon tubes . . . man, i tell you they are durable, stays clear longer but they lose their elasticity. water leaked down the side of the case all the way down missing my HDDs.
Click to expand...

I accidentally spilled some while dismantling the blocks for installing the Fujipoly. It managed to get under one of the memory chip. I just clean as best as I can using paper towel & hair dryer. Then I just proceed installing the Fujipoly, put the block back, rebuild the loop & leak test but I leave the card air dry for 24 hours. Next day the card working just fine.


----------



## webhito

Quote:


> Originally Posted by *rdr09*
> 
> yikes. they are good looking but . . . barbs and zipties here.
> 
> actually, a similar thing happened to one of my tubes. I forgot to ziptie one fitting and tygon tubes . . . man, i tell you they are durable, stays clear longer but they lose their elasticity. water leaked down the side of the case all the way down missing my HDDs.


Yea, I also had barbs but decided to go the compression fitting way, not due to looks but due availability here in Mexico, ( was on 1/2" barbs and they were no longer available ) plus the cost of the compression fittings from PPC was just as much as the barbs are here so I went that way. My tubing is swiftech not the transparent but the black stuff, available here for really cheap but more than a tubing issue I think that the compression was not tight enough.
Quote:


> Originally Posted by *kizwan*
> 
> I accidentally spilled some while dismantling the blocks for installing the Fujipoly. It managed to get under one of the memory chip. I just clean as best as I can using paper towel & hair dryer. Then I just proceed installing the Fujipoly, put the block back, rebuild the loop & leak test but I leave the card air dry for 24 hours. Next day the card working just fine.


I was lucky enough to notice the drops on the back of my card, No idea how long they had been there but the liquid was not dry, nothing serious as the drops landed on the ek backplate ( glad i purchased it as I was not going to ).


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> I accidentally spilled some while dismantling the blocks for installing the Fujipoly. It managed to get under one of the memory chip. I just clean as best as I can using paper towel & hair dryer. Then I just proceed installing the Fujipoly, put the block back, rebuild the loop & leak test but I leave the card air dry for 24 hours. Next day the card working just fine.


bet your heart was pumping. understand Web. his rig is much more expensive than mine.


----------



## kayan

Quote:


> Originally Posted by *rdr09*
> 
> i used it when i had the 7950. the tubing were not cut to size, so it was pushing the block down a bit. i got it right with the 290, so i got rid of it . . .
> 
> 
> 
> crappy phone cam. i see you lauging. it worked.


That made me giggle, but it's a good idea


----------



## webhito

Quote:


> Originally Posted by *rdr09*
> 
> bet your heart was pumping. understand Web. his rig is much more expensive than mine.


Sure was. Took me a long time to save up for it also. Its something I really do not want to have to deal with, air might be hotter but it sure is safer. The reason I put it all under water was due to the reference cards noise, 290x is stupid loud compared to the non reference cards now available and any other amd card I had beforehand, reminds me of a gtx 280 i had a few years back.


----------



## wrigleyvillain

Quote:


> Originally Posted by *rdr09*
> 
> bet your heart was pumping. understand Web. his rig is much more expensive than mine.


Man I spill on my stuff all the time. Not powered on, of course, so never lost a thing. Though the worst is when I'm not sure if any got in my fan-up PSU.


----------



## bond32

Quote:


> Originally Posted by *wrigleyvillain*
> 
> Man I spill on my stuff all the time. Not powered on, of course, so never lost a thing. Though the worst is when I'm not sure if any got in my fan-up PSU.


This... I also spill stuff all the time. Knock on wood never had a failure yet. I've even had a hose blow completely off a fitting on the cpu block before, and the board had power to it....

If you do spill anything, yank the power and dry it completely.

Even took the stock heatsinks off my 290's and washed them in the sink with dish soap. Works even better plus all clean!


----------



## Durvelle27

Quote:


> Originally Posted by *Red1776*
> 
> Hey D,
> can ya throw a up a link to the U-block you purchased
> 
> Thanks


lol i don't think you'll like the one i got


----------



## Red1776

Quote:


> Originally Posted by *Durvelle27*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> Hey D,
> can ya throw a up a link to the U-block you purchased
> 
> Thanks
> 
> 
> 
> lol i don't think you'll like the one i got
Click to expand...

What makes you say that? The U-blocks seem to be really becoming popular in a hurry and make sense upgrade wise.

so....your not gonna let me see it ?







aw c'mon man! LOL


----------



## Durvelle27

omg my heart just dropped guys. I think my 290X just blew


----------



## cephelix

Wht?! Wht happened? Do u see any blown capacitors? Burn marks?


----------



## Red1776

Quote:


> Originally Posted by *Durvelle27*
> 
> omg my heart just dropped guys. I think my 290X just blew


 Have you figured if its a safety trip or damaged? did you have the VRM heatsinks on yet?


----------



## Durvelle27

Quote:


> Originally Posted by *cephelix*
> 
> Wht?! Wht happened? Do u see any blown capacitors? Burn marks?


After examining the PCB I don't see any damage
Quote:


> Originally Posted by *Red1776*
> 
> Have you figured if its a safety trip or damaged? did you have the VRM heatsinks on yet?


I no have No clue. I booted into Windows 8.1 opened GPU-Z to monitor temps. Core was idling around 37°© and VRMs both said 30°©. I than tried to load GPU to see what temps I've achieved than screen artifacted green and flickered than PC shut down.


----------



## DeadlyDNA

Psu maybe?


----------



## Red1776

Quote:


> Originally Posted by *Durvelle27*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cephelix*
> 
> Wht?! Wht happened? Do u see any blown capacitors? Burn marks?
> 
> 
> 
> After examining the PCB I don't see any damage
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> Have you figured if its a safety trip or damaged? did you have the VRM heatsinks on yet?
> 
> Click to expand...
> 
> I no have No clue. I booted into Windows 8.1 opened GPU-Z to monitor temps. Core was idling around 37°© and VRMs both said 30°©. I than tried to load GPU to see what temps I've achieved than screen artifacted green and flickered than PC shut down.
Click to expand...

Hmm. Green ey? I assuming you had the U-block on? did you have the memory modules heatsinked?

I ask because Green artifacting and discolored pre-rendered scene layers (like the sky turning purple)

can be memory overheating/failing

Good luck, I hope it just needs to cool down or have the memory HS'd


----------



## Durvelle27

Quote:


> Originally Posted by *Red1776*
> 
> Hmm. Green ey? I assuming you had the U-block on? did you have the memory modules heatsinked?
> I ask because Green artifacting and discolored pre-rendered scene layers (like the sky turning purple)
> can be memory overheating/failing
> 
> Good luck, I hope it just needs to cool down or have the memory HS'd


I did the reference HSF mod for the VRAM and VRMs using the stock cooler backplate.


----------



## aneutralname

Quote:


> Originally Posted by *Red1776*
> 
> You need to read what I wrote if you are going to respond. I see a lot of members waxing on about powering 2 or three GPU systems with 1000w-1200w PSU's (and OC'd at that) with WC loops. my point is I prefer (and think it is a very good idea to have a good amount of headroom, like 25-30% for that amount of draw. I thought that was a fairly obvious conclusion...my mistake.
> 
> Maybe the issue is that you are stuck in the ATI era. I have been building high end gaming rigs since 2007 mostly multi GPU rigs including a quadfire machine for my own personal use every 9 months or so.
> 
> 
> Both team red and green have had periods when they have excelled and some low spots but I have not had more problems with one or the other (and I work largely multi GPU setups) If AMD GPU's were so bad and troublesome, believe me I would have left AMD so fast my shorts would have had to catch the next train.
> That is a throw away line, When I have needed support from ATI...er AMD they have been responsive, respectful, and timely.


I did read what you wrote. No need to get overly defensive about it. I was clarifying in case it wasn't understood that wattage from the wall is not to be compared directly to PSU capacity.

You can have whatever preferences you like, but PSUs cost money. I'm currently running 2x 290 and a 3570k @ 4,5 GHZ (pretty much the same system as Dasboogieman) with a Cooler Master Silent Pro 720 Watts and I have had zero power issues. I never even hear the fan of the PSU.

ATI/AMD has made no difference in regards to bugs and driver problems. A lot more games have compatibility problems of different sorts on AMD cards (about 1/3 of all games in my library run noticeably slower or have some other sort of problem on AMD, for example Watch Dogs and Train Simulator), and downsampling is barely possible on AMD cards, whereas it's pretty easy to achieve with Nvidia.


----------



## rdr09

Quote:


> Originally Posted by *aneutralname*
> 
> I did read what you wrote. No need to get overly defensive about it. I was clarifying in case it wasn't understood that wattage from the wall is not to be compared directly to PSU capacity.
> 
> You can have whatever preferences you like, but PSUs cost money. I'm currently running 2x 290 and a 3570k @ 4,5 GHZ (pretty much the same system as Dasboogieman) with a Cooler Master Silent Pro 720 Watts and I have had zero power issues. I never even hear the fan of the PSU.
> 
> ATI/AMD has made no difference in regards to bugs and driver problems. A lot more games have compatibility problems of different sorts on AMD cards (about 1/3 of all games in my library run noticeably slower or have some other sort of problem on AMD, for example Watch Dogs and Train Simulator), and downsampling is barely possible on AMD cards, whereas it's pretty easy to achieve with Nvidia.


no one is forcing you to use amd. there is another choice. besides, you haven't join the club.


----------



## Dasboogieman

Quote:


> Originally Posted by *aneutralname*
> 
> I did read what you wrote. No need to get overly defensive about it. I was clarifying in case it wasn't understood that wattage from the wall is not to be compared directly to PSU capacity.
> 
> You can have whatever preferences you like, but PSUs cost money. I'm currently running 2x 290 and a 3570k @ 4,5 GHZ (pretty much the same system as Dasboogieman) with a Cooler Master Silent Pro 720 Watts and I have had zero power issues. I never even hear the fan of the PSU.
> 
> ATI/AMD has made no difference in regards to bugs and driver problems. A lot more games have compatibility problems of different sorts on AMD cards (about 1/3 of all games in my library run noticeably slower or have some other sort of problem on AMD, for example Watch Dogs and Train Simulator), and downsampling is barely possible on AMD cards, whereas it's pretty easy to achieve with Nvidia.


You are very very lucky with your PSU. Most PSU reviewers simply don't look at the fan speeds when you continuously stress the PSU to 80-90% capacity. That's why the 20-30% headroom recommendation exists. You can't be sure how a PSU's cooling system works with regards to noise so you overprescribe the wattage to guarantee no hiccups when the heat (no pun intended) is on.

I agree with you about the downsampling. The 290 seems to have a lot of difficulty in forcing SSAA in crossfire mode. I haven't had issues with Single GPU operation. I believe it's related to the XDMA engine as the SSAA rendered frames are huge in size. I can't be sure until I see data with people who use Crossfire 290s with a HEDT platform (x16 PCIe 3.0 lanes for each GPU).

But coming from NVIDIA (I used to use GTX 570 SLI) I can say that I agree there is a definite performance edge on the green team, but the advantage doesn't last. This is simply because NVIDIA GPUs are too optimized for the current and short term with respect to hardware design (with the shortcomings mitigated via software) thus they don't age very well (I went from being able to max out all settings including SSAA to having to use Medium settings in the span of 1.5 years of game releases while my friends on AMD 6970 cards are still going fine).

For example, look at the design of the 290x compared to the 780Ti. Game generations tend to follow this trend, geometry complexity->shader complexity->texture complexity. Thus, it sacrifices significant texturing power (geometry complexity circa 2013 pretty much stagnated due to the console limitations thus there was more focus on shader and texture complexity) and a small degree of peak possible memory bandwidth in favor of much stronger geometry, raster performance and more VRAM. Thus, the 290x loses out a lot in current (on the release of the GPU) game titles but it is much better equipped for future titles and higher resolutions....once AMD driver support eventually catches up of course.

Basically the same comparison when AMD 7970 vs the 680. You tell me which one was better and cheaper on release and which one aged better till today.


----------



## kizwan

Quote:


> Originally Posted by *Durvelle27*
> 
> omg my heart just dropped guys. I think my 290X just blew


Is this happen right after you installed the U-block?


----------



## Durvelle27

Quote:


> Originally Posted by *kizwan*
> 
> Is this happen right after you installed the U-block?


Yes


----------



## FernTeixe

so 680 is better than 7970? really? I think almost every freaking place I go people just say some wrong numbers so... I just see almost every case a 280x/7970 with better fametime and better performance

also 290x is not really against 780ti... it was against 780/titan


----------



## Red1776

Quote:


> Originally Posted by *aneutralname*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> You need to read what I wrote if you are going to respond. I see a lot of members waxing on about powering 2 or three GPU systems with 1000w-1200w PSU's (and OC'd at that) with WC loops. my point is I prefer (and think it is a very good idea to have a good amount of headroom, like 25-30% for that amount of draw. I thought that was a fairly obvious conclusion...my mistake.
> 
> Maybe the issue is that you are stuck in the ATI era. I have been building high end gaming rigs since 2007 mostly multi GPU rigs including a quadfire machine for my own personal use every 9 months or so.
> 
> 
> Both team red and green have had periods when they have excelled and some low spots but I have not had more problems with one or the other (and I work largely multi GPU setups) If AMD GPU's were so bad and troublesome, believe me I would have left AMD so fast my shorts would have had to catch the next train.
> That is a throw away line, When I have needed support from ATI...er AMD they have been responsive, respectful, and timely.
> 
> 
> 
> I did read what you wrote. No need to get overly defensive about it. I was clarifying in case it wasn't understood that wattage from the wall is not to be compared directly to PSU capacity.
> 
> You can have whatever preferences you like, but PSUs cost money. I'm currently running 2x 290 and a 3570k @ 4,5 GHZ (pretty much the same system as Dasboogieman) with a Cooler Master Silent Pro 720 Watts and I have had zero power issues. I never even hear the fan of the PSU.
> 
> ATI/AMD has made no difference in regards to bugs and driver problems. A lot more games have compatibility problems of different sorts on AMD cards (about 1/3 of all games in my library run noticeably slower or have some other sort of problem on AMD, for example Watch Dogs and Train Simulator), and downsampling is barely possible on AMD cards, whereas it's pretty easy to achieve with Nvidia.
Click to expand...

and then I'm not defensive about it. I just don't like canned lines about red/green GPU comparison. You certainly get to purchase whatever product you wish.

You have illustrated though that you do not understand what I was speaking of. If You are running 2 x R290's and an OC 3570K @ 4.5MHz , you are demanding more than the rated watts of your PSU while gaming.

and here is what your PSU is aimed at:

Quote:Silent Preview


> Given the overall acoustic and efficiency curves, this CoolerMaster might be just about perfect for an often-usedgaming system with a somewhat higher idle or low load power, game play loads of 200~500W, in a modern case with good intake venting for the PSU.


Your pair of R290's exceed that themselves. add in your OC of them, your 4.5Ghz OC'd 3750K, and whatever else you have attached to the CM 720 and you are over the rated watts of the PSU. You claim to have no issues in regard to this but go on and on about your erratic R290's. Until you get adequate power to those GPU (and entire system) I really don't understand how you can make such a statement. overtaxing your PSU is a fundamental cause of system problems and instability.

at any rate I didn't know if you had done the math and figured it might be useful.

Good luck.


----------



## Dasboogieman

Quote:


> Originally Posted by *FernTeixe*
> 
> so 680 is better than 7970? really? I think almost every freaking place I go people just say some wrong numbers so... I just see almost every case a 280x/7970 with better fametime and better performance
> 
> also 290x is not really against 780ti... it was against 780/titan


well, the 680 ranged from slightly worse or significantly better than Tahiti at 925Mhz depending on the benchmark. And yes, I think the 680 did have an edge initially because NVIDIA were also quite aggressive in improving the performance via the drivers.
However, when that optimization was removed (basically when GK110 came around), Tahiti pulled ahead (partly also because AMD drivers were also beginning to address the performance deficit).
Both GPUs had similar Shader power, Texturing and ROPs. The only advantage Tahiti had was significantly better memory bandwidth and capacity, this paid real dividends in the years to come while the 680 fell behind.

Cmon, comparing fully enabled Hawaii against GK110 that has 7-25% of its assets cut away is hardly an objective and fair architecture comparison.







Besides, even though 290X was supposed to be pitted against the 780/Titan, it's a happy coincidence that it is also competitive against the 780Ti.


----------



## cephelix

All this talk about psu overwatting got me thinking whethet my x750 is sufficient.looking at powerdraw for OCed cpu n gpu on overclocking reviews, I'm estimating tht I'm ard 700W before factoring in fans, pumps peripherals...along with capacitor aging(psu is about 4 yrs old at this point)

Do I have anything to worry about or am I just being paranoid?


----------



## Dasboogieman

Well, I changed PSUs due to noise, not because it was actually overloaded. Just simply because my unit can't handle sustained 80% loads without running the fan at an ear splitting RPM.
I mean, there are heaps of people here running Crossfired systems on 750-850W units (heaven forbid I don't like cutting it that close but that's just me) with no issues.

If you aren't getting symptoms like instability or other issues then I don't think you should be worried.


----------



## motokill36

Quote:


> Originally Posted by *bond32*
> 
> Snagged a pretty good deal - got 2 more 290's with Hynix memory... Wow, I forgot how loud these boys are...
> 
> Still running my EVGA G2 1000 Watt psu strong with all three, but all three at stock settings. 4770k is still overclocked, but I know I am at the limits of the psu.


Do you know what its pulling power wise
Want to try 3 but on a 950 watt corsair psu
Not sure ill get away with it ?


----------



## Sgt Bilko

So I was getting these random screen artifacts when i've been folding right?
Had to reset to fix them well i figured out the issue.

When i'm folding and i leave my monitor's on i don't have a problem but when i turn them off for the night (as most would do) and when i wake up my PC is frozen and i have the tearing/artifacts.

The one that looks like someone put a rainbow through a woodchipper and spat it at your screen?

Anyways, thought that might help someone out if that had that issue


----------



## cephelix

Quote:


> Originally Posted by *Dasboogieman*
> 
> Well, I changed PSUs due to noise, not because it was actually overloaded. Just simply because my unit can't handle sustained 80% loads without running the fan at an ear splitting RPM.
> I mean, there are heaps of people here running Crossfired systems on 750-850W units (heaven forbid I don't like cutting it that close but that's just me) with no issues.
> 
> If you aren't getting symptoms like instability or other issues then I don't think you should be worried.


so far nothing of the sort. i don't think i've ever heard let alone seen the psu fan spinning....
anyone using AB 3.0.0 where the gpu fan graph consistently reading 0%?


----------



## Dasboogieman

Ok, Helipal 8mm M2 screws arrived. Begin EK backplate mod.
First up, these screws are still too small for the Tri X cooler, much closer than anything else I've tried but maybe .25mm too small diameter wise
I swear, Sapphire commissioned some kind of custom screw for their cooler.


Anyway, little trick from Dental school, if it doesn't fit but is very close, fill in the space with something and let friction do its magic.
Soooo.....duct tape to the rescue!!


Finished product, used 1.5mm Phobya thermal tape for the VRM 1 area. Interesting factoids about this backplate
1. They blocked a very important screw hole with their label, so I removed it with fire and steel (hairdryer and screwdriver)
2. there is extra metal near the VRM 1 area to help reduce the thickness of thermal pad, they just got the area wrong and ended up assisting the area under the inductors. Nice try EK, u had the right idea, just wrong execution.
looking very sexay


I saw a flat heat generating area so I couldn't resist


All assembled, excuse my cable routing, not my finest work. Yes, that is a poor ASUS Xonar STX sandwiched between a hot NH-D14 and a hotter AMD 290


VRM1 Temps are now down by 5 degrees at stock


----------



## sugarhell

Quote:


> Originally Posted by *Dasboogieman*
> 
> well, the 680 ranged from slightly worse or significantly better than Tahiti at 925Mhz depending on the benchmark. And yes, I think the 680 did have an edge initially because NVIDIA were also quite aggressive in improving the performance via the drivers.
> However, when that optimization was removed (basically when GK110 came around), Tahiti pulled ahead (partly also because AMD drivers were also beginning to address the performance deficit).
> Both GPUs had similar Shader power, Texturing and ROPs. The only advantage Tahiti had was significantly better memory bandwidth and capacity, this paid real dividends in the years to come while the 680 fell behind.
> 
> Cmon, comparing fully enabled Hawaii against GK110 that has 7-25% of its assets cut away is hardly an objective and fair architecture comparison.
> 
> 
> 
> 
> 
> 
> 
> Besides, even though 290X was supposed to be pitted against the 780/Titan, it's a happy coincidence that it is also competitive against the 780Ti.


No .680 won on release because it had 150-200 mhz advantage. Nothing else


----------



## bond32

Quote:


> Originally Posted by *motokill36*
> 
> Do you know what its pulling power wise
> Want to try 3 but on a 950 watt corsair psu
> Not sure ill get away with it ?


Wouldn't recommend that. The only corsair psu up to par with quality are the AX series. My system pulled about 897 watts but that was all software readings.


----------



## BradleyW

Quote:


> Originally Posted by *bond32*
> 
> Wouldn't recommend that. The only corsair psu up to par with quality are the AX series. My system pulled about 897 watts but that was all software readings.


I thought the AX was the only series that was NOT up to par from Corsair?


----------



## Dasboogieman

Quote:


> Originally Posted by *BradleyW*
> 
> I thought the AX was the only series that was NOT up to par from Corsair?


When I did my PSU research, the AX series were supposedly top notch (or at least the AX1200). The OEM mostly does server/industrial designs, their only consumer client is Corsair.
Damn, all this discussion is making me regret not tanking the exorbitant shipping fee to get the completely overkill AX1500i. Its like passing up on getting a Ferrari in order to get a more pedestrian Toyota Camry, I mean sure, both get me from point A to point B but.....


----------



## bond32

Quote:


> Originally Posted by *BradleyW*
> 
> I thought the AX was the only series that was NOT up to par from Corsair?


I could be wrong, not really familiar with them. I do know that in the 1300 watt bracket, the corsair ax1200i performs really well compared to others. Check the OP, Shilka is the most knowledgeable about all psu's.
Quote:


> Originally Posted by *Dasboogieman*
> 
> When I did my PSU research, the AX series were supposedly top notch (or at least the AX1200). The OEM mostly does server/industrial designs, their only consumer client is Corsair.
> Damn, all this discussion is making me regret not tanking the exorbitant shipping fee to get the completely overkill AX1500i. Its like passing up on getting a Ferrari in order to get a more pedestrian Toyota Camry, I mean sure, both get me from point A to point B but.....


How many cards you running? 1500 would be for quad-fire... 1200-1300 is plenty for trifire.


----------



## Dasboogieman

Quote:


> Originally Posted by *bond32*
> 
> I could be wrong, not really familiar with them. I do know that in the 1300 watt bracket, the corsair ax1200i performs really well compared to others. Check the OP, Shilka is the most knowledgeable about all psu's.
> How many cards you running? 1500 would be for quad-fire... 1200-1300 is plenty for trifire.


Just 2 in crossfire and a 3570k. I know 1500W is totally obscene but call me old fashioned, I think of PSUs like a good ATX case, it is the sort of thing you splurge on because you keep it for maybe a decade, moving from one build to the next. I was poor at the time so the CM SilentPro 1000W was the best I could afford but alas it hasn't stood the test of time. The AX1500i is definitely built with industrial time frames in mind, the G2 is a perfectly capable workhorse but its more like, I absolutely loved the engineering that went in to the AX1500i. Definitely worth the extra cost.


----------



## PureBlackFire

Back in the game folks. RMA for my Sapphire reference card came in today. They sent me a 290 Tri-X!


----------



## aneutralname

Quote:


> Originally Posted by *rdr09*
> 
> no one is forcing you to use amd. there is another choice. besides, you haven't join the club.


Ok, then. Let's see if I get a cookie.


Spoiler: Warning: Spoiler!



To be added on the member list please submit the following in your post

1. GPU-Z Link with OCN name or Screen shot of GPU-Z validation tab open with OCN Name or finally a pic of GPU with piece of paper with OCN name showing.
2. Manufacturer & Brand - Please don't make me guess.
3. Cooling - Stock, Aftermarket or 3rd Party Water





http://www.techpowerup.com/gpuz/5xukx/
http://www.techpowerup.com/gpuz/wq8w7/

One of them is a Sapphire Tri-x OC. The other is an Asus reference I bought cheap from the used market. The Tri-X is a 290x chip that's impossible to unlock due to the card not being compatible with any other bios than its own. Asic quality 79.1%. Hynx memory. The Asus is a normal 290 chip. Asic quality 70.8%. Elpida memory.

Cooling is stock for now.


----------



## rdr09

Quote:


> Originally Posted by *aneutralname*
> 
> Ok, then. Let's see if I get a cookie.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> To be added on the member list please submit the following in your post
> 
> 1. GPU-Z Link with OCN name or Screen shot of GPU-Z validation tab open with OCN Name or finally a pic of GPU with piece of paper with OCN name showing.
> 2. Manufacturer & Brand - Please don't make me guess.
> 3. Cooling - Stock, Aftermarket or 3rd Party Water
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/5xukx/
> http://www.techpowerup.com/gpuz/wq8w7/
> 
> One of them is a Sapphire Tri-x OC. The other is an Asus reference I bought cheap from the used market. The Tri-X is a 290x chip that's impossible to unlock due to the card not being compatible with any other bios than its own. Asic quality 79.1%. Hynx memory. The Asus is a normal 290 chip. Asic quality 70.8%. Elpida memory.
> 
> Cooling is stock for now.


btw, i am not doubting you own these cards 'cause i know you are a miner. too bad mining took a dive and you might just be bitter. amd is not obligated to come up with a working driver for mining purposes. heck, they are struggling to keep pace with the games. lol. like i said, you do have a choice but i guess our choices are dictated by our purpose, budget, and so on.

as stated by the mining experts here in ocn, the 750Ti is now a better choice than the Hawaiis or any other amd card for mining. don't need to rep me.


----------



## aneutralname

I want it for both mining and games. Can't imagine any a 750 ti can hold a candle to 290 for those purposes.


----------



## rdr09

Quote:


> Originally Posted by *aneutralname*
> 
> I want it for both mining and games. Can't imagine any a 750 ti can hold a candle to 290 for those purposes.


how about the 780? Trail simulator? i play that with a HD7770 just fine. Watchdog, that plays better with Nvidia.


----------



## Dasboogieman

Quote:


> Originally Posted by *aneutralname*
> 
> I want it for both mining and games. Can't imagine any a 750 ti can hold a candle to 290 for those purposes.


Well, maybe not in the absolute raw hashrate but the sheer energy efficiency.
According to this
http://gpus.afs1.netdna-cdn.com/nvidia-gtx-750-ti

The 750ti is at least 15-20% more efficient than the 290 on a hashrate/power basis. In mining terms, this reduces your energy consumption by 20% thus increasing your returns respectively. Now undervolt the 750ti and you will probably get another 10%-20%. Also less cooling demands, less PSU demands.
The only thing you really lose is the space density that the 290 can give you.

That being said, the 750ti is not gonna give you the gaming performance you want.


----------



## chevy350

Here's my Diamond 290X with stock clocks, Kraken G10 with ThermalTake Water 3.0 Performer
http://www.techpowerup.com/gpuz/bqa6a/


----------



## aneutralname

Quote:


> Originally Posted by *Dasboogieman*
> 
> Well, maybe not in the absolute raw hashrate but the sheer energy efficiency.
> According to this
> http://gpus.afs1.netdna-cdn.com/nvidia-gtx-750-ti
> 
> The 750ti is at least 15-20% more efficient than the 290 on a hashrate/power basis. In mining terms, this reduces your energy consumption by 20% thus increasing your returns respectively. Now undervolt the 750ti and you will probably get another 10%-20%. Also less cooling demands, less PSU demands.
> The only thing you really lose is the space density that the 290 can give you.
> 
> That being said, the 750ti is not gonna give you the gaming performance you want.


Yep. And power consumption is not an issue for me. I have free electricity.I'm more concerned about the heat that is generated in my room when I mine. It's uncomfortable. But buying an a/c does not seem worth it for the money.
Quote:


> Originally Posted by *rdr09*
> 
> how about the 780? Trail simulator? i play that with a HD7770 just fine. Watchdog, that plays better with Nvidia.


780 is no good for mining and my mobo does not support SLI, only crossfire.

What do you mean train simulator plays "just fine"? I have very low FPS. 30ish on occasion.


----------



## nightfox

Quote:


> Originally Posted by *PureBlackFire*
> 
> Back in the game folks. RMA for my Sapphire reference card came in today. They sent me a 290 Tri-X!


Man, wish when I RMA my card they would give me upgraded cooler as well.

Congrats. and enjoy the new card.


----------



## fateswarm

Quote:


> Originally Posted by *Dasboogieman*
> 
> Well, maybe not in the absolute raw hashrate but the sheer energy efficiency.
> According to this
> http://gpus.afs1.netdna-cdn.com/nvidia-gtx-750-ti
> 
> The 750ti is at least 15-20% more efficient than the 290 on a hashrate/power basis. In mining terms, this reduces your energy consumption by 20% thus increasing your returns respectively. Now undervolt the 750ti and you will probably get another 10%-20%. Also less cooling demands, less PSU demands.
> The only thing you really lose is the space density that the 290 can give you.
> 
> That being said, the 750ti is not gonna give you the gaming performance you want.


I wonder if they counted the factor of having to use more motherboards+system if the 750Tis don't fit anymore, on some sizes of setups.


----------



## jerrolds

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yep, and i chose a lighter cooler that still manages to make the PCB sag and thus put stress not only on the PCB itself but also the PCIe slot.
> 
> Sorry if you missed that.
> 
> 
> Spoiler: Just leaving this here


Crazy - i ordered an XSPC backplate (no front) and used zipties to keep the pcb from sagging haha.


----------



## Red1776

Quote:


> Originally Posted by *BradleyW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bond32*
> 
> Wouldn't recommend that. The only corsair psu up to par with quality are the AX series. My system pulled about 897 watts but that was all software readings.
> 
> 
> 
> I thought the AX was the only series that was NOT up to par from Corsair?
Click to expand...

 The JG results pretty much echoed widely all around.

Highest score until the AX 1500i rolled around

exceptionally low ripple/stability/ and vastly underrated in its wattage.

The AX 1200 is a Flextronics build same as the AX 1500i

Quote: From Jonnyguru.com Corsair AX1200W Review


> Performance
> 
> 
> 
> 10
> 
> 
> 
> Functionality
> 
> 
> 
> 9.5
> 
> 
> 
> Value
> 
> 
> 
> 9.5
> 
> 
> 
> Aesthetics
> 
> 
> 
> 10
> 
> 
> 
> *Total Score*
> 
> 
> 
> *9.8*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Summary*
> 
> There is a whole lot to like about the new Corsair AX1200, and very little indeed to hate. Lots of power, lots of connectors, and all of those connectors come off if you want them to. And we can't forget about the efficiency, which just about outclasses everything else I've ever tested. Do you need any other reason to buy one? Let's face it - Corsair's new baby is made of awesome sauce, and I know you all want a taste. Stop reading my review and buy one, already!
> 
> *The Good:*
> 
> 
> among the highest efficiency I've tested to date
> extremely low ripple
> lots of connectors
> fully modular
> very stable
> 
> *The Bad:*
> 
> 
> somehow, they're going to have to top this one... eventually


----------



## motokill36

Quote:


> Originally Posted by *bond32*
> 
> Wouldn't recommend that. The only corsair psu up to par with quality are the AX series. My system pulled about 897 watts but that was all software readings.


Its a old corsair TX950
Thought they were a ok unit


----------



## Durvelle27

Ok after reassembling Windows 8.1 no longer detects my R9 290X


----------



## Arizonian

Quote:


> Originally Posted by *PureBlackFire*
> 
> Back in the game folks. RMA for my Sapphire reference card came in today. They sent me a 290 Tri-X!
> 
> 
> Spoiler: Warning: Spoiler!


Congrats, nice little upgrade. Went to update you on the list and realized you never submitted entry.







This is the closes I got, so I'm taking it. If you could do me a favor and just add a GPU-Z validation link would be great. Added









Quote:


> Originally Posted by *aneutralname*
> 
> Ok, then. Let's see if I get a cookie.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> To be added on the member list please submit the following in your post
> 
> 1. GPU-Z Link with OCN name or Screen shot of GPU-Z validation tab open with OCN Name or finally a pic of GPU with piece of paper with OCN name showing.
> 2. Manufacturer & Brand - Please don't make me guess.
> 3. Cooling - Stock, Aftermarket or 3rd Party Water
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/5xukx/
> http://www.techpowerup.com/gpuz/wq8w7/
> 
> One of them is a Sapphire Tri-x OC. The other is an Asus reference I bought cheap from the used market. The Tri-X is a 290x chip that's impossible to unlock due to the card not being compatible with any other bios than its own. Asic quality 79.1%. Hynx memory. The Asus is a normal 290 chip. Asic quality 70.8%. Elpida memory.
> 
> Cooling is stock for now.


Congrats - added











Quote:


> Originally Posted by *chevy350*
> 
> Here's my Diamond 290X with stock clocks, Kraken G10 with ThermalTake Water 3.0 Performer
> http://www.techpowerup.com/gpuz/bqa6a/
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Quote:


> Originally Posted by *Durvelle27*
> 
> After examining the PCB I don't see any damage
> I no have No clue. I booted into Windows 8.1 opened GPU-Z to monitor temps. Core was idling around 37°© and VRMs both said 30°©. I than tried to load GPU to see what temps I've achieved than screen artifacted green and flickered than PC shut down.


Did you figure out what is going on?


----------



## jerrolds

Quote:


> Originally Posted by *PureBlackFire*
> 
> Back in the game folks. RMA for my Sapphire reference card came in today. They sent me a 290 Tri-X!


Nice how long did the entire process take?


----------



## klepp0906

So guys.. I unlocked my 290's to 290x with a standard ATI reference bios. All was fine and dandy until I decided I HAD to have UEFI support. I mean , who ships a card these days without UEFI boot support anyhow right?

Turns out - with the ASUS bios i'll also get the ability to use an even larger voltage range as well as a few other smaller bells/whistles.

I would much rather my bios match the vendor but xfx threw that out with the uefi support so now having it match my mobo is the next best option









Anyways, the reason I'm posting. A) I am having trouble discerning between the Uber and Quiet bios' (if someone could share links/or numbers to identify by that would be great) as can be ascertained - I am an anal bastard. I want to flash the switches w/ the proper bios in the proper position etc.

More importantly however - I just flashed to whichever ASUS bios tickled my fancy and I don't know what went wrong, but I when I rebooted I assumed all went well because I was able to go into the bios. I went in and disabled secure boot (so I could disable DEP rq) and changed to UEFI instead of CSM.

After the save/exit of the bios - upon repost all I got was blaaackness...? No idea what happened or why... but my main issue now (as im posting from the backup/quiet bios) Is to figure out how to flash the opposite bios from THIS bios, if that makes sense?

I have atiwinflash which I used to change my 290>290x and I am aware of the commands to flash each of the pcie slots, however how do I do the other chips from "this" side?

PLEASE PLEASE tell me theirs another way as opposed to actually having to pull and swap around cards?









Also if you have any insight as to why I got a black screen?

Anywho - <3


----------



## PureBlackFire

Quote:


> Originally Posted by *nightfox*
> 
> Man, wish when I RMA my card they would give me upgraded cooler as well.
> 
> Congrats. and enjoy the new card.


thanks.
Quote:


> Originally Posted by *Arizonian*
> 
> Congrats, nice little upgrade. Went to update you on the list and realized you never submitted entry.
> 
> 
> 
> 
> 
> 
> 
> This is the closes I got, so I'm taking it. If you could do me a favor and just add a GPU-Z validation link would be great. Added


thanks. here you go: http://www.techpowerup.com/gpuz/6b88g/
Quote:


> Originally Posted by *jerrolds*
> 
> Nice how long did the entire process take?


just over a week. when my card arrived at althon micro they were quiet for 2 days, but their first reply was to let me know they were sending the Tri-X and give me the tracking info.


----------



## webhito

Hi again! Anyone here know what blank I need for this block posted below?

Decided sticking with wc as I have already invested too much but will change a few things in the loop to make sure I never have a scare again.

Terminal is just like this one:


----------



## Durvelle27

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats, nice little upgrade. Went to update you on the list and realized you never submitted entry.
> 
> 
> 
> 
> 
> 
> 
> This is the closes I got, so I'm taking it. If you could do me a favor and just add a GPU-Z validation link would be great. Added
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> Did you figure out what is going on?


Read post above


----------



## kizwan

Quote:


> Originally Posted by *klepp0906*
> 
> So guys.. I unlocked my 290's to 290x with a standard ATI reference bios. All was fine and dandy until I decided I HAD to have UEFI support. I mean , who ships a card these days without UEFI boot support anyhow right?
> 
> Turns out - with the ASUS bios i'll also get the ability to use an even larger voltage range as well as a few other smaller bells/whistles.
> 
> I would much rather my bios match the vendor but xfx threw that out with the uefi support so now having it match my mobo is the next best option
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyways, the reason I'm posting. A) I am having trouble discerning between the Uber and Quiet bios' (if someone could share links/or numbers to identify by that would be great) as can be ascertained - I am an anal bastard. I want to flash the switches w/ the proper bios in the proper position etc.
> 
> More importantly however - I just flashed to whichever ASUS bios tickled my fancy and I don't know what went wrong, but I when I rebooted I assumed all went well because I was able to go into the bios. I went in and disabled secure boot (so I could disable DEP rq) and changed to UEFI instead of CSM.
> 
> After the save/exit of the bios - upon repost all I got was blaaackness...? No idea what happened or why... but my main issue now (as im posting from the backup/quiet bios) Is to figure out how to flash the opposite bios from THIS bios, if that makes sense?
> 
> I have atiwinflash which I used to change my 290>290x and I am aware of the commands to flash each of the pcie slots, however how do I do the other chips from "this" side?
> 
> PLEASE PLEASE tell me theirs another way as opposed to actually having to pull and swap around cards?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also if you have any insight as to why I got a black screen?
> 
> Anywho - <3


Are you using PT1 VBIOS? Please go to this thread for recovery procedure. I never done this before but I thought you can boot with working/quiet BIOS then switch to bad/uber BIOS using the BIOS switch & re-flash with working VBIOS. Whenever I want to flash BIOS/VBIOS, I prefer doing it in DOS mode than in Windows. There's higher chance something can go wrong in Windows.
Quote:


> Originally Posted by *webhito*
> 
> Hi again! Anyone here know what blank I need for this block posted below?
> 
> Decided sticking with wc as I have already invested too much but will change a few things in the loop to make sure I never have a scare again.
> 
> Terminal is just like this one:


Depends on whether that is parallel or serial terminal.

Blank for parallel terminal:-
http://www.ekwb.com/shop/blocks/vga-blocks/multiple-block-connectivity/fc-terminals/ek-fc-terminal-blank-parallel.html

Blank for serial terminal:-
http://www.ekwb.com/shop/blocks/vga-blocks/multiple-block-connectivity/fc-terminals/ek-fc-terminal-blank-serial.html


----------



## Red1776

if you are running three or four cards, I highly recommend the Parallel terminal bridge (actually semi parallel)

(I actually do put my money where my mouth is







)


----------



## wrigleyvillain

Quote:


> Originally Posted by *Dasboogieman*
> 
> Ok, Helipal 8mm M2 screws arrived. Begin EK backplate mod.


So you had an EK backplate and the "mod" helps VRM temps? Or you got the backplate to do the mod?

Sorry, just bought a used 290 so new to the thread.


----------



## Arizonian

Quote:


> Originally Posted by *Durvelle27*
> 
> Ok after reassembling Windows 8.1 no longer detects my R9 290X











Quote:


> Originally Posted by *Durvelle27*
> 
> Read post above


That was just seconds prior to my post.

Sorry to hear this.







. Any other GPU lying around to test?


----------



## Durvelle27

Quote:


> Originally Posted by *Arizonian*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That was just seconds prior to my post.
> 
> Sorry to hear this.
> 
> 
> 
> 
> 
> 
> 
> . Any other GPU lying around to test?


Tried flipping BIOs switch, ever slot, and still nothing. An i do not as i gave them all away to friends who wanted to lightly game. So i'm stuck with my IGP HD 4600


----------



## webhito

Quote:


> Originally Posted by *kizwan*
> 
> Blank for parallel terminal:-
> http://www.ekwb.com/shop/blocks/vga-blocks/multiple-block-connectivity/fc-terminals/ek-fc-terminal-blank-parallel.html
> 
> Blank for serial terminal:-
> http://www.ekwb.com/shop/blocks/vga-blocks/multiple-block-connectivity/fc-terminals/ek-fc-terminal-blank-serial.html


Thanks!


----------



## sibanez

Hello guys. Apologies if this has already been posted but I couldn't find anything in the run up to making my purchase! The EK R9 290X backplate IS compatible with the Gigabyte Windforce R9 290X OC. I took the plunge and bought 3 for my cards and all of the screws lined up perfectly. If anyone was put off the cards due to this, rejoice away









Here they are doing a marvellous job keeping my cards looking badass:


----------



## kahboom

Bought some XFX R9 290's this week, got two for 400.00 off ebay. Still had the plastic on them and came with all the crap in the original boxes. Sadly they did not unlock. Looking for some XFX R9 290 non X uber bios to flash on these, or if someone can just point me in the right direction. Techpowerup.com doesn't have the bios labeled as quiet or uber.


----------



## klepp0906

A) if they don't unlock, why do U want the bioS? 
B) tech power up has the xfx ones and has them labeled uber/quiet









Now if they only did the same for asus ones


----------



## kahboom

Sorry about that i actually need the non X version R9 290 uber bios, since i got them second hand they were already flashed with 290x bios but my motherboard would black screen it, so i flashed quiet version over it, now i just have two quiet version's.


----------



## bond32

I was pretty sure the 290 bios was actually the same on both switches... Thought the 290x was the only one with the "beer" bios, but I could be wrong. At any rate, did you check all the manufacturers at techpowerup? Cards like the sapphire tri-x are reference PCB.


----------



## Forceman

Quote:


> Originally Posted by *bond32*
> 
> I was pretty sure the 290 bios was actually the same on both switches... Thought the 290x was the only one with the "beer" bios, but I could be wrong. At any rate, did you check all the manufacturers at techpowerup? Cards like the sapphire tri-x are reference PCB.


You're right, on the 290 both sides are the same.


----------



## kahboom

Yeah ive checked and there not labeled on techpowerup. http://www.guru3d.com/articles_pages/radeon_r9_290_review_benchmarks,15.html They do have a quiet and uber mode for the regular 290 its just little to no difference, like 1-2fps but should help for throttling and a higher overclock perhaps.


----------



## klepp0906

Quote:


> Originally Posted by *kizwan*
> 
> Are you using PT1 VBIOS? Please go to this thread for recovery procedure. I never done this before but I thought you can boot with working/quiet BIOS then switch to bad/uber BIOS using the BIOS switch & re-flash with working VBIOS. Whenever I want to flash BIOS/VBIOS, I prefer doing it in DOS mode than in Windows. There's higher chance something can go wrong in Windows.
> Depends on whether that is parallel or serial terminal.
> 
> Blank for parallel terminal:-
> http://www.ekwb.com/shop/blocks/vga-blocks/multiple-block-connectivity/fc-terminals/ek-fc-terminal-blank-parallel.html
> 
> Blank for serial terminal:-
> http://www.ekwb.com/shop/blocks/vga-blocks/multiple-block-connectivity/fc-terminals/ek-fc-terminal-blank-serial.html


Whats the PT1 bios? I just used http://www.techpowerup.com/vgabios/148214/asus-r9290x-4096-131014.html afaik. I was going to temp flash that until I could get someone to reply with a differentiation of the UBER/Quiet version so I could actually flash how I wanted.


----------



## Hl86

Though my 290x was noisy, finds out my psu dust filter was full and the psu made the noise :>


----------



## kahboom

Uber would stay or have a higher powertune then stock and higher fan limit or curve set by default. PT1 bios has lifted limits of an uber bios, also has the ssid for the card changed to that of a regular 290. Some cards still will black screen if the vendor id is still different.


----------



## klepp0906

Quote:


> Originally Posted by *kahboom*
> 
> Uber would stay or have a higher powertune then stock and higher fan limit or curve set by default. PT1 bios has lifted limits of an uber bios, also has the ssid for the card changed to that of a regular 290. Some cards still will black screen if the vendor id is still different.


omg where can I find this hawt piece of work?


----------



## klepp0906

ftr - im a first time AMD owner. my main rig has titans but got fed up w/ surround issues and figured now was a good time to try AMD







Quite different as far as Catalyst vs nvcp goes that's for sure







The cards seem to run SMOOTHER than my titans as far as micro stutter goes, but that could be trifire vs quadsli as well. couldn't say. 4 cards gets a bad wrap though.

Anyhow, back on topic, that PT1 bios mmmm . I know im asking a lot of you - but is it UEFI boot compliant? or do you know? ;P

Found it - I should read more and talk less







sorry roflmao


----------



## klepp0906

Found it - I should read more and talk less







sorry roflmao


----------



## kahboom

Click on the thread you linked for the R9 290 unlock club the click Other R9 290X ROMs and it will appear to download.

PT3 has an offset+ value to counter vdroop from what is says about it so add more voltage than you are really using. I would avoid this bios, adding more voltage and not know what your card is really running could blow some parts of have lasting effects.


----------



## klepp0906

yea im good on no droop without waterblocks anyhow. XFX cards aren't the best and who's to say how hearty the vrm's are or aren't.

With that said - should I give the PT1 or PT1T a try? is the only difference the way its interpreted by the mobo?


----------



## Kittencake

So excited to finally have something that's not so out of date


----------



## klepp0906

so those bios's still do show up as ASUS unfortunately







They aren't vendor neutral BOO!

I tried the PT1 and the PT1T. Good news is the pc booted with both. Bad is, my 3rd card isn't showing up period now.

And seriously, do we have our very own .. Not only female overclocker..and not only hot female overclocker.. but hot Swedish female overclocker?

Or is that an avatar and I fail?







Where have you been all my life?


----------



## Kittencake

Norwegian not sweetish -_- do not insult thy kitty , and no it's not just an avatar


----------



## Dasboogieman

Quote:


> Originally Posted by *klepp0906*
> 
> yea im good on no droop without waterblocks anyhow. XFX cards aren't the best and who's to say how hearty the vrm's are or aren't.
> 
> With that said - should I give the PT1 or PT1T a try? is the only difference the way its interpreted by the mobo?


PT1T BIOS is safer, they are both functionally the same but the PT1T insulates you from the possibility of getting a blackscreen on boot.


----------



## wrigleyvillain

Quote:


> Originally Posted by *klepp0906*
> 
> And seriously, do we have our very own .. Not only female overclocker..and not only hot female overclocker.. but hot Swedish female overclocker?
> 
> Or is that an avatar and I fail?
> 
> 
> 
> 
> 
> 
> 
> Where have you been all my life?


Just...just don't.


----------



## Durvelle27

Quote:


> Originally Posted by *wrigleyvillain*
> 
> Just...just don't.


Lol


----------



## Kittencake

Quote:


> Originally Posted by *wrigleyvillain*
> 
> Just...just don't.


Lol im used to it but I'm spoken for anyway


----------



## Dasboogieman

Quote:


> Originally Posted by *wrigleyvillain*
> 
> So you had an EK backplate and the "mod" helps VRM temps? Or you got the backplate to do the mod?
> 
> Sorry, just bought a used 290 so new to the thread.


Both, I got an EK Backplate but couldn't attach it to the sapphire Tri X cooler because of the funky sapphire screws. So I got Helipal screws, modded them, so I could now attach the backplate. The EK backplate is much better designed for cooling than any other I've seen except maybe the Aquacomputer one. Its got relief and protrusions (mostly) where appropriate for thermal pads. That being said, I actually expected the performance to be a wee bit more but oh well.

Then, once the backplate is mounted, I have a flat surface to attach more heatsinks in to the VRM area. Earlier on in this thread, I strapped an old NVIDIA GT 210 cooler to the VRM assembly on the backside and it dropped temperatures by about 5 degrees. I'm thinking of doing it again so I expect maybe a 10 ish degree drop all up.


----------



## nightfox

Quote:


> Originally Posted by *klepp0906*
> 
> so those bios's still do show up as ASUS unfortunately
> 
> 
> 
> 
> 
> 
> 
> They aren't vendor neutral BOO!
> 
> I tried the PT1 and the PT1T. Good news is the pc booted with both. Bad is, my 3rd card isn't showing up period now.
> 
> And seriously, do we have our very own .. Not only female overclocker..and not only hot female overclocker.. but hot Swedish female overclocker?
> 
> Or is that an avatar and I fail?
> 
> 
> 
> 
> 
> 
> 
> Where have you been all my life?


tsk tsk. you need to get out of your room and stop playing for a while. You need to get out and see the world rather than just seeing an avatar, or knowing there's a female around, acting so childishly


----------



## klepp0906

Quote:


> Originally Posted by *wrigleyvillain*
> 
> Just...just don't.


lol let me tell ya.. i have game for .. like... yeaaaaaaaaaaaaaaaaars. Alas - im married. Insta-fail. Aaaanyways, your a rare breed. Im just a dumb american (assumed norway/sweden were the same ethnicity for the most part).

Back to business!

Weirdest thing ever. 1 of my 3 cards is not being picked up after flash. Did i already say that? Ugh, so i tried to flash again.. and no matter what if i have the switch flipped to the one side, 2 cards. Flip it back over to the other side, 3 cards.

All using the same bios (PT1 but have tried PT1T as well).

I'm stumped. Is their a way to add hardware or have it search for hardware manually in windows 8.1? I would move it manually but... im out of slots









what to do what to do!


----------



## Durvelle27

Quote:


> Originally Posted by *Kittencake*
> 
> Lol im used to it but I'm spoken for anyway


I've never seen you around here before

Guys I need help


----------



## Kittencake

I was wondering what temps I will see with a ref sapphire 290x in a room with temps say 25-28c? It gets hot in the summer


----------



## klepp0906

Quote:


> Originally Posted by *Kittencake*
> 
> Lol im used to it but I'm spoken for anyway


that makes two of us. Such a shame! we could have made beautiful babies! kings and queens i bet!









But on a more serious note. I bet you are! (used to it). Women who fly airplanes are uncommon. Women in the armed forces even. Women on a pc enthusiast website that are pre 150lbs and even somewhat attractive.. that's RARE!

Stereotypical? not if its true!

I meant it as light hearted as can be though! Just in case humor is taken a bit differently on your side of the fence







Oh.. I forgot to add - women on an enthusiast forum WITH cutting edge hardware.. Literally Unheard of!
Quote:


> Originally Posted by *klepp0906*
> 
> Women on a pc enthusiast website that are pre 150lbs and even somewhat attractive.. that's RARE!


was that as insensitive and as rude as it sounded reading it back?









Quote:


> Originally Posted by *Durvelle27*
> 
> I've never seen you around here before
> 
> Guys I need help


im sure she means in general. And that makes two of us as well (help)







But it usually helps to post what you need help with


----------



## klepp0906

Quote:


> Originally Posted by *klepp0906*
> 
> Women on a pc enthusiast website that are pre 150lbs and even somewhat attractive.. that's RARE!


was that as insensitive and as rude as it sounded reading it back?


----------



## klepp0906

Quote:


> Originally Posted by *Durvelle27*
> 
> I've never seen you around here before
> 
> Guys I need help


im sure she means in general. And that makes two of us as well (help)







But it usually helps to post what you need help with


----------



## devilhead

Quote:


> Originally Posted by *Kittencake*
> 
> I was wondering what temps I will see with a ref sapphire 290x in a room with temps say 25-28c? It gets hot in the summer


hmm, with original fan curve you will get 95C, but with modified fan curve you can get 75 C(or even lower), but the fan will spin louder


----------



## Durvelle27

Quote:


> Originally Posted by *Kittencake*
> 
> I was wondering what temps I will see with a ref sapphire 290x in a room with temps say 25-28c? It gets hot in the summer


Depends on fan speed
Quote:


> Originally Posted by *klepp0906*
> 
> im sure she means in general. And that makes two of us as well (help)
> 
> 
> 
> 
> 
> 
> 
> But it usually helps to post what you need help with


What else can I try to see if my 290X still has life in it.


----------



## Kittencake

Quote:


> Originally Posted by *devilhead*
> 
> hmm, with original fan curve you will get 95C, but with modified fan curve you can get 75 C(or even lower), but the fan will spin louder


I'm not so much bothered with fan noise, I have a set of turtle beach p11s they're fairly decent on canceling noise


----------



## klepp0906

Quote:


> Originally Posted by *nightfox*
> 
> tsk tsk. you need to get out of your room and stop playing for a while. You need to get out and see the world rather than just seeing an avatar, or knowing there's a female around, acting so childishly


and you need to remove that angle iron from your backside and learn how to have a little fun. Give me a break. Sigh. Or wait.. let me guess your the "mature" and "chivalrous" one who swept in to save the day?

Pro tip - women dont mind being called on being attractive. Try it.

So now that the ships sailed. Can we move on?


----------



## klepp0906

Quote:


> Originally Posted by *Durvelle27*
> 
> Depends on fan speed
> What else can I try to see if my 290X still has life in it.


I apparently missed your former post. Is it not posting at all? Go dead post flash or post overclock or neither?


----------



## nightfox

Quote:


> Originally Posted by *klepp0906*
> 
> and you need to remove that angle iron from your backside and learn how to have a little fun. Give me a break. Sigh. Or wait.. let me guess your the "mature" and "chivalrous" one who swept in to save the day?
> 
> Pro tip - women dont mind being called on being attractive. Try it.
> 
> So now that the ships sailed. Can we move on?


ha ha. no problem. lets not escalate any more. youre right lets move on .







peace


----------



## Dasboogieman

Quote:


> Originally Posted by *Kittencake*
> 
> I was wondering what temps I will see with a ref sapphire 290x in a room with temps say 25-28c? It gets hot in the summer


Likely the full 95 degrees with the stock fan curve. Maybe even a bit of clockspeed throttling too. Its ok though, the GPU can take 95 degrees continuously without issue, the VRMs are also quite safe due to the ref cooler. Depending on your chip, you may have to increase fan speed north of 60% to remove the clockspeed throttling. Basically, just increase the fan speed to whatever allows minimal clockspeed throttling and doesn't kill your ears.


----------



## Durvelle27

Quote:


> Originally Posted by *klepp0906*
> 
> I apparently missed your former post. Is it not posting at all? Go dead post flash or post overclock or neither?


It's on the previous page.


----------



## wrigleyvillain

Quote:


> Originally Posted by *Kittencake*
> 
> Lol im used to it but I'm spoken for anyway


No surprise there. On either thing.
Quote:


> Originally Posted by *klepp0906*
> 
> that makes two of us. Such a shame! we could have made beautiful babies! kings and queens i bet!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But on a more serious note. I bet you are! (used to it). Women who fly airplanes are uncommon. Women in the armed forces even. Women on a pc enthusiast website that are pre 150lbs and even somewhat attractive.. that's RARE!
> 
> Stereotypical? not if its true!
> 
> I meant it as light hearted as can be though! Just in case humor is taken a bit differently on your side of the fence
> 
> 
> 
> 
> 
> 
> 
> Oh.. I forgot to add - women on an enthusiast forum WITH cutting edge hardware.. Literally Unheard of!


You're doing it again. And you're also being sexist and a little demeaning to her even though I know you don't mean to be. Women, even young attractive ones, may not be common but are not at all "unheard of" on the high-end forums. Go look at Darlene's (IT Diva) builds in the Water Cooling Club Pic gallery. She is one of the best in the thread and that is saying a whole helluva lot. There is also another older woman at XS named Shazza who also does mind-blowing liquid cooled builds.
Quote:


> Originally Posted by *Dasboogieman*
> 
> Both, I got an EK Backplate but couldn't attach it to the sapphire Tri X cooler because of the funky sapphire screws. So I got Helipal screws, modded them, so I could now attach the backplate. The EK backplate is much better designed for cooling than any other I've seen except maybe the Aquacomputer one. Its got relief and protrusions (mostly) where appropriate for thermal pads. That being said, I actually expected the performance to be a wee bit more but oh well.
> 
> Then, once the backplate is mounted, I have a flat surface to attach more heatsinks in to the VRM area. Earlier on in this thread, I strapped an old NVIDIA GT 210 cooler to the VRM assembly on the backside and it dropped temperatures by about 5 degrees. I'm thinking of doing it again so I expect maybe a 10 ish degree drop all up.


Thanks +rep


----------



## Mega Man

would also like to point out that i.t. diva, is amazingly smart/knowledgeable too ! builds her own pwm fan controllers, great with tools ect,

also to throw in the mix most of the woman mechs i know, are some of the best i have ever met ! through out my life i have seen most if not all of the men/women stereotypes with nothing but holes in them, no matter what sex you are, we really all can be/are the same if you take out the outside features, one of the many reasons i love forums,

takes out the physical leaving only the mental to compete, and imo you really see peoples true selfs once behind a pc monitor !~


----------



## Dasboogieman

Quote:


> Originally Posted by *Mega Man*
> 
> would also like to point out that i.t. diva, is amazingly smart/knowledgeable too ! builds her own pwm fan controllers, great with tools ect,
> 
> also to throw in the mix most of the woman mechs i know, are some of the best i have ever met ! through out my life i have seen most if not all of the men/women stereotypes with nothing but holes in them, no matter what sex you are, we really all can be/are the same if you take out the outside features, one of the many reasons i love forums,
> 
> takes out the physical leaving only the mental to compete, and imo you really see peoples true selfs once behind a pc monitor !~


There is nothing in this world scarier than a female DOTA2 mid player. I can't count how many times ive had my behind handed to me by my good friend.


----------



## kizwan

Quote:


> Originally Posted by *klepp0906*
> 
> so those bios's still do show up as ASUS unfortunately
> 
> 
> 
> 
> 
> 
> 
> They aren't vendor neutral BOO!
> 
> I tried the PT1 and the PT1T. Good news is the pc booted with both. Bad is, my 3rd card isn't showing up period now.


You should be able to get it detected again if you switch to original/untouched BIOS. If that still failed then try this; Shutdown & disconnect the PCIe power cable from the third card. Leave it for 5 - 10 minutes. Then plug in the PCIe power cable & try again. You might want to re-seat the card if it still not detected.


----------



## Kittencake

Quote:


> Originally Posted by *Dasboogieman*
> 
> Likely the full 95 degrees with the stock fan curve. Maybe even a bit of clockspeed throttling too. Its ok though, the GPU can take 95 degrees continuously without issue, the VRMs are also quite safe due to the ref cooler. Depending on your chip, you may have to increase fan speed north of 60% to remove the clockspeed throttling. Basically, just increase the fan speed to whatever allows minimal clockspeed throttling and doesn't kill your ears.


K i'm just playing it safe I don't think it will get that hot hopefully , cause its my first summer in my new place I'll also be moving the pc from the loft to downstairs as well away from the heat


----------



## kizwan

Quote:


> Originally Posted by *Kittencake*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dasboogieman*
> 
> Likely the full 95 degrees with the stock fan curve. Maybe even a bit of clockspeed throttling too. Its ok though, the GPU can take 95 degrees continuously without issue, the VRMs are also quite safe due to the ref cooler. Depending on your chip, you may have to increase fan speed north of 60% to remove the clockspeed throttling. Basically, just increase the fan speed to whatever allows minimal clockspeed throttling and doesn't kill your ears.
> 
> 
> 
> 
> 
> 
> 
> K i'm just playing it safe I don't think it will get that hot hopefully , cause its my first summer in my new place I'll also be moving the pc from the loft to downstairs as well away from the heat
Click to expand...

What is the ambient temp when in summer? Summer over here can go from 32 to 37c & it's all year round summer. You should be fine with custom fan curve. I remember playing BF4 with stock cooling, stock fan curve, without A/C on & at stock clocks, and it was running without any problem at all. It's throttling because 95c on the core but no problem at all. I don't know which card you have but with fan at 50%, it shouldn't go over 80c on the core (maybe in 80s celsius depending on the ambient) at stock clock.


----------



## Kittencake

sapphire 290x I just ordered it , right nwo i'm on a dinky lousy asus 9500gt 1 gig card till it arrives my 7950's fan crapped out , the 7950 card its self is ok


----------



## kahboom

Went from a pair of 7950s too two 290s for just 1080p gaming. Now to wait to see how freesync turns out before getting another monitor or TV.


----------



## anubis1127

Quote:


> Originally Posted by *kahboom*
> 
> Went from a pair of 7950s too two 290s for just 1080p gaming. Now to wait to see how freesync turns out before getting another monitor or TV.


Overkill FTW!!


----------



## Sgt Bilko

Quote:


> Originally Posted by *anubis1127*
> 
> Overkill FTW!!


When you want 200% res scaling then not so much


----------



## kizwan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *anubis1127*
> 
> Overkill FTW!!
> 
> 
> 
> When you want 200% res scaling then not so much
Click to expand...

200% res scaling FTW!!!


----------



## Mega Man

Quote:


> Originally Posted by *kahboom*
> 
> Went from a pair of 7950s too two 290s for just 1080p gaming. Now to wait to see how freesync turns out before getting another monitor or TV.


you do know how long that has been out right?

part of the current DP standard


----------



## anubis1127

Quote:


> Originally Posted by *Sgt Bilko*
> 
> When you want 200% res scaling then not so much


Overkill FTW!!


----------



## kahboom

Haven't really followed it or payed much attention. How does it seem to work with the r9 cards? What's this 200% scaling? Is it something I can do with a 1080p monitor/TV?


----------



## nightfox

Quote:


> Originally Posted by *kahboom*
> 
> Haven't really followed it or payed much attention. How does it seem to work with the r9 cards? What's this 200% scaling? Is it something I can do with a 1080p monitor/TV?


BF4 settings


----------



## Sgt Bilko

Quote:


> Originally Posted by *kahboom*
> 
> Haven't really followed it or payed much attention. How does it seem to work with the r9 cards? What's this 200% scaling? Is it something I can do with a 1080p monitor/TV?


BF4, Medal of Honour: Warfighter and a few others have the option


----------



## the9quad

.
Quote:


> Originally Posted by *Mega Man*
> 
> you do know how long that has been out right?
> 
> part of the current DP standard


I don't think a FreeSync monitor has been released yet, and I believe although it is part of the standard, it will still be some time before we see monitors implementing the standard fully.


----------



## kahboom

So is the scaling a in game setting of ati CCC setting. Never really noticed or looked for it. I will have to check that out this weekend. As well as editing the bios on these newer cards. Was looking at the new water block with the backplate cooling but $200 for the pair per card. Is there any other block comparable for over clocking and that gets decent vrm temps or is that it?


----------



## Raephen

Quote:


> Originally Posted by *the9quad*
> 
> .
> I don't think a FreeSync monitor has been released yet, and I believe although it is part of the standard, it will still be some time before we see monitors implementing the standard fully.


If I'm not mistaken, Freesync will be supported by DP 1.2*a*.

I don't know if there are any monitors with this newer version of displayport out yet.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kahboom*
> 
> So is the scaling a in game setting of ati CCC setting. Never really noticed or looked for it. I will have to check that out this weekend. As well as editing the bios on these newer cards. Was looking at the new water block with the backplate cooling but $200 for the pair per card. Is there any other block comparable for over clocking and that gets decent vrm temps or is that it?


It's an in-game option.

Some games have it and others don't, great as a substitute for AA sometimes


----------



## Durvelle27

Does anyone have a spare R9 290X cooler


----------



## gotmilkmang

i have a reference cooler just chilling here


----------



## chronicfx

Quote:


> Originally Posted by *the9quad*
> 
> .
> I don't think a FreeSync monitor has been released yet, and I believe although it is part of the standard, it will still be some time before we see monitors implementing the standard fully.


Just out of curiosity if you use afterburner with your trifire to monitor gpu usage, does your usage graph bounce up and down for the three cards or do all cards stay near 100% usage or at least at some stable percentage? With one card I am pegged at 100% with two I am at 100% more than not but encounter the gpu usage bouncing around at some points. With three it is like a three year old tried to color in all my graphs. I will screening it later on for you.


----------



## fateswarm

When do vrm temps on the 290s become a real practical problem? I noticed specs of mosfets usually show them capable of going up to 130C, after which point they rapidly diminish current capacity and then eventually switch off at ~150C. Though that might not be safe for surrounding components I suspect.


----------



## anubis1127

It becomes a problem when the card starts throttling performance.


----------



## cephelix

it should be around 95-100 deg right?


----------



## rdr09

Quote:


> Originally Posted by *cephelix*
> 
> it should be around 95-100 deg right?


ignoring what amd said, i would not let them go past 90C. 80C will be ideal. i'd monitor VRM1 more than VRM2.


----------



## Dasboogieman

Quote:


> Originally Posted by *fateswarm*
> 
> When do vrm temps on the 290s become a real practical problem? I noticed specs of mosfets usually show them capable of going up to 130C, after which point they rapidly diminish current capacity and then eventually switch off at ~150C. Though that might not be safe for surrounding components I suspect.


Quote:


> Originally Posted by *fateswarm*
> 
> When do vrm temps on the 290s become a real practical problem? I noticed specs of mosfets usually show them capable of going up to 130C, after which point they rapidly diminish current capacity and then eventually switch off at ~150C. Though that might not be safe for surrounding components I suspect.


Well, I've seen a few and very isolated cases of VRMs blowing from people who've incorrectly installed Accelero Cooling units. I can only guess but sustained temperatures above 100 degrees is not safe. It obviously has engineering headroom built in but the surrounding components will start to deteriorate (like the micro surface mount capacitors on the backside or the buffer solid capacitors nearby). I mean, the VRMs are metal packaged (which btw is very expensive but extremely heat resistant) however, the support components are still plastic packaged and IIRC solid caps are only rated for up to 100 degrees (maybe can survive 120).

You just look at the AMD reference cooler to see what components should be at what temperature. AMD prioritized the VRM1 assembly to have its heatsink directly beneath the blower. Thus, it has the best temperature differential with the ambient over any other component (including the core). I think the VRM temperatures on the reference cooler range from 60-80 degrees depending on fan speed.
In general, the VRMs are absolutely fine at stock, the GPU won't draw enough current at normal voltages to stress the assembly, however because its only a 5+1+1 phase design, temperatures spike really fast with overvoltage.


----------



## heroxoot

Quote:


> Originally Posted by *cephelix*
> 
> it should be around 95-100 deg right?


If you mean VRM1, I seriously never see mine go past 70 much. GPU stays below 80c on stock.


----------



## cephelix

Quote:


> Originally Posted by *heroxoot*
> 
> If you mean VRM1, I seriously never see mine go past 70 much. GPU stays below 80c on stock.


With overclocks the highest my VRM 1 went was 76...tht is of course without overvolt mods....
Quote:


> Originally Posted by *rdr09*
> 
> ignoring what amd said, i would not let them go past 90C. 80C will be ideal. i'd monitor VRM1 more than VRM2.


Yeah, if anything were to overheat,it'd be VRM1 more than 2....


----------



## Durvelle27

Quote:


> Originally Posted by *gotmilkmang*
> 
> i have a reference cooler just chilling here


Are you willing to part with it


----------



## gotmilkmang

yea man i sent you a PM


----------



## ImGunz

Looking for a waterblock for my MSI R9 290 Twin Frozr. I got the EK - FC R9-290X waterblock right now, but it doesn't fit and im gonna have to return it. My question is which block will fit? The one i got didn't have cut outs for the things in the picture marked with red


http://imgur.com/fEI3voE


----------



## cephelix

Quote:


> Originally Posted by *ImGunz*
> 
> Looking for a waterblock for my MSI R9 290 Twin Frozr. I got the EK - FC R9-290X waterblock right now, but it doesn't fit and im gonna have to return it. My question is which block will fit? The one i got didn't have cut outs for the things in the picture marked with red
> 
> 
> http://imgur.com/fEI3voE


did u get the rev 2.0 version?


----------



## Gabkicks

Besides battlefield 4, what other games really show off crossfire performance gains? I have the beta/alphas of a few games and they don't run very well with AMD. I see no gains from crossfire and worse performance than my 780 in those beta/alpha games :/


----------



## ImGunz

Yes it's the 2.0 version according to the company that i got it from.


----------



## anubis1127

Quote:


> Originally Posted by *Gabkicks*
> 
> Besides battlefield 4, what other games really show off crossfire performance gains? I have the beta/alphas of a few games and they don't run very well with AMD. I see no gains from crossfire and worse performance than my 780 in those beta/alpha games :/


Crysis 3 had good CFX scaling IIRC.


----------



## Red1776

Quote:


> Originally Posted by *Gabkicks*
> 
> Besides battlefield 4, what other games really show off crossfire performance gains? I have the beta/alphas of a few games and they don't run very well with AMD. I see no gains from crossfire and worse performance than my 780 in those beta/alpha games :/


 Almost all major titles.

I have a folder of graphs with CPU/GPU with 4 GPU's demonstrating it.


----------



## the9quad

A lot of newer titles are terrible in CFX or it doesn't work at all. I imagine it's the same thing on the green team as well. Watchdogs, Wolfenstein, rust, dayz, titanfall, etc... A lot of other titles work great though, pretty much any frostbite game, tomb raider, hitman, crysis3, etc...


----------



## pdasterly

F1, grid, need 4 speed, batman, metro


----------



## DeadlyDNA

Quote:


> Originally Posted by *Gabkicks*
> 
> Besides battlefield 4, what other games really show off crossfire performance gains? I have the beta/alphas of a few games and they don't run very well with AMD. I see no gains from crossfire and worse performance than my 780 in those beta/alpha games :/


Beta/alphas typically don't have support for multi-gpu so that's not going to help much. Are you just eyeballing it or actively benching in a fashion that can backup your claim. Not that i doubt your observation but a lot of factors are involved. What i mean is things like "game feels smoother" is subjective to users, and with a more stable frame rate it could feel faster, or slower depending on the person.

Most of the GPU scaling testing i have done shows 2xGPU Rseries is the sweet spot. Even for games that don't scale well, they usually still scale with 2 cards.


----------



## rdr09

Quote:


> Originally Posted by *Gabkicks*
> 
> Besides battlefield 4, what other games really show off crossfire performance gains? I have the beta/alphas of a few games and they don't run very well with AMD. I see no gains from crossfire and worse performance than my 780 in those beta/alpha games :/


not sure if this is still relevant . . .

http://www.hardocp.com/article/2014/01/26/xfx_r9_290x_double_dissipation_edition_crossfire_review/8

is it for your sig - i7 920? might be a good idea to stick with just one 290.


----------



## Dire Squirrel

Quote:


> Originally Posted by *ImGunz*
> 
> Yes it's the 2.0 version according to the company that i got it from.


It should say so on the box unless I'm very much mistaken.


----------



## Gabkicks

Quote:


> Originally Posted by *rdr09*
> 
> not sure if this is still relevant . . .
> 
> http://www.hardocp.com/article/2014/01/26/xfx_r9_290x_double_dissipation_edition_crossfire_review/8
> 
> is it for your sig - i7 920? might be a good idea to stick with just one 290.


I am upgrading to 4790k whenever they finally are available. Not sure which z97 board to go with yet and I didn't think it would make much of a difference but I could be wrong.
Quote:


> Originally Posted by *DeadlyDNA*
> 
> Beta/alphas typically don't have support for multi-gpu so that's not going to help much. Are you just eyeballing it or actively benching in a fashion that can backup your claim. Not that i doubt your observation but a lot of factors are involved. What i mean is things like "game feels smoother" is subjective to users, and with a more stable frame rate it could feel faster, or slower depending on the person.


Yeah I did some repeatable benchmark runs using fraps and there was barely any difference between single and crossfire in the 2 games i am thinking of. I'll try out the games mentioned above.


----------



## Red1776

Quote:


> Originally Posted by *pdasterly*
> 
> F1, grid, need 4 speed, batman, metro


A comment I find odd that CF does not work with "many or most, or a lot of titles" along with "few games use more than two cores.

F1 works with CF: I will get my graphs out for this one.

Batman works with CF http://www.techpowerup.com/reviews/AMD/R9_295_X2/1.html

Metro? I don't know which you are referring to, but 2033 is about as in love with more GPU's as ant title ever produced. and Last light continues that. http://www.techpowerup.com/reviews/AMD/R9_295_X2/1.html

TPU is known for benching a whole lot of Titles in its GPU reviews, from Assassins Creed to World of Warcraft

Its pretty tough to find titles that don't run CF. I will leave this here as TPU lists the info on the driver used etc...

http://www.techpowerup.com/reviews/AMD/R9_295_X2/1.html


----------



## ImGunz

Quote:


> Originally Posted by *Dire Squirrel*
> 
> It should say so on the box unless I'm very much mistaken.


Yes it does say it on the box, didn't notice when i got it, i just expected it to be


----------



## rdr09

Quote:


> Originally Posted by *Gabkicks*
> 
> I am upgrading to 4790k whenever they finally are available. Not sure which z97 board to go with yet and I didn't think it would make much of a difference but I could be wrong.
> Yeah I did some repeatable benchmark runs using fraps and there was barely any difference between single and crossfire in the 2 games i am thinking of. I'll try out the games mentioned above.


get 3 290s.


----------



## aneutralname

Quote:


> Originally Posted by *pdasterly*
> 
> F1, grid, need 4 speed, batman, metro


NFS rivals runs much worse with crossfire.

Serious Sam runs actually very well with crossfire. But for the most part crossfire is a waste.


----------



## Kittencake

I can't wait to try crossfire ,







hopefully withing a month or 2 there will be some major improvements in drivers for it


----------



## Pheatton

So I made the mistake of letting the GPU Tweak tool do a VBIOS update.. Well know my primary card shows just a black screen. Anyway to reflash the card using an older VBIOS?

I'm running two Asus DirectCU II R290s. The second card is fine.


----------



## Durvelle27

Quote:


> Originally Posted by *Kittencake*
> 
> I can't wait to try crossfire ,
> 
> 
> 
> 
> 
> 
> 
> hopefully withing a month or 2 there will be some major improvements in drivers for it


Now is better than any other. I see 3x R9 290s going for $225 Shipped each


----------



## Forceman

Quote:


> Originally Posted by *Pheatton*
> 
> So I made the mistake of letting the GPU Tweak tool do a VBIOS update.. Well know my primary card shows just a black screen. Anyway to reflash the card using an older VBIOS?
> 
> I'm running two Asus DirectCU II R290s. The second card is fine.


Download the BIOS from the other card with GPU-Z, and then use the DOS Atiflash tool to flash the second card. Pretty easy, actually. You can find the flash tool in this thread:

http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread/0_30


----------



## Kittencake

well I just got 1 290x







and it hasn't arrived yet that was a bit expensive in itself


----------



## fateswarm

I'd prefer 1 good 290x than used ebay 290s that were mining coins.


----------



## pdasterly

Quote:


> Originally Posted by *aneutralname*
> 
> NFS rivals runs much worse with crossfire.
> 
> Serious Sam runs actually very well with crossfire. But for the most part crossfire is a waste.


Your right, nfs is capped at 30fps.
Crossfire isnt a waste if you run high resolutions. I game at eyeFUNity resolutions


----------



## cephelix

Quote:


> Originally Posted by *ImGunz*
> 
> Yes it's the 2.0 version according to the company that i got it from.


that is weird, EK says the rev 2.0 version is compatible with the msi r9 290 gaming


----------



## ImGunz

I will show you some pictures when i get home, but it doesn't seem to lie flat on the card, it woobles from side to side and i notices like in the picture


http://imgur.com/fEI3voE

 that it is because it is touching due to there being no cutout in the waterblock for them, but i might be wrong about that, it's just what i saw when trying to mount it quite a few times.


----------



## kizwan

Quote:


> Originally Posted by *cephelix*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ImGunz*
> 
> Yes it's the 2.0 version according to the company that i got it from.
> 
> 
> 
> that is weird, EK says the rev 2.0 version is compatible with the msi r9 290 gaming
Click to expand...

Unless I'm mistaken, the picture of the card ImGunz posted is different than MSI Gaming card.


----------



## Pheatton

Quote:


> Originally Posted by *Forceman*
> 
> Download the BIOS from the other card with GPU-Z, and then use the DOS Atiflash tool to flash the second card. Pretty easy, actually. You can find the flash tool in this thread:
> 
> http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread/0_30


That will work with both BIOS's?


----------



## aaroc

All recent games that I tried except for the F1 2012/2013 worked a lot better with CFX in multiple monitor setup (3x 2540x1440).
Red can you share your configuration setup for the F1 games in CFX? That is the only game that I could not win FPS using CFX, still playable with one R9 290 on multiple monitor setup.
And the stuttering that I had with 2x 7870 in CFX is gone with the R9 290 and new drivers.


----------



## aneutralname

Quote:


> Originally Posted by *pdasterly*
> 
> Your right, nfs is capped at 30fps.
> Crossfire isnt a waste if you run high resolutions. I game at eyeFUNity resolutions


It can be unlocked to 60fps. Still enabling crossfire made a whole lot of problems for me.


----------



## Dasboogieman

I found that Tomb Raider scales really nicely with crossfire (I can actually use SSAA but x4 is a smidge too much, need 290X crossfire or at least 1150mhz), but no surprise since it's an AMD co-developed game.

Another really Crossfire friendly title is Skyrim with all the enhancement and texture mods. A single AMD 290 doesn't have quite the texturing power to enable the latter smoothly but it certainly has the VRAM.

Watchdogs and Wolfenstein run like total crap on AMD. I blame sloppy coding on the devs side. I find it extremely difficult to believe the Watchdogs developers claiming that the poor performance is due to lack of VRAM, the game looks like a Saints Row 3 clone (no kidding) on ultra. Sure it runs great on the 780Ti but with massive framerate drops when the VRAM buffer is breached, on AMD it runs at a steady craptastic 45-50FPS except without the huge stutters.

Shogun 2 also really likes Crossfire 290s, even though the game can only address 3072Gb, it scales quite well.Far Cry 3 works quite well too though much choppier than I would like, nowhere near the same smoothness as Tomb Raider.
Quote:


> Originally Posted by *ImGunz*
> 
> Looking for a waterblock for my MSI R9 290 Twin Frozr. I got the EK - FC R9-290X waterblock right now, but it doesn't fit and im gonna have to return it. My question is which block will fit? The one i got didn't have cut outs for the things in the picture marked with red
> 
> 
> http://imgur.com/fEI3voE


Exactly what model is your AMD 290? If it is indeed the Gaming Edition, it seems like MSI have changed the PCB layout since it was released. I'm looking at the Techpowerup review and your cluster of capacitors aren't there.


----------



## pdasterly

Wolfenstein is awsome, too bad crossfire dosent work. I have to disable xfire just to get a decent fps. And its capped @60 fps. I get driver issues also, sometimes the fps is so low its unplayable but ill restart system and it runs fine, even on three monitors, one card blows yhru this game


----------



## anubis1127




----------



## Durvelle27

Quote:


> Originally Posted by *Dasboogieman*
> 
> I found that Tomb Raider scales really nicely with crossfire (I can actually use SSAA but x4 is a smidge too much, need 290X crossfire or at least 1150mhz), but no surprise since it's an AMD co-developed game.
> 
> Another really Crossfire friendly title is Skyrim with all the enhancement and texture mods. A single AMD 290 doesn't have quite the texturing power to enable the latter smoothly but it certainly has the VRAM.
> 
> Watchdogs and Wolfenstein run like total crap on AMD. I blame sloppy coding on the devs side. I find it extremely difficult to believe the Watchdogs developers claiming that the poor performance is due to lack of VRAM, the game looks like a Saints Row 3 clone (no kidding) on ultra. Sure it runs great on the 780Ti but with massive framerate drops when the VRAM buffer is breached, on AMD it runs at a steady craptastic 45-50FPS except without the huge stutters.
> 
> Shogun 2 also really likes Crossfire 290s, even though the game can only address 3072Gb, it scales quite well.Far Cry 3 works quite well too though much choppier than I would like, nowhere near the same smoothness as Tomb Raider.
> Exactly what model is your AMD 290? If it is indeed the Gaming Edition, it seems like MSI have changed the PCB layout since it was released. I'm looking at the Techpowerup review and your cluster of capacitors aren't there.


Watch Dogs ran fine on my single 290X


----------



## Dasboogieman

Quote:


> Originally Posted by *Durvelle27*
> 
> Watch Dogs ran fine on my single 290X


I was really annoyed that it was hovering at 50-55FPS on my 290 because I was already using 1200mhz +168mV. I guess those extra 10% shaders on the 290X made it smooth for you. Never mind that crossfire is funky at the moment.
Quote:


> Originally Posted by *ImGunz*
> 
> Looking for a waterblock for my MSI R9 290 Twin Frozr. I got the EK - FC R9-290X waterblock right now, but it doesn't fit and im gonna have to return it. My question is which block will fit? The one i got didn't have cut outs for the things in the picture marked with red
> 
> 
> http://imgur.com/fEI3voE


Quote:


> Originally Posted by *cephelix*
> 
> that is weird, EK says the rev 2.0 version is compatible with the msi r9 290 gaming


I searched around, he's got the same Gaming edition as mine, basically a late model one http://www.coolingconfigurator.com/step1_complist?gpu_gpus=1401. These are actually not reference designs, they use the gold SSC chokes and a less numerous array of larger capacitors near the VRMs. EK doesn't actually support this board unfortunately so he might be out of luck.

A little mystery got solved for me today though, I was wondering why my MSI 290 Gaming's VRMs were always a lot cooler than the Tri X, as it turns out, my late version uses a 6+1+1 design instead of the 5+1+1 design on the reference AMD board. I knew it could not be just simply the backplate + the heatsink plate (which is actually worse than the Tri X). A full 9 degree (4 can be attributed to the backplate) drop in operating temperatures with just an extra VRM phase


----------



## Forceman

Quote:


> Originally Posted by *Pheatton*
> 
> That will work with both BIOS's?


Not sure what you mean. Both sides, like quiet and uber? Yes, you can download either one (or both) from the good card and then flash it to the corresponding spot on the other card.


----------



## Durvelle27

Quote:


> Originally Posted by *Dasboogieman*
> 
> I was really annoyed that it was hovering at 50-55FPS on my 290 because I was already using 1200mhz +168mV. I guess those extra 10% shaders on the 290X made it smooth for you. Never mind that crossfire is funky at the moment.
> 
> I searched around, he's got the same Gaming edition as mine, basically a late model one http://www.coolingconfigurator.com/step1_complist?gpu_gpus=1401. These are actually not reference designs, they use the gold SSC chokes and a less numerous array of larger capacitors near the VRMs. EK doesn't actually support this board unfortunately so he might be out of luck.
> 
> A little mystery got solved for me today though, I was wondering why my MSI 290 Gaming's VRMs were always a lot cooler than the Tri X, as it turns out, my late version uses a 6+1+1 design instead of the 5+1+1 design on the reference AMD board. I knew it could not be just simply the backplate + the heatsink plate (which is actually worse than the Tri X). A full 9 degree (4 can be attributed to the backplate) drop in operating temperatures with just an extra VRM phase


I play at 2560x1080 on Ultra w/4xMSAA+HBAO+High with SweetFX mod and Worse Mode. I get 30-80 FPS running at 1100/1400


----------



## Pheatton

Quote:


> Originally Posted by *Forceman*
> 
> Not sure what you mean. Both sides, like quiet and uber? Yes, you can download either one (or both) from the good card and then flash it to the corresponding spot on the other card.


Thanks that's exactly what was asking. So I imagine switch in one position, copy then flash, switch in other position then copy and flash, correct?


----------



## Kittencake

I noticed with the dual dvi on the 290x you can hook a third monitor with hdmi this i'm curious about , I'm also curious about idle temps with eyeifinity and performace


----------



## cephelix

Quote:


> Originally Posted by *Dasboogieman*
> 
> I searched around, he's got the same Gaming edition as mine, basically a late model one http://www.coolingconfigurator.com/step1_complist?gpu_gpus=1401. These are actually not reference designs, they use the gold SSC chokes and a less numerous array of larger capacitors near the VRMs. EK doesn't actually support this board unfortunately so he might be out of luck.


this puts me in a bit of a predicament then...how do you tell if it's an early or later version without dismantling?


----------



## grunion

Quote:


> Originally Posted by *Kittencake*
> 
> I noticed with the dual dvi on the 290x you can hook a third monitor with hdmi this i'm curious about , I'm also curious about idle temps with eyeifinity and performace


Yes you can ef with 2dvi+hdmi, expect 10°c increase in idle temps without using work arounds.


----------



## Forceman

Quote:


> Originally Posted by *Pheatton*
> 
> Thanks that's exactly what was asking. So I imagine switch in one position, copy then flash, switch in other position then copy and flash, correct?


Yep, exactly.


----------



## Pheatton

Quote:


> Originally Posted by *Forceman*
> 
> Yep, exactly.


Again thank you very much for the help. I was not looking forward to sending it in for a RMA.


----------



## Red1776

Quote:


> Originally Posted by *cephelix*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dasboogieman*
> 
> I searched around, he's got the same Gaming edition as mine, basically a late model one http://www.coolingconfigurator.com/step1_complist?gpu_gpus=1401. These are actually not reference designs, they use the gold SSC chokes and a less numerous array of larger capacitors near the VRMs. EK doesn't actually support this board unfortunately so he might be out of luck.
> 
> 
> 
> this puts me in a bit of a predicament then...how do you tell if it's an early or later version without dismantling?
Click to expand...

I thought I would just repost this:

( This was someone else's card, not yours)

Quote: red1776


> A lot of the partners are notorious for changing up PCB's/layouts without notice and XFX leads the pack in this. All of the WC companies are also known for being less than expeditious about updating their comp list, and you can't blame them with unannounced changes constantly happening.
> 
> Keep in mind that there are true reference PCB's and 'Reference layout' while they will usually be cross compatible, they are not always, my MSI game editions being a classic example of this as well as my XFX HD 7970 DD's
> 
> The best you cab do is remove your DD cooler and painstakingly compare it visually to the comp list pictures.
> 
> EK does a great job of providing both PCB close ups and VGA model close ups.
> 
> That can be found here:
> 
> http://www.coolingconfigurator.com/waterblock/3831109869024
> 
> This is your model I believe
> 
> 
> 
> My MSI R290x's had an issue with beefed up VRM's, but everything else appearing to be ref. You might also call XFX and give them the serial number of your cards so they can pinpoint if they were before or after any design changes that took place.


----------



## Dasboogieman

Quote:


> Originally Posted by *cephelix*
> 
> this puts me in a bit of a predicament then...how do you tell if it's an early or later version without dismantling?


Absolutely easiest is to look for the AMD silkscreen on the pcie connector (the early gaming editions had this). Otherwise look through the cooler to see the chokes. If it's gold colored then its custom.
Finally look at the 8 pin header, the reference has a capacitor pretty much right next to it but the late model ones have it set about 1cm further in.


----------



## cephelix

Quote:


> Originally Posted by *Red1776*
> 
> I thought I would just repost this:
> 
> ( This was someone else's card, not yours)


Quote:


> Originally Posted by *Dasboogieman*
> 
> Absolutely easiest is to look for the AMD silkscreen on the pcie connector (the early gaming editions had this). Otherwise look through the cooler to see the chokes. If it's gold colored then its custom.
> Finally look at the 8 pin header, the reference has a capacitor pretty much right next to it but the late model ones have it set about 1cm further in.


Thanks for the info guys... will have to check my card before i make the purchases then


----------



## aaroc

Quote:


> Originally Posted by *Kittencake*
> 
> I noticed with the dual dvi on the 290x you can hook a third monitor with hdmi this i'm curious about , I'm also curious about idle temps with eyeifinity and performace


56-39-46C 3x R9 290 reading overclock.net room temp 20C, winter here. I have 3 monitors attached to DVI-DVI-DP.


----------



## kizwan

Quote:


> Originally Posted by *Pheatton*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Forceman*
> 
> Not sure what you mean. Both sides, like quiet and uber? Yes, you can download either one (or both) from the good card and then flash it to the corresponding spot on the other card.
> 
> 
> 
> Thanks that's exactly what was asking. So I imagine switch in one position, copy then flash, switch in other position then copy and flash, correct?
Click to expand...

It always a good idea to only flash one bios & leave the other one untouched. This way when you messed up one bios, you still can boot with working bios.


----------



## TheBenson

I have tried creating the .bat file on referenced on page 14 to increase voltage limits in afterburner but it does not seem to work. Is there something different that I have to do for the latest version of Afterburner?


----------



## fateswarm

Quote:


> Originally Posted by *kizwan*
> 
> It always a good idea to only flash one bios & leave the other one untouched. This way when you messed up one bios, you still can boot with working bios.


They were talking about two different cards there.


----------



## JaseUK

Guys I was hoping you could help me out a little please?

I am looking at upgrading to a 290x but I want to find a card that is reference as I will be watercooling it.
I also want one that has the Hynix memory.
Oh and is there a card that has all this and the manufacturer doesn't void the warranty if I remove the cooler?

I know I'm asking a lot


----------



## Sgt Bilko

Quote:


> Originally Posted by *JaseUK*
> 
> Guys I was hoping you could help me out a little please?
> 
> I am looking at upgrading to a 290x but I want to find a card that is reference as I will be watercooling it.
> I also want one that has the Hynix memory.
> Oh and is there a card that has all this and the manufacturer doesn't void the warranty if I remove the cooler?
> 
> I know I'm asking a lot



Should be able to find used cards fairly easy

Hynix or Elpida, doesn't really make a difference tbh

MSI afaik and Sapphire has a "Don't ask don't tell" policy


----------



## Dasboogieman

Quote:


> Originally Posted by *JaseUK*
> 
> Guys I was hoping you could help me out a little please?
> 
> I am looking at upgrading to a 290x but I want to find a card that is reference as I will be watercooling it.
> I also want one that has the Hynix memory.
> Oh and is there a card that has all this and the manufacturer doesn't void the warranty if I remove the cooler?
> 
> I know I'm asking a lot


Actually, fairly easy, I might add some stuff on top of Bilko's recommendations.
1. There have been some implications from a survey that went around a while ago that Hynix memory is more resistant to stock speed black screens. However, correlation=/= causation so take this with a grain of salt, I can't confirm this correlation until I get the re-compiled version of MemtestCL distributed to the known black screeners.
2. The best and most reliable way of knowing if a card is full block compatible is to look for the AMD logo on the PCIe connector.
3. I agree with the MSI + Sapphire warranty policies
4. I can't vouch much for MSI in terms of watercooling anymore since there's actually two versions of the 290x gaming. There's a stock standard one with the AMD logo and everything but there's also the late model one (I have the 290 version of this revised edition) which uses a different VRM assembly, doesn't have the AMD logo and basically is incompatible with current full cover waterblocks. If you can inspect it physically before you buy, this is the only way to be sure which one you'll get, the revised incompatible one has gold chokes.

So basically applying your criteria, the only real brand I can say with reasonable confidence is probably Sapphire. Comes with Hynix memory standard. My Tri X didn't even come with a "warranty void" sticker on any of the screws and the PCB is absolutely stock standard (unless there's been a revision since I got mine which was 2 months ago)
When you call them for RMA, they'll ask you the obligatory "have you removed the stock cooler", just play dumb, they'll know full well you derped otherwise you wouldn't be RMAing but they'll take your card in regardless.

As long as you didn't write "JaseUK Waz ere herp derp" on the PCB with an oxyacetylene torch, they'll RMA most kinds of damage.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Dasboogieman*
> 
> Actually, fairly easy, I might add some stuff on top of Bilko's recommendations.
> 1. There have been some implications from a survey that went around a while ago that Hynix memory is more resistant to stock speed black screens. However, correlation=/= causation so take this with a grain of salt, I can't confirm this correlation until I get the re-compiled version of MemtestCL distributed to the known black screeners.
> 2. The best and most reliable way of knowing if a card is full block compatible is to look for the AMD logo on the PCIe connector.
> 3. I agree with the MSI + Sapphire warranty policies
> 4. I can't vouch much for MSI in terms of watercooling anymore since there's actually two versions of the 290x gaming. There's a stock standard one with the AMD logo and everything but there's also the late model one (I have the 290 version of this revised edition) which uses a different VRM assembly, doesn't have the AMD logo and basically is incompatible with current full cover waterblocks. If you can inspect it physically before you buy, this is the only way to be sure which one you'll get.
> 
> So basically applying your criteria, the only real brand I can say with reasonable confidence is probably Sapphire. Comes with Hynix memory standard. My Tri X didn't even come with a "warranty void" sticker on any of the screws and the PCB is absolutely stock standard (unless there's been a revision since I got mine which was 2 months ago)
> When you call them for RMA, they'll ask you the obligatory "have you removed the stock cooler", just play dumb, they'll know full well you derped otherwise you wouldn't be RMAing but they'll take your card in regardless.
> 
> As long as you didn't write "JaseUK Waz ere herp derp" on the PCB with an oxyacetylene torch, they'll RMA most kinds of damage.


^ yup this


----------



## JaseUK

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 
> Should be able to find used cards fairly easy
> 
> Hynix or Elpida, doesn't really make a difference tbh
> 
> MSI afaik and Sapphire has a "Don't ask don't tell" policy


Quote:


> Originally Posted by *Dasboogieman*
> 
> Actually, fairly easy, I might add some stuff on top of Bilko's recommendations.
> 1. There have been some implications from a survey that went around a while ago that Hynix memory is more resistant to stock speed black screens. However, correlation=/= causation so take this with a grain of salt, I can't confirm this correlation until I get the re-compiled version of MemtestCL distributed to the known black screeners.
> 2. The best and most reliable way of knowing if a card is full block compatible is to look for the AMD logo on the PCIe connector.
> 3. I agree with the MSI + Sapphire warranty policies
> 4. I can't vouch much for MSI in terms of watercooling anymore since there's actually two versions of the 290x gaming. There's a stock standard one with the AMD logo and everything but there's also the late model one (I have the 290 version of this revised edition) which uses a different VRM assembly, doesn't have the AMD logo and basically is incompatible with current full cover waterblocks. If you can inspect it physically before you buy, this is the only way to be sure which one you'll get, the revised incompatible one has gold chokes.
> 
> So basically applying your criteria, the only real brand I can say with reasonable confidence is probably Sapphire. Comes with Hynix memory standard. My Tri X didn't even come with a "warranty void" sticker on any of the screws and the PCB is absolutely stock standard (unless there's been a revision since I got mine which was 2 months ago)
> When you call them for RMA, they'll ask you the obligatory "have you removed the stock cooler", just play dumb, they'll know full well you derped otherwise you wouldn't be RMAing but they'll take your card in regardless.
> 
> As long as you didn't write "JaseUK Waz ere herp derp" on the PCB with an oxyacetylene torch, they'll RMA most kinds of damage.


Awesome thank you for the info

I am looking at this as its the cheapest I can find http://www.scan.co.uk/products/4gb-msi-radeon-r9-290x-pci-e

Does anyone know if the memory is Hynix on these?


----------



## Noviets

Heyguys, for 1080p gaming, I want to try and get over 150FPS in most games. What would be the best option?

Right now I have a 7970 DC2 at 1080/1400 (voltage locked). Also have a 360 XT45, and a 240UT60 rad, so looking at watercooling them, is that enough rad?

I was looking at either getting another 7970 (as its the cheapest option vs framerate increase) but I dont think that will give me 150fps, and i'll still be limited by the DC2 voltage locked speed, which really makes watercooling pointless.

Second option would be get 2x 290's, or a single 290X and get a second in a month or so after.

My CPU is an FX8350 at 5.4ghz.

Where should I go from here?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Noviets*
> 
> Heyguys, for 1080p gaming, I want to try and get over 150FPS in most games. What would be the best option?
> 
> Right now I have a 7970 DC2 at 1080/1400 (voltage locked). Also have a 360 XT45, and a 240UT60 rad, so looking at watercooling them, is that enough rad?
> 
> I was looking at either getting another 7970 (as its the cheapest option vs framerate increase) but I dont think that will give me 150fps, and i'll still be limited by the DC2 voltage locked speed, which really makes watercooling pointless.
> 
> Second option would be get 2x 290's, or a single 290X and get a second in a month or so after.
> 
> My CPU is an FX8350 at 5.4ghz.
> 
> Where should I go from here?


Well with 2 x R9 290's i'm pulling 100fps + at 1440p and at 1080p i was getting around 140fps in most games with 98% of the eyecandy turned on.

Also.....5.4? Nice work








Quote:


> Originally Posted by *JaseUK*
> 
> Awesome thank you for the info
> 
> I am looking at this as its the cheapest I can find http://www.scan.co.uk/products/4gb-msi-radeon-r9-290x-pci-e
> 
> Does anyone know if the memory is Hynix on these?


With the Ref cards it's hit and miss regarding mem type.

There is no brand in Ref cards that will come with Elpida or Hynix exclusively.

Your best option will be a Sapphire Tri-X (Ref PCB and Hynix Mem) or a MSI Gaming card (Rev 2.0 needs a different block and Hynix Mem)


----------



## JaseUK

I have the opportunity to get a Sapphire 290X Tri-X OC 4GB

Just with them being the OC version are they truly reference as it doesn't have this version of card on Cooling Configurator?


----------



## Sgt Bilko

Quote:


> Originally Posted by *JaseUK*
> 
> I have the opportunity to get a Sapphire 290X Tri-X OC 4GB
> 
> Just with them being the OC version are they truly reference as it doesn't have this version of card on Cooling Configurator?


I am 98% sure they are the same as the base card just factory overclocked, i think there are a few members here that have them though.


----------



## PureBlackFire

Quote:


> Originally Posted by *JaseUK*
> 
> I have the opportunity to get a Sapphire 290X Tri-X OC 4GB
> 
> Just with them being the OC version are they truly reference as it doesn't have this version of card on Cooling Configurator?


the 290 and 290X Tri-X come with hynix memory and are on the reference pcb. Sapphire does not allow you to modify or remove the cooler. if you do so your warranty is void.


----------



## JaseUK

Quote:


> Originally Posted by *PureBlackFire*
> 
> the 290 and 290X Tri-X come with hynix memory and are on the reference pcb. Sapphire does not allow you to modify or remove the cooler. if you do so your warranty is void.


Thank you for the reply. The 290X OC version is going to be reference then?


----------



## PureBlackFire

Quote:


> Originally Posted by *JaseUK*
> 
> Thank you for the reply. The 290X OC version is going to be reference then?


yea it's reference. don't cause any obvious damage around the screws and your fine.


----------



## JaseUK

Quote:


> Originally Posted by *PureBlackFire*
> 
> yea it's reference. don't cause any obvious damage around the screws and your fine.


Sweet, I am getting two of them at a very good price. I'll be back once I have them installed.


----------



## PachAz

The good thing with sapphire is that they dont have any stickers on the screws, which means you can mount other coolers without them knowing it. Just save the stock thermal pads so you can put them on in case of a RMA. My r9 290 is on RMA, and I really hope I can choose the tri-x as an replacement because the shop dont sell the regular sapphire r9 290 anymore. I would not want a xfx or gigabyte, my waterblock dont even fit those because its rev 1.0 not 2.0. I just need to find out a good motivation to tell the shop why I would want a tri-x instead, without making it too obvious im going to mount a WB







. Pls give some tip. I alreaddy have one tri-x and it would be sweet to get another one.


----------



## anubis1127

Even screws with stickers can be removed, just have to be more careful.


----------



## Sgt Bilko

Quote:


> Originally Posted by *PachAz*
> 
> The good thing with sapphire is that they dont have any stickers on the screws, which means you can mount other coolers without them knowing it. Just save the stock thermal pads so you can put them on in case of a RMA. My r9 290 is on RMA, and I really hope I can choose the tri-x as an replacement because the shop dont sell the regular sapphire r9 290 anymore. I would not want a xfx or gigabyte, my waterblock dont even fit those because its rev 1.0 not 2.0. I just need to find out a good motivation to tell the shop why I would want a tri-x instead, without making it too obvious im going to mount a WB
> 
> 
> 
> 
> 
> 
> 
> . Pls give some tip. I alreaddy have one tri-x and it would be sweet to get another one.


The XFX cards are also based on the Ref PCB.

just take what you get back, if you want it replaced with a Tri-X then tell them that.

No point in being dishonest about it.


----------



## PachAz

The shop only has the xfx DD black edition, and that is a reference, but it cost like 500 SEK more than the tri-x. The point is to tell them what I want once they are willing to send me a card back. Thats why I want to find out as much positive things as possible so they send me a tri-x instead. Basicly a gigabyte windforce would work out just as good, but not for me because its not reference. I must get them to understand what I want, yet not making it too apparent I will watercool it







.


----------



## Dasboogieman

Quote:


> Originally Posted by *JaseUK*
> 
> I have the opportunity to get a Sapphire 290X Tri-X OC 4GB
> 
> Just with them being the OC version are they truly reference as it doesn't have this version of card on Cooling Configurator?


Its reference, I can say it is more likely to be reference than the MSI gaming card, mine has the AMD logo stamped on the PCIe connector. I'm also not aware of any revisions that have been done to the Tri X design.
Quote:


> Originally Posted by *PachAz*
> 
> The good thing with sapphire is that they dont have any stickers on the screws, which means you can mount other coolers without them knowing it. Just save the stock thermal pads so you can put them on in case of a RMA. My r9 290 is on RMA, and I really hope I can choose the tri-x as an replacement because the shop dont sell the regular sapphire r9 290 anymore. I would not want a xfx or gigabyte, my waterblock dont even fit those because its rev 1.0 not 2.0. I just need to find out a good motivation to tell the shop why I would want a tri-x instead, without making it too obvious im going to mount a WB
> 
> 
> 
> 
> 
> 
> 
> . Pls give some tip. I alreaddy have one tri-x and it would be sweet to get another one.


Oh Sapphire knows believe me, Like EVGA or MSI, they just helpfully turn a blind eye







. Its really easy to have an excuse to get a Tri X. It also happens to be the most efficient and quietest 2 slot Air cooler. Just say you want the silence and extra cooling the Tri X gives compared to the DD because you're going crossfire.


----------



## Sgt Bilko

Quote:


> Originally Posted by *PachAz*
> 
> The shop only has the xfx DD black edition, and that is a reference, but it cost like 500 SEK more than the tri-x. The point is to tell them what I want once they are willing to send me a card back. Thats why I want to find out as much positive things as possible so they send me a tri-x instead. Basicly a gigabyte windforce would work out just as good, but not for me because its not reference. I must get them to understand what I want, yet not making it too apparent I will watercool it
> 
> 
> 
> 
> 
> 
> 
> .


Well tell them that you already have one Tri-X and you would like matching cards.

Like i said, no point in being dishonest


----------



## nightfox

false
Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well tell them that you already have one Tri-X and you would like matching cards.
> 
> Like i said, no point in being dishonest


+1


----------



## Sgt Bilko

Quote:


> Originally Posted by *Dasboogieman*
> 
> Oh Sapphire knows believe me, Like EVGA or MSI, they just helpfully turn a blind eye
> 
> 
> 
> 
> 
> 
> 
> . Its really easy to have an excuse to get a Tri X. It also happens to be the most efficient and quietest 2 slot Air cooler. Just say you want the silence and extra cooling the Tri X gives compared to the DD because you're going crossfire.


But......the DD's aren't loud

Tri-X is better cooling though i'll give it that and Sapphire knows.....they definitely know


----------



## PachAz

Okay then I will focus on these points:

- Alreaddy have a tri-x in my system
- Tri-x have the best and most efficient cooler
- Only 2 slot
- Hynix memory
- Factory overclocked
- Nice color theme
- ??

Also how would they know, unless they remove the cooler them selves? Here in sweden RMA are done by the shops policy, not thru the manufacturer.


----------



## Sgt Bilko

Quote:


> Originally Posted by *PachAz*
> 
> Okay then I will focus on these points:
> 
> *- Already have a tri-x in my system*
> - Tri-x have the best and most quiet cooler
> - Only 2 slot
> - Hynix memory
> - ??
> 
> Also how would they know, unless they remove the cooler them selves? Here in sweden RMA are done by the shops policy, not thru the manufacturer.


That will be all you will need to tell them imo


----------



## fateswarm

When you guys say same pcb do you mean all the devices on it? Because I noticed a tri-x has 2 more power phases than others.


----------



## PachAz

If you intend to use EK blocks, check their compatiblity on coolingconfigurator. I have the rev 1.0 block and it fit my tri-x perfect.


----------



## InsideJob

I would like to join the club!








http://www.techpowerup.com/gpuz/5p55h/
Asus reference model with stock cooling.


----------



## Mega Man

Quote:


> Originally Posted by *PachAz*
> 
> The shop only has the xfx DD black edition, and that is a reference, but it cost like 500 SEK more than the tri-x. The point is to tell them what I want once they are willing to send me a card back. Thats why I want to find out as much positive things as possible so they send me a tri-x instead. Basicly a gigabyte windforce would work out just as good, but not for me because its not reference. I must get them to understand what I want, yet not making it too apparent I will watercool it
> 
> 
> 
> 
> 
> 
> 
> .


just because it is the same pcb does not mean there can not be more on it, alot of pcbs have blank spots, for various reasons.

also to note.

XFX is really the best company for watercooling, as on their site they have it in writing that you can change the cooler.

just my 2 cents,


----------



## PachAz

That doesnt matter because in sweden we have to follow the consumer right first ie lie to the shop.


----------



## Red1776

Quote:


> Originally Posted by *PachAz*
> 
> That doesnt matter because in sweden we have to follow the consumer right first ie lie to the shop.


That is actually interesting. can you link me to something about that consumer law?


----------



## heroxoot

So some friends bought Borderlands 2 on the steam sale and got me to play. My 290X is constantly dropping its core clock and I'm getting like 30fps in some areas because of it. What's the deal? My 7970 got 120fps solid everywhere and the clock never throttled like this.


----------



## PachAz

Its on swedish, but basicly it gives the consumer 3 years right to return a item to the seller and during that time you can either get the product fixed, a new product or money back. The law also say that if you dont _change_ or _modify_ the item in such way that it damage the item, it should be okay to get a refund. How ever, many of these swedish web shops are owned by foreign companies and dont really put alot of emphasis on customer service and constantly trying to "avoid" local laws. In practice, this does mean that customer who dont understand what right they have just say "okay" to what ever the shop is telling them. And the reason for this is they have alot of policies, like the right to charge the customer additional fees if they dont find the problem described in the letter as well as shipping cost. Only if they do manage to find the issue, you get money back and/or a similar product. This also mean that if the shop lacks knowledge about the item, they can state that the product is "working" and charge the customer fees for service and shipping. Then they sell the same item on "discount" because box is open, despite it might very well be a broken.

The used sapphire r9 290 I bought was a similar product, sold on discout to a buyer who never tested the card and then I bought it just to find out it was faulty. Since the issue only appears when using power limit in sapphire tri-x software, its likely the problems never occured when the shop tested the gpu because they never used any "power limit" while testing. I really hope the shop find the issue because that specific shop have a policy allowing 2nd hand customers a 2 year warranty if they have the reciept of the product.

What is intresting though, is why the r9 290 got returned from the very first beginning and for what reasons. Theres no lie that used products which are sold by various shops not only have open boxes and scratches, but also they dont work. Im not even sure the shop test all products that are returned considering the high failure rate buying such items. I know for sure I wont ever buy and "demo" products, or products that the private seller havent even tested.

But as you know, the card is one of the early sapphire r9 290 with elpida memory, which has alot of problems with black screen/freeze occuring in different situations.


----------



## Red1776

Quote:


> Originally Posted by *PachAz*
> 
> Its on swedish, but basicly it gives the consumer 3 years right to return a item to the seller and during that time you can either get the product fixed, a new product or money back. The law also say that if you dont change or modify the item in such way that it damage the item, it should be okay to get a refund. How ever, many of these swedish web shops are owned by foreign companies and dont really put alot of emphasis on customer service and constantly trying to "avoid" local laws. In practice, this does mean that customer who dont understand what right they have just say "okay" to what ever the shop is telling them. And the reason for this is they have alot of policies, like the right to charge the customer additional fees if they dont find the problem described in the letter as well as shipping cost. Only if they do manage to find the issue, you get money back and/or a similar product. This also mean that if the shop lacks knowledge about the item, they can state that the product is "working" and charge the customer fees for service and shipping. Then they sell the same item on "discount" because box is open, despite it might very well be a broken.
> 
> The used sapphire r9 290 I bought was a similar product, sold on discout to a buyer who never tested the card and then I bought it just to find out it was faulty. Since the issue only appears when using power limit in sapphire tri-x software, its likely the problems never occured when the shop tested the gpu because they never used any "power limit" while testing. I really hope the shop find the issue because that specific shop have a policy allowing 2nd hand customers a 2 year warranty if they have the reciept of the product.
> 
> What is intresting though, is why the r9 290 got returned from the very first beginning and for what reasons. Theres no lie that used products which are sold by various shops not only have open boxes and scratches, but also they dont work. Im not even sure the shop test all products that are returned considering the high failure rate buying such items. I know for sure I wont ever buy and "demo" products, or products that the private seller havent even tested.
> 
> But as you know, the card is one of the early sapphire r9 290 with elpida memory, which has alto of problems with black screen/freeze occuring in different situations.


Interesting..Is the consumer market extremely litigious in Sweden? or do retail outlets conform to the laws without threat of legal exposure?


----------



## PCSarge

im in, got mine at the LAN yesterday.

it is an AMD "press sample" according to the sticker on the back.

it is subvendor of ATI. which means AMD. nobody elses logos on it

runs 937/1250 stock, will run 1000/1350 with no voltage bump or power limit increase.


----------



## Durvelle27

Quote:


> Originally Posted by *heroxoot*
> 
> So some friends bought Borderlands 2 on the steam sale and got me to play. My 290X is constantly dropping its core clock and I'm getting like 30fps in some areas because of it. What's the deal? My 7970 got 120fps solid everywhere and the clock never throttled like this.


Its beacuse the gaming isn't very taxing and isn't fully using the 290X


----------



## heroxoot

Quote:


> Originally Posted by *Durvelle27*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> So some friends bought Borderlands 2 on the steam sale and got me to play. My 290X is constantly dropping its core clock and I'm getting like 30fps in some areas because of it. What's the deal? My 7970 got 120fps solid everywhere and the clock never throttled like this.
> 
> 
> 
> Its beacuse the gaming isn't very taxing and isn't fully using the 290X
Click to expand...

Yea it seems disabling PP support didnt actually help even tho the GPU clock is sticking to 1030mhz. The most load I'm seeing is around 50%


----------



## PachAz

Quote:


> Originally Posted by *Red1776*
> 
> Interesting..Is the consumer market extremely litigious in Sweden? or do retail outlets conform to the laws without threat of legal exposure?


Good question actually. I think most people agree in the end because they feel they cant afford the costs a legal procedure would involve. Some shops do give in at the end though, specially if they are represented on different computer forums. I think a open debate is lacking about this, for example 8/10 shops still talk about "manufacture warranty" despite it is totally insignificant as a 1st rule. The 1st rule the custromer and shop should follow is the consumer right law and then manufacture warranty, but that is an agreement between the shop and the manufacturer.

I think its a big gray zone area to be honest, some times the seller and shop can settle and some times it doesnt work. The guide lines are there but they are interpreted very differently and some times ignored. A couple of years ago many physical stores sold extended warranty, despite that warranty would give the buyer the same right as the consumer law, ie a kind of ripoff. That got attention in media though.

To be honest I dont even know if its possible to RMA items in sweden directly to the manufacturer due to the insane shipping costs and due to the general rule that if item is faulty, it should be returned to the _shop_.


----------



## davidm71

Quote:


> Originally Posted by *morreale*
> 
> ASUS R9290X-DC2OC-4GD5
> Stock Asus
> username: morreale
> 
> 
> 
> I have had issues with this card since it has been installed with black screens and BSODs. I am using a Corsair Professional Series AX860i so I know power is not an issue.


Me too. On the first day installing that card with 14.6 I had lockups and black screens within 5 minutes of booting into windows. I cleaned up the old drivers and installed 14.4 and it seems stable though have had not enough time to vet it. Weird thing is the latest bios on Asus's site for that card changes its hardware name but keeps the bios version the same. Still works though. Thank god.


----------



## anubis1127

Quote:


> Originally Posted by *PCSarge*
> 
> 
> 
> im in, got mine at the LAN yesterday.
> 
> it is an AMD "press sample" according to the sticker on the back.
> 
> it is subvendor of ATI. which means AMD. nobody elses logos on it
> 
> runs 937/1250 stock, will run 1000/1350 with no voltage bump or power limit increase.


Nice man, congrats! Was that free? Or did you have to destroy a NV card to get it?


----------



## PCSarge

Quote:


> Originally Posted by *anubis1127*
> 
> Nice man, congrats! Was that free? Or did you have to destroy a NV card to get it?


i gave up an old radeon card for it not too bad of a trade if i do say so myself.

oh guys if you have driver issues, im running 13.12 with zero issue. benchmarked the life out of the card last night on different drivers, maybe a 2fps difference, but 13.12 was the only driver not to crash out.


----------



## anubis1127

Quote:


> Originally Posted by *PCSarge*
> 
> i gave up an old radeon card for it not too bad of a trade if i do say so myself.


Cool, sounds like a good deal to me. Enjoy the new card!


----------



## PCSarge

Quote:


> Originally Posted by *anubis1127*
> 
> Cool, sounds like a good deal to me. Enjoy the new card!


i am already. went from sub 50 fps on a 270 to breaking over 90 on this on BF4


----------



## PureBlackFire

another valley run at 1075mhz/1300mhz.:



vrm1: 67c
vrm2: 48c


----------



## yawa

For all two of you still curious how a 290X does with a 7850k APU and more mature drivers...

My latest Overclocked 3D Mark Firestrike 2013 Run.

CPU at 4.5Ghz. iGPU turned off as HSA acceleration is still not included in 3D Mark's suites yet. 290X sitting at 1213/1327. 14.6 AMD beta Catalyst Drivers

Overall: 7926
Graphics: 12763
Physics: 5245
Combined: 2578



*I really hope they give us the option to use HSA acceleration in their next benching suite. Would go a long way towards giving us APU users more reasonable (and more "real world" considering it's potential) Physics and Combined scores.


----------



## Widde

Quote:


> Originally Posted by *PachAz*
> 
> Its on swedish, but basicly it gives the consumer 3 years right to return a item to the seller and during that time you can either get the product fixed, a new product or money back. The law also say that if you dont _change_ or _modify_ the item in such way that it damage the item, it should be okay to get a refund. How ever, many of these swedish web shops are owned by foreign companies and dont really put alot of emphasis on customer service and constantly trying to "avoid" local laws. In practice, this does mean that customer who dont understand what right they have just say "okay" to what ever the shop is telling them. And the reason for this is they have alot of policies, like the right to charge the customer additional fees if they dont find the problem described in the letter as well as shipping cost. Only if they do manage to find the issue, you get money back and/or a similar product. This also mean that if the shop lacks knowledge about the item, they can state that the product is "working" and charge the customer fees for service and shipping. Then they sell the same item on "discount" because box is open, despite it might very well be a broken.
> 
> The used sapphire r9 290 I bought was a similar product, sold on discout to a buyer who never tested the card and then I bought it just to find out it was faulty. Since the issue only appears when using power limit in sapphire tri-x software, its likely the problems never occured when the shop tested the gpu because they never used any "power limit" while testing. I really hope the shop find the issue because that specific shop have a policy allowing 2nd hand customers a 2 year warranty if they have the reciept of the product.
> 
> What is intresting though, is why the r9 290 got returned from the very first beginning and for what reasons. Theres no lie that used products which are sold by various shops not only have open boxes and scratches, but also they dont work. Im not even sure the shop test all products that are returned considering the high failure rate buying such items. I know for sure I wont ever buy and "demo" products, or products that the private seller havent even tested.
> 
> But as you know, the card is one of the early sapphire r9 290 with elpida memory, which has alot of problems with black screen/freeze occuring in different situations.


First part is partially false, You only have 6 months of return right, after that it is up to you, the consumer to prove that the fault was original. But it is 3 years for some weird reason







If you return it to the seller before 6 months have passed the seller have to prove that the fault is not original and that you have caused it.


----------



## Arizonian

Quote:


> Originally Posted by *InsideJob*
> 
> I would like to join the club!
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/5p55h/
> Asus reference model with stock cooling.


Congrats - added









Quote:


> Originally Posted by *PCSarge*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> im in, got mine at the LAN yesterday.
> 
> it is an AMD "press sample" according to the sticker on the back.
> 
> it is subvendor of ATI. which means AMD. nobody elses logos on it
> 
> runs 937/1250 stock, will run 1000/1350 with no voltage bump or power limit increase.


Congrats - added









I see some new faces and if you haven't submitted proof yet please do and I'll get you on our roster.


----------



## PCSarge

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I see some new faces and if you haven't submitted proof yet please do and I'll get you on our roster.


i see youve finally made it so damn is not a curseword on the site


----------



## DeadlyDNA

Quote:


> Originally Posted by *PCSarge*
> 
> i see youve finally made it so damn is not a curseword on the site


OH Damn! oops i didn't mean to say "damn". Oops i said damn again.
















Is anyone using MSI AB 3.0 and with voltage unlocked it's not showing?


----------



## Mega Man

quote name="PachAz" url="/t/1436497/official-amd-r9-290x-290-owners-club/24800_100#post_22456757"]

add this to the beginning and your quote is fixed

make sure to put "[" in front of it


----------



## PCSarge

well this seems to be a very cherry picked "press sample"

ran heaven at 1100/1350 stock voltage

average fps of 84.6

max temp i hit at around 85% fan was 77C

ran it on ultra with tesselation disabled at 1080P, 75hz refresh rate monitor

that and the door on my prodigy is like 1 inch away from the fan... bet it would run low 70s with the side panel off.....

if only the damn results would save properly......couldve done real well on HWbot with that one


----------



## PachAz

First part is partially false, You only have 6 months of return right, after that it is up to you, the consumer to prove that the fault was original. But it is 3 years for some weird reason







If you return it to the seller before 6 months have passed the seller have to prove that the fault is not original and that you have caused it.

Ermageerrd I broke the quote[/quote]
I know its up the the buyer to prove the fault was original after 6 months, but you are still allowed to return the item in a time frame of 3 years. The shop cant deny you the right to return the item during that time frame of course. How exactly the shop is going to prove that the item was damaged from the factory versus your proof, is still a hard thing to do. In my case, it will get harder to prove anything because the person before me didnt test the card, and the very first person who bought the card returned it for reasons I dont know. Also, how do a shop prove a fault if they dont even know how to do it? Obviously the card was sold despite it beeing faulty, which indicates the support people:

1: didnt test the card before selling it at discount
2. didnt test it with power limit in benchmarks

One can also argue that a item should work past the 6 months time limit. It can very well be a factory fault making the item break at any time, depending on the item itself. But looking at the context, its likely the sapphire r9 290 did suffer from black screens/freeze from the factory because its a known issue with elpida cards not to mention the incompetence of the support to mess things up. For example when they accepted my RMA they did change the words in the description I gave to them. In the stated reason they wrote "broken after it was used", but that wasnt the case since it was alreaddy broken when I got it. I still dont know where they got that information from since I didnt ever tell them the card got faulty after I used it. And here it is pretty intresting, because according to the law, it says that if the fault is occuring within 2 months, its even more reason to believe the fault is original. While I didnt buy the card new, this rule should apply as soon as I got the item recieved. Its still up to the support to prove the card was faulty, and if they dont have the abilty to show me what tests they did and in depth details of the tests they did before discounting the card from the first beginning, well then they dont have a squat.

And I think many shops are lacking in documenting these procedures so they either trying to play smart and hard headed or give in and return your money/give new item just so they dont get bad reputation. That is also why, and this is no lie, that customers mess up parts with purpose so just so the support can find a 100% obvious defect. Im not saying that is right, but many faults are kinda hard to discover, not to mention all variables that affects the outcome, and in my case the power limit that makes the card crash (while it works perfect on my other r9 290 with hynix). This whole area is a grey zone, but the shops are making big money otherwise they wouldnt be in business by and keep on investing.


----------



## fateswarm

Hehe. Those return policies only work in countries where people believe each other. You pull that elsewhere, everyone takes advantage of it.


----------



## PachAz

But the thing is that it doesnt work proper, because many of these online shops are owned by non swedish corporations and therfore they educate the workers to _lie_ and avoiding the law. And as you know, swedish people are generally shy and dont want conflict so they agree out of desperation. These companies know swedish people generally have okay economy, so they can milk them a little bit more.

Where this maybe is a little bit OT, why do the sapphire r9 290 that I returned crash when using power limit, and not my r9 290 tri-x? Does this have anything to do with the memory (elpidia vs hynix)? I also got to mention that I get artifacts some times before the screen goes black.


----------



## cephelix

Very interesting read on swedish laws this morning...
Going back to trix since ab is not displaying things right..what software do you guys use for monitoring the gpu? Rivatuner any good??


----------



## PCSarge

Quote:


> Originally Posted by *cephelix*
> 
> Very interesting read on swedish laws this morning...
> Going back to trix since ab is not displaying things right..what software do you guys use for monitoring the gpu? Rivatuner any good??


i use AIDA 64 for CPU and GPU. displays fan speeds and all on my G510's screen very useful in games


----------



## Widde

Quote:


> Originally Posted by *PachAz*
> 
> But the thing is that it doesnt work proper, because many of these online shops are owned by non swedish corporations and therfore they educate the workers to _lie_ and avoiding the law. And as you know, swedish people are generally shy and dont want conflict so they agree out of desperation. These companies know swedish people generally have okay economy, so they can milk them a little bit more.
> 
> Where this maybe is a little bit OT, why do the sapphire r9 290 that I returned crash when using power limit, and not my r9 290 tri-x? Does this have anything to do with the memory (elpidia vs hynix)? I also got to mention that I get artifacts some times before the screen goes black.


It should be somewhat stricter because I know people that have turned in both psu´s and gpus and the "support" people, I dont even think they tested them (Not properly atleast) One mate got back a dead psu, another one got back a gpu with the dvi port dead, These were dead on arrival aswell.


----------



## cephelix

Quote:


> Originally Posted by *PCSarge*
> 
> i use AIDA 64 for CPU and GPU. displays fan speeds and all on my G510's screen very useful in games


thanks...will give it a try when I get back..it's going to be a long day at work today...finally got into witcher though I'm experiencing intermittent freezes during gameplay (card at stock, settings on high). Hence the need for an in game monitor to see if it's because of downclocking etc...


----------



## pdasterly

Can you play Xbox/360 games on pc(emulator)?


----------



## Dasboogieman

Quote:


> Originally Posted by *heroxoot*
> 
> So some friends bought Borderlands 2 on the steam sale and got me to play. My 290X is constantly dropping its core clock and I'm getting like 30fps in some areas because of it. What's the deal? My 7970 got 120fps solid everywhere and the clock never throttled like this.


Have you tried force setting the Flip Frame to 1 or 2? That fixed most of the dropping for me.
Basically, its perfectly normal for the clockspeed to fluctuate on the 290 but this is problematic if there is an output of only 30FPS. I can only guess what the issue is but I think its to do with how the CPU is feeding the GPU in sporadic bursts instead of pre-preparing frames properly.


----------



## heroxoot

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> So some friends bought Borderlands 2 on the steam sale and got me to play. My 290X is constantly dropping its core clock and I'm getting like 30fps in some areas because of it. What's the deal? My 7970 got 120fps solid everywhere and the clock never throttled like this.
> 
> 
> 
> Have you tried force setting the Flip Frame to 1 or 2? That fixed most of the dropping for me.
> Basically, its perfectly normal for the clockspeed to fluctuate on the 290 but this is problematic if there is an output of only 30FPS. I can only guess what the issue is but I think its to do with how the CPU is feeding the GPU in sporadic bursts instead of pre-preparing frames properly.
Click to expand...

Is there a way to do this without a 3rd party tool? I know it can be done with that ATI tool thing but I don't really wanna run it because It caused me BSOD in the past.


----------



## Dasboogieman

Quote:


> Originally Posted by *heroxoot*
> 
> Is there a way to do this without a 3rd party tool? I know it can be done with that ATI tool thing but I don't really wanna run it because It caused me BSOD in the past.


Try RadeonPro, CCC won't let you mess with it.

I have flip queue set to 1 on the global profile


----------



## heroxoot

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Is there a way to do this without a 3rd party tool? I know it can be done with that ATI tool thing but I don't really wanna run it because It caused me BSOD in the past.
> 
> 
> 
> Try RadeonPro, CCC won't let you mess with it.
> 
> I have flip queue set to 1 on the global profile
Click to expand...

I've tried that in Radeon pro once and it made no difference. Not sure if it even worked.


----------



## Durvelle27

Ok guys more bad news. Seems with the death of my 290X it took the HDMI port out on my monitor


----------



## PCSarge

Quote:


> Originally Posted by *Durvelle27*
> 
> Ok guys more bad news. Seems with the death of my 290X it took the HDMI port out on my monitor


yikes. use DVI

in other news im staring at the sexy that is an aquacomputer 290 block.........and then looking at the pricetag and wanting to kick myself for buying mining gear


----------



## anubis1127

Quote:


> Originally Posted by *PCSarge*
> 
> yikes. use DVI
> 
> in other news im staring at the sexy that is an aquacomputer 290 block.........and then looking at the pricetag and wanting to kick myself for buying mining gear


Yeah, those are quite attractive.


----------



## luckyboy

Quote:


> Originally Posted by *Naxxy*
> 
> Thanks for the info but id prefer not to do that. 14.4 had same issues i have now on 14.6....tried everything from resetting to stock values to some software that was supposed to fix the sleep mode issue...but nothing seems to work. Happens 50% of the time I let the computer go on sleep mode.
> 
> Guess only option left is disabling sleep mode.......this thing is bothering me so much I'm starting to think of changing the card........


I have the same problem. Sometimes, monitor won't wake up after sleep. Uninstall trixx utility and monitor wake up properly every time. Here and here I describe issues with my r9 290 tri-x oc.

p.p for now, I'm not 100% sure that problem is not hardware related, but basically, witohut trixx utility - no problem with wake up. Without afterburner and 14.1-14.4 driver - no bsods or freezing.


----------



## PCSarge

Quote:


> Originally Posted by *luckyboy*
> 
> I have the same problem. Sometimes, monitor won't wake up after sleep. Uninstall trixx utility and monitor wake up properly every time. Here and here I describe issues with my r9 290 tri-x oc.
> 
> p.p for now, I'm not 100% sure that problem is not hardware related, but basically, witohut trixx utility - no problem with wake up. Without afterburner and 14.1-14.4 driver - no bsods or freezing.


try 13.12. bet it fixes the problem


----------



## luckyboy

13.12 in combination with trixx utility?

p.p do you run at the same time youtube video, fishbowl benchmark, foobar2000 and movie? During playback run afterburner 3.0.1 .


----------



## PCSarge

Quote:


> Originally Posted by *luckyboy*
> 
> 13.12 in combination with trixx utility?
> 
> p.p do you run at the same time youtube video, fishbowl benchmark, foobar2000 and movie? During playback run afterburner 3.0.1 .


13.12 is so stable i can run heaven, a DVD movie and a youtube video with no crashes at 1100/1400

and afterburner and aida64 are always running on my pc.


----------



## luckyboy

I will test with 13.12.Thank you.

Which version of trixx utility and afterburner using?


----------



## PCSarge

Quote:


> Originally Posted by *luckyboy*
> 
> I will test with 13.12.Thank you.
> 
> Which version of trixx utility and afterburner using?


afterburner is 3.0.1, trixx doesnt work on my card now but it worked flawlessly on my 270 before i swapped. ive got the newest version on here.


----------



## Naxxy

Quote:


> Originally Posted by *luckyboy*
> 
> I have the same problem. Sometimes, monitor won't wake up after sleep. Uninstall trixx utility and monitor wake up properly every time. Here and here I describe issues with my r9 290 tri-x oc.
> 
> p.p for now, I'm not 100% sure that problem is not hardware related, but basically, witohut trixx utility - no problem with wake up. Without afterburner and 14.1-14.4 driver - no bsods or freezing.


Tried without trixx and with or without Afterburner still happens......i solved the issue disabling monitor sleep and all works fine.


----------



## fateswarm

Will an EVGA 1000 G2 handle 2 Tri-X 290s and i7-4790K if overclocking it attempted on all? without any extreme/water cooling.


----------



## bond32

Quote:


> Originally Posted by *fateswarm*
> 
> Will an EVGA 1000 G2 handle 2 Tri-X 290s and i7-4790K if overclocking it attempted on all? without any extreme cooling.


Yes. I ran that psu with 3 290's and a 4770k with the 4770k heavily overclocked, stock 290's. Ran just fine. Software readings totaled about 897 watts draw, however I know that isn't the most accurate.

I have since upgraded to the 1300 watt version but the 1000 watt would be perfect for 2 cards.


----------



## fateswarm

Quote:


> Originally Posted by *bond32*
> 
> Yes. I ran that psu with 3 290's and a 4770k with the 4770k heavily overclocked, stock 290's. Ran just fine. Software readings totaled about 897 watts draw, however I know that isn't the most accurate.
> 
> I have since upgraded to the 1300 watt version but the 1000 watt would be perfect for 2 cards.


Alright. Thanks! I'd love overheads, since they run quieter and have better longevity but good to know I'll be probably fine if I upgrade to a 2x in the future.


----------



## bond32

Quote:


> Originally Posted by *fateswarm*
> 
> Alright. Thanks! I'd love overheads, since they run quieter and have better longevity but good to know I'll be probably fine if I upgrade to a 2x in the future.


Actually, I did a little research. The 1300 watt G2 and 1000 watt are very close, however they do actually have different fans. The 1300 watt model has a much more powerful fan. One of the things I noticed when I installed my 1300 watt after taking the 1000 out was the difference in noise levels. It wasn't by any means absurd or anything, but for almost a year now I know what that 1000 watt sounds like so changing it I would notice.


----------



## fateswarm

Quote:


> Originally Posted by *bond32*
> 
> Actually, I did a little research. The 1300 watt G2 and 1000 watt are very close, however they do actually have different fans. The 1300 watt model has a much more powerful fan. One of the things I noticed when I installed my 1300 watt after taking the 1000 out was the difference in noise levels. It wasn't by any means absurd or anything, but for almost a year now I know what that 1000 watt sounds like so changing it I would notice.


I would imagine since you were near capacity. I hope 2 cards won't go that near. on oc


----------



## Dasboogieman

Quote:


> Originally Posted by *fateswarm*
> 
> Alright. Thanks! I'd love overheads, since they run quieter and have better longevity but good to know I'll be probably fine if I upgrade to a 2x in the future.


If you keep those cards on aircooling, your PSU will sound like a church mouse in comparison. However, the EVGA G2 series may be a tad noisy if you intend to go silent watercooling.

This wouldn't be overclock.net if we didn't cut things close.


----------



## fateswarm

lol.. I just noticed tiger direct had a 290 Tri-x bundled with a free cheap 750W PSU. So many people might have failed trying them on their flimsy 400W computers..


----------



## anubis1127

Quote:


> Originally Posted by *fateswarm*
> 
> lol.. I just noticed tiger direct had a 290 Tri-x bundled with a free cheap 750W PSU. So many people might have failed trying them on their flimsy 400W computers..


"I just got dis bomb 290 veedyo card for my Dell XPS, but now I can't start my PC...wot gives???"


----------



## Dasboogieman

Quote:


> Originally Posted by *fateswarm*
> 
> lol.. I just noticed tiger direct had a 290 Tri-x bundled with a free cheap 750W PSU. So many people might have failed trying them on their flimsy 400W computers..


I'd like to see those bundled PSUs benchmarked. A lot of the cheaper units are overrated as hell, the 290 might not even boot in the best case scenario. Worst case we might see fireworks.


----------



## rdr09

Quote:


> Originally Posted by *anubis1127*
> 
> "I just got dis bomb 290 veedyo card for my Dell XPS, but now I can't start my PC...wot gives???"


is the XPS still have the original psu? what psu is it?


----------



## anubis1127

Quote:


> Originally Posted by *rdr09*
> 
> is the XPS still have the original psu? what psu is it?


I was kidding. Lame attempt at humor.


----------



## rdr09

Quote:


> Originally Posted by *anubis1127*
> 
> I was kidding. Lame attempt at humor.


i was going to say . . . not anubis.


----------



## PachAz

Lol at "veedyo"....


----------



## HoneyBadger84

Submission Link (I think this is what you need :-\)
http://www.techpowerup.com/gpuz/26u2v/

Is that good enough for my submission or ? Here's a screenshot (of the wrong tab, I think  oops):


Spoiler: Warning: Spoiler!







I have 4 cards in total, two of which I'm trying to resell atm:

1 Gigabyte WindForce R9 290X (aftermarket cooler from Gigabyte of course)
1 VisionTek 900654 R9 290X with their aftermarket cooler that comes with it stock
2 XFX R9 290X Core Editions with Stock Blower Fan/Heatsinks

If you need somethin' else to add me let me know, sorry if I did that wrong :-D


----------



## Kriant

Grant my your wisdom my fellow members of this forum:
What is the safe voltage limit for 24/7 use on R9 290 and R9 290x? (I googled around by didn't see any concrete statements)

Thanks in advance.


----------



## kpoeticg

Really depends on your temps. I'd say under 1.3 as long as ALL your temps are in check. Other people can probly give a more accurate answer tho. I personally don't like pushing any of my hw for normal 24/7 browsing ocn and whatnot. I don't OV my 2D profile at all...


----------



## Kriant

Quote:


> Originally Posted by *kpoeticg*
> 
> Really depends on your temps. I'd say under 1.3 as long as ALL your temps are in check. Other people can probly give a more accurate answer tho. I personally don't like pushing any of my hw for normal 24/7 browsing ocn and whatnot. I don't OV my 2D profile at all...


I have +10mv set right now for stable 1050/1300 on my cards, according to GPUz and AB, under load they go around 1.2-1.21v. Temp-wise the only thing I'm worrying about is VRM, core is taken care by water, so core temps averages around 54-56c under load right now (with 80-84f in the room X_X thanks to summer and heat output from the rig).

I was just wondering, because I want to push them up to 1100ish, but that would required around 1.25v from my preliminary tests.


----------



## Gualichu04

I wonder if a 8pin plus 6 pin on a single cord is good enough for powering a single r9 290x. I have been getting the black screen issue with one card and swapped it out with the other i have to test for the same issue. If the other card is fine it seems i have to rma the other.


----------



## kpoeticg

I'd pull back the ram and push the core further if you want better performance. Then raise the ram after the core. I keep my 24/7 3D profile ~1075/1275 +20, +10aux. With my 2D profile at stock

I haven't booted up my system since i attached my blocks and hooked up my loop though cuz i wanna flush everything first. I had my loop running for a few months before i redid everything and attached my 290x blocks.















Edit: @Gualichu04 I run both my cards with 8Pin => 8+6Pin. That shouldn't be the issue, but it's worth trying separate cables before rma'ing


----------



## PCSarge

Quote:


> Originally Posted by *kpoeticg*
> 
> I'd pull back the ram and push the core further if you want better performance. Then raise the ram after the core. I keep my 24/7 3D profile ~1075/1275 +20, +10aux. With my 2D profile at stock
> 
> I haven't booted up my system since i attached my blocks and hooked up my loop though cuz i wanna flush everything first. I had my loop running for a few months before i redid everything and attached my 290x blocks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: @Gualichu04
> I run both my cards with 8Pin => 8+6Pin. That shouldn't be the issue, but it's worth trying separate cables before rma'ing


those blocks are sexy, i have one on order for my 290.


----------



## kpoeticg

Thanx, i'm a huge fan of em too. Those blocks are literally the reason i went with reference 290x's









Edit: Trying to decide if i should paint the caps on the cards white or black tho. Or if i should steal BNEG's idea and try to make a shroud


----------



## PCSarge

Quote:


> Originally Posted by *kpoeticg*
> 
> Thanx, i'm a huge fan of em too. Those blocks are literally the reason i went with reference 290x's
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Trying to decide if i should paint the caps on the cards white or black tho. Or if i should steal BNEG's idea and try to make a shroud


so many 90s man......looks so chaotic lol


----------



## kpoeticg

Yeah i was originally just planning on bending it all. But to have all horizontal and vertical lines with all those blocks in such a small space, i had to go with the F/F 90's









It looked alot cleaner before i added the GPU run



I'm still considering changing the gpu run. I really wanted it to crawl up and across the backplate like that, and the only way i could make everything match up straight was by swinging it back around in a square like that. I might play with it some more though since it's still empty/dry right now...


----------



## PCSarge

Quote:


> Originally Posted by *kpoeticg*
> 
> Yeah i was originally just planning on bending it all. But to have all horizontal and vertical lines with all those blocks in such a small space, i had to go with the F/F 90's
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It looked alot cleaner before i added the GPU run


im worried about....the pump that will be so pissed off at you for those 90s


----------



## kpoeticg

Still worried?


----------



## bond32

Wow... That's some serious pumpage. I'd say you're perfectly fine with that lol...


----------



## kpoeticg

Lol yeah i haven't updated my sig in a while. I got a bunch of changes i need to make

I originally had 2 BP D-Plugs in the loop. Both of em cracked open when i was leaktesting the original 'non-gpu' loop. The first one was on the pumps outlet, and i was leaktesting with Blood-Red X1









The D-Plug just split right in half and i got firehosed in the face. I still have red stains on my walls lolll. The 2nd one was in the PCH => VRM run and just cracked a little before i felt the sprinkler on in my apartment


----------



## VSG

That's weird, I have plenty of D-plugs and all of them operate still like new.


----------



## kpoeticg

They were both brand new. It was just the head pressure from the 35x3. Too much for D-Plugs =)

Since i was leak-testing, i had the duty cycle at 100%.


----------



## VSG

Well if you had them unobstructed with those 3 pumps at full speed, I can't blame Bitspower lol. 90 degree fittings really aren't all that restrictive for a single DDC, let alone 2-3.


----------



## kpoeticg

Yeah i don't blame BP at all. The D-Plugs are perfectly fine. Just too much head-pressure from the pump. I went with the 35x3 cuz of all the blocks, not necessarily the 90's. I'm also gonna be adding a 3rd 290x in the loop sometime soon

Edit: I really wasn't thinking clearly when i put the D-Plug on the pumps outlet loll


----------



## devilhead

Quote:


> Originally Posted by *kpoeticg*
> 
> Yeah i don't blame BP at all. The D-Plugs are perfectly fine. Just too much head-pressure from the pump. I went with the 35x3 cuz of all the blocks, not necessarily the 90's. I'm also gonna be adding a 3rd 290x in the loop sometime soon
> 
> Edit: I really wasn't thinking clearly when i put the D-Plug on the pumps outlet loll


why not parallel thru those 2x290x? you have more than enough pumps for that







those cards get hot, the second one will get hotter







i used parallel for mine 2x290X and 2xD5








edit: even aesthetics will be better


----------



## VSG

As a reference, a single 35x got me 1 GPM with 1 CPU block, 1 set of 2 motherboard blocks, 2 GPU blocks, 4 QDs and plenty of angled fittings no problem. Ram blocks do add restriction but with the motherboard blocks already in there, they won't do a whole lot more. I would love to see what the flow rate is with the 3rd block installed/loop is finished.


----------



## kpoeticg

Quote:


> Originally Posted by *devilhead*
> 
> why not parallel thru those 2x290x? you have more than enough pumps for that
> 
> 
> 
> 
> 
> 
> 
> those cards get hot, the second one will get hotter
> 
> 
> 
> 
> 
> 
> 
> i used parallel for mine 2x290X and 2xD5


Doesn't parallel give "less" pressure to the cards? Meaning serial would give better temps?

Quote:


> Originally Posted by *geggeg*
> 
> As a reference, a single 35x got me 1 GPM with 1 CPU block, 1 set of 2 motherboard blocks, 2 GPU blocks, 4 QDs and plenty of angled fittings no problem. Ram blocks do add restriction but with the motherboard blocks already in there, they won't do a whole lot more. I would love to see what the flow rate is with the 3rd block installed/loop is finished.


I got 0.8GPM at full duty cycle with 1 35x and just my CPU Block. I have an AC High Flow Non-USB, so i'll post the GPM when i add the 3rd block

Edit: I was using an Apogee Drive II back then. So it was literally Res => Apogee => Rad => Flowmeter. 0.8GPM was 100% duty cycle


----------



## VSG

Quote:


> Originally Posted by *kpoeticg*
> 
> Doesn't parallel give "less" pressure to the cards? Meaning serial would give better temps?
> 
> I got 0.8GPM at full duty cycle with 1 35x and just my CPU Block. I have an AC High Flow Non-USB, so i'll post the GPM when i add the 3rd block
> 
> Edit: I was using an Apogee Drive II back then. So it was literally Res => Apogee => Rad => Flowmeter. 0.8GPM was 100% duty cycle


Ya, parallel is good for overall flow but reduces flow per card. It does mean similar card temps but the average may well be higher than serial flow. I had my cards in parallel but I will likely go serial in my rebuild. The mcp35x is more powerful than the APG II so those numbers seem spot on. Anyway the discussion is going off topic now.


----------



## devilhead

Quote:


> Originally Posted by *kpoeticg*
> 
> Doesn't parallel give "less" pressure to the cards? Meaning serial would give better temps?
> 
> I got 0.8GPM at full duty cycle with 1 35x and just my CPU Block. I have an AC High Flow Non-USB, so i'll post the GPM when i add the 3rd block
> 
> Edit: I was using an Apogee Drive II back then. So it was literally Res => Apogee => Rad => Flowmeter. 0.8GPM was 100% duty cycle


with that pump power i think just need to go parallel, you will have more than enough pressure







and the cards will have same temperature







even if you will go 3x290X


----------



## kpoeticg

The 35x is the pump used in the Apogee. One of my 35x's came outta my Apogee Drive II









I agree about the topic tho. Sorry for accidentally threadjacking everybody









Quote:


> Originally Posted by *devilhead*
> 
> with that pump power i think just need to go parallel, you will have more than enough pressure
> 
> 
> 
> 
> 
> 
> 
> and the cards will have same temperature
> 
> 
> 
> 
> 
> 
> 
> even if you will go 3x290X


You have it backwards. Less pump benefits from parallel. More pump allows serial


----------



## devilhead

Quote:


> Originally Posted by *kpoeticg*
> 
> The 35x is the pump used in the Apogee. One of my 35x's came outta my Apogee Drive II
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I agree about the topic tho. Sorry for accidentally threadjacking everybody
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You have it backwards. Less pump benefits from parallel. More pump allows serial


i'm just about graphic cards heat, i have tested parallel and serial, the first card is ok, second gets warmer by like 3C vrm 5C, the 3 card 5C vrm by 7c..... those cards generating "a lot heat", so if you will think about some overclock it will make more sense, go just for parallel







you can test by yourself


----------



## kpoeticg

It's because you're getting better flowrate with parallel. Which moves the water to your rads quicker. Having the 35x3 should mean better temps from serial and higher 'un-needed' flowrate from parallel.

If i only had one pump, i'd get better temps with parallel. Usually people do the first 2 in parallel with the 3rd in serial. With the 3 pumps, i should benefit more from serial (in theory anyway). I'll test when i get my 3rd card tho


----------



## fateswarm

Hehe. The more I see convoluted pipes, hearing of dead m/bs because of leaks and thinking of my random clumsiness, the more I'm convinced I did a good job ordering air cooling this time around.


----------



## anubis1127

Quote:


> Originally Posted by *fateswarm*
> 
> Hehe. The more I see convoluted pipes, hearing of dead m/bs because of leaks and thinking of my random clumsiness, the more I'm convinced I did a good job ordering air cooling this time around.


Air-cooling FTW!!!


----------



## Kittencake

I can't wait for the corsair gh10 cooling bracket







.. I have a h60 all lined up for it


----------



## kpoeticg

Quote:


> Originally Posted by *fateswarm*
> 
> Hehe. The more I see convoluted pipes, hearing of dead m/bs because of leaks and thinking of my random clumsiness, the more I'm convinced I did a good job ordering air cooling this time around.


Lollll, this is my first wc build so i'm pretty thorough about leak-testing (from an external power brick). I've had to rma my mobo and 3 290x's along the way, but none of em had water damage. It's just good that i test the hell out of everything before voiding warranties


----------



## cephelix

Quote:


> Originally Posted by *kpoeticg*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I'd pull back the ram and push the core further if you want better performance. Then raise the ram after the core. I keep my 24/7 3D profile ~1075/1275 +20, +10aux. With my 2D profile at stock
> 
> I haven't booted up my system since i attached my blocks and hooked up my loop though cuz i wanna flush everything first. I had my loop running for a few months before i redid everything and attached my 290x blocks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: @Gualichu04
> I run both my cards with 8Pin => 8+6Pin. That shouldn't be the issue, but it's worth trying separate cables before rma'ing


I for one like the look of all the straight lines and 90deg angles....great job man!


----------



## kpoeticg

Thanx =)


----------



## devilhead

Quote:


> Originally Posted by *kpoeticg*
> 
> It's because you're getting better flowrate with parallel. Which moves the water to your rads quicker. Having the 35x3 should mean better temps from serial and higher 'un-needed' flowrate from parallel.
> 
> If i only had one pump, i'd get better temps with parallel. Usually people do the first 2 in parallel with the 3rd in serial. With the 3 pumps, i should benefit more from serial (in theory anyway). I'll test when i get my 3rd card tho


yes, test it by yourself and you will see







and is not so good to have a lot pressure in loop, it can causes some leaks "from that big pressure"







if you will go for 3x290x or even 4X290X so is better to make separate loop


----------



## Beezleybuzz

Hey guys, joined awhile ago, just wanting to get more involved on the site. I've actually got 2 Asus R9 290 DCUII OC's but am currently just using one as I was ready to pull out my hair with driver and heat issues.

One card on it's own OC's beautifully and runs quite cool, but with 2 that's an entirely different story.

I read a lot about higher VRM temps with my card however the highest I have seen with hours of gaming or heaven/valley runs is high 70C's.

Currently running 1100/6000 on my card with 1.337v and 150% power target using GPU Tweak. Gaming for hours and hours GPU max temp is around 75-81C.


----------



## bond32

Gpu blocks should be run in parallel in about 90% of the situations. I'd say with 3 ddc pumps in series it won't matter, they will produce enough head to overcome any block setup in parallel or series.

Either way, more pumps do not mean higher pressure. More blocks do. Pumps generate flow not pressure.


----------



## kpoeticg

35x's have a tendency to create extra pressure for sure. D5's are better for pure flowrate. I'm building in a small-ish chassis tho, so the 35x3 made more sense for me


----------



## PCSarge

i cant help myself... i must post pictures...........





and just because i like my new wallaper so much


----------



## Aussiejuggalo

Quote:


> Originally Posted by *PCSarge*


Found the fanboy









Looks cool tho


----------



## darkelixa

Hello,

I have a gigabyte r9 290 and when I game in wildstar/crysis 3 the pc will just shut down for like 5 seconds and then reboots , no blue screen or anything so Im not 100% sure whats wrong with the pc, its driving me nuts to say. I have just replaced the mainboard to a h87m-d3h from an asrock z87 oc formula. The temps from the cpu at stress testing have dropped from 88 c down to 68c with the new mainboard as the vcore is no where near as high as what it was on the asrock.

Specs of the pc
i7 4770k
h87m-d3h
8gb ddr3 corsair xmp ram
fractal xl r2
2,140mm fans at top, 1 ,140mm exhast, 1 140mm bottom, 1 140mm front
750 fsp aurum psu gold

Just not sure if i should buy a whole new rig or hope that its just a gpu issue, i have tried the gpu with my amd 8350 with the same issue. Have also tried a 750w corsair psu with the same issue


----------



## PCSarge

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Found the fanboy
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looks cool tho


actually i won that at the LAN, that was the backdrop for taking the winning teams pictures
Quote:


> Originally Posted by *darkelixa*
> 
> Hello,
> 
> I have a gigabyte r9 290 and when I game in wildstar/crysis 3 the pc will just shut down for like 5 seconds and then reboots , no blue screen or anything so Im not 100% sure whats wrong with the pc, its driving me nuts to say. I have just replaced the mainboard to a h87m-d3h from an asrock z87 oc formula. The temps from the cpu at stress testing have dropped from 88 c down to 68c with the new mainboard as the vcore is no where near as high as what it was on the asrock.
> 
> Specs of the pc
> i7 4770k
> h87m-d3h
> 8gb ddr3 corsair xmp ram
> fractal xl r2
> 2,140mm fans at top, 1 ,140mm exhast, 1 140mm bottom, 1 140mm front
> 750 fsp aurum psu gold
> 
> Just not sure if i should buy a whole new rig or hope that its just a gpu issue, i have tried the gpu with my amd 8350 with the same issue. Have also tried a 750w corsair psu with the same issue


sounds almost like a lack of power or an unstable overclock to me


----------



## anubis1127

Wow, @PCSarge, that is some dedication!


----------



## anubis1127

Quote:


> Originally Posted by *PCSarge*
> 
> sounds almost like a lack of power or an unstable overclock to me


Almost sounds like a R9 290 black screen issue to me.


----------



## Beezleybuzz

That sounds like a power issue to me...


----------



## darkelixa

I havent overclocked the gpu at all, and I thought a 750W FSP Aurum Gold Psu would of been way more than enough power for the gpu unless the new house that I have moved in has bad power in it or something


----------



## darkelixa

Power issues as in a bad psu? Or power from the house?


----------



## Beezleybuzz

From the PSU? I dunno dude... Could it be that your new mobo was installed a bit wrong, possibly a short/grounding out somewhere? Since it's happening with both PSU's that would be the place I would start.


----------



## PCSarge

Quote:


> Originally Posted by *anubis1127*
> 
> Wow, @PCSarge
> , that is some dedication!


thanks









i dont want to say it but. i was a long time ATI card user before AMD bought them. i have cards from the ATI 3D Rage Pro PCI 8MB card, all the way up to the 5770

i got stuck on using ati as a kid..had tried nvidia stuff.always had issues one way or another with it. im sure many nvidia guys say the same about ATI/AMD

the worst nvidia experience i had, costed me a mobo a power supply and 2 9800GTX+ EVGA cards that were a week old.

had SLI rolling on a core 2 duo E7400 and 780i chipset board. all was good everything for a while. then i started randomly getting blackscreens randomly in games.

pulled my side panel off with the rig shut down to check out my cards and make sure something hadnt fallen off. but it was worse. they had MELTED into the pci-e lanes on the mobo. i found out about 3 days later via OCN that it was a bad driver set the infamous driver that stuck the fan speed at 40% forever. EVGA refused RMA, so i was out a good chunk of money there. then knowing the cards were stuck i plugged everything back in and figured ill use my gimped rig until i can afford a new board and cards. press power button. PSU makes a popping sound, i immediately unplug it. corsair however RMA'd that HX750. gigabyte told me to shove it when i wanted to RMA the mobo. the E7400 survived somehow. so did my RAM. from then on. i never even looked at Nvidia at the places i buy parts from.

not that i wanted to share my horror story but, someone called me a fanboy earlier, so i figured id explain


----------



## Beezleybuzz

Similar experience but far less damage. I had a BFG Geforce FX 5800... It literally decided to catch on fire after about a week. I was very new to PC hardware at the time, found an ATI on sale, never a problem, and have stayed with them since.


----------



## anubis1127

Quote:


> Originally Posted by *PCSarge*
> 
> thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i dont want to say it but. i was a long time ATI card user before AMD bought them. i have cards from the ATI 3D Rage Pro PCI 8MB card, all the way up to the 5770
> 
> i got stuck on using ati as a kid..had tried nvidia stuff.always had issues one way or another with it. im sure many nvidia guys say the same about ATI/AMD
> 
> the worst nvidia experience i had, costed me a mobo a power supply and 2 9800GTX+ EVGA cards that were a week old.
> 
> had SLI rolling on a core 2 duo E7400 and 780i chipset board. all was good everything for a while. then i started randomly getting blackscreens randomly in games.
> 
> pulled my side panel off with the rig shut down to check out my cards and make sure something hadnt fallen off. but it was worse. they had MELTED into the pci-e lanes on the mobo. i found out about 3 days later via OCN that it was a bad driver set the infamous driver that stuck the fan speed at 40% forever. EVGA refused RMA, so i was out a good chunk of money there. then knowing the cards were stuck i plugged everything back in and figured ill use my gimped rig until i can afford a new board and cards. press power button. PSU makes a popping sound, i immediately unplug it. corsair however RMA'd that HX750. gigabyte told me to shove it when i wanted to RMA the mobo. the E7400 survived somehow. so did my RAM. from then on. i never even looked at Nvidia at the places i buy parts from.
> 
> not that i wanted to share my horror story but, someone called me a fanboy earlier, so i figured id explain


Cool man, yeah I used to use strictly ATI cards back in the day, 2001-2002, then I got out of PCs for about 5-6 years or so, and have been using a mix of both since. When I came back to PCs I got a 8600GT, it blew up like 2 years later (mainly due to it got filthy dusty), and got a hd4650 after that.

Right now my main gaming GPUs are two R9 270s in CFX, but I also have a GTX 780 hanging out in another rig folding. I have no issues with either, but to each their own.


----------



## PCSarge

Quote:


> Originally Posted by *anubis1127*
> 
> Cool man, yeah I used to use strictly ATI cards back in the day, 2001-2002, then I got out of PCs for about 5-6 years or so, and have been using a mix of both since. When I came back to PCs I got a 8600GT, it blew up like 2 years later (mainly due to it got filthy dusty), and got a hd4650 after that.
> 
> Right now my main gaming GPUs are two R9 270s in CFX, but I also have a GTX 780 hanging out in another rig folding. I have no issues with either, but to each their own.


i went from a 270 that wouldnt OC to my 290 which ive had since saturday...6 worlds over a difference

when i had that problem back then. it pissed me off how a company that big would release a faulty drive that was that badly made. ATI hasnt been that bad ever in my memory recall


----------



## darkelixa

I did triple check all of my standoffs where correct and if it was shorting out, wouldnt it short out as I was trying to start the pc up?


----------



## anubis1127

Quote:


> Originally Posted by *PCSarge*
> 
> i went from a 270 that wouldnt OC to my 290 which ive had since saturday...6 worlds over a difference
> 
> when i had that problem back then. it pissed me off how a company that big would release a faulty drive that was that badly made. ATI hasnt been that bad ever in my memory recall


I like my little baby R9 270s play every PC game I care for well @ 1200p, and they crank out the PPD for folding, I was getting 178k PPD between the two earlier today.









Not to say I wouldn't take a r9 290. I had one, briefly, but it was a POS, so I returned it. Overheating vrms at stock, and crappy performance lower than other 290s.


----------



## Sgt Bilko

Quote:


> Originally Posted by *PCSarge*
> 
> i went from a 270 that wouldnt OC to my 290 which ive had since saturday...6 worlds over a difference
> 
> when i had that problem back then. it pissed me off how a company that big would release a faulty drive that was that badly made. ATI hasnt been that bad ever in my memory recall


MX 440 agp, 6200 agp, CF 4850's, CF 6970's, single 7970, single 290x, CF 290's


----------



## Dasboogieman

Quote:


> Originally Posted by *Kriant*
> 
> Grant my your wisdom my fellow members of this forum:
> What is the safe voltage limit for 24/7 use on R9 290 and R9 290x? (I googled around by didn't see any concrete statements)
> 
> Thanks in advance.


I'd say proportional to your loading and cooling. Miners are on one end where they have 100% loading but undervolted (i.e. constant 150-175W loading on the core) while on the other hand, you have bench markers who push 1.4V+ for very brief loads. There have been many reports of miners whose cards became total garbage on the used market despite the fact they were undervolted.

I personally, find that +50mV provides the best power to heat to performance ratio yet is also modest on the VRMs. I'd go no more than +125mV continuously (mining, FAH etc etc) unless you can keep the core and auxilliary components under 60 degrees (the lower leakage will offset the power requirements).
Quote:


> Originally Posted by *Beezleybuzz*
> 
> Similar experience but far less damage. I had a BFG Geforce FX 5800... It literally decided to catch on fire after about a week. I was very new to PC hardware at the time, found an ATI on sale, never a problem, and have stayed with them since.


I was that sucker who bought the infamous Pentium 4 + GeForce 4MX....them framerates, they were so bad, so very bad







. The fiasco continued when I bought a GeForce 6200LE, get this, I thought the LE meant something special


----------



## PCSarge

Quote:


> Originally Posted by *anubis1127*
> 
> I like my little baby R9 270s play every PC game I care for well @ 1200p, and they crank out the PPD for folding, I was getting 178k PPD between the two earlier today.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not to say I wouldn't take a r9 290. I had one, briefly, but it was a POS, so I returned it. Overheating vrms at stock, and crappy performance lower than other 290s.


yeah this one is a press card...mustve been cherry picked does heaven @ stock volts 1150/1400 max i hit was 79C on 80% fan


----------



## PCSarge

Quote:


> Originally Posted by *Dasboogieman*
> 
> I'd say proportional to your loading and cooling. Miners are on one end where they have 100% loading but undervolted (i.e. constant 150-175W loading on the core) while on the other hand, you have bench markers who push 1.4V+ for very brief loads. There have been many reports of miners whose cards became total garbage on the used market despite the fact they were undervolted.
> 
> I personally, find that +50mV provides the best power to heat to performance ratio yet is also modest on the VRMs. I'd go no more than +125mV continuously (mining, FAH etc etc) unless you can keep the core and auxilliary components under 60 degrees (the lower leakage will offset the power requirements).
> I was that sucker who bought the infamous Pentium 4 + GeForce 4MX....them framerates, they were so bad, so very bad
> 
> 
> 
> 
> 
> 
> 
> . The fiasco continued when I bought a GeForce 6200LE, get this, I thought the LE meant something special


GEforce Loser Edition?


----------



## anubis1127

xD


----------



## Mega Man

Quote:


> Originally Posted by *devilhead*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kpoeticg*
> 
> Yeah i don't blame BP at all. The D-Plugs are perfectly fine. Just too much head-pressure from the pump. I went with the 35x3 cuz of all the blocks, not necessarily the 90's. I'm also gonna be adding a 3rd 290x in the loop sometime soon
> 
> Edit: I really wasn't thinking clearly when i put the D-Plug on the pumps outlet loll
> 
> 
> 
> why not parallel thru those 2x290x? you have more than enough pumps for that
> 
> 
> 
> 
> 
> 
> 
> those cards get hot, the second one will get hotter
> 
> 
> 
> 
> 
> 
> 
> i used parallel for mine 2x290X and 2xD5
> 
> 
> 
> 
> 
> 
> 
> 
> edit: even aesthetics will be better
Click to expand...

hahaha i push 4 mcp35xs ( 2x mcp35x2 )


----------



## Beezleybuzz

What drivers are you using? I've heard some people having a lot of trouble with 14.6 and sometimes even 14.4. The 13.12 drivers seem to be the most stable (although you won't be able to use Mantle).


----------



## nightfox

Quote:


> Originally Posted by *PCSarge*
> 
> GEforce Loser Edition?


this made me laugh alot


----------



## darkelixa

Was using the latest edition of drivers with my gigabyte r9 290, using the 770gtx in the machine, so far, no crashes


----------



## Dasboogieman

Quote:


> Originally Posted by *PCSarge*
> 
> GEforce Loser Edition?


Lol yeah, pretty much . Worst $80 I've ever spent. I installed it and I was like wut!?!!??? This is marginally faster than the GeForce 4MX


----------



## buddatech

I absolutely love my new cards but the stuttering is driving my nuts. Never right away happens I don't maybe after two hours of gaming, only when/after gaming no matter the game. Currently using 14.4 but same difference with 14.6. When I exit game soon after stuttering starts, my desktop/mouse is stuttering, only way to make it stop is to reboot PC. CPU, MB, GPU VRM 1 and 2 temps are great. CPU temps high 50's low 60's MB low to mid 40's top GPU mid 70's VRM high 60's bot GPU low 70's VRM high 60's so temp isn't not a problem. My ambients range from 20/21 up to 25/26


----------



## Yvese

Hmm so I noticed some rattling coming from my GPU. Did some tests and the rattling occurs when the fan speed is between 38-50%. Anything below or higher than that is completely fine.

Should I ignore this? My GPU fan is set on a fan curve and set to 60% when it hits 60C so I'd never hear it while gaming ( though I already wear a headset anyway ). I'd only hear the rattling when the fan ramps up/down.

Has anyone else experienced this? Does this mean the fan is bad even though it only rattles at certain rpms? RMA time?


----------



## Mega Man

sounds like it is just hitting the right freq to transfer vibrations.

if it only happens at certain speeds, i wouldnt worry about it


----------



## anubis1127

One of my old 7970 cooler's would rattle a bit at certain speeds, it was fine though.

I wouldn't worry about it much.


----------



## Gualichu04

What would cause the computer to crash and both monitors have moving pixulated images that change colors ect and go away then come back till i force restart the pc. I am using a 8 pin to 8+6pin power cable for one card and the other card did the same thing. I was doing [email protected] when it crashed and had it happen gaming also.


----------



## Beezleybuzz

Unstable OC?


----------



## Gualichu04

Never overclocked the gpu's yet just the cpu.


----------



## pdasterly

What psu?


----------



## Gualichu04

corsair ax1200i and i am just now trying a 8 pin to 8+6 for the 8 pin and a 8 pin to 8+6 for the 6 pin power to see if if thats the issues. But, read else where a 8 pin to 8+6pin per card is enough power.


----------



## pdasterly

My evga 1300 had same problem, its the amps not the watts, try running each power plug separate(use 4 plugs)


----------



## Gualichu04

Quote:


> Originally Posted by *pdasterly*
> 
> My evga 1300 had same problem, its the amps not the watts, try running each power plug separate(use 4 plugs)


so two plugs per gpu. I was really hoping a 8 pin to 8+6 would be enough since it is not easy to route 2 extra wires per gpu. I am currently only using one gpu since i had the black screen issue with the other while i am testing the system on a day to day basis before using both cards and putting them under water cooling.


----------



## pdasterly

290x is power greedy


----------



## Gualichu04

I will test it with two separate plugs and report back if it has any issues. Thanks for the help. Just have to manage more cables now. I just do not understand how it can be fine for hours then all of the sudden black screen or crash with random pixulated colors. Must just mean it needs more power then.


----------



## pdasterly

Read post 24404
Mine ran fine until I added more ram


----------



## Yvese

Quote:


> Originally Posted by *anubis1127*
> 
> One of my old 7970 cooler's would rattle a bit at certain speeds, it was fine though.
> 
> I wouldn't worry about it much.


Quote:


> Originally Posted by *Mega Man*
> 
> sounds like it is just hitting the right freq to transfer vibrations.
> 
> if it only happens at certain speeds, i wouldnt worry about it


Hmm yea I just did a little test and found that just holding up the card slightly from the tip would stop the rattling significantly. Looks like slight bending is the problem. The card is pretty long and heavy so I suppose it's not surprising. I'll try re-tightening the screws on the pcie slot and hope that works. If not I'll just ignore it.

Still it's weird how it only rattles at certain speeds, and low ones at that. I'm fortunate though since if it rattled at higher speeds then it'd be a big problem


----------



## Mega Man

Quote:


> Originally Posted by *Yvese*
> 
> Quote:
> 
> 
> 
> Originally Posted by *anubis1127*
> 
> One of my old 7970 cooler's would rattle a bit at certain speeds, it was fine though.
> 
> I wouldn't worry about it much.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> sounds like it is just hitting the right freq to transfer vibrations.
> 
> if it only happens at certain speeds, i wouldnt worry about it
> 
> Click to expand...
> 
> Hmm yea I just did a little test and found that just holding up the card slightly from the tip would stop the rattling significantly. Looks like slight bending is the problem. The card is pretty long and heavy so I suppose it's not surprising. I'll try re-tightening the screws on the pcie slot and hope that works. If not I'll just ignore it.
> 
> Still it's weird how it only rattles at certain speeds, and low ones at that. I'm fortunate though since if it rattled at higher speeds then it'd be a big problem
Click to expand...

there is a term for this, my mind is not wanting to remember it though.

another thing you can do is fishing line to hold up the card


----------



## Arizonian

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Submission Link (I think this is what you need :-\)
> http://www.techpowerup.com/gpuz/26u2v/
> 
> Is that good enough for my submission or ? Here's a screenshot (of the wrong tab, I think  oops):
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I have 4 cards in total, two of which I'm trying to resell atm:
> 
> 1 Gigabyte WindForce R9 290X (aftermarket cooler from Gigabyte of course)
> 1 VisionTek 900654 R9 290X with their aftermarket cooler that comes with it stock
> 2 XFX R9 290X Core Editions with Stock Blower Fan/Heatsinks
> 
> If you need somethin' else to add me let me know, sorry if I did that wrong :-D


Congrats - added
















Quote:


> Originally Posted by *Beezleybuzz*
> 
> Hey guys, joined awhile ago, just wanting to get more involved on the site. I've actually got 2 Asus R9 290 DCUII OC's but am currently just using one as I was ready to pull out my hair with driver and heat issues.
> 
> One card on it's own OC's beautifully and runs quite cool, but with 2 that's an entirely different story.
> 
> I read a lot about higher VRM temps with my card however the highest I have seen with hours of gaming or heaven/valley runs is high 70C's.
> 
> Currently running 1100/6000 on my card with 1.337v and 150% power target using GPU Tweak. Gaming for hours and hours GPU max temp is around 75-81C.


Welcome aboard







I didn't see your name on the roster. Post proof submission and I'd be happy to add you.


----------



## Gualichu04

I am having the same pixulated screens crash using two separate power cables for one gpu. I was running [email protected] and trying to watch a movie. Could the vrm be over heating since ihad this same issue with the other card not sure what to do. I did try bumping the cpus vcore up a bit to see if that helps but, i do not get a overclocking failed message upon a restart.


----------



## anubis1127

Quote:


> Originally Posted by *Mega Man*
> 
> there is a term for this, my mind is not wanting to remember it though.
> 
> another thing you can do is fishing line to hold up the card


Sagging? That's what I generally call it. Not sure if that's what you were thinking of.


----------



## Mega Man

no the frequency where vibrations travel through materials, dealing with fans and VFDs i have had to program fans to skip certain hz as it would do that, there is a word/phrase, but i cant think of it

edit

Resonance frequency


----------



## anubis1127

Quote:


> Originally Posted by *Mega Man*
> 
> no the frequency where vibrations travel through materials, dealing with fans and VFDs i have had to program fans to skip certain hz as it would do that, there is a word/phrase, but i cant think of it
> 
> edit
> 
> Resonance frequency


Oh, I was referring to how his card was bending down slightly. My bad.


----------



## Gualichu04

Any idea on the random crashing with the random colors pixulated screens on both monitors. Maybe i should remove the 8 pin to 8+6 and jsut use regualr 8 pin to 6+2 for the card instead of both 8pin to 8+8. I wonder if certain connection slots to the psu make a difference?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gualichu04*
> 
> Any idea on the random crashing with the random colors pixulated screens on both monitors. Maybe i should remove the 8 pin to 8+6 and jsut use regualr 8 pin to 6+2 for the card instead of both 8pin to 8+8. I wonder if certain connection slots to the psu make a difference?


i'm having the same issue, it's only with [email protected] though for some reason.

i don't have any more PCIe cables to test but i did lessen the problem by never turning my monitors off


----------



## Gualichu04

Quote:


> Originally Posted by *Sgt Bilko*
> 
> i'm having the same issue, it's only with [email protected] though for some reason.
> 
> i don't have any more PCIe cables to test but i did lessen the problem by never turning my monitors off


The pixulated crash is only with [email protected] and the monitors are always on. I am in front of the pc while it happens. Maybe the 14.6 beta drivers are the cause.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gualichu04*
> 
> The pixulated crash is only with [email protected] and the monitors are always on. I am in front of the pc while it happens. Maybe the 14.6 beta drivers are the cause.


I'm using 14.6 as well and it has happened to me whilst using my PC at the same time.


----------



## anubis1127

Hmm, that is odd. I just have little pleb r9 270s, but 14.6 have been working great for me, both folding and gaming.

Are you guys browsing the web when it happens? Maybe turn off GPU acceleration in the browser if its on and if it occurs while browsing.


----------



## Sgt Bilko

Quote:


> Originally Posted by *anubis1127*
> 
> Hmm, that is odd. I just have little pleb r9 270s, but 14.6 have been working great for me, both folding and gaming.
> 
> Are you guys browsing the web when it happens? Maybe turn off GPU acceleration in the browser if its on and if it occurs while browsing.


14.6 works perfectly fine in everything for me, [email protected] works well (just passed 2 mill) with it i'm just having this one problem.

Will turn off hardware acceleration (didn't know it was even on) and report back!


----------



## Gualichu04

Hardware acceleration is off for me.


----------



## 98uk

What sort of clocks are you guys pushing?

I've got a reference Gigabyte R9 290, using the latest BIOS version and a custom fan profile. Luckily the card is voltage unlocked, but so far, at completely stock I managed to get to 947mhz -> 1000mhz... with 75c under load in BF4 (in the middle of a warm summer!). Haven't touched memory yet.

Quite impressed with the reference cooler temps if you ignore the noise it makes (can't hear it with my headphones anyway).

But, what sort of clocks are people with reference cards getting near?


----------



## pdasterly

Sapphire reference card running tri-x oc bios
I can get 1100 without voltage increase using trixx


----------



## Dasboogieman

Quote:


> Originally Posted by *Yvese*
> 
> Hmm so I noticed some rattling coming from my GPU. Did some tests and the rattling occurs when the fan speed is between 38-50%. Anything below or higher than that is completely fine.
> 
> Should I ignore this? My GPU fan is set on a fan curve and set to 60% when it hits 60C so I'd never hear it while gaming ( though I already wear a headset anyway ). I'd only hear the rattling when the fan ramps up/down.
> 
> Has anyone else experienced this? Does this mean the fan is bad even though it only rattles at certain rpms? RMA time?


is this a Tri X card?
Yeah I had the exact same issue, its caused by resonance between the Fan assembly and the heatsink fins. Basically, the fan assembly sits on the heatsink with fairly flexible prongs with no padding.

I posted my solution to this earlier in this thread.

In a nutshell, what you want to do is put some padding in, I cut up an old kitchen glove (those thick rubber ones) in to rectangles about 3mm by 50mm. I had about 6 of these rectangles, you first unscrew the fans (theres 3 screws per fan), I then slotted each rectangle between each fan assembly prong and the metal (be careful not to block too much airflow), you may need a metal pick to elevate the prong to slot the rubber in. After you do this, just reattach the fans and Voila, resonance gone.

There is a guide on youtube on how to fix this but the youtuber's solution involves dampening the Fan only, not the actual prong that contacts the metal.

If you have any issues, I'll post some photos.

Don't RMA the card, this is a design flaw thus your replacement may be no better.


----------



## fateswarm

aye, resonance, that's the word. It's why some modest earthquakes can completely destroy a certain building but not others. It may reach resonance.


----------



## kizwan

Quote:


> Originally Posted by *kpoeticg*
> 
> Yeah i don't blame BP at all. The D-Plugs are perfectly fine. Just too much head-pressure from the pump. I went with the 35x3 cuz of all the blocks, not necessarily the 90's. I'm also gonna be adding a 3rd 290x in the loop sometime soon
> 
> Edit: I really wasn't thinking clearly when i put the D-Plug on the pumps outlet loll


Little bit drunk maybe?!








Quote:


> Originally Posted by *kpoeticg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *devilhead*
> 
> 
> 
> Doesn't parallel give "less" pressure to the cards? Meaning serial would give better temps?
Click to expand...

I read parallel give better temps.







I think a lot of factors in play here.


----------



## PachAz

Hello,

As you all know my sapphire r9 290 go black screen/freeze while using 5-50% power limit. Since I bought the card used the shop can only take consideration to what the manufacturer tells about the warranty. The support people experianced the same issues as I did i.e. black screen/freeze while using power limit. How ever they dont consider this as a valid defect, because according to them using power limit is the same as overclocking, and hence sapphire wont accept the RMA. The question is though, if power limit is the same as "overclocking". Both me and the support did agree that this was an issue, but they cant give me a new card unless sapphire specify what they mean with "overclocking". Personly I find it really absurd, since often the cards whould be more stable using power limit, not the opposit. I still think the gpu is faulty, despite the support managed 22 hours valley...with 0% power limit. Anyone have any comments on this?

I really uppset now in the way the shop handled it, what if I would buy a brand new card, would they still claim "this is regarded as overclocking" and hence we cant help you? I understand the support people are no genoius, but this is a really big company in scandinavia. They were "kind" enough to not charge me with postage and additional fees, but I mean come on. Where do sapphire say that using power limit aviods the warranty? And even if it did, the warranty would be avoided as soon as I or the support used that function anyways.

The person I spoke with also made a misstake when he claimed that it could be my settings or PSU that was the issue, despite the test computer at the support also experianced the same issue? Lol, sleezy corporate BS. How do one assume the buyers computer may be the issue if the test computer show exactly the same thing?

Any sapphire reps here?

Edit: Just talked with sapphire technical support in germany, and they told me that using sapphire tri-x and power limit function in that program, does _not_ void the warranty at all.


----------



## fateswarm

Needless to say, reading the reports on the instability of those cards in whole makes it obvious any attempt on OC is marginally a suicide mission. I even start considering if downvolting might be a good idea.


----------



## Sgt Bilko

Quote:


> Originally Posted by *PachAz*
> 
> Hello,
> 
> As you all know my sapphire r9 290 go black screen/freeze while using 5-50% power limit. Since I bought the card used the shop can only take consideration to what the manufacturer tells about the warranty. The support people experianced the same issues as I did i.e. black screen/freeze while using power limit. How ever they dont consider this as a valid defect, because according to them using power limit is the same as overclocking, and hence sapphire wont accept the RMA. The question is though, if power limit is the same as "overclocking". Both me and the support did agree that this was an issue, but they cant give me a new card unless sapphire specify what they mean with "overclocking". Personly I find it really absurd, since often the cards whould be more stable using power limit, not the opposit. I still think the gpu is faulty, despite the support managed 22 hours valley...with 0% power limit. Anyone have any comments on this?
> 
> I really uppset now in the way the shop handled it, what if I would buy a brand new card, would they still claim "this is regarded as overclocking" and hence we cant help you? I understand the support people are no genoius, but this is a really big company in scandinavia. They were "kind" enough to not charge me with postage and additional fees, but I mean come on. Where do sapphire say that using power limit aviods the warranty? And even if it did, the warranty would be avoided as soon as I or the support used that function anyways.


You are changing the stock settings......

Didn't you read the little box in CCC?
Quote:


> Originally Posted by *fateswarm*
> 
> Needless to say, reading the reports on the instability of those cards in whole makes it obvious any attempt on OC is marginally a suicide mission. I even start considering if downvolting might be a good idea.


Mine have been pretty solid actually, You should know by now that complaints speak louder than compliments


----------



## cephelix

@Red1776

Just checked the specific model number on mine(V308-022) vs the one found on coolingconfigurator.com(V308-002). Is there any difference? Either ways, i've contacted msi support and hopefully i'll have a reply soon.


----------



## fateswarm

Quote:


> Originally Posted by *Sgt Bilko*
> 
> complaints speak louder than compliments


That's true. But we also have a lot of AMD fans. So, I think I hear both sides in this forum in whole.

More complaints here, more praise in the news section though


----------



## PachAz

Quote:


> Originally Posted by *Sgt Bilko*
> 
> You are changing the stock settings......
> 
> Didn't you read the little box in CCC?
> Mine have been pretty solid actually, You should know by now that complaints speak louder than compliments


I didnt use CCC, I used sapphire trixx. Stock settings? Would that also mean that if i change the fan speed in sapphire trixx and the fan wouldnt not work, that would not be a fault since "sapphire only guarantee full functionality with stock settings"? This is silly. Its silly as hell if one card dont work using 5-10% power limit, and that some how should void the warranty.


----------



## Sgt Bilko

Quote:


> Originally Posted by *PachAz*
> 
> I didnt use CCC, I used sapphire trixx. Stock settings? Would that also mean that if i change the fan speed in sapphire trixx and the fan wouldnt not work, that would not be a fault since "sapphire only guarantee full functionality with stock settings"? This is silly. Its silly as hell if one card dont work using 5-10% power limit, and that some how should void the warranty.




If they want to then they can enforce that rule down the the last letter.


----------



## kizwan

Quote:


> Originally Posted by *PCSarge*
> 
> i cant help myself... i must post pictures...........
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and just because i like my new wallaper so much


With that wallpaper you don't need heater in your room anymore.


----------



## kpoeticg

Ba dum bump










Edit: Lovin the build btw, i was asleep when he posted it last night. Love the OCN logo


----------



## PachAz

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 
> 
> If they want to then they can enforce that rule down the the last letter.


Where did i mention that i used ccc? Regardles of that, a card crashing using 5-50% power limit is in my book faulty.


----------



## kpoeticg

Just because you didn't OC with CCC, doesn't mean OC'ing isn't 'voiding' the warranty









When you kill a card tho, i don't think they can really tell how it happened

PS: Any1 else find the whole AMD Poster => Halo Poster => Teddy Bear kinda funny in that pic


----------



## Sgt Bilko

Quote:


> Originally Posted by *PachAz*
> 
> Where did i mention that i used ccc? Regardles of that, a card crashing using 5-50% power limit is in my book faulty.


I used the warning from CCC as it's just a general blanket statement when it comes to overlcocking anything really.

Overclocking = Warranty Void

And if they cannot reprodue the results then in their book it's not faulty
Quote:


> Products that become non-functional due to customer improper use.


Quote:


> Originally Posted by *kpoeticg*
> 
> Just because you didn't OC with CCC, doesn't mean OC'ing isn't 'voiding' the warranty
> 
> 
> 
> 
> 
> 
> 
> 
> 
> When you kill a card tho, i don't think they can really tell how it happened
> 
> PS: Any1 else find the whole AMD Poster => Halo Poster => Teddy Bear kinda funny in that pic


^ Yup this

Also, was funny, Teddy Radeon Chief!


----------



## PCSarge

Quote:


> Originally Posted by *kpoeticg*
> 
> Ba dum bump
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Lovin the build btw, i was asleep when he posted it last night. Love the OCN logo


thank you sir. wish i had the rad for my 290 block. im not hooking it up to the single 240 my cpu is on atm

and lol yes. that teddy has been around as long as i have. so its only proper hes on my bed somewhere, even if it isnt made for him to sit on my pillow


----------



## kpoeticg

Yeah. It looks tight


----------



## PCSarge

Quote:


> Originally Posted by *kpoeticg*
> 
> Yeah. It looks tight


had to use 2 45 rotaries to move the tubing high enough at the res for the 290's power connectors to plug in lol

that 290 OCs like a champ on the stock cooler though. 1150/1400 stock volts, ran heaven at that setting got max temp of 78C at 80% fan, and avg fps of 85.7, max of 120.6.

im guessing thats why it was in AMD's lab so long. is le golden card lol


----------



## PachAz

Since when is increasing or decreasing power limit the same as overclocking? If im not wrong, sapphire only speak about overclocking but they dont define what settings in trixx that is under that category. Also they could reproduce the fault using power limit. But first they told me that they cant issue a new card because they believed using power limit is voiding sapphire warranty, then a "female" in the support told me that I got to prove that sapphire regard issues happening while using power limit is a hardware failure and hence a valid warranty reason. I hate when support change their mind and BS me.

The point was that the card work in stock settings, but not while using power limit. I sure damn arnt satisfied with a card that cant use 5-50% power limit and I think the shop is just beeing silly and stuborn. Then the "female" started to question me why I would want to use power limit anyways. That isnt her business from the first place at all. If sapphire makes a software to their card, sure as hell the cards should work with those functions, even if that would void the warranty. But we still dont know if power limit voids warranty. Sapphire support from germany told me it didnt, but I need to find more people from sapphire that can agree to that.


----------



## PCSarge

Quote:


> Originally Posted by *PachAz*
> 
> Since when is increasing or decreasing power limit the same as overclocking? If im not wrong, sapphire only speak about overclocking but they dont define what settings in trixx that is under that category.


since when are we worried about following rules?


----------



## Sgt Bilko

Quote:


> Originally Posted by *PachAz*
> 
> Since when is increasing or decreasing power limit the same as overclocking? If im not wrong, sapphire only speak about overclocking but they dont define what settings in trixx that is under that category.


That's the point, they are covering their bases.

You should have told them that you never used the software to start with.


----------



## fateswarm

IMO the tri-x is a preoverclocked card. And it's not even mildly preoverclocked, it's not massive but above just touching it. I would consider it as a card that I have already overclocked decently.


----------



## PachAz

Quote:


> Originally Posted by *Sgt Bilko*
> 
> That's the point, they are covering their bases.
> 
> You should have told them that you never used the software to start with.


And how would they be able to reproduce the problem if I didnt tell them I used trixx? The problem occurs when using 5-50% power limit.


----------



## kpoeticg

I've only caught a few posts about your issue so i might be missing something, but sounds like maybe you just got some crappy silicon. If the only problem is that your card doesn't OC or OV well, consider selling it and buying another. It's called the silicon lottery for a reason =\


----------



## PachAz

It is a joke when you cant use power limit, usually power limit should prevent black screens not cause them.


----------



## kpoeticg

Ooooh it's a black screen issue =\

There's a million different solutions for that depending what's really causing it. And sometimes RMA is the only solution. Are you positive it only happens with power limit raised? Have you spent a good amount of time gaming or watching vidz without raising it? Black screens suck man. Sometimes a fresh windows install can fix it, depending on the real cause

Edit: Have you tried using something other than Trixxx? Maybe it's not playing nice with your drivers....


----------



## kizwan

Quote:


> Originally Posted by *Beezleybuzz*
> 
> What drivers are you using? I've heard some people having a lot of trouble with 14.6 and sometimes even 14.4. The 13.12 drivers seem to be the most stable (although you won't be able to use Mantle).


What kind of trouble? I use 14.6 without any problem.


----------



## PCSarge

Quote:


> Originally Posted by *kizwan*
> 
> What kind of trouble? I use 14.6 without any problem.


14.6 and 14.4 seem to cause blackscreen and instability issues for quite a few people, whereas 13.12 is fine. so it must have to do with mantle in some way


----------



## PachAz

Quote:


> Originally Posted by *kpoeticg*
> 
> Ooooh it's a black screen issue =\
> 
> There's a million different solutions for that depending what's really causing it. And sometimes RMA is the only solution. Are you positive it only happens with power limit raised? Have you spent a good amount of time gaming or watching vidz without raising it? Black screens suck man. Sometimes a fresh windows install can fix it, depending on the real cause
> 
> Edit: Have you tried using something other than Trixxx? Maybe it's not playing nice with your drivers....


Not really, they only happen when using valley in combination with power limit. According to the support they tested 3 hours furemark extreme and 22 hours valley benchmark with no power limit and it passed. To be honest I originaly got the black screen issues when having no power limit in trixx, but later on they didnt occured again. It was after I used power limit they occured every time. So in a sense, yes they did occur using all stock settings, but only a few times and that is also what makes me worried. But that doesnt matter because support clam the card works "great" on stock settings. I will open a support ticket and I have also mailed some sapphire tech people about this problem, so I can forward their answers to my shop, which is Komplett.se. A known vendor in scandinavia.


----------



## Yvese

Quote:


> Originally Posted by *Dasboogieman*
> 
> is this a Tri X card?
> Yeah I had the exact same issue, its caused by resonance between the Fan assembly and the heatsink fins. Basically, the fan assembly sits on the heatsink with fairly flexible prongs with no padding.
> 
> I posted my solution to this earlier in this thread.
> 
> In a nutshell, what you want to do is put some padding in, I cut up an old kitchen glove (those thick rubber ones) in to rectangles about 3mm by 50mm. I had about 6 of these rectangles, you first unscrew the fans (theres 3 screws per fan), I then slotted each rectangle between each fan assembly prong and the metal (be careful not to block too much airflow), you may need a metal pick to elevate the prong to slot the rubber in. After you do this, just reattach the fans and Voila, resonance gone.
> 
> There is a guide on youtube on how to fix this but the youtuber's solution involves dampening the Fan only, not the actual prong that contacts the metal.
> 
> If you have any issues, I'll post some photos.
> 
> Don't RMA the card, this is a design flaw thus your replacement may be no better.


That sounds like a lot of work lol. I think I'll leave it alone though since thankfully it only happens around 38-50% fan speed which is easily solved by my fan curve.

Curious though.. would a backplate solve it? If so Sapphire should have added it for this tri-x model as well and not just the vapor-x.


----------



## wrigleyvillain

So has anyone ever _removed_ an Accelero, more specifically a III from one of these cards? My eBay special came with one but if I keep it I am going to want to get a block and I am wondering/concerned about removing these apparently sturdy heatsinks off the VRAM and VRMs without at best a bunch of gunk left behind and at worst tearing one off.

Hair dryer maybe?


----------



## PCSarge

Quote:


> Originally Posted by *wrigleyvillain*
> 
> So has anyone ever _removed_ an Accelero, more specifically a III from one of these cards? My eBay special came with one but if I keep it I am going to want to get a block and I am wondering/concerned about removing these apparently sturdy heatsinks off the VRAM and VRMs without at best a bunch of gunk left behind and at worst tearing one off.
> 
> Hair dryer maybe?


easier way...run the card in heaven on ultra for a bit, heatsinks will heat up, then pull it out and gently twist them off. hair dryer gives off static, which is bad.


----------



## wrigleyvillain

Thanks and, of course, heating it up through use occurred to me right after I posted. But I also wasn't sure if it would still be 'hot enough' by the time I got my fingers on the sinks.


----------



## PCSarge

Quote:


> Originally Posted by *wrigleyvillain*
> 
> Thanks and, of course, heating it up through use occurred to me right after I posted. But I also wasn't sure if it would still be 'hot enough' by the time I got my fingers on the sinks.


should be, the sinks hold heat for a bit, i had to haul a card out of a WC loop and pull sinks before and they were still warm when i got it out.

run a full heaven session and pull the card. lol


----------



## anubis1127

Quote:


> Originally Posted by *wrigleyvillain*
> 
> Thanks and, of course, heating it up through use occurred to me right after I posted. But I also wasn't sure if it would still be 'hot enough' by the time I got my fingers on the sinks.


Not to mention your fingers might get scorched. Oww.


----------



## wrigleyvillain

Yeah I will touch them lightly and quickly at first…thanks +rep PCSarge.


----------



## PCSarge

Quote:


> Originally Posted by *wrigleyvillain*
> 
> Yeah I will touch them lightly and quickly at first&#8230;thanks +rep PCSarge.


no problem bud.


----------



## Beezleybuzz

You don't see that message if you OC in something like GPU tweak. You're being pedantic.


----------



## Hl86

Changed the tri-x cooler to at kraken, i reached 1200mhz core 70-75 core and 80-85c Vrm1. Win/Win i say.


----------



## Gualichu04

I stil have no idea what to do about the computer crashing with a colored moving pixulated screen when tryitn to use [email protected] and doing other tasks. Only using one gpu for now and separate power cables for the 6 and 8pin. I am going to try connecting the 4 pin molex on the asus riv BE and see if that helps.


----------



## anubis1127

Quote:


> Originally Posted by *Gualichu04*
> 
> I stil have no idea what to do about the computer crashing with a colored moving pixulated screen when tryitn to use [email protected] and doing other tasks. Only using one gpu for now and separate power cables for the 6 and 8pin. I am going to try connecting the 4 pin molex on the asus riv BE and see if that helps.


Try starting up a thread in the [email protected] section, see if any of the guys folding on R9 290/Xs can chime in on their experience. I know we have a few. @Krusher33 , @neurotix , @sayaman22 come to mind off the top of my head.

I've never heard of that issue. Can you take a pic of it to help describe it?


----------



## Gualichu04

Quote:


> Originally Posted by *anubis1127*
> 
> Try starting up a thread in the [email protected] section, see if any of the guys folding on R9 290/Xs can chime in on their experience. I know we have a few. @Krusher33
> , @neurotix
> , @sayaman22
> come to mind off the top of my head.
> 
> I've never heard of that issue. Can you take a pic of it to help describe it?


When it happens again i will take a picture. It seems quite random till then i will just try the connecting 4 pin molex on the motherboard and hope that helps.


----------



## Dr.GumbyM.D.

Quote:


> Originally Posted by *Gualichu04*
> 
> I stil have no idea what to do about the computer crashing with a colored moving pixulated screen when tryitn to use [email protected] and doing other tasks. Only using one gpu for now and separate power cables for the 6 and 8pin. I am going to try connecting the 4 pin molex on the asus riv BE and see if that helps.


I don't do [email protected] but that sounds exactly what happened with my 290 (reference Powercolor) playing World of Tanks (very poorly optimized) and Tropico 5. I could play BF4 all day long 64player, etc, I could run heaven, valley, 3dmark, and furmark with no issue. Just those two games took the PC down just how you described. It drove me nuts.

I tried troubleshooting a number of different things, reinstalling drivers, reinstalling windows, increasing voltage, increasing fan to keep it cooler, underclocking, overclocking, anything. The only thing that ultimately worked (or at least I haven't had any issues yet, I haven't had much time to game recently) was flashing the Asus 290x bios on the unlocking guide to my card. It didn't unlock the extra shaders, and it's roughly where my clocks were before, and I run a custom fan profile anyway, but I figured I'd try a few different bioses as a last ditch effort (only factory ones from other MFGs. I wasn't going to try the PT1/2/3 bioses that can be dangerous with voltages).

So that worked for me. It's not an ideal solution, because while it fixed the problem, I'm not sure how it fixed it. Either way, I haven't had a problem since, so I'd recommend trying that. Let us know if it helps!


----------



## anubis1127

Quote:


> Originally Posted by *Gualichu04*
> 
> When it happens again i will take a picture. It seems quite random till then i will just try the connecting 4 pin molex on the motherboard and hope that helps.


Good luck!


----------



## PureBlackFire

Quote:


> Originally Posted by *Hl86*
> 
> Changed the tri-x cooler to at kraken, i reached 1200mhz core 70-75 core and 80-85c Vrm1. Win/Win i say.


that's very high temps for an AIO solution. my previous 290 never broke 46c with a Kraken G10/H55 on it. my Tri-X hasn't broken 67c yet. albeit my card was run at 1150mhz and no higher.


----------



## Gualichu04

Quote:


> Originally Posted by *Dr.GumbyM.D.*
> 
> I don't do [email protected] but that sounds exactly what happened with my 290 (reference Powercolor) playing World of Tanks (very poorly optimized) and Tropico 5. I could play BF4 all day long 64player, etc, I could run heaven, valley, 3dmark, and furmark with no issue. Just those two games took the PC down just how you described. It drove me nuts.
> 
> I tried troubleshooting a number of different things, reinstalling drivers, reinstalling windows, increasing voltage, increasing fan to keep it cooler, underclocking, overclocking, anything. The only thing that ultimately worked (or at least I haven't had any issues yet, I haven't had much time to game recently) was flashing the Asus 290x bios on the unlocking guide to my card. It didn't unlock the extra shaders, and it's roughly where my clocks were before, and I run a custom fan profile anyway, but I figured I'd try a few different bioses as a last ditch effort (only factory ones from other MFGs. I wasn't going to try the PT1/2/3 bioses that can be dangerous with voltages).
> 
> So that worked for me. It's not an ideal solution, because while it fixed the problem, I'm not sure how it fixed it. Either way, I haven't had a problem since, so I'd recommend trying that. Let us know if it helps!


What other bios could i try for my xfx r9 290x the voltage is already unlocked.


----------



## Dr.GumbyM.D.

Quote:


> Originally Posted by *Gualichu04*
> 
> What other bios could i try for my xfx r9 290x the voltage is already unlocked.


All of the bioses are slightly modified/optimized by the OEM (they do more than just change the sticker on the cooler, or replace the cooler, who knew?) so some of the bioses may be slightly different.

the unlock thread I'm referring to is here, and the bios I used is this one listed under "Other R9 290X ROMs" --> ASUS 290X ROM.zip

It shouldn't change much, but again, I believe that all of the card bioses are tweaked slightly by the manufacturers before being sent out. It was a really straightforward and easy process for me, you just follow the directions in that thread to make a bootable USB drive, then run the 1 command line, and reboot, and then my problem stopped occurring.

Again, it's not ideal to be messing with alternative vendor bioses, but ultimately I was losing my mind and was about to return the card, so I was very happy that this fixed it. No more mid-round crashes (until WoT's next patch).


----------



## HoneyBadger84

Hey guys, I'm not really new to BIOS flashing but I have no idea what program to use on AMD cards... can someone point the best program out? What I am trying to do is, the previous owner of one of my XFX Core Editions flashed the Uber BIOS to the Quiet BIOS switch setting, and then put some other weird BIOS on the Uber setting. So what I want to do is, using my other XFX Core Edition, which has the correct BIOSes in both spots, I want to take them off that card, and put them on the other card, in the right place, thusly putting the card back to its original specs, so to speak, that way when I sell it (as I'm reselling both Core Editions) it won't have any "issues" for someone to complain about.

So what program would you recommend I use to take the two BIOSes off the card that has them in the correct spot then put them on the other one? I'm assuming I'd have to first take the Uber BIOS off the card without the problem, put it on the card with the problem to replace the weird BIOS, then put both cards to "Quiet" setting, and do the same thing again, right?


----------



## Red1776

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Hey guys, I'm not really new to BIOS flashing but I have no idea what program to use on AMD cards... can someone point the best program out? What I am trying to do is, the previous owner of one of my XFX Core Editions flashed the Uber BIOS to the Quiet BIOS switch setting, and then put some other weird BIOS on the Uber setting. So what I want to do is, using my other XFX Core Edition, which has the correct BIOSes in both spots, I want to take them off that card, and put them on the other card, in the right place, thusly putting the card back to its original specs, so to speak, that way when I sell it (as I'm reselling both Core Editions) it won't have any "issues" for someone to complain about.
> 
> So what program would you recommend I use to take the two BIOSes off the card that has them in the correct spot then put them on the other one? I'm assuming I'd have to first take the Uber BIOS off the card without the problem, put it on the card with the problem to replace the weird BIOS, then put both cards to "Quiet" setting, and do the same thing again, right?


http://www.techpowerup.com/downloads/1962/techpowerup-radeon-bios-editor-v1-28/


----------



## Dr.GumbyM.D.

You use GPU-Z to save the bios off of the card. You have to properly name them to keep them in order. You will have to manually manipulate the switches between downloading each one. I would recommend rebooting between manual switches even though IIRC you can probably switch them live (that's how it used to be done with recovering a bad BIOS upload, start with the good bios position, boot up, switch to bad bios, upload to that slot, reboot)

I would use GPU-Z and maybe even Hawaiiinfo for details on each card since they should be the same, but I'd always double check before flashing bioses, especially if your manufacturer was known for changing memory chips mid-process or anything like that. Better safe than sorry.

But yes, in theory you should be fine to download off of one card, upload it to the other, then rinse and repeat for the other bios after flipping the switch manually.

GPU-Z is a quick google search. For the upload/reload tool, use my link in my prior post to the 290 flashing thread, and there it has a great walkthrough for how to flash the bios (you'll only want to duplicate your own, not actually download someone else's bios).


----------



## kpoeticg

ATIWinFlash is the best program for flashing 290/x's. Flash from the cmd prompt tho, the GUI can be buggy. The link's in the OP

There's also a link in the op to TPU's BIOS Archive. I'm pretty sure it has every bios that's been released for every version of the 290/x. Should be pretty straight forward to do what you need. If you have a bios that can't boot, boot with the good bios and then flip the switch before you flash. It'll flash to the bios that the switch is positioned to. Just always make a habit of backing up your bios before flashing over it!!!

Edit: You should be fine saving from one and flashing to the other. I've done it with my Powercolor's. Might as well check TPU to see if there's an updated bios for your card tho

Edit x2: If the program gives an error, just keep trying. I've had to do it 5 times in a row to get it flashed before, just gives errors the first cpl times sometimes


----------



## hwoverclkd

Quote:


> Originally Posted by *kpoeticg*
> 
> ATIWinFlash is the best program for flashing 290/x's. Flash from the cmd prompt tho, the GUI can be buggy. The link's in the OP
> 
> There's also a link in the op to TPU's BIOS Archive. I'm pretty sure it has every bios that's been released for every version of the 290/x. Should be pretty straight forward to do what you need. If you have a bios that can't boot, boot with the good bios and then flip the switch before you flash. It'll flash to the bios that the switch is positioned to. Just always make a habit of backing up your bios before flashing over it!!!
> 
> Edit: You should be fine saving from one and flashing to the other. I've done it with my Powercolor's. Might as well check TPU to see if there's an updated bios for your card tho
> 
> *Edit x2: If the program gives an error, just keep trying. I've had to do it 5 times in a row to get it flashed before, just gives errors the first cpl times sometimes*


lol...thought you just said it's the best? But I agree. I prefer ATIWinFlash either.


----------



## kpoeticg

It always works tho. Just sometimes it takes a few tries









I've flashed ALOT with it when i had my black screen card. Never bricked with it so i'm happy. Probly shoulda also mentioned that i prefer GPU-Z for SAVING my bios first. I just use atiwinflash for actual flashing =)


----------



## Bipox

Hi all,
You guessed it : I. Need. Help.








(sorry in advance for my internet-learned english)

I've had a lot of BSOD with a non-custom R9 290. RMA'd it, with the refund bought a R9 290 Vapor-x.
It was fine at first, very good scores in 3D Mark.
Summer came.
It is now quite hot in my little student room, even hotter when I'm playing.
At first I could overclock quite well. Then I had quite a few artifacts. Then BSOD's. Today, only on factory OC (none of my own), I had the first BSOD.

Well, you say, it is just too hot...

But it doesn't look like it, temperatures never reach 80°C

*I'm a real noob*, but *I have a gut feeling my PSU is at fault*. It has been acting strange for a long time. Lots of hard reboots, with "asus power surge prevented damage by rebooting your computer and losing all your work/gameplay. You're welcome"
-> disabled asus power surge security option in BIOS ("what's this red blinking dot ? better ignore it")
Shortly after, computer wouldn't boot. Unplugging everything from the PSU , put it back upside down, computer boots again.

I've got an i5 4670k oc'd to 4300. Mobo is an Asus z87-a. Ram is 2x4 1866.
GPU, as said, Sapphire R9 290 Vapor-x (is it possible to unlock those btw?)
PSU: Corsair CX600M

I built my computer in december 2013, it was a first, I had no idea what I was doing, just went for a cheap PSU.

*TL;DR*: lots of BSOD with R9 290 Vapor-x. I want to blame the PSU (CX600M) because it behaved strangely, but really I don't know.
Please tell me what to do. I'm about to buy an Xbox.

(sorry if it's not the best place to post, I didn't feel like making a post just for this)


----------



## kpoeticg

Hrmmm, 600 seems kinda light to me for OC'ing with that card. It could very well be the problem but i'm no expert. Are you OC'ing your CPU too?


----------



## fateswarm

600 is not light. It's inadequate.


----------



## Bipox

@ kpoeticg: yes I'm OC'ing the CPU to 4300 MHz (factory is 3800)

@ fateswarm: what would be adequate for my build ?


----------



## PCSarge

Quote:


> Originally Posted by *fateswarm*
> 
> 600 is not light. It's inadequate.


^this, very inadequite. im running 750W in an ITX build with a 290 lol


----------



## ad hoc

Where's t he 270's owner's club!?


----------



## anubis1127

Quote:


> Originally Posted by *ad hoc*
> 
> Where's t he 270's owner's club!?


Right here: http://www.overclock.net/t/1432035/official-amd-r9-280x-280-270x-270-owners-club/


----------



## Blaise170

Does a CPU-Z validation count, or does it have to be GPU-Z? My card is currently with XFX so I don't have any other way of validating it.


----------



## PCSarge

Quote:


> Originally Posted by *Blaise170*
> 
> Does a CPU-Z validation count, or does it have to be GPU-Z? My card is currently with XFX so I don't have any other way of validating it.


must be GPU-Z


----------



## fateswarm

Quote:


> Originally Posted by *Bipox*
> 
> @ fateswarm: what would be adequate for my build ?


The experts at realhardtechx say 650, though they try to not go near 90% capacity I suspect because it may be very unstable. But, they also don't mean OCing is used a lot. Since you overclock both CPU and GPU (even if it's a factory-OC), I'd say 750 or more to be ok.


----------



## Bipox

@ fateswarm : thanks. I will begin by changing the PSU then. Any brand ? I heard Seasonic are good, but they are mighty expensive...


----------



## PCSarge

Quote:


> Originally Posted by *Bipox*
> 
> @ fateswarm : thanks. I will begin by changing the PSU then. Any brand ? I heard Seasonic are good, but they are mighty expensive...


money is worth it for the reliability factor


----------



## kpoeticg

A good PSU is one of those things that people take for granted. Think about all the expensive hardware that's being powered through it. All it takes is a power surge to fry your mobo/cpu/gpu(s)

Seasonic's always a great choice, especially since alot of the popular PSU's are just rebranded Seasonics with a higher pricetag. Superflower Leadex (a.k.a EVGA G2 & P2) and Antec HCP Platinum are great PSU's also. I love my Antec.


----------



## ad hoc

Quote:


> Originally Posted by *anubis1127*
> 
> Right here: http://www.overclock.net/t/1432035/official-amd-r9-280x-280-270x-270-owners-club/


Thanks! What are all these clubs for anyway? Is it just for fun, or what?


----------



## kpoeticg

The club's are where we have our mixers & casino nights









It's just threads people hang out in cuz they have or are interested in the same hardware. Having clubs in your sig is really just representing the threads you hang out in the most IMO


----------



## anubis1127

Quote:


> Originally Posted by *ad hoc*
> 
> Thanks! What are all these clubs for anyway? Is it just for fun, or what?


Yeah, for fun, to ask other users about their experience with the hardware, ask questions if you are experience issues, share OC / benchmark results, that sort of thing. Some general chit-chat thrown in.


----------



## ad hoc

That sounds awesome. Is there a list of the main clubs somewhere?


----------



## kpoeticg

Just search for your hardware in the search box. If there's a club, it'll pop up in the search


----------



## fateswarm

Quote:


> Originally Posted by *Bipox*
> 
> @ fateswarm : thanks. I will begin by changing the PSU then. Any brand ? I heard Seasonic are good, but they are mighty expensive...


I went with an EVGA G2. They seem recently released and well received at reviews. But I don't know, there are a lot of choices, look at the performance rating at johnyguru's site.


----------



## kpoeticg

G2 = Superflower Leadex Gold. It's a great PSU from what i know about it. You can usually find more info by searching the original OEM model of it


----------



## Bipox

So, in order to summarize, the BSODs _could_ be caused by the GPU getting unstable when under load, because the power flow is not regular enough (or something like that?)
If so, I'll buy a new one. Too bad you can't SLI those.
Thanks for the help, greatly appreciated


----------



## pdasterly

thermaltake, evga has been good to me.
evga offers 10 years warranty, yeah


----------



## kpoeticg

Quote:


> Originally Posted by *Bipox*
> 
> So, in order to summarize, the BSODs could be caused by the GPU getting unstable when under load, because the power flow is not regular enough (or something like that?)
> If so, I'll buy a new one. Too bad you can't SLI those.
> Thanks for the help, greatly appreciated


From what everybody's said, it sounds like it's definitely causing the issue. It's just not enough watts to support your system

SLI the cards or the PSU? You can SLI both if you wanted. They sell adapters to run dual/triple/quad psu's. Not expensive...

I'd personally ditch the cx600 altogether tho if you care about your hardware. Get a better PSU


----------



## cephelix

Heard great things about the superflower platinum series and of course seasonic.
I personally have a seasonic x750 gold.been running for 4 years without a problem.recent browsing thru the net though makes me worried. An i5 760 oc-ed to 4ghz + a 290 oc-ed to 1150 core 1400 memory already consumes ~700W. But of course this assumes both are at full load at the same time and a very rough estimation at best.


----------



## ad hoc

Quote:


> Originally Posted by *kpoeticg*
> 
> From what everybody's said, it sounds like it's definitely causing the issue. It's just not enough watts to support your system
> 
> SLI the cards or the PSU? You can SLI both if you wanted. They sell adapters to run dual/triple/quad psu's. Not expensive...
> 
> I'd personally ditch the cx600 altogether tho if you care about your hardware. Get a better PSU


I'm planning on grabbing an i5 4690k and Sapphire 290 Tri-X OC with a 600W PSU for a friend's build. Is that an issue?


----------



## fateswarm

yes. the card is a beast.


----------



## Blaise170

Quote:


> Originally Posted by *ad hoc*
> 
> I'm planning on grabbing an i5 4690k and Sapphire 290 Tri-X OC with a 600W PSU for a friend's build. Is that an issue?


Depends on the PSU. If it is a quality PSU, then you shouldn't have an issue with 600W.


----------



## ad hoc

Quote:


> Originally Posted by *Blaise170*
> 
> Depends on the PSU. If it is a quality PSU, then you shouldn't have an issue with 600W.


http://www.newegg.com/Product/Product.aspx?Item=N82E16817139028

Edit: Woops, I put the link it quotes.


----------



## Bipox

Quote:


> Originally Posted by *kpoeticg*
> 
> From what everybody's said, it sounds like it's definitely causing the issue. It's just not enough watts to support your system
> 
> SLI the cards or the PSU? You can SLI both if you wanted. They sell adapters to run dual/triple/quad psu's. Not expensive...
> 
> I'd personally ditch the cx600 altogether tho if you care about your hardware. Get a better PSU


Haha, well I suppose the die is cast, new PSU it is. I'll look for a 750 gold seasonic.

Oh, and learned you can SLI a PSU^^

THANKS !


----------



## Bipox

Quote:


> Originally Posted by *ad hoc*


Well I have almost the same card (vapor-x trixx), an i5 4670k, and exactly this PSU.
I'm here because it's not working well. But I'm oc-ing.

(sorry, should've edited my previous post)


----------



## pdasterly

Which thermal paste for gpu, too many choices. im confused


----------



## Red1776

Quote:


> Originally Posted by *pdasterly*
> 
> Which thermal paste for gpu, too many choices. im confused


Diamond IC is great. And I try a lot of them.

Gelid Is terrific as well.


----------



## Dasboogieman

Quote:


> Originally Posted by *Yvese*
> 
> That sounds like a lot of work lol. I think I'll leave it alone though since thankfully it only happens around 38-50% fan speed which is easily solved by my fan curve.
> 
> Curious though.. would a backplate solve it? If so Sapphire should have added it for this tri-x model as well and not just the vapor-x.


Nope, backplate wouldn't solve it. Its caused by the fans assembly vibrating against the plastic.
Are you sure? Its actually really easy. The hardest part is making sure your rubber inserts don't block too many fins.
If you add a backplate it helps the VRM temps heaps. However you won't be able to do it with the stock sapphire screws. I had to special order m2.5 8mm screws from helipal to use with an EK backplate (which is the easiest since it needs the least screws)


----------



## HoneyBadger84

So I got WinATIFlash, and put both XFX cards in... and discovered something even more idiotic the previous owner did.

He put whatever weird-stupid BIOS he put on the one card on the OTHER one on the QUIET setting... so:

Card 1: Quiet BIOS switch setting = weird ATI BIOS, Uber BIOS switch setting = Normal XFX Uber BIOS
Card 2: Quiet BIOS switch setting = Uber XFX BIOS, Uber BIOS switch setting = weird ATI BIOS

SO what I had to do was put both on Uber, copy the Uber from Card 1 to Card 2, then put them both on Quiet, and copy the BIOS from Card 2 to Card 1...

Now one thing I noticed during this, the ATI BIOS in question apparently had HALF the ROPs disabled, it was reading out 32 instead of 64... so naturally I thought "Hey, maybe I'll get a performance gain, somehow, from this, yeah?"

After it was all said & done, I tested all 4 settings to make sure both cards were good to go on both BIOS settings in terms of not artifacting (used 3DMark11 & FireStrike to test on Performance & Extreme settings), I retested the cards with my OCs I know they can run applied to them and lo & behold...

3K performance gain in 3DMark11 P setting, 300pt gain in 3DMark11 X setting, and small gains in FireStrike as well. So I dunno what all this changing around did, but it did something good :-D

Thanks for the help guys!


----------



## Gualichu04

Quote:


> Originally Posted by *Dr.GumbyM.D.*
> 
> All of the bioses are slightly modified/optimized by the OEM (they do more than just change the sticker on the cooler, or replace the cooler, who knew?) so some of the bioses may be slightly different.
> 
> the unlock thread I'm referring to is here, and the bios I used is this one listed under "Other R9 290X ROMs" --> ASUS 290X ROM.zip
> 
> It shouldn't change much, but again, I believe that all of the card bioses are tweaked slightly by the manufacturers before being sent out. It was a really straightforward and easy process for me, you just follow the directions in that thread to make a bootable USB drive, then run the 1 command line, and reboot, and then my problem stopped occurring.
> 
> Again, it's not ideal to be messing with alternative vendor bioses, but ultimately I was losing my mind and was about to return the card, so I was very happy that this fixed it. No more mid-round crashes (until WoT's next patch).


It seems the crash has never came back. I would try a gpu bios flash but, I don't see how i can get the same issue with two different cards and that helping. With separate 8pin and 6 pin to the gpu the black screen issue is gone so far. Thanks for the help. I will post back if the pixulated colored screen crash occurs again


----------



## anubis1127

Quote:


> Originally Posted by *Gualichu04*
> 
> It seems the crash has never came back. I would try a gpu bios flash but, I don't see how i can get the same issue with two different cards and that helping. With separate 8pin and 6 pin to the gpu the black screen issue is gone so far. Thanks for the help. I will post back if the pixulated colored screen crash occurs again


Hopefully you got it sorted out.


----------



## Dasboogieman

Quote:


> Originally Posted by *HoneyBadger84*
> 
> So I got WinATIFlash, and put both XFX cards in... and discovered something even more idiotic the previous owner did.
> 
> He put whatever weird-stupid BIOS he put on the one card on the OTHER one on the QUIET setting... so:
> 
> Card 1: Quiet BIOS switch setting = weird ATI BIOS, Uber BIOS switch setting = Normal XFX Uber BIOS
> Card 2: Quiet BIOS switch setting = Uber XFX BIOS, Uber BIOS switch setting = weird ATI BIOS
> 
> SO what I had to do was put both on Uber, copy the Uber from Card 1 to Card 2, then put them both on Quiet, and copy the BIOS from Card 2 to Card 1...
> 
> Now one thing I noticed during this, the ATI BIOS in question apparently had HALF the ROPs disabled, it was reading out 32 instead of 64... so naturally I thought "Hey, maybe I'll get a performance gain, somehow, from this, yeah?"
> 
> After it was all said & done, I tested all 4 settings to make sure both cards were good to go on both BIOS settings in terms of not artifacting (used 3DMark11 & FireStrike to test on Performance & Extreme settings), I retested the cards with my OCs I know they can run applied to them and lo & behold...
> 
> 3K performance gain in 3DMark11 P setting, 300pt gain in 3DMark11 X setting, and small gains in FireStrike as well. So I dunno what all this changing around did, but it did something good :-D
> 
> Thanks for the help guys!


That weird idiotic BIOS that guy put is a vram tightened timing, mining optimised BIOS. Each GCN cluster had one ROP partition disabled to save power.
Out of academic curiosity, would you be able to run some benchmarks for me on that weird BIOS? Specifically all the vantage fill tests. I'm curious if the loss of 32 ROPs has actually benefited or hindered performance. Does the other BIOS work ok on that card? One worry I had about that mining BIOS was that it would blow the ROP fuses.


----------



## wrigleyvillain

Well...call me incredibly pleased with my $230 shipped XFX 290 with an Accelero. Pretty damn quiet and both core and VRMS maxed at 41C during my first 3DM11 run which scored me a hair over 12K--what I was aiming/hoping for considering my high-clocked GTX 670 SLI gets me a few hundred points under 13K. 290 at stock too.

Going to check if could unlock next...


----------



## HoneyBadger84

Quote:


> Originally Posted by *Dasboogieman*
> 
> That weird idiotic BIOS that guy put is a vram tightened timing, mining optimised BIOS. Each GCN cluster had one ROP partition disabled to save power.
> Out of academic curiosity, would you be able to run some benchmarks for me on that weird BIOS? Specifically all the vantage fill tests. I'm curious if the loss of 32 ROPs has actually benefited or hindered performance. Does the other BIOS work ok on that card? One worry I had about that mining BIOS was that it would blow the ROP fuses.


Eeeeh saw your message too late man, sorry, I wiped that BIOS he used completely & didn't save it cuz I had no intention of using it again. And the cards did run "fine" when they were both on the regular XFX BIOSes, I just didn't like the fact that they had those on there because I didn't want any potential buyer freaking out if they noticed, hence why I put them back to "stock" BIOSes, as I am planning on reselling them very soon (they're already listed on EBay)


----------



## PachAz

Anyone know if the sapphire r9 290 is designed and suppose to work with power limit function in sapphire trixx?







. My shop told me that apparently they arnt and that seem silly to me.


----------



## anubis1127

Quote:


> Originally Posted by *PachAz*
> 
> Anyone know if the sapphire r9 290 is designed and suppose to work with power limit function in sapphire trixx?
> 
> 
> 
> 
> 
> 
> 
> . My shop told me that apparently they arnt and that seem silly to me.


I don't see why it shouldn't, all recent AMD cards should work with the Power Limit function AFAIK.


----------



## Kittencake

K I have a coolermaster 700watt extreme power supply , its pretty much about almost 2 years old now , and i'm waiting for my 290x that i bought of a member here to arrive , now with what I have in my pc will that be adequet enought till i replace the psu with this one http://www.newegg.ca/Product/Product.aspx?Item=N82E16817438018 ?


----------



## anubis1127

Quote:


> Originally Posted by *Kittencake*
> 
> K I have a coolermaster 700watt extreme power supply , its pretty much about almost 2 years old now , and i'm waiting for my 290x that i bought of a member here to arrive , now with what I have in my pc will that be adequet enought till i replace the psu with this one http://www.newegg.ca/Product/Product.aspx?Item=N82E16817438018 ?


Yeah, you'll be fine.


----------



## PachAz

Quote:


> Originally Posted by *anubis1127*
> 
> I don't see why it shouldn't, all recent AMD cards should work with the Power Limit function AFAIK.


I also think that, ill just have to wait and see what Sapphire Tech thinks about that. Too bad my swedish shop dont think so. They think that a gpu that is freezing and show instabilty using power limit is totally fine. Fail.

"Yeah, you'll be fine." - You think she is "fine"?


----------



## cephelix

Guys and gals, any idea on how to word my email to msi support?
basically I asked what the difference between the V308-002(shown on coolingconfig) and V308-022 (my card) are" since red and das said the new msi 290 cards have changes but the person who replied said he/she couldnt search for anything.


----------



## Dasboogieman

I w
Quote:


> Originally Posted by *cephelix*
> 
> Guys and gals, any idea on how to word my email to msi support?
> basically I asked what the difference between the V308-002(shown on coolingconfig) and V308-022 (my card) are" since red and das said the new msi 290 cards have changes but the person who replied said he/she couldnt search for anything.


I wouldn't trust MSI support. If you can take some photos of your card I can probably see if I can ID it for you. I'll need a closeup of the PCIe connector, the 8 pin from the top and see if you can catch a glimpse of the chokes under the heatsink. Finally maybe a shot of the backside so I can count the solder points.


----------



## cephelix

thanks for that das....i'll take it over the weekend when i have the time


----------



## heroxoot

Quote:


> Originally Posted by *cephelix*
> 
> Guys and gals, any idea on how to word my email to msi support?
> basically I asked what the difference between the V308-002(shown on coolingconfig) and V308-022 (my card) are" since red and das said the new msi 290 cards have changes but the person who replied said he/she couldnt search for anything.


It's easier to talk to someone. That's how I got them to look into the bios bug my GPU has. They had to push the fan issue on my bios up to HQ in Taiwan they said, because the dept in cali is only for RMA.


----------



## Mega Man

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PCSarge*
> 
> i cant help myself... i must post pictures...........
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and just because i like my new wallaper so much
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> With that wallpaper you don't need heater in your room anymore.
Click to expand...

haha thanks that is funny
Quote:


> Originally Posted by *cephelix*
> 
> @Red1776
> 
> Just checked the specific model number on mine(V308-022) vs the one found on coolingconfigurator.com(V308-002). Is there any difference? Either ways, i've contacted msi support and hopefully i'll have a reply soon.


Quote:


> Originally Posted by *PachAz*
> 
> Since when is increasing or decreasing power limit the same as overclocking? If im not wrong, sapphire only speak about overclocking but they dont define what settings in trixx that is under that category. Also they could reproduce the fault using power limit. But first they told me that they cant issue a new card because they believed using power limit is voiding sapphire warranty, then a "female" in the support told me that I got to prove that sapphire regard issues happening while using power limit is a hardware failure and hence a valid warranty reason. I hate when support change their mind and BS me.
> 
> The point was that the card work in stock settings, but not while using power limit. I sure damn arnt satisfied with a card that cant use 5-50% power limit and I think the shop is just beeing silly and stuborn. Then the "female" started to question me why I would want to use power limit anyways. That isnt her business from the first place at all. If sapphire makes a software to their card, sure as hell the cards should work with those functions, even if that would void the warranty. But we still dont know if power limit voids warranty. Sapphire support from germany told me it didnt, but I need to find more people from sapphire that can agree to that.


Quote:


> Originally Posted by *Dasboogieman*
> 
> I w
> Quote:
> 
> 
> 
> Originally Posted by *cephelix*
> 
> Guys and gals, any idea on how to word my email to msi support?
> basically I asked what the difference between the V308-002(shown on coolingconfig) and V308-022 (my card) are" since red and das said the new msi 290 cards have changes but the person who replied said he/she couldnt search for anything.
> 
> 
> 
> I wouldn't trust MSI support. If you can take some photos of your card I can probably see if I can ID it for you. I'll need a closeup of the PCIe connector, the 8 pin from the top and see if you can catch a glimpse of the chokes under the heatsink. Finally maybe a shot of the backside so I can count the solder points.
Click to expand...

agreed, they have changed the pcb now


----------



## cephelix

thanks for the help guys......








i am so touched.......by an angel...lol


----------



## rene mauricio

I am having an odd issue. One I have never encountered before.

My 290 pauses during game play and displays a black screen. If left alone it will resume in a few seconds. If this happens more than 3 or so time, it hard locks. A few pages ago I read that this is most common with Elpida memory. On various (other) threads it has been asked that one with such problems run memtestCL to see if the memory is performing stable. After running 10 passes it threw up thousands of errors. I then attempted to use the latest beta version of AFTERBURNER to see if I could not downclock the memory but to no avail.

It seems no matter what I do the memory will always run at 1250MHz. Is there anything I could do or is RMA the only answer? I have never had an issue with underclocking... perhaps I am missing something.


----------



## Kriant

Moment of PURE RANTING RAGE
Got my Swiftech SLI pass-through today (2x) -> one of them is defective due to improper o-ring size (can't switch that one myself, because it's the one, that's inside the non-moving part) (yey)
Took out my old koolance pass-throughs that I don't like, installed for some of that quad-fire action....turns out that because of the length of the card, and full-cover EK nickel block I can't mount the 4th card, because it conflicts with connectors (res/power/led/hdd) on my mobo.
Ok, thankfully EK-FC R9-290X copper block isn't as long, and doesn't conflict -> oohoo I thought, now I just need to start filling the loop -> connected my switch-adapter to one of the pumps.....dead silence....connected to second one....----> dead silence. (both pumps worked earlier today, the res has water, since I QDed the cards, so I doubt that both all of a sudden died)
Yey, my switch-adapter is dead.
So now I'm stuck waiting for another switch adapter X_X. I guess it's just not my day.







And no quad-fire R9 290 action for me


----------



## DeadlyDNA

Quote:


> Originally Posted by *Kriant*
> 
> Moment of PURE RANTING RAGE
> Got my Swiftech SLI pass-through today (2x) -> one of them is defective due to improper o-ring size (can't switch that one myself, because it's the one, that's inside the non-moving part) (yey)
> Took out my old koolance pass-throughs that I don't like, installed for some of that quad-fire action....turns out that because of the length of the card, and full-cover EK nickel block I can't mount the 4th card, because it conflicts with connectors (res/power/led/hdd) on my mobo.
> Ok, thankfully EK-FC R9-290X copper block isn't as long, and doesn't conflict -> oohoo I thought, now I just need to start filling the loop -> connected my switch-adapter to one of the pumps.....dead silence....connected to second one....----> dead silence. (both pumps worked earlier today, the res has water, since I QDed the cards, so I doubt that both all of a sudden died)
> Yey, my switch-adapter is dead.
> So now I'm stuck waiting for another switch adapter X_X. I guess it's just not my day.
> 
> 
> 
> 
> 
> 
> 
> And no quad-fire R9 290 action for me


No worries quadfire is overrated, so many say it's just for epeen. or is it?


----------



## Blaise170

Quote:


> Originally Posted by *DeadlyDNA*
> 
> No worries quadfire is overrated, so many say it's just for epeen. or is it?


Depends completely on the application. If you are playing at 4K, it's very useful. Just the a look at this thread (using Titans instead of 290x): http://www.overclock.net/t/1481789/baashas-4k-surround-quad-gtx-titan-black-sc-benchmarks-thread


----------



## Sgt Bilko

Quote:


> Originally Posted by *Blaise170*
> 
> Depends completely on the application. If you are playing at 4K, it's very useful. Just the a look at this thread (using Titans instead of 290x): http://www.overclock.net/t/1481789/baashas-4k-surround-quad-gtx-titan-black-sc-benchmarks-thread


For a second there i thought you were going to quote his thread


----------



## Kriant

Quote:


> Originally Posted by *DeadlyDNA*
> 
> No worries quadfire is overrated, so many say it's just for epeen. or is it?


Well, I am getting ready for DA:I, Witcher 3, Dead Rising 3, EVOLVE. I want it in glorious eyefinity at comfortable framerates.


----------



## cephelix

^ DA:I!!!!WITCHER 3!!!!!YEAH!!!!!


----------



## Blue Dragon

Quote:


> Originally Posted by *rene mauricio*
> 
> I am having an odd issue. One I have never encountered before.
> 
> My 290 pauses during game play and displays a black screen. If left alone it will resume in a few seconds. If this happens more than 3 or so time, it hard locks. A few pages ago I read that this is most common with Elpida memory. On various (other) threads it has been asked that one with such problems run memtestCL to see if the memory is performing stable. After running 10 passes it threw up thousands of errors. I then attempted to use the latest beta version of AFTERBURNER to see if I could not downclock the memory but to no avail.
> 
> It seems no matter what I do the memory will always run at 1250MHz. Is there anything I could do or is RMA the only answer? I have never had an issue with underclocking... perhaps I am missing something.


i had a problem with native PSU connectors (don't mix w/ modular plug-ins) and heard some other guys talk about dvi cords causing problems.


----------



## bond32

Quote:


> Originally Posted by *Sgt Bilko*
> 
> For a second there i thought you were going to quote his thread


Lol me too... Don't think he's seen his thread...


----------



## Dasboogieman

Just got me a P7P55D-E Pro system on the cheep for $250. Added up, the guy effectively gave me his MSI GTX 560Ti for $50. Judging from the light dust, hasn't been heavily used. Although one of the fan blades are broken so the cleanliness may have been because he was a little aggressive in disassembly.

I don't know if may people know this, the MSI Twin Frozr II cards actually have an MSI HAWK silkscreen on the PCB PCIe Connector, its just covered by black electrical tape. Must've been the original name of the design before they decided to use it for the top mainstream card.
Anyway, new fans cost only $25 from China. So basically a GTX 560Ti for $75 bucks. Not too shabby.

On the other hand, I'm having to use old school Bclk OC for the i5-750, I'd forgotten how much headroom Sandy Bridge/Ivy had.


----------



## PCSarge

Quote:


> Originally Posted by *Dasboogieman*
> 
> Just got me a P7P55D-E Pro system on the cheep for $250. Added up, the guy effectively gave me his MSI GTX 560Ti for $50. Judging from the light dust, hasn't been heavily used. Although one of the fan blades are broken so the cleanliness may have been because he was a little aggressive in disassembly.
> 
> I don't know if may people know this, the MSI Twin Frozr II cards actually have an MSI HAWK silkscreen on the PCB PCIe Connector, its just covered by black electrical tape. Must've been the original name of the design before they decided to use it for the top mainstream card.
> Anyway, new fans cost only $25 from China. So basically a GTX 560Ti for $75 bucks. Not too shabby.
> 
> On the other hand, I'm having to use old school Bclk OC for the i5-750, I'd forgotten how much headroom Sandy Bridge/Ivy had.


if your lucky youll get a whopping 4ghz out of that chip. mine did it and still does, most come in at 3.8 or below before a wall though


----------



## wrigleyvillain

Damn I do have the black screen on reboot thing though; have seen mention of it already but don't know any more details at this point. Do I just deal with it? Reboots are fairly rare compared to cold boots.

A QNIX if that's relevant.


----------



## kpoeticg

I had a black screen on reboot issue when i had to rma my mobo and used a different one temporarily with the same windows install. I needed to reinstall windows to clean out the drivers, never had another issue with it


----------



## wrigleyvillain

So it's a software issue?


----------



## PCSarge

Quote:


> Originally Posted by *wrigleyvillain*
> 
> So it's a software issue?


it can be, especially with 14.4/14.6 drivers if something glitches out.


----------



## kpoeticg

For me it was a driver issue. If it only happens at reboot, that would definitely be my guess. Reinstalling windows fixed it for me, i can't promise it'll work for you...


----------



## wrigleyvillain

Well driver = software. Just wanted to make sure wasn't something with the hardware design I guess. I am running the 14.6's which just came out the the other day.

Did a pretty thorough manual cleaning of my NVidia drivers too.


----------



## PachAz

I think we can all agree that these black screens/freeze are serious problems these r9 290/290x are suffering from, specially the ones with elpida memory. Some might be software related, other hardware related and maybe a combination of both. Either way, we got to ask ourselfs how to define "full functionality" of a product.


----------



## kpoeticg

I've had a black screen plagued card that was 100% hardware related, so i know it's impossible to promise some1 that a certain solution will fix it

I've also had various software related black screen issue's with other 290x's along the way that were easily fixable. It's definitely a 290/x specific problem, has many causes, and many solutions. Sometimes there is no solution =\

Edit: I consider myself kind of a 'Black Screen Connoisseur'


----------



## PachAz

The solution is to sing "we shall overcome".


----------



## Sgt Bilko

14.6 RC Driver is out: http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx

*Feature Highlights of The AMD Catalyst™ 14.6 RC Driver for Windows*

Plants vs. Zombies (Direct3D performance improvements):
AMD Radeon R9 290X - 1920x1080 Ultra - improves up to 11%
AMD Radeon R9290X - 2560x1600 Ultra - improves up to 15%
AMD Radeon R9290X CrossFire configuration (3840x2160 Ultra) - 92% scaling

3DMark Sky Diver improvements:
AMD A4 6300 - improves up to 4%
Enables AMD Dual Graphics / AMD CrossFire support

Grid Auto Sport:
AMD CrossFire profile

Wildstar:
Power Xpress profile
Performance improvements to improve smoothness of application

Watch Dogs:
AMD CrossFire - Frame pacing improvements

Battlefield Hardline Beta:
AMD CrossFire profile


----------



## Dasboogieman

Quote:


> Originally Posted by *PCSarge*
> 
> if your lucky youll get a whopping 4ghz out of that chip. mine did it and still does, most come in at 3.8 or below before a wall though


My current unit is sitting at 3.8 Ghz 1.36V using the stock Dell cooler (damn these things can cool, its only slightly taller than the stock cooler but this thing was designed for a hot Core 2 Quad), temps are about 76 under Prime 95 load. Would love to shoot for 4Ghz but I don't think the RAM can take it, these are OEM 1066mhz sticks, I can either drop the memory multiplier to 2 (and get 800Mhz RAM) or keep the 3 multiplier and get 1200Mhz RAM CL7/8. What I wouldn't do for half memory multipliers but I think this was reserved for the ASUS Gene or Maximus models. If this chip can't take 4Ghz, I have a second 750 unit that I haven't binned yet, salvaged from an office Dell machine.

Gonna see how much more this stock Dell cooling unit can take before I slap a Hyper 212 Evo or 120mm CLC on it. I've only realized how cool the 750s were, the active CPU die size was massive compared to anything from the Sandy Bridge era onwards (which the GPU accounts for about 30-40% of the die thus making the actual active CPU die a lot smaller).


----------



## Newbie2009

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 14.6 RC Driver is out: http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx
> 
> *Feature Highlights of The AMD Catalyst™ 14.6 RC Driver for Windows*
> 
> Plants vs. Zombies (Direct3D performance improvements):
> AMD Radeon R9 290X - 1920x1080 Ultra - improves up to 11%
> AMD Radeon R9290X - 2560x1600 Ultra - improves up to 15%
> AMD Radeon R9290X CrossFire configuration (3840x2160 Ultra) - 92% scaling
> 
> 3DMark Sky Diver improvements:
> AMD A4 6300 - improves up to 4%
> Enables AMD Dual Graphics / AMD CrossFire support
> 
> Grid Auto Sport:
> AMD CrossFire profile
> 
> Wildstar:
> Power Xpress profile
> Performance improvements to improve smoothness of application
> 
> Watch Dogs:
> AMD CrossFire - Frame pacing improvements
> 
> Battlefield Hardline Beta:
> AMD CrossFire profile


I'm guessing the driver team have been away on vacation.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Newbie2009*
> 
> I'm guessing the driver team have been away on vacation.


Nothing really exciting is there?


----------



## Arizonian

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 14.6 RC Driver is out: http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx
> 
> *Feature Highlights of The AMD Catalyst™ 14.6 RC Driver for Windows*
> 
> Plants vs. Zombies (Direct3D performance improvements):
> AMD Radeon R9 290X - 1920x1080 Ultra - improves up to 11%
> AMD Radeon R9290X - 2560x1600 Ultra - improves up to 15%
> AMD Radeon R9290X CrossFire configuration (3840x2160 Ultra) - 92% scaling
> 
> 3DMark Sky Diver improvements:
> AMD A4 6300 - improves up to 4%
> Enables AMD Dual Graphics / AMD CrossFire support
> 
> Grid Auto Sport:
> AMD CrossFire profile
> 
> Wildstar:
> Power Xpress profile
> Performance improvements to improve smoothness of application
> 
> Watch Dogs:
> AMD CrossFire - Frame pacing improvements
> 
> Battlefield Hardline Beta:
> AMD CrossFire profile


EDIT

Nice - updated.


----------



## rene mauricio

Quote:


> Originally Posted by *Blue Dragon*
> 
> i had a problem with native PSU connectors (don't mix w/ modular plug-ins) ...


Could you elaborate?

I have a Corsair PSU that has a single PCIE power cable that splits into two 6+2 pin adapters. I tried this first but I was experiencing random game lockups. I then tried connecting a second PCIE power cable and had two different cables powering up each of the 290's plugs. This ultimately made no difference as I still had random lockups. I tried various drivers, making sure I used DDU each time. I tried a fresh install of Windows 8.1. I tried other PowereColor BIOS. I even tried using Afterburner to create a more aggressive fan profile.

Upon using MemtestCL I believe the culprit is the Elpida RAM. Sadly I was unable to underclock / downclock the memory in order to see if that would cause MemtestCL to stop producing errors.

On a related note; what really grinds my gears is that it appears AMD did away with the different power states I enjoyed on the 7950. This thing is either dead idle (with NOTHING open) or going full crank. Example; while watching Windows Media Center, I noticed the room was getting uncomfortably warm. I checked CCC and it said my GPU was 13% under load, but it was running at 1250 on the memory and spewing out heat at 83C. What the hell was so wrong with the previous generation's power states that made them do away with that?


----------



## PCSarge

Quote:


> Originally Posted by *Dasboogieman*
> 
> My current unit is sitting at 3.8 Ghz 1.36V using the stock Dell cooler (damn these things can cool, its only slightly taller than the stock cooler but this thing was designed for a hot Core 2 Quad), temps are about 76 under Prime 95 load. Would love to shoot for 4Ghz but I don't think the RAM can take it, these are OEM 1066mhz sticks, I can either drop the memory multiplier to 2 (and get 800Mhz RAM) or keep the 3 multiplier and get 1200Mhz RAM CL7/8. What I wouldn't do for half memory multipliers but I think this was reserved for the ASUS Gene or Maximus models. If this chip can't take 4Ghz, I have a second 750 unit that I haven't binned yet, salvaged from an office Dell machine.
> 
> Gonna see how much more this stock Dell cooling unit can take before I slap a Hyper 212 Evo or 120mm CLC on it. I've only realized how cool the 750s were, the active CPU die size was massive compared to anything from the Sandy Bridge era onwards (which the GPU accounts for about 30-40% of the die thus making the actual active CPU die a lot smaller).


yup mine has a single fan 212+ on it to do 4GHZ. they run cool, just like sandy i5's i did and still do love that series of chips. i also have some OEM unbinned i5/i7s from that series, along with alot of core 2 duos e7400,e8400, and even a few 775 P4s


----------



## Blue Dragon

Quote:


> Originally Posted by *rene mauricio*
> 
> Could you elaborate?
> 
> I have a Corsair PSU that has a single PCIE power cable that splits into two 6+2 pin adapters. I tried this first but I was experiencing random game lockups. I then tried connecting a second PCIE power cable and had two different cables powering up each of the 290's plugs. This ultimately made no difference as I still had random lockups. I tried various drivers, making sure I used DDU each time. I tried a fresh install of Windows 8.1. I tried other PowereColor BIOS. I even tried using Afterburner to create a more aggressive fan profile.
> 
> Upon using MemtestCL I believe the culprit is the Elpida RAM. Sadly I was unable to underclock / downclock the memory in order to see if that would cause MemtestCL to stop producing errors.
> 
> On a related note; what really grinds my gears is that it appears AMD did away with the different power states I enjoyed on the 7950. This thing is either dead idle (with NOTHING open) or going full crank. Example; while watching Windows Media Center, I noticed the room was getting uncomfortably warm. I checked CCC and it said my GPU was 13% under load, but it was running at 1250 on the memory and spewing out heat at 83C. What the hell was so wrong with the previous generation's power states that made them do away with that?


There are separate power states, believe gpuz will show them as well as GPU shark(MSI kombuster).

The pcie connector that is "hard wired" to the PSU is considered native. The one you plug into the modular part is not native. I have been told not to mix. In other words- don't use one hard wired and one plugin together, either use two separate native plugs (if you have them) or try two plugins. Also check the amps ur PSU gives, its not just enough to have the watts... You'll need the amps too (I'm not that knowledgeable of amperage, so maybe one of these other guys can help give exact number...)
Also, try running it without afterburner ( uninstall or uncheck box that says 'apply oc at startup') and see how it acts. Probably because my Asus 290 was factory oc'd, but mine runs at 1260 on mem and not 1250.


----------



## Blaise170

Quote:


> Originally Posted by *Blue Dragon*
> 
> There are separate power states, believe gpuz will show them as well as GPU shark(MSI kombuster).
> 
> The pcie connector that is "hard wired" to the PSU is considered native. The one you plug into the modular part is not native. I have been told not to mix. In other words- don't use one hard wired and one plugin together, either use two separate native plugs (if you have them) or try two plugins. Also check the amps ur PSU gives, its not just enough to have the watts... You'll need the amps too (I'm not that knowledgeable of amperage, so maybe one of these other guys can help give exact number...)
> Also, try running it without afterburner ( uninstall or uncheck box that says 'apply oc at startup') and see how it acts. Probably because my Asus 290 was factory oc'd, but mine runs at 1260 on mem and not 1250.


An R9 290 needs 31A on the +12V rail. The Corsair RM650 he has 54A on the +12V rail, so he has more than enough amperage.


----------



## KeepWalkinG

For r9 290 Sapphire Vapor-X or Tri-X can i change the cooler with Arctic Accelero IV or MK-26?


----------



## Blaise170

RMA log for my 290







: http://www.overclock.net/t/1498188/xfx-rma-log-r9-290


----------



## Sgt Bilko

Quote:


> Originally Posted by *KeepWalkinG*
> 
> For r9 290 Sapphire Vapor-X or Tri-X can i change the cooler with Arctic Accelero IV or MK-26?


You could but I wouldn't.

Thd Tri-X and Vapor-X are very good coolers and there isnt much sense in replacing them for something that will be heavier and most likely not cool the vrms as well.


----------



## sugarhell

Quote:


> Originally Posted by *KeepWalkinG*
> 
> For r9 290 Sapphire Vapor-X or Tri-X can i change the cooler with Arctic Accelero IV or MK-26?


Vapor-x is a custom pcb. You can only change the tri-x cooler but its pointless.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Blaise170*
> 
> RMA log for my 290
> 
> 
> 
> 
> 
> 
> 
> : http://www.overclock.net/t/1498188/xfx-rma-log-r9-290


Good to hear!

XFX gets a pretty bad rap but the little experience I've had with them so far has all been positive


----------



## fateswarm

Does anyone know the model of the mosfets of the tri-x 290? Perhaps someone that took close up photos.

I believe they are the silvery things next to the chokes.


----------



## aaroc

Quote:


> Originally Posted by *PachAz*
> 
> I think we can all agree that these black screens/freeze are serious problems these r9 290/290x are suffering from, specially the ones with elpida memory. Some might be software related, other hardware related and maybe a combination of both. Either way, we got to ask ourselfs how to define "full functionality" of a product.


I have 3 R9 290 with elpida an no black screen problem yet. the only problem that I got was some reboots, because I just swapped 2x 7870 for the R9 290s without cleaning drivers. After reinstalling W7 all good, now using W8.1.


----------



## Roaches

Quote:


> Originally Posted by *fateswarm*
> 
> Does anyone know the model of the mosfets of the tri-x 290? Perhaps someone that took close up photos.
> 
> I believe they are the silvery things next to the chokes.















http://www.itfiles.ro/2014/01/sapphire-ofera-eleganta-solutiei-grafice-radeon-r9-290-prin-implementarea-tri-x/2/


----------



## ZealotKi11er

Guys so i have a 290 that unlocked 290X. I just got a 290 now. Should i switch the 290X to 290 or keep it 290X?


----------



## cephelix

I'd keep the 290x.unless u can sell the unlocked one for a significant amount


----------



## ZealotKi11er

Quote:


> Originally Posted by *cephelix*
> 
> I'd keep the 290x.unless u can sell the unlocked one for a significant amount


I will be running CF. 290 + 290 or 290X + 290?


----------



## PachAz

They are basicly the same. If you run crossfire, reverse back the 290x to 290. Unless you can unlock the new 290 to a 290x.


----------



## Beezleybuzz

Quote:


> Originally Posted by *aaroc*
> 
> I have 3 R9 290 with elpida an no black screen problem yet. the only problem that I got was some reboots, because I just swapped 2x 7870 for the R9 290s without cleaning drivers. After reinstalling W7 all good, now using W8.1.


Dude, same here. I have 2 R9 290's (bought separately from different retailers), both with Elpida memory that clocks to 6000mhz, and both with zero freezing or black screens. The only time I had this issue was with 14.6b drivers when they first came out, and anything over 13.12 for crossfire.

Honestly, I think people need to realize it's not as widespread as it seems.


----------



## ZealotKi11er

Quote:


> Originally Posted by *PachAz*
> 
> They are basicly the same. If you run crossfire, reverse back the 290x to 290. Unless you can unlock the new 290 to a 290x.


Probably just swap it with my cousen and get his 290 but worried about overclocking. My 290X does 1200/1500 +100mV very easy under water. I have not tried to push it harder though.


----------



## Sgt Bilko

Quote:


> Originally Posted by *PachAz*
> 
> I think we can all agree that these black screens/freeze are serious problems these r9 290/290x are suffering from, specially the ones with elpida memory. Some might be software related, other hardware related and maybe a combination of both. Either way, we got to ask ourselfs how to define "full functionality" of a product.


My 290x (Day 1 card) was Elpida and I had one black screen with it after clocking the memory too high. When I RMA'd it I got 2 x 290's instead.....Both Elpida and no issues at all with them apart from the [email protected] thing.
Quote:


> Originally Posted by *kpoeticg*
> 
> I've had a black screen plagued card that was 100% hardware related, so i know it's impossible to promise some1 that a certain solution will fix it
> 
> I've also had various software related black screen issue's with other 290x's along the way that were easily fixable. It's definitely a 290/x specific problem, has many causes, and many solutions. Sometimes there is no solution =\
> 
> Edit: I consider myself kind of a 'Black Screen Connoisseur'


It was plaguing alot of people who had the day 1 cards, and in turn those that are now buying those cards used from Miners. Not so much of a problem anymore though.


----------



## pdasterly

When overclocking a gpu, what does the driver failure mean


----------



## ZealotKi11er

Quote:


> Originally Posted by *pdasterly*
> 
> When overclocking a gpu, what does the driver failure mean


You have gone too far.


----------



## pdasterly

Quote:


> Originally Posted by *ZealotKi11er*
> 
> You have gone too far.


Thats weird because Ive taken gpu much further than this. Just adjusting gpu clocks without voltage increase.
1125 gpu, 50 power limit.
I just re-did my vrm1 fans and wanted to get some numbers to compare to, heat wise.
Also is there a program that will stress test the gpu while monitoring vrm1 temp. I cant find a program that will, I have to switch back n forth between the apps and the gpu gets a little rest during that process.


----------



## Beezleybuzz

Quote:


> Originally Posted by *ZealotKi11er*
> 
> You have gone too far.


----------



## pdasterly

is that ok for stock volts, is it safe to give some more volts or should I stop


----------



## Beezleybuzz

Quote:


> Originally Posted by *pdasterly*
> 
> Thats weird because Ive taken gpu much further than this. Just adjusting gpu clocks without voltage increase.
> 1125 gpu, 50 power limit.
> I just re-did my vrm1 fans and wanted to get some numbers to compare to, heat wise.
> Also is there a program that will stress test the gpu while monitoring vrm1 temp. I cant find a program that will, I have to switch back n forth between the apps and the gpu gets a little rest during that process.


To monitor your VRM temps just run an instance (or a couple for 2 cards) of GPU-Z. The "Sensors" tab keeps track of everything and you can even get log saved to a file if you want.

I find that Furmark stresses the VRM's the most for me. Heaven and 3Dmark Firestrike Extreme also tax the card really hard.


----------



## pdasterly

thanks for quick reply, I have two instances of gpu-z running.
Im using occt to check for errors, I can get 1050 gpu without errors so far


----------



## cephelix

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I will be running CF. 290 + 290 or 290X + 290?


Oh, if that's the case I'll stick with 2x290
And with the power limit thing, when do you guys actually increase it? I know increasing core then voltage, memory then voltagr, combine the 2 values and retest.so when is the power limit actually increased/tested in this situation?


----------



## Sgt Bilko

Quote:


> Originally Posted by *cephelix*
> 
> Oh, if that's the case I'll stick with 2x290
> And with the power limit thing, when do you guys actually increase it? I know increasing core then voltage, memory then voltagr, combine the 2 values and retest.so when is the power limit actually increased/tested in this situation?


Power limit can be used to stop the card from sitting at lower clock speeds during load


----------



## cephelix

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Power limit can be used to stop the card from sitting at lower clock speeds during load


So how do you know how much to set? Or is it dragged to 50% right off the bat?


----------



## Sgt Bilko

Quote:


> Originally Posted by *cephelix*
> 
> So how do you know how much to set? Or is it dragged to 50% right off the bat?


I normally have it at 50% and ULPS didabled...makes games nice and smooth for me


----------



## cephelix

Ok..so it's set after the initial oc values are set...am I correct to assume that?


----------



## fateswarm

Quote:


> Originally Posted by *Roaches*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.itfiles.ro/2014/01/sapphire-ofera-eleganta-solutiei-grafice-radeon-r9-290-prin-implementarea-tri-x/2/


Thanks! That's even more than I asked for.

Apparently, one of them is IRF6894MPbF and the other one IRF6811SPbF. Specs easily googled. Hm, beefy, International Rectifier, those guys are pros, hrm, it appears the big ones can do up to like 60Amps and the small ones about 25, hm, I wonder if the small ones limit the big ones.


----------



## Sgt Bilko

Quote:


> Originally Posted by *cephelix*
> 
> Ok..so it's set after the initial oc values are set...am I correct to assume that?


Either before or after. Doesnt make any difference afaik. I normally run my cards at stock with the powerlimit ay 50%


----------



## Forceman

Quote:


> Originally Posted by *cephelix*
> 
> So how do you know how much to set? Or is it dragged to 50% right off the bat?


Just set it to 50% and then you can forget about it. No downside to it.


----------



## cephelix

Ok.thanks guys....


----------



## Mega Man

Quote:


> Originally Posted by *PachAz*
> 
> They are basicly the same. If you run crossfire, reverse back the 290x to 290. Unless you can unlock the new 290 to a 290x.


huh. am i missing something? ( the other post did not quote ) the one guy that said to downgrade the 290x to a 290, this is directed to him ) why would you downgrade, i would rather just cfx them as is ( or sell again why ? )


----------



## disintegratorx

I was just actually able to o.c. my card significantly further, probably with the newest version of MSI Afterburner. My clocks are setting at a sweet 1180/1530 now.


----------



## kizwan

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Guys so i have a 290 that unlocked 290X. I just got a 290 now. Should i switch the 290X to 290 or keep it 290X?


Keep the card @unlocked 290x. 290X + 290 CFX.
Quote:


> Originally Posted by *Beezleybuzz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *aaroc*
> 
> I have 3 R9 290 with elpida an no black screen problem yet. the only problem that I got was some reboots, because I just swapped 2x 7870 for the R9 290s without cleaning drivers. After reinstalling W7 all good, now using W8.1.
> 
> 
> 
> Dude, same here. I have 2 R9 290's (bought separately from different retailers), both with Elpida memory that clocks to 6000mhz, and both with zero freezing or black screens. The only time I had this issue was with 14.6b drivers when they first came out, and anything over 13.12 for crossfire.
> 
> Honestly, I think people need to realize it's not as widespread as it seems.
Click to expand...

Both my cards with Hynix & Elpida memory respectively also don't have black screen problem. Don't have problem with drivers either. Running 14.6 beta.


----------



## cephelix

I've only ever had the black screen issue on the first install on 13.11 drivers.but they magically disappeared.no idea wht I did other thn switch to 13.12 thn.and using dvi cable if tht is any indicator


----------



## pdasterly

Whats black screen, knock on wood.


----------



## Aussiejuggalo

I dont understand how everyone has black/blue screens, only problem I've had since having my card was a bsod caused by hardware acceleration with flash

Y'all must be n00bs or something


----------



## the9quad

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> I dont understand how everyone has black/blue screens, only problem I've had since having my card was a bsod caused by hardware acceleration with flash
> 
> Y'all must be n00bs or something


The first card I got was defective and gave black screens. Replacement worked fine. Sometimes it is just bad hardware.


----------



## Aussiejuggalo

Quote:


> Originally Posted by *the9quad*
> 
> The first card I got was defective and gave black screens. Replacement worked fine. Sometimes it is just bad hardware.


Yeah true, sometimes its software tho... or idiot error


----------



## KeepWalkinG

So earlier in the thread I asked if I could change the cooler but this is not a good idea..
And i can change only in Tri-X.

I want to buy Sapphire r9 290 Vapor-X but my previous card r9 280x Dual-X with same fans..
I think you know what a lot of people have problems with this card. (rattling and vibration)

Will they have the same problem in Tri-X or Vapor-X?
However, speed will be lower - level in this and maybe will have a longer life


----------



## fateswarm

Quote:


> Originally Posted by *fateswarm*
> 
> Thanks! That's even more than I asked for.
> 
> Apparently, one of them is IRF6894MPbF and the other one IRF6811SPbF. Specs easily googled. Hm, beefy, International Rectifier, those guys are pros, hrm, it appears the big ones can do up to like 60Amps and the small ones about 25, hm, I wonder if the small ones limit the big ones.


So, I asked an expert, and it appears the reason those mosfets look so weird, one having high current capacity, the other limiting the phase with a lower current ability, is that one of the mosfets is compensating for higher power loss (in the form of heat) for the role it is assigned to play.

Hence, the board's each phase appears to do up to 25Amps though the bigger mosfets that could theoretically do up to 60A help for lower temps.


----------



## cephelix

Quote:


> Originally Posted by *fateswarm*
> 
> So, I asked an expert, and it appears the reason those mosfets look so weird, one having high current capacity, the other limiting the phase with a lower current ability, is that one of the mosfets is compensating for higher power loss (in the form of heat) for the role it is assigned to play.
> 
> Hence, the board's each phase appears to do up to 25Amps though the bigger mosfets that could theoretically do up to 60A help for lower temps.


Quote:


> Originally Posted by *pdasterly*
> 
> Whats black screen, knock on wood.


it's when your screen blacks out intermittently when it's not supposed to...there's a more detailed thread somewhere here...quite a few people are experiencing it with the 290/x.

@fateswarm that went right over my head...lol


----------



## wrigleyvillain

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Good to hear!
> 
> XFX gets a pretty bad rap but the little experience I've had with them so far has all been positive


I always had an as positive view of their products as any other manufacturer until recently&#8230;it seemed their 79XX cards had some quality and reliability issues, at least the Double D's. Never owned one but saw quite a lot of complaints etc on the forums. But quite happy with my XFX 290 so far&#8230;but a large part of that is the cheap ex-miner price. However it also can unlock woooo
Quote:


> Originally Posted by *Beezleybuzz*
> 
> To monitor your VRM temps just run an instance (or a couple for 2 cards) of GPU-Z. The "Sensors" tab keeps track of everything and you can even get log saved to a file if you want.


Yes and such are nice. I am coming off GTX 670s with no VRM sensors.
Quote:


> Originally Posted by *cephelix*
> 
> it's when your screen blacks out intermittently when it's not supposed to...there's a more detailed thread somewhere here...quite a few people are experiencing it with the 290/x.
> 
> @fateswarm that went right over my head...lol


Oh I thought it was just on reboot if at all. Not cold boot and definitely not, like, mid-game. I have only seen it on reboot so far. I can see my machine booting via HDD LED but never get an image on screen.

Yes there is a thread with a poll here in this section.


----------



## Gobigorgohome

Add me to the club!









Information:
4x Sapphire Radeon R9 290X
Stock air-cooler


----------



## PCSarge

Quote:


> Originally Posted by *Gobigorgohome*
> 
> 
> 
> Add me to the club!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Information:
> 4x Sapphire Radeon R9 290X
> Stock air-cooler


you must watercool. or we shall send chuck norris to your house.


----------



## Gobigorgohome

Quote:


> Originally Posted by *PCSarge*
> 
> you must watercool. or we shall send chuck norris to your house.


Yes, that is the plan. Will do EK-FC R9 290X Acetal+Nickel blocks.


----------



## joeh4384

I would use the stock cooler on all 4 just once to hear the noise.


----------



## Dasboogieman

Quote:


> Originally Posted by *fateswarm*
> 
> Thanks! That's even more than I asked for.
> 
> Apparently, one of them is IRF6894MPbF and the other one IRF6811SPbF. Specs easily googled. Hm, beefy, International Rectifier, those guys are pros, hrm, it appears the big ones can do up to like 60Amps and the small ones about 25, hm, I wonder if the small ones limit the big ones.


The most striking part about these vrms is the packaging. They use rather expensive metal packaging for high heat transfer. Its the same as those on the Tahiti cards. They were expecting hot VRM temps.


----------



## Gobigorgohome

Quote:


> Originally Posted by *joeh4384*
> 
> I would use the stock cooler on all 4 just once to hear the noise.


I will probably do that for testing the cards, I am kind of nervous to be honest with you guys. I thought the 2x MSI Lightning R9 290X was bad, and from what I have been told that is nothing compared to the original coolers. Will probably order the waterblocks within a week.


----------



## kahboom

Whats a good waterblock for these? Looking at the new komodos but aquacomputer has some nice blocks minus the island on the block. Real nice backplate too that never seems to be in stock. What about ek or koolance? Are there vrm temps OKwhen oover clocked in crossfire?


----------



## wrigleyvillain

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I will probably do that for testing the cards, I am kind of nervous to be honest with you guys. I thought the 2x MSI Lightning R9 290X was bad, and from what I have been told that is nothing compared to the original coolers. Will probably order the waterblocks within a week.


You know those headphone-like ear protecters they wear at shooting ranges and what not?


----------



## bond32

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I will probably do that for testing the cards, I am kind of nervous to be honest with you guys. I thought the 2x MSI Lightning R9 290X was bad, and from what I have been told that is nothing compared to the original coolers. Will probably order the waterblocks within a week.


Lol, lightnings will seem like a joke compared to the reference cooler... People hate/knock on it, but remember, it works. Yeah it's loud, but it's also cheap. With that said I don't hate it SO much... not until I get two more gpu blocks at least haha.


----------



## Gobigorgohome

Quote:


> Originally Posted by *wrigleyvillain*
> 
> You know those headphone-like ear protecters they wear at shooting ranges and what not?


With Pioneer S-DJ08 and NAD D 1050 I need ear-protecters anyway ...


----------



## Gobigorgohome

Quote:


> Originally Posted by *bond32*
> 
> Lol, lightnings will seem like a joke compared to the reference cooler... People hate/knock on it, but remember, it works. Yeah it's loud, but it's also cheap. With that said I don't hate it SO much... not until I get two more gpu blocks at least haha.


Yes, you are probably right.

The temperature and noise is not the problem, the plan all along was to water cool the graphic cards and I will do that too as soon as I know they are working properly.


----------



## Arizonian

Quote:


> Originally Posted by *Gobigorgohome*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Add me to the club!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Information:
> 4x Sapphire Radeon R9 290X
> Stock air-cooler


Congrats - added









We are up to 400 members 579 GPU's total.


----------



## rdr09

Quote:


> Originally Posted by *kahboom*
> 
> Whats a good waterblock for these? Looking at the new komodos but aquacomputer has some nice blocks minus the island on the block. Real nice backplate too that never seems to be in stock. What about ek or koolance? Are there vrm temps OKwhen oover clocked in crossfire?


if you'll install the gpu the normal way in the case, then i'd say go with the one with the best looking backplate. i think the aqua has active cooling backplate but hard to come by.

post#21896

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#


----------



## heroxoot

Quote:


> Originally Posted by *bond32*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gobigorgohome*
> 
> I will probably do that for testing the cards, I am kind of nervous to be honest with you guys. I thought the 2x MSI Lightning R9 290X was bad, and from what I have been told that is nothing compared to the original coolers. Will probably order the waterblocks within a week.
> 
> 
> 
> Lol, lightnings will seem like a joke compared to the reference cooler... People hate/knock on it, but remember, it works. Yeah it's loud, but it's also cheap. With that said I don't hate it SO much... not until I get two more gpu blocks at least haha.
Click to expand...

I find this to be an upset because the 7970 lightning cooler was really quiet for me for a year and a half. After that I kind of messed it up and had to RMA, but otherwise it was quiet below 50% for a long time.


----------



## Gobigorgohome

Quote:


> Originally Posted by *heroxoot*
> 
> I find this to be an upset because the 7970 lightning cooler was really quiet for me for a year and a half. After that I kind of messed it up and had to RMA, but otherwise it was quiet below 50% for a long time.


I can sign a paper on that the MSI Lightning R9 290X is not quiet in crossfire, I hit 95 degree celsius and averaging around 85 degree celsius while gaming .... it performed good though ...

After a gaming session on Hitman Absolution (ultra, 4K) the temperature would not go under 60 degree celsius idle either ... It did not actually impress me much.









Anyone know how much juice 4x R9 290X overclocked on water need? I am planning on using one Silver Power 500 wattage PSU for 24-pin, two D5-pumps, CPU, HDD's and SSD's. I will use my second power supply for the GPU's alone, it is an EVGA G2 1300W. Later I will add another EVGA G2 1300W to be on the safe side when I overclock my CPU.


----------



## cephelix

Quote:


> Originally Posted by *wrigleyvillain*
> 
> I always had an as positive view of their products as any other manufacturer until recently&#8230;it seemed their 79XX cards had some quality and reliability issues, at least the Double D's. Never owned one but saw quite a lot of complaints etc on the forums. But quite happy with my XFX 290 so far&#8230;but a large part of that is the cheap ex-miner price. However it also can unlock woooo
> Yes and such are nice. I am coming off GTX 670s with no VRM sensors.
> Oh I thought it was just on reboot if at all. Not cold boot and definitely not, like, mid-game. I have only seen it on reboot so far. I can see my machine booting via HDD LED but never get an image on screen.
> 
> Yes there is a thread with a poll here in this section.


Well, when I had the issue, anytime I launched a program, my screen would black out for about 5 seconds.not including the 2-3 seconds where everything just froze. Thi ls happened most frequently with afterburner.i never had black screens on reboot or cold boot


----------



## heroxoot

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> I find this to be an upset because the 7970 lightning cooler was really quiet for me for a year and a half. After that I kind of messed it up and had to RMA, but otherwise it was quiet below 50% for a long time.
> 
> 
> 
> I can sign a paper on that the MSI Lightning R9 290X is not quiet in crossfire, I hit 95 degree celsius and averaging around 85 degree celsius while gaming .... it performed good though ...
> 
> After a gaming session on Hitman Absolution (ultra, 4K) the temperature would not go under 60 degree celsius idle either ... It did not actually impress me much.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone know how much juice 4x R9 290X overclocked on water need? I am planning on using one Silver Power 500 wattage PSU for 24-pin, two D5-pumps, CPU, HDD's and SSD's. I will use my second power supply for the GPU's alone, it is an EVGA G2 1300W. Later I will add another EVGA G2 1300W to be on the safe side when I overclock my CPU.
Click to expand...

I'm glad I'm not the only one who gets really picky about temps like this. My 7970 wouldnt even get into 60c for lower usage games like Street fighter and now its 65c+. 75 - 80c for heavy games like BF4. I removed my OCs for now and it still performs very well in all games. But yea, the heat is annoying especially in the summer.


----------



## PCSarge

well i pulled my stock cooler cause my card was idling in the mid 50s

holy crapton of TIM i found. threw some freezing point on it and it dropped 10C+ . that was a crappy TIM application thats for sure.

its also apparent my "press sample" is an engineering sample due to this:



its marked "1333 ES" right on it.

but yeah. idle is now below 42C instead of 56C

that and those gobs of TIM from factory are now removed


----------



## cephelix

Didnt know ES can be sold as a retail version


----------



## PCSarge

Quote:


> Originally Posted by *cephelix*
> 
> Didnt know ES can be sold as a retail version


my card was a "press sample" i obtained via trade of an old card at the AMD LAN last weekend it comes with zero warranty, no box, no nada, just a card in an ESD bag was all i got. didnt realize what was handed to me until now


----------



## cephelix

^ ah, that explains it then.hoping to disassemble my card tomorrow to have a look see.hopefully the layout is the same as what is shown on coolingconfig.if not I'll have to get a universal block.and hoping I dont tear the thermal pad as I have no spares


----------



## PCSarge

Quote:


> Originally Posted by *cephelix*
> 
> ^ ah, that explains it then.hoping to disassemble my card tomorrow to have a look see.hopefully the layout is the same as what is shown on coolingconfig.if not I'll have to get a universal block.and hoping I dont tear the thermal pad as I have no spares


the thermal pads wil lbe literally stuck to the stock cooler i had to work mine loose from the pci-e power side over to get it off

EDIT: you also have to take every damn screw off, because the plated for the RAM and VRMS is mounted to the gpu cooling block.

you need themal tape not pad to hold ramsinks on


----------



## cephelix

Yeah....will probably have to use a hair dryer to warm them first.worst case senario I have a gtx 570 with an accelero hybrid as backup though I'd rather not use that


----------



## bond32

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I can sign a paper on that the MSI Lightning R9 290X is not quiet in crossfire, I hit 95 degree celsius and averaging around 85 degree celsius while gaming .... it performed good though ...
> 
> After a gaming session on Hitman Absolution (ultra, 4K) the temperature would not go under 60 degree celsius idle either ... It did not actually impress me much.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone know how much juice 4x R9 290X overclocked on water need? I am planning on using one Silver Power 500 wattage PSU for 24-pin, two D5-pumps, CPU, HDD's and SSD's. I will use my second power supply for the GPU's alone, it is an EVGA G2 1300W. Later I will add another EVGA G2 1300W to be on the safe side when I overclock my CPU.


You should be fine, I would even run 3 cards (like I do) plus everything else on your evga, then run the 4th on a 500 Watt. Also dual pumps are overrated, a single d5 would have you covered unless you run the cards in series.


----------



## pdasterly

There needs to be a program that will throttle the 290x when vrm1 temps exceeds XX
the card throttles when gpu temp hits 95, why can't the same be applied to vrm1?


----------



## PCSarge

Quote:


> Originally Posted by *pdasterly*
> 
> There needs to be a program that will throttle the 290x when vrm1 temps exceeds XX
> the card throttles when gpu temp hits 95, why can't the same be applied to vrm1?


cause VRMS naturally run hotter than the GPU. lol

on another note im using heaven as a burn in for TIM. every run im shaving off a degree or two on load temps. gone from the stock tim @ 77C on ultra to freezing point after 3 runs max temp at 73C.

custom fan curve via afterburner ofc. but who would use AMD's shoddy fan curve anyways.


----------



## pdasterly

Quote:


> Originally Posted by *PCSarge*
> 
> cause VRMS naturally run hotter than the GPU. lol
> 
> on another note im using heaven as a burn in for TIM. every run im shaving off a degree or two on load temps. gone from the stock tim @ 77C on ultra to freezing point after 3 runs max temp at 73C.
> 
> custom fan curve via afterburner ofc. but who would use AMD's shoddy fan curve anyways.


my cards are partially under water so my gpu temp isnt an issue. My fans are externally controlled, kraken g10 bracket, I run the fan a full speed, its a zalman shark fin and is super quiet at full tilt


----------



## Blaise170

Quote:


> Originally Posted by *cephelix*
> 
> Didnt know ES can be sold as a retail version


Engineering Samples (ES) and Mechanical Samples (MS) are allowed to be sold without breaking laws, but you cannot sell any internal CPUs with markings like "Confidential" or "Internal".


----------



## cephelix

Quote:


> Originally Posted by *Blaise170*
> 
> Engineering Samples (ES) and Mechanical Samples (MS) are allowed to be sold without breaking laws, but you cannot sell any internal CPUs with markings like "Confidential" or "Internal".


Cool.thanks for that bit of info..


----------



## kpoeticg

Can anybody confirm that the Powercolor PCS+ is the same reference PCB as the single blower 'reference' Powercolor 290x?

Newegg doesn't seem to carry the original anymore except an open-box special, which is surely a black screen card that was returned. I need to order my 3rd card and the first 2 are the original reference Powercolor's with the single blower cooler


----------



## aaroc

Quote:


> Originally Posted by *kpoeticg*
> 
> Can anybody confirm that the Powercolor PCS+ is the same reference PCB as the single blower 'reference' Powercolor 290x?
> 
> Newegg doesn't seem to carry the original anymore except an open-box special, which is surely a black screen card that was returned. I need to order my 3rd card and the first 2 are the original reference Powercolor's with the single blower cooler


YES! More R9 290 POWA!


----------



## PCSarge

Quote:


> Originally Posted by *kpoeticg*
> 
> Can anybody confirm that the Powercolor PCS+ is the same reference PCB as the single blower 'reference' Powercolor 290x?
> 
> Newegg doesn't seem to carry the original anymore except an open-box special, which is surely a black screen card that was returned. I need to order my 3rd card and the first 2 are the original reference Powercolor's with the single blower cooler


http://coolingconfigurator.com/step1_complist?gpu_gpus=1310

if it is the part number listed there, EK says its reference if its a different part number then it is not ref, avoid.


----------



## kpoeticg

Thanx. It seems the same. PowerColor PCS+ AXR9 290X 4GBD5-PPDHE

I didn't think to check EK's coolingconfigurator since i'm using Aquacomputer blocks.

I also wanna make sure they're the same card with a different cooler since i'm gonna be running all 3 in CFX. It's probly best to flash em all to the same bios.

Edit: This is my other 2 Powercolors


----------



## aLb.Strykr

Add me to the list









http://www.techpowerup.com/gpuz/9uhd7/

R9 - 290 || AMD || Reference Cooler


----------



## Arizonian

Quote:


> Originally Posted by *aLb.Strykr*
> 
> Add me to the list
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/9uhd7/
> 
> R9 - 290 || AMD || Reference Cooler


Congrats - added


----------



## PCSarge

Quote:


> Originally Posted by *kpoeticg*
> 
> Thanx. It seems the same. PowerColor PCS+ AXR9 290X 4GBD5-PPDHE
> 
> I didn't think to check EK's coolingconfigurator since i'm using Aquacomputer blocks.
> 
> I also wanna make sure they're the same card with a different cooler since i'm gonna be running all 3 in CFX. It's probly best to flash em all to the same bios.
> 
> Edit: This is my other 2 Powercolors


if it doesnt say "AMD" down by the pci-e lane when you get it. return it if it has an LF number. LF is custom powercolor PCB


----------



## kpoeticg

My 2 reference powercolors both have LF pcb's. Something like LF R290 v1.0

Or something along those lines. Definitely the LF and v1.0


----------



## Draygonn

Just switched the Gelid Icy Vision from my 680 to the 290x


----------



## Arizonian

Quote:


> Originally Posted by *Draygonn*
> 
> Just switched the Gelid Icy Vision from my 680 to the 290x
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Nice - updated









Curious what's the difference in temps you getting now?


----------



## PCSarge

Quote:


> Originally Posted by *kpoeticg*
> 
> My 2 reference powercolors both have LF pcb's. Something like LF R290 v1.0
> 
> Or something along those lines. Definitely the LF and v1.0


better check EK's site for the model number then.


----------



## kpoeticg

I just checked again from the link you gave me for the pcs+. It's the same LF PCB. EK says it's reference


----------



## PCSarge

Quote:


> Originally Posted by *kpoeticg*
> 
> I just checked again from the link you gave me for the pcs+. It's the same LF PCB. EK says it's reference


intresting...why wouldnt they just leave the AMD silkscreen on there lol. would make ID easier all thier old LF cards were non ref. i got used to the trend


----------



## Roaches

I think they appeared on early models and review samples before being removed on the full production model?


This one has it.

Notice the AMD decal missing.


----------



## PCSarge

Quote:


> Originally Posted by *Roaches*
> 
> I think they appeared on early models and review samples before being removed on the full production model?
> 
> 
> This one has it.
> 
> Notice the AMD decal missing.


sneaky buggers over at powercolor arent they? lol


----------



## Roaches

We know the the board has same mosfets and chokes as the reference design, though the PCS+ design was designed as a quick thermal solution around the reference model.
At least I'm happy it has better board components than a Vanilla Titan/780 PCB (i.e full digital phases vs analogue on Nvidia's reference board.)

nevermind Titan reference uses digital NCP4206 onsemi.. I've must've read somewhere long ago that it was analog...


----------



## Sgt Bilko

Quote:


> Originally Posted by *PCSarge*
> 
> intresting...why wouldnt they just leave the AMD silkscreen on there lol. would make ID easier all thier old LF cards were non ref. i got used to the trend


My XFX cards are ref design but still have XFX instead of AMD on them


----------



## kpoeticg

Quote:


> Originally Posted by *Roaches*
> 
> We know the the board has same mosfets and chokes as the reference design, *though the PCS+ design was designed as a quick thermal solution around the reference model.*
> At least I'm happy it has better board components than a Vanilla Titan/780 PCB (i.e full digital phases vs analogue on Nvidia's reference board.)
> 
> nevermind Titan reference uses digital NCP4206 onsemi.. I've must've read somewhere long ago that it was analog...


That's exactly what i was trying to find out. Thanx. Since Newegg doesn't sell the old reference Powercolor anymore, i'm gonna need to grab a PCS+ for my 3rd. Just wanted to make sure it's the same card.

Sounds like i should just flash all 3 to the PCS+ BIOS then since they'll be under water


----------



## Vici0us

So I got my R9 290 for extremely cheap a while back. When i try to overclock the memory 1400mhz . 1061mhz (core). It doesn't show me when I run benchmarks. It shows stock clocks 947mhz & 500mhz. But when I ran GPU-Z, it shows proper overclock (1061mhz &1400mhz Any input would be appreciated I do not want to flash the card, thanks in advance guys. It's an ASUS R9 290 w/ blower)


----------



## Sgt Bilko

Powerlimit to 50%


----------



## Mega Man

Quote:


> Originally Posted by *Arizonian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gobigorgohome*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Add me to the club!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Information:
> 4x Sapphire Radeon R9 290X
> Stock air-cooler
> 
> 
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> We are up to 400 members 579 GPU's total.
Click to expand...

+1 actually i have 5 personally, and i dont mine, i am just crazy


----------



## Gobigorgohome

Quote:


> Originally Posted by *bond32*
> 
> You should be fine, I would even run 3 cards (like I do) plus everything else on your evga, then run the 4th on a 500 Watt. Also dual pumps are overrated, a single d5 would have you covered unless you run the cards in series.


I think the 500 wattage PSU just is going to be temporary, it is just my "leak-testing PSU", think I will get another EVGA G2 1300W or EVGA P2 1000W.

I have used dual D5 pumps before and it works much better than single, it actually get a little pressure in the tubing with two pumps (which I think is very satisfying), the cards will run in semi-parallell I guess.

Beside I already own 3x MCP 655 pumps, I do not want two of those pumps gathering dust.


----------



## fateswarm

What is the right way to pull out a big GPU from a motherboard that is free standing on a motherboard box? It was easy to push in. If I try to pull out it seems to require holding it down properly to avoid bending.


----------



## hwoverclkd

Quote:


> Originally Posted by *fateswarm*
> 
> What is the right way to pull out a big GPU from a motherboard that is free standing on a motherboard box? It was easy to push in. If I try to pull out it seems to require holding it down properly to avoid bending.


no other way buddy. just make sure it isn't getting jammed on the side of the pcie latch.


----------



## Draygonn

Quote:


> Originally Posted by *Arizonian*
> 
> Nice - updated
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Curious what's the difference in temps you getting now?


36 idle. Playing Tomb Raider GPU hit 66 and VRM1 hit 72 (using Gelid 290(x) heatsink kit). This card can dump a lot of heat. Reminds me of the GTX480 days.


----------



## ZealotKi11er

VRM1 run really hot in this card. Under water is was hitting 75C until i changed the thermal pads to better ones. Now it does not break 55C.


----------



## Gobigorgohome

Quote:


> Originally Posted by *ZealotKi11er*
> 
> VRM1 run really hot in this card. Under water is was hitting 75C until i changed the thermal pads to better ones. Now it does not break 55C.


The thermal pads that come with the EK-FC R9 290X, is they good enough or do I need to buy some others?

I am thinking of the thermal pads over the VRM1's.


----------



## rdr09

Quote:


> Originally Posted by *Gobigorgohome*
> 
> The thermal pads that come with the EK-FC R9 290X, is they good enough or do I need to buy some others?
> 
> I am thinking of the thermal pads over the VRM1's.


use this . . .

http://www.frozencpu.com/products/16880/thr-165/Fujipoly_ModRight_Extreme_System_Builder_Thermal_Pad_Blister_Pack_-_14_Sheet_-_150_x_100_x_10_-_Thermal_Conductivity_110_WmK.html

therer is a more expensive one that goes 17W/mk. the one that comes with the kit is only 6 or 7 W/mk.

edit: btw, that is enough for at least 4 GPUs on both the vrm 1 and 2.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Gobigorgohome*
> 
> The thermal pads that come with the EK-FC R9 290X, is they good enough or do I need to buy some others?
> 
> I am thinking of the thermal pads over the VRM1's.


Its horrible. I was really pushing the VRM1 with mining but even still got almost 20C drop buy getting the 11W/mk. Stock i think are 5W/mk.


----------



## PachAz

How can a r9 290 throttle the memory if using power limit 0%? I mean 947 and 500Mhz, isnt that kinda funny. Mine stays at 1250Mhz even while using power limit 0%, which is good baby.


----------



## Sgt Bilko

Quote:


> Originally Posted by *PachAz*
> 
> How can a r9 290 throttle the memory if using power limit 0%? I mean 947 and 500Mhz, isnt that kinda funny. Mine stays at 1250Mhz even while using power limit 0%, which is good baby.


Because some games won't utilise the card at max clocks (when overclocking) and you need to up the power limit or use a different driver to make it do so.


----------



## kizwan

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> VRM1 run really hot in this card. Under water is was hitting 75C until i changed the thermal pads to better ones. Now it does not break 55C.
> 
> 
> 
> The thermal pads that come with the EK-FC R9 290X, is they good enough or do I need to buy some others?
> 
> I am thinking of the thermal pads over the VRM1's.
Click to expand...


----------



## PachAz

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Because some games won't utilise the card at max clocks (when overclocking) and you need to up the power limit or use a different driver to make it do so.


Didnt the guy run stock clocks? 947 seem like a stock core clock to me, maybe there are other cards running lower core clocks, I dont know.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Vici0us*
> 
> So I got my R9 290 for extremely cheap a while back. When i try to overclock the memory 1400mhz . 1061mhz (core). It doesn't show me when I run benchmarks. It shows stock clocks 947mhz & 500mhz. But when I ran GPU-Z, it shows proper overclock (1061mhz &1400mhz Any input would be appreciated I do not want to flash the card, thanks in advance guys. It's an ASUS R9 290 w/ blower)


Quote:


> Originally Posted by *PachAz*
> 
> Didnt the guy run stock clocks? 947 seem like a stock core clock to me, maybe there are other cards running lower core clocks, I dont know.


Ahem......


----------



## wrigleyvillain

Quote:


> Originally Posted by *cephelix*
> 
> Well, when I had the issue, anytime I launched a program, my screen would black out for about 5 seconds.not including the 2-3 seconds where everything just froze. Thi ls happened most frequently with afterburner.i never had black screens on reboot or cold boot


Well I left it running Heaven last night while I went to the kitchen and returned to black. But it may have been frozen up for some reason too. Hitting power button on my case did not invoke a proper shut down like when it black screens at reboot.


----------



## PachAz

Yes that heaven black screen problem is a common issue on these gpus. If it happens at stock settings, you need to send it back.


----------



## aaroc

Is there a tool that automatically looks for artifacts or other problems like memtest86+ for memory, run and live it running and look the next day if a the program found a problem? Im buying an used R9 290 with EK block and want to test it through the weekend. thanks!


----------



## ZealotKi11er

Quote:


> Originally Posted by *aaroc*
> 
> Is there a tool that automatically looks for artifacts or other problems like memtest86+ for memory, run and live it running and look the next day if a the program found a problem? Im buying an used R9 290 with EK block and want to test it through the weekend. thanks!


No. Its very simple to test the memory. Increase by 25MHz and test performance. If you start getting lower performance then it means you are getting memory errors. 1500MHz should be no problem for most cards if you increase core voltage.


----------



## Sgt Bilko

Quote:


> Originally Posted by *aaroc*
> 
> Is there a tool that automatically looks for artifacts or other problems like memtest86+ for memory, run and live it running and look the next day if a the program found a problem? Im buying an used R9 290 with EK block and want to test it through the weekend. thanks!


Yup, Credit goes to Das for it








Quote:


> Originally Posted by *Dasboogieman*
> 
> For anyone interested in really how stable your memory OC is. I suggest checking out MemtestCL https://simtk.org/project/xml/downloads.xml?group_id=385
> You'll have to register an account to download since it's an academic institution.
> 
> As it turns out, my 1500mhz mem clocks were throwing out loads of silent errors but everything stabilized once I dialed it back to 1475mhz.
> 
> Also, I'm curious as to what results the people with chronic black screen get at stock, if it throws out loads of errors then it pretty much confirms that blackscreen is caused by unstable memory.
> 
> This nifty little app may also answer the question of whether Aux Voltage helps memory OCs once and for all since it is far more sensitive than any benchmark.
> 
> Should probably mention, I'm using the latest AMD driver (the one for watchdogs). IIRC 13.12 and 14.4 throws up a lot of false negatives.
> 
> Edit: just also tested with then Fans at 100% at stock, extra cooling on the core + VRAM doesn't seem to help the mem stability @ 1500mhz. I guess Roboyto was correct when he tried to use FujiPloy for the RAM.


----------



## tsm106

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *aaroc*
> 
> Is there a tool that automatically looks for artifacts or other problems like memtest86+ for memory, run and live it running and look the next day if a the program found a problem? Im buying an used R9 290 with EK block and want to test it through the weekend. thanks!
> 
> 
> 
> Yup, Credit goes to Das for it
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dasboogieman*
> 
> For anyone interested in really how stable your memory OC is. I suggest checking out MemtestCL https://simtk.org/project/xml/downloads.xml?group_id=385
> You'll have to register an account to download since it's an academic institution.
> 
> As it turns out, my 1500mhz mem clocks were throwing out loads of silent errors but everything stabilized once I dialed it back to 1475mhz.
> 
> Also, I'm curious as to what results the people with chronic black screen get at stock, if it throws out loads of errors then it pretty much confirms that blackscreen is caused by unstable memory.
> 
> This nifty little app may also answer the question of whether Aux Voltage helps memory OCs once and for all since it is far more sensitive than any benchmark.
> 
> Should probably mention, I'm using the latest AMD driver (the one for watchdogs). IIRC 13.12 and 14.4 throws up a lot of false negatives.
> 
> *Edit: just also tested with then Fans at 100% at stock, extra cooling on the core + VRAM doesn't seem to help the mem stability @ 1500mhz. I guess Roboyto was correct when he tried to use FujiPloy for the RAM.*
> 
> Click to expand...
Click to expand...

Hmm... some of us were using Fujis a longtime before anyone decided create a thread on the gains of Fujis.


----------



## Sgt Bilko

Quote:


> Originally Posted by *tsm106*
> 
> Hmm... some of us were using Fujis a longtime before anyone decided create a thread on the gains of Fujis.


Things get lost pretty easy in a thread containing over 25k posts









Not to mention of course the threads before this one and the ones before that.

Knowledge gets passed down like normal









Nice Pic btw, I'm sure the Lt approves


----------



## sugarhell

Quote:


> Originally Posted by *tsm106*
> 
> Hmm... some of us were using Fujis a longtime before anyone decided create a thread on the gains of Fujis.


I never mention about fuji because i believed that it was fundamental for gpu watercooling

We talked about fuji from the start of 7970 thread


----------



## Gobigorgohome

I am so happy with the reference cooler on the R9 290X, yes it gets hot, but loud? My GTX 660 Ti makes more sound than this does.

Running medium settings on BF4 multiplayer on 4K is totally okay with just one card.

In other words, one card: checked, three more to go!


----------



## anubis1127

Quote:


> Originally Posted by *Gobigorgohome*
> 
> *I am so happy with the reference cooler on the R9 290X*, yes it gets hot, but loud? My GTX 660 Ti makes more sound than this does.
> 
> Running medium settings on BF4 multiplayer on 4K is totally okay with just one card.
> 
> In other words, one card: checked, three more to go!


LOL, I think that is the first time I've read that statement. Glad you're enjoying the card though!!


----------



## wrigleyvillain

Quote:


> Originally Posted by *PachAz*
> 
> Yes that heaven black screen problem is a common issue on these gpus. If it happens at stock settings, you need to send it back.


Yeah that's not an option. I run Heaven about as often as I reboot though so if it doesn't do it mid-game, I'll live. Guess I shall see. Other benches have run fine but have not extensively tested this card yet.


----------



## tsm106

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Hmm... some of us were using Fujis a longtime before anyone decided create a thread on the gains of Fujis.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Things get lost pretty easy in a thread containing over 25k posts
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not to mention of course the threads before this one and the ones before that.
> 
> Knowledge gets passed down like normal
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nice Pic btw, I'm sure the Lt approves
Click to expand...

Yea, he's all giggly inside. Btw, stability testers are more difficult to use with Hawaii and AMD's type of error correction. By the time any errors, ie. artifacts are seen/detected, the card has already passed it's limit. Thus ya have to keep in mind how far ahead of the "visible" errors you have to be in order for these apps to be meaningful. On the other hand if you are benching, artifacts are par for the course.

Quote:


> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Hmm... some of us were using Fujis a longtime before anyone decided create a thread on the gains of Fujis.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I never mention about fuji because i believed that it was fundamental for gpu watercooling
> 
> We talked about fuji from the start of 7970 thread
Click to expand...

That was like over two years ago? lol


----------



## Faster_is_better

The red side grows stronger by the day


----------



## joeh4384

Quote:


> Originally Posted by *Faster_is_better*
> 
> The red side grows stronger by the day


There are too many good deals on used and new AMD cards to go Nvidia if you are building ATM.


----------



## anubis1127

Quote:


> Originally Posted by *joeh4384*
> 
> There are too many good deals on used and new AMD cards to go Nvidia if you are building ATM.


Really depends what you want to use the build for. In general I tend to agree though.


----------



## Gobigorgohome

Quote:


> Originally Posted by *anubis1127*
> 
> LOL, I think that is the first time I've read that statement. Glad you're enjoying the card though!!


I mean, all R9 290X coolers are bad, but this is not bad to be reference. The MSI Lightning R9 290X got as hot as this at 80 mhz more and at 305 more USD too. I will water cool because I am not happy with 72 degree celsius idleing temperatures, but aside from that this is a good card.


----------



## heroxoot

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *anubis1127*
> 
> LOL, I think that is the first time I've read that statement. Glad you're enjoying the card though!!
> 
> 
> 
> I mean, all R9 290X coolers are bad, but this is not bad to be reference. The MSI Lightning R9 290X got as hot as this at 80 mhz more and at 305 more USD too. I will water cool because I am not happy with 72 degree celsius idleing temperatures, but aside from that this is a good card.
Click to expand...

That's pretty bad. My 290X is a Gaming edition MSI and it idles 49c with the AC on and 52ish with it off. At load it stays below 80c on the default clocks. I thought it was literally the same twin frozr but maybe they goofed somewhere.


----------



## cephelix

72 deg on idle?that's terrible.....mine does maybe 50max on idle....60 odd on load...ambients are around 30


----------



## Hl86

I have some copper bricks and i was thinking of buying some thermal tape and the put them on the backside of vrm. Anyone tried this?


----------



## tsm106

Quote:


> Originally Posted by *Hl86*
> 
> I have some copper bricks and i was thinking of buying some thermal tape and the put them on the backside of vrm. Anyone tried this?


It's not worth it imo when you can more easily rubber cement/glue 70mm fan over the backside.


----------



## cephelix

it should work...i think roboyto has said that i would reduce vrm temps a bit more....


----------



## MTDEW

Quote:


> Originally Posted by *heroxoot*
> 
> That's pretty bad. My 290X is a Gaming edition MSI and it idles 49c with the AC on and 52ish with it off. At load it stays below 80c on the default clocks. I thought it was literally the same twin frozr but maybe they goofed somewhere.


Quote:


> Originally Posted by *cephelix*
> 
> 72 deg on idle?that's terrible.....mine does maybe 50max on idle....60 odd on load...ambients are around 30


Yeah, I think he was just embellishing it a little bit to get his point across about Water cooling them.

But i'm glad he did.

I just Got a R9 290 Tri-X and have been testing and came on here specifically to see if my idle temps were good.
Since I cannot compare to reviews which use an open test bench vs mine being in my case.
Anandtech shows 30c idle temps for this card in their review.....I was hoping that was low, and mine was at least close to normal for inside a case idle temps cuz if it wasn't something was wrong here cuz my HAF X has excellent air flow.

I currently Idle at 43c with ambient being around 23c with the AC on right now.
So I guess that is normal.

I expected to have to search through a lot of pages in this thread to find "real world" idle temps, but luckily you guys just answered the question 4 me.

Thanks for supplying the info i needed with me having ask or search.


----------



## Vici0us

So since, my stock reference ASUS R9 290 comes 947mhz & 5000mhz mem. GPU-Z show's 1061mhz & 1400mhz (overclock). When I run benchmarks it shows stock speeds.. How can I prevent this? I even replaced a 990FX mobo. Is There anything I can do to fix this? I am very frustrated! Any help would be appreciated!
if this help here are current Specs:
AMD FX-8350
ASRock 990FX Fatal1ty Killer
ASUS R9 290
16GB (2X8GB) Crucial Tactical DDR3 @ 1600mhz
Cooler Master Seidon 120m w/ 120mm Noctua
240GB SSD PNY XLR8
1TB Seagate Barraduca - 64mb
Corsair CX750M
Windows 7 Pro -64 bit
Corsair 300R

Please help me out, I've tried just about everything, thanks in advance guys.


----------



## Gobigorgohome

Quote:


> Originally Posted by *heroxoot*
> 
> That's pretty bad. My 290X is a Gaming edition MSI and it idles 49c with the AC on and 52ish with it off. At load it stays below 80c on the default clocks. I thought it was literally the same twin frozr but maybe they goofed somewhere.


My bad, right after gaming it was 72 degrees, after 5 minutes it was 50 degrees. I am used to the temperature to lower quicklier.
Quote:


> Originally Posted by *cephelix*
> 
> 72 deg on idle?that's terrible.....mine does maybe 50max on idle....60 odd on load...ambients are around 30


I have yet to see anything lower than 50 degrees on idle, but they are going on water so I do not care how the refernce coolers do.


----------



## disintegratorx

I just switched from MSI AB to Sapphire Trixx and my overclocks are now going way more stable.


----------



## rdr09

Quote:


> Originally Posted by *Vici0us*
> 
> So since, my stock reference ASUS R9 290 comes 947mhz & 5000mhz mem. GPU-Z show's 1061mhz & 1400mhz (overclock). When I run benchmarks it shows stock speeds.. How can I prevent this? I even replaced a 990FX mobo. Is There anything I can do to fix this? I am very frustrated! Any help would be appreciated!
> if this help here are current Specs:
> AMD FX-8350
> ASRock 990FX Fatal1ty Killer
> ASUS R9 290
> 16GB (2X8GB) Crucial Tactical DDR3 @ 1600mhz
> Cooler Master Seidon 120m w/ 120mm Noctua
> 240GB SSD PNY XLR8
> 1TB Seagate Barraduca - 64mb
> Corsair CX750M
> Windows 7 Pro -64 bit
> Corsair 300R
> 
> Please help me out, I've tried just about everything, thanks in advance guys.


you meant a bench like Futuremark benches would show 947/1250 even after running oc'ed? some benchmarks wil not read oc's correctly depending on the driver you are using. best way to make sure your card is functioning correctly is post your results and let others compare theirs. for example, the latest 3DMark bench called Sky Diver does not read my oc correctly.

http://www.3dmark.com/3dm/3281686

that's at 1280/1600 or 6400.

what driver are you using?


----------



## heroxoot

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> That's pretty bad. My 290X is a Gaming edition MSI and it idles 49c with the AC on and 52ish with it off. At load it stays below 80c on the default clocks. I thought it was literally the same twin frozr but maybe they goofed somewhere.
> 
> 
> 
> My bad, right after gaming it was 72 degrees, after 5 minutes it was 50 degrees. I am used to the temperature to lower quicklier.
> Quote:
> 
> 
> 
> Originally Posted by *cephelix*
> 
> 72 deg on idle?that's terrible.....mine does maybe 50max on idle....60 odd on load...ambients are around 30
> 
> Click to expand...
> 
> I have yet to see anything lower than 50 degrees on idle, but they are going on water so I do not care how the refernce coolers do.
Click to expand...

Prior to the 14.6 driver my idle was 1 or 2c lower and that was with the messy thermal paste. It idles a bit lower since I cleaned it up, almost as low as it did before 14.6.


----------



## Loktar Ogar

*Tri-X / PCS+ / 3rd party air cooler owners:*

Have you guys tried using a Fujipoly 11W/mk-17W/mk thermal pads on VRM1? Was there any significant thermal improvements?


----------



## wrigleyvillain

Quote:


> Originally Posted by *disintegratorx*
> 
> I just switched from MSI AB to Sapphire Trixx and my overclocks are now going way more stable.


Yes I really don't like AB, even when it doesn't refuse to launch due to "missing components". What I miss most about my 670s at this point is Precision-X, I have to say.

Fun, not-well-known fact: Trixx was coded by Wizzard of TPU (GPU-Z dev as well).


----------



## heroxoot

MSI got in touch with me today. They emailed me a new bios that solves my lack of fan control in MSI AB. He said it should solve the problem completely but if not it is 100% for MSI AB.


----------



## PachAz

My reference is liek 40-50C depending on ambient in idle, but at 100% load it is like 94-95C and thermal throttle little. Isnt it fun?


----------



## anubis1127

Quote:


> Originally Posted by *wrigleyvillain*
> 
> Yes I really don't like AB, even when it doesn't refuse to launch *due to "missing components".* What I miss most about my 670s at this point is Precision-X, I have to say.
> 
> Fun, not-well-known fact: Trixx was coded by Wizzard of TPU (GPU-Z dev as well).


You do know that "missing components" message appears when you are trying to use an expired beta version, correct?


----------



## PCSarge

Quote:


> Originally Posted by *PachAz*
> 
> My reference is liek 40-50C depending on ambient in idle, but at 100% load it is like 94-95C and thermal throttle little. Isnt it fun?


do a TIM change on the card, i shave over 10C off load temps changing the TIM


----------



## bond32

Quote:


> Originally Posted by *PCSarge*
> 
> do a TIM change on the card, i shave over 10C off load temps changing the TIM


Think I am going to do that on my two newer cards. Can't quite afford the blocks yet so perhaps that would help. Well no perhaps about it of course it would help...


----------



## PCSarge

Quote:


> Originally Posted by *bond32*
> 
> Think I am going to do that on my two newer cards. Can't quite afford the blocks yet so perhaps that would help. Well no perhaps about it of course it would help...


gonna need lots of alcohol to get it off, the stock goo is pretty thick on these


----------



## Tokkan

Well my card can't keep up with the heat on its own anymore. Summer is hitting my desktop pretty badly, had to make a custom fan curve on MSI.
Was getting pink screens before crashing. With the fan curve everything runs smooth.
Custom fan curve keeps it under 70ºc now, pretty aggressive fan curve. Loads of noise but I use headphones so I dont notice it untill I take them off.


----------



## wrigleyvillain

Quote:


> Originally Posted by *anubis1127*
> 
> You do know that "missing components" message appears when you are trying to use an expired beta version, correct?


Yes I suppose I do after having the same prob years ago but why the hell would G3D still be hosting an expired version of 3.0.1? At any rate, I don't like it anyway also from past experience re. instability and I guess I only really bothered with it this time for the monitoring graphs. Trixx will suffice and not give me any grief along the way.


----------



## Widde

Quote:


> Originally Posted by *wrigleyvillain*
> 
> Yes I really don't like AB, even when it doesn't refuse to launch due to "missing components". What I miss most about my 670s at this point is Precision-X, I have to say.
> 
> Fun, not-well-known fact: Trixx was coded by Wizzard of TPU (GPU-Z dev as well).


The only thing keeping me away from trixx is that it doesnt apply my custom fan curve to the 2nd card, Have not been able to make it work


----------



## pdasterly

Quote:


> Originally Posted by *Widde*
> 
> The only thing keeping me away from trixx is that it doesnt apply my custom fan curve to the 2nd card, Have not been able to make it work


Did you check sync tab for multi gpu?


----------



## Widde

Quote:


> Originally Posted by *pdasterly*
> 
> Did you check sync tab for multi gpu?


Yes


----------



## heroxoot

MSI gave me a bios that works on my GPU and solves the fan problem where I could not adjust it on 14.1 and higher. Oh great, but it locks down my card to a 290. It was showing 64/160 ROPS instead of 64/176 and 25xx shaders instead of 2816. It also was clocking me at 290 core speed which is 977mhz. I switched back to my old bios till I contact MSI again but is there anyway to prove this card is infact a 290X? I feel like I may have a 290 running a 290X bios now, but wouldn't the shaders show 64/160 and not 64/176 with the original bios?


----------



## rdr09

Quote:


> Originally Posted by *PachAz*
> 
> My reference is liek 40-50C depending on ambient in idle, but at 100% load it is like 94-95C and thermal throttle little. Isnt it fun?


information gets buried in this thread so fast but that info is in page 1.


----------



## disintegratorx

Quote:


> Originally Posted by *wrigleyvillain*
> 
> Yes I really don't like AB, even when it doesn't refuse to launch due to "missing components". What I miss most about my 670s at this point is Precision-X, I have to say.
> 
> Fun, not-well-known fact: Trixx was coded by Wizzard of TPU (GPU-Z dev as well).


Yeah, it was making my whole system not even boot well and causing my sound to crash so that's why I decided to try out TRIXX. Glad I did....


----------



## 66racer

Quote:


> Originally Posted by *heroxoot*
> 
> MSI gave me a bios that works on my GPU and solves the fan problem where I could not adjust it on 14.1 and higher. Oh great, but it locks down my card to a 290. It was showing 64/160 ROPS instead of 64/176 and 25xx shaders instead of 2816. It also was clocking me at 290 core speed which is 977mhz. I switched back to my old bios till I contact MSI again but is there anyway to prove this card is infact a 290X? I feel like I may have a 290 running a 290X bios now, but wouldn't the shaders show 64/160 and not 64/176 with the original bios?


I think the only thing to do really is run a few benchmarks with each bios, or at least one benchmark. Make sure they are clocked the same so you can see if it truely "unlocks" the card. I dont have a 290 so cant really comment beyond that....just curious about them now that they are $400 again








Quote:


> Originally Posted by *joeh4384*
> 
> There are too many good deals on used and new AMD cards to go Nvidia if you are building ATM.


I cant disagree with that lol. I mean 770 and 280x are close enough, but the 290 is $400 right now and a 780 is $520....Personally I havent run ati/amd in years so wish the 780 was cheaper but am doing my reading, might get a 290 sooner than later.


----------



## heroxoot

Quote:


> Originally Posted by *66racer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> MSI gave me a bios that works on my GPU and solves the fan problem where I could not adjust it on 14.1 and higher. Oh great, but it locks down my card to a 290. It was showing 64/160 ROPS instead of 64/176 and 25xx shaders instead of 2816. It also was clocking me at 290 core speed which is 977mhz. I switched back to my old bios till I contact MSI again but is there anyway to prove this card is infact a 290X? I feel like I may have a 290 running a 290X bios now, but wouldn't the shaders show 64/160 and not 64/176 with the original bios?
> 
> 
> 
> I think the only thing to do really is run a few benchmarks with each bios, or at least one benchmark. Make sure they are clocked the same so you can see if it truely "unlocks" the card. I dont have a 290 so cant really comment beyond that....just curious about them now that they are $400 again
> 
> 
> 
> 
> 
> 
> 
> 
> .
Click to expand...

I've talked to a friend and he can confirm I do have a 290X. He said when he flashed his 7950 to a 7970 bios the shaders and ROPS do not change, thus the original bios showing me 290X spec is not a lie. So now the problem remains that this bios is for the 290 not 290X. I have to wait till Monday to talk to the guy who sent me the bios, or find a way to edit the bios to unlock the shaders and such.


----------



## PCSarge

catzilla benchmark @ 720P 290 @ 1150/1350

someone else wanna run that and compare?


----------



## bond32

Quote:


> Originally Posted by *PCSarge*
> 
> 
> 
> catzilla benchmark @ 720P 290 @ 1150/1350
> 
> someone else wanna run that and compare?


Downloading now... I'll bite.


----------



## PCSarge

Quote:


> Originally Posted by *bond32*
> 
> Downloading now... I'll bite.


to unlock 720p u need to register on the site, they give you a free key if you do, otherwise its a 580p benchmark lol


----------



## bond32

Quote:


> Originally Posted by *PCSarge*
> 
> to unlock 720p u need to register on the site, they give you a free key if you do, otherwise its a 580p benchmark lol


Sorry, but that benchmark actually had me until I heard the dub-step garbage. Now it's promptly uninstalled.


----------



## Widde

Thinking of slapping on a couple of h55s on my 290s while keeping the shroud,fan and the bottom plate or is there a cooler that would make a better fit?


----------



## breenemeister

Quote:


> Originally Posted by *Widde*
> 
> Thinking of slapping on a couple of h55s on my 290s while keeping the shroud,fan and the bottom plate or is there a cooler that would make a better fit?


With two fans on them, H55s may just barely be adequate on a 290. Go with H60s at least and use two fans. On my 7970, the h60 was better than the h55 by over 10C.


----------



## PCSarge

Quote:


> Originally Posted by *bond32*
> 
> Sorry, but that benchmark actually had me until I heard the dub-step garbage. Now it's promptly uninstalled.


this is why we mute sound.


----------



## pdasterly

xfire disabled


xfire enabled


Latest beta driver 14.6rc


----------



## PCSarge

Quote:


> Originally Posted by *pdasterly*
> 
> xfire disabled
> 
> 
> xfire enabled
> 
> 
> Latest beta driver 14.6rc


im running 13.12 still. had issues with 14.4/14.6

that in 720P?

and did you run my OC numbers or your own?


----------



## Widde

Quote:


> Originally Posted by *breenemeister*
> 
> With two fans on them, H55s may just barely be adequate on a 290. Go with H60s at least and use two fans. On my 7970, the h60 was better than the h55 by over 10C.


Smack on a couple of delta fans







No but joking aside think I'm thinking about h80i's, like 15 bucks more. Or I'm just gonna save up and buy me some 1200$ watercooling (includes a new case)







Been wanting to do a custom loop for a couple of years now.


----------



## pdasterly

720p, 1150 gpu, 1350 mem
Had to bump voltage to keep from crashing


----------



## fateswarm

Daily reminder that mosfets (of VRM) can run properly up to around 130C. They are not CPUs. That being said, it might be safer for nearby components to not go above 100C.


----------



## Arizonian

Benchmark Catzilla thread if you guys like to post some scores up.

http://www.overclock.net/t/1489023/catzilla-top-30-thread


----------



## PCSarge

Quote:


> Originally Posted by *pdasterly*
> 
> 720p, 1150 gpu, 1350 mem
> Had to bump voltage to keep from crashing


thats odd... my 290 does it with no voltage or power limit bumps....i must have a strong card lol


----------



## kizwan

Quote:


> Originally Posted by *PCSarge*
> 
> Quote:
> 
> 
> 
> Originally Posted by *pdasterly*
> 
> 720p, 1150 gpu, 1350 mem
> Had to bump voltage to keep from crashing
> 
> 
> 
> thats odd... my 290 does it with no voltage or power limit bumps....i must have a strong card lol
Click to expand...

No power limit? Are you sure it's not throttling?


----------



## PCSarge

Quote:


> Originally Posted by *kizwan*
> 
> No power limit? Are you sure it's not throttling?


sure as heck doesnt look like it by the score i got. nor did clocks drop in AB on the monitoring graphs during the benchmark.


----------



## Dasboogieman

Quote:


> Originally Posted by *Hl86*
> 
> I have some copper bricks and i was thinking of buying some thermal tape and the put them on the backside of vrm. Anyone tried this?


I did 2 mods.
Basically I attached an ek backplate to my 290 tri x with phobya thermal pads on the VRM area. I then attached an old NVIDIA GT210 blower to the VRM area on the backplate with sekisui tape on the edges of the cooler and thermal paste to bridge the half mm gap.
Temperatures on the VRM side dropped by 15 degrees.
I couldn't keep the mod tho since my ASUS Xonar STX would no longer fit in between the card and the NH D14 cooler.
So I instead turned the VRM area in to a hedgehog by glueing about 15 Ramsinks using thermal tape. The temperature improvement is a more modest 7 degrees but I can use my Xonar card again.
I'll re attempt the mod once I can watercool since I'm really space limited at the moment.


----------



## cephelix

well, update on my card @Dasboogieman and @Red1776.

I checked my card as per your instructions...

1. No AMD silkscreen print near pci-e slot
2. Chokes are gold (viewed through the cooler with a flashlight)








3. Capacitor is moved inwards i.e. not right next to the 8-pin connector

So in conclusion, I cannot fit a full waterblock on my card, which sucks donkey [email protected] Especially since i've bought the rest of the components for a custom loop barring the fujipoly thermal tape and EK waterblock/packplate/reinforcing bracket combo.

My only other option would be to get a universal block...maybe the EK thermosphere. Only thing I don't know is the size of the heatsinks required for VRMs 1 and 2 and the VRAM modules. Thickness of the thermal tape as when i was planning for the fullcover, it was 1.0mm for VRAM and 0.5 for VRMs. Can I now get 0.5mm or 1mm for everything? Which would be better? And of course how to secure the heatsinks to the VRMs.

Anyone with experience in this, if they could help would be greatly appreciated


----------



## Sgt Bilko

Quote:


> Originally Posted by *cephelix*
> 
> well, update on my card @Dasboogieman and @Red1776.
> 
> I checked my card as per your instructions...
> 
> 1. No AMD silkscreen print near pci-e slot
> 2. Chokes are gold (viewed through the cooler with a flashlight)
> 
> 
> 
> 
> 
> 
> 
> 
> 3. Capacitor is moved inwards i.e. not right next to the 8-pin connector
> 
> So in conclusion, I cannot fit a full waterblock on my card, which sucks donkey [email protected] Especially since i've bought the rest of the components for a custom loop barring the fujipoly thermal tape and EK waterblock/packplate/reinforcing bracket combo.
> 
> My only other option would be to get a universal block...maybe the EK thermosphere. Only thing I don't know is the size of the heatsinks required for VRMs 1 and 2 and the VRAM modules. Thickness of the thermal tape as when i was planning for the fullcover, it was 1.0mm for VRAM and 0.5 for VRMs. Can I now get 0.5mm or 1mm for everything? Which would be better? And of course how to secure the heatsinks to the VRMs.
> 
> Anyone with experience in this, if they could help would be greatly appreciated


Have you checked out the Rev 2 version of the EK block?

They made it for the MSI gaming and Giga WF3 cards


----------



## Dasboogieman

Quote:


> Originally Posted by *cephelix*
> 
> well, update on my card @Dasboogieman and @Red1776.
> 
> I checked my card as per your instructions...
> 
> 1. No AMD silkscreen print near pci-e slot
> 2. Chokes are gold (viewed through the cooler with a flashlight)
> 
> 
> 
> 
> 
> 
> 
> 
> 3. Capacitor is moved inwards i.e. not right next to the 8-pin connector
> 
> So in conclusion, I cannot fit a full waterblock on my card, which sucks donkey [email protected] Especially since i've bought the rest of the components for a custom loop barring the fujipoly thermal tape and EK waterblock/packplate/reinforcing bracket combo.
> 
> My only other option would be to get a universal block...maybe the EK thermosphere. Only thing I don't know is the size of the heatsinks required for VRMs 1 and 2 and the VRAM modules. Thickness of the thermal tape as when i was planning for the fullcover, it was 1.0mm for VRAM and 0.5 for VRMs. Can I now get 0.5mm or 1mm for everything? Which would be better? And of course how to secure the heatsinks to the VRMs.
> 
> Anyone with experience in this, if they could help would be greatly appreciated


Yeah, I was really disappointed with mine too. To be fair, its one hell of an Air Cooled card but Watercooling even with a uni block is going to be problematic.
Because MSI added an extra VRM1 phase, it wasn't possible to lay out all the MOSFETs in a single row like on the reference card, instead, MSI laid them in a double row. The VRM2 is the same albeit with one missing MOSFET compared to the reference.

Uni blocks are also not ideal because the VRM cooling on the card is handled by a heatsink that is soldered to the heatpipes, this is excellent for an aircooling solution (efficiency wise) but it won't help you for WC as it cannot be used independently of the main HS. My GELID enhancement kit heatsink is also useless as half the MOSFETS won't be getting cooled due to the double row layout.

The only real solutions at this stage are to either: petition to EK to do a limited waterblock run, sell the card and get a guaranteed reference board or send the card to Alphacool for them to manufacture a custom Heatsink for Uniblock cooling.
http://www.alphacool.com/temp/Alphacool%20Press%20Release.pdf
Quote:


> Originally Posted by *Sgt Bilko*
> 
> Have you checked out the Rev 2 version of the EK block?
> 
> They made it for the MSI gaming and Giga WF3 cards


I checked too, EK's site says its a no go. The card is basically an AMD 290X Gaming with a semi-custom PCB layout (i.e. the one featured on Coolingconfigurator) however, Cephelix and I seemed to have scored a 290 version with the 290X PCB.
Basically it sucks because its not a high enough profile custom PCB like the Lightning or DirectCU to warrant a special waterblock run but it's custom enough to make it impossible to use waterblocks for the reference board.


----------



## cephelix

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Have you checked out the Rev 2 version of the EK block?
> 
> They made it for the MSI gaming and Giga WF3 cards


Checked it out and no,it's not the same....from what i could see without removing the cooler, the capacitor next to the 8-pin connector is out of place.
Quote:


> Originally Posted by *Dasboogieman*
> 
> Yeah, I was really disappointed with mine too. To be fair, its one hell of an Air Cooled card but Watercooling even with a uni block is going to be problematic.
> Because MSI added an extra VRM1 phase, it wasn't possible to lay out all the MOSFETs in a single row like on the reference card, instead, MSI laid them in a double row. The VRM2 is the same albeit with one missing MOSFET compared to the reference.
> 
> The VRM cooling on the card is handled by a heatsink that is soldered to the heatpipes, this is excellent for an aircooling solution but it won't help you for WC. My GELID enhancement kit heatsink is also useless as half the MOSFETS won't be getting cooled.
> 
> The only real solutions at this stage are to either: petition to EK to do a limited waterblock run, sell the card and get a guaranteed reference board or send the card to GELID for them to manufacture a custom Heatsink for Uniblock cooling.


Damn...that's a lot of work either ways. no way to use the heatplate with a universal block eh? At any rate, is it possible to at least change the thermal tape?
Seems like such a waste to run a 360mm + 240mm rad for just the cpu....and i already have the parts


----------



## Dasboogieman

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Have you checked out the Rev 2 version of the EK block?
> 
> They made it for the MSI gaming and Giga WF3 cards


Quote:


> Originally Posted by *cephelix*
> 
> Checked it out and no,it's not the same....from what i could see without removing the cooler, the capacitor next to the 8-pin connector is out of place.
> Damn...that's a lot of work either ways. no way to use the heatplate with a universal block eh? At any rate, is it possible to at least change the thermal tape?
> Seems like such a waste to run a 360mm + 240mm rad for just the cpu....and i already have the parts


What I wouldn't do with an aluminium CNC milling machine. A custom heatsink for the card would've been trivial to do. Sigh
Changing the thermal pads should help, you might void the warranty though since mine has a warranty sticker on one of the core sink screws (though it looks really easy to remove in one piece with a bit of heat), however, the air cooling for the VRMs is really good, you'll be limited by noise and core heat before the VRMs become a concern on this card. This is the total opposite of the Tri X which had crazy core thermal headroom but next to no VRM headroom.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Dasboogieman*
> 
> I checked too, EK's site says its a no go. The card is basically an AMD 290X Gaming with a semi-custom PCB layout (i.e. the one featured on Coolingconfigurator) however, Cephelix and I seemed to have scored a 290 version with the 290X PCB.
> Basically it sucks because its not a high enough profile custom PCB like the Lightning or DirectCU to warrant a special waterblock run but it's custom enough to make it impossible to use waterblocks for the reference board.


Quote:


> Originally Posted by *cephelix*
> 
> Checked it out and no,it's not the same....from what i could see without removing the cooler, the capacitor next to the 8-pin connector is out of place.


Sorry to hear that guys, seems MSI must have changed the design a 3rd time huh?


----------



## Dasboogieman

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Sorry to hear that guys, seems MSI must have changed the design a 3rd time huh?


Yeah, don't get me wrong the change was for the better IF it was only intended to be air cooled. This card's cooling and power design is much much better balanced than the competition, the gold SSC inductors are practically immune to coil whine and the VRM heatsink is directly hooked up to the heatpipes (which means excellent cooling even at low fan RPMs). However, a 2 slot standard length card is still limited by physics, it still succumbs to Hawaii's heat output when heavily overvolted (though the VRM temps are cool, the core temps spike rapidly)


----------



## cephelix

Quote:


> Originally Posted by *Dasboogieman*
> 
> What I wouldn't do with an aluminium CNC milling machine. A custom heatsink for the card would've been trivial to do. Sigh
> Changing the thermal pads should help, you might void the warranty though since mine has a warranty sticker on one of the core sink screws (though it looks really easy to remove in one piece with a bit of heat), however, the air cooling for the VRMs is really good, you'll be limited by noise and core heat before the VRMs become a concern on this card. This is the total opposite of the Tri X which had crazy core thermal headroom but next to no VRM headroom.


thermally limited by now...at 1100MHz core, i'm already hitting 83deg core and 73dec vrm1. I have fine tipped tweezers that should be able to remove the sticker. if you ever do get a hold of a cnc milling machine, remember me.....cos i'll want one of those custom heatsinks..lol
Quote:


> Originally Posted by *Sgt Bilko*
> 
> Sorry to hear that guys, seems MSI must have changed the design a 3rd time huh?


yup....just my luck....or rather, lack thereof....


----------



## passey

just swapped my GTX680 for a Sapphire Radeon R9 290 Tri-X brand new £199.99 from OCUK can't wait to get testing it out now.


----------



## Sgt Bilko

Quote:


> Originally Posted by *passey*
> 
> just swapped my GTX680 for a Sapphire Radeon R9 290 Tri-X brand new £199.99 from OCUK can't wait to get testing it out now.


I dare say Mantle will be awesome for you









Curious to see how the 6300 goes with a 290 actually, good luck


----------



## LtMatt

Welcome to Mantle. The word awesome does not do it justice.


----------



## Sgt Bilko

Quote:


> Originally Posted by *LtMatt*
> 
> Welcome to Mantle. The word awesome does not do it justice.


i knew if i said it loud enough you'd hear it


----------



## Sgt Bilko

Good news for the Vapor-X and Matrix crowd:
Quote:


> ASUS ROG Matrix R9 290X FC EK-FC R9-290X Matrix July 1st
> Sapphire Radeon R9 290X Vapor-X FC EK-FC R9-290X VPX H1 of August


EK is planning on Fullcover blocks










Source: http://www.ekwb.com/news/500/19/New-water-cooling-gear-in-the-works-July-2014/


----------



## ebhsimon

If those water blocks work for the Vapor X 290 then I think I might start watercooling...


----------



## Durvelle27

Anybody running 3x 290s. If so does it benefit running PCIe 3.0


----------



## ZealotKi11er

Quote:


> Originally Posted by *Durvelle27*
> 
> Anybody running 3x 290s. If so does it benefit running PCIe 3.0


Maybe if you are running 4K. I have not seen any recent PCIE test down with high resolution monitors.


----------



## Durvelle27

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Maybe if you are running 4K. I have not seen any recent PCIE test down with high resolution monitors.


Yes 4K is the plan


----------



## Sgt Bilko

Quote:


> Originally Posted by *ebhsimon*
> 
> If those water blocks work for the Vapor X 290 then I think I might start watercooling...


They will fit, 290 and 290x are the same PCB just shaders disabled.
Quote:


> Originally Posted by *Durvelle27*
> 
> Anybody running 3x 290s. If so does it benefit running PCIe 3.0


Quote:


> Originally Posted by *ZealotKi11er*
> 
> Maybe if you are running 4K. I have not seen any recent PCIE test down with high resolution monitors.


@DeadlyDNA is running 4k eyefinity atm with 4 x R9 290x on an AMD PCIe 2.0 rig and it's working fine for him.

But that said if you want 4k eyefinity at 60hz then you need PCIe 3.0, but for a single 4k panel at 60hz it's perfectly fine for 2.0.


----------



## Durvelle27

Quote:


> Originally Posted by *Sgt Bilko*
> 
> They will fit, 290 and 290x are the same PCB just shaders disabled.
> 
> @DeadlyDNA is running 4k eyefinity atm with 4 x R9 290x on an AMD PCIe 2.0 rig and it's working fine for him.
> 
> But that said if you want 4k eyefinity at 60hz then you need PCIe 3.0, but for a single 4k panel at 60hz it's perfectly fine for 2.0.


I ask because i'm trying to to decide between a 8350 or 4820K


----------



## Dasboogieman

Ok
Somebody who has successfully compiled the fixed MemtestCL app has finally come forward.
I tested his .exe and the program seems to have lost some of its sensitivity. I can't seem to induce VRAM errors before I blackscreen so hopefully with more data, we can verify the accuracy of the program. It may be because I haven't done enough iterations, iHaque himself did say that the innate ECC correction on the GDDR5 may take hundreds of iterations to weed out any errors.

If someone would want to volunteer for science, run your desktop in iGPU mode, then purposely dial an unstable system RAM overclock, then boot and run the program but ensure that its computing on the iGPU. If it throws up errors as usual then the program is valid.

Grab a copy from here:
https://dl.dropboxusercontent.com/u/105687884/memtestCL.zip


----------



## Sgt Bilko

Quote:


> Originally Posted by *Durvelle27*
> 
> I ask because i'm trying to to decide between a 8350 or 4820K


Well it depends on how much cash you wanna spend and if you are planning to upgrade again soon after.

Long term the 4820k would be better but the 8350 with a decent overclock is a nice CPU and at 4k it's all dictated by your GPU(s) anyway.


----------



## Durvelle27

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well it depends on how much cash you wanna spend and if you are planning to upgrade again soon after.
> 
> Long term the 4820k would be better but the 8350 with a decent overclock is a nice CPU and at 4k it's all dictated by your GPU(s) anyway.


I don't know if you remember but i use to have a 8350 that did 5.4GHz. The CPU wasn't what concerned me it was the PCIe lanes and SATA ports


----------



## Sgt Bilko

Quote:


> Originally Posted by *Durvelle27*
> 
> I don't know if you remember but i use to have a 8350 that did 5.4GHz. The CPU wasn't what concerned me it was the PCIe lanes and SATA ports


Well the 990FX chipset has enough lanes for quadfire at x8/x8/x8/x8 so the lanes are there for it and i do remember now, memory went a bit fuzzy there for a minute









I agree partly on the Sata ports but that's really a Mobo thing more than anything else.


----------



## Durvelle27

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well the 990FX chipset has enough lanes for quadfire at x8/x8/x8/x8 so the lanes are there for it and i do remember now, memory went a bit fuzzy there for a minute
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I agree partly on the Sata ports but that's really a Mobo thing more than anything else.


Yea i have about 12+ HDDs i would like to connect. And yea i was the one with the M5A97 EVO and daily 5.1GHz 8350


----------



## Sgt Bilko

Quote:


> Originally Posted by *Durvelle27*
> 
> Yea i have about 12+ HDDs i would like to connect. And yea i was the one with the M5A97 EVO and daily 5.1GHz 8350


Yeah 970 Chipset is weak for Multi-GPU configs, as for the HDD's, that's alot of storage man.....can't you just get a couple of 4TB drives instead?


----------



## Durvelle27

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yeah 970 Chipset is weak for Multi-GPU configs, as for the HDD's, that's alot of storage man.....can't you just get a couple of 4TB drives instead?


Yea they are. I will eventually but for now this is what i have lol


----------



## bond32

Interesting... Since I have one 290x and 2 290's, I noticed after setting clocks in AB (with one of the 290's in the primary slot), the clocks of the 290x would not change. Seems GPU Tweak sets the clocks the same across all three cards.


----------



## Durvelle27

Quote:


> Originally Posted by *bond32*
> 
> Interesting... Since I have one 290x and 2 290's, I noticed after setting clocks in AB (with one of the 290's in the primary slot), the clocks of the 290x would not change. Seems GPU Tweak sets the clocks the same across all three cards.


Do you have sync clocks checked in settings


----------



## cephelix

just played sleeping dogs at 1920x1200...just picked the preset high setting.....buttery smooth....


----------



## aaroc

Quote:


> Originally Posted by *Durvelle27*
> 
> Anybody running 3x 290s. If so does it benefit running PCIe 3.0


I have 3x R9 290 on a CHVFZ + FX 8350, 3x 2560x1440 monitors, all stock no OC. No problems with the setup. Still not under water. My build log is on my signature (here). Going forward slow but steady.







If you need something specific just ask.


----------



## Blaise170

My new R9 290 comes back from XFX Monday. Very excite.


----------



## KeepWalkinG

Fan on Vapor X 290 is the same like TRI-X ?
And maybe have rattling noise problem ?


----------



## PureBlackFire

Quote:


> Originally Posted by *KeepWalkinG*
> 
> Fan on Vapor X 290 is the same like TRI-X ?
> And maybe have rattling noise problem ?


the fans themselves are the same. gigabyte, powercolor and visiontek also use the same exact fans on their non reference cards. the heatsink is thicker. my card has/had a rattling noise issue too. I haven't heard it the past two days, but for a while it would make a hideous sound (which I initially thought was coil whine) whenever the fan was running at 52-56%. every time I touched the end of the card's shroud with my finger the sound would stop. though it hasn't made noise inn two days, I'm sure the fan on the end is rattling, causing the noise.


----------



## passey

Is mantle out now then?

not looked into all the ati stuff for ages.


----------



## BradleyW

Quote:


> Originally Posted by *passey*
> 
> Is mantle out now then?
> 
> not looked into all the ati stuff for ages.


Yes, on some games. In the near future, we will see more Mantle titles soon, such as Sniper Elite III, Dragon Age Retribution and Plants vs Zombies and Battlefield Hardline, plus many more.


----------



## Blaise170

Quote:


> Originally Posted by *BradleyW*
> 
> Yes, on some games. In the near future, we will see more Mantle titles soon, such as Sniper Elite III, Dragon Age Retribution and Plants vs Zombies and Battlefield Hardline, plus many more.


Plants vs Zombies? Why would you even need Mantle for that.


----------



## BradleyW

Quote:


> Originally Posted by *Blaise170*
> 
> Plants vs Zombies? Why would you even need Mantle for that.


CPU overhead I guess? Then again, AMD once made a CFX profile for spider solitaire and minesweeper.


----------



## mikeo01

Hey all, looking for some opinions. Currently have a Powercolor TurboDuo 280X installed, on my Coolermaster V450S (450W) PSU. Full load (+ full load CPU) its 380W at the wall. I used Prime95 large FFTs and MSI Kombuster to get them to full load.

2x 0.15A fans and a 0.25A fan + a H75. 2x SATAs and single SSD. Working tight here.

My question to you guys is whether or not buying a second hand reference 290 is worth it?

Worth it, as in, second hand + power consumption. My plan was to maybe slap a NZXT G10 and H55 onto it, or the Arctic Accelero Twin Turbo. All under a Corsair 250D mini ITX chassis. So I am relying on that side vent 100%.

Max my 280X has seen is 80c, and that's pretty normal for it to hit that as I leave the fan profile to auto (silence).

Really, how bad is the stock cooler for noise? I can't stand my stock reference HD 7870s fan at 40%, how much worse is the stock 290s fan at this speed?

My PSU provides 36A on single +12V rail, would opting for the G10 - H55 be worth it? (lower temps, lower power consumption).

Thanks all


----------



## Blaise170

Quote:


> Originally Posted by *mikeo01*
> 
> Hey all, looking for some opinions. Currently have a Powercolor TurboDuo 280X installed, on my Coolermaster V450S (450W) PSU. Full load (+ full load CPU) its 380W at the wall. I used Prime95 large FFTs and MSI Kombuster to get them to full load.
> 
> 2x 0.15A fans and a 0.25A fan + a H75. 2x SATAs and single SSD. Working tight here.
> 
> My question to you guys is whether or not buying a second hand reference 290 is worth it?
> 
> Worth it, as in, second hand + power consumption. My plan was to maybe slap a NZXT G10 and H55 onto it, or the Arctic Accelero Twin Turbo. All under a Corsair 250D mini ITX chassis. So I am relying on that side vent 100%.
> 
> Max my 280X has seen is 80c, and that's pretty normal for it to hit that as I leave the fan profile to auto (silence).
> 
> Really, how bad is the stock cooler for noise? I can't stand my stock reference HD 7870s fan at 40%, how much worse is the stock 290s fan at this speed?
> 
> My PSU provides 36A on single +12V rail, would opting for the G10 - H55 be worth it? (lower temps, lower power consumption).
> 
> Thanks all


Buy an XFX 290. XFX will send you a replacement cooler if you ask them, and they will even install watercooling blocks for you (not that you'd need them to).


----------



## Blaise170

Quote:


> Originally Posted by *Beezleybuzz*
> 
> Dude, not trying to be insulting, but really? You're a moderator at overclock.net and you need to do 'more reading' on the 290? Do you pay attention to any of the multitude of benchmarks and reviews that have been done of the past 7 months?


Do you read the benchmarks for every single piece of hardware released? I certainly don't, and being a moderator has no correlation with that.


----------



## ZealotKi11er

Guys I am having a big problem right now. I had the same problem when I first got my 290X. All I did was move MSI AB slider for voltage to undervolt the card. I clicked apply and I got black screen. I restarted PC and it's stuck in black screen. I tried safe move and uninstalled both the drivers and MSI AB and booted fine. I stilled drivers again and after beefing asked to restart I got black screen again. In this mode the cards fan is spinning more then idle. Before I could not solve this problem until I did a fresh install of Windows. Has anyone experienced this problem?


----------



## disintegratorx

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Guys I am having a big problem right now. I had the same problem when I first got my 290X. All I did was move MSI AB slider for voltage to undervolt the card. I clicked apply and I got black screen. I restarted PC and it's stuck in black screen. I tried safe move and uninstalled both the drivers and MSI AB and booted fine. I stilled drivers again and after beefing asked to restart I got black screen again. In this mode the cards fan is spinning more then idle. Before I could not solve this problem until I did a fresh install of Windows. Has anyone experienced this problem?


Yeah, you need to apply more voltage then, ZealotKi11er. What's happening is your card is not taking enough power in and so its crashing your system. Powerplay can do that too by the way.


----------



## Durvelle27

Quote:


> Originally Posted by *aaroc*
> 
> I have 3x R9 290 on a CHVFZ + FX 8350, 3x 2560x1440 monitors, all stock no OC. No problems with the setup. Still not under water. My build log is on my signature (here). Going forward slow but steady.
> 
> 
> 
> 
> 
> 
> 
> If you need something specific just ask.


What games do you play


----------



## aaroc

Quote:


> Originally Posted by *Durvelle27*
> 
> What games do you play


BF3, TombRaider, Bioshock Infinite, F1 2013/2012, Dirt 3, Dirt Showdown, Dota 2, Far Cry 3, StarCraft 2,Sleeping Dogs, etc...


----------



## giygas

I just got a Sapphire Vapor-X R9 290 and it's idling at 52C. That seems a little absurdly high to me...does anyone have any ideas what could be causing it? As I understand it Vapor-X models are supposed to be cooler, not this much warmer...I thought it might be because I have my monitor at 144Hz but putting it back at 60 Hz didn't change the temperature at all.


----------



## ZealotKi11er

Quote:


> Originally Posted by *disintegratorx*
> 
> Yeah, you need to apply more voltage then, ZealotKi11er. What's happening is your card is not taking enough power in and so its crashing your system. Powerplay can do thiat too by the way.


How do I add more voltage. MSI AB does not work safe mode.


----------



## skruppe

Hey this looks like a cool club, and you're letting a guy like me join you?

GPU1: http://www.techpowerup.com/gpuz/4vdfh/
GPU1: ASUS, 290X - Battlefield 4 Edition, replaced stock and got myself a nice EKWC block.(incl. backplate)
GPU2: http://www.techpowerup.com/gpuz/3d74s/
GPU2: Powercolor PCS+ 290X got cleaned up and aftermarket fans were replaced by EKWC block (incl. backplate)

And just my luck, both cards have Elpida... I guess my only alternative is to buy a third random one and hope for the best? I suppose IF that were to fail as well I would have no other choice than going QuadFire right?


----------



## DeadlyDNA

Quote:


> Originally Posted by *skruppe*
> 
> Hey this looks like a cool club, and you're letting a guy like me join you?
> 
> GPU1: http://www.techpowerup.com/gpuz/4vdfh/
> GPU1: ASUS, 290X - Battlefield 4 Edition, replaced stock and got myself a nice EKWC block.(incl. backplate)
> GPU2: http://www.techpowerup.com/gpuz/3d74s/
> GPU2: Powercolor PCS+ 290X got cleaned up and aftermarket fans were replaced by EKWC block (incl. backplate)
> 
> And just my luck, both cards have Elpida... I guess my only alternative is to buy a third random one and hope for the best? I suppose IF that were to fail as well I would have no other choice than going QuadFire right?


Welcome to da club! Hopefully it serves you well!


----------



## DeadlyDNA

Quote:


> Originally Posted by *ZealotKi11er*
> 
> How do I add more voltage. MSI AB does not work safe mode.


I think MSI AB is saving your old settings, try goign to safe mode remove MSI AB. Then go to windows reinstall MSI AB and before restart reset alll cards. If you have any saved profiles delete them or override them with default settings. Also, uncheck startup options in MSI AB. See if that helps


----------



## disintegratorx

Quote:


> Originally Posted by *ZealotKi11er*
> 
> How do I add more voltage. MSI AB does not work safe mode.


Well then you might want to try uninstalling your graphics card driver and AB, in your device manager, attempting a boot into normal mode and starting over. This should be able to help you: http://windows.microsoft.com/en-us/windows/diagnostic-tools-use-safe-mode#1TC=windows-7

Good luck, I know from experience that these video cards and third party utilities can be troublesome to get the desired effects from.


----------



## ZealotKi11er

I was able to get it working by switching Bios.


----------



## disintegratorx

Ahh.. Good to know.


----------



## Forceman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I was able to get it working by switching Bios.


I think when you changed the BIOS it redetected the card and the AB settings were reset. You should be able to remove/delete AB and then you could switch back to the other BIOS, if you wanted to.


----------



## maynard14

hi guys, is it safe to leave my overclock 290 unlock to 290x using this settings 24/7?

msi after burner 100mv volts
gpu core 1120
memory core 1450

vrm max temp 77c
vrm 2 max temp 61c

gpu core max load temp 65c

using nzxt g10 ang vrm heatsinks from gelid and lastly aio antec kuhler 920


----------



## heroxoot

Quote:


> Originally Posted by *maynard14*
> 
> hi guys, is it safe to leave my overclock 290 unlock to 290x using this settings 24/7?
> 
> msi after burner 100mv volts
> gpu core 1120
> memory core 1450
> 
> vrm max temp 77c
> vrm 2 max temp 61c
> 
> gpu core max load temp 65c
> 
> using nzxt g10 ang vrm heatsinks from gelid and lastly aio antec kuhler 920


The only way to know is to try. With liquid cooling it should be fine tho.


----------



## pdasterly

Just watch the vrm1 temps


----------



## maynard14

what safe temp for vrm 1?


----------



## chiknnwatrmln

Under 90c. They are rated to 120c I think, but they can start to get unstable above 80c. For every day use 77c is fine, on hotter days if it goes over 85c I would personally dial back the voltage a bit.


----------



## uk3k

pls add me:

GPU-Z
uk3k - 2x Sapphire R9 290 Tri-X OC - Water/Arctic Cooling Hybrid II

thx


----------



## pdasterly

Quote:


> Originally Posted by *uk3k*
> 
> pls add me:
> 
> GPU-Z
> uk3k - 2x Sapphire R9 290 Tri-X OC - Water/Arctic Cooling Hybrid II
> 
> thx


How are your vrm1 temps, oc'd?
Ive done just about everything to tame my vrm1 temps with no success. Anything over 20mv and the vrm goes over 100


----------



## uk3k

about 107°C after running Furmark Burn-In for a while @ 1120MHz/+44mV

Should be fine because as far as i know those VRMs are designed for temperatures up to 130°C

Using less OC @ stock-voltage my VRMs always stay below 90°C.

Actually i'm still interested in a simple to mount cooling header for the vrms but didn't find any yet :-(


----------



## HoneyBadger84

Quote:


> Originally Posted by *uk3k*
> 
> about 107°C after running Furmark Burn-In for a while @ 1120MHz/+44mV
> 
> Should be fine because as far as i know those VRMs are designed for temperatures up to 130°C
> 
> Using less OC @ stock-voltage my VRMs always stay below 90°C.
> 
> Actually i'm still interested in a simple to mount cooling header for the vrms but didn't find any yet :-(


I would react very badly if I ever saw those temps myself. I'm a bit of an air cool to the max type though. Lol


----------



## Dasboogieman

Quote:


> Originally Posted by *giygas*
> 
> I just got a Sapphire Vapor-X R9 290 and it's idling at 52C. That seems a little absurdly high to me...does anyone have any ideas what could be causing it? As I understand it Vapor-X models are supposed to be cooler, not this much warmer...I thought it might be because I have my monitor at 144Hz but putting it back at 60 Hz didn't change the temperature at all.


That's expected. The vapor x is designed to turn off its fans when idling or at low load. The normal tri x just runs its fans at 20% in the same scenario that's why it idles lower. You can override this with afterburner or trixx.


----------



## mAs81

Quote:


> Originally Posted by *Dasboogieman*
> 
> That's expected. The vapor x is designed to turn off its fans when idling or at low load. The normal tri x just runs its fans at 20% in the same scenario that's why it idles lower. You can override this with afterburner or trixx.


Well,I also have that card and I have turned the "intelligent" fan control off,but I still get those temps in idle..
But,to be fair ,I have 2 monitors @60Hz ,plus case airflow plays a big part, and since I have crammed it inside an obsidian 350D high idle temps are to be expected..Not to mention the ambient temps are high now(summer in Greece







)

I got mostly the same idle temps with my msi 280 X,so I wouldn't worry too much..


----------



## HoneyBadger84

Quote:


> Originally Posted by *mAs81*
> 
> Well,I also have that card and I have turned the "intelligent" fan control off,but I still get those temps in idle..
> But,to be fair ,I have 2 monitors @60Hz ,plus case airflow plays a big part, and since I have crammed it inside an obsidian 350D high idle temps are to be expected..Not to mention the ambient temps are high now(summer in Greece
> 
> 
> 
> 
> 
> 
> 
> )
> 
> I got mostly the same idle temps with my msi 280 X,so I wouldn't worry too much..


Indeed, two monitors means you vRAM is running at full speed 24/7, so the card would run about 3-10C warmer than normal. That's part of why I stopped running 2 monitors for 24/7 use recently, made my R9 290X that was the primary card idle at 40-44C instead of 32-38C (40-50% minimum fan speed programmed into AfterBurner, and those temps are on the Gigabyte WindForce cooler)


----------



## disintegratorx

Well, it turns out that I had a lower voltage than I needed, so I switched back to MSI AB. I didn't know how to add extra voltage on GPU core right away, so I just searched it out and found out how.. Why don't they just put this setting ability on the regular program? lol It took me like a week to figure this out!


----------



## Decoman

Please forgive me for not having read all the 8.000+ posts in this thread.
I recently placed an order for a Sapphire 290X card w. AMD stock cooler/fan to be used in a future water cooling setup and I will be buying parts occasionally to set it all up some months later:

*Sapphire Radeon R9 290X 4GB DDR5 2xDVI HDMI DisplayPort PCIe BF4 - (21226-00-53G) (1000 MHz core clock)*

I have a few questions about the card I bought and hope someone can give me some feedback on that. I normally ask around before I buy stuff, but I had been looking for a reference card and saw one at a discount and decided to buy it. Hopefully it will not be impossible to buy another reference 290X card later on. At worst I guess I could try buy a used one.

Q1: Is there perhaps anything I should know about the 21226-00-53G model? The 21226-00-40G seem similar to the version I have.
Q2: How much of an issue is this problem of getting the "black screen" all these months after the product launch in 2013?
Q3: What is safe max voltage for a 290X card?
Q4: Is there any point to flashing to a different bios or modding the bios if I only want to achieve a modest overclock? (20-25%)

Edit: Someone at "Sapphire Technology Club" on facebook claims that "(...) the 53G is the Battlefield 4 version and it comes with a copy of this game. The 40G version doesn't inculde it." I guess that might explain the difference between the two numbers 21226-00-53G and 21226-00-40G.

Edit2: The following post in this thread seem to sum up alot of useful info about the 290X card: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781

Edit3: I thought I should mention that someone seem to make a correlation between getting a "black screen" and allowing overclocking software to incidentally load in a random order when Windows start (as opposed to have it load at the end of the Windows start up process).


----------



## Arizonian

Quote:


> Originally Posted by *skruppe*
> 
> Hey this looks like a cool club, and you're letting a guy like me join you?
> 
> GPU1: http://www.techpowerup.com/gpuz/4vdfh/
> GPU1: ASUS, 290X - Battlefield 4 Edition, replaced stock and got myself a nice EKWC block.(incl. backplate)
> GPU2: http://www.techpowerup.com/gpuz/3d74s/
> GPU2: Powercolor PCS+ 290X got cleaned up and aftermarket fans were replaced by EKWC block (incl. backplate)
> 
> And just my luck, both cards have Elpida... I guess my only alternative is to buy a third random one and hope for the best? I suppose IF that were to fail as well I would have no other choice than going QuadFire right?


Congrats - added
















Quote:


> Originally Posted by *uk3k*
> 
> pls add me:
> 
> GPU-Z
> uk3k - 2x Sapphire R9 290 Tri-X OC - Water/Arctic Cooling Hybrid II
> 
> thx


Congrats - added


----------



## Dasboogieman

Quote:


> Originally Posted by *Decoman*
> 
> Please forgive me for not having read all the 8.000+ posts in this thread.
> I recently placed an order for a Sapphire 290X card w. AMD stock cooler/fan to be used in a future water cooling setup and I will be buying parts occasionally to set it all up some months later:
> 
> *Sapphire Radeon R9 290X 4GB DDR5 2xDVI HDMI DisplayPort PCIe BF4 - (21226-00-53G) (1000 MHz core clock)*
> 
> I have a few questions about the card I bought and hope someone can give me some feedback on that. I normally ask around before I buy stuff, but I had been looking for a reference card and saw one at a discount and decided to buy it. Hopefully it will not be impossible to buy another reference 290X card later on. At worst I guess I could try buy a used one.
> 
> Q1: Is there perhaps anything I should know about the 21226-00-53G model? The 21226-00-40G seem similar to the version I have.
> Q2: How much of an issue is this problem of getting the "black screen" all these months after the product launch in 2013?
> Q3: What is safe max voltage for a 290X card?
> Q4: Is there any point to flashing to a different bios or modding the bios if I only want to achieve a modest overclock? (20-25%)


1. No clue about the versions, I know that the Tri X, there was an earlier and later model. Biggest difference was that the late model couldn't use the earlier model's BIOS due to changes in VRAM timings.

2. The black screen issue is most commonly associated with bad memory. The prevailing theory at the moment was that the earlier batches of cards had derped Elpida timings resulting in blackscreens at stock. Its pretty rare nowadays (though I daresay its more common with Elpida equipped cards than Hynix), you tend to see blackscreens more now due to unstable VRAM overclocks.

3. There is no real definite answer but I can share a rough guideline: very very rough, if anyone can help me clean it up, it would be much appreciated. Its not really voltage that kills components, its the power draw and heat, voltage alone is just an electrical potential.

The VRM MOSFETS are rated for up to 50A on the large ones and 20A on the smaller auxiliary ones, I am basing as rough estimate (I have limited knowledge on the intricacies of VRM design) that with 5 phases, it is reasonable to assume at a Vcore of 1.25V (stock worst binned GPU) and about 1.18V (VDROOP), 50x1.18x5 =295W of VRM designed power. The TDP is 250W ish with provision for 375W ish(50% power limit). At 375W, you would need roughly (very rough mind you, power consumption is quadratically proportional to voltage and linearly proportional to clockspeed) 1.5V. So with this information, *I can surmise that the "safe" 24/7 voltage is leaning towards maybe the 1.35V range (which is about +125mV on a decent binned GPU) on air cooling* where you are roughly still inside the AMD mandated maximum PCB power draw.
However, because the 290X VRM assembly is designed for optimal operation in the 250W zone, VRM temperatures rapidly spike at loads above this. Most would not attempt large overvoltages without watercooling since the act of having cooler temperatures reduces the power draw of the GPU at a given clockspeed, this will allow higher voltages, hence clockspeed, before tripping past the 375W danger zone. Plus, WC also deals with the high heat density of the VRMs beautifully.

4. 20-25% is a pretty large overclock for Hawaii. Most GPUs (barring the worst binned <70% ASIC ones) can seem to reach about 1080-1100mhz on stock voltage (around 1.212V, mine actually need +38mV to do 1100mhz but I think its because mine started off with a low stock voltage with not much headroom) 1150mhz requires at least +125mV (at least for my 2 cards with 78.9 and 79.9 ASIC quality) plus my 1200mhz clock is not stable even at a blistering +175mV (perhaps +200mV would do it). You may get better results under water, notably because your ASIC will draw less power overall thus having more overvoltage headroom (before the VRMs are running too much out of spec).


----------



## ZealotKi11er

Quote:


> Originally Posted by *Dasboogieman*
> 
> 1. No clue about the versions, I know that the Tri X, there was an earlier and later model. Biggest difference was that the late model couldn't use the earlier model's BIOS due to changes in VRAM timings.
> 
> 2. The black screen issue is most commonly associated with bad memory. The prevailing theory at the moment was that the earlier batches of cards had derped Elpida timings resulting in blackscreens at stock. Its pretty rare nowadays (though I daresay its more common with Elpida equipped cards than Hynix), you tend to see blackscreens more now due to unstable VRAM overclocks.
> 
> 3. There is no real definite answer but I can share a rough guideline: very very rough, if anyone can help me clean it up, it would be much appreciated. Its not really voltage that kills components, its the power draw and heat, voltage alone is just an electrical potential.
> 
> The VRM MOSFETS are rated for up to 50A on the large ones and 20A on the smaller auxiliary ones, I am basing as rough estimate (I have limited knowledge on the intricacies of VRM design) that with 5 phases, it is reasonable to assume at a Vcore of 1.25V (stock worst binned GPU) and about 1.18V (VDROOP), 50x1.18x5 =295W of VRM designed power. The TDP is 250W ish with provision for 375W ish(50% power limit). At 375W, you would need roughly (very rough mind you, power consumption is quadratically proportional to voltage and linearly proportional to clockspeed) 1.5V. So with this information, *I can surmise that the "safe" 24/7 voltage is leaning towards maybe the 1.35V range (which is about +125mV on a decent binned GPU) on air cooling* where you are roughly still inside the AMD mandated maximum PCB power draw.
> However, because the 290X VRM assembly is designed for optimal operation in the 250W zone, VRM temperatures rapidly spike at loads above this. Most would not attempt large overvoltages without watercooling since the act of having cooler temperatures reduces the power draw of the GPU at a given clockspeed, this will allow higher voltages, hence clockspeed, before tripping past the 375W danger zone. Plus, WC also deals with the high heat density of the VRMs beautifully.
> 
> 4. 20-25% is a pretty large overclock for Hawaii. Most GPUs (barring the worst binned <70% ASIC ones) can seem to reach about 1080-1100mhz on stock voltage (around 1.212V, mine actually need +38mV to do 1100mhz but I think its because mine started off with a low stock voltage with not much headroom) 1150mhz requires at least +125mV (at least for my 2 cards with 78.9 and 79.9 ASIC quality) plus my 1200mhz clock is not stable even at a blistering +175mV (perhaps +200mV would do it). You may get better results under water, notably because your ASIC will draw less power overall thus having more overvoltage headroom (before the VRMs are running too much out of spec).


My 290X is 77.9% and does 1200MHz with 100mV under water. Did 1125MHz with stock voltage.


----------



## chiknnwatrmln

ASIC quality means nothing.

Having said that, dasboogieman has offered some good info. Temps are also a huge factor in overclocking, for max stability the core should be under 70c or so, with the VRMs under 80ish. The cooler the better, though.

Once you go WC feel free to crank the voltage. I have been running 1200core with +168mv daily since December and have seen barely any GPU degradation. I run triple monitors so whenever I game my GPU is pinned at 100%.


----------



## piquadrat

Quote:


> Originally Posted by *Dasboogieman*
> 
> There is no real definite answer but I can share a rough guideline: very very rough, if anyone can help me clean it up, it would be much appreciated. Its not really voltage that kills components, its the power draw and heat, voltage alone is just an electrical potential.


If that was true the service employees all over the world wouldn't use grounded bands over their wrists.


----------



## pdasterly

Playing sniper 3, not best graphics but definitely 290x optimized
Amd should add a marker on-screen to let the user know that crossfire is working, nvidia has it for sli, but just not the big green bar, maybe a gold star or cube with each corner representing a gpu.


----------



## BradleyW

Quote:


> Originally Posted by *pdasterly*
> 
> Playing sniper 3, not best graphics but definitely 290x optimized
> Amd should add a marker on-screen to let the user know that crossfire is working, nvidia has it for sli, but just not the big green bar, maybe a gold star or cube with each corner representing a gpu.


Yep, CFX is working perfect for this game. I get exactly x2 scaling.


----------



## fateswarm

Is there a good guide on R9 290 Overclocking. I have a tri-x 290.


----------



## ZealotKi11er

Quote:


> Originally Posted by *fateswarm*
> 
> Is there a good guide on R9 290 Overclocking. I have a tri-x 290.


Not really. Dont use more then +100mV and keep temps for Core and VRM1 in check. You can expect a OC 1100-1200MHz. The lower the temps the higher the OC.


----------



## sugarhell

Quote:


> Originally Posted by *pdasterly*
> 
> Playing sniper 3, not best graphics but definitely 290x optimized
> Amd should add a marker on-screen to let the user know that crossfire is working, nvidia has it for sli, but just not the big green bar, maybe a gold star or cube with each corner representing a gpu.


They have. ALso you can use an osd?


----------



## pdasterly

How do I enable, I runs fraps now


----------



## fateswarm

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Not really. Dont use more then +100mV and keep temps for Core and VRM1 in check. You can expect a OC 1100-1200MHz. The lower the temps the higher the OC.


So above +0.1v the chip degrades?


----------



## sugarhell

Its on the crossfire tab iirc or you can find it from the tray icon. Something show crossfire icon when its working

@fate. Not really. One friend used 1.6v with an extreme water loop and its still okay.


----------



## pdasterly

Sorry, im blind. Cant find


----------



## sugarhell

Quote:


> Originally Posted by *pdasterly*
> 
> Sorry, im blind. Cant find


http://forums.overclockers.co.uk/showthread.php?t=18544020


----------



## bond32

Even though it's kinda a pain, and I know lots don't like it, ASUS GPU Tweak has been working quite well for me. Since I only have one out of the three on water right now, I have to deal with the absurd heat and fan noise. ULPS actually helps with this. I noticed with GPU Tweak, when it runs it disables ULPS - quite handy when I want to bench/game/OC, then when I just want to browse, all I need to do is close it then ULPS is back on and cuts the non-water-cooled cards off.

Also since I have 1 290x and 2 290's, I can't seem to get AB to change clocks on all three. GPU Tweak doesn't seem to have any issues with it. And yes I have the Sync settings checked.

Edit: What exactly does the "Enable unified GPU usage monitoring" do?


----------



## pdasterly

Quote:


> Originally Posted by *bond32*
> 
> Even though it's kinda a pain, and I know lots don't like it, ASUS GPU Tweak has been working quite well for me. Since I only have one out of the three on water right now, I have to deal with the absurd heat and fan noise. ULPS actually helps with this. I noticed with GPU Tweak, when it runs it disables ULPS - quite handy when I want to bench/game/OC, then when I just want to browse, all I need to do is close it then ULPS is back on and cuts the non-water-cooled cards off.
> 
> Also since I have 1 290x and 2 290's, I can't seem to get AB to change clocks on all three. GPU Tweak doesn't seem to have any issues with it. And yes I have the Sync settings checked.
> 
> Edit: What exactly does the "Enable unified GPU usage monitoring" do?


I can sync gpu using trixx


----------



## Fade2Black

i'm stuck between the 280x and the 290. as far as i can tell there is a minor difference between them. is it worth the extra ££? i'll be buying an XFX or Gigabyte windforce card, and running on 3 1080p displays.


----------



## Forceman

Quote:


> Originally Posted by *Fade2Black*
> 
> i'm stuck between the 280x and the 290. as far as i can tell there is a minor difference between them. is it worth the extra ££? i'll be buying an XFX or Gigabyte windforce card, and running on 3 1080p displays.


There's a minor difference between the 290 and 290X, but a pretty significant difference between the 280x and 290. I would get a 290 at this point, prices have really come down.


----------



## Fade2Black

alright thanks, could a CX 500w handle a 290?. i've been reading many threads and the answers seem very mixed.


----------



## Forceman

Quote:


> Originally Posted by *Fade2Black*
> 
> alright thanks, could a CX 500w handle a 290?. i've been reading many threads and the answers seem very mixed.


It would be a stretch. My system draws between 400W and 450W from the wall at full load (that's with +50mV on the GPU). So your system would draw less than that, but you are still probably looking at something in the 400W range. Plus you might run into problems with the available amps on the 12V line. In short, I wouldn't risk it.


----------



## Fade2Black

600w to be safe then and i can give my 500w to a friend that currently has the parts of my sig rig. thanks for the info


----------



## fateswarm

It's a bit ironic the forum spends so much time on CPU overclocking, when GPU overclocking is more delicate since the space is smaller, the power is bigger, for higher VRM quality needed. I guess it's because most users are gamers when many CPU users are also seasoned techies.


----------



## darkelixa

My sapphire r9 290 gets so damn hot that the card starts to throttle then if i turn the machine off and then back on, i get a red screen or a split second before windows finishes the boot. Bout to give up on these hot devil cards


----------



## heroxoot

Quote:


> Originally Posted by *darkelixa*
> 
> My sapphire r9 290 gets so damn hot that the card starts to throttle then if i turn the machine off and then back on, i get a red screen or a split second before windows finishes the boot. Bout to give up on these hot devil cards


I was having a similar issue. Applying new thermal paste in a not messy way dropped my temps at load a lot.


----------



## chinmi

Count me in, I just replaced my *dead cause of lightning* 6990 with a 290x.

1. GPU-Z Link with OCN name http://www.techpowerup.com/gpuz/k6dkq/
or Screen shot of GPU-Z validation tab open with OCN Name http://i.imgur.com/634DreJ.jpg


or finally a pic of GPU with piece of paper with OCN name showing.


http://imgur.com/KAFdpGA

 &


http://imgur.com/VccqMff




2. Gigabyte R9 290x

3. Cooling - Curently Stock, still haven't decide if I wanna change it to aftermarket or water, since in my country taking off the gpu case will void the warranty


----------



## darkelixa

Yeah I could do that but I dont really want to void my warranty, i just rmad a gigabyte r9 290 that I just bought like a month ago, its worse, it will shut the whole pc down and then 5 seconds later the pc will start up again. Issue goes away when gpu is replaced


----------



## Blaise170

Quote:


> Originally Posted by *piquadrat*
> 
> If that was true the service employees all over the world wouldn't use grounded bands over their wrists.


Static is electric potential...


----------



## heroxoot

Quote:


> Originally Posted by *darkelixa*
> 
> Yeah I could do that but I dont really want to void my warranty, i just rmad a gigabyte r9 290 that I just bought like a month ago, its worse, it will shut the whole pc down and then 5 seconds later the pc will start up again. Issue goes away when gpu is replaced


Sapphire is quite good about it. A friend was water cooling and they told him as long as the old cooler works just slap it on and send it in. MSI is the same way. As long as it is not physically broken they take it.


----------



## darkelixa

I dont think tim is the issue if i have the problem wtih my gigabyte r9 290 and the same with the sapphire, i think its the r9 290


----------



## Fade2Black

http://uk.pcpartpicker.com/p/2vjdrH
i have my finger over the proceed to check out button... should i do it?


----------



## heroxoot

Quote:


> Originally Posted by *darkelixa*
> 
> I dont think tim is the issue if i have the problem wtih my gigabyte r9 290 and the same with the sapphire, i think its the r9 290


All I know is my 290X was getting 85C at load during BF4 on the default 1030 clock and changing the thermal paste with the noctua paste I have dropped me to 77c during load. I'd say its worth a shot. They do a terrible job when they put the card together.


----------



## ZealotKi11er

Quote:


> Originally Posted by *heroxoot*
> 
> All I know is my 290X was getting 85C at load during BF4 on the default 1030 clock and changing the thermal paste with the noctua paste I have dropped me to 77c during load. I'd say its worth a shot. They do a terrible job when they put the card together.


How fast the fan going? My 290 hits 94C in every game but depending how demanding the game is the fan varies from 30-50%.


----------



## CoolRonZ

Quote:


> Originally Posted by *ZealotKi11er*
> 
> How fast the fan going? My 290 hits 94C in every game but depending how demanding the game is the fan varies from 30-50%.


ya me too, got mid 70s @70% fan speeds after a TIM change.. although fan speeds were set at 100% at 80degrees centigrade. but now I can add 200mV and crank her up to 1250/1500 and do mid 40's with an XSPC block and its wayyyyyyyyyyyyyyyyyy quieter...... I just wish the memory voltage wasn't locked....


----------



## piquadrat

Quote:


> Originally Posted by *Blaise170*
> 
> Static is electric potential...


... so it's not always power draw and heat what kills.


----------



## heroxoot

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> All I know is my 290X was getting 85C at load during BF4 on the default 1030 clock and changing the thermal paste with the noctua paste I have dropped me to 77c during load. I'd say its worth a shot. They do a terrible job when they put the card together.
> 
> 
> 
> How fast the fan going? My 290 hits 94C in every game but depending how demanding the game is the fan varies from 30-50%.
Click to expand...

I think 70% area. When I was getting 85c+ it was pushing 90% fan speed. Right now I'm stuck on default fan curve. I got MSI to fix the bios problem but they sent me the wrong bios. I got a 290 bios not 290X. So I'm waiting for them to get me the 290X bios but I gotta wait till monday as they are closed on weekends. I use GPUZ to monitor fan speed and all I can do to tell is that it's running X/3000 RPM. On the new bios they gave me the problem is solved but again, it isn't a 290X bios. I refuse to hinder my GPU performance.


----------



## Dasboogieman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> My 290X is 77.9% and does 1200MHz with 100mV under water. Did 1125MHz with stock voltage.


I thought 77.9% is a good quality chip? considering the 290X GPUs tend to be better binned in the first place compared to the failed 290s to meet board power requirements.
Quote:


> Originally Posted by *PureBlackFire*
> 
> my reference 290 was maxing 77c at 68% fan speed. i really hope AMD realize that they aren't just in the business of making chips and that the whole product has to be decent as a package. we'll see if they stick to these cheapo reference coolers on next series of cards.


Yeah, they didn't want to redesign the cooler so they simply just let it run hotter and louder to abuse the fact more energy is transferred as the temperature differential increases. Good grief, I wonder why they didn't just commission Sapphire to design a Vapor X Blower of similar capacity as the Titan one if the in-house team is incapable or too lazy to.


----------



## PureBlackFire

my reference 290 was maxing 77c at 68% fan speed. i really hope AMD realize that they aren't just in the business of making chips and that the whole product has to be decent as a package. we'll see if they stick to these cheapo reference coolers on next series of cards.


----------



## rdr09

Quote:


> Originally Posted by *PureBlackFire*
> 
> my reference 290 was maxing 77c at 68% fan speed. i really hope AMD realize that they aren't just in the business of making chips and that the whole product has to be decent as a package. we'll see if they stick to these cheapo reference coolers on next series of cards.


Pure, i'm not sure about others but i knew this reference of mine is a hot head even before this thread was created. my system is watercooled and reference are the first to get blocks. there are non-reference versions now but even the ones with only two fans are struggling to keep these babies cool. its water or 3 fans.

oh, i forgot - AIOs and some gelid heatsinks.


----------



## bbond007

I'm having an issue with my MSI r9 290X boards and AMD's ridiculous 14.6 beta drivers.

I can't set the core clock and have it take effect. Seems to totally ignore the core clock setting. I've tried setting both with CCC and Afterburner 3.01
I'm having this on multiple machines. I tried both the May and the June betas.

One machine has a Asrock Fatal1ty z87 killer MB and a MSI 290X lightning and a MSI 290X gamer.

The other machine has an Asrock Fatal1ty z87 killer MB and single MSI 290X gamer.

I've tried DDU on both machines and re-installing the drivers and Afterburner - No dice.

Anyone else having this issue? or fix for this? This seems like basic functionality that is broken. It seems like AMD's drivers actually are regressing and getting worse. Back to 13.12 for me.

Any help would be greatly appreciated.


----------



## Mega Man

ironically one of mine from sapphire has a chipped die , it came like that, so i doubt they even care about that


----------



## fateswarm

Hrm. Am I safe with 1100 gpu, 1300 vram and +50mv vdcc on tri-x 290? I'm mainly interested in safety+a boost, not full-on performance regardless of safety.


----------



## Forceman

Quote:


> Originally Posted by *fateswarm*
> 
> Hrm. Am I safe with 1100 gpu, 1300 vram and +50mv vdcc on tri-x 290? I'm mainly interested in safety+a boost, not full-on performance regardless of safety.


I've been running 1100/1350/+50mV for several months now with no problems. That's on water, but the Tri-X should keep things cool enough also.


----------



## Sgt Bilko

Powerlimit to 50 % in ccc an afterburner fixed it for me


----------



## fateswarm

Quote:


> Originally Posted by *Forceman*
> 
> I've been running 1100/1350/+50mV for several months now with no problems. That's on water, but the Tri-X should keep things cool enough also.


Hrm, that extra 50 on the ram gives it an interesting boost indeed. I'm only worried about voltage safety to be honest. If I understand it correctly temps don't matter as much as voltage to chips if temps are below 90C.


----------



## fateswarm

I'm trying now +37mv on 1100/1350. Heaven seems to like it. That's an interesting boost that doesn't look scary on voltage at all.


----------



## ZealotKi11er

Quote:


> Originally Posted by *fateswarm*
> 
> I'm trying now +37mv on 1100/1350. Heaven seems to like it. That's an interesting boost that doesn't look scary on voltage at all.


If you have Hynix try 1500MHz memory @ 50mV.


----------



## fateswarm

Hah. I do have Hynix according to MemoryInfo. BTW I tried +12mv on 1100/1350 before and it seemed to take it.


----------



## fateswarm

BTW, the vddc setting appears to be restricted to the gpu chip voltage regulator (rather than the memory's) according to hwinfo which means it may have a big impact on gpu stability but only an indirect impact for memory since it would mainly stabilize the gpu chip to work with the faster memory.


----------



## kizwan

Quote:


> Originally Posted by *bond32*
> 
> *Even though it's kinda a pain, and I know lots don't like it, ASUS GPU Tweak has been working quite well for me. Since I only have one out of the three on water right now, I have to deal with the absurd heat and fan noise. ULPS actually helps with this. I noticed with GPU Tweak, when it runs it disables ULPS - quite handy when I want to bench/game/OC, then when I just want to browse, all I need to do is close it then ULPS is back on and cuts the non-water-cooled cards off.*
> 
> Also since I have 1 290x and 2 290's, I can't seem to get AB to change clocks on all three. GPU Tweak doesn't seem to have any issues with it. And yes I have the Sync settings checked.
> 
> Edit: What exactly does the "Enable unified GPU usage monitoring" do?


Did your 2nd & 3rd card overvolt properly when overclocked? Meaning did they get additional voltage that you set in AB or GPU Tweak?


----------



## Forceman

Quote:


> Originally Posted by *fateswarm*
> 
> BTW, the vddc setting appears to be restricted to the gpu chip voltage regulator (rather than the memory's) according to hwinfo which means it may have a big impact on gpu stability but only an indirect impact for memory since it would mainly stabilize the gpu chip to work with the faster memory.


I think the limiting factor for the memory overclocks is the memory controller, not the VRAM itself. The VRAM is rated at 1500 (at least that's my understanding), but lots of people can't run that reliably. That, coupled with the fact that increasing VDDC really does help with memory overclocking, is what leads me to that conclusion. Makes sense with the new memory controller also, since it is reported to be less robust than the 384-bit controller used in Tahiti. I should go back and look and see if there is any correlation between high core clocks and high memory overclocks - if good silicon for raising the core is also good silicon for raising the memory.

Luckily memory speed isn't all that important for performance.


----------



## Dasboogieman

Quote:


> Originally Posted by *Forceman*
> 
> I think the limiting factor for the memory overclocks is the memory controller, not the VRAM itself. The VRAM is rated at 1500 (at least that's my understanding), but lots of people can't run that reliably. That, coupled with the fact that increasing VDDC really does help with memory overclocking, is what leads me to that conclusion. Makes sense with the new memory controller also, since it is reported to be less robust than the 384-bit controller used in Tahiti. I should go back and look and see if there is any correlation between high core clocks and high memory overclocks - if good silicon for raising the core is also good silicon for raising the memory.
> 
> Luckily memory speed isn't all that important for performance.


Quote:


> Originally Posted by *PureBlackFire*
> 
> my reference 290 was maxing 77c at 68% fan speed. i really hope AMD realize that they aren't just in the business of making chips and that the whole product has to be decent as a package. we'll see if they stick to these cheapo reference coolers on next series of cards.


Yeah, all my hynix chips are rated for 6000MHz. Yet my MSI card can't do 1500mhz (unlike my sapphire tri x card) unless I push +125mv on the core. The memory controller is definitely the bottleneck, similar to Fermi and Haswell. Unfortunately, unlike CPUs we can't control the memory controller voltage or VRAM voltage independently of Vcore.


----------



## fateswarm

How can I make Afterburner do its job? It runs after a reboot minimized properly, ok, the box of applying settings is checked, ok. But it doesn't apply the settings unless I do it explicitly.


----------



## Razzaa

I have been following this thread alot for some time now. I bought a MSI R9 290 off ebay today for 240$. I know it is a used card but the seller was top rated and offers a money return policy. I am coming from a 270 so i am sure i will notice a big difference in gaming lol.

According to this http://forum-en.msi.com/faq/article/power-requirements-for-graphics-cards my XFX 550w will run a 290 but i will upgrade shortly anyway. I am looking forward to a better gaming experience!!!


----------



## Blaise170

Quote:


> Originally Posted by *Razzaa*
> 
> I have been following this thread alot for some time now. I bought a MSI R9 290 off ebay today for 240$. I know it is a used card but the seller was top rated and offers a money return policy. I am coming from a 270 so i am sure i will notice a big difference in gaming lol.
> 
> According to this http://forum-en.msi.com/faq/article/power-requirements-for-graphics-cards my XFX 550w will run a 290 but i will upgrade shortly anyway. I am looking forward to a better gaming experience!!!


XFX usually rebrands Seasonic. Unless you have some specific reason to upgrade, you don't need to.


----------



## Razzaa

Quote:


> Originally Posted by *Blaise170*
> 
> XFX usually rebrands Seasonic. Unless you have some specific reason to upgrade, you don't need to.


Great news. Now hopefully this card can be unlocked!!!


----------



## Blaise170

Quote:


> Originally Posted by *Razzaa*
> 
> Great news. Now hopefully this card can be unlocked!!!


Just curious, what PSU do you have?


----------



## fateswarm

Hrm. CCC competes with all overclocking utilities on system start up. How do I prevent it without ugly workarounds?

Overdrive is already unticked.


----------



## PCSarge

Quote:


> Originally Posted by *fateswarm*
> 
> Hrm. CCC competes with all overclocking utilities on system start up. How do I prevent it without ugly workarounds?
> 
> Overdrive is already unticked.


i dont seem to have this problem...my CCC doesnt interfere with Afterburner at all.

in other news: TIM swap on my card frim stock paste brought me from 54C idle down to 47C idle, and 75C load down to 70C load in heaven


----------



## Forceman

Quote:


> Originally Posted by *fateswarm*
> 
> Hrm. CCC competes with all overclocking utilities on system start up. How do I prevent it without ugly workarounds?
> 
> Overdrive is already unticked.


I saw someone recommended setting AB to delayed start, so that it starts after CCC. Supposedly the problem is that AB runs first and sets the overclock, and then CCC runs and resets it, so making AB delay 30 seconds before starting (using task scheduler) is supposed to fix it. So you could try that. Or you could try enabling CCC and then setting your overclock in AB, then check to see if it reflects in CCC. If it does (so CCC shows the same speeds) then CCC won't reset it when you reboot and all will be fine. That's the way I have it set up, with CCC overdrive enabled, and I've never had any problems - except for the sleep voltage reset bug, which I don't think is fixable.


----------



## Razzaa

Quote:


> Originally Posted by *Blaise170*
> 
> Just curious, what PSU do you have?


XFX TS550w


----------



## Roboyto

Quote:
Originally Posted by *bbond007* 

I'm having an issue with my MSI r9 290X boards and AMD's ridiculous 14.6 beta drivers.

I can't set the core clock and have it take effect. Seems to totally ignore the core clock setting. I've tried setting both with CCC and Afterburner 3.01
I'm having this on multiple machines. I tried both the May and the June betas.

One machine has a Asrock Fatal1ty z87 killer MB and a MSI 290X lightning and a MSI 290X gamer.

The other machine has an Asrock Fatal1ty z87 killer MB and single MSI 290X gamer.

I've tried DDU on both machines and re-installing the drivers and Afterburner - No dice.

Anyone else having this issue? or fix for this? This seems like basic functionality that is broken. It seems like AMD's drivers actually are regressing and getting worse. Back to 13.12 for me.

Any help would be greatly appreciated.

What version of Windows are you on? 14.anything for me was troublesome with Win7 so I stuck with 13.12

14.6 beta has been flawless with 8.1

If you're using CCC/Overdrive to OC, stop now and disable it; Combining it with another utility is an even worse idea.

Quote:
Originally Posted by *fateswarm* 

Hrm. Am I safe with 1100 gpu, 1300 vram and +50mv vdcc on tri-x 290? I'm mainly interested in safety+a boost, not full-on performance regardless of safety.
I'm trying now +37mv on 1100/1350. Heaven seems to like it. That's an interesting boost that doesn't look scary on voltage at all.

Hrm. CCC competes with all overclocking utilities on system start up. How do I prevent it without ugly workarounds?

Overdrive is already unticked.

Sure, as long as your temperatures are solid. Core and VRMs probably best to max in the 80s...sure they are capable of running higher but for the life expectancy of your GPU you probably don't want to do that.

My PowerColor reference 290 makes 1075/1375 with +37mV. It takes too much more voltage for higher clocks after that so I just leave it. I wouldn't concern yourself with the RAM clocks all that much if they give you trouble, you have much more to gain from core speeds anyhow. Others have had luck using AUX voltage in AB to assist in RAM clocks if you really want to try and push it.

I would heed @Forceman's advice of delayed start. Personally I let Trixx load on start but leave stock settings, and if I'm going to game I apply the OC profile.

Quote:
Originally Posted by *Dasboogieman* 
I thought 77.9% is a good quality chip? considering the 290X GPUs tend to be better binned in the first place compared to the failed 290s to meet board power requirements.


> Yeah, they didn't want to redesign the cooler so they simply just let it run hotter and louder to abuse the fact more energy is transferred as the temperature differential increases. Good grief, I wonder why they didn't just commission Sapphire to design a Vapor X Blower of similar capacity as the Titan one if the in-house team is incapable or too lazy to.


From most of what I have read in this thread since ~October the ASIC values for these cards don't mean too much.

HIS had a blower design with heatpipes that actually worked quite well on a 7950. I briefly used one of these cards for BTC mining and it held a 1200 MHz core clock with a little voltage boost without making a bunch of noise even at fairly high, ~70%+, fan speeds .



Spoiler: HIS Blower














Quote:


> Originally Posted by *Fade2Black*
> 
> alright thanks, could a CX 500w handle a 290?. i've been reading many threads and the answers seem very mixed.


You can get away with a lower wattage PSU, but I would get a higher quality/more efficient one than that. My HTPC runs a mild OC 290(1075/1375), 3770k @ 4GHz with 8GB of 2133 RAM using a Rosewill Capstone 450W modular. It's also powering (1) SSD, (2) HDD, BD-RW, (2) AIO pumps, and 3 fans.

Rosewill has been good to me, I would definitely recommend their Capstone line since my main rig is using the 650W for a pretty high, 1300/1700, 290 OC and 4770k @ 4.5+

Quote:


> Originally Posted by *Razzaa*
> 
> XFX TS550w


As long as you're not running any kind of OC on your 8350 you'll probably be alright, just be mindful of that CPUs power hungry tendencies. If you see some hiccups under load I would consider upgrading the PSU.

At stock according to GPU-Z my reference 290 with an AIO cooling it pulls around 200W depending on ambient temperatures; winter was ~20W less.


----------



## pdasterly

whats wrong with ccc, I just uninstalled trixx and use ccc only to increase power limit


----------



## HoneyBadger84

Quote:


> Originally Posted by *pdasterly*
> 
> whats wrong with ccc, I just uninstalled trixx and use ccc only to increase power limit


Far as I know CCC doesn't hold settings on next boot (clocks, fan speed, power limit), thats one of its drawbacks when it comes to OCing if you're going for a 24/7 stable setup with an OC on the card/s.


----------



## pdasterly

Whats best utility for sapphire card. Trixx is ok but it also changes settings on me a few times


----------



## HoneyBadger84

IMO MSI AfterBurner is the best GPU OCing tool for testing/maintaining settings, just make sure ya don't check "apply settings on startup" til you know they're 100% stable.

The different methods by which you can do fan control are a big plus for me as well.


----------



## Roboyto

Quote:


> Originally Posted by *pdasterly*
> 
> whats wrong with ccc, I just uninstalled trixx and use ccc only to increase power limit


Interference with other utilities and utter lack of options for OC; Power Limit an only get you so far.

Also, I'm not sure if this issue has been resolved or not, CCC had a problem where it would automatically apply an OC after a crash. So if you are trying to push your card to the limit, and you crashed because of it, you could end up in a crash loop since CCC loads on boot.

I haven't used CCC for OC since my 4890 which was a miracle chip and didn't require any voltage for high OC; Otherwise it has been Trixx since HD6XXX and up.


----------



## pdasterly

Ill give trixx one more chance


----------



## HoneyBadger84

Quote:


> Originally Posted by *pdasterly*
> 
> Ill give trixx one more chance


One of my main issues with Trixx was the lack of being able to say your fan profile to a step pattern instead of just the arc option it uses. I have my fans setup like so:


----------



## Roboyto

Quote:


> Originally Posted by *pdasterly*
> 
> Ill give trixx one more chance


What kind of issues are you having specifically? I see you have a sufficient PSU so we can immediately rule that out.

XFire OC can be a challenge. Usually best to bench each card individually, find the lesser performer, and attempt those clocks with both.

With these cards the core is your top priority, don't even touch RAM clocks until you're certain you have reached a limit on core; 512-bit bus makes up for lesser RAM clocks.

RAM clocks too high = black screens and lockups from my experience. Core clock too high will give your artifacting/flickering and most cases more voltage will solve this; if you can't apply any more voltage then back your clocks down a bit.

Try all the utilities for the hell of it; Trixx, AB, and even HIS iTurbo. I had a 270X Devil edition that worked better with HIS iTurbo than anything else; it's identical to Trixx it seems aside from the skin on it. If you want to try flashing to an ASUS BIOS than you could even use GPU Tweak.


----------



## RocketAbyss

Quote:


> Originally Posted by *pdasterly*
> 
> Ill give trixx one more chance


Do you know if Trixx will work on another make card? I have a reference PowerColor 290X and was wondering if i could use Trixx for OC'ing purposes. I want to avoid afterburner as it has given me problems before.


----------



## Roboyto

Quote:


> Originally Posted by *RocketAbyss*
> 
> Do you know if Trixx will work on another make card? I have a reference PowerColor 290X and was wondering if i could use Trixx for OC'ing purposes. I want to avoid afterburner as it has given me problems before.


Absolutely, I'm using Trixx for my reference PowerColor 290.


----------



## RocketAbyss

Quote:


> Originally Posted by *Roboyto*
> 
> Absolutely, I'm using Trixx for my reference PowerColor 290.


Sweet! No modding of bios needed amirite?


----------



## Roboyto

Quote:


> Originally Posted by *RocketAbyss*
> 
> Sweet! No modding of bios needed amirite?


Nope.

That's only required for GPU Tweak AFAIK.


----------



## pdasterly

No on board fans, using g10 bracket.
Cards are reference with tri-x oc bios on them.
My vrm1 temps are real issue, cant go over 1050 gpu without vrm1 going over 90c


----------



## RocketAbyss

Quote:


> Originally Posted by *Roboyto*
> 
> Nope.
> 
> That's only required for GPU Tweak AFAIK.


Gotcha thanks! Will give it a shot once I'm back home from work.

What kind of voltages should I be trying for? I saw some peeps going for +75mV. I'm hoping to push my memory to at least 1500Mhz because my GPU crashes when setting anything above 1400Mhz using CCC.


----------



## HoneyBadger84

Quote:


> Originally Posted by *RocketAbyss*
> 
> Gotcha thanks! Will give it a shot once I'm back home from work.
> 
> What kind of voltages should I be trying for? I saw some peeps going for +75mV. I'm hoping to push my memory to at least 1500Mhz because my GPU crashes when setting anything above 1400Mhz using CCC.


Some cards just wont go much over 1400 on the vRAM without flashing to a BIOS that had more relaxed vRAM settings in my experience. I have one card that'll do 1450+, and two that get pissed id I go over 1400 at all, some my testing OC for TriFire is nerfed to 1375MHz vRAM as a result. Course thats with no voltage being added, just a power limit increase. I haven't really pushed these cards much cuz 2 are stock blower cards and I'm reselling them.


----------



## RocketAbyss

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Some cards just wont go much over 1400 on the vRAM without flashing to a BIOS that had more relaxed vRAM settings in my experience. I have one card that'll do 1450+, and two that get pissed id I go over 1400 at all, some my testing OC for TriFire is nerfed to 1375MHz vRAM as a result. Course thats with no voltage being added, just a power limit increase. I haven't really pushed these cards much cuz 2 are stock blower cards and I'm reselling them.


Yeah. When I managed that 1400Mhz on the memory on stock voltage, I was like "hell yeah!". Then a few weeks/months later I see people hitting 1500Mhz and the works, thats when I felt my card's memory is a little lacking.

Weirdly enough, I remember only being able to do 1400Mhz mem for valley and heaven bench. As soon as I tried Battlefield 4, my system would crash. Which kinda relates to why my mem is downclocked to about 1300Mhz now without any(once in a blue moon) crashes. I reckon a slight bump in voltage will help yes?


----------



## Roboyto

Quote:


> Originally Posted by *RocketAbyss*
> 
> Gotcha thanks! Will give it a shot once I'm back home from work.
> 
> What kind of voltages should I be trying for? I saw some peeps going for +75mV. I'm hoping to push my memory to at least 1500Mhz because my GPU crashes when setting anything above 1400Mhz using CCC.


Voltages depend on how good of a card you have honestly; The only way to know is to test it. Start bumping your core clock, run a bench until you experience artifacts/flickers, then give it a little voltage. I usually go in ~25MHz jumps and ~10mV increases so I can locate the 'sweet spot' for temps/volts/clocks.

Voltages you should 'go for' depend on A) keeping the core/VRM1 cool and B) how hard you want to push your GPU.

A pretty good card will reach around 1100 core without adjusting the voltage at all. My XFX reference 290 makes 1200/1500 with +87mV and 1300/1700 with +200mV and 50% power limit.

RAM is not that important for these cards, I wouldn't touch it until you reach a limit on your core clock. If you've pushed your RAM clocks too high odds are you will get black screen and lockups; core clocks nearly always cause artifacting/flickering.


----------



## Sgt Bilko

Quote:


> Originally Posted by *RocketAbyss*
> 
> Gotcha thanks! Will give it a shot once I'm back home from work.
> 
> What kind of voltages should I be trying for? I saw some peeps going for +75mV. I'm hoping to push my memory to at least 1500Mhz because my GPU crashes when setting anything above 1400Mhz using CCC.


When you up the core voltage and core clocks it helps with your memory clocks as well.

I can't run 1000/1500 +0 mV but i can run 1200/1500 +100 mV for example


----------



## RocketAbyss

Quote:


> Originally Posted by *Roboyto*
> 
> Voltages depend on how good of a card you have honestly; The only way to know is to test it. Start bumping your core clock, run a bench until you experience artifacts/flickers, then give it a little voltage. I usually go in ~25MHz jumps and ~10mV increases so I can locate the 'sweet spot' for temps/volts/clocks.
> 
> A pretty good card will reach around 1100 core without adjusting the voltage at all. My XFX reference 290 makes 1200/1500 with +87mV and 1300/1700 with +200mV and 50% power limit.
> 
> RAM is not that important for these cards, I wouldn't touch it until you reach a limit on your core clock. If you've pushed your RAM clocks too high odds are you will get black screen and lockups; core clocks nearly always cause artifacting/flickering.


Noted! I forgot to mention tho'...if you looked at my old "infinitum" rig in my sig, it showed 1102 for my core. I remember(I only ever tested OC during the first week of getting the card, which was launch week) being able to get that overclock on stock voltage, using CCC with no adjustment to power limit at all. Ran Heaven and Valley no issue, but as my abovementioned post, my pc would crash upon playing Battlefield 4, say maybe within the first minute of gameplay.


----------



## RocketAbyss

Quote:


> Originally Posted by *Sgt Bilko*
> 
> When you up the core voltage and core clocks it helps with your memory clocks as well.
> 
> I can't run 1000/1500 +0 mV but i can run 1200/1500 +100 mV for example


I'll have to give it a shot once I'm back from work. I just wished I could do more than 1400 on the mem at stock voltage even though I know it makes the least effect on FPS in games. I just like to set myself a benchmark to hit







Moreover the fact that 290Xs default at 1250Mhz for the mem.


----------



## Roboyto

Quote:
Originally Posted by *Sgt Bilko* 

When you up the core voltage and core clocks it helps with your memory clocks as well.

I can't run 1000/1500 +0 mV but i can run 1200/1500 +100 mV for example









I have noticed this phenomenon as well.

Quote:
Originally Posted by *RocketAbyss* 

Noted! I forgot to mention tho'...if you looked at my old "infinitum" rig in my sig, it showed 1102 for my core. I remember(I only ever tested OC during the first week of getting the card, which was launch week) being able to get that overclock on stock voltage, using CCC with no adjustment to power limit at all. Ran Heaven and Valley no issue, but as my abovementioned post, my pc would crash upon playing Battlefield 4, say maybe within the first minute of gameplay.


> Originally Posted by *RocketAbyss*
> 
> I'll have to give it a shot once I'm back from work. I just wished I could do more than 1400 on the mem at stock voltage even though I know it makes the least effect on FPS in games. I just like to set myself a benchmark to hit
> 
> 
> 
> 
> 
> 
> 
> Moreover the fact that 290Xs default at 1250Mhz for the mem.


While any benchmark is a decent test for stability, they are no guarantee...especially with something as brutal as BF4. One thing you must also consider with BF4 is how hard it is on CPUs compared to any other bench/game as it will utilize 4 processing threads up to 90%.

OCing can be a painstaking process...but if you're willing to devote the time it is very rewarding.

If you have questions/problems feel free to fire away. I have tested/tweaked/tortured 4 of these 290s already and have a pretty good understanding of how they act.

Odds are you will be needing voltage of some sort to enhance RAM clocks, but don't be discouraged if you don't hit your goals as some of these cards simply don't clock the RAM well. They default to 1250 because most of the cards come equipped with 5GHz RAM modules and the 512-bit bus makes up for lesser speeds.

This is a good summary of most of the info I have personally found and have accrued from this thread: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781


----------



## fateswarm

Uhm, I don't seem to be able to adjust voltage on overdrive so it doesn't seem to be an option to ignore other tools.

Also the power limit options appear to be of little use in some cases. I guess the system or card does it anyway.

Quote:


> Originally Posted by *Forceman*
> 
> I saw someone recommended setting AB to delayed start, so that it starts after CCC. Supposedly the problem is that AB runs first and sets the overclock, and then CCC runs and resets it, so making AB delay 30 seconds before starting (using task scheduler) is supposed to fix it. So you could try that. Or you could try enabling CCC and then setting your overclock in AB, then check to see if it reflects in CCC. If it does (so CCC shows the same speeds) then CCC won't reset it when you reboot and all will be fine. That's the way I have it set up, with CCC overdrive enabled, and I've never had any problems - except for the sleep voltage reset bug, which I don't think is fixable.


That's what I thought of doing. It appears it doesn't override the voltage setting to +0. Though it's still an ugly workaround to be honest since any change would need to change both.

You know what. I may just disable CCC for now. It seems most games, if not all, have their own settings enough for adjusting their graphics anyway.


----------



## RocketAbyss

Quote:


> Originally Posted by *Roboyto*
> 
> I have noticed this phenomenon as well.
> 
> While any benchmark is a decent test for stability, they are no guarantee...especially with something as brutal as BF4. One thing you must also consider with BF4 is how hard it is on CPUs compared to any other bench/game as it will utilize 4 processing threads up to 90%.
> 
> OCing can be a painstaking process...but if you're willing to devote the time it is very rewarding.
> 
> If you have questions/problems feel free to fire away. I have tested/tweaked/tortured 4 of these 290s already and have a pretty good understanding of how they act.
> 
> Odds are you will be needing voltage of some sort to enhance RAM clocks, but don't be discouraged if you don't hit your goals as some of these cards simply don't clock the RAM well. They default to 1250 because most of the cards come equipped with 5GHz RAM modules and the 512-bit bus makes up for lesser speeds.
> 
> This is a good summary of most of the info I have personally found and have accrued from this thread: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781


Awesome breakdown! Thanks a bunch! I've downloaded Trixx and done some minor testing with the overclocks before the heat gets out of hand. So far I've managed 1100/1500 +50mV. Anything more I will have to wait to put my GPU under water


----------



## rdr09

@Roboyto, I've been using 14.6 Beta and Win7 without issue only the higher temp, which does not matter if you are using water. It's been as solid as 13.12 or 13.11.


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> @Roboyto, I've been using 14.6 Beta and Win7 without issue only the higher temp, which does not matter if you are using water. It's been as solid as 13.12 or 13.11.


True this ^


----------



## fateswarm

Is there any big disadvantage on not loading CCC at all? Assuming I only alter in-game settings. Does it do any serious optimizations that are real?

I assume those may be part of the driver to begin with.


----------



## rdr09

Quote:


> Originally Posted by *fateswarm*
> 
> Is there any big disadvantage on not loading CCC at all? Assuming I only alter in-game settings. Does it do any serious optimizations that are real?
> 
> I assume those may be part of the driver to begin with.


fate, if you are not using CCC like for game settings, then treat it like Word. Don't even let it be part of the startup programs. It slows the boot process if you are not using SSD. Disable Overdrive and forget it.


----------



## fateswarm

Quote:


> Originally Posted by *rdr09*
> 
> fate, if you are not using CCC like for game settings, then treat it like Word. Don't even let it be part of the startup programs. It slows the boot process if you are not using SSD. Disable Overdrive and forget it.


Thanks. And lol... I've been always running CCC on a slow PC when I didn't even need it, and now that it doesn't matter (SSD+4790k) I find out.


----------



## Agoniizing

What are your guys gaming stable 24/7 clocks?


----------



## koekwau5

Quote:


> Originally Posted by *Agoniizing*
> 
> What are your guys gaming stable 24/7 clocks?


Currently:

Core mV extra: 50
AUX mV extra: 25

Power Limit: +25%

GPU clock: 1075Mhz
MEM clock: 1300Mhz

Card: MSI R9 290X Gaming 4GB.

Following settings still stable but GPU hitting 95 degrees with Max Payne 3 and Far Cry 3:

Core mV extra: 75
AUS mV extra: 25

Power Limit: +35%

GPU clock: 1100Mhz
MEM clock: 1350Mhz


----------



## fateswarm

I left it at 1100/1350 +0.012v here. I don't know if I should be tweaking. I wouldn't want to overvoltage by much, I want it to survive for at least 1 or 2 years.

PS. Power limit appears to do little here, if anything at all according to live HWInfo data. It might be covered by the motherboard or the card though.


----------



## MTDEW

Quote:


> Originally Posted by *Agoniizing*
> 
> What are your guys gaming stable 24/7 clocks?


1075 / 1375 simply because that is what mine does at default volts.

I thought 1100/1400 was completely stable since I had tested it for days using Valley / Heaven / Crysis 3 and the Witcher 2 and it seemed 100% stable at stock volts.

But then someone suggested trying Sleeping Dogs with the AA and all settings maxed....and sure enough, I had to back it down a bit.

I could run +25mv and keep the 1200/1400 clocks, but I decided to just stick with default voltage and whatever clocks it does with that for now.


----------



## Arizonian

Quote:


> Originally Posted by *chinmi*
> 
> Count me in, I just replaced my *dead cause of lightning* 6990 with a 290x.
> 
> 1. GPU-Z Link with OCN name http://www.techpowerup.com/gpuz/k6dkq/
> or Screen shot of GPU-Z validation tab open with OCN Name http://i.imgur.com/634DreJ.jpg
> or finally a pic of GPU with piece of paper with OCN name showing.
> 
> 
> http://imgur.com/KAFdpGA
> 
> &
> 
> 
> http://imgur.com/VccqMff
> 
> 
> 2. Gigabyte R9 290x
> 
> 3. Cooling - Curently Stock, still haven't decide if I wanna change it to aftermarket or water, since in my country taking off the gpu case will void the warranty


Congrats - added


----------



## Widde

Quote:


> Originally Posted by *Agoniizing*
> 
> What are your guys gaming stable 24/7 clocks?


1080/1350 Powerlimit +50% +88mV and +33mV aux. My cards are crap


----------



## battleaxe

Quote:


> Originally Posted by *chinmi*
> 
> Count me in, I just replaced my *dead cause of lightning* 6990 with a 290x.
> 
> 1. GPU-Z Link with OCN name http://www.techpowerup.com/gpuz/k6dkq/
> or Screen shot of GPU-Z validation tab open with OCN Name http://i.imgur.com/634DreJ.jpg
> or finally a pic of GPU with piece of paper with OCN name showing.
> 
> 
> http://imgur.com/KAFdpGA
> 
> &
> 
> 
> http://imgur.com/VccqMff
> 
> 
> 2. Gigabyte R9 290x
> 
> 3. Cooling - Curently Stock, still haven't decide if I wanna change it to aftermarket or water, since in my country taking off the gpu case will void the warranty


The the ol' girl died on you? Poor thing... she had a split personality anyway though right?

Edit: Getting 1100/1400 on stock on my Reference MSI. Nowhere near that on the MSI gaming.


----------



## Agoniizing

Thank you to those who posted their 24/7 clocks







I just wanted an idea.


----------



## PureBlackFire

1100/1300mhz stock voltage 24/7. I need to bump voltage ion order to oc the memory even 1mhz without errors lol. can do 1150/1400 with +60mv.


----------



## fishingfanatic

Hey if I have a PowerColor Devil 13 Dual Core R9 290X 8GB does that qualify as it is a dual, but not a 295x2. lol

Just waiting for it to arrive....Someone chew their nails for me...









FF


----------



## VSG

No different than when Powercolor and ASUS made their Devil 13 7990 and ARES II- it is still a dual GPU board, just not the reference PCB or name.


----------



## koekwau5

Quote:


> Originally Posted by *MTDEW*
> 
> 1075 / 1375 simply because that is what mine does at default volts.
> 
> I thought 1100/1400 was completely stable since I had tested it for days using Valley / Heaven / Crysis 3 and the Witcher 2 and it seemed 100% stable at stock volts.
> 
> But then someone suggested trying Sleeping Dogs with the AA and all settings maxed....and sure enough, I had to back it down a bit.
> 
> I could run +25mv and keep the 1200/1400 clocks, but I decided to just stick with default voltage and whatever clocks it does with that for now.


Same weirdness here.
GPU clock @ 1100MHz is Max Payne 3 and Far Cry 3 stable. But Borderlands 2 will cause a black screen. Resetting PC's causes PC to start with black screen as well. I can hear Windows boot tho.
Shutting of PC completely and wait for 5 secs fixed this.


----------



## fateswarm

GPUs have a few stages and compartments to support the APIs. It's likely some games do not use some of them. They may be unstable but not crash since they are unused.


----------



## MTDEW

Yeah, I take it as good thing when stuff like that happens.
It helped ensure my Overclock was stable and I would have NEVER guessed to use Sleeping Dogs to test my GPUs stability.

At least now I know, and when someone asks like they just did, I can give them accurate info on what my best stable GPU OC on stock voltage is.


----------



## Dan848

I have a reference Sapphire R9 290 with stock cooler - bought it several minutes after midnight, as soon as they were released on newegg. Of course it is not the best overclocker, however, it does enough for me. According to GPU-Z, this video card uses Elpida memory.

I download and use drivers from ATI, install them then uninstall CCC. I download the latest MSI Afterburner and install it - each time I install new video card drivers.

With afterburner and this video card [stock] I can increase only the power limit, for the GPU, by %, no voltage increase for RAM. Because games require different amounts of video power to turn up the eye candy, I have 5 settings set:

Stock 947 Core, 1250 Memory
1003 Core +2% Power Limit, 1375 Memory
1039 Core +4% Power Limit, 1390 Memory
1087 Core +6% Power Limit, 1395 Memory
1119 Core +8% Power Limit, 1401 Memory

I can go higher on the core, however, with the stock cooler temps get high and I do not need any more speed with a 1080p screen.
Fan Speed set by me in MSI Afterburner. Also, I usually keep the second setting for almost every game 1003/1375


----------



## tsm106

Is it me? Almost a year after release and everyone has gotten slower in regards to clock frequencies. Jomama and I were rocking 1300/1600 and higher back in Oct last year. I only see like 5 guys hitting 1300mhz these days...


----------



## VSG

My 290x cards bought the first week (may they have mercy on me for selling them to a miner) were also really good. Never got the chance to put them under water but they did 1150/1450 on air with no overvolting. The stock volts were higher than average though.


----------



## tsm106

Quote:


> Originally Posted by *geggeg*
> 
> My 290x cards bought the first week (may they have mercy on me for selling them to a miner) were also really good. Never got the chance to put them under water but they did *1150/1450 on air with no overvolting*. The stock volts were higher than average though.


Oh dude, those had potential! You cashed out when the values rocketed then?


----------



## VSG

Quote:


> Originally Posted by *tsm106*
> 
> Oh dude, those had potential! You cashed out when the values rocketed then?


Ya, but mostly because I had to get my motherboard exchanged and Newegg was out of stock. So it was going to be a while before Asus could send me one and I decided to go non-reference. Got a good deal combined with the Newegg promo savings and the games so that's when I had to leave this club as an owner









My current cards can go 1500/1900 on water for benching so it's ok. But the 290x was my first ever desktop GPU so it will always have a place in my heart


----------



## sugarhell

Quote:


> Originally Posted by *tsm106*
> 
> Is it me? Almost a year after release and everyone has gotten slower in regards to clock frequencies. Jomama and I were rocking 1300/1600 and higher back in Oct last year. I only see like 5 guys hitting 1300mhz these days...


----------



## PCSarge

Quote:


> Originally Posted by *sugarhell*


^ my sentiments exactly i can do 1300/1400 all iwant on this 290 SADLY the stock cooler is going DERP at that point and cant handle it unless 100% fan constantly. whereas at 1100/1350 the stock cooler sits at 70C load. lol


----------



## sugarhell

Quote:


> Originally Posted by *PCSarge*
> 
> ^ my sentiments exactly i can do 1300/1400 all iwant on this 290 SADLY the stock cooler is going DERP at that point and cant handle it unless 100% fan constantly. whereas at 1100/1350 the stock cooler sits at 70C load. lol


Yeah he is trolling probably


----------



## tsm106

Quote:


> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PCSarge*
> 
> ^ my sentiments exactly i can do 1300/1400 all iwant on this 290 SADLY the stock cooler is going DERP at that point and cant handle it unless 100% fan constantly. whereas at 1100/1350 the stock cooler sits at 70C load. lol
> 
> 
> 
> Yeah he is trolling probably
Click to expand...

Probably... but.

1300/1400? What?

It's funny some scores are still in the fse top 10, and they're from last year!


----------



## sugarhell

Quote:


> Originally Posted by *tsm106*
> 
> Probably... but.
> 
> 1300/1400? What?


I need to copyright that pic you thief...


----------



## PCSarge

Quote:


> Originally Posted by *tsm106*
> 
> Probably... but.
> 
> 1300/1400? What?
> 
> It's funny some scores are still in the fse top 10, and they're from last year!


yes 1300/1400...modified bios FTW

card didnt have warranty to begin with anyways... its an AMD "press sample" with an ES chip on it. lol

it did 1400 mem on stock bios, needed to modify 80 mhz onto the bios to do the core though.


----------



## tsm106

Quote:


> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Probably... but.
> 
> 1300/1400? What?
> 
> 
> 
> I need to copyright that pic you thief...
Click to expand...

Feel proud that The AMD God gave use consent to use it and for me to embalzon it on my avy. The fact that you created it is besides the point lol.


----------



## pdasterly

If I uninstall ccc, do I lose eyefinity?


----------



## ZealotKi11er

Quote:


> Originally Posted by *pdasterly*
> 
> If I uninstall ccc, do I lose eyefinity?


Yes.


----------



## pdasterly

I give up trying to oc my xfire setup. Too many problems, just going to run at factory settings before I f something up


----------



## Fade2Black

you can go ahead and add me to the 290 club







. can't wait for it to arrive


----------



## Blaise170

My 290 finally came back from XFX. So excite!









Add me to the club please. Card: XFX R9 290 Double Dissapation


----------



## Fade2Black

nice card. i bought the gigabyte windforce
Quote:


> Originally Posted by *Blaise170*
> 
> My 290 finally came back from XFX. So excite!


what was wrong with it?


----------



## Blaise170

Quote:


> Originally Posted by *Fade2Black*
> 
> nice card. i bought the gigabyte windforce
> what was wrong with it?


It had the reference cooler and was hitting max temps. I contacted XFX to ask if I could replace it with an aftermarket cooler and they offered to replace it with the DD version for free.


----------



## Fade2Black

Quote:


> Originally Posted by *Blaise170*
> 
> It had the reference cooler and was hitting max temps. I contacted XFX to ask about it and they told me they'd replace it with the DD version for free.


wow, that's pretty awesome. i wonder if my 7850 could be replaced with one


----------



## Sgt Bilko

Quote:


> Originally Posted by *Blaise170*
> 
> It had the reference cooler and was hitting max temps. I contacted XFX to ask if I could replace it with an aftermarket cooler and they offered to replace it with the DD version for free.


XFX have been getting better with their customer service.

Nice looking card isn't it?


----------



## Blaise170

Quote:


> Originally Posted by *Sgt Bilko*
> 
> XFX have been getting better with their customer service.
> 
> Nice looking card isn't it?


Looks sweet, and the LED logo is a nice touch too.


----------



## batman900

The DD is great, silent and takes a beating without a hitch. The only downside is that the vrms get hot if you over volt. Upping the mhz at stock doesn't bother it though.


----------



## WhitePrQjser

Does any of you know if you HAVE to use an active adapter for three displays? Not Eyefinity. Just three displays.


----------



## Arizonian

Quote:


> Originally Posted by *fishingfanatic*
> 
> Hey if I have a PowerColor Devil 13 Dual Core R9 290X 8GB does that qualify as it is a dual, but not a 295x2. lol
> 
> Just waiting for it to arrive....Someone chew their nails for me...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> FF


I'm not sure if this would be a good place for that GPU. None of the owners here have those, temps, volts, and you may not get the type of support you need, if you have any questions. I really think the 295X2 Club might be a better place. I'm checking with the 295X2 Club and will PM you to let you know.

_If I do list you here it would be as two PowerColor 290X cards and can list it as Devil 13 cooling I guess._

Either way, congrats on the new Devil13 incoming.








Quote:


> Originally Posted by *Blaise170*
> 
> My 290 finally came back from XFX. So excite!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Add me to the club please. Card: XFX R9 290 Double Dissapation
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Don't forget to subscribe to the *XFX Black/Double Dissipation Club!* . Sgt Bilko would love to have you aboard there.








Quote:


> Originally Posted by *WhitePrQjser*
> 
> Does any of you know if you HAVE to use an active adapter for three displays? Not Eyefinity. Just three displays.


Never ran three but I don't see why one would. _See 2nd post._ Someone who has three correct me if I'm wrong.


----------



## WhitePrQjser

Quote:


> Originally Posted by *Arizonian*
> 
> Never ran three but I don't see why one would. _See 2nd post._ Someone who has three correct me if I'm wrong.


I'm currently running two, and have a projector coming in a few days, so I'd need to be able to run three monitors. But I can plug my two main screens in the DVI ports, and the projector in the HDMI then?


----------



## kyismaster

Spoiler: Warning: Spoiler!







:c

is that low?


Spoiler: Warning: Spoiler!


----------



## Roboyto

Quote:


> Originally Posted by *rdr09*
> 
> @Roboyto, I've been using 14.6 Beta and Win7 without issue only the higher temp, which does not matter if you are using water. It's been as solid as 13.12 or 13.11.


I used up to 14.4 on Win7 and every iteration was problematic for me.

Going to 8.1 with 14.6 has been pleasant thus far. I will update my post for this information, thanks!


----------



## darkelixa

Well an update on my r9 290s,

Have sent my gigabyte r9 290 back for RMA it just shuts the pc down and then restarts it, reguardless of what mode the card is in, or the drivers

Sapphire r9 290 doesnt shut the pc down, runs at 81-91 degrees during games at 27% fans on auto and only like the 13.12 drivers, has all sorts of issues with the newer drivers on it, from the drivers crashing to red screens right before the windows logo on windows 8.1


----------



## kyismaster

that moment when you overclock and get worse 3dmark scores


----------



## PCSarge

Quote:


> Originally Posted by *kyismaster*
> 
> that moment when you overclock and get worse 3dmark scores


thats the moment you underclock to see if it does better than stock







jk


----------



## kyismaster

Quote:


> Originally Posted by *PCSarge*
> 
> thats the moment you underclock to see if it does better than stock
> 
> 
> 
> 
> 
> 
> 
> jk


brb while i just.... plug 300 vdc straight into it









my card keeps pushing itself lower and lower started at 1300 mhz core for the first test and now its 1175


----------



## kyismaster

welp, i tried.


----------



## SPLWF

Hello all, i'm a long time lurker/internet searcher. I was always directed to this forum for AMD and/or PC related mods and such. Anyways, I recently got a Sapphire Reference R9 290. I didn't like the stock cooler at all. I upgraded to a NZXT G10 with Corsair H55 plus GELID ICY VRM kit as well as ZALMAN VRAM heatsinks. I did a good amount of testing on this card as far as OC goes. I believe my card is one of the lowest BIN cards.

I followed all the instructions on this:

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781

With MSI Afterburner I was only able to get core to 1150 with +88 on Core Voltage with VRM maxing out at 75c-80c (depends on fan speed, I have it hooked through ASUS AI Suite II on my sabertooth Z77).

Now my questions are:

Is VRM safe at 80c for long periods of gaming? What is the max allowed temp on the VRM?

OC memory worth it? That graph from "ROBOYTO" shows an overall of 5% increase.

I followed this guy:






I cranked up memory to 1500mhz and AUX voltage to +13 and it was stable. I later returned memory to stock clocks because I wasn't sure if that was the right way.

What is AUX voltage?

Thank you for your time


----------



## Roboyto

Quote:


> Originally Posted by *SPLWF*
> 
> Hello all, i'm a long time lurker/internet searcher. I was always directed to this forum for AMD and/or PC related mods and such. Anyways, I recently got a Sapphire Reference R9 290. I didn't like the stock cooler at all. I upgraded to a NZXT G10 with Corsair H55 plus GELID ICY VRM kit as well as ZALMAN VRAM heatsinks. I did a good amount of testing on this card as far as OC goes. I believe my card is one of the lowest BIN cards.
> 
> I followed all the instructions on this:
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781
> 
> With MSI Afterburner I was only able to get core to 1150 with +88 on Core Voltage with VRM maxing out at 75c-80c (depends on fan speed, I have it hooked through ASUS AI Suite II on my sabertooth Z77).
> 
> Now my questions are:
> 
> Is VRM safe at 80c for long periods of gaming? What is the max allowed temp on the VRM?
> 
> OC memory worth it? That graph from "ROBOYTO" shows an overall of 5% increase.
> 
> I followed this guy:
> 
> 
> 
> 
> 
> 
> I cranked up memory to 1500mhz and AUX voltage to +13 and it was stable. I later returned memory to stock clocks because I wasn't sure if that was the right way.
> 
> What is AUX voltage?
> 
> Thank you for your time


80C is plenty good for VRM1 temperature. If you want to be technical, the VRMs are rated somewhere around 120C max operating temperature. It's not good for the health of your GPU as a whole to run it like that though. Most people in this thread are comfortable in the 80C range, but don't like going much higher. Lower is obviously always better, but your current temps are acceptable. You may want to grab some Fujipoly Ultra Extreme for your VRM kit, it will bring temps down about 10-20C from what I've been told in my thermal pad upgrade thread; I've yet to do the switch myself but will be doing so soon.

If you can get high memory clocks without issues than its not going to hurt anything; it's just much smaller performance gains compared to core. You'll know pretty quick if you push RAM too far as black screen and lockups will start occurring.

1150 with 88mV isn't all that bad honestly; you may want to try increasing power limit and see if it helps at all. You have to figure that 947 is stock so 1150 is already a 21% performance jump.

My good card makes 1200/1500 with 87mV.

My bad 290 needs 200mV to get to your core speed so I don't even bother with it. I run it at 1075/1375 +37mV..It is only pushing 1080 on living room TV for my wife so the card is overkill anyway you look at it.
Quote:


> Originally Posted by *kyismaster*
> 
> that moment when you overclock and get worse 3dmark scores


If your score went down after OC than it's not a successful/stable OC.

You need to monitor the GPU usage to see if the card is throttling clocks/voltages due to temperature. If it's not throttling than you either need more voltage if temps will allow, or you need to reduce your OC to find stability.

With these cards core clock is #1 priority. Don't even touch the memory until you've hit a limit on core speed. Trying to do both simultaneously makes things more difficult than it needs to be.


----------



## kyismaster

Quote:


> Originally Posted by *Roboyto*
> 
> 80C is plenty good for VRM1 temperature. If you want to be technical, the VRMs are rated somewhere around 120C max operating temperature. It's not good for the health of your GPU as a whole to run it like that though. Most people in this thread are comfortable in the 80C range, but don't like going much higher. Lower is obviously always better, but your current temps are acceptable. You may want to grab some Fujipoly Ultra Extreme for your VRM kit, it will bring temps down about 10-20C from what I've been told in my thermal pad upgrade thread; I've yet to do the switch myself but will be doing so soon.
> 
> If you can get high memory clocks without issues than its not going to hurt anything; it's just much smaller performance gains compared to core. You'll know pretty quick if you push RAM too far as black screen and lockups will start occurring.
> 
> 1150 with 88mV isn't all that bad honestly; you may want to try increasing power limit and see if it helps at all. You have to figure that 947 is stock so 1150 is already a 21% performance jump.
> 
> My good card makes 1200/1500 with 87mV.
> 
> My bad 290 needs 200mV to get to your core speed so I don't even bother with it. I run it at 1075/1375 +37mV..It is only pushing 1080 on living room TV for my wife so the card is overkill anyway you look at it.
> If your score went down after OC than it's not a successful/stable OC.
> 
> You need to monitor the GPU usage to see if the card is throttling clocks/voltages due to temperature. If it's not throttling than you either need more voltage if temps will allow, or you need to reduce your OC to find stability.
> 
> With these cards core clock is #1 priority. Don't even touch the memory until you've hit a limit on core speed. Trying to do both simultaneously makes things more difficult than it needs to be.


yeah i know, my scores went up


----------



## SPLWF

Quote:


> Originally Posted by *Roboyto*
> 
> 80C is plenty good for VRM1 temperature. If you want to be technical, the VRMs are rated somewhere around 120C max operating temperature. It's not good for the health of your GPU as a whole to run it like that though. Most people in this thread are comfortable in the 80C range, but don't like going much higher. Lower is obviously always better, but your current temps are acceptable. You may want to grab some Fujipoly Ultra Extreme for your VRM kit, it will bring temps down about 10-20C from what I've been told in my thermal pad upgrade thread; I've yet to do the switch myself but will be doing so soon.
> 
> If you can get high memory clocks without issues than its not going to hurt anything; it's just much smaller performance gains compared to core. You'll know pretty quick if you push RAM too far as black screen and lockups will start occurring.
> 
> 1150 with 88mV isn't all that bad honestly; you may want to try increasing power limit and see if it helps at all. You have to figure that 947 is stock so 1150 is already a 21% performance jump.
> 
> My good card makes 1200/1500 with 87mV.
> 
> My bad 290 needs 200mV to get to your core speed so I don't even bother with it. I run it at 1075/1375 +37mV..It is only pushing 1080 on living room TV for my wife so the card is overkill anyway you look at it.
> If your score went down after OC than it's not a successful/stable OC.
> 
> You need to monitor the GPU usage to see if the card is throttling clocks/voltages due to temperature. If it's not throttling than you either need more voltage if temps will allow, or you need to reduce your OC to find stability.
> 
> With these cards core clock is #1 priority. Don't even touch the memory until you've hit a limit on core speed. Trying to do both simultaneously makes things more difficult than it needs to be.


Thank you for the quick reply. I think I am good with VRM's being at 80c, although those thermal pads you posted Fujipoly seems unreal. My only stink about it is, in order to replace the pads on VRM1, it would require the removal of my H55. It will also require me to use low level TIM. I had another H55 when I was doing this mod for the first time without the GELID VRM kit. Man what I huge difference in temps on the GPU. I used MX5 TIM w/ H55, my ilde was around 50c hitting a max of 75-80c, with stock H55 TIM, it idles around 36c, max temp of 63c. I think I will stick with 80c on the VRM1 for now.

What is the max temp for VRM2? What is safe? When upping aux volt to +13. My VDDCI jumps from 1.000 to 1.008. Seems to be holding this 1500mhz (6ghz) stable.


----------



## Roboyto

Quote:


> Originally Posted by *SPLWF*
> 
> Thank you for the quick reply. I think I am good with VRM's being at 80c, although those thermal pads you posted Fujipoly seems unreal. My only stink about it is, in order to replace the pads on VRM1, it would require the removal of my H55. It will also require me to use low level TIM. I had another H55 when I was doing this mod for the first time without the GELID VRM kit. Man what I huge difference in temps on the GPU. I used MX5 TIM w/ H55, my ilde was around 50c hitting a max of 75-80c, with stock H55 TIM, it idles around 36c, max temp of 63c. I think I will stick with 80c on the VRM1 for now.
> 
> What is the max temp for VRM2? What is safe? When upping aux volt to +13. My VDDCI jumps from 1.000 to 1.008. Seems to be holding this 1500mhz (6ghz) stable.


The Fujipoly results are very real I can assure you of that; there are several other users who can confirm. I will be swapping them out on my Gelid kit soon to post my findings as well.

My VRM2 with 0, literally none, airflow doesn't come close to overheating with the little Gelid sink on it. I am using some Junpus thermal tape as I felt the included tape was much too thin. Last time I recorded temps was a 3 hour tomb raider session and VRM2 peaked at 64C. VRM2 can run just as hot as VRM1, but you will never get it to that point.


----------



## aaroc

Quote:


> Originally Posted by *WhitePrQjser*
> 
> Does any of you know if you HAVE to use an active adapter for three displays? Not Eyefinity. Just three displays.


It works without active adapters. 3x 2560x1440 monitors connected to R9 290. tested DVI DL+DVI DL+DP or DVI DL+DVI DL+HDMI

I bought an used R9 290 on Friday. I will return it tomorrow. It has black screen. Just run Unigine valley/heaven or 3dmark for more than 10 minutes and black screen. My other 3 bought new R9 290 don't have this problem.


----------



## kizwan

Quote:


> Originally Posted by *pdasterly*
> 
> I give up trying to oc my xfire setup. Too many problems, just going to run at factory settings before I f something up


What kind of problems?
Quote:


> Originally Posted by *Roboyto*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> @Roboyto, I've been using 14.6 Beta and Win7 without issue only the higher temp, which does not matter if you are using water. It's been as solid as 13.12 or 13.11.
> 
> 
> 
> I used up to 14.4 on Win7 and every iteration was problematic for me.
> 
> Going to 8.1 with 14.6 has been pleasant thus far. I will update my post for this information, thanks!
Click to expand...

No problem here too with any beta drivers on Win 7. What kind of problem you have with beta & Win 7?
Quote:


> Originally Posted by *kyismaster*
> 
> 
> 
> welp, i tried.


It's better to post detail scores, especially graphic scores.


----------



## Mega Man

Quote:


> Originally Posted by *aaroc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *WhitePrQjser*
> 
> Does any of you know if you HAVE to use an active adapter for three displays? Not Eyefinity. Just three displays.
> 
> 
> 
> It works without active adapters. 3x 2560x1440 monitors connected to R9 290. tested DVI DL+DVI DL+DP or DVI DL+DVI DL+HDMI
> 
> I bought an used R9 290 on Friday. I will return it tomorrow. It has black screen. Just run Unigine valley/heaven or 3dmark for more than 10 minutes and black screen. My other 3 bought new R9 290 don't have this problem.
Click to expand...

no you can use passive, that was one of the big upgrades 7970s to 280x, but it is true with all this gen cards, you can run 3 displays with a passive adapter if needed / no adapter


----------



## Blaise170

Quote:


> Originally Posted by *Mega Man*
> 
> no you can use passive, that was one of the big upgrades 7970s to 280x, but it is true with all this gen cards, you can run 3 displays with a passive adapter if needed / no adapter


So you _don't_ need active DisplayPort anymore or do you still need it for Eyefinity? I was unaware of that change.


----------



## Mega Man

for 3 displays you dont, 4 you do ( unless you are using basic DP connection ) , 5 and 6 you dont as the MST hub does not need active displays ( neither would the 4th if you use a mst hub )


----------



## pdasterly

Heat, blk screens, knowing when to stop


----------



## Roboyto

Quote:


> Originally Posted by *kizwan*
> 
> No problem here too with any beta drivers on Win 7. What kind of problem you have with beta & Win 7?


Every kind of problem possible really; Instability with known good clocks/volts, freezing, BSOD, black screen, video loss while audio is still playing, etc. All of the aforementioned occurred while doing anything or nothing at all with stock or OC settings..idling at desktop, booting, browsing Web, watching movies, benching, gaming, etc.

I tried 14.1-14.4 with Win7 and all displayed some sort of issue almost immediately after installation. I gave them a couple hours at most and then always reverted to 13.12
Until I upgraded to 8.1 Pro with 14.6 beta, which has been flawless thus far.


----------



## Blaise170

*Sigh* After playing Sleeping Dogs for about 10 minutes, I've black screened. Maybe I should've just kept my reference card.


----------



## aaroc

Quote:


> Originally Posted by *Mega Man*
> 
> for 3 displays you dont, 4 you do ( unless you are using basic DP connection ) , 5 and 6 you dont as the MST hub does not need active displays ( neither would the 4th if you use a mst hub )


current MST Hub supports only 2x 2560x1440 monitors.I wanted to run only one cable an use a MST hub, but not possible.


----------



## Mega Man

you are correct you have to conform to current standards/capabilities


----------



## Blaise170

Well I found that the problem was throttling due to overheating. I set a custom fan profile, but it turns out that they were only turning at 25% speed. I then just turned off the fan profile completely and am running them at 100% and now sitting comfortably at 60-65C.


----------



## kizwan

Quote:


> Originally Posted by *Roboyto*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> No problem here too with any beta drivers on Win 7. What kind of problem you have with beta & Win 7?
> 
> 
> 
> Every kind of problem possible really; Instability with known good clocks/volts, freezing, BSOD, black screen, video loss while audio is still playing, etc. All of the aforementioned occurred while doing anything or nothing at all with stock or OC settings..idling at desktop, booting, browsing Web, watching movies, benching, gaming, etc.
> 
> I tried 14.1-14.4 with Win7 and all displayed some sort of issue almost immediately after installation. I gave them a couple hours at most and then always reverted to 13.12
> Until I upgraded to 8.1 Pro with 14.6 beta, which has been flawless thus far.
Click to expand...

I got BSOD (related to AMD/ATi driver) too but not serious; two BSOD in January, 4 BSOD in February & one BSOD in May. These with various drivers. There is one time, a few months ago, video loss while audio is still playing. I can't remember which drivers but I think either 13.11 beta or 13.12. It happen just once. I figured it just a fluke. Other than these, mine was working flawlessly (except some power limit issue with certain 14 beta drivers) with 13.11 beta, 13.12, 14.1, 14.2, 14.4 (RC & WHQL) & 14.6 beta.

The most important yours working flawlessly now.

*Catzilla 720p 1150/1350 +100mV +0% vs +50% Power Limit*

Note: Single card run. I think I can run lower than +100mV but I was in a hurry though.

*+0% PL*


*+50% PL*


*GPU Clocks & Usage - +0% vs +50% PL*


----------



## kyismaster

Quote:


> Originally Posted by *kizwan*
> 
> What kind of problems?
> No problem here too with any beta drivers on Win 7. What kind of problem you have with beta & Win 7?
> It's better to post detail scores, especially graphic scores.





Sky diver:




hmm...


----------



## pdasterly

I need a dumb dummies guide to overclock xfire setup using sapphire cards


----------



## kizwan

Quote:


> Originally Posted by *kyismaster*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Sky diver:
> 
> 
> 
> 
> 
> 
> hmm...


These @1175 on the core right? Firestrike score look allright to me. I don't have record for other benchmarks.
Quote:


> Originally Posted by *pdasterly*
> 
> I need a dumb dummies guide to overclock xfire setup using sapphire cards


If heat is the issue, nothing you can do until you improved the cooling system. Heat affect stability.


----------



## pdasterly

Ive done everything short of a full water block.
G10 bracket, heatsinks on everything, upgraded fan on g10, fujipoly, backplate.
Using corsair air 540 case


----------



## Sgt Bilko

Quote:


> Originally Posted by *Blaise170*
> 
> Well I found that the problem was throttling due to overheating. I set a custom fan profile, but it turns out that they were only turning at 25% speed. I then just turned off the fan profile completely and am running them at 100% and now sitting comfortably at 60-65C.


Yeah the stock profiles for most versions suck pretty bad, i have mine set to ramp up pretty early, 60c = 100% fan speed.

I play with headphones so i really don't notice









Quote:


> Originally Posted by *pdasterly*
> 
> Ive done everything short of a full water block.
> G10 bracket, heatsinks on everything, upgraded fan on g10, fujipoly, backplate.
> Using corsair air 540 case


What issues are you having?


----------



## pdasterly

vrm1 temps, is occt the only program that detects silent errors?

My vrm1 temps are inconsistent, sometimes I can get temps into the 90's and rising, sometimes they remain around 80.
I use furmark so I can watch the temp in gpu-z. Furmark gets the vrm1 hot fast


----------



## kyismaster

Quote:


> Originally Posted by *kizwan*
> 
> These @1175 on the core right? Firestrike score look allright to me. I don't have record for other benchmarks.
> If heat is the issue, nothing you can do until you improved the cooling system. Heat affect stability.


1175/1300


----------



## rdr09

Quote:


> Originally Posted by *pdasterly*
> 
> vrm1 temps, is occt the only program that detects silent errors?
> 
> My vrm1 temps are inconsistent, sometimes I can get temps into the 90's and rising, sometimes they remain around 80.
> I use furmark so I can watch the temp in gpu-z. Furmark gets the vrm1 hot fast


i don't think any one here in this thread will recommend furmark. i won't. use the render test in GPUZ instead.

edit: you used Fuji on the vrms? how are you cooling it? intake fan?
Quote:


> Originally Posted by *kyismaster*
> 
> 1175/1300


Kyis, you messed with the power limit at all? what app are you using to oc? my 290 scores higher than that at a lower oc.


----------



## Talon720

Quick question.. Since the r9 290x Tri-x is a ref pcb with a different rom and cooler can you flash it to pt1/pt3 rom or any other roms like the ref can?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Talon720*
> 
> Quick question.. Since the r9 290x Tri-x is a ref pcb with a different rom and cooler can you flash it to pt1/pt3 rom or any other roms like the ref can?


Yep


----------



## aneutralname

Quote:


> Originally Posted by *Talon720*
> 
> Quick question.. Since the r9 290x Tri-x is a ref pcb with a different rom and cooler can you flash it to pt1/pt3 rom or any other roms like the ref can?


My r9 290 tri-x only works with its own bios. Does not accept bioses from older tri-x even. The same may or may not be true for 290x.


----------



## pdasterly

Quote:


> Originally Posted by *rdr09*
> 
> i don't think any one here in this thread will recommend furmark. i won't. use the render test in GPUZ instead.
> 
> edit: you used Fuji on the vrms? how are you cooling it? intake fan?
> Kyis, you messed with the power limit at all? what app are you using to oc? my 290 scores higher than that at a lower oc.


I use furmark to heat up the vrm1's, no other test that I know will do this other than BF4. With gpu-z you cant monitor vrm1 while in fullscreen render test, basic test wont heat up my machine. I have to run like 4 instances of the app.
fujipoly extreme, yes. zalman sharks fin fan. Airflow goes like this


----------



## rdr09

Quote:


> Originally Posted by *pdasterly*
> 
> I use furmark to heat up the vrm1's, no other test that I know will do this other than BF4. With gpu-z you cant monitor vrm1 while in fullscreen render test, basic test wont heat up my machine. I have to run like 4 instances of the app.
> fujipoly extreme, yes. zalman sharks fin fan. Airflow goes like this
> 
> 
> Spoiler: Warning: Spoiler!


its been mentioned quite a number of times in this thread not to use furmark. my temps in BF4 MP 64 are close to what i get in the GPUZ render test.

this way . . .



those were my temps last winter and was run only for a few minutes.


----------



## kyismaster

Quote:


> Originally Posted by *rdr09*
> 
> i don't think any one here in this thread will recommend furmark. i won't. use the render test in GPUZ instead.
> 
> edit: you used Fuji on the vrms? how are you cooling it? intake fan?
> Kyis, you messed with the power limit at all? what app are you using to oc? my 290 scores higher than that at a lower oc.


trixiie


----------



## rdr09

Quote:


> Originally Posted by *kyismaster*
> 
> trixiie


you maxed your Power Li? you should.


----------



## tsm106

Quote:


> Originally Posted by *Roboyto*
> 
> Quote:
> 
> 
> 
> Originally Posted by *SPLWF*
> 
> Thank you for the quick reply. I think I am good with VRM's being at 80c, although those thermal pads you posted Fujipoly seems unreal. My only stink about it is, in order to replace the pads on VRM1, it would require the removal of my H55. It will also require me to use low level TIM. I had another H55 when I was doing this mod for the first time without the GELID VRM kit. Man what I huge difference in temps on the GPU. I used MX5 TIM w/ H55, my ilde was around 50c hitting a max of 75-80c, with stock H55 TIM, it idles around 36c, max temp of 63c. I think I will stick with 80c on the VRM1 for now.
> 
> What is the max temp for VRM2? What is safe? When upping aux volt to +13. My VDDCI jumps from 1.000 to 1.008. Seems to be holding this 1500mhz (6ghz) stable.
> 
> 
> 
> The Fujipoly results are very real I can assure you of that; there are several other users who can confirm. I will be swapping them out on my Gelid kit soon to post my findings as well.
> 
> My VRM2 with 0, literally none, airflow doesn't come close to overheating with the little Gelid sink on it. I am using some Junpus thermal tape as I felt the included tape was much too thin. Last time I recorded temps was a 3 hour tomb raider session and VRM2 peaked at 64C. VRM2 can run just as hot as VRM1, but you will never get it to that point.
Click to expand...

Some of us have been using fujis for years. Some of us have been leading leader boards in benches for years. Fujis have always been an extreme luxury, an edge for benchers however the increased load on the memory modules of Hawaii has made it more attractive to the general user.

Quote:


> Originally Posted by *Roboyto*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> No problem here too with any beta drivers on Win 7. What kind of problem you have with beta & Win 7?
> 
> 
> 
> *Every kind of problem possible really*; Instability with known good clocks/volts, freezing, BSOD, black screen, video loss while audio is still playing, etc. All of the aforementioned occurred while doing anything or nothing at all with stock or OC settings..idling at desktop, booting, browsing Web, watching movies, benching, gaming, etc.
> 
> I tried 14.1-14.4 with Win7 and all displayed some sort of issue almost immediately after installation. I gave them a couple hours at most and then always reverted to 13.12
> Until I upgraded to 8.1 Pro with 14.6 beta, which has been flawless thus far.
Click to expand...

I got nothing other than maybe some fundamental break in knowledge. It doesn't make any good sense that it would only be in win 7. I have various machines here and my main rig dual boots 8.1 and 7, due to various bench affinities to either OS.


----------



## fateswarm

Any idea of a good gpu stress test? That loads up fast. I tried furmark but it appears limited for this card since it at first doesn't load the gpu fully, unless ram is downclocked.

nevermind combustor looks good.


----------



## VSG

Unigine Heaven 4.0 or Valley, just loop them on and see the loads fly up.


----------



## fateswarm

I found combustor earlier. It seems good for that need. I had heaven but it's more bulky to load.


----------



## fateswarm

Hm. Actually it's not that great either in ram usage. I may have to use a full blown one to have a good feed of textures.


----------



## kyismaster

Quote:


> Originally Posted by *rdr09*
> 
> you maxed your Power Li? you should.


Its at 200


----------



## Talon720

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yep


Quote:


> Originally Posted by *aneutralname*
> 
> My r9 290 tri-x only works with its own bios. Does not accept bioses from older tri-x even. The same may or may not be true for 290x.


Thats odd I've read a few others having that happen too. You would think it should... maybe no one that I read tried the pt1/pt3 bios. I need to replace a defective sapphire card since it doesn't seem they will accept it once it gets there. They state that it the serial sticker is damaged or missing we wont accept it. I spilled Arctic silver on it effectively damaging it. I think I'm kinda screwed I don't wanna waste time sending it in to get it back. I just wanted each card to have the same bios, makes me feel good inside. So if anyone successfully
Flashed a different rom to a tri-x speak up.


----------



## rdr09

Quote:


> Originally Posted by *kyismaster*
> 
> Its at 200


that's the VDDC. Use the slider to expose PL. Raise the PL to 50% and you might be able to lower that 200 to 175 at same oc. do this at your own risk.


----------



## kyismaster

Quote:


> Originally Posted by *rdr09*
> 
> that's the VDDC. Use the slider to expose PL. Raise the PL to 50% and you might be able to lower that 200 to 175 at same oc. do this at your own risk.


pl 50 + 200 vdc, still same results as pl 0


----------



## Roboyto

Quote:


> Originally Posted by *tsm106*
> 
> I got nothing other than maybe some fundamental break in knowledge. It doesn't make any good sense that it would only be in win 7. I have various machines here and my main rig dual boots 8.1 and 7, due to various bench affinities to either OS.


I've been fairly active and attentive of this thread since ~October, and I know I was not the only person experiencing issues with 14.X drivers, regardless of the OS.

I did not try any other 14 driver except 14.6 with 8. 1, and since this is working so well I probably won't until the next revision.


----------



## rdr09

Quote:


> Originally Posted by *kyismaster*
> 
> pl 50 + 200 vdc, still same results as pl 0


oo. what brand of r9 do you have? what temps do you get at load? Core and VRMs.

edit: might just be the version of Firestrike you used.


----------



## kyismaster

Quote:


> Originally Posted by *rdr09*
> 
> oo. what brand of r9 do you have? what temps do you get at load? Core and VRMs.
> 
> edit: might just be the version of Firestrike you used.


Sapphire reference, hynix mem, 72c core 66c vrm1 58c vrm 2


----------



## Gobigorgohome

How bad does four reference coolers R9 290X perform in quad crossfire on air? No overclocking on the GPU's. Won't be doing any overclocking (probably) after I get them all on water either.


----------



## rdr09

Quote:


> Originally Posted by *kyismaster*
> 
> Sapphire reference, hynix mem, 72c core 66c vrm1 58c vrm 2


i was comparing our graphics score in Firestrike. I might be mistaken or you could be using the old version. i just ran another using old driver . . .

http://www.3dmark.com/3dm/3439291?

add about 200 pts in graphics using 14.6 Beta.


----------



## tsm106

Quote:


> Originally Posted by *Gobigorgohome*
> 
> How bad does four reference coolers R9 290X perform in quad crossfire on air? No overclocking on the GPU's. Won't be doing any overclocking (probably) after I get them all on water either.


On air its tantamount to masochism.


----------



## kyismaster

Quote:


> Originally Posted by *rdr09*
> 
> i was comparing our graphics score in Firestrike. I might be mistaken or you could be using the old version. i just ran another using old driver . . .
> 
> http://www.3dmark.com/3dm/3439291?
> 
> add about 200 pts in graphics using 14.6 Beta.


I got 200 points boosting 200 mhz , i updated to 14.6 but says im still on 14.4


----------



## Gobigorgohome

Quote:


> Originally Posted by *tsm106*
> 
> On air its tantamount to masochism.


Aha, not good then .... "free" heater at least


----------



## ZealotKi11er

Quote:


> Originally Posted by *Gobigorgohome*
> 
> How bad does four reference coolers R9 290X perform in quad crossfire on air? No overclocking on the GPU's. Won't be doing any overclocking (probably) after I get them all on water either.


If on air don't turn more then 3. The heat is insane. I would be surprised if something does not melt. Thats 1200W of heat concentrated.


----------



## Gobigorgohome

Quote:


> Originally Posted by *ZealotKi11er*
> 
> If on air don't turn more then 3. The heat is insane. I would be surprised if something does not melt. Thats 1200W of heat concentrated.


Oh, that bad ... I am sweating with just one in my room, at first I will only do waterblocks without backplates. But I read somewhere that the thermal pads where bad over the VRM1's ... is it that bad with EK-FC R9 290X that I cannot use it with stock clocks? It seems a little weird to me ...


----------



## tsm106

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> If on air don't turn more then 3. The heat is insane. I would be surprised if something does not melt. Thats 1200W of heat concentrated.
> 
> 
> 
> Oh, that bad ... I am sweating with just one in my room, at first I will only do waterblocks without backplates. But I read somewhere that the thermal pads where bad over the VRM1's ... is it that bad with EK-FC R9 290X that I cannot use it with stock clocks? It seems a little weird to me ...
Click to expand...

Ok, now you're just scaring the guy. It's gonna be hot no doubt but with bios 1, it shouldn't get very noisy, just hot. That said if blocks are coming you won't be a masochist for too long.


----------



## the9quad

I run 3 on air. It's very warm. I have to turn down the AC far enough everyone in the house complains, but I just put on my headphones and ignore them lol


----------



## pdasterly

Quote:


> Originally Posted by *rdr09*
> 
> its been mentioned quite a number of times in this thread not to use furmark. my temps in BF4 MP 64 are close to what i get in the GPUZ render test.
> 
> this way . . .
> 
> 
> 
> those were my temps last winter and was run only for a few minutes.


temps are down a lot, 58c on vrm1. render test has been running for 30 min


----------



## tsm106

Quote:


> Originally Posted by *Roboyto*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I got nothing other than maybe some fundamental break in knowledge. It doesn't make any good sense that it would only be in win 7. I have various machines here and my main rig dual boots 8.1 and 7, due to various bench affinities to either OS.
> 
> 
> 
> I've been fairly *active and attentive of this thread since ~October*, and I know I was not the only person experiencing issues with 14.X drivers, regardless of the OS.
> 
> I did not try any other 14 driver except 14.6 with 8. 1, and since this is working so well I probably won't until the next revision.
Click to expand...

I'm not sure we have the same definition of active, but your first post in this thread is from Feb.


----------



## sugarhell

Quote:


> Originally Posted by *tsm106*
> 
> I'm not sure we have the same definition of active, but your first post in this thread is from Feb.


feb october the same.


----------



## battleaxe

Quote:


> Originally Posted by *pdasterly*
> 
> temps are down a lot, 58c on vrm1. render test has been running for 30 min


Looks good. You should be all set at those temps.


----------



## pdasterly

I dont think the vrm1 is being pushed hard enough. Furmark and occt would have temp in the 90's


----------



## battleaxe

Quote:


> Originally Posted by *pdasterly*
> 
> I dont think the vrm1 is being pushed hard enough. Furmark and occt would have temp in the 90's


How hot does it run on BF4?

Those tests are only good for burning your card to smithereens and not much else. They don't really provide real world scenarios. Even mining my 290's never went anywhere near as high as the temps you're quoting. 75c was the most I've seen on VRM 1. My water cooled card which uses an AIO H80 H20 cooler on the core and air sinks on the RAM and VRM's never goes above 62c while scrypt mining. Those are the highest temps I have ever seen. BF4 is not as high.

So, I don't see the point of going for maximum heat just for the sake of maximum heat? If you're good on BF4 or Valley, Heaven, etc. Isn't that enough?


----------



## Talon720

http://www.overclock.net/t/1472202/cant-flash-other-bios-into-sapphire-290-tri-x I see some of these guys had the issue im asking about. There was never a follow up, or solution saying the were able to do it. I wanna buy the 290x tri-x asap, but dont wanna get screwed by not being able to flash


----------



## HoneyBadger84

I really don't see the point in burning the cards that way either, if you're not seeing anywhere near those temps in game/benchmarking, why bother with burn tests... unless you're trying outrightly to kill the card? lol


----------



## tsm106

Quote:


> Originally Posted by *Talon720*
> 
> http://www.overclock.net/t/1472202/cant-flash-other-bios-into-sapphire-290-tri-x I see some of these guys had the issue im asking about. There was never a follow up, or solution saying the were able to do it. I wanna buy the 290x tri-x asap, but dont wanna get screwed by not being able to flash


Isn't the tri-x a custom?


----------



## pdasterly

Good point


----------



## TheGoat Eater

MY 290X Lightning is a beast in BF4 and I am loving it - but the thing I can't stand is that the latest beta driver from AMD still has broken fan support. I want to be able to increase the fan manually with afterburner.


----------



## HoneyBadger84

Quote:


> Originally Posted by *TheGoat Eater*
> 
> MY 290X Lightning is a beast in BF4 and I am loving it - but the thing I can't stand is that the latest beta driver from AMD still has broken fan support. I want to be able to increase the fan manually with afterburner.


Last person I heard say their fan control was broken on their R9 290/290X had a FUBARed BIOS on their card, it wasn't the drivers at all. Might wanna look in to that, compare your BIOS listed in GPUz with someone else that has a Lightning.


----------



## Blaise170

Quote:


> Originally Posted by *TheGoat Eater*
> 
> MY 290X Lightning is a beast in BF4 and I am loving it - but the thing I can't stand is that the latest beta driver from AMD still has broken fan support. I want to be able to increase the fan manually with afterburner.


I'm using Afterburner with the latest drivers and don't have an issue.


----------



## Talon720

Quote:


> Originally Posted by *tsm106*
> 
> Isn't the tri-x a custom?


No its a custom cooler on a reference pcb. I havnt heard a first hand account of someone flashing the bios specifically pt1. Everything ive heard is been kinda been inferred someone says they r going to flash it, but never follow up with it. Youd think ref pcb shouldnt matter, maybe its human error I need to know. I could always put the tri-x uefi bios on the ref card i guess.


----------



## tsm106

Quote:


> Originally Posted by *Talon720*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Isn't the tri-x a custom?
> 
> 
> 
> No its a custom cooler on a reference pcb. I havnt heard a first hand account of someone flashing the bios specifically pt1. Everything ive heard is been kinda been inferred someone says they r going to flash it, but never follow up with it. Youd think ref pcb shouldnt matter, maybe its human error I need to know. I could always put the tri-x uefi bios on the ref card i guess.
Click to expand...

It is a custom, it just uses a reference layout.


----------



## sugarhell

Quote:


> Originally Posted by *Talon720*
> 
> No its a custom cooler on a reference pcb. I havnt heard a first hand account of someone flashing the bios specifically pt1. Everything ive heard is been kinda been inferred someone says they r going to flash it, but never follow up with it. Youd think ref pcb shouldnt matter, maybe its human error I need to know. I could always put the tri-x uefi bios on the ref card i guess.


Ref layout, lower or the same components quality


----------



## rdr09

Quote:


> Originally Posted by *kyismaster*
> 
> I got 200 points boosting 200 mhz , i updated to 14.6 but says im still on 14.4


that's prolly why your card is performing less than a 290. i recommend reinstalling.

edit: i ran Sky Diver using old driver with my 290 at 1175/1500 and got a 42K in graphics score.


----------



## Gobigorgohome

Quote:


> Originally Posted by *tsm106*
> 
> Ok, now you're just scaring the guy. It's gonna be hot no doubt but with bios 1, it shouldn't get very noisy, just hot. That said if blocks are coming you won't be a masochist for too long.


Bios 1 = quiet, bios 2 = über/performance? I will probably get blocks next week if everything works out with the RIVBE I get tomorrow, first one was RMAed ...
Quote:


> Originally Posted by *the9quad*
> 
> I run 3 on air. It's very warm. I have to turn down the AC far enough everyone in the house complains, but I just put on my headphones and ignore them lol


It is no AC in this house, but I could just run one or two cards until I get water cooling, I just have to test the motherboard and everything on air before I mount the waterblocks.


----------



## wrigleyvillain

For BF4 players--the Test Range or maybe even the CTE if you have Premium is a great way to test for stability without crashing during a 'real game'. TR is at least somewhat as taxing, especially if you can cause large explosions.

It's always at the worst time too it seems for me like when I am driving a boat full of teammates or about to win an engagement.


----------



## Fade2Black

i'll add this club to my sig, there's proof in my build log. who would've guessed monitors with HDMI and DVI inputs still come with VGA cables only


----------



## Talon720

Quote:


> Originally Posted by *tsm106*
> 
> It is a custom, it just uses a reference layout.


Quote:


> Originally Posted by *sugarhell*
> 
> Ref layout, lower or the same components quality


Well thanks for clearing that up guys







The ref layout but not reference is confusing, because It ends up like what msi did with larger components not fitting waterblocks. Maybe this is why tri-x cant have the bios flashed to pt1/pt3.


----------



## fateswarm

Has anyone noticed their 'power limit' doing nothing at all? I've put it on +50% even when the clock was high enough for artifacts without more voltage and it did nothing according to the voltage controller. I suspect though it's normal since the board or the card might be just giving all the power it requires anyway.

Voltage does wonders though compared to it.


----------



## sugarhell

What do you expect from the power limit. If the cards needs more power it will gonna get it.If you limit the card at 250 watt then you will limit your performance based on how hard you will gonna push ALU/Ace engine usage


----------



## pdasterly

Quote:


> Originally Posted by *Talon720*
> 
> Well thanks for clearing that up guys
> 
> 
> 
> 
> 
> 
> 
> The ref layout but not reference is confusing, because It ends up like what msi did with larger components not fitting waterblocks. Maybe this is why tri-x cant have the bios flashed to pt1/pt3.


Weird cause I have reference cards and I run tri-x oc bios
1040/1300


----------



## fateswarm

Quote:


> Originally Posted by *sugarhell*
> 
> What do you expect from the power limit.


I expected that the card was being throttled. I'm happy it does nothing to be honest, it sounds like an unintuitive way to do under/overclocking.

I suspect it was meant to be a way to do auto-voltage but they eventually decided (or someone in the loop did) to make it do nothing at all.

At least on my setup/system.


----------



## kyismaster

totes awesome 14.6


----------



## sugarhell

Quote:


> Originally Posted by *fateswarm*
> 
> I expected that the card was being throttled. I'm happy it does nothing to be honest, it sounds like an unintuitive way to do under/overclocking.
> 
> I suspect it was meant to be a way to do auto-voltage but they eventually decided (or someone in the loop did) to make it do nothing at all.
> 
> At least on my setup/system.


eh? Why should the card throttle at stock?

Powerlimit lets you feed the pcb with more power over stock tdp nothing else


----------



## heroxoot

Quote:


> Originally Posted by *fateswarm*
> 
> Has anyone noticed their 'power limit' doing nothing at all? I've put it on +50% even when the clock was high enough for artifacts without more voltage and it did nothing according to the voltage controller. I suspect though it's normal since the board or the card might be just giving all the power it requires anyway.
> 
> Voltage does wonders though compared to it.


Mine does plenty. Remember GPU have vdroop just like CPU. I've monitored my GPU during load on stock all and my OC and the gpu voltage stays higher all around.


----------



## fateswarm

Quote:


> Originally Posted by *sugarhell*
> 
> eh? Why should the card throttle at stock?
> 
> Powerlimit lets you feed the pcb with more power over stock tdp nothing else


As said, powerlimit doesn't let me do anything at all. I monitored closely the output of the digital PWM controller of the Tri-x on HWinfo and it's clear as day: The powerlimit even on +50% gives absolutely nothing more, even when the clock is high enough to give artifacts.

A little bit of more manual voltage though does wonders.


----------



## fateswarm

Quote:


> Originally Posted by *heroxoot*
> 
> Mine does plenty. Remember GPU have vdroop just like CPU. I've monitored my GPU during load on stock all and my OC and the gpu voltage stays higher all around.


Here it has no vdroop or any power change. I can monitor what is going on on HWInfo. Tri-x has a digitral PWM controller (from International Rectifier, I don't remember the model) and it can report those things meticulously.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kyismaster*
> 
> 
> 
> totes awesome 14.6


DDU wipe then install 13.12 first then install 14.6 over it

Works well for me


----------



## heroxoot

Quote:


> Originally Posted by *fateswarm*
> 
> Quote:
> 
> 
> 
> Originally Posted by *sugarhell*
> 
> eh? Why should the card throttle at stock?
> 
> Powerlimit lets you feed the pcb with more power over stock tdp nothing else
> 
> 
> 
> As said, powerlimit doesn't let me do anything at all. I monitored closely the output of the digital PWM controller of the Tri-x on HWinfo and it's clear as day: The powerlimit even on +50% gives absolutely nothing more, even when the clock is high enough to give artifacts.
> 
> A little bit of more manual voltage though does wonders.
Click to expand...

If you use MSI AB check in the config file under settings for a line called startupdelay. Next to it should say 0. Change it to 10000. This will make sure MSI AB does not start up before CCC when windows boots. This way no conflicts occur.

Also if adding voltage works then powerlimit increase is working. It wouldnt be able to take much more than stock without the limit increase.


----------



## fateswarm

Quote:


> Originally Posted by *heroxoot*
> 
> If you use MSI AB check in the config file under settings for a line called startupdelay. Next to it should say 0. Change it to 10000. This will make sure MSI AB does not start up before CCC when windows boots. This way no conflicts occur.


I had tried that and it didn't do anything. I may try again.

Though, I like that I avoid the lag of CCC loading.


----------



## heroxoot

Quote:


> Originally Posted by *fateswarm*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> If you use MSI AB check in the config file under settings for a line called startupdelay. Next to it should say 0. Change it to 10000. This will make sure MSI AB does not start up before CCC when windows boots. This way no conflicts occur.
> 
> 
> 
> I had tried that and it didn't do anything. I may try again.
> 
> Though, I like that I avoid the lag of CCC loading.
Click to expand...

Yes one problem I had in the past was MSI AB would start before CCC and CCC would screw with voltage and clocks. But yes if you can increase voltage then powerlimit is definitely working, even if the overdrive in CCC does not reflect it. On 14.4 it would move the slider in CCC when I adjusted it in MSI AB but I don't see that happening now. I still believe it is working tho because I put my 290X @ 1100 with +35mv just to see and I saw the difference in voltage on GPUZ. Without it, the VDDC will be around 1.164v or so at load, but with it increased I see 1.18 or so I think. So yes, if you see this then it is most definitely working.


----------



## Talon720

Quote:


> Originally Posted by *pdasterly*
> 
> Weird cause I have reference cards and I run tri-x oc bios
> 1040/1300


Well reference cards can be flashed with most bios. Where as i thought non reference/reference layout had more limited bios compatibility. I only own ref my self i kist need to replace one.


----------



## fateswarm

Oh the voltage setting works fine here and it has a real life benefit on stability and it's printed on HWInfo. It's the power limit settings that appear to do nothing at all.

(I like it that way, and whoever made it work that way, knew where it's at.)


----------



## Dasboogieman

Quote:


> Originally Posted by *Talon720*
> 
> No its a custom cooler on a reference pcb. I havnt heard a first hand account of someone flashing the bios specifically pt1. Everything ive heard is been kinda been inferred someone says they r going to flash it, but never follow up with it. Youd think ref pcb shouldnt matter, maybe its human error I need to know. I could always put the tri-x uefi bios on the ref card i guess.


I'm running the PT1T BIOS at the moment on my tri x 290. Granted from what I've gathered, mine is one of the older batches. On the unlock forums, we had a guy with a new tri x unsuccessfully flash his PT1 BIOS, the BIOS string on the stock is supposedly different due to altered memory timings. I suspect its due to a new supply of Hynix GDDR5 that sapphire is using now.


----------



## giygas

I just got me a fancy Sapphire Vapor-X R9 290 Tri-X OC to go with a 144Hz monitor (ASUS VG248QE) and problems







often there are horizontal lines flickering on screen. Haven't noticed it in games, only while in Windows and they flicker the most when I'm moving my mouse. I haven't overclocked it or touched any of its clock/memory speeds at all. What could be causing the issue? Still happens at 60Hz. Also, I can hear a faint buzzing noise whenever I move my mouse as well. At first I thought my mouse was grinding along my mousepad or something, but it's coming from the card.


----------



## rdr09

Quote:


> Originally Posted by *giygas*
> 
> I just got me a fancy Sapphire Vapor-X R9 290 Tri-X OC to go with a 144Hz monitor (ASUS VG248QE) and problems
> 
> 
> 
> 
> 
> 
> 
> often there are horizontal lines flickering on screen. Haven't noticed it in games, only while in Windows and they flicker the most when I'm moving my mouse. I haven't overclocked it or touched any of its clock/memory speeds at all. What could be causing the issue? Still happens at 60Hz. Also, I can hear a faint buzzing noise whenever I move my mouse as well. At first I thought my mouse was grinding along my mousepad or something, but it's coming from the card.


similar issue . . .

http://www.overclock.net/t/1498744/sapphire-290-horizontal-flickers-on-screen/60

btw, we have a Sapphire rep on board. his name is - Vapor X.


----------



## heroxoot

Quote:


> Originally Posted by *giygas*
> 
> I just got me a fancy Sapphire Vapor-X R9 290 Tri-X OC to go with a 144Hz monitor (ASUS VG248QE) and problems
> 
> 
> 
> 
> 
> 
> 
> often there are horizontal lines flickering on screen. Haven't noticed it in games, only while in Windows and they flicker the most when I'm moving my mouse. I haven't overclocked it or touched any of its clock/memory speeds at all. What could be causing the issue? Still happens at 60Hz. Also, I can hear a faint buzzing noise whenever I move my mouse as well. At first I thought my mouse was grinding along my mousepad or something, but it's coming from the card.


Are you using MSI AB? If I don't set it to reset display I get weird lines and crazy looking stuff when I move the mouse. Setting it to reset the display fixes it for me. Even tho I'm currently not overclocking I have it do it anyway.


----------



## giygas

It's inconsistent, too. Sometimes it's there, sometimes not.
Quote:


> Originally Posted by *heroxoot*
> 
> Are you using MSI AB?


Nope, I don't have any overclocking utilities installed on my computer at all. Unless you count Catalyst, which I also haven't used.


----------



## heroxoot

Quote:


> Originally Posted by *giygas*
> 
> It's inconsistent, too. Sometimes it's there, sometimes not.
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Are you using MSI AB?
> 
> 
> 
> Nope, I don't have any overclocking utilities installed on my computer at all. Unless you count Catalyst, which I also haven't used.
Click to expand...

Not too sure then. Must be a conflict between the refresh rate and the gpu. Try setting it to 60 or 120 and see if it still happens.


----------



## BradleyW

Quote:


> Originally Posted by *giygas*
> 
> I just got me a fancy Sapphire Vapor-X R9 290 Tri-X OC to go with a 144Hz monitor (ASUS VG248QE) and problems
> 
> 
> 
> 
> 
> 
> 
> often there are horizontal lines flickering on screen. Haven't noticed it in games, only while in Windows and they flicker the most when I'm moving my mouse. I haven't overclocked it or touched any of its clock/memory speeds at all. What could be causing the issue? Still happens at 60Hz. Also, I can hear a faint buzzing noise whenever I move my mouse as well. At first I thought my mouse was grinding along my mousepad or something, but it's coming from the card.


I'd try a different screen to see if the issue is coming from the screen or the PC. As for the noise you hear when moving the mouse, it's normal. it's the capacitors making noise from the GPU using a tiny bit of power to accelerate and output the cursor on screen. If you still get lines on a different screen, I'd suggest a software issue or a direct faulty card. Most likely a software issue. We will cross that bridge when we come to it. If we come to it.


----------



## SPLWF

I think I maxed out my 290. Can't get core any higher, also cannot get mem beyond 1500mhz. Just wanted to thank everyone for helping me and I will be looking to get the Fujipoly thermal pad as well as GELID GC-Extreme to replace my stock Corsair TIM on my H55 in the future. Thanks again

Here are my mods:

Case: Corsair Carbide 500R White
CPU: Intel i5 3570k @4.2ghz
CPU Cooler: Corsair H100
Motherboard: ASUS Sabertooth Z77
RAM: 2x4GB G.Skill Sniper [email protected]
GPU: AMD Sapphire R9 290 + NZXT Kraken G10 + Corsair H55 + GELID ICY Vision VRM Heatsinks + Zalman ZM-RHS1 VRAM Heatsinks
GPU Core Clock: [email protected] Max Core Temp [email protected] Max on VRM1
GPU Memory Clock: 1500mhz (6ghz) [email protected] Max on VRM2
PSU: OCZ ZX 850
SSD: Samsung 830 128GB w/ Win 7 Pro x64
SSD: OCZ Agility 3 60GB (Steam Secondary for fast gaming cache)
HDD's: 2x Samsung F3 1TB
HDD: WD Caviar Black 3TB
HDD: WD Mainstream 1TB
Mouse: Logitech G400s
Keyboard: Cooler Master Storm Quick Fire XT Cherry MX RED
Fans: x7 Cougar 120mm/x1 Cougar 140mm
Lighting: NZXT 2m Sleeved White LED/x2 Bitfenix Alchemy 6inch White LED

CATZILLA STOCK 290



Core 1150/Mem at stock 1250



Core 1150/Mem 1500mhz (6ghz)



FFXIV STOCK 290



Core 1150/Mem at stock



Core 1150/Mem 1500


----------



## giygas

Quote:


> Originally Posted by *heroxoot*
> 
> Not too sure then. Must be a conflict between the refresh rate and the gpu. Try setting it to 60 or 120 and see if it still happens.


Quote:


> Originally Posted by *BradleyW*
> 
> I'd try a different screen to see if the issue is coming from the screen or the PC. As for the noise you hear when moving the mouse, it's normal. it's the capacitors making noise from the GPU using a tiny bit of power to accelerate and output the cursor on screen. If you still get lines on a different screen, I'd suggest a software issue or a direct faulty card. Most likely a software issue. We will cross that bridge when we come to it. If we come to it.


I don't think it's a faulty card, just has some quirks I suppose. I'm fine with it unless the flickering will get worse.


----------



## Forceman

Quote:


> Originally Posted by *wrigleyvillain*
> 
> For BF4 players--the Test Range or maybe even the CTE if you have Premium is a great way to test for stability without crashing during a 'real game'. TR is at least somewhat as taxing, especially if you can cause large explosions.
> 
> It's always at the worst time too it seems for me like when I am driving a boat full of teammates or about to win an engagement.


I like to use spectator mode - seems more demanding than Test Range, and you can just set it in first person view and let it run while you go do something else.
Quote:


> Originally Posted by *fateswarm*
> 
> Has anyone noticed their 'power limit' doing nothing at all? I've put it on +50% even when the clock was high enough for artifacts without more voltage and it did nothing according to the voltage controller. I suspect though it's normal since the board or the card might be just giving all the power it requires anyway.
> 
> Voltage does wonders though compared to it.


Quote:


> Originally Posted by *fateswarm*
> 
> As said, powerlimit doesn't let me do anything at all. I monitored closely the output of the digital PWM controller of the Tri-x on HWinfo and it's clear as day: The powerlimit even on +50% gives absolutely nothing more, even when the clock is high enough to give artifacts.
> 
> A little bit of more manual voltage though does wonders.


I don't think power limit does what you think it does. Power limit does not allow the card to internally increase volts (it only decreases it) and it doesn't increase stability. All it does is allow the card to draw additional power, if required. With 0% power limit the card is only going to draw the spec'd 250W (or whatever the actual number is), once your power draw starts to exceed that the card will start to throttle to keep it under the limit. Increasing the power limit increases the total power draw, which means the card won't throttle. It isn't going to increase the voltage, or help stability - all it's going to do is keep the card from throttling. So if your card wasn't throttling in the first place (basically if you didn't increase the voltage) then it isn't going to do anything at all for you.


----------



## kyismaster

I dunno, but im happy with stock i think


----------



## Sonikku13

I'm debating whether to downgrade from my 290X and go with the A10-7850K I have until the 390X comes out. Basically, I want to sell my 290X now to fund the 390X purchase, since when the 390X comes out, 290X prices will probably tank. Any thoughts?


----------



## disintegratorx

Alright. I've pushed my card to its limits in an effort to play Far Cry 3 as smooth as it can be played and I've reached clocks at 1195/1571, with 175 extra mv, but the voltage set to 88 in MSI AB. I also only am able to turn up the Powerplay to 34%. This will be my setting from now on because it enables me to not tip the voltage limit and have enough power to make for smoother gameplay (through Powerplay).









Update: Make that 1195/1572. And this is exactly where I wanna be. All these nights of trying to set this card to an optimal setting has actually payed off.







This LCS from Powercolor is an excellent card to own. I hope they keep making them in the future.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Sonikku13*
> 
> I'm debating whether to downgrade from my 290X and go with the A10-7850K I have until the 390X comes out. Basically, I want to sell my 290X now to fund the 390X purchase, since when the 390X comes out, 290X prices will probably tank. Any thoughts?


Dont think they're gonna tank that hard. I just sold my well cared for 7970 for $215 shipped 2 weeks ago, bought it 2 yrs ago lol


----------



## Sonikku13

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sonikku13*
> 
> I'm debating whether to downgrade from my 290X and go with the A10-7850K I have until the 390X comes out. Basically, I want to sell my 290X now to fund the 390X purchase, since when the 390X comes out, 290X prices will probably tank. Any thoughts?
> 
> 
> 
> Dont think they're gonna tank that hard. I just sold my well cared for 7970 for $215 shipped 2 weeks ago, bought it 2 yrs ago lol
Click to expand...

I had to dump three 7970s at bargain basement prices, then again, they were abused a lot. My 290X is in better shape, the problem is my eBay account doesn't have enough feedback yet.


----------



## Blaise170

Quote:


> Originally Posted by *Sonikku13*
> 
> I had to dump three 7970s at bargain basement prices, then again, they were abused a lot. My 290X is in better shape, the problem is my eBay account doesn't have enough feedback yet.


Feedback isn't a huge deal. Buyers are protected so people will still bid on your stuff. I sold my Xbox 360 on there when I had 0 feedbacks.


----------



## heroxoot

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sonikku13*
> 
> I'm debating whether to downgrade from my 290X and go with the A10-7850K I have until the 390X comes out. Basically, I want to sell my 290X now to fund the 390X purchase, since when the 390X comes out, 290X prices will probably tank. Any thoughts?
> 
> 
> 
> Dont think they're gonna tank that hard. I just sold my well cared for 7970 for $215 shipped 2 weeks ago, bought it 2 yrs ago lol
Click to expand...

That's a good 50% loss on the original price tho. I'm kind of glad my 7970 turned into a 290X.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Sonikku13*
> 
> I had to dump three 7970s at bargain basement prices, then again, they were abused a lot. My 290X is in better shape, the problem is my eBay account doesn't have enough feedback yet.


Yeah I keep my hardware in pristine condition anymore specifically because of resale values. The Sapphire HD 7970 OC I sold literally looked brand new, other than some dust build up on the fan blades themselves I couldn't get off, other than that you literally couldn't tell it had been used because I use gloves to handle any video cards I'm changing out to avoid fingerprints & other issues.

Feedback doesn't matter too awefully much, it's all about if you're selling at a time when alot of people are buying. Case & point, 2 weeks ago, give or take, I got a "brand new" in box Gigabyte R9 290X WindForce edition for ~$360 shipped on EBay, and that was with about 6 different people bidding on it (sniping OP with 15s left), but I just saw another one recently go for more, and another a week earlier go for less. My VisionTek, who I call the bain of my existence, just sold today for $320 and some change plus shipping, so I'm taking a loss on it, but it's worth it to be rid of the worst video card I've ever owned (never buy that company's cards if you can avoid it, coolers don't work for crap)

EBay is a bit of a diceroll when it comes to selling, sometimes you hit when demand is good... and sometimes you only get one bid. lol
Quote:


> Originally Posted by *heroxoot*
> 
> That's a good 50% loss on the original price tho. I'm kind of glad my 7970 turned into a 290X.


Yeah but it's a 50% loss in price for something that was new 2 years ago & has been used for 2 years. The impact will be less so for more current hardware that's less used.


----------



## ZealotKi11er

Quote:


> Originally Posted by *fateswarm*
> 
> Oh the voltage setting works fine here and it has a real life benefit on stability and it's printed on HWInfo. It's the power limit settings that appear to do nothing at all.
> 
> (I like it that way, and whoever made it work that way, knew where it's at.)


Let me give you a example that power limit does. Take mining for example. When i used to do it it would really push the card hard. At stock clocks 0% power limit it could do 1000/1250 no problem. When i OCed memory to 1500 i had to increase Voltage to +50mV to maintain stability. What this did is made the card draw more power. Because there was a limit to the power the card could draw it going to start to lower Core clock. Doing so instead of getting 1000/1500 @ 0% Power Limit i was getting ~ 950MHz. They i could slowly add +5% and had to go 15% to get back to 100MHz. Maybe in games i might have not see the card hit the limit as easy but if you push 100mV or even 200mV you will have to increase the power limit or your card will down clock. Keep in mind this is under water where temps have no effect at all.


----------



## dr0thegreatest

I try to avoid ebay to sell as much as possible. It really doesnt favor sellers, have had really bad experiences on it.


----------



## Dasboogieman

Quote:


> Originally Posted by *dr0thegreatest*
> 
> I try to avoid ebay to sell as much as possible. It really doesnt favor sellers, have had really bad experiences on it.


I generally avoid electrical or things with moving parts on eBay. If its stuff like OEM fans or heatsinks which are next to impossible to screw up then its fine. I've only ever bought one electronic thing and that was an i7-920xm QS. Laptop which housed it died shortly after.


----------



## KeepWalkinG

this is my new card








Can I join the club?

image hosting no account

free screen capture software

uploading pictures

screen shot pc

adult image sharing

upload a gif

upload image


----------



## fateswarm

Quote:


> Originally Posted by *Forceman*
> 
> With 0% power limit the card is only going to draw the spec'd 250W (or whatever the actual number is)


If that part turns out to be true, it is the full explanation on its own. It makes "safe" voltage settings here not being able to get anything out of the power setting. It's likely that Tri-X has a limit high enough that anything at safe-ish voltages need not to change that setting at all.


----------



## SPLWF

I am also in the 290 club now

http://s688.photobucket.com/user/BoostedSupraMK3/media/20140701_202412_zps4b5cbe60.jpg.html


----------



## wrigleyvillain

Quote:


> Originally Posted by *Forceman*
> 
> I like to use spectator mode - seems more demanding than Test Range, and you can just set it in first person view and let it run while you go do something else.


Ah I am sure it is and that is a better idea that did not occur to me. I guess because I have never had any other reason to try spectator and never think about it.


----------



## ZealotKi11er

Quote:


> Originally Posted by *fateswarm*
> 
> If that part turns out to be true, it is the full explanation on its own. It makes "safe" voltage settings here not being able to get anything out of the power setting. It's likely that Tri-X has a limit high enough that anything at safe-ish voltages need not to change that setting at all.


Could easily be the case. I think 290/290X Stock limits is 300W and with 50% goes to 450W. Also like i said the game you are playing might not push the card internally to 100% so you really are not hitting the power limit wall. When using MSI AB do you clock speed stay constant at your set clocks throughout the game?


----------



## JeremyFenn

Quote:


> Originally Posted by *KeepWalkinG*
> 
> this is my new card
> 
> 
> 
> 
> 
> 
> 
> 
> Can I join the club?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> image hosting no account
> 
> free screen capture software
> 
> uploading pictures
> 
> screen shot pc
> 
> adult image sharing
> 
> upload a gif
> 
> upload image [/
> 
> 
> quote]
> 
> I just bought that card too !!! Runs great for everything I do, I'll put pics up of mine after work. Sapphire VAPOR-X Tri-X R9 290x OC edition FTW !!!


----------



## fateswarm

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Could easily be the case. I think 290/290X Stock limits is 300W and with 50% goes to 450W. Also like i said the game you are playing might not push the card internally to 100% so you really are not hitting the power limit wall. When using MSI AB do you clock speed stay constant at your set clocks throughout the game?


I've run a valley with nasty settings to drop the fps. It appears I can't go easily above 214W + 35W (vram). That's according to the digital IR3570 IR3567B controller on the card read by HWInfo and I would consider it accurate.

Since I try to keep voltage at safe settings, it appears I would not be able to get any advantage out of that setting unless I up the voltage for a higher OC.


----------



## sugarhell

Try a good graphic engine that is not 100% based on shader power. Try frostbite so you can push the ALU close to 100% usage so you can measure real power consumption


----------



## Germanian

best driver yet 14.6 RC2
Running 2x R9 290 overclocked and underclocked it works out of the box. So far all my games 0 problems and Wildstar very nice FPS.
CPU bound by 4770K you need to overclock CPU like mad









I went from 13.12 > 14.4 beta > 14.4 WHQL > 14.6 RC2

if anyone is interested in undervolting for less power consumption i am running 860 core 1200 VRAM -44mv core voltage (you can go even lower, test stability), power limit +20%, custom fan curse which hits maximum 50%, which almost never happens to less power consumption from undervolting.

reason for undervolting is i am living in california with high ambients up to 33celcius since we are in summer now and those reference coolers on r9 290 just get too loud.


----------



## sugarhell

Quote:


> Originally Posted by *Germanian*
> 
> best driver yet 14.6 RC2
> Running 2x R9 290 overclocked and underclocked it works out of the box. So far all my games 0 problems and Wildstar very nice FPS.
> CPU bound by 4770K you need to overclock CPU like mad
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I went from 13.12 > 14.4 beta > 14.4 WHQL > 14.6 RC2
> 
> if anyone is interested in undervolting for less power consumption i am running 860 core 1200 VRAM -44mv core voltage (you can go even lower, test stability), power limit +20%, custom fan curse which hits maximum 50%, which almost never happens to less power consumption from undervolting.


You have a stock 4770k with crossfire. Mgpus adds a bit cpu usage. Oc at least to 4.5


----------



## Rainmaker91

Although I will likely wait for the next generation cards to upgrade I wanted to ask you all what you think of eyefinity. I ask here since it's more common to run it of high end cards rather then lower end ones. How much powr is actually needed to run it in reality? I figure In would need crossfire or a 295x2 to run larger setups without a problem. How much power is really needed for a 3x 1080p setup? and how much is needed for 6x 1080p? would an 8gb card help in such large resolutions or is 4gb enough?


----------



## HoneyBadger84

Quote:


> Originally Posted by *Rainmaker91*
> 
> Although I will likely wait for the next generation cards to upgrade I wanted to ask you all what you think of eyefinity. I ask here since it's more common to run it of high end cards rather then lower end ones. How much powr is actually needed to run it in reality? I figure In would need crossfire or a 295x2 to run larger setups without a problem. How much power is really needed for a 3x 1080p setup? and how much is needed for 6x 1080p? would an 8gb card help in such large resolutions or is 4gb enough?


Depends on what games you're wanting to run. Even an R9 295x2 & crossfire 290Xs will struggle if you're going up to and above 4K resolution if you want to max out the newest games like Watch_Dogs, Crysis 3 etc, and Eyefinity resolutions are at or near the reso of 4K (3x 1080p is pretty much 3/4s of 4K, give or take a few pixels). In my experience, crossfire is usually enjoy to run current-gen games at decent FPS on high/maximum settings at 3x 1080p. By decent I don't mean 60, I mean keeping the average over 30 most of the time.


----------



## Rainmaker91

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Depends on what games you're wanting to run. Even an R9 295x2 & crossfire 290Xs will struggle if you're going up to and above 4K resolution if you want to max out the newest games like Watch_Dogs, Crysis 3 etc, and Eyefinity resolutions are at or near the reso of 4K (3x 1080p is pretty much 3/4s of 4K, give or take a few pixels). In my experience, crossfire is usually enjoy to run current-gen games at decent FPS on high/maximum settings at 3x 1080p. By decent I don't mean 60, I mean keeping the average over 30 most of the time.


hmm, I was thinking of getting a couple of r9 285x cards when they get released since they will more or less be 290s in performance. I just had to replace my monitor with a Dell Ultrasharp U2414h so the thought of eyefinity just came to mind with that super thin frame (just 6mm). If I could build a dream computer I would have six of them but that will most likely never be practical to set up or when it comes to price.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Rainmaker91*
> 
> hmm, I was thinking of getting a couple of r9 285x cards when they get released since they will more or less be 290s in performance. I just had to replace my monitor with a Dell Ultrasharp U2414h so the thought of eyefinity just came to mind with that super thin frame (just 6mm). If I could build a dream computer I would have six of them but that will most likely never be practical to set up or when it comes to price.


Yeah that's a bit out there :-D Never hurts to think about & dream though. I'm still gaming on my (now getting pretty) ol' 52" Sony LED LCD HDTV for the most part, it feels pretty immersive with surround headphones and a screen that big when you're walking around a game like Metro Last Light or Watch_Dogs that really makes you feel almost there. I do sometimes wish Watch_Dogs had a first person option though...

If the R9 285X does come out and has that performance as you say, it would mean that, most likely the R9 290 will be phased out & be cheaper used than it is now, which is pretty freakin' cheap TBH. You just have to avoid mining cards if you hunt on EBay, you can find a pretty good deal, just grill the seller... or buy on the marketplace here & you'll know you're getting a good card and you can ask whatever you wanna know about it and get actual answers from someone that most likely actually knows what they're doing (vs on EBay you could be talking to a braindead moron, on here, it's mostly tech-savvy folk)


----------



## Rainmaker91

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Yeah that's a bit out there :-D Never hurts to think about & dream though. I'm still gaming on my (now getting pretty) ol' 52" Sony LED LCD HDTV for the most part, it feels pretty immersive with surround headphones and a screen that big when you're walking around a game like Metro Last Light or Watch_Dogs that really makes you feel almost there. I do sometimes wish Watch_Dogs had a first person option though...
> 
> If the R9 285X does come out and has that performance as you say, it would mean that, most likely the R9 290 will be phased out & be cheaper used than it is now, which is pretty freakin' cheap TBH. You just have to avoid mining cards if you hunt on EBay, you can find a pretty good deal, just grill the seller... or buy on the marketplace here & you'll know you're getting a good card and you can ask whatever you wanna know about it and get actual answers from someone that most likely actually knows what they're doing (vs on EBay you could be talking to a braindead moron, on here, it's mostly tech-savvy folk)


The 285x is just a rumor I heard. I just know that I will most likely just get the next generation cards that comes. The thought of expanding with a couple of 280x cards and my current 7950 has also crossed my mind, so I will see when the time comes. I will be doing massive changes to my build in about half a year or so when I get back from a semester abroad so I will see what has hit the market by then. One thing is for sure, I will be running a full open loop in the future. I figure ea 360 rad, a 240 and a 140 rad should hold my PC low on temps. If it doesn't then I will just modify it even more to fit a 360 in the front


----------



## JeremyFenn

Hey can I be in the club too?







Caelus is now sporting a sexy SAPPHIRE VAPOR-X R9 290X 4GB GDDR5 TRI-X OC (UEFI).


----------



## hwoverclkd

Quote:


> Originally Posted by *JeremyFenn*
> 
> I just bought that card too !!! Runs great for everything I do, I'll put pics up of mine after work. Sapphire VAPOR-X Tri-X R9 290x OC edition FTW !!!


curious, how far could you overclock that badass sapphire 290x? I assume this is the card whose base clock is at 1080, correct?


----------



## heroxoot

Quote:


> Originally Posted by *acupalypse*
> 
> Quote:
> 
> 
> 
> Originally Posted by *JeremyFenn*
> 
> I just bought that card too !!! Runs great for everything I do, I'll put pics up of mine after work. Sapphire VAPOR-X Tri-X R9 290x OC edition FTW !!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> curious, how far could you overclock that badass sapphire 290x? I assume this is the card whose base clock is at 1080, correct?
Click to expand...

Probably as far as most other cards.


----------



## JeremyFenn

I haven't OC'd this card yet. Yes it's the 1080 OC edition.


----------



## Dan848

Quote:


> Originally Posted by *Sonikku13*
> 
> I'm debating whether to downgrade from my 290X and go with the A10-7850K I have until the 390X comes out. Basically, I want to sell my 290X now to fund the 390X purchase, since when the 390X comes out, 290X prices will probably tank. Any thoughts?


Normally, high end AMD cards prices remain high for 9 to 18 months after the release of the next gen card. This may be a very poor time to sell your R9 290, because bit coin miners have been dumping thousands of 290 and 290X video cards on to the used market. So, you will lose a lot of money if you sell now. Might as well wait.

I do not know about you, however, I skip a generation. Normally there is not that great a jump in performance in just one generation for me to purchase a new video card.


----------



## JeremyFenn

I'll give you a hint, the last video cards I've owned were ASUS NVIDIA GTS450 TOP editions (900Mhz stock OC). I waited for a hot minute to get the 7770's then the R series came out and got a few R7 260x's and put an r9 280x in my Caelus. Unfortunately I had tessellation issues with that card, swapped it out for a GTX 770 but really felt that I wanted an AMD in my rig, so I swapped that for this sexy R9 290x. I gotta say I'm NOT disappointed in switching back from NVIDIA to AMD.


----------



## Arizonian

Quote:


> Originally Posted by *KeepWalkinG*
> 
> this is my new card
> 
> 
> 
> 
> 
> 
> 
> 
> Can I join the club?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> image hosting no account
> 
> free screen capture software
> 
> uploading pictures
> 
> screen shot pc
> 
> adult image sharing
> 
> upload a gif
> 
> upload image


Congrats - added







Please go back and add GPU-Z validation with OCN name open to validation tab.

Quote:


> Originally Posted by *SPLWF*
> 
> I am also in the 290 club now
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s688.photobucket.com/user/BoostedSupraMK3/media/20140701_202412_zps4b5cbe60.jpg.html


Congrats - added







I guessed your manufacture and card. Please go back and add GPU-Z validation with OCN name open to validation tab.
Quote:


> Originally Posted by *JeremyFenn*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hey can I be in the club too?
> 
> 
> 
> 
> 
> 
> 
> Caelus is now sporting a sexy SAPPHIRE VAPOR-X R9 290X 4GB GDDR5 TRI-X OC (UEFI).


Congrats - added







Please go back and add GPU-Z validation with OCN name open to validation tab.

Per OP reminder:

To be added on the member list please submit the following in your post

1. GPU-Z Link with OCN name or Screen shot of GPU-Z validation tab open with OCN Name or finally a pic of GPU with piece of paper with OCN name showing.
2. Manufacturer & Brand - Please don't make me guess.
3. Cooling - Stock, Aftermarket or 3rd Party Water

Please note if you do not provide manufacturer, brand or cooling - by default Sapphire Stock Cooling will be chosen for you.


----------



## JeremyFenn

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> Please go back and add GPU-Z validation with OCN name open to validation tab.
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> I guessed your manufacture and card. Please go back and add GPU-Z validation with OCN name open to validation tab.
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> Please go back and add GPU-Z validation with OCN name open to validation tab.
> 
> Per OP reminder:
> 
> To be added on the member list please submit the following in your post
> 
> 1. GPU-Z Link with OCN name or Screen shot of GPU-Z validation tab open with OCN Name or finally a pic of GPU with piece of paper with OCN name showing.
> 2. Manufacturer & Brand - Please don't make me guess.
> 3. Cooling - Stock, Aftermarket or 3rd Party Water
> 
> Please note if you do not provide manufacturer, brand or cooling - by default Sapphire Stock Cooling will be chosen for you.


Sorry bout that.

http://www.techpowerup.com/gpuz/2vqxb/


----------



## Arizonian

Quote:


> Originally Posted by *JeremyFenn*
> 
> Sorry bout that.
> 
> http://www.techpowerup.com/gpuz/2vqxb/


Thanks. No worries. Added the X to your 290.









Updated.


----------



## JeremyFenn

Quote:


> Originally Posted by *Arizonian*
> 
> Thanks. No worries. Added the X to your 290.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Updated.


Thanks.. Btw it's a SAPPHIRE VAPOR-X R9 290X 4GB GDDR5 TRI-X OC (UEFI) with Vapor-X cooling. I guess technically that could be considered stock, but it's not base/reference to AMD, it's it's own monster.


----------



## heroxoot

MSI came through for me. Took a couple of weeks, but bam, working fan control.



Seems it was my batch of 290X that has the problems.


----------



## Fade2Black

the cx650 comes with 2 pcie cables that each can be 2x 6pin or 2x 8pin. what is the correct way to use them? my 290 has an 8pin and a 6pin input and i don't know whether i should use one cable on both connections or two cables for both of the connections


----------



## Mega Man

use both


----------



## heroxoot

Quote:


> Originally Posted by *Fade2Black*
> 
> the cx650 comes with 2 pcie cables that each can be 2x 6pin or 2x 8pin. what is the correct way to use them? my 290 has an 8pin and a 6pin input and i don't know whether i should use one cable on both connections or two cables for both of the connections


Either works. I have all 8 pin connectors and I peel the extra 2 pin back on one to get it into my card. Whom ever thought it was a good idea to put the 6pin on the right side was an idiot. Would have been way easier if it was on the left of the connectors. I had to pull back the 2 pin to make it a 6pin a little more than I liked.


----------



## SPLWF

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> Please go back and add GPU-Z validation with OCN name open to validation tab.
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> I guessed your manufacture and card. Please go back and add GPU-Z validation with OCN name open to validation tab.
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> Please go back and add GPU-Z validation with OCN name open to validation tab.
> 
> Per OP reminder:
> 
> To be added on the member list please submit the following in your post
> 
> 1. GPU-Z Link with OCN name or Screen shot of GPU-Z validation tab open with OCN Name or finally a pic of GPU with piece of paper with OCN name showing.
> 2. Manufacturer & Brand - Please don't make me guess.
> 3. Cooling - Stock, Aftermarket or 3rd Party Water
> 
> Please note if you do not provide manufacturer, brand or cooling - by default Sapphire Stock Cooling will be chosen for you.


Sorry. here you go

http://www.techpowerup.com/gpuz/4wrcy/

And specs:

GPU: AMD Reference Sapphire R9 290 + NZXT Kraken G10 + Corsair H55 + GELID ICY Vision VRM Heatsinks + Zalman ZM-RHS1 VRAM Heatsinks


----------



## Rainmaker91

Quote:


> Originally Posted by *Fade2Black*
> 
> the cx650 comes with 2 pcie cables that each can be 2x 6pin or 2x 8pin. what is the correct way to use them? my 290 has an 8pin and a 6pin input and i don't know whether i should use one cable on both connections or two cables for both of the connections


It all depends on the PSU, I can't say I'm familiar with the cx650 since the only ones I can find that resembles the name is the Corsair cx600 and tx650. If it's one of these then it won't matter since they both have a single 12v rail and should deliver all the needed power that is needed to the PCIe plug. If it's another one then I can't say for sure. Its all about rails and it it turns out your PSU has a split rail then that could prove to deliver less power to your card. If it has a split rail then you should consider using two dedicated cables as that would assure you of getting enough power.

In theory every 8pin and 6pin should get the needed amount of power but it all varies on the quality of the PSU and that is why I suggested splitting it over two cables if it's several rails instead of a single rail.


----------



## PCSarge

so...bored....must play...games...


----------



## mojobear

Hey guys,

For all those wondering how much power a 290 stock uses...I am in a pretty good position to give u a good estimate. I have 4 290s in my system. 3 290s running from an ax1200 (which also does cpu/everything else) and my other psu cx600M is only running the last 290...thus my watt meter can isolate the cx600m feeding the lone 290. Of course the only power I'm not taking into account is the PCIE from the mobo...but for what I hear and see re: 1x PCIE (non-powered) running 290s fine while mining...the 290s do not need much power from the PCIE bus....please correct me if wrong guys.

With that intro and caveat..this is what I get...

Idle: 20-25W (with ULPS off)...actually might turn it on and see what diff in idle power consumption actually is








For 3DMark firestrike extreme: stock 290 pulls about 200-210W max from the CX600M under load (this is with fan @70% so 290 does not throttle)
For 3DMark firestrike extreme: OC 290 (1160/1375 +87mV) pulls about 260-270W from the CX600M under load (this is with fan @70% so 290 does not throttle)

Hope this little bit of info will help people get a better idea of the 290s....there are some ideas that the 290s chew up 300-400W @ stock...I really don't think that is the case.


----------



## chiknnwatrmln

Don't forget about PSU efficiency. I take it those measurements were taken at the wall? Assuming 80% efficiency, then that's about 200 x .8 = 160w and 260 x .8 = 210ish watts DC.

Also, I think the PCIe may provide more power than you think. I vaguely remember calculating a draw of 180ish watts when stock.

290/x's also have a tendency to use a lot of power when you increase the voltage.
Quote:


> Originally Posted by *chiknnwatrmln*
> 
> In regards to power usage I don't really believe that 2 290x's will draw 900w...
> 
> My entire rig with my 290 at full tilt (we're talking more than +200mV here, so some pretty serious OC'ing for water) draws around 560w max at the wall. This is include my 3770k on 1.4v, a water pump, 10 fans, etc. Now with the 92% efficiency of my PSU that means my PSU is outputting ~515w DC current.
> 
> Now GPU-z says that my card is pulling around 350w max, give or take a few watts due to software inaccuracies. But if this is anywhere near right, that means everything else in my rig is using 165 watts. This makes sense because an OC'ed 3770k is around 100w TDP (77w stock), if I'm at 75% usage that around 75w, leaving 90 watts for my fans, HDD, SSD's, pump, lights, RAM, mobo, etc.
> 
> So pretty much the most my ref 290 draws when OC'ed to the max is 350w, keep in mind that's only for spikes so most of the time it's under 330w. Keep in mind this is under water, but I'd expect a similarly OC'ed 290x to draw max of 370ish watts.
> 
> So, going off these estimations and simple math, 2 290x's OC'ed for suicide benchmark runs will draw roughly 750 watts maximum.


----------



## HeliXpc

Quick question, would a tiny chip on the power phase coil's affect anything? on the corner, one on the memory coil and one on gpu power delivery coil.....dont know how it happened, but it did, wondering......


----------



## chiknnwatrmln

Do you mean on the VRM's? If so, I would think not... If it still works, then it should be fine.

Even if one of the main VRM's goes out, the others will pick up the slack. Granted, OC's will be limited and the others will be more strained, but I remember reading about some guy who had one VRM blow out and his card still worked.


----------



## HeliXpc

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Do you mean on the VRM's? If so, I would think not... If it still works, then it should be fine.
> 
> Even if one of the main VRM's goes out, the others will pick up the slack. Granted, OC's will be limited and the others will be more strained, but I remember reading about some guy who had one VRM blow out and his card still worked.


Yeah, its a small surface crack, i dont know how the electronics of the card work, was wondering if anyone is an expert and can let me know if i should bother using the card at all...


----------



## Red1776

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Don't forget about PSU efficiency. I take it those measurements were taken at the wall? Assuming 80% efficiency, then that's about 200 x .8 = 160w and 260 x .8 = 210ish watts DC.
> 
> Also, I think the PCIe may provide more power than you think. I vaguely remember calculating a draw of 180ish watts when stock.
> 
> 290/x's also have a tendency to use a lot of power when you increase the voltage.
> Quote:
> 
> 
> 
> Originally Posted by *chiknnwatrmln*
> 
> In regards to power usage I don't really believe that 2 290x's will draw 900w...
> 
> My entire rig with my 290 at full tilt (we're talking more than +200mV here, so some pretty serious OC'ing for water) draws around 560w max at the wall. This is include my 3770k on 1.4v, a water pump, 10 fans, etc. Now with the 92% efficiency of my PSU that means my PSU is outputting ~515w DC current.
> 
> Now GPU-z says that my card is pulling around 350w max, give or take a few watts due to software inaccuracies. But if this is anywhere near right, that means everything else in my rig is using 165 watts. This makes sense because an OC'ed 3770k is around 100w TDP (77w stock), if I'm at 75% usage that around 75w, leaving 90 watts for my fans, HDD, SSD's, pump, lights, RAM, mobo, etc.
> 
> So pretty much the most my ref 290 draws when OC'ed to the max is 350w, keep in mind that's only for spikes so most of the time it's under 330w. Keep in mind this is under water, but I'd expect a similarly OC'ed 290x to draw max of 370ish watts.
> 
> So, going off these estimations and simple math, 2 290x's OC'ed for suicide benchmark runs will draw roughly 750 watts maximum.
Click to expand...

I have 4 x R290X's powered by a AX1200W and 2 x FSP 500W (1000W) and accounting for the difference between the 290's and the 290X's this is in line with my draw (a bit higher as the AX1200 is around 90%)


----------



## tsm106

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Don't forget about PSU efficiency. I take it those measurements were taken at the wall? Assuming 80% efficiency, then that's about 200 x .8 = 160w and 260 x .8 = 210ish watts DC.
> 
> Also, I think the PCIe may provide more power than you think. I vaguely remember calculating a draw of 180ish watts when stock.
> 
> 290/x's also have a tendency to use a lot of power when you increase the voltage.
> Quote:
> 
> 
> 
> Originally Posted by *chiknnwatrmln*
> 
> *In regards to power usage I don't really believe that 2 290x's will draw 900w...*
> 
> My entire rig with my 290 at full tilt (we're talking more than +200mV here, so some pretty serious OC'ing for water) draws around 560w max at the wall. This is include my 3770k on 1.4v, a water pump, 10 fans, etc. Now with the 92% efficiency of my PSU that means my PSU is outputting ~515w DC current.
> 
> Now GPU-z says that my card is pulling around 350w max, give or take a few watts due to software inaccuracies. But if this is anywhere near right, that means everything else in my rig is using 165 watts. This makes sense because an OC'ed 3770k is around 100w TDP (77w stock), if I'm at 75% usage that around 75w, leaving 90 watts for my fans, HDD, SSD's, pump, lights, RAM, mobo, etc.
> 
> So pretty much the most my ref 290 draws when OC'ed to the max is 350w, keep in mind that's only for spikes so most of the time it's under 330w. Keep in mind this is under water, but I'd expect a similarly OC'ed 290x to draw max of 370ish watts.
> 
> So, going off these estimations and simple math, 2 290x's OC'ed for suicide benchmark runs will draw roughly 750 watts maximum.
Click to expand...











From 11/7/2013 in this very thread... I run dual psu so I can isolate cards and equipment into configurations where I can isolate the power draw for measurement so I don't have to guess at their power draws. Looks like 900w to me.

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *brazilianloser*
> 
> Yeah way back a few pages but thanks for the answers... I have an ax860 as well... I did find a review for crossfire 2 way Crossfire Power Consumption In full stress they only managed to pull 603w out of their system... if that proves to be true a "non problematic" ax860 should handle it just fine... now if you have a lot of extras maybe not.
> 
> 
> 
> Word of advice, never trust consumption numbers measured from the wall from guru. They are always extremely low. They would fit right in with the guys from our psu forum lol. And using furmark to measure draw is silly too since it has been a throttled app for years now. You need real world workload and max fan on the stock cooler. For ex, look below. Consider also that the number I got below is w/o accessories, loop, drives, fans, pumps, etc. Then imagine when it is run on the uber bios or unlock bios and overclocked with anger!
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> 
> 
> *Two cards on psu with my cpu and nothing else connected, ie. no drives, no loop, nothing else.* Running 3dm11 extreme at stock with +50 on air, so it's throttling a bit as expected. 903w x .89 at that wattage = *803w.* This number should blow up with blocks and serious overclocking. Ya need to remember that on the stock bios PT is trying to keep the tdp of the cards very very low.
> 
> Click to expand...
Click to expand...


----------



## sugarhell

That was fast


----------



## tsm106

Quote:


> Originally Posted by *sugarhell*
> 
> That was fast


Faster than you. Even Red got you.


----------



## sugarhell

Quote:


> Originally Posted by *tsm106*
> 
> Faster than you. Even Red got you.


Just wake up thats why tsm


----------



## chiknnwatrmln

Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> From 11/7/2013 in this very thread... I run dual psu so I can isolate cards and equipment into configurations where I can isolate the power draw for measurement so I don't have to guess at their power draws. Looks like 900w to me.


With my 290 at stock 290x speeds and +50 PL my entire rig and one monitor only pull 410 watts from the wall max, factoring in the 15 watt monitor and the 92% efficient PSU that's not even 370w DC for my entire PC with half a dozen fans, a water pump, etc.

Now I took this measurement playing a modded out Skyrim as that usually gives me the highest power use, higher than Heaven, 3dm11, and even Furmark. All those gave me ~370ish at the wall total. I understand that I have a 290 and you're talking about 290x's, but there is only a 10% difference at cores so given the same speeds there should not be that big of a difference. There's also the whole hotter-chips-draw-more-power thing, but even when my card was on the stock blower the power usage was only 10 or 15 watts more, if my memory serves me correctly.

Not doubting your findings, just trying to figure out why we are getting such different results. 800 watts for two cards throttling at stock speeds seems like a lot.


----------



## Red1776

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *sugarhell*
> 
> That was fast
> 
> 
> 
> Faster than you. Even Red got you.
Click to expand...

Heeeey.....

whats "even Red got you" supposed to mean?!


----------



## KeepWalkinG

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> Please go back and add GPU-Z validation with OCN name open to validation tab.
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> I guessed your manufacture and card. Please go back and add GPU-Z validation with OCN name open to validation tab.
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> Please go back and add GPU-Z validation with OCN name open to validation tab.
> 
> Per OP reminder:
> 
> To be added on the member list please submit the following in your post
> 
> 1. GPU-Z Link with OCN name or Screen shot of GPU-Z validation tab open with OCN Name or finally a pic of GPU with piece of paper with OCN name showing.
> 2. Manufacturer & Brand - Please don't make me guess.
> 3. Cooling - Stock, Aftermarket or 3rd Party Water
> 
> Please note if you do not provide manufacturer, brand or cooling - by default Sapphire Stock Cooling will be chosen for you.


Sorry I did not read the rules









http://www.techpowerup.com/gpuz/cgbxs/


----------



## Dasboogieman

Quote:


> Originally Posted by *Dan848*
> 
> Normally, high end AMD cards prices remain high for 9 to 18 months after the release of the next gen card. This may be a very poor time to sell your R9 290, because bit coin miners have been dumping thousands of 290 and 290X video cards on to the used market. So, you will lose a lot of money if you sell now. Might as well wait.
> 
> I do not know about you, however, I skip a generation. Normally there is not that great a jump in performance in just one generation for me to purchase a new video card.


Yeah I agree, I usually skip and even then wait about 6-9 months for the initial issues to be ironed out and the AIB cards to be released. Then I can judge the GPU's competitive position more objectively. Part of the reason it took so long for me to jump on the 290, I had to be sure it was going to be mature when I went on board, reduced prices are a bonus, almost no reason to be an early adopter these days.

Just by waiting, so many issues were bypassed: hot + loud reference coolers, blackscreens, crappy release drivers, infant mortality on early cards. Also, you have a better idea of the final overclocked performance, reliability and potential overvoltage degradation that can be expected from an architecture as loads of early adopters basically beta test the settings for you. I basically only had to purchase the 290 and punch in the most common setting of 1100mhz +25mV.
I don't know how many people remember the GTX 570 but the initial reference 4 phase design was very prone to blowing up if overvolted. Serious issue that didn't get fixed until about a year later (6 phase designs became the norm) with thousands of unhappy campers.


----------



## RetiredAssassin

*Hey there folks







*

I need a little help, I was surfing on NewEgg and came across something that got me re-consider my initial choice of Graphics Card which is/was GTX 780Ti classified at *$760*

I've came across an R9 290X at very exciting price tag which is PowerColor PCS+ AXR9 290X at *$480*

soo... now I'm considering this option, either 2-way PowerColor R9 290X crossfire OR *single* GTX 780Ti Classified.

what would you recommend guys? I'm gonna put my hardware order on July 4 so I need to decide before then,

_NOTE: as of present I'll be gaming on single 1080p monitor_, and not sure if down the road I'll make it triple-monitor setup, I know this 2-way crossfire setup mainly would be ideal for multiple monitor or 4K setup, but the reason I'm still considering is because I feel I get more performance for my buck with these R9s as opposed to 780Ti with which I've VRAM concern to be not enough as other triple-a titles hit the market, but anyways, what's your take on this guys?

Thanks


----------



## heroxoot

Quote:


> Originally Posted by *RetiredAssassin*
> 
> *Hey there folks
> 
> 
> 
> 
> 
> 
> 
> *
> 
> I need a little help, I was surfing on NewEgg and came across something that got me re-consider my initial choice of Graphics Card which is/was GTX 780Ti classified at *$760*
> 
> I've came across an R9 290X at very exciting price tag which is PowerColor PCS+ AXR9 290X at *$480*
> 
> soo... now I'm considering this option, either 2-way PowerColor R9 290X crossfire OR *single* GTX 780Ti Classified.
> 
> what would you recommend guys? I'm gonna put my hardware order on July 4 so I need to decide before then,
> 
> Thanks


Dual 290X will beat a single 780ti.

http://www.hardocp.com/article/2013/11/11/geforce_gtx_780_ti_vs_radeon_r9_290x_4k_gaming/3

They are about par on performance, so dual 290X would be amazing. Also remember that this benchmark is a regular 290X I believe so the power color would most likely be clocked higher than ref, as most are.


----------



## Dsrt

My R9-290X's and 1300W Leadex Super Flower finally came. Gonna fire these up tonight


----------



## HoneyBadger84

Quote:


> Originally Posted by *RetiredAssassin*
> 
> *Hey there folks
> 
> 
> 
> 
> 
> 
> 
> *
> 
> I need a little help, I was surfing on NewEgg and came across something that got me re-consider my initial choice of Graphics Card which is/was GTX 780Ti classified at *$760*
> 
> I've came across an R9 290X at very exciting price tag which is PowerColor PCS+ AXR9 290X at *$480*
> 
> soo... now I'm considering this option, either 2-way PowerColor R9 290X crossfire OR *single* GTX 780Ti Classified.
> 
> what would you recommend guys? I'm gonna put my hardware order on July 4 so I need to decide before then,
> 
> _NOTE: as of present I'll be gaming on single 1080p monitor_, and not sure if down the road I'll make it triple-monitor setup, I know this 2-way crossfire setup mainly would be ideal for multiple monitor or 4K setup, but the reason I'm still considering is because I feel I get more performance for my buck with these R9s as opposed to 780Ti with which I've VRAM concern to be not enough as other triple-a titles hit the market, but anyways, what's your take on this guys?
> 
> Thanks


Thats about $50 lower than last time I looked at it. Buy and buy quick is my recommendation lol


----------



## steverebo

Hi guys quick question is +150mv a safe 24/7 voltage on a saphire r9 290x trix under water with kryographics block and 1 x xt45 360 and 1 x xt45 240 rads.

My card seems stable at 1200 core but at +150mv at 45° full load valley benchmark


----------



## HoneyBadger84

Quote:


> Originally Posted by *steverebo*
> 
> Hi guys quick question is +150mv a safe 24/7 voltage on a saphire r9 290x trix under water with kryographics block and 1 x xt45 360 and 1 x xt45 240 rads.
> 
> My card seems stable at 1200 core but at +150mv at 45° full load valley benchmark


As long as gpu & vrm temps are in check I don't see why it wouldn't be safe.


----------



## heroxoot

Quote:


> Originally Posted by *steverebo*
> 
> Hi guys quick question is +150mv a safe 24/7 voltage on a saphire r9 290x trix under water with kryographics block and 1 x xt45 360 and 1 x xt45 240 rads.
> 
> My card seems stable at 1200 core but at +150mv at 45° full load valley benchmark


If trixx can do multi profile I would. for MSI AB I would run 2 profiles, one for 3d, one for 2d. That way its not running 24/7.


----------



## Dasboogieman

Quote:


> Originally Posted by *RetiredAssassin*
> 
> *Hey there folks
> 
> 
> 
> 
> 
> 
> 
> *
> 
> I need a little help, I was surfing on NewEgg and came across something that got me re-consider my initial choice of Graphics Card which is/was GTX 780Ti classified at *$760*
> 
> I've came across an R9 290X at very exciting price tag which is PowerColor PCS+ AXR9 290X at *$480*
> 
> soo... now I'm considering this option, either 2-way PowerColor R9 290X crossfire OR *single* GTX 780Ti Classified.
> 
> what would you recommend guys? I'm gonna put my hardware order on July 4 so I need to decide before then,
> 
> _NOTE: as of present I'll be gaming on single 1080p monitor_, and not sure if down the road I'll make it triple-monitor setup, I know this 2-way crossfire setup mainly would be ideal for multiple monitor or 4K setup, but the reason I'm still considering is because I feel I get more performance for my buck with these R9s as opposed to 780Ti with which I've VRAM concern to be not enough as other triple-a titles hit the market, but anyways, what's your take on this guys?
> 
> Thanks


The 3Gb memory buffer on the 780/ti was the exact reason why I got a Hawaii card. There's basically no scalability, similar to the older GTX 570. It's adequate for a single GPU configuration but it is totally inadequate for SLI. As long as you are aware of what you are leaving behind and can mitigate the risks, you'll get more mileage per dollar out of dual 290X.
A couple of points from my experience:

What you are leaving behind when not going NVIDIA:
1. The excellent NVIDIAinspector program, this is an absolute must for any NVIDIA user and I've yet to see an AMD equivalent that allows the same degree of driver-level tweaking
2. Faster driver release cycle: simple, NVIDIA has a bigger driver team, they will even optimize the hell out of obscure and beta games which may never see the light of day.
3. TXAA, I've never used it so I can't comment
4. PhysX, I've seen it once on Batman Arkham City, lagged too much so I turned it off
5. Superior Texture filtering performance: means that the 780Ti does better in older games circa 2012-2013 which don't push the ROPS or Memory as hard
6. Superior Tessellation: I've yet to see a good implementation where this is a dealbreaker but the advantage is there
7. Reasonably hassle free SLI: I might be too new to Crossfire to be truly objective but the NVIDIA SLI is really easy to use, because the driver team is bigger, they can afford to implement SLI profiles for a wider selection of games, even obscure ones that don't really need SLI.
8. SGSSAA: I've yet to see this employed on AMD, I'm a big fan of SSAA so this was a big loss
9. the 780Ti is reasonably cool and quiet, or conversely, reasonably hot and loud (as opposed to "extremely") when using the Skynet BIOS to OC

What you won't miss from NVIDIA
1. Expensive
2. Expensive, expensive, more expense
3. Unbalanced GPU configurations, seriously, nobody likes to have a flagship GPU that tanks performance after only 2 years of use (see GTX 570).

What you gain from going AMD
1. Excellent bang for buck
2. Stronger high resolution performance, indirectly, at 1080p this also means that Hawaii theoretically bleeds less performance when using heavy MSAA or SSAA
3. 4Gb of RAM, this is a really big advantage because this is the sweet spot for the offered GPU rendering power. Allows you to play 99% of games out there, even in crossfire, without VRAM issues.
4. Aircooled GPUs are very hot and loud in crossfire, unless you have one of those rare X79 boards with heaps of spacing. Can be partially mitigated by having a powerful (150+ CFM) side panel extraction fan. Can be totally mitigated with watercooling.
5. Superb (but not quite as excellent as the NVIDIA games) performance in AMD favored games
6. The model you are after, the PCS+, is a _*perfect*_ candidate for NZXT G10 modding with an AIO cooler since it has independent heatsinks for the VRM1 and VRM2 assemblies + integrated backplate.
7. only weakness of the PCS+ is it's 3 slots which may hamper 2-way air cooled crossfire.


----------



## Sgt Bilko

^ I like that....good post man


----------



## sugarhell

Quote:


> Originally Posted by *Sgt Bilko*
> 
> ^ I like that....good post man


Not really.

I dont agree with 2,4,5,6,7,8

2: WQHL means nothing. Both have around the same number of drivers
5:wut
6:wut this is not 6900 series hawaii has around the same performance or a bit lower performance on tessellation
7:Almost all the mgpu setup is a pain.No difference between the two solutions
8:Amd has SGSSAA and RGSSAA

He missed shadowplay which is cool and atm better than amd solution


----------



## Sgt Bilko

Quote:


> Originally Posted by *sugarhell*
> 
> Not really.
> 
> I dont agree with 2,4,5,6,7,8
> 
> 2: WQHL means nothing. Both have around the same number of drivers
> 5:wut
> 6:wut this is not 6900 series hawaii has around the same performance or a bit lower performance on tessellation
> 7:Almost all the mgpu setup is a pain.No difference between the two solutions
> 8:Amd has SGSSAA and RGSSAA
> 
> He missed shadowplay which is cool and atm better than amd solution


Maybe i should have highlighted the "What you gain from going AMD" part of it.

I don't have alot of experience with Nvidia personally so i normally don't comment on it.

Things like Shadowplay and AMD's DVR function are something that i really really really have no interest in whatsoever.
i don't record my gameplay, i only take screencaps (I'm the minority in this though)


----------



## heroxoot

@dasboogieman I disagree with the heat comment on AMD. My 290X is very quiet below 60% fan speed. No one buys reference design unless they plan to put it under water.

On another note, is anyone elses gpu upclocking during flash player? I have HW acceleration disabled on firefox and flash, with protected mode disabled, and my GPU is still clocking up to 500 for twitch.tv. Not a big deal but curious why when before it stopped clocking up past 400 when I disabled all HW acceleration.


----------



## rdr09

Quote:


> Originally Posted by *RetiredAssassin*
> 
> *Hey there folks
> 
> 
> 
> 
> 
> 
> 
> *
> 
> I need a little help, I was surfing on NewEgg and came across something that got me re-consider my initial choice of Graphics Card which is/was GTX 780Ti classified at *$760*
> 
> I've came across an R9 290X at very exciting price tag which is PowerColor PCS+ AXR9 290X at *$480*
> 
> soo... now I'm considering this option, either 2-way PowerColor R9 290X crossfire OR *single* GTX 780Ti Classified.
> 
> what would you recommend guys? I'm gonna put my hardware order on July 4 so I need to decide before then,
> 
> _NOTE: as of present I'll be gaming on single 1080p monitor_, and not sure if down the road I'll make it triple-monitor setup, I know this 2-way crossfire setup mainly would be ideal for multiple monitor or 4K setup, but the reason I'm still considering is because I feel I get more performance for my buck with these R9s as opposed to 780Ti with which I've VRAM concern to be not enough as other triple-a titles hit the market, but anyways, what's your take on this guys?
> 
> Thanks


$400 tops for a 3GB card.


----------



## heroxoot

So guys whats the opinion about texture filtering on performance rather than standard? I was looking up some tweaks that add performance without degrading quality and one was that setting texture filtering to performance is ok to do because when you run a game it will use Anisotropic filter over the trill or bili filter so it looks all the same in games and such. It net me a good extra frame in heaven too and I cannot see a difference. Just curious if others do this or if there is a difference I cannot see.

I popped up my gpu from 1030 - 1040, turned down the filter to performance, and bam, an extra frame and a half. My GPU is made to run 1040 out of box via MSI gaming app so I figured, why not just set it to run 1040 in MSI AB? Not like it needs more voltage. Now that I can control fan speed and curve my GPU cannot even see 80c.


----------



## fateswarm

Quote:


> Originally Posted by *Dasboogieman*
> 
> 4. Aircooled GPUs are very hot and loud


Tri-x 290 is surprisingly not loud usually. Also part of temps depends on the quality of the circuit of the VRM the board maker chose to have. If the mosfets are very good, you mainly care about cooling the GPU chip.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Dasboogieman*
> 
> 4. Aircooled GPUs are very hot and loud *in crossfire*,


Quote:


> Originally Posted by *fateswarm*
> 
> Tri-x 290 is surprisingly not loud usually. Also part of temps depends on the quality of the circuit of the VRM the board maker chose to have. If the mosfets are very good, you mainly care about cooling the GPU chip.


Quote:


> Originally Posted by *heroxoot*
> 
> @dasboogieman I disagree with the heat comment on AMD. My 290X is very quiet below 60% fan speed. No one buys reference design unless they plan to put it under water.


You both missed that last part......

Single cards have no issue at all with noise or heat (even Ref to an extent) but when you start adding more GPU's in Crossfire or Trifire then the heat and noise (in some cases) will obviously ramp up unless of course you have enough spacing and airflow which Dasboogieman also pointed out.


----------



## heroxoot

Quote:


> Originally Posted by *fateswarm*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dasboogieman*
> 
> 4. Aircooled GPUs are very hot and loud
> 
> 
> 
> Tri-x 290 is surprisingly not loud usually. Also part of temps depends on the quality of the circuit of the VRM the board maker chose to have. If the mosfets are very good, you mainly care about cooling the GPU chip.
Click to expand...

This. My 290X Gaming is pretty silent below 60% speed.


----------



## Sgt Bilko

Quote:


> Originally Posted by *heroxoot*
> 
> This. My 290X Gaming is pretty silent below 60% speed.


As are my DD cards but you are taking that quote out of context


----------



## heroxoot

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> This. My 290X Gaming is pretty silent below 60% speed.
> 
> 
> 
> As are my DD cards but you are taking that quote out of context
Click to expand...

Not even a little. I know people with crossfire and 3way and they don't overclock and thus don't have liquid. Their cards are fairly silent even so. Only ref cards are noisy.


----------



## Sgt Bilko

Quote:


> Originally Posted by *heroxoot*
> 
> Not even a little. I know people with crossfire and 3way and they don't overclock and thus don't have liquid. Their cards are fairly silent even so. Only ref cards are noisy.


I covered that above:
Quote:


> Originally Posted by *Sgt Bilko*
> 
> Single cards have no issue at all with noise or heat (even Ref to an extent) but when you start adding more GPU's in Crossfire or Trifire then the heat and noise (in some cases) will obviously ramp up unless of course you have enough spacing and airflow which Dasboogieman also pointed out.


----------



## cennis

How are your watchdog experiences?

with my 295x2 (comparableto 2x 290x) I get like 30~50 fps in car, 70~140 fps on foot not near cars,

1440p, ultra, AA disabled


----------



## heroxoot

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Not even a little. I know people with crossfire and 3way and they don't overclock and thus don't have liquid. Their cards are fairly silent even so. Only ref cards are noisy.
> 
> 
> 
> I covered that above:
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Single cards have no issue at all with noise or heat (even Ref to an extent) but when you start adding more GPU's in Crossfire or Trifire then the heat and noise (in some cases) will obviously ramp up unless of course you have enough spacing and airflow which Dasboogieman also pointed out.
> 
> Click to expand...
Click to expand...

Good to know. Want a cookie? Kidding, but still, the noise to heat ratio has been blown way out of proportion on the new cards. They do get hotter than last gen for sure, but I swear people make it sound like your PC will catch fire.


----------



## Sgt Bilko

Quote:


> Originally Posted by *heroxoot*
> 
> Good to know. Want a cookie? Kidding, but still, the noise to heat ratio has been blown way out of proportion on the new cards. They do get hotter than last gen for sure, but I swear people make it sound like your PC will catch fire.


I agree that the heat thing gets blown way out of proportion but i was just pointing out that Dasboogieman only said that heat and noise are only an issue in Multi-GPU setups where you and another were only comparing that comment to Single GPU setups.

I am curious about how the 8150 handles your 290x though, Was thinking about getting another 290 while they are cheap to replace the 7970 in my 8150 rig.


----------



## fateswarm

Quote:


> Originally Posted by *Sgt Bilko*
> 
> As are my DD cards but you are taking that quote out of context


Like you ignored what I said on VRMs?


----------



## Sgt Bilko

Quote:


> Originally Posted by *fateswarm*
> 
> Like you ignored what I said on VRMs?


Single vs Dual again....


----------



## fateswarm

Not necessarily. If the mosfets are good for a good VRD circuit you could avoid even cooling them. They could make total overkills if they wanted to.

e.g. that external VRM Gigabyte sells now appears to be full of IR3550s. It is capable of around 3,000W. Even half of it may need zero cooling.


----------



## heroxoot

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Good to know. Want a cookie? Kidding, but still, the noise to heat ratio has been blown way out of proportion on the new cards. They do get hotter than last gen for sure, but I swear people make it sound like your PC will catch fire.
> 
> 
> 
> I agree that the heat thing gets blown way out of proportion but i was just pointing out that Dasboogieman only said that heat and noise are only an issue in Multi-GPU setups where you and another were only comparing that comment to Single GPU setups.
> 
> I am curious about how the 8150 handles your 290x though, Was thinking about getting another 290 while they are cheap to replace the 7970 in my 8150 rig.
Click to expand...

Seems fine. I'm sure my fps would be better on an 8350 tho. I keep 60+ in all games. 64 player games on BF4 are usually 80 - 90 fps with mantle on the default 1030/1250. The only game I have issues with is borderlands 2. I get frame drops in the silliest ways. Facing a wall with nothing in sight I get 30fps. Facing out at 50 enemies shooting me I get 70fps. Sometimes its normal but area depends a lot. In heaven I get 52+ FPS depending. I set texture filter to performance and I almost get 53fps on 1040/1250. This card does 1040 via gaming app so I decided to just set it to run 1040 during 3D. Doesn't improve it a whole lot but my minimum got slightly higher.

My rig had a 7970 lightning till the cooler on it took a dump.


----------



## HoneyBadger84

Quote:


> Originally Posted by *cennis*
> 
> How are your watchdog experiences?
> 
> with my 295x2 (comparableto 2x 290x) I get like 30~50 fps in car, 70~140 fps on foot not near cars,
> 
> 1440p, ultra, AA disabled


Sounds in the ballpark of mine, I last ran it on dual 290Xs on Ultra with SMAA, ran 60FPS flatlined with VSync on about 90% of the time, only dipped occasionally while driving, I'm playing at 1080p though, so 1.8million less pixels than you







I recently installed theWorst Mod & that cut my performance a bit, then I went down to single card as I'm rotating some hardware in/out, so right now I'm running it on High, if I play before my newer second card arrives (which should be today, fingers crossed)


----------



## Sgt Bilko

Quote:


> Originally Posted by *fateswarm*
> 
> Not necessarily. If the mosfets are good for a good VRD circuit you could avoid even cooling them. They could make total overkills if they wanted to.
> 
> e.g. that external VRM Gigabyte sells now appears to be full of IR3550s. It is capable of around 3,000W. Even half of it may need zero cooling.


Even then keeping the Core itself cool is still a bit harder with the bottom gpu feeding the top one with pre-heated air, that's where better case ventilation helps
Quote:


> Originally Posted by *heroxoot*
> 
> Seems fine. I'm sure my fps would be better on an 8350 tho. I keep 60+ in all games. 64 player games on BF4 are usually 80 - 90 fps with mantle on the default 1030/1250. The only game I have issues with is borderlands 2. I get frame drops in the silliest ways. Facing a wall with nothing in sight I get 30fps. Facing out at 50 enemies shooting me I get 70fps. Sometimes its normal but area depends a lot. In heaven I get 52+ FPS depending. I set texture filter to performance and I almost get 53fps on 1040/1250. This card does 1040 via gaming app so I decided to just set it to run 1040 during 3D. Doesn't improve it a whole lot but my minimum got slightly higher.
> 
> My rig had a 7970 lightning till the cooler on it took a dump.


I've no doubt an 8350 would be better but i like my 8150, fun chip to play with.
I did have issues with BL2 as well from memory, fairly similar to what you're describing.

I'm guessing Mantle makes a nice difference for you same as it does for me.

I remember MSI replacing your 7970 with a 290x, pretty awesome trade there


----------



## heroxoot

yea there is about a hundred topics on the gearbox forum about poor performance on 780ti and other cards alike. Not sure if there was a patch after psycho was released, but my performance used to be amazing. Just wish I could keep it stable. I'm avoiding Watch_Dogs for the same types of reason.


----------



## Sgt Bilko

Quote:


> Originally Posted by *heroxoot*
> 
> yea there is about a hundred topics on the gearbox forum about poor performance on 780ti and other cards alike. Not sure if there was a patch after psycho was released, but my performance used to be amazing. Just wish I could keep it stable. I'm avoiding Watch_Dogs for the same types of reason.


i remember the PhysX debacle (pre-ordered here







) and that caused some major performance issues for me, seems to be going alright for me atm though.

Watch_Dogs is just Ubi being Ubi imo, The last few games i've played of theirs on release has been crappy performance all round.
only getting them from the bargain bin now.


----------



## heroxoot

Well I had no physx driver installed period. I recently installed it and made sure its set to low to see if that helps any. Just waiting till later to play with friends. I mean it is playable but getting less than 60fps feels so jagged.

But yea, other than that I have no issues in any games. Benchmarks aren't too shabby even on this CPU. Though I see people with i5 getting a couple more fps than me in heaven.


----------



## Dasboogieman

Quote:


> Originally Posted by *heroxoot*
> 
> Not even a little. I know people with crossfire and 3way and they don't overclock and thus don't have liquid. Their cards are fairly silent even so. Only ref cards are noisy.


My Crossfire Tri X and MSI Gaming cards shall regale you with the song of its people







I'l post a sound recording for those interested, it is a very, very loud whine combined with a roar (compared to the whisper quiet Tri X alone).
The reason is, if you leave stock silent fan curves, the top GPU will either overheat or the VRMs will breach 95 degrees. In fact, my MSI gaming cannot actually be placed in the top slot because it completely overheats past 90 degrees unless the fan speed is set to 65% (at wich point you hear the characteristic Twin Frozr whine). The Tri X holds up much better in the top slot because of the sheer cooling power at its disposal, it can maintain reasonable temperatures with the barely audible 50% but games like Metro will force the fan speed higher to the audible range, overclocking will send it to the annoying range.

Single cards are absolutely fine, in fact, you will rarely hear them unless you are overclocking. That being said, the OP was intending to go crossfire.

Though, I accept that what I consider loud is different to others. Basically, the fans on a Tri X at 60% or the Twin Frozr at 70% is loud to me.
Quote:


> Originally Posted by *heroxoot*
> 
> Well I had no physx driver installed period. I recently installed it and made sure its set to low to see if that helps any. Just waiting till later to play with friends. I mean it is playable but getting less than 60fps feels so jagged.
> 
> But yea, other than that I have no issues in any games. Benchmarks aren't too shabby even on this CPU. Though I see people with i5 getting a couple more fps than me in heaven.


I was equally annoyed when Dragon Age Origins refused to run without the PhysX driver with or without an NVIDIA GPU







.
Quote:


> Originally Posted by *heroxoot*
> 
> Well I had no physx driver installed period. I recently installed it and made sure its set to low to see if that helps any. Just waiting till later to play with friends. I mean it is playable but getting less than 60fps feels so jagged.
> 
> But yea, other than that I have no issues in any games. Benchmarks aren't too shabby even on this CPU. Though I see people with i5 getting a couple more fps than me in heaven.


Quote:


> Originally Posted by *sugarhell*
> 
> Not really.
> 
> I dont agree with 2,4,5,6,7,8
> 
> 2: WQHL means nothing. Both have around the same number of drivers
> 5:wut
> 6:wut this is not 6900 series hawaii has around the same performance or a bit lower performance on tessellation
> 7:Almost all the mgpu setup is a pain.No difference between the two solutions
> 8:Amd has SGSSAA and RGSSAA
> 
> He missed shadowplay which is cool and atm better than amd solution


Oh yes I did miss Shadowplay, never tried it myself though.
Its not so much the number of drivers as the frequency and promptness following known issues, though AMD has improved enormously in recent times.

Texture Filtering, how else is the GTX 780ti gaining the upper hand in most older games and some newer ones? sometimes as much as 25%. Shader arithmetic is similar, VRAM bandwidth is similar, Clockspeeds are similar (though slight edge to the 780Ti), Hawaii has a roughly 30% ROP advantage, GK110 has a roughly 27% Texture Filtering advantage.

If Hawaii has the same tesselation performance as GK110, why do so many people turn off Tessellation when doing Firestrike or Heaven benchmarks?

I've only used Crossfire for 3 months so I can't say much about AMD crossfire support, I have used SLI GTX 570s for 2.5 years so I've put up with that particular set of dual GPU issues for longer

If AMD has SGSSAA, how does one enable it? I've been scouring the interwebs for ages, plus every nook and cranny of RadeonPro. I would love to be able to use this with my older games since Hawaii does so well with SSAA.
Quote:


> Originally Posted by *heroxoot*
> 
> That thing has to be defective. Mine stays <80c on 1040/1250 even during heaven.


Sadly I wish it was the case. If it can stay at <80 degrees, how much would the temperatures rise if it swallowed 60-70 degree air coming from an open air card in the lower slot. I mean, I ran some numbers and it didn't look favourable with my setup.
Say the 290 Gaming can optimistically achieve 70 degrees in heaven with total silence with 25 degrees ambient temperatures (its actually about 12 here now in Australia). Thats a Delta T of 45 degrees. If it were swallowing a generous 50 degree air intake from the lower GPU (assuming excellent side case extraction with a good quantity of ambient air mixing with exhaust air), with the same Delta T, its expected to reach 95 degrees. In practice, I'm expecting thermal throttling at this stage unless the fan ramps up (hence louder).


----------



## heroxoot

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Not even a little. I know people with crossfire and 3way and they don't overclock and thus don't have liquid. Their cards are fairly silent even so. Only ref cards are noisy.
> 
> 
> 
> My Crossfire Tri X and MSI Gaming cards shall regale you with the song of its people
> 
> 
> 
> 
> 
> 
> 
> I'l post a sound recording for those interested, it is a very, very loud whine combined with a roar (compared to the whisper quiet Tri X alone).
> The reason is, if you leave stock silent fan curves, the top GPU will either overheat or the VRMs will breach 95 degrees. In fact, my MSI gaming cannot actually be placed in the top slot because it completely overheats past 90 degrees unless the fan speed is set to 65% (at wich point you hear the characteristic Twin Frozr whine). The Tri X holds up much better in the top slot because of the sheer cooling power at its disposal, it can maintain reasonable temperatures with the barely audible 50% but games like Metro will force the fan speed higher to the audible range, overclocking will send it to the annoying range.
> 
> Single cards are absolutely fine, in fact, you will rarely hear them unless you are overclocking. That being said, the OP was intending to go crossfire.
> 
> Though, I accept that what I consider loud is different to others. Basically, the fans on a Tri X at 60% or the Twin Frozr at 70% is loud to me.
Click to expand...

That thing has to be defective. Mine stays <80c on 1040/1250 even during heaven.


----------



## Red1776

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *RetiredAssassin*
> 
> *Hey there folks
> 
> 
> 
> 
> 
> 
> 
> *
> 
> I need a little help, I was surfing on NewEgg and came across something that got me re-consider my initial choice of Graphics Card which is/was GTX 780Ti classified at *$760*
> 
> I've came across an R9 290X at very exciting price tag which is PowerColor PCS+ AXR9 290X at *$480*
> 
> soo... now I'm considering this option, either 2-way PowerColor R9 290X crossfire OR *single* GTX 780Ti Classified.
> 
> what would you recommend guys? I'm gonna put my hardware order on July 4 so I need to decide before then,
> 
> NOTE: as of present I'll be gaming on single 1080p monitor, and not sure if down the road I'll make it triple-monitor setup, I know this 2-way crossfire setup mainly would be ideal for multiple monitor or 4K setup, but the reason I'm still considering is because I feel I get more performance for my buck with these R9s as opposed to 780Ti with which I've VRAM concern to be not enough as other triple-a titles hit the market, but anyways, what's your take on this guys?
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The 3Gb memory buffer on the 780/ti was the exact reason why I got a Hawaii card. There's basically no scalability, similar to the older GTX 570. It's adequate for a single GPU configuration but it is totally inadequate for SLI. As long as you are aware of what you are leaving behind and can mitigate the risks, you'll get more mileage per dollar out of dual 290X.
> A couple of points from my experience:
> 
> What you are leaving behind when not going NVIDIA:
> 1. The excellent NVIDIAinspector program, this is an absolute must for any NVIDIA user and I've yet to see an AMD equivalent that allows the same degree of driver-level tweaking
> 2. Faster driver release cycle: simple, NVIDIA has a bigger driver team, they will even optimize the hell out of obscure and beta games which may never see the light of day.
> 3. TXAA, I've never used it so I can't comment
> 4. PhysX, I've seen it once on Batman Arkham City, lagged too much so I turned it off
> 5. Superior Texture filtering performance: means that the 780Ti does better in older games circa 2012-2013 which don't push the ROPS or Memory as hard
> 6. Superior Tessellation: I've yet to see a good implementation where this is a dealbreaker but the advantage is there
> 7. Reasonably hassle free SLI: I might be too new to Crossfire to be truly objective but the NVIDIA SLI is really easy to use, because the driver team is bigger, they can afford to implement SLI profiles for a wider selection of games, even obscure ones that don't really need SLI.
> 8. SGSSAA: I've yet to see this employed on AMD, I'm a big fan of SSAA so this was a big loss
> 9. the 780Ti is reasonably cool and quiet, or conversely, reasonably hot and loud (as opposed to "extremely") when using the Skynet BIOS to OC
> 
> What you won't miss from NVIDIA
> 1. Expensive
> 2. Expensive, expensive, more expense
> 3. Unbalanced GPU configurations, seriously, nobody likes to have a flagship GPU that tanks performance after only 2 years of use (see GTX 570).
> 
> What you gain from going AMD
> 1. Excellent bang for buck
> 2. Stronger high resolution performance, indirectly, at 1080p this also means that Hawaii theoretically bleeds less performance when using heavy MSAA or SSAA
> 3. 4Gb of RAM, this is a really big advantage because this is the sweet spot for the offered GPU rendering power. Allows you to play 99% of games out there, even in crossfire, without VRAM issues.
> 4. Aircooled GPUs are very hot and loud in crossfire, unless you have one of those rare X79 boards with heaps of spacing. Can be partially mitigated by having a powerful (150+ CFM) side panel extraction fan. Can be totally mitigated with watercooling.
> 5. Superb (but not quite as excellent as the NVIDIA games) performance in AMD favored games
> 6. The model you are after, the PCS+, is a *perfect* candidate for NZXT G10 modding with an AIO cooler since it has independent heatsinks for the VRM1 and VRM2 assemblies + integrated backplate.
> 7. only weakness of the PCS+ is it's 3 slots which may hamper 2-way air cooled crossfire.
Click to expand...

Some other thoughts to consider:

1) This is a great driver feature

http://www.guru3d.com/articles_pages/amd_eyefinity_3_panel_mixed_resolution_review,1.html

2) The Stutter (FCAT) performance of the AMD cards is the same as Nvidia. I note this as a lot of people do not keep up, or don't want to.

3)guru3D.com is a site I trust for reviews (besides my own) and you can see the minor performance difference between a single R290X and the 780ti.

http://www.guru3d.com/articles_pages/powercolor_radeon_r9_290x_pcs_review,1.html

4)Mantle , although early is an up and comer and getting a very good reception with devs

http://www.guru3d.com/articles_pages/amd_mantle_preview,2.html

5) The overwhelming additional performance of 2 x R290X over a 780ti. That can be seen here:

http://www.guru3d.com/articles_pages/amd_radeon_r9_295x2_review,1.html

6) I build a quadfire machine every generation of AMD GPU's since 07-08. The scaling with the 7xxx and now the 290X has been fantastic to all four cards.

this concludes my 2 cents


----------



## ZealotKi11er

Has anyone used this CF connector before? https://www.dazmode.com/store/product/sli_vid_connector_-_g_1_4_-_1_2_3_slot_-_silver/


----------



## sugarhell

Quote:


> Originally Posted by *Dasboogieman*
> 
> My Crossfire Tri X and MSI Gaming cards shall regale you with the song of its people
> 
> 
> 
> 
> 
> 
> 
> I'l post a sound recording for those interested, it is a very, very loud whine combined with a roar (compared to the whisper quiet Tri X alone).
> The reason is, if you leave stock silent fan curves, the top GPU will either overheat or the VRMs will breach 95 degrees. In fact, my MSI gaming cannot actually be placed in the top slot because it completely overheats past 90 degrees unless the fan speed is set to 65% (at wich point you hear the characteristic Twin Frozr whine). The Tri X holds up much better in the top slot because of the sheer cooling power at its disposal, it can maintain reasonable temperatures with the barely audible 50% but games like Metro will force the fan speed higher to the audible range, overclocking will send it to the annoying range.
> 
> Single cards are absolutely fine, in fact, you will rarely hear them unless you are overclocking. That being said, the OP was intending to go crossfire.
> 
> Though, I accept that what I consider loud is different to others. Basically, the fans on a Tri X at 60% or the Twin Frozr at 70% is loud to me.
> I was equally annoyed when Dragon Age Origins refused to run without the PhysX driver with or without an NVIDIA GPU
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Oh yes I did miss Shadowplay, never tried it myself though.
> Its not so much the number of drivers as the frequency and promptness following known issues, though AMD has improved enormously in recent times.
> 
> Texture Filtering, how else is the GTX 780ti gaining the upper hand in most older games and some newer ones? sometimes as much as 25%. Shader arithmetic is similar, VRAM bandwidth is similar, Clockspeeds are similar (though slight edge to the 780Ti), Hawaii has a roughly 30% ROP advantage, GK110 has a roughly 27% Texture Filtering advantage.
> 
> If Hawaii has the same tesselation performance as GK110, why do so many people turn off Tessellation when doing Firestrike or Heaven benchmarks?
> 
> I've only used Crossfire for 3 months so I can't say much about AMD crossfire support, I have used SLI GTX 570s for 2.5 years so I've put up with that particular set of dual GPU issues for longer
> 
> If AMD has SGSSAA, how does one enable it? I've been scouring the interwebs for ages, plus every nook and cranny of RadeonPro. I would love to be able to use this with my older games since Hawaii does so well with SSAA.
> Sadly I wish it was the case. If it can stay at <80 degrees, how much would the temperatures rise if it swallowed 60-70 degree air coming from an open air card in the lower slot. I mean, I ran some numbers and it didn't look favourable with my setup.
> Say the 290 Gaming can optimistically achieve 70 degrees in heaven with total silence with 25 degrees ambient temperatures (its actually about 12 here now in Australia). Thats a Delta T of 45 degrees. If it were swallowing a generous 50 degree air intake from the lower GPU (assuming excellent side case extraction with a good quantity of ambient air mixing with exhaust air), with the same Delta T, its expected to reach 95 degrees. In practice, I'm expecting thermal throttling at this stage unless the fan ramps up (hence louder).


First dont compare the number of shaders. Completely different architectures.Older games you mean dx9? Nvidia is better with dx9

They disable tess because hwbot allows it. No other reason except higher benchmark score

SGSSAA

http://forums.anandtech.com/showpost.php?p=34297941&postcount=12-Use these settings


----------



## tsm106

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Not even a little. I know people with crossfire and 3way and they don't overclock and thus don't have liquid. Their cards are fairly silent even so. Only ref cards are noisy.
> 
> 
> 
> *My Crossfire Tri X and MSI Gaming cards shall regale you with the song of its people*
> 
> 
> 
> 
> 
> 
> 
> I'l post a sound recording for those interested, it is a very, very loud whine combined with a roar (compared to the whisper quiet Tri X alone).
> The reason is, if you leave stock silent fan curves, the top GPU will either overheat or the VRMs will breach 95 degrees. In fact, my MSI gaming cannot actually be placed in the top slot because it completely overheats past 90 degrees unless the fan speed is set to 65% (at wich point you hear the characteristic Twin Frozr whine). The Tri X holds up much better in the top slot because of the sheer cooling power at its disposal, it can maintain reasonable temperatures with the barely audible 50% but games like Metro will force the fan speed higher to the audible range, overclocking will send it to the annoying range.
> 
> Single cards are absolutely fine, in fact, you will rarely hear them unless you are overclocking. That being said, the OP was intending to go crossfire.
> 
> Though, I accept that what I consider loud is different to others. Basically, the fans on a Tri X at 60% or the Twin Frozr at 70% is loud to me.
Click to expand...











You stuck three open fan cooler AIB boards together? What did you expect??

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Has anyone used this CF connector before? https://www.dazmode.com/store/product/sli_vid_connector_-_g_1_4_-_1_2_3_slot_-_silver/


Run of the mill basic connectors. There's better looking alternatives, but flow wise that's as good as anything out there.


----------



## tsm106

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> From 11/7/2013 in this very thread... I run dual psu so I can isolate cards and equipment into configurations where I can isolate the power draw for measurement so I don't have to guess at their power draws. Looks like 900w to me.
> 
> 
> 
> With my 290 at stock 290x speeds and +50 PL my entire rig and one monitor only pull 410 watts from the wall max, factoring in the 15 watt monitor and the 92% efficient PSU that's not even 370w DC for my entire PC with half a dozen fans, a water pump, etc.
> 
> Now I took this measurement playing a modded out Skyrim as that usually gives me the highest power use, higher than Heaven, 3dm11, and even Furmark. *All those gave me ~370ish at the wall total.* I understand that I have a 290 and you're talking about 290x's, but there is only a 10% difference at cores so given the same speeds there should not be that big of a difference. There's also the whole hotter-chips-draw-more-power thing, but even when my card was on the stock blower the power usage was only 10 or 15 watts more, if my memory serves me correctly.
> 
> Not doubting your findings, just trying to figure out why we are getting such different results. 800 watts for two cards throttling at stock speeds seems like a lot.
Click to expand...

You're doing it wrong. Do the math, your numbers don't add up. A D5 is 20w, my monitor is 35w (don't know about yours), etc etc. Btw after you convert for efficiency, it's no longer "at the wall."


----------



## Blaise170

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Has anyone used this CF connector before? https://www.dazmode.com/store/product/sli_vid_connector_-_g_1_4_-_1_2_3_slot_-_silver/


That's SLI, not CFX.


----------



## tsm106

Quote:


> Originally Posted by *Blaise170*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> Has anyone used this CF connector before? https://www.dazmode.com/store/product/sli_vid_connector_-_g_1_4_-_1_2_3_slot_-_silver/
> 
> 
> 
> *That's SLI, not CFX.*
Click to expand...

It's an industry misnomer, it doesn't matter what they call it. It's simply a gpu connector fitting.


----------



## Blaise170

Quote:


> Originally Posted by *tsm106*
> 
> It's an industry misnomer, it doesn't matter what they call it. It's simply a gpu connector fitting.


I looked at it wrong on my phone, I thought that was an actual bridge.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Blaise170*
> 
> That's SLI, not CFX.


It's the same thing in this case, It's a waterblock connector.

It's a connector bridge for water to pass from one GPU block to the next


----------



## tsm106

Quote:


> Originally Posted by *Blaise170*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> It's an industry misnomer, it doesn't matter what they call it. It's simply a gpu connector fitting.
> 
> 
> 
> I looked at it wrong on my phone, I thought that was an actual bridge.
Click to expand...

It is a bridge. It bridges two blocks together.


----------



## Blaise170

Quote:


> Originally Posted by *tsm106*
> 
> It is a bridge. It bridges two blocks together.


Electrical bridge, not water.


----------



## tsm106

Quote:


> Originally Posted by *Blaise170*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> It is a bridge. It bridges two blocks together.
> 
> 
> 
> Electrical bridge, not water.
Click to expand...


----------



## PCSarge

Quote:


> Originally Posted by *tsm106*


go trade those 7970s for 290s


----------



## Blaise170

Quote:


> Originally Posted by *tsm106*


Sorry for briefly seeing it on my phone and thinking it was an SLI bridge and not a water connector. I'm sure you've never made a mistake either.


----------



## sugarhell

Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You stuck three open fan cooler AIB boards together? What did you expect??


When people will learn not air for over 1 gpu..
Quote:


> Originally Posted by *PCSarge*
> 
> go trade those 7970s for 290s


Why?


----------



## ZealotKi11er

Now that I will have 2 x 290s i don't know what to use them for.


----------



## tsm106

Quote:


> Originally Posted by *Blaise170*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sorry for briefly seeing it on my phone and thinking it was an SLI bridge and not a water connector. I'm sure you've never made a mistake either.
Click to expand...

What's with your postings then? Zealots question was asked and answered succinctly two times. Why did you need to keep posting?

Quote:


> Originally Posted by *tsm106*
> 
> It's an industry misnomer, it doesn't matter what they call it. It's simply a gpu connector fitting.


Quote:


> Originally Posted by *Sgt Bilko*
> 
> It's the same thing in this case, It's a waterblock connector.
> 
> It's a connector bridge for water to pass from one GPU block to the next


----------



## battleaxe

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Now that I will have 2 x 290s i don't know what to use them for.


A new 4k monitor that's what... I'm waiting for the bugs to get worked out though. I don't see why they can't do 60hz. I imagine they will before long.

I've got my eye on the new ASUS model for about 650quid. It seems they have a few issues though, so IDK yet. My wife asked me yesterday what I want for my birthday... hmmm.... 4k looks so good it will be hard not to go for a set.


----------



## Faster_is_better

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Now that I will have 2 x 290s i don't know what to use them for.


lol, I was considering another 290 because some are just going dirt cheap... But I would be in the same boat as you









I think 290/x cards will be really cheap as used for a long while to come though.


----------



## BradleyW

I refuse to go 4k until we have 144Hz and 1080p compatibility without blurriness. (Just in case I can't run a game at 4k and need to drop to 1080p).


----------



## Red1776

Quote:


> My Crossfire Tri X and MSI Gaming cards shall regale you with the song of its people
> 
> 
> 
> 
> 
> 
> 
> I'l post a sound recording for those interested, it is a very, very loud whine combined with a roar (compared to the whisper quiet Tri X alone).
> The reason is, if you leave stock silent fan curves, the top GPU will either overheat or the VRMs will breach 95 degrees. In fact, my MSI gaming cannot actually be placed in the top slot because it completely overheats past 90 degrees unless the fan speed is set to 65% (at wich point you hear the characteristic Twin Frozr whine). The Tri X holds up much better in the top slot because of the sheer cooling power at its disposal, it can maintain reasonable temperatures with the barely audible 50% but games like Metro will force the fan speed higher to the audible range, overclocking will send it to the annoying range.
> 
> Single cards are absolutely fine, in fact, you will rarely hear them unless you are overclocking. That being said, the OP was intending to go crossfire.
> 
> Though, I accept that what I consider loud is different to others. Basically, the fans on a Tri X at 60% or the Twin Frozr at 70% is loud to me.


I am curious what case/fan setup/ambient temps you are working with?

I am running quad MSI R290X Twin Frozr Game Editions for my 'AMD High Performance Project'

First on air and then putting them under water, None of the four on air hit 90c


----------



## Widde

Does anyone know why I cant apply changes in Catalyst? The button just stays grayed out for some reason no matter what I change, I want to "Enhance application settings" with the AA and supersampling.

Nvm, Closed it in task manager and started it again now it works. Silly me


----------



## chiknnwatrmln

Quote:


> Originally Posted by *tsm106*
> 
> You're doing it wrong. Do the math, your numbers don't add up. A D5 is 20w, my monitor is 35w (don't know about yours), etc etc. Btw after you convert for efficiency, it's no longer "at the wall."


What exactly am I doing wrong? One of my monitors only uses 15 watts, the other two are plugged into a different outlet.

The 370 watts is at the wall, hence why I said at the wall... I figured you could plug the numbers into a calculator. (370 - 15) x .92 = 327w. Perhaps my wording was a bit off as I posted that late last night, but my point about this was that those programs (Heaven, FurMark, 3dm11 - 370w at the wall each or 327w DC) didn't stress my card as much as Skyrim did (410w at the wall or 364w DC), which is why I went with Skyrim's numbers. I am just confused as to why your findings are saying that one 290x at stock clocks uses more power than my entire rig.


----------



## tsm106

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> You're doing it wrong. Do the math, your numbers don't add up. A D5 is 20w, my monitor is 35w (don't know about yours), etc etc. Btw after you convert for efficiency, it's no longer "at the wall."
> 
> 
> 
> What exactly am I doing wrong? One of my monitors only uses 15 watts, the other two are plugged into a different outlet.
> 
> The 370 watts is at the wall, hence why I said at the wall... I figured you could plug the numbers into a calculator. (370 - 15) x .92 = 327w. Perhaps my wording was a bit off as I posted that late last night, but my point about this was that those programs (Heaven, FurMark, 3dm11 - 370w at the wall each or 327w DC) didn't stress my card as much as Skyrim did (410w at the wall or 364w DC), which is why I went with Skyrim's numbers. I am just confused as to why your findings are saying that one 290x at stock clocks uses more power than my entire rig.
Click to expand...

Like I wrote your numbers don't add up imo. I can't make sense of it.

You originally said:
Quote:


> With my 290 at stock 290x speeds and +50 PL my entire rig and one monitor only *pull 410 watts from the wall max,* factoring in the 15 watt monitor and the 92% efficient PSU *that's not even 370w* DC for my entire PC with half a dozen fans, a water pump, etc.


But now you say it's 370w at the wall. Regardless, let's go with 370w. Mathematically you subtract known powerdraws after conversion not before. 370x.92=340w. Also, you're not supposed to measure with accessories attached if you can help it. Only the computer should be connected to the kill a watt to minimize discrepancies or errors.

340w for a 290, cpu, mb, acessories, water pump and fans?

340watts...

Bitech did some cpu powerdraw tests on the 3770K a while back and even though we don't have specifics of their at the wall numbers regardless they measured 244w at the wall, we have to guestimate to the differences between psu efficiencies and methodologies. In other words it's safe to assume at 4.8 your cpu is drawing 200w on the safe side, 224w if taken at .92.

Off the bat your 340w - 200w = magic pixie dust. A 290 is rated to 275w iirc, right up there with the 290x.

200w cpu
275w gpu
20w d5 pump
15w lcd
20-30w misc accessories

= 370w

I don't have a single clue how you could achieve the numbers you have.


----------



## the9quad

1133 watts at the wall (specs in sig). Platinum 1200 watt power supply with 89% efficiency at full load. That's stock clocks. PC and monitor plugged in the kill-a-watt


----------



## aneutralname

To owners of 290X Tri-X cards, what BIOS version did your card come with?


----------



## RetiredAssassin

Quote:


> Originally Posted by *heroxoot*
> 
> Dual 290X will beat a single 780ti.
> 
> http://www.hardocp.com/article/2013/11/11/geforce_gtx_780_ti_vs_radeon_r9_290x_4k_gaming/3
> 
> They are about par on performance, so dual 290X would be amazing. Also remember that this benchmark is a regular 290X I believe so the power color would most likely be clocked higher than ref, as most are.


Quote:


> Originally Posted by *HoneyBadger84*
> 
> Thats about $50 lower than last time I looked at it. Buy and buy quick is my recommendation lol


Quote:


> Originally Posted by *Dasboogieman*
> 
> The 3Gb memory buffer on the 780/ti was the exact reason why I got a Hawaii card. There's basically no scalability, similar to the older GTX 570. It's adequate for a single GPU configuration but it is totally inadequate for SLI. As long as you are aware of what you are leaving behind and can mitigate the risks, you'll get more mileage per dollar out of dual 290X.
> A couple of points from my experience:
> 
> What you are leaving behind when not going NVIDIA:
> 1. The excellent NVIDIAinspector program, this is an absolute must for any NVIDIA user and I've yet to see an AMD equivalent that allows the same degree of driver-level tweaking
> 2. Faster driver release cycle: simple, NVIDIA has a bigger driver team, they will even optimize the hell out of obscure and beta games which may never see the light of day.
> 3. TXAA, I've never used it so I can't comment
> 4. PhysX, I've seen it once on Batman Arkham City, lagged too much so I turned it off
> 5. Superior Texture filtering performance: means that the 780Ti does better in older games circa 2012-2013 which don't push the ROPS or Memory as hard
> 6. Superior Tessellation: I've yet to see a good implementation where this is a dealbreaker but the advantage is there
> 7. Reasonably hassle free SLI: I might be too new to Crossfire to be truly objective but the NVIDIA SLI is really easy to use, because the driver team is bigger, they can afford to implement SLI profiles for a wider selection of games, even obscure ones that don't really need SLI.
> 8. SGSSAA: I've yet to see this employed on AMD, I'm a big fan of SSAA so this was a big loss
> 9. the 780Ti is reasonably cool and quiet, or conversely, reasonably hot and loud (as opposed to "extremely") when using the Skynet BIOS to OC
> 
> What you won't miss from NVIDIA
> 1. Expensive
> 2. Expensive, expensive, more expense
> 3. Unbalanced GPU configurations, seriously, nobody likes to have a flagship GPU that tanks performance after only 2 years of use (see GTX 570).
> 
> What you gain from going AMD
> 1. Excellent bang for buck
> 2. Stronger high resolution performance, indirectly, at 1080p this also means that Hawaii theoretically bleeds less performance when using heavy MSAA or SSAA
> 3. 4Gb of RAM, this is a really big advantage because this is the sweet spot for the offered GPU rendering power. Allows you to play 99% of games out there, even in crossfire, without VRAM issues.
> 4. Aircooled GPUs are very hot and loud in crossfire, unless you have one of those rare X79 boards with heaps of spacing. Can be partially mitigated by having a powerful (150+ CFM) side panel extraction fan. Can be totally mitigated with watercooling.
> 5. Superb (but not quite as excellent as the NVIDIA games) performance in AMD favored games
> 6. The model you are after, the PCS+, is a _*perfect*_ candidate for NZXT G10 modding with an AIO cooler since it has independent heatsinks for the VRM1 and VRM2 assemblies + integrated backplate.
> 7. only weakness of the PCS+ is it's 3 slots which may hamper 2-way air cooled crossfire.


I appreciate your excellent detailed and well thought out answer







I've a few more things to clarify even though when I remember the hassle I should go with AMD it kind of puts me off...









in case of doing 2-way crossfire(on Z97 with ASrock Z97 OC Formula) should I've to go with reference design cards(radian fan design) to prevent overheat of other components in case OR even with axial-design 290Xs(custom gpus) are good enough to make 2-way crossfire setup without harming and overheating the system?

at 1080p this also means that Hawaii theoretically bleeds less performance when using heavy MSAA or SSAA - and this is unfortunate also







I'll be gaming on single 1080p monitor and I guess I also like how Nvidia is more caring about their hardware and usually new technologies and innovations come to Nvidia first before AMD... well this aside, all in all let me bring up another option as I'm able to do this way as well, how about

*EVGA GTX 780 6GB* _OR_ *GTX 780Ti classified*?

I understand this is not the Nvidia GPUs thread but asking in Nvidia since most of them own it they blindly offer it over anything else without really considering the necessary concerns.

There is only one thing I don't wanna see happening with my GPU... and it is become obsolete within a year as games require more than 3gb VRAM(780Ti) so at that point I'll be stuck with it and basically that will mean that I've waisted $760, of course 780Ti classy has its advantages... unlocked voltage, more phases and custom PCB... but if it hits the VRAM limit none of these things can come to help which isn't the case with 6GB 780 which although won't overclock or play at as high frames as 780Ti but at least you know in case of these 3GB more ram requiring games show up you'll be safe and won't struggle playing them. thanks once again


----------



## sugarhell

Nothing. Wait for next gen ^

Or if you cant wait 2 290s used


----------



## ZealotKi11er

Quote:


> Originally Posted by *tsm106*
> 
> Like I wrote your numbers don't add up imo. I can't make sense of it.
> 
> You originally said:
> But now you say it's 370w at the wall. Regardless, let's go with 370w. Mathematically you subtract known powerdraws after conversion not before. 370x.92=340w. Also, you're not supposed to measure with accessories attached if you can help it. Only the computer should be connected to the kill a watt to minimize discrepancies or errors.
> 
> 340w for a 290, cpu, mb, acessories, water pump and fans?
> 
> 340watts...
> 
> Bitech did some cpu powerdraw tests on the 3770K a while back and even though we don't have specifics of their at the wall numbers regardless they measured 244w at the wall, we have to guestimate to the differences between psu efficiencies and methodologies. In other words it's safe to assume at 4.8 your cpu is drawing 200w on the safe side, 224w if taken at .92.
> 
> Off the bat your 340w - 200w = magic pixie dust. A 290 is rated to 275w iirc, right up there with the 290x.
> 
> 200w cpu
> 275w gpu
> 20w d5 pump
> 15w lcd
> 20-30w misc accessories
> 
> = 370w
> 
> I don't have a single clue how you could achieve the numbers you have.


With my system playing BF4 last time i checked i was hitting ~ 440W under load with 290X stock speed. Thats ~ 400W actual. 150W the system and 250W the GPU. Both the CPU and GPU where not really pushed to the MAX. Probably have to play some other game.


----------



## bond32

Quote:


> Originally Posted by *sugarhell*
> 
> Nothing. Wait for next gen ^
> 
> Or if you cant wait 2 290s used


This. You can easily find, even here in the marketplace 290's for around $250... That's a steal if you ask me...

What's the discussion on the power draw about? I want in. I like talking about it...

I borrowed a Watt's up meter when I was in school, found my system to draw about 410-420 watts at full load, this was with a 4770k at serious overvoltage and a 290x with pt1 bios. Was actually surprised it wasn't any higher.

Strongly considering buying a watt meter especially now that I have tri-fire.

Oh and a single D5 pump pulls 35 watts max (at least mine does, the 24V version). Just nit picking lol...


----------



## Talon720

Well I ended up getting a powercolor 290x pcs+ sorta accidentally. I was on amazon and only wanted to order from
Amazon no 3rd party. Well I ended up hitting the 1 click order not really thinking i thought there was still a confirmation screen. I was going to just take my chances on a reference Hynix, but didnt wanna keep returning as theres not that many. Oh well i can just return it if Elpida. Ive read they can have Hynix or Elpida. I got a pcie extension cable so i can thoroughly test it w/o removing my water blocks and other gpus. Anyone have the 290x pcs+? And how do you like it? Ill be taking the cooler off eventually and swapping my other block


----------



## chiknnwatrmln

Quote:


> Originally Posted by *tsm106*
> 
> Like I wrote your numbers don't add up imo. I can't make sense of it.
> 
> You originally said:
> But now you say it's 370w at the wall. Regardless, let's go with 370w. Mathematically you subtract known powerdraws after conversion not before. 370x.92=340w. Also, you're not supposed to measure with accessories attached if you can help it. Only the computer should be connected to the kill a watt to minimize discrepancies or errors.
> 
> 340w for a 290, cpu, mb, acessories, water pump and fans?
> 
> 340watts...
> 
> Bitech did some cpu powerdraw tests on the 3770K a while back and even though we don't have specifics of their at the wall numbers regardless they measured 244w at the wall, we have to guestimate to the differences between psu efficiencies and methodologies. In other words it's safe to assume at 4.8 your cpu is drawing 200w on the safe side, 224w if taken at .92.
> 
> Off the bat your 340w - 200w = magic pixie dust. A 290 is rated to 275w iirc, right up there with the 290x.
> 
> 200w cpu
> 275w gpu
> 20w d5 pump
> 15w lcd
> 20-30w misc accessories
> 
> = 370w
> 
> I don't have a single clue how you could achieve the numbers you have.


Those were two separate tests using separate programs to demonstrate the difference in power draw between them. I'll state again, the reason I used 410w was because that's the highest value I got with my 290 at stock 290x speeds. It was 410w at the wall, including the monitor, at stock 290x clocks. The 370w at the wall was a completely separate test, for all intents and purposes just ignore it.

My CPU is not under 100% load when playing Skyrim, it's usually hovering around 30-40%. I would guess that it's using around 55-70 watts, again this is just a guess based off of total wattage and software readings and is by no means accurate. Also, Bittech's stats are not just the CPU, they are the whole PC including mobo, RAM, cooling, etc. 224w would be accurate if I was using the same mobo, cooling, RAM, and my CPU was pegged at 100%.
Quote:


> The power draw is measured via a power meter at the wall, so the numbers below are of total system power draw from the mains, not the power consumption of a CPU itself.


A 3770k's stock TDP is 77 watts.. Even when pushing 1.35v (I'm on 1.32v - currently running 4.6GHz - a 100% CPU load yields 205w at the wall) there's no way in hell only a 3770k is gonna use over 200 watts.

However, I did think of something that might be causing some discrepancy - I have been using my UPS's display readings to determine power draw. After a bit of research some people say that APC's UPS's are relatively accurate, but others say they can be 30w or so off. I will have to see if I can get my hands on a Killawatt sometime soon to check.


----------



## sugarhell

My 3770k at 5ghz use over 200 watt. 77 is the average tdp. Intel use average amd use maximum tdp for the cpus


----------



## tsm106

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> My CPU is not under 100% load when playing Skyrim, it's usually hovering around 30-40%. I would guess that it's using around 55-70 watts, again this is just a guess based off of total wattage and software readings and is by no means accurate. *Also, Bittech's stats are not just the CPU*, they are the whole PC including mobo, RAM, cooling, etc. 224w would be accurate if I was using the same mobo, cooling, RAM, and my CPU was pegged at 100%.
> Quote:
> 
> 
> 
> The power draw is measured via a power meter at the wall, so the numbers below are of total system power draw from the mains, not the power consumption of a CPU itself.
> 
> 
> 
> A 3770k's stock TDP is 77 watts.. Even when pushing 1.35v (I'm on 1.32v - currently running 4.6GHz - a 100% CPU load yields 205w at the wall) there's no way in hell only a 3770k is gonna use over 200 watts.
> 
> However, I did think of something that might be causing some discrepancy - I have been using my UPS's display readings to determine power draw. After a bit of research some people say that APC's UPS's are relatively accurate, but others say they can be 30w or so off. I will have to see if I can get my hands on a Killawatt sometime soon to check.
Click to expand...

It's implied that it's not just the cpu. How are you going to test full load power consumption on your cpu? Not many people I know can separate out the cpu from the mb. Thus we generally consider cpu as mb, mem, etc etc. And a simple ballpark is measuring at idle vs peak.

As sugar mentioned, Intel's tdp numbers are very low and NOT INDICATIVE of real world usage. For ex. take a 3930K, it has a tdp of 130w but in real world isolated cpu peak consumption tests, a 4.7/4.8 ghz overclocked cpu can push total system consumption to well north of 500w at the wall. Obviously a 130w tdp cpu is doing something more than 130w if it is pushing a system which idles at 210w to well over 500w peak.

There are people who always post very low numbers. It's their choice to post low numbers. However that doesn't mean that these gpus/systems etc can only draw what they think is possible.

Here's a good summation of my opinions on power draws.

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/6800_40#post_21229798


----------



## fateswarm

You might be able to get accurate power and even current readings from HWInfo if you have a good digital voltage controller that is detected. e.g. Here I can read the Tri-X's IR3567B and the CPU's IR3563B on the m/b.

You get output power/current/voltage to the chip or memory, and the input it got from the PSU.


----------



## sugarhell

Quote:


> Originally Posted by *fateswarm*
> 
> You might be able to get accurate power and even current readings from HWInfo if you have a good digital voltage controller that is detected. e.g. Here I can read the Tri-X's IR3567B and the CPU's IR3563B on the m/b.
> 
> You get output power/current/voltage to the chip or memory, and the input it got from the PSU.


Software readings sucks. And IR3567B is the same as ref pcb and lightning


----------



## RetiredAssassin

Quote:


> Originally Posted by *sugarhell*
> 
> Nothing. Wait for next gen ^
> 
> Or if you cant wait 2 290s used


are any of these next generation GPUs meant to be released within this year? I'm talking about *R9 3xx(preferably 390X) or GTX 8xx(preferably 880)* series cards? there have been rumors that if these models might happen to be released this year AND they won't be on _20nm_ architecture but rather the old _28nm_(if I'm not wrong) architecture.


----------



## fateswarm

Quote:


> Originally Posted by *sugarhell*
> 
> Software readings sucks.


Those chips are pure hardware. Also, some sensor chips on boards are effectively multimeters. They literally connect to read points exactly the way one would do with a multimeter.


----------



## HoneyBadger84

Quote:


> Originally Posted by *aneutralname*
> 
> To owners of 290X Tri-X cards, what BIOS version did your card come with?


This is what mine came with:


Quote:


> Originally Posted by *Talon720*
> 
> Well I ended up getting a powercolor 290x pcs+ sorta accidentally. I was on amazon and only wanted to order from
> Amazon no 3rd party. Well I ended up hitting the 1 click order not really thinking i thought there was still a confirmation screen. I was going to just take my chances on a reference Hynix, but didnt wanna keep returning as theres not that many. Oh well i can just return it if Elpida. Ive read they can have Hynix or Elpida. I got a pcie extension cable so i can thoroughly test it w/o removing my water blocks and other gpus. Anyone have the 290x pcs+? And how do you like it? Ill be taking the cooler off eventually and swapping my other block


Best. Accidental. Purchase. Ever. lol


----------



## HoneyBadger84

Question to the floor, specifically directed at Tri-X OC owners of the R9 290X variety:

Here's what I've seen so far in testing:



My question is, A: are those temps normal in your experience (yes, I know they're low, my case has great airflow & It's in there by itself for now cuz I'm testing it alone since I just got it, making sure it's not a lemon etc), & B: once I get around to it, what clocks should I try pushing, keeping in mind I'm on air, and those are my current "max load" temps? Ideally, I want to pair it with my Gigabyte WindForce, 2 slots between the cards, and try to run Core: 1100MHz, vRAM ~1500MHz or so, what voltage do you think would be sufficient, most likely, for that, given that both cards are 1040MHz stock on the core?

It seems like even though I "disabled" ULPS this time, that it's not really disabled. The card still dips a bit in speeds in some tests at points where it seems downright odd it would do so. For instance during Metro 2033's benchmark in a few spots, as well as during some of the 3DMark tests (I ran it through 3DMark11 on 2 presets, 3DMark FireStrike on Normal and Extreme, 3DMark SkyDiver, Unigine Valley & Heaven, Metro Last Light & Metro 2033's benchmark, as testing to make sure it didn't have artifact issues). Is that normal for this card? I haven't really experienced that at all once I've disabled ULPS on my other cards.

Lastly, I've seen talk about BIOS flashing this card to a "better" BIOS for it? I have ATIWinFlash already, which BIOS would you all recommend I try out, if I were going to try one (I'll back up the current BIOS before hand of course).

P.s. Sorry for much text such wall.


----------



## bond32

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Question to the floor, specifically directed at Tri-X OC owners of the R9 290X variety:
> 
> Here's what I've seen so far in testing:
> 
> 
> 
> My question is, A: are those temps normal in your experience (yes, I know they're low, my case has great airflow & It's in there by itself for now cuz I'm testing it alone since I just got it, making sure it's not a lemon etc), & B: once I get around to it, what clocks should I try pushing, keeping in mind I'm on air, and those are my current "max load" temps? Ideally, I want to pair it with my Gigabyte WindForce, 2 slots between the cards, and try to run Core: 1100MHz, vRAM ~1500MHz or so, what voltage do you think would be sufficient, most likely, for that, given that both cards are 1040MHz stock on the core?
> 
> It seems like even though I "disabled" ULPS this time, that it's not really disabled. The card still dips a bit in speeds in some tests at points where it seems downright odd it would do so. For instance during Metro 2033's benchmark in a few spots, as well as during some of the 3DMark tests (I ran it through 3DMark11 on 2 presets, 3DMark FireStrike on Normal and Extreme, 3DMark SkyDiver, Unigine Valley & Heaven, Metro Last Light & Metro 2033's benchmark, as testing to make sure it didn't have artifact issues). Is that normal for this card? I haven't really experienced that at all once I've disabled ULPS on my other cards.
> 
> Lastly, I've seen talk about BIOS flashing this card to a "better" BIOS for it? I have ATIWinFlash already, which BIOS would you all recommend I try out, if I were going to try one (I'll back up the current BIOS before hand of course).
> 
> P.s. Sorry for much text such wall.


If you're having ULPS issues try this: http://www.ulpsconfigurationutility.com/

Temps look good, although can't tell your voltage for those clocks. Have you considered one of the benching bios (PT1, PT3, Asus)?


----------



## sugarhell

Or disable it through msi ab?


----------



## bond32

Quote:


> Originally Posted by *sugarhell*
> 
> Or disable it through msi ab?


Or that. Although I have had issues with that not actually disabling it before... Just needed to clear the AB settings was all.


----------



## tsm106

Quote:


> Originally Posted by *sugarhell*
> 
> Or disable it through msi ab?


^^That takes literally one click!

Or they could learn the simple string to look for. ^^Lmao someone wrote a website dedicated to something that takes 3 seconds to change. How lazy can we get?

This key 4D36E968, under CurrentControlSet... taken from the how to thread

A- Disabling ULPS


Spoiler: Warning: Spoiler!



Open regedit and go to:

HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Class\{4D36E968-E325-11CE-BFC1-08002BE10318}

This key 4D36E968, under CurrentControlSet is the only folder you need to access. You can ignore the others, so don't search for it just traverse directly to said folder. It's time to disable ULPS, or ultra low power savings. Inside the folder you will find more folders, 0000/0001/0002/etc so on and on.

Open each folder and double click on EnableULPS and change to 0. You do not need to change any other key, or any keys that look similar, just change EnableULPS. Close regedit.


----------



## HoneyBadger84

Quote:


> Originally Posted by *sugarhell*
> 
> Or disable it through msi ab?


That's what I did, which is why I found the clock random changing so odd... maybe Afterburner messed up & didn't disable it right the first time?

Also, voltage at those clocks is just stock, it's reading out at like 1.08-1.11 or such in benchmarks/gaming tests so far.


----------



## dr0thegreatest

The temps are fine.


----------



## sugarhell

Quote:


> Originally Posted by *HoneyBadger84*
> 
> That's what I did, which is why I found the clock random changing so odd... maybe Afterburner messed up & didn't disable it right the first time?
> 
> Also, voltage at those clocks is just stock, it's reading out at like 1.08-1.11 or such in benchmarks/gaming tests so far.


It looks like when you change scene your gpu drops to 2d clocks. its normal. This is not ulps this is powerplay.

Ulps is when your second gpu power off completely even the fan. When ulps is on you cant check with msi ab the second gpu all the values will be zero.


----------



## HoneyBadger84

Quote:


> Originally Posted by *dr0thegreatest*
> 
> The temps are fine.


X_X I know they're fine, I'm asking if that's normal for this particular cooler, or if I'm running cooler or warmer than users that also have a Tri-X OC. lol

So what do y'all think in reference to voltage I should try for 1100MHz Core? I'm not really stuck on the vRAM getting to 1500MHz, just a nice round number I thought of, 6GHz effective vRAM speed would be nice, even & flat. Mainly I just wanna run that 10% OC from bone-stock R9 290X speeds in crossfire & have it be game-stable for that bit of performance boost it'll provide.

Previously I've tested 1150MHz core on the Gigabyte card with the vRAM at 1550MHz, that was on +131mV, but I don't think it was 100% stable, as I never did more than run short benchmarks with it (all of which it passed). So I'm thinking for 1100MHz I could start at something pretty low like +50mV & work my way up if I get any artifacts or crashing? I assume I should start off by just OCing the core & leave the vRAM speed alone until I get the core clock I want stable, then fine-tune the vRAM speed afterwards for stability.


----------



## HoneyBadger84

Quote:


> Originally Posted by *sugarhell*
> 
> It looks like when you change scene your gpu drops to 2d clocks. its normal. This is not ulps this is powerplay.
> 
> Ulps is when your second gpu power off completely even the fan. When ulps is on you cant check with msi ab the second gpu all the values will be zero.


Yes that was the case, but the clock changes were also happening during benchmarks being in the middle of running on occasion, like in the Metro tests as well as in some parts of 3DMark SkyDiver for instance. I'm gonna be switching back to Crossfire soon, with this card as the primary, so I'll have a fresh shot of ULPS being disabled & I'll double check again if it still happens after that.

be back in about 10 mins, gotta uninstall, wipe files, pop the Gigabyte card in & reinstall drivers


----------



## tsm106

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *sugarhell*
> 
> It looks like when you change scene your gpu drops to 2d clocks. its normal. This is not ulps this is powerplay.
> 
> Ulps is when your second gpu power off completely even the fan. When ulps is on you cant check with msi ab the second gpu all the values will be zero.
> 
> 
> 
> Yes that was the case, but the clock changes were also happening during benchmarks being in the middle of running on occasion, like in the Metro tests as well as in some parts of 3DMark SkyDiver for instance. I'm gonna be switching back to Crossfire soon, with this card as the primary, so I'll have a fresh shot of ULPS being disabled & I'll double check again if it still happens after that.
Click to expand...

Quote:


> It seems like even though I "disabled" ULPS this time, that it's not really disabled. The card still dips a bit in speeds in some tests at points where it seems downright odd it would do so.


Clock changes have nothing to do with ULPS just like sugar explained.


----------



## sugarhell

If you want to a quick way to find your card limits is :

100% fan speed and keep vrms under 70C.
Increase core without voltage adjuments
If you see instantly artifacts means more volts
If you see artifacts a bit after means you need to cool down the gpu more
Increase volts by +10,20

In general for 1300 core clock you need 1.45 volts before vdroop


----------



## HoneyBadger84

Should I just stick Power Limit to +50% and leave it there when testing this OC, or only adjust that if I start getting throttling even though temps are acceptable?


----------



## sugarhell

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Should I just stick Power Limit to +50% and leave it there when testing this OC, or only adjust that if I start getting throttling even though temps are acceptable?


Always 50%.

Go delay msi ab too when you start the pc. Otherwise sometimes it reset powerlimit


----------



## tsm106

**Don't Apply overclock at startup.


----------



## HoneyBadger84

Quote:


> Originally Posted by *sugarhell*
> 
> Always 50%.
> 
> Go delay msi ab too when you start the pc. Otherwise sometimes it reset powerlimit


Yeah I'm used to that, I've heard many a horror story of endless crash/BSOD/other ******ed loops caused by Afterburner being funny on startup when people have failed OCs.


----------



## Dasboogieman

Quote:


> Originally Posted by *heroxoot*
> 
> That thing has to be defective. Mine stays <80c on 1040/1250 even during heaven.


Quote:


> Originally Posted by *Red1776*
> 
> I am curious what case/fan setup/ambient temps you are working with?
> I am running quad MSI R290X Twin Frozr Game Editions for my 'AMD High Performance Project'
> First on air and then putting them under water, None of the four on air hit 90c




My case is quite modest in terms of multi GPU support, its a coolermaster HAF 932 With 200mm OEM Intake at the front and top. 140mm OEM intake in 5.25inch drive bays, 140mm Noctua exhaust. 200mm Bitfenix Spectre Pro Side exahust (I am looking for a more powerful fan but alas).
This has always been the case with my setup, even back in the SLI 570 days, with GPUs that have horizontal heatsink fins the air is dumped sideways.
Quote:


> Originally Posted by *RetiredAssassin*
> 
> I appreciate your excellent detailed and well thought out answer
> 
> 
> 
> 
> 
> 
> 
> I've a few more things to clarify even though when I remember the hassle I should go with AMD it kind of puts me off...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> in case of doing 2-way crossfire(on Z97 with ASrock Z97 OC Formula) should I've to go with reference design cards(radian fan design) to prevent overheat of other components in case OR even with axial-design 290Xs(custom gpus) are good enough to make 2-way crossfire setup without harming and overheating the system?
> 
> at 1080p this also means that Hawaii theoretically bleeds less performance when using heavy MSAA or SSAA - and this is unfortunate also
> 
> 
> 
> 
> 
> 
> 
> I'll be gaming on single 1080p monitor and I guess I also like how Nvidia is more caring about their hardware and usually new technologies and innovations come to Nvidia first before AMD... well this aside, all in all let me bring up another option as I'm able to do this way as well, how about
> 
> *EVGA GTX 780 6GB* _OR_ *GTX 780Ti classified*?
> 
> I understand this is not the Nvidia GPUs thread but asking in Nvidia since most of them own it they blindly offer it over anything else without really considering the necessary concerns.
> 
> There is only one thing I don't wanna see happening with my GPU... and it is become obsolete within a year as games require more than 3gb VRAM(780Ti) so at that point I'll be stuck with it and basically that will mean that I've waisted $760, of course 780Ti classy has its advantages... unlocked voltage, more phases and custom PCB... but if it hits the VRAM limit none of these things can come to help which isn't the case with 6GB 780 which although won't overclock or play at as high frames as 780Ti but at least you know in case of these 3GB more ram requiring games show up you'll be safe and won't struggle playing them. thanks once again


Yeah I know that feeling, I've owned both NVIDIA and AMD so I'll try and make things as objective as possible, its too easy to blindly recommend a product if only for the fact it's vindicating one's purchase choices. The exact thing happened with my GTX 570 SLI, fantastic while it lasted but the 1.25Gb VRAM ceiling was hit really really hard circa 2012-2013.

Unfortunately with the GPU selection, you are stuck between a rock and a hard place. In actuality, a 6Gb GTX 780ti would be absolutely ideal in terms of balance between rendering power and VRAM pool (unfortunately, NVIDIA also know this so they're floggin the Titan Black like theres no tomorrow). However, 6Gb on a 780 is too unbalanced in favor of the VRAM considering the price premium, the GPU is simply too weak to utilize the VRAM to its full potential. I fear for the longevity of the 780 from a rendering power perspective, especially at higher resolutions or heavy MSAA.

The choice of lesser evils may be the 290X since its the most perfectly balanced in terms of rendering power, VRAM pool and price. If you get an AIB one then noise also becomes a non-issue.
In terms of Crossfire, blowers do help with heat buildup but the key thing is your airflow. Even blowers cannot compensate for poor case airflow.
http://www.pugetsystems.com/labs/articles/Side-Panel-Fans-Are-They-Worth-It-102/

There was an article comparing the heat buildup between open and blower coolers in SLI/Crossfire but I can't find it.
Basically, the gist was, good airflow + open air coolers is better than poor airflow + blower coolers.. This is because blower coolers exhaust most but not all of the hot air, some heat buildup always occurs.


----------



## Red1776

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> That thing has to be defective. Mine stays <80c on 1040/1250 even during heaven.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> I am curious what case/fan setup/ambient temps you are working with?
> I am running quad MSI R290X Twin Frozr Game Editions for my 'AMD High Performance Project'
> First on air and then putting them under water, None of the four on air hit 90c
> 
> Click to expand...
> 
> 
> 
> My case is quite modest in terms of multi GPU support, its a coolermaster HAF 932 With 200mm OEM Intake at the front and top. 140mm OEM intake in 5.25inch drive bays, 140mm Noctua exhaust. 200mm Bitfenix Spectre Pro Side exahust (I am looking for a more powerful fan but alas).
> This has always been the case with my setup, even back in the SLI 570 days, with GPUs that have horizontal heatsink fins the air is dumped sideways.
Click to expand...

Ah, I know exactly what you are dealing with. My 09 4 x 5870's quadfire was in a 932 and the hottest running quad I ever built. Check out the 'modded' CM V8


----------



## heroxoot

Your case has way less flow than mine as I have a HAF XB. But still, that is way too hot. I'd apply new thermal paste. When I got my 290X it was showing 90C with an overclock and 85c without one. Putting on new thermal paste dropped it a good 5 - 10c during load.


----------



## Dasboogieman

Quote:


> Originally Posted by *heroxoot*
> 
> Your case has way less flow than mine as I have a HAF XB. But still, that is way too hot. I'd apply new thermal paste. When I got my 290X it was showing 90C with an overclock and 85c without one. Putting on new thermal paste dropped it a good 5 - 10c during load.


Yeah, I should probably get round to that. Mine has a dinky warranty void sticker though. Will have to take it off slow with a heat gun. My 932 did come with a GPU bracket to mount a 120mm fan near the HDD cages. However the length of the tri x prevented me from using it. Plus the length of the 1300w G2 also prevented the use of a 120mm bottom intake.


----------



## heroxoot

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Your case has way less flow than mine as I have a HAF XB. But still, that is way too hot. I'd apply new thermal paste. When I got my 290X it was showing 90C with an overclock and 85c without one. Putting on new thermal paste dropped it a good 5 - 10c during load.
> 
> 
> 
> Yeah, I should probably get round to that. Mine has a dinky warranty void sticker though. Will have to take it off slow with a heat gun. My 932 did come with a GPU bracket to mount a 120mm fan near the HDD cages. However the length of the tri x prevented me from using it. Plus the length of the 1300w G2 also prevented the use of a 120mm bottom intake.
Click to expand...

Depends on the company. MSI voided my sticker to check it when they sent it as a replacement, but they say as long as the card isn't physically broken they take it anyway. Sapphire tends to be the same.


----------



## PCSarge

Quote:


> Originally Posted by *heroxoot*
> 
> Depends on the company. MSI voided my sticker to check it when they sent it as a replacement, but they say as long as the card isn't physically broken they take it anyway. Sapphire tends to be the same.


i can chime on this and say XFX is also the same


----------



## heroxoot

Quote:


> Originally Posted by *PCSarge*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Depends on the company. MSI voided my sticker to check it when they sent it as a replacement, but they say as long as the card isn't physically broken they take it anyway. Sapphire tends to be the same.
> 
> 
> 
> i can chime on this and say XFX is also the same
Click to expand...

My only problem here is XFX has a misleading warranty. It's called a "Life time" warranty but it only exists as long as the GPU is the current model. A friend tried to get a 6870 fixed and they wouldnt do it for free because the "Life time of the card is gone". I'd rather a 3 year warranty over a fake life time one.


----------



## PCSarge

Quote:


> Originally Posted by *heroxoot*
> 
> My only problem here is XFX has a misleading warranty. It's called a "Life time" warranty but it only exists as long as the GPU is the current model. A friend tried to get a 6870 fixed and they wouldnt do it for free because the "Life time of the card is gone". I'd rather a 3 year warranty over a fake life time one.


eh. i wouldnt say that, i RMA'd a 5770 when the 7000 series hit and they sent a new one i think it just matters if you get a dick for a rep or not.


----------



## RetiredAssassin

Quote:


> Originally Posted by *Dasboogieman*
> 
> 
> 
> My case is quite modest in terms of multi GPU support, its a coolermaster HAF 932 With 200mm OEM Intake at the front and top. 140mm OEM intake in 5.25inch drive bays, 140mm Noctua exhaust. 200mm Bitfenix Spectre Pro Side exahust (I am looking for a more powerful fan but alas).
> This has always been the case with my setup, even back in the SLI 570 days, with GPUs that have horizontal heatsink fins the air is dumped sideways.
> Yeah I know that feeling, I've owned both NVIDIA and AMD so I'll try and make things as objective as possible, its too easy to blindly recommend a product if only for the fact it's vindicating one's purchase choices. The exact thing happened with my GTX 570 SLI, fantastic while it lasted but the 1.25Gb VRAM ceiling was hit really really hard circa 2012-2013.
> 
> Unfortunately with the GPU selection, you are stuck between a rock and a hard place. In actuality, a 6Gb GTX 780ti would be absolutely ideal in terms of balance between rendering power and VRAM pool (unfortunately, NVIDIA also know this so they're floggin the Titan Black like theres no tomorrow). However, 6Gb on a 780 is too unbalanced in favor of the VRAM considering the price premium, the GPU is simply too weak to utilize the VRAM to its full potential. I fear for the longevity of the 780 from a rendering power perspective, especially at higher resolutions or heavy MSAA.
> 
> The choice of lesser evils may be the 290X since its the most perfectly balanced in terms of rendering power, VRAM pool and price. If you get an AIB one then noise also becomes a non-issue.
> In terms of Crossfire, blowers do help with heat buildup but the key thing is your airflow. Even blowers cannot compensate for poor case airflow.
> http://www.pugetsystems.com/labs/articles/Side-Panel-Fans-Are-They-Worth-It-102/
> 
> There was an article comparing the heat buildup between open and blower coolers in SLI/Crossfire but I can't find it.
> Basically, the gist was, good airflow + open air coolers is better than poor airflow + blower coolers.. This is because blower coolers exhaust most but not all of the hot air, some heat buildup always occurs.


again, very nice objective answer







I thank you for that.

to be honest all these things seem too confusing and hard to make a decision... I truly want an Nvidia card, but thinking of I might spend 800$ and card be irrelative in couple months craps me out real bad









I think this is what I'm gonna do... I'm gonna buy everything else to put together a computer except for GPU for now I'll hold up on Graphics Card till end of this year, if they release any next gen. R9 3xx or GTX 8xx series I'll just grab one, if not then I'll buy current one apparently for lower price, I think this is the only win-win case I can chase down for present.

BTW: do you have any clue about if any of these next gen. gpus are might be released within end of this year? either R9 3xx or GTX 8xx series, thanks


----------



## Dasboogieman

Check this out guys, excellent lunch time read. I generally strongly disagree with NVIDIAs pricing but you can't argue with the engineering excellence.
http://www.tomshardware.com/reviews/nvidia-history-geforce-gtx-690,3605.html
AMD and most AIB partners have much to learn about designing a good GPU cooler. The key words are passion and elegance. The sad thing is, I used to hold ASUS to the same regard, though lately they've fallen from grace.


----------



## Silent Scone

Random article to post but ok...

I take it you've not used a 290x Lightning on air then.


----------



## PCSarge

Quote:


> Originally Posted by *Silent Scone*
> 
> Random article to post but ok...
> 
> I take it you've not used a 290x Lightning on air then.


why am i just thinking nvidia trolling AMD/ATI thread?


----------



## HoneyBadger84

There's plenty of quality aftermarket solutions on AMD's side, just gotta know what to look for. People that are running stock cooler cards and hitting 90C plus have poor airflow and need to setup a custom fan profile like I did on my stock blowers. They never passed 75C at stock clocks in my rig.


----------



## Dasboogieman

Quote:


> Originally Posted by *PCSarge*
> 
> why am i just thinking nvidia trolling AMD/ATI thread?


Nothing wrong with keeping an open mind about both sides of the fence.
Nah I'm just longing for the old ASUS levels of quality and attention to detail in a cooler, the old DCII was dead quiet and high capacity albeit taking 3 slots. MSI and Sapphire have always had solid performant designs. The whole point of the article was that its not just pure performance to get a premium feeling, its the little things too like smooth silent fan ramp, coil whine control, materials etc.
However, if I had to give a cookie to an AIB that has really pushed the envelope on the AMD side, its the XFX DD. Sexy restrained minimalist design and silent, only issue seems to be the quality control.
We as users must demand better in order to get better. The stock AMD cooler is a disgrace. The AMD 295 was a step in the right direction.


----------



## Silent Scone

The XFX cooler.

Ok, definitely trollin'









Take it you've not witnessed the physics defying VRM temps on the Lightning then.

Nice talking with you anyway








Quote:


> Originally Posted by *PCSarge*
> 
> why am i just thinking nvidia trolling AMD/ATI thread?


I'm predominantly an Nvidia user at the moment and even I spotted that one a mile off


----------



## HoneyBadger84

Well, the TriX left enough of a good impression that I just got a second month old TriX 290X off Ebay for $335 shipped.







the 59-60C max load temps were sexy. Guess I'll be reselling my Gigabyte WindForce 290X.









For those keeping track, that'll be my 6th 290X lol first three are on their way to new buyers, with a slight profit off the resales of the 2 Core Editions.


----------



## Mercy4You

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Well, the TriX left enough of a good impression that I just got a second month old TriX 290X off Ebay for $335 shipped.
> 
> 
> 
> 
> 
> 
> 
> the 59-60C max load temps were sexy. Guess I'll be reselling my Gigabyte WindForce 290X.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> For those keeping track, that'll be my 6th 290X lol first three are on their way to new buyers, with a slight profit off the resales of the 2 Core Editions.


I'm very satisfied with my Sapphire290XTri-X. Got it almost 6 months now and I wouldn't trade it for anything









It's silent (@50% fanspeed) and cool, providing using a mild OC, otherwise not the core but the VRM1 will get hot (around 90-95C)

Only other issue is the fan-rattle because of the weight of the card. Support it @ the end and thats easily solved


----------



## HoneyBadger84

Quote:


> Originally Posted by *Mercy4You*
> 
> I'm very satisfied with my Sapphire290XTri-X. Got it almost 6 months now and I wouldn't trade it for anything
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's silent (@50% fanspeed) and cool, providing using a mild OC, otherwise not the core but the VRM1 will get hot (around 90-95C)
> 
> Only other issue is the fan-rattle because of the weight of the card. Support it @ the end and thats easily solved


Don't have that issue with the one I've already got. Not yet anyway. VRMs are very cool, they stay 3-10C cooler than the GPU core according to GPUz, as I posted earlier:



But again, my case flow is ridiculously good so my temps are very low compared to most, which is why I'm going to be trying to see how low of voltage bump I need for the core to run 1100MHz 24/7 stable. Fan isn't anymorw loud than the Gigabyte WindForce I've been using at 100%, which is what I have the GPU fan go up to when it hits 60C, and that's why it typically hits 60C then goes right back down go 59. Lol


----------



## rdr09

Quote:


> Originally Posted by *Dasboogieman*
> 
> Nothing wrong with keeping an open mind about both sides of the fence.
> Nah I'm just longing for the old ASUS levels of quality and attention to detail in a cooler, the old DCII was dead quiet and high capacity albeit taking 3 slots. MSI and Sapphire have always had solid performant designs. The whole point of the article was that its not just pure performance to get a premium feeling, its the little things too like smooth silent fan ramp, coil whine control, materials etc.
> However, if I had to give a cookie to an AIB that has really pushed the envelope on the AMD side, its the XFX DD. Sexy restrained minimalist design and silent, only issue seems to be the quality control.
> We as users must demand better in order to get better. The stock AMD cooler is a disgrace. The AMD 295 was a step in the right direction.


the stock amd cooler was a blessing in my case. if they put a more expensive cooler on that, then it would still serve no purpose 'cause my rig is watercooled. reference amd cards are the first to get waterblocks. paid $400 for my 290.

btw, even nvidia users have to resort to G10.


----------



## devilhead

http://www.3dmark.com/3dm/3462225 testing my 290X with my mining rig, so the cpu is just at 4.7ghz







it is 1360/1700 something like that, tryed 1370/1700 the score goes lower, maybe my psu can't hadle it







) my psu is 800w corsair. And at combined scene pulled around 800w from wall







now just need to connect those card at my 3930k at 5.2ghz


----------



## rdr09

Quote:


> Originally Posted by *devilhead*
> 
> http://www.3dmark.com/3dm/3462225 testing my 290X with my mining rig, so the cpu is just at 4.7ghz
> 
> 
> 
> 
> 
> 
> 
> it is 1360/1700 something like that, tryed 1370/1700 the score goes lower, maybe my psu can't hadle it
> 
> 
> 
> 
> 
> 
> 
> ) my psu is 800w corsair. And at combined scene pulled around 800w from wall
> 
> 
> 
> 
> 
> 
> 
> now just need to connect those card at my 3930k at 5.2ghz


it's silicon but . . . jomama has a point. it depends on whose hands the card falls. nice job!


----------



## JeremyFenn

Hey, just did some benchmarking the other day. You think these scores are decent for my rig?





Also here:

3DMark Results:
http://www.3dmark.com/3dm/3450420

3DMark 11 Results:
http://www.3dmark.com/3dm11/8484591

PCMark 8 Conventional Home:
http://www.3dmark.com/pcm8/3458046

PCMark 8 Accelerated 3.0:
http://www.3dmark.com/pcm8/3458394

Let me know what you think.


----------



## Dasboogieman

Gosh, why must there be so much polarity and patriotism over something as simple as GPU brands. I would love to see AMD release a premium magnesium/aluminium blower, just for once.
I have seen the physics/timewarping qualities of the Lightning VRMs. Tis truly a spectacle to behold, monstrous cooling + monstrous VRM assembly has truly spacetime altering performance.
Well I can't exactly lie about being a predominantly NVIDIA user, I've been mentioning ad nauseum about recently coming off a set of SLI GTX 570s. As for trolling, why can't both sides learn from each other?

Anyway, on a happier note:
I took delivery of the last set of 16gb G.skill 2400Mhz Tridents (2x8). Got them cuz they were on sale and my spare rig needs more RAM. Plugged them in and I've noticed absolutely no difference, anyway, at least they run at 1.575v. Should I be noticing a difference?

Also, if anybody has a Sapphire Tri X cooler 290/X and wants to use a backplate. I can recommend the EK one with these screws http://www.helipal.com/m2-5-x-8mm-flat-head-cap-screws.html no need for my earlier duct tape mod. These slot it perfectly and flush with the EK backplate. Dropped VRM temps by about 5 degrees too.


----------



## smoke2

Will be i5-4590 (3.3GHz/3.7GHz Boost) bottlenecking Sapphire Tri-X 290 OC not overclocked?


----------



## rdr09

Quote:


> Originally Posted by *Dasboogieman*
> 
> Gosh, why must there be so much polarity and patriotism over something as simple as GPU brands. I would love to see AMD release a premium magnesium/aluminium blower, just for once.
> I have seen the physics/timewarping qualities of the Lightning VRMs. Tis truly a spectacle to behold, monstrous cooling + monstrous VRM assembly has truly spacetime altering performance.
> Well I can't exactly lie about being a predominantly NVIDIA user, I've been mentioning ad nauseum about recently coming off a set of SLI GTX 570s. As for trolling, why can't both sides learn from each other?
> 
> Anyway, on a happier note:
> I took delivery of the last set of 16gb G.skill 2400Mhz Tridents (2x8). Got them cuz they were on sale and my spare rig needs more RAM. Plugged them in and I've noticed absolutely no difference, anyway, at least they run at 1.575v. Should I be noticing a difference?
> 
> Also, if anybody has a Sapphire Tri X cooler 290/X and wants to use a backplate. I can recommend the EK one with these screws http://www.helipal.com/m2-5-x-8mm-flat-head-cap-screws.html no need for my earlier duct tape mod. These slot it perfectly and flush with the EK backplate. Dropped VRM temps by about 5 degrees too.


why? 'cause some of us do not have the budget for it. now, digest that for a few minutes. besides, there other aspects why we pick a certain brand. i play Battlefield, so i pick amd regardless of cooler.

you may have stepped on the toes posting nvidia stuff on an amd thread. pls. stop.


----------



## shwarz

Quote:


> Originally Posted by *smoke2*
> 
> Will be i5-4590 (3.3GHz/3.7GHz Boost) bottlenecking Sapphire Tri-X 290 OC not overclocked?


nope no bottleneck there


----------



## Silent Scone

Quote:


> Originally Posted by *Dasboogieman*
> 
> Gosh, why must there be so much polarity and patriotism over something as simple as GPU brands. I would love to see AMD release a premium magnesium/aluminium blower, just for once.
> I have seen the physics/timewarping qualities of the Lightning VRMs. Tis truly a spectacle to behold, monstrous cooling + monstrous VRM assembly has truly spacetime altering performance.
> Well I can't exactly lie about being a predominantly NVIDIA user, I've been mentioning ad nauseum about recently coming off a set of SLI GTX 570s. As for trolling, why can't both sides learn from each other?
> 
> Anyway, on a happier note:
> I took delivery of the last set of 16gb G.skill 2400Mhz Tridents (2x8). Got them cuz they were on sale and my spare rig needs more RAM. Plugged them in and I've noticed absolutely no difference, anyway, at least they run at 1.575v. Should I be noticing a difference?
> 
> Also, if anybody has a Sapphire Tri X cooler 290/X and wants to use a backplate. I can recommend the EK one with these screws http://www.helipal.com/m2-5-x-8mm-flat-head-cap-screws.html no need for my earlier duct tape mod. These slot it perfectly and flush with the EK backplate. Dropped VRM temps by about 5 degrees too.


Let me give you a bit of advise, seeing as through preference although I do own both, I tend to use NV way more much like yourself.

If you want to post an article from last year about the 690 blower, post it in a thread that has the slightest relevance. As in here it has none.

Secondly, you then praised possibly one of the worst (XFX) coolers I have ever seen on a graphics card as standing above the rest, so you'll have to excuse people if they're a little confused at your completely random post


----------



## hwoverclkd

Quote:


> Originally Posted by *heroxoot*
> 
> My only problem here is XFX has a misleading warranty. It's called a "Life time" warranty but it only exists as long as the GPU is the current model. A friend tried to get a 6870 fixed and they wouldnt do it for free because the "Life time of the card is gone". I'd rather a 3 year warranty over a fake life time one.


just like the PNY brand


----------



## Dasboogieman

Quote:


> Originally Posted by *Silent Scone*
> 
> Let me give you a bit of advise, seeing as through preference although I do own both, I tend to use NV way more much like yourself.
> 
> If you want to post an article from last year about the 690 blower, post it in a thread that has the slightest relevance. As in here it has none.
> 
> Secondly, you then praised possibly one of the worst (XFX) coolers I have ever seen on a graphics card as standing above the rest, so you'll have to excuse people if they're a little confused at your completely random post


Well I just thought some extra knowledge can never hurt and help us see if there were any ways to make AMD cards better, in light of the weak AMD reference cooler.
My sincere apologies, I won't be posting any more NVIDIA stuff if it offends the everyone's sensibilities. I guess I was too naive to expect people to be able to put aside brand loyalty to be able to appreciate all corners of technology, regardless of brand.

Anyway, I do genuinely think the XFX cooler is a brave move, if the execution is bit wonky. They went for a compact build instead of adding more bulk, like the single slot coolers of old.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Silent Scone*
> 
> Let me give you a bit of advise, seeing as through preference although I do own both, I tend to use NV way more much like yourself.
> 
> If you want to post an article from last year about the 690 blower, post it in a thread that has the slightest relevance. As in here it has none.
> 
> Secondly, you then praised possibly *one of the worst (XFX) coolers I have ever seen on a graphics card* as standing above the rest, so you'll have to excuse people if they're a little confused at your completely random post


----------



## sugarhell

Maybe in general is average but for hawaii is kinda bad...


----------



## Sgt Bilko

There are a grand total of 4 people in this thread with XFX DD cards, And while the vrm temps aren't great compared to others they are far from bottom of the barrel here.


----------



## sugarhell

Because its not the worst doesnt mean its a good cooler. Vrm1 temps are awful and vrm temps is the most important thing for amd gpu oc


----------



## Silent Scone

Quote:


> Originally Posted by *Sgt Bilko*


Depends in what context you are speaking, as he criticised _most_ AIB coolers and then proceeded to say that the XFX one stood out, even though the DD cooler has some of the worst VRM temps out of the bunch. Not to mention, if you're familiar with the company they have some quality issues to boot. Sure there might be a few worse, but I won't be swayed into saying they're any good







.

The Vapor-X cooler, along with the Lightning are leagues above. So if he was looking to school people on just how good Nvidias vapour chamber is (which has now been discontinued), he chose a shoddy example


----------



## Sgt Bilko

Quote:


> Originally Posted by *sugarhell*
> 
> Because its not the worst doesnt mean its a good cooler. Vrm1 temps are awful and vrm temps is the most important thing for amd gpu oc


Quote:


> one of the worst (XFX) coolers I have ever seen on a graphics card


That was the part of the quote i picked out, and that was the part to which i was referring.
It's pretty obvious by now that if you want to OC these cards to a decent level on Air then you want a Lightning or Vapor-X or better yet.....a waterblock

Quote:


> Originally Posted by *Silent Scone*
> 
> Depends in what context you are speaking, as he criticised _most_ AIB coolers and then proceeded to say that the XFX one stood out, even though the DD cooler has some of the worst VRM temps out of the bunch. Not to mention, if you're familiar with the company they have some quality issues to boot. Sure there might be a few worse, but I won't be swayed into saying they're any good
> 
> 
> 
> 
> 
> 
> 
> .
> 
> The Vapor-X cooler, along with the Lightning are leagues above. So if he was looking to school people on just how good Nvidias vapour chamber is (which has now been discontinued), he chose a shoddy example


The Vapor-X and the Lightning are in a class of their own when it comes to Hawaii, What XFX did was redesign the cooler and it works well (compared to the last DD's), It's a great looking card and performs well enough for it's price point.

As for quality issues.....I will admit these are my first XFX cards (Sapphire, HIS and Giga mainly beforehand) and they are great. they hit nice clocks and they keep cool enough when i'm gaming and i've had no issues with customer support when e-mailing them about replacing the paste or taking off the cooler to add thermal pads to the vrms (which i haven't needed to do yet)

if you are a bencher or you are using a case with crappy airflow then i can't recommend them but for the everyday person they are great value for money (here at least).


----------



## HoneyBadger84

From my little experience with the regular Tri-X OC Sapphire cooler I'd say it's quite good compared to most as well. I thought my Gigabyte WindForce ran cool, but this thing runs even cooler than that, and they have about the same noise level at 100% (only semi audible over the general hum of my computer running thanks to 5 140mm & 3 180mm fans lol)


----------



## Mega Man

personally the ref cooler is and always will be the best, noisy yes, but i would never buy something without putting it under water, so, yea reference GPU for life, more options, better options, and water


----------



## HoneyBadger84

Quote:


> Originally Posted by *Mega Man*
> 
> personally the ref cooler is and always will be the best, noisy yes, but i would never buy something without putting it under water, so, yea reference GPU for life, more options, better options, and water


Buying aftermarket coolered boards, particularly the ones we're talking about, have their advantages even if you plan on putting it under liquid cooling. The Gigabyte & Sapphire cards we're talkin' about have modified PCBs with added hardware to increase the efficiency & overclockability of the card. So buying a reference card, while it will save you money if you're planning to put a liquid block on it, isn't really necessarily the best plan if you want a good overclocker. PowerColor's PCS+ boards also stray from reference in terms of additions to the board, and they of course sell cards with full coverage blocks already on them & ready to go.

I don't do liquid cooling on my cards, ever, so the better the aftermarket cooler performs, the more I love it... one of these days I'll go liquid on my cards, but after the recent breakdown of my CPU loop, I don't think I'll be dealing with another custom made loop for a bit, gotta get over the near-heart attack that one gave me when it died.


----------



## Mega Man

while non ref helps nvidia substantially

can you please show me any proof that it helps amd users ( for 24/7 usage ) ? look at the op here, there are just as many after market coolers ( non ref ) with high clocks as ref,

unlike nvidia amd designs some of the best cards, for normal use ( air/water ) obviously i am not talking for l2n and other extreme cooling


----------



## sugarhell

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Buying aftermarket coolered boards, particularly the ones we're talking about, have their advantages even if you plan on putting it under liquid cooling. The Gigabyte & Sapphire cards we're talkin' about have modified PCBs with added hardware to increase the efficiency & overclockability of the card. So buying a reference card, while it will save you money if you're planning to put a liquid block on it, isn't really necessarily the best plan if you want a good overclocker. PowerColor's PCS+ boards also stray from reference in terms of additions to the board, and they of course sell cards with full coverage blocks already on them & ready to go.
> 
> I don't do liquid cooling on my cards, ever, so the better the aftermarket cooler performs, the more I love it... one of these days I'll go liquid on my cards, but after the recent breakdown of my CPU loop, I don't think I'll be dealing with another custom made loop for a bit, gotta get over the near-heart attack that one gave me when it died.


I dont know why people believe that a ref pcb is lower quality than a custom one. Yeah lightning has more phases but you dont need them for air or water. The ref pcb quality is enough for anyone crazy to push 1.6 volt under water. For ln2 is enough until you hit power limits.

Look the highest overclocks. Its done by ref cards.

By the way PCS+ is a ref pcb...


----------



## Mega Man




----------



## Sgt Bilko

Quote:


> Originally Posted by *sugarhell*
> 
> I dont know why people believe that a ref pcb is lower quality than a custom one. Yeah lightning has more phases but you dont need them for air or water. The ref pcb quality is enough for anyone crazy to push 1.6 volt under water. For ln2 is enough until you hit power limits.
> 
> Look the highest overclocks. Its done by ref cards.
> 
> By the way PCS+ is a ref pcb...


Wasn't Tsm pumping 1.5v through ref cards within the first couple of weeks?


----------



## Mega Man

as well as one of our friends 1.6+ ( as i have done now )


----------



## sugarhell

Yes and yes.


----------



## HoneyBadger84

I'm speaking of longevity of the parts when run at the higher power levels etc, not whether or not the board can do it, but how long it'll last and how well it handles said power... but *shrug* to each their own. As I said, I don't do liquid on cards, so reference cards aren't really an option... well, they are, I've used them, recently, but the amount of heat they put out even with the fan maxed out is kinda silly, not to mention the leaf blower effect of having the fan on maximum speed lol

What program would be the most recommended for stress-testing a Graphics OC to determine whether or not it's 100% stable? Loop Heaven, Valley, or something similar? I haven't done testing for 24/7 stability of a GPU OC in quite some time as I've only run OCs for benchmarking numbers til now, but I want to get these things chugging along at 1100MHz core for regular use. Just did a full session of Valley with no artifacts with only a slight voltage bump, temps considering it's crossfire of open air cards weren't too bad, low to mid 70s.


----------



## sugarhell

Except lightning,vaporx (maybe matrix too) the rest of the custom one use lower rebrand quality components. And amd ref layout already use quality parts. Even lightning use the same voltage controller.

Pls stop spreading misinformations when you have no idea about pcbs and components.


----------



## devilhead

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Wasn't Tsm pumping 1.5v through ref cards within the first couple of weeks?


i have used today 1.57v (1.47V after vdrop pt1) on my reference 290x, just need to cool down those cards, maybe later will try 1.55v-1.6v


----------



## Mega Man

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I'm speaking of longevity of the parts when run at the higher power levels etc, not whether or not the board can do it, but how long it'll last and how well it handles said power... but *shrug* to each their own. As I said, I don't do liquid on cards, so reference cards aren't really an option... well, they are, I've used them, recently, but the amount of heat they put out even with the fan maxed out is kinda silly, not to mention the leaf blower effect of having the fan on maximum speed lol
> 
> What program would be the most recommended for stress-testing a Graphics OC to determine whether or not it's 100% stable? Loop Heaven, Valley, or something similar? I haven't done testing for 24/7 stability of a GPU OC in quite some time as I've only run OCs for benchmarking numbers til now, but I want to get these things chugging along at 1100MHz core for regular use. Just did a full session of Valley with no artifacts with only a slight voltage bump, temps considering it's crossfire of open air cards weren't too bad, low to mid 70s.


you really didnt listen, so i will speak clearly, there is NO advantage
Quote:


> Originally Posted by *sugarhell*
> 
> Except lightning,vaporx (maybe matrix too) the rest of the custom one use _*lower rebrand quality components*_. And amd ref layout already _*use quality parts*_. Even lightning use the _*same voltage controller*_.
> 
> Pls stop spreading misinformations when you have no idea about pcbs and components.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Mega Man*
> 
> you really didnt listen, so i will speak clearly, there is NO advantage


And how, preytell, would one know if any advantage in terms of what I speak, which is longevity (the life span of the card, being it reference or modified, and how it may be shortened by using overclocking or running at high temps), when the cards haven't even been available for a full 2 years yet?  Just sayin'.

How am I "spreading" misinformation, I just recently started fiddling with R9 290Xs so I'm learning the ins and outs, and which PCBs are straight 100% reference & which have physically added circuitry or are fully customized PCBs, in regards to the PCS+ I don't know for sure if it's reference or not as I don't own one (Edit: And according to Guru3d.com's review, the PCS+ is a customized reference PCB, like I said: http://www.guru3d.com/articles_pages/powercolor_radeon_r9_290x_pcs_review,1.html *shrug* don't own one, can't say for sure how custom it is.), and haven't seen one without the backplate on, so I can't say for sure, I was simply stating the fact that PCS+ cards have been showing to OC pretty well, especially compared to pure reference cards (the ones with the leaf blower). You guys take this crap way too seriously & get worked up/offended it seems like. Guess I'll leave my question that's gone half-answered for about 7 pages now for some other time lol. Nevermind.


----------



## sugarhell

You say something wrong then you say i am still learning. Most of the cards follow ref layout and they add 2 more phases or they go full custom (fail like asus DCU cards and gigafail) or real custom like lightning, matrix or vaporx

The quality is the same all of them use elpida or hynix. Because a card has more phases doesnt mean it will oc better. Thats just a marketing thing that goes for years. Get the gpu with the best air cooler if you want to stay on air or if you want water ref or a custom one with a block. All the others are just marketing gimmicks that most of the times doesnt apply to users with on air or water. Or you believe msi with military components quality? Probably that fan oil is a tank oil or something


----------



## HoneyBadger84

I never got the reference to "military grade components" by MSI to be honest, stuff in the military breaks just the same, it just gets fixed faster, sometimes


----------



## tsm106

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Guess I'll leave my question that's gone half-answered for about 7 pages now for some other time lol. Nevermind.


Next time just ask your question and leave the psuedo expert opinions out of it? At this point I don't know what your question is except that you were repeating the same misinformation people who have had very little experience using AMD reference cards. Here's a tip, AMD reference designs are overbuilt. There are no flaws in the reference design unlike Nvidia's reference boards which are cheap and locked down. That's why customs are so popular with Nvidia cards. That same mentality is what drives people to posit that AMD custom cards are also superior to AMD reference cards.

That is SIMPLY NOT TRUE.

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *sugarhell*
> 
> I dont know why people believe that a ref pcb is lower quality than a custom one. Yeah lightning has more phases but you dont need them for air or water. The ref pcb quality is enough for anyone crazy to push 1.6 volt under water. For ln2 is enough until you hit power limits.
> 
> Look the highest overclocks. Its done by ref cards.
> 
> By the way PCS+ is a ref pcb...
> 
> 
> 
> Wasn't Tsm pumping 1.5v through ref cards within the first couple of weeks?
Click to expand...

I put 1.6v thru my trifire array last Nov. Yea in 2013. It seems a lot of knowledge gets lost in this thread over time, strangeness.


----------



## Red1776

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I never got the reference to "military grade components" by MSI to be honest, stuff in the military breaks just the same, it just gets fixed faster, sometimes


The MSI Military grade comes from the documentation they provide with the actual military standards for field conditions. I researched this as far as I was able mind you, and I believe that technically they are. but again I am relying on what I was able to find.



Just sharing what i was able to find, don't shoot the messenger here.

Great Points on the ref cards BTW TSM, I have had that arguement with countless people who will not touh ref cards because "they are weak"


----------



## pdasterly

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I never got the reference to "military grade components" by MSI to be honest, stuff in the military breaks just the same, it just gets fixed faster, sometimes


I have a hp elitebook thats military grade and I can vouch for that, she has lasted longer than any laptop ive owned and still going strong. I replaced the hdd with a sdd and it has no moving parts except for cpu fan. Its over 5 years old and works great. I do automotive work with it and really abuse it. Elitebook 2730p. Unlike most hp items, this was built to last


----------



## smoke2

Quote:


> Originally Posted by *shwarz*
> 
> nope no bottleneck there


And if I slightly overclock Tri-X OC, still CPU will not br bottlenecking?


----------



## shwarz

Quote:


> Originally Posted by *smoke2*
> 
> And if I slightly overclock Tri-X OC, still CPU will not br bottlenecking?


Assuming your not running a gimped screen which if your spending that much on CPU and GPU I would hope you are gaming at least 1920x1080. Still no bottleneck no. The GPU will still be the limiting factor in fps in games


----------



## sugarhell

Anyone with a 290x vapor-x? Want to ask some questions


----------



## Mega Man

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HoneyBadger84*
> 
> Guess I'll leave my question that's gone half-answered for about 7 pages now for some other time lol. Nevermind.
> 
> 
> 
> Next time just ask your question and leave the psuedo expert opinions out of it? At this point I don't know what your question is except that you were repeating the same misinformation people who have had very little experience using AMD reference cards. Here's a tip, AMD reference designs are overbuilt. There are no flaws in the reference design unlike Nvidia's reference boards which are cheap and locked down. That's why customs are so popular with Nvidia cards. That same mentality is what drives people to posit that AMD custom cards are also superior to AMD reference cards.
> 
> That is SIMPLY NOT TRUE.
Click to expand...

all i would like to add, well is simply how long has amd been making cards now? we at least have the last 2 gen to look through, which if they have been any indication, ref cards are just fine, besides that this is well put,

( i know they have been making longer then the last 2 gen, but the point is they are still going strong )


----------



## hwoverclkd

Quote:


> Originally Posted by *Dasboogieman*
> 
> Well I just thought some extra knowledge can never hurt and help us see if there were any ways to make AMD cards better, in light of the weak AMD reference cooler.
> My sincere apologies, I won't be posting any more NVIDIA stuff if it offends the everyone's sensibilities. I guess I was too naive to expect people to be able to put aside brand loyalty to be able to appreciate all corners of technology, regardless of brand.


to me, what you posted is just 'incongruous' ...there's a perfect place for it, but not here. Although I own both amd and nvidia cards, I neither side with either brand


----------



## hwoverclkd

Quote:


> Originally Posted by *sugarhell*
> 
> Anyone with a 290x vapor-x? Want to ask some questions


@JeremyFenn and @KeepWalkinG


----------



## HoneyBadger84

Quote:


> Originally Posted by *tsm106*
> 
> Next time just ask your question and leave the psuedo expert opinions out of it? At this point I don't know what your question is except that you were repeating the same misinformation people who have had very little experience using AMD reference cards.


That was actually exactly my point. I did JUST ask my question, several pages ago, more than once, before this yammering about the boards went on and on, and got no answer, then proceeded to get hammered with "no" "you're wrong" without any real proof other than words on the subject I hadn't even posed a question about. I've yet to hear much of the "I've personally used this and can vouche for it" which is where the majority of my statements come from when I'm stating something I find to be the case, for instance the Gigabyte card being a reference PCB with customizations is something I know for a fact as I visually personally compared the two, side by side, a reference XFX card with the stock, AMD cooler on it, and the Gigabyte WindForce, and could easily tell there were a significant number of additions & replacements. My statement about the PCS+ card was similarly based on something I'd read from a trusted review site, which I linked to.

Whereas from my perspective, y'all are chattering on about this and I've seen no "proof". Instead, I'm accepting that, most likely, you may have used the things yourselves, and know what you're talking about. This isn't particularly directed at you, since you haven't really said much one way or the other, but others have, without a whole lot to back it up, from my point of view.

This conversation started out being about coolers, and to be perfectly honest, I was corrected, wrongly, on two occasions early on, and proved said corrections wrong (the PCS+ was a reference board with add ons, which I stated & was denied) then I was told I'm "spreading misinformation" about reference PCBs when I was doing nothing of the sort...

That's my two cents on the subject, and from there I'm moving on, I'll just ignore any further comment on the topic since it's been blown out of proportion & ridiculously dramatized at this point.

I did get a general, but acceptable, basis for an answer to my main question elsewhere, so at least there's that.

Edit: And I just have to point this out because it's bugging me, I never said the boards are outrightly "better", I said they OC better within the limits of testing done, which is true in the Guru3D reviews I read of the PCS+, WindForce Edition, & Sapphire Tri-X review I read, simple as that. I'm not saying the Reference board CAN'T OC as well with adequate cooling, at all, I'm saying with the stock cooler vs the aftermarket coolers, the aftermarket coolered cards, especially with their modified PCBs in mind, do seem to handle clocks easier, and better, than the completely-reference cards tested.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Red1776*
> 
> The MSI Military grade comes from the documentation they provide with the actual military standards for field conditions. I researched this as far as I was able mind you, and I believe that technically they are. but again I am relying on what I was able to find.


Quote:


> Originally Posted by *pdasterly*
> 
> I have a hp elitebook thats military grade and I can vouch for that, she has lasted longer than any laptop ive owned and still going strong. I replaced the hdd with a sdd and it has no moving parts except for cpu fan. Its over 5 years old and works great. I do automotive work with it and really abuse it. Elitebook 2730p. Unlike most hp items, this was built to last


Well on laptops I can vouche for that as well, I've seen how much they can take myself. I was refering specifically to military grade hardware being toted as a thing on Video Cards that makes any form of meaningful difference. For motherboards, & more significantly laptops, it's certainly something that's a nice "certification" to have so to speak. Kinda like those crazy laptops they make that have... what's it called... most of them are yellow, the majority are waterproof, drop-damage-resistant, that sort of thing... I dunno, I've been up almost 24hrs. Sleepy time.


----------



## JeremyFenn

Quote:


> Originally Posted by *sugarhell*
> 
> Anyone with a 290x vapor-x? Want to ask some questions


I have a Sapphire Vapor-X Tri-X R9 290x OC edition (UEFI) I could maybe help, although I haven't gotten around to playing around with it much yet.


----------



## smoke2

Quote:


> Originally Posted by *shwarz*
> 
> Assuming your not running a gimped screen which if your spending that much on CPU and GPU I would hope you are gaming at least 1920x1080. Still no bottleneck no. The GPU will still be the limiting factor in fps in games


Probably I will be buying new monitor, 24" or 27", the biggest resolution then will be 2560*1440.
You think it will be still, OK?


----------



## JeremyFenn

Ok so the Sapphire Trixx program is a bunch of crap tbh. Using MSI AB I've set Core Voltage +100 and Power Limit +50, Core clock up 100mhz (whoo hoo lol) to 1180 and that's due to blocking starting around 1190mhz. Also I've got the memory up to 1450mhz which is 5800mhz effective, 200mhz bump there from what the OC edition gives me. Using a higher fan profile (0% @ 0c - 50% @ 50c then 100% @ 70c) I haven't seen this thing get up past 70 c (actually not seen it past 68c). If anyone has a stress tester for the GPU memory I'd appreciate that. Only thing I've been using is the GPUz render test, OCCT which doesn't use much memory either, and just video games (Thief and WatchDogs on all max settings I figured would eat up a lot of VRAM). Anyway, someone was trying to OC theirs and didn't know about voltage so that's what I've gotten so far.


----------



## Widde

Anyone tried and got any oc gains by switching to the asus bios or pt1? Getting fed up with only being able to do 1080 on the core







Cant get access to the voltage control in gpu tweak either


----------



## Dasboogieman

Quote:


> Originally Posted by *Widde*
> 
> Anyone tried and got any oc gains by switching to the asus bios or pt1? Getting fed up with only being able to do 1080 on the core
> 
> 
> 
> 
> 
> 
> 
> Cant get access to the voltage control in gpu tweak either


Its really unusual for your GPU to have a locked voltage BIOS. Well, your low OC may be due to a lower stock voltage to begin with. The PT1 BIOS sets a flat 1.25V (which may actually be an overvoltage if your card began with a lower stock Vcore on the stock bios) with unlimited overvoltage (which may still not work).


----------



## Forceman

Quote:


> Originally Posted by *Widde*
> 
> Anyone tried and got any oc gains by switching to the asus bios or pt1? Getting fed up with only being able to do 1080 on the core
> 
> 
> 
> 
> 
> 
> 
> Cant get access to the voltage control in gpu tweak either


I set my highest overclocks on the Asus BIOS using GPU Tweak, but that's just because of the extra voltage control. Otherwise the clocks were the same using AB.


----------



## Dasboogieman

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Well on laptops I can vouche for that as well, I've seen how much they can take myself. I was refering specifically to military grade hardware being toted as a thing on Video Cards that makes any form of meaningful difference. For motherboards, & more significantly laptops, it's certainly something that's a nice "certification" to have so to speak. Kinda like those crazy laptops they make that have... what's it called... most of them are yellow, the majority are waterproof, drop-damage-resistant, that sort of thing... I dunno, I've been up almost 24hrs. Sleepy time.


You mean the Panasonic Toughbooks? Those are definitely built for harsh conditions, petroleum engineers love em for field work.

For GPUs, how much of it is marketing and how much of it is true is difficult to determine unless we can inspect the components directly. The true Military class stuff I suppose must go through some kind of qualification process which I doubt can be applied to GPUs. Anyhoo, I wouldn't be able to tell the difference between military inductors/VRMs/capacitors over simply over-engineered ones.

I have read a case of the past Galaxy GTX HOF edition with the white PCB (I don't remember if it was the 680, 780 or 780Ti), it had military class components, basically the stuff on the PCB was fit for Aircraft avionics however it's VRM assembly was also notorious for exploding and general failures under overvoltage conditions. Since then, I've taken the military class marketing term with a grain of salt.


----------



## JeremyFenn

I set the cold and power all the way up in ab and set constant voltage in the setting (also have a custom fan profile) and was about to get 1180 on the core and 1450 or 1475 on memory. My stock OC was 1080 though on my vapor x sapphire. Is it just getting too hot maybe?


----------



## Widde

Quote:


> Originally Posted by *Dasboogieman*
> 
> Its really unusual for your GPU to have a locked voltage BIOS. Well, your low OC may be due to a lower stock voltage to begin with. The PT1 BIOS sets a flat 1.25V (which may actually be an overvoltage if your card began with a lower stock Vcore on the stock bios) with unlimited overvoltage (which may still not work).


Might be a lower core voltage







Doing 100+mV in AB for 1080, Gonna try the pt1 bios with care though, still on ref coolers







Thinking of mounting an AC at the intake of the case though. Have one laying around


----------



## Dasboogieman

Quote:


> Originally Posted by *Widde*
> 
> Might be a lower core voltage
> 
> 
> 
> 
> 
> 
> 
> Doing 100+mV in AB for 1080, Gonna try the pt1 bios with care though, still on ref coolers
> 
> 
> 
> 
> 
> 
> 
> Thinking of mounting an AC at the intake of the case though. Have one laying around


Out of curiosity, what are your stock voltages and ASIC quality?


----------



## Widde

Quote:


> Originally Posted by *Dasboogieman*
> 
> Out of curiosity, what are your stock voltages and ASIC quality?


1.188v load @Stock, ASIC 69.4%


----------



## Dasboogieman

Quote:


> Originally Posted by *Widde*
> 
> 1.188v load @Stock, ASIC 69.4%


OK I'm gonna get totally crucified for bringing this up but I believe there is a moderate correlation between ASIC quality, stock voltage and overclocking simply based off what I've observed from my own cards and the reliable data published on this thread.

With 1.188v under load, accounting for Vdroop, this means that your stock VDDCI is around 1.24 voltage bin, this is identical to my MSI 290 Gaming.
However, my gaming card can do 1100mhz out of the box because it has an ASIC quality of 79.6%. My Sapphire Tri X has the 1.18 voltage bin (about 1.14V under load) with 78.9% ASIC which means it needs about +50mV (i.e. 1.188mV) to do 1100mhz.
This is too consistent to be a coincidence.

If you need +100mv to do 1080Mhz then I think it is simply because you have a high leakage GPU. This is where the data gets foggy and unreliable as everyone has different standards for stability (plus the expected exaggeration and bragging that tends to accompany overclock result publishing). I'm leaning towards the ultra conservative high stability side since I'm not much of a bencher.

From what I've observed, my Tri X (the weaker of the two) needs +125mV to do 1150mhz and hits a clockspeed wall at 1200mhz (I've tried +168mV but its unstable and I'm limited by VRM cooling, though the point for stability might be around +200mV). Based on the limited data set, this is pointing to an almost exponential progression in voltage vs clockspeed.
Thus, I believe that high ASIC quality (i.e. low leakage) chips clock better under light voltage conditions but they require much more voltage to operate outside of normal parameters, often they also possess clockspeed walls. This is consistent with observations from Ivy Bridge (low leakage) overclocking vs Sandy Bridge.

Extrapolating the data
So what conclusion can I draw from your GPU? I believe high leakage parts like yours do worse under stock and light conditions but they require less of a voltage ramp to reach higher clockspeeds, plus, they are also less prone to clockspeed walls. *Basically, your card is a poor candidate for air cooled operation but is an excellent candidate for the PT1 BIOS with watercooling*.

This is somewhat supported by the data from the early Hawaii adopters (who mostly had high leakage parts due to a young manufacturing process) who can easily exceed 1300mhz + under water.

Feel free to critique my thoughts, I might be generalizing too much but this is the loose trend I've been seeing.


----------



## Widde

Quote:


> Originally Posted by *Dasboogieman*
> 
> OK I'm gonna get totally crucified for bringing this up but I believe there is a moderate correlation between ASIC quality, stock voltage and overclocking simply based off what I've observed from my own cards and the reliable data published on this thread.
> 
> With 1.188v under load, accounting for Vdroop, this means that your stock VDDCI is around 1.24 voltage bin, this is identical to my MSI 290 Gaming.
> However, my gaming card can do 1100mhz out of the box because it has an ASIC quality of 79.6%. My Sapphire Tri X has the 1.18 voltage bin (about 1.14V under load) with 78.9% ASIC which means it needs about +50mV (i.e. 1.188mV) to do 1100mhz.
> This is too consistent to be a coincidence.
> 
> If you need +100mv to do 1080Mhz then I think it is simply because you have a high leakage GPU. This is where the data gets foggy and unreliable as everyone has different standards for stability. I'm leaning towards the ultra conservative high stability side since I'm not much of a bencher.
> 
> From what I've observed, my Tri X (the weaker of the two) needs +125mV to do 1150mhz and hits a clockspeed wall at 1200mhz (I've tried +168mV but its unstable and I'm limited by VRM cooling, though the point for stability might be around +200mV). Based on the limited data set, this is pointing to an almost exponential progression in voltage vs clockspeed.
> Thus, I believe that high ASIC quality (i.e. low leakage) chips clock better under light voltage conditions but they require much more voltage to operate outside of normal parameters, often they also possess clockspeed walls. This is consistent with observations from Ivy Bridge (low leakage) overclocking vs Sandy Bridge.
> 
> Extrapolating the data
> So what conclusion can I draw from your GPU? I believe high leakage parts like yours do worse under stock and light conditions but they require less of a voltage ramp to reach higher clockspeeds, plus, they are also less prone to clockspeed walls. Basically, your card is a poor candidate for air cooled operation but is an excellent candidate for the PT1 BIOS with watercooling.
> 
> This is somewhat supported by the data from the early Hawaii adopters (who mostly had high leakage parts due to a young manufacturing process) who can easily exceed 1300mhz + under water.
> 
> Feel free to critique my thoughts, I might be generalizing too much but this is the loose trend I've been seeing.


It is pretty early cards so you might have a point there ^^


----------



## HoneyBadger84

Question, where does one find out their ASIC rating?

I've gotten my Gigabyte WindForce as well as one of the XFX stock blower carss to run 1150MHz solid enough for prolonges bencmharking sessions so it would be sad indeed if someone was stuck at 1080MHz on a card with s better cooler









I'm gonna be fiddling with my Tri-X and the WindForce in Crossfire more today, see how low I can run the voltage on 1100MHz, so far I only tested it out at +50mV and it made it through 2 1/2 loops of Unigome4 Valley on Extreme HD no artifacts. I think given both cards start out at 1040 core, that +50mV is more than I actually need, but I alsowant to be able to run the vRAM at a higher speed than stock.

Should have a second Tri-X in early next week, then I'll run those together, see what they can do since the WindForce doesn't like running as the top card much, even with how cool the Tri-X runs under it (58-61C max load, VRMs never passed 56C/50C respectively.


----------



## Forceman

The problem with the ASIC theory is that AMD has already said it's pointless on Hawaii cards. Basically GPU-Z is reading some random register and making assumptions based off it, but that register doesn't mean what they think it means. There's a quote about it from Dave Baumann floating around somewhere.


----------



## heroxoot

Quote:


> Originally Posted by *Widde*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dasboogieman*
> 
> Out of curiosity, what are your stock voltages and ASIC quality?
> 
> 
> 
> 1.188v load @Stock, ASIC 69.4%
Click to expand...

My asic is 74.1 but I have the same stock voltage. Kind of weird because my gpu core is higher than the 290 by 30 at the least. I guess they use one voltage to rule them all.

74% asic seems good tho. My last GPU was like 79% tho.


----------



## tsm106

Quote:


> Originally Posted by *tsm106*
> 
> http://forum.beyond3d.com/showpost.php?p=1808073&postcount=2130
> 
> Quote:
> 
> 
> 
> ASIC "quality" is a misnomer propobated by GPU-z reading a register and not really knowing what it results in. It is even more meaningless with the binning mechanism of Hawaii.
Click to expand...


----------



## HoneyBadger84

Random meaningless numbers op. Lol


----------



## Dasboogieman

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Random meaningless numbers op. Lol


Well whatever it's reading the trend is there once you cut out all the crud and false results. Even if it only acts as a guideline.

It'll be really easy to disprove the theory. You just need to find an example of a <70% ASIC with 1.25 stock voltage being able to do 1100mhz with no extra voltage.
Or conversely, find an example of a 80% ASIC that can't do 1100mhz unless you feed it +100mhz

Quote:


> Originally Posted by *heroxoot*
> 
> My asic is 74.1 but I have the same stock voltage. Kind of weird because my gpu core is higher than the 290 by 30 at the least. I guess they use one voltage to rule them all.
> 
> 74% asic seems good tho. My last GPU was like 79% tho.


yeah, there seems to be no correlation between ASIC quality and stock voltage which is why the results are so seemingly random, but what matters is what kind of overclocks can be achieved at similar volts.


----------



## tsm106

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HoneyBadger84*
> 
> Random meaningless numbers op. Lol
> 
> 
> 
> Well whatever it's reading the trend is there once you cut out all the crud and false results. Even if it only acts as a guideline.
Click to expand...

There's no trend lol. You guys need to search this thread more.


----------



## Red1776

I surmise What we have with ASIC quality and 'trends' is a classic example of

*Correlation as opposed to causation.*

I go through a lot of GPU's and the majority of the best OC'rs have been around 70% +/-

From everything I have read about ASIC quality, while the strong overclocking GPU's have correlated to an ASIC quality percentage of 70. There is nothing to demonstrate causation.

Could be luck of the draw, a disproportionate number of 70% ASIC GPU's in the market or other non-causation factors.


----------



## steverebo

Guys just got my sapphire 290x tri x and when I overclock the card and do a cold boot I get a black screen and have to hard reset to get it to boot into windows but if I reset the clocks its fine and the overclock is 100 percent stable and its under water so temps never go above 50 degrees full load for hours, Im using MSI AB for overclocking and letting it apply the oc at start.

Any ideas what is causing this and how to fix it???


----------



## HoneyBadger84

Quote:


> Originally Posted by *steverebo*
> 
> Guys just got my sapphire 290x tri x and when I overclock the card and do a cold boot I get a black screen and have to hard reset to get it to boot into windows but if I reset the clocks its fine and the overclock is 100 percent stable and its under water so temps never go above 50 degrees full load for hours, Im using MSI AB for overclocking and letting it apply the oc at start.
> 
> Any ideas what is causing this and how to fix it???


Don't apply it at startup. That's very likely the issue.


----------



## steverebo

Is there anyway to get the overclock to run 24/7 without this issue, ive tried changing to trixxx and it does the same thing


----------



## Gobigorgohome

I am in the middle of testing three R9 290X reference cards in trifire right now, still do not understand what people are talking about it being so loud, for reference cards I think it is very good. Running BF4 at ultra without AA at 4K and getting stable 60 fps +. I guess the fourth card will take that FPS even a little higher again.

It was very easy to setup and I am noticing some tearing, but WAY less than I thought (actually saw more tearing with just two cards), the heat though is pretty insane, but on air I will only do stock clock and leave everything stock until I am on water. Very happy with the purchase this far!


----------



## HoneyBadger84

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I am in the middle of testing three R9 290X reference cards in trifire right now, still do not understand what people are talking about it being so loud, for reference cards I think it is very good. Running BF4 at ultra without AA at 4K and getting stable 60 fps +. I guess the fourth card will take that FPS even a little higher again.
> 
> It was very easy to setup and I am noticing some tearing, but WAY less than I thought (actually saw more tearing with just two cards), the heat though is pretty insane, but on air I will only do stock clock and leave everything stock until I am on water. Very happy with the purchase this far!


Do me a favor. Load up Afterburner or Trixx, set the fan speed to 100%, click apply, then tell me again they aren't that loud.







Lol I've used 2 XFX Core Editons at once, and the leaf blower description I use is accurate at that fan speed.

If you're letting it max out at 55%, like they do default, yeah, they aren't too bad, that I agree with.


----------



## Gobigorgohome

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Do me a favor. Load up Afterburner or Trixx, set the fan speed to 100%, click apply, then tell me again they aren't that loud.
> 
> 
> 
> 
> 
> 
> 
> Lol I've used 2 XFX Core Editons at once, and the leaf blower description I use is accurate at that fan speed.
> 
> If you're letting it max out at 55%, like they do default, yeah, they aren't too bad, that I agree with.


I have NO intentions of installing any kind of 3rd party software for overclocking before I get these babyes on water. Why would I turn the fanspeed up to 100% anyway? You do what you want, these cards will be stock until they are on water and that is the end of it.

By the way, just ordered 4x EK-FC R9 290X Acetal + Nickel blocks + 4x EK-FC R9 290X black backplates, together with EK-FC Terminal Quad, Semi-Parallell. I hope I get them by next weekend, because I do not work then!


----------



## bond32

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I have NO intentions of installing any kind of 3rd party software for overclocking before I get these babyes on water. Why would I turn the fanspeed up to 100% anyway? You do what you want, these cards will be stock until they are on water and that is the end of it.
> 
> By the way, just ordered 4x EK-FC R9 290X Acetal + Nickel blocks + 4x EK-FC R9 290X black backplates, together with EK-FC Terminal Quad, Semi-Parallell. I hope I get them by next weekend, because I do not work then!


Lol, relax man, he was just asking a favor... No need to pull the ole "dad lecturing his 16 year old daughter" tone...


----------



## Google Reader

Hi guys, which one of the r9 290's you'll recommend for NZXT H2 case? I've heard that XFX one needs a goooood airflow, but it also one the most silent ones. Will Windforce be better?


----------



## mAs81

Hey guys,just a quick question...
Fortunately I got my black screen problem all sorted out(I think)








Is it natural for the Memory clock to be stuck at 1400 constantly?


----------



## JordanTr

Ok i got my arctic xtreme iv installed and decided to play with oc. My sapphire r9 290 got asic 75.2 and 1.218v on load. I was able to pass valley and heaven bench 1150/1400 with no voltage added. With +100mv i was able to do 1200/1500, but decided to leave temps in check and dialed back to 1100/1400 no extra voltage, jus 50% pl. I also played few hours of watch dogs absolutely fine. Max core temp 63, vrm1 82, vrm2 63. Ambient 25. Used only what arctic gave me, no extra sinks. Stock clocks max 58 core, vrm1 73, vrm2 60. No su bad i guess. I got corsair af140 quiet edidion on my side panel blowing air on the arctics backplate.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Google Reader*
> 
> Hi guys, which one of the r9 290's you'll recommend for NZXT H2 case? I've heard that XFX one needs a goooood airflow, but it also one the most silent ones. Will Windforce be better?


If it's a single XFX DD card then you will be fine at stock settings.

If you plan on overclocking it with that case then i'd go for a Sapphire Tri-X or Gigabyte Windforce 3.

The XFX DD cards need good airflow when they are in Crossfire or in a cramped case.......or you could just grab a Ref card and slap a NZXT G10 on there


----------



## JeremyFenn

I wouldn't do that. Cold air meets warm parts = condensation = broken electronics.


----------



## Sgt Bilko

Quote:


> Originally Posted by *mAs81*
> 
> Hey guys,just a quick question...
> Fortunately I got my black screen problem all sorted out(I think)
> 
> 
> 
> 
> 
> 
> 
> 
> Is it natural for the Memory clock to be stuck at 1400 constantly?
> Quote:
Click to expand...

Mine is stuck at 1250Mhz constant due to my monitor being overclocked but i think it's also got something to do with having two monitors connected or the way they are connected.....not 100% sure about that though.

How many monitors do you have and what res/hz?


----------



## bond32

Quote:


> Originally Posted by *JeremyFenn*
> 
> I wouldn't do that. Cold air meets warm parts = condensation = broken electronics.


LOL... This made me laugh.


----------



## Sgt Bilko

Quote:


> Originally Posted by *JeremyFenn*
> 
> I wouldn't do that. Cold air meets warm parts = condensation = broken electronics.


You've lost me


----------



## mAs81

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Mine is stuck at 1250Mhz constant due to my monitor being overclocked but i think it's also got something to do with having two monitors connected or the way they are connected.....not 100% sure about that though.
> 
> How many monitors do you have and what res/hz?


You might be right..I currently have 2 monitors , one @ 60Hz 1680x1050 and another @60 Hz 1920x1080
So it's normal I guess?Because I've seen reviews in which the clocks fluctuate..


----------



## Sgt Bilko

Quote:


> Originally Posted by *mAs81*
> 
> You might be right..I currently have 2 monitors , one @ 60Hz 1680x1050 and another @60 Hz 1920x1080
> So it's normal I guess?Because I've seen reviews in which the clocks fluctuate..


When my monitors are at 2560x1440 110hz and 1920x1080 60hz my memory clock will stay at max but if i drop my 1440p monitor back to 60hz the clocks drop.

how are you connecting them?

Curious about this.


----------



## mAs81

Well my one monitor is connected via HDMI and the other via DVI..My main one(1920x1080) is a TV monitor and the HDMI one...


----------



## Sgt Bilko

Quote:


> Originally Posted by *mAs81*
> 
> Well my one monitor is connected via HDMI and the other via DVI..My main one(1920x1080) is a TV monitor and the HDMI one...


I'm willing to bet if you disabled or turned off one of the monitors then it would switch to 2d clocks.

Seems this is the way for multi-monitor, they keep the mem clock high to avoid flickering and such.

Any eyefinity users want to chime in and add some experiences here?

Would be nice to know if this is the norm or not


----------



## JeremyFenn

I swear I saw someone say they were going to put an air conditioner up to their intake fans on my phone lol I actually know a guy at the office who did the exact same thing, had a portable AC unit ducted to the rear of his PC (his rear fan was obviously an intake) and he fried his PC due to the condensation. Anyway, I swear it wasn't just randomness, I know it's early here and I don't have my glasses on but I SAW IT !!!! (That's my story and I'm sticking to it)


----------



## Sgt Bilko

Quote:


> Originally Posted by *JeremyFenn*
> 
> I swear I saw someone say they were going to put an air conditioner up to their intake fans on my phone lol I actually know a guy at the office who did the exact same thing, had a portable AC unit ducted to the rear of his PC (his rear fan was obviously an intake) and he fried his PC due to the condensation. Anyway, I swear it wasn't just randomness, I know it's early here and I don't have my glasses on but I SAW IT !!!! (That's my story and I'm sticking to it)


Might want to ask @HOMECINEMA-PC about that.

Pretty sure he hasn't had a single issue with PC's and AC









And yeah, i remember something about it but where your post was in the midst made no sense to me


----------



## JeremyFenn

lol like I said it's early here and I didn't have my glasses, PLUS I was on my phone writing that so maybe I misread something. Anyway, I read people getting something crazy on their memory like 1600 on their 290(x) and at least 1200 on the core. You think they're using a hardware mod or just a simple BIOS mod?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I am in the middle of testing three R9 290X reference cards in trifire right now, still do not understand what people are talking about it being so loud, for reference cards I think it is very good. Running BF4 at ultra without AA at 4K and getting stable 60 fps +. I guess the fourth card will take that FPS even a little higher again.
> 
> It was very easy to setup and I am noticing some tearing, but WAY less than I thought (actually saw more tearing with just two cards), the heat though is pretty insane, but on air I will only do stock clock and leave everything stock until I am on water. Very happy with the purchase this far!


One problem. If you dont monitor clock speed you are probably not pushing them 100%. If you level everything stock you are for sure getting the cards to throttle. I got 2 reference 290X to throttle in BF4 before i had to increase fans speed.


----------



## Sgt Bilko

Quote:


> Originally Posted by *JeremyFenn*
> 
> lol like I said it's early here and I didn't have my glasses, PLUS I was on my phone writing that so maybe I misread something. Anyway, I read people getting something crazy on their memory like 1600 on their 290(x) and at least 1200 on the core. You think they're using a hardware mod or just a simple BIOS mod?


I can hit 1250/1500 on my 290's on the stock XFX bios.

just some extra voltage through Trixx or MSI Afterburner and clock them up


----------



## Dasboogieman

Quote:


> Originally Posted by *Google Reader*
> 
> Hi guys, which one of the r9 290's you'll recommend for NZXT H2 case? I've heard that XFX one needs a goooood airflow, but it also one the most silent ones. Will Windforce be better?


*Are you planning to crossfire?*

If you are then you can consider below:
Do you perhaps have a side panel that can mount an extraction fan? From the website, I only see a plain side panel, 2x120mm at the front and one 120mm at the rear. Your enemy will be heat buildup, especially from cards that have fins that blast hot air sideways because the top card has to operate with hot air from the bottom card.

Blower fans might help a lot with the heat buildup (i.e the reference one) but I don't think you will be perfectly satisfied with the noise vs performance. That being said, your case is designed to dampen and contain internal noise.

Alternatively, if you want AIB, you might want to look at the HIS R9 290 IceQ. I mention it because it has a vertical heatsink fin pattern which minimises hot air being blasted directly from the bottom card to the top card, with about 30% being exhausted at the rear. However, this card is quite rare from what I've seen, I also don't personally know anyone who has one.

Your case is tricky because its a positive pressure system with a priority on low noise rather than brute airflow.

*If you aren't planning to crossfire, then heat buildup is a non-issue
*
I'd consider the PowerColor PCS+. It takes 3 slots but has excellent cooling + silent operation as a result, great price too. Plus, it can be very readily cooled with the NZXT G10 mod (it has independent VRM1 and VRM2 cooling) or even full scale water cooling in the future should you wish.

Windforce had a bad rep early ish on from poor performance, I don't know if it has been resolved. If it has then its a viable contender though not quite as good as the PCS+. I believe it was a little weak on the core cooling at low fan RPMs compared to the other 3 fan coolers.

Sapphire Tri X, Vapor X, XFX DD and MSI Gaming are all viable silent alternatives too. I'll walk you through each one's characteristics.

The Tri X is 2 slots so you won't lose as many expansion slots, similar pricing to the PCS+ though has weaker VRM cooling and slightly worse core cooling. You can place an EK backplate if you can get your hands on M2.5 8mm screws to improve the VRM cooling. You can get those screws from Helipal. The fans are dead silent up to 50% (in my HAF 932 case with a mesh side panel) you might not be able to hear them up to 60% with your dampened case.

The Vapor X is about 2.5 slots, similar cooling capacity to the PCS+, has much much stronger VRMs which are better cooled. Biggest issue IIRC is cost. I heard EK are releasing a waterblock for this card so getting one doesn't necessarily preclude you from WC.

The XFX DD is extremely silent and compact. Biggest weakness is that the cooler has one of the least cooling capacities so you will have higher core and VRM temps. Actually, these are quite cheap, most of them are on special here in Australia. Bilko can better fill you in on these since he has one.

The MSI Gaming is an interesting card, it has greater cooling capacity than the DD but not as much as the triple fan cards yet is more compact but not as compact as the DD. The fans are really quiet up to about 60% then it introduces a very characteristic high pitched woosh thereafter. It's biggest strength to me is the excellent cooling capacity to volume ratio on this card, however, theres 2 versions available and you pretty much won't be able to tell. Theres a stock standard reference version (which means it can be watercooled) with the Twin Frozr cooler, however, theres also another custom Gaming version which has a slightly upgraded VRM assembly. The one with the upgraded VRM sacrifices the ability to be watercooled but it has much better VRM temperatures, overvoltage headroom and more robust protection against coil whine.

Or you can go balls out and get the Lightning, pretty much has no drawbacks except cost.


----------



## JeremyFenn

Oh I have, on AB Im running core voltage +100 and power limit +50 (all the way up) but I can only get my core clock around 1180 without blocking or freezing and memory clock at 1450... When monitoring the VDDC/VDDCI I'm only pulling 1.195V / 1.0V


----------



## Sgt Bilko

Quote:


> Originally Posted by *JeremyFenn*
> 
> Oh I have, on AB Im running core voltage +100 and power limit +50 (all the way up) but I can only get my core clock around 1180 without blocking or freezing and memory clock at 1450... When monitoring the VDDC/VDDCI I'm only pulling 1.195V / 1.0V


My better card needs +100mV for 1200/1500 and +160mV for 1250/1500, the other needs a bit more volts for those clocks








Quote:


> Originally Posted by *Dasboogieman*
> 
> Are you planning to crossfire?
> 
> If you are then you can consider below:
> Do you perhaps have a side panel that can mount an extraction fan? From the website, I only see a plain side panel, 2x120mm at the front and one 120mm at the rear. Your enemy will be heat buildup, especially from cards that have fins that blast hot air sideways because the top card has to operate with hot air from the bottom card.
> 
> Blower fans might help a lot with the heat buildup (i.e the reference one) but I don't think you will be perfectly satisfied with the noise vs performance. That being said, your case is designed to dampen and contain internal noise.
> 
> Alternatively, if you want AIB, you might want to look at the HIS R9 290 IceQ. I mention it because it has a vertical heatsink fin pattern which minimises hot air being blasted directly from the bottom card to the top card, with about 30% being exhausted at the rear. However, this card is quite rare from what I've seen, I also don't personally know anyone who has one.


The HIS cards are fairly rare, Probably because of the colour choices









If you aren't planning to crossfire, then heat buildup is a non-issue
Quote:


> I'd consider the PowerColor PCS+. It takes 3 slots but has excellent cooling + silent operation as a result, great price too. Plus, it can be very readily cooled with the NZXT G10 mod or even full scale water cooling in the future should you wish. Windforce had a bad rep early ish on from poor performance, I don't know if it has been resolved. Sapphire Tri X, Vapor X, XFX DD and MSI Gaming are all viable silent alternatives too.


The PCS+ is a great cooler, the brackets take up 3 slots but the cooler takes up 2.5 slots.
Quote:


> The Tri X is 2 slots so you won't lose as many expansion slots, similar pricing to the PCS+ though has weaker VRM cooling and slightly worse core cooling
> The Vapor X is about 2.5 slots, similar cooling capacity to the PCS+, has much much stronger VRMs which are better cooled. Biggest issue IIRC is cost.
> The XFX DD is extremely silent and compact. Biggest weakness is that the cooler has one of the least cooling capacities so you will have higher core and VRM temps. Actually, these are quite cheap, most of them are on special here in Australia


Tri-X is pretty much a staple in this thread for good reason, It's a great card with good cooling for a decent price

Vapor-X is pretty much the best you can get besides a Lightning (290x only)

The XFX DD cards are quite good for noise (my case fans are louder), the cooling isn't as strong as the Tri-X especially on the vrm side on things.
been thinking about getting a 3rd actually since they are quite cheap atm......I'd need another Monitor (or 3)








Quote:


> The MSI Gaming is an interesting card, it has greater cooling capacity than the DD but not as much as the triple fan cards yet is more compact but not as compact as the DD. The fans are really quiet up to about 60% then it introduces a very characteristic high pitched woosh thereafter. It's biggest strength to me is the excellent cooling capacity to volume ratio on this card, however, theres 2 versions available and you pretty much won't be able to tell. Theres a stock standard reference version with the Twin Frozr cooler, however, theres also another custom Gaming version which has a slightly upgraded VRM assembly. The one with the upgraded VRM sacrifices the ability to be watercooled but it has much better VRM temperatures, overvoltage headroom and more robust protection against coil whine.
> 
> Or you can go balls out and get the Lightning, pretty much has no drawbacks except cost.


i have no experience with the Gaming cards apart from yours, Red and Hexoroot's opinions on them


----------



## Dasboogieman

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Might want to ask @HOMECINEMA-PC about that.
> 
> Pretty sure he hasn't had a single issue with PC's and AC
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And yeah, i remember something about it but where your post was in the midst made no sense to me


yeah but he also has a deskputer, plenty of aeration for his parts already. Damn, wheres that awesome picture he posted up?
Quote:


> Originally Posted by *JeremyFenn*
> 
> lol like I said it's early here and I didn't have my glasses, PLUS I was on my phone writing that so maybe I misread something. Anyway, I read people getting something crazy on their memory like 1600 on their 290(x) and at least 1200 on the core. You think they're using a hardware mod or just a simple BIOS mod?


Probably luck of the draw, voltage and watercooling. I know theres a strong likelihood of excellent VRAM overclocks if your card has Hynix 6000mhz modules. Plus, extra Vcore also seems to juice the IMC to allow 1500mhz+ clocks too. I know some BIOS mods do allow better memory overclocks, though I'm not sure if the exact mechanism is because of looser VRAM timings, higher stock voltage or both.

My Tri X won't do 1200mhz for love or money. It can do 1600Mhz on the VRAM however. My MSI Gaming card can do 1200mhz +150mV but it can only do 1450Mhz on VRAM with extra Vcore. Both use Hynix VRAM.


----------



## JeremyFenn

So what modded BIOS can I use with my Sapphire Vapor-X Tri-X R9 290x OC edition? I'd like to max this beast out, I'm not even breaking 60c but I think I'm hitting a wall due to V limitations. Also are we still using ATI Winflash to flash modded BIOS on AMD cards?


----------



## Gobigorgohome

Quote:


> Originally Posted by *bond32*
> 
> Lol, relax man, he was just asking a favor... No need to pull the ole "dad lecturing his 16 year old daughter" tone...


I have no idea what you are trying to tell me here. I just do not see the point with his request when the cards is feed with a corsair af140 with cold air and is running just fine. The volume is fine at default fan profile so i will stick to it for now.


----------



## Dasboogieman

Quote:


> Originally Posted by *JeremyFenn*
> 
> So what modded BIOS can I use with my Sapphire Vapor-X Tri-X R9 290x OC edition? I'd like to max this beast out, I'm not even breaking 60c but I think I'm hitting a wall due to V limitations. Also are we still using ATI Winflash to flash modded BIOS on AMD cards?


I'd follow the instructions on the first page of this thread.
http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread

The BIOS you are after is the PT1 BIOS, however, I recommend the PT1T BIOS instead. They are both functionally the same except the latter has a workaround for potential BIOS blackscreens because your motherboard might panic at mismatched hardware IDs. Basically, this BIOS disables powetune (thus your cards run at a 1000mhz minimum baseclock at all times), adds a constant 1.25V at all times with Vdroop under load and enables fully unlocked voltage bins up to "your GPU with explode" levels. I wouldn't touch the PT3 BIOS unless you are going for Liquid Nitrogen.

Be warned though, check with GPU Z what your BIOS string is first. If its something like this 015.*042.000.000.000000* then ur good to go but if its something like this: 015.*043.000.001.000000* *then the flash won't work,* Catalyst will refuse to install. At the moment, if you fall in to the latter position then I don't think theres much that can be done until a BIOS editing tool is released that works with Hawaii.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Dasboogieman*
> 
> yeah but he also has a deskputer, plenty of aeration for his parts already. Damn, wheres that awesome picture he posted up?


He has now

Before that it was in a case.....Cosmos II i think





there are others who run them as well


----------



## HoneyBadger84

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I have NO intentions of installing any kind of 3rd party software for overclocking before I get these babyes on water. Why would I turn the fanspeed up to 100% anyway? You do what you want, these cards will be stock until they are on water and that is the end of it.


Quote:


> Originally Posted by *bond32*
> 
> Lol, relax man, he was just asking a favor... No need to pull the ole "dad lecturing his 16 year old daughter" tone...


Actually, I was rebuttling his "I dunno why people say these cards are loud" statement, by trying to SHOW HIM why they're called loud lol He doesn't know what loud is because he hasn't heard them at full roar, he's only heard them get up to MAYBE 50%, as they don't hit 55% unless they're above 90C on the stock cooler with the card's default profile. I ran my cards at 100% while benchmarking them to make sure they weren't getting hot at all because they were in a QuadFire AKA sandwiched on top of eachother configuration, and thanks to the shop-blower fan I had as a side fan for that, and them being at 100%, they managed to stay under 70C throughout testing.


----------



## JeremyFenn

Quote:


> Originally Posted by *Dasboogieman*
> 
> I'd follow the instructions on the first page of this thread.
> http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread
> 
> The BIOS you are after is the PT1 BIOS, however, I recommend the PT1T BIOS instead. They are both functionally the same except the latter has a workaround for potential BIOS blackscreens because your motherboard might panic at mismatched hardware IDs. Basically, this BIOS disables powetune (thus your cards run at a 1000mhz minimum baseclock at all times), adds a constant 1.25V at all times with Vdroop under load and enables fully unlocked voltage bins up to "your GPU with explode" levels. I wouldn't touch the PT3 BIOS unless you are going for Liquid Nitrogen.
> 
> Be warned though, check with GPU Z what your BIOS string is first. If its something like this 015.*042.000.000.000000* then ur good to go but if its something like this: 015.*043.000.001.000000* *then the flash won't work,* Catalyst will refuse to install. At the moment, if you fall in to the latter position then I don't think theres much that can be done until a BIOS editing tool is released that works with Hawaii.


What if my BIOS Ver is 015.045.000.000.000000 (113-C6710101-V01) ? LOL


----------



## Dasboogieman

The you are
Quote:


> Originally Posted by *JeremyFenn*
> 
> What if my BIOS Ver is 015.045.000.000.000000 (113-C6710101-V01) ? LOL


AHHHHHHHHH Silly me, silly silly me, didn't check









You have a Vapor X, those instructions were for a reference Tri X

no the flash won't work since the VRM assembly is programmed differently on your card. I wouldn't try it for risk of damaging your card.









You may have to wait for BIOS editing programs I'm afraid.


----------



## JeremyFenn

Well then I won't even try, this isn't a dual BIOS it's just reg and UEFI, I'd hate to go through a ton of issues trying to get it a smidge higher than what I'm already running. It's such a shame though, with my fan profile, I'm not even getting much over 60c at full loads.


----------



## HoneyBadger84

So if one's BIOS were to be say, in my instance:



Would that le-work for the flashing to the BIOS you're recommending for overclocking purposes? I have this Tri-X OC already & I'm getting in a second sometime this week, and I'd like to get them stable/happy at 1100 core (small OC I know, but I'm very sensitive about temps lol) when the second one gets here, in Crossfire together (I'll most likely only have the Gigabyte card in for TriFire testing numbers for the leaderboards here, then it'll be resold on EBay or here if I can get my Rep up high enough to sell on the marketplace).
Quote:


> Originally Posted by *JeremyFenn*
> 
> Well then I won't even try, this isn't a dual BIOS it's just reg and UEFI, I'd hate to go through a ton of issues trying to get it a smidge higher than what I'm already running. It's such a shame though, with my fan profile, I'm not even getting much over 60c at full loads.


I feel similarly at the moment, but at I'm at stock, hitting 58-61C max load temps & VRMs are only getting up to 55C/48C respectively. Really hoping I can get the tiny clock improvement I want with minimal voltage... and of course, I plan to push them as high as I can get'em without them throttling for benchmarks only.


----------



## JeremyFenn

Well testing the memory out with stock clock (1080) I can get 1600 (6.4Ghz effective) on the memory. Then I can get 1180 on the core and run the render just fine. I know once I get to 1190 it blocks up and freezes... I guess my core is just stuck until there's a bios editor for the R series. Think I'll run Valley 1.0 for a few hours see how stable it is exactly.


----------



## HoneyBadger84

Quote:


> Originally Posted by *JeremyFenn*
> 
> Well testing the memory out with stock clock (1080) I can get 1600 (6.4Ghz effective) on the memory. Then I can get 1180 on the core and run the render just fine. I know once I get to 1190 it blocks up and freezes... I guess my core is just stuck until there's a bios editor for the R series. Think I'll run Valley 1.0 for a few hours see how stable it is exactly.


From what others have said earlier in this thread, a lock up is often indicative of vRAM clocks being less than stable with a given GPU Core setting. Did this happen at 1190 core with the vRAM at stock or with the vRAM up at 1600?


----------



## HoneyBadger84

What do y'all use for stability testing an overclock you intend to use for 24/7 use? I've asked that before but didn't really get a solid answer. I mean normally I run OCs through a benchmarking suite, and if it passes them all, I consider the benchmarking results valid, if it artifacts, I won't submit the scores to a leaderboard... but for 24/7 usage, like regular gaming type, I want to be sure it's more stable so I don't crash in the middle of a game I may be playing online or something.

Valley? Heaven? I'm thinking maybe the 3DMark FireStrike Demo on a loop? That's almost constant 100% load on both GPUs and it's quite long with minimal "scene cuts" where the cards might get the load lightened up a bit.


----------



## Red1776

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Google Reader*
> 
> Hi guys, which one of the r9 290's you'll recommend for NZXT H2 case? I've heard that XFX one needs a goooood airflow, but it also one the most silent ones. Will Windforce be better?
> 
> 
> 
> *Are you planning to crossfire?*
> 
> If you are then you can consider below:
> Do you perhaps have a side panel that can mount an extraction fan? From the website, I only see a plain side panel, 2x120mm at the front and one 120mm at the rear. Your enemy will be heat buildup, especially from cards that have fins that blast hot air sideways because the top card has to operate with hot air from the bottom card.
> 
> Blower fans might help a lot with the heat buildup (i.e the reference one) but I don't think you will be perfectly satisfied with the noise vs performance. That being said, your case is designed to dampen and contain internal noise.
> 
> Alternatively, if you want AIB, you might want to look at the HIS R9 290 IceQ. I mention it because it has a vertical heatsink fin pattern which minimises hot air being blasted directly from the bottom card to the top card, with about 30% being exhausted at the rear. However, this card is quite rare from what I've seen, I also don't personally know anyone who has one.
> 
> Your case is tricky because its a positive pressure system with a priority on low noise rather than brute airflow.
> 
> *If you aren't planning to crossfire, then heat buildup is a non-issue*
> 
> I'd consider the PowerColor PCS+. It takes 3 slots but has excellent cooling + silent operation as a result, great price too. Plus, it can be very readily cooled with the NZXT G10 mod (it has independent VRM1 and VRM2 cooling) or even full scale water cooling in the future should you wish.
> 
> Windforce had a bad rep early ish on from poor performance, I don't know if it has been resolved. If it has then its a viable contender though not quite as good as the PCS+. I believe it was a little weak on the core cooling at low fan RPMs compared to the other 3 fan coolers.
> 
> Sapphire Tri X, Vapor X, XFX DD and MSI Gaming are all viable silent alternatives too. I'll walk you through each one's characteristics.
> 
> The Tri X is 2 slots so you won't lose as many expansion slots, similar pricing to the PCS+ though has weaker VRM cooling and slightly worse core cooling. You can place an EK backplate if you can get your hands on M2.5 8mm screws to improve the VRM cooling. You can get those screws from Helipal. The fans are dead silent up to 50% (in my HAF 932 case with a mesh side panel) you might not be able to hear them up to 60% with your dampened case.
> 
> The Vapor X is about 2.5 slots, similar cooling capacity to the PCS+, has much much stronger VRMs which are better cooled. Biggest issue IIRC is cost. I heard EK are releasing a waterblock for this card so getting one doesn't necessarily preclude you from WC.
> 
> The XFX DD is extremely silent and compact. Biggest weakness is that the cooler has one of the least cooling capacities so you will have higher core and VRM temps. Actually, these are quite cheap, most of them are on special here in Australia. Bilko can better fill you in on these since he has one.
> 
> The MSI Gaming is an interesting card, it has greater cooling capacity than the DD but not as much as the triple fan cards yet is more compact but not as compact as the DD. The fans are really quiet up to about 60% then it introduces a very characteristic high pitched woosh thereafter. It's biggest strength to me is the excellent cooling capacity to volume ratio on this card, however, theres 2 versions available and you pretty much won't be able to tell. Theres a stock standard reference version (which means it can be watercooled) with the Twin Frozr cooler*, however, theres also another custom Gaming version which has a slightly upgraded VRM assembly. The one with the upgraded VRM sacrifices the ability to be watercooled* but it has much better VRM temperatures, overvoltage headroom and more robust protection against coil whine.
> 
> Or you can go balls out and get the Lightning, pretty much has no drawbacks except cost.
Click to expand...

The highlighted is not true. I have the MSI R290X Twin Frozr game edition cards with the taller caps and EK has come out with a rev 2.0 for the 290/290X version of these cards.

Also, If you call MSI tech support (I actually did this) and read them the serial numbers of all four MSI 4GB R290X Game Editions, they confirmed that I had the rev 2.0 (taller capacitors)

BTW, when I ran the Twin Frozr version and the lightning, the difference was 3c





http://hellfiretoyz.com/index.php/water-blocks/vga-video-card-water-blocks/amd-full-covervage-water-blocks/ek-waterblocks-ek-fc-r9-290x-nickel-rev-2-0.html



Quote: From EK Website:


> *Block Rev 2.0 - Nickel (EK-FC R9-290X - Nickel (Rev.2.0))*
> 
> Be the first to review this product
> 
> *Quick Overview*
> 
> EK-FC R9-290X is a high performance full-cover water block for AMD reference design Radeon R9 290X and -290 graphics card.
> 
> *Rev.2.0 brings compatibility for previously unsupported MSI® and GIGABYTE® model graphics cards.*






^

^

*Block Rev 2.0 - Nickel (EK-FC R9-290X - Nickel (Rev.2.0))*


----------



## JeremyFenn

Ok, I usually try to push my core first, find something stable, then work on the memory. I couldn't get past 1180 1450, it would lock up even 1460. I decided to do it backwards since my memory is Hynix and should push to 1600 which right now that's what it's set at. I'm running Valley 1.0 on 1024x768 windowed maxed out on my driver and program (other than res obviously) to get max fps and I'm running 1200 core and 1600 mem. So far so good, I'm only at 63c @ 82% fan speed 100% gpu load 1.234V max I've seen on this, VRNs holding steady about 51c on VRM1 and 48c on #2. Pulling about 260W max that I've seen about 860MB of the 4GB being used.





Sorry for the quality but I used my phone as not to interrupt the stress testing.


----------



## JeremyFenn

Quote:


> Originally Posted by *HoneyBadger84*
> 
> What do y'all use for stability testing an overclock you intend to use for 24/7 use? I've asked that before but didn't really get a solid answer. I mean normally I run OCs through a benchmarking suite, and if it passes them all, I consider the benchmarking results valid, if it artifacts, I won't submit the scores to a leaderboard... but for 24/7 usage, like regular gaming type, I want to be sure it's more stable so I don't crash in the middle of a game I may be playing online or something.
> 
> Valley? Heaven? I'm thinking maybe the 3DMark FireStrike Demo on a loop? That's almost constant 100% load on both GPUs and it's quite long with minimal "scene cuts" where the cards might get the load lightened up a bit.


For game stability I max out valley 1.0 and just let it go on loop for hours (usually I'll put it on in the morning then come back after work or after doing some things and check to see if it's still working) You don't usually have to stress 100% of the memory, just a good chunk of it should do the trick. I've only got mine windowed for now so I can check on different stats, after I feel confident enough I'll put the res up and leave it while I go paint my sons room.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Red1776*


Much card. Such expense. #Jelly lol

The 4 R9 290Xs I had at once didn't look anywhere near that nice lined up together (course that's cuz it was 3 different brands, 2 different aftermarket coolers & 2 Core Editions lol)


----------



## JeremyFenn

Trying Trixx and getting better #'s although I'm seeing some lines in 1250, I'll see what 1240 brings, but mem up to 1650.


----------



## JeremyFenn

Ok so, sweet spot found. Core @ 1230 and memory at 1630. Running Watchdogs maxed out windowed @ 1024x768 here's the results of that:


----------



## heroxoot

Quote:


> Originally Posted by *JeremyFenn*
> 
> Ok so, sweet spot found. Core @ 1230 and memory at 1630. Running Watchdogs maxed out windowed @ 1024x768 here's the results of that:


I wish my 290X stayed this cool. Mine gets up to 80c during heaven/valley and 75c during actual games that push a full load all the time. That's at default clocks too!


----------



## motokill36

Is it just me or do 290's in crossfire suck ?

Seam's to play better on one card .
Using 13 Driver
14.4 is nightmare
14.6 was kinda ok
But only getting 76% usage on both gpu's


----------



## Sgt Bilko

Quote:


> Originally Posted by *motokill36*
> 
> Is it just me or do 290's in crossfire suck ?
> 
> Seam's to play better on one card .
> Using 13 Driver
> 14.4 is nightmare
> 14.6 was kinda ok
> But only getting 76% usage on both gpu's


Mine are going great.

What games/apps are you using?


----------



## motokill36

Its Battlefield 4

Maybe its the Games ?
With AB to clock


----------



## Sgt Bilko

Quote:


> Originally Posted by *motokill36*
> 
> Its Battlefield 4
> 
> Maybe its the Games ?
> With AB to clock


I get pretty much 100% usage on both cards with BF4.

Frame rate is 120+ for me so i don't care if it's 100% or 20% usage









What settings and is vsync off? (gotta ask)


----------



## rdr09

Quote:


> Originally Posted by *motokill36*
> 
> Its Battlefield 4
> 
> Maybe its the Games ?
> With AB to clock


what's AB saying about usage? you can leave it running while playing games.


----------



## motokill36

its around the 76% on both Gpu's
And it feels laggy .
If i go back to one card it just feels smoother .and that card runs at 99%
Just installed 14.6 again to see what that's like


----------



## tsm106

Make sure to disable the origin in game stuff.


----------



## Sgt Bilko

Quote:


> Originally Posted by *tsm106*
> 
> Make sure to disable the origin in game stuff.


Does it have a negative effect?


----------



## tsm106

Conflicting OSDs can cause problems. Removing that variable is helpful for those with issues.


----------



## Sgt Bilko

Quote:


> Originally Posted by *tsm106*
> 
> Conflicting OSDs can cause problems. Removing that variable is helpful for those with issues.


Ah, I don't use any OSD's apart from Fraps, I have AB displayed on my keyboard.

Good to know though


----------



## JeremyFenn

My Sapphire Vapor-X Tri-X R9 290x OC edition is fully stable @ 1200Mhz core and 1600Mhz memory just by using the Trixx software (+200 VDDC and +50 Power Limit). Here's a screen shot of how the tower itself is holding up under stress:


----------



## rdr09

Quote:


> Originally Posted by *motokill36*
> 
> its around the 76% on both Gpu's
> And it feels laggy .
> If i go back to one card it just feels smoother .and that card runs at 99%
> Just installed 14.6 again to see what that's like


how about cpu usage like this . . .



you may have to use the slider to see the other threads. i got my HT off.


----------



## rdr09

Quote:


> Originally Posted by *JeremyFenn*
> 
> My Sapphire Vapor-X Tri-X R9 290x OC edition is fully stable @ 1200Mhz core and 1600Mhz memory just by using the Trixx software (+200 VDDC and +50 Power Limit). Here's a screen shot of how the tower itself is holding up under stress:


dat 1300w comes in handy.


----------



## JeremyFenn

Quote:


> Originally Posted by *rdr09*
> 
> dat 1300w comes in handy.


Yes it certainly does!!!


----------



## rdr09

Quote:


> Originally Posted by *JeremyFenn*
> 
> Yes it certainly does!!!


i only use 200 offset for benching. your temps look good, though, at 78% fan. core volts go up to 1.4v.


----------



## tsm106

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *JeremyFenn*
> 
> Yes it certainly does!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i only use 200 offset for benching. your temps look good, though, at 78% fan. core volts go up to 1.4v.
Click to expand...

I don't get out of bed till it's over 200mv.


----------



## JeremyFenn

Well now that I've hit my peak settings on the entire rig, think I'll do some benchmarks and see what I come up with. I'll be sure to post them here when they're finished.


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> I don't get out of bed till it's over 200mv.


lol. i'm such a wuss to try PT1.


----------



## kizwan

Quote:


> Originally Posted by *motokill36*
> 
> its around the 76% on both Gpu's
> And it feels laggy .
> If i go back to one card it just feels smoother .and that card runs at 99%
> Just installed 14.6 again to see what that's like


I really forgot whether this affect GPU usage but did you set frame pacing to Method 1 in BF4? (RenderDevice.FramePacingMethod 1)

My GPU usage (Crossfire) in BF4 (1080p 200% res scale) with 14.4 beta. At 1080p 100% res scale, GPU usage will be fluctuating a lot but the games should running smooth nevertheless.


----------



## motokill36

Quote:


> Originally Posted by *rdr09*
> 
> how about cpu usage like this . . .
> 
> 
> 
> you may have to use the slider to see the other threads. i got my HT off.


Ok all sorted
it was AB
Uninstalled it and then put 14.6 back on then instailed AB and unticked
Reset DisplayMode and Enable unified GPU Monitoring.Plus Ticked Disable ULPS

Now Mega Smooth all max settings

Happy Days


----------



## Gobigorgohome

Quote:


> Originally Posted by *ZealotKi11er*
> 
> One problem. If you dont monitor clock speed you are probably not pushing them 100%. If you level everything stock you are for sure getting the cards to throttle. I got 2 reference 290X to throttle in BF4 before i had to increase fans speed.


Of course it is throtteling, I will start monitor everything when I am on water, the performance before the water cooling does not really mather to me, it just have to work in tri/quadfire. Yes, two cards throttle too, but as I said nothing of this mather to me on air. On water there is a different story, will overclock it as much as I can on water. Probably not the answer you wanted, but I see that you try to help me, but I do not care if the cards is throtteling on air, I will probably just test quadfire and remove two cards until I get the second PSU in my case with the water blocks.


----------



## JeremyFenn

3DMark Results: http://www.3dmark.com/3dm/3475168

3DMark 11 Results http://www.3dmark.com/3dm11/8493629


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *JeremyFenn*
> 
> I swear I saw someone say they were going to put an air conditioner up to their intake fans on my phone lol I actually know a guy at the office who did the exact same thing, had a portable AC unit ducted to the rear of his PC (his rear fan was obviously an intake) and he fried his PC due to the condensation. Anyway, I swear it wasn't just randomness, I know it's early here and I don't have my glasses on but I SAW IT !!!! (That's my story and I'm sticking to it)


Quote:


> Originally Posted by *Sgt Bilko*
> 
> Might want to ask @HOMECINEMA-PC about that.
> 
> Pretty sure he hasn't had a single issue with PC's and AC
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And yeah, i remember something about it but where your post was in the midst made no sense to me


Quote:


> Originally Posted by *Dasboogieman*
> 
> yeah but he also has a deskputer, plenty of aeration for his parts already. Damn, wheres that awesome picture he posted up?
> Probably luck of the draw, voltage and watercooling. I know theres a strong likelihood of excellent VRAM overclocks if your card has Hynix 6000mhz modules. Plus, extra Vcore also seems to juice the IMC to allow 1500mhz+ clocks too. I know some BIOS mods do allow better memory overclocks, though I'm not sure if the exact mechanism is because of looser VRAM timings, higher stock voltage or both.
> 
> My Tri X won't do 1200mhz for love or money. It can do 1600Mhz on the VRAM however. My MSI Gaming card can do 1200mhz +150mV but it can only do 1450Mhz on VRAM with extra Vcore. Both use Hynix VRAM.


V 1

V2

V3

Deskputer with AC and Water chiller


Had 10c water termps last nite


----------



## JeremyFenn

How do you alieviate the condensation from accumulating in the tower?


----------



## Dasboogieman

Quote:


> Originally Posted by *HoneyBadger84*
> 
> So if one's BIOS were to be say, in my instance:
> 
> 
> 
> Would that le-work for the flashing to the BIOS you're recommending for overclocking purposes? I have this Tri-X OC already & I'm getting in a second sometime this week, and I'd like to get them stable/happy at 1100 core (small OC I know, but I'm very sensitive about temps lol) when the second one gets here, in Crossfire together (I'll most likely only have the Gigabyte card in for TriFire testing numbers for the leaderboards here, then it'll be resold on EBay or here if I can get my Rep up high enough to sell on the marketplace).
> I feel similarly at the moment, but at I'm at stock, hitting 58-61C max load temps & VRMs are only getting up to 55C/48C respectively. Really hoping I can get the tiny clock improvement I want with minimal voltage... and of course, I plan to push them as high as I can get'em without them throttling for benchmarks only.


You should be fine with that card. However, don't look for silence or low temps with the PT1T BIOS unless you are watercooling. The 1000Mhz Baseclock means about 55 degrees at *idle*. You will also need the modded GPUTweak software to use the higher voltage bins beyond +200mV.

Umm, everyone has their own choices for stability. I use the Heaven benchmark with all the settings cranked up, I just leave it looped. Another good one I've been using lately is Metro LL or 2033 benchmark at ultra settings looped. Tomb Raider is also a powerful one too since it has an option for SSAA.
@Roboyto uses Final Fantasy 13 or something since apparently the devs went nuts with the image quality so artifacts are easy to spot.


Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Red1776*
> 
> The highlighted is not true. I have the MSI R290X Twin Frozr game edition cards with the taller caps and EK has come out with a rev 2.0 for the 290/290X version of these cards.
> 
> Also, If you call MSI tech support (I actually did this) and read them the serial numbers of all four MSI 4GB R290X Game Editions, they confirmed that I had the rev 2.0 (taller capacitors)
> 
> BTW, when I ran the Twin Frozr version and the lightning, the difference was 3c
> 
> 
> 
> 
> 
> 
> 
> http://hellfiretoyz.com/index.php/water-blocks/vga-video-card-water-blocks/amd-full-covervage-water-blocks/ek-waterblocks-ek-fc-r9-290x-nickel-rev-2-0.html
> 
> 
> 
> 
> 
> 
> 
> 
> ^
> ^
> 
> *Block Rev 2.0 - Nickel (EK-FC R9-290X - Nickel (Rev.2.0))*






@cephelix and I were discussing this at great length. We were going by the cooling configurator
http://www.coolingconfigurator.com/step1_complist?gpu_gpus=1401

This is our card, except mine is the 290 version while cephelix has the 290X, with weird SSC inductors, double row VRMs and weird capacitors. From the weird layout it looked like it wasn't going to fit.

When you blocked your cards, did you notice the same VRM assembly? I do know MSI have a Standard Rev 2.0 reference version with the Twin Frozr Cooler.
Quote:


> Originally Posted by *JeremyFenn*
> 
> My Sapphire Vapor-X Tri-X R9 290x OC edition is fully stable @ 1200Mhz core and 1600Mhz memory just by using the Trixx software (+200 VDDC and +50 Power Limit). Here's a screen shot of how the tower itself is holding up under stress:


The Vapor X is truly amazing, those VRM temps are crazy at +200mV, only thing better is probably a waterblock. Yeah, Sapphire caught some flak by the enthusiasts over weak VRM cooling on the Tri X. Sapphire responded saying the weakness was in the reference VRM 5+1+1 layout rather than their lack of cooling which I agree with. Looks like they amended those issues in spades.

My Tri X already barely misses thermal throttling at +165mV let alone +200mV.


----------



## Red1776

Quote:


> @cephelix and I were discussing this at great length. We were going by the cooling configurator
> http://www.coolingconfigurator.com/step1_complist?gpu_gpus=1401
> 
> This is our card, except mine is the 290 version while cephelix has the 290X, with weird SSC inductors, double row VRMs and weird capacitors. From the weird layout it looked like it wasn't going to fit.
> 
> When you blocked your cards, did you notice the same VRM assembly? I do know MSI have a Standard Rev 2.0 reference version with the Twin Frozr Cooler.


That sucks. Youi got the "gold inductors" SSC



I know that EK issued a statement saying they would not be manuf a block for this one.


----------



## bond32

Quote:


> Originally Posted by *JeremyFenn*
> 
> How do you alieviate the condensation from accumulating in the tower?


The relative humidity of your ambient needs to be lower if you want to avoid that... The entering cold air mixes with the ambient which causes a portion to reach the dewpoint hence the condensate.


----------



## Mega Man

There are methods ie vaseline


----------



## fateswarm

Quote:


> Originally Posted by *Mega Man*
> 
> There are methods ie vaseline


Those are mainly short-term solutions by LN2 overclockers









(Cleaning them is hell and long-term they accumulate dirt).


----------



## Mega Man

most would much rather dirt vs dead $500 board


----------



## fateswarm

I'm all into dirt, but it's also not good for electronics.


----------



## heroxoot

Has anyone tried MSI AB 3.0.1? I noticed screen flickering that looks similar to frame tearing on the desktop when using multi profile. It's weird because the profile didn't switch during this flicker it just looked like a frame tear on the desktop. It doesn't happen when I don't use profiles with powerplay disabled.


----------



## HoneyBadger84

Quote:


> Originally Posted by *heroxoot*
> 
> Has anyone tried MSI AB 3.0.1? I noticed screen flickering that looks similar to frame tearing on the desktop when using multi profile. It's weird because the profile didn't switch during this flicker it just looked like a frame tear on the desktop. It doesn't happen when I don't use profiles with powerplay disabled.


I had this issue repeatedly with several versions of Afterburner, wipe drivers and wipe (completely) afterburner then reinstall both, only way I found to fix it.

Edit: from what I could tell it wad caused by testing OCs that weren't quite stable, wiping the program completely, and redoing it and the drivers always fixed it.


----------



## heroxoot

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Has anyone tried MSI AB 3.0.1? I noticed screen flickering that looks similar to frame tearing on the desktop when using multi profile. It's weird because the profile didn't switch during this flicker it just looked like a frame tear on the desktop. It doesn't happen when I don't use profiles with powerplay disabled.
> 
> 
> 
> I had this issue repeatedly with several versions of Afterburner, wipe drivers and wipe (completely) afterburner then reinstall both, only way I found to fix it.
> 
> Edit: from what I could tell it wad caused by testing OCs that weren't quite stable, wiping the program completely, and redoing it and the drivers always fixed it.
Click to expand...

Well it was doing a severe underclock for idle during it but maybe flash player really does need 600mhz on the gpu with hardware off. That's when it was happening.


----------



## HoneyBadger84

Yep, mine would happen only on desktop, when viewing stuff on the net, simple tasks. Didnt happen when running games or benchmarks at all.


----------



## heroxoot

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Yep, mine would happen only on desktop, when viewing stuff on the net, simple tasks. Didnt happen when running games or benchmarks at all.


Yea I'm thinking this has something to do with my driver vs flash during my 2d 515/625 profile. Because when I full screen flash player on twitch it pushes 600mhz clock. I cannot understand why it suddenly needs more gpu clock but it started when I updated to 14.6 RC2. Hardware acceleration is off in flash and firefox. It used to stay under 400mhz this way but now it clocks up for flash player regardless of these being off.


----------



## kizwan

Quote:


> Originally Posted by *heroxoot*
> 
> Has anyone tried MSI AB 3.0.1? I noticed screen flickering that looks similar to frame tearing on the desktop when using multi profile. It's weird because the profile didn't switch during this flicker it just looked like a frame tear on the desktop. It doesn't happen when I don't use profiles with powerplay disabled.


Only when afterburner is running? Did it happen at stock clock too?


----------



## HoneyBadger84

Quote:


> Originally Posted by *kizwan*
> 
> Only when afterburner is running? Did it happen at stock clock too?


Once the issue started for me it didn't matter if Afterburner was loaded or not, had to uninstall it and redo my drivers to get rid of the issue, and it doesn't occur again until I test an OC and find it unstable, then next boot up it'll start again and I have to redo things again.

Haven't had this issue in a few weeks now, but I'm not pushing OCs that could be super unstable anymore.


----------



## heroxoot

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Has anyone tried MSI AB 3.0.1? I noticed screen flickering that looks similar to frame tearing on the desktop when using multi profile. It's weird because the profile didn't switch during this flicker it just looked like a frame tear on the desktop. It doesn't happen when I don't use profiles with powerplay disabled.
> 
> 
> 
> Only when afterburner is running? Did it happen at stock clock too?
Click to expand...

Only during 2D profile. When I just run it with powerplay enabled and let it down clock from whatever is max it has no problem because it clocks up instead of staying at a strict 515/625. Instead it will downclock to 300/1250. As long as it doesn't hinder my performance I have no problem keeping it like this but when I play games, some don't keep the clock max. If they don't push over 50% load I see around 900mhz clock.


----------



## Dasboogieman

Quote:


> Originally Posted by *Mega Man*
> 
> There are methods ie vaseline


What, as in lubing the board? Sorry couldn't resist








Wouldn't a humidifier partially bypass the issue?


----------



## Mega Man

no it would make it worse


----------



## fateswarm

Some people paint motherboards, it may insulate them. But I wouldn't do it. It may be done badly in an irreversible way and it might cover stuff that must be cooled.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Dasboogieman*
> 
> What, as in lubing the board? Sorry couldn't resist
> 
> 
> 
> 
> 
> 
> 
> 
> Wouldn't a humidifier partially bypass the issue?


A DEhumidifier would help yeah.

I've had my computer running within 10ft of my window AC unit for years with no issues. But I don't have my computer on 24/7 or anything special.


----------



## Dasboogieman

Quote:


> Originally Posted by *HoneyBadger84*
> 
> A DEhumidifier would help yeah.
> 
> I've had my computer running within 10ft of my window AC unit for years with no issues. But I don't have my computer on 24/7 or anything special.


Oh yeah, thats what I meant. The ones that sit in the corner of the room and filter the air.


----------



## Gobigorgohome

A followup question for the "100% fan speed advice", will that keep four reference R9 290X cooled on air without thermal throtteling with a AF140 feeding them fresh air all the time?


----------



## sugarhell

If you want quadfire get trifire and invest the rest to watercooling


----------



## HoneyBadger84

Quote:


> Originally Posted by *Gobigorgohome*
> 
> A followup question for the "100% fan speed advice", will that keep four reference R9 290X cooled on air without thermal throtteling with a AF140 feeding them fresh air all the time?


Probably. I didn't run the QuadFire for more than a few hours as I didn't want that stress on my PSU 24/7. With the fans at 100% (leaf blower mode), and my shop-blower fan for a side fan (it moves more air than any set of system fans could ever hope to), the two XFX Core Editions I had, which were sandwiched in the middle in between my Gigabyte WindForce and a VisionTek card (so glad that thing's gone, hot running POS), never got over 75C load during multiple tests which included 3DMark11 (Performance & Extreme), 3DMark FireStrike (Normal & Extreme), 3DMark SkyDiver & 3DMark Vantage.

Issues with QuadFire, on top of making sure the cards get lots of air so they stay cool, is power draw is insane even at stock, I was seeing 1400W+ from the wall, which means ~1200W of the actual system draw, and even if you keep the cards "cool" (aka under 80C), they're still going to be spitting hot fire out the back of your PC, and you'll need good room cooling to deal with that or it'll eventually heat up the room, which will heat up the cards further and... well, the cycle would continue. lol

The 100% fan speed thing was merely meant as a joke to show you just how loud these cards really are at full boar though, because like I said, if you're running them with the stock fan profile, they never EVER go over 55% that way, which is, even at 55%, very quiet, compared to them at 100%. But, yeah, it does keep them a LOT cooler. From what I've heard, removing the "grill teeth" on the card reduces the amount of noise the card makes as it allows the air to exit the card easier, creating less secondary noise, but I never tested that myself since I had every intention of reselling the Core Editions I got after I finished fiddling with QuadFire.

Edit:
Quote:


> Originally Posted by *sugarhell*
> 
> If you want quadfire get trifire and invest the rest to watercooling


^^^ I 100% agree with that statement if noise/cooling is your concern, go TriFire & liquid instead of Quad.


----------



## JeremyFenn

Ok, so I OC my 290x to 1200/1600 yesterday. I left it running all night (Screen goes black after 30 mins.) no sleep mode or anything, I get here this morning and it's frozen. I guess this means I have to bump it back? It ran for hours and hours yesterday on watchdogs (everything maxed) just fine, dunno why it would freeze in idle with just the monitor in power saving mode?


----------



## HoneyBadger84

For those wondering what I'm refering to, here's a picture from ~2008 when I first started using these things:



It's basically a turbine-type fan (giant version of what reference video cards have on them) that channels in to a single output that's about 1 1/4 feet tall & ~ 5 inches wide. Works very well to blow massive amounts of air towards hardware that needs it, like 4 cards sandwiched on top of eachother







lol (yes I know the cooling loop in that picture was a monstrosity,


----------



## Gobigorgohome

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Probably. I didn't run the QuadFire for more than a few hours as I didn't want that stress on my PSU 24/7. With the fans at 100% (leaf blower mode), and my shop-blower fan for a side fan (it moves more air than any set of system fans could ever hope to), the two XFX Core Editions I had, which were sandwiched in the middle in between my Gigabyte WindForce and a VisionTek card (so glad that thing's gone, hot running POS), never got over 75C load during multiple tests which included 3DMark11 (Performance & Extreme), 3DMark FireStrike (Normal & Extreme), 3DMark SkyDiver & 3DMark Vantage.
> 
> Issues with QuadFire, on top of making sure the cards get lots of air so they stay cool, is power draw is insane even at stock, I was seeing 1400W+ from the wall, which means ~1200W of the actual system draw, and even if you keep the cards "cool" (aka under 80C), they're still going to be spitting hot fire out the back of your PC, and you'll need good room cooling to deal with that or it'll eventually heat up the room, which will heat up the cards further and... well, the cycle would continue. lol
> 
> The 100% fan speed thing was merely meant as a joke to show you just how loud these cards really are at full boar though, because like I said, if you're running them with the stock fan profile, they never EVER go over 55% that way, which is, even at 55%, very quiet, compared to them at 100%. But, yeah, it does keep them a LOT cooler. From what I've heard, removing the "grill teeth" on the card reduces the amount of noise the card makes as it allows the air to exit the card easier, creating less secondary noise, but I never tested that myself since I had every intention of reselling the Core Editions I got after I finished fiddling with QuadFire.
> 
> Edit:
> ^^^ I 100% agree with that statement if noise/cooling is your concern, go TriFire & liquid instead of Quad.


I already have the four cards and the watercooling is coming in this week or the next. I am going to use 4x r9 290x 24/7 on water for 4k gaming. I have ordered everything but the water chiller, which i will order next week. Will probably buy the second psu then too.

I will try quadfire on air and probably remove a couple of cards just to keep it "cool" and quiet.


----------



## Dasboogieman

Quote:


> Originally Posted by *JeremyFenn*
> 
> Ok, so I OC my 290x to 1200/1600 yesterday. I left it running all night (Screen goes black after 30 mins.) no sleep mode or anything, I get here this morning and it's frozen. I guess this means I have to bump it back? It ran for hours and hours yesterday on watchdogs (everything maxed) just fine, dunno why it would freeze in idle with just the monitor in power saving mode?


Chances are your Mem clock is too high. I'd dial it back to 1550 or even 1500mhz. This thing has tons of bandwidth already, and see how stable you are. Mine did something similar, the Gaming edition can technically run 1475mhz with stress tests but I either get artifacts or blackscreens when idle browsing.


----------



## steverebo

ok guys got my 290x set up again after a few issues, what is the best way to overclock these cards 24/7 without having to load a profile. Im neew to AMD this is my first non nvidia carde but on my 780 I used to bois flash the clock speed and voltage I wanted is that possible with 290x?


----------



## ZealotKi11er

Quote:


> Originally Posted by *steverebo*
> 
> ok guys got my 290x set up again after a few issues, what is the best way to overclock these cards 24/7 without having to load a profile. Im neew to AMD this is my first non nvidia carde but on my 780 I used to bois flash the clock speed and voltage I wanted is that possible with 290x?


I just used MSI AB. I dont think you can edit 290 Bios right now.


----------



## lasttimei

just wondering is this score ok?
I've the msi r9 290x2 stock @ 977/1250

http://www.3dmark.com/3dm/3186112?


----------



## Laur3nTyu

hey hey.. add me.. i have it for quite some time.. and deciced to join now









1. Screen shot of GPU-Z validation tab open with OCN Name
2. Asus R9 290x 4GB
3. Cooling - Stock


----------



## JeremyFenn

Ok, so after using my Trixx app and using 1200/1500, I'm noticing after using the card on these settings, sometimes it doesn't go back down to idle clocks. I wonder if it's from disabling ULPS or even maybe for forcing constant voltage? ugh, too many issues lol


----------



## Arizonian

Quote:


> Originally Posted by *Laur3nTyu*
> 
> hey hey.. add me.. i have it for quite some time.. and deciced to join now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1. Screen shot of GPU-Z validation tab open with OCN Name
> 2. Asus R9 290x 4GB
> 3. Cooling - Stock
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









_If anyone else has been a lurker or active poster but never submitted your proof, please feel free to and join the roster.







_


----------



## Dsrt

Hi, Im running 2x r9-290x DirectCU II OC's. My case is Fractal Design R4. The problem is that my Upper GFX card VRM 1# goes up to 97c during Heaven 4.0. Im running stock settings, with a custom fanprofile in Afterburner.
Card 1 core ~88'c @ 73% fanspeed, 97c VRM 1#
Card 2 core ~80'c @ 73% fanspeed, 89c VRM 1#

Heres a paint picture of my fan setup. 

Im wondering If I should buy a bigger case with better airflow or what? I dont really want to put these under water because it voids warranty and it would cost too much.


----------



## ZealotKi11er

What happens with side panel off?


----------



## hwoverclkd

Quote:


> Originally Posted by *JeremyFenn*
> 
> Ok, so after using my Trixx app and using 1200/1500, I'm noticing after using the card on these settings, sometimes it doesn't go back down to idle clocks. I wonder if it's from disabling ULPS or even maybe for forcing constant voltage? ugh, too many issues lol


at least you find AMD isn't boring and constantly keeps you at work


----------



## Dsrt

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What happens with side panel off?


Ah, forgot to try that XD
Im at work atm, 8.5 hours to go until I can get back home and try that.


----------



## heroxoot

Quote:


> Originally Posted by *JeremyFenn*
> 
> Ok, so after using my Trixx app and using 1200/1500, I'm noticing after using the card on these settings, sometimes it doesn't go back down to idle clocks. I wonder if it's from disabling ULPS or even maybe for forcing constant voltage? ugh, too many issues lol


If you're on dual monitor then the GPU should downclock but the memory probably will stay 1500. Make sure powerplay is enabled tho or it won't down clock.


----------



## pdasterly

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> _If anyone else has been a lurker or active poster but never submitted your proof, please feel free to and join the roster.
> 
> 
> 
> 
> 
> 
> 
> _



http://www.techpowerup.com/gpuz/7n92f/


----------



## JeremyFenn

Quote:


> Originally Posted by *heroxoot*
> 
> If you're on dual monitor then the GPU should downclock but the memory probably will stay 1500. Make sure powerplay is enabled tho or it won't down clock.


Nope single monitor, and it downclocks fine if I haven't used the card. It's only after I've used it gaming or testing where it's gone up to 1200/1600 that it stays that way (and I think freezes eventually). I've since downclocked my CPU to 4.83 since it's max is 58c in IBT and downclocked the GPU VRAM to 1500, hopefully preventing this from happening. I guess time will tell.


----------



## JeremyFenn

Its
Quote:


> Originally Posted by *acupalypse*
> 
> at least you find AMD isn't boring and constantly keeps you at work


It's not as bad as all that, I dunno why I just can't leave it alone lol. I guess it's just the geek in me that just keeps trying to squeeze every bit of juice out of my machine. I think I'll leave it the way it is though if it stays on. It's fast, temps are good even under full load, I should just be happy with 4.83cpu 2239ram, 1200/1500 gpu:mem. It's IBT stable @58c and runs games maxed all day under 70c on the 290x, I can't complain too much!!


----------



## heroxoot

Quote:


> Originally Posted by *JeremyFenn*
> 
> Its
> Quote:
> 
> 
> 
> Originally Posted by *acupalypse*
> 
> at least you find AMD isn't boring and constantly keeps you at work
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's not as bad as all that, I dunno why I just can't leave it alone lol. I guess it's just the geek in me that just keeps trying to squeeze every bit of juice out of my machine. I think I'll leave it the way it is though if it stays on. It's fast, temps are good even under full load, I should just be happy with 4.83cpu 2239ram, 1200/1500 gpu:mem. It's IBT stable @58c and runs games maxed all day under 70c on the 290x, I can't complain too much!!
Click to expand...

On air? I wish my 290X was this cool. Mine stays under 80c on the default 1030/1250. Twin frozr seems like it really lost it's touch since the 7970.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Dsrt*
> 
> Ah, forgot to try that XD
> Im at work atm, 8.5 hours to go until I can get back home and try that.


Thats essentially how good its going to get even with the best case







.


----------



## JeremyFenn

Yeah on air, well kinda. It's the Sapphire Vapor-X Tri-X R9-290x OC edition. I don't use that 1 fan until it gets hot then all 3 work crap either, I turned that off, let all 3 run imo. I'm just using Trixx with +200, +50 and setting my fan profile for 1:1 0c-50c then from 50c @ 50% - 80c @ 100%. This keeps it under 70c during hours of abuse. I've set it to 70c @ 100% and that keeps it a few c cooler like 64 or 65c but imo I'd rather have the noise just a little less and start pushing 70c than have it screaming for just a few c less ya know. Vapor-X is the way to go, plus I think they re-vamp'd the VRM's for this card.


----------



## fateswarm

Hm. Why do most of us get some ludicrous low minimumFPS results in valley/heaven? The gameplay of regular games does not look choppy.

NVIDIA biased engine?


----------



## Arizonian

Quote:


> Originally Posted by *pdasterly*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/7n92f/


Congrats - added


----------



## ZealotKi11er

Quote:


> Originally Posted by *fateswarm*
> 
> Hm. Why do most of us get some ludicrous low minimumFPS results in valley/heaven? The gameplay of regular games does not look choppy.
> 
> NVIDIA biased engine?


Does it matter. They are just benchmarks. One single drop can cause very low min. You really cant judge a card just based on min value unless you look at fps throughout the game.


----------



## fateswarm

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Does it matter. They are just benchmarks. One single drop can cause very low min. You really cant judge a card just based on min value unless you look at fps throughout the game.


They are probably caused by some stutters I've noticed. Luckily I don't see them in games. I wonder if NVIDIA get them.


----------



## rdr09

Quote:


> Originally Posted by *fateswarm*
> 
> They are probably caused by some stutters I've noticed. Luckily I don't see them in games. I wonder if NVIDIA get them.


if oc'ed and it stutters, then it might be unstable. low minimums may be caused by starting the bench early. checkout my 7950 minimum (left) . . .


----------



## JeremyFenn

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Does it matter. They are just benchmarks. One single drop can cause very low min. You really cant judge a card just based on min value unless you look at fps throughout the game.


I agree, although we all want über scores in 3DMark & PCMark, it's usually the startup and transitions that'll get ya. Even I get weird scores in them like stupid low low's like 25fps but crazy highs like 140fps on extreme. Games are different, once the engine is loaded & you're in scene it's usually pretty fluid.


----------



## fateswarm

Quote:


> Originally Posted by *rdr09*
> 
> if oc'ed and it stutters, then it might be unstable. low minimums may be caused by starting the bench early. checkout my 7950 minimum (left) . . .


I've heard of that, then again if I don't see the stutters in games.. And even in some cases their settings are high enough.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Dsrt*
> 
> Hi, Im running 2x r9-290x DirectCU II OC's. My case is Fractal Design R4. The problem is that my Upper GFX card VRM 1# goes up to 97c during Heaven 4.0. Im running stock settings, with a custom fanprofile in Afterburner.
> Card 1 core ~88'c @ 73% fanspeed, 97c VRM 1#
> Card 2 core ~80'c @ 73% fanspeed, 89c VRM 1#
> 
> Heres a paint picture of my fan setup.
> 
> Im wondering If I should buy a bigger case with better airflow or what? I dont really want to put these under water because it voids warranty and it would cost too much.


Have you tried 100% fan speed? I'd try that and with the side of the case off. It sounds to me like you're having a similar issue to what I had, which is the heat is being trapped in the left-most area of the back of the case. Luckily my case had rubber gormet things where tubing can go if you go liquid cooling, and I just removed those for "Holes" out the back side of the case, giving the air even more areas to get out.



I also recommend taking out any PCI-slot covers you have on the back of your case, as that can also let hot air get out that would otherwise be trapped, that could also help drop temps.

But yeah, give the side panel off a try, and try 100% Fan for 2 reasons: 1 to see if it helps, obviously, and 2 to see if you can stand how loud they get (I dunno how loud CU IIs are, I've heard the 290X versions are quite ridiculous though, compared to your average Tri-X or WindForce anyway)


----------



## rdr09

Quote:


> Originally Posted by *fateswarm*
> 
> I've heard of that, then again if I don't see the stutters in games.. And even in some cases their settings are high enough.


only time i had stuttering in valley was with crossfired 7950 and 7970. the difference in speeds must be to blame. though no stutters in games like BF3, C3 and C2. you can tell it stuttered based on the minimum. with the 7950 oc'ed to 1100 and 7970 stock . . .


----------



## HoneyBadger84

I get occasional hitching in Valley and Heaven, but only on the first loop through. I usually let them run through once on their own on a loop, THEN run the benchmark after the second run through starts, that way the hitching doesn't occur. It's usually between, or right after the start, of certain scenes. On Heaven, in particular, there's two that are in the dark-scenes, the one with the torch in the arches, and the one where it starts out in one area then arches up over to a distance view, hitches towards the beginning of the one, and as it's arching up on the other. That only happens on the first run, second loop through it doesn't occur at all, and those hitches do drop the overall result by a few tenths if you just run the benchmark right when the visuals start.

That's been my experience across 7970s, GTX 580s (from 2-3 years ago to now), and now 290Xs, happens/happened on all of them. Valley I can't say for sure where the hitching is cuz I'm still new to it, but it only happens on 2-3 scenes of it, if memory serves.


----------



## JeremyFenn

Hey, what's up with the 14.6 beta driver for the 290x? Anything good there or should I stick to my 14.4?


----------



## ZealotKi11er

So getting ready to add a second 290 i have been testing my 290X vs 290 to see how much difference there is. I just used 3DMark to test the cards. Here are my results.

290X - 1000/1250 - [10516]
http://www.3dmark.com/fs/2402854
290X - 1200/1500 - [12935]
http://www.3dmark.com/3dm/3483798
290 - 947/1250 - [9983]
http://www.3dmark.com/3dm/3483837
290 - 1000/1250 - [10610]
http://www.3dmark.com/3dm/3483871
290 - 1200/1500 - [12028]

The only problem thing that puzzled me was 290 @ 1000/1250 had same score as 290X @ 1000/1250. It could be because of Power Limit because the OCed cards there is 7.5% difference.


----------



## heroxoot

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fateswarm*
> 
> They are probably caused by some stutters I've noticed. Luckily I don't see them in games. I wonder if NVIDIA get them.
> 
> 
> 
> if oc'ed and it stutters, then it might be unstable. low minimums may be caused by starting the bench early. checkout my 7950 minimum (left) . . .
Click to expand...

I'va had this stutter too. In Valley it happens often but in Heaven there are 2 select spots that always and I mean ALWAYS have a quick frame stutter. Heaven doesn't have it for me other than that. In Valley I've seen stutter on stock tho. Also how is your 7950 doing almost as good as my stock 290X in Valley? It's this kind of thing that makes me feel like my GPU isn't running properly sometimes.


----------



## fateswarm

Well, their engine might be buggy. Those engines are beasts. Very complicated monsters.


----------



## rdr09

Quote:


> Originally Posted by *JeremyFenn*
> 
> Hey, what's up with the 14.6 beta driver for the 290x? Anything good there or should I stick to my 14.4?


checkout below.

Quote:


> Originally Posted by *ZealotKi11er*
> 
> So getting ready to add a second 290 i have been testing my 290X vs 290 to see how much difference there is. I just used 3DMark to test the cards. Here are my results.
> 
> 290X - 1000/1250 - [10516]
> http://www.3dmark.com/fs/2402854
> 290X - 1200/1500 - [12935]
> http://www.3dmark.com/3dm/3483798
> 290 - 947/1250 - [9983]
> http://www.3dmark.com/3dm/3483837
> 290 - 1000/1250 - [10610]
> http://www.3dmark.com/3dm/3483871
> 290 - 1200/1500 - [12028]
> 
> The only problem thing that puzzled me was 290 @ 1000/1250 had same score as 290X @ 1000/1250. It could be because of Power Limit because the OCed cards there is 7.5% difference.


with 14.6 Beta i matched your 290X with the the 290 at same clocks . . .

http://www.3dmark.com/3dm/3147012?
Quote:


> Originally Posted by *JeremyFenn*
> 
> Hey, what's up with the 14.6 beta driver for the 290x? Anything good there or should I stick to my 14.4?


Quote:


> Originally Posted by *heroxoot*
> 
> I'va had this stutter too. In Valley it happens often but in Heaven there are 2 select spots that always and I mean ALWAYS have a quick frame stutter. Heaven doesn't have it for me other than that. In Valley I've seen stutter on stock tho. Also how is your 7950 doing almost as good as my stock 290X in Valley? It's this kind of thing that makes me feel like my GPU isn't running properly sometimes.


cpu influences valley a tiny bit. that 7950 was oc'ed to 1255. that was no tweaks, too. plus hawaii don't do well in valley and heaven. checkout who is at the top of the valley bench thread.


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> checkout below.
> with 14.6 Beta i matched your 290X with the the 290 at same clocks . . .
> 
> http://www.3dmark.com/3dm/3147012?
> 
> cpu influences valley a tiny bit. that 7950 was oc'ed to 1255. that was no tweaks, too. plus hawaii don't do well in valley and heaven. checkout who is at the top of the valley bench thread.


Thats nice improvement.


----------



## JeremyFenn

Quote:


> Originally Posted by *rdr09*
> 
> checkout below.
> with 14.6 Beta i matched your 290X with the the 290 at same clocks . . .
> 
> http://www.3dmark.com/3dm/3147012?
> 
> cpu influences valley a tiny bit. that 7950 was oc'ed to 1255. that was no tweaks, too. plus hawaii don't do well in valley and heaven. checkout who is at the top of the valley bench thread.


Sooooo, I'm talking about the 14.6 driver not the 14.1. And you're scores are higher since you have a CPU that's $150 more than mine


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Thats nice improvement.


i think i notice improvements in games as well. you should go for it Zeal.

wait, you got water? yah, you are good.
Quote:


> Originally Posted by *JeremyFenn*
> 
> Sooooo, I'm talking about the 14.6 driver not the 14.1. And you're scores are higher since you have a CPU that's $150 more than mine


i am talking about 14.6 Beta not RC. try it.









wait, you are on air? no, 14.6 might make your gpu run warmer.


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> i think i notice improvements in games as well. you should go for it Zeal.
> 
> wait, you got water? yah, you are good.
> i am talking about 14.6 Beta not RC. try it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> wait, you are on air? no, 14.6 might make your gpu run warmer.


Would be nice to break 14K from with 290X. How well do 780 To score?


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Would be nice to break 14K from with 290X. How well do 780 To score?


3DMark leans to AMD. Highly oc'ed 780 will compete. my 290 at 1300 core should be matched by a 780 in this bench at around 1450 maybe. more.

Switch to 14.6 B and run Sky Diver.


----------



## tsm106

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> Would be nice to break 14K from with 290X. How well do 780 To score?
> 
> 
> 
> *3DMark leans to AMD*. Highly oc'ed 780 will compete. my 290 at 1300 core should be matched by a 780 in this bench at around 1450 maybe. more.
> 
> Switch to 14.6 B and run Sky Diver.
Click to expand...

That's not true. Futuremark is heavily NV biased imo. I've seen NV cards which scored lower in GS and physics, yet they scored a much higher combine score leading to a higher P score. That's unpossible lol. Some benches simply highlight the big difference in IPC between the two architectures.


----------



## JeremyFenn

Quote:


> Originally Posted by *tsm106*
> 
> That's not true. Futuremark is heavily NV biased imo. I've seen NV cards which scored lower in GS and physics, yet they scored a much higher combine score leading to a higher P score. That's unpossible lol. Some benches simply highlight the big difference in IPC between the two architectures.


I think intel/nvidia setup a will always trump amd/amd systems purely on the processing power/architure of the CPUs and the physics of the nvidia chipsets. Frankly for the price you still can't beat an AMD/AMD system imo.


----------



## Mega Man

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> Would be nice to break 14K from with 290X. How well do 780 To score?
> 
> 
> 
> *3DMark leans to AMD*. Highly oc'ed 780 will compete. my 290 at 1300 core should be matched by a 780 in this bench at around 1450 maybe. more.
> 
> Switch to 14.6 B and run Sky Diver.
> 
> Click to expand...
> 
> That's not true. Futuremark is heavily NV biased imo. I've seen NV cards which scored lower in GS and physics, yet they scored a much higher combine score leading to a higher P score. That's unpossible lol. Some benches simply highlight the big difference in IPC between the two architectures.
Click to expand...











agreed
Quote:


> Originally Posted by *JeremyFenn*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> That's not true. Futuremark is heavily NV biased imo. I've seen NV cards which scored lower in GS and physics, yet they scored a much higher combine score leading to a higher P score. That's unpossible lol. Some benches simply highlight the big difference in IPC between the two architectures.
> 
> 
> 
> I think intel/nvidia setup a will always trump amd/amd systems purely on the processing power/architure of the CPUs and the physics of the nvidia chipsets. Frankly for the price you still can't beat an AMD/AMD system imo.
Click to expand...

not true, amd owns nvidia imo esp in large res ! nvidia


----------



## JeremyFenn

Tha
Quote:


> Originally Posted by *Mega Man*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> agreed
> not true, amd owns nvidia imo esp in large res !


That's for sure!!! I'm just stating the physX portion of nvidia cards will score better in physics tests since amd doesn't have that.


----------



## Sgt Bilko

Quote:


> Originally Posted by *JeremyFenn*
> 
> Tha
> That's for sure!!! I'm just stating the physX portion of nvidia cards will score better in physics tests since amd doesn't have that.


Its CPU physics, not GPU based

PhysX will not boost your score in 3Dmark due to fact it uses your CPU to calculate the physics and not your GPU


----------



## Dasboogieman

Quote:


> Originally Posted by *Dsrt*
> 
> Hi, Im running 2x r9-290x DirectCU II OC's. My case is Fractal Design R4. The problem is that my Upper GFX card VRM 1# goes up to 97c during Heaven 4.0. Im running stock settings, with a custom fanprofile in Afterburner.
> Card 1 core ~88'c @ 73% fanspeed, 97c VRM 1#
> Card 2 core ~80'c @ 73% fanspeed, 89c VRM 1#
> 
> Heres a paint picture of my fan setup.
> 
> Im wondering If I should buy a bigger case with better airflow or what? I dont really want to put these under water because it voids warranty and it would cost too much.


bear in mind that the ASUS Direct CU II is known to be rather inefficient with Hawaii due to the heatpipe design. I'd try HoneyBadger's suggestion first and get rid of those PCIe bracket covers. That increases the cool airflow intake a lot.

Another issue you have here is heat buildup. The bottom card is a lot hotter than I expected for 73% fan speed which tells me that there is a lack of cool air getting to the cards. Now theres generally 2 approaches to this problem. I personally, am using a negative pressure system with a really powerful side extraction fan because my 290s exhaust heat sideways and draw in air via the empty PCIe bracket slots.
However, in your case, the ASUS Direct CU has lengthwise fins which means you might get less mileage out of a powerful side extraction unit with respect to pulling out hot air.
I would also try adding more powerful rear and side fans, then you can experiment with the side fan being exhaust or intake and see what your mileage is. I'm leaning more towards a more poweful side intake.

Secondly, you can mod the ASUS cards themselves. This isn't technically voiding the warrranty since you're only removing the backplate on each. What you do is then use 1.5mm Thermal Pads (might need a 1.5mm and a 1mm depending on the clearance of the plate and the PCB) in the VRM area on the backside. This should drop VRM temps by about 5%-10%. You then attach sticky VRAM heatsinks to the flat backplate over the VRM 1 area for another 5-10% drop in temperatures. This won't fix your core issues but it will improve your VRM temps to the level where you can reduce your fan speed since the Core can be allowed to go to around 94 degrees with no dramas.


If all else fails, get that new case.
Quote:


> Originally Posted by *Sgt Bilko*
> 
> Its CPU physics, not GPU based


IIRC NVIDIA got crucified for cheating in 3dMarkVantage when the press found out that GPU PhysX was being used.


----------



## JeremyFenn

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Its CPU physics, not GPU based


See ya learn something new everyday. I thought the physX helped with that.


----------



## Sgt Bilko

Quote:


> Originally Posted by *JeremyFenn*
> 
> See ya learn something new everyday. I thought the physX helped with that.


If it was then you'd see FX 6300's with 780Ti's outscoring 4770k/290x combo's









The only bench ive seen the Nvidia outperforms AMD on consistently is Catzilla.

Ive seen SLI GTX 770 come very close to my 290's at max clocks on that


----------



## hwoverclkd

Quote:


> Originally Posted by *fateswarm*
> 
> They are probably caused by some stutters I've noticed. Luckily I don't see them in games. I wonder if NVIDIA get them.


Yes. You'd see those 'blips' on both nvidia and amd cards (valley/heaven).


----------



## hwoverclkd

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Its CPU physics, not GPU based
> 
> PhysX will not boost your score in 3Dmark due to fact it uses your CPU to calculate the physics and not your GPU


Quote:


> Originally Posted by *JeremyFenn*
> 
> See ya learn something new everyday. I thought the physX helped with that.


wait, even though physics is CPU-based, i believe the score (physics score) is still influenced more by using nvidia card.

http://www.3dmark.com/fs/2185608

http://www.3dmark.com/fs/2128140

You can tell me tomorrow if there's something i missed...going to bed now.


----------



## kizwan

Quote:


> Originally Posted by *JeremyFenn*
> 
> Hey, what's up with the 14.6 beta driver for the 290x? Anything good there or should I stick to my 14.4?


14.4 WHQL vs. 14.6 Beta
http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/23630#post_22346469
Quote:


> Originally Posted by *ZealotKi11er*
> 
> So getting ready to add a second 290 i have been testing my 290X vs 290 to see how much difference there is. I just used 3DMark to test the cards. Here are my results.
> 
> 290X - 1000/1250 - [10516]
> http://www.3dmark.com/fs/2402854
> 290X - 1200/1500 - [12935]
> http://www.3dmark.com/3dm/3483798
> 290 - 947/1250 - [9983]
> http://www.3dmark.com/3dm/3483837
> 290 - 1000/1250 - [10610]
> http://www.3dmark.com/3dm/3483871
> 290 - 1200/1500 - [12028]
> 
> The only problem thing that puzzled me was 290 @ 1000/1250 had same score as 290X @ 1000/1250. It could be because of Power Limit because the OCed cards there is 7.5% difference.


Did you re-run the benchmarks a couple of time? Probably just a fluke. Just set power limit to +50% when overclock.


----------



## Dsrt

Quote:


> Originally Posted by *Dsrt*
> 
> Hi, Im running 2x r9-290x DirectCU II OC's. My case is Fractal Design R4. The problem is that my Upper GFX card VRM 1# goes up to 97c during Heaven 4.0. Im running stock settings, with a custom fanprofile in Afterburner.
> Card 1 core ~88'c @ 73% fanspeed, 97c VRM 1#
> Card 2 core ~80'c @ 73% fanspeed, 89c VRM 1#
> 
> Heres a paint picture of my fan setup.
> 
> Im wondering If I should buy a bigger case with better airflow or what? I dont really want to put these under water because it voids warranty and it would cost too much.


Ok now Ive tried few things









I keep my case under my desk, which seems that its not really good with this crossfire setup that I have. So I placed my case in front of the desk and opened sidepanel + made a desktop fan blow directly to it.

And heres the results


Then I put my sidepanel back on and turned off the desktopfan but didint place my case under the desktop this time.
Results:


Seems that the case placement under the desktop is making my temps almost +10c higher. Wonder If I still could drop the temps abit if I would invest in better fans or just go for case like Corsair Airflow 540?

And Finally a validation picture for joining the club










2x Asus R9-290X DirectCU II OC


----------



## Arizonian

Quote:


> Originally Posted by *Dsrt*
> 
> Ok now Ive tried few things
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I keep my case under my desk, which seems that its not really good with this crossfire setup that I have. So I placed my case in front of the desk and opened sidepanel + made a desktop fan blow directly to it.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> And heres the results
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Then I put my sidepanel back on and turned off the desktopfan but didint place my case under the desktop this time.
> Results:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Seems that the case placement under the desktop is making my temps almost +10c higher. Wonder If I still could drop the temps abit if I would invest in better fans or just go for case like Corsair Airflow 540?
> 
> And Finally a validation picture for joining the club
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 2x Asus R9-290X DirectCU II OC


Congrats - added


----------



## Infinite Jest

I'll be joining the club soon with a reference 290!









Question for you fine AMDers: Has anyone had experience with the Gelid Icy Vision cooler? Thoughts?


----------



## aaroc

Its worth going from R9 290 to R9 290X for a few bucks? I just saw a Linus video on Utube showing that the R9 290 was a better OC than the R9 290X and they get the same results on both cards. I don't know if their OC/benchmark methodology is correct or if they got the best R9 290 in the world. Do you have some experience or opinion to share? Thanks!


----------



## Blaise170

Quote:


> Originally Posted by *Infinite Jest*
> 
> I'll be joining the club soon with a reference 290!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Question for you fine AMDers: Has anyone had experience with the Gelid Icy Vision cooler? Thoughts?


I've heard that it's okay, but for most companies it will void your warranty since it is a binding heatsink. When I talked to XFX, they told me I could use aftermarket air and watercooling, but not the Gelid for that reason.
Quote:


> Originally Posted by *aaroc*
> 
> Its worth going from R9 290 to R9 290X for a few bucks? I just saw a Linus video on Utube showing that the R9 290 was a better OC than the R9 290X and they get the same results on both cards. I don't know if their OC/benchmark methodology is correct or if they got the best R9 290 in the world. Do you have some experience or opinion to share? Thanks!


Most 290s will compete with the 290x - in general, you will see less than 5% difference in the two cards.


----------



## Mega Man

Quote:


> Originally Posted by *Blaise170*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Infinite Jest*
> 
> I'll be joining the club soon with a reference 290!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Question for you fine AMDers: Has anyone had experience with the Gelid Icy Vision cooler? Thoughts?
> 
> 
> 
> I've heard that it's okay, but for most companies it will void your warranty since it is a binding heatsink. When I talked to XFX, they told me I could use aftermarket air and watercooling, but not the Gelid for that reason.
> Quote:
> 
> 
> 
> Originally Posted by *aaroc*
> 
> Its worth going from R9 290 to R9 290X for a few bucks? I just saw a Linus video on Utube showing that the R9 290 was a better OC than the R9 290X and they get the same results on both cards. I don't know if their OC/benchmark methodology is correct or if they got the best R9 290 in the world. Do you have some experience or opinion to share? Thanks!
> 
> Click to expand...
> 
> Most 290s will compete with the 290x - in general, you will see less than 5% difference in the two cards. _*except in price !*_
Click to expand...

fixed !


----------



## HoneyBadger84

Quote:


> Originally Posted by *JeremyFenn*
> 
> Hey, what's up with the 14.6 beta driver for the 290x? Anything good there or should I stick to my 14.4?


I've been running the 14.6 Betas and/or RCs since I got my 290Xs cuz one of the primary reasons I got them was Watch_Dogs, which needed the 14.6s to be playable in Crossfire.


----------



## Dasboogieman

Quote:


> Originally Posted by *Blaise170*
> 
> I've heard that it's okay, but for most companies it will void your warranty since it is a binding heatsink. When I talked to XFX, they told me I could use aftermarket air and watercooling, but not the Gelid for that reason.
> Most 290s will compete with the 290x - in general, you will see less than 5% difference in the two cards.


Wait, what does XFX mean by binding cooler?


----------



## HoneyBadger84

Quote:


> Originally Posted by *Dasboogieman*
> 
> Wait, what does XFX mean by binding cooler?


Super glue. Lol j/k (at least I hope it doesn't mean its glued on with something)


----------



## ZealotKi11er

Quote:


> Originally Posted by *aaroc*
> 
> Its worth going from R9 290 to R9 290X for a few bucks? I just saw a Linus video on Utube showing that the R9 290 was a better OC than the R9 290X and they get the same results on both cards. I don't know if their OC/benchmark methodology is correct or if they got the best R9 290 in the world. Do you have some experience or opinion to share? Thanks!


From my testing 290X is about 7.5% faster.


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> That's not true. Futuremark is heavily NV biased imo. I've seen NV cards which scored lower in GS and physics, yet they scored a much higher combine score leading to a higher P score. That's unpossible lol. Some benches simply highlight the big difference in IPC between the two architectures.


Noted.
Quote:


> Originally Posted by *aaroc*
> 
> Its worth going from R9 290 to R9 290X for a few bucks? I just saw a Linus video on Utube showing that the R9 290 was a better OC than the R9 290X and they get the same results on both cards. I don't know if their OC/benchmark methodology is correct or if they got the best R9 290 in the world. Do you have some experience or opinion to share? Thanks!


comes down to silicon. if you have both 290X and 290 that are very good overclockers, then the 290X is sure to come on top. a 290X oc'ed at 1250 or higher will be untouched by a good clocking 290. i read the extra shaders helps in higher rez.


----------



## JeremyFenn

Quote:


> Originally Posted by *aaroc*
> 
> Its worth going from R9 290 to R9 290X for a few bucks? I just saw a Linus video on Utube showing that the R9 290 was a better OC than the R9 290X and they get the same results on both cards. I don't know if their OC/benchmark methodology is correct or if they got the best R9 290 in the world. Do you have some experience or opinion to share? Thanks!


I have a 290x that can get up to 1230/1600, but pull it back to 1200/1500 due to *cough cough* issues... I'd say look at the cooling on the card, that's what made all the difference for me. I personally love the Vapor-X by Sapphire (it's around $600) but even at 1200/1600 running full for hours, with my fan profile (1:1 0c-50c then 50c-80c =50%-100%) I barely touch 70c (more like 68c or so) and that's hours of games like watch dogs, Thief, maxed out and using over 3GB of VRAM and using the CPU 100%. I hear people get into the 80's and 90's with the 290x and it's supposed to just take it, I'm just glad mine keeps frosty. Oh, also, someone mentioned that Sapphire re-vamped their VRMs on these cards? Anyway, here's a link:

SAPPHIRE VAPOR-X R9 290X 4GB GDDR5 TRI-X OC (UEFI)

http://www.sapphiretech.com/presentation/product/?cid=1&gid=3&sgid=1227&pid=2283&psn=&lid=1&leg=0

http://www.amazon.com/Sapphire-VAPOR-X-PCI-Express-Graphics-11226-09-40G/dp/B00K2OJ38O

I'd post a link to the one on Newegg.com (the one that I bought) but the link from my order history brings me to a 1030/1325 one and mine stock was 1080/1410. I'm not sure if they just pulled it back a little on their new ones or what, but that Vapor-X is the real deal unless you're thinking of WC.


----------



## mojobear

Hey Arizonian,

Could you please update me to quad 290s









3 of the GPUs are sapphire. One is powercolor (with sapphire bios)

Cooling: EK waterblocks

Thanks!


----------



## Talon720

Quote:


> Originally Posted by *heroxoot*
> 
> Depends on the company. MSI voided my sticker to check it when they sent it as a replacement, but they say as long as the card isn't physically broken they take it anyway. Sapphire tends to be the same.


Im behind a few pages but i was going to rma my sapphire but when i got the rma number it said " rma wont be accepted if the s/n or p/n is damaged or missing" I was going to return it in the original box so the numbers could be takin off that. Unfortunately it also specifically states not to return it in original box bare product wrapped. I really didn't wanna send it in and wait only to be rejected. I removed the sticker where the the thermal pads go for the backplate. I used the artic clean stuff and some if got on the other 2 stickers completely destroying one and mostly the other one. At least with the pcs+ the stickers are on the backplate only slight negative is its Elpida.


----------



## Blaise170

Quote:


> Originally Posted by *Dasboogieman*
> 
> Wait, what does XFX mean by binding cooler?


It means the heatsink binds to the PCB. That makes it very difficult, if not impossible to remove with damaging it.


----------



## Gobigorgohome

It seems like quadfire R9 290X stock with fans at 100% draws more power than my EVGA G2 1300W can produce.


----------



## KeepWalkinG

Quote:


> Originally Posted by *Gobigorgohome*
> 
> It seems like quadfire R9 290X stock with fans at 100% draws more power than my EVGA G2 1300W can produce.


A friend changed his power supply from Corsair Ax 1200i to 1500i and now there are no problems with quad cross 290x.
even power socket exceed 1500 watts after overclock.


----------



## rdr09

Finally got rid of 13.11 on my second HDD. Took less than 15 mins. to switch to 14.6 Beta RC. Have yet to try some games. My other HDD has 14.6 Beta.

As usual, all i did was download the latest, ran it and used it to uninstall existing driver (Express method). Rebooted and ran RC to install (Express method). After installing RC and prior to rebooting, i went to msconfig and uncheck CCC from startup programs. Rebooted and ran Firestrike as initial test.

13.11

http://www.3dmark.com/3dm/1900067

14.6 Beta RC

http://www.3dmark.com/3dm/3487968?

Seems as stable as 14.6 Beta and 13.11.


----------



## ZealotKi11er

Quote:


> Originally Posted by *mojobear*
> 
> Hey Arizonian,
> 
> Could you please update me to quad 290s
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 3 of the GPUs are sapphire. One is powercolor (with sapphire bios)
> 
> Cooling: EK waterblocks
> 
> Thanks!


Mother of GOD. Why are unot going X79?
Quote:


> Originally Posted by *rdr09*
> 
> Finally got rid of 13.11 on my second HDD. Took less than 15 mins. to switch to 14.6 Beta RC. Have yet to try some games. My other HDD has 14.6 Beta.
> 
> As usual, all i did was download the latest, ran it and used it to uninstall existing driver (Express method). Rebooted and ran RC to install (Express method). After installing RC and prior to rebooting, i went to msconfig and uncheck CCC from startup programs. Rebooted and ran Firestrike as initial test.
> 
> 13.11
> 
> http://www.3dmark.com/3dm/1900067
> 
> 14.6 Beta RC
> 
> http://www.3dmark.com/3dm/3487968?
> 
> Seems as stable as 14.6 Beta and 13.11.


Not that much difference. What memory does your card have? Also do you do anything special when you run the benchmarks?


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Mother of GOD. Why are unot going X79?
> Not that much difference. What memory does your card have? Also do you do anything special when you run the benchmarks?


yes, not much difference. about 200 points more in graphics score from 13.11. same amount i get using 14.6 Beta. nothing special with my rig just 1600 rams. here is 14.6 Beta . . .

http://www.3dmark.com/3dm/3158989?


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> yes, not much difference. about 200 points more in graphics score from 13.11. same amount i get using 14.6 Beta. nothing special with my rig just 1600 rams. here is 14.6 Beta . . .
> 
> http://www.3dmark.com/3dm/3158989?


Is it me or the Physics score a bit too low.


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Is it me or the Physics score a bit too low.


that's my daily setting for the cpu. no need for HT just 4.5GHz oc enough to max out C3 using 1080 and BF4 at 130% rez scale. takes 5-7C off my loop.


----------



## tsm106

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Mother of GOD. Why are unot going X79?


I was wondering where the 2nd bank of memory slots was, hehe? Why no backplates. I think its crazy to go water and not get the backplates. The ek smooth plates not only adds stiffness and some cooling but it adds a big protective layer from drips to screws, to whatever.


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> that's my daily setting for the cpu. no need for HT just 4.5GHz oc enough to max out C3 using 1080 and BF4 at 130% rez scale. takes 5-7C off my loop.


AHA forgot about HT. Also have it on. Got to have it on since I came from 3570 to 3770 lol.
Quote:


> Originally Posted by *tsm106*
> 
> I was wondering where the 2nd bank of memory slots was, hehe? Why no backplates. I think its crazy to go water and not get the backplates. The ek smooth plates not only adds stiffness and some cooling but it adds a big protective layer from drips to screws, to whatever.


Believe it or no but back plates are expensive. $40 is too much for it. I paid $130 for my Cooper EK block, $140 is for Chrome plated one. Add $40 and then 13% TAX + shipping and its goes $200 to water 1 card that now costs less then $300. Water cooling has gotten more expensive slowly. Used to get similar blocks for $100 2 years ago.


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> AHA forgot about HT. Also have it on. Got to have it on since I came from 3570 to 3770 lol.


maybe if i add another 290. but with one, 4.5GHz oc works.


----------



## Dasboogieman

Quote:


> Originally Posted by *Gobigorgohome*
> 
> It seems like quadfire R9 290X stock with fans at 100% draws more power than my EVGA G2 1300W can produce.


Good lord, you brought an EVGA G2 1300W to its knees? I guess for Quadfire, anything less than an AX1500i isn't gonna do. Any chance of getting an Auxilliary PSU?


----------



## tsm106

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gobigorgohome*
> 
> It seems like quadfire R9 290X stock with fans at 100% draws more power than my EVGA G2 1300W can produce.
> 
> 
> 
> Good lord, you brought an EVGA G2 1300W to its knees? I guess for Quadfire, anything less than an AX1500i isn't gonna do. Any chance of getting an Auxilliary PSU?
Click to expand...

1600w isn't enough to run a highly overclocked quad 290x setup, assuming all the trimming. Hell, 1600w is arguably not enough for four gold 7970s either.


----------



## ZealotKi11er

Quote:


> Originally Posted by *tsm106*
> 
> I was wondering where the 2nd bank of memory slots was, hehe? Why no backplates. I think its crazy to go water and not get the backplates. The ek smooth plates not only adds stiffness and some cooling but it adds a big protective layer from drips to screws, to whatever.


Quote:


> Originally Posted by *tsm106*
> 
> 1600w isn't enough to run a highly overclocked quad 290x setup, assuming all the trimming. Hell, 1600w is arguably not enough for four gold 7970s either.


I would not want to run 4 x 290X in my room and game with them. That's more heat then a heater.


----------



## tsm106

External rads for the win. I'm way over sticking 10 rads into every orifice in any given case.


----------



## Infinite Jest

Quote:


> Originally Posted by *Blaise170*
> 
> It means the heatsink binds to the PCB. That makes it very difficult, if not impossible to remove with damaging it.


Can you define "bind"? Is this because of a thermal glue or some kind of physical clamp?


----------



## Dasboogieman

Quote:


> Originally Posted by *tsm106*
> 
> External rads for the win. I'm way over sticking 10 rads into every orifice in any given case.


I did read a build log of some dude who immersed his external rad in to a groundwater reservoir that was piped through some copper channels embedded in the concrete foundations of his house. Twas truly epic cooling power and silent. I'll try and see if I can find it again. I think he was cooling some 580s in Tri SLI.


----------



## ZealotKi11er

Quote:


> Originally Posted by *tsm106*
> 
> External rads for the win. I'm way over sticking 10 rads into every orifice in any given case.


Always wanted to do that. Maybe I might add another RAD in parallel and have the ability to still run the loop without it if its possible.


----------



## tsm106

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> External rads for the win. I'm way over sticking 10 rads into every orifice in any given case.
> 
> 
> 
> I did read a build log of some dude who immersed his external rad in to a groundwater reservoir that was piped through some copper channels embedded in the concrete foundations of his house. Twas truly epic cooling power and silent. I'll try and see if I can find it again. I think he was cooling some 580s in Tri SLI.
Click to expand...

Are you referring to vega's geothermal setup?


----------



## Gobigorgohome

Quote:


> Originally Posted by *Dasboogieman*
> 
> Good lord, you brought an EVGA G2 1300W to its knees? I guess for Quadfire, anything less than an AX1500i isn't gonna do. Any chance of getting an Auxilliary PSU?


It booted up just fine, I started CPU-Z, MSI Afterburner and HWMonitor, opened Steam and pushed "Play" on Hitman Absolution, then it all just froze (more than once). It is not the PCI-port or the card (running in the fourth port now with that exact card). Seems like I need another PSU.









I guess it will be another EVGA G2 1300W.
Quote:


> Originally Posted by *tsm106*
> 
> 1600w isn't enough to run a highly overclocked quad 290x setup, assuming all the trimming. Hell, 1600w is arguably not enough for four gold 7970s either.


I do not know exactly how much power is needed for quad r9 290x. I plan to buy another EVGA G2 1300W to use with my system. So I will have at least 2000 wattage of power after that purchase (the only thing I need to make my build "complete"). It is just a little silly to sit here with four cards and I can't be running more than three, I will be able to use four in a week though.
Quote:


> Originally Posted by *tsm106*
> 
> External rads for the win. I'm way over sticking 10 rads into every orifice in any given case.


I am doing external rads, but I am not finished with my "radiator-barrel" yet.


----------



## mojobear

Haha 32 lanes with PLX + 4770K is fast enough for me.







Instead of sinking an extra 200 dollars on the CPU etc I used extra cash for radiator + GT fans
Quote:


> Originally Posted by *ZealotKi11er*
> 
> Mother of GOD. Why are unot going X79?


Blackplates would cost 200 more (40x4 plus tax). Cant justify the cost given primarily aesthetic appeal. The EK block provides enough stiffness








Quote:


> Originally Posted by *tsm106*
> 
> I was wondering where the 2nd bank of memory slots was, hehe? Why no backplates. I think its crazy to go water and not get the backplates. The ek smooth plates not only adds stiffness and some cooling but it adds a big protective layer from drips to screws, to whatever.


only thing stupid about the EK terminal, other than the o-rings popping out, is that they block the bios pin. it looks damn good though


----------



## Blaise170

Quote:


> Originally Posted by *Infinite Jest*
> 
> Can you define "bind"? Is this because of a thermal glue or some kind of physical clamp?


Thermal glue I believe.


----------



## tsm106

Quote:


> Originally Posted by *mojobear*
> 
> Blackplates would cost 200 more (40x4 plus tax). *Cant justify* the cost given primarily aesthetic appeal. The EK block provides enough stiffness
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I was wondering where the 2nd bank of memory slots was, hehe? Why no backplates. I think its crazy to go water and not get the backplates. The ek smooth plates not only adds stiffness and some cooling but it adds a big protective layer from drips to screws, to whatever.
Click to expand...











lol right. It's up to you how you spend your money and how you rationalize it. Imo it is stupid to go the distance and not get the plates. They are not just aesthetic. They've saved my arrays from leaks, drips, shorts, physical damage many a times. Hell, the fujis I used back in Nov 2013 on my 290x costs more than the plates alone per card.


----------



## aaroc

Quote:


> Originally Posted by *mojobear*
> 
> Hey Arizonian,


You have quick disconnects on the in & out tubes from your Quad Fire? Can you please share the pros and cons of doing that?
Quote:


> Originally Posted by *Gobigorgohome*
> 
> It seems like quadfire R9 290X stock with fans at 100% draws more power than my EVGA G2 1300W can produce.


I have a Corsair AX1200i that can report power usage via Corsair Link. CHVFZ+FX8350+3xR9 290 uses 900 to 1000W when running benchmarks or games. My APC 1500BG UPS started to scream overload (895 max) showing same usage on its LCD, so I had to connect the UPS only to my monitors. All this with stock clocks and voltage. I will use 2 PSU if I start OC or get another R9 290 for Quadfire.


----------



## tsm106

Quote:


> Originally Posted by *aaroc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mojobear*
> 
> Hey Arizonian,
> 
> 
> 
> 
> You have quick disconnects on the in & out tubes from your Quad Fire? Can you please share the pros and cons of doing that?
Click to expand...

If you are watercooling gpus, especially multiples you need QDCs. They are not setup in the above pic for quick removal of the gpu array, from what I can tell of the pics. A better example of for quick removal of array thru QDCs can be seen below. The array literally takes 3 minutes to remove.

http://cdn.overclock.net/f/fc/fcb4620b_P1080796.jpeg


----------



## Infinite Jest

Quote:


> Originally Posted by *Blaise170*
> 
> Thermal glue I believe.


Ah, okay. That's not a problem as I have tape.


----------



## Blaise170

Quote:


> Originally Posted by *Gobigorgohome*
> 
> It booted up just fine, I started CPU-Z, MSI Afterburner and HWMonitor, opened Steam and pushed "Play" on Hitman Absolution, then it all just froze (more than once). It is not the PCI-port or the card (running in the fourth port now with that exact card). Seems like I need another PSU.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I guess it will be another EVGA G2 1300W.
> I do not know exactly how much power is needed for quad r9 290x. I plan to buy another EVGA G2 1300W to use with my system. So I will have at least 2000 wattage of power after that purchase (the only thing I need to make my build "complete"). It is just a little silly to sit here with four cards and I can't be running more than three, I will be able to use four in a week though.
> I am doing external rads, but I am not finished with my "radiator-barrel" yet.


R9 290X Quadfire: 350 x 4 = 1400W
R9 290 Quadfire: 300 x 4 = 1200W

These are max wattages based on a stock reference card.


----------



## mojobear

Hi there,

Actually my quick disconnects are to isolate my GPUs (you cant see the second on its under the 4th GPU, and also had a second pair of QD to isolate my stupid external radiator (ex480) which has the inlet and outlet at the bottom thus a *%&$ to bleed bleed air out...therefore I can prefill the ex480 before adding it to the loop. With multigpu setups I like QD bc if there are issues its a lot easier to get them out without bleeding the entire loop. Hope that helps!
Quote:


> Originally Posted by *aaroc*
> 
> You have quick disconnects on the in & out tubes from your Quad Fire? Can you please share the pros and cons of doing that?
> I have a Corsair AX1200i that can report power usage via Corsair Link. CHVFZ+FX8350+3xR9 290 uses 900 to 1000W when running benchmarks or games. My APC 1500BG UPS started to scream overload (895 max) showing same usage on its LCD, so I had to connect the UPS only to my monitors. All this with stock clocks and voltage. I will use 2 PSU if I start OC or get another R9 290 for Quadfire.


----------



## mojobear

Actually I lied you can see the QD...they are circled in red











Quote:


> Originally Posted by *aaroc*
> 
> You have quick disconnects on the in & out tubes from your Quad Fire? Can you please share the pros and cons of doing that?
> I have a Corsair AX1200i that can report power usage via Corsair Link. CHVFZ+FX8350+3xR9 290 uses 900 to 1000W when running benchmarks or games. My APC 1500BG UPS started to scream overload (895 max) showing same usage on its LCD, so I had to connect the UPS only to my monitors. All this with stock clocks and voltage. I will use 2 PSU if I start OC or get another R9 290 for Quadfire.


----------



## mojobear

haha. your right.









to each their own. I dont think I could spend that much on backplates + Fuji pads...thats like the cost of another 290 (used







)...i would just build a miniitx system with that lol. Its kinda silly having 4 290s and saying this, but I would side with function over form given the same $$. I guess for you they also serve a function.

Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> lol right. It's up to you how you spend your money and how you rationalize it. Imo it is stupid to go the distance and not get the plates. They are not just aesthetic. They've saved my arrays from leaks, drips, shorts, physical damage many a times. Hell, the fujis I used back in Nov 2013 on my 290x costs more than the plates alone per card.


----------



## heroxoot

Quote:


> Originally Posted by *Talon720*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Depends on the company. MSI voided my sticker to check it when they sent it as a replacement, but they say as long as the card isn't physically broken they take it anyway. Sapphire tends to be the same.
> 
> 
> 
> Im behind a few pages but i was going to rma my sapphire but when i got the rma number it said " rma wont be accepted if the s/n or p/n is damaged or missing" I was going to return it in the original box so the numbers could be takin off that. Unfortunately it also specifically states not to return it in original box bare product wrapped. I really didn't wanna send it in and wait only to be rejected. I removed the sticker where the the thermal pads go for the backplate. I used the artic clean stuff and some if got on the other 2 stickers completely destroying one and mostly the other one. At least with the pcs+ the stickers are on the backplate only slight negative is its Elpida.
Click to expand...

That is a problem. The sticker is always on mine. I always avoid hurting the sticker.


----------



## ZealotKi11er

Tried 14.6 RC2 (June 23) Drivers. So far so good. Got 400 point boost in GPU score and Combined score sow a nice increase.

http://www.3dmark.com/3dm/3490073


----------



## koekwau5

Quote:


> Originally Posted by *heroxoot*
> 
> That is a problem. The sticker is always on mine. I always avoid hurting the sticker.


For work I've send a whole bunch of graphics cards RMA without stickers. (Edit: Some brands we used to sell: Asus, MSI, Gigabyte)
Sometimes they simply let loose of the heat and seen a couple chopped up hiding in fans as well.
Never got rejected. If you got the invoice matching the S/N registered in the card itself there is no problem.
Sticker is just there to scare people not to open / modify the graphics card.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> Tried 14.6 RC2 (June 23) Drivers. So far so good. Got 400 point boost in GPU score and Combined score sow a nice increase.
> 
> http://www.3dmark.com/3dm/3490073


Interesting.
Experienced any x101 BSOD's while watching YouTube video's aka ATI most known issue?
Link: http://www.overclock.net/t/1478456/bluescreen-error-0xa0000001-with-amd-290x


----------



## Gobigorgohome

Quote:


> Originally Posted by *aaroc*
> 
> I have a Corsair AX1200i that can report power usage via Corsair Link. CHVFZ+FX8350+3xR9 290 uses 900 to 1000W when running benchmarks or games. My APC 1500BG UPS started to scream overload (895 max) showing same usage on its LCD, so I had to connect the UPS only to my monitors. All this with stock clocks and voltage. I will use 2 PSU if I start OC or get another R9 290 for Quadfire.


I am using Asus RIVBE, 3930K @ 1,23v, 4x R9 290X stock, 16 GB RAM, 2x HDD's, 1x D5 pump, 3 fans. As soon as I started a game it exceed 1300 watts (or whatever my PSU could give of power)
Quote:


> Originally Posted by *Blaise170*
> 
> R9 290X Quadfire: 350 x 4 = 1400W
> R9 290 Quadfire: 300 x 4 = 1200W
> 
> These are max wattages based on a stock reference card.


Yes, that might be why my system did freeze all the time with the fourth GPU in, went fine with just three.


----------



## phallacy

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I am using Asus RIVBE, 3930K @ 1,23v, 4x R9 290X stock, 16 GB RAM, 2x HDD's, 1x D5 pump, 3 fans. As soon as I started a game it exceed 1300 watts (or whatever my PSU could give of power)
> Yes, that might be why my system did freeze all the time with the fourth GPU in, went fine with just three.


I'm using a g2 1300w and g2 1000w to power my whole system. Uses about 1500w after about an hour of any game. This is with the cards overvolted and OCd though. I think stock would only pull around 250-275w per card.


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Tried 14.6 RC2 (June 23) Drivers. So far so good. Got 400 point boost in GPU score and Combined score sow a nice increase.
> 
> http://www.3dmark.com/3dm/3490073


nice, Zeal.

Quote:


> Originally Posted by *koekwau5*
> 
> For work I've send a whole bunch of graphics cards RMA without stickers. (Edit: Some brands we used to sell: Asus, MSI, Gigabyte)
> Sometimes they simply let loose of the heat and seen a couple chopped up hiding in fans as well.
> Never got rejected. If you got the invoice matching the S/N registered in the card itself there is no problem.
> Sticker is just there to scare people not to open / modify the graphics card.
> Interesting.
> Experienced any x101 BSOD's while watching YouTube video's aka ATI most known issue?
> Link: http://www.overclock.net/t/1478456/bluescreen-error-0xa0000001-with-amd-290x


I never experience any BSOD with my 290. i have 14.6 Beta on one HDD and i just recently switched my other HDD from 13.11 to RC. Zero BSOD in videos. i can even enable Hardware Acceleration and not have issues. one thing, though, i don't use DDU or Driver sweeper to clean old drivers. just the new driver.

edit: if you are using Firefox, then i recommend reading this . . .

http://www.overclock.net/t/1265543/the-amd-how-to-thread


----------



## mojobear

Woha that is insane power draw lol. Im running an ax1200 and a cx600m...i dont do as crazy of an OC...like about +87mv @ 1160/1375 and usually pull around 1250-1300W from the wall. This is with 3 monitors attached to the same outlet. Cant believe we are living in days of dual PSUs being more of a norm haha.
Quote:


> Originally Posted by *phallacy*
> 
> I'm using a g2 1300w and g2 1000w to power my whole system. Uses about 1500w after about an hour of any game. This is with the cards overvolted and OCd though. I think stock would only pull around 250-275w per card.


----------



## ZealotKi11er

Turned on the 290X Bios:

290X - 1200/1500

Graphics Score
13580
Physics Score
12132
Combined Score
5099

http://www.3dmark.com/3dm/3490264


----------



## Kittencake

I got my beautiful 290x in the mail .. and now I have only one issue. .. how the hell do i stop my browser from going across all 3 screens when I full screen it ?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Kittencake*
> 
> I got my beautiful 290x in the mail .. and now I have only one issue. .. how the hell do i stop my browser from going across all 3 screens when I full screen it ?


I never really use the full screen button. I just use Windows Snap. Have you tried that?


----------



## Kittencake

never heard of windows snap


----------



## Blaise170

Quote:


> Originally Posted by *Kittencake*
> 
> never heard of windows snap


It's the neat trick when you drag a window to the side or top of the monitor. You can also use WinKey + arrow left/right/up to do the same.


----------



## devilhead

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Turned on the 290X Bios:
> 
> 290X - 1200/1500
> 
> Graphics Score
> 13580
> Physics Score
> 12132
> Combined Score
> 5099
> 
> http://www.3dmark.com/3dm/3490264


Hehe, same physics score, my 2600k was at 4600mhz http://www.3dmark.com/fs/2389409


----------



## Kittencake

http://www.techpowerup.com/gpuz/zmqa2/ my validation


----------



## ZealotKi11er

Quote:


> Originally Posted by *devilhead*
> 
> Hehe, same physics score, my 2600k was at 4600mhz http://www.3dmark.com/fs/2389409


Whats your card clocked at? Also whats your RAM speed and timings?


----------



## devilhead

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Whats your card clocked at? Also whats your RAM speed and timings?


1360mhz on core i think







) and don't remember what kind of ram, but not the best...this is my mining rig


----------



## ZealotKi11er

Quote:


> Originally Posted by *devilhead*
> 
> 1360mhz on core i think
> 
> 
> 
> 
> 
> 
> 
> ) and don't remember what kind of ram, but not the best...this is my mining rig


Is 1360MHz stable for gaming?


----------



## HoneyBadger84

Quote:


> Originally Posted by *Gobigorgohome*
> 
> It seems like quadfire R9 290X stock with fans at 100% draws more power than my EVGA G2 1300W can produce.


Quote:


> Originally Posted by *Gobigorgohome*
> 
> I am using Asus RIVBE, 3930K @ 1,23v, 4x R9 290X stock, 16 GB RAM, 2x HDD's, 1x D5 pump, 3 fans. As soon as I started a game it exceed 1300 watts (or whatever my PSU could give of power)
> Yes, that might be why my system did freeze all the time with the fourth GPU in, went fine with just three.


This makes no sense to me personally. Maybe my PSU is just a beast, but I've run 4x R9 290Xs on my system (AX 1200W from Corsair that's ~3 years old) and it ran benchmarks and a few game-engine benchmarks (Metro 2033, Metro Last Light) just fine with no issue. Saw about 1450W draw from the wall, peak, which with this PSU's insane efficiency rating equates to about 1200-1230W actual draw of the system. You shouldn't be having these issues imo.

The cards don't draw that much more power at 100% fan vs. leaving it on the stock curve. I think I tested that as well, draw was like an extra 30-40W with all 4 cards at 100% fan speed (2 of them were aftermarket coolers, one of which was a Gigabyte WindForce, so arguably it pulls the same or maybe even more power than stock blower card for 100% fan speed).

And my CPU was running a higher voltage/clock than yours... Very odd. What's the efficiency rating on that PSU?


----------



## devilhead

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Is 1360MHz stable for gaming?


Didn't tryed







) i jus play batlefield 4 on HD 120fps, so i use stock settings


----------



## armartins

Quote:


> Originally Posted by *Kittencake*
> 
> I got my beautiful 290x in the mail .. and now I have only one issue. .. how the hell do i stop my browser from going across all 3 screens when I full screen it ?


HydraGrid FTW!


----------



## King PWNinater

To the people who have water cooled their 290(X)s, what clock speeds were you able to achieve?


----------



## ZealotKi11er

Quote:


> Originally Posted by *King PWNinater*
> 
> To the people who have water cooled their 290(X)s, what clock speeds were you able to achieve?


So far 1200MHz /1500MHz. +100mV. Could go higher but never really tried.


----------



## Gualichu04

Add me i never posted my two r9 290x


----------



## Dasboogieman

Quote:


> Originally Posted by *Gobigorgohome*
> 
> It booted up just fine, I started CPU-Z, MSI Afterburner and HWMonitor, opened Steam and pushed "Play" on Hitman Absolution, then it all just froze (more than once). It is not the PCI-port or the card (running in the fourth port now with that exact card). Seems like I need another PSU.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I guess it will be another EVGA G2 1300W.
> I do not know exactly how much power is needed for quad r9 290x. I plan to buy another EVGA G2 1300W to use with my system. So I will have at least 2000 wattage of power after that purchase (the only thing I need to make my build "complete"). It is just a little silly to sit here with four cards and I can't be running more than three, I will be able to use four in a week though.
> I am doing external rads, but I am not finished with my "radiator-barrel" yet.


Actually I remembered an OCN member here had stability issues with his G2 and the 290 GPU. It was because he used the combined 6+8 pin cables provided. I think the G2 1300w only comes with enough modular VGA slots for 3 GPUs unless you use those combined cables. Might be an even better case to get another PSU.


----------



## JeremyFenn

I'm not on water cooling, just using the stock Vapor-X that Sapphire puts on these cards. Running +200 in Trixx (+50 power limit) and getting 1200/1600 OC. I've switched to the 14.6 Beta (June 23rd) and notice it does draw a little more power and causes a little more heat. Still getting around 70c (max I see is 72c but it usually stays at 70 or 71c).


----------



## HoneyBadger84

Quote:


> Originally Posted by *Dasboogieman*
> 
> Actually I remembered an OCN member here had stability issues with his G2 and the 290 GPU. It was because he used the combined 6+8 pin cables provided. I think the G2 1300w only comes with enough modular VGA slots for 3 GPUs unless you use those combined cables. Might be an even better case to get another PSU.


Sounds familiar. My PSU only has enough connections for 3 GPUs, I have to make the last 2 6-pins I need for QuadFire out of 2 4-pin-to-single-6-pin adapters... which is why I don't run QuadFire for more than benchmarks, don't trust the adapters for prolonged use.

If I were him, I'd be gettin' a single, solid quality PSU that's massive in terms of amperage & wattage. Silverstone ST1500, Corsair AX1500i, somethin' like that. Then once he gets the cards liquid cooled, maybe get a secondary PSU to power 1-2 of the cards so that he can OC & not run in to power draw issues. That's what most folks do from what I gather... don't see the point in getting a whole nother 1300W PSU myself, especially since, like I said, there's no reason that unit shouldn't be able to handle it at stock (if my PSU can), unless it's simply inadequate for the reasons you stated.


----------



## ZealotKi11er

Quote:


> Originally Posted by *JeremyFenn*
> 
> 
> 
> I'm not on water cooling, just using the stock Vapor-X that Sapphire puts on these cards. Running +200 in Trixx (+50 power limit) and getting 1200/1600 OC. I've switched to the 14.6 Beta (June 23rd) and notice it does draw a little more power and causes a little more heat. Still getting around 70c (max I see is 72c but it usually stays at 70 or 71c).


Thats a impressive cooler. Have you tried lower then +200mV? I only need +100mV.


----------



## bond32

Quote:


> Originally Posted by *Dasboogieman*
> 
> Actually I remembered an OCN member here had stability issues with his G2 and the 290 GPU. It was because he used the combined 6+8 pin cables provided. I think the G2 1300w only comes with enough modular VGA slots for 3 GPUs unless you use those combined cables. Might be an even better case to get another PSU.


Yep, someone with electrical knowledge came on and said that cable is capable of around 7 A of juice. After you run some numbers, that comes out to a max of around 220-230 watts you could provide to the card(s) assuming you used the cable with 2 leads on the end. Knowing the 290/290x pulls the majority of it's power from those connectors, it would be advised to not use but one lead end.


----------



## Paul17041993

for those with the experience with said blocks, which do you think is 'better'? any pros and cons for each and potential performance improvement via 3rd party pads etc.

- XSPC Razor 290X/290
- Koolance VID-AR290X
- EK EK-FC R9-290X

I prefer full-copper over nickel plating but if one block is significantly better at VRMs or such I'll go for it, doesn't look like an air-mod will be any cheaper than a basic loop for now...


----------



## JeremyFenn

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Thats a impressive cooler. Have you tried lower then +200mV? I only need +100mV.


Yes I've used ab since that's what I'm used to and could only get about 1150/1450 stable. I was thinking of taking the cooler apart and replacing the TIM with some PK1 but at those temps why bother ya know.


----------



## Kittencake

I just noticed that the box on my 290x says 750 watt psu needed o.o ... how necessary is that ?


----------



## PCSarge

Quote:


> Originally Posted by *Kittencake*
> 
> I just noticed that the box on my 290x says 750 watt psu needed o.o ... how necessary is that ?


only necessary if oc'd i think.


----------



## PureBlackFire

Quote:


> Originally Posted by *Kittencake*
> 
> I just noticed that the box on my 290x says 750 watt psu needed o.o ... how necessary is that ?


not necessary at all. all you need is a 550 watt for a single 290X setup. the recommended wattage psu is for people who buy junk or peak rated power supplies that either can't sustain their rated wattage on the +12v rail or are outright lying about capacity on the label.


----------



## bond32

Quote:


> Originally Posted by *Paul17041993*
> 
> for those with the experience with said blocks, which do you think is 'better'? any pros and cons for each and potential performance improvement via 3rd party pads etc.
> 
> - XSPC Razor 290X/290
> - Koolance VID-AR290X
> - EK EK-FC R9-290X
> 
> I prefer full-copper over nickel plating but if one block is significantly better at VRMs or such I'll go for it, doesn't look like an air-mod will be any cheaper than a basic loop for now...


They all perform so close it's almost negligible... Maybe someone will find the link, but there is a good comparison of them out there. Believe the EK had the better VRM temps (like 3-5%), but the Koolance had a lower pressure drop. Of course, i'm a little bias as I have the Koolance. But that's simply because it was the only one available way back when this card was released.

Also, the Koolance is on sale for $109 on their website.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Kittencake*
> 
> I just noticed that the box on my 290x says 750 watt psu needed o.o ... how necessary is that ?


Using 3DMark I pulled 430W stock, 530W OC. Add another 100W if you use 200mV. Factor in efficiency and 550W is enough.


----------



## steadly2004

delete......


----------



## Blaise170

Quote:


> Originally Posted by *Kittencake*
> 
> I just noticed that the box on my 290x says 750 watt psu needed o.o ... how necessary is that ?


Take a look at this post:
Quote:


> Originally Posted by *Blaise170*
> 
> Nope. Manufacturers way overshoot what is actually needed since a lot of people buy cheap, overrated power supplies. You can run a single 290 on a 500W PSU.
> 
> MSI Power Requirements sheet: http://forum-en.msi.com/faq/article/power-requirements-for-graphics-cards
> 
> According to MSI, a 290 consumes 31A. Using the electrical equation W = V · A, we get a total of 372W = +12V · 31A.
> 
> The max TDP is only actually 300W though, at least for reference cards running at stock. The only reason you'd need 1000W for Crossfire is if you are overvolting.


Now this is for a 290, but add about 50W for a 290X and you end up with no more than 550W needed.


----------



## skruppe

Quote:


> Originally Posted by *PureBlackFire*
> 
> not necessary at all. all you need is a 550 watt for a single 290X setup. the recommended wattage psu is for people who buy junk or peak rated power supplies that either can't sustain their rated wattage on the +12v rail or are outright lying about capacity on the label.


A lot of people don't understand PSUs and settle with crap so they can spend more money on something like a R9 290X instead of a R9 290. If they're lucky the PSU lives for a while, worst case scenario PSU dies and takes a whole bunch of other stuff with it.


----------



## Draygonn

I can vouch for a 290x on a 520W PSU. My HX850 is getting RMA'd so I'm currently using an Antec Neo ECO 520. My i7 950 takes 1.42v to get to 4.2, the 290x runs 1100 on stock voltage. The PSU has had no problems keeping up during heavy sessions of TR and Metro LL.


----------



## PCSarge

gonna say it.

added an EK coolstream PE 120 RAD to my loop today. tommorow or wednesday my aquacomputer block should be in.... dont know if i can use the dye i ordered or not . if a 10% antifreeze mix can be mixed with dye without bad effects? that is the golden question.


----------



## bluedevil

Currently I run my 290 and a i5 3470 on my 550w Capstone. No issues at all.


----------



## Kittencake

well my psu is a coolermaster 700w extreme ... great psu so far I have no complaints against it


----------



## PureBlackFire

Quote:


> Originally Posted by *skruppe*
> 
> A lot of people don't understand PSUs and settle with crap so they can spend more money on something like a R9 290X instead of a R9 290. If they're lucky the PSU lives for a while, worst case scenario PSU dies and takes a whole bunch of other stuff with it.


yea that's a bad practice. most of the time you have people who buy bad power supplies that actually provide way more power than their system requires. they think because it lasted two + years at less than 50% capacity it's good.
Quote:


> Originally Posted by *Kittencake*
> 
> well my psu is a coolermaster 700w extreme ... great psu so far I have no complaints against it


unfortunately this situation kind of fits what has just been said. the 290X may be the tipping point when we find out what the psu is really worth.


----------



## Mega Man

Quote:


> Originally Posted by *Kittencake*
> 
> I got my beautiful 290x in the mail .. and now I have only one issue. .. how the hell do i stop my browser from going across all 3 screens when I full screen it ?


hydravision
iirc
Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gobigorgohome*
> 
> It seems like quadfire R9 290X stock with fans at 100% draws more power than my EVGA G2 1300W can produce.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gobigorgohome*
> 
> I am using Asus RIVBE, 3930K @ 1,23v, 4x R9 290X stock, 16 GB RAM, 2x HDD's, 1x D5 pump, 3 fans. As soon as I started a game it exceed 1300 watts (or whatever my PSU could give of power)
> Yes, that might be why my system did freeze all the time with the fourth GPU in, went fine with just three.
> 
> Click to expand...
> 
> This makes no sense to me personally. Maybe my PSU is just a beast, but I've run 4x R9 290Xs on my system (AX 1200W from Corsair that's ~3 years old) and it ran benchmarks and a few game-engine benchmarks (Metro 2033, Metro Last Light) just fine with no issue. Saw about 1450W draw from the wall, peak, which with this PSU's insane efficiency rating equates to about 1200-1230W actual draw of the system. You shouldn't be having these issues imo.
> 
> The cards don't draw that much more power at 100% fan vs. leaving it on the stock curve. I think I tested that as well, draw was like an extra 30-40W with all 4 cards at 100% fan speed (2 of them were aftermarket coolers, one of which was a Gigabyte WindForce, so arguably it pulls the same or maybe even more power than stock blower card for 100% fan speed).
> 
> And my CPU was running a higher voltage/clock than yours... Very odd. What's the efficiency rating on that PSU?
Click to expand...

why does everyone ask this, it is irreverent 1300w supplied, is 1300w, to the pc. besides that yea you really need to look, when OCED they pull alot more, not at stock.... not at a different fan curve

not to mention the g2 is a superflower unit ( leadex ) which is a very high end well rated PSU !!
Quote:


> Originally Posted by *PureBlackFire*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Kittencake*
> 
> I just noticed that the box on my 290x says 750 watt psu needed o.o ... how necessary is that ?
> 
> 
> 
> not necessary at all. all you need is a 550 watt for a single 290X setup. the recommended wattage psu is for people who buy junk or peak rated power supplies that either can't sustain their rated wattage on the +12v rail or are outright lying about capacity on the label.
Click to expand...

this


----------



## Blaise170

Quote:


> Originally Posted by *Kittencake*
> 
> well my psu is a coolermaster 700w extreme ... great psu so far I have no complaints against it


Unfortunately as mentioned, those units were not very good. Cooler Master's newer series such as the V series are very good, but the Extremes were a very weak Seventeam OEM platform.


----------



## Kittencake

Quote:


> Originally Posted by *Blaise170*
> 
> Unfortunately as mentioned, those units were not very good. Cooler Master's newer series such as the V series are very good, but the Extremes were a very weak Seventeam OEM platform.


ok well i plan on upgrading to a evga 850watter next month anyway so hopefully it can hold for another couple weeks


----------



## Blaise170

Quote:


> Originally Posted by *Kittencake*
> 
> ok well i plan on upgrading to a evga 850watter next month anyway so hopefully it can hold for another couple weeks


It won't blow up on you in any case.


----------



## Arizonian

Quote:


> Originally Posted by *Gualichu04*
> 
> Add me i never posted my two r9 290x
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









_Just pop a GPU-Z validation link in that post with your OCN name showing would be appreciated._


----------



## Kittencake

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> _Just pop a GPU-Z validation link in that post with your OCN name showing would be appreciated._


you forgot the kitty i already posted my validation


----------



## Paul17041993

Quote:


> Originally Posted by *bond32*
> 
> They all perform so close it's almost negligible... Maybe someone will find the link, but there is a good comparison of them out there. Believe the EK had the better VRM temps (like 3-5%), *but the Koolance had a lower pressure drop.* Of course, i'm a little bias as I have the Koolance. But that's simply because it was the only one available way back when this card was released.
> 
> Also, the Koolance is on sale for $109 on their website.


guess I'll go with the koolance then, as I'm only going to be using a single DDC to start with and may undervolt it if its too loud. VRM temps wise I may actually employ a custom backplate heatsink if deemed necessary, cooling both sides of the PCB can bring quite significant advantages.

What was the block that had the optional backplate heatpipe though? forgotten which brand it was now...


----------



## PCSarge

Quote:


> Originally Posted by *Paul17041993*
> 
> guess I'll go with the koolance then, as I'm only going to be using a single DDC to start with and may undervolt it if its too loud. VRM temps wise I may actually employ a custom backplate heatsink if deemed necessary, cooling both sides of the PCB can bring quite significant advantages.
> 
> What was the block that had the optional backplate heatpipe though? forgotten which brand it was now...


beleive thats aquacomputer or EK. either way the backplate is very hard to find.


----------



## fateswarm

Quote:


> Originally Posted by *bond32*
> 
> Yep, someone with electrical knowledge came on and said that cable is capable of around 7 A of juice. After you run some numbers, that comes out to a max of around 220-230 watts you could provide to the card(s) assuming you used the cable with 2 leads on the end. Knowing the 290/290x pulls the majority of it's power from those connectors, it would be advised to not use but one lead end.


They probably violate those standards though. I think I heard a regular Titan does recently.. Don't quote me on specifics.


----------



## fateswarm

Your PSU needs depend on your CPU needs. e.g. some gaming may not draw on an overclocked i7-4790K more than 50 or 70W. But run it to its knees somehow and it may go up to around 200W even on air.

This may happen in cases you run video streaming on gaming or stuff of that sort. Or approach it.


----------



## Gobigorgohome

Quote:


> Originally Posted by *Dasboogieman*
> 
> Actually I remembered an OCN member here had stability issues with his G2 and the 290 GPU. It was because he used the combined 6+8 pin cables provided. I think the G2 1300w only comes with enough modular VGA slots for 3 GPUs unless you use those combined cables. Might be an even better case to get another PSU.


Yes, I used everything I got. 2x 6+2 pin + 6 pin and 6x 6+2 pin.
Quote:


> Originally Posted by *HoneyBadger84*
> 
> This makes no sense to me personally. Maybe my PSU is just a beast, but I've run 4x R9 290Xs on my system (AX 1200W from Corsair that's ~3 years old) and it ran benchmarks and a few game-engine benchmarks (Metro 2033, Metro Last Light) just fine with no issue. Saw about 1450W draw from the wall, peak, which with this PSU's insane efficiency rating equates to about 1200-1230W actual draw of the system. You shouldn't be having these issues imo.
> 
> The cards don't draw that much more power at 100% fan vs. leaving it on the stock curve. I think I tested that as well, draw was like an extra 30-40W with all 4 cards at 100% fan speed (2 of them were aftermarket coolers, one of which was a Gigabyte WindForce, so arguably it pulls the same or maybe even more power than stock blower card for 100% fan speed).
> 
> And my CPU was running a higher voltage/clock than yours... Very odd. What's the efficiency rating on that PSU?


The PSU has Gold certified so it should not be a problem, will try another cable and see if that works out better.
Quote:


> Originally Posted by *Mega Man*
> 
> why does everyone ask this, it is irreverent 1300w supplied, is 1300w, to the pc. besides that yea you really need to look, when OCED they pull alot more, not at stock.... not at a different fan curve
> 
> not to mention the g2 is a superflower unit ( leadex ) which is a very high end well rated PSU !!
> this


I do not know about all the specifications but it has been working fine with three cards for days now, just seems a little weird that it started to freeze with four cards (may be a quadfire problem with that particulary game though). I will do some more testing and come back to you guys, I will get another G2 1300W anyways just to be on the safe side when they are overclocked, will also overclock the CPU so I guess the power consumption blows 1300 wattage of power in the air.


----------



## Paul17041993

well I can say this with a stock cooled 290X (high fan so no throttle), FX-8150 and many drives, a 1KW platinum PSU will never warm up...

though that *is* ~90% (or higher) PSU efficiency, unrated, generic or otherwise very cheap-and-nasty PSUs will eat half the power they supply to your hardware if they don't explode. Its not so much a power rating, but the rail current, regulation and overall efficiency that are the main concerns, a PSU may be rated for 750W but that doesn't mean it will survive the power draw from a 250W card. where an expensive 400W PSU may be able to do it fine while running close to over-current (though no OC or efficiency headroom, and things like furmark might tip the protection).


----------



## Dasboogieman

Quote:


> Originally Posted by *Paul17041993*
> 
> well I can say this with a stock cooled 290X (high fan so no throttle), FX-8150 and many drives, a 1KW platinum PSU will never warm up...
> 
> though that *is* ~90% (or higher) PSU efficiency, unrated, generic or otherwise very cheap-and-nasty PSUs will eat half the power they supply to your hardware if they don't explode. Its not so much a power rating, but the rail current, regulation and overall efficiency that are the main concerns, a PSU may be rated for 750W but that doesn't mean it will survive the power draw from a 250W card. where an expensive 400W PSU may be able to do it fine while running close to over-current (though no OC or efficiency headroom, and things like furmark might tip the protection).


Platinum PSUs are fantastic for that reason. Reduced heat output/noise is one of the lesser talked about benefits of high efficiency PSUs. Basically the reason why the AX1500i can deliver 1500W while still being cooler and quieter than PSUs of half it's capacity.
Its always that tough decision when it comes to purchasing a PSU from a noise/heat output perspective.
1. Is it better to get a platinum or better efficiency and sacrifice wattage.
2. Is it better to greatly overspecc on wattage and compensate for lesser efficiency with the sheer headroom.

My old Silverstone 1000W silent pro would whine like a reference 290X with 80% fan speed if it was placed in your system simply due to its poorer efficiency.


----------



## Gobigorgohome

Aaaahhh, problems seemed to be solved, I guess I have a bad PCI-E 8-pin cable.

Plugged in the four cards with all the original cables in the G2 1300W and it completed 3D Mark Fire Strike Extreme (above 14000) and Sky Diver (above 37000). I think the CPU bottlenecked a bit, it seemed like that. Will test some overclocking with that too, so I may be able to squeeze out a couple more frames from this system.

I read somewhere that the cards would not do more than 1075 Mhz with 4K screens, anyone going on a higher clock while on 4K?

I will feel more safe with 2x G2 1300W PSU's though, I guess I will try a couple of games now.


----------



## bond32

Quote:


> Originally Posted by *Paul17041993*
> 
> guess I'll go with the koolance then, as I'm only going to be using a single DDC to start with and may undervolt it if its too loud. VRM temps wise I may actually employ a custom backplate heatsink if deemed necessary, cooling both sides of the PCB can bring quite significant advantages.
> 
> What was the block that had the optional backplate heatpipe though? forgotten which brand it was now...


The highest VRM temps I have had were around 70 C. This was with the PT1 bios and around 1.46 volts... You should be fine lol.


----------



## Dasboogieman

Quote:


> Originally Posted by *bond32*
> 
> The highest VRM temps I have had were around 70 C. This was with the PT1 bios and around 1.46 volts... You should be fine lol.


I remember now, the block with the active heatpipe backplate is the Aquacomputer Kryographics. Those are really really rare.
Granted, you can achieve a similar effect with an XSPC or EK backplate if you glue some stick VRAM heatsinks on in the VRM area.


----------



## miraldo

Hey guys.

Here is my Sapphire R9 290 TRI-X + NZXT case and few benchmark results:

















What your opinion? I know that I should better hide all the cables but I realy dont know where to hide them..

Are bench result good for my RIG? Currently using Catalyst 14.6 RC2 drivers.

PC spec:

Gigabyte Z77M-D3H
Intel i5 2500K 3.3GHZ
Sapphire R9 290 TRI-X 4GB UEFI
SSD Crucial M500 120GB
HDD Samsung Spinpoint F3 1TB /
8Gb Crucial Ballistix Tactical 1866mhz Ram
Corsair 600 CX V2
NZXT Hades

PSU has 45A on 12V and for now no problems (not laud or hot) system run stable.


----------



## Mercy4You

After watching Youtube my R9 290X doesn't downclock anymore. Needs a restart for Powertune to work again. Anyone else got this problem, and is there a solution?


----------



## heroxoot

I'm seeing my 290X constantly have a slight load all the time and I don't know why. It's always a load around 10% and it keeps the core clocking up and down, never idling to 300. Not sure why. It constantly keeps the core doing 325 - 350 with nothing running sitting at the desktop. I even tried reinstalling 14.6 RC2. Anyone else having weird issues on this driver?


----------



## Dasboogieman

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *miraldo*
> 
> Hey guys.
> 
> Here is my Sapphire R9 290 TRI-X + NZXT case and few benchmark results:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What your opinion? I know that I should better hide all the cables but I realy dont know where to hide them..
> 
> Are bench result good for my RIG? Currently using Catalyst 14.6 RC2 drivers.
> 
> PC spec:
> 
> Gigabyte Z77M-D3H
> Intel i5 2500K 3.3GHZ
> Sapphire R9 290 TRI-X 4GB UEFI
> SSD Crucial M500 120GB
> HDD Samsung Spinpoint F3 1TB /
> 8Gb Crucial Ballistix Tactical 1866mhz Ram
> Corsair 600 CX V2
> NZXT Hades
> 
> PSU has 45A on 12V and for now no problems (not laud or hot) system run stable.






Looks OK to me, power shouldn't be an issue, even if you overclock your CPU and GPU. The PSU may get a bit louder but that should be it.
That cabling looks pretty neat already, don't really see how else you can improve it.
Quote:


> Originally Posted by *heroxoot*
> 
> I'm seeing my 290X constantly have a slight load all the time and I don't know why. It's always a load around 10% and it keeps the core clocking up and down, never idling to 300. Not sure why. It constantly keeps the core doing 325 - 350 with nothing running sitting at the desktop. I even tried reinstalling 14.6 RC2. Anyone else having weird issues on this driver?


Are you sure there isn't some rogue background app that's polling the GPU?


----------



## heroxoot

Quote:


> Originally Posted by *Dasboogieman*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *miraldo*
> 
> Hey guys.
> 
> Here is my Sapphire R9 290 TRI-X + NZXT case and few benchmark results:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What your opinion? I know that I should better hide all the cables but I realy dont know where to hide them..
> 
> Are bench result good for my RIG? Currently using Catalyst 14.6 RC2 drivers.
> 
> PC spec:
> 
> Gigabyte Z77M-D3H
> Intel i5 2500K 3.3GHZ
> Sapphire R9 290 TRI-X 4GB UEFI
> SSD Crucial M500 120GB
> HDD Samsung Spinpoint F3 1TB /
> 8Gb Crucial Ballistix Tactical 1866mhz Ram
> Corsair 600 CX V2
> NZXT Hades
> 
> PSU has 45A on 12V and for now no problems (not laud or hot) system run stable.
> 
> 
> 
> 
> 
> 
> 
> Looks OK to me, power shouldn't be an issue, even if you overclock your CPU and GPU. The PSU may get a bit louder but that should be it.
> That cabling looks pretty neat already, don't really see how else you can improve it.
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> I'm seeing my 290X constantly have a slight load all the time and I don't know why. It's always a load around 10% and it keeps the core clocking up and down, never idling to 300. Not sure why. It constantly keeps the core doing 325 - 350 with nothing running sitting at the desktop. I even tried reinstalling 14.6 RC2. Anyone else having weird issues on this driver?
> 
> Click to expand...
> 
> Are you sure there isn't some rogue background app that's polling the GPU?
Click to expand...

I closed everything and it still does it. Not sure what else there is. I did however have hardware polling set to 1000 in MSI AB. Setting it to 500 for whatever reason shows its going down to 300 again. Not sure why that caused it but the usage is now 0ing out.


----------



## rdr09

Quote:


> Originally Posted by *miraldo*
> 
> Hey guys.
> 
> Here is my Sapphire R9 290 TRI-X + NZXT case and few benchmark results:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What your opinion? I know that I should better hide all the cables but I realy dont know where to hide them..
> 
> Are bench result good for my RIG? Currently using Catalyst 14.6 RC2 drivers.
> 
> PC spec:
> 
> Gigabyte Z77M-D3H
> Intel i5 2500K 3.3GHZ
> Sapphire R9 290 TRI-X 4GB UEFI
> SSD Crucial M500 120GB
> HDD Samsung Spinpoint F3 1TB /
> 8Gb Crucial Ballistix Tactical 1866mhz Ram
> Corsair 600 CX V2
> NZXT Hades
> 
> PSU has 45A on 12V and for now no problems (not laud or hot) system run stable.


Like Das said, it looks good. I recommend getting a better cpu cooler like a Cooler Master 212 and oc the cpu a bit.

Quote:


> Originally Posted by *Mercy4You*
> 
> After watching Youtube my R9 290X doesn't downclock anymore. Needs a restart for Powertune to work again. Anyone else got this problem, and is there a solution?


290 here using 14.6 RC downclocks fine at stock and even when oc'ed (with Trixx). it downclocks even when watching the video in YT.



@ Hexoroot, i recommend going to msconfig and check what programs you have running at startup as well as processes running in the background. you might have raptr running or something.


----------



## Mega Man

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Paul17041993*
> 
> well I can say this with a stock cooled 290X (high fan so no throttle), FX-8150 and many drives, a 1KW platinum PSU will never warm up...
> 
> though that *is* ~90% (or higher) PSU efficiency, unrated, generic or otherwise very cheap-and-nasty PSUs will eat half the power they supply to your hardware if they don't explode. Its not so much a power rating, but the rail current, regulation and overall efficiency that are the main concerns, a PSU may be rated for 750W but that doesn't mean it will survive the power draw from a 250W card. where an expensive 400W PSU may be able to do it fine while running close to over-current (though no OC or efficiency headroom, and things like furmark might tip the protection).
> 
> 
> 
> Platinum PSUs are fantastic for that reason. Reduced heat output/noise is one of the lesser talked about benefits of high efficiency PSUs. Basically the reason why the AX1500i can deliver 1500W while still being cooler and quieter than PSUs of half it's capacity.
> Its always that tough decision when it comes to purchasing a PSU from a noise/heat output perspective.
> 1. Is it better to get a platinum or better efficiency and sacrifice wattage.
> 2. Is it better to greatly overspecc on wattage and compensate for lesser efficiency with the sheer headroom.
> 
> My old Silverstone 1000W silent pro would whine like a reference 290X with 80% fan speed if it was placed in your system simply due to its poorer efficiency.
Click to expand...

as to number one.... why does everyone think a non 80+ 1000w delivers less then a 80+ 1000w, for the love of god ( assuming they are QUALITY products ) they are both rated to deliver 1000w, effieciancy has NOTHING to do with supply voltage, ONLY from the WATTAGE _*TAKEN FROM THE WALL*_

as for two.

also completely incorrect sheer wattage has what to do with inefficiency?? !

you guys need to read this AND the further reading !
Quote:


> Originally Posted by *Gobigorgohome*
> 
> Aaaahhh, problems seemed to be solved, I guess I have a bad PCI-E 8-pin cable.
> 
> Plugged in the four cards with all the original cables in the G2 1300W and it completed 3D Mark Fire Strike Extreme (above 14000) and Sky Diver (above 37000). I think the CPU bottlenecked a bit, it seemed like that. Will test some overclocking with that too, so I may be able to squeeze out a couple more frames from this system.
> 
> I read somewhere that the cards would not do more than 1075 Mhz with 4K screens, anyone going on a higher clock while on 4K?
> 
> I will feel more safe with 2x G2 1300W PSU's though, I guess I will try a couple of games now.


glad you got it sorted out !


----------



## Paul17041993

Quote:


> Originally Posted by *Dasboogieman*
> 
> Platinum PSUs are fantastic for that reason. Reduced heat output/noise is one of the lesser talked about benefits of high efficiency PSUs. Basically the reason why the AX1500i can deliver 1500W while still being cooler and quieter than PSUs of half it's capacity.
> Its always that tough decision when it comes to purchasing a PSU from a noise/heat output perspective.
> 1. Is it better to get a platinum or better efficiency and sacrifice wattage.
> 2. Is it better to greatly overspecc on wattage and compensate for lesser efficiency with the sheer headroom.
> 
> My old Silverstone 1000W silent pro would whine like a reference 290X with 80% fan speed if it was placed in your system simply due to its poorer efficiency.


Id always go for option #1 in that sense, unless I was limited by budget or a certain environment...

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Aaaahhh, problems seemed to be solved, I guess I have a bad PCI-E 8-pin cable.
> 
> Plugged in the four cards with all the original cables in the G2 1300W and it completed 3D Mark Fire Strike Extreme (above 14000) and Sky Diver (above 37000). I think the CPU bottlenecked a bit, it seemed like that. Will test some overclocking with that too, so I may be able to squeeze out a couple more frames from this system.
> 
> I read somewhere that the cards would not do more than 1075 Mhz with 4K screens, anyone going on a higher clock while on 4K?
> 
> I will feel more safe with 2x G2 1300W PSU's though, I guess I will try a couple of games now.


quite possible that one end of the cable has soft pins and isn't making decent contact with the pins on the card or PSU, I would give the cable a good inspection to make sure none of the pins have slipped out of place. Or its also possible that the one socket (or rail if multi-rail) on your PSU that you were using has a poor output and simply wasn't supplying the power adequately, my seasonic 1KW that I use has a similar case in that there's one extra 12V socket that they prefer you to use last (if at all) as its connected via internal wiring over direct PCB circuitry, which [slightly] reduces the current and quality it can supply over the rest of the sockets.


----------



## Gobigorgohome

Quote:


> Originally Posted by *Mega Man*
> 
> glad you got it sorted out !


Yes, me too.


----------



## heroxoot

@rdr09 not sure why but setting hardware polling to 500ms instead of 1000ms solved it. I thought setting a higher poll number for reduced checking would be better but it seems not. Not sure why but it's fine now.


----------



## rdr09

Quote:


> Originally Posted by *heroxoot*
> 
> @rdr09 not sure why but setting hardware polling to 500ms instead of 1000ms solved it. I thought setting a higher poll number for reduced checking would be better but it seems not. Not sure why but it's fine now.


wonder if its same with Mercy? so, you are using AB? tried opening AB it says expired. i don't use it anyway i'll leave it. glad you got it sorted.


----------



## heroxoot

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> @rdr09 not sure why but setting hardware polling to 500ms instead of 1000ms solved it. I thought setting a higher poll number for reduced checking would be better but it seems not. Not sure why but it's fine now.
> 
> 
> 
> wonder if its same with Mercy? so, you are using AB? tried opening AB it says expired. i don't use it anyway i'll leave it. glad you got it sorted.
Click to expand...

They don't release beta to the public anymore due to complaints of expiration. I however am not using a public release so I cannot talk about anything like that. Just know turning down polling did help my issue 100%.


----------



## Dasboogieman

Quote:


> Originally Posted by *Mega Man*
> 
> as to number one.... why does everyone think a non 80+ 1000w delivers less then a 80+ 1000w, for the love of god ( assuming they are QUALITY products ) they are both rated to deliver 1000w, effieciancy has NOTHING to do with supply voltage, ONLY from the WATTAGE _*TAKEN FROM THE WALL*_
> 
> as for two.
> 
> also completely incorrect sheer wattage has what to do with inefficiency?? !
> 
> you guys need to read this AND the further reading !


Well as to point 1.
I'm assuming that if the PSU is more efficient, less input wattage is wasted as heat. Therefore, the PSU can run cooler thus needing a slower fan to keep it from burning up.

As to point 2.
Say for some reason the ideal Platinum power supply isn't available. Yet you desire silence and low heat output. Isn't it better to overspec your PSU requirements, as a large capacity PSU won't generate as much heat at partial % draw of its capacity than a smaller PSU of similar efficiency with a higher % draw of its capacity?

I was more refering to heat and noise profiles
Just because multiple PSUs are rated for a given wattage, doesn't mean they all deliver the full capacity or parts thereof at the same noise and heat output profiles.


----------



## fateswarm

If your bills aren't a problem, it's better to look into ripple stats for better hardware operation. Though good thing is there is a slight relation between the two. Chances are if the maker made good efficiency also made low ripple (but it's not certain, e.g. my G2 is gold, but it has the ripple of some platinums).


----------



## Dasboogieman

Quote:


> Originally Posted by *fateswarm*
> 
> If your bills aren't a problem, it's better to look into ripple stats for better hardware operation. Though good thing is there is a slight relation between the two. Chances are if the maker made good efficiency also made low ripple (but it's not certain, e.g. my G2 is gold, but it has the ripple of some platinums).


Oh I absolutely love the ripple performance on my new 1300W G2. Still wonder sometimes if I should've gotten the Plat Seasonic 1200W which was only $15 more but is definitely more silent.


----------



## Blaise170

Quote:


> Originally Posted by *Dasboogieman*
> 
> Well as to point 1.
> I'm assuming that if the PSU is more efficient, less input wattage is wasted as heat. Therefore, the PSU can run cooler thus needing a slower fan to keep it from burning up.
> 
> As to point 2.
> Say for some reason the ideal Platinum power supply isn't available. Yet you desire silence and low heat output. Isn't it better to overspec your PSU requirements, as a large capacity PSU won't generate as much heat at partial % draw of its capacity than a smaller PSU of similar efficiency with a higher % draw of its capacity?
> 
> I was more refering to heat and noise profiles
> Just because multiple PSUs are rated for a given wattage, doesn't mean they all deliver the full capacity or parts thereof at the same noise and heat output profiles.


Point 1: Somewhat, but a 1300W platinum and a 1300W bronze will still both get very hot at 1280W of draw.

Point 2: The only reason to overspec is if you will later upgrade and need it. Your PC will only rarely hit its maximum power consumption. For most people, the PC will be idling for most of the day, and when it isn't it usually isn't being benchmarked. So while an R9 290 and i7 will *need* about 550W, you will generally only be using about 200W unless gaming/rendering/benchmarking/etc.

Also efficiency is a bad metric to use when purchasing power. Many less-reputable manufacturers (I'm looking at you, CoolMax) will do a bunch of things to get 80+ certified that are unrealistic conditions, such as running the PSU in 20C temps instead of 40C, etc.


----------



## tsm106

Quote:


> Originally Posted by *Blaise170*
> 
> Also efficiency is a bad metric to use when purchasing power. Many less-reputable manufacturers (I'm looking at you, CoolMax) will do a bunch of things to get 80+ certified that are unrealistic conditions, such as running the PSU in 20C temps instead of 40C, etc.


Efficiency is pretty damn important at killawatt and higher wattages. An easy way of getting more useable wattage from your psu is thru higher and higher efficiency. This is vitally important because as you go up there are fewer and fewer high wattage units to choose from and their is a ceiling to how much wattage you can buy in a single unit. Also, if you're using Coolmax in your argument, you've framed your argument badly.


----------



## Blaise170

Quote:


> Originally Posted by *tsm106*
> 
> Efficiency is pretty damn important at killawatt and higher wattages. An easy way of getting more useable wattage from your psu is thru higher and higher efficiency. This is vitally important because as you go up there are fewer and fewer high wattage units to choose from and their is a ceiling to how much wattage you can buy in a single unit. Also, if you're using Coolmax in your argument, you've framed your argument badly.


Perhaps I wasn't clear, I'm saying that you shouldn't buy a Platinum PSU just because it's Platinum when there are units just as capable that may only be rated at Gold or Silver at half the price; but yes efficiency is good for saving electricity.


----------



## skruppe

Quote:


> Originally Posted by *miraldo*
> 
> Hey guys.
> 
> Here is my Sapphire R9 290 TRI-X + NZXT case and few benchmark results:


Why do you use the second PCIe slot? Isn't that one only x8?


----------



## ZealotKi11er

My Block just came. Now going back to more Dota 2.


----------



## rv8000

Quote:


> Originally Posted by *skruppe*
> 
> Why do you use the second PCIe slot? Isn't that one only x8?


Even if it's pci-e 3.0 @ 8x, the card won't saturate the bus, there will be almost 0 difference running it in the other slot if the card is there for placement reasons.


----------



## skruppe

Quote:


> Originally Posted by *rv8000*
> 
> Even if it's pci-e 3.0 @ 8x, the card won't saturate the bus, there will be almost 0 difference running it in the other slot if the card is there for placement reasons.


Even if the card won't saturate the bus there's latency. So your nonsense about the difference being small is irrelevant as you can't answer why he would prefer it that way.


----------



## mojobear

Hey Arizonian,

Could you please update me? You might have not seen my prev post.

Thanks!

Sapphire x3, Powercolor x1 - OC to 1150/1375 + 63 mV
EK waterblocks - EK420 + RX360 + EX480


----------



## Silent Scone

Is that an Elysium? Great case


----------



## Hl86

Suddenly CF stopped working in dota 2. 14,6 drivers. Any ideas? I tried reinstalling driver.


----------



## mojobear

Yes sir it is! Got it before the 900D came out, which I would have gotten instead but its a very roomy case...fits an external 480 out the back without it jutting over the top of the case, so it looks pretty decent actually!
Quote:


> Originally Posted by *Silent Scone*
> 
> Is that an Elysium? Great case


----------



## twerk

Cleaned of argumentative posts. Try and keep it friendly please.


----------



## Silent Scone

Quote:


> Originally Posted by *mojobear*
> 
> Yes sir it is! Got it before the 900D came out, which I would have gotten instead but its a very roomy case...fits an external 480 out the back without it jutting over the top of the case, so it looks pretty decent actually!


You're not missing much on the 900D front. The build quality isn't much better for what you'd expect from Corsair. Elysium is a fav of mine


----------



## ZealotKi11er

Quote:


> Originally Posted by *Hl86*
> 
> Suddenly CF stopped working in dota 2. 14,6 drivers. Any ideas? I tried reinstalling driver.


1) Why the hell do you need CF in Dota 2
2) Check if game is running Full Screen Windows Mode.


----------



## Hl86

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 1) Why the hell do you need CF in Dota 2
> 2) Check if game is running Full Screen Windows Mode.


1) For downsampling
2) I tried toggling fullscreen/window mode.


----------



## rv8000

Quote:


> Originally Posted by *rv8000*
> 
> Even if it's pci-e 3.0 @ 8x, the card won't saturate the bus, there will be almost 0 difference running it in the other slot if the card *is there for placement reasons*.


Quote:


> Originally Posted by *skruppe*
> 
> Even if the card won't saturate the bus there's latency. So your nonsense about the difference being small is irrelevant as you can't answer why he would prefer it that way.


"placement reasons", i.e. restricted air flow, cpu cooler being in conflict, other cards blocking the gpu, anything really...

Why people are so negative when others are just trying to help is a mystery to me


----------



## Sjp770

I've got 2x 290x and before setting up my system I ran them in pcie3.0 16x-16x and 16x-8x. I didn't see any noticeable difference. The reason I'm using 16x-8x is for placement reasons. The dual EK bridge really decided for me.


----------



## the9quad

Quote:


> Originally Posted by *skruppe*
> 
> Even if the card won't saturate the bus there's latency. So your nonsense about the difference being small is irrelevant as you can't answer why he would prefer it that way.


Geez this kind of crap is why I'm done with this thread.


----------



## mojobear

Can you prove that there is actually more latency using the lower PCIE? He doesn't have a PLX. Its divided up x8 and x8 evenly.
Whats up with all this latency anyway...is it meaningful? Have you correlated it with anything meaningful that you can prove?








Quote:


> Originally Posted by *skruppe*
> 
> Even if the card won't saturate the bus there's latency. So your nonsense about the difference being small is irrelevant as you can't answer why he would prefer it that way.


----------



## HoneyBadger84

Quote:


> Originally Posted by *rv8000*
> 
> "placement reasons", i.e. restricted air flow, cpu cooler being in conflict, other cards blocking the gpu, anything really...
> 
> Why people are so negative when others are just trying to help is a mystery to me


Indeed, I don't get it either...

The difference between x8 and x16 in my experience, which has been plenty recently running through 2-3-4 card configs on my motherboard, is minute, really only noticable in benchmark scores. You'd need to be running a massive resolution like 4K to be able to see a meaningful difference in performance between an x8 and x16 slot, and by then one card is going to be screaming in agony anyway, right? lol


----------



## King PWNinater

During Overclocking, when I hit a voltage wall, how much should I increase the voltage? Also, when do I know when to increase the power limit on my 290 when overclocking? And by how much?


----------



## tsm106

Quote:


> Originally Posted by *mojobear*
> 
> Can you prove that there is actually more latency using the lower PCIE?


Technically there shouldn't be any latency difference between 2.0 vs 3.0. The only time that latency becomes a factor is if one had to go thru a multiplexing chip, ie. PLX, NF200 etc etc.

Bandwidth wise, the Hawaii cards have the "theoretical potential" according to AMD to surpass pcie 3.0, but in the wild as honeybadger alluded to is a very different thing. I've found at best single digit % differences between 2.0 and 3.0.


----------



## HoneyBadger84

Quote:


> Originally Posted by *King PWNinater*
> 
> During Overclocking, when I hit a voltage wall, how much should I increase the voltage? Also, when do I know when to increase the power limit on my 290 when overclocking? And by how much?


For safety & efficiency, you should just put Power Limit to +50% and leave it there when overclocking. Keep in mind, increasing power limit just increases how much the card can draw IF it needs it, it's not increasing the card's draw period or anything. For Voltage I suspect a 10mV bump to start with would probably be enough? Not too familiar with that stuff yet as I haven't forrayed in to OCing much because I'm still intending on reselling most of the cards I get in, so I haven't pushed them very hard.


----------



## King PWNinater

Why do people overclock VGA memory?


----------



## anubis1127

Quote:


> Originally Posted by *King PWNinater*
> 
> Why do people overclock VGA memory?


Better performance.


----------



## King PWNinater

Quote:


> Originally Posted by *anubis1127*
> 
> Better performance.


What will give you better performance in games, overclocking GPU speed or memory? Also, can you overclock the GPU without overclocking the memory?


----------



## HoneyBadger84

Quote:


> Originally Posted by *King PWNinater*
> 
> What will give you better performance in games, overclocking GPU speed or memory? Also, can you overclock the GPU without overclocking the memory?


Especially with R9 290/290X cards because of the wide-as-f*** memory bus on them, it's better to OC the GPU core as much as you can, rather than pushing the memory too much, as it's already so fast it's crazy. Granted, if you can push it an extra 200-300MHz easily enough on the vRAM, that's an extra 800-1200MHz effective speed you're adding, and the differences can be noticable. But getting 100-150MHz on the core will give you a very noticable boost, because that's effectively a 10-15% OC over stock for these GPUs, give or take a bit, and it equates to (around) a 5-7% performance gain in a lot of instances.


----------



## hwoverclkd

Quote:


> Originally Posted by *King PWNinater*
> 
> What will give you better performance in games, overclocking GPU speed or memory? Also, can you overclock the GPU without overclocking the memory?


overclocking GPU generally gives better performance than memory, clock for clock. Sure, you can overclock either of the two, or both. Finding the 'sweet spot' is the challenge


----------



## kahboom

Isn't PCI-e 3.0 8x equal to PCI-e 2.0 16x?


----------



## Paul17041993

Quote:


> Originally Posted by *heroxoot*
> 
> @rdr09 not sure why but setting hardware polling to 500ms instead of 1000ms solved it. I thought setting a higher poll number for reduced checking would be better but it seems not. Not sure why but it's fine now.


CPU load, input event thread overburnened, sync misses, USB hub and controller load, driver/firmware bug, mousepad (if on a mouse of course).

1000ms means it checks the sensory surface and button states a thousand times a second, meaning greater movement accuracy and lower delay, however of course puts higher stress on the hardware and can cause all sorts of problems, I usually just use 500ms.

edit note; for corsair mice its displayed as the actual rate, not the clock, so 1ms == 1000ms, 2ms == 500ms, 4 == 250 and 8 == 125 which is the normal default on basic mice and keyboards.


----------



## Dasboogieman

Quote:


> Originally Posted by *kahboom*
> 
> Isn't PCI-e 3.0 8x equal to PCI-e 2.0 16x?


yeah it is.

There is supposedly a big difference in peak FPS at higher resolutions between PCIe2.0 x16 and PCIe2.0 x8. Which is why I upgraded to a 3570k, PCIe 3.0 x8 should be fine until the next hardware cycle.


----------



## PCSarge

well... ive bought the copper version of the very sexy aquacomputer block. along with a replacement rad. it seems my swiftech rad i currently have isnt quite doing its job with just the CPU in the loop, the EK coolstream 120 is dumping the brunt of the heat and its the last item in the loop before res atm.

so ive ordered a coolstream PE 240 to match the 120.

the block, some nice blue dye so the block looks sexy. and some clear feser tube so i can see the coolant .

i should have it all in by thursday....i will then go completely bonkers, drain my loop. rip my rig apart. and then...some assembly required







not for novices.

and of course as always. props to dazmode for having what i need when i want it.


----------



## mojobear

ya thats exactly my point which is why I think Skruppe or whatever his name is was just being plain rude + spewing out ill-informed info....two strikes, one post haha
Quote:


> Originally Posted by *tsm106*
> 
> Technically there shouldn't be any latency difference between 2.0 vs 3.0. The only time that latency becomes a factor is if one had to go thru a multiplexing chip, ie. PLX, NF200 etc etc.
> 
> Bandwidth wise, the Hawaii cards have the "theoretical potential" according to AMD to surpass pcie 3.0, but in the wild as honeybadger alluded to is a very different thing. I've found at best single digit % differences between 2.0 and 3.0.


----------



## heroxoot

Quote:


> Originally Posted by *Paul17041993*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> @rdr09 not sure why but setting hardware polling to 500ms instead of 1000ms solved it. I thought setting a higher poll number for reduced checking would be better but it seems not. Not sure why but it's fine now.
> 
> 
> 
> CPU load, input event thread overburnened, sync misses, USB hub and controller load, driver/firmware bug, mousepad (if on a mouse of course).
> 
> 1000ms means it checks the sensory surface and button states a thousand times a second, meaning greater movement accuracy and lower delay, however of course puts higher stress on the hardware and can cause all sorts of problems, I usually just use 500ms.
> 
> edit note; for corsair mice its displayed as the actual rate, not the clock, so 1ms == 1000ms, 2ms == 500ms, 4 == 250 and 8 == 125 which is the normal default on basic mice and keyboards.
Click to expand...

I don't think that's it. It says polling period in milliseconds. But yea 500 does work wonders.


----------



## Paul17041993

Quote:


> Originally Posted by *kahboom*
> 
> Isn't PCI-e 3.0 8x equal to PCI-e 2.0 16x?


in bandwidth, yes, performance, no.

x16 2.0 has actually shown to perform better than x8 3.0, simple reason is 16 lanes allows more data to be sent in a buffer at a time, whereas half the lanes but double speed requires two smaller buffers to be processed and scheduled separately. adding a very slight overhead and allowing 2.0 a few more frames.

kinda why AMD is completely skipping PCIe3.0 on their higher-end (AM3+/AM4) platform, but in all honesty the difference is negligible, just that intel using an on-die 3.0 controller allows it to compete with 32 2.0 lanes on a decade old northbridge.


----------



## HoneyBadger84

I think AMD skipped 3.0 outta laziness myself  lol j/k... kinda.


----------



## Red1776

Quote:


> Originally Posted by *skruppe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rv8000*
> 
> Even if it's pci-e 3.0 @ 8x, the card won't saturate the bus, there will be almost 0 difference running it in the other slot if the card is there for placement reasons.
> 
> 
> 
> Even if the card won't saturate the bus there's latency. So your nonsense about the difference being small is irrelevant as you can't answer why he would prefer it that way. Don't reply with another useless post.
Click to expand...

*Taken from an article by Matt Bach of Puget Systems*

*Impact of PCI-E Speed on Gaming Performance*

*( very typical of the entire test/benchmark) ( Including the multi-cards tests)*





Quote:*Puget Systems*


> Starting with Unigine Heaven 4.0, lets first take a closer look at the Z87 test system. On that system, PCI-E 3.0 was very slightly faster than PCI-E 2.0, but oddly we saw higher scores in x8 mode than in x16 mode. The biggest variance was still only 1 FPS, however, which we would call within our margin of error.
> 
> On the X79 system, there is really nothing to discuss. The biggest variance was only .4 FPS which is well within our margin of error.


Quote:*Puget Systems*


> Metro: Last Light - our results are again unremarkable. On the Z87 test system, the results were all essentially identical. The X79 system was slightly faster when running PCI-E 3.0 in both x8 and x16 mode, but only by .3-.7 FPS.


Quote:*Puget Systems*


> Running our benchmarks at 4k resolutions with two video cards should stress the PCI-E bus more than a single card at lower resolutions, but at least for Unigine Heaven there is no noteable difference in performance based on either the PCI-E revision or the number of PCI-E lanes


Quote:*Puget Systems*


> Unlike Unigine Heaven 4.0, Hitman: Absolution does give us some interesting data to go over, at least for the X79 test system. On that system, at x16 speeds PCI-E 2.0 is actually 1.2 FPS faster than PCI-E 3.0. At x8 speeds, however, *PCI-E 3.0 is actually much faster than PCI-E 2.0 by 1.5 FPS which is the biggest variance we saw in any of our tests*


Quote:*Puget Systems*


> *Conclusion:*
> 
> Our testing has pretty clearly shown that for gaming using either PCI-E 2.0 or PCI-E 3.0 will give you nearly identical performance. Oddly, in some benchmarks PCI-E 2.0 was actually faster than PCI-E 3.0. At the same time, x16 was not consistantly faster than x8. Again, x8 was actually faster than x16 in many cases. So unless you care about getting up to 1.5 FPS better performance, you might actually want to manually set your video cards to operate at x8 speeds - although we really would not recommend doing so.


*The Balance of the Article can be had here:*

*http://www.pugetsystems.com/labs/articles/Impact-of-PCI-E-Speed-on-Gaming-Performance-518/*

Quote: *Red1776*

Having myself built more multi GPU systems than most and working with PCI-E 1.1,2.0,3.0 x4/x8/x16 these results closely match my experience and results.


----------



## ZealotKi11er

To see the difference you have to test 4K. 1080p is garbage.


----------



## Red1776

Quote:


> Originally Posted by *ZealotKi11er*
> 
> To see the difference you have to test 4K. 1080p is garbage.


 Read the article, they test at 4K, and with multiple cards.

*Taken from an article by Matt Bach of Puget Systems*

*Impact of PCI-E Speed on Gaming Performance*


----------



## pdasterly

Whats best way to connect 3 monitors to 290x xfire setup, sapphire cards?

Currently have two dvi and one dp hooked up. Every now and then I get a dp error about losing connection and resolution but monitors
Never cut off, all connectors going to 1 card. Monitors have dvi and dp, hp z23i


----------



## mojobear

Hey if its just 1080 x 3 you can do DVI, DVI, DVI -> HDMI adapter. you dont need to use the DP for eyefinity with the r9 290(X)

Quote:


> Originally Posted by *pdasterly*
> 
> Whats best way to connect 3 monitors to 290x xfire setup, sapphire cards?
> 
> Currently have two dvi and one dp hooked up. Every now and then I get a dp error about losing connection and resolution but monitors
> Never cut off, all connectors going to 1 card. Monitors have dvi and dp, hp z23i


----------



## pdasterly

So I need to buy hdmi2dvi adapter?


----------



## mojobear

Well it could be one solution to avoid the DP connection, which sounds like your issue. I had some issues with a DP adapter for eyefinity with 14.6 driver, so switched to HDMI and issues are gone. DVI-HDMI adapter is probably cheaper than a new DP cable, which would be the other quick option to consider.

First thing you can try is to switch to AMD 14.4 or 13.12 drivers to see if that fixes your DP signal issue, if you are on 14.6 RC2 right now.
Quote:


> Originally Posted by *pdasterly*
> 
> So I need to buy hdmi2dvi adapter?


----------



## pdasterly

Dp cable is new, oem hp. I have three I can try out. The error is random, happened in 14.4, 14.6 and rc.


----------



## tsm106

Quote:


> Originally Posted by *pdasterly*
> 
> Whats best way to connect 3 monitors to 290x xfire setup, sapphire cards?
> 
> Currently have two dvi and one dp hooked up. Every now and then I get a dp error about losing connection and resolution but monitors
> Never cut off, all connectors going to 1 card. Monitors have dvi and dp, hp z23i


It's not uncommon to get dp errors, if things are working fine otherwise you might just ignore it. If you're asking what's the best way of doing it... Three 1080P 60 hz panels. In that case I would get a hub and three matching DP cables. That's the ideal method and the advantage to this besides simplicity is that all the monitors are synched so you won't have mismatched clocks on the panels, ie. screen tearing.


----------



## pdasterly

Which hub, thanks for fast reply


----------



## tsm106

Quote:


> Originally Posted by *pdasterly*
> 
> Which hub, thanks for fast reply


I've used both accell and evga's mst 2.0 hubs. I liek the evga better for aesthetics and ease of connectivity. As I mentioned DP connect issues happen every so often, and it's easy to deal with using the evga. There's a button on it to refresh the panels, so when it happens you simply press that button to re-synch the panels. Oh also, the evga can take any DP cable, because it's not hard wired like the accell.

http://www.amazon.com/EVGA-Compliant-Multi-Stream-Displayport-200-DP-1301-L1/dp/B00HK8V5KE/ref=sr_1_1?ie=UTF8&qid=1404876737&sr=8-1&keywords=evga+hub


----------



## pdasterly

Sheesh, 100 bux. Ill deal with it. I will change cable but error is random and un-reproducable


----------



## Arizonian

Quote:


> Originally Posted by *mojobear*
> 
> Hey Arizonian,
> 
> Could you please update me? You might have not seen my prev post.
> 
> Thanks!
> 
> Sapphire x3, Powercolor x1 - OC to 1150/1375 + 63 mV
> EK waterblocks - EK420 + RX360 + EX480
> 
> 
> Spoiler: Warning: Spoiler!


I'm sorry about that and thanks for your patience.







Congrats - updated






















Quote:


> Originally Posted by *miraldo*
> 
> Hey guys.
> 
> Here is my Sapphire R9 290 TRI-X + NZXT case and few benchmark results:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What your opinion? I know that I should better hide all the cables but I realy dont know where to hide them..
> 
> Are bench result good for my RIG? Currently using Catalyst 14.6 RC2 drivers.
> 
> PC spec:
> 
> Gigabyte Z77M-D3H
> Intel i5 2500K 3.3GHZ
> Sapphire R9 290 TRI-X 4GB UEFI
> SSD Crucial M500 120GB
> HDD Samsung Spinpoint F3 1TB /
> 8Gb Crucial Ballistix Tactical 1866mhz Ram
> Corsair 600 CX V2
> NZXT Hades
> 
> PSU has 45A on 12V and for now no problems (not laud or hot) system run stable.


Congrats - added


----------



## Kittencake

Quote:


> Originally Posted by *Arizonian*
> 
> I'm sorry about that and thanks for your patience.
> 
> 
> 
> 
> 
> 
> 
> Congrats - updated
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added


you forgot the kitty again -_-


----------



## Arizonian

Quote:


> Originally Posted by *Kittencake*
> 
> http://www.techpowerup.com/gpuz/zmqa2/ my validation


Quote:


> Originally Posted by *Kittencake*
> 
> you forgot the kitty again -_-


I'm sorry. I found your post.









I'm not sure which manufacture but from the validation link it's a single 290X. I'm sure you may have said it later, PM me if it's not a Sapphire. I'll update it. Placed you in order of entry.

Congrats - added


----------



## Kittencake

its a sapphire


----------



## Mega Man

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> as to number one.... why does everyone think a non 80+ 1000w delivers less then a 80+ 1000w, for the love of god ( assuming they are QUALITY products ) they are both rated to deliver 1000w, effieciancy has NOTHING to do with supply voltage, ONLY from the WATTAGE _*TAKEN FROM THE WALL*_
> 
> as for two.
> 
> also completely incorrect sheer wattage has what to do with inefficiency?? !
> 
> you guys need to read this AND the further reading !
> 
> 
> 
> Well as to point 1.
> I'm assuming that if the PSU is more efficient, less input wattage is wasted as heat. Therefore, the PSU can run cooler thus needing a slower fan to keep it from burning up.
> 
> As to point 2.
> Say for some reason the ideal Platinum power supply isn't available. Yet you desire silence and low heat output. Isn't it better to overspec your PSU requirements, as a large capacity PSU won't generate as much heat at partial % draw of its capacity than a smaller PSU of similar efficiency with a higher % draw of its capacity?
> 
> I was more refering to heat and noise profiles
> Just because multiple PSUs are rated for a given wattage, doesn't mean they all deliver the full capacity or parts thereof at the same noise and heat output profiles.
Click to expand...

point 2 no larger capacity= worse and worse efficiency at lower loads which again if you read the articles it would explain it


----------



## Silent Scone

Anyone having issues with their 290 on the AOC U2868PQU 28 flickering - working EDID fix here

AOCwithTimingFix-Proper.txt 2k .txt file


There will also be a fix included in the next beta drivers


----------



## fateswarm

PCI-E Saturation is never an issue on 99% cases of gaming. This is because bulk tranfers of assets (mostly textures and geometry) are done when you are on the loading screen and the rendering is done with small transactions when it gets to the PCI-E. The main benefit better PCI-E gives is on latency in almost all cases of gaming (small exceptions include stuff like changing world-area mid-game in an MMORPG).

i.e. that means you mainly want faster PCI-E for latency alone. This is because even when you have to transfer 100kilobytes it will _still_ be done faster on a faster bus (because simply "what you have to transfer/bandwidth" = lower latency on higher bandwidth.

That's how all modern GPU APIs and engines work currently. They try to keep the PCI-E on the smallest transactions possible while they do bulk loading before rendering. Since PCI-E is multiple times slower than VRAM<->GPU speeds, it explains stuff like Mantle.

That being said, due to other limitations unrelated to the PCI-E bus alone, its latency becomes not an issue after a point currently.

Make no mistake though, if the PCI-E bus had the bandwidth(latency) of VRAM<->GPU we would not need VRAM on card!

The main reason we offload so much stuff on the graphics card VRAM is because PCI-E is extremely slow comparatively.


----------



## Imprezzion

I just got my R9 290X Windforce3X in and i'm trying to figure out whether MSI AB can unlock voltage on this bad boy but for now it's not working yet..








Anyone got some tips or is it voltage locked like most Gigabyte AMD cards.

Edit: Fixed it. Black screened on me once already though... <3 Elpida!


----------



## Decoman

Does anyone know much about so called "double precision" features on 290x cards? What cards has them, active/inactive, how much and how relevant are the numbers for gaming and whatnot? Does AMD's other line (CAD related) of cards have better abilities for working with great precision?

I searched for this on the internet, but didn't really find much useful info on that. I ended up reading some weird things like, 280x cards being better at "double precision" than 290x cards. I have no idea if that is correct though.

Edit: Unless something odd happens the next 24 hours, I should be the owner of a "Sapphire R9 290X 4GB TRI-X OC" card. Hopefully I can another one in the next 6 months or so for a crossfire setup. Will plan on water cooling though, with one really big external radiator.







Apparently, these cards have a reference card pcb layout.


----------



## heroxoot

Quote:


> Originally Posted by *Decoman*
> 
> Does anyone know much about so called "double precision" features on 290x cards? What cards has them, active/inactive, how much and how relevant are the numbers for gaming and whatnot? Does AMD's other line (CAD related) of cards have better abilities for working with great precision?
> 
> I searched for this on the internet, but didn't really find much useful info on that. I ended up reading some weird things like, 280x cards being better at "double precision" than 290x cards. I have no idea if that is correct though.


From what I read it's 1/8th performance or something of the Single? If the 280x is better it was probably a driver thing and by now has been improved.


----------



## sugarhell

Hawaii is 1/8th of the single precision and 7970 is 1/4.Its not a driver thing hawaii is crippled compared to 7970


----------



## heroxoot

Quote:


> Originally Posted by *sugarhell*
> 
> Hawaii is 1/8th of the single precision and 7970 is 1/4.Its not a driver thing hawaii is crippled compared to 7970


is there a reason for this? I'm assuming this precision thing is for crossfire setup.


----------



## sugarhell

Quote:


> Originally Posted by *heroxoot*
> 
> is there a reason for this? I'm assuming this precision thing is for crossfire setup.


Wut..Pls google it


----------



## Dasboogieman

Quote:


> Originally Posted by *heroxoot*
> 
> is there a reason for this? I'm assuming this precision thing is for crossfire setup.


Double precision processing allows the use of 64 bit floating point numbers. It allows greater precision (I.e. more decimal places) for various floating point ops like multiplication. Extremely useful for scientific endeavours and even some forms of cryptography. FP64 capable cores are more complex, more power hungry and require more transistors which is why its impractical to have a general purpose GPU being 100% DP.

Tahiti was originally the top of AMD's product stack. Titan style prosumer branding didn't exist back then so top end consumer cards tended to have strong DP performance. Plus it was to showcase GCN's compute strength. The strong DP was part of the reason Tahiti ran so hot and consumed more power vs GK104.

Hawaii was having such power issues, I'd imagine AMD reduced DP performance to meet a reasonable TDP and die area budget. I may be wrong but I think Firepro Hawaii's have 1/4 DP but much much lower clock speeds, maybe even only 2560 cores a la Hawaii XT.

To be honest, consumers have precious little use for double precision. Most graphics work is single precision or integer. I was surprised AMD didn't hobble it even more for the 280x to allow higher core clocks or to make it more TDP competitive with GK104.


----------



## Imprezzion

Hmm... Is it normal for a Gigabyte Windforce3X to run ~80c load at +31mV 1150/1500Mhz clocks, with about 80% fanspeed? It's doing a LOT worse then my Tri-X was.. and I thought the WF3 was a better cooler..


----------



## joeh4384

Quote:


> Originally Posted by *Imprezzion*
> 
> Hmm... Is it normal for a Gigabyte Windforce3X to run ~80c load at +31mV 1150/1500Mhz clocks, with about 80% fanspeed? It's doing a LOT worse then my Tri-X was.. and I thought the WF3 was a better cooler..


What is your case? I have heard the Tri-x was better than the Windforce.


----------



## fateswarm

Yep. Tri-x clearly shows superiority on regular reviews benchmarks. Good thing I found out early enough 'cause I also thought of a windforce.


----------



## Imprezzion

HAF932 Advanced.

360 rad in the top for CPU, push pull, intake.
5,25" bay 140mm outtake
backside 140mm outtake
bottom 140mm intake
front 230mm intake.

Wierd setup but with my Tri-X 290X @ +100mV, GTX770 JetStream @ 1.325v softmod and GTX780 ACX @ 1.30 softmod, which all 3 exhaust hot air into the case, it works very very well and gave the lowest temps for both GPU and CPU.

Tri-X never went above 80c core on like, 75% fanspeed This gigabyte is hitting 80c core on stock clocks and volts with Auto fanspeed (Perf. mode)

It's not like I switched it for the Gigabyte, someone just sold this Gigabyte at a ridiculously low price with 20 months of warranty still in effect. Couldn't resist to just buy it. I can flip it for a profit anyway but still.

It's a shame to know the cooler disappoints on the AMD cards cause it performs very very well on GTX780/GTX780 Ti. But that's maybe because there's no Sapphire there









God I have too many performance GPU's.. I really need to sell some of my stuff.. rofl..


----------



## tonymontana95

I keep mine powercolor r9 290x pcs+ on 1130/1500mhz without touching the voltage, max twmp was 82 C on hot summer day in Croatia








What do you think about my clock? could it go higher?


----------



## heroxoot

Quote:


> Originally Posted by *tonymontana95*
> 
> I keep mine powercolor r9 290x pcs+ on 1130/1500mhz without touching the voltage, max twmp was 82 C on hot summer day in Croatia
> 
> 
> 
> 
> 
> 
> 
> 
> What do you think about my clock? could it go higher?


Is it really stable? I can't do 1100/1250 on mine with voltage increases so far. I did +30mV and I still see artifacts sometimes. It tends to look like speckled checkerboard pattern of white too. What games have you run so far?


----------



## tonymontana95

Quote:


> Originally Posted by *heroxoot*
> 
> Is it really stable? I can't do 1100/1250 on mine with voltage increases so far. I did +30mV and I still see artifacts sometimes. It tends to look like speckled checkerboard pattern of white too. What games have you run so far?


I tested it 30min on furmark,30min on unigine heaven and I played 3-4 h of BF4 on all ultra settings 1080p


----------



## heroxoot

Quote:


> Originally Posted by *tonymontana95*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Is it really stable? I can't do 1100/1250 on mine with voltage increases so far. I did +30mV and I still see artifacts sometimes. It tends to look like speckled checkerboard pattern of white too. What games have you run so far?
> 
> 
> 
> I tested it 30min on furmark,30min on unigine heaven and I played 3-4 h of BF4 on all ultra settings 1080p
Click to expand...

Furmark is useless and only makes heat. Heaven is great but it's not a sure thing. BF4 on the other hand, great. Thing is my GPU showed artifacts in games that didnt keep the gpu at load and BF4 was fine. Not sure why. I reverted back to stock for the summer anyway.


----------



## Imprezzion

I know the feeling about BF4.. Especially if you crank up the resolution scale and run Mantle. That puts insane loads on the GPU and gets a unstable OC very fast.

I have put the Windforce3X on stock volts now as well and i'm going to see how it handles that. A 60 minute loop of Fire Strike Extreme (continuous demo movie loop, no loading screen pauses) had the temps at about 76c core and 67c VRM at stock volts, max power limit, 1125Mhz core, 1400Mhz VRAM.

Going to start a round or 2 now on BF4 (Ultra 2xAA, 125% res scale, adaptive AA enabled in CCC, running Mantle) and see what it does.

Btw, when I go 1500Mhz VRAM sometimes it seems to work fine and is ''stable'' but it also randomly goes completely bonkers. Even idle. Massive striping and artifacting on my screen when idle on desktop and such.. Is this yet *another* Elpida thing? My Tri-X had Hynix and had none of it.


----------



## heroxoot

Quote:


> Originally Posted by *Imprezzion*
> 
> I know the feeling about BF4.. Especially if you crank up the resolution scale and run Mantle. That puts insane loads on the GPU and gets a unstable OC very fast.
> 
> I have put the Windforce3X on stock volts now as well and i'm going to see how it handles that. A 60 minute loop of Fire Strike Extreme (continuous demo movie loop, no loading screen pauses) had the temps at about 76c core and 67c VRM at stock volts, max power limit, 1125Mhz core, 1400Mhz VRAM.
> 
> Going to start a round or 2 now on BF4 (Ultra 2xAA, 125% res scale, adaptive AA enabled in CCC, running Mantle) and see what it does.
> 
> Btw, when I go 1500Mhz VRAM sometimes it seems to work fine and is ''stable'' but it also randomly goes completely bonkers. Even idle. Massive striping and artifacting on my screen when idle on desktop and such.. Is this yet *another* Elpida thing? My Tri-X had Hynix and had none of it.


Possibly the lack of quality in Elpida. They just are not has high quality as Hynix.


----------



## Silent Scone

Quote:


> Originally Posted by *heroxoot*
> 
> Possibly the lack of *quality* in *Elpida*. They just are not has high quality as Hynix.


InvalidArgumentException


----------



## HoneyBadger84

Quote:


> Originally Posted by *Imprezzion*
> 
> Hmm... Is it normal for a Gigabyte Windforce3X to run ~80c load at +31mV 1150/1500Mhz clocks, with about 80% fanspeed? It's doing a LOT worse then my Tri-X was.. and I thought the WF3 was a better cooler..


One word answer from someone that's run both: Yes. The WindForce is a great cooler at stock, but when you start OCing it does get pretty warm pretty fast.

I ran my WindForce card at 100% fan to offset that and it rarely if ever got over 70C at stock, while gaming (usually stayed under 65C), and even the bit of OCing I did on it only got up to 80C in the harder/longer benchmarks.

Edit: Also the lines/etc you're experiencing on desktop are the result of a bad overclock pissing Afterburner & thereby the drivers off. I had this issue several times & the only way I found to get rid of it was to wipe Afterburner & the drivers from my system completely & reinstall after using a cleaning software to make sure no driver files were left.


----------



## Imprezzion

I'm now getting 77c flat line core and 69c flat line VRM at stock volts with 1110Mhz core and 1400Mhz VRAM. Running ~80% fanspeed custom curve. That is in BF4 @ Mantle 125% res scale 2xAA and all Ultra settings.

Not that bad..









Still debating whether to keep my 1280Mhz GTX780 or this beast.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Imprezzion*
> 
> I'm now getting 77c flat line core and 69c flat line VRM at stock volts with 1110Mhz core and 1400Mhz VRAM. Running ~80% fanspeed custom curve. That is in BF4 @ Mantle 125% res scale 2xAA and all Ultra settings.
> 
> Not that bad..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Still debating whether to keep my 1280Mhz GTX780 or this beast.


If you're able to push that at stock volts that's impressive. Every card I've tested out has needed additional voltage to go ahove 1075MHz core... course, I haven't OC tested my Sapphire cards yet, so that'll probably change. lol


----------



## joeh4384

Quote:


> Originally Posted by *Imprezzion*
> 
> I'm now getting 77c flat line core and 69c flat line VRM at stock volts with 1110Mhz core and 1400Mhz VRAM. Running ~80% fanspeed custom curve. That is in BF4 @ Mantle 125% res scale 2xAA and all Ultra settings.
> 
> Not that bad..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Still debating whether to keep my 1280Mhz GTX780 or this beast.


I would sell the gtx 780 just because the 290/290x market is so saturated right now.


----------



## Imprezzion

Well yeah, but the onjly 290X's for sale here are reference's and no one wants those.

For the rest, just 1-2 XFX DD's cause they're so terribad.


----------



## rdr09

Quote:


> Originally Posted by *Imprezzion*
> 
> Well yeah, but the onjly 290X's for sale here are *reference's and no one wants those*.
> 
> For the rest, just 1-2 XFX DD's cause they're so terribad.


unless when watercooling. haven't seen non-ref 290s beat my ref 290. even watercooled ones.

if you win the silicon lottery on a reference card (290 or 290X) - no non-ref 290s will beat it. prolly excluding the Lightning.


----------



## PCSarge

Quote:


> Originally Posted by *rdr09*
> 
> unless when watercooling. haven't seen non-ref 290s beat my ref 290. even watercooled ones.
> 
> if you win the silicon lottery on a reference card (290 or 290X) - no non-ref 290s will beat it. prolly excluding the Lightning.


my card is the father of all of your cards. quite literally. its a "press sample" with an engineering sample GPU core on the board.

it will pull 1150 core 1350 mem on no voltage bump


----------



## rdr09

Quote:


> Originally Posted by *PCSarge*
> 
> my card is the father of all of your cards. quite literally. its a "press sample" with an engineering sample GPU core on the board.
> 
> it will pull 1150 core 1350 mem on no voltage bump


depends on the synthetic bench. Futuremark, my 290 reference won't need any voltage adjustment not till 1170. i've benched mine as high as 1320 core. i've seen others do better. all reference.


----------



## PCSarge

Quote:


> Originally Posted by *rdr09*
> 
> depends on the synthetic bench. Futuremark, my 290 reference won't need any voltage adjustment not till 1170. i've benched mine as high as 1320 core. i've seen other do better. all reference.


i know i can do better, i have a fullcover aquacomputer block inbound, should be in my hands tommorow afternoon.

those clocks are stable in heaven ultra


----------



## HoneyBadger84

Must be nice to have cards that clock so well with no voltage adjustments  lol

Got my lady all prepped for the Reference cards that'll be incoming in the next week:




Once I cycle out all the cards I don't want, I'll be picking up an AX1500i so I can actually run QuadFire (when I want to) without having to use adapters for the last two 6-pin connectors since I only have 6 8-pin connectors on the AX1200W.

Can you tell which side fan is the new one and which are the 2 that have been in there a while  I need to clean those two out better, the dust on the air-channels is icky lookin' in that picture.

I don't know how much good the bottom fans are actually gonna do, they're kinda there more for show than anything (extra red-LED light makes the case have more of an evil red glow in combination with the new Corsair Quiet Edition AF140s I put in that're red-LED as well).

My H110 seems to run just a bit cooler with the new Corsair AF140s on it as well. We shall see when them reference coolers get here ^_^


----------



## Imprezzion

I said that because they sell as low as €280 here for a 290X and €200 for a 290 and no one buys them. The ads have been up for weeks..


----------



## PCSarge

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Must be nice to have cards that clock so well with no voltage adjustments  lol


its about the luck of the draw. i traded in an old card for my 290 at the AMD LAN event engineering samples tend to be strong cards anyways.


----------



## tsm106

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Imprezzion*
> 
> Well yeah, but the onjly 290X's for sale here are *reference's and no one wants those*.
> 
> For the rest, just 1-2 XFX DD's cause they're so terribad.
> 
> 
> 
> unless when watercooling. haven't seen non-ref 290s beat my ref 290. even watercooled ones.
> 
> if you win the silicon lottery on a reference card (290 or 290X) - no non-ref 290s will beat it. prolly excluding the Lightning.
Click to expand...

You can have all the customs. Do not want.


----------



## rdr09

Quote:


> Originally Posted by *PCSarge*
> 
> i know i can do better, i have a fullcover aquacomputer block inbound, should be in my hands tommorow afternoon.
> 
> those clocks are stable in heaven ultra


Heaven and Valley taxes my gpu more. Highest i've achieved on those benchmarks is 1250. have not tried any higher. 1320 using an older driver works for Futuremark benches. let me be clear, we are talking about suicide runs. imo, the higher the core oc in these benches will translate in a higher stable oc in games. may not necessarily 1300 or 1320. i play at stock clocks.


----------



## ZealotKi11er

I am more then happy with 1200MHz full stable for 2 cards. Both my cards can do 1150MHz on stock volts.


----------



## hwoverclkd

Quote:


> Originally Posted by *Imprezzion*
> 
> God I have too many performance GPU's.. I really need to sell some of my stuff.. rofl..


put them into good use, like hang them on your walls


----------



## Imprezzion

Quote:


> Originally Posted by *acupalypse*
> 
> put them into good use, like hang them on your walls


Ok, My wall inventory:

Gigabyte GA-X58A UD7 Rev 2.0. Is in fine shape but has no CPU to go with it as I sold my Xeon.
ASUS P5N32-E SLI. Working order. Buggy BIOS but works. CPU is a Q6600. No RAM.
MSI P45 board. Works fine, has a E8500 in it. No RAM.
Gigabyte 1GB 9500GT. Works fine. Noisy fan but works.
Sapphire HD4870 reference. Works, noisy fan bearings.
ASUS HD4890 TOP. Works, also noisy fan bearings.
Sapphire HD7970 reference. Blown up memory VRM while benching it. Had the Accelero Xtreme III on it and the tiny ass aluminium heatsinks couldn't take 1.70v to the RAM...
And there's probably more..









But, to get on topic a tad..

I replaced my Window panel for my HAF932 again for my stock fan panel to draw the hot air from the 290X out of the case. Heat was building pretty rapidly actually.

My current 290X Windforce3X temps @ 25c ambient.. [CLICKABLE]

100% fan loud as hell:


70% fan pretty decent:


60% fan, casefans are just as loud:


Now, which one do you guys consider ''safe''.
I mean, the 60% one has some very high VRM temps to be honest.. Or is that still safe..
Another thing I worry about is it's very slow decent to idle temps and reasonably high idle temps.
Voltage goes down, but only to 0.95v and it takes a full 4 minutes to get to idle temps. Which are still in the 40's Celsius at a 25 ambient...


----------



## HoneyBadger84

I'd be runnin' 100% just to keep the card as happy as possible, if you don't mind the noise, which isn't that bad from this card at 100% imo.

I'm running some testing on 3K (3x1080p Eyefinity) resolution with the Sapphire Tri-X OC 290X & the Gigabyte WindForce 290X... so far not too bad, numbers wise, got 85FPS average with my usual settings in Metro2033 (I get 88FPS with single card at 1080p, so dual card 3K results being 85FPS is pretty darn good), and Metro Last Light with my usual settings got 59FPS average, compared to about 80-somethin' at regular 1080p with dual cards... so far so good.

Load temps during the MetroLL bench were 70/74C, so nothing too bad considering how long that benchmark was, and they seemed to be leveled off temp-wise at those temperatures, were steady there for the entire 3rd loop of the benchmark. 74C is of course the Gigabyte card... which is on bottom, sadly, and still running that warm.

Wondering what to run next... I'm kinda limited on game-benchmarks I have at the moment.


----------



## Imprezzion

I measured this looping the Fire Strike Demo movie on Extreme + max tesselation + max MSAA. Bought 3DMark on summer sale for like, €4 so..









Ehm, good game benches for stressing, Crysis 3 in-game bench is great.
Far Cry 3 is very very intensive on core clock OC's.
Battlefield 4 is intensive on Core overclocks and when running Mantle @ high resolution the memory load is insane at like, 3600MB easy for 4K on my rig (200% resolution scale @ 1080p = 4K rendering)


----------



## JeremyFenn

Just ran Heaven 4.0 and Valley 1.0 @ 5.04GHz CPU 1230/1620 GPU, here's the results:




Wish I had a better monitor. I think the 1680x1050 resolution makes the score higher than it should be tbh. I need 1080p at least. Maybe one of these days I'll get to spend some cash on my puter again and get one.







Hard to splurge on yourself when you've got a wife and 2 kids lol


----------



## Blaise170

Quote:


> Originally Posted by *JeremyFenn*
> 
> Just ran Heaven 4.0 and Valley 1.0 @ 5.04GHz CPU 1230/1620 GPU, here's the results:
> 
> 
> 
> 
> Wish I had a better monitor. I think the 1680x1050 resolution makes the score higher than it should be tbh. I need 1080p at least. Maybe one of these days I'll get to spend some cash on my puter again and get one.
> 
> 
> 
> 
> 
> 
> 
> Hard to splurge on yourself when you've got a wife and 2 kids lol


I have two 1080p monitors I'm trying to sell if you're interested, I am going to replace it with a TV.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Imprezzion*
> 
> Ok, My wall inventory:
> 
> Gigabyte GA-X58A UD7 Rev 2.0. Is in fine shape but has no CPU to go with it as I sold my Xeon.
> ASUS P5N32-E SLI. Working order. Buggy BIOS but works. CPU is a Q6600. No RAM.
> MSI P45 board. Works fine, has a E8500 in it. No RAM.
> Gigabyte 1GB 9500GT. Works fine. Noisy fan but works.
> Sapphire HD4870 reference. Works, noisy fan bearings.
> ASUS HD4890 TOP. Works, also noisy fan bearings.
> Sapphire HD7970 reference. Blown up memory VRM while benching it. Had the Accelero Xtreme III on it and the tiny ass aluminium heatsinks couldn't take 1.70v to the RAM...
> And there's probably more..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But, to get on topic a tad..
> 
> I replaced my Window panel for my HAF932 again for my stock fan panel to draw the hot air from the 290X out of the case. Heat was building pretty rapidly actually.
> 
> My current 290X Windforce3X temps @ 25c ambient.. [CLICKABLE]
> 
> 100% fan loud as hell:
> 
> 
> 70% fan pretty decent:
> 
> 
> 60% fan, casefans are just as loud:
> 
> 
> Now, which one do you guys consider ''safe''.
> I mean, the 60% one has some very high VRM temps to be honest.. Or is that still safe..
> Another thing I worry about is it's very slow decent to idle temps and reasonably high idle temps.
> Voltage goes down, but only to 0.95v and it takes a full 4 minutes to get to idle temps. Which are still in the 40's Celsius at a 25 ambient...


Yeah those VRM1 temps are high. I am not 100% sure but even stock cooler has much lower temps in VRM temps with set fan profile. You just have to check what other people are getting with stock card stock fan profile in VRM1 under load and keep that as your MAX. Also maybe try to change the thermal pad of VRM1. I had to change it for my cards even with Waterblocks because i was getting high 70s.


----------



## Imprezzion

I know. I had Fujipoly Ultra Extreme on it. Dropped a bit but not much. For some reason the Gigabyte runs hot.. VRM temps are accurate with reviews though and still not nearly as bad as a XFX DD or ASUS DC2 haha.. (Both hitting 100c easy)


----------



## rdr09

Quote:


> Originally Posted by *Imprezzion*
> 
> I know. I had Fujipoly Ultra Extreme on it. Dropped a bit but not much. For some reason the Gigabyte runs hot.. VRM temps are accurate with reviews though and still not nearly as bad as a XFX DD or ASUS DC2 haha.. (Both hitting 100c easy)


Imprezzion, do you mind sharing how you clean out the gpu drivers? you switch cards as if you never run into any issues.

i'm sure you do.


----------



## Imprezzion

Nope, I never run into any issues tbh.

My sequence: Same when switching from either side:

- Deinstall current driver while card is still in there through control panel - programs and apps.
- When it asks for a reboot, don't. Shift-click ''restart'' button to enter safe mode boot.
- Boot into safe mode (regular one).
- Use DDU to clean the driver up but leave the files in C://AMD/ and C://NVIDIA. These are the installers so you don't have to extract every time and you can install very fast.
- Don't choose the reboot option but just shut down.
- Swap cards.
- Now Windows boots 100% clean and all you have to do is install the correct driver for the card you swapped to and done. Runs like a charm.


----------



## droid0mega

Quote:


> Originally Posted by *Silent Scone*
> 
> Anyone having issues with their 290 on the AOC U2868PQU 28 flickering - working EDID fix here
> 
> AOCwithTimingFix-Proper.txt 2k .txt file
> 
> 
> There will also be a fix included in the next beta drivers


Will this work for the ASUS monitor as well?


----------



## ZealotKi11er

There is this green LED in the back of my 290 (Second Card). Does anyone know what is it? Top card its off.


----------



## HoneyBadger84

Quote:


> Originally Posted by *ZealotKi11er*
> 
> There is this green LED in the back of my 290 (Second Card). Does anyone know what is it? Top card its off.


Some 290s and 290Xs have fancy "temperature indicator" lights on them that change colors the hotter the card gets... Is it a Vapor-X? I'm pretty sure those have them...


----------



## ZealotKi11er

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Some 290s and 290Xs have fancy "temperature indicator" lights on them that change colors the hotter the card gets... Is it a Vapor-X? I'm pretty sure those have them...


Its Reference from AMD. Top card has it too but its off.


----------



## MTDEW

Quote:


> Originally Posted by *ZealotKi11er*
> 
> There is this green LED in the back of my 290 (Second Card). Does anyone know what is it? Top card its off.


In this case its probably just one of those ULPS (zero core) LED lights, that light up when the card is in a low power state to indicate that the card still has power but is in a low power state.


----------



## HoneyBadger84

Quote:


> Originally Posted by *MTDEW*
> 
> In this case its probably just one of those ULPS (zero core) LED lights, that light up when the card is in a low power state to indicate that the card still has power but is in a low power state.


That'd be why I've never seen them. First thing I do after installing drivers & rebooting is disable ULPS & reboot again  lol


----------



## rdr09

Quote:


> Originally Posted by *Imprezzion*
> 
> Nope, I never run into any issues tbh.
> 
> My sequence: Same when switching from either side:
> 
> - Deinstall current driver while card is still in there through control panel - programs and apps.
> - When it asks for a reboot, don't. Shift-click ''restart'' button to enter safe mode boot.
> - Boot into safe mode (regular one).
> - Use DDU to clean the driver up but leave the files in C://AMD/ and C://NVIDIA. These are the installers so you don't have to extract every time and you can install very fast.
> - Don't choose the reboot option but just shut down.
> - Swap cards.
> - Now Windows boots 100% clean and all you have to do is install the correct driver for the card you swapped to and done. Runs like a charm.


Thank you so much. If you don't mind i'll recommend this method for others needing help with drivers when switching from red to green or vice versa.


----------



## ZealotKi11er

Quote:


> Originally Posted by *MTDEW*
> 
> In this case its probably just one of those ULPS (zero core) LED lights, that light up when the card is in a low power state to indicate that the card still has power but is in a low power state.


Figured as much. Just wanted to make sure.


----------



## Lou HM

Hi I want to join the Club
1. http://www.techpowerup.com/gpuz/by9mx/
2. MSI R9 290X (reference) & XFX R9 290X (reference)
3. Using G10 Kraken With antec kuhler 920 on the XFX Card & thermaltake water 3.0 performance on MSI Card


----------



## pdasterly

Is there a guide to oc a crossfire setup?


----------



## the9quad

Quote:


> Originally Posted by *pdasterly*
> 
> Is there a guide to oc a crossfire setup?


If you just want a quick dirty oc, then use ab or something similar and it will allow you to oc both cards with the same setting. Not hard at all, if you want something more in depth, plenty of knowledgable and friendly people here to ask.


----------



## pdasterly

Quote:


> Originally Posted by *the9quad*
> 
> If you just want a quick dirty oc, then use ab or something similar and it will allow you to oc both cards with the same setting. Not hard at all, if you want something more in depth, plenty of knowledgable and friendly people here to ask.


I have sapphire cards, using trixx only so far. I was recommended using gpu-z render but mygames run hotter than what that app will reproduce. Unigine but you cant monitor vrm temp during benchmark, same with 3dmark and catzilla. Then occt gives a bunch of errors?

Currently at 1075 gpu, 0 vddc and 200 power limit. Didnt attempt to oc ram


----------



## HoneyBadger84

Quote:


> Originally Posted by *pdasterly*
> 
> I have sapphire cards, using trixx only so far. I was recommended using gpu-z render but mygames run hotter than what that app will reproduce. Unigine but you cant monitor vrm temp during benchmark, same with 3dmark and catzilla. Then occt gives a bunch of errors?
> 
> Currently at 1075 gpu, 0 vddc and 200 power limit. Didnt attempt to oc ram


Trixx is fine, it's basically Afterburner without the OSD, give or take. Works the same, just make sure you don't have it set to "apply overclock/settings on startup" as you don't want it trying to apply a failed OC if you happen to blue screen or otherwise crash.

Stress testing with Crossfire is a bit more cumbersome for OCing if you only have one screen as, like you said, you can't keep track of VRMs while running a test. What type of Sapphire cards are they (Sorry I forgot what type you have) stock blower cards or Tri-X or which? If they're aftermarket (Tri-X or Vapor-X), there's no way in heck the VRMs are gonna get hotter than the core if you only have 2 cards in there, as I've tested a setup with a Tri-X in it and the VRMs stayed a good 5-10C cooler than the GPU core in all tests.

I'd use Valley/Heaven looped a few times and watch for artifacts if you are planning to use this OC for regular use. If you're just doing it for benchmarking, you can just do what I do, run the clocks through all the benchmarks you want scores from, and if it passes them all without any artifacts, good enough  lol


----------



## pdasterly

I have 3 screens,I have reference cards with kraken g10 bracket


----------



## Mega Man

if you use hwinfo you can check the vrms on all cards, if they have a sensor


----------



## HoneyBadger84

Quote:


> Originally Posted by *Mega Man*
> 
> if you use hwinfo you can check the vrms on all cards, if they have a sensor


ORLY :-D I'll haveta download that when I get home & check it out then ^_^ Off to work... got me 3 brand spankin' new HIS R9 290X Reference Cards on the way







Dis Gigabyte WindForce & Sapphire Tri-X are gonna be hitting the road very soon... going back to reference cards cuz I heard scuttlebut that Corsair is putting out AIO loops for GPUs soon and I don't wanna haveta deal with removing an aftermarket cooler from a card to install them  Also got the 3-card set for a steal, so y'know... there's that. Just gotta resell the Sapphire Tri-X 290 I mistakenly bought & the 290X Gigabyte & Sapphire cards...


----------



## the9quad

Personally with 290xs in crossfire I see no reason other than benchmarks to overclock. So I just overclock until I see artifacts in whatever bench it is I'm interested in running. See what score I can get, pat myself on the back then go back to stock clocks. I use my PC primarily for gaming and overclocking is really not necessary or all that noticeable for me anyway for all practical purposes in gaming. Maybe I am alone in doing this, but is it really useful at all to overclock other than benches? What kind of gains are you guys getting compared to stock and is it really noticeable when your gaming?


----------



## HoneyBadger84

Quote:


> Originally Posted by *the9quad*
> 
> Personally with 290xs in crossfire I see no reason other than benchmarks to overclock. So I just overclock until I see artifacts in whatever bench it is I'm interested in running. See what score I can get, pat myself on the back then go back to stock clocks. I use my PC primarily for gaming and overclocking is really not necessary or all that noticeable for me anyway for all practical purposes in gaming. Maybe I am alone in doing this, but is it really useful at all to overclock other than benches? What kind of gains are you guys getting compared to stock and is it really noticeable when your gaming?


I'm the same way. These cards eat games well enough that OCing for 24/7 use isn't necessary. I only OC for benchmark numbers. I'm sure on liquid cooling you can get enough of an OC going for a nice performance gain in games.


----------



## pdasterly

Try playing total war Rome 2 @ 5780x1080
System is purely for gaming, I do my other work on my laptop.


----------



## the9quad

Quote:


> Originally Posted by *pdasterly*
> 
> Try playing total war Rome 2 @ 5780x1080
> System is purely for gaming, I do my other work on my laptop.


So how much of an increase do you see by overclocking? Do you gain enough fps to make it worthwhile? Curious since I'm at 1440p where any gains I get are superfluous.


----------



## pdasterly

Not worth the time and effort, but it does bring up the minimum fps

Also Wolfenstein which doesn't support xfire


----------



## Arizonian

Quote:


> Originally Posted by *Lou HM*
> 
> Hi I want to join the Club
> 1. http://www.techpowerup.com/gpuz/by9mx/
> 2. MSI R9 290X (reference) & XFX R9 290X (reference)
> 3. Using G10 Kraken With antec kuhler 920 on the XFX Card & thermaltake water 3.0 performance on MSI Card


Congrats - added


----------



## Red1776

Quote:


> Originally Posted by *the9quad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *pdasterly*
> 
> Try playing total war Rome 2 @ 5780x1080
> System is purely for gaming, I do my other work on my laptop.
> 
> 
> 
> So how much of an increase do you see by overclocking? Do you gain enough fps to make it worthwhile? Curious since I'm at 1440p where any gains I get are superfluous.
Click to expand...

 Depend on the game, and how far you like to push your settings. I play at 6330 x 1080 with 4 x R290X's and with games like Metro pushed to the maximum settings, or STALKER with a brutal super high res/texture I get scaling to all four cards. OC'ing can add 8-14% and raise the minimum frame rate most importantly.

It depends on how important exploiting every Graphic option is to you.

(Try Witcher 2 on Uber mode some time)


----------



## fateswarm

How long does it take for a Tri-X 290 to degrade or die with an overvoltage of +100mV? What about +150mV? +200mV?


----------



## HoneyBadger84

Quote:


> Originally Posted by *Red1776*
> 
> Depend on the game, and how far you like to push your settings. I play at 6330 x 1080 with 4 x R290X's and with games like Metro pushed to the maximum settings, or STALKER with a brutal super high res/texture I get scaling to all four cards. OC'ing can add 8-14% and raise the minimum frame rate most importantly.
> It depends on how important exploiting every Graphic option is to you.
> 
> (Try Witcher 2 on Uber mode some time)


Looking forward to trying that out at 3K (3x1080p) myself :-D I did Metro 2033/LL on very high with Tess on Normal and in LL's case SSAA off, got about 85fps average on 2033, 59fps avg on LL, and thats with 2 290Xs only. Really lookin forward to 3/4 card numbers at that resolution. I never to find more games with benchmarks build in or those two. Do Crysis 2 & 3 have built in benchmarks like 1 did?


----------



## HoneyBadger84

Quote:


> Originally Posted by *fateswarm*
> 
> How long does it take for a Tri-X 290 to degrade or die with an overvoltage of +100mV? What about +150mV? +200mV?


I do believe life expectancy is relative to how hot you're running them more than what voltage used, unless you're going up over something crazy like over 1.5V.

Edit: this is meant in reference to air and liquid cooled cards. If you can keep the temps in check, generally speaking, voltages under 1.5V are "okay"... I do believe the tsm-guy here said pretty much the exact same thing, and that he had in fact done it on air himself. Not sure how long those cards lasted though.


----------



## fateswarm

If it works anything like cpus, voltage should be more important, not because temps don't matter, but because they matter a lot mainly after 90 or 100C, and those are points chips may shut down on their own. Also mosfets (of VRMs) on almost all brands are able to withstand up to 130C without a problem. That being said, you may damage the board itself or other components that way!

In general I doubt voltage is safe when temps are low!11

LN2 overclockers wouldn't kill chips at subzero then.


----------



## HoneyBadger84

Quote:


> Originally Posted by *fateswarm*
> 
> If it works anything like cpus, voltage should be more important, not because temps don't matter, but because they matter a lot mainly after 90 or 100C, and those are points chips may shut down on their own. Also mosfets (of VRMs) on almost all brands are able to withstand up to 130C without a problem. That being said, you may damage the board itself or other components that way!
> 
> In general I doubt voltage is safe when temps are low!11
> 
> LN2 overclockers wouldn't kill chips at subzero then.


?... that argument at the end is utterly ridiculous. The stuff under LN2 dies because of moisture exposure. I made that statement in reference to air and liquid cooled cards, obviously. Jeez.

Fixes the post more to your liking I hope 
Quote:


> Originally Posted by *HoneyBadger84*
> 
> I do believe life expectancy is relative to how hot you're running them more than what voltage used, unless you're going up over something crazy like over 1.5V.
> 
> Edit: this is meant in reference to air and liquid cooled cards. If you can keep the temps in check, generally speaking, voltages under 1.5V are "okay"... I do believe the tsm-guy here said pretty much the exact same thing, and that he had in fact done it on air himself. Not sure how long those cards lasted though.


----------



## fateswarm

Quote:


> Originally Posted by *HoneyBadger84*
> 
> that argument at the end is utterly ridiculous. The stuff under LN2 dies because of moisture exposure.


The ironing of calling others "ridiculous" and then posting the most ridiculous thing against LN2 reasons of killing chips I've ever heard.

"Voltage kills, not temps" is what you hear from chip Engineers often. Not because temps are safe but because they usually throttle.

In a complete solution like a board or a card, you may damage something neibhoring with temps, but you could be careful with <90C.


----------



## HoneyBadger84

Pretty sure I said exactly that. Stated what voltages are considered "safe" if your temps are in 'check' which I happen to consider under 80C on these GPUs (on air), but I'm picky.

Also I think you need to learn to read, I didn't call YOU ridiculous, I called the argument at the end of your statement ridiculous, because it IS ridiculous to bring LN2 cooling and card treatment in to a conversation being had between air/liquid cooled card owners discussing the limits of cards on those methods of cooling. Why do you keep going back to LN2 crap when were clearly talking about everyday normally used cards on air and liquid cooling?

Are you trying to say moisture condensation isn't a guaranteed eventual death sentence to any card put under extended LN2 use then? Cuz that's what your statement makes it sound like:
Quote:


> Originally Posted by *fateswarm*
> 
> The ironing of calling others "ridiculous" and then posting the most ridiculous thing against LN2 reasons of killing chips I've ever heard.


I'm not the crazy types that will tell you "yeah under 95C and you're fine" on these cards. I don't care if that's the card's operating specs or not, no way I'm letting my cards over 80C on air for regular-use clocks. I said that in previous posts to the very person I was having this conversation with before you came in with your comments about LN2 cards. Lol

Edit: To be honest, I don't like my GPUs going over 70C, but I've had to get over it with the running them on air in crossfire with 2-3-4 card combinations because at least one of the sandwiched cards is GOING to hit that temp eventually.

Edited cuz typing on a tablet while at work is fun. Lol


----------



## TLM-610

I have a very serious battle inside me going on about what I should pick (still saving up though) between the GTX 780 gigabyte or a more cost friendly 2nd hand R9 290/x, I still have yet to make up my mind which one I should go after. I have seen mantle do wonders for the AMD cards because the R9 290/x are bother real performance cards and in majority cases outdoing the GTX 780 yet you'll find the R9s far better priced for 2nd hand in comparison to the gtx 780. My major gripe is the temps that the R9 cause in the case thus affecting my cpu overclocks, I saw a really good bargain about $330 just here in OCN. What are all the other aftermarket cooling options you can adopt for the R9 except water cooling to keep this beast around 75°C?


----------



## HoneyBadger84

Here's a snippet of what I'm talking about:
Quote:


> Originally Posted by *tsm106*
> 
> I've pushed a lot more voltage than that before and the cards did not die.
> 
> There is no definitive max voltage with these cards because the limits can be removed by using either of the unlocked bios' and well then it comes down to how far you are willing to go. I have taken my cards past 1.5v loaded without blowing up lol. However I would not recommend anyone venture that far down the rabbit hole. I was more or less testing the limits of the reference design. Its pretty stout imo. Also, at high voltages the software monitors start to lose their accuracy. Anyways, I would say stay under 1.40v loaded for most situations.
> 
> Here's some things I wrote down from my experience overclocking this silicon.
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/6120_40#post_21205990


Also if you know so much about all this voltage beeswax, why precisely did you post this in the first place:
Quote:


> Originally Posted by *fateswarm*
> 
> How long does it take for a Tri-X 290 to degrade or die with an overvoltage of +100mV? What about +150mV? +200mV?


From the way yer talking you should know the answer  just sayin'


----------



## Red1776

Quote:


> Originally Posted by *fateswarm*
> 
> How long does it take for a Tri-X 290 to degrade or die with an overvoltage of +100mV? What about +150mV? +200mV?


Trying to estimate the degradation / killing a card is like trying to predict the temperature next august 12th.

it depends on what temps you run them at, how many hours use, what voltage you run, the transistors in that batch, And the quality of the voltage you are feeding it, etc.

If you run them at good temps and voltage within reason, they will probably outlive your need for them.

....oh yeah, stay away from Furmark. It's like running over a spike strip to prove your run flat tires work.


----------



## HoneyBadger84

Quote:


> Originally Posted by *TLM-610*
> 
> I have a very serious battle inside me going on about what I should pick (still saving up though) between the GTX 780 gigabyte or a more cost friendly 2nd hand R9 290/x, I still have yet to make up my mind which one I should go after. I have seen mantle do wonders for the AMD cards because the R9 290/x are bother real performance cards and in majority cases outdoing the GTX 780 yet you'll find the R9s far better priced for 2nd hand in comparison to the gtx 780. My major gripe is the temps that the R9 cause in the case thus affecting my cpu overclocks, I saw a really good bargain about $330 just here in OCN. What are all the other aftermarket cooling options you can adopt for the R9 except water cooling to keep this beast around 75°C?


I just picked up 3 R9 290X core editions new in box for ~$880 as a packaged deal, so you can get a darn good deal if you search'em out. The Tri-X & Vapor-X coolers from Sapphire are very, very good. I have a 290 and a 290X Tri-X that I'm looking to offload very soon (once the core editions arrive) if you're interested when they become available.


----------



## HoneyBadger84

Quote:


> ....oh yeah, stay away from Furmark. It's like running over a spike strip to prove your run flat tires work.










that gave me a great giggle, thanks for that, too true.


----------



## chronicfx

Quote:


> Originally Posted by *PCSarge*
> 
> my card is the father of all of your cards. quite literally. its a "press sample" with an engineering sample GPU core on the board.
> 
> it will pull 1150 core 1350 mem on no voltage bump


Three of your cards children are in my pc and they said they can team up to kick their fathers butt every which way without any voltage adjustment.


----------



## rdr09

Quote:


> Originally Posted by *pdasterly*
> 
> I have sapphire cards, using trixx only so far. I was recommended using gpu-z render but mygames run hotter than what that app will reproduce. Unigine but you cant monitor vrm temp during benchmark, same with 3dmark and catzilla. Then occt gives a bunch of errors?
> 
> Currently at 1075 gpu, 0 vddc and 200 power limit. Didnt attempt to oc ram


200?


----------



## fateswarm

HoneyBadger84 better don't speak at all if you are going to be condescending and arrogant. It's perfectly valid to demolish your argument with an LN2 argument. Voltage kills without needing temps and LN2 overclockers are the first to tell you that.

"Only moisture kills on LN2". Come again with something more serious. Also learn about insulation under LN2.

I wanted an answer on degradation or death specs, if you don't have numbers, better don't speak.


----------



## fateswarm

Quote:


> Originally Posted by *Red1776*
> 
> Trying to estimate the degradation / killing a card is like trying to predict the temperature next august 12th.
> it depends on what temps you run them at, how many hours use, what voltage you run, the transistors in that batch, And the quality of the voltage you are feeding it, etc.


Do you have any examples? e.g. a Haswell CPU has been found to degrade within a year on 1.35v in a few examples. Above 1.4v it may die.

Numbers/examples like that.


----------



## HoneyBadger84

Quote:


> Originally Posted by *fateswarm*
> 
> HoneyBadger84 better don't speak at all if you are going to be condescending and arrogant. It's perfectly valid to demolish your argument with an LN2 argument. Voltage kills without needing temps and LN2 overclockers are the first to tell you that.
> 
> "Only moisture kills on LN2". Come again with something more serious. Also learn about insulation under LN2.
> 
> I wanted an answer on degradation or death specs, if you don't have numbers, better don't speak.


I'm sorry, when did I say ONLY?!? You seriously need to learn to read. I'm blocking you before I say somethin' I'll regret or get reported for. It's completely invalid & you are completely full of it and yourself, apparently. I know perfectly well about insulation, I said as much, it only DELAYS the inevitable... I think you're the one that needs to learn your facts before spouting futher nonsense. Nothing I've said is condescending or arrogant, I was straight up stating why your entire topic of conversation was 100% invalid to the topic we were discussing previous to you spouting off about some LN2 nonsense.

Like I said, blocked.


----------



## HoneyBadger84

Quote:


> Originally Posted by *rdr09*
> 
> 200?


Pretty sure he meant +50%... but hey I could be wrong ^_^


----------



## pdasterly

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Pretty sure he meant +50%... but hey I could be wrong ^_^


Yes, I meant 50 not 200


----------



## rdr09

Quote:


> Originally Posted by *pdasterly*
> 
> Yes, I meant 50 not 200


also, you can use GPUZ sensors to monitor the core and vrm temps. leave it in the background while playing your game and check the temps once in awhile. you can alt tab quickly to check. you can set the reading to show current, minimum and maximum. use maximum reading. AB, though, might conflict with GPUZ. i use trixx to oc.

same with Hwinfo64. just use one app at a time.


----------



## JeremyFenn

Quote:


> Originally Posted by *rdr09*
> 
> also, you can use GPUZ sensors to monitor the core and vrm temps. leave it in the background while playing your game and check the temps once in awhile. you can alt tab quickly to check. you can set the reading to show current, minimum and maximum. use maximum reading. AB, though, might conflict with GPUZ. i use trixx to oc.
> 
> same with Hwinfo64. just use one app at a time.


Yay for Trixx !!! Love that OCer, I mean everyone else has to settle for +100 and we get +200, I mean how much of an unfair advantage is that







I also use GPU-Z and HWInfo64 AND AIDA64, I just put a game like Watch Dogs on max settings, window it, then let the GPU draw 100% and suck up about 3.5GBs of VRAM, then you can watch your GPU temp, GPU VRM temps, Load % on the CPU & RAM during gameplay, CPU temp, HDD temp, VRM temp, etc.


----------



## Frrstlthr

Do you have any idea if this works with the ASUS monitor? also I have no idea how to apply this fix


----------



## pdasterly

Gpu's wont go 100 percent unless its fullscreen. Switching back n forth is playing with fire, I did that with catzilla and my vrm temps were way too high. I need to monitor the temp before it overheats not after


----------



## JeremyFenn

Quote:


> Originally Posted by *pdasterly*
> 
> Gpu's wont go 100 percent unless its fullscreen. [


Really?


----------



## rdr09

Quote:


> Originally Posted by *pdasterly*
> 
> Gpu's wont go 100 percent unless its fullscreen. Switching back n forth is playing with fire, I did that with catzilla and my vrm temps were way too high. I need to monitor the temp before it overheats not after


if your vrms can't handle catzilla, then try to fix the heat issues first before using your cards for extended gaming or oc'ing.









i forgot you're using 3 monitors. those alone will prevent your card (s) from going idle.


----------



## TLM-610

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I just picked up 3 R9 290X core editions new in box for ~$880 as a packaged deal, so you can get a darn good deal if you search'em out. The Tri-X & Vapor-X coolers from Sapphire are very, very good. I have a 290 and a 290X Tri-X that I'm looking to offload very soon (once the core editions arrive) if you're interested when they become available.


What!!!!! 3 of them at $880, how?????


----------



## Imprezzion

Soooo, i've been trading and swapping around a bit again and the WF3 OC is going away as well.

I traded one of my 780's for a reference unlocked XFX 290 with Accelero Hybrid (and stock cooler still present), including all invoices and warranty. Oh and + €50







.

Going to do the same as I did with my Xtreme III cooled unlocked Powercolor. Chop up the reference cooler for VRM cooling.

That's probably run a lot cooler and more quiet that that noisebox WF3 OC..


----------



## tsm106

I see a lot discussion here on voltage. I will just leave this here and say I've been running 1.4v thru these quads since the day I got them for benches.



http://www.3dmark.com/fs/1824905

FS run dated 3/2014.


----------



## HoneyBadger84

Quote:


> Originally Posted by *TLM-610*
> 
> What!!!!! 3 of them at $880, how?????


Imma beast at EBay sniping. I got my first 2 XFX Core Editions for $285 each, shipped, resold'em for $330 a pop a few weeks later. The 3-card deal was a package, and what's amazaballs is, if the seller didn't lie (which I have no reason to think he did) these 3 particular cards were only opened to ensure they worked (tested briefly), they weren't involved in the project he bought the rest of the ones he's selling from, so they're 3 brand-spankin' new HIS Core Editions, and because it was a bundle, with the way EBay weirdos are, they don't like to scale upwards 100% on pricing (sounds like Crossfire & SLi) when you get past 2 cards for sale at once. I ended up bidding with about 15s left (placed my max bid at what i was willing to pay for them, which was a lot more than $880), so dude bid with like 5s left & raised it to $880...

What irritates me is the guy listed them originally with a $200 starting price, and $875 buy it now, but I didn't want to do the Buy-It-Now just in case I could get them cheaper at auction end, so I bid the minimum to make the BIN go away... that cost me $5







lol Still, that's under $295 a card, and if they really are brand new in box, I'm gonna happy as a clam they are ^_^ not that I care too much as long as they work perfect, I just want Core Editions so I don't have cards venting in to the case that're stacked on top of eachother anymore when I'm testing TriFire/QuadFire, and I can't be bothered to go liquid cooling, yet.
Quote:


> Originally Posted by *tsm106*
> 
> I see a lot discussion here on voltage. I will just leave this here and say I've been running 1.4v thru these quads since the day I got them for benches.
> 
> 
> 
> http://www.3dmark.com/fs/1824905
> 
> FS run dated 3/2014.


I would go *coughcough* voltage doesn't kill cards temps do *coughcough* to the guy that was arguing with me over it, but I've moved on already







lol That is sick that 7970s last that long with 1.4V though. I chickened out on the 3 I had when I started flirting over the 1.25V mark lol


----------



## wrigleyvillain

Quote:


> Originally Posted by *Imprezzion*
> 
> Soooo, i've been trading and swapping around a bit again and the WF3 OC is going away as well.
> 
> I traded one of my 780's for a reference unlocked XFX 290 with Accelero Hybrid (and stock cooler still present), including all invoices and warranty. Oh and + €50
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Going to do the same as I did with my Xtreme III cooled unlocked Powercolor. Chop up the reference cooler for VRM cooling.
> 
> That's probably run a lot cooler and more quiet that that noisebox WF3 OC..


Yeah I have an XFX reference bought used with an Accelero&#8230;chronic black screens under load but I think I have it nailed down to thermal issues. Mainly VRM temps, especially VRM2 for the memory. Though it's possible one of the VRAM chips themselves is overheating too due to janky install of that one sink which I will also fix. Have seen educated theories that load-related black screens can be caused by certain Elpida chips with lower temp thresholds (~80C) and/or the overheating of even just a single one compared to the others.


----------



## debuchan

I have not read all +2.6k pages, but I was wondering if anyone else had this issue.

I got a R9 290X (Sapphire reference) about a month ago and it was running without issues (on 14.4). Last week, my pump dies, so I decide I will get a new waterblock for it (Aquacomputer). I install it last night, do a leak test, and boot up. I now am getting the dreaded black screens that have plagued several other 290 users, sometimes a full shut down. Has anyone had this happen (where the black screens came after it was running fine)?

Other facts:
- There is an air bubble in my 290 waterblock. It is not large, but it also is not going away. Could this be causing the issue? Also, how do I get an air bubble out of a graphics card (aside from running leak test)? Shaking and jostling has not helped.
- I have yet to try the 14.6 beta driver

I think my issue is tied to one (or both) of these facts, but if anyone else can give me other ideas to try out, I would be appreciative.









EDIT: I used Fujipoly (Extreme, 1.0mm) on the memory and whatever Aquacomputer supplied with its water block on the items they marked as green on the installation manual


----------



## tsm106

Quote:


> Originally Posted by *debuchan*
> 
> I have not read all +2.6k pages, but I was wondering if anyone else had this issue.
> 
> I got a R9 290X (Sapphire reference) about a month ago and it was running without issues (on 14.4). Last week, my pump dies, so I decide I will get a new waterblock for it (Aquacomputer). I install it last night, do a leak test, and boot up. I now am getting the dreaded black screens that have plagued several other 290 users, sometimes a full shut down. Has anyone had this happen (where the black screens came after it was running fine)?
> 
> Other facts:
> - There is an air bubble in my 290 waterblock. It is not large, but it also is not going away. Could this be causing the issue? Also, how do I get an air bubble out of a graphics card (aside from running leak test)? Shaking and jostling has not helped.
> - I have yet to try the 14.6 beta driver
> 
> I think my issue is tied to one (or both) of these facts, but if anyone else can give me other ideas to try out, I would be appreciative.


Evidence points to the change in the block. First, if you don't have enough pump to push out bubbles, you NEED more pump. Besides the pump, check the thermal pads. Blackscreens don't just happen randomly, they are usually tied to temp or clocks. Considering the timing of events, you have some big question marks surrounding your block fitment and pump. Btw, Aqua blocks are very restrictive.


----------



## wrigleyvillain

I have seen many reports of black screens only starting after 3rd party cooler installs, mostly air but also water. See my post above re thermal issues. I would bet that your VRM(s) and or VRAM chip(s) are now overheating. Start by monitoring VRMs in GPU-Z under load&#8230;but there is no way to know what the VRAM temps are without some kind of high-end laser thermometer or whatever.

It is possible the heat issues are caused by a bubble but most like just not good enough contact or flow in that area (due to block design&#8230;but that is speculation). Shaking is really the only way and should get it out if just a bubble.


----------



## debuchan

Quote:


> Originally Posted by *tsm106*
> 
> Evidence points to the change in the block. First, if you don't have enough pump to push out bubbles, you NEED more pump. Besides the pump, check the thermal pads. Blackscreens don't just happen randomly, they are usually tied to temp or clocks. Considering the timing of events, you have some big question marks surrounding your block fitment and pump. Btw, Aqua blocks are very restrictive.


Thanks, I got a D5 PWM pump. I will disconnect the PWM part and just keep running it full-bore until the air bubble is gone. Thanks for the tip!
Quote:


> Originally Posted by *wrigleyvillain*
> 
> I have seen many reports of black screens only starting after 3rd party cooler installs, mostly air but also water. See my post above re thermal issues. I would bet that your VRM(s) and or VRAM chip(s) are now overheating. Start by monitoring VRMs in GPU-Z under load&#8230;but there is no way to know what the VRAM temps are without some kind of high-end laser thermometer or whatever.


Thanks, I will do that!


----------



## wrigleyvillain

Cool, please report back. If your black screens are heat-related then that is even more evidence that mine are as well. VRM2 gets to about 95 before it blacks. Though I also think I have VRAM chip overheating too.


----------



## tsm106

PWM shouldn't have a bearing when the psu is jumpered. Essentially, w/o a PWM signal the pump should run 100% like regular pumps. You are jumpering the psu during bleeding right? I had a feeling you had a D5.


----------



## Dasboogieman

Quote:


> Originally Posted by *TLM-610*
> 
> I have a very serious battle inside me going on about what I should pick (still saving up though) between the GTX 780 gigabyte or a more cost friendly 2nd hand R9 290/x, I still have yet to make up my mind which one I should go after. I have seen mantle do wonders for the AMD cards because the R9 290/x are bother real performance cards and in majority cases outdoing the GTX 780 yet you'll find the R9s far better priced for 2nd hand in comparison to the gtx 780. My major gripe is the temps that the R9 cause in the case thus affecting my cpu overclocks, I saw a really good bargain about $330 just here in OCN. What are all the other aftermarket cooling options you can adopt for the R9 except water cooling to keep this beast around 75°C?


I'd say the 780 is about slightly worse than, albeit maybe with a slight edge when fully overclocked compared to Hawaii XT (also depends on your Hawaii chip, some here on OCN have achieved fantastic results), its definitely worse than Hawaii Pro when fully OCed unless you got one of them golden chips I keep hearing about from the NVIDIA guys. That being said, Hawaii won't bleed as much performance at higher resolutions with heavy MSAA/SSAA usage, GK110 loses whatever edge it has very quickly in these scenarios. The 4GB RAM is gravy, basically much more desirable should you go dual GPU in future. However, I can't deny that GK110 will consume less power and will operate at a lower core temperature than Hawaii, plus, in the games that it does well, it does phenomenally well (though the same can also be said for AMD, however the NVIDIA favored titles tend to gimp AMD unfairly).

How close to the edge are you with your CPU overclock to be so sensitive to case temperatures?

With respect to aftermarket cooling. You have to really strike a balance between silence and cooler bulk as most aftermarket solutions can easily achieve 75 degrees load with silence. Its actually quite OK for the AMD GPU core to operate at 94 degrees, I'm sure the extra leakage has been taken in to account and compensated for. The real issue with Hawaii is the VRM temperatures, while they're rated for 120 degrees plus, their temperatures spike really fast when forced to operate beyond 250W of power draw (i.e. overvoltage conditions) due to the 5+1+1 design. On my Tri X, the VRM temperatures spike by about 5 degrees for every 50mV of extra juice. This is unfortunately, a byproduct of the 5+1+1 design.

Regardless of cooler you go, your VRM temperatures will usually be equal (best case scenario) or much worse compared to the reference cooler, and this is the real overclock limiter, not core temperatures.

1. The best performance outright in all metrics is a full cover water block. You get core and VRM improvements. I don't need to explain the downsides though









2. NZXT G10 + AIO cooler + GELID Enhancement pack + VRAM heatsinks. You get comparable core temperatures to full scale watercooling (most who have used it with a 120mm class rad have gotten a fairly good delta T of 25-30 degrees), however, VRM temperatures don't really improve to match the extra core headroom. Reasonably compact and silent, you just need to find a rad mounting point. Quite expensive though, not quite the same as full WC but much pricier than most other solutions.

3. Corsair H10 + AIO cooler: not too much information on this yet as there are no owners who have published results. Supposedly has superior VRM cooling to the G10 setup because the bracket recycles the reference heat plate + fan.

4. Accelero Extreme III: probably the most popular at the moment with most users here. Takes roughly 2.5 slots, pretty long and requires a reasonably involved installation process. I've been told temperatures around 60 are possible with great silence. Somewhat weaker VRM cooling though this is quite technique sensitive with respect to your VRM cooling arrangements (i.e. Thermal Epoxy vs Tape). I'd avoid the Extreme IV for the moment, reports range from wildly better to worse VRM temperatures, plus, this is even bulkier due to the massive backplate.

5. GELID Icy Vision: Pretty cheap, reasonably quiet. Performance can range from 70-80s depending on your noise threshold. Probably the most compact of the aftermarket coolers.

6. Rajintek: I don't recommend this, it's a monstrous 4 slot cooler which will undoubtedly give you fantastic sub 60 degree temperatures with great silence (due to large fans). However, the sheer bulk and weight doesn't really seem worth it considering the VRM cooling isn't much improved from the smaller coolers. For this effort, it may be better to go with an AIO.

I'm sure a lot of other people will chime in regarding options I may have missed.


----------



## JeremyFenn

I dunno, my Sapphire Vapor-X only gets about 70c max at load with it being 73F in the room. I guess it all depends on how hot it is in the room and what the air flow is like inside your case.


----------



## Dasboogieman

Quote:


> Originally Posted by *JeremyFenn*
> 
> I dunno, my Sapphire Vapor-X only gets about 70c max at load with it being 73F in the room. I guess it all depends on how hot it is in the room and what the air flow is like inside your case.


Your Vapor X has a much much more powerful VRM assembly, more VRM phases = more thermal balancing. It has more efficient VRMs to begin with, heatsinked inductors, a 10 phase design going by cooling configurator and more optimal VRM positioning compared to the reference design. All of that vastly drops your VRM temperatures, not also mentioning the Vapor chamber which practically solves the thermal density issue. You're the first user I've seen since the Lightning guys, whose Vapor X card can take +200mV on aircooling without breaking a sweat.

My MSI Gaming 290 has a 6+1+1 design, the VRM section heatsink performance is arguably similar too the Tri X but the temperatures are a full 10-15 degrees less in most scenarios simply due to the extra phase balancing out the load.


----------



## JeremyFenn

Quote:


> Originally Posted by *Dasboogieman*
> 
> Your Vapor X has a much much more powerful VRM assembly, more VRM phases = more thermal balancing. It has more efficient VRMs to begin with, heatsinked inductors, a 10 phase design going by cooling configurator and more optimal VRM positioning compared to the reference design. All of that vastly drops your VRM temperatures, not also mentioning the Vapor chamber which practically solves the thermal density issue.
> 
> My MSI Gaming 290 has a 6+1+1 design, the VRM section heatsink performance is arguably similar too the Tri X but the temperatures are a full 10-15 degrees less in most scenarios simply due to the extra phase balancing out the load.


Yeah I've heard from others that my VRM design and configuration was different in a good way.







I think when I get another 290x it will be a clone of the one I have, even if I'm going to WC them, the VRM difference would be well worth it even @ $620 a pop.


----------



## kckyle

Quote:


> Originally Posted by *Dasboogieman*
> 
> I'd say the 780 is about slightly worse than, albeit maybe with a slight edge when fully overclocked compared to Hawaii XT (also depends on your Hawaii chip, some here on OCN have achieved fantastic results), its definitely worse than Hawaii Pro when fully OCed unless you got one of them golden chips I keep hearing about from the NVIDIA guys. That being said, Hawaii won't bleed as much performance at higher resolutions with heavy MSAA/SSAA usage, GK110 loses whatever edge it has very quickly in these scenarios. The 4GB RAM is gravy, basically much more desirable should you go dual GPU in future. However, I can't deny that GK110 will consume less power and will operate at a lower core temperature than Hawaii, plus, in the games that it does well, it does phenomenally well (though the same can also be said for AMD, however the NVIDIA favored titles tend to gimp AMD unfairly).
> 
> How close to the edge are you with your CPU overclock to be so sensitive to case temperatures?
> 
> With respect to aftermarket cooling. You have to really strike a balance between silence and cooler bulk as most aftermarket solutions can easily achieve 75 degrees load with silence. Its actually quite OK for the AMD GPU core to operate at 94 degrees, I'm sure the extra leakage has been taken in to account and compensated for. The real issue with Hawaii is the VRM temperatures, while they're rated for 120 degrees plus, their temperatures spike really fast when forced to operate beyond 250W of power draw (i.e. overvoltage conditions) due to the 5+1+1 design. On my Tri X, the VRM temperatures spike by about 5 degrees for every 50mV of extra juice. This is unfortunately, a byproduct of the 5+1+1 design.
> 
> Regardless of cooler you go, your VRM temperatures will usually be equal (best case scenario) or much worse compared to the reference cooler, and this is the real overclock limiter, not core temperatures.
> 
> 1. The best performance outright in all metrics is a full cover water block. You get core and VRM improvements. I don't need to explain the downsides though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2. NZXT G10 + AIO cooler + GELID Enhancement pack + VRAM heatsinks. You get comparable core temperatures to full scale watercooling (most who have used it with a 120mm class rad have gotten a fairly good delta T of 25-30 degrees), however, VRM temperatures don't really improve to match the extra core headroom. Reasonably compact and silent, you just need to find a rad mounting point. Quite expensive though, not quite the same as full WC but much pricier than most other solutions.
> 
> 3. Corsair H10 + AIO cooler: not too much information on this yet as there are no owners who have published results. Supposedly has superior VRM cooling to the G10 setup because the bracket recycles the reference heat plate + fan.
> 
> 4. Accelero Extreme III: probably the most popular at the moment with most users here. Takes roughly 2.5 slots, pretty long and requires a reasonably involved installation process. I've been told temperatures around 60 are possible with great silence. Somewhat weaker VRM cooling though this is quite technique sensitive with respect to your VRM cooling arrangements (i.e. Thermal Epoxy vs Tape). I'd avoid the Extreme IV for the moment, reports range from wildly better to worse VRM temperatures, plus, this is even bulkier due to the massive backplate.
> 
> 5. GELID Icy Vision: Pretty cheap, reasonably quiet. Performance can range from 70-80s depending on your noise threshold. Probably the most compact of the aftermarket coolers.
> 
> 6. Rajintek: I don't recommend this, it's a monstrous 4 slot cooler which will undoubtedly give you fantastic sub 60 degree temperatures with great silence (due to large fans). However, the sheer bulk and weight doesn't really seem worth it considering the VRM cooling isn't much improved from the smaller coolers. For this effort, it may be better to go with an AIO.
> 
> I'm sure a lot of other people will chime in regarding options I may have missed.


that was very informative than you, i'm currently waiting on the reviews on the hg10 and see if it is the right choice. even with two 140mm fan blowing on it im still reaching 90 c temp.


----------



## tsm106

I look in my profile and see the reply. Otherwise I'd never see posts liek the one above cuz that poster is blocked.


----------



## HoneyBadger84

Those are my max temps on my Sapphire Tri-X OC R9 290X running in my case by itself from a run through 3DMark FireStrike... it got about 3C warmer through 2 loops of Valley, as did the VRMs... so the VRM cooling even on the regular Tri-X is really freakin' good... my VRM idle temps were in the 23-26C range... so darn near room temp. Nuts.

Quote:


> Originally Posted by *tsm106*
> 
> I look in my profile and see the reply. Otherwise I'd never see posts liek the one above cuz that poster is blocked.


Good to know I'm not the only one with the homeslice blocked. lol


----------



## pdasterly

Quote:


> Originally Posted by *rdr09*
> 
> if your vrms can't handle catzilla, then try to fix the heat issues first before using your cards for extended gaming or oc'ing.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i forgot you're using 3 monitors. those alone will prevent your card (s) from going idle.


I have xfire setup


----------



## debuchan

Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PWM shouldn't have a bearing when the psu is jumpered. Essentially, w/o a PWM signal the pump should run 100% like regular pumps. You are jumpering the psu during bleeding right? I had a feeling you had a D5.


Yes, I jumped the psu during bleeding. What I meant was I suspect that the PWM signal was causing my pump to not push at 100% while it was running (after bleeding), which is causing the air bubble to not be forced out, which could be causing one (or a couple) of my memory overheating.

After work, I am going to bleed it again to hopefully force the air bubble out. One thing I forgot to mention is that while bleeding last night, there was a lot more air pressure build up than I normally encounter while filling a loop- it was bad enough that I was having issues getting water into the reservoir! Anyways, I will report back after taking a look tonight.


----------



## wrigleyvillain

Quote:


> Originally Posted by *HoneyBadger84*
> 
> 
> so the VRM cooling even on the regular Tri-X is really freakin' good...


Hmm yes I would say that is definitely the case based on that screenshot.


----------



## fateswarm

VRM temps are predominately depended on the mosfets quality and amount. Chances are if you have more phases and more quality mosfets, you have lower temps on the same cooling. There are also extremities of overkill that may not even need cooling at all.

e.g. the external VRM Gigabyte sells now has an output capability of around 3,000W (it is full of IR3550s). It can run naked and not be heated at all.


----------



## KeepWalkinG

I had gtx 560 ti Asus DC II, gtx 670 WF3, gtx 680, r9 280x Dual-x all were quiet.

But now r9 290 Vapor-X is quieter 10 times than others.








In a quiet card I've ever had !


----------



## MistaBernie

I've deleted a couple of off-topic posts or posts which were quoting other deleted posts. Please keep discussion on topic going forward.. thanks.


----------



## JeremyFenn

I would love to get rid of my H110 and get a custom loops or a couple loops for CPU, VRMs, RAM, NB, GPU's, etc. I think 4 of those Sapphire Vapor-X 290x's with the buff VRMs would just wreck shop on the whole world.


----------



## TLM-610

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Imma beast at EBay sniping. I got my first 2 XFX Core Editions for $285 each, shipped, resold'em for $330 a pop a few weeks later. The 3-card deal was a package, and what's amazaballs is, if the seller didn't lie (which I have no reason to think he did) these 3 particular cards were only opened to ensure they worked (tested briefly), they weren't involved in the project he bought the rest of the ones he's selling from, so they're 3 brand-spankin' new HIS Core Editions, and because it was a bundle, with the way EBay weirdos are, they don't like to scale upwards 100% on pricing (sounds like Crossfire & SLi) when you get past 2 cards for sale at once. I ended up bidding with about 15s left (placed my max bid at what i was willing to pay for them, which was a lot more than $880), so dude bid with like 5s left & raised it to $880...


One question, since am even seing deals as low as $200 right now, how do you know for sure these cards from ebay were not used heavily for mining and are now being discarded to let them go?
Quote:


> Originally Posted by *Dasboogieman*
> 
> I'd say the 780 is about slightly worse than, albeit maybe with a slight edge when fully overclocked compared to Hawaii XT (also depends on your Hawaii chip, some here on OCN have achieved fantastic results), its definitely worse than Hawaii Pro when fully OCed unless you got one of them golden chips I keep hearing about from the NVIDIA guys. That being said, Hawaii won't bleed as much performance at higher resolutions with heavy MSAA/SSAA usage, GK110 loses whatever edge it has very quickly in these scenarios. The 4GB RAM is gravy, basically much more desirable should you go dual GPU in future. However, I can't deny that GK110 will consume less power and will operate at a lower core temperature than Hawaii, plus, in the games that it does well, it does phenomenally well (though the same can also be said for AMD, however the NVIDIA favored titles tend to gimp AMD unfairly).
> 
> How close to the edge are you with your CPU overclock to be so sensitive to case temperatures?
> 
> With respect to aftermarket cooling. You have to really strike a balance between silence and cooler bulk as most aftermarket solutions can easily achieve 75 degrees load with silence. Its actually quite OK for the AMD GPU core to operate at 94 degrees, I'm sure the extra leakage has been taken in to account and compensated for. The real issue with Hawaii is the VRM temperatures, while they're rated for 120 degrees plus, their temperatures spike really fast when forced to operate beyond 250W of power draw (i.e. overvoltage conditions) due to the 5+1+1 design. On my Tri X, the VRM temperatures spike by about 5 degrees for every 50mV of extra juice. This is unfortunately, a byproduct of the 5+1+1 design.
> 
> Regardless of cooler you go, your VRM temperatures will usually be equal (best case scenario) or much worse compared to the reference cooler, and this is the real overclock limiter, not core temperatures.
> 
> 1. The best performance outright in all metrics is a full cover water block. You get core and VRM improvements. I don't need to explain the downsides though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2. NZXT G10 + AIO cooler + GELID Enhancement pack + VRAM heatsinks. You get comparable core temperatures to full scale watercooling (most who have used it with a 120mm class rad have gotten a fairly good delta T of 25-30 degrees), however, VRM temperatures don't really improve to match the extra core headroom. Reasonably compact and silent, you just need to find a rad mounting point. Quite expensive though, not quite the same as full WC but much pricier than most other solutions.
> 
> 3. Corsair H10 + AIO cooler: not too much information on this yet as there are no owners who have published results. Supposedly has superior VRM cooling to the G10 setup because the bracket recycles the reference heat plate + fan.
> 
> 4. Accelero Extreme III: probably the most popular at the moment with most users here. Takes roughly 2.5 slots, pretty long and requires a reasonably involved installation process. I've been told temperatures around 60 are possible with great silence. Somewhat weaker VRM cooling though this is quite technique sensitive with respect to your VRM cooling arrangements (i.e. Thermal Epoxy vs Tape). I'd avoid the Extreme IV for the moment, reports range from wildly better to worse VRM temperatures, plus, this is even bulkier due to the massive backplate.
> 
> 5. GELID Icy Vision: Pretty cheap, reasonably quiet. Performance can range from 70-80s depending on your noise threshold. Probably the most compact of the aftermarket coolers.
> 
> 6. Rajintek: I don't recommend this, it's a monstrous 4 slot cooler which will undoubtedly give you fantastic sub 60 degree temperatures with great silence (due to large fans). However, the sheer bulk and weight doesn't really seem worth it considering the VRM cooling isn't much improved from the smaller coolers. For this effort, it may be better to go with an AIO.
> 
> I'm sure a lot of other people will chime in regarding options I may have missed.


Very well executed answer







also waiting to here what others have to say +rep for you


----------



## JeremyFenn

http://www.ebay.com/itm/SAPPHIRE-Radeon-R9-290X-4GB-BattleField-4-Video-Card-/261528476755?pt=PCC_Video_TV_Cards&hash=item3ce44fdc53

Is that for 5 cards or he HAS 5 cards and selling them one by one? lol


----------



## Roaches

Quote:


> Originally Posted by *JeremyFenn*
> 
> http://www.ebay.com/itm/SAPPHIRE-Radeon-R9-290X-4GB-BattleField-4-Video-Card-/261528476755?pt=PCC_Video_TV_Cards&hash=item3ce44fdc53
> 
> Is that for 5 cards or he HAS 5 cards and selling them one by one? lol


Quote:


> On Jul-09-14 at 21:14:29 PDT, seller added the following information:
> 
> Listing is for (5)- SAPPHIRE Radeon R9 290X 4GB GDDR5 PCI Express 3.0 BF4 Card, These were used for a few months, no damage, nothing is wrong with the cards, they're in absolute perfect condition and run beautifully. I will be shipping them out in the original box. You will receive what is shown in the picture. Although this is the Battlefield 4 Edition, those cards have been used. Again, I have FIVE of these graphic cards.
> *
> THIS LISTING IS PER ONE CARD*. Once the card has sold I will list the others. If you're looking to buy more than one card please feel free to reach out to me and we can set that up for you.
> 
> If you have any questions feel free to ask.


----------



## TLM-610

Quote:


> Originally Posted by *JeremyFenn*
> 
> http://www.ebay.com/itm/SAPPHIRE-Radeon-R9-290X-4GB-BattleField-4-Video-Card-/261528476755?pt=PCC_Video_TV_Cards&hash=item3ce44fdc53
> 
> Is that for 5 cards or he HAS 5 cards and selling them one by one? lol


I am 100% certain that this guy has recovered and profited from those cards (mining) and now is looking to dispose of them. No way on earth can you sell something at 83% loss just out of a few months of usage


----------



## JeremyFenn

Yeah I agree lol I mean, mining won't kill the cards IF they're properly cooled... I still wouldn't want them though, I'm waiting till I see my 1080 Sapphire Vapor-X 290x's somewhere again. I saw them on Amazon, but just saw in a review that it was 1080, not on the page itself. Not to mention it was like $40 more than what I paid for mine, it's totally worth it imo, I just wouldn't buy it unless I saw from the seller they were 1080/1410 cards.


----------



## rdr09

Quote:


> Originally Posted by *TLM-610*
> 
> I am 100% certain that this guy has recovered and profited from those cards (mining) and now is looking to dispose of them. No way on earth can you sell something at 83% loss just out of a few months of usage


that's up for bids and there are 6 days to go. i would not be surprised if the final bid will end up somewhere near what others are selling at "buy now".

$300


----------



## TLM-610

Quote:


> Originally Posted by *rdr09*
> 
> that's up for bids and there are 6 days to go. i would not be surprised if the final bid will end up somewhere near what others are selling at "buy now".
> 
> $300


No sooner did you say 3 than this happened - http://www.ebay.com/itm/SAPPHIRE-Radeon-R9-290X-4GB-BattleField-4-Video-Card-/261528913737?pt=PCC_Video_TV_Cards&hash=item3ce4568749


----------



## rdr09

Quote:


> Originally Posted by *TLM-610*
> 
> No sooner did you say 3 than this happened - http://www.ebay.com/itm/SAPPHIRE-Radeon-R9-290X-4GB-BattleField-4-Video-Card-/261528913737?pt=PCC_Video_TV_Cards&hash=item3ce4568749












TLM, there are always risks buying stuff in ebay but buyers are protected i believe more than the seller. i sold my 7900 cards there and i know they were working. just have to put your fate on the seller's reputation.

if i were to buy another 290 or 290X, it will be a reference and it would there. get a seidon cooler, some gelid heatsinks, and improve airflow should be enough to cool these beasts.

Seidon

Gelid

see post # 24047

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/24040#post_22387647


----------



## JeremyFenn

I'm thinking of replacing my monitor since the one I currently use is only 1680x1050. I want something like a 144Hz 1ms 1080p so if I wanted to eventually I could get 2 more and use eyefinity. Just another stepping stone in the shaping of my machine. Eventually I'd like to have 3 or 4 of those 1080/1410 Vapor-X 290x's (water cooled would be nice too) running 3 144Hz 1ms 1080p monitors. I could go with the Acer GN246HL since it has the 100,000,000:1 contrast but it doesn't have display port. I'm leaning more towards the ASUS VG248QE since even though it's 80,000,000:1 contrast, it DOES have display port....only feature on that Asus screen though I don't want are the speakers in it. I mean cmon, who's going to get a 144hz 1ms gaming screen and actually want monitor speakers in it lol


----------



## KeepWalkinG

Quote:


> Originally Posted by *JeremyFenn*
> 
> I'm thinking of replacing my monitor since the one I currently use is only 1680x1050. I want something like a 144Hz 1ms 1080p so if I wanted to eventually I could get 2 more and use eyefinity. Just another stepping stone in the shaping of my machine. Eventually I'd like to have 3 or 4 of those 1080/1410 Vapor-X 290x's (water cooled would be nice too) running 3 144Hz 1ms 1080p monitors. I could go with the Acer GN246HL since it has the 100,000,000:1 contrast but it doesn't have display port. I'm leaning more towards the ASUS VG248QE since even though it's 80,000,000:1 contrast, it DOES have display port....only feature on that Asus screen though I don't want are the speakers in it. I mean cmon, who's going to get a 144hz 1ms gaming screen and actually want monitor speakers in it lol


This picture from your profile is a 280x Tri-x Vapor-x ?

Vapor-X logo is only in 280x?


----------



## JeremyFenn

No, although I did have one of those 280x Vapor-X's, but it didn't say Sapphire in blue on the side it said "Vapor-X" in blue LED. It also hated Tessellation and would just give crazy lines everywhere when enabled. I since RMA'd for a GTX770 SC edition (EVGA) which was ok but I wasn't all that impressed with, RMA'd that for a refund and spent a little more for the 290x. My exact model is:

SAPPHIRE VAPOR-X R9 290X 4GB GDDR5 TRI-X OC (UEFI)

and here's the link to the Sapphire website for it:

http://www.sapphiretech.com/presentation/product/?cid=1&gid=3&sgid=1227&pid=2283&psn=&lid=1&leg=0


----------



## aaroc

Im replying from my phone, the quote didnt work.

For our friend asking how to connect 3 monitor to a R9 290 and having dp problems:
I have three samsung 2560x1440 connected with 2 Dvi dual link and one hdmi cable. All at full resolution. I was trying the hdmi because dp cables longer than 1 meter are note available here. I needed at least a 3 meter one. I have sound on the hdmi connected. Still no problems.

I detected a strange cfx behavior on 14.4, it only work if running the app or game fullscreen the first time. The ab osd reported zero usage on cards >1 if changing windowd and fullscreen. Maybe its teamviewer or other program installed. I will check later.


----------



## Sgt Bilko

290x Vapor-X for 1680 x 1050?

Dude....at least 1080p

Im running a 1440p Korean panel at 110hz by comparison


----------



## HoneyBadger84

Quote:


> Originally Posted by *TLM-610*
> 
> One question, since am even seing deals as low as $200 right now, how do you know for sure these cards from ebay were not used heavily for mining and are now being discarded to let them go?


Simply put, I'm A: asking the seller first & B: having faith they're not lying. If they DO lie and send you a card that's not in new/pristine condition that is listed as such, you're protected by EBay Buyer policies & will get your money back (may take a bit but you will get it back). For me I ask a lot of questions. In the particular case of the newest cards I have incoming, they simply stated the cards were never used. The VisionTek, which I just got in, was still sealed & also was obviously never used as the fan is spotless & the plastic it comes with on it (to protect the face of the card) was still there:



I've actually gotten 2 mining cards as my first 2 cards off EBay, and they both worked great, must to my surprise & enjoyment. They fans were disgusting, but with some q-tips & fancy cleaning, I got them a lot cleaner than they were (here's some before & after for ya):

This was before my thorough cleaning, but after I got off some serious gunk with just q-tips:



And this was after a more thorough cleaning, you can see the difference I trust:



I mean really, you just have to take a look at the fan if they have a picture of the actual card posted up. If they don't, try to ask for a picture of the box-seal to be uploaded, or the card itself, if they don't have one posted and they're saying it's "new open box".

Sidenote: never buy VisionTek. Their packaging is horrible, the card can move around inside the box a LOT more than any other card I've ever gotten in that's brand new, and this was the case on the previous VisionTek card I got in that had an aftermarket cooler stock as well, you can kinda see what I mean just by how much open space is near the card:


----------



## HoneyBadger84

Quote:


> Originally Posted by *TLM-610*
> 
> I am 100% certain that this guy has recovered and profited from those cards (mining) and now is looking to dispose of them. No way on earth can you sell something at 83% loss just out of a few months of usage


They won't sell for anywhere near that low. Look up Completed/Sold Items in search. R9 290Xs sell for (on average) between 275-350+$ depending on the card & it's condition.

My Gigabyte R9 290X WindForce just resold for $375+shipping. Not even kidding. lol


----------



## TLM-610

Quote:


> Originally Posted by *rdr09*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> TLM, there are always risks buying stuff in ebay but buyers are protected i believe more than the seller. i sold my 7900 cards there and i know they were working. just have to put your fate on the seller's reputation.
> 
> if i were to buy another 290 or 290X, it will be a reference and it would there. get a seidon cooler, some gelid heatsinks, and improve airflow should be enough to cool these beasts.
> 
> Seidon
> 
> Gelid
> 
> see post # 24047
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/24040#post_22387647


Understood
Quote:


> Originally Posted by *HoneyBadger84*
> 
> Simply put, I'm A: asking the seller first & B: having faith they're not lying. If they DO lie and send you a card that's not in new/pristine condition that is listed as such, you're protected by EBay Buyer policies & will get your money back (may take a bit but you will get it back). For me I ask a lot of questions. In the particular case of the newest cards I have incoming, they simply stated the cards were never used. The VisionTek, which I just got in, was still sealed & also was obviously never used as the fan is spotless & the plastic it comes with on it (to protect the face of the card) was still there:
> 
> 
> 
> I've actually gotten 2 mining cards as my first 2 cards off EBay, and they both worked great, must to my surprise & enjoyment. They fans were disgusting, but with some q-tips & fancy cleaning, I got them a lot cleaner than they were (here's some before & after for ya):
> 
> This was before my thorough cleaning, but after I got off some serious gunk with just q-tips:
> 
> 
> 
> And this was after a more thorough cleaning, you can see the difference I trust:
> 
> 
> 
> I mean really, you just have to take a look at the fan if they have a picture of the actual card posted up. If they don't, try to ask for a picture of the box-seal to be uploaded, or the card itself, if they don't have one posted and they're saying it's "new open box".
> 
> Sidenote: never buy VisionTek. Their packaging is horrible, the card can move around inside the box a LOT more than any other card I've ever gotten in that's brand new, and this was the case on the previous VisionTek card I got in that had an aftermarket cooler stock as well, you can kinda see what I mean just by how much open space is near the card:


Alright


----------



## JeremyFenn

Well don't forget, I gave my system to my kids which was a Q6600 with 8GB of 1066 stock and it had a 2x GTS450 in SLi. That thing rocked for a long time until I just lost interest. Building this rig, I have yet to replace the screen, which is what I'm looking to do lol (kinda why I made the post I'm just really putting my thoughts on here).







I think 3 of those 1080p 1ms Asus monitors using DP would be a nice setup, although I think I'd then need at least another 290x to power them the right way. Like I said before though, my ultimate setup will have 3 of these monitors and 3 or 4 of those Vapor-X 290x's with the stock 1080/1410 settings.


----------



## wrigleyvillain

Good god please yes upgrade your display. I would urge it even if you didn't have such GPU power. 1080P 144hz x 2 is not a terrible idea but I would also at least consider and look at 1440P options like QNIX. The increased detail from higher pixel density combined with the increased real estate will truly blow your mind compared to what you are presently used to. You may find you don't notice a huge or even any real difference at higher Hz either, some people don't. And there is the bezels to consider if you want to do Eyefinity. Some people don't mind; others can't stand it.


----------



## JeremyFenn

See I'd want something higher than 60hz. If I could get a 1440p monitor at least 120Hz and 2ms then I'm there.


----------



## HoneyBadger84

Posted up what will be my adventures with the new stock cards, feel free to drop by & chitchat ^_^

http://www.overclock.net/t/1501116/quadfire-r9-290xs-reference-cuz-why-not-for-1250-lots-of-pics-3k-3x1080p-testing-to-be-included

I'm installing the VisionTek card (drivers) right now, then I'll run a few tests... then I gotta nap before work in 8hrs lol


----------



## ZealotKi11er

Quote:


> Originally Posted by *JeremyFenn*
> 
> See I'd want something higher than 60hz. If I could get a 1440p monitor at least 120Hz and 2ms then I'm there.


You can. It will just cost $800 though.


----------



## Sgt Bilko

Quote:


> Originally Posted by *JeremyFenn*
> 
> See I'd want something higher than 60hz. If I could get a 1440p monitor at least 120Hz and 2ms then I'm there.


I haven't seen a qnix get less than 96hz (majority are 110hz+)

As for the response time....I personally havent noticed the change from 1080p 60hz 5ms to 1440p 110hz 8ms.

Im assuming some people notice it more than others but I dont.


----------



## HoneyBadger84

LOL as fate would have it, seems I just ran in to my first shady seller. He either accidentally, or on purpose, sent me an R9 290 in a R9 290X box... it is indeed brand new, but it's clearly a 290:



Also it's showing as AMD in every aspect, aka completely reference from them, and not VisionTek... can't remember if my other VisionTek card had it labeled that way or not...

And before anyone suggests it, I already checked it, I can't flash it to a 290X like I immediately hoped. lol Contactin' the seller to see what he'd prefer I do, be nice (either take a partial refund or exchange cards with him for one that is what I paid for) or be mean (file an EBay claim & return it for a full refund).


----------



## bond32

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I haven't seen a qnix get less than 96hz (majority are 110hz+)
> 
> As for the response time....I personally havent noticed the change from 1080p 60hz 5ms to 1440p 110hz 8ms.
> 
> Im assuming some people notice it more than others but I dont.


I notice a pretty substantial difference in battlefield 4 going from 60 to 120hz. This is at 1440p, all other settings maxed.

And running higher supersampling actually pushes the 3x290x's to their limits.... Even around 150% my framerate will dip to 70-80fps but otherwise is constant 120.


----------



## Sgt Bilko

Quote:


> Originally Posted by *bond32*
> 
> I notice a pretty substantial difference in battlefield 4 going from 60 to 120hz. This is at 1440p, all other settings maxed.
> 
> And running higher supersampling actually pushes the 3x290x's to their limits.... Even around 150% my framerate will dip to 70-80fps but otherwise is constant 120.


Thats refresh rate, I was talking about response time.

From 60 to 110hz I noticed a massive difference (also TN to PLS)


----------



## bond32

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Thats refresh rate, I was talking about response time.
> 
> From 60 to 110hz I noticed a massive difference (also TN to PLS)


Ah, yeah I am not that knowledgeable with that....

Friend of mine bought 2 280x's and a 4k monitor - I saw it, it looks ok. And by looks ok I mean the graphics look good, but it's a slide show running 20-30 fps. Those 280x's aren't near powerful enough for 4k.

If and when cheap 4k monitors start showing up like the qnix and x-star, I may look into it as long as it can run at a decent refresh rate. Until then I might pick up a second x-star 1440p.

Anyone here using 2 or 3? The bezel is just a tad thick but I think it would look good. For the price, you could get 2xX-Star 1440p monitors which gives you 4k, but if you have the horsepower you could run at a high refresh rate.


----------



## Dsrt

Does anybody know what are safe VRM temps for R9-290X DirectCU II, core going up to 90c while OCing +25mV but *VRM 1# goes up to 106c* (25min run in heaven 4.0), is that safe for 24/7 overclock? (Running Heaven 4.0 Ultra, Tessellation Extreme, AA 8x, 1080p).


----------



## HoneyBadger84

Quote:


> Originally Posted by *Dsrt*
> 
> Does anybody know what are safe VRM temps for R9-290X DirectCU II, core going up to 90c while OCing +25mV but *VRM 1# goes up to 106c* (25min run in heaven 4.0), is that safe for 24/7 overclock? (Running Heaven 4.0 Ultra, Tessellation Extreme, AA 8x, 1080p).


I've heard the DirectCU II on these particular cards is utter trash... I'm afraid those temps are quite common, but I could be wrong, I don't have one to verify personally.


----------



## JeremyFenn

I think I'd rather go with 1080p @ 144Hz 1ms, I play a lot of first person shooters and I think the lower latency + higher refresh would be better. I can't find a 1440p monitor that goes any lower than 4ms and higher refresh than 85Hz.


----------



## bond32

VRM on these cards is usually rated to 130 ish. But with that said, I would not be comfortable with anything over 90 C even on air.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Dsrt*
> 
> Does anybody know what are safe VRM temps for R9-290X DirectCU II, core going up to 90c while OCing +25mV but *VRM 1# goes up to 106c* (25min run in heaven 4.0), is that safe for 24/7 overclock? (Running Heaven 4.0 Ultra, Tessellation Extreme, AA 8x, 1080p).


Oh GOD thats horrible. I have to check the cooling for that card but i though it looked god. Dont they claim 21C cooler then stock. Personally i would not touch the voltages.


----------



## Sgt Bilko

Quote:


> Originally Posted by *JeremyFenn*
> 
> I think I'd rather go with 1080p @ 144Hz 1ms, I play a lot of first person shooters and I think the lower latency + higher refresh would be better. I can't find a 1440p monitor that goes any lower than 4ms and higher refresh than 85Hz.


There is only one that I know of and thats the ROG Swift. 1440p TN panel, 120hz and 1ms.

$800 USD msrp


----------



## ZealotKi11er

Quote:


> Originally Posted by *Sgt Bilko*
> 
> There is only one that I know of and thats the ROG Swift. 1440p TN panel, 120hz and 1ms.
> 
> $800 USD msrp


TN kills it for me. I am forced to use TN with my Gaming laptop and i cant stand it. There are just to many things wrong with TN unless you are color shifts dont matter to you.


----------



## Sgt Bilko

Quote:


> Originally Posted by *ZealotKi11er*
> 
> TN kills it for me. I am forced to use TN with my Gaming laptop and i cant stand it. There are just to many things wrong with TN unless you are color shifts dont matter to you.


Im loving my Qnix, IPS/PLS or no deal for me now.....never knew the difference till I was staring at it


----------



## Decoman

I have aquired a *Sapphire 290X Tri-X* card (not the OC version) and started to fiddle with overclocking using Trixx.

I find it a little odd that the core clock jumps up and down, even though the card doesn't appear to be that hot.

Core clock is 1010 MHz and mem clock is 1250 MHz. Core clock sometimes dip below 900MHz and I don't know why. Increasing "power limit" to 25 doesn't seem to help at all.

Asic: 79.9%
Bios: 015.042.000.003.000000 (113-C6710100-L01)

Using Catalyst 14.6 beta rc2 drivers.

The button for the two bios setting doesn't seem to result in different clocks, same core and memory clock on both settings.

The cooler/fan is nice and quiet at 45% fan speed. I noticed that there is some kind of coil whine when overlclocking and running Unigine Heaven.

The image below show a run with Unigine Heaven at stock speeds (1010 MHzcore/1250MHzmem).


How do I get the card to run at a steady clock? Ticking "force constant voltage" in Trixx doesn't do anything from what I can see (edit: I mean it doesn't make the core clock stay at the same value).

Edit: Turning off Vsynch when running Unigine Heaven seem to make the core clock alot more steady for me.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Decoman*
> 
> I have aquired a *Sapphire 290X Tri-X* card (not the OC version) and started to fiddle with overclocking using Trixx.
> 
> I find it a little odd that the core clock jumps up and down, even though the card doesn't appear to be that hot.
> 
> Core clock is 1010 MHz and mem clock is 1250 MHz. Core clock sometimes dip below 900MHz and I don't know why. Increasing "power limit" to 25 doesn't seem to help at all.


Try disabling ULPS. I know you're running single card, but that usually works for me to fix the issue where the card will unnecessarily adjust clocks at times when it shouldn't be (i.e. during a benchmark or game but not during a scene change)


----------



## pdasterly

took your idea, works ok with grid in benchmark mode. should I try to overclock from here.
benchmark has been running for about an hour


----------



## Decoman

@HoneyBadger84

Hm, ticking the "disable ULPS" in Trix doesn't do anything for me, even after a restart. The core clock still vary greatly. *shrugs*


----------



## ZealotKi11er

Quote:


> Originally Posted by *Decoman*
> 
> I have aquired a *Sapphire 290X Tri-X* card (not the OC version) and started to fiddle with overclocking using Trixx.
> 
> I find it a little odd that the core clock jumps up and down, even though the card doesn't appear to be that hot.
> 
> Core clock is 1010 MHz and mem clock is 1250 MHz. Core clock sometimes dip below 900MHz and I don't know why. Increasing "power limit" to 25 doesn't seem to help at all.
> 
> Asic: 79.9%
> Bios: 015.042.000.003.000000 (113-C6710100-L01)
> 
> Using Catalyst 14.6 beta rc2 drivers.
> 
> The button for the two bios setting doesn't seem to result in different clocks, same core and memory clock on both settings.
> 
> The cooler/fan is nice and quiet at 45% fan speed. I noticed that there is some kind of coil whine when overlclocking and running Unigine Heaven.
> 
> The image below show a run with Unigine Heaven at stock speeds (1010 MHzcore/1250MHzmem).
> 
> 
> How do I get the card to run at a steady clock? Ticking "force constant voltage" in Trixx doesn't do anything from what I can see (edit: I mean it doesn't make the core clock stay at the same value).


Your card downclocking? Unless you have set some Temp limit on the card you should be running constant Core speed. Voltage will not matter.


----------



## heroxoot

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Decoman*
> 
> I have aquired a *Sapphire 290X Tri-X* card (not the OC version) and started to fiddle with overclocking using Trixx.
> 
> I find it a little odd that the core clock jumps up and down, even though the card doesn't appear to be that hot.
> 
> Core clock is 1010 MHz and mem clock is 1250 MHz. Core clock sometimes dip below 900MHz and I don't know why. Increasing "power limit" to 25 doesn't seem to help at all.
> 
> Asic: 79.9%
> Bios: 015.042.000.003.000000 (113-C6710100-L01)
> 
> Using Catalyst 14.6 beta rc2 drivers.
> 
> The button for the two bios setting doesn't seem to result in different clocks, same core and memory clock on both settings.
> 
> The cooler/fan is nice and quiet at 45% fan speed. I noticed that there is some kind of coil whine when overlclocking and running Unigine Heaven.
> 
> The image below show a run with Unigine Heaven at stock speeds (1010 MHzcore/1250MHzmem).
> 
> 
> How do I get the card to run at a steady clock? Ticking "force constant voltage" in Trixx doesn't do anything from what I can see (edit: I mean it doesn't make the core clock stay at the same value).
> 
> 
> 
> Your card downclocking? Unless you have set some Temp limit on the card you should be running constant Core speed. Voltage will not matter.
Click to expand...

I think what he means is, the card doesn't ever run steady in games. I have this happen too. When I play a game, even BF4, the clock might dip a little. The way I fixed it was to disable powerplay and set a 3D and 2D profile. However doing this did not result in better performance. In Heaven and Valley the clock always stays pegged at full clocks as it is pushing 90 - 100% load the entire time. The way I see it, if the game does not push it hard enough the card feels no reason to keep clocked up, which I think is fine. The only game I have any issues with is borderlands 2. All other games run great regardless of if the clock goes down.


----------



## ZealotKi11er

Just testing my 290s in CF. Not sure whats going on with my new 290 but its getting 74C VRM1 under load. 290X on top get 54C. My 290X did go that high but after changing to Fujipoly my VRM1 dropped. THe thing is that i am using Fujipoly on my 290. I did not use any thermal paste though. Not sure if i did with my 290X


----------



## Kittencake

Ok is it normal that a simple youtube video brought the core clock to max ?


----------



## PureBlackFire

Quote:


> Originally Posted by *Kittencake*
> 
> Ok is it normal that a simple youtube video brought the core clock to max ?


nope.


----------



## Kittencake

what would cause this ... and I have a 9500gt would hybrid physX be worth looking at and make use of it ?


----------



## PureBlackFire

Quote:


> Originally Posted by *Kittencake*
> 
> what would cause this ... and I have a 9500gt would hybrid physX be worth looking at and make use of it ?


maybe if your browser is using hardware acceleration? I haven't used hybrid physx in a few years so I don't know how it would work with current drivers.


----------



## rdr09

Quote:


> Originally Posted by *pdasterly*
> 
> took your idea, works ok with grid in benchmark mode. should I try to overclock from here.
> benchmark has been running for about an hour


got anything more demanding like Minecraft?


----------



## pdasterly

Quote:


> Originally Posted by *rdr09*
> 
> got anything more demanding like Minecraft?


dont have minecraft but switched to total war rome 2 and vrm temp went up a little
running benchmark in loop, but vrms are getting chance too cool off in between rounds of loop.
The dips in the graph show where the test ended and begun.


----------



## rdr09

Quote:


> Originally Posted by *pdasterly*
> 
> dont have minecraft but switched to total war rome 2 and vrm temp went up a little
> running benchmark in loop, but vrms are getting chance too cool off in between rounds of loop.
> The dips in the graph show where the test ended and begun.


what's your ambient temp? your temps are fine. either that or those games are not taxing the gpus, especially Rome. your other gpu was asleep.

i was kid kid with minecraft. how about some BF4 or BF3? nonetheless, your temps are fine.


----------



## Blaise170

Quote:


> Originally Posted by *rdr09*
> 
> got anything more demanding like Minecraft?


How about Mario? That seems like a good stress test.


----------



## pdasterly

Quote:


> Originally Posted by *rdr09*
> 
> what's your ambient temp? your temps are fine. either that or those games are not taxing the gpus, especially Rome. your other gpu was asleep.
> 
> i was kid kid with minecraft. how about some BF4 or BF3? nonetheless, your temps are fine.


70f today, but its warmer in my cave. cant get bf4 to run right now. mario?

seems like the xfire dosent kick in until you go fullscreen mode


----------



## Dasboogieman

Quote:


> Originally Posted by *Dsrt*
> 
> Does anybody know what are safe VRM temps for R9-290X DirectCU II, core going up to 90c while OCing +25mV but *VRM 1# goes up to 106c* (25min run in heaven 4.0), is that safe for 24/7 overclock? (Running Heaven 4.0 Ultra, Tessellation Extreme, AA 8x, 1080p).


Holey cow, that card was supposed to have more quantity and quality of VRM phases as well.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> TN kills it for me. I am forced to use TN with my Gaming laptop and i cant stand it. There are just to many things wrong with TN unless you are color shifts dont matter to you.


Yeah, I find it hard to use TN screens. The high frequency ones are undoubtedly more responsive but the lack of color accuracy or stability really detracts from the beauty of a scene.


----------



## rdr09

Quote:


> Originally Posted by *Blaise170*
> 
> How about Mario? That seems like a good stress test.


yah. lol
Quote:


> Originally Posted by *pdasterly*
> 
> 70f today, but its warmer in my cave. cant get bf4 to run right now. mario?
> 
> seems like the xfire dosent kick in until you go fullscreen mode


even if your ambient goes 10 higher . . . your temps will still be good. i'd say forget it and just enjoy your cards. you got to fix bf4 issue and use mantle. you are missing out.


----------



## pdasterly

Is bf4 on steam? Ill get today if so. If ambient goes 10 higher im turning on my a/c.
This machine forced me to buy portable a/c otherwise my cave turns into a sauna


----------



## Blaise170

Quote:


> Originally Posted by *pdasterly*
> 
> Is bf4 on steam? Ill get today if so. If ambient goes 10 higher im turning on my a/c.
> This machine forced me to buy portable a/c otherwise my cave turns into a sauna


Origin.


----------



## pdasterly

looks like a hard copy of bf4 is cheaper than digital download


----------



## heroxoot

Quote:


> Originally Posted by *pdasterly*
> 
> looks like a hard copy of bf4 is cheaper than digital download


And by hardcopy you mean a Frisbee with a serial code to put into origin? Yea looks like it. It did go on sale a few weeks back for like 25 on amazon for digital.


----------



## Sgt Bilko

Quote:


> Originally Posted by *heroxoot*
> 
> And by hardcopy you mean a Frisbee with a serial code to put into origin? Yea looks like it. It did go on sale a few weeks back for like 25 on amazon for digital.


And with my net that "Frisbee" can install faster than me downloading it.

Retail copies are still useful for some people.


----------



## heroxoot

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> And by hardcopy you mean a Frisbee with a serial code to put into origin? Yea looks like it. It did go on sale a few weeks back for like 25 on amazon for digital.
> 
> 
> 
> And with my net that "Frisbee" can install faster than me downloading it.
> 
> Retail copies are still useful for some people.
Click to expand...

If that works for you. BF3 had you install the disc and then redownload the whole thing from origin. I'd be shocked if they actually fixed it.

http://www.abc.net.au/technology/articles/2011/10/28/3350832.htm


----------



## wrigleyvillain

While I too am a big proponent of BF3 or BF4 as one of the best true stress tests if you don't have it just run Heaven. I am so tried of looking at it after all these years but it really heats my card (and black screens; working on fixing the cooling). Not all benches do; 3DM11 being one it can pass no problem.


----------



## heroxoot

Quote:


> Originally Posted by *wrigleyvillain*
> 
> While I too am a big proponent of BF3 or BF4 as one of the best true stress tests if you don't have it just run Heaven. I am so tried of looking at it after all these years but it really heats my card (and black screens; working on fixing the cooling). Not all benches do; 3DM11 being one it can pass no problem.


Heaven shows me no problems and then BF4 has artifacts. It's fairly hit and miss.


----------



## Dr.GumbyM.D.

Quote:


> Originally Posted by *wrigleyvillain*
> 
> While I too am a big proponent of BF3 or BF4 as one of the best true stress tests if you don't have it just run Heaven. I am so tried of looking at it after all these years but it really heats my card (and black screens; working on fixing the cooling). Not all benches do; 3DM11 being one it can pass no problem.


That's one of the things that drives me nuts about "benchmarks". They're good for comparing apples to apples with PC performance (sorta, drivers and other things weigh on this), but ultimately the experience I had with my new 290 was that I could play BF4 for hours no problem, I could bench Heaven or Valley, 3dmark11, furmark, but I still had issues with the video card/drivers crashing in World of Tanks and Tropico 5. Frustrating. Ended up having to flash to an Asus 290x bios. No free shaders, but all I wanted was stability!


----------



## pdasterly

Quote:


> Originally Posted by *Sgt Bilko*
> 
> And with my net that "Frisbee" can install faster than me downloading it.
> 
> Retail copies are still useful for some people.


You forgot shipping time


----------



## rdr09

Installed 14.7 rc1. Took just 10 mins to uninstall 14.6 and install this. No change in scores using 3DMark11. RC1 on the right with 290 @ 1260 core and i7 4.5 HT off:


----------



## pdasterly

i noticed when i went from 14.4 to 14.6 i wasent able to hookup my soundbar thru the hdmi port anymore, ccc now thinks its a monitor and its throwing my eyefinity off


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> Installed 14.7 rc1. Took just 10 mins to uninstall 14.6 and install this. No change in scores using 3DMark11. RC1 on the right with 290 @ 1260 core and i7 4.5 HT off:


Looks more like a driver to fix problems. I am going to stay with 14.6 RC for now.


----------



## debuchan

Quote:


> Originally Posted by *wrigleyvillain*
> 
> Cool, please report back. If your black screens are heat-related then that is even more evidence that mine are as well. VRM2 gets to about 95 before it blacks. Though I also think I have VRAM chip overheating too.


Hi!

So I bled my system out some more and as long as I do not try to load any game, it will be okay. If I do try to load a game (i.e. Guild Wars 2, Injustice), the GPU core shoots up to over 90. (my VRMs are relatively frosty around 38C). The main offending bubble is gone, but the fact that my core goes that high is troubling. Any thoughts? I have gone back to bleeding for now. Thanks in advance!

EDIT: Looking at it, running videos on youtube causes the memory clocks to jump up to 1250, which causes the temperature to ramp up as well. If it is not jumping to that number, temps stay around and below 60C.


----------



## pdasterly

Quote:


> Originally Posted by *pdasterly*
> 
> took your idea, works ok with grid in benchmark mode. should I try to overclock from here.
> benchmark has been running for about an hour


upgraded to 14.7rc. amd added xfire profile for grid motorsport


----------



## HoneyBadger84

Quote:


> Originally Posted by *pdasterly*
> 
> upgraded to 14.7rc. amd added xfire profile for grid motorsport


Is that game fairly complex, or pretty simple in terms of you just drive? I'm looking for a not-too-involved racing game to play when I get that racing game itch, but I don't want one that gets overly complex to continue progression (ala Dirt 3). I don't mind customization etc.


----------



## Gualichu04

How exciting that one of my gpu's still gets the black screen issue in some games the last game it did it was Tomb Raider. I have since then changed the power cables again and it has two separate 6+2 pins with only one gpu in use. With Folding at home while trying to say watch a movie. both monitors will look pixalated with different colors and the computer has to be force rebooted. If the different power cables do not help i sense an rma is a foot. Sadly the 2nd gpu already has the water block on it and awaiting for the other one to be stable at all times. Using 14.6RC Drivers. Maybe the crappy cooler on the xfx DD is to blame the vrms do get up to 90C since all that is cooling them and the ram is a just a metal plate.
shown here:


----------



## pdasterly

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Is that game fairly complex, or pretty simple in terms of you just drive? I'm looking for a not-too-involved racing game to play when I get that racing game itch, but I don't want one that gets overly complex to continue progression (ala Dirt 3). I don't mind customization etc.


A little of both. I just casual game. Nfs rivals has good graphics


----------



## kizwan

Quote:


> Originally Posted by *debuchan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wrigleyvillain*
> 
> Cool, please report back. If your black screens are heat-related then that is even more evidence that mine are as well. VRM2 gets to about 95 before it blacks. Though I also think I have VRAM chip overheating too.
> 
> 
> 
> Hi!
> 
> So I bled my system out some more and as long as I do not try to load any game, it will be okay. If I do try to load a game (i.e. Guild Wars 2, Injustice), the GPU core shoots up to over 90. (my VRMs are relatively frosty around 38C). The main offending bubble is gone, but the fact that my core goes that high is troubling. Any thoughts? I have gone back to bleeding for now. Thanks in advance!
> 
> EDIT: Looking at it, running videos on youtube causes the memory clocks to jump up to 1250, which causes the temperature to ramp up as well. If it is not jumping to that number, temps stay around and below 60C.
Click to expand...

You really need to re-seat the block. You shouldn't get that high. That is too high.
Quote:


> Originally Posted by *debuchan*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I have not read all +2.6k pages, but I was wondering if anyone else had this issue.
> 
> I got a R9 290X (Sapphire reference) about a month ago and it was running without issues (on 14.4). Last week, my pump dies, so I decide I will get a new waterblock for it (Aquacomputer). I install it last night, do a leak test, and boot up. I now am getting the dreaded black screens that have plagued several other 290 users, sometimes a full shut down. Has anyone had this happen (where the black screens came after it was running fine)?
> 
> Other facts:
> - There is an air bubble in my 290 waterblock. It is not large, but it also is not going away. Could this be causing the issue? Also, how do I get an air bubble out of a graphics card (aside from running leak test)? Shaking and jostling has not helped.
> - I have yet to try the 14.6 beta driver
> 
> I think my issue is tied to one (or both) of these facts, but if anyone else can give me other ideas to try out, I would be appreciative.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: I used Fujipoly (Extreme, 1.0mm) on the memory and whatever Aquacomputer supplied with its water block on the items they marked as green on the installation manual


If I remember correctly, with aquacomputer block, you need to use TIM on the memory. Using thermal pad instead may prevent the block from sitting properly on the GPU die/chip.


----------



## Decoman

I noticed something odd with my Sapphire 290X tri-x (non-oc) card. The pcb layout matches a 290X card so I know I have a 290X card, the sticker at the back also states that it is a 290X card. The serial number on the card doesn't match the one on the box, there just is minor difference there though. The box has the serial number (SN) ending in xxxxxxxxxx183 however the card's SN ends with xxxxxxxxxx182.

Edit: So I heard just now, that the guy I bought the used card from (with receipt ) had an acquaintance and that they bought two cards together and supposedly they must have mixed up the boxes. Guess that explains it.


----------



## debuchan

Quote:


> Originally Posted by *kizwan*
> 
> You really need to re-seat the block. You shouldn't get that high. That is too high.


Yeah, I took off the block and saw that the TIM didn't spread. I don't know what I was thinking. Anyways, I cleaned it up, reapplied, and made sure it spread out.
Quote:


> Originally Posted by *kizwan*
> 
> If I remember correctly, with aquacomputer block, you need to use TIM on the memory. Using thermal pad instead may prevent the block from sitting properly on the GPU die/chip.


I read somewhere that folks were using Fujipoly to get better temps. (of course, I think I used the wrong size, found the info here:http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures)

Well, I guess I know what to fix. Thanks!

EDIT- I just realized where I made my mistake... I (saw the table in the link) and used the Heatkiller info (because my brain, for whatever reason mistakes Heatkiller and AC). Ugh...


----------



## PCSarge

so. who wants to see pictures of the sexy aquacomputer 290/X block when i get off work at 3PM?


----------



## Arizonian

Quote:


> Originally Posted by *PCSarge*
> 
> so. who wants to see pictures of the sexy aquacomputer 290/X block when i get off work at 3PM?


Me! Call me weird but I enjoy pics of members computer parts. There's 'people watching' and then there's us who like 'people's computer parts watching'. Click away and share.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Decoman*
> 
> I noticed something odd with my Sapphire 290X tri-x (non-oc) card. The pcb layout matches a 290X card so I know I have a 290X card, the sticker at the back also states that it is a 290X card. The serial number on the card doesn't match the one on the box, there just is minor difference there though. The box has the serial number (SN) ending in xxxxxxxxxx183 however the card's SN ends with xxxxxxxxxx182.


If you got it used from someone selling multiple cards they may have mixed up the boxes. Had that happen with one of my first 2 XFX 290X Core Editions. One serial matched the box the other didn't.


----------



## pdasterly

Quote:


> Originally Posted by *rdr09*
> 
> what's your ambient temp? Your temps are fine. either that or those games are not taxing the gpus, especially Rome. your other gpu was asleep.
> 
> i was kid kid with minecraft. how about some BF4 or BF3? nonetheless, your temps are fine.


Tried with metro last light, very demanding, about as much as bf4. vrms went up to 80 but cant run game in windowed mode to monitor vrms while I bbenchmark.


----------



## tarobbt

Anyone else besides me having crashing issues with the R9 290x?

It ran perfectly fine for 2 weeks, but it started crashing when GPU is under load.

Tried using 2 separate PCI cables, fixed it for a day before it went back crashing.

My PSU is the Corsair Ax750 which I bought last year. Works fine during idle.


----------



## pdasterly

Quote:


> Originally Posted by *tarobbt*
> 
> Anyone else besides me having crashing issues with the R9 290x?
> 
> It ran perfectly fine for 2 weeks, but it started crashing when GPU is under load.
> 
> Tried using 2 separate PCI cables, fixed it for a day before it went back crashing.
> 
> My PSU is the Corsair Ax750 which I bought last year. Works fine during idle.


Is your card crashing or your system


----------



## the9quad

Quote:


> Originally Posted by *tarobbt*
> 
> Anyone else besides me having crashing issues with the R9 290x?
> 
> It ran perfectly fine for 2 weeks, but it started crashing when GPU is under load.
> 
> Tried using 2 separate PCI cables, fixed it for a day before it went back crashing.
> 
> My PSU is the Corsair Ax750 which I bought last year. Works fine during idle.


It all depends for me, it is weird. I can play BF4 and other games for days and really push the cards and I wont get a single crash. Then I can play something like divinity original sin or planetside 2 and get a hard crash to the point that windows wont recognize my cards being installed or having drivers upon reboot. That's happened twice now. So now I play divinity original sin with crossfire disabled and get the same frame rate with zero crashes. Havent tried planetside 2 again since it locked up twice in a row.


----------



## Decoman

Q: What software can I use to just have a look at 290X bios? (For sake of making a comparison.)


----------



## Red1776

Quote:


> Originally Posted by *Decoman*
> 
> Q: What software can I use to just have a look at 290X bios? (For sake of making a comparison.)


 CCC will give you the Bios Number and date


----------



## HoneyBadger84

GPUz, CCC as he said.


----------



## Decoman

Wow, so I am testing out some overclocking settings for my Sapphire 290X Tri-X (non-OC) card with Trixx, and suddenly I get a 12 fps increase. *confused*

The core voltage seem to be alot more stable and so is my core clock, which probably explain the improved fps and score in Unigine Heaven. Only thing I did was raising the mem clock from 1300 to 1350. Leaving core at 1010, power level at +50, and voltage at +100mV. I did set the fan to 70%, but I also had it at 70% when trying memory at 1300 MHz.

I run Unigine Heaven at "custom", "ultra", "extreme", x8 AA, fullscreen and 1920 x 1200 resolution.

Weird, but I like it.









*Edit:* Apparently, if I change my Trixx settings when my Heaven benchmark is already running in the background, I get +12 fps than normal. If I close down Heaven benchmark and start it up again, I get the regular lower fps. I wonder what is not working with the drivers, Trixx or whatnot. Doesn't really seem to have anything to do with the overclocking values I set.

*Edit2:* Apparently, running Unigine Heaven with Vsynch off improves average fps and makes the core clock alot more steady.


----------



## Decoman

No guys, what I meant was: what software can I use to inspect the settings for the 290X bios?

I modded my 7950's bios some time ago, but I understand things are quite different with the Hawaii cards.


----------



## Red1776

Quote:


> Originally Posted by *Decoman*
> 
> 
> 
> 
> 
> 
> 
> 
> No guys, what I meant was: what software can I use to inspect the settings for the 290X bios?
> 
> I modded my 7950's bios some time ago, but I understand things are quite different with the Hawaii cards.


http://www.techpowerup.com/vgabios/

On the right most column you can download the BIOS details after you look up yours from the database


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Looks more like a driver to fix problems. I am going to stay with 14.6 RC for now.


i thnk it added crossfire support in some games and i read somewhere the ability to use 60Hz in higher rez.

Quote:


> Originally Posted by *pdasterly*
> 
> Tried with metro last light, very demanding, about as much as bf4. vrms went up to 80 but cant run game in windowed mode to monitor vrms while I bbenchmark.


p, you can always just check the temps after the benchmarks. leave the app running in the background. if you want to see the max temps reached on the core and vrms in GPUZ, set the reading on each to max using the drop downs.



Hwinfo will show min/max/average and current.


----------



## pdasterly

Quote:


> Originally Posted by *rdr09*
> 
> i thnk it added crossfire support in some games and i read somewhere the ability to use 60Hz in higher rez.
> p, you can always just check the temps after the benchmarks. leave the app running in the background. if you want to see the max temps reached on the core and vrms in GPUZ, set the reading on each to max using the drop downs.
> 
> 
> 
> Hwinfo will show min/max/average and current.


checking after benchmarks, what if vrms overheat. i want to catch before too late


----------



## rdr09

Quote:


> Originally Posted by *pdasterly*
> 
> checking after benchmarks, what if vrms overheat. i want to catch before too late


in short benchmarks it does not really matter. the vrms, for example, are designed to withstand higher than 100C. the core at 95C will make the gpu throttle, thus, temp will lower before the gpu breaks. seeing your temps, nothing really to worry about. in games, i suggest playing for at least half an hour during the warmest part of the day and check the temps. if GPUZ shows max temp reached is 85C, then that is the max. all others will be lower.


----------



## pdasterly

Quote:


> Originally Posted by *rdr09*
> 
> in short benchmarks it does not really matter. the vrms, for example, are designed to withstand higher than 100C. the core at 95C will make the gpu throttle, thus, temp will lower before the gpu breaks. seeing your temps, nothing really to worry about. in games, i suggest playing for at least half an hour during the warmest part of the day and check the temps. if GPUZ shows max temp reached is 85C, then that is the max. all others will be lower.


i have g10 bracket, gpu never goes above 60c


----------



## HoneyBadger84

Quote:


> Originally Posted by *pdasterly*
> 
> checking after benchmarks, what if vrms overheat. i want to catch before too late


Why not just run afterburner with its OSD in-game enabled to at least watch the GPU core temp? Thats the main thing I use AfterBurner for.


----------



## rdr09

Quote:


> Originally Posted by *pdasterly*
> 
> i have g10 bracket, gpu never goes above 60c


p, your temp is almost as good watercooled gpus. the maximum reading in Hwinfo, for example, is a reading during the most heated or demanding part of the bench or game.

it is a snapshot of that moment. it is as if you can see it live.


----------



## pdasterly

Couldnt get to work properly, seemed over confusing. Trixx is simple, gpu clock, mem clock, vddc and power limit. I will give af another shot but I couldnt get osd to display. Not worried about gpu, its my vrms


----------



## Decoman

As I am fiddling with Unigine Heaven I think I notice that my fps improve by 12 fps if I multitask out of the running full screen Unigine Heaven benchmark, and then go back to it in Windows. Simply starting Unigine Heaven gives me lower fps, and the core clock is alot more uneven.

Edit: Eeeh, is it normal to get lower scores in Unigine Heaven when having vsynch on? With Vsynch off, I get higher average Fps and a higher score. The core clock also runs more steady with vsynch off apparently, regardless of having started Unigine Heaven again, and regardless of having multitasked out and back in.


----------



## pdasterly

As a last resort I was gonna oc each card individually and run the weaker card as my main card but thats a lot of work, but seeming like I have no choice


----------



## Kittencake

I think xfx has a don't ask don't tell policy


----------



## HoneyBadger84

Only way I can see to effectively watch thermals in benchmarks and games is a second screen with GPUz sensor tab up. I've never had the issue of VRMs running hotter than GPU core though, and thats on 7 different R9 290/290Xs I've tested so far.


----------



## pdasterly

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Only way I can see to effectively watch thermals in benchmarks and games is a second screen with GPUz sensor tab up. I've never had the issue of VRMs running hotter than GPU core though, and thats on 7 different R9 290/290Xs I've tested so far.


Im partially water cooled, gpu temp isnt an issue


----------



## pdasterly

Quote:


> Originally Posted by *Decoman*
> 
> As I am fiddling with Unigine Heaven I think I notice that my fps improve by 12 fps if I multitask out of the running full screen Unigine Heaven benchmark, and then go back to it in Windows. Simply starting Unigine Heaven gives me lower fps, and the core clock is alot more uneven.
> 
> Edit: Eeeh, is it normal to get lower scores in Unigine Heaven when having vsynch on? With Vsynch off, I get higher average Fps and a higher score. The core clock also runs more steady with vsynch off apparently, regardless of having started Unigine Heaven again, and regardless of having multitasked out and back in.


V-sync caps [email protected] When I switch out from benchmark to windows my fps drops, as does my gpu load until I click back on benchmark


----------



## Tokkan

Okay then I have a few questions.
My r9 290 is crashing my computer, the heat coming out from it is heating up everything and making my CPU unstable








My case is a NZXT Lexa S with two 240mm fans up top, I was thinking of getting a dual 140mm AIO cooler with a G10 to put up top, but I would like to know if it would be overkill to go with a 280mm AIO cooler? I'm thinking of the Corsair H110 coupled with the NZXT G10.
I have no idea where to buy the other small heatsinks inside EU, my country (Portugal) has no store with them so I'm probably going to order both the Corsair H110 and the G10 from overclockers co uk but where to get the VRM/VRAM heatsinks?

Another question I have is, Arctic Accelero Xtreme IV. How would it compare to the Corsair H110+G10?

Of note I will obviously be overclocking but my prime goal is getting the heat down and the silence up.

Thanks in advance to the replies.


----------



## Mega Man

Quote:


> Originally Posted by *Kittencake*
> 
> I think xfx has a don't ask don't tell policy


no xfx openly says it is ok on their website ! but if you damage the card then the warranty is invalid


----------



## Dr.GumbyM.D.

Quote:


> Originally Posted by *Tokkan*
> 
> Okay then I have a few questions.
> My r9 290 is crashing my computer, the heat coming out from it is heating up everything and making my CPU unstable
> 
> 
> 
> 
> 
> 
> 
> 
> My case is a NZXT Lexa S with two 240mm fans up top, I was thinking of getting a dual 140mm AIO cooler with a G10 to put up top, but I would like to know if it would be overkill to go with a 280mm AIO cooler? I'm thinking of the Corsair H110 coupled with the NZXT G10.
> I have no idea where to buy the other small heatsinks inside EU, my country (Portugal) has no store with them so I'm probably going to order both the Corsair H110 and the G10 from overclockers co uk but where to get the VRM/VRAM heatsinks?
> 
> Another question I have is, Arctic Accelero Xtreme IV. How would it compare to the Corsair H110+G10?
> 
> Of note I will obviously be overclocking but my prime goal is getting the heat down and the silence up.
> 
> Thanks in advance to the replies.


An H110 is definitely overkill, especially since it's only cooling the GPU, not the ram or VRMs. The 295x2 is using 1 120mm radiator to cool 2x290 GPUs. totally up to you, but if you feel that the hot air from the 290 is causing instability on your CPU because of heat, maybe you should get either 2 120/140mm CLCs (H55/H90) and use one for CPU and one for GPU, or just go with a basic custom loop (start with a swiftech H220 kit or XSPC rasa kit and add in a GPU block).

For the heatsinks, you can get the generic ones from any online site, enzotech makes nice copper ones. Get some of those and then some thermal adhesive pads.

Honestly, if you have the room, the Arctic will be a much more simple, out of the box solution, but either way you can't go wrong.


----------



## Kittencake

Quote:


> Originally Posted by *Mega Man*
> 
> no xfx openly says it is ok on their website ! but if you damage the card then the warranty is invalid


oh ok .. I remember one having that policy I couldn't remember which one, XFX is really good with warranties ,I've delt with their rma dept once and it was quick and painless


----------



## pdasterly

tried to torture test it, found my limit. cant add vddc or vrms will over heat.
Only could get 1075 on gpu and 1350 on mem. Power limit 50. 0 vddc.
no errors in occt


any vddc and vrm will overheat
any more gpu clock and i will get errors in occt
any more mem and system will blackscreen


----------



## Enzarch

Quote:


> Originally Posted by *Dasboogieman*
> 
> I'd say the 780 is about slightly worse than, albeit maybe with a slight edge when fully overclocked compared to Hawaii XT (also depends on your Hawaii chip, some here on OCN have achieved fantastic results), its definitely worse than Hawaii Pro when fully OCed unless you got one of them golden chips I keep hearing about from the NVIDIA guys. That being said, Hawaii won't bleed as much performance at higher resolutions with heavy MSAA/SSAA usage, GK110 loses whatever edge it has very quickly in these scenarios. The 4GB RAM is gravy, basically much more desirable should you go dual GPU in future. However, I can't deny that GK110 will consume less power and will operate at a lower core temperature than Hawaii, plus, in the games that it does well, it does phenomenally well (though the same can also be said for AMD, however the NVIDIA favored titles tend to gimp AMD unfairly).
> 
> How close to the edge are you with your CPU overclock to be so sensitive to case temperatures?
> 
> With respect to aftermarket cooling. You have to really strike a balance between silence and cooler bulk as most aftermarket solutions can easily achieve 75 degrees load with silence. Its actually quite OK for the AMD GPU core to operate at 94 degrees, I'm sure the extra leakage has been taken in to account and compensated for. The real issue with Hawaii is the VRM temperatures, while they're rated for 120 degrees plus, their temperatures spike really fast when forced to operate beyond 250W of power draw (i.e. overvoltage conditions) due to the 5+1+1 design. On my Tri X, the VRM temperatures spike by about 5 degrees for every 50mV of extra juice. This is unfortunately, a byproduct of the 5+1+1 design.
> 
> Regardless of cooler you go, your VRM temperatures will usually be equal (best case scenario) or much worse compared to the reference cooler, and this is the real overclock limiter, not core temperatures.
> 
> 1. The best performance outright in all metrics is a full cover water block. You get core and VRM improvements. I don't need to explain the downsides though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2. NZXT G10 + AIO cooler + GELID Enhancement pack + VRAM heatsinks. You get comparable core temperatures to full scale watercooling (most who have used it with a 120mm class rad have gotten a fairly good delta T of 25-30 degrees), however, VRM temperatures don't really improve to match the extra core headroom. Reasonably compact and silent, you just need to find a rad mounting point. Quite expensive though, not quite the same as full WC but much pricier than most other solutions.
> 
> 3. Corsair H10 + AIO cooler: not too much information on this yet as there are no owners who have published results. Supposedly has superior VRM cooling to the G10 setup because the bracket recycles the reference heat plate + fan.
> 
> 4. Accelero Extreme III: probably the most popular at the moment with most users here. Takes roughly 2.5 slots, pretty long and requires a reasonably involved installation process. I've been told temperatures around 60 are possible with great silence. Somewhat weaker VRM cooling though this is quite technique sensitive with respect to your VRM cooling arrangements (i.e. Thermal Epoxy vs Tape). I'd avoid the Extreme IV for the moment, reports range from wildly better to worse VRM temperatures, plus, this is even bulkier due to the massive backplate.
> 
> 5. GELID Icy Vision: Pretty cheap, reasonably quiet. Performance can range from 70-80s depending on your noise threshold. Probably the most compact of the aftermarket coolers.
> 
> 6. Rajintek: I don't recommend this, it's a monstrous 4 slot cooler which will undoubtedly give you fantastic sub 60 degree temperatures with great silence (due to large fans). However, the sheer bulk and weight doesn't really seem worth it considering the VRM cooling isn't much improved from the smaller coolers. For this effort, it may be better to go with an AIO.
> 
> I'm sure a lot of other people will chime in regarding options I may have missed.


Aye, provided you have a reference cooler, you can use my method you can get good vrm temps with an AIO or universal block


----------



## rdr09

Quote:


> Originally Posted by *Tokkan*
> 
> Okay then I have a few questions.
> My r9 290 is crashing my computer, the heat coming out from it is heating up everything and making my CPU unstable
> 
> 
> 
> 
> 
> 
> 
> 
> My case is a NZXT Lexa S with two 240mm fans up top, I was thinking of getting a dual 140mm AIO cooler with a G10 to put up top, but I would like to know if it would be overkill to go with a 280mm AIO cooler? I'm thinking of the Corsair H110 coupled with the NZXT G10.
> I have no idea where to buy the other small heatsinks inside EU, my country (Portugal) has no store with them so I'm probably going to order both the Corsair H110 and the G10 from overclockers co uk but where to get the VRM/VRAM heatsinks?
> 
> Another question I have is, Arctic Accelero Xtreme IV. How would it compare to the Corsair H110+G10?
> 
> Of note I will obviously be overclocking but my prime goal is getting the heat down and the silence up.
> 
> Thanks in advance to the replies.


if you can find these stuff locally, then i recommend them . . .

Seidon

Gelid

see post # 24047

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/24040#post_22387647


----------



## Tokkan

Quote:


> Originally Posted by *Dr.GumbyM.D.*
> 
> An H110 is definitely overkill, especially since it's only cooling the GPU, not the ram or VRMs. The 295x2 is using 1 120mm radiator to cool 2x290 GPUs. totally up to you, but if you feel that the hot air from the 290 is causing instability on your CPU because of heat, maybe you should get either 2 120/140mm CLCs (H55/H90) and use one for CPU and one for GPU, or just go with a basic custom loop (start with a swiftech H220 kit or XSPC rasa kit and add in a GPU block).
> 
> For the heatsinks, you can get the generic ones from any online site, enzotech makes nice copper ones. Get some of those and then some thermal adhesive pads.
> 
> Honestly, if you have the room, the Arctic will be a much more simple, out of the box solution, but either way you can't go wrong.


I doubt an H90 would be capable of cooling it as well as my Noctua. So I'm not downgrading/sidegrading.
The temperatures on my Phenom II with my Noctua NH-D14 is 55 degrees at load, I can't say for sure that it is instability on the CPU because I don't monitor temps ingame and it only happens on some games. Doesnt happen in Watch Dogs, Battlefield 3/4.
I know the heat is causing it, because as soon as I set a really aggressive fan curve. Going up to 75% fan speed atm ingame it stops crashing. GPU temps with this fan speed are of 70 degrees on GPU and arround 60 for the VRM's.
With my headset on and volume turned up it doesn't bother me but I can't hear any1 arround me and I had to change the mic settings on teamspeak because everyone was hearing the GPU fan









A custom loop kit was on my mind too, getting a 240/280 kit, whichever is in stock at the better price/better components getting a 120 rad and a gpu block. Should stay below 300e and improve cooling on everything. Just a tad too pricey for my taste since I'm trying to save up.

Any experience on the Accelero Xtreme IV? How well does it keep temps down/noise?

Okay then, bad experiences with the Accelero it seems...
Quote:


> Originally Posted by *rdr09*
> 
> if you can find these stuff locally, then i recommend them . . .
> 
> Seidon
> 
> Gelid
> 
> see post # 24047
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/24040#post_22387647


Oh this seems like the solution I am looking for! I had the seidon under my aim, I can find it new for 40 euros lol
And I already have an army of fans zip tied together arround 10cm away from the GPU blowing air to it...


----------



## ZealotKi11er

So been testing 290s yesterday and even with Mantle i am getting ~ 70% GPU usage with cards @ Stock on BF4. I tried 200% Resolution scaling and made my system feel like its running Integrated GPU. Now i for the first time get to experience how games feel when you run out of vRAM. Even 150% is not the smoothest thing.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Kittencake*
> 
> oh ok .. I remember one having that policy I couldn't remember which one, XFX is really good with warranties ,I've delt with their rma dept once and it was quick and painless


Sapphire is "Don't ask, Don't tell"


----------



## wrigleyvillain

Quote:


> Originally Posted by *heroxoot*
> 
> Heaven shows me no problems and then BF4 has artifacts. It's fairly hit and miss.


Well BF4 is even more stressful. Heaven is def "enough" to properly test my cooling adequacy. I first ran 3DM11 on this card as it was the only bench installed at that time and it passed easily while staying cool so I thought my card was a-ok. NOPE!
Quote:


> Originally Posted by *Kittencake*
> 
> oh ok .. I remember one having that policy I couldn't remember which one, XFX is really good with warranties ,I've delt with their rma dept once and it was quick and painless


Hmm would XFX allow me to RMA a card not being original owner and not physically having the stock cooler?


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> So been testing 290s yesterday and even with Mantle i am getting ~ 70% GPU usage with cards @ Stock on BF4. I tried 200% Resolution scaling and made my system feel like its running Integrated GPU. Now i for the first time get to experience how games feel when you run out of vRAM. Even 150% is not the smoothest thing.


i just used GPUZ and see how much vram was used in BF4 MP 64 at only 1080 and 130% rez scale - 3300MB. one 290.


----------



## JeremyFenn

Quote:


> Originally Posted by *wrigleyvillain*
> 
> Hmm would XFX allow me to RMA a card not being original owner and not physically having the stock cooler?


Not being the original owner: Possibly, I'd think so though since it's the card itself that should be under warranty (unless the previous owner registered it).

Not having the stock cooler: That's probably a no, any serious alterations to their cards is usually a "do not pass go, do not collect $200" type of deal-breaker and warranty voider.

Hey, do you have any 3rd party software like AB or Trixx installed? Maybe setting the fan curve yourself would help with those temps. I know on my Sapphire card I use a 1:1 from 0-50c then from 50c-70c is 50-100%.



I use Watch Dogs personally to test out my OC's and temps on the GPU, even windowed it uses quite a bit of VRAM and taxes the GPU at 100% so I can monitor what's going on with the entire machine while in game. I had this fan curve set 1:1 0-50c then 50%-100% 50-80c. Temps are ok for me, I've seen a lot worse.


----------



## wrigleyvillain

Yes I suppose I could try raising the fan curve but it seems that my real problem areas are VRM2 and the rear (closest to mobo) VRAM chips and I am not sure any of the 3 Accelero fans really hit those areas enough. I am in the process of getting better and properly-fitting heatsinks on those now as the person who installed this cooler used some mismatched ones it turns out. And the VRM2 ones were loose, as well. Oh and there was way too much TIM on the core too (but that is an easy mistake to make, I know…).


----------



## PCSarge

Quote:


> Originally Posted by *Arizonian*
> 
> Me! Call me weird but I enjoy pics of members computer parts. There's 'people watching' and then there's us who like 'people's computer parts watching'. Click away and share.


haha im just waiting for the box from dazmode to say "delivered" so i can jump out of my skin and run home at 4PM he said 3PM was too early. boss thinks im going to a professional gaming tournament ...he also thinks im sponsored cause i wore the extravalanza shirt to work. lol

in this case stupidity will not be corrected. but taken advantage of. muhahaha

burned a total of $475 on parts to upgrade my loop. and i have no regrets.


----------



## JeremyFenn

Yeah, I was debating on removing the cooling unit on my card and replacing the TIM with some PK-1. I mean 70-71c max on these cards is like nothing though, they're designed to hit the 90's (95c I think is the ceiling for the GPU itself). I don't know if it'd be worth it in the long run.


----------



## Dasboogieman

Quote:


> Originally Posted by *Dr.GumbyM.D.*
> 
> An H110 is definitely overkill, especially since it's only cooling the GPU, not the ram or VRMs. The 295x2 is using 1 120mm radiator to cool 2x290 GPUs. totally up to you, but if you feel that the hot air from the 290 is causing instability on your CPU because of heat, maybe you should get either 2 120/140mm CLCs (H55/H90) and use one for CPU and one for GPU, or just go with a basic custom loop (start with a swiftech H220 kit or XSPC rasa kit and add in a GPU block).
> 
> For the heatsinks, you can get the generic ones from any online site, enzotech makes nice copper ones. Get some of those and then some thermal adhesive pads.
> 
> Honestly, if you have the room, the Arctic will be a much more simple, out of the box solution, but either way you can't go wrong.


I agree, I used the H90 for my card + the NZXT G10 for a while. It achieves roughly 40-45 degrees stock (about 50 when overclocked +125mV) with 25 degrees ambient, this is a healthy delta T of 15-20 degrees. It was a PITA to find space in my case for the H90, basically, it won't fit in many smaller cases than the HAF 932, I can only imagine 240mm rads would only compound the problem.
The 120mm is more than sufficient for AIO based cooling, most cases would be able to easily accommodate 2 though VRM temps become an issue.
If AIO cooling is the route taken, I would very strongly recommend the Kraken over the Corsair units as they have the same AIO OEM but the hoses are much much longer.

Quote:


> Originally Posted by *Enzarch*
> 
> Aye, provided you have a reference cooler, you can use my method you can get good vrm temps with an AIO or universal block


That would be far superior to the GELID + G10 setup. The reference cooler fan and fin stack are really bad but the heatplate is definitely efficient.


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> i just used GPUZ and see how much vram was used in BF4 MP 64 at only 1080 and 130% rez scale - 3300MB. one 290.


Do you know who to check the actually resolution? I have 1440p @ 150%. vRAMs usage is something like 7.6GB/2. What's weird is that Core clock is not steady 1000MHz. Feel like there is memory bottleneck somewhere. I tried to increase the power limit but that did not change it either.


----------



## JeremyFenn

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Do you know who to check the actually resolution? I have 1440p @ 150%. vRAMs usage is something like 7.6GB/2. What's weird is that Core clock is not steady 1000MHz. Feel like there is memory bottleneck somewhere. I tried to increase the power limit but that did not change it either.


7.62GB / 2 Cards? Looking at your rig are you using a 290 and a 290x in CFX?


----------



## ZealotKi11er

Quote:


> Originally Posted by *JeremyFenn*
> 
> 7.62GB / 2 Cards? Looking at your rig are you using a 290 and a 290x in CFX?


I think you have to divide by 2 in CFX. Reports double.


----------



## Razzaa

http://www.techpowerup.com/gpuz/9c9x6/

MSI R9 290 Stock


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Do you know who to check the actually resolution? I have 1440p @ 150%. vRAMs usage is something like 7.6GB/2. What's weird is that Core clock is not steady 1000MHz. Feel like there is memory bottleneck somewhere. I tried to increase the power limit but that did not change it either.


the way i understand it you just get the product of the 2. 1440 X 1.5 = 2160. do you really need 150%? you are using close to 4GB but some say these apps are reading allocated usage not actual. imo, it does not have to read exactly 4000MB 'cause there are other processes that use the VRAM.


----------



## heroxoot

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> i just used GPUZ and see how much vram was used in BF4 MP 64 at only 1080 and 130% rez scale - 3300MB. one 290.
> 
> 
> 
> Do you know who to check the actually resolution? I have 1440p @ 150%. vRAMs usage is something like 7.6GB/2. What's weird is that Core clock is not steady 1000MHz. Feel like there is memory bottleneck somewhere. I tried to increase the power limit but that did not change it either.
Click to expand...

My card never stays exactly max clock in BF4. The load is fairly high in BF4, 80 - 100% mostly, but never max clock. The clock always dips around 10mhz of the 3D clock. Forcing the clock to stay at max by disabling PowerPlay does not help, so I assume this isn't an issue and it's just doing what it does. In Heaven and Valley however, the clock never dips even a little. I can only assume it's the game at fault.


----------



## ZealotKi11er

Quote:


> Originally Posted by *heroxoot*
> 
> My card never stays exactly max clock in BF4. The load is fairly high in BF4, 80 - 100% mostly, but never max clock. The clock always dips around 10mhz of the 3D clock. Forcing the clock to stay at max by disabling PowerPlay does not help, so I assume this isn't an issue and it's just doing what it does. In Heaven and Valley however, the clock never dips even a little. I can only assume it's the game at fault.


I dont remember clock dipping when i was running single GPU. I was getting 99% GPU usage though. They do drop in less demanding games.


----------



## heroxoot

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> My card never stays exactly max clock in BF4. The load is fairly high in BF4, 80 - 100% mostly, but never max clock. The clock always dips around 10mhz of the 3D clock. Forcing the clock to stay at max by disabling PowerPlay does not help, so I assume this isn't an issue and it's just doing what it does. In Heaven and Valley however, the clock never dips even a little. I can only assume it's the game at fault.
> 
> 
> 
> I dont remember clock dipping when i was running single GPU. I was getting 99% GPU usage though. They do drop in less demanding games.
Click to expand...

Yea majority of the time its like 1mhz lower than my 3D clock. I see it run say 1029 instead of 1030, which really does not hurt anything. In less demanding games I see half clocks and no problems too. Borderlands 2 seems to be the only problematic game for me still but it is not just me or my GPU. Even 780TI have poor usage problems. But yea, it happens regardless.


----------



## Arizonian

Quote:


> Originally Posted by *Razzaa*
> 
> http://www.techpowerup.com/gpuz/9c9x6/
> 
> MSI R9 290 Stock


Congrats - added


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I dont remember clock dipping when i was running single GPU. I was getting 99% GPU usage though. They do drop in less demanding games.


i play at stock and my core clock was pegged it seems . . .


----------



## heroxoot

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> I dont remember clock dipping when i was running single GPU. I was getting 99% GPU usage though. They do drop in less demanding games.
> 
> 
> 
> i play at stock and my core clock was pegged it seems . . .
Click to expand...

Well I'm not even pushing temp limits, it just likes to go down a mhz or 2. Doesn't seem to be a problem, getting 80 - 90fps on the default 1030/1250.


----------



## rdr09

Quote:


> Originally Posted by *heroxoot*
> 
> Well I'm not even pushing temp limits, it just likes to go down a mhz or 2. Doesn't seem to be a problem, getting 80 - 90fps on the default 1030/1250.


could be 'cause you have boost. mine is reference.


----------



## heroxoot

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Well I'm not even pushing temp limits, it just likes to go down a mhz or 2. Doesn't seem to be a problem, getting 80 - 90fps on the default 1030/1250.
> 
> 
> 
> could be 'cause you have boost. mine is reference.
Click to expand...

It does not self boost tho. You use the gaming app and you manually tell it to upclock. I just decided to set 1040 in MSI AB. I been letting it clock down with powerplay and so far no issues. I have been fine with this card on the default clocks for the most part, but since I feel an extra app is silly, I just have it clocking up that whole 10mhz.

Either way, it's not a problem from what I can tell, and forcing my GPU to stay at a stable clock does not improve performance at all in borderlands 2, so letting it down clock for games that don't keep it pegged isn't an issue I guess.


----------



## rdr09

Quote:


> Originally Posted by *heroxoot*
> 
> It does not self boost tho. You use the gaming app and you manually tell it to upclock. I just decided to set 1040 in MSI AB. I been letting it clock down with powerplay and so far no issues. I have been fine with this card on the default clocks for the most part, but since I feel an extra app is silly, I just have it clocking up that whole 10mhz.
> 
> Either way, it's not a problem from what I can tell, and forcing my GPU to stay at a stable clock does not improve performance at all in borderlands 2, so letting it down clock for games that don't keep it pegged isn't an issue I guess.


i see.


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> i play at stock and my core clock was pegged it seems . . .


Yeah thats how i used to get with 1 card. Even OCed @ 1200MHz. I dont mind 3-10MHz less but i like solid numbers.


----------



## Gualichu04

No one has any suggestions for my issue which is post #26285


----------



## PCSarge

3 more hours until sexy aquacomputer block. as long as ups doesnt make some lame excuse to not deliver that is.


----------



## wrigleyvillain

Gee Sarge why aren't you excited?


----------



## yudodisamd

http://www.techpowerup.com/gpuz/z4neh/

PowerColor R9 290 with stock cooling and some extra fans strapped to it and overclocked to Core1050 Memory1300


----------



## ZealotKi11er

Quote:


> Originally Posted by *Gualichu04*
> 
> No one has any suggestions for my issue which is post #26285


The VRM cooling actually is not bad there. Just buy some better thermal pads. Not much else you can do.


----------



## PCSarge

Quote:


> Originally Posted by *wrigleyvillain*
> 
> Gee Sarge why aren't you excited?


how can i not be? you all will be when you see the end results


----------



## Gualichu04

Quote:


> Originally Posted by *ZealotKi11er*
> 
> The VRM cooling actually is not bad there. Just buy some better thermal pads. Not much else you can do.


What about the black screen crash randomlly in games? I changed the power cables for the card twice now and have yet to test the 2nd set of cables.


----------



## Gualichu04

Horay i still get the black screen crash even in Heaven benchmark now. The 2nd gpu never did it and now i see no other option but to rma it and use only one gpu under water while i wait for that long process. Would an unstable cpu overclock make this issue happen at all?


----------



## PCSarge

Quote:


> Originally Posted by *Gualichu04*
> 
> Horay i still get the black screen crash even in Heaven benchmark now. The 2nd gpu never did it and now i see no other option but to rma it and use only one gpu under water while i wait for that long process.


and HOPE you get another reference card. alot of companies have edited thier designs by now.


----------



## Gualichu04

Quote:


> Originally Posted by *PCSarge*
> 
> and HOPE you get another reference card. alot of companies have edited thier designs by now.


I have the XFX DD r9 290x which is non reference. I find it sad a brand new card has the black screen issue and for a non reference edition.


----------



## Decoman

I am new to the 290X card, however I believe I have experienced the "black screen" already. Might be some other thing, though, I decided at one point to simply set memory clock all the way up to 1500, and then the screen went black. Because Trixx was set to restore clocks, even on a reboot, I got a black screen the moment I logged into windows. The solution was to uninstall Trixx and reinstall it. I suspect that the memory didn't have enough voltage, but I am no expert.


----------



## PCSarge

Quote:


> Originally Posted by *Decoman*
> 
> I am new to the 290X card, however I believe I have experienced the "black screen" already. Might be some other thing, though, I decided at one point to simply set memory clock all the way up to 1500, and then the screen went black. Because Trixx was set to restore clocks, even on a reboot, I got a black screen the moment I logged into windows. The solution was to uninstall Trixx and reinstall it. I suspect that the memory didn't have enough voltage, but I am no expert.


sounds like an unstable memory clock. increase in increments of 25-50mhz. dont just crank it. these cards are very "sansetave" as bugs bunny would say.

trixx has never cooperated for me. ive always used AB. had no issues with it.

in other news : my box from dazmode has been delivered. its party time in 1 1/2 hrs!


----------



## Gualichu04

Here is a pic of a r9 290x with a aquacomputer block and active back plate. At least i have a need ot finish the water loop but, with only one gpu.


----------



## PCSarge

Quote:


> Originally Posted by *Gualichu04*
> 
> Here is a pic of a r9 290x with a aquacomputer block and active back plate. At least i have a need ot finish the water loop but, with only one gpu.


i got the clear/copper version of that block. waiting on the backplate, its on backorder. apparently aquacomputer is very bad at supplying stock to sellers.

gonna run ocean blue coolant through mine. should look pretty amazing.


----------



## ZealotKi11er

Both my cards are from AMD and they never had the black screen problem. Does your card have Elpida or Hynix vRAM?


----------



## PCSarge

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Both my cards are from AMD and they never had the black screen problem. Does your card have Elpida or Hynix vRAM?


mine has no issues either and mine's got hynix. i believe the issue is linked to elpida ram.


----------



## rdr09

Quote:


> Originally Posted by *PCSarge*
> 
> sounds like an unstable memory clock. increase in increments of 25-50mhz. dont just crank it. these cards are very "sansetave" as bugs bunny would say.
> 
> trixx has never cooperated for me. ive always used AB. had no issues with it.
> 
> in other news : my box from dazmode has been delivered. its party time in 1 1/2 hrs!


pics already. who needs to eat and sleep. put that thing together. what paste and thermal pads are you using?


----------



## PCSarge

Quote:


> Originally Posted by *rdr09*
> 
> pics already. who needs to eat and sleep. put that thing together. what paste and thermal pads are you using?


ive got some nice thermal pads left over from EK. we'll see which ones are better between those and the ones that come with the aquacomputer block. ill be using Xigmatek Freezing Point G4718. its really great paste for the price.

youll get pics when i get home. i leave work in an hour. so probably an hour and 20 mins ill be home. trust me. im as anxious as you. theres more than just the block in that box.


----------



## Gualichu04

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Both my cards are from AMD and they never had the black screen problem. Does your card have Elpida or Hynix vRAM?


Sadly Elpida on both cards.


----------



## PCSarge

Quote:


> Originally Posted by *Gualichu04*
> 
> Sadly Elpida on both cards.


then youve found the root cause sadly. i dont know why they swapped RAM providers. but it hurts the cards. Hynix seems to have a higher tolerance for punishment


----------



## cephelix

Quote:


> Originally Posted by *Gualichu04*
> 
> Here is a pic of a r9 290x with a aquacomputer block and active back plate. At least i have a need ot finish the water loop but, with only one gpu.


that is just super smexy!


----------



## ZealotKi11er

Quote:


> Originally Posted by *PCSarge*
> 
> ive got some nice thermal pads left over from EK. we'll see which ones are better between those and the ones that come with the aquacomputer block. ill be using Xigmatek Freezing Point G4718. its really great paste for the price.
> 
> youll get pics when i get home. i leave work in an hour. so probably an hour and 20 mins ill be home. trust me. im as anxious as you. theres more than just the block in that box.


EK Pads are horrible. Just make sure for VRM1 and VRM2 before putting the thermal pads put some thermal past (non-conductive) in each VRM chip. That should greatly help. I am hitting ~ 50-55C stock and 60-65C VRM1 with +100mV. These VRM suckers do like to run hot even with active VRM cooling.


----------



## Arizonian

Quote:


> Originally Posted by *yudodisamd*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/z4neh/
> 
> PowerColor R9 290 with stock cooling and some extra fans strapped to it and overclocked to Core1050 Memory1300


Congrats - added







Welcome to OCN as well.








Quote:


> Originally Posted by *Gualichu04*
> 
> Here is a pic of a r9 290x with a aquacomputer block and active back plate. At least i have a need ot finish the water loop but, with only one gpu.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - looking good. Updated


----------



## Gualichu04

Quote:


> Originally Posted by *PCSarge*
> 
> then youve found the root cause sadly. i dont know why they swapped RAM providers. but it hurts the cards. Hynix seems to have a higher tolerance for punishment


What am i supposed to do xfx will just send me another card with elpida memory and the 2nd gpu is fine with the elpida it has. Must be just be a random issue.


----------



## tsm106

lol

I'm still 9th on the HoF for Firestrike Extreme 3x gpu with two of the three gpus using Elpida chips.


----------



## kizwan

One of my card have Elpida memory & it can do 1210/1620 & currently ranked at #60 for 3dmark 11 (Performance), beat few 780ti, Titan & Titan Black in the process.


----------



## jumperfly

Guys is there a way to pump 1.3v+ through my R290? It's on a full custom loop and can do 1250Mhz with +100mv in Afterburner but the chip feels like it can give so much more.


----------



## ZealotKi11er

Quote:


> Originally Posted by *jumperfly*
> 
> Guys is there a way to pump 1.3v+ through my R290? It's on a full custom loop and can do 1250Mhz with +100mv in Afterburner but the chip feels like it can give so much more.


TRIXX


----------



## JeremyFenn

Quote:


> Originally Posted by *ZealotKi11er*
> 
> TRIXX


I thought Trixx +200 was only on Sapphire cards that could use that feature?


----------



## jumperfly

Quote:


> Originally Posted by *ZealotKi11er*
> 
> TRIXX


Doesn't apply the clocks upon restart, tried everything


----------



## pdasterly

both my cards have Hynix mem. Mine wont go past 1350


----------



## amptechnow

i have a reference 290 on a custom loop and can hit 1250/1625 with elpida!


----------



## PureBlackFire

elpida clocks fine. it's prone to errors than hynix. their memory chips also appear to have a greater chance of dying from bad memory overclocks.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gualichu04*
> 
> Sadly Elpida on both cards.


Try and get in contact with XFX, i think there is a BIOS specific to Elpida cards that fixes the issues.
Quote:


> Originally Posted by *tsm106*
> 
> lol
> 
> I'm still 9th on the HoF for Firestrike Extreme 3x gpu with two of the three gpus using Elpida chips.


Some us are still Mortal ya know








Quote:


> Originally Posted by *kizwan*
> 
> One of my card have Elpida memory & it can do 1210/1620 & currently ranked at #60 for 3dmark 11 (Performance), beat few 780ti, Titan & Titan Black in the process.


Yeah, my 290's clock pretty well and they are both Elpida and no Black Screens on my end








Quote:


> Originally Posted by *JeremyFenn*
> 
> I thought Trixx +200 was only on Sapphire cards that could use that feature?


It's not locked to Sapphire branded cards, I think GPU Tweak is locked to Asus only though


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> One of my card have Elpida memory & it can do 1210/1620 & currently ranked at #60 for 3dmark 11 (Performance), beat few 780ti, Titan & Titan Black in the process.


my msi 290 has elpida, too. can only do 1600. 1620 in a few benches. had it since launch and no blackscreens except past 1620.


----------



## PCSarge

Well I was running for 3 hours. Rig randomly shuts off while playing bf4 narrowed it down to a dead itx board and I'm broke other bad news is. No sandy z77 itx boards left that I can find. So I'm forced to buy a board and chip


----------



## Roaches

Cause of death? burnt SMDs, VRM blew, track damage?


----------



## Scorpion49

Hey guys, is anyone else experiencing problems with Vapor-X models? My first one died quickly within a few days of being new, and now my replacement started artifacting after only two days of use in a totally different machine. I'm not overclocking or anything and my temps stayed below 60*C all the time. I'm getting really irritated with this as I had a great time with the regular Tri-X.


----------



## PCSarge

Quote:


> Originally Posted by *Roaches*
> 
> Cause of death? burnt SMDs, VRM blew, track damage?


I believe the cause of motherboard death was old age. Nothing visually wrong with it


----------



## Roaches

Sorry for your loss, hopefully theres some warranty left but knowing ASUS....


----------



## PCSarge

Quote:


> Originally Posted by *Roaches*
> 
> Sorry for your loss, hopefully theres some warranty left but knowing ASUS....


not gonna bother with Asus setting up my old atx until I can afford a new board+chip


----------



## ZealotKi11er

Quote:


> Originally Posted by *PCSarge*
> 
> not gonna bother with Asus setting up my old atx until I can afford a new board+chip


Why would you not RMA it?


----------



## Velict

asus RMA is not bad. Just annoy the hell out of them until they bend to your will. They usually come around.


----------



## yawa

So, 1245Mhz seem's to be all my card will do with +200Mhz. I have to ask, not that I'm intending to push it too far, but what are my options in regards to getting it a little higher?

My temperatures are quite reasonable (at the above voltage/clockrate I peak around 55C) so I'd like to at least attempt 1300MHZ. Problem being of course I can't find any options to go above +200MHZ.

So yeah, any ideas guys?


----------



## bond32

Quote:


> Originally Posted by *yawa*
> 
> So, 1245Mhz seem's to be all my card will do with +200Mhz. I have to ask, not that I'm intending to push it too far, but what are my options in regards to getting it a little higher?
> 
> My temperatures are quite reasonable (at the above voltage/clockrate I peak around 55C) so I'd like to at least attempt 1300MHZ. Problem being of course I can't find any options to go above +200MHZ.
> 
> So yeah, any ideas guys?


Credit goes to Sugarhill:
Quote:


> Just use /wi4,30,8d,10 for 100mv. The offset is 6.25 mv in hexademical. So on decimal is :16*6.25=100 mv. For 50mv you need 8. For 200mv you need 20( 20=32 on dec. So 32 * 6.25=200mv)
> 
> The easy way to do changes:
> 
> Create a txt on desktop. Write
> CD C:\Program Files (x86)\MSI Afterburner
> MSIAfterburner.exe /wi4,30,8d,10
> 
> and then save as .bat file. Eveyrtime you start this bat file msi will start with +100mv
> 
> For 50mv: 8
> For 100mv:10
> For 125mv:14
> For 150mv:18
> For 175mv:1C
> For 200mv:20
> 
> I wouldn't go over this point because
> 1)You are close to leave the sweet spot of the ref pcb vrms efficiency
> 2)These commands add 200mv on top of the 100mv offset through AB gui.That means 300mv
> 
> By default /wi command apply to current gpu only. So if you have 2 or more gpus you must use /sg command. That means the command line is something like that
> ex:MsiAfterburner.exe /sg0 /wi6,30,8d,10 /sg1 /wi6,30,8d,10


----------



## HoneyBadger84

Only way to go higher than +200mV would be to find a modified BIOS that works with your card that allows you to go higher, I think.

Edit: or do that ^^^ lol


----------



## bond32

Be careful with that though... You are talking a lot of voltage/heat...


----------



## fateswarm

Does anyone recall any reputable data at what point voltage kills before temps do? We know a lot about that stuff about cpus from LN2 overclocking (high voltage, low temps). But I'm not aware of much info on GPUs.


----------



## Arizonian

Quote:


> Originally Posted by *bond32*
> 
> Credit goes to Sugarhill:


Sugerhell's guide has been on OP, info section for reference.









Edit : the other important guide is The R9 290(X) "Need to Know" by OCN member Roboyto, everyone should read.


----------



## Dire Squirrel

Just realized that for reasons unknown, I am listed as having a Sapphire 290 with stock cooling.

It is in fact a watercooled Club3D 290.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Dire Squirrel*
> 
> Just realized that for reasons unknown, I am listed as having a Sapphire 290 with stock cooling.
> 
> It is in fact a watercooled Club3D 290.


You're automatically listed as a Sapphire on air if you don't specify what brand etc of card you have when posting for membership


----------



## Arizonian

Quote:


> Originally Posted by *Dire Squirrel*
> 
> Just realized that for reasons unknown, I am listed as having a Sapphire 290 with stock cooling.
> 
> It is in fact a watercooled Club3D 290.


Thank you.

If it's ever listed like that it's because an entry was missing the info requested in OP for submission. I do remember asking, but never got a reply. So by default it's that. How fast this thread moves its easy to miss things. Sorry about that.

Updated.


----------



## PCPanamaCrew

http://www.techpowerup.com/gpuz/b8b6k/

AMD Sapphire Radeon R9 290X 4GB GDDR5 DUAL DVI-D/HDMI/DP PCI-Express BF4 Edition Graphics Card

Stock Cooler.

P.D: HYNIX MEMORY!!!


----------



## HoneyBadger84

Quote:


> Originally Posted by *PCPanamaCrew*
> 
> http://www.techpowerup.com/gpuz/b8b6k/
> 
> AMD Sapphire Radeon R9 290X 4GB GDDR5 DUAL DVI-D/HDMI/DP PCI-Express BF4 Edition Graphics Card
> 
> Stock Cooler.


Welcome to the club









I'm gonna post up when I get the HIS cards in. Didn't bother posting up with the Sapphire Tri-X 290 & 290X cuz I quickly determined despite their awesome that I was going back to reference blower cards for Tri/QuadFire.


----------



## Arizonian

Quote:


> Originally Posted by *PCPanamaCrew*
> 
> http://www.techpowerup.com/gpuz/b8b6k/
> 
> AMD Sapphire Radeon R9 290X 4GB GDDR5 DUAL DVI-D/HDMI/DP PCI-Express BF4 Edition Graphics Card
> 
> Stock Cooler.


Congrats - added









Welcome to OCN as well.









_(418 members with 604 GPU's listed now)_


----------



## Vici0us

Reference ASUS R9 290 overclocked a bit.
http://www.techpowerup.com/gpuz/7hcez/


----------



## JordanTr

O
Quote:


> Originally Posted by *Tokkan*
> 
> I doubt an H90 would be capable of cooling it as well as my Noctua. So I'm not downgrading/sidegrading.
> The temperatures on my Phenom II with my Noctua NH-D14 is 55 degrees at load, I can't say for sure that it is instability on the CPU because I don't monitor temps ingame and it only happens on some games. Doesnt happen in Watch Dogs, Battlefield 3/4.
> I know the heat is causing it, because as soon as I set a really aggressive fan curve. Going up to 75% fan speed atm ingame it stops crashing. GPU temps with this fan speed are of 70 degrees on GPU and arround 60 for the VRM's.
> With my headset on and volume turned up it doesn't bother me but I can't hear any1 arround me and I had to change the mic settings on teamspeak because everyone was hearing the GPU fan
> 
> 
> 
> 
> 
> 
> 
> 
> 
> A custom loop kit was on my mind too, getting a 240/280 kit, whichever is in stock at the better price/better components getting a 120 rad and a gpu block. Should stay below 300e and improve cooling on everything. Just a tad too pricey for my taste since I'm trying to save up.
> 
> Any experience on the Accelero Xtreme IV? How well does it keep temps down/noise?
> 
> Okay then, bad experiences with the Accelero it seems...
> Oh this seems like the solution I am looking for! I had the seidon under my aim, I can find it new for 40 euros lol
> And I already have an army of fans zip tied together arround 10cm away from the GPU blowing air to it...


k maybe im a bit late, but i got arctic xtreme iv on my r9 290. Used only stuff arctic gave me, no other tim,heatsinks etc. I keep my card running 1100/1400 no added voltage, just 50% powerlimit. I get max core of 63, vrm1 82, vrm2 64. Just make sure you got side fan blowing cold air on the vga backplate, cause it does all the job cooling vrm. Corsair af140 uiet edition doing its job quite decent. I even think it doesnt spin at full speed cause its connected through splitter together with back exhaust fan. If you get performance edition you could get even better vrm temps.


----------



## Nirvana91

Hi guys,

Yesterday, while running Heaven 4.0 I took this print screen of GPU Z:



The card´s a Sapphire R9 290 TriX OC which is at 1140/1600 with 1.32v... Isn´t this too much voltage?


----------



## Arizonian

Quote:


> Originally Posted by *Vici0us*
> 
> Reference ASUS R9 290 overclocked a bit.
> http://www.techpowerup.com/gpuz/7hcez/


Congrats - added


----------



## Sgt Bilko

Quote:


> Originally Posted by *Nirvana91*
> 
> Hi guys,
> 
> Yesterday, while running Heaven 4.0 I took this print screen of GPU Z:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> The card´s a Sapphire R9 290 TriX OC which is at 1140/1600 with 1.32v... Isn´t this too much voltage?


Temps are in check so voltage is ok


----------



## Nirvana91

The temperatures are going to be better, here in Portugal the weather´s hot right now (30+ celcius)


----------



## HoneyBadger84

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Temps are in check so voltage is ok


Agreed.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Nirvana91*
> 
> The temperatures are going to be better, here in Portugal the weather´s hot right now (30+ celcius)


Yeah, I'm in Denmark atm and it's a bit warm here for my liking








Quote:


> Originally Posted by *HoneyBadger84*
> 
> Agreed.


Voltage isn't that much of an issue for Hawaii it seems, so long as the temps are in check you're good to go.

These cards are still new though so we haven't seen what voltage can do in the long term but i haven't heard any horror stories about volts burning these cards out yet but i doubt anything under +100mV will hurt them in the long term


----------



## 033Y5

dont think i have joined yet
http://www.3dmark.com/3dm11/8513165
http://www.techpowerup.com/gpuz/3vyy8/


Spoiler: Warning: Spoiler!











using ek rev 2 copper/plexi + backplate


----------



## Arizonian

Quote:


> Originally Posted by *033Y5*
> 
> dont think i have joined yet
> http://www.3dmark.com/3dm11/8513165
> http://www.techpowerup.com/gpuz/3vyy8/
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## HoneyBadger84

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yeah, I'm in Denmark atm and it's a bit warm here for my liking
> 
> 
> 
> 
> 
> 
> 
> 
> Voltage isn't that much of an issue for Hawaii it seems, so long as the temps are in check you're good to go.
> 
> These cards are still new though so we haven't seen what voltage can do in the long term but i haven't heard any horror stories about volts burning these cards out yet but i doubt anything under +100mV will hurt them in the long term


I really wanna see where the sweet spot clocks wise is for my new cards once I get them in at +100mV since thats prolly as high as I'll push them seeing as they'll be in TriFire to start with and QuadFire later, once I get a dual PSU setup so I don't have to worry about powering the overclocks properly.


----------



## rdr09

Quote:


> Originally Posted by *PCSarge*
> 
> Well I was running for 3 hours. Rig randomly shuts off while playing bf4 narrowed it down to a dead itx board and I'm broke other bad news is. No sandy z77 itx boards left that I can find. So I'm forced to buy a board and chip


just a bump on the road. expensive but . . . we got to do what we got to do. gl!


----------



## Imprezzion

Now THIS is what I like to see!

Scored me a XFX R9 290 @ 290X Unlock with Accelero Hybrid.
Cut up the stock cooler again like so many before me for VRM cooling.

Check these temps and OC. (Clickable ofcourse)



That is just sick. It actually clocks core higher on +100mV then any of my real 290X's.. And 1500Mhz for Elpida aint bad either without black screens!
VRM temps for +100mV are really good as well.


----------



## HoneyBadger84

Quote:


> Originally Posted by *rdr09*
> 
> lol


Regarding air/liquid cooling, as I was saying tsm106 has tested up to 1.5V and not had issues, so if you're not seeing high temps on the cards, and you're staying under that, nothing bad will happen, unless your card's quality is utter trash anyway.

My Sapphire R9 290X Tri-X just resold for $360+shipping. Woot woot, only a slight loss after EBay fees & it was fun to play with... plus I need it gone with the HIS cards coming in Monday.


----------



## 033Y5

Quote:


> Originally Posted by *Imprezzion*
> 
> Now THIS is what I like to see!
> 
> Scored me a XFX R9 290 @ 290X Unlock with Accelero Hybrid.
> Cut up the stock cooler again like so many before me for VRM cooling.
> 
> Check these temps and OC. (Clickable ofcourse)
> 
> 
> 
> That is just sick. It actually clocks core higher on +100mV then any of my real 290X's.. And 1500Mhz for Elpida aint bad either without black screens!
> VRM temps for +100mV are really good as well.


not to far from my clocks with a reference xfx 290 unlocked to xfx 290x and with Elpida


----------



## bond32

This thread has been swarm'ed by fate...


----------



## HoneyBadger84

Quote:


> Originally Posted by *bond32*
> 
> This thread has been swarm'ed by fate...


Chihuahua or Jack Russell. That's my only question/response to that. lol Get it?


----------



## sugarhell

You talk about ln2 volts or what? Because tsm used 1.5v like the ln2 guys but on water. The gpu just scale better with cold


----------



## HoneyBadger84

Quote:


> Originally Posted by *sugarhell*
> 
> You talk about ln2 volts or what? Because tsm used 1.5v like the ln2 guys but on water. The gpu just scale better with cold


I'm not talkin' about LN2 volts period. I was telling someone asking questions about if +200mV being safe that, generally speaking, if temperatures are in check (GPU & VRMs), aka under about 80C (for me, I know they can be higher but I don't like it), then any voltage in that range is perfectly safe for use.


----------



## Ramzinho

YAY my 290X is here











Validation

one question though. GPU-Z doesn't seem to detect my Fan RPM. although i could hear it go WAY WAY Up during valley benchmark. "correction, it shows how much % it's running but not how fast (RPM) It's at"

now give me tips to run this thing right. i know there are tons of options. just throw me your best ones. what software to use. i know AB is a good one. but i heard it conflicts with CCC. so just throw your experience at me and let me get a fairly cooler gpu.


----------



## piquadrat

Quote:


> Originally Posted by *Ramzinho*
> 
> YAY my 290X is here
> 
> 
> 
> 
> 
> 
> 
> 
> 
> one question though. GPU-Z doesn't seem to detect my Fan RPM. although i could hear it go WAY WAY Up during valley benchmark. "correction, it shows how much % it's running but not how fast (RPM) It's at"


The only program which can give you rpm is HWiNFO. AfterBurner - NO, GPU-Z - NO, Trixx - NO, SpeedFan - NO, AIDA - NO. Don't know why, don't ask me. Ask developers.


----------



## Ramzinho

here are my readings after playing some FC3



any thoughts?

think it's time to OC my monitor to 96hz


----------



## alancsalt

Cleaned.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Ramzinho*
> 
> YAY my 290X is here
> 
> 
> 
> 
> 
> 
> 
> 
> 
> one question though. GPU-Z doesn't seem to detect my Fan RPM. although i could hear it go WAY WAY Up during valley benchmark. "correction, it shows how much % it's running but not how fast (RPM) It's at"
> 
> now give me tips to run this thing right. i know there are tons of options. just throw me your best ones. what software to use. i know AB is a good one. but i heard it conflicts with CCC. so just throw your experience at me and let me get a fairly cooler gpu.


Congrats man! Temps look good so far. That a reference or aftermarket card?







*highfive*


----------



## Ramzinho

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Congrats man! Temps look good so far. That a reference or aftermarket card?
> 
> 
> 
> 
> 
> 
> 
> *highfive*


reference









looking forward to OC my Monitor .. probably get another 290X once i sell my old 7970.. and plan for the ultimate finish with a Full custom loop.. 92C at full load isn't nice at all.. my 7970C barely hit 60c at 6 hours of a gaming session. i need to cool this beast ASAP


----------



## HoneyBadger84

Quote:


> Originally Posted by *Ramzinho*
> 
> reference
> 
> 
> 
> 
> 
> 
> 
> 
> 
> looking forward to OC my Monitor .. probably get another 290X once i sell my old 7970.. and plan for the ultimate finish with a Full custom loop.. 92C at full load isn't nice at all.. my 7970C barely hit 60c at 6 hours of a gaming session. i need to cool this beast ASAP


100% fan man. Doooooooo it. You know you want to. Just once. Hear the Leaf Blower. Learn it. Love it. lol I just got through testing an R9 290 reference I got from an idiot that was supposed to send me an R9 290X (having fun on EBay getting that returned & refunded) and I cranked the fan during benchmarks to remind myself of what I'm getting in to with the HIS cards I have coming... wasn't as bad as I thought I remembered... and it kept the card below 70C during all testing







67C was the peak during 2 loops of Valley.

Also, for reference, this is with a nice uber-blower-fan directing air at the card's intakes in QuadFire, the middle 2 cards are Reference XFX R9 290Xs, with them at 100% fan speed:


----------



## Infinite Jest

I just got my 290 in yesterday and I've run into a problem that I want to make sure isn't a problem with the individual card (bought used). When I boot my PC, I usually have to unplug the DVI cable attached to my monitor to get the card to output to the display on 14.4 I noticed this first when installing the driver, when the monitor flashes a few times, but on the 3rd or so flash, it didn't come back on. I wound up wiping everything AMD related and installing again, but the boot issue persists. Does anyone else have this or is my card borked?

EDIT: BTW, BIOS splashscreen loads without an issue as well as the windows loading animation. The output dies as soon as the login screen pops up for Win 8.1. Driver related? I may try beta drivers and see if they do anything.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Infinite Jest*
> 
> I just got my used 290 in yesterday and I've run into a problem that I want to make sure isn't a problem with the individual card (bought used). When I boot my PC, I usually have to unplug the DVI cable attached to my monitor to get the card to output to the display on 14.4 I noticed this first when installing the driver, when the monitor flashes a few times, but on the 3rd or so flash, it didn't come back on. I wound up wiping everything AMD related and installing again, but the boot issue persists. Does anyone else have this or is my card borked?
> 
> EDIT: BTW, BIOS splashscreen loads without an issue as well as the windows loading animation. The output dies as soon as the login screen pops up for Win 8.1. Driver related? I may try beta drivers and see if they do anything.


I had that happen with the R9 290 (wrong card) card I just got in, the first time after driver install, and what I did was I plugged another monitor in to another port, it switched to that one, then I switched it back, and the issue hasn't occured since... might help ya, might not.

I'm on 14.6 RCs though.


----------



## Infinite Jest

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I had that happen with the R9 290 (wrong card) card I just got in, the first time after driver install, and what I did was I plugged another monitor in to another port, it switched to that one, then I switched it back, and the issue hasn't occured since... might help ya, might not.
> 
> I'm on 14.6 RCs though.


I just installed 14.6 as well (literally just now, I'll report back if it fixes it). What exactly do you mean by switching to/back?


----------



## HoneyBadger84

Quote:


> Originally Posted by *Infinite Jest*
> 
> I just installed 14.6 as well (literally just now, I'll report back if it fixes it). What exactly do you mean by switching to/back?


The card for some reason switched output to a plug with no monitor connected. Connecting a monitor to it, then using CCC to switch the output back to my main monitor fixed it & it hasn't occured again since. Check to see if it's outputing to one of your other plugs if you have a second monitor to do this with.


----------



## Infinite Jest

Quote:


> Originally Posted by *HoneyBadger84*
> 
> The card for some reason switched output to a plug with no monitor connected. Connecting a monitor to it, then using CCC to switch the output back to my main monitor fixed it & it hasn't occured again since. Check to see if it's outputing to one of your other plugs if you have a second monitor to do this with.


Will do, thanks.

EDIT: It does appear to be outputting to the second plug, or at least completely ignoring the initial monitor on boot. Can't seem to get it to work properly. Looks like others have complained about it online as well as happening with 14.1+. This is a bad start for my first AMD card...


----------



## yawa

Ugh. After finally stabilizing 4.7 GHZ on my i7 4790k (passed 10 hours of intel extreme test and the four hours I left it on Prime) pushing the best clocks I've managed to get ever on this card at 1247Mhz/1500MHZ ( don't worry I decided to not pursue going over +200mv) I decided to switch over to the new Catalyst 14.7 RC.

So to make a long story short, after DDU'd the old drivers and installed 14.7 everything ran fine last night through a few benches. When I woke up this morning and tried to boot not one minute after I got into Windows and got hit with the dreaded black screen. Now it keeps happening within 45 seconds to a 1 minute 30 after booting.

So as you can imagine I'm a little stuck.

So a little advice is needed here as I've never gotten one before. Is it driver related? Memory related? A combination? Did I Bork myself with those clocks?

A little help would be appreciated.


----------



## Dr.GumbyM.D.

Quote:


> Originally Posted by *yawa*
> 
> So to make a long story short, after DDU'd the old drivers and installed 14.7 everything ran fine last night through a few benches. When I woke up this morning and tried to boot not one minute after I got into Windows and got hit with the dreaded black screen. Now it keeps happening within 45 seconds to a 1 minute 30 after booting.
> 
> So as you can imagine I'm a little stuck.
> 
> So a little advice is needed here as I've never gotten one before. Is it driver related? Memory related? A combination? Did I Bork myself with those clocks?
> 
> A little help would be appreciated.


Yep, I was hosed after installing the new drivers. I submitted a ticket to AMD about it.

I flashed my card back to stock BIOS and it worked again (947mhz core, Powercolor 290 OC). I was on an ASUS 290x bios.

If you have another computer, I would try a low clock/stock bios and flash it and see if that helps.

I did a full clean out, DDU, etc and had this same issue. Definitely frustrating. I also had to use a backup monitor, as it wouldn't work over displayport or HDMI with my 4k monitor, so I had to hook up an old 1080p HDMI monitor to get a signal out to even flash it.

Good luck!


----------



## SLOWION

Just picked this R9 290 up on eBay for a little over $300 shipped









It was used for about two months according to the seller



http://www.techpowerup.com/gpuz/f2sw2/

Also made a video


----------



## Razzaa

I would reinstall drivers. Go back to 14.4


----------



## Infinite Jest

Quote:


> Originally Posted by *Infinite Jest*
> 
> Will do, thanks.
> 
> EDIT: It does appear to be outputting to the second plug, or at least completely ignoring the initial monitor on boot. Can't seem to get it to work properly. Looks like others have complained about it online as well as happening with 14.1+. This is a bad start for my first AMD card...


Found something strange. Firstly, the upper DVI port is more likely to detect a GPU and secondly, of my two monitors, my Dell U2312HM refuses to be detected in either port upon boot, even in a dual monitor configuration while my Samsung T220 is detected and works in either port (gets detected faster in the upper port). Any ideas as to why this happens?

EDIT2: Okay, now I'm confuzzled. Swapped DVI cables and rolled back to 14.4 and now both monitors work together, individually, or in either port.







Hopefully it stays this way,


----------



## PCSarge

well. i got pics before my ITX board bit the bullet on its own accord last night:

the complete loop:



the CPU block:



the sexy aquacomputer block:


----------



## aaroc

Quote:


> Originally Posted by *Infinite Jest*
> 
> Found something strange. Firstly, the upper DVI port is more likely to detect a GPU and secondly, of my two monitors, my Dell U2312HM refuses to be detected in either port upon boot, even in a dual monitor configuration while my Samsung T220 is detected and works in either port (gets detected faster in the upper port). Any ideas as to why this happens?
> 
> EDIT2: Okay, now I'm confuzzled. Swapped DVI cables and rolled back to 14.4 and now both monitors work together, individually, or in either port.
> 
> 
> 
> 
> 
> 
> 
> Hopefully it stays this way,


This happened to me when I had a 7870 and was using HDMI cable. The cable was bad quality and didnt work well with my lg monitor. Changed the cable and al monitors detected at boot without problems. It was the cable because I connected a DVD player with HDMI out put to the monitor and had the same problem. If you have another cable try that. Maybe the combination of GPU+cable+monitor doesnt work.


----------



## Kittencake

wow this card gets almighty hot playing old republic .


----------



## the9quad

So my power supply is 99.5 amps on a single 12v rail. Is that enough for 3 290x's? Anyone know? Because I occasionally have some really funky crashes when pushing these cards even at stock. Stays under 1200 watts, but I am not sure if the current draw on these is more than this power supply can handle.


----------



## Infinite Jest

Quote:


> Originally Posted by *Kittencake*
> 
> wow this card gets almighty hot playing old republic .


Yeah, it's definitely a toasty reference card. Looking for a cooling solution for mine at the moment. Has anyone had any change in load Temps after repasting the stock cooler? May be a temporary solution for me untilI find a cooler.


----------



## Kittencake

Quote:


> Originally Posted by *Infinite Jest*
> 
> Yeah, it's definitely a toasty reference card. Looking for a cooling solution for mine at the moment. Has anyone had any change in load Temps after repasting the stock cooler? May be a temporary solution for me before I find a cooler.


I'm waiting for the cosair bracket to come out of i can use that paired with maybe a h80 and get a h100 for my cpu


----------



## King4x4

Any news that a 290x will come out with more then one DP?


----------



## rdr09

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Regarding air/liquid cooling, as I was saying tsm106 has tested up to 1.5V and not had issues, so if you're not seeing high temps on the cards, and you're staying under that, nothing bad will happen, unless your card's quality is utter trash anyway.
> 
> My Sapphire R9 290X Tri-X just resold for $360+shipping. Woot woot, only a slight loss after EBay fees & it was fun to play with... plus I need it gone with the HIS cards coming in Monday.


i was lol at fate's way with words. you guys can pump how high a voltage you want in your cards. they are your cards.


----------



## tsm106

Guys, when I say 1.5v, I mean a loaded 1.5v. Guess what that means for input voltage?


----------



## VSG

1.35-1.4v based on my experience. Of course this depends on LLC or lack thereof.


----------



## Nirvana91

I do believe that there´s a certain point where it´s pointless to go any further.

My card does 1140/1600 at 1.32v... For me to reach +1200 in the core I would have to go 1.4v...

But still, I feel a bit nervous at 1.32v...

We do know that HD7900 were not recommended to go +1.25v... But i cannot find any data on the R9 290´s.


----------



## Ramzinho

Quote:


> Originally Posted by *HoneyBadger84*
> 
> 100% fan man. Doooooooo it. You know you want to. Just once. Hear the Leaf Blower. Learn it. Love it. lol I just got through testing an R9 290 reference I got from an idiot that was supposed to send me an R9 290X (having fun on EBay getting that returned & refunded) and I cranked the fan during benchmarks to remind myself of what I'm getting in to with the HIS cards I have coming... wasn't as bad as I thought I remembered... and it kept the card below 70C during all testing
> 
> 
> 
> 
> 
> 
> 
> 67C was the peak during 2 loops of Valley.
> 
> Also, for reference, this is with a nice uber-blower-fan directing air at the card's intakes in QuadFire, the middle 2 cards are Reference XFX R9 290Xs, with them at 100% fan speed:


You know what. i was always skeptical about people bragging about "silent" computers. and i can't stress how beautiful it's. i've changed my fans to sp120 and af120s and i love the silence. i can't even put words to how much i love them.

This thing begs to be water cooled. time to list my Old GPU on the local market and see how it goes.

i can't really run the fan at 100%.. it's just absurd


----------



## tsm106

Quote:


> Originally Posted by *Nirvana91*
> 
> I do believe that there´s a certain point where it´s pointless to go any further.
> 
> My card does 1140/1600 at 1.32v... For me to reach +1200 in the core I would have to go 1.4v...
> 
> But still, I feel a bit nervous at 1.32v...
> 
> *We do know that HD7900 were not recommended to go +1.25v.*.. But i cannot find any data on the R9 290´s.


Says who???????


----------



## King PWNinater

Who has better customer support, Gigabyte or XFX?


----------



## ZealotKi11er

Quote:


> Originally Posted by *tsm106*
> 
> Says who???????


I said. Well I had 3 x 7970 and after 1.2v even under water the OC did not get any better.


----------



## tsm106

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Says who???????
> 
> 
> 
> I said. Well I had 3 x 7970 and after 1.2v even under water the OC did not get any better.
Click to expand...

Oh well since "you" said so it must be true?


----------



## mcg75

Quote:


> Originally Posted by *Nirvana91*
> 
> We do know that HD7900 were not recommended to go +1.25v... But i cannot find any data on the R9 290´s.


My Sapphire 7970 ghz Vapor-X was 1.256v right out of the box.

I could do 1285 mhz in some benches and like a lot of others, increasing voltage past that didn't help.


----------



## Ramzinho

Quote:


> Originally Posted by *King PWNinater*
> 
> Who has better customer support, Gigabyte or XFX?


i heard horror stories on this forum about XFX. gigabyte is a gpu i wont recommend. had loads of issues with my 7970.

AMD = sapphire or powercolo or asus.


----------



## Kittencake

Quote:


> Originally Posted by *Ramzinho*
> 
> i heard horror stories on this forum about XFX. gigabyte is a gpu i wont recommend. had loads of issues with my 7970.
> 
> AMD = sapphire or powercolo or asus.


I wouldn't recommend powercolor either


----------



## Sgt Bilko

Quote:


> Originally Posted by *Ramzinho*
> 
> i heard horror stories on this forum about XFX. gigabyte is a gpu i wont recommend. had loads of issues with my 7970.
> 
> AMD = sapphire or powercolo or asus.


XFX have been getting better, Asus have been getting worse, Sapphire has been good to me, Gigabyte I've heard good things about and i know nothing of Powercolor


----------



## rv8000

Quote:


> Originally Posted by *Sgt Bilko*
> 
> XFX have been getting better, Asus have been getting worse, Sapphire has been good to me, Gigabyte I've heard good things about and i know nothing of Powercolor


Aftermarket powercolor cards this gen have had a lot of issues from what I gather, some personal experience with it as well. Sapphire has never done me wrong, rma's are easy, even gave me an upgrade 7970 from a 7950 for coil whine issues and so on.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Infinite Jest*
> 
> Found something strange. Firstly, the upper DVI port is more likely to detect a GPU and secondly, of my two monitors, my Dell U2312HM refuses to be detected in either port upon boot, even in a dual monitor configuration while my Samsung T220 is detected and works in either port (gets detected faster in the upper port). Any ideas as to why this happens?
> 
> EDIT2: Okay, now I'm confuzzled. Swapped DVI cables and rolled back to 14.4 and now both monitors work together, individually, or in either port.
> 
> 
> 
> 
> 
> 
> 
> Hopefully it stays this way,


Sounds like typical occasional driver insanity to me. Mayhaps it just needed a good reinstall with both monitors plugged in? Who knows.
Quote:


> Originally Posted by *King PWNinater*
> 
> Who has better customer support, Gigabyte or XFX?


I have heard bad things about Gigabyte (they definitely 100% have quality control issues with the amount of duds/bad cards they have getting out), but haven't had any issues with the Gigabyte WindForce card I had for over a month. And I've heard that XFX used to have issues, but that nowadays they're actually quite great., both the XFX Reference R9 290Xs I had for about a month ran great, and they were used for mining of all things for 6 months before I got them.


----------



## Nirvana91

Quote:


> Originally Posted by *tsm106*
> 
> Says who???????


André Santos, Marketing Manager of Powercolor.

He sold me 2x HD7950 PCS+ which were press samples and warned me to not go above +1.25v. And before that, I had already read in foruns that it wasn't recommended to go +1.25v.

As for Hawaii, I don't think that the safe limits are known.


----------



## LazarusIV

Hey guys, I've got a bit of an issue here... Let me lay it out:

I recently got home after a couple of months and my fixed R9 290 was waiting for me. I installed it with the 14.4 drivers after using DDU to clean up the old ones. I went to play Battlefield 4 and it kept giving me these Direct X errors. I did a little bit of reading and saw that MSI Afterburner could be causing the issues. I uninstalled the latest one (3.0.1 I think) and installed an older version, 3.0.0 and it seemed to help with the issue. I was happily playing BF4 again with my R9 290!

Today I decided to install the second R9 290 I had acquired recently. Once again with DDU I uninstalled the drivers, installed the 2nd GPU, then started back up and installed the drivers again with both cards in there. Might've been unnecessary but I figured better safe than sorry. The install went without a hitch but when I went to run BF4 it was giving me all sorts of stupid low memory and Direct X issues. Low memory!? Are you kidding me!?!? I did a little research and couldn't find anything in particular that was helpful. I increased my pagefile size, which barely stalled the crash. I got in game and picked a class and then it crashed with the same Direct X or low memory issue. I got rid of my pagefile and I've still got the same issue. I'm goin' nuts here, can someone help me figure this out? I feel like this is a fairly common problem with BF4 but I can't for the life of me figure it out!

Thanks everyone, I really appreciate any input!


----------



## jumperfly

Guys is a reference R290 ASUS BIOS compatible with a Sapphire R290 Tri-X? I know the Tri-X uses the reference PCB so I assume it should work?


----------



## Blue Dragon

Quote:


> Originally Posted by *LazarusIV*
> 
> Hey guys, I've got a bit of an issue here... Let me lay it out:
> 
> I recently got home after a couple of months and my fixed R9 290 was waiting for me. I installed it with the 14.4 drivers after using DDU to clean up the old ones. I went to play Battlefield 4 and it kept giving me these Direct X errors. I did a little bit of reading and saw that MSI Afterburner could be causing the issues. I uninstalled the latest one (3.0.1 I think) and installed an older version, 3.0.0 and it seemed to help with the issue. I was happily playing BF4 again with my R9 290!
> 
> Today I decided to install the second R9 290 I had acquired recently. Once again with DDU I uninstalled the drivers, installed the 2nd GPU, then started back up and installed the drivers again with both cards in there. Might've been unnecessary but I figured better safe than sorry. The install went without a hitch but when I went to run BF4 it was giving me all sorts of stupid low memory and Direct X issues. Low memory!? Are you kidding me!?!? I did a little research and couldn't find anything in particular that was helpful. I increased my pagefile size, which barely stalled the crash. I got in game and picked a class and then it crashed with the same Direct X or low memory issue. I got rid of my pagefile and I've still got the same issue. I'm goin' nuts here, can someone help me figure this out? I feel like this is a fairly common problem with BF4 but I can't for the life of me figure it out!
> 
> Thanks everyone, I really appreciate any input!


I had to install drivers with only one GPU in, then power down and put in second GPU, then reboot to get my 290's going. but think that just affected my asus board, didn't have to do that with my 7870's and msi board.


----------



## HoneyBadger84

For Crossfire & SLi both, usually, after cleaning out previous drivers & putting the new cards in together, you have to let Windows Boot up, then it'll ask for a reboot after installing some Standard VGA Adapter for the second card... then you can reboot & install drivers & everything works fine. That's been my experience at least, with these cards I think it can happen with Crossfire, definitely happens with TriFire every time.


----------



## sugarhell

Quote:


> Originally Posted by *Nirvana91*
> 
> André Santos, Marketing Manager of Powercolor.
> 
> He sold me 2x HD7950 PCS+ which were press samples and warned me to not go above +1.25v. And before that, I had already read in foruns that it wasn't recommended to go +1.25v.
> 
> As for Hawaii, I don't think that the safe limits are known.


Go buy asus stuff then they have better pr


----------



## Kittencake

K i'm playing star wars the old republic my card gets a tad warm , what fan speed should i set it at on afterburner cause when i set it to auto it tends to dounce to 100 to 89% a lot for fan speed


----------



## Nirvana91

Quote:


> Originally Posted by *jumperfly*
> 
> Guys is a reference R290 ASUS BIOS compatible with a Sapphire R290 Tri-X? I know the Tri-X uses the reference PCB so I assume it should work?


Why would you want to do that? Just curious.
Quote:


> Originally Posted by *sugarhell*
> 
> Go buy asus stuff then they have better pr


Since I´m no PRO, I do prefer to stay well under the limits. And if a guy, who happens to work on the field tells me not to go +1.25v well... I´ll just follow his advice.


----------



## sugarhell

They foolproof their cards for people like you. You are a newbie? Dont go over stock. You are a pro? You put the limits


----------



## tsm106

They told me if I went over 1.25v the time space continuum would shatter. Omg!


----------



## Ramzinho

Quote:


> Originally Posted by *tsm106*
> 
> They told me if I went over 1.25v the time space continuum would shatter. Omg!


and we will be stuck in the loop. stewie.. HELP US PLEASE


----------



## LazarusIV

Quote:


> Originally Posted by *Blue Dragon*
> 
> I had to install drivers with only one GPU in, then power down and put in second GPU, then reboot to get my 290's going. but think that just affected my asus board, didn't have to do that with my 7870's and msi board.


I had read an account where a guy uninstalled the drivers then installed both cards and installed the drivers like new and that was the best way he could do it so I figured I'd just do that too. The problem occurred with one card and with two so I'm not sure where that points...

Quote:


> Originally Posted by *HoneyBadger84*
> 
> For Crossfire & SLi both, usually, after cleaning out previous drivers & putting the new cards in together, you have to let Windows Boot up, then it'll ask for a reboot after installing some Standard VGA Adapter for the second card... then you can reboot & install drivers & everything works fine. That's been my experience at least, with these cards I think it can happen with Crossfire, definitely happens with TriFire every time.


I did wait for Windows to install its generic driver (which didn't require a reboot), then I installed the Catalyst 14.4 drivers after that. Everything else works just fine except for BF4 so I'm wondering if it's just an issue with BF4. If that's the case I'm hoping others have had this issue so there's a fix somewhere. Is it MSI Afterburner causing the issue, should I roll back to Afterburner Beta 18 or 19 or maybe even Catalyst Driver 13.12? Is it HWInfo64 causing an issue? Is it a VRAM or RAM memory leak issue with BF4? Anyone else had these issues?


----------



## Nirvana91

Quote:


> Originally Posted by *sugarhell*
> 
> They foolproof their cards for people like you. You are a newbie? Dont go over stock. You are a pro? You put the limits


Quote:


> Originally Posted by *tsm106*
> 
> They told me if I went over 1.25v the time space continuum would shatter. Omg!












If I lived in a country where this kind of graphic cards wouldn´t cost as much as the minimum salary, I wouldn´t mind all that much if you ask me. I´d just put it at 1.4v with core/memory set to the moon. The R9 290 costs the same as the minimum salary, 290X costs 136$ more than the minimum salary and a 780Ti costs almost as twice as that.









So, like many ppl here, I´m just trying to know if there´s any official info about safe voltages/degradation on Hawaii. I´ve already understood that there´s no info on that.

To be honest I don´t recall seeing a graphic card blowing up or whatever because of voltage. So If programs like TRIXX let you do 1.4v, probably it´s safe if you watch over the temperatures. But still, for me, 1.4v is just to much.


----------



## sugarhell

If you have max 30-40C max load temps then you can use 1.6 volt input. Just use the msi ab with +100mv limit. Its enough if you ask about the safe limit


----------



## tsm106

I like how these marketing managers greenlight these custom cards with bajillion mosfets made of unobtanium and they are locked to one brand bios so you can only put +100mv or +200mv thru it. Yea, that's how they do overkill!


----------



## Sgt Bilko

Quote:


> Originally Posted by *tsm106*
> 
> I like how these marketing managers greenlight these custom cards with bajillion mosfets made of unobtanium and they are locked to one brand bios so you can only put +100mv or +200mv thru it. Yea, that's how they do overkill!


Better than a brand that locks down the voltage


----------



## tsm106

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I like how these marketing managers greenlight these custom cards with bajillion mosfets made of unobtanium and they are locked to one brand bios so you can only put +100mv or +200mv thru it. Yea, that's how they do overkill!
> 
> 
> 
> Better than a brand that locks down the voltage
Click to expand...

That's a whole other ball of jerk considering the reference design is unlocked.


----------



## HoneyBadger84

So guess what? I just showed this Seller I am in fact 100% correct about this card being a 290 in a 290X box.

Check it. Card matches serial # on the box:



Card says 290. Box says 290X. He got screwed by where-ever he bought this from. NOT ME. lol


----------



## JordanTr

I
Quote:


> Originally Posted by *HoneyBadger84*
> 
> So guess what? I just showed this Seller I am in fact 100% correct about this card being a 290 in a 290X box.
> 
> Check it. Card matches serial # on the box:
> 
> 
> 
> Card says 290. Box says 290X. He got screwed by where-ever he bought this from. NOT ME. lol


Its possible, that by mistake he flashed that card using r9 290 bios


----------



## yawa

So memory is weird on these cards eh? I start losing points in benches any time I clock over 5950Mhz.

Oh well.


----------



## sugarhell

Its not 'weird'. Its called ECC


----------



## HoneyBadger84

Quote:


> Originally Posted by *JordanTr*
> 
> I
> Its possible, that by mistake he flashed that card using r9 290 bios


No no. Read the sticker on the card again. It says it's a 290  lol both BIOSes readout as a 290 as well as it being unflashable to a 290X (that's the first thing I checked when I got it is if I could fix it myself).


----------



## Arizonian

Quote:


> Originally Posted by *Ramzinho*
> 
> YAY my 290X is here
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Validation
> 
> one question though. GPU-Z doesn't seem to detect my Fan RPM. although i could hear it go WAY WAY Up during valley benchmark. "correction, it shows how much % it's running but not how fast (RPM) It's at"
> 
> now give me tips to run this thing right. i know there are tons of options. just throw me your best ones. what software to use. i know AB is a good one. but i heard it conflicts with CCC. so just throw your experience at me and let me get a fairly cooler gpu.


Congrats - added









Sapphire?
Quote:


> Originally Posted by *SLOWION*
> 
> Just picked this R9 290 up on eBay for a little over $300 shipped
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It was used for about two months according to the seller
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/f2sw2/


Congrats - added









Your Tri-X in pic, thanks.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Ramzinho*
> 
> You know what. i was always skeptical about people bragging about "silent" computers. and i can't stress how beautiful it's. i've changed my fans to sp120 and af120s and i love the silence. i can't even put words to how much i love them.
> 
> This thing begs to be water cooled. time to list my Old GPU on the local market and see how it goes.
> 
> i can't really run the fan at 100%.. it's just absurd


I missed this reply til just now when searching for another one :-D LOL!

I love it when people actually do put the stock blower fan up to 100% and immediately have that "OMG!" moment







Too funny









I game with headphones on, and I can just BARELY hear them because of how good the noise cancelling is on the ear cups.
Quote:


> Originally Posted by *SLOWION*
> 
> Just picked this R9 290 up on eBay for a little over $300 shipped
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It was used for about two months according to the seller


Curious, can anyone that's gotten a brand new in box Sapphire Card tell me for sure, does Sapphire actually send them out like that, no static proof bag? Cuz the Sapphire R9 290X Tri-X OC card I got in came like that too, just foam, no bag. I had to put one on it (I keep a stock of static proof bags around for cases just like this) before I sent it off after reselling it, cuz I'm paranoid.


----------



## Imprezzion

Aaaw man what was the fix again for that annoying as hell PCI-E 1.1 bug...

My reference XFX 290 @ 290X has it again.. It'll say in GPU-Z and such under load PCI-E 3.0 x16 @ x16 1.1.

Not sure if it actually runs on 1.1 though. Might just be running 3.0 actual but just to be sure...


----------



## Ramzinho

i will wait till the new series is out .. get another one and blocks.. they are too expensive now. 120 on the cheapest


----------



## Dasboogieman

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I missed this reply til just now when searching for another one :-D LOL!
> 
> I love it when people actually do put the stock blower fan up to 100% and immediately have that "Holy $*%!" moment
> 
> 
> 
> 
> 
> 
> 
> Too funny
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I game with headphones on, and I can just BARELY hear them because of how good the noise cancelling is on the ear cups.
> Curious, can anyone that's gotten a brand new in box Sapphire Card tell me for sure, does Sapphire actually send them out like that, no static proof bag? Cuz the Sapphire R9 290X Tri-X OC card I got in came like that too, just foam, no bag. I had to put one on it (I keep a stock of static proof bags around for cases just like this) before I sent it off after reselling it, cuz I'm paranoid.


No, mine came in a static proof bag.

Lol, I love it how we're so voltage happy compared to the CPU overclock guys. I guess GPUs aren't really kept for longer than 2 years here so we don't feel the effects of degradation or voltage death.

I know that according to Intel (who arguably has the most enviable manufacturing process at the moment), continued usage of their CPUs beyond a specified voltage spec, (I think it was 1.5V for Sandy, 1.45 for Ivy and 1.4 for haswell) results in permanent instability at stock clockspeeds (something about the CPU gradually getting leakier and shifting voltage planes). I can imagine the same fate awaits GPUs which are run at very high voltages regardless of cooling. Obviously cooling helps mitigate the damage but I imagine the high energy degrades stuff like the interconnects on the die.

That being said, it would be cool if GPUs, one day, allowed control over voltages like the IMC. Sure there will always be people who shoot for the moon but as long as the documentation is accurate then I don't see an issue, though it seems NVIDIA/AMD are loathe to divulge the engineering specifications of their GPUs unlike Intel.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Imprezzion*
> 
> Aaaw man what was the fix again for that annoying as hell PCI-E 1.1 bug...
> 
> My reference XFX 290 @ 290X has it again.. It'll say in GPU-Z and such under load PCI-E 3.0 x16 @ x16 1.1.
> 
> Not sure if it actually runs on 1.1 though. Might just be running 3.0 actual but just to be sure...


That's just power saving, when the GPU goes under load it changes.

It's normal


----------



## HoneyBadger84

Quote:


> Originally Posted by *Ramzinho*
> 
> i will wait till the new series is out .. get another one and blocks.. they are too expensive now. 120 on the cheapest


Well, the way I tell people to do it over at NV Forums in reference to fan speed vs noise is, crank the fan up slowly using Afterburner or Trixx til you can't stand it anymore, turn it down a bit so it's "bearable", set that as your maximum fan speed, and have it kick in to that speed at around 55-60C. That should keep the card a lot cooler than the stock fan profile, especially on 290/290Xs, I mean they don't even go up to the supposed-max the stock profile has, which is 55%, at 90C, what's up with that


----------



## Imprezzion

Quote:


> Originally Posted by *Sgt Bilko*
> 
> That's just power saving, when the GPU goes under load it changes.
> 
> It's normal


Read again. It stays at 1.1 under load. A lot of my 290 series cards did this. Especially with non-stock BIOS's.


----------



## Ramzinho

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Well, the way I tell people to do it over at NV Forums in reference to fan speed vs noise is, crank the fan up slowly using Afterburner or Trixx til you can't stand it anymore, turn it down a bit so it's "bearable", set that as your maximum fan speed, and have it kick in to that speed at around 55-60C. That should keep the card a lot cooler than the stock fan profile, especially on 290/290Xs, I mean they don't even go up to the supposed-max the stock profile has, which is 55%, at 90C, what's up with that


i'm using a linear fan profile. it goes up to 55% and my gpu was at 75c max during my last 3 hours session

next time i'll see how the vrm goes..


----------



## Sgt Bilko

Quote:


> Originally Posted by *Imprezzion*
> 
> Read again. It stays at 1.1 under load. A lot of my 290 series cards did this. Especially with non-stock BIOS's.


Tried re-installing GPU-Z?

If it was running at 1.1 then you'd know it.


----------



## rene mauricio

If I wanted to buy Fujipoly for my WindForce OC card, what size sheet would I need? Oh and I guess I should also ask (for the sake of curiosity) if anyone knows GIGABYTE's stance on the removal of coolers.

Might as well add this to the mix:


----------



## HoneyBadger84

Quote:


> Originally Posted by *Ramzinho*
> 
> i'm using a linear fan profile. it goes up to 55% and my gpu was at 75c max during my last 3 hours session
> 
> next time i'll see how the vrm goes..


55% is still barely audible, at least to me... it's noticable but by no means loud. What brand of 290X did you get? The VisionTek/AMD/whateveritis R9 290 I got & the 2 XFX 290Xs I had were about the same at 55%. I'd give 65% or 70% a try, see if it's too loud for ya. 70-75% is usually the point at which people start going "okay that's getting ridiculous"... and of course 85-100% is leaf-blower mode.


----------



## Kittencake

Quote:


> Originally Posted by *HoneyBadger84*
> 
> 55% is still barely audible, at least to me... it's noticable but by no means loud. What brand of 290X did you get? The VisionTek/AMD/whateveritis R9 290 I got & the 2 XFX 290Xs I had were about the same at 55%. I'd give 65% or 70% a try, see if it's too loud for ya. 70-75% is usually the point at which people start going "okay that's getting ridiculous"... and of course 85-100% is leaf-blower mode.


yeah at 85% my desk actually vibrates


----------



## Ramzinho

Quote:


> Originally Posted by *HoneyBadger84*
> 
> 55% is still barely audible, at least to me... it's noticable but by no means loud. What brand of 290X did you get? The VisionTek/AMD/whateveritis R9 290 I got & the 2 XFX 290Xs I had were about the same at 55%. I'd give 65% or 70% a try, see if it's too loud for ya. 70-75% is usually the point at which people start going "okay that's getting ridiculous"... and of course 85-100% is leaf-blower mode.


it's a reference sapphire. elbida memory


----------



## HoneyBadger84

Quote:


> Originally Posted by *Kittencake*
> 
> yeah at 85% my desk actually vibrates


I have my computer on a completely separate desk from my keyboard/mouse specifically because the vibration from 3-4 video cards running full throttle on the fans, even with them being extra-secured by the motherboard's secure-clips + bracket screws + the slot itself, is noticeable.

Do they sell like... rubber vibration reduction kits like they do for fans, but for video card brackets? lol That'd be awesome.


----------



## Kittencake

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I have my computer on a completely separate desk from my keyboard/mouse specifically because the vibration from 3-4 video cards running full throttle on the fans, even with them being extra-secured by the motherboard's secure-clips + bracket screws + the slot itself, is noticeable.
> 
> Do they sell like... rubber vibration reduction kits like they do for fans, but for video card brackets? lol That'd be awesome.


I have my pc on my desk cause I will not put it on the carpet , and the power button is out of munchkin range


----------



## HoneyBadger84

Quote:


> Originally Posted by *Ramzinho*
> 
> it's a reference sapphire. elbida memory


Is that on the "quiet (left)" or "uber (right)" BIOS setting? I've seen Sapphires readout as just ATI & some readout like this one I have in atm does (Tri-X OC R9 290):



Refering specifically to the Subvendor ID of course.


----------



## Ramzinho

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Is that on the "quiet (left)" or "uber (right)" BIOS setting? I've seen Sapphires readout as just ATI & some readout like this one I have in atm does (Tri-X OC R9 290):
> 
> 
> 
> Refering specifically to the Subvendor ID of course.


it's on the setting far from the end.. closer to the case front.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Ramzinho*
> 
> it's on the setting far from the end.. closer to the case front.


So it's on the Uber BIOS already. That's the one with a slightly more aggressive fan curve if left on auto (still caps out at 55% ~92C though) if memory serves... forgot what other advantages the Uber BIOS has.

I'm off to sleep more, gotta be up to get ready for work in 2 hrs lol


----------



## King PWNinater

I need an inf file of the 14.4 amd driver. My cards won't work without it. Anyone able to send/post the file?


----------



## Radmanhs

has anyone had the problem of lines running vertically on the screen while gaming when certain colors flash up? Its not oced.


----------



## yawa

Quote:


> Originally Posted by *Radmanhs*
> 
> has anyone had the problem of lines running vertically on the screen while gaming when certain colors flash up? Its not oced.


Dark Souls II has an issue on AMD cards where the textures have wavy pink lines go through them. It's weird because it's only only on a few specific level's (Heide's tower being the most noticeable) but otherwise runs fine.

The developer claims it's a driver issue, but who knows?


----------



## Radmanhs

Well i just changed monitors to see if or was my 8 year old tv that is the problem, now i got this


----------



## ZealotKi11er

Quote:


> Originally Posted by *Radmanhs*
> 
> Well i just changed monitors to see if or was my 8 year old tv that is the problem, now i got this


To me it looks like cable problem. If its the GPU then its probably dead. I personally dont thing modern GPUs even work in that state anymore because they will BSOD out of that image.


----------



## Infinite Jest

Definitely some funk with Dayz and amd drivers. ATOC makes for some funk grass and trees. Hope they get that fixed.


----------



## Jos8coreHead

All shot with my Nokia 909 (1020)

Gcard:













Can't wait to get my other one in a few weeks with waterblocks and XDMA the feck out of them, needing to clean my loop anyway, tinted yellow in my acrylic res :X

User info:



So what? I send these here and wait for a response?








*ASUS Crosshair V Formula Club*


----------



## Razzaa

I have a weird issue. When i run my card on stock clocks it black screens. If i run it on 1100/1500 it runs flawlessly. I have been looping Valley and even scored a 22768 in Skydiver but as soon as i reset back to stock.....boom black screen. Is that weird or what?


----------



## Gualichu04

I think i may have solved the black screen issue but, do not like how i did it. in Msi afterburner i set the core voltage to +50 and power limit to +50 and it did not black screen through heaven benchmark while before at scene 20-22 it would. But, the vrm got up to 100C which seems to hot. Core temp was 80C max. I am using the stock XFX DD cooler since i won't water cool this gpu till i know i don't have to rma it.


----------



## Razzaa

Quote:


> Originally Posted by *Gualichu04*
> 
> I think i may have solved the black screen issue but, do not like how i did it. in Msi afterburner i set the core voltage to +50 and power limit to +50 and it did not black screen through heaven benchmark while before at scene 20-22 it would. But, the vrm got up to 100C which seems to hot. Core temp was 80C max. I am using the stock XFX DD cooler since i won't water cool this gpu till i know i don't have to rma it.


I have to do the same thing to be stable. I am running at +50% and +90Mv. My clocks are Core 1100 /1500 but my VRM's never get above 65c. Yours are unusually high in my opinion.


----------



## pdasterly

Quote:


> Originally Posted by *yawa*
> 
> Ugh. After finally stabilizing 4.7 GHZ on my i7 4790k (passed 10 hours of intel extreme test and the four hours I left it on Prime) pushing the best clocks I've managed to get ever on this card at 1247Mhz/1500MHZ ( don't worry I decided to not pursue going over +200mv) I decided to switch over to the new Catalyst 14.7 RC.
> 
> So to make a long story short, after DDU'd the old drivers and installed 14.7 everything ran fine last night through a few benches. When I woke up this morning and tried to boot not one minute after I got into Windows and got hit with the dreaded black screen. Now it keeps happening within 45 seconds to a 1 minute 30 after booting.
> 
> So as you can imagine I'm a little stuck.
> 
> So a little advice is needed here as I've never gotten one before. Is it driver related? Memory related? A combination? Did I Bork myself with those clocks?
> 
> A little help would be appreciated.


Same thing happened to me with 14.7
Had to boot.into safe mode, run ddu. Reboot. Going into device manager and delete display, reboot, reinstall amd driver.


----------



## Gualichu04

Quote:


> Originally Posted by *Razzaa*
> 
> I have to do the same thing to be stable. I am running at +50% and +90Mv. My clocks are Core 1100 /1500 but my VRM's never get above 65c. Yours are unusually high in my opinion.


Its gotta be the crappy xfx DD cooler only a small metal plate is used to cool the vrm and memory. As long as it doesnt black screen i will just build the water loop for both gpu's.


----------



## pdasterly

Quote:


> Originally Posted by *LazarusIV*
> 
> Hey guys, I've got a bit of an issue here... Let me lay it out:
> 
> I recently got home after a couple of months and my fixed R9 290 was waiting for me. I installed it with the 14.4 drivers after using DDU to clean up the old ones. I went to play Battlefield 4 and it kept giving me these Direct X errors. I did a little bit of reading and saw that MSI Afterburner could be causing the issues. I uninstalled the latest one (3.0.1 I think) and installed an older version, 3.0.0 and it seemed to help with the issue. I was happily playing BF4 again with my R9 290!
> 
> Today I decided to install the second R9 290 I had acquired recently. Once again with DDU I uninstalled the drivers, installed the 2nd GPU, then started back up and installed the drivers again with both cards in there. Might've been unnecessary but I figured better safe than sorry. The install went without a hitch but when I went to run BF4 it was giving me all sorts of stupid low memory and Direct X issues. Low memory!? Are you kidding me!?!? I did a little research and couldn't find anything in particular that was helpful. I increased my pagefile size, which barely stalled the crash. I got in game and picked a class and then it crashed with the same Direct X or low memory issue. I got rid of my pagefile and I've still got the same issue. I'm goin' nuts here, can someone help me figure this out? I feel like this is a fairly common problem with BF4 but I can't for the life of me figure it out!
> 
> Thanks everyone, I really appreciate any input!


Set page file to auto, had same issue


----------



## Mega Man

Quote:


> Originally Posted by *LazarusIV*
> 
> Hey guys, I've got a bit of an issue here... Let me lay it out:
> 
> I recently got home after a couple of months and my fixed R9 290 was waiting for me. I installed it with the 14.4 drivers after using DDU to clean up the old ones. I went to play Battlefield 4 and it kept giving me these Direct X errors. I did a little bit of reading and saw that MSI Afterburner could be causing the issues. I uninstalled the latest one (3.0.1 I think) and installed an older version, 3.0.0 and it seemed to help with the issue. I was happily playing BF4 again with my R9 290!
> 
> Today I decided to install the second R9 290 I had acquired recently. Once again with DDU I uninstalled the drivers, installed the 2nd GPU, then started back up and installed the drivers again with both cards in there. Might've been unnecessary but I figured better safe than sorry. The install went without a hitch but when I went to run BF4 it was giving me all sorts of stupid low memory and Direct X issues. Low memory!? Are you kidding me!?!? I did a little research and couldn't find anything in particular that was helpful. I increased my pagefile size, which barely stalled the crash. I got in game and picked a class and then it crashed with the same Direct X or low memory issue. I got rid of my pagefile and I've still got the same issue. I'm goin' nuts here, can someone help me figure this out? I feel like this is a fairly common problem with BF4 but I can't for the life of me figure it out!
> 
> Thanks everyone, I really appreciate any input!


i have had this issues since i got BF4 on both 290xs and 7970s, memory leak besides that what res are you pushing ?

Quote:


> Originally Posted by *sugarhell*
> 
> They foolproof their cards for people like you. You are a newbie? Dont go over stock. You are a pro? You put the limits


Quote:


> Originally Posted by *tsm106*
> 
> They told me if I went over 1.25v the time space continuum would shatter. Omg!


hahaha both so true !
Quote:


> Originally Posted by *LazarusIV*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Blue Dragon*
> 
> I had to install drivers with only one GPU in, then power down and put in second GPU, then reboot to get my 290's going. but think that just affected my asus board, didn't have to do that with my 7870's and msi board.
> 
> 
> 
> I had read an account where a guy uninstalled the drivers then installed both cards and installed the drivers like new and that was the best way he could do it so I figured I'd just do that too. The problem occurred with one card and with two so I'm not sure where that points...
> 
> Quote:
> 
> 
> 
> Originally Posted by *HoneyBadger84*
> 
> For Crossfire & SLi both, usually, after cleaning out previous drivers & putting the new cards in together, you have to let Windows Boot up, then it'll ask for a reboot after installing some Standard VGA Adapter for the second card... then you can reboot & install drivers & everything works fine. That's been my experience at least, with these cards I think it can happen with Crossfire, definitely happens with TriFire every time.
> 
> Click to expand...
> 
> I did wait for Windows to install its generic driver (which didn't require a reboot), then I installed the Catalyst 14.4 drivers after that. Everything else works just fine except for BF4 so I'm wondering if it's just an issue with BF4. If that's the case I'm hoping others have had this issue so there's a fix somewhere. Is it MSI Afterburner causing the issue, should I roll back to Afterburner Beta 18 or 19 or maybe even Catalyst Driver 13.12? Is it HWInfo64 causing an issue? Is it a VRAM or RAM memory leak issue with BF4? Anyone else had these issues?
Click to expand...

vram paging file will not help this at all.
Quote:


> Originally Posted by *HoneyBadger84*
> 
> So guess what? I just showed this Seller I am in fact 100% correct about this card being a 290 in a 290X box.
> 
> Check it. Card matches serial # on the box:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Card says 290. Box says 290X. He got screwed by where-ever he bought this from. NOT ME. lol


wow. nice find and GL with it ! ( the return ) hope it goes quick ~!
Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ramzinho*
> 
> You know what. i was always skeptical about people bragging about "silent" computers. and i can't stress how beautiful it's. i've changed my fans to sp120 and af120s and i love the silence. i can't even put words to how much i love them.
> 
> This thing begs to be water cooled. time to list my Old GPU on the local market and see how it goes.
> 
> i can't really run the fan at 100%.. it's just absurd
> 
> 
> 
> I missed this reply til just now when searching for another one :-D LOL!
> 
> I love it when people actually do put the stock blower fan up to 100% and immediately have that "OMG!" moment
> 
> 
> 
> 
> 
> 
> 
> Too funny
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I game with headphones on, and I can just BARELY hear them because of how good the noise cancelling is on the ear cups.
> Quote:
> 
> 
> 
> Originally Posted by *SLOWION*
> 
> Just picked this R9 290 up on eBay for a little over $300 shipped
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It was used for about two months according to the seller
> 
> 
> 
> Click to expand...
> 
> Curious, can anyone that's gotten a brand new in box Sapphire Card tell me for sure, does Sapphire actually send them out like that, no static proof bag? Cuz the Sapphire R9 290X Tri-X OC card I got in came like that too, just foam, no bag. I had to put one on it (I keep a stock of static proof bags around for cases just like this) before I sent it off after reselling it, cuz I'm paranoid.
Click to expand...

iirc that foam is supposed to be anti static but yea most still come in bags
Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HoneyBadger84*
> 
> I missed this reply til just now when searching for another one :-D LOL!
> 
> I love it when people actually do put the stock blower fan up to 100% and immediately have that "Holy $*%!" moment
> 
> 
> 
> 
> 
> 
> 
> Too funny
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I game with headphones on, and I can just BARELY hear them because of how good the noise cancelling is on the ear cups.
> Curious, can anyone that's gotten a brand new in box Sapphire Card tell me for sure, does Sapphire actually send them out like that, no static proof bag? Cuz the Sapphire R9 290X Tri-X OC card I got in came like that too, just foam, no bag. I had to put one on it (I keep a stock of static proof bags around for cases just like this) before I sent it off after reselling it, cuz I'm paranoid.
> 
> 
> 
> No, mine came in a static proof bag.
> 
> Lol, I love it how we're so voltage happy compared to the CPU overclock guys. I guess GPUs aren't really kept for longer than 2 years here so we don't feel the effects of degradation or voltage death.
> 
> I know that according to Intel (who arguably has the most enviable manufacturing process at the moment), continued usage of their CPUs beyond a specified voltage spec, (I think it was 1.5V for Sandy, 1.45 for Ivy and 1.4 for haswell) results in permanent instability at stock clockspeeds (something about the CPU gradually getting leakier and shifting voltage planes). I can imagine the same fate awaits GPUs which are run at very high voltages regardless of cooling. Obviously cooling helps mitigate the damage but I imagine the high energy degrades stuff like the interconnects on the die.
> 
> That being said, it would be cool if GPUs, one day, allowed control over voltages like the IMC. Sure there will always be people who shoot for the moon but as long as the documentation is accurate then I don't see an issue, though it seems NVIDIA/AMD are loathe to divulge the engineering specifications of their GPUs unlike Intel.
Click to expand...

nah amd does not have that issue i have pushed 1.7 on my fx ( bios limit at the time ) and others since they fixed that bios limit, 1.8, this does not include the l2n users only water, a few have done above 2v on l2n and only have mild degradation


----------



## HoneyBadger84

Jos8, question, why purple tubing? Or is the coolant opaque purple?


----------



## Arizonian

Quote:


> Originally Posted by *Jos8coreHead*
> 
> All shot with my Nokia 909 (1020)
> 
> Gcard:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can't wait to get my other one in a few weeks with waterblocks and XDMA the feck out of them, needing to clean my loop anyway, tinted yellow in my acrylic res :X
> 
> User info:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> So what? I send these here and wait for a response?
> 
> 
> 
> 
> 
> 
> 
> 
> *ASUS Crosshair V Formula Club*


Congrats - added









Let me know when you get your other one and water blocks. I'll update the list.


----------



## Imprezzion

Ok, now i'm really getting angry lol.

First, my card unlocks just fine to 290X but it seems (at least in GPU-Z and such) to hang on PCI-E 1.1 x16 *under load*. It will not go to 3.0 or even 2.0 while my board + CPU should do it just fine and do it just fine with a GTX780.
Never mind. I fixed it.. It seems that when my mainboards NB PCI-E config. is set to Auto mode it will hang on PCI-E 1.1. When I force it to run 3.0 it actually goes up to x16 3.0 and idles on x1 1.1. Bit strange that Auto doesn't work on a 3.0 combo..

Still can't flash another BIOS though..

ATiWinFlash just stops working when I press ''Program''.
ATiFlash on a win98boot USB just tosses up a page fault error with every command I enter..
Tried 3 different USB's. Tried 4 different versions of AtiFlash, nothing works. Just page fault error.

So basically I can't flash a AMD GPU on my system. End of story. I can flash nvidia just fine from the same win98boot USB and even in windows with nvflash 5.127 which I always do but AMD is unflashable...

What now?!


----------



## HoneyBadger84

Quote:


> Originally Posted by *Imprezzion*
> 
> Ok, now i'm really getting angry lol.
> 
> First, my card unlocks just fine to 290X but it seems (at least in GPU-Z and such) to hang on PCI-E 1.1 x16 *under load*. It will not go to 3.0 or even 2.0 while my board + CPU should do it just fine and do it just fine with a GTX780.
> 
> Now, am trying to flash some different BIOS's to the card to see if newer or older or different branded ones fix it but I can't flash any BIOS to it either..
> 
> ATiWinFlash just stops working when I press ''Program''.
> ATiFlash on a win98boot USB just tosses up a page fault error with every command I enter..
> Tried 3 different USB's. Tried 4 different versions of AtiFlash, nothing works. Just page fault error.
> 
> So basically I can't flash a AMD GPU on my system. End of story. I can flash nvidia just fine from the same win98boot USB and even in windows with nvflash 5.127 which I always do but AMD is unflashable...
> 
> What now?!


I assume you tried completely uninstalling, yanking the card, installing another card, then uninstalling it and putting the problem child back in with another clean driver install? That cray. This is why I never hunted down unlockable 290s when I came back to things 2 months ago.


----------



## Imprezzion

Well, I fixed the PCI-E issue. It seems to be a issue with my board. Whenever my PCI-E mode is set to ''Auto'' it won't detect it as 3.0. But when I force it to run 3.0 in the BIOS it works fine.

Still can't flash it though. Read my editted post above









It has a stock reference XFX 290X BIOS on it but I want the ASUS one for more volts to bench it as it runs pretty decent core clocks on just +100mV. (Like, 1175Mhz stable, easy 1200+ bench)

And installing another card, you mean another AMD or a nvidia? I can either install my other 290x, a Windforce3X, or my golden GTX770..


----------



## KeepWalkinG

I think to take water block for my r9 290 Vapor-X but i have only +200mV in trixx...

Is there a way to submit more tension?
With +200mV my card can not lift more than 80 ° С


----------



## RednBlack

Is the fan speed on the R9 290 Tri-x supposed to be locked? I can' t control it manually at all. If I set it to 100% in Afterburner, it reports a fan speed of 100%, but the actual fans don't change speed at all.


----------



## Imprezzion

No, it's not supposed to be locked at all mate.

@KeepWalkinG, The Vapor-X uses a non-reference PCB. No blocks for them available unless you wanna go universal.


----------



## RednBlack

I tried using Trixx and Catalyst, but the fan speed still stays the same. I have the latest driver installed so I don't know what could be the problem. I just finally solved the frustrating black screen issue and now I've run into another problem


----------



## Imprezzion

Does the fanspeed change when you leave it on Auto?
Try forcing it in CCC as well? Just to rule out software.


----------



## RednBlack

Quote:


> Originally Posted by *Imprezzion*
> 
> Does the fanspeed change when you leave it on Auto?
> Try forcing it in CCC as well? Just to rule out software.


Even if it shows different percentages on auto, the fans don't actually change. I tried using AMD Overdrive in CCC and set it to 100%, but it didn't change.


----------



## devilhead

soon it will be available http://www.ekwb.com/news/500/19/New-water-cooling-gear-in-the-works-July-2014/


----------



## Imprezzion

Quote:


> Originally Posted by *devilhead*
> 
> soon it will be available http://www.ekwb.com/news/500/19/New-water-cooling-gear-in-the-works-July-2014/


He's talkin' about the Vapor-X block btw.









And @ RednBlack, can you check the fan connector and wiring without disassembling the card? I think so cause on my Tri-X I could reach the fan connector pretty easily..
Also, if it doesn't have the warranty stickers on the screws just open her up and see if there's any bad wiring or something on particularly the PWM wiring.

It really sounds like a hardware issue. Either busted PWM controller on the card or faulty wiring..


----------



## Blue Dragon

Quote:


> Originally Posted by *RednBlack*
> 
> Is the fan speed on the R9 290 Tri-x supposed to be locked? I can' t control it manually at all. If I set it to 100% in Afterburner, it reports a fan speed of 100%, but the actual fans don't change speed at all.


I had same problem with my 7870s and afterburner (forget which ver.) I tried other things, but in end the only thing I could get to work was open AB and hit the reset (to default) button several times (once wasn't enough for some reason) and I'd get control of my fans back. but I had to repeat every time I boot. that is on my second rig, so I haven't gotten back to it to find better solution.


----------



## devilhead

Quote:


> Originally Posted by *Imprezzion*
> 
> He's talkin' about the Vapor-X block btw.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And @ RednBlack, can you check the fan connector and wiring without disassembling the card? I think so cause on my Tri-X I could reach the fan connector pretty easily..
> Also, if it doesn't have the warranty stickers on the screws just open her up and see if there's any bad wiring or something on particularly the PWM wiring.
> 
> It really sounds like a hardware issue. Either busted PWM controller on the card or faulty wiring..


Sapphire Radeon R9 290X Vapor-X FC EK-FC R9-290X VPX H1 of August


----------



## HoneyBadger84

RednBlack, last time I saw someone having this issue they had a wonky BIOS on their card causing the fan speed to be "stuck". It was a rather simple fix. Which BIOS switch position are you using? Towards the slot, or towards the power connectors on the cards?

What is the BIOS version listed on your R9 290 Tri-X? Is it a regular or OC? Post a host of GPUz's main screen if you can, and compare it to mine for my Sapphire R9 290 Tri-X OC:


----------



## LazarusIV

Quote:


> Originally Posted by *pdasterly*
> 
> Set page file to auto, had same issue


That actually worked, fantastic! Thanks for the help! Now for the next problem...

Quote:


> Originally Posted by *Mega Man*
> 
> i have had this issues since i got BF4 on both 290xs and 7970s, memory leak besides that what res are you pushing ?
> vram paging file will not help this at all.


I'm on my sig monitor, QNIX 1440p overclocked to 96Hz. The auto page file setting does allow me to play BF4 now just fine but I've got another problem. Now I get a random stutter / lag issue and I'm not sure why. It happens in BF4, Kindoms of Amalur: Reckoning (not the most graphically intensive game), and the UFO Monitor Refresh Rate tester page. I'm going to test it on other games as well but so far it seems to happen with everything. It will lag (as in slow down, like it hit a snag) for just less than a second then it'll run just fine for about 2 or 3 seconds then lag again. It's like a pattern, it just keeps doing it. I'll try to catch some video of it and post it, it's driving me crazy! What's that about!? So far my crossfire experience isn't the awesomest, any help would be greatly appreciated! Crossfire n00b here...


----------



## Mega Man

Quote:


> Originally Posted by *LazarusIV*
> 
> Quote:
> 
> 
> 
> Originally Posted by *pdasterly*
> 
> Set page file to auto, had same issue
> 
> 
> 
> That actually worked, fantastic! Thanks for the help! Now for the next problem...
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> i have had this issues since i got BF4 on both 290xs and 7970s, memory leak besides that what res are you pushing ?
> vram paging file will not help this at all.
> 
> Click to expand...
> 
> I'm on my sig monitor, QNIX 1440p overclocked to 96Hz. The auto page file setting does allow me to play BF4 now just fine but I've got another problem. Now I get a random stutter / lag issue and I'm not sure why. It happens in BF4, Kindoms of Amalur: Reckoning (not the most graphically intensive game), and the UFO Monitor Refresh Rate tester page. I'm going to test it on other games as well but so far it seems to happen with everything. It will lag (as in slow down, like it hit a snag) for just less than a second then it'll run just fine for about 2 or 3 seconds then lag again. It's like a pattern, it just keeps doing it. I'll try to catch some video of it and post it, it's driving me crazy! What's that about!? So far my crossfire experience isn't the awesomest, any help would be greatly appreciated! Crossfire n00b here...
Click to expand...

that lag is probably your pc using pagefile.

as for 1440p whats your aa set to ?


----------



## RednBlack

@HoneyBadger84

I have the bios switch set to "Uber", farthest from the connections. I updated the bios to the latest in an attempt to solve the black screen issue. (It didn't work)


----------



## ZealotKi11er

Quote:


> Originally Posted by *RednBlack*
> 
> @HoneyBadger84
> 
> I have the bios switch set to "Uber", farthest from the connections. I updated the bios to the latest in an attempt to solve the black screen issue. (It didn't work)


Flash a BIOS with lower memory overclock.


----------



## HoneyBadger84

Quote:


> Originally Posted by *RednBlack*
> 
> @HoneyBadger84
> 
> I have the bios switch set to "Uber", farthest from the connections. I updated the bios to the latest in an attempt to solve the black screen issue. (It didn't work)


Any word from anyone else that's updated to that BIOS having this issue? I can't do it to mine cuz it's gonna be shipping off to a new owner in 2 days. Did the fan control work previously, or were you not able to even test it out previously? Cuz my fan control... *double checks* based on the audible







that occured on my setting to 100%, is working.

Mem clock ain't the problem I don't think...

Based on the number in your BIOS, it's almost as if it's a half-different half-the-same-as-mine one. Only a few letters/numbers are changed.


----------



## RednBlack

I didn't try using the fan control before flashing it. The black screen kind of made it hard to do anything.
Should I try a non-oc bios?


----------



## LazarusIV

So it seems like MSI Afterburner is the culprit... I uninstalled it and BF4 works perfectly fine now with page file disabled on my computer. Freakin' AB... I think I'll roll back to Beta 19, I had no issues with that whatsoever. Thanks everyone for your help!
Quote:


> Originally Posted by *Mega Man*
> 
> that lag is probably your pc using pagefile.
> 
> as for 1440p whats your aa set to ?


I disabled page file and got rid of my AB install and voila! Thanks for the info man, I appreciate it! My AA is set to none, the most I ever put it on is 2X but usually I turn that off.


----------



## heroxoot

Quote:


> Originally Posted by *RednBlack*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Imprezzion*
> 
> Does the fanspeed change when you leave it on Auto?
> Try forcing it in CCC as well? Just to rule out software.
> 
> 
> 
> Even if it shows different percentages on auto, the fans don't actually change. I tried using AMD Overdrive in CCC and set it to 100%, but it didn't change.
Click to expand...

I had a similar problem. My 290X couldnt have the fan speed changed period nor did % show. MSI had to make me a new bios which I now have. Problem is completely solved.


----------



## HoneyBadger84

Well since the seller finally caved on giving me a partial refund & letting me keep the reference 290, I'm gonna swap out to it since this Tri-X OC 290 has bids & will sell when the auction ends on Tuesday. I'll be back  Time to revive da leaf blower...

another con to VisionTek: the inner box is harder than any other I've ever dealt with to get out of the outer box...


----------



## Imprezzion

Quote:


> Originally Posted by *HoneyBadger84*
> 
> another con to VisionTek: the inner box is harder than any other I've ever dealt with to get out of the outer box...


Hahaha. Try getting a Palit Jestream out of it's box..









Well, it seems that even though 1175Mhz is ''stable'' in benchmarks and stresstests it gives slight checkerboard like arti's in games like payday 2 and bf4.. Soo, i'm debating whether I should just go with the stable 1160Mhz or ramp up the volts a slight bit more. I'm already on +100mV as this is after all a unlocked 290 @ 290x shaders but temps for both vrm and core leave some room to play with voltages...


----------



## heroxoot

Quote:


> Originally Posted by *Imprezzion*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HoneyBadger84*
> 
> another con to VisionTek: the inner box is harder than any other I've ever dealt with to get out of the outer box...
> 
> 
> 
> Hahaha. Try getting a Palit Jestream out of it's box..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well, it seems that even though 1175Mhz is ''stable'' in benchmarks and stresstests it gives slight checkerboard like arti's in games like payday 2 and bf4.. Soo, i'm debating whether I should just go with the stable 1160Mhz or ramp up the volts a slight bit more. I'm already on +100mV as this is after all a unlocked 290 @ 290x shaders but temps for both vrm and core leave some room to play with voltages...
Click to expand...

Glad to see I'm not the only one getting the checkerboard artifacts. I really prefer the old style jetting polygons, but watcha gonna do? Polygon artifacts are so much easier to notice.

Of course I only get these checkerboards during overclocks if they are unstable. defaults are fine.


----------



## Mega Man

Quote:


> Originally Posted by *LazarusIV*
> 
> So it seems like MSI Afterburner is the culprit... I uninstalled it and BF4 works perfectly fine now with page file disabled on my computer. Freakin' AB... I think I'll roll back to Beta 19, I had no issues with that whatsoever. Thanks everyone for your help!
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> that lag is probably your pc using pagefile.
> 
> as for 1440p whats your aa set to ?
> 
> 
> 
> I disabled page file and got rid of my AB install and voila! Thanks for the info man, I appreciate it! My AA is set to none, the most I ever put it on is 2X but usually I turn that off.
Click to expand...

fyi the lagging was probably due to the system ram ( not vram ) being switched to the page file and back out


----------



## LazarusIV

Quote:


> Originally Posted by *Mega Man*
> 
> fyi the lagging was probably due to the system ram ( not vram ) being switched to the page file and back out


It may have been but I just reinstalled MSI Afterburner and it would let me play BF4 but it caused that same lagging pattern as before... I played BF4 right before I installed AB and it wasn't lagging in the slightest, then playing with AB it was lagging. Not sure what's going on there but I suppose I'll give ASUS' tweaker a shot. Maybe it'll jack up my gameplay less. Good lord.


----------



## tsm106

You're doing something wrong... No idea what at a glance but pointing towards the apps as the culprit is heading in the wrong direction imo.


----------



## BradleyW

Quote:


> Originally Posted by *LazarusIV*
> 
> It may have been but I just reinstalled MSI Afterburner and it would let me play BF4 but it caused that same lagging pattern as before... I played BF4 right before I installed AB and it wasn't lagging in the slightest, then playing with AB it was lagging. Not sure what's going on there but I suppose I'll give ASUS' tweaker a shot. Maybe it'll jack up my gameplay less. Good lord.


Yeah, MSI AB caused a bit of bother for me too in BF4. In fact, a couple of games have not enjoyed MSI AB running in the background. Most games work fine with AB.


----------



## LazarusIV

Quote:


> Originally Posted by *tsm106*
> 
> You're doing something wrong... No idea what at a glance but pointing towards the apps as the culprit is heading in the wrong direction imo.


Looks like GPU tweak is also causing the lag issue. What would you suggest? Remove drivers, take out 1 card, reinstall drivers then add second card?


----------



## tsm106

Quote:


> Originally Posted by *LazarusIV*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> You're doing something wrong... No idea what at a glance but pointing towards the apps as the culprit is heading in the wrong direction imo.
> 
> 
> 
> Looks like GPU tweak is also causing the lag issue. What would you suggest? Remove drivers, take out 1 card, reinstall drivers then add second card?
Click to expand...

Did you already post about your config/setup? What post # is it if so? If not, what is your setup volts, clocks, cpu, mb (which slots cards reside in etc)? How are you overclocking gpus, or how you intend to or not at all?


----------



## King PWNinater

Hello. When ever I run bf4 or valley for a minute, the programs crash and I get the black screen, thus forcing me to restart. What does this mean?

2 r9 290s
FX 8350


----------



## Germanian

Quote:


> Originally Posted by *King PWNinater*
> 
> Hello. When ever I run bf4 or valley for a minute, the programs crash and I get the black screen, thus forcing me to restart. What does this mean?
> 
> 2 r9 290s
> FX 8350


is this with default settings no overclock?
if yes, then maybe your PSU is too weak. What is your Power supply unit?


----------



## King PWNinater

They are default with a decrease of 50mhz on the core clock. I have a 1300 watt power supply. The evga one.


----------



## Nihsnek

Question for everyone:

Just got a new Sapphire 290x BF4 edition. Been using a HD6950 for years. The 290x is causing my computer to completely freeze (no BSOD or black screen, just freeze) when it's at a minimal load (internet browsing). Temperatures are being monitored and are under 70 for browsing. No GPU overclocking being done.

*Would my PSU 12v rails be causing this?* I have this PSU: http://www.newegg.com/Product/Product.aspx?Item=N82E16817341017

It shows the [email protected] Doesn't the 290x require [email protected]?

I'm unsure if I should RMA the card..or get a new PSU. Any advice would be helpful.


----------



## aaroc

Quote:


> Originally Posted by *LazarusIV*
> 
> Looks like GPU tweak is also causing the lag issue. What would you suggest? Remove drivers, take out 1 card, reinstall drivers then add second card?


When I had an Asus 6950 CU II I had serious lag problems on GTAIV (unplayable) and other games, all produced by the Asus Fan, Temp, OC utility of that GPU card. Uninstalled and finally could play GTA IV.


----------



## LazarusIV

Quote:


> Originally Posted by *tsm106*
> 
> Did you already post about your config/setup? What post # is it if so? If not, what is your setup volts, clocks, cpu, mb (which slots cards reside in etc)? How are you overclocking gpus, or how you intend to or not at all?


All the stuff in my sig rig is what I'm using right now, GPUs are stock settings as is the CPU except that the power saving features and turbo are turned off. Cards are in the first and third slot of my 990FXA-UD5, both are PCIe 2.0 X16. I will probably put a very mild OC on the GPUs once I get this performance issue sorted out. Later when I put these GPUs under water I'll get more ambitious but for now the most I'd OC is 1000MHz, my PSU is the sig rig one, only 750W so I won't push the CPU and GPUs very hard. I plan on eventually getting a 1000W unit just to be safe, I ordered some WC stuff today and I'd rather have too much power than not enough.

I originally posted a description of the problems I've had here.


----------



## tsm106

Quote:


> Originally Posted by *LazarusIV*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Did you already post about your config/setup? What post # is it if so? If not, what is your setup volts, clocks, cpu, mb (which slots cards reside in etc)? How are you overclocking gpus, or how you intend to or not at all?
> 
> 
> 
> All the stuff in my sig rig is what I'm using right now, GPUs are stock settings as is the CPU except that the power saving features and turbo are turned off. Cards are in the first and third slot of my 990FXA-UD5, both are PCIe 2.0 X16. I will probably put a very mild OC on the GPUs once I get this performance issue sorted out. Later when I put these GPUs under water I'll get more ambitious but for now the most I'd OC is 1000MHz, my PSU is the sig rig one, only 750W so I won't push the CPU and GPUs very hard. I plan on eventually getting a 1000W unit just to be safe, I ordered some WC stuff today and I'd rather have too much power than not enough.
> 
> I originally posted a description of the problems I've had here.
Click to expand...

750w is obviously too little psu. Depending on level of overclock your cpu alone can suck down over 400w. Your gpus can get up there as well, and then there are your accessory bits. I'm listing things out in maximum situations so you have a frame of reference on power draws of the components you own. You'll not necessarily hit those maximums, and few do but it's vitally important to know that the hardware can achieve that. At stock you are already near max on the psu just going by the component TDP specs.

Quote:


> Originally Posted by *LazarusIV*
> 
> Hey guys, I've got a bit of an issue here... Let me lay it out:
> 
> I recently got home after a couple of months and my fixed R9 290 was waiting for me. I installed it with the 14.4 drivers after using DDU to clean up the old ones. I went to play Battlefield 4 and it kept giving me these Direct X errors. I did a little bit of reading and saw that MSI Afterburner could be causing the issues. I uninstalled the latest one (3.0.1 I think) and installed an older version, 3.0.0 and it seemed to help with the issue. I was happily playing BF4 again with my R9 290!
> 
> Today I decided to install the second R9 290 I had acquired recently. Once again with DDU I uninstalled the drivers, installed the 2nd GPU, then started back up and installed the drivers again with both cards in there. Might've been unnecessary but I figured better safe than sorry. The install went without a hitch but when I went to run BF4 it was giving me all sorts of stupid low memory and Direct X issues. Low memory!? Are you kidding me!?!? I did a little research and couldn't find anything in particular that was helpful. I increased my pagefile size, which barely stalled the crash. I got in game and picked a class and then it crashed with the same Direct X or low memory issue. I got rid of my pagefile and I've still got the same issue. I'm goin' nuts here, can someone help me figure this out? I feel like this is a fairly common problem with BF4 but I can't for the life of me figure it out!
> 
> Thanks everyone, I really appreciate any input!


First the low memory is related to your pagefile.BF4 wants more pagefile. A good starting amount is over 8gb. Directx crashes equal instability. I would start by proving the hardware works first. You can do that by confirming the gpus work, test the scaling using gpuz's scaling test. Run benchmarks to confirm the system is working as it should by comparing scores. If your system cannot handle benches in stock form, you might want to facilitate the psu upgrade sooner than later.


----------



## RednBlack

Quote:


> I had a similar problem. My 290X couldnt have the fan speed changed period nor did % show. MSI had to make me a new bios which I now have. Problem is completely solved.


I tried all the bios from techpowerup. How did you get MSI to send you a bios?


----------



## pdasterly

Interested in that new bios also


----------



## Widde

Quote:


> Originally Posted by *Nihsnek*
> 
> Question for everyone:
> 
> Just got a new Sapphire 290x BF4 edition. Been using a HD6950 for years. The 290x is causing my computer to completely freeze (no BSOD or black screen, just freeze) when it's at a minimal load (internet browsing). Temperatures are being monitored and are under 70 for browsing. No GPU overclocking being done.
> 
> *Would my PSU 12v rails be causing this?* I have this PSU: http://www.newegg.com/Product/Product.aspx?Item=N82E16817341017
> 
> It shows the [email protected] Doesn't the 290x require [email protected]?
> 
> I'm unsure if I should RMA the card..or get a new PSU. Any advice would be helpful.


I'm not sure so correct me if it's inaccurate but could it be that 25amps on the 12v isnt enough? Say if it for some reason isnt even getting that then that could be the problem just my 2 cents, How old is it?


----------



## Nihsnek

Quote:


> Originally Posted by *Widde*
> 
> I'm not sure so correct me if it's inaccurate but could it be that 25amps on the 12v isnt enough? Say if it for some reason isnt even getting that then that could be the problem just my 2 cents, How old is it?


It's about 3-4 years old. I'm using both 12v rails to power it. Any more ideas?


----------



## LazarusIV

Quote:


> Originally Posted by *tsm106*
> 
> 750w is obviously too little psu. Depending on level of overclock your cpu alone can suck down over 400w. Your gpus can get up there as well, and then there are your accessory bits. I'm listing things out in maximum situations so you have a frame of reference on power draws of the components you own. You'll not necessarily hit those maximums, and few do but it's vitally important to know that the hardware can achieve that. At stock you are already near max on the psu just going by the component TDP specs.
> 
> First the low memory is related to your pagefile.BF4 wants more pagefile. A good starting amount is over 8gb. Directx crashes equal instability. I would start by proving the hardware works first. You can do that by confirming the gpus work, test the scaling using gpuz's scaling test. Run benchmarks to confirm the system is working as it should by comparing scores. If your system cannot handle benches in stock form, you might want to facilitate the psu upgrade sooner than later.


I know the 750W is too little PSU to do any overclocking with either component, let alone adding in WC gear. That's why I'll be buying the 1000W unit as soon as I can. I'm not interested in extreme overclocks at all but I would like to get some moderate overclocks on the GPUs and I'd like to get 4.8GHz with the CPU. Until I get the new PSU though I won't be doing any overclocking at all.

The page file problem is gone now. I don't use a page file on my OS SSD at all, but even when I set the page file to auto it had the problem. I think I'll take out my reference GPU and test the XFX one just to make sure it isn't causing issues. I need to swap spots for them anyways so I might as well do a little testing with the newer card.


----------



## PureBlackFire

Quote:


> Originally Posted by *Nihsnek*
> 
> Question for everyone:
> 
> Just got a new Sapphire 290x BF4 edition. Been using a HD6950 for years. The 290x is causing my computer to completely freeze (no BSOD or black screen, just freeze) when it's at a minimal load (internet browsing). Temperatures are being monitored and are under 70 for browsing. No GPU overclocking being done.
> 
> *Would my PSU 12v rails be causing this?* I have this PSU: http://www.newegg.com/Product/Product.aspx?Item=N82E16817341017
> 
> It shows the [email protected] Doesn't the 290x require [email protected]?
> 
> I'm unsure if I should RMA the card..or get a new PSU. Any advice would be helpful.


it's possible, but what do you mean by under 70? is that to say temps are over 60c while web browsing? that is not normal at all. pointing toward a heat issue of some kind. the psu is used by you for a few years and was a mediocre thing brand new. could be dying. I've seen dozens of modXstream and stealthXstream power supplies die early, badly and under light load. you don't have another psu you can test?

edit: AFAIK this unit is no longer covered under any warranty so keep that in mind. you need to acquire another power supply to test so you can eliminate the current one as a source of trouble. in any event, it's old and out of warranty, so you'd be wise to look at getting a new one for that reason alone. hope you get it sorted out.


----------



## Roboyto

Quote:



> Originally Posted by *LazarusIV*
> 
> I know the 750W is too little PSU to do any overclocking with either component, let alone adding in WC gear. That's why I'll be buying the 1000W unit as soon as I can. I'm not interested in extreme overclocks at all but I would like to get some moderate overclocks on the GPUs and I'd like to get 4.8GHz with the CPU. Until I get the new PSU though I won't be doing any overclocking at all.


@tsm106 is spot on.

I have a reference 290 with a Kraken G10, at stock clocks it pulls <200W. Air cooled you may be a little higher.

If you get good cards and you can push them, you can get in the 350W range per GPU. With mild 1100 core OC, you can probably expect 250-275W each.



Spoiler: 359 watts















@shilka has a nice 1000W PSU comparison: http://www.overclock.net/t/1438987/1000-1050-watts-comparison-thread#post_21108368


----------



## Ramzinho

Quote:


> Originally Posted by *devilhead*
> 
> soon it will be available http://www.ekwb.com/news/500/19/New-water-cooling-gear-in-the-works-July-2014/


Isn't that a bit too late. products are ending and they are in their final stages.

Quote:


> Originally Posted by *LazarusIV*
> 
> I know the 750W is too little PSU to do any overclocking with either component, let alone adding in WC gear. That's why I'll be buying the 1000W unit as soon as I can. I'm not interested in extreme overclocks at all but I would like to get some moderate overclocks on the GPUs and I'd like to get 4.8GHz with the CPU. Until I get the new PSU though I won't be doing any overclocking at all.
> 
> The page file problem is gone now. I don't use a page file on my OS SSD at all, but even when I set the page file to auto it had the problem. I think I'll take out my reference GPU and test the XFX one just to make sure it isn't causing issues. I need to swap spots for them anyways so I might as well do a little testing with the newer card.


Says who? a system with an OCed CPU + stock GPU 290X for example won't take more than around 400-450W max add in another GPU and A full loop and you will be using around 600-650W which a 750W quality PSU will be sufficient for. for more accurate calculations. refer to @twerk or @TwoCables


----------



## sugarhell

Just the 290x is 250-270 watt. Add a loop an oced cpu,fans,hdd etc etc etc and you still use only 400 watts? Nah


----------



## fateswarm

The CPU can fluctuate a lot too. e.g. I'm now on ~40W on low load but it can go up to 180-200W on very high load.


----------



## Ramzinho

Quote:


> Originally Posted by *sugarhell*
> 
> Just the 290x is 250-270 watt. Add a loop an oced cpu,fans,hdd etc etc etc and you still use only 400 watts? Nah


an OCed CPU depends on the voltage you put on it.. i guess 200W is a decent amount.. load fans and pump around 100W afaik so 300W.. again i think the guy with the best math to be asked is twerk.

but i just asked him yesterday. Oced 3770K @ 1.25V + 2 X 290X and a full loop for a 750W PSU.. he said it's gonna be fine.


----------



## tsm106

Quote:


> Originally Posted by *Ramzinho*
> 
> Quote:
> 
> 
> 
> Originally Posted by *sugarhell*
> 
> Just the 290x is 250-270 watt. Add a loop an oced cpu,fans,hdd etc etc etc and you still use only 400 watts? Nah
> 
> 
> 
> an OCed CPU depends on the voltage you put on it.. i guess 200W is a decent amount.. load fans and pump around 100W afaik so 300W.. again i think the guy with the best math to be asked is twerk.
> 
> *but i just asked him yesterday*. Oced 3770K @ 1.25V + 2 X 290X and a full loop for a 750W PSU.. he said it's gonna be fine.
Click to expand...

Yea about that, he's not a source I could care about.


----------



## Roboyto

Quote:


> Originally Posted by *Ramzinho*
> 
> Says who? a system with an OCed CPU + stock GPU 290X for example won't take more than around 400-450W max add in another GPU and A full loop and you will be using around 600-650W which a 750W quality PSU will be sufficient for. for more accurate calculations. refer to @twerk or @TwoCables


Maybe with an OC'd Intel CPU, but that AMD FX is a hog; especially if he wants 4.8GHz.

I will agree that a 4770k @ 4.3GHz and a 290 @ 1075/1375 will run on a quality 450W PSU, because I'm doing it. But an OC'd FX with dual 290s is going to require more for what he has planned.


----------



## fateswarm

No need to have sources. If you have a digital controller (for the CPU VRM) you can look at HWInfo. e.g. here I see the IR3563B controller detected and it reports CPU power usage directly.

It goes from something like 80W on low gaming load to 200W if you put it to its knees.

It separates usage of the CPU from PSU input. The latter is probably what you need.


----------



## Widde

Quote:


> Originally Posted by *Ramzinho*
> 
> an OCed CPU depends on the voltage you put on it.. i guess 200W is a decent amount.. load fans and pump around 100W afaik so 300W.. again i think the guy with the best math to be asked is twerk.
> 
> but i just asked him yesterday. Oced 3770K @ 1.25V + 2 X 290X and a full loop for a 750W PSU.. he said it's gonna be fine.


My 3570k at 4.5 1.35v and 2 290s ocd with 100mV 50%+ power limit makes my 1000 watts seasonic sweat so I personally think that 750 watter is gonna have it rough and a reduced life span if anything gets overclocked. Dont even have a wc loop just 4 fans and 3 hdds.


----------



## Ramzinho

Quote:


> Originally Posted by *Roboyto*
> 
> Maybe with an OC'd Intel CPU, but that AMD FX is a hog; especially if he wants 4.8GHz.
> 
> I will agree that a 4770k @ 4.3GHz and a 290 @ 1075/1375 will run on a quality 450W PSU, because I'm doing it. But an OC'd FX with dual 290s is going to require more for what he has planned.


i didn't know he was talking about FX. my bad.. this is right.. he will need more power then
Quote:


> Originally Posted by *Widde*
> 
> My 3570k at 4.5 1.35v and 2 290s ocd with 100mV 50%+ power limit makes my 1000 watts seasonic sweat so I personally think that 750 watter is gonna have it rough and a reduced life span if anything gets overclocked. Dont even have a wc loop just 4 fans and 3 hdds.


i didn't say that the GPUs are Oced.

Oced GPUs = need more power 100%.. at least a 850 with mild OC.. 1000W is a must if you are going crazy.


----------



## the9quad

The question though is current draw through the rail more than what the PSU pushes out>? I mean if you have a single rail 1200 watt power supply with a max of 99.5 amps on the rail, is it possible to stay under the 1200 watts and still draw more current than what the rail is rated at, and therefore have issues? I never reach 1200 watts, but I think this is the issue I am having. 3 cards at 33 amps each sure is close to my 99.5 amps on the rail. And every once in a while when my cards are being pushed really hard (even though I am still only like 1150 watts at the wall) I will get some slight artificating, and then BAM! hard crash so hard that the PC wont recognize the drivers being installed anymore. My power supply is rated pretty decent, so I don't imagine it is anything other than the single rail being undersized for the current draw? The cards are at stock clocks as well.


----------



## Mega Man

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *LazarusIV*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Did you already post about your config/setup? What post # is it if so? If not, what is your setup volts, clocks, cpu, mb (which slots cards reside in etc)? How are you overclocking gpus, or how you intend to or not at all?
> 
> 
> 
> All the stuff in my sig rig is what I'm using right now, GPUs are stock settings as is the CPU except that the power saving features and turbo are turned off. Cards are in the first and third slot of my 990FXA-UD5, both are PCIe 2.0 X16. I will probably put a very mild OC on the GPUs once I get this performance issue sorted out. Later when I put these GPUs under water I'll get more ambitious but for now the most I'd OC is 1000MHz, my PSU is the sig rig one, only 750W so I won't push the CPU and GPUs very hard. I plan on eventually getting a 1000W unit just to be safe, I ordered some WC stuff today and I'd rather have too much power than not enough.
> 
> I originally posted a description of the problems I've had here.
> 
> Click to expand...
> 
> 750w is obviously too little psu. Depending on level of overclock your cpu alone can suck down over 400w. Your gpus can get up there as well, and then there are your accessory bits. I'm listing things out in maximum situations so you have a frame of reference on power draws of the components you own. You'll not necessarily hit those maximums, and few do but it's vitally important to know that the hardware can achieve that. At stock you are already near max on the psu just going by the component TDP specs.
> 
> Quote:
> 
> 
> 
> Originally Posted by *LazarusIV*
> 
> Hey guys, I've got a bit of an issue here... Let me lay it out:
> 
> I recently got home after a couple of months and my fixed R9 290 was waiting for me. I installed it with the 14.4 drivers after using DDU to clean up the old ones. I went to play Battlefield 4 and it kept giving me these Direct X errors. I did a little bit of reading and saw that MSI Afterburner could be causing the issues. I uninstalled the latest one (3.0.1 I think) and installed an older version, 3.0.0 and it seemed to help with the issue. I was happily playing BF4 again with my R9 290!
> 
> Today I decided to install the second R9 290 I had acquired recently. Once again with DDU I uninstalled the drivers, installed the 2nd GPU, then started back up and installed the drivers again with both cards in there. Might've been unnecessary but I figured better safe than sorry. The install went without a hitch but when I went to run BF4 it was giving me all sorts of stupid low memory and Direct X issues. Low memory!? Are you kidding me!?!? I did a little research and couldn't find anything in particular that was helpful. I increased my pagefile size, which barely stalled the crash. I got in game and picked a class and then it crashed with the same Direct X or low memory issue. I got rid of my pagefile and I've still got the same issue. I'm goin' nuts here, can someone help me figure this out? I feel like this is a fairly common problem with BF4 but I can't for the life of me figure it out!
> 
> Thanks everyone, I really appreciate any input!
> 
> Click to expand...
> 
> First the low memory is related to your pagefile.BF4 wants more pagefile. A good starting amount is over 8gb. Directx crashes equal instability. I would start by proving the hardware works first. You can do that by confirming the gpus work, test the scaling using gpuz's scaling test. Run benchmarks to confirm the system is working as it should by comparing scores. If your system cannot handle benches in stock form, you might want to facilitate the psu upgrade sooner than later.
Click to expand...

really? my question then becomes, why does a game made in todays age even use pagefile by default. ...... i hate pagefile !
Quote:


> Originally Posted by *LazarusIV*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> 750w is obviously too little psu. Depending on level of overclock your cpu alone can suck down over 400w. Your gpus can get up there as well, and then there are your accessory bits. I'm listing things out in maximum situations so you have a frame of reference on power draws of the components you own. You'll not necessarily hit those maximums, and few do but it's vitally important to know that the hardware can achieve that. At stock you are already near max on the psu just going by the component TDP specs.
> 
> First the low memory is related to your pagefile.BF4 wants more pagefile. A good starting amount is over 8gb. Directx crashes equal instability. I would start by proving the hardware works first. You can do that by confirming the gpus work, test the scaling using gpuz's scaling test. Run benchmarks to confirm the system is working as it should by comparing scores. If your system cannot handle benches in stock form, you might want to facilitate the psu upgrade sooner than later.
> 
> 
> 
> I know the 750W is too little PSU to do any overclocking with either component, let alone adding in WC gear. That's why I'll be buying the 1000W unit as soon as I can. I'm not interested in extreme overclocks at all but I would like to get some moderate overclocks on the GPUs and I'd like to get 4.8GHz with the CPU. Until I get the new PSU though I won't be doing any overclocking at all.
> 
> The page file problem is gone now. I don't use a page file on my OS SSD at all, but even when I set the page file to auto it had the problem. I think I'll take out my reference GPU and test the XFX one just to make sure it isn't causing issues. I need to swap spots for them anyways so I might as well do a little testing with the newer card.
Click to expand...

why do you keep limiting your self on power? 1000w is low for this, assuming amd is correct on stock values. 350wx2 ( gpus ) = 700w, leaving less then 300w for CPU and everything else, you dont have to oversize by much, but more then that, otherwise you will be buying a new psu rather quickly again. stick with a 1200w at the least
Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ramzinho*
> 
> Quote:
> 
> 
> 
> Originally Posted by *sugarhell*
> 
> Just the 290x is 250-270 watt. Add a loop an oced cpu,fans,hdd etc etc etc and you still use only 400 watts? Nah
> 
> 
> 
> an OCed CPU depends on the voltage you put on it.. i guess 200W is a decent amount.. load fans and pump around 100W afaik so 300W.. again i think the guy with the best math to be asked is twerk.
> 
> *but i just asked him yesterday*. Oced 3770K @ 1.25V + 2 X 290X and a full loop for a 750W PSU.. he said it's gonna be fine.
> 
> Click to expand...
> 
> Yea about that, he's not a source I could care about.
Click to expand...

with that answer, yea i would not trust him ever. min of 1000w, and i do mean min, unless you plan on everything @ stock
Quote:


> Originally Posted by *fateswarm*
> 
> No need to have sources. If you have a digital controller (for the CPU VRM) you can look at HWInfo. e.g. here I see the IR3563B controller detected and it reports CPU power usage directly.
> 
> It goes from something like 80W on low gaming load to 200W if you put it to its knees.
> 
> It separates usage of the CPU from PSU input. The latter is probably what you need.










please never say this again. DO NOT TRUST software voltage monitors...... ever

they give you an idea, but are usually wrong


----------



## Ramzinho

yes man my question was everything at stock.. no crazy stuff..

and twerk seriously knows what he talks about mate.


----------



## fateswarm

Quote:


> Originally Posted by *Mega Man*
> 
> 
> 
> 
> 
> 
> 
> 
> please never say this again. DO NOT TRUST software voltage monitors...... ever


Calm down. IR3563B is *pure hardware*. It's a chip that actually regulates the voltage and supply.

Something similar is on my Tri-x by the way, IR3567B, and HWInfo detects it.


----------



## spenceaj

1. 
2. ASUS R9 290
3. Stock Cooling


----------



## sugarhell

Quote:


> Originally Posted by *fateswarm*
> 
> Calm down. IR3553B is *pure hardware*. It's a chip that actually regulates the voltage and supply.
> 
> Something similar is on my Tri-x by the way, IR3567B, and HWInfo detects it.


You know that pure hardware means nothing? Every single info app reads through i2c bus. If the voltage regulator support voltage control then you can change the volts.Otherwise you read through an i2c bus via the software. Its not accurate thats why ln2 ocers solder readings points directly to the voltage regulator. Pls stop spamming the same misinformations

IR3553B does the same as the CHiL8228G (7970) its not something special


----------



## fateswarm

Quote:


> Originally Posted by *sugarhell*
> 
> You know that pure hardware means nothing? Every single info app reads through i2c bus. If the voltage regulator support voltage control then you can change the volts.Otherwise you read through an i2c bus via the software. Its not accurate thats why ln2 ocers solder readings points directly to the voltage regulator. Pls stop spamming the same misinformations


I don't know if you understand what the PWM controller does. It has full control of the VRM. It's not just a bystander, everything goes through it.

Anyway, you can keep arguing. It seems you are not interested in what I have to say.


----------



## sugarhell

Quote:


> Originally Posted by *fateswarm*
> 
> I don't know if you understand what the PWM controller does. It has full control of the VRM. It's not just a bystander, everything goes through it.


Pls you understand you read volts through a i2c bus? DO you understand what i am talking about? You read through a SOFTWARE not directly from the voltage regulator.


----------



## Mega Man

Quote:


> Originally Posted by *fateswarm*
> 
> Quote:
> 
> 
> 
> Originally Posted by *sugarhell*
> 
> You know that pure hardware means nothing? Every single info app reads through i2c bus. If the voltage regulator support voltage control then you can change the volts.Otherwise you read through an i2c bus via the software. Its not accurate thats why ln2 ocers solder readings points directly to the voltage regulator. Pls stop spamming the same misinformations
> 
> 
> 
> I don't know if you understand what the PWM controller does. It has full control of the VRM. It's not just a bystander, everything goes through it.
> 
> Anyway, you can keep arguing. It seems you are not interested in what I have to say.
Click to expand...











please, again see point about L2n and why anyone tells people to check with dmm if they want a real readout

edit i will also add do you know the margin of error? because my DMM does not need it....


----------



## fateswarm

Quote:


> Originally Posted by *sugarhell*
> 
> Pls you understand you read volts through a i2c bus? DO you understand what i am talking about?


As I said, the IR3563B controller is detected by HWInfo as a chip and it reports its output. There is no other software involved.

It seems you are not interested in what I have to say though.


----------



## kizwan

Quote:


> Originally Posted by *Nihsnek*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Widde*
> 
> I'm not sure so correct me if it's inaccurate but could it be that 25amps on the 12v isnt enough? Say if it for some reason isnt even getting that then that could be the problem just my 2 cents, How old is it?
> 
> 
> 
> It's about 3-4 years old. I'm using both 12v rails to power it. Any more ideas?
Click to expand...

How do you use both rails? If I understand correctly, you're saying you use both rails for GPU.
Quote:


> Originally Posted by *LazarusIV*
> 
> The page file problem is gone now. I don't use a page file on my OS SSD at all, but even when I set the page file to auto it had the problem. I think I'll take out my reference GPU and test the XFX one just to make sure it isn't causing issues. I need to swap spots for them anyways so I might as well do a little testing with the newer card.


Did you try fixed size or let system managed?
Quote:


> Originally Posted by *Ramzinho*
> 
> Quote:
> 
> 
> 
> Originally Posted by *devilhead*
> 
> soon it will be available http://www.ekwb.com/news/500/19/New-water-cooling-gear-in-the-works-July-2014/
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Isn't that a bit too late. products are ending and they are in their final stages.
Click to expand...

For watercooling, it never too late.


----------



## Mega Man

Quote:


> Originally Posted by *fateswarm*
> 
> Quote:
> 
> 
> 
> Originally Posted by *sugarhell*
> 
> Pls you understand you read volts through a i2c bus? DO you understand what i am talking about?
> 
> 
> 
> As I said, the IR3563B controller is detected by HWInfo as a chip and it reports its output. There is no other software involved.
> 
> It seems you are not interested in what I have to say though.
Click to expand...

here let me edit your statement for you ? ready ?
Quote:


> Originally Posted by *fateswarm*
> 
> As I said, the IR3563B controller is detected by HWInfo SOFTWARE as a chip and it reports its output. There is no other software involved.


now see
Quote:


> Originally Posted by *Mega Man*
> 
> 
> 
> 
> 
> 
> 
> 
> please never say this again. DO NOT TRUST software voltage monitors...... ever
> 
> they give you an idea, but are usually wrong


----------



## fateswarm

Quote:


> Originally Posted by *Mega Man*
> 
> now see


HWInfo directly reading the hardware solely responsible for the VRM operation is "interfering" now?

Explain how.


----------



## sugarhell

Quote:


> Originally Posted by *fateswarm*
> 
> HWInfo directly reading the hardware solely responsible for the VRM operation is "interfering" now?
> 
> Explain how.


Hwinfo does the same thing for all the voltage regulator. They read through a specific i2c bus the voltage controller. Go ln2 with software readings. I will laugh

For proof:


----------



## fateswarm

Quote:


> Originally Posted by *sugarhell*
> 
> Hwinfo does the same thing for all the voltage regulator.


You can not read an analog PWM controller with HWInfo. Digital controllers like that are not the norm.


----------



## Nihsnek

Quote:


> Originally Posted by *kizwan*
> 
> How do you use both rails? If I understand correctly, you're saying you use both rails for GPU.


Yea. The PSU has 2 outputs for PCI-E power connectors (red colored). I am using 2 separate power connector cables for the card, one for each of the PSU outputs. I'm assuming each one is a 12v rail since my PSU has 2.


----------



## ZealotKi11er

My 290X @ 1000/1250 Stock + 3770K @ 4.6GHz w/ 1.365v hit MAX Load ~ 430W. This was using 3DMark and was the peak reading.
290X @ 1200/1500 +100mV +50% ~ 520W.

290X + 290 Stock hitting ~ 670W.


----------



## shwarz

Quote:


> Originally Posted by *Nihsnek*
> 
> Yea. The PSU has 2 outputs for PCI-E power connectors (red colored). I am using 2 separate power connector cables for the card, one for each of the PSU outputs. I'm assuming each one is a 12v rail since my PSU has 2.


not necessarily each psu rail setup can be different you may find that cpu is on one and your running the gpu all off one rail


----------



## devilhead

at this test http://www.3dmark.com/fs/2389409, 2600k 4.7ghz 1.38v and 290X at 1.47, power supply not the best - Corsair GS800, at combined test draw was max i saw ~760-770w from wall







so real power maybe 650w


----------



## kahboom

http://www.3dmark.com/3dm/3539546  I7 4790K 4.8Ghz 1.35v with power savings features on, crossfire xfx 290's with Power color R9 290 bios, adjusted bios with hex tool, changed memory timings from debug elpida to non debuged ones and changed core from 1040 to 1050, voltage is raised +50mv by default. Using latest 14.7 beta driver. Heres the bios if someone wants to try them.

Powercolor1050-1350bios.zip 99k .zip file


----------



## ZealotKi11er

With 360 RAD + 240 RAD running 2 stock 290s i am getting 15C delta. Cards hit 55C/59C in BF4 (DX11).


----------



## Ramzinho

Quote:


> Originally Posted by *ZealotKi11er*
> 
> With 360 RAD + 240 RAD running 2 stock 290s i am getting 15C delta. Cards hit 55C/59C in BF4 (DX11).


that's pretty decent imo.. any ideas about how much power you draw at stock CPU?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Ramzinho*
> 
> that's pretty decent imo.. any ideas about how much power you draw at stock CPU?


No idea about stock CPU but with CPU OCed i am hitting ~ 700W.


----------



## Hl86

Im still getting audio stutter and vertical video flicker with vsyng and 14,4 and 14,7 in WoW and Dota2. Any know a fix?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Hl86*
> 
> Im still getting audio stutter and vertical video flicker with vsyng and 14,4 and 14,7 in WoW and Dota2. Any know a fix?


Never had any problem in Dota 2 with single card.


----------



## Arizonian

Quote:


> Originally Posted by *spenceaj*
> 
> 1.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 2. ASUS R9 290
> 3. Stock Cooling


Congrats - added


----------



## Spectre-

managed to fit my h100i on the R9 290


----------



## Kittencake

I didn't realize the star wars old republic could push this card hard lol


----------



## jumperfly

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Curious, can anyone that's gotten a brand new in box Sapphire Card tell me for sure, does Sapphire actually send them out like that, no static proof bag? Cuz the Sapphire R9 290X Tri-X OC card I got in came like that too, just foam, no bag. I had to put one on it (I keep a stock of static proof bags around for cases just like this) before I sent it off after reselling it, cuz I'm paranoid.


I've had 3 of the Tri-X cards and they all come in a thick anti-static bag and protective film over the cards cooler,


----------



## Arizonian

Quote:


> Originally Posted by *Spectre-*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> managed to fit my h100i on the R9 290


Nice - updated


----------



## HoneyBadger84

Quote:


> Originally Posted by *jumperfly*
> 
> I've had 3 of the Tri-X cards and they all come in a thick anti-static bag and protective film over the cards cooler,


I thought so, just weird that two used cards I've recently seen, mine and one other, didn't come in static proof bags.

I like the thicker bags, they actually give a bit of cushion compared to the thin ones most people use these days.

My 3 HIS R9 290Xs will be here in less than 12 hrs







such excited, much ready!


----------



## kizwan

Quote:


> Originally Posted by *Nihsnek*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> How do you use both rails? If I understand correctly, you're saying you use both rails for GPU.
> 
> 
> 
> Yea. The PSU has 2 outputs for PCI-E power connectors (red colored). I am using 2 separate power connector cables for the card, one for each of the PSU outputs. I'm assuming each one is a 12v rail since my PSU has 2.
Click to expand...

That is not true. Your PSU follow ATX12V v2.2 specsification. Basically, 12V2 ("ATX12V" or "P4" cable) is for CPU & 12V1 is for everything else, including GPU. I recommend upgrade to bigger capacity PSU or if you're electrician maybe you can bridge both rails.


----------



## RocketAbyss

Hi Arizonian, kindly help me update my Powercolor 290X entry. Am running a NZXT G10 + Thermaltake Water 3.0 Performer for cooling now











Cheers!


----------



## HoneyBadger84

I can't wait to see these full coverage closed loop coolers Corsair are supposedly releasing soon. Should be spiffy for my reference cards


----------



## heroxoot

So I like to keep my GPU constantly monitored and I noticed something. No single program shows an honest GPU load. MSI AB, CCC, and GPUZ all show different load at the same time. Which is the best to trust right now? I'm thinking MSI AB with unified GPU usage monitoring enabled. Looking for an opinion about this tho.


----------



## HoneyBadger84

That has to do with your update poll settings, among other things. GPUz and AfterBurner both have the ability to adjust how often the numbers are refreshed. Perhaps thats why you're seeing different numbers, if they aren't refreshing at the exact same time, they of course shouldn't match.

I don't trust CCC for anything, I never even go to the OverDrive tab.


----------



## heroxoot

Quote:


> Originally Posted by *HoneyBadger84*
> 
> That has to do with your update poll settings, among other things. GPUz and AfterBurner both have the ability to adjust how often the numbers are refreshed. Perhaps thats why you're seeing different numbers, if they aren't refreshing at the exact same time, they of course shouldn't match.
> 
> I don't trust CCC for anything, I never even go to the OverDrive tab.


I only used it to see what was going on. MSI AB is polling 500ms. I don't know what the others run. GPUZ seems within reason of MSI AB but not exact.

I changed polling rates to match between MSI AB and GPUZ and they are still not showing the same numbers. I guess I'll just have to stick with one as both probably won't ever sync up properly.


----------



## BradleyW

I asked to be a member 2000 pages ago.


----------



## bond32

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I can't wait to see these full coverage closed loop coolers Corsair are supposedly releasing soon. Should be spiffy for my reference cards


Have any links? There's some concern about patients and disputes nonsense with Asutek so yeah...


----------



## PureBlackFire

Quote:


> Originally Posted by *Kittencake*
> 
> I didn't realize the star wars old republic could push this card hard lol


that game has always been hard on hardware. badly optimized from day 1. that's all it is.


----------



## fateswarm

Ah, I did find a way to affect my power usage with the power limit setting after all. If it's on minus 50%, the voltage raises a bit, the wattage drops, the current drops, the fps drops and that significantly drops the load on the VRM and its temperature. Raising the limit still does nothing on the particular settings since I guess it's already above the max it could use.

I'm surprised the GPU doesn't shutdown to be honest. It just considers it a case to nerf its fps.


----------



## heroxoot

Quote:


> Originally Posted by *BradleyW*
> 
> I asked to be a member 2000 pages ago.


You most likely lacked information that was requested in the OP to join the list.


----------



## HoneyBadger84

Quote:


> Originally Posted by *heroxoot*
> 
> I only used it to see what was going on. MSI AB is polling 500ms. I don't know what the others run. GPUZ seems within reason of MSI AB but not exact.
> 
> I changed polling rates to match between MSI AB and GPUZ and they are still not showing the same numbers. I guess I'll just have to stick with one as both probably won't ever sync up properly.


From what I understand, the two can actually interfere with eachother in reference to polling/monitoring, so that might be part of your problem... but I could be completely wrong on that, I usually only use GPUz to monitor VRMs & I have MSI running with the OSD in bench/game


----------



## heroxoot

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> I only used it to see what was going on. MSI AB is polling 500ms. I don't know what the others run. GPUZ seems within reason of MSI AB but not exact.
> 
> I changed polling rates to match between MSI AB and GPUZ and they are still not showing the same numbers. I guess I'll just have to stick with one as both probably won't ever sync up properly.
> 
> 
> 
> From what I understand, the two can actually interfere with eachother in reference to polling/monitoring, so that might be part of your problem... but I could be completely wrong on that, I usually only use GPUz to monitor VRMs & I have MSI running with the OSD in bench/game
Click to expand...

This is pretty much what I do. I just never noticed a desync between them before. I thought it was because I have MSI AB using Unified GPU monitoring and GPUZ uses AMD defaults, but that made no difference as well. It's ok tho, I more or less use it the same as you.


----------



## kizwan

Quote:


> Originally Posted by *fateswarm*
> 
> Ah, I did find a way to affect my power usage with the power limit setting after all. If it's on minus 50%, the voltage raises a bit, the wattage drops, the current drops, the fps drops and that significantly drops the load on the VRM and its temperature. Raising the limit still does nothing on the particular settings since I guess it's already above the max it could use.
> 
> I'm surprised the GPU doesn't shutdown to be honest. It just considers it a case to nerf its fps.


Power limit is power limit, pretty much similar to CPU Current Capability but without OCP as far as I know. It dictate max wattage the card can consume before it throttle. It doesn't automatically increase the voltage or amperage. Whenever the card try to draw more power than the pre-set limit, the card will throttling to make sure the power draw within the limit.
Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> I only used it to see what was going on. MSI AB is polling 500ms. I don't know what the others run. GPUZ seems within reason of MSI AB but not exact.
> 
> I changed polling rates to match between MSI AB and GPUZ and they are still not showing the same numbers. I guess I'll just have to stick with one as both probably won't ever sync up properly.
> 
> 
> 
> From what I understand, *the two can actually interfere with eachother in reference to polling/monitoring*, so that might be part of your problem... but I could be completely wrong on that, I usually only use GPUz to monitor VRMs & I have MSI running with the OSD in bench/game
Click to expand...

As far as I know, there is some kind of synchronization between these monitoring software to prevent them from fighting with each other. This is why two or more monitoring software may not show same value at the same time.


----------



## heroxoot

Makes sense Kiz. Thanks.


----------



## Kittencake

Quote:


> Originally Posted by *PureBlackFire*
> 
> that game has always been hard on hardware. badly optimized from day 1. that's all it is.


still an addicting game , I dropped everything to low settings and my temps are around 60 c rather than 73


----------



## fateswarm

Quote:


> Originally Posted by *kizwan*
> 
> Power limit is power limit, pretty much similar to CPU Current Capability but without OCP as far as I know. It dictate max wattage the card can consume before it throttle. It doesn't automatically increase the voltage or amperage. Whenever the card try to draw more power than the pre-set limit, the card will throttling to make sure the power draw within the limit.


Yes it's a side-effect. Power is voltage * current so the current must drop. I'm not sure why the voltage raises, but it's a very slight raise, probably a slight side effect of some sort while it drops power and current.


----------



## AK-47

Icy vision vs Accelero Xtreme 4
which is better for this card?


----------



## heroxoot

Quote:


> Originally Posted by *AK-47*
> 
> Icy vision vs Accelero Xtreme 4
> which is better for this card?


I hear really good things about Accelero Xtreme. Pretty sure I watched a comparison video earlier today and it was so silent too.


----------



## ZealotKi11er

Quote:


> Originally Posted by *AK-47*
> 
> Icy vision vs Accelero Xtreme 4
> which is better for this card?


ARCTIC Accelero Xtreme IV looks like a beast.


----------



## AK-47

Links to articles comparing the 2 with actual numbers?
I did a google search and didn't really find anything
price difference between the 2 is like $10 after you buy the $15 R9 290 kit for the Gelid


----------



## Jflisk

Quote:


> Originally Posted by *AK-47*
> 
> Icy vision vs Accelero Xtreme 4
> which is better for this card?


Full blown water cooling D5 pump 2x240 radiator or if not Accelero Xtream. Had them on my old 6970s
















Just buy Artic Silver thermal adhesive for the heat sinks. The stuff that comes with any of them you will be pulling your hair out.

As seen here
http://www.arcticsilver.com/ta.htm


----------



## Imprezzion

Shameless crosspost as the other topic I posted in aint very active:









I can't get any flash tool to work.

USB sticks with win98boot and AtiFlash just gives ''page faults'' with every command i enter. Even atiflash -i fails to do anything else then throw up a page fault. Tried 3 USB sticks and 4 different software versions (3.67, 3.99, 4.07 and 4.17). Nothing but the same page fault mess.

ATiWinFlash just stops working when I try to flash a BIOS via the GUI.
Command line seems to do something but then after about a minute of the ''progress bar'' running it also just crashes with some error.

So, the BIOS that someone else put on here is the only thing i'm going to be using so it seems... Unless someone can cook up a solution.. I can't be the only one having this nonsense..


----------



## hwoverclkd

Quote:


> Originally Posted by *Imprezzion*
> 
> Shameless crosspost as the other topic I posted in aint very active:
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can't get any flash tool to work.
> 
> USB sticks with win98boot and AtiFlash just gives ''page faults'' with every command i enter. Even atiflash -i fails to do anything else then throw up a page fault. Tried 3 USB sticks and 4 different software versions (3.67, 3.99, 4.07 and 4.17). Nothing but the same page fault mess.
> 
> *ATiWinFlash just stops working when I try to flash a BIOS via the GUI.
> Command line seems to do something but then after about a minute of the ''progress bar'' running it also just crashes with some error.*
> 
> So, the BIOS that someone else put on here is the only thing i'm going to be using so it seems... Unless someone can cook up a solution.. I can't be the only one having this nonsense..


what specific error did you get? you were running your command prompt with elevated privileges, yeah?


----------



## Roboyto

Quote:


> Originally Posted by *AK-47*
> 
> Links to articles comparing the 2 with actual numbers?
> I did a google search and didn't really find anything
> price difference between the 2 is like $10 after you buy the $15 R9 290 kit for the Gelid


The Arctic will likely blow the gelid out of the water. Their new design with the huge heatsink on the back of the card makes much less hassle of attaching sinks to the front.

Another user suggested artic thermal epoxy, which is permanent; it says so explicitly in the instructions. I wouldn't suggest using it, friend of mine ripped VRM mosfets off trying to take heatsinks off to RMA a card.


----------



## Jflisk

Quote:


> Originally Posted by *Roboyto*
> 
> The Arctic will likely blow the gelid out of the water. Their new design with the huge heatsink on the back of the card makes much less hassle of attaching sinks to the front.
> 
> Another user suggested artic thermal epoxy, which is permanent; it says so explicitly in the instructions. I wouldn't suggest using it, friend of mine ripped VRM mosfets off trying to take heatsinks off to RMA a card.


To safely remove Arctic silver thermal apoxy use 90% alcohol around the chip first losens up the bond.


----------



## cennis

Quote:


> Originally Posted by *Roboyto*
> 
> The Arctic will likely blow the gelid out of the water. Their new design with the huge heatsink on the back of the card makes much less hassle of attaching sinks to the front.
> 
> Another user suggested artic thermal epoxy, which is permanent; it says so explicitly in the instructions. I wouldn't suggest using it, friend of mine ripped VRM mosfets off trying to take heatsinks off to RMA a card.


Arctic IV reviews showing that it has bad vrm cooling.

They ommitted the vrm heatsinks on the surface hoping the backplate is good enough.

Vrm temps seemed worst than Arctic III

If you add the gelid vrm kit then the backplate will probably help


----------



## Roboyto

Quote:


> Originally Posted by *Jflisk*
> 
> To safely remove Arctic silver thermal apoxy use 90% alcohol around the chip first losens up the bond.


Quote:


> Originally Posted by *cennis*
> 
> Arctic IV reviews showing that it has bad vrm cooling.
> 
> They ommitted the vrm heatsinks on the surface hoping the backplate is good enough.
> 
> Vrm temps seemed worst than Arctic III
> 
> If you add the gelid vrm kit then the backplate will probably help


I stand corrected on both accounts.

The gelid VRM kit does do a pretty darn good job, especially if you use some Fujipoly Ultra Extreme pads with it. I'm getting 50C load temps at stock clocks/volts on my reference 290; in a tiny 250D with no airflow.


----------



## cennis

Quote:


> Originally Posted by *Roboyto*
> 
> I stand corrected on both accounts.
> 
> The gelid VRM kit does do a pretty darn good job, especially if you use some Fujipoly Ultra Extreme pads with it. I'm getting 50C load temps at stock clocks/volts on my reference 290; in a tiny 250D with no airflow.


the beefy backplate is a great addition if you have the space, but you should still get heatsinks on the front surface of vrm, arctic should have included them..


----------



## Imprezzion

Quote:


> Originally Posted by *acupalypse*
> 
> what specific error did you get? you were running your command prompt with elevated privileges, yeah?


Yeah I was. It was something with ''ATIwinflash MCP stopped working''.
Could be Windows 8.1 tho...

But what's with the page faults when using Atiflash from a win98boot USB... Everyone uses it.. I have used it numerous times.. but not in this particular build..

Point is, I wanna flash it with the ASUS or PT1 BIOS cause I have plenty of room on my temps for both VRM and core to go higher then +100mV.. But I can't like this.. MSI AB 3.0.1 also doesn't start with the commands for +100mV.


----------



## HoneyBadger84

:-D I'll be testin' these for the next 3 hrs... must re-read rules on page 1 to see how I get these babies listed since they're my semi-permanent playthings for now *clicks fingernails together evilly*










Testing card #1 meow:


----------



## HoneyBadger84

So here's my validation stuffuh for the member list for these 3 children:

Link: http://www.techpowerup.com/gpuz/53wwk/

Screenshot (merged them together so it's not so wide):



And lastly a picture of'em in the computer (it don't have my name in it cuz I figure those two should be enough right?):












3x HIS R9 290Xs (stock air blower coolers)








off to benchmarking I go!


----------



## Ramzinho

Quote:


> Originally Posted by *HoneyBadger84*
> 
> So here's my validation stuffuh for the member list for these 3 children:
> 
> Link: http://www.techpowerup.com/gpuz/53wwk/
> 
> Screenshot (merged them together so it's not so wide):
> 
> 
> 
> And lastly a picture of'em in the computer (it don't have my name in it cuz I figure those two should be enough right?):
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 3x HIS R9 290Xs (stock air blower coolers)
> 
> 
> 
> 
> 
> 
> 
> 
> off to benchmarking I go!


you got your self a nice heater for the winter









are you going water?


----------



## HoneyBadger84

Quote:


> Originally Posted by *Ramzinho*
> 
> you got your self a nice heater for the winter
> 
> 
> 
> 
> 
> 
> 
> 
> 
> are you going water?


Eventually I might go liquid. Gotta get new PSU, then a second PSU for QuadFire, then I'll probably go liquid when the Corsair AIO full coverage blocks come out... Will have to figure out where to put all those radiators, assuming they have 1 fan rads each, the 4th will be a challenge to place. lol


----------



## Ramzinho

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Eventually I might go liquid. Gotta get new PSU, then a second PSU for QuadFire, then I'll probably go liquid when the Corsair AIO full coverage blocks come out... Will have to figure out where to put all those radiators, assuming they have 1 fan rads each, the 4th will be a challenge to place. lol


what that amount of money. i would go for a full custom loop. though it's gonna cost you like 900$


----------



## Kittencake

I dunno something about that just makes it so sexy


----------



## HoneyBadger84

Quote:


> Originally Posted by *Kittencake*
> 
> I dunno something about that just makes it so sexy


I know right, simplicity.

They actually didn't get too warm during first test, course I have all 3 Fans kicking up to 100% at 55C, peaks were 65/63/60 in 3DMark FireStrike Extreme... running it through my full suite just to see how crazy temps get & determine if I need to open up my case or if I can leave the side flow with the standard case fans I have.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Ramzinho*
> 
> what that amount of money. i would go for a full custom loop. though it's gonna cost you like 900$


Gotta keep in mind, I got all 3 of these for $880+$27 shipping  and they're in brand new condition, guy only tested them to make sure they work (he had these sitting there cuz he couldn't fit them in a mining-field he had going, sold these as a set of 3 when he was selling the rest off). Fans have almost no dust at all, and they're spotless









Beat me some 780s in 3-Way SLi (over at NV forums' leaderboard) despite him having a CPU speed advantage:



3DMark FireStrike (Extreme): 11964
Graphics: 14322
Physics: 16393
Combined: 4533

Validation Link: http://www.3dmark.com/fs/2441676 (if anyone wants to run comparisons using the ORB's interface)

Also, peak usage from the wall was 1124W during the Combined Test. Figuring in my PSU's efficiency of ~90% at near-full load, that's about 1011W of power being actually used by the system... not bad eh?


----------



## Kittencake

nice .. thats a pretty damn good deal


----------



## HoneyBadger84

LOL! :-D

House Refrigerated air kicked in at the same time as FireStrike's Combined Test started, my UPS insta-gibbed... guess I need a higher wattage one for sure now ^_^

Gotta rerun that now on normal now X_X lol


----------



## Kittencake

I just upgraded my psu readying for a second 290x


----------



## HoneyBadger84

Quote:


> Originally Posted by *Kittencake*
> 
> I just upgraded my psu readying for a second 290x


Well from the figures I'm getting, a 1200W is definitely more than enough for TriFire, at least so far... I'm assuming if I actually got something that could fully load the cards & the CPU (which I'm assuming FireStrike's Combined test is close, but no cigar to) I'd get over 1200W from the wall, which is still only about 1050W actually powering the system.

So for 2-way Crossfire, seems like recommendations of 1000W are plenty.

My Battery Backup, which is what tripped, is a small unit, it ain't made for this craziness... the last one I had that was actually capable of handling 1200W died a back when I was testing 3 7970s. That was a while ago now X_X


----------



## HoneyBadger84

Minor sad face: They all have Elpida memory. What should I expect out of it, less clocks or ? Not familiar with it as I haven't seen many folks with it that're OCing much.


----------



## the9quad

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Well from the figures I'm getting, a 1200W is definitely more than enough for TriFire, at least so far... I'm assuming if I actually got something that could fully load the cards & the CPU (which I'm assuming FireStrike's Combined test is close, but no cigar to) I'd get over 1200W from the wall, which is still only about 1050W actually powering the system.
> 
> So for 2-way Crossfire, seems like recommendations of 1000W are plenty.
> 
> My Battery Backup, which is what tripped, is a small unit, it ain't made for this craziness... the last one I had that was actually capable of handling 1200W died a back when I was testing 3 7970s. That was a while ago now X_X


Well I have a fairly decent highly rated 1200 watt PSU and even though my rig only pulls 1150 watts at the wall, I am afraid it still pulls more amps then what my single rail has. Most single 12V rail are rated at roughly 100-105 amps amps which is fairly large actually, but I am afraid since the 290x's are 33 amps each it doesnt leave much room for anything else. Keep that in mind.

Yours has 0.9 more amps then mine, so maybe you will do better?


----------



## HoneyBadger84

Quote:


> Originally Posted by *the9quad*
> 
> Well I have a fairly decent highly rated 1200 watt PSU and even though my rig only pulls 1150 watts at the wall, I am afraid it still pulls more amps then what my single rail has. Most single 12V rail are rated at roughly 100-105 amps amps which is fairly large actually, but I am afraid since the 290x's are 33 amps each it doesnt leave much room for anything else. Keep that in mind.
> 
> Yours has 0.9 more amps then mine, so maybe you will do better?


lol, yeah, that's one of the reasons I'm going to be upgrading to an Antec HCP-1300W & getting a secondary PSU before I go QuadFire, I don't wanna haveta worry about stuff like that.

Maybe I should be looking at the max amp pull in GPUz during benching too? I know it's not dead-on accurate, but it'll be in the ballpark...


----------



## the9quad

I am waiting for the CORSAIR AX1500i to go on sale, to at least in the $350 range.


----------



## VSG

The cheapest I saw the ax1200i go for was $260 or so after MIR, a Newegg promo code and a Google Wallet promo code. It will take similar scenarios to see the ax1500i anywhere near $350 soon.


----------



## HoneyBadger84

Quote:


> Originally Posted by *geggeg*
> 
> The cheapest I saw the ax1200i go for was $260 or so after MIR, a Newegg promo code and a Google Wallet promo code. It will take similar scenarios to see the ax1500i anywhere near $350 soon.


Yeah when I bought my PSU (original AX1200W Pro Series Gold+) it was $299. It's the ancestor of the 1200i, I guess you could say.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Kittencake*
> 
> I just upgraded my psu readying for a second 290x


Does that Mobo support 2 cards at x16 or x8?

I just had a look and it's a x16 slot but only operates at x4 and you don't want that.

You'd need a 990FX board for Crossfire

Sorry


----------



## bond32

Highly recommend holding out for the EVGA 1600 watt g2. It will likely cost just as much if not less, plus you get a higher rated psu. Should, in theory, run everything off 1 psu (evga 1600 watt)...


----------



## VSG

Agreed.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Kittencake*
> 
> I just upgraded my psu readying for a second 290x


Quote:


> Originally Posted by *HoneyBadger84*
> 
> Minor sad face: They all have Elpida memory. What should I expect out of it, less clocks or ? Not familiar with it as I haven't seen many folks with it that're OCing much.


my cards will do 1500 on memory but i've seen them as high as 1620Mhz, I wouldn't worry too much.


----------



## JordanTr

W
Quote:


> Originally Posted by *cennis*
> 
> Arctic IV reviews showing that it has bad vrm cooling.
> 
> They ommitted the vrm heatsinks on the surface hoping the backplate is good enough.
> 
> Vrm temps seemed worst than Arctic III
> 
> If you add the gelid vrm kit then the backplate will probably help


what are you talking about? I got arctic xtreme iv for two weaks now on my sapphire r9 290. No extra heatsinks, just stuff arctic provided. Peak temps on 1100/1400 is 63 for core, 82 for vrm1 and 63 vrm2. Just make sure side panel fan blows cols air on arctics backplate and its fine. Im using corsair af140 quiet edition. With performance edition you could get even lower vrm1 temps. Mine af140 even not running full speed. On stock clocks i hot 58core, 74 vrm1, 60 vrm2. Also its very quiet. Just make custom fan profile on msi ab/trixx. Like starting 30%, then 50 from 50 celsius ant 100% on core 60. If you wont do it your fans probably will not go above 40% and you get like 77-82 core, cause it uses % from stock cooler. Even on 100% fans are barely audible. Its like stock on 35%


----------



## HoneyBadger84

Quote:


> Originally Posted by *bond32*
> 
> Highly recommend holding out for the EVGA 1600 watt g2. It will likely cost just as much if not less, plus you get a higher rated psu. Should, in theory, run everything off 1 psu (evga 1600 watt)...


According to Twerk (and seconded and thirded by others) the 1500W won't be enough either:
Quote:


> Originally Posted by *twerk*
> 
> I'm not entirely convinced the HCP-1300 would provide enough wattage, OCP kicks in at just over 1400W from my testing so you may end up triggering it with a decent overclock on your cards and CPU.
> 
> 290X @ 1.35V/1150MHz = ~422W
> 3930k @ 1.29V/4.6GHz = ~201W
> 
> (422x4)+201=1889W
> 
> Of course, you may not be overclocking the GPU's that high but it just shows how quickly power draw goes up, and that's not even a particularly extreme overclock.


1150MHz is my target for when I OC for benchmarking, soooo I'm gonna need dual PSUs either way.

Plan is, thanks to some pointing out by Bhav, I'm going to get an 850 or 1000W for the CPU, everything else but the cards, and one GPU, then run the 1300W for the other 3 GPUs.


----------



## Kittencake

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Does that Mobo support 2 cards at x16 or x8?
> 
> I just had a look and it's a x16 slot but only operates at x4 and you don't want that.
> 
> You'd need a 990FX board for Crossfire
> 
> Sorry


I'm well aware of that , thats my next goal in a week or two


----------



## Sgt Bilko

Quote:


> Originally Posted by *Kittencake*
> 
> I'm well aware of that , thats my next goal in a week or two


Good to hear!









Wasn't sure is all.....What board you planning on?


----------



## bond32

The 290X by itself won't draw 422 watts at that voltage. I know because I have pulled less than that, at more voltage. And that was my entire system using a meter.


----------



## HoneyBadger84

Quote:


> Originally Posted by *bond32*
> 
> The 290X by itself won't draw 422 watts at that voltage. I know because I have pulled less than that, at more voltage. And that was my entire system using a meter.


But still, I think it's safe to say on 4 cards I'd get over 1500W draw, overclocked on the cards, when I've seen 1200-1225W (depending on what my efficiency was) at stock with 4 cards previously, or am I overestimating how much each card will draw extra OCed? Keep in mind that's ~1450W at the wall, I did the math to get 1200-1225W depending on if my PSU was hitting it's 90% or 92% efficiency marks it's been tested at.

Sidenote: my battery got irritated again, so I guess I'll be being safe & not benchmarking TriFire further until I get the newer PSU & a new battery backup. lol Did finally get through all testing though. It only happens during the combined test startup on 3DMark11. That's weird, maybe it is an amperage pull issue like someone pointed out...

Or it could be my power is sketchy because of a thunderstorm outside. Who knows. On that thunder note, I'm done testing til the storm passes, surge protection or not, don't wanna fry my brand new stuff. lol


----------



## fateswarm

Add/subtract up to +/-200W depended on what the load of the CPU for the game is, I guess that's about right for a 6 core. Come to think of it, it might be gradually increasing as a factor on the same game as you add GPUs: The more of them the more the CPU contribution since with less GPUs and less FPS it had to supply them with less draw calls.


----------



## Learath2

I just downgraded back to 14.4 from 14.6 beta as it caused problems with Warface. I now seem to get 0x1000007E BSOD while playing youtube videos on the HTML5 player. I am starting to think that the card is the problem as I switched out my mobo and CPU. I am on Windows 7.


----------



## bond32

Quote:


> Originally Posted by *HoneyBadger84*
> 
> But still, I think it's safe to say on 4 cards I'd get over 1500W draw, overclocked on the cards, when I've seen 1200-1225W (depending on what my efficiency was) at stock with 4 cards previously, or am I overestimating how much each card will draw extra OCed? Keep in mind that's ~1450W at the wall, I did the math to get 1200-1225W depending on if my PSU was hitting it's 90% or 92% efficiency marks it's been tested at.
> 
> Sidenote: my battery got irritated again, so I guess I'll be being safe & not benchmarking TriFire further until I get the newer PSU & a new battery backup. lol Did finally get through all testing though. It only happens during the combined test startup on 3DMark11. That's weird, maybe it is an amperage pull issue like someone pointed out...
> 
> Or it could be my power is sketchy because of a thunderstorm outside. Who knows. On that thunder note, I'm done testing til the storm passes, surge protection or not, don't wanna fry my brand new stuff. lol


Right, but even you pointed out how much over your AX1200 was able to provide... With the requirements of quadfire, if it can be done with one psu then I say do that over 2 psu's...


----------



## Kittencake

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Good to hear!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Wasn't sure is all.....What board you planning on?


Gigabyte 990fx-ud3


----------



## rdr09

Quote:


> Originally Posted by *Learath2*
> 
> I just downgraded back to 14.4 from 14.6 beta as it caused problems with Warface. I now seem to get 0x1000007E BSOD while playing youtube videos on the HTML5 player. I am starting to think that the card is the problem as I switched out my mobo and CPU. I am on Windows 7.


what browser are you using?

i am not sure how you uninstalled and installed driver but i use the new driver (14.7) to uninstall current driver using Express, rebooted, ran 14.7 again and installed it using Express again. Before rebooting i went to msconfig and unchecked CCC and raptr. Reboot and played BF4.

If you are using Chrome you can go to settings>show advance settings>system>uncheck hardware acceleration. if Firefox, i think there is a way to do the same.


----------



## Learath2

Quote:


> Originally Posted by *rdr09*
> 
> what browser are you using?
> 
> i am not sure how you uninstalled and installed driver but i use the new driver (14.7) to uninstall current driver using Express, rebooted, ran 14.7 again and installed it using Express again. Before rebooting i went to msconfig and unchecked CCC and raptr. Reboot and played BF4.
> 
> If you are using Chrome you can go to settings>show advance settings>system>uncheck hardware acceleration. if Firefox, i think there is a way to do the same.


Already disabled HW acceleration and I just uninstalled by using the setup rebooted and installed 14.4.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Kittencake*
> 
> Gigabyte 990fx-ud3


The UD3 is a good board, Rev 1.1 and 3.0 are the best ones i think.

looking forward to see it goes


----------



## Kittencake

Quote:


> Originally Posted by *Sgt Bilko*
> 
> The UD3 is a good board, Rev 1.1 and 3.0 are the best ones i think.
> 
> looking forward to see it goes


you can check my build log

http://www.overclock.net/t/1304532/kitty-rig-reborn


----------



## HoneyBadger84

Yeah the power glitches I had in testing were the storm. Waiting for it to pass....
Quote:


> Originally Posted by *bond32*
> 
> Right, but even you pointed out how much over your AX1200 was able to provide... With the requirements of quadfire, if it can be done with one psu then I say do that over 2 psu's...


Whst I may do is buy one of those old adapters to use 2 PSUs on one system and just get the Antec 1300W then run it in tandum with the AX1200w, better to have way more power than I need, thsn not enough, right? Lol


----------



## bond32

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Yeah the power glitches I had in testing were the storm. Waiting for it to pass....
> Whst I may do is buy one of those old adapters to use 2 PSUs on one system and just get the Antec 1300W then run it in tandum with the AX1200w, better to have way more power than I need, thsn not enough, right? Lol


Not really, if you need X watts, no need to buy X+Y wattage psu. A 1600 watt should, in theory, provide enough to power 4 heavily overclocked cards plus the rest of the full system.

The 1300 watt g2 I have has pulled over 1500 watts before. Lots of people with the same psu have similar results. My 1000 watt g2 also pulled much higher.

Not trying to sell the G2 or anything. Shilka here on the forums is the psu go to guy, he knows a lot about them. Check out some of his threads if you're curious.


----------



## HoneyBadger84

Yeah I know, I got the Antec HCP-1300 recommendations from shilka and nleksan among others









Wish this storm would get the heck outta here, I wanna test some games I haven't gotten to test in TriFire yet...


----------



## the9quad

Quote:


> Originally Posted by *bond32*
> 
> Not really, if you need X watts, no need to buy X+Y wattage psu. A 1600 watt should, in theory, provide enough to power 4 heavily overclocked cards plus the rest of the full system.
> 
> The 1300 watt g2 I have has pulled over 1500 watts before. Lots of people with the same psu have similar results. My 1000 watt g2 also pulled much higher.
> 
> Not trying to sell the G2 or anything. Shilka here on the forums is the psu go to guy, he knows a lot about them. Check out some of his threads if you're curious.


any idea how amperage figures in on that>? All i ever here is is wattage this wattage that, but if you only can draw so much amperage on a single rail, that has to figure in somewhere, but I have never seen a straight answer on it. I know ohms law, and power = voltage x current so 100 amps times 12v= 1200watts, but does that all being on one rail have any effect? like does it mess up when you have high current draw from 3 different sources or are high end power supplies cool with that?


----------



## Mega Man

Quote:


> Originally Posted by *bond32*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HoneyBadger84*
> 
> I can't wait to see these full coverage closed loop coolers Corsair are supposedly releasing soon. Should be spiffy for my reference cards
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Have any links? There's some concern about patients and disputes nonsense with Asutek so yeah...
Click to expand...

pretty sure they use the patent troll company to do it with they have so far. even though if they dont you dont have to asetek is just suing everyone who doesnt
Quote:


> Originally Posted by *geggeg*
> 
> The cheapest I saw the ax1200i go for was $260 or so after MIR, a Newegg promo code and a Google Wallet promo code. It will take similar scenarios to see the ax1500i anywhere near $350 soon.


sorry, but i gotta say that is the same cookie cutter uggry i would expect, solid product, uggry packaging !
Quote:


> Originally Posted by *bond32*
> 
> Highly recommend holding out for the EVGA 1600 watt g2. It will likely cost just as much if not less, plus you get a higher rated psu. Should, in theory, run everything off 1 psu (evga 1600 watt)...


Quote:


> Originally Posted by *geggeg*
> 
> Agreed.


+12... yes 12 when 1-11 are not enough.
Quote:


> Originally Posted by *Kittencake*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Does that Mobo support 2 cards at x16 or x8?
> 
> I just had a look and it's a x16 slot but only operates at x4 and you don't want that.
> 
> You'd need a 990FX board for Crossfire
> 
> Sorry
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm well aware of that , thats my next goal in a week or two
Click to expand...

let me know if you need any help !
Quote:


> Originally Posted by *Kittencake*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Good to hear!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Wasn't sure is all.....What board you planning on?
> 
> 
> 
> Gigabyte 990fx-ud3
Click to expand...

dear god no... please save your self the issue. go at min ud5 if not saberkitty/CFVz/UD7
Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Kittencake*
> 
> Gigabyte 990fx-ud3
> 
> 
> 
> The UD3 is a good board, Rev 1.1 and 3.0 are the best ones i think.
> 
> looking forward to see it goes
Click to expand...

dear god no, rev 3-4 nothing but issues


----------



## Kittencake

Quote:


> Originally Posted by *Mega Man*
> 
> pretty sure they use the patent troll company to do it with they have so far. even though if they dont you dont have to asetek is just suing everyone who doesnt
> sorry, but i gotta say that is the same cookie cutter uggry i would expect, solid product, uggry packaging !
> 
> +12... yes 12 when 1-11 are not enough.
> let me know if you need any help !
> dear god no... please save your self the issue. go at min ud5 if not saberkitty/CFVz/UD7
> dear god no, rev 3-4 nothing but issues


I thought rev 2 was the best?


----------



## Mega Man

rev 1.1 maybe 1.2 but idk i never see posts about 1.2


----------



## Kittencake

Quote:


> Originally Posted by *Mega Man*
> 
> pretty sure they use the patent troll company to do it with they have so far. even though if they dont you dont have to asetek is just suing everyone who doesnt
> sorry, but i gotta say that is the same cookie cutter uggry i would expect, solid product, uggry packaging !
> 
> +12... yes 12 when 1-11 are not enough.
> let me know if you need any help !
> dear god no... please save your self the issue. go at min ud5 if not saberkitty/CFVz/UD7
> dear god no, rev 3-4 nothing but issues


ty hun i will







your always a great help


----------



## yawa

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Gotta keep in mind, I got all 3 of these for $880+$27 shipping  and they're in brand new condition, guy only tested them to make sure they work (he had these sitting there cuz he couldn't fit them in a mining-field he had going, sold these as a set of 3 when he was selling the rest off). Fans have almost no dust at all, and they're spotless
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Beat me some 780s in 3-Way SLi (over at NV forums' leaderboard) despite him having a CPU speed advantage:
> 
> 
> 
> 3DMark FireStrike (Extreme): 11964
> Graphics: 14322
> Physics: 16393
> Combined: 4533
> 
> Validation Link: http://www.3dmark.com/fs/2441676 (if anyone wants to run comparisons using the ORB's interface)
> 
> Also, peak usage from the wall was 1124W during the Combined Test. Figuring in my PSU's efficiency of ~90% at near-full load, that's about 1011W of power being actually used by the system... not bad eh?


That is a great deal.

With the first stage of my 4K rebuild finished way ahead of schedule (CPU/Motherboard upgrade) I'm hoping to do the same thing as you for a 2nd card, and/or punish a bitcoin miner on Craig's list for making me pay $600 for my first 290X by getting his one off of him for under $250. Then it's on to a case capable of fitting 420mm and 360mm rads, and finally, a 28 inch-32 inch 60Hz 4K monitor by the end of the year.

Still, what a steal man. Gratz. It's amazing out there how many people still don't know how much their computer stuff is worth.
Quote:


> Originally Posted by *Kittencake*
> 
> I thought rev 2 was the best?


I kinda wanna chime in here to let you know, since I think you guys are talking GA boards here (I think), I ran pretty much the exact same setup as you KC for a year and some change, on a GA970-UD3 revision 1.1 and never had any issues. Even with super high clocks. The board was a trooper, which is even more crazy to think of since I was using the much maligned 970 chipset and I only paid $89.99 for it.

I could bench and game at 5.2Ghz (though not stress stable) and had a super stable 4.9Ghz Overclock. Though a lot of that had to do with the quality of my chip, the fact that my board could handle the heat and voltage as a pretty cheap budget build, goes a long way towards making an argument for it.

Anyway my point is, those boards are pretty good. At least in my experience. Plus, at this point they are cheap enough that the 990FX versions could probably be had for well under $100, but either way, in my experience, I wouldn't by any means be afraid of any of those GA boards (minus the D3 of course, UGH GARBAGE).

_P.S. Actually, if you're interested (and not too far away, like in Siberia) my GA970-UD3 is collecting dust in it's box. It's yours if you want it (or know someone who does) as the last two people who tried to buy it basically tried to not pay me on Ebay so I withheld it and now don't really care what I get for it at this point as long as I know it'll be used._


----------



## Kittencake

Quote:


> Originally Posted by *yawa*
> 
> That is a great deal.
> 
> With the first stage of my 4K rebuild finished way ahead of schedule (CPU/Motherboard upgrade) I'm hoping to do the same thing as you for a 2nd card, and/or punish a bitcoin miner on Craig's list for making me pay $600 for my first 290X by getting his one off of him for under $250. Then it's on to a case capable of fitting 420mm and 360mm rads, and finally, a 28 inch-32 inch 60Hz 4K monitor by the end of the year.
> 
> Still, what a steal man. Gratz. It's amazing out there how many people still don't know how much their computer stuff is worth.
> I kinda wanna chime in here to let you know, since I think you guys are talking GA boards here (I think), I ran pretty much the exact same setup as you KC for a year and some change, on a GA970-UD3 revision 1.1 and never had any issues. Even with super high clocks. The board was a trooper, which is even more crazy to think of since I was using the much maligned 970 chipset and I only paid $89.99 for it.
> 
> I could bench and game at 5.2Ghz (though not stress stable) and had a super stable 4.9Ghz Overclock. Though a lot of that had to do with the quality of my chip, the fact that my board could handle the heat and voltage as a pretty cheap budget build, goes a long way towards making an argument for it.
> 
> Anyway my point is, those boards are pretty good. At least in my experience. Plus, at this point they are cheap enough that the 990FX versions could probably be had for well under $100, but either way, in my experience, I wouldn't by any means be afraid of any of those GA boards (minus the D3 of course, UGH GARBAGE).
> 
> _P.S. Actually, if you're interested (and not too far away, like in Siberia) my GA970-UD3 is collecting dust in it's box. It's yours if you want it (or know someone who does) as the last two people who tried to buy it basically tried to not pay me on Ebay so I withheld it and now don't really care what I get for it at this point as long as I know it'll be used._


sure I'd be willing to trade up boards, I'm in canada , and good timing too since I've not to long ago moved from oslo norway lol


----------



## yawa

KK good,KC, we can definitely set something up then. I'll shoot you a PM before I crash tonight. Definitely going to be a lot cheaper than sending it to Oslo Norway. That's quite a move though, I hope you are enjoying it!.


----------



## Mega Man

Quote:


> Originally Posted by *yawa*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HoneyBadger84*
> 
> Gotta keep in mind, I got all 3 of these for $880+$27 shipping  and they're in brand new condition, guy only tested them to make sure they work (he had these sitting there cuz he couldn't fit them in a mining-field he had going, sold these as a set of 3 when he was selling the rest off). Fans have almost no dust at all, and they're spotless
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Beat me some 780s in 3-Way SLi (over at NV forums' leaderboard) despite him having a CPU speed advantage:
> 
> 
> 
> 3DMark FireStrike (Extreme): 11964
> Graphics: 14322
> Physics: 16393
> Combined: 4533
> 
> Validation Link: http://www.3dmark.com/fs/2441676 (if anyone wants to run comparisons using the ORB's interface)
> 
> Also, peak usage from the wall was 1124W during the Combined Test. Figuring in my PSU's efficiency of ~90% at near-full load, that's about 1011W of power being actually used by the system... not bad eh?
> 
> 
> 
> That is a great deal.
> 
> With the first stage of my 4K rebuild finished way ahead of schedule (CPU/Motherboard upgrade) I'm hoping to do the same thing as you for a 2nd card, and/or punish a bitcoin miner on Craig's list for making me pay $600 for my first 290X by getting his one off of him for under $250. Then it's on to a case capable of fitting 420mm and 360mm rads, and finally, a 28 inch-32 inch 60Hz 4K monitor by the end of the year.
> 
> Still, what a steal man. Gratz. It's amazing out there how many people still don't know how much their computer stuff is worth.
> Quote:
> 
> 
> 
> Originally Posted by *Kittencake*
> 
> I thought rev 2 was the best?
> 
> Click to expand...
> 
> I kinda wanna chime in here to let you know, since I think you guys are talking GA boards here (I think), I ran pretty much the exact same setup as you KC for a year and some change, on a GA970-UD3 revision 1.1 and never had any issues. Even with super high clocks. The board was a trooper, which is even more crazy to think of since I was using the much maligned 970 chipset and I only paid $89.99 for it.
> 
> I could bench and game at 5.2Ghz (though not stress stable) and had a super stable 4.9Ghz Overclock. Though a lot of that had to do with the quality of my chip, the fact that my board could handle the heat and voltage as a pretty cheap budget build, goes a long way towards making an argument for it.
> 
> Anyway my point is, those boards are pretty good. At least in my experience. Plus, at this point they are cheap enough that the 990FX versions could probably be had for well under $100, but either way, in my experience, I wouldn't by any means be afraid of any of those GA boards (minus the D3 of course, UGH GARBAGE).
> 
> _P.S. Actually, if you're interested (and not too far away, like in Siberia) my GA970-UD3 is collecting dust in it's box. It's yours if you want it (or know someone who does) as the last two people who tried to buy it basically tried to not pay me on Ebay so I withheld it and now don't really care what I get for it at this point as long as I know it'll be used._
Click to expand...

Quote:


> Originally Posted by *yawa*
> 
> KK good,KC, we can definitely set something up then. I'll shoot you a PM before I crash tonight. Definitely going to be a lot cheaper than sending it to Oslo Norway. That's quite a move though, I hope you are enjoying it!.


bad idea

970 does not = 990

970 max you can get is 16x 4x cfx

the vrms are very very different on the 990fxa-ud3 and they suck, giga had to hard code in the bios to throttle at certain powerdraw/temp to prevent them from blowing

again this is on the 990 not the 970,

with wanting to go CFX either get the ud5/7 sabertooth or CVFz


----------



## Kittencake

actually i looked up the board its 2x 16 so better than what i have now and with my current board .. Over clocking ... no go lol just drew the short end of the stick when i comes to this board


----------



## Mega Man

i own 2

there are 2x16 slots, but the second is only x4

http://www.gigabyte.com/fileupload/product/2/3907/4462_big.jpg

x16 x2 is impossible the chipset does not have enough lanes

thats why you need 990fxa


----------



## Kittencake

aww well it will do till i get my mitts on a better board I can't oc worth beans with my current


----------



## invincible20xx

how long do you guys think a crossfire r9 290 system will serve me @ 1080p maxed out 60fps , look at sig rig , thanks !


----------



## pdasterly

Quote:


> Originally Posted by *invincible20xx*
> 
> how long do you guys think a crossfire r9 290 system will serve me @ 1080p maxed out 60fps , look at sig rig , thanks !


Depends on monitor/monitors


----------



## Arizonian

Quote:


> Originally Posted by *RocketAbyss*
> 
> Hi Arizonian, kindly help me update my Powercolor 290X entry. Am running a NZXT G10 + Thermaltake Water 3.0 Performer for cooling now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Cheers!


Congrats - updated









Quote:


> Originally Posted by *HoneyBadger84*
> 
> So here's my validation stuffuh for the member list for these 3 children:
> 
> Link: http://www.techpowerup.com/gpuz/53wwk/
> 
> Screenshot (merged them together so it's not so wide):
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> And lastly a picture of'em in the computer (it don't have my name in it cuz I figure those two should be enough right?):
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 3x HIS R9 290Xs (stock air blower coolers)
> 
> 
> 
> 
> 
> 
> 
> 
> off to benchmarking I go!


Congrats - added


----------



## invincible20xx

Quote:


> Originally Posted by *pdasterly*
> 
> Depends on monitor/monitors


single 1080p panel and in the future i'm getting a 1080p projector


----------



## rdr09

Quote:


> Originally Posted by *invincible20xx*
> 
> how long do you guys think a crossfire r9 290 system will serve me @ 1080p maxed out 60fps , look at sig rig , thanks !


another year? i don't know about you, i also play at 1080 and my single 290 seems overkill atm but i am thinking of getting the 390 when it comes out. scratches itch.


----------



## HoneyBadger84

Storm has passed, I got a nap in, it's nighttime, time for some more benchmarking ^_^


----------



## fateswarm

Quote:


> Originally Posted by *rdr09*
> 
> another year? i don't know about you, i also play at 1080 and my single 290 seems overkill atm but i am thinking of getting the 390 when it comes out. scratches itch.


I may get a 2nd 290 in a year. I suspect the prices will be floored if new cards come out. I don't expect more than +20% from new cards so a 2nd would be a ~80%+ boost with something like 200 euros.


----------



## chronicfx

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Storm has passed, I got a nap in, it's nighttime, time for some more benchmarking ^_^


You on the east coast? That lightning shut my computer down last night.. I was hitting the power button for like 10 minutes before it started back up again. Scared me, the only hope I had that it wasn't dead was that some of the mobo leds and the ethernet led were still glowing, the power button was unresponsive, then all the sudden it starts up like nothing happened.. Must have tripped the psu protection and needed to reset.


----------



## Gobigorgohome

I am looking for 10w/mk (or more) thermal pads for the EK-FC R9 290X Acetal+Nickel blocks for my GPU's, any advice?

I know I have asked before, but looking for the answer in this thread is like looking for a needle in a haystack.


----------



## Imprezzion

Fujipoly Ultra Extreme. 17w/mk and worldwide shipping from FrozenCPU.

I bought mine there as well, cost a bit of shipping tho but they send it to Holland in like, a week?

Btw, i'm going to ask again as I didn't really fix it yet and didn't get a lot of useful responses:

I can't flash my card.

AtiWinFlash gives a ''ATIwinflash MCP stopped working''.
Could be Windows 8.1 tho...

Win98boot DOS USB gives page faults when using Atiflash... Everyone uses it.. I have used it numerous times.. but not in this particular build.. I tried all my USB ports, I tried 4 different AtiFlash versions, 3 different USB sticks, just does the same every time. Not even atiflash -i works. Just page faults.

Point is, I wanna flash it with the ASUS or PT1 BIOS cause I have plenty of room on my temps for both VRM and core to go higher then +100mV.. But I can't like this.. MSI AB 3.0.1 also doesn't start with the commands for +100mV.


----------



## HoneyBadger84

Quote:


> Originally Posted by *chronicfx*
> 
> You on the east coast? That lightning shut my computer down last night.. I was hitting the power button for like 10 minutes before it started back up again. Scared me, the only hope I had that it wasn't dead was that some of the mobo leds and the ethernet led were still glowing, the power button was unresponsive, then all the sudden it starts up like nothing happened.. Must have tripped the psu protection and needed to reset.


Naw I'm in h... New Mexico lol

We had some pretty severe storms this evening, it knocked out the plug my computer was on twice so I shut down and took a nap. Irritating cuz both times it happened was in the middle of a 3DMark11 run that was almost finished (once on a loading screen, the other time during the Combined Test's start).

I need to get a new, bigger battery for sure, cuz anytime it gets brown outs, which we get out here often, if I'm pulling over what the battery is rated for, it automatically shuts down everything plugged in to it to avoid killing all the things.







I think it's only a 640W or so, so it definitely needs upgrading.


----------



## Imprezzion

Lol, I managed to flash my card









What a elaborate way to flash a card though... Had to flash from a Windows 7 virtual machine








AtiWinFlash worked there lol. Compatibility mode didn't work in 8.1 cause it said something with critical file missing..


----------



## Dasboogieman

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Minor sad face: They all have Elpida memory. What should I expect out of it, less clocks or ? Not familiar with it as I haven't seen many folks with it that're OCing much.


Not very much. Chances are, you're bottlenecked by your GPU IMC as most decent AIBs use 6000ghz binned GDDR5 ics.


----------



## yawa

Well keep in mind the offer for my 970 at this point is freefree Mega Man. If CFX is her deal, I'll definitely let her know about the limitations. I was only thinking in terms of Overclocking, which I can vouch isn't an issue with this thing.


----------



## rdr09

Quote:


> Originally Posted by *fateswarm*
> 
> I may get a 2nd 290 in a year. I suspect the prices will be floored if new cards come out. I don't expect more than +20% from new cards so a 2nd would be a ~80%+ boost with something like 200 euros.


2 290s will be faster than the a single next gen highend card for sure. i may have to upgrade my i7 SB and other stuff in my intel rig, so to stay within a reasonable budget a single gpu solution will be a better fit.
Quote:


> Originally Posted by *Gobigorgohome*
> 
> I am looking for 10w/mk (or more) thermal pads for the EK-FC R9 290X Acetal+Nickel blocks for my GPU's, any advice?
> 
> I know I have asked before, but looking for the answer in this thread is like looking for a needle in a haystack.


You can get the more expensive stuff like what Imprezzion suggested or a cheaper but still better than the stock pads companies like ek provides. this is what i have with my 290 for the VRMs. the pads for the IC memory that come with the kit will suffice. Temps in BF4 MP using 3 120 rads for the cpu and gpu . . .



ambient is about 25C.


----------



## Imprezzion

Ok, this is new for me. I am now using the ASUS unlock BIOS on my 290 @ 290X shaders, and running Sapphire TriXX @ +200mV, load voltage of ~1.30v which is very high.

And my Accelero Hybrid keeps the core on 70c stable, VRM's with my modified reference cooler heatsink are holding at VRM1 = 79c, VRM2 = 59c.
Currently just trying some random frequency's and i'm running 1200Mhz core, 1500Mhz VRAM (Elpida) and looping Fire Strike Extreme.

I am going to see how high the core can actually go with +200mV







Temps are fine..
EDIT: Running 1225Mhz now. GOing to loop for 10-15 minutes scanning for artifacts. If there's none i;ll push on to like, 1240-1250Mhz


----------



## rdr09

Quote:


> Originally Posted by *Imprezzion*
> 
> Ok, this is new for me. I am now using the ASUS unlock BIOS on my 290 @ 290X shaders, and running Sapphire TriXX @ +200mV, load voltage of ~1.30v which is very high.
> 
> And my Accelero Hybrid keeps the core on 70c stable, VRM's with my modified reference cooler heatsink are holding at VRM1 = 79c, VRM2 = 59c.
> Currently just trying some random frequency's and i'm running 1200Mhz core, 1500Mhz VRAM (Elpida) and looping Fire Strike Extreme.
> 
> I am going to see how high the core can actually go with +200mV
> 
> 
> 
> 
> 
> 
> 
> Temps are fine..
> EDIT: Running 1225Mhz now. GOing to loop for 10-15 minutes scanning for artifacts. If there's none i;ll push on to like, 1240-1250Mhz


i've seen 1.4v using +200. my elpida can do 1600 easy. blackscreens past 1620 in some benchmarks.


----------



## Imprezzion

Quote:


> Originally Posted by *rdr09*
> 
> i've seen 1.4v using +200. my elpida can do 1600 easy. blackscreens past 1620 in some benchmarks.


It ''idles'' on 1.412v but it loads around 1.30-1.32v average.
Too bad 1250Mhz artifacts like mad lol. 1225Mhz seemed quite decent though.


----------



## rdr09

Quote:


> Originally Posted by *Imprezzion*
> 
> It ''idles'' on 1.412v but it loads around 1.30-1.32v average.
> Too bad 1250Mhz artifacts like mad lol. 1225Mhz seemed quite decent though.


need more volts and lower ambient - maybe. you touched the power limit at all?

are you aiming for stable daily oc? 1225 is like 1275 for a 290, so you're still better off.


----------



## Imprezzion

Quote:


> Originally Posted by *rdr09*
> 
> need more volts and lower ambient - maybe.


Ambient is at about 23c now. We have super hot days incoming, aka 30c+, so I will just fall back to 1150Mhz @ +100mV since that barely hits 65c on either VRM or Core. Just wanna see what the card can do with normal ambients









1230Mhz doesn't artifact in Fire Strike but still.. Something like BF4 @ Mantle @ 125% res scale artifacts and finds instability faster.

Even more volts? Nah. Can't do that on this cooling. Maybe on full water but still, i'm looking for 24/7 clocks. Not benchclocks. I kinda dare to use +200mV 24/7 if temps stay like this but any higher?


----------



## rdr09

Quote:


> Originally Posted by *Imprezzion*
> 
> Ambient is at about 23c now. We have super hot days incoming, aka 30c+, so I will just fall back to 1150Mhz @ +100mV since that barely hits 65c on either VRM or Core. Just wanna see what the card can do with normal ambients
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1230Mhz doesn't artifact in Fire Strike but still.. Something like BF4 @ Mantle @ 125% res scale artifacts and finds instability faster.


best time to set oc's is when ambient is high or the weather warmer. yah, BF4 is sensitive to oc's


----------



## fateswarm

It was scary how high my Hynix can go without even overvoltage. I left it at 1350 though. It doesn't seem to give a significant benefit with gpu at 1100 when higher.


----------



## Dasboogieman

Quote:


> Originally Posted by *fateswarm*
> 
> It was scary how high my Hynix can go without even overvoltage. I left it at 1350 though. It doesn't seem to give a significant benefit with gpu at 1100 when higher.


Yeah, the 512bit Bus is amazing. Core speed is king on Hawaii.

I was conversing with the Litecoin mining guys who were using the VRAM timing optimized BIOS. Some interesting observations came up. Apparently, all Hawaii cards have a pre-set of timings for each memory SKU. Furthermore, the IMC switches the timings depending on the VRAM clocks, the threshold points were something like 1250, 1375 and 1500. Furthermore, apparently, the IMC on most Hawaii cards throw out heaps of silent memory errors beyond 1375mhz that then get ECC corrected, which explains why not all cards can reach 6000mhz Mem speed. It was concluded that Hynix chips had the best clockspeed to timing ratio thus explaining the excellent performance for GPGPU though there was disagreement if the generally better OC results was because of the IMC or the VRAM itself.


----------



## MrGaZZaDaG

Hey Guys,

not to sure how many of you's live in Australia, but I noticed you can pickup a XFX R9 290 reference cooled graphics card for $359.
Thats so cheap, considering I ordered my second R9 290 card just over a month ago and got it for $400 + delivery.

http://www.shoppingexpress.com.au/buy/xfx-amd-r9-290-980-mhz-4gb-graphics-card-r9-290a-enb-512-bit-gddr5-pci-e-3.0-dual-link-dvi-i-displayport-hdmi-dvi-d/R9-290A-ENB

Also, does anyone have problems with two cards in crossfire? random lockups on desktop??
I was experiencing black screens with my the second card in, but i've overclocked it since and seems to be okay.. just getting the lockups now and then...

**posted originally on The R9 290 -> 290X Unlock Thread **


----------



## alancsalt

Quote:


> Originally Posted by *MrGaZZaDaG*
> 
> Hey Guys,
> 
> not to sure how many of you's live in Australia, but I noticed you can pickup a XFX R9 290 reference cooled graphics card for $359.
> Thats so cheap, considering I ordered my second R9 290 card just over a month ago and got it for $400 + delivery.
> 
> http://www.shoppingexpress.com.au/buy/xfx-amd-r9-290-980-mhz-4gb-graphics-card-r9-290a-enb-512-bit-gddr5-pci-e-3.0-dual-link-dvi-i-displayport-hdmi-dvi-d/R9-290A-ENB
> 
> Also, does anyone have problems with two cards in crossfire? random lockups on desktop??
> I was experiencing black screens with my the second card in, but i've overclocked it since and seems to be okay.. just getting the lockups now and then...
> 
> **posted originally on The R9 290 -> 290X Unlock Thread **


Quote:


> WAS $428.00
> NOW $359.00
> Sale Ends 23:59 pm, 15 July 2014


You have twenty minutes.....


----------



## Aximous

Anyone with the Aquacomputer backplate could tell me how thick thermal pads it uses?


----------



## cennis

Quote:


> Originally Posted by *sugarhell*
> 
> Guys its easy to give more volts on msi.
> 
> Just use /wi4,30,8d,10 for 100mv. The offset is 6.25 mv in hexademical. So on decimal is :16*6.25=100 mv. For 50mv you need 8. For 200mv you need 20( 20=32 on dec. So 32 * 6.25=200mv)
> 
> The easy way to do changes:
> 
> Create a txt on desktop. Write
> CD C:\Program Files (x86)\MSI Afterburner
> MSIAfterburner.exe /wi4,30,8d,10
> 
> and then save as .bat file. Eveyrtime you start this bat file msi will start with +100mv
> 
> For 50mv: 8
> For 100mv:10
> For 125mv:14
> For 150mv:18
> For 175mv:1C
> For 200mv:20
> 
> I wouldn't go over this point because
> 1)You are close to leave the sweet spot of the ref pcb vrms efficiency
> 2)These commands add 200mv on top of the 100mv offset through AB gui.That means 300mv
> 
> By default /wi command apply to current gpu only. So if you have 2 or more gpus you must use /sg command. That means the command line is something like that
> ex:MsiAfterburner.exe /sg0 /wi4,30,8d,10 /sg1 /wi4,30,8d,10


I tried applying volts with this method (with /wi6 instead of /wi4 to the 295x2 but it does not work. It has always worked for me on the 290s.
The main benefit of this method is it allows more than +100mw that the slider is limited to. Anyone know how it works on the 295x2?


----------



## Imprezzion

Quote:


> Originally Posted by *cennis*
> 
> I tried applying volts with this method (with /wi6 instead of /wi4 to the 295x2 but it does not work. It has always worked for me on the 290s.
> The main benefit of this method is it allows more than +100mw that the slider is limited to. Anyone know how it works on the 295x2?


It doesn't work on my 290 either with MSI AB 3.0.1. Program just doesn't start when entering those codes.

Btw, I can't run BF4 on anything over 1200Mhz orso core. Even with +200mV it gives checkerboards when runnning Mantle @ 125% 1080p scale, 2x Adaptive AA, all Ultra settings.

And @ the memory talk, if the threshold is 1375Mhz, then i'll put my VRAM back to 1350Mhz as I was running 1400Mhz which was max stable.


----------



## cennis

Quote:


> Originally Posted by *sugarhell*
> 
> Guys its easy to give more volts on msi.
> 
> Just use /wi4,30,8d,10 for 100mv. The offset is 6.25 mv in hexademical. So on decimal is :16*6.25=100 mv. For 50mv you need 8. For 200mv you need 20( 20=32 on dec. So 32 * 6.25=200mv)
> 
> The easy way to do changes:
> 
> Create a txt on desktop. Write
> CD C:\Program Files (x86)\MSI Afterburner
> MSIAfterburner.exe /wi4,30,8d,10
> 
> and then save as .bat file. Eveyrtime you start this bat file msi will start with +100mv
> 
> For 50mv: 8
> For 100mv:10
> For 125mv:14
> For 150mv:18
> For 175mv:1C
> For 200mv:20
> 
> I wouldn't go over this point because
> 1)You are close to leave the sweet spot of the ref pcb vrms efficiency
> 2)These commands add 200mv on top of the 100mv offset through AB gui.That means 300mv
> 
> By default /wi command apply to current gpu only. So if you have 2 or more gpus you must use /sg command. That means the command line is something like that
> ex:MsiAfterburner.exe /sg0 /wi4,30,8d,10 /sg1 /wi4,30,8d,10


Quote:


> Originally Posted by *Imprezzion*
> 
> It doesn't work on my 290 either with MSI AB 3.0.1. Program just doesn't start when entering those codes.
> 
> Btw, I can't run BF4 on anything over 1200Mhz orso core. Even with +200mV it gives checkerboards when runnning Mantle @ 125% 1080p scale, 2x Adaptive AA, all Ultra settings.
> 
> And @ the memory talk, if the threshold is 1375Mhz, then i'll put my VRAM back to 1350Mhz as I was running 1400Mhz which was max stable.


Just to confirm you got it to work with 3.0 versions? /wi6?
maybe they changed the code again..


----------



## bond32

Quote:


> Originally Posted by *cennis*
> 
> I tried applying volts with this method (with /wi6 instead of /wi4 to the 295x2 but it does not work. It has always worked for me on the 290s.
> The main benefit of this method is it allows more than +100mw that the slider is limited to. Anyone know how it works on the 295x2?


Did you try the "/sg0" and "sg1"? Would think that would be it for a 295...


----------



## fateswarm

Has anyone noticed the massive amount of activity of Afterburner on registry all the time? As evidenced on procmon.



I wonder if it can be alleviated though I have no non-essential settings enabled. Trixx is on worse cpu usage btw.

Hopefully the impact is approximately nothing. Process explorer gives a delta of ~50mil cycles which isn't 0.


----------



## heroxoot

MSI AB uses less than 1% CPU for me, but I have no idea about these registries. Is it a big deal?


----------



## Imprezzion

Well, I found my best balanced clock / voltage profile.

Artifacts are finally gone in BF4 and if one does pop up I can raise voltage a tiny bit still.

I am now running 1180Mhz core, 1350Mhz VRAM with +150mV core. That is on a unlocked XFX ref 290 so i'm not disappointed with that at all as it is a unlocked one and some people say they are worse then actual 290x's even though i really, really doubt that. Still, 1180Mhz fully stable aint bad









Temps are fine. Core high 60's, VRM1 low-mid 70's. VRM2 high 50's. (Celsius). Even with the full +200mV the highest i've seen core go is 71c and VRM1 84c. So even that is pretty safe to run as a card like the XFX DD will let VRM's go as high as 101c on stock lol..
Problem is, 1200Mhz still isn't stable / free of checkerboard arti's even with +200mV so yeah..


----------



## fateswarm

Quote:


> Originally Posted by *heroxoot*
> 
> MSI AB uses less than 1% CPU for me, but I have no idea about these registries. Is it a big deal?


*Success*. By turning off all the graphs from the settings, cpu usage dropped form ~0.16% constantly to about 0.03%.

Nope, it's not a huge deal as it's evidenced by the numbers.









That cleared most of the procmon mess too apparently.


----------



## Kittencake

lol very little impact but its all good , I've been playing with raptr and that installed when i installed the drivers .. I love it


----------



## Imprezzion

Quote:


> Originally Posted by *fateswarm*
> 
> *Success*. By turning off all the graphs from the settings, cpu usage dropped form ~0.16% constantly to about 0.03%.
> 
> Nope, it's not a huge deal as it's evidenced by the numbers.


Try turning off only CPU and RAM based graphs nbut leave GPU graphs enabled.
That should turn CPU usage down properly as well.

Also, @ the memory discussion, i've been reading on litecointalk for a bit and Stilt himself (the creator of the memory timing tweaked BIOS's) says the ideal RAM frequency's are anywhere from 1375-1450Mhz on Elpida and 1375-1475Mhz on Hynix. That's where latency / bandwidth is the most ideal. More speed is ofcourse always better but Elpida, and the memory controller, can't really do much more then 1450Mhz on most cards.

So, 1400Mhz seems like a good point to be at for my card and I know it can do it stable


----------



## HoneyBadger84

Just saw lil' over 10000MB of vRAM usage for the first time... of course that's spread across 3 GPUs so it's actually in the 3.3-3.4GB range... but still, was funny to see a 5 digit number when I looked at the OSD. Watch_Dogs too hard man.


----------



## Imprezzion

Try BF4 on Mantle with a high res or res scale. I can hit as high as 3.8GB on a single card.
I'm in-game now and GPU-Z sais 3626MB VRAM usage.


----------



## meshal300

Quote:


> Originally Posted by *Aximous*
> 
> Anyone with the Aquacomputer backplate could tell me how thick thermal pads it uses?


it seems half mm "0.5mm"
don't have any meter to measure it.. but it seems so.

https://www.dropbox.com/s/qbnf4ihfj5rhuxa/2014-07-15%2020.13.38.jpg

https://www.dropbox.com/s/o58se0wt3rh2xbf/2014-07-15%2020.34.50.jpg


----------



## HoneyBadger84

Quote:


> Originally Posted by *Imprezzion*
> 
> Try BF4 on Mantle with a high res or res scale. I can hit as high as 3.8GB on a single card.
> I'm in-game now and GPU-Z sais 3626MB VRAM usage.


Don't have the game or I would  I missed out on the free BF4 train... I have a code that came with one of my cards but of course the offer has expired.


----------



## Aximous

Quote:


> Originally Posted by *meshal300*
> 
> it seems half mm "0.5mm"
> don't have any meter to measure it.. but it seems so.
> 
> https://www.dropbox.com/s/qbnf4ihfj5rhuxa/2014-07-15%2020.13.38.jpg
> 
> https://www.dropbox.com/s/o58se0wt3rh2xbf/2014-07-15%2020.34.50.jpg


looks like that indeed, thanks!


----------



## 8cr13mov

Is this a good deal/card? http://www.ebay.com/itm/171329028748?_trksid=p2059216.m2763.l2649&ssPageName=STRK%3AMEBIDX%3AIT


----------



## Kittencake

that is a good deal i'd scoop it up


----------



## HoneyBadger84

Quote:


> Originally Posted by *8cr13mov*
> 
> Is this a good deal/card? http://www.ebay.com/itm/171329028748?_trksid=p2059216.m2763.l2649&ssPageName=STRK%3AMEBIDX%3AIT


I'd wait and find one that wasn't used for mining (message the sellers to ask) and get one of those. $349 is steep for a single 290X even new in box on EBay.


----------



## Redeemer

How does the 290x compete so well with the GK110 with a 30% smaller die? It seems pretty amazing to me, anyway any news on what will replace the 290x??


----------



## HoneyBadger84

So on a hunch, I've reinstalled drivers with the cards repositioned in spread out slots (using the black one and the bottom-most). Going to see how it effects my performance in 2 game benchmarks, rather than wasting time on the 3DMarks etc... Should be interesting. I really think that black slot gimps performance.


----------



## 8cr13mov

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I'd wait and find one that wasn't used for mining (message the sellers to ask) and get one of those. $349 is steep for a single 290X even new in box on EBay.


The seller listed as "New Open Factory Box - Purchased for a Mining Rig Project that was Never Started" The statement seems a bit sketchy, but why would you open box if you "never started" The seller seems to have good feedback but all listed as private. $349 is still a best offer so i will try a bit lower. Any one know if this Sapphire BF4 version has Elpida or Hynix memory? I am upgrading a from 7850.


----------



## bond32

Unless you are air cooling, and this is of course my opinion, the fact the card was used for mining shouldn't matter. If your putting a full cover gpu block on it then it wouldn't make any difference. If you were air cooling, however, I would at least replace the stock heatsink and fan as it likely ran at 100 percent for days on end.

The 2 290's I bought came from a miner. They clock fantastic and both are Hynix. Blocks should be in by this weekend...


----------



## pkrexer

I'm not sure why people are so turned off from buying cards used for mining. I picked up a XFX 290x off ebay that was used for mining. Does 1120mhz on stock volts and has hynix memory. It also doesn't have any coil whine, which was my main concern. The guy claimed it was only used for a few weeks, but the amount of dust would have said otherwise.

Regardless, long as they aren't pushing some crazy volts through it on a 24/7 basis. I don't see a reason not to buy them.


----------



## Gobigorgohome

Quote:


> Originally Posted by *Imprezzion*
> 
> Fujipoly Ultra Extreme. 17w/mk and worldwide shipping from FrozenCPU.
> 
> I bought mine there as well, cost a bit of shipping tho but they send it to Holland in like, a week?
> 
> Btw, i'm going to ask again as I didn't really fix it yet and didn't get a lot of useful responses:
> 
> I can't flash my card.
> 
> AtiWinFlash gives a ''ATIwinflash MCP stopped working''.
> Could be Windows 8.1 tho...
> 
> Win98boot DOS USB gives page faults when using Atiflash... Everyone uses it.. I have used it numerous times.. but not in this particular build.. I tried all my USB ports, I tried 4 different AtiFlash versions, 3 different USB sticks, just does the same every time. Not even atiflash -i works. Just page faults.
> 
> Point is, I wanna flash it with the ASUS or PT1 BIOS cause I have plenty of room on my temps for both VRM and core to go higher then +100mV.. But I can't like this.. MSI AB 3.0.1 also doesn't start with the commands for +100mV.


Oh my ... four of each of those pads are almost as much as one waterblock







.... it is actually more


----------



## 8cr13mov

Should I be worried about the elpdia memory. I'm not that worried that they are used but the fact that elpdia yields higher black screen as shown on the polls.


----------



## Gobigorgohome

Quote:


> Originally Posted by *rdr09*
> 
> You can get the more expensive stuff like what Imprezzion suggested or a cheaper but still better than the stock pads companies like ek provides. this is what i have with my 290 for the VRMs. the pads for the IC memory that come with the kit will suffice. Temps in BF4 MP using 3 120 rads for the cpu and gpu . . .
> 
> 
> 
> ambient is about 25C.


I have 2940 mm^2 radiator surface so I think I am good when it comes to cooling.

I will not overclock like crazy on the GPUs though, I am happy with +100 Mhz core clock and that's it. I just do not want to have hotter cards than necessary.


----------



## Imprezzion

Well... My card is a lot hotter then necessary now but I got it stable on 1200Mhz core








Unlocked 290 stable on 1200Mhz.. Not bad if I may say.. Problem is the kinda hot VRM1 haha.

Cooling on VRM1 is very very good but still, being at +200mV, it hit as high as 86c peak and 82c average over 2-3 hours of heavy gaming.
Core barely touched 70c with a peak of 71c and a average of 68c. VRM2 way way cooler at peak 61c and average 59c.

These temps are still well below what coolers like the XFX DD or ASUS DCII let the VRM's get up to even on stock clocks but does the extra load from the higher voltage damage them sooner?


----------



## Gobigorgohome

Quote:


> Originally Posted by *Imprezzion*
> 
> Well... My card is a lot hotter then necessary now but I got it stable on 1200Mhz core
> 
> 
> 
> 
> 
> 
> 
> 
> Unlocked 290 stable on 1200Mhz.. Not bad if I may say.. Problem is the kinda hot VRM1 haha.
> 
> Cooling on VRM1 is very very good but still, being at +200mV, it hit as high as 86c peak and 82c average over 2-3 hours of heavy gaming.
> Core barely touched 70c with a peak of 71c and a average of 68c. VRM2 way way cooler at peak 61c and average 59c.
> 
> These temps are still well below what coolers like the XFX DD or ASUS DCII let the VRM's get up to even on stock clocks but does the extra load from the higher voltage damage them sooner?


The VRM1 temperatures has to be lower than 70 degree celsius on water even with the thermal pads from EK?

I do not think the core will pass 50 degrees with my setup, in that case I get shocked.


----------



## Imprezzion

With my voltage on +100mV VRM temps barely hit 70c here on air. It's nothing more then a cut out piece of stock cooler and the Accelero Hybrid fan cooling it.


----------



## Gobigorgohome

Quote:


> Originally Posted by *Imprezzion*
> 
> With my voltage on +100mV VRM temps barely hit 70c here on air. It's nothing more then a cut out piece of stock cooler and the Accelero Hybrid fan cooling it.


So you think I should be good with those EK-thermal pads even though they are just from 3-5 w/mk?


----------



## Imprezzion

Quote:


> Originally Posted by *Gobigorgohome*
> 
> So you think I should be good with those EK-thermal pads even though they are just from 3-5 w/mk?


Basically, yeah, however, the pads EK ships with their blocks are a little on the thin side and don't give good contact. That's why everyone switches them out.

Hell, I use the AMD stock TIM strip that comes with the reference cooler now lol.
I ran out of Ultra Extreme and like you said, it's very expensive.. A strip of 100mmx15mm is enough for all VRM's on a 290/290X or 780/780Ti if you cut it lengthwise in 2 strips of 7.5mm wide.


----------



## Gualichu04

Is there anyway to get the max voltage higher than +100mv in msi afterburner?


----------



## HoneyBadger84

My main questions with former mining cards for sellers on EBay is what temps were they running at. Thats the key I think. You don't wanna buy a card that was run at 80+C constantly for weeks/months. Most miners that use eBay to resell don't not manually set the fan higher than 55% which is why the run so hot. The two mining cards I got previously both ran great, just needed a good cleaning... and I has to fix their BIOSes because one on each card was fubared. Other than that they ran like champs til I resold them.

The three HIS cards I have now were also listed as purchased for a project but never used for it, and they came from a miner as well. Just tested them at 1150/1400 for some benchmarks and they ran it at +125mV, barely broke 70C but that was with the fans maxed and my shop-blower fan on sids intake duty.


----------



## fateswarm

I'd like to try a good Mantle game to test the card with it but I haven't heard great things about BF4. And I'm not into raw FPS gameplay lately. But I tried Star Swarm demo recently and I was impressed, multiple times higher fps in some instances.


----------



## pdasterly

To water or not to water, considering going full water system, would that push my cards further. Currenty cards can handle up to +81mv, 1125 gpu. Any higher gpu produces errors in occt even with increased vddc. +81mv is lowest vddc I can run without errors in occt.


----------



## Ramzinho

Quote:


> Originally Posted by *pdasterly*
> 
> To water or not to water, considering going full water system, would that push my cards further. Currenty cards can handle up to +81mv, 1125 gpu. Any higher gpu produces errors in occt even with increased vddc. +81mv is lowest vddc I can run without errors in occt.


what are your current temps... VRM and Core?


----------



## Dasboogieman

Quote:


> Originally Posted by *Redeemer*
> 
> How does the 290x compete so well with the GK110 with a 30% smaller die? It seems pretty amazing to me, anyway any news on what will replace the 290x??


Similar shading power, similar clock speeds, similar VRAM bandwidth. The key difference is that Hawaii has twice the ROPs, but GK110 has 25% more texture processing power.. Additionally, GK110 is capable of boosting to higher clockspeeds that Hawaii albeit for very limited scenarios/times. Ignoring driver differences, This basically means that GK110 ranges from slightly to moderately more powerful in older games and at resolutions less than 4k. GK110 bleeds a lot of performance at high resolutions and high MSAA/SSAA use whereas Hawaii is less affected.
So basically, unless the game uses heavy amounts of texture filtering/high resolution textures, Hawaii and GK110 should be equal or even a slight edge to Hawaii at the same clock speed.

As for the Die size differences, Hawaii uses a much simpler IMC which accounts for about 20% of the space savings. Additionally, GK110 has more texture units and FP64 hardware which make up the remainder.


----------



## pdasterly

Quote:


> Originally Posted by *Ramzinho*
> 
> what are your current temps... VRM and Core?


62c gpu max under heavy load, vrm1 up to 98c under heavy load(occt, furmark)


----------



## sugarhell

Quote:


> Originally Posted by *Dasboogieman*
> 
> Similar shading power, similar clock speeds, similar VRAM bandwidth. The key difference is that Hawaii has twice the ROPs, but GK110 has 25% more texture processing power. Ignoring driver differences, This basically means that GK110 ranges from slightly to more powerful in older games and at resolutions less than 4k. GK110 bleeds a lot of performance at high resolutions and high MSAA/SSAA use whereas Hawaii is less affected.
> 
> As for the Die size differences, Hawaii uses a much simpler IMC which accounts for about 20% of the space savings. Additionally, GK110 has more texture units and FP64 hardware which make up the remainder.


Why? Hawaii doesnt have FP64?

The only reason that hawaii die is small:HDL


----------



## Ramzinho

Quote:


> Originally Posted by *pdasterly*
> 
> 62c gpu max under heavy load, vrm1 up to 98c under heavy load(occt, furmark)


i think these are very good temps.. you will not see huge improvements going water.. but i'll wait for experts to answer. i just got my 290X and i can't push it cause it gets toasty


----------



## pdasterly

Quote:


> Originally Posted by *Ramzinho*
> 
> i think these are very good temps.. you will not see huge improvements going water.. but i'll wait for experts to answer. i just got my 290X and i can't push it cause it gets toasty


I have nzxt g10, just trying to justify getting real water cooler to lower my vrms but seems like my cards wont go any faster due to errors in occt?


----------



## HoneyBadger84

Quote:


> Originally Posted by *pdasterly*
> 
> 62c gpu max under heavy load, vrm1 up to 98c under heavy load(occt, furmark)


Furmark bad. Had someone on here refer to it as like running your runflat tires over spike strips to make sure they work.

I'd use like 3-4 loops of Unigine Valley to get more realistic temps out of your card. 98C is nuts, at least to me. I've never seen my VRMs go past 65C even when OCed on these new cards. At stock speeds they don't even get near 60, max out around 48-56c depending on the stress level of what I'm running.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Ramzinho*
> 
> i think these are very good temps.. you will not see huge improvements going water.. but i'll wait for experts to answer. i just got my 290X and i can't push it cause it gets toasty


Tellin ya man 100% fan, +125 or +131mV 1150/1400. If it doesn't pass benchmarks at least I'll be amazed. And it should stay under 70C unless your case doesn't have adequate airflow... recommend earplugs for offsetting fan noise if you're sensitive to it. Lol

I did a nice Watch_Dogs session earlier and between Vsync and thr load splitting TriFire does, all 3 cards stayed under 60C most of the time,could here the roar of their fans over the game.

I need to try to adjust my fan speed settings down so they cap out at ariuns 70C, see if its quiet enough in comparison to warrant the heat increase.

To me, between 50-75% is where they go from barely there to noticeable to loud.


----------



## ZealotKi11er

Can you update me. I now have 2 x 290. One is unlocked to 290X. (Water Cooled with EK Blocks)


----------



## Dasboogieman

Quote:


> Originally Posted by *sugarhell*
> 
> Why? Hawaii doesnt have FP64?
> 
> The only reason that hawaii die is small:HDL


That too, HDL would do much. Hawaii has FP64 just less. Though how much of it is due to artificial limits or how much is due to hardware is unknown.


----------



## AK-47

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Can you update me. I now have 2 x 290. One is unlocked to 290X. (Water Cooled with EK Blocks)


How much performance are you missing out on by CFing a 290X with a 290?


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Can you update me. I now have 2 x 290. One is unlocked to 290X. (Water Cooled with EK Blocks)


Very nicely done, Zeal. Just beautiful.

@pdasterly, it is not too late to watercool.


----------



## heroxoot

Quote:


> Originally Posted by *AK-47*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> Can you update me. I now have 2 x 290. One is unlocked to 290X. (Water Cooled with EK Blocks)
> 
> 
> 
> 
> 
> How much performance are you missing out on by CFing a 290X with a 290?
Click to expand...

Probably not much. I hear the difference is 10% at most between a 290 and 290X at the same clocks. It really isn't a huge difference. Plus he bought 290 and got lucky one unlocked, so it's probably no skin off his nose right? Lucky man.


----------



## pdasterly

Which waterblocks for sapphire reference 290x


----------



## iiNTEL

I bought an HIS Iceq x2 (non-turbo) as the website says it has a reference PCB. But i ordered an Aquacomputer 290x waterblock, the fricken thing doesnt fit. The capacitors are larger than the reference PCBs i just google image searched. anyone have any ideas? its only by like 1mm.... just ever so slightly.....

BTW coolingconfigurator said that this was a reference PCB and they did a visual check. So thats why i assumed that it would fit. Im extremely annoyed.

Any ideas or thoughts on what i should do?

Let me know if you guys want pictures


----------



## ZealotKi11er

Quote:


> Originally Posted by *AK-47*
> 
> How much performance are you missing out on by CFing a 290X with a 290?


Same clocks 290X ~ 7.5% faster. Given i have to cards i am losing 3-4% performance. In reality is nothing much. I would only recommend 290X if you want the very best single GPU from AMD. In CF i say get 290s. The reason i am saying this is because in DX11 even at 1440p apart from 1-2 games its very hard to push the card before CPU holds them back. I dont even have to OC. In BF4 right now i get ~ 100-120 fps 1440p Ultra 0XAA 125% Scaling which is technically 1800p. There is huge performance increase from Mantle and DX11 with CF in BF4. Something 30-40% range. Unless i want to go 4K i personally would not use my own money to buy 2 x 290s for 1440p really. DX11 really kills the scaling of these cards in CPU limited games.


----------



## aaroc

Quote:


> Originally Posted by *iiNTEL*
> 
> I bought an HIS Iceq x2 (non-turbo) as the website says it has a reference PCB. But i ordered an Aquacomputer 290x waterblock, the fricken thing doesnt fit. The capacitors are larger than the reference PCBs i just google image searched. anyone have any ideas? its only by like 1mm.... just ever so slightly.....
> 
> BTW coolingconfigurator said that this was a reference PCB and they did a visual check. So thats why i assumed that it would fit. Im extremely annoyed.
> 
> Any ideas or thoughts on what i should do?
> 
> Let me know if you guys want pictures


EK released a V2 water block, because of the taller capacitors.
Quote:


> Rev.2.0 brings compatibility for previously unsupported MSI® and GIGABYTE® model graphics cards.


----------



## iiNTEL

Quote:


> Originally Posted by *aaroc*
> 
> EK released a V2 water block, because of the taller capacitors.


poo. The EK blocks weren't as nice looking as the aquacomputer block and performed a few degrees worse. Guess the degrees dont really matter..... Still sucks though. I saw the revision to the EK blocks but since i saw no news of aquacomputer releasing a new revision i didnt realize it applied to their block aswell. and i guess i missed people complaining the block didnt fit. A quick google search doesnt yield me any results either. Guess i should get the EK block....

any other advice or knowledge on what other block i can get?


----------



## Kittencake

k i had my gpu temp shoot up to like 62 when i had a youtube vid full screened it doesn't even get that hot when i play old republic on low settings


----------



## oDizz82

Anyone have an idea of what wattage PSU is recommended for running two r9-290x's in crossfire with an FX-8350 OC'ed to 4.6GHz?
Currently I'm running two GTX 770's on a gold rated Corsair RM750 without any issues. Will possibly be trading my two 770's for two r9-290x's and trying to figure out if I will need a beefier PSU.


----------



## Kittencake

safe bet 1000 watt min


----------



## boot318

Quote:


> Originally Posted by *oDizz82*
> 
> Anyone have an idea of what wattage PSU is recommended for running two r9-290x's in crossfire with an FX-8350 OC'ed to 4.6GHz?
> Currently I'm running two GTX 770's on a gold rated Corsair RM750 without any issues. Will possibly be trading my two 770's for two r9-290x's and trying to figure out if I will need a beefier PSU.


Your going to need to upgrade your PSU. I wouldn't dare stress test all the components at the same time if you didn't.

Kinda hard seeing someone doing that trade in your favor without cash.


----------



## tsm106

Quote:


> Originally Posted by *iiNTEL*
> 
> Quote:
> 
> 
> 
> Originally Posted by *aaroc*
> 
> EK released a V2 water block, because of the taller capacitors.
> 
> 
> 
> poo. *The EK blocks weren't as nice looking as the aquacomputer block and performed a few degrees worse.* Guess the degrees dont really matter..... Still sucks though. I saw the revision to the EK blocks but since i saw no news of aquacomputer releasing a new revision i didnt realize it applied to their block aswell. and i guess i missed people complaining the block didnt fit. A quick google search doesnt yield me any results either. Guess i should get the EK block....
> 
> any other advice or knowledge on what other block i can get?
Click to expand...

That's up to preference because I think the Aqua block is horrendously tacky looking. Also regarding performance, the review that has the Aqua block in the lead is ugh... does not compute. That said, I don't trust block reviews unless it's Martin running the tests.

Quote:


> Originally Posted by *Kittencake*
> 
> safe bet 1000 watt min


I would even recommend more since the jump from 1kw to 1.3kw is small compared to replacing a whole psu later. Besides, it's not good to run your psu sustained at close to max load.


----------



## oDizz82

Quote:


> Originally Posted by *boot318*
> 
> Your going to need to upgrade your PSU. I wouldn't dare stress test all the components at the same time if you didn't.
> 
> Kinda hard seeing someone doing that trade in your favor without cash.


Not a straight trade... but if I work it right, I'll only have to shell out a hundred bucks or so **fingers crossed**


----------



## tsm106

Quote:


> Originally Posted by *oDizz82*
> 
> Quote:
> 
> 
> 
> Originally Posted by *boot318*
> 
> Your going to need to upgrade your PSU. I wouldn't dare stress test all the components at the same time if you didn't.
> 
> Kinda hard seeing someone doing that trade in your favor without cash.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not a straight trade... but if I work it right, I'll only have to shell out a hundred bucks or so **fingers crossed**
Click to expand...

Totally worth it man!


----------



## iiNTEL

Quote:


> Originally Posted by *tsm106*
> 
> That's up to preference because I think the Aqua block is horrendously tacky looking. Also regarding performance, the review that has the Aqua block in the lead is ugh... does not compute. That said, I don't trust block reviews unless it's Martin running the tests.[/
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> That's up to preference because I think the Aqua block is horrendously tacky looking. Also regarding performance, the review that has the Aqua block in the lead is ugh... does not compute. That said, I don't trust block reviews unless it's Martin running the tests.
> 
> 
> 
> Fair enough. But besides EK should i just look for blocks that are guaranteed to fit the msi and gigabyte versions?
Click to expand...


----------



## tsm106

Those revised boards are a pain in the ass, so you're only choice is to find specific fitment blocks. And in those cases you don't have much choice in availability. Can you not find any true reference cards?


----------



## iiNTEL

Quote:


> Originally Posted by *tsm106*
> 
> Those revised boards are a pain in the ass, so you're only choice is to find specific fitment blocks. And in those cases you don't have much choice in availability. Can you not find any true reference cards?


At the time i didnt realize there were revised reference boards. Seems stupid to still call them reference. I bought the HIS one because they sold it as a "reference PCB" and the aquacomputer block was made to fit "reference PCBs". i didnt realize there was a marketing gimmick in the word reference.


----------



## Kittencake

Quote:


> Originally Posted by *iiNTEL*
> 
> At the time i didnt realize there were revised reference boards. Seems stupid to still call them reference. I bought the HIS one because they sold it as a "reference PCB" and the aquacomputer block was made to fit "reference PCBs". i didnt realize there was a marketing gimmick in the word reference.


isn't that false advertising?


----------



## iiNTEL

Quote:


> Originally Posted by *Kittencake*
> 
> isn't that false advertising?


I don't care about that but I mean. Even cooling configurator called it a reference PCB. They didn't offer a warning or anything. That's what pisses me off. Screw HIS and screw cooling configurator. Aquacomouter should also offer a warning but I guess it isn't necessarily their fault, but where is their revised waterblock? Someone needs to start a thread on this because I did my research. I didn't find anything specific enough to guide my choice.


----------



## Gobigorgohome

Quote:


> Originally Posted by *Imprezzion*
> 
> Basically, yeah, however, the pads EK ships with their blocks are a little on the thin side and don't give good contact. That's why everyone switches them out.
> 
> Hell, I use the AMD stock TIM strip that comes with the reference cooler now lol.
> I ran out of Ultra Extreme and like you said, it's very expensive.. A strip of 100mmx15mm is enough for all VRM's on a 290/290X or 780/780Ti if you cut it lengthwise in 2 strips of 7.5mm wide.


Just ordered some Fujipoly Ultra Extreme 1mm and 0,5mm, I calculated it out to work with both GPU-blocks and the RIVBE Monoblock I am getting. Thanks for clearing things up!


----------



## rdr09

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Just ordered some Fujipoly Ultra Extreme 1mm and 0,5mm, I calculated it out to work with both GPU-blocks and the RIVBE Monoblock I am getting. Thanks for clearing things up!


went all out for the vrms . . . might as well go all out for the core. what paste are you planning to use? i recommend Gelid Extreme. Go to the Extreme. lol

you won't be disappointed with the Fujis. hard to apply, though. prolly just my huge fingers. don't cut your nails not till after you applied them.


----------



## Gobigorgohome

Quote:


> Originally Posted by *rdr09*
> 
> went all out for the vrms . . . might as well go all out for the core. what paste are you planning to use? i recommend Gelid Extreme. Go to the Extreme. lol
> 
> you won't be disappointed with the Fujis. hard to apply, though. prolly just my huge fingers. don't cut your nails not till after you applied them.


Will order some Gelid Extreme too, seems like my water cooling seller has them in stock, awesome!







Usually I use Noctua's thermal paste and it has worked good for me.


----------



## Dasboogieman

Quote:


> Originally Posted by *rdr09*
> 
> went all out for the vrms . . . might as well go all out for the core. what paste are you planning to use? i recommend Gelid Extreme. Go to the Extreme. lol
> 
> you won't be disappointed with the Fujis. hard to apply, though. prolly just my huge fingers. don't cut your nails not till after you applied them.


Gelid Extreme is incredible. Likewise, you can also consider Phobya HeGrease Extreme which uses the exact same OEM as the Gelid. Whichever is more available or cheaper.


----------



## chronicfx

Quote:


> Originally Posted by *8cr13mov*
> 
> The seller listed as "New Open Factory Box - Purchased for a Mining Rig Project that was Never Started" The statement seems a bit sketchy, but why would you open box if you "never started" The seller seems to have good feedback but all listed as private. $349 is still a best offer so i will try a bit lower. Any one know if this Sapphire BF4 version has Elpida or Hynix memory? I am upgrading a from 7850.


Mine is hynix


----------



## Brian18741

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Just ordered some Fujipoly Ultra Extreme 1mm and 0,5mm, I calculated it out to work with both GPU-blocks and the RIVBE Monoblock I am getting. Thanks for clearing things up!


Do you need both sizes or are you just covering yourself?

Also anyone know a supplier in Europe? I can get it from Frozen CPU but it takes up to one month to get here unless I want to pay silly shipping rates, on top of the up to 18 days to ship it.

I need to order some for my two Asus 290 DCII. How much and what thickness will I need? Wanna get it right the first time as it takes so long to order and deliver


----------



## invincible20xx

Quote:


> Originally Posted by *rdr09*
> 
> another year? i don't know about you, i also play at 1080 and my single 290 seems overkill atm but i am thinking of getting the 390 when it comes out. scratches itch.


only one more year @ 1080p , i have 2 x r9 290's in crossfire , man that is sad lol


----------



## heroxoot

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *invincible20xx*
> 
> how long do you guys think a crossfire r9 290 system will serve me @ 1080p maxed out 60fps , look at sig rig , thanks !
> 
> 
> 
> another year? i don't know about you, i also play at 1080 and my single 290 seems overkill atm but i am thinking of getting the 390 when it comes out. scratches itch.
Click to expand...

It's overkill for the most part. Still a pain some games like Borderlands 2 just don't run as good as they should. Such games should not be showing a 30% load.


----------



## Gobigorgohome

Quote:


> Originally Posted by *Brian18741*
> 
> Do you need both sizes or are you just covering yourself?
> 
> Also anyone know a supplier in Europe? I can get it from Frozen CPU but it takes up to one month to get here unless I want to pay silly shipping rates, on top of the up to 18 days to ship it.
> 
> I need to order some for my two Asus 290 DCII. How much and what thickness will I need? Wanna get it right the first time as it takes so long to order and deliver


From the installation manual for the ek-fc r9 290x i need both 0,5 and 1,0 mm. Just bought what i need.

Just order from frozencpu, express shipping for only 40 bucks.


----------



## invincible20xx

Quote:


> Originally Posted by *heroxoot*
> 
> It's overkill for the most part. Still a pain some games like Borderlands 2 just don't run as good as they should. Such games should not be showing a 30% load.


what do you mean ? i never played borderlands 2 on my rig so far


----------



## kizwan

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Imprezzion*
> 
> With my voltage on +100mV VRM temps barely hit 70c here on air. It's nothing more then a cut out piece of stock cooler and the Accelero Hybrid fan cooling it.
> 
> 
> 
> So you think I should be good with those EK-thermal pads even though they are just from 3-5 w/mk?
Click to expand...

I wouldn't call them good but you can use them. They did provide excellent cooling performance for VRM on my 5870 though. If you're overclocked 24/7, depends on the ambient & you don't use water chiller or/and A/C, I recommend better thermal pad like Phobya 7W/mk or Fujipoly Extreme 11W/mk. These two cheaper than Fujipoly Ultra Extreme but should be good enough for 290/290X.

This is comparison I did between EK & Fujipoly Extreme thermal pad.
http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/20020#post_22044493
Quote:


> Originally Posted by *Imprezzion*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gobigorgohome*
> 
> So you think I should be good with those EK-thermal pads even though they are just from 3-5 w/mk?
> 
> 
> 
> *Basically, yeah, however, the pads EK ships with their blocks are a little on the thin side and don't give good contact. That's why everyone switches them out.*
> 
> Hell, I use the AMD stock TIM strip that comes with the reference cooler now lol.
> I ran out of Ultra Extreme and like you said, it's very expensive.. A strip of 100mmx15mm is enough for all VRM's on a 290/290X or 780/780Ti if you cut it lengthwise in 2 strips of 7.5mm wide.
Click to expand...

That is not true. The EK thermal pad fit just fine & have good contact with VRMs. People changed them to Fujipoly for better cooling performance.


----------



## heroxoot

Quote:


> Originally Posted by *invincible20xx*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> It's overkill for the most part. Still a pain some games like Borderlands 2 just don't run as good as they should. Such games should not be showing a 30% load.
> 
> 
> 
> what do you mean ? i never played borderlands 2 on my rig so far
Click to expand...

I mean on my 7970 I had 100+ FPS maxed on borderlands 2 and on my 290X the frame rate is crazy 30 - 80fps and never puts my GPU at a proper load for such a strong game.


----------



## invincible20xx

Quote:


> Originally Posted by *heroxoot*
> 
> I mean on my 7970 I had 100+ FPS maxed on borderlands 2 and on my 290X the frame rate is crazy 30 - 80fps and never puts my GPU at a proper load for such a strong game.


must be some driver issue

i'm wondering how long my system will tie me maxed out @ 1080p because i'm looking to improve my gaming experience by getting a better sound system + a 1080p projector so there will be no money to upgrade the rig itself again in the next 2 years at least

so really hoping my sig rig will hold it's own @ 1080p for a long time to come


----------



## AK-47

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Same clocks 290X ~ 7.5% faster. Given i have to cards i am losing 3-4% performance. In reality is nothing much. I would only recommend 290X if you want the very best single GPU from AMD. In CF i say get 290s. The reason i am saying this is because in DX11 even at 1440p apart from 1-2 games its very hard to push the card before CPU holds them back. I dont even have to OC. In BF4 right now i get ~ 100-120 fps 1440p Ultra 0XAA 125% Scaling which is technically 1800p. There is huge performance increase from Mantle and DX11 with CF in BF4. Something 30-40% range. Unless i want to go 4K i personally would not use my own money to buy 2 x 290s for 1440p really. DX11 really kills the scaling of these cards in CPU limited games.


But doesn't your best card scale down to match the performance of the slower card?
So both cards are essentially running as CF 290s and you're not gaining anything from the X?
Which would bring the performance loss closer to 14-20%?
Or do both cards just perform together at individual specs?
I only ask because I ordered a 290 in hopes of unlocking it to a 290X and eventually get a second 290 to CF.
So trying to figure out if I'll need a 290X if it unlocks or if I can get by with a 290


----------



## kizwan

Quote:


> Originally Posted by *Kittencake*
> 
> Quote:
> 
> 
> 
> Originally Posted by *iiNTEL*
> 
> At the time i didnt realize there were revised reference boards. Seems stupid to still call them reference. I bought the HIS one because they sold it as a "reference PCB" and the aquacomputer block was made to fit "reference PCBs". i didnt realize there was a marketing gimmick in the word reference.
> 
> 
> 
> isn't that false advertising?
Click to expand...

I don't think so. They're correct to advertised as reference PCB if they did use reference PCB but they didn't say anything about the components they use.
Quote:


> Originally Posted by *Gobigorgohome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Brian18741*
> 
> Do you need both sizes or are you just covering yourself?
> 
> Also anyone know a supplier in Europe? I can get it from Frozen CPU but it takes up to one month to get here unless I want to pay silly shipping rates, on top of the up to 18 days to ship it.
> 
> I need to order some for my two Asus 290 DCII. How much and what thickness will I need? Wanna get it right the first time as it takes so long to order and deliver
> 
> 
> 
> From the installation manual for the ek-fc r9 290x i need both 0,5 and 1,0 mm. Just bought what i need.
> 
> Just order from frozencpu, express shipping for only 40 bucks.
Click to expand...

Yeah, *only* 40 bucks.


----------



## ZealotKi11er

Quote:


> Originally Posted by *AK-47*
> 
> But doesn't your best card scale down to match the performance of the slower card?
> So both cards are essentially running as CF 290s and you're not gaining anything from the X?
> Which would bring the performance loss closer to 14-20%?
> Or do both cards just perform together at individual specs?
> I only ask because I ordered a 290 in hopes of unlocking it to a 290X and eventually get a second 290 to CF.
> So trying to figure out if I'll need a 290X if it unlocks or if I can get by with a 290


It should not. If 290X is 10% faster then 290 then 290X CF is 10% faster then 290 CF not 20%.


----------



## Gobigorgohome

Quote:


> Originally Posted by *kizwan*
> 
> I don't think so. They're correct to advertised as reference PCB if they did use reference PCB but they didn't say anything about the components they use.
> Yeah, *only* 40 bucks.


I have four r9 290x (and water cooling coming), so 40 bucks in shipping do not hurt much. Heck, shipping within Norway cost 40 bucks.


----------



## AK-47

Quote:


> Originally Posted by *ZealotKi11er*
> 
> It should not. If 290X is 10% faster then 290 then 290X CF is 10% faster then 290 CF not 20%.


yeah good point.
But your 2 cards are essentially running as 2 290s and not a 290x and 290 correct?


----------



## sugarhell

Quote:


> Originally Posted by *AK-47*
> 
> yeah good point.
> But your 2 cards are essentially running as 2 290s and not a 290x and 290 correct?


No. Its a 290x and a 290


----------



## HoneyBadger84

Quote:


> Originally Posted by *oDizz82*
> 
> Anyone have an idea of what wattage PSU is recommended for running two r9-290x's in crossfire with an FX-8350 OC'ed to 4.6GHz?
> Currently I'm running two GTX 770's on a gold rated Corsair RM750 without any issues. Will possibly be trading my two 770's for two r9-290x's and trying to figure out if I will need a beefier PSU.


Having done testing with a watt-o-meter and 2 290Xs OCed on a 4.6GHz 3930k I can tell you 1000W is a good idea with that powerhog of a CPU you have. I saw just over 1100W from the wall, which means I was actually outputting in the 850-920W range. Not much headroom if you only had a 1KW PSU.


----------



## heroxoot

Quote:


> Originally Posted by *invincible20xx*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> I mean on my 7970 I had 100+ FPS maxed on borderlands 2 and on my 290X the frame rate is crazy 30 - 80fps and never puts my GPU at a proper load for such a strong game.
> 
> 
> 
> must be some driver issue
> 
> i'm wondering how long my system will tie me maxed out @ 1080p because i'm looking to improve my gaming experience by getting a better sound system + a 1080p projector so there will be no money to upgrade the rig itself again in the next 2 years at least
> 
> so really hoping my sig rig will hold it's own @ 1080p for a long time to come
Click to expand...

It's not just for the 290X. I see complaints from both sides on the gearbox forums. It's playable but not very smooth to say the least. I don't think it's a driver issue since it happens to both teams.


----------



## ZealotKi11er

Quote:


> Originally Posted by *AK-47*
> 
> yeah good point.
> But your 2 cards are essentially running as 2 290s and not a 290x and 290 correct?


I can test for you and see if i drop performance by going 290X + 290 to 290 + 290. Either way i have my 290X as primary card so if a game does not support CF or running windows mode i get 290X performance.


----------



## HoneyBadger84

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I can test for you and see if i drop performance by going 290X + 290 to 290 + 290. Either way i have my 290X as primary card so if a game does not support CF or running windows mode i get 290X performance.


I ran a 290X & 290 in crossfire briefly, and I can tell you all the performance difference between it & crossfired 290Xs is pretty small. Somewhat noticable in benchmarks, but in games it's no big difference, and it does seem like they both perform for themselves rather than trying to meet one another in the middle or anything.

I have a reference 290 that I haven't gotten rid of yet, but I can't be bothered to uninstall TriFire just to check that out.


----------



## fateswarm

Quote:


> Originally Posted by *fateswarm*
> 
> *Success*. By turning off all the graphs from the settings, cpu usage dropped form ~0.16% constantly to about 0.03%.
> 
> Nope, it's not a huge deal as it's evidenced by the numbers.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That cleared most of the procmon mess too apparently.


I took it one step further and eliminated it completely. I made a scheduled task to kill it 10 seconds after user log on.


----------



## heroxoot

Quote:


> Originally Posted by *fateswarm*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fateswarm*
> 
> *Success*. By turning off all the graphs from the settings, cpu usage dropped form ~0.16% constantly to about 0.03%.
> 
> Nope, it's not a huge deal as it's evidenced by the numbers.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That cleared most of the procmon mess too apparently.
> 
> 
> 
> I took it one step further and eliminated it completely. I made a scheduled task to kill it 10 seconds after user log on.
Click to expand...

I don't get that. I have all the graphs on and it uses .05%.


----------



## Gobigorgohome

Anyone know if bf4 use more power at ultra settings at 4k vs low settings on 4k?
My system crash and i get an error sayin something about psu-failure when i did it with quadfire, on low everything is just fine (no problems).
I think the gpu's thermal throttle alot easier on ultra anyways. I guess i will know for sure tomorrow when i get the second EVGA G2 1300W.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Anyone know if bf4 use more power at ultra settings at 4k vs low settings on 4k?
> My system crash and i get an error sayin something about psu-failure when i did it with quadfire, on low everything is just fine (no problems).
> I think the gpu's thermal throttle alot easier on ultra anyways. I guess i will know for sure tomorrow when i get the second EVGA G2 1300W.


For sure they use more power.


----------



## fateswarm

Quote:


> Originally Posted by *heroxoot*
> 
> I don't get that. I have all the graphs on and it uses .05%.


Yep. It's very minimal. Borderline OCD treatment.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Anyone know if bf4 use more power at ultra settings at 4k vs low settings on 4k?
> My system crash and i get an error sayin something about psu-failure when i did it with quadfire, on low everything is just fine (no problems).
> I think the gpu's thermal throttle alot easier on ultra anyways. I guess i will know for sure tomorrow when i get the second EVGA G2 1300W.


What's your other hardware again, and what clocks are you running the GPUs at? I've run QuadFire on my PSU but that was with all 4 cards at stock, and I saw just over 1200W actual PSU power (~1450W from the wall). I've heard the eVGA PSU is pretty good, so I can't imagine why it'd be having problems, unless you're hitting the PSU limit, which you very well could be in BF4 at 4K, I know that can use every bit of GPU it has access to:






So it would stand to reason that it's POSSIBLE you're actually hitting the limit of that PSU. This is why I was planning on going dual-PSUs in my system when I get QuadFire up & running. Just on TriFire I've seen draws almost up to 1200W actual, with the cards OCed a bit.


----------



## Gobigorgohome

Quote:


> Originally Posted by *HoneyBadger84*
> 
> What's your other hardware again, and what clocks are you running the GPUs at? I've run QuadFire on my PSU but that was with all 4 cards at stock, and I saw just over 1200W actual PSU power (~1450W from the wall). I've heard the eVGA PSU is pretty good, so I can't imagine why it'd be having problems, unless you're hitting the PSU limit, which you very well could be in BF4 at 4K, I know that can use every bit of GPU it has access to:
> 
> 
> 
> 
> 
> 
> So it would stand to reason that it's POSSIBLE you're actually hitting the limit of that PSU. This is why I was planning on going dual-PSUs in my system when I get QuadFire up & running. Just on TriFire I've seen draws almost up to 1200W actual, with the cards OCed a bit.


RIVBE, 3930k, 16 GB 1866mhz, seagate barracuda 3 TB, 4x ref r9 290x and EVGA G2 1300W. Hitman absolution maxed out (with AA) and Tomb Raider at ultimate at 4k runs just fine. I only have the problem at ultra settings in bf4, it seemed to struggle a bit before it shut down so i figured it was the psu.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Gobigorgohome*
> 
> RIVBE, 3930k, 16 GB 1866mhz, seagate barracuda 3 TB, 4x ref r9 290x and EVGA G2 1300W. Hitman absolution maxed out (with AA) and Tomb Raider at ultimate at 4k runs just fine. I only have the problem at ultra settings in bf4, it seemed to struggle a bit before it shut down so i figured it was the psu.


BF4 is going to stress your system more than either of those games, I would imagine. I don't have the game or I'd do a load test myself on my system for some direct comparison as far as draw on a TriFire system goes.

Newegg's testing of QuadFire 290Xs with a system around about the same as mine in terms of clocks etc, saw about 1225W actual power draw (more at the wall of course), and that was a pretty basic system, no extra drives and the like. Battlefield 4 is supposed to be one of the most stressful titles available at the moment, so I wouldn't doubt it if you're hitting your limit on that PSU.


----------



## PCSarge

the EVGA Stinger M-ITX board was just dropped at my house. now i want work to end early.


----------



## Beezleybuzz

Hey guys, been lurking for a bit, don't think I'm actually officially 'in the club' yet.

http://www.techpowerup.com/gpuz/vcgx/

There is my GPU-Z validation. I actually have 2 Asus R9 290 DCUII OC but was really tired of the crossfire issues not working in certain games. My overclock is also substantially higher with a single card and heat is essentially half (with 1 of 2 cards).


----------



## Gobigorgohome

Quote:


> Originally Posted by *HoneyBadger84*
> 
> BF4 is going to stress your system more than either of those games, I would imagine. I don't have the game or I'd do a load test myself on my system for some direct comparison as far as draw on a TriFire system goes.
> 
> Newegg's testing of QuadFire 290Xs with a system around about the same as mine in terms of clocks etc, saw about 1225W actual power draw (more at the wall of course), and that was a pretty basic system, no extra drives and the like. Battlefield 4 is supposed to be one of the most stressful titles available at the moment, so I wouldn't doubt it if you're hitting your limit on that PSU.


Yes, that might be it. I still think it is a little weird though, but as i said i probably get the answer when all my parts get here.


----------



## fateswarm

What's the best Mantle game to test these cards?


----------



## Kittencake

Quote:


> Originally Posted by *fateswarm*
> 
> What's the best Mantle game to test these cards?


Bf4 and Theif and the dreaded Sniper Elite 3

the following will have upcoming support

Dragon Age: Inquisition

Mass Effect 4

Star Wars: Battlefront

Mirror's Edge 2

Civilization: Beyond Earth


----------



## BradleyW

Quote:


> Originally Posted by *Kittencake*
> 
> Bf4 and Theif and the dreaded Sniper Elite 3
> 
> the following will have upcoming support
> 
> Dragon Age: Inquisition
> 
> Mass Effect 4
> 
> Star Wars: Battlefront
> 
> Mirror's Edge 2
> 
> Civilization: Beyond Earth


Sniper Elite III does not support Mantle yet I don't believe.
Also, Star Swarm and Plants vs Zombies support Mantle.


----------



## Sgt Bilko

Quote:


> Originally Posted by *BradleyW*
> 
> Sniper Elite III does not support Mantle yet I don't believe.
> Also, Star Swarm and Plants vs Zombies support Mantle.


Yup, still waiting on Mantle support before i start Sniper Elite 3 again.

Starswarm is a benchmark rather than a game and so far PvZ has been running great under Mantle for me.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Kittencake*
> 
> Bf4 and Theif and the dreaded Sniper Elite 3
> 
> the following will have upcoming support
> 
> Dragon Age: Inquisition
> 
> Mass Effect 4
> 
> Star Wars: Battlefront
> 
> Mirror's Edge 2
> 
> Civilization: Beyond Earth


I'll almost definitely be getting Civ:BE, Dragon Age I haven't even played 2 yet since I heard so many meh-things about it. Thief is 100% on my list, can't wait to play that. Mirror's Edge 2 will probably garner my interest after watching Day[9] play through the original and laughing continuously at how bad he was at it.


----------



## The Storm

My third 290 along with another EK block is on its way. My first 2 sapphire 290's both unlocked, I got them back at launch last year. Here's crossing my fingers that this will unlock as well. I hope my corsair AX1200 will keep up. Currently have a 4930k on a RIVE, with 2 290's unlocked.


----------



## Sgt Bilko

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I'll almost definitely be getting Civ:BE, Dragon Age I haven't even played 2 yet since I heard so many meh-things about it. Thief is 100% on my list, can't wait to play that. Mirror's Edge 2 will probably garner my interest after watching Day[9] play through the original and laughing continuously at how bad he was at it.


Really looking forward to what Mantle does to Civ, DA:I is on Pre-Order for me, I'll actually finish Thief now that it has Crossfire support in Mantle and Mirror's Edge 2 I'm undecided on, will most likely get it but maybe not at release.


----------



## fateswarm

Darn, both bf4 and thief have bad reviews by the people, in gameplay.


----------



## BradleyW

Quote:


> Originally Posted by *fateswarm*
> 
> Darn, both bf4 and thief have bad reviews by the people, in gameplay.


Ignore the reviews, both games are great!


----------



## TTheuns

Quote:


> Originally Posted by *fateswarm*
> 
> Darn, both bf4 and thief have bad reviews by the people, in gameplay.


Linus' (LinusTechTips) review mentioned that Thief screwed up with the sound on the Hawaii-cards.
Could be fixed by now


----------



## fateswarm

Quote:


> Originally Posted by *BradleyW*
> 
> Ignore the reviews, both games are great!


I do ignore "pro reviewers". I'm looking at aggregate votes by common players. Thief is tanked, bf4 is so and so.


----------



## Ramzinho

Quote:


> Originally Posted by *The Storm*
> 
> My third 290 along with another EK block is on its way. My first 2 sapphire 290's both unlocked, I got them back at launch last year. Here's crossing my fingers that this will unlock as well. I hope my corsair AX1200 will keep up. Currently have a 4930k on a RIVE, with 2 290's unlocked.


----------



## Gobigorgohome

Four reference R9 290X's screaming for more power and water cooling, poor quality too.


----------



## TTheuns

Quote:


> Originally Posted by *Gobigorgohome*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Four reference R9 290X's screaming for more power and water cooling, poor quality too.


What PSU do you use to power those magma chambers? And what are the temps in the current setup?


----------



## Gualichu04

Why can i only do +100mv in msi afterburner 3.0.1 but, in saphire trixx i can do +200mv.


----------



## heroxoot

Quote:


> Originally Posted by *Gualichu04*
> 
> Why can i only do +100mv in msi afterburner 3.0.1 but, in saphire trixx i can do +200mv.


Because it has not been updated for the 290 series probably.


----------



## sugarhell

Quote:


> Originally Posted by *heroxoot*
> 
> Because it has not been updated for the 290 series probably.










Because ab allows 100mv and trixx allows 200mv. If it wasnt updated for 290 series you shouldnt have control over voltage at all


----------



## heroxoot

Quote:


> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Because it has not been updated for the 290 series probably.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Because ab allows 100mv and trixx allows 200mv. If it wasnt updated for 290 series you shouldnt have control over voltage at all
Click to expand...

That doesn't mean anything. It may have not been updated to allow more.


----------



## sugarhell

3.0.1 is the latest version of msi ab and support 290 series. Its a msi policy to not allow over 100mv. If the msi ab didint support 290 series you shouldnt have voltage control at all


----------



## Gualichu04

Thanks it seems i will have to use sapphire trixx for overclocking my 290x's then


----------



## hwoverclkd

if anyone is looking, msi 290x lightning for $599 @tigerdirect.


----------



## The Storm

Current setup before the 3rd 290 comes in.


----------



## bond32

Quote:


> Originally Posted by *The Storm*
> 
> Current setup before the 3rd 290 comes in.


Oh man, that RX240 looks so sexy


----------



## The Storm

Quote:


> Originally Posted by *bond32*
> 
> Oh man, that RX240 looks so sexy


Thanks, stuffed an RX360, RX240, RX120, and an EX120 in a switch 810


----------



## FuriousPop

Storm, can you link a site for your case?

that thing looks bigger than a 30' screen!!!

and i thought i was bad with 2x RX480's for my 3x r9 290 setup (still in the process of putting it all together - waiting on additional screws to arrive for my push+pull config)...


----------



## The Storm

http://www.newegg.com/Product/Product.aspx?Item=N82E16811146088&cm_re=nzxt_switch_810-_-11-146-088-_-Product

But mine is the matte black version.


----------



## mojobear

Hey all,

Updated my quadfire 290 rig with some lights...taken with iphone...does not do it justice haha.

Hope you guys enjoy!


----------



## HoneyBadger84

Quote:


> Originally Posted by *The Storm*
> 
> My third 290 along with another EK block is on its way. My first 2 sapphire 290's both unlocked, I got them back at launch last year. Here's crossing my fingers that this will unlock as well. I hope my corsair AX1200 will keep up. Currently have a 4930k on a RIVE, with 2 290's unlocked.


PSU shouldn't have any issues, it's an original Pro Series Gold like mine right? If so it can handle all that no problem, unless you push the cards to something insane like over 1200MHz. Even then, it can go over it's rated wattage & keep chugging. At least mine did when tested with QuadFire & OCed TriFire.
Quote:


> Originally Posted by *The Storm*
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16811146088&cm_re=nzxt_switch_810-_-11-146-088-_-Product
> 
> But mine is the matte black version.


Finally a case almost as big as my current one... Almost. lol

Switch 810: 9.25" x 23.43" x 23.03" (W x H x D)
My case: 26.54" x 9.26" x 25.20"

In the same ballpark at least.


----------



## The Storm

Quote:


> Originally Posted by *HoneyBadger84*
> 
> PSU shouldn't have any issues, it's an original Pro Series Gold like mine right? If so it can handle all that no problem, unless you push the cards to something insane like over 1200MHz. Even then, it can go over it's rated wattage & keep chugging. At least mine did when tested with QuadFire & OCed TriFire.
> Finally a case almost as big as my current one... Almost. lol
> 
> Switch 810: 9.25" x 23.43" x 23.03" (W x H x D)
> My case: 26.54" x 9.26" x 25.20"
> 
> In the same ballpark at least.


Thanks yeah its the gold series psu. I guess it will put my cooling to the test.


----------



## HoneyBadger84

Quote:


> Originally Posted by *The Storm*
> 
> Thanks yeah its the gold series psu. I guess it will put my cooling to the test.


Well your system compared to mine has some power differences, but where I'm running my GPU fans at 100% during benchmarking etc, which draws a clip more power per GPU compared to stock fan speeds (15-20W if I remember what my watt-o-meter said), you're instead running liquid pumps. So I would assume that's pretty much offsetting... and from the looks of it, we have about the same amount of "Fan power" give or take a few, since the majority of my fans are high-flow 140mm/180mm (all of them except 1 in fact, which is a 120mm I recently installed behind my motherboard in the space for it on my case at Nleksan's suggestion to help cool the back of the board & the VRMs butts, so to speak).

I don't see liquid having an issue with these if it's got good quality flow & heat dissipation. I mean I'm maxing out between 60C(lesser games)-68C(highest I've seen, which was in Valley during a double-loop) on my GPUs with air, depending on the application. I can keep my load temps under 65C on anything, at stock, even lower maybe, if I'm using the blower fan I use for side-flow during OCing sessions... but I'd rather run the normal side fans I have & deal with a bit of extra heat.

Luckily most games don't actually max out my GPUs like benchmarks & the like do, partly because I run VSync in most games for the simple fact that I don't see the point in running more than 60FPS in a single player game. Lessens heat, and in my case, thereby noise, since the GPU fans don't have to work as hard if the cards aren't hot.

Wish I could go back to liquid & finally do that on my GPUs, but after the issues I've had with it recently, I'll take the bit of noise. I play with headphones anyway, can't really hear anything until they get up to 100% fan speed, even then it's barely noticable over in-game sound.


----------



## ZealotKi11er

Quote:


> Originally Posted by *The Storm*
> 
> Thanks, stuffed an RX360, RX240, RX120, and an EX120 in a switch 810


How is the cooling? I have 360 + 240 Push/Pull and Deltas are ~ 15C with 800 RPM fans.


----------



## The Storm

Not sure what the delta is but room temp is 70f and they hover about 52c playing BF4. They take a bit to climb up to that temp. Corsair sp120's set on silent mode on my rive and double up resistors on the fans not plugged into MB. I like it quiet.


----------



## The Storm

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Well your system compared to mine has some power differences, but where I'm running my GPU fans at 100% during benchmarking etc, which draws a clip more power per GPU compared to stock fan speeds (15-20W if I remember what my watt-o-meter said), you're instead running liquid pumps. So I would assume that's pretty much offsetting... and from the looks of it, we have about the same amount of "Fan power" give or take a few, since the majority of my fans are high-flow 140mm/180mm (all of them except 1 in fact, which is a 120mm I recently installed behind my motherboard in the space for it on my case at Nleksan's suggestion to help cool the back of the board & the VRMs butts, so to speak).
> 
> I don't see liquid having an issue with these if it's got good quality flow & heat dissipation. I mean I'm maxing out between 60C(lesser games)-68C(highest I've seen, which was in Valley during a double-loop) on my GPUs with air, depending on the application. I can keep my load temps under 65C on anything, at stock, even lower maybe, if I'm using the blower fan I use for side-flow during OCing sessions... but I'd rather run the normal side fans I have & deal with a bit of extra heat.
> 
> Luckily most games don't actually max out my GPUs like benchmarks & the like do, partly because I run VSync in most games for the simple fact that I don't see the point in running more than 60FPS in a single player game. Lessens heat, and in my case, thereby noise, since the GPU fans don't have to work as hard if the cards aren't hot.
> 
> Wish I could go back to liquid & finally do that on my GPUs, but after the issues I've had with it recently, I'll take the bit of noise. I play with headphones anyway, can't really hear anything until they get up to 100% fan speed, even then it's barely noticable over in-game sound.


That's pretty amazing Temps for quad fire air cooled cards man!!!


----------



## ZealotKi11er

Quote:


> Originally Posted by *The Storm*
> 
> Not sure what the delta is but room temp is 70f and they hover about 52c playing BF4. They take a bit to climb up to that temp. Corsair sp120's set on silent mode on my rive and double up resistors on the fans not plugged into MB. I like it quiet.[/quote
> 
> Thats with 2 cards right? I was getting ~ 45C with one card. Now get ~ 55C with 2 cards. You probably will hit 60s with 3 cards.


----------



## tsm106

Vsync plus 100% fan speed... You can't figure out why it didn't get that hot? What's interesting is quad gpu and vsync, kind of defeats the point imo.


----------



## tokoam

So I get my 290x block from from FrozenCPU today and guess what you tell me what's wrong with this picture. ] I don't get how this snuck past QOS check I called Frozencpu and they gave me 2 options.Either send the block back or they can just send me the screw I opted for them to just send the screw I hate delays !! any of you had any bad luck with EK ?


----------



## The Storm

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *The Storm*
> 
> Not sure what the delta is but room temp is 70f and they hover about 52c playing BF4. They take a bit to climb up to that temp. Corsair sp120's set on silent mode on my rive and double up resistors on the fans not plugged into MB. I like it quiet.[/quote
> 
> Thats with 2 cards right? I was getting ~ 45C with one card. Now get ~ 55C with 2 cards. You probably will hit 60s with 3 cards.
> 
> 
> 
> Ya, thats 2 unlocked 290's at stock clocks with a 4930k running @ 4.5, if it stays below 65 I will be pleased.
Click to expand...


----------



## The Storm

Quote:


> Originally Posted by *tokoam*
> 
> So I get my 290x block from from FrozenCPU today and guess what you tell me what's wrong with this picture. ] I don't get how this snuck past QOS check I called Frozencpu and they gave me 2 options.Either send the block back or they can just send me the screw I opted for them to just send the screw I hate delays !! any of you had any bad luck with EK ?


Ouch man that sux, I am not very patient I hate waiting, I would not be pleased. Not Frozens fault but still, I would not be happy.


----------



## tsm106

Someone was drunk on the QC line! That's not actually that bad, I've seen blocks sold missing key milling fundamental to the function of the block.


----------



## fateswarm

Look into the box.


----------



## tokoam

Quote:


> Originally Posted by *fateswarm*
> 
> Look into the box.


I did nothing found other than the thermal pads and mounting hardware...checked the baggy not there either


----------



## ZealotKi11er

So testing my system temp i am getting 58C GPU1 and 61C GPU2. Intake air is 28C but after playing for 30 min its ~ 32C. Water hit ~ 48C. This is with 360 + 240 Thick RADs Push/Pull GT15s @ 7V ~ 800 RPM. With Fans @ 12v ~ 1800 RPM i got ~ 40.5C Water temp and GPUs dropped to 53C and 54C. Keep in mind this is with cards @ stock. They are being pushed hard though. BF4 Mantle 1440p 125% Ultra. Probably need to upgrade my case and cooling setup to add anther 360 RAD. Good news is that i dont have to OC.


----------



## invincible20xx

quadfire people , what res are you playing on ?


----------



## ZealotKi11er

Quote:


> Originally Posted by *invincible20xx*
> 
> quadfire people , what res are you playing on ?


If they are not playing 4K or 3x1080p i must ask what are they doing.


----------



## Mega Man

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *The Storm*
> 
> My third 290 along with another EK block is on its way. My first 2 sapphire 290's both unlocked, I got them back at launch last year. Here's crossing my fingers that this will unlock as well. I hope my corsair AX1200 will keep up. Currently have a 4930k on a RIVE, with 2 290's unlocked.
> 
> 
> 
> PSU shouldn't have any issues, it's an original Pro Series Gold like mine right? If so it can handle all that no problem, unless you push the cards to something insane like over 1200MHz. Even then, it can go over it's rated wattage & keep chugging. At least mine did when tested with QuadFire & OCed TriFire.
> Quote:
> 
> 
> 
> Originally Posted by *The Storm*
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16811146088&cm_re=nzxt_switch_810-_-11-146-088-_-Product
> 
> But mine is the matte black version.
> 
> Click to expand...
> 
> Finally a case almost as big as my current one... Almost. lol
> 
> Switch 810: 9.25" x 23.43" x 23.03" (W x H x D)
> My case: 26.54" x 9.26" x 25.20"
> 
> In the same ballpark at least.
Click to expand...

heh th10 yours is ~ .5 in taller/longer but mine is about 2x as wide.
i may be getting a ped too.... i am begining to like the idea of putting my pumps at the bottom more space !

when i get my tx10-d, yours wont have a chance. it will be as tall as me !
Quote:


> Originally Posted by *tsm106*
> 
> Vsync plus 100% fan speed... You can't figure out why it didn't get that hot? What's interesting is quad gpu and vsync, kind of defeats the point imo.


agreed
Quote:


> Originally Posted by *tokoam*
> 
> So I get my 290x block from from FrozenCPU today and guess what you tell me what's wrong with this picture. ] I don't get how this snuck past QOS check I called Frozencpu and they gave me 2 options.Either send the block back or they can just send me the screw I opted for them to just send the screw I hate delays !! any of you had any bad luck with EK ?


Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *invincible20xx*
> 
> quadfire people , what res are you playing on ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If they are not playing 4K or 3x1080p i must ask what are they doing.
Click to expand...

that one, wanna go 2x4k but not liking the options so far, waiting for monitors to get to where i want to play.

i will be going for 6x1080p though ( maybe only 5 have not decided yet )


----------



## HoneyBadger84

Quote:


> Originally Posted by *tsm106*
> 
> Vsync plus 100% fan speed... You can't figure out why it didn't get that hot? What's interesting is quad gpu and vsync, kind of defeats the point imo.


Where did I say I can't figure out why it's not running hot? Pretty sure I said I'm not running hot BECAUSE of my case airflow and the high fan speed.

Also, I did say I'm not running QuadFire, yet, and that it's only for Single Player games, in 1080p, that I run VSync on. Obviously if/when I'm running Eyefinity 3x1080p, I'll run with VSync off, that's an er-dur thing.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> If they are not playing 4K or 3x1080p i must ask what are they doing.


Indeed, my system being TriFire now & QuadFire later is for 3x1080p & 4K eventually, when I upgrade my TV to a 4K unit. Right now I'm running 2 side monitors of normal size with a TV in the middle, in the position I normally game in, it gives a nice fish-eye type Eyefinity setup with good peripheral vision. I can imagine 3 screens of the same size would be "neater", but I like it just fine


----------



## PCSarge

hmm and i was told this board was crap. i literally OC'd to a stable 5ghz on this Z77 stinger in under 5 minutes.

in other news.waterblock on the 290 got me to 1250/1400 in heaven.


----------



## FuriousPop

Quote:


> Originally Posted by *ZealotKi11er*
> 
> If they are not playing 4K or 3x1080p i must ask what are they doing.


playing on 2560 x 1600 can take a toll on the GPU with all the eye candy on for certain games...


----------



## kizwan

false
Quote:


> Originally Posted by *ZealotKi11er*
> 
> So testing my system temp i am getting 58C GPU1 and 61C GPU2. Intake air is 28C but after playing for 30 min its ~ 32C. Water hit ~ 48C. This is with 360 + 240 Thick RADs Push/Pull GT15s @ 7V ~ 800 RPM. With Fans @ 12v ~ 1800 RPM i got ~ 40.5C Water temp and GPUs dropped to 53C and 54C. Keep in mind this is with cards @ stock. They are being pushed hard though. BF4 Mantle 1440p 125% Ultra. Probably need to upgrade my case and cooling setup to add anther 360 RAD. Good news is that i dont have to OC.


I wouldn't run the fans at anything less than 1200 RPM when gaming or whenever CPU or/and GPU underload. With the amount of heat produced by two 290's, I recommend at least add another 120mm rad. When recording delta, I ran my fans @100%, so I don't have data that can be use to compare with yours. When benching, delta is between 3 - 5C (overclocked) but when gaming it can be around ~10C (stock).


----------



## Arizonian

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Can you update me. I now have 2 x 290. One is unlocked to 290X. (Water Cooled with EK Blocks)
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added and updated















Quote:


> Originally Posted by *Beezleybuzz*
> 
> Hey guys, been lurking for a bit, don't think I'm actually officially 'in the club' yet.
> 
> http://www.techpowerup.com/gpuz/vcgx/
> 
> There is my GPU-Z validation. I actually have 2 Asus R9 290 DCUII OC but was really tired of the crossfire issues not working in certain games. My overclock is also substantially higher with a single card and heat is essentially half (with 1 of 2 cards).


Congrats - added finally








Quote:


> Originally Posted by *The Storm*
> 
> Current setup before the 3rd 290 comes in.
> 
> 
> Spoiler: Warning: Spoiler!


Sweet, let us know when that third one comes in.









Quote:


> Originally Posted by *mojobear*
> 
> Hey all,
> 
> Updated my quadfire 290 rig with some lights...taken with iphone...does not do it justice haha.
> 
> Hope you guys enjoy!
> 
> 
> Spoiler: Warning: Spoiler!


Nice. I'm a sucker for aesthetics.









Quote:


> Originally Posted by *tokoam*
> 
> So I get my 290x block from from FrozenCPU today and guess what you tell me what's wrong with this picture. ] I don't get how this snuck past QOS check I called Frozencpu and they gave me 2 options.Either send the block back or they can just send me the screw I opted for them to just send the screw I hate delays !! any of you had any bad luck with EK ?
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Just throw a GPU-Z validation in that post for me.

Again, if anyone has been part of the club but hasn't submitted your proof of validation, please join us on that list too.


----------



## invincible20xx

Quote:


> Originally Posted by *FuriousPop*
> 
> playing on 2560 x 1600 can take a toll on the GPU with all the eye candy on for certain games...


we are still talking 4 x god GPUs under water don't think 2k res can hold a candle to that


----------



## invincible20xx

don't think i added myself to the club

here is my humble system :





GPU-Z Validation for OCN :

http://www.techpowerup.com/gpuz/mem9f/

tried my best on the cable managment took me a whole day


----------



## Arizonian

Quote:


> Originally Posted by *invincible20xx*
> 
> don't think i added myself to the club
> 
> here is my humble system :
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> tried my best on the cable managment took me a whole day


No you hadn't until now. Congrats - added









Put a GPU-Z validation link in that post would be great.


----------



## invincible20xx

Quote:


> Originally Posted by *Arizonian*
> 
> No you hadn't until now. Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Put a GPU-Z validation link in that post would be great.


Edited, here is the link too

http://www.techpowerup.com/gpuz/mem9f/

but why there nothing in the validation page that says crossfire

this is actually the first time i use the validation thing lol


----------



## TwistedDonut

I definitely need to register myself when I get home I have been lurking for about a month now


----------



## RocketAbyss

Seeking the knowledge of you peeps, grant unto me thy divine wisdom. (lol)

Just like to clarify, does anyone know for the VRM1 and VRM2 readings on the reference cards, where are the respective locations of the VRMs? VRM1 is the triangle shaped one near the back of the card, while VRM2 is the long line of mosfets and chokes in the middle toward the front of the card, amirite?

Recently put my 290X under partial water with the NZXT G10 and a Thermaltake Water 3.0 Performer for the core. Bought the GELID VRM kit for the 290X and so far I've only put the heatsink for the triangle shaped VRM, which I presumed is VRM1. However, under load VRM1s still hit the same temperature as they do without the heatsink, which is leading me to question if that is VRM2 instead.

Thanks in advance!


----------



## HoneyBadger84

Quote:


> Originally Posted by *RocketAbyss*
> 
> Seeking the knowledge of you peeps, grant unto me thy divine wisdom. (lol)
> 
> Just like to clarify, does anyone know for the VRM1 and VRM2 readings on the reference cards, where are the respective locations of the VRMs? VRM1 is the triangle shaped one near the back of the card, while VRM2 is the long line of mosfets and chokes in the middle toward the front of the card, amirite?
> 
> Recently put my 290X under partial water with the NZXT G10 and a Thermaltake Water 3.0 Performer for the core. Bought the GELID VRM kit for the 290X and so far I've only put the heatsink for the triangle shaped VRM, which I presumed is VRM1. However, under load VRM1s still hit the same temperature as they do without the heatsink, which is leading me to question if that is VRM2 instead.
> 
> Thanks in advance!


http://www.mckeemaker.com/2014/01/diy-vrm-heat-sinks-for-amd-r9-290.html

Would assume locations are the same for the 290X, check that out, has locations of both.


----------



## TwistedDonut

this was posted in this thread a while ago
Quote:


> Originally Posted by *evensen007*
> 
> Also, don't overlook the 2nd VRM strip. In my manual that came with mine, it was very hard to see in black and white.


----------



## Gobigorgohome

Quote:


> Originally Posted by *TTheuns*
> 
> What PSU do you use to power those magma chambers? And what are the temps in the current setup?


Only one EVGA G2 1300W, but soon two EVGA G2 1300W. Just need the second psu starting cable and mod my case a bit.


----------



## Gobigorgohome

Quote:


> Originally Posted by *invincible20xx*
> 
> quadfire people , what res are you playing on ?


4k all the way, trying to play every game with 60 fps + with highest possible settings.


----------



## invincible20xx

Quote:


> Originally Posted by *Gobigorgohome*
> 
> 4k all the way, trying to play every game with 60 fps + with highest possible settings.


and did you achieve your goal with 4 R9 290x's @ 4k

i kind of have your same goal but i play @ 1k only







i only have 2 r9 290's though ...


----------



## kayan

Hey all, I need an opinion on how much raddage I need for a my current build.

I currently have an AMD FX-9370, and two ref 290x (one is watercooled with the 9370 at the moment), but I'd like to add the 2nd 290x to my loop and I don't think that I have enough rad space. Currently in a 760t case, with an AX240 in front and a ST30 280 top, and for the CPU + GPU it works great.

I have the option of swapping the 280 into a 360, or adding a 120 in back/bottom. Is this enough? And if so, which way should I go?


----------



## fateswarm

I suspect vrm1 is low or high mosfets of the gpu power supply and vrm2 low or high (whatever the other was, the opposite) of the same supply (usually located next to the line of chokes at the right of the gpu). There is also supply of the VRAM but I doubt that's measured usually. I'm not sure what's going on with VRAM but on RAM I know that supply is insignificant in volume.


----------



## Gobigorgohome

Quote:


> Originally Posted by *invincible20xx*
> 
> and did you achieve your goal with 4 R9 290x's @ 4k
> 
> i kind of have your same goal but i play @ 1k only
> 
> 
> 
> 
> 
> 
> 
> i only have 2 r9 290's though ...


From what i have tested this far, it looks good, but i doubt it will in thief, metro 2033, metro last light and crysis 3. I will overclock and see what i can squeeze out of this system though and hope for the best!


----------



## Imprezzion

Quote:


> Originally Posted by *TwistedDonut*
> 
> this was posted in this thread a while ago


Well, that aint right for sure. The chips labeled as ''1'' are the VRAM. Something completely different and unmonitored on 290 series cards.

The long strip is VRM1, the little triangle block is VRM2. Simple as that.

I should update my entry to the OP as well.. If i'm even listed.. and if I am it's probably on a card i no longer own lol. Like my old unlocked Powercolor..

http://www.techpowerup.com/gpuz/d9yrp/

Brand: XFX R9 290 flashed to 290x (XFX BIOS as well).
Cooling: Accelero Hybrid + stock cooler VRM parts sawed out of it.
Temps @ 1200/1400 +200mV: ~70c core, ~85c VRM1, ~60c VRM2.


----------



## kizwan

Quote:


> Originally Posted by *kayan*
> 
> Hey all, I need an opinion on how much raddage I need for a my current build.
> 
> I currently have an AMD FX-9370, and two ref 290x (one is watercooled with the 9370 at the moment), but I'd like to add the 2nd 290x to my loop and I don't think that I have enough rad space. Currently in a 760t case, with an AX240 in front and a ST30 280 top, and for the CPU + GPU it works great.
> 
> I have the option of swapping the 280 into a 360, or adding a 120 in back/bottom. Is this enough? And if so, which way should I go?


Add 120mm at the back or bottom.

Swap 280 to 360: 240 + 360 = 600mm (28800 + 43200 = 72000 mm2)
Add 120: 240 + 280 + 120 = 640mm (28800 + 39200 + 14400 = 82400 mm2)

600 to 640mm rads should be ok but I recommend minimum 720mm rads space for two 290's.


----------



## invincible20xx

Quote:


> Originally Posted by *Gobigorgohome*
> 
> From what i have tested this far, it looks good, but i doubt it will in thief, metro 2033, metro last light and crysis 3. I will overclock and see what i can squeeze out of this system though and hope for the best!


good luck, the way i see it is the current hardware is just not ready for 60fps minimum maxed out gaming experience @ 4k yet maybe in the future more bad ass gpus will cut it @ crossfire configs

but the system you got now is NO slouch, been thinking of a display upgrade myself but i settled on upgrading to a 1080p projector instead to avoid having to upgrade my rig to keep the minimum 60fps


----------



## fateswarm

If they measure the vram's supply vrm temp, in some models it might be situated right above the other phases for the gpu (e.g. 6 + 2 for the vram). But it's often at the top left of the gpu (with 1 or 2 phases). I'm surprised they care to measure the vram vrm to be honest, but it may be important in gpus or in some models.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Gobigorgohome*
> 
> From what i have tested this far, it looks good, but i doubt it will in thief, metro 2033, metro last light and crysis 3. I will overclock and see what i can squeeze out of this system though and hope for the best!


Metro 2033 you shouldn't have problems, Metro Last Light, I dunno, that game runs kinda bad on AMD hardware, I noticed it a while ago, on 2-way Crossfire 1080p I was having to turn it down to High or I got these weird instances where FPS would drop to 29.9 & sit there for a long while, then go back to normal. Never had that happen with any other game, so I assume it's something to do with that one in particular. And what made it really annoy is I could TELL the FPS had dropped because it became extremely laggy even though at 29.9FPS it should still be "playable" like most games are. I haven't replayed it since readding a third card, so I'll find out soon I guess.

Crysis 3, if you turn off AA or lower it, cuz you honestly don't need much if any at 4K resolution, you should be getting 45-60FPS I'd imagine, most of the time. Thief I don't have experience with yet, very much looking forward to that.


----------



## Dasboogieman

Quote:


> Originally Posted by *fateswarm*
> 
> I suspect vrm1 is low or high mosfets of the gpu power supply and vrm2 low or high (whatever the other was, the opposite) of the same supply (usually located next to the line of chokes at the right of the gpu). There is also supply of the VRAM but I doubt that's measured usually. I'm not sure what's going on with VRAM but on RAM I know that supply is insignificant in volume.


Hwinfo has a tab which estimates the VRM2 power supply. Its a good 29-40W worth of power actually (this is conservative due to the nature of software monitoring).


----------



## fateswarm

Quote:


> Originally Posted by *Dasboogieman*
> 
> Hwinfo has a tab which estimates the VRM2 power supply. Its a good 29-40W worth of power actually (this is conservative due to the nature of software monitoring).


Yeah it seems VRAM has higher usage of power than I thought (I was comparing with RAM) according to what someone said.

Yep, my HWInfo splits them to two sensor chips but they both get the same temps pair but in reverse to each other.

It's pretty obvious one of them is for the gpu, the other for the vram supply. (Possibly the lower temp for the vram).


----------



## cennis

From my experience with 290s

I've gathered that,
black screen = pushing memory too high
artifact = pushing core too high
throttling = core/vrm temps too high

However, I have encountered a new issue of freezing.

I moved to a 295x2( i know its not the right thread, but its similar hopefully I get some input) for a month now,
it was always good and stable at 100mw,50 power limit, 1150mhz, 1625mhz.

no crash/ artifact/black screen, perfectly stable and temps under control, a couple times i had "vrm throttling" to 300mhz but I have put many fans blowing at it on my testbench and do not have any throttling.

However last night, running valley my card freezes at those clocks. Temperatures are definitely under control, no core or vrm throttling.

Reduced my cpu to stock and it happens. replicated issue for 5 times.
freeze does not seem to occur under stock settings.
The freeze does not go to bsod or restart, just freeze at a picture and stays there

Also, when it freeze, the psu makes a clicky noise similar to when you turn on the system (certain seasonic designs make this noise when you turn on), however the fan is still spinning.

*TLDR; freezing started last night, was completely stable before*


----------



## Jflisk

Quote:


> Originally Posted by *cennis*
> 
> From my experience with 290s
> 
> I've gathered that,
> black screen = pushing memory too high
> artifact = pushing core too high
> throttling = core/vrm temps too high
> 
> However, I have encountered a new issue of freezing.
> 
> I moved to a 295x2( i know its not the right thread, but its similar hopefully I get some input) for a month now,
> it was always good and stable at 100mw,50 power limit, 1150mhz, 1625mhz.
> 
> no crash/ artifact/black screen, perfectly stable and temps under control, a couple times i had "vrm throttling" to 300mhz but I have put many fans blowing at it on my testbench and do not have any throttling.
> 
> However last night, running valley my card freezes at those clocks. Temperatures are definitely under control, no core or vrm throttling.
> 
> Reduced my cpu to stock and it happens. replicated issue for 5 times.
> freeze does not seem to occur under stock settings.
> The freeze does not go to bsod or restart, just freeze at a picture and stays there
> 
> Also, when it freeze, the psu makes a clicky noise similar to when you turn on the system (certain seasonic designs make this noise when you turn on), however the fan is still spinning.


I could be wrong here but your clocks seem awfully high for that card. I have 2X 290x and the highest clocks I can get under water are 1060/1100 - 1360/1400 (min)-(max) Stable.Without voltage and without smoking the card. Power tune set to 50 is way to high. Even people that mine set there cards at 20 max.


----------



## cennis

Quote:


> Originally Posted by *Jflisk*
> 
> I could be wrong here but your clocks seem awfully high for that card. I have 2X 290x and the highest clocks I can get under water are 1060/1100 - 1360/1400 (min)-(max) Stable.Without voltage and without smoking the card. Power tune set to 50 is way to high. Even people that mine set there cards at 20 max.


My overclock has +100mw(software limit) which is why I can stablize 1150mhz,

People here run +150mw and I have ran +200mw on my 290s before, as long as cooling is sufficient, those voltages are not a problem.

if you give a 100~150mw bump you should be able to 1200mhz core.

What troubles me is just that it never had any signs of instability but the freezing issue just came out of no where


----------



## tsm106

Quote:


> Originally Posted by *cennis*
> 
> What troubles me is just that it never had any signs of instability but the freezing issue just came out of no where


Sounds like something is dieing... check your psu? How large is your psu? You wrote it clicks, restarts. That's indicative of a power issue, maybe overload. You probably should fill out your rig specs too.


----------



## cennis

Quote:


> Originally Posted by *tsm106*
> 
> Sounds like something is dieing... check your psu? How large is your psu? You wrote it clicks, restarts. That's indicative of a power issue, maybe overload. You probably should fill out your rig specs too.


its a 1000w platinum XFX, should be fine for a 4970k and 295x2 overclocked


----------



## tsm106

Quote:


> Originally Posted by *cennis*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Sounds like something is dieing... check your psu? How large is your psu? You wrote it clicks, restarts. That's indicative of a power issue, maybe overload. You probably should fill out your rig specs too.
> 
> 
> 
> its a 1000w platinum XFX, *should be fine* for a 4970k and 295x2 overclocked
Click to expand...

Yet your psu is shutting down? Clicks, to restart.


----------



## cennis

Quote:


> Originally Posted by *tsm106*
> 
> Yet your psu is shutting down? Clicks, to restart.


Well, im not sure whether if the PSU is shutting down first causing the freeze,
or the gpu freezing first then releasing the power load causing the psu to go on "standby mode"/reset


----------



## Arizonian

Quote:


> Originally Posted by *Imprezzion*
> 
> Well, that aint right for sure. The chips labeled as ''1'' are the VRAM. Something completely different and unmonitored on 290 series cards.
> 
> The long strip is VRM1, the little triangle block is VRM2. Simple as that.
> 
> I should update my entry to the OP as well.. If i'm even listed.. and if I am it's probably on a card i no longer own lol. Like my old unlocked Powercolor..
> 
> http://www.techpowerup.com/gpuz/d9yrp/
> 
> Brand: XFX R9 290 flashed to 290x (XFX BIOS as well).
> Cooling: Accelero Hybrid + stock cooler VRM parts sawed out of it.
> Temps @ 1200/1400 +200mV: ~70c core, ~85c VRM1, ~60c VRM2.


Nope you were not listed, I checked. Never made an official entry. However you are now.









Congrats - added


----------



## Gobigorgohome

Quote:


> Originally Posted by *invincible20xx*
> 
> good luck, the way i see it is the current hardware is just not ready for 60fps minimum maxed out gaming experience @ 4k yet maybe in the future more bad ass gpus will cut it @ crossfire configs
> 
> but the system you got now is NO slouch, been thinking of a display upgrade myself but i settled on upgrading to a 1080p projector instead to avoid having to upgrade my rig to keep the minimum 60fps


It depends on which games you are basing your "conclusion" on. I am very surprised and happy about the performance of four of these cards in quadfire, even at air.

I was looking at the Samsung U28D590 for months before it got released and when the first shop had it available I hit the "purchase" button and I am so happy with the monitor. I came from a crappy BenQ G2450HM monitor and the U28D590 is SO much better in any way, I really like that upgrade. 1080P is okay, but it is boring, I have been through Nvidia Surround with 5760x1080 and 3240x1920 and I missed that resolution at 1080P, so that is the main reason I upgraded to 4K.
Quote:


> Originally Posted by *HoneyBadger84*
> 
> Metro 2033 you shouldn't have problems, Metro Last Light, I dunno, that game runs kinda bad on AMD hardware, I noticed it a while ago, on 2-way Crossfire 1080p I was having to turn it down to High or I got these weird instances where FPS would drop to 29.9 & sit there for a long while, then go back to normal. Never had that happen with any other game, so I assume it's something to do with that one in particular. And what made it really annoy is I could TELL the FPS had dropped because it became extremely laggy even though at 29.9FPS it should still be "playable" like most games are. I haven't replayed it since readding a third card, so I'll find out soon I guess.
> 
> Crysis 3, if you turn off AA or lower it, cuz you honestly don't need much if any at 4K resolution, you should be getting 45-60FPS I'd imagine, most of the time. Thief I don't have experience with yet, very much looking forward to that.


I think I tried a little Metro Last Light at max and it was lagging CRAZY much, I turned it down to LOW and it was still very bad, I guess 25-30 fps average, it lagged a lot.
I will try Metro 2033 when I have the chance, together with Thief (even I do not expect much). From the "review" DeadlyDNA made of quadfire R9 290X at 4K on Thief it showed that adding more cards did nothing with the fps, it stayed pretty much at 40 fps, it may have been a CPU bottleneck or something, but I will test it out myself when I got the time.

From the Digital Storm benchmarks on youtube of quadfire r9 290x they get 52 fps I think with 2x MSAA, I will turn that completely off to achieve good frame rates. I would love to play through Crysis 3 on 4K too, I have done it at 1080P, but I think the 4K experience will be a little better though.

I actually did a few benchmarks a few days ago, and this was my results:

Tomb Raider 4K, quadfire R9 290X, 3930K: stock
Ultimate: 104 fps average
Everything maxed out (AA on full and so on): 46,7 fps average

My conclusion over these results is that AA make a BIG hit at 4K, it absolutely crushes the fps.

Hitman Absolution 4K, quadfire R9 290X, 3930K: stock
Ultra: 81,9 fps average

I did a run in both Skydiver and Fire Strike Extreme, but it took a big hit in "physics" because of the CPU and probably ram. I will post the results later.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Gobigorgohome*
> 
> It depends on which games you are basing your "conclusion" on. I am very surprised and happy about the performance of four of these cards in quadfire, even at air.
> 
> I was looking at the Samsung U28D590 for months before it got released and when the first shop had it available I hit the "purchase" button and I am so happy with the monitor. I came from a crappy BenQ G2450HM monitor and the U28D590 is SO much better in any way, I really like that upgrade. 1080P is okay, but it is boring, I have been through Nvidia Surround with 5760x1080 and 3240x1920 and I missed that resolution at 1080P, so that is the main reason I upgraded to 4K.
> I think I tried a little Metro Last Light at max and it was lagging CRAZY much, I turned it down to LOW and it was still very bad, I guess 25-30 fps average, it lagged a lot.
> I will try Metro 2033 when I have the chance, together with Thief (even I do not expect much). From the "review" DeadlyDNA made of quadfire R9 290X at 4K on Thief it showed that adding more cards did nothing with the fps, it stayed pretty much at 40 fps, it may have been a CPU bottleneck or something, but I will test it out myself when I got the time.
> 
> From the Digital Storm benchmarks on youtube of quadfire r9 290x they get 52 fps I think with 2x MSAA, I will turn that completely off to achieve good frame rates. I would love to play through Crysis 3 on 4K too, I have done it at 1080P, but I think the 4K experience will be a little better though.
> 
> I actually did a few benchmarks a few days ago, and this was my results:
> 
> Tomb Raider 4K, quadfire R9 290X, 3930K: stock
> Ultimate: 104 fps average
> Everything maxed out (AA on full and so on): 46,7 fps average
> 
> My conclusion over these results is that AA make a BIG hit at 4K, it absolutely crushes the fps.
> 
> Hitman Absolution 4K, quadfire R9 290X, 3930K: stock
> Ultra: 81,9 fps average
> 
> I did a run in both Skydiver and Fire Strike Extreme, but it took a big hit in "physics" because of the CPU and probably ram. I will post the results later.


I think Quad Fire is not working properly. Also AA @ 4K will eat your memory. I have tested BF4 @ 4K and even with no AA memory usage is very high.


----------



## Gobigorgohome

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I think Quad Fire is not working properly. Also AA @ 4K will eat your memory. I have tested BF4 @ 4K and even with no AA memory usage is very high.


What do you mean by "quad fire is not working properly"? Am I getting bad performance? It seems fine in Windows and CCC, every card is recognized and showing 4 GPU's in CCC with crossfire enabled

Yes, I do not really game that much with AA on, it is just for benchmarking, even if it eat my memory I should be good with 4GB's?

I looked at this: http://www.eteknix.com/4k-gaming-showdown-amd-r9-290x-crossfire-vs-nvidia-gtx-780-ti-sli/8/ and by that it do not seem likely my setup is not working properly, the fourth card is probably mostly for aesthetics I guess (more than performance at least). Averaging out 10 fps ahead I have some performance boost over those results and the CPU is pretty much at the same clock.

There may be something I do not understand, so please enlighten me on what my problem is.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Gobigorgohome*
> 
> What do you mean by "quad fire is not working properly"? Am I getting bad performance? It seems fine in Windows and CCC, every card is recognized and showing 4 GPU's in CCC with crossfire enabled
> 
> Yes, I do not really game that much with AA on, it is just for benchmarking, even if it eat my memory I should be good with 4GB's?
> 
> I looked at this: http://www.eteknix.com/4k-gaming-showdown-amd-r9-290x-crossfire-vs-nvidia-gtx-780-ti-sli/8/ and by that it do not seem likely my setup is not working properly, the fourth card is probably mostly for aesthetics I guess (more than performance at least). Averaging out 10 fps ahead I have some performance boost over those results and the CPU is pretty much at the same clock.
> 
> There may be something I do not understand, so please enlighten me on what my problem is.


Not working i mean by scaling.


----------



## Gobigorgohome

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Not working i mean by scaling.


I will check that out when I get the rest of the cards back in, just running one until the water cooling is getting here, so darn hot with four of those beasts on air ...

How do I get better scaling? From what I could find on google it is just that way quadfire works, but hey, I am happy with it!


----------



## Gualichu04

Should i send my XFX DD r9 290x in for an rma testing. It has the black screen issue at stock everything. But, upping the power limit to +50 and adding +19-50mv it does not black screen. I plan to overclock it with the 2nd one anyways.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I will check that out when I get the rest of the cards back in, just running one until the water cooling is getting here, so darn hot with four of those beasts on air ...
> 
> How do I get better scaling? From what I could find on google it is just that way quadfire works, but hey, I am happy with it!


Scaling is dependent on the game and drivers.

Tomb Raider for example has great scaling while something like Metro:LL only really scales to 3 GPU's and the 4th will actually bring fps down


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gualichu04*
> 
> Should i send my XFX DD r9 290x in for an rma testing. It has the black screen issue at stock everything. But, upping the power limit to +50 and adding +19-50mv it does not black screen. I plan to overclock it with the 2nd one anyways.


If it's doing at stock and you can repeat the error then i'd say RMA, just make sure you tell them whats happening and how it happens so they can try and replicate it.

By chance, are you using a linked Power cable?

As in a 6 pin lead off the 6+2/8 pin?


----------



## Gualichu04

Quote:


> Originally Posted by *Sgt Bilko*
> 
> If it's doing at stock and you can repeat the error then i'd say RMA, just make sure you tell them whats happening and how it happens so they can try and replicate it.
> 
> By chance, are you using a linked Power cable?
> 
> As in a 6 pin lead off the 6+2/8 pin?


Using two separate 8 pin cables from the psu. Really did not want to rma since it delays watercooling both cards and finishing the build. I guess i can deal with one card water cooled since it is stable and already has the block on it.


----------



## MasterGamma12

Sapphire brand
Stock cooling


----------



## Gobigorgohome

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Scaling is dependent on the game and drivers.
> 
> Tomb Raider for example has great scaling while something like Metro:LL only really scales to 3 GPU's and the 4th will actually bring fps down


Okay, then I understand, pretty much what I thought. I do most BF4 gaming and singelplayer in other games so I can live with a little bad scaling.


----------



## smoke2

Please, could me users of Sapphire 290 Tri-X check VDDC Current In/Out, Power In/Out and post a screenshot of GPU-Z during youtube playing?
Please check it after few seconds for stabling in two conditions:
1. With playing youtube on your screen
2. With playing youtube on the background with next 3 or 4 cards opened, like surfing with listening the music on background.
Thanks.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gualichu04*
> 
> Using two separate 8 pin cables from the psu. Really did not want to rma since it delays watercooling both cards and finishing the build. I guess i can deal with one card water cooled since it is stable and already has the block on it.


It does suck but better to have a stable card than one that plays up on you a bit.

Let us know how you get on with XFX, heard they have been getting better with Customer Support


----------



## Beezleybuzz

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added and updated
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added finally
> 
> 
> 
> 
> 
> 
> 
> 
> Sweet, let us know when that third one comes in.
> 
> 
> 
> 
> 
> 
> 
> 
> Nice. I'm a sucker for aesthetics.
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just throw a GPU-Z validation in that post for me.
> 
> Again, if anyone has been part of the club but hasn't submitted your proof of validation, please join us on that list too.


Thanks for adding me to the list! Just one correction, I have 2 R9 290's, not 290x (I WISH!).

Thanks again!


----------



## MasterGamma12

CAN I JOIN!!!!!!!!!









Sapphire brand
Stock cooling


----------



## heroxoot

Something I noticed weird with flash player on the newer 14.6 driver. If I fall asleep with flash player running the heat just goes up and up even if flash stops. Lets say twitch.tv, because it is. Usually I fall asleep at night watching something. When the stream ends, the flash player used to close while it was in full screen, and the temps would drop from usage. Now it just stays pumping up when the stream ends. Before I'd wake up and it would down clock because the stream is over and it went offline. Now it stays running half clocks and my room heats up over night after a while. What makes it weird is that it gets hotter after it ends, rather than while the stream is active.

Kind of weird.


----------



## Imprezzion

I've seen this happen as well. Plus, the heat is significantly higher now then it was as well.
Clocks go up way more.

Battlelogs animated background for example. As long as the browser in question is active and maximized the clocks stay around 550-600Mhz.. For just a animated background which takes maybe 1% of power @ 300Mhz to render...


----------



## heroxoot

Quote:


> Originally Posted by *Imprezzion*
> 
> I've seen this happen as well. Plus, the heat is significantly higher now then it was as well.
> Clocks go up way more.
> 
> Battlelogs animated background for example. As long as the browser in question is active and maximized the clocks stay around 550-600Mhz.. For just a animated background which takes maybe 1% of power @ 300Mhz to render...


Well for this specifically there is nothing but a background image. Not sure why it happens but it wasn't a thing till 14.6. My GPU will clock up around 800MHZ it seems for nothing while it only clocks 400 - 500mhz for the actual stream doing 1080p. The temp is usually around 55c tops with my AC off, but when this happens it goes up to 57 - 59. It's just weird but not life threatening.


----------



## Imprezzion

My core temps don't really rise. Could be the watercooling.. But VRM temps go into the 50's which is a shame as the idle low 30's..


----------



## heroxoot

Quote:


> Originally Posted by *Imprezzion*
> 
> My core temps don't really rise. Could be the watercooling.. But VRM temps go into the 50's which is a shame as the idle low 30's..


I idle low 40s high 30s on VRM1, so yea its just using power and causing heat it doesn't need. I am on air tho, so not a huge shock that it would cause me more heat than you.


----------



## kizwan

Quote:


> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Imprezzion*
> 
> I've seen this happen as well. Plus, the heat is significantly higher now then it was as well.
> Clocks go up way more.
> 
> Battlelogs animated background for example. As long as the browser in question is active and maximized the clocks stay around 550-600Mhz.. For just a animated background which takes maybe 1% of power @ 300Mhz to render...
> 
> 
> 
> Well for this specifically there is nothing but a background image. Not sure why it happens but it wasn't a thing till 14.6. My GPU will clock up around 800MHZ it seems for nothing while it only clocks 400 - 500mhz for the actual stream doing 1080p. The temp is usually around 55c tops with my AC off, but when this happens it goes up to 57 - 59. It's just weird but not life threatening.
Click to expand...

With Chrome, my GPU clock up to ~350MHz when watching youtube (HTML5) & with Firefox, clock fluctuate between ~300s, ~400s, ~500s, ~600s, ~800s & ~900s MHz when watching youtube (Flash). Both with hardware accelerated enabled. Clock return to 300MHz after the video stop.

14.6 beta.


----------



## heroxoot

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Imprezzion*
> 
> I've seen this happen as well. Plus, the heat is significantly higher now then it was as well.
> Clocks go up way more.
> 
> Battlelogs animated background for example. As long as the browser in question is active and maximized the clocks stay around 550-600Mhz.. For just a animated background which takes maybe 1% of power @ 300Mhz to render...
> 
> 
> 
> Well for this specifically there is nothing but a background image. Not sure why it happens but it wasn't a thing till 14.6. My GPU will clock up around 800MHZ it seems for nothing while it only clocks 400 - 500mhz for the actual stream doing 1080p. The temp is usually around 55c tops with my AC off, but when this happens it goes up to 57 - 59. It's just weird but not life threatening.
> 
> Click to expand...
> 
> With Chrome, my GPU clock up to ~350MHz when watching youtube (HTML5) & with Firefox, clock fluctuate between ~300s, ~400s, ~500s, ~600s, ~800s & ~900s MHz when watching youtube (Flash). Both with hardware accelerated enabled. Clock return to 300MHz after the video stop.
> 
> 14.6 beta.
Click to expand...

Lucky you. For twitch I clock up between 400 and 600 for 1080p flash on twitch and when the video stops the clocks seem to slightly rise more causing more temps. Only in full screen. If the video stops regularly without fullscreen it does not occur.


----------



## MasterGamma12

Quote:


> Originally Posted by *MasterGamma12*
> 
> CAN I JOIN!!!!!!!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sapphire brand
> Stock cooling


Will I ever get added??


----------



## heroxoot

Quote:


> Originally Posted by *MasterGamma12*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MasterGamma12*
> 
> CAN I JOIN!!!!!!!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sapphire brand
> Stock cooling
> 
> 
> 
> Will I ever get added??
Click to expand...

Maybe if you read the first post? It helps a bit bro.


----------



## RocketAbyss

Quote:


> Originally Posted by *HoneyBadger84*
> 
> http://www.mckeemaker.com/2014/01/diy-vrm-heat-sinks-for-amd-r9-290.html
> 
> Would assume locations are the same for the 290X, check that out, has locations of both.


Quote:


> Originally Posted by *fateswarm*
> 
> I suspect vrm1 is low or high mosfets of the gpu power supply and vrm2 low or high (whatever the other was, the opposite) of the same supply (usually located next to the line of chokes at the right of the gpu). There is also supply of the VRAM but I doubt that's measured usually. I'm not sure what's going on with VRAM but on RAM I know that supply is insignificant in volume.


Quote:


> Originally Posted by *Imprezzion*
> 
> Well, that aint right for sure. The chips labeled as ''1'' are the VRAM. Something completely different and unmonitored on 290 series cards.
> 
> The long strip is VRM1, the little triangle block is VRM2. Simple as that.


Thanks for the help peeps. I've finally decided to take off the G10 adapter, put on the VRM1 heatsink(aka the long one) and reattach the G10. Now VRM1 temps max about 70C and VRM2 about 65C during a BF4 gameplay session. Cheers!


----------



## HoneyBadger84

One thing to note for QuadFire: The 4th Card is only going to help in some titles, and only at higher resolutions. Most of the time it'll have zero effect at 1080p or even 1440/1600 resolutions. It really only starts to show up in 3K(3x1080p) or 4K resolutions. You should take a look at Newegg's performance review on QuadFire R9 290Xs, they ran them with 3 2560x1600 monitors, more pixels than 4K in actuality, and the performance scaling at that resolution was downright impressive... granted, it was going from 11FPS in single card, to just over 30FPS in Crysis 3 maxed out, but keep in mind, that's maxed out, AKA with AA which is really unnecessary in 4K.

The other thing you're going to be dealing with is CPU-limited titles, as well as titles that just plain don't scale well with Crossfire, especially past 2 cards. There are a few titles that scaling incredibly well (almost 100%) with 2 cards, Metro2033 being a good example, but going to 3-4 cards has a smaller effect, unless you're going to very high resolutions like 3K or 4K.

As far as vRAM goes, no 4GB is not going to be enough for 4K in certain titles if you're looking to run maxed out settings, even without Anti-Aliasing, I've seen reviews saying several games go over the 4GB mark. I believe Hitman Absolution was tested as getting up to 5900MB of usage on a Titan SLi setup review at 4K, but I don't know if that was ran with or without Antialiasing on...

I'd love to go 4K sooner rather than later, but I ain't replacing my TV til the warranty ends on it that replaces it if it breaks or dies.


----------



## kahboom

What's considered a safe voltage for the ram on cards with elpida memory? .50mv more OK for 24/7 use?


----------



## HoneyBadger84

Quote:


> Originally Posted by *kahboom*
> 
> What's considered a safe voltage for the ram on cards with elpida memory? .50mv more OK for 24/7 use?


I thought vRAM voltage wasn't adjustable? What exactly is the Aux Voltage for on these cards? I don't even know, haven't messed with it myself.


----------



## Gobigorgohome

Quote:


> Originally Posted by *HoneyBadger84*
> 
> One thing to note for QuadFire: The 4th Card is only going to help in some titles, and only at higher resolutions. Most of the time it'll have zero effect at 1080p or even 1440/1600 resolutions. It really only starts to show up in 3K(3x1080p) or 4K resolutions. You should take a look at Newegg's performance review on QuadFire R9 290Xs, they ran them with 3 2560x1600 monitors, more pixels than 4K in actuality, and the performance scaling at that resolution was downright impressive... granted, it was going from 11FPS in single card, to just over 30FPS in Crysis 3 maxed out, but keep in mind, that's maxed out, AKA with AA which is really unnecessary in 4K.
> 
> The other thing you're going to be dealing with is CPU-limited titles, as well as titles that just plain don't scale well with Crossfire, especially past 2 cards. There are a few titles that scaling incredibly well (almost 100%) with 2 cards, Metro2033 being a good example, but going to 3-4 cards has a smaller effect, unless you're going to very high resolutions like 3K or 4K.
> 
> As far as vRAM goes, no 4GB is not going to be enough for 4K in certain titles if you're looking to run maxed out settings, even without Anti-Aliasing, I've seen reviews saying several games go over the 4GB mark. I believe Hitman Absolution was tested as getting up to 5900MB of usage on a Titan SLi setup review at 4K, but I don't know if that was ran with or without Antialiasing on...
> 
> I'd love to go 4K sooner rather than later, but I ain't replacing my TV til the warranty ends on it that replaces it if it breaks or dies.


Most of the reviews of Quadfire R9 290X out there it seems to be effective to add the 3rd and 4th card at 4K, some games scale a little better while others scale much better. BF4 being one of the games with very good scaling with Quadfire R9 290X at 4K. Sweet spot for 4K gaming is 3x R9 290X in Tri-Fire, but for the price I paid for my cards I am happy to go with four cards and now I have 2x EVGA G2 1300W in my case too, so there is no power-bottleneck either.

4GB vRAM may be a little short with everything at max, but such as Hitman Absolution still is beautiful and very playable at max settings with Quadfire R9 290X (averaging out 81,9 fps from the benchmark), I have no microstuttering/tearing or lag in that game (what so ever). And I have the opportunity to push both cards and cpu some more when it comes to overclock.


----------



## Dasboogieman

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I thought vRAM voltage wasn't adjustable? What exactly is the Aux Voltage for on these cards? I don't even know, haven't messed with it myself.


I've read reports it determines the amount of power drawn in via the PCIe Port.
Some cards have adjustable VRAM voltage but these are really rare. I've only ever seen it on MSI cards, even then, it was on the 270X Hawk edition and the GTX 570 Power Edition.


----------



## heroxoot

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kahboom*
> 
> What's considered a safe voltage for the ram on cards with elpida memory? .50mv more OK for 24/7 use?
> 
> 
> 
> I thought vRAM voltage wasn't adjustable? What exactly is the Aux Voltage for on these cards? I don't even know, haven't messed with it myself.
Click to expand...

From prior learning I believe AUX is PCIE power draw. This is not always the same for every GPU tho. I've read sometimes it's different.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Most of the reviews of Quadfire R9 290X out there it seems to be effective to add the 3rd and 4th card at 4K, some games scale a little better while others scale much better. BF4 being one of the games with very good scaling with Quadfire R9 290X at 4K. Sweet spot for 4K gaming is 3x R9 290X in Tri-Fire, but for the price I paid for my cards I am happy to go with four cards and now I have 2x EVGA G2 1300W in my case too, so there is no power-bottleneck either.
> 
> 4GB vRAM may be a little short with everything at max, but such as Hitman Absolution still is beautiful and very playable at max settings with Quadfire R9 290X (averaging out 81,9 fps from the benchmark), I have no microstuttering/tearing or lag in that game (what so ever). And I have the opportunity to push both cards and cpu some more when it comes to overclock.


Indeed, I'm the same boat as far as price paid goes, $880 for my first 3, 4th, when I get it, shouldn't run me more than a lil' over $300 max. And BF4 is definitely one of the best showcase pieces for what QuadFire is capable of when the game properly supports use of it.


----------



## kahboom

I'm asking since in bios I changed to default vram voltage to 1.55v up from 1.50v for elpida memory it got me stable running up too 1425 to 1450mhz which before it would not go past 1360mhz


----------



## Dasboogieman

Quote:


> Originally Posted by *kahboom*
> 
> I'm asking since in bios I changed to default vram voltage to 1.55v up from 1.50v for elpida memory it got me stable running up too 1425 to 1450mhz which before it would not go past 1360mhz


Have you added core voltage? If you are overclocking the core anyway, extra Vcore can help stabilize some Mem clocks.


----------



## Blue Dragon

haven't figured out what happened, only used my rig yesterday for streaming low res video for about 4 hours and noticed my core clock-

I don't have any OC software installed on fresh win 7 besides built-in CCC crap and I haven't even activated (clicked on) the overdrive. the core should be sitting at 1000MHz for Asus R9-290 OC.


----------



## heroxoot

Quote:


> Originally Posted by *Blue Dragon*
> 
> haven't figured out what happened, only used my rig yesterday for streaming low res video for about 4 hours and noticed my core clock-
> 
> I don't have any OC software installed on fresh win 7 besides built-in CCC crap and I haven't even activated (clicked on) the overdrive. the core should be sitting at 1000MHz for Asus R9-290 OC.


This is an obvious false positive. Grab GPUZ and put your mind at ease.


----------



## rdr09

Quote:


> Originally Posted by *Blue Dragon*
> 
> haven't figured out what happened, only used my rig yesterday for streaming low res video for about 4 hours and noticed my core clock-
> 
> I don't have any OC software installed on fresh win 7 besides built-in CCC crap and I haven't even activated (clicked on) the overdrive. the core should be sitting at 1000MHz for Asus R9-290 OC.


that is a very nice oc combined with a decent temp. i would not complain about that. if that's what you are doing.

seriously, it is a bugged rdg for sure.

Don't use Overdrive. I don't even use CCC. only when i need it like Excel or Word.


----------



## kahboom

Quote:


> Originally Posted by *Dasboogieman*
> 
> Have you added core voltage? If you are overclocking the core anyway, extra Vcore can help stabilize some Mem clocks.


No haven't added core just set my stock
Quote:


> Originally Posted by *Dasboogieman*
> 
> Have you added core voltage? If you are overclocking the core anyway, extra Vcore can help stabilize some Mem clocks.


Yeah just core did not help me for my memory clocking.


----------



## Kittencake

I was flipping though windows and as i minimized waterfox i noticed my temps where at 68c with the fan @ 59% ... all from using youtube, I've turned off hardware acceleration and it still likes to spke my temps


----------



## kahboom

bad case airflow perhaps. How long have you noticed temps on the rise?


----------



## Kittencake

only when i use youtube , sits fine at 42c on idle when i'm not doing anything else


----------



## heroxoot

Quote:


> Originally Posted by *Kittencake*
> 
> only when i use youtube , sits fine at 42c on idle when i'm not doing anything else


did you perhaps leave a window with flash opened? I have been noticing when I leave flash running the clocks kick up when the flash player ends. I watch twitch.tv a lot, and when I fall asleep I wake up to find my GPU at 58c even tho the live stream has ended. The funny thing is even with my room sweating hot, the GPU only goes to 55c usually during flash player. It's an odd problem really. This usually happens when it was full screen tho. Once I hit escape my temps drop. Again, the stream has ended so it makes no sense flash is still this active.

Beyond that, it could be something running that makes the GPU think it needs to clock up. Depending on your setup, you can tell the riva tuner statistics to NONE for that program.


----------



## Kittencake

Quote:


> Originally Posted by *heroxoot*
> 
> did you perhaps leave a window with flash opened? I have been noticing when I leave flash running the clocks kick up when the flash player ends. I watch twitch.tv a lot, and when I fall asleep I wake up to find my GPU at 58c even tho the live stream has ended. The funny thing is even with my room sweating hot, the GPU only goes to 55c usually during flash player. It's an odd problem really. This usually happens when it was full screen tho. Once I hit escape my temps drop. Again, the stream has ended so it makes no sense flash is still this active.
> 
> Beyond that, it could be something running that makes the GPU think it needs to clock up. Depending on your setup, you can tell the riva tuner statistics to NONE for that program.


all I was doing was listening to a mix on youtube soon as it ended it cooled down , wondering why something like flash would use so much


----------



## sugarhell

Quote:


> Originally Posted by *Kittencake*
> 
> only when i use youtube , sits fine at 42c on idle when i'm not doing anything else


Hey.iirc even browser has hardware acceleration.So you need to disable both flash and browser hardware acceleration.

If you use msi AB you can press the i button and monitor the 3d clocks apps at the bottom.

Final solution disable powerplay and manual control 2d to 3d clocks and back


----------



## heroxoot

Quote:


> Originally Posted by *Kittencake*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> did you perhaps leave a window with flash opened? I have been noticing when I leave flash running the clocks kick up when the flash player ends. I watch twitch.tv a lot, and when I fall asleep I wake up to find my GPU at 58c even tho the live stream has ended. The funny thing is even with my room sweating hot, the GPU only goes to 55c usually during flash player. It's an odd problem really. This usually happens when it was full screen tho. Once I hit escape my temps drop. Again, the stream has ended so it makes no sense flash is still this active.
> 
> Beyond that, it could be something running that makes the GPU think it needs to clock up. Depending on your setup, you can tell the riva tuner statistics to NONE for that program.
> 
> 
> 
> all I was doing was listening to a mix on youtube soon as it ended it cooled down , wondering why something like flash would use so much
Click to expand...

No idea. Flash makes my GPU heat up around 55c during use, and then when the stream ends and flash should close, it just starts eating up my GPU causing more heat. I have all hardware acceleration disabled and yet it clock sup to 600mhz on the core at most. On 14.4 this didn't happen. GPU stayed 300 with acceleration off.

You got me!


----------



## Kittencake

Quote:


> Originally Posted by *heroxoot*
> 
> No idea. Flash makes my GPU heat up around 55c during use, and then when the stream ends and flash should close, it just starts eating up my GPU causing more heat. I have all hardware acceleration disabled and yet it clock sup to 600mhz on the core at most. On 14.4 this didn't happen. GPU stayed 300 with acceleration off.
> 
> You got me!


maybe i should roll back to 14.4


----------



## heroxoot

Quote:


> Originally Posted by *Kittencake*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> No idea. Flash makes my GPU heat up around 55c during use, and then when the stream ends and flash should close, it just starts eating up my GPU causing more heat. I have all hardware acceleration disabled and yet it clock sup to 600mhz on the core at most. On 14.4 this didn't happen. GPU stayed 300 with acceleration off.
> 
> You got me!
> 
> 
> 
> maybe i should roll back to 14.4
Click to expand...

Quite a few including myself see more heat on 14.6 but generally the performance increase is worth it. I saw a huge performance buff for my card. It all depends on what you need.


----------



## tsm106

Quote:


> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Kittencake*
> 
> only when i use youtube , sits fine at 42c on idle when i'm not doing anything else
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Hey.iirc even browser has hardware acceleration.So you need to disable both flash and browser hardware acceleration.
> 
> If you use msi AB you can press the i button and monitor the 3d clocks apps at the bottom.
> 
> Final solution disable powerplay and manual control 2d to 3d clocks and back
Click to expand...

I like how the answers get posted, are ignored while misinfo is eaten up.


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> I like how the answers get posted, are ignored while misinfo is eaten up.


tsm, i just opened AB and it showed powerplay disabled. is that permanent even if AB is closed or until AB is uninstalled?

I watch You Tube, Twitch TV, etc and my temps do go up like 2C - 3C from idle using 14.6, which i think is normal for watercooled 290. Hardware Acceleration is enabled in the browser and in the video.


----------



## tsm106

Powerplay is never disabled.Powerplay is fundamental to AMD's tech. You can only set AB to ignore it if you use unofficial mode.

Temps go up because AB and RTSS are reacting to the powerstate change from whatever app.


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> Powerplay is never disabled.Powerplay is fundamental to AMD's tech. You can only set AB to ignore it if you use unofficial mode.
> 
> Temps go up because AB and RTSS are reacting to the powerstate change from whatever app.


+rep.


----------



## Arizonian

Quote:


> Originally Posted by *MasterGamma12*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Sapphire brand
> Stock cooling


Quote:


> Originally Posted by *MasterGamma12*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> CAN I JOIN!!!!!!!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Sapphire brand
> Stock cooling


Quote:


> Originally Posted by *MasterGamma12*
> 
> Will I ever get added??


Congrats - added









Your re-post was one hour after originally submitting proof. Sorry, I was at work yesterday for 11 hours. Got home and had to build my dad an i5 4970S tower and do some moderating that couldn't wait. Some hot topics the last couple days. Usually I take care of the club first.









Just for reference for others who may submit proof, give me at least 24hrs and PM me your link if you feel I missed you. For the most part I'm on top of it much faster than this.
Quote:


> Originally Posted by *Beezleybuzz*
> 
> Thanks for adding me to the list! Just one correction, I have 2 R9 290's, not 290x (I WISH!).
> 
> Thanks again!


Heh, I must have read your validation wrong. Corrected.


----------



## kizwan

Quote:


> Originally Posted by *heroxoot*
> 
> Lucky you. For twitch I clock up between 400 and 600 for 1080p flash on twitch and when the video stops the clocks seem to slightly rise more causing more temps. Only in full screen. If the video stops regularly without fullscreen it does not occur.


It's kinda weird you got this problem. I'm on 14.6 beta. Are you using beta or RC? Both Chrome & Firefox have hardware acceleration enabled but Chrome use HTML5 while Firefox use Flash. The latter, core clock will up & down aggressively than the former but with both, when video ended (in full screen), clock always return to 300MHz. Can you try YouTube? Maybe this happen on certain website only.
Quote:


> Originally Posted by *kahboom*
> 
> I'm asking since in bios I changed to default vram voltage to 1.55v up from 1.50v for elpida memory it got me stable running up too 1425 to 1450mhz which before it would not go past 1360mhz


How did you add VRAM voltage in BIOS? Really? Can't go above 1360MHz with stock VRAM voltage?


----------



## HoneyBadger84

Quote:


> Originally Posted by *kizwan*
> 
> How did you add VRAM voltage in BIOS? Really? Can't go above 1360MHz with stock VRAM voltage?


That seems nuts to me as well. My HIS cards have Elpida and I ran 1400MHz for a full bench sweep without a single issue. I'm pretty sure they can do 1450MHz or higher, but I haven't tested it yet.


----------



## FrankoNL

Guys, i am looking for a simple to install aftermarket cooling system for my reference gigabyte 290. The sound is driving me bonkers. What cooler do you guys recommend ( has to be available in europe )


----------



## Kittencake

well you can get the kraken g10 with an h60 or you can always wait for the corsair hg10 like i am


----------



## heroxoot

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Lucky you. For twitch I clock up between 400 and 600 for 1080p flash on twitch and when the video stops the clocks seem to slightly rise more causing more temps. Only in full screen. If the video stops regularly without fullscreen it does not occur.
> 
> 
> 
> It's kinda weird you got this problem. I'm on 14.6 beta. Are you using beta or RC? Both Chrome & Firefox have hardware acceleration enabled but Chrome use HTML5 while Firefox use Flash. The latter, core clock will up & down aggressively than the former but with both, when video ended (in full screen), clock always return to 300MHz. Can you try YouTube? Maybe this happen on certain website only.
Click to expand...

RC2 for me right now. I don't 100% remember if it didnt happen on 14.6 beta 1 but I'm pretty sure it did. For sure did not happen on 14.4 though. I have youtube running HTML5 and the problem doesn't show up. Its merely when I watch video on twitch and the stream ends in fullscreen.


----------



## VSG

Any limitations with number of PCI-E slots, budget etc? There are some very nice options but they can take up to 3-4 PCI-E slots with fans or require an AIO and heatsinks.


----------



## EdForce1

Hi, i have the Vapor-X edition of Sapphire R9 290 and i'm bothered by the noise the power load activity LEDS make, especially when i move my mouse. Any thoughts on how to remedy? Thnx


----------



## Gobigorgohome

Will a 3930K at stock speeds bottleneck Quadfire R9 290X's much at 4K?


----------



## heroxoot

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Will a 3930K at stock speeds bottleneck Quadfire R9 290X's much at 4K?


Doubt it. Until the game requires more CPU than what you have offers, you won't bottleneck.


----------



## Gobigorgohome

Quote:


> Originally Posted by *heroxoot*
> 
> Doubt it. Until the game requires more CPU than what you have offers, you won't bottleneck.


That was a relief, was worried I had to buy another CPU. It seems like my chip is unable to do 4,2 Ghz at 1,4 volts (water cooled with EK-Supremacy), great temperatures but BSOD's all the time in Prime95.


----------



## heroxoot

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Doubt it. Until the game requires more CPU than what you have offers, you won't bottleneck.
> 
> 
> 
> That was a relief, was worried I had to buy another CPU. It seems like my chip is unable to do 4,2 Ghz at 1,4 volts (water cooled with EK-Supremacy), great temperatures but BSOD's all the time in Prime95.
Click to expand...

That sucks. I doubt it tho. Games should be pretty GPU heavy at 4k res right? I'd expect more on the quadfire than anything else really. I mean, games like BF4 are so GPU heavy the difference between the highest end i7 and my crappy 8150 is like, 2fps. But there is no way a 3930k could cause a bottleneck. Even if it did, I bet it wouldnt be a lot. It's a good CPU regardless.


----------



## Gobigorgohome

Quote:


> Originally Posted by *heroxoot*
> 
> That sucks. I doubt it tho. Games should be pretty GPU heavy at 4k res right? I'd expect more on the quadfire than anything else really. I mean, games like BF4 are so GPU heavy the difference between the highest end i7 and my crappy 8150 is like, 2fps. But there is no way a 3930k could cause a bottleneck. Even if it did, I bet it wouldnt be a lot. It's a good CPU regardless.


I guess it should be fine, I will try to tune the CPU again when I get all my water cooling in (alot more cooling area and probably lower ambient water temperature). In the big titles I do not think the CPU overclock have too much to say at 4K.


----------



## King PWNinater

If I'm artifacting at 1050-1100mhz on the core clock, how much should I increase the voltage by?


----------



## Sgt Bilko

Quote:


> Originally Posted by *King PWNinater*
> 
> If I'm artifacting at 1050-1100mhz on the core clock, how much should I increase the voltage by?


Bump by 5-10mV until it stops artifacting


----------



## rv8000

Quote:


> Originally Posted by *EdForce1*
> 
> Hi, i have the Vapor-X edition of Sapphire R9 290 and i'm bothered by the noise the power load activity LEDS make, especially when i move my mouse. Any thoughts on how to remedy? Thnx


That is actually coil whine and has nothing to do with the leds on the back side of the card. The only real remedies are using frame limiters in games, possibly swapping power supplies, or rma the card if it really bothers you.


----------



## yawa

Just a personal experience heads up here. Trixx was holding my 290X back massively as far as Overclocking potential. So if anyone is having issues pulling off a high level Overclock on their 290/290X while using Trixx trust me it's worth it, give another program a try.

I went from 58.7/mid 1300 Heaven score to 66.7/1684. Everything else gained a ton of points as well. Trixx was definitely not holding a constant voltage for me, even when forcing it. Afterburner fixed this and made a huge difference.


----------



## PureBlackFire

Quote:


> Originally Posted by *yawa*
> 
> Just a personal experience heads up here. Trixx was holding my 290X back massively as far as Overclocking potential. So if anyone is having issues pulling off a high level Overclock on their 290/290X while using Trixx trust me it's worth it, give another program a try.
> 
> I went from 58.7/mid 1300 Heaven score to 66.7/1684. Everything else gained a ton of points as well. Trixx was definitely not holding a constant voltage for me, even when forcing it. Afterburner fixed this and made a huge difference.


if what I see on OSD is correct AB doesn't come close to holding a constant voltage for me, even with it forced.


----------



## rdr09

Quote:


> Originally Posted by *yawa*
> 
> Just a personal experience heads up here. Trixx was holding my 290X back massively as far as Overclocking potential. So if anyone is having issues pulling off a high level Overclock on their 290/290X while using Trixx trust me it's worth it, give another program a try.
> 
> I went from 58.7/mid 1300 Heaven score to 66.7/1684. Everything else gained a ton of points as well. Trixx was definitely not holding a constant voltage for me, even when forcing it. Afterburner fixed this and made a huge difference.


did you use the power limit in Trixx? it allowed me to oc my 290 to 1320 core.

edit: actually, all my oc's were done using Trixx.


----------



## yawa

Yeah no answers on that one guys. I just know even at +200mv, everything unlocked, she dropped voltages massively on me. Afterburner certainly fluctuates for me, but when I bench clocks remain constant. 1243 max core in Trixx, 1295 max core in AB.

Disregarding the potential though, the weirdest things are that benchmarking scores at the same clocks are very different as well. I gained a ton of points using AB.

I guess my card just doesn't like Trixx.


----------



## PureBlackFire

valley loop for 30 minutes then benchmarked. front panel on my case is off.


----------



## King PWNinater

Need more voltage. Should I use afterburner or Trixx?


----------



## King PWNinater

Btw. I'm a little f'ed over at the moment. My voltage is maxed out, but my memory and core clock won't go any lower (1215/1250) . I'm alot starting to get small artifacts. Any way to increase voltage in afterburner?


----------



## heroxoot

Quote:


> Originally Posted by *King PWNinater*
> 
> Btw. I'm a little f'ed over at the moment. My voltage is maxed out, but my memory and core clock won't go any lower (1215/1250) . I'm alot starting to get small artifacts. Any way to increase voltage in afterburner?


Unlock voltage in the options.


----------



## King PWNinater

I did.


----------



## heroxoot

Quote:


> Originally Posted by *King PWNinater*
> 
> I did.


if voltage does not unlock on a 290 or 290X card then something is wrong. Some info on what make the card is would help.


----------



## Germanian

i think he means he maxed it out to +100mv now he wants to unlock more. To do this you need to change the .ini file if your are usng MSI afterburner


----------



## King PWNinater

What is the ini file and what does it need to be changed to?

Btw, it is unlocked and I'm at +100.


----------



## Beezleybuzz

Quote:


> Originally Posted by *HoneyBadger84*
> 
> That seems nuts to me as well. My HIS cards have Elpida and I ran 1400MHz for a full bench sweep without a single issue. I'm pretty sure they can do 1450MHz or higher, but I haven't tested it yet.


My Asus R9 290 DCUII OC has Elpida memory and I run it at 1500mhz (6000mhz effective) no problem, now that is.

There was a bios update not long ago for my cards and it seems to have made overclocking the memory more stable? Could it possibly be due to an increase in RAM voltage? HMMMM? I had been musing as to the reasons for my increase in overclock following the new bios.


----------



## Petet1990

Whats up everyone i just ordered 3 msi 290s..can anyone tell me how much power will those 3 cards draw? Thanks.


----------



## pdasterly

Where can I buy waterblocks for decent price, prefer heatkiller


----------



## HoneyBadger84

Quote:


> Originally Posted by *Gobigorgohome*
> 
> That was a relief, was worried I had to buy another CPU. It seems like my chip is unable to do 4,2 Ghz at 1,4 volts (water cooled with EK-Supremacy), great temperatures but BSOD's all the time in Prime95.


Don't use Prime95. Use Intel Burn Test run on High, let it do at least 10-30 loops. 1.4v is insane imo, no way it should need that much when mine does 4.2GHz at 1.21-1.22v


----------



## HoneyBadger84

Quote:


> Originally Posted by *Petet1990*
> 
> Whats up everyone i just ordered 3 msi 290s..can anyone tell me how much power will those 3 cards draw? Thanks.


Less than 1200W but more than 900W, if you're going full boar on them.

That's based off my 3 290Xs pulling about 1150W from the wall at stock, which converts to around 1000W or a but less, at stock, then up to 1400W at the wall OCed which comes out to a bit under 1200W actual system power given my PSU's ratings at those particular draws. Given that 290s pull slightly less, I'd say 850-1100W actual draw is in the range you'll see with mild OCs.


----------



## cephelix

Alright guys,

Finally had the time today to take apart my MSI R9 290 Gaming 4G to take some pictures and measurements of the various components. As was posted before in this thread, with alot of help from @Dasboogieman, it sadly turns out that my particular card cannot fit the EK full cover water block(yes, even the rev 2.0). But since I still would like to include my 290 in my custom loop, I'll have to go for a universal block, namely the recently released EK Thermosphere. And as we all know, VRM1 on the 290 runs hot as hell, especially when extra volts are applied and so I would have to start applying heatsinks to them and that is the purpose of today's disassembly. Along the way, I took a few photos as a visual record of those measurements. Hope it could be of use to others as well.


First up, is the obligatory full pcb shot of my particular card. As you can see, the chokes are gold SSC chokes, the capacitor on the top right corner is not immediately next to the 8-pin connector, instead it is recessed and there is no AMD silkscreen print near the pci-e connector.


PCB revision 2.2 has the same layout as PCB revision 1.4 (V308-014R) whose picture can be found on coolingconfigurator.com


Pictue and measurements of VRM1. Dimensions of VRM1 is 12mm x 67mm (W x H). If including the screw holes on the top and bottom then the height is extended to 88mm.


Hynix memory. 14mm x 12mm (W x H)


VRM2 12mm x 9mm(W x H), Diagonal length including screw holes: 37mm






Now, i see small squares of thermal pad on the stock heatplate that corresponds to components on the PCB as indicated by the green and red squares. Anyone knows what they are?

Now that I got all the measurements down, I'll be sourcing for heatsinks of various sizes for the VRMs and VRAMS


----------



## fateswarm

I suspect your vrm1 is up to the 6th mosfet. The 7th is probably for the vram. I wonder how they place the sensor though (just on one of them for kicks?).


----------



## miraldo

Hey guys.

Can you please tell me which are currently the best AMD drivers for R9 290?

I playing only Battlefield 4.


----------



## Dasboogieman

Quote:


> Originally Posted by *cephelix*
> 
> Alright guys,
> 
> Finally had the time today to take apart my MSI R9 290 Gaming 4G to take some pictures and measurements of the various components. As was posted before in this thread, with alot of help from @Dasboogieman, it sadly turns out that my particular card cannot fit the EK full cover water block(yes, even the rev 2.0). But since I still would like to include my 290 in my custom loop, I'll have to go for a universal block, namely the recently released EK Thermosphere. And as we all know, VRM1 on the 290 runs hot as hell, especially when extra volts are applied and so I would have to start applying heatsinks to them and that is the purpose of today's disassembly. Along the way, I took a few photos as a visual record of those measurements. Hope it could be of use to others as well.
> 
> 
> First up, is the obligatory full pcb shot of my particular card. As you can see, the chokes are gold SSC chokes, the capacitor on the top right corner is not immediately next to the 8-pin connector, instead it is recessed and there is no AMD silkscreen print near the pci-e connector.
> 
> 
> PCB revision 2.2 has the same layout as PCB revision 1.4 (V308-014R) whose picture can be found on coolingconfigurator.com
> 
> 
> Pictue and measurements of VRM1. Dimensions of VRM1 is 12mm x 67mm (W x H). If including the screw holes on the top and bottom then the height is extended to 88mm.
> 
> 
> Hynix memory. 14mm x 12mm (W x H)
> 
> 
> VRM2 12mm x 9mm(W x H), Diagonal length including screw holes: 37mm
> 
> 
> 
> 
> 
> 
> Now, i see small squares of thermal pad on the stock heatplate that corresponds to components on the PCB as indicated by the green and red squares. Anyone knows what they are?
> 
> Now that I got all the measurements down, I'll be sourcing for heatsinks of various sizes for the VRMs and VRAMS


Damn, auxiliary cooling is gonna be even harder than I imagined. Those units between the capacitors is probably the controller units for the PWM. To be honest, MSI are overkilling the cooling of the PCB ICs a little. Hell, they slapped thermal pads on the inductors which don't really get hot.


----------



## EdForce1

Quote:


> Originally Posted by *rv8000*
> 
> That is actually coil whine and has nothing to do with the leds on the back side of the card. The only real remedies are using frame limiters in games, possibly swapping power supplies, or rma the card if it really bothers you.


Thanks, i haven't tried in games yet, but so far it's only when it's idle i'm experiencing this noise, and only 9/10 times when i move my mouse.
Would changing to a different pci power cable help? I have a semi-modular XFX 750W psu.


----------



## cephelix

Quote:


> Originally Posted by *Dasboogieman*
> 
> Damn, auxiliary cooling is gonna be even harder than I imagined. Those units between the capacitors is probably the controller units for the PWM. To be honest, MSI are overkilling the cooling of the PCB ICs a little. Hell, they slapped thermal pads on the inductors which don't really get hot.


Forgive me for being a bit dense, but why would auxiliary cooling be hard?


----------



## kizwan

Auxiliary cooling


----------



## Dasboogieman

Quote:


> Originally Posted by *cephelix*
> 
> Forgive me for being a bit dense, but why would auxiliary cooling be hard?


Cooling for the subcomponents of the Gaming edition because you'd need mostly non standard heatsinks etc. Efficiency would also be poor since you would have to use Tape, epoxy might help but removal is an issue with so many ICs.


----------



## Kittencake

Quote:


> Originally Posted by *miraldo*
> 
> Hey guys.
> 
> Can you please tell me which are currently the best AMD drivers for R9 290?
> 
> I playing only Battlefield 4.


14.6 I find that I had a boost in 5-10 fps in bf4 from 14.4


----------



## Enilder

I need help with BSOD problem with R9 290 Tri-X OC

I generally have about 10+ tabs open under Firefox and while watching youtube, stream, etc., it just crashes. My CPU is overclocked and stress-tested with Prime95 over a day or so. I had the system running for quite some time before I installed the GPU so I am quite confident that it's my GPU that's causing this crash.

I have 14.1 Catalyst and wondering if there's a better/more stable version of the driver I should look into. Does anyone have this problem with this Catalyst version and found a solution to it? Thanks.


----------



## Kittencake

Quote:


> Originally Posted by *Enilder*
> 
> I need help with BSOD problem with R9 290 Tri-X OC
> 
> I generally have about 10+ tabs open under Firefox and while watching youtube, stream, etc., it just crashes. My CPU is overclocked and stress-tested with Prime95 over a day or so. I had the system running for quite some time before I installed the GPU so I am quite confident that it's my GPU that's causing this crash.
> 
> I have 14.1 Catalyst and wondering if there's a better/more stable version of the driver I should look into. Does anyone have this problem with this Catalyst version and found a solution to it? Thanks.


try 14.4 or the 14.6 beta

did you get the error number?


----------



## BradleyW

Quote:


> Originally Posted by *Enilder*
> 
> I need help with BSOD problem with R9 290 Tri-X OC
> 
> I generally have about 10+ tabs open under Firefox and while watching youtube, stream, etc., it just crashes. My CPU is overclocked and stress-tested with Prime95 over a day or so. I had the system running for quite some time before I installed the GPU so I am quite confident that it's my GPU that's causing this crash.
> 
> I have 14.1 Catalyst and wondering if there's a better/more stable version of the driver I should look into. Does anyone have this problem with this Catalyst version and found a solution to it? Thanks.


Turn off browser and adobe flash hardware acceleration features.


----------



## heroxoot

Quote:


> Originally Posted by *Enilder*
> 
> I need help with BSOD problem with R9 290 Tri-X OC
> 
> I generally have about 10+ tabs open under Firefox and while watching youtube, stream, etc., it just crashes. My CPU is overclocked and stress-tested with Prime95 over a day or so. I had the system running for quite some time before I installed the GPU so I am quite confident that it's my GPU that's causing this crash.
> 
> I have 14.1 Catalyst and wondering if there's a better/more stable version of the driver I should look into. Does anyone have this problem with this Catalyst version and found a solution to it? Thanks.


Disable all accelerations, disable protected mode in flash. I'd also try 14.7 drivers. If not, 14.6. I have heard some complaining of similar problems on 14.7 but not 14.6.


----------



## Petet1990

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Less than 1200W but more than 900W, if you're going full boar on them.
> 
> That's based off my 3 290Xs pulling about 1150W from the wall at stock, which converts to around 1000W or a but less, at stock, then up to 1400W at the wall OCed which comes out to a bit under 1200W actual system power given my PSU's ratings at those particular draws. Given that 290s pull slightly less, I'd say 850-1100W actual draw is in the range you'll see with mild OCs.


well that's a relief lol..thanks


----------



## rv8000

Quote:


> Originally Posted by *EdForce1*
> 
> Thanks, i haven't tried in games yet, but so far it's only when it's idle i'm experiencing this noise, and only 9/10 times when i move my mouse.
> Would changing to a different pci power cable help? I have a semi-modular XFX 750W psu.


I've had this happen before, I can't remember which card it was (I think my 680) but when I would scroll on web pages it would coil whine either due to use of hardware acceleration or a program called pando media booster messing with gpu clocks. If you haven't already, download msi afterburner or gpuz and monitor your gpu clocks at idle and browsing the web. Changing the cable isn't going to help, some cards just whine more than others and some power supplies will just add to the issue.

My vapor-x has some pretty bad coil whine in games, swapping psus currently to a 750w g2 in a last ditch effort, coil whine drives me insane. I either have terrible luck or I'm putting another breaker in and running dedicated wire for a new outlet just for my pc.


----------



## HoneyBadger84

Have any of you not heard a 290X Reference Card's fan or never run it at full boar? Wonder why I call them Leaf Blowers? Imagine 3-4 of these together (note: this video is kinda louder cuz it's not secured to a case's PCI slot bracket which does dampen noise just a tad, as does a case):




:-D This is why I find it comical to hear people's reactions the first time they turn their card up to 100% or hear one doing it. Much noise. Such leaf blower... but look at the air movement! O_O


----------



## heroxoot

Yea but who buys ref coolers unless they are going to go liquid?


----------



## Sgt Bilko

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Have any of you not heard a 290X Reference Card's fan or never run it at full boar? Wonder why I call them Leaf Blowers? Imagine 3-4 of these together (note: this video is kinda louder cuz it's not secured to a case's PCI slot bracket which does dampen noise just a tad, as does a case):
> 
> 
> 
> 
> :-D This is why I find it comical to hear people's reactions the first time they turn their card up to 100% or hear one doing it. Much noise. Such leaf blower... but look at the air movement! O_O


Did you ever own a Ref HD 6970?

It's actually louder


----------



## HoneyBadger84

Quote:


> Originally Posted by *heroxoot*
> 
> Yea but who buys ref coolers unless they are going to go liquid?


Uuuuuuh *points to self*


----------



## HoneyBadger84

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Did you ever own a Ref HD 6970?
> 
> It's actually louder


Owned 2 6990s, TBH they're about the same in terms of loudness. But of course, having 3-4 290Xs means they're louder cuz there's more of them. lol


----------



## Kittencake

Quote:


> Originally Posted by *heroxoot*
> 
> Yea but who buys ref coolers unless they are going to go liquid?


ME ... least till the corsair hg10 comes out XD


----------



## heroxoot

That being said you have plans to move to liquid while I do not, and thus bought a 2nd party card.


----------



## Kittencake

yes but i also find blower cards sexy ..


----------



## HoneyBadger84

Quote:


> Originally Posted by *heroxoot*
> 
> That being said you have plans to move to liquid while I do not, and thus bought a 2nd party card.


Not really... I don't plan to have these cards more than 6-12 months, which is why I got them as cheap as possible to minimize loss on resale value. I won't go liquid on video cards because I swap them out far too often to mess with that crap. lol


----------



## heroxoot

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> That being said you have plans to move to liquid while I do not, and thus bought a 2nd party card.
> 
> 
> 
> Not really... I don't plan to have these cards more than 6-12 months, which is why I got them as cheap as possible to minimize loss on resale value. I won't go liquid on video cards because I swap them out far too often to mess with that crap. lol
Click to expand...

I sort of envy your ability to change cards every generation. I wouldn't even have this 290X if MSI didn't screw up so badly. Ahh, the life of a poor college student. It's a rough yet promising one.


----------



## sugarhell

Quote:


> Originally Posted by *Kittencake*
> 
> yes but i also find blower cards sexy ..


Quadfire ref 290x is way too hot


----------



## HoneyBadger84

Quote:


> Originally Posted by *heroxoot*
> 
> I sort of envy your ability to change cards every generation. I wouldn't even have this 290X if MSI didn't screw up so badly. Ahh, the life of a poor college student. It's a rough yet promising one.


Well, it's mostly cuz I do this now, I buy used, so the price of updating isn't that bad counting in selling the old cards. I don't exactly make "bank" or anything, I'm in the poor bracket far as most states are concerned, lol I'm just above the poverty line here in NM.

Also I only upgrade if it's worth it to upgrade, which from 1 7970 that I still had to multiple 290Xs for fairly cheap, it was worth.
Quote:


> Originally Posted by *sugarhell*
> 
> Quadfire ref 290x is way too hot


If you're not willing to run your fans on blast-mode, yeah, it gets hot as heck, period. If you can stand 70% fan speed or higher, it actually doesn't get too bad, but it is a space heater, if you don't have additional cooling in your room, even crossfire of 290Xs with Reference Cards can be quite the heat source. Granted, they run cooler cuz they're spread out (I have my TriFire spread out right now so it's in 1-4-7 configuration on my board, one space between each card). Stacked up, you need REALLY good airflow RIGHT at the cards fan intakes or they'll get hot pretty fast.


----------



## HoneyBadger84

Question for y'all, one of the folks that got one of my two XFX R9 290X Core Editions is having the following issues:

Black screen on Skyrim startup, and also crashes on desktop.

I've recommended he try different drivers, check temperatures, and the like, but it seems like none of that is helping. The cards both worked perfectly fine for me for several weeks after I got them, so it can't possibly be the card (no way it was damaged in shipping with the way I packed them). What would you recommend I suggest he try next?


----------



## HoneyBadger84

Meh, it's sounding like his issue is overheating due to lack of airflow now. He said the GPU was getting up in the 90C+ range & black screening, by Skyrim specifically black screens on startup, does anyone know how to fix that issue so I can just troubleshoot the temperature problems with him?

I'm picturing this guy has a case with very bad airflow, as he's saying even with the fan "ramped up" it's getting up to 89C before freezing. That's insane to me since I was running that very card and only seeing low-to-mid 60s, 71C max, and that wasn't even running the fan full boar (at 100% it never saw over 65C in anything)


----------



## yawa

Just figured I'd share some results since switching Overclocking and offer a challenge to anyone here to top me.

I currently have the best Single 290X score in Heaven 4.0, as well as the best single 290X scores (with "Tess on") in Firestrike Extreme and 3D Mark 11 Extreme.

Someone has gotta be able to beat me here (all of these were benched 1283/1479 +300/400MV if I remember correctly) as I'm personally dying to see what +1300Mhz GPU Core can do in these benches.

Remember guys, real men bench with Tess on!


----------



## Euda

Hey guys, I recently scanned my R9 290X for errors via OCCT and recognized that even at stock-clocks or independent of clockrates, precisely (means even at 600 MHz core and 900 MHz VRAM), my 290X produces a hell lot of errors in a short time. Within the first seconds of testing @ shader complexity x7, the error-counter raises by 4-5 errors per ~5-10 seconds. After ~20 minutes of benchmarking, it showed a total of 563 errors. Like I said, these errors occur independent of both VRAM & GPU-clockrates whereas, when I raise the voltages by 100mV, they disappear.
Does that mean that my card is probably defective? Didn't see any artifacts @ stock-clocks, ever. Since October '14 when I first installed the card.
Hoping to receive some answers, thanks in advance.


----------



## sugarhell

Are you serious? We had 1350 mhz 290x here...


----------



## yawa

Quote:


> Originally Posted by *sugarhell*
> 
> Are you serious? We had 1350 mhz 290x here...


I'll have to dig. Suffice to say, he never posted in the benching threads! I wanna see what he scored!


----------



## sugarhell

Quote:


> Originally Posted by *yawa*
> 
> I'll have to dig. Suffice to say, he never posted in the benching threads! I wanna see what he scored!


http://www.3dmark.com/fs/1881364


----------



## yawa

Quote:


> Originally Posted by *sugarhell*
> 
> http://www.3dmark.com/fs/1881364


Awesome. Thank you. We really should all go mobilize into those threads. Seems no one with high clocks is representing in them right now.

I was shocked at my ranking to say the least knowing people have far better cards than me.
Quote:


> Originally Posted by *Euda*
> 
> Hey guys, I recently scanned my R9 290X for errors via OCCT and recognized that even at stock-clocks or independent of clockrates, precisely (means even at 600 MHz core and 900 MHz VRAM), my 290X produces a hell lot of errors in a short time. Within the first seconds of testing @ shader complexity x7, the error-counter raises by 4-5 errors per ~5-10 seconds. After ~20 minutes of benchmarking, it showed a total of 563 errors. Like I said, these errors occur independent of both VRAM & GPU-clockrates whereas, when I raise the voltages by 100mV, they disappear.
> Does that mean that my card is probably defective? Didn't see any artifacts @ stock-clocks, ever. Since October '14 when I first installed the card.
> Hoping to receive some answers, thanks in advance.


I don't think I've ever run OCCT at any clock (CPU-GPU) and not gotten errors. Knowing my CPU is both 24 hour Prime(small FFT's)/IBT stable and my card has zero issues/artifacts at stock and day to day Overclock, I would tend to think maybe the issue is with OCCT not liking our setups. I would try something else personally before even thinking it was defective.

I mean, have you had any issues outside of it? Black screens? Artifacts? Crashes?


----------



## HoneyBadger84

Quote:


> Originally Posted by *yawa*
> 
> Awesome. Thank you. We really should all go mobilize into those threads. Seems no one with high clocks is representing in them right now.
> 
> I was shocked at my ranking to say the least knowing people have far better cards than me.


I would try to beat you, but I'm on air, and I ain't crazy like tsm & some others runnin' 1.35V+ on Air. lol


----------



## miraldo

Quote:


> Originally Posted by *Enilder*
> 
> I need help with BSOD problem with R9 290 Tri-X OC
> 
> I generally have about 10+ tabs open under Firefox and while watching youtube, stream, etc., it just crashes. My CPU is overclocked and stress-tested with Prime95 over a day or so. I had the system running for quite some time before I installed the GPU so I am quite confident that it's my GPU that's causing this crash.
> 
> I have 14.1 Catalyst and wondering if there's a better/more stable version of the driver I should look into. Does anyone have this problem with this Catalyst version and found a solution to it? Thanks.


I have same problems.

I get ramdom crashes when Im on internet.

Error is: STOP: 0xA0000001 and atikmdag.sys+0x28ECE.

-I try with fresh clean installation of WIndow 7 x64
-try with many diffrent AMD drivers
-I even turn off hardware acceleration but I still can not get rid of the error


----------



## HoneyBadger84

So update on the guy's issue: he's tried different drivers etc, but he just stated he still had Intel's display drivers installed (Z87-gd65 motherboard from MSI), so I advised him to uninstall those (he's using DDU so that's good at least) and report back with a fresh install of the 14.4 drivers... Hope that fixes it cuz I don't wanna haveta be the one to tell him that CoolMax 1000W ZU PSU may well be the problem if he rules temps & such out.


----------



## yawa

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I would try to beat you, but I'm on air, and I ain't crazy like tsm & some others runnin' 1.35V+ on Air. lol


Well you should still give what you're comfortable with a shot. My point certainly isn't to brag about it, it's more to express my shock that no one who can do 1300Mhz+ seems to be benching their 290X's in those threads.

Basically I'm saying, Knowing what I know now about what clocks people have achieved in here, I certainly shouldn't be the one to currently own the ranking spots I do.


----------



## HoneyBadger84

Quote:


> Originally Posted by *yawa*
> 
> Well you should still give what you're comfortable with a shot. My point certainly isn't to brag about it, it's more to express my shock that no one who can do 1300Mhz+ seems to be benching their 290X's in those threads.
> 
> Basically I'm saying, I shouldn't own tose ranking spots I currently do.


Max I've pushed is 1150MHz core, and I'm mostly on Reference/Core Edition cards. I haven't posted my most up to date scores on the leaderboards yet myself, most of my submissions are at stock or very mild OCs, before I found out Trixx could give me more voltage (I posted for the leaderboards when I first got 290Xs and I ran a Gigabyte WindForce & Sapphire Tri-X OC card in crossfire, ran'em both at like 1150/1550 & saw a good amount of improvement in scores).

I think I could push farther, but can't be bothered to benchmark as of late, busy with other things... I'm wondering what kinda voltages folks push for core clocks in the 1200MHz range, wanna try to run that myself, just to see if I can keep it cool on a single or multi-card configuration.


----------



## tsm106

Quote:


> Originally Posted by *sugarhell*
> 
> Are you serious? We had 1350 mhz 290x here...


Is it me or have ppl lost knowledge? It's like no one remembers anything anymore.


----------



## Enilder

Quote:


> Originally Posted by *Kittencake*
> 
> try 14.4 or the 14.6 beta
> 
> did you get the error number?


Nope. I just reinstalled 14.4. I am going to check and see if the problem comes back again.
Quote:


> Originally Posted by *BradleyW*
> 
> Turn off browser and adobe flash hardware acceleration features.


Turn off browser...? Don't use Firefox?







Is there a way to turn off adobe flash hardware acceleration feature? For example, I can turn it off in Youtube but do I need to manually do this for any other websites?
Quote:


> Originally Posted by *heroxoot*
> 
> Disable all accelerations, disable protected mode in flash. I'd also try 14.7 drivers. If not, 14.6. I have heard some complaining of similar problems on 14.7 but not 14.6.


Since those are beta (I think), I decided to try 14.4 first. Is there a way to disable all accelerations? Not sure how to...
Quote:


> Originally Posted by *miraldo*
> 
> I have same problems.
> 
> I get ramdom crashes when Im on internet.
> 
> Error is: STOP: 0xA0000001 and atikmdag.sys+0x28ECE.
> 
> -I try with fresh clean installation of WIndow 7 x64
> -try with many diffrent AMD drivers
> -I even turn off hardware acceleration but I still can not get rid of the error


Hm...I have Windows8.1. I am going to try 14.4 to see how it goes. If I don't come back here, you know my problem is gone! (Crossing fingers)


----------



## heroxoot

Disable acceleration in your browser in its settings. Do the same for Flash by right clicking on a video and going to settings for flash.


----------



## Beezleybuzz

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I would try to beat you, but I'm on air, and I ain't crazy like tsm & some others runnin' 1.35V+ on Air. lol


Well, according to GPU Tweak my voltage is set to 1.337v (not on purpose I swear!) yet with GPU-Z running it shows my voltage actually in the 1.2~ range. I've got the DCUII btw.


----------



## ZealotKi11er

I have no problems with card clocking up when in Youtube. I think its mostly browser related. I use old Opera.


----------



## heroxoot

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I have no problems with card clocking up when in Youtube. I think its mostly browser related. I use old Opera.


Thing is I personally don't want it to clock up. I disabled all hardware accel and it still clocks up. It didn't do that on 14.4. Clock stayed 300 for flash.


----------



## yawa

Quote:


> Originally Posted by *tsm106*
> 
> Is it me or have ppl lost knowledge? It's like no one remembers anything anymore.


Well keep in mind, I'm going by what I saw in the benchmarking threads in the other forum, not by what was posted in here.

My point was, I'm surprised at the lack of high clocking 290X's over there, and even more surprised at how high my pedestrian 1284Mhz 290X finished. That's all.

I get some people don't care to participate over there, I just figured at least some of the 1300+MHz 290X owning peeps would.

That's all.


----------



## pdasterly

Do you have errors running at that speed?
Quote:


> Originally Posted by *yawa*
> 
> Well keep in mind, I'm going by what I saw in the benchmarking threads in the other forum, not by what was posted in here.
> 
> My point was, I'm surprised at the lack of high clocking 290X's over there, and even more surprised at how high my pedestrian 1284Mhz 290X finished. That's all.
> 
> I get some people don't care to participate over there, I just figured at least some of the 1300+MHz 290X owning peeps would.
> 
> That's all.


----------



## fateswarm

Quote:


> Originally Posted by *yawa*
> 
> I was shocked at my ranking to say the least knowing people have far better cards than me.


Do you have to pay to run the extreme benchmark?


----------



## VSG

Quote:


> Originally Posted by *fateswarm*
> 
> Do you have to pay to run the extreme benchmark?


If you are referring to Firestrike Extreme, then yes- you need a full 3DMark license. It goes on a big discount on Steam during the big Steam sales.


----------



## Aussiejuggalo

Any reason why Minecraft hates AMD? my 670 I use to get 200+ fps average but with my 290 I get around 120ish

Really annoying


----------



## fateswarm

Quote:


> Originally Posted by *geggeg*
> 
> If you are referring to Firestrike Extreme, then yes- you need a full 3DMark license. It goes on a big discount on Steam during the big Steam sales.


Pass. I wonder if it's the reason he was surprised others don't beat his benchmark results.


----------



## Sgt Bilko

Quote:


> Originally Posted by *fateswarm*
> 
> Pass. I wonder if it's the reason he was surprised others don't beat his benchmark results.


There are actually alot of people that have the full version.

i picked it up for $2.50 during a steam sale about 8 months ago.


----------



## Dasboogieman

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Any reason why Minecraft hates AMD? my 670 I use to get 200+ fps average but with my 290 I get around 120ish
> 
> Really annoying


Gimped OpenGL performance I'm guessing. NVIDIA did have an advantage when it comes to this though they tended to epic cripple it in favour of the Quadros. That being said, AMD may have also been a little too aggressive in gimping their own OGL implementation for Hawaii.


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Dasboogieman*
> 
> Gimped OpenGL performance I'm guessing. NVIDIA did have an advantage when it comes to this though they tended to epic cripple it in favour of the Quadros. That being said, AMD may have also been a little too aggressive in gimping their own OGL implementation for Hawaii.


Ah ok, thanks


----------



## yawa

Quote:


> Originally Posted by *fateswarm*
> 
> Pass. I wonder if it's the reason he was surprised others don't beat his benchmark results.


Haha it probably is.

Though seriously I bought my Firestrike for like $4.99 on a steam sale, one day. 3D Mark 11 as well.

Also, in regards to other more prominent (i.e. "Free") benches like Heaven and Valley, I still don't see many people posting regular results with 290X's either. Especially in regards to high clocking 290X's.
Quote:


> Originally Posted by *pdasterly*
> 
> Do you have errors running at that speed?


1295 I get artifacts like crazy. 1287 minor artifacts once in awhile. 1283 just fine. Takes a lot of voltage though. Luckily, my cooling is really good.

For reference, my Day to Day rock solid clock is 1220Mhz/1450Mhz at +200mv. Not exactly fantastic, hence why I wanna see people who can break 1300 post more benches.


----------



## pdasterly

Quote:


> Originally Posted by *yawa*
> 
> Haha it probably is.
> 
> Though seriously I bought my Firestrike for like $4.99 on a steam sale, one day. 3D Mark 11 as well.
> 
> Also, in regards to other more prominent (i.e. "Free") benches like Heaven and Valley, I still don't see many people posting regular results with 290X's either. Especially in regards to high clocking 290X's.
> 1295 I get artifacts like crazy. 1287 minor artifacts once in awhile. 1283 just fine. Takes a lot of voltage though. Luckily, my cooling is really good.


no not artifacts but errors like occt detects?


----------



## yawa

Quote:


> Originally Posted by *pdasterly*
> 
> no not artifacts but errors like occt detects?


Nope. Doesn't crash on me in anything if that's what you mean. I don't use OCCT though. I had issues with it on my 7850K and GTX670 at stock clocks, so I tend not to trust it any more. For GPU's (Only because I'm watercooled) I'll run Furmark briefly to check for artifacts, and obviously benches and games for stability.

Since I got it though, I've only hard crashed once, and that was only because Trixx (which I used before Afterburner) set itself to the highest possible overclock without bumping the voltage after the reboot.


----------



## pdasterly

Quote:


> Originally Posted by *yawa*
> 
> Nope.


1125 gpu/1600 mem 81 vddc 50 power limit before errors in occt, 290X
occt detects silent errors


----------



## yawa

Quote:


> Originally Posted by *pdasterly*
> 
> 1125 gpu/1600 mem 81 vddc 50 power limit before errors in occt, 290X


Not that I put much stock in OCCT because of reasons stated above, but what are you clocking with?

I had huge problems with Overdrive and Trixx after flashing to the ASUS bios (which are the best voltage combination for clocks they say), especially in regards to throttling.

After switching to Afterburner all my clocks got way more stable, and the potential shot up. Before (with Trixx at +200mv I would soft crash at 1245/1475 AND throttle down like crazy even though my temps were absolutely fine), after with Afterburner I could clock all the way up 1287/1479 at +300MV and while the voltage would jump all over the place, it never decreased performance.

So my advice is, whatever you're using, try something else.


----------



## Beezleybuzz

Quote:


> Originally Posted by *Sgt Bilko*
> 
> There are actually alot of people that have the full version.
> 
> i picked it up for $2.50 during a steam sale about 8 months ago.


Yeah, I've got 3dmark and 3dmark11. Got both for like 3-5 bucks a piece. Worth it in my opinion when changing hardware and being able to bench and test OC's.


----------



## pdasterly

using trixx only, what do you suggest


----------



## yawa

I don't know if your situation is similar to mine, but I'd give Afterburner a shot just for the sake of eliminating all possible reasons.

Like I said, even at the same clocks and voltages, everything got wayyyyyy more stable and performance went up quite a bit after I switched. Maybe it will help you.

Worth a shot I think.


----------



## pdasterly

i tried ab but I think the max vddc allowed was +100 even though im below that in trixx
my system is stable. Ill give it another try


----------



## pdasterly

ab made no difference, anything above 1125gpu and i get errors in occt


----------



## fateswarm

Didn't someone say at 1135 there is an internal threshold that things start to happen?


----------



## Kittencake

I'm almost tempted to start ocing my 290X


----------



## Jflisk

Quote:


> Originally Posted by *fateswarm*
> 
> Didn't someone say at 1135 there is an internal threshold that things start to happen?


----------



## fateswarm

Quote:


> Originally Posted by *Jflisk*
> 
> actually documented in many places1125 before weird things start happening but to each there own. Add the voltage and wonder why there cards starts smoking.


Could be 1125. Is there a good source about it?


----------



## Jflisk

Quote:


> Originally Posted by *fateswarm*
> 
> Could be 1125. Is there a good source about it?


theres sites that have over clocked the cards. Just put reviews over clocked R9 290X in any browser. Should give you a couple of returns. The one on my xfx's said when they hit 1125- 1400 they started having problems. Thanks


----------



## rv8000

Quote:


> Originally Posted by *Kittencake*
> 
> I'm almost tempted to start ocing my 290X


You haven't even fiddled with it? I bought my reference 290, first thing I did was see how far It could go


----------



## tsm106

Quote:


> Originally Posted by *Jflisk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fateswarm*
> 
> Didn't someone say at 1135 there is an internal threshold that things start to happen?
> 
> 
> 
> actually documented in many places1125 before weird things start happening but to each there own. Add the voltage and wonder why there cards starts smoking.
Click to expand...

If you or the review you're reading don't know what or how to do something, that doesn't make yours or their experience truth.


----------



## kizwan

OMG! Latest info in this thread is mind blowing. What is going on to this thread?


----------



## Kittencake

Quote:


> Originally Posted by *rv8000*
> 
> You haven't even fiddled with it? I bought my reference 290, first thing I did was see how far It could go


no not yet .. been a hot summer been a little afraid to do it


----------



## tsm106

Quote:


> Originally Posted by *kizwan*
> 
> OMG! Latest info in this thread is mind blowing. What is going on to this thread?


That's what I keep asking too?


----------



## ZealotKi11er

Quote:


> Originally Posted by *tsm106*
> 
> That's what I keep asking too?


Same here. What are some people smoking here lately. They need to delete these posts. Making confused people more confused.


----------



## Descadent

i just switched to amd after 10+years of being loyal to nvidia. switched from 2x 780 ti's to 2x 290x vapor-x. The 3gb limit on the 780ti was brutal at 7680x1440 and already with the 2x 290x's i have seen up to 20fps increase in some games...minimum fps improved and NO CHOKING!!!! that bigger bus and larger ram realllly helps. It pissed me off that I had to turn down settings to get under 3gb with the 780 ti's after spending that much money on cards!

just some pictures and some comparison pictures... 290x vapor-x sure is a beauty.


----------



## pdasterly

http://www.techpowerup.com/mobile/reviews/MSI/R9_290X_Lightning/27.html


----------



## tsm106

Those Vapors really are quite attractive looks wise. That blue reminds me of a world rally blue.


----------



## HoneyBadger84

Quote:


> Originally Posted by *yawa*
> 
> Haha it probably is.
> 
> Though seriously I bought my Firestrike for like $4.99 on a steam sale, one day. 3D Mark 11 as well.
> 
> Also, in regards to other more prominent (i.e. "Free") benches like Heaven and Valley, I still don't see many people posting regular results with 290X's either. Especially in regards to high clocking 290X's.
> 1295 I get artifacts like crazy. 1287 minor artifacts once in awhile. 1283 just fine. Takes a lot of voltage though. Luckily, my cooling is really good.
> 
> For reference, my Day to Day rock solid clock is 1220Mhz/1450Mhz at +200mv. Not exactly fantastic, hence why I wanna see people who can break 1300 post more benches.


You are on liquid right? Full coverage or no? Just curious about your VRMs temps. I'll have to try 1200/+200mV on one card, see if I can keep it cool on air. Doubt it but who knows. I imagine since 1150 is bench-stable at +125/131mV that 1200 isn't a far reach and I may not even need the full +200.


----------



## HoneyBadger84

Quote:


> Originally Posted by *kizwan*
> 
> OMG! Latest info in this thread is mind blowing. What is going on to this thread?


Quote:


> Originally Posted by *tsm106*
> 
> That's what I keep asking too?


Quote:


> Originally Posted by *ZealotKi11er*
> 
> Same here. What are some people smoking here lately. They need to delete these posts. Making confused people more confused.


Well it'd help if knowledgeable members would share said knowledge and help others learn instead of making posts that are, in all honesty, as bad as talking about someone badly behind their back, except everyone can see it


----------



## HoneyBadger84

Christmas in July mane!

Just landed 3 290s for $600 shipped, no mining done (used for a video project and supposedly in emaculate condition). Guess I'll be seeing whether they clock well enough to beat my 290Xs then I'll resell whichever threesome performs worse... or whichever I can make more off of, more likely.


----------



## VladimirT

Hi. Is my card dead ? If i put in only my lightning to pc = black screen at all. If put in 2 cards in oc = 1 card work fine , second can't install the drivers and give me code 43 . How i can fix it? I can't RMA it.

dead.jpg 597k .jpg file


----------



## fateswarm

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Same here. What are some people smoking here lately. They need to delete these posts. Making confused people more confused.


What do you mean? Is it about my question on the 1135 or 1125 thing? I still wonder by the way.


----------



## ZealotKi11er

Quote:


> Originally Posted by *fateswarm*
> 
> What do you mean? Is it about my question on the 1135 or 1125 thing? I still wonder by the way.


Yeah. There is no such thing as 1125Mhz. The higher the clock the better. There is no limit. In Stock air cooling 1150MHz was my limit because of cooling problems. In water all these limits are gone.


----------



## HoneyBadger84

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah. There is no such thing as 1125Mhz. The higher the clock the better. There is no limit. In Stock air cooling 1150MHz was my limit because of cooling problems. In water all these limits are gone.


You happen to remember the voltage you pushed for that speed being stable on stock air? I've run it with 125/131mV, but I didn't test it stable, just ran some benchmarks with it.


----------



## pdasterly

Did it have errors in occt?
Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah. There is no such thing as 1125Mhz. The higher the clock the better. There is no limit. In Stock air cooling 1150MHz was my limit because of cooling problems. In water all these limits are gone.


----------



## sugarhell

Quote:


> Originally Posted by *pdasterly*
> 
> Did it have errors in occt?


All the gpus produce errors. Occt is a joke


----------



## naved777

Is 14.6 driver good ?
I am with 14.4 currently


----------



## HoneyBadger84

Quote:


> Originally Posted by *sugarhell*
> 
> All the gpus produce errors. Occt is a joke


Agree. I stopped trusting OCCT when it's PSU test stopped being able to load my system fully, and that was a looooooooooong time ago. Not to mention it went ******ed when Hexa-Cores came out not being able to read/use them properly for LinPack testing stability.


----------



## pdasterly

Mine doesn't produce errors under 1125, faster than that errors start


----------



## sugarhell

Quote:


> Originally Posted by *pdasterly*
> 
> Mine doesn't produce errors under 1125, faster than that errors start


What you dont understand? By default gpus produce errors. Its a lie that occt cant find any errors

Memtest your gpu vram at stock


----------



## PureBlackFire

my card produces errors in occt at stock.


----------



## pdasterly

What program?


----------



## ZealotKi11er

I could go 1150MHz with stock volts.


----------



## heroxoot

Quote:


> Originally Posted by *PureBlackFire*
> 
> my card produces errors in occt at stock.


Could be something, could be nothing. If you play games and don't see artifacts you're fine. I hate programs like OCCT. I get false positives all around with it. Intel burn test for CPU, actual usage for GPU.

These are the only good ways to test stability IMO.


----------



## Descadent

Quote:


> Originally Posted by *naved777*
> 
> Is 14.6 driver good ?
> I am with 14.4 currently


14.7 is out


----------



## yawa

Quote:


> Originally Posted by *Descadent*
> 
> 14.7 is out


Ugh I wanted to use 14.7 so bad (with it and my day to day oc I was getting a consistent 60FPS in Dark Souls 2 down sampled from 4K, normally I get 52-56FPS at the same settings with 14.4) but it just keep black screening me like crazy. Each install got worse and worse. Even at stock.

I tried three meticulous uninstalls and reinstalls, no dice.

So 14.4 for me. At least until 14.8 hits.


----------



## fateswarm

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah.


Then chill out with the aggression earlier. It was a question. Nobody told you it was a fact.


----------



## ZealotKi11er

Quote:


> Originally Posted by *fateswarm*
> 
> Then chill out with the aggression earlier. It was a question. Nobody told you it was a fact.


People have been taking them like facts. Not just you.


----------



## jumperfly

Guys is there a way to pump more then 1.4v through my 290? Got a full cover block on her and she also has her own radiator so temps should be fine.


----------



## fateswarm

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Not just you.


I didn't at all. So not "just me" at all. If you get so annoyed with questions why do you visit?


----------



## ZealotKi11er

Quote:


> Originally Posted by *jumperfly*
> 
> Guys is there a way to pump more then 1.4v through my 290? Got a full cover block on her and she also has her own radiator so temps should be fine.


I think you can but once you go past 50C on the core the voltage does not have much effect. 1.4V should get you good OC.


----------



## 2tired

is there anything I can do to optimize this game with an r9 290? like am i missing anything other than updateing the drivers? Ive been lagging a whole lot recently and wondering if I can do anything about it


----------



## Sgt Bilko

Quote:


> Originally Posted by *2tired*
> 
> is there anything I can do to optimize this game with an r9 290? like am i missing anything other than updateing the drivers? Ive been lagging a whole lot recently and wondering if I can do anything about it


What game?


----------



## the9quad

Quote:


> Originally Posted by *Sgt Bilko*
> 
> What game?


http://www.overclock.net/t/1375478/official-battlefield-4-information-discussion-thread/24720#post_22593940

lol he posted in that thread too so I guess BF4


----------



## Jflisk

Quote:


> Originally Posted by *Descadent*
> 
> i just switched to amd after 10+years of being loyal to nvidia. switched from 2x 780 ti's to 2x 290x vapor-x. The 3gb limit on the 780ti was brutal at 7680x1440 and already with the 2x 290x's i have seen up to 20fps increase in some games...minimum fps improved and NO CHOKING!!!! that bigger bus and larger ram realllly helps. It pissed me off that I had to turn down settings to get under 3gb with the 780 ti's after spending that much money on cards!
> 
> just some pictures and some comparison pictures... 290x vapor-x sure is a beauty.


Wow those are nice.


----------



## pdasterly

what does aux voltage(mV) control in afterburner


----------



## Sgt Bilko

Quote:


> Originally Posted by *pdasterly*
> 
> what does aux voltage(mV) control in afterburner


I think it's PCIe slot power but i'm not 100% sure


----------



## ZealotKi11er

Quote:


> Originally Posted by *pdasterly*
> 
> what does aux voltage(mV) control in afterburner


I should net effect overclocking so there is not much point to change the values. You could always try to play around with it.


----------



## heroxoot

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *pdasterly*
> 
> what does aux voltage(mV) control in afterburner
> 
> 
> 
> I should net effect overclocking so there is not much point to change the values. You could always try to play around with it.
Click to expand...

A good amount of GPU use AUX for PCIE voltage. Depends on the card but it can in fact improve an overclock.


----------



## HoneyBadger84

Also according to two different PSU gurus on these forums, the 290X already regularly draws at or more than it should max from the slot for wattage (75W is the max a card should get from the slot) so trying to increases it's draw through the slot further is a bit risky. If your motherboard doesn't have supplemental power plugs (specifically for the PCI-E slots) like mine does, you could be putting undue stress on your 24-pin connector and that can lead to... well, fire.

Quote from my own PSU-questions thread below:
Quote:


> Originally Posted by *nleksan*
> 
> A random thought popped into my mind (as they are wont to do), regarding the power draw of the incredibly hungry 290 series...
> 
> (note: one reason behind selling the 290X Lightnings I had besides the lackluster performance, was that I found via a few of my Fluke DMM's how much they were pulling from the PCIe slot)
> 
> Basically, either by accident or not, stock and overclocked, my 290X LTG's were way out of the PCI-E spec for power draw from the board slots. Having a RIVE, the aux 6pin power input provided a measure of relief as I was only running two, but it was one of the reasons why two was the most I ever ran.
> CConsidering that the 3930K @5Ghz+ pulls upward of 220W on its own (and still way more efficient than Piledriver
> 
> 
> 
> 
> 
> 
> 
> ), and while the board has 8+4pin CPU PWR connectors for that reason, there's still a fair amount of power needed by everything else on the board or connected such as sound cards or hardware RAID controllers (or onboard USB3.0, SATA, etc)... And they all look to the 24pin to feed them.
> 
> Stock, each card was pulling over 100W from the slot and that number grew faster than the cards' clock speeds, to around DOUBLE the maximum amount allowed as per the PCISIG.
> The 6pin does provide an extra 75W, but I worry that for 3+ cards it wwon't be quite enough, to the point that the 24pin starts to be overdrawn.
> EVGA makes/made that little power adder thing that plugs into a spare slot and provides an extra dose of wattage, and I thought it silly for quite some time until I started running multiple heavily OC'ed cards and realizing tthe amount of electricity required to do so.
> 
> I would strongly recommend looking into solutions like that, I recall someone on XS running a lot of them for benching (with 4-5 total PSU's!), but I can't remember if he made them himself... Regardless, I have seen homemade versions of the device somewhere, I apologize for not being able to recall where, but I would think it wouldbe eextremely worthwhile to look into it.
> I have never before seen a card pull so much more than they should from the slot itself, I think the previous "worst" were the 580 Classy Ultra's I had which in 4-way SLI each maxed at aaround 85W, and that was enough for me to start using aux power beyond what was provided by the Board as it came. Seeing the 290X's draw so much more than that, I just would truly be very sad should you lose a slot (or burn out connectors) and thus the board
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyway, just wanted to make sure that you stay safe!


----------



## pdasterly

Just dumped ab in favor of trixx, no difference in oc, except that trixx is easier to use and supports more voltage


----------



## HoneyBadger84

landed a new (from Antec via RMA) HCP-850W for $99. Secondary PSU check! For 114 less than retail cuz rhe guy listed it weirs, I grilled him on it in questions and found out his original unit was dead, he RMAed it, couldn't wait on that so bought another unit, then Antec sent him back a brand new in box unit. ^_^

Getting the breakers that'll be dedicated for my computer and window AC unit in Thursday. Happy times.

Also working a deal on 2 290Xs that will secure me my 4th card, and give me one to resell, for cheap (under $600 for both) so the 3 290s thing is out the window thanks to the Casino being so kind as to give me $1k for my $300.


----------



## PCSarge

so. i just ran firestrike.



how bout that score for single card and cpu at 4.6ghz


----------



## HoneyBadger84

Not bad. But not great either, this is from when I first got my first 290X, which was a while ago now:



Note specifically the graphics score which is what's comparable since our CPUs are so different. Find it odd you've only got 300pts on me there with 1000MHz for mine (stock) vs 1175MHz for yours.

Is that a 290 or 290X?

Edit: thats a screen shot from the tablet I'm on btw, taken from a run that's about 2 months old meow.


----------



## PCSarge

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Not bad. But not great either, this is from when I first got my first 290X, which was a while ago now:
> 
> 
> 
> Note specifically the graphics score which is what's comparable since our CPUs are so different. Find it odd you've only got 300pts on me there with 1000MHz for mine (stock) vs 1175MHz for yours.
> 
> Is that a 290 or 290X?
> 
> Edit: thats a screen shot from the tablet I'm on btw, taken from a run that's about 2 months old meow.


its a 290. core is maxed. even a 5mhz bump artifacts. mem will do 1550 but framerate and score drops out.

also dont forget. you have the i7 giving better scores on the physics portion.


----------



## HoneyBadger84

Quote:


> Originally Posted by *PCSarge*
> 
> its a 290. core is maxed. even a 5mhz bump artifacts. mem will do 1550 but framerate and score drops out.
> 
> also dont forget. you have the i7 giving better scores on the physics portion.


Yeah thats why I said specifically look at graphics score. Given its a 290 your performance is right where it should be, beating a stock 290X on graphics score with a decent OC on your card









The closest thing I have to your OC (gpu clocks wise) on single card is this run (again graphics score is what we can compare):


----------



## ace1ndahole

Can someone tell me how well does the R9 290 scale in crossfire? I currently have a Vapor X thinking of maybe getting another 290 in the future.
But I want to know if it's worth it. otherwise I'll probably get a PS4 instead.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *ace1ndahole*
> 
> Can someone tell me how well does the R9 290 scale in crossfire? I currently have a Vapor X thinking of maybe getting another 290 in the future.
> But I want to know if it's worth it. otherwise I'll probably get a PS4 instead.


They scale real good esse


----------



## Dasboogieman

Quote:


> Originally Posted by *sugarhell*
> 
> All the gpus produce errors. Occt is a joke


That makes me nervous, I for one, am not willing to pay for a processor that doesn't output computationally accurate results. I mean, Hawaii cards are used for GP computation and applications like FAH.

Though, I have realised everybody has different standards for stability. The absolute mathematically stable clock speed for Hawaii at a given voltage is actually quite a bit lower than most would like. That's why I like stress tests like FAH (not sure how sensitive the occt algorithm is) because they not only check that your GPU is stable but also that it is outputting mathematically accurate results.

Sadly, memtestcl is no longer a valid application, it either throws up false negatives or the soft errors get through via the GDDR5 ECC, I guess the best way to test for memory now is to use a memory sensitive test to see at what clock speed the VRAM performance tanks.


----------



## Imprezzion

I'm curious what my unlocked 290x can do at fire strike.
It's running at 1200Mhz core and 1400Mhz memory 24/7. Any higher on the core and the checkerboards appear, any higher on the memory and it randomly black screens under load.

My CPU should compare decently as well when I push benchclocks on it. (5.29Ghz HT on 1.612v)

I'll bench when I get home from work in a bit..


----------



## recitio

I found this chart earlier in this thread: http://www.overclock.net/content/type/61/id/1747928/width/500/height/1000/flags/LL

Is that an accurate depiction of VRM1 and VRM2 locations as reported in GPU-Z? Do we know where each sensor is, exactly?


----------



## heroxoot

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *sugarhell*
> 
> All the gpus produce errors. Occt is a joke
> 
> 
> 
> That makes me nervous, I for one, am not willing to pay for a processor that doesn't output computationally accurate results. I mean, Hawaii cards are used for GP computation and applications like FAH.
> 
> Though, I have realised everybody has different standards for stability. The absolute mathematically stable clock speed for Hawaii at a given voltage is actually quite a bit lower than most would like. That's why I like stress tests like FAH (not sure how sensitive the occt algorithm is) because they not only check that your GPU is stable but also that it is outputting mathematically accurate results.
> 
> Sadly, memtestcl is no longer a valid application, it either throws up false negatives or the soft errors get through via the GDDR5 ECC, I guess the best way to test for memory now is to use a memory sensitive test to see at what clock speed the VRAM performance tanks.
Click to expand...

Or just, play some games? Seriously, there is no better answer. If your memory is messy you will probably black screen.


----------



## fateswarm

Quote:


> Originally Posted by *recitio*
> 
> I found this chart earlier in this thread: http://www.overclock.net/content/type/61/id/1747928/width/500/height/1000/flags/LL
> 
> Is that an accurate depiction of VRM1 and VRM2 locations as reported in GPU-Z? Do we know where each sensor is, exactly?


It appears very accurate for some cards. I'm a bit uncertain what's going on with some cards because they appear to have the GPU power phases on the right of the GPU as normal, but also one or two more phases on top of them (easy to see if the chokes are different or if the number of phases in the area is not an even number) most probably for the VRAM supply. Then again, we may both see that and an extra choke at the left of the GPU, but that may be mainly for filtering the 12V supply along with one capacitor.

In any case, it may not matter a lot beyond the GPU VRM since it will likely be the one on the highest temps.

Unless some makers made the VRAM supply flimsy (on my card only the GPU VRM appears high temped).

All cards may have a second sensor at the second vrm on top of the gpu vrm but that pic is too cited.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Imprezzion*
> 
> I'm curious what my unlocked 290x can do at fire strike.
> It's running at 1200Mhz core and 1400Mhz memory 24/7. Any higher on the core and the checkerboards appear, any higher on the memory and it randomly black screens under load.
> 
> My CPU should compare decently as well when I push benchclocks on it. (5.29Ghz HT on 1.612v)
> 
> I'll bench when I get home from work in a bit..


That CPU voltage almost made me throw up a little, but y'know, I'm sensitive about such things. Check out the score I posted a few posts above you if you'd like a simple R9 290X lemons to lemons comparison, specifically the graphics subscore is what's directly comparable without CPU causing too much variance.


----------



## kizwan

Quote:


> Originally Posted by *recitio*
> 
> I found this chart earlier in this thread: http://www.overclock.net/content/type/61/id/1747928/width/500/height/1000/flags/LL
> 
> Is that an accurate depiction of VRM1 and VRM2 locations as reported in GPU-Z? Do we know where each sensor is, exactly?


That is very accurate for referenced cards & any cards that use referenced PCB. VRM1 on the right & VRM2 on the left (labelled *2* in the picture).


----------



## mojobear

How did they prove this on the thread? Im curious. Especially since miners have been able to run 290(x) with an unpowered (No molex) 1x PCIE extension (which only supplies 25W).

Also...forgot to mention I ran 4 x 290 without the 6 pin aux power to the PCIE for about 2-3 weeks with no issues...at stock or mildly overclocked (+87mv).
Quote:


> Originally Posted by *HoneyBadger84*
> 
> Also according to two different PSU gurus on these forums, the 290X already regularly draws at or more than it should max from the slot for wattage (75W is the max a card should get from the slot) so trying to increases it's draw through the slot further is a bit risky. If your motherboard doesn't have supplemental power plugs (specifically for the PCI-E slots) like mine does, you could be putting undue stress on your 24-pin connector and that can lead to... well, fire.
> 
> Quote from my own PSU-questions thread below:


----------



## bond32

The supplemental 6 pin connection on the motherboard is not needed. It is only for extreme situations (like 4 kingpin 780ti's pulling over 450 watts each).

Also, it's been stated by various people here with electrical knowledge that the 290x pulls almost all its power from the 6 pin and 8 pin connections which is different from what mojobear observed... Don't recall where otherwise I would find it.


----------



## HoneyBadger84

Quote:


> Originally Posted by *mojobear*
> 
> How did they prove this on the thread? Im curious. Especially since miners have been able to run 290(x) with an unpowered (No molex) 1x PCIE extension (which only supplies 25W).
> 
> Also...forgot to mention I ran 4 x 290 without the 6 pin aux power to the PCIE for about 2-3 weeks with no issues...at stock or mildly overclocked (+87mv).


Mining doesn't draw as much power as gaming in most cases from what I understand (I have no idea, I don't mine), it just runs them hot because of the way the GPU is used and the fact that they refuse to turn up the fan in most cases.

The thing is, you'd have to constant-draw your system to put the 24-pin under stress, according to him, to really be risky. If you're only doing gaming, chances are yer not going to be running all 4 GPUs at 100% load for hours, get what I mean?

I'm just going off what Nleksan informed me on, which was his own personal experience with 290Xs pulling more than 75W from the slot via a setup he had that could actually read the draw from the slot itself directly. You'd have to ask him.
Quote:


> Originally Posted by *bond32*
> 
> The supplemental 6 pin connection on the motherboard is not needed. It is only for extreme situations (like 4 kingpin 780ti's pulling over 450 watts each).
> 
> Also, it's been stated by various people here with electrical knowledge that the 290x pulls almost all its power from the 6 pin and 8 pin connections which is different from what mojobear observed... Don't recall where otherwise I would find it.


Yeah, 4-way configurations, when OCed, that's pretty much what I meant, in terms of the 6-pin being REQUIRED to avoid 24-pin socket damage potentially... I doubt it'll make a huge diff but it might help with 3-way as well... I have no way of testing the stuff myself as I don't have any internal power draw readout stuff other than the readouts on GPUz which I know aren't that accurate probably.


----------



## ZealotKi11er

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Mining doesn't draw as much power as gaming in most cases from what I understand (I have no idea, I don't mine), it just runs them hot because of the way the GPU is used and the fact that they refuse to turn up the fan in most cases.
> 
> The thing is, you'd have to constant-draw your system to put the 24-pin under stress, according to him, to really be risky. If you're only doing gaming, chances are yer not going to be running all 4 GPUs at 100% load for hours, get what I mean?
> 
> I'm just going off what Nleksan informed me on, which was his own personal experience with 290Xs pulling more than 75W from the slot via a setup he had that could actually read the draw from the slot itself directly. You'd have to ask him.
> Yeah, 4-way configurations, when OCed, that's pretty much what I meant, in terms of the 6-pin being REQUIRED to avoid 24-pin socket damage potentially... I doubt it'll make a huge diff but it might help with 3-way as well... I have no way of testing the stuff myself as I don't have any internal power draw readout stuff other than the readouts on GPUz which I know aren't that accurate probably.


Mining pushed my 290X hard. It generates higher temps then any game. One reason is that the card is running 99% where most games its not always that. VRM1 also pushed really hard.


----------



## HoneyBadger84

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Mining pushed my 290X hard. It generates higher temps then any game. One reason is that the card is running 99% where most games its not always that. VRM1 also pushed really hard.


Just because it's at 99% load doesn't necessarily mean it's drawing a crapton of power though, like the fact that some [email protected] units heat your CPU up more than others, depends on the speed the work is being done at based on how large whatever it's crunching is, in most cases. Fairly certain Mining follows the same principle AKA there are different types etc & some of them are much more temperature intensive than others.


----------



## ZealotKi11er

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Just because it's at 99% load doesn't necessarily mean it's drawing a crapton of power though, like the fact that some [email protected] units heat your CPU up more than others, depends on the speed the work is being done at based on how large whatever it's crunching is, in most cases. Fairly certain Mining follows the same principle AKA there are different types etc & some of them are much more temperature intensive than others.


It is drawing a lot of power. More then any game i have played.


----------



## fateswarm

Chances are some kinds of mining do little use of the VRAM and higher use of the GPU. Kinda like the most extreme prime95 tests. They are that extreme because they use almost no RAM.


----------



## Jflisk

When mining your card or cards are pushed to the max wattage goes up -Heat goes up - Everything from the core to the VRAM. From the wall using kilowatt 600W not mining 860+ while mining. Like running Furmark + basically.Temps gaming 54C max - temps mining 62C max GPU. I did mine on cards when it was worth it. Now I use ASIC to mine. Saving my cards for gaming.. Thanks


----------



## Decoman

I have noticed that my 290x card won't enter a power saving state and I am wondering it that is because the drivers are buggy, or if something in Windows broke.

Sometimes my Trixx application would start twice on logging into windows, and I am suspecting that something has broken in windows because of that.
Somehow I had two instances of Trixx in my windows' task scheduler.

The card had managed to enter the power saving state by itself up until today. *shrugs*

I guess, i'll try to uninstall Trixx, maybe clean up some folders and try again. Wouldn't surprise me if there was something wrong with the lastest cat beta drivers that I have installed.

Edit: Another day, and now the card enters a power saving state, even though I fiddled with the core and memory clock. I guess Trixx just messed up things when i accidentally ended up having two instances running. Didn't need to uninstall Trixx.

Edit2: Later on, the card again won't enter the power saving mode. It seems as if one of my web browsers make the card run at full clocks speeds constantly (I have lots of tabs open). Mystery solved I think.

Edit: Still later this day. Opening up my other internet browser again, and this time I have the card in a power saving state, even though the very same tabs are loaded in the browser.


----------



## kayan

So, I have an odd question. I have 3 @ XFX 290x, 2 @ Elpida, and 1 @ Hynix. I'm adding a 2nd card to my loop today (just got everything in the mail. Should I stick in the 2nd Elpida, since my 1st block is on an Elpida, or should I throw the Hynix into the loop, making it one of each? The (2nd) Elpida core OCs higher than the Hynix, with less voltage in AB.


----------



## PCSarge

Quote:


> Originally Posted by *kayan*
> 
> So, I have an odd question. I have 3 @ XFX 290x, 2 @ Elpida, and 1 @ Hynix. I'm adding a 2nd card to my loop today (just got everything in the mail. Should I stick in the 2nd Elpida, since my 1st block is on an Elpida, or should I throw the Hynix into the loop, making it one of each? The (2nd) Elpida core OCs higher than the Hynix, with less voltage in AB.


id put the matching card in


----------



## kayan

Quote:


> Originally Posted by *PCSarge*
> 
> id put the matching card in


That's what I was thinking, but thanks for clarifying


----------



## kizwan

I'd say put the second Elpida card because you found it overclocked higher than Hynix card & with less voltage. Otherwise mixing Elpida & Hynix cards will not causing any trouble. One of my card is Elpida & another card is Hynix. No problem whatsoever especially when overclocking.


----------



## kayan

Ok, I have done so, but, I have a possibly stupid question:

I now have 2 290x with blocks on both, how the heck do I use this bridge? I mean, I know how, but what is the proper way to have the water coming into the first card, the bridge to the 2nd card, and water coming out of the 2nd card? In other words, I need help with flow and connector placement. I can't seem to find a good diagram. :/


----------



## ZealotKi11er

What bridge do you have?


----------



## kayan

http://www.frozencpu.com/products/19540/ex-blc-1439/XSPC_Razor_SLI_High_Flow_Bridge_4_Slot.html


----------



## VSG

If you see this picture for example:



Have the flow coming in one of the ports on the right in the first card, then the bridge takes it from bottom left on 1st card to top left on 2nd card allowing you to exit from any of the ports on the right. Make sure to have all other ports plugged up.

In other news, guess what's now available for purchase: http://www.newegg.com/Product/Product.aspx?Item=N82E16814121883&cm_re=asus_matrix-_-14-121-883-_-Product


----------



## ZealotKi11er

Ok. so since each block has 8 ports what you want to do is close the top 2 ports for both cards. The 2 inner ports, 1 connect the bridge and the other close for both cards. After that close both ports that are next to the bridge. You basically want to leave open the top port opposite side from the bridge open for intake and the bottom or side opposite to the bridge open for out.


----------



## Kittencake

Having a minor bout of issues wiht 14.7 I install it then windows refuses to finnish booting it gets past the start screen the stock desktop loads then nothing else .. no task bar .. nothing >.< , its making a very fustrated kitty and aparently i can't remove ccc in safe mode


----------



## kayan

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Ok. so since each block has 8 ports what you want to do is close the top 2 ports for both cards. The 2 inner ports, 1 connect the bridge and the other close for both cards. After that close both ports that are next to the bridge. You basically want to leave open the top port opposite side from the bridge open for intake and the bottom or side opposite to the bridge open for out.


That picture is exactly what I needed, and a great explanation. Thanks. +rep for you.


----------



## sugarhell

Quote:


> Originally Posted by *Kittencake*
> 
> Having a minor bout of issues wiht 14.7 I install it then windows refuses to finnish booting it gets past the start screen the stock desktop loads then nothing else .. no task bar .. nothing >.< , its making a very fustrated kitty and aparently i can't remove ccc in safe mode


Its a registry bug. Use DDU in safe mode to clean 14.7 and reinstall


----------



## Kittencake

yeah but every time i do install 14.7 nothing loads lol .. I restored so hopefully 14.4 will work .. never had any issues with 14.6


----------



## sugarhell

Quote:


> Originally Posted by *Kittencake*
> 
> yeah but every time i do install 14.7 nothing loads lol .. I restored so hopefully 14.4 will work .. never had any issues with 14.6


Just install 14.6. Same branch drivers


----------



## devilhead

Quote:


> Originally Posted by *Kittencake*
> 
> Having a minor bout of issues wiht 14.7 I install it then windows refuses to finnish booting it gets past the start screen the stock desktop loads then nothing else .. no task bar .. nothing >.< , its making a very fustrated kitty and aparently i can't remove ccc in safe mode


Had same problem with 14.7, but after changing cards, 7970 to 290x, then needed to reinstall drivers


----------



## Kittencake

I just installed 14.6 and it still does it kitty is not amused


----------



## sugarhell

Quote:


> Originally Posted by *Kittencake*
> 
> I just installed 14.6 and it still does it kitty is not amused


Remove drivers. Remove msi ab if you have. Reseat card.Install new drivers


----------



## Kittencake

It's a fresh install of windows


----------



## sugarhell

Quote:


> Originally Posted by *Kittencake*
> 
> It's a fresh install of windows


Oh you have amd cpu. Try to reseat your card so you can reset your registry

Also did you let windows update to finish? You need the latest framework.net to install correctly the drivers


----------



## pdasterly

Found good deal on ek waterblocks, will any backplate work with them?
I have heatkiller backplates already installed


----------



## Kittencake

Yes I
Quote:


> Originally Posted by *sugarhell*
> 
> Oh you have amd cpu. Try to reseat your card so you can reset your registry
> 
> Also did you let windows update to finish? You need the latest framework.net to install correctly the drivers


did I've went from 14.7 to 14.6 to 14.4 short of te driver cd I've tried everything


----------



## Kittencake

did some googling i unplugged the other dvi cable so im running on one monitor and shabang ... it works


----------



## sugarhell

Quote:


> Originally Posted by *Kittencake*
> 
> did some googling i unplugged the other dvi cable so im running on one monitor and shabang ... it works


^^


----------



## Kittencake

now only if i can remeber how I made the font on everything bigger

whooooo 1000th post


----------



## HOMECINEMA-PC

Set up a TRI 7680x1440p


----------



## mojobear

wait isnt what you are saying a repeat of my post? I said I didnt need to use the 6 pin to run my quadfire system.








Quote:


> Originally Posted by *bond32*
> 
> The supplemental 6 pin connection on the motherboard is not needed. It is only for extreme situations (like 4 kingpin 780ti's pulling over 450 watts each).
> 
> Also, it's been stated by various people here with electrical knowledge that the 290x pulls almost all its power from the 6 pin and 8 pin connections which is different from what mojobear observed... Don't recall where otherwise I would find it.


----------



## pdasterly

Found good deal on ek waterblocks, will any backplate work with them?I have heatkiller backplates already installed


----------



## VSG

Only if the screw holes match up. Even then you may have to get different length screws. But since you already have the backplates, go ahead and try them out!


----------



## chronicfx

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Just because it's at 99% load doesn't necessarily mean it's drawing a crapton of power though, like the fact that some [email protected] units heat your CPU up more than others, depends on the speed the work is being done at based on how large whatever it's crunching is, in most cases. Fairly certain Mining follows the same principle AKA there are different types etc & some of them are much more temperature intensive than others.


As a former litecoin miner using the same cards for gaming now that I mined with, mining draws more power. I used to pull alot more from the wall on three stock 290x once they are heated up. Gaming Is definitely lower. Go ahead and download a mining program and try it. Let us know the results.


----------



## HoneyBadger84

Got 2 more 290Xs for $550 shipped on the way, just gotta test both when they get here, see which performs more in like with the HIS cards & resell the other one... Secondary PSU should be here in a few days too, and I'll have the room wired with a 20A breaker specifically for the computer on Thursday. Everything is coming together nicely


----------



## Infinite Jest

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Got 2 more 290Xs for $550 shipped on the way, just gotta test both when they get here, see which performs more in like with the HIS cards & resell the other one... Secondary PSU should be here in a few days too, and I'll have the room wired with a 20A breaker specifically for the computer on Thursday. Everything is coming together nicely


You've been picking up some sweet deals!


----------



## PCSarge

speaking of more cards:

quadfire anyone?



paid $200 a piece for them. just because my benching board looked so empty.

my friend wanted rid of them because he went back to nvidia for some odd reason.


----------



## VSG

Great deal!


----------



## Kittencake

awesome deal .. I'm jelly


----------



## HoneyBadger84

Quote:


> Originally Posted by *Infinite Jest*
> 
> You've been picking up some sweet deals!


The guy that I was going to get the 290Xs off of before backed out by raising the price too much, quite irritated about that one... and the 290s I just decided not to bother with.
Quote:


> Originally Posted by *PCSarge*
> 
> speaking of more cards:
> 
> quadfire anyone?
> 
> 
> 
> paid $200 a piece for them. just because my benching board looked so empty.
> 
> my friend wanted rid of them because he went back to nvidia for some odd reason.


290s or 290Xs? I got a guy that's selling 290s for $600 for 3, not gonna do it now that I've got 5 total 290Xs once I get the 4th & 5th in.


----------



## Ironsmack

Quote:


> Originally Posted by *PCSarge*
> 
> speaking of more cards:
> 
> quadfire anyone?
> 
> 
> 
> paid $200 a piece for them. just because my benching board looked so empty.
> 
> my friend wanted rid of them because he went back to nvidia for some odd reason.


That's a good deal. Wish i could find a deal like that.


----------



## PCSarge

Quote:


> Originally Posted by *HoneyBadger84*
> 
> The guy that I was going to get the 290Xs off of before backed out by raising the price too much, quite irritated about that one... and the 290s I just decided not to bother with.
> 290s or 290Xs? I got a guy that's selling 290s for $600 for 3, not gonna do it now that I've got 5 total 290Xs once I get the 4th & 5th in.


290Xs. the 290 in my sig rig is beastly though so its gonna stay in there. that and i know i dont have enough rad for a 290x. only 360mm total.

buddy took what he got out of me and went 780Tis so everyone is happy.


----------



## ZealotKi11er

Quote:


> Originally Posted by *PCSarge*
> 
> 290Xs. the 290 in my sig rig is beastly though so its gonna stay in there. that and i know i dont have enough rad for a 290x. only 360mm total.
> 
> buddy took what he got out of me and went 780Tis so everyone is happy.


What are you going to do with them? I have now 2 x 290s and even @ 1440p they are overkill. Hopefully they come to use with new games.


----------



## PCSarge

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What are you going to do with them? I have now 2 x 290s and even @ 1440p they are overkill. Hopefully they come to use with new games.


HWBot is going to get a nice kick in the nuts with those on overclocks the only issue will be heat....which i have waterblocks to solve ala the deal i made with my friend.. and a 4.8GHZ i7.

in other news. 5,888 posts. 225 rep. the rep system doesnt seem plausible


----------



## Arizonian

Quote:


> Originally Posted by *PCSarge*
> 
> HWBot is going to get a nice kick in the nuts with those on overclocks the only issue will be heat....which i have waterblocks to solve ala the deal i made with my friend.. and a 4.8GHZ i7.
> 
> in other news. 5,888 posts. 225 rep. the rep system doesnt seem plausible


REP discussion here please

http://www.overclock.net/t/1479403/rep-change-suggestion

OT in here.


----------



## PCSarge

Quote:


> Originally Posted by *Arizonian*
> 
> REP discussion here please
> 
> http://www.overclock.net/t/1479403/rep-change-suggestion
> 
> OT in here.


was just making fun of myself man. not looking to make discussions about it


----------



## Arizonian

Discussion that followed regarding REP was removed. Indirectly it was not you, but it opened the door for response.

Sorry for the confusion.


----------



## PCSarge

Quote:


> Originally Posted by *Arizonian*
> 
> Discussion that followed regarding REP was removed. Indirectly it was not you, but it opened the door for response.
> 
> Sorry for the confusion.


its all good.


----------



## ethanhunt

Finally got my Sapphire R9 290 Tri-X today.

GPU-Z Link - http://www.techpowerup.com/gpuz/kr5sr

Sapphire R9 290 Tri-X OC

Upgrading from MSi GTX760 TF OC.


----------



## fateswarm

Quote:


> Originally Posted by *ethanhunt*
> 
> Finally got my Sapphire R9 290 Tri-X today.
> 
> GPU-Z Link - http://www.techpowerup.com/gpuz/kr5sr
> 
> Sapphire R9 290 Tri-X OC
> 
> Upgrading from MSi GTX760 TF OC.


Gratz on getting the best card in the universe in terms of value/money on the high end.


----------



## PureBlackFire

Quote:


> Originally Posted by *fateswarm*
> 
> Gratz on getting the best card in the universe in terms of value/money on the high end.


+1. no bias, just facts.


----------



## Petet1990

need help here..just got my 3 290s in today..when i finally set thrn up and i go load up bf4 it bsods.


----------



## Ramzinho

Quote:


> Originally Posted by *Petet1990*
> 
> need help here..just got my 3 290s in today..when i finally set thrn up and i go load up bf4 it bsods.


may i ask what PSU you have?


----------



## PCSarge

if its not over 1000W hes probably short on draw wattage. especially if overclocked.


----------



## Jflisk

Quote:


> Originally Posted by *Petet1990*
> 
> need help here..just got my 3 290s in today..when i finally set thrn up and i go load up bf4 it bsods.


You can have a look here but If your not up in the 1200W power supply range you may have problems. Looks like slightly between 300W and 350 watts a pop. Remember at a 1000W you still need power for the CPU and other devices. You might want to pull one and see if the other 2 work process of elimination. Then determine if its all 3 together causing the problem.

http://www.techspot.com/review/736-amd-radeon-r9-290/page8.html


----------



## Petet1990

i have the 1300g2


----------



## Jflisk

Quote:


> Originally Posted by *Petet1990*
> 
> i have the 1300g2


You should be golden then. If you want you can try the process of elimination to see if its the cards or what card. Its a good start. Before you get into all that you did use a driver uninstall utility and totally. remove all the drivers from the previous card and re install the drivers. Thanks

DDU (display driver unistall utility)

http://www.wagnardmobile.com/DDU/


----------



## Petet1990

Quote:


> Originally Posted by *Jflisk*
> 
> You should be golden then. If you want you can try the process of elimination to see if its the cards or what card. Its a good start. Before you get into all that you did use a driver uninstall utility and totally. remove all the drivers from the previous card and re install the drivers. Thanks
> 
> DDU (display driver unistall utility)
> 
> http://www.wagnardmobile.com/DDU/


was thinkin of a clean windows install but i could try that to


----------



## Gobigorgohome

Quote:


> Originally Posted by *Petet1990*
> 
> i have the 1300g2


I also have the EVGA G2 1300W and it is good for 4x R9 290X (I can't run BF4 at ultra settings because then the hole thing shuts down), get some weird message in bios about power supply failed or something, I guess it is just because it drew a little to much power.

I only have this "power usage problem" in BF4, I can run Skydiver/Firestrike Extreme and a lot of other heavy games and benchmarks just fine at max settings, but BF4 is no go for me. Have you tried other games and is those games fine?


----------



## Petet1990

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I also have the EVGA G2 1300W and it is good for 4x R9 290X (I can't run BF4 at ultra settings because then the hole thing shuts down), get some weird message in bios about power supply failed or something, I guess it is just because it drew a little to much power.
> 
> I only have this "power usage problem" in BF4, I can run Skydiver/Firestrike Extreme and a lot of other heavy games and benchmarks just fine at max settings, but BF4 is no go for me. Have you tried other games and is those games fine?


i troed crysis and same thing..i just did fresh install of windows and as im installing the amd drivers it bsods again..im really at loss for words now...


----------



## ducknukem86

Hey guys, i was thinking of building a pc with some of these cheap mining r9 290s in crossfire. I heard MSI honors their warranty just with the serial number. I found a couple of Msi Gaming Editions, which is pretty cool, since they are not reference. I was wondering if i could max everything out with a configuration like this. Or if i would be dropping frames like crazy and get stuttering. I ask this because i used to have a single r9 280x and ended up selling it cause i wasn't happy with the performance (i'm a whore for eyecandy). Games i really wanted to play, mostly Ubisoft had terrible performance (looks like they don't like amd or nvidia for that matter lol). Would a setup like these be really good? What's your opinion with Crossfire?









I mostly game 1080p and would like to reach 120 fps or close to it most of the times with a lot of eyecandy (AA and stuff).


----------



## bluedevil

Yeah I know. Here's what I was doing. Re positioning my waterblock on my GPU, tightened it a little too hard. Noticed a little bend, backed off the bolts a bit. Then the Post had blue lines, backed off the bolts a little more. Now no more blue in POST, but crashes when I try to install the 14.4s.

Fresh install of Win 8.1

Any ideas?

Also boots into Safe mode fine


----------



## Jflisk

[
Quote:


> Originally Posted by *bluedevil*
> 
> Yeah I know. Here's what I was doing. Re positioning my waterblock on my GPU, tightened it a little too hard. Noticed a little bend, backed off the bolts a bit. Then the Post had blue lines, backed off the bolts a little more. Now no more blue in POST, but crashes when I try to install the 14.4s.
> 
> Fresh install of Win 8.1
> 
> Any ideas?
> 
> Also boots into Safe mode fine


Sounds like you may have accidently messed up the board. Ive done it before without realizing it. Until I looked at the waterblock and it had a bend in it (long story short the instructions had one thickness pad in one place and another thickness pad in another part of the instructions for the same location DOH). You can try pulling the block of and reseating it. No Guarantees on that one. Usually drivers installing to blue screen or blackscreen in some situations mean real problems.


----------



## Blue Dragon

Quote:


> Originally Posted by *Petet1990*
> 
> i troed crysis and same thing..i just did fresh install of windows and as im installing the amd drivers it bsods again..im really at loss for words now...


Don't know if it makes a difference on intel, but my amd asus board I had to install a single card, then shut down and add other card before I could get it running in crossfire. also might want to uninstall any gpu oc proggy like afterburner, etc...

Quote:


> Originally Posted by *ducknukem86*
> 
> Hey guys, i was thinking of building a pc with some of these cheap mining r9 290s in crossfire. I heard MSI honors their warranty just with the serial number. I found a couple of Msi Gaming Editions, which is pretty cool, since they are not reference. I was wondering if i could max everything out with a configuration like this. Or if i would be dropping frames like crazy and get stuttering. I ask this because i used to have a single r9 280x and ended up selling it cause i wasn't happy with the performance (i'm a whore for eyecandy). Games i really wanted to play, mostly Ubisoft had terrible performance (looks like they don't like amd or nvidia for that matter lol). Would a setup like these be really good? What's your opinion with Crossfire?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I mostly game 1080p and would like to reach 120 fps or close to it most of the times with a lot of eyecandy (AA and stuff).


imo you'd be happier getting another 780 classified for sli rather then going for cfx r9 290s. I had lots of headaches with these cards... but have heard of others that seem to do ok with theirs.


----------



## Dasboogieman

Quote:


> Originally Posted by *ducknukem86*
> 
> Hey guys, i was thinking of building a pc with some of these cheap mining r9 290s in crossfire. I heard MSI honors their warranty just with the serial number. I found a couple of Msi Gaming Editions, which is pretty cool, since they are not reference. I was wondering if i could max everything out with a configuration like this. Or if i would be dropping frames like crazy and get stuttering. I ask this because i used to have a single r9 280x and ended up selling it cause i wasn't happy with the performance (i'm a whore for eyecandy). Games i really wanted to play, mostly Ubisoft had terrible performance (looks like they don't like amd or nvidia for that matter lol). Would a setup like these be really good? What's your opinion with Crossfire?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I mostly game 1080p and would like to reach 120 fps or close to it most of the times with a lot of eyecandy (AA and stuff).


Your biggest issue with crossfire isn't the stuttering, that issue seems to pretty much been resolved. Your problem would be getting crossfire to be enabled since AMD has a smaller driver team to patch these things in to games.
So when it works, the scaling is incredible with smoothness.

UbiSoft titles have traditionally favoured NVIDIA due to the close partnership between the studio. NVIDIA cards will have an FPS edge (i.e. more likely to reach your 120fps target) also with access to special features like TXAA or enhanced god rays in AC4 with minimal lag.

That being said, the AMD 290 is much better equipped to handle heavy MSAA and SSAA usage. The average bleed in performance is quite small compared to Kepler. Basically NVIDIA leads through optimisation, AMD keeps up through brute hardware power.

I can't give you a truly impartial decision but I would advise crossfire 290s simply due to being more futureproof. Optimisations come and go, you don't want to hobble yourself in other areas just for the sake of UbiSoft.

Just be sure your case and PSU can handle the crossfire 290s. I.e powerful side extraction fans and at least a 1000W PSU.


----------



## bluedevil

Quote:


> Originally Posted by *Jflisk*
> 
> [
> Sounds like you may have accidently messed up the board. Ive done it before without realizing it. Until I looked at the waterblock and it had a bend in it (long story short the instructions had one thickness pad in one place and another thickness pad in another part of the instructions for the same location DOH). You can try pulling the block of and reseating it. No Guarantees on that one. Usually drivers installing to blue screen or blackscreen in some situations mean real problems.


Think baking it will revive it?


----------



## Arizonian

Quote:


> Originally Posted by *ethanhunt*
> 
> Finally got my Sapphire R9 290 Tri-X today.
> 
> GPU-Z Link - http://www.techpowerup.com/gpuz/kr5sr
> 
> Sapphire R9 290 Tri-X OC
> 
> Upgrading from MSi GTX760 TF OC.


Congrats - added









Quote:


> Originally Posted by *Petet1990*
> 
> i have the 1300g2


Hope you iron out your issues with your tri-fire. Afterward make sure to join us.

Tip: List your rig when you get a chance. When asking questions it helps other members help you to know your system. - *How to put your Rig in your Sig*


----------



## Jflisk

Quote:


> Originally Posted by *bluedevil*
> 
> Think baking it will revive it?


you've baked them before. I tried it once made it a little better. But didn't fully fix it. No RMA availible for the card


----------



## HoneyBadger84

Quote:


> Originally Posted by *Petet1990*
> 
> i troed crysis and same thing..i just did fresh install of windows and as im installing the amd drivers it bsods again..im really at loss for words now...


Did you test each card individually before setting them all up together? You should always do that with new hardware IMO, to make sure you don't have one card that's gimped or doesn't perform the same as the others. I do this with all new hardware I get in, it's a bit time consuming, but worth it:

Run the following benchmarks, score is not important as long as they pass with no artifacts (and all matching cards score about the same):

3DMark FireStrike on Normal & Extreme
3DMark11 on Performance & Extreme
Unigine Valley (one full benchmark loop)

Then at least 1 game benchmark (be that Metro 2033, Metro Last Light, whichever game you have with a built in benchmark, pretty sure Hitman Absolution has one)

Running all of those & watching for artifacts, and check to see that the scores match or are close on each card ensures that each card is working by itself properly, and that eliminates them as being the issue.


----------



## Petet1990

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> Hope you iron out your issues with your tri-fire. Afterward make sure to join us.
> 
> Tip: List your rig when you get a chance. When asking questions it helps other members help you to know your system. - *How to put your Rig in your Sig*


will do thanks


----------



## Petet1990

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Did you test each card individually before setting them all up together? You should always do that with new hardware IMO, to make sure you don't have one card that's gimped or doesn't perform the same as the others. I do this with all new hardware I get in, it's a bit time consuming, but worth it:
> 
> Run the following benchmarks, score is not important as long as they pass with no artifacts (and all matching cards score about the same):
> 
> 3DMark FireStrike on Normal & Extreme
> 3DMark11 on Performance & Extreme
> Unigine Valley (one full benchmark loop)
> 
> Then at least 1 game benchmark (be that Metro 2033, Metro Last Light, whichever game you have with a built in benchmark, pretty sure Hitman Absolution has one)
> 
> Running all of those & watching for artifacts, and check to see that the scores match or are close on each card ensures that each card is working by itself properly, and that eliminates them as being the issue.


im testing out now thanks


----------



## Enilder

Just checking back to let others know about my BSOD issue. It's been couple days since I updated the driver from 14.1 to 14.4. It is rock solid. I hope I don't get another BSOD. If it comes back, I will come back to this thread. Thanks!


----------



## bluedevil

Just submitted for RMA. Hopefully it won't take too long. .


----------



## Petet1990

played bf4 with one card for a bit and everything fine..second card in and it crashes while uploading..dont know anymore


----------



## ZealotKi11er

Quote:


> Originally Posted by *Petet1990*
> 
> played bf4 with one card for a bit and everything fine..second card in and it crashes while uploading..dont know anymore


Did you try each card individually? When i first tried BF4 with my 2 cards i had similar crash too. After restart it was all fine.


----------



## gobblebox

Looking to join! I've actually had this card for a while, finally, here is my proof:

Stock Core Clock: 1000Mhz
Stock Memory Clock: 5000Mhz

OC Core: 1135Mhz
OC Mem: 6475Mhz

*Sapphire Tri-X OC r9 290*


----------



## Arizonian

Quote:


> Originally Posted by *gobblebox*
> 
> Looking to join! I've actually had this card for a while, finally, here is my proof:
> 
> Stock Core Clock: 1000Mhz
> Stock Memory Clock: 5000Mhz
> 
> OC Core: 1135Mhz
> OC Mem: 6475Mhz
> 
> *Sapphire Tri-X OC r9 290*
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## Descadent

just chiming in on my two 290x vapor-x's. haven't started overclocking yet but LOVE THEM... oh it's so nice not turning down my games because of some small memory bus and 3gb for it not to choke at 7680x1440!

I just need amd to put out a driver that supports Assetto Corsa in crossfire!


----------



## zyndro

Hello there guys,

I have a crossfire of reference 290's, but they make a lot of noise and heat.
so i decided to mod them, trying to keep the reference look.

what i did was cut ( yes cut ) the heatsink and the metal plate that holds the VGA case, instead of that chunk of metal i wanted to put a water pump, so i took my dremel drill and star working on it.

i used a cooler master seidon 120, some tweaks to the intel 1155 socket in order to make it fit on the VGA PCB holes and voila, i was able to adapt the water pump.

I still have to put heatsinks in the vrams, im waiting for them to arrive, and those should be cooled by the stock fan









finally did some holes in the VGA case to make space for the waterpump tubes and that's pretty much it.

before i was getting insane noise and 89° celsius on full load, now im getting 54° celsius on full load ( with the radiator fan @ 100% )

also the 290 has a sexy look similar to a 295! hahaha

still have to do the same thing on the another one VGA, i may do some video for youtube on the how to process.









Cheers!
-Brian.


----------



## Arizonian

Quote:


> Originally Posted by *zyndro*
> 
> Hello there guys,
> 
> I have a crossfire of reference 290's, but they make a lot of noise and heat.
> so i decided to mod them, trying to keep the reference look.
> 
> what i did was cut ( yes cut ) the heatsink and the metal plate that holds the VGA case, instead of that chunk of metal i wanted to put a water pump, so i took my dremel drill and star working on it.
> 
> i used a cooler master seidon 120, some tweaks to the intel 1155 socket in order to make it fit on the VGA PCB holes and voila, i was able to adapt the water pump.
> 
> I still have to put heatsinks in the vrams, im waiting for them to arrive, and those should be cooled by the stock fan
> 
> 
> 
> 
> 
> 
> 
> 
> 
> finally did some holes in the VGA case to make space for the waterpump tubes and that's pretty much it.
> 
> before i was getting insane noise and 89° celsius on full load, now im getting 54° celsius on full load ( with the radiator fan @ 100% )
> 
> also the 290 has a sexy look similar to a 295! hahaha
> 
> still have to do the same thing on the another one VGA, i may do some video for youtube on the how to process.
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers!
> -Brian.


Nice work. Welcome to OCN with your first post. Nice temps . Read OP and submit proof and I'll add you to the club roster, Love to have you aboard,


----------



## bond32

Finally got the other blocks in:


----------



## zyndro

Thank you Arizonian!

I'm sorry, kinda lost with the OP ? its like an user rules i guess?
and also submit proof. ill do a quick search bout it !


----------



## Vici0us

Ordered a Gigabyte Windforce R9 290 few days ago, it came in today. Now running Crossfire with Reference Asus R9 290. So far everything is good! My Asus card is Elpida and Gigabyte is Henyx. They work great together. Windforce temps don't go past 73C (when running FireStrike) while Reference card is @ 86C.
Here's GPU-Z proof for both cards just in case. (I believe Asus card is already added).
Gigabyte - http://www.techpowerup.com/gpuz/enzfn/
Asus - http://www.techpowerup.com/gpuz/ezegb/


----------



## Arizonian

Quote:


> Originally Posted by *zyndro*
> 
> Thank you Arizonian!
> 
> I'm sorry, kinda lost with the OP ? its like an user rules i guess?
> and also submit proof. ill do a quick search bout it !


Heh, sorry about that I'm using lingo you're not familiar with being new







. OP = original post to this thread.

A GPU-Z validation link with your OCN name showing or a pic of the GPU with you OCN name on a paper or something.








Quote:


> Originally Posted by *Vici0us*
> 
> Ordered a Gigabyte Windforce R9 290 few days ago, it came in today. Now running Crossfire with Reference Asus R9 290. So far everything is good! My Asus card is Elpida and Gigabyte is Henyx. They work great together. Windforce temps don't go past 73C (when running FireStrike) while Reference card is @ 86C.
> Here's GPU-Z proof for both cards just in case. (I believe Asus card is already added).
> Gigabyte - http://www.techpowerup.com/gpuz/enzfn/
> Asus - http://www.techpowerup.com/gpuz/ezegb/
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - updated


----------



## kizwan

Quote:


> Originally Posted by *bluedevil*
> 
> Just submitted for RMA. Hopefully it won't take too long. .


Next time don't overtighten. Hopefully they fixed or replaced the card for you.
Quote:


> Originally Posted by *Petet1990*
> 
> played bf4 with one card for a bit and everything fine..second card in and it crashes while uploading..dont know anymore


Test the cards one by one. One of the cards maybe faulty.


----------



## zyndro

Let me know if this works !!

http://www.techpowerup.com/gpuz/mekg/


----------



## Arizonian

Quote:


> Originally Posted by *bond32*
> 
> Finally got the other blocks in:
> 
> 
> Spoiler: Warning: Spoiler!


I don't see you on the list bond32. I'll have to apologize if I missed your submission. I can't find it. Link me to it, or by all means submit validation, I'll get you added.








Quote:


> Originally Posted by *zyndro*
> 
> Let me know if this works !!
> 
> http://www.techpowerup.com/gpuz/mekg/
> 
> 
> Spoiler: Warning: Spoiler!


Sure did. Congrats again - added


----------



## HoneyBadger84

Quote:


> Originally Posted by *Vici0us*
> 
> Ordered a Gigabyte Windforce R9 290 few days ago, it came in today. Now running Crossfire with Reference Asus R9 290. So far everything is good! My Asus card is Elpida and Gigabyte is Henyx. They work great together. Windforce temps don't go past 73C (when running FireStrike) while Reference card is @ 86C.
> Here's GPU-Z proof for both cards just in case. (I believe Asus card is already added).


Try capping the Gigabyte at 100% fan (it should still be quiet) and the reference card at 70%, see how much your temps go down. 70% should help the stock blower a lot compared to the default profile which tops out at ~50%. And the noise isn't too bad... at least not compared to 85-100%


----------



## ebhsimon

Quote:


> Originally Posted by *Vici0us*
> 
> Ordered a Gigabyte Windforce R9 290 few days ago, it came in today. Now running Crossfire with Reference Asus R9 290. So far everything is good! My Asus card is Elpida and Gigabyte is Henyx. They work great together. Windforce temps don't go past 73C (when running FireStrike) while Reference card is @ 86C.
> Here's GPU-Z proof for both cards just in case. (I believe Asus card is already added).


Bro you have some really decent temps. I was going to say your Windforce is good, but then I saw your reference card. 86C under full load? That's awesome. I always thought most reference cards throttle under full load since they reach 94C


----------



## HoneyBadger84

Quote:


> Originally Posted by *ebhsimon*
> 
> Bro you have some really decent temps. I was going to say your Windforce is good, but then I saw your reference card. 86C under full load? That's awesome. I always thought most reference cards throttle under full load since they reach 94C




2 middle cards in that chart are reference cards under full load, bottom card is a WindForce with fresh airflow







ignore the top card, I got rid of that VisionTek "aftermarket" POS a while ago







thats with all fans at 100% and a shop-blower fan for sideflow though... but yeah, nice temps can be had on stock blower cards, just at the cost of noise.

I'm gonna be running 4 all-stock blower cards soon enough







looking forward to it.


----------



## Vici0us

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Vici0us*
> 
> Ordered a Gigabyte Windforce R9 290 few days ago, it came in today. Now running Crossfire with Reference Asus R9 290. So far everything is good! My Asus card is Elpida and Gigabyte is Henyx. They work great together. Windforce temps don't go past 73C (when running FireStrike) while Reference card is @ 86C.
> Here's GPU-Z proof for both cards just in case. (I believe Asus card is already added).
> 
> 
> 
> Try capping the Gigabyte at 100% fan (it should still be quiet) and the reference card at 70%, see how much your temps go down. 70% should help the stock blower a lot compared to the default profile which tops out at ~50%. And the noise isn't too bad... at least not compared to 85-100%
Click to expand...

I'll try 100% fan on Gigabyte, my reference card is set to 65% that's the reason I'm not hitting 90s.
Quote:


> Originally Posted by *ebhsimon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Vici0us*
> 
> Ordered a Gigabyte Windforce R9 290 few days ago, it came in today. Now running Crossfire with Reference Asus R9 290. So far everything is good! My Asus card is Elpida and Gigabyte is Henyx. They work great together. Windforce temps don't go past 73C (when running FireStrike) while Reference card is @ 86C.
> Here's GPU-Z proof for both cards just in case. (I believe Asus card is already added).
> 
> 
> 
> Bro you have some really decent temps. I was going to say your Windforce is good, but then I saw your reference card. 86C under full load? That's awesome. I always thought most reference cards throttle under full load since they reach 94C
Click to expand...

Yeah, reference card set to 65% max speed, don't want to make it too noisy. But I'm very happy with Windforce temps and it's very quiet.


----------



## SgtMunky

I have the Sapphire 290 Tri-X which I am massively happy with since buying, but my build is red and black themed. Any ideas on a) a backplate to tidy the card up and b) changing the look of the stock cooler?


----------



## ebhsimon

Quote:


> Originally Posted by *HoneyBadger84*
> 
> 
> 
> 2 middle cards in that chart are reference cards under full load, bottom card is a WindForce with fresh airflow
> 
> 
> 
> 
> 
> 
> 
> ignore the top card, I got rid of that VisionTek "aftermarket" POS a while ago
> 
> 
> 
> 
> 
> 
> 
> thats with all fans at 100% and a shop-blower fan for sideflow though... but yeah, nice temps can be had on stock blower cards, just at the cost of noise.
> 
> I'm gonna be running 4 all-stock blower cards soon enough
> 
> 
> 
> 
> 
> 
> 
> looking forward to it.


Those are really good temps.. But unlike many I'm very sensitive to noise. Partly why I only have one Vapor-x 290. Barely audible inside my modded case filled with acoustic foam.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Vici0us*
> 
> I'll try 100% fan on Gigabyte, my reference card is set to 65% that's the reason I'm not hitting 90s.
> Yeah, reference card set to 65% max speed, don't want to make it too noisy. But I'm very happy with Windforce temps and it's very quiet.


Have you tried 70%? Pretty sure the noise change from 65 to 70 is pretty small, but that extra flow is pretty good.

Yeah the WindForce is a very good cooler as long as its not breathing off another aftermarket card. I've had mine run as low as 58C max load with the side of my case off using a shop-blower fan as side flow (bottom card is the Gigabyte)


----------



## HoneyBadger84

Quote:


> Originally Posted by *ebhsimon*
> 
> Those are really good temps.. But unlike many I'm very sensitive to noise. Partly why I only have one Vapor-x 290. Barely audible inside my modded case filled with acoustic foam.


I'm sure if I didn't game on a Logitech G930 headset it'd bug me more, but that setup was just for benchmarks, getting my room upgraded to a 20A breaker and going dual PSU before I run QuadFire of all reference cards for regular use. Got an Antec HCP 850w on the way that I got for $99 and the 4th (and 5th) card should be here the same day as the PSU.

Then I'll have to figure out an ideal fan setup for regular use of 4 cards.


----------



## ebhsimon

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I'm sure if I didn't game on a Logitech G930 headset it'd bug me more, but that setup was just for benchmarks, getting my room upgraded to a 20A breaker and going dual PSU before I run QuadFire of all reference cards for regular use. Got an Antec HCP 850w on the way that I got for $99 and the 4th (and 5th) card should be here the same day as the PSU.
> 
> Then I'll have to figure out an ideal fan setup for regular use of 4 cards.


If you're going to have the side panel open you may as well just run them in open air on a test bench. Both would accumulate similar amounts of dust, but the test bench would allow for way easier cleaning.


----------



## ethanhunt

Quote:


> Originally Posted by *SgtMunky*
> 
> I have the Sapphire 290 Tri-X which I am massively happy with since buying, but my build is red and black themed. Any ideas on a) a backplate to tidy the card up and b) changing the look of the stock cooler?


Is there a backplate available for Tri-X ?


----------



## HoneyBadger84

Quote:


> Originally Posted by *ebhsimon*
> 
> If you're going to have the side panel open you may as well just run them in open air on a test bench. Both would accumulate similar amounts of dust, but the test bench would allow for way easier cleaning.


For daily use setup I won't be running with the side of the case off, which is why I need to configure optimal fan flow with a 4 card dual PSU setup in mind. It'll be my first forray in to dual PSU land.


----------



## ozzy1925

Quote:


> Originally Posted by *ethanhunt*
> 
> Is there a backplate available for Tri-X ?


yes but you can use with waterblocks


----------



## ebhsimon

Quote:


> Originally Posted by *HoneyBadger84*
> 
> For daily use setup I won't be running with the side of the case off, which is why I need to configure optimal fan flow with a 4 card dual PSU setup in mind. It'll be my first forray in to dual PSU land.


You're going to need some damn high airflow fans if they're open air. But luckily most are reference so basically just make every fan an intake I think? Or at least a strong bias to intake fans. Make your intake fans much stronger than your exhaust if you must have an exhaust fan.


----------



## HoneyBadger84

Quote:


> Originally Posted by *ebhsimon*
> 
> You're going to need some damn high airflow fans if they're open air. But luckily most are reference so basically just make every fan an intake I think? Or at least a strong bias to intake fans. Make your intake fans much stronger than your exhaust if you must have an exhaust fan.


Yep, they're all reference cards with stock blowers. The only exhaust my case has is the fan on the back, and the cpu radiator (H110), which is up in the 5 1/4" drive bays area and doesn't really mess with the case flow. Two front intake 140mms, 3 side 180mm intake fans (I'll be adding a 4th upper right for better direct air towards the cpu radiator):




The bottom fans, at least one of them, will have to come out when the secondary PSU goes in.

And I'll be tidying up the wiring of course once its more permanently setup...


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *HoneyBadger84*
> 
> For daily use setup I won't be running with the side of the case off, which is why I need to configure optimal fan flow with a 4 card dual PSU setup in mind. It'll be my first forray in to dual PSU land.


Dual 1200w PSU's enabled me to get 1290 / 1398 on TRI fire . So yeah a test bench or deskputer is the go for this


----------



## Dasboogieman

Quote:


> Originally Posted by *ethanhunt*
> 
> Is there a backplate available for Tri-X ?


Yes there is, you just need the right screws. @SgtMunky





This is mine, I got the EK Backplate and screws from Helipal

They are these ones:
http://www.helipal.com/m2-5-x-10mm-flat-head-cap-screws.html

You may also need one of these for the screw hole under the EK label on the backplate (which blocks the hole for some ******ed reason)
http://www.helipal.com/m2-5-x-10mm-button-head-cap-screws.html

You can also use thermal pads under the backplate in the VRM area to improve temperatures. I went for broke and installed sticky heatsinks to the backplate. Dropped my VRM temperatures by about 5-10 degrees.


----------



## SgtMunky

Quote:


> Originally Posted by *Dasboogieman*
> 
> Yes there is, you just need the right screws. @SgtMunky
> 
> 
> 
> 
> 
> This is mine, I got the EK Backplate and screws from Helipal
> 
> They are these ones:
> http://www.helipal.com/m2-5-x-10mm-flat-head-cap-screws.html
> 
> You may also need one of these for the screw hole under the EK label on the backplate (which blocks the hole for some ******ed reason)
> http://www.helipal.com/m2-5-x-10mm-button-head-cap-screws.html
> 
> You can also use thermal pads under the backplate in the VRM area to improve temperatures. I went for broke and installed sticky heatsinks to the backplate. Dropped my VRM temperatures by about 5-10 degrees.


you legend, I'll look at this properly when at home with a proper monitor but that looks what I am after


----------



## HoneyBadger84

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Dual 1200w PSU's enabled me to get 1290 / 1398 on TRI fire . So yeah a test bench or deskputer is the go for this


No way I'll be able to push those clocks on air anyway lol I'm hoping I can get the cores to 1150, which the 3 HIS cards I already have can run easily at +125~131mV, and between 1400-1550 vRAM. Thatll be for benchmarks only. For regular 3K (3x1080p) gaming I'll be running them at stock... then sometime between now and next June I'll be picking me up a 4K TV. Gotta pay off the last bit of the newer 14" Alienware I got & one other bill before I can get the newer TV.


----------



## Kuro1n

Guessing this will be my best bet at getting a Sapphire 290x Tri-x 043 rom, so please if someone has a 043 rom bios upload it, trying to unlock my 290 tri-x card but not working with 042 roms since drivers wont run then.


----------



## Dasboogieman

Quote:


> Originally Posted by *Kuro1n*
> 
> Guessing this will be my best bet at getting a Sapphire 290x Tri-x 043 rom, so please if someone has a 043 rom bios upload it, trying to unlock my 290 tri-x card but not working with 042 roms since drivers wont run then.


This is the ROM you are after
http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread/3160

@zackbummente had the same issue earlier on and he managed to get a copy

Also @Arizonian
I forgot to validate my second card



MSI 290 4G Gaming Edition
Standard cooling, no mods


----------



## Kuro1n

Quote:


> Originally Posted by *Dasboogieman*
> 
> This is the ROM you are after
> http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread/3160
> 
> @zackbummente had the same issue earlier on and he managed to get a copy


@Dasboogieman
Yeah I saw that one but wasn't sure I trusted a guy who had only made 7 posts on the forum in total with a bios which was named rather strange that is why I am asking. Don't want to mess up my card as I will need it for work as well.


----------



## fateswarm

Quote:


> Originally Posted by *Kuro1n*
> 
> @Dasboogieman
> Yeah I saw that one but wasn't sure I trusted a guy who had only made 7 posts on the forum in total with a bios which was named rather strange that is why I am asking. Don't want to mess up my card as I will need it for work as well.


I tried the rom and it bricked. Temporarily, I reverted it. I guess it's because I didn't have the magic hawaiinfo12 output but I hoped it may be a special case.

Oh well, at least I get lower heat


----------



## Gobigorgohome

Quote:


> Originally Posted by *Petet1990*
> 
> i troed crysis and same thing..i just did fresh install of windows and as im installing the amd drivers it bsods again..im really at loss for words now...


Hmm ... First off, try just one card. Then put in the other one and so on, try the card(s) in different slots too. Find out if it bsods with all the pci-e cables (and that they are plugged in correctly, take them out and put them in again), also check cpu 8-pin and Mb 24-pin cables. Then try again. If you use additional cables for more than two gpu's check that cable too.

Also, is your cards of the same brand and uses the same bios? Nothing running hot or at wrong voltage?


----------



## gobblebox

Can anyone help out with the following issue on a Sapphire Tri-X OC r9 290? When I boot up my PC, the GPU is always idling at a pretty high temperature ~60 C ... the only way I can get it back down (even after tinkering with Fan Curves) after starting up my PC is to manually open MSI AB, select a profile, then click "reset'. It appears that the culprit is ULPS being disabled, but why would it be disabled at startup before I even open AB? And why would clicking "reset" lower the temps/appear to re-enable ULPS (I can see the core & memory speeds drop at idle) when, even in the settings tab, "Disable ULPS" is clearly still ticked? I do not have AB set up to automatically run at startup either. Any help would be greatly appreciated!

[Edit] Typo & OCD


----------



## HOMECINEMA-PC

I just finished a 3hr sess on Ol' Cyrsis 3 on very high settings @ 7680x1440p runnin tri monitors . Turned off tess and was getting avg 90fps a sec on tri 290 @ 1100 stock volts and mem . Peak gpu load temp of 30c - 33c . Very pleased


----------



## ebhsimon

Quote:


> Originally Posted by *gobblebox*
> 
> Can anyone help out with the following issue on a Sapphire Tri-X OC r9 290? When I boot up my PC, the GPU is always idling at a pretty high temperature ~60 C ... the only way I can get it back down (even after tinkering with Fan Curves) after starting up my PC is to manually open MSI AB, select a profile, then click "reset'. It appears that the culprit is ULPS being disabled, but why would it be disabled at startup before I even open AB? And why would clicking "reset" lower the temps/appear to re-enable ULPS (I can see the core & memory speeds drop at idle) when, even in the settings tab, "Disable ULPS" is clearly still ticked? I do not have AB set up to automatically run at startup either. Any help would be greatly appreciated!
> 
> [Edit] Typo & OCD


Do you have overclocks set to run at desktop start up?
Do you have adequate airflow?
Has the card had this problem since you bought it?
What speed are the fans running at?

My 290 card idles at ~44-45C with triple monitors with ULPS disabled.


----------



## PCSarge

Quote:


> Originally Posted by *ebhsimon*
> 
> Bro you have some really decent temps. I was going to say your Windforce is good, but then I saw your reference card. 86C under full load? That's awesome. I always thought most reference cards throttle under full load since they reach 94C


nah my reference card on the blower before my waterblock, on an OC to 1100/1300 was around 85C full load in firestrike. if i cranked fan to 100% it sat around 77C. that is just after a TIM swap ofc.

now that its waterblocked it barely breaks 50c on an overclock


----------



## gobblebox

Quote:


> Originally Posted by *ebhsimon*
> 
> Do you have overclocks set to run at desktop start up?
> Do you have adequate airflow?
> Has the card had this problem since you bought it?
> What speed are the fans running at?
> 
> My 290 card idles at ~44-45C with triple monitors with ULPS disabled.


End of my post says no OCs run at startup. My airflow is excellent with 2 140mm Noctua front intake, 1 140mm Corsair Quiet rear exhaust, and a top H100i push/pull with 4x sp120 performance as exhaust. But like I said, if I reset the profiles in AB, the temps drop to 40 C... I just don't know why at startup, without any OC on, it acts like there is. Fans are around 40% - 50‰ because they are chugging to cool this beast off.


----------



## ebhsimon

Quote:


> Originally Posted by *gobblebox*
> 
> End of my post says no OCs run at startup. My airflow is excellent with 2 140mm Noctua front intake, 1 140mm Corsair Quiet rear exhaust, and a top H100i push/pull with 4x sp120 performance as exhaust. But like I said, if I reset the profiles in AB, the temps drop to 40 C... I just don't know why at startup, without any OC on, it acts like there is. Fans are around 40% - 50‰ because they are chugging to cool this beast off.


So it's definitely a software problem.. Idk try reinstalling AB, if that doesn't fix reinstall your 14.6 amd drivers.


----------



## PCSarge

Quote:


> Originally Posted by *ebhsimon*
> 
> So it's definitely a software problem.. Idk try reinstalling AB, if that doesn't fix reinstall your 14.6 amd drivers.


is almost say go from 14.6 back to 14.4 i had the same issue with 14.6 on my card.


----------



## gobblebox

Quote:


> Originally Posted by *ebhsimon*
> 
> So it's definitely a software problem.. Idk try reinstalling AB, if that doesn't fix reinstall your 14.6 amd drivers.


Now that you mentioned it, I forgot that I updated to 14.7 last night, prior to that, I wasn't having this issue... I'll be downgrading tonight. Thanks for the help!


----------



## bluedevil

Well VisionTek emailed me for the invoice. Hopefully it goes through, as I am not the "original" purchaser.


----------



## tsm106

Doesn't visiontek have a standard warranty? They are locked to original owners...? Only Asus, MSI, and maybe Giga are serial based thus owner agnostic iirc.


----------



## Petet1990

ok looks like i got all three cards going..i went and set the cpu to stock and i upped thr power limit on each card and so far everything is good.


----------



## fateswarm

Hm, is it normal that if I try to install the latest beta it gives that the driver is already the latest ver but not ccc? Unless they released a version that only upgraded non-driver parts.


----------



## bluedevil

Quote:


> Originally Posted by *tsm106*
> 
> Doesn't visiontek have a standard warranty? They are locked to original owners...? Only Asus, MSI, and maybe Giga are serial based thus owner agnostic iirc.


dunno we shall see!


----------



## Petet1990

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Hmm ... First off, try just one card. Then put in the other one and so on, try the card(s) in different slots too. Find out if it bsods with all the pci-e cables (and that they are plugged in correctly, take them out and put them in again), also check cpu 8-pin and Mb 24-pin cables. Then try again. If you use additional cables for more than two gpu's check that cable too.
> 
> Also, is your cards of the same brand and uses the same bios? Nothing running hot or at wrong voltage?


i think its good now i put the cpu to stock clocks and uped the power limit on the gpus..so far so good...thanks


----------



## bond32

Koolance gpu blocks are installed! Finally!


----------



## HoneyBadger84

Quote:


> Originally Posted by *bond32*
> 
> Koolance gpu blocks are installed! Finally!


Dunno how y'all can do all that without sweating bullets about it. I guess you get used to it and the like... I never did but I only fiddled with my loop every so often. Am I the only one that loves compression fittings?


----------



## bond32

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Dunno how y'all can do all that without sweating bullets about it. I guess you get used to it and the like... I never did but I only fiddled with my loop every so often. Am I the only one that loves compression fittings?


Those are all compression fittings. I don't like them but use them anyway. I do have some barbs that dont need a hose clamp - it's quite difficult to get the tubing off actually.


----------



## Jflisk

Quote:


> Originally Posted by *bond32*
> 
> Koolance gpu blocks are installed! Finally!


Bond nice ^^^^^^^


----------



## bond32

Thanks, I'm pretty pleased with them. Need a bigger case, I have lots of radiators I am not even using now. Looking at the Bitfenix Atlas


----------



## Jflisk

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Dunno how y'all can do all that without sweating bullets about it. I guess you get used to it and the like... I never did but I only fiddled with my loop every so often. Am I the only one that loves compression fittings?


I use compression fittings never had any problems with them once the tube is properly placed over them and screwed down on they are not going anywhere.


----------



## Jflisk

Quote:


> Originally Posted by *bond32*
> 
> Thanks, I'm pretty pleased with them. Need a bigger case, I have lots of radiators I am not even using now. Looking at the Bitfenix Atlas


How long did it take you to sheath all that cable.


----------



## bond32

Quote:


> Originally Posted by *Jflisk*
> 
> How long did it take you to sheath all that cable.


However long it took to go to amazon and order the full kit from EVGA... I cheated lol. I have little patience, figured sleeving would be too much.


----------



## Jflisk

Quote:


> Originally Posted by *bond32*
> 
> However long it took to go to amazon and order the full kit from EVGA... I cheated lol. I have little patience, figured sleeving would be too much.


Hah the easy way out. I was thinking about sheathing my cables. But I have 14 Noctua fans so whats the sense. Hurle brown


----------



## zackbummente

@Kuro1n

I recieved your pm - my card works really flawlessly.
As I could see there are 2 revisions of the 290 Tri-X out there.
The early ones could be unlocked with the custom bioses in the first post - the later revision got some important changes in its layout, so that there was a need for another bios.

I flashed several bios before and I would go so far to say...you cannot screw up your card. In the Tri-X case...the worst case would be, that the Tri-X will not install the AMD driver - then you've got another revision and have to try another bios - or you flash back your original one.

I hope I could disperse your concerns.

By the way...I've only got few posts, caused by the fact that I'm from germany and that forum was the only chance to get my card unlocked - my card is only a few weeks old...- i guess you see that point.


----------



## cennis

Quote:


> Originally Posted by *zackbummente*
> 
> @Kuro1n
> 
> I recieved your pm - my card works really flawlessly.
> As I could see there are 2 revisions of the 290 Tri-X out there.
> The early ones could be unlocked with the custom bioses in the first post - the later revision got some important changes in its layout, so that there was a need for another bios.
> 
> I flashed several bios before and I would go so far to say...you cannot screw up your card. In the Tri-X case...the worst case would be, that the Tri-X will not install the AMD driver - then you've got another revision and have to try another bios - or you flash back your original one.
> 
> I hope I could disperse your concerns.
> 
> By the way...I've only got few posts, caused by the fact that I'm from germany and that forum was the only chance to get my card unlocked - my card is only a few weeks old...- i guess you see that point.


The worst that may happen is, if you flash the wrong bios, the card may not be able to display anything.

The solution is to either boot up using integrated gpu, or a different gpu and put the flashed 290 in a secondary pcie slot and flash it back.

If you do not have a integrated gpu or another gpu/pcie slot, you could boot the card up with the working bios switch, then toggle it when you have booted up, and you can now flash into the non active bios slot.

*TLDR; Make sure you save a copy of the stock bios in the bootable USB you are using, and also try not to flash both bios switches, as the stock one can be used to flash the bricked one.*


----------



## ZealotKi11er

Quote:


> Originally Posted by *cennis*
> 
> The worst that may happen is, if you flash the wrong bios, the card may not be able to display anything.
> 
> The solution is to either boot up using integrated gpu, or a different gpu and put the flashed 290 in a secondary pcie slot and flash it back.
> 
> If you do not have a integrated gpu or another gpu/pcie slot, you could boot the card up with the working bios switch, then toggle it when you have booted up, and you can now flash into the non active bios slot.
> 
> *TLDR; Make sure you save a copy of the stock bios in the bootable USB you are using, and also try not to flash both bios switches, as the stock one can be used to flash the bricked one.*


I think one of the Bios chips can't the flashed so your always safe. It helped me so many time to fix. MSI AB problem.


----------



## devilhead

http://hwbot.org/submission/2589320_8_pack_3dmark11___performance_radeon_r9_290x_24759_marks
1600/1700 not bad


----------



## ozzy1925

Quote:


> Originally Posted by *devilhead*
> 
> http://hwbot.org/submission/2589320_8_pack_3dmark11___performance_radeon_r9_290x_24759_marks
> 1600/1700 not bad


also
http://hwbot.org/submission/2590063_
good message to nvidia fan boys


----------



## Gobigorgohome

Hallo, I have a problem, I ordered 5x Fujipoly Ultra Extreme 0,5 mm thick and that is just enough for about two cards .... I thought it was enough for all four, shipping from Frozencpu takes at least one week from the date I order (which I do not have the time for now). How bad is the memory-thermal pad from EK? I have enough of the 1 mm thick thermal pad though (which will cover vrm's), but I do not have enough for the memory, can I just use the EK-thermal pad then?


----------



## ZealotKi11er

You dont need to change the memory so you are fine with using the ones EK provides. Only the VRM1 needs the good stuff.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Jflisk*
> 
> I use compression fittings never had any problems with them once the tube is properly placed over them and screwed down on they are not going anywhere.


Yeah they're definitely my favorite type of fitting. It's all I used in my last loop I built & the only issue it had was a stupid thread on the radiator being messed up and leaking a bit.


----------



## bond32

Well, after all this time having a total overkill water loop, I can finally push it to the limits. My loop is cooling well with the st30 and a single set of AP-15's in push with a monsta 240 in push/pull ap-15's, however I can really feel the heat being dumped now with all the 290's in the loop running PT1 bios with a 4770k at 1.49 VCORE...

Awesome.

Let the benching begin...


----------



## ZealotKi11er

Is there any way to OC with CrosfireX while having ULPS ON? I am asking because any OC i do with MSI, BF4 hangs and system locks. I want to keep ULPS ON because its summer and 90% of the time the system is at idle and its uses ~ 50W less power. 120W vs 170W.


----------



## Arizonian

Quote:


> Originally Posted by *Dasboogieman*
> 
> This is the ROM you are after
> http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread/3160
> 
> @zackbummente had the same issue earlier on and he managed to get a copy
> 
> Also @Arizonian
> I forgot to validate my second card
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> MSI 290 4G Gaming Edition
> Standard cooling, no mods


Congrats - updated








Quote:


> Originally Posted by *bond32*
> 
> Koolance gpu blocks are installed! Finally!
> 
> 
> Spoiler: Warning: Spoiler!


Congrats looks great - added
















_If you can add a GPU-Z validation link in the original post would be appreciated._


----------



## haravovadia

Hi all,

I think I ran into an extreme case here, but its driving me nuts.. I need someone with 2 (preferably 3) monitors to please try and reproduce what I am having, to verify if this is a driver issue or a hardware issue:

setup:
-Single Sapphire R9 290 TRIX (750W seasonic PSU if relevant)
-Driver version is irrelevant as I have had this happening on all 14.x and 13.x drivers.
-have 3 monitors (mine are 1920X1200) connected with *Eyefinity disabled* (so 3 independent 1920X1200 monitors)
-Windows 7 with Aero fully enabled (specifically "transparent glass")

Issue:

1-Open up any window (folder/web browser etc), set to not be Maximized and move it across monitors quickly, at my setup this generates heavy lagging of the window animation, so much that even my sister's Atom notebook does a better job at keeping the Aero experience smooth.

2-At Photoshop (again, version irrelevant) select the "selection tool" and attempt to select objects, this generates heavy lag and the selection tool animation will be really slow.

Issue 1 will take place only if i use more than 1 monitor with Eyefinity disabled, and "Transparent glass" enabled.
Issue 2 will always occur no matter what my setup is.

*I have tried to set the GPU to run at full clock speeds with afterburner while in 2D (windows) environment but this made no change.
**I have seen the 2D slow-down issue at other various software such as Counter strike GO menu etc, but prefer to keep it as I have posted as photoshop is more widespread than specific games are.
***With my 5850 everything works perfectly smooth with the same rig.

Thank you!


----------



## Gualichu04

Finaly was able to put the good xfx r9 290x under water and it idles at 35-37C with a room temp of 25.5C Under load the gpu does not go over 44C and vrm temp is 35-37C the active back plate really helps.


----------



## FuriousPop

Quote:


> Originally Posted by *haravovadia*
> 
> Hi all,
> 
> I think I ran into an extreme case here, but its driving me nuts.. I need someone with 2 (preferably 3) monitors to please try and reproduce what I am having, to verify if this is a driver issue or a hardware issue:
> 
> setup:
> -Single Sapphire R9 290 TRIX (750W seasonic PSU if relevant)
> -Driver version is irrelevant as I have had this happening on all 14.x and 13.x drivers.
> -have 3 monitors (mine are 1920X1200) connected with *Eyefinity disabled* (so 3 independent 1920X1200 monitors)
> -Windows 7 with Aero fully enabled (specifically "transparent glass")
> 
> Issue:
> 
> 1-Open up any window (folder/web browser etc), set to not be Maximized and move it across monitors quickly, at my setup this generates heavy lagging of the window animation, so much that even my sister's Atom notebook does a better job at keeping the Aero experience smooth.
> 
> 2-At Photoshop (again, version irrelevant) select the "selection tool" and attempt to select objects, this generates heavy lag and the selection tool animation will be really slow.
> 
> Issue 1 will take place only if i use more than 1 monitor with Eyefinity disabled, and "Transparent glass" enabled.
> Issue 2 will always occur no matter what my setup is.
> 
> *I have tried to set the GPU to run at full clock speeds with afterburner while in 2D (windows) environment but this made no change.
> **I have seen the 2D slow-down issue at other various software such as Counter strike GO menu etc, but prefer to keep it as I have posted as photoshop is more widespread than specific games are.
> ***With my 5850 everything works perfectly smooth with the same rig.
> 
> Thank you!


can i ask why you have eyefinity disabled when running 3 monitors?

You can still use each monitor independently provided you dont hit the full screen button on the application (Maximise) - can be done manually ie; drag the app to the monitor you want then simply move the application to the top of that particular monitor - it should resize the app to take the whole monitor.


----------



## HOMECINEMA-PC

THIS ^^^^^^^


----------



## haravovadia

Hi guys,

To be honest i like to have the option to game on a single screen (heroes 3 and other various games that like to run on fullscreen yet hate EF come to mind)
In addition, while running eyefinity I have not found an option (yet) to maximize a window on a single screen outside of having a 3rd application such as Hydra running at the background, and even then the start menu spans across all 3 screens with the start button on the edge of the left screen feels odd.

I am currently running with a macro for swapping from EF to normal "standalone" modes, i activate the EF profile before booting up an EF game, and outside of it i use the standalone profile.

Back on track though, even if it is not "recommended/logical " configurationi still see no reason for the 290 to operate much worse than the 5850 or even an integrated CPU graphics card, can anyone please confirm if this is only on my side of things?


----------



## Mega Man

within cc you can easily fic the start menu


----------



## Gualichu04

MY single r9 290x does 1150mhz on the core and 50mhz extra on the memory +19mv and +20 power limit heaven had no artifacts.
Edit: seems it needs between +31-38 on the core and power limit +35 or more. Was getting artifacts in watch dogs.


----------



## ebhsimon

Quote:


> Originally Posted by *Gualichu04*
> 
> MY single r9 290x does 1150mhz on the core and 50mhz extra on the memory +19mv and +20 power limit heaven had no artifacts.


That's really good. I haven't overclocked my memory yet but on +25mV I only get 1090mhz on the core on my 290.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Gualichu04*
> 
> MY single r9 290x does 1150mhz on the core and 50mhz extra on the memory +19mv and +20 power limit heaven had no artifacts.


Lucky dog, I have to crank my voltage up to ~125-131mV to get 1150MHz core stable on my reference core edition cards, what type of cooler are you running and what load temps do you see at those clocks?


----------



## Gualichu04

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Lucky dog, I have to crank my voltage up to ~125-131mV to get 1150MHz core stable on my reference core edition cards, what type of cooler are you running and what load temps do you see at those clocks?


Water cooled with aquacomputer kryographics water block and active back plate. Load temps are 45C max on the core and 35-37C on the vrm. It seems it starts to artifact after gaming for over 30 minutes. seems to be stable again at +31-38mv and adding more to the power limit. Not sure if the power limit affects the overclock if ramped up to +50. I tried to get it to 1300mhz on the core but at 200mv and +50 power limit it black screen crashes.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Gualichu04*
> 
> Water cooled with aquacomputer kryographics water block and active back plate. Load temps are 45C max on the core and 35-37C on the vrm. It seems it starts to artifact after gaming for over 30 minutes. seems to be stable again at +31-38mv and adding more to the power limit. Not sure if the power limit affects the overclock if ramped up to +50. I tried to get it to 1300mhz on the core but at 200mv and +50 power limit it black screen crashes.


I think most folks set to +50% power limit when OCing because it allows the card to draw up to 50% more power, if it needs it, so its better to be safe with that than sorry. That's what I do when I'm testing overclocks.

I'm stuck on air cooling so overclocking is a bit more challenging especially in 2-3-4 card setups.


----------



## Gualichu04

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I think most folks set to +50% power limit when OCing because it allows the card to draw up to 50% more power, if it needs it, so its better to be safe with that than sorry. That's what I do when I'm testing overclocks.
> 
> I'm stuck on air cooling so overclocking is a bit more challenging especially in 2-3-4 card setups.


I am keeping it at +50 power limit then. seems to be stable at +38mv 1150mhz on the core and 1300mhz on the memory. Did a run of heaven flawlessly. Max temp was 49C with the room being 26.6C


----------



## HoneyBadger84

Do a loop of Valley with at least 2 run throughs. Or run 3DMark FireStrike's demo on a loop (not on extreme), that would keep you pegged at 100% most of the time and show artifacts if its going to by the second or third loop.


----------



## tsm106

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Is there any way to OC with CrosfireX while having ULPS ON? I am asking because any OC i do with MSI, BF4 hangs and system locks. I want to keep ULPS ON because its summer and 90% of the time the system is at idle and its uses ~ 50W less power. 120W vs 170W.


Using ulps on + unofficial mode = instant BSOD as soon as clocks are engaged.

If you must do it with ulps on use official mode. Make sure the regedit entries AB inserts are replicated to your other cards. You know which entry it is right?


----------



## ZealotKi11er

Quote:


> Originally Posted by *tsm106*
> 
> Using ulps on + unofficial mode = instant BSOD as soon as clocks are engaged.
> 
> If you must do it with ulps on use official mode. Make sure the regedit entries AB inserts are replicated to your other cards. You know which entry it is right?


I have not used Unofficial Mode since HD 7970. Why do i need that mode? Not sure about regedit.


----------



## tsm106

Read it again. Ulps on need official not unofficial.

If you search the gpu folders in registry, you will find that AB doesn't set ulps correctly if at all on crossfire cards.


----------



## Jflisk

Quote:


> Originally Posted by *tsm106*
> 
> Read it again. Ulps on need official not unofficial.
> 
> If you search the gpu folders in registry, you will find that AB doesn't set ulps correctly if at all on crossfire cards.


TSM cant they just use this utility turns ULPS off (OPSS almost said on). I use it for my xfire set up. Instead of digging registry.

http://www.ulpsconfigurationutility.com/

Thanks


----------



## tsm106

Quote:


> Originally Posted by *Jflisk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Read it again. Ulps on need official not unofficial.
> 
> If you search the gpu folders in registry, you will find that AB doesn't set ulps correctly if at all on crossfire cards.
> 
> 
> 
> TSM cant they just use this utility turns ULPS off (OPSS almost said on). I use it for my xfire set up. Instead of digging registry.
> 
> http://www.ulpsconfigurationutility.com/
> 
> Thanks
Click to expand...

I don't need an app to do it, since it's memorized. You should learn what folder it is by memory too. The 4D36E968, under current control set/class. Really, memorize it, it's easy. Yall should.

Under unofficial method, I walk ya thru how to traverse to the folder. It's really a non-issue to remember.

http://www.overclock.net/t/1265543/the-amd-how-to-thread/0_40


----------



## ZealotKi11er

I want to keep ULPS ON and still OC. I dont have Unofficial Mode enabled in MSI AB. If there is no way to OC with ULPS ON then i will give up.


----------



## tsm106

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I want to keep ULPS ON and still OC. I dont have Unofficial Mode enabled in MSI AB. If there is no way to OC with ULPS ON then i will give up.


The pragmatic way is to use Overdrive. AB doesn't always work the way it's advertised and currently it's the only way to overclock past the Overdrive clock limits using official method. Since your setup is crashing, I suspect it to be AB. If you are not clocking past Overdrive limits, then simply use Overdrive?


----------



## Jflisk

Quote:


> Originally Posted by *tsm106*
> 
> The pragmatic way is to use Overdrive. AB doesn't always work the way it's advertised and currently it's the only way to overclock past the Overdrive clock limits using official method. Since your setup is crashing, I suspect it to be AB. If you are not clocking past Overdrive limits, then simply use Overdrive?


I have to give you a +1 on the forum link. Did you ever get the pictures of your new machine in. Thanks


----------



## tsm106

Quote:


> Originally Posted by *Jflisk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> The pragmatic way is to use Overdrive. AB doesn't always work the way it's advertised and currently it's the only way to overclock past the Overdrive clock limits using official method. Since your setup is crashing, I suspect it to be AB. If you are not clocking past Overdrive limits, then simply use Overdrive?
> 
> 
> 
> I have to give you a +1 on the forum link. Did you ever get the pictures of your new machine in. Thanks
Click to expand...

Thanks. A lot of the things written in the guide have been implemented in AB. However, with all the automation ppl soon forget how or even why things work the way they work. Also AB is not perfect and it doesn't always do what the check boxes say. But that is all part of overclocking.

As for the H Frame Mini, I need to take some pics. I was in a hurry to build it for my daughter, then we plugged it in and she won't let me take it apart lol. But yea, I guess I owe the cfx club some pics.


----------



## ZealotKi11er

Quote:


> Originally Posted by *tsm106*
> 
> The pragmatic way is to use Overdrive. AB doesn't always work the way it's advertised and currently it's the only way to overclock past the Overdrive clock limits using official method. Since your setup is crashing, I suspect it to be AB. If you are not clocking past Overdrive limits, then simply use Overdrive?


Overdrive has no voltage control or does it?


----------



## HoneyBadger84

The last of my children will be here today... maybe... two more, one stays, one goes... whichever one gets along with the HIS cards better... aka whichever one can match them in clocks easily... if both can, then I'll prolly just keep the Asus & ditch the Sapphire back to EBay or locally selling it. *clicks fingernails together evilly*


----------



## tsm106

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> The pragmatic way is to use Overdrive. AB doesn't always work the way it's advertised and currently it's the only way to overclock past the Overdrive clock limits using official method. Since your setup is crashing, I suspect it to be AB. If you are not clocking past Overdrive limits, then simply use Overdrive?
> 
> 
> 
> Overdrive has no voltage control or does it?
Click to expand...

No it doesn't have volt control. But you're on water, you should be able to max Overdrive clock limits without adding voltage.


----------



## fateswarm

Quote:


> Originally Posted by *tsm106*
> 
> you're on water, you should be able to max Overdrive clock limits without adding voltage.


I don't think that's right.


----------



## ebhsimon

Quote:


> Originally Posted by *fateswarm*
> 
> I don't think that's right.


LOL that is definitely not right.


----------



## heroxoot

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> The pragmatic way is to use Overdrive. AB doesn't always work the way it's advertised and currently it's the only way to overclock past the Overdrive clock limits using official method. Since your setup is crashing, I suspect it to be AB. If you are not clocking past Overdrive limits, then simply use Overdrive?
> 
> 
> 
> Overdrive has no voltage control or does it?
> 
> Click to expand...
> 
> No it doesn't have volt control. But you're on water, you should be able to max Overdrive clock limits without adding voltage.
Click to expand...

Why would you think this?


----------



## tsm106

I guess that means you guys couldn't do it eh?


----------



## bluedevil

Well my RMA is underway. Hopefully I get it back soon.


----------



## HoneyBadger84

Well hello ladies, aren't you lookin' mighty fine...






The Asus card looks almost like it's scarcely been used. The Sapphire, even after the guy obviously cleaned it (see picture below for how it was show up for auction), is still kinda ick on the inside, but I got most of it off with a compressed air can. I'll save testing them for tomorrow I think, need to get some shut eye... and my CPU is busy folding which I don't wanna interupt just to test these cards.



I'm hoping one of the two has Elpida so it'll match the HIS cards (not that it matters in reality since they'll be running at stock for normal 3K gaming use)


----------



## tsm106

Quote:


> Originally Posted by *bluedevil*
> 
> Well my RMA is underway. Hopefully I get it back soon.


Grats, that is good news especially regarding the non-original owner receipt.


----------



## HoneyBadger84

Quote:


> Originally Posted by *heroxoot*
> 
> Why would you think this?


I'm guessing cuz he's done it. Repeatedly. That's where most of TSM's opinions come from. Experience.









The way CCC works is just silly though, I'd rather know what clock I'm setting on the core directly not some silly percentage I have to click a graph to see what it actually equals... it's harder to calculate on a 290 than a 290X (as 290Xs come 1000MHz stock if you have a reference card that wasn't factory overclocked... meaning 1% OC is 10MHz on the core, and so on)


----------



## tsm106

First test I do is see how far any card will go on stock, on air then water. That gives me a baseline and an inkling of how this particular silicon will scale with cooling.


----------



## bluedevil

Quote:


> Originally Posted by *tsm106*
> 
> Grats, that is good news especially regarding the non-original owner receipt.


They really didn't seem to care...


----------



## HoneyBadger84

Quote:


> Originally Posted by *bluedevil*
> 
> They really didn't seem to care...


Was this through VisionTek or...? Can't remember what brand you said it was... either way, that's pretty awesome. I like that most card makers are becoming more relaxed, as long as the card isn't chewed up or something ridiculous, they'll generally RMA it and send you a working replacement.


----------



## bluedevil

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Was this through VisionTek or...? Can't remember what brand you said it was... either way, that's pretty awesome. I like that most card makers are becoming more relaxed, as long as the card isn't chewed up or something ridiculous, they'll generally RMA it and send you a working replacement.


Nope card looks great. Just dead as a doornail.


----------



## pdasterly

I have a three monitor setup and I want to switch out the center screen for a gaming monitor, can my sapphire 290x support and will eyefinity work?


----------



## PCSarge

Quote:


> Originally Posted by *pdasterly*
> 
> I have a three monitor setup and I want to switch out the center screen for a gaming monitor, can my sapphire 290x support and will eyefinity work?


itll be just fine. i run 3 1080P 30" screens on my single OC'd 290


----------



## TTheuns

Quote:


> Originally Posted by *PCSarge*
> 
> itll be just fine. i run 3 1080P 30" screens on my single OC'd 290


I think he is referring to the fact that he will be running a different screen in the middle.


----------



## PCSarge

Quote:


> Originally Posted by *TTheuns*
> 
> I think he is referring to the fact that he will be running a different screen in the middle.


shouldnt matter as long as the resolution matches up. if not eyefinity will downscale on the bigger res screen to make it match up


----------



## ZealotKi11er

Quote:


> Originally Posted by *PCSarge*
> 
> shouldnt matter as long as the resolution matches up. if not eyefinity will downscale on the bigger res screen to make it match up


You can run all monitors at different res. Not sure if you can game in middle monitor only while you have Eyefinity configuration setup.


----------



## smartdroid

@Arizonian

Can you please update my entry for 4 Sapphire R9 290 on stock cooling.

Thanks


----------



## Arizonian

Quote:


> Originally Posted by *smartdroid*
> 
> @Arizonian
> 
> Can you please update my entry for 4 Sapphire R9 290 on stock cooling.
> 
> Thanks
> 
> 
> Spoiler: Warning: Spoiler!


Sure can. Congrats - updated


----------



## Gobigorgohome

Quote:


> Originally Posted by *ZealotKi11er*
> 
> You can run all monitors at different res. Not sure if you can game in middle monitor only while you have Eyefinity configuration setup.


That you can do with Nvidia Surround, weird if it is not working with Eyefinity then.


----------



## SmackHisFace

Hi Guys I just got my MSI r9 290 today and was wondering if there was a list of ASIC scores anywhere. I got a score of 80% and was just curious how my card stacks up to other R9 290s. TY


----------



## tsm106

^^asic is not very relevant on Hawaii.

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> You can run all monitors at different res. Not sure if you can game in middle monitor only while you have Eyefinity configuration setup.
> 
> 
> 
> That you can do with Nvidia Surround, weird if it is not working with Eyefinity then.
Click to expand...

'course you can, just set game res to single size.


----------



## LA_Kings_Fan

Hey there Arizonian ... I'm back to try this again







,

RMA'd the defective PowerColor PCS+ R9 290X and got my money back in full from NewEgg ... I will always give NewEgg a







for how they treat me and returns.

Anyway, ended up getting a USED Sapphire R9 290x BF4 edition Reference Card off Amazon for $315.00 ... so saved a few $$ for the troubles, so I picked up a new EVGA SuperNova 1000 P2 PSU as well









Now need to wait and see some reviews for the new Swiftech H220X AIO unit and debate about water cooling some more


----------



## Arizonian

Quote:


> Originally Posted by *LA_Kings_Fan*
> 
> Hey there Arizonian ... I'm back to try this again
> 
> 
> 
> 
> 
> 
> 
> ,
> 
> RMA'd the defective PowerColor PCS+ R9 290X and got my money back in full from NewEgg ... I will always give NewEgg a
> 
> 
> 
> 
> 
> 
> 
> for how they treat me and returns.
> 
> Anyway, ended up getting a USED Sapphire R9 290x BF4 edition Reference Card off Amazon for $315.00 ... so saved a few $$ for the troubles, so I picked up a new EVGA SuperNova 1000 P2 PSU as well
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now need to wait and see some reviews for the new Swiftech H220X AIO unit and debate about water cooling some more
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Sweet - congrats - added









Saw your 1000W P2 in the EVGA club. Looks like your set. Welcome back.


----------



## tsm106

Goes to search amazon...


----------



## Cobra Khan

Can I join the club?!?!









Got all three for $650


----------



## Ukkooh

Quote:


> Originally Posted by *Cobra Khan*
> 
> Can I join the club?!?!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Got all three for $650


I'm kind of jelly. I paid more for my 290x near launch. It cost me $740.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Cobra Khan*
> 
> Can I join the club?!?!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Got all three for $650


Good deal








Quote:


> Originally Posted by *Ukkooh*
> 
> I'm kind of jelly. I paid more for my 290x near launch. It cost me $740.


Yeah I stopped buying on release specifically because prices are always stupid. I got all 5 of the 290Xs currently in my posession for under $1450







And 4 of the 5 are new/like new condition, so I'm perdy happy


----------



## Gualichu04

Why would the gpu core speed fluctuate at load a bit under 1000mhz and 1150mhz i disabled upls in afterbuern and the registry. Power limit is at +50 and i have +44mv. corei s at 1150mhz and ram is at 1300.
Also: for my entry to the club it says i have 2 r9 290s when i have the 290x


----------



## Arizonian

Quote:


> Originally Posted by *Cobra Khan*
> 
> Can I join the club?!?!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Got all three for $650
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Nicely priced.









_If you would pop a GPU-Z validation link in there with OCN name in validation link showing would be appreciated._


----------



## HoneyBadger84

Quote:


> Originally Posted by *Gualichu04*
> 
> Why would the gpu core speed fluctuate at load a bit under 1000mhz and 1150mhz i disabled upls in afterbuern and the registry. Power limit is at +50 and i have +44mv. corei s at 1150mhz and ram is at 1300.
> Also: for my entry to the club it says i have 2 r9 290s when i have the 290x


Insufficient voltage mayhaps. Disable PowerPlay?


----------



## Gualichu04

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Insufficient voltage mayhaps. Disable PowerPlay?


Where is the power play settings? It even fluctuates at stock settings. It seems to only do it in certain games. furmark and Heaven are both max clock and 100% usage.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Gualichu04*
> 
> Where is the power play settings? It even fluctuates at stock settings. It seems to only do it in certain games. furmark and Heaven are both max clock and 100% usage.


In Afterburner under the main tab of settings, towards the bottom, near the ULPS setting, change "with Powerplay support" to without. I believe that's supposed to disable the ability of the card to adjust clocks unless its throttling them because of temperature issues.

Not totally sure on that though.


----------



## Gualichu04

Quote:


> Originally Posted by *HoneyBadger84*
> 
> In Afterburner under the main tab of settings, towards the bottom, near the ULPS setting, change "with Powerplay support" to without. I believe that's supposed to disable the ability of the card to adjust clocks unless its throttling them because of temperature issues.
> 
> Not totally sure on that though.


No way its a temp issue its under 50C at load and idles between 33and 38c. Just disabled power play and will see if that makes the difference.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Gualichu04*
> 
> No way its a temp issue its under 50C at load and idles between 33and 38c. Just disabled power play and will see if that makes the difference.


In my experience, varying gpu core clocks most often means the clock is either A: not stable, B: not getting quite enough voltage or power (+50% Power Limit recommended), C: thermally throttling or D: ULPS or PowerPlay is making your card go ****** mode (sometimes this requires a driver reinstall to fix). Your temps rule out C. Progress. Lol


----------



## heroxoot

I'm on the default clock and it varies unless the game is pushing a 90%+ load. The best way to do it is to disable powerplay and set a 2D and 3D profile. My clocks never dip this way unless I'm pushing the thermal limits.


----------



## bluedevil

Well since I am out of a GPU ATM, I may as well post up some progress of my RMA process with VisionTek.









7/24/14 Submitted RMA form
7/25/14 Received email for invoice
7/25/14 Received email with RMA #
7/26/14 Shipped GPU
7/28/14 Set for delivery

Hopefully this will be a fast process, maybe under 2 weeks. If so VisionTek has a customer for life!


----------



## Descadent

I forgot to do this. Can I be apart of the club ?!

2x Sapphire 290x Vapor-x's powering 7680x1440


----------



## Cobra Khan

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nicely priced.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> _If you would pop a GPU-Z validation link in there with OCN name in validation link showing would be appreciated._


Will do! Build should be completed in a couple months!

[Build Log]-Blue Dream


----------



## HoneyBadger84

Quote:


> Originally Posted by *bluedevil*
> 
> Well since I am out of a GPU ATM, I may as well post up some progress of my RMA process with VisionTek.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 7/24/14 Submitted RMA form
> 7/25/14 Received email for invoice
> 7/25/14 Received email with RMA #
> 7/26/14 Shipped GPU
> 7/28/14 Set for delivery
> 
> Hopefully this will be a fast process, maybe under 2 weeks. If so VisionTek has a customer for life!


Very cool, hope it goes fast & smooth for ya.

It still baffles me I got a 290 in a 290X box from them and the serial on the card & the box match







lol


----------



## bluedevil

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Very cool, hope it goes fast & smooth for ya.
> 
> It still baffles me I got a 290 in a 290X box from them and the serial on the card & the box match
> 
> 
> 
> 
> 
> 
> 
> lol


Well I have to say I am overall impressed with the speedyness of the process. If this is any indication of what to follow, count me in.


----------



## HoneyBadger84

Quote:


> Originally Posted by *bluedevil*
> 
> Well I have to say I am overall impressed with the speedyness of the process. If this is any indication of what to follow, count me in.


Reminds me of eVGA back when I used NVidia, but then again I paid the $18 or whatever it was at the time for advanced RMA service with overnight/next day shipping... so basically, I filed an RMA & got a new card in within 2 days, before I even sent the old one in







That's one of the only things NVidia has over AMD, eVGA's RMA service is STUPID good... if only eVGA made AMD cards X_X I kid, I can't stand eVGA's forums, full of fanbois/elitists somethin' fierce. They did have great customer care when I used them, though it has gone downhill a tad from what I hear... haven't been on that side of the fence since the 7970s came out.


----------



## heroxoot

Getting a little sick of this heat. Tried 14.7, still extra heat, and BSOD. Went back to 14.4 for tests. Performance is literally crap. Going to 14.6v1 now. I need to find that sweet spot. 14.6v2 seemed like it made my GPU jump up clocks for flash which it didnt on 14.6v1 as far as I remember. Performance otherwise the same. Hopefully this is the sweet spot.

/blog

For anyone else seeing some extra heat on the beta drivers, have you put in a ticket? I have once and I think I will again.


----------



## HoneyBadger84

Quote:


> Originally Posted by *heroxoot*
> 
> Getting a little sick of this heat. Tried 14.7, still extra heat, and BSOD. Went back to 14.4 for tests. Performance is literally crap. Going to 14.6v1 now. I need to find that sweet spot. 14.6v2 seemed like it made my GPU jump up clocks for flash which it didnt on 14.6v1 as far as I remember. Performance otherwise the same. Hopefully this is the sweet spot.
> 
> /blog
> 
> For anyone else seeing some extra heat on the beta drivers, have you put in a ticket? I have once and I think I will again.


I actually haven't run the 14.4s, at all, since getting my 290Xs, other than very-briefly to get valid 3DMark11 & 3DMark FireStrike entries on QuadFire as I wanted to see if I could hit the top 100 global scores... I'll have to look in to that myself see how my Reference cards do temps wise with one set of drivers vs the other (14.4s)


----------



## heroxoot

Yea 14.6v1 is my sweet spot. My GPU isn;t clocking up small numbers randomly for no reason. Just watching on the desktop it clocks up to like 325mhz and fluctuates on 0% gpu usage. Not sure why that is.


----------



## fateswarm

If heat is higher on a driver, chances are performance is higher. Though it's not a guarantee. Sometimes there are bugs or inefficiencies.


----------



## heroxoot

Quote:


> Originally Posted by *fateswarm*
> 
> If heat is higher on a driver, chances are performance is higher. Though it's not a guarantee. Sometimes there are bugs or inefficiencies.


Yes the performance is much higher, but the heat is also almost 10C at load higher. It's worth it for sure, but I remember something like this happening on the 7970. The wound up fixing it so we kept the performance mostly but the heat was reduced a bit. I'm more worried about load heat than I am idle for sure. Mostly when I watch videos using flash player the temp goes up 55 - 60c if I turn my AC off. 53c with the AC on. 80c during games isn't a big deal IMO. My old 7970 went up to 65c tops but that's a different beast. On 14.4 with the AC off it just doesn't get that hot even if I turn my AC off.


----------



## HoneyBadger84

I'm gonna try out the 14.4 drivers on the Crossfire setup currently installing (new Asus card + one of my HIS cards) & see if I notice any temperature differences.

Did you see them just in games or in benchmarks as well?


----------



## heroxoot

I got my idle temp down a little bit. Like 1c cooler than before. Every little bit counts.
Quote:


> Originally Posted by *HoneyBadger84*
> 
> I'm gonna try out the 14.4 drivers on the Crossfire setup currently installing (new Asus card + one of my HIS cards) & see if I notice any temperature differences.
> 
> Did you see them just in games or in benchmarks as well?


BF4 is a game that in fact runs better on 14.6. Keep in mine not everyone had the performance problem I did. People who were ok on 14.4 didn't see a huge gain, while people like me with problems saw a mighty increase of performance. On 14.6 I get 52.7FPS in Heaven and on 14.4 I get like, 46. That's with 1040/1225 clocks. To get 48fps on 14.4 I had to push my clocks to 1100/1225. So to see default clocks running 4.5fps better than 1100/1225? That's incredible. The difference is huge.


----------



## shwarz

managed to get my r290 up to 1200mhz on the core









and score a really decent score in 3d mark firestrike
http://www.3dmark.com/3dm/3641541?










wont run it at 1200mhz 24/7 too much coil whine at +200 VDDC


----------



## ZealotKi11er

Does Trixx not support individual GPU overclocking? I am not able to control power limit for second card.


----------



## heroxoot

Just had the weirdest crash. Was watching netflix on my main monitor when the second monitor went to sleep as if it was unplugged. After that a few seconds passed and my main monitor went black but didnt go to sleep. Could hear netflix playing. Then it went silent. Then the sound came back. Lastly the main monitor went to sleep and I could hear netflix again. Assuming because I installed the beta over the old 14.4 WHQL, this was just a fluke. DDU'd and reinstalled it.

Anyone else ever seen that before?I mean the GPU wasn't even clocking high enough to put stress.


----------



## kayan

I have the same question as Zealot. Finally got both cards in my loop, will post a pic for proof later, but what is the best way to OC 2x 290x? I wanna see what these babies can do on water!


----------



## ZealotKi11er

Here are my Temps for my Setup after playing BF4 with Mantle. Air Intake Temp starts ~ 28C and as the area around the computer heats up its hits ~ 32C. Water Delta ~ 15C.



290 + 125mV
290X + 100mV

They both end up with about the same voltage in the end.
I used ASUS GPU Tweak to OC. Going to see if its possible to apply more then +100 mV with MSI AB as its a lot cleaner to work with.


----------



## Raephen

Quote:


> Originally Posted by *heroxoot*
> 
> Just had the weirdest crash. Was watching netflix on my main monitor when the second monitor went to sleep as if it was unplugged. After that a few seconds passed and my main monitor went black but didnt go to sleep. Could hear netflix playing. Then it went silent. Then the sound came back. Lastly the main monitor went to sleep and I could hear netflix again. Assuming because I installed the beta over the old 14.4 WHQL, this was just a fluke. DDU'd and reinstalled it.
> 
> Anyone else ever seen that before?I mean the GPU wasn't even clocking high enough to put stress.


Actually sounds familiar, though I rarely ever do anything but game on my main rig with a 290 in it.

No, this sounds like issues I've been having with my Kabini build - so R3 graphics.

I had it yesterday while playing a video file, though MPC-HC finally recovered after a while of freezing, then black screen -> no screen, then still frame + sound, over and over.

Don't think the issue is related, but I'm pretty sure my issues are software / codec / driver related, because I CBA to reinstall Windows 8.1 when I got my new HTPC mobo + Athlon 5350, I just tried to clean my old install (FM2+, btw) of all driver related stuff, switched mobo + APU and fired her up again.

Completely different, I guess, , but it sounds very like problems I've encountered on different AMD hardware,...


----------



## heroxoot

Quote:


> Originally Posted by *Raephen*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Just had the weirdest crash. Was watching netflix on my main monitor when the second monitor went to sleep as if it was unplugged. After that a few seconds passed and my main monitor went black but didnt go to sleep. Could hear netflix playing. Then it went silent. Then the sound came back. Lastly the main monitor went to sleep and I could hear netflix again. Assuming because I installed the beta over the old 14.4 WHQL, this was just a fluke. DDU'd and reinstalled it.
> 
> Anyone else ever seen that before?I mean the GPU wasn't even clocking high enough to put stress.
> 
> 
> 
> Actually sounds familiar, though I rarely ever do anything but game on my main rig with a 290 in it.
> 
> No, this sounds like issues I've been having with my Kabini build - so R3 graphics.
> 
> I had it yesterday while playing a video file, though MPC-HC finally recovered after a while of freezing, then black screen -> no screen, then still frame + sound, over and over.
> 
> Don't think the issue is related, but I'm pretty sure my issues are software / codec / driver related, because I CBA to reinstall Windows 8.1 when I got my new HTPC mobo + Athlon 5350, I just tried to clean my old install (FM2+, btw) of all driver related stuff, switched mobo + APU and fired her up again.
> 
> Completely different, I guess, , but it sounds very like problems I've encountered on different AMD hardware,...
Click to expand...

No doubt this is a driver issue. I'm 100% sure if was related to my installing 14.6 over 14.4. Possibly something did not get overwrote properly.


----------



## bond32

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Does Trixx not support individual GPU overclocking? I am not able to control power limit for second card.


The only one I have had any luck with for controlling all the cards at once without going into the settings and selecting my other cards, was GPU Tweak. I could be wrong... I have 1 290x and 2 290's - to get them all to the same clocks, even when on the same bios (PT1), I have to set the clocks of the 290x, then go into the AB settings, select either card 2 or 3, then go set those clocks. Just a pain thats all.

GPU Tweak I think last time I used it set them all the same


----------



## Raephen

Quote:


> Originally Posted by *heroxoot*
> 
> No doubt this is a driver issue. I'm 100% sure if was related to my installing 14.6 over 14.4. Possibly something did not get overwrote properly.


Oh aye, indubitably.

I just installed the lasted CCC package for my Athlon and that was 14.x...

... Now that's odd: I wanted to check my version number, but I can't seem to find Catalyst anywhere...

I know I installed the drivers and left it be, so CCC _should_ be running.

Odd, that.


----------



## Arizonian

Quote:


> Originally Posted by *Descadent*
> 
> I forgot to do this. Can I be apart of the club ?!
> 
> 2x Sapphire 290x Vapor-x's powering 7680x1440
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Quote:


> Originally Posted by *bluedevil*
> 
> Well I have to say I am overall impressed with the speedyness of the process. If this is any indication of what to follow, count me in.


Thanks for sharing your RMA with VisionTek, keep us posted when you receive it. We just don't hear the good things enough. Seems people voice themselves louder when it's something goes wrong or bad more often.


----------



## heroxoot

Quote:


> Originally Posted by *Raephen*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> No doubt this is a driver issue. I'm 100% sure if was related to my installing 14.6 over 14.4. Possibly something did not get overwrote properly.
> 
> 
> 
> Oh aye, indubitably.
> 
> I just installed the lasted CCC package for my Athlon and that was 14.x...
> 
> ... Now that's odd: I wanted to check my version number, but I can't seem to find Catalyst anywhere...
> 
> I know I installed the drivers and left it be, so CCC _should_ be running.
> 
> Odd, that.
Click to expand...

Did you install the correct version? I have accidentally installed 32bit on 64bit and that is what happens. It's an honest mistake.


----------



## naved777

will there be any improvement in temps if i change the stock TIM on the reference card with MX4 ?


----------



## heroxoot

Quote:


> Originally Posted by *naved777*
> 
> will there be any improvement in temps if i change the stock TIM on the reference card with MX4 ?


There was for me. It was a total mess under my heatsink. I can actually acquire 47c idle now after some hardcore cleaning and case fan adjustment, but alas, running flash causes a lot of heat still. I've seen it get up to 57c watching a stream 1080p on twitch.


----------



## tokoam

After finally getting things sorted with EK and missing screw on front plexi panel of my water block I got decided to get things tested .One top VRM seems kinda high would like to hear some input I have seen VRM Temp 1 hit as high as 70c during furmark.While VRM 2 temp dose not clear 45C What? I am on PT1 bios due to NF200 issues GPU temp dose not hit anything higher than 50.Is this normal ? what are you guys on water getting ?


----------



## heroxoot

It happened again. During netflix my monitors freaked out and eventually everything went to sleep. No idea why this is happening now. It only occurs watching netflix.


----------



## HoneyBadger84

Quote:


> Originally Posted by *heroxoot*
> 
> There was for me. It was a total mess under my heatsink. I can actually acquire 47c idle now after some hardcore cleaning and case fan adjustment, but alas, running flash causes a lot of heat still. I've seen it get up to 57c watching a stream 1080p on twitch.


I idle at 31-39C depending on ambient temps and card positioning (and how many I have installed), but my idle fan speed is set to 45%/50%, adjusting up ato 50% at 36C, goes to 60% once they pass 40C since that typically means they're being used if they get that warm... but then again, they are breathing off of this:



So they have good airflow to intake off of. (Note: that was taken before my CPU loop died, obviously)


----------



## tonymontana95

I have powercolor r9 290x pcs+ for a few months now. I overclocked it to 1130/1500 and the card is stable in every game on ultra and every stress test...And then I decided to do some benchmarking and I found out that my r9 290x gives lower results than r9 290 on default clock. On the default clock my card scored 1488 on unigine heaven and friends r9 290 scored 1550.
Can you guys tell me what to do?
And btw I have installed the latest drivers


----------



## HoneyBadger84

Quote:


> Originally Posted by *tonymontana95*
> 
> I have powercolor r9 290x pcs+ for a few months now. I overclocked it to 1130/1500 and the card is stable in every game on ultra and every stress test...And then I decided to do some benchmarking and I found out that my r9 290x gives lower results than r9 290 on default clock. On the default clock my card scored 1488 on unigine heaven and friends r9 290 scored 1550.
> Can you guys tell me what to do?
> And btw I have installed the latest drivers


Would need full system specs on both systems to even begin to understand why that's happening, including what drivers are being used (on both systems)


----------



## Raephen

Quote:


> Originally Posted by *heroxoot*
> 
> Did you install the correct version? I have accidentally installed 32bit on 64bit and that is what happens. It's an honest mistake.


Aye, 64-bit. For good measure I just did a re-install. Funny thing: almost every package was already installed, except Microsoft Visual ++.
Quote:


> Originally Posted by *tonymontana95*
> 
> I have powercolor r9 290x pcs+ for a few months now. I overclocked it to 1130/1500 and the card is stable in every game on ultra and every stress test...And then I decided to do some benchmarking and I found out that my r9 290x gives lower results than r9 290 on default clock. On the default clock my card scored 1488 on unigine heaven and friends r9 290 scored 1550.
> Can you guys tell me what to do?
> And btw I have installed the latest drivers


I have no experience bench-marking, but I recall reading in this thread that when OCing and getting lower results than expected it usually due to an unstable VRAM OC. Try lowering VRAM to stock and bench again.


----------



## tonymontana95

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Would need full system specs on both systems to even begin to understand why that's happening, including what drivers are being used.


Me: i7 4770k @stock,r9 290x, maximus vi hero, corsair rm750,patriot viper 1600mhz 8gb,wd caviar blue 1tb,samsung ssd 120gb evo,noctua nh-d14,corsair 750d

He: i7 2600k @4,4,asus p8z77-v,gigabyte r9 290 with accelero hybrid,corsair xms3 2x4 gb,120gb ssd ocz vertex 3, wd hdd 1tb,lc power 650w

I didnt compare my results just with him..I compared it with 3-4 more people and they all say that it is too low and that they have better result with theirs r9 290


----------



## HoneyBadger84

Quote:


> Originally Posted by *tonymontana95*
> 
> Me: i7 4770k @stock,r9 290x, maximus vi hero, corsair rm750,patriot viper 1600mhz 8gb,wd caviar blue 1tb,samsung ssd 120gb evo,noctua nh-d14,corsair 750d
> 
> He: i7 2600k @4,4,asus p8z77-v,gigabyte r9 290 with accelero hybrid,corsair xms3 2x4 gb,120gb ssd ocz vertex 3, wd hdd 1tb,lc power 650w
> 
> I didnt compare my results just with him..I compared it with 3-4 more people and they all say that it is too low and that they have better result with theirs r9 290


Running the 14.7 drivers or which? Also what preset on the benchmark? Use Extreme, not normal, as normal is much more CPU dependent. Also do as rae suggested, try testing with just your Core OCed and the vRAM at stock.

Edit : also does your card have Hynix or Elpida memory? Use GPUz to check.


----------



## tonymontana95

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Running the 14.7 drivers or which? Also what preset on the benchmark? Use Extreme, not normal, as normal is much more CPU dependent. Also do as rae suggested, try testing with just your Core OCed and the vRAM at stock.


14.4 drivers
how to test just Core or vRAM?
presets:

u.png 967k .png file


----------



## HoneyBadger84

Quote:


> Originally Posted by *tonymontana95*
> 
> 14.4 drivers
> how to test just Core or vRAM?
> presets:
> 
> u.png 967k .png file


He's using the same drivers as well? No way he should be beating you at those settings, what program are you using to OC the card currently? Can you show a screenshot of your current GPU settings for core/vRAM?

Examples:




Ignore the folding stuff I the first one.


----------



## tonymontana95

Quote:


> Originally Posted by *HoneyBadger84*
> 
> He's using the same drivers as well? No way he should be beating you at those settings, what program are you using to OC the card currently? Can you show a screenshot of your current GPU settings for core/vRAM?
> 
> Examples:
> 
> 
> 
> 
> Ignore the folding stuff I the first one.


He has 14.7 beta drivers and I dont want beta drivers

for this overclock I didnt raise voltage and it passes every stress test and games on ultra


----------



## kizwan

Quote:


> Originally Posted by *tokoam*
> 
> After finally getting things sorted with EK and missing screw on front plexi panel of my water block I got decided to get things tested .One top VRM seems kinda high would like to hear some input I have seen VRM Temp 1 hit as high as 70c during furmark.While VRM 2 temp dose not clear 45C What? I am on PT1 bios due to NF200 issues GPU temp dose not hit anything higher than 50.Is this normal ? what are you guys on water getting ?
> 
> 
> Spoiler: Warning: Spoiler!


don't use furmark, otherwise 70C consider ok for VRM1. Temp will be a bit high if you use EK thermal pad. You can replace it with phobya 7w/mk or fujipoly extreme 11w/mk if you want better cooling.


----------



## HoneyBadger84

Quote:


> Originally Posted by *tonymontana95*
> 
> He has 14.7 beta drivers and I dont want beta drivers
> 
> for this overclock I didnt raise voltage and it passes every stress test and games on ultra


Try running again with memory at 1350 (stock), see if your score goes up or down.

Also the 14.7s include stuff from the 14.6s which gave benchmark improvements in some applications, Heaven may have been effected.

Edit: woot woot 600th post!


----------



## tonymontana95

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Try running again with memory at 1350 (stock), see if your score goes up or down.
> 
> Also the 14.7s include stuff from the 14.6s which gave benchmark improvements in some applications, Heaven may have been effected.
> 
> Edit: woot woot 600th post!


at 1130/1350 the score is 1575
at 1130/1500 the score was 1588 and the second time was 1612
at default clock (1050/1350) the score was 1488

this screenshot is takan a second after the benchmark and


----------



## HoneyBadger84

Quote:


> Originally Posted by *tonymontana95*
> 
> at 1130/1350 the score is 1575
> at 1130/1500 the score was 1588 and the second time was 1612
> at default clock (1050/1350) the score was 1488
> 
> this screenshot is takan a second after the benchmark and


Sounds like it's the mem clock and drivers making the difference. I'll do a run when I get home (~2hrs)on single card 14.4, then 14.6 (I need to update anyways) for comparison.

The fact that you can run 1130 core with no additional voltage is nice though.

Edit: but the real question is what is the stock voltage on that card. Looks like it's in the 1.2 range which is higher than mine by a bit.


----------



## tonymontana95

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Sounds like it's the mem clock and drivers making the difference. I'll do a run when I get home (~2hrs)on single card 14.4, then 14.6 (I need to update anyways) for comparison.
> 
> The fact that you can run 1130 core with no additional voltage is nice though.
> 
> Edit: but the real question is what is the stock voltage on that card. Looks like it's in the 1.2 range which is higher than mine by a bit.


post your results here
I have contacted powercolor and told them the problem and now I am waiting the response


----------



## HoneyBadger84

Quote:


> Originally Posted by *tonymontana95*
> 
> post your results here
> I have contacted powercolor and told them the problem and now I am waiting the response


It's not really a problem tbh, I think it's your vRAM needing more voltage to run 1450+. Being able to run 1130 core at stock voltage is impressive. I would try running 1130/1350 and see if your performance is better or worse than 1130/1500. If it's better, your card memory isn't getting enough voltage to run the higher clock.

Did it turn out to be Hynix or Elpida memory?


----------



## heroxoot

This crashing problem is now happening on every driver when running netflix. The driver just crashes randomly and causes my monitors to freak out. It happens on all drivers. I woke up to find the driver had restarted but after a few minutes using windows it crapped out again. I don't know what to do but this was the 14.7 driver at fault.


----------



## rdr09

Quote:


> Originally Posted by *tonymontana95*
> 
> at 1130/1350 the score is 1575
> at 1130/1500 the score was 1588 and the second time was 1612
> at default clock (1050/1350) the score was 1488
> 
> this screenshot is takan a second after the benchmark and


you might have one of those underperforming powercolors. there is thread here in this forum about it and owners are being emailed a bios fix. some were successful. make it easy to all of us and download the benchmark below and run it at stock to compare with other 290X owners . . .

http://www.techpowerup.com/downloads/2336/futuremark-3dmark-11-v1-0-132/

use stretched mode.

edit: here . . .

http://www.overclock.net/t/1462592/powercolor-pcs-r9-290/910


----------



## heroxoot

Quote:


> Originally Posted by *heroxoot*
> 
> This crashing problem is now happening on every driver when running netflix. The driver just crashes randomly and causes my monitors to freak out. It happens on all drivers. I woke up to find the driver had restarted but after a few minutes using windows it crapped out again. I don't know what to do but this was the 14.7 driver at fault.


Reinstalled the silver light driver and it seems fine now.
Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> There was for me. It was a total mess under my heatsink. I can actually acquire 47c idle now after some hardcore cleaning and case fan adjustment, but alas, running flash causes a lot of heat still. I've seen it get up to 57c watching a stream 1080p on twitch.
> 
> 
> 
> I idle at 31-39C depending on ambient temps and card positioning (and how many I have installed), but my idle fan speed is set to 45%/50%, adjusting up ato 50% at 36C, goes to 60% once they pass 40C since that typically means they're being used if they get that warm... but then again, they are breathing off of this:
Click to expand...

On a single monitor right? I'm using dual so my clocks idle 300/1250. When I remove my second monitor it drops down to 40c for idle.


----------



## tonymontana95

Quote:


> Originally Posted by *rdr09*
> 
> you might have one of those underperforming powercolors. there is thread here in this forum about it and owners are being emailed a bios fix. some were successful. make it easy to all of us and download the benchmark below and run it at stock to compare with other 290X owners . . .
> 
> http://www.techpowerup.com/downloads/2336/futuremark-3dmark-11-v1-0-132/
> 
> use stretched mode.
> 
> edit: here . . .
> 
> http://www.overclock.net/t/1462592/powercolor-pcs-r9-290/910


I ran the benchmark test only in streched mode on performance preset (1280x720-it doesnt allow me 1080p) and this is the score:
http://www.3dmark.com/3dm11/8558902
this is 1130/1350


----------



## HoneyBadger84

Quote:


> Originally Posted by *heroxoot*
> 
> Reinstalled the silver light driver and it seems fine now.
> On a single monitor right? I'm using dual so my clocks idle 300/1250. When I remove my second monitor it drops down to 40c for idle.


Yeah with dual monitors I idle in the low 40s (41-45C), I think. I'm swappin' drivers & cards so often of late I haven't plugged my second monitor back in in a bit.

Gonna run Heaven real quick on single card mode for comparison for this guy...


----------



## ebhsimon

Quote:


> Originally Posted by *HoneyBadger84*
> 
> It's not really a problem tbh, I think it's your vRAM needing more voltage to run 1450+. Being able to run 1130 core at stock voltage is impressive. I would try running 1130/1350 and see if your performance is better or worse than 1130/1500. If it's better, your card memory isn't getting enough voltage to run the higher clock.
> 
> Did it turn out to be Hynix or Elpida memory?


Word. I need to push in +100mV just for 1145 core clock. With 1145/1600 I get 1841 on a sapphire vapor-x r9 290 on heaven extreme.


----------



## HoneyBadger84

Seems like your score may be normal because your CPU is stock. Here's mine (Notes below):



FPS: 61.0
Score: 1538

CPU @ 4.2GHz (3930k)
GPU @ stock (Core: 1000MHz, vRAM: 1250MHz as you can see in the image)
Drivers: 14.4 WHQL

Now, lemme swap drivers & rerun it, and we'll see if I score better (betting I will), be back in a bit...


----------



## bluedevil

Is it me or VisionTek is using a non reference HSF and sinks on a reference pcb? Mine on RMA is part # 900653 with a AMD reference HSF.

http://www.amazon.com/VisionTek-Products-Express-Graphics-900653/dp/B00GSQ1C5C




Maybe I will get this model back?


----------



## ebhsimon

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Seems like your score may be normal because your CPU is stock. Here's mine (Notes below):
> 
> 
> 
> FPS: 61.0
> Score: 1538
> 
> CPU @ 4.2GHz (3930k)
> GPU @ stock (Core: 1000MHz, vRAM: 1250MHz as you can see in the image)
> Drivers: 14.4 WHQL
> 
> Now, lemme swap drivers & rerun it, and we'll see if I score better (betting I will), be back in a bit...


I'll help out too. Except I'm on 14.6.


----------



## ebhsimon

CPU: i5 4670k stock (but thinking of getting z79 sabertooth mark 2 huehuehue)
gpu: r9 290 vapor-x core: 1000mhz, mem: 1350mhz
driver: 14.6
benchmark: uningine heaven extreme preset



I just noticed... honey badger you're using custom settings. Was tonymontana using a preset or 1080p 4*AA? Sigh.. time to run it again...


----------



## HoneyBadger84

Quote:


> Originally Posted by *bluedevil*
> 
> Is it me or VisionTek is using a non reference HSF and sinks on a reference pcb? Mine on RMA is part # 900653 with a AMD reference HSF.
> 
> http://www.amazon.com/VisionTek-Products-Express-Graphics-900653/dp/B00GSQ1C5C
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Maybe I will get this model back?


Trust me when I say you DO NOT WANT that model. This picture, the top card is THAT MODEL. It's cooler is HORRIBLE, you would definitely have to redo the TIM if you get one of those:



And it ran that hot in ANY configuration, even when in the case by itself







was the single worst experience I've ever had with an aftermarket card. The backplate on it is solid, so the back of the card not being able to breathe and only having one thermal-strip on it MIGHT be part of it's problem...


----------



## HoneyBadger84

Quote:


> Originally Posted by *ebhsimon*
> 
> CPU: i5 4670k stock (but thinking of getting z79 sabertooth mark 2 huehuehue)
> gpu: r9 290 vapor-x core: 1000mhz, mem: 1350mhz
> driver: 14.6
> benchmark: uningine heaven extreme preset
> 
> 
> 
> I just noticed... honey badger you're using custom settings. Was tonymontana using a preset or 1080p 4*AA? Sigh.. time to run it again...


Yep, 1080p, 4xAA is what he ran it at so that's what I ran it at, with Extreme Tess & the highest quality setting.

Here's 14.6 RC2 Drivers:



FPS: 61.7
Score: 1554

Same CPU/GPU settings as before, only in the 14.6 RC2 Drivers

Basically, it's the drivers giving him at least part of his edge. I gained .7FPS JUST from drivers.


----------



## bluedevil

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Trust me when I say you DO NOT WANT that model. This picture, the top card is THAT MODEL. It's cooler is HORRIBLE, you would definitely have to redo the TIM if you get one of those:
> 
> 
> 
> And it ran that hot in ANY configuration, even when in the case by itself
> 
> 
> 
> 
> 
> 
> 
> was the single worst experience I've ever had with an aftermarket card. The backplate on it is solid, so the back of the card not being able to breathe and only having one thermal-strip on it MIGHT be part of it's problem...


I sent in a reference board/hsf. So anything is a upgrade.


----------



## ebhsimon

Exact same setup as before, I only got 1406 but with 1080p 4xAA. I'll see if it improved when I overclock within the next week or two... Another thing I noticed, did you put your fan at 100%? I put mine on auto and the max it reached was 36%. Maybe it'll go up if I put the fan speed down, but my gpu only reached like 68 degrees.


----------



## HoneyBadger84

Quote:


> Originally Posted by *bluedevil*
> 
> I sent in a reference board/hsf. So anything is a upgrade.


The two middle cards in that temperature readout... are stock reference heatsink/fan cards







The aftermarket from VisionTek suuuuuuuuuuuuuuuuuuuucks lol It might've just been the TIM but I didn't wanna mess with it cuz I got it new & it ran THAT hot.

Also, because of how stupid thick the backplate makes the card, you can't stack it underneath another card, so I couldn't run it on bottom in a QuadFire configuration to get it more airflow either







That card... bad memories man. Bad memories.

It seriously got in to the low 80s or very high 70s regardless of being in there by itself or crossfired or what, in any thing I ran on it.


----------



## tonymontana95

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Yep, 1080p, 4xAA is what he ran it at so that's what I ran it at, with Extreme Tess & the highest quality setting.
> 
> Here's 14.6 RC2 Drivers:
> 
> 
> 
> FPS: 61.7
> Score: 1554
> 
> Same CPU/GPU settings as before, only in the 14.6 RC2 Drivers
> 
> Basically, it's the drivers giving him at least part of his edge. I gained .7FPS JUST from drivers.


thanks!! I will install the latest drivers


----------



## heroxoot

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Yep, 1080p, 4xAA is what he ran it at so that's what I ran it at, with Extreme Tess & the highest quality setting.
> 
> Here's 14.6 RC2 Drivers:
> 
> 
> 
> FPS: 61.7
> Score: 1554
> 
> Same CPU/GPU settings as before, only in the 14.6 RC2 Drivers
> 
> Basically, it's the drivers giving him at least part of his edge. I gained .7FPS JUST from drivers.


Even the better gpu heatsinks have a mess of TIM under them. Cleaning up mine literally dropped me 10c at load.


----------



## HoneyBadger84

Quote:


> Originally Posted by *tonymontana95*
> 
> thanks!! I will install the latest drivers


Another thing to keep in mind is that his CPU being OCed is helping him out. It may not be a big difference, but it definitely is one. Even at 1080p, with only 4xAA, CPU does still effect score in this benchmark, as demonstrated by how different my score is from yours with my CPU being 4.2GHz vs yours being stock, with the 14.4 drivers comparing numbers, I got 1538, you got 1488 with us both at stock & on the same drivers... that difference could well entirely be CPU (I think that equates to like, 1FPS? lol)


----------



## HoneyBadger84

Quote:


> Originally Posted by *heroxoot*
> 
> Even the better gpu heatsinks have a mess of TIM under them. Cleaning up mine literally dropped me 10c at load.


I want to do that to mine but I'm paranoid I'd break something so I haven't yet... I have a brand new tube of MX-4 just waiting to do it too


----------



## heroxoot

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I want to do that to mine but I'm paranoid I'd break something so I haven't yet... I have a brand new tube of MX-4 just waiting to do it too


It's just 4 screws bro. You don't even have to unplug the fan. I just wiped it up with a dry napkin, real gentle, and just applied new thermal paste. Just be careful what you use. Non conductive TIM is best.


----------



## HoneyBadger84

Quote:


> Originally Posted by *heroxoot*
> 
> It's just 4 screws bro. You don't even have to unplug the fan. I just wiped it up with a dry napkin, real gentle, and just applied new thermal paste. Just be careful what you use. Non conductive TIM is best.


Wait wait, yer telling me I only have to unscrew the four on the back of the card that are on the main GPU bracket to take it off? Are you serious? 

Pretty sure MX-4 is one of the best you can use on video cards (and it's non-conductive)


----------



## bluedevil

Quote:


> Originally Posted by *HoneyBadger84*
> 
> The two middle cards in that temperature readout... are stock reference heatsink/fan cards
> 
> 
> 
> 
> 
> 
> 
> The aftermarket from VisionTek suuuuuuuuuuuuuuuuuuuucks lol It might've just been the TIM but I didn't wanna mess with it cuz I got it new & it ran THAT hot.
> 
> Also, because of how stupid thick the backplate makes the card, you can't stack it underneath another card, so I couldn't run it on bottom in a QuadFire configuration to get it more airflow either
> 
> 
> 
> 
> 
> 
> 
> That card... bad memories man. Bad memories.
> 
> It seriously got in to the low 80s or very high 70s regardless of being in there by itself or crossfired or what, in any thing I ran on it.


Maybe, but I will Red Mod it anyways.


----------



## ebhsimon

Quote:


> Originally Posted by *ebhsimon*
> 
> Exact same setup as before, I only got 1406 but with 1080p 4xAA. I'll see if it improved when I overclock within the next week or two...


Quote:


> Originally Posted by *HoneyBadger84*
> 
> Wait wait, yer telling me I only have to unscrew the four on the back of the card that are on the main GPU bracket to take it off? Are you serious?
> 
> Pretty sure MX-4 is one of the best you can use on video cards (and it's non-conductive)


Get that Gelid GC-Extreme on there.


----------



## Raephen

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I want to do that to mine but I'm paranoid I'd break something so I haven't yet... I have a brand new tube of MX-4 just waiting to do it too


Quote:


> Originally Posted by *heroxoot*
> 
> It's just 4 screws bro. You don't even have to unplug the fan. I just wiped it up with a dry napkin, real gentle, and just applied new thermal paste. Just be careful what you use. Non conductive TIM is best.


My Sapphire ref card had 14, not 4 screws. And keep in mind there are 2 screws in the I/O shield on reference design cards.

And MX-4 would be a good choice - it's non-conductive, not too expensive and spreads very good.

Replacing TIM shouldn't be too difficult, just take your time and don't try to rush things and you will do fine.

LOL! I'm all thumbs, but I've successfully replaced multiple stock VGA coolers in my life - my 290 is even waterblocked now, so if I can do that, I'd wager almost anyone who can hold a screw driver could replace TIM on a GPU.

Good luck!









EDIT: or does the Visiontek one just have 4 screws? That would save a lot of hassle...


----------



## heroxoot

Quote:


> Originally Posted by *Raephen*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HoneyBadger84*
> 
> I want to do that to mine but I'm paranoid I'd break something so I haven't yet... I have a brand new tube of MX-4 just waiting to do it too
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> It's just 4 screws bro. You don't even have to unplug the fan. I just wiped it up with a dry napkin, real gentle, and just applied new thermal paste. Just be careful what you use. Non conductive TIM is best.
> 
> Click to expand...
> 
> My Sapphire ref card had 14, not 4 screws. And keep in mind there are 2 screws in the I/O shield on reference design cards.
> 
> And MX-4 would be a good choice - it's non-conductive, not too expensive and spreads very good.
> 
> Replacing TIM shouldn't be too difficult, just take your time and don't try to rush things and you will do fine.
> 
> LOL! I'm all thumbs, but I've successfully replaced multiple stock VGA coolers in my life - my 290 is even waterblocked now, so if I can do that, I'd wager almost anyone who can hold a screw driver could replace TIM on a GPU.
> 
> Good luck!
Click to expand...

I think you're looking at it wrong broski.










You see the 4 screws in the square? That is all you need to remove to dislocate the heatsink. The rest is for the back plate or faceplate.


----------



## ebhsimon

Update: Graphics card coolers are probably medium clamping force (out of light, medium and heavy): http://www.tomshardware.com/reviews/thermal-paste-performance-benchmark,3616-20.html

Gelid GC-Extreme is tha best, and actually wins in light medium AND heavy tests. It loses to Coolaboratory though but you're probably not going to buy that since it's conductive.

EDIT: another update: Here's the list that they keep updated: http://www.tomshardware.com/charts/thermal-compound-charts/-6-GPU-Cooling,3366.html


----------



## HoneyBadger84

Yeah that's what I mean. The 4 around the back of the GPU core area. That's it?

Does someone have a how-to guide posted on here that uses a reference card as an example? I have 3 HIS cards, 1 Asus & 1 Sapphire (the Sapphire probably needs it the most, and a proper cleaning I can't do cuz I dunno how to take the stock shroud off these things, I've heard it's complicatedish)


----------



## JordanTr

Quote:


> Originally Posted by *ebhsimon*
> 
> Word. I need to push in +100mV just for 1145 core clock. With 1145/1600 I get 1841 on a sapphire vapor-x r9 290 on heaven extreme.


I dont need any added voltage to get 1150/1400, however even adding +100mv it doesnt go above 1200/1500, so i keep them at stock voltage, dont think its worth +100mv to get extra 50 on core and extra 100 on memory.


----------



## Raephen

Quote:


> Originally Posted by *heroxoot*
> 
> I think you're looking at it wrong broski.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You see the 4 screws in the square? That is all you need to remove to dislocate the heatsink. The rest is for the back plate or faceplate.


Aye, I realized the Visiontek wasn't a ref cooler when I hit the submit button


----------



## rdr09

Quote:


> Originally Posted by *tonymontana95*
> 
> I ran the benchmark test only in streched mode on performance preset (1280x720-it doesnt allow me 1080p) and this is the score:
> http://www.3dmark.com/3dm11/8558902
> this is 1130/1350


tony, go to that thread i linked and contact Raymond the powercolor rep thru email.


----------



## tonymontana95

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Another thing to keep in mind is that his CPU being OCed is helping him out. It may not be a big difference, but it definitely is one. Even at 1080p, with only 4xAA, CPU does still effect score in this benchmark, as demonstrated by how different my score is from yours with my CPU being 4.2GHz vs yours being stock, with the 14.4 drivers comparing numbers, I got 1538, you got 1488 with us both at stock & on the same drivers... that difference could well entirely be CPU (I think that equates to like, 1FPS? lol)


a couple of people told me that the lack of voltage could cause it..what do you think? should I bump the voltage and how much..
I dont know why but msi afterburner doesnt allow me to increase or decrease the voltage..should I try with amd overdrive or some ozher software?


----------



## HoneyBadger84

Also to Tony: rdr09 is correct about PowerColor having some issues that a BIOS update can fix, you should seriously visit that thread & see if your card is one of them. It's an easy fix and does improve the performance of the card, if yours is one that's effected.
Quote:


> Originally Posted by *rdr09*
> 
> tony, go to that thread i linked and contact Raymond the powercolor rep thru email.


----------



## heroxoot

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Yeah that's what I mean. The 4 around the back of the GPU core area. That's it?
> 
> Does someone have a how-to guide posted on here that uses a reference card as an example? I have 3 HIS cards, 1 Asus & 1 Sapphire (the Sapphire probably needs it the most, and a proper cleaning I can't do cuz I dunno how to take the stock shroud off these things, I've heard it's complicatedish)




It looks the same to me. You remove the X clamp there on the back. If that doesn't do it then you will in fact have to remove the surrounding screws. Don't be afraid, as long as you don't lose a screw it isn't hard. It's just like applying it to your CPU. You do it nice and even, clean up the remainder.

I don't know about you guys but I prefer spreading with the credit card method. I get lower temps on my rig making sure it's even like that. I know many like to just do a dab and let the pressure spread it around.


----------



## HoneyBadger84

Quote:


> Originally Posted by *tonymontana95*
> 
> a couple of people told me that the lack of voltage could cause it..what do you think? should I bump the voltage and how much..
> I dont know why but msi afterburner doesnt allow me to increase or decrease the voltage..should I try with amd overdrive or some ozher software?


You have to enable voltage control in Afterburner or it won't let you mess with it. Go to Settings, it's on the first tab under "Compatability settings". I recommend setting to "extended MSI" mode. Just cuz you can.


----------



## HoneyBadger84

Quote:


> Originally Posted by *heroxoot*
> 
> It looks the same to me. You remove the X clamp there on the back. If that doesn't do it then you will in fact have to remove the surrounding screws. Don't be afraid, as long as you don't lose a screw it isn't hard. It's just like applying it to your CPU. You do it nice and even, clean up the remainder.
> 
> I don't know about you guys but I prefer spreading with the credit card method. I get lower temps on my rig making sure it's even like that. I know many like to just do a dab and let the pressure spread it around.


Is there a guide I can eyeball for this? I'm really paranoid about breaking stuff I don't normally mess with







Aren't there thermal pads & stuff that I'd be messing with? Should I order some stuff to redo those as well before I do anything?


----------



## Descadent

my corsair hx1050 power supply just went POP!!! UGH!!!

any fast recommendations for a new power supply that can power two 290x vapor-x's and overclocked cpu?


----------



## HoneyBadger84

Quote:


> Originally Posted by *Descadent*
> 
> my corsair hx1050 power supply just went POP!!! UGH!!!
> 
> any fast recommendations for a new power supply that can power two 290x vapor-x's and overclocked cpu?


Impressive. Antec HCP-1300 immediately comes to mind, one of the best PSUs ever made (according to JohnnyGuru among others), and it's 80+ Platinum.


----------



## heroxoot

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> It looks the same to me. You remove the X clamp there on the back. If that doesn't do it then you will in fact have to remove the surrounding screws. Don't be afraid, as long as you don't lose a screw it isn't hard. It's just like applying it to your CPU. You do it nice and even, clean up the remainder.
> 
> I don't know about you guys but I prefer spreading with the credit card method. I get lower temps on my rig making sure it's even like that. I know many like to just do a dab and let the pressure spread it around.
> 
> 
> 
> Is there a guide I can eyeball for this? I'm really paranoid about breaking stuff I don't normally mess with
> 
> 
> 
> 
> 
> 
> 
> Aren't there thermal pads & stuff that I'd be messing with? Should I order some stuff to redo those as well before I do anything?
Click to expand...

I'd love to find you a video guide but my ISP currently is having issues with youtube. It's really simple tho like I said. You remove the 4 screws, change the thermal, reapply the heatsink. It's that simple. When the screws come out, nothing will be holding it on and it will be free of the PCB. Simple as that.


----------



## Raephen

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Yeah that's what I mean. The 4 around the back of the GPU core area. That's it?
> 
> Does someone have a how-to guide posted on here that uses a reference card as an example? I have 3 HIS cards, 1 Asus & 1 Sapphire (the Sapphire probably needs it the most, and a proper cleaning I can't do cuz I dunno how to take the stock shroud off these things, I've heard it's complicatedish)


Removing the plastic shroud is easy. Just 6 small screws - 3 on either side of the shroud.




The tricky part, I imagine, would be the aluminium heat-sink array and the copper heat-plate. Modding those for a red mod would take more than just a screw driver. A dremel might do the trick, but you'd have to cut a lot of metal.


----------



## tonymontana95

Quote:


> Originally Posted by *HoneyBadger84*
> 
> You have to enable voltage control in Afterburner or it won't let you mess with it. Go to Settings, it's on the first tab under "Compatability settings". I recommend setting to "extended MSI" mode. Just cuz you can.


I did it and still I cant change the voltage


----------



## HoneyBadger84

Quote:


> Originally Posted by *tonymontana95*
> 
> I did it and still I cant change the voltage


Try Sapphire TRIXX (google it). If you install it, make sure that Afterburner & Trixx aren't running at the same time if they're both set to enabling overclocking as they can get eachother quiet agitated.


----------



## heroxoot

Quote:


> Originally Posted by *tonymontana95*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HoneyBadger84*
> 
> You have to enable voltage control in Afterburner or it won't let you mess with it. Go to Settings, it's on the first tab under "Compatability settings". I recommend setting to "extended MSI" mode. Just cuz you can.
> 
> 
> 
> I did it and still I cant change the voltage
Click to expand...

Give this a shot.

Code:



Code:


Add the following below [Settings]

VDDC_IR3567B_Detection = 32h

VDDC_IR3567B_Output = 0

MVDDC_IR3567B_Detection = 32h

MVDDC_IR3567B_Output = 1

This unlocked voltage on a 280X I had for a day.


----------



## Raephen

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Is there a guide I can eyeball for this? I'm really paranoid about breaking stuff I don't normally mess with
> 
> 
> 
> 
> 
> 
> 
> Aren't there thermal pads & stuff that I'd be messing with? Should I order some stuff to redo those as well before I do anything?


I don't know about the Visiontek, but for ref coolers, it might be a good idea to have some extra padding in back-up in case it rips. Don't have to go over the top on those - cheaper Phobya 7 or 11 W/Mk should do just nicely. Eyeballing the stock heat-pads, I'd say 1mm thick would suffice.


----------



## Dasboogieman

Quote:


> Originally Posted by *Descadent*
> 
> my corsair hx1050 power supply just went POP!!! UGH!!!
> 
> any fast recommendations for a new power supply that can power two 290x vapor-x's and overclocked cpu?


Really? that PSU is supposed to be fairly decent

Anyway a couple come to mind depending on your budget. The EVGA 1300W G2 is definitely one of the most cost effective, if you can live with the fact the fan is a tad audible. That Antec HCP 1300W is also really solid since its a Delta design. The Seasonic X-1250 is also fairly good efficiency wise and voltage regulation. Basically of these 3, get the one that is the most affordable, they'll all do well.

If you have a taste for luxury and want the very best then get the Corsair AX1500i Titanium lol.


----------



## tonymontana95

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Try Sapphire TRIXX (google it). If you install it, make sure that Afterburner & Trixx aren't running at the same time if they're both set to enabling overclocking as they can get eachother quiet agitated.


these are default settings..by how much should I bump the voltage for 1130/1500


----------



## heroxoot

Quote:


> Originally Posted by *tonymontana95*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HoneyBadger84*
> 
> Try Sapphire TRIXX (google it). If you install it, make sure that Afterburner & Trixx aren't running at the same time if they're both set to enabling overclocking as they can get eachother quiet agitated.
> 
> 
> 
> these are default settings..by how much should I bump the voltage for 1130/1500
Click to expand...

The offset works just like MSI AB. Every bump is 1mV and you have to mess with it to find what you can do. Start by bumping the clock up 10mhz. Test it. Is it stable? Bump it more. Do this till it's unstable and artifacts/crashes. Start adding some voltage. You can't just want a specific clock speed and expect it to work.


----------



## HoneyBadger84

Quote:


> Originally Posted by *tonymontana95*
> 
> these are default settings..by how much should I bump the voltage for 1130/1500


That really depends on the card. Some of mine I had to push quite a bit to get the core to 1150, others no so much. If you're not getting any artifacts like you said at those speeds in games, I'd say you don't need to add that much...


----------



## HoneyBadger84

Quote:


> Originally Posted by *Raephen*
> 
> I don't know about the Visiontek, but for ref coolers, it might be a good idea to have some extra padding in back-up in case it rips. Don't have to go over the top on those - cheaper Phobya 7 or 11 W/Mk should do just nicely. Eyeballing the stock heat-pads, I'd say 1mm thick would suffice.


But do I HAVE to redo the thermal pads or can I just do the GPU core? Is it optional, a good idea, required, unnecessary? I've never done this before


----------



## heroxoot

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Raephen*
> 
> I don't know about the Visiontek, but for ref coolers, it might be a good idea to have some extra padding in back-up in case it rips. Don't have to go over the top on those - cheaper Phobya 7 or 11 W/Mk should do just nicely. Eyeballing the stock heat-pads, I'd say 1mm thick would suffice.
> 
> 
> 
> But do I HAVE to redo the thermal pads or can I just do the GPU core? Is it optional, a good idea, required, unnecessary? I've never done this before
Click to expand...

No. Usually there are pads for the ram and VRM. Those are a special material leave them be. You are only changing the TIM between the GPU and the heatsink. Nothing more.


----------



## Raephen

Quote:


> Originally Posted by *HoneyBadger84*
> 
> But do I HAVE to redo the thermal pads or can I just do the GPU core? Is it optional, a good idea, required, unnecessary? I've never done this before


If you gently remove the ref cooler, take care not to damage the stock padding, then I'd say no.

Would I recommend having some in case the VRM1 strip gets messed up? Yes, I would, and I know the Phobya stuff is pretty pliable (more so than others I've seen), I'd wager 1,5 mm would be awesome too and mold itself better around the VRM1 better than the stock padding.


----------



## HoneyBadger84

Quote:


> Originally Posted by *heroxoot*
> 
> No. Usually there are pads for the ram and VRM. Those are a special material leave them be. You are only changing the TIM between the GPU and the heatsink. Nothing more.


Okay... I'll prolly try it out on the Sapphire since if I DO break it at least I'm only out $275







lol


----------



## HoneyBadger84

Sooo I removed the four screws we're talkin' about and this thing won't budge... I'm assuming I have to take off all the black screws to make the entire shroud let go?


----------



## bluedevil

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Sooo I removed the four screws we're talkin' about and this thing won't budge... I'm assuming I have to take off all the black screws to make the entire shroud let go?


On your VisionTek?


----------



## HoneyBadger84

Quote:


> Originally Posted by *bluedevil*
> 
> On your VisionTek?


This is escalating quickly. The card is now naked... and still attached to it's heatsink. What. The. CRAP. lol





HALP! LOL


----------



## tsm106

Why you remove the shroud with the sink still on it?


----------



## bluedevil

Quote:


> Originally Posted by *HoneyBadger84*
> 
> This is escalating quickly. The card is now naked... and still attached to it's heatsink. What. The. CRAP. lol
> 
> 
> 
> 
> 
> HALP! LOL


Are you trying to take off the entire heatsink and fan? If so all the tiny screws and the 4 screw GPU holddown need to come off. Also don't forget about the 4 screws that bolt the PCI bracket to the HSF. Then slight wiggle to get the whole assembly off.


----------



## HoneyBadger84

Quote:


> Originally Posted by *tsm106*
> 
> Why you remove the shroud with the sink still on it?


I thought it would help... should I put it back on?


----------



## HoneyBadger84

Quote:


> Originally Posted by *bluedevil*
> 
> Are you trying to take off the entire heatsink and fan? If so all the tiny screws and the 4 screw GPU holddown need to come off. Then slight wiggle to get the whole assembly off.


I'm trying to redo the thermal paste on the GPU core







Obviously this is a lot more complicated than others alleged X_X


----------



## bluedevil

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I'm trying to redo the thermal paste on the GPU core
> 
> 
> 
> 
> 
> 
> 
> Obviously this is a lot more complicated than others alleged X_X


Yeah its all the screws on the back + the 4 on the PCI bracket.


----------



## tsm106

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Why you remove the shroud with the sink still on it?
> 
> 
> 
> I thought it would help... should I put it back on?
Click to expand...

Yea put it back on. Don't put any screws on the backside of the pcb back on though. You also need to remove the screws on the vent plate. Then holding the shroud, twist just a hair and the separate the cooler from the pcb.


----------



## HoneyBadger84

Quote:


> Originally Posted by *bluedevil*
> 
> Yeah its all the screws on the back + the 4 on the PCI bracket.


I pictured the screws & the back so y'all could see how much I've removed... there are zero screws left. Am I not wiggling it enough?
Quote:


> Originally Posted by *tsm106*
> 
> Yea put it back on. Don't put any screws on the backside of the pcb back on though. You also need to remove the screws on the vent plate. Then holding the shroud, twist just a hair and the separate the cooler from the pcb.


doin' that now.


----------



## kayan

w00tz! So, I got my 2nd card working my loop, after a leak earlier this week (boo). Here's a screenie of Heaven Valley:



And here's a screenie of CPU-Z, CoreTemp, GPU-Z x2....The left GPU-Z is my top (1st card), right is bottom card (2nd):



Good temps, I think on everything. That was after 2 Benchmark runs of Valley. The cards at idle are as follows:

Top = 35C, VRM1 = 33, VRM2 = 32.
Bottom = 35C, VRM1 = 32, VRM2 = 31.

GPUs are at 1000/1250. No OC changes at all.


----------



## tsm106

Just crack it open. Then screw the cooler screws back into said cooler so u dont lose them!


----------



## skupples

Quote:


> Originally Posted by *Raephen*
> 
> If you gently remove the ref cooler, take care not to damage the stock padding, then I'd say no.
> 
> Would I recommend having some in case the VRM1 strip gets messed up? Yes, I would, and I know the Phobya stuff is pretty pliable (more so than others I've seen), I'd wager 1,5 mm would be awesome too and mold itself better around the VRM1 better than the stock padding.


phobya thermal pads are quite over priced when compared to even mid-range fujipoly, specially if you buy said fujipoly from Performance PCs where you can get a whole sheet for only 2x the price of a tiny square of Phobya. Phobya is one of the worst companies in watercooling, & they should be avoided like the plague. Don't believe me?



metal caps made out of pewter, or some other garbage metal. Destroyed everything in the loop.


----------



## HoneyBadger84

Quote:


> Originally Posted by *tsm106*
> 
> Just crack it open. Then screw the cooler screws back into said cooler so u dont lose them!


I won't lose'em, despite the pile-esque look on this picture, they're organized:



This thing won't come apart, guess I'll apply a bit more pressure.


----------



## HoneyBadger84

SUCCESS... but some of the pads came off... I can just put them back right?


----------



## tsm106

If it won't come off, put it back together and stick it in your rig. Now heat it up, then remove and crack it open.

lol I'm so slow posting on my phone.


----------



## bluedevil

Thinking of getting a different 290 now that I am realizing how much crap the Red mod actually is. Hows the Gigabyte Windforce R9 290?
http://www.newegg.com/Product/Product.aspx?Item=N82E16814125505


----------



## HoneyBadger84

Quote:


> Originally Posted by *bluedevil*
> 
> Thinking of getting a different 290 now that I am realizing how much crap the Red mod actually is. Hows the Gigabyte Windforce R9 290?
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814125505


Decent. I had one. High 60s low 70s load temps...

How do I clean off all the paste around the edge of the GPU cap? Or should I bother with that?


----------



## motokill36

Any one still getting mem leak on BF4 get mem low warning after 20 mins or so f


----------



## tsm106

Alcohol + swabs, or electro spray cleaner.


----------



## HoneyBadger84

Quote:


> Originally Posted by *tsm106*
> 
> Alcohol + swabs, or electro spray cleaner.


That's what I'm using. I finally used a flat tip computer screw driver to gently get it off. Should I be worried about getting the GPU lid spotless or just make sure there's no Q-tip fuzz left and re-paste it?


----------



## tsm106

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Alcohol + swabs, or electro spray cleaner.
> 
> 
> 
> That's what I'm using. I finally used a flat tip computer screw driver to gently get it off. Should I be worried about getting the GPU lid spotless or just make sure there's no Q-tip fuzz left and re-paste it?
Click to expand...

Depends on your level of ocd. Personally I use a spray cleaner to remove everything. The less material sitting around the less latent heat there is on the die, I say.


----------



## Sgt Bilko

Quote:


> Originally Posted by *tsm106*
> 
> Depends on your level of ocd. Personally I use a spray cleaner to remove everything. The less material sitting around the less latent heat there is on the die, I say.


I used some alcohol and a toothbrush then some compressed air afterwards to blow it all off


----------



## Raephen

Quote:


> Originally Posted by *skupples*
> 
> phobya thermal pads are quite over priced when compared to even mid-range fujipoly, specially if you buy said fujipoly from Performance PCs where you can get a whole sheet for only 2x the price of a tiny square of Phobya. Phobya is one of the worst companies in watercooling, & they should be avoided like the plague. Don't believe me?
> 
> 
> 
> metal caps made out of pewter, or some other garbage metal. Destroyed everything in the loop.


Too bad about your Phobya res.

And I'm loving my Phobya padding. Ordering from Performance PCs would have added to the cost, to me, seeing I'm in Europe, and here in the NL where I bought my stuff it was cheaper.

It works fine for me.
Quote:


> Originally Posted by *HoneyBadger84*
> 
> SUCCESS... but some of the pads came off... I can just put them back right?


Good job!


----------



## Roboyto

Quote:


> Originally Posted by *bluedevil*
> 
> Thinking of getting a different 290 now that I am realizing how much crap the Red mod actually is. Hows the Gigabyte Windforce R9 290?
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814125505


It may require a little work, but a Kraken G10, a cheap AIO, and the Gelid heatsink kit will likely cool better than any manufacturers air cooled card. Even if temperatures were similar you can at least guarantee yourself quiet operation and the fact you're exhausting all the core heat out rather than spewing it all over inside your case.

Good info here:
http://www.overclock.net/t/1503301/msi-r9-290-kraken-g10-questions/0_20

My red mod configuration with upgraded vrm1 fan and thermal pad:
http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures/60#post_22599798


----------



## HoneyBadger84

Well that was terrifying. She's back together, pretty sure I put way too much but it kinda came out in a goop because it was the first time I've used the tube... Imma shut down & plug'er in and hope it doesn't go "hiiiisssss pop" lol


----------



## bluedevil

Quote:


> Originally Posted by *Roboyto*
> 
> It may require a little work, but a Kraken G10, a cheap AIO, and the Gelid heatsink kit will likely cool better than any manufacturers air cooled card. Even if temperatures were similar you can at least guarantee yourself quiet operation and the fact you're exhausting all the core heat out rather than spewing it all over inside your case.
> 
> Good info here:
> http://www.overclock.net/t/1503301/msi-r9-290-kraken-g10-questions/0_20
> 
> My red mod configuration with upgraded vrm1 fan and thermal pad:
> http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures/60#post_22599798


I don't need all that extra stuff. The CM 120M/240M/120XL all bolt up without the use of "adapter" plates. Then take an Antec Spot cool for VRMs and some small heatsinks and I am set. I think I am just a little gunshy is that I dont want to over tighten and bend the card.


----------



## Roboyto

Quote:


> Originally Posted by *bluedevil*
> 
> I don't need all that extra stuff. The CM 120M/240M/120XL all bolt up without the use of "adapter" plates. Then take an Antec Spot cool for VRMs and some small heatsinks and I am set. I think I am just a little gunshy is that I dont want to over tighten and bend the card.


Those CM coolers are also still using that horrible hard corrugated hose. You're asking for a leak with that junk. If you're going to use one of those cooler master coolers I'd definitely suggest replacing the hose. I took broken H100, H55 pump,used cheap hose with koolance spring clamps and it works like a champ.

The Gelid heatsink kit works pretty darn good for VRM1 with the pad supplied, and someone is selling them on eBay now for $15 free ship so you don't have to wait for shipping from China.


----------



## Germanian

Quote:


> Originally Posted by *motokill36*
> 
> Any one still getting mem leak on BF4 get mem low warning after 20 mins or so f


yes, i do after around 1 hour. I got 16GB ram. After 1 hour I see almost 12GB ram usage and BF4 crashes. It only started happening after the latest dragon patch.

i am using mantle btw. 4770k with 2x r9 290


----------



## HoneyBadger84

Forgot to plug the GPU fan back in so I had to take a few screws back off & get that done... installin' drivers now then I'll pop off a benchmark & check temps... it booted & it's not artifacting, that's good right?







lol


----------



## bluedevil

Quote:


> Originally Posted by *Roboyto*
> 
> Those CM coolers are also still using that horrible hard corrugated hose. You're asking for a leak with that junk. If you're going to use one of those cooler master coolers I'd definitely suggest replacing the hose. I took broken H100, H55 pump,used cheap hose with koolance spring clamps and it works like a champ.
> 
> The Gelid heatsink kit works pretty darn good for VRM1 with the pad supplied, and someone is selling them on eBay now for $15 free ship so you don't have to wait for shipping from China.


With my rig being down due to the 290 RMA, I might just replace the tubes now since I am on vacation this week.









Also here is the VisionTek model that I might get back.


----------



## Roboyto

Quote:


> Originally Posted by *bluedevil*
> 
> With my rig being down due to the 290 RMA, I might just replace the tubes now since I am on vacation this week.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also here is the VisionTek model that I might get back.


I'm pretty sure it's the same as the H100 tubing; may want to double check. If that's the case Performance-PCs had the hose/clamps for cheap. See here:

http://www.overclock.net/t/1478544/the-build-formerly-known-as-the-devil-inside-a-gaming-htpc-for-my-wife-4770k-corsair-250d-powercolor-r9-290/20_20#post_22182993

Can't comment on the 290, but my buddy just got a 280X Visiontek and the stock cooler surprised me. Very quiet and efficient even with a large OC and some additional voltage.


----------



## HoneyBadger84

Quote:


> Originally Posted by *bluedevil*
> 
> With my rig being down due to the 290 RMA, I might just replace the tubes now since I am on vacation this week.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also here is the VisionTek model that I might get back.


Yeah, VRM temps are still ridiculous on it. Those heatsinks don't do a good job... I think I have a screenshot of it somewhere...

Nope.

Anyway, it has a thermal limit of 85C set in BIOS that you can't go higher than in CCC, and it loads up to about 82C... which is enough for it to start throttling. Maybe you'll have better luck via redoing the TIM on it. Hope so if you get one that it works better for you than it did for me









Well it looks like this redo of TIM I did sucked... or it could just be cuz I'm on OCN. I'mma let it idle for a bit then run a benchmark & see if it gets hotter or not. If it does I'm going to assume I have another date with screw drivers tomorrow to redo it AGAIN and use less goop this time.


----------



## HoneyBadger84

Well the good and bad news is, temps are the same... pretty sure I used too much. Oh well, I'll take it apart again tomorrow... maybe.

At least VRMs are still running cool, was a bit concerned, it looked like some of the strip had come off on the main line, on the VRMs themselves, but I guess that didn't matter too much... idling down now & they're sitting at 28/34 idle where they were 26/32 before running the benchtest I did. So should be fine... Load temp was still 64C, same as before (again, I run fan at 100%). Ambient is a bit warmer than yesterday but not much, so at most I gained 1C or so out of it... basically a waste of two hours other than the learning experience.


----------



## bluedevil

Quote:


> Originally Posted by *Roboyto*
> 
> I'm pretty sure it's the same as the H100 tubing; may want to double check. If that's the case Performance-PCs had the hose/clamps for cheap. See here:
> 
> http://www.overclock.net/t/1478544/the-build-formerly-known-as-the-devil-inside-a-gaming-htpc-for-my-wife-4770k-corsair-250d-powercolor-r9-290/20_20#post_22182993
> 
> Can't comment on the 290, but my buddy just got a 280X Visiontek and the stock cooler surprised me. Very quiet and efficient even with a large OC and some additional voltage.


Interesting replacement of the tubes. How did you refill it?


----------



## tsm106

^^if you open a CLC, you might as well run a real loop then?
Quote:


> Originally Posted by *bluedevil*
> 
> Thinking of getting a different 290 now that I am realizing *how much crap the Red mod actually is*. Hows the Gigabyte Windforce R9 290?
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814125505


Concur. lol, I think I posted the first gpu only setup on here. It was a temporary setup mcw82 and tape on sinks. Benched Valley at 1250/1500 and it got so hot that the mem sinks started falling off because the thermal tape got gooey. I keep the mcw82 for rainy days, especially for those release day cards. Now a mcw82 sure as heck beats a clc, but it still is a pain all around. You're still aircooling the vrms and mem at the end of the day.


----------



## HoneyBadger84

Okay apparently I'm full of it on temps, it's running cooler by a bit. The ASUS card ran at 64C max temp during 3DMark11, the Sapphire was 2-3C hotter, so it's 2-3C cooler now doing 64C max during that, and it only hit 60C during the tail end of FireStrike's graphics stuff... gonna run Valley on it to see if it stays under 70C since I'm pretty sure that's what it hit before.


----------



## bluedevil

Quote:


> Originally Posted by *tsm106*
> 
> ^^if you open a CLC, you might as well run a real loop then?
> Concur. lol, I think I posted the first gpu only setup on here. It was a temporary setup mcw82 and tape on sinks. Benched Valley at 1250/1500 and it got so hot that the mem sinks started falling off because the thermal tape got gooey. I keep the mcw82 for rainy days, especially for those release day cards. Now a mcw82 sure as heck beats a clc, but it still is a pain all around. You're still aircooling the vrms and mem at the end of the day.


Funny you should mention that. I think the reason why most people get a CLC over custom is the cost. Lets face it a MCW82 is $55 on a good day. Now factor in a res, rad, pump.....there you have it....too expensive.


----------



## HoneyBadger84

69C vs 71-72C before. Not bad eh? Still think I used too much goop. Oh well


----------



## the9quad

watercooling is really expensive, I keep looking into it, but cant afford to drop another grand plus into this just for watercooling.


----------



## Roboyto

Quote:


> Originally Posted by *bluedevil*
> 
> Interesting replacement of the tubes. How did you refill it?


Filled the radiator with tubes attached to the tip-top. Put some fluid in the pump and attached the hoses. It's as full as any of my AIO that haven't been opened up.
Quote:


> Originally Posted by *tsm106*
> 
> ^^if you open a CLC, you might as well run a real loop then?
> Concur. lol, I think I posted the first gpu only setup on here. It was a temporary setup mcw82 and tape on sinks. Benched Valley at 1250/1500 and it got so hot that the mem sinks started falling off because the thermal tape got gooey. I keep the mcw82 for rainy days, especially for those release day cards. Now a mcw82 sure as heck beats a clc, but it still is a pain all around. You're still aircooling the vrms and mem at the end of the day.


Last time I checked thermal tape has a small fraction of the thermal conductance of a good pad. I searched for quite a bit and the best thermal tape I could find was JunPus T180, rated at 1.8 w/mk. Compare this to Fujipoly Ultra Extreme at 17 w/mk...nearly 10X the thermal conductance rating. I wonder why the VRMs overheated with thermal tape on them?









The cost of a CM Seidon 120, some heatsinks, a fan for VRM1, thermal pad and tape will be a fraction of the cost of running full loop; especially if you're starting the full loop from scratch.

Quote:


> Originally Posted by *the9quad*
> 
> watercooling is really expensive, I keep looking into it, but cant afford to drop another grand plus into this just for watercooling.


Yes, I am aircooling the VRMs. And with the way I have it setup, VRM1 doesn't exceed 50C after running FFXIV benchmark looped for an hour. Yes, this is at stock clocks, but you show me any air cooled card that is capable of keeping VRM1 at, or even near, this temperature with that amount of stress...oh and it's in a mini-ITX 250D and is absolutely silent. I still need to test with overclock/volt, but I'll have that information sooner or later.


----------



## skupples

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Well that was terrifying. She's back together, pretty sure I put way too much but it kinda came out in a goop because it was the first time I've used the tube... Imma shut down & plug'er in and hope it doesn't go "hiiiisssss pop" lol


eh, you will get use to it. You will have it torn down & back together in half the time, next tyime.


----------



## tsm106

Quote:


> Originally Posted by *Roboyto*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> ^^if you open a CLC, you might as well run a real loop then?
> Concur. lol, I think I posted the first gpu only setup on here. It was a temporary setup mcw82 and tape on sinks. Benched Valley at 1250/1500 and it got so hot that the mem sinks started falling off because the thermal tape got gooey. I keep the mcw82 for rainy days, especially for those release day cards. Now a mcw82 sure as heck beats a clc, but it still is a pain all around. You're still aircooling the vrms and mem at the end of the day.
> 
> 
> 
> Last time I checked thermal tape has a small fraction of the thermal conductance of a good pad. I searched for quite a bit and the best thermal tape I could find was JunPus T180, rated at 1.8 w/mk. Compare this to Fujipoly Ultra Extreme at 17 w/mk...nearly 10X the thermal conductance rating. *I wonder why the VRMs overheated with thermal tape on them?*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The cost of a CM Seidon 120, some heatsinks, a fan for VRM1, thermal pad and tape will be a fraction of the cost of running full loop; especially if you're starting the full loop from scratch.
Click to expand...

Stop replying to me if you are going to imply stupid crap you just made up.


----------



## Decoman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Does Trixx not support individual GPU overclocking? I am not able to control power limit for second card.


I vaguely recall that being buggy when I had my 2x 7950's. (Or I lack the knowhow to explain how Trixx works for crossfire setups.)
I guess you have already noticed that there is a roller curtain option in Trixx for selecting that other card, to tweak the values.


----------



## HoneyBadger84

Quote:


> Originally Posted by *skupples*
> 
> eh, you will get use to it. You will have it torn down & back together in half the time, next tyime.


Mayhaps. I may try it again tomorrow while I'm testing out QuadFire once I get it installed (the HCP 850W will be here tomorrow, which means it's GO TIME as I'll have access to TWELVE PCI-E cords & 2050W of powa







) on one of the cards I won't be using for it.

Plan is 3 HIS + the Asus card. Since I won't be taking the Sapphire apart again cuz I don't wanna cleanup whatever mess I made with too much goop, that leaves the brand new VisionTek R9 290 I got in a 290X box (thanks again VisionTek & stupid seller that didn't check his product before listing it on EBay)... doubt I'll do it to that one simply because it's new, but maybe.

Where would one go about getting good quality Thermal Pads to replace the stock ones if I wanted to do that next time I take one apart? it seemed like that stock thermal padding was pretty shady... it was like mint green, what's up with that?


----------



## tsm106

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *skupples*
> 
> eh, you will get use to it. You will have it torn down & back together in half the time, next tyime.
> 
> 
> 
> Mayhaps. I may try it again tomorrow while I'm testing out QuadFire once I get it installed (the HCP 850W will be here tomorrow, which means it's GO TIME as I'll have access to TWELVE PCI-E cords & 2050W of powa
> 
> 
> 
> 
> 
> 
> 
> ) on one of the cards I won't be using for it.
> 
> Plan is 3 HIS + the Asus card. Since I won't be taking the Sapphire apart again cuz I don't wanna cleanup whatever mess I made with too much goop, that leaves the brand new VisionTek R9 290 I got in a 290X box (thanks again VisionTek & stupid seller that didn't check his product before listing it on EBay)... doubt I'll do it to that one simply because it's new, but maybe.
> 
> Where would one go about getting good quality Thermal Pads to replace the stock ones if I wanted to do that next time I take one apart? it seemed like that stock thermal padding was pretty shady... it was like mint green, what's up with that?
Click to expand...

Why do want to replace the stock ones? Wait, are you going to run quad on air? That's sort of insane man.


----------



## HoneyBadger84

Quote:


> Originally Posted by *tsm106*
> 
> Why do want to replace the stock ones? Wait, are you going to run quad on air? That's sort of insane man.


Do the stock ones not really need replacing persay? I figure I'll need replacing pads eventually either way since someday I might actually go liquid on the cards... but yeah, for starters at least, I'll be running QuadFire on stock air blower cards. I did it before, briefly for benchmarking, then found out the power draw was insane & took out the 4th card til I could get more power... that'll be here tomorrow so the journey resumes







Now I can actually OC a bit mayhaps with 2050W of power at my disposal.


----------



## tsm106

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Why do want to replace the stock ones? Wait, are you going to run quad on air? That's sort of insane man.
> 
> 
> 
> Do the stock ones not really need replacing persay? I figure I'll need replacing pads eventually either way since someday I might actually go liquid on the cards... but yeah, for starters at least, I'll be running QuadFire on stock air blower cards. I did it before, briefly for benchmarking, then found out the power draw was insane & took out the 4th card til I could get more power... that'll be here tomorrow so the journey resumes
> 
> 
> 
> 
> 
> 
> 
> Now I can actually OC a bit mayhaps with 2050W of power at my disposal.
Click to expand...

Memory chips don't get that hot. The only reason to replace would be because the stock ones get torn up pretty easily. However they are an odd size and they are thick yet squishy foam, which I think has to do with the vast deviation in production. Aftermarket pads are much more uniform, so there's the possibility you might fall into the over/under with regards to the thickness. Anyways, I would just put up with it till you got waterblocks. Unless you flash to the modded pt1 bios it's gonna throttle on uber anyways. Gah, that is going to be a cauldron, hehe.


----------



## HoneyBadger84

Quote:


> Originally Posted by *tsm106*
> 
> Memory chips don't get that hot. The only reason to replace would be because the stock ones get torn up pretty easily. However they are an odd size and they are thick yet squishy foam, which I think has to do with the vast deviation in production. Aftermarket pads are much more uniform, so there's the possibility you might fall into the over/under with regards to the thickness. Anyways, I would just put up with it till you got waterblocks. Unless you flash to the modded pt1 bios it's gonna throttle on uber anyways. Gah, that is going to be a cauldron, hehe.


No way man







My system doesn't know what throttle means since I got rid of that POS VisionTek Aftermarket cooler card. lol I haven't seen load temps higher than 72C in TriFire testing even with the cards sandwiched in a 1-3-5 configuration. But of course, that's with the fans blasting at 100%







If I use my shop blower fan for side flow, which I will during QuadFire OC testing, they won't get much hotter even when OCed.

This is from the last time I ran QuadFire. The top card is the VisionTek POS, the middle 2 are stock reference blower XFX Core Editions, & the bottom card is a Gigabyte WindForce (which is why it's so stupidly cool). I took screenshots after each benchmark's run, this was the hottest I saw (you'll notice listed temps are a bit higher, I didn't get the point dead on for where my cursor was, but it's 83C/63C/69C/57C max load temps):



If one can simply handle the noise of 100% fans, throttling is never an issue... of course, on aftermarket cards, 100% is a lot quieter... although one thing I discovered, for some reason, the Asus R9 290X I got that's (or at least it looks) new is quieter than the rest of them are at 100%... not by a lot, but enough that I noticed it the first time the fan was running at full speed. Bit odd but *shrug*

On the topic of the repasted Sapphire R9 290X, here be her idle tempies with a ambient of ~72.3F:



That's in single monitor, dual monitor it idles at ~40C, so 2-4C lower than the last one I ran dual monitors on. Not bad... still think I used too much paste though.


----------



## Sketchus

Hi all, looking for some advice regarding possible upgrading to Crossfire 290's.

I currently have just one 7970, since my other card is broken.

I could possibly get two R9 290's for about 200 pounds (350 dollarsish) each, if not under. I think they're standard models, nothing like the TRI X versions. This is certainly a tempting deal for me. That said, would I be better off waiting to see what Nvidia do with the 800 series cards? I can't imagine they'd be able to compete with the price/performance, but it might be worth a shot, right?


----------



## heroxoot

@honeybadger84 40c is a pretty low idle in my opinion. I got 46c now with my AC on from the 50c it was idling with my AC on. All I did was kick up my case fans a tad. One of them is dying sadly. Good stuff tho. Told you changing the TIM wasn't hard.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Sketchus*
> 
> Hi all, looking for some advice regarding possible upgrading to Crossfire 290's.
> 
> I currently have just one 7970, since my other card is broken.
> 
> I could possibly get two R9 290's for about 200 pounds (350 dollarsish) each, if not under. I think they're standard models, nothing like the TRI X versions. This is certainly a tempting deal for me. That said, would I be better off waiting to see what Nvidia do with the 800 series cards? I can't imagine they'd be able to compete with the price/performance, but it might be worth a shot, right?


R9 290s are a great deal right now, but $350 is high each. They're going for sub $300 here, and that's in good-to-great condition, on EBay. I'd say pass unless they're aftermarket cooler versions.


----------



## HoneyBadger84

Quote:


> Originally Posted by *heroxoot*
> 
> @honeybadger84 40c is a pretty low idle in my opinion. I got 46c now with my AC on from the 50c it was idling with my AC on. All I did was kick up my case fans a tad. One of them is dying sadly. Good stuff tho. Told you changing the TIM wasn't hard.


It was the worst experience I've had in... well, a few days since driving in circles in Albuquerque cuz I couldn't find this stupid shop that had the AC repair parts we needed... X_X

BUT it was a great learning experience, I'm very happy about that.

46C ain't bad depending on your fan speed, at all, with dual monitors enabled. That vRAM being stuck at 5GHz is a real heat-adder. Even on the Tri-X OC 290X I used to have it still ran in the 38-39C range if I remember right whenever I was running dual or eyefinity resolutions.


----------



## heroxoot

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> @honeybadger84 40c is a pretty low idle in my opinion. I got 46c now with my AC on from the 50c it was idling with my AC on. All I did was kick up my case fans a tad. One of them is dying sadly. Good stuff tho. Told you changing the TIM wasn't hard.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It was the worst experience I've had in... well, a few days since driving in circles in Albuquerque cuz I couldn't find this stupid shop that had the AC repair parts we needed... X_X
> 
> BUT it was a great learning experience, I'm very happy about that.
> 
> 46C ain't bad depending on your fan speed, at all, with dual monitors enabled. That vRAM being stuck at 5GHz is a real heat-adder. Even on the Tri-X OC 290X I used to have it still ran in the 38-39C range if I remember right whenever I was running dual or eyefinity resolutions.
Click to expand...

Trust me, you will enjoy it more and more. Cleaning my PC is my favorite thing to do with it, for real. As for temps, that's 46c with a fan speed of like 15%. It's 47c right now with 20% speed. I let it slow down really low when it's cool enough just to save some fan bearing life.


----------



## Sketchus

Quote:


> Originally Posted by *HoneyBadger84*
> 
> R9 290s are a great deal right now, but $350 is high each. They're going for sub $300 here, and that's in good-to-great condition, on EBay. I'd say pass unless they're aftermarket cooler versions.


Thanks for the response. FWIW the TRI X version of the R9 290 sells here new for 300 pounds, so aroud 550 dollars. I'm wondering if the expense is because we generally get shafted here in prices. Looking at Amazon.com it seems it's about 450 dollars for the same card, which is more like 270 pounds here. I suppose I should hold off for now then.


----------



## [email protected]

Hello guys,

Anyone have a rattling / crackling noise on Sapphire 290 tri-x card? (Because of the fans)

http://forums.overclockers.co.uk/showpost.php?p=26473335&postcount=51


----------



## HoneyBadger84

Quote:


> Originally Posted by *[email protected]*
> 
> Hello guys,
> 
> Anyone have a rattling / crackling noise on Sapphire 290 tri-x card? (Because of the fans)


Didn't have that issue on the Tri-X OC 290 or 290X I had for a few weeks. I have heard some of them have fan noise issues, but neither of mine did the entire time they were in use. May want to check if perhaps there's some dust build up, or if the fans are actually a bit lose? Unlikely, but possible.


----------



## heroxoot

Quote:


> Originally Posted by *[email protected]*
> 
> Hello guys,
> 
> Anyone have a rattling / crackling noise on Sapphire 290 tri-x card? (Because of the fans)
> 
> http://forums.overclockers.co.uk/showpost.php?p=26473335&postcount=51


I had a 280X for a day that did something similar. Fans were trash and had to be sent right back to MSI. It did it at high speeds too.


----------



## heroxoot

Holy crap guys I just noticed I have RPM again! Not sure if this has something to do with 14.7 ever being installed, but it's back for me.



Well I'll be damned it was not there before I tried 14.7, currently on 14.6RC2.


----------



## HoneyBadger84

Huh... pretty sure I'm on RC2 & fan RPM doesn't display for me... cool either way though ^_^

I'm having to resist the urge to pull the trigger on a set of cards someone has listed then resell them individually, they have the better coolers on them and I've seen them go for $280+ each... listed for ~$480 as a bundle... wouldn't be much profit but







It's also about the thrill of the test. lol


----------



## heroxoot

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Huh... pretty sure I'm on RC2 & fan RPM doesn't display for me... cool either way though ^_^


Thats what I was getting at. It was not displaying RPM at all previously. I have not seen RPM since 13.12.


----------



## PureBlackFire

Quote:


> Originally Posted by *[email protected]*
> 
> Hello guys,
> 
> Anyone have a rattling / crackling noise on Sapphire 290 tri-x card? (Because of the fans)
> 
> http://forums.overclockers.co.uk/showpost.php?p=26473335&postcount=51


----------



## shwarz

just wondering if anyone has major issues with overclocking vram on their r290/r290x

running eplida memory

i cannot raise my ram above stock without artifacting and yet it will do 1200 on the core at +200mv


----------



## HoneyBadger84

Quote:


> Originally Posted by *shwarz*
> 
> just wondering if anyone has major issues with overclocking vram on their r290/r290x
> 
> running eplida memory
> 
> i cannot raise my ram above stock without artifacting and yet it will do 1200 on the core at +200mv


Most folks with Elpida memory have been saying it doesn't OC for crap compared to Hynix. I haven't pushed any of mine past 1400MHz, so I can't really comment too much on OCing potential of the chips, and that was with a 1150/1400 clock combination and some pretty hefty voltage (TriFire testing) @ +125~131mV.


----------



## ZealotKi11er

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Most folks with Elpida memory have been saying it doesn't OC for crap compared to Hynix. I haven't pushed any of mine past 1400MHz, so I can't really comment too much on OCing potential of the chips, and that was with a 1150/1400 clock combination and some pretty hefty voltage (TriFire testing) @ +125~131mV.


Both my Hynix do 1500MHz but they need voltage or i crash at idle.


----------



## Sgt Bilko

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Both my Hynix do 1500MHz but they need voltage or i crash at idle.


And both my Elpida do 1500Mhz with a 1100+ core clock.

You get hit and miss either way, memory OC doesn't meant hat much on these cards though so don't stress about it


----------



## HoneyBadger84

Quote:


> Originally Posted by *Sgt Bilko*
> 
> And both my Elpida do 1500Mhz with a 1100+ core clock.
> 
> You get hit and miss either way, memory OC doesn't meant hat much on these cards though so don't stress about it


Indeed, core is what's important, this vRAM with it's 512-bit bus is stupid fast, no need to really push it much other than for slightly more flashy benchmark scores.


----------



## davidm71

Hi,

I was wondering at what point will overclocking an Asus 290X OC II pull more than 75 watts through the pci express slot and for just one card is augmenting the pci-e bus with a molex adapter necessary?

Thanks


----------



## shwarz

Quote:


> Originally Posted by *davidm71*
> 
> Hi,
> 
> I was wondering at what point will overclocking an Asus 290X OC II pull more than 75 watts through the pci express slot and for just one card is augmenting the pci-e bus with a molex adapter necessary?
> 
> Thanks


i can bench my 290 at +200mv on the core on my motherboard asus z97m plus with no extra power plugs no issues


----------



## heroxoot

Just like to report Borderlands 2 suddenly playable for me. Not sure what I did but even tho it only pushes like, 30 - 40% load, my fps stays between 50 and 60 most of the time, rarely dropping below. I changed airflow a bit, installed a physx driver ( remains on low regardless) bios changed my GPU, and went to 14.6RC2 from RC1. Could be anything, but it works a bit better now.


----------



## davidm71

Quote:


> Originally Posted by *shwarz*
> 
> i can bench my 290 at +200mv on the core on my motherboard asus z97m plus with no extra power plugs no issues


Ok thanks. Guess I'm not gonna do it..


----------



## Velict

Quote:


> Originally Posted by *the9quad*
> 
> watercooling is really expensive, I keep looking into it, but cant afford to drop another grand plus into this just for watercooling.


well once you get started you don't really need to touch your system again for a long time. my buddy has been using gtx 480's (sli) and an i7 975 since release, and hasn't spent another dime on his computer. I think in the long run you might save money. He runs at 2560x1600 and his PC doesn't skip a beat. He maintains it, cleans his loop with one of those 10 micron filters once in a while, dusts it and that's it. He isn't even worried about 4k (and no one should be) because it's so buggy, useless, and not optimized for almost anything at the moment. 1440p or bust IMHO. Once you start dipping under 50 FPS @ 1600p, then i'll think about upgrading. But other than that, there isn't a point unless you need to be on the bleeding edge of technology just to show off on the internet.

Moral of the story: If you put in enough now, you won't need to upgrade for 5 years. Fact. Don't listen to the BS.


----------



## the9quad

Quote:


> Originally Posted by *Velict*
> 
> well once you get started you don't really need to touch your system again for a long time. my buddy has been using gtx 480's (sli) and an i7 975 since release, and hasn't spent another dime on his computer. I think in the long run you might save money. He runs at 2560x1600 and his PC doesn't skip a beat. He maintains it, cleans his loop with one of those 10 micron filters once in a while, dusts it and that's it. He isn't even worried about 4k (and no one should be) because it's so buggy, useless, and not optimized for almost anything at the moment. 1440p or bust IMHO. Once you start dipping under 50 FPS @ 1600p, then i'll think about upgrading. But other than that, there isn't a point unless you need to be on the bleeding edge of technology just to show off on the internet.
> 
> Moral of the story: If you put in enough now, you won't need to upgrade for 5 years. Fact. Don't listen to the BS.


Doesn't change the fact that it is >$1000 to get started. 3 waterblocks, res,rads,pumps,tubing,fittings, and probably a new case. I realize you can re-use most when you upgrade again, but that doesn't change the initial investment. whether I water cool or not I wont upgrade most of my pc for 5 years anyway. And water cooling really isn't really going to prolong the life/performance that much anyway.


----------



## ZealotKi11er

Quote:


> Originally Posted by *the9quad*
> 
> Doesn't change the fact that it is >$1000 to get started. 3 waterblocks, res,rads,pumps,tubing,fittings, and probably a new case. I realize you can re-use most when you upgrade again, but that doesn't change the initial investment. whether I water cool or not I wont upgrade most of my pc for 5 years anyway. And water cooling really isn't really going to prolong the life/performance that much anyway.


For GPU blocks not upgrading is a good thing. If you are spending 2K already in parts going water is a must. GPU blocks are expensive but there rest not so much. RADs, Pump, Fans, Res, Tube, Fittings will cost ~ $400 for something good. Then add ~ 120 for each GPU. Not really $1000. That much money you are overspending. You can also go each cheaper. There are kits that go ~ $250 and come with enough RAD for 1 GPU. The thing is that going Water you dont really go all out. You upgrade slowly. Maybe even buy used. My first Loop i got was ~ $200 for CPU only. I only have the pump from that loop left.


----------



## the9quad

Quote:


> Originally Posted by *ZealotKi11er*
> 
> For GPU blocks not upgrading is a good thing. If you are spending 2K already in parts going water is a must. GPU blocks are expensive but there rest not so much. RADs, Pump, Fans, Res, Tube, Fittings will cost ~ $400 for something good. Then add ~ 120 for each GPU. Not really $1000. That much money you are overspending. You can also go each cheaper. There are kits that go ~ $250 and come with enough RAD for 1 GPU. The thing is that going Water you dont really go all out. You upgrade slowly. Maybe even buy used. My first Loop i got was ~ $200 for CPU only. I only have the pump from that loop left.


I've priced it out several times, and cooling 3 290x's and a 4930k is really close to a grand. Add in the fact I'd need a new case and it's well over a grand. I appreciate the help, but it really is that expensive.


----------



## SLOWION

0mg I put an R9 290 into my budget gaming rig and benchmarked it! Tee hee﻿


----------



## tsm106

Quote:


> Originally Posted by *the9quad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> For GPU blocks not upgrading is a good thing. If you are spending 2K already in parts going water is a must. GPU blocks are expensive but there rest not so much. RADs, Pump, Fans, Res, Tube, Fittings will cost ~ $400 for something good. Then add ~ 120 for each GPU. Not really $1000. That much money you are overspending. You can also go each cheaper. There are kits that go ~ $250 and come with enough RAD for 1 GPU. The thing is that going Water you dont really go all out. You upgrade slowly. Maybe even buy used. My first Loop i got was ~ $200 for CPU only. I only have the pump from that loop left.
> 
> 
> 
> I've priced it out several times, and cooling 3 290x's and a 4930k is really close to a grand. Add in the fact I'd need a new case and it's well over a grand. I appreciate the help, but it really is that expensive.
Click to expand...

It costs lots to build a full tilt loop. That's obvious, however if you feel it's too much that is fine too. I'd just point out that your rig is pretty expensive as well but you spec'd it out the way you did, shrugs. PPL spend money however they see fit, for me the cost of a good loop pays for itself because I can run whatever clocks my cards and cpu can handle without impunity. I can bench or play, whatever I feel like and not have to be punished by heat and noise. That's worth a lot too me.


----------



## pdasterly

Quote:


> Originally Posted by *the9quad*
> 
> I've priced it out several times, and cooling 3 290x's and a 4930k is really close to a grand. Add in the fact I'd need a new case and it's well over a grand. I appreciate the help, but it really is that expensive.


Im at $700 so far for water cooling system, still waiting for my frozencpu box to arrive, shopped around first. Bought kit off craigslist, waterblocks from forum and the rest from frozencpu. 290x crossfire blocks, motherboard waterblock and cpu block


----------



## FuriousPop

hi All,

building my first water loop and wanted to check with you all how to go about which items to put at what position.

the products i have are:
Top rad RX480
CPU Block
3x R9 290 XSPC GPU blocks
1x Bitspower flow metre (requires tubing to and fro)
Bottom rad RX480
1x res270/D5 pump combo

i was thinking of this type of way to do the tubing:
Res/pump --> CPU --> top rad --> 3x GPU's --> bottom rad --> flow metre --> res/pump

was going about it this way more so due to practicallity of the parts and their position in my case (900D), but if anyone has any other suggestions please let me know.
I was also under the impression/advisement that doing it other ways will probably give a 1-3 degree cel change in temps if lucky - but im not too fussed.

Thanks in advanced.


----------



## falcon26

I just returned my new 780 and picked up a used MSI gaming 290x for $350. I hope I made the right choice. This will be for bf4...


----------



## ZealotKi11er

Quote:


> Originally Posted by *falcon26*
> 
> I just returned my new 780 and picked up a used MSI gaming 290x for $350. I hope I made the right choice. This will be for bf4...


How much was the GTX780?


----------



## falcon26

With tax $530


----------



## tsm106

Quote:


> Originally Posted by *falcon26*
> 
> With tax $530


You saved $180. Is that not a win already??


----------



## ZealotKi11er

Quote:


> Originally Posted by *falcon26*
> 
> With tax $530


Thats a mazing deal. Not only that but 290X is a better card.


----------



## pdasterly

add about $150 for fittings and tubing
Quote:


> Originally Posted by *FuriousPop*
> 
> hi All,
> 
> building my first water loop and wanted to check with you all how to go about which items to put at what position.
> 
> the products i have are:
> Top rad RX480
> CPU Block
> 3x R9 290 XSPC GPU blocks
> 1x Bitspower flow metre (requires tubing to and fro)
> Bottom rad RX480
> 1x res270/D5 pump combo
> 
> i was thinking of this type of way to do the tubing:
> Res/pump --> CPU --> top rad --> 3x GPU's --> bottom rad --> flow metre --> res/pump
> 
> was going about it this way more so due to practicallity of the parts and their position in my case (900D), but if anyone has any other suggestions please let me know.
> I was also under the impression/advisement that doing it other ways will probably give a 1-3 degree cel change in temps if lucky - but im not too fussed.
> 
> Thanks in advanced.


----------



## falcon26

Good. I hope it plays bf4 as good as the 780 did. I hope its quiet and not a jet engine ...


----------



## tsm106

Quote:


> Originally Posted by *pdasterly*
> 
> add about $150 for fittings and tubing
> Quote:
> 
> 
> 
> Originally Posted by *FuriousPop*
> 
> hi All,
> 
> building my first water loop and wanted to check with you all how to go about which items to put at what position.
> 
> the products i have are:
> Top rad RX480
> CPU Block
> 3x R9 290 XSPC GPU blocks
> 1x Bitspower flow metre (requires tubing to and fro)
> Bottom rad RX480
> 1x res270/D5 pump combo
> 
> i was thinking of this type of way to do the tubing:
> Res/pump --> CPU --> top rad --> 3x GPU's --> bottom rad --> flow metre --> res/pump
> 
> was going about it this way more so due to practicallity of the parts and their position in my case (900D), but if anyone has any other suggestions please let me know.
> I was also under the impression/advisement that doing it other ways will probably give a 1-3 degree cel change in temps if lucky - but im not too fussed.
> 
> Thanks in advanced.
Click to expand...

Are you going to oc like mad? Btw, a single D5 is way too weak, it's gonna be a trickle with that much restriction. Btw, ea. 480 only cools 300w maybe at a 15c delta at 1500rpm...


----------



## Velict

Quote:


> Originally Posted by *the9quad*
> 
> I've priced it out several times, and cooling 3 290x's and a 4930k is really close to a grand. Add in the fact I'd need a new case and it's well over a grand. I appreciate the help, but it really is that expensive.


You bought some of the most high end components, are not water cooling, and complaining about pricing (when we're giving you options. And yes, there is some sweet fricking air coolers out there)

But if you're choosing this route, what are you doing on overclock.net?


----------



## FuriousPop

Quote:


> Originally Posted by *tsm106*
> 
> Are you going to oc like mad? Btw, a single D5 is way too weak, it's gonna be a trickle with that much restriction. Btw, ea. 480 only cools 300w maybe at a 15c delta at 1500rpm...


At the moment i'll be running them at stock for time being, then will eventually OC to reasonable levels but nothing too crazy - this box needs to last me 2 -3 years worth..

I thought a D5 would be fine through all that (so i was told anyway)...

But is the way i have lined up the blocks/items one after the other ok?


----------



## Velict

Quote:


> Originally Posted by *tsm106*
> 
> Are you going to oc like mad? Btw, a single D5 is way too weak, it's gonna be a trickle with that much restriction. Btw, ea. 480 only cools 300w maybe at a 15c delta at 1500rpm...


Alot of fallacies. Anyway, lemme clear this

A single D5 is fine, don't worry.
A single 480 radiator is completely fine @1500 RPM



If you have the room, go with a monsta. If not, then this or UT60

My next build is going to be xfire + cpu 480 monsta push / pull with a 240mm ut60 no push pull

You'll have tons of overclocking headroom with dual 480's up in dat beast.


----------



## tsm106

Quote:


> Originally Posted by *FuriousPop*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Are you going to oc like mad? Btw, a single D5 is way too weak, it's gonna be a trickle with that much restriction. Btw, ea. 480 only cools 300w maybe at a 15c delta at 1500rpm...
> 
> 
> 
> At the moment i'll be running them at stock for time being, then will eventually OC to reasonable levels but nothing too crazy - this box needs to last me 2 -3 years worth..
> 
> I thought a D5 would be fine through all that (so i was told anyway)...
> 
> But is the way i have lined up the blocks/items one after the other ok?
Click to expand...

Loop order doesn't really matter in the scope of things. Some rules are there for general guidelines like res before pump but block to block order doesn't matter once the loop gets up to temp and an equilibrium is established. Regarding pump, you want to maintain a 1gpm minimum when you are loaded, ie. running the system doing what you want to do. Just getting the order to work in an efficient order is hard enough sometimes. A D5 has very little head pressure. In fact the D5 is the weakest pump of the readily available pumps but it makes up for this with high flow and very low noise. It's a great pump for simple loops. However your loop is entering complex loop status with many blocks. I assume you're cooling your cpu too? With multi blocks and rads, you will want at least two D5. Personally I prefer a 35X2 which is equivalent to four D5.

Quote:


> Originally Posted by *Velict*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Are you going to oc like mad? Btw, a single D5 is way too weak, it's gonna be a trickle with that much restriction. Btw, ea. 480 only cools 300w maybe at a 15c delta at 1500rpm...
> 
> 
> 
> Alot of fallacies. Anyway, lemme clear this
> A single D5 is fine, don't worry.
> A single 480 radiator is completely fine @1500 RPM
> 
> 
> 
> If you have the room, go with a monsta. If not, then this or UT60
> 
> My next build is going to be xfire + cpu 480 monsta push / pull with a 240mm ut60 no push pull
> 
> You'll have tons of overclocking headroom with dual 480's up in dat beast.
Click to expand...

Seriously, you reply like you know what the hell is going on and you use a GTX480 as proof? Fallacies? Are you high?


----------



## HoneyBadger84

Quote:


> Originally Posted by *falcon26*
> 
> I just returned my new 780 and picked up a used MSI gaming 290x for $350. I hope I made the right choice. This will be for bf4...


Quote:


> Originally Posted by *tsm106*
> 
> You saved $180. Is that not a win already??


Saved $180 and the R9 290X beats the regular GTX 780 in most of the higher end games. Definite win.
Quote:


> Originally Posted by *falcon26*
> 
> Good. I hope it plays bf4 as good as the 780 did. I hope its quiet and not a jet engine ...


Onky the stock blower cards are "loud", and they're only actually loud when the fan gets up over 50% (which is what most stock default fan profiles max at).


----------



## Velict

"300 watts @ 1500 rpm"

"you act like you know what you're talking about"

> Posts actual tests....

Lol. Go take a seat.


----------



## Silent Scone

Quote:


> Originally Posted by *Velict*
> 
> "300 watts @ 1500 rpm"
> 
> "you act like you know what you're talking about"
> 
> > Posts actual tests....
> 
> Lol. Go take a seat.


He's actually right though. It will be OK at best with just the two rads but it will be on the line...twin D5 or go Swiftech for a 3 GPU loop.

Also 2x480s isn't beastly


----------



## Velict

I promise you he'll be okay with a single d5. I'll friggin buy him a new one if it dies or he doesn't have "adequate" temps. Also, you don't always see people everyday with dual 480;s


----------



## Decoman

Q: Is there a trick or something for running Trixx with custom clocks?

I have set up Trixx so that windows' task scheduler start Trixx on login into Windows 7, however Trixx does not restore the clocks. It does not matter if the clocks were set, saved, or even if I ticked the "restore clocks" option. I use a batch script for delaying Trixx startup on windows login.

I guess something might be wrong here, but I have no idea what it might be.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Decoman*
> 
> Q: Is there a trick or something for running Trixx with custom clocks?
> 
> I have set up Trixx so that windows' task sheduler start Trixx on login into Windows 7, however Trixx does not restore the clocks. It does not matter if the clocks were set, saved, or even if I ticked the "restore clocks" option. I use a batch script for delaying Trixx startup on windows login.
> 
> I guess something might be wrong here, but I have no idea what it might be.


I wouldn't set to resume clocks even if you figure out how, it can lead to booting issues. Least that's what I've been told by about six different people in this thread.


----------



## Mega Man

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bluedevil*
> 
> Yeah its all the screws on the back + the 4 on the PCI bracket.
> 
> 
> 
> I pictured the screws & the back so y'all could see how much I've removed... there are zero screws left. Am I not wiggling it enough?
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Yea put it back on. Don't put any screws on the backside of the pcb back on though. You also need to remove the screws on the vent plate. Then holding the shroud, twist just a hair and the separate the cooler from the pcb.
> 
> Click to expand...
> 
> doin' that now.
Click to expand...

i always use a blow drier to warm up the copper helps alot !~
Quote:


> Originally Posted by *Velict*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Are you going to oc like mad? Btw, a single D5 is way too weak, it's gonna be a trickle with that much restriction. Btw, ea. 480 only cools 300w maybe at a 15c delta at 1500rpm...
> 
> 
> 
> Alot of fallacies. Anyway, lemme clear this
> 
> A single D5 is fine, don't worry.
> A single 480 radiator is completely fine @1500 RPM
> 
> 
> 
> If you have the room, go with a monsta. If not, then this or UT60
> 
> My next build is going to be xfire + cpu 480 monsta push / pull with a 240mm ut60 no push pull
> 
> You'll have tons of overclocking headroom with dual 480's up in dat beast.
Click to expand...

and... no dont believe me check my rig
Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *FuriousPop*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Are you going to oc like mad? Btw, a single D5 is way too weak, it's gonna be a trickle with that much restriction. Btw, ea. 480 only cools 300w maybe at a 15c delta at 1500rpm...
> 
> 
> 
> At the moment i'll be running them at stock for time being, then will eventually OC to reasonable levels but nothing too crazy - this box needs to last me 2 -3 years worth..
> 
> I thought a D5 would be fine through all that (so i was told anyway)...
> 
> But is the way i have lined up the blocks/items one after the other ok?
> 
> Click to expand...
> 
> Loop order doesn't really matter in the scope of things. Some rules are there for general guidelines like res before pump but block to block order doesn't matter once the loop gets up to temp and an equilibrium is established. Regarding pump, you want to maintain a 1gpm minimum when you are loaded, ie. running the system doing what you want to do. Just getting the order to work in an efficient order is hard enough sometimes. A D5 has very little head pressure. In fact the D5 is the weakest pump of the readily available pumps but it makes up for this with high flow and very low noise. It's a great pump for simple loops. However your loop is entering complex loop status with many blocks. I assume you're cooling your cpu too? With multi blocks and rads, you will want at least two D5. Personally I prefer a 35X2 which is equivalent to four D5.
> 
> Quote:
> 
> 
> 
> Originally Posted by *Velict*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Are you going to oc like mad? Btw, a single D5 is way too weak, it's gonna be a trickle with that much restriction. Btw, ea. 480 only cools 300w maybe at a 15c delta at 1500rpm...
> 
> Click to expand...
> 
> Alot of fallacies. Anyway, lemme clear this
> A single D5 is fine, don't worry.
> A single 480 radiator is completely fine @1500 RPM
> 
> 
> 
> If you have the room, go with a monsta. If not, then this or UT60
> 
> My next build is going to be xfire + cpu 480 monsta push / pull with a 240mm ut60 no push pull
> 
> You'll have tons of overclocking headroom with dual 480's up in dat beast.
> 
> Click to expand...
> 
> Seriously, you reply like you know what the hell is going on and you use a GTX480 as proof? Fallacies? Are you high?
Click to expand...

lets get this right, you dont even own a full block yet you think you can just assume a d5 is enough for cfx?

wont get into the fact that you are arguing with one of the most knowledgeable people on quadfire, the only person i know with D5s+quadfire ( not saying there is not more ) pushes 4 of them to make good pressure/flow through his loop
Quote:


> Originally Posted by *Velict*
> 
> "300 watts @ 1500 rpm"
> 
> "you act like you know what you're talking about"
> 
> > Posts actual tests....
> 
> Lol. Go take a seat.


Quote:


> Originally Posted by *Velict*
> 
> I promise you he'll be okay with a single d5. I'll friggin buy him a new one if it dies or he doesn't have "adequate" temps. Also, you don't always see people everyday with dual 480;s


umm..... i run 5 ( 3x80mm thick, 1x60mm thick and 1x45mm thick. my other rig has 1x45mm240, 3x360, i think they will be 3x80mm 1x60, if not 3x60mm 1x45 i have not measured that closely both running quadfire, ( first being 290s second being 7970s, have not rebuilt the 7970 build, but it absorbs alot more heat as i push ~ 1.65v 24/7 on my 8350. and the 2x360 45mm, 2x2490 45mm and 120m 60mm and 120 45 mm was way to hot for my tastes, )

as to the pump. no 1 d5 will not even touch enough flow for 4 gpus let alone with a cpu in it !

as for tests, i didnt document it but even with a mcp350x just by adding a second my ( at the time ) cfx 7970+cpu temps dropped by ~15-20 deg


----------



## Klocek001

Hi guys

New to R9 290, just got myself a Tri-X version. It's 1000 MHz on core, 1300 mem - how much can I still squeeze out of it (safely, that is. changing vcore is NOT an option) ?
And if I was planning to buy another one for CF, is my power supply going to handle it ? It's XFX Pro 850W, single rail 70A.

Finally, can I be in the club now


----------



## fateswarm

It's probably safe to go even +0.2v, if the temps are good. If it's the tri-x cooler only, realistically that won't be needed. Try 1100/1350.

Maybe +0.020 or so may be needed.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Klocek001*
> 
> Hi guys
> 
> New to R9 290, just got myself a Tri-X version. It's 1000 MHz on core, 1300 mem - how much can I still squeeze out of it (safely, that is. changing vcore is NOT an option) ?
> And if I was planning to buy another one for CF, is my power supply going to handle it ? It's XFX Pro 850W, single rail 70A.
> 
> Finally, can I be in the club now


Hmmm, did you happen to buy that off eBay? I sold one not too long ago and that GPU bios version and such looks very much like the one I sold. That'd be funny if so lol nice card, welcome to the club!

For OC, increasing core voltage on the Tri-X cooler is perfectly safe just do it in small increments. Shoot for 1100/1400 as a starting point. With Hynix chips you should be able to push 1500+ on the vRAM if you give it some extra voltage, and maybe 1400 at stock volts. 1100MHz core should be very easy.


----------



## Klocek001

I'm Polish, I don't do my shopping on ebay







Thx for advice, what about my power supply ? can it handle 2 cards like that ?

These cards are becoming really freaking cheap on Polish auction sites, mine is 2 moths old and I got it for $320 (1000pln). I guess people were buying them like crazy for mining, and now they're selling them frantically because it's not worth it. you can get two of them and a power supply for the price of a new 780ti. good, good ..


----------



## fateswarm

Quote:


> Originally Posted by *Klocek001*
> 
> I'm Polish, I don't do my shopping on ebay


You haven't discovered the Hong Kong sellers yet. Zero shipping costs and ludicrous prices. Though they mainly apply to low weight/low cost products.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Klocek001*
> 
> I'm Polish, I don't do my shopping on ebay
> 
> 
> 
> 
> 
> 
> 
> Thx for advice, what about my power supply ? can it handle 2 cards like that ?


Borderline. You have the wattage but not really the amperage. 290s can use 25-30A each especially when OCed, that leaves less than 10-20A for the entire rest of your system. I would upgrade PSU before getting crossfire, to be on the safe side.

Its the same reason I'm getting a secondary power supply. 3 290Xs can pull 99A or more... My PSU is only about 110A. And I'm going QuadFire, and OCing them, at least for benchmarks, so need moar power. I've also seen ~1400W draw at the wall with minorly OCed TriFire, which means I was pushing my 1200W PSU to close it's output capacity.

For your info perspective, I've seen up to around 1050W draw at the wall with OCed crossfire, which comes out to around 870W give or take actual power the PSU is putting out. Something's to think about. Your CPU is more power efficient than mine by a bit though.


----------



## Klocek001

Quote:


> Originally Posted by *fateswarm*
> 
> You haven't discovered the Hong Kong sellers yet. Zero shipping costs and ludicrous prices. Though they mainly apply to low weight/low cost products.


There's probably going to be DX13 when a package from hong kong reaches me..


----------



## fateswarm

Quote:


> Originally Posted by *Klocek001*
> 
> There's probably going to be DX13 when a package from hong kong reaches me..


It's a bit boring that they always usually take up to 3 or 4 weeks to me.


----------



## Klocek001

I just want to know if I decide to put a fan on the side of my case to help deal with the heat should it be intake or exhaust ? The diagram I found on the net says intake but my mind tells me it should be exhaust. The card runs hot, 40-45 degrees idle, 75 in games, is that normal ?


----------



## pdasterly

Intake, exhaust has minimal effect
Quote:


> Originally Posted by *Klocek001*
> 
> I just want to know if I decide to put a fan on the side of my case to help deal with the heat should it be intake or exhaust ? The diagram I found on the net says intake but my mind tells me it should be exhaust. The card runs hot, 40-45 degrees idle, 75 in games, is that normal ?


----------



## joeh4384

75 is very good temp for a r9 290 or r9 290x.


----------



## Klocek001

But if the fans on the card are forcing the hot air out of it and out of the case, why should I put an intake fan, it will only reduce the flow of hot air out of the case...sort of push it back inside. I based my assumption on the fact that there's a lot of really hot ir coming out of the side of the card.


----------



## Dasboogieman

Quote:


> Originally Posted by *Klocek001*
> 
> But if the fans on the card are forcing the hot air out of it and out of the case, why should I put an intake fan, it will only reduce the flow of hot air out of the case...sort of push it back inside. I based my assumption on the fact that there's a lot of really hot ir coming out of the side of the card.


In theory yes, however, some setups have inadequate air intake/cycling to begin with. This starves the cards of cold air thus creating a bigger thermal issue than recycling some of the exhaust air by having an intake.

I personally find that my setup needs to extract hot air more than intake but that's because I already have a good amount of air cycling through the system.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Klocek001*
> 
> But if the fans on the card are forcing the hot air out of it and out of the case, why should I put an intake fan, it will only reduce the flow of hot air out of the case...sort of push it back inside. I based my assumption on the fact that there's a lot of really hot ir coming out of the side of the card.


Side intake is best intake man:



My temps are low with all that side airflow...

.... I swear I'm not trying to rhyme.


----------



## Dasboogieman

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Side intake is best intake man:
> 
> 
> 
> My temps are low with all that side airflow...
> 
> .... I swear I'm not trying to rhyme.


dat rap, dem rhymes are dope









In all seriousness, that is a ridiculous amount of Penetrator airflow going to the GPUs


----------



## HoneyBadger84

Quote:


> Originally Posted by *Dasboogieman*
> 
> dat rap
> 
> In all seriousness, that is a ridiculous amount of Penetrator airflow going to the GPUs


The fan in the background on the right is the same kind I use as sideflow when the side of the case comes off for dem OCs









HVB = High Velocity Blower

Awwwwyisssss.


----------



## The Storm

3 290's, 4930k, 4 rads, all on a single d5 on setting 3. Works well if I say so myself. Crappy cell phone pic as I am not home to upload the good photos, but you get the jist.


----------



## PCSarge

Quote:


> Originally Posted by *The Storm*
> 
> 3 290's, 4930k, 4 rads, all on a single d5 on setting 3. Works well if I say so myself. Crappy cell phone pic as I am not home to upload the good photos, but you get the jist.


and i thought i had too much money?


----------



## The Storm

Quote:


> Originally Posted by *PCSarge*
> 
> and i thought i had too much money?


Haha!!
There was some talk earlier that a d5 wouldn't handle three cards so I just wanted to show my setup as it works just fine.


----------



## Jflisk

Quote:


> Originally Posted by *The Storm*
> 
> Haha!!
> There was some talk earlier that a d5 wouldn't handle three cards so I just wanted to show my setup as it works just fine.


D5 pump will handle anything you can throw at it. I have mine upconverted to 24V. Makes the filling time a lot quicker.


----------



## tsm106

Quote:


> Originally Posted by *The Storm*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PCSarge*
> 
> and i thought i had too much money?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Haha!!
> There was some talk earlier that a d5 wouldn't handle three cards so I just wanted to show my setup as it works just fine.
Click to expand...

Technically you only showed your rig, not proof it works well. With that many blocks your flow is at a trickle. And with that small amount of surface area for rads, I know first hand temps are not as good as they could be.

Quote:


> Originally Posted by *Jflisk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *The Storm*
> 
> Haha!!
> There was some talk earlier that a d5 wouldn't handle three cards so I just wanted to show my setup as it works just fine.
> 
> 
> 
> D5 pump will handle anything you can throw at it. I have mine upconverted to *24V*. Makes the filling time a lot quicker.
Click to expand...

That's one way to boost a D5, albeit a bit expensive as an add on option.


----------



## The Storm

Quote:


> Originally Posted by *tsm106*
> 
> Technically you only showed your rig, not proof it works well. With that many blocks your flow is at a trickle. And with that small amount of surface area for rads, I know first hand temps are not as good as they could be.
> That's one way to boost a D5, albeit a bit expensive as an add on option.


As I stated, I am not at home show anything else other than what's on my phone. All I can do at the moment is tell you that when playing BF4 (which generates the most heat of anything I do) the Temps climb to 58c. When I had 2 cards it climbed to 52c, these are several hour sessions. Could it be better...sure I bet it can be but for me it works. This is an overclocked 4930k @ 4.5 with 1.25v and stock on the cards. I never said it couldn't be better just stating that it works. For me with fans on silent setting with my RIVE, and the other fans doubled up on the resistors it's quiet and I can live with 58c.


----------



## tsm106

Quote:


> Originally Posted by *The Storm*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Technically you only showed your rig, not proof it works well. With that many blocks your flow is at a trickle. And with that small amount of surface area for rads, I know first hand temps are not as good as they could be.
> That's one way to boost a D5, albeit a bit expensive as an add on option.
> 
> 
> 
> As I stated, I am not at home show anything else other than what's on my phone. All I can do at the moment is tell you that when playing BF4 (which generates the most heat of anything I do) the Temps climb to 58c. When I had 2 cards it climbed to 52c, these are several hour sessions. Could it be better...sure I bet it can be but for me it works. This is an overclocked 4930k @ 4.5 with 1.25v and stock on the cards. I never said it couldn't be better just stating that it works. For me with fans on silent setting with my RIVE, and the other fans doubled up on the resistors it's quiet and I can live with 58c.
Click to expand...

Don't take offense, but I've already done what you are doing now. For me 58c is no where good enough if you want to oc. In fact, I could barely run benches because over 50c and you lose the ability to get high clocks. Imo, just stating that it works isn't telling the whole picture so others will just assume it works well when in fact its a compromise that you "can live with."


----------



## The Storm

Quote:


> Originally Posted by *tsm106*
> 
> Don't take offense, but I've already done what you are doing now. For me 58c is no where good enough if you want to oc. In fact, I could barely run benches because over 50c and you lose the ability to get high clocks. Imo, just stating that it works isn't telling the whole picture so others will just assume it works well when in fact its a compromise that you "can live with."


Definitely not offended, and I'm not trying to imply that your wrong and I'm right. Just showing that I have a similar setup that the question was asked about. I don't believe my 1200 watt power supply would have enough power to overclock all of this so I don't even try. I have been looking into getting an mcp35x to be honest but for now this works for me. I don't mind 58c but again this is me. The temps are consistent across all three according to afterburner so I believe water is making it through all them ok. If I drop the speed to setting 2 I see temp differences between the three.


----------



## tsm106

Quote:


> Originally Posted by *The Storm*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Don't take offense, but I've already done what you are doing now. For me 58c is no where good enough if you want to oc. In fact, I could barely run benches because over 50c and you lose the ability to get high clocks. Imo, just stating that it works isn't telling the whole picture so others will just assume it works well when in fact its a compromise that you "can live with."
> 
> 
> 
> Definitely not offended, and I'm not trying to imply that your wrong and I'm right. Just showing that I have a similar setup that the question was asked about. I don't believe my 1200 watt power supply would have enough power to overclock all of this so I don't even try. I have been looking into getting an mcp35x to be honest but for now this works for me. I don't mind 58c but again this is me. The temps are consistent across all three according to afterburner so I believe water is making it through all them ok. If I drop the speed to setting 2 I see temp differences between the three.
Click to expand...

For kicks, look at my 3930 rig pics. Start from the beginning and work your way across. You'll see some similarities. A year ago or so I said screw it and went external. Best thing I ever did... moving to a setup of mass over rads. You can see my rigs evolution from being able to "live" with it to where it is now. Two of these rads and accoutrements, some QDCs... I won't ever go back to just barely scraping by on rads. And the ironic part? I could have saved sooo much more money starting with the rads below instead blowing so much money trying to stuff silly small rads inside my case and living with it.

http://www.performance-pcs.com/catalog/index.php?main_page=product_info&products_id=21622


----------



## smartdroid

You're running 2 of those external rads for your cards alone?


----------



## The Storm

Quote:


> Originally Posted by *tsm106*
> 
> For kicks, look at my 3930 rig pics. Start from the beginning and work your way across. You'll see some similarities. A year ago or so I said screw it and went external. Best thing I ever did... moving to a setup of mass over rads. You can see my rigs evolution from being able to "live" with it to where it is now. Two of these rads and accoutrements, some QDCs... I won't ever go back to just barely scraping by on rads. And the ironic part? I could have saved sooo much more money starting with the rads below instead blowing so much money trying to stuff silly small rads inside my case and living with it.
> 
> http://www.performance-pcs.com/catalog/index.php?main_page=product_info&products_id=21622


Wow that's a lot of radiator haha. Yeah I can see it being cheaper just doing that vs buying a bunch of rads at 50 to 100 bucks a pop. This build his been a slow evolution as well. That may just have to be my next step to suit my building "bug" I don't think I will ever be able to say that I'm finished. As soon as I think I'm done I think of what I can do next...vicious cycle.


----------



## Talon720

So does anyone have the issue of crashing/blackscreen when overvolting. If i use 1.4 or higher my gpus start to crash blackscreen on, and off then usually ends in lost signal. My cards arent the best, but i wanna push them since in on water. Its happened with AB and 2v mod gpu tweak on pt1 bios.


----------



## tsm106

Quote:


> Originally Posted by *smartdroid*
> 
> You're running 2 of those external rads for your cards alone?


No, single loop. I went thru the dual loop phase long ago.

Quote:


> Originally Posted by *Talon720*
> 
> So does anyone have the issue of crashing/blackscreen when overvolting. If i use 1.4 or higher my gpus start to crash blackscreen on, and off then usually ends in lost signal. My cards arent the best, but i wanna push them since in on water. Its happened with AB and 2v mod gpu tweak on pt1 bios.


Here are some things I worked out for myself overclocking on Hawaii that I wrote a while back. Maybe it can help you.

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/6120_40#post_21205990


----------



## ZealotKi11er

I am trying to get + 200 mV with MSI AB as i hate other overclocking tools. I tried to read the guide on OP but dont quite get it. Has anyone here done it? I want my first card to run 100 mV and then second card 125 mV.


----------



## krakin

How do the stock cards perform compared to aftermarket coolers?


----------



## heroxoot

Quote:


> Originally Posted by *krakin*
> 
> How do the stock cards perform compared to aftermarket coolers?


Second party coolers or reference cooler? The Tri-X and twin frozr cards perform admirably in my opinion.


----------



## Sgt Bilko

Quote:


> Originally Posted by *krakin*
> 
> How do the stock cards perform compared to aftermarket coolers?


Stock cards are clocked at 947/1250 with 2560 shaders for the 290 and 1000/1250 with 2816 shaders for the 290x,

Aftermarket cards are almost always overclocked slightly and have quieter and better coolers on them.

basically, Ref cooler = Cool at the expense of noise

Aftermarket cooler = Cool, not as quiet but they blow a bit of hot air back into the case.


----------



## krakin

Quote:


> Originally Posted by *heroxoot*
> 
> Second party coolers or reference cooler? The Tri-X and twin frozr cards perform admirably in my opinion.


Reference cards.

I also am not against water cooling.


----------



## bluedevil

RMA update with VisionTek

7/24/14 Submitted RMA form
7/25/14 Received email for invoice
7/25/14 Received email with RMA #
7/26/14 Shipped GPU
7/28/14 Set for delivery
7/28/14 Delivered and signed for


----------



## Sgt Bilko

Quote:


> Originally Posted by *krakin*
> 
> Reference cards.
> 
> I also am not against water cooling.


If you want to watercool then go for Ref cards, makes it easier for blocks and taking the cooler off as well.


----------



## heroxoot

Quote:


> Originally Posted by *krakin*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Second party coolers or reference cooler? The Tri-X and twin frozr cards perform admirably in my opinion.
> 
> 
> 
> Reference cards.
> 
> I also am not against water cooling.
Click to expand...

Reference are very noisy, and 2nd party are quieter and keep the card cooler, but dump heat back into the case. Reference don't cool that well in my opinion. They are rated for a 94c max temp, but on my MSI Gaming edition it gets to 80c tops. In fact after some case fan adjustments and extra thorough cleaning, I get 76c at load during benches.


----------



## krakin

Quote:


> Originally Posted by *heroxoot*
> 
> Reference are very noisy, and 2nd party are quieter and keep the card cooler, but dump heat back into the case. Reference don't cool that well in my opinion. They are rated for a 94c max temp, but on my MSI Gaming edition it gets to 80c tops. In fact after some case fan adjustments and extra thorough cleaning, I get 76c at load during benches.


Thanks


----------



## Gualichu04

I am sad i can't get 1300mhz core with 1300mhz memory even at +100mv. Temps are under 50C at load ulps is disabled. I guess ill keep the 1150mhz core and 1300mhz mem at +44mv.
It seems the eplida memory the card has is not that good of a overclocker either.


----------



## Sgt Bilko

Quote:


> Originally Posted by *heroxoot*
> 
> Reference are very noisy, and 2nd party are quieter and keep the card cooler, but dump heat back into the case. Reference don't cool that well in my opinion. They are rated for a 94c max temp, but on my MSI Gaming edition it gets to 80c tops. In fact after some case fan adjustments and extra thorough cleaning, I get 76c at load during benches.


Ref cards are quite good at cooling but at the expense of noise, and if he is planning on Water cooling then Ref would be the better option


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gualichu04*
> 
> I am sad i can't get 1300mhz core with 1300mhz memory even at +100mv. Temps are under 50C at load ulps is disabled. I guess ill keep the 1150mhz core and 1300mhz mem at +44mv.
> It seems the eplida memory the card has is not that good of a overclocker either.


1300Mhz?

have you tried using Trixx? You might get 1300Mhz on the core with that

Also, you should sign up for the XFX DD club


----------



## heroxoot

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Reference are very noisy, and 2nd party are quieter and keep the card cooler, but dump heat back into the case. Reference don't cool that well in my opinion. They are rated for a 94c max temp, but on my MSI Gaming edition it gets to 80c tops. In fact after some case fan adjustments and extra thorough cleaning, I get 76c at load during benches.
> 
> 
> 
> Ref cards are quite good at cooling but at the expense of noise, and if he is planning on Water cooling then Ref would be the better option
Click to expand...

But the way I hear it, the are locked down at a specific fan speed. I read in "Uber" they are locked to 55% fan speed. That's not cool in my book. I let my fans kick up to 100% before they even think about 90C. At 85C my fan kicks 100% speed. Noise be damned, they are that loud at only 50%? My gaming edition card is quieter at 100% and stays cooler than reference from what I have read.

I mean look at this.



If this is what you call cool at the expensive of noise, I don't want it. My card is clocked higher and stays 15c cooler? Yea, that is not cool at the expense of noise.


----------



## Sgt Bilko

Quote:


> Originally Posted by *heroxoot*
> 
> But the way I hear it, the are locked down at a specific fan speed. I read in "Uber" they are locked to 55% fan speed. That's not cool in my book. I let my fans kick up to 100% before they even think about 90C. At 85C my fan kicks 100% speed. Noise be damned, they are that loud at only 50%? My gaming edition card is quieter at 100% and stays cooler than reference from what I have read.
> 
> I mean look at this.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> If this is what you call cool at the expensive of noise, I don't want it. My card is clocked higher and stays 15c cooler? Yea, that is not cool at the expense of noise.


They aren't "locked" to 55%, they will only spin up to that if you don't set a fan profile, when i had my 290x i set it to top out at 75% fan speed and that lead to it never going over 80c in 35c ambients, not to mention that the Ref design has some of the best vrm cooling around outside of the Lightning.

and noise is only an issue if don't plan on water cooling or wear headphones, i wear headphones so i couldn't hear a thing


----------



## Gualichu04

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 1300Mhz?
> 
> have you tried using Trixx? You might get 1300Mhz on the core with that
> 
> Also, you should sign up for the XFX DD club


Tried trixx and even at +150mv it is very unstable the memory was at stock speeds 1250mhz. Seems the silicon lottery doesn't like me. I close msi after burner to whiel using trixx. Would i need to set the power limit in CCC even if overdrive isnt enabled?


----------



## heroxoot

Yea, non reference coolers stay cooler without adjusting the fan curve tho. I don't get why they even cap it like that on default


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gualichu04*
> 
> Tried trixx and even at +150mv it is very unstable the memory was at stock speeds 1250mhz. Seems the silicon lottery doesn't like me. I close msi after burner to whiel using trixx. Would i need to set the power limit in CCC even if overdrive isnt enabled?


I know that sometimes i had to manually set the powerlimit in CCC to +50% while overclocking with Trixx but that was only with specific drivers, not sure if it's been fixed yest or not.

You can give it a shot and see how you go


----------



## PureBlackFire

Quote:


> Originally Posted by *heroxoot*
> 
> Yea, non reference coolers stay cooler without adjusting the fan curve tho. I don't get why they even cap it like that on default


fan speed is capped because of the noise it makes. that's all really. they set the cards to run at the highest acceptable temps under the max while running at the highest acceptable fan speed for those who have to hear the cards. in the end it's just proof that the reference cooler isn't up to par for Hawaii chips. anyway, if someone is planning to water cool, reference is the best way to go.


----------



## HoneyBadger84

Quote:


> Originally Posted by *PCSarge*
> 
> and i thought i had too much money?


I should post a picture that's just a stack of all the freakin' "money" i have sitting around in the form of video cards... I have an addiction to EBay that needs treatment man. lol


----------



## tsm106

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PCSarge*
> 
> and i thought i had too much money?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I should post a picture that's just a stack of all the freakin' "money" i have sitting around in the form of video cards... I have an addiction to EBay that needs treatment man. lol
Click to expand...

Stop talking about it and go do it. And btw, at this point its all been done already and you guys are lucky that you don't have to pay 600 per card. So go get blocks mounted already and and go break some records.


----------



## motokill36

Any one still getting mem leak on crossfire 290;s in BF4

Hits 10Gb system mem then crashes


----------



## HoneyBadger84

Quote:


> Originally Posted by *Gualichu04*
> 
> I am sad i can't get 1300mhz core with 1300mhz memory even at +100mv. Temps are under 50C at load ulps is disabled. I guess ill keep the 1150mhz core and 1300mhz mem at +44mv.
> It seems the eplida memory the card has is not that good of a overclocker either.


1300MHz core on +100mV is insane, at least from my perspective. I've yet to see someone get much north of 1200MHz without pushing more than that. 1150/1300 at +44mV is pretty darn good, I'd be happy ^_^


----------



## Gualichu04

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I know that sometimes i had to manually set the powerlimit in CCC to +50% while overclocking with Trixx but that was only with specific drivers, not sure if it's been fixed yest or not.
> 
> You can give it a shot and see how you go


Seems setting power limit to 50% in CCC helps . Yay! So far its stable at +100mv hopefully i can bring that down and at least get 1350 on the memory.


----------



## HoneyBadger84

Quote:


> Originally Posted by *heroxoot*
> 
> Yea, non reference coolers stay cooler without adjusting the fan curve tho. I don't get why they even cap it like that on default


SOME non-reference coolers stay cooler without adjusting the fan curve... remember the VisionTek Aftermarket Cooler







(top card, 100% fan, it ran at that temperature in there by itself, 2-way, TriFire, QuadFire, didn't matter):


Card readouts above match the layout below (top is VisionTek, middle 2 are reference XFX, bottom is a Gigabyte WindForce), do believe temps were from a run of FireStrike Extreme:


If you're going to get an aftermarket cooler card, I can tell you from personal experience that the Tri-X from Sapphire & WindForce from Gigabyte both perform very well & neither has even 60% of the noise level of a Reference Card at 75%+ fan speed.

I'm swappin' to the 14.7 drivers & reinstalling... at least 2 cards... maybe. Be back in a bit


----------



## Gualichu04

Quote:


> Originally Posted by *HoneyBadger84*
> 
> 1300MHz core on +100mV is insane, at least from my perspective. I've yet to see someone get much north of 1200MHz without pushing more than that. 1150/1300 at +44mV is pretty darn good, I'd be happy ^_^


Seems i will keep 1150 on the core and 1300 on the memory at 44mv. It wont go to 1300 on the core at all.







I am happy with the clock so far.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Gualichu04*
> 
> Seems i will keep 1150 on the core and 1300 on the memory at 44mv. It wont go to 1300 on the core at all.
> 
> 
> 
> 
> 
> 
> 
> I am happy with the clock so far.


1150 to 1300. Some did not learn anything about overclocking. Maybe try 1200Mzh first then 1250MHz before you go to 1300? And also you know 1300 is next to impossible for these cards.


----------



## heroxoot

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Yea, non reference coolers stay cooler without adjusting the fan curve tho. I don't get why they even cap it like that on default
> 
> 
> 
> SOME non-reference coolers stay cooler without adjusting the fan curve... remember the VisionTek Aftermarket Cooler
> 
> 
> 
> 
> 
> 
> 
> (top card, 100% fan, it ran at that temperature in there by itself, 2-way, TriFire, QuadFire, didn't matter):
> 
> 
> Card readouts above match the layout below (top is VisionTek, middle 2 are reference XFX, bottom is a Gigabyte WindForce), do believe temps were from a run of FireStrike Extreme:
> 
> 
> If you're going to get an aftermarket cooler card, I can tell you from personal experience that the Tri-X from Sapphire & WindForce from Gigabyte both perform very well & neither has even 60% of the noise level of a Reference Card at 75%+ fan speed.
> 
> I'm swappin' to the 14.7 drivers & reinstalling... at least 2 cards... maybe. Be back in a bit
Click to expand...

Visiontech is a bad brand tho. It's a cheaper brand by far as well.


----------



## tsm106

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gualichu04*
> 
> Seems i will keep 1150 on the core and 1300 on the memory at 44mv. It wont go to 1300 on the core at all.
> 
> 
> 
> 
> 
> 
> 
> I am happy with the clock so far.
> 
> 
> 
> 1150 to 1300. Some did not learn anything about overclocking. Maybe try 1200Mzh first then 1250MHz before you go to 1300? And also you know 1300 is next to impossible for *mere mortals*.
Click to expand...

Fixed... heh jk.


----------



## HoneyBadger84

so I'm in the middle of rewiring and this shows up... unexpected but welcome ^_^



Oh and yeah... I have one more on the way but:


----------



## FuriousPop

I wanna thank you all for your inputs and discussions (without it getting heated).

All of you have given me valid points from different areas of my issue and i appreciate it.

So what i have gathered is this:

1x single D5 will be able to push through 1x CPU Block + 3x R9 290 blocks + 2x 480 rads.
Will it give the BEST temps - probably not or could be better with other alterations (eg; additional pump)

wanted to ask The Storm - in your pics - just how much rads are in there? i can see a 240 +120+120+either a 360 or 480 at top??? and at setting 3 on the pump?
i am assuming running the pump at high settings eg 4 or 5 will no doubt shorten its life span?


----------



## tsm106

Quote:


> Originally Posted by *FuriousPop*
> 
> So what i have gathered is this:
> 
> 1x single D5 will be able to push through 1x CPU Block + 3x R9 290 blocks + 2x 480 rads.
> Will it give the BEST temps - probably not or could be better with other alterations (eg; additional pump)
> 
> wanted to ask The Storm - in your pics - just how much rads are in there? i can see a 240 +120+120+either a 360 or 480 at top??? and at setting 3 on the pump?
> i am assuming running the pump at high settings eg 4 or 5 will no doubt shorten its life span?


1 D5 will give you the equivalent of a trickle. I'm talking on the order of .3 gpm with that many blocks.

Running a D5 at max will dump heat into your loop, and if you're already starved for rad surface that's not helping matters.

I don't understand the point of spending half to three quarters of a grand only to get slightly better than top of the line aircooling temps?


----------



## FuriousPop

Quote:


> Originally Posted by *tsm106*
> 
> 1 D5 will give you the equivalent of a trickle. I'm talking on the order of .3 gpm with that many blocks.
> 
> Running a D5 at max will dump heat into your loop, and if you're already starved for rad surface that's not helping matters.
> 
> I don't understand the point of spending half to three quarters of a grand only to get slightly better than top of the line aircooling temps?


Totally understand where your coming from...

I have honestly been thinking an additional pump - however when i look at my case and everything in it - kind of hard to picture where the 2nd pump is going to be placed physically!! its all a little daunting.

If i was to get a 2nd pump - would you say just another D5 or something different? let me know so i can research it - still aiming for a very very quiet system..

Thanks again for your help, much appreciated.


----------



## heroxoot

Quote:


> Originally Posted by *FuriousPop*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> 1 D5 will give you the equivalent of a trickle. I'm talking on the order of .3 gpm with that many blocks.
> 
> Running a D5 at max will dump heat into your loop, and if you're already starved for rad surface that's not helping matters.
> 
> I don't understand the point of spending half to three quarters of a grand only to get slightly better than top of the line aircooling temps?
> 
> 
> 
> Totally understand where your coming from...
> 
> I have honestly been thinking an additional pump - however when i look at my case and everything in it - kind of hard to picture where the 2nd pump is going to be placed physically!! its all a little daunting.
> 
> If i was to get a 2nd pump - would you say just another D5 or something different? let me know so i can research it - still aiming for a very very quiet system..
> 
> Thanks again for your help, much appreciated.
Click to expand...

External mounting is always an option!


----------



## tsm106

Quote:


> Originally Posted by *FuriousPop*
> 
> Totally understand where your coming from...
> 
> I have honestly been thinking an additional pump - however when i look at my case and everything in it - kind of hard to picture where the 2nd pump is going to be placed physically!! its all a little daunting.
> 
> If i was to get a 2nd pump - would you say just another D5 or something different? let me know so i can research it - still aiming for a very very quiet system..
> 
> Thanks again for your help, much appreciated.


Yea a good first start is getting your flow as close as possible to the ideal of 1.5gpm under load. Getting there from your starting point is going to mean making some hard decisions. D5 pumps are very expensive when you figure in all the accessories you need to make them work, like a top or a stand, and then there are the performance mods like going 24v.

If you choose to go the dual D5 route, thats another 100 for a pump, you need a dual top, another 100 bucks, and then since D5 are freaking huge, buy/fashion a mount or try to get it into your rig somehow. Looking at it from the other side of the pump fence, you could just buy a 35X2, and add in the heatsink which is also a pump mount.

There's also the 24v option, which requires a D5 strong and the controller whichis like 50 bucks. However if you have a vario, there is hardly any gain from testing. If you had a D5 S, the quickest route is a going 24V and it would also be the cheapest.


----------



## Descadent

just for the record my two 290x vapor-x's have been pretty cool...anywhere from 68-74c on card one and 58-64 on card 2 with custom fan profile i have set. I would say it's about 75f in my man cave where they are


----------



## Gualichu04

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 1150 to 1300. Some did not learn anything about overclocking. Maybe try 1200Mzh first then 1250MHz before you go to 1300? And also you know 1300 is next to impossible for these cards.


WAs tryign to see the max clock it would go before it decided to crash all the time on it. Also now it needs +50mv for 1150mhz so far. it was stable at 44 now it decides not to be.


----------



## HoneyBadger84

It's alive... installing drivers







for the last 5 mins... lol





Edit:

X_X Drivers get done, reboot, thunderstorm outside... shutting down til it passes *sigh* only delaying the inevitable awesome...

I'll post up Idle temps once I reboot for ULPS disable...


----------



## joeh4384

I am sure it sounds alive if you push those.


----------



## HoneyBadger84

& this is with the side of the case on regular fan flow setup I use for 2-3 cards as well... and this is with the fans at idle speed (45-50%) inaudible over my room fans & window AC unit (which is on it's own breaker now, woot woot) :Wheee:



I tell ya what, getting the back panel for the case back on was not easy, had to rearrange how the 2 24-pin cords are routed cuz it was impossible to get on with them both routed behind the motherboard...


----------



## bluedevil

Absolutely crazy.....that's alot of freakin gpus.....


----------



## HoneyBadger84

Ran this at 4.2GHz CPU so the score is a bit lower than it would otherwise be, but, temps aren't too bad at all so far (displayed in Afterburner on the right), FireStrike Extreme run:



Gonna reboot to 4.6GHz & run some tests, compare the numbers with my last shots at QuadFire... running the 14.7 drivers.


----------



## Jflisk

D5 12v to 24V up converter. I have one in mine never had a problem.

http://www.performance-pcs.com/catalog/index.php?main_page=product_info&products_id=36906


----------



## HoneyBadger84

Seems the 14.7 drivers are gimped compared to the 14.6 RC2s in terms of benchmark performance. I'm getting a couple hundred points less across the board so far in the extreme runs of 11 & FireStrike. Guess I'll revert drivers









I ran Eyefinity (3K reso) Valley out of curiosity on temperatures, highest they got was 79-75-74-69 ~ and again that's with the side of the case on, normal case flow. So not too bad at all, especially running 3x1080p at over 90FPS throughout the entire loop.

I'mma go hop over to the older drivers then run a full bench suite at stock... then it's time for some games







Must enjoy the fruits of my work.

Edit: Gotta wait for these storms to pass then update my signature and rig builder thing.

Now running Antec HCP-850W + Corsair AX1200W with the Antec powering the CPU, motherboard, pci-e supplemental power plugs, radiator fans for the H110, and the top R9 290X, while the AX 1200W is running the front & side case fans, HDD, SSDs, & the 3 other R9 290Xs. Running the Asus card on top.With the 3 HIS as the others. Haven't gotten to run much yet cuz of thunderstorms in the area, don't have a second battery backup so I can't risk running the computer when the power could cut out x_x ah the trials of running on 2 plugs/surge protectors.

Once the storms are done I'll revert drivers and post up some numbers







no OCing tonight though, too tired.


----------



## tsm106

You're running at 100% fan speed right?


----------



## disintegratorx

The other thing about getting a good oc is that your motherboard has to be in good shape too. The extra power needed to make the card perform better has to be there and attainable.


----------



## Rankre

So I've been running my new set up for a few days now and I've noticed that my Vapor-x r9 290 idles at ~40-45C. Thats a little high I think? I'm running 1440p with an enthoo pro case that has 140 and 200mm fans for exhaust and intake. I do have my 212 evo spitting air out the top tho, so the fan for it sits right next to my r9 290.

Load temps never went higher than 65C, but even with msi afterburner and fans at 40% idle, its 45 degrees right now with ~25 degree ambient lol


----------



## Descadent

my 2 290x vapor's idle about the same


----------



## FuriousPop

Quote:


> Originally Posted by *tsm106*
> 
> Yea a good first start is getting your flow as close as possible to the ideal of 1.5gpm under load. Getting there from your starting point is going to mean making some hard decisions. D5 pumps are very expensive when you figure in all the accessories you need to make them work, like a top or a stand, and then there are the performance mods like going 24v.
> 
> If you choose to go the dual D5 route, thats another 100 for a pump, you need a dual top, another 100 bucks, and then since D5 are freaking huge, buy/fashion a mount or try to get it into your rig somehow. Looking at it from the other side of the pump fence, you could just buy a 35X2, and add in the heatsink which is also a pump mount.
> 
> There's also the 24v option, which requires a D5 strong and the controller whichis like 50 bucks. However if you have a vario, there is hardly any gain from testing. If you had a D5 S, the quickest route is a going 24V and it would also be the cheapest.


thanks for that, after spending the amount that i have done so for my system, honestly another 100 bucks in the scheme of things is a drop in the ocean!

Stuff it, for the sake of 100 bucks, i'll just get another D5 Vario and be done with it..... i am in Aus so making another order from the states is a little difficult since the postage costs more than the product!


----------



## aaroc

Quote:


> Originally Posted by *tsm106*
> 
> For kicks, look at my 3930 rig pics. Start from the beginning and work your way across. You'll see some similarities. A year ago or so I said screw it and went external. Best thing I ever did... moving to a setup of mass over rads. You can see my rigs evolution from being able to "live" with it to where it is now. Two of these rads and accoutrements, some QDCs... I won't ever go back to just barely scraping by on rads. And the ironic part? I could have saved sooo much more money starting with the rads below instead blowing so much money trying to stuff silly small rads inside my case and living with it.
> 
> http://www.performance-pcs.com/catalog/index.php?main_page=product_info&products_id=21622


Why did you choose external radiators with 120mm fans instead of one with 140mm fans?


----------



## mAs81

Quote:


> Originally Posted by *Rankre*
> 
> So I've been running my new set up for a few days now and I've noticed that my Vapor-x r9 290 idles at ~40-45C. Thats a little high I think? I'm running 1440p with an enthoo pro case that has 140 and 200mm fans for exhaust and intake. I do have my 212 evo spitting air out the top tho, so the fan for it sits right next to my r9 290.
> 
> Load temps never went higher than 65C, but even with msi afterburner and fans at 40% idle, its 45 degrees right now with ~25 degree ambient lol
> Quote:
> 
> 
> 
> Originally Posted by *Descadent*
> 
> my 2 290x vapor's idle about the same
Click to expand...

I've never seen my Vapor-X 290 idle below 50c








But since it is summer in Greece now , and the ambient is 30+,I believe that is normal?
To be fair , my 290 is cramped in a 350D and the airflow is not the best it could be..Even though I have 2 120mm as intake..Due to the HDD cage only one is fully pushing air in unobstructed..


Spoiler: case in point:






Load temps with "new" games/benchmarks are in the higher 70s with custom fan control on triXX or AB..
I know that these cards are hot but I'm a little worried also..I have a dual monitor setup @1080p/60Hz


----------



## Mega Man

watercooling, lifes better/ more quite here


----------



## fateswarm

Is there a more active club in OCN?


----------



## Klocek001

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Borderline. You have the wattage but not really the amperage. 290s can use 25-30A each especially when OCed, that leaves less than 10-20A for the entire rest of your system. I would upgrade PSU before getting crossfire, to be on the safe side.
> Your CPU is more power efficient than mine by a bit though.


What if I want to leave them at stock speeds. It turns out I can't really squeeze out that much of my card on stock voltage, and I'm afraid of the temps as well. Is 840W of "real power" and 70A going to fail at CFX ? The rest of the system: i5 2500k 4,8GHz, 4GB DDR 3, seagate 1GB, and that's pretty much it. Just to remind you, I am wondering whether or not to add another 290 Tri-X to my setup without changing the power supply (that's XFX Pro 850W).


----------



## the9quad

Pretty sure the power is the amp draw times the voltage. So if you aren't exceeding the wattage you aren't exceeding the amp draw since the voltage is constant. i.e. 1200 watt power supply at 12 volts is going to give you roughly 100 amps. Assuming single rail is rated high enough.

http://www.rapidtables.com/convert/electric/Volt_to_Watt.htm


----------



## Arizonian

Quote:


> Originally Posted by *Klocek001*
> 
> Hi guys
> 
> New to R9 290, just got myself a Tri-X version. It's 1000 MHz on core, 1300 mem - how much can I still squeeze out of it (safely, that is. changing vcore is NOT an option) ?
> And if I was planning to buy another one for CF, is my power supply going to handle it ? It's XFX Pro 850W, single rail 70A.
> 
> Finally, can I be in the club now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Quote:


> Originally Posted by *The Storm*
> 
> 3 290's, 4930k, 4 rads, all on a single d5 on setting 3. Works well if I say so myself. Crappy cell phone pic as I am not home to upload the good photos, but you get the jist.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - updated









Quote:


> Originally Posted by *bluedevil*
> 
> RMA update with VisionTek
> 
> 7/24/14 Submitted RMA form
> 7/25/14 Received email for invoice
> 7/25/14 Received email with RMA #
> 7/26/14 Shipped GPU
> 7/28/14 Set for delivery
> 7/28/14 Delivered and signed for


Really nice and quick RMA. Way to go VisionTek.

Quote:


> Originally Posted by *HoneyBadger84*
> 
> It's alive... installing drivers
> 
> 
> 
> 
> 
> 
> 
> for the last 5 mins... lol
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit:
> 
> X_X Drivers get done, reboot, thunderstorm outside... shutting down til it passes *sigh* only delaying the inevitable awesome...
> 
> I'll post up Idle temps once I reboot for ULPS disable...


Congrats - updated









Quote:


> Originally Posted by *fateswarm*
> 
> Is there a more active club in OCN?


Not right currently. To give you an idea on how quick this thread moves.

The Mechanical Keyboard Club started in 2009 has 26,400 posts 1,477,358 views

The OCN Headphones and Earphones Club [/B] started in 2007 has 26,508 posts 1,352,902 views

AMD R9 290X / 290 Owners Club started in 2013 has *27,581*posts with *1,330,354*

The one club that beats this is another AMD GPU thread.

AMD Radeon HD 7950/7970/7990 Owners Thread started in 2012 has *36,665* posts with *1,840,642* views but has *three* GPU's (_one though is rare 7990_) in the club roster where we only have two. I have a feeling we may just meet that before next series comes out and things start to cool down here.









Don't blink because you may have just missed my post.









EDIT: 437 Members right now on the roster.


----------



## danno29

New R9 290 owner here

hope to see how it performs with my new build


----------



## Arizonian

Quote:


> Originally Posted by *danno29*
> 
> New R9 290 owner here
> 
> hope to see how it performs with my new build
> 
> 
> Spoiler: Warning: Spoiler!


Nice blue aesthetics in your rig all the way down to the Vapor-X









Congrats - added


----------



## Klocek001

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Borderline. You have the wattage but not really the amperage.


Yeah, about that. I wrote on the official XFX PS thread and they say that's doesn't even make sense. Now I'm confused...


----------



## the9quad

Quote:


> Originally Posted by *Klocek001*
> 
> Yeah, about that. I wrote on the official XFX PS thread and they say that's doesn't even make sense. Now I'm confused...


It doesn't make sense because wattage equals current times voltage. You have the wattage therefore you have the amperage.


----------



## bluedevil

Quote:


> Originally Posted by *Arizonian*
> 
> Really nice and quick RMA. Way to go VisionTek.


They just got it lol, I didn't get it back yet. Lol


----------



## Dasboogieman

Quote:


> Originally Posted by *the9quad*
> 
> It doesn't make sense because wattage equals current times voltage. You have the wattage therefore you have the amperage.


What he means is that you have enough Wattage but the rails individually may not be able to cope with the amperage draw. That being said, most decent multirail designs in the 800W range can easily deliver 150W per rail required to power a 290X.

I also googled the XFX unit, they are all single rail designs so I doubt amperage per rail is an issue.


----------



## the9quad

Quote:


> Originally Posted by *Dasboogieman*
> 
> What he means is that you have enough Wattage but the rails individually may not be able to cope with the amperage draw. That being said, most decent multirail designs in the 800W range can easily deliver 150W per rail required to power a 290X.
> 
> I also googled the XFX unit, they are all single rail designs so I doubt amperage per rail is an issue.


yeah in my earlier post above, I said assuming a single rail.








http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/27580#post_22634093


----------



## Talon720

Quote:


> Originally Posted by *tsm106*
> 
> Here are some things I worked out for myself overclocking on Hawaii that I wrote a while back. Maybe it can help you.
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/6120_40#post_21205990


Hey read through that post last night. In my situation let's say for example I'm trying for 1200 core at 1.4. I'll get some artifacts so i up the voltage, it blacks out... even with memory at stock. If I just lower core clocks with keeping the higher voltage same deal. Asic is 69% on the display/main card, other two are 77 ish, which i dunno if that makes any difference. I only read about one other person having an issue, but that was over 1.55v. Could this come down to what you were saying about the core/mem/volts ratio, and how some work better together than others? The main card since the beginning seemed to get to 1200 core, and no matter what wouldn't budge over. Are you saying if i tried 1250 core it may work better.
I also read someone was using hex editor to edit the bios do you need to know how to read code to effectively make changes? Also I know hwinfo64 voltage reading for vrm isnt the most accurate, but can the comparison of Elpida cards using more wattage than Hynix cards be drawn? Just cause its inaccurate doesn't mean it isn't inconsistent in comparison to each other.


----------



## Arizonian

Quote:


> Originally Posted by *bluedevil*
> 
> They just got it lol, I didn't get it back yet. Lol










myself. LOL

Keep us posted. Somehow I thought it to be cross shipping. Still your on track with a good RMA.


----------



## bluedevil

Quote:


> Originally Posted by *Arizonian*
> 
> 
> 
> 
> 
> 
> 
> 
> myself. LOL
> 
> Keep us posted. Somehow I thought it to be cross shipping. Still your on track with a good RMA.


Emailed them just now asking about a ETA on when I should expect a new card.







Hopefully they get one shipped out today or tomorrow and I get it before Friday.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Klocek001*
> 
> Yeah, about that. I wrote on the official XFX PS thread and they say that's doesn't even make sense. Now I'm confused...


XFX forums are going to be as full of fanboiism as the eVGA forums are for people with eVGA products. They're not going to believe their PSU couldnt possible be enough. I'm simply trying to inform you that while it may or may not technically be "enough" it is "inadequate" if your cards actually reach their rated draw, that your system would only have 10-20A at most for the entire rest of your system left. Take it or leave it, I'm not going to argue with people, especially when they haven't done the testing themselves, vs the advice I got stating the amperage, wattage etc requirements for these cards at stock & when pushed from someone that's done the testing with the equipment necessary to give factual information.

I'll just say it again, is your PSU enough? Maybe. Is it safe? Maybe not. Would I run it with the amperage being that borderline? No. But that may be more of a "what if" than an actual issue, because your system may or may not actually need more amperage than what's left IF those cards hit their rated max-draw (at stock). *shrug* Simply trying to get people to lean to the side of "safe" rather than not.

I doubt nleksan will personally show up to this thread to give his two cents, but he's where I got most of the information from on how much amps/watts these cards can get up to. It's pretty crazy.


----------



## HoneyBadger84

X_X Just got X16987 in 3DMark11







WHY NO 17K!!! lol


----------



## Klocek001

Quote:


> Originally Posted by *HoneyBadger84*
> 
> XFX forums are going to be as full of fanboiism as the eVGA forums are for people with eVGA products.


I don't know whether or not it's what you meant, but I wasn't talking about XFX forum but the XFX PS thread on OCN.

C'mon, the guy who wrote this had 1500+ rep (!)


----------



## The Storm

Quote:


> Originally Posted by *FuriousPop*
> 
> wanted to ask The Storm - in your pics - just how much rads are in there? i can see a 240 +120+120+either a 360 or 480 at top??? and at setting 3 on the pump?
> i am assuming running the pump at high settings eg 4 or 5 will no doubt shorten its life span?


Yes, it's a 360 60mm thickness, a 240 60mm, a 120 60mm, and a thin style 120.
And no setting 3 isn't for life span since these pumps technically are rated for 24v, it's for the vibrations. For some reason my shakes and reverberates throughout the case at anything higher than 3 and I can't stand the sound of the shaking.


----------



## PCSarge

Quote:


> Originally Posted by *tsm106*
> 
> Technically you only showed your rig, not proof it works well. With that many blocks your flow is at a trickle. And with that small amount of surface area for rads, I know first hand temps are not as good as they could be.


actually it is... the 35X has plenty of flow still with the 2 blocks and 2 rads in play. it pretty much spits so much flow that if the res isnt full its pressure spraying the other end of it.


----------



## tsm106

Quote:


> Originally Posted by *FuriousPop*
> 
> thanks for that, after spending the amount that i have done so for my system, honestly another 100 bucks in the scheme of things is a drop in the ocean!
> 
> Stuff it, for the sake of 100 bucks, i'll just get another D5 Vario and be done with it..... i am in Aus so making another order from the states is a little difficult since the postage costs more than the product!


The dual D5 mounts are typically another hundred. You can avoid that if you can manage to plumb both pumps in separately, like one at one end of the loop and one at the other. Your pump is huge so that maybe be an even bigger hurdle. GL.

Quote:


> Originally Posted by *aaroc*
> 
> Why did you choose external radiators with 120mm fans instead of one with 140mm fans?


120mm fans have higher static pressure whereas 140mm fans have higher flow. With respect to radiator applications, static pressure is what we want where higher flow is better for case fans. When I went this route, there were not any 140mm fans that could compete with GTs. I'm not sure there are any now since the AP15s have been resigned to eol.

Quote:


> Originally Posted by *Talon720*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Here are some things I worked out for myself overclocking on Hawaii that I wrote a while back. Maybe it can help you.
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/6120_40#post_21205990
> 
> 
> 
> Hey read through that post last night. In my situation let's say for example I'm trying for 1200 core at 1.4. I'll get some artifacts so i up the voltage, it blacks out... even with memory at stock. If I just lower core clocks with keeping the higher voltage same deal. Asic is 69% on the display/main card, other two are 77 ish, which i dunno if that makes any difference. I only read about one other person having an issue, but that was over 1.55v. Could this come down to what you were saying about the core/mem/volts ratio, and how some work better together than others? The main card since the beginning seemed to get to 1200 core, and no matter what wouldn't budge over. Are you saying if i tried 1250 core it may work better.
> I also read someone was using hex editor to edit the bios do you need to know how to read code to effectively make changes? Also I know hwinfo64 voltage reading for vrm isnt the most accurate, but can the comparison of Elpida cards using more wattage than Hynix cards be drawn? Just cause its inaccurate doesn't mean it isn't inconsistent in comparison to each other.
Click to expand...

Are you using the same software and bios I was? Asic is meaningless. I never bothered to look at my cards asics. There is no differentiation between Hynix and Elpida as far as power consumption. Software readings are very inaccurate when it comes to power draws. That's why high end cards often come with DMM points built in.

1200mhz wall... sounds like you are running out of voltage. What is your oc method, what app are you running? I described what app and bios setup I was using. The method I described is for using gputweak + pt1 bios under water. My results were 3 random cards which all did at least 1300/1600. My friend has had 3 cards too, 290/290X/290X Lightning and all did 1300 at least using same methodology. However to get there you need a good loop and the brass to push 1.45v loaded sometimes.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Klocek001*
> 
> I don't know whether or not it's what you meant, but I wasn't talking about XFX forum but the XFX PS thread on OCN.
> 
> C'mon, the guy who wrote this had 1500+ rep (!)


Oh I assume you went to some XFX Support forum, not here on OCN. lol


----------



## HoneyBadger84

Well I posted these in my system stuff thread but I figure I'll give y'all some eye candy as well, I'm still disappointed in the 3DMark11 P & E numbers, but CPU-dependent benchmarks OP:


----------



## tsm106

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Well I posted these in my system stuff thread but I figure I'll give y'all some eye candy as well, I'm still disappointed in the 3DMark11 P & E numbers, but CPU-dependent benchmarks OP:
> 
> 
> Spoiler: Warning: Spoiler!


Wouldn't it be easier to just post the url? And fyi, with quads you need a lot of clock on cpu to feed them.

Here's something to chew on.









http://www.3dmark.com/3dm11/7303785


----------



## anubis1127

Quote:


> Originally Posted by *tsm106*
> 
> Wouldn't it be easier to just post the url? And fyi, with quads you need a lot of clock on cpu to feed them.
> 
> Here's something to chew on.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/7303785


What clock speed was that at on the 3930k? 5Ghz?


----------



## tsm106

Yea, 5ghz. I don't bother disabling any power savings.


----------



## anubis1127

Quote:


> Originally Posted by *tsm106*
> 
> Yea, 5ghz.


Nice, seemed about right for that physics score.


----------



## HoneyBadger84

Quote:


> Originally Posted by *tsm106*
> 
> Wouldn't it be easier to just post the url? And fyi, with quads you need a lot of clock on cpu to feed them.
> 
> Here's something to chew on.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/7303785


Exactly, I'm getting beaten by inferior GPUs in all but Extreme preset, because P & E are so CPU-dependent. Only reason I ran P & E at all was for HWBot & a leaderboard submission







FireStrike is more the score I actually care about a bit, and I beat the same GPUs that beat me in 11 in it both on Normal & Extreme, so I'm happay









And meh, didn't bother posting URLs cuz I was in a hurry.


----------



## tsm106

Not sure I would use inferior. You're still slower than my 7970 quads in FS.


----------



## HoneyBadger84

Quote:


> Originally Posted by *tsm106*
> 
> Not sure I would use inferior. You're still slower than my 7970 quads in FS.


That are clocked at 1300MHz and also on a CPU that has 400MHz on mine. Perfectly proper to use "inferior" as stock vs stock, or if my cards would run 1300MHz (which they would if I got liquid no doubt) I'd win. So meh. Regardless of that, personal best and I beat the 280x x4 system I was wanting to beat in it, he's on liquid vs I'm on air, so I'm happy.

If this were an actual contest I'd probably be one of the highest scores on air, especially if you factor in Reference coolers... think I'd be the only one there with 4







Or one of the few to be crazy enough to do it.


----------



## tsm106

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Not sure I would use inferior. You're still slower than my 7970 quads in FS.
> 
> 
> 
> That are clocked at 1300MHz and also on a CPU that has 400MHz on mine. Perfectly proper to use "inferior" as stock vs stock, or if my cards would run 1300MHz (which they would if I got liquid no doubt) I'd win. So meh. Regardless of that, personal best and I beat the 280x x4 system I was wanting to beat in it, he's on liquid vs I'm on air, so I'm happy.
> 
> If this were an actual contest I'd probably be one of the highest scores on air, especially if you factor in Reference coolers... think I'd be the only one there with 4
> 
> 
> 
> 
> 
> 
> 
> Or one of the few to be crazy enough to do it.
Click to expand...

You want a head start too?? You new guys talk too much, might as well just phone the win in huh?

That's why I keep telling stop yapping and get your blocks on and overclock it.


----------



## Sgt Bilko

Quote:


> Originally Posted by *HoneyBadger84*
> 
> That are clocked at 1300MHz and also on a CPU that has 400MHz on mine. Perfectly proper to use "inferior" as stock vs stock, or if my cards would run 1300MHz (which they would if I got liquid no doubt) I'd win. So meh. Regardless of that, personal best and I beat the 280x x4 system I was wanting to beat in it, he's on liquid vs I'm on air, so I'm happy.
> 
> *If this were an actual contest I'd probably be one of the highest scores on air, especially if you factor in Reference coolers... think I'd be the only one there with 4
> 
> 
> 
> 
> 
> 
> 
> Or one of the few to be crazy enough to do it*.


Nope

http://hwbot.org/submission/2579580_lucky_n00b_3dmark___fire_strike_extreme_4x_radeon_r9_290x_19491_marks


----------



## HoneyBadger84

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Nope
> 
> http://hwbot.org/submission/2579580_lucky_n00b_3dmark___fire_strike_extreme_4x_radeon_r9_290x_19491_marks


No GPU clocks listed... nice.









They're actually not that much more OCed than mine, but the CPU is A: a better performer & B: clocked higher than mine. He smashed me in Physics & Combined because of that... Graphics is closer, but still quite an advantage with those clocks & the extra CPU throughput for the 4 cards to feed off of. That guys CPU/RAM throughput probably kicks mine pretty hard, his RAM is also clocked higher with about the same timings.

Wonder if those were ES & what voltage he was running... I won't go past 131mV because temps get in the mid 70s max load with my shop blower fan as side flow.
Quote:


> Originally Posted by *tsm106*
> 
> You want a head start too?? You new guys talk too much, might as well just phone the win in huh?
> 
> That's why I keep telling stop yapping and get your blocks on and overclock it.


Not goin' liquid, when are people going to stop assuming otherwise, this rig is for gamin' not for showin' off e-P. Ain't keeping these things long enough to justify the cost of a liquid loop, period. 6-12 months tops. Also this system is dual purpose, folding & gaming... so yeah...


----------



## anubis1127

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Nope
> 
> http://hwbot.org/submission/2579580_lucky_n00b_3dmark___fire_strike_extreme_4x_radeon_r9_290x_19491_marks


Does that carry over into gaming too, or is it just synthetic benches? If so, does Tahiti CFX scale better than Hawaii?

Thanks.


----------



## HoneyBadger84

Quote:


> Originally Posted by *anubis1127*
> 
> Does that carry over into gaming too, or is it just synthetic benches? If so, does Tahiti CFX scale better than Hawaii?
> 
> Thanks.


If you're gaming at anything under 1440p, it probably doesn't carry over at all (going from 3 to 4 GPUs). I can tell you most games at 1080p don't show much improvement going from 2 to 3 290Xs, but I think that might be partly game engine, as games like Metro 2033 & Crysis 3 (and BF4 according to reviews, though I don't have it) scale pretty well across increasing GPU counts.

Mainly I'll be using it for 3x 1080p & later 4K, but I'm so-so on actually keeping the 4th GPU, mostly because I don't like how this thing behaves with the dual PSU setup on boot up. I know the dual-PSU cable connector is "safe" but I don't like the idea of it compared to using OC Link like I could if I had another Antec HCP PSU.


----------



## Sgt Bilko

Quote:


> Originally Posted by *HoneyBadger84*
> 
> No GPU clocks listed... nice.
> 
> 
> 
> 
> 
> 
> 
> 
> .


1175/1500, they are in the picture.
Quote:


> Originally Posted by *anubis1127*
> 
> Does that carry over into gaming too, or is it just synthetic benches? If so, does Tahiti CFX scale better than Hawaii?
> 
> Thanks.


Hawaii seems to have better scaling than Tahiti did but 3 cards is still the sweet spot for most games, the 4th card can bring the fps down in some titles.
Quote:


> Originally Posted by *HoneyBadger84*
> 
> If you're gaming at anything under 1440p, it probably doesn't carry over at all (going from 3 to 4 GPUs). I can tell you most games at 1080p don't show much improvement going from 2 to 3 290Xs, but I think that might be partly game engine, as games like Metro 2033 & Crysis 3 (and BF4 according to reviews, though I don't have it) scale pretty well across increasing GPU counts.
> 
> Mainly I'll be using it for 3x 1080p & later 4K, but I'm so-so on actually keeping the 4th GPU, mostly because I don't like how this thing behaves with the dual PSU setup on boot up. I know the dual-PSU cable connector is "safe" but I don't like the idea of it compared to using OC Link like I could if I had another Antec HCP PSU.


Metro scales to 3 cards and the 4th doesn't really do anything, same for Crysis 3 afaik, Tomb Raider scales very well though.

Mind you most of my info came from DeadlyDNA's testing before he left OCN so not really any proof of it atm.....unless you feel like running some scaling tests?


----------



## HoneyBadger84

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Mind you most of my info came from DeadlyDNA's testing before he left OCN so not really any proof of it atm.....unless you feel like running some scaling tests?


at 1080p or 3x1080p? Metro 2033 @ 3x1080p (and the settings I play on which is DOF off because I find it annoying in that game) gets 106FPS with 3 cards, 85FPS with 2 cards, so not great scaling but it's there. I haven't run it on 4 cards with the CPU @ 4.6GHz, but at 4.2GHz 4 cards @ 3K got ~114FPS... so not much extra, but the minimum FPS was around 75, so 100% playable by anyone's standards, in Eyefinity.


----------



## Sgt Bilko

Quote:


> Originally Posted by *HoneyBadger84*
> 
> at 1080p or 3x1080p? Metro 2033 @ 3x1080p (and the settings I play on which is DOF off because I find it annoying in that game) gets 106FPS with 3 cards, 85FPS with 2 cards, so not great scaling but it's there. I haven't run it on 4 cards with the CPU @ 4.6GHz, but at 4.2GHz 4 cards @ 3K got ~114FPS... so not much extra, but the minimum FPS was around 75, so 100% playable by anyone's standards, in Eyefinity.


Well the only graph i have saved is Tomb Raider at 1080p


----------



## tsm106

AMD cards scale fine in quad. However, ppl don't realize the cpu is always a bottleneck especially with Hawaii. That thread with the scaling tests left a lot to be desired imo.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well the only graph i have saved is Tomb Raider at 1080p


Looks pretty in line with the numbers I've seen in Metro 2033 in terms of scaling. And that engine is one that I've seen has pretty good scaling in general, at least in 3K... it scaled great in 1080p with 7970s, but it seems like after the 2nd GPU at 1080p the numbers stop going up... of course with just 2 at 1080p you're talkin' about over 100FPS average so meh.


----------



## anubis1127

Quote:


> Originally Posted by *HoneyBadger84*
> 
> If you're gaming at anything under 1440p, it probably doesn't carry over at all (going from 3 to 4 GPUs). I can tell you most games at 1080p don't show much improvement going from 2 to 3 290Xs, but I think that might be partly game engine, as games like Metro 2033 & Crysis 3 (and BF4 according to reviews, though I don't have it) scale pretty well across increasing GPU counts.
> 
> Mainly I'll be using it for 3x 1080p & later 4K, but I'm so-so on actually keeping the 4th GPU, mostly because I don't like how this thing behaves with the dual PSU setup on boot up. I know the dual-PSU cable connector is "safe" but I don't like the idea of it compared to using OC Link like I could if I had another Antec HCP PSU.


Cool. Yeah, the 4th GPU always seemed like it only really benefited benches from most of the results I have looked at.

Oh, these work for multi PSU:

http://www.add2psu.com

I used one for about a year and it worked flawlessly with two x750 PSUs.
Quote:


> Originally Posted by *tsm106*
> 
> AMD cards scale fine in quad. However, ppl don't realize the cpu is always a bottleneck especially with Hawaii. That thread with the scaling tests left a lot to be desired imo.


Cool, thanks. I know CFX scaling is pretty stellar these days, especially vs SLI.


----------



## ZealotKi11er

Quote:


> Originally Posted by *tsm106*
> 
> AMD cards scale fine in quad. However, ppl don't realize the cpu is always a bottleneck especially with Hawaii. That thread with the scaling tests left a lot to be desired imo.


Yeah, That CPU again. Been killing GPU performance for years and people still blame scaling in drivers. I am sick of CPU bottlenecks even with 2 x 290X @ 1440p. I have to apply heavy AA to increase GPU usage while fps stays the same.


----------



## kayan

So, I have a quick update for all the temp folks and performance stuff:

After 30ish minutes of playing BF4 @ 2560x1440, 100% scale, and 120POV, avg between 53-63 FPS.

My FX-9370 maxed out at 50C, with a 4.8 clock steady, turbo off. My xfire 290x @ Stock clocks are maxed at the core as 53 and 56C respectively, with VRM 1 = 61C each, and VRM 2 = 47 and 46C respectively. Open case at the moment, and it's 75F degrees today.

Methinks that I should overclock them some.


----------



## tsm106

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> AMD cards scale fine in quad. However, ppl don't realize the cpu is always a bottleneck especially with Hawaii. That thread with the scaling tests left a lot to be desired imo.
> 
> 
> 
> Yeah, That CPU again. Been killing GPU performance for years and people still blame scaling in drivers. I am sick of CPU bottlenecks even with 2 x 290X @ 1440p. I have to apply heavy AA to increase GPU usage while fps stays the same.
Click to expand...

It will be nice when Mantle matures. By then we'll have cut off that albatross and finally get to see what one could really do. Although, one could load up a Codemasters game and oggle at the stupendous scaling thanks to their use of compute. And one could do that right now and not have to wait till devs get off their asses.

Also, you guys keep mentioning Metro. It's a terrible gfx heavy game that yet is surprisingly poorly threaded. It would not be my choice to prove scaling as a test. And there's a ton of games just like it, yuck.


----------



## anubis1127

Quote:


> Originally Posted by *kayan*
> 
> So, I have a quick update for all the temp folks and performance stuff:
> 
> After 30ish minutes of playing BF4 @ 2560x1440, 100% scale, and 120POV, avg between 53-63 FPS.
> 
> My FX-9370 maxed out at 50C, with a 4.8 clock steady, turbo off. My xfire 290x @ Stock clocks are maxed at the core as 53 and 56C respectively, with VRM 1 = 61C each, and VRM 2 = 47 and 46C respectively. Open case at the moment, and it's 75F degrees today.
> 
> Methinks that I should overclock them some.


I think so too, temps look fine, needs more OC. Moar volts.
Quote:


> Originally Posted by *tsm106*
> 
> It will be nice when Mantle matures. By then we'll have cut off that albatross and finally get to see what one could really do. Although, one could load up a Codemasters game and oggle at the stupendous scaling thanks to their use of compute. And one could do that right now and not have to wait till devs get off their asses.
> 
> Also, you guys keep mentioning Metro. It's a terrible gfx heavy game that yet is surprisingly poorly threaded. It would not be my choice to prove scaling as a test. And there's a ton of games just like it, yuck.


Yes, that will be nice, this 4930k I have isn't a very fast one. I have been using little baby GCN 1.0 cards, and the only mantle supported game I have is bf4, so I don't get the best mantle experience atm.

I wonder if it would be any better on some of the newer games that support mantle.. I kind of want to try that Plants vs Zombies game, but it uses the same engine as bf4 so I'm guessing the experience won't be much better for me.

I think I may have to try Hawaii again, but last time I did I was so unimpressed with it I took the card back the next day. That was before mantle launched though.


----------



## tsm106

Quote:


> Originally Posted by *anubis1127*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> It will be nice when Mantle matures. By then we'll have cut off that albatross and finally get to see what one could really do. Although, one could load up a Codemasters game and oggle at the stupendous scaling thanks to their use of compute. And one could do that right now and not have to wait till devs get off their asses.
> 
> Also, you guys keep mentioning Metro. It's a terrible gfx heavy game that yet is surprisingly poorly threaded. It would not be my choice to prove scaling as a test. And there's a ton of games just like it, yuck.
> 
> 
> 
> Yes, that will be nice, this 4930k I have isn't a very fast one. I have been using little baby GCN 1.0 cards, and the only mantle supported game I have is bf4, so I don't get the best mantle experience atm.
> 
> I wonder if it would be any better on some of the newer games that support mantle.. I kind of want to try that Plants vs Zombies game, but it uses the same engine as bf4 so I'm guessing the experience won't be much better for me.
Click to expand...

Fire up any recent Codemasters game, like Grid AS, Grid 2 or F1 2013/2014. Set it to the max, enable advanced lighting and global illum. Now bathe in the glorious scaling! It's actually possible to have good scaling now. It however is sad that only handful of games use the EGO 3 engine. Codies should really look into licensing it.


----------



## ZealotKi11er

Quote:


> Originally Posted by *kayan*
> 
> So, I have a quick update for all the temp folks and performance stuff:
> 
> After 30ish minutes of playing BF4 @ 2560x1440, 100% scale, and 120POV, avg between 53-63 FPS.
> 
> My FX-9370 maxed out at 50C, with a 4.8 clock steady, turbo off. My xfire 290x @ Stock clocks are maxed at the core as 53 and 56C respectively, with VRM 1 = 61C each, and VRM 2 = 47 and 46C respectively. Open case at the moment, and it's 75F degrees today.
> 
> Methinks that I should overclock them some.


Fps looks a bit low. I get ~ 100fps with 125% scaling and 1440p.


----------



## the9quad

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Fps looks a bit low. I get ~ 100fps with 125% scaling and 1440p.


Next time you play a full mp round post your frame time graphs, please.


----------



## ZealotKi11er

Quote:


> Originally Posted by *the9quad*
> 
> Next time you play a full mp round post your frame time graphs, please.


How do i check that?


----------



## Themisseble

Quote:


> Originally Posted by *tsm106*
> 
> It will be nice when Mantle matures. By then we'll have cut off that albatross and finally get to see what one could really do. Although, one could load up a Codemasters game and oggle at the stupendous scaling thanks to their use of compute. And one could do that right now and not have to wait till devs get off their asses.
> 
> Also, you guys keep mentioning Metro. It's a terrible gfx heavy game that yet is surprisingly poorly threaded. It would not be my choice to prove scaling as a test. And there's a ton of games just like it, yuck.


mantle is not the problem. Problems is that Games are designed for Dx11 not for Mantle.


----------



## Themisseble

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Fps looks a bit low. I get ~ 100fps with 125% scaling and 1440p.


maybe he dont use mantle. AMD DX11 sucks


----------



## HoneyBadger84

Quote:


> Originally Posted by *Themisseble*
> 
> mantle is not the problem. Problems is that Games are designed for Dx11 not for Mantle.


That will vastly be changing. Both PS4 & Xbox One GPUs were designed around GCN architecture, which means the PC versions of games that are "console ports" will likely natively support Mantle as new titles come out. Great things are coming, and they're mostly on AMD's side of the fence thanks to that.
Quote:


> Originally Posted by *Themisseble*
> 
> maybe he dont use mantle. AMD DX11 sucks


If so then why do R9 290s meet or beat GTX 780 3GBs that cost more than they do, in DX11 titles


----------



## anubis1127

Quote:


> Originally Posted by *HoneyBadger84*
> 
> That will vastly be changing. Both PS4 & Xbox One GPUs were designed around GCN architecture, which means the PC versions of games that are "console ports" will likely natively support Mantle as new titles come out. Great things are coming, and they're mostly on AMD's side of the fence thanks to that.
> If so then why do R9 290s meet or beat GTX 780 3GBs that cost more than they do, in DX11 titles


I think that is a common misconception, the consoles being based around GCN has nothing to do with PC "console ports" using Mantle. Its not like the consoles use Mantle.


----------



## heroxoot

Quote:


> Originally Posted by *anubis1127*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HoneyBadger84*
> 
> 
> 
> I think that is a common misconception, the consoles being based around GCN has nothing to do with PC "console ports" using Mantle. Its not like the consoles use Mantle.
Click to expand...

I like to believe that since consoles are using normalized PC parts that ports will be a little better soon. Maybe optimized a little better at the least since they are building the game on what is a HTPC.


----------



## kayan

I can't quote on my phone for some reason, but ....

I haven't tried Mantle yet, but I will Thurs and report back.

Also, my frames/benchies are generally lower because I have an AMD CPU too, and I'm ok with that. Computing/gaming is smooth on this CPU (except America's Army Proving Ground), whereas it wasn't on Intel (this is not to start a flame war).

I'll check all my settings in game, and possibly see if I can get everything overclocked by Thursday (doubt I'll have time tomorrow).

I will update then


----------



## HoneyBadger84

Quote:


> Originally Posted by *heroxoot*
> 
> I like to believe that since consoles are using normalized PC parts that ports will be a little better soon. Maybe optimized a little better at the least since they are building the game on what is a HTPC.


The amount of time & extensive work Rockstar has had to put in to moving GTA V from 360/PS3 to the One/PS4/PC speaks volumes... HOPEFULLY it won't run like GTA IV did on release (I did not play it but I hear it was HORRIBLE). I am really looking forward to being able to play GTA V on PC, I can't stand playing shooters (other than the Halo series) on a console controller.

Also, that game is one of the better games I've played in recent times, very fun, lots of ability to do your own thing as much as you want to, and the characters are fun to me. Stealing a 747 was hard work, I did that for fun ^_^ Getting it off the ground without hitting anything that would make it explode was the hardest part. lol


----------



## ZealotKi11er

Quote:


> Originally Posted by *kayan*
> 
> I can't quote on my phone for some reason, but ....
> 
> I haven't tried Mantle yet, but I will Thurs and report back.
> 
> Also, my frames/benchies are generally lower because I have an AMD CPU too, and I'm ok with that. Computing/gaming is smooth on this CPU (except America's Army Proving Ground), whereas it wasn't on Intel (this is not to start a flame war).
> 
> I'll check all my settings in game, and possibly see if I can get everything overclocked by Thursday (doubt I'll have time tomorrow).
> 
> I will update then


Have you tried the game with 1 card and then 2 cards. You should not be seeing much difference in your case.


----------



## HoneyBadger84

Quote:


> Originally Posted by *kayan*
> 
> I can't quote on my phone for some reason, but ....
> 
> I haven't tried Mantle yet, but I will Thurs and report back.


Everytime someone mentions Mantle I'm reminded I need to freakin' buy Thief already... I need a new game to play that isn't Crysis 3 (I still haven't finished Crysis 2 which I want to do before playing 3).


----------



## anubis1127

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Everytime someone mentions Mantle I'm reminded I need to freakin' buy Thief already... I need a new game to play that isn't Crysis 3 (I still haven't finished Crysis 2 which I want to do before playing 3).


It was on sale during the steam sale for like $15 iirc, I have such a backlog of games right now though I passed on it. Maybe next time I see it on sale though.


----------



## HoneyBadger84

Quote:


> Originally Posted by *anubis1127*
> 
> It was on sale during the steam sale for like $15 iirc, I have such a backlog of games right now though I passed on it. Maybe next time I see it on sale though.


X_X You know how long it'd take me to download it doe? lol X_X Stupid 1.5d/.7u DSL


----------



## the9quad

Quote:


> Originally Posted by *ZealotKi11er*
> 
> How do i check that?


Battlefield 4 Frame Time Analyzer
http://www.overclock.net/t/1469627/bf4fta-battlefield-4-frame-time-analyzer-version-4-2-released-major-release

and also

FLA Calculator
http://www.webwalkers.cz/Windows/FLAcalculator/FLAcalculator.aspx

those should give you a nice way to bench,ark BF4 properly


----------



## kayan

I played on 1 card and 2 cards. And you are correct, not a lot of difference, but I haven't tried turning up the scale either, or AA for that matter. I'll try more on Thursday.


----------



## heroxoot

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> I like to believe that since consoles are using normalized PC parts that ports will be a little better soon. Maybe optimized a little better at the least since they are building the game on what is a HTPC.
> 
> 
> 
> The amount of time & extensive work Rockstar has had to put in to moving GTA V from 360/PS3 to the One/PS4/PC speaks volumes... HOPEFULLY it won't run like GTA IV did on release (I did not play it but I hear it was HORRIBLE). I am really looking forward to being able to play GTA V on PC, I can't stand playing shooters (other than the Halo series) on a console controller.
> 
> Also, that game is one of the better games I've played in recent times, very fun, lots of ability to do your own thing as much as you want to, and the characters are fun to me. Stealing a 747 was hard work, I did that for fun ^_^ Getting it off the ground without hitting anything that would make it explode was the hardest part. lol
Click to expand...

To this day I can hardly get 60+ FPS in GTA4. On my 7970 it got 60fps as a top but it was pretty consistent. Using realism mods did not change that at all. Open world games like that tend to just be poorly optimized. Saints Row 4 runs pretty bad in a lot of area, only getting 40fps. At least this is what it is like for me and friends.


----------



## FuriousPop

Quote:


> Originally Posted by *The Storm*
> 
> Yes, it's a 360 60mm thickness, a 240 60mm, a 120 60mm, and a thin style 120.
> And no setting 3 isn't for life span since these pumps technically are rated for 24v, it's for the vibrations. For some reason my shakes and reverberates throughout the case at anything higher than 3 and I can't stand the sound of the shaking.


So you have 840 Rad space as apposed to my 960 and 3 GPU's......hmmmmmmmmmm wonder if the higher setting would result in fixing my issue... problem is i dont wanna put this together then drain it and put another pump in - ahhh crap, i might just have to do that!
Quote:


> Originally Posted by *tsm106*
> 
> The dual D5 mounts are typically another hundred. You can avoid that if you can manage to plumb both pumps in separately, like one at one end of the loop and one at the other. Your pump is huge so that maybe be an even bigger hurdle. GL.


Guess i thought ahead - slighty - my XSPC Res/Pump combo is the perfect piece - pump is already attached to the Res out of the box and i've already factored in its large size. Problem will be finding a home for the 2nd D5.... also the only space i can think of in the looposition in the rack would result in have Res-> 1st Pump -> 2nd Pump -> CPU -> rad etc etc etc... would have to put them right after each other to avoid a tube going right across the whole case then another back again to the next piece........ LOVELY!


----------



## gobblebox

So right now I'm OCing my Sapphire Tri-X OC r9 290, but the heat on VRM 1 is limiting me. After my last application of ICD7 on the heatsink, I noticed that some of the smaller surrounding thermal pads had some creases, tears, and/or "wrinkles" in them, not to mention lint on a few, as if the last owner had already opened it up once before, but was careless in doing so. I didn't take note of the long strip for VRM 1, but I'm guessing that it is probably in the same shape as the smaller thermal pads.

*Would replacing all of these pads lower my VRM 1 temps?* (I'm very familiar with CPU OCing, but I would consider myself a novice GPU OCer) VRM 2 is sitting at a nice 55 C, and the GPU itself idles at 40 with higher than normal ambients while I'm pushing 130 on VDDC offset just to hit 1150 Core Clock. *Does anyone have any suggestions for replacement pads for this specific card?* Any help would be greatly appreciated!

One more thing, does anyone know if this card can be flashed to 290x? I've looked around & have seen mixed information...


----------



## HoneyBadger84

Quote:


> Originally Posted by *gobblebox*
> 
> So right now I'm OCing my Sapphire Tri-X OC r9 290, but the heat on VRM 1 is limiting me. After my last application of ICD7 on the heatsink, I noticed that some of the smaller surrounding thermal pads had some creases, tears, and/or "wrinkles" in them, not to mention lint on a few, as if the last owner had already opened it up once before, but was careless in doing so. I didn't take note of the long strip for VRM 1, but I'm guessing that it is probably in the same shape as the smaller thermal pads.
> 
> *Would replacing all of these pads lower my VRM 1 temps?* (I'm very familiar with CPU OCing, but I would consider myself a novice GPU OCer) VRM 2 is sitting at a nice 55 C, and the GPU itself idles at 40 with higher than normal ambients while I'm pushing 130 on VDDC offset just to hit 1150 Core Clock. *Does anyone have any suggestions for replacement pads for this specific card?* Any help would be greatly appreciated!
> 
> One more thing, does anyone know if this card can be flashed to 290x? I've looked around & have seen mixed information...


I wouldn't even try to flash it to a 290X, I had a Tri-X OC R9 290 and it wouldn't let me do it.

For the VRM thermal pads, replacing them is not the best idea unless/until you go liquid, they are surprisingly effective for being "stock" pads from what I've been told by a few people on here.


----------



## gobblebox

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I wouldn't even try to flash it to a 290X, I had a Tri-X OC R9 290 and it wouldn't let me do it.
> 
> For the VRM thermal pads, replacing them is not the best idea unless/until you go liquid, they are surprisingly effective for being "stock" pads from what I've been told by a few people on here.


So, is there a way to reduce VRM 1 temp?


----------



## bluedevil

Well to update, VisionTek seems to have not "received" my package as of 2pm. Had me send the tracking number. I really hope it's not "lost". Sent yet another email asking for an update.


----------



## ForNever

What mid tower cases can fit an R9 290? I bought one for my nephew and he's going nuts because it won't fit in his case, so looks like we are going case shopping for him this weekend.


----------



## gobblebox

Quote:


> Originally Posted by *ForNever*
> 
> What mid tower cases can fit an R9 290? I bought one for my nephew and he's going nuts because it won't fit in his case, so looks like we are going case shopping for him this weekend.


Go with the arc midi r2... if you remove the middle HDD cage, you can easily fit even the largest 290s with after market coolers & it has excellent air flow. Just read the reviews, i'm loving mine


----------



## ForNever

Quote:


> Originally Posted by *gobblebox*
> 
> Go with the arc midi r2... if you remove the middle HDD cage, you can easily fit even the largest 290s with after market coolers & it has excellent air flow. Just read the reviews, i'm loving mine


Thanks a million! Hoping microcenter carries them.


----------



## Klocek001

I'd say get Corsair R500 for the same price

Guys you were right about my fans - I needed to turn them around so that they're intake not exhaust. Just cut a hole in my side to fit 140mm blue vortex last night, went down from 75 to 70 degrees in Far Cry 3, also 38 from 43-45 in idle. Planning to add another one.


----------



## Raephen

The Fractal Design Arc Midi r2 is an excellent suggestion.

I've owned the first version and loved it. I've built a system for a friend in the revision 2 (the r2) and love dthat version even better.

Has a nice, slightly smoked window, an elegant, minimalist design and excellent lay-out and good build quality for the price.

Depending on the budget there is for the case, however, might I suggest a Phanteks Enthoo Pro?

I have no personal experience with any case from the Phanteks Enthoo line, but from what I've seen and read about those is that they're almost a new benchmark when it comes to cost / quality ratio.

I myself am seriously considering an Enthoo Luxe, or maybe the Mini XL when it comes out.


----------



## The Storm

Do you guys think that I should be using the extra 6 pin power connector by the PCIE slots on my RIVE now that I am using 3 290's?


----------



## bluedevil

Quote:


> Originally Posted by *The Storm*
> 
> Do you guys think that I should be using the extra 4 pin power connector by the PCIE slots on my RIVE now that I am using 3 290's?


It wouldn't hurt anything.


----------



## gobblebox

Quote:


> Originally Posted by *ForNever*
> 
> Thanks a million! Hoping microcenter carries them.


That's exactly where I got mine, if he likes a sleek professional look, this one is perfect

[Edit] Grammar


----------



## bluedevil

VisionTek got it. On test bench today, replacement should be sent out within a few days.


----------



## Gobigorgohome

How high of an overclock should I be looking at with my 3930K with 4x R9 290X on water?

I think I can do 4,5 Ghz or probably more on the CPU, the cards I will take as far as they can go (without making too much trouble at 4K). Main propose is gaming and benchmarking.

I have also been on the looking for a 3960X/3970X/4960X to pair my four cards with, what are you guys thinking?


----------



## anubis1127

Quote:


> Originally Posted by *Gobigorgohome*
> 
> How high of an overclock should I be looking at with my 3930K with 4x R9 290X on water?
> 
> I think I can do 4,5 Ghz or probably more on the CPU, the cards I will take as far as they can go (without making too much trouble at 4K). Main propose is gaming and benchmarking.
> 
> I have also been on the looking for a 3960X/3970X/4960X to pair my four cards with, what are you guys thinking?


The faster the better on the OC, if you can do 4.8Ghz+ then do that.


----------



## rdr09

Quote:


> Originally Posted by *Gobigorgohome*
> 
> How high of an overclock should I be looking at with my 3930K with 4x R9 290X on water?
> 
> I think I can do 4,5 Ghz or probably more on the CPU, the cards I will take as far as they can go (without making too much trouble at 4K). Main propose is gaming and benchmarking.
> 
> I have also been on the looking for a 3960X/3970X/4960X to pair my four cards with, what are you guys thinking?


depends on the games you play. i've seen a member with that cpu at 4.5 struggle even with just 2 290Xs.


----------



## BradleyW

Quote:


> Originally Posted by *rdr09*
> 
> depends on the games you play. i've seen a member with that cpu at 4.5 struggle even with just 2 290Xs.


Here I am!
And the reason for the struggle is due to the lack of low level API!


----------



## HoneyBadger84

Quote:


> Originally Posted by *The Storm*
> 
> Do you guys think that I should be using the extra 4 pin power connector by the PCIE slots on my RIVE now that I am using 3 290's?


Yes. I was advised to do so by someone that knows what he's talkin' about in regards to PSU/video card draw stuff. You have the 6-pin PCI-E EZ Plug already plugged in right? The lil' 4-pin FDD plug doesn't give much extra power, but it's enough to help.


----------



## The Storm

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Yes. I was advised to do so by someone that knows what he's talkin' about in regards to PSU/video card draw stuff. You have the 6-pin PCI-E EZ Plug already plugged in right? The lil' 4-pin FDD plug doesn't give much extra power, but it's enough to help.


Sorry I meant the 6 pin plug by the pcie slots, and no I don't have it connected. The manual stated to plug it in when you are using 4 PCIe 3.0 x16 cards so I didn't since I am just using 3. I just wondered if it would be best if I do so.


----------



## rdr09

Quote:


> Originally Posted by *BradleyW*
> 
> Here I am!
> And the reason for the struggle is due to the lack of low level API!


lack of low level API meaning games that are not optimized into balancing the use of system components. so, it is mainly a software issue rather than a hardware issue?

thanks.


----------



## HoneyBadger84

Quote:


> Originally Posted by *The Storm*
> 
> Sorry I meant the 6 pin plug by the pcie slots, and no I don't have it connected. The manual stated to plug it in when you are using 4 PCIe 3.0 x16 cards so I didn't since I am just using 3. I just wondered if it would be best if I do so.


TBH when it comes to 290s/290Xs the manual is full of poop, it was written in the days of the GTX 580 afterall. I would have the PCI-E 6-pin EZ Plug plugged in for sure. Better safe than a dead 24-pin


----------



## BradleyW

Quote:


> Originally Posted by *rdr09*
> 
> lack of low level API meaning games that are not optimized into balancing the use of system components. so, it is mainly a software issue rather than a hardware issue?
> 
> thanks.


Correct. So we compensate by CPU overclocking. If programs utilized CPU's efficiently, we would not need to overclock the CPU. Not to be confused with CPU multi core usage. A CPU can be fully used across all cores, but that does not mean the CPU is used efficiently. Low level API addresses everything wrong with gaming.


----------



## gobblebox

Quote:


> Originally Posted by *BradleyW*
> 
> Correct. So we compensate by CPU overclocking. If programs utilized CPU's efficiently, we would not need to overclock the CPU. Not to be confused with CPU multi core usage. A CPU can be fully used across all cores, but that does not mean the CPU is used efficiently. Low level API addresses everything wrong with gaming.


So is non-HT still the recommended way to go with 99% of today's games? Or should we have HT enabled on our 4790K's - I currently have mine OCed to 4.9Ghz with HT enabled, but I wouldn't mind dropping HT for the additional temp decrease, but I wouldn't do it if it is going to decrease my gaming performance.


----------



## BradleyW

Quote:


> Originally Posted by *gobblebox*
> 
> So is non-HT still the recommended way to go with 99% of today's games? Or should we have HT enabled on our 4790K's - I currently have mine OCed to 4.9Ghz with HT enabled, but I wouldn't mind dropping HT for the additional temp decrease, but I wouldn't do it if it is going to decrease my gaming performance.


Check the link in my sig. HT tested.


----------



## Gobigorgohome

Quote:


> Originally Posted by *anubis1127*
> 
> The faster the better on the OC, if you can do 4.8Ghz+ then do that.


Yes, I know that, but my CPU is kind of struggling at 4,5 Ghz with 1,37 volts with the right settings in bios on my RIVBE, I am doing full custom water cooling and if that is not good enough I will buy a water chiller to make it better, but I thought it might be easier just to get another CPU.

To say it another way, I highly doubt I will be able to do 4,8 Ghz under 1,5 volts with the chip I got, sure I can try.
Quote:


> Originally Posted by *rdr09*
> 
> depends on the games you play. i've seen a member with that cpu at 4.5 struggle even with just 2 290Xs.


I have not really a big problem with 4 cards in Hitman Absolution, Tomb Raider or BF4 at 4K with high settings and the CPU and cards at stock. BF4 quit on me due to PSU being to weak at ultra settings at 4K, but now I have 2x EVGA G2 1300W so that should not be a problem anymore. Sitting duck waiting for some water blocks, due to EKWB not doing the work right with the Monoblock for the RIVBE.

But you meant it bottlenecked on him? Will I be better with the 3970X/4960X?


----------



## HoneyBadger84

CPU is literally always going to be the problem regardless of speed because of how much raw power these video cards have, and how great Crossfire can scale if you give it the capability to do so by having CPU throughput. I'd be willing to bet CPUs 2 years from now will still have issues handling 3-4 of these things. It shows in games coded well for 3-4 card setups when they actually scale well (BF4, Hitman Absolution etc) at higher resolutions, compared to games that don't scale at all because of crap coding & CPU dependency.


----------



## Dasboogieman

Quote:


> Originally Posted by *gobblebox*
> 
> So right now I'm OCing my Sapphire Tri-X OC r9 290, but the heat on VRM 1 is limiting me. After my last application of ICD7 on the heatsink, I noticed that some of the smaller surrounding thermal pads had some creases, tears, and/or "wrinkles" in them, not to mention lint on a few, as if the last owner had already opened it up once before, but was careless in doing so. I didn't take note of the long strip for VRM 1, but I'm guessing that it is probably in the same shape as the smaller thermal pads.
> 
> *Would replacing all of these pads lower my VRM 1 temps?* (I'm very familiar with CPU OCing, but I would consider myself a novice GPU OCer) VRM 2 is sitting at a nice 55 C, and the GPU itself idles at 40 with higher than normal ambients while I'm pushing 130 on VDDC offset just to hit 1150 Core Clock. *Does anyone have any suggestions for replacement pads for this specific card?* Any help would be greatly appreciated!
> 
> One more thing, does anyone know if this card can be flashed to 290x? I've looked around & have seen mixed information...


Most can't be unlocked. I was lucky in the sense that mine had the 42 revision BIOS which meant it was compatible with the PT1T BIOS but I was unlucky in the sense it did not become a 290X. Most late model Tri X cards are 43 revision thus making it impossible to use the PT1T BIOS.

The Tri X was known to have fairly weak VRM cooling. Though not fully because the cooler is bad, more because we are dealing with a 5+1+1 VRM design that is being tasked with drawing 250-300W under overvoltage conditions, there is no stopping it getting hot. Most of the other cards that have 6+1+1 designs or better shave at least 10-15 degrees under similar conditions due to the more even load balancing.

An upgrade of the thermal pads would help drop the temperatures by about 5 degrees assuming you use FujiPloy Ultra Extreme for the VRM1. Not much point in upgrading the pads on the VRAM or VRM2 since it's expensive and yields zero extra headroom, just use the cheaper Phobya Thermal Pads.

The measurements you need are:
1mm for VRM 1 and VRM2 thickness
1mm for VRAM thickness.

You can also attach an EK backplate with the help of some Helipal M2.5 Flat-Head 10mm screws and 1.5mm thermal pads over the VRM1 area backside to drop the temperatures by another 5 degrees. Like I've done here.


I also stuck a whole bunch of sticky RAM sinks on the VRM1 area, dropped temperatures by a further 5 degrees.



Beyond this, you have to consider watercooling as that is the only solution that can deal with the huge thermal density that the VRM1 area produces.

Don't worry my card also needs +125mV to hit 1150mhz, these mods allowed me to hit this OC with only 75 -80 degrees on the VRM1 area at 55% fan speed.


----------



## gobblebox

Quote:


> Originally Posted by *BradleyW*
> 
> Check the link in my sig. HT tested.


So what is your thought on the user who suggested turning HT off with Win7, but on with Win8? I don't see how the OS would make a difference since the efficiency is dependent on the game's coding - resource usage efficiency.


----------



## gobblebox

Quote:


> Originally Posted by *Dasboogieman*
> 
> Most can't be unlocked
> 
> The Tri X was known to have fairly weak VRM cooling. Though not fully because the cooler is bad, more because we are dealing with a 5+1+1 VRM design that is being tasked with drawing 250-300W under overvoltage conditions. Most of the other cards that have 6+1+1 designs or better shave at least 10-15 degrees under similar conditions.
> 
> An upgrade of the thermal pads would help drop the temperatures by about 5 degrees assuming you use FujiPloy for the VRMs. Not much point in upgrading the pads on the VRAM or VRM2 since it's expensive and yields zero extra headroom, just use the cheaper Phobya Thermal Pads.
> 
> The measurements you need are:
> 1mm for VRM 1 and VRM2 thickness
> 1mm for VRAM thickness.
> 
> You can also attach an EK backplate with the help of some Helipal M2.5 Flat-Head 10mm screws and 1.5mm thermal pads over the VRM1 area backside to drop the temperatures by another 5 degrees. Like I've done here.
> 
> 
> I also stuck a whole bunch of sticky RAM sinks on the VRM1 area, dropped temperatures by a further 5 degrees.
> 
> 
> 
> Beyond this, you have to consider watercooling as that is the only solution that can deal with the huge thermal density that the VRM1 area produces.
> 
> Don't worry my card also needs +125mV to hit 1150mhz, these mods allowed me to hit this OC with only 75 -80 degrees on the VRM1 area at 60% fan speed.


This is some excellent information, thanks! My VRM 1 is around 82-85 with the game at 75% on the aforementioned OC, so should I even worry?


----------



## HoneyBadger84

Quote:


> Originally Posted by *gobblebox*
> 
> This is some excellent information, thanks! My VRM 1 is around 82-85 with the game at 75% on the aforementioned OC, so should I even worry?


If I saw those temps I'd be screamin', but I run my fans at 100% with 0 care for noise, so it'd be a lot scarier for me than you.

As a for instance right now I have 3 cards running [email protected] units, which is slightly less stressful in terms of heat than gaming, but I'm sitting at 63C core, 39C VRM 1, 51C VRM 2, full load temps, and that's on a card that's sandwiched between 2 others.

VRM#1 if I remember right are rated to run up in to the 100C range, that doesn't mean you should call 80-something acceptable though.


----------



## Dasboogieman

Quote:


> Originally Posted by *gobblebox*
> 
> So what is your thought on the user who suggested turning HT off with Win7, but on with Win8? I don't see how the OS would make a difference since the efficiency is dependent on the game's coding - resource usage efficiency.


Quote:


> Originally Posted by *gobblebox*
> 
> So what is your thought on the user who suggested turning HT off with Win7, but on with Win8? I don't see how the OS would make a difference since the efficiency is dependent on the game's coding - resource usage efficiency.


Windows 8 is supposedly better at scheduling threads. The exact mechanics of which I'm not certain but it can optimize according to whether a thread should be hyperthreaded with another or to go on an untapped core. Thus, theorectically, Windows 8 helps in a CPU limited scenario.
Quote:


> Originally Posted by *gobblebox*
> 
> This is some excellent information, thanks! My VRM 1 is around 82-85 with the game at 75% on the aforementioned OC, so should I even worry?


The improvements may allow you to run a lower fan speed (like 55%-60% on mine) to reduce noise or to keep the same fan speed and perhaps reach the next voltage bin at the same temperatures for a higher OC.
Quote:


> Originally Posted by *gobblebox*
> 
> This is some excellent information, thanks! My VRM 1 is around 82-85 with the game at 75% on the aforementioned OC, so should I even worry?


Quote:


> Originally Posted by *HoneyBadger84*
> 
> If I saw those temps I'd be screamin', but I run my fans at 100% with 0 care for noise, so it'd be a lot scarier for me than you.
> 
> As a for instance right now I have 3 cards running [email protected] units, which is slightly less stressful in terms of heat than gaming, but I'm sitting at 63C core, 39C VRM 1, 51C VRM 2, full load temps, and that's on a card that's sandwiched between 2 others.
> 
> VRM#1 if I remember right are rated to run up in to the 100C range, that doesn't mean you should call 80-something acceptable though.


82-85 degrees is perfectly in line with what the Tri X can achieve at 75% fan speed with a +125mV overclock. The 5+1+1 design means that temperatures ramp up extremely fast with overvoltage.

The Reference cooler is probably the best at keeping the reference VRM1 area cool (other than waterblocks) since the blower is right on top of the assembly with a corrugated metal surface acting as the heatsink. Shame that the core temperatures are really bad otherwise I'd wager it would allow higher OCs than the aftermarket reference models.


----------



## vieuxchnock

I have a problem to run Unigine Heaven and Valley in Crossfire. There is only one GPU running. The second give me sometime 5% but not more. All the other bench ( 3DMark...) runs in Xfire except Unigine.
Any idea?


----------



## tsm106

Quote:


> Originally Posted by *vieuxchnock*
> 
> I have a problem to run Unigine Heaven and Valley in Crossfire. There is only one GPU running. The second give me sometime 5% but not more. All the other bench ( 3DMark...) runs in Xfire except Unigine.
> Any idea?


You need to create a cfx profile using 1x1 optimize.


----------



## vieuxchnock

Sorry for my ignorance but how can I do that?


----------



## ZealotKi11er

Quote:


> Originally Posted by *vieuxchnock*
> 
> Sorry for my ignorance but how can I do that?


I think you go at CCC and there is a place there that you can create game profiles. In that menu i think you get to select different CFX modes.


----------



## Sgt Bilko

Quote:


> Originally Posted by *tsm106*
> 
> You need to create a cfx profile using 1x1 optimize.


I should do that actually
Quote:


> Originally Posted by *vieuxchnock*
> 
> Sorry for my ignorance but how can I do that?


It's under 3D Application settings in the Gaming tab, click add (find the valley.exe file) on the left (find the valley.exe file) then set Tessellation to override settings and turn it off then down the bottom is the Crossfire profile setting, change it to 1x1 optimization

I think that's right, correct me if i'm wrong


----------



## Cysquatch

I just purchased a used r9 290 reference card, am i going to be able to run this thing at stock clocks without worry of burning the card up? I will eventually be getting corsairs hg10 block with a cooler whenver they decide to release that.


----------



## Roboyto

Quote:


> Originally Posted by *gobblebox*
> 
> This is some excellent information, thanks! My VRM 1 is around 82-85 with the game at 75% on the aforementioned OC, so should I even worry?


@Dasboogieman gave you solid advice. Fujipoly Ultra Extreme is the best bet for a thermal pad. The smallest quantity you can order is 15mm*100mm*1mm, and is easily enough to do upgrade the VRM pads on 2, if not 3, cards.

Your temperatures at 82-85 are still fairly safe. From what I have seen in this thread, 90C is the point where you want to be concerned about reducing temperatures.

If you don't want the backplate, you can attach sinks directly to the back of the card with some thermal tape. Just be certain that you use enough tape so the sink doesn't touch the PCB!

@piquadrat sent me this photo after he located the hottest spots on the back of the card.


----------



## fateswarm

The heat comes from the mosfets. It's the little chips next to the chokes. A delicate point is that since they are occasionally two for each phase (and not integrated into one chip) you may get more heat from one of the two of the pair, usually the low side one (it may be uncertain which one it is).

So you may be able to pin point the exact location on any card.

If the pairs are integrated to a chip it's easier.


----------



## Roboyto

Quote:


> Originally Posted by *Cysquatch*
> 
> I just purchased a used r9 290 reference card, am i going to be able to run this thing at stock clocks without worry of burning the card up? I will eventually be getting corsairs hg10 block with a cooler whenver they decide to release that.


Absolutely, you can expect a bit of noise from the blower though. If the core reaches 95C the card will throttle clocks/voltages until the temperature comes back down.

Kraken G10 is a pretty solid alternative if you use it in combination with the Gelid 290 VRM heatsink kit.

See here for a list of items I am using with the Kraken G10:

http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures/80_20#post_22599798

You don't need to do purchase everything I list since I went a little overboard..but I wanted to get the best performance I could get with what I had, without full blown water cooling, in the tiny 250D case. 49C for VRM1, after a 1 hour benchmark loop, is pretty solid for a small heatsink and 92mm fan


----------



## kayan

Quote:


> Originally Posted by *anubis1127*
> 
> I think so too, temps look fine, needs more OC. Moar volts.


I overclocked my CPU first, up to 5ghz now. Next will try the GPU's....what program though to overclock 2 cards and make their clocks the same?
Quote:


> Originally Posted by *ZealotKi11er*
> 
> Fps looks a bit low. I get ~ 100fps with 125% scaling and 1440p.


You're right, FPS were low, it was my CPU







Got it up to 5ghz, from 4.8, and frames increased to between 85-95ish. And no I didn't check variance









I got out of one job early today, so I was able to test before I go to job #2:

@4.9 and after about 30 minutes of BF4 64 MP, temps solidified around 52C according to Core Temp.

@5 and here's my CPU-Z validation link:

http://valid.x86.fr/bij3qb

Up next, BF4 test....

Was running great and then the server crashed, of course the server crashed, it only crashes when I do well. Heh, Temps said that they were at 44C, but I only played for about 20mins, so it's probably a tad on the low side.

I ran Prime and Heaven with the CPU OCd, and temps definitely went up a little bit. Core on the GPU went up by about 2-3C, VRMs went up between 1-5C.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Cysquatch*
> 
> I just purchased a used r9 290 reference card, am i going to be able to run this thing at stock clocks without worry of burning the card up? I will eventually be getting corsairs hg10 block with a cooler whenver they decide to release that.


Should be fine as long as you run the fan up.


----------



## the9quad

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Should be fine as long as you run the fan up.


It's perfectly fine even if he doesn't. They are designed to throttle at 95cand to handle that temp. He is no danger of damaging his card at all.


----------



## Cysquatch

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Should be fine as long as you run the fan up.


Quote:


> Originally Posted by *the9quad*
> 
> It's perfectly fine even if he doesn't. They are designed to throttle at 95cand to handle that temp. He is no danger of damaging his card at all.


thanks a bunch, that leaves my mind at ease. I guess i panicked a bit when I started reading all the temperature problems etc. Also i do understand the benifits of aftermarket cooling, i just want to explore my options a bit more, but would like to use the card in the meantime, im going form an 7770 to the 290, so I'm very excited.


----------



## vieuxchnock

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I should do that actually
> It's under 3D Application settings in the Gaming tab, click add (find the valley.exe file) on the left (find the valley.exe file) then set Tessellation to override settings and turn it off then down the bottom is the Crossfire profile setting, change it to 1x1 optimization
> 
> I think that's right, correct me if i'm wrong


I don't know if I did OK but everything stays the same. The secong GPU use is only around 3-5 % in both benchmarks.
I tryed what you tell me to do and it doesn't work for me.


----------



## the9quad

Quote:


> Originally Posted by *vieuxchnock*
> 
> I don't know if I did OK but everything stays the same. The secong GPU use is only around 3-5 % in both benchmarks.
> I tryed what you tell me to do and it doesn't work for me.


Are you in full screen?


----------



## vieuxchnock

1920X1080 but the full screen box in not checked


----------



## ZealotKi11er

Quote:


> Originally Posted by *vieuxchnock*
> 
> 1920X1080 but the full screen box in not checked


CFX does not work unless it full screen.


----------



## vieuxchnock

Got it. Thanks guys:thumb:


----------



## tsm106

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> You need to create a cfx profile using 1x1 optimize.
> 
> 
> 
> I should do that actually
> Quote:
> 
> 
> 
> Originally Posted by *vieuxchnock*
> 
> Sorry for my ignorance but how can I do that?
> 
> Click to expand...
> 
> It's under 3D Application settings in the Gaming tab, click add (find the valley.exe file) on the left (find the valley.exe file) then set Tessellation to override settings and turn it off then down the bottom is the Crossfire profile setting, change it to 1x1 optimization
> 
> I think that's right, correct me if i'm wrong
Click to expand...

It's about right, but you dont need to disable tess for valley because there's no tess in valley.

If you're submitting this to the valley thread, read the OP on AMD tweaks. Before the thread was ripped, Karl wrote instructions on the profile. It's a must read.


----------



## Sgt Bilko

Quote:


> Originally Posted by *tsm106*
> 
> It's about right, but you dont need to disable tess for valley because there's no tess in valley.
> 
> If you're submitting this to the valley thread, read the OP on AMD tweaks. Before the thread was ripped, Karl wrote instructions on the profile. It's a must read.


I just run Tess off in general, and yeah, those tweaks improved my score a bit


----------



## HoneyBadger84

Quote:


> Originally Posted by *the9quad*
> 
> It's perfectly fine even if he doesn't. They are designed to throttle at 95cand to handle that temp. He is no danger of damaging his card at all.


I think you're one of the only people that seriously believes it's "acceptable" for them to actually run at that temperature for normal use. That's insane. If you run the fan at ~70% they never get anywhere near that hot, and it's not THAT noisy. I run mine at 100% max starting at 60C, they never go over 65C in most games/testing, even though I'm running QuadFire which means I have a giant sandwich going on. They're running at 37-63-65-60 (top card is idle) right now doing some [email protected] units.
Quote:


> Originally Posted by *Cysquatch*
> 
> thanks a bunch, that leaves my mind at ease. I guess i panicked a bit when I started reading all the temperature problems etc. Also i do understand the benifits of aftermarket cooling, i just want to explore my options a bit more, but would like to use the card in the meantime, im going form an 7770 to the 290, so I'm very excited.


Getting a reference card and planning on liquid when/if you go Crossfire is the smart choice. Reference cards aren't as bad as everyone makes them out to be if you're willing to run the fan a little bit higher than the card's profile dictates (it maxes out at like 55% at 95C, kinda crazy IMO).

What I tell people to do is, open up Afterburner or your program you want to use for fan control. Turn the fan up slowly 5% at a time until it gets to a level you call "too loud". Then set a level a little under that as your maximum fan speed, and have it kick in at ~60C, as you're not going to hit 60C unless you're gaming. That should keep the card a lot cooler than the stock fan profile does.


----------



## HoneyBadger84

EBay has gone completely bonkers, I guess the people that have been racking up all these 290Xs are done buying for now? lol I saw one go for ~$250 earlier... that's nuts. Still glad I got mine as cheap as I did & they were in brand-new condition, vs most of the ones going for that much are used & dusty as all get out...


----------



## Sgt Bilko

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I think you're one of the only people that seriously believes it's "acceptable" for them to actually run at that temperature for normal use.


That is an acceptable temp for them on Air cooling, let's be honest....they wouldn't give them a 2-3 year warranty if they wouldn't last that long on the default (55%) fan profile would they?


----------



## Cysquatch

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I think you're one of the only people that seriously believes it's "acceptable" for them to actually run at that temperature for normal use. That's insane. If you run the fan at ~70% they never get anywhere near that hot, and it's not THAT noisy. I run mine at 100% max starting at 60C, they never go over 65C in most games/testing, even though I'm running QuadFire which means I have a giant sandwich going on. They're running at 37-63-65-60 (top card is idle) right now doing some [email protected] units.
> Getting a reference card and planning on liquid when/if you go Crossfire is the smart choice. Reference cards aren't as bad as everyone makes them out to be if you're willing to run the fan a little bit higher than the card's profile dictates (it maxes out at like 55% at 95C, kinda crazy IMO).
> 
> What I tell people to do is, open up Afterburner or your program you want to use for fan control. Turn the fan up slowly 5% at a time until it gets to a level you call "too loud". Then set a level a little under that as your maximum fan speed, and have it kick in at ~60C, as you're not going to hit 60C unless you're gaming. That should keep the card a lot cooler than the stock fan profile does.


Noise is no issue for me, so i will be doing something similar to what you're doing. Id rather have a little bit of noise rather than a ton of heat.


----------



## Cysquatch

Quote:


> Originally Posted by *Sgt Bilko*
> 
> That is an acceptable temp for them on Air cooling, let's be honest....they wouldn't give them a 2-3 year warranty if they wouldn't last that long on the default (55%) fan profile would they?


I agree, but i think we can all agree 95 deg is a pretty insane temp to call stable


----------



## fateswarm

The chips don't die at 95 but there are hints from intel specs that something like 75C might be better for longevity. The mosfets of vrm can usually be safe up to 130C by the way. But it's probably unsafe for the surrounding components.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Cysquatch*
> 
> I agree, but i think we can all agree 95 deg is a pretty insane temp to call stable


"Stable" "safe for regular use" "acceptable" ~ I wouldn't call it any of those things. AMD might. But they're a bit nuts.

like I said, I can keep mine under 65C slightly OCed, or under 62C at stock, for regular gaming use, and that's sandwiched in QuadFire, by running them with higher fan speed and good sideflow. The card that has free air (bottom one) is only at 61C atm and it's running at 1050MHz. Fixing to turn them down to stock though, heating up the room a bit more than I'd like while folding. lol


----------



## Zipperly

Quote:


> Originally Posted by *Cysquatch*
> 
> I agree, but i think we can all agree 95 deg is a pretty insane temp to call stable


I don't like that temp either but if it's made to withstand those temps then it is stable. When I hady 290X I had an accelero 3 cooler and even with a overvolt of 1.328vc and 1200mhz core clock my load temps never exceeded 65c and vrm temps were 79c. The gpu just needs a beefy cooler to tame it.


----------



## HoneyBadger84

Even with the stock reference cooler, if you're brave enough to repaste it (which I found out is easy, relatively speaking, but a bit complex in terms of disassembly/reassembly, I did it earlier in this thread), you can drop the temps a few degrees if you use higher quality paste than it comes with and do a proper job not using too much. That would also help. The card I did it on saw a 2-5C drop in temps, and that's with the temps already being down in the 60-70C range before the drops post repaste, with the fan at 100%. So one would stand to guess it'd probably save you upwards of 5C if you got up in the 80C range before repasting.


----------



## kayan

I've got a question about crossfire, if it only works in full screen, does that mean it does or doesn't work in borderless full screen?


----------



## HoneyBadger84

Quote:


> Originally Posted by *kayan*
> 
> I've got a question about crossfire, if it only works in full screen, does that mean it does or doesn't work in borderless full screen?


Uuuuuh I'd check that for you personally but um... my other 3 are busy folding so I can't  Pretty sure the answer to that is "no it does not work" though.


----------



## Gualichu04

Come to find out the r9 290x with aquacomputer backplate it is to thick to put in the top slot of an asus riv BE. the ram slots are in the way. Now i am stuck using the middle two slots and the sound card has to go in the bottom slot.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Gualichu04*
> 
> Come to find out the r9 290x with aquacomputer backplate it is to thick to put in the top slot of an asus riv BE. the ram slots are in the way. Now i am stuck using the middle two slots and the sound card has to go in the bottom slot.


That's nuts. Even the stupid backplate the aftermarket VisionTek R9 290X comes with didn't do that (it was close to messing with them, was a tight fit).


----------



## ChaosAD

I use an old gtx275 atm. I wanted to buy a used 780ti tbh, but while searching the forums i found a SAPPHIRE RADEON R9 290 that i can buy, still with 2 years warranty, at 250 euros. Do you think the price is good for a reference 290 at this point of time?


----------



## Gualichu04

Quote:


> Originally Posted by *HoneyBadger84*
> 
> That's nuts. Even the stupid backplate the aftermarket VisionTek R9 290X comes with didn't do that (it was close to messing with them, was a tight fit).


Yup it sucks to so now the gpu is at pci-e 3.0 at x8 and it seems once i get the 2nd one back form rma it will be x8 to.


----------



## HoneyBadger84

Quote:


> Originally Posted by *ChaosAD*
> 
> I use an old gtx275 atm. I wanted to buy a used 780ti tbh, but while searching the forums i found a SAPPHIRE RADEON R9 290 that i can buy, still with 2 years warranty, at 250 euros. Do you think the price is good for a reference 290 at this point of time?


Not too bad. Is it in really good condition? If so I say worth.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kayan*
> 
> I've got a question about crossfire, if it only works in full screen, does that mean it does or doesn't work in borderless full screen?


No, Crossfire only works in windowed or borderless windowed when using Mantle, If you are using DX and want Crossfire then you need Fullscreen.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Cysquatch*
> 
> I agree, but i think we can all agree 95 deg is a pretty insane temp to call stable


I'm not saying it's stable, i'm saying anything under 90c on the core is acceptable for air cooling.
Quote:


> Originally Posted by *HoneyBadger84*
> 
> "Stable" "safe for regular use" "acceptable" ~ I wouldn't call it any of those things. AMD might. But they're a bit nuts.
> 
> like I said, I can keep mine under 65C slightly OCed, or under 62C at stock, for regular gaming use, and that's sandwiched in QuadFire, by running them with higher fan speed and good sideflow. The card that has free air (bottom one) is only at 61C atm and it's running at 1050MHz. Fixing to turn them down to stock though, heating up the room a bit more than I'd like while folding. lol


And that's your definition, everyone has a different point of view.

Don't you have the side off your case to feed enough air to those cards?


----------



## Raephen

Quote:


> Originally Posted by *ChaosAD*
> 
> I use an old gtx275 atm. I wanted to buy a used 780ti tbh, but while searching the forums i found a SAPPHIRE RADEON R9 290 that i can buy, still with 2 years warranty, at 250 euros. Do you think the price is good for a reference 290 at this point of time?


Aye, it looks like a good buy, a mere €250. Will the seller have a copy of the original receipt for you? Do you know if he/she has already registered that card with Sapphire? If not, you can simply register there and in case of trouble, get technical support or maybe RMA directly from them.

Though, I'm not sure about the direct RMA bit... never had to do that before. The one card I ever had to RMA was via the shop that sold it to me...


----------



## HoneyBadger84

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'm not saying it's stable, i'm saying anything under 90c on the core is acceptable for air cooling.
> And that's your definition, everyone has a different point of view.
> 
> Don't you have the side off your case to feed enough air to those cards?


Right now? Yes. For regular gaming, no, I'd have it on with the 3 180mm side Silverstone Air Penetrator fans I run feeding them air. Main reason I have the side of the case off with a blower-fan running currently as the side flow is the fact that I'm running [email protected] units which means near-constant 100% load for 5-10hrs at a time, and I want them running as cool as possible. At the moment they're at 62-63-59 respectively, and that's at 1020MHz, zero extra voltage, 100% fan...

I'm seriously wanting to take these gals apart & redo their thermal paste, but I think I'm gonna pace myself on that & do one card at a time when I have the rig down for something or other... tomorrow I gotta test the card that just came in, make sure it's good to go, then I gotta get the Sapphire I already redid the paste on, the card that just came in, and the 290 from VisionTek up for sale, since I'm keeping the 3 HIS & 1 Asus card I have in the machine now because of the fact that they're all in brand-new condition & clock well enough (they ran all the benchmarks I threw at them at 1150MHz +131mV core & never had any issues).

I'm also debating hacking the brackets up so they have less teeth & more room to exhaust air out, as I've heard that can help drop temps a bit and lower the noise... dunno if I'll do that for sure. Whenever I'm gaming with headphones on, because of how good the earcup-to-head seal is on the Logitech G930 headset, I can't really hear them at all, even at 100%, and I'm pretty near my case's rear-end with how I have it situated now.

But for a single reference card by itself, I would think a single side fan for intake directly near the card's fan port, and 70% fan would keep it well under the 85C mark even under heavy load... depends on ambient of course.


----------



## Sgt Bilko

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Right now? Yes. For regular gaming, no, I'd have it on with the 3 180mm side Silverstone Air Penetrator fans I run feeding them air. Main reason I have the side of the case off with a blower-fan running currently as the side flow is the fact that I'm running [email protected] units which means near-constant 100% load for 5-10hrs at a time, and I want them running as cool as possible. At the moment they're at 62-63-59 respectively, and that's at 1020MHz, zero extra voltage, 100% fan...
> 
> I'm seriously wanting to take these gals apart & redo their thermal paste, but I think I'm gonna pace myself on that & do one card at a time when I have the rig down for something or other... tomorrow I gotta test the card that just came in, make sure it's good to go, then I gotta get the Sapphire I already redid the paste on, the card that just came in, and the 290 from VisionTek up for sale, since I'm keeping the 3 HIS & 1 Asus card I have in the machine now because of the fact that they're all in brand-new condition & clock well enough (they ran all the benchmarks I threw at them at 1150MHz +131mV core & never had any issues).
> 
> I'm also debating hacking the brackets up so they have less teeth & more room to exhaust air out, as I've heard that can help drop temps a bit and lower the noise... dunno if I'll do that for sure. Whenever I'm gaming with headphones on, because of how good the earcup-to-head seal is on the Logitech G930 headset, I can't really hear them at all, even at 100%, and I'm pretty near my case's rear-end with how I have it situated now.
> 
> But for a single reference card by itself, I would think a single side fan for intake directly near the card's fan port, and 70% fan would keep it well under the 85C mark even under heavy load... depends on ambient of course.


But you still have great airflow for them plus running them at 100% fan speed means they are going to stay fairly cool.

Taking them apart is fairly simple if you just want to change the paste, first time takes a while though.

I have the same headset and i used to have a ref 290x as well, 70% fan speed does keep the card under 85c in 35-37c ambients as well as not being too loud.

main reason i didn't want more Ref cards is because of the noise, me personally i don't care........the people on the receiving end of my mic do though


----------



## HoneyBadger84

Quote:


> Originally Posted by *Sgt Bilko*
> 
> But you still have great airflow for them plus running them at 100% fan speed means they are going to stay fairly cool.
> 
> Taking them apart is fairly simple if you just want to change the paste, first time takes a while though.
> 
> I have the same headset and i used to have a ref 290x as well, 70% fan speed does keep the card under 85c in 35-37c ambients as well as not being too loud.
> 
> main reason i didn't want more Ref cards is because of the noise, me personally i don't care........the people on the receiving end of my mic do though


lol yeah it's quite odd, I dunno if it was the noise cancelling tech on my headset's mic or what, but the people I was playing League with earlier couldn't hear the 290Xs while folding at all, and that was 3 of them howling at 100% fan speed roughly 4-5ft away from where I sit. Got all 4 of them running a unit each now to get me over the 1M point mark by tomorrow morning. Then when I get home I'll shut it all down after uninstalling drivers, yank all 4 to test the card that just came in & make sure it works perfectly before I resell it, assuming it's not some dynamite overclocker, which I doubt. It's also dusty vs the 4 in my system now have zero dust build up on the fans, and I hate the process of actually bothering to clean the fan blades completely since canned air won't get it all off most of the time, have to use one of those things normally used for in-between teeth cleaning (it's basically floss attached to a soft-stick, tiny brush that can get in between the fan fins & clean them properly).


----------



## the9quad

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I think you're one of the only people that seriously believes it's "acceptable" for them to actually run at that temperature for normal use. That's insane. If you run the fan at ~70% they never get anywhere near that hot, and it's not THAT noisy. I run mine at 100% max starting at 60C, they never go over 65C in most games/testing, even though I'm running QuadFire which means I have a giant sandwich going on. They're running at 37-63-65-60 (top card is idle) right now doing some [email protected] units.
> Getting a reference card and planning on liquid when/if you go Crossfire is the smart choice. Reference cards aren't as bad as everyone makes them out to be if you're willing to run the fan a little bit higher than the card's profile dictates (it maxes out at like 55% at 95C, kinda crazy IMO).
> 
> What I tell people to do is, open up Afterburner or your program you want to use for fan control. Turn the fan up slowly 5% at a time until it gets to a level you call "too loud". Then set a level a little under that as your maximum fan speed, and have it kick in at ~60C, as you're not going to hit 60C unless you're gaming. That should keep the card a lot cooler than the stock fan profile does.


What he asked was will running two of them in crossfire break his cards due to heat, what you told him was yes it will unless he changes his fan profile. Sorry that is wrong, and that is what I am pointing out. It will not break his cards, they are designed to run that way even if you aren't comfortable with it.

I've had 3 of them sandwiched together _*since release day*_, and have used a custom curve since then as well. So it's not like I am saying it isnt a good idea. I am saying that if you choose not to use a custom curve (which a lot of people do, since they cant stand that much noise) then he will not break his cards.


----------



## tsm106

100% fan speed... with multiple gpus lol. I'm not a masochist, I can't handle that.


----------



## the9quad

if anything is going to break, it will be the fans from running at 100% 24/7, because that is not what they were designed to do.


----------



## Dasboogieman

Quote:


> Originally Posted by *fateswarm*
> 
> The chips don't die at 95 but there are hints from intel specs that something like 75C might be better for longevity. The mosfets of vrm can usually be safe up to 130C by the way. But it's probably unsafe for the surrounding components.


Insane is relative. If the ASIC was designed properly for the temperature then it is perfectly fine to run at said temperature for increased heat dissipation efficiency purposes. IBM server CPUs run at extended temperatures as well in order to facilitate thermal transfer. It is perfectly fine if the design accounted for the extra leakage and the additional strain on the interconnects. The 28nm fab process is extremely mature at this point, AMD probably have a better idea of the envelope is than any of us.

I do agree about the VRMs though. A good clue regarding longevity is how the reference cooler strives to keep the VRM1 area below 75-80 degrees despite the core temperatures in normal operation. Excess heat in that area can damage the relatively fragile capacitors which are probably only rated for 105 degrees (120 degree ones are hideously expensive).

The reason the NVIDIA GK110 cards throttle at 80-83 degrees is because they didn't design it to operate at high temperatures. All the components on the PCB are rated for lower temperatures, hell just look at the VRMs, they don't use the expensive metallic packaging the AMD ones use. Basically, the extra surface area of GK110 allows it to have a lower target temperature from a thermal dissipation point of view.


----------



## HoneyBadger84

So you're saying the card can run at 95C and be fine... but that the fan, which can run at 100% or it wouldn't be allowed that way, would die first, from regular usage at that speed, rather than the GPU dying from being running 94-95C regularly... okie dokie then.

The person whom I was helping specifically asked about a single R9 290 card. He didn't mention Crossfire, you even replied to it as such being a single-card discussion, so uh... yeah.

My main point was, running the fans at ~70% or so isn't too bad on noise and it keeps it a lot cooler, which is why I said:
Quote:


> Originally Posted by *HoneyBadger84*
> 
> Should be fine as long as you run the fan up.


Then you posted:
Quote:


> Originally Posted by *the9quad*
> 
> It's perfectly fine even if he doesn't. They are designed to throttle at 95 to handle that temp. He is no danger of damaging his card at all.


The problem with this is, not everyone's case is going to be able to handle something that hot being in there, even with reference cards blowing most, if not all, the heat they create out the back, the top of the card is still going to be getting hot & it'll be right near other components. Saying he's not in danger of damaging the card running it at that temperature, when you yourself clearly stated you setup a custom fan profile on day 1 & never let yours run that hot, says you, and anyone else here for that matter, actually has no clue if running the cards at that temp for regular usage while gaming will actually kill them, sooner rather than later.

The guy's already said he plans to adjust the fan profile because like me A: he doesn't care about noise & B: he'd rather it run cool than quiet... so how's about we move on instead of having a pointless discussion about something that, in reality, neither of us know much about, which is what happens to the card if it's actually allowed to run 94-95C for long gaming sessions continuously for months.








Quote:


> Originally Posted by *tsm106*
> 
> 100% fan speed... with multiple gpus lol. I'm not a masochist, I can't handle that.


Meh. I don't HAVE to run them at 100%, but I do when I'm gaming because I can't hear it anyway... if I'm going to be taking a nap, like I'm going to shortly, I might turn'em down a tad. Honestly if they're properly secured with the motherboard retention clip & screwed in, the extra noise they put out due to vibration is minimal. I did a test a bit ago & running at 80% fan is significantly quieter while only increasing temperatures by 3-4C... so it may be that 70-80% is the sweetspot most people should test their tolerance to.


----------



## failwheeldrive

Hey guys... I'm sorry for the noob question and I know you get VRM temp questions like this all the time, but I could use some help.

I'm running a ref 290x with an Aquacomputer block and EK backplate. Temps on the core are great (55C at load while overclocked to 1150 on the core and fans alternating between 0-800rpm) but not so great on VRM 1. VRM 2 consistently runs at 40-45C when overclocked, but VRM 1 regularly hits 80C in the same conditions. I know it's common for there to be a temp variance between them, but 80C seems extremely high for a watercooled 290x.

So I'm guessing it's a problem with uneven contact on the VRM section. It could either be due to improperly installed thermal pads, or possibly due to the screws for the EK backplate being too long. The backplate seems to snug down fine when I installed it, but I'm not totally sure. Which bank of VRMs is VRM1? To the left of the GPU die or the right? And what is the "acceptable" range of temp variance between VRM 1 and VRM 2?

I'm planning on tearing down my loop soon to add a few parts, so I'll try to get this sorted out then. I'm hoping it's just a problem with thermal pads, because I have no idea where I can get a matching Aquacomputer backplate for my block. They're out of stock everywhere and Aquacomputer is saying it'll be at least three weeks before they get more made.

If you guys have any other suggestions for what may be going wrong I'm all ears. Thanks!


----------



## failwheeldrive

Oh, and in case anyone wants to see some AC smexyness:



The rig in question


----------



## the9quad

I am saying it is designed to run at 95c as stated by the people designed it. On the other hand they set the fan profile with a max at 60%. So running at 100% is not how it was designed but running at 95c is.


----------



## morreale

Quote:


> Originally Posted by *morreale*
> 
> ASUS R9290X-DC2OC-4GD5
> Stock Asus
> username: morreale
> 
> 
> 
> I have had issues with this card since it has been installed with black screens and BSODs. I am using a Corsair Professional Series AX860i so I know power is not an issue.


i sent the card back and now have 2x Asus GTX 780ti DCIIs. so you can remove me from the owner list if you'd like.


----------



## falcon26

Just played around with my new well used 290x I got for $350 one awesome card


----------



## rdr09

Quote:


> Originally Posted by *failwheeldrive*
> 
> Oh, and in case anyone wants to see some AC smexyness:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> The rig in question


Wow.

anyways, that vrm1 temp is certainly not normal for a watercooled 290X. vrm1 is the row mosfets near the power plug. you should not see 60C - imo.

some guesses . . .

1. you forgot to remove the plastic covers of the pads

2. wrong size (but i doubt it)

3. air in the system, and

4. what you think might be.


----------



## failwheeldrive

Quote:


> Originally Posted by *rdr09*
> 
> Wow.
> 
> anyways, that vrm1 temp is certainly not normal for a watercooled 290X. vrm1 is the row mosfets near the power plug. you should not see 60C - imo.
> 
> some guesses . . .
> 
> 1. you forgot to remove the plastic covers of the pads
> 
> 2. wrong size (but i doubt it)
> 
> 3. air in the system, and
> 
> 4. what you think might be.


Thanks for the quick reply and clarifying which VRM is which. That's what I thought, since VRM2 never breaks 40C at stock clocks, even with the fans turned off lol. I can see the coolant over that section of VRM underneath the Hawaii logo plate, so I doubt it's air stuck in the block. I hadn't thought about the plastic still being on the thermal pad, but it's certainly possible. That'll be the first thing I check, along with making sure the pad is the correct thickness and it's covering the whole row. If it's not that, my guess is the backplate screws are a bit too long, causing the block to have adequate pressure on the right side of the PCB. Hopefully it won't be too difficult to trouble shoot. Thanks again!


----------



## aaroc

Quote:


> Originally Posted by *kayan*
> 
> I've got a question about crossfire, if it only works in full screen, does that mean it does or doesn't work in borderless full screen?


You need real fullscreen to have CFX working. Some programs like Teamviewer gave me problems to make CFX work. That program tries to avoid fullscreen programs for some reason. Just close the program before gaming and no problems with CFX or add the command to force fullscreen to every game.
Quote:


> Originally Posted by *Gualichu04*
> 
> Come to find out the r9 290x with aquacomputer backplate it is to thick to put in the top slot of an asus riv BE. the ram slots are in the way. Now i am stuck using the middle two slots and the sound card has to go in the bottom slot.


I think in this same thread two users reported the same problem. They cut the backplate on the offending part that touched the rams and/or took off the part on that side of the memory slot that make the click to avoid the ram going out.


----------



## rdr09

Quote:


> Originally Posted by *failwheeldrive*
> 
> Thanks for the quick reply and clarifying which VRM is which. That's what I thought, since VRM2 never breaks 40C at stock clocks, even with the fans turned off lol. I can see the coolant over that section of VRM underneath the Hawaii logo plate, so I doubt it's air stuck in the block. I hadn't thought about the plastic still being on the thermal pad, but it's certainly possible. That'll be the first thing I check, along with making sure the pad is the correct thickness and it's covering the whole row. If it's not that, my guess is the backplate screws are a bit too long, causing the block to have adequate pressure on the right side of the PCB. Hopefully it won't be too difficult to trouble shoot. Thanks again!


go over post# 21896

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781


----------



## Gualichu04

Quote:


> Originally Posted by *aaroc*
> 
> You need real fullscreen to have CFX working. Some programs like Teamviewer gave me problems to make CFX work. That program tries to avoid fullscreen programs for some reason. Just close the program before gaming and no problems with CFX or add the command to force fullscreen to every game.
> I think in this same thread two users reported the same problem. They cut the backplate on the offending part that touched the rams and/or took off the part on that side of the memory slot that make the click to avoid the ram going out.


I will just deal with it cause no matter what the gpus will run at x8 when in crossfire. Since the active backplate only has certain sizes for the crossfire fitting. Did not want to use sli/crossfire tubes either.


----------



## failwheeldrive

Quote:


> Originally Posted by *rdr09*
> 
> go over post# 21896
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781


Thanks man, at least I know it won't hurt anything to be running at 70-80C until I can figure out what the issue is.

I filed down the backplate screws over the vrm1 section a bit to see if that was the cause, and it didn't help at all unfortunately







I suppose the screws on the back of the card itself may not be completely tightened, so I can try that too... pretty unlikely though. Signs seem to be pointing to the thermal pads being the problem, so I will probably end up having to remove the block to get this sorted out. Sucks since I'm running acrylic tubing lol. Makes getting the card out more of a pain.


----------



## tsm106

Quote:


> Originally Posted by *failwheeldrive*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> go over post# 21896
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781
> 
> 
> 
> Thanks man, at least I know it won't hurt anything to be running at 70-80C until I can figure out what the issue is.
> 
> I filed down the backplate screws over the vrm1 section a bit to see if that was the cause, and it didn't help at all unfortunately
> 
> 
> 
> 
> 
> 
> 
> I suppose the screws on the back of the card itself may not be completely tightened, so I can try that too... pretty unlikely though. Signs seem to be pointing to the thermal pads being the problem, so I will probably end up having to remove the block to get this sorted out. Sucks since I'm running acrylic tubing lol. Makes getting the card out more of a pain.
Click to expand...

Your temps ideally could be a lil better but they are ok, not too bad. Aqua tend to market that they are awesome on vrm temps, but it's really doubtful due to the smallish vrm channel. You can as linked overcome the design with some fujis. What are you using a 360 in that bench?

Btw if you left the plastic on, your core would shoot up to 95c in a flash with a load.


----------



## failwheeldrive

Quote:


> Originally Posted by *tsm106*
> 
> Your temps ideally could be a lil better but they are ok, not too bad. Aqua tend to market that they are awesome on vrm temps, but it's really doubtful due to the smallish vrm channel. You can as linked overcome the design with some fujis. What are you using a 360 in that bench?
> 
> Btw if you left the plastic on, your core would shoot up to 95c in a flash with a load.


Thanks bud, yeah I'm just running a single black ice nemesis 360gtx with eloop b12-4 in push/pull at the moment. Temps on the core and vrm2 are really good when I turn up the fans a bit (40C or less on the core at load on stock clocks) but I have my aquaero running the fans at 0 rpm with occasional jumps to 800rpm just to keep things quiet. I'll be adding a 2nd 360gtx and probably a thin 240 up top when I tear down the loop soon. Also looking around for a 2nd 290x.

I know Aquacomputer's vrm performance has been kinda iffy in the past, but Stren's testing showed really good performance on the 290x vrm1, especially with the active backplate: http://www.xtremesystems.org/forums/showthread.php?288109-Stren-s-R9-290-290x-Water-Block-Testing So yeah, something definitely seems to be off with my VRM1 temps if Stren's results were any indication of the norm. And thanks for clearing that up about leaving the plastic on the thermal pad, at least I can probably cross that off the list of possible issues now.

I will go ahead and grab some fuji pads when I make the order for the rads soon. Hopefully that'll improve things. If it's not a seating or thermal pad problem, I really don't know what it could be lol.


----------



## Arizonian

Quote:


> Originally Posted by *morreale*
> 
> i sent the card back and now have 2x Asus GTX 780ti DCIIs. so you can remove me from the owner list if you'd like.


Updated


----------



## FuriousPop

Quote:


> Originally Posted by *failwheeldrive*
> 
> Thanks bud, yeah I'm just running a single black ice nemesis 360gtx with eloop b12-4 in push/pull at the moment. Temps on the core and vrm2 are really good when I turn up the fans a bit (40C or less on the core at load on stock clocks) but I have my aquaero running the fans at 0 rpm with occasional jumps to 800rpm just to keep things quiet. I'll be adding a 2nd 360gtx and probably a thin 240 up top when I tear down the loop soon. Also looking around for a 2nd 290x.
> 
> I know Aquacomputer's vrm performance has been kinda iffy in the past, but Stren's testing showed really good performance on the 290x vrm1, especially with the active backplate: http://www.xtremesystems.org/forums/showthread.php?288109-Stren-s-R9-290-290x-Water-Block-Testing So yeah, something definitely seems to be off with my VRM1 temps if Stren's results were any indication of the norm. And thanks for clearing that up about leaving the plastic on the thermal pad, at least I can probably cross that off the list of possible issues now.
> 
> I will go ahead and grab some fuji pads when I make the order for the rads soon. Hopefully that'll improve things. If it's not a seating or thermal pad problem, I really don't know what it could be lol.


do you really need a backplate on these?

almost every review i have seen in relation to the 290/290x waterblocks run hotter with the backplate ON...

I didn't bother with getting any backplates on my XSPC 290 ones.... but im sure i will no doubt find out in a very short time if i have made the right decision or not!


----------



## keikei

backplates=looks.


----------



## heroxoot

Quote:


> Originally Posted by *keikei*
> 
> backplates=looks.


They tend to be pretty good to keep the card from bending too.


----------



## tsm106

I'll always buy the backplate if it's an option. I've been doing this too long and seen much damage from bare exposed pcbs. Backplates serve different roles from pcb armor, to deflecting leaks, stiffening saggy card syndrome, and then the traditional assist cooling on vrms and core. To each his own but I'll always buy a plate of I'm going fullcover.


----------



## FuriousPop

Quote:


> Originally Posted by *tsm106*
> 
> I'll always buy the backplate if it's an option. I've been doing this too long and seen much damage from bare exposed pcbs. Backplates serve different roles from pcb armor, to deflecting leaks, stiffening saggy card syndrome, and then the traditional assist cooling on vrms and core. To each his own but I'll always buy a plate of I'm going fullcover.


guess i gotta take some more notes from the experts in my next class!

doing my first ever WC build now. i did notice that 1 of my 3 blocks i screwed in bent just a little (maybe 1 degree or so, i can just barely tell) but maybe thats because i screwed the dam thing in too hard! lol...

i also got 2x little SLI bridges holding these things together so hopefully "saggyness" should not be happening at all. i have screwed in these cards into the case pretty tight - cut myself getting my fat fingers in the tiny space doing so!!!


----------



## tsm106

Quote:


> Originally Posted by *FuriousPop*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I'll always buy the backplate if it's an option. I've been doing this too long and seen much damage from bare exposed pcbs. Backplates serve different roles from pcb armor, to deflecting leaks, stiffening saggy card syndrome, and then the traditional assist cooling on vrms and core. To each his own but I'll always buy a plate of I'm going fullcover.
> 
> 
> 
> guess i gotta take some more notes from the experts in my next class!
> 
> doing my first ever WC build now. i did notice that 1 of my 3 blocks i screwed in bent just a little (maybe 1 degree or so, i can just barely tell) but maybe thats because i screwed the dam thing in too hard! lol...
> 
> i also got 2x little SLI bridges holding these things together so hopefully "saggyness" should not be happening at all. i have screwed in these cards into the case pretty tight - cut myself getting my fat fingers in the tiny space doing so!!!
Click to expand...

Relax on the tightening. Waterblocks don't need to be tightened down till you see white knuckles. Get it tight, then a 1/4 to a h1/2 turn should be good. There are also nuanced differences between the block makers. EK is typically a direct fit, meaning there is hardly any flex once the block is mounted. With Aqua (at least with my 7970 block) there is much distance between the standoff and the actual pcb, so it almost always flexes the pcb. I do my damndest to balance enough tightening force vs pcb bending but its still annoying.


----------



## FuriousPop

Quote:


> Originally Posted by *tsm106*
> 
> Relax on the tightening. Waterblocks don't need to be tightened down till you see white knuckles. Get it tight, then a 1/4 to a h1/2 turn should be good. There are also nuanced differences between the block makers. EK is typically a direct fit, meaning there is hardly any flex once the block is mounted. With Aqua (at least with my 7970 block) there is much distance between the standoff and the actual pcb, so it almost always flexes the pcb. I do my damndest to balance enough tightening force vs pcb bending but its still annoying.


i agree, when doing mine i only used my thumb and fore finger to turn the screw driver. once it got tight i did another 1/4 turn and left it at that - its a snug fit the XSPC ones but my concern with them is the edge of the block is still loose with the PCB - area of the power pins/cables got into. theres holes there in the PCB but XSPC in all their wisdom didnt think to put another 3 holes for additional screws to go in!!!!!

i could snap off the edge of it if i wasn't soooo perfectionist careful!!!


----------



## Mega Man

i tighten the heck out of the crossbracket but leave the rest a normal tight

( komodos use it, and have stop points, so you are only tightening the spring )


----------



## failwheeldrive

Stren's testing showed pretty significant improvements in performance with the backplate, at least with the Aquacomputer block. Granted, I'm running an EK backplate lol.

It's mostly for looks, since I can't stand to look at a bare PCB if I can avoid it. It also does provide a degree of protection and increased strength, especially since I'm running an open case.


----------



## FuriousPop

Quote:


> Originally Posted by *failwheeldrive*
> 
> Stren's testing showed pretty significant improvements in performance with the backplate, at least with the Aquacomputer block. Granted, I'm running an EK backplate lol.
> 
> It's mostly for looks, since I can't stand to look at a bare PCB if I can avoid it. It also does provide a degree of protection and increased strength, especially since I'm running an open case.


lol, by the weight of my XSPC blocks i could smack them over someone's head and they would still be in 1 piece!!

but jokes aside - i can't wait to run some water through these puppies - hopefully i have put them on correctly and covered everything with no plastic left on any of the covers etc etc....

since i never done this before can someone give me a range of what temps i should be looking for:
GPU Core:
VRM 1:
VRM 2:

i have seen many people giving their temps of the build they run however just wanted to know simply <50 degrees cel good or >65 VRM 1 bad - along those lines if possible


----------



## tsm106

Quote:


> Originally Posted by *failwheeldrive*
> 
> Stren's testing showed pretty significant improvements in performance with the backplate, at least with the Aquacomputer block. Granted, I'm running an EK backplate lol.
> 
> It's mostly for looks, since I can't stand to look at a bare PCB if I can avoid it. It also does provide a degree of protection and increased strength, especially since I'm running an open case.


I don't know what to make of strens testing. My 290x ek block only went up to 45c at most. His charts show a delta larger than my temps lol. Hell, my current 7970 blocks that are not using fujis max out at 40c. It doesn't make sense to me. My water temp earlier today was 31c in the bios and loaded vrm temp reaches 45c. That's a 14c delta, yet Stren has 290 block deltas at +45c. It doesn't make sense to me.


----------



## FuriousPop

Quote:


> Originally Posted by *tsm106*
> 
> I don't know what to make of strens testing. My 290x ek block only went up to 45c at most. His charts show a delta larger than my temps lol. Hell, my current 7970 blocks that are not using fujis max out at 40c. It doesn't make sense to me. My water temp earlier today was 31c in the bios and loaded vrm temp reaches 45c. That's a 14c delta, yet Stren has 290 block deltas at +45c. It doesn't make sense to me.


could it be anything to do with the ambient temps outside the case at the time of testing? summer time here i get 45 degree's cel outside for several days to which inside can be just as bad....


----------



## failwheeldrive

Quote:


> Originally Posted by *tsm106*
> 
> I don't know what to make of strens testing. My 290x ek block only went up to 45c at most. His charts show a delta larger than my temps lol. Hell, my current 7970 blocks that are not using fujis max out at 40c. It doesn't make sense to me. My water temp earlier today was 31c in the bios and loaded vrm temp reaches 45c. That's a 14c delta, yet Stren has 290 block deltas at +45c. It doesn't make sense to me.


That does seem off lol. There are a ton of variables involved, but none that I know of would cause that large of a discrepancy.

Of course if I always knew the reason why VRM1 temps were high, I wouldn't be in my current situation


----------



## Talon720

Quote:


> Originally Posted by *tsm106*
> 
> The dual D5 mounts are typically another hundred. You can avoid that if you can manage to plumb both pumps in separately, like one at one end of the loop and one at the other. Your pump is huge so that maybe be an even bigger hurdle. GL.
> 120mm fans have higher static pressure whereas 140mm fans have higher flow. With respect to radiator applications, static pressure is what we want where higher flow is better for case fans. When I went this route, there were not any 140mm fans that could compete with GTs. I'm not sure there are any now since the AP15s have been resigned to eol.
> Are you using the same software and bios I was? Asic is meaningless. I never bothered to look at my cards asics. There is no differentiation between Hynix and Elpida as far as power consumption. Software readings are very inaccurate when it comes to power draws. That's why high end cards often come with DMM points built in.
> 
> 1200mhz wall... sounds like you are running out of voltage. What is your oc method, what app are you running? I described what app and bios setup I was using. The method I described is for using gputweak + pt1 bios under water. My results were 3 random cards which all did at least 1300/1600. My friend has had 3 cards too, 290/290X/290X Lightning and all did 1300 at least using same methodology. However to get there you need a good loop and the brass to push 1.45v loaded sometimes.


Yea I used pt1 bios with gpu tweak I have a full loop with cpu block and vrm block with 2 d5 pumps temps are fine. Oh in support for you tsm and to anyone saying 1 pump was enough for 3 cards or more with a cpu block no way. I couldn't push out stuck air bubbles until the second pump. I was bringing other stuff up just in case, cause it seems odd that's it's happening. If I use the pt3 bios things hit the fan even faster since there's no vdrop. Maybe it's me maybe its the card ill keep trying. Btw you're a wealth of info


----------



## failwheeldrive

Quote:


> Originally Posted by *FuriousPop*
> 
> could it be anything to do with the ambient temps outside the case at the time of testing? summer time here i get 45 degree's cel outside for several days to which inside can be just as bad....


Not really, since an increase in ambient would also increase water temp. The vrm/water delta should remain the same regardless of ambient temps.


----------



## kizwan

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *failwheeldrive*
> 
> Stren's testing showed pretty significant improvements in performance with the backplate, at least with the Aquacomputer block. Granted, I'm running an EK backplate lol.
> 
> It's mostly for looks, since I can't stand to look at a bare PCB if I can avoid it. It also does provide a degree of protection and increased strength, especially since I'm running an open case.
> 
> 
> 
> I don't know what to make of strens testing. My 290x ek block only went up to 45c at most. His charts show a delta larger than my temps lol. Hell, my current 7970 blocks that are not using fujis max out at 40c. It doesn't make sense to me. My water temp earlier today was 31c in the bios and loaded vrm temp reaches 45c. That's a 14c delta, yet Stren has 290 block deltas at +45c. It doesn't make sense to me.
Click to expand...

Stren use Furmark with 40 minute warm up & 20 minute for data logging & the 290 overclocked to 1175/1465MHz +100mV. I didn't use Furmark but with Valley I got 35C delta with 290 @1150/1600 +100mV. With Fujipoly Etreme I got it down to 15C delta. I like to know how did you get 14C delta because whatever I did, I can't get that kind of result with stock EK thermal pad. I do not upgrade GPU regularly & my previous 5870 running cool with EK block + stock thermal pad.


----------



## HoneyBadger84

Quote:


> Originally Posted by *kizwan*
> 
> Stren use Furmark with 40 minute warm up & 20 minute for data logging & the 290 overclocked to 1175/1465MHz +100mV. I didn't use Furmark but with Valley I got 35C delta with 290 @1150/1600 +100mV. With Fujipoly Etreme I got it down to 15C delta. I like to know how did you get 14C delta because whatever I did, I can't get that kind of result with stock EK thermal pad. I do not upgrade GPU regularly & my previous 5870 running cool with EK block + stock thermal pad.


Or you could not use Furmark on a 290/290X, ever. Several folks aliken running furmark on modern video cards to driving run-flat tired over a spike strip to prove they work. Its unnecessary and excessive. Run Valley, or BF4 for a while.


----------



## kizwan

You should read my post carefully, what exactly I'm trying to discuss.


----------



## HoneyBadger84

Quote:


> Originally Posted by *kizwan*
> 
> You should read my post carefully, what exactly I'm trying to discuss.


Did read it. I'm saying simply furmark bad as a newer member was asking questions about temps. Temps in Furmark are not realistic for any actual gaming or stress at stock or otherwise. The fact that some benchmark sites still use it for temperature testing despite AMD saying not to because it's unrealistic makes the entire review void imo.

Yes, I saw you said you didn't use it. I'm simply advising that it not be used at all. Valley or BF4 is sufficient temperature testing, Valley in particular if you leave it running for 3+ loops you should level off in temps unless something about your cooling is inadequate.

Nice clocks on the 290 @ +100mV btw


----------



## failwheeldrive

Quote:


> Originally Posted by *tsm106*
> 
> I don't know what to make of strens testing. My 290x ek block only went up to 45c at most. His charts show a delta larger than my temps lol. Hell, my current 7970 blocks that are not using fujis max out at 40c. It doesn't make sense to me. My water temp earlier today was 31c in the bios and loaded vrm temp reaches 45c. That's a 14c delta, yet Stren has 290 block deltas at +45c. It doesn't make sense to me.


Yeah, I can't remember the last time I heard someone recommend furmark for benchmarking/stress testing. There have been better options for years.

Slight update for the VRM1 temps, I took off the backplate and made sure the screws holding the block down were snug. Didn't help at all, VRM1 still hovers in the high 60s.

I guess the next step is fuji thermal pads. In the unlikely chance it's fitment related it might help to reseat the block. And if not at least the new pads should lower temps some.


----------



## kizwan

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> You should read my post carefully, what exactly I'm trying to discuss.
> 
> 
> 
> Did read it. I'm saying simply furmark bad as a newer member was asking questions about temps. Temps in Furmark are not realistic for any actual gaming or stress at stock or otherwise. The fact that some benchmark sites still use it for temperature testing despite AMD saying not to because it's unrealistic makes the entire review void imo.
> 
> Yes, I saw you said you didn't use it. I'm simply advising that it not be used at all. Valley or BF4 is sufficient temperature testing, Valley in particular if you leave it running for 3+ loops you should level off in temps unless something about your cooling is inadequate.
> 
> Nice clocks on the 290 @ +100mV btw
Click to expand...

You missing the point there. The main subject of the latest discussion is Stren's VRM1 *delta temps*. Since I got almost similar result with stock EK thermal pad, i.e. high delta temp (Stren's VRM1 delta temp in 40s Celsius with Furmark while my VRM1 delta temp in 30s Celsius with Valley), I tend to see Stren's result not unrealistic at all. Because tsm106 managed to get 14C delta, I'm really interested to learn how he did managed that because I can't for the life of me to get low delta temp with stock EK thermal pad. This what was my previous post is all about.
Quote:


> Originally Posted by *failwheeldrive*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I don't know what to make of strens testing. My 290x ek block only went up to 45c at most. His charts show a delta larger than my temps lol. Hell, my current 7970 blocks that are not using fujis max out at 40c. It doesn't make sense to me. My water temp earlier today was 31c in the bios and loaded vrm temp reaches 45c. That's a 14c delta, yet Stren has 290 block deltas at +45c. It doesn't make sense to me.
> 
> 
> 
> 
> 
> 
> 
> Yeah, I can't remember the last time I heard someone recommend furmark for benchmarking/stress testing. There have been better options for years.
> 
> Slight update for the VRM1 temps, I took off the backplate and made sure the screws holding the block down were snug. Didn't help at all, VRM1 still hovers in the high 60s.
> 
> I guess the next step is fuji thermal pads. In the unlikely chance it's fitment related it might help to reseat the block. And if not at least the new pads should lower temps some.
Click to expand...

The fact that he is using Furmark is not important because we are talking about delta temps. If you use furmark & overclocked to same clocks and ideally same voltage, you get lower delta, then yes you can say something wrong with the review. Obviously, if you use Valley or any tools at all, you should be able to calculate delta temp you should get, e.g. with Valley the delta temp might be -10C less than Furmark.

Before trying Fujipoly, I recommend try re-seat the block first.


----------



## failwheeldrive

Quote:


> Originally Posted by *kizwan*
> 
> The fact that he is using Furmark is not important because we are talking about delta temps. If you use furmark & overclocked to same clocks and ideally same voltage, you get lower delta, then yes you can say something wrong with the review. Obviously, if you use Valley or any tools at all, you should be able to calculate delta temp you should get, e.g. with Valley the delta temp might be -10C less than Furmark.
> 
> Before trying Fujipoly, I recommend try re-seat the block first.


I wasn't trying to criticize Stren's testing methodology, I was just agreeing that there are better stress testing options available for people to use that aren't so hard on the card.

I figured I'd go ahead and try the fuji pads since I'd have to disassemble anyway. From what I've read they seem like a solid upgrade over the stock thermal pads. I could reseat it using the stock pads first, but I'd have to drain the loop again if I wanted to go with fuji. I'm lazy


----------



## kizwan

Quote:


> Originally Posted by *failwheeldrive*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> The fact that he is using Furmark is not important because we are talking about delta temps. If you use furmark & overclocked to same clocks and ideally same voltage, you get lower delta, then yes you can say something wrong with the review. Obviously, if you use Valley or any tools at all, you should be able to calculate delta temp you should get, e.g. with Valley the delta temp might be -10C less than Furmark.
> 
> Before trying Fujipoly, I recommend try re-seat the block first.
> 
> 
> 
> 
> 
> 
> 
> I wasn't trying to criticize Stren's testing methodology, I was just agreeing that there are better stress testing options available for people to use that aren't so hard on the card.
Click to expand...

I don't think that is what was tsm106 is arguing about. tsm106 is arguing about the large difference in delta temp between Kryographics & other 290/290X water blocks. Not because Stren use Furmark. If Stren use different tool like Valley for example, it'll still show huge difference in delta temp between Kryographics & other 290/290X water blocks. The "problem" here is the huge difference, not the tool use in the review.


----------



## Mercy4You

Quote:


> Originally Posted by *Mercy4You*
> 
> After watching Youtube my R9 290X doesn't downclock anymore. Needs a restart for Powertune to work again. Anyone else got this problem, and is there a solution?


Quote:


> Originally Posted by *rdr09*
> 
> wonder if its same with Mercy? so, you are using AB? tried opening AB it says expired. i don't use it anyway i'll leave it. glad you got it sorted.


Found the solution, it's because of GPU rendering (hardware acceleration) in the browsers. After disabling this, GPU downclocks fine now...


----------



## The Storm

Quote:


> Originally Posted by *Talon720*
> 
> Yea I used pt1 bios with gpu tweak I have a full loop with cpu block and vrm block with 2 d5 pumps temps are fine. Oh in support for you tsm and to anyone saying 1 pump was enough for 3 cards or more with a cpu block no way. I couldn't push out stuck air bubbles until the second pump. I was bringing other stuff up just in case, cause it seems odd that's it's happening. If I use the pt3 bios things hit the fan even faster since there's no vdrop. Maybe it's me maybe its the card ill keep trying. Btw you're a wealth of info


I must be lucky then with my single D5 and 3 gpus and cpu block, no air bubbles in my system.


----------



## bond32

Quote:


> Originally Posted by *The Storm*
> 
> I must be lucky then with my single D5 and 3 gpus and cpu block, no air bubbles in my system.


One D5 is plenty for that, and if you have air trapped in your system it doesn't necessarily mean it's because you don't have enough pumping power.


----------



## devilhead

Done some more test with 290 and 290X under water, so this is 290 score http://www.3dmark.com/3dm/3679094, and tested my 290X again http://www.3dmark.com/fs/2523594 , i can run those 290X at 1390 on core even but the score get' worse... and daily cpu overclock















and i'm happy with 290, which i will throw to my 4770k rig, it can run 1230/1500 +100mv battlefield 4 all day long at 120hz


----------



## heroxoot

Quote:


> Originally Posted by *devilhead*
> 
> Done some more test with 290 and 290X under water, so this is 290 score http://www.3dmark.com/3dm/3679094, and tested my 290X again http://www.3dmark.com/fs/2523594 , i can run those 290X at 1390 on core even but the score get' worse... and daily cpu overclock
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and i'm happy with 290, which i will throw to my 4770k rig, it can run 1230/1500 +100mv battlefield 4 all day long at 120hz


Score getting worse usually entails a bad OC. Either way good stuff man. 120fps BF4 on a single card sounds great.


----------



## Klocek001

OK, I found the box I need to uncheck in Opera settings to disable hardware acceleration, now the GPU is sitting around 370MHz but I'm still getting 950MHz spikes. Memory is still at 1300 MHz though, won't downclock.
Is there any browser which will let me watch YT at 350/150 ?
I don't remember having this issue with 7870, it was very cool and power efficient in normal, desktop use.


----------



## heroxoot

Quote:


> Originally Posted by *Klocek001*
> 
> Is there any way I can watch youtube and not need a freakin 1000 mhz on the core ?
> 
> I'm using Opera


Disable hardware acceleration both in flash and the browser. My GPU still clocks up to 400 - 500mhz however with these off. Less heat this way tho.


----------



## Klocek001

Quote:


> Originally Posted by *heroxoot*
> 
> Disable hardware acceleration both in flash and the browser. My GPU still clocks up to 400 - 500mhz however with these off. Less heat this way tho.


Ok, I got it. Still 1300 memory (doesn't seem to affect the temperature though). thx a lot.


----------



## wrigleyvillain

The memory stays clocked up if you are running more than one monitor, as well. I hate it but it's to prevent 2D flickering apparently. Uses more power and my VRM2 temps are 5-7C higher on the desktop.


----------



## tsm106

Quote:


> Originally Posted by *wrigleyvillain*
> 
> The memory stays clocked up if you are running more than one monitor, as well. I hate it but it's to prevent 2D flickering apparently. Uses more power and my VRM2 temps are 5-7C higher on the desktop.


Generally that's true but it really depends on each individual driver and how you overclock plays a big role. For example 14.6 and 14.7 idle at 500/max mem speed in multi-monitors while 14.4 whql doesn't enforce clocks. Thus you can run 300/150 with 14.4 whql.


----------



## tsm106

Quote:


> Originally Posted by *bond32*
> 
> Quote:
> 
> 
> 
> Originally Posted by *The Storm*
> 
> I must be lucky then with my single D5 and 3 gpus and cpu block, no air bubbles in my system.
> 
> 
> 
> One D5 is plenty for that, and if you have air trapped in your system it doesn't necessarily mean it's because you don't have enough pumping power.
Click to expand...











That's exactly what it means lol. If your pump cannot push the air out of your loop, it means its too weak and thus it's a compromise that you are willing to live with.


----------



## heroxoot

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wrigleyvillain*
> 
> The memory stays clocked up if you are running more than one monitor, as well. I hate it but it's to prevent 2D flickering apparently. Uses more power and my VRM2 temps are 5-7C higher on the desktop.
> 
> 
> 
> Generally that's true but it really depends on each individual driver and how you overclock plays a big role. For example 14.6 and 14.7 idle at 500/max mem speed in multi-monitors while 14.4 whql doesn't enforce clocks. Thus you can run 300/150 with 14.4 whql.
Click to expand...

14.6RC2 idles 300/1250 for me. I have not done any overclocking recently since the 14.6 driver alone gave me huge performance. Staying below 50c watching flash based streams, 52c with the AC off. It's nice actually. Games are much cooler too.


----------



## kayan

I have another dumb-ish question:

http://www.3dmark.com/fs/2525236

Why, almost every time I overclock my GPU's does 3dMark report stock speeds instead of what they are overclocked to? Sometimes it shows the OC'd core values, and sometimes it doesn't.








/scratching head


----------



## heroxoot

Quote:


> Originally Posted by *kayan*
> 
> I have another dumb-ish question:
> 
> http://www.3dmark.com/fs/2525236
> 
> Why, almost every time I overclock my GPU's does 3dMark report stock speeds instead of what they are overclocked to? Sometimes it shows the OC'd core values, and sometimes it doesn't.
> 
> 
> 
> 
> 
> 
> 
> 
> /scratching head


Bios reading most likely.


----------



## tsm106

Quote:


> Originally Posted by *kayan*
> 
> I have another dumb-ish question:
> 
> http://www.3dmark.com/fs/2525236
> 
> Why, almost every time I overclock my GPU's does 3dMark report stock speeds instead of what they are overclocked to? Sometimes it shows the OC'd core values, and sometimes it doesn't.
> 
> 
> 
> 
> 
> 
> 
> 
> /scratching head


Futuremarks sysinfo has trouble reading clocks. I notice that it happens often with not necessarily unstable overclocks, but overclocks that could use more stability. Or another way to think of it, you're really pushing it.


----------



## kayan

Quote:


> Originally Posted by *tsm106*
> 
> Futuremarks sysinfo has trouble reading clocks. I notice that it happens often with not necessarily unstable overclocks, but overclocks that could use more stability. Or another way to think of it, you're really pushing it.


So I should just try increasing the voltages?

For example I can overclock both of my 290x's to 1125 without any increase in voltages, and that will display the clocks properly, however as soon as I start having to increase voltages the clocks show at 1000, even when it's really at 1200, the benchmark score will go up, but the clocks show at 1000.


----------



## bluedevil

Just emailed VisonTek about my replacement RMA GPU, was told it would be a total of 3 - 5 business days to test (after today it has been 2 full days). So hopefully I will see my new GPU on my doorstep sometime next week!


----------



## ZealotKi11er

Quote:


> Originally Posted by *bluedevil*
> 
> Just emailed VisonTek about my replacement RMA GPU, was told it would be a total of 3 - 5 business days to test (after today it has been 2 full days). So hopefully I will see my new GPU on my doorstep sometime next week!


What the hell are they testing it for 3 days for? It wither works or does not.


----------



## bluedevil

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What the hell are they testing it for 3 days for? It wither works or does not.


That's what I am trying to figure out....I don't want to sound rude or ungrateful but IMO it shouldn't take long to figure out if doesn't work or not. Funny thing is, it was delivered on Monday, 7/28/14 3:19PM. And it didn't get into the "Testing" process until Wednesday. Talk about someone sitting on there hands....


----------



## Talon720

Quote:


> Originally Posted by *The Storm*
> 
> I must be lucky then with my single D5 and 3 gpus and cpu block, no air bubbles in my system.


Quote:


> Originally Posted by *bond32*
> 
> One D5 is plenty for that, and if you have air trapped in your system it doesn't necessarily mean it's because you don't have enough pumping power.


Well its also true everyones system is different, and some people learn to live with or ignore the issue . My system just wouldnt bleed like it did before the 3rd gpu block. I had horrible temps they sky rocketed the minute the gpus were stressed and it was an air bubble. I know you weren't having this issue, but ive read about others who had similar system, and no matter how they rocked the case and flipped it upside down it wouldn't bleed. I just think adding a second pump isnt out of the realm of possibilities. It was a night and day difference after adding a second pump for me. Maybe watercooling parts restrictiveness varies more than i woulda thought.


----------



## battleaxe

Quote:


> Originally Posted by *bluedevil*
> 
> That's what I am trying to figure out....I don't want to sound rude or ungrateful but IMO it shouldn't take long to figure out if doesn't work or not. Funny thing is, it was delivered on Monday, 7/28/14 3:19PM. And it didn't get into the "Testing" process until Wednesday. Talk about someone sitting on there hands....


Most likely not because they are testing it for three days but because it would be in the department for 3 days. There's probably a waiting list of parts to test. They come in and enter the back of the line. As they work through the parts its about 3 days before they get to yours. That's the likely scenario. Not that they'd test for 3 days. Or whatever the time frame is, that's kinda how they do it.


----------



## bluedevil

Quote:


> Originally Posted by *battleaxe*
> 
> Most likely not because they are testing it for three days but because it would be in the department for 3 days. There's probably a waiting list of parts to test. They come in and enter the back of the line. As they work through the parts its about 3 days before they get to yours. That's the likely scenario. Not that they'd test for 3 days. Or whatever the time frame is, that's kinda how they do it.


Well you'd hope that they really wouldn't want to "waste" a day like they did with my card. I just hope that they can tell me something tomorrow.


----------



## FuriousPop

Quote:


> Originally Posted by *Talon720*
> 
> Well its also true everyones system is different, and some people learn to live with or ignore the issue . My system just wouldnt bleed like it did before the 3rd gpu block. I had horrible temps they sky rocketed the minute the gpus were stressed and it was an air bubble. I know you weren't having this issue, but ive read about others who had similar system, and no matter how they rocked the case and flipped it upside down it wouldn't bleed. I just think adding a second pump isnt out of the realm of possibilities. It was a night and day difference after adding a second pump for me. Maybe watercooling parts restrictiveness varies more than i woulda thought.


sounds like it IS coming down to HOW restrictive your loops are.... hence why different people having different results...

i have been told that 90 degree angle fittings can increase the restrictiveness of the loop dramatically and is seen as something that should be avoided. i got 2x 45 degree ones on my rads with the way they are positioned hopefully doesn't cause THAT much restriction. kinks in the loop etc etc i mean unless we see photo's of your build and compare exactly to others thats where we may be able to advise better in relation to 1x vs 2x pumps..

Me for example has no idea which one to go for so am going to pick up my 2nd XSPC d5 Vario pump today and just put 1 in the loop and see how it goes - if required will add the 2nd pump otherwise its going to be collecting dust on a bench! (exaggeration, i normally keep my parts in sealed plastic cases = no dust)

i can also recomm this link - http://www.overclock.net/t/1108918/what-can-my-pump-handle-a-guide kinda helps to get a rough idea at least.


----------



## ozzy1925

i know this question is more psu related but need 290 owners.
i have 3x290 sapphire trix oc and corsair ax 1500i(125A) as i read stock 290 needs 31A so 3 of them needs 93A.What will happen if i oc these cards like 1200/1500 together with high end cpu like incoming 5960x,plus 18 fans and 2 d5 pumps ?Will my psu fail to power these cards?


----------



## Talon720

Quote:


> Originally Posted by *FuriousPop*
> 
> sounds like it IS coming down to HOW restrictive your loops are.... hence why different people having different results...
> 
> i have been told that 90 degree angle fittings can increase the restrictiveness of the loop dramatically and is seen as something that should be avoided. i got 2x 45 degree ones on my rads with the way they are positioned hopefully doesn't cause THAT much restriction. kinks in the loop etc etc i mean unless we see photo's of your build and compare exactly to others thats where we may be able to advise better in relation to 1x vs 2x pumps..
> 
> Me for example has no idea which one to go for so am going to pick up my 2nd XSPC d5 Vario pump today and just put 1 in the loop and see how it goes - if required will add the 2nd pump otherwise its going to be collecting dust on a bench! (exaggeration, i normally keep my parts in sealed plastic cases = no dust)
> 
> i can also recomm this link - http://www.overclock.net/t/1108918/what-can-my-pump-handle-a-guide kinda helps to get a rough idea at least.


.

Well I was kinda in the same situation. I had a second pump just laying around from a xspc photon res I just got a ek d5 block for it, the photon didn't really fit in my case. I figured its not gonna hurt to have 2, only help especially when expanding the loop. I still have my 3rd rad 45xt 240 to fit in the 540 air


----------



## FuriousPop

Quote:


> Originally Posted by *Talon720*
> 
> .
> 
> Well I was kinda in the same situation. I had a second pump just laying around from a xspc photon res I just got a ek d5 block for it, the photon didn't really fit in my case. I figured its not gonna hurt to have 2, only help especially when expanding the loop. I still have my 3rd rad 45xt 240 to fit in the 540 air


well my EK D5 Vario is ready for pickup, so im leaving in 1 hour to go get it.


----------



## melodystyle2003

On a reference r9 290 tell me please a sufficient way to cool it down, both core and VRMs (1,2) without using waterblock.


----------



## ozzy1925

Quote:


> Originally Posted by *melodystyle2003*
> 
> On a reference r9 290 tell me please a sufficient way to cool it down, both core and VRMs (1,2) without using waterblock.


you can consider adding better thermal pad named fuji extreme here:
http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures/100_100#post_22647942


----------



## Dasboogieman

Quote:


> Originally Posted by *ozzy1925*
> 
> i know this question is more psu related but need 290 owners.
> i have 3x290 sapphire trix oc and corsair ax 1500i(125A) as i read stock 290 needs 31A so 3 of them needs 93A.What will happen if i oc these cards like 1200/1500 together with high end cpu like incoming 5960x,plus 18 fans and 2 d5 pumps ?Will my psu fail to power these cards?


I can assure you that the AX1500i is more than sufficient. Even assuming each GPU draws a ridiculous 350W ,which is virtually impossible without severe risks of damaging the Tri X VRM assembly, remember, it's not only heat that kills VRMs, many reference GTX 570 owners can attest to that.
You still have a massive 450W of 12V headroom for the CPU, RAM and other stuff, I seriously doubt you can get the CPU to draw more than 300W (again virtually impossible to attain without risks of killing the CPU with voltage) unless it's an AMD unit. The HDDs are on the 5V rail so its unlikely to contribute to the overall result.

The AX1500i is the best PSU on the market at the moment, it will easily handle 1500W of continuous power draw for many years. The sheer efficiency also means that it would virtually not get hot either.


----------



## ds84

Has anyone encounter a problem with MSI r9 290 gaming with AQC gpo blocks? My 2nd gpu came with inverted psu pins. Not a bigger but the layout of the components are changed. Now its cant fit. The old one can fit, but i cannot tighten so much.


----------



## rdr09

Quote:


> Originally Posted by *melodystyle2003*
> 
> On a reference r9 290 tell me please a sufficient way to cool it down, both core and VRMs (1,2) without using waterblock.


check this out . . .

http://www.feppaspot.com/servlet/the-897/GELID-Solutions-Icy-Vision/Detail

also, go over post # 21896

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781


----------



## melodystyle2003

Quote:


> Originally Posted by *rdr09*
> 
> check this out . . .
> 
> http://www.feppaspot.com/servlet/the-897/GELID-Solutions-Icy-Vision/Detail
> 
> also, go over post # 21896
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781


Thanks i ordered the vrm kit. I ve read that the coolermaster 120M or 240M holes align directly to r9 290/x boards.
Should i go with this aio cooler or take the arctic accelero 4 / gelid icy vision plus i ve got copper ram passive heatsinks for the vrams?

+rep


----------



## amptechnow

i have some questions for the crossfire users...

so my second r9 290 will be here today! can't wait. i do not have a matching water block or back plate for it yet though. hoping they play nice together. the 290 i have now is very very picky at times. overclocks pretty well... for benches up to 1200 core and 1625 mem on elpida using pt1 bios and modded gpu tweaks. runs 10-15 degrees hotter then stock bios under load and vrm1 hits 75-80 at times. i game with it at 1115-1125 core and 1625 mem either with pt1 bios or stock xfx bios using afterburner with voltage and power limit maxed. stays cooler then pt1 and vrm1 only hits 40-47 degrees under load. my xfx is a reference design and was new when i got it and i added a heatkiller full water block and and the card i have coming today is new, a powercolor pcs with the 3 fans. hoping its not one of the defective ones.


----------



## kizwan

Quote:


> Originally Posted by *amptechnow*
> 
> i have some questions for the crossfire users...
> 
> so my second r9 290 will be here today! can't wait. i do not have a matching water block or back plate for it yet though. hoping they play nice together. the 290 i have now is very very picky at times. overclocks pretty well... for benches up to 1200 core and 1625 mem on elpida using pt1 bios and modded gpu tweaks. runs 10-15 degrees hotter then stock bios under load and vrm1 hits 75-80 at times. i game with it at 1115-1125 core and 1625 mem either with pt1 bios or stock xfx bios using afterburner with voltage and power limit maxed. stays cooler then pt1 and vrm1 only hits 40-47 degrees under load. my xfx is a reference design and was new when i got it and i added a heatkiller full water block and and the card i have coming today is new, a powercolor pcs with the 3 fans. hoping its not one of the defective ones.


Ermmm... What is the question?


----------



## Knight26

Quote:


> Originally Posted by *ozzy1925*
> 
> i know this question is more psu related but need 290 owners.
> i have 3x290 sapphire trix oc and corsair ax 1500i(125A) as i read stock 290 needs 31A so 3 of them needs 93A.What will happen if i oc these cards like 1200/1500 together with high end cpu like incoming 5960x,plus 18 fans and 2 d5 pumps ?Will my psu fail to power these cards?


31A is the minimum recommended system amperage for a system with a R9 290x, 550w w/ a single +12v rail @ 31A. So the actual amperage needed per card will be somewhat lower than that. Of course, overclocking raises that number. You can find several calculators online that will give the estimated amps based on voltage and watts. More than likely your ax1500i will be more than sufficient for 3 290's.


----------



## mojobear

Hey guys,

Is there a PT1 bios for the R9 290, non-x version? Wanted to have some fun OCing the cards when the temperatures come down in Aug-Sept









Thanks!


----------



## tsm106

Quote:


> Originally Posted by *mojobear*
> 
> Hey guys,
> 
> Is there a PT1 bios for the R9 290, non-x version? Wanted to have some fun OCing the cards when the temperatures come down in Aug-Sept
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks!


There PTx bios' are compatible with all reference based 290x or 290 cards, so no need to seek out a specific bios.


----------



## Roboyto

Quote:


> Originally Posted by *melodystyle2003*
> 
> Thanks i ordered the vrm kit. I ve read that the coolermaster 120M or 240M holes align directly to r9 290/x boards.
> Should i go with this aio cooler or take the arctic accelero 4 / gelid icy vision plus i ve got copper ram passive heatsinks for the vrams?
> 
> +rep


If you're willing to spend a couple extra bucks, you can really get some killer VRM temps with the Kraken G10, that Gelid VRM kit, upgrade VRM pads to Fujipoly Ultra Extreme, upgrade Krakegn 92mm fan, and some good thermal tape for VRM2... See here:

http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures/80_20#post_22599798

VRM1 Maxing at 49C after 1 hour FFXIV Benchmark loop. Pretty solid temperature for aircooled at stock settings


----------



## Gandroid100

I want to be a member this is the GPU-Z screenshot 
the brand is Sapphire and it use the tri-x aftermarket cooler


----------



## Descadent

yuu need your forum name in the screenshot


----------



## Willi

I forgot to leave my validation here, please include me in the list XD

1. Validation link: http://www.techpowerup.com/gpuz/3mqp5/
2. Gigabyte R9 290X (reference) - using MSI BIOS due to stability
3. Cooling: water - Aquacomputer full cover block.

I need to get a decent camera ASAP. had to redo some bends on the acrylic tubing, but it works ok, temps are perfectly under control too. The block is absolutely awesome.


----------



## spikezone2004

Getting a XFX R9 290 reference card and planning out my cooling but I just can't decide on what route to go.

My desired route would be to use my ek vga supremacy block with the gelid vrm heatsink but I'm worried bout high vrm temps

Looked at the aqua computers block but don't really want to spend that much when I just bought the card but I want to have cool temperatures what are yalls thoughts?


----------



## Talon720

Well I ve tried pt1/gpu tweak 1.4 v and over black screens. Pt1 and stock bios/AB with overvoltage bat. file still black screens at 1.4v. Its odd because it will be stable at whatever clocks at lower volts. The only variable I change is the voltage to try to isolate the cause. Could overvoltage cause memory to black screen? I've even lowered mem clocks below stock and the same thing happens. Maybe switch the display card or oc each card at its own voltage same clocks. Has anyone heard of vivi's 290x .95v rail mod and what is it? I looks like its for lightning cards just curious


----------



## ZealotKi11er

Quote:


> Originally Posted by *spikezone2004*
> 
> Getting a XFX R9 290 reference card and planning out my cooling but I just can't decide on what route to go.
> 
> My desired route would be to use my ek vga supremacy block with the gelid vrm heatsink but I'm worried bout high vrm temps
> 
> Looked at the aqua computers block but don't really want to spend that much when I just bought the card but I want to have cool temperatures what are yalls thoughts?


How long you plan to keep the card?

If at least 1 year then get a full block. I think EK Blocks are cheaper then Aqua.


----------



## Arizonian

Quote:


> Originally Posted by *Gandroid100*
> 
> I want to be a member this is the GPU-Z screenshot
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> the brand is Sapphire and it use the tri-x aftermarket cooler


Congrats - added









Welcome to OCN as well.









_If you can add the GPU-Z validation shot to be open to 'validation tab' with your OCN name showing in that proof link, would be appreciated._

Quote:


> Originally Posted by *Willi*
> 
> I forgot to leave my validation here, please include me in the list XD
> 
> 1. Validation link: http://www.techpowerup.com/gpuz/3mqp5/
> 2. Gigabyte R9 290X (reference) - using MSI BIOS due to stability
> 3. Cooling: water - Aquacomputer full cover block.
> 
> I need to get a decent camera ASAP. had to redo some bends on the acrylic tubing, but it works ok, temps are perfectly under control too. The block is absolutely awesome.


Congrats - added


----------



## amptechnow

sorry never finished typing and dont know why i hit post....i feel really ******ed...atleast my 3 year old didnt get on here and type jiberish again...

so...would it be best in a crossfire setup to run each card with stock bios or flash them both to same bios since they are xfx and powercolor? and which program do you suggest for overclocking. my xfx seems to like AB best on stock bios and modded asus gpu tweaks on pt1 bios....


----------



## Arizonian

Quote:


> Originally Posted by *amptechnow*
> 
> i have some questions for the crossfire users...
> 
> so my second r9 290 will be here today! can't wait. i do not have a matching water block or back plate for it yet though. hoping they play nice together. the 290 i have now is very very picky at times. overclocks pretty well... for benches up to 1200 core and 1625 mem on elpida using pt1 bios and modded gpu tweaks. runs 10-15 degrees hotter then stock bios under load and vrm1 hits 75-80 at times. i game with it at 1115-1125 core and 1625 mem either with pt1 bios or stock xfx bios using afterburner with voltage and power limit maxed. stays cooler then pt1 and vrm1 only hits 40-47 degrees under load. my xfx is a reference design and was new when i got it and i added a heatkiller full water block and and the card i have coming today is new, a powercolor pcs with the 3 fans. hoping its not one of the defective ones.


Quote:


> Originally Posted by *amptechnow*
> 
> -snip-
> 
> so...would it be best in a crossfire setup to run each card with stock bios or flash them both to same bios since they are xfx and powercolor? and which program do you suggest for overclocking. my xfx seems to like AB best on stock bios and modded asus gpu tweaks on pt1 bios....


You mean these two together right

Just helping bring it together.


----------



## spikezone2004

Quote:


> Originally Posted by *ZealotKi11er*
> 
> How long you plan to keep the card?
> 
> If at least 1 year then get a full block. I think EK Blocks are cheaper then Aqua.


I definitely plan on keeping the card for a while over a year (my last card i had for 4 years was a 6850). They are cheaper but only by a bit, the aqua isnt much more expensive and seems to support better vrm cooling but just not sure, cause none of the blocks seem to have water physically running over the vrm of the card


----------



## amptechnow

yes...just got card in actually....power color just sent me a pcs+ 290X instead of 290!!!!!!


----------



## keikei

^Win.


----------



## Arizonian

Quote:


> Originally Posted by *amptechnow*
> 
> yes...just got card in actually....power color just sent me a pcs+ 290X instead of 290!!!!!!


Score for you.









Don't forget to submit proof and join the roster. Would love pics on the one.


----------



## amptechnow

just ran valley with both cards stock bios and no overclocking or volting or even power limit....



how does that score fair for 290's in crossfire?

i will post pics of cards in a few...


----------



## ZealotKi11er

Quote:


> Originally Posted by *spikezone2004*
> 
> I definitely plan on keeping the card for a while over a year (my last card i had for 4 years was a 6850). They are cheaper but only by a bit, the aqua isnt much more expensive and seems to support better vrm cooling but just not sure, cause none of the blocks seem to have water physically running over the vrm of the card


EK i have does run water on the VRM1. The problem is cheap thermal pads. I suggest you buy some after-market thermal pads. Considering you used HD 6850 for 4 year i would definitely goo full block.


----------



## amptechnow

almost just hard a heart attack. opened case to get some pics and fans on powercolor were not spinning at all. put some load on and they turned on. not using any oc programs at the moment. i have extremely good airflow in my case and my house stays around 21 degrees. is this normal?


----------



## ozzy1925

Quote:


> Originally Posted by *Dasboogieman*
> 
> I can assure you that the AX1500i is more than sufficient. Even assuming each GPU draws a ridiculous 350W ,which is virtually impossible without severe risks of damaging the Tri X VRM assembly, remember, it's not only heat that kills VRMs, many reference GTX 570 owners can attest to that.
> You still have a massive 450W of 12V headroom for the CPU, RAM and other stuff, I seriously doubt you can get the CPU to draw more than 300W (again virtually impossible to attain without risks of killing the CPU with voltage) unless it's an AMD unit. The HDDs are on the 5V rail so its unlikely to contribute to the overall result.
> 
> The AX1500i is the best PSU on the market at the moment, it will easily handle 1500W of continuous power draw for many years. The sheer efficiency also means that it would virtually not get hot either.


Quote:


> Originally Posted by *Knight26*
> 
> 31A is the minimum recommended system amperage for a system with a R9 290x, 550w w/ a single +12v rail @ 31A. So the actual amperage needed per card will be somewhat lower than that. Of course, overclocking raises that number. You can find several calculators online that will give the estimated amps based on voltage and watts. More than likely your ax1500i will be more than sufficient for 3 290's.


ty both +rep i was thinking the same ax 1500i is more than enough for 3 x 290


----------



## amptechnow

here are the pics.

top card: xfx r9 290 reference card with heatkiller xgpu3 waterblock and backplate

bottom card: powercolor pcs+ r9 290x with stock cooling


----------



## Red1776

Quote:


> Originally Posted by *amptechnow*
> 
> almost just hard a heart attack. opened case to get some pics and fans on powercolor were not spinning at all. put some load on and they turned on. not using any oc programs at the moment. i have extremely good airflow in my case and my house stays around 21 degrees. is this normal?


That's zero-core. The GPU ramps down to about 2W, low enough for the fans to stop and dissipate passively. Three of my four R290X's ( before WB'ing) would spin all the way down until load was applied.

Zero-core freaks a lot people out at first.


----------



## spikezone2004

Quote:


> Originally Posted by *ZealotKi11er*
> 
> EK i have does run water on the VRM1. The problem is cheap thermal pads. I suggest you buy some after-market thermal pads. Considering you used HD 6850 for 4 year i would definitely goo full block.


ok this may sound completely noob, but i cant find a diagram anywhere which part is the vrm1 and vrm2 on the gpu?

Tried finding a picture of the card without a block on naming each part. Been along time since iv upgraded my gpu, but i want to make sure i cool it right than have anything overheat,


----------



## amptechnow

ok thanks. thought i was going to have to rma card and would lose the free 290x upgrade i got...


----------



## Roboyto

Quote:


> Originally Posted by *spikezone2004*
> 
> ok this may sound completely noob, but i cant find a diagram anywhere which part is the vrm1 and vrm2 on the gpu?
> 
> Tried finding a picture of the card without a block on naming each part. Been along time since iv upgraded my gpu, but i want to make sure i cool it right than have anything overheat,


See here for my "290X Need to Know" post: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781



VRM1 in yellow

VRM2 in Orange


----------



## Sgt Bilko

Quote:


> Originally Posted by *spikezone2004*
> 
> ok this may sound completely noob, but i cant find a diagram anywhere which part is the vrm1 and vrm2 on the gpu?
> 
> Tried finding a picture of the card without a block on naming each part. Been along time since iv upgraded my gpu, but i want to make sure i cool it right than have anything overheat,


Quote:


> Originally Posted by *Roboyto*
> 
> [/LIST]
> 
> *Keep an eye on VRM1 temps, <90C is where you want to be for VRM1*
> VRM1 controls core voltage and is the *long vertical strip* closest to the PCIe power connectors.
> 
> *VRM Locations*
> 
> 
> 
> *VRM2 controls RAM* voltage and is nearest the video output connections.
> You won't likely need to be as concerned with keeping them cool since they don't run nearly as hot.
> From my experience with a Kraken G10 on a 290, all you need is a good heatsink(s) on VRM2 too keep it cool; no airflow is really even necessary.
> 
> Kraken & Gelid Heatsink Cooling - http://www.overclock.net/t/1478544/the-devil-inside-a-gaming-htpc-for-my-wife-3770k-corsair-250d-powercolor-r9-270x-devil-edition/20#post_22214255


----------



## Sgt Bilko

Quote:


> Originally Posted by *Roboyto*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> See here for my "290X Need to Know" post: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781
> 
> 
> 
> VRM1 in yellow
> VRM2 in Orange


You beat me on quoting your post


----------



## tsm106

Quote:


> Originally Posted by *Red1776*
> 
> Quote:
> 
> 
> 
> Originally Posted by *amptechnow*
> 
> almost just hard a heart attack. opened case to get some pics and fans on powercolor were not spinning at all. put some load on and they turned on. not using any oc programs at the moment. i have extremely good airflow in my case and my house stays around 21 degrees. is this normal?
> 
> 
> 
> That's zero-core. The GPU ramps down to about 2W, low enough for the fans to stop and dissipate passively. Three of my four R290X's ( before WB'ing) would spin all the way down until load was applied.
> Zero-core freaks a lot people out at first.
Click to expand...

Disable ULPS if ya don't want your cards sleeping on the job.


----------



## spikezone2004

Quote:


> Originally Posted by *Roboyto*
> 
> See here for my "290X Need to Know" post: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781
> 
> 
> 
> VRM1 in yellow
> VRM2 in Orange


well that makes it easier thanks!

I have a good universal gpu block that id like to use with the gelid heatsink cause its only 10$ for the heatsink compared to 130+ for a full block.

i just dont know if the gelid heatsink will be enough, im not going to be ocing like crazy but i will a little and my rooms ambient temp is rather high sometimes


----------



## kizwan

Quote:


> Originally Posted by *amptechnow*
> 
> just ran valley with both cards stock bios and no overclocking or volting or even power limit....
> 
> 
> 
> how does that score fair for 290's in crossfire?
> 
> i will post pics of cards in a few...


Yeah, the score look alright in Crossfire at stock clock (947/1250).


----------



## Roboyto

Quote:


> Originally Posted by *spikezone2004*
> 
> well that makes it easier thanks!
> 
> I have a good universal gpu block that id like to use with the gelid heatsink cause its only 10$ for the heatsink compared to 130+ for a full block.
> 
> i just dont know if the gelid heatsink will be enough, im not going to be ocing like crazy but i will a little and my rooms ambient temp is rather high sometimes


http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures/80_20#post_22599798

Gelid kit actually works quite well if you upgrade the thermal pad to Fujipoly Ultra Extreme and have good airflow on the heatsink. See the link above that has my configuration with a Kraken cooled 290. With the Gelid kit, Fuji UE, and a Zalman 92mm fan, I have VRM1 temps under control. After 1 hour loop of FFXIV benchmark, GPU stock settings, VRM1 peaked at 49C; and this is in a Corsair 250D with no additional airflow over the card.


----------



## spikezone2004

Quote:


> Originally Posted by *Roboyto*
> 
> http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures/80_20#post_22599798
> 
> Gelid kit actually works quite well if you upgrade the thermal pad to Fujipoly Ultra Extreme and have good airflow on the heatsink. See the link above that has my configuration with a Kraken cooled 290. With the Gelid kit, Fuji UE, and a Zalman 92mm fan, I have VRM1 temps under control. After 1 hour loop of FFXIV benchmark, GPU stock settings, VRM1 peaked at 49C; and this is in a Corsair 250D with no additional airflow over the card.


how did u mount zalman fan to card? saw the pic with all the heatsinks on, curious as to how you mounted the fan


----------



## Roboyto

Quote:


> Originally Posted by *spikezone2004*
> 
> how did u mount zalman fan to card? saw the pic with all the heatsinks on, curious as to how you mounted the fan


NZXT Kraken G10 mounting kit for AIO water coolers.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Roboyto*
> 
> http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures/80_20#post_22599798
> 
> Gelid kit actually works quite well if you upgrade the thermal pad to Fujipoly Ultra Extreme and have good airflow on the heatsink. See the link above that has my configuration with a Kraken cooled 290. With the Gelid kit, Fuji UE, and a Zalman 92mm fan, I have VRM1 temps under control. After 1 hour loop of FFXIV benchmark, GPU stock settings, VRM1 peaked at 49C; and this is in a Corsair 250D with no additional airflow over the card.


Dont be fooled by those temps. You have to test games that fully use the GPU. I get 55C under water so on air it impossible to get 49C. Test some BF4 and see the VRM1 go 60C. If you look at GPU-Z u see the GPU usage and GPU clocks go much lower then stock,


----------



## BradleyW

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Dont be fooled by those temps. You have to test games that fully use the GPU. I get 55C under water so on air it impossible to get 49C. Test some BF4 and see the VRM1 go 60C. If you look at GPU-Z u see the GPU usage and GPU clocks go much lower then stock,


I get around 50c in many games, but games using SSAA such as Sniper Elite III often put both my GPU's at 55c exactly! 49 on air would require very cold ambient or idle system at moderate ambient.


----------



## Arizonian

Quote:


> Originally Posted by *amptechnow*
> 
> here are the pics.
> 
> top card: xfx r9 290 reference card with heatkiller xgpu3 waterblock and backplate
> 
> bottom card: powercolor pcs+ r9 290x with stock cooling
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added
















Very nice work.


----------



## Roboyto

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Dont be fooled by those temps. You have to test games that fully use the GPU. I get 55C under water so on air it impossible to get 49C. Test some BF4 and see the VRM1 go 60C. If you look at GPU-Z u see the GPU usage and GPU clocks go much lower then stock,


I don't think it's impossible at all since this temp is at stock clocks/volts and only driving 1080. My VRM1 is also around 55C under water, with 87mV, 1200 core clock, while driving 5760*1080.

That 49C is stock clocks, and FFXIV benchmark loads the card the same as playing the game, which is quite intense. GPU-Z logged 94% max load, this is because this card is only driving 1080 resolution.

When I first installed the Kraken the load temperature on VRM1, for a single run of the benchmark, was 56C; this number would have undoubtedly been higher if I left it run for an extended period of time. Now the temperature is down 7C even after a 1 hour loop; and ambient temp in my condo is 5C higher then when I first installed everything.

I will record again when my wife sits down for an extended Tomb Raider session to see what the temperatures are now after the pad upgrade. Before I upgraded the pads, after 3 hours of Tomb Raider, VRM1 peaked at 71C.


----------



## bluedevil

Talk about make me mad.....this is what I get from VisionTek about my RMA.

Quote:


> Originally Posted by *VisionTek*
> 
> We are currently out of stock of this unit. Once we get more in stock we will be shipping out the unit.
> 
> Thank you


And my response.
Quote:


> Originally Posted by *Me*
> 
> Since you are out of stock of the R9 290, an upgrade is feasible. If you could ship out a R9 290X in place of the R9 290, I would be more than happy.
> 
> Thanks


You'd think the upgrade would come automatic?

I just get this as I was typing this.
Quote:


> Originally Posted by *VisionTek*
> We will be looking into this option next week as we will have information on when our next shipment is coming in then. Once we have this, we will contact you on the options available and let you decide which route you would like to go. It should be noted that we are also out of stock of the R9 290x. We do have a shipment placed for both these units.
> 
> Thank you


So I guess my RMA is in limbo as I wait in anticipation for what I will get....Maybe I should ask for a CryoVenom? or a R9 295X2?







I can't help it that they are out of stock!


----------



## keikei

^I like how they dont give a date for the restock or a secondary option for another product of equal or slightly better value. Good customer service there.


----------



## bluedevil

Quote:


> Originally Posted by *keikei*
> 
> ^I like how they dont give a date for the restock or a secondary option for another product of equal or slightly better value. Good customer service there.


Oh don't worry I will be emailing them daily.


----------



## devilhead

and talking about vrm1, vrm2 with EK block, you always need to apply some non electrical conductive thermal paste, more important for vrm2 (i use always mx-4) it helps, even with stock EK thermal pads + mx-4 i get pretty decent temperatures.
Now i made some test with fujipoly 11 W/mK and at 1.47v vrm2 tempetures reach around 70C, when i applied mx-4 on vrm2, felt to 60C.
Maybe with 17W/mK, don't need to apply


----------



## Roboyto

Quote:


> Originally Posted by *devilhead*
> 
> and talking about vrm1, vrm2 with EK block, you always need to apply some non electrical conductive thermal paste, more important for vrm2 (i use always mx-4) it helps, even with stock EK thermal pads + mx-4 i get pretty decent temperatures.
> Now i made some test with fujipoly 11 W/mK and at 1.47v vrm2 tempetures reach around 70C, when i applied mx-4 on vrm2, felt to 60C.
> Maybe with 17W/mK, don't need to apply


Definitely don't need thermal paste with the Ultra Extreme 17 w/mk pads.

Did you mean VRM1 fell to 60C? If your VRM2 is at 60C under water then something is probably wrong; mine typically doesn't surpass 50C even with RAM clocked as high as 1700 MHz.


----------



## devilhead

Quote:


> Originally Posted by *Roboyto*
> 
> Definitely don't need thermal paste with the Ultra Extreme 17 w/mk pads.
> 
> Did you mean VRM1 fell to 60C? If your VRM2 is at 60C under water then something is probably wrong; mine typically doesn't surpass 50C even with RAM clocked as high as 1700 MHz.


hehe, did you tried to apply 1.5v after vdrop (almost 1.6v at Asus GPUTweak) to your card? or just +200 mv ? with +200mv, stock bios memory 1700 never reach more than 45c







talking about vrm2. vrm1 never goes more than 35C


----------



## Jflisk

Quote:


> Originally Posted by *Roboyto*
> 
> Definitely don't need thermal paste with the Ultra Extreme 17 w/mk pads.
> 
> Did you mean VRM1 fell to 60C? If your VRM2 is at 60C under water then something is probably wrong; mine typically doesn't surpass 50C even with RAM clocked as high as 1700 MHz.


Unless your mining on the card and I forget what VRM is what under water you should never go over 58C max and that's mining. Under normal games 55C or under is usual. Thanks


----------



## Roboyto

Quote:


> Originally Posted by *devilhead*
> 
> hehe, did you tried to apply 1.5v after vdrop (almost 1.6v at Asus GPUTweak) to your card? or just +200 mv ? with +200mv, stock bios memory 1700 never reach more than 45c
> 
> 
> 
> 
> 
> 
> 
> talking about vrm2. vrm1 never goes more than 35C


No, I haven't.

VRM1 never goes more than 35C? With what cooling it?







Please tell me how your VRM1 temperatures are lower than VRM2?

My VRM1 idles at 32C, so I find it hard to believe yours peaks at 35C unless you are running something more extravagant than a water block.


----------



## devilhead

Quote:


> Originally Posted by *Roboyto*
> 
> No, I haven't.
> 
> VRM1 never goes more than 35C? With what cooling it?
> 
> 
> 
> 
> 
> 
> 
> Please tell me how your VRM1 temperatures are lower than VRM2?
> 
> My VRM1 idles at 32C, so I find it hard to believe yours peaks at 35C unless you are running something more extravagant than a water block.


Oh sorry i talked about vrm1 all the time







)) vrm2 never goe's more than 35c







when i'm benching, i use Ac blowing air thru rad







that help to handle 1.5v


----------



## Jflisk

Quote:


> Originally Posted by *devilhead*
> 
> hehe, did you tried to apply 1.5v after vdrop (almost 1.6v at Asus GPUTweak) to your card? or just +200 mv ? with +200mv, stock bios memory 1700 never reach more than 45c
> 
> 
> 
> 
> 
> 
> 
> talking about vrm2. vrm1 never goes more than 35C


How many radiators and pumps are you running mine idels at VRM1 36C and VRM 2 39C on both my cards.


----------



## Jflisk

Quote:


> Originally Posted by *devilhead*
> 
> Oh sorry i talked about vrm1 all the time
> 
> 
> 
> 
> 
> 
> 
> )) vrm2 never goe's more than 35c
> 
> 
> 
> 
> 
> 
> 
> when i'm benching, i use Ac blowing air thru rad
> 
> 
> 
> 
> 
> 
> 
> that help to handle 1.5v


You don't have a condensation problem with running AC across the rads.


----------



## Roboyto

Quote:


> Originally Posted by *devilhead*
> 
> Oh sorry i talked about vrm1 all the time
> 
> 
> 
> 
> 
> 
> 
> )) vrm2 never goe's more than 35c
> 
> 
> 
> 
> 
> 
> 
> when i'm benching, i use Ac blowing air thru rad
> 
> 
> 
> 
> 
> 
> 
> that help to handle 1.5v


Details my friend, details. This makes much more sense now..but you can't compare your temps when pumping cold A/C through your radiators when that definitely isn't the norm.


----------



## devilhead

My funny card testing with portable AC unit







))


----------



## Jflisk

Quote:


> Originally Posted by *devilhead*
> 
> 
> My funny card testing with portable AC unit
> 
> 
> 
> 
> 
> 
> 
> ))


Quote:


> Originally Posted by *devilhead*
> 
> 
> My funny card testing with portable AC unit
> 
> 
> 
> 
> 
> 
> 
> ))


That's WOW lost for words. Nice


----------



## joeh4384

Lol that's one way to cool the inferno known as the hawaii gpu.


----------



## Brandyn

I got the Asus 290x OC version and I am getting constant crashing. Even when watching youtube it will freeze with a sound loop and restart. Has anyone solved this problem yet? I cant find it anywhere


----------



## heroxoot

Quote:


> Originally Posted by *Brandyn*
> 
> I got the Asus 290x OC version and I am getting constant crashing. Even when watching youtube it will freeze with a sound loop and restart. Has anyone solved this problem yet? I cant find it anywhere


What driver? I'd disable hardware acceleration both in flash and the browser. Also disabled protected mode in flash. If you are on 14.7 I have heard of BSOD issues in flash on guru.


----------



## Brandyn

name="heroxoot" url="/t/1436497/official-amd-r9-290x-290-owners-club/27840#post_22653188"]
What driver? I'd disable hardware acceleration both in flash and the browser. Also disabled protected mode in flash. If you are on 14.7 I have heard of BSOD issues in flash on guru.[/quote]

14.4 is my driver and I disabled hardware accel but idk how to disable the protected mode in flash. It also crashes sometimes when logging into my user account


----------



## amptechnow

does anyone use supersampling? i was thinking with crossfire i would be able to run it, but i still am having a huge performance hits and lag. if i can run a game at 120-144+ fps with multisampling, shouldn't i be able to get 60 fps with ss? does it make it noticeably better? or is it just not needed?


----------



## ZealotKi11er

Quote:


> Originally Posted by *amptechnow*
> 
> does anyone use supersampling? i was thinking with crossfire i would be able to run it, but i still am having a huge performance hits and lag. if i can run a game at 120-144+ fps with multisampling, shouldn't i be able to get 60 fps with ss? does it make it noticeably better? or is it just not needed?


Its the vRAM probably.


----------



## tsm106

I just received a Lightning in today. Finally was able to plumb it in, thank the lord for QDCs. It's a seriously big ass card lol and that row of lights are annoying too. Haven't done much, ran 3dm11 1100/1400 stock volts, no prob. It apparently was assembled with mx4 and ek pads. Vrm1 never exceeded 41c. Not really sure why I have a lightning lol.


----------



## PuffinMyLye

I just picked up my first AMD card (well first for gaming purposes I have an HD6450 laying around just for triple monitor support) and I'm having all kinds of problems with it. No matter what I do I can't get the card to work properly in Windows without the following error message coming up upon login:



I've installed the 14.4 and 14.7 drivers multiple times (always uninstalling with Display Driver Uninstaller in safe mode). I've flipped the switch on the card to use the other BIOS. I've flashed the latest BIOS found on the Asus websites but I have no idea experience flashing video card BIOS' so I don't know if there is anything else to do beyond that.

The exact same system (sig rig below) works great with my HD6450 with the exact same 14.4 drivers but the moment I put my 290X it will not work properly in WIndows.

Have I exhausted all my options and it's just a bad card at this point?


----------



## ZealotKi11er

Quote:


> Originally Posted by *tsm106*
> 
> I just received a Lightning in today. Finally was able to plumb it in, thank the lord for QDCs. It's a seriously big ass card lol and that row of lights are annoying too. Haven't done much, ran 3dm11 1100/1400 stock volts, no prob. It apparently was assembled with mx4 and ek pads. Vrm1 never exceeded 41c. Not really sure why I have a lightning lol.


VRM1 run cooler because the card has better VRMs?
Quote:


> Originally Posted by *PuffinMyLye*
> 
> I just picked up my first AMD card (well first for gaming purposes I have an HD6450 laying around just for triple monitor support) and I'm having all kinds of problems with it. No matter what I do I can't get the card to work properly in Windows without the following error message coming up upon login:
> 
> 
> 
> I've installed the 14.4 and 14.7 drivers multiple times (always uninstalling with Display Driver Uninstaller in safe mode). I've flipped the switch on the card to use the other BIOS. I've flashed the latest BIOS found on the Asus websites but I have no idea experience flashing video card BIOS' so I don't know if there is anything else to do beyond that.
> 
> The exact same system (sig rig below) works great with my HD6450 with the exact same 14.4 drivers but the moment I put my 290X it will not work properly in WIndows.
> 
> Have I exhausted all my options and it's just a bad card at this point?


Leave the card in the stock BIOS. If you have not done a fresh install do that. I had that problem once and nothing i did Windows would not see the card until fresh install of Windows 7.


----------



## tsm106

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I just received a Lightning in today. Finally was able to plumb it in, thank the lord for QDCs. It's a seriously big ass card lol and that row of lights are annoying too. Haven't done much, ran 3dm11 1100/1400 stock volts, no prob. It apparently was assembled with mx4 and ek pads. Vrm1 never exceeded 41c. Not really sure why I have a lightning lol.
> 
> 
> 
> VRM1 run cooler because the card has better VRMs?
Click to expand...

Hmm, maybe... maybe not. It's not like the reference vrms are knuckle draggers, they're top shelf to start. I will open the block up later to inspect it, it's idling a lil high for my tastes at 38c.


----------



## ZealotKi11er

Quote:


> Originally Posted by *PuffinMyLye*
> 
> I just picked up my first AMD card (well first for gaming purposes I have an HD6450 laying around just for triple monitor support) and I'm having all kinds of problems with it. No matter what I do I can't get the card to work properly in Windows without the following error message coming up upon login:
> 
> 
> 
> I've installed the 14.4 and 14.7 drivers multiple times (always uninstalling with Display Driver Uninstaller in safe mode). I've flipped the switch on the card to use the other BIOS. I've flashed the latest BIOS found on the Asus websites but I have no idea experience flashing video card BIOS' so I don't know if there is anything else to do beyond that.
> 
> The exact same system (sig rig below) works great with my HD6450 with the exact same 14.4 drivers but the moment I put my 290X it will not work properly in WIndows.
> 
> Have I exhausted all my options and it's just a bad card at this point?


Quote:


> Originally Posted by *tsm106*
> 
> Hmm, maybe... maybe not. It's not like the reference vrms are knuckle draggers, they're top shelf to start. I will open the block up later to inspect it, it's idling a lil high for my tastes at 38c.


I think having VRMs that handle higher load results to them running cooler.


----------



## PuffinMyLye

Quote:


> Originally Posted by *ZealotKi11er*
> 
> VRM1 run cooler because the card has better VRMs?
> Leave the card in the stock BIOS. If you have not done a fresh install do that. I had that problem once and nothing i did Windows would not see the card until fresh install of Windows 7.


Unfortunately I've re-installed twice already and it didn't fix the issue.


----------



## tsm106

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Hmm, maybe... maybe not. It's not like the reference vrms are knuckle draggers, they're top shelf to start. I will open the block up later to inspect it, it's idling a lil high for my tastes at 38c.
> 
> 
> 
> I think having VRMs that handle higher load results to them running cooler.
Click to expand...

No having more phases reduces the load on each phases, thereby running cooler. But imo it's not very perceivable. It could simply be that since this pcb is freaking huge ass, the vrm channel is that much bigger. I dunno, but it'll be cracked open tomorrow if I can find time.


----------



## the9quad

Quote:


> Originally Posted by *PuffinMyLye*
> 
> I just picked up my first AMD card (well first for gaming purposes I have an HD6450 laying around just for triple monitor support) and I'm having all kinds of problems with it. No matter what I do I can't get the card to work properly in Windows without the following error message coming up upon login:
> 
> 
> 
> I've installed the 14.4 and 14.7 drivers multiple times (always uninstalling with Display Driver Uninstaller in safe mode). I've flipped the switch on the card to use the other BIOS. I've flashed the latest BIOS found on the Asus websites but I have no idea experience flashing video card BIOS' so I don't know if there is anything else to do beyond that.
> 
> The exact same system (sig rig below) works great with my HD6450 with the exact same 14.4 drivers but the moment I put my 290X it will not work properly in WIndows.
> 
> Have I exhausted all my options and it's just a bad card at this point?


I know it sounds stupid but make sure our bios settings for pcie are correct and make sure your card is seated all the way


----------



## PuffinMyLye

Quote:


> Originally Posted by *the9quad*
> 
> I know it sounds stupid but make sure our bios settings for pcie are correct and make sure your card is seated all the way


I've re-seated at least 4-5 times. BIOS is set to PCIE graphics. I mean I can get into Windows fine with the card so the computer is detected the card fine. But it's like Windows (or AMD's drivers) are detecting some issue with it and therefore it won't load the drivers properly it.


----------



## amptechnow

did you try completely removing the card then booting up system and then shut down and re-installing card? i had a similar issue when flashing different bios' and windows and my mobo bios got all screwy. maybe even take out card and clear your cmos and start up shut down and then put card back in.


----------



## tsm106

I miss my old tri array.


----------



## PuffinMyLye

Quote:


> Originally Posted by *amptechnow*
> 
> did you try completely removing the card then booting up system and then shut down and re-installing card? i had a similar issue when flashing different bios' and windows and my mobo bios got all screwy. maybe even take out card and clear your cmos and start up shut down and then put card back in.


I just tried that with reinstalling Windows again (only takes me like 8 minutes that's why I keep doing it) and still no good. I also noticed while flashing the bios that it ages a very long time. We'll over a minute to flash as if there's a problem with the bios chip. As much as I want to get this card working I think I'm resigned to the fact that it's the card.


----------



## sugarhell

Quote:


> Originally Posted by *tsm106*
> 
> I miss my old tri array.


show off


----------



## spikezone2004

I feel like im back to square one for what i want to do for cooling my card. I wanted to install the block while i upgraded my loop with my new rad and pump but dont want to spend another 150+ on a block after everything else.

Sucks how theres no used blocks for sale anywhere, not even on ebay. Guess i might use stock cooler for a bit until i decide, just hate having to redo my loop again when i decide to get a fullblock.


----------



## Klocek001

Do you think the launch of r9 3xx series will have a noticeable impact on the prices of 290s ? I'm thinking of waiting till they drop to get my second 290..... btw, has anyone using two 290s noticed any improvements in AC IV BF as far as CF scaling ? I heard 290 have the best scaling of all AMD cards so far. That true ?


----------



## melodystyle2003

@Arizonian fill me in pls:

http://www.techpowerup.com/gpuz/k3h9s/
Sapphire r9-290, stock reference cooling


----------



## ZealotKi11er

Quote:


> Originally Posted by *Klocek001*
> 
> Do you think the launch of r9 3xx series will have a noticeable impact on the prices of 290s ? I'm thinking of waiting till they drop to get my second 290..... btw, has anyone using two 290s noticed any improvements in AC IV BF as far as CF scaling ? I heard 290 have the best scaling of all AMD cards so far. That true ?


I dont think so. 290s are already pretty low.


----------



## heroxoot

Quote:


> Originally Posted by *Klocek001*
> 
> Do you think the launch of r9 3xx series will have a noticeable impact on the prices of 290s ? I'm thinking of waiting till they drop to get my second 290..... btw, has anyone using two 290s noticed any improvements in AC IV BF as far as CF scaling ? I heard 290 have the best scaling of all AMD cards so far. That true ?


I would not expect them much lower. You can already find them around 300? Pretty low for a currently high end card. Even if not new, miners cards are good enough to buy and then RMA for a proper one and save some cash. A lot of companies give refurb or new as replacements. MSI gave me a new 290X for my replacement.


----------



## The Storm

Quote:


> Originally Posted by *tsm106*
> 
> I miss my old tri array.


I remeber when you sold those blocks, I tried to buy 2 of them but you only had 1 left at the time I seen the post.


----------



## Arizonian

Quote:


> Originally Posted by *melodystyle2003*
> 
> @Arizonian fill me in pls:
> 
> http://www.techpowerup.com/gpuz/k3h9s/
> Sapphire r9-290, stock reference cooling
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## fateswarm

Why would you not expect 290s to be lower when 390s are out? But it may be only ~$50 lower.


----------



## Klocek001

Hey I just found out that my mobo won't support PCI-E 3.0 8x-8x using 2nd gen core i5, I'll only get PCI-E 2.0 8x-8x. Is it a big deal ? Is it going to bottleneck two 290s if I get the second one...


----------



## ZealotKi11er

Quote:


> Originally Posted by *Klocek001*
> 
> Hey I just found out that my mobo won't support PCI-E 3.0 8x-8x using 2nd gen core i5, I'll only get PCI-E 2.0 8x-8x. Is it a big deal ? Is it going to bottleneck two 290s if I get the second one...


Not 100% sure but it might. You can always test it and compare it with other. If you are playing 1080p dont worry about it.


----------



## Gabkicks

one of my sapphire r9 290's stopped working so i RMA'd it.







hopefully it gets back soon. not sure if i will stay r9 290 crossfire or switch to gtx 780 sli or sell gtx 780 and run single r9 290... dunnoooo


----------



## BradleyW

What is the average voltage required for 1200MHz on the GPU core clock with an AMD 290X?
I've currently tried 1100MHz and it did not require voltages adjustments.


----------



## ZealotKi11er

Quote:


> Originally Posted by *BradleyW*
> 
> What is the average voltage required for 1200MHz on the GPU core clock with an AMD 290X?
> I've currently tried 1100MHz and it did not require voltages adjustments.


For me one card need 100 other needs 125. They hit same MAX voltage though so the one with 125 starts at lower voltage. ~ 1.3v


----------



## BradleyW

Quote:


> Originally Posted by *ZealotKi11er*
> 
> For me one card need 100 other needs 125. They hit same MAX voltage though so the one with 125 starts at lower voltage. ~ 1.3v


I see,
What does AUX voltage do?
Thanks for the information so far. +1


----------



## ZealotKi11er

Quote:


> Originally Posted by *BradleyW*
> 
> I see,
> What does AUX voltage do?
> Thanks for the information so far. +1


Never tried it. Also for overclocking i used ASUS Tweaker. I dont know how to use MSI AB with more then 100 mV. In the end of the day i dont need to OC since most games i play are CPU bound. Overclocking GPUs is just for benchmarking.


----------



## BradleyW

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Never tried it. Also for overclocking i used ASUS Tweaker. I dont know how to use MSI AB with more then 100 mV. In the end of the day i dont need to OC since most games i play are CPU bound. Overclocking GPUs is just for benchmarking.


Yeah, most games do tend to drop the fps because of CPU limitation / poor programming.


----------



## Descadent

not a lot of reviews out there on the cards that I have which are the Sapphire 290x Vapor-x but tek syndicate just posted an overview


----------



## heroxoot

Quote:


> Originally Posted by *BradleyW*
> 
> What is the average voltage required for 1200MHz on the GPU core clock with an AMD 290X?
> I've currently tried 1100MHz and it did not require voltages adjustments.


That's kind of lucky. I couldn't do 1100/1250 on stock. I think 1080 was the most I could.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Descadent*
> 
> not a lot of reviews out there on the cards that I have which are the Sapphire 290x Vapor-x but tek syndicate just posted an overview


Thats a crazy card.


----------



## pnoozi

^ So it's pretty much a must that you combine that card with an SSD.


----------



## ebhsimon

Quote:


> Originally Posted by *heroxoot*
> 
> That's kind of lucky. I couldn't do 1100/1250 on stock. I think 1080 was the most I could.


Exactly the same as me.. Except I have +25mV stock since I have a Vapor-X model.


----------



## ebhsimon

Quote:


> Originally Posted by *Descadent*
> 
> not a lot of reviews out there on the cards that I have which are the Sapphire 290x Vapor-x but tek syndicate just posted an overview


230W idle and 430W load? Doesn't this seem a little high? Even overclocked on GPU-Z I am seeing like 30-40W on idle and 370W on my vapor-x 290. And I only get like 230-250W max stock, does the X variant really make it pull 200W more?


----------



## Vici0us

Does Cooler Master Seidon 120M fit on R9 290? Or only 120XL?


----------



## gobblebox

Quote:


> Originally Posted by *ebhsimon*
> 
> Exactly the same as me.. Except I have +25mV stock since I have a Vapor-X model.


Aww, I was hoping you hadn't deleted the comment about a lack of good airflow in the Arc Midi R2... since it is arguably the _best_ Mid-Tower for a high air flow design.... 420 rad top mount, 140 rear exhaust, 240 front rad, 240 bottom rad... (not saying I'd put rads in every spot, but you can get some excellent air flow with some 140 fans all over the place) where is the lack of air flow?


----------



## ebhsimon

Quote:


> Originally Posted by *gobblebox*
> 
> Aww, I was hoping you hadn't deleted the comment about a lack of good airflow in the Arc Midi R2... since it is arguably the _best_ Mid-Tower for a high air flow design.... 420 rad top mount, 140 rear exhaust, 240 front rad, 240 bottom rad... (not saying I'd put rads in every spot, but you can get some excellent air flow with some 140 fans all over the place) where is the lack of air flow?


That was a comment to someone else earlier. What I meant was in stock form it has bad air flow, it really does especially with it's foam dust filters. The fans it comes with aren't suited well to the thick foam filters.


----------



## melodystyle2003

I have few questions to ask, would try to be as clear and laconic as possible. Gpu is a r9 290, reference cooler, hynix ram chips.

*Is there any way to set static core voltage, i.e. 1V or 1.15V or 1.25V without LLC if possible only in 3d clocks?*
I play with 1050/1250Mhz with fan set to 54% and on reference cooler it reaches 88oC stable while gaming and wearing headphones








I see voltage fluctuations between 1.12-1.2V but can not lower it. If i set core voltage offset to minus something, even idle voltage is lowering and eventually if vcore reduction is quite big it crashes on idle too. I am on 14.4 catalyst / win 8.1.

*Aftermarket cooling, i tent to choose aio solution with vrm/vram heatsinks plus a fan blowing air onto these, except of lowering dbA's will help on overclocking potentials?*
So far with fan @ 100% i can run it at 1150/1500Mhz artifact free, and 1200/1500Mhz with some artifacts with the temps (core,VRM 1 & 2) being under 90oC. A 3dmark firestrike run. Should i expect a 1200/1500Mhz clocks stabilization for 24/7 with aftermarket cooling?

*Last but not least, is there any bios that is proven to work better than others? I am on 015.041.000.000.000000 (113-E285P47-U001).
*
TIA


----------



## spikezone2004

I think after looking at all the tests that have been done on full blocks for the 290 the aquacomputer seems to be the best, i cant figure out why the backplate makes such a positive difference on the vrm temps, but the XSPC backplate negatively affects the temps?

http://www.xtremerigs.net/2014/05/03/r9-290x-waterblock-testing-results-so-far/

Problem is to get a backplate for the aquacomputers block seems to be near impossible its not instock anywhere.


----------



## BradleyW

Quote:


> Originally Posted by *ebhsimon*
> 
> 230W idle and 430W load? Doesn't this seem a little high? Even overclocked on GPU-Z I am seeing like 30-40W on idle and 370W on my vapor-x 290. And I only get like 230-250W max stock, does the X variant really make it pull 200W more?


They must have been talking about system load. I checked the power usage of this card on other reviews. Full system load as under 400 watts.


----------



## BradleyW

Quote:


> Originally Posted by *heroxoot*
> 
> That's kind of lucky. I couldn't do 1100/1250 on stock. I think 1080 was the most I could.


I thought 1100MHz was normal without voltages changes? I wonder how far these cards can go without the need for extra voltage? Time to find out I guess.

Edit: It seems I can do 1130MHz on each core. I then start to get a slight fps drop past this point, but no artifacts.
Quote:


> Originally Posted by *ebhsimon*
> 
> 230W idle and 430W load? Doesn't this seem a little high? Even overclocked on GPU-Z I am seeing like 30-40W on idle and 370W on my vapor-x 290. And I only get like 230-250W max stock, does the X variant really make it pull 200W more?


They must have been talking about system load. I checked the power usage of this card on other reviews. Full system load as under 400 watts.


----------



## bluedevil

Quote:


> Originally Posted by *Vici0us*
> 
> Does Cooler Master Seidon 120M fit on R9 290? Or only 120XL?


They both fit. Same bolt pattern.


----------



## Vici0us

Quote:


> Originally Posted by *bluedevil*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Vici0us*
> 
> Does Cooler Master Seidon 120M fit on R9 290? Or only 120XL?
> 
> 
> 
> They both fit. Same bolt pattern.
Click to expand...

Thanks mang, I got some work to do then.


----------



## bluedevil

Quote:


> Originally Posted by *Vici0us*
> 
> Thanks mang, I got some work to do then.


Word. 20mm M3 bolts + M3 nuts + nylon washers.


----------



## heroxoot

Quote:


> Originally Posted by *ebhsimon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> That's kind of lucky. I couldn't do 1100/1250 on stock. I think 1080 was the most I could.
> 
> 
> 
> Exactly the same as me.. Except I have +25mV stock since I have a Vapor-X model.
Click to expand...

Mine is +13 for default voltage because I have the Gaming edition. Still showed checkered artifacts on 1100mhz


----------



## Roboyto

Quote:


> Originally Posted by *BradleyW*
> 
> I see,
> What does AUX voltage do?
> Thanks for the information so far. +1


1200/1500 for me with +87mV on one card.

My other card won't even make 1200 with +200mV. Anything after 1075/1375 with +37mV isn't worth the abuse on this card. Silicon lottery plays it's part as usual.

AUX voltage is for RAM as far as I know from reading in this thread. Some get additional OC out of RAM with it, others don't. I've never had much luck with AB personally, Trixx is always my go to.


----------



## BradleyW

Quote:


> Originally Posted by *Roboyto*
> 
> 1200/1500 for me with +87mV on one card.
> 
> My other card won't even make 1200 with +200mV. Anything after 1075/1375 with +37mV isn't worth the abuse on this card. Silicon lottery plays it's part as usual.
> 
> AUX voltage is for RAM as far as I know from reading in this thread. Some get additional OC out of RAM with it, others don't. I've never had much luck with AB personally, Trixx is always my go to.


Thanks for the info.


----------



## Zipperly

Memory voltage on Hawaii is tied in with core voltage. When I had my 290X increasing core voltage helped me achieve a higher memory overclock, this is odd to say the least but its documented around the net.

http://www.overclock.net/t/1484990/overclocking-the-memory-on-an-r9-290-290x


----------



## Roboyto

Quote:


> Originally Posted by *BradleyW*
> 
> Thanks for the info.


No problem.

Did some scrounging and I found my bench spreadsheet for 2nd 290. Only able to get to 1025 with stock voltage and it maxed at 1175 core with +137mV; additional voltage after that made no difference in clocks.

+100mV for +100MHz, compared to 1075 +37mV, is not really worth it especially when this card is hooked up to my living room TV(1080) so my wife can play Tomb Raider and Batman.

Your best bet with these cards is still to find how far the core can go with RAM at stock. Core gives you exponentially higher performance gains compared to RAM speeds. See spreadsheet tab "VRAM CLK Performace" in this post: http://www.overclock.net/t/1456279/honey-i-shrunk-the-ultra-tower-beast-my-journey-to-creating-a-more-compact-pc-with-an-r9-290/20_20#post_21847939

The 512-bit bus makes up for the seemingly slow RAM speeds. I say seemingly slow because stock RAM clock nets 320 GB/s bandwidth. Compare this bandwidth to a GTX 780 with a mammoth RAM OC of 1865 which yields 358.1 GB/s.



Spoiler: GTX 780 1865 RAM bandwidth















If you are fortunate enough to get some BA memory chips capable of, say 1700MHz, like I did...then the bandwidth measured in GPU-Z could be 435.2 GB/s











Spoiler: R9 290 1700 RAM bandwidth







Quote:


> Originally Posted by *Zipperly*
> 
> Memory voltage on Hawaii is tied in with core voltage. When I had my 290X increasing core voltage helped me achieve a higher memory overclock, this is odd to say the least but its documented around the net.
> 
> http://www.overclock.net/t/1484990/overclocking-the-memory-on-an-r9-290-290x


I did notice this phenomenon when I did extensive benching with my first pair of 290s.


----------



## falcon26

Well I tried to get used to the heat but it just bothers me too much. Gaming its at 80-90 :-( And of course I can't return the card. So I think I will just post it for sale around $350 and see if I have any takers. Or a swap on a 780 somewhere...


----------



## Vici0us

Quote:


> Originally Posted by *bluedevil*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Vici0us*
> 
> Thanks mang, I got some work to do then.
> 
> 
> 
> Word. 20mm M3 bolts + M3 nuts + nylon washers.
Click to expand...

Yump, I gotta 120M sitting on my CPU. My Windforce stays cool enough but my Reference is hot as usual.. the noise is the most annoying. I think, I'm gonna run push-pull Corsair SP fans or Noctuas.


----------



## 2tired

am i losing performance by allowing my card to get top 90-93 degrees? this happens in bf4 every time. FPS seems ok, but its sometimes it feels slow with the bullet registration. Like I know the net code is bad, but sometimes shots wont register for me. this one guy one on the forums switched to a 780 and never got temps above 70 (thats what he said). I think thats a significant jump from 90. I also read here that heat can cause strange issues. Would changing the heatsink make game play better?

anyone have a recommendation for a good low profile gpu cooler? like something easy to install and one that takes up 2 slots?

thanks


----------



## Roboyto

Quote:


> Originally Posted by *falcon26*
> 
> Well I tried to get used to the heat but it just bothers me too much. Gaming its at 80-90 :-( And of course I can't return the card. So I think I will just post it for sale around $350 and see if I have any takers. Or a swap on a 780 somewhere...


It is unfortunate that you want to sell the card. I feel your temperatures are likely due to the poor airflow from the front, and lack of airflow from the side, of your case. Those Fractal cases are nice for keeping noise down, but not so great for feeding much needed fresh air to these 290(X) cards.

Have you tried running with the side panel off? Have some pics of your rig, maybe we can make some suggestions to get temps in check?

If you're not looking to set any OC records, than a Kraken G10 and an AIO should be able to brings temps well within normal operating temperatures, and allow for a mediocre OC. By taking a look at what going on under the proverbial hood of your card, on CoolingConfigurator.com, you would need to add a heatsink for the primary VRM1 once you remove the MSI cooler.

The Gelid 290 VRM kit does a great job of cooling VRM1 when paired with the Kraken: http://www.overclock.net/t/1478544/the-build-formerly-known-as-the-devil-inside-a-gaming-htpc-for-my-wife-4770k-corsair-250d-powercolor-r9-290/20_20#post_22214255

Total Investment from Newegg:


NZXT Kraken G10 $30
Corsair H50 $60 ($10 MIR available)
Gelid 290 VRM Kit $14
$104


Yes, this is $104 additional expense, but will still be a cheaper endeavor than selling your card at a loss and then buying a 780.

Quote:


> Originally Posted by *2tired*
> 
> am i losing performance by allowing my card to get top 90-93 degrees? this happens in bf4 every time. FPS seems ok, but its sometimes it feels slow with the bullet registration. Like I know the net code is bad, but sometimes shots wont register for me. this one guy one on the forums switched to a 780 and never got temps above 70 (thats what he said). I think thats a significant jump from 90. I also read here that heat can cause strange issues. Would changing the heatsink make game play better?
> 
> anyone have a recommendation for a good low profile gpu cooler? like something easy to install and one that takes up 2 slots?
> 
> thanks


If the card is running that hot, then it could be causing issues even though they are designed to run at those temperatures. Have you tried forcing the fan to high speeds, to keep temps down, to see if it smooths out gameplay? I would do this first to ensure the GPU temps are your problem.

I don't know if any low profile cooler is going to be compatible/capable of cooling one of these cards.

The Gelid Icy Vision 2, I believe, is your thinnest option and would also benefit from the Gelid 290 VRM kit.

The Arctic Accelero Extreme VI does a fair job, but I believe it falls short on VRM1 cooling.

I would suggest the Kraken/AIO/Gelid heatsink combination as I mention above.

If you really want to go the extra mile for some outstanding VRM1 cooling performance with the Kraken, then you can make 2 other changes. A new 92mm fan, and Fujipoly Ultra Extreme thermal pads; ~$40 additional. http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures/80_20#post_22599798


----------



## Zipperly

Quote:


> Originally Posted by *falcon26*
> 
> Well I tried to get used to the heat but it just bothers me too much. Gaming its at 80-90 :-( And of course I can't return the card. So I think I will just post it for sale around $350 and see if I have any takers. Or a swap on a 780 somewhere...


When I owned my 290X I took care of the heat problem with an Accelero 3 cooler. With an overvolt too 1.328vc and a core clock of 1200mhz core and 1500mhz mem my absolute highest temps under extreme amounts of heavy gaming in Crysis 3 and BF4 were only 64c and the hottest VRM section was 79c. If I ran the card at stock settings my temps would have even been better. As you can see though I no longer own the 290X, I got a little tired of the coil wine along with some spotty driver issues and sold it for $450.00 while snagging a GTX 780 for $350.00 off of ebay.

I installed an accelero 4 onto my GTX 780 and im loading at 48c with an overclock of 1254mhz core and voltage set to 1.238vc, I did of course flash the bios to a modded one to eliminate throttling and this card is a real beast after overclocking, it really benefits quiet a lot with increased core frequency.


----------



## Zipperly

Quote:


> Originally Posted by *BradleyW*
> 
> I thought 1100MHz was normal without voltages changes?


That is about right, mine needed a slight voltage bump at 1150mhz core and then after that needed a very big voltage increase in order to remain stable.


----------



## rdr09

Quote:


> Originally Posted by *Zipperly*
> 
> When I owned my 290X I took care of the heat problem with an Accelero 3 cooler. With an overvolt too 1.328vc and a core clock of 1200mhz core and 1500mhz mem my absolute highest temps under extreme amounts of heavy gaming in Crysis 3 and BF4 were only 64c and the hottest VRM section was 79c. If I ran the card at stock settings my temps would have even been better. As you can see though I no longer own the 290X, I got a little tired of the coil wine along with some spotty driver issues and sold it for $450.00 while snagging a GTX 780 for $350.00 off of ebay.
> 
> I installed an accelero 4 onto my GTX 780 and im loading at 48c with an overclock of 1254mhz core and voltage set to 1.238vc, I did of course flash the bios to a modded one to eliminate throttling and this card is a real beast after overclocking, it really benefits quiet a lot with increased core frequency.


the 780 3GB used is a good alternative. that 3Gb is more than enough for upcoming games even. a 1254 MHz oc is about equal to a 290 at 1150 and not many 290 will do that without voltage tweaks. IMO, watercooling is the best way to tame Hawaiis.


----------



## the9quad

Quote:


> Originally Posted by *rdr09*
> 
> the 780 3GB used is a good alternative. that 3Gb is more than enough for upcoming games even. a 1254 MHz oc is about equal to a 290 at 1150 and not many 290 will do that without voltage tweaks. IMO, watercooling is the best way to tame Hawaiis.


Watchdogs and Ubisoft would like to have a word with you about 3GB being enough. I fear it is only going to get worse with their titles as well.


----------



## rdr09

Quote:


> Originally Posted by *the9quad*
> 
> Watchdogs and Ubisoft would like to have a word with you about 3GB being enough. I fear it is only going to get worse with their titles as well.


they can always lower some settings like AA but the other alternative - 6GB version is just expensive if you think about it since the next gen will be coming soon.


----------



## BradleyW

Quote:


> Originally Posted by *the9quad*
> 
> Watchdogs and Ubisoft would like to have a word with you about 3GB being enough. I fear it is only going to get worse with their titles as well.


Resources used from GameWorks tend to use a lot of VRAM because it's using pre-created objects which are selected and imported into a game world. It sucks man!


----------



## Zipperly

[/quote]
Quote:


> Originally Posted by *rdr09*
> 
> the 780 3GB used is a good alternative. that 3Gb is more than enough for upcoming games even. a 1254 MHz oc is about equal to a 290 at 1150 and not many 290 will do that without voltage tweaks. IMO, watercooling is the best way to tame Hawaiis.


Disagree about the 290 comment unless you are talking about 1 particular canned benchmark which favors ATI. For gaming my 1254mhz clocked GTX 780 is giving me better performance than my 1200mhz core clocked 290X gave me. Some games are obviously closer than others but the majority are running better on my 780.


----------



## Zipperly

Quote:


> Originally Posted by *the9quad*
> 
> Watchdogs and Ubisoft would like to have a word with you about 3GB being enough. I fear it is only going to get worse with their titles as well.


Watchdogs is a crap game and a poor example to be honest. I ran Skyrim with over 100 mods at 1920X1080 4XMSAA and had no issues with the 3gb's on my 780.


----------



## falcon26

The other odd thing is in BF3 my 290X runs like a dream, butter smooth. But in BF4 its like pausing for a split second every 5 to 10 seconds or so. Like micro stutter but I'm not running cross fire. Its really annoying..


----------



## the9quad

Quote:


> Originally Posted by *Zipperly*
> 
> Watchdogs is a crap game and a poor example to be honest. I ran Skyrim with over 100 mods at 1920X1080 4XMSAA and had no issues with the 3gb's on my 780.


Just saying it is an example and it is utilizing ram the way the new consoles do. So if other engines end up doing the same thing or porting as bad as ubi, we could be in for a generation of crud or get used to turning down textures on $4000 dollar pc's to play a game from a $400 console.


----------



## Zipperly

Quote:


> Originally Posted by *the9quad*
> 
> Just saying it is an example and it is utilizing ram the way the new consoles do. So if other engines end up doing the same thing or porting as bad as ubi, we could be in for a generation of crud or get used to turning down textures on $4000 dollar pc's to play a game from a $400 console.


Guess time will tell, either way I wont worry much about it. I'll just grab a card with more vram if it becomes an issue anytime soon.


----------



## BradleyW

Quote:


> Originally Posted by *Zipperly*
> 
> Watchdogs is a crap game and a poor example to be honest. I ran Skyrim with over 100 mods at 1920X1080 4XMSAA and had no issues with the 3gb's on my 780.


Watch Dogs is not coded well, but after the E3 mod, I saw substantial performance increases in both fps and visuals. The game is excellent for me, especially with the mod. The only issue I have is that I get negative CFX scaling. However, I really loved the game a lot. Huge fan of Watch Dogs. I just wish they did a better job technically.








Quote:


> Originally Posted by *the9quad*
> 
> Just saying it is an example and it is utilizing ram the way the new consoles do. So if other engines end up doing the same thing or porting as bad as ubi, we could be in for a generation of crud or get used to turning down textures on $4000 dollar pc's to play a game from a $400 console.


I think we just have to be weary when it comes to Ubisoft. Everything else should be absolutely fine on the PC.


----------



## Zipperly

Quote:


> Originally Posted by *BradleyW*
> 
> Watch Dogs is not coded well, but after the E3 mod, I saw substantial performance increases in both fps and visuals. The game is excellent for me, especially with the mod. The only issue I have is that I get negative CFX scaling. However, I really loved the game a lot. Huge fan of Watch Dogs. I just wish they did a better job technically.


Thanks for that info, perhaps when I can get it on a good sale I will give it a shot and use the mod.


----------



## the9quad

Is it better than the worse mod right before version 1? Because I seriously was getting 17 fps with that, I am not sure what I am doing wrong, because that was just really bad. If only we had someone whose name rhymes with Schmadley W, who would be kind enough to PM me their settings, life would be grand...please


----------



## BradleyW

Quote:


> Originally Posted by *Zipperly*
> 
> Thanks for that info, perhaps when I can get it on a good sale I will give it a shot and use the mod.


The mod is excellent and it certainly makes the game 95% smooth. It's just a shame that the fps suffers and we can't use Ultra textures without massive stuttering.
Quote:


> Originally Posted by *the9quad*
> 
> Is it better than the worse mod right before version 1? Because I seriously was getting 17 fps with that, I am not sure what I am doing wrong, because that was just really bad. If only we had someone whose name rhymes with Schmadley W, who would be kind enough to PM me their settings, life would be grand...please


PM'ed









Also, enjoy:


----------



## falcon26

OK disabling "Origin in Game" fixed the stutter issue. Next issue is choppiness. My card plays fine in BF4 for about 10 minutes, then it starts to get choppy. I check my temp and its 95 :-( so I assume my card is now throttling correct? How the hell do people keep these cards cool. If I put fan speed to 100% then its like a 747 in here.


----------



## BradleyW

Quote:


> Originally Posted by *falcon26*
> 
> OK disabling "Origin in Game" fixed the stutter issue. Next issue is choppiness. My card plays fine in BF4 for about 10 minutes, then it starts to get choppy. I check my temp and its 95 :-( so I assume my card is now throttling correct? How the hell do people keep these cards cool. If I put fan speed to 100% then its like a 747 in here.


Yes, that has to be throttling. What's your airflow setup like?


----------



## falcon26

1rear exhaust fan for my h55 2 front fans. All 120mm


----------



## kayan

Quote:


> Originally Posted by *falcon26*
> 
> OK disabling "Origin in Game" fixed the stutter issue. Next issue is choppiness. My card plays fine in BF4 for about 10 minutes, then it starts to get choppy. I check my temp and its 95 :-( so I assume my card is now throttling correct? How the hell do people keep these cards cool. If I put fan speed to 100% then its like a 747 in here.


Definitely throttling. What version of the 290(x) do you have, and also fan setup on your case?


----------



## Zipperly

Quote:


> Originally Posted by *falcon26*
> 
> OK disabling "Origin in Game" fixed the stutter issue. Next issue is choppiness. My card plays fine in BF4 for about 10 minutes, then it starts to get choppy. I check my temp and its 95 :-( so I assume my card is now throttling correct? How the hell do people keep these cards cool. If I put fan speed to 100% then its like a 747 in here.


I told you what I did too keep mine cool and it was more than enough plus with that cooler its nearly silent. It just comes down to if you wanna do aftermarket cooling or not.


----------



## falcon26

I have the twin froze. I'd rather not do anything to the card. My cooling setup is fine. My old 780 never went over 70 whole gaming. I'd rather just sell the card...


----------



## kayan

Quote:


> Originally Posted by *falcon26*
> 
> I have the twin froze. I'd rather not do anything to the card. My cooling setup is fine. My old 780 never went over 70 whole gaming. I'd rather just sell the card...


See, the issue is though, these cards are meant to run that hot. Comparing them to a 780(ti) isn't really something that can be done, viably, they are apples and oranges when it comes to temps. There isn't much you can do about that if you don't want to mod your card by adding an aftermarket cooler. I did for both of mine, and am happy with it now, but before, I was not happy with the noise.


----------



## Roboyto

Quote:


> Originally Posted by *falcon26*
> 
> 1rear exhaust fan for my h55 2 front fans. All 120mm


Quote:


> Originally Posted by *falcon26*
> 
> I have the twin froze. I'd rather not do anything to the card. My cooling setup is fine. My old 780 never went over 70 whole gaming. I'd rather just sell the card...


You may think your cooling setup is fine, but in actuality it's lackluster and inadequate for your 290X. 290(X) run hotter than 780 because the transistor density is higher.


AMD has crammed 6.2 Billion transistors into a 438 mm² die
Compared to GTX 780 which has 7.1 Billion transistors in a 561 mm² die
Nvidia has 14% more transistors but uses 28% more space to do so



Have you tried gaming with the front door open? The tiny little vents on the side of the front panel suffocate the dual 120mm front fans when the door is closed. 
Do you have all the hard drive cages installed? If so, then the front 120mm fans are really not doing much of anything to assist in cooling the GPU.
You could add dual 120mm fans to the top of the case to help evacuate the heat.


----------



## gobblebox

So my VRM 1 temps are getting ridiculous... I have the Sapphire Tri-X OC r9 290 card, and though the fans are pretty good at keeping temps down, it is awfully loud (dropping the fan speed results in crazy high temps) & I need to do something to drop the VRM 1 temp down a bit. I have decided to replace the Tri-X cooler, and am stuck between the Arctic Accelero 4 and the Gelid Icy Vision Rev 2; which one is more efficient? Are there any other recommendations that don't exceed $65? I've got to get this thing swapped out.


----------



## Zipperly

Quote:


> Originally Posted by *gobblebox*
> 
> So my VRM 1 temps are getting ridiculous... I have the Sapphire Tri-X OC r9 290 card, and though the fans are pretty good at keeping temps down, it is awfully loud (dropping the fan speed results in crazy high temps) & I need to do something to drop the VRM 1 temp down a bit. I have decided to replace the Tri-X cooler, and am stuck between the Arctic Accelero 4 and the Gelid Icy Vision Rev 2; which one is more efficient? Are there any other recommendations that don't exceed $65? I've got to get this thing swapped out.


Between the Gelid and Arctic go with the arctic if you overclock/overvolt. The Gelid is a decent cooler but when paired with the 290 it cant withstand overclocking and especially overvolting nearly as much as the accelero. The one good thing from Gelid is the VRM1 upgrade kit which you can purchase separately so I would recommend the Accelero 4 along with the Gelid VRM upgrade kit, the Accelero also has a backplate which acts as a heatsink and helps cool the vrm and mem section, I have one on my GTX 780 and it does a tremendous job with the core and vrms.

When I owned my 290X I cooled it with the Accelero 3 and with 1.328vc with 1200mhz on the core I was peaking at 64c after hours of heavy gaming with titles such as crysis 3 and bf4. VRM1 temps in my rig were hitting 79c under load with the above overclock and overvolt. Keep in mind good case airflow is a plus and can really help with temps.


----------



## gobblebox

Quote:


> Originally Posted by *Zipperly*
> 
> Between the Gelid and Arctic go with the arctic if you overclock/overvolt. The Gelid is a decent cooler but when paired with the 290 it cant withstand overclocking and especially overvolting nearly as much as the accelero. The one good thing from Gelid is the VRM1 upgrade kit which you can purchase separately so I would recommend the Accelero 4 along with the Gelid VRM upgrade kit, the Accelero also has a backplate which acts as a heatsink and helps cool the vrm and mem section, I have one on my GTX 780 and it does a tremendous job with the core and vrms.
> 
> When I owned my 290X I cooled it with the Accelero 3 and with 1.328vc with 1200mhz on the core I was peaking at 64c after hours of heavy gaming with titles such as crysis 3 and bf4. VRM1 temps in my rig were hitting 79c under load with the above overclock and overvolt. Keep in mind good case airflow is a plus and can really help with temps.


Yeah my rig has excellent airflow, 2x 140mm Noctua PWM intake, 1x AF140 Quiet rear exhaust, then H100i w/ 2x AF120 pull / 2x SP120 push as top exhaust in my Arc Midi R2. Thanks for the recommendation! Is there any huge difference between the Accelero 3 and the Accelero 4? These have PWM fans, right?

[Edit] Typo/OCD


----------



## Zipperly

Quote:


> Originally Posted by *gobblebox*
> 
> Yeah my rig has excellent airflow, 2x 140mm Noctua PWM intake, 1x AF140 Quiet rear exhaust, then H100i w/ 2x AF120 pull / 2x SP 120 push as top exhaust in my Arc Midi R2. Thanks for the recommendation! Is there any huge difference between the Accelero 3 and the Accelero 4? These have PWM fans, right?


Yes PWM fans but run them at 100 percent, its necessary for proper VRM cooling and is near silent at full blast. The difference between accelero 3 and 4 is the way they chose to cool the VRM's, with the accelero 4 its the backplate which acts as a heatsink cooling the vrms and memory and trust me it does do a good job but for even better cooling as I recommended before grab the Gelid VRM upgrade kit and install that in addition to the backplate, this will really help with the vrm temps.

I have additional vrm sinks on my GTX 780 as well as the backplate, this is what it looks like inside of my system but note that if it appears my card is slightly leaning it is only because I had not yet installed the support bracket which keeps the card straight, the support bracket is included with the cooler.











This is what I have installed onto my side door...


----------



## pnoozi

Quote:


> Originally Posted by *falcon26*
> 
> I have the twin froze. I'd rather not do anything to the card. My cooling setup is fine. My old 780 never went over 70 whole gaming. I'd rather just sell the card...


Which card do you have exactly? This one? That is ridiculous. A Twin Frozr card should not be reaching those temperatures. You need to try replacing the thermal paste on the GPU.


----------



## rdr09

Quote:


> Originally Posted by *pnoozi*
> 
> Which card do you have exactly? This one? That is ridiculous. A Twin Frozr card should not be reaching those temperatures. You need to try replacing the thermal paste on the GPU.


that twin frozer may do a good job on a 7950 but not hawaiis. these things get hot and even those with 3 fans are struggling. watercooled and you won't see 60C on the core, then the vrms will be even cooler than it.

edit: what i don't get is . . . these gpus been out for almost a year and there are tons of reviews saying they do get hot. despite them . . . still some are surprised by the way they generate heat.


----------



## pnoozi

Quote:


> Originally Posted by *rdr09*
> 
> that twin frozer may do a good job on a 7950 but not hawaiis. these things get hot and even those with 3 fans are struggling. watercooled and you won't see 60C on the core, then the vrms will be even cooler than it.


Please tell me if I'm missing something but according to reviews, it appears they run at roughly the same temps as my 7950 TFIII.

http://www.tweaktown.com/reviews/6179/msi-radeon-r9-290x-twin-frozr-gaming-oc-overclocked-video-card-review/index21.html

71C load

http://www.hardocp.com/article/2014/03/31/msi_radeon_r9_290x_gaming_4g_video_card_review/9#.U97q8fldU58

70C load, 75C load ("overclocked")

http://www.guru3d.com/articles_pages/msi_radeon_r9_290x_gaming_oc_review,10.html

73C load

falcon26 is reporting a temperature of 95C under load.


----------



## rdr09

Quote:


> Originally Posted by *pnoozi*
> 
> Please tell me if I'm missing something but according to reviews, it appears they run at roughly the same temps as my 7950 TFIII.
> 
> http://www.tweaktown.com/reviews/6179/msi-radeon-r9-290x-twin-frozr-gaming-oc-overclocked-video-card-review/index21.html
> 
> 71C load
> 
> http://www.hardocp.com/article/2014/03/31/msi_radeon_r9_290x_gaming_4g_video_card_review/9#.U97q8fldU58
> 
> 70C load, 75C load ("overclocked")
> 
> http://www.guru3d.com/articles_pages/msi_radeon_r9_290x_gaming_oc_review,10.html
> 
> 73C load
> 
> falcon26 is reporting a temperature of 95C under load.


what you are missing are the vrm temps. also, there are cases where QC fails, especially in the application of the paste. i've read this same cooler failed to cool the vrms on the Ti. one member measured his Ti's vrm area at the back of the pcb and got 100C using an infrared thermometer.

if i read correctly, hardocp used a test bench.

edit: post# 5744

http://www.overclock.net/t/1438886/official-nvidia-gtx-780-ti-owners-club/5740


----------



## falcon26

Well all I can tell you is I'm getting almost 95 load in bf4. I did notice that my 2 front fans are being used as an exhaust rather then an intake would that matter much?


----------



## rdr09

Quote:


> Originally Posted by *falcon26*
> 
> Well all I can tell you is I'm getting almost 95 load in bf4. I did notice that my 2 front fans are being used as an exhaust rather then an intake would that matter much?


yes, normally, case air flow should go from front to rear. that gpu dumps hot air in the case, thus you want negative pressure - meaning - you need to get the air inside out the case asap. more exhaust the intake.


----------



## pnoozi

Quote:


> Originally Posted by *rdr09*
> 
> what you are missing are the vrm temps. also, there are cases where QC fails, especially in the application of the paste. i've read this same cooler failed to cool the vrms on the Ti. one member measured his Ti's vrm area at the back of the pcb and got 100C using an infrared thermometer.
> 
> if i read correctly, hardocp used a test bench.


I don't know what normal VRM temps should be, but falcon26 is reporting _core_ temps 20-25 degrees higher than normal for an MSI R290X Gaming (Twin Frozr). The fact that BF4 is throttling supports this. He needs to try replacing the thermal paste. Adjusting air flow in the case will only have a minor effect on GPU temperatures.


----------



## rdr09

Quote:


> Originally Posted by *pnoozi*
> 
> I don't know what normal VRM temps should be, but falcon26 is reporting _core_ temps 20-25 degrees higher than normal for an MSI R290X Gaming (Twin Frozr). The fact that BF4 is throttling supports this. He needs to try replacing the thermal paste.


i was gonna suggest to replace the paste but falcon wants to leave it as is, since he'll be selling it. i think he is referring to the core temp.


----------



## falcon26

So say I get 2 140mm fans in the front as intakes would that help


----------



## Roboyto

Quote:


> Originally Posted by *gobblebox*
> 
> Yeah my rig has excellent airflow, 2x 140mm Noctua PWM intake, 1x AF140 Quiet rear exhaust, then H100i w/ 2x AF120 pull / 2x SP120 push as top exhaust in my Arc Midi R2. Thanks for the recommendation! Is there any huge difference between the Accelero 3 and the Accelero 4? These have PWM fans, right?
> 
> [Edit] Typo/OCD


It may be hard to find an Accelero Extreme 3 as they are discontinued and have been replaced by the 4. The Accelero 3 relied on using thermal epoxy to attach sinks to RAM/VRMs, with the large cooler for the core. The epoxied sinks can be a pain to remove later on down the line.

The Accelero 4 uses a very large passive heatsink that attaches to the backside of the card, leaving the components on the topside of the card bare, but no glue is necessary. This makes the card very large and it will occupy a slot above the card as well as taking all that space below. From the very few things I can find about this cooler with these cards, they don't do a very good job with VRM1. Essentially leaving you in the same boat you are now after spending about $90. If you were able to use the Gelid VRM kit with the Accelero 4, you might have something...but with the way the Accelero rear heatsink attaches to the back of the card I don't think this is possible. The Gelid VRM kit screws to the card, and the small thumbscrews will interfere with the rear Accelero heatsink making contact.



Spoiler: Gelid VRM Kit - Rear Thumbscrews















$65 is going to be a hard to do unless you go with the Gelid Icy Vision with the Gelid VRM enhancement kit; $68 through NewEgg for both items.

Here's a thread for the Gelid on a 290: http://www.overclock.net/t/1437634/installation-guide-tips-of-rev-2-icy-vision-on-r9-290x/0_20

I quickly skimmed through, and it seems VRM1 is still problematic. However, at the end of the thread, one user, @fishyfish777, has commented about the VRM enhancement kit. He says it keeps good temps on VRM1, 65C-80C, unless he runs Furmark. Furmark is terrible, so this sounds promising to me. I would send fishyfish a PM.

I think your best bet is going to be to get the Icy Vision with the VRM kit, and get better thermal pads; Fujipoly Ultra Extreme to be exact. The pads will run you an extra $21 + ship from FrozenCPU, but they will undoubtedly increase performance even more. I have the Gelid VRM kit, with FUE pads, on a 290, where core is cooled by a NZXT Kraken, and VRM1 stays extremely cool.

Quote:


> Originally Posted by *falcon26*
> 
> So say I get 2 140mm fans in the front as intakes would that help


Turn your front fans around to intake and see what happens. As I mentioned before, if you have HDD cages installed they will obstruct the airflow so this may not help much.

But...If you're going to buy 2 more fans, then you should use the ones you replace in front for 2 exhaust fans on top. Like @rdr09 said, more exhaust then intake to force the air through.

Quote:


> Originally Posted by *pnoozi*
> 
> I don't know what normal VRM temps should be, but falcon26 is reporting core temps 20-25 degrees higher than normal for an MSI R290X Gaming (Twin Frozr). The fact that BF4 is throttling supports this. He needs to try replacing the thermal paste. *Adjusting air flow in the case will only have a minor effect on GPU temperatures.*


In most cases I would agree. However, the card is getting absolutely no cool air with the current configuration.


----------



## pnoozi

Quote:


> Originally Posted by *falcon26*
> 
> So say I get 2 140mm fans in the front as intakes would that help


I don't think it would make much of a difference (if anything, the added noise would make it worse). I would replace the thermal paste. Assuming your card isn't overheating for other reasons (let's hope not!), it is a quick, cheap and easy fix for your problem.


----------



## kizwan

Quote:


> Originally Posted by *falcon26*
> 
> So say I get 2 140mm fans in the front as intakes would that help


What is the temp inside the case? What is your room temperature? If the temp inside the case only a couple degrees higher (under load) than room temp, I doubt adding fan will help much. What temps you get if you set fans speed to 60%?


----------



## Zipperly

Quote:


> Originally Posted by *Roboyto*
> 
> It may be hard to find an Accelero Extreme 3 as they are discontinued and have been replaced by the 4. The Accelero 3 relied on using thermal epoxy to attach sinks to RAM/VRMs, with the large cooler for the core. The epoxied sinks can be a pain to remove later on down the line.
> 
> The Accelero 4 uses a very large passive heatsink that attaches to the backside of the card, leaving the components on the topside of the card bare, but no glue is necessary. This makes the card very large and it will occupy a slot above the card as well as taking all that space below. From the very few things I can find about this cooler with these cards, they don't do a very good job with VRM1. Essentially leaving you in the same boat you are now after spending about $90. If you were able to use the Gelid VRM kit with the Accelero 4, you might have something...but with the way the Accelero rear heatsink attaches to the back of the card I don't think this is possible. The Gelid VRM kit screws to the card, and the small thumbscrews will interfere with the rear Accelero heatsink making contact.
> 
> 
> Spoiler: Gelid VRM Kit - Rear Thumbscrews
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> $65 is going to be a hard to do unless you go with the Gelid Icy Vision with the Gelid VRM enhancement kit; $68 through NewEgg for both items.
> Here's a thread for the Gelid on a 290: http://www.overclock.net/t/1437634/installation-guide-tips-of-rev-2-icy-vision-on-r9-290x/0_20
> I quickly skimmed through, and it seems VRM1 is still problematic. However, at the end of the thread, one user, @fishyfish777
> , has commented about the VRM enhancement kit. He says it keeps good temps on VRM1, 65C-80C, unless he runs Furmark. Furmark is terrible, so this sounds promising to me. I would send fishyfish a PM.
> 
> I think your best bet is going to be to get the Icy Vision with the VRM kit, and get better thermal pads; Fujipoly Ultra Extreme to be exact. The pads will run you an extra $21 + ship from FrozenCPU, but they will undoubtedly increase performance even more. I have the Gelid VRM kit, with FUE pads, on a 290, where core is cooled by a NZXT Kraken, and VRM1 stays extremely cool.


The backplate that comes with the accelero 4 cools the vrm area better than the accelero 3 kit, it may not seem like it would work very well but trust me it certainly does. Good job pointing out the Gelid VRM upgrade with the thumb screws, I was unaware of that but there are other options he can use for additional vrm cooling. With my GTX 780 that backplate/heatsink from the accelero 4 gets pretty darn hot to the touch around the vrm area which tells me its doing a good job pulling heat from that area.

The Gelid heatsink itself is not bad as long as he does not expect to overclock or overvolt much, if he does then the accelero 4 will be the better heatsink as it handles high voltages and overclocks better than the Gelid. When I had my accelero 3 on my 290X with an overvolt of 1.328vc and overclock of 1200mhz core my load temps were roughly the same as guys using the Gelid cooler but running their 290X cards at completely stock clocks and voltage.

Here is a guys results after installing only the backplate for the vrms and memory onto his 290X.
http://forums.overclockers.co.uk/showpost.php?p=26177381&postcount=72

Quote:


> Hello everybody, just mounted the Xtreme IV on my 290X without any heatsink (only with pieces provided by Arctic), run many Valley benchs consequently for about 20 minutes at Extreme HD preset. Temps are great. I don't know temps from AX III, but here with fan at 100% (which is basically not udible, especially with game's sound/music) I got after 15/20 minutes 55°C on GPU core, 60°C VRM1 and 50°C VRM2.
> 
> It's clear that in this case fans are the most important part in cooling VRMs, in fact lowering them from 100% to 50% GPU core is at about 60°C while VRM1 goes to about 70°C, VRM2 stays about the same. Lowering fans to 20% GPU core goes up to 70-75°C, VRM1 to 82°C and VRM2 at about 60°C.
> 
> So, after 20 minutes Valley bench preset Extreme HD:
> Fans 20%:
> GPU, VRM1, VRM2: 73°C, 82°C, 60°C
> 
> Fans 50%:
> GPU, VRM1, VRM2: 60°C, 70°C, 58°C
> 
> Fans 100%:
> GPU, VRM1, VRM2: 55°C, 60°C, 50°C
> 
> So, if you want a cold card using an Acc Xtreme IV just put your fans to 100% and you'll be great even without heatsinks on VRAMs and VRMs Or just use AB to create a personalized fan curve where if temp goes to 55°C fans will go up to 100% and keep core and VRMs at great temps (for those who do not like the almost non udible sound of fans spinning at 100%)
> 
> Hoping this would be helpful for someone


As I said before, running that fan at 100 percent is crucial for properly cooling the vrms.


----------



## Zipperly

Quote:


> Originally Posted by *pnoozi*
> 
> I don't think it would make much of a difference (if anything, the added noise would make it worse). I would replace the thermal paste. Assuming your card isn't overheating for other reasons (let's hope not!), it is a quick, cheap and easy fix for your problem.


Right, and adding 2 140mm fans as an intake isnt going to do you much good if your exhaust is weaker than the intake, if anything you will be trapping hot air inside of the case.


----------



## FuriousPop

Quote:


> Originally Posted by *gobblebox*
> 
> So my VRM 1 temps are getting ridiculous... I have the Sapphire Tri-X OC r9 290 card, and though the fans are pretty good at keeping temps down, it is awfully loud (dropping the fan speed results in crazy high temps) & I need to do something to drop the VRM 1 temp down a bit. I have decided to replace the Tri-X cooler, and am stuck between the Arctic Accelero 4 and the Gelid Icy Vision Rev 2; which one is more efficient? Are there any other recommendations that don't exceed $65? I've got to get this thing swapped out.


thats interesting - i have 3x of these cards and didnt have any issues with mine. however when i was removing the fan/heat sink to put on my water block - i did notice that there was a touch of spacing between the VRM1 and 2 from the stock cooler/heat sink and fans - maybe a clean and new thermal paste as well as some slightly more tightening of the screws on the PCB might help?

when i was running mine i would test each one with MSI AB (custom fan curve) on BF4 7560x1600 - medium settings and the car would never go above 65c core - VRM's were slightly more..


----------



## ZealotKi11er

Quote:


> Originally Posted by *Zipperly*
> 
> Right, and adding 2 140mm fans as an intake isnt going to do you much good if your exhaust is weaker than the intake, if anything you will be trapping hot air inside of the case.


It creates positive pressure. The case has a lot of vent holes for air to escape.


----------



## Zipperly

Quote:


> Originally Posted by *ZealotKi11er*
> 
> It creates positive pressure. The case has a lot of vent holes for air to escape.


As I said as long as the exhaust setup is good enough that the air can circulate out of the case.


----------



## Mega Man

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *falcon26*
> 
> Well all I can tell you is I'm getting almost 95 load in bf4. I did notice that my 2 front fans are being used as an exhaust rather then an intake would that matter much?
> 
> 
> 
> yes, normally, case air flow should go from front to rear. that gpu dumps hot air in the case, thus you want negative pressure - meaning - you need to get the air inside out the case asap. more exhaust the intake.
Click to expand...

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Zipperly*
> 
> Right, and adding 2 140mm fans as an intake isnt going to do you much good if your exhaust is weaker than the intake, if anything you will be trapping hot air inside of the case.
> 
> 
> 
> It creates positive pressure. The case has a lot of vent holes for air to escape.
Click to expand...

this and you never want neg pressure, you will do nothing but suck dirt into your pc !!

also front to back... no. well designed yes,
i have no issues running back to front, if it is done right . this is like saying heat rises. it does, except in pcs, as the air, heated or not will go where the fan pushes it ( obviously excluding a fully passive system


----------



## Zipperly

Quote:


> Originally Posted by *Mega Man*
> 
> this and you never want neg pressure, you will do nothing but suck dirt into your pc !!
> 
> also front to back... no. well designed yes,
> i have no issues running back to front, if it is done right . this is like saying heat rises. it does, except in pcs, as the air, heated or not will go where the fan pushes it ( obviously excluding a fully passive system


I agree with this, just wanted to make sure he had it set up some air can escape instead of staying built up inside the case circulating hot air around. With coolers such as the accelero you wanna have good case airflow.


----------



## falcon26

Well then if my front fans are exhaust then aren't they expelling the hot air from the GPU?


----------



## Zipperly

Quote:


> Originally Posted by *falcon26*
> 
> Well then if my front fans are exhaust then aren't they expelling the hot air from the GPU?


Yes, thanks for clarifying that the front fans are exhaust. Earlier I could have swore you said they were intake fans unless you rearranged them.

Edit-just went back and looked, you did say they were intake fans the first time.


----------



## falcon26

My rear fan is an exhaust which is attached to my h55 water cooler. My 2 front fans are being used as an exhaust..


----------



## Zipperly

Quote:


> Originally Posted by *falcon26*
> 
> My rear fan is an exhaust which is attached to my h55 water cooler. My 2 front fans are being used as an exhaust..


So what was this about?
Quote:


> Originally Posted by *falcon26*
> 
> So say I get 2 140mm fans in the front as intakes would that help


----------



## Roboyto

Quote:
Originally Posted by *Zipperly* 

Here is a guys results after installing only the backplate for the vrms and memory onto his 290X.
http://forums.overclockers.co.uk/showpost.php?p=26177381&postcount=72
Hello everybody, just mounted the Xtreme IV on my 290X without any heatsink (only with pieces provided by Arctic), run many Valley benchs consequently for about 20 minutes at Extreme HD preset. Temps are great. I don't know temps from AX III, but here with fan at 100% (which is basically not udible, especially with game's sound/music) I got after 15/20 minutes 55°C on GPU core, 60°C VRM1 and 50°C VRM2.

It's clear that in this case fans are the most important part in cooling VRMs, in fact lowering them from 100% to 50% GPU core is at about 60°C while VRM1 goes to about 70°C, VRM2 stays about the same. Lowering fans to 20% GPU core goes up to 70-75°C, VRM1 to 82°C and VRM2 at about 60°C.

So, after 20 minutes Valley bench preset Extreme HD:
Fans 20%:
GPU, VRM1, VRM2: 73°C, 82°C, 60°C

Fans 50%:
GPU, VRM1, VRM2: 60°C, 70°C, 58°C

Fans 100%:
GPU, VRM1, VRM2: 55°C, 60°C, 50°C

So, if you want a cold card using an Acc Xtreme IV just put your fans to 100% and you'll be great even without heatsinks on VRAMs and VRMs







Or just use AB to create a personalized fan curve where if temp goes to 55°C fans will go up to 100% and keep core and VRMs at great temps







(for those who do not like the almost non udible sound of fans spinning at 100%)

Hoping this would be helpful for someone










> As I said before, running that fan at 100 percent is crucial for properly cooling the vrms.


Glad to see there is some positive information for the new Accelero with these cards. However, he doesn't mention if he is overclocking the card or not as you were with the AX3. The only thing I came across was a German tech site which reported VRM1 at 117C under load







http://translate.google.com/translate?sl=de&tl=en&u=http://ht4u.net/reviews/2014/arctic_accelero_iv_xtreme_im_test_auf_r9_290/index9.php&sandbox=0&usg=ALkJrhiu_oZNQDReoiZccP7HfL7b8xENUA

These temperatures look very similar to my results with the Kraken/Gelid Kit after a short load time; he says 15-20 min. Initially I was seeing 64C peak after a few benchmark runs, and after 3 hours of Tomb Raider VRM1 peaked at 71C.

I have yet to test for an extended period of time since I swapped out the thermal pads. Think I'm going to load up Tomb Raider and see where the VRM1 temps end up at.


----------



## Zipperly

Quote:


> Originally Posted by *Roboyto*
> 
> Glad to see there is some positive information for the new Accelero with these cards. However, he doesn't mention if he is overclocking the card or not as you were with the AX3. The only thing I came across was a German tech site which reported VRM1 at 117C under load
> 
> 
> 
> 
> 
> 
> 
> http://translate.google.com/translate?sl=de&tl=en&u=http://ht4u.net/reviews/2014/arctic_accelero_iv_xtreme_im_test_auf_r9_290/index9.php&sandbox=0&usg=ALkJrhiu_oZNQDReoiZccP7HfL7b8xENUA
> 
> These temperatures look very similar to my results with the Kraken/Gelid Kit after a short load time; he says 15-20 min. Initially I was seeing 64C peak after a few benchmark runs, and after 3 hours of Tomb Raider VRM1 peaked at 71C.
> 
> I have yet to test for an extended period of time since I swapped out the thermal pads. Think I'm going to load up Tomb Raider and see where the VRM1 temps end up at.


Right he does not mention if he was overclocked or not but even so 60c is respectable VRM temps and is in line with what I was getting with the accelero 3 while running stock settings so im betting it will do a similar job as to what I was getting after some voltage and overclocking is applied.

Where a lot of people mess up is neglecting to run the cooler at 100 percent fan speed, as you can see from his results it makes a tremendous difference in vrm temps. As for that german site who knows what they may have gotten wrong during the installation, 117c is way too high and is much higher than I have heard from others who neglect to run the fans at 100 percent.


----------



## Zipperly

Taken from that german site you linked.
Quote:


> The question is quickly answered. With only 590 RPM Xtreme IV 117 ° C can be achieved here, according to an internal measurement of the diode, the voltage converter. This is consistent with the behavior of the Xtreme III with significantly higher GPU temperature.
> 
> At about 1,450 RPM but the converter temperatures drop significantly. In this view both cooling variants are now on a par, which actually may be certified that the new approach of Arctic is no less powerful. At the point but we must consider that the Xtreme was made a better passive cooling solution for Page III of us, as the product it actually provides.


This is why the cooler must be run full tilt to achieve good vrm temps.


----------



## Roboyto

Quote:


> Originally Posted by *Zipperly*
> 
> Taken from that german site you linked.
> This is why the cooler must be run full tilt to achieve good vrm temps.


Yes, it dropped from 117C to 92C with 1450 RPM; fans max at 2k so 60C is plausible.

I love Arctic, but the size of the AX4 is a bit ridiculous and may not be possible for some. I would like to see some detailed results of what the Icy Vision can do with the enhancement kit with it's more compact size.

I'm running TR now to see what kind of a difference the FUE makes with the Gelid heatsink.


----------



## gobblebox

Quote:


> Originally Posted by *FuriousPop*
> 
> thats interesting - i have 3x of these cards and didnt have any issues with mine. however when i was removing the fan/heat sink to put on my water block - i did notice that there was a touch of spacing between the VRM1 and 2 from the stock cooler/heat sink and fans - maybe a clean and new thermal paste as well as some slightly more tightening of the screws on the PCB might help?
> 
> when i was running mine i would test each one with MSI AB (custom fan curve) on BF4 7560x1600 - medium settings and the car would never go above 65c core - VRM's were slightly more..


IIRC, before I replaced the stock thermal paste with ICD7, VRM1 temps were _much_ lower, I think, around 60 instead of the current 85... perhaps I just didn't tighten the PCB enough ( was worried about tightening the screws too much)? I didn't even touch anything around VRM1, but didn't take a look at it either, so who knows what's actually going on under the Tri-X cooler... I'll tighten the screws and see if there's any difference.

If I don't see a drop in temps, I'll probably grab an Accelero IV - are FujiPoly pads or the Gelid VRM enhancement kit necessary if I were to go this route (I'm well learned in CPU overclocking/cooling, but hardly even a novice when it comes to GPUs)?

I _am_ overvolting, but only around 125+mv right now... what is considered safe (ruling out thermal walls) for the r9 290?

By the way, y'all are some badasses, thanks for all of the useful information!


----------



## Roboyto

Quote:


> Originally Posted by *gobblebox*
> 
> IIRC, before I replaced the stock thermal paste with ICD7, VRM1 temps were much lower, I think, around 60 instead of the current 85... *perhaps I just didn't tighten the PCB enough* ( was worried about tightening the screws too much)? I didn't even touch anything around VRM1, but didn't take a look at it either, so who knows what's actually going on under the Tri-X cooler... I'll tighten the screws and see if there's any difference.
> 
> If I don't see a drop in temps, I'll probably grab an Accelero IV - are *FujiPoly pads or the Gelid VRM enhancement kit necessary* if I were to go this route (I'm well learned in CPU overclocking/cooling, but hardly even a novice when it comes to GPUs)?
> 
> I am overvolting, but only around 125+mv right now... *what is considered safe* (ruling out thermal walls) for the r9 290?
> 
> By the way, y'all are some badasses, *thanks for all of the useful information*!


It is possible that if you didn't tighten things enough that poor contact could be made to the VRMs.



Spoiler: Tri-X Cooler















The Fuji's aren't really necessary, but you will likely get better results with them.

If you go with the Arctic, then the Gelid Kit isn't going to be compatible due to how it attaches; unless you glue it on, which I wouldn't suggest.

"Safe" has a loose definition. If you're keeping temperatures in check, then odds are you can apply more voltage. Anything beyond stock voltage is likely going to accelerate degradation of the hardware though. You are best testing/benching the card to find the best blend of volts/clocks/temperatures/performance. With these cards core is always first, any RAM overclock is just an added bonus.

You're very welcome


----------



## HoneyBadger84

Quote:


> Originally Posted by *the9quad*
> 
> Watchdogs and Ubisoft would like to have a word with you about 3GB being enough. I fear it is only going to get worse with their titles as well.


Quote:


> Originally Posted by *rdr09*
> 
> they can always lower some settings like AA but the other alternative - 6GB version is just expensive if you think about it since the next gen will be coming soon.


Even with only SMAA on I've seen 3.6GB of vRAM usage (per GPU) in Watch_Dogs on Ultra, so turning AA down don't make that big a diff, and that's only at 1080p, single screen.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Zipperly*
> 
> Yes PWM fans but run them at 100 percent, its necessary for proper VRM cooling and is near silent at full blast. The difference between accelero 3 and 4 is the way they chose to cool the VRM's, with the accelero 4 its the backplate which acts as a heatsink cooling the vrms and memory and trust me it does do a good job but for even better cooling as I recommended before grab the Gelid VRM upgrade kit and install that in addition to the backplate, this will really help with the vrm temps.
> 
> I have additional vrm sinks on my GTX 780 as well as the backplate, this is what it looks like inside of my system but note that if it appears my card is slightly leaning it is only because I had not yet installed the support bracket which keeps the card straight, the support bracket is included with the cooler.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This is what I have installed onto my side door...


Woooooow you have as much intake as I do lol your temps should be like mine, 55-70C load max, if you had a 290X in that.


----------



## Jflisk

Quote:


> Originally Posted by *gobblebox*
> 
> IIRC, before I replaced the stock thermal paste with ICD7, VRM1 temps were _much_ lower, I think, around 60 instead of the current 85... perhaps I just didn't tighten the PCB enough ( was worried about tightening the screws too much)? I didn't even touch anything around VRM1, but didn't take a look at it either, so who knows what's actually going on under the Tri-X cooler... I'll tighten the screws and see if there's any difference.
> 
> If I don't see a drop in temps, I'll probably grab an Accelero IV - are FujiPoly pads or the Gelid VRM enhancement kit necessary if I were to go this route (I'm well learned in CPU overclocking/cooling, but hardly even a novice when it comes to GPUs)?
> 
> I _am_ overvolting, but only around 125+mv right now... what is considered safe (ruling out thermal walls) for the r9 290?
> 
> By the way, y'all are some badasses, thanks for all of the useful information!


Oh by the way the IC diamond is going to scratch your GPU Die. Not that its going to hurt anything just a heads up.I used it before it does drop temps.


----------



## Zipperly

Quote:


> Originally Posted by *Roboyto*
> 
> Yes, it dropped from 117C to 92C with 1450 RPM; fans max at 2k so 60C is plausible.


Not really, thats what it was with my accelero 3 at stock and the guy at the UK forums reported the same. If you are not careful it is possible to screw up the installation of that back plate that comes with the accelero 4, you get the thermal pads positioned in the wrong spot on the PCB and there goes your VRM cooling.
Quote:


> I love Arctic, but the size of the AX4 is a bit ridiculous and may not be possible for some.


Yes for some configs it could be to big, as you see in my rig I have a ton of room with it installed.


----------



## Zipperly

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Woooooow you have as much intake as I do lol your temps should be like mine, 55-70C load max, if you had a 290X in that.


It was 64c max when I had my 290X in here with the Accelero 3 installed and that was with a whopping 1.328vc and 1200mhz on the core. If I ran the card at stock the core temps were even lower.

Also this helps......


----------



## rdr09

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Even with only SMAA on I've seen 3.6GB of vRAM usage (per GPU) in Watch_Dogs on Ultra, so turning AA down don't make that big a diff, and that's only at 1080p, single screen.


smaa? i use smaa with my HD7770. lol. i've seen 3.3GB in BF4 MP using just 1080. 130% scale, though.


----------



## HoneyBadger84

Quote:


> Originally Posted by *rdr09*
> 
> smaa? i use smaa with my HD7770. lol. i've seen 3.3GB in BF4 MP using just 1080. 130% scale, though.


SMAA is one of the lowest AA settings in Watch_Dogs besides off. On my screen, because of it's size (52" LED LCD HDTV) I don't really need to run high AA in that particular game, because it makes no visual difference on most edges. Pretty sure in Watch_Dogs it goes off - something - SMAA - Temporal SMAA - FXAA - 2x MSAA - 4x MSAA - 8x MSAA, or something to that effect. I think the game defaults to Temporal, but the difference between Temp & non Temp SMAA is like... nothing, as far as I can tell, and Temporal adds a bit of vRAM usage & lowers FPS a tiny bit, so I turned it off.

Keep in mind I'm speaking from when I was only running the game on 2 290Xs, about 3 driver releases ago now. Haven't played it much since I started Folding etc, so 2 weeks or so, and I hadn't played it more than sparsely for about 2 weeks before that...


----------



## Romanion

Hi guys, I just built my rig in june with a ex-mining 290x. It hasn't been running up to my expectations so I decided to use Unigine Valley scores to compare to other cards. This was my score.

Only 50fps as compared to others with 70. Did I buy a defective card? Its a reference Sapphire card. Will be adding a custom waterloop in a few weeks time.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Romanion*
> 
> Hi guys, I just built my rig in june with a ex-mining 290x. It hasn't been running up to my expectations so I decided to use Unigine Valley scores to compare to other cards. This was my score.
> 
> Only 50fps as compared to others with 70. Did I buy a defective card? Its a reference Sapphire card. Will be adding a custom waterloop in a few weeks time.


BIOS check time. Post a screenshot of the primary tab in GPUz so we can have a look?


----------



## Romanion

Quote:


> Originally Posted by *HoneyBadger84*
> 
> BIOS check time. Post a screenshot of the primary tab in GPUz so we can have a look?


----------



## Meatdohx

Hi everyone.

I have an issue with my current r9 290 crossfire setup. In 2 different games (Aliens Colonial Marines and X-com Enemy Unknown) I had a system lockup that completly destroyed my video card drivers to the point i needed either to restore or reinstall windows.

Every time i would try to either deinstall or install/repair i would get a blue screen stating system thread not handled.

First one is on a water block and second one is with the horrible reference cooler.

I tried 3 things soo far.

1- I had a 850W PSU. I changed it to a 1000W to be sure. I am sure its not a power feeding issue.

2- I have ran the fan of the second card at 100% and made sure i did not go over 75 cel and i still reproduced the problem.

3- I updated the bios of my MSI gaming z87 g45 to the latest update.

I suspect my motherboard sucks as this point.

I also tried with my CPU 4670K OC to 4.5 and stock and i was able to reproduce the problem with both settings.

The issue does not happen in Wildstar or BF4 and i play thoses games alot more.

I am open to suggestions.

Please help meh

English is not my native language! Sorry for the bad syntax.


----------



## the9quad

That's happened to me as well in other games twice now. Did you get a checkerboard pattern flash periodically prior to it happening? That's what mine did, and that is with absolutley no overclock and the cards not even stressed. My guess is it has something to do with crossfire and the 2d portion remaining in effect when it should be swapping to 3d. I think some games screw it up and it results in this phenomena.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Romanion*


Specifically what type of Sapphire card is this? A stock reference card, Tri-X, Tri-X OC, Vapor-X? Regardless, it should be showing Sapphire/PCPartners in SubVendor. Which BIOS setting do you currently have the card on? Try switching the Switch to the other BIOS & see if it's different under SubVendor (note: you should reinstall drivers if you shut down & switch the BIOS as it will make the drivers act funny otherwise, or at least it did for me). Your BIOS ID also doesn't look familiar to me & I've had 3 different Sapphire cards, so it may be that the previous owner switched the BIOS. If so, that's an easy fix as you can just flash it to a proper Sapphire-card BIOS pretty easily.

Everything else looks fine, main thing I wanted to look at is if it had a specific mining BIOS that locks down half the ROPs which can result in a pretty noticable performance difference between it and other 290Xs.


----------



## Romanion

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Specifically what type of Sapphire card is this? A stock reference card, Tri-X, Tri-X OC, Vapor-X? Regardless, it should be showing Sapphire/PCPartners in SubVendor. Which BIOS setting do you currently have the card on? Try switching the Switch to the other BIOS & see if it's different under SubVendor (note: you should reinstall drivers if you shut down & switch the BIOS as it will make the drivers act funny otherwise, or at least it did for me). Your BIOS ID also doesn't look familiar to me & I've had 3 different Sapphire cards, so it may be that the previous owner switched the BIOS. If so, that's an easy fix as you can just flash it to a proper Sapphire-card BIOS pretty easily.
> 
> Everything else looks fine, main thing I wanted to look at is if it had a specific mining BIOS that locks down half the ROPs which can result in a pretty noticable performance difference between it and other 290Xs.


Its a stock reference card. I have the bios setting to "UBER" and not quiet.
I've tried to swap the bios settings via the switch previously but nothing happened. I think I'll try to flash the bios. How should I go about doing that?

Do I flash to http://www.techpowerup.com/vgabios/150748/sapphire-r9290x-4096-131121.html This bios?


----------



## HoneyBadger84

Quote:


> Originally Posted by *Romanion*
> 
> Its a stock reference card. I have the bios setting to "UBER" and not quiet.
> I've tried to swap the bios settings via the switch previously but nothing happened. I think I'll try to flash the bios. How should I go about doing that?


So both BIOSes perform the same? Meh, yeah you'll have to find a default BIOS & try that then. There's a thread on BIOS flashing, find that, I used ATIFlash to fix one of the cards I got with that issue.


----------



## heroxoot

Depends on what driver you guys are on. I've heard of multiple problems on 14.7. 14.6V2 I only have checker artifacts if my OC is bad. Have not tried 14.7 because I BSOD'd on stock.


----------



## Meatdohx

Quote:


> Originally Posted by *the9quad*
> 
> That's happened to me as well in other games twice now. Did you get a checkerboard pattern flash periodically prior to it happening? That's what mine did, and that is with absolutley no overclock and the cards not even stressed. My guess is it has something to do with crossfire and the 2d portion remaining in effect when it should be swapping to 3d. I think some games screw it up and it results in this phenomena.


Sooo... it happen even on an X79 board.

Maybe if we disable ULPS in MSI afterburner....

I really don't like having catalyst center and MSI afterburner installed at the same time tho.


----------



## gobblebox

Quote:


> Originally Posted by *Jflisk*
> 
> Oh by the way the IC diamond is going to scratch your GPU Die. Not that its going to hurt anything just a heads up.I used it before it does drop temps.


Yeah, I'm aware of it scratching, but I don't plan on changing it out very often at all, the next best thing I have is some Shin-Etsu x23 7783D. I'll tighten the screws tonight to see if I get better contact.


----------



## kizwan

Quote:


> Originally Posted by *HoneyBadger84*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Romanion*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Specifically what type of Sapphire card is this? A stock reference card, Tri-X, Tri-X OC, Vapor-X? Regardless, it should be showing Sapphire/PCPartners in SubVendor. Which BIOS setting do you currently have the card on? Try switching the Switch to the other BIOS & see if it's different under SubVendor (note: you should reinstall drivers if you shut down & switch the BIOS as it will make the drivers act funny otherwise, or at least it did for me). Your BIOS ID also doesn't look familiar to me & I've had 3 different Sapphire cards, so it may be that the previous owner switched the BIOS. If so, that's an easy fix as you can just flash it to a proper Sapphire-card BIOS pretty easily.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Everything else looks fine, main thing I wanted to look at is if it had a specific mining BIOS that locks down half the ROPs which can result in a pretty noticable performance difference between it and other 290Xs.
Click to expand...

The subvendor for my reference Sapphire 290's cards also ATI (with stock VBIOS). You can check VBIOS from 015.039 to 015.041 will have ATI as subvendor & VBIOS from 015.042 to 015.044 will have Sapphire/PCPartner as subvendor. I think reference Sapphire 290/290X stock VBIOS is 015.039 & 015.041 is VBIOS update.
Quote:


> Originally Posted by *gobblebox*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jflisk*
> 
> Oh by the way the IC diamond is going to scratch your GPU Die. Not that its going to hurt anything just a heads up.I used it before it does drop temps.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah, I'm aware of it scratching, but I don't plan on changing it out very often at all, the next best thing I have is some Shin-Etsu x23 7783D. I'll tighten the screws tonight to see if I get better contact.
Click to expand...

You can prevent it from happening if you use proper solvent when cleaning it up.


----------



## keikei

Quote:


> Originally Posted by *Meatdohx*
> 
> Hi everyone.
> 
> I have an issue with my current r9 290 crossfire setup. In 2 different games (Aliens Colonial Marines and X-com Enemy Unknown) I had a system lockup that completly destroyed my video card drivers to the point i needed either to restore or reinstall windows.
> 
> Every time i would try to either deinstall or install/repair i would get a blue screen stating system thread not handled.
> 
> First one is on a water block and second one is with the horrible reference cooler.
> 
> I tried 3 things soo far.
> 
> 1- I had a 850W PSU. I changed it to a 1000W to be sure. I am sure its not a power feeding issue.
> 
> 2- I have ran the fan of the second card at 100% and made sure i did not go over 75 cel and i still reproduced the problem.
> 
> 3- I updated the bios of my MSI gaming z87 g45 to the latest update.
> 
> I suspect my motherboard sucks as this point.
> 
> I also tried with my CPU 4670K OC to 4.5 and stock and i was able to reproduce the problem with both settings.
> 
> The issue does not happen in Wildstar or BF4 and i play thoses games alot more.
> 
> I am open to suggestions.
> 
> Please help meh
> 
> English is not my native language! Sorry for the bad syntax.


Have you tried older amd drivers?


----------



## falcon26

Well I already got my 290X sold. I'll pick up a 780 sometime this week. The 290X is a great card no doubt, but in my case it was just too hot and loud.


----------



## HoneyBadger84

Quote:


> Originally Posted by *falcon26*
> 
> Well I already got my 290X sold. I'll pick up a 780 sometime this week. The 290X is a great card no doubt, but in my case it was just too hot and loud.


Downgrade much? Why not get a 290X with an aftermarket cooler that's a reference board, for future liquid cooling?


----------



## Meatdohx

Quote:


> Originally Posted by *keikei*
> 
> Have you tried older amd drivers?


That i did not. Theses test were all with 14.6 and 14.7 (RC) because of the Wildstar optimization. I don't think i wanna live in a world where my graphic drivers are not optimized for this game haha.

I could try for fun. Aliens Colonial Marine is an instant fatal crash. Such a crappy game but i love ALIENS and the story...


----------



## pnoozi

Quote:


> Originally Posted by *falcon26*
> 
> Well I already got my 290X sold. I'll pick up a 780 sometime this week. The 290X is a great card no doubt, but in my case it was just too hot and loud.










Oh well. Better luck with the next card!


----------



## the9quad

Quote:


> Originally Posted by *falcon26*
> 
> Well I already got my 290X sold. I'll pick up a 780 sometime this week. The 290X is a great card no doubt, but in my case it was just too hot and loud.


I can understand your frustration and see why you would make that choice. These cards and drivers have been problematic for some.


----------



## Gobigorgohome

Hallo guys, when I was removing the reference coolers from my cards I noticed that I had two cards with Hynix and two cards with Elpida, will these cards stay at the same overclock when I overclock them? I am just looking for 1080 to 1100 core overclock and a memory clock that works okay. (I have never tried to overclock these cards yet), my rig will probably be up and running in a few days (waiting for CPU and MB water blocks). Will be running them water cooled with Fujipoly Ultra Extreme thermal pads and the EK-Terminal for quad crossfire. Which clocks is best for stability 24/7 and at 4K?

Thanks in advance.


----------



## HoneyBadger84

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Hallo guys, when I was removing the reference coolers from my cards I noticed that I had two cards with Hynix and two cards with Elpida, will these cards stay at the same overclock when I overclock them? I am just looking for 1080 to 1100 core overclock and a memory clock that works okay. (I have never tried to overclock these cards yet), my rig will probably be up and running in a few days (waiting for CPU and MB water blocks). Will be running them water cooled with Fujipoly Ultra Extreme thermal pads and the EK-Terminal for quad crossfire. Which clocks is best for stability 24/7 and at 4K?
> 
> Thanks in advance.


Different vRAM types won't matter that much. Because you have Elpida units, it's likely you'll find that your stable vRAM OC may not be as high as some others, from what I've heard, Elpida chips can't hit the OCs Hynix can. It's not a HUGE difference mind you, but it's there. I do know that my Elpida cards I have now can only run 1450MHz vRAM for benchmarking, whereas the Hynix cards I had in the past hit 1550MHz at the same voltage, and had zero stability issues, whereas I had 1-2 benchmarks I actually had to run at 1425MHz vRAM on the Elpida cards because they showed instability at 1450MHz with the same voltage I was previously able to run 1550MHz with on the Hynix cards.

Now having said that, as others have said & results have shown, OCing the vRAM really don't do a whole lot on these cards because the bus & overall speed is already so stupid fast.

But overall, having mixed cards won't cause any direct issues of any kind.


----------



## Maracus

Quote:


> Originally Posted by *the9quad*
> 
> That's happened to me as well in other games twice now. Did you get a checkerboard pattern flash periodically prior to it happening? That's what mine did, and that is with absolutley no overclock and the cards not even stressed. My guess is it has something to do with crossfire and the 2d portion remaining in effect when it should be swapping to 3d. I think some games screw it up and it results in this phenomena.


That's odd I only had this when overclocking and on a single card setup. I thought it was due to high clock frequencies combined with vdroop/voltage. Anyhow i'll test the theory when i get home next week.


----------



## falcon26

Actually I picked up a used msi gaming 780 TI


----------



## the9quad

Quote:


> Originally Posted by *falcon26*
> 
> Actually I picked up a used msi gaming 780 TI


That is a baller card, hope your happy with it.


----------



## Zipperly

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Downgrade much? Why not get a 290X with an aftermarket cooler that's a reference board, for future liquid cooling?


It isnt a downgrade if he is into overclocking.

Edit-NVM just saw he picked up a TI. Nice!


----------



## schoolofmonkey

Hey guys.
I'm thinking about picking a ASUS Radeon R9 290X DirectCU II OC 4GB up, they are on special here for $629AU.
After having a really crappy run with Nvidia card I'm thinking of switching.

Monitor setup is Benq [email protected]
I do have a evga GTX780ti, but its going back for RMA again so was going to get the AMD instead.

You think I would notice any visible frame or performance loss?
How are the real would temps on these card (not on a test bench..lol), I will be putting it in a Enthoo Primo.
Any other issues like coil whine etc.

Thanks


----------



## Germanian

http://www.3dmark.com/3dm/3716046?
Graphics Score 19456

core 1061 1350 VRAM
+56mv Core voltage, +6mv AUX voltage
+50 power limit,
73% fan speed manual

had some throttling in between but nothing major, score should be comparable

PS: 4770K is practically stock clocks 3.9ghz, hence bad CPU physics score (need money for watercooling gear







, hopefully by next month)


----------



## the9quad

Quote:


> Originally Posted by *schoolofmonkey*
> 
> Hey guys.
> I'm thinking about picking a ASUS Radeon R9 290X DirectCU II OC 4GB up, they are on special here for $629AU.
> After having a really crappy run with Nvidia card I'm thinking of switching.
> 
> Monitor setup is Benq [email protected]
> I do have a evga GTX780ti, but its going back for RMA again so was going to get the AMD instead.
> 
> You think I would notice any visible frame or performance loss?
> How are the real would temps on these card (not on a test bench..lol), I will be putting it in a Enthoo Primo.
> Any other issues like coil whine etc.
> 
> Thanks


ti is probably marginally (a few fps) faster at that res in some games and the 290x is faster in fewer. Increase the res and it reverses with the 290x being marginally faster in more games. Nothing game changing either way.


----------



## Sgt Bilko

Quote:


> Originally Posted by *schoolofmonkey*
> 
> Hey guys.
> I'm thinking about picking a ASUS Radeon R9 290X DirectCU II OC 4GB up, they are on special here for $629AU.
> After having a really crappy run with Nvidia card I'm thinking of switching.
> 
> Monitor setup is Benq [email protected]
> I do have a evga GTX780ti, but its going back for RMA again so was going to get the AMD instead.
> 
> You think I would notice any visible frame or performance loss?
> How are the real would temps on these card (not on a test bench..lol), I will be putting it in a Enthoo Primo.
> Any other issues like coil whine etc.
> 
> Thanks


Don't get the DCU II unless you are planning on watercooling, it's air cooling is underwhelming due to the fact it's a 780 cooler slapped on the 290x die









I assume you are talking about the ones from PCCG? Strange they don't have any other stock in......Scorptec might though

You could either just get the 780 Ti back or grab a couple of R9 290's for some really high fps









EDIT: Scorptec has the Gigabyte R9 290x OC for $649 if that takes your fancy instead?


----------



## HoneyBadger84

Quote:


> Originally Posted by *schoolofmonkey*
> 
> Hey guys.
> I'm thinking about picking a ASUS Radeon R9 290X DirectCU II OC 4GB up, they are on special here for $629AU.
> After having a really crappy run with Nvidia card I'm thinking of switching.
> 
> Monitor setup is Benq [email protected]
> I do have a evga GTX780ti, but its going back for RMA again so was going to get the AMD instead.
> 
> You think I would notice any visible frame or performance loss?
> How are the real would temps on these card (not on a test bench..lol), I will be putting it in a Enthoo Primo.
> Any other issues like coil whine etc.
> 
> Thanks


I would caution against getting the D CUII coolered R9 290X. From what I've heard from user-based feedback, as well as some reviews, that cooler is absolute trash on that card compared to some others (Sapphire's coolers, Gigabyte's WindForce). The CUII brand went from being one of the better coolers out there to being pretty meh over the past few iterations of it, not to mention I think it's just simply not capable of handling the R9 290X's heat output very well without a redesign.

Couple of reviews state that card runs in to the mid 80sC with the stock fan profile... that's pretty crazy IMO with that supposedly Uber of an aftermarket cooler. When I left the Gigabyte WindForce card I had on it's default fan profile, it didn't pass 81C even in Unigine Valley loops, and it capped itself out in the 50% fan range which was inaudible.

That'd be the other aspect of bad the D CUII cooler supposedly has. It's noisy.

I haven't owned one myself, so all of that is based off what I've read in reviews & heard from other people... but it was enough to make me not bother getting one when I had a few opportunities to. If you can, try to find a Sapphire Tri-X or a Gigabyte WindForce card. I think you'll be much happier with either one, if you can find them at relatively the same price.


----------



## schoolofmonkey

@Sgt Bilko @HoneyBadger84

Thanks guys.

I made the mistake of ordering right before I got your posts, but I've sent a cancellation request.

I've just has so many issues with the GTX780ti since Jan, so I bought the big boy on the block (Classy) and now its got fan issues, on 2 coolers, someone at the eVGA forums sent the theirs and it doesn't click, but the whine is horrible.

I've spent more time RMA GTX780ti cards than actually using them.


----------



## Sgt Bilko

Quote:


> Originally Posted by *schoolofmonkey*
> 
> @Sgt Bilko @HoneyBadger84
> 
> Thanks guys.
> 
> I made the mistake of ordering right before I got your posts, but I've sent a cancellation request.
> 
> I've just has so many issues with the GTX780ti since Jan, so I bought the big boy on the block (Classy) and now its got fan issues, on 2 coolers, someone at the eVGA forums sent the theirs and it doesn't click, but the whine is horrible.
> 
> I've spent more time RMA GTX780ti cards than actually using them.


No worries, in every other GPu line the DCU II is a good card but not so for Hawaii unfortunately.

The best ones are the Vapor-X and the Lightning but they carry a price premuim.

Other than them the Gigabyte WF3, Sapphire Tri-X, XFX DD and the MSI Gaming are all good cards but it seems it's only the Giga and Asus avaliable through PCCG and Scorptech....plenty of r9 290's around but you'll be happy with the Classy, i hear nothing but good things about them them


----------



## rdr09

Quote:


> Originally Posted by *schoolofmonkey*
> 
> @Sgt Bilko @HoneyBadger84
> 
> Thanks guys.
> 
> I made the mistake of ordering right before I got your posts, but I've sent a cancellation request.
> 
> I've just has so many issues with the GTX780ti since Jan, so I bought the big boy on the block (Classy) and now its got fan issues, on 2 coolers, someone at the eVGA forums sent the theirs and it doesn't click, but the whine is horrible.
> 
> I've spent more time RMA GTX780ti cards than actually using them.


not sure what games you are going to play and your budget but for 144Hz you may need more than one highend gpu. but with 2, your cpu might hold them back.


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> not sure what games you are going to play and your budget but for 144Hz you may need more than one highend gpu. but with 2, your cpu might hold them back.


2 x R9 290's would be hitting 144hz pretty easy, i ran mine on 1080p with my cpu and was getting good frames in almost all games


----------



## schoolofmonkey

Quote:


> Originally Posted by *Sgt Bilko*
> 
> but you'll be happy with the Classy, i hear nothing but good things about them them


I have the Classy now, good performer, 2 crappy coolers in a row with hideously loud fan whine and clicking.
The first one the fan was physically off center and sitting higher on one side, the second whines so loud and at a frequency that hurts your ears.

Not to mention the 2 artifacting Gigabyte cards, 2 Overheating Asus card, and 2 Galaxy's that buzzed loudly, haven't had a good run at all.

So I was looking to go AMD, might be better, sounds like the Asus one has crap cooling like the GTX780ti DCUII...
Quote:


> Originally Posted by *rdr09*
> 
> not sure what games you are going to play and your budget but for 144Hz you may need more than one highend gpu. but with 2, your cpu might hold them back.


I have to update my rig.
I'm running a 4790k and 16GB ram on a Gigabyte z97x Gaming G1 Black.


----------



## HoneyBadger84

Quote:


> Originally Posted by *schoolofmonkey*
> 
> @Sgt Bilko @HoneyBadger84
> 
> Thanks guys.
> 
> I made the mistake of ordering right before I got your posts, but I've sent a cancellation request.
> 
> I've just has so many issues with the GTX780ti since Jan, so I bought the big boy on the block (Classy) and now its got fan issues, on 2 coolers, someone at the eVGA forums sent the theirs and it doesn't click, but the whine is horrible.
> 
> I've spent more time RMA GTX780ti cards than actually using them.


I'd ask if you can switch the order to a Gigabyte R9 290X if they'll let you. I've heard that card has some quality control issues like the 780 Ti Classy does, BUT less so since the initial batches sold out, and like I said, the one I had ran cool, quiet & was a decent OCer without anything but the fan being set to 100%, it ran 1150/1550 without breaking a sweat.


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 2 x R9 290's would be hitting 144hz pretty easy, i ran mine on 1080p with my cpu and was getting good frames in almost all games


that's what i was going to suggest but these cards get hot on air, especially crossfire.

Quote:


> Originally Posted by *schoolofmonkey*
> 
> I have the Classy now, good performer, 2 crappy coolers in a row with hideously loud fan whine and clicking.
> The first one the fan was physically off center and sitting higher on one side, the second whines so loud and at a frequency that hurts your ears.
> 
> Not to mention the 2 artifacting Gigabyte cards, 2 Overheating Asus card, and 2 Galaxy's that buzzed loudly, haven't had a good run at all.
> 
> So I was looking to go AMD, might be better, sounds like the Asus one has crap cooling like the GTX780ti DCUII...
> I have to update my rig.
> I'm running a 4790k and 16GB ram on a Gigabyte z97x Gaming G1 Black.


either 2 290 (Tri X) or a single MSI 290X Lightning (best aftermarket cooler if you are really concerned about heat and noise).


----------



## Sgt Bilko

Quote:


> Originally Posted by *HoneyBadger84*
> 
> I'd ask if you can switch the order to a Gigabyte R9 290X if they'll let you. I've heard that card has some quality control issues like the 780 Ti Classy does, BUT less so since the initial batches sold out, and like I said, the one I had ran cool, quiet & was a decent OCer without anything but the fan being set to 100%, it ran 1150/1550 without breaking a sweat.


The Giga card is a different store, I still think he'd be better off with CF 290's over a single 290x though, especially at 1080p


----------



## HoneyBadger84

Quote:


> Originally Posted by *Sgt Bilko*
> 
> The Giga card is a different store, I still think he'd be better off with CF 290's over a single 290x though, especially at 1080p


Agreed depending on his budget. Heat would be a bit of an issue in CF, but the performance of Crossfired 290s at 1080p is pretty sexy in any game that scales with Crossfire worth a crap, which is quite the majority these days.


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> that's what i was going to suggest but these cards get hot on air, especially crossfire.
> either 2 290 (Tri X) or a single MSI 290X Lightning (best aftermarket cooler if you are really concerned about heat and noise).


Case looks pretty good for airflow and the board has heaps of room between the slots for clean air so i'd say Crossfire should be fine with a couple of Tri-X's or WF3's


----------



## the9quad

The mobo also looks like it has plenty of space for 2 cards in crossfire with aftermarket coolers to breathe, should be cool enough without going water? It's not like he is going to sammich 3 or 4 of em on there.

and then i look above me and see you said the same thing lol....


----------



## schoolofmonkey

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Agreed depending on his budget.


Well the Classified was $970, so that should nearly cover 2 290's..

Thought I could pick up the Asus one as it was on special, but not wasting money if its that bad.
Already been through that..lol


----------



## Sgt Bilko

Quote:


> Originally Posted by *schoolofmonkey*
> 
> Well the Classified was $970, so that should nearly cover 2 290's..
> 
> Thought I could pick up the Asus one as it was on special, but not wasting money if its that bad.
> Already been through that..lol


Yeah, Classy are expensive here









Here are your choice for 290's from PCCG, the Giga WF3 ($450), Sapphire Tri-X OC ($500), and the XFX DD ($450)

i can vouch for the XFX cards working well for temps in Xfire, they get a tad warm during summer though so ther Triple fan designs of the Tri-X and a WF3 might be more to your liking


----------



## schoolofmonkey

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yeah, Classy are expensive here
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Giga WF3


They have put at 1 per customer limit on it at the moment..


----------



## Sgt Bilko

Quote:


> Originally Posted by *schoolofmonkey*
> 
> They have put at 1 per customer limit on it at the moment..


Quote:


> Bargain price - $80 off! Normally $529, limit *two* per customer, strictly while stocks last!


Nope....


----------



## HoneyBadger84

Quote:


> Originally Posted by *schoolofmonkey*
> 
> They have put at 1 per customer limit on it at the moment..


You could do what I did at first when I first started getting 290Xs. Get one 290 Tri-X (for the top card as they run cooler) and one 290 WindForce (run it on bottom). They should be about level in temps from the Tri-X getting a bit of air off the WindForce's exhaust, both will run cool & quiet, and they should be clocked around the same so they'll get along just fine.


----------



## keikei

Quote:


> Originally Posted by *schoolofmonkey*
> 
> They have put at 1 per customer limit on it at the moment..


It may just refer to the order. Like one per order. Try ordering another on a separate order of course. Worked for me at superbiz. They had a one per customer order rule as well.


----------



## Sgt Bilko

Quote:


> Originally Posted by *HoneyBadger84*
> 
> You could do what I did at first when I first started getting 290Xs. Get one 290 Tri-X (for the top card as they run cooler) and one 290 WindForce (run it on bottom). They should be about level in temps from the Tri-X getting a bit of air off the WindForce's exhaust, both will run cool & quiet, and they should be clocked around the same so they'll get along just fine.


Quote:


> Originally Posted by *keikei*
> 
> It may just refer to the order. Like one per order. Try ordering another on a separate order of course. Worked for me at superbiz. They had a one per customer order rule as well.


Guys, it's a 2 per customer limit : http://www.pccasegear.com/index.php?main_page=product_info&cPath=193_1575&products_id=26600


----------



## schoolofmonkey

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Guys, it's a 2 per customer limit : http://www.pccasegear.com/index.php?main_page=product_info&cPath=193_1575&products_id=26600


Nah thats cool, I just saw a image with 1 per customer on it.,

http://www.pccasegear.com/UserFiles/GV-R929WF3-4GD-banner.jpg


----------



## Sgt Bilko

Quote:


> Originally Posted by *schoolofmonkey*
> 
> Nah thats cool, I just saw a image with 1 per customer on it.,
> 
> http://www.pccasegear.com/UserFiles/GV-R929WF3-4GD-banner.jpg


I didn't actually notice that lol

They did have a sale a while ago for the OC editions....maybe they just re-used it?


----------



## schoolofmonkey

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I didn't actually notice that lol
> 
> They did have a sale a while ago for the OC editions....maybe they just re-used it?


I got caught with PCCG once before.
Bought a Tegra Note 7 for my sons bday, didn't go into the product page, just bought it from the list, turned out I could only pay with paypal, stuffed around for days, it arrived after his birthday.
I make a habit of reading everything on the product page now, though it did let me ad 2 to the cart and I made it all the way to pay for it.
Guess they take the money, then refund it after...lol

Just have to wait for the RMA before going ahead with it now, but the xfire sounds like a good option.


----------



## Sgt Bilko

Quote:


> Originally Posted by *schoolofmonkey*
> 
> I got caught with PCCG once before.
> Bought a Tegra Note 7 for my sons bday, didn't go into the product page, just bought it from the list, turned out I could only pay with paypal, stuffed around for days, it arrived after his birthday.
> I make a habit of reading everything on the product page now, though it did let me ad 2 to the cart and I made it all the way to pay for it.
> Guess they take the money, then refund it after...lol
> 
> Just have to wait for the RMA before going ahead with it now, but the xfire sounds like a good option.


Only with PayPal? that's kinda weird......

Either way, hope your happy with the 290's, just set yourself a custom fan profile in MSI Afterburner or OC Guru etc to ramp the fans up if they get warm and some side airflow never goes unwanted with Multi-GPU setups









Enjoy them man!


----------



## kizwan

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Hallo guys, when I was removing the reference coolers from my cards I noticed that I had two cards with Hynix and two cards with Elpida, will these cards stay at the same overclock when I overclock them? I am just looking for 1080 to 1100 core overclock and a memory clock that works okay. (I have never tried to overclock these cards yet), my rig will probably be up and running in a few days (waiting for CPU and MB water blocks). Will be running them water cooled with Fujipoly Ultra Extreme thermal pads and the EK-Terminal for quad crossfire. Which clocks is best for stability 24/7 and at 4K?
> 
> Thanks in advance.


It all depends on your card overclockability - pretty much depends on silicon lottery. Your target for moderate overclock should be easily achievable.


----------



## Mega Man

anyone here use a mst hub ??


----------



## flamin9_t00l

I've bought my 3rd 290 a couple of weeks back in the form of the Powercolor PCS+ R9 290. It's a great card but the fan profile is very aggressive and causes quite a bit of noise, but it needs to be at the stock clocks and voltages (1040MHz core / 1350 mem / +50mv). Core is fine but VRM1 gets a little toasty. The fans reach over 85% which is getting into reference cooler levels of noise. It's summer here now with ambient temps over 25c so the noise will be much less I'm sure as winter gets closer. I'm not having a go at the card just pointing out the flaws in regards to the stock clocks/volts.

I'm currently running at a downclock of 1000MHz core and 1300MHz memory with a custom ab fan profile to tame the noise and temps (especially VRM1) until cooler ambients. It's much more respectable noise levels with these clocks and a 1:1 temp/fan speed setup... It barely gets past 70c core and of course same for fan 70%. I had seen VRM1 as high as 91c at stock which is partly why I'm downclocked/downvolted, it was a really hot day though. VRM1 now hovers around 75c and I haven't seen it hit 80c yet.

Overall very pleased with the card, had to bump power limit a little to prevent some minor downclocking at stock but otherwise it's in great shape and performing brilliantly in my new Z97 rig. I did notice that with +50mv on the PCS+ the actual voltage reported during load was similar or less than both of my other 2 cards at stock volts which I found a little odd.

Anyway that's my thoughts on the PCS+ 290... starting to feel like a review lol.
Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yeah, Classy are expensive here
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here are your choice for 290's from PCCG, the Giga WF3 ($450), Sapphire Tri-X OC ($500), and the XFX DD ($450)
> 
> i can vouch for the XFX cards working well for temps in Xfire, they get a tad warm during summer though so ther Triple fan designs of the Tri-X and a WF3 might be more to your liking


IIRC the XFX DD 290's have poor VRM temps which is why I avoided it when buying my 3rd card, but please correct me if I'm wrong.


----------



## Mega Man

well if anyone does use a mst hub, i can only activate 2 of my 1080p monitors,

using newest drivers directly connected by DP ( all 3 )

i can use them with 2 via mst hub and one via DLDVI

however i should be able to use them soley on the mst hub. i know the mst hub works,

it will load all 3 screens at boot up

any ideas/ any infos i can give for helps !~


----------



## spikezone2004

I have finally narrowed my choices for my R 9 290 block, I have decided either the aqua computers block or the EK-FC R9-290X Waterblock - Copper Acetal (Rev.2.0).

Main thing is on stens tests the ac block gets good vrm temps with backplate, but it seems to be hard to get. But the ek block people say the thermal pads suck and to upgrade them. I don't want to buy a block to have to pay more to upgrade the pads on it.

What do you guys think I should get out of those two? I'm not worried bout looks just the best temperatures you can get with those two blocks


----------



## renzkuken1

Hey guys,

I'm a total noob when it comes to AMD. This is my first card and i have a Gigabyte r9 290x under water.
What do i need to do? I bought the card seconhand with the block already installed.
Do i need to update the card bios? Should i use OC Guru II as the program to overclock it? (It's gigabytes one apparently)

Cheers in advance for the help and saving me browsing through the forum for hours on end.









Chris.


----------



## kizwan

Quote:


> Originally Posted by *spikezone2004*
> 
> I have finally narrowed my choices for my R 9 290 block, I have decided either the aqua computers block or the EK-FC R9-290X Waterblock - Copper Acetal (Rev.2.0).
> 
> Main thing is on stens tests the ac block gets good vrm temps with backplate, but it seems to be hard to get. But the ek block people say the thermal pads suck and to upgrade them. I don't want to buy a block to have to pay more to upgrade the pads on it.
> 
> What do you guys think I should get out of those two? I'm not worried bout looks just the best temperatures you can get with those two blocks


The Aquacomputer Kryographics block is strictly for referenced card while the EK Rev. 2.0 block is for both referenced & non-referenced card. What card do you have? If you have MSI Gaming card, remember there are two version/revision of this card where the latest is not compatible with EK Rev. 2.0 block. The Sapphire Tri-X card also have two version/revision where the first rev. is referenced card while the later rev. not compatible with EK Rev. 2.0 block if I'm not mistaken.
Quote:


> Originally Posted by *renzkuken1*
> 
> Hey guys,
> 
> I'm a total noob when it comes to AMD. This is my first card and i have a Gigabyte r9 290x under water.
> What do i need to do? I bought the card seconhand with the block already installed.
> Do i need to update the card bios? Should i use OC Guru II as the program to overclock it? (It's gigabytes one apparently)
> 
> Cheers in advance for the help and saving me browsing through the forum for hours on end.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Chris.


Try running the card as it is. Play games & see whether the card is working as it should be or not. Then you can decide to flash the BIOS or do anything to your card.

For overclocking, you can use either MSI Afterburner ver. 3 or Sapphire TRIXX software.


----------



## rdr09

Quote:


> Originally Posted by *schoolofmonkey*
> 
> I have the Classy now, good performer, 2 crappy coolers in a row with hideously loud fan whine and clicking.
> The first one the fan was physically off center and sitting higher on one side, the second whines so loud and at a frequency that hurts your ears.
> 
> Not to mention the 2 artifacting Gigabyte cards, 2 Overheating Asus card, and 2 Galaxy's that buzzed loudly, haven't had a good run at all.
> 
> So I was looking to go AMD, might be better, sounds like the Asus one has crap cooling like the GTX780ti DCUII...
> I have to update my rig.
> I'm running a 4790k and 16GB ram on a Gigabyte z97x Gaming G1 Black.


that cpu turbos to 4.4GHz. pretty much no need to oc but this is ocn.


----------



## alancsalt

Quote:


> Originally Posted by *Mega Man*
> 
> well if anyone does use a mst hub, i can only activate 2 of my 1080p monitors,
> 
> using newest drivers directly connected by DP ( all 3 )
> 
> i can use them with 2 via mst hub and one via DLDVI
> 
> however i should be able to use them soley on the mst hub. i know the mst hub works,
> 
> it will load all 3 screens at boot up
> 
> any ideas/ any infos i can give for helps !~


Wermad made a thread about the MST Hub..


----------



## Romanion

Quote:


> Originally Posted by *HoneyBadger84*
> 
> So both BIOSes perform the same? Meh, yeah you'll have to find a default BIOS & try that then. There's a thread on BIOS flashing, find that, I used ATIFlash to fix one of the cards I got with that issue.


Wow that fixed it! Thanks!


----------



## HoneyBadger84

Quote:


> Originally Posted by *Romanion*
> 
> Wow that fixed it! Thanks!


It's odd that so many Sapphire cards seem to have that issue. The first one I had it with (weird/fault BIOS) was an XFX but I had a Sapphire that had it always, but I fixed it pretty easily thanks to that thread having BIOS available for download, along with ATIFlash making it very simple to do it. Glad you got it taken care of


----------



## spikezone2004

Quote:


> Originally Posted by *kizwan*
> 
> The Aquacomputer Kryographics block is strictly for referenced card while the EK Rev. 2.0 block is for both referenced & non-referenced card. What card do you have? If you have MSI Gaming card, remember there are two version/revision of this card where the latest is not compatible with EK Rev. 2.0 block. The Sapphire Tri-X card also have two version/revision where the first rev. is referenced card while the later rev. not compatible with EK Rev. 2.0 block if I'm not mistaken.


Well thanks for that info! out of everything I have read I have not seen that the aquacomputer block is only for reference cards! I am getting the XFX R9 290 Non reference card. I guess that leaves me with the EK block for my card, Will the EK block work with the XFX no problem?

I am looking at this block
http://www.performance-pcs.com/catalog/index.php?main_page=product_info&products_id=40685

And this backplate
http://www.performance-pcs.com/catalog/index.php?main_page=product_info&products_id=39426

Will these 2 fit together? if so, i think those are the winners.

EDIT: Is a reinforcer bracket recommended to take pressure off the pcb?


----------



## HoneyBadger84

So I log in to EBay & I have a second chance offer from someone for 2 Tri-X 290s for pretty cheap... waiting to hear back from the seller on if they were mined or not. If they weren't, much profit will be made as the price is very cheap & I know I can resell'em for more, assuming I don't keep them for [email protected]


----------



## kizwan

Quote:


> Originally Posted by *spikezone2004*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> The Aquacomputer Kryographics block is strictly for referenced card while the EK Rev. 2.0 block is for both referenced & non-referenced card. What card do you have? If you have MSI Gaming card, remember there are two version/revision of this card where the latest is not compatible with EK Rev. 2.0 block. The Sapphire Tri-X card also have two version/revision where the first rev. is referenced card while the later rev. not compatible with EK Rev. 2.0 block if I'm not mistaken.
> 
> 
> 
> 
> 
> 
> 
> Well thanks for that info! out of everything I have read I have not seen that the aquacomputer block is only for reference cards! I am getting the XFX R9 290 Non reference card. I guess that leaves me with the EK block for my card, Will the EK block work with the XFX no problem?
> 
> I am looking at this block
> http://www.performance-pcs.com/catalog/index.php?main_page=product_info&products_id=40685
> 
> And this backplate
> http://www.performance-pcs.com/catalog/index.php?main_page=product_info&products_id=39426
> 
> Will these 2 fit together? if so, i think those are the winners.
> 
> EDIT: Is a reinforcer bracket recommended to take pressure off the pcb?
Click to expand...

Yeah, those two will fit together. If your card in this compatibility list, then is should be compatible. Make sure to do visual comparison before purchasing the block, just in case.

Regarding the reinforcer bracket, these two post might be able to answer your question.


Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Just got and installed my EK Reinforcement Bracket...
> 
> Not a huge help as my card still bends a bit, but it looks nice. Not bad for $8. Thanks to whoever suggested it.
> 
> I need to clean my case window, there's a thin layer of dust on it.








Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *ledzepp3*
> 
> Alright so reporting back on those FC R9-290X Reinforcement Brackets which I posted about earlier.
> 
> The installation was remarkably easy. If you already have the block (and backplate if you've got one) installed, there's _no_ need to rip the block off and waste TIM or thermal pads. There are three bolts required (two allen heads which attach at the PCI bracket), and one at the very end of the card right by the PCI 6 and 8 pin connections.
> 
> 
> 
> 
> There is very little of the beautiful acrylic (on my blocks at least) covered up by these, and no nickel covered whatsoever. There is clearance from the brackets to the actual block (~11mm), and clearance from the bracket to the inlet/outlet ports. More than enough to be able to fit rotary or angled adapters. Ample space is had for any aftermarket/upgraded PCI power connectors to make their way into the sockets on the card, so no worries if anyone has sleeved cables with larger connectors.
> 
> 
> 
> 
> 
> Overall, these adapters are phenomenal. The stiffness and rigidity of the cards is noticeably higher- plus I think they look even classier now that the _hideous_ half-pink caps are somewhat covered by the brackets when viewed from a head-on angle (as if they were installed on a board in a case). Here are some other shots I have of the perfect finishes which are flat black, but will catch some light if shown directly on them due to the mildly textured look. The brackets themselves span the whole length of the PCB for those who are concerned about overhanging components or accessories.
> 
> 
> 
> Mountain Dew can for scale
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Cheers all and hope this helps!
> 
> 
> 
> 
> 
> 
> 
> 
> -Zepp


----------



## spikezone2004

Quote:


> Originally Posted by *kizwan*
> 
> Yeah, those two will fit together. If your card in this compatibility list, then is should be compatible. Make sure to do visual comparison before purchasing the block, just in case.
> 
> Regarding the reinforcer bracket, these two post might be able to answer your question.


The card is on the list, as far as visual comparison everything I have looked at online seems to match up. (Haven't got card yet its on it way, want block to come at same time ish to re do my loop after i run stock gpu cooler tests to compare)

Those posts were helpful hadn't seen those, one says it still bowed one said it made it better. Don't want to spend 8$ if I don't have to but don't want card to break after time of sagging.


----------



## Jflisk

Quote:


> Originally Posted by *spikezone2004*
> 
> The card is on the list, as far as visual comparison everything I have looked at online seems to match up. (Haven't got card yet its on it way, want block to come at same time ish to re do my loop after i run stock gpu cooler tests to compare)
> 
> Those posts were helpful hadn't seen those, one says it still bowed one said it made it better. Don't want to spend 8$ if I don't have to but don't want card to break after time of sagging.


I have 2 R9 290Xs in my case without the helper bracket. Dont have any bending in mine. I have the EK bridge though between my 2 cards. Also have the back plate that might help. Thanks


----------



## spikezone2004

Quote:


> Originally Posted by *Jflisk*
> 
> I have 2 R9 290Xs in my case without the helper bracket. Dont have any bending in mine. I have the EK bridge though between my 2 cards. Also have the back plate that might help. Thanks


I read that the bridge between the two helps them support each other. I am getting the backplate for them since it is suppose to help VRM1 temps as it has thermal pad to put on the back of the VRM1 to transfer heat to backplate.

Out of curiosity what EK block do you have and what core/VRM1 temps?


----------



## Jflisk

I have 2 of these Water blocks
http://www.frozencpu.com/products/24126/ex-blc-1714/EK_MSI_Gigabyte_Radeon_R9-290X_VGA_Liquid_Cooling_Block_Rev_20_-_Acetal_EK-FC_R9-290X_-_Acetal_Rev20.html?tl=g30c309s2073&id=6pRZB4ze&mv_pc=1179

2x backplate
http://www.frozencpu.com/products/21680/ex-blc-1569/EK_R9-290X_VGA_Liquid_Cooling_RAM_Backplate_-_Black_EK-FC_R9-290X_Backplate_-_Black.html?id=6pRZB4ze&mv_pc=1600

These are reference back plates. You can check your compatibility off the page. It will take you to the ek configurator.


----------



## spikezone2004

The block and backplate I am looking at looks just like that when I check compatibility for them says Visual. I assume they will fit from everything I have seen.


----------



## BlackWS6

Anyone have some insight on troubleshooting a pair of reference 290's I have been mining with since they were first released? I removed a reference 290X from my gaming machine for these two standard R9 290's for crossfire. Performance is absolutely terrible in BF4 until I disable crossfire like I'm CPU bottlenecked (high cpu usage). It's a freaking 4770k @ 4.5Ghz, 16GB of ram, twin 128GB SATA3 SSD's in Raid0 and these two reference 290's attached to a 144Hz Asus monitor playing at 1080P. I removed 14.4, cleaned drivers, manually disabled ULPS, tried disabling it in Afterburner, have a manual fan profile and both cards sit around 75C when gaming but I am seeing very, very low GPU usage out of both cards in BF4 with FPS locked to 200 (for testing). I'm seeing as low as 40 for crying out loud with everything on low! I tried the 14.7 beta's to no avail, cleaned and reinstalled 14.4's and can't get any performance unless I go into CCC and disable Crossfire!

Help? =)

PS - I am now running a Z87M Extreme 4 board with a 4770K. Cards are in PCIE 3.0 slots (1 and 3) at 8x each. They are powered via an eVGA 1k PSU.


----------



## gobblebox

Does anyone know the required FujiPoly Ultra Extreme thermal pad dimensions (thickness) required to replace the stock pads on the Sapphire Tri-X OC r9 290? I want to upgrade the pads under the cooler. Also, would adding heatsink fins to the back of the PCB, along with Fujipoly pads (behind the VRMs of course), help reduce VRM temps? Or would y'all, instead, suggest that I just get a back plate to use with these pads? If backplate, what thickness in pads do I need & what backplate do you recommend (without replacing Tri-X cooler)?


----------



## ZealotKi11er

Quote:


> Originally Posted by *BlackWS6*
> 
> Anyone have some insight on troubleshooting a pair of reference 290's I have been mining with since they were first released? I removed a reference 290X from my gaming machine for these two standard R9 290's for crossfire. Performance is absolutely terrible in BF4 until I disable crossfire like I'm CPU bottlenecked (high cpu usage). It's a freaking 4770k @ 4.5Ghz, 16GB of ram, twin 128GB SATA3 SSD's in Raid0 and these two reference 290's attached to a 144Hz Asus monitor playing at 1080P. I removed 14.4, cleaned drivers, manually disabled ULPS, tried disabling it in Afterburner, have a manual fan profile and both cards sit around 75C when gaming but I am seeing very, very low GPU usage out of both cards in BF4 with FPS locked to 200 (for testing). I'm seeing as low as 40 for crying out loud with everything on low! I tried the 14.7 beta's to no avail, cleaned and reinstalled 14.4's and can't get any performance unless I go into CCC and disable Crossfire!
> 
> Help? =)
> 
> PS - I am now running a Z87M Extreme 4 board with a 4770K. Cards are in PCIE 3.0 slots (1 and 3) at 8x each. They are powered via an eVGA 1k PSU.


Are you using Mantle or DX11? I would bench them with some 3DMark and post the score here. If you did not do a fresh install i suggest you do. ULPS does nothing for you unless you OC.


----------



## BlackWS6

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Are you using Mantle or DX11? I would bench them with some 3DMark and post the score here. If you did not do a fresh install i suggest you do. ULPS does nothing for you unless you OC.


Mantle was 'better' at best vs DX11. I'll do more troubleshooting tonight, and I do overclock. =)


----------



## spikezone2004

Quote:


> Originally Posted by *gobblebox*
> 
> Does anyone know the required FujiPoly Ultra Extreme thermal pad dimensions (thickness) required to replace the stock pads on the Sapphire Tri-X OC r9 290? I want to upgrade the pads under the cooler. Also, would adding heatsink fins to the back of the PCB, along with Fujipoly pads (behind the VRMs of course), help reduce VRM temps? Or would y'all, instead, suggest that I just get a back plate to use with these pads? If backplate, what thickness in pads do I need & what backplate do you recommend (without replacing Tri-X cooler)?


Does this help
http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures/0_20

If not, try looking in installation manual for your block you have it should say thermal pad thickness. Atleast some of the manuals do.


----------



## gobblebox

Quote:


> Originally Posted by *spikezone2004*
> 
> Does this help
> http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures/0_20
> 
> If not, try looking in installation manual for your block you have it should say thermal pad thickness. Atleast some of the manuals do.


Check the Sapphire Tri-X OC after market cooler's manual for the thermal pad thickness? Just making sure since you are referring to a "block," since I do not have a (assuming water) block. I guess I could always pull the cooler back off & just measure the thickness of the pads underneath. That guide definitely helps with the pad application, but I do not want to go the water cooling route because of $$. I'm just looking for alternative methods to reduce VRM1 temps; one earlier post suggested a backplate with thermal pads between, but I'm not sure what backplate I should go with.


----------



## ZealotKi11er

Quote:


> Originally Posted by *BlackWS6*
> 
> Mantle was 'better' at best vs DX11. I'll do more troubleshooting tonight, and I do overclock. =)


1080p i dont think you are pushing the cards. I am @ 1440p and still need to increase resolution scaling 125% to increase GPU usage.


----------



## spikezone2004

Quote:


> Originally Posted by *gobblebox*
> 
> Check the Sapphire Tri-X OC after market cooler's manual for the thermal pad thickness? Just making sure since you are referring to a "block," since I do not have a (assuming water) block. I guess I could always pull the cooler back off & just measure the thickness of the pads underneath. That guide definitely helps with the pad application, but I do not want to go the water cooling route because of $$. I'm just looking for alternative methods to reduce VRM1 temps; one earlier post suggested a backplate with thermal pads between, but I'm not sure what backplate I should go with.


Yes sorry i mean't the Tri-X OC manual.

It gives good points of how to apply the thermal pads though to make sure you have good contact etc can use it with any heatsink you put on and could possibly help with your temps.

You could try just upgrading your thermal pads as well to what they mention in that thread, the fujipoly extreme. The post you mentioned from earlier might of been one of mine as I am buying a backplate which has thermal pads on rear of VRM1 aswell, if you purchased a backplate you could always add your own thermal pads to the rear of your VRM1 see if that gives you better temps.


----------



## BlackWS6

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 1080p i dont think you are pushing the cards. I am @ 1440p and still need to increase resolution scaling 125% to increase GPU usage.


But i'm setting gametime.maxvariablefps to 200+ and even going to ultra, GPU usage is low, cpu usage is crazy high. It _shouldn't_ be pushing the cards at 1080P but it is. I play in 64P 24/7 lockers for the most part and the snow blizzard crap bogs it down big time with crossfire enabled. Holds almost 160FPS (locked there) with a single 290x. May have to try a fresh install.


----------



## bluedevil

Still no response from VisionTek about my RMA. Looks like they are out of stock of these and 290x. Would it be totally out of line to ask for a 295x2?


----------



## Sgt Bilko

Quote:


> Originally Posted by *bluedevil*
> 
> Still no response from VisionTek about my RMA. Looks like they are out of stock of these and 290x. Would it be totally out of line to ask for a 295x2?


How long have they had it?


----------



## bluedevil

Quote:


> Originally Posted by *Sgt Bilko*
> 
> How long have they had it?


They received last Monday.


----------



## Sgt Bilko

Quote:


> Originally Posted by *bluedevil*
> 
> They received last Monday.


Hmm, I'm not sure how long Visontek takes with RMA's but if they are out of stock then sure...you can try a squeeze a 295x2 outta them


----------



## JordanTr

Unless general rma manager is drunk, they wont give you that beast


----------



## bluedevil

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Hmm, I'm not sure how long Visontek takes with RMA's but if they are out of stock then sure...you can try a squeeze a 295x2 outta them


Well I will give it till 10am to respond. Then I will be asking for a 295x2.
Quote:


> Originally Posted by *JordanTr*
> 
> Unless general rma manager is drunk, they wont give you that beast


One can hope.


----------



## Sgt Bilko

Quote:


> Originally Posted by *bluedevil*
> 
> Well I will give it till 10am to respond. Then I will be asking for a 295x2.
> One can hope.


Worst case is they say no and give you the first 290/x they get they're hands on anyways


----------



## bluedevil

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Worst case is they say no and give you the first 290/x they get they're hands on anyways


Funny thing is that the 295x2 is in stock.


----------



## HoneyBadger84

Quote:


> Originally Posted by *bluedevil*
> 
> Funny thing is that the 295x2 is in stock.


Cuz people aren't buying VisionTek's model because they're an unproven brand in terms of quality/RMA service to most people. I'd never heard of them before about 3 months ago now.


----------



## bluedevil

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Cuz people aren't buying VisionTek's model because they're an unproven brand in terms of quality/RMA service to most people. I'd never heard of them before about 3 months ago now.


Yep pretty much making me think I'm going to sell it as soon as I get it.


----------



## BradleyW

I've not seen any negative cases with VisionTek. I've known them for around 3 years.


----------



## Sgt Bilko

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Cuz people aren't buying VisionTek's model because they're an unproven brand in terms of quality/RMA service to most people. I'd never heard of them before about 3 months ago now.


They are a bit like Diamond and Club3D to me, i know about them but have no experience with them or know anyone that's used them.


----------



## HoneyBadger84

Quote:


> Originally Posted by *bluedevil*
> 
> Yep pretty much making me think I'm going to sell it as soon as I get it.


It doesn't necessarily mean they're bad, or good, I've heard nothing about them either way, which makes me iffy on them. My only experience was that crappy aftermarket coolered R9 290X I got from them, and I'm pretty sure it was equal parts bad cooler & bad TIM.

The R9 290 VisionTek card I got in a 290X box, directly from them (shrink wrapped & it was a 290 in a 290X box, with matching serial numbers) is another reason I'm iffy on ever actually buying from them new. If that happens when you're buying directly from them, what else could? *shrug* I mean their RMA service may be good, but yeah... they have 2 strikes for me already.


----------



## Zipperly

Quote:


> Originally Posted by *BradleyW*
> 
> I've not seen any negative cases with VisionTek. I've known them for around 3 years.


I have bought them before, never a moments issue with any of there cards in my experience.


----------



## Jflisk

I could tell you if it was a XFX. XFX honors all warranties in the US. I have to admit they are short of awesome in the RMA dept.


----------



## Jflisk

Quote:


> Originally Posted by *spikezone2004*
> 
> The block and backplate I am looking at looks just like that when I check compatibility for them says Visual. I assume they will fit from everything I have seen.


The should as long as it is a reference card. About 75% of the cards out there are reference.


----------



## Jflisk

Quote:


> Originally Posted by *bluedevil*
> 
> Funny thing is that the 295x2 is in stock.


I can tell you with a xfx RMA. It takes a week for the card to get there. 2 or 3 days after they get it. They check it. Then about a day before it ships. A week to get returned. So about 3 weeks total. Hope that helps.


----------



## chronicfx

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Cuz people aren't buying VisionTek's model because they're an unproven brand in terms of quality/RMA service to most people. I'd never heard of them before about 3 months ago now.


I don't have experience with them lately because I am just using AMD again since the 7970 but my HD2600xt from god knows when still works in the computer I gave to my uncle. I think it may even be an AGP card iirc.







But they have been around longer than three months.. probably longer than 15 years.


----------



## Jflisk

Quote:


> Originally Posted by *chronicfx*
> 
> I don't have experience with them lately because I am just using AMD again since the 7970 but my HD2600xt from god knows when still works in the computer I gave to my uncle. I think it may even be an AGP card iirc.
> 
> 
> 
> 
> 
> 
> 
> But they have been around longer than three months.. probably longer than 15 years.


AGP - You can play some mean doom on the HD2600XT.


----------



## chronicfx

Quote:


> Originally Posted by *Jflisk*
> 
> AGP - You can play some mean doom on the HD2600XT.


HAHA doom and dark forces were favorites, that was back when I thought you could buy a "high end" GPU at microcenter and had not discovered that enthusiast level existed or bothered to know how to overclock. I actually relegated that computer to my uncle pretty fast and got the HD3850 which is now in my father in laws computer. Anyways that thing was a beast and I remember beating call of duty with the thinking no game with better graphics could ever exist!!


----------



## The Storm

Quote:


> Originally Posted by *BlackWS6*
> 
> But i'm setting gametime.maxvariablefps to 200+ and even going to ultra, GPU usage is low, cpu usage is crazy high. It _shouldn't_ be pushing the cards at 1080P but it is. I play in 64P 24/7 lockers for the most part and the snow blizzard crap bogs it down big time with crossfire enabled. Holds almost 160FPS (locked there) with a single 290x. May have to try a fresh install.


I have 3 290's on a single 1440p, I wasn't getting much load across the three cards on ultra either until I cranked up the AA. Once I did that they all loaded up nicely. IMO @ 1080 they just aren't getting enough load. I would also try a few benches and compare to similar systems to see if they are on par.


----------



## gobblebox

Quote:


> Originally Posted by *spikezone2004*
> 
> Yes sorry i mean't the Tri-X OC manual.
> 
> It gives good points of how to apply the thermal pads though to make sure you have good contact etc can use it with any heatsink you put on and could possibly help with your temps.
> 
> You could try just upgrading your thermal pads as well to what they mention in that thread, the fujipoly extreme. The post you mentioned from earlier might of been one of mine as I am buying a backplate which has thermal pads on rear of VRM1 aswell, if you purchased a backplate you could always add your own thermal pads to the rear of your VRM1 see if that gives you better temps.


Which backplate are you getting? How are you going to attach it to the PCB? I thought those backplates required a water block to mount to...

I was thinking about getting Fujipoly Ultra Extreme and these *Aluminum* or these *Copper* heatsinks (most likely the former, considering the price of the latter) & just applying them to the back of the PCB with the thermal pads, do you think that would work?


----------



## th3illusiveman

Man... i really wish AMD released these things with non reference coolers at launch then the miners could have snatched them up and they could be up on ebay for cheap right now. The horrible ref cooler 290X's are so cheap it's ridiculous yet the ones with custom cooling have a steep premium on them in comparison.


----------



## Mega Man

ref > non ref~


----------



## HoneyBadger84

Quote:


> Originally Posted by *th3illusiveman*
> 
> Man... i really wish AMD released these things with non reference coolers at launch then the miners could have snatched them up and they could be up on ebay for cheap right now. The horrible ref cooler 290X's are so cheap it's ridiculous yet the ones with custom cooling have a steep premium on them in comparison.


There are a regular flow of mined & non-mined 290s/290Xs as well, just gotta keep an eye out for them. I had to delete my watch list cuz I'm buying more than I can use even for folding purposes







Reference cooler cards are far better candidates for liquid cooling anyway, both because of cost & because they're guaranteed to match any of the blocks available for reference cards.


----------



## spikezone2004

Quote:


> Originally Posted by *gobblebox*
> 
> Which backplate are you getting? How are you going to attach it to the PCB? I thought those backplates required a water block to mount to...
> 
> I was thinking about getting Fujipoly Ultra Extreme and these *Aluminum* or these *Copper* heatsinks (most likely the former, considering the price of the latter) & just applying them to the back of the PCB with the thermal pads, do you think that would work?


I am going to go with an EK waterblock over than the Aquacomputer block since it seems impossible to get their backplate. If you want to try to put a heatsink on the back of the VRM I would try the gelid rev2 R9 290 heatsink, I have seen people attach it with string and thermal pads in between.

For those heatsinks though I would go with the copper ones


----------



## bluedevil

Quote:


> Originally Posted by *VisionTek*
> We are reviewing options to fulfill this claim, however there is no status update at the moment.
> 
> Thank you,
> 
> VisionTek Support


And my response.
Quote:


> Originally Posted by *Me*
> Any clue on to when? Also as I have been patient on the process, would a R9 295x2 be an option if in stock? I would rather this not be a month long ordeal waiting for a R9 290/X to come into stock.
> 
> Thank you,


Maybe?


----------



## tsm106

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HoneyBadger84*
> 
> Cuz people aren't buying VisionTek's model because they're an unproven brand in terms of quality/RMA service to most people. *I'd never heard of them before about 3 months ago now*.
> 
> 
> 
> They are a bit like Diamond and Club3D to me, i know about them but have no experience with them or know anyone that's used them.
Click to expand...

And that means what exactly? 3 months...?

And about the underlined part, unproven... I don't get statements like these. The first round of cards are reference. And that means what? It means they are all identical, so whether they are proven or not, does not matter. The cards are identical. It's not until later that AIBs are cleared for custom cards, but most if they are smart do not deviate from AMD's spec very far.

That said, I'll take a custom Cryovenom any day to be honest... as long as I don't have to pay for it lol. Also rma wise, regardless of what vendor used it's always like getting a tooth pulled imo.


----------



## HoneyBadger84

Quote:


> Originally Posted by *tsm106*
> 
> And that means what exactly? 3 months...?


Exactly what it says. Before 3 months ago, I had not heard of them because I was not in the computer market for about 2 years and thereby unaware of any "newer" brands that had come to be regular producers, i.e. the likes of VisionTek, which was someone you never heard about 2 years ago when I was last in the market of buying video cards.

So, let me break that down for you in simple english:

Before 3 months ago, I had NEVER heard of VisionTek.

You're welcome









Unproven brand in terms of quality = the one aftermarket cooler I've used from them performed like utter crap, as I explained a few posts later. I didn't say "they're a crappy brand" because they only have 2 strikes with me. If a third strike should happen, then yes, they'll be classified as crappy (for those that didn't read or missed the posts on those, Strike 1: Crappy aftermarket cooler that performed almost as bad, or indeed quite a bit worse, in similar conditions, compared to a Stock Reference cooler card; Strike 2: Getting a 290 in a 290X box that was directly from VisionTek still in shrink wrap & somehow mystically the serial number on the box stating it was a 290X matched the R9 290 card inside...)


----------



## tsm106

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> And that means what exactly? 3 months...?
> 
> 
> 
> Exactly what it says. Before 3 months ago, I had not heard of them because I was not in the computer market for about 2 years and thereby unaware of any "newer" brands that had come to be regular producers, i.e. the likes of VisionTek, which was someone you never heard about 2 years ago when I was last in the market of buying video cards.
> 
> So, let me break that down for you in simple english:
> 
> Before 3 months ago, I had NEVER heard of VisionTek.
> 
> You're welcome
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Unproven brand in terms of quality = the one aftermarket cooler I've used from them performed like utter crap, as I explained a few posts later. I didn't say "they're a crappy brand" because they only have 2 strikes with me. If a third strike should happen, then yes, they'll be classified as crappy (for those that didn't read or missed the posts on those, Strike 1: Crappy aftermarket cooler that performed almost as bad, or indeed quite a bit worse, in similar conditions, compared to a Stock Reference cooler card; Strike 2: Getting a 290 in a 290X box that was directly from VisionTek still in shrink wrap & somehow mystically the serial number on the box stating it was a 290X matched the R9 290 card inside...)
Click to expand...

And the point is that just because you've never heard of them... lol. They've been around for a long time like since the 90s. You write it like it means something. Shrink wrap... didn't you buy that card used?


----------



## chronicfx

Quote:


> Originally Posted by *Mega Man*
> 
> ref > non ref~


Agreed







my vrms are 50c cores at 70 with same fan% and i wear headphones.. #stopcrying


----------



## HoneyBadger84

Quote:


> Originally Posted by *tsm106*
> 
> And the point is that just because you've never heard of them... lol. They've been around for a long time like since the 90s. You write it like it means something. Shrink wrap... didn't you buy that card used?


VisionTek R9 290 I have was new sealed in box (as I've said about 3 times now). The VisionTek aftermarket card I got was also new sealed in box... so no. Wrong again.

And I stated it because I'm one of many people I talked to about this card that went 'who is VisionTek", they're not that well known of a brand. Being around since the 90s doesn't mean much of anything. So has Gainward, they're not that well known in the US, at all, because their US branch went under quite a while ago, still one of the better brands available, outside of the US, in terms of aftermarket cooler quality etc.

Once again I'm done arguing with you since it seems like you get your jollys out of arguing with people in this thread continually of late.


----------



## tsm106

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> And the point is that just because you've never heard of them... lol. They've been around for a long time like since the 90s. You write it like it means something. Shrink wrap... didn't you buy that card used?
> 
> 
> 
> VisionTek R9 290 I have was new sealed in box (as I've said about 3 times now). The VisionTek aftermarket card I got was also new sealed in box... so no. Wrong again.
> 
> And I stated it because I'm one of many people I talked to about this card that went 'who is VisionTek", they're not that well known of a brand.
> 
> Once again I'm done arguing with you since it seems like you get your jollys out of arguing with people in this thread continually of late.
Click to expand...

Lmao, answer the question or I'll just check your post. You got that card used right? IE. you bought it from someone not from a store. Because if you got it from a store, you'd just take it back to the store. You sure you aren't a victim of fraud?


----------



## HoneyBadger84

Quote:


> Originally Posted by *tsm106*
> 
> Lmao, answer the question or I'll just check your post. You got that card used right? IE. you bought it from someone not from a store. Because if you got it from a store, you'd just take it back to the store. You sure you aren't a victim of fraud?


I did answer the question. Let me say it a fourth time in 2 pages: both cards were new in box, sealed, in shrink wrap, when I got them, never used before. That'll be my last reply to you, blocked, tired of it.


----------



## tsm106

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Lmao, answer the question or I'll just check your post. You got that card used right? IE. you bought it from someone not from a store. Because if you got it from a store, you'd just take it back to the store. You sure you aren't a victim of fraud?
> 
> 
> 
> I did answer the question. Let me say it a fourth time in 2 pages: both cards were new in box, sealed, in shrink wrap, when I got them, never used before. That'll be my last reply to you, blocked, tired of it.
Click to expand...

Seriously, where'd you get the cards then? You say you blast Visiontek for the mystically altered serianl numbers, but if it was legit, and you bought it from a store you can take it back. But you can't do that, you bought it from a person. LOL. The fact that you won't answer the question and will just bad mouth Visiontek instead speaks volumes.


----------



## tsm106

Lmao, found it right here.

Quote:


> Originally Posted by *HoneyBadger84*
> 
> Quote:
> 
> 
> 
> Originally Posted by *skupples*
> 
> eh, you will get use to it. You will have it torn down & back together in half the time, next tyime.
> 
> 
> 
> Mayhaps. I may try it again tomorrow while I'm testing out QuadFire once I get it installed (the HCP 850W will be here tomorrow, which means it's GO TIME as I'll have access to TWELVE PCI-E cords & 2050W of powa
> 
> 
> 
> 
> 
> 
> 
> ) on one of the cards I won't be using for it.
> 
> Plan is 3 HIS + the Asus card. Since I won't be taking the Sapphire apart again cuz I don't wanna cleanup whatever mess I made with too much goop, that leaves the brand new VisionTek R9 290 I got in a 290X box (thanks again VisionTek & *stupid seller that didn't check his product before listing it on EBay*)... doubt I'll do it to that one simply because it's new, but maybe.
> 
> Where would one go about getting good quality Thermal Pads to replace the stock ones if I wanted to do that next time I take one apart? it seemed like that stock thermal padding was pretty shady... it was like mint green, what's up with that?
Click to expand...

He bought it off ebay, and never thought he was being lied to? As if being shrink wrapped is like a seal of authenticity from Visiontek? Anyone in retail has access a shrink wrap machine...


----------



## HoneyBadger84

I never said the serial numbers were altered... I also never bad mouthed the brand. Saying they're unproven or unknown is not bad mouthing. You are clearly not going to act like an adult in this argument or read anything properly, so yep, 100% done. Have a nice day.


----------



## tsm106




----------



## ZealotKi11er

Nothing wrong with VisionTek. They are just not as well known.

They fall under:

VisionTek
Diamond
Club3D
HIS

Then you have

MSI
ASUS
Gigabyte

XFX
Sapphire
PowerColor


----------



## HoneyBadger84

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Nothing wrong with VisionTek. They are just not as well known.
> 
> They fall under:
> 
> VisionTek
> Diamond
> Club3D
> HIS
> 
> Then you have
> 
> MSI
> ASUS
> Gigabyte
> 
> XFX
> Sapphire
> PowerColor


Which is what I meant. *shrug* he can rage at himself all he likes, I can't read it anymore. Off to run some errands so I can forget all about this guy's issues.


----------



## tsm106

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Nothing wrong with VisionTek. They are just not as well known.
> 
> They fall under:
> 
> VisionTek
> Diamond
> Club3D
> HIS
> 
> Then you have
> 
> MSI
> ASUS
> Gigabyte
> 
> XFX
> Sapphire
> PowerColor


Man how times have changed. Diamond used to be one of the biggest players and now they are just a rebrander of reference cards. Though it's fine by me. Three of my 7970s are Diamonds. It's also interesting to see you class Powercolor with XFX/Sapp when they used to be considered more of a 3rd tier brand like Diamond/Vision now. Personally, I could care less what brand my cards are as long as they are true reference.


----------



## The Storm

On a side note to all of this, I did order an MCP35x with the res, heatsink and fan. It was supposed to arrive today but the mail has already been here and still no pump. Maybe tomorrow, it might be several days before I get to try it out, I am working the next 7 nights in a row, damn 12hr shifts with 2hrs of drive time kills any extra time to do anything.


----------



## tsm106

If that's the 35x optional res, use the bottom of the two ports for inlet. The bottom port does not create vortexes.


----------



## The Storm

Quote:


> Originally Posted by *tsm106*
> 
> If that's the 35x optional res, use the bottom of the two ports for inlet. The bottom port does not create vortexes.


Ok thanks will do, yes its the swiftech optional res. I may end up getting another pump and getting the dual pump top as well.

Its this setup here minus the barbs.


----------



## spikezone2004

Quote:


> Originally Posted by *tsm106*
> 
> If that's the 35x optional res, use the bottom of the two ports for inlet. The bottom port does not create vortexes.


does the vortexes affect performance of the pump? I just ordered this res with the MCP50X pump planned on using the top inlet and having the fluid level a little lower so you can see water flowing into the res


----------



## tsm106

Quote:


> Originally Posted by *spikezone2004*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> If that's the 35x optional res, use the bottom of the two ports for inlet. The bottom port does not create vortexes.
> 
> 
> 
> does the vortexes affect performance of the pump? I just ordered this res with the MCP50X pump planned on using the top inlet and having the fluid level a little lower so you can see water flowing into the res
Click to expand...

Use the bottom port and you will be fine. The vortex is generated because of the small size of the res and because the DDC type pumps can generate a lot of head pressure. That much head pressure in a teeny tiny res equals lots of bubbles. Martin wrote a whole bit about it but he used the top port which is actually the fill port. So don't use the top ports and you will be fine.


----------



## spikezone2004

Quote:


> Originally Posted by *tsm106*
> 
> Use the bottom port and you will be fine. The vortex is generated because of the small size of the res and because the DDC type pumps can generate a lot of head pressure. That much head pressure in a teeny tiny res equals lots of bubbles. Martin wrote a whole bit about it but he used the top port which is actually the fill port. So don't use the top ports and you will be fine.


Not sure if we are talking about the same inlet. The one on the very top of res on the cap I will be using as fill port. I thought you meant the ones on the side their are 2 of them and I was going to use the top one out of those two for inlet


----------



## tsm106

When I say don't use the top ports, I mean both the top cap/fill port and the upper of the two side inlet ports. You will see for yourself anyways if you are curious.


----------



## spikezone2004

Quote:


> Originally Posted by *tsm106*
> 
> When I say don't use the top ports, I mean both the top cap/fill port and the upper of the two side inlet ports. You will see for yourself anyways if you are curious.


Ok wasnt sure, ill probably test it out to see what you mean lol plus ill be leak testing parts outside of case before i put them in.

Does the vortex create any performance issues though? hurt the pump?


----------



## tsm106

Well... you don't want a vortex of bubbles at all anywhere near a loop. If there was no air involved it wouldn't be much of a problem unless it got so bad that it started to cavitate. It sounds scary but its not, it's just what happens when the water moves in a circle really fast. Though it is a pain in the ass when filling and bleeding to have a vortex going on in your res. Thus using the bottom port reduces this to almost nil. Btw, since this res is so small volume wise, you will want a barb and a funnel so you can feed it an ample water supply. The 35x will suck down water fast so use a psu jumper for stop and go filling. The funnel makes it go much faster. When I do it I don't try to get all teh bubbles out, just 95% of them. If you have enough pump, the bubbles will work their way out in a few minutes/hours. THen I come by later and top off the res.


----------



## The Storm

It has arrived. I wonder if the uni holder from ek that I have for my d5 pump setup will work for this..hmmmm. it looks like the mounting holes are close.


----------



## spikezone2004

Quote:


> Originally Posted by *tsm106*
> 
> Well... you don't want a vortex of bubbles at all anywhere near a loop. If there was no air involved it wouldn't be much of a problem unless it got so bad that it started to cavitate. It sounds scary but its not, it's just what happens when the water moves in a circle really fast. Though it is a pain in the ass when filling and bleeding to have a vortex going on in your res. Thus using the bottom port reduces this to almost nil. Btw, since this res is so small volume wise, you will want a barb and a funnel so you can feed it an ample water supply. The 35x will suck down water fast so use a psu jumper for stop and go filling. The funnel makes it go much faster. When I do it I don't try to get all teh bubbles out, just 95% of them. If you have enough pump, the bubbles will work their way out in a few minutes/hours. THen I come by later and top off the res.


True, ill most likely end up using bottom port but will mess around in my testing phase. I have a barb and some tubing to run a fill port with funnel for filling it, my last res drained quick when filling and this pump is alot more powerful so i can expect itll drain even faster with it being smaller and more powerful. Can't wait to get my whole loop redone and my R9 290 in my system with my new rad to have 3x240mm rads total.

Anyone have scythe grand flex 120mm fans? wondering if these pop off easily to paint the blades, dont want to try and end up breaking one. my color scheme is black and white, all my fans are white and some case accents, was thinking possibly painting the rad white and leaving fans black or keeping it the same and painting the fans white like my others.


----------



## tsm106

Freaking EK, using hidden but weak screws.



*Edit on closer inspection, the threads were not tapped well. The two hidden but stripped screws show serious thread wear.


----------



## The Storm

Quote:


> Originally Posted by *tsm106*
> 
> Freaking EK, using hidden but weak screws.
> 
> 
> 
> *Edit on closer inspection, the threads were not tapped well. The two hidden but stripped screws show serious thread wear.


What is this on?


----------



## tsm106

It's a 290x Lightning.


----------



## kckyle

even my used r9 290 came in anti static bag lol


----------



## HeliXpc

Hey guys, just got an asus 290x dc2 oc, getting red screen of death under high load levels, in BF4 and tried crysis 2. Driver issue? or hardware, using the latest july 9th beta drivers.


----------



## The Storm

Quote:


> Originally Posted by *HeliXpc*
> 
> Hey guys, just got an asus 290x dc2 oc, getting red screen of death under high load levels, in BF4 and tried crysis 2. Driver issue? or hardware, using the latest july 9th beta drivers.


Stock clocks or is that with an overclock?


----------



## Tokuzi

If anyone could toss me a stock bios for the Powercolor LCS 1060/1350 OC dip switch BIOS (or both switches) that would be awesome.

This is the exact GPU: http://www.techpowerup.com/gpudb/b2700/powercolor-lcs-r9-290x.html

I looked pretty hard online and can't find a copy of it anywhere... Thanks.


----------



## aaroc

Quote:


> Originally Posted by *Mega Man*
> 
> well if anyone does use a mst hub, i can only activate 2 of my 1080p monitors,
> 
> using newest drivers directly connected by DP ( all 3 )
> 
> i can use them with 2 via mst hub and one via DLDVI
> 
> however i should be able to use them soley on the mst hub. i know the mst hub works,
> 
> it will load all 3 screens at boot up
> 
> any ideas/ any infos i can give for helps !~


What brand model?


----------



## Loktar Ogar

Quote:


> Originally Posted by *Tokuzi*
> 
> If anyone could toss me a stock bios for the Powercolor LCS 1060/1350 OC dip switch BIOS (or both switches) that would be awesome.
> 
> This is the exact GPU: http://www.techpowerup.com/gpudb/b2700/powercolor-lcs-r9-290x.html
> 
> I looked pretty hard online and can't find a copy of it anywhere... Thanks.


Nic pic bro!


----------



## flamin9_t00l

Quote:


> Originally Posted by *HeliXpc*
> 
> Hey guys, just got an asus 290x dc2 oc, getting red screen of death under high load levels, in BF4 and tried crysis 2. Driver issue? or hardware, using the latest july 9th beta drivers.


I had RSOD during gaming with my new Z97 build (Fortress rig) and the problem was CPU vcore. Check to see if you have memory dump from the crash and BSOD code (mine was 124). Not sure if it's the same problem but bumping CPU vcore on mine sorted it.


----------



## Decade

Low 3dmark Firestrike score on a Sapphire Tri-X OC R9 290 @ 1000/1300. Getting mid to high 7000s.

Comparison, I have an Asus R9 290 that I can oc to 1089/1350 getting mid 9000s. There is NO WAY an 89mhz and 50mhz difference can result in a 2K points split!

Something can't be right here. I'm about to uninstall cat 14.4 and try out 14.7.

Fixed it... had 3D Mark's gpu count on "2" from playing with crossfire.


----------



## gobblebox

If I was looking into upgrading to a Crossfire setup with a Reference r9 290 upgraded with the Gelid Rev2 Icy Vision cooler to run alongside my Sapphire Tri-X OC r9 290, what would be the minimum PSU that I should shoot for with an 1150/1475 OC on both cards while running my 4790K @ a 4.9Ghz 1.38v (EIST/C1E/C7)? I'm 99% sure my EVGA 750W G2 Supernova won't cut it, but what would be a safe minimum PSU to shoot for? Will it require a 1000w PSU or can I get by safely with 850w? I'm trying to save some $$ and would prefer semi/fully modular...

[Edit] Embarrassing grammar


----------



## Decade

Quote:


> Originally Posted by *gobblebox*
> 
> If I was looking into upgrading to a Crossfire setup with a Reference r9 290 upgraded with the Gelid Rev2 Icy Vision cooler to run alongside my Sapphire Tri-X OC r9 290, what would be the minimum PSU that I should shoot for with an 1150/1475 OC on both cards while running my 4790K @ a 4.9Ghz 1.38v (EIST/C1E/C7)? I'm 99% sure my EVGA 750W G2 Supernova won't cut it, but what would be a safe minimum PSU to shoot for? Will it require a 1000w PSU or can I get by safely with 850w? I'm trying to save some $$ and would prefer semi/fully modular...
> 
> [Edit] Embarrassing grammar


I can't speak 100% for your setup, but doing a Sapphire Tri-X OC 290 @ 1000/1300 and a reference Asus 290 w/ Gelid @ 1000/1300 myself, I had no issues with my SuperNova 750 G2 and my 4670K playing around with crossfire for a few hours. Gaming in Crysis 3, Tomb Raider, and 3D Mark benches.

This tool can give you a really good baseline of where to start from: http://www.extreme.outervision.com/psucalculatorlite.jsp


----------



## gobblebox

Quote:


> Originally Posted by *Decade*
> 
> I can't speak 100% for your setup, but doing a Sapphire Tri-X OC 290 @ 1000/1300 and a reference Asus 290 w/ Gelid @ 1000/1300 myself, I had no issues with my SuperNova 750 G2 and my 4670K playing around with crossfire for a few hours. Gaming in Crysis 3, Tomb Raider, and 3D Mark benches.
> 
> This tool can give you a really good baseline of where to start from: http://www.extreme.outervision.com/psucalculatorlite.jsp


The similarity of that setup and my intended plan almost scares (and delights) me. Did you OC the GPUs at all, or should I not even touch the clocks? Do you recall what your VID was on your 4670K?


----------



## Decade

Quote:


> Originally Posted by *gobblebox*
> 
> The similarity of that setup and my intended plan almost scares (and delights) me. Did you OC the GPUs at all, or should I not even touch the clocks? Do you recall what your VID was on your 4670K?


I OC'd the reference to 1000/1300 to match the Sapphire. I couldn't push the Sapphire at all, and my Asus reference only clocks up to 1089 on the core. I'm still at 1080p before I upgrade later this year, so combined FPS of the two in crossfire was more than enough at 1000/1300. If you OC the 290s, I would certainly feel more comfortable at a higher wattage than 750. But stock or factory OC? It's plenty for my setup.

4670K is running 1.280 in bios, it gets reported as 1.282 in gpu-z or 1.29 by the bios itself.

To give you an idea of what else is running; 2 ssds, 1 2.5" hdd, 3x 120mm fans, 3x 140mm fans, and an h80i.


----------



## gobblebox

Quote:


> Originally Posted by *Decade*
> 
> I OC'd the reference to 1000/1300 to match the Sapphire. I couldn't push the Sapphire at all, and my Asus reference only clocks up to 1089 on the core. I'm still at 1080p before I upgrade later this year, so combined FPS of the two in crossfire was more than enough at 1000/1300. If you OC the 290s, I would certainly feel more comfortable at a higher wattage than 750. But stock or factory OC? It's plenty for my setup.
> 
> 4670K is running 1.280 in bios, it gets reported as 1.282 in gpu-z or 1.29 by the bios itself.
> 
> To give you an idea of what else is running; 2 ssds, 1 2.5" hdd, 3x 120mm fans, 3x 140mm fans, and an h80i.


I was already leaning towards upgrading to a 1000w PSU and just wanted to confirm that it would be necessary. I'm looking to OC both of the cards to support my QNIX 2710 Evolution II at 120hz (Lucky with the OC on that bad boy) so that I can actually hit those frame rates in games, and with my 4790K already at 4.9Ghz w/ 1.38v (pulling 150w alone), I'd rather be safe than sorry - I'm looking at Corsair and EVGA since both are high quality and the 80+ Gold Cert have some pretty nice deals running on Newegg right now.

2x Noctua 140mm PWM, 1x Corsair AF140, h100i with 2x AF120 Pull/ 2x SP120 Push, 1x SSD, 2x 2.5" HDDs, and an Asus CD/DVD RW combo drive. I think I'm going to need the extra juice lol


----------



## kizwan

Quote:


> Originally Posted by *kckyle*
> 
> even my used r9 290 came in anti static bag lol


I think she meant the box is shrink wrapped, not the card. Still that doesn't means the box is unopened/new. For the "mystical" serial number, if anyone can removed the warranty sticker & put it back without damaging it, the same thing can be done with serial number sticker. Also what I don't understand is how the card can come directly from Visiontek when the card was bought from ebay.


----------



## tsm106

^^It's mysticalness!


----------



## Mega Man

Quote:


> Originally Posted by *The Storm*
> 
> On a side note to all of this, I did order an MCP35x with the res, heatsink and fan. It was supposed to arrive today but the mail has already been here and still no pump. Maybe tomorrow, it might be several days before I get to try it out, I am working the next 7 nights in a row, damn 12hr shifts with 2hrs of drive time kills any extra time to do anything.


i know how you feel :/ i am waiting to buy wire but i havce everything else for sleeving/making my own cables
Quote:


> Originally Posted by *Tokuzi*
> 
> If anyone could toss me a stock bios for the Powercolor LCS 1060/1350 OC dip switch BIOS (or both switches) that would be awesome.
> 
> This is the exact GPU: http://www.techpowerup.com/gpudb/b2700/powercolor-lcs-r9-290x.html
> 
> I looked pretty hard online and can't find a copy of it anywhere... Thanks.


 NewCompressedzippedFolder.zip 99k .zip file


i dont have switch one ( closest to I//O ) on my pc and the card has yet to be reinstalled sorry or i would get you that as well !
( 90% sure this is the right one, either way if it falls in the 10% all my cards are ref.

but yea that card was in slot one
Quote:


> Originally Posted by *aaroc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> well if anyone does use a mst hub, i can only activate 2 of my 1080p monitors,
> 
> using newest drivers directly connected by DP ( all 3 )
> 
> i can use them with 2 via mst hub and one via DLDVI
> 
> however i should be able to use them soley on the mst hub. i know the mst hub works,
> 
> it will load all 3 screens at boot up
> 
> any ideas/ any infos i can give for helps !~
> 
> 
> 
> What brand model?
Click to expand...

haha found it
and thanks ! ( club 3d )

even if i am not using 144hz, does not matter !


Quote:


> Originally Posted by *Decade*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gobblebox*
> 
> If I was looking into upgrading to a Crossfire setup with a Reference r9 290 upgraded with the Gelid Rev2 Icy Vision cooler to run alongside my Sapphire Tri-X OC r9 290, what would be the minimum PSU that I should shoot for with an 1150/1475 OC on both cards while running my 4790K @ a 4.9Ghz 1.38v (EIST/C1E/C7)? I'm 99% sure my EVGA 750W G2 Supernova won't cut it, but what would be a safe minimum PSU to shoot for? Will it require a 1000w PSU or can I get by safely with 850w? I'm trying to save some $$ and would prefer semi/fully modular...
> 
> [Edit] Embarrassing grammar
> 
> 
> 
> I can't speak 100% for your setup, but doing a Sapphire Tri-X OC 290 @ 1000/1300 and a reference Asus 290 w/ Gelid @ 1000/1300 myself, I had no issues with my SuperNova 750 G2 and my 4670K playing around with crossfire for a few hours. Gaming in Crysis 3, Tomb Raider, and 3D Mark benches.
> 
> This tool can give you a really good baseline of where to start from: http://www.extreme.outervision.com/psucalculatorlite.jsp
Click to expand...

Please for the love of whatever you worship. NEVER TRUST ONLINE PSU CALCULATORS !!!!
( Shilka rule #1 )
Quote:


> Originally Posted by *tsm106*
> 
> Also rma wise, regardless of what vendor used it's always like getting a tooth pulled imo.


this should go into my sig


----------



## spikezone2004

When I worked at circuit city during high school half the stuff returned was just shrinkwrapped and put back on the shelf especially the dvds and computer parts like routers.

Manager would look at item first then go shrinkwrap and stock!


----------



## renzkuken1

Hey guys so i downloaded MSI Afterburner like i was told and i did a stock run using unigine valley and it ran ok so i tweaked a little bit till it started to artifact and rolled it back 25mhz on core which looked fine.

Anyway i will post screenshots...can you tell me if i did it right?

Thanks in advance.


----------



## rdr09

Quote:


> Originally Posted by *renzkuken1*
> 
> Hey guys so i downloaded MSI Afterburner like i was told and i did a stock run using unigine valley and it ran ok so i tweaked a little bit till it started to artifact and rolled it back 25mhz on core which looked fine.
> 
> Anyway i will post screenshots...can you tell me if i did it right?
> 
> Thanks in advance.


not sure about that heaven score but that oc is certainly above average. must be watered. do you mind running 3DMark11?

http://www.techpowerup.com/downloads/2336/futuremark-3dmark-11-v1-0-132/


----------



## renzkuken1

Quote:


> Originally Posted by *rdr09*
> 
> not sure about that heaven score but that oc is certainly above average. must be watered. do you mind running 3DMark11?
> 
> http://www.techpowerup.com/downloads/2336/futuremark-3dmark-11-v1-0-132/


Yeah it is watercooled. (My first build







) Haven't played around with clocks for long...maybe 20 mins.
Downloading it now, any specific setting i need to run?


----------



## rdr09

Quote:


> Originally Posted by *renzkuken1*
> 
> Yeah it is watercooled. (My first build
> 
> 
> 
> 
> 
> 
> 
> ) Haven't played around with clocks for long...maybe 20 mins.
> Downloading it now, any specific setting i need to run?


its the free version just make sure you use stretched.

congrats on your first build. i am certain it won't be your last.


----------



## renzkuken1

Quote:


> Originally Posted by *rdr09*
> 
> its the free version just make sure you use stretched.
> 
> congrats on your first build. i am certain it won't be your last.


Haha i meant first watercool build but cool nonetheless







Testing now.


----------



## renzkuken1

SCORE
P14223 with AMD Radeon R9 290X(1x) and Intel Core i5-3570K Processor
Graphics Score 18345
Physics Score 8517
Combined Score 8465


----------



## rdr09

Quote:


> Originally Posted by *renzkuken1*
> 
> SCORE
> P14223 with AMD Radeon R9 290X(1x) and Intel Core i5-3570K Processor
> Graphics Score 18345
> Physics Score 8517
> Combined Score 8465


Very nice. You are using 14.4?

congrats on your first watercooled build.


----------



## renzkuken1

Quote:


> Originally Posted by *rdr09*
> 
> Very nice. You are using 14.4?
> 
> congrats on your first watercooled build.


Nah was using the beta but then realised i need 14.4 for it to register my overclocks...thought it didn't overclock.
So is that a good overclock on the card is it?









Ill post a screenshot once i have 14.4 drivers installed and i'll boost the cpu a bit more.


----------



## rdr09

Quote:


> Originally Posted by *renzkuken1*
> 
> Nah was using the beta but then realised i need 14.4 for it to register my overclocks...thought it didn't overclock.
> So is that a good overclock on the card is it?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ill post a screenshot once i have 14.4 drivers installed and i'll boost the cpu a bit more.


14.6 and 14.7 Betas gives like 400 points boost in graphics score. don't be surprised if your score go down in 14.4. i am using 14.4, so i can play Warface.


----------



## Klocek001

Would anyone kindly help me get my r9 290 to 1150 with a sane vcore bump (offset preferably). Using Tri-X version, Tri-X OC software.


----------



## renzkuken1

Quote:


> Originally Posted by *rdr09*
> 
> 14.6 and 14.7 Betas gives like 400 points boost in graphics score. don't be surprised if your score go down in 14.4. i am using 14.4, so i can play Warface.


Have 14.4 now trying 3dmark11 again.


----------



## rdr09

Quote:


> Originally Posted by *Klocek001*
> 
> Would anyone kindly help me get my r9 290 to 1150 with a sane vcore bump (offset preferably). Using Tri-X version, Tri-X OC software.


have you tried +100 offset?

Make sure AMD Overdrive is disabled in CCC.

I opened 2 screens so you can see the Power Li . . .



Not sure about what you hit in Silicon Lot but you might need lower offset or higher offset. it is safe to start at +100, then test with a synthetic bench like 3DMark11. I always set the Power Li to the highest setting at any oc. You've got to keep your temps below 80C (imo).

edit: you might not even need an offset at 1150 . . . just max the Power Limit.


----------



## Klocek001

VDDC 100 is 0,1 v ?


----------



## rdr09

Quote:


> Originally Posted by *Klocek001*
> 
> VDDC 100 is 0,1 v ?


i saw 1.4v benching with +200 offset. some use it for 7/24.

edit: wait, i don't recommend +200. maybe +150 max depends on your temps (core and vrms). Don't be mixing AB, Trixx and GPUZ. I use Hwinfo64 to monitor temps.

the goal is to find the lowest offset. start at 100, if it passes, then go lower - +80. if it fails, then go a bit higher - +110. again, depending on your temps. 3DMark11 is a short bench to test stability and temps but a good start. games like BF4 will be the true test.


----------



## Klocek001

Installed triX.
Set 1150 on core, +100 vddc, +50 power. No memory changes (1300 default).
Ran Far Cry 3. 10 fps less.
Uninstalled tri-x.
Reinstalled tri-x.
Set 1150 on core, +100 vddc, +50 power. No memory changes (1300 default).
Ran Far Cry 3. Seems the same as @1000 core.
Hit default in Trix
Default button set my mem to 2200MHz (!!!). PC froze.
Uninstalled tri-X.

WT* is this crap ?!

Gonna try AB, I remember it was great, used it with 8600GTS back in the days....
Are you suggesting uninstallin gpu-z first ? It's not OC software, just reading...


----------



## renzkuken1

Quote:


> Originally Posted by *renzkuken1*
> 
> SCORE
> P14223 with AMD Radeon R9 290X(1x) and Intel Core i5-3570K Processor
> Graphics Score 18345
> Physics Score 8517
> Combined Score 8465


SCORE
P13765 with AMD Radeon R9 290X(1x) and Intel Core i5-3570K Processor
Graphics Score 17822
Physics Score 8207
Combined Score 8140

New score with 4.4 drivers. Why isn't it showing my overclocks in the details and why is it showing my memory as 667mhz when it's 1600mhz? :/


----------



## rdr09

Quote:


> Originally Posted by *Klocek001*
> 
> Installed triX.
> Set 1150 on core, +100 vddc, +50 power. No memory changes (1300 default).
> Ran Far Cry 3. 10 fps less.
> Uninstalled tri-x.
> Reinstalled tri-x.
> Set 1150 on core, +100 vddc, +50 power. No memory changes (1300 default).
> Ran Far Cry 3. Seems the same as @1000 core.
> Hit default in Trix
> Default button set my mem to 2200MHz (!!!). PC froze.
> Uninstalled tri-X.
> 
> WT* is this crap ?!
> 
> Gonna try AB, I remember it was great, used it with 8600GTS back in the days....
> Are you suggesting uninstallin gpu-z first ? It's not OC software, just reading...


yah, used AB. Shutdown Trixx and, again, make sure AMD overdrive is disabled.

Quote:


> Originally Posted by *renzkuken1*
> 
> SCORE
> P13765 with AMD Radeon R9 290X(1x) and Intel Core i5-3570K Processor
> Graphics Score 17822
> Physics Score 8207
> Combined Score 8140
> 
> New score with 4.4 drivers. Why isn't it showing my overclocks in the details and why is it showing my memory as 667mhz when it's 1600mhz? :/


don't sweat those (readings). your gpu is working 100% fine.

Now my issue . . . I love Mantle and i am not going back to D3D in BF4 but it seems that I still have to use HT, which raises my temps a bit. Now, my other concern is if and when i add another 290 - my i7 might struggle . . .

HT on



HT off



this is just 1080 at 130% scale. My temps are still nice, though . . .


----------



## tsm106

Quote:


> Originally Posted by *Mega Man*
> 
> well if anyone does use a mst hub, i can only activate 2 of my 1080p monitors,
> 
> using newest drivers directly connected by DP ( all 3 )
> 
> i can use them with 2 via mst hub and one via DLDVI
> 
> however i should be able to use them soley on the mst hub. i know the mst hub works,
> 
> it will load all 3 screens at boot up
> 
> any ideas/ any infos i can give for helps !~


You can't run three 1080 144hz panels off one DP. There's barely enough bandwidth for two 120hz/144hz panels. This is why the ref Hawaii port arrangement is really gimped. I guess all them korean panel owners complained really loudly last generation out. And AMD only sort of listened lol, by adding another DL DVI but at the cost of a DP. Go figure. Now koreans are out and DP is in because of Nvidia adoption of DP due to gstring. All your ports belong to DP. Hawaii would have been so much better with the old ports lol. Shakes head.


----------



## HighTemplar

Just picked up Two Sapphire 290X Reference cards for $275 each used, and I'm grabbing 2 more this weekend.

I'll be comparing them in 2 way, 3 way, and 4 way setups vs my 780 Ti Classified 4 Way setup.

I needed the extra VRAM for 4K gaming, and the fact that they perform pretty close to 780 Ti's @ 4K makes it worth trying especially at 40% the price of a 780 Ti









Has anyone else been grabbing a bunch of these to use for crossfire setups?

And as far as buying used from mining setups, mine were used in a dust-free datacenter environment, and still have a warranty via Sapphire.

I was stuck between buying a Titan black with 6GB VRAM, or downgrading to 6GB 780s, and what's funny is, even the 6GB 780s were more expensive than the 290X's.


----------



## spikezone2004

What are the best benchmarks and tests to run on a R9 290? Before I install mine I want to run the benchmarks and tests on my current card, then install my R9 290 and run them all again to compare. Was planning on using 14.4 drivers for a older game I play that they work on, but I see the 14.7B gives better scores? what about FPS does it improve that or just the scores?


----------



## rdr09

Quote:


> Originally Posted by *HighTemplar*
> 
> Just picked up Two Sapphire 290X Reference cards for $275 each used, and I'm grabbing 2 more this weekend.
> 
> I'll be comparing them in 2 way, 3 way, and 4 way setups vs my 780 Ti Classified 4 Way setup.
> 
> I needed the extra VRAM for 4K gaming, and the fact that they perform pretty close to 780 Ti's @ 4K makes it worth trying especially at 40% the price of a 780 Ti
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Has anyone else been grabbing a bunch of these to use for crossfire setups?
> 
> And as far as buying used from mining setups, mine were used in a dust-free datacenter environment, and still have a warranty via Sapphire.
> 
> I was stuck between buying a Titan black with 6GB VRAM, or downgrading to 6GB 780s, and what's funny is, even the 6GB 780s were more expensive than the 290X's.


4? you gonna need a datacenter environment.


----------



## tsm106

Quote:


> Originally Posted by *rdr09*
> 
> 4? you gonna need a datacenter environment.


lol with the bottom dropping out from the used 290 market, we're gonna see everyone and their moms running quads. Though there's a huge difference between sticking four in a box and smashing it into the top 10 on the HOF.


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> lol with the bottom dropping out from the used 290 market, we're gonna see everyone and their moms running quads. Though there's a huge difference between sticking four in a box and smashing it into the top 10 on the HOF.


64C F ambient for gaming and less for benching.

Bench that Lightning already.


----------



## tsm106

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> lol with the bottom dropping out from the used 290 market, we're gonna see everyone and their moms running quads. Though there's a huge difference between sticking four in a box and smashing it into the top 10 on the HOF.
> 
> 
> 
> 64C F ambient for gaming and less for benching.
> 
> Bench that Lightning already.
Click to expand...

It's #2 on hwbot already. It gets 7300 on FSE.

*And for the record I freaking hate benching with Afterburner + Hawaii. Half the time I want to punch something lol. I hate Lightnings.


----------



## VSG

Quote:


> Originally Posted by *tsm106*
> 
> It's #2 on hwbot already. It gets 7300 on FSE.
> 
> *And for the record I freaking hate benching with Afterburner + Hawaii. Half the time I want to punch something lol. I hate Lightnings.


I only see 1 7300 FSE 290x Lightning on HWBot and that''s from a guy in Indonesia with LN2 cooling. Is that you?


----------



## tsm106

Quote:


> Originally Posted by *geggeg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> It's #2 on hwbot already. It gets 7300 on FSE.
> 
> *And for the record I freaking hate benching with Afterburner + Hawaii. Half the time I want to punch something lol. I hate Lightnings.
> 
> 
> 
> I only see 1 7300 FSE 290x Lightning on HWBot and that''s from a guy in Indonesia with LN2 cooling. Is that you?
Click to expand...

Oh sorry, memory fails me. It's #2 on novice with 7134. Some crazy guy from norway sent it to me, but I had to send him my legendary xfx, shrugs. I like the lightning but I also hate it to death. LOL.


----------



## VSG

Quote:


> Originally Posted by *tsm106*
> 
> Oh sorry, memory fails me. It's #2 on novice with 7134. Some crazy guy from norway sent it to me, but I had to send him my legendary xfx, shrugs. I like the lightning but I also hate it to death. LOL.


lol I know Ranger. He goes nuts with voltages but then again he has ambients of -5 to -20 C often. That's a pretty good card you have if he hasn't degraded it already


----------



## Jflisk

I have to find another R9 290X for my rig. New case is here Frac design ARC XL. Guess ill be ripping apart my old case tonight. After I get this all done guess ill order another water block and 3 way EK bridge.I need some color scheme ideas. Ugly Noctuas - ASUS sabertooth board. So basically BLAAA green -brown- tan


----------



## rdr09

Quote:


> Originally Posted by *geggeg*
> 
> lol I know Ranger. He goes nuts with voltages but then again he has ambients of -5 to -20 C often. That's a pretty good card you have if he hasn't degraded it already


He went nuts with a ligthning. look at tsm's avatar. lol


----------



## VSG

Quote:


> Originally Posted by *rdr09*
> 
> He went nuts with a ligthning. look at tsm's avatar. lol


Given how buggy the 7970 Matrix was, I would have likely done that as well if I had one that infuriated me.


----------



## tsm106

Haha. Actually that's Rangers Matrix.


----------



## VSG

Why am I not surprised?


----------



## tsm106




----------



## Klocek001

Like I said a few hours ago, using AB instead of triXXX was a much more pleasant experience for me. R290 @1150MHz core with +81mV never hit 70 degrees with 66% fixed fan speed.
However, here's a few things I'd like to ask you about:

AB shows that the GPU never actually hit 1150MHz, it was between 1060-1130. I figure that's fine. On the FPS graph I noticed the game was (almost) perfectly smooth at 60fps, however there were 3 short 30fps drops, which correspond with 3 GPU core speed drops (700-800MHz) on the time line. Do you think locking it to 1150 constantly (I know RadeonPro can do that) can prevent the occasional FPS drops ? Or was that just "bad game programming" ? sorry for the terrible term

Now, the most important part I'd like to ask -
CPU Core 3 was running at 90 - 100 % for the whole testing time
Core 4 - 40-60%
Core 2 - 45-70%
*Core 1 - 25-40%*

Is that sth to be concerned about ? Is it throttling ? I didn't notice anything over 59 degrees ....


----------



## Gualichu04

XFX received my faulty r9 290x DD and they tested it . it didn't have the random crashing like i experience in nearly any game stressing the gpu such as bf4 when they tested it. Tomb Raider and Watch Dogs. When the current gpu works flawlessly. They are replacing it any ways to rule out all issues. Soon I will harness the power of two r9 290x''s under water.


----------



## tsm106

Quote:


> Originally Posted by *Gualichu04*
> 
> XFX received my faulty r9 290x DD and they tested it . it didn't have the random crashing like i experience in nearly any game stressing the gpu such as bf4 when they tested it. Tomb Raider and Watch Dogs. When the current gpu works flawlessly. They are replacing it any ways to rule out all issues. Soon I will harness the power of two r9 290x''s under water.


The cards are reference I hope. Ref cards under water + asus bios + gputweak = benching done right. Anything else makes me want to punch the screen lol.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gualichu04*
> 
> XFX received my faulty r9 290x DD and they tested it . it didn't have the random crashing like i experience in nearly any game stressing the gpu such as bf4 when they tested it. Tomb Raider and Watch Dogs. When the current gpu works flawlessly. They are replacing it any ways to rule out all issues. Soon I will harness the power of two r9 290x''s under water.


So even thought they can't replicate the results they are replacing it for you anyway?


----------



## heroxoot

Quote:


> Originally Posted by *Klocek001*
> 
> Like I said a few hours ago, using AB instead of triXXX was a much more pleasant experience for me. R290 @1150MHz core with +81mV never hit 70 degrees with 66% fixed fan speed.
> However, here's a few things I'd like to ask you about:
> 
> AB shows that the GPU never actually hit 1150MHz, it was between 1060-1130. I figure that's fine. On the FPS graph I noticed the game was (almost) perfectly smooth at 60fps, however there were 3 short 30fps drops, which correspond with 3 GPU core speed drops (700-800MHz) on the time line. Do you think locking it to 1150 constantly (I know RadeonPro can do that) can prevent the occasional FPS drops ? Or was that just "bad game programming" ? sorry for the terrible term
> 
> Now, the most important part I'd like to ask -
> CPU Core 3 was running at 90 - 100 % for the whole testing time
> Core 4 - 40-60%
> Core 2 - 45-70%
> *Core 1 - 25-40%*
> 
> Is that sth to be concerned about ? Is it throttling ? I didn't notice anything over 59 degrees ....


When this happens it's usually the game not pushing hard enough. When I play BF4 the core stays within 10mhz of the set clock, and GPU usage staying 90%+. In Heaven it stays pegged at full clocks and its got 100% load. When I play games that push less than 60% load however I only see my GPU clocking around 900mhz.

So yea, might just be the program isn't pushing your GPU to enough load. And this can be fine depending on the game.


----------



## bluedevil

Supposedly a shipment of R9 290s will becoming in at VisionTek today. They told me my replacement would be in route.


----------



## Sgt Bilko

Quote:


> Originally Posted by *bluedevil*
> 
> Supposedly a shipment of R9 290s will becoming in at VisionTek today. They told me my replacement would be in route.


Good to hear!

So......I'm guessing they didn't go much on the 295x2 suggestion?


----------



## Gualichu04

Quote:


> Originally Posted by *Sgt Bilko*
> 
> So even thought they can't replicate the results they are replacing it for you anyway?


Yes they are which i find amazing. Never had bad experience with their rma process. The last card i bought thats xfx was a radeon hd 5770 and it still works after 2-3 rma's later.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gualichu04*
> 
> Yes they are which i find amazing. Never had bad experience with their rma process. The last card i bought thats xfx was a radeon hd 5770 and it still works after 2-3 rma's later.


That's good to hear, I heard alot of bad things about XFX and their RMA in particular but after owning my cards i'm only hearing good things.

Guess they are on the up and up


----------



## Arizonian

Quote:


> Originally Posted by *bluedevil*
> 
> Supposedly a shipment of R9 290s will becoming in at VisionTek today. They told me my replacement would be in route.


Good to hear. VisionTek RMA wasn't too bad. The 'testing' phase made me wonder but in the end they came through it looks like. Good luck scoring some good clockers.


----------



## Gualichu04

Quote:


> Originally Posted by *tsm106*
> 
> The cards are reference I hope. Ref cards under water + asus bios + gputweak = benching done right. Anything else makes me want to punch the screen lol.


They are a xfx design but can fit reference water blocks. They are the double dissipation ones they make which is not much better than stock cooling. Would i be able to put a asus bios on it for better performance or stability? if so which one?


----------



## Klocek001

Anyone using MSI Afterburner here with 14.7 and 2d/3d profiles don't work automatically ? I come to think this might be a beta driver issue but I don't wanna guess


----------



## bluedevil

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Good to hear!
> 
> So......I'm guessing they didn't go much on the 295x2 suggestion?


not hardly....not even a 290x


----------



## spikezone2004

Any good overclock guides for the R9 290s? Havent got much experience overclocking gpus since my previous gpu was voltage locked and could only change clocks a tiny bit before instability became an issue.

I see people seem to be overclocking to 1100 core and 1300 mem clock? but not sure if they are raising voltages or what


----------



## Raephen

Quote:


> Originally Posted by *spikezone2004*
> 
> Any good overclock guides for the R9 290s? Havent got much experience overclocking gpus since my previous gpu was voltage locked and could only change clocks a tiny bit before instability became an issue.
> 
> I see people seem to be overclocking to 1100 core and 1300 mem clock? but not sure if they are raising voltages or what


The consensus in this thread seems to be: an average / maybe slightly above average card would / should not need any extra voltage for 1100 core. The 1300 vram wouldn't make that much of a difference, apart from benchmarks, due to the 512-bit bus.

Oh, it's true: I run my 290 at 1150/1350 + 20ish mv, but that is just because I can, and not because I will see a diff in ESO between 1250 and 1350 - my bottleneck seems to be server side


----------



## ZealotKi11er

I am having problems with MSI AB going over + 100 mV. Does the method posted in OP only work for single GPUs? There must be some other way to OC while using MSI AB. I cant stand the ASUS and Sapphire garbage.


----------



## tsm106

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I am having problems with MSI AB going over + 100 mV. Does the method posted in OP only work for single GPUs? There must be some other way to OC while using MSI AB. I cant stand the ASUS and Sapphire garbage.


Did you use the multi gpu switch? It's in the op. You have to switch to using sgX, ie sg0 for gpu 1, etc.


----------



## bluedevil

Oh and I am set to get the package (R9 290) tomorrow! Looks like some gaming action soon! Can't wait!


----------



## Decade

TLDR; I nerdgasm'd over a bug

So, playing with my crossfire setup of what's supposed to be an Asus R9 290 reference and a Sapphire Tri-X OC R9 290.

I notice something weird with GPU-Z.



*THE ASUS IS AN R9 290X JUDGING BY THE SHADER COUNT!*

The best part? I bought the Asus as a refurb 290 from Newegg!

And I have no idea why GPU-Z say's the Tri-X is an Asus, so don't ask.


----------



## Sgt Bilko

Its a known gpu-z bug, its really a 290, its just reading it wrong.


----------



## Decade

Damn it! You're right, it's back to normal now.

Well, at least I still have two bad ass cards.


----------



## The Storm

Quote:


> Originally Posted by *Decade*
> 
> Damn it! You're right, it's back to normal now.
> 
> Well, at least I still have two bad ass cards.


Agreed, have fun with them.


----------



## kizwan

Yup! See "autodetect" in memory type field. ULPS is enabled & causing GPU-Z to mistakenly reported wrong shader count. Disable ULPS using MSI AB then try run GPU-Z again.


----------



## Decade

Quote:


> Originally Posted by *kizwan*
> 
> Yup! See "autodetect" in memory type field. ULPS is enabled & causing GPU-Z to mistakenly reported wrong shader count. Disable ULPS using MSI AB then try run GPU-Z again.


Thanks for reminding me about ULPS.

I take it Afterburner is easier than plowing through the registry and sticks through Catalyst updates?


----------



## JourneymanMike

Quote:


> Originally Posted by *Decade*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Yup! See "autodetect" in memory type field. ULPS is enabled & causing GPU-Z to mistakenly reported wrong shader count. Disable ULPS using MSI AB then try run GPU-Z again.
> 
> 
> 
> Thanks for reminding me about ULPS.
> 
> I take it Afterburner is easier than plowing through the registry and sticks through Catalyst updates?
Click to expand...

I found this http://www.ulpsconfigurationutility.com/ and it made an easy job of turning ULPS on/off...


----------



## Decade

Any thoughts on re-arranging my cables? They're purposely like this to keep the Asus from sagging. Doesn't seem to interfere much with the middle 120mm fan.

It does cover my 24 pin though, so I can use the stock one on my SuperNova 750.


----------



## Pierce

Hi guys I have a reference r9 290

Im sort of new to this. I was getting lag in bf4 so I decided to check my gpu usage. I always knew temps were high but my gpu usage is inconsistent. It constantly goes up and down and wont stay like at 55% (or whatever)

same with gpu clock speed. Earlier today it was at like 946, but now its going up and down. I saw it at 929, then like 945, etc. Is this throttling? Will I get better gameplay by getting a new heat sink?

I didnt know ref cards ran this. Now i know a bit more


----------



## Knight26

Quote:


> Originally Posted by *Pierce*
> 
> Hi guys I have a reference r9 290
> 
> Im sort of new to this. I was getting lag in bf4 so I decided to check my gpu usage. I always knew temps were high but my gpu usage is inconsistent. It constantly goes up and down and wont stay like at 55% (or whatever)
> 
> same with gpu clock speed. Earlier today it was at like 946, but now its going up and down. I saw it at 929, then like 945, etc. Is this throttling? Will I get better gameplay by getting a new heat sink?
> 
> I didnt know ref cards ran this. Now i know a bit more


If you are using the stock fan profile, then it's probably being throttled b/c you are hitting the temperature ceiling at 94C. You can change the maximum fan speed in catalyst control center. Setting it to about 65% would keep my cards at about 85C without any throttling before I put water blocks on them.


----------



## kizwan

Quote:


> Originally Posted by *Decade*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Yup! See "autodetect" in memory type field. ULPS is enabled & causing GPU-Z to mistakenly reported wrong shader count. Disable ULPS using MSI AB then try run GPU-Z again.
> 
> 
> 
> Thanks for reminding me about ULPS.
> 
> I take it Afterburner is easier than plowing through the registry and sticks through Catalyst updates?
Click to expand...

Registry easier too if you remember what to edit/delete/add. For me Afterburner easier. Either method doesn't stick through Catalyst update. You'll need to redo every time you update.
Quote:


> Originally Posted by *Decade*
> 
> Any thoughts on re-arranging my cables? They're purposely like this to keep the Asus from sagging. Doesn't seem to interfere much with the middle 120mm fan.
> 
> It does cover my 24 pin though, so I can use the stock one on my SuperNova 750.
> 
> 
> Spoiler: Warning: Spoiler!


That look ok to me. Look like a cocoon. That is unique way to mount radiator. I'd put fan/rad grills to cover the radiator fins. Just my personal taste, don't like exposed radiator fins.


----------



## Pierce

Quote:


> Originally Posted by *Knight26*
> 
> If you are using the stock fan profile, then it's probably being throttled b/c you are hitting the temperature ceiling at 94C. You can change the maximum fan speed in catalyst control center. Setting it to about 65% would keep my cards at about 85C without any throttling before I put water blocks on them.


hey I tried that and Im getting the same problem. I increased the fan speed and decreased target temp. to 75 degrees. GPU utilization wont go really low like 8% but yea both continue to fluctuate. GPU clock will be at 929 one sec, then 940, and it wont stay at a certain speed.


----------



## kizwan

Quote:


> Originally Posted by *Pierce*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Knight26*
> 
> If you are using the stock fan profile, then it's probably being throttled b/c you are hitting the temperature ceiling at 94C. You can change the maximum fan speed in catalyst control center. Setting it to about 65% would keep my cards at about 85C without any throttling before I put water blocks on them.
> 
> 
> 
> hey I tried that and Im getting the same problem. I increased the fan speed and decreased target temp. to 75 degrees. GPU utilization wont go really low like 8% but yea both continue to fluctuate. GPU clock will be at 929 one sec, then 940, and it wont stay at a certain speed.
Click to expand...

You don't want set target temp to 75C. It will make your card to start throttle at 75C.


----------



## Pierce

Quote:


> Originally Posted by *kizwan*
> 
> You don't want set target temp to 75C. It will make your card to start throttle at 75C.


should i just increase the fan speed?

also would a good heatsink take care of this? a 3rd party solution


----------



## kizwan

Quote:


> Originally Posted by *Pierce*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> You don't want set target temp to 75C. It will make your card to start throttle at 75C.
> 
> 
> 
> should i just increase the fan speed?
> 
> also would a good heatsink take care of this? a 3rd party solution
Click to expand...

Just increase the fan speed. It's normal for GPU usage to fluctuate with 290/290X. Are you playing @1080p? It also normal for the GPU frequency to fluctuate a little with 290/290X, depending on the games. Did you experiencing poor performance in games?

GPU usage with BF4 @1080p 100% resolution scale


GPU usage with BF4 @1080p 200% resolution scale (rendering @4K resolution)


----------



## naved777

is +200mV safe for a 290x with stock cooler ?
AB is locked at 100mV
but just found Trixx allows 200mV


----------



## kizwan

Quote:


> Originally Posted by *naved777*
> 
> is +200mV safe for a 290x with stock cooler ?
> AB is locked at 100mV
> but just found Trixx allows 200mV


I always benched my cards with +200mV, 1.38V & 1.4V before Vdroop respectively but I'm using water. The key is cooling. With stock cooler, I think you'll need ambient at least 10C or lower to run +200mV.


----------



## Pierce

Quote:


> Originally Posted by *kizwan*
> 
> Just increase the fan speed. It's normal for GPU usage to fluctuate with 290/290X. Are you playing @1080p? It also normal for the GPU frequency to fluctuate a little with 290/290X, depending on the games. Did you experiencing poor performance in games?
> 
> GPU usage with BF4 @1080p 100% resolution scale
> 
> 
> GPU usage with BF4 @1080p 200% resolution scale (rendering @4K resolution)


i appreciate the reply. I was playing at 1600 x 900

What is little? because this will go to 40% one sec, then 66, etc. theres no consistency.


----------



## kizwan

Quote:


> Originally Posted by *Pierce*
> 
> i appreciate the reply. I was playing at 1600 x 900
> 
> What is little? because this will go to 40% one sec, then 66, etc. theres no consistency.


The "little" comment is for the clock not usage. Did you see first pic? GPU usage @1080p & BF4 was running fine. Like I said it's normal.


----------



## Mega Man

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> well if anyone does use a mst hub, i can only activate 2 of my 1080p monitors,
> 
> using newest drivers directly connected by DP ( all 3 )
> 
> i can use them with 2 via mst hub and one via DLDVI
> 
> however i should be able to use them soley on the mst hub. i know the mst hub works,
> 
> it will load all 3 screens at boot up
> 
> any ideas/ any infos i can give for helps !~
> 
> 
> 
> You can't run three 1080 144hz panels off one DP. There's barely enough bandwidth for two 120hz/144hz panels. This is why the ref Hawaii port arrangement is really gimped. I guess all them korean panel owners complained really loudly last generation out. And AMD only sort of listened lol, by adding another DL DVI but at the cost of a DP. Go figure. Now koreans are out and DP is in because of Nvidia adoption of DP due to gstring. All your ports belong to DP. Hawaii would have been so much better with the old ports lol. Shakes head.
Click to expand...

yea i 100% agree, i hate dual slot cards, that is why 7970s are my faves !


----------



## tsm106

Quote:


> Originally Posted by *Gualichu04*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> The cards are reference I hope. Ref cards under water + asus bios + gputweak = benching done right. Anything else makes me want to punch the screen lol.
> 
> 
> 
> They are a xfx design but can fit reference water blocks. They are the double dissipation ones they make which is not much better than stock cooling. Would i be able to put a asus bios on it for better performance or stability? if so which one?
Click to expand...

I haven't been keeping up with Hawaii and its evolution of its many permutations involving the pcb. With regard to the asus bios, if xfx did not change the vrm controller then you could feasibly flash one of your bios settings to the asus bios or one of the unlocked "PT" bios' though I would not recommend anything other than PT1 on air. And about the aircooling, it's generally a bad idea to use the unlocked bios with the unlocked gputweak whilst on aircooling. Unlocked 290x's can be monsters in benchmarks but it comes at the cost of serious wattage draw which means you would need serious cooling to match. That's not possible on air.

However you could use the PT1 bios if for example you disliked the stock throttle personality of the Hawaii cards. The unlocked PT1 and PT3 have Powertune removed so they do not downclock at all. But again, using your gpu 100% all the time will require comparable cooling. If you were on water and had ample radiator surface area to match, then PT1 is pretty safe when used within reason.


----------



## flamin9_t00l

Hey all I've been playing around (or wrestling with) the 14.7 BETA drivers for my Crossfire rig and have noticed that with my stable OC I was using with the 14.4 WHQL drivers is no good with the 14.7. I can reach the same core OC but the memory OC that used to work now immediately black screens the system when applied. I am totally fine with this as the performance at the same clocks is improved and stock clocks is perfectly stable but was wondering if any of you guys have the same thing happening? ie. find stable mem OC again as I have found out only the mem is affected in my case.

It was the same with the 14.6 when I tried it a while ago (Installed them then applied my OC and instant black screen), so at that point I just removed them and reinstalled the 14.4's up until now I decided to find the problem as the 14.7's have their benefits.

So, any of you guys have black screen/unstable mem on your OC after 14.4's?

EDIT: I think I just asked a stupid question didn't I, I should have asked if anyone is not having this happen.

There's lots of people having black screens but is it always down to unstable mem (some people even at stock clocks







) above 14.4.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Pierce*
> 
> should i just increase the fan speed?
> 
> also would a good heatsink take care of this? a 3rd party solution


Can you update you sig rig so we know what parts you have like CPU and RAM. I sure hope not you are not running 290 with a E8400.


----------



## flamin9_t00l

Quote:


> Originally Posted by *Pierce*
> 
> Hi guys I have a reference r9 290
> 
> Im sort of new to this. I was getting lag in bf4 so I decided to check my gpu usage. I always knew temps were high but my gpu usage is inconsistent. It constantly goes up and down and wont stay like at 55% (or whatever)
> 
> same with gpu clock speed. Earlier today it was at like 946, but now its going up and down. I saw it at 929, then like 945, etc. Is this throttling? Will I get better gameplay by getting a new heat sink?
> 
> I didnt know ref cards ran this. Now i know a bit more


Increase fan speed and power limit but don't decrease temp target and you should see more stable clocks with no downclocking.

with the stock cooler I was using a manual static fan speed which kept the card below about 75c/80c whatever games/benchies I ran, combine this with a power limit increase by +10% or +20% just incase the card required the extra power and the clocks were solid always.

If the card draws more power than what is default it will cause throttling of the clocks, so a bump in power limit will prevent this, may not be required but doesn't hurt to increase.


----------



## Klocek001

What sort of core clock can be achieved at +100mV ? If it's like past 1200 (1250 possibly) I'd like to try, but if it's like 1170-1180 max I'd rather not go through benching and be disappointed.
I'm currently at +80mV @1150MHz. Haven't tried anything else, I just set +80mV-1150-+50% and it's working, no problem. Temps - 66-68 degrees on Tri-X air cooling with 68% fan (fixed).


----------



## flamin9_t00l

Quote:


> Originally Posted by *Klocek001*
> 
> What sort of core clock can be achieved at +100mV ? If it's like past 1200 (1250 possibly) I'd like to try, but if it's like 1170-1180 max I'd rather not go through benching and be disappointed.
> I'm currently at +80mV @1150MHz. Haven't tried anything else, I just set +80mV-1150-+50% and it's working, no problem. Temps - 66-68 degrees on Tri-X air cooling with 68% fan (fixed).


Only way to know is try it and find out as you won't know if you're at the limit for +80mv yet as you just applied them and tried it.


----------



## rdr09

Quote:


> Originally Posted by *flamin9_t00l*
> 
> Only way to know is try it and find out as you won't know if you're at the limit for +80mv yet as you just applied them and tried it.


^this. its silicon lot. i need +80 for 1200 and +200 for 1300 for my 290. cooling matters as well and a bit of skillzzz.

also, i need 0 for 1150.


----------



## gobblebox

Are the 14.7 or any of the 14.x drivers compatible with the extended voltage method in AB, so that I can use more than 100mv when OCing? I've followed the guide to the T in an attempt to get more than 100mv applied in AB, but nothing seems to work. It is my understanding that it doesn't actually _show_ 100+mv in AB after the method is applied, but after extending the voltage to match (and even exceed) my settings in Trixx, it is clear that the voltage isn't actually going past 100mv. Does anyone know how to resolve this? Trixx is great and all, but IMO, AB is lightyears ahead...


----------



## Arizonian

For our recent new owners please read OP and submit proof. Like to add you to the roster with us.









Quote:


> Originally Posted by *Klocek001*
> 
> What sort of core clock can be achieved at +100mV ? If it's like past 1200 (1250 possibly) I'd like to try, but if it's like 1170-1180 max I'd rather not go through benching and be disappointed.
> I'm currently at +80mV @1150MHz. Haven't tried anything else, I just set +80mV-1150-+50% and it's working, no problem. Temps - 66-68 degrees on Tri-X air cooling with 68% fan (fixed).


Nice clocks / temp you got IMO.









I'd still try adding a bit more just to see what your card might be capable of just for fun if I was you. +20 on Core doesn't equate to more than 2-3 FPS gaming but if temps are still in check, why not? Overclock it. It will bring up temps a few degree but will still be more than acceptable if you don't throttle.


----------



## Klocek001

Quote:


> Originally Posted by *Arizonian*
> 
> Nice clocks / temp you got IMO.


That's 6 case fans and weeks of trial/error attempts to get the best air flow in my case, with just some average, cheaps fans and a pair of plate shears.

I got +5 min fps going from 1100->1150. Now it's just pulling out that little bit more to get rid of like 57-58 fps drops and be at 60 ALL the time.

edit: all my chip can do at +100mV is 1180MHz, at 1200 I get artifacts, but not any crazy ones. It'll probably e fine with just a little more mV, but I can't do it in AB. I've already checked the "extend limits" box and rebooted. Not a single mV can be added still.


----------



## Romanion

Hi, just like to get added to the list

Validation link: http://www.techpowerup.com/gpuz/mxdfx/
Brand: Sapphire 290x reference
Cooling: Stock now, EK waterblock next week.


----------



## gobblebox

Quote:


> Originally Posted by *gobblebox*
> 
> Are the 14.7 or any of the 14.x drivers compatible with the extended voltage method in AB, so that I can use more than 100mv when OCing? I've followed the guide to the T in an attempt to get more than 100mv applied in AB, but nothing seems to work. It is my understanding that it doesn't actually _show_ 100+mv in AB after the method is applied, but after extending the voltage to match (and even exceed) my settings in Trixx, it is clear that the voltage isn't actually going past 100mv. Does anyone know how to resolve this? Trixx is great and all, but IMO, AB is lightyears ahead...


Anyone?


----------



## devilhead

Quote:


> Originally Posted by *rdr09*
> 
> ^this. its silicon lot. i need +80 for 1200 and +200 for 1300 for my 290. cooling matters as well and a bit of skillzzz.
> 
> also, i need 0 for 1150.


heh, i have 2x sapphire 290, both can do 1200 with +80







and 1300 +200







one have tested under water http://www.3dmark.com/fs/2523508


----------



## rdr09

Quote:


> Originally Posted by *devilhead*
> 
> heh, i have 2x sapphire 290, both can do 1200 with +80
> 
> 
> 
> 
> 
> 
> 
> and 1300 +200
> 
> 
> 
> 
> 
> 
> 
> one have tested under water http://www.3dmark.com/fs/2523508


you've got better skillzz than i do.

PT1?


----------



## bluedevil

Got my RMA back. BNIB with the upgraded cooler. Makes me happy!


----------



## Arizonian

Quote:


> Originally Posted by *Romanion*
> 
> Hi, just like to get added to the list
> 
> Validation link: http://www.techpowerup.com/gpuz/mxdfx/
> Brand: Sapphire 290x reference
> Cooling: Stock now, EK waterblock next week.


Congrats - added







Pics of blocks, I'll update you to water. Nice.









Quote:


> Originally Posted by *gobblebox*
> 
> Are the 14.7 or any of the 14.x drivers compatible with the extended voltage method in AB, so that I can use more than 100mv when OCing? I've followed the guide to the T in an attempt to get more than 100mv applied in AB, but nothing seems to work. It is my understanding that it doesn't actually _show_ 100+mv in AB after the method is applied, but after extending the voltage to match (and even exceed) my settings in Trixx, it is clear that the voltage isn't actually going past 100mv. Does anyone know how to resolve this? Trixx is great and all, but IMO, AB is lightyears ahead...


Free Bump reminder already two pages back.








Quote:


> Originally Posted by *bluedevil*
> 
> Got my RMA back. BNIB with the upgraded cooler. Makes me happy!


Congrats.







When you get a chance like to know your best clocks and temps when you OC.


----------



## devilhead

Quote:


> Originally Posted by *rdr09*
> 
> you've got better skillzz than i do.
> 
> PT1?


yes, and 1.48v







but the memory is Elpida


----------



## tsm106

It sort of sucks to have a decent core on a 290. Benching wise it will always lose to 290x.


----------



## fishingfanatic

Ok, is this good enough?








FF


----------



## devilhead

Quote:


> Originally Posted by *tsm106*
> 
> It sort of sucks to have a decent core on a 290. Benching wise it will always lose to 290x.


yes, here you can see, almost the same clocks, but score... 290 ----> http://www.3dmark.com/fs/2523508
290X

> http://www.3dmark.com/fs/2523594
290X had 10 more on core and 20more on memory, all other stuff same







but score gap is huge


----------



## tsm106

Quote:


> Originally Posted by *devilhead*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> It sort of sucks to have a decent core on a 290. Benching wise it will always lose to 290x.
> 
> 
> 
> yes, here you can see, almost the same clocks, but score... 290 ----> http://www.3dmark.com/fs/2523508
> 290X
> 
> > http://www.3dmark.com/fs/2523594
> 290X had 10 more on core and 20more on memory, all other stuff same
> 
> 
> 
> 
> 
> 
> 
> but score gap is huge
Click to expand...

Yeap. Last year I had a lil bench off vs Ranger. Let him run whatever clocks he wanted, he clocked his super gold 290 upto 1345/1775, but it still got creamed by a 1300/1600 290x. Hehe. I don't think most people/everyone lol, realizes the actual gaps.


----------



## rv8000

Lets play a game, its called rate this coil whine... seriously on a scale from 1-10 (10 being bad). Im sick of this issue, ive literally swapped every component bar sending the card for rma. I had high hopes the issue would be resolved when i picked up my 750w supernova G2 but the amount of whine didnt change, only the pitch in certain cases. I don't mind hearing my pc, but hearing this stupid electrical garbage above everything drives me insane, and this is only 40-70fps, i play all of my games 150+







. Link below.


----------



## BradleyW

Quote:


> Originally Posted by *rv8000*
> 
> Lets play a game, its called rate this coil whine... seriously on a scale from 1-10 (10 being bad). Im sick of this issue, ive literally swapped every component bar sending the card for rma. I had high hopes the issue would be resolved when i picked up my 750w supernova G2 but the amount of whine didnt change, only the pitch in certain cases. I don't mind hearing my pc, but hearing this stupid electrical garbage above everything drives me insane, and this is only 40-70fps, i play all of my games 150+
> 
> 
> 
> 
> 
> 
> 
> . Link below.


Maybe you should have just switched the card instead of components which did not require changing? It's like having an engine issue, yet you swap everything other than the engine.


----------



## rv8000

Quote:


> Originally Posted by *BradleyW*
> 
> Maybe you should have just switched the card instead of components which did not require changing? It's like having an engine issue, yet you swap everything other than the engine.


The other parts were upgrades regardless of the card having whine or not, I wanted a better psu anyways as well. Just not looking forward to the fact that even if they do reproduce results in their rma dept. they'll send a refurbed card and most likely not check it for the same issue putting me right back at square one. I wanted to get a general opinion on how bad people thought it was, still on the edge between dealing with it and rma'ing the card.


----------



## shwarz

That coil whines not too bad my 290 sounds like that at +200mv


----------



## the9quad

If I have coil whine it is masked by the fans, I have never heard any on mine.


----------



## Spectre-

I get coil whine at anything higher than 1.4 volts while benching


----------



## Mulm

Hi there, I have the famous atikmdag.sys BSOD (Stopcode 0x0000007E) with a Powercolor R9 290 and have the feeling the cause of it is the Idle/2D/Whatever Mode aka the GPU running at 300 Mhz Core and 150 Mhz Memory Clock. So I want to test the Card running always at standard clock.

So how do I achieve this? I tried to disable ULPS, but I'm not sure if this is the right thing and it does not work either. OC Software like AB or Trixx just change the clock speeds for the 3D mode.


----------



## ABADY

are these tests normal?

3dmark

@stock month ago

http://www.3dmark.com/3dm/3577094?

and i tried to overclock my card ( its my first time ._. ) but i notice some low score in some tests so i ran couple of them @stock and oc'ed

@1090/1430MHz

http://www.3dmark.com/3dm/3745900

@stock

http://www.3dmark.com/3dm/3749267?

@1100/1430MHz

http://www.3dmark.com/3dm/3749563?

as far as i know my card is stable i played couple of games metro last light + arma 3 didnt notice any weird action.

why specially on the ic storm test the first test got the highest score ?

and can i oc the card more than 1100/1430?


----------



## schoolofmonkey

Quote:


> Originally Posted by *fishingfanatic*
> 
> Ok, is this good enough?


How do you find the temps on that beast?
GURU3D in their review said they were through the roof and Powercolor should of went with a LCS:
http://www.guru3d.com/articles_pages/powercolor_radeon_290x_295x2_devil13_review,12.html

Thing looks awesome though..


----------



## lucifeil

Hello all,

I have finally joined up to this club! I traded in my GTX 770 for a R9 290!

Here's the link: http://www.techpowerup.com/gpuz/vqa4f/

XFX DD Black Edition; on the stock cooler (for now).


----------



## Arizonian

Quote:


> Originally Posted by *fishingfanatic*
> 
> Ok, is this good enough?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> [/URL].
> 
> FF


Sweet card man congrats









Just not going to qualify here.. I've already discussed this with NavDigitalStorm and they've got a spot for you at the *Official AMD R9 295X2 Owners Club* Enjoy that beast.









Quote:


> Originally Posted by *lucifeil*
> 
> Hello all,
> 
> I have finally joined up to this club! I traded in my GTX 770 for a R9 290!
> 
> Here's the link: http://www.techpowerup.com/gpuz/vqa4f/
> 
> XFX DD Black Edition; on the stock cooler (for now).


Congrats - added









You also may want to check out *XFX Black/Double Dissipation Club!* with Sgt Bilko.


----------



## sun100

Quote:


> Originally Posted by *BradleyW*
> 
> Maybe you should have just switched the card instead of components which did not require changing? It's like having an engine issue, yet you swap everything other than the engine.


I replaced my reference Sapphire R9 290 2 times cuz of coil whine which was so goddamn irritating, didn't help. It gets worst in games like FIFA 14 where i get 300FPS, its insane xD Have to play basically every game with vsync on, and when one plays on 60Hz TV that's not really what you call a joy of framerate









We just gotta live with it i'd guess, coil whine on some nVidias is far far worse. Interestingly tho, my card never whined untill i did a bit of wire rearrangement in my case to allow for better airflow with 2 new chassis fans, then it started whining like mad.
As it is with coils, anything can affect it really, not just card itself. It can be pretty much any component in the chassis, bending cables, fan flow, power supply, anything really..


----------



## Spectre-

guys it seems the 14.7 beta is complete poop for benching
is 14.6 beta any good?


----------



## Klocek001

why would you take benchmark improvement over game improvement ?


----------



## kizwan

Quote:


> Originally Posted by *Spectre-*
> 
> guys it seems the 14.7 beta is complete poop for benching
> is 14.6 beta any good?


What bench?


----------



## BradleyW

Quote:


> Originally Posted by *rv8000*
> 
> The other parts were upgrades regardless of the card having whine or not, I wanted a better psu anyways as well. Just not looking forward to the fact that even if they do reproduce results in their rma dept. they'll send a refurbed card and most likely not check it for the same issue putting me right back at square one. I wanted to get a general opinion on how bad people thought it was, still on the edge between dealing with it and rma'ing the card.


Well, I tell you what, my 290X's where terrible for coil whine. After a month of frequent gaming, they lost most of their coil whine. Now, the coil is masked under my slow spinning quiet fans attach to my radiators. The whine is gone from high pitched to a low sizzling noise. Most of the time it cannot be heard. I game at 100+ fps in most titles.
Quote:


> Originally Posted by *sun100*
> 
> I replaced my reference Sapphire R9 290 2 times cuz of coil whine which was so goddamn irritating, didn't help. It gets worst in games like FIFA 14 where i get 300FPS, its insane xD Have to play basically every game with vsync on, and when one plays on 60Hz TV that's not really what you call a joy of framerate
> 
> 
> 
> 
> 
> 
> 
> 
> 
> We just gotta live with it i'd guess, coil whine on some nVidias is far far worse. Interestingly tho, my card never whined untill i did a bit of wire rearrangement in my case to allow for better airflow with 2 new chassis fans, then it started whining like mad.
> As it is with coils, anything can affect it really, not just card itself. It can be pretty much any component in the chassis, bending cables, fan flow, power supply, anything really..


Bending a cable or messing with fan flow should have no effect on the coil whine.


----------



## flamin9_t00l

I think I have the worst OCer I've ever seen on a 290 (Powercolor PCS+ 290).

Stock clocks are:
1040MHz / 1350MHz / +50mv

Highest core OC at stock volts and remember this is with +50mv.
1060MHz... LMAO









Any more and I get artifacts. Has anyone seen worse? then feel free to let us know.


----------



## Klocek001

I like how 8 out of 10 posts here is a guy describing step by step his RMA process and then other guys commenting on that process.
As for 1060+ with artifacts at +50 mV stock that is wickedly funny, indeed.


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> It sort of sucks to have a decent core on a 290. Benching wise it will always lose to *some* 290x.


Fixed. 290X for benching or better yet 780 Ti. ha!

Quote:


> Originally Posted by *flamin9_t00l*
> 
> I think I have the worst OCer I've ever seen on a 290 (Powercolor PCS+ 290).
> 
> Stock clocks are:
> 1040MHz / 1350MHz / +50mv
> 
> Highest core OC at stock volts and remember this is with +50mv.
> 1060MHz... LMAO
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any more and I get artifacts. Has anyone seen worse? then feel free to let us know.


I think these non-references get gimped bioses to prevent owners from borking them increasing the chance of rma. indeed that is low.


----------



## flamin9_t00l

Quote:


> Originally Posted by *rdr09*
> 
> I think these non-references get gimped bioses to prevent owners from borking them increasing the chance of rma. indeed that is low.


Really? Am I likely to get much better OC if I flash a different BIOS? This is a shame really as I personally would never let a card get into borking territory due to pushing it too far.

I was wondering is it possible to flash any 290 BIOS to any other 290 and it will work ok?

eg: Sapphire ref ---> Powercolor non-ref

or

HIS ref ---> Powercolor non-ref

I have both Sapphire reference and HIS reference cards. Not really interested in PT1 & PT3 with the current VRM temps on the Powercolor.

Is the Powercolor PCS+ a reference PCB.. Actually looking at it, it looks reference to me.

Nvm, looks like it's a reference PCB but with some different capacitors and some glue around the chokes.

Sorry for all the questions, last one I promise... Is the Powercolor likely to be flashable with a ref BIOS? Thanks.


----------



## rdr09

Quote:


> Originally Posted by *flamin9_t00l*
> 
> Really? Am I likely to get much better OC if I flash a different BIOS? This is a shame really as I personally would never let a card get into borking territory due to pushing it too far.


Reference have issues (they are the first to come out) but the chances of oc'ing higher is with them.

with proper cooling


----------



## rdr09

Quote:


> Originally Posted by *flamin9_t00l*
> 
> Really? Am I likely to get much better OC if I flash a different BIOS? This is a shame really as I personally would never let a card get into borking territory due to pushing it too far.
> 
> I was wondering is it possible to flash any 290 BIOS to any other 290 and it will work ok?
> 
> eg: Sapphire ref ---> Powercolor non-ref
> 
> or
> 
> HIS ref ---> Powercolor non-ref
> 
> I have both Sapphire reference and HIS reference cards. Not really interested in PT1 & PT3 with the current VRM temps on the Powercolor.
> 
> Is the Powercolor PCS+ a reference PCB.. Actually looking at it, it looks reference to me.


Sorry, flaming . . . i am not into flashing. such a wuzz but that's what the 2 bios switches are for. just make sure you don't bork both. save the original bios to flash drive or something. try a reference bios for a 290 of any brand.


----------



## flamin9_t00l

Quote:


> Originally Posted by *rdr09*
> 
> Sorry, flaming . . . i am not into flashing. such a wuzz but that's what the 2 bios switches are for. just make sure you don't bork both. save the original bios to flash drive or something. try a reference bios for a 290 of any brand.


I'm not 100% sure things will go well... can't find much info on the changes this PCB has so I think I will leave it for now. Not too fussed about a couple of extra MHz anyway... I guess if it aint broke don't fix it.


----------



## ozzy1925

i know its not the place but amd mining section looks dead.Anyways i have 3x r290 cards and one of them always seem to get sick and dead after some time.I will try heaven benchmark if that card has problem or not .Can you tell me what will be the heaven setting and whats the normal score?ty


----------



## ABADY

Quote:


> Originally Posted by *ABADY*
> 
> are these tests normal?
> 
> 3dmark
> 
> @stock month ago
> 
> http://www.3dmark.com/3dm/3577094?
> 
> and i tried to overclock my card ( its my first time ._. ) but i notice some low score in some tests so i ran couple of them @stock and oc'ed
> 
> @1090/1430MHz
> 
> http://www.3dmark.com/3dm/3745900
> 
> @stock
> 
> http://www.3dmark.com/3dm/3749267?
> 
> @1100/1430MHz
> 
> http://www.3dmark.com/3dm/3749563?
> 
> as far as i know my card is stable i played couple of games metro last light + arma 3 didnt notice any weird action.
> 
> why specially on the ic storm test the first test got the highest score ?
> 
> and can i oc the card more than 1100/1430?


----------



## fishingfanatic

It has 1 little glitch that's annoying as heck, other than that it's a beast alright.

I have a pretty big case with a 140mm gpu fan and I added extras. Evven then if the case is closed it will eventually only run as a single

bcuz of the heat.

If I leave the case open there's enough dissipation that it runs fine.

If it didn't dump so much of the heat into the case it wouldn't be so bad imho.

here's a quick result @ 4.3ghz.

http://www.3dmark.com/fs/2508976

FF


----------



## DeadLink

I gotta ask why does the MSI Gaming 290X not get good frames in BF4? Everything on Ultra it just runs like my GTX 680. But my 780 Ti Classified completely destroys it.


----------



## rdr09

Quote:


> Originally Posted by *DeadLink*
> 
> I gotta ask why does the MSI Gaming 290X not get good frames in BF4? Everything on Ultra it just runs like my GTX 680. But my 780 Ti Classified completely destroys it.


check the vrms' temps.


----------



## fishingfanatic

In my limited experience I have found that the MSi cards don't oc much bcuz they're usually pretty well oced from the factory.

That said, in my experience benching gpus much as I hate ASUS now, they make the best cards, though Gigabyte's new WF series r impressive.

EVGA stands alone imho, great products, A++ support,... and the ACX cooling has come a long way too!

These r based on air cooling not wcing. I'm just getting to wcing now so we'll c if that makes any difference.

Just mho for what it's worth.

FF


----------



## spikezone2004

I am getting ahead of myself I added the club to my sig already









My card is sitting at my house in the delivery box still, get home next week and going to do some tests out of the box then put it under water the next day


----------



## FuriousPop

Finally i have my build up and running......

if someone can help me pls with these temps to advise if they are ok, good, bad etc etc

All cards Sapphire R9 290 Tri-x OC on water

1st card - idle at 32 degree cel
VRM1 - 29 deg cel
VRM2 - 28 deg cel

2nd card - idle at 20 deg cel
VRM1 -
VRM2 -

3rd card - idle at 20 deg cel
VRM1 -
VRM2 -

ok thats weird - GPUZ with 2nd and 3rd card - temps were there a second ago but now they gone! ***??? im running 0.7.8..... hmmmmmm


----------



## DeadLink

Quote:


> Originally Posted by *rdr09*
> 
> check the vrms' temps.


55 Max


----------



## Mega Man

Quote:


> Originally Posted by *FuriousPop*
> 
> Finally i have my build up and running......
> 
> if someone can help me pls with these temps to advise if they are ok, good, bad etc etc
> 
> All cards Sapphire R9 290 Tri-x OC on water
> 
> 1st card - idle at 32 degree cel
> VRM1 - 29 deg cel
> VRM2 - 28 deg cel
> 
> 2nd card - idle at 20 deg cel
> VRM1 -
> VRM2 -
> 
> 3rd card - idle at 20 deg cel
> VRM1 -
> VRM2 -
> 
> ok thats weird - GPUZ with 2nd and 3rd card - temps were there a second ago but now they gone! ***??? im running 0.7.8..... hmmmmmm


ulps or uplsat work at the moment but yea turn it off


----------



## rdr09

Quote:


> Originally Posted by *DeadLink*
> 
> 55 Max


must be water. i use HWinfo64 to monitor temps in games like this . . .



here are my fps for the 290 (130%) vs 7950 all stock with the i7 @ 4.5 using 1080 . . .


----------



## DeadLink

No water, Stock. I am using Hw64 and get the same results.


----------



## rdr09

Quote:


> Originally Posted by *DeadLink*
> 
> No water, Stock. I am using Hw64 and get the same results.


you might be mistaken 'cause the gaming's vrms get pretty hot unless your ambient is really low. check them again. play BF4 for about 15mins and check like i did there. mine is watercooled and my ambient is 23C.

what driver are you using? also, do you mind running 3DMark11?

edit: the gaming stock cooler is the same as the one used with the Ti. the Ti's vrms reached 100C measured at the back of the pcb.


----------



## devilhead

Quote:


> Originally Posted by *DeadLink*
> 
> I gotta ask why does the MSI Gaming 290X not get good frames in BF4? Everything on Ultra it just runs like my GTX 680. But my 780 Ti Classified completely destroys it.


use Mantle with 290X


----------



## ozzy1925

i get better score when my cards clocks are stock (1000-1300)then 1150-1300 +100mv and %50 power limit with my 290.Whats wrong?
http://www.3dmark.com/3dm11/8599335
http://www.3dmark.com/3dm11/8599308

temps are around 70-75c

edit: 1 more test this time 1050-1300 +0mv this time better:
http://www.3dmark.com/compare/3dm11/8599363/3dm11/8599308
i think my card throttles when benching at 1150gpu


----------



## kizwan

Quote:


> Originally Posted by *ozzy1925*
> 
> i get better score when my cards clocks are stock (1000-1300)then 1150-1300 +100mv and %50 power limit with my 290.Whats wrong?
> http://www.3dmark.com/3dm11/8599335
> http://www.3dmark.com/3dm11/8599308
> 
> temps are around 70-75c
> 
> edit: 1 more test this time 1050-1300 +0mv this time better:
> http://www.3dmark.com/compare/3dm11/8599363/3dm11/8599308
> i think my card throttles when benching at 1150gpu


The 3dmark 11 graphics tests ran in centred or stretched?

Run GPU-Z in the background & enable log to file. Then you'll know whether the card is throttling or not.


----------



## AlR25

Quote:


> Originally Posted by *Mulm*
> 
> Hi there, I have the famous atikmdag.sys BSOD (Stopcode 0x0000007E) with a Powercolor R9 290 and have the feeling the cause of it is the Idle/2D/Whatever Mode aka the GPU running at 300 Mhz Core and 150 Mhz Memory Clock. So I want to test the Card running always at standard clock.
> 
> So how do I achieve this? I tried to disable ULPS, but I'm not sure if this is the right thing and it does not work either. OC Software like AB or Trixx just change the clock speeds for the 3D mode.




+ Change the frequency (enough for 1 Mhz)


----------



## ozzy1925

Quote:


> Originally Posted by *kizwan*
> 
> The 3dmark 11 graphics tests ran in centred or stretched?
> 
> Run GPU-Z in the background & enable log to file. Then you'll know whether the card is throttling or not.


i run them stretched will try with gpuz and report
edit:yes it throttles at 1150 .What should i do increase voltage ?


----------



## rdr09

Quote:


> Originally Posted by *ozzy1925*
> 
> i run them stretched will try with gpuz and report
> edit:yes it throttles at 1150 .What should i do increase voltage ?


your igpu is enabled or lucid. it can screw up results at times. here is 1150 using 13 driver which is about 300 short in graphics score . . .

http://www.3dmark.com/3dm11/8215207


----------



## ozzy1925

Quote:


> Originally Posted by *rdr09*
> 
> your igpu is enabled or lucid. it can screw up results at times. here is 1150 using 13 driver which is about 300 short in graphics score . . .
> 
> http://www.3dmark.com/3dm11/8215207


i select primary display pcie from the bios as i found out, it even throttles at 1125 with+100mv i have sapphire trix oc. btw driver 14.4


----------



## rdr09

Quote:


> Originally Posted by *ozzy1925*
> 
> i select primary display pcie from the bios as i found out, it even throttles at 1125 with+100mv i have sapphire trix oc. btw driver 14.4


are you leaving the fan in auto? if so, don't. always max your power limit at any oc. 14.4 will give similar results as 13 drivers. 14.6 ^ will give higher points.

if you see secondary cards in futuremark detailed info, then you still have the igpu enabled somehow.

play with the offset. you might need lower than 100mv for such and such oc.

also, you only have a stick of ram?


----------



## ozzy1925

Quote:


> Originally Posted by *rdr09*
> 
> are you leaving the fan in auto? if so, don't. always max your power limit at any oc. 14.4 will give similar results as 13 drivers. 14.6 ^ will give higher points.
> 
> if you see secondary cards in futuremark detailed info, then you still have the igpu enabled somehow.
> 
> play with the offset. you might need lower than 100mv for such and such oc.
> 
> also, you only have a stick of ram?


i only have 1x 4gb ram, i switch to TRIXX and set to +125 vddc ,1125 clock .now it seems stable
http://www.3dmark.com/3dm11/8599518
Max temp was 77c fans were %67 .That means i have ****ty card?


----------



## rdr09

Quote:


> Originally Posted by *ozzy1925*
> 
> i only have 1x 4gb ram, i switch to TRIXX and set to +125 vddc ,1125 clock .now it seems stable
> http://www.3dmark.com/3dm11/8599518
> Max temp was 77c fans were %67 .*That means i have ****ty card?*


Hell no. check your vrm temps. you can either use GPUZ or Hwinfo64. Trixx only shows the core temp. Keep your temps below 80C for all. you may have to crank the fans up. The 4GB single stick does not help. Power Limit settings?

i was going to say you be better off oc'ing your cpu but your board is not up for it. compare physics scores. at least turn off all power saving features in bios and windows. i think that missing ram stick is pulling your physics down.


----------



## ozzy1925

Quote:


> Originally Posted by *rdr09*
> 
> Hell no. check your vrm temps. you can either use GPUZ or Hwinfo64. Trixx only shows the core temp. Keep your temps below 80C for all. you may have to crank the fans up. The 4GB single stick does not help. Power Limit settings?
> 
> i was going to say you be better off oc'ing your cpu but your board is not up for it. compare physics scores. at least turn off all power saving features in bios and windows. i think that missing ram stick is pulling your physics down.


yes my board sux and as i checked my max vrm temp was 85c.I will try to +150mv for 1150 core and this is my mining rig







. Mining is not profitable anymore so i wanted try my cards. I will switch to a better board and cpu as soon as x99 hits the market.I will bench when i come back to home thanks for the help +rep


----------



## rdr09

Quote:


> Originally Posted by *ozzy1925*
> 
> yes my board sux and as i checked my max vrm temp was 85c.I will try to +150mv for 1150 core and this is my mining rig
> 
> 
> 
> 
> 
> 
> 
> . Mining is not profitable anymore so i wanted try my cards. I will switch to a better board and cpu as soon as x99 hits the market.I will bench when i come back to home thanks for the help +rep


switch cpu?! oh noes! that thing is faster than my sandy at 4.5GHz. check its temp - it might be throttling. just need a board.


----------



## DeadLink

Quote:


> Originally Posted by *rdr09*
> 
> you might be mistaken 'cause the gaming's vrms get pretty hot unless your ambient is really low. check them again. play BF4 for about 15mins and check like i did there. mine is watercooled and my ambient is 23C.
> 
> what driver are you using? also, do you mind running 3DMark11?
> 
> edit: the gaming stock cooler is the same as the one used with the Ti. the Ti's vrms reached 100C measured at the back of the pcb.


Played for an hour, the most I was getting with a 74 F room temp was 56.


----------



## ozzy1925

Quote:


> Originally Posted by *rdr09*
> 
> switch cpu?! oh noes! that thing is faster than my sandy at 4.5GHz. check its temp - it might be throttling. just need a board.


i tried 1150 +200mv and %50 power limit it still seems to throttle down to 1149.2 .I checked vram temp was around 90-95c.My card throttle because of air cooling or its on its limit?
edit:i think i found the problem AB makes my card throttle i do a clean install and trixx .Looks like I can even get stable clocks @1170
driver 13.12 graphics score :16774


----------



## Germanian

Quote:


> Originally Posted by *ozzy1925*
> 
> i tried 1150 +200mv and %50 power limit it still seems to throttle down to 1149.2 .I checked vram temp was around 90-95c.My card throttle because of air cooling or its on its limit?
> edit:i think i found the problem AB makes my card throttle i do a clean install and trixx .Looks like I can even get stable clocks @1170
> driver 13.12 graphics score :16774


in AB settings switch to Unofficial overclocking mode to WITHOUT POWERPLAY SUPPORT and try again

that setting makes it so it always stays at full speeds even in idle mode but it should prevent the downclocking


----------



## FuriousPop

Thanks MM.

ok so this morning i got the following:

Ambient is 18 deg cel

GPU1 - 28
GPU2 - 25
GPU3 - 29

interesting - 3rd card hottest at idle - hmmm. 1st card seems to just touch 29 and sits on 28.

GPU1 - VRM1 - 25, VRM2 25
GPU2 - VRM1 - 21, VRM2 20
GPU3 - VRM1 - 21, VRM2 20

got a chance to run heaven last night.... this looks aahhhh ok ish??????

Firsttestingrun.jpg 111k .jpg file


everything running stock atm - want to make sure i've put this thing together properly since its my first water build..


----------



## ozzy1925

Quote:


> Originally Posted by *Germanian*
> 
> in AB settings switch to Unofficial overclocking mode to WITHOUT POWERPLAY SUPPORT and try again
> 
> that setting makes it so it always stays at full speeds even in idle mode but it should prevent the downclocking


i did try that but the engine still throttles down when benchmarking


----------



## gobblebox

I guess since nobody responded the first time, I'll ask one more time, does anyone know how to get the "/wi4,30,8d,15" (to add 150mv) modifier to actually apply? With my Sapphire Tri-X OC r9 290, I'm running AB 3.0.1 and AMD 14.7 Beta Drivers, but the voltage increase doesn't show up in AB.

I've tried both methods: creating a batch file & just adding it to the end of the target location of a shortcut to MSI AB. I've also tried the suggested "/wi6" modifier...

Has anyone had success with this on the versions I am running? Do I need to downgrade drivers to 14.4 or another? Also, when running either the batch file or the shortcut, is it supposed to open MSI AB? Because mine does not...


----------



## heroxoot

Quote:


> Originally Posted by *ozzy1925*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Germanian*
> 
> in AB settings switch to Unofficial overclocking mode to WITHOUT POWERPLAY SUPPORT and try again
> 
> that setting makes it so it always stays at full speeds even in idle mode but it should prevent the downclocking
> 
> 
> 
> i did try that but the engine still throttles down when benchmarking
Click to expand...

Increase GPU clock by 1mhz. Just be aware this will make the GPU never clock down. Best to set a 3D profile with your OC and a 2D profile with the minimum underclock.


----------



## falcon26

OK just food for thought. After selling my 290X because of heat and noise. I got my 780 TI MSI in. Idle temp is about 41 and load is about 68 my fan speed never goes over 54% and its dead quiet. Its also nice to have adaptive Vsync back as well and when watching video my temps do not go up at all unlike the 290X where it shot up to like 70. The guy I sold my 290X too loves it so everything worked out great...


----------



## the9quad

Quote:


> Originally Posted by *falcon26*
> 
> OK just food for thought. After selling my 290X because of heat and noise. I got my 780 TI MSI in. Idle temp is about 41 and load is about 68 my fan speed never goes over 54% and its dead quiet. Its also nice to have adaptive Vsync back as well and when watching video my temps do not go up at all unlike the 290X where it shot up to like 70. The guy I sold my 290X too loves it so everything worked out great...


glad your happy, sucks spending $$ and not being happy.


----------



## rdr09

Quote:


> Originally Posted by *DeadLink*
> 
> Played for an hour, the most I was getting with a 74 F room temp was 56.


are you serious? that's pretty much my room temp. that 74. my bad. you mean 56C. what driver?

run this pls . . .

http://www.techpowerup.com/downloads/2336/futuremark-3dmark-11-v1-0-132/

Stretched mode.

Quote:


> Originally Posted by *ozzy1925*
> 
> i tried 1150 +200mv and %50 power limit it still seems to throttle down to 1149.2 .I checked vram temp was around 90-95c.My card throttle because of air cooling or its on its limit?
> edit:i think i found the problem AB makes my card throttle i do a clean install and trixx .Looks like I can even get stable clocks @1170
> driver 13.12 graphics score :16774


i would not be too concerned about that 1MHz difference.

Quote:


> Originally Posted by *falcon26*
> 
> OK just food for thought. After selling my 290X because of heat and noise. I got my 780 TI MSI in. Idle temp is about 41 and load is about 68 my fan speed never goes over 54% and its dead quiet. Its also nice to have adaptive Vsync back as well and when watching video my temps do not go up at all unlike the 290X where it shot up to like 70. The guy I sold my 290X too loves it so everything worked out great...


watch your vrm temps. see post # 5744

http://www.overclock.net/t/1438886/official-nvidia-gtx-780-ti-owners-club/5740

that's a powerful card.


----------



## ByteHacker

Hi there! Could someone here with a Sapphire Vapor-X R9 290X dump me their BIOS file? Thanks!

I am currently trying to unlock my card, but it seems like all the other BIOS's don't work because this card has a custom VRM circuit.

Help would be appreciated. Thanks again!


----------



## Mulm

Quote:


> Originally Posted by *AlR25*
> 
> 
> 
> + Change the frequency (enough for 1 Mhz)


Thank you, I already had installed AB but I didn't noticed this option.


----------



## piquadrat

Quote:


> Originally Posted by *gobblebox*
> 
> I guess since nobody responded the first time, I'll ask one more time, does anyone know how to get the "/wi4,30,8d,15" (to add 150mv) modifier to actually apply? With my Sapphire Tri-X OC r9 290, I'm running AB 3.0.1 and AMD 14.7 Beta Drivers, but the voltage increase doesn't show up in AB.
> 
> I've tried both methods: creating a batch file & just adding it to the end of the target location of a shortcut to MSI AB. I've also tried the suggested "/wi6" modifier...
> 
> Has anyone had success with this on the versions I am running? Do I need to downgrade drivers to 14.4 or another? Also, when running either the batch file or the shortcut, is it supposed to open MSI AB? Because mine does not...


That is not the way it's supposed to work (for me at least). Do not expect some kind of offset applied to AB voltage settings. For instance sending: /wi6,30,8d,18 sets voltage to +150mV but when you touch voltage slider in AB this will be reset to where the slider points to. I think /wi6 command uses AB engine to sent the setting directly to voltage regulator and not messing with AB behavior in any way. Set clocks in AB and after that apply voltage using /wi6. BTW /wi4 never worked for me.
Proceed with caution.


----------



## Gobigorgohome

A little "upgrade" to my rig: 4x Sapphire Radeon R9 290X with 4x EK-FC R9 290X Acetal+Nickel







I am registered as "stock" cooling I think.


----------



## leetmode

Quote:


> Originally Posted by *Gobigorgohome*
> 
> 
> 
> A little "upgrade" to my rig: 4x Sapphire Radeon R9 290X with 4x EK-FC R9 290X Acetal+Nickel
> 
> 
> 
> 
> 
> 
> 
> I am registered as "stock" cooling I think.


i hate you. very clean pipe work!


----------



## kizwan

Quote:


> Originally Posted by *FuriousPop*
> 
> Thanks MM.
> 
> ok so this morning i got the following:
> 
> Ambient is 18 deg cel
> 
> GPU1 - 28
> GPU2 - 25
> GPU3 - 29
> 
> interesting - 3rd card hottest at idle - hmmm. 1st card seems to just touch 29 and sits on 28.
> 
> GPU1 - VRM1 - 25, VRM2 25
> GPU2 - VRM1 - 21, VRM2 20
> GPU3 - VRM1 - 21, VRM2 20
> 
> got a chance to run heaven last night.... this looks aahhhh ok ish??????
> 
> Firsttestingrun.jpg 111k .jpg file
> 
> 
> everything running stock atm - want to make sure i've put this thing together properly since its my first water build..


Better use image button to add picture.


----------



## FuriousPop

Quote:


> Originally Posted by *kizwan*
> 
> Better use image button to add picture.


ahh yes, thanks


----------



## kizwan

Quote:


> Originally Posted by *FuriousPop*
> 
> ahh yes, thanks
> 
> 
> Spoiler: Warning: Spoiler!


That's better. Better use Unigine Heaven Benchmark 4.0 & Unigine Valley Benchmark 1.0.


----------



## piquadrat

Is it a way in AMD's drivers to adjust gamma value but not only for desktop but for all applications games included ?


----------



## bluedevil

Well, did the Red Mod over on my new RMA'd 290.









Wow....

Alright, new stock cooler dual fan design. I will bite.











Side


Back - New Backplate


Dual Fan HSF off - Wow that's a ton of thermal paste. No wonder why my temps sucked. Notice how crappy they (VisionTek) applied the ramsinks?


Seidon 120XL block installed


Final


Now off to OCIng!


----------



## HeliXpc

Quote:


> Originally Posted by *flamin9_t00l*
> 
> I had RSOD during gaming with my new Z97 build (Fortress rig) and the problem was CPU vcore. Check to see if you have memory dump from the crash and BSOD code (mine was 124). Not sure if it's the same problem but bumping CPU vcore on mine sorted it.


Thank you, this helped


----------



## Klocek001

Just did a GPU stress test in MSI kombustor and the temps scared the s*** out of me - I started getting artifacts at 82 degrees on my 290 (1150/1500), which I assume are related to the heat building up in the case. I thought the OC values I've been using for a few days now are safe, but I never did any stress testing, only games. Would you suggest downclocking, restoring back to stock or keeping the OC as it is (like I said, the "break point" is 82 degrees in Kombustor, the games I play games never even pushed it over 70 once).


----------



## spikezone2004

Quote:


> Originally Posted by *Klocek001*
> 
> Just did a GPU stress test in MSI kombustor and the temps scared the s*** out of me - I started getting artifacts at 82 degrees on my 290 (1150/1500), which I assume are related to the heat building up in the case. I thought the OC values I've been using for a few days now are safe, but I never did any stress testing, only games. Would you suggest downclocking, restoring back to stock or keeping the OC as it is (like I said, the "break point" is 82 degrees in Kombustor, the games I play games never even pushed it over 70 once).


What are your vrm temps looking like when you game? you want to keep a close eye on the vrm temps aswell as the core temps


----------



## HOMECINEMA-PC

Clockin the 290 now on BE [email protected]









http://www.3dmark.com/3dm11/8601470


----------



## Klocek001

Quote:


> Originally Posted by *spikezone2004*
> 
> What are your vrm temps looking like when you game? you want to keep a close eye on the vrm temps aswell as the core temps


vrm1 max 68, vrm2 max 47


----------



## CroAryan

Knock Knock! Can i came in?









Its XFX R9 290 Double Dissipation, Air cooler, link to gpu-z validation: http://www.techpowerup.com/gpuz/details.php?id=8z4e7


----------



## lucifeil

Gotta love that XFX DD cooler!


----------



## spikezone2004

Quote:


> Originally Posted by *Klocek001*
> 
> vrm1 max 68, vrm2 max 47


Your vrm temps aren't bad for while your gaming. I believe you don't want to be over 90c for your core, and VRM1 I think 100c? (someone will correct me if I am wrong) your VRM2 shouldn't be a problem.


----------



## Klocek001

Is not passing Kombustor over 82 degrees on core that much of a worry ?


----------



## ZealotKi11er

Quote:


> Originally Posted by *bluedevil*
> 
> Well, did the Red Mod over on my new RMA'd 290.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Wow....
> 
> Alright, new stock cooler dual fan design. I will bite.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Side
> 
> 
> Back - New Backplate
> 
> 
> Dual Fan HSF off - Wow that's a ton of thermal paste. No wonder why my temps sucked. Notice how crappy they (VisionTek) applied the ramsinks?
> 
> 
> Seidon 120XL block installed
> 
> 
> Final
> 
> 
> Now off to OCIng!


Did you test Stock cooler at all? Also do VRM1 get cooling from backplate?


----------



## bluedevil

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Did you test Stock cooler at all? Also do VRM1 get cooling from backplate?


Yeah l let it go for about 2 days. Its pretty bad, but comes with passive cooling for the VRMs and Memory. Topped out at 94C at 1Ghz clock. Now currently at 1.175 Ghz topped out at 82C. Much better.

And yes the VRM does get cooling from the backplate. Never seen temps over 60C on either VRM.

Also I screwed the M3 20mm screws in from the back of the PCB with nylon washers. This way it screwed right into the Seidon like it was made for it. Perfection. I will never suggest another cooling solution like a G10 or zip tie.


----------



## hwoverclkd

Quote:


> Originally Posted by *bluedevil*
> 
> Yeah l let it go for about 2 days. Its pretty bad, but comes with passive cooling for the VRMs and Memory. Topped out at 94C at 1Ghz clock. Now currently at 1.175 Ghz topped out at 82C. Much better.
> 
> And yes the VRM does get cooling from the backplate. Never seen temps over 60C on either VRM.
> 
> Also I screwed the M3 20mm screws in from the back of the PCB with nylon washers. This way it screwed right into the Seidon like it was made for it. Perfection. I will never suggest another cooling solution like a G10 or zip tie.


Nice work! how's the temp so far?


----------



## bluedevil

Quote:


> Originally Posted by *acupalypse*
> 
> Nice work! how's the temp so far?


About 82C in FurMark, 10 min. @ 1.175ghz


----------



## Arizonian

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Clockin the 290 now on BE [email protected]
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/8601470











Quote:


> Originally Posted by *Gobigorgohome*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> A little "upgrade" to my rig: 4x Sapphire Radeon R9 290X with 4x EK-FC R9 290X Acetal+Nickel
> 
> 
> 
> 
> 
> 
> 
> I am registered as "stock" cooling I think.


Nice indeed.








Quote:


> Originally Posted by *CroAryan*
> 
> Knock Knock! Can i came in?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Its XFX R9 290 Double Dissipation, Air cooler, link to gpu-z validation: http://www.techpowerup.com/gpuz/details.php?id=8z4e7


Congrats - added









And welcome to OCN with your first post. You started in the right place.


----------



## ZealotKi11er

Quote:


> Originally Posted by *bluedevil*
> 
> About 82C in FurMark, 10 min. @ 1.175ghz


Whats with AIO?


----------



## flamin9_t00l

Quote:


> Originally Posted by *HeliXpc*
> 
> Thank you, this helped


No probs, glad it helped.


----------



## Gobigorgohome

Just did a little BF4 gaming at 4K with ultra settings (without AA), it seems to be good, not a great experience but it seems to work. This is everything at stock though, need to overclock my CPU a little, I think it is bottlenecking my quad R9 290X a bit.


----------



## gobblebox

I'm using Trixx, since I can't get AB to go past 100mv. Does anyone know why Disable ULPS isn't working for me, or at least, on the desktop? I have "Disable ULPS" selected, CCC is Overdrive is not enabled, nor is CCC even running. Does anyone know how to force the OC on the desktop? Right now, my clocks are sitting at 300/150.


----------



## bluedevil

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Whats with AIO?


Da


----------



## Nicky73

Nicky73

2014-08-10_19-23-49_1.jpg 89k .jpg file

Sapphire AMD R9 290X Tri-X | 1040MHz Core Clock | 4GB 5200MHz GDDR5 | 512 Bit | PCI-e 3.0 16x | DX11.2 | DVI/HDMI/DP
I bought 2 R9 290X`s when they first come out , kinda wish I waited till 295X2 came out hehe I have just the 3 fans on each my gpu and 5 case fans + Corsair H100i Extreme Performance Cooler | Dual Corsair AF 120mm Fans | Dual 120mm RAD
I`m currently looking for a CPU , Motherboard and both gpu Liquid cooling rig but I`m just a beginner and I not sure what to get , I have a cooler master storm stryker and all their full tower cases are HUGE so I can put a nice size cooling system in.
this a pic of my pc with panel off and remember I am a beginner builder and I build from what im told but I do like a neat pc

pcpic.jpg 1381k .jpg file


----------



## kizwan

Quote:


> Originally Posted by *gobblebox*
> 
> I'm using Trixx, since I can't get AB to go past 100mv. Does anyone know why Disable ULPS isn't working for me, or at least, on the desktop? I have "Disable ULPS" selected, CCC is Overdrive is not enabled, nor is CCC even running. Does anyone know how to force the OC on the desktop? Right now, my clocks are sitting at 300/150.


ULPS is only available in Crossfire. I think you meant PowerPlay. Try disable PowerPlay. You will need to use AB.
Quote:


> Originally Posted by *Nicky73*
> 
> Nicky73
> 
> 2014-08-10_19-23-49_1.jpg 89k .jpg file
> 
> Sapphire AMD R9 290X Tri-X | 1040MHz Core Clock | 4GB 5200MHz GDDR5 | 512 Bit | PCI-e 3.0 16x | DX11.2 | DVI/HDMI/DP
> I bought 2 R9 290X`s when they first come out , kinda wish I waited till 295X2 came out hehe I have just the 3 fans on each my gpu and 5 case fans + Corsair H100i Extreme Performance Cooler | Dual Corsair AF 120mm Fans | Dual 120mm RAD
> I`m currently looking for a CPU , Motherboard and both gpu Liquid cooling rig but I`m just a beginner and I not sure what to get , I have a cooler master storm stryker and all their full tower cases are HUGE so I can put a nice size cooling system in.
> this a pic of my pc with panel off and remember I am a beginner builder and I build from what im told but I do like a neat pc
> 
> pcpic.jpg 1381k .jpg file


Instead in attachment, use image button to add pictures.


----------



## BradleyW

I've noticed a lot of people are removing powerplay. What are the benefits of doing so?
Thank you.


----------



## heroxoot

Quote:


> Originally Posted by *BradleyW*
> 
> I've noticed a lot of people are removing powerplay. What are the benefits of doing so?
> Thank you.


Removing powerplay is like removing cool n quiet and C1E on a CPU. It basically clocks the GPU down and lowers voltage for idle. You can accomplish this in MSI AB by settting 2D and 3D profiles. Set voltage how you want. This way your OC has better handle on its own voltage.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *gobblebox*
> 
> I'm using Trixx, since I can't get AB to go past 100mv. Does anyone know why Disable ULPS isn't working for me, or at least, on the desktop? I have "Disable ULPS" selected, CCC is Overdrive is not enabled, nor is CCC even running. Does anyone know how to force the OC on the desktop? Right now, my clocks are sitting at 300/150.


If you want to flash your card to PT1T asus 290x bios and then D/load Asus GPU tweek and your voltage will be unlocked and your clock speed will run at whatever you set it at but min Clock speed is [email protected]@1.25v .... if that helps ya


----------



## deezdrama

I got a reference 290.

Will this fan profile be ok? Anything else i need to set for my ref 290 to stay cool?
Noise doesnt bother me untill its at 75%









Sent from my SCH-I605 using Tapatalk


----------



## ZealotKi11er

I am having a small problem. Sometimes when i play a game second card does not want to start. I have to Alt-Tab to get it to start working. I am using RC 14.7 Drivers.


----------



## kizwan

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I am having a small problem. Sometimes when i play a game second card does not want to start. I have to Alt-Tab to get it to start working. I am using RC 14.7 Drivers.


I have experienced the same thing but with Catzilla. I need to restart Catzilla a couple of times before Crossfire is properly engaged. I'm with 14.6 beta though. I have not play games for a couple of months now, I don't know whether this happen to games too.


----------



## ZealotKi11er

Quote:


> Originally Posted by *kizwan*
> 
> I have experienced the same thing but with Catzilla. I need to restart Catzilla a couple of times before Crossfire is properly engaged. I'm with 14.6 beta though. I have not play games for a couple of months now, I don't know whether this happen to games too.


Valley is another one that has this problem. I dont play much games so i dont know if anything else is effected.


----------



## FuriousPop

can someone give me advice? good bad ugly????

hopefully i am on the right track


----------



## gobblebox

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> If you want to flash your card to PT1T asus 290x bios and then D/load Asus GPU tweek and your voltage will be unlocked and your clock speed will run at whatever you set it at but min Clock speed is [email protected]@1.25v .... if that helps ya


The problem isn't that voltage is locked... the problem is that MSI AB doesn't let me go past 100mv, even with the "/w ...." modifiers. So I can't push my OC any farther than 1135/1475, simply because MSI AB caps the added voltage at 100mv. I have no trouble with this w/ Trixx, but that's only because Trixx, by default, allows you to add up to 200mv to the core voltage.


----------



## Klocek001

What software do you use for benching ?
Quote:


> Originally Posted by *gobblebox*
> 
> The problem isn't that voltage is locked... the problem is that MSI AB doesn't let me go past 100mv, even with the "/w ...." modifiers. So I can't push my OC any farther than 1135/1475, simply because MSI AB caps the added voltage at 100mv. I have no trouble with this w/ Trixx, but that's only because Trixx, by default, allows you to add up to 200mv to the core voltage.


I see we've got the same card and the same problem, what I've been able to achieve is 1100/1500, 1111 on GPU gives me OCCT errors in just 60-90 seconds. Do you use any software error-check benchmarks?

What's your max vcore at +100mV ? Mine usually hits 1,25-1,26v

Tri-X software has more OC room, but it generally sucks at everyday use


----------



## Spectre-

Quote:


> Originally Posted by *Klocek001*
> 
> What software do you use for benching ?
> I see we've got the same card and the same problem, what I've been able to achieve is 1100/1500, 1111 on GPU gives me OCCT errors in just 60-90 seconds. Do you use any software error-check benchmarks?
> 
> What's your max vcore at +100mV ? Mine usually hits 1,25-1,26v
> 
> Tri-X software has more OC room, but it generally sucks at everyday use


my max vcore is 1.43 volts while benching R9 290 @ 1280/1600


----------



## Klocek001

I suppose it's possible with water cooling.


----------



## ozzy1925

Quote:


> Originally Posted by *gobblebox*
> 
> The problem isn't that voltage is locked... the problem is that MSI AB doesn't let me go past 100mv, even with the "/w ...." modifiers. So I can't push my OC any farther than 1135/1475, simply because MSI AB caps the added voltage at 100mv. I have no trouble with this w/ Trixx, but that's only because Trixx, by default, allows you to add up to 200mv to the core voltage.


i have the same issue, trixx gives more voltage and stable core speeds when benchmarking but it wont allow me to set the power play.AB allows power play but my card throttles when benchmarking.


----------



## piquadrat

Quote:


> Originally Posted by *gobblebox*
> 
> The problem isn't that voltage is locked... the problem is that MSI AB doesn't let me go past 100mv, even with the "/w ...." modifiers. So I can't push my OC any farther than 1135/1475, simply because MSI AB caps the added voltage at 100mv. I have no trouble with this w/ Trixx, but that's only because Trixx, by default, allows you to add up to 200mv to the core voltage.


Didn't my advice work for you?
I have no problem with passing +100mV in AB using 290 and 14.7 drivers.
Quote:


> Originally Posted by *piquadrat*
> 
> That is not the way it's supposed to work (for me at least). Do not expect some kind of offset applied to AB voltage settings. For instance sending: /wi6,30,8d,18 sets voltage to +150mV but when you touch voltage slider in AB this will be reset to where the slider points to. I think /wi6 command uses AB engine to sent the setting directly to voltage regulator and not messing with AB behavior in any way. Set clocks in AB and after that apply voltage using /wi6. BTW /wi4 never worked for me.
> Proceed with caution.


Just use bat file to load clocks profile and after that set voltage offset. For instance to load profile 2 and get +150mV:

Code:



Code:


CD <path_to_afterburner_install_folder>
MSIAfterburner.exe -profile2
MSIAfterburner.exe /wi6,30,8d,18


----------



## ABADY

http://www.techpowerup.com/gpuz/details.php?id=ar558


----------



## paradoxum

Which drivers should I be using with my Sapphire R9 290 Tri-X OC? The 14.4 from 4/25/2014 or 14.7 Beta from 7/10/2014?
I read something about 14.7 not coming out for Win 8.1, is that true? What are 8.1 users to do?

I'm just checking out TRIXX, and the "VDDC Offset" goes up to 200, is that 200mv core voltage overclock? Because in MSI Afterburner it only lets me go up to 100mv, so I'm not sure if it's the same thing or different - if it is, what's the safest I can set it to without killing the card? I've been benching it with MSI AB and 3dMark/valley unigine at +100mv +50% power limit and it's been fine, but don't wanna go crazy and try 200 and kill my new card - anyone?

Oh yeah, the card also has a little switch on the side of it (BIOS I presume?) my previous HD6970 had the same one, and flicking it unlocked more shaders, will doing the same on this card yield the same results? what's the difference? hard to find info on what that switch actually does on this card.

Thanks and rep in advance.


----------



## Klocek001

Don't go 200mV on air. I personally hate TriXXX software so much I'm shaking. I also use 290 tri-x, just use 100mV, you'll probably achieve 1100-1150 OC with nice temps.


----------



## paradoxum

Quote:


> Originally Posted by *Klocek001*
> 
> Don't go 200mV on air. I personally hate TriXXX software so much I'm shaking. I also use 290 tri-x, just use 100mV, you'll probably achieve 1100-1150 OC with nice temps.


I've got quite a hefty water setup, these were my temps when benching with 100mv +50% power limit, 1160 core and 1600 mem clock: https://i.imgur.com/AfceNda.png

Is it too risky to go higher than 100mv or with those temps what do you reckon?

Edit: I only installed TRIXX just now because MSI AB wasn't letting me go above 100mv, but TRIXX lets me take it to 200. Is there a way to unlock MSI AB to allow higher?

Also, minor question, but since I have no fan connected to the card due to being water cooled, is there any voltage being sent to the fan header? I always try and set the fan speed manually to the lowest it will let me anyway (20%), but TRIXX lets me set 1%. I figured it might help a tiny bit, but it was just a small thing I was wondering about if the card knows there's no cooler attached whether it sends any volts to the fan header or not.


----------



## ozzy1925

Quote:


> Originally Posted by *paradoxum*
> 
> I've got quite a hefty water setup, these were my temps when benching with 100mv +50% power limit, 1160 core and 1600 mem clock: https://i.imgur.com/AfceNda.png
> 
> Is it too risky to go higher than 100mv or with those temps what do you reckon?
> 
> Edit: I only installed TRIXX just now because MSI AB wasn't letting me go above 100mv, but TRIXX lets me take it to 200. Is there a way to unlock MSI AB to allow higher?
> 
> Also, minor question, but since I have no fan connected to the card due to being water cooled, is there any voltage being sent to the fan header? I always try and set the fan speed manually to the lowest it will let me anyway (20%), but TRIXX lets me set 1%. I figured it might help a tiny bit, but it was just a small thing I was wondering about if the card knows there's no cooler attached whether it sends any volts to the fan header or not.


whats the core speed you are planning ?For me +200mv is not necessary if i dont go over 1170 core


----------



## rdr09

Quote:


> Originally Posted by *piquadrat*
> 
> Is it a way in AMD's drivers to adjust gamma value but not only for desktop but for all applications games included ?


Have you tried Desktop Management in CCC?


----------



## piquadrat

Quote:


> Originally Posted by *rdr09*
> 
> Have you tried Desktop Management in CCC?


Yes, but that changes only applications which do not take exclusive use of the whole display, so not games run in fullscreen.
I wish I have the monitor with more gamma settings.


----------



## rdr09

Quote:


> Originally Posted by *piquadrat*
> 
> Yes, but that changes only applications which do not take exclusive use of the whole display, so not games run in fullscreen.
> I wish I have the monitor with more gamma settings.


sorry, can't help.


----------



## KeepWalkinG

Have you noticed that after sleep mode voltage on the r9 290 is going on stock?


----------



## spikezone2004

Quote:


> Originally Posted by *paradoxum*
> 
> I've got quite a hefty water setup, these were my temps when benching with 100mv +50% power limit, 1160 core and 1600 mem clock: https://i.imgur.com/AfceNda.png
> 
> Is it too risky to go higher than 100mv or with those temps what do you reckon?
> 
> Edit: I only installed TRIXX just now because MSI AB wasn't letting me go above 100mv, but TRIXX lets me take it to 200. Is there a way to unlock MSI AB to allow higher?
> 
> Also, minor question, but since I have no fan connected to the card due to being water cooled, is there any voltage being sent to the fan header? I always try and set the fan speed manually to the lowest it will let me anyway (20%), but TRIXX lets me set 1%. I figured it might help a tiny bit, but it was just a small thing I was wondering about if the card knows there's no cooler attached whether it sends any volts to the fan header or not.


What setup do you have for those temps? and block on your gpu?

Trying to compare it to mine see if i can get near those temps with that overclock


----------



## gobblebox

Quote:


> Originally Posted by *piquadrat*
> 
> Didn't my advice work for you?
> I have no problem with passing +100mV in AB using 290 and 14.7 drivers.
> Just use bat file to load clocks profile and after that set voltage offset. For instance to load profile 2 and get +150mV:
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> CD <path_to_afterburner_install_folder>
> MSIAfterburner.exe -profile2
> MSIAfterburner.exe /wi6,30,8d,18


I saw your suggestion and I think I misread it the first time, not realizing that wi6 wouldn't show anything in AB. I'll try that method when I get off work and monitor the voltage with hwinfo during a bench. Thanks for the tip!

One more question, is the second .exe command supposed to launch AB at all or do I need to launch it after running the batch file?


----------



## ZealotKi11er

Guys i think Down-sampling may be finally coming with AMD drivers.


----------



## piquadrat

Quote:


> Originally Posted by *gobblebox*
> 
> I saw your suggestion and I think I misread it the first time, not realizing that wi6 wouldn't show anything in AB. I'll try that method when I get off work and monitor the voltage with hwinfo during a bench. Thanks for the tip!
> 
> One more question, is the second .exe command supposed to launch AB at all or do I need to launch it after running the batch file?


First command with exe will start the AB instance if not launched already.
Second command does not.


----------



## gobblebox

Quote:


> Originally Posted by *piquadrat*
> 
> First command with exe will start the AB instance if not launched already.
> Second command does not.


So, I'll create a profile with the voltage at 0 by default, clocks at 1150/1475, then save it and close AB. Then I'll run the suggested batch commands to load the appropriate profile & apply a +150mv offset (Which should show the appropriate voltage change in HWiNFO while benching). Just letting you know exactly what I'll do just in case I'm somehow still missing something.

Thanks, I'll give this a shot & report back when I get to test it out.


----------



## piquadrat

Quote:


> Originally Posted by *gobblebox*
> 
> So, I'll create a profile with the voltage at 0 by default, clocks at 1150/1475, then save it and close AB. Then I'll run the suggested batch commands to load the appropriate profile & apply a +150mv offset (Which should show the appropriate voltage change in HWiNFO while benching). Just letting you know exactly what I'll do just in case I'm somehow still missing something.
> 
> Thanks, I'll give this a shot & report back when I get to test it out.


You need no worry if AB instance is launched or not. It wont be duplicated.
Start with conservative voltage offset (like +100mV) in case that (for unknown reason) it works differently in your system. But I doubt it.

And you will see the appropriate voltage change even in idle, no need to bench. That is how that offset thing works.


----------



## Gobigorgohome

Anyone here that have problems with three/four R9 290X's at 4K in BF4? I play at ultra (without AA) with my 4,5 Ghz 3930K and all the cards at stock. I get well above 100 fps most of the time, have not seen anything lower than 75 fps and it is VERY BAD, seems like I have about 25 fps ... I will try with AA too, to see if I may not use enough of the GPU's. Anybody have the same feeling as me?

Never mind, BF4 has to be poor optimized for 4K, I have a better experience (less laging) at ultra with 4x msaa than without msaa. Good work EA.


----------



## duhasttas

Greetings everybody, long time no post. I recently purchased a R9 290 and have been fooling around with OCing the little beast but I have come across an anomaly of sorts in my eyes regarding the VDDC read-out by GPU-Z (I tried utilizing my Google-Fu to inquire but failed). Running Heaven maxed out my card is only shown using 1.000 VDDC and from my findings that is extremely low compared to a lot of other cards even at stock speeds. Currently my card is OC'd to 1035/1350 on these stock volts as I haven't upped the voltage or power limit in TriXx. Any insight as to why my card is consuming or shown to be consuming such low voltage?


----------



## BradleyW

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Guys i think Down-sampling may be finally coming with AMD drivers.


???


----------



## smoke2

I would like to ask of owners of Sapphire Tri-X 290, if the card isn't too noisy in idle? It have relatively high minimum RPM's.
I have Fractal R4 case with every fans on 7V, including original Fractal fans.
Will be this card in idle noisier than the fans in case?


----------



## Loktar Ogar

Quote:


> Originally Posted by *smoke2*
> 
> I would like to ask of owners of Sapphire Tri-X 290, if the card isn't too noisy in idle? It have relatively high minimum RPM's.
> I have Fractal R4 case with every fans on 7V, including original Fractal fans.
> Will be this card in idle noisier than the fans in case?


Tri-X fans are quiet in operation from 25% to 30% in idle below 40c but noticeable when the temps goes above 50c - 75c and the fans will ramp up 50% to 70%.

I believe there are videos available online for this as well if you like to check.









P.S
I have a fan profile that is 35% up to 50c so it will be quiet while not gaming and the temps will go 50c + automatically regardless of any games I've played.


----------



## invincible20xx

Quote:


> Originally Posted by *Loktar Ogar*
> 
> Tri-X fans are quiet in operation from 25% to 30% in idle below 40c but noticeable when the temps goes above 50c - 75c and the fans will ramp up 50% to 70%.
> 
> I believe there are videos available online for this as well if you like to check.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> P.S
> I have a fan profile that is 35% up to 50c so it will be quiet while not gaming and the temps will go 50c + automatically regardless of any games I've played.


yea i concur tri-x fans are really quit !!


----------



## ZealotKi11er

Quote:


> Originally Posted by *BradleyW*
> 
> ???


You know the feature Nvidia has had for ages. I might be coming to AMD too. Have no tried it myself but will be doing soon and let you guys know. I personally cant wait to run my games @ 4K.


----------



## invincible20xx

Quote:


> Originally Posted by *ZealotKi11er*
> 
> You know the feature Nvidia has had for ages. I might be coming to AMD too. Have no tried it myself but will be doing soon and let you guys know. I personally cant wait to run my games @ 4K.


think i will have to stick to 1080p till i move from this country, no more money will go to upgrading my rig because i will be moving from here next year so there is no point now for me to fiddle with my sig rig anymore, i wonder how great games could look @ 4k though


----------



## ZealotKi11er

Quote:


> Originally Posted by *invincible20xx*
> 
> think i will have to stick to 1080p till i move from this country, no more money will go to upgrading my rig because i will be moving from here next year so there is no point now for me to fiddle with my sig rig anymore, i wonder how great games could look @ 4k though


You can run 4K @ 1080p monitor. It will eliminate the need for AA and make games look sharper. Good for older games and non demanding games or games that have bad AA support or no AA at all.


----------



## invincible20xx

Quote:


> Originally Posted by *ZealotKi11er*
> 
> You can run 4K @ 1080p monitor. It will eliminate the need for AA and make games look sharper. Good for older games and non demanding games or games that have bad AA support or no AA at all.


not sure my gaming rig can pull off 4k even in older titles lol

Edit : also noticed you are from canada, can you check the thread in my signature, thanks take care


----------



## Arizonian

Quote:


> Originally Posted by *ABADY*
> 
> http://www.techpowerup.com/gpuz/details.php?id=ar558


Congrats - Gigabyte Windforce 290 - added


----------



## Mega Man

Quote:


> Originally Posted by *KeepWalkinG*
> 
> Have you noticed that after sleep mode voltage on the r9 290 is going on stock?


simple solution, dont sleep !


----------



## FuriousPop

can anyone advise, best drivers to be using at the moment for 2-3 way configs?

silly me just checked and im still running 13.12!!!!!!

Been busy with my water build.. temps look ok except for 3rd card - think i put too much thermal on it - reads and extra 3C than the other 2 cards and under load max out at 59C while the other 2 max out at 48/49C

been seeing alot with 14.4 - any link to the old ones?


----------



## Spectre-

Quote:


> Originally Posted by *FuriousPop*
> 
> can anyone advise, best drivers to be using at the moment for 2-3 way configs?
> 
> silly me just checked and im still running 13.12!!!!!!
> 
> been busy with my water build.. temps look ok except for 3rd card - think i put too much thermal on it - reads and extra 3C than the other 2 cards and under load max out at 59C while the other 2 max out at 48/49C
> 
> been seeing alot with 14.4 - any link to the old ones?


14.6 for benching

14.4 for stability


----------



## Vici0us

Quote:


> Originally Posted by *Mega Man*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KeepWalkinG*
> 
> Have you noticed that after sleep mode voltage on the r9 290 is going on stock?
> 
> 
> 
> simple solution, dont sleep !
Click to expand...

I wouldn't know because of SSD I never put my rig to sleep.


----------



## Germanian

14.6 RC2 best driver yet. 0 problems for me
Crossfire works in almost all my games except for some older games.

Playing Witcher 2 atm and it's awesome. Also heroes of the storm atm.


----------



## paradoxum

Quote:


> Originally Posted by *spikezone2004*
> 
> What setup do you have for those temps? and block on your gpu?
> 
> Trying to compare it to mine see if i can get near those temps with that overclock


My rig is fully up to date, I've been posting my settings / results with pics in this thread starting from page 9 or 10, has all the details: www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures


----------



## Spongeworthy

Does anyone's 290X go completely nuts if you try to go 1.5v+ using PT1? Screen starts flickering like crazy.

Using 14.7


----------



## Zipperly

Quote:


> Originally Posted by *Spongeworthy*
> 
> Does anyone's 290X go completely nuts if you try to go 1.5v+ using PT1? Screen starts flickering like crazy.
> 
> Using 14.7


Its possible that you are pushing too much voltage and the card simply cannot cope. I would never try to do 1.5+vc unless you are on some serious cooling and even then it would make me nervous.


----------



## BradleyW

Quote:


> Originally Posted by *Spongeworthy*
> 
> Does anyone's 290X go completely nuts if you try to go 1.5v+ using PT1? Screen starts flickering like crazy.
> 
> Using 14.7


I sure hope you are kidding. Don't use that kind of voltage.


----------



## Talon720

Quote:


> Originally Posted by *Zipperly*
> 
> Its possible that you are pushing too much voltage and the card simply cannot cope. I would never try to do 1.5+vc unless you are on some serious cooling and even then it would make me nervous.


Yea I sure hope you're on water cooling at the least. My cards start flicking black screen at anything over 1.4v. I've read in other forums people having this issue. If you have more than 1 card you could swap cards around, the display card might gets messed up where as another card wont. I know for me the screen is black, but frames are still being rendered.

I saw ranger had posted a guide to overclocking 290/290x in another forum. I've never seen him in this thread, or maybe he can't come in here.


----------



## Klocek001

Quote:


> Originally Posted by *Spongeworthy*
> 
> Does anyone's 290X go completely nuts if you try to go 1.5v+ using PT1? Screen starts flickering like crazy.
> 
> Using 14.7


There's maybe like a few percent, the absolute cream of the crop, of 290 cards that will give you no issues at 1,5v freaking vcore


----------



## kizwan

All you guys need is good cooling, at least sub-ambient cooling. If you search in this thread, I'm pretty sure you can find few people running at 1.5V at least (both before & after Vdroop). Not 24/7 of course but for benching.


----------



## jomama22

Quote:


> Originally Posted by *kizwan*
> 
> All you guys need is good cooling, at least sub-ambient cooling. If you search in this thread, I'm pretty sure you can find few people running at 1.5V at least (both before & after Vdroop). Not 24/7 of course but for benching.


Just a fair warning. The actual silicon is not going to be the first thing to go, it will be a cap/vrm right in the middle of the stack on reference cards. 1.45v was enough to pop it under LN2. Not saying they are all going to do this, but because of how these chips are designed, it is very difficult even under ln2 to find one that will keep running for you through an entire bench @ above 1.45v. Went through 11 290x and found that out.

Remember, with these chips, because the imc is weaker then that of tahini, it is much more sensitive to the voltage you are throwing at the chip. Thus, even if the shaders, rops, rastisizers ect are all working properly and allowing higher clocks, a weak or very sensitive imc will ruin it.

Its not all about hynix vs elpida, its about how well the imc works in conjunction with the rest of the chip. Sandy bridge-e is a very good parallel in this regard where the higher core clock would make it much more difficult to attain higher (and stable) memory clocks.

I has a launch day 290x hit 1357/1749 (6996) with hynix @ dimm check voltage of 1.43v on water. Anything above that voltage would end up with either black screens or lock up of visual artifacts. I had 2 other 290x hit 1340+ as well under the same voltage requirements. Though these had elpidia and were restricted to 1625 (6500) on the memory clock.


----------



## KeepWalkinG

My card have a big V-droop on +200mV in Unigine Valley 1.0 voltage is 1.180 and +300mV 1.215v
For a moment, i see in Gpu-z high voltage is 1.4v but it was only for a second...

Does this mean that I can lift more voltage if I have a large v-droop?


----------



## Zipperly

Quote:


> Originally Posted by *KeepWalkinG*
> 
> My card have a big V-droop on +200mV in Unigine Valley 1.0 voltage is 1.180 and +300mV 1.215v
> For a moment, i see in Gpu-z high voltage is 1.4v but it was only for a second...
> 
> Does this mean that I can lift more voltage if I have a large v-droop?


No, what you need is a modded bios that deals with vdroop. Its what I did when I had my 290X but you have to be careful and watch your temps, I had good aftermarket cooling so it was not a problem for me.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Zipperly*
> 
> No, what you need is a modded bios that deals with vdroop. Its what I did when I had my 290X but you have to be careful and watch your temps, I had good aftermarket cooling so it was not a problem for me.


So with nodded BIOS +100 mv is a lot more effective?


----------



## KeepWalkinG

But is hard to find modded bios for Vapor-X 290.
There was such an option in 7970 Asus Matrix to remove v-droop.


----------



## deezdrama

Playing tomb raider today and after 20 minutes of playing the system freezes, turns black, and then i get this









Im getting highly frustrated.
Im running this on a freash harddrive on fresh install of win 7 64bit.

I did have a 650ti installed prior and uninstalled all its drivers and software before installing the 290.
Im running the newest beta drivers.

Im starting to wish i dumped a little more cash n got a nvidia card.

Anyone have any ideas or suggesstions what i can do?

Sent from my SCH-I605 using Tapatalk


----------



## brazilianloser

Quote:


> Originally Posted by *deezdrama*
> 
> Playing tomb raider today and after 20 minutes of playing the system freezes, turns black, and then i get this
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Im getting highly frustrated.
> Im running this on a freash harddrive on fresh install of win 7 64bit.
> 
> I did have a 650ti installed prior and uninstalled all its drivers and software before installing the 290.
> Im running the newest beta drivers.
> 
> Im starting to wish i dumped a little more cash n got a nvidia card.
> 
> Anyone have any ideas or suggesstions what i can do?
> 
> Sent from my SCH-I605 using Tapatalk


Are you overclocking the card? Going to assume you have proper power going to the card... If not overclocking and you have proper power just increase the power limit by a touch using something like MSI Afterburner. At least that is what did for me when I first got my cards and something similar was happening.


----------



## deezdrama

Not overclocking, have a brand new corsair cx750 psu.

Ive read to enable "constant voltage" and disable ulps within afterburner...

I think im just going to reinstall windows.

When im done reinstalling should i dl the regular ccc drivers or beta?

And should i set the above mentioned settings in afterburner?
Thanks

Sent from my SCH-I605 using Tapatalk


----------



## deezdrama

And anyone know how to not have to install that raptr program?

I always choose custom install and uncheck the baggage but didnt see option for that

Sent from my SCH-I605 using Tapatalk


----------



## brazilianloser

Just give it some time and someone with more in detail help will appear. But as far as I am aware constant voltage is not a option you would want to have enabled. On the other hand disabling ULPS is ideal since if you are gaming you really don't want your cards to go into energy saving mode for no reason. The current beta is good for benchmarking and such but if you want a stable system I would just stick with the last non beta release. Make sure to properly uninstall the drivers if changing and I personally don't use raptr app but some folks do like it... You can always uninstall it if you come to dislike it of find no real useful use for it.


----------



## rdr09

Quote:


> Originally Posted by *deezdrama*
> 
> And anyone know how to not have to install that raptr program?
> 
> I always choose custom install and uncheck the baggage but didnt see option for that
> 
> Sent from my SCH-I605 using Tapatalk


For raptr, if it came installed, i just go to msconfig and uncheck it from system startup programs. i do the same with CCC. i treat them like Word or Excel. Open when needed.


----------



## brazilianloser

Quote:


> Originally Posted by *deezdrama*
> 
> And anyone know how to not have to install that raptr program?
> 
> I always choose custom install and uncheck the baggage but didnt see option for that
> 
> Sent from my SCH-I605 using Tapatalk


On the custom installation process it is usually the last option and its labeled "AMD Gaming Evolved App"


----------



## bluedevil

Played BF4 this morning for about an hour @ 1.175Ghz. I'd call that stable. Gonna try for 1.2Ghz tonite.


----------



## ZealotKi11er

Quote:


> Originally Posted by *bluedevil*
> 
> Played BF4 this morning for about an hour @ 1.175Ghz. I'd call that stable. Gonna try for 1.2Ghz tonite.


I think thats where the wall is with 100 mV. 1175 to 1200.


----------



## KeepWalkinG

Quote:


> Originally Posted by *bluedevil*
> 
> Played BF4 this morning for about an hour @ 1.175Ghz. I'd call that stable. Gonna try for 1.2Ghz tonite.


With scale on 150% on 1080p monitor?


----------



## bluedevil

Quote:


> Originally Posted by *KeepWalkinG*
> 
> With scale on 150% on 1080p monitor?


Nope. 1440p.


----------



## smoke2

Quote:


> Originally Posted by *Loktar Ogar*
> 
> Tri-X fans are quiet in operation from 25% to 30% in idle below 40c but noticeable when the temps goes above 50c - 75c and the fans will ramp up 50% to 70%.
> 
> I believe there are videos available online for this as well if you like to check.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> P.S
> I have a fan profile that is 35% up to 50c so it will be quiet while not gaming and the temps will go 50c + automatically regardless of any games I've played.


How big is temperature during watching Youtube Full HD video?
And how many RPM's or % have fans?
I'm watching it often.
And what about power consumption?
GPU-Z screen will probably help


----------



## heroxoot

Quote:


> Originally Posted by *smoke2*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Loktar Ogar*
> 
> Tri-X fans are quiet in operation from 25% to 30% in idle below 40c but noticeable when the temps goes above 50c - 75c and the fans will ramp up 50% to 70%.
> 
> I believe there are videos available online for this as well if you like to check.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> P.S
> I have a fan profile that is 35% up to 50c so it will be quiet while not gaming and the temps will go 50c + automatically regardless of any games I've played.
> 
> 
> 
> How big is temperature during watching Youtube Full HD video?
> And how many RPM's or % have fans?
> I'm watching it often.
> And what about power consumption?
> GPU-Z screen will probably help
Click to expand...

Personal experience. Youtube causes a lot less heat in 1080p HD compared to twitch.tv. I see my GPU get to 52c tops on twitch, but youtube it really does not get any hotter. 48c maybe as a top.


----------



## smoke2

Quote:


> Originally Posted by *heroxoot*
> 
> Personal experience. Youtube causes a lot less heat in 1080p HD compared to twitch.tv. I see my GPU get to 52c tops on twitch, but youtube it really does not get any hotter. 48c maybe as a top.


But you have reference card which have over half less power consumption than non-reference cards.
It can be checked through GPU-Z, items VDCC Power In and VDCC Power Out.
There could be big differencies between reference and non-reference cards.
Try to have youtube with hardware acceleration on and play it on the background with some other browser cards opened.


----------



## HeatM1ser

XFX R9 290, stock cooling

I think this is the right number: 102C6711100 000001 (correct me if Im wrong)


----------



## rdr09

Quote:


> Originally Posted by *smoke2*
> 
> But you have reference card which have over half less power consumption than non-reference cards.
> It can be checked through GPU-Z, items VDCC Power In and VDCC Power Out.
> There could be big differencies between reference and non-reference cards.
> Try to have youtube with hardware acceleration on and play it on the background with some other browser cards opened.


Hardware Acceleration creates issues to some. Mine is fine having it enabled. this's what you looking for . . .



edit: mine is watercooled, btw.


----------



## Spongeworthy

Quote:


> Originally Posted by *Klocek001*
> 
> There's maybe like a few percent, the absolute cream of the crop, of 290 cards that will give you no issues at 1,5v freaking vcore


Even after vdroop? It doesn't even breach 1.4v before I start getting flickering, even at 1.45v, after vdroop its like ~1.35v and I get a little static.

To everyone, yes I'm under water, temps don't exceed 50.


----------



## smoke2

Quote:


> Originally Posted by *rdr09*
> 
> Hardware Acceleration creates issues to some. Mine is fine having it enabled. this's what you looking for . . .
> 
> 
> 
> edit: mine is watercooled, btw.


You have it good, but someone with custom cooler have VDCC Power In almost 100W and VDCC Curent Out over 70A.
You can disable HW acceleration, but then the work do CPU what I know.
You have reference card?


----------



## rdr09

Quote:


> Originally Posted by *smoke2*
> 
> You have it good, but someone with custom cooler have VDCC Power In almost 100W and VDCC Curent Out over 70A.
> You can disable HW acceleration, but then the work do CPU what I know.
> You have reference card?


if i am having issues with bluescreen and such, then i'll disable HA but i leave it on. yes, it is reference. i don't use Firefox, i read it is problematic.


----------



## Gobigorgohome

Just did a run with 3D Mark Fire Strike Extreme (at Extreme preset) and got 14680 in score. This is with 4x Sapphire Radeon R9 290X with EK-FC R9 290X Acetal+Nickel at 1100/1300 with no mV added.

Is this a okay score for my system? 3930K running at 4,5 Ghz.


----------



## Spongeworthy

Is vdroop this substantial normal? 1.412v on gpu tweak gets me 1.273v


----------



## smoke2

Quote:


> Originally Posted by *rdr09*
> 
> if i am having issues with bluescreen and such, then i'll disable HA but i leave it on. yes, it is reference. i don't use Firefox, i read it is problematic.


Reference don't have this power consumption issues during playing Youtube.
But someone with Tri-X could test it also for comparision








And how noisy it gets in comparision with idle?


----------



## rv8000

AMD has been awfully quiet on the driver front, I wonder what gives D: ?


----------



## rdr09

Quote:


> Originally Posted by *Spongeworthy*
> 
> Is vdroop this substantial normal? 1.412v on gpu tweak gets me 1.273v


there is a drop down window and set it to show max.

@rev, i read the new whql is coming out soon. very soon.


----------



## Spongeworthy

Quote:


> Originally Posted by *rdr09*
> 
> there is a drop down window and set it to show max.
> 
> @rev, i read the new whql is coming out soon. very soon.




Not a whole lot of difference.


----------



## KeepWalkinG

This 370wattage is real? +100mV 1150/1625
After Unigine Valley 1.0:


Here is very hot 29C room temperature.


----------



## Gobigorgohome

I use the Fujipoly Ultra Extreme thermal pads on my R9 290X's with water cooling, VRM1 temperatures is: 47 degree celsius while VRM2 temperatures is: 57 degree celsius at 1100/1300 is this okay?

Will I be able to push to lets say 1150/1350 with that?

I am using MSI AB, power target at 30%, no tweaks on the mV.


----------



## Mega Man

Quote:


> Originally Posted by *KeepWalkinG*
> 
> This 370wattage is real? +100mV 1150/1625
> After Unigine Valley 1.0:
> 
> Here is very hot 29C room temperature.


please dont trust software readouts they are semi close


Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Zipperly*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Spongeworthy*
> 
> Does anyone's 290X go completely nuts if you try to go 1.5v+ using PT1? Screen starts flickering like crazy.
> 
> Using 14.7
> 
> 
> 
> Its possible that you are pushing too much voltage and the card simply cannot cope. I would never try to do 1.5+vc unless you are on some serious cooling and even then it would make me nervous.
Click to expand...

Quote:


> Originally Posted by *BradleyW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Spongeworthy*
> 
> Does anyone's 290X go completely nuts if you try to go 1.5v+ using PT1? Screen starts flickering like crazy.
> 
> Using 14.7
> 
> 
> 
> I sure hope you are kidding. Don't use that kind of voltage.
Click to expand...

Quote:


> Originally Posted by *kizwan*
> 
> All you guys need is good cooling, at least sub-ambient cooling. If you search in this thread, I'm pretty sure you can find few people running at 1.5V at least (both before & after Vdroop). Not 24/7 of course but for benching.


Quote:


> Originally Posted by *jomama22*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> All you guys need is good cooling, at least sub-ambient cooling. If you search in this thread, I'm pretty sure you can find few people running at 1.5V at least (both before & after Vdroop). Not 24/7 of course but for benching.
> 
> 
> 
> Just a fair warning. The actual silicon is not going to be the first thing to go, it will be a cap/vrm right in the middle of the stack on reference cards. 1.45v was enough to pop it under LN2. Not saying they are all going to do this, but because of how these chips are designed, it is very difficult even under ln2 to find one that will keep running for you through an entire bench @ above 1.45v. Went through 11 290x and found that out.
> 
> Remember, with these chips, because the imc is weaker then that of tahini, it is much more sensitive to the voltage you are throwing at the chip. Thus, even if the shaders, rops, rastisizers ect are all working properly and allowing higher clocks, a weak or very sensitive imc will ruin it.
> 
> Its not all about hynix vs elpida, its about how well the imc works in conjunction with the rest of the chip. Sandy bridge-e is a very good parallel in this regard where the higher core clock would make it much more difficult to attain higher (and stable) memory clocks.
> 
> I has a launch day 290x hit 1357/1749 (6996) with hynix @ dimm check voltage of 1.43v on water. Anything above that voltage would end up with either black screens or lock up of visual artifacts. I had 2 other 290x hit 1340+ as well under the same voltage requirements. Though these had elpidia and were restricted to 1625 (6500) on the memory clock.
Click to expand...





meh even i did 1.65v several hours without issues on water, granted 5x480s but i know several that have with no ill effects granted not 24/7 but the "others" i am talking about has done it for far longer then i


----------



## HighTemplar

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Just did a little BF4 gaming at 4K with ultra settings (without AA), it seems to be good, not a great experience but it seems to work. This is everything at stock though, need to overclock my CPU a little, I think it is bottlenecking my quad R9 290X a bit.


That doesn't make any sense because my 2x 780 Ti setup runs 4K without MSAA @ 55-100fps depending on the map and scenario, and I consider that 'good'. I had planned to add my other 2 780 Ti's but they were reference cards and didn't match up with the tall PCBs of the Classifieds.

So what I did was I ordered 4 290X Sapphire Reference cards, just because I got them cheap (between 275-310 each), and I plan on adding full cover EK blocks. But if you're only achieiving a 'good' 4K experience with them, it doesn't make any sense to put the effort in.

The whole idea was the mantle benefits albeit minimal, and the extra VRAM would be a little more futureproof than the 3GB VRAM limit on the 780 Ti Classifieds.

However if you're not seeing good scaling, idk if it's a good idea or not.


----------



## FuriousPop

Can someone help me out with Temps to aim for (water build) with 2+ more R9 290's??

When playing BF4 last night my first 2 cards hang around the 45ish mark and my 3rd card is in the 50's....

or if you can provide a link to screen shots or something else i can compare with... kind of stumped at the moment


----------



## heroxoot

Quote:


> Originally Posted by *smoke2*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Personal experience. Youtube causes a lot less heat in 1080p HD compared to twitch.tv. I see my GPU get to 52c tops on twitch, but youtube it really does not get any hotter. 48c maybe as a top.
> 
> 
> 
> But you have reference card which have over half less power consumption than non-reference cards.
> It can be checked through GPU-Z, items VDCC Power In and VDCC Power Out.
> There could be big differencies between reference and non-reference cards.
> Try to have youtube with hardware acceleration on and play it on the background with some other browser cards opened.
Click to expand...

I see 1.1v when watching either one. My GPU only clocks up half way at most for full screen twitch. About 400 for youtube. I think something about it being live on twitch hammers flash more.


----------



## fishingfanatic

Ok, in anticipation of the new wcing beast when it gets here, I decided to go to a proper wcing system.

I probably should've gone 1/2x3/4 but got a good deal on some bnib 3/8x5/8 fittings so went with those.

A pr of 480s, a D5 pump EK-CoolStream PE 480
EK-D5 X-RES 140 CSQ - Acetal D5 Top-Reservoir Combo
DazMode STORM D5 STRONG 8-24V Pump.

The best reviewed cpu block I could find, EK Supremacy Clean CS Gold.

a decent res. combo, a bunch of swivel fittings, hoses and quick connects as well. Some protector and killcoil,...

That should keep the Devil from overheating!!!









Meanwhile I think I'll try out the titans and 670s I already have set up for wcing on the exos v2 755. Just needed a few more fittings for it

and some more tubing. I'll bet there's always a shortage of hosing at home for some enthusiasts.lol

I didn't bother to get a mobo block set. With all of the other stuff being kept cooler and high cfm case fans it shouldn't get too warm in that

case. Changed the exhaust to 90 cfms and added an extra intake of 60+ cfm

Just that dropped my case temps down 6 C! The hardest part is putting the nylon filters on the fans as I have a pr of hairy dogs and it's a

constant battle of vigilance that keeps my stuff virtually dust and hair free but I swear by those nylons.

FF


----------



## Jflisk

Quote:


> Originally Posted by *bluedevil*
> 
> Played BF4 this morning for about an hour @ 1.175Ghz. I'd call that stable. Gonna try for 1.2Ghz tonite.


You got your card back congrats.


----------



## kizwan

Quote:


> Originally Posted by *Mega Man*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KeepWalkinG*
> 
> This 370wattage is real? +100mV 1150/1625
> After Unigine Valley 1.0:
> 
> Here is very hot 29C room temperature.
> 
> 
> 
> please dont trust software readouts they are semi close
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Zipperly*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Spongeworthy*
> 
> Does anyone's 290X go completely nuts if you try to go 1.5v+ using PT1? Screen starts flickering like crazy.
> 
> Using 14.7
> 
> Click to expand...
> 
> Its possible that you are pushing too much voltage and the card simply cannot cope. I would never try to do 1.5+vc unless you are on some serious cooling and even then it would make me nervous.
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *BradleyW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Spongeworthy*
> 
> Does anyone's 290X go completely nuts if you try to go 1.5v+ using PT1? Screen starts flickering like crazy.
> 
> Using 14.7
> 
> Click to expand...
> 
> I sure hope you are kidding. Don't use that kind of voltage.
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> All you guys need is good cooling, at least sub-ambient cooling. If you search in this thread, I'm pretty sure you can find few people running at 1.5V at least (both before & after Vdroop). Not 24/7 of course but for benching.
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *jomama22*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> All you guys need is good cooling, at least sub-ambient cooling. If you search in this thread, I'm pretty sure you can find few people running at 1.5V at least (both before & after Vdroop). Not 24/7 of course but for benching.
> 
> Click to expand...
> 
> Just a fair warning. The actual silicon is not going to be the first thing to go, it will be a cap/vrm right in the middle of the stack on reference cards. 1.45v was enough to pop it under LN2. Not saying they are all going to do this, but because of how these chips are designed, it is very difficult even under ln2 to find one that will keep running for you through an entire bench @ above 1.45v. Went through 11 290x and found that out.
> 
> Remember, with these chips, because the imc is weaker then that of tahini, it is much more sensitive to the voltage you are throwing at the chip. Thus, even if the shaders, rops, rastisizers ect are all working properly and allowing higher clocks, a weak or very sensitive imc will ruin it.
> 
> Its not all about hynix vs elpida, its about how well the imc works in conjunction with the rest of the chip. Sandy bridge-e is a very good parallel in this regard where the higher core clock would make it much more difficult to attain higher (and stable) memory clocks.
> 
> I has a launch day 290x hit 1357/1749 (6996) with hynix @ dimm check voltage of 1.43v on water. Anything above that voltage would end up with either black screens or lock up of visual artifacts. I had 2 other 290x hit 1340+ as well under the same voltage requirements. Though these had elpidia and were restricted to 1625 (6500) on the memory clock.
> 
> Click to expand...
> 
> 
> 
> 
> 
> meh even i did 1.65v several hours without issues on water, granted 5x480s but i know several that have with no ill effects granted not 24/7 but the "others" i am talking about has done it for far longer then i
Click to expand...


----------



## fishingfanatic

kizwan what r u running at with 5 480s. R u in a sauna??? lol

Seriously, ur temps must be phenominal! How many pumps?

FF


----------



## kizwan

Quote:


> Originally Posted by *fishingfanatic*
> 
> kizwan what r u running at with 5 480s. R u in a sauna??? lol
> 
> Seriously, ur temps must be phenominal! How many pumps?
> 
> FF


You're mistaken. Mega Man is the one that have 5x480. I only have 2x360.







Temps not phenomenal (when overclocked & overvolted, +200mV = 1.37 - 1.39V after Vdroop) without A/C because ambient is always 30++ Celsius. I'll need a chiller if I want to push higher.


----------



## fishingfanatic

oh my bad. Still 5 480s sounds insane! That thing must be a full corner of cooling...
lol


----------



## kizwan

Quote:


> Originally Posted by *fishingfanatic*
> 
> oh my bad. Still 5 480s sounds insane! That thing must be a full corner of cooling...
> lol


Yeah, 5 480s... that's the dream!


----------



## gobblebox

Quote:


> Originally Posted by *fishingfanatic*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Ok, in anticipation of the new wcing beast when it gets here, I decided to go to a proper wcing system.
> 
> I probably should've gone 1/2x3/4 but got a good deal on some bnib 3/8x5/8 fittings so went with those.
> 
> A pr of 480s, a D5 pump EK-CoolStream PE 480
> EK-D5 X-RES 140 CSQ - Acetal D5 Top-Reservoir Combo
> DazMode STORM D5 STRONG 8-24V Pump.
> 
> The best reviewed cpu block I could find, EK Supremacy Clean CS Gold.
> 
> a decent res. combo, a bunch of swivel fittings, hoses and quick connects as well. Some protector and killcoil,...
> 
> That should keep the Devil from overheating!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Meanwhile I think I'll try out the titans and 670s I already have set up for wcing on the exos v2 755. Just needed a few more fittings for it
> 
> and some more tubing. I'll bet there's always a shortage of hosing at home for some enthusiasts.lol
> 
> I didn't bother to get a mobo block set. With all of the other stuff being kept cooler and high cfm case fans it shouldn't get too warm in that
> 
> case. Changed the exhaust to 90 cfms and added an extra intake of 60+ cfm
> 
> 
> Just that dropped my case temps down 6 C! The hardest part is putting the nylon filters on the fans as I have a pr of hairy dogs and it's a
> 
> constant battle of vigilance that keeps my stuff virtually dust and hair free but I swear by those nylons.
> 
> FF


Which nylon filters are you using? The stock filters in my Arc Midi R2 are choking the hell out of my intake, but with negative airflow, I don't wan't to remove them... do you have a link? I am very interested in what you speak of...


----------



## fishingfanatic

Women's nylons over the fan to eliminate hair getting in. Buy the cheapys but large to cover the whole shroud. Dbl sided tape around the screw holes b4 making a hole in the nylon, so it doesn't tear right up. Gently make a hole for each screw as necessary.

Works great!

Home made hair filters...









Actually found a dead ant once in my case but not with those.


----------



## Arizonian

Quote:


> Originally Posted by *HeatM1ser*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> XFX R9 290, stock cooling
> 
> I think this is the right number: 102C6711100 000001 (correct me if Im wrong)


Congrats - added


----------



## Mega Man

Quote:


> Originally Posted by *HighTemplar*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gobigorgohome*
> 
> Just did a little BF4 gaming at 4K with ultra settings (without AA), it seems to be good, not a great experience but it seems to work. This is everything at stock though, need to overclock my CPU a little, I think it is bottlenecking my quad R9 290X a bit.
> 
> 
> 
> That doesn't make any sense because my 2x 780 Ti setup runs 4K without MSAA @ 55-100fps depending on the map and scenario, and I consider that 'good'. I had planned to add my other 2 780 Ti's but they were reference cards and didn't match up with the tall PCBs of the Classifieds.
> 
> So what I did was I ordered 4 290X Sapphire Reference cards, just because I got them cheap (between 275-310 each), and I plan on adding full cover EK blocks. But if you're only achieiving a 'good' 4K experience with them, it doesn't make any sense to put the effort in.
> 
> The whole idea was the mantle benefits albeit minimal, and the extra VRAM would be a little more futureproof than the 3GB VRAM limit on the 780 Ti Classifieds.
> 
> However if you're not seeing good scaling, idk if it's a good idea or not.
Click to expand...

of course it does, if you think 4k drivers will not mature you are kidding yourself
Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fishingfanatic*
> 
> kizwan what r u running at with 5 480s. R u in a sauna??? lol
> 
> Seriously, ur temps must be phenominal! How many pumps?
> 
> FF
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You're mistaken. Mega Man is the one that have 5x480. I only have 2x360.
> 
> 
> 
> 
> 
> 
> 
> Temps not phenomenal (when overclocked & overvolted, +200mV = 1.37 - 1.39V after Vdroop) without A/C because ambient is always 30++ Celsius. I'll need a chiller if I want to push higher.
Click to expand...

3xmonstas, 1ut60, 1xt45, all packed in a TH10, soon to actually look good ( going full acrylic, and custom cables ) i just need to buy wire, and need winter so i have time ( summer is my busy season )

just bought 100 AP30, been trying ( even contacted nidec servo, still waiting for one more response ) to get AP31s, but no such luch, so instead of ~ 35-46 @ 5400 rpm, i have to live with 4250 rpm, all will be PWM modded with a min speed ~ 1k-1.2k ) so excited, cant wait to post pics ok my GTs they look so sexay !



i use 2x MCP35x2 ( 4x350xs ) if swiftech will make a mcp350x2 i will probably switch [email protected]

sneak peak at it while it is uggry !


Spoiler: Warning: Spoiler!


----------



## gobblebox

Quote:


> Originally Posted by *fishingfanatic*
> 
> Women's nylons over the fan to eliminate hair getting in. Buy the cheapys but large to cover the whole shroud. Dbl sided tape around the screw holes b4 making a hole in the nylon, so it doesn't tear right up. Gently make a hole for each screw as necessary.
> 
> Works great!
> 
> Home made hair filters...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Actually found a dead ant once in my case but not with those.


Nice! Very resourceful... but risky... I can just see it now... my wife walking in on me playing with a pair of her nylons...


----------



## jomama22

Quote:


> Originally Posted by *Mega Man*
> 
> please dont trust software readouts they are semi close
> 
> meh even i did 1.65v several hours without issues on water, granted 5x480s but i know several that have with no ill effects granted not 24/7 but the "others" i am talking about has done it for far longer then i


I find it hard to believe you or anyone else was running a reference card @ 1.65v actual measured with a dmm for any extended time. If you're going merely by gputweak/gpuz/ab then you know the issues there. Also, what bios were you using?

Not trying to knock you or anything, im running 4x 360 with 24 delta fans and did a few ln2 run, I had/have no issues with temps. Im also the 2nd highest 3x 290x setup on fse hof, and that was on water @ 1340/1625. So i do know what i am talking about.


----------



## Mega Man

yet tsm himself said he has run it as well.

did i take a dmm no.. did i go off of what i put into GPU tweak yes

if i am running 1.65v and you have to ask what bios i am running then i







you sir,
all the more "since you know what you are talking about "

esp when you add it i SAID i was on water.


----------



## Gobigorgohome

Anybody have any suggestions on good settings for 4K gaming? Running 1100/1300 on my four cards now, wish to push them a little at least. Using Fujipoly Ultra Extreme, Gelid GC Extreme and EK-FC R9 290X A+N, I have 2x Hynix and 2x Elpida.


----------



## Gobigorgohome

Quote:


> Originally Posted by *HighTemplar*
> 
> That doesn't make any sense because my 2x 780 Ti setup runs 4K without MSAA @ 55-100fps depending on the map and scenario, and I consider that 'good'. I had planned to add my other 2 780 Ti's but they were reference cards and didn't match up with the tall PCBs of the Classifieds.
> 
> So what I did was I ordered 4 290X Sapphire Reference cards, just because I got them cheap (between 275-310 each), and I plan on adding full cover EK blocks. But if you're only achieiving a 'good' 4K experience with them, it doesn't make any sense to put the effort in.
> 
> The whole idea was the mantle benefits albeit minimal, and the extra VRAM would be a little more futureproof than the 3GB VRAM limit on the 780 Ti Classifieds.
> 
> However if you're not seeing good scaling, idk if it's a good idea or not.


You kind of misunderstood me, but whatever. I got 60 fps +++ at ultra with 4x MSAA added at 4K in BF4, I got something that looks like lag even with 120 fps in-game, that is why I said my experience was okay, I like things like this to run smooth. With MSAA off, I get more laging than with the MSAA on, probably GPU usage, but it got a bit better after overclocking the cards. Still what I was talking about was CPU-bottleneck, I was running 3,8 Ghz at the time, now I have 4,5 Ghz so I guess it is a little better.

I have never tried GTX 780 Ti, but I have tried GTX 780 and R9 290's and the R9 290 seems like a better buy than the GTX 780.

Beside, that BF4 is running worse at ultra without MSAA than with MSAA enabled, that is not my machines fault, but poor optimizing from EA. It might be the pit-falls with using four cards too, I do not know.

So, basically you want to know if it is worth it? Well, I have paid almost 3000 USD for my cards with waterblocks and everything and I can live with it, but bang for the buck and all that you will be better off with three cards.







Beside, tri-fire maybe use 1000 wattage of power, I have 2x 1300 wattage PSU's, because BF4 crashed at ultra with just one 1300 wattage PSU. Be sure you have enough juice.


----------



## bluedevil

Just played BF4 for about 30 min and was stable at 1.2Ghz w/75+ mV and 50+% on the power limiter. Tried at 1.225Ghz....no dice.


----------



## bond32

Quote:


> Originally Posted by *Gobigorgohome*
> 
> You kind of misunderstood me, but whatever. I got 60 fps +++ at ultra with 4x MSAA added at 4K in BF4, I got something that looks like lag even with 120 fps in-game, that is why I said my experience was okay, I like things like this to run smooth. With MSAA off, I get more laging than with the MSAA on, probably GPU usage, but it got a bit better after overclocking the cards. Still what I was talking about was CPU-bottleneck, I was running 3,8 Ghz at the time, now I have 4,5 Ghz so I guess it is a little better.
> 
> I have never tried GTX 780 Ti, but I have tried GTX 780 and R9 290's and the R9 290 seems like a better buy than the GTX 780.
> 
> Beside, that BF4 is running worse at ultra without MSAA than with MSAA enabled, that is not my machines fault, but poor optimizing from EA. It might be the pit-falls with using four cards too, I do not know.
> 
> So, basically you want to know if it is worth it? Well, I have paid almost 3000 USD for my cards with waterblocks and everything and I can live with it, but bang for the buck and all that you will be better off with three cards.
> 
> 
> 
> 
> 
> 
> 
> Beside, tri-fire maybe use 1000 wattage of power, I have 2x 1300 wattage PSU's, because BF4 crashed at ultra with just one 1300 wattage PSU. Be sure you have enough juice.


Just curious here, have you ran 3 cards just to see what your framerate is at identical settings to when you run 4?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Anybody have any suggestions on good settings for 4K gaming? Running 1100/1300 on my four cards now, wish to push them a little at least. Using Fujipoly Ultra Extreme, Gelid GC Extreme and EK-FC R9 290X A+N, I have 2x Hynix and 2x Elpida.


@ 4K you want to make sure you are not hitting vRAM limit. You don't need to OC the GPUs really. Try stock and see how ti goes. Don't uses AA or Resolution Scaling.


----------



## jomama22

Quote:


> Originally Posted by *Mega Man*
> 
> yet tsm himself said he has run it as well.
> 
> did i take a dmm no.. did i go off of what i put into GPU tweak yes
> 
> if i am running 1.65v and you have to ask what bios i am running then i
> 
> 
> 
> 
> 
> 
> 
> you sir,
> all the more "since you know what you are talking about "
> 
> esp when you add it i SAID i was on water.


I'm merely asking you what bios to tell you whether your 1.65v in GPU tweak is more like 1.5v after vdroop. That's what you get with PT1. If you were to set 1.65v with pt3 you would be hitting 1.72v+ actual.

So I was putting you on the spot. Clearly you were using PT1 as pt3 would of blown up in your face. Thus, you never hit anywhere near 1.65v. You may have never even hit 1.5v for that matter

If you arnt even willing to measure with a dmm then you should not be telling people "1.65v is OK and others have done it!" Because you havnt.

And considering you were touting your radiator setup I would assume you were on water as you neever said anything about using ln2.

You are misinforming people and are trying to get smart with me about it? Please go ask tsm about me and see what he has to say, I'm sure you will quickly learn that I'm not just pulling **** out of my ass to prove you wrong.


----------



## devilhead

i have done some test with pt1 bios with my 290X at 1.63v (1.51v after vdrop), water temperature was around 10C, core heats up till 44C max, and vrm to 60C.
to reach 1.65v after vdrop thats crazy on water


----------



## jomama22

Quote:


> Originally Posted by *devilhead*
> 
> i have done some test with pt1 bios with my 290X at 1.63v (1.51v after vdrop), water temperature was around 10C, core heats up till 44C max, and vrm to 60C.
> to reach 1.65v after vdrop thats crazy on water


Highest i hit on a dmm was 1.56v durring a FSE run using the PT3 bios @ 1.49v gputweak setting.

Vdroop is very much uncontrollable and sways heavily during benching, thats why i never even used the PT1 for benching as it was more unpredictable what voltage I would be throwing at the card.

Underwater, these cards dont have any issue with temperature. But i will say, when benching on water, a 10C difference under load can net you some more points/clock. Big radiator setups will help but honestly, its all about ambient temps. Most of my benching on water, which was done at the end of nov/beginning dec, was done with ambient temps below 10*F.

They way you report your voltage is the correct way. Not telling peopel "O i hit 1.63v!!! must be safe!" you actually reported voltage after vdroop and posted the bios you used. Thats all i was trying to get. It all started with a fair warning that once you get over 1.45v actual (dmm read) you are exponentially increasing your chance of blowing a mosfet/cap on the reference boards, specifically the one 4th from the top of the row.

If you have more interest in when they could easily blow up, wander over to kingpincooling.com and head to shaminos lair. there is an old 290x thread (where the asus.rom/pt1/pt3 bios all came from) posted by the master shamino himself just before retiring. Quite a few people blew up their cards well below even 1.5v actual.


----------



## Gualichu04

How do i get the 2nd gpu to show its temp and usage in msi and gpu-z I used DDU to remove drivers and reinstalled them. Also, does anyone know of any multi gpu monitor that uses rain meter?


----------



## Red1776

Quote:


> Originally Posted by *Gualichu04*
> 
> How do i get the 2nd gpu to show its temp and usage in msi and gpu-z I used DDU to remove drivers and reinstalled them. Also, does anyone know of any multi gpu monitor that uses rain meter?


You can drop down the second GPU in GPU-Z and open the sensor tab, or open a second instance of GPU-z

In MSI AB the graph will show Usage/temp/voltage etc whatever you select it to show in the monitoring tab


----------



## spikezone2004

On the EK 290X waterblock does it matter which way the inlet and outlet is?


----------



## Gualichu04

Quote:


> Originally Posted by *Red1776*
> 
> You can drop down the second GPU in GPU-Z and open the sensor tab, or open a second instance of GPU-z
> 
> In MSI AB the graph will show Usage/temp/voltage etc whatever you select it to show in the monitoring tab


It wont show my temps for the rain meter i use it jsut shows 0C and msi nor gpu-z show temp or suage for the 2nd gpu till it has a load on it.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Gualichu04*
> 
> It wont show my temps for the rain meter i use it jsut shows 0C and msi nor gpu-z show temp or suage for the 2nd gpu till it has a load on it.


Make sure ULPS is off.


----------



## Gualichu04

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Make sure ULPS is off.


It shows the gpu temp without upls at idle now but, not on the rain meter skin i have. Can only find two skins that support multi gpu so far and still searching.


----------



## PCSarge

Quote:


> Originally Posted by *Gualichu04*
> 
> It shows the gpu temp without upls at idle now but, not on the rain meter skin i have. Can only find two skins that support multi gpu so far and still searching.


rainmeter is a bit glitchy in windows 8 had it myself until it didnt want to do things i wanted
Quote:


> Originally Posted by *spikezone2004*
> 
> On the EK 290X waterblock does it matter which way the inlet and outlet is?


it shouldnt really matter, unless its marked IN and OUT. direction shouldnt matter at all for inlet/outlet


----------



## spikezone2004

Quote:


> Originally Posted by *PCSarge*
> 
> it shouldnt really matter, unless its marked IN and OUT. direction shouldnt matter at all for inlet/outlet


Thats what I was thinking but I know most blocks have a way the water should flow to cool it properly but havent seen anything saying in or out, wasn't sure if it was just suppose to be known than left should be in and right out


----------



## PCSarge

Quote:


> Originally Posted by *spikezone2004*
> 
> Thats what I was thinking but I know most blocks have a way the water should flow to cool it properly but havent seen anything saying in or out, wasn't sure if it was just suppose to be known than left should be in and right out


try it a right side inlet and see if it works? i know my aquacomputer block has to go left-right for it to work because of the way the vrm area water flow works


----------



## spikezone2004

Quote:


> Originally Posted by *PCSarge*
> 
> try it a right side inlet and see if it works? i know my aquacomputer block has to go left-right for it to work because of the way the vrm area water flow works


Baah somehow I missed this in the tons of times I read the manual "You can use any opening as an inlet/outlet port"


----------



## Gobigorgohome

Quote:


> Originally Posted by *bond32*
> 
> Just curious here, have you ran 3 cards just to see what your framerate is at identical settings to when you run 4?


I have not done that, no. BF4 would not go on anything but low with just one PSU, I got the second PSU and water cooling installed at once so I do not know. I have good framerates in every game I have tried yet, but it is laging a bit with stock cards at BF4 Ultra 4K (worse laging without AA than it enabled). I think it is GPU usage that does this. I looked at the file from GPU-Z and it showed very unstable GPU usage so I guess that is the thing. I can't really just drag out one card (have it hooked up on water with copper tubing and everything so that is not a option really).
Quote:


> Originally Posted by *ZealotKi11er*
> 
> @ 4K you want to make sure you are not hitting vRAM limit. You don't need to OC the GPUs really. Try stock and see how ti goes. Don't uses AA or Resolution Scaling.


I do not think I am hitting the vRAM limit, it seems like the cards is not totally synced ... it got better with more AA .. therefor I used it. Normally I won't use AA or Resolution Scaling (never used RS), but when the cards need more to work with they get AA too. I could take a look at the file in GPU-Z after doing some gaming. First off, try to push my CPU a little further.


----------



## Gualichu04

Horray the r9 290x I received from xfx rma is doing great. It was obviously a refurbish since the fan blades were still dusty. had to tear the DD stock cooler down to just the heat sink to make it fit for testing before it goes under water with the other card. Next step is water col it after day of testing and then the build is finally complete.


----------



## ozzy1925

is there a way to increase power limit more than %50?


----------



## disintegratorx

I just found out about the stuttering problem with the memory, since it takes power from the GPU if its clocked too high. So far I have my best setting at 1562 on the memory. That's the highest I can set it before incurring the stuttering effect.


----------



## jomama22

Quote:


> Originally Posted by *ozzy1925*
> 
> is there a way to increase power limit more than %50?


The pt1 and pt3 bios have lax'd power limits. Setting them at 50% is much higher then that of a stock biod


----------



## FuriousPop

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I have not done that, no. BF4 would not go on anything but low with just one PSU, I got the second PSU and water cooling installed at once so I do not know. I have good framerates in every game I have tried yet, but it is laging a bit with stock cards at BF4 Ultra 4K (worse laging without AA than it enabled). I think it is GPU usage that does this. I looked at the file from GPU-Z and it showed very unstable GPU usage so I guess that is the thing. I can't really just drag out one card (have it hooked up on water with copper tubing and everything so that is not a option really).
> I do not think I am hitting the vRAM limit, it seems like the cards is not totally synced ... it got better with more AA .. therefor I used it. Normally I won't use AA or Resolution Scaling (never used RS), but when the cards need more to work with they get AA too. I could take a look at the file in GPU-Z after doing some gaming. First off, try to push my CPU a little further.


curious - why even the need for AA when your running 4K?
Im running 7560x1600 and have AA off, running medium settings and my 3 cards very very rarely touch 100% usage when playing a Multiplayer map. however lag does occur from time to time. LOL i get more lag when i run mantle than directx11 on 14.4.....but can still maintain 60fps i would say 90% of the time with reasonable temps..


----------



## Spongeworthy

Ugh, my 290X is unstable past 1.3V, what a bummer.


----------



## FuriousPop

Quote:


> Originally Posted by *kizwan*
> 
> Yeah, 5 480s... that's the dream!


I just have to know..... what case can you possible fit that into??? or are you using more than 1x case?

I struggled to get my 2xRX480's in my 900D case in push+pull.... and i still think its not enough for 4 blocks....


----------



## kizwan

Quote:


> Originally Posted by *ozzy1925*
> 
> is there a way to increase power limit more than %50?


Why do you want more than +50%? I would worry components start to pop if they allowed to draw more power than what they designed for.
Quote:


> Originally Posted by *FuriousPop*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Yeah, 5 480s... that's the dream!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I just have to know..... what case can you possible fit that into??? or are you using more than 1x case?
> 
> I struggled to get my 2xRX480's in my 900D case in push+pull.... and i still think its not enough for 4 blocks....
Click to expand...

Not me that have 5 480s. Mega Man that have them. Caselabs with pedestal or any case with pedestal will be able to handle 4 480s. With that many rads, you probably need to use some slim 480s for all the rads to fit.

I only have 2 blocks (or 3 if you include CPU) & I feels 720 rads space is not enough but that's all my case can swallow.


----------



## HighTemplar

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I have not done that, no. BF4 would not go on anything but low with just one PSU, I got the second PSU and water cooling installed at once so I do not know. I have good framerates in every game I have tried yet, but it is laging a bit with stock cards at BF4 Ultra 4K (worse laging without AA than it enabled). I think it is GPU usage that does this. I looked at the file from GPU-Z and it showed very unstable GPU usage so I guess that is the thing. I can't really just drag out one card (have it hooked up on water with copper tubing and everything so that is not a option really).
> I do not think I am hitting the vRAM limit, it seems like the cards is not totally synced ... it got better with more AA .. therefor I used it. Normally I won't use AA or Resolution Scaling (never used RS), but when the cards need more to work with they get AA too. I could take a look at the file in GPU-Z after doing some gaming. First off, try to push my CPU a little further.


You won't hit a VRAM limit @ 4K in BF4 with 4GB of VRAM. Even with 3GB with my 780 Ti's, I don't have a problem with those settings. 4x MSAA becomes an issue with 3GB, but 2x and below are just fine.


----------



## Vici0us

I hear that the higher ASIC score let's you overclock GPU better which turned out to be false at least for me. My Gigabyte WIndforce R9 290 is @ 79% while my Reference ASUS R9 290 is @ 75.4%. My Reference card overclocks better.
I know, it's not much overclocked but I don't need to (at least not at the moment) since, I'm running Crossfire


----------



## FuriousPop

Quote:


> Originally Posted by *kizwan*
> 
> Why do you want more than +50%? I would worry components start to pop if they allowed to draw more power than what they designed for.
> Not me that have 5 480s. Mega Man that have them. Caselabs with pedestal or any case with pedestal will be able to handle 4 480s. With that many rads, you probably need to use some slim 480s for all the rads to fit.
> 
> I only have 2 blocks (or 3 if you include CPU) & I feels 720 rads space is not enough but that's all my case can swallow.


whats your temps like?

I am running 3 cards + 1 CPU block = 4 blocks on 960 rad space (haven't OC'd anything) and i feel this is still not enough since my temps still pretty high (mid to high 40's) 1x Card is at 50ish (think i've done something wrong with that) and thats 240 rad space per block


----------



## Vici0us

Does anyone know if it voids the warranty on Gigabyte and ASUS cards if I was to run closed loops on em, getting prepared to run Seidon 120M on em.


----------



## ozzy1925

Quote:


> Originally Posted by *jomama22*
> 
> The pt1 and pt3 bios have lax'd power limits. Setting them at 50% is much higher then that of a stock biod


i currently run my 290 trix ocs with stock cooling using pt1 and pt3 bioses are ok for daily usage?


----------



## kizwan

Quote:


> Originally Posted by *FuriousPop*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Why do you want more than +50%? I would worry components start to pop if they allowed to draw more power than what they designed for.
> Not me that have 5 480s. Mega Man that have them. Caselabs with pedestal or any case with pedestal will be able to handle 4 480s. With that many rads, you probably need to use some slim 480s for all the rads to fit.
> 
> I only have 2 blocks (or 3 if you include CPU) & I feels 720 rads space is not enough but that's all my case can swallow.
> 
> 
> 
> 
> 
> 
> 
> whats your temps like?
> 
> I am running 3 cards + 1 CPU block = 4 blocks on 960 rad space (haven't OC'd anything) and i feel this is still not enough since my temps still pretty high (mid to high 40's) 1x Card is at 50ish (think i've done something wrong with that) and thats 240 rad space per block
Click to expand...

At stock Tri-X clocks 1000/1300, in 50s without A/C because ambient always at 30++ Celsius. Water temp in the 40s I think. I didn't monitor water temp frequently though. I only have one exhaust fan at the back so I'll need to play games with side panel off, even when A/C is on. Because high ambient temp I always play with A/C ON though, so temps not an issue but heat does. I think with more radiator space the heat will be distributed a lot better.

What is your ambient/room temp like?


----------



## Kukielka

Count me in!








3x Sapphire R9-290 with EK-WB Fullcover + Backplate
Proof:


http://imgur.com/yS8D1


----------



## Arizonian

Quote:


> Originally Posted by *Kukielka*
> 
> Count me in!
> 
> 
> 
> 
> 
> 
> 
> 
> 3x Sapphire R9-290 with EK-WB Fullcover + Backplate
> Proof:
> 
> 
> http://imgur.com/yS8D1


Congrats - added
















Looks amazing. Compliments to your work.

Just do me a favor and add a GPU-Z validation link with validation tab open and OCN name showing to your *post*, appreciated.

Welcome aboard OCN too.


----------



## Kukielka

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looks amazing. Compliments to your work.
> 
> Just do me a favor and add a GPU-Z validation link with validation tab open and OCN name showing to your *post*, appreciated.
> 
> Welcome aboard OCN too.


Will do once I'm back home. Thanks!


----------



## FuriousPop

Quote:


> Originally Posted by *kizwan*
> 
> At stock Tri-X clocks 1000/1300, in 50s without A/C because ambient always at 30++ Celsius. Water temp in the 40s I think. I didn't monitor water temp frequently though. I only have one exhaust fan at the back so I'll need to play games with side panel off, even when A/C is on. Because high ambient temp I always play with A/C ON though, so temps not an issue but heat does. I think with more radiator space the heat will be distributed a lot better.
> 
> What is your ambient/room temp like?


hhhmmmmm so my 2 cards seem to be running normal then except for the 3rd that i need to re-do.

My Ambient is at 17-18 cel. however once the wife puts on the heater (since its winter here) ambient jumps to low to mid 30's. to which then the cards jump up to high 40's and 3rd card ive seen hit 62c when im on BF4....

i am the same with the case, 1x exhaust (stock case face) and 2x front intake with are the stock case fans also (900D Case).

i could squeeze another 240 rad behind the 480 in the bottom but then i will need to place my 2nd pump on the same level as the res (which for a 2nd pump in the loop, not sure if this will have a negative result), but still another 240 rad with all that trouble, not sure if thats going to help much rather than changing my intake and exhaust fans to something a little better like some noctuas or sp120 quiets....

Hence why my question to whoever is running 4x 480 rads in 1x case with everything - i am assuming its either 2x cases or the rads are external from the case...

i didn't check my water temp when i have been gaming, but i'll keep an eye out for it next.. think it was in high 30's when the cards were in the mid to high 40's.. - will triple check this


----------



## kizwan

Quote:


> Originally Posted by *FuriousPop*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> At stock Tri-X clocks 1000/1300, in 50s without A/C because ambient always at 30++ Celsius. Water temp in the 40s I think. I didn't monitor water temp frequently though. I only have one exhaust fan at the back so I'll need to play games with side panel off, even when A/C is on. Because high ambient temp I always play with A/C ON though, so temps not an issue but heat does. I think with more radiator space the heat will be distributed a lot better.
> 
> What is your ambient/room temp like?
> 
> 
> 
> hhhmmmmm so my 2 cards seem to be running normal then except for the 3rd that i need to re-do.
> 
> My Ambient is at 17-18 cel. however once the wife puts on the heater (since its winter here) ambient jumps to low to mid 30's. to which then the cards jump up to high 40's and 3rd card ive seen hit 62c when im on BF4....
> 
> i am the same with the case, 1x exhaust (stock case face) and 2x front intake with are the stock case fans also (900D Case).
> 
> i could squeeze another 240 rad behind the 480 in the bottom but then i will need to place my 2nd pump on the same level as the res (which for a 2nd pump in the loop, not sure if this will have a negative result), but still another 240 rad with all that trouble, not sure if thats going to help much rather than changing my intake and exhaust fans to something a little better like some noctuas or sp120 quiets....
> 
> 
> 
> Hence why my question to whoever is running 4x 480 rads in 1x case with everything - i am assuming its either 2x cases or the rads are external from the case...
> 
> i didn't check my water temp when i have been gaming, but i'll keep an eye out for it next.. think it was in high 30's when the cards were in the mid to high 40's.. - will triple check this
Click to expand...

In the sig he is using Caselabs case & I remember he mentioned TH10 case (Caselabs). Caselabs look huge & with pedestal, I believe you can put 5 480 rads no problem.


----------



## FuriousPop

Quote:


> Originally Posted by *kizwan*
> 
> In the sig he is using Caselabs case & I remember he mentioned TH10 case (Caselabs). Caselabs look huge & with pedestal, I believe you can put 5 480 rads no problem.


like this one?
MAGNUM TH10A
http://www.caselabs-store.com/magnum-th10a/

cant see the TH10 - above must be the updated version.

BUT YES - definetly getting that one for my next build, no ifs/buts about it


----------



## Mega Man

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *FuriousPop*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> At stock Tri-X clocks 1000/1300, in 50s without A/C because ambient always at 30++ Celsius. Water temp in the 40s I think. I didn't monitor water temp frequently though. I only have one exhaust fan at the back so I'll need to play games with side panel off, even when A/C is on. Because high ambient temp I always play with A/C ON though, so temps not an issue but heat does. I think with more radiator space the heat will be distributed a lot better.
> 
> What is your ambient/room temp like?
> 
> 
> 
> hhhmmmmm so my 2 cards seem to be running normal then except for the 3rd that i need to re-do.
> 
> My Ambient is at 17-18 cel. however once the wife puts on the heater (since its winter here) ambient jumps to low to mid 30's. to which then the cards jump up to high 40's and 3rd card ive seen hit 62c when im on BF4....
> 
> i am the same with the case, 1x exhaust (stock case face) and 2x front intake with are the stock case fans also (900D Case).
> 
> i could squeeze another 240 rad behind the 480 in the bottom but then i will need to place my 2nd pump on the same level as the res (which for a 2nd pump in the loop, not sure if this will have a negative result), but still another 240 rad with all that trouble, not sure if thats going to help much rather than changing my intake and exhaust fans to something a little better like some noctuas or sp120 quiets....
> 
> 
> 
> Hence why my question to whoever is running 4x 480 rads in 1x case with everything - i am assuming its either 2x cases or the rads are external from the case...
> 
> i didn't check my water temp when i have been gaming, but i'll keep an eye out for it next.. think it was in high 30's when the cards were in the mid to high 40's.. - will triple check this
> 
> Click to expand...
> 
> In the sig he is using Caselabs case & I remember he mentioned TH10 case (Caselabs). Caselabs look huge & with pedestal, I believe you can put 5 480 rads no problem.
Click to expand...

no ped.
Quote:


> Originally Posted by *FuriousPop*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> In the sig he is using Caselabs case & I remember he mentioned TH10 case (Caselabs). Caselabs look huge & with pedestal, I believe you can put 5 480 rads no problem.
> 
> 
> 
> like this one?
> MAGNUM TH10A
> http://www.caselabs-store.com/magnum-th10a/
> 
> cant see the TH10 - above must be the updated version.
> 
> BUT YES - definetly getting that one for my next build, no ifs/buts about it
Click to expand...

yes that is jsut updated same basic layout

and 5x 480 1 in front others in tops and bottoms


----------



## JordanTr

That
Quote:


> Originally Posted by *FuriousPop*
> 
> hhhmmmmm so my 2 cards seem to be running normal then except for the 3rd that i need to re-do.
> 
> My Ambient is at 17-18 cel. however once the wife puts on the heater (since its winter here) ambient jumps to low to mid 30's. to which then the cards jump up to high 40's and 3rd card ive seen hit 62c when im on BF4....
> 
> i am the same with the case, 1x exhaust (stock case face) and 2x front intake with are the stock case fans also (900D Case).
> 
> i could squeeze another 240 rad behind the 480 in the bottom but then i will need to place my 2nd pump on the same level as the res (which for a 2nd pump in the loop, not sure if this will have a negative result), but still another 240 rad with all that trouble, not sure if thats going to help much rather than changing my intake and exhaust fans to something a little better like some noctuas or sp120 quiets....
> 
> Hence why my question to whoever is running 4x 480 rads in 1x case with everything - i am assuming its either 2x cases or the rads are external from the case...
> 
> i didn't check my water temp when i have been gaming, but i'll keep an eye out for it next.. think it was in high 30's when the cards were in the mid to high 40's.. - will triple check this


s totally out of subject, but who whould like to turn on the heater to reach mid 30's at home? Take your wife to a doctor, mate i bet something is wrong with her thermal receptors


----------



## FuriousPop

Lmao. Lucky I said it's winter here hence ambient (in study room) is 17 degrees Celsius. Outside house drops to about 5 at night.

Perfect environment for Ocing. However wanted to make sure rig and blocks + vrms all running at decent temps.


----------



## Infinite Jest

Anyone had experience with temperature differences between AX III vs Ax IV vs Gelid Icy a Vision with RAM kit? Also, anyone have real experience with repasting reference cooler and temp reductions?


----------



## rdr09

Quote:


> Originally Posted by *FuriousPop*
> 
> Lmao. Lucky I said it's winter here hence ambient (in study room) is 17 degrees Celsius. Outside house drops to about 5 at night.
> 
> Perfect environment for Ocing. However wanted to make sure rig and blocks + vrms all running at decent temps.


240 per rad i think is enough. if not plenty. since only the third gpu that is exhibiting higher temp compared to the other two, then it might be the mount. play with the fans. push/pull maybe. it would be best to judge the temps when ambient is high, though, like summer or when the wife is home and adjusts the thermostat. lol


----------



## FuriousPop

Quote:


> Originally Posted by *rdr09*
> 
> 240 per rad i think is enough. if not plenty. since only the third gpu that is exhibiting higher temp compared to the other two, then it might be the mount. play with the fans. push/pull maybe. it would be best to judge the temps when ambient is high, though, like summer or when the wife is home and adjusts the thermostat. lol


how high roughly?

like i said the most its gotten in the room is mid 30's ambient to which cards are at mid to high 40's with exceptiuon 3rd at low to mid 50's - will eventually pull off the block on this one and redo the paste and screws just to be safe.

Will play around with the fans as well - thanks heaps!

real test is going to be summer when it hits 45c outside!!!

thanks again all, much appreciated.


----------



## Gobigorgohome

Hi guys, I need some overclocking help, just got my 3930K stable at 4,6 Ghz, now it is time for the GPU's. I have one EVGA G2 1300W dedicated to my four GPU's alone so PSU wattage should not be a problem, I want to game at 4K with the Samsung U28D590D so that is the real target, just to get something stable that draws less than 1300 wattage and are good at 4K.









Running 4x R9 290X's on water, today I am doing 1100/1300 with 30 % power limit, I want to go as high as possible without going with custom bios or anything. Do you guys have any recommendations?

I am using MSI AB by the way.


----------



## smoke2

I need quick help!
I can buy brand new Gigabyte 290 Windforce from one user with 2,5 years warranty for 190€.
Or I can buy brand new Sapphire Tri-X 290 for 230€ with 2 years warranty.
Sapphire looks probably better, but have 1500 RPM's in idle.
I have Fractal R4 case in living room, all on 5V.
Which one would you buy in my case?
Thanks.


----------



## ozzy1925

i would choose tri-x


----------



## Spectre-

Sapphire hands down


----------



## ZealotKi11er

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Hi guys, I need some overclocking help, just got my 3930K stable at 4,6 Ghz, now it is time for the GPU's. I have one EVGA G2 1300W dedicated to my four GPU's alone so PSU wattage should not be a problem, I want to game at 4K with the Samsung U28D590D so that is the real target, just to get something stable that draws less than 1300 wattage and are good at 4K.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Running 4x R9 290X's on water, today I am doing 1100/1300 with 30 % power limit, I want to go as high as possible without going with custom bios or anything. Do you guys have any recommendations?
> 
> I am using MSI AB by the way.


Dont bother unless you want to benchmark them.


----------



## tsm106

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Hi guys, I need some overclocking help, just got my 3930K stable at 4,6 Ghz, now it is time for the GPU's. I have one EVGA G2 1300W dedicated to my four GPU's alone so PSU wattage should not be a problem, I want to game at 4K with the Samsung U28D590D so that is the real target, just to get something stable that draws less than 1300 wattage and are good at 4K.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Running 4x R9 290X's on water, today I am doing 1100/1300 with 30 % power limit, *I want to go as high as possible without going with custom bios or anything*. Do you guys have any recommendations?
> 
> I am using MSI AB by the way.


I'm not sure why I bother if you are not willing to change the bios. Btw, why are you starving the gpus? Dedicating one 1300w to four gpus is kind of defeating the point of using dual gpu. What are you doing with the second psu in your sig?


----------



## Gobigorgohome

Quote:


> Originally Posted by *tsm106*
> 
> I'm not sure why I bother if you are not willing to change the bios. Btw, why are you starving the gpus? Dedicating one 1300w to four gpus is kind of defeating the point of using dual gpu. What are you doing with the second psu in your sig?


First off I do not understand what do you mean by starving the GPU's .... Second, what does 1300 wattage power supply have to do with dual gpu's? Or do you mean that if I "unlock" the bios and get some modified-thingy they will sing much better than today? Heck, if I have enough power and good enough cooling I will be in all the way.









One 1300 wattage to 24-pin, 8-pin, molex to bottom plug on RIVBE, 2x D5 pumps and 21 fans, together with 3x 3TB HDD's, I may add more disks after a while.
The other 1300 wattage psu just for my GPU's. Why do you ask?


----------



## tsm106

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I'm not sure why I would bother if you are not willing to change the bios. Btw, why are you starving the gpus? Dedicating one 1300w to four gpus is kind of defeating the point of using dual gpu psu. What are you doing with the second psu in your sig?
> 
> 
> 
> First off I do not understand what do you mean by starving the GPU's .... Second, what does 1300 wattage power supply have to do with dual gpu's? Or do you mean that if I "unlock" the bios and get some modified-thingy they will sing much better than today? Heck, if I have enough power and good enough cooling I will be in all the way.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> One 1300 wattage to 24-pin, 8-pin, molex to bottom plug on RIVBE, 2x D5 pumps and 21 fans, together with 3x 3TB HDD's, I may add more disks after a while.
> The other 1300 wattage psu just for my GPU's. Why do you ask?
Click to expand...

Sorry that post came out weird.

You are starving your gpus. Each gpu at stock can draw up to 300w. That makes it a total of 1200w, leaving you with only 100w spare. Now if you start overclocking and raising PT, you will eclipse your maximum wattage of 1300w easily. In other words you are starving your gpus. On the other hand you are using a WHOLE 1300w psu for your cpu and loop! That's a huge waste. You should be running two gpu on each psu and moving the loop off the psu with the cpu. You want to load balance your gear better.

*gah edited more key typos on my part.


----------



## Mega Man

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gobigorgohome*
> 
> Hi guys, I need some overclocking help, just got my 3930K stable at 4,6 Ghz, now it is time for the GPU's. I have one EVGA G2 1300W dedicated to my four GPU's alone so PSU wattage should not be a problem, I want to game at 4K with the Samsung U28D590D so that is the real target, just to get something stable that draws less than 1300 wattage and are good at 4K.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Running 4x R9 290X's on water, today I am doing 1100/1300 with 30 % power limit, *I want to go as high as possible without going with custom bios or anything*. Do you guys have any recommendations?
> 
> I am using MSI AB by the way.
> 
> 
> 
> I'm not sure why I bother if you are not willing to change the bios. Btw, why are you starving the gpus? Dedicating one 1300w to four gpus is kind of defeating the point of using dual gpu. What are you doing with the second psu in your sig?
Click to expand...

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gobigorgohome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I'm not sure why I would bother if you are not willing to change the bios. Btw, why are you starving the gpus? Dedicating one 1300w to four gpus is kind of defeating the point of using dual gpu. What are you doing with the second psu in your sig?
> 
> 
> 
> First off I do not understand what do you mean by starving the GPU's .... Second, what does 1300 wattage power supply have to do with dual gpu's? Or do you mean that if I "unlock" the bios and get some modified-thingy they will sing much better than today? Heck, if I have enough power and good enough cooling I will be in all the way.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> One 1300 wattage to 24-pin, 8-pin, molex to bottom plug on RIVBE, 2x D5 pumps and 21 fans, together with 3x 3TB HDD's, I may add more disks after a while.
> The other 1300 wattage psu just for my GPU's. Why do you ask?
> 
> Click to expand...
> 
> Sorry that post came out weird.
> 
> You are starving your gpus. Each gpu at stock can draw up to 300w. That makes it a total of 1200w, leaving you with only 100w spare. Now if you start overclocking and raising PT, you will eclipse your maximum wattage of 1300w easily. In other words you are starving your gpus. On the other hand you are using a WHOLE 1300w psu for your cpu and loop! That's a huge waste. You should be running two gpu on each psu and moving the loop off the psu with the cpu. You want to load balance your gear better.
Click to expand...

agreed at min 3/1 ( 3 on one, 1 on other )


----------



## fishingfanatic

Sounds like it makes sense to me. Not enough pci-e plugs for multiple gpus until u get higher powered ones.

Personally, now that I have a Add2Psu adapters, actually about 40....I use my 1200 and a 1050 for my gpus.

That's on air, and I'm building my wcing system as we speak. Gonna get some 2x2x1/8" aluminum angle to mount the rads outside of the case.

Even with a monster case Lian Li P V2120. Not the black interior unfortunately. 2 480s will make things crowded. $46 for a 8' length, might need

another small piece.

Bend a 90 into a cpl of lengths and attach thru the fan thread holes. Quick connects to move everything a lot easier, and remove a rad when not

benching.

L shaped brackets. 107 cfm fans,.....should be fun!!! My only drawback atm, the pump barb is too big for the hoses, so a quick home made

reducer should be fine unless there's something at th store that'll do the trick.

I'll try to remember to post some pics when done.

Off to Princess Auto shortly to make my reducer and grab the angle.









FF


----------



## paradoxum

Can I be added to the list?

http://www.techpowerup.com/gpuz/details.php?id=dpsxe

http://www.3dmark.com/fs/2581923


----------



## Gobigorgohome

Quote:


> Originally Posted by *tsm106*
> 
> Sorry that post came out weird.
> 
> You are starving your gpus. Each gpu at stock can draw up to 300w. That makes it a total of 1200w, leaving you with only 100w spare. Now if you start overclocking and raising PT, you will eclipse your maximum wattage of 1300w easily. In other words you are starving your gpus. On the other hand you are using a WHOLE 1300w psu for your cpu and loop! That's a huge waste. You should be running two gpu on each psu and moving the loop off the psu with the cpu. You want to load balance your gear better.
> 
> *gah edited more key typos on my part.


Sure, that sounds right. But would not 3 GPU's on one 1300 wattage PSU and the last one on the other 1300 wattage PSU. I guess I could do two cards on each PSU. What are you recommending for highest clocks? If I need new BIOS's so be it, but I want everything to work properly when it is overclocked. And I need some kind of a guide on how to do it.
Quote:


> Originally Posted by *Mega Man*
> 
> agreed at min 3/1 ( 3 on one, 1 on other )


Guess that makes sense, but two GPU's on each PSU may be better. IF they draw 300 wattage each alone at stock then I guess they will draw some more when they are overclocked and my CPU draw around 220 wattage already.


----------



## dpl2007

Hi there any guys with two r9 290s in crossfire and playing original sin? Just wondering what fps your getting at top settings in 1440p I'm hitting about 80 max and less in some situations. I thought I could hit the 107fps my qnix can handle no problem







but I have to cut setting allot for that.


----------



## battleaxe

My GPU score on an R9 290 was 16020 on 3d Mark 11. Is this about right for 1100mhz core and 1350mhz memory? http://www.3dmark.com/3dm11/8614911
http://www.3dmark.com/3dm11/8615226 Correction: Running 16600 now at 1150mhz core. Crashed at 1175mhz.


----------



## ZealotKi11er

Quote:


> Originally Posted by *battleaxe*
> 
> My GPU score on an R9 290 was 16020 on 3d Mark 11. Is this about right for 1100mhz core and 1350mhz memory? http://www.3dmark.com/3dm11/8614911
> http://www.3dmark.com/3dm11/8615226 Correction: Running 16600 now at 1150mhz core. Crashed at 1175mhz.


Try the lattes 3DMark. Should get about 10.5K @ Stock.


----------



## PJFT808

I'd think so. Here's one of mine at 1200mhz. Seems I never saved any of my 1100mhz runs but http://www.3dmark.com/3dm11/8358679


----------



## battleaxe

Quote:


> Originally Posted by *PJFT808*
> 
> I'd think so. Here's one of mine at 1200mhz. Seems I never saved any of my 1100mhz runs but http://www.3dmark.com/3dm11/8358679


Yeah, I just did a run at 1150 and got 16620ish. These are pretty crazy scores. Better than a stock 290x or 780ti by quite a bit... http://www.3dmark.com/3dm11/8615226

Pretty happy with this card. What voltage is your 1200mhz run at? Mine was at +100mv on MSI afterburner. Voltage onscreen was only saying 1.125v though... not sure if that means much though.


----------



## spikezone2004

Feel like my R9 290 is idling rather high, I am idling at 41C under water. I feel like its not downclocking properly on desktop. Clocks keep jumping up and down


----------



## rdr09

Quote:


> Originally Posted by *spikezone2004*
> 
> Feel like my R9 290 is idling rather high, I am idling at 41C under water. I feel like its not downclocking properly on desktop. Clocks keep jumping up and down


how many monitors are hooked up?
Quote:


> Originally Posted by *PJFT808*
> 
> I'd think so. Here's one of mine at 1200mhz. Seems I never saved any of my 1100mhz runs but http://www.3dmark.com/3dm11/8358679


1200? kinda low. here is mine but using 14.6 Beta

http://www.3dmark.com/3dm11/8369902

it is only suppose to be about 400 points higher in graphics scores between these drivers.

here is 1260 using the latest RC . . .

http://www.3dmark.com/3dm11/8615010


----------



## spikezone2004

Quote:


> Originally Posted by *rdr09*
> 
> how many monitors are hooked up?
> 1200? kinda low. here is mine but using 14.6 Beta
> 
> http://www.3dmark.com/3dm11/8369902
> 
> it is only suppose to be about 400 points higher in graphics scores between these drivers.


I have 2 monitors hooked up.

literally just got it in my pc couple hours ago along with new pump and radiator, so I am new to the R9 290 world. I have a EK block and backplate on it


----------



## rdr09

Quote:


> Originally Posted by *spikezone2004*
> 
> I have 2 monitors hooked up.
> 
> literally just got it in my pc couple hours ago along with new pump and radiator, so I am new to the R9 290 world. I have a EK block and backplate on it


that's why. unplug one monitor and see what happens.


----------



## spikezone2004

Quote:


> Originally Posted by *rdr09*
> 
> that's why. unplug one monitor and see what happens.


I just unplugged one and temp is the same.

The clock speeds keep jumping up and down on desktop they wont stay downclock keep jumping up for a second then back down


----------



## rdr09

Quote:


> Originally Posted by *spikezone2004*
> 
> I just unplugged one and temp is the same.
> 
> The clock speeds keep jumping up and down on desktop they wont stay downclock keep jumping up for a second then back down


well, it is not suppose to go down that fast but the clock fluctuating might be either you have Afterburner running or other apps. Chrome and its add ons tend to do that. check your background services.

actually, 41 is not too bad if your ambient is high. what matters, though are the load temps. they should all be below 60C in games like BF4.


----------



## kizwan

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Sorry that post came out weird.
> 
> You are starving your gpus. Each gpu at stock can draw up to 300w. That makes it a total of 1200w, leaving you with only 100w spare. Now if you start overclocking and raising PT, you will eclipse your maximum wattage of 1300w easily. In other words you are starving your gpus. On the other hand you are using a WHOLE 1300w psu for your cpu and loop! That's a huge waste. You should be running two gpu on each psu and moving the loop off the psu with the cpu. You want to load balance your gear better.
> 
> *gah edited more key typos on my part.
> 
> 
> 
> Sure, that sounds right. But would not 3 GPU's on one 1300 wattage PSU and the last one on the other 1300 wattage PSU. I guess I could do two cards on each PSU. What are you recommending for highest clocks? If I need new BIOS's so be it, but I want everything to work properly when it is overclocked. And I need some kind of a guide on how to do it.
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> agreed at min 3/1 ( 3 on one, 1 on other )
> 
> Click to expand...
> 
> Guess that makes sense, but two GPU's on each PSU may be better. IF they draw 300 wattage each alone at stock then I guess they will draw some more when they are overclocked and my CPU draw around 220 wattage already.
Click to expand...

I would do that too, two GPU's on each PSU then you can either connect all other devices including CPU to one of the PSU because there's plenty room (wattage wise) for them or you can use both PSU. When gaming or graphics benching, I believe your CPU will draw less than ~220W.
Quote:


> Originally Posted by *battleaxe*
> 
> My GPU score on an R9 290 was 16020 on 3d Mark 11. Is this about right for 1100mhz core and 1350mhz memory? http://www.3dmark.com/3dm11/8614911
> http://www.3dmark.com/3dm11/8615226 Correction: Running 16600 now at 1150mhz core. Crashed at 1175mhz.


Quote:


> Originally Posted by *PJFT808*
> 
> I'd think so. Here's one of mine at 1200mhz. Seems I never saved any of my 1100mhz runs but http://www.3dmark.com/3dm11/8358679


I concur. The score look alright to me.

Mine at 1200/1600 or 1200/1500 with three different drivers. I wish there is a search function that I can use to find (my) results for certain clocks though.
http://www.3dmark.com/3dm11/8269128
http://www.3dmark.com/3dm11/8371754
http://www.3dmark.com/3dm11/8382247
Quote:


> Originally Posted by *spikezone2004*
> 
> Feel like my R9 290 is idling rather high, I am idling at 41C under water. I feel like its not downclocking properly on desktop. Clocks keep jumping up and down


Mine idling at that temp in 30++ Celsius ambient.


----------



## battleaxe

Quote:


> Originally Posted by *kizwan*
> 
> I would do that too, two GPU's on each PSU then you can either connect all other devices including CPU to one of the PSU because there's plenty room (wattage wise) for them or you can use both PSU. When gaming or graphics benching, I believe your CPU will draw less than ~220W.
> 
> I concur. The score look alright to me.
> 
> This is mine at 1200/1600 & 1200/1500 with three different drivers.
> http://www.3dmark.com/3dm11/8269128
> http://www.3dmark.com/3dm11/8371754
> http://www.3dmark.com/3dm11/8382247


What voltage are you running at 1200mhz core? Using Afterburner or Trixx? 100mv or what?


----------



## spikezone2004

Is 39-41C a high idle speed for a R9 290 under full block? I thought it should be around 33C


----------



## Arizonian

Quote:


> Originally Posted by *paradoxum*
> 
> Can I be added to the list?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/details.php?id=dpsxe
> 
> http://www.3dmark.com/fs/2581923
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## kizwan

Quote:


> Originally Posted by *battleaxe*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I would do that too, two GPU's on each PSU then you can either connect all other devices including CPU to one of the PSU because there's plenty room (wattage wise) for them or you can use both PSU. When gaming or graphics benching, I believe your CPU will draw less than ~220W.
> 
> I concur. The score look alright to me.
> 
> This is mine at 1200/1600 & 1200/1500 with three different drivers.
> http://www.3dmark.com/3dm11/8269128
> http://www.3dmark.com/3dm11/8371754
> http://www.3dmark.com/3dm11/8382247
> 
> 
> 
> 
> 
> 
> 
> What voltage are you running at 1200mhz core? Using Afterburner or Trixx? 100mv or what?
Click to expand...

+200mV which is equal to 1.39V after Vdroop according to GPU-Z. I'm too lazy to measure the actual voltage using DMM. The actual voltage might be slightly lower. Using Trixx for overclocking. I have both AB & Trixx installed. For monitoring & ULPS disable function I use AB.

I'm in top 10 1 x 290/290x setup & #1 1 x 290 setup on 3dmark 11 (P) HOF.








Quote:


> Originally Posted by *spikezone2004*
> 
> Is 39-41C a high idle speed for a R9 290 under full block? I thought it should be around 33C


Mine idle at 39-41C in 30++ ambient. Right now idling at 38-39C in 29C ambient. What is your ambient (room) temp?


----------



## rdr09

Quote:


> Originally Posted by *spikezone2004*
> 
> Is 39-41C a high idle speed for a R9 290 under full block? I thought it should be around 33C


here is mine idling with an ambient of 24C . . .



these latest drivers starting from 14.6, i think, has raised the temps a few degrees. some on air are finding hard to cope.


----------



## spikezone2004

I guess mine is about right then, my ambient is 30C rightnow.

Just did some temp tests in BF3 wanted to see what I got, was getting 48C core, and 45 and 50 on VRMs


----------



## bluedevil

I idle about 34C with my 120XL (21C ambient). I tend to never get above 60C on load.


----------



## FuriousPop

Quote:


> Originally Posted by *spikezone2004*
> 
> I guess mine is about right then, my ambient is 30C rightnow.
> 
> Just did some temp tests in BF3 wanted to see what I got, was getting 48C core, and 45 and 50 on VRMs


what about BF4? if you have it?

I think BF4 seems to tax the blocks more so than BF3.

my 3 idle at 28-29 with 3 monitors connected at same time (eyefinity) with an ambient of 20..

if that helps...


----------



## spikezone2004

Quote:


> Originally Posted by *FuriousPop*
> 
> what about BF4? if you have it?
> 
> I think BF4 seems to tax the blocks more so than BF3.
> 
> my 3 idle at 28-29 with 3 monitors connected at same time (eyefinity) with an ambient of 20..
> 
> if that helps...


I don't have BF4. My ambient is always 27-32C in the summer my office gets very hot, debating whether re doing thermal paste but a lot of work to drain loop and take off block.


----------



## FuriousPop

Quote:


> Originally Posted by *spikezone2004*
> 
> I don't have BF4. My ambient is always 27-32C in the summer my office gets very hot, debating whether re doing thermal paste but a lot of work to drain loop and take off block.


I have BF3 as well, will test it tonight and give results on temps for it.... haven't even tried it to be honest since i did my WC build..

not looking forward to my summer - can reach 45+ cel in any day... wonder what the PC will get to!!!


----------



## Cyber Locc

Add me Please

Okay here is link. http://www.techpowerup.com/gpuz/details.php?id=6mmky

Here is pic (dont judge my build going through major overhaul and broke my top rad swapped my 680s for this and when this ebay buyer pays I will have another on the way then they will be under water







)



Ooh and Sapphire 290 reference, Stock Cooling, however it will be under water soon.

And here is the fun I have been having tonight I worked the mem first then did the clock a little until now I will most likely leave it here for now till I get my other card and get them under water. What am I saying ill be doing this all night







.



Also I think the proc may be lowering that score as it is hitting 100 alot during the tests I gave my 3570k to my step brother as I'm swapping to x79 soon. Now I think i should have waited lol


----------



## 2tired

anyone use the gelid vrm kit with this? one of the pads isnt stick on both both sides. Can I use the thermal pad under the stock r90 heatsink with the GELied VRM heatsinks? They seem a lot thicker so im not sure if it will cool properly. thanks


----------



## pkrexer

So the newest 14.7RC3 drivers seem to have fixed my BF4 Mantle issue that I have had since going crossfire. Originally my GPU usage would be all over the place and my fps would be well below DX11. I gave up on it and just settled for DX.

Now with working mantle, both GPUs are pegged almost the entire time and everything just seems more fluid. I play lot of hurt locker and with DX11 my fps would usually stick in the 80-100fps range. Now I'm seeing 100-130fps.

I'm running 1080p - Ultra - 4xAA - 150% Scale

Feels good to actually be utilizing my cards full potential


----------



## fateswarm

It's the hot summer in some places. 5-10C differences are common. Compare ambients before calling what is normal or what is not.


----------



## Vici0us

I'm running Crossfire 290s (8X8) is it worth adding 3rd 290 @ X4 (All 3.0) PCIe. What performance would I gain?


----------



## Gobigorgohome

tsm106: Now I have connected two cards to each PSU, so what then? New bios, and which?


----------



## leetmode

Finally replaced my 5970 with a Tri-X 290x!!! I'm SO happy!!! The cooler on this thing is incredible, basically it sounds like my old card at idle when I'm at full load.


----------



## paradoxum

Quote:


> Originally Posted by *paradoxum*
> 
> Can I be added to the list?
> 
> http://www.techpowerup.com/gpuz/details.php?id=dpsxe
> 
> http://www.3dmark.com/fs/2581923


In the list you could change me from Stock to Watercooling (EK-FC R9-290X Block) and it's not the R9 290*X*, just the 290 - my high overclock might have made it look like one though?


----------



## Wezzor

Hello guys!
Well, I decided to write about my problem here since if this is going to noticed its here and maybe even fixed







Anyway, I bought an Sapphire R9 290 Tri-X about 1 month ago but have just used it for a week since I've been on vacation. My first impression was wow this card is really a beast and really enjoyed gaming with it. But I also noticed a couple days ago that I get BSOD randomly and when I checked out what's causing the BSOD it's atikmdag.sys ( atikmdag+28ece ). So I updated once again to the newest driver (which I already had) but I thought somehow it may fix my problem but I was wrong. So, I started to check when my BSOD actually occurs and found out it'll only occur when watching videos on youtube (sadly I do it alot







) So I started to google my problem and was shocked by how many people seemed to have exactly the same problem as I. I thought since so many have had the same problem there should probably be a solution to it but I was wrong. It seemed to be a problem that was known and haven't been fixed for years. I mean AMD must know about this problem and see all those post they receive on their forum all the time but they still haven't fixed it? No, wonder so many people try to avoid AMD and why there is so many nvidia fanboys. I'm sorry, I'm just a bit frustrated I'm not trying to hate on AMD even if it looks like it. I'm actually really happy with my card except for the youtube part. So my question is have anyone else had this problem and maybe even fixed it? If so I would really appreciate telling me how you did it.







I read that people have RMA'ed their card and received a new card with exactly the same problem. I've tried stuff like remove hardware acceleration in firefox, only installed the gpu driver and other stuff but nothing have helped. I feels like it should be something driver related since everything else works perfectly except for youtube.


----------



## tsm106

Quote:


> Originally Posted by *Gobigorgohome*
> 
> tsm106: Now I have connected two cards to each PSU, so what then? New bios, and which?





Spoiler: Warning: Spoiler!



http://kingpincooling.com/forum/showthread.php?t=2473



Then read this post and the next three posts or so.


----------



## Cyber Locc

Quote:


> Originally Posted by *paradoxum*
> 
> In the list you could change me from Stock to Watercooling (EK-FC R9-290X Block) and it's not the R9 290*X*, just the 290 - my high overclock might have made it look like one though?


Why does the high overclock make it look like a 290x? From all the info I have seen 290s clock higher than 290xs have you seen different?

Vici0us - Im pretty sure that would cause your performance to go down if you want to go 3 get a board that has plx chip or a x79 board


----------



## spikezone2004

Day 2 of my R9 290 in my system and i love it, big upgrade from my 6850HD. I got the EK 290X full block and backplate I planned up upgrading the thermal pads but don't think I am going to bother I seem to be getting good temps when gaming max I have reached is 52 on VRM1 and 49C gaming, seems like it wouldn't be worth the time taking block off again and buying more pads, what do you guys think?


----------



## paradoxum

Quote:


> Originally Posted by *cyberlocc*
> 
> Why does the high overclock make it look like a 290x? From all the info I have seen 290s clock higher than 290xs have you seen different?
> 
> Vici0us - Im pretty sure that would cause your performance to go down if you want to go 3 get a board that has plx chip or a x79 board


No I haven't, I'm relatively ignorant of all the different cards personally - I just assumed because they were more expensive they had higher clocks (among other things like more RAM, shaders(?) etc)

Could you give the name of what my overclocked Tri-X OC 290 would be the equivelant of had I bought something more expensive?


----------



## dpl2007

Quote:


> Originally Posted by *the9quad*
> 
> It all depends for me, it is weird. I can play BF4 and other games for days and really push the cards and I wont get a single crash. Then I can play something like divinity original sin or planetside 2 and get a hard crash to the point that windows wont recognize my cards being installed or having drivers upon reboot. That's happened twice now. So now I play divinity original sin with crossfire disabled and get the same frame rate with zero crashes. Havent tried planetside 2 again since it locked up twice in a row.


so I think I saw you in the qnix thread so have you tried three 290xs with original sin? I have two 290s (no x) and on my qnix I'm a bit disappointed only getting 80 odd maximum fps and sometimes as low as 50s in some parts how's your fps? This is at maximum settings and monitor on 107Hz bit jerky as well...


----------



## kizwan

Quote:


> Originally Posted by *Wezzor*
> 
> Hello guys!
> Well, I decided to write about my problem here since if this is going to noticed its here and maybe even fixed
> 
> 
> 
> 
> 
> 
> 
> Anyway, I bought an Sapphire R9 290 Tri-X about 1 month ago but have just used it for a week since I've been on vacation. My first impression was wow this card is really a beast and really enjoyed gaming with it. But I also noticed a couple days ago that I get BSOD randomly and when I checked out what's causing the BSOD it's atikmdag.sys ( atikmdag+28ece ). So I updated once again to the newest driver (which I already had) but I thought somehow it may fix my problem but I was wrong. So, I started to check when my BSOD actually occurs and found out it'll only occur when watching videos on youtube (sadly I do it alot
> 
> 
> 
> 
> 
> 
> 
> ) So I started to google my problem and was shocked by how many people seemed to have exactly the same problem as I. I thought since so many have had the same problem there should probably be a solution to it but I was wrong. It seemed to be a problem that was known and haven't been fixed for years. I mean AMD must know about this problem and see all those post they receive on their forum all the time but they still haven't fixed it? No, wonder so many people try to avoid AMD and why there is so many nvidia fanboys. I'm sorry, I'm just a bit frustrated I'm not trying to hate on AMD even if it looks like it. I'm actually really happy with my card except for the youtube part. So my question is have anyone else had this problem and maybe even fixed it? If so I would really appreciate telling me how you did it.
> 
> 
> 
> 
> 
> 
> 
> I read that people have RMA'ed their card and received a new card with exactly the same problem. I've tried stuff like remove hardware acceleration in firefox, only installed the gpu driver and other stuff but nothing have helped. I feels like it should be something driver related since everything else works perfectly except for youtube.


No problem here with youtube, either using chrome or firefox. Hardware acceleration also enabled.


----------



## Jflisk

Okay so I have another R9 290x Coming. My other two are underwater and I am going to put this one under eventually also. Since I have never run a R9 290x air what are the temps like untill I get the water block and bridge. I know its not going to be no 55c max under load. Just an idea. Thanks


----------



## Wezzor

Quote:


> Originally Posted by *kizwan*
> 
> No problem here with youtube, either using chrome or firefox. Hardware acceleration also enabled.


http://forums.amd.com/game/messageview.cfm?catid=440&threadid=169383 all have exactly the same problem as me. I guess I need to RMA it then. I've never done it before. Should I do it directly to Sapphire or where I bought the card?


----------



## tsm106

Quote:


> Originally Posted by *Wezzor*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> No problem here with youtube, either using chrome or firefox. Hardware acceleration also enabled.
> 
> 
> 
> http://forums.amd.com/game/messageview.cfm?catid=440&threadid=169383 all have exactly the same problem as me. I guess I need to RMA it then. I've never done it before. Should I do it directly to Sapphire or where I bought the card?
Click to expand...

I'd assume you will still have the same problem as before after rma. But if you did go rma, I'd rather return to seller than buy into a refurbed card for new price which is essentially what you get for rma'ing a new card.

When watching utube videos, utube makes calls to the gpu this initiates a powerstate change, ie. gpu enters whatever mode appropriate for watching videos core/mem clocks. That in and of itself isn't a problem. The problem is how all other apps relate to it and this is where the problems come in. Let me guess are you using msi afterburner too?

Here's how to control how AB detects powerstate/profile changes.


Spoiler: Warning: Spoiler!



http://www.overclock.net/t/1265543/the-amd-how-to-thread/380#post_21832882


----------



## Wezzor

Quote:


> Originally Posted by *tsm106*
> 
> I'd assume you will still have the same problem as before after rma. But if you did go rma, I'd rather return to seller than buy into a refurbed card for new price which is essentially what you get for rma'ing a new card.
> 
> When watching utube videos, utube makes calls to the gpu this initiates a powerstate change, ie. gpu enters whatever mode appropriate for watching videos core/mem clocks. That in and of itself isn't a problem. The problem is how all other apps relate to it and this is where the problems come in. Let me guess are you using msi afterburner too?
> 
> Here's how to control how AB detects powerstate/profile changes.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://www.overclock.net/t/1265543/the-amd-how-to-thread/380#post_21832882


Nope normally I don't I just downloaded it now because of this thread: http://www.overclock.net/t/1478456/bluescreen-error-0xa0000001-with-amd-290x/40 where they say I should disable ULPS and increase power limit.


----------



## tsm106

Quote:


> Originally Posted by *Wezzor*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I'd assume you will still have the same problem as before after rma. But if you did go rma, I'd rather return to seller than buy into a refurbed card for new price which is essentially what you get for rma'ing a new card.
> 
> When watching utube videos, utube makes calls to the gpu this initiates a powerstate change, ie. gpu enters whatever mode appropriate for watching videos core/mem clocks. That in and of itself isn't a problem. The problem is how all other apps relate to it and this is where the problems come in. Let me guess are you using msi afterburner too?
> 
> Here's how to control how AB detects powerstate/profile changes.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://www.overclock.net/t/1265543/the-amd-how-to-thread/380#post_21832882
> 
> 
> 
> 
> 
> 
> Nope normally I don't I just downloaded it now because of this thread: http://www.overclock.net/t/1478456/bluescreen-error-0xa0000001-with-amd-290x/40 where they say I should disable ULPS and increase power limit.
Click to expand...

ULPS has nothing to do with powerstate changes and neither does raising powerlimit aka PT.

Your card is jumping in and out of different powerstates because differing apps are making calls to the gpu. Then when a real load is presented it is caught off guard and in the wrong powerstate. I would assume it is in a powerstate that is not giving it enough voltage to run at X clocks. You can use AB to control what app is allowed to initiate a powerstate change. Using the method I linked with a set of 2D and 3D profiles will allow you to bypass this annoyance.


----------



## Arizonian

Quote:


> Originally Posted by *paradoxum*
> 
> In the list you could change me from Stock to Watercooling (EK-FC R9-290X Block) and it's not the R9 290*X*, just the 290 - my high overclock might have made it look like one though?


Sorry if I missed that







corrected & updated








Quote:


> Originally Posted by *cyberlocc*
> 
> Add me Please
> 
> Okay here is link. http://www.techpowerup.com/gpuz/details.php?id=6mmky
> 
> Here is pic (dont judge my build going through major overhaul and broke my top rad swapped my 680s for this and when this ebay buyer pays I will have another on the way then they will be under water
> 
> 
> 
> 
> 
> 
> 
> )
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Ooh and Sapphire 290 reference, Stock Cooling, however it will be under water soon.
> 
> And here is the fun I have been having tonight I worked the mem first then did the clock a little until now I will most likely leave it here for now till I get my other card and get them under water. What am I saying ill be doing this all night
> 
> 
> 
> 
> 
> 
> 
> .
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Also I think the proc may be lowering that score as it is hitting 100 alot during the tests I gave my 3570k to my step brother as I'm swapping to x79 soon. Now I think i should have waited lol


Congrats - added









.....and fellow Arizonian I see.








You sounded committed to getting it under water so I'll list you as such,







and get to save myself a step.







but that doesn't mean you get out of posting pics.


----------



## Wezzor

Quote:


> Originally Posted by *tsm106*
> 
> ULPS has nothing to do with powerstate changes and neither does raising powerlimit aka PT.
> 
> Your card is jumping in and out of different powerstates because differing apps are making calls to the gpu. Then when a real load is presented it is caught off guard and in the wrong powerstate. I would assume it is in a powerstate that is not giving it enough voltage to run at X clocks. You can use AB to control what app is allowed to initiate a powerstate change. Using the method I linked with a set of 2D and 3D profiles will allow you to bypass this annoyance.


lol, I just noticed AB is the only progran that makes my whole computer freeze but no BSOD. Well, anyway I clicked on that I for information but it just shows system information nothing about apps that running in 3D mode.


----------



## Darius Silver

Well, got a nice little gift of a PCS+ R9 290x from my brother. Which is a hell of a step up from the MSI 7770 I was using. Got a little issue though, I can't seem to control the fan speed with neither MSI Afterburner or CCC. Does anyone have any insight on what I could be doing wrong? Using 14.7 RC3 and the 4.0.0 Beta 9. In fact, it doesn't seem like the fan speed changes at all, the card will go up to 94C and start to throttle.


----------



## tsm106

Quote:


> Originally Posted by *Wezzor*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> ULPS has nothing to do with powerstate changes and neither does raising powerlimit aka PT.
> 
> Your card is jumping in and out of different powerstates because differing apps are making calls to the gpu. Then when a real load is presented it is caught off guard and in the wrong powerstate. I would assume it is in a powerstate that is not giving it enough voltage to run at X clocks. You can use AB to control what app is allowed to initiate a powerstate change. Using the method I linked with a set of 2D and 3D profiles will allow you to bypass this annoyance.
> 
> 
> 
> lol, I just noticed AB is the only progran that makes my whole computer freeze but no BSOD. Well, anyway I clicked on that I for information but it just shows system information nothing about apps that running in 3D mode.
Click to expand...

C'mon ppl. Scroll down.


----------



## Wezzor

Quote:


> Originally Posted by *tsm106*
> 
> C'mon ppl. Scroll down.


Dude, I'm not that dumb








As you can see there is nothing to scroll down.


----------



## tsm106

Quote:


> Originally Posted by *Wezzor*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> C'mon ppl. Scroll down.
> 
> 
> 
> Dude, I'm not that dumb
> 
> 
> 
> 
> 
> 
> 
> 
> As you can see there is nothing to scroll down.
Click to expand...

Did you do the full install? Let me see... did you choose to install the RTSS server along with AB? I'm guessing not... if you cannot scroll down.


----------



## Cyber Locc

Quote:


> Originally Posted by *Arizonian*
> 
> Sorry if I missed that
> 
> 
> 
> 
> 
> 
> 
> corrected & updated
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> .....and fellow Arizonian I see.
> 
> 
> 
> 
> 
> 
> 
> 
> You sounded committed to getting it under water so I'll list you as such,
> 
> 
> 
> 
> 
> 
> 
> and get to save myself a step.
> 
> 
> 
> 
> 
> 
> 
> but that doesn't mean you get out of posting pics.


yep I am was born and raised in pheinix live up north now.

ya they will defiabtly be under water







is there really any other opiton where we live lol. even up here it gets 100 easy in the summer and its humid unlike pheonix


----------



## Wezzor

Quote:


> Originally Posted by *tsm106*
> 
> Did you do the full install? Let me see... did you choose to install the RTSS server along with AB? I'm guessing not... if you cannot scroll down.


Alright, then I know the problem nope I didn't choice it didn't think I'll need it.


----------



## tsm106

Quote:


> Originally Posted by *Wezzor*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Did you do the full install? Let me see... did you choose to install the RTSS server along with AB? I'm guessing not... if you cannot scroll down.
> 
> 
> 
> Alright, then I know the problem nope I didn't choice it didn't think I'll need it.
Click to expand...

Eek. The big selling point of AB is to use the OSD from the RTSS server man! You're missing out on all the fun of AB.


----------



## Wezzor

Quote:


> Originally Posted by *tsm106*
> 
> Eek. The big selling point of AB is to use the OSD from the RTSS server man! You're missing out on all the fun of AB.


Well, I'm really new when it comes to computers. Just bought my first computer in June was really happy just being able to build it myself and overclock CPU and RAM while it's something really easy and normal for you.







Wanted also overclock my GPU but because of those BSOD I've been afraid of doing it.


----------



## Wezzor

Hmm, okey I got all those new settings that I can screen capture video capture (looks fun will try it out







) etc but still once I click on information it's looks like the screens I posted you.







I'm sorry if I'm being annoying on a friday night but I just want this to be fixed because it's really frustrating and I'm really excited to see if it'll fix my problem.


----------



## tsm106

Quote:


> Originally Posted by *Wezzor*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Eek. The big selling point of AB is to use the OSD from the RTSS server man! You're missing out on all the fun of AB.
> 
> 
> 
> Well, I'm really new when it comes to computers. Just bought my first computer in June was really happy just being able to build it myself and overclock CPU and RAM while it's something really easy and normal for you.
> 
> 
> 
> 
> 
> 
> 
> Wanted also overclock my GPU but because of those BSOD I've been afraid of doing it.
Click to expand...











Read the first page here. It will give you a fundamental understanding of how to oc amd gpus. The current AB betas have automated many of the steps but it is still very important to understand the why and how amd gpus work.

http://www.overclock.net/t/1265543/the-amd-how-to-thread/0_40


----------



## Cyber Locc

Quote:


> Originally Posted by *paradoxum*
> 
> No I haven't, I'm relatively ignorant of all the different cards personally - I just assumed because they were more expensive they had higher clocks (among other things like more RAM, shaders(?) etc)
> 
> Could you give the name of what my overclocked Tri-X OC 290 would be the equivelant of had I bought something more expensive?


Ya I would agree that it use to be you pay more you get more this line is be more and more blurred everyday however another reason is that alot of stuff is application specific for instance If your just gaming a 4770k will beat a 4960k so a 300 dollar chip will beat a 1k one in gaming you will lose frames for that better chip however in other applications like rendering the 4770 will get chewed up.

It does have higher stock clocks but this is a marketing gimmick and its used quite a bit by all the companys just like evgas FTW well I had FTWS and they clocked lower than others at the end of the day its a silicon lottery. It seems the way the market is going is changing this pay more and get more as I said we seen the same thing with the flagship and right under being a very minimal gain its become more of an epeen or wanting the absolute top and in some applications those shaders may excel more. This is also why we see the difference between them scale depending on game or usage between 3%-10% those are numbers are stock Techpowerup has a heaven chart there running and granted there's not many results that aside my score last night 2 hours of messing with clocks and barely modding voltages at all I got a 1590 at 1115 -1425. This score beat all the 290s on the list and 290xs and half of the 780s and that's on air and I wasn't even really pushing it I push voltage up by +20 the card was hot (like 85c but its rated to 95c) the fan was loud 80% but water can change all this. With the current mining inflated prices I'm seeing 290s for 400-450 and 290xs for 650 thats a 200 dollar difference for 300 shaders IMO thats not worth it thats the enough to buy a good block and backplate or a closed loop system for the card which will in turn make it perform higher than the same 290x unless you bought a block ect for it so basically its a 200 dollar difference for 3%-10%

As to the other thing more ram ect. The only difference between them is 300 shaders unlocked on the x and not on the 290 the pcb ram gpu everything else is the same they unlocked some shaders and factory overclocked a tiny bit and through a 100 dollar price difference the miners drove the difference to 100-200. the 290x is the superior card there's no doubt but imo not by enough to justify 100 dollars must less 200 but to each there own. I'm trying the unlock game buy previously mined cards plus I dont have to worry about a mined card failing 2 weeks after I put it in my loop as if it were going to die it would have already







.

TSM I must just say I love your avatar every time I see it I crack up as I read the story







Keep that avatar for ever...


----------



## Wezzor

Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Read the first page here. It will give you a fundamental understanding of how to oc amd gpus. The current AB betas have automated many of the steps but it is still very important to understand the why and how amd gpus work.
> 
> http://www.overclock.net/t/1265543/the-amd-how-to-thread/0_40


I'll definitely check it out once I've dealt with this problem.


----------



## Draygonn

Thanks for the fuji knowledge guys. I wanted to keep fan noise low so I installed Fuji on VRM1 and GC Extreme on the GPU. After running some BF4 rounds in 78° ambient with fans turned down to quiet levels VRM1 stayed in the 80s. Success


----------



## Infinite Jest

Is the prolimatech mk-26 compatible with a reference 290 or just the mk-26 black? They don't list compatibility for the vanilla version and the dimensions are slightly different..


----------



## bluedevil

Quote:


> Originally Posted by *Infinite Jest*
> 
> Is the prolimatech mk-26 compatible with a reference 290 or just the mk-26 black? They don't list compatibility for the vanilla version and the dimensions are slightly different..


Just looks like the Black, per their website.

But I'd be willing to bet that the non black would work fine. AMD's mounting holes rarely ever change, ie a 7970 is the same as my 290, as was my 7870.


----------



## Infinite Jest

Quote:


> Originally Posted by *bluedevil*
> 
> Just looks like the Black, per their website.
> 
> But I'd be willing to bet that the non black would work fine. AMD's mounting holes rarely ever change, ie a 7970 is the same as my 290, as was my 7870.


Hmm, just wanted to try to confirm before I bought anything. The schematics are slightly different.


----------



## bluedevil

Quote:


> Originally Posted by *Infinite Jest*
> 
> Hmm, just wanted to try to confirm before I bought anything. The schematics are slightly different.


I would buy the Black if you are unsure. Me, I'd Red mod it.







Lower temps and much more quiet.


----------



## CrossfireManiac

Need help with 4 monitor setup

SAPPHIRE TRI-X OC 100361-2SR

I have had my 290 since late May, put it in , hooked up all mof my monitors, 2x DVI ,1 x Display port-DVI,1xHDMI
has worked well in Windows 7, but I have had issues with video performance in games in Linux.
Yesterday I decided to put my old 680 back in to test the performance of games with the Nvidia card while running Linux, and games played smooth and nice.
upon putting my 290 back in windows had absolutely refused to recognize the HDMI output, I can easily get 3 monitors running via DVI and Display Port.

When windows boots from bios to login screen the HDMI monitor is active, I login from my HDMI monitor
as soon as I press enter to log into windows the HDMI monitor shuts down , and is not detectable by windows.

I have removed and replaced all drivers and software, I have upgraded to the latest beta drivers, I have mixed and matched monitors , and only running 2 monitors
the HDMI is not discovered on logging into windows.

need some suggestions, as I have operated in this mode for several months without issue.
I am posting this from Linux with all 4 monitors currently active.


----------



## chronicfx

Quote:


> Originally Posted by *CrossfireManiac*
> 
> Need help with 4 monitor setup
> 
> SAPPHIRE TRI-X OC 100361-2SR
> 
> I have had my 290 since late May, put it in , hooked up all mof my monitors, 2x DVI ,1 x Display port-DVI,1xHDMI
> has worked well in Windows 7, but I have had issues with video performance in games in Linux.
> Yesterday I decided to put my old 680 back in to test the performance of games with the Nvidia card while running Linux, and games played smooth and nice.
> upon putting my 290 back in windows had absolutely refused to recognize the HDMI output, I can easily get 3 monitors running via DVI and Display Port.
> 
> When windows boots from bios to login screen the HDMI monitor is active, I login from my HDMI monitor
> as soon as I press enter to log into windows the HDMI monitor shuts down , and is not detectable by windows.
> 
> I have removed and replaced all drivers and software, I have upgraded to the latest beta drivers, I have mixed and matched monitors , and only running 2 monitors
> the HDMI is not discovered on logging into windows.
> 
> need some suggestions, as I have operated in this mode for several months without issue.
> I am posting this from Linux with all 4 monitors currently active.


Did you use a software uninstaller like ddu? Sometimes switching nvidia to amd needs a little more than just the uninstall button.


----------



## airisom2

Hey guys, I'm in the process of finding some 290X bioses for the Unlock thread I made, and I was wondering if some of you could help me out with dumping your 290X Uber BIOSes. The reason is because there are some non-reference 290s that can be unlocked, according to Hawaii Info's results, but aren't operable when flashing any other bios except the one made for it, and the ones on TechPowerUp's VGA BIOS collection page are too old.

I need:

Sapphire Vapor-X
XFX DD
Gigabyte WF3X
Asus DCII
MSI Gaming
HIS IceQ (and the Diamond equivalent)

Thanks!


----------



## mojobear

Hey guys,

The 14.7 RC3 drivers (havent tried anything other that 14.4 WHQL and 14.6 beta) works pretty well for my quad 290 setup. Getting higher scores on 3dmark etc compared to 14.4 WHQL, but need about 15-20 mV more to get things stable without artifacts for 3dmark firestrike extreme scene 1 loop. Just FYI in case people are having stability problems with 14.7 RC3.

Have funnnnn.


----------



## heroxoot

Anyone having BSOD on the new 14.7RC3 driver? 14.7 was giving me random crashes on stock so I left it behind.


----------



## Gobigorgohome

I have started to get random BSOD and driver crashes in the last few days at 14.4 which has been rock solid for my quad-setup 290X since last week. Undid the overclocking and uninstalled MSI AB and it seems to be gone, anybody noticed the same? I also have a better gaming experience with no overclock vs overclocked cards ....

Another thing: BF4 @ 4K, I have the same FPS at medium/high/ultra (even with and without AA), I have this hang of the picture every third second even at LOW settings ... I do not understand this, my machine should have reduced BF4 to rubble with my system.
In Metro Last Light it is pretty much the same thing as in BF4, only that the FPS is REALLY low sometimes and other times it is very high (like sometimes it is 25 FPS and other times it is 60 FPS ++), all at the same settings. It is equally bad at LOW and at VERY HIGH even though my setup should be doing good there too. Thief is pretty much the same thing, but that game is limited to one card as far as I know. Anybody have a possible solution?

CPU temp at 100% load is 60 degree Celsius, the GPU's is about 50 degree Celsius (all four of them).

The rest of the system is:

Asus RIVBE
3930K @ 4,5 Ghz
16 GB 1866 Mhz Quad channel
4x R9 290X
2x EVGA G2 1300W
3x 3TB Seagate Barracuda

Anybody know what it could be?


----------



## keikei

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Another thing: BF4 @ 4K, I have the same FPS at medium/high/ultra (even with and without AA), I have this hang of the picture every third second even at LOW settings ... I do not understand this, my machine should have reduced BF4 to rubble with my system.
> In Metro Last Light it is pretty much the same thing as in BF4, only that the FPS is REALLY low sometimes and other times it is very high (like sometimes it is 25 FPS and other times it is 60 FPS ++), all at the same settings. It is equally bad at LOW and at VERY HIGH even though my setup should be doing good there too. Thief is pretty much the same thing, but that game is limited to one card as far as I know. Anybody have a possible solution?


I had/have the same issue and for all the stuff i've tried, i've come to the conclusion, its the game. If i notice some insane lag, i just switch servers.


----------



## phallacy

Hey everyone,

been some time since I made a post on these forums due to work and all. I was hoping someone could help me with an issue I've been having with the Witcher 2 and multi gpu crossfire.

Recently I downloaded The Witcher 2 enhanced edition from steam after reading some rave reviews both online and here on OC.net. However, I'm running into a serious problem when playing.

I'm using a 4x 290x computer overclocked with the 14.7 drivers (not RC2 but the original) and within a few minutes the game freezes and so does my computer. This is with ubersampling disabled by the way on 4k res. The first time it corrupted the AMD drivers and had to do a reinstall but ever since it just keeps freezing/crashing/whatever you want to call it. All other games including Crysis 3, Tomb Raider, Watch Dogs modded, Far Cry 3, Battlefield 4; basically any other game that is demanding on GPUs is working fine with the overclocks.

I tested this problem with all my cards overclocked as I do in any game vs. all stock and am experiencing the same problem. I did a google search about crossfire / multi AMD gpu and witcher 2 and it seems quite a few other people have the same problem.

I tried an uninstall and reinstall of Witcher 2 but the same freezing/crashing happens.

Any advice for troubleshooting or a fix I'm unaware of? Thanks!


----------



## The Storm

Quote:


> Originally Posted by *phallacy*
> 
> Hey everyone,
> 
> been some time since I made a post on these forums due to work and all. I was hoping someone could help me with an issue I've been having with the Witcher 2 and multi gpu crossfire.
> 
> Recently I downloaded The Witcher 2 enhanced edition from steam after reading some rave reviews both online and here on OC.net. However, I'm running into a serious problem when playing.
> 
> I'm using a 4x 290x computer overclocked with the 14.7 drivers (not RC2 but the original) and within a few minutes the game freezes and so does my computer. This is with ubersampling disabled by the way on 4k res. The first time it corrupted the AMD drivers and had to do a reinstall but ever since it just keeps freezing/crashing/whatever you want to call it. All other games including Crysis 3, Tomb Raider, Watch Dogs modded, Far Cry 3, Battlefield 4; basically any other game that is demanding on GPUs is working fine with the overclocks.
> 
> I tested this problem with all my cards overclocked as I do in any game vs. all stock and am experiencing the same problem. I did a google search about crossfire / multi AMD gpu and witcher 2 and it seems quite a few other people have the same problem.
> 
> I tried an uninstall and reinstall of Witcher 2 but the same freezing/crashing happens.
> 
> Any advice for troubleshooting or a fix I'm unaware of? Thanks!


When you find out let me know haha. I have had Witcher 2 for a very long time and have never been able to run the game due to hangs and freezes.


----------



## rdr09

Quote:


> Originally Posted by *heroxoot*
> 
> Anyone having BSOD on the new 14.7RC3 driver? 14.7 was giving me random crashes on stock so I left it behind.


no bsod nor crashes just bad artifacts or glitches in Warface. Easy fix was 14.4. Lower benchmark scores, though.


----------



## battleaxe

14.7 seems to need a bit more voltage. Artifacts if not. Add 20mv for me anyway. Got higher 3dMark 11 score though. 17k on my 290. Pretty nice.


----------



## Vici0us

Reapplied arctic 5 thermal paste on my reference 290 and temps dropped by 5C-7C, I'm amazed.


----------



## disintegratorx

Quote:


> Originally Posted by *battleaxe*
> 
> 14.7 seems to need a bit more voltage. Artifacts if not. Add 20mv for me anyway. Got higher 3dMark 11 score though. 17k on my 290. Pretty nice.


Yeah that's what I've noticed as well. The newest drivers are awesome though. They make BF4 a totally different game if you ask me. Nice and smooth and along with the Razer mouse and some very carefully set controls, its finally worth buying a mechanical keyboard for some truly competitive gaming.


----------



## CrossfireManiac

Quote:


> Originally Posted by *chronicfx*
> 
> Did you use a software uninstaller like ddu? Sometimes switching nvidia to amd needs a little more than just the uninstall button.


I did use DDU , and uninstalled all traces of Nvidia, and ATI/AMD as well, rebooted , then installed the latest beta drivers clean , and the same issue persists

I boot up, choose windows loader in grub, HDMI monitor is active in grub, and stays active when the Windows boot loader is chosen,
HDMI monitor stays active though the windows start ani splash screen , and is active and is the monitor that i enter the password for Win7 on, as soon as Win7 logs in , the HDMI monitor goes black, no signal, and other 3 monitors are active.


----------



## Jflisk

Okay guys here goes one for you. I just installed another R9 290X. Its is seen by windows and I can see it in gpu-z. But I cannot crossfire it. I am looking for non standard answers here. I did the usual install uninstall drivers. Card works stand alone. Like there's no crossfire bridge. Thanks in advance for the help.


----------



## keikei

Quote:


> Originally Posted by *Jflisk*
> 
> Okay guys here goes one for you. I just installed another R9 290X. Its is seen by windows and I can see it in gpu-z. But I cannot crossfire it. I am looking for non standard answers here. I did the usual install uninstall drivers. Card works stand alone. Like there's no crossfire bridge. Thanks in advance for the help.


Did you enable crossfire in CCC?


----------



## spikezone2004

Just got a directx crash in BF3 and then got this weird looking screen:


Then I went to play Grid 2 and I got BSOD atimpag.sys

Think its all driver related? I am running 14.4

EDIT: I heard a high pitched noise coming from gpu, is that the coil whine?


----------



## kizwan

Quote:


> Originally Posted by *Jflisk*
> 
> Okay guys here goes one for you. I just installed another R9 290X. Its is seen by windows and I can see it in gpu-z. But I cannot crossfire it. I am looking for non standard answers here. I did the usual install uninstall drivers. Card works stand alone. Like there's no crossfire bridge. Thanks in advance for the help.


Did you get any error message when trying to enable Crossfire? Try remove both cards & put them back again, and try again.


----------



## rdr09

Quote:


> Originally Posted by *Jflisk*
> 
> Okay guys here goes one for you. I just installed another R9 290X. Its is seen by windows and I can see it in gpu-z. But I cannot crossfire it. I am looking for non standard answers here. *I did the usual install uninstall drivers*. Card works stand alone. Like there's no crossfire bridge. Thanks in advance for the help.


when? Before or after installing the second card?


----------



## Paopawdecarabao

Hello,

I'm new here to the boards, I have a question regarding my Sapphire R9 290 Tri X Crossfire. I Oc'ed it to 1150 / 1550 VDDC Offset 131.

I've ran Unigine Valley and it is stable but the score is lower than the stock speed. Is the top card throttling that's why the score is lower?

And also is this an acceptable temp?



I'm worried on my top card temp. VRM mas temp is 97c and GPU max temp is 91c.


----------



## Jflisk

Quote:


> Originally Posted by *kizwan*
> 
> Did you get any error message when trying to enable Crossfire? Try remove both cards & put them back again, and try again.


No error messages at all. When I disable the trifire it shows 3 cards. But I am unable to manage the card. in catalyst. 3 cards exist in windows and GPU-Z


----------



## Jflisk

managed to get the trifire running. Spent 5 hrs on it last night no worky. Spent 3 minutes on it today and it starts working go figure. Its either 1 of 2 things. Either there's something wrong with my motherboard. I pulled and reinstalled the card(did this a couple of times last night) or the fact I loaded the drivers on my bottom card first then worked my way up for the trifire- driver problem. It is now showing 3 cards available for trifire and xfiring.


----------



## CrossfireManiac

Quote:


> Originally Posted by *CrossfireManiac*
> 
> I did use DDU , and uninstalled all traces of Nvidia, and ATI/AMD as well, rebooted , then installed the latest beta drivers clean , and the same issue persists
> 
> I boot up, choose windows loader in grub, HDMI monitor is active in grub, and stays active when the Windows boot loader is chosen,
> HDMI monitor stays active though the windows start ani splash screen , and is active and is the monitor that i enter the password for Win7 on, as soon as Win7 logs in , the HDMI monitor goes black, no signal, and other 3 monitors are active.


So this forum was a truly desperate attempt at finding some help after a somewhat easy experiment went awry~
the only help I got was a request to remove and reinstall drivers , which I had described in my first post~
perhaps posts get buried to fast when everything goes into a thread that is 2854 pages long
none the less this is not a viable venue for support , I understand that now, and will not bring problems here again~

BTW, I did figure out the solution


----------



## Paopawdecarabao

Quote:


> Originally Posted by *Paopawdecarabao*
> 
> Hello,
> 
> I'm new here to the boards, I have a question regarding my Sapphire R9 290 Tri X Crossfire. I Oc'ed it to 1150 / 1550 VDDC Offset 131.
> 
> I've ran Unigine Valley and it is stable but the score is lower than the stock speed. Is the top card throttling that's why the score is lower?
> 
> And also is this an acceptable temp?
> 
> 
> 
> I'm worried on my top card temp. VRM mas temp is 97c and GPU max temp is 91c.


Anyone could help me? thanks


----------



## mAs81

Quote:


> Originally Posted by *Paopawdecarabao*
> 
> Anyone could help me? thanks


I only own one card and don't have a lot Xfire knowledge but I believe that your temps are really high..
Indeed the top card is throttling and that's why your benchmarks are lower..
You should really try to keep the vrm temps at least under 80c

Edit:just looked at your post properly about the first card


----------



## Sgt Bilko

Quote:


> Originally Posted by *Paopawdecarabao*
> 
> Anyone could help me? thanks


Increase your powerlimit and disable ULPS.

That should help


----------



## Paopawdecarabao

Quote:


> Originally Posted by *mAs81*
> 
> I only own one card and don't have a lot Xfire knowledge but I believe that your temps are really high..
> Indeed the top card is throttling and that's why your benchmarks are lower..
> You should really try to keep the vrm temps at least under 80c
> 
> Edit:just looked at your post properly about the first card


It is really high that is why top the top card is throttling. I don't know why but the Tri X coolers couldn't keep the temp low.

My case is a NZXT Phantom Full Tower with one front intake, two side intake, two top exhaust and one back exhaust.

Or I will just put my system underwater so I can overclock.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Paopawdecarabao*
> 
> It is really high that is why top the top card is throttling. I don't know why but the Tri X coolers couldn't keep the temp low.
> 
> My case is a NZXT Phantom Full Tower with one front intake, two side intake, two top exhaust and one back exhaust.
> 
> Or I will just put my system underwater so I can overclock.


You aren't temp throttling, disable UPLS and increase the power limit.


----------



## rdr09

Quote:


> Originally Posted by *Paopawdecarabao*
> 
> It is really high that is why top the top card is throttling. I don't know why but the Tri X coolers couldn't keep the temp low.
> 
> My case is a NZXT Phantom Full Tower with one front intake, two side intake, two top exhaust and one back exhaust.
> 
> Or I will just put my system underwater so I can overclock.


experiement with the fans on the side. I know the coolers of the gpus are exhausting to the top (blowing against the side fans if they are right across another). Turn them off, change air flow direction, turn off one side fan, etc. keep the other ones exhausting. you have any fan(s) in the front?

edit: yah, watercool at least one. both would be better and forget about them.

BTW, in the op of the valley bench thread you'd see steps to tweak multi-gpu setups to increase performance. the vram oc will help in benchmarks like that one but in games . . . you'd be better off setting them back to stock and maybe 1100 on the core. 1100 might just need stock vcore, thus lowering your temps significantly.

also, when benching on air, crank the fans to 100%.


----------



## Paopawdecarabao

Quote:


> Originally Posted by *Sgt Bilko*
> 
> You aren't temp throttling, disable UPLS and increase the power limit.


ULPS is disabled . Where do I set the power limit on TriXX? I do my OC at TriXX though.

But when I benchmarked my oc'ed crossfire, the score was lower than my stock speed benchmark.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Paopawdecarabao*
> 
> ULPS is disabled . Where do I set the power limit on TriXX? I do my OC at TriXX though.
> 
> But when I benchmarked my oc'ed crossfire, the score was lower than my stock speed benchmark.


Because the core clock is fluctuating.

Trixx is funny for me sometimes. I need to enable overdrive in CCC and raise the power limit there and then set the clocks/voltage in Trixx.

I use afterburner for 24/7 stuff and Trixx for big overclocks.

Are you sure you need that much voltage for 1150? Seems like alot.....you could try +100mV in afterburner and 1150 there because its easier to set the powerlimit slider straight to 50%


----------



## Paopawdecarabao

Quote:


> Originally Posted by *rdr09*
> 
> experiement with the fans on the side. I know the coolers of the gpus are exhausting to the top (blowing against the side fans if they are right across another). Turn them off, change air flow direction, turn off one side fan, etc. keep the other ones exhausting. you have any fan(s) in the front?
> 
> edit: yah, watercool at least one. both would be better and forget about them.
> 
> BTW, in the op of the valley bench thread you'd see steps to tweak multi-gpu setups to increase performance. the vram oc will help in benchmarks like that one but in games . . . you'd be better off setting them back to stock and maybe 1100 on the core. 1100 might just need stock vcore, thus lowering your temps significantly.
> 
> also, when benching on air, crank the fans to 100%.


The side fans are not directly across the GPU's. It more on the side of the drive bays.

I do have one 140mm front intake fan.

You mean there is no signicant increase when overclocking cards on games, just on benchmarks?

Yeah most likely just want to see the maximum performance of my GPU's.


----------



## Paopawdecarabao

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Because the core clock is fluctuating.
> 
> Trixx is funny for me sometimes. I need to enable overdrive in CCC and raise the power limit there and then set the clocks/voltage in Trixx.
> 
> I use afterburner for 24/7 stuff and Trixx for big overclocks.
> 
> Are you sure you need that much voltage for 1150? Seems like alot.....you could try +100mV in afterburner and 1150 there because its easier to set the powerlimit slider straight to 50%


I would try that setting. How about the memory clock could I still do 1550?

I am overclocking on crossfire mode btw. Or I need to overclock the cards individually?

Not really an expert on oci'ng I'm just playing around the settings.

Thanks for the input


----------



## Sgt Bilko

Quote:


> Originally Posted by *Paopawdecarabao*
> 
> I would try that setting. How about the memory clock could I still do 1550?
> 
> I am overclocking on crossfire mode btw. Or I need to overclock the cards individually?
> 
> Not really an expert on oci'ng I'm just playing around the settings.
> 
> Thanks for the input


Do them both at once but mem overclocks don't mean much with these cards.

Focus on the core clocks 1100 like rdr said is a good everyday oc and 1200+ for benching.

Memory maybe around 1400-1500 is good enough.

Get afterburner and pump the fan speed straight to 100% and start with your clocks. Keep going higher till you are at your highest stable clocks then add voltage and keep going till you hit the point where you are comfortable with the temps.

For Xfire thats around +100mV for a daily overclock, all depending on airflow, ambients etc of course


----------



## Paopawdecarabao

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Do them both at once but mem overclocks don't mean much with these cards.
> 
> Focus on the core clocks 1100 like rdr said is a good everyday oc and 1200+ for benching.
> 
> Memory maybe around 1400-1500 is good enough.
> 
> Get afterburner and pump the fan speed straight to 100% and start with your clocks. Keep going higher till you are at your highest stable clocks then add voltage and keep going till you hit the point where you are comfortable with the temps.
> 
> For Xfire thats around +100mV for a daily overclock, all depending on airflow, ambients etc of course


I've tried the settings you told me. How's this?



and I ran Valley. Stock clocks I get 4,250 to 4,300 score. When I benchmarked it it goes 4,500. not so much difference.


----------



## rdr09

Quote:


> Originally Posted by *Paopawdecarabao*
> 
> I've tried the settings you told me. How's this?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> and I ran Valley. Stock clocks I get 4,250 to 4,300 score. When I benchmarked it it goes 4,500. not so much difference.


your temps are looking better. 80C or lower on air when benchmarking (imo) would be ideal. the lower the better. it takes a lot of oc to see significant increases in Valley. don't expect significant jumps in scores.

you tried the tweaks sugerhell provided? scroll down . . .

http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0

Follow what Sgt said in oc'ing. Try other benchmarks like Futuremark such as Firestrike. Synthetic benchmarks and games will rely on the cpu and its oc. 2 290s especially.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Paopawdecarabao*
> 
> I've tried the settings you told me. How's this?
> 
> 
> 
> and I ran Valley. Stock clocks I get 4,250 to 4,300 score. When I benchmarked it it goes 4,500. not so much difference.


Temps are looking better, your core clock didnt fluctuate so it was the lack of powerlimit that was the problem
Do what rdr said and download firestrike and run it at stock then overclocked, it will some more gains than valley will.
Quote:


> Originally Posted by *rdr09*
> 
> your temps are looking better. 80C or lower on air when benchmarking (imo) would be ideal. the lower the better. it takes a lot of oc to see significant increases in Valley. don't expect significant jumps in scores.
> 
> you tried the tweaks sugerhell provided? scroll down . . .
> 
> http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0
> 
> Follow what Sgt said in oc'ing. Try other benchmarks like Futuremark such as Firestrike. Synthetic benchmarks and games will rely on the cpu and its oc. 2 290s especially.


Yep, those tweaks are awesome, I might do some runs when I finish work to compare them.


----------



## Paopawdecarabao

Quote:


> Originally Posted by *rdr09*
> 
> your temps are looking better. 80C or lower on air when benchmarking (imo) would be ideal. the lower the better. it takes a lot of oc to see significant increases in Valley. don't expect significant jumps in scores.
> 
> you tried the tweaks sugerhell provided? scroll down . . .
> 
> http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0
> 
> Follow what Sgt said in oc'ing. Try other benchmarks like Futuremark such as Firestrike. Synthetic benchmarks and games will rely on the cpu and its oc. 2 290s especially.


Quote:


> Originally Posted by *Sgt Bilko*
> 
> Temps are looking better, your core clock didnt fluctuate so it was the lack of powerlimit that was the problem
> Do what rdr said and download firestrike and run it at stock then overclocked, it will some more gains than valley will.
> Yep, those tweaks are awesome, I might do some runs when I finish work to compare them.


I did the tweaks on the first page as per the instructions but it made my score worst. In crossfire I was getting 2900 with the tweaks compared to 4500 without the tweaks


----------



## Sgt Bilko

Quote:


> Originally Posted by *Paopawdecarabao*
> 
> I did the tweaks on the first page as per the instructions but it made my score worst. In crossfire I was getting 2900 with the tweaks compared to 4500 without the tweaks


Did you add the changes to the crossfire profile?


----------



## Paopawdecarabao

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Did you add the crossfire profile?


What do you mean? I've tried on CCC the 1x1 and default mode on the valley but it went really low.

I haven't save the benchmark but it shows x2 on the cards


----------



## chronicfx

Quote:


> Originally Posted by *Paopawdecarabao*
> 
> I've tried the settings you told me. How's this?
> 
> 
> 
> and I ran Valley. Stock clocks I get 4,250 to 4,300 score. When I benchmarked it it goes 4,500. not so much difference.


Just out of curiosity for the group when I run crossfire or trifire with my 290x's my usage graphs always look like his above going up and down like a child scribbling, even when running with VSYNC off. This is while gaming single monitor at 1440p as I don't really do any benchmarking. Most recent games played have been the AMD bundled 8 games that came with my previous 7990 GPU. Is this a normal occurrence or is it due to having an i5 CPU with 4 threads and an 3.0x8/x4/x4 pcie configuration? Perhaps someone with a mainstream i7 and a maximus extreme or other PLX mobo with several 290x can show me their usage graph? Also someone using X79 and a six core with 290/x trifire would be great too. Obviously I would wait a bit to upgrade as I can't really see anything I would want to play being demanding until at least Far Cry 4 and Dragon age drops but just for me to think about whether or not a platform upgrade to X99 or Z97 is at all justified. I am not experiencing bad gameplay especially when running "bundled" titles. Just wondering if people with i7's don't have this issue. Thanks


----------



## Paopawdecarabao

Quote:


> Originally Posted by *rdr09*
> 
> your temps are looking better. 80C or lower on air when benchmarking (imo) would be ideal. the lower the better. it takes a lot of oc to see significant increases in Valley. don't expect significant jumps in scores.
> 
> you tried the tweaks sugerhell provided? scroll down . . .
> 
> http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0
> 
> Follow what Sgt said in oc'ing. Try other benchmarks like Futuremark such as Firestrike. Synthetic benchmarks and games will rely on the cpu and its oc. 2 290s especially.


Quote:


> Originally Posted by *Sgt Bilko*
> 
> Did you add the changes to the crossfire profile?


Comparisons of my benchmark before and after I did the tweak.

Before I did the tweak



After I did the tweak


----------



## rdr09

Quote:


> Originally Posted by *Paopawdecarabao*
> 
> Comparisons of my benchmark before and after I did the tweak.
> 
> Before I did the tweak
> 
> 
> 
> After I did the tweak
> 
> 
> Spoiler: Warning: Spoiler!


try hitting enable crossfire again just before running the bench. your previous score is not really that bad for the given oc. the highest i see for the same config is around 135 and those are highly clocked. hawaiis do not really do well in this bench. well, it does the more you add gpus. look at the top three spots.


----------



## kizwan

Quote:


> Originally Posted by *chronicfx*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Paopawdecarabao*
> 
> I've tried the settings you told me. How's this?
> 
> 
> 
> and I ran Valley. Stock clocks I get 4,250 to 4,300 score. When I benchmarked it it goes 4,500. not so much difference.
> 
> 
> 
> 
> 
> 
> 
> Just out of curiosity for the group when I run crossfire or trifire with my 290x's my usage graphs always look like his above going up and down like a child scribbling, even when running with VSYNC off. This is while gaming single monitor at 1440p as I don't really do any benchmarking. Most recent games played have been the AMD bundled 8 games that came with my previous 7990 GPU. Is this a normal occurrence or is it due to having an i5 CPU with 4 threads and an 3.0x8/x4/x4 pcie configuration? Perhaps someone with a mainstream i7 and a maximus extreme or other PLX mobo with several 290x can show me their usage graph? Also someone using X79 and a six core with 290/x trifire would be great too. Obviously I would wait a bit to upgrade as I can't really see anything I would want to play being demanding until at least Far Cry 4 and Dragon age drops but just for me to think about whether or not a platform upgrade to X99 or Z97 is at all justified. I am not experiencing bad gameplay especially when running "bundled" titles. Just wondering if people with i7's don't have this issue. Thanks
Click to expand...

That is perfectly normal. Also when your GPU's run at higher resolution, GPU usage also increase & less fluctuating.

Valley


BF4 1080p 150% resolution scale (2880 x 1620)


BF4 1080p 200% resolution scale (3840 x 2160 = 4k)


----------



## Paopawdecarabao

Quote:


> Originally Posted by *rdr09*
> 
> [/SPOILER]
> 
> try hitting enable crossfire again just before running the bench. your previous score is not really that bad for the given oc. the highest i see for the same config is around 135 and those are highly clocked. hawaiis do not really do well in this bench. well, it does the more you add gpus. look at the top three spots.


I've monitored the second GPU on MSI AB max load was only 26. It seems that it doesn't utilize the second card. How would I fix that?


----------



## rdr09

Quote:


> Originally Posted by *Paopawdecarabao*
> 
> I've monitored the second GPU on MSI AB max load was only 26. It seems that it doesn't utilize the second card. How would I fix that?


have you re-enabled crossfire? sometimes AB and CCC oppose each other. check if all your settings are still the same. the change happened after following the tweaks.

its been awhile since i ran crossfire but i remember running Valley that i had to disable and re-enable crossfire just prior the bench one time. same thing with Heaven.


----------



## Paopawdecarabao

Quote:


> Originally Posted by *rdr09*
> 
> have you re-enabled crossfire? sometimes AB and CCC oppose each other. check if all your settings are still the same. the change happened after following the tweaks.
> 
> its been awhile since i ran crossfire but i remember running Valley that i had to disable and re-enable crossfire just prior the bench one time. same thing with Heaven.


Yes I did, but its weird I enabled my eyefinity and ran Valley and it worked.

But I didn't change the priority to realtime cause I'm running the benchmark on each of my three monitor on a 1920x1080. I have to start the task manager.

It's kinda weird I have to run eyefinity. But do crossfire work when I have my display separately?


----------



## rdr09

Quote:


> Originally Posted by *Paopawdecarabao*
> 
> Yes I did, but its weird I enabled my eyefinity and ran Valley and it worked.
> 
> But I didn't change the priority to realtime cause I'm running the benchmark on each of my three monitor on a 1920x1080. I have to start the task manager.
> 
> It's kinda weird I have to run eyefinity. But do crossfire work when I have my display separately?
> 
> 
> Spoiler: Warning: Spoiler!


i did not know you have three monitors and i am have no idea how multi-monitor works in this bench. i think it should have shown 5760 X 1080 and your score a lot lower.


----------



## LesPaulLover

Well I'll tell ya:

STEER CLEAR OF THE MSI GAMING 290/290X .... THEY'RE COMPLETE GARBAGE

My "MSI Gaming 290 4gb" card hits 94c within 10-15 minutes running any game that maxes it out. Witcher 2, Arkham City, BF4 etc etc. Keep in mind this is with the fan manually set to 99% BEFORE I even start playing the game.

MSI refuses to RMA it citing "those temps are within the normal operating range of the card." Which is funny because the back of the box shows those temps are the reference temps, and they advertise THEIR cards as 10-20c below that.

I'm using a Corsair Carbide Air 540 "high air flow) cube case. 2 140mm intake fans on the front, 2 140mm EXHAUST fans on the back, and 2 140mm exhaust fans on the top, so it's not my case airflow!!!


----------



## hteng

Need some advice, I already own a r9 290x Tri X and I'm looking to get another one in the near future to crossfire.

I'm also looking to upgrade my mobo to either Asus Maximus VI Gene or Gryphon (I'm limited to mATX form factor). I want to know if the gap between the PCIE x16s are enough to accommodate two R9 290x Tri Xs ? Would appreciate it if any posters here has a similar setup and could maybe snap a photo for me? thank you.


----------



## keikei

Quote:


> Originally Posted by *LesPaulLover*
> 
> Well I'll tell ya:
> 
> STEER CLEAR OF THE MSI GAMING 290/290X .... THEY'RE COMPLETE GARBAGE
> 
> My "MSI Gaming 290 4gb" card hits 94c within 10-15 minutes running any game that maxes it out. Witcher 2, Arkham City, BF4 etc etc. Keep in mind this is with the fan manually set to 99% BEFORE I even start playing the game.
> 
> MSI refuses to RMA it citing "those temps are within the normal operating range of the card." Which is funny because the back of the box shows those temps are the reference temps, and they advertise THEIR cards as 10-20c below that.
> 
> I'm using a Corsair Carbide Air 540 "high air flow) cube case. 2 140mm intake fans on the front, 2 140mm EXHAUST fans on the back, and 2 140mm exhaust fans on the top, so it's not my case airflow!!!


May I ask do you have vsync on during gaming? With it on, your card will work less, and wont be as hot.


----------



## Paopawdecarabao

Quote:


> Originally Posted by *rdr09*
> 
> have you re-enabled crossfire? sometimes AB and CCC oppose each other. check if all your settings are still the same. the change happened after following the tweaks.
> 
> its been awhile since i ran crossfire but i remember running Valley that i had to disable and re-enable crossfire just prior the bench one time. same thing with Heaven.


Yeah I've done that. Probably I will just straight trade my two Sapphire R9 290 Tri X to two Sapphire R9 290X reference card. Would that be fair?

Cause I do plan on getting my system underwater and the Tri X cooler would go to waste.


----------



## chronicfx

Quote:


> Originally Posted by *phallacy*
> 
> Hey everyone,
> 
> been some time since I made a post on these forums due to work and all. I was hoping someone could help me with an issue I've been having with the Witcher 2 and multi gpu crossfire.
> 
> Recently I downloaded The Witcher 2 enhanced edition from steam after reading some rave reviews both online and here on OC.net. However, I'm running into a serious problem when playing.
> 
> I'm using a 4x 290x computer overclocked with the 14.7 drivers (not RC2 but the original) and within a few minutes the game freezes and so does my computer. This is with ubersampling disabled by the way on 4k res. The first time it corrupted the AMD drivers and had to do a reinstall but ever since it just keeps freezing/crashing/whatever you want to call it. All other games including Crysis 3, Tomb Raider, Watch Dogs modded, Far Cry 3, Battlefield 4; basically any other game that is demanding on GPUs is working fine with the overclocks.
> 
> I tested this problem with all my cards overclocked as I do in any game vs. all stock and am experiencing the same problem. I did a google search about crossfire / multi AMD gpu and witcher 2 and it seems quite a few other people have the same problem.
> 
> I tried an uninstall and reinstall of Witcher 2 but the same freezing/crashing happens.
> 
> Any advice for troubleshooting or a fix I'm unaware of? Thanks!


Happened to me too. Use lower settings to get away from the area and then save again and that is what fixed it for me. My prob was that I would load into a scene and i could stand there and look around but as soon as I walked it froze. Is this the prob?


----------



## rdr09

Quote:


> Originally Posted by *Paopawdecarabao*
> 
> Yeah I've done that. Probably I will just straight trade my two Sapphire R9 290 Tri X to two Sapphire R9 290X reference card. Would that be fair?
> 
> Cause I do plan on getting my system underwater and the Tri X cooler would go to waste.


yes, that is a good plan. Reference for watercooling. check them individually first on air prior to watercooling.


----------



## Paopawdecarabao

Quote:


> Originally Posted by *rdr09*
> 
> yes, that is a good plan. Reference for watercooling. check them individually first on air prior to watercooling.


If not sapphire, would amd radeon R9 290X would be okay? I heard that other brands have a core voltage locked. I'm not really familiar with the brands on amd. The Sapphire Tri X was my first amd card btw.


----------



## Cyber Locc

Quote:


> Originally Posted by *Paopawdecarabao*
> 
> If not sapphire, would amd radeon R9 290X would be okay? I heard that other brands have a core voltage locked. I'm not really familiar with the brands on amd. The Sapphire Tri X was my first amd card btw.


Why do you say if not sapphire? saphiores are fine just get ref not tri x. If you trade your tri xs for 290x you will most likely have to include cash not sure many people would even trade that unless there having heat issues but getting someone to downgrade aint easy lol. Your best bet would sell the tri xs and buy 290xs (maybe that's what you meant)

Is that fair? What does this mean is it a fair trade thats up the person but by facts no not at all used saphire trixs are like 280-300 where ref 290xs are 350-400 so no not fair trade


----------



## Sgt Bilko

Quote:


> Originally Posted by *Paopawdecarabao*
> 
> If not sapphire, would amd radeon R9 290X would be okay? I heard that other brands have a core voltage locked. I'm not really familiar with the brands on amd. The Sapphire Tri X was my first amd card btw.


Not one Reference 290/x has the voltage locked and they are all the same card it's just the sticker, warranty and bios that changes.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Paopawdecarabao*
> 
> Comparisons of my benchmark before and after I did the tweak.
> 
> Before I did the tweak
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> After I did the tweak
> 
> 
> Spoiler: Warning: Spoiler!


I figured out what went wrong for you, when you alt+tab out of valley to set priority to realtime Crossfire doesn't start again, it goes back to Single GPU.

Same thing just happened to me so i did all tweaks apart from that.

Stock settings, 980/1250



And again after doing the tweaks:



Bear in mind i run 2 Monitors and i didn't feel like disabling one of them to do this

And sorry about the second one, valley froze right at the end before i could take a screencap


----------



## joeh4384

Quote:


> Originally Posted by *LesPaulLover*
> 
> Well I'll tell ya:
> 
> STEER CLEAR OF THE MSI GAMING 290/290X .... THEY'RE COMPLETE GARBAGE
> 
> My "MSI Gaming 290 4gb" card hits 94c within 10-15 minutes running any game that maxes it out. Witcher 2, Arkham City, BF4 etc etc. Keep in mind this is with the fan manually set to 99% BEFORE I even start playing the game.
> 
> MSI refuses to RMA it citing "those temps are within the normal operating range of the card." Which is funny because the back of the box shows those temps are the reference temps, and they advertise THEIR cards as 10-20c below that.
> 
> I'm using a Corsair Carbide Air 540 "high air flow) cube case. 2 140mm intake fans on the front, 2 140mm EXHAUST fans on the back, and 2 140mm exhaust fans on the top, so it's not my case airflow!!!


That is strange, MSI accepted my RMA on my 290x gaming for the same reason. It ran at 90 in my Fractal Design arc-midi 2 case. First, I thought it was just my ITX case than I tried in my main rig and it was still super hot. They sent me back a hotter card, but I am hoping RMA #2 works out. I am not in a rush without it but I would like to have it back.


----------



## VonDutch

Hi,

can i ask if anyone here has a problem running 2x R9 290 with battlefield 3?
I sold my GTX690 and bought these cards a few days ago, but bf3 plays better with 1 card somehow,
when i play cf the fps is almost the same as playing with 1 card,
the gpu's don't seem to go full load, it behaves very erratic..
i tried both cards single, and both load full in bf3..

then i dl bf4, and it works like a charm, in game, or on testrange, ultra settings,
supersampling on 150% , fps is 100 on avarage, most of the time higher then that..


this good enough for my entry


----------



## battleaxe

Quote:


> Originally Posted by *LesPaulLover*
> 
> Well I'll tell ya:
> 
> STEER CLEAR OF THE MSI GAMING 290/290X .... THEY'RE COMPLETE GARBAGE
> 
> My "MSI Gaming 290 4gb" card hits 94c within 10-15 minutes running any game that maxes it out. Witcher 2, Arkham City, BF4 etc etc. Keep in mind this is with the fan manually set to 99% BEFORE I even start playing the game.
> 
> MSI refuses to RMA it citing "those temps are within the normal operating range of the card." Which is funny because the back of the box shows those temps are the reference temps, and they advertise THEIR cards as 10-20c below that.
> 
> I'm using a Corsair Carbide Air 540 "high air flow) cube case. 2 140mm intake fans on the front, 2 140mm EXHAUST fans on the back, and 2 140mm exhaust fans on the top, so it's not my case airflow!!!


I have a gaming MSI also. Have you tried resetting the block with new thermal paste? I did that and got over 5c better results. I don't see temps over 80c unless I add 100mv of additional voltage and have it running at 1150mhz on the core. Otherwise it benches Valley and runs BF4 at around 73c. So not sure what's going on with your card. Have you tried to undervolt it? that can help a lot to get temps down. I personally like my MSI gaming 290 a lot. I also have a reference MSI 290. Both good cards. Both going on water very soon too. The reference MSI 290 is already on water. I've just been too lazy to put the gaming on water also cause its so quiet already.


----------



## Paopawdecarabao

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I figured out what went wrong for you, when you alt+tab out of valley to set priority to realtime Crossfire doesn't start again, it goes back to Single GPU.
> 
> Same thing just happened to me so i did all tweaks apart from that.
> 
> Stock settings, 980/1250
> 
> 
> 
> And again after doing the tweaks:
> 
> 
> 
> Bear in mind i run 2 Monitors and i didn't feel like disabling one of them to do this
> 
> And sorry about the second one, valley froze right at the end before i could take a screencap


Thats a good score. Now it makes sense it benches like a single card thats why the score was low.

How do you set real time then? Unplug all the monitors?

But you still need to alt tab to go to task manager to set realtime though.


----------



## Klocek001

My 290 tri-x was running fine until I decided to update the driver on Sunday. Now both 14.7 rc3 and 14.4 see it as 8700M. plz help.


----------



## spikezone2004

I crashed again with this weird looking screen, anyone know what it means? also got a bsod with atikmpag.sys never figured it out why I got it, I am thinking its all driver issues.


----------



## battleaxe

Quote:


> Originally Posted by *Klocek001*
> 
> My 290 tri-x was running fine until I decided to update the driver on Sunday. Now both 14.7 rc3 and 14.4 see it as 8700M. plz help.


Use a driver sweeper as mentioned in the OP. Do that a few times. Then install the new ones a few times also. Works flawlessly most of the time.


----------



## Klocek001

just used clear cmos button - it helped. I've recently had a series of random problems, which all seemed to have been resolved with clearing cmos - what should I do ? is this the mobo acting up ?


----------



## battleaxe

Quote:


> Originally Posted by *Klocek001*
> 
> just used clear cmos button - it helped. I've recently had a series of random problems, which all seemed to have been resolved with clearing cmos - what should I do ? is this the mobo acting up ?


Any electronics can go stupid every now and then. See what it does over a few weeks. Probably fine though if I had to guess... when adding new hardware sometimes the mobu will freak out. I've seen it happen more times than I can count. OS can get confused too which is often the real issue.


----------



## heroxoot

Quote:


> Originally Posted by *Klocek001*
> 
> My 290 tri-x was running fine until I decided to update the driver on Sunday. Now both 14.7 rc3 and 14.4 see it as 8700M. plz help.


I've had this happen on a laptop. The damn thing wound up dying. It'd DDU sweep and see if it helps.


----------



## FuriousPop

Quote:


> Originally Posted by *keikei*
> 
> May I ask do you have vsync on during gaming? With it on, your card will work less, and wont be as hot.


^^ this.. also you can always have vsync off and using something like Razor to put a frame limit on. no point pumping out 100FPS on a 60Hz monitor....

i've done this for my setup since its a little out of the ordinary... cards tend to go up and down and all around depending what game + settings are running.


----------



## Beezleybuzz

OK... So I've got a predicament for you guys.

I have 2 Asus R9 290 DCUII OC's, bought about 2 months apart at 2 different retailers. My first one performs great, with custom fan profile I can have it OC'd to 1100 core and 6000 memory with peak temps of 81-83. At stock clocks and regular fan it peaks at 79-80.

Now, my second card is where things get weird. My second card reaches max temp even at stock clocks, within NO TIME. What makes matters worse is that I didn't notice a broken fan blade until after the 30 day purchase and Asus says I broke the fan (which I didn't) and won't cover it. I figure one broken fan blade might cause temps to go up by 5 maybe 6 degrees tops, but 94C in no time? Something has to be wrong.

So I don't know what to do. Should I see if Asus with still RMA it even though has broken fan? Or just tear off the heat sink and see if there is a plastic wrapper on the bottom or some other tom foolery at work?

Advice?


----------



## bbond007

Quote:


> Originally Posted by *Beezleybuzz*
> 
> OK... So I've got a predicament for you guys.
> 
> I have 2 Asus R9 290 DCUII OC's, bought about 2 months apart at 2 different retailers. My first one performs great, with custom fan profile I can have it OC'd to 1100 core and 6000 memory with peak temps of 81-83. At stock clocks and regular fan it peaks at 79-80.
> 
> Now, my second card is where things get weird. My second card reaches max temp even at stock clocks, within NO TIME. What makes matters worse is that I didn't notice a broken fan blade until after the 30 day purchase and Asus says I broke the fan (which I didn't) and won't cover it. I figure one broken fan blade might cause temps to go up by 5 maybe 6 degrees tops, but 94C in no time? Something has to be wrong.
> 
> So I don't know what to do. Should I see if Asus with still RMA it even though has broken fan? Or just tear off the heat sink and see if there is a plastic wrapper on the bottom or some other tom foolery at work?
> 
> Advice?


Yes. If you saved the blade you can fix it. Carefully remove the fan.

Sand the blade a little on both sides where it broken. The whole area...

Get a rubber-band that fits around all blades ready. You may have to cut it to the right size and tie it if you don't have the exact right size available.

use the 2 part epoxy and you probably wan the kind that sets in 5 min. Goop it on there as much as possible but not so much it runs.

Hold the fan blade in place with rubber-band.

I have fixed several broken fans that way over the years.


----------



## Beezleybuzz

See here is the thing, I didn't break the fan, there was no piece of it even in the box that it came in. I was just dumb when I got the card I didn't go over it with a fine tooth comb to make sure everything was kosher. Live and learn eh?
Quote:


> Originally Posted by *bbond007*
> 
> Yes. If you saved the blade you can fix it. Carefully remove the fan.
> 
> Sand the blade a little on both sides where it broken. The whole area...
> 
> Get a rubber-band that fits around all blades ready. You may have to cut it to the right size and tie it if you don't have the exact right size available.
> 
> use the 2 part epoxy and you probably wan the kind that sets in 5 min. Goop it on there as much as possible but not so much it runs.
> 
> Hold the fan blade in place with rubber-band.
> 
> I have fixed several broken fans that way over the years.


----------



## Beezleybuzz

Quote:


> Originally Posted by *bbond007*
> 
> Yes. If you saved the blade you can fix it. Carefully remove the fan.
> 
> Sand the blade a little on both sides where it broken. The whole area...
> 
> Get a rubber-band that fits around all blades ready. You may have to cut it to the right size and tie it if you don't have the exact right size available.
> 
> use the 2 part epoxy and you probably wan the kind that sets in 5 min. Goop it on there as much as possible but not so much it runs.
> 
> Hold the fan blade in place with rubber-band.
> 
> I have fixed several broken fans that way over the years.


----------



## Beezleybuzz

dammit... I'm messing up my posts, sorry, slightly inebriated here.


----------



## Dasboogieman

Quote:


> Originally Posted by *Beezleybuzz*
> 
> OK... So I've got a predicament for you guys.
> 
> I have 2 Asus R9 290 DCUII OC's, bought about 2 months apart at 2 different retailers. My first one performs great, with custom fan profile I can have it OC'd to 1100 core and 6000 memory with peak temps of 81-83. At stock clocks and regular fan it peaks at 79-80.
> 
> Now, my second card is where things get weird. My second card reaches max temp even at stock clocks, within NO TIME. What makes matters worse is that I didn't notice a broken fan blade until after the 30 day purchase and Asus says I broke the fan (which I didn't) and won't cover it. I figure one broken fan blade might cause temps to go up by 5 maybe 6 degrees tops, but 94C in no time? Something has to be wrong.
> 
> So I don't know what to do. Should I see if Asus with still RMA it even though has broken fan? Or just tear off the heat sink and see if there is a plastic wrapper on the bottom or some other tom foolery at work?
> 
> Advice?


They also have replacement fans on ebay, I've had to recently replace a set on an old MSI GTX 560Ti Twin Frozr ii.

A little expensive though:
http://www.ebay.com/itm/95mm-Video-Card-Single-Fan-Replacement-for-ASUS-GTX780-R9-280X-T129215SU-0-50A-/171415290180?pt=US_Video_Card_GPU_Cooling&hash=item27e925c144


----------



## Beezleybuzz

Quote:


> Originally Posted by *Dasboogieman*
> 
> They also have replacement fans on ebay, I've had to recently replace a set on an old MSI GTX 560Ti Twin Frozr ii.
> 
> A little expensive though:
> http://www.ebay.com/itm/95mm-Video-Card-Single-Fan-Replacement-for-ASUS-GTX780-R9-280X-T129215SU-0-50A-/171415290180?pt=US_Video_Card_GPU_Cooling&hash=item27e925c144


80 bucks? I can buy a whole new heatsink for that!

I really don't think it's the fan that's making the difference because even with the fans cranked it still throttles at 94C. I'm going to get a little set of screw drivers tomorrow and take it apart, warranty be damned!


----------



## Arizonian

Quote:


> Originally Posted by *VonDutch*
> 
> Hi,
> 
> can i ask if anyone here has a problem running 2x R9 290 with battlefield 3?
> I sold my GTX690 and bought these cards a few days ago, but bf3 plays better with 1 card somehow,
> when i play cf the fps is almost the same as playing with 1 card,
> the gpu's don't seem to go full load, it behaves very erratic..
> i tried both cards single, and both load full in bf3..
> 
> then i dl bf4, and it works like a charm, in game, or on testrange, ultra settings,
> supersampling on 150% , fps is 100 on avarage, most of the time higher then that..
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> this good enough for my entry


Congrats - added


----------



## shinayasaki

Hi there. Not sure if i can ask about temp in this topic or need to go to the ATI cooling.
I'm running my R9 290 at stock speed. Benchmarked by Furmark and the temp are :
GPU : 31/45
VRM 2 : 31/50
VRM 1 : 27/*77*

Is it normal that VRM1's temp being that high ?
I already installed VRM heatsinks on the VRAMs with Arctic silver thermal adhesive.

Regards.


----------



## VonDutch

Quote:


> Originally Posted by *shinayasaki*
> 
> Hi there. Not sure if i can ask about temp in this topic or need to go to the ATI cooling.
> I'm running my R9 290 at stock speed. Benchmarked by Furmark and the temp are :
> GPU : 31/45
> VRM 2 : 31/50
> VRM 1 : 27/*77*
> 
> Is it normal that VRM1's temp being that high ?
> I already installed VRM heatsinks on the VRAMs with Arctic silver thermal adhesive.
> 
> Regards.


Should be no problem, they can go higher then that, maybe try redo vrm 1?
mine are around 60-70C..
just noticed your gpu temps are low, is your fan spinning fast enough?
if it's to low, your vrm's dont get enough colder air?
Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added


TY


----------



## sugarhell

Quote:


> Originally Posted by *shinayasaki*
> 
> Hi there. Not sure if i can ask about temp in this topic or need to go to the ATI cooling.
> I'm running my R9 290 at stock speed. Benchmarked by Furmark and the temp are :
> GPU : 31/45
> VRM 2 : 31/50
> VRM 1 : 27/*77*
> 
> Is it normal that VRM1's temp being that high ?
> I already installed VRM heatsinks on the VRAMs with Arctic silver thermal adhesive.
> 
> Regards.


Dont use furmark.Except if you want to test your loop


----------



## rdr09

Quote:


> Originally Posted by *Beezleybuzz*
> 
> 80 bucks? I can buy a whole new heatsink for that!
> 
> I really don't think it's the fan that's making the difference because even with the fans cranked it still throttles at 94C. I'm going to get a little set of screw drivers tomorrow and take it apart, warranty be damned!


with one broken fan blade, then cooling will not be optimal. even those with three fan design these gpus are struggling to keep cool. that fan replacement is a rip off. almost the cost of a full waterblock.

not sure exactly when you discovered the fan blade was broken but i would have missed that myself. i understand the company's stand on the matter, though. could you at least ask for a replacement cooling assembly? if not . . . just the fan?


----------



## Yuriewitsch

Hi,
I think, this should be enough for my entry to member's list.









I'm using Accelero Extreme IV + homemade aluminium VRM and memory heatsinks.


----------



## Sgt Bilko

Those are some sweet heatsinks, nice work


----------



## rdr09

Quote:


> Originally Posted by *Yuriewitsch*
> 
> Hi,
> I think, this should be enough for my entry to member's list.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm using Accelero Extreme IV + homemade aluminium VRM and memory heatsinks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


great skillz. where you get that version of AB? me likes the grid style readings.

edit: i must say . . . nice clean rig. looks elegant. i know that case is a Cooler Master but which one?


----------



## Yuriewitsch

Thx guys,

That version of AB is new beta from:

http://www.guru3d.com/files-details/msi-afterburner-beta-download.html

edit:







Wrong.
Fractal Design Arc Midi


----------



## rdr09

Quote:


> Originally Posted by *Yuriewitsch*
> 
> Thx guys,
> 
> That version of AB is new beta from:
> 
> http://www.guru3d.com/files-details/msi-afterburner-beta-download.html
> 
> edit:
> 
> 
> 
> 
> 
> 
> 
> Wrong.
> Fractal Design Arc Midi


my bad. looks like my cheap 912 'cause of the vertical bracket. +rep.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Yuriewitsch*
> 
> Thx guys,
> 
> That version of AB is new beta from:
> 
> http://www.guru3d.com/files-details/msi-afterburner-beta-download.html
> 
> edit:
> 
> 
> 
> 
> 
> 
> 
> Wrong.
> Fractal Design Arc Midi


Nice......

Those graphs should be nice but this got me:
Quote:


> Improved Logitech keyboard LCD monitoring module:
> Ported to new Logitech API to provide support for newer Logitech LCD displays
> Added support for color LCD display of Logitech G19/G19s keyboards Guru3D
> Added graph mode support for color LCD display of Logitech G19/G19s keyboards. Now in addition to previously available text mode you can optionally select graph mode and see exact copy of MSI Afterburner's monitoring graphs displayed directly inside the keyboard LCD. You can also press "Menu" soft button on your Logitech G19/G19S keyboard to toggle between text and graph modes dynamically in realtime
> Added acceleration support to LCD scrolling implementation
> Added larger 8x12, 10x12, 12x12 and 12x16 fonts support for text mode


That has made me very happy


----------



## mojobear

Hey guys,

This may be a bit off topic, or on topic for those with tri/quad fire 290(x).

JohnnyGuru reviewed the EVGA G2 1600W PSU....got a very solid score in case anyone is interested.

Super Flower is the OEM and damn its going for $350 on newegg for 1600W gold...pretty sweet.

http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story&reid=391


----------



## Cyber Locc

Quote:


> Originally Posted by *mojobear*
> 
> Hey guys,
> 
> This may be a bit off topic, or on topic for those with tri/quad fire 290(x).
> 
> JohnnyGuru reviewed the EVGA G2 1600W PSU....got a very solid score in case anyone is interested.
> 
> Super Flower is the OEM and damn its going for $350 on newegg for 1600W gold...pretty sweet.
> 
> http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story&reid=391


Lol I made that same recomendation last night to someone thats what im going to use for my quadfire.

Lepa 1600 has crashed and burned there claiming that 4 290s arent supported and pull more than 1600w and there psus are failing there saying its.not.there fault if you run those cards you lose your warranty.

personally I think its the fact somone else is now making them and there doing a poor job as people are saying anything over 1200/1300w the psu will die in a week or less.

also mojo johnny says its not gold its platiunum







the name says gold(which is understandable if you read about how those ratings work and relize they vost more to give you a plat rating) but the tests by johhny show it as plat.


----------



## heroxoot

Quote:


> Originally Posted by *Yuriewitsch*
> 
> Thx guys,
> 
> That version of AB is new beta from:
> 
> http://www.guru3d.com/files-details/msi-afterburner-beta-download.html
> 
> edit:
> 
> 
> 
> 
> 
> 
> 
> Wrong.
> Fractal Design Arc Midi


I guess it went public? These beta are supposed to be private. There is even a private forum on guru3d that requires a password. I got in as a tester. Guess it's public now. That or he just doesn't want 200 people crying that something isn't working, so the forum is private.


----------



## Cyber Locc

Scratch that figured it out


----------



## Paopawdecarabao

Yay! got a good deal on a two EK FC R9 nickel acetal and two backplates brand new box never been opened for $200. A $140 savings can't wait to put them on water.


----------



## FuriousPop

Quote:


> Originally Posted by *mojobear*
> 
> Hey guys,
> 
> This may be a bit off topic, or on topic for those with tri/quad fire 290(x).
> 
> JohnnyGuru reviewed the EVGA G2 1600W PSU....got a very solid score in case anyone is interested.
> 
> Super Flower is the OEM and damn its going for $350 on newegg for 1600W gold...pretty sweet.
> 
> http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story&reid=391


Think i'll stick to what i got now for my trifire:

http://www.corsair.com/en/ax1500i-digital-atx-power-supply-1500-watt-fully-modular-psu


----------



## kizwan

Quote:


> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Yuriewitsch*
> 
> Thx guys,
> 
> That version of AB is new beta from:
> 
> http://www.guru3d.com/files-details/msi-afterburner-beta-download.html
> 
> edit:
> 
> 
> 
> 
> 
> 
> 
> Wrong.
> Fractal Design Arc Midi
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I guess it went public? These beta are supposed to be private. There is even a private forum on guru3d that requires a password. I got in as a tester. Guess it's public now. That or he just doesn't want 200 people crying that something isn't working, so the forum is private.
Click to expand...

It was the beta version that properly support 290/290X & the beta version also is the one I use when I first overclocked my 290's. I remember exactly the MSI AB 3.0 beta 17 was downloaded from Guru3D & MSI AB 3.0 beta 18 & 19 was downloaded directly from MSI AB official website.


----------



## VSG

So looks like I may be making my entry back into this club. My girlfriend works on some fancy MRI research projects for tumors and brain disorders and the result manupulation software seems to prefer AMD's OpenGL better than CUDA which surprised me. Seeing how the 290/290x cards are still selling cheap I am going to look for a reference PCB w/reference cooler (the cooler itself is for another little project plus it seems to be cheapest) and watercool it. If any of you know someone selling it in the classifieds please alert me to it, thanks a lot. Kinda makes me feel bad about using 2 power hungry cards for just gaming/benching when this one will be doing some real good.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *spikezone2004*
> 
> I crashed again with this weird looking screen, anyone know what it means? also got a bsod with atikmpag.sys never figured it out why I got it, I am thinking its all driver issues.


Ahh yes , Technicolour Yawn Syndrome ............

I suggest hitting the Off button , having a good power down and thinking about lowering your mem overclock and /or re-install driver and get back to me in the morning ...........
Quote:


> Originally Posted by *Yuriewitsch*
> 
> Hi,
> I think, this should be enough for my entry to member's list.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm using Accelero Extreme IV + homemade aluminium VRM and memory heatsinks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Those are some sweet heatsinks, nice work
Click to expand...

Yep that's different the large passive ones on the top








Quote:


> Originally Posted by *kizwan*
> 
> It was the beta version that properly support 290/290X & the beta version also is the one I use when I first overclocked my 290's. I remember exactly the MSI AB 3.0 beta 17 was downloaded from Guru3D & MSI AB 3.0 beta 18 & 19 was downloaded directly from MSI AB official website.


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> It was the beta version that properly support 290/290X & the beta version also is the one I use when I first overclocked my 290's. I remember exactly the MSI AB 3.0 beta 17 was downloaded from Guru3D & MSI AB 3.0 beta 18 & 19 was downloaded directly from MSI AB official website.
Click to expand...


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Ahh yes , Technicolour Yawn Syndrome ............
> 
> 
> I suggest hitting the Off button , having a good power down and thinking about lowering your mem overclock and /or re-install driver and get back to me in the morning ...........
> Yep that's different the large passive ones on the top


Quote:


> Originally Posted by *kizwan*


ME









+










+










=








+


----------



## maynard14

hey guys,. after 7 months of my unlock xfx reference 290 to 290x using xfx reference 290x bios, while playing crysis 3 or any graphics extensive games i got black screen,.. sometimes it restarts my pc

tried:

re installing amd driver 14.7 rc3
re installed 14.4 driver

and fully uninstall amd driver 14.7 rc3 with dd

can any one help me?

is it my card being faulty?


----------



## shwarz

Quote:


> Originally Posted by *maynard14*
> 
> hey guys,. after 7 months of my unlock xfx reference 290 to 290x using xfx reference 290x bios, while playing crysis 3 or any graphics extensive games i got black screen,.. sometimes it restarts my pc
> 
> tried:
> 
> re installing amd driver 14.7 rc3
> re installed 14.4 driver
> 
> and fully uninstall amd driver 14.7 rc3 with dd
> 
> can any one help me?
> 
> is it my card being faulty?


might be worthwhile trying a 290 bios again
as the extra part u unlocked may be faulting


----------



## Arizonian

Quote:


> Originally Posted by *Yuriewitsch*
> 
> Hi,
> I think, this should be enough for my entry to member's list.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm using Accelero Extreme IV + homemade aluminium VRM and memory heatsinks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Nice. Congrats - added


----------



## maynard14

Quote:


> Originally Posted by *shwarz*
> 
> might be worthwhile trying a 290 bios again
> as the extra part u unlocked may be faulting


ok sir thank you

will try later









i hope my card still works,,,, or not damage by the unlock

what if still black screen even in default 290 bios?


----------



## Blue Dragon

Quote:


> Originally Posted by *maynard14*
> 
> ok sir thank you
> 
> will try later
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i hope my card still works,,,, or not damage by the unlock
> 
> what if still black screen even in default 290 bios?


my black screens turned out to be power related... try looking here- http://www.overclock.net/t/1441349/290-290x-black-screen-poll
alot of guys just add little bit of voltage to fix the black screens but sure there are other methods worth trying buried in those pages...
don't know if it helps, but i'm currently running 14.6 with mine.


----------



## maynard14

Quote:


> Originally Posted by *Blue Dragon*
> 
> my black screens turned out to be power related... try looking here- http://www.overclock.net/t/1441349/290-290x-black-screen-poll
> alot of guys just add little bit of voltage to fix the black screens but sure there are other methods worth trying buried in those pages...
> don't know if it helps, but i'm currently running 14.6 with mine.


thank you again sir, at first i thought also my power supply is giving up coz i keep getting asus power supply surges. i tried to turn it off on the bios, but still blck screen, also tried original 290 bios still black screen









so i thought my card is faulty or my psu is faulty,,

i search the forums for 290/290x black screen and i found one solution, its gpuz thats causing my black screen! i dont know why but it is the one causing my black scree, tried 14.7 rc3 drivers and back to 290x bios un installed gpu z, and played crysis for 15 min no black screen, also tried metro last light build in benchmark also tomb raider benchmark and still no black screen,..

its kinda weird why gpu z? im also using msi after burner and riva tuner and hwinfo to monitor my temps while playing

also is it safe to put vrm heatsinks with thermal pads on the top of vrm2 , vrm close to the power socket?


----------



## rdr09

Quote:


> Originally Posted by *maynard14*
> 
> thank you again sir, at first i thought also my power supply is giving up coz i keep getting asus power supply surges. i tried to turn it off on the bios, but still blck screen, also tried original 290 bios still black screen
> 
> 
> 
> 
> 
> 
> 
> 
> 
> so i thought my card is faulty or my psu is faulty,,
> 
> i search the forums for 290/290x black screen and i found one solution, its gpuz thats causing my black screen! i dont know why but it is the one causing my black scree, tried 14.7 rc3 drivers and back to 290x bios un installed gpu z, and played crysis for 15 min no black screen, also tried metro last light build in benchmark also tomb raider benchmark and still no black screen,..
> 
> its kinda weird why *gpu z? im also using msi after burner and riva tuner and hwinfo to monitor* my temps while playing
> 
> also is it safe to put vrm heatsinks with thermal pads on the top of vrm2 , vrm close to the power socket?


combining these is no bueno.


----------



## maynard14

Quote:


> Originally Posted by *rdr09*
> 
> combining these is no bueno.


I see...so that why.... I wont use them all together now. How about the vrm heat sinks on top of the vrm 2 ? Eheh


----------



## rdr09

Quote:


> Originally Posted by *maynard14*
> 
> I see...so that why.... I wont use them all together now. How about the vrm heat sinks on top of the vrm 2 ? Eheh


it is safe to use heatsinks on vrms, so long as you keep them away from exposed circuits to avoid shorts. check out post #28600; page 2860.

vrm2 is the one close to the bracket and vrm1 is near the power source.

edit: did you also see that massive heatsink on the backside?

the thermal pads will lift the heatsinks off the pcb and thereby avoiding shorting out circuits. installation can be tricky. got to bo careful or you'll end up with a paperweight. and, yes, this applies to aluminum, too.


----------



## maynard14

Quote:


> Originally Posted by *rdr09*
> 
> it is safe to use heatsinks on vrms, so long as you keep them away from exposed circuits to avoid shorts. check out post #28600; page 2860.
> 
> vrm2 is the one close to the bracket and vrm1 is near the power source.
> 
> edit: did you also see that massive heatsink on the backside?
> 
> the thermal pads will lift the heatsinks off the pcb and thereby avoiding shorting out circuits. installation can be tricky. got to bo careful or you'll end up with a paperweight. and, yes, this applies to aluminum, too.


i see, i highlight the vrm 1 that im planning to put some thermal pad and vrm alluminum heatsinks



coz at the bottom of vrm 2 i already put gelid vrm2 heatsinks for 290x


----------



## rdr09

Quote:


> Originally Posted by *maynard14*
> 
> i see, i highlight the vrm 1 that im planning to put some thermal pad and vrm alluminum heatsinks
> 
> 
> 
> coz at the bottom of vrm 2 i already put gelid vrm2 heatsinks for 290x


yah, that's a good idea. you can make the thermal pads wider/longer than the shape of the heatsink just to be safe.


----------



## Jflisk

Quote:


> Originally Posted by *maynard14*
> 
> ok sir thank you
> 
> will try later
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i hope my card still works,,,, or not damage by the unlock
> 
> what if still black screen even in default 290 bios?


If your in the US XFX has excellent RMA service as long as the card is registered to you and you can get the R9 290 bios back on heck they may even RMA it then too. There process takes about 3 weeks between sending and getting a return. Good luck


----------



## Jflisk

Quote:


> Originally Posted by *maynard14*
> 
> i see, i highlight the vrm 1 that im planning to put some thermal pad and vrm alluminum heatsinks
> 
> 
> 
> coz at the bottom of vrm 2 i already put gelid vrm2 heatsinks for 290x


Don't know what you access to water cooling parts are where your located but you can try to install a water cooling back plate on the card. You will probably have to change the screws but should be able to be installed cost about 30.00 US. Link below

There out of stock right now but gives you the general Idea
http://www.frozencpu.com/products/21680/ex-blc-1569/EK_R9-290X_VGA_Liquid_Cooling_RAM_Backplate_-_Black_EK-FC_R9-290X_Backplate_-_Black.html?tl=g30c309s2073


----------



## bluedevil

Thinking I might ditch my WC'd VisionTek 290 for a Gigabyte Windforce 290. Smart?


----------



## VSG

Quote:


> Originally Posted by *bluedevil*
> 
> Thinking I might ditch my WC'd VisionTek 290 for a Gigabyte Windforce 290. Smart?


If you are going to, please create a classifieds ad in the marketplace for the Visiontek as I would be interested


----------



## maynard14

Quote:


> Originally Posted by *rdr09*
> 
> yah, that's a good idea. you can make the thermal pads wider/longer than the shape of the heatsink just to be safe.


thank you yes i will use a long heatsink:



though they are alluminum

heres my set up





Quote:


> Originally Posted by *Jflisk*
> 
> If your in the US XFX has excellent RMA service as long as the card is registered to you and you can get the R9 290 bios back on heck they may even RMA it then too. There process takes about 3 weeks between sending and getting a return. Good luck


no sir, unfortunately i live here in the Philippines so my warranty is void, but i got lucky that i dint get anymore black screen after i uninstalled gpu z
Quote:


> Originally Posted by *Jflisk*
> 
> Don't know what you access to water cooling parts are where your located but you can try to install a water cooling back plate on the card. You will probably have to change the screws but should be able to be installed cost about 30.00 US. Link below
> 
> There out of stock right now but gives you the general Idea
> http://www.frozencpu.com/products/21680/ex-blc-1569/EK_R9-290X_VGA_Liquid_Cooling_RAM_Backplate_-_Black_EK-FC_R9-290X_Backplate_-_Black.html?tl=g30c309s2073


hi sir, is this the same item?

http://www.tipidpc.com/viewitem.php?iid=30052630

i live here in the philippines, and its available here, but will it be compatible with nzxt g10? im planning also to upgrade to corsair hg10 when corsair release it
im also using corsair h105 for the gpu core


----------



## Jflisk

Quote:


> Originally Posted by *maynard14*
> 
> thank you yes i will use a long heatsink:
> 
> 
> 
> though they are alluminum
> 
> heres my set up
> 
> 
> 
> 
> no sir, unfortunately i live here in the Philippines so my warranty is void, but i got lucky that i dint get anymore black screen after i uninstalled gpu z
> hi sir, is this the same item?
> 
> http://www.tipidpc.com/viewitem.php?iid=30052630
> 
> i live here in the philippines, and its available here, but will it be compatible with nzxt g10? im planning also to upgrade to corsair hg10 when corsair release it
> im also using corsair h105 for the gpu core


That would be the backplate









One other thing don't know how you attached the heat sinks. But Arctic silver makes thermal epoxy. The sinks are still removable with 90% alcohol and they will not fall off.

http://www.frozencpu.com/products/3771/thr-07/Arctic_Alumina_Adhesive_Premium_Ceramic_Thermal_Epoxy_-_5_Gram_set_AATA-5G.html

The card looks good though.


----------



## bobbybluz

http://www.techpowerup.com/gpuz/details.php?id=5a3eq


----------



## maynard14

Quote:


> Originally Posted by *Jflisk*
> 
> That would be the backplate
> 
> 
> 
> 
> 
> 
> 
> 
> 
> One other thing don't know how you attached the heat sinks. But Arctic silver makes thermal epoxy. The sinks are still removable with 90% alcohol and they will not fall off.
> 
> http://www.frozencpu.com/products/3771/thr-07/Arctic_Alumina_Adhesive_Premium_Ceramic_Thermal_Epoxy_-_5_Gram_set_AATA-5G.html
> 
> The card looks good though.


thank you again sir, yeah i got lucky i got it to unlock to 290x, i really thought i mess up the card because of the black screen ,, now im relax haha,

oh i dont know that you can still remove the heatsink after using arctic alumina with 90% alcohol, thanks for the tip.

yes sir i will buy the back plate soon when corsair release the hg10 gpu bracket









thanks again ocn for the help


----------



## Talon720

I read that stren used the aquacomputer active backplate, and it was compatible with the ek blocks, anyone else tried this? I have ek blocks, and backplates w/fujipoly pads which work fine though my 2nd card needs redoing. On the extreme.net roundup for thoes who didnt see it the active backplate on the ek block definitly helps cool it more. Whether that transfers into a more efficient oc i dont know. I do think the active backplates look sexy though.


----------



## Talon720

I've been thinking a lot about monitors lately. I have an asus vs something 23.5" 60hz tn nothing fancy. Also given my gpu setup im way overpowerd. It seems like we are in a very hard time for buying monitors with the release of gsync and then dp 1.3. I wanted to jump on an overlord tempest i really want a 1440p 120/144hz and it sounded good. In the end Id feel like i threw my money away when dp 1.3 finally arrives. Switching to nividia just to make use of the rog swift/gsync seems like a huge waste of money. A 1080p 120/144hz monitor dosnt seem worth it either. Are any of you in this postion?


----------



## Meatdohx

Quote:


> Originally Posted by *Talon720*
> 
> I've been thinking a lot about monitors lately. I have an asus vs something 23.5" 60hz tn nothing fancy. Also given my gpu setup im way overpowerd. It seems like we are in a very hard time for buying monitors with the release of gsync and then dp 1.3. I wanted to jump on an overlord tempest i really want a 1440p 120/144hz and it sounded good. In the end Id feel like i threw my money away when dp 1.3 finally arrives. Switching to nividia just to make use of the rog swift/gsync seems like a huge waste of money. A 1080p 120/144hz monitor dosnt seem worth it either. Are any of you in this postion?


Go Korean Qnix!


----------



## tsm106

Quote:


> Originally Posted by *Meatdohx*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Talon720*
> 
> I've been thinking a lot about monitors lately. I have an asus vs something 23.5" 60hz tn nothing fancy. Also given my gpu setup im way overpowerd. It seems like we are in a very hard time for buying monitors with the release of gsync and then dp 1.3. I wanted to jump on an overlord tempest i really want a 1440p 120/144hz and it sounded good. In the end Id feel like i threw my money away when dp 1.3 finally arrives. Switching to nividia just to make use of the rog swift/gsync seems like a huge waste of money. A 1080p 120/144hz monitor dosnt seem worth it either. Are any of you in this postion?
> 
> 
> 
> Go Korean Qnix!
Click to expand...

Wait it out for the next gen cards? The 1.3 standard will be crystalized by Q4 I assume since that is the eta on the next gen cards. I would not buy any big panels until then. Panels before that time will be known as legacy panels. Their value will not hold as the new standard proliferates. Btw, I could never run 60hz anymore. Have you tried 120hz and up yet?


----------



## Meatdohx

Quote:


> Originally Posted by *tsm106*
> 
> Wait it out for the next gen cards? The 1.3 standard will be crystalized by Q4 I assume since that is the eta on the next gen cards. I would not buy any big panels until then. Panels before that time will be known as legacy panels. Their value will not hold as the new standard proliferates. Btw, I could never run 60hz anymore. Have you tried 120hz and up yet?


I have a QNIX [email protected] atm. 120hz give me green lines.

I love that monitor soo much,

Only 279$ of pure IPS greatness.

Your r9 290(x) is begging for it. I can feel it


----------



## ebhsimon

What temps can I expect with having two open air cooled 290s in crossfire? I'm already getting high 60s in BF3 with just one Vapor-X 290.


----------



## Paopawdecarabao

Quote:


> Originally Posted by *ebhsimon*
> 
> What temps can I expect with having two open air cooled 290s in crossfire? I'm already getting high 60s in BF3 with just one Vapor-X 290.


I have tri x sapphire r9 290 xfire top card gets usually hot max 76c playing bf4.

When I benchmark it usually goes to 83c.


----------



## spikezone2004

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Ahh yes , Technicolour Yawn Syndrome ............
> 
> I suggest hitting the Off button , having a good power down and thinking about lowering your mem overclock and /or re-install driver and get back to me in the morning ...........


I am not overclocked at all, only had it in my system a week havent messed with the overclock yet, I think it might be driver related going to uninstall and run driver uninstaller couple times then re install


----------



## tsm106

Quote:


> Originally Posted by *cyberlocc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mojobear*
> 
> Hey guys,
> 
> This may be a bit off topic, or on topic for those with tri/quad fire 290(x).
> 
> JohnnyGuru reviewed the EVGA G2 1600W PSU....got a very solid score in case anyone is interested.
> 
> Super Flower is the OEM and damn its going for $350 on newegg for 1600W gold...pretty sweet.
> 
> http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story&reid=391
> 
> 
> 
> Lol I made that same recomendation last night to someone thats what im going to use for my quadfire.
> 
> *Lepa 1600 has crashed and burned there claiming that 4 290s arent supported and pull more than 1600w* and there psus are failing there saying its.not.there fault if you run those cards you lose your warranty.
> 
> personally I think its the fact somone else is now making them and there doing a poor job as people are saying anything over 1200/1300w the psu will die in a week or less.
> 
> also mojo johnny says its not gold its platiunum
> 
> 
> 
> 
> 
> 
> 
> the name says gold(which is understandable if you read about how those ratings work and relize they vost more to give you a plat rating) but the tests by johhny show it as plat.
Click to expand...

If you read the fine print on the rail amperage, there's only enough power for 360 or so odd watts per rail. ANd then with its multi rail setup, you have to share the system with one or two of those rails. Add in overclocking and bam you have a problem.

I've written about the Lepa's shortcoming sin this thread like last year.

Btw, I'm not sure about using a 1600 watt single rail psu though. It's a double edged sword. Yea you have the power but you also have the power in a worst case scenario to fry everything in your rig.


----------



## Cyber Locc

Quote:


> Originally Posted by *tsm106*
> 
> If you read the fine print on the rail amperage, there's only enough power for 360 or so odd watts per rail. ANd then with its multi rail setup, you have to share the system with one or two of those rails. Add in overclocking and bam you have a problem.
> 
> I've written about the Lepa's shortcoming sin this thread like last year.
> 
> Btw, I'm not sure about using a 1600 watt single rail psu though. It's a double edged sword. Yea you have the power but you also have the power in a worst case scenario to fry everything in your rig.


Ahh that would make since. Yes very true about the frying everything that makes me want to spend the extra for the corsair I had one of there psu's blow some drives and they replaced them so there defiantly stand up with rma's took a awhile but got it done.


----------



## Talon720

Quote:


> Originally Posted by *tsm106*
> 
> Wait it out for the next gen cards? The 1.3 standard will be crystalized by Q4 I assume since that is the eta on the next gen cards. I would not buy any big panels until then. Panels before that time will be known as legacy panels. Their value will not hold as the new standard proliferates. Btw, I could never run 60hz anymore. Have you tried 120hz and up yet?


l

Yea I was kinda thinking I should wait, but no i have never ran 120hz or up. The highest i was able to run was 65hz and that dosnt count.


----------



## Darklyric

Quote:


> Originally Posted by *tsm106*
> 
> Wait it out for the next gen cards? The 1.3 standard will be crystalized by Q4 I assume since that is the eta on the next gen cards. I would not buy any big panels until then. Panels before that time will be known as legacy panels. Their value will not hold as the new standard proliferates. Btw, I could never run 60hz anymore. Have you tried 120hz and up yet?


I am in virtually the same boat and was almost going to pull the trigger.... dam must resist 120+hz.... Well gues I'll focus on watercooling ftm.

Thought I'd post an idea of how I would be wcing a pair of reference XFX 290s though for you guys to pick apart. Any constructive criticism would be great. And yes I'll be overclocking cpu and gpus. Also this is my first custom loop









First I'm going to ditch my current case and pick up a nzxt h440 since its still on sale for $99 @newegg

2x360mm rads One in front and one on top Unknown thickness atm. Have to wait until the case arrives and I get it torn apart

Pump and res are undecided

Blocks/back plates I'm highly considering aquacomputer and some passive backplates but are the backplates readily available?

Cpu block is undecided but I'll be for a 4770k.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Talon720*
> 
> I've been thinking a lot about monitors lately. I have an asus vs something 23.5" 60hz tn nothing fancy. Also given my gpu setup im way overpowerd. It seems like we are in a very hard time for buying monitors with the release of gsync and then dp 1.3. I wanted to jump on an overlord tempest i really want a 1440p 120/144hz and it sounded good. In the end Id feel like i threw my money away when dp 1.3 finally arrives. Switching to nividia just to make use of the rog swift/gsync seems like a huge waste of money. A 1080p 120/144hz monitor dosnt seem worth it either. Are any of you in this postion?
> Quote:
> 
> 
> 
> Originally Posted by *Meatdohx*
> 
> Go Korean Qnix!
Click to expand...

Yeah I scored a 30" 1600p 310 Yamakasi Sparta .......... they all have funny names LoooooooooL
for $400AU sent only has dvi







DP would have been nice but looks and runs quite well
Quote:


> Originally Posted by *tsm106*
> 
> If you read the fine print on the rail amperage, there's only enough power for 360 or so odd watts per rail. ANd then with its multi rail setup, you have to share the system with one or two of those rails. Add in overclocking and bam you have a problem.
> 
> I've written about the Lepa's shortcoming sin this thread like last year.
> 
> Btw, I'm not sure about using a 1600 watt single rail psu though. It's a double edged sword. Yea you have the power but you also have the power in a worst case scenario to fry everything in your rig.


Glad I stuck with what I gots








Quote:


> Originally Posted by *spikezone2004*
> 
> I am not overclocked at all, only had it in my system a week havent messed with the overclock yet, I think it might be driver related going to uninstall and run driver uninstaller couple times then re install


Damn drivers cause me more grief than anything else with these red things


----------



## chronicfx

Thoughts guys? Will I see any improvement from my three 290x going from my current setup of i5 [email protected] and gigabyte z77-UD5h motherboard which runs at [email protected]/x4/x4 to a 4790k @ who knows ghz and a gigabyte gaming gt motherboard which has a PLX chip that runs [email protected]/x8/x8. I am thinking dragon age, far cry4, witcher 3, gtaV as my next endeavors. I am debating the 5820k if it can actually properly support three cards in x8 instead of that x16/x8/x4 garbage they are spouting but to be honest having to rebuy ram is a turn-off and will end up being at least $150 more than the z97 route. FYI: I am gaming at 1440p, already own the cards and not selling one, three is the number and the number three it shall stay


----------



## kizwan

Quote:


> Originally Posted by *chronicfx*
> 
> Thoughts guys? Will I see any improvement from my three 290x going from my current setup of i5 [email protected] and gigabyte z77-UD5h motherboard which runs at [email protected]/x4/x4 to a 4790k @ who knows ghz and a gigabyte gaming gt motherboard which has a PLX chip that runs [email protected]/x8/x8. I am thinking dragon age, far cry4, witcher 3, gtaV as my next endeavors. I am debating the 5820k if it can actually properly support three cards in x8 instead of that x16/x8/x4 garbage they are spouting but to be honest having to rebuy ram is a turn-off and will end up being at least $150 more than the z97 route. FYI: I am gaming at 1440p, already own the cards and not selling one, three is the number and the number three it shall stay


5820k is Haswell-E. You need x99 chipset motherboard. You can run x8/x8/x8 with 5820k. I read somewhere that 5820k on-die PCIe 3.0 root complex will have fewer lanes though, so you can't do x16/x8/x8 like SB-E & IVY-E; 3820 & 4820k.

I read in a forum somewhere that some people having problem when running (multi-GPU) Hawaii @PCIe 3.0 x4 in BF4. If you experiencing this problem, getting them to run at x8 should fixed the problem. I don't know whether the problem is user error or what because I didn't investigate further. I can say depends on the program you're going to use though.


----------



## chronicfx

Quote:


> Originally Posted by *kizwan*
> 
> 5820k is Haswell-E. You need x99 chipset motherboard. You can run x8/x8/x8 with 5820k. I read somewhere that 5820k on-die PCIe 3.0 root complex will have fewer lanes though, so you can't do x16/x8/x8 like SB-E & IVY-E; 3820 & 4820k.
> 
> I read in a forum somewhere that some people having problem when running (multi-GPU) Hawaii @PCIe 3.0 x4 in BF4. If you experiencing this problem, getting them to run at x8 should fixed the problem. I don't know whether the problem is user error or what because I didn't investigate further. I can say depends on the program you're going to use though.


Yes the 5820k will have 28 pcie lanes instead of the usual 40 for the HEDT processors. If the motherboard manufacturers can make the 5820k run trifire in x8 for all three slots and the last slot x4 then it will be a very nice gaming chip for the next several years. What I was saying about the setups would be if I went z97 I would get a 4790k for $339 (I am paying in bitcoin so I will have to take my lumps at newegg even though microcenter is down the street) and a gigabyte gaming gt for $223 with a $30 rebate. Final total is $603 right now or 1.2 bitcoin whichever. My other choice would be to go for x99 and the 5820K which will be about $400 and a motherboard would be $250 again but also I would need new ram, and 16gb is about $270 so we are looking at near $1000 all said and done. That is pretty hefty and there better be some damned good gains if I do that. Anyways is there any reason to ditch my i5 for an i7 to push a trifire setup, talk me out of being impulsive


----------



## Cyber Locc

Quote:


> Originally Posted by *chronicfx*
> 
> Thoughts guys? Will I see any improvement from my three 290x going from my current setup of i5 [email protected] and gigabyte z77-UD5h motherboard which runs at [email protected]/x4/x4 to a 4790k @ who knows ghz and a gigabyte gaming gt motherboard which has a PLX chip that runs [email protected]/x8/x8. I am thinking dragon age, far cry4, witcher 3, gtaV as my next endeavors. I am debating the 5820k if it can actually properly support three cards in x8 instead of that x16/x8/x4 garbage they are spouting but to be honest having to rebuy ram is a turn-off and will end up being at least $150 more than the z97 route. FYI: I am gaming at 1440p, already own the cards and not selling one, three is the number and the number three it shall stay


Im in the same boat except i plan on quadfire . I am leaning to x79 depending on the numbers that 5820k is a 6 core so wont perform well for gaming a 3790 will beat it for gaming. Ivy E has the 4820k though so you can get 40 lanes for cheaper then it would be 16 16 and 8 on a quad core that aint too far off a 4790k and you wanted have to buy new ram







that said if the single core is powerful enough on the 5930k it may be worth going to x99 just imo.


----------



## aberrero

I want to be in the club. Reference MSI 290x.

Bought this recently for $320 off eBay, hoping to use it in a CF setup in a few months once they are even cheaper. Hopefully by then I will have a 4k monitor to take advantage of it too.


----------



## kizwan

Quote:


> Originally Posted by *chronicfx*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> 5820k is Haswell-E. You need x99 chipset motherboard. You can run x8/x8/x8 with 5820k. I read somewhere that 5820k on-die PCIe 3.0 root complex will have fewer lanes though, so you can't do x16/x8/x8 like SB-E & IVY-E; 3820 & 4820k.
> 
> I read in a forum somewhere that some people having problem when running (multi-GPU) Hawaii @PCIe 3.0 x4 in BF4. If you experiencing this problem, getting them to run at x8 should fixed the problem. I don't know whether the problem is user error or what because I didn't investigate further. I can say depends on the program you're going to use though.
> 
> 
> 
> 
> 
> 
> 
> Yes the 5820k will have 28 pcie lanes instead of the usual 40 for the HEDT processors. If the motherboard manufacturers can make the 5820k run trifire in x8 for all three slots and the last slot x4 then it will be a very nice gaming chip for the next several years. What I was saying about the setups would be if I went z97 I would get a 4790k for $339 (I am paying in bitcoin so I will have to take my lumps at newegg even though microcenter is down the street) and a gigabyte gaming gt for $223 with a $30 rebate. Final total is $603 right now or 1.2 bitcoin whichever. My other choice would be to go for x99 and the 5820K which will be about $400 and a motherboard would be $250 again but also I would need new ram, and 16gb is about $270 so we are looking at near $1000 all said and done. That is pretty hefty and there better be some damned good gains if I do that. Anyways is there any reason to ditch my i5 for an i7 to push a trifire setup, talk me out of being impulsive
Click to expand...

Me i7 any day!







4790k vs. 5820k. 5820k with gimped PCIe lanes is a ripped off. However I'd choose 5820k because it can run tri-GPU @x8/x8/x8 without using PLX.


----------



## Arizonian

Quote:


> Originally Posted by *aberrero*
> 
> I want to be in the club. Reference MSI 290x.
> 
> Bought this recently for $320 off eBay, hoping to use it in a CF setup in a few months once they are even cheaper. Hopefully by then I will have a 4k monitor to take advantage of it too.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## aberrero

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added


Thanks.

Not to nitpick or anything, but it's a 290x, but you listed it as a 290.

Anyway, ive been surprised at how quiet the reference cooler was. I was definitely planning on watercooling it but now I'm reconsidering whether it is really worth it.


----------



## Arizonian

Quote:


> Originally Posted by *aberrero*
> 
> Thanks.
> 
> Not to nitpick or anything, but it's a 290x, but you listed it as a 290.
> 
> Anyway, ive been surprised at how quiet the reference cooler was. I was definitely planning on watercooling it but now I'm reconsidering whether it is really worth it.


Think it was auto re-filled. Fixed.


----------



## mus1mus

I just have mine.. And another thread to visit everyday..

http://www.techpowerup.com/gpuz/details.php?id=29wes



Ref MSI On a Gaming Bios


----------



## Sgt Bilko

Quote:


> Originally Posted by *mus1mus*
> 
> I just have mine.. And another thread to visit everyday..
> 
> http://www.techpowerup.com/gpuz/details.php?id=29wes
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Ref MSI On a Gaming Bios


Welcome Mus!

Nice upgrade to the 650 huh?







(better change your sig now







)


----------



## mus1mus

Quote:


> Originally Posted by *mus1mus*
> 
> I just have mine.. And another thread to visit everyday..
> 
> http://www.techpowerup.com/gpuz/details.php?id=29wes
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Ref MSI On a Gaming Bios
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Welcome Mus!
> 
> Nice upgrade to the 650 huh?
> 
> 
> 
> 
> 
> 
> 
> (better change your sig now
> 
> 
> 
> 
> 
> 
> 
> )
Click to expand...

Uh huh..







Huge leap Sarge!!









But then, arghh.. Can't enter the BIOS to tweak my system.. So the 650TI-Boost will stay if I needed to OC the FX up until I pick a proper 1440p monitor(s).


----------



## Red1776

Hi gang,

testing/benching the MSI R290X's with the Twin Frozr coolers before getting them wet.


----------



## bond32

Quote:


> Originally Posted by *Red1776*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Hi gang,
> 
> testing/benching the MSI R290X's with the Twin Frozr coolers before getting them wet.


Bet you just think you're SOOOOOOO hot, with your quadfire's here and there...









Looks sexy though. Curious to see that performance... Still sporting the UD7 I see?


----------



## Red1776

Quote:


> Originally Posted by *bond32*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Hi gang,
> 
> testing/benching the MSI R290X's with the Twin Frozr coolers before getting them wet.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Bet you just think you're SOOOOOOO hot, with your quadfire's here and there...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looks sexy though. Curious to see that performance... Still sporting the UD7 I see?
Click to expand...

 hehe,

Yeah the UD7 still the only to support quad natively. I played with a CVF-Z, but you have to deal with passive riser and bit of performance loss on the X4 pcie. The Revised UD7 has been really great to me though, so I am not apt to complain.

Quote:


> Bet you just think you're SOOOOOOO hot


was that a double entendre?


----------



## joeh4384

Just need to quadfire 290x's instead of getting a space heater.


----------



## joeh4384

Quote:


> Originally Posted by *bond32*
> 
> Bet you just think you're SOOOOOOO hot, with your quadfire's here and there...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looks sexy though. Curious to see that performance... Still sporting the UD7 I see?


I can imagine they would be very hot and throttle before going under water.


----------



## Descadent

anybody getting the asus swift pg278q even knowing amd obviously can't use gsync or ulmb?


----------



## Cyber Locc

descadent that monitor is severly overpriced for what it is even if I had nvidia I wouldnt buy it lol. plus freesync is coming for all screens yes please


----------



## Red1776

Quote:


> Originally Posted by *joeh4384*
> 
> Just need to quadfire 290x's instead of getting a space heater.


Quote:


> Originally Posted by *joeh4384*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bond32*
> 
> Bet you just think you're SOOOOOOO hot, with your quadfire's here and there...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looks sexy though. Curious to see that performance... Still sporting the UD7 I see?
> 
> 
> 
> I can imagine they would be very hot and throttle before going under water.
Click to expand...

have been running them and not bad at all. warmest card under game load 84C no throttle.


----------



## heroxoot

Quote:


> Originally Posted by *Red1776*
> 
> Quote:
> 
> 
> 
> Originally Posted by *joeh4384*
> 
> Just need to quadfire 290x's instead of getting a space heater.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *joeh4384*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bond32*
> 
> Bet you just think you're SOOOOOOO hot, with your quadfire's here and there...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looks sexy though. Curious to see that performance... Still sporting the UD7 I see?
> 
> Click to expand...
> 
> I can imagine they would be very hot and throttle before going under water.
> 
> Click to expand...
> 
> have been running them and not bad at all. warmest card under game load 84C no throttle.
Click to expand...

I bet they have messy thermal paste. Mine was doing 85 - 90c load and I cleaned up the thermal. Now they go under 80c in full loads for benchmarks.


----------



## ForNever

what TIM did you use?


----------



## failwheeldrive

Quote:


> Originally Posted by *cyberlocc*
> 
> descadent that monitor is severly overpriced for what it is even if I had nvidia I wouldnt buy it lol. plus freesync is coming for all screens yes please


Freesync won't be coming out for all monitors, or even all monitors with DP1.2. Not to mention it won't be out for ages lol.


----------



## afokke

is a 1000W 80 plus gold power supply enough to power three R9 290's, or do I need to step it up to 1200W


----------



## pkrexer

Quote:


> Originally Posted by *afokke*
> 
> is a 1000W 80 plus gold power supply enough to power three R9 290's, or do I need to step it up to 1200W


1200w. You're asking for issues if you try to run it on a 1000w.


----------



## Cyber Locc

Quote:


> Originally Posted by *failwheeldrive*
> 
> Freesync won't be coming out for all monitors, or even all monitors with DP1.3. Not to mention it won't be out for ages lol.


Link to proof of this and yes i meant all DP 1.2a's its not a monitor specific thing they said it will work on all monitors that have dp 1.2a so you have proof of different i would love to read it.

Heres a link for you proving your wrong .http://support.amd.com/en-us/search/faq/216 This link shows the requirements as adaptive sync and a amd compatible card.

This link shows vesa has made adaptive sync a standard so yes not all monitors that was my bad I mena ll with dp1.2a and above and this is fact http://www.anandtech.com/show/8008/vesa-adds-adaptivesync-to-displayport-12a-standard-variable-refresh-monitors-move-forward

Regardless most monitors or all its still more than 1 that is severally overpriced.

And IMO I wouldnt call Q4/14 or Q1/15 "For Ages" But i guess that's a relative term


----------



## Red1776

Quote:


> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> Quote:
> 
> 
> 
> Originally Posted by *joeh4384*
> 
> Just need to quadfire 290x's instead of getting a space heater.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *joeh4384*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bond32*
> 
> Bet you just think you're SOOOOOOO hot, with your quadfire's here and there...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looks sexy though. Curious to see that performance... Still sporting the UD7 I see?
> 
> Click to expand...
> 
> I can imagine they would be very hot and throttle before going under water.
> 
> Click to expand...
> 
> have been running them and not bad at all. warmest card under game load 84C no throttle.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> I bet they have messy thermal paste. Mine was doing 85 - 90c load and I cleaned up the thermal. Now they go under 80c in full loads for benchmarks.
Click to expand...

Were you running quads when you got those temps Hex? cuz that's pretty good.

Nope this bone stock. I am replacing the TIM (which I am sure is a mess as you pointed out) with IC Diamond

Been using it since it came out on CPU/GPU's. best I have used and I have tried all the big names.


----------



## tsm106

Quote:


> Originally Posted by *cyberlocc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *failwheeldrive*
> 
> Freesync won't be coming out for all monitors, or even all monitors with DP1.3. Not to mention it won't be out for ages lol.
> 
> 
> 
> Link to proof of this and yes i meant all DP 1.2a's its not a monitor specific thing they said it will work on all monitors that have dp 1.2a so you have proof of different i would love to read it.
> 
> Heres a link for you proving your wrong .http://support.amd.com/en-us/search/faq/216 This link shows the requirements as adaptive sync and a amd compatible card.
> 
> This link shows vesa has made adaptive sync a standard so yes not all monitors that was my bad I mena ll with dp1.2a and above and this is fact http://www.anandtech.com/show/8008/vesa-adds-adaptivesync-to-displayport-12a-standard-variable-refresh-monitors-move-forward
> 
> Regardless most monitors or all its still more than 1 that is severally overpriced.
Click to expand...

AMD/ATI tend to push open standards. Nvidia, charges you for them.

There's quite a list of standards pushed forth by ATI back in the day. Off the top of my head... video card related the biggest one is GDDR memory, which ATI designed and pushed JEDEC to make a standard. And so too will freesync become a norm in time.


----------



## Cyber Locc

Quote:


> Originally Posted by *tsm106*
> 
> AMD/ATI tend to push open standards. Nvidia, charges you for them.
> 
> There's quite a list of standards pushed forth by ATI back in the day. Off the top of my head... video card related the biggest one is GDDR memory, which ATI designed and pushed JEDEC to make a standard. And so too will freesync become a norm in time.


This is so true and I personally am far from biased I have always bought nvidia gpus this 290 is my first amd since like I don't even remember when (looks down at the quadro 4 laying on my desk). But nvidia as gotten too greedy and to proprietary there the apple of PCs and the need to be taught some humbleness and knock it off and begin to help the community as a whole not just there pockets they can still make a profit without this shady business practices.

But hey they made gsync for people that will buy into there gimmick and pretend that no one else can do it or make it realistically priced. Until someone does like amd is doing, but by that time nvidia is laughing all the way to the bank with your money.

I do wish that evga would swap to amd and nvidia as there my fave for gpus but such is life.


----------



## failwheeldrive

Quote:


> Originally Posted by *cyberlocc*
> 
> Link to proof of this and yes i meant all DP 1.2a's its not a monitor specific thing they said it will work on all monitors that have dp 1.2a so you have proof of different i would love to read it.
> 
> Heres a link for you proving your wrong .http://support.amd.com/en-us/search/faq/216 This link shows the requirements as adaptive sync and a amd compatible card.
> 
> This link shows vesa has made adaptive sync a standard so yes not all monitors that was my bad I mena ll with dp1.2a and above and this is fact http://www.anandtech.com/show/8008/vesa-adds-adaptivesync-to-displayport-12a-standard-variable-refresh-monitors-move-forward
> 
> Regardless most monitors or all its still more than 1 that is severally overpriced.
> 
> And IMO I wouldnt call Q4/14 or Q1/15 "For Ages" But i guess that's a relative term


It says in that FAQ AMD can't guarantee freesync support on all DP monitors. And not all monitors will support DP either, so you can't say that it will be a free feature on every future monitor... that just won't be the case. I'm looking for a link to something I watched a month or two ago talking about the VESA standard and why it won't be guaranteed that every DP monitor will support freesync... I'll edit this post with a link when I find it.
Quote:


> Originally Posted by *tsm106*
> 
> AMD/ATI tend to push open standards. Nvidia, charges you for them.
> 
> There's quite a list of standards pushed forth by ATI back in the day. Off the top of my head... video card related the biggest one is GDDR memory, which ATI designed and pushed JEDEC to make a standard. And so too will freesync become a norm in time.


Nvidia is the one that's pushing the standard in this case, without gsync AMD wouldn't be developing freesync. I do agree that it's great that AMD is working on pushing for free industry wide support for this technology, but Nvidia was still the one to bring the issue to light and introduce a solution for it. Gsync is available now, freesync is at the very least months away.


----------



## tsm106

Quote:


> Originally Posted by *Meatdohx*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Wait it out for the next gen cards? The 1.3 standard will be crystalized by Q4 I assume since that is the eta on the next gen cards. I would not buy any big panels until then. Panels before that time will be known as legacy panels. Their value will not hold as the new standard proliferates. Btw, I could never run 60hz anymore. Have you tried 120hz and up yet?
> 
> 
> 
> I have a QNIX [email protected] atm. 120hz give me green lines.
> 
> I love that monitor soo much,
> 
> Only 279$ of pure IPS greatness.
> 
> Your r9 290(x) is begging for it. I can feel it
Click to expand...

Been there. However running three oc'd panels in eyefinity is not for the feint of heart, ie. its a big pita. That said I am rather copacetic at the moment with triple 1080 at 144hz, all on DP, no screen tearing, not synching issues, all goodness.









Quote:


> Originally Posted by *failwheeldrive*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> AMD/ATI tend to push open standards. Nvidia, charges you for them.
> 
> There's quite a list of standards pushed forth by ATI back in the day. Off the top of my head... video card related the biggest one is GDDR memory, which ATI designed and pushed JEDEC to make a standard. And so too will freesync become a norm in time.
> 
> 
> 
> *Nvidia is the one that's pushing the standard in this case*, without gsync AMD wouldn't be developing freesync. I do agree that it's great that AMD is working on pushing for free industry wide support for this technology, but Nvidia was still the one to bring the issue to light and introduce a solution for it. Gsync is available now, freesync is at the very least months away.
Click to expand...

You seem to not know what a standard is versus proprietary. Nvidia had to make a box because their cards cannot enable the integrated tech, ie it was already in there. And as written before, by the time Q4 rolls around, we will maybe see some next gen cards. I would suspect as written before some panels coming to market to match on the AMD side.


----------



## heroxoot

Quote:


> Originally Posted by *Red1776*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> Quote:
> 
> 
> 
> Originally Posted by *joeh4384*
> 
> Just need to quadfire 290x's instead of getting a space heater.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *joeh4384*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bond32*
> 
> Bet you just think you're SOOOOOOO hot, with your quadfire's here and there...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looks sexy though. Curious to see that performance... Still sporting the UD7 I see?
> 
> Click to expand...
> 
> I can imagine they would be very hot and throttle before going under water.
> 
> Click to expand...
> 
> have been running them and not bad at all. warmest card under game load 84C no throttle.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> I bet they have messy thermal paste. Mine was doing 85 - 90c load and I cleaned up the thermal. Now they go under 80c in full loads for benchmarks.
> 
> Click to expand...
> 
> Were you running quads when you got those temps Hex? cuz that's pretty good.
> Nope this bone stock. I am replacing the TIM (which I am sure is a mess as you pointed out) with IC Diamond
> Been using it since it came out on CPU/GPU's. best I have used and I have tried all the big names.
Click to expand...

Just a single card. In my experience cards don't run as hot when they are in CFX. Maybe 4 just balls up the heat tho. I'd still take a look under the hood. MSI is bad with doing a thermal job.


----------



## Red1776

Quote:


> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> Quote:
> 
> 
> 
> Originally Posted by *joeh4384*
> 
> Just need to quadfire 290x's instead of getting a space heater.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *joeh4384*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bond32*
> 
> Bet you just think you're SOOOOOOO hot, with your quadfire's here and there...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looks sexy though. Curious to see that performance... Still sporting the UD7 I see?
> 
> Click to expand...
> 
> I can imagine they would be very hot and throttle before going under water.
> 
> Click to expand...
> 
> have been running them and not bad at all. warmest card under game load 84C no throttle.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> I bet they have messy thermal paste. Mine was doing 85 - 90c load and I cleaned up the thermal. Now they go under 80c in full loads for benchmarks.
> 
> Click to expand...
> 
> Were you running quads when you got those temps Hex? cuz that's pretty good.
> Nope this bone stock. I am replacing the TIM (which I am sure is a mess as you pointed out) with IC Diamond
> Been using it since it came out on CPU/GPU's. best I have used and I have tried all the big names.
> 
> Click to expand...
> 
> Just a single card. In my experience cards don't run as hot when they are in CFX. Maybe 4 just balls up the heat tho. I'd still take a look under the hood. MSI is bad with doing a thermal job.
Click to expand...

I am taking pics of the stock cooler after removal, I always replace the stock TIM, but running a single MSI Twin Frozr was a nice cool 67C @ 22C ambient. So I like the Twin Frozr's they do a great job, but I will never understand the way and GPU vendor applies TIM. Its a heat dam waiting to happen.

I will post the Twin Frozrs removed here and in my AMD HPP thread. I think as you that it will warrant replacement.


----------



## Cyber Locc

Quote:


> Originally Posted by *Red1776*
> 
> I am taking pics of the stock cooler after removal, I always replace the stock TIM, but running a single MSI Twin Frozr was a nice cool 67C @ 22C ambient. So I like the Twin Frozr's they do a great job, but I will never understand the way and GPU vendor applies TIM. Its a heat dam waiting to happen.
> I will post the Twin Frozrs removed here and in my AMD HPP thread. I think as you that it will warrant replacement.


They give you tim with the coolers that's cool of them. I have been meaning to try msi cards im in a whole new world now without evga so maybe ill give msi a shot.


----------



## Talon720

Quote:


> Originally Posted by *kizwan*
> 
> 5820k is Haswell-E. You need x99 chipset motherboard. You can run x8/x8/x8 with 5820k. I read somewhere that 5820k on-die PCIe 3.0 root complex will have fewer lanes though, so you can't do x16/x8/x8 like SB-E & IVY-E; 3820 & 4820k.
> 
> I read in a forum somewhere that some people having problem when running (multi-GPU) Hawaii @PCIe 3.0 x4 in BF4. If you experiencing this problem, getting them to run at x8 should fixed the problem. I don't know whether the problem is user error or what because I didn't investigate further. I can say depends on the program you're going to use though.


Im currently running 8x/4x/4x and have had no issues with bf4 running dx11 or mantle on win 7. Also this is with a 1080p monitor so maybe that makes the difference. The only issues ive had have veen with bf4 and the new patch started some issues a couple random fps drops and in mantle mem leak. Other than thoes 2 issues it runs great. It would make me feel all warm and fuzzy to have haswell-e with more bandwidth just because.


----------



## heroxoot

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Red1776*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> Quote:
> 
> 
> 
> Originally Posted by *joeh4384*
> 
> Just need to quadfire 290x's instead of getting a space heater.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *joeh4384*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bond32*
> 
> Bet you just think you're SOOOOOOO hot, with your quadfire's here and there...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looks sexy though. Curious to see that performance... Still sporting the UD7 I see?
> 
> Click to expand...
> 
> I can imagine they would be very hot and throttle before going under water.
> 
> Click to expand...
> 
> have been running them and not bad at all. warmest card under game load 84C no throttle.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> I bet they have messy thermal paste. Mine was doing 85 - 90c load and I cleaned up the thermal. Now they go under 80c in full loads for benchmarks.
> 
> Click to expand...
> 
> Were you running quads when you got those temps Hex? cuz that's pretty good.
> 
> Nope this bone stock. I am replacing the TIM (which I am sure is a mess as you pointed out) with IC Diamond
> 
> Been using it since it came out on CPU/GPU's. best I have used and I have tried all the big names.
> 
> Click to expand...
> 
> Just a single card. In my experience cards don't run as hot when they are in CFX. Maybe 4 just balls up the heat tho. I'd still take a look under the hood. MSI is bad with doing a thermal job.
> 
> Click to expand...
> 
> I am taking pics of the stock cooler after removal, I always replace the stock TIM, but running a single MSI Twin Frozr was a nice cool 67C @ 22C ambient. So I like the Twin Frozr's they do a great job, but I will never understand the way and GPU vendor applies TIM. Its a heat dam waiting to happen.
> I will post the Twin Frozrs removed here and in my AMD HPP thread. I think as you that it will warrant replacement.
Click to expand...





Single monitor? I get 76c on mine in heaven. 73ish in games that actually push the card. Every other game it can't even get 60c.


----------



## Cyber Locc

Quote:


> Originally Posted by *Talon720*
> 
> Im currently running 8x/4x/4x and have had no issues with bf4 running dx11 or mantle on win 7. Also this is with a 1080p monitor so maybe that makes the difference. The only issues ive had have veen with bf4 and the new patch started some issues a couple random fps drops and in mantle mem leak. Other than those 2 issues it runs great. It would make me feel all warm and fuzzy to have haswell-e with more bandwidth just because.


Have you seen any issues with mantle and crossfire I had read there would be issue or that it didn't support it but hadn't bothered looking lately. That is probably not true as I think i read that on a comment on anandtech so not a very reliable source but since your here whats your take on it.

No worries I'm also done, However I do want to point out to the other attack about my PPC rage thread. They updated the site which should have previously been done. And another person contacted me and that whole thing is resolved so in essence my "Rage Thread" Achieved its goals to the fullest Just Saying.

Yes I don't have the greatest grammar however unreadable it is not however.

Apologies for derailing the thread and my horrid grammar.


----------



## Red1776

Quote:


> Originally Posted by *heroxoot*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> Quote:
> 
> 
> 
> Originally Posted by *joeh4384*
> 
> Just need to quadfire 290x's instead of getting a space heater.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *joeh4384*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bond32*
> 
> Bet you just think you're SOOOOOOO hot, with your quadfire's here and there...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looks sexy though. Curious to see that performance... Still sporting the UD7 I see?
> 
> Click to expand...
> 
> I can imagine they would be very hot and throttle before going under water.
> 
> Click to expand...
> 
> have been running them and not bad at all. warmest card under game load 84C no throttle.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> I bet they have messy thermal paste. Mine was doing 85 - 90c load and I cleaned up the thermal. Now they go under 80c in full loads for benchmarks.
> 
> Click to expand...
> 
> Were you running quads when you got those temps Hex? cuz that's pretty good.
> 
> Nope this bone stock. I am replacing the TIM (which I am sure is a mess as you pointed out) with IC Diamond
> 
> Been using it since it came out on CPU/GPU's. best I have used and I have tried all the big names.
> 
> Click to expand...
> 
> Just a single card. In my experience cards don't run as hot when they are in CFX. Maybe 4 just balls up the heat tho. I'd still take a look under the hood. MSI is bad with doing a thermal job.
> 
> Click to expand...
> 
> I am taking pics of the stock cooler after removal, I always replace the stock TIM, but running a single MSI Twin Frozr was a nice cool 67C @ 22C ambient. So I like the Twin Frozr's they do a great job, but I will never understand the way and GPU vendor applies TIM. Its a heat dam waiting to happen.
> I will post the Twin Frozrs removed here and in my AMD HPP thread. I think as you that it will warrant replacement.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> 
> 
> 
> 
> Single monitor? I get 76c on mine in heaven. 73ish in games that actually push the card. Every other game it can't even get 60c.
Click to expand...

in heaven 4.0/most benchmarks, yes, single monitor. In games 5760 x 1080

could be ambient temps or the card I tried in single GPU mode. removing the cooler may tell the tale.


----------



## Lou HM

Hi Everyone,
Is anyone else R9 290X Crossfire not using 2nd GPU under Furmark and/or MSI Kumbustor ?
and can you offer any tips to solved it issue ?
Thanks in Advance

EDIT: been getting Freeze/Crash with sound on the background since upgrade to 4K resolution...


----------



## chronicfx

Quote:


> Originally Posted by *Lou HM*
> 
> Hi Everyone,
> Is anyone else R9 290X Crossfire not using 2nd GPU under Furmark and/or MSI Kumbustor ?
> and can you offer any tips to solved it issue ?
> Thanks in Advance
> 
> EDIT: been getting Freeze/Crash with sound on the background since upgrade to 4K resolution...


Yeah mine did that too. Not sure why


----------



## Sgt Bilko

Quote:


> Originally Posted by *Lou HM*
> 
> Hi Everyone,
> Is anyone else R9 290X Crossfire not using 2nd GPU under Furmark and/or MSI Kumbustor ?
> and can you offer any tips to solved it issue ?
> Thanks in Advance
> 
> EDIT: been getting Freeze/Crash with sound on the background since upgrade to 4K resolution...


Are you running them in fullscreen?


----------



## Lou HM

I am running full screen, also tried rename the .exe file


----------



## chronicfx

Quote:


> Originally Posted by *kizwan*
> 
> 5820k is Haswell-E. You need x99 chipset motherboard. You can run x8/x8/x8 with 5820k. I read somewhere that 5820k on-die PCIe 3.0 root complex will have fewer lanes though, so you can't do x16/x8/x8 like SB-E & IVY-E; 3820 & 4820k.
> 
> I read in a forum somewhere that some people having problem when running (multi-GPU) Hawaii @PCIe 3.0 x4 in BF4. If you experiencing this problem, getting them to run at x8 should fixed the problem. I don't know whether the problem is user error or what because I didn't investigate further. I can say depends on the program you're going to use though.


Check this out.

http://www.tomshardware.com/news/haswell-e-i7-5820k-pci-express,27507.html

Have you found anything that says that x8/x8/x8 will be a reality or were you just adding it up?

If I need to shell out $600 just for a cpu (5930k) then I will have to go Z97 4970k and PLX there is no question... Between spending $1000+ on 5930k/mobo/ram and getting a 4790k with a gaming gt plus a ps4 I would choose the latter. Either way not spending a $1000 on a cpu upgrade.


----------



## Cyber Locc

Quote:


> Originally Posted by *chronicfx*
> 
> Check this out.
> 
> http://www.tomshardware.com/news/haswell-e-i7-5820k-pci-express,27507.html
> 
> Have you found anything that says that x8/x8/x8 will be a reality or were you just adding it up?
> 
> If I need to shell out $600 just for a cpu (5930k) then I will have to go Z97 4970k and PLX there is no question... Between spending $1000+ on 5930k/mobo/ram and getting a 4790k with a gaming gt plus a ps4 I would choose the latter. Either way not spending a $1000 on a cpu upgrade.


Im not surprised at all if you want multiple gpus you paying for it







. It is most likely as the fact the 4820ks sold very well as people would go x79 (I just did







) just for the lanes they aint making that mistake again


----------



## kizwan

Quote:


> Originally Posted by *chronicfx*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> 5820k is Haswell-E. You need x99 chipset motherboard. You can run x8/x8/x8 with 5820k. I read somewhere that 5820k on-die PCIe 3.0 root complex will have fewer lanes though, so you can't do x16/x8/x8 like SB-E & IVY-E; 3820 & 4820k.
> 
> I read in a forum somewhere that some people having problem when running (multi-GPU) Hawaii @PCIe 3.0 x4 in BF4. If you experiencing this problem, getting them to run at x8 should fixed the problem. I don't know whether the problem is user error or what because I didn't investigate further. I can say depends on the program you're going to use though.
> 
> 
> 
> 
> 
> 
> 
> Check this out.
> 
> http://www.tomshardware.com/news/haswell-e-i7-5820k-pci-express,27507.html
> 
> Have you found anything that says that x8/x8/x8 will be a reality or were you just adding it up?
> 
> If I need to shell out $600 just for a cpu (5930k) then I will have to go Z97 4970k and PLX there is no question... Between spending $1000+ on 5930k/mobo/ram and getting a 4790k with a gaming gt plus a ps4 I would choose the latter. Either way not spending a $1000 on a cpu upgrade.
Click to expand...

5820k PCIe configuration 1x16 1x8 1x4 is not what tri-GPU will run at. It just means only 1x16 1x8 1x4 lanes (28 lanes) available with 5820k. The x16 lanes can be shared with two PCIe slots, making them when working at the same time will be at x8/x8 mode. So with tri-GPU, you can run them at x8/x8/x8. Of course this is not confirmed yet. According to techpowerup news, only x16 x8 lanes are for discrete graphics cards.

http://www.techpowerup.com/201243/intel-core-i7-haswell-e-processor-lineup-detailed.html


----------



## chronicfx

Quote:


> Originally Posted by *kizwan*
> 
> 5820k PCIe configuration 1x16 1x8 1x4 is not what tri-GPU will run at. It just means only 1x16 1x8 1x4 lanes (28 lanes) available with 5820k. The x16 lanes can be shared with two PCIe slots, making them when working at the same time will be at x8/x8 mode. So with tri-GPU, you can run them at x8/x8/x8. Of course this is not confirmed yet. According to techpowerup news, only x16 x8 lanes are for discrete graphics cards.
> 
> http://www.techpowerup.com/201243/intel-core-i7-haswell-e-processor-lineup-detailed.html


That is what I am hoping that the 16 can be split between the two top slots







Now to find free ddr4 ram...


----------



## Cyber Locc

Quote:


> Originally Posted by *kizwan*
> 
> 5820k PCIe configuration 1x16 1x8 1x4 is not what tri-GPU will run at. It just means only 1x16 1x8 1x4 lanes (28 lanes) available with 5820k. The x16 lanes can be shared with two PCIe slots, making them when working at the same time will be at x8/x8 mode. So with tri-GPU, you can run them at x8/x8/x8. Of course this is not confirmed yet. According to techpowerup news, only x16 x8 lanes are for discrete graphics cards.
> 
> http://www.techpowerup.com/201243/intel-core-i7-haswell-e-processor-lineup-detailed.html


This would be the ideal situation to have all 3 cards run at x8 x8 x8. However I have a strong feeling this is anti there point they have to make a tangible difference between the chips to make you want to buy the 30.

Locking you to 2 cards is defiantly a good way to do that, but this is just an opinion know one will know forsure till the 29th.

However I have seen some leaked motherbaord manuals stating x16 x8 x4 with no mention of x8 x8 x8.

Also according to that article you linked the 30k and 60x will run x16 x16 x8 so that would mean there removing quad sli all together which i doubt will happen so maybe there is hope.

However its entirely possible this platform will remove quad sli/cf in favor of there m.2 slots that seem to be on all the boards those require pci lanes.


----------



## kizwan

Quote:


> Originally Posted by *cyberlocc*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> 5820k PCIe configuration 1x16 1x8 1x4 is not what tri-GPU will run at. It just means only 1x16 1x8 1x4 lanes (28 lanes) available with 5820k. The x16 lanes can be shared with two PCIe slots, making them when working at the same time will be at x8/x8 mode. So with tri-GPU, you can run them at x8/x8/x8. Of course this is not confirmed yet. According to techpowerup news, only x16 x8 lanes are for discrete graphics cards.
> 
> http://www.techpowerup.com/201243/intel-core-i7-haswell-e-processor-lineup-detailed.html
> 
> 
> 
> 
> 
> 
> This would be the ideal situation to have all 3 cards run at x8 x8 x8. However I have a strong feeling this is anti there point they have to make a tangible difference between the chips to make you want to buy the 30.
> 
> Locking you to 2 cards is defiantly a good way to do that, but this is just an opinion know one will know forsure till the 29th.
> 
> However I have seen some leaked motherbaord manuals stating x16 x8 x4 with no mention of x8 x8 x8.
> 
> Also according to that article you linked the 30k and 60x will run x16 x16 x8 so that would mean there removing quad sli all together which i doubt will happen so maybe there is hope.
> 
> However its entirely possible this platform will remove quad sli/cf in favor of there m.2 slots that seem to be on all the boards those require pci lanes.
Click to expand...

Like I said the 1x16 1x16 1x8 with 5930k/5960X & the 1x16 1x8 1x4 with 5820k doesn't means they only support three-way GPU at x16/x16/x8 & x16/x8/x4 respectively. They just show available PCIe lanes configurations. The x16 lanes can be shared with two PCIe x16 slots which means when running them individually they will run at x16 but when both PCIe x16 slots populated, both will run at x8/x8 mode. Again of course this is not confirmed yet, even the leaked x99 motherboard manual can be considered just a draft, not final document.

This is a leaked Asus x99 Deluxe motherboard manual showing with 5820k 28 PCIe lanes, three-way GPU will run at x8/x8/x8.
http://linustechtips.com/main/topic/195718-asus-x99-deluxe/


----------



## Arizonian

Quote:


> Originally Posted by *mus1mus*
> 
> I just have mine.. And another thread to visit everyday..
> 
> http://www.techpowerup.com/gpuz/details.php?id=29wes
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Ref MSI On a Gaming Bios


Congrats - added









Quote:


> Originally Posted by *Red1776*
> 
> Hi gang,
> 
> testing/benching the MSI R290X's with the Twin Frozr coolers before getting them wet.
> 
> 
> Spoiler: Warning: Spoiler!


That's beautiful









Quote:


> Originally Posted by *Descadent*
> 
> anybody getting the asus swift pg278q even knowing amd obviously can't use gsync or ulmb?


I'd imagine if you have enough to push 2560 x1 440 @120Hz you don't need the other features. Means maintaining 100 FPS - 120 FPS during gaming and movement should be fluid without stutter. It would be nice to see someone with a PG278Q and dual AMD GPU set up to give us some feed back.

Side note :

/thread cleaned

Please be respectful to each other when posting.








_If you have any questions please feel free to PM me._


----------



## spikezone2004

What exactly is coil whine? is it bad or just something that happens? I seem to get it in a couple games I can hear it quite a bit since my side panel is off atm


----------



## BradleyW

Quote:


> Originally Posted by *spikezone2004*
> 
> What exactly is coil whine? is it bad or just something that happens? I seem to get it in a couple games I can hear it quite a bit since my side panel is off atm


It causes no harm to the hardware. It is just the copper coils vibrating really fast in simple terms.


----------



## spikezone2004

Quote:


> Originally Posted by *BradleyW*
> 
> It causes no harm to the hardware. It is just the copper coils vibrating really fast in simple terms.


ahh ok, I wasn't sure if it was a bad thing or just something that happens. Been noticing it in a couple games but not all of them.

Thinking of trying different version drivers, i keep getting BSODs atikmpag.sys already tried uninstalling and reinstalling ran DDU multiple times and still getting them


----------



## BradleyW

Quote:


> Originally Posted by *spikezone2004*
> 
> ahh ok, I wasn't sure if it was a bad thing or just something that happens. Been noticing it in a couple games but not all of them.
> 
> Thinking of trying different version drivers, i keep getting BSODs atikmpag.sys already tried uninstalling and reinstalling ran DDU multiple times and still getting them


A reformat followed by a fresh driver install would be the way to sort that issue out. However, frequent AMD driver BSOD's can also mean a faulty GPU, but not commonly the case.


----------



## spikezone2004

Quote:


> Originally Posted by *BradleyW*
> 
> A reformat followed by a fresh driver install would be the way to sort that issue out. However, frequent AMD driver BSOD's can also mean a faulty GPU, but not commonly the case.


Trying to hold out on a reformat until I get a SSD. Might try 13.12 drivers if I get BSOD again, haven't tried 14.7b I am running 14.4


----------



## BradleyW

Quote:


> Originally Posted by *spikezone2004*
> 
> Trying to hold out on a reformat until I get a SSD. Might try 13.12 drivers if I get BSOD again, haven't tried 14.7b I am running 14.4


You should be OK with 14.4 WHQL.
When did the BSOD's start?
Did you have a previous version of CCC installed before 14.4 WHQL?


----------



## spikezone2004

Quote:


> Originally Posted by *BradleyW*
> 
> You should be OK with 14.4 WHQL.
> When did the BSOD's start?
> Did you have a previous version of CCC installed before 14.4 WHQL?


I had 13.8 before 14.4, I updated them last week and thats when it started also same time I put my R9 290 in


----------



## BradleyW

Quote:


> Originally Posted by *spikezone2004*
> 
> I had 13.8 before 14.4, I updated them last week and thats when it started also same time I put my R9 290 in


How did you remove 13.8 CCC? Did you use DDU and all that kind of stuff?


----------



## Beezleybuzz

Quote:


> Originally Posted by *failwheeldrive*
> 
> Freesync won't be coming out for all monitors, or even all monitors with DP1.2. Not to mention it won't be out for ages lol.


Not from what I've read. The adaptive refresh rate technology "free sync" is not created by AMD but was a VESA standard agreed upon quite sometime ago. Any monitor that will be DP 1.3 will support variable refresh rate technology natively.


----------



## BradleyW

Quote:


> Originally Posted by *Beezleybuzz*
> 
> Not from what I've read. The adaptive refresh rate technology "free sync" is not created by AMD but was a VESA standard agreed upon quite sometime ago. Any monitor that will be DP 1.3 will support variable refresh rate technology natively.


This is exactly true. This information came out months ago. The tech existed long before G-Sync. It has also existed in our mobile phones. Ever wondered why you get no tearing on a game on your phone or tablet?


----------



## spikezone2004

Quote:


> Originally Posted by *BradleyW*
> 
> How did you remove 13.8 CCC? Did you use DDU and all that kind of stuff?


Yea, uninstalled 13.8 ran DDU in safe mode and installed 14.4. other day I uninstalled 14.4 ran DDU 2 times to make sure then reinstalled 14.4, bout go try 13.12 drivers see if its better.


----------



## Mega Man

Quote:


> Originally Posted by *BradleyW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Beezleybuzz*
> 
> Not from what I've read. The adaptive refresh rate technology "free sync" is not created by AMD but was a VESA standard agreed upon quite sometime ago. Any monitor that will be DP 1.3 will support variable refresh rate technology natively.
> 
> 
> 
> This is exactly true. This information came out months ago. The tech existed long before G-Sync. It has also existed in our mobile phones. Ever wondered why you get no tearing on a game on your phone or tablet?
Click to expand...

i cant find the video but tek showed when it first came out, nvidia supported it. several gen ago now


----------



## BradleyW

Quote:


> Originally Posted by *spikezone2004*
> 
> Yea, uninstalled 13.8 ran DDU in safe mode and installed 14.4. other day I uninstalled 14.4 ran DDU 2 times to make sure then reinstalled 14.4, bout go try 13.12 drivers see if its better.


There is the problem. You should have just uninstalled the drivers through add/remove programs. Then you should reboot the system and install the new drivers. DDU has probably made a mess with your system. A reformat is most likely required.


----------



## chronicfx

Quote:


> Originally Posted by *kizwan*
> 
> Like I said the 1x16 1x16 1x8 with 5930k/5960X & the 1x16 1x8 1x4 with 5820k doesn't means they only support three-way GPU at x16/x16/x8 & x16/x8/x4 respectively. They just show available PCIe lanes configurations. The x16 lanes can be shared with two PCIe x16 slots which means when running them individually they will run at x16 but when both PCIe x16 slots populated, both will run at x8/x8 mode. Again of course this is not confirmed yet, even the leaked x99 motherboard manual can be considered just a draft, not final document.
> 
> This is a leaked Asus x99 Deluxe motherboard manual showing with 5820k 28 PCIe lanes, three-way GPU will run at x8/x8/x8.
> http://linustechtips.com/main/topic/195718-asus-x99-deluxe/


Good to know. Stinks that the only configuration on that board and possibly all boards for x99 sandwiches all three cards and on that one leaves no room for a soundcard. I find that on my current board (pcie x16 then two x1 slots before next x16) that triple slot spacing for the first card really alleviates alot of temp issues on air for my reference cards because it is one card, then a slot of space then the other two in crossfire. I only hit mid 70's using the default afterburner profile. If i were sandwiched I bet that middle card would be alot hotter.


----------



## Talon720

Quote:


> Originally Posted by *cyberlocc*
> 
> Have you seen any issues with mantle and crossfire I had read there would be issue or that it didn't support it but hadn't bothered looking lately. That is probably not true as I think i read that on a comment on anandtech so not a very reliable source but since your here whats your take on it.
> 
> No worries I'm also done, However I do want to point out to the other attack about my PPC rage thread. They updated the site which should have previously been done. And another person contacted me and that whole thing is resolved so in essence my "Rage Thread" Achieved its goals to the fullest Just Saying.
> 
> Yes I don't have the greatest grammar however unreadable it is not however.
> 
> Apologies for derailing the thread and my horrid grammar.


Ah no worries your grammer is fine i hear way worse english where I work. Well at first mantle and trifire on bf4 worked great untill the new early access bf4 expansion patch. With mantle i had a memory leak introduced all 16gb used with 10 mins never had it before. 14.4 whql hands down is the best for me and alot of issues are on dice/ea end fix one thing break another. I do get random fps drops w/dx11 i wasnt getting before, but i also was trying the different drivers uninstalling installing would run bad then use ddu. So i may need a format and start over again from no fault of trifire though. Ive never gotten any stuttering or any performance issues related to 8x/4x/4x like would people lead you to believe, thus why I tried it for myself. Basically mantle worked great untill ea/dice broke it, but dx11 performance has seemed to improved through drivers it seems.


----------



## rdr09

Quote:


> Originally Posted by *Talon720*
> 
> Ah no worries your grammer is fine i hear way worse english where I work. Well at first mantle and trifire on bf4 worked great untill the new early access bf4 expansion patch. With mantle i had a memory leak introduced all 16gb used with 10 mins never had it before. 14.4 whql hands down is the best for me and alot of issues are on dice/ea end fix one thing break another. I do get random fps drops w/dx11 i wasnt getting before, but i also was trying the different drivers uninstalling installing would run bad then use ddu. So i may need a format and start over again from no fault of trifire though. Ive never gotten any stuttering or any performance issues related to 8x/4x/4x like would people lead you to believe, thus why I tried it for myself. Basically mantle worked great untill ea/dice broke it, but dx11 performance has seemed to improved through drivers it seems.


there shouldn't be any stuttering even when running at 8/4/4 but I think the performance is lowered a bit. may not notice it depending on the game, settings, rez, etc. maybe only in synthetic benchmarks. benchers will go for the highest lanes for sure. it's like they bought tires rated for 180MPH and intall them on a car that can only go 80MPH.


----------



## Lou HM

Quote:


> Originally Posted by *BradleyW*
> 
> There is the problem. You should have just uninstalled the drivers through add/remove programs. Then you should reboot the system and install the new drivers. DDU has probably made a mess with your system. A reformat is most likely required.


It's also the some of the issues I am experiencing alone with furmark and msi kumbustor not using crossfire...
I also tried many of the thing you guys suggest... Crash only with crysis, crysis 2, BF4 and twice when opening GPUZ...


----------



## aaroc

I need to test 4GPUs R9 290x what driver version do you recommend 13.12? 3 used and one new. this is not to OC, ust o know if they work correctly. Thanks!


----------



## Talon720

Quote:


> Originally Posted by *aaroc*
> 
> I need to test 4GPUs R9 290x what driver version do you recommend 13.12? 3 used and one new. this is not to OC, ust o know if they work correctly. Thanks!


Ive been using 14.4 whql for trifire pretty successfully
Quote:


> Originally Posted by *rdr09*
> 
> there shouldn't be any stuttering even when running at 8/4/4 but I think the performance is lowered a bit. may not notice it depending on the game, settings, rez, etc. maybe only in synthetic benchmarks. benchers will go for the highest lanes for sure. it's like they bought tires rated for 180MPH and intall them on a car that can only go 80MPH.


Yea I agree with you completely, im probably missing out on a few fps. After some people said 4x wouldn't be enough its not gonna work, I just wanted to be able to speak from experience. Also since xmda on hawaii it seemed no one was really knew how much bandwidth they'd need.


----------



## Meatdohx

Quote:


> Originally Posted by *BradleyW*
> 
> There is the problem. You should have just uninstalled the drivers through add/remove programs. Then you should reboot the system and install the new drivers. DDU has probably made a mess with your system. A reformat is most likely required.


Not. DDU does NOT make a mess.

DDU is pretty good and have succeeded where the express AMD uninstall failed miserably,

In fact express uninstall does not remove everything from your PC. Your AMD folder will still be there it's a bug.

Look at the release note from the last beta drivers.
Quote:


> Known Issues
> ■Running Watch Dogs with a R9 280X CrossFire configuration may result in the application running in CrossFire software compositing mode
> ■Enabling Temporal SMAA in a CrossFire configuration when playing Watch Dogs will result in flickering
> ■AMD CrossFire™ configurations with AMD Eyefinity enabled will see instability with BattleField 4 or Thief when running Mantle
> ■Catalyst™ Install Manager text is covered by Express/Custom radio button text
> ■Express Uninstall does not remove C:\Program Files\(AMD or ATI) folder


http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx

And if you look at your registry after an AMD express uninstall you will see there alot of thing still there.


----------



## rdr09

Quote:


> Originally Posted by *Meatdohx*
> 
> Not. DDU does NOT make a mess.
> 
> DDU is pretty good and have succeeded where the express AMD uninstall failed miserably,
> 
> In fact express uninstall does not remove everything from your PC. Your AMD folder will still be there it's a bug.
> 
> Look at the release note from the last beta drivers.
> http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx
> 
> And if you look at your registry after an AMD express uninstall you will see there alot of thing still there.


I use express all the time. never an issue. I use the new driver to uninstall the existing one. no safe mode, no DDU. no driver sweeper, no Hardware Acceleration issue, etc.

Been doing this method since HD5830 days. I guess to each his own.


----------



## TopicClocker

Hi everyone, what's everyone's thoughts on the XFX Radeon R9 290 DD? I'm currently looking for a new card as mine died sadly and I'm getting a refund on it due to it being discontinued. (First card I've ever had die on me after owning 4+ cards over 5 years).

I've been looking into them and I've heard about VRM overheating problems and also a couple of them having Elpida memory chips.
In-fact I had trouble with a XFX R9 280 for someone I know and we had to RMA it, not sure what the problem was but it would restart the entire computer.


----------



## spikezone2004

Quote:


> Originally Posted by *TopicClocker*
> 
> Hi everyone, what's everyone's thoughts on the XFX Radeon R9 290 DD? I'm currently looking for a new card as mine died sadly and I'm getting a refund on it due to it being discontinued. (First card I've ever had die on me after owning 4+ cards over 5 years).
> 
> I've been looking into them and I've heard about VRM overheating problems and also a couple of them having Elpida memory chips.
> In-fact I had trouble with a XFX R9 280 for someone I know and we had to RMA it, not sure what the problem was but it would restart the entire computer.


I have a XFX R9 290 which I just got, it works great. The VRM overheating isn't just XFX it can happen on all of the R9 290(x)s they just produce a lot of heat. However I put mine straight under water when I got mine my temps haven't gone past 55C


----------



## TopicClocker

Quote:


> Originally Posted by *spikezone2004*
> 
> I have a XFX R9 290 which I just got, it works great. The VRM overheating isn't just XFX it can happen on all of the R9 290(x)s they just produce a lot of heat. However I put mine straight under water when I got mine my temps haven't gone past 55C


Oh okay I hadn't known it was a problem affecting all the 290s.
If that's the case I'm less hesitant, thanks.

How's the performance and what memory chips does your card have if you don't mind me asking?


----------



## spikezone2004

I have Elpdia memory on mine, the performance is good. Now my cpu seems to bottleneck me in certain games but the card performs good. Haven't gotten round to overclocking it yet still debating whether to redo my tim which i might


----------



## Descadent

+1 for ddu never causing an issue and being amazing.


----------



## BradleyW

Quote:


> Originally Posted by *Meatdohx*
> 
> Not. DDU does NOT make a mess.
> 
> DDU is pretty good and have succeeded where the express AMD uninstall failed miserably,
> 
> In fact express uninstall does not remove everything from your PC. Your AMD folder will still be there it's a bug.
> 
> Look at the release note from the last beta drivers.
> http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx
> 
> And if you look at your registry after an AMD express uninstall you will see there alot of thing still there.


Actually, DDU can cause stability issues and software corruption. It's not common, but it sure ain't impossible. I've seen DDU cause issues for various people. I also found errors during my internal testing several months ago on 10 test systems. 3 of the systems resulted in corruption via DDU. A reformat was required. I've warned people about DDU a long time ago. There is a large following of people who are now aware of it's "potential" dangers. You are far less likely to run into issues if you are using an AMD GPU with an Intel based chipset. This is my verdict based on my own testing and other peoples reports. And remember this, just because you've not had issues with DDU, it does not mean DDU is not flawless.


----------



## TopicClocker

Quote:


> Originally Posted by *spikezone2004*
> 
> I have Elpdia memory on mine, the performance is good. Now my cpu seems to bottleneck me in certain games but the card performs good. Haven't gotten round to overclocking it yet still debating whether to redo my tim which i might


Okay, thanks, I hope you enjoy it!


----------



## amptechnow

Quote:


> Originally Posted by *TopicClocker*
> 
> Oh okay I hadn't known it was a problem affecting all the 290s.
> If that's the case I'm less hesitant, thanks.
> 
> How's the performance and what memory chips does your card have if you don't mind me asking?


i have a reference xfx r9 290 with elpida memory. i put it under water after testing it. it does 1150/1625 for 24/7 gaming and up to 1200 core for benches. very happy with it. and like spikezone mine stays below 55 usually around 45. great performance great benches and able to pretty much max any game at 1080 with lots of aa.

recently bought a powercolor pcs r9 290 to go with it and they ended up sending me the 290x by mistake. also another great card. with stock cooler stays below 75 degrees

no vrm issues on either card. also with such good prices right now i say you cant go wqrong with a 290(x)....


----------



## th3illusiveman

Add me up pls 

 Whats a nice safe voltage to plug into this thing for some quick testing?

its quite noisy imo. Compared to the Accelero attached to my 7970 this thing at 60% sounds ALOT louder than my accelero at 100%.


----------



## Arizonian

Quote:


> Originally Posted by *th3illusiveman*
> 
> Add me up pls
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Whats a nice safe voltage to plug into this thing for some quick testing?
> 
> its quite noisy imo. Compared to the Accelero attached to my 7970 this thing at 60% sounds ALOT louder than my accelero at 100%.


Congrats - added


----------



## Paopawdecarabao

Nice!


----------



## Sgt Bilko

Quote:


> Originally Posted by *TopicClocker*
> 
> Hi everyone, what's everyone's thoughts on the XFX Radeon R9 290 DD? I'm currently looking for a new card as mine died sadly and I'm getting a refund on it due to it being discontinued. (First card I've ever had die on me after owning 4+ cards over 5 years).
> 
> I've been looking into them and I've heard about VRM overheating problems and also a couple of them having Elpida memory chips.
> In-fact I had trouble with a XFX R9 280 for someone I know and we had to RMA it, not sure what the problem was but it would restart the entire computer.


So far as i know all the DD cards are Elpida memory, the vrm cooling isn't great on them but it's good enough provided you have decent airflow in your case.

I've only had temp issues when my cards are overvolted and in crossfire, even then the top card doesn't get all that warm. Bottom card never breaks 70c on the core and 70c on the vrms


----------



## Performer81

Looks like my new powercolor pcs+ is unlockable, what do you think?
With memory info i read this:

RA1: F8000005 RA2: F8010000
RB1: F8000005 RB2: F8010000
RC1: F8000005 RC2: F8010000
RD1: F8000005 RD2: F8010000

That means i have a 290X chip


----------



## fateswarm

Quote:


> Originally Posted by *Performer81*
> 
> Looks like my new powercolor pcs+ is unlockable, what do you think?
> With memory info i read this:
> 
> RA1: F8000005 RA2: F8010000
> RB1: F8000005 RB2: F8010000
> RC1: F8000005 RC2: F8010000
> RD1: F8000005 RD2: F8010000
> 
> That means i have a 290X chip


Does anyone have a URL to the technical rationale that makes those codes the only valid for upgrade? I don't mean the "we know from experience it's true' rationale. The actual technical analysis that confirms it.


----------



## Widde

My cards might end up getting a new house soon







Thinking of a Corsair air 540 as I wont be able to go watercooling for awhile..


----------



## battleaxe

Quote:


> Originally Posted by *Performer81*
> 
> Looks like my new powercolor pcs+ is unlockable, what do you think?
> With memory info i read this:
> 
> RA1: F8000005 RA2: F8010000
> RB1: F8000005 RB2: F8010000
> RC1: F8000005 RC2: F8010000
> RD1: F8000005 RD2: F8010000
> 
> That means i have a 290X chip


Unlockable. Yup


----------



## Cyber Locc

Quote:


> Originally Posted by *Performer81*
> 
> Looks like my new powercolor pcs+ is unlockable, what do you think?
> With memory info i read this:
> 
> RA1: F8000005 RA2: F8010000
> RB1: F8000005 RB2: F8010000
> RC1: F8000005 RC2: F8010000
> RD1: F8000005 RD2: F8010000
> 
> That means i have a 290X chip


Yep thats unlockable gratz.

AFAIK there all the same chip some can be unlocked some cant though. The only difference is 300 shaders, Give it a shot at unlocking I would







.


----------



## Cyber Locc

Quote:


> Originally Posted by *fateswarm*
> 
> Does anyone have a URL to the technical rationale that makes those codes the only valid for upgrade? I don't mean the "we know from experience it's true' rationale. The actual technical analysis that confirms it.


I dont really understand what your saying but if your saying what i think your saying, Then no. What we do have is a ton of unlocked cards and the have 1 of 2 results every other set tried that has different numbers hasn't worked.

I really don't know where a technical rationale would come from the only person. I think that would be able to provide such a explanation would be amd and I doubt there going to talk about this issue in anyway at all.


----------



## fateswarm

Quote:


> Originally Posted by *cyberlocc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fateswarm*
> 
> Does anyone have a URL to the technical rationale that makes those codes the only valid for upgrade? I don't mean the "we know from experience it's true' rationale. The actual technical analysis that confirms it.
> 
> 
> 
> I dont really understand what your saying but if your saying what i think your saying, Then no. What we do have is a ton of unlocked cards and the have 1 of 2 results every other set tried that has different numbers hasn't worked.
> 
> I really don't know where a technical rationale would come from the only person. I think that would be able to provide such a explanation would be amd and I doubt there going to talk about this issue in anyway at all.
Click to expand...

Do you have a list of that ton of cards? I've stumbled upon that information but it was only in the form of "trust us, no other codes work". OK, where is your data that confirms it.

I don't say I don't believe them by the way. I think they are most probably right. I just like confirming things like that.


----------



## Cyber Locc

Quote:


> Originally Posted by *fateswarm*
> 
> Do you have a list of that ton of cards? I've stumbled upon that information but it was only in the form of "trust us, no other codes work". OK, where is your data that confirms it.
> 
> I don't say I don't believe them by the way. I think they are most probably right. I just like confirming things like that.


http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread 329 pages of successful and unsuccessful unlocks







.

Not 1 user has reported those codes not unlocking nor have they reported other codes to unlock so by process of the data those are the only ones that unlock. However this doesnt mean don't try yours as a non those numbers card may very well unlock.

From what I have read it has to do with original cards only and heres the reason. AMD release the 290x first and planned a later release of 290s and nvidia did something that made them push up the release date of 290s.

So this entirely my opinion, but I think what happened was the makers of the cards in effort to get out 290s quicker than expected use actual 290xs that didn't bin well. They just added a software lock to speed up the process IE put 290 bios on 290xs. This is what happened I believe as only the very first batch of 290s are unlock-able from what i have read on different sites the first 3 weeks of production are the only ones. This also plays into why we see most of the sapphires unlocked are bf4 editions.

The technical reason of why those memory's unlock IMO would be above those memory's are from the first batch of cards which are actually 290xs with 290 bios. After this few weeks of production and they had some 290s to sell they took out more time and started to cut off the bad shaders thus physically locking them to 290s.


----------



## fateswarm

Quote:


> Originally Posted by *cyberlocc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fateswarm*
> 
> Do you have a list of that ton of cards? I've stumbled upon that information but it was only in the form of "trust us, no other codes work". OK, where is your data that confirms it.
> 
> I don't say I don't believe them by the way. I think they are most probably right. I just like confirming things like that.
> 
> 
> 
> http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread 329 pages of successful and unsuccessful unlocks
Click to expand...

Yeah I know the thread. So you say it's based on each post and then the replies that gave them the code data of ram results? I won't through all that.









I hope they didn't just see 4 results and had a verdict though, but I guess not.


----------



## Cyber Locc

Quote:


> Originally Posted by *fateswarm*
> 
> Yeah I know the thread. So you say it's based on each post and then the replies that gave them the code data of ram results? I won't through all that.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I hope they didn't just see 4 results and had a verdict though, but I guess not.


Ya no it was way more than 4 cards lol the 2 guys that made the thread tested somewhere around 30 cards by them selves so yes there is alot of data check out the thread and theres list of cards that have and haven't unlocked in the first post that will also link you to the users post with there results. And some posts on there contain results for 10+ cards some people bought alot for mining and tested them all.


----------



## fateswarm

Quote:


> Originally Posted by *cyberlocc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fateswarm*
> 
> Yeah I know the thread. So you say it's based on each post and then the replies that gave them the code data of ram results? I won't through all that.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I hope they didn't just see 4 results and had a verdict though, but I guess not.
> 
> 
> 
> Ya no it was way more than 4 cards lol the 2 guys that made the thread tested somewhere around 30 cards by them selves so yes there is alot of data check out the thread and theres list of cards that have and haven't unlocked in the first post that will also link you to the users post with there results. And some posts on there contain results for 10+ cards some people bought alot for mining and tested them all.
Click to expand...

Yeah OK. It is believable. I know the OP from other threads and he's generally trustworthy anyway.


----------



## rv8000

Anyone with a Tri-X or Vapor-x if you could chime in on this...

I've been monitoring my 290 Vapor-X temps (at factory clocks) the past few days, mostly wildstar and gw2, with a custom fan curve thats somewhat tolerable and im still hitting upwards of 81c after about an hour of play. Max fan speed during this time was 68%, and averaged around 60%. Room temp is approx 25~27c during this time, this seems awfully weird for the card to be getting this warm, id expect mid 70s but 80+ is surprising.


----------



## Raul-7

Can I be added? MSI R9 290, stock soon to be water.


----------



## Performer81

Now i just need a new Powercolor 290X PCS+ Bios cause my 290 PCS+ has an updated PCb and i dont think an an old Bios would work on my card.
My bios Version is 015.044.000.007.000000. Where can i find the 290X pendant?


----------



## smoke2

Is here some owner of Sapphire 290 Tri-X with Fractal R4 case?
I would like to compare temps








I have two original Fractal intake fan on the front and two Enermax UCTB14 one on the bottom, second one on the back as an outtake.
All regulated on 5V.
My max. temps during playing Far Cry 3 are 75 degrees core and 79 degrees VRM.

Are these temps OK?


----------



## th3illusiveman

Far Cry 3 and Sleeping dogs are VERY intense on the Core and VRM temps for AMD cards. I don't think there are any other games that get that hot.


----------



## th3illusiveman

What are your 24/7 overclocks? After some brief testing it seems mine sits comfy at 1100/1500 with +75mv in AB. I think i got a greedy core.


----------



## mojobear

Hey guys,

Anyone running a r9 290 with the PT1 bios....I just want to be sure that it works for the 290 non-x version. Switching bios is a bit of an issue for me bc the EK bridge blocks the bios switch lol.

Also....anyone have experience with the PT1 bios....getting better OC?

Thanks!


----------



## bond32

I can legit say I am surprised now that I finally had time to play and test with my trifire setup... My 1300 Watt EVGA g2 is the only thing holding me back now









I didn't believe it at first. Of course, that was when I had only 1 card. But now, running the trifre 290x's with PT1 bios, 4770k at 4.8 ghz (stupid high voltage), I can get the cards to around 1.4 volts each. When I attempt 1.43 volts, I get an immediate shut down during benchmark. I have even lowered the cpu overclock/voltage to try to get a solid run at 1250 core but can't get it. This was the highest I could get:

http://www.3dmark.com/3dm/3870093?


----------



## mojobear

Hey guys,

Not sure what happened to my post...lol....might have posted it somewhere totally invalid by accident. haha

Was wondering if anyone has experience with the PT1 bios and can confirm that it works for the 290 reference non-x version. My bios switch is hidden by the EK bridge, so a poor flash is a problem to fix









Also...anyone used the PT1 bios and can comment on their experience...voltage, better OC, etc?

Many thanks


----------



## Sgt Bilko

Quote:


> Originally Posted by *mojobear*
> 
> Hey guys,
> 
> Not sure what happened to my post...lol....might have posted it somewhere totally invalid by accident. haha
> 
> Was wondering if anyone has experience with the PT1 bios and can confirm that it works for the 290 reference non-x version. My bios switch is hidden by the EK bridge, so a poor flash is a problem to fix
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also...anyone used the PT1 bios and can comment on their experience...voltage, better OC, etc?
> 
> Many thanks


I think @HOMECINEMA-PC has used the PT1 bios when benching, either way he's a great source of info for this.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I think @HOMECINEMA-PC has used the PT1 bios when benching, either way he's a great source of info for this.


Yep that's me the primary overvolter








PT 1 bios has no vdroop so I don't use it for 24/7 . But PT1T is what I use 24/7 on my TRI setup . Both bioses give you unlocked voltages but PT1T gives you Vdroop and they both bench well . But the drivers as usual are what dictates stability and they leave a lot to be desired









http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread/0_20
have read of the first page









PS both bioses do NOT downclock , clocks at idle stay fixed at 1000 / 1250 . Vcore at 1.25vc . Must use GPU Tweek .


----------



## mojobear

Thanks Homecinema-pc.

I actually just sent you a PM before seeing your post lol...I googled through the 290 flash forum and saw that you posted on the PT1T bios. My confusion was regarding my understanding of the origins of the PT1(T)....arent they from an asus 290x bios? I know that my cards are LOCKED thus if I OC using the PT1 won't I end up with issues like those with locked 290s trying to flash to a 290x bios?

Many thanks.


----------



## mojobear

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I think @HOMECINEMA-PC has used the PT1 bios when benching, either way he's a great source of info for this.


Many thanks Sgt Bilko


----------



## Sgt Bilko

Quote:


> Originally Posted by *mojobear*
> 
> Many thanks Sgt Bilko


no worries, i just pointed you in the right direction








Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yep that's me the primary overvolter


that you are good sir, cheers for the help


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *mojobear*
> 
> Thanks Homecinema-pc.
> 
> I actually just sent you a PM before seeing your post lol...I googled through the 290 flash forum and saw that you posted on the PT1T bios. My confusion was regarding my understanding of the origins of the PT1(T)....arent they from an asus 290x bios? I know that my cards are LOCKED thus if I OC using the PT1 won't I end up with issues like those with locked 290s trying to flash to a 290x bios?
> 
> Many thanks.


No problem man .








I answered PM in a bit more detail but read the first page of that thread and it will tell you why there is a PT1T bios apart from vdroop


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> ' SNIP '
> that you are good sir, cheers for the help


No dramas man with these cards ive got it sorted these days .
But those damn ati drivers really suck man


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> No dramas man with these cards ive got it sorted these days .
> But those damn ati drivers really suck man


the newest Beta is showing some promise isn't it?

I seen someone else actually beat out your HoF score with lower GPU clocks thanks to the newer beta's, then again that's before you got the 3970x









http://www.3dmark.com/3dm/3866863

How is it treating you anyways?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> the newest Beta is showing some promise isn't it?
> 
> I seen someone else actually beat out your HoF score with lower GPU clocks thanks to the newer beta's, then again that's before you got the 3970x
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/3866863
> 
> How is it treating you anyways?


Not much better man . I thinks my Sandybee win 7 is corrupted . On DC ATM . Gonna bench on that tonite










Just thought I would put this here LoooooL


----------



## ebhsimon

Quote:


> Originally Posted by *rv8000*
> 
> Anyone with a Tri-X or Vapor-x if you could chime in on this...
> 
> I've been monitoring my 290 Vapor-X temps (at factory clocks) the past few days, mostly wildstar and gw2, with a custom fan curve thats somewhat tolerable and im still hitting upwards of 81c after about an hour of play. Max fan speed during this time was 68%, and averaged around 60%. Room temp is approx 25~27c during this time, this seems awfully weird for the card to be getting this warm, id expect mid 70s but 80+ is surprising.


There's definitely something wrong with your card. When it gets to 80c, does it shoot up to 80c or does it gradually increase? If it shoots up it's likely there's a problem with the cooler. Mine doesn't even break 70c while playing BF3 @ 1440p with a slight overclock.


----------



## Spectre-

can someone here help me out

i would love to catch upto HCPC in the R9 290 rankings but seems my pc is being a pooop

so this is my problem

i install gpu drivers (doesnt matter which ones) i restart my pc and everything goes all good till upto the windows logo

after that its just a black screen

i have tried reseating my card i even moved it to a diferent pci-e lane

i have tried booting with just 1 ram stick

i dont think my psu is dying

the card runs fine in safe mode and i dont have any problems in the bios

but there was once when the side of my screen had blue artifact

Edit post: gfx card just died on me ... The ram modules have gone bad


----------



## smoke2

Quote:


> Originally Posted by *th3illusiveman*
> 
> What are your 24/7 overclocks? After some brief testing it seems mine sits comfy at 1100/1500 with +75mv in AB. I think i got a greedy core.


I didn't test it yet, it's new.
But I have relatively higher asic 82.8
But don't know if it on Radeons 290 have some value?


----------



## Red1776

Finished the tests on air



Now everyone in the pool 

R290X EK RC Rev 2.0


----------



## Sgt Bilko

Lookin good!


----------



## Paopawdecarabao

Would a Seasonic x1050 power tri fire? And what would be a good mobo for a 3 way xfire? Current mobo is Asrock extreme4 i7 3770k.


----------



## chronicfx

Quote:


> Originally Posted by *Paopawdecarabao*
> 
> Would a Seasonic x1050 power tri fire? And what would be a good mobo for a 3 way xfire? Current mobo is Asrock extreme4 i7 3770k.


I wouldn't do it with less than 1200w seasonic or not


----------



## Cyber Locc

Quote:


> Originally Posted by *Red1776*
> 
> Finished the tests on air
> 
> 
> 
> 
> Now everyone in the pool
> 
> 
> 
> 
> 
> 
> 
> 
> R290X EK RC Rev 2.0


What psu you rocking and can it bench can it oc without the psu failing lol? Im about to start the same adventure and am going to hold out for the new 1600s coming out evgas supernova 1600w titanuim yes please









Darn your using duals I was trying to avoid going that route but it would be cheaper I do have a 900d so there's room


----------



## Red1776

Quote:


> Originally Posted by *cyberlocc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> Finished the tests on air
> 
> 
> 
> Now everyone in the pool
> 
> 
> 
> 
> 
> 
> 
> 
> R290X EK RC Rev 2.0
> 
> 
> 
> 
> 
> What psu you rocking and can it bench can it oc without the psu failing lol? Im about to start the same adventure and am going to hold out for the new 1600s coming out evgas supernova 1600w titanuim yes please
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Darn your using duals I was trying to avoid going that route but it would be cheaper I do have a 900d so there's room
Click to expand...

I am running three power supplies in the Holodeck.

1 x1200w

2 x 500w

For 2200w/185A


----------



## tsm106

PPL always say they will overclock but really how much are you going to oc?Anyways do the math, 300w x 1.5% (+50% PT) = 450w. 450w x 4 = 1800w just for the gpus theoretical maximums. Add in the cpu and loop and other supporting systems...

There is no magic. That said, I doubt 99% of you are going to push the gpus to its maximum. Outside of a handful of guys like 4, I have not seen anyone else run 3-4 290x at 1300mhz core or higher.


----------



## Arizonian

Quote:


> Originally Posted by *Raul-7*
> 
> Can I be added? MSI R9 290, stock soon to be water.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## Red1776

Quote:


> Originally Posted by *tsm106*
> 
> PPL always say they will overclock but really how much are you going to oc?Anyways do the math, 300w x 1.5% (+50% PT) = 450w. 450w x 4 = 1800w just for the gpus theoretical maximums. Add in the cpu and loop and other supporting systems...
> 
> There is no magic. That said, I doubt 99% of you are going to push the gpus to its maximum. Outside of a handful of guys like 4, I have not seen anyone else run 3-4 290x at 1300mhz core or higher.


Which is why I run at 2200w+. These 290X's are going to be pushed. I The air cooled benching was any indication I have some pretty good ones here.

anyway...


----------



## heroxoot

They look so sick with eachother. I need to buy a bigger case someday and do some liquid. Can't fit a damn inside mine.


----------



## Widde

Damm this thread has gotten pages since the start







Remember when it was around 20 pages ^^ This is bad, I'm getting urges to add a 3rd card







Prob around the time I'm switching socket







Gonna see where x99 is going first before I switch


----------



## tsm106

Quote:


> Originally Posted by *Red1776*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> PPL always say they will overclock but really how much are you going to oc?Anyways do the math, 300w x 1.5% (+50% PT) = 450w. 450w x 4 = 1800w just for the gpus theoretical maximums. Add in the cpu and loop and other supporting systems...
> 
> There is no magic. That said, I doubt 99% of you are going to push the gpus to its maximum. Outside of a handful of guys like 4, I have not seen anyone else run 3-4 290x at 1300mhz core or higher.
> 
> 
> 
> Which is why I run at 2200w+. These 290X's are going to be pushed. I The air cooled benching was any indication I have some pretty good ones here.
> 
> anyway...
> 
> 
> Spoiler: Warning: Spoiler!
Click to expand...

Yea, you're in the ballpark power wise. Is that the serial/parallel bridge? Your setup is very shiny!


----------



## Red1776

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> PPL always say they will overclock but really how much are you going to oc?Anyways do the math, 300w x 1.5% (+50% PT) = 450w. 450w x 4 = 1800w just for the gpus theoretical maximums. Add in the cpu and loop and other supporting systems...
> 
> There is no magic. That said, I doubt 99% of you are going to push the gpus to its maximum. Outside of a handful of guys like 4, I have not seen anyone else run 3-4 290x at 1300mhz core or higher.
> 
> 
> 
> Which is why I run at 2200w+. These 290X's are going to be pushed. I The air cooled benching was any indication I have some pretty good ones here.
> 
> anyway...
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Yea, you're in the ballpark power wise. Is that the serial/parallel bridge? Your setup is very shiny!
Click to expand...

Yeah I can really push it with the 2.2kW so far.

Its not really that bad, it just photographs that way. The custom sleeving will balance things out.

I got the EK semi parallel bridge. I have been going with full parallel until now so this will be interesting to see what it does to pressure/flow and where my pump setting will need to be.


----------



## rv8000

Quote:


> Originally Posted by *ebhsimon*
> 
> There's definitely something wrong with your card. When it gets to 80c, does it shoot up to 80c or does it gradually increase? If it shoots up it's likely there's a problem with the cooler. Mine doesn't even break 70c while playing BF3 @ 1440p with a slight overclock.


It's a gradual increase, id say after about 15-20 minutes the card is up in the 76+ range and continues upwards, 82c is the hottest i've seen. I haven't repasted the card yet but something seems wrong, there are no components near the card and I have a 140mm 1200rpm fan blowing across the card in my 540.


----------



## kizwan

I'm not going to push my GPU's to maximum because I don't have proper cooling system to cool them down. This is why I hold on flashing my cards to PT1/PT1T VBIOS. Even with A/C, max I can do is 1200/1600 with +200mV (1.38 - 1.4V before Vdroop).

P/S: Something wrong with this thread. When I clicked the last page, it doesn't open the last page. Basically the whole last page missing. If I go to my subscription list & click the link, it will open the last page/post.


----------



## mus1mus

Quote:


> Originally Posted by *kizwan*
> 
> I'm not going to push my GPU's to maximum because I don't have proper cooling system to cool them down. This is why I hold on flashing my cards to PT1/PT1T VBIOS. Even with A/C, max I can do is 1200/1600 with +200mV (1.38 - 1.4V before Vdroop).
> 
> P/S: Something wrong with this thread. When I clicked the last page, it doesn't open the last page. Basically the whole last page missing. If I go to my subscription list & click the link, it will open the last page/post.


You need to click the button encircled below..

Something must be going nuts on OCN servers. I'm seeing the same on my sub'd pages. Most recent post doesn't show..


----------



## Germanian

Quote:


> Originally Posted by *tsm106*
> 
> That said, I doubt 99% of you are going to push the gpus to its maximum. Outside of a handful of guys like 4, I have not seen anyone else run 3-4 290x at 1300mhz core or higher.


3x 1300 clocks? im jelly.


----------



## Cyber Locc

Quote:


> Originally Posted by *Red1776*
> 
> I am running three power supplies in the Holodeck.
> 1 x1200w
> 2 x 500w
> 
> For 2200w/185A


Dude I don't think your going to get 2200w from that I wouldnt assume such if i was you be careful with it. Dual psus lower the efficiency of both psus by 20-30%(20-30% of the total of both) your running 3 that number goes up I would drop the 2 500s and run another 1200w then you should be able to make sure that you will get at least 2k. Also 3 psus is 3 points of failure just saying IMO thats a very bad idea.

Also I hope you live int he uk a normal house breaker wont hold that load









So TSM 4 on a 1600w no voltage add or 3 on 1600w overclocked with no more than 40% on each would work good to know dot know why I didn't think about that Derpa Derpa.


----------



## Red1776

Quote:


> Originally Posted by *cyberlocc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> I am running three power supplies in the Holodeck.
> 1 x1200w
> 2 x 500w
> 
> For 2200w/185A
> 
> 
> 
> Dude I don't think your going to get 2200w from that I wouldnt assume such if i was you be careful with it. Dual psus lower the efficiency of both psus by 20-30%(20-30% of the total of both) your running 3 that number goes up I would drop the 2 500s and run another 1200w then you should be able to make sure that you will get at least 2k. Also 3 psus is 3 points of failure just saying IMO thats a very bad idea.
> 
> Also I hope you live int he uk a normal house breaker wont hold that load
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So TSM 4 on a 1600w no voltage add or 3 on 1600w overclocked with no more than 40% on each would work good to know dot know why I didn't think about that Derpa Derpa.
Click to expand...

You know that has actually come up before and talked to an electrician about this. As I understand it from him if I was running them in series that would be the case. my setup is a bit different however. The AX1200 is separate from the dual 500w/40A PSU's completely, and each 500w PSU is completely separate from each other. The 500w PSU's each have a separate Bios wake molex and a separate wall plug supply. The 500w PSU's do not go into the 24 Pin ATX at all. They are dedicated to one GPU each. and run off a separate circuit entirely.


----------



## Mega Man

Quote:


> Originally Posted by *cyberlocc*
> 
> Dude I don't think your going to get 2200w from that I wouldnt assume such if i was you be careful with it. Dual psus lower the efficiency of both psus by 20-30%(20-30% of the total of both) your running 3 that number goes up I would drop the 2 500s and run another 1200w then you should be able to make sure that you will get at least 2k. Also 3 psus is 3 points of failure just saying IMO thats a very bad idea.


wait how do you figure this ??


----------



## kizwan

How? How dual or triple PSU's lower efficiency?







Also if the PSU is rated at 1200W or 500W, it will be able to supply that rated output wattage regardless the PSU efficiency.
Quote:


> Originally Posted by *Red1776*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *cyberlocc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> I am running three power supplies in the Holodeck.
> 
> 1 x1200w
> 
> 2 x 500w
> 
> For 2200w/185A
> 
> 
> 
> Dude I don't think your going to get 2200w from that I wouldnt assume such if i was you be careful with it. Dual psus lower the efficiency of both psus by 20-30%(20-30% of the total of both) your running 3 that number goes up I would drop the 2 500s and run another 1200w then you should be able to make sure that you will get at least 2k. Also 3 psus is 3 points of failure just saying IMO thats a very bad idea.
> 
> Also I hope you live int he uk a normal house breaker wont hold that load
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So TSM 4 on a 1600w no voltage add or 3 on 1600w overclocked with no more than 40% on each would work good to know dot know why I didn't think about that Derpa Derpa.
> 
> Click to expand...
> 
> 
> 
> 
> You know that has actually come up before and talked to an electrician about this. As I understand it from him if I was running them in series that would be the case. my setup is a bit different however. *The AX1200 is separate from the dual 500w/40A PSU's completely, and each 500w PSU is completely separate from each other. The 500w PSU's each have a separate Bios wake molex and a separate wall plug supply. The 500w PSU's do not go into the 24 Pin ATX at all.* They are dedicated to one GPU each. and run off a separate circuit entirely.
Click to expand...

That's pretty much how everyone running their dual or triple PSU's.


----------



## Red1776

Quote:


> Originally Posted by *kizwan*
> 
> How? How dual or triple PSU's lower efficiency?
> 
> 
> 
> 
> 
> 
> 
> Also if the PSU is rated at 1200W or 500W, it will be able to supply that rated output wattage regardless the PSU efficiency.
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *cyberlocc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> I am running three power supplies in the Holodeck.
> 
> 1 x1200w
> 
> 2 x 500w
> 
> For 2200w/185A
> 
> 
> 
> Dude I don't think your going to get 2200w from that I wouldnt assume such if i was you be careful with it. Dual psus lower the efficiency of both psus by 20-30%(20-30% of the total of both) your running 3 that number goes up I would drop the 2 500s and run another 1200w then you should be able to make sure that you will get at least 2k. Also 3 psus is 3 points of failure just saying IMO thats a very bad idea.
> 
> Also I hope you live int he uk a normal house breaker wont hold that load
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So TSM 4 on a 1600w no voltage add or 3 on 1600w overclocked with no more than 40% on each would work good to know dot know why I didn't think about that Derpa Derpa.
> 
> Click to expand...
> 
> 
> 
> 
> You know that has actually come up before and talked to an electrician about this. As I understand it from him if I was running them in series that would be the case. my setup is a bit different however. *The AX1200 is separate from the dual 500w/40A PSU's completely, and each 500w PSU is completely separate from each other. The 500w PSU's each have a separate Bios wake molex and a separate wall plug supply. The 500w PSU's do not go into the 24 Pin ATX at all.* They are dedicated to one GPU each. and run off a separate circuit entirely.
> 
> Click to expand...
> 
> That's pretty much how everyone running their dual or triple PSU's.
Click to expand...

Well there is a an add on 24 pin fitting that you can plug into the ATX 24 I guess, That send them into the board both at the ATX fitting ( I have not used one) But I was also addressing that I do not have all 2.2kW running from the same circuit.


----------



## tsm106

You mean the add2psu module? That is just a relay for the OCD. For the rest of us, you only need a 2 pins to jump a psu.

Hmm, someone thinks we lose 20-30% just by using multi psu? How is that even possible to make that statement?


----------



## kizwan

Quote:


> Originally Posted by *Red1776*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> How? How dual or triple PSU's lower efficiency?
> 
> 
> 
> 
> 
> 
> 
> Also if the PSU is rated at 1200W or 500W, it will be able to supply that rated output wattage regardless the PSU efficiency.
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *cyberlocc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> I am running three power supplies in the Holodeck.
> 
> 1 x1200w
> 
> 2 x 500w
> 
> For 2200w/185A
> 
> 
> 
> Dude I don't think your going to get 2200w from that I wouldnt assume such if i was you be careful with it. Dual psus lower the efficiency of both psus by 20-30%(20-30% of the total of both) your running 3 that number goes up I would drop the 2 500s and run another 1200w then you should be able to make sure that you will get at least 2k. Also 3 psus is 3 points of failure just saying IMO thats a very bad idea.
> 
> Also I hope you live int he uk a normal house breaker wont hold that load
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So TSM 4 on a 1600w no voltage add or 3 on 1600w overclocked with no more than 40% on each would work good to know dot know why I didn't think about that Derpa Derpa.
> 
> Click to expand...
> 
> 
> 
> 
> You know that has actually come up before and talked to an electrician about this. As I understand it from him if I was running them in series that would be the case. my setup is a bit different however. *The AX1200 is separate from the dual 500w/40A PSU's completely, and each 500w PSU is completely separate from each other. The 500w PSU's each have a separate Bios wake molex and a separate wall plug supply. The 500w PSU's do not go into the 24 Pin ATX at all.* They are dedicated to one GPU each. and run off a separate circuit entirely.
> 
> Click to expand...
> 
> That's pretty much how everyone running their dual or triple PSU's.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> 
> 
> 
> Well there is a an add on 24 pin fitting that you can plug into the ATX 24 I guess, That send them into the board both at the ATX fitting ( I have not used one) But I was also addressing that I do not have all 2.2kW running from the same circuit.
Click to expand...

Yeah, the dual PSU cable connector is one way to power on/off all three PSU's at the same time. I guess you use different device then. Yeah I know you're explaining that all three PSU's connected to different circuits (dedicated circuit breaker if I remember correctly).


----------



## Raul-7

Man these cards really do run hot and loud. I just got the 290 and it hits 90C within games without a flinch. Can't wait to get under water.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Raul-7*
> 
> Man these cards really do run hot and loud. I just got the 290 and it hits 90C within games without a flinch. Can't wait to get under water.


have you tried a manual fan curve?


----------



## Red1776

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> How? How dual or triple PSU's lower efficiency?
> 
> 
> 
> 
> 
> 
> 
> Also if the PSU is rated at 1200W or 500W, it will be able to supply that rated output wattage regardless the PSU efficiency.
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *cyberlocc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> I am running three power supplies in the Holodeck.
> 
> 1 x1200w
> 
> 2 x 500w
> 
> For 2200w/185A
> 
> 
> 
> Dude I don't think your going to get 2200w from that I wouldnt assume such if i was you be careful with it. Dual psus lower the efficiency of both psus by 20-30%(20-30% of the total of both) your running 3 that number goes up I would drop the 2 500s and run another 1200w then you should be able to make sure that you will get at least 2k. Also 3 psus is 3 points of failure just saying IMO thats a very bad idea.
> 
> Also I hope you live int he uk a normal house breaker wont hold that load
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So TSM 4 on a 1600w no voltage add or 3 on 1600w overclocked with no more than 40% on each would work good to know dot know why I didn't think about that Derpa Derpa.
> 
> Click to expand...
> 
> 
> 
> 
> You know that has actually come up before and talked to an electrician about this. As I understand it from him if I was running them in series that would be the case. my setup is a bit different however. *The AX1200 is separate from the dual 500w/40A PSU's completely, and each 500w PSU is completely separate from each other. The 500w PSU's each have a separate Bios wake molex and a separate wall plug supply. The 500w PSU's do not go into the 24 Pin ATX at all.* They are dedicated to one GPU each. and run off a separate circuit entirely.
> 
> Click to expand...
> 
> That's pretty much how everyone running their dual or triple PSU's.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> 
> 
> 
> Well there is a an add on 24 pin fitting that you can plug into the ATX 24 I guess, That send them into the board both at the ATX fitting ( I have not used one) But I was also addressing that I do not have all 2.2kW running from the same circuit.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Yeah, the dual PSU cable connector is one way to power on/off all three PSU's at the same time. I guess you use different device then. Yeah I know you're explaining that all three PSU's connected to different circuits (dedicated circuit breaker if I remember correctly).
Click to expand...

Wow, good memory 

I had my guy wire a dedicated circuits 20A.

The only way my two 500w PSU's are connected is a 4 pin molex that receives the wake up from BIOS via the AX1200w relaying it. otherwise they share no source or paths. When I shut down all three power down.


----------



## Widde

What's up with the forums? ANyone else having trouble watching the latest page? Can only see 2879 but if I click 10 unread posts I get to page 2880 :S And cant see my own posts until an hour or so after I post or if enough peopl have responded so it gets a new page


----------



## keikei

Quote:


> Originally Posted by *Widde*
> 
> What's up with the forums? ANyone else having trouble watching the latest page? Can only see 2879 but if I click 10 unread posts I get to page 2880 :S And cant see my own posts until an hour or so after I post or if enough peopl have responded so it gets a new page


Youre not the only one. Basically, for now go *forums>my posts* to see your latest posts.


----------



## bluedevil

Why in the would would I want a Sapphire Vapor-X 290 over my AIO'd Visontek 290? Call me crazy, but I really want a blue build and that HSF on the Vapor-X is dead sexy.


----------



## digitally

Did amd end the production for r9 290 card? Almost my entire country retailers does not have any stock anymore. Ugh...


----------



## Descadent

Quote:


> Originally Posted by *bluedevil*
> 
> Why in the would would I want a Sapphire Vapor-X 290 over my AIO'd Visontek 290? Call me crazy, but I really want a blue build and that HSF on the Vapor-X is dead sexy.


it's even more sexy on 290x


----------



## bluedevil

Quote:


> Originally Posted by *Descadent*
> 
> it's even more sexy on 290x


Hotness.....prices suck on them though....


----------



## Descadent

i got them for $550 a piece on sale that amazon price matched newegg...sold off my two 780 ti's and at least made $100 back


----------



## bluedevil

Quote:


> Originally Posted by *Descadent*
> 
> i got them for $550 a piece on sale that amazon price matched newegg...sold off my two 780 ti's and at least made $100 back


True though....but I would only get about $250 back on my GPU and about $50 back on my AIO. So about $300 total back. Still about $250 off.....not gonna happen, too expensive.


----------



## Talon720

Hey all just wanted to give a simple solution to what seem like a common problem. The EK bridge blocks the bios switch and can cause a headach if the bios you flash dosnt work. This happened to me. I ended up using a ziptie strong enough to push the button, but thin enough to fit behind the bridge. I was trying all different things and a mediumish size ziptie worked great.


----------



## Cyber Locc

Quote:


> Originally Posted by *Red1776*
> 
> Wow, good memory
> 
> 
> 
> 
> 
> 
> 
> 
> I had my guy wire a dedicated circuits 20A.
> The only way my two 500w PSU's are connected is a 4 pin molex that receives the wake up from BIOS via the AX1200w relaying it. otherwise they share no source or paths. When I shut down all three power down.


For everyone Im trying to find it. The 20%-30% loss came from a member here that studys psus and is an electrician. Red seems to know exactly what im talking about







I am going to look for him I cant remember his name atm. Ill be back with the info.

It had something to do with the atx pin though I do remember that much.

There is other issues as well in my search I will post them. (i think this might be the guy I was thinking of but it was last week and my memory is bad but check this out while I search) http://www.overclock.net/t/1177714/faq-dual-power-supplies

Red your fsp psus are group regulated so thats a bad idea in and of itself as shown.


----------



## Jflisk

I have a question how much wattage are 3x 290x really using at full throttle. I have a wemo basically a kilowatt device its max read is 850 W for 3x290x and FX9590 CPU. I have a 1350W thermaltake Power supply (750 watts per rail). When I set the CPU on overclock I get intermittent freezes. Just noticed this after installing the 3rd GPU. It looks like at full tilt I am wearing out the power supply. I have the 1350 dual rails balanced between the 3 cards. Debating motherboard or Power supply. All underwater so temps are not an issue.

Thanks


----------



## Ramzinho

Next Gen sapphire should start On having Water-X with a block included


----------



## fateswarm

Quote:


> Originally Posted by *Jflisk*
> 
> I have a question how much wattage are 3x 290x really using at full throttle. I have a wemo basically a kilowatt device its max read is 850 W for 3x290x and FX9590 CPU. I have a 1350W thermaltake Power supply (750 watts per rail). When I set the CPU on overclock I get intermittent freezes. Just noticed this after installing the 3rd GPU. It looks like at full tilt I am wearing out the power supply. I have the 1350 dual rails balanced between the 3 cards. Debating motherboard or Power supply. All underwater so temps are not an issue.
> 
> Thanks


To be sure the PSU experts suggest at least 1300W. But, what most people don't know is that the usage of the CPU can vary a lot (e.g. some intel CPUs can be used on 50W in some games and 150W on other games) and AMD is worse at that. The overclocks of the GPUs also play a role of course.


----------



## Red1776

Quote:


> Originally Posted by *cyberlocc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> Wow, good memory
> 
> 
> 
> 
> 
> 
> 
> 
> I had my guy wire a dedicated circuits 20A.
> The only way my two 500w PSU's are connected is a 4 pin molex that receives the wake up from BIOS via the AX1200w relaying it. otherwise they share no source or paths. When I shut down all three power down.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> For everyone Im trying to find it. The 20%-30% loss came from a member here that studys psus and is an electrician. Red seems to know exactly what im talking about
> 
> 
> 
> 
> 
> 
> 
> I am going to look for him I cant remember his name atm. Ill be back with the info.
> 
> It had something to do with the atx pin though I do remember that much.
> 
> There is other issues as well in my search I will post them. (i think this might be the guy I was thinking of but it was last week and my memory is bad but check this out while I search) http://www.overclock.net/t/1177714/faq-dual-power-supplies
> 
> Red your fsp psus are group regulated so thats a bad idea in and of itself as shown.
Click to expand...

 The FSP units are high quality PFC units that receive a wake up signal from the BIOS



Quote:From [H] review


> *Auxiliary Power Supply (FSP Booster X5 450W)*
> 
> Whereas the Add2Psu and Lian Li Secondary Power Supply Starter Kit were only pieces to the dual power supply puzzle, an auxiliary power supply such as the FSP Booster X5 450W is the complete package.


They do not use the ATX pin adapter shown in the link , and do not enter the power through the board at all.

besides the wealth of very good reviews and testing of these PFC PSU's, I had the first one I purchased cracked open by an IT guy who worked at Cisco Systems because I had wondered about potential issues of adding separate PSU's to a single system.

I have owned 8 of them and they have been in my last 4 quadfire builds operating flawlessly.


----------



## kizwan

Quote:


> Originally Posted by *cyberlocc*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> Wow, good memory
> 
> 
> 
> 
> 
> 
> 
> 
> I had my guy wire a dedicated circuits 20A.
> The only way my two 500w PSU's are connected is a 4 pin molex that receives the wake up from BIOS via the AX1200w relaying it. otherwise they share no source or paths. When I shut down all three power down.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> For everyone Im trying to find it. The 20%-30% loss came from a member here that studys psus and is an electrician. Red seems to know exactly what im talking about
> 
> 
> 
> 
> 
> 
> 
> I am going to look for him I cant remember his name atm. Ill be back with the info.
> 
> It had something to do with the atx pin though I do remember that much.
> 
> There is other issues as well in my search I will post them. (i think this might be the guy I was thinking of but it was last week and my memory is bad but check this out while I search) http://www.overclock.net/t/1177714/faq-dual-power-supplies
> 
> Red your fsp psus are group regulated so thats a bad idea in and of itself as shown.
Click to expand...

The FSP PSU that red is using is FSP BoosterX5. It's a PSU designed for supporting multi-GPU especially when the main PSU unable to handle all GPU's. It only provide +12V rail as far as I know. There will be no unbalanced loads issue.


----------



## Red1776

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cyberlocc*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> Wow, good memory
> 
> 
> 
> 
> 
> 
> 
> 
> I had my guy wire a dedicated circuits 20A.
> The only way my two 500w PSU's are connected is a 4 pin molex that receives the wake up from BIOS via the AX1200w relaying it. otherwise they share no source or paths. When I shut down all three power down.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> For everyone Im trying to find it. The 20%-30% loss came from a member here that studys psus and is an electrician. Red seems to know exactly what im talking about
> 
> 
> 
> 
> 
> 
> 
> I am going to look for him I cant remember his name atm. Ill be back with the info.
> 
> It had something to do with the atx pin though I do remember that much.
> 
> There is other issues as well in my search I will post them. (i think this might be the guy I was thinking of but it was last week and my memory is bad but check this out while I search) http://www.overclock.net/t/1177714/faq-dual-power-supplies
> 
> Red your fsp psus are group regulated so thats a bad idea in and of itself as shown.
> 
> Click to expand...
> 
> The FSP PSU that red is using is FSP BoosterX5. It's a PSU designed for supporting multi-GPU especially when the main PSU unable to handle all GPU's. It only provide +12V rail as far as I know. There will be no unbalanced loads issue.
Click to expand...

More eloquent than I put it.


----------



## Cyber Locc

Quote:


> Originally Posted by *Red1776*
> 
> The FSP units are high quality PFC units that receive a wake up signal from the BIOS
> 
> 
> 
> They do not use the ATX pin adapter shown in the link , and do not enter the power through the board at all.
> besides the wealth of very good reviews and testing of these PFC PSU's, I had the first one I purchased cracked open by an IT guy who worked at Cisco Systems because I had wondered about potential issues of adding separate PSU's to a single system.
> I have owned 8 of them and they have been in my last 4 quadfire builds operating flawlessly.


The link doesnt say anything about adapters.

quote - "Dual PSUs should only be used when both power supplies are high quality and use independent regulation" Yours while may be good quality are group regulated not independent them not going through the board is the problem.

Anyway this is OT so lets wrap it up if its been working good that is what matters just not something I would do.

When I find the 20% loss post I will edit this and add it here.

EDIT: Okay that is different that is an independent psu. However thats not the psu your rig specs led me to believe as first off that's 450w not 500 that changes everything carry on. Does anyone still make something like that I need one or 3 actually


----------



## Jflisk

Quote:


> Originally Posted by *fateswarm*
> 
> To be sure the PSU experts suggest at least 1300W. But, what most people don't know is that the usage of the CPU can vary a lot (e.g. some intel CPUs can be used on 50W in some games and 150W on other games) and AMD is worse at that. The overclocks of the GPUs also play a role of course.


My processor is 220W so lets keep that in mind. Thanks


----------



## Cyber Locc

Quote:


> Originally Posted by *Jflisk*
> 
> My processor is 220W so lets keep that in mind. Thanks


At stock they pull 300w each so 900ws plus your cpu at 220w thats 1100 as long as you havent oced your fine if your overclocking the cards they can pull up to 450 (+50% PT)

Plus thats just your cpu and cards you also said under water so you have other components also pulling power more than 200w I don't know but its possible. Need more info.

If your killawatt is reading 850 thats idle or its lying (joking I know it wont idle at 850w).


----------



## Jflisk

Quote:


> Originally Posted by *cyberlocc*
> 
> At stock they pull 300w each so 900ws plus your cpu at 220w thats 1100 as long as you havent oced your fine if your overclocking the cards they can pull up to 450 (+50% PT)
> 
> Plus thats just your cpu and cards you also said under water so you have other components also pulling power more than 200w I don't know but its possible. Need more info.
> 
> If your killawatt is reading 850 thats idle or its lying.


So essentially I have a power problem. I overclock everything Video Card 1060/1350 CPU 5.0GHZ.


----------



## Red1776

Quote:


> Originally Posted by *cyberlocc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> The FSP units are high quality PFC units that receive a wake up signal from the BIOS
> 
> 
> 
> They do not use the ATX pin adapter shown in the link , and do not enter the power through the board at all.
> besides the wealth of very good reviews and testing of these PFC PSU's, I had the first one I purchased cracked open by an IT guy who worked at Cisco Systems because I had wondered about potential issues of adding separate PSU's to a single system.
> I have owned 8 of them and they have been in my last 4 quadfire builds operating flawlessly.
> 
> 
> 
> The link doesnt say anything about adapters.
> 
> quote - "Dual PSUs should only be used when both power supplies are high quality and use independent regulation" Yours while may be good quality are group regulated not independent them not going through the board is the problem.
> 
> Anyway this is OT so lets wrap it up if its been working good that is what matters just not something I would do.
> 
> When I find the 20% loss post I will edit this and add it here.
> 
> EDIT: Okay that is different that is an independent psu. However thats not the psu your rig specs led me to believe as first off that's 450w not 500 that changes everything carry on. Does anyone still make something like that I need one or 3 actually
Click to expand...

They do, you can get them on ebay, Amazon and here for starters. they are 450w continuous and 500w peak, however I read load testing and they held well over 500W indefinitely.

Some more info:

Quote:


> - 450 Watt Supplemental Power Supply
> - Great for adding power for additional Graphic Card
> - Supports SLI and Crossfire
> - Active Power Factor Correction
> - Over Clock Running to 500 Watt Peak
> - Supports 4 PCI-Express Power Connectors
> - Improves PC Stability
> - Dual Intake Smart Cooling Fan through front of Case
> - 7 Changeable Colors LED Front Panel
> - Fits into your 5.25" Bay
> - 85+ Efficiency
> - Installation Kit
> - Auto Recovery Program
> - Designed by FSP Group


Ripple on these units was very low 7-25mv and they stay very cool

If you get get them, let me know what you think.









And a link:

http://www.cputopia.com/fsp-group-booster-x5.html

Greg


----------



## battleaxe

Man I've been having a nightmare with my 290s. Both work fine. I've switched slots and both cards will work in slot 1 and 2. Problem is no matter what I do I cannot get the crossfire tab to show in CCC...? I"ve tried different drivers, used the DDU uninstall tool, updated Windows update, switched the cards and still nothing.

Has anyone run into this? Both cards show up in GPUz but crossfire shows as disabled. So weird. Any help appreciated. And yes, I've been researching Google quite a bit over this. Asking here as a last resort.


----------



## Cyber Locc

Quote:


> Originally Posted by *Red1776*
> 
> They do, you can get them on ebay, Amazon and here for starters. they are 450w continuous and 500w peak, however I read load testing and they held well over 500W indefinitely.
> 
> Some more info:
> 
> Ripple on these units was very low 7-25mv and they stay very cool
> If you get get them, let me know what you think.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And a link:
> http://- 450 Watt Supplemental Power Supply\
> 
> Greg


I would love to get them they however have been discounted in the last couple months and you cant get them anywhere I checked new egg amazon tiger direct and ebay not 1 available. There is a xpower one that still in production and is 20 bucks but it has split rails among other issues. If you come across any be sure to let me know I need one or actually 4









I too have a 20amp circuit for 2 outlets so the pc can be ran solo on it but if i have to put in a second real psu i lose my brand new 480 rad in my 900d and not doing that


----------



## Red1776

Quote:


> Originally Posted by *cyberlocc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> They do, you can get them on ebay, Amazon and here for starters. they are 450w continuous and 500w peak, however I read load testing and they held well over 500W indefinitely.
> 
> Some more info:
> 
> Ripple on these units was very low 7-25mv and they stay very cool
> If you get get them, let me know what you think.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And a link:
> http://- 450 Watt Supplemental Power Supply\
> 
> Greg
> 
> 
> 
> I would love to get them they however have been discounted in the last couple months and you cant get them anywhere I checked new egg amazon tiger direct and ebay not 1 available. There is a xpower one that still in production and is 20 bucks but it has split rails among other issues. If you come across any be sure to let me know I need one or actually 4
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I too have a 20amp circuit for 2 outlets so the pc can be ran solo on it but if i have to put in a second real psu i lose my brand new 480 rad in my 900d and not doing that
Click to expand...

Sorry cyber, I c & P the wrong link. try this

http://www.cputopia.com/fsp-group-booster-x5.html

Greg


----------



## Cyber Locc

Quote:


> Originally Posted by *Red1776*
> 
> Sorry cyber, I c & P the wrong link. try this
> http://www.cputopia.com/fsp-group-booster-x5.html
> 
> Greg


Out of stock









There unicorns bro









What I may do is have 3 overclocked and the 4th stock clocks then in quadfire they will result to stock volts and if I swicth that card off for whatever reason there overclocks will throw in macthing the performance of the quadfire anyway.

I can still oc anyway just not for benching and if i stay close to stock voltage i may be good.

What you think 4930k at 1.4v and 4 r9 290xs with 2 ssds and 3 hdds nzxt led strip and 2 mcp35xs and 26 fans 1600w enough im thinking not


----------



## Jflisk

Quote:


> Originally Posted by *battleaxe*
> 
> Man I've been having a nightmare with my 290s. Both work fine. I've switched slots and both cards will work in slot 1 and 2. Problem is no matter what I do I cannot get the crossfire tab to show in CCC...? I"ve tried different drivers, used the DDU uninstall tool, updated Windows update, switched the cards and still nothing.
> 
> Has anyone run into this? Both cards show up in GPUz but crossfire shows as disabled. So weird. Any help appreciated. And yes, I've been researching Google quite a bit over this. Asking here as a last resort.


Went thru this last week when I added my third GPU(1 day to try and install 3rd GPU -30 Sec next day works like nothing wrong). Remove driver (DDU-Total wreck out driver) stick card in bottom slot. Run card in bottom slot bring up display. Shut down - Put card in top start computer. Shut down install cable in top card bring up display. Should work. Make sure shows 2 gpus in hardware tab. In other words the process is remove drivers bring up one card at a time from the bottom up with display. Good luck.


----------



## battleaxe

Quote:


> Originally Posted by *Jflisk*
> 
> Went thru this last week when I added my third GPU(1 day to try and install 3rd GPU -30 Sec next day works like nothing wrong). Remove driver (DDU-Total wreck out driver) stick card in bottom slot. Run card in bottom slot bring up display. Shut down - Put card in top start computer. Shut down install cable in top card bring up display. Should work. Make sure shows 2 gpus in hardware tab. In other words the process is remove drivers bring up one card at a time from the bottom up with display. Good luck.


Man I hope this works. Great idea. The one thing I had not tried. I'm at a total loss if this doesn't work. Both cards show in device manager, and both show up in Afterburner. But the bottom card is always shown as disabled on CCC. So lets hope this does the trick. Thank you for the idea. +1 !! Even if it doesn't work.


----------



## Jflisk

Quote:


> Originally Posted by *battleaxe*
> 
> Man I hope this works. Great idea. The one thing I had not tried. I'm at a total loss if this doesn't work. Both cards show in device manager, and both show up in Afterburner. But the bottom card is always shown as disabled on CCC. So lets hope this does the trick. Thank you for the idea. +1 !! Even if it doesn't work.


Mine did the exact same thing. Like I said I sat there and banged my head against the desk for 1 whole night. Woke up the next day and tried the suggestion I gave you and it worked within 30 seconds.I tried to get help on here and got the usual answers that I already tried. So I had to go way outside the box on this one. Also try reseating the cards. CCC is supposed to see the cards sometimes it don't.


----------



## Red1776

Quote:


> Originally Posted by *cyberlocc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> Sorry cyber, I c & P the wrong link. try this
> http://www.cputopia.com/fsp-group-booster-x5.html
> 
> Greg
> 
> 
> 
> Out of stock
> 
> 
> 
> 
> 
> 
> 
> 
> 
> There unicorns bro
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What I may do is have 3 overclocked and the 4th stock clocks then in quadfire they will result to stock volts and if I swicth that card off for whatever reason there overclocks will throw in macthing the performance of the quadfire anyway.
> 
> I can still oc anyway just not for benching and if i stay close to stock voltage i may be good.
> 
> What you think 4930k at 1.4v and 4 r9 290xs with 2 ssds and 3 hdds nzxt led strip and 2 mcp35xs and 26 fans 1600w enough im thinking not
Click to expand...

a quality 1600w will run it, but without OC and (this is my opinion) I like 25% +/- overhead

There are others running that configuration on a 1600w, but with even a very modest OC I have exceeded that.

I will call a few of my contacts and

I will find some X5's









Greg


----------



## Jflisk

Quote:


> Originally Posted by *Red1776*
> 
> a quality 1600w will run it, but without OC and (this is my opinion) I like 25% +/- overhead
> There are others running that configuration on a 1600w, but with even a very modest OC I have exceeded that.
> I will call a few of my contacts and
> I will find some X5's
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Greg


This looks like a good idea. There are none available on the bay either. I could use one at this point.


----------



## Cyber Locc

Quote:


> Originally Posted by *Jflisk*
> 
> This looks like a good idea. There are none available on the bay either. I could use one at this point.


Right lol they wait until 290xs come out then they stop making em now that people actually need them lol


----------



## Jflisk

Quote:


> Originally Posted by *cyberlocc*
> 
> Right lol they wait until 290xs come out then they stop making em now that people actually need them lol


I am looking at the LEPA 1600W as a power supply right now. I am looking for 1500W or up to run my system at this point. I think 1500W might be max. Think the 1600W will be killing breakers left and right.


----------



## Cyber Locc

Quote:


> Originally Posted by *Jflisk*
> 
> I am looking at the LEPA 1600W as a power supply right now. I am looking for 1500W or up to run my system at this point. I think 1500W might be max. Think the 1600W will be killing breakers left and right.


Look elsewhere lepas cant run quadfire I wouldnt even push trifire on them there voiding warrantys if you run 290xs on them as they fail within days. TSM said its the amps If i rember correctly theres some posts about it in this thread. but lepa will ask if your running 290s and wont honor warranty if you are they say they pull more than rated for the psu.

Look at the new supoernova or a hercules


----------



## bond32

Quote:


> Originally Posted by *Jflisk*
> 
> I am looking at the LEPA 1600W as a power supply right now. I am looking for 1500W or up to run my system at this point. I think 1500W might be max. Think the 1600W will be killing breakers left and right.


Yeah I hear the LEPA is no good. Like the other guy said, check out the EVGA which they are scheduled to release a 1600 watt G2 soon. Especially considering that 9590 probably pulls more power than most of our i7's do, you will be better off using 2 psu's or either a top of the line EVGA 1600 watt or Corsair AX1500i.

I have the EVGA 1300 watt G2. Running trifire and a 4770k with a heavy OC, pushing the cards to 1.42 volts caused the system to shut down every time.


----------



## Cyber Locc

Quote:


> Originally Posted by *bond32*
> 
> Yeah I hear the LEPA is no good. Like the other guy said, check out the EVGA which they are scheduled to release a 1600 watt G2 soon. Especially considering that 9590 probably pulls more power than most of our i7's do, you will be better off using 2 psu's or either a top of the line EVGA 1600 watt or Corsair AX1500i.
> 
> I have the EVGA 1300 watt G2. Running trifire and a 4770k with a heavy OC, pushing the cards to 1.42 volts caused the system to shut down every time.


Its out its 350 and got a really good review from jonny guru

http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story&reid=391

http://www.newegg.com/Product/Product.aspx?Item=N82E16817438033&cm_re=super_nova_1600w-_-17-438-033-_-Product

Go to newegg for some reason amazon is trying to charge 410 there nuts


----------



## bond32

Quote:


> Originally Posted by *cyberlocc*
> 
> Its out its 350 and got a really good review from jonny guru
> 
> http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story&reid=391
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16817438033&cm_re=super_nova_1600w-_-17-438-033-_-Product
> 
> Go to newegg for some reason amazon is trying to charge 410 there nuts


Holy cow. What a beauty... Looks slightly bigger than the 1300 watt I think, taller that is.


----------



## VSG

Quote:


> Originally Posted by *bond32*
> 
> Holy cow. What a beauty... Looks slightly bigger than the 1300 watt I think, taller that is.


It is 225mm long, the 1300G2 is 200mm I think. If I wasn't lazy enough, I would get off my backside and actually measure in person.


----------



## failwheeldrive

Cable pinout should be the same between the 1600 and 1300, right?


----------



## VSG

Yup!


----------



## rv8000

Would any Vapor-X 290 owners be kind enough to post some temperature information for comparison. Stock settings and stock fan curve, 10-15 minutes of an intensive game. My card is pushing 83c at factory settings in a superbly ventilated Air 540 after just 10-15 minutes of gaming in a 27-28c ambient room.


----------



## kizwan

Quote:


> Originally Posted by *rv8000*
> 
> Would any Vapor-X 290 owners be kind enough to post some temperature information for comparison. Stock settings and stock fan curve, 10-15 minutes of an intensive game. My card is pushing 83c at factory settings in a superbly ventilated Air 540 after just 10-15 minutes of gaming in a 27-28c ambient room.


Can you measure the temp inside the case?


----------



## Strileckifunk

Trying to decide on picking up another 7970 ghz, or selling mine to go up to a 290x w/ 4gb VRAM. Since I game @ 1440p I figured the extra VRAM would help. Any suggestions?


----------



## rv8000

Quote:


> Originally Posted by *kizwan*
> 
> Can you measure the temp inside the case?


Ambient via digital thermometer after 7 mins of being inside the closed case is reading 28.3c.

Would you like case temp during gaming as well?


----------



## th3illusiveman

is +150mv safe? I think my Tri-X has the reference PCB with ref VRMs.
Quote:


> Originally Posted by *Strileckifunk*
> 
> Trying to decide on picking up another 7970 ghz, or selling mine to go up to a 290x w/ 4gb VRAM. Since I game @ 1440p I figured the extra VRAM would help. Any suggestions?


i went from 7970 to 290x is there is a NICE difference. Everything is much smoother and this is at 1080p! You should notice it more at 1440p. Though my 7970 ran at like 955Mhz so if you're running one at 1200Mhz+ then maybe it wont be as noticeable.


----------



## kizwan

Quote:


> Originally Posted by *rv8000*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Can you measure the temp inside the case?
> 
> 
> 
> Ambient via digital thermometer after 7 mins of being inside the closed case is reading 28.3c.
> 
> Would you like case temp during gaming as well?
Click to expand...

Yes please.


----------



## battleaxe

Quote:


> Originally Posted by *Jflisk*
> 
> Mine did the exact same thing. Like I said I sat there and banged my head against the desk for 1 whole night. Woke up the next day and tried the suggestion I gave you and it worked within 30 seconds.I tried to get help on here and got the usual answers that I already tried. So I had to go way outside the box on this one. Also try reseating the cards. CCC is supposed to see the cards sometimes it don't.


Well, no dice. I think my motherboard is toast. Crud... Pretty sure I've tried everything. I just did a fresh OS install and same thing.


----------



## Mega Man

Quote:


> Originally Posted by *cyberlocc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jflisk*
> 
> My processor is 220W so lets keep that in mind. Thanks
> 
> 
> 
> At stock they pull 300w each so 900ws plus your cpu at 220w thats 1100 as long as you havent oced your fine if your overclocking the cards they can pull up to 450 (+50% PT)
> 
> Plus thats just your cpu and cards you also said under water so you have other components also pulling power more than 200w I don't know but its possible. Need more info.
> 
> If your killawatt is reading 850 thats idle or its lying.
Click to expand...

huh? please tell me your are joking, at idle maybe 200-300w, but i am well overestimating....

idle is idle. little power use, you could pump as much volts as you want.... idle is idle.

amps are pulled not pushed, ie you only use what you need. nothing more


----------



## rv8000

Quote:


> Originally Posted by *kizwan*
> 
> Yes please.


After 20 minutes of gameplay the case temp was 33.3c


----------



## Cyber Locc

Quote:


> Originally Posted by *Mega Man*
> 
> huh? please tell me your are joking, at idle maybe 200-300w, but i am well overestimating....
> 
> idle is idle. little power use, you could pump as much volts as you want.... idle is idle.
> 
> amps are pulled not pushed, ie you only use what you need. nothing more


I was joking. I updated it to show that lol. Sorry I know its hard to tell on forums yes it was a joke.

Also I think you mean watts I seen you say amps is what we pay for I don't know where you live but electric companies charge in KWH or kilowatt hour not in amps lol


----------



## kizwan

Quote:


> Originally Posted by *rv8000*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Yes please.
> 
> 
> 
> After 20 minutes of gameplay the case temp was 33.3c
Click to expand...

With that case temp, 83C is in the range what you should get even at stock. Your ambient already high at around 27 - 28C. Since you mentioned the card was at factory settings, you might want to try custom fan profile. You may get it down in high 70s Celsius at stock clock.


----------



## Mega Man

Quote:


> Originally Posted by *cyberlocc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> huh? please tell me your are joking, at idle maybe 200-300w, but i am well overestimating....
> 
> idle is idle. little power use, you could pump as much volts as you want.... idle is idle.
> 
> amps are pulled not pushed, ie you only use what you need. nothing more
> 
> 
> 
> I was joking. I updated it to show that lol. Sorry I know its hard to tell on forums yes it was a joke.
> 
> Also I think you mean watts I seen you say amps is what we pay for I don't know where you live but electric companies charge in KWH or kilowatt hour not in amps lol
Click to expand...

Watts is derived from amps. The less amps you use. The less watts.

You pay for how many amps you use.

Ie

One is like saying you owe $1.

The other is 100 cents. It is the same thing


----------



## Cyber Locc

Quote:


> Originally Posted by *Mega Man*
> 
> Watts is derived from amps. The less amps you use. The less watts.
> 
> You pay for how many amps you use.
> 
> Ie
> 
> One is like saying you owe $1.
> 
> The other is 100 cents. It is the same thing


That's true a kwh is an amp (if we are talking about 100v, Being most places use 110v an amp is actually 100ws more than a kilowatt so that would be 110 by your equation) but electric companies don't measure in amps lol.

Dude I own a rv park that I have to pay the electric company and charge my residents and both of which are in KWH Electric companies don't speak in amps lol look at your electric bill. Also if your paying 1 dollar a kilowatt your getting ripped off.

Electrical power is measured in watts. In an electrical system power (P) is equal to the voltage multiplied by the current.

IE - Amperage X Volts = Watts. which is how power is measured not by amps (AKA Current)

Anyway this is OT so Lets change the subject.


----------



## rv8000

Quote:


> Originally Posted by *kizwan*
> 
> With that case temp, 83C is in the range what you should get even at stock. Your ambient already high at around 27 - 28C. Since you mentioned the card was at factory settings, you might want to try custom fan profile. You may get it down in high 70s Celsius at stock clock.


True this is pretty close to the 49c delta some reviews show. Some people in the Vapor-X thread in this same forum subsection seem to have considerably lower deltas though 40-45c. Why did I give up on that loop


----------



## Anthrax234

Here ya go
XFX 290X Double Disspation Stock cooling.
http://www.techpowerup.com/gpuz/details.php?id=e983r


----------



## th3illusiveman

so it looks like once VRM1 goes over 80c your Overclocks get rendered void with this chip. Same as Tahiti.


----------



## Cyber Locc

Quote:


> Originally Posted by *th3illusiveman*
> 
> so it looks like once VRM1 goes over 80c your Overclocks get rendered void with this chip. Same as Tahiti.


Hmm link to this info or how did you find this out.

Also how did you monitor temps of the vrm. I got my core to 90c under a heaven highly clocked and my clock didn't drop. Seeing how I have very few fans going over that as the rest of the system is underwater and I watched afterburner readout with my clocks and temps the entire time. The clocks never went down usage stayed at 100% and I'm pretty sure my vrms were over 80c.

Also seeing how amd claim the vrm is safe to 105 or 115 don't remember which it was one of them though I highly doubt this.

Hmm my vrm aint going over 65c even with the gpu at 90 guess i underestimated those rads fans blowing down lol.


----------



## Ironsmack

Quote:


> Originally Posted by *th3illusiveman*
> 
> so it looks like once VRM1 goes over 80c your Overclocks get rendered void with this chip. Same as Tahiti.


If I'm understanding this correctly, if it goes over 80c - it throttles down?


----------



## Mega Man

Quote:


> Originally Posted by *cyberlocc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> Watts is derived from amps. The less amps you use. The less watts.
> 
> You pay for how many amps you use.
> 
> Ie
> 
> One is like saying you owe $1.
> 
> The other is 100 cents. It is the same thing
> 
> 
> 
> That's true a kwh is an amp (if we are talking about 100v, Being most places use 110v an amp is actually 100ws more than a kilowatt so that would be 110 by your equation) but electric companies don't measure in amps lol.
> 
> Dude I own a rv park that I have to pay the electric company and charge my residents and both of which are in KWH Electric companies don't speak in amps lol look at your electric bill. Also if your paying 1 dollar a kilowatt your getting ripped off.
> 
> Electrical power is measured in watts. In an electrical system power (P) is equal to the voltage multiplied by the current.
> 
> IE - Amperage X Volts = Watts. which is how power is measured not by amps (AKA Current)
> 
> Anyway this is OT so Lets change the subject.
Click to expand...

I think you need to look up ohms law if you don't know the relationship of amps and watts..

It isn't ot it is a common misconception that does help people when deciding what to purchase

Also. . The $1 was a metaphor....


----------



## Cyber Locc

Quote:


> Originally Posted by *Mega Man*
> 
> I think you need to look up ohms law if you don't know the relationship of amps and watts..
> 
> It isn't ot it is a common misconception that does help people when deciding what to purchase
> 
> Also. . The $1 was a metaphor....


Dude I know ohms law I also own a vape shop as well and deal with ohms law alot. Also I think your the one that needs to read about ohms law as I stated it to you in the last post ohms law has nothing to do with this as Resistance isn't a factor in our convo now your just grabbing at straws. Ohms means Resistance which would be another part the equation that I linked above and ahs nothing to do with our convo.

The watts given are a direct result of both the voltage and the amperage if you run 1 amp at 110vs that's alot different than running it at 12v or 240v or 220v that's why electric is measured in watts not amps.

IE -1amp x 12v = 12watts, 1amp x 110v = 110w, 1amp x 240v = 240w. do you see what is happening here and why you are so wrong its seriously insane. by your logic 2+2 doesn't equal 4 as 2 is whats important. amps and volts are part of an equation where we solve for watts.

I seriously think that you are trolling me at this point. I know it was a metaphor just teasing. Anyway if you want to continue this isanity make a thread about it and pm me the link as this is totally off topic. Mega man I made a thread for it stop derailing this one http://www.overclock.net/t/1509499/amps-vs-watts-crazy-convo-continuing

My apologizes guys for derailing.


----------



## Mega Man

So you pay for the amps you use.

Period.

It is not charged in amps.

Never said it was.

More amps is more watts. More watts is more kwh ....


----------



## mus1mus

Quote:


> Originally Posted by *cyberlocc*
> 
> Dude I know ohms law I also own a vape shop as well and deal with ohms law alot. Also I think your the one that needs to read about ohms law as I stated it to you in the last post ohms law has nothing to do with this as *Resistance isn't a factor in our convo now your just grabbing at straws. Ohms means Resistance which would be another part the equation that I linked above and ahs nothing to do with our convo*.
> 
> The watts given are a direct result of both the voltage and the amperage if you run 1 amp at 110vs that's alot different than running it at 12v or 240v or 220v that's why electric is measured in watts not amps.
> 
> IE -1amp x 12v = 12watts, 1amp x 110v = 110w, 1amp x 240v = 240w. do you see what is happening here and why you are so wrong its seriously insane. by your logic 2+2 doesn't equal 4 as 2 is whats important. amps and volts are part of an equation where we solve for watts.
> 
> I seriously think that you are trolling me at this point. I know it was a metaphor just teasing. Anyway if you want to continue this isanity make a thread about it and pm me the link as this is totally off topic.
> 
> My apologizes guys for derailing.


This made me laugh a bit.

Hearing Ohm's and you relate the topic to resistance..

Vape Shop reference?

Pfft

Edit:

You both know where this goes right? You're both going to be spinning around within a topic where, both of where neither wrong.


----------



## Cyber Locc

Quote:


> Originally Posted by *mus1mus*
> 
> This made me laugh a bit.
> 
> Hearing Ohm's and you relate the topic to resistance..
> 
> Vape Shop reference?
> 
> Pfft


Ohms means Resistance so what made you laugh exactly it is a measurement of Resistance. Yes vape shop reference where I have to teach people ohms law on a daily basis. Ecigs use a coil that has a certain resistance thus there wattage changes as do the amps needed to that Resistance (as do the volts in some cases). so to make sure the load is safe of the battery knowing ohms law is crucial.

Definition of ohms law- where I is the current through the conductor in units of amperes, V is the potential difference measured across the conductor in units of volts, and R is the resistance of the conductor in units of ohms. More specifically, Ohm's law states that the R in this relation is constant, independent of the current.[3]

"More amps is more watts. More watts is more kwh ...."
Yes more amps or more volts, amps and volts are variables not solutions the solution is watts we solve for watts not for amps. You have some backwards math man.

The basic electrical equation does not take into account Resistance that comes in with ohms law and is irrelevant to the basic electrical equation ya'll need to take a physics class. and reply to the other thread.


----------



## Mega Man

Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cyberlocc*
> 
> Dude I know ohms law I also own a vape shop as well and deal with ohms law alot. Also I think your the one that needs to read about ohms law as I stated it to you in the last post ohms law has nothing to do with this as *Resistance isn't a factor in our convo now your just grabbing at straws. Ohms means Resistance which would be another part the equation that I linked above and ahs nothing to do with our convo*.
> 
> The watts given are a direct result of both the voltage and the amperage if you run 1 amp at 110vs that's alot different than running it at 12v or 240v or 220v that's why electric is measured in watts not amps.
> 
> IE -1amp x 12v = 12watts, 1amp x 110v = 110w, 1amp x 240v = 240w. do you see what is happening here and why you are so wrong its seriously insane. by your logic 2+2 doesn't equal 4 as 2 is whats important. amps and volts are part of an equation where we solve for watts.
> 
> I seriously think that you are trolling me at this point. I know it was a metaphor just teasing. Anyway if you want to continue this isanity make a thread about it and pm me the link as this is totally off topic.
> 
> My apologizes guys for derailing.
> 
> 
> 
> This made me laugh a bit.
> 
> Hearing Ohm's and you relate the topic to resistance..
> 
> Vape Shop reference?
> 
> Pfft
> 
> Edit:
> 
> You both know where this goes right? You're both going to be spinning around within a topic where, both of where neither wrong.
Click to expand...

i know i am not wrong, my point is although he is trying to troll me in another thread. he is not correct by saying i am wrong .


----------



## Spectre-

@arizonian

My R9 290 died

heres my 290X validation

http://www.techpowerup.com/gpuz/details.php?id=a4xyk

cooled by H55


----------



## bond32

Not to split hairs/fuel fires, but ohm's law is not the same as the power equation









Ohm's law : V=I*R

Power Eq : P=VI, P=I^2*R

There's a lot of things that go on that are way more complicated than just a simple power equation. Also that comment about knowing how much you charge your residents, again its way more complicated. Ever heard of the power triangle? Power factor? kVAR? Yeah.

P.S. I'm an engineer. Although mechanical







, not electrical. So perhaps an electrical engineer can chime in on the details of how the card's draw their current...


----------



## mus1mus

Quote:


> Originally Posted by *bond32*
> 
> Not to split hairs/fuel fires, but ohm's law is not the same as the power equation
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ohm's law : V=I*R
> 
> Power Eq : P=VI, P=I^2*R
> 
> There's a lot of things that go on that are way more complicated than just a simple power equation. Also that comment about knowing how much you charge your residents, again its way more complicated. Ever heard of the power triangle? Power factor? kVAR? Yeah.
> 
> P.S. I'm an engineer. Although mechanical
> 
> 
> 
> 
> 
> 
> 
> , not electrical. So perhaps an electrical engineer can chime in on the details of how the card's draw their current...


Correct.

So in line with keeping within the spirit of the thread, here's my little ding ding..

Quote:


> Originally Posted by *cyberlocc*
> 
> Ohms means Resistance so what made you laugh exactly it is a measurement of Resistance. Yes vape shop reference where I have to teach people ohms law on a daily basis. Ecigs use a coil that has a certain resistance thus there wattage changes as do the amps needed to that Resistance (as do the volts in some cases). so to make sure the load is safe of the battery knowing ohms law is crucial.
> 
> Definition of ohms law- where I is the current through the conductor in units of amperes, V is the potential difference measured across the conductor in units of volts, and R is the resistance of the conductor in units of ohms. More specifically, Ohm's law states that the R in this relation is constant, independent of the current.[3]
> 
> "More amps is more watts. More watts is more kwh ...."
> Yes more amps or more volts, amps and volts are variables not solutions the solution is watts we solve for watts not for amps. You have some backwards math man.
> 
> The basic electrical equation does not take into account Resistance that comes in with ohms law and is irrelevant to the basic electrical equation ya'll need to take a physics class. and reply to the other thread.


Assuming your 290X draws 350W of power from the wall bone stock, (stock clocks, voltages) overclocking it by 10% on stock voltages will equate to how much increase in power pulled from the wall at full power?

(I won't mention frequency as that is not constant on every make and model nor values as I indicated it by percentage)

I bet you can get this quick. As this is as simple as Ohms Law gets.


----------



## bond32

Awesome reply. But why not consider the absolute max tdp of the card is 375 watts. That is the amount the 8 pin, 6pin, and slot can physically provide at stock. Now figure if you run pt bios with the limits removed and add +50% power limit, you're max draw is now over 400 watts. I say max 400 because I don't know how that pt bios actually operates or how the power slider actually operates.


----------



## th3illusiveman

Quote:


> Originally Posted by *cyberlocc*
> 
> Hmm link to this info or how did you find this out.
> 
> Also how did you monitor temps of the vrm. I got my core to 90c under a heaven highly clocked and my clock didn't drop. Seeing how I have very few fans going over that as the rest of the system is underwater and I watched afterburner readout with my clocks and temps the entire time. The clocks never went down usage stayed at 100% and I'm pretty sure my vrms were over 80c.
> 
> Also seeing how amd claim the vrm is safe to 105 or 115 don't remember which it was one of them though I highly doubt this.
> 
> Hmm my vrm aint going over 65c even with the gpu at 90 guess i underestimated those rads fans blowing down lol.


Quote:


> Originally Posted by *Ironsmack*
> 
> If I'm understanding this correctly, if it goes over 80c - it throttles down?


No throttling but it does make your overclocks extremely unstable and increases the chances of artifacts. I can't find a link because it's something i noticed on both my GCN GPUs.


----------



## mus1mus

Quote:


> Originally Posted by *bond32*
> 
> Awesome reply. But why not consider the absolute max tdp of the card is 375 watts. That is the amount the 8 pin, 6pin, and slot can physically provide at stock. Now figure if you run pt bios with the limits removed and add +50% power limit, you're max draw is now over 400 watts. I say max 400 because I don't know how that pt bios actually operates or how the power slider actually operates.


You cannot simply put it that way. Regarding the maximum an 8pin + 6pin + pcie slot can provide, it was already debunct by the R9 295X2. That card can pull more than 600 Watts (not from the wall) from the PSU using 2 8pins?

Also note, these chips vary a lot due to many things. Binning, leakage, and Voltages they require to max out their clocks. And I myself don't have an idea how much they pull in general at stock..

But that question will lead to a series of questions once he figure out the first one. We can only getbmathematical insight about Power at stock, OC'd and OverVolt Power as percentages in terms of increase. But then again, those are not going to relate exactly in real life. Still, better knowing it theoretically as a guide.


----------



## Sgt Bilko

Quote:


> Originally Posted by *th3illusiveman*
> 
> No throttling but it does make your overclocks extremely unstable and increases the chances of artifacts. I can't find a link because it's something i noticed on both my GCN GPUs.


Yeah i can back this up to some extent, I've noticed that when my Vrm 1 temp hits over 80c i start to artifact when under some heavy clocks and voltage.


----------



## Dasboogieman

Quote:


> Originally Posted by *bond32*
> 
> Awesome reply. But why not consider the absolute max tdp of the card is 375 watts. That is the amount the 8 pin, 6pin, and slot can physically provide at stock. Now figure if you run pt bios with the limits removed and add +50% power limit, you're max draw is now over 400 watts. I say max 400 because I don't know how that pt bios actually operates or how the power slider actually operates.


I will quote a certain somebody from Pirates of the Carribbean regarding the cables

"those are more of...guidelines, than actual rules"

Anyway, the power slider basically limits the TDP of the board, +50% means your card is allowed to draw an additional 50% more power. I assume the power silder is totally irrelevant with the PT BIOS since powertune is completely disabled.

As for the earlier conversation, as I understand it
It is true that electricity consumption is measured in KWh simply because that is a unit of energy (i.e. 3.6MJ).
Watts alone (and by extension, Amps) does not account for the time factor, it is simply a rate which energy is can potentially be consumed/produced.


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *bond32*
> 
> Awesome reply. But why not consider the absolute max tdp of the card is 375 watts. That is the amount the 8 pin, 6pin, and slot can physically provide at stock. Now figure if you run pt bios with the limits removed and add +50% power limit, you're max draw is now over 400 watts. I say max 400 because I don't know how that pt bios actually operates or how the power slider actually operates.
> 
> 
> 
> 
> 
> 
> 
> You cannot simply put it that way. Regarding the maximum an 8pin + 6pin + pcie slot can provide, *it was already debunct by the R9 295X2*. That card can pull more than 600 Watts (not from the wall) from the PSU using 2 8pins?
> 
> Also note, these chips vary a lot due to many things. Binning, leakage, and Voltages they require to max out their clocks. And I myself don't have an idea how much they pull in general at stock..
> 
> But that question will lead to a series of questions once he figure out the first one. We can only getbmathematical insight about Power at stock, OC'd and OverVolt Power as percentages in terms of increase. But then again, those are not going to relate exactly in real life. Still, better knowing it theoretically as a guide.
Click to expand...

Actually it already debunked long time ago that max power draw can exceeds max TDP of that card & also can exceeds the theoretically max power draw the combination of the 6-pin and/or 8-pin PCIe power connector and the PCIe slot can provide. This is at stock clock if I'm not mistaken. I can provide the links if anyone interested (an article from 2010).


----------



## Cyber Locc

Quote:


> Originally Posted by *th3illusiveman*
> 
> No throttling but it does make your overclocks extremely unstable and increases the chances of artifacts. I can't find a link because it's something i noticed on both my GCN GPUs.


That makes sense. how hot was your core when your vrm was so high if you don't mind me asking?
I'm guessing this is caused by the unstable voltage to your overclock. BTW hw64 can show vrm temps now guys


----------



## Raul-7

Just overclocked my card to test how far it can go; just need a waterblock to run it 24/7.



That's 1100 core and memory at stock.


----------



## th3illusiveman

Quote:


> Originally Posted by *cyberlocc*
> 
> That makes sense. how hot was your core when your vrm was so high if you don't mind me asking?
> I'm guessing this is caused by the unstable voltage to your overclock. BTW hw64 can show vrm temps now guys


My Core is usually afew degrees lower (2-3) but the core is fine up until the mid 90s. I think if VRM1 sayed at around 70c it wouldn't matter if the core was at 90 c, the OC would be more stable than if it was the other way around. Sadly most air coolers do a terrible job at cooling VRMs.


----------



## chronicfx

It is a happy day for me! Just placed an order on a 4790k, a new raystorm block, new tubing for my loop, and a gigabyte Z97x-gaming gt motherboard. So I will finally have the extra grunt of an i7 if future games push towards more threading and the extra bandwidth of PLX for my three R9 290x's! Life is complete









edit: I was waiting for the 5820k but after looking at ram prices I feel devil's canyon will be just fine.


----------



## Sgt Bilko

Quote:


> Originally Posted by *cyberlocc*
> 
> That makes sense. how hot was your core when your vrm was so high if you don't mind me asking?
> I'm guessing this is caused by the unstable voltage to your overclock. BTW hw64 can show vrm temps now guys


HWiNFO64 has been able to show vrm temps for a few months now actually.

When my vrm1 hits 80c+ my core temp is usually around the same, im going to take the coolers off my cards this weekend and replace the tim and see what I can do about the vrm cooling.

I was thinking about a loop but there are very few places in Aus that are selling watercooling parts now and ordering from the EU or US is murder for shipping


----------



## mus1mus

Quote:


> Originally Posted by *chronicfx*
> 
> It is a happy day for me! Just placed an order on a 4790k, a new raystorm block, new tubing for my loop, and a gigabyte Z97x-gaming gt motherboard. So I will finally have the extra grunt of an i7 if future games push towards more threading and the extra bandwidth of PLX for my three R9 290x's! Life is complete
> 
> 
> 
> 
> 
> 
> 
> 
> 
> edit: I was waiting for the 5820k *but after looking at ram prices* I feel devil's canyon will be just fine.


240 US$ for a 16 GB kit ain't too bad.

Anyway, the set-up can pull off good numbers IMO..


----------



## chronicfx

Quote:


> Originally Posted by *mus1mus*
> 
> 240 US$ for a 16 GB kit ain't too bad.
> 
> Anyway, the set-up can pull off good numbers IMO..


Not at all, same price as good DDR3, but this way I can transfer my 16GB instead of buying a new set.


----------



## kizwan

Quote:


> Originally Posted by *chronicfx*
> 
> It is a happy day for me! Just placed an order on a 4790k, a new raystorm block, new tubing for my loop, and a gigabyte Z97x-gaming gt motherboard. So I will finally have the extra grunt of an i7 if future games push towards more threading and the extra bandwidth of PLX for my three R9 290x's! Life is complete
> 
> 
> 
> 
> 
> 
> 
> 
> 
> edit: I was waiting for the 5820k but after looking at ram prices I feel devil's canyon will be just fine.


Congrats! That i7 should do well.


----------



## mus1mus

Quote:


> Originally Posted by *Dasboogieman*
> 
> I will quote a certain somebody from Pirates of the Carribbean regarding the cables
> 
> "those are more of...guidelines, than actual rules"
> 
> Anyway, the power slider basically limits the TDP of the board, +50% means your card is allowed to draw an additional 50% more power. I assume the power silder is totally irrelevant with the PT BIOS since powertune is completely disabled.
> 
> *As for the earlier conversation, as I understand it
> It is true that electricity consumption is measured in KWh simply because that is a unit of energy (i.e. 3.6MJ).
> Watts alone (and by extension, Amps) does not account for the time factor, it is simply a rate which energy is can potentially be consumed/produced.*


Yes. You are right.

Although, Current measured in Amperes or Amps is the flow of electric charge across a surface at the rate of one coulomb per second.

And, In physics, power is the rate of doing work. It is equivalent to an amount of energy consumed per unit time.

So to make it short, KWH is the just a measurement of how much total power or energy (as they are equivalent) you have consumed over time. In the case of your electric bill, for a month. or

KW * 24 H/day * 30 days. days just cancel each other thus giving you KWH.









As for @cyberlocc who have not extended his usual guts to answer my question,


Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *mus1mus*
> 
> Assuming your 290X draws 350W of power from the wall bone stock, (stock clocks, voltages) overclocking it by 10% on stock voltages will equate to how much increase in power pulled from the wall at full power?
> 
> (I won't mention frequency as that is not constant on every make and model nor values as I indicated it by percentage)
> 
> I bet you can get this quick. As this is as simple as Ohms Law gets.


Holding Voltage constant. The card will consume 10% more power simply because the number of times it was on ON State per Second was raised by 10%. Thus pulls 10% more Current or Amps from the PSU. P=V*I. Or in terms of 350Watts stock, the OC caused your card to draw 385 Watts without changing the Voltage.

Now for the tricky one, let's say your card was Overclocked by 30% and Overvolted by 30%. How much does that relate to power consumption?:

Take note of the Ohm's Law Basic Principles.
*P=VI ; P=I² R; P= V² / R; R= V / I; I=V / R; V=IR*

You might think, you could simply get the Power by
*P=VI = (1.3 * 1.3) = 1.69* or an increse of 69% more Power Requirement due to OC. But no.



As you can see from the illustration, if we increase *V*, it will pull *R* down and will open up enough space for *I* to move to the other side. Thus, calculate for the change, we need a double take.

At 30% no OV, we'll take how much the Resistance will change.

R = V / I or R = 1 / 1.3 ;
R (at 30% OC) = 0.769.. or dropped to 77% of it's stock Value.

Now at 30% OV,

P= V²/R = 1.3² / 0.769 = 2.198
P (at 30 OV + 30 OC) = 2.2 or more than doubled the Power Requirement from stock.

So if a stock card consumes 300 Watts, the OC and OV will raise it's consumption to 660 Watts







.

Is that achievable? Yes, in extreme cases. Normally, NO, Power Limiters kick in, Leakage, and other stuff will force your card no to reach that.

But what it shows, is the effect of Overvolting on the Power requirement that could guide us how to pick PSU ratings on specific needs.

*Q* _Wait, if I buy a 290X that consumes 300 Watts at stock, and I need to Overclock, pushing it to the limits, I need to consider a 600 Watts PSU for it alone?_









*A.* No. like I said, too many things will limit it's power draw that it will not Max out the calculated Power Requirement Calculated. But consider much lower values like 25% and that requirement will drop down significantly.









I do hope this is not a straying away from the thread..


Quote:


> Originally Posted by *kizwan*
> 
> Actually it already debunked long time ago that max power draw can exceeds max TDP of that card & also can exceeds the theoretically max power draw the combination of the 6-pin and/or 8-pin PCIe power connector and the PCIe slot can provide. This is at stock clock if I'm not mistaken. I can provide the links if anyone interested (an article from 2010).


Yes you are right. I haven't been here long enough to know or read those articles.









But on a note, TDP or Thermal Design Power is not Power Draw.


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Actually it already debunked long time ago that max power draw can exceeds max TDP of that card & also can exceeds the theoretically max power draw the combination of the 6-pin and/or 8-pin PCIe power connector and the PCIe slot can provide. This is at stock clock if I'm not mistaken. I can provide the links if anyone interested (an article from 2010).
> 
> 
> 
> Yes you are right. I haven't been here long enough to know or read those articles.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *But on a note, TDP or Thermal Design Power is not Power Draw.*
Click to expand...

I know, never said it was.









If we look few pages back, it kinda funny how idle power draw discussion leads to (total) power consumption discussion.







I don't understand why it's so hard to stick to the topic that is being discussed. When the discussion derailed, it will create confusion. Whenever I see this kind of discussion I know it's a trap.


----------



## mus1mus

I know, I'd edited that to add spoilers..
















Someone just asked what's the max power the cards can pull.







Just sayin..

What really funny is that when one explains something, someone will pop up to tell him he's wrong or whatever. When you know, they simply misunderstood one another.


----------



## Arizonian

Quote:


> Originally Posted by *Spectre-*
> 
> @arizonian
> 
> My R9 290 died
> 
> heres my 290X validation
> 
> http://www.techpowerup.com/gpuz/details.php?id=a4xyk
> 
> cooled by H55


Glad to hear it turned out for the better.







Congrats - updated









Quote:


> Originally Posted by *Anthrax234*
> 
> Here ya go
> XFX 290X Double Disspation Stock cooling.
> http://www.techpowerup.com/gpuz/details.php?id=e983r


Congrats - added


----------



## redklu3

Hi all I have a question about thermal pads, this is somewhat related on controlling r9 290x vrm temps. Are for example this: http://www.ebay.com.au/itm/151380785188 thermal pads non-electrically conductive? As I'm planning to use it as sandwich at the back, in between the PCB and back plate. Also I'll be modding the back plate and fit a copperblock there then connect it to the XFX DD air cooler while the front of the card is cooled by a full XSPC water block. I really think the vrm temps is the one holding back my card.

P.S. I've visited a 290x thread with thermalpad update but I wasn't bale to read whether it is non electrically conductive.


----------



## Raul-7

Best $220 GPU I've bought in a long time. Worth every penny. Can't wait to watercool it and push it to the max.


----------



## smoke2

I have maybe an issue with my Sapphire Tri-X 290.
My PC starts to buzzing and I don't know what can cause it.
I know Tri-X 290 had rattling issue with fans, but don't know if it this sound in video is similar to this issue and going from graphics card.
It can be heard only sometimes and sometimes when I move the case on side it appears.
Case is Fractal R4, my fans consist of: 4 case fans, CPU fan and PSU fan (Cooler Master V700)

http://tinypic.com/player.php?v=2zyvg4w%3E&s=8#.U_2OhmMqJ6Y
What can cause it?


----------



## redklu3

can't hear a buzz, all i can hear is muffled air, probably a coil whine from how you described it, but i can't hear it from your vid. they say its normal for high end GPUs r9s and 780s but few lucky ones don't have it. if it annoys you to much, you could probably RMA it.


----------



## Wezzor

I had something similiar to you smoke2 and it disappeared after a while for me and with a while I mean around 1,5 month.

EDIT: I'd also like to be added to the member list.









Sapphire R9 290 Tri-X
http://www.techpowerup.com/gpuz/details.php?id=r6e2


----------



## rdr09

Quote:


> Originally Posted by *redklu3*
> 
> Hi all I have a question about thermal pads, this is somewhat related on controlling r9 290x vrm temps. Are for example this: http://www.ebay.com.au/itm/151380785188 thermal pads non-electrically conductive? As I'm planning to use it as sandwich at the back, in between the PCB and back plate. Also I'll be modding the back plate and fit a copperblock there then connect it to the XFX DD air cooler while the front of the card is cooled by a full XSPC water block. I really think the vrm temps is the one holding back my card.
> 
> P.S. I've visited a 290x thread with thermalpad update but I wasn't bale to read whether it is non electrically conductive.


i don't think you need to go that far with a full waterblock. vrms are cooler than the core when using full block.


----------



## smoke2

Quote:


> Originally Posted by *redklu3*
> 
> can't hear a buzz, all i can hear is muffled air, probably a coil whine from how you described it, but i can't hear it from your vid. they say its normal for high end GPUs r9s and 780s but few lucky ones don't have it. if it annoys you to much, you could probably RMA it.


Please, turn up the volume. It's recorded quieter.
It's buzzing or rattling sound.
Quote:


> Originally Posted by *Wezzor*
> 
> I had something similiar to you smoke2 and it disappeared after a while for me and with a while I mean around 1,5 month.
> 
> EDIT: I'd also like to be added to the member list.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sapphire R9 290 Tri-X
> http://www.techpowerup.com/gpuz/details.php?id=r6e2


For around 1,5 month it disappeared?
Do you think it's a coil whine?


----------



## machomen

Hi everyone I think my r9 290 is way to high on temperature.

I am using the Asus R9 290 DCUII

here is a screenshot after about 4 minutes of furmark.



If you look in GPU-z at the vrm temps that what i am really concerned about.

Do you guys have the same temps ?


----------



## ozzy1925

Quote:


> Originally Posted by *machomen*
> 
> Hi everyone I think my r9 290 is way to high on temperature.
> 
> I am using the Asus R9 290 DCUII
> 
> here is a screenshot after about 4 minutes of furmark.
> 
> 
> 
> If you look in GPU-z at the vrm temps that what i am really concerned about.
> 
> Do you guys have the same temps ?


i wouldnt leave the fans @auto when benching try at least %60


----------



## machomen

So after 10 min furmark on 60 % fanspeed which is actually really loud on the ASUS DCUII



So the temps are as followed

GPU - 89 Degrees celcius
VRM 1 - 122 Degrees celcius
VRM 2 - 86 Degrees celcius

BTW my GPU is not overclocked !


----------



## sugarhell

This is terrible! What case you have?

Try to bench 3dmark. Furmark is useless


----------



## ozzy1925

Quote:


> Originally Posted by *machomen*
> 
> So after 10 min furmark on 60 % fanspeed which is actually really loud on the ASUS DCUII
> 
> 
> 
> So the temps are as followed
> 
> GPU - 89 Degrees celcius
> VRM 1 - 122 Degrees celcius
> VRM 2 - 86 Degrees celcius
> 
> BTW my GPU is not overclocked !


122C is way too high , which case do you use and set up?


----------



## kizwan

Quote:


> Originally Posted by *redklu3*
> 
> Hi all I have a question about thermal pads, this is somewhat related on controlling r9 290x vrm temps. Are for example this: http://www.ebay.com.au/itm/151380785188 thermal pads non-electrically conductive? As I'm planning to use it as sandwich at the back, in between the PCB and back plate. Also I'll be modding the back plate and fit a copperblock there then connect it to the XFX DD air cooler while the front of the card is cooled by a full XSPC water block. I really think the vrm temps is the one holding back my card.
> 
> P.S. I've visited a 290x thread with thermalpad update but I wasn't bale to read whether it is non electrically conductive.


You don't need going to that extreme. All you need is full waterblock like XSPC & thermal pad with thermal conductivity between 7 to 17.0 W/mK. Phobya 7W/mK or Fujipoly Extreme 11W/mK or Fujipoly Ultra Extreme 17W/mK. The latter is expensive though.

Thermal pad is non electrical conductive.
Quote:


> Originally Posted by *machomen*
> 
> So after 10 min furmark on 60 % fanspeed which is actually really loud on the ASUS DCUII
> 
> 
> 
> So the temps are as followed
> 
> GPU - 89 Degrees celcius
> VRM 1 - 122 Degrees celcius
> VRM 2 - 86 Degrees celcius
> 
> BTW my GPU is not overclocked !


Just an advice, stop using Furmark.


----------



## Imprezzion

Nothing special on a DCUII.

The badly designed and tiny VRM block can't hold up against the heat they produce. Stock DCUII cards on stock speeds bench 3dmark in the upper 90c regions.
Not surprised at all that it hit 120+c in Furmark..


----------



## machomen

I am using a intel 2500K cooled by corsair H80i and its all inside a corsair 250D so a tiny case actually


----------



## Imprezzion

Here, this (clickable) review page says it all.
Open test bench.

91c on stock using normal gaming benchmarks.
105c with a overclock.
And that is on a open testbench..

If they would have ran Furmark there it would be the same as yours. ~115c stock, 130c+ overclocked..


----------



## Sgt Bilko

Quote:


> Originally Posted by *sugarhell*
> 
> This is terrible! What case you have?
> 
> Try to bench 3dmark. Furmark is useless


Quote:


> Originally Posted by *ozzy1925*
> 
> 122C is way too high , which case do you use and set up?


It's not really the case, Asus just did a crappy job with the DCU II for Hawaii, best thing he can do is put a Block/AIO on it or sell it off imo


----------



## Jflisk

Quote:


> Originally Posted by *chronicfx*
> 
> It is a happy day for me! Just placed an order on a 4790k, a new raystorm block, new tubing for my loop, and a gigabyte Z97x-gaming gt motherboard. So I will finally have the extra grunt of an i7 if future games push towards more threading and the extra bandwidth of PLX for my three R9 290x's! Life is complete
> 
> 
> 
> 
> 
> 
> 
> 
> 
> edit: I was waiting for the 5820k but after looking at ram prices I feel devil's canyon will be just fine.


Congrats on the new add ins.


----------



## chronicfx

Quote:


> Originally Posted by *Jflisk*
> 
> Congrats on the new add ins.


Thanks! Super excited but confused as hell with Haswell overclocking guide for the moment. Gotta get my terminology straight, uncore/core/vring... whattheheck! Can't I just use my multi, LLC at turbo, set my PLL overvoltage on and go? Haswell looks a slight bit more complicated.


----------



## Jflisk

Quote:


> Originally Posted by *machomen*
> 
> Hi everyone I think my r9 290 is way to high on temperature.
> 
> I am using the Asus R9 290 DCUII
> 
> here is a screenshot after about 4 minutes of furmark.
> 
> 
> 
> If you look in GPU-z at the vrm temps that what i am really concerned about.
> 
> Do you guys have the same temps ?


Running Furmark those temps are normal - under air . You are turning the card up more then it will ever be used in real life situations. Furmark was created to push the card to its limits and keep it there. I have seen 122C on the VRMS under air - Furmark. You might want to consider water cooling. You can replace the thermal pads on the card and give that a shot. I would use GPU-Z to check the VRMS temps and run Valley below closer to playing a high end game.









Try Valley benchmark here
https://unigine.com/products/valley/


----------



## Talon720

If you all are right and 290x w/pt1 are pulling 400w w/o oc in trifire thats 1200w alone. I mean i know that dosnt equal exactly what psu you should use but thats crazy. I guess it was my hx1050 that wasnt up to the task of overclocking (shutdown on high oc) glad i went for the g2 1300w. I cant help but think the g2 1600w woulda been a better choice giving a little extra headroom though it wasnt out at the time. Has anyone tried the p2 1200w I was gonna trade the g2 1300w for it. The P2 fanless mode appealed to me along with better effeciency . The hx1050 fan was so loud when it turned on i was afraid based on reviews the g2 1300w would too but the g2 1300w ended up being way more quiet than i expected. Im sure all the sound proofing stuff i used helped. I also assumed the 1300w had higher peak than than the p2 1200w which is why I just kept it. That g2 1600w does seems like a good deal though.


----------



## Wezzor

Quote:


> Originally Posted by *smoke2*
> 
> For around 1,5 month it disappeared?
> Do you think it's a coil whine?


I've no idea actually. Yeh I had the annoying sound coming from my GPU for like 1,5 month and now I haven't heard it for months.


----------



## smoke2

Quote:


> Originally Posted by *Wezzor*
> 
> I've no idea actually. Yeh I had the annoying sound coming from my GPU for like 1,5 month and now I haven't heard it for months.


On yours it sounds exactly the same as on my?
Because I've watched videos with rattling issue and I think my sounds different.
Maybe bearing on one fan is dying?
Should I RMA it?


----------



## JordanTr

Hello guys i just tried The Crew beta and while im playing every few minutes game stucks for about 1s. MSI monitorint always on so i see, then it unstuck clock jumps from 300 core and i guess thats is tthe problem. I tired even take off oc of my cpu and tested ram everything is fine. How should i force constant clocks? I tried Make RadeonPro profile for the game, but if i try to start it while radeonpro is working game gives me "not responding" table. Also constant voltage and disable ulps are checked in MSI AB. Dont know what else to do


----------



## heroxoot

Quote:


> Originally Posted by *JordanTr*
> 
> Hello guys i just tried The Crew beta and while im playing every few minutes game stucks for about 1s. MSI monitorint always on so i see, then it unstuck clock jumps from 300 core and i guess thats is tthe problem. I tired even take off oc of my cpu and tested ram everything is fine. How should i force constant clocks? I tried Make RadeonPro profile for the game, but if i try to start it while radeonpro is working game gives me "not responding" table. Also constant voltage and disable ulps are checked in MSI AB. Dont know what else to do


You can disable powerplay by enabling unofficial OC mode Without powerplay, and then raise your clock by 1mhz minimum. This will force your clock to always be max. Then just set a profile for 2d and 3d so that it can clock down to save power.


----------



## JordanTr

Quote:


> Originally Posted by *heroxoot*
> 
> You can disable powerplay by enabling unofficial OC mode Without powerplay, and then raise your clock by 1mhz minimum. This will force your clock to always be max. Then just set a profile for 2d and 3d so that it can clock down to save power.


Quote:


> Originally Posted by *heroxoot*
> 
> You can disable powerplay by enabling unofficial OC mode Without powerplay, and then raise your clock by 1mhz minimum. This will force your clock to always be max. Then just set a profile for 2d and 3d so that it can clock down to save power.


Thx for the tip. It looks like i get it then i hit 60fps for few seconds (yeah i took of 30fps cap). So if i make settings to keep fps 30-50 i got no problems.


----------



## machomen

So these are my Temps after Valley benchmark (Asus R9 290 DCUII)




GPU - 77 Degrees celcius
VRM 1 - 86 Degrees celcius
VRM 2 - 72 Degrees celcius

Are those temps reasonable ?
Its only 1 run of valley benchmark not OC'ed btw


----------



## Wezzor

Quote:


> Originally Posted by *smoke2*
> 
> On yours it sounds exactly the same as on my?
> Because I've watched videos with rattling issue and I think my sounds different.
> Maybe bearing on one fan is dying?
> Should I RMA it?


Well, I had more of the rattle sound. You do as you wish, but remember it might take some time if you decide to RMA.


----------



## smoke2

Quote:


> Originally Posted by *Wezzor*
> 
> Well, I had more of the rattle sound. You do as you wish, but remember it might take some time if you decide to RMA.


And do you know what it can be? Defected bearing on fan or...?


----------



## fateswarm

Hm. Did someone say Sleeping Dogs and Far Cry 3 are very unforgiving on overclocks? I think I'm seeing that.


----------



## Jflisk

Now for something completely different. Under water how much difference would it be if I change my 2x 240 radiators out for 2 x 280 radiators. After adding the 3rd R9 290X don't really like the temps. Loop breakdown 2x 240 and 1 x 120 for 3 x R9 290x (estimate 400W a piece) and 1x FX 9590 220W. I was under 55C with 2x R9 290x now I am at 63C (highest temp) with the third R9 290x. Any ideas and I am trying to keep it all in the case. Thanks


----------



## pkrexer

That is still pretty high for one loop of valley. I would loop it for at least half-hour to get an idea what your temps are.

Quote:


> Originally Posted by *machomen*
> 
> So these are my Temps after Valley benchmark (Asus R9 290 DCUII)
> 
> 
> 
> 
> GPU - 77 Degrees celcius
> VRM 1 - 86 Degrees celcius
> VRM 2 - 72 Degrees celcius
> 
> Are those temps reasonable ?
> Its only 1 run of valley benchmark not OC'ed btw


----------



## rdr09

Quote:


> Originally Posted by *machomen*
> 
> So these are my Temps after Valley benchmark (Asus R9 290 DCUII)
> 
> 
> 
> 
> GPU - 77 Degrees celcius
> VRM 1 - 86 Degrees celcius
> VRM 2 - 72 Degrees celcius
> 
> Are those temps reasonable ?
> Its only 1 run of valley benchmark not OC'ed btw


the fans are still in auto? its reacting to the core temp and not strong enough to cool the vrms. set a fan curve or manual based on the vrm temps. 60% maybe?


----------



## Jflisk

Quote:


> Originally Posted by *machomen*
> 
> So these are my Temps after Valley benchmark (Asus R9 290 DCUII)
> 
> 
> 
> 
> GPU - 77 Degrees celcius
> VRM 1 - 86 Degrees celcius
> VRM 2 - 72 Degrees celcius
> 
> Are those temps reasonable ?
> Its only 1 run of valley benchmark not OC'ed btw


Per someone else suggestion. I would let it do a couple of loops. But it looks right under air. The card will only hit a certain amount of heat before it down clocks. Think its around 94C under air on VRM1. I had those temps before I went water. The Vrm1 is usually 10C higher then the core. But it is what it is with air cooling. If you have space on the side of your case you can try an intake fan- try to draw more air across the card .


----------



## th3illusiveman

that valley score is only 1 FPS down from my 1100/1450Mhz 290X....


----------



## Raul-7

Quote:


> Originally Posted by *th3illusiveman*
> 
> that valley score is only 1 FPS down from my 1100/1450Mhz 290X....


The problem (I suspect) is he's running on Custom not Extreme HD like everyone else.


----------



## Descadent

anyone looking for a 290x Vapor-x should see my sig


----------



## th3illusiveman

Quote:


> Originally Posted by *smoke2*
> 
> And do you know what it can be? Defected bearing on fan or...?


My card also made a weird noise. It happened when the fans were changing speeds but i found it was it because the Tri-X was sagging. It's a very long card so i fixed my issue but using some rubber to hold the tip of the card next to my HDD bays eliminating the sagging and the noise has been gone since.


----------



## mus1mus

Quote:


> Originally Posted by *Jflisk*
> 
> Now for something completely different. Under water how much difference would it be if I change my 2x 240 radiators out for 2 x 280 radiators. After adding the 3rd R9 290X don't really like the temps. Loop breakdown 2x 240 and 1 x 120 for 3 x R9 290x (estimate 400W a piece) and 1x FX 9590 220W. I was under 55C with 2x R9 290x now I am at 63C (highest temp) with the third R9 290x. Any ideas and I am trying to keep it all in the case. Thanks


You lack rad space IMO. And switching from 240 to 280 won't make big differences.

Rule always say 1 rad space per component will be fine.. if you want lower temps, make it 2 per component. Or 2 x 120mm rad per component.

If space is an issue, start looking at higher revving fans..


----------



## Infinite Jest

I'm a biut confused by the Catalyst software package/driver bundle. I noticed that I wasn't getting the Mantle API option to show up in BF4 with Catalyst 14.4 and found in device manager that the driver version was actually 13.25. I browsed for another version and found 14.1 also there, and switched over to it. Why are there two drivers installed by default? I had done a DDU clean install to eliminate an issue I was having and this till happened. I really wish AMD driver installation was more straight-forward like nVidia, unless I'm missing something.


----------



## Mega Man

On mobile can't edit quotes down sorry
Quote:


> Originally Posted by *Talon720*
> 
> If you all are right and 290x w/pt1 are pulling 400w w/o oc in trifire thats 1200w alone. I mean i know that dosnt equal exactly what psu you should use but thats crazy. I guess it was my hx1050 that wasnt up to the task of overclocking (shutdown on high oc) glad i went for the g2 1300w. I cant help but think the g2 1600w woulda been a better choice giving a little extra headroom though it wasnt out at the time. Has anyone tried the p2 1200w I was gonna trade the g2 1300w for it. The P2 fanless mode appealed to me along with better effeciency . The hx1050 fan was so loud when it turned on i was afraid based on reviews the g2 1300w would too but the g2 1300w ended up being way more quiet than i expected. Im sure all the sound proofing stuff i used helped. I also assumed the 1300w had higher peak than than the p2 1200w which is why I just kept it. That g2 1600w does seems like a good deal though.


Just a FYI you will probably spend more to shop your power supply then you will save in efficacy

On this note the lepa g1600 has been getting a bad rap recently. As I own a few I asked lepa. So far I have confirmation that the 290x does not void your warranty. I am waiting for a response on tri/quadfire and when I have that (either way void or no void ) ill post the results

Quote:


> Originally Posted by *Jflisk*
> 
> Now for something completely different. Under water how much difference would it be if I change my 2x 240 radiators out for 2 x 280 radiators. After adding the 3rd R9 290X don't really like the temps. Loop breakdown 2x 240 and 1 x 120 for 3 x R9 290x (estimate 400W a piece) and 1x FX 9590 220W. I was under 55C with 2x R9 290x now I am at 63C (highest temp) with the third R9 290x. Any ideas and I am trying to keep it all in the case. Thanks


Not really. Assuming you have decent rad fans.
140's are hard to find with high static pressure (aka rad fans)
Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jflisk*
> 
> Now for something completely different. Under water how much difference would it be if I change my 2x 240 radiators out for 2 x 280 radiators. After adding the 3rd R9 290X don't really like the temps. Loop breakdown 2x 240 and 1 x 120 for 3 x R9 290x (estimate 400W a piece) and 1x FX 9590 220W. I was under 55C with 2x R9 290x now I am at 63C (highest temp) with the third R9 290x. Any ideas and I am trying to keep it all in the case. Thanks
> 
> 
> 
> You lack rad space IMO. And switching from 240 to 280 won't make big differences.
> 
> Rule always say 1 rad space per component will be fine.. if you want lower temps, make it 2 per component. Or 2 x 120mm rad per component.
> 
> If space is an issue, start looking at higher revving fans..
Click to expand...

1 x 120 + 1 per component is recommended


----------



## Descadent

Quote:


> Originally Posted by *th3illusiveman*
> 
> My card also made a weird noise. It happened when the fans were changing speeds but i found it was it because the Tri-X was sagging. It's a very long card so i fixed my issue but using some rubber to hold the tip of the card next to my HDD bays eliminating the sagging and the noise has been gone since.


good idea. definitely heavy


----------



## rdr09

Quote:


> Originally Posted by *Infinite Jest*
> 
> I'm a biut confused by the Catalyst software package/driver bundle. I noticed that I wasn't getting the Mantle API option to show up in BF4 with Catalyst 14.4 and found in device manager that the driver version was actually 13.25. I browsed for another version and found 14.1 also there, and switched over to it. Why are there two drivers installed by default? I had done a DDU clean install to eliminate an issue I was having and this till happened. I really wish AMD driver installation was more straight-forward like nVidia, unless I'm missing something.


use 14.4 WHQL. Run it and use it to *uninstall* whatever driver you have in your system. Use *Express method*.

Reboot, then *Run 14.4 again* using Express method to *Install*. Before Rebooting, *go to msconfig* and uncheck CCC and raptr (if present). Reboot and test. Takes less than 15 minutes. On the Second reboot, it might take a long time to log on to Windows. Just wait till you get to the log on page.


----------



## mus1mus

Quote:


> Originally Posted by *rdr09*
> 
> use 14.4 WHQL. Run it and use it to *uninstall* whatever driver you have in your system. Use *Express method*.
> 
> Reboot, then *Run 14.4 again* using Express method to *Install*. Before Rebooting, *go to msconfig* and uncheck CCC and raptr (if present). Reboot and test. Takes less than 15 minutes. On the Second reboot, *it might take a long time to log on to Windows*. Just wait till you get to the log on page.


I'm having this Black Screen at log-on issue with my 290 after a fresh OS and Catalyst install. Does that relate to bold items?

What do you guys knew and suggest of this??


----------



## Spectre-

Quote:


> Originally Posted by *mus1mus*
> 
> I'm having this Black Screen at log-on issue with my 290 after a fresh OS and Catalyst install. Does that relate to bold items?
> 
> What do you guys knew and suggest of this??


are you getting artifacts ?

i recently had my R9 290 do the black screen issue

turned out the Vram had gone bad


----------



## mus1mus

Nope.. No OC either..

Bought it used last week.. Ran benches that seemed fine, with temps within 80 on the cores. Less than 70 on the VRMs. Happened after a Windows update.

So I reinstalled my OS.. It will work without Catalyst..After the install, reboot to a black screen at log on..


----------



## th3illusiveman

i get the black screen log-on thing too. The only time i get black-screens is then or memory OC with no added voltage.


----------



## Arizonian

Quote:


> Originally Posted by *Wezzor*
> 
> I had something similiar to you smoke2 and it disappeared after a while for me and with a while I mean around 1,5 month.
> 
> EDIT: I'd also like to be added to the member list.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sapphire R9 290 Tri-X
> http://www.techpowerup.com/gpuz/details.php?id=r6e2


Congrats - added


----------



## Dasboogieman

Quote:


> Originally Posted by *mus1mus*
> 
> Nope.. No OC either..
> 
> Bought it used last week.. Ran benches that seemed fine, with temps within 80 on the cores. Less than 70 on the VRMs. Happened after a Windows update.
> 
> So I reinstalled my OS.. It will work without Catalyst..After the install, reboot to a black screen at log on..


I get it very occasionally, I assumed in my case it is because of Sapphire TriXX auto-applying the OC. When I stopped the Auto apply, the blackscreens went away.


----------



## rdr09

Quote:


> Originally Posted by *mus1mus*
> 
> I'm having this Black Screen at log-on issue with my 290 after a fresh OS and Catalyst install. Does that relate to bold items?
> 
> What do you guys knew and suggest of this??


Follow the steps. Like i said, you got to wait after the second reboot to get to Windows log on. if it still a problem (if i'm in your case), return it. Do not hesitate. Let the former owner deal with it.


----------



## fateswarm

So why do games like Far Cry 3 and Sleeping Dogs crash on some overclocks when others don't?


----------



## th3illusiveman

Quote:


> Originally Posted by *fateswarm*
> 
> So why do games like Far Cry 3 and Sleeping Dogs crash on some overclocks when others don't?


they are the furmark of games lol. Seriously though, they heat up the VRMs significantly and that can make clocks are relatively stable unstable.Once those VRMs pass 80c you're in trouble stability wise.


----------



## fateswarm

Quote:


> Originally Posted by *th3illusiveman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fateswarm*
> 
> So why do games like Far Cry 3 and Sleeping Dogs crash on some overclocks when others don't?
> 
> 
> 
> they are the furmark of games lol. Seriously though, they heat up the VRMs significantly and that can make clocks are relatively stable unstable.Once those VRMs pass 80c you're in trouble stability wise.
Click to expand...

I don't think it's the mosfets. My card is on low temps and the game IMMEDIATELY starts showing "red dots". It's definitely something on the "base overclock" that is being hit.

It's probably using "more of the GPU" somehow.

Kinda like in CPUs smallFFTs prime is heavier.


----------



## mus1mus

Quote:


> Originally Posted by *rdr09*
> 
> use 14.4 WHQL. Run it and use it to *uninstall* whatever driver you have in your system. Use *Express method*.
> 
> Reboot, then *Run 14.4 again* using Express method to *Install*. Before Rebooting, *go to msconfig* and uncheck CCC and raptr (if present). Reboot and test. Takes less than 15 minutes. On the Second reboot, it might take a long time to log on to Windows. Just wait till you get to the log on page.


Quote:


> Originally Posted by *rdr09*
> 
> Follow the steps. Like i said, you got to wait after the second reboot to get to Windows log on. if it still a problem (if i'm in your case), return it. Do not hesitate. Let the former owner deal with it.


How could I thank you enough?

Fresh install, installed CCC. Didn't reboot, went to msconfig>startup>unticked CCC.
Reboot and here I am on Windows.

Thanks a lot mate!


----------



## sugarhell

Quote:


> Originally Posted by *fateswarm*
> 
> So why do games like Far Cry 3 and Sleeping Dogs crash on some overclocks when others don't?


They just use 100% shaders. Far cry 3 is terrible optimized and sleeping dogs is 100% shader based game


----------



## BranField

Think i can be added from this (http://www.overclock.net/t/1506751/sapphire-290x-vaporx-thermal-pad-quality-control-issues/0_50), im at work at the moment so cant get a gpu-z screenshot with name. if needed i will add one to my post when i get back home.

Edit: validation added

sapphire r9 290x oc vaporx trix


----------



## Zipperly

Quote:


> Originally Posted by *machomen*
> 
> So after 10 min furmark on 60 % fanspeed which is actually really loud on the ASUS DCUII
> 
> 
> 
> So the temps are as followed
> 
> GPU - 89 Degrees celcius
> VRM 1 - 122 Degrees celcius
> VRM 2 - 86 Degrees celcius
> 
> BTW my GPU is not overclocked !


It is a known issue that the Asus DCUll is a crappy cooler for the hot running Hawaii. Asus is using the exact same cooler on your card as they used with the GK110 "which was ok because GK110 runs cooler".

Also your VRM temps will be very dependent upon airflow from the cooler so you need to keep the fan cranked up a bit and off of auto.


----------



## dantoddd

hey just got a reference 290x card for cheap. what kind of noise issues am i going to experience. Will headphones be enough to deal with the extra noise. Also i've heard that the card will throttle. Practically, if i'm gaming 1080p how much of an issue is this.


----------



## Jflisk

Quote:


> Originally Posted by *mus1mus*
> 
> Nope.. No OC either..
> 
> Bought it used last week.. Ran benches that seemed fine, with temps within 80 on the cores. Less than 70 on the VRMs. Happened after a Windows update.
> 
> So I reinstalled my OS.. It will work without Catalyst..After the install, reboot to a black screen at log on..


The card sounds like its bad and is not taking the drivers or catalyst. I had a 7990 that would do the same thing fine with windows drive would not take the AMD driver (RMA of course). So if return is an option I would take it.


----------



## mus1mus

Quote:


> Originally Posted by *Jflisk*
> 
> The card sounds like its bad and is not taking the drivers or catalyst. I had a 7990 that would do the same thing fine with windows drive would not take the AMD driver (RMA of course). So if return is an option I would take it.


Got it working bro.. Thanks for the tip though.

CCC is just messing the startup..

See post 28927.


----------



## bluedevil

Quote:


> Originally Posted by *dantoddd*
> 
> hey just got a reference 290x card for cheap. what kind of noise issues am i going to experience. Will headphones be enough to deal with the extra noise. Also i've heard that the card will throttle. Practically, if i'm gaming 1080p how much of an issue is this.


When I had my 290 reference, it was loud, even with headphones. Plus you are going to battle thermal throttling once it gets to 95C (and it will get to 95C). IMO, get a AIO and slap it on there. Be sure to get some cooling on the VRMs.

On a side note, a 290X on a single 1080P screen is complete overkill at 60hz. I sure hope you have it cranked to 120hz or looking to get multiple screens or 1440P.


----------



## Spectre-

Quote:


> Originally Posted by *bluedevil*
> 
> When I had my 290 reference, it was loud, even with headphones. Plus you are going to battle thermal throttling once it gets to 95C (and it will get to 95C). IMO, get a AIO and slap it on there. Be sure to get some cooling on the VRMs.
> 
> On a side note, a 290X on a single 1080P screen is complete overkill at 60hz. I sure hope you have it cranked to 120hz or looking to get multiple screens or 1440P.


WHHATTT

i thought this was Overkill.net

on other note - my reason to get the 290x was for benching


----------



## Jflisk

Quote:


> Originally Posted by *mus1mus*
> 
> Got it working bro.. Thanks for the tip though.
> 
> CCC is just messing the startup..
> 
> See post 28927.


Okay cool guess there's so many post missed that one.


----------



## Zipperly

Quote:


> Originally Posted by *bluedevil*
> 
> When I had my 290 reference, it was loud, even with headphones. Plus you are going to battle thermal throttling once it gets to 95C (and it will get to 95C). IMO, get a AIO and slap it on there. Be sure to get some cooling on the VRMs.
> 
> On a side note, a 290X on a single 1080P screen is complete overkill at 60hz. I sure hope you have it cranked to 120hz or looking to get multiple screens or 1440P.


I replaced the stock cooler on my old 290X card with the arctic accelero 3 and it worked well for me. I ran a 1200mhz core overclock with zero throttling at a voltage of 1.328vc with max gaming load temps of 64c and vrm1 hitting 79c so long as I had the fans on the cooler set to 100 percent because vrm cooling with this style of cooler is dependent upon fan speed and even at 100 percent this cooler is nearly silent.

I now have a GTX 780 and have the accelero 4 mounted to it. I Love these coolers.


----------



## fateswarm

Quote:


> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fateswarm*
> 
> So why do games like Far Cry 3 and Sleeping Dogs crash on some overclocks when others don't?
> 
> 
> 
> They just use 100% shaders. Far cry 3 is terrible optimized and sleeping dogs is 100% shader based game
Click to expand...

That makes sense. I suspected they do something nasty to the GPU. Like not using the VRAM as much and 'burning' the chip..


----------



## Zipperly

Quote:


> Originally Posted by *fateswarm*
> 
> That makes sense. I suspected they do something nasty to the GPU. Like not using the VRAM as much and 'burning' the chip..


Far Cry 3 isnt horribly optimized but enabling 4XMSAA will eat up your frame rates, my old HD 7970 ran FC3 great with everything maxed at 1920X1080 "mostly 60fps with dips into 40's" as long as I had msaa disabled, the game uses a deferred AA technique anyway and msaa added to it provides very little improvement in image quality while greatly reducing your FPS. It is true that both watch dogs and far cry 3 are pretty demanding and as a result this will obviously effect stability in your GPU overclock if you are already teetering on the edge of instability. BTW- neither game is going to burn your chip due to low vram usage, it doesnt work that way.


----------



## sugarhell

Quote:


> Originally Posted by *Zipperly*
> 
> Far Cry 3 isnt horribly optimized but enabling 4XMSAA will eat up your frame rates, my old HD 7970 ran FC3 great with everything maxed at 1920X1080 "mostly 60fps with dips into 40's" as long as I had msaa disabled, the game uses a deferred AA technique anyway and msaa added to it provides very little improvement in image quality while greatly reducing your FPS. It is true that both watch dogs and far cry 3 are pretty demanding and as a result this will obviously effect stability in your GPU overclock if you are already teetering on the edge of instability. BTW- neither game is going to burn your chip due to low vram usage, it doesnt work that way.


Probably you never had the awful frame latency from far cry 3 no matter what the gpu. Or the difference between dx9 vs dx11 was minimal.Or i can go on for 10 mins

Also inaccurate memory management, poor shader port from consoles and in general bad experience with dx11 can crash your gpu easily. (like far cry 3)


----------



## Zipperly

Quote:


> Originally Posted by *sugarhell*
> 
> Probably you never had the awful frame latency from far cry 3 no matter what the gpu. Or the difference between dx9 vs dx11 was minimal.Or i can go on for 10 mins
> 
> Also inaccurate memory management, poor shader port from consoles and in general bad experience with dx11 can crash your gpu easily. (like far cry 3)


The latency issue was remedied by simply capping your FPS to 60 using riva tuner. Also its true there was min diff between dx9 or dx11 but fc3 never once crashed my "overclocked" GPU. If it is crashing yours and you are overclocked then try backing the overclock off a bit.


----------



## dantoddd

When the 290x throttles is there discernible performance drop.


----------



## Talon720

Quote:


> Originally Posted by *Mega Man*
> 
> On mobile can't edit quotes down sorry
> Just a FYI you will probably spend more to shop your power supply then you will save in efficacy


Good point on the psu just a thought that was brewing.
quote name="Mega Man" url="/t/1436497/official-amd-r9-290x-290-owners-club/28910#post_22766141"]1 x 120 + 1 per component is recommended[/quote]
So for my trifire and cpu+vrm block I have a 360 and 240 this is considered enough? I know other factors such as how fast fans are going and such. Im working on a 3rd rad a 240 so my fans dont have to be cranked up.


----------



## heroxoot

Quote:


> Originally Posted by *dantoddd*
> 
> When the 290x throttles is there discernible performance drop.


Yes. When it throttles the clocks drop to prevent damage and to decrease heat. I have only throttled once and it made me drop a lot of fps in a benchmark. Otherwise if the clock is not maxing out the card probably has no reason to do so, or so it thinks. I still don't max out clocks on Borderlands 2 but forcing max clocks does not give me any more performance, so I guess it does not need to do so? FPS could be better sadly but the game won't force more load no matter what I do.


----------



## kizwan

Quote:


> Originally Posted by *Talon720*
> 
> So for my trifire and cpu+vrm block I have a 360 and 240 this is considered enough? I know other factors such as how fast fans are going and such. Im working on a 3rd rad a 240 so my fans dont have to be cranked up.


Cooling depends on the ambient. If your ambient below 25C, then I think it should be fine. Mine was 360 + 240 too but it can't keep up especially when overclocked because my ambient is always above 29C. After I add another 120, it's much better but I'm still need to removed the side panel when benching or gaming because I only have one exhaust fan at the back.

For your setup, using rule of thumb, you'll need 720 of rad space (360 + 360 or 360 + 240 +120) at least. However, for 290's, I would add another 120 per (GPU) block, if I have space.


----------



## devilhead

now played BF4 for an hour at 1250/1600 with +137mv at gpu-z stays around 1.25v (some spikes to 1.28v) and it was stable with no glitches or problems







max gpu tempeture 48C/ vrm2 46c.


----------



## mojobear

Hey I would agree with Kizwan....those 290s put out a lot of heat. I have quadfire 290s, and when I OC them with +118 mV 1165/1350 temps go up to high 50s. This is with a 420 pull, 360 push/pull, and a 480 pull.








Quote:


> Originally Posted by *kizwan*
> 
> Cooling depends on the ambient. If your ambient below 25C, then I think it should be fine. Mine was 360 + 240 too but it can't keep up especially when overclocked because my ambient is always above 29C. After I add another 120, it's much better but I'm still need to removed the side panel when benching or gaming because I only have one exhaust fan at the back.
> 
> For your setup, using rule of thumb, you'll need 720 of rad space (360 + 360 or 360 + 240 +120) at least. However, for 290's, I would add another 120 per (GPU) block, if I have space.


----------



## tdbone1

let me start by saying BF series is my favorite game.

i just sold my gtx 770 on ebay.
was going to get a pair of 290's or 290x's if price is right)
anyhow i see the new haswell-e cpu's just dropped so im thinking about getting the 5820k as its 6c/12thread

anyhow the only problem i had in the 1080P Ultra Preset (less no AA) was that i would get a cpu/gpu spikes in that "perfoverlay.drawgraph 1" during heavy scenes.
95% of the time it was above 80FPS
only during those heavy action would i "feel" it and see it on the graph

so i decided to sell the gtx 770.
paypal release my pending payment tomorrow so i shopping now.

so confused
do i get dual 290's to fix this problem or 4790K or 5820K (6core)

ive looked at cpu usage for my 8320 @ 4.8ghz and the 2nd core is always riding close to maxing out in window performace tab when i click on cpu and show all logical cores.

the gtx 770 was maxed out most of the time.

i read a few pages in this thread about the 290 series in crossfire (gpu usage)
the graphs in hwinfo64 show each card spiking off and on in the graphs i seen.

can i "feel" this when im playing bf4 multiplayer 64 player maps on 1080P "ultra preset" with 4x AA and everything turned on if i get those 290's for crossfire with the rig in my sig?

really want to make the right decision here as i wont be able to do it again for a long time.

i know ive seen nvidia series in sli and the graphs show now spikes up and down for gtx 780 in sli etc..

thanks and please respond asap as im ready to pull the trigger on something (cpu/mb or dual gpu's)

thanks again
would love to see some data if someone can post some. (graphs are great for cpu and both gpu's)

i know its overkill for 1080P but i can increase the scaling resolution if i have room to play with.
most importantly is 1080P Ultra Preset.... no custom settings


----------



## mus1mus

you should be adding that you have a mismatched 4GB RAM. 1066 to be exact and OC'd to 1333MHz, and Doesn't use Page File.









Sorry mate, before you flood every thread involved with your rig, fix your system.


----------



## Mega Man

so.... you know you will have to buy new ram right ? ( DDR4 )

you atm need ram ( your mix matches sets + the fact you only have 6gb )

and even if you bought a new CPU/mobo/ram

what gpu would you use then ??


----------



## tdbone1

for the 1000th time
my ram is fine
i have proved this over and over and over with hwinfo64 open and windows taskmanager set to memory tab.

6GB with no pagefile and i am getting NO out of memory errors.

im getting the video cards or cpu or mb first.

i have also said later i will get more ram to be either 8GB or 16GB but for NOW i am fine on my ram you are going to have to trust me.
i run win8.1 64bit fresh install and i have very very little extra
at full bootup and wait 5mins im only using 630MB ram TOTAL
that leaves over 5GB for BF4

ok now that we are pass that part (as this i know for a fact) what about the 290's questions i had above
thanks and please no more about the ram. im not new.
bf has been my fav game since bf1
i know all about the series
what i dont know is 290's in cfx and also this cpu vs intel cpu (with this game)
i know about the ram


----------



## Mega Man

first i dont trust you, because i know your wrong.

but all that is besides the point

x99 uses DDR4

you _*WILL*_ need new ram, you _*CAN NOT*_ use ddr3 in it

yet again you side step the questions, and play the " trust me im a engineer" card, please note that is a metaphor i am not saying you think you are an engineer

we have covered this in MULTIPLE threads, same posts, same questions, you refuse to listen, please dont pollute this one too, only to not listen to what is suggested to fix your problem

and again i ask, what GPU would you use assuming you got x99- ( new cpu/mb/ram )

lastly i note, you have not said that i read you will get new ram, you said you will get another set of as you said "matching" ram ( not matching, but just another kit ) and mix it once again, even though it is slow and outdated


----------



## kizwan

I thought page file is important for BF4. Mine set to static value (system managed) to 16GB.

BTW, this is my minimum RAM usage.


----------



## KyadCK

Quote:


> Originally Posted by *kizwan*
> 
> I thought page file is important for BF4. Mine set to static value (system managed) to 16GB.
> 
> BTW, this is my minimum RAM usage.


Idle usage depends on many things. One of those things is how much RAM you have. In a 16-32GB system it'll use about 3-4GB idle. In a 4GB one it'll use about 1-1.5GB idle. Newer Windows always likes to use more, but it won't do so at the cost of not being able to run other things.


----------



## tdbone1

my question is not about ram.


----------



## Sgt Bilko

Quote:


> Originally Posted by *tdbone1*
> 
> let me start by saying BF series is my favorite game.
> 
> i just sold my gtx 770 on ebay.
> was going to get a pair of 290's or 290x's if price is right)
> anyhow i see the new haswell-e cpu's just dropped so im thinking about getting the 5820k as its 6c/12thread
> 
> anyhow the only problem i had in the 1080P Ultra Preset (less no AA) was that i would get a cpu/gpu spikes in that "perfoverlay.drawgraph 1" during heavy scenes.
> 95% of the time it was above 80FPS
> only during those heavy action would i "feel" it and see it on the graph
> 
> so i decided to sell the gtx 770.
> paypal release my pending payment tomorrow so i shopping now.
> 
> so confused
> do i get dual 290's to fix this problem or 4790K or 5820K (6core)
> 
> ive looked at cpu usage for my 8320 @ 4.8ghz and the 2nd core is always riding close to maxing out in window performace tab when i click on cpu and show all logical cores.
> 
> the gtx 770 was maxed out most of the time.
> 
> i read a few pages in this thread about the 290 series in crossfire (gpu usage)
> the graphs in hwinfo64 show each card spiking off and on in the graphs i seen.
> 
> can i "feel" this when im playing bf4 multiplayer 64 player maps on 1080P "ultra preset" with 4x AA and everything turned on if i get those 290's for crossfire with the rig in my sig?
> 
> really want to make the right decision here as i wont be able to do it again for a long time.
> 
> i know ive seen nvidia series in sli and the graphs show now spikes up and down for gtx 780 in sli etc..
> 
> thanks and please respond asap as im ready to pull the trigger on something (cpu/mb or dual gpu's)
> 
> thanks again
> would love to see some data if someone can post some. (graphs are great for cpu and both gpu's)
> 
> i know its overkill for 1080P but i can increase the scaling resolution if i have room to play with.
> most importantly is 1080P Ultra Preset.... no custom settings


First problem, you'd be hard pressed to get 100% usage across both cards at 1080p no matter the CPU you are using. (unless you are doing Mantle/150%+ res scale and Ultra)

And no, my usage spikes in Crossfire/1080p but i don't feel it happening. I do with DX but not with Mantle.

That's my thoughts on it.

EDIT: At 1080p Ultra with Mantle i get anywhere between 140 - 200+ fps, most of the time it hangs around 180fps or so.


----------



## Arizonian

Quote:


> Originally Posted by *BranField*
> 
> Think i can be added from this (http://www.overclock.net/t/1506751/sapphire-290x-vaporx-thermal-pad-quality-control-issues/0_50), im at work at the moment so cant get a gpu-z screenshot with name. if needed i will add one to my post when i get back home.
> 
> Edit: validation added
> 
> sapphire r9 290x oc vaporx trix
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Will be away for the weekend. Good place for me to set as a marker to start reading from when I return and update if needed.


----------



## BranField

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Will be away for the weekend. Good place for me to set as a marker to start reading from when I return and update if needed.


awesome, thanks. enjoy your weekend away

now just to wait for that vapor-x block


----------



## mikep577

Sadly i have swapped both my R9 290X's water-cooled to Nvidia evga 780 ti's kingpin editions EK water-cooled as I was fed up with the poor driver and disappointed performance in games like Grid auto driver etc.. I have to say that my new 780 ti's are faster and much smoother in game.

Good luck guys.


----------



## Vici0us

What's the max temp for vrm1?


----------



## sugarhell

Quote:


> Originally Posted by *mikep577*
> 
> Sadly i have swapped both my R9 290X's water-cooled to Nvidia evga 780 ti's kingpin editions EK water-cooled as I was fed up with the poor driver and disappointed performance in games like Grid auto driver etc.. I have to say that my new 780 ti's are faster and much smoother in game.
> 
> Good luck guys.


ok


----------



## tsm106

Quote:


> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mikep577*
> 
> Sadly i have swapped both my R9 290X's water-cooled to Nvidia evga 780 ti's kingpin editions EK water-cooled as I was fed up with the poor driver and disappointed *performance in games like Grid auto* driver etc.. I have to say that my new 780 ti's are faster and much smoother in game.
> 
> Good luck guys.
> 
> 
> 
> ok
Click to expand...











Codies games are specifically coded to capitalize on AMD's compute capability. Can only relate to my own experience, but even my trifire 7970s could average 120fps in Grid AS maxed out (8x16) and obviously with advanced lighting and global illum on at a res of 5900x1080.


----------



## th3illusiveman

Quote:


> Originally Posted by *mikep577*
> 
> Sadly i have swapped both my R9 290X's water-cooled to Nvidia evga 780 ti's kingpin editions EK water-cooled as I was fed up with the poor driver and disappointed performance in games like Grid auto driver etc.. I have to say that my new 780 ti's are faster and much smoother in game.
> 
> Good luck guys.


shoulda waited for the 880/980....


----------



## sugarhell

Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Codies games are specifically coded to capitalize on AMD's compute capability. Can only relate to my own experience, but even my trifire 7970s could average 120fps in Grid AS maxed out (8x16) and obviously with advanced lighting and global illum on at a res of 5900x1080.


Whats the point to come here and say i changed my 290x to 780ti and i have faster performance. good for you but its a show off


----------



## VSG

Maybe he wanted Arizonian to update the member list? I did the same earlier on (although I am likely coming back







)


----------



## redklu3

Quote:


> Originally Posted by *kizwan*
> 
> You don't need going to that extreme. All you need is full waterblock like XSPC & thermal pad with thermal conductivity between 7 to 17.0 W/mK. Phobya 7W/mK or Fujipoly Extreme 11W/mK or Fujipoly Ultra Extreme 17W/mK. The latter is expensive though.
> 
> Thermal pad is non electrical conductive.


thanks man, I miight not buy a full waterblock now, I've just finished building the system buy I might still add that a small rear cooler for the GPU, its currently on kraken G10, its great for XFX to put a flat copper vrm heatsink on the PCB covering all vrms and ram modules, and my corsair dominators fins fits perfect in it! i dropped vrm temps by 10-15deg but would like to drop a little bit more if possible, also I could now increase memory without having runt frames after running valley longer.

I'll post pictures tomorrow, and hopefully join this r9 290x owners club


----------



## redklu3

hello all! would like to join this club with my XFX r9 290x DD.









Its tuned up with kraken G10 (moded), thermaltake water 2.0, noctua fans, corsair dominator fins (moded), xspc back plate (moded). though still runs HOT


----------



## tdbone1

just got a 4770k off ebay for $250 shipped and ordered the asus maximus vii gene off newegg http://www.newegg.com/Product/Product.aspx?Item=N82E16813132136&cm_re=asus_maximus_vii_gene-_-13-132-136-_-Product

think i going to go with a pair of r9 290's when i get some money so i hope that will complete my rig besides some ram and a case

i have an h100i so i want to use that i believe so i prob gonna go back to the mitx thread and ask for some ideas.


----------



## Mega Man

1 three letter word.

RAM


----------



## hwoverclkd

Quote:


> Originally Posted by *tdbone1*
> 
> just got a 4770k off ebay for $250 shipped and ordered the asus maximus vii gene off newegg http://www.newegg.com/Product/Product.aspx?Item=N82E16813132136&cm_re=asus_maximus_vii_gene-_-13-132-136-_-Product
> 
> think i going to go with a pair of r9 290's when i get some money so i hope that will complete my rig besides some ram and a case
> 
> i have an h100i so i want to use that i believe so i prob gonna go back to the mitx thread and ask for some ideas.


if you're not on a budget, for $49 more, you could have bought 5820k, sells for 299 at microcenter. Of course motherboard will be a little more expensive.


----------



## tdbone1

i priced the 5820 and it was going for 369 on a pre buy the night be for they were released. i think its 389 on egg right now or 369. any whoot i researched the ddr4 ram and the new motherboards.....lol....that 49 more at mc is funny cause thats only a drop in the bucket


----------



## tdbone1

ah then you are agreeing with me but i getting ram LAST


----------



## Mega Man

have fun playing with your mixed 1066ram oced to 1333


----------



## davidm71

Can someone with an Asus DirectCu OC II card post your gpu-Z screen pic? Want to confirm that my bios version is correct.
Mine says 015.041.000.002.000000 (113-AD62900-103). Want to make sure I didn't flash the wrong version of vbios the
first day I installed the card because now if I go over 14.4 (like 14.8) I get lockups..

Thanks


----------



## uaedroid

Is the new AMD Catalyst 14.8 WHQL working great?


----------



## Aussiejuggalo

Quote:


> Originally Posted by *uaedroid*
> 
> Is the new AMD Catalyst 14.8 WHQL working great?


14.8 WHQL?







do you mean 14.4?


----------



## uaedroid

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> 14.8 WHQL?
> 
> 
> 
> 
> 
> 
> 
> do you mean 14.4?


No Sir, i really mean 14.8 WHQL. Here it is...http://www.guru3d.com/files-details/amd-catalyst-14-8-whql-%2814-201-1008-august-12%29-download.html


----------



## Aussiejuggalo

Quote:


> Originally Posted by *uaedroid*
> 
> No Sir, i really mean 14.8 WHQL. Here it is...http://www.guru3d.com/files-details/amd-catalyst-14-8-whql-%2814-201-1008-august-12%29-download.html










there not even on AMDs site yet









Once AMD upload them on there site I'll download them


----------



## Mega Man

they leak them before they put them up on their site


----------



## sugarhell

Quote:


> Originally Posted by *Mega Man*
> 
> they leak them before they put them up on their site


Its not a leak. Its an apu driver but it support desktop gpus too


----------



## Mega Man

i ment they put them out on another site before they put them on their main site


----------



## BradleyW

Quote:


> Originally Posted by *sugarhell*
> 
> Its not a leak. Its an apu driver but it support desktop gpus too


Any release notes floating around?


----------



## rv8000

Quote:


> Originally Posted by *BradleyW*
> 
> Any release notes floating around?


None so far.


----------



## BradleyW

OK. If anyone finds them, post them here.


----------



## sugarhell

Quote:


> Originally Posted by *Mega Man*
> 
> i ment they put them out on another site before they put them on their main site


i already told you that it is an apu driver on the apu section of the site...


----------



## BranField

Soo happy right now

EK sapphire 290x waterblock facebook picture


----------



## rdr09

Quote:


> Originally Posted by *BranField*
> 
> Soo happy right now
> 
> EK sapphire 290x waterblock facebook picture


make sure you remove the protective films on the pads.


----------



## rv8000

To anyone willing to re-tim their cards, do it! Just got a 4c drop from swapping stock paste on my Vapor-X (which was way too much as usual) to gelid gc extreme.


----------



## BranField

Quote:


> Originally Posted by *rdr09*
> 
> make sure you remove the protective films on the pads.


ohh that will be the first thing i do lol

will also be the first thing i check in the future as well


----------



## th3illusiveman

Hey guys, i'm playing Skyrim with the Realvision ENB but my overclock fluctuates alot when i'm running that game. between 1040-1100Mhz. I can stop it by disabling PowerPlay support but that's not optimal. Is there something i'm missing? This is the only game that does this as far as i know.


----------



## schoolofmonkey

Ok guys I'm finally over the green team, with the new Matrix having issues I'm going RED.

What is the Sapphire R9 290X Tri-X OC like, can pick one up for $600.
But I've read about black screens, is it a problem still?

I just want something I can put in my machine and use, no overclocking, just something I can finally play a game with.


----------



## th3illusiveman

wait for maxwell, at the very least you will get lower AMD prices.


----------



## hwoverclkd

Quote:


> Originally Posted by *schoolofmonkey*
> 
> Ok guys I'm finally over the green team, with the new Matrix having issues I'm going RED.
> 
> What is the Sapphire R9 290X Tri-X OC like, can pick one up for $600.
> But I've read about black screens, is it a problem still?
> 
> I just want something I can put in my machine and use, no overclocking, just something I can finally play a game with.


lol i thought u got two matrix there, one red and one green







some still have blackscreen, many managed to fix them via small tweak. I'd say about 30-40% chance you'll get one.


----------



## jagdtigger

Greetings.

I bought this card(Sapphire R9 290X) in January and it has a lot of power. But the reference cooling was too rowdy so i replaced it with a Koolance VID-AR290X Water Block:
http://www.techpowerup.com/gpuz/details.php?id=6wcuf

And i run into a little problem just recently. I used two monitors without problems. One in DVI-D, and one on HDMI. But when i try to connect a third monitor(a TV to be precize) with a DVI->HDMI adapter i get BSOD... I tested the adapter on both DVI ports with all the monitors and its working without problems. Has someone any idea whats going on?









/EDIT
I just got curious and fired up kubuntu to test out if its software related and bingo. Kubuntu boots up without kernel panic with three screens. Downloaded 14,7RC and after install i can use both DVI-D ports







.


----------



## chronicfx

Got my 4790k up and running.

First firestrike score, I just overclocked it tonight too ??so stable??, not so sure yet.









http://www.3dmark.com/3dm/3934887


----------



## schoolofmonkey

Quote:


> Originally Posted by *th3illusiveman*
> 
> wait for maxwell, at the very least you will get lower AMD prices.


I didn't, getting a refund on the GTX780ti Matrix, no more nvidia for a while, I have had a string of dodgy gtx780ti's and I'm over it.

Currently in my machine, VRM temps hit 80c, GPU hasn't gone over 77c, which from what I understand is really good for the R290x

I did notice a few FPS drops using the same settings in game, but nothing to complain about, still runs Watch_Dogs smoothly at the same setting as the GTX780ti.
Little bit of coil whine, but I've came to accept that in all high end cards.

The one thing I am happy with as well, is this card matches every review I read, temp/performance/noise are exactly the same as the multipule reviews I read.
Its the first any only new card that has done that, the Matrix was 20c hotter than reviews, the VRMs were 100c a full 30c hotter than Guru3d's review.


----------



## th3illusiveman

Quote:


> Originally Posted by *schoolofmonkey*
> 
> I didn't, getting a refund on the GTX780ti Matrix, no more nvidia for a while, I have had a string of dodgy gtx780ti's and I'm over it.
> 
> Currently in my machine, VRM temps hit 80c, GPU hasn't gone over 77c, which from what I understand is really good for the R290x
> 
> I did notice a few FPS drops using the same settings in game, but nothing to complain about, still runs Watch_Dogs smoothly at the same setting as the GTX780ti.
> Little bit of coil whine, but I've came to accept that in all high end cards.


Congrats then. Set up a custom fan profile and try to keep those VRMs under 80c and report back with OC results


----------



## schoolofmonkey

Quote:


> Originally Posted by *th3illusiveman*
> 
> Congrats then. Set up a custom fan profile and try to keep those VRMs under 80c and report back with OC results


One strange thing though, the GPU temp won't display in Heaven, it shows 4353266c, umm don't think that's right.
idle temps are high, GPU 45 - 50c, VRM's 43c, but that's ok, fan speed is 20%, so that's understandable.

These are the temps after 15 minutes of Heaven, ambient temps 26c:


----------



## kizwan

Quote:


> Originally Posted by *schoolofmonkey*
> 
> Quote:
> 
> 
> 
> Originally Posted by *th3illusiveman*
> 
> Congrats then. Set up a custom fan profile and try to keep those VRMs under 80c and report back with OC results
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> One strange thing though, the GPU temp won't display in Heave, it shows 4353266c, umm don't think that's right.
Click to expand...

Unigine don't report correct temperature on 290's.


----------



## schoolofmonkey

Quote:


> Originally Posted by *kizwan*
> 
> Unigine don't report correct temperature on 290's.


Are these idle temps normal?
GPU 45 - 50c, VRM's 43c, fan speed is 20%, but even at 40% they are the same.

I did create a custom fan profile so now the VRM's are sitting at about 66c under load, GPU is around the same, no louder than any other card I've used.


----------



## rv8000

Quote:


> Originally Posted by *schoolofmonkey*
> 
> Are these idle temps normal?
> GPU 45 - 50c, VRM's 43c, fan speed is 20%, but even at 40% they are the same.
> 
> I did create a custom fan profile so now the VRM's are sitting at about 66c under load, GPU is around the same, no louder than any other card I've used.


Great temps for a relatively warm ambient. If you want to get the core down a few degrees you can repaste it, dropped my vapor-x 4c by changing out the TIM.


----------



## kizwan

Quote:


> Originally Posted by *schoolofmonkey*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Unigine don't report correct temperature on 290's.
> 
> 
> 
> 
> 
> 
> 
> Are these idle temps normal?
> GPU 45 - 50c, VRM's 43c, fan speed is 20%, but even at 40% they are the same.
> 
> I did create a custom fan profile so now the VRM's are sitting at about 66c under load, GPU is around the same, no louder than any other card I've used.
Click to expand...

Look ok to me. Depends on the ambient temp though. I remember my referenced cards with referenced cooler, I'll need to turn up the fans around 40 - 50% for the cards to idle at the same temp range. Your load temps look great. Mine higher than that, around 70s to 80s Celsius with fan at 80% because my ambient usually 31++ Celsius without A/C.


----------



## schoolofmonkey

Quote:


> Originally Posted by *rv8000*
> 
> Great temps for a relatively warm ambient. If you want to get the core down a few degrees you can repaste it, dropped my vapor-x 4c by changing out the TIM.


]
Quote:


> Originally Posted by *kizwan*
> 
> Look ok to me. Depends on the ambient temp though. I remember my referenced cards with referenced cooler, I'll need to turn up the fans around 40 - 50% for the cards to idle at the same temp range. Your load temps look great. Mine higher than that, around 70s to 80s Celsius with fan at 80% because my ambient usually 31++ Celsius without A/C.


I just ran Adrenaline Action Benchmark Tool (Bioshock Inf, Sleeping Dogs, Tomb Raider) an xtreme settings twice, these are the temps I'm getting with Stock fan profile, no overclock and an ambient temp of 26c (not even a fan on in the house.)

http://s1294.photobucket.com/user/schoolofmonkey2/media/newtemps_zpsfd60149b.gif.html

This look about right?


----------



## Mega Man

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *schoolofmonkey*
> 
> Quote:
> 
> 
> 
> Originally Posted by *th3illusiveman*
> 
> Congrats then. Set up a custom fan profile and try to keep those VRMs under 80c and report back with OC results
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> One strange thing though, the GPU temp won't display in Heave, it shows 4353266c, umm don't think that's right.
> 
> Click to expand...
> 
> Unigine don't report correct temperature on 290's.
Click to expand...

sure they do ! mine is only 27million c

how can you not believe my gpu is approx 5times the temp of the sun !


----------



## rv8000

Quote:


> Originally Posted by *schoolofmonkey*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I just ran Adrenaline Action Benchmark Tool (Bioshock Inf, Sleeping Dogs, Tomb Raider) an xtreme settings twice, these are the temps I'm getting with Stock fan profile, no overclock and an ambient temp of 26c (not even a fan on in the house.)
> 
> http://s1294.photobucket.com/user/schoolofmonkey2/media/newtemps_zpsfd60149b.gif.html
> 
> 
> 
> This look about right?


If your case ambient is around 30c, then those load temps are normal, delta for the Tri-X based coolers is roughly 49c under load, above ambient. More aggressive fan profile will help, current fan profile I have set maxes around 55-58% depending on ambient and I haven't seen any warmer than 74c today with a 29c room ambient (after repasting earlier today).


----------



## schoolofmonkey

Quote:


> Originally Posted by *rv8000*
> 
> If your case ambient is around 30c, then those load temps are normal, delta for the Tri-X based coolers is roughly 49c under load, above ambient. More aggressive fan profile will help, current fan profile I have set maxes around 55-58% depending on ambient and I haven't seen any warmer than 74c today with a 29c room ambient (after repasting earlier today).


You have a Vapor X Sapphire, Guessing temps are a little better on that..
If its not too much trouble, can I check out your fan profile so I can get an idea?


----------



## rv8000

Quote:


> Originally Posted by *schoolofmonkey*
> 
> You have a Vapor X Sapphire, Guessing temps are a little better on that..
> If its not too much trouble, can I check out your fan profile so I can get an idea?


Not a huge improvement in the core temps on the Vapor-X (I don't think reviews show much more than a 1-2c difference), cooling for the vrm is vastly superior though which is also due to the better overall design for the vrms.



My fan profile is normally 5% higher than the stock profile, but I have my fan profile set to basically match idle temp to fan % (i normally see 38c-41c depending on how warm a day it is).


----------



## pdasterly

is my card dead?


----------



## schoolofmonkey

Quote:


> Originally Posted by *rv8000*
> 
> My fan profile is normally 5% higher than the stock profile, but I have my fan profile set to basically match idle temp to fan % (i normally see 38c-41c depending on how warm a day it is).


This is my temps with the same run of Adrenaline Action Benchmark Tool.
Looking better, I'll probably keep that.

http://s1294.photobucket.com/user/schoolofmonkey2/media/newfan_zps30486050.gif.html


----------



## maynard14

Wohhh. What happen bro? You have two 290x crossfired. Maybe try each one to know which of the cards are faulty... I hope you still can fix it and hope it is just a driver issue


----------



## pdasterly

its my primary card, 2nd card works fine


----------



## Sgt Bilko

Quote:


> Originally Posted by *pdasterly*
> 
> is my card dead?
> 
> 
> Spoiler: Warning: Spoiler!


Have you got a spare monitor you can use as well?

If it's still doing it after that then do a sweep of your drivers, if that's not it then i'd say it's the GPU, most likely memory related.


----------



## pdasterly

tried all three monitors, tried cables( dvi & dp). Sucks just got my watercooled system up and running, now I have to tear down


----------



## kizwan

Quote:


> Originally Posted by *Mega Man*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *schoolofmonkey*
> 
> Quote:
> 
> 
> 
> Originally Posted by *th3illusiveman*
> 
> Congrats then. Set up a custom fan profile and try to keep those VRMs under 80c and report back with OC results
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> One strange thing though, the GPU temp won't display in Heave, it shows 4353266c, umm don't think that's right.
> 
> Click to expand...
> 
> Unigine don't report correct temperature on 290's.
> 
> Click to expand...
> 
> sure they do ! mine is only 27million c
> 
> how can you not believe my gpu is approx 5times the temp of the sun !
Click to expand...










Quote:


> Originally Posted by *pdasterly*
> 
> tried all three monitors, tried cables( dvi & dp). Sucks just got my watercooled system up and running, now I have to tear down


A couple of things I can think of; leak or water block too tight.


----------



## sugarhell

Quote:


> Originally Posted by *pdasterly*
> 
> is my card dead?


Change cables?


----------



## mikep577

You're 100% right only at the time I had an opportunity to swap and the info about the new cards at that time was not available.


----------



## Spectre-

Quote:


> Originally Posted by *sugarhell*
> 
> Change cables?


actually i had the same problem just 2 weeks ago

and my card was gone


----------



## MojoW

Quote:


> Originally Posted by *Spectre-*
> 
> actually i had the same problem just 2 weeks ago
> 
> and my card was gone


This....
I got one in a RMA as we speak, so brace yourself.
Just happened like that with the stock cooler and it has been in that system for six month's.


----------



## schoolofmonkey

So I set out to replace the TIM on this Tri-X, thinking that may have been the reason for the high idle temps (48c idle).
What I found what completely dried up paste that flaked off, and a scratch on the die. Replacing the TIM didn't lower idle temps, but my load temps are about 73-75c.
So I guess that's normal for high idle temps...

Here's the die:


----------



## Imprezzion

That's not big enough to cause high idle temps.

48c idle indicates to me that it's not downclocking to ''2D'' clocks. Are you running multiple monitors or did you disable ULPS / power saving (a.k.a. High Performance'' in Windows?

With 20c ambient and a normally downclocking card idle should be about 30-35c max.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Imprezzion*
> 
> That's not big enough to cause high idle temps.
> 
> 48c idle indicates to me that it's not downclocking to ''2D'' clocks. Are you running multiple monitors or did you disable ULPS / power saving (a.k.a. High Performance'' in Windows?
> 
> With 20c ambient and a normally downclocking card idle should be about 30-35c max.


I'm guessing Multi-monitor, my top card idles around 40c because of this, i don't mind tbh

They never get over 85c when gaming anyway


----------



## schoolofmonkey

Quote:


> Originally Posted by *Imprezzion*
> 
> That's not big enough to cause high idle temps.
> 
> 48c idle indicates to me that it's not downclocking to ''2D'' clocks. Are you running multiple monitors or did you disable ULPS / power saving (a.k.a. High Performance'' in Windows?
> 
> With 20c ambient and a normally downclocking card idle should be about 30-35c max.


Wish that was the case, I'm getting ok load temps of around 76c, but even now its idling at 49c.

http://s1294.photobucket.com/user/schoolofmonkey2/media/downclock_zps0f9b8852.gif.html


----------



## Sgt Bilko

Quote:


> Originally Posted by *schoolofmonkey*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Imprezzion*
> 
> That's not big enough to cause high idle temps.
> 
> 48c idle indicates to me that it's not downclocking to ''2D'' clocks. Are you running multiple monitors or did you disable ULPS / power saving (a.k.a. High Performance'' in Windows?
> 
> With 20c ambient and a normally downclocking card idle should be about 30-35c max.
> 
> 
> 
> Wish that was the case, I'm getting ok load temps of around 76c, but even now its idling at 49c.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s1294.photobucket.com/user/schoolofmonkey2/media/downclock_zps0f9b8852.gif.html
Click to expand...

It's because your memory isn't downclocking, it's stuck at the 3D clock of 1300Mhz

For me this is either when i run my Qnix at a high refresh rate or when i run more than one monitor.


----------



## schoolofmonkey

Quote:


> Originally Posted by *Sgt Bilko*
> 
> It's because your memory isn't downclocking, it's stuck at the 3D clock of 1300Mhz
> 
> For me this is either when i run my Qnix at a high refresh rate or when i run more than one monitor.


I have a 144hz monitor, would that do it?

edit: Yep that did it, I set it to 60Hz and well the temps dropped like a stone.
Crap, I ran out of money for a new monitor.

Would running at 144hz push gaming temps up too?


----------



## Sgt Bilko

Quote:


> Originally Posted by *schoolofmonkey*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> It's because your memory isn't downclocking, it's stuck at the 3D clock of 1300Mhz
> 
> For me this is either when i run my Qnix at a high refresh rate or when i run more than one monitor.
> 
> 
> 
> I have a 144hz monitor, would that do it?
> 
> edit: Yep that did it, I set it to 60Hz and well the temps dropped like a stone.
> Crap, I ran out of money for a new monitor.
> 
> Would running at 144hz push gaming temps up too?
Click to expand...

That would be the reason i'd say, you can try a different connection with it and see it that helps otherwise just have your fan curve a bit more aggressive than normal.

Running at that refresh rate wouldn't push gaming temps any higher than they would be normally, it's just a higher idle temp you get.


----------



## schoolofmonkey

Quote:


> Originally Posted by *Sgt Bilko*
> 
> That would be the reason i'd say, you can try a different connection with it and see it that helps otherwise just have your fan curve a bit more aggressive than normal.
> 
> Running at that refresh rate wouldn't push gaming temps any higher than they would be normally, it's just a higher idle temp you get.


I'll try the display port otherwise I can set the desktop refresh to 120hz which allows the memory to down clock, and doesn't case any 2D issues.


----------



## Sgt Bilko

Quote:


> Originally Posted by *schoolofmonkey*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> That would be the reason i'd say, you can try a different connection with it and see it that helps otherwise just have your fan curve a bit more aggressive than normal.
> 
> Running at that refresh rate wouldn't push gaming temps any higher than they would be normally, it's just a higher idle temp you get.
> 
> 
> 
> I'll try the display port otherwise I can set the desktop refresh to 120hz which allows the memory to down clock, and doesn't case any 2D issues.
Click to expand...

Yeah i'm not sure how the connections might affect the downclocking, someone else might have a better idea but 120hz is still a decent refresh rate if you are worried about the idle temps.

Glad to see you figured out the problem at any rate


----------



## schoolofmonkey

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yeah i'm not sure how the connections might affect the downclocking, someone else might have a better idea but 120hz is still a decent refresh rate if you are worried about the idle temps.
> 
> Glad to see you figured out the problem at any rate


Yeah, didn't matter what connection, so I just set it to 120Hz on the desktop, temps are around 38c - 42c depending on what I'm doing.
Thanks for pointing that out.


----------



## fateswarm

Is it sane to use CLP on tri-x 290?


----------



## pdasterly

Quote:


> Originally Posted by *kizwan*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> A couple of things I can think of; leak or water block too tight.


I will pull apart to inspect soon, I have no leaks
Quote:


> Originally Posted by *sugarhell*
> 
> Change cables?


works fine with 2nd card, ruling out cables


----------



## battleaxe

Just hit 17,137 on 3dMark11... this is pretty good isn't it? http://www.3dmark.com/3dm11/8667823

I know there's better, but this seems better than average to me. Core was at 1165mhz. Mem at stock. CPU at stock too.


----------



## Mega Man

Quote:


> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *pdasterly*
> 
> is my card dead?
> 
> 
> 
> Change cables?
Click to expand...

this
mine will do that with one of my cables
Quote:


> Originally Posted by *schoolofmonkey*
> 
> So I set out to replace the TIM on this Tri-X, thinking that may have been the reason for the high idle temps (48c idle).
> What I found what completely dried up paste that flaked off, and a scratch on the die. Replacing the TIM didn't lower idle temps, but my load temps are about 73-75c.
> So I guess that's normal for high idle temps...
> 
> Here's the die:


one of mine came like that, but much worse still works great though !


----------



## battleaxe

Quote:


> Originally Posted by *Mega Man*
> 
> this
> mine will do that with one of my cables
> one of mine came like that, but much worse still works great though !


Just an imperfection in the surface glass. If she works great don't worry about it.


----------



## pdasterly

Quote:


> Originally Posted by *Mega Man*
> 
> this
> mine will do that with one of my cables
> one of mine came like that, but much worse still works great though !


any suggestions? im so mad i can't think clearly


----------



## Gualichu04

3Dmark score I hope thats good for dual gpus at 1150 core and 1300 memory.


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> Just hit 17,137 on 3dMark11... this is pretty good isn't it? http://www.3dmark.com/3dm11/8667823
> 
> I know there's better, but this seems better than average to me. Core was at 1165mhz. Mem at stock. CPU at stock too.


Looking good, battle. i get the same score at those clocks.
Quote:


> Originally Posted by *Gualichu04*
> 
> 3Dmark score I hope thats good for dual gpus at 1150 core and 1300 memory.


i am jelly. can't comment on the score except it looks normal.

btw, how's that crucial treating you? saw one for $220 today.


----------



## Gualichu04

Quote:


> Originally Posted by *rdr09*
> 
> Looking good, battle. i get the same score at those clocks.
> i am jelly. can't comment on the score except it looks normal.
> 
> btw, how's that crucial treating you? saw one for $220 today.


I paid 340 for mine and a month after the new mx100 came out. It is doing great. Loving the 15 sec boot up and putting all the programs and games i want on it that i use most.


----------



## rdr09

Quote:


> Originally Posted by *Gualichu04*
> 
> I paid 340 for mine and a month after the new mx100 came out. It is doing great. Loving the 15 sec boot up and putting all the programs and games i want on it that i use most.


+rep. thinking of using it for a laptop. give it life a bit. thanks.


----------



## Mega Man

Quote:


> Originally Posted by *pdasterly*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> this
> mine will do that with one of my cables
> one of mine came like that, but much worse still works great though !
> 
> 
> 
> any suggestions? im so mad i can't think clearly
Click to expand...

again change cables, if still happening change LCDS if still have to rma


----------



## Gualichu04

Quote:


> Originally Posted by *rdr09*
> 
> +rep. thinking of using it for a laptop. give it life a bit. thanks.


No problem i hope you enjoy it. I had a 128GB crucial but, it wasnt big enough.


----------



## pkrexer

Quote:


> Originally Posted by *Gualichu04*
> 
> 3Dmark score I hope thats good for dual gpus at 1150 core and 1300 memory.


Looks good, though I'd maybe expect a little more given your CPU. Here is my score with the same GPU clock although my memory is running at 1500 on both.

The latest leaked 14.8 drivers seem to give a nice boost in firestrike too.

http://www.3dmark.com/fs/2680726


----------



## drnilly007

Just looking for some quick reference on wattage draw for a 290 with a high overclock?


----------



## Red1776

Quote:


> Originally Posted by *drnilly007*
> 
> Just looking for some quick reference on wattage draw for a 290 with a high overclock?


Around 350w could be a bit more if you are really pushing it.


----------



## schoolofmonkey

Hey guys.
I don't want to be a pain, with the string of dud cards since January I don't want to be stuck with another, kinda made the OCD kick in.
Also haven't personally had a AMD card in 5 years...lol

This is the model I have:
SAPPHIRE TRI-X R9 290X
Base Clock 1040Mhz
Memory: 1300 MHz
http://www.sapphiretech.com/presentation/product/?cid=1&gid=3&sgid=1227&pid=2090&psn=&lid=1&leg=0

So I was just checking these temps, are they ok for 24c ambient, 30 minutes of Watch_Dogs and a 10 pass run of Metro Last Light Benchmark maxed out:

http://s1294.photobucket.com/user/schoolofmonkey2/media/stockfan_zps191aee2d.gif.html


----------



## supermiguel

I have 3 msi 290x gaming edition with the new board V308-014 v2.2.. does any one make a full water block for this card? seems like rev 2.0 of ek doesnt work on them.. Any ideas?

http://www.coolingconfigurator.com/step1_complist?gpu_gpus=1401

I dont want to use the universal blocks


----------



## mus1mus

I'm getting about the same temps albeit with a much higher ambient and much aggressive Fan Curve. Needs 80-85% fan speed to keep temps within 75 on 30C + ambient.

Mine is a reference MSI 290 .

And yes, flashed PCS Bios that increased my cards performance. 1040 on Core and 1350 on Memory. Throttle free Bios it seems to me..








Quote:


> Originally Posted by *supermiguel*
> 
> I have 3 msi 290x gaming edition with the new board V308-014 v2.2.. does any one make a full water block for this card? seems like rev 2.0 of ek doesnt work on them.. Any ideas?
> 
> http://www.coolingconfigurator.com/step1_complist?gpu_gpus=1401
> 
> I dont want to use the universal blocks


I think @Red1776 is using the Gaming on his latest build and he mentions Waterblock compatibility on later MSI cards.. It has to do with the taller caps IIRC..

Check him out or Call MSI to verify.


----------



## supermiguel

Quote:


> Originally Posted by *mus1mus*
> 
> I'm getting about the same temps albeit with a much higher ambient and much aggressive Fan Curve. Needs 80-85% fan speed to keep temps within 75 on 30C + ambient.
> 
> Mine is a reference MSI 290 .
> 
> And yes, flashed PCS Bios that increased my cards performance. 1040 on Core and 1350 on Memory. Throttle free Bios it seems to me..
> 
> 
> 
> 
> 
> 
> 
> 
> I think @Red1776 is using the Gaming on his latest build and he mentions Waterblock compatibility on later MSI cards.. It has to do with the taller caps IIRC..
> 
> Check him out or Call MSI to verify.


Yes the version 1.0 had a taller caps issue, ek released a 2.0 version of their water blocks and that works, but now they released a new PCB where many things are moved around, so the 2.0 block doesnt work anymore


----------



## mus1mus

Ouchie..









Hey, I just saw your build log.! And I was like,






























I bet you have already talked to AquaComputer about this as well.. Can you link or have a picture of the board?


----------



## supermiguel

Quote:


> Originally Posted by *mus1mus*
> 
> Ouchie..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hey, I just saw your build log.! And I was like,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I bet you have already talked to AquaComputer about this as well.. Can you link or have a picture of the board?


http://www.techpowerup.com/forums/threads/msi-radeon-r9-290-x-gaming-lightning-card-club.197912/page-14#post-3103363



Yup aqua computer doesnt support it either.. im pissed







i contact MSI to see if there was anything they could do to help... but i think this boards may end up in ebay =(


----------



## mus1mus

I think Alphacool.. But it is FUGLY..


----------



## maynard14

i hope you still have warranty sir, if not
Quote:


> Originally Posted by *fateswarm*
> 
> Is it sane to use CLP on tri-x 290?


For me its a bit risky, i only use clp on my 4770k die, if the cooler of your gpu sir is alluminum, clp will eat it







,. just use gelid extreme sir much safer and still performs much better than stock tim


----------



## schoolofmonkey

Quote:


> Originally Posted by *mus1mus*
> 
> I'm getting about the same temps albeit with a much higher ambient and much aggressive Fan Curve. Needs 80-85% fan speed to keep temps within 75 on 30C + ambient.
> 
> Mine is a reference MSI 290 .


Thanks for replying.
So the stock lower fan curve is working as it should.
The most I've seen the VRM's hit is 85c, but that's well within spec isn't it?

Honestly I don't get a lot of time for gaming, longest stretch I've done is about 2 hours, with multiple game pauses in between (thanks to having 5 kids with 2 toddlers lol).
Waiting for GTA V, will hire a baby sitter then..


----------



## th3illusiveman

Quote:


> Originally Posted by *schoolofmonkey*
> 
> Hey guys.
> I don't want to be a pain, with the string of dud cards since January I don't want to be stuck with another, kinda made the OCD kick in.
> Also haven't personally had a AMD card in 5 years...lol
> 
> This is the model I have:
> SAPPHIRE TRI-X R9 290X
> Base Clock 1040Mhz
> Memory: 1300 MHz
> http://www.sapphiretech.com/presentation/product/?cid=1&gid=3&sgid=1227&pid=2090&psn=&lid=1&leg=0
> 
> So I was just checking these temps, are they ok for 24c ambient, 30 minutes of Watch_Dogs and a 10 pass run of Metro Last Light Benchmark maxed out:
> 
> http://s1294.photobucket.com/user/schoolofmonkey2/media/stockfan_zps191aee2d.gif.html


I have a Tri-X and i ran Valley stock (30min) to compare with you. My Ambient is about 23 degrees and temps are as follows

Core: 76c
Fan: 65%
VRM1: 85
VRM2: 61
Average voltage of 1.120



if anything your card does a better job lol.


----------



## schoolofmonkey

Quote:


> Originally Posted by *th3illusiveman*
> 
> I have a Tri-X and i ran Valley stock (30min) to compare with you. My Ambient is about 23 degrees and temps are as follows
> 
> Core: 76c
> Fan: 65%
> VRM1: 85
> VRM2: 61
> Average voltage of 1.120
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> if anything your card does a better job lol.


I just did a 6x run of Metro 2033 Benchmark, got 78c CPU 81c VRM, but it's got warmer inside (25c Ambient).
Though the GTX780ti Matrix I returned would hit CPU 83c and VRM 100c (I have had a string of bad nvidia GTX780ti's since January, hence the switch to the red team).


----------



## mus1mus

Quote:


> Originally Posted by *schoolofmonkey*
> 
> Thanks for replying.
> So the stock lower fan curve is working as it should.
> The most I've seen the VRM's hit is 85c, but that's well within spec isn't it?
> 
> Honestly I don't get a lot of time for gaming, longest stretch I've done is about 2 hours, with multiple game pauses in between (thanks to having 5 kids with 2 toddlers lol).
> Waiting for GTA V, will hire a baby sitter then..


I haven't really have time myself to push my card yet. But reports from users VRMs should be kept within 80.. So yours could still go for a little more aggressive fan curve..

Quote:


> Originally Posted by *supermiguel*
> 
> oh ****: http://www.alphacool.com/product_info.php/info/p1382_Alphacool-NexXxoS-GPX---ATI-R9-290X-und-290-M02---mit-Backplate---Schwarz-.html i think it works... at least the caps are in the same configuration..
> 
> I dont care if it is FUGLY !!!!kudos to you


http://www.xtremerigs.net/2014/08/11/r9-290x-gpu-waterblock-roundup/

http://www.xtremerigs.net/2014/08/11/alphacool-gpx-a290-review/


----------



## TMatzelle60

Looking to do a black and red theme build who has the best 290/290x. Also i will be using this in a Caselabs S3 with a side window so no direct airflow only front fans and top fans are intakes and single 120 in the back as exhaust


----------



## supermiguel

Quote:


> Originally Posted by *mus1mus*
> 
> http://www.xtremerigs.net/2014/08/11/r9-290x-gpu-waterblock-roundup/
> 
> http://www.xtremerigs.net/2014/08/11/alphacool-gpx-a290-review/


ya i saw the reviews they suck, but what other choice do i have







i guess either sell them or use vga blocks, with passive heatsink for vrm


----------



## rv8000

Quote:


> Originally Posted by *TMatzelle60*
> 
> Looking to do a black and red theme build who has the best 290/290x. Also i will be using this in a Caselabs S3 with a side window so no direct airflow only front fans and top fans are intakes and single 120 in the back as exhaust


Options are: 290x Matrix, DC2 290/290x, MSI Gamer 290/290x, and you could go with a lightning 290x if you don't mind a splash of yellow. Matrix and lightning will have the best quality components, for cooling im not quite sure how the DC2 and Gaming coolers do.

If you wanted to watercool, you could always pick up a cheap reference card and buy a black/red NZXT G10 bracket and some lower end AIO's too**


----------



## mus1mus

Quote:


> Originally Posted by *supermiguel*
> 
> ya i saw the reviews they suck, but what other choice do i have
> 
> 
> 
> 
> 
> 
> 
> i guess either sell them or use vga blocks, with passive heatsink for vrm


You can also use Uni- VRM blocks. I forgot who does them but there's a kit that comes with a shapeable copper sheet where the actual block is screwed in to.. Have to look for it.. Hang on./.

Koolance -

MVR

Transfer plate


----------



## schoolofmonkey

Quote:


> Originally Posted by *mus1mus*
> 
> I haven't really have time myself to push my card yet. But reports from users VRMs should be kept within 80.. So yours could still go for a little more aggressive fan curve..


I have a Kraken G10 and a Kraken x60 here on the shelf.
Just nothing to cool the VRM's with, and stupid Sapphire didn't make a 2 part cooler, so you can't use the stock vrm/vram heatsink...


----------



## bak3donh1gh

So I bought a g10 and a Zalman LQ-310. But it seems the 310 isnt big enough for the 290 since I am maxing my temps at 94 C. Any recommendations on a liquid cooler for the 290. Im running 800D that already has the top full with a triple rad for my Cpu(h320)


----------



## th3illusiveman

Quote:


> Originally Posted by *schoolofmonkey*
> 
> I have a Kraken G10 and a Kraken x60 here on the shelf.
> Just nothing to cool the VRM's with, and stupid Sapphire didn't make a 2 part cooler, so you can't use the stock vrm/vram heatsink...


i know the Tri-X has a reference board but idk about the vapor-X. If it's ref then you have many options.


----------



## schoolofmonkey

Quote:


> Originally Posted by *th3illusiveman*
> 
> i know the Tri-X has a reference board but idk about the vapor-X. If it's ref then you have many options.


I have the Tri-X not the Vapor X..lol

I usually don't go modding stuff for at least 2 weeks after purchase, just to make sure it doesn't die...lol


----------



## RocketAbyss

Quote:


> Originally Posted by *bak3donh1gh*
> 
> So I bought a g10 and a Zalman LQ-310. But it seems the 310 isnt big enough for the 290 since I am maxing my temps at 94 C. Any recommendations on a liquid cooler for the 290. Im running 800D that already has the top full with a triple rad for my Cpu(h320)


Use any one of the Asetek based coolers, like the corsair h55, h90, h110 i think? Im using thermaltake water performer 3.0 which in essence is like a h55. And my max temps with 30c ambients is about 63c.

Either that, or you need to tighten the block further. My first mount I was maxing 95c as well on my 290x. After a reapplication of TIM and redoing the screws, I never saw above 65c on the core.


----------



## bak3donh1gh

The thermaltake doesnt seem like it would be much better than the lq-310. Since the thermaltake water performer 3.0 is actually 1mm thinner on the radiator.

Also does anyone know where I can get an adaptor so I can have the pump and the fans running off the gpu board connector?


----------



## RocketAbyss

Quote:


> Originally Posted by *bak3donh1gh*
> 
> The thermaltake doesnt seem like it would be much better than the lq-310. Since the thermaltake water performer 3.0 is actually 1mm thinner on the radiator.
> 
> Also does anyone know where I can get an adaptor so I can have the pump and the fans running off the gpu board connector?


Why would it matter if it was better? Im still getting 63 to 65c max and im overclocked.


----------



## Radmanhs

hey, I was just wondering, if you wanted to do triple 1440p monitors, would you need 2 or 3 r9 290's to run games smoothly? Same with 290x's, considering there isnt too big of a difference between the 2.


----------



## keikei

Quote:


> Originally Posted by *Radmanhs*
> 
> hey, I was just wondering, if you wanted to do triple 1440p monitors, would you need 2 or 3 r9 290's to run games smoothly? Same with 290x's, considering there isnt too big of a difference between the 2.


You can get away with it using 2 cards, but i would go 3. 3 1440p, you're pushing 11059200 pixels. 4k is 8316000 pixels. The pixel count is close, assuming my math is correct. I would also go 'X', you'll need every ounce of gpu power.


----------



## bak3donh1gh

Quote:


> Originally Posted by *RocketAbyss*
> 
> Why would it matter if it was better? Im still getting 63 to 65c max and im overclocked.


Becuase the 310 is technically a larger rad yet im getting worse temps that you. I have some spare fans so maybe i should just add some more on?


----------



## Radmanhs

Thats a lot of pixels 0.0, considering triple 1080 screens add up to 6220800. If you had a choice to go with a 27" 1080p triple screen display of an equivalent 1440 display with 2 290's, which would you go with? I know 27" is kinda stretching it, but I bet it would still look a lot better than my 40" tv XD

I was looking around a little, I found an asus 1080p monitor that looks perfect for eyefinity considering (as far as I can tell) its thin bezels.

http://www.amazon.com/dp/B00B17C5KO/?tag=pcpapi-20


----------



## keikei

Quote:


> Originally Posted by *Radmanhs*
> 
> Thats a lot of pixels 0.0, considering triple 1080 screens add up to 6220800. If you had a choice to go with a 27" 1080p triple screen display of an equivalent 1440 display with 2 290's, which would you go with? I know 27" is kinda stretching it, but I bet it would still look a lot better than my 40" tv XD
> 
> I was looking around a little, I found an asus 1080p monitor that looks perfect for eyefinity considering (as far as I can tell) its thin bezels.
> http://www.amazon.com/dp/B00B17C5KO/?tag=pcpapi-20


Eyefinity is a nice setup. 1080p x 3 would be easier to push on 2 cards.


----------



## Sgt Bilko

Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *schoolofmonkey*
> 
> Thanks for replying.
> So the stock lower fan curve is working as it should.
> The most I've seen the VRM's hit is 85c, but that's well within spec isn't it?
> 
> Honestly I don't get a lot of time for gaming, longest stretch I've done is about 2 hours, with multiple game pauses in between (thanks to having 5 kids with 2 toddlers lol).
> Waiting for GTA V, will hire a baby sitter then..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I haven't really have time myself to push my card yet. But reports from users VRMs should be kept within 80.. So yours could still go for a little more aggressive fan curve..
> 
> Quote:
> 
> 
> 
> Originally Posted by *supermiguel*
> 
> oh ****: http://www.alphacool.com/product_info.php/info/p1382_Alphacool-NexXxoS-GPX---ATI-R9-290X-und-290-M02---mit-Backplate---Schwarz-.html i think it works... at least the caps are in the same configuration..
> 
> I dont care if it is FUGLY !!!!kudos to you
> 
> Click to expand...
> 
> http://www.xtremerigs.net/2014/08/11/r9-290x-gpu-waterblock-roundup/
> 
> http://www.xtremerigs.net/2014/08/11/alphacool-gpx-a290-review/
Click to expand...

Vrm temps are optimal under 80c but its fine if they go up to 95c.

Bear in mind the ref cooler does a damn fine job at keeping vrm temps in line.


----------



## schoolofmonkey

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Vrm temps are optimal under 80c but its fine if they go up to 95c.
> 
> Bear in mind the ref cooler does a damn fine job at keeping vrm temps in line.


Thanks for that.
On the stock fan profile I haven't seen the VRM's go over 85c, but that's the max temp, it goes between 81c - 85c.
Increasing the fan profile a little keeps them around 76c - 78c but all those temps depend on ambient temps.
26c inside at the moment.

The VRM's idle a lot cooler than gpu though, it idles from 37c - [email protected], running 144hz (monitor native res) it idles at 48c because the ram isn't down clocking like the core.

Maybe if I grab some of the GELID Solutions CL-R9290-01-A R9 290 Series VRM Cooling and stick the Kraken x60 on the card things might be better..

This is something else I found interesting, the temps go up depending on the refresh rate (I have a 144hz monitor)
Ignore the fan %, it was actually 55%, gpu-z remembered when I manually set the fan speed to 100%
I ran the same scene in Risen 3 for 15 minutes each time

120hz
http://s1294.photobucket.com/user/schoolofmonkey2/media/120hz_zps5cff108b.gif.html

144hz
http://s1294.photobucket.com/user/schoolofmonkey2/media/144hz_zps99b0ec42.gif.html


----------



## fateswarm

What happens to the die and/or the cooler when you use Coollaboratory Liquid Pro on the Tri-X R9 290?

Can it be used on the VRM's mosfets?


----------



## ozzy1925

Quote:


> Originally Posted by *fateswarm*
> 
> What happens to the die and/or the cooler when you use Coollaboratory Liquid Pro on the Tri-X R9 290?
> 
> Can it be used on the VRM's mosfets?


i dont think its a good idea and worth 5C to use clu or clp on vrm even on the die you need to be very careful because its electrically conductive and it eats aluminum


----------



## fateswarm

Quote:


> Originally Posted by *ozzy1925*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fateswarm*
> 
> What happens to the die and/or the cooler when you use Coollaboratory Liquid Pro on the Tri-X R9 290?
> 
> Can it be used on the VRM's mosfets?
> 
> 
> 
> i dont think its a good idea and worth 5C to use clu or clp on vrm even on the die you need to be very careful because its electrically conductive and it eats aluminum
Click to expand...

Is there aluminum?


----------



## Imprezzion

Not directly no. The die is nickel plated afaik.

It's safe to use on GPU's as long as you are VERY careful applying it. Spilling ANY bit over the edge of the die can be quite a problem as it's mostly exposed








I used it on direct-die / delidded CPU's and GPU's and never ran into any problems.

VRM's are just too dangerous and the distance it has to bridge is too far. There's a pad there for a reason









Buy some Fujipoly Ultra Extreme thermal pads if you want better VRM cooling performance.


----------



## fateswarm

Quote:


> Originally Posted by *Imprezzion*
> 
> Not directly no. The die is nickel plated afaik.
> 
> It's safe to use on GPU's as long as you are VERY careful applying it. Spilling ANY bit over the edge of the die can be quite a problem as it's mostly exposed
> 
> 
> 
> 
> 
> 
> 
> 
> I used it on direct-die / delidded CPU's and GPU's and never ran into any problems.
> 
> VRM's are just too dangerous and the distance it has to bridge is too far. There's a pad there for a reason
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Buy some Fujipoly Ultra Extreme thermal pads if you want better VRM cooling performance.


Thanks. Yeah, I've been very nervous about it when I delidded my CPU (it has capacitors or resistors right next to the die). I'm thinking of (if I do it) use electrical tape or other cover temporarily above all else.


----------



## ozzy1925

Quote:


> Originally Posted by *fateswarm*
> 
> Is there aluminum?


no unless you buy a heatsink made of aluminum .


----------



## hwoverclkd

Quote:


> Originally Posted by *Radmanhs*
> 
> hey, I was just wondering, if you wanted to do triple 1440p monitors, would you need 2 or 3 r9 290's to run games smoothly? Same with 290x's, considering there isnt too big of a difference between the 2.


Depends on the games you play, but both 290 and 290x might struggle on giving 'smoother' game play on 3 x 1440p still. Remember you won't get 3x performance. And you might be hitting 4gb limit often. Still, all depends on the games you play and how low of a game setting would you go for.

I'd go with triple 1080p


----------



## Newbie2009

Quote:


> Originally Posted by *Radmanhs*
> 
> hey, I was just wondering, if you wanted to do triple 1440p monitors, would you need 2 or 3 r9 290's to run games smoothly? Same with 290x's, considering there isnt too big of a difference between the 2.


With 3 cards you would really need to go watercooling. Serious amount of heat on air. PSU also something to consider. I've considered tri fire many times, never gone there though.


----------



## tsm106

Quote:


> Originally Posted by *Imprezzion*
> 
> Not directly no. *The die is nickel plated afaik.*
> 
> It's safe to use on GPU's as long as you are VERY careful applying it. Spilling ANY bit over the edge of the die can be quite a problem as it's mostly exposed
> 
> 
> 
> 
> 
> 
> 
> 
> I used it on direct-die / delidded CPU's and GPU's and never ran into any problems.
> 
> VRM's are just too dangerous and the distance it has to bridge is too far. There's a pad there for a reason
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Buy some Fujipoly Ultra Extreme thermal pads if you want better VRM cooling performance.


Dude, seriously. Why do you think it's called the die? Die = uncovered, ie. no heatspreader. If there was a heatspreader that would be nickel plated.


----------



## sugarhell

Quote:


> Originally Posted by *tsm106*
> 
> Dude, seriously. Why do you think it's called the die? Die = uncovered, ie. no heatspreader. If there was a heatspreader that would be nickel plated.


lol


----------



## nightfox

Quote:


> Originally Posted by *Radmanhs*
> 
> hey, I was just wondering, if you wanted to do triple 1440p monitors, would you need 2 or 3 r9 290's to run games smoothly? Same with 290x's, considering there isnt too big of a difference between the 2.


ask this guy

p33k http://www.overclock.net/u/255223/p33k

He got eyefinity 1440p and 2x 290. You can ask him about performance but I am sure it should be sufficient with todays games.







(assuming you water cool and OC them)


----------



## Jflisk

Quote:


> Originally Posted by *fateswarm*
> 
> What happens to the die and/or the cooler when you use Coollaboratory Liquid Pro on the Tri-X R9 290?
> 
> Can it be used on the VRM's mosfets?


You don't want to use it on the VRMS - The Die on the other hand - I hear its good stuff but hard to work with. Never tried it myself but there are others that have and have posted guides to using it.


----------



## chronicfx

I used to think afterburner was broken because the GPU usage would bounce so badly... I have just been playing about an hour of crysis 3 at very high settings with 2xSMAA averaging 90-100 FPS at 1440p. This is the first game test of my 4790k overclock (4.7/4.5uncore seems stable enough for crysis 3 and good temps ranging 50-55c) and all three 290x are being pushed to 100% GPU usage not the whole time, but alot of the time all three are pinned. My old usage graphs with my i5 I would describe as a 2 year old scribbling on a page. So happy I made this upgrade, butter smooth play and I did not notice all the PLX chip latency nonsense. I think this setup will conquer The Witcher 3! (Unless it has suberduberubersampling)


----------



## Imprezzion

Quote:


> Originally Posted by *tsm106*
> 
> Dude, seriously. Why do you think it's called the die? Die = uncovered, ie. no heatspreader. If there was a heatspreader that would be nickel plated.


Dude, seriously. You think the die itself isn't covered by some form of metal?


----------



## tsm106

Quote:


> Originally Posted by *Imprezzion*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Dude, seriously. Why do you think it's called the die? Die = uncovered, ie. no heatspreader. If there was a heatspreader that would be nickel plated.
> 
> 
> 
> Dude, seriously. You think the die itself isn't covered by some form of metal?
Click to expand...

That's not even an amd gpu.









A few posts back someone else posted a pic of their nicked gpu die. That should tell you what? Metal chips and flakes off right?
Quote:


> Originally Posted by *Mega Man*
> 
> Quote:
> 
> 
> 
> Originally Posted by *schoolofmonkey*
> 
> So I set out to replace the TIM on this Tri-X, thinking that may have been the reason for the high idle temps (48c idle).
> What I found what completely dried up paste that flaked off, and a scratch on the die. Replacing the TIM didn't lower idle temps, but my load temps are about 73-75c.
> So I guess that's normal for high idle temps...
> 
> Here's the die:
> 
> 
> 
> 
> one of mine came like that, but much worse still works great though !
Click to expand...


----------



## battleaxe

Quote:


> Originally Posted by *tsm106*
> 
> That's not even an amd gpu.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> A few posts back someone else posted a pic of their nicked gpu die. That should tell you what? Metal chips and flakes off right?


I believe the die is silicon. The kind of silicon being the kind grown in a lab. I could be wrong though.


----------



## hwoverclkd

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> That's not even an amd gpu.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> A few posts back someone else posted a pic of their nicked gpu die. That should tell you what? Metal chips and flakes off right?
> 
> 
> 
> I believe the die is silicon. The kind of silicon being the kind grown in a lab. I could be wrong though.
Click to expand...

perhaps he meant the molding compound on top where they laser-etch the text/logo?


----------



## battleaxe

Quote:


> Originally Posted by *acupalypse*
> 
> perhaps he meant the molding compound on top where they laser-etch the text/logo?


Typically they are covered with a very thin layer of glass. Laid over the silicon. Someone correct me if I'm wrong. Maybe some of the older ones used metal? I suppose its possible.


----------



## jagdtigger

Greetings.

I solved the 3 monitor issue but now my card(sapphire r9 290x) stuck at pcie 3 x4, but both my card and mb(msi z87 g55) is pcie3 x16. I noticed before going to work so i dont had time to try anything. I hope its not hardware problem... Have someone any idea what gone wrong?

Thans in advance for the help.


----------



## bak3donh1gh

Can anyone tell me what temps I should be getting with a g10 and a zalman lq-310? Im gettting 94C should I just stick another fan and run them on max?


----------



## davidm71

I noticed that my 290X has two different bios's depending on which way a switch is set on the card. Does that mean if you have to flash a new bios you have to flash the card twice? Also noticed according to GPUZ the device name for each setting is slightly different. One ends in '102' and the other in '103' depending on the switch of the Asus 290X OC Directcu II. Can someone confirm this on their card please for comparison?

Thanks.


----------



## chronicfx

Omg.. Why have I never played grid auto-sport before now. This is addiction.


----------



## rdr09

Quote:


> Originally Posted by *bak3donh1gh*
> 
> Can anyone tell me what temps I should be getting with a g10 and a zalman lq-310? Im gettting 94C should I just stick another fan and run them on max?


something ain't right. not sure how that aio performs but this cools the core much like custom cooling. below 60C, i believe, under load.

Quote:


> Originally Posted by *davidm71*
> 
> I noticed that my 290X has two different bios's depending on which way a switch is set on the card. Does that mean if you have to flash a new bios you have to flash the card twice? Also noticed according to GPUZ the device name for each setting is slightly different. One ends in '102' and the other in '103' depending on the switch of the Asus 290X OC Directcu II. Can someone confirm this on their card please for comparison?
> 
> Thanks.


just flash one. i think you should not mess with the one closer to the power connectors. it's a backup.


----------



## kizwan

Quote:


> Originally Posted by *jagdtigger*
> 
> Greetings.
> 
> I solved the 3 monitor issue but now my card(sapphire r9 290x) stuck at pcie 3 x4, but both my card and mb(msi z87 g55) is pcie3 x16. I noticed before going to work so i dont had time to try anything. I hope its not hardware problem... Have someone any idea what gone wrong?
> 
> Thans in advance for the help.


If it still stuck at PCIe 3.0 x4, try re-seat the GPU.


----------



## hwoverclkd

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *acupalypse*
> 
> perhaps he meant the molding compound on top where they laser-etch the text/logo?
> 
> 
> 
> Typically they are covered with a very thin layer of glass. Laid over the silicon. Someone correct me if I'm wrong. Maybe some of the older ones used metal? I suppose its possible.
Click to expand...

Not sure if glass was ever used but in die attach/encapsulation process, a non-electrically conductive resin is typically used. If it were a metal then that could be sort of a heat spreader put on top of the resin.


----------



## bak3donh1gh

Quote:


> Originally Posted by *rdr09*
> 
> something ain't right. not sure how that aio performs but this cools the core much like custom cooling. below 60C, i believe, under load.
> .


ok ill add another fan and run them both off adaptors so they run at max.


----------



## kizwan

Quote:


> Originally Posted by *davidm71*
> 
> I noticed that my 290X has two different bios's depending on which way a switch is set on the card. Does that mean if you have to flash a new bios you have to flash the card twice? Also noticed according to GPUZ the device name for each setting is slightly different. One ends in '102' and the other in '103' depending on the switch of the Asus 290X OC Directcu II. Can someone confirm this on their card please for comparison?
> 
> Thanks.


Both 290 & 290X have dual BIOS. 290X have Uber & Quiet (mode) BIOS. Usually the difference is the fan profile. You just need to flash one. No need to flash both. The other BIOS can also act as backup BIOS if you screw up when flashing BIOS.


----------



## davidm71

That's good to know. I tried flashing the first day but recently I tried to flash it again using the Asus flash utility and it refused saying can't find Rom file. Can these cards be flashed only once? Also how would the flash utility know wether to flash for performance vs quiet mode?


----------



## bak3donh1gh

They can be flashed more than once this I know for sure since I flashed once to try a higher performance bios and again to fix the bios that Powercolor shipped with my card. Perhaps you had the wrong name?

Performance or quiet depends on which bios switch you have the card on, it has nothing to do with the flash utility.


----------



## mus1mus

The Die, in the olden days, are just bare where you could just slap a heatsink on and not worry about electrical conduction.

Not sure how or what coating they used on top of the actual nano-circuits but don't forget the one thing that makes Silicon (and the rest of the semiconductor elements) unique. They are all Semiconductors.

Pure silicon will not conduct electrical current. The conduction property is the effect of adding Dopants (So if the outermost layer of the die is of pure silicon wafer, you are safe.

I think, the reason that most of the guys who have used CLP advise not to use the TIM on Direct-Die application is that they scratch the Die. There's the issue. We don't know how thick is the non-conducting layer of the Die is. So the danger is there. As well as CLP overflowing to circuit traces surrounding the actual Die.

But no, Die to Heatsink using CLP is not much an issue. They wouldn't be using a copper Lid if it is that serious. CLP to traces and surrounding component is.


----------



## bak3donh1gh

Update on my Powercolor 290 + g10 + lq-310.

So I ditched the included fan that the 310 came with and pulled out 2 old COOLER MASTER A12025 from back when I thought LED's would be a good idea. Got them blowing the air out of the case. Also as per what another user posted in this thread I grabbed the sticky tape from the stock cooler and put it inbetween the vrm and the backplate that Powercolor includes with the card (also has vrm coolers already installed). I've got the pump and both fans running full on through molex adaptors. I might buy a fan controller or put the fan on the controller included with my H320.

Temps ingame while overclocked(+13mv 1150/1500) max out a 68C with vrm1 at 65C and vrm2 only increasing from 59 to 65C.


----------



## davidm71

Quote:


> Originally Posted by *bak3donh1gh*
> 
> They can be flashed more than once this I know for sure since I flashed once to try a higher performance bios and again to fix the bios that Powercolor shipped with my card. Perhaps you had the wrong name?
> 
> Performance or quiet depends on which bios switch you have the card on, it has nothing to do with the flash utility.


Perhaps but the device name identified by Gpuz doesn't match with any of the rom files I unpacked with winrar from within the Asus flash utility executable that Asus has posted for my card . Powercolor may be different but I believe the utility changed my device id name after flashing that's why I would appreciate it if anyone has a virgin unflashed card as mine can post what your device id name is? That the string Gpuz posts in the driver version under apprentices. Then run the flash utility and compare device names after. Thanks.


----------



## heroxoot

Quote:


> Originally Posted by *davidm71*
> 
> That's good to know. I tried flashing the first day but recently I tried to flash it again using the Asus flash utility and it refused saying can't find Rom file. Can these cards be flashed only once? Also how would the flash utility know wether to flash for performance vs quiet mode?


if it says it cannot find the rom it usually means the rom you are trying to flash on. I suggest using the ATIFLASH utility and doing it in DOS. Safest most effective way.


----------



## davidm71

Quote:


> Originally Posted by *heroxoot*
> 
> if it says it cannot find the rom it usually means the rom you are trying to flash on. I suggest using the ATIFLASH utility and doing it in DOS. Safest most effective way.


I would if I had to. Have to verify I have the latest one installed and then wait for Asus to issue a new one.


----------



## hwoverclkd

Quote:


> Originally Posted by *davidm71*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> if it says it cannot find the rom it usually means the rom you are trying to flash on. I suggest using the ATIFLASH utility and doing it in DOS. Safest most effective way.
> 
> 
> 
> I would if I had to. Have to verify I have the latest one installed and then wait for Asus to issue a new one.
Click to expand...

just in case you need to get back to your original bios, see if you can find it on techpowerup

http://www.techpowerup.com/vgabios/


----------



## Mega Man

Quote:


> Originally Posted by *fateswarm*
> 
> What happens to the die and/or the cooler when you use Coollaboratory Liquid Pro on the Tri-X R9 290?
> 
> Can it be used on the VRM's mosfets?


no on the VRMs very very very bad idea

Quote:


> Originally Posted by *fateswarm*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ozzy1925*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fateswarm*
> 
> What happens to the die and/or the cooler when you use Coollaboratory Liquid Pro on the Tri-X R9 290?
> 
> Can it be used on the VRM's mosfets?
> 
> 
> 
> i dont think its a good idea and worth 5C to use clu or clp on vrm even on the die you need to be very careful because its electrically conductive and it eats aluminum
> 
> Click to expand...
> 
> Is there aluminum?
Click to expand...

no
Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Imprezzion*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Dude, seriously. Why do you think it's called the die? Die = uncovered, ie. no heatspreader. If there was a heatspreader that would be nickel plated.
> 
> 
> 
> Dude, seriously. You think the die itself isn't covered by some form of metal?
> 
> 
> 
> Click to expand...
> 
> That's not even an amd gpu.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> A few posts back someone else posted a pic of their nicked gpu die. That should tell you what? Metal chips and flakes off right?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> Quote:
> 
> 
> 
> Originally Posted by *schoolofmonkey*
> 
> So I set out to replace the TIM on this Tri-X, thinking that may have been the reason for the high idle temps (48c idle).
> What I found what completely dried up paste that flaked off, and a scratch on the die. Replacing the TIM didn't lower idle temps, but my load temps are about 73-75c.
> So I guess that's normal for high idle temps...
> 
> Here's the die:
> 
> 
> Click to expand...
> 
> one of mine came like that, but much worse still works great though !
> 
> Click to expand...
Click to expand...

correct me if i am wrong but isnt that an intel C2D?
Quote:


> Originally Posted by *jagdtigger*
> 
> Greetings.
> 
> I solved the 3 monitor issue but now my card(sapphire r9 290x) stuck at pcie 3 x4, but both my card and mb(msi z87 g55) is pcie3 x16. I noticed before going to work so i dont had time to try anything. I hope its not hardware problem... Have someone any idea what gone wrong?
> 
> Thans in advance for the help.


i would try reseating it in the PCIE slot ( remove the GPU then reinstall it in the mobo, ( just physically ) )
Quote:


> Originally Posted by *mus1mus*
> 
> The Die, in the olden days, are just bare where you could just slap a heatsink on and not worry about electrical conduction.
> 
> Not sure how or what coating they used on top of the actual nano-circuits but don't forget the one thing that makes Silicon (and the rest of the semiconductor elements) unique. They are all Semiconductors.
> 
> Pure silicon will not conduct electrical current. The conduction property is the effect of adding Dopants (So if the outermost layer of the die is of pure silicon wafer, you are safe.
> 
> I think, the reason that most of the guys who have used CLP advise not to use the TIM on Direct-Die application is that they scratch the Die. There's the issue. We don't know how thick is the non-conducting layer of the Die is. So the danger is there. As well as CLP overflowing to circuit traces surrounding the actual Die.
> 
> But no, Die to Heatsink using CLP is not much an issue. They wouldn't be using a copper Lid if it is that serious. CLP to traces and surrounding component is.


the top layer is pretty deep some time ill post pics of mine, but my sd card is somewhere.... hehe

the 2 one my die ( open box from either sapphire or power color idr which, wanna say sapphire ) make the one i quoted look puny


----------



## jagdtigger

@kizwan, Mega Man
Thanks for the tip







. I done that and its better, but still only x8 instead of x16. I will try a hard bios reset afternoon, i have a hunch that the IGP messed up things in there(i enabled it accidentally while tried to solve the "third monitor problem", but forget about it until now -.-' )...


----------



## Arizonian

Quote:


> Originally Posted by *redklu3*
> 
> hello all! would like to join this club with my XFX r9 290x DD.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Its tuned up with kraken G10 (moded), thermaltake water 2.0, noctua fans, corsair dominator fins (moded), xspc back plate (moded). though still runs HOT
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added
















Quote:


> Originally Posted by *BranField*
> 
> Soo happy right now
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> EK sapphire 290x waterblock
> 
> 
> facebook picture


Congrats - updated :









Quote:


> Originally Posted by *jagdtigger*
> 
> Greetings.
> 
> I bought this card(Sapphire R9 290X) in January and it has a lot of power. But the reference cooling was too rowdy so i replaced it with a Koolance VID-AR290X Water Block:
> http://www.techpowerup.com/gpuz/details.php?id=6wcuf
> 
> And i run into a little problem just recently. I used two monitors without problems. One in DVI-D, and one on HDMI. But when i try to connect a third monitor(a TV to be precize) with a DVI->HDMI adapter i get BSOD... I tested the adapter on both DVI ports with all the monitors and its working without problems. Has someone any idea whats going on?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> /EDIT
> I just got curious and fired up kubuntu to test out if its software related and bingo. Kubuntu boots up without kernel panic with three screens. Downloaded 14,7RC and after install i can use both DVI-D ports
> 
> 
> 
> 
> 
> 
> 
> .


Congrats - added


----------



## kizwan

Quote:


> Originally Posted by *jagdtigger*
> 
> @kizwan, Mega Man
> Thanks for the tip
> 
> 
> 
> 
> 
> 
> 
> . I done that and its better, but still only x8 instead of x16. I will try a hard bios reset afternoon, i have a hunch that the IGP messed up things in there(i enabled it accidentally while tried to solve the "third monitor problem", but forget about it until now -.-' )...


Try re-seat again. A couple of people including me experienced similar issue (in Asus Rampage x79 thread if I remember correctly & it involved different GPU's). I don't think resetting BIOS can help (but worth a try anyway). It's like PCIe link between the slot & the card, electrically incomplete somehow. Re-seating the GPU should fixed it.


----------



## Mega Man

blow out the slot, check for debris ect


----------



## chronicfx

Nintendo


----------



## Mega Man

i was thinking the same thing except
1 the younger gen may not know what we were talking about and
2 it really didnt fix the problem with it it was the multiple inserts that did, where as it could actually help here


----------



## ebhsimon

What's the consensus on the Asus DCU2 cooler now? Did they fix the heatsink problem of only 3 of the 5 heatpipes touching the GPU? Thinking of selling my Vapor-X 290 to replace it with a DCU2 version because the mismatched colour scheme going on in my build is driving my nuts.


----------



## Imprezzion

That's fixed yeah, GPU temps are very good, but the VRM cooling is still very bad on the DCU2's.
Don't expect any proper overvolting with the stock VRM cooling on it.

If youwant a matching colored card, take the MSI Gaming. Overall a beter card and usually cheaper as well.


----------



## Raul-7

Finally got her under water.


----------



## Raul-7

Here's a comparison of the temps, air vs water. Both after 2 runs of Valley.



With the water, I overclocked it 1100 core and gave it a 50mV bump.


----------



## Sgt Bilko

Quote:


> Originally Posted by *ebhsimon*
> 
> What's the consensus on the Asus DCU2 cooler now? Did they fix the heatsink problem of only 3 of the 5 heatpipes touching the GPU? Thinking of selling my Vapor-X 290 to replace it with a DCU2 version because the mismatched colour scheme going on in my build is driving my nuts.


afaik it's still the same as when they launched it unfortunately.


----------



## HOMECINEMA-PC

HOMECINEMA-PC [email protected]@2388 [email protected]@1400 *P19567* tess off of cause and single card PB










http://www.3dmark.com/3dm11/8678401


----------



## hotrod717

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> HOMECINEMA-PC [email protected]@2388 [email protected]@1400 *P19567* tess off of cause and single card PB
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/8678401


Nice Run! That one of the cards you've had or a new addition to the family?


----------



## rdr09

Quote:


> Originally Posted by *hotrod717*
> 
> Nice Run! That one of the cards you've had or a new addition to the family?


i saw he beat that already, hot. good job, Homes.


----------



## Spectre-

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> HOMECINEMA-PC [email protected]@2388 [email protected]@1400 *P19567* tess off of cause and single card PB
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/8678401


he wouldnt have bothered with the 290X till i challenged him on hwbot


----------



## ebhsimon

Quote:


> Originally Posted by *Sgt Bilko*
> 
> afaik it's still the same as when they launched it unfortunately.


Quote:


> Originally Posted by *Imprezzion*
> 
> That's fixed yeah, GPU temps are very good, but the VRM cooling is still very bad on the DCU2's.
> Don't expect any proper overvolting with the stock VRM cooling on it.
> 
> If youwant a matching colored card, take the MSI Gaming. Overall a beter card and usually cheaper as well.


Uh.. I'm a little conflicted.

Anyway, why the MSI Gaming? I've read that it has a noisier fan and has higher temps. Unless VRM temps are way better? I might get a 780 for lower temps and comparable fps since the price of 780s is plummeting in Australia at the moment.


----------



## Sgt Bilko

Quote:


> Originally Posted by *ebhsimon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> afaik it's still the same as when they launched it unfortunately.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Imprezzion*
> 
> That's fixed yeah, GPU temps are very good, but the VRM cooling is still very bad on the DCU2's.
> Don't expect any proper overvolting with the stock VRM cooling on it.
> 
> If youwant a matching colored card, take the MSI Gaming. Overall a beter card and usually cheaper as well.
> 
> Click to expand...
> 
> Uh.. I'm a little conflicted.
> 
> Anyway, why the MSI Gaming? I've read that it has a noisier fan and has higher temps. Unless VRM temps are way better? I might get a 780 for lower temps and comparable fps since the price of 780s is plummeting in Australia at the moment.
Click to expand...

Well if you are talking about the heatpipe thing then as before i think they are still the same, You'd need someone with a DCU II to chime in.

And 780's prices are going down? Here?

last i seen they were still higher had a higher price tag than R9 290's.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *hotrod717*
> 
> Nice Run! That one of the cards you've had or a new addition to the family?


Thanks man ive had this one since may......... its a unlocked XFX 290








Starting to workout the dram better . These are downclocked CL11 2666 Dom Plats @CL9 [email protected]









Quote:


> Originally Posted by *rdr09*
> 
> i saw he beat that already, hot. good job, Homes.


Thanks man








Now im having driver issues and my display port is unstable . so thats all for now









Quote:


> Originally Posted by *Spectre-*
> 
> he wouldnt have bothered with the 290X till i challenged him on hwbot


Yep motivation








But ive been on DC as you know messin with dram . So i applied a bit of what I learned with these sticks there here








Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well if you are talking about the heatpipe thing then as before i think they are still the same, You'd need someone with a DCU II to chime in.
> 
> And 780's prices are going down? Here?
> 
> last i seen they were still higher had a higher price tag than R9 290's.


X99 and 780ti's $$$ are mind boggling here


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Well if you are talking about the heatpipe thing then as before i think they are still the same, You'd need someone with a DCU II to chime in.
> 
> And 780's prices are going down? Here?
> 
> last i seen they were still higher had a higher price tag than R9 290's.
> 
> 
> 
> X99 and 780ti's $$$ are mind boggling here
Click to expand...

What isn't expensive here though?

just seen the 780Ti Kingpin back in stock for $1100









Well the 295x2 is getting a price cut now, you think you might have a play around with one of them?


----------



## ebhsimon

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well if you are talking about the heatpipe thing then as before i think they are still the same, You'd need someone with a DCU II to chime in.
> 
> And 780's prices are going down? Here?
> 
> last i seen they were still higher had a higher price tag than R9 290's.


Where do you normally buy your stuff?
I normally get my stuff from MSY and PCCG. And looking at MSY's 780s, it's a LOT cheaper than PCCG's same 780s. 780ti prices still high though.


----------



## Spectre-

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> X99 and 780ti's $$$ are mind boggling here


X99 mobos are similarly priced to X79 , DDR4 is SUPER EXPENSIVE and the 5960x is $1200

if i sell my 3930K and ram and mobo

ill get $750-850 if i am lucky

and thats covering only a third of the stuff

also the cheapest 780ti is $700 and the R9 290 is $400


----------



## ebhsimon

Quote:


> Originally Posted by *Spectre-*
> 
> X99 mobos are similarly priced to X79 , DDR4 is SUPER EXPENSIVE and the 5960x is $1200
> 
> if i sell my 3930K and ram and mobo
> 
> ill get $750-850 if i am lucky
> 
> and thats covering only a third of the stuff
> 
> also the cheapest 780ti is $700 and the R9 290 is $400


We can't even get 290s for $400 here... Cheapest I've (just checked) seen is $440. But you can't really compare 780tis to 290s... because a 780ti is quite a bit faster than a 290. I mean the price to performance ratio for the 290 is super great, but when you need that little extra bit (Eg. 1440p 120hz gaming) the 780ti is the only thing that'll work in some rare scenarios.


----------



## Sgt Bilko

Quote:


> Originally Posted by *ebhsimon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Well if you are talking about the heatpipe thing then as before i think they are still the same, You'd need someone with a DCU II to chime in.
> 
> And 780's prices are going down? Here?
> 
> last i seen they were still higher had a higher price tag than R9 290's.
> 
> 
> 
> Where do you normally buy your stuff?
> I normally get my stuff from MSY and PCCG. And looking at MSY's 780s, it's a LOT cheaper than PCCG's same 780s. 780ti prices still high though.
Click to expand...

PCCG for most hardware and Mwave for the smaller things lately.

that's good to see that 780's are getting cheaper but i'd still prefer my 290's over them


----------



## Spectre-

Quote:


> Originally Posted by *ebhsimon*
> 
> We can't even get 290s for $400 here... Cheapest I've (just checked) seen is $440. But you can't really compare 780tis to 290s... because a 780ti is quite a bit faster than a 290. I mean the price to performance ratio for the 290 is super great, but when you need that little extra bit (Eg. 1440p 120hz gaming) the 780ti is the only thing that'll work in some rare scenarios.


Msy sydney is getting rid of ref. models - $400 for r9 290 and $469 for 290X

also if your budget is $700 (same price as the cheapest 780ti)

grab a 290(or 290X)

spend $120 for a g10+h55 combo ( cheapest )

and you are looking at liquid cooled R9 290 for $ 550ish dollars

now if its like my old R9 290 (that died)

it will overclock to 1200/1400 on +80mV (40% power)

i am pretty sure you are on par with 780ti perf.

and you still have $150ish dollars to spend on other stuff


----------



## ebhsimon

Quote:


> Originally Posted by *Spectre-*
> 
> Msy sydney is getting rid of ref. models - $400 for r9 290 and $469 for 290X
> 
> also if your budget is $700 (same price as the cheapest 780ti)
> 
> grab a 290(or 290X)
> 
> spend $120 for a g10+h55 combo ( cheapest )
> 
> and you are looking at liquid cooled R9 290 for $ 550ish dollars
> 
> now if its like my old R9 290 (that died)
> 
> it will overclock to 1200/1400 on +80mV (40% power)
> 
> i am pretty sure you are on par with 780ti perf.
> 
> and you still have $150ish dollars to spend on other stuff


I guess that's an option, but unfortunately Melbourne MSY doesn't have any reference 290s for sale







They do have a 290x ref for $470 though which isn't bad. I could chuck a g10 and watercooler on that.
However my budget isn't $700, since I want to replace my 290 with a 780. I'm looking at the Gigabyte 780 windforce which is ~$530. A much cheaper alternative.

If I get a 290x though, I could add a g10 and an X41 cooler which would make it ~$630. I guess it's not too bad, I could always go that route. Side note; I'd need to get the X41 because of the extended tubing to put the rad as a front intake as every other spot has no space (I already have a 280mm rad up top which blocks room for the rear exhaust).

By the way, 1200/1400 on +80mV and 40% power is pretty rare. That's an extremely good chip, sad that it's dead now.


----------



## Imprezzion

Quote:


> Originally Posted by *Spectre-*
> 
> Msy sydney is getting rid of ref. models - $400 for r9 290 and $469 for 290X
> 
> also if your budget is $700 (same price as the cheapest 780ti)
> 
> grab a 290(or 290X)
> 
> spend $120 for a g10+h55 combo ( cheapest )
> 
> and you are looking at liquid cooled R9 290 for $ 550ish dollars
> 
> now if its like my old R9 290 (that died)
> 
> it will overclock to 1200/1400 on +80mV (40% power)
> 
> i am pretty sure you are on par with 780ti perf.
> 
> and you still have $150ish dollars to spend on other stuff


1200Mhz on +80mV is very very rare even for a 290X. Let alone a 290...


----------



## Spectre-

Quote:


> Originally Posted by *ebhsimon*
> 
> By the way, 1200/1400 on +80mV and 40% power is pretty rare. That's an extremely good chip, sad that it's dead now.


i blame HCPC for killing my R9 290

his RIVE is to OP for my card

... also 780 is a sidegrade to the 290


----------



## Sgt Bilko

Quote:


> Originally Posted by *Imprezzion*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Spectre-*
> 
> Msy sydney is getting rid of ref. models - $400 for r9 290 and $469 for 290X
> 
> also if your budget is $700 (same price as the cheapest 780ti)
> 
> grab a 290(or 290X)
> 
> spend $120 for a g10+h55 combo ( cheapest )
> 
> and you are looking at liquid cooled R9 290 for $ 550ish dollars
> 
> now if its like my old R9 290 (that died)
> 
> it will overclock to 1200/1400 on +80mV (40% power)
> 
> i am pretty sure you are on par with 780ti perf.
> 
> and you still have $150ish dollars to spend on other stuff
> 
> 
> 
> 1200Mhz on +80mV is very very rare even for a 290X. Let alone a 290...
Click to expand...

290's will naturally clock higher than 290x's due to less shaders needing the power but agreed though 1200Mhz @ +80mV is quite rare indeed, mine will only do 1140 @ +100mV


----------



## Spectre-

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 290's will naturally clock higher than 290x's due to less shaders needing the power but agreed though 1200Mhz @ +80mV is quite rare indeed, mine will only do 1140 @ +100mV


my current 290X is doing 1160/1350 on +75mv 50% power

silly thing doesnt want to go above 1250/1600 while benching

might open my H320 to do a proper loop and add a 240mm rad


----------



## Sgt Bilko

Quote:


> Originally Posted by *Spectre-*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> 290's will naturally clock higher than 290x's due to less shaders needing the power but agreed though 1200Mhz @ +80mV is quite rare indeed, mine will only do 1140 @ +100mV
> 
> 
> 
> my current 290X is doing 1160/1350 on +75mv 50% power
> 
> silly thing doesnt want to go above 1250/1600 while benching
> 
> might open my H320 to do a proper loop and add a 240mm rad
Click to expand...

Best i've gotten out of my 290's is 1250/1550 +200mV with the stock BIOS.

for what i'm running i just leave my cards at stock settings


----------



## Spectre-

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Best i've gotten out of my 290's is 1250/1550 +200mV with the stock BIOS.
> 
> for what i'm running i just leave my cards at stock settings


pt1 bios was unstable for me

the Gigabyte bios is pretty boss

i flashed to the newest one

it does 1.43 volts (with drop-off)


----------



## Imprezzion

Best mine'll bench is 1230Mhz on +200mV but it will only stabilize in games at 1180Mhz +200mV.Anything over 1180Mhz gives random checkerboard artifacts.. It's a unlocked 290 ref. with Accelero Hybrid and cut up stock cooler for VRM cooling.. Temps are no issue...


----------



## Spectre-

Quote:


> Originally Posted by *Imprezzion*
> 
> Best mine'll bench is 1230Mhz on +200mV but it will only stabilize in games at 1180Mhz +200mV.Anything over 1180Mhz gives random checkerboard artifacts.. It's a unlocked 290 ref. with Accelero Hybrid and cut up stock cooler for VRM cooling.. Temps are no issue...


that is a lot of volts for 1180

have you given it extra power ?

i basically just loop Metro LL like 20 - 30 times for gpu stability


----------



## ebhsimon

Quote:


> Originally Posted by *Spectre-*
> 
> i blame HCPC for killing my R9 290
> 
> his RIVE is to OP for my card
> 
> ... also 780 is a sidegrade to the 290


I blame HCPC for making me feel like my rig is too slow









But yeah, I know the 780 is a sidegrade. I'm not looking to get anything much faster. I'm not selling my Vapor-X 290 because it's too slow, I'm selling it because it doesn't match my colour scheme.
I might just get a 290x ref and slap a G10 on that sucker.


----------



## Sgt Bilko

Quote:


> Originally Posted by *ebhsimon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Spectre-*
> 
> i blame HCPC for killing my R9 290
> 
> *his RIVE is to OP for my card*
> 
> ... also 780 is a sidegrade to the 290
> 
> 
> 
> *I blame HCPC for making me feel like my rig is too slow*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But yeah, I know the 780 is a sidegrade. I'm not looking to get anything much faster. I'm not selling my Vapor-X 290 because it's too slow, I'm selling it because it doesn't match my colour scheme.
> I might just get a 290x ref and slap a G10 on that sucker.
Click to expand...

Yeah.....he does that


----------



## rdr09

Quote:


> Originally Posted by *ebhsimon*
> 
> I guess that's an option, but unfortunately Melbourne MSY doesn't have any reference 290s for sale
> 
> 
> 
> 
> 
> 
> 
> They do have a 290x ref for $470 though which isn't bad. I could chuck a g10 and watercooler on that.
> However my budget isn't $700, since I want to replace my 290 with a 780. I'm looking at the Gigabyte 780 windforce which is ~$530. A much cheaper alternative.
> 
> If I get a 290x though, I could add a g10 and an X41 cooler which would make it ~$630. I guess it's not too bad, I could always go that route. Side note; I'd need to get the X41 because of the extended tubing to put the rad as a front intake as every other spot has no space (I already have a 280mm rad up top which blocks room for the rear exhaust).
> 
> By the way, 1200/1400 on +80mV and 40% power is pretty rare. That's an extremely good chip, sad that it's dead now.


what's the theme? Red, right? Gigabyte, if i'm not mistaken, is blue. there is a msi gaming version of the 780. that's what you need.


----------



## Mega Man

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Well if you are talking about the heatpipe thing then as before i think they are still the same, You'd need someone with a DCU II to chime in.
> 
> And 780's prices are going down? Here?
> 
> last i seen they were still higher had a higher price tag than R9 290's.
> 
> 
> 
> X99 and 780ti's $$$ are mind boggling here
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> What isn't expensive here though?
> 
> just seen the 780Ti Kingpin back in stock for $1100
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well the 295x2 is getting a price cut now, you think you might have a play around with one of them?
Click to expand...

but no voltage control :/ that i have seen
Quote:


> Originally Posted by *ebhsimon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Spectre-*
> 
> X99 mobos are similarly priced to X79 , DDR4 is SUPER EXPENSIVE and the 5960x is $1200
> 
> if i sell my 3930K and ram and mobo
> 
> ill get $750-850 if i am lucky
> 
> and thats covering only a third of the stuff
> 
> also the cheapest 780ti is $700 and the R9 290 is $400
> 
> 
> 
> We can't even get 290s for $400 here... Cheapest I've (just checked) seen is $440. But you can't really compare 780tis to 290s... because a 780ti is quite a bit faster than a 290. I mean the price to performance ratio for the 290 is super great, but when you need that little extra bit (Eg. 1440p 120hz gaming) the 780ti is the only thing that'll work in some rare scenarios.
Click to expand...

sorry but this made me laugh


----------



## ebhsimon

Quote:


> Originally Posted by *rdr09*
> 
> what's the theme? Red, right? Gigabyte, if i'm not mistaken, is blue. there is a msi gaming version of the 780. that's what you need.


The theme is more of an orange/black. Having trouble finding anything that would fit, which is why I'm tempted to just find black cards, DCU2 cooler with no stickers, ACX coolers, black g10s etc.



As you can see, my Vapor-X sticks out like a sore thumb.


----------



## Sgt Bilko

Quote:


> Originally Posted by *ebhsimon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> what's the theme? Red, right? Gigabyte, if i'm not mistaken, is blue. there is a msi gaming version of the 780. that's what you need.
> 
> 
> 
> The theme is more of an orange/black. Having trouble finding anything that would fit, which is why I'm tempted to just find black cards, DCU2 cooler with no stickers, ACX coolers, black g10s etc.
> 
> 
> 
> As you can see, my Vapor-X sticks out like a sore thumb.
Click to expand...

XFX?


----------



## Mega Man

watercooling ?


----------



## rdr09

Quote:


> Originally Posted by *ebhsimon*
> 
> The theme is more of an orange/black. Having trouble finding anything that would fit, which is why I'm tempted to just find black cards, DCU2 cooler with no stickers, ACX coolers, black g10s etc.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> As you can see, my Vapor-X sticks out like a sore thumb.


it does stick out. with regard to the difference in performance between the 780Ti and the 290, i agree, sort off. i know you have an i5 (i might get flamed for this) but here is the difference in my experience with a 290 stock with HT off and on in BF4 MP . . .



check out the minimums.


----------



## Spectre-

Quote:


> Originally Posted by *Mega Man*
> 
> but no voltage control :/ that i have seen
> sorry but this made me laugh


laugh at our silly taxes

> wants that 5950X


----------



## Mega Man

meh i was talking about the 780/ti, amd has stomped nvidia at higher res for how long ?


----------



## ebhsimon

@Sgt BilkoI wouldn't mind getting XFX as it looks pretty neat, but I heard it gets a little toasty. Especially with summer coming up I don't want to be throttling or even close to throttling (makes me feel iffy seeing my GPU over 70C).

@Mega ManAs for watercooling, I'm not prepared to put in the money into making a loop for my GPU and CPU for this rig since it's just a single gpu + 4670k, they don't really need it. A g10 + x41 could work, though.

@rdr09 I may be reading this wrong, but are you getting higher minimum frames with HT off?


----------



## Spectre-

Quote:


> Originally Posted by *Mega Man*
> 
> meh i was talking about the 780/ti, amd has stomped nvidia at higher res for how long ?


oh woops


----------



## ebhsimon

Quote:


> Originally Posted by *Mega Man*
> 
> meh i was talking about the 780/ti, amd has stomped nvidia at higher res for how long ?


I wouldn't exactly call it stomping. But I'm at 1440p which isn't exactly 'high res', it's the same resolution as just under 2 1080p screens. I haven't looked at benchmarks regarding this, but does the 290/x beat a 780/ti at 1440p that badly?


----------



## rdr09

Quote:


> Originally Posted by *ebhsimon*
> 
> @Sgt BilkoI wouldn't mind getting XFX as it looks pretty neat, but I heard it gets a little toasty. Especially with summer coming up I don't want to be throttling or even close to throttling (makes me feel iffy seeing my GPU over 70C).
> 
> @Mega ManAs for watercooling, I'm not prepared to put in the money into making a loop for my GPU and CPU for this rig since it's just a single gpu + 4670k, they don't really need it. A g10 + x41 could work, though.
> 
> @rdr09 I may be reading this wrong, but are you getting higher minimum frames with HT off?


i wish. my loop is about 5C cooler with HT off. yes, you read it wrong. same with BF3 MP, same with C3.


----------



## ebhsimon

Quote:


> Originally Posted by *Spectre-*
> 
> > wants that 5950X


Haha, if you're a Youtuber maybe Intel will give you a cherry-picked 5960X. Paul's Hardware got his 5960X to 4.76Ghz to all 8 cores at just 1.29v. Meanwhile I'm pushing 1.34v to keep my 4670k stable at 4.4Ghz.
Quote:


> Originally Posted by *rdr09*
> 
> i wish. my loop is about 5C cooler with HT off. yes, you read it wrong. same with BF3 MP, same with C3.


That's a pretty huge difference. I guess I should probably stay with one GPU since my i5 is definitely going to hold things back. After I get my gpu sorted out I'll most likely build another rig when Skylake comes out (if I'm still into games, because by that time I'll have finished uni and moved on to full time work hopefully).


----------



## Sgt Bilko

Quote:


> Originally Posted by *ebhsimon*
> 
> @Sgt BilkoI wouldn't mind getting XFX as it looks pretty neat, but I heard it gets a little toasty. Especially with summer coming up I don't want to be throttling or even close to throttling (makes me feel iffy seeing my GPU over 70C).
> 
> @Mega ManAs for watercooling, I'm not prepared to put in the money into making a loop for my GPU and CPU for this rig since it's just a single gpu + 4670k, they don't really need it. A g10 + x41 could work, though.
> 
> @rdr09 I may be reading this wrong, but are you getting higher minimum frames with HT off?


Coming from a Vapor-X you would definitely notice a difference but they aren't any different from what i've been seeing by MSI, Giga and some Tri-X owners temp results.

A single GPU would have any issue with temps and it's only in Crossfire when playing BF4 for hours on end in 35c ambients did i ever notice them getting toasty and i had to take the side off my case to help airflow


----------



## Imprezzion

Quote:


> Originally Posted by *Spectre-*
> 
> that is a lot of volts for 1180
> 
> have you given it extra power ?
> 
> i basically just loop Metro LL like 20 - 30 times for gpu stability


Yeah it runs +50%. No throttling.

It's more or less suicide style as it will run 1150Mhz on +118v so for 30Mhz more I need +80mV extra. Alos, 1125Mhz is perfectly stable on just +37mV. Card just scales very very bad at any frequency over 1125Mhz.

But then again, temps are fine (Core 65-70c load and VRM1 80-85c load, VRM2 60-65c load) and i don't care about load power consumption so might as well just run 1180 @ +200mV then.. haha.


----------



## ebhsimon

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Coming from a Vapor-X you would definitely notice a difference but they aren't any different from what i've been seeing by MSI, Giga and some Tri-X owners temp results.
> 
> A single GPU would have any issue with temps and it's only in Crossfire when playing BF4 for hours on end in 35c ambients did i ever notice them getting toasty and i had to take the side off my case to help airflow


That's true I shouldn't run into any problems running a single gpu set up. It's just a little unsettling seeing a gpu above 80c. The hardest part of any build is choosing the parts.


----------



## Sgt Bilko

Quote:


> Originally Posted by *ebhsimon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Coming from a Vapor-X you would definitely notice a difference but they aren't any different from what i've been seeing by MSI, Giga and some Tri-X owners temp results.
> 
> A single GPU wouldn't have any issue with temps and it's only in Crossfire when playing BF4 for hours on end in 35c ambients did i ever notice them getting toasty and i had to take the side off my case to help airflow
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That's true I shouldn't run into any problems running a single gpu set up. It's just a little unsettling seeing a gpu above 80c. The hardest part of any build is choosing the parts.
Click to expand...

This is an older review : LINK but it's a decent look at how a single card would perform temp wise.

highest i've seen my bottom card hit is 75c on the core and 80c on vrm1 and as before that was in the middle of summer here in a room with no air con, top card was obviously a bit warmer but nothing i've stressed over


----------



## rdr09

Quote:


> Originally Posted by *ebhsimon*
> 
> Haha, if you're a Youtuber maybe Intel will give you a cherry-picked 5960X. Paul's Hardware got his 5960X to 4.76Ghz to all 8 cores at just 1.29v. Meanwhile I'm pushing 1.34v to keep my 4670k stable at 4.4Ghz.
> That's a pretty huge difference. I guess I should probably stay with one GPU since my i5 is definitely going to hold things back. After I get my gpu sorted out I'll most likely build another rig when Skylake comes out (if I'm still into games, because by that time I'll have finished uni and moved on to full time work hopefully).


at your rez, you have no choice.


----------



## ebhsimon

@Sgt Bilko Oh wow it looks like the XFX DD 290 is not too shabby at all. I'll look into finding one for myself after I've sold my Vapor-X 290.


----------



## Sgt Bilko

Quote:


> Originally Posted by *ebhsimon*
> 
> @Sgt Bilko Oh wow it looks like the XFX DD 290 is not too shabby at all. I'll look into finding one for myself after I've sold my Vapor-X 290.


It's goes alright, not as good as the Tri-X or the Vapor-X obviously but core cooling isn't an issue, the vrm's can get somewhat warm though when overclocking so fair warning in advance


----------



## PearlJammzz

So I have some overclocking questions for any of you in the know. I recently put my r9 290 under water and started overclocking it. I use Sapphire Trixx and I have a Sapphire reference r9 290 (the BF4 edition if that matters).

Question 1: I am able to get 1225 core clock just fine without touching the voltage at all. It's 100% solid. If I give the card more voltage, however, I can't get it to be stable any higher. My temps never get above 50c on my card. The VRM temps (can't remember, but I think it's VRM1) will get 60c but that's it. Am I doing something wrong or is this card just not capable of anything past 1225? I just find it weird that the voltage changes seem to do nothing in terms of stability.

Question 2: When I overclock the memory (it hits 1500 easily and is 100% stable. It went at 1600 no issues but didn't have it there long enough to say stable for sure) my benchmark scores actually get lower. Am I missing something here as well or is overclocking the memory on these cards counterproductive for some reason?

Thank you all for any help! It's much appreciated


----------



## sinnedone

Quote:


> Originally Posted by *ebhsimon*
> 
> @Sgt Bilko Oh wow it looks like the XFX DD 290 is not too shabby at all. I'll look into finding one for myself after I've sold my Vapor-X 290.


you could always paint the blue bits orange to match your build.


----------



## Spectre-

Quote:


> Originally Posted by *Imprezzion*
> 
> Yeah it runs +50%. No throttling.
> 
> It's more or less suicide style as it will run 1150Mhz on +118v so for 30Mhz more I need +80mV extra. Alos, 1125Mhz is perfectly stable on just +37mV. Card just scales very very bad at any frequency over 1125Mhz.
> 
> But then again, temps are fine (Core 65-70c load and VRM1 80-85c load, VRM2 60-65c load) and i don't care about load power consumption so might as well just run 1180 @ +200mV then.. haha.


damn man

my 290X does 1100/1325 on +0.12

+20%

i just wanted a few more fps


----------



## Widde

Quote:


> Originally Posted by *PearlJammzz*
> 
> Question 2: When I overclock the memory (it hits 1500 easily and is 100% stable. It went at 1600 no issues but didn't have it there long enough to say stable for sure) my benchmark scores actually get lower. Am I missing something here as well or is overclocking the memory on these cards counterproductive for some reason?
> 
> Thank you all for any help! It's much appreciated


Think it's the memory that is getting errors and correcting, Try backing off alittle on the oc


----------



## PearlJammzz

Quote:


> Originally Posted by *Widde*
> 
> Think it's the memory that is getting errors and correcting, Try backing off alittle on the oc


Any way to log/visually see this? Or just OC and test benchmarks till they start going down?


----------



## wrigleyvillain

Yes while it does seem a little curious that more volts isn't getting you *anything* that is also a pretty insane OC for stock and I would be extremely pleased. My reference can't seem to do much better than 1070 on stock voltage. But it also unlocked stable so it has a nice place in my heart nonetheless.








Quote:


> Originally Posted by *PearlJammzz*
> 
> Any way to log/visually see this?


I don't think so..? Too bad no Memtest for VRAM (that I know of).


----------



## jagdtigger

Quote:


> Originally Posted by *kizwan*
> 
> Try re-seat again. A couple of people including me experienced similar issue (in Asus Rampage x79 thread if I remember correctly & it involved different GPU's). I don't think resetting BIOS can help (but worth a try anyway). It's like PCIe link between the slot & the card, electrically incomplete somehow. Re-seating the GPU should fixed it.


Re-seated it multiple times but it gone back to x4. But the BIOS reset solved it, after win up and running checked with GPU-Z and it went up to x16 3.0 as it should...


----------



## kizwan

Quote:


> Originally Posted by *jagdtigger*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Try re-seat again. A couple of people including me experienced similar issue (in Asus Rampage x79 thread if I remember correctly & it involved different GPU's). I don't think resetting BIOS can help (but worth a try anyway). It's like PCIe link between the slot & the card, electrically incomplete somehow. Re-seating the GPU should fixed it.
> 
> 
> 
> 
> 
> 
> Re-seated it multiple times but it gone back to x4. But the BIOS reset solved it, after win up and running checked with GPU-Z and it went up to x16 3.0 as it should...
Click to expand...

I'm glad you sorted that out.









P/S: Are you tanks fan? Specifically rc tank?


----------



## PearlJammzz

Quote:


> Originally Posted by *wrigleyvillain*
> 
> Yes while it does seem a little curious that more volts isn't getting you *anything* that is also a pretty insane OC for stock and I would be extremely pleased. My reference can't seem to do much better than 1070 on stock voltage. But it also unlocked stable so it has a nice place in my heart nonetheless.
> 
> 
> 
> 
> 
> 
> 
> 
> I don't think so..? Too bad no Memtest for VRAM (that I know of).


Ya, it's weird as hell. On air and no voltage I could get ~1150. Seems cooler temps helps the card run at higher speeds even at the same power. I almost wonder if the power limit slider in Sapphire Trixx isn't working somehow? Just really weird!


----------



## jagdtigger

Quote:


> Originally Posted by *kizwan*
> 
> I'm glad you sorted that out.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> P/S: Are you tanks fan? Specifically rc tank?


Not yet, re-applied OC (manually, deleted the old profiles) and its again only x8, its strange...









As for tanks the real is better...







But for us normal only the RC is available.


----------



## battleaxe

My MSI Gaming 290 will do 1165 while benching. Hoping it will reach 1200mhz when the temps come down about 25c from adding a block to it. That's at +100mv... My GTX670's were much happier on water than on air. Even at the same clocks the scores were better and I'm not sure how or why that's possible, but it was. Same drivers, same everything. Just did a lot better on water.


----------



## Mega Man

Quote:


> Originally Posted by *ebhsimon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> meh i was talking about the 780/ti, amd has stomped nvidia at higher res for how long ?
> 
> 
> 
> I wouldn't exactly call it stomping. But I'm at 1440p which isn't exactly 'high res', it's the same resolution as just under 2 1080p screens. I haven't looked at benchmarks regarding this, but does the 290/x beat a 780/ti at 1440p that badly?
Click to expand...

My bad "higher" is the wording I should of used.

Usually due to the higher memory bandwidth
Quote:


> Originally Posted by *PearlJammzz*
> 
> So I have some overclocking questions for any of you in the know. I recently put my r9 290 under water and started overclocking it. I use Sapphire Trixx and I have a Sapphire reference r9 290 (the BF4 edition if that matters).
> 
> Question 1: I am able to get 1225 core clock just fine without touching the voltage at all. It's 100% solid. If I give the card more voltage, however, I can't get it to be stable any higher. My temps never get above 50c on my card. The VRM temps (can't remember, but I think it's VRM1) will get 60c but that's it. Am I doing something wrong or is this card just not capable of anything past 1225? I just find it weird that the voltage changes seem to do nothing in terms of stability.
> 
> Question 2: When I overclock the memory (it hits 1500 easily and is 100% stable. It went at 1600 no issues but didn't have it there long enough to say stable for sure) my benchmark scores actually get lower. Am I missing something here as well or is overclocking the memory on these cards counterproductive for some reason?
> 
> Thank you all for any help! It's much appreciated


Memory has ecc which means it is causing errors but correcting them. So you are not really stable


----------



## devilhead

Quote:


> Originally Posted by *Imprezzion*
> 
> 1200Mhz on +80mV is very very rare even for a 290X. Let alone a 290...


now my i have 2x 290X, both can do +80mV 1200 mhz battlefield 4 no problems( tested and 1250mhz already need 137mV gaming stable no artifacts, and 1280mhz +200mv, but already somtimes can se some small artifacts, but able to play like couple hours), had and many 290's(6x) so almost all of them under water can do 1200mhz ~100mV and able to play bf4 without any artifact


----------



## Wezzor

Is 35c idle on a R9 290 Tri-X good? (not overclocked yet)


----------



## Mongoose135

Quote:


> Originally Posted by *Wezzor*
> 
> Is 35c idle on a R9 290 Tri-X good? (not overclocked yet)


Yep, pretty good. It's hard to get much below that in my experience (once windows has been left running a while)


----------



## thrgk

Are people still selling their 290x's? I remember a month or so ago people were selling like 45 at a time. I think I would go 290x, 3 or 4 of em, dump my 7970s, instead of go with the 980 nvidea, but I cannot find em anywhere.


----------



## Wezzor

Quote:


> Originally Posted by *Mongoose135*
> 
> Yep, pretty good. It's hard to get much below that in my experience (once windows has been left running a while)


Ah okay! But is it normal that the card goes up to 50-55c when watching streams on twitch? It feels kinda much somehow, I mean when I max out heavy games I get like 70-75c.


----------



## hotrod717

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Thanks man ive had this one since may......... its a unlocked XFX 290
> 
> 
> 
> 
> 
> 
> 
> 
> Starting to workout the dram better . These are downclocked CL11 2666 Dom Plats @CL9 [email protected]


Yes, i'm starting to see cas latency and certain dividers in a whole new light since I got my PIS 2200CL7. 2666CL9 anyone?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Wezzor*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mongoose135*
> 
> Yep, pretty good. It's hard to get much below that in my experience (once windows has been left running a while)
> 
> 
> 
> Ah okay! But is it normal that the card goes up to 50-55c when watching streams on twitch? It feels kinda much somehow, I mean when I max out heavy games I get like 70-75c.
Click to expand...

Yup. That's hardware acceleration


----------



## Mongoose135

Quote:


> Originally Posted by *Wezzor*
> 
> Ah okay! But is it normal that the card goes up to 50-55c when watching streams on twitch? It feels kinda much somehow, I mean when I max out heavy games I get like 70-75c.


Your gaming temps are perfectly fine, and going by that I'd say the 50-55 is also just fine. The fan profile is designed to stay quiet and only really spin up when you're putting heavy loads onto the card. Watching videos can load up the card between 10-30% on and off - check your GPU load as you watch if you want confirmation.


----------



## th3illusiveman

why is it my PC blackscreens on the login window after it's been turned off for afew hours? If i turn it off now, then turn it on in 10 mins it will boot up fine. If i turn it off now and turn it on again in 6 hours it blackscreens on the logon window and i have to restart it. So far this has been a very consistent pattern with both 14.7 RC3 and the latest driver release. OC or stock doesn't matter, i even tried turning CCC off on startup and that didnt fix it.








AMD pls. 7970 had no issues like this.


----------



## Sgt Bilko

Quote:


> Originally Posted by *th3illusiveman*
> 
> why is it my PC blackscreens on the login window after it's been turned off for afew hours? If i turn it off now, then turn it on in 10 mins it will boot up fine. If i turn it off now and turn it on again in 6 hours it blackscreens on the logon window and i have to restart it. So far this has been a very consistent pattern with both 14.7 RC3 and the latest driver release. OC or stock doesn't matter, i even tried turning CCC off on startup and that didnt fix it.
> 
> 
> 
> 
> 
> 
> 
> 
> AMD pls. 7970 had no issues like this.


Out of curiosity, do you have a leader type power cable plugged in or are they seperate 8 & 6 pin connectors?


----------



## th3illusiveman

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Out of curiosity, do you have a leader type power cable plugged in or are they seperate 8 & 6 pin connectors?


uh... idk? Whats that? I have the PSU in my sig and it's got the 6pin and 8pin connectors. enough for 2 570s in Sli.


----------



## Sgt Bilko

Quote:


> Originally Posted by *th3illusiveman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Out of curiosity, do you have a leader type power cable plugged in or are they seperate 8 & 6 pin connectors?
> 
> 
> 
> uh... idk? Whats that? I have the PSU in my sig and it's got the 6pin and 8pin connectors. enough for 2 570s in Sli.
Click to expand...

I probably worded it badly.

Do you have seperate power cables for each of the PCIe power connecters or does the 6 pin come off the 8 pin?

Blackscreens seem to happen when the memory isn't getting enough voltage and im just wondering if separate cables might be an answer.

Also on mobile, cant see sigs sorry


----------



## th3illusiveman

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I probably worded it badly.
> 
> Do you have seperate power cables for each of the PCIe power connecters or does the 6 pin come off the 8 pin?
> 
> Blackscreens seem to happen when the memory isn't getting enough voltage and im just wondering if separate cables might be an answer.
> 
> Also on mobile, cant see sigs sorry


they are both separate. They each come from the PSU.


----------



## Arizonian

Quote:


> Originally Posted by *Raul-7*
> 
> Finally got her under water.
> 
> 
> Spoiler: Warning: Spoiler!


Sweet - I'm glad in hindsight I already had listed you under water.


----------



## samoth777

hi there,

Do R9 cards have something similar to Nvidia's adaptive vsync?


----------



## Spectre-

Quote:


> Originally Posted by *samoth777*
> 
> hi there,
> 
> Do R9 cards have something similar to Nvidia's adaptive vsync?


most likely free-sync

and thats only for selective gpu's


----------



## Mega Man

Quote:


> Originally Posted by *Spectre-*
> 
> Quote:
> 
> 
> 
> Originally Posted by *samoth777*
> 
> hi there,
> 
> Do R9 cards have something similar to Nvidia's adaptive vsync?
> 
> 
> 
> most likely free-sync
> 
> and thats only for selective gpu's
Click to expand...

all gpus with dp 1.2a


----------



## ebhsimon

Quote:


> Originally Posted by *sinnedone*
> 
> you could always paint the blue bits orange to match your build.


I'm not capable of that. But I am fond of an orange anodized look. I have no clue where I'd get it coated though. It would be damn sexy if it came in orange.


----------



## Wezzor

Quote:


> Originally Posted by *Mongoose135*
> 
> Your gaming temps are perfectly fine, and going by that I'd say the 50-55 is also just fine. The fan profile is designed to stay quiet and only really spin up when you're putting heavy loads onto the card. Watching videos can load up the card between 10-30% on and off - check your GPU load as you watch if you want confirmation.


That's nice to hear. Thank you for your help!


----------



## ozzy1925

today i noticed 2 out of my 3 trix gpus make very loud rattling sound when i set the fans at %20.Is there a fix without voiding warranty?I dont want to deal with RMA


----------



## EdWeisz

Hi.

I'm still clinging on my old i7 960 set up. And playing on a 1080p/ 60hz monitor.

When I play Wathcdog or Sniper Elite III, I usually see my GPU load at 100%. Is this normal or does it mean that I'm having bottleneck issue/s with the CPU that I have.

Thanks.


----------



## Sgt Bilko

Quote:


> Originally Posted by *EdWeisz*
> 
> Hi.
> 
> I'm still clinging on my old i7 960 set up. And playing on a 1080p/ 60hz monitor.
> 
> When I play Wathcdog or Sniper Elite III, I usually see my GPU load at 100%. Is this normal or does it mean that I'm having bottleneck issue/s with the CPU that I have.
> 
> Thanks.


100% load on the GPU means that your GPU is the bottleneck and your CPU is doing fine.


----------



## EdWeisz

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 100% load on the GPU means that your GPU is the bottleneck and your CPU is doing fine.


I see. Thanks.


----------



## rdr09

gained a few points using the leaked driver 14.3 (aka 14.6 RC). stock 947MHz . . .

http://www.3dmark.com/3dm11/8682642

this was 13 driver . . .

http://www.3dmark.com/3dm11/7897245


----------



## imadorkx

Thats a common problem for r9 290 tri x unfortunately. But you can try to re screw the fan screw.

there also a fix for that in youtube.


----------



## ozzy1925

Quote:


> Originally Posted by *rdr09*
> 
> gained a few points using the leaked driver 14.3 (aka 14.6 RC). stock 947MHz . . .
> 
> http://www.3dmark.com/3dm11/8682642
> 
> this was 13 driver . . .
> 
> http://www.3dmark.com/3dm11/7897245


is your gpu underwater?Also where can i download thisdriver?

edit :i tought 1250 mem speed was core speed








very impressive score
i just tried with latest 14.8 .What do you think?
http://www.3dmark.com/3dm11/8682688
core speed 1150 +125mv is good ?Andt is not stable after 1150 even i tried +200mv.This card worh going underwater or just send rma because of fan rattling


----------



## rdr09

Quote:


> Originally Posted by *ozzy1925*
> 
> is your gpu underwater?Also where can i download thisdriver?
> 
> edit :i tought 1250 mem speed was core speed
> 
> 
> 
> 
> 
> 
> 
> 
> very impressive score
> i just tried with latest 14.8 .What do you think?
> http://www.3dmark.com/3dm11/8682688
> core speed 1150 +125mv is good ?Andt is not stable after 1150 even i tried +200mv.This card worh going underwater or just send rma because of fan rattling


yes, it's watered. same score as 14.8. yours a little lower 'cause of your cpu at stock and . . . you only have a stick of ram?

http://www.3dmark.com/3dm11/8682785


----------



## ozzy1925

Quote:


> Originally Posted by *rdr09*
> 
> yes, it's watered. same score as 14.8. yours a little lower 'cause of your cpu at stock and . . . you only have a stick of ram?
> 
> http://www.3dmark.com/3dm11/8682785


yea currenlty only 1 4gb ram waiting for my friend to pick 5960x for me from Usa.I am thinking of sending this card for rma because middle fan looks unbalanced and makes some noise but not sure do you think its worth going underwater with this?


----------



## rdr09

Quote:


> Originally Posted by *ozzy1925*
> 
> yea currenlty only 1 4gb ram waiting for my friend to pick 5960x for me from Usa.I am thinking of sending this card for rma because middle fan looks unbalanced and makes some noise but not sure do you think its worth going underwater with this?


you really need X99? that 4770K can handle at least 2 of these cards easy. going water is certainly worth it with these cards, especially reference.

test the replacement card if it will oc better. i would suggest asking for a reference card (290X) if they have any. test it on air, if it is good, then watercool it.

almost broke 18K in graphics at only 1200 . . .

http://www.3dmark.com/3dm11/8682835

a 290X will be scoring mcuh higher.


----------



## ozzy1925

Quote:


> Originally Posted by *rdr09*
> 
> you really need X99? that 4770K can handle at least 2 of these cards easy. going water is certainly worth it with these cards, especially reference.
> 
> test the replacement card if it will oc better. i would suggest asking for a reference card (290X) if they have any. test it on air, if it is good, then watercool it.
> 
> almost broke 18K in graphics at only 1200 . . .
> 
> http://www.3dmark.com/3dm11/8682835
> 
> a 290X will be scoring mcuh higher.


I was mining with these but gpu mining is dead so i purchased ek blocks for them .i have 3 of these 1 of them stable at 1150 +200mv .this one is stable 1150 +125mv stock trix coling.Havent test the 3rd card .yet.


----------



## rdr09

Quote:


> Originally Posted by *ozzy1925*
> 
> I was mining with these but gpu mining is dead so i purchased ek blocks for them .i have 3 of these 1 of them stable at 1150 +200mv .this one is stable 1150 +125mv stock trix coling.Havent test the 3rd card .yet.


i see. if you want to go trifire, then no choice but X99. those ek blocks are for reference, too. i really believe reference are better all around when watercooling. you get a better chance of getting higher clocks.

btw, BF4 works with the leaked driver so far. will test further.


----------



## Newbie2009

Quote:


> Originally Posted by *rdr09*
> 
> i see. if you want to go trifire, then no choice but X99. those ek blocks are for reference, too. i really believe reference are better all around when watercooling. you get a better chance of getting higher clocks.


You don't think a normal quad for ivy, sandy or haswell i7 is enough?


----------



## rdr09

Quote:


> Originally Posted by *Newbie2009*
> 
> You don't think a normal quad for ivy, sandy or haswell i7 is enough?


not sure. prolly tri at the most. they need to reach higher clocks. seeing the physics score of that haswell at stock in 3DMark11 . . . i am not sure if that is a good enuf gauge. it's like my thuban at 4GHz.

i know Bradley's 290x's are being bottlenecked by his X79 at 4.5GHz. he blames it on poor coding of games.


----------



## mus1mus

Without unlocking, what bios are you guys suggest for the 290?

Mine would be swimming soon. But I want to still be safe..

I have a ref msi..


----------



## rv8000

Quote:


> Originally Posted by *ozzy1925*
> 
> yea currenlty only 1 4gb ram waiting for my friend to pick 5960x for me from Usa.I am thinking of sending this card for rma because middle fan looks unbalanced and makes some noise but not sure do you think its worth going underwater with this?


How hot are your vrms getting at +125mG, if you're exceeding 85c and up you should see at least a small benefit under water.


----------



## rdr09

Quote:


> Originally Posted by *th3illusiveman*
> 
> why is it my PC blackscreens on the login window after it's been turned off for afew hours? If i turn it off now, then turn it on in 10 mins it will boot up fine. If i turn it off now and turn it on again in 6 hours it blackscreens on the logon window and i have to restart it. So far this has been a very consistent pattern with both 14.7 RC3 and the latest driver release. OC or stock doesn't matter, i even tried turning CCC off on startup and that didnt fix it.
> 
> 
> 
> 
> 
> 
> 
> 
> AMD pls. 7970 had no issues like this.


not sure if it is the driver that fixed this same issue on my amd rig with a 7950 but you can try . . .

Go to Control Panel>Power options>(Mine is set to High)Change plan settings>Advance settings

Under Advance settings go to the following:

-Sleep set to Never
-PCI Express>Link State Power Management set to Off
-Display>Turn off display after set it to Never

Apply.

Quote:


> Originally Posted by *mus1mus*
> 
> Without unlocking, what bios are you guys suggest for the 290?
> 
> Mine would be swimming soon. But I want to still be safe..
> 
> I have a ref msi..


What are you trying to accomplish? i have a reference msi 290 and i use the orig bios. it works fine.


----------



## ozzy1925

Quote:


> Originally Posted by *rv8000*
> 
> How hot are your vrms getting at +125mG, if you're exceeding 85c and up you should see at least a small benefit under water.


I cant remember atm i will check when back from work


----------



## KeepWalkinG

What you think about that, on my Vapor-X 290 voltage on +200mV in Unigine Valley 1.0 voltage is 1.160v, 1.159v with msi afterburner + rivatuner
This is real or software don't show real values?


----------



## th3illusiveman

Quote:


> Originally Posted by *rdr09*
> 
> not sure if it is the driver that fixed this same issue on my amd rig with a 7950 but you can try . . .
> 
> Go to Control Panel>Power options>(Mine is set to High)Change plan settings>Advance settings
> 
> Under Advance settings go to the following:
> 
> -Sleep set to Never
> -PCI Express>Link State Power Management set to Off
> -Display>Turn off display after set it to Never
> 
> Apply.
> What are you trying to accomplish? i have a reference msi 290 and i use the orig bios. it works fine.


it was set at that already but thanks for trying. Kinda mitigates the value of an SSD if i have to turn the PC on twice lol. I will try a Bios flash with something with more juice soon.


----------



## ozzy1925

i get this score @1180 core +200mv vrm 94C gpu 80C
http://www.3dmark.com/3dm11/8684257
Graphics Score 17316

.Is it low for 1180?


----------



## pdasterly

what are my screen options, currently have three 23" 1080 monitors(z23i). What are my options for adding another monitor. I know if i add more 1080 my fps will drop too far.
sapphire 290x crossfire


----------



## hwoverclkd

Quote:


> Originally Posted by *KeepWalkinG*
> 
> What you think about that, on my Vapor-X 290 voltage on +200mV in Unigine Valley 1.0 voltage is 1.160v, 1.159v with msi afterburner + rivatuner
> This is real or software don't show real values?


KeepBenching







sw reading is usually unreliable. DMM is better.


----------



## rdr09

Quote:


> Originally Posted by *th3illusiveman*
> 
> it was set at that already but thanks for trying. Kinda mitigates the value of an SSD if i have to turn the PC on twice lol. I will try a Bios flash with something with more juice soon.


oh, noes. the only times i turn off my sigs are when i go on vacation of two weeks or more and during a thunder storm or something. i only have hdds.

edit: btw, i don't think it has to do with the bios on these cards. i think it has to do with a driver and how it interacts with certain monitors. i have a monitor hooked to my 290 that does not do it. hooked up my 7950 and my 7770 to the same one and sleep issue was gone on both.

it's just this one monitors of mine that i use with the 7950 and i will test tonight when i put it to sleep as usual. tomorrow i will hit the enter key first before turning the monitor on.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *ozzy1925*
> 
> i get this score @1180 core +200mv vrm 94C gpu 80C
> http://www.3dmark.com/3dm11/8684257
> Graphics Score 17316
> 
> .Is it low for 1180?


No, here is an old bench of my card @ 1200MHz with tess on.

http://www.3dmark.com/3dm11/7540268


----------



## mus1mus

Quote:


> Originally Posted by *rdr09*
> 
> What are you trying to accomplish? i have a reference msi 290 and i use the orig bios. it works fine.


Some good numbers..


----------



## rdr09

Quote:


> Originally Posted by *mus1mus*
> 
> Some good numbers..


i see. i think what you're looking for is the PT1 BIOS. Sorry, not too familiar with it. do you mind comparing our bioses?



have no guts flashing PT1 but this bios of mine seems fine for oc'ing. i know it comes down to silicon. curious to see what bios version you have.


----------



## mus1mus

I haven't had a chance to note.. and take a copy of the stock BIOS since it's really jumping the core clocks all over the place..

Tried the Asus 290 BIOS for +400 Voltage control, No effect. Sapphires BIOS didn't work. I'm now using the PCS+ at 1040/1350 with a fairly aggressive fan profile that keeps me with 75 at 30s Ambient.

But PCS is only up to +100mV and it doesn't really matter coz even at Max, the fan profile is able to keep the temps in control..


----------



## chiknnwatrmln

Anyone using the 14.7 Beta drivers?

I just installed them, have yet to play any games but 3dm11 is showing a good 400 graphic score increase at the same clocks.

Kinda bummed the F.lux doesn't work properly with Eyefinity on these driver though.

Oh and for some reason whenever I hover my mouse over a text box, the cursor disappears.


----------



## hwoverclkd

Quote:


> Originally Posted by *mus1mus*
> 
> I haven't had a chance to note.. and take a copy of the stock BIOS since it's really jumping the core clocks all over the place..
> 
> Tried the Asus 290 BIOS for +400 Voltage control, No effect. Sapphires BIOS didn't work. I'm now using the PCS+ at 1040/1350 with a fairly aggressive fan profile that keeps me with 75 at 30s Ambient.
> 
> But PCS is only up to +100mV and it doesn't really matter coz even at Max, the fan profile is able to keep the temps in control..


any luck finding the stock bios from techpowerup? http://www.techpowerup.com/vgabios/


----------



## mus1mus

I can. But nope, 1040 vs 947 MHz, PCS+ vs MSI BIOS is a no-brainer.

It also keeps the clocks stable.


----------



## rdr09

Quote:


> Originally Posted by *mus1mus*
> 
> I haven't had a chance to note.. and take a copy of the stock BIOS since it's really jumping the core clocks all over the place..
> 
> Tried the Asus 290 BIOS for +400 Voltage control, No effect. Sapphires BIOS didn't work. I'm now using the PCS+ at 1040/1350 with a fairly aggressive fan profile that keeps me with 75 at 30s Ambient.
> 
> But PCS is only up to +100mV and it doesn't really matter coz even at Max, the fan profile is able to keep the temps in control..


you're good, then.


----------



## mus1mus

Well, better is BETTER..


----------



## Spectre-

Quote:


> Originally Posted by *mus1mus*
> 
> I haven't had a chance to note.. and take a copy of the stock BIOS since it's really jumping the core clocks all over the place..
> 
> Tried the Asus 290 BIOS for +400 Voltage control, No effect. Sapphires BIOS didn't work. I'm now using the PCS+ at 1040/1350 with a fairly aggressive fan profile that keeps me with 75 at 30s Ambient.
> 
> But PCS is only up to +100mV and it doesn't really matter coz even at Max, the fan profile is able to keep the temps in control..


you should try the gigabyte bios

they turnt out to be stable for me and i can bench upto 1.43 volts( with drop -off)


----------



## mus1mus

Quote:


> Originally Posted by *mus1mus*
> 
> I haven't had a chance to note.. and take a copy of the stock BIOS since it's really jumping the core clocks all over the place..
> 
> Tried the Asus 290 BIOS for +400 Voltage control, No effect. Sapphires BIOS didn't work. I'm now using the PCS+ at 1040/1350 with a fairly aggressive fan profile that keeps me with 75 at 30s Ambient.
> 
> But PCS is only up to +100mV and it doesn't really matter coz even at Max, the fan profile is able to keep the temps in control..
> Quote:
> 
> 
> 
> Originally Posted by *Spectre-*
> 
> you should try the gigabyte bios
> 
> they turned out to be stable for me and i can bench upto 1.43 volts( with drop -off)
Click to expand...

With Voltage Control up to ??

Thanks for the tip. Will try that one later.


----------



## mus1mus

Anybody has experience issues with XFX Double Dissipation 290??


----------



## Sgt Bilko

Quote:


> Originally Posted by *mus1mus*
> 
> Anybody has experience issues with XFX Double Dissipation 290??


What info are you after?


----------



## th3illusiveman

What bios's can i use on a Sapphire 290X TriX-OC? Is there anyone on the forums who mods bioses? I remember a dude who did it for my GTX 570s. He could raise clocks and voltage.


----------



## samjitsu

Hello guys I'm new here. Just got a *Gigabyte R9 290 WINDFORCE OC*. And I was wondering if it's the same as the *REFERENCE PCB*? I'm planning to hook it up with a *KRAKEN G10* Bracket. NZXT said that _"Compatibility list is limited to reference designs, but non-reference designs will most likely work, provided the board partner did not change the location of the die, die height, or screw spacing."_.

PS: planning to water cool this because I bought this abroad and brought it here to my country. And Temps get so high even for a non ref cooler. Which means It's really that bad of a cooling system. RMA'ing it would cost me the same as getting it liquid cooled.


----------



## mus1mus

Quote:


> Originally Posted by *Sgt Bilko*
> 
> What info are you after?


Somebody is selling me a 290 with black screens with the driver ON. I know I have this issue before but the owner gathered it may be a Memory issue.

Is that common for the card?

How is their RMA btw?

Edit: card was purchased from Hong Kong so reseller warranty seems nill on posibilities.

Should I buy it hoping I can fix the stuff for half the price and hope for an RMA?


----------



## Spectre-

Quote:


> Originally Posted by *mus1mus*
> 
> With Voltage Control up to ??
> 
> Thanks for the tip. Will try that one later.


+200Mv


----------



## mus1mus

Quote:


> Originally Posted by *Spectre-*
> 
> +200Mv


I hope it's not Mega Volts.









Thanks +1


----------



## Sgt Bilko

Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> What info are you after?
> 
> 
> 
> Somebody is selling me a 290 with black screens with the driver ON. I know I have this issue before but the owner gathered it may be a Memory issue.
> 
> Is that common for the card?
> 
> How is their RMA btw?
Click to expand...

It's something to do with the memory and voltage supplied afaik and it varies from rig to rig.

The 290's i own have been great, no black screens (apart from pushing the memory too high) they clock fairly well and temps aren't too bad either.

As for the RMA, tbh i have no idea (these are my first XFX products) but people in this thread have reported that thier support is very good now and no complaints.

You can also ask in here: XFX DD/Black Club


----------



## Spectre-

Quote:


> Originally Posted by *mus1mus*
> 
> I hope it's not Mega Volts.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks +1


only if


----------



## oicwutudidthar

what is currently the best bios to run with 290x for overclocking? I am running two on water and I want to overclock the bejesus out of them (400mv+ voltage)


----------



## Spectre-

Quote:


> Originally Posted by *oicwutudidthar*
> 
> what is currently the best bios to run with 290x for overclocking? I am running two on water and I want to overclock the bejesus out of them (400mv+ voltage)


asus pt1


----------



## kizwan

Quote:


> Originally Posted by *ozzy1925*
> 
> i get this score @1180 core +200mv vrm 94C gpu 80C
> http://www.3dmark.com/3dm11/8684257
> Graphics Score 17316
> 
> .Is it low for 1180?


No problem with Graphics Score. The Physics & Combined score is surprisingly low but you're running CPU at stock clock. Why running "K" CPU at stock clock?


----------



## Spectre-

Quote:


> Originally Posted by *kizwan*
> 
> No problem with Graphics Score. The Physics & Combined score is surprisingly low but you're running CPU at stock clock. Why running "K" CPU at stock clock?


i dont get that either

if you dont overclock just get the Xeon


----------



## octiny

Picked up a couple PCS+ 290's with Kraken G10's to replace my GTX 780's I got a couple days ago (which originally replaced super hot reference 290's)









Ran it at my 24/7 clocks 1150/1550



http://www.3dmark.com/fs/2702107

Landed in the top 2 for similar systems









Edit:

Here's @ 1175/1550



http://www.3dmark.com/fs/2702177


----------



## Mega Man

Quote:


> Originally Posted by *ozzy1925*
> 
> today i noticed 2 out of my 3 trix gpus make very loud rattling sound when i set the fans at %20.Is there a fix without voiding warranty?I dont want to deal with RMA


iirc that sapphire after market coolers are known for this it is just you found the freq that the shroud will vibrate with the fan at


----------



## ozzy1925

Quote:


> Originally Posted by *kizwan*
> 
> No problem with Graphics Score. The Physics & Combined score is surprisingly low but you're running CPU at stock clock. Why running "K" CPU at stock clock?


Quote:


> Originally Posted by *Spectre-*
> 
> i dont get that either
> 
> if you dont overclock just get the Xeon


sorry for the confusion this is my mining rig ,i was just checking the graphics score .I am building my project you can check from my sig.

Quote:


> Originally Posted by *Mega Man*
> 
> iirc that sapphire after market coolers are known for this it is just you found the freq that the shroud will vibrate with the fan at


I tried the shroud fix from youtube but didnt help me i will add a video for you

edit:here




as you see the noise comes from the middle fan


----------



## schoolofmonkey

Quote:


> Originally Posted by *ozzy1925*
> 
> today i noticed 2 out of my 3 trix gpus make very loud rattling sound when i set the fans at %20.Is there a fix without voiding warranty?I dont want to deal with RMA


I didn't see anyone else that posted this fix:

Sapphire R9 290 TRI X fixing the rattle sound


----------



## ozzy1925

Quote:


> Originally Posted by *schoolofmonkey*
> 
> I didn't see anyone else that posted this fix:
> 
> Sapphire R9 290 TRI X fixing the rattle sound


they did but if you read my above post it didnt help me


----------



## Jflisk

I have a system with 120.2 >140.2> 120 radiators in it. I have 3x R9 290x and 1x FX9590. The pump is a D5. Will adding a second D5 pump bring the temps down by increasing the flow or split the loop 120.2>140.2>GPU - 120>CPU. Temps are not horrendous but I am hitting 67C max after a couple of hours of Crysis 3 or Bioshock infinite. Trying to keep it all in the case and bring down the temps a little. Thanks in advance.


----------



## mus1mus

Quote:


> Originally Posted by *Jflisk*
> 
> I have a system with 120.2 >140.2> 120 radiators in it. I have 3x R9 290x and 1x FX9590. The pump is a D5. Will adding a second D5 pump bring the temps down by increasing the flow or split the loop 120.2>140.2>GPU - 120>CPU. Temps are not horrendous but I am hitting 67C max after a couple of hours of Crysis 3 or Bioshock infinite. Trying to keep it all in the case and bring down the temps a little. Thanks in advance.


Sorry to spoil your preference but a pump will not improve your temps. Another rad will. Or sime good sets of fans..


----------



## Jflisk

Quote:


> Originally Posted by *mus1mus*
> 
> Sorry to spoil your preference but a pump will not improve your temps. Another rad will. Or sime good sets of fans..


I read the disclaimer I want to complain.


----------



## mAs81

Quote:


> Originally Posted by *samjitsu*
> 
> Hello guys I'm new here. Just got a *Gigabyte R9 290 WINDFORCE OC*. And I was wondering if it's the same as the *REFERENCE PCB*? I'm planning to hook it up with a *KRAKEN G10* Bracket. NZXT said that _"Compatibility list is limited to reference designs, but non-reference designs will most likely work, provided the board partner did not change the location of the die, die height, or screw spacing."_.
> 
> PS: planning to water cool this because I bought this abroad and brought it here to my country. And Temps get so high even for a non ref cooler. Which means It's really that bad of a cooling system. RMA'ing it would cost me the same as getting it liquid cooled.


You can always ask in the official Kraken club
http://www.overclock.net/t/1487012/official-nzxt-kraken-g10-owners-club
I'm not sure but I think it will fit..
Good luck


----------



## oicwutudidthar

linkerino plz?


----------



## mus1mus

Quote:


> Originally Posted by *Jflisk*
> 
> I read the disclaimer I want to complain.












If you want to, shoot it away..


----------



## DMills

howdy all, just ordered my first sapphire tri-x 290x

its been a long time coming to replace my cf 7850s, but a day of reckoning will soon be at hand


----------



## hwoverclkd

Quote:


> Originally Posted by *DMills*
> 
> howdy all, just ordered my first sapphire tri-x 290x
> 
> its been a long time coming to replace my cf 7850s, but a day of reckoning will soon be at hand


i'd say it's a well-deserved upgrade









have fun!


----------



## samjitsu

Quote:


> Originally Posted by *mAs81*
> 
> You can always ask in the official Kraken club
> http://www.overclock.net/t/1487012/official-nzxt-kraken-g10-owners-club
> I'm not sure but I think it will fit..
> Good luck


I also have a slight feeling it would fit. Gonna decide a week from now whether to pull the trigger on it. If anyone else has the same mod as I wanted kindly inform me. It would be nice of you.


----------



## ace101

Hi! Is it ok to keep my Gigabyte R9 290 Windforce always on performance mode even if I'm not playing?


----------



## Blue Dragon

Quote:


> Originally Posted by *ace101*
> 
> Hi! Is it ok to keep my Gigabyte R9 290 Windforce always on performance mode even if I'm not playing?


if you mean the switch on the card set to performance mode, than yes it's ok.


----------



## Vici0us

Quote:


> Originally Posted by *mAs81*
> 
> Quote:
> 
> 
> 
> Originally Posted by *samjitsu*
> 
> Hello guys I'm new here. Just got a *Gigabyte R9 290 WINDFORCE OC*. And I was wondering if it's the same as the *REFERENCE PCB*? I'm planning to hook it up with a *KRAKEN G10* Bracket. NZXT said that _"Compatibility list is limited to reference designs, but non-reference designs will most likely work, provided the board partner did not change the location of the die, die height, or screw spacing."_.
> 
> PS: planning to water cool this because I bought this abroad and brought it here to my country. And Temps get so high even for a non ref cooler. Which means It's really that bad of a cooling system. RMA'ing it would cost me the same as getting it liquid cooled.
> 
> 
> 
> You can always ask in the official Kraken club
> http://www.overclock.net/t/1487012/official-nzxt-kraken-g10-owners-club
> I'm not sure but I think it will fit..
> Good luck
Click to expand...

I am running Windforce in Crossfire and my temps don't go past 86C. You sure there's nothing wrong with it?


----------



## Mongoose135

Quote:


> Originally Posted by *samjitsu*
> 
> Hello guys I'm new here. Just got a *Gigabyte R9 290 WINDFORCE OC*. And I was wondering if it's the same as the *REFERENCE PCB*? I'm planning to hook it up with a *KRAKEN G10* Bracket. NZXT said that _"Compatibility list is limited to reference designs, but non-reference designs will most likely work, provided the board partner did not change the location of the die, die height, or screw spacing."_.
> 
> PS: planning to water cool this because I bought this abroad and brought it here to my country. And Temps get so high even for a non ref cooler. Which means It's really that bad of a cooling system. RMA'ing it would cost me the same as getting it liquid cooled.


Hi there, the PCB and components look very similar to the ref board - I'd say there are no problems there. It's just the side bracket that's the problem. It would probably get in the way of the G10, so you should check if the bracket can be screwed off.
http://www.coolingconfigurator.com/step1_complist?gpu_gpus=1299


----------



## samjitsu

Quote:


> Originally Posted by *Vici0us*
> 
> I am running Windforce in Crossfire and my temps don't go past 86C. You sure there's nothing wrong with it?


Pretty sure mine's not working like its supposed to. It gets loud like a jet plane when I reach that range of temps.


----------



## samjitsu

Quote:


> Originally Posted by *Mongoose135*
> 
> Hi there, the PCB and components look very similar to the ref board - I'd say there are no problems there. It's just the side bracket that's the problem. It would probably get in the way of the G10, so you should check if the bracket can be screwed off.
> http://www.coolingconfigurator.com/step1_complist?gpu_gpus=1299


Hey thanks! I also noticed a slight difference but most of them are pretty much the same. The side bracket can be screwed off as you can see here:


----------



## Mega Man

Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jflisk*
> 
> I have a system with 120.2 >140.2> 120 radiators in it. I have 3x R9 290x and 1x FX9590. The pump is a D5. Will adding a second D5 pump bring the temps down by increasing the flow or split the loop 120.2>140.2>GPU - 120>CPU. Temps are not horrendous but I am hitting 67C max after a couple of hours of Crysis 3 or Bioshock infinite. Trying to keep it all in the case and bring down the temps a little. Thanks in advance.
> 
> 
> 
> Sorry to spoil your preference but a pump will not improve your temps. Another rad will. Or sime good sets of fans..
Click to expand...

sorry but you are wrong

it helped me alot with 3 gpus in serial gpu blocks are near/just as/more restrictive then cpu blocks.

they need help to get proper water flow


----------



## Valicy

I'm planning to pick up one of these, seen some sweet benchmarks on here. cool.


----------



## mus1mus

Quote:


> Originally Posted by *Jflisk*
> 
> I have a system with 120.2 >140.2> 120 radiators in it. I have 3x R9 290x and 1x FX9590. The pump is a D5. Will adding a second D5 pump bring the temps down by increasing the flow or split the loop 120.2>140.2>GPU - 120>CPU. Temps are not horrendous but I am hitting 67C max after a couple of hours of Crysis 3 or Bioshock infinite. Trying to keep it all in the case and bring down the temps a little. Thanks in advance.
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Sorry to spoil your preference but a pump will not improve your temps. Another rad will. Or sime good sets of fans..
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> sorry but you are wrong
> 
> it helped me alot with 3 gpus in serial gpu blocks are near/just as/more restrictive then cpu blocks.
> 
> they need help to get proper water flow
> 
> 
> 
> 
> 
> Click to expand...
Click to expand...

Neglected to ask about his GPU Blocks config. It was indeed in serial.









Yes, you are right. He needs another pump.

But if you look at his components, he still needs more rads.







Unless he's always on a sub 20C ambient. IMO


----------



## Wezzor

Guys, how much do you gain for overclocking your GPU? Does the extra performance compensate the shorter lifespan of the card? Let's say that I have an average card how much of an FPS increase would I see in games?


----------



## Mega Man

Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jflisk*
> 
> I have a system with 120.2 >140.2> 120 radiators in it. I have 3x R9 290x and 1x FX9590. The pump is a D5. Will adding a second D5 pump bring the temps down by increasing the flow or split the loop 120.2>140.2>GPU - 120>CPU. Temps are not horrendous but I am hitting 67C max after a couple of hours of Crysis 3 or Bioshock infinite. Trying to keep it all in the case and bring down the temps a little. Thanks in advance.
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Sorry to spoil your preference but a pump will not improve your temps. Another rad will. Or sime good sets of fans..
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> sorry but you are wrong
> 
> it helped me alot with 3 gpus in serial gpu blocks are near/just as/more restrictive then cpu blocks.
> 
> they need help to get proper water flow
> 
> 
> 
> 
> 
> Click to expand...
> 
> 
> 
> Click to expand...
> 
> Neglected to ask about his GPU Blocks config. It was indeed in serial.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yes, you are right. He needs another pump.
> 
> But if you look at his components, he still needs more rads.
> 
> 
> 
> 
> 
> 
> 
> Unless he's always on a sub 20C ambient. IMO
Click to expand...

never said more wouldnt help, just that more pump would


----------



## jagdtigger

I think this is the max for my card, 1240 MHz +200mV:
http://www.3dmark.com/3dm/3977189?

GPU: 48 °C
VRM Temp 1: 60 °C
VRM Temp 2: 37 °C


----------



## rdr09

Quote:


> Originally Posted by *jagdtigger*
> 
> I think this is the max for my card, 1240 MHz +200mV:
> http://www.3dmark.com/3dm/3977189?
> 
> GPU: 48 °C
> VRM Temp 1: 60 °C
> VRM Temp 2: 37 °C


something is off. i get that graphics score at 1200 core with a 290.

http://www.3dmark.com/3dm/3147012?


----------



## jagdtigger

Quote:


> Originally Posted by *rdr09*
> 
> something is off. i get that graphics score at 1200 core with a 290.
> 
> http://www.3dmark.com/3dm/3147012?


My CPU is only an i5 4670k at 4,5 GHz... And i didnt solved why my card runs in 3.0 x8 max...


----------



## rdr09

Quote:


> Originally Posted by *jagdtigger*
> 
> My CPU is only an i5 4670k at 4,5 GHz... And i didnt solved why my card runs in 3.0 x8 max...


it's not the cpu. graphics score is not too influenced by the cpu in fs - at least not between an i7 and an i5.

its the gpu. lower your oc a bit to 1200/1300. you raised the power limit? if not, set it to max - 50%.

don't use CCC. make sure you disable Overdrive in CCC. use the latest AB or Trixx. just one and do not mix with other apps like GPUZ.


----------



## jagdtigger

Quote:


> Originally Posted by *rdr09*
> 
> it's not the cpu. graphics score is not too influenced by the cpu in fs - at least not between an i7 and an i5.
> 
> its the gpu. lower your oc a bit to 1200/1300. you raised the power limit? if not, set it to max - 50%.
> 
> don't use CCC. make sure you disable Overdrive in CCC. use the latest AB or Trixx. just one and do not mix with other apps like GPUZ.


Its a tricky thing, if i want to raise the power limit then i have only +100 mV(AB dont let me set higher than 100). On the other hand in trixx i can set +200 mV but no power limit option...


----------



## rdr09

Quote:


> Originally Posted by *jagdtigger*
> 
> Its a tricky thing, if i want to raise the power limit then i have only +100 mV(AB dont let me set higher than 100). On the other hand in trixx i can set +200 mV but no power limit option...


in Trixx, use the slider on the side. you should see Power Li. shoot for 1200 core and 1300 mem.

compare the graphics scores not the overall.


----------



## ZealotKi11er

His problem is power limit. Nothing to see here. +200 mV will need the power limit or the core will down clock.


----------



## jagdtigger

+200 mV
+50 % power limit
GPU Core 1200 http://www.3dmark.com/3dm/3978256?
1230 http://www.3dmark.com/3dm/3978296?
1240 http://www.3dmark.com/3dm/3978336?


----------



## devilhead

Quote:


> Originally Posted by *jagdtigger*
> 
> +200 mV
> +50 % power limit
> GPU Core 1200 http://www.3dmark.com/3dm/3978256?
> 1230 http://www.3dmark.com/3dm/3978296?
> 1240 http://www.3dmark.com/3dm/3978336?


here is mine at 1260/1700







http://www.3dmark.com/fs/2705050


----------



## jagdtigger

Quote:


> Originally Posted by *devilhead*
> 
> here is mine at 1260/1700
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/2705050


Nice







. Im curious mine how high can go







...


----------



## mus1mus

So, is Trixxx better than AB?


----------



## chronicfx

Anyone here able to use afterburner with battlefield 4? For some reason since I reloaded my OS (8.1) I am not able to use bf4 with afterburner to monitor FPS and control gpu fans. I used to not have this issue but now it crashed on initial load wit AB active.

Nevermind edit: enable origin ingamr


----------



## jagdtigger

Quote:


> Originally Posted by *mus1mus*
> 
> So, is Trixxx better than AB?


For some reason AB wont let me set +200 mV offset....

As for the GPU clock the highest stable is 1240 :/ .


----------



## mus1mus

Quote:


> Originally Posted by *jagdtigger*
> 
> For some reason AB wont let me set +200 mV offset....
> 
> As foor the GPU clock the highest stable is 1240 :/ .


I have tried Trixxx but it seems to lock my clocks.

Will try back then..

What about GPU tweak that allows up to 1.4???

Nice score btw..


----------



## rdr09

Quote:


> Originally Posted by *jagdtigger*
> 
> For some reason *AB wont let me set +200 mV offset*....
> 
> As for the GPU clock the highest stable is 1240 :/ .


see post #1. Arizonian included that info and many more.

btw, at 1200 or even a bit higher might not need +200. try +175, +180, etc. keep the memory at stock while trying to figure out core oc.


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> see post #1. Arizonian included that info and many more.
> 
> btw, at 1200 or even a bit higher might not need +200. try +175, +180, etc. keep the memory at stock while trying to figure out core oc.


That method is very confusing. I only need +125 mV for one card and i have to go to all the trouble to do that MSI AB hack is is not really permanent and easy.


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> That method is very confusing. I only need +125 mV for one card and i have to go to all the trouble to do that MSI AB hack is is not really permanent and easy.


i agree that's why i use Trixx myself. 200 out the box. just showed jag that there is a way with AB. it's up to him/her to use.


----------



## rdr09

Quote:


> Originally Posted by *rdr09*
> 
> i agree that's why i use Trixx myself. 200 out the box. just showed jag that there is a way with AB. it's up to him/her to use.


i think the author of AB does not want to be responsible for bork gpus. lol








wrong button again.


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> i agree that's why i use Trixx myself. 200 out the box. just showed jag that there is a way with AB. it's up to him/her to use.


Problem is Trixx does not work with CF because the power limit slider will be removed for the cards.

ASUS Tweak is a mess.


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Problem is Trixx does not work with CF because the power limit slider will be removed for the cards.
> 
> ASUS Tweak is a mess.


no choice for crossfire users that are benching. AB hack is the only answer.


----------



## Raul-7

Why is it when I run 1100 [in Valley] with +10mV I get a higher score than when I run it at +50mV?


----------



## BradleyW

Quote:


> Originally Posted by *Raul-7*
> 
> Why is it when I run 1100 [in Valley] with +10mV I get a higher score than when I run it at +50mV?


Too much voltage can cause a degree of instability or reduced performance due to temperature increases or voltage tolerances.


----------



## Jflisk

Quote:


> Originally Posted by *mus1mus*
> 
> Neglected to ask about his GPU Blocks config. It was indeed in serial.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yes, you are right. He needs another pump.
> 
> But if you look at his components, he still needs more rads.
> 
> 
> 
> 
> 
> 
> 
> Unless he's always on a sub 20C ambient. IMO


No sub 20C here - So another 120 and a double pump a fix for the temps. Thanks guys

@Mega - How much of a drop will I see with the second pump. Thanks


----------



## mojobear

Hey I didn't know that! I'm quadfire and I see the power limit slider for my first card which I think applies to all cards.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> Problem is Trixx does not work with CF because the power limit slider will be removed for the cards.
> 
> ASUS Tweak is a mess.


----------



## tsm106

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> Problem is Trixx does not work with CF because the power limit slider will be removed for the cards.
> 
> ASUS Tweak is a mess.
> 
> 
> 
> no choice for crossfire users that are benching. AB hack is the only answer.
Click to expand...

I'd have never placed in any top 10 using AB, lol. Or trixx for that matter. That leaves only... the dreaded gputweak, but then again it works for the purpose of benching.


----------



## kizwan

Quote:


> Originally Posted by *mojobear*
> 
> Hey I didn't know that! I'm quadfire and I see the power limit slider for my first card which I think applies to all cards.
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> Problem is Trixx does not work with CF because the power limit slider will be removed for the cards.
> 
> ASUS Tweak is a mess.
Click to expand...

Power limit also available here in Trixx with crossfire.


----------



## Han Bao Quan

Anyone here can help me answer this question?
I have 2 290x Direct CUII in Crossfire, after several cooling tweaks (reapplied TIM, replaced case fans...) I got my cards to top out at 78 degrees (top) and 72 degrees (bottom) in Unigine Valley. My case is the Fractal Design XL R2, and I'm thinking of getting the Blackhawk-ultra to help with the temp a little more. My question is: Will the blackhawk give me lower GPU temps compared to what I have now?

Thanks.


----------



## th3illusiveman

i noticed that the Asus bios that people used in the 290->290X unlock thread has bigger Vdroop then the Sapphire Tri-X bios i use. So if anyone is using that Asus bios and wants to try to keep their clocks stable with lower voltage then you could try that. Maybe it differs from card to card. I'm just wondering if there is a better reference PCB bios out there with even smaller Vdroop that can downclock in 2d mode.


----------



## Naeem

I have 2 X Asus R9 290X DCU2 OC

1135 @ 1425

http://www.3dmark.com/3dm/3979975


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> I'd have never placed in any top 10 using AB, lol. Or trixx for that matter. That leaves only... the dreaded gputweak, but then again it works for the purpose of benching.


oh yah, that's in the op, too. do you need PT bios for that?


----------



## Han Bao Quan

Quote:


> Originally Posted by *Naeem*
> 
> I have 2 X Asus R9 290X DCU2 OC
> 
> 1135 @ 1425
> 
> http://www.3dmark.com/3dm/3979975


Can you please tell what's the temp for your cards?


----------



## Mega Man

Quote:


> Originally Posted by *Jflisk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Neglected to ask about his GPU Blocks config. It was indeed in serial.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yes, you are right. He needs another pump.
> 
> But if you look at his components, he still needs more rads.
> 
> 
> 
> 
> 
> 
> 
> Unless he's always on a sub 20C ambient. IMO
> 
> 
> 
> No sub 20C here - So another 120 and a double pump a fix for the temps. Thanks guys
> 
> @Mega - How much of a drop will I see with the second pump. Thanks
Click to expand...

unfortunately my crystal ball just broke. sorry :/


----------



## Jflisk

Quote:


> Originally Posted by *Mega Man*
> 
> unfortunately my crystal ball just broke. sorry :/


If your crystal ball is broke we are all in trouble. I was actually looking for a guestamate.


----------



## Mega Man

really hard to do. depends on how many you add ( i added 3 ) what speed you run, whatr speed fans you run

i dropped well over 20deg but i have the rad space needed ( on my 7970s )

but mine were full para

i ran serial before and tbh hated it , which is what caused me to switch to full para +add 3pumps


----------



## thrgk

I need to do a wattage check to make sure i have enough wattage.

If I upgrade to 5960x, Asus Deluxe x99, ddr4 16gb ram, have 4 290x, 2 mcp35x pumps, 26 120mm fans, 2 fan controllers, will 1950watts be enough? Or will I need more? Planning to overclock everything as well


----------



## Jflisk

Quote:


> Originally Posted by *thrgk*
> 
> I need to do a wattage check to make sure i have enough wattage.
> 
> If I upgrade to 5960x, Asus Deluxe x99, ddr4 16gb ram, have 4 290x, 2 mcp35x pumps, 26 120mm fans, 2 fan controllers, will 1950watts be enough? Or will I need more? Planning to overclock everything as well


Are both your power supplies plugged into the same outlet. If so you might want to separate them. The standard 120 US outlet is around 1800W.


----------



## Jflisk

Quote:


> Originally Posted by *Mega Man*
> 
> really hard to do. depends on how many you add ( i added 3 ) what speed you run, whatr speed fans you run
> 
> i dropped well over 20deg but i have the rad space needed ( on my 7970s )
> 
> but mine were full para
> 
> i ran serial before and tbh hated it , which is what caused me to switch to full para +add 3pumps


You have the external 360.3 that's just crazy. I can understand needing 3 pumps for that thing.


----------



## thrgk

Quote:


> Originally Posted by *Jflisk*
> 
> Are both your power supplies plugged into the same outlet. If so you might want to separate them. The standard 120 US outlet is around 1800W.


No one in a 20amp breaker, other in a 15amp


----------



## mus1mus

Quote:


> Originally Posted by *Jflisk*
> 
> You have the external 360.3 that's just crazy. I can understand needing 3 pumps for that thing.


You really can't do much without enough rad space IMO.

Plus, those are EX rads am I right?

If you can switch to a parallel config in the GPU, replace those Noctuas with higher speed fans (high speed GTs), all intake, 20C ambient, another pump, you might get better temps.

For your set-up, you'd need at least a couple thick 360s as minimum. More if you are Overclocking and Overvolting the CPU and the GPUs.

And you'll be left with wanting a new case.









Boy, this hobby breaks every guy's wallet. lol


----------



## Gualichu04

I wonder if i should try to push my crossfire r9 290x's past 1150 core 1300 memory. They are at +69mv and +50 power


----------



## Mega Man

the external 360s are extremely old build no longer in use.


----------



## pdasterly

price drop on the 295x2 should i sell off my 290x cards


----------



## joeh4384

Quote:


> Originally Posted by *pdasterly*
> 
> price drop on the 295x2 should i sell off my 290x cards


How many 290x cards do you have?


----------



## Jflisk

Quote:


> Originally Posted by *mus1mus*
> 
> You really can't do much without enough rad space IMO.
> 
> Plus, those are EX rads am I right?
> 
> If you can switch to a parallel config in the GPU, replace those Noctuas with higher speed fans (high speed GTs), all intake, 20C ambient, another pump, you might get better temps.
> 
> For your set-up, you'd need at least a couple thick 360s as minimum. More if you are Overclocking and Overvolting the CPU and the GPUs.
> 
> And you'll be left with wanting a new case.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Boy, this hobby breaks every guy's wallet. lol


WA - Watercoolers anonyms. It all started with a D5 pump . Then it got out of hand


----------



## petedread

Am I right in thinking that when the 290x first went on sale there was no AMD game bundle thingy?


----------



## schoolofmonkey

Quote:


> Originally Posted by *petedread*
> 
> Am I right in thinking that when the 290x first went on sale there was no AMD game bundle thingy?


I just bought a 290x last week and I never got a code for the game bundle, no one can tell me (Store or AMD) why or if I'm entitled to one..


----------



## ZealotKi11er

Quote:


> Originally Posted by *schoolofmonkey*
> 
> I just bought a 290x last week and I never got a code for the game bundle, no one can tell me (Store or AMD) why or if I'm entitled to one..


If its not listed on the place you bought the GPU then you dont get one.


----------



## rdr09

Quote:


> Originally Posted by *schoolofmonkey*
> 
> I just bought a 290x last week and I never got a code for the game bundle, no one can tell me (Store or AMD) why or if I'm entitled to one..


did you try?

http://www.amd4u.com/radeonrewards/


----------



## schoolofmonkey

Quote:


> Originally Posted by *ZealotKi11er*
> 
> If its not listed on the place you bought the GPU then you dont get one.


Ah ok, so its not available to Australia, none of the store I buy from list it.
No biggy, just couldn't work it out, and when you have the store and AMD telling you to ask the other party make it frustrating..lol.

On another note, is it normal for the Sapphire 290x Tri-X to have fairly loud coil whine, I was playing Risen 3 and you can hear it change frequencies when you turn.


----------



## skupples

Hello 290/290/ Hawaii beach party club!

I have a question to ask all of you wonderful people!

I have been putting some serious debate into joining the Red Team, now that 295x2 has dropped down to $999... Specifically, I have been thinking about setting up a tri-fire w/ 295x2 and 290x... I would be selling my 3x titans, for what would be a near flat trade, after buying the waterblocks...

My question is this... Has anyone seen any benchmarks where the reviewer specifically tests this configuration of 295x2 + 290x? Also, say that I only grab the 295x2, how would you run eyefinity with monitors that don't have Display Port plugs?

Yes, i'm aware that Nvidia is about to release a new line of cards, yes i'm aware that AMD is only a few months from also releasing a new line of products.

Any help would be greatly appreciated, thank you!

also, to all of you looking to use your Never Settle Bundle for the Mustang Racing ship, and Star Citizen Arena Commander / Murray Cup alpha access... You won't regret it if skill driven space sim is your thing... Those of you that played Braben & Chris Roberts games back in the 90s will definitely love it.


----------



## tsm106

Quote:


> Originally Posted by *skupples*
> 
> Hello 290/290/ Hawaii beach party club!
> 
> I have a question to ask all of you wonderful people!
> 
> I have been putting some serious debate into joining the Red Team, now that 295x2 has dropped down to $999... Specifically, I have been thinking about setting up a tri-fire w/ 295x2 and 290x... I would be selling my 3x titans, for what would be a near flat trade, after buying the waterblocks...
> 
> My question is this... Has anyone seen any benchmarks where the reviewer specifically tests this configuration of 295x2 + 290x? Also, say that I only grab the 295x2, how would you run eyefinity with monitors that don't have Display Port plugs?
> 
> Yes, i'm aware that Nvidia is about to release a new line of cards, yes i'm aware that AMD is only a few months from also releasing a new line of products.
> 
> Any help would be greatly appreciated, thank you!
> 
> also, to all of you looking to use your Never Settle Bundle for the Mustang Racing ship, and Star Citizen Arena Commander / Murray Cup alpha access... You won't regret it if skill driven space sim is your thing... Those of you that played Braben & Chris Roberts games back in the 90s will definitely love it.


295X2 is slightly slower but nothing to alarmed at unless you really want to bench. This is really no different than the 7990 that came before, which was again slower than two discrete cards. This line generally holds true for all the dual gpu cards. Adding a 290x or more to the party is no problemo and the drivers and frame pacing hardware will keep the trio synched.

Now for the tough news, there is really no point in getting a 295X2 unless you are running panels with DP natively, otherwise the cost of getting adapters start to defeat the point of getting the 295 in the first place. For ex, running triple 1440P with DVI = adding two hundred on top for two DL Adapters, and you've just defeated the point of buying a gpu with 4 DP. Using mixed port setups can though mostly likely will create sync/tearing due to the variations from using mixed ports. One of the key points to the 295 is to get rid of the mixed port tearing and synch issues that plague so many setups.

For ex. watch the video from this post.

http://www.overclock.net/t/1511486/amd-needs-your-help-catalyst-driver-special-submission-form/0_40#post_22805749

This user has panels that support DP nativbely, yet he is using DVI lol. All he needs to do is swap to DP and get a DP 1.2 hub, voila no more tearing.


----------



## skupples

Quote:


> Originally Posted by *tsm106*
> 
> 295X2 is slightly slower but nothing to alarmed at unless you really want to bench. This is really no different than the 7990 that came before, which was again slower than two discrete cards. This line generally holds true for all the dual gpu cards. Adding a 290x or more to the party is no problemo and the drivers and frame pacing hardware will keep the trio synched.
> 
> Now for the tough news, there is really no point in getting a 295X2 unless you are running panels with DP natively, otherwise the cost of getting adapters start to defeat the point of getting the 295 in the first place. For ex, running triple 1440P with DVI = adding two hundred on top for two DL Adapters, and you've just defeated the point of buying a gpu with 4 DP. Using mixed port setups can though mostly likely will create sync/tearing due to the variations from using mixed ports. One of the key points to the 295 is to get rid of the mixed port tearing and synch issues that plague so many setups.
> 
> For ex. watch the video from this post.
> 
> http://www.overclock.net/t/1511486/amd-needs-your-help-catalyst-driver-special-submission-form/0_40#post_22805749
> 
> This user has panels that support DP nativbely, yet he is using DVI lol. All he needs to do is swap to DP and get a DP 1.2 hub, voila no more tearing.


I have seen the tearing you speak of first hand, it is hideous, and game breaking... I believe it was caused by using DP, mini HDMI, and DVI... On a 7970 setup...

Someone just made me an offer for $1621 for all three titans... oh, what have I gotten my self into.

The tearing I witnessed was between screens... Like there was a few MS latency between left/right & center. Looked TERRIBLE...

Seems the Gigabyte 290x is @ $499... Hmm...


----------



## sugarhell

Just get 3 290 for cheap skup...


----------



## HOMECINEMA-PC

Yeah I luv me 290's but I wish I had 4 DP ports







.


----------



## tsm106

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yeah I luv me 290's but I wish I had 4 DP ports
> 
> 
> 
> 
> 
> 
> 
> .


That's the gist of it. And if you are using triple panels with a 295 they better be DP otherwise what's the point?


----------



## skupples

Quote:


> Originally Posted by *sugarhell*
> 
> Just get 3 290 for cheap skup...


I have conceded with my self, for now. I have returned to only selling one unit, with the proceeds i'll build out my second system with the $350 290s Amazon is always listing.







win win!


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *tsm106*
> 
> That's the gist of it. And if you are using triple panels with a 295 they better be DP otherwise what's the point?


Good to hear from ya man








I find with just one DP that when I start to really push things along I gets severe instability and automatically enables ' black screen benchmarking ' mode


----------



## Dasboogieman

Quote:


> Originally Posted by *schoolofmonkey*
> 
> Ah ok, so its not available to Australia, none of the store I buy from list it.
> No biggy, just couldn't work it out, and when you have the store and AMD telling you to ask the other party make it frustrating..lol.
> 
> On another note, is it normal for the Sapphire 290x Tri-X to have fairly loud coil whine, I was playing Risen 3 and you can hear it change frequencies when you turn.


Yeah the tri x cards have 2 annoying sources of noise.
1. The infamous fan vibration against the metal frame, this often occurs at around 50% fan speed, I fixed it by padding the heatsink and fan frame with bits of old kitchen rubber glove.

2. The coil whine is extremely noticeable due to how quiet the tri x design is. I fixed i by padding the chokes with Sekisui thermal tape, hot electronics silicone glue works great as well.


----------



## heroxoot

Has anyone had good experience with force smooth playback/enable for internet video? Turning on smooth playback makes my flash videos look less laggy but then the video freezes in spots. I tend to have to watch twitch in full screen to get a good smooth playback. Can't use the chat like that.


----------



## schoolofmonkey

Quote:


> Originally Posted by *heroxoot*
> 
> Has anyone had good experience with force smooth playback/enable for internet video? Turning on smooth playback makes my flash videos look less laggy but then the video freezes in spots. I tend to have to watch twitch in full screen to get a good smooth playback. Can't use the chat like that.


I've had that on a GTX780ti and the 290x, I've just disabled hardware acceleration on Flash and Firefox
All good now.

I know its not the cards as both benchmark and game perfectly.


----------



## heroxoot

Quote:


> Originally Posted by *schoolofmonkey*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Has anyone had good experience with force smooth playback/enable for internet video? Turning on smooth playback makes my flash videos look less laggy but then the video freezes in spots. I tend to have to watch twitch in full screen to get a good smooth playback. Can't use the chat like that.
> 
> 
> 
> I've had that on a GTX780ti and the 290x, I've just disabled hardware acceleration on Flash and Firefox
> All good now.
> 
> I know its not the cards as both benchmark and game perfectly.
Click to expand...

It's all disabled already. Problem is my GPU still upclocks a bit for flash. When I don't fullscreen it clocks up less and sometimes not enough. So I have to fullscreen and it fixes it, but not for long. It's weird that it plays smoother in fullscreen than regular windowed.


----------



## schoolofmonkey

Quote:


> Originally Posted by *heroxoot*
> 
> It's all disabled already. Problem is my GPU still upclocks a bit for flash. When I don't fullscreen it clocks up less and sometimes not enough. So I have to fullscreen and it fixes it, but not for long. It's weird that it plays smoother in fullscreen than regular windowed.


That's strange, I don't see any of my GPU usage go up when browsing on either card.
Are you using Firefox or another browser?


----------



## mus1mus

Not an issue here myself. Even without looking for those acceleration options..


----------



## heroxoot

Quote:


> Originally Posted by *schoolofmonkey*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> It's all disabled already. Problem is my GPU still upclocks a bit for flash. When I don't fullscreen it clocks up less and sometimes not enough. So I have to fullscreen and it fixes it, but not for long. It's weird that it plays smoother in fullscreen than regular windowed.
> 
> 
> 
> That's strange, I don't see any of my GPU usage go up when browsing on either card.
> Are you using Firefox or another browser?
Click to expand...

Firefox. For flash it clocks up to 500ish mhz. It's just flash. It's been doing this for quite some time. At one point my card didnt clock up at all for flash with hardware accel disabled.


----------



## Yuriewitsch

If anyone's interested, I finally made a little review of VRM modding for my reference Sapphire R9 290 + installation of Arctic Accelero Xtreme IV.









You can check it here:
Sapphire R9 290 + Arctic Accelero Xtreme IV + VRM mod


----------



## Devildog83

Looking to switch from my current set-up (7870/270x) to a single 290/290x under water. Could anyone tell me which would be the best card to buy that would be compatible to most or all full cover blocks? Thanks.

My old set-up until the 7870 died.


----------



## rdr09

Quote:


> Originally Posted by *Devildog83*
> 
> Looking to switch from my current set-up (7870/270x) to a single 290/290x under water. Could anyone tell me which would be the best card to buy that would be compatible to most or all full cover blocks? Thanks.
> 
> My old set-up until the 7870 died.


Man! that's a very nice rig.

I recommend a 290X REFERENCE if you are into benching. If you don't mind used (say in the bay) just make sure you ask the seller about any issues. Preferably from MSI or XFX. I can't recommend non-reference 'cause you are watercooling with an "EK" block. hint

also, i would test them first on air (i know you know this). checkout the ocn market. i saw some decent prices recently comparable to the bay.

i read about your card.


----------



## Devildog83

Quote:


> Originally Posted by *rdr09*
> 
> Man! that's a very nice rig.
> 
> I recommend a 290X REFERENCE if you are into benching. If you don't mind used (say in the bay) just make sure you ask the seller about any issues. Preferably from MSI or XFX. I can't recommend non-reference 'cause you are watercooling with an "EK" block. hint
> 
> also, i would test them first on air (i know you know this). checkout the ocn market. i saw some decent prices recently comparable to the bay.
> 
> i read about your card.


Thank you for all of that.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Devildog83*
> 
> Looking to switch from my current set-up (7870/270x) to a single 290/290x under water. Could anyone tell me which would be the best card to buy that would be compatible to most or all full cover blocks? Thanks.
> 
> My old set-up until the 7870 died.


Reference is always best for making sure the blocks will fit.

(This is all "as far as i know")

MSI has changed their design again so the EK Rev 2.0 block won't fit , Gigabyte WF3 cards have a block for them made by EK (Rev 2.0), XFX has a Reference design, The DCU II has it's own block made by EK.
Tri-X is also reference design, Vapor-X doesn't have a block and not sure if it will and i really don't know about Powercolor's PCS+ but i think it's a Ref PCB as well


----------



## rdr09

Quote:


> Originally Posted by *Devildog83*
> 
> Thank you for all of that.


i recommended those brands for their warranty. msi has 3yrs, while xfx has lifetime limited. these cards are less than a year old.

so, ask the seller about time of purchase, bios flashing, stock cooler mod, etc.

btw, powercolor's warranty is not transferable. avoid it unless the seller is willing to help. go for reference.


----------



## heroxoot

MSI warranty is the best also because they don't ask for proof. All they need is your GPU/Mobo serial number and for you to send it in with the RMA slip. No questions asked. They just won't do anything if it is physically damaged.


----------



## VSG

Isn't Gigabyte similar in that they do by serial number only? Or is that just something wrong I read on a few forums?


----------



## bluedevil

Quote:


> Originally Posted by *geggeg*
> 
> Isn't Gigabyte similar in that they do by serial number only? Or is that just something wrong I read on a few forums?


Yep.

http://rma.gigabyte.us/


----------



## VSG

Quote:


> Originally Posted by *bluedevil*
> 
> Yep.
> 
> http://rma.gigabyte.us/


Yup, that's what I figured. Thanks a lot, +1 (it's the thought that counts, seeing how I can't rep you anyway).

If everything goes as planned, I will soon rejoin with a Gigabyte 290


----------



## bluedevil

Quote:


> Originally Posted by *geggeg*
> 
> Yup, that's what I figured. Thanks a lot, +1 (it's the thought that counts, seeing how I can't rep you anyway).
> 
> If everything goes as planned, I will soon rejoin with a Gigabyte 290


Been looking at replacing my Visontek with a Gigabyte too......dunno know if its worth it.


----------



## Jflisk

Quote:


> Originally Posted by *mus1mus*
> 
> You really can't do much without enough rad space IMO.
> 
> Plus, those are EX rads am I right?
> 
> If you can switch to a parallel config in the GPU, replace those Noctuas with higher speed fans (high speed GTs), all intake, 20C ambient, another pump, you might get better temps.
> 
> For your set-up, you'd need at least a couple thick 360s as minimum. More if you are Overclocking and Overvolting the CPU and the GPUs.
> 
> And you'll be left with wanting a new case.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Boy, this hobby breaks every guy's wallet. lol


So I just pulled the trigger on a Monsta 240 Dual Rad. Hopefully that will bring the temps down a little. I will make the investment in a D5 Dual res and extra pump on the next run. Ill wind up with 1x 120 > 280.2> Monsta 240.2 . I mean the flow in my system looks good by the flow sensor. What do you guys think.


----------



## Devildog83

The Tri-x 290 and Komodo looks like a great combo. I think about $450 could get that plus fittings, more acrylic tube and another rad.


----------



## rdr09

Quote:


> Originally Posted by *Devildog83*
> 
> The Tri-x 290 and Komodo looks like a great combo. I think about $450 could get that plus fittings, more acrylic tube and another rad.


with sapphire - warranty is void when you modify cooling system.


----------



## Devildog83

Quote:


> Originally Posted by *rdr09*
> 
> with sapphire - warranty is void when you modify cooling system.


That is true but I can get one for $275 used and the are good clockers. Still in the 1st phase of figuring out what to do here. I might sell my 270x Devil and get something very soon though and this would stay cool and perform well until I get it wet.


----------



## ozzy1925

today, i have sent the 2 cards to service i hope they change the faulty fans


----------



## rdr09

Quote:


> Originally Posted by *Devildog83*
> 
> That is true but I can get one for $275 used and the are good clockers. Still in the 1st phase of figuring out what to do here. I might sell my 270x Devil and get something very soon though and this would stay cool and perform well until I get it wet.


the advantage of the Tri-X is it has a reference design. Nice price for a good clocker. Sell the 270X.


----------



## ozzy1925

@rdr09
whats the best driver for 3dmark bench ?


----------



## Devildog83

Quote:


> Originally Posted by *rdr09*
> 
> the advantage of the Tri-X is it has a reference design. Nice price for a good clocker. Sell the 270X.


Yep, I will have to make a deal with my Devil.


----------



## Gobigorgohome

Hallo, I need some advice, I have just purchased a highly overclockable 4930K for my build, I will run it at the overclock which gives me the best framerates at 4K (but not above 70C and 1,4V). I would also like to overclock my RAM at 2133/2400/2666 and use the pt1/pt3 bios on all of my four cards. Anyone done anything similar, which can share with me how big the difference is in-game using stock clocks vs. overclocked CPU/RAM/GPU's? Everything is water cooled, not using chiller.

Tsm, you have four cards right? From your benchmark-runs of FireStrike Extreme I know you have some nice clocks.


----------



## DMills

woot just got mine in the mail today


Spoiler: :-D






validation
gotta grab a can of air and do some cleaning now


----------



## amptechnow

Quote:


> Originally Posted by *DMills*
> 
> woot just got mine in the mail today
> 
> 
> Spoiler: :-D
> 
> 
> 
> 
> 
> 
> gotta grab a can of air and do some cleaning now


welcome to the party!


----------



## pdasterly

sent 290x in today to sapphire for rma, fingers crossed


----------



## mus1mus

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Reference is always best for making sure the blocks will fit.
> 
> (This is all "as far as i know")
> 
> MSI has changed their design again so the EK Rev 2.0 block won't fit , Gigabyte WF3 cards have a block for them made by EK (Rev 2.0), XFX has a Reference design, The DCU II has it's own block made by EK.
> Tri-X is also reference design, *Vapor-X doesn't have a block and not sure if it will* and i really don't know about Powercolor's PCS+ but i think it's a Ref PCB as well


It has now..

I think this has been swarming EK's FB page for a week or two now..


----------



## th3illusiveman

Bios flash didn't fix my black screen login window issues :/ stock card is 1040/1350 asus runs at 1000/1250 and the issue persists.


----------



## th3illusiveman

Quote:


> Originally Posted by *th3illusiveman*
> 
> Bios flash didn't fix my black screen login window issues :/ stock card is 1040/1350 asus runs at 1000/1250 and the issue persists.


Might swap out my card for the 7970 when i have time to see if its the card or drivers at fault.


----------



## mus1mus

what issue is that exactly? root?


----------



## Arizonian

Quote:


> Originally Posted by *DMills*
> 
> woot just got mine in the mail today
> 
> 
> Spoiler: :-D
> 
> 
> 
> 
> 
> 
> gotta grab a can of air and do some cleaning now


Congrats - added









Just drop a GPU-Z validation link with OCN name showing in your original link underneath.
http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/29350#post_22817220

Quote:


> Originally Posted by *Yuriewitsch*
> 
> If anyone's interested, I finally made a little review of VRM modding for my reference Sapphire R9 290 + installation of Arctic Accelero Xtreme IV.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You can check it here:
> Sapphire R9 290 + Arctic Accelero Xtreme IV + VRM mod


Nice illustration on what you did. I've added this to OP Helpful Info section as an informational source if anyone else is interested.


----------



## Yuriewitsch

Quote:


> Originally Posted by *Arizonian*
> 
> Nice illustration on what you did. I've added this to OP Helpful Info section as an informational source if anyone else is interested.


Thx Arizonian.


----------



## Sgt Bilko

Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Reference is always best for making sure the blocks will fit.
> 
> (This is all "as far as i know")
> 
> MSI has changed their design again so the EK Rev 2.0 block won't fit , Gigabyte WF3 cards have a block for them made by EK (Rev 2.0), XFX has a Reference design, The DCU II has it's own block made by EK.
> Tri-X is also reference design, *Vapor-X doesn't have a block and not sure if it will* and i really don't know about Powercolor's PCS+ but i think it's a Ref PCB as well
> 
> 
> 
> It has now..
> 
> I think this has been swarming EK's FB page for a week or two now..
Click to expand...

Sweet, got a mate who will be stoked to hear about this


----------



## bluedevil

Trying to figure out if my 290 from VisionTek is a reference model or not. Might go full on water with EK blocks.









https://www.visiontek.com/graphics-cards/visiontek-radeon-r9-290-detail.html

http://www.amazon.com/VisionTek-Products-Express-Graphics-900653/dp/B00GSQ1C5C/ref=sr_1_1?s=electronics&ie=UTF8&qid=1410245558&sr=1-1&keywords=visiontek+r9+290


----------



## mus1mus

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Sweet, got a mate who will be stoked to hear about this












Clean version for the win!


----------



## Mega Man

i used to say that, but i have to admit the crop circles look epic, with the right tlc
http://www.overclock.net/t/1434852/ek-waterblock-polishing-guide/0_100

with the right skill and time they really shine with a led, i have to admit when i see that i actually admire the blocks.... alot, esp with LEDS


----------



## egemenbor

Hi,

I want to get an aftermarket cooler for my reference MSI 290 and from what I understand I have two choice the accelero iv or prolimatech mk26. The problem is neither offer decent cooling for the VRMs

Im getting mixed reviews some say Ill be fine other dont recommend it and clearly someone on here has gone the extent of making his own heatsink for the vrm's so Im going to go ahead and assume that without sufficient cooling neither will be a viable option.

My question is which heatsinks do I get?


----------



## mus1mus

Quote:


> Originally Posted by *Mega Man*
> 
> i used to say that, but i have to admit the crop circles look epic, with the right tlc
> http://www.overclock.net/t/1434852/ek-waterblock-polishing-guide/0_100
> 
> with the right skill and time they really shine with a led, i have to admit when i see that i actually admire the blocks.... alot, esp with LEDS


They're not too bad. But with polishing, they really become gorgeous.

Performance ain't that bad either..



Yeah, not as good as AC with the Backplate, but still good..


----------



## Yuriewitsch

Quote:


> Originally Posted by *egemenbor*
> 
> Hi,
> 
> I want to get an aftermarket cooler for my reference MSI 290 and from what I understand I have two choice the accelero iv or prolimatech mk26. The problem is neither offer decent cooling for the VRMs
> 
> Im getting mixed reviews some say Ill be fine other dont recommend it and clearly someone on here has gone the extent of making his own heatsink for the vrm's so Im going to go ahead and assume that without sufficient cooling neither will be a viable option.
> 
> My question is which heatsinks do I get?


I made some VRM heatsinks for my R9 290 with AC Xtreme IV:
Sapphire R9 290 + Arctic Accelero Xtreme IV + VRM mod

But, If you're not feeling for making your own VRM heatsink, you could also consider combo:

CPU AiO cooler (such as Corsiar H55, ...) + NZXT G10 bracket + GELID Enhancement kit for AMD R9 290/290X


----------



## egemenbor

Yeh kudos to you for being able to manage that but its definitely not something I can manage

Would the gelid enhancement kit + ac iv/prolimatech be enough? Because at this point I might as well sell the card and buy one with an aftermarket cooler instad of spending that much money on cooling. Im not a hardcore overclocker so I dont need amazing temperature. Im just after lower noise for the time being.


----------



## Yuriewitsch

Yes, It would be OK, but I'm not 100% sure, if that GELID VRM heatsink will fit between card and Accelero or Prolimatech (if GELID isn't too tall).
Maybe someone else can confirm that.









If you choose Accelero, be sure, that you can fit that AC Xtreme IV in your PC, bigger CPU coolers can collide with that black backplate. Also installation is a little tricky, but that backplate helps the temps a little.
Prolimatech requires purchase at least two 120mm fans, so that would be more expensive choice.
Or, If you can sell your card for nice price, something like Sapphire Tri-X would also be a decent choice.


----------



## mus1mus

Anybody tried slapping an h220 block to a video card?


----------



## Devildog83

Quote:


> Originally Posted by *Mega Man*
> 
> i used to say that, but i have to admit the crop circles look epic, with the right tlc
> http://www.overclock.net/t/1434852/ek-waterblock-polishing-guide/0_100
> 
> with the right skill and time they really shine with a led, i have to admit when i see that i actually admire the blocks.... alot, esp with LEDS


They look sweet but since EK is supposed to be premier how come they don't come like that to begin with? Just being the Devil's advocate.


----------



## b0z0

I'm thinking about going with a Reference R9 290. I'm worried about the noise and heat while using my FT03B micro ATX case. Has anyone had any expereince with a MATX build running the reference card? I'm going to try skimming through all the pages, but almost 3000 jeeze.


----------



## Zipperly

Quote:


> Originally Posted by *b0z0*
> 
> I'm thinking about going with a Reference R9 290. I'm worried about the noise and heat while using my FT03B micro ATX case. Has anyone had any expereince with a MATX build running the reference card? I'm going to try skimming through all the pages, but almost 3000 jeeze.


If you are worried about noise and heat using a micro atx case then you should probably consider something else.


----------



## mus1mus

Quote:


> Originally Posted by *Zipperly*
> 
> If you are worried about noise and heat using a micro atx case then you should probably consider something else.


This /\

or watercooling.


----------



## tsm106

Quote:


> Originally Posted by *mus1mus*
> 
> Anybody tried slapping an h220 block to a video card?


That sort of defeats the point no? It's made to hook up to other blocks so why not add a real gpu block?


----------



## mus1mus

Cause I've seen guys selling the pump/block for cheap.







fast getaway.

We can see red mods here and there (aio on a gpu) but noone used the said block.

Plus, it has it's own pump that could help with my flow.

Anyway, a fuul cover always makes things cooler and nicer...


----------



## tsm106

Quote:


> Originally Posted by *mus1mus*
> 
> Cause I've seen guys selling the pump/block for cheap.
> 
> 
> 
> 
> 
> 
> 
> fast getaway.
> 
> We can see red mods here and there (aio on a gpu) but noone used the said block.
> 
> Plus, it has it's own pump that could help with my flow.
> 
> Anyway, a fuul cover always makes things cooler and nicer...


AIO usually have their own pump... you know otherwise it wouldn't work so good.

You miss the point. The H220 is expandable, it's made to allow more blocks, rads whatever to enlarge the loop, so get that cheap h220 and add a gpu block too it. However, the pump they include in the h220 isn't very strong, just keep that in mind but I think it can support one more block.

*Oh you're thinking of adding it to boost flow? That's not the way I would go about it especially considering how weak closed loop pumps are.


----------



## b0z0

Well it will be in a Silverstone FT03B case with a intake 120mm fan blowing air straight into it.


----------



## hwoverclkd

Quote:


> Originally Posted by *mus1mus*
> 
> Cause I've seen guys selling the pump/block for cheap.
> 
> 
> 
> 
> 
> 
> 
> fast getaway.
> 
> We can see red mods here and there (aio on a gpu) but noone used the said block.
> 
> Plus, it has it's own pump that could help with my flow.
> 
> Anyway, a fuul cover always makes things cooler and nicer...


On some gpu cards, you can do nicely with just a kraken g10 or universal block (like EK vga supremacy) which is cheaper than full cover -- if you're really after a 'cost-effective' way. Might need to stick some heatsink + a little fan. You can still make it look good if you're creative









Of course, full cover definitely looks better and, most of the time, works more efficiently.


----------



## rdr09

Hey wake up!

13 driver at 1320 core

http://www.3dmark.com/3dm/2072325

14.3 driver at 1250 core

http://www.3dmark.com/3dm/4004448


----------



## Jflisk

Check Ek configurator here. By model number your is reference.

http://www.coolingconfigurator.com/


----------



## Jflisk

Quote:


> Originally Posted by *bluedevil*
> 
> Trying to figure out if my 290 from VisionTek is a reference model or not. Might go full on water with EK blocks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.visiontek.com/graphics-cards/visiontek-radeon-r9-290-detail.html
> 
> http://www.amazon.com/VisionTek-Products-Express-Graphics-900653/dp/B00GSQ1C5C/ref=sr_1_1?s=electronics&ie=UTF8&qid=1410245558&sr=1-1&keywords=visiontek+r9+290


Check EK configurator here
http://www.coolingconfigurator.com/

yours looks to be reference. Thanks


----------



## KeepWalkinG

Quote:


> Originally Posted by *rdr09*
> 
> Hey wake up!
> 
> 13 driver at 1320 core
> 
> http://www.3dmark.com/3dm/2072325
> 
> 14.3 driver at 1250 core
> 
> http://www.3dmark.com/3dm/4004448


This is only for this test or for all?

Have you tried unigine valley 1.0 or heaven 4.0?


----------



## mus1mus

Quote:


> Originally Posted by *tsm106*
> 
> AIO usually have their own pump... you know otherwise it wouldn't work so good.
> 
> You miss the point. The H220 is expandable, it's made to allow more blocks, rads whatever to enlarge the loop, so get that cheap h220 and add a gpu block too it. However, the pump they include in the h220 isn't very strong, just keep that in mind but I think it can support one more block.
> 
> *Oh you're thinking of adding it to boost flow? That's not the way I would go about it especially considering how weak closed loop pumps are.


Quote:


> Originally Posted by *acupalypse*
> 
> On some gpu cards, you can do nicely with just a kraken g10 or universal block (like EK vga supremacy) which is cheaper than full cover -- if you're really after a 'cost-effective' way. Might need to stick some heatsink + a little fan. You can still make it look good if you're creative
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Of course, full cover definitely looks better and, most of the time, works more efficiently.


Full cover then







.

Thanks guys. My card being reference won't be an isshe with full cover blocks so looking elsewhere isn't a good idea as you guys pointed out.
Quote:


> Originally Posted by *Jflisk*
> 
> Check Ek configurator here. By model number your is reference.
> 
> http://www.coolingconfigurator.com/


----------



## rdr09

Quote:


> Originally Posted by *KeepWalkinG*
> 
> This is only for this test or for all?
> 
> Have you tried unigine valley 1.0 or heaven 4.0?


not much change in Heaven. i don't have valley.

13 Driver at 1300 core

http://www.3dmark.com/3dm11/7748464

14.X Driver at 1300 core

http://www.3dmark.com/3dm11/8701258


----------



## heroxoot

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KeepWalkinG*
> 
> This is only for this test or for all?
> 
> Have you tried unigine valley 1.0 or heaven 4.0?
> 
> 
> 
> not much change in Heaven. i don't have valley.
> 
> 13 Driver at 1300 core
> 
> http://www.3dmark.com/3dm11/7748464
> 
> 14.X Driver at 1300 core
> 
> http://www.3dmark.com/3dm11/8701258
Click to expand...

Is this the latest 14.x? Your driver version is higher than mine. I'm on 14.6v2. It shows 13.251.


----------



## rdr09

Quote:


> Originally Posted by *heroxoot*
> 
> Is this the latest 14.x? Your driver version is higher than mine. I'm on 14.6v2. It shows 13.251.


yes, the one posted by DiceAir coming from Asder.


----------



## heroxoot

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Is this the latest 14.x? Your driver version is higher than mine. I'm on 14.6v2. It shows 13.251.
> 
> 
> 
> yes, the one posted by DiceAir coming from Asder.
Click to expand...

The difference is negligible but that driver might not work well with our cards for a reason. That being, it's the 285 driver modded for other cards. Then again unlike me you didn't get the giant boost from nothing performance on 14.6 like i did. That boost was god tier for me.


----------



## ozzy1925

guys,whats the best gaming 27"or 28" single monitor for 290 trixfire set up?


----------



## MojoW

Quote:


> Originally Posted by *ozzy1925*
> 
> guys,whats the best gaming 27"or 28" single monitor for 290 trixfire set up?


It really depends on your needs.
If you need graphic fluidity i would go for one of the new 1440p 120/144Hz screens.
If you need graphic fidelity you should look at the new 4k screens.


----------



## Dasboogieman

Quote:


> Originally Posted by *ozzy1925*
> 
> guys,whats the best gaming 27"or 28" single monitor for 290 trixfire set up?


Triple screen surround or a single 4K unit lol with that kind of horsepower. lol....jks

That being said, personally, I'm a stickler for image fidelity so I'd take slower but extremely color accurate IPS screens over any other gaming unit any day.

Give the Overlord Tempest a shot, I've heard good things, apparently the panel is a fairly color accurate IPS with custom board logic so it can run at 120hz, enough to satisfy most refresh rate junkies.


----------



## rdr09

Quote:


> Originally Posted by *heroxoot*
> 
> The difference is negligible but that driver might not work well with our cards for a reason. That being, it's the 285 driver modded for other cards. Then again unlike me you didn't get the giant boost from nothing performance on 14.6 like i did. That boost was god tier for me.


about 400 points increase in graphics score. it's like a 50 MHz oc. don't assume stuff. just use it. been using it for days without issues. it's a keeper till WHQL comes out.


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> The difference is negligible but that driver might not work well with our cards for a reason. That being, it's the 285 driver modded for other cards. Then again unlike me you didn't get the giant boost from nothing performance on 14.6 like i did. That boost was god tier for me.
> 
> 
> 
> about 400 points increase in graphics score. it's like a 50 MHz oc. don't assume stuff. just use it. been using it for days without issues. it's a keeper till WHQL comes out.
Click to expand...

I really should install it at some point, heard it's a very good driver


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I really should install it at some point, heard it's a very good driver


http://www.3dmark.com/3dm11/7748882

http://www.3dmark.com/3dm11/8702421

compare the graphics scores and the clocks.


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> I really should install it at some point, heard it's a very good driver
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/7748882
> 
> http://www.3dmark.com/3dm11/8702421
> 
> compare the graphics scores and the clocks.
Click to expand...

Thats....interesting









I'm definitely going to be jumping on the WHQL version of that, someone with a couple of SLI 780's overtook me for my CPU/GPU combo a little while ago


----------



## heroxoot

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> The difference is negligible but that driver might not work well with our cards for a reason. That being, it's the 285 driver modded for other cards. Then again unlike me you didn't get the giant boost from nothing performance on 14.6 like i did. That boost was god tier for me.
> 
> 
> 
> about 400 points increase in graphics score. it's like a 50 MHz oc. don't assume stuff. just use it. been using it for days without issues. it's a keeper till WHQL comes out.
Click to expand...

What now? You scored higher on 13 driver than 14 with the same clock? What is being assumed now? You showed the scores yourself.


----------



## rdr09

Quote:


> Originally Posted by *heroxoot*
> 
> What now? You scored higher on 13 driver than 14 with the same clock? What is being assumed now? You showed the scores yourself.


what numbers are you comparing? overall or graphics?


----------



## Sgt Bilko

Quote:


> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> The difference is negligible but that driver might not work well with our cards for a reason. That being, it's the 285 driver modded for other cards. Then again unlike me you didn't get the giant boost from nothing performance on 14.6 like i did. That boost was god tier for me.
> 
> 
> 
> about 400 points increase in graphics score. it's like a 50 MHz oc. don't assume stuff. just use it. been using it for days without issues. it's a keeper till WHQL comes out.
> 
> Click to expand...
> 
> What now? You scored higher on 13 driver than 14 with the same clock? What is being assumed now? You showed the scores yourself.
Click to expand...

Ummm, thats not what I seen.

Unless you were looking at overall instead of Graphics score


----------



## ozzy1925

Quote:


> Originally Posted by *MojoW*
> 
> It really depends on your needs.
> If you need graphic fluidity i would go for one of the new 1440p 120/144Hz screens.
> If you need graphic fidelity you should look at the new 4k screens.


Quote:


> Originally Posted by *Dasboogieman*
> 
> Triple screen surround or a single 4K unit lol with that kind of horsepower. lol....jks
> 
> That being said, personally, I'm a stickler for image fidelity so I'd take slower but extremely color accurate IPS screens over any other gaming unit any day.
> 
> Give the Overlord Tempest a shot, I've heard good things, apparently the panel is a fairly color accurate IPS with custom board logic so it can run at 120hz, enough to satisfy most refresh rate junkies.


To be honest i need a flicker free 4k gaming monitor and after reading some posts i decide to go with either samsung U28D590D or asus PB287 for my set up .Samsung seems to work better @60hz .
http://www.overclock.net/t/1472671/samsung-4k-1ms-monitor-754-it-has-begun/1200_100#post_22522838
What you think?


----------



## ace101

Good Day Everyone! At what rate can I adjust my GPU Clock and Memory Clock with stock coolers? I'm using Gigabyte R9 290 Windforce 4GB (GV-R929OC-GD)


----------



## Blue Dragon

Quote:


> Originally Posted by *rdr09*
> 
> what numbers are you comparing? overall or graphics?


Using average of five runs(each) would be a lot better to compare than a single run.
based on a single-run benching logic- the the systeminfo version is a lot more important than the driver version or the clocks
http://www.3dmark.com/compare/fs/2699964/fs/2377097
notice everything is the same except the sysinfo, why such big difference in graphics score? how does 3dmark see everything in my system correctly one test and then another it doesn't even report correct ram?


----------



## Talon720

Quote:


> Originally Posted by *b0z0*
> 
> I'm thinking about going with a Reference R9 290. I'm worried about the noise and heat while using my FT03B micro ATX case. Has anyone had any expereince with a MATX build running the reference card? I'm going to try skimming through all the pages, but almost 3000 jeeze.


Quote:


> Originally Posted by *b0z0*
> 
> Well it will be in a Silverstone FT03B case with a intake 120mm fan blowing air straight into it.


Hey I think the reference would work fine on that case since it exhausts the heat out. You are also feeding the inside with 120 fan as long as that fans moves a lot of air I think it would keep the card pretty quiet. You can just manually set the fan to 55% or use your own fan curve and that will keep it quiet unless under the most extreme conditions. All the custom cards will be throwing heat in there. Just a thought


----------



## rdr09

Quote:


> Originally Posted by *Blue Dragon*
> 
> Using average of five runs(each) would be a lot better to compare than a single run.
> based on a single-run benching logic- the the systeminfo version is a lot more important than the driver version or the clocks
> http://www.3dmark.com/compare/fs/2699964/fs/2377097
> notice everything is the same except the sysinfo, why such big difference in graphics score? how does 3dmark see everything in my system correctly one test and then another it doesn't even report correct ram?


5X? but i play BF4.

i got to go back to class.


----------



## KeepWalkinG

Anyone tried the Asus bios on Sapphire r9 290 ?
I wanna go with Asus bios for ASUS Gpu tweak with opportunity to use 1.4V


----------



## heroxoot

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> What now? You scored higher on 13 driver than 14 with the same clock? What is being assumed now? You showed the scores yourself.
> 
> 
> 
> what numbers are you comparing? overall or graphics?
Click to expand...

I was because the system did not change, just the driver. I don't expect the overall to decrease in a situation like that. I see the GPU score is 400 high tho. So why did it overall decline?


----------



## rdr09

Quote:


> Originally Posted by *heroxoot*
> 
> I was because the system did not change, just the driver. I don't expect the overall to decrease in a situation like that. I see the GPU score is 400 high tho. So why did it overall decline?


Different oc settings on the cpu. one at 4.5, the other 4.9GHz. my bad.

@ Blue Dragon, i'm only a regular user. not a hardcore bencher or reviewer, so i only bench a few runs when updating driver to see any difference. most importantly, if they run my games.


----------



## heroxoot

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> I was because the system did not change, just the driver. I don't expect the overall to decrease in a situation like that. I see the GPU score is 400 high tho. So why did it overall decline?
> 
> 
> 
> Different oc settings on the cpu. one at 4.5, the other 4.9GHz. my bad.
> 
> @ Blue Dragon, i'm only a regular user. not a hardcore bencher or reviewer, so i only bench a few runs when updating driver to see any difference. most importantly, if they run my games.
Click to expand...

I see. It shows the OC as turbo for some reason so I just didn't think anything of it.


----------



## ace101

Hi! I tried rendering at Revit Architecture and my CPU Load reached 100% at 93 Degrees Max.

If I were to OC my GPU, would it affect the processor or i mean will it make the process easier for the CPU?

I'm running on default configuration on my Gigabyte R9 290.

Thanks!


----------



## SovereigN7

In the market for an upgrade to a 290/290x and kept my eye on few models on the non-references so far.

- Sapphire R9 290x tri-x (ebay), was likely mined with for few months, includes receipt for warranty purposes. Around the $400 mark
- Gigabyte R9 290x Windforce (local), $375
- XFX R9 290 DD (local) $300
- XFX R9 290x DD (ebay) $400,

Is the 290x worth about $75-100 more? I've also looked at few articles and found not much difference between the sapphire tri-x cooler and the xfx DD, although many people still favor the tri-x for some reason.

Thanks for the help!


----------



## pdasterly

Quote:


> Originally Posted by *SovereigN7*
> 
> In the market for an upgrade to a 290/290x and kept my eye on few models on the non-references so far.
> 
> - Sapphire R9 290x tri-x (ebay), was likely mined with for few months, includes receipt for warranty purposes. Around the $400 mark
> - Gigabyte R9 290x Windforce (local), $375
> - XFX R9 290 DD (local) $300
> - XFX R9 290x DD (ebay) $400,
> 
> Is the 290x worth about $75-100 more? I've also looked at few articles and found not much difference between the sapphire tri-x cooler and the xfx DD, although many people still favor the tri-x for some reason.
> 
> Thanks for the help!


might as well throw in the 285x in your research


----------



## Samurai Batgirl

I would just like to say I'm happy my 290 is powerful, but Lord, the drivers make me crash so much.


----------



## MojoW

Quote:


> Originally Posted by *ace101*
> 
> Hi! I tried rendering at Revit Architecture and my CPU Load reached 100% at 93 Degrees Max.
> 
> If I were to OC my GPU, would it affect the processor or i mean will it make the process easier for the CPU?
> 
> I'm running on default configuration on my Gigabyte R9 290.
> 
> Thanks!


No that's not the reason.
Quote:


> Revit uses a version of the Mental Ray render engine, which only uses the CPU for rendering.
> 
> This is because it is a raytrace rendering app.


Source


----------



## MojoW

Quote:


> Originally Posted by *Samurai Batgirl*
> 
> I would just like to say I'm happy my 290 is powerful, but Lord, the drivers make me crash so much.


What driver's are you using and which games or apps are crashing?


----------



## Samurai Batgirl

14.4, but this has happened with every driver I've used. Flash and HTML5 are the culprits.

Also, I've completely uninstalled and reinstalled drivers multiple times.

The only other thing it could be is the hardware, but I wouldn't know how to test that.


----------



## MojoW

Quote:


> Originally Posted by *Samurai Batgirl*
> 
> 14.4, but this has happened with every driver I've used. Flash and HTML5 are the culprits.
> 
> Also, I've completely uninstalled and reinstalled drivers multiple times.
> 
> The only other thing it could be is the hardware, but I wouldn't know how to test that.


What browser and are we talking BSOD or just crashes?


----------



## Samurai Batgirl

Quote:


> Originally Posted by *MojoW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Samurai Batgirl*
> 
> 14.4, but this has happened with every driver I've used. Flash and HTML5 are the culprits.
> 
> Also, I've completely uninstalled and reinstalled drivers multiple times.
> 
> The only other thing it could be is the hardware, but I wouldn't know how to test that.
> 
> 
> 
> What browser and are we talking BSOD or just crashes?
Click to expand...

Switched between Firefox, Waterfox, Chrome, and Chrome Canary. BSoD.

Thank you for taking your time to help me, by the way.


----------



## MojoW

Quote:


> Originally Posted by *Samurai Batgirl*
> 
> Switched between Firefox, Waterfox, Chrome, and Chrome Canary. BSoD.
> 
> Thank you for taking your time to help me, by the way.


Disable HWA(Hardware acceleration)
Right click on a flash video for the options and uncheck it.

No problem that's what OCN is all about!


----------



## Samurai Batgirl

Quote:


> Originally Posted by *MojoW*
> 
> Disable HWA(Hardware acceleration)
> Right click on a flash video for the options and uncheck it.
> 
> No problem that's what OCN is all about!


I've done that before D:


----------



## MojoW

Quote:


> Originally Posted by *Samurai Batgirl*
> 
> I've done that before D:


You should check your bsod log that can help locate the problem.
So that your certain where the problem is coming from.
If it's really AMD driver's that's causing this, then i would say it's a faulty card.

Edit: I would suggest to clean your system of driver remnants but with more post then me, i'm thinking you've allready done that.


----------



## Sgt Bilko

Need some info peoples.

Going to replace the paste and vrm/vram pads on my XFX DD 290's and just wondering what thickness would be recommended?


----------



## Blue Dragon

Quote:


> Originally Posted by *Samurai Batgirl*
> 
> 14.4, but this has happened with every driver I've used. Flash and HTML5 are the culprits.
> 
> Also, I've completely uninstalled and reinstalled drivers multiple times.
> 
> The only other thing it could be is the hardware, but I wouldn't know how to test that.


i use catalyst 14.6 (only seen driver go to 14.2 so far) try uninstalling any oc program like afterburner, gpu tweak, etc... and don't use gpuz while testing. these are all things that caused problems with my cards. i do use hwinfo64 to watch my temps


----------



## Blue Dragon

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Need some info peoples.
> 
> Going to replace the paste and vrm/vram pads on my XFX DD 290's and just wondering what thickness would be recommended?


http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures


----------



## Sgt Bilko

Quote:


> Originally Posted by *Blue Dragon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Need some info peoples.
> 
> Going to replace the paste and vrm/vram pads on my XFX DD 290's and just wondering what thickness would be recommended?
> 
> 
> 
> http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures
Click to expand...

That's more for waterblocks, i'm just replacing the stock pads on my air coolers


----------



## ace101

Quote:


> Originally Posted by *MojoW*
> 
> No that's not the reason.
> Source


Does this mean I cannot do anything to make my rendering faster aside from overclocking my CPU?


----------



## MojoW

Quote:


> Originally Posted by *ace101*
> 
> Does this mean I cannot do anything to make my rendering faster aside from overclocking my CPU?


Yes or use a different rendering program that supports GPU rendering.


----------



## b0z0

Well I've found a few that are used for sale for roughly $200ish. Curious if one brand was better than another for warranty purposes since I'm purchasing a used card. The 3 brands are Asus, Sapphire, and PowerCooler.

Thanks


----------



## ace101

Quote:


> Originally Posted by *MojoW*
> 
> Yes or use a different rendering program that supports GPU rendering.


Thank You very much Sir! Anyway this will be an additional question, do you have any idea how much I can overclock my Gigabyte R9 290 with stock cooling?


----------



## ace101

Hi! Anyone here who can tell me how much I can overclock my Gigabyte R9 290 with stock coolers? Thanks!


----------



## MojoW

Quote:


> Originally Posted by *ace101*
> 
> Hi! Anyone here who can tell me how much I can overclock my Gigabyte R9 290 with stock coolers? Thanks!


If you use a custom fan curve and watch your temps (max rated is 95c) you should be able to do 1100 core.
Watch your temps with GPU-Z or any other program that shows vrm1 and 2 temps.
Start off with your core and test stability with 3dmark.
When you reach you final OC then start on the memory if you want but the gains are minimal.


----------



## ace101

Quote:


> Originally Posted by *MojoW*
> 
> If you use a custom fan curve and watch your temps (max rated is 95c) you should be able to do 1100 core.
> Watch your temps with GPU-Z or any other program that shows vrm1 and 2 temps.
> Start off with your core and test stability with 3dmark.
> When you reach you final OC then start on the memory if you want but the gains are minimal.


Upon adjusting the GPU Clock, should i also adjust the Power Limit equally?


----------



## MojoW

Quote:


> Originally Posted by *ace101*
> 
> Upon adjusting the GPU Clock, should i also adjust the Power Limit equally?


Yes when overclocking raise the PL as needed, you can see the effects when your 290 stops downclocking in games and bench apps.

Edit: And when your done play a few games to be sure your stable.
Mine cards do 1100/1500 with 63mv on air with MSI AB.(One elpida and one hynix)
As a reference but remember every card is different.


----------



## dartuil

Worth it to buy a r9 290 to play in 1080p or stick with 7950?


----------



## rdr09

Quote:


> Originally Posted by *dartuil*
> 
> Worth it to buy a r9 290 to play in 1080p or stick with 7950?


No, imo. if you want to stick to AMD - wait for the next gen.


----------



## marsey99

i made that swicth for 1440p and was made up with the upgrade.

that being said is 1080 pushing the 7900?


----------



## KeepWalkinG

I want to flash my card with Asus bios, who is the best bios from this:
http://www.techpowerup.com/vgabios/index.php?architecture=&manufacturer=Asus&model=R9+290&interface=&memType=&memSize= ?


----------



## ZealotKi11er

Quote:


> Originally Posted by *dartuil*
> 
> Worth it to buy a r9 290 to play in 1080p or stick with 7950?


If you need to power yes. 290 is quite a bit faster.


----------



## Gabkicks

Anyone with these cards have an occulus rift DK1 or 2? does AMD give good Occulust rift support? I know I am gonna end up buying on eventually and mainly using it to game, so I dunno if i should sell my 780 that's just been sitting on my desk, or sell the 290's and buy another gtx 780. I was thinking maybethe 1gb extra framebuffer on the 290's would come in handy with oculus rift Retail if it ends up being somewhere between 1600p and 4k.


----------



## Decade

Tri-X woes.

So, my Sapphire R9 290 Tri-X OC has the famous rattling issue. Drove me nuts.I had it temporarily fixed by putting pressure by hoisting the card up.

Look terrible.

Found a solution on youtube (



) where some man with a Russian accent placed bits of rubber under the fans where the screws poked through.
I used bits of rubber bands...

Just ran the Valley benchmark for the past 20 minutes, got the card up to 74*C and 45% fan speed... no rattles. None.









Now to find someone to buy my Asus R9 290 w/ Gelid heatsink and downsize to ITX....


----------



## pdasterly

nvidia is about to drop their new cards soon. I wanted to wait for their release, but became impatient so i settled for the 290x. After seeing the benchmarks im glad i went with the 290x.


----------



## Zipperly

Quote:


> Originally Posted by *pdasterly*
> 
> nvidia is about to drop their new cards soon. I wanted to wait for their release, but became impatient so i settled for the 290x. After seeing the benchmarks im glad i went with the 290x.


Yes because there are so many gaming benchmarks out right now for nvidias new card............ NOT.


----------



## pdasterly

Quote:


> Originally Posted by *Zipperly*
> 
> Yes because there are so many gaming benchmarks out right now for nvidias new card............ NOT.


simple google search


----------



## Zipperly

Quote:


> Originally Posted by *pdasterly*
> 
> simple google search


Let me help you with something, there are no gaming benchmarks for nvidias new cards out yet. You jumped the gun based off of a 3dmark test "which itself is not even 100 percent confirmed" and even if it was it would be foolish to use this as a standard to how you think the new cards will perform.


----------



## pdasterly

Quote:


> Originally Posted by *Zipperly*
> 
> Let me help you with something, there are no gaming benchmarks for nvidias new cards out yet. You jumped the gun based off of a 3dmark test "which itself is not even 100 percent confirmed" and even if it was it would be foolish to use this as a standard to how you think the new cards will perform.


No im happy i purchased my 290x's. the wait isn't worth the performance gain to me, throw in price(nvidia) and the 290x looks like a steal. evevyone was hoping maxwell would crush everything but thats not the case


----------



## Zipperly

Quote:


> Originally Posted by *pdasterly*
> 
> No im happy i purchased my 290x's


They are great cards, nothing against them... I had one myself.

Quote:


> . the wait isn't worth the performance gain to me


But we dont know anything about gaming performance yet.

Quote:


> , throw in price(nvidia)


Again, nothing set in stone at the moment here either although I agree nvidia is usually pricey but they could pull another GTX 670 and 680 surprise us.

Quote:


> Originally Posted by *pdasterly*
> 
> evevyone was hoping maxwell would crush everything but thats not the case


You cant say its not the case until we have some actual gaming numbers, a leaked "questionable" 3dmark isnt enough to make that call.


----------



## pdasterly

its nvidia so you know it will be high, makes me wonder what ati will counter punch with.


----------



## Zipperly

Quote:


> Originally Posted by *pdasterly*
> 
> its nvidia so you know it will be high, makes me wonder what ati will counter punch with.


Well actually I dont know that, as I said they could throw a curve ball with another GTX 670 or 680. Those cards launched at $399.99 and $499.99 respectively. However as I said before I agree that Nvidia can be expensive. Im also anxious to see what ATI comes out with now.


----------



## pdasterly

time will tell but dont think they will be that low. That would drop the price of the 780ti too low which is looking close to what the gtx 980 is going to be. i expect $999 price tag


----------



## Zipperly

Quote:


> Originally Posted by *pdasterly*
> 
> time will tell but dont think they will be that low. That would drop the price of the 780ti too low which is looking close to what the gtx 980 is going to be. i expect $999 price tag


780TI will simply be EOL and maxwell is cheaper to manufacture. Also when the 680 launched there were some 580 models "particularly 3gb ones" which were still priced higher than the GTX 680 but it didnt matter because GTX 580's were EOL.

GTX 980 will not be $999 either, lol. It might be more than $499.99 at launch but it will not be twice that.


----------



## Zipperly

The big maxwell chip comes out early next year, 980TI or whatever they decide to call it and yes that one might be $800-900 but not the GTX 980 which is getting ready to come out.


----------



## dartuil

Ill wait for 970 and see whats happening.


----------



## th3illusiveman

thinking about changing the TIM on my Tri-X. I don't have much time to do this stuff anymore so i wanna open it up, scub the paste off and slap some fresh one on there. Is there anything i should watch out for?


----------



## sugarhell

Quote:


> Originally Posted by *th3illusiveman*
> 
> thinking about changing the TIM on my Tri-X. I don't have much time to do this stuff anymore so i wanna open it up, scub the paste off and slap some fresh one on there. Is there anything i should watch out for?


Clean the die well. Dont scratch it. I use clu but you can use whatever you want. Also use the antistatic bag


----------



## Unknownm

Quote:


> Originally Posted by *th3illusiveman*
> 
> thinking about changing the TIM on my Tri-X. I don't have much time to do this stuff anymore so i wanna open it up, scub the paste off and slap some fresh one on there. Is there anything i should watch out for?


I take it by "I don't have much time to do this stuff anymore" means the teacher strike is keeping the kids at home










O Canada, we don't care about our kids education!


----------



## Offler

OK... R9-290x hit really low price around. About 280 euro.

It is this version:

http://www.gigabyte.eu/products/product-page.aspx?pid=4810#ov

The piece which is being sold was most probably used in a review by som techportal, its sold as "Used".

Can you provide any feedback on this card? I expect that its revision 1.0.


----------



## Imprezzion

It's just a fully standard reference card. Nothing really special. €280 is, in my opinion, a bit too expensive for it.
Here in Holland we can get used 290X's for €200-225 in reference form with warranty intact. Some, if not most, are ex mining cards but with still valid 1.5 years of warranty that wouldn't personally bother me.


----------



## Offler

One year warranty and its sold from shop, not from individual user. But here is HW quite expensive...


----------



## rdr09

Quote:


> Originally Posted by *Offler*
> 
> One year warranty and its sold from shop, not from individual user. But here is HW quite expensive...


Giggy normally carries 3YR warranty. These cards are less than a year old. all reference are the same no matter what manufacturer. they also come from the same silicon lot. some are good clockers, others not. these are meant for watercooling.


----------



## Blue Dragon

Quote:


> Originally Posted by *KeepWalkinG*
> 
> I want to flash my card with Asus bios, who is the best bios from this:
> http://www.techpowerup.com/vgabios/index.php?architecture=&manufacturer=Asus&model=R9+290&interface=&memType=&memSize= ?


this one- http://www.techpowerup.com/vgabios/148880/asus-r9290-4096-130930.html
would be 4th down on list in ur link.


----------



## Offler

Quote:


> Originally Posted by *rdr09*
> 
> Giggy normally carries 3YR warranty. These cards are less than a year old. all reference are the same no matter what manufacturer. they also come from the same silicon lot. some are good clockers, others not. these are meant for watercooling.


Laws in my country require two years warranty so ... when two years expire, you have to send it to Gigabyte/other manufacturer at your own expenses.


----------



## Tchernobyl

Hey guys. Got myself one of these bad boys in my recent computer upgrade!








Asus Radeon R9 290 4GB DirectCU II, specifically. It's so, so nice to just set everything to ultra and play like that. Bwehehe.

Although I just had the thought when waking up today (literally, i woke up, looked at the time, and thought this)... would it be worth having some nvidia card in here for the sake of PhysX?


----------



## reedy777

Quote:


> Originally Posted by *Tchernobyl*
> 
> Hey guys. Got myself one of these bad boys in my recent computer upgrade!
> 
> 
> 
> 
> 
> 
> 
> 
> Asus Radeon R9 290 4GB DirectCU II, specifically. It's so, so nice to just set everything to ultra and play like that. Bwehehe.
> 
> Although I just had the thought when waking up today (literally, i woke up, looked at the time, and thought this)... would it be worth having some nvidia card in here for the sake of PhysX?


PhysX no. g sync maybe yes


----------



## Sgt Bilko

Quote:


> Originally Posted by *reedy777*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Tchernobyl*
> 
> Hey guys. Got myself one of these bad boys in my recent computer upgrade!
> 
> 
> 
> 
> 
> 
> 
> 
> Asus Radeon R9 290 4GB DirectCU II, specifically. It's so, so nice to just set everything to ultra and play like that. Bwehehe.
> 
> Although I just had the thought when waking up today (literally, i woke up, looked at the time, and thought this)... would it be worth having some nvidia card in here for the sake of PhysX?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PhysX no. g sync maybe yes
Click to expand...

I'm 99% sure that wouldn't work


----------



## Broken-Heart

Hi,

I just joined OCN. I happen to own a Sapphire R9 290X. I also bought an Arctic Accelero Xtreme III and had it installed on my card by the local distributor. I wanted to do the cooler installation myself but my local distributor said that they don't void the card's warranty if the cooler installation is done by them so, I left it to them. Now, even with the cooler fans set to 100%, I can't hear the fans at all and the temps are incredible. The maximum GPU temperature is 78C

*Concerning the requirements* to join this group, Here we go:

1) The GPU *Manufacturer* is Sapphire. The *Model* is R9 290X Battlefield 4 Limited Edition. The *Cooling Solution* is Arctic Accelero Xtreme III *(Aftermarket)*.

2)GPU-Z validation link : http://www.techpowerup.com/gpuz/details.php?id=w52u5

3) Card and Rig Pictures:


----------



## th3illusiveman

^ watch out for the VRM temps when using an accelero. It will keep your core cool but your VRMs might be 20c higher in temps. Use GPU-Z and scroll to the bottom to see the temps.


----------



## rdr09

Quote:


> Originally Posted by *dartuil*
> 
> Ill wait for 970 and see whats happening.


yah, wait for more info. preferably from ocn members.

the 290 is about 30% faster than the 7950. the main difference between the two in 1080 is one can't max all games the other can. even at stock.


----------



## Wezzor

Quote:


> Originally Posted by *rdr09*
> 
> yah, wait for more info. preferably from ocn members.
> 
> the 290 is about 30% faster than the 7950. the main difference between the two in 1080 is one can't max all games the other can. even at stock.


I can't max all cames in 1080 with 290 at stock clocks ^^


----------



## rdr09

Quote:


> Originally Posted by *Wezzor*
> 
> I can't max all cames in 1080 with 290 at stock clocks ^^


not sure what games you have but i can max C3 even at stock. some games i need HT. my cpu is oc'ed to 4.5GHz.


----------



## Wezzor

Quote:


> Originally Posted by *rdr09*
> 
> not sure what games you have but i can max C3 even at stock. some games i need HT. my cpu is oc'ed to 4.5GHz.


Hitman: Absolution, Far Cry 3 and Tomb Raider are the games I haven't been able to max out yet. GTA IV and Watch Dogs too but they're just bad optimized. Maybe that you've an i7 makes the difference. Mine is also overclocked to 4,5.


----------



## rdr09

Quote:


> Originally Posted by *Wezzor*
> 
> Hitman: Absolution, Far Cry 3 and Tomb Raider are the games I haven't been able to max out yet. GTA IV and Watch Dogs too but they're just bad optimized. Maybe that you've an i7 makes the difference. Mine is also overclocked to 4,5.
> EDIT: With max out I mean constant 60 FPS.


yah, check your usage using AB. i was thinking of adding another 290 but after looking at my usage . . . looks like my i7 SB will finally show its age.


----------



## Wezzor

Quote:


> Originally Posted by *rdr09*
> 
> yah, check your usage using AB. i was thinking of adding another 290 but after looking at my usage . . . looks like my i7 SB will finally show its age.


Will do it! How much should it be?


----------



## rdr09

Quote:


> Originally Posted by *Wezzor*
> 
> Will do it! How much should it be?


what do you mean how much? i have FC3 but have not played on my intel rig, so not sure if any of those games are more demanding than BF4 MP 64 and if they take advantage of cores/threads. use AB and get rid of the other readings like CPU temps (unless your card is not cooled well, then you need to address that first), RAM usage, etc. make sure CPU usage is showing like so . . .



that's with HT off.


----------



## Wezzor

Quote:


> Originally Posted by *rdr09*
> 
> what do you mean how much? i have FC3 but have not played on my intel rig, so not sure if any of those games are more demanding than BF4 MP 64 and if they take advantage of cores/threads. use AB and get rid of the other readings like CPU temps (unless your card is not cooled well, then you need to address that first), RAM usage, etc. make sure CPU usage is showing like so . . .
> 
> 
> 
> that's with HT off.


I meant how much should my GPU usage be. Should it be 95% or 80% when I'm gaming. I've actually tried BF4 MP 64 and it played really smooth. I guess the games that I haven't been able to max out weren't that well optimised.


----------



## rdr09

Quote:


> Originally Posted by *Wezzor*
> 
> I meant how much should my GPU usage be. Should it be 95% or 80% when I'm gaming. I've actually tried BF4 MP 64 and it played really smooth. I guess the games that I haven't been able to max out weren't that well optimised.


when usage starts hitting the 90s, then that's not very good. yah, BF4 is easy. i can leave HT off in BF4, which i did in the middle of summer. fps is a little low but not enuf to make it unplayable. C3 is another game you can leave HT off even maxed. HT helps smooth things out in some maps.

BF4 has Mantle, too, that helps a lot in my experience.

edit: i am not an expert in these things but i think even 80s is not a good sign 'cause when a lot of things happen like during huge explosions and stuff, then it would spike and you'll see minimum fps drop.


----------



## rcoolb2002

Quote:


> Originally Posted by *rdr09*
> 
> when usage starts hitting the 90s, then that's not very good. yah, BF4 is easy. i can leave HT off in BF4, which i did in the middle of summer. fps is a little low but not enuf to make it unplayable. C3 is another game you can leave HT off even maxed. HT helps smooth things out in some maps.
> 
> BF4 has Mantle, too, that helps a lot in my experience.
> 
> edit: i am not an expert in these things but i think even 80s is not a good sign 'cause when a lot of things happen like during huge explosions and stuff, then it would spike and you'll see minimum fps drop.


Are you referring to GPU or CPU usage?

My gpu usage is always 99% with vsync and capped fps off.


----------



## rdr09

Quote:


> Originally Posted by *rcoolb2002*
> 
> Are you referring to GPU or CPU usage?
> 
> My gpu usage is always 99% with vsync and capped fps off.


cpu. just like the one in the screenie. gpu should be above 90 ideally. wait, in BF4? even with HT off my gpu usage is at 99%. a few drops to the 95% mark, i think.


----------



## sugarhell

Quote:


> Originally Posted by *Wezzor*
> 
> I meant how much should my GPU usage be. Should it be 95% or 80% when I'm gaming. I've actually tried BF4 MP 64 and it played really smooth. I guess the games that I haven't been able to max out weren't that well optimised.


Depends if you cap your fps. If you dont cap your fps you should be 99% all the times with a single gpu


----------



## rcoolb2002

Quote:


> Originally Posted by *rdr09*
> 
> cpu. just like the one in the screenie. gpu should be above 90 ideally. wait, in BF4? even with HT off my gpu usage is at 99%. a few drops to the 95% mark, i think.


The guy you quoted asked about GPU usage.








Quote:


> Originally Posted by *Wezzor*
> 
> I meant how much should my GPU usage be. Should it be 95% or 80% when I'm gaming. I've actually tried BF4 MP 64 and it played really smooth. I guess the games that I haven't been able to max out weren't that well optimised.


----------



## rdr09

Quote:


> Originally Posted by *rcoolb2002*
> 
> The guy you quoted asked about GPU usage.


i see, has to be in the 90s. forgot about caps sugar pointed out.


----------



## bluedevil

Played some BF4 last nite with my 290 cranked up to 1200mhz with 25% more power limit/core voltage. Totally stable, never exceeded 60C on my 120XL.


----------



## Performer81

CAn someone post a 015.045 or newer branch bios for PCS+ 290X please?


----------



## KeepWalkinG

Quote:


> Originally Posted by *bluedevil*
> 
> Played some BF4 last nite with my 290 cranked up to 1200mhz with 25% more power limit/core voltage. Totally stable, never exceeded 60C on my 120XL.


But vrm is hot ?


----------



## amptechnow

Quote:


> Originally Posted by *Performer81*
> 
> CAn someone post a 015.045 or newer branch bios for PCS+ 290X please?


its on the first page here: http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread


----------



## Performer81

No, thats a 044 and i have some issues with it.


----------



## bluedevil

Quote:


> Originally Posted by *KeepWalkinG*
> 
> But vrm is hot ?


nope temps on the vrm never get over 60C.


----------



## ZealotKi11er

At 1080p 290 should max every games. I can play any games with 290X no AA @ 1440p. In Crysis 3 i get ~ 35 fps but its very playable. Oced to 1200/1500 it goes to low 40s.


----------



## EdWeisz

Quote:


> Originally Posted by *ZealotKi11er*
> 
> At 1080p 290 should max every games. I can play any games with 290X no AA @ 1440p. In Crysis 3 i get ~ 35 fps but its very playable. Oced to 1200/1500 it goes to low 40s.


Since you've mentioned it. What is your usual frame rate when playing at 1440p?
I'm still playing on my old 1080P/ 60 Hz monitor with an (unlocked) 290x.


----------



## Mr357

I've been away from this thread for quite some time. Did I miss anything big? Namely, a good, custom BIOS for 24/7 OC's?


----------



## b0z0

Quote:


> Originally Posted by *bluedevil*
> 
> Played some BF4 last nite with my 290 cranked up to 1200mhz with 25% more power limit/core voltage. Totally stable, never exceeded 60C on my 120XL.


On stock reference cooler? @ what fan speed?

Thanks


----------



## davidm71

Quote:


> Originally Posted by *Tchernobyl*
> 
> Hey guys. Got myself one of these bad boys in my recent computer upgrade!
> 
> 
> 
> 
> 
> 
> 
> 
> Asus Radeon R9 290 4GB DirectCU II, specifically. It's so, so nice to just set everything to ultra and play like that. Bwehehe.
> 
> Although I just had the thought when waking up today (literally, i woke up, looked at the time, and thought this)... would it be worth having some nvidia card in here for the sake of PhysX?


I did the physx mod on my 290X and 450GTS and it helped out Metro Last Light dramatically improving fps.


----------



## Sgt Bilko

Quote:


> Originally Posted by *b0z0*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bluedevil*
> 
> Played some BF4 last nite with my 290 cranked up to 1200mhz with 25% more power limit/core voltage. Totally stable, never exceeded 60C on my 120XL.
> 
> 
> 
> On stock reference cooler? @ what fan speed?
> 
> Thanks
Click to expand...

he has a AIO on his card


----------



## ZealotKi11er

Quote:


> Originally Posted by *EdWeisz*
> 
> Since you've mentioned it. What is your usual frame rate when playing at 1440p?
> I'm still playing on my old 1080P/ 60 Hz monitor with an (unlocked) 290x.


Apart from Crysis 3 most other games are 60 fps +.


----------



## King PWNinater

Any fixes for low GPU usage in crossfire with an FX-8350?


----------



## Sgt Bilko

Quote:


> Originally Posted by *King PWNinater*
> 
> Any fixes for low GPU usage in crossfire with an FX-8350?


Res, Game and CPU clock speeds please?


----------



## King PWNinater

GPU 1 1150/1350
GPU 2 980/1250

1440p @ 120hz

Battlefield 4.

70-85FPS on the ground


----------



## Sgt Bilko

Quote:


> Originally Posted by *King PWNinater*
> 
> GPU 1 1150/1350
> GPU 2 980/1250
> 
> 1440p @ 120hz
> 
> Battlefield 4.
> 
> 70-85FPS on the ground


Whats your CPU's clock speed and is that Mantle or DX?

Would also be better is your GPU's clock speeds matched as well.


----------



## King PWNinater

4.8ghz. Mantle. 14.8 Driver. DX is waay lower.
My second GPU throttles because the max target GPU temperature in CCC is way too low (85C).

Ok, just changed/synchronized their clocks.
Both at 947/1300


----------



## Sgt Bilko

Quote:


> Originally Posted by *King PWNinater*
> 
> 4.8ghz. Mantle. 14.8 Driver. DX is waay lower.
> My second GPU throttles because the max target GPU temperature in CCC is way too low (85C).
> 
> Ok, just changed/synchronized their clocks.
> Both at 947/1300


That doesn't sound right, is this at Ultra or High?

I can believe it for Ultra but High settings net me good frames. (120fps avg)


----------



## King PWNinater

Ultra. On high, I have even less GPU usage and around the same frames. The only reason I am getting this high of usage on 1440p Ultra, is because my resolution scale is 120%. At 100% it's like 65-ish Usage with terrible FPS.


----------



## Sgt Bilko

Quote:


> Originally Posted by *King PWNinater*
> 
> Ultra. On high, I have even less GPU usage and around the same frames. The only reason I am getting this high of usage on 1440p Ultra, is because my resolution scale is 120%. At 100% it's like 65-ish Usage with terrible FPS.


That's really strange, i'm on the 14.7 RC3 driver atm, i'll run a quick test for you


----------



## King PWNinater

Thanks.


----------



## ZealotKi11er

Set resolution scaling to 125%. That would run the game in 3K. Not much you can do. 2 x 290 are too much for AMD CPU. Even 1 is too much for some games.


----------



## Sgt Bilko

Quote:


> Originally Posted by *King PWNinater*
> 
> Thanks.




Quote:


> Originally Posted by *ZealotKi11er*
> 
> Set resolution scaling to 125%. That would run the game in 3K. Not much you can do. 2 x 290 are too much for AMD CPU. Even 1 is too much for some games.


Some games, not all and Battlefield 4 is a good exception, he isn't running at 1080p you know?


----------



## King PWNinater

125% is too much for the system to bare for some reason. My frames drop hard at that point.


----------



## Zipperly

Quote:


> Originally Posted by *King PWNinater*
> 
> 125% is too much for the system to bare for some reason. My frames drop hard at that point.


That does not sound right, I'm 1920x1080 and I run 150% res scale all ultra msaa disabled 80 fov and I stay at 60 fps most of the time with rare dips into high 40's only in heavy action on 64 man servers.


----------



## th3illusiveman

Changed my TIM, Core went down ~3c VRMs went up ~4c







I kinda had a feeling that not having fresh VRM pads was a bad idea.


----------



## heroxoot

Quote:


> Originally Posted by *th3illusiveman*
> 
> Changed my TIM, Core went down ~3c VRMs went up ~4c
> 
> 
> 
> 
> 
> 
> 
> I kinda had a feeling that not having fresh VRM pads was a bad idea.


I'd check the positioning. I changed the thermal paste on my GPU and temps dropped 3 - 4c on average as well. In a warm room I went from 52c to 47-48c. VRM temps didn't change.


----------



## th3illusiveman

Quote:


> Originally Posted by *heroxoot*
> 
> I'd check the positioning. I changed the thermal paste on my GPU and temps dropped 3 - 4c on average as well. In a warm room I went from 52c to 47-48c. VRM temps didn't change.


If i try moving the pad it falls apart. To avoid risking a short when i put it back together i don't think i'll toy around with this until i can by VRM pads in december.


----------



## Sgt Bilko

Quote:


> Originally Posted by *King PWNinater*
> 
> 125% is too much for the system to bare for some reason. My frames drop hard at that point.


Are you experiencing a system memory leak?

Because at 1440p high i'm pulling 100+fps avg


----------



## Arizonian

Quote:


> Originally Posted by *Broken-Heart*
> 
> Hi,
> 
> I just joined OCN. I happen to own a Sapphire R9 290X. I also bought an Arctic Accelero Xtreme III and had it installed on my card by the local distributor. I wanted to do the cooler installation myself but my local distributor said that they don't void the card's warranty if the cooler installation is done by them so, I left it to them. Now, even with the cooler fans set to 100%, I can't hear the fans at all and the temps are incredible. The maximum GPU temperature is 78C
> 
> *Concerning the requirements* to join this group, Here we go:
> 
> 1) The GPU *Manufacturer* is Sapphire. The *Model* is R9 290X Battlefield 4 Limited Edition. The *Cooling Solution* is Arctic Accelero Xtreme III *(Aftermarket)*.
> 
> 2)GPU-Z validation link : http://www.techpowerup.com/gpuz/details.php?id=w52u5
> 
> 3) Card and Rig Pictures:
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Welcome to OCN with your first post.









*How to list your rig in Rig Builder*


----------



## King PWNinater

I think i'm leaking everything except RAM. Which is weird. My GPU usage just goes down as I play, including my FPS.


----------



## Sgt Bilko

Quote:


> Originally Posted by *King PWNinater*
> 
> I think i'm leaking everything except RAM. Which is weird. My GPU usage just goes down as I play, including my FPS.


Do you know what your CPU Usage is when those dips occur?


----------



## Dasboogieman

Quote:


> Originally Posted by *th3illusiveman*
> 
> thinking about changing the TIM on my Tri-X. I don't have much time to do this stuff anymore so i wanna open it up, scub the paste off and slap some fresh one on there. Is there anything i should watch out for?


The stock TIM is quite good, whatever you are using has to equal or surpass at least Phobya HeGrease Extreme (aka GELID Extreme) otherwise it isn't worth it. Secondly, make sure you have decent thermal pads of the correct thickness handy, IIRC, it's 1mm for the VRAM and 1mm for the VRMs. When I did mine, I also upgraded the stock VRM thermal pads to uber expensive Fujiploy ones while I was in there, dropped VRM temps by 4-5 degrees.
Be careful when you are peeling the Tri X cooler from the PCB, detach the fan header, then gently peel it from the power connector end, it takes a few goes because the whole assembly is very tightly fitted. If it feels like it isn't giving anymore (and you are sure all the screws are removed) then just keep persevering, the thermal pads that Sapphire use are quite sticky.
Finally, protip (I learnt the hard way), insert the fan header before re-assembling, that thing is impossible to insert after it is fully assembled.


----------



## th3illusiveman

Quote:


> Originally Posted by *Dasboogieman*
> 
> The stock TIM is quite good, whatever you are using has to equal or surpass at least Phobya HeGrease Extreme (aka GELID Extreme) otherwise it isn't worth it. Secondly, make sure you have decent thermal pads of the correct thickness handy, IIRC, it's 1mm for the VRAM and 1mm for the VRMs. When I did mine, I also upgraded the stock VRM thermal pads to uber expensive Fujiploy ones while I was in there, dropped VRM temps by 4-5 degrees.
> Be careful when you are peeling the Tri X cooler from the PCB, detach the fan header, then gently peel it from the power connector end, it takes a few goes because the whole assembly is very tightly fitted. If it feels like it isn't giving anymore (and you are sure all the screws are removed) then just keep persevering, the thermal pads that Sapphire use are quite sticky.
> Finally, protip (I learnt the hard way), insert the fan header before re-assembling, that thing is impossible to insert after it is fully assembled.


Yep you're right about everything. Getting the card off the cooler was quite a challenge but once you do it, it's easier to redo. Well the TIM i used (MX4) actually did something, temps are down by ~3c could get abit better with time or not. What sucks is that VRM temps went up by ~4c too. I don't know why, i've taken GPUs apart and reassembled them more times then i can remember but If i had known this would be the tradeoff i wouldn't have wasted my time. I had no issues keeping the core under 75c with my OC (adding +81mv to AB for 1100 core) before the TIM change, but the VRMs were already touching 80. Now they are mid 80's with the fan at ~75%-80%. while the core rests easy at ~72c in a Valley loop..I figured for now i'lll run it at stock since i can cap the fan at 65% while keeping the core under 70c and VRMs in check. When i buy those thermal pads i'll raise the clocks to something better, -60Mhz will only hurt in Skyrim at the morthal swamps with an ENB.

I just wish one GPU manufacturer could take VRM cooling seriously on a mainstream card. I regret not water cooling when i had the chance afew years ago.


----------



## ZealotKi11er

Quote:


> Originally Posted by *King PWNinater*
> 
> 125% is too much for the system to bare for some reason. My frames drop hard at that point.


Turn AA off.


----------



## King PWNinater

Ok, for some reason, I'm only allowed to change my resolution scale once or twice, and then I get crappy frames, thus pretty much making me restart the Application. I have changed it to 135% and that get's me a constant 95%-99% usage. The bad part about that is I struggle to achieve 60FPS at that point. 120% is the sweet spot.

I think I may have found a solution to my problems and many other problems aswell. I was looking around on this forum and found a guy who had pretty much the same problems that I've been having. He fixed it by messing with the page filing on his drives. So I did this this my Hard Drive and it fixed all of my Stability issues and my Memory leak issues.

Another thing. I tried Valley on my Hard Drive last night, and my usage struggled to obtain 85%. I installed another version on my SSD and I got +97% pretty much all of the time. I think if I Re-Install Battlefield 4 to my SSD, I may be able to fix my performance issues. Originally, when I had good usage and good FPS, I THINK Battlefield 4 was on my SSD, so hopefully this fixes my issues.

Wish me luck.


----------



## Broken-Heart

Quote:


> Originally Posted by *th3illusiveman*
> 
> ^ watch out for the VRM temps when using an accelero. It will keep your core cool but your VRMs might be 20c higher in temps. Use GPU-Z and scroll to the bottom to see the temps.


Yeah, I noticed that and told that to the guy who installed the cooler but he told me not to worry about it. Since the card is still under warranty, I know that if something happened, they'll replace it.

THANKS a lot for the warning, though.

Do you have any solution to cool the VRMs?


----------



## Talon720

Quote:


> Originally Posted by *King PWNinater*
> 
> Ok, for some reason, I'm only allowed to change my resolution scale once or twice, and then I get crappy frames, thus pretty much making me restart the Application. I have changed it to 135% and that get's me a constant 95%-99% usage. The bad part about that is I struggle to achieve 60FPS at that point. 120% is the sweet spot.
> 
> I think I may have found a solution to my problems and many other problems aswell. I was looking around on this forum and found a guy who had pretty much the same problems that I've been having. He fixed it by messing with the page filing on his drives. So I did this this my Hard Drive and it fixed all of my Stability issues and my Memory leak issues.
> 
> Another thing. I tried Valley on my Hard Drive last night, and my usage struggled to obtain 85%. I installed another version on my SSD and I got +97% pretty much all of the time. I think if I Re-Install Battlefield 4 to my SSD, I may be able to fix my performance issues. Originally, when I had good usage and good FPS, I THINK Battlefield 4 was on my SSD, so hopefully this fixes my issues.
> 
> Wish me luck.


Ive had very simliar issues as you with single and crossfire. The switching of res a couple times and then horrible performance. Fps drops even though my cpu usage is 50-75 (guesstimated from task manager + hwinfo) id say primarily with mantle. Certain drivers fixed issues up untill the latest bf4 DT patch. Ive had memory leak ram all 16gb being used w/mantle. I get random fps drops it seems to happen the first time something happens. The 1st explosion fps drop 2nd it will be fine or use the scope fps drop use it again its fine or turn really fast but then its fine. The game isnt installed on same ssd as the os. I am replacing my os ssd though w/850 pro 512gb then moving the hyperx 3k to the htpc. So os will be 850pro 512gb gaming ssd is on 840 pro 256gb and 1 tb hdd. Also i feel like i had less issues with bf4 on the os drive but no data or logs to prove it. Ive tried reconfiguring my page files to let windows do everything on whatever drive (current) to using a page file on just the 840 pro which is more than half full. Ive read alot of issues though with this latest patch.

Oh and on a side note with that fry's 5960x sale im really tempted. 40 cpu lanes for my trifire to breath since im not sold on a plx z97 or just plx in general. The price is so good though if i didnt use the chip i could easily sell it for regular price. Not that i need all that power lol but if i did need to use it id have it.


----------



## King PWNinater

Nope. didn't work.


----------



## trihy

Today gpuz showed a peak of 146c on core.

Hope it's some wacky drivers thing..


----------



## Gobigorgohome

My experience with BF4 with these cards is not too good, I have 4x R9 290X water cooled running quadfire with 1100/1300 with MSI AB and I get better FPS at ultra (with 4x AA on 4K), than at LOW at 4K. Usage at low is 50-60% on each card (never above and a lot of the time a whole lot less), with 4x AA I get 80% and above. Running scale at 100% at 4K. Why do you guys use scale at more than 100%?

My opinion is that BF4 is a bad game with these cards (or at least with my system), I have no problem in any other game, but BF4 struggle like heck. I have not tried Mantle yet (I think), because my system is pretty much wrecking everything at 4K with high/ultra settings on DX11 or whatever it is using.

Oh and by the way, all the guys here with aircooling, do yourself a favour and water cool your card(s). The heat and temperature is crazy, I am coming from 100% fan speed, stock clocks and 94C to EK-FC R9 290X with Fujipoly Ultra Extreme thermal pads, 1100/1300 and 30C idle/51C load, VRM1/VRM2 is well below the critical point.

These cards should only been sold with waterblocks, but that is my opinion.

Edit: I have also had the MSI Lightning R9 290X and they are loud too.


----------



## Zipperly

Quote:


> Originally Posted by *Gobigorgohome*
> 
> My experience with BF4 with these cards is not too good, I have 4x R9 290X water cooled running quadfire with 1100/1300 with MSI AB and I get better FPS at ultra (with 4x AA on 4K), than at LOW at 4K. Usage at low is 50-60% on each card (never above and a lot of the time a whole lot less), with 4x AA I get 80% and above. Running scale at 100% at 4K. Why do you guys use scale at more than 100%?
> 
> My opinion is that BF4 is a bad game with these cards (or at least with my system), I have no problem in any other game, but BF4 struggle like heck. I have not tried Mantle yet (I think), because my system is pretty much wrecking everything at 4K with high/ultra settings on DX11 or whatever it is using.
> 
> Oh and by the way, all the guys here with aircooling, do yourself a favour and water cool your card(s). The heat and temperature is crazy, I am coming from 100% fan speed, stock clocks and 94C to EK-FC R9 290X with Fujipoly Ultra Extreme thermal pads, 1100/1300 and 30C idle/51C load, VRM1/VRM2 is well below the critical point.
> 
> These cards should only been sold with waterblocks, but that is my opinion.
> 
> Edit: I have also had the MSI Lightning R9 290X and they are loud too.


I agree, had a single 290X at a solid non throttling 1200mhz core and with or without mantle my GTX 780 with nvidias optimized dx11 drivers is giving me a much more stable framerate with this game.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Gobigorgohome*
> 
> My experience with BF4 with these cards is not too good, I have 4x R9 290X water cooled running quadfire with 1100/1300 with MSI AB and I get better FPS at ultra (with 4x AA on 4K), than at LOW at 4K. Usage at low is 50-60% on each card (never above and a lot of the time a whole lot less), with 4x AA I get 80% and above. Running scale at 100% at 4K. Why do you guys use scale at more than 100%?
> 
> My opinion is that BF4 is a bad game with these cards (or at least with my system), I have no problem in any other game, but BF4 struggle like heck. I have not tried Mantle yet (I think), because my system is pretty much wrecking everything at 4K with high/ultra settings on DX11 or whatever it is using.
> 
> Oh and by the way, all the guys here with aircooling, do yourself a favour and water cool your card(s). The heat and temperature is crazy, I am coming from 100% fan speed, stock clocks and 94C to EK-FC R9 290X with Fujipoly Ultra Extreme thermal pads, 1100/1300 and 30C idle/51C load, VRM1/VRM2 is well below the critical point.
> 
> These cards should only been sold with waterblocks, but that is my opinion.
> 
> Edit: I have also had the MSI Lightning R9 290X and they are loud too.


If you have 4 cards and not getting high GPU usage then just leave cards at stock. Overclocking them does nothing. You definitely need to try Mantle but there is a problem with both DX11 and Mantle right now. In BF4 DX11 is not really good with AMD because they have used all their time in Mantle. The problem with Mantle is high vRAM usage. I dont think you can run 4K 0AA with Mantle with 4GB right now. I know because i have tried to run 1440p with 150% resolution scaling which = 4K. 125% is 3K which is my limit. ~ 3.9GB of vRAM used.


----------



## Gobigorgohome

Quote:


> Originally Posted by *Zipperly*
> 
> I agree, had a single 290X at a solid non throttling 1200mhz core and with or without mantle my GTX 780 with nvidias optimized dx11 drivers is giving me a much more stable framerate with this game.


Unfortunately I have only tried GTX 780 on 1080P in BF4, that was a good experience, but it is a long time ago. I did game some with one Asus DCUII R9 290 at 1080P BF4 and it was working great, but at 4K with my Sapphire Radeon R9 290X there is a total different story. One card at 1080P might be good, but at 4K with more than two cards is not a good experience. I have actually not played that game too much because of the poor optimization with AMD cards. I do not know if it is the quad-crossfire, resolution or the settings that are causing my instability.

Just to take it to another game it is doing Tomb Raider in 4K at ultimate (with AA) at about 50 FPS average and Hitman Absolution on max settings in 4K and I get stable 70-80 FPS and NO ISSUES.

For those of you that do not game at 40 FPS (such as me), BF4 is equal to Thief (that is capped at one GPU ~ 40 FPS at 4K).


----------



## ZealotKi11er

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Unfortunately I have only tried GTX 780 on 1080P in BF4, that was a good experience, but it is a long time ago. I did game some with one Asus DCUII R9 290 at 1080P BF4 and it was working great, but at 4K with my Sapphire Radeon R9 290X there is a total different story. One card at 1080P might be good, but at 4K with more than two cards is not a good experience. I have actually not played that game too much because of the poor optimization with AMD cards. I do not know if it is the quad-crossfire, resolution or the settings that are causing my instability.
> 
> Just to take it to another game it is doing Tomb Raider in 4K at ultimate (with AA) at about 50 FPS average and Hitman Absolution on max settings in 4K and I get stable 70-80 FPS and NO ISSUES.
> 
> For those of you that do not game at 40 FPS (such as me), BF4 is equal to Thief (that is capped at one GPU ~ 40 FPS at 4K).


These cards are not really 4K ready. You really need more then 4GB once you go 4K. Also 4 cards is asking a bit much for stability. Have you tried with just 2 cards? 2 cards should be able to get 60 fps in BF4 with 0AA.


----------



## Gobigorgohome

Quote:


> Originally Posted by *ZealotKi11er*
> 
> If you have 4 cards and not getting high GPU usage then just leave cards at stock. Overclocking them does nothing. You definitely need to try Mantle but there is a problem with both DX11 and Mantle right now. In BF4 DX11 is not really good with AMD because they have used all their time in Mantle. The problem with Mantle is high vRAM usage. I dont think you can run 4K 0AA with Mantle with 4GB right now. I know because i have tried to run 1440p with 150% resolution scaling which = 4K. 125% is 3K which is my limit. ~ 3.9GB of vRAM used.


Okay, I see. My biggest problem is CPU-bottleneck I think, I benched my 3930K in Tomb Raider at 4,6 Ghz @ 1,44 volts and got an increase of 2 FPS average (two runs) vs. stock CPU clock. I have a pre-owned CPU on it's way that will get my FPS a little higher I think, because I need 4,6 Ghz + for my setup to get higher FPS in games. I will try out Mantle when my machine is up and running again and give you guys feedback on how it is.








Quote:


> Originally Posted by *ZealotKi11er*
> 
> These cards are not really 4K ready. You really need more then 4GB once you go 4K. Also 4 cards is asking a bit much for stability. Have you tried with just 2 cards? 2 cards should be able to get 60 fps in BF4 with 0AA.


Seems pretty 4K ready to me.

I am quite surprised with using four cards, I have used 3-way GTX 660 Ti SLI before and this WAY, WAAAAAAAAAAAAAAAAAAAAY more stable than that was. I have a pretty bad experience with BF4 with 1, 2, 3 and/or 4 cards. 0xAA vs 4xAA does nothing to the FPS. FPS at low and ultra is the same.

I do not think two of these cards are able to maintain 60 FPS average in BF4 without AA, I do not believe so, but I will try it out later. Probably late next week.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Okay, I see. My biggest problem is CPU-bottleneck I think, I benched my 3930K in Tomb Raider at 4,6 Ghz @ 1,44 volts and got an increase of 2 FPS average (two runs) vs. stock CPU clock. I have a pre-owned CPU on it's way that will get my FPS a little higher I think, because I need 4,6 Ghz + for my setup to get higher FPS in games. I will try out Mantle when my machine is up and running again and give you guys feedback on how it is.
> 
> 
> 
> 
> 
> 
> 
> 
> Seems pretty 4K ready to me.
> 
> I am quite surprised with using four cards, I have used 3-way GTX 660 Ti SLI before and this WAY, WAAAAAAAAAAAAAAAAAAAAY more stable than that was. I have a pretty bad experience with BF4 with 1, 2, 3 and/or 4 cards. 0xAA vs 4xAA does nothing to the FPS. FPS at low and ultra is the same.
> 
> I do not think two of these cards are able to maintain 60 FPS average in BF4 without AA, I do not believe so, but I will try it out later. Probably late next week.


Just tried BF4 1440p Ultra 0x AA 0x FXAA 150% Resolution Scaling. I had the gaming running in DX11. The game looks beautiful. I was getting 60-90 fps with 2 x 290X @ Stock. Thats effectively rendering at 4K. You should give 2 cards a go. It was butter smooth for me. If i OCed to 1200/1500 probably never hit below 60 fps.


----------



## kizwan

Quote:


> Originally Posted by *Gobigorgohome*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> If you have 4 cards and not getting high GPU usage then just leave cards at stock. Overclocking them does nothing. You definitely need to try Mantle but there is a problem with both DX11 and Mantle right now. In BF4 DX11 is not really good with AMD because they have used all their time in Mantle. The problem with Mantle is high vRAM usage. I dont think you can run 4K 0AA with Mantle with 4GB right now. I know because i have tried to run 1440p with 150% resolution scaling which = 4K. 125% is 3K which is my limit. ~ 3.9GB of vRAM used.
> 
> 
> 
> 
> 
> 
> 
> Okay, I see. My biggest problem is CPU-bottleneck I think, I benched my 3930K in Tomb Raider at 4,6 Ghz @ 1,44 volts and got an increase of 2 FPS average (two runs) vs. stock CPU clock. I have a pre-owned CPU on it's way that will get my FPS a little higher I think, because I need 4,6 Ghz + for my setup to get higher FPS in games. I will try out Mantle when my machine is up and running again and give you guys feedback on how it is.
Click to expand...

What was your CPU clocked at while you're trying BF4? What was your CPU usage?
Quote:


> Originally Posted by *Gobigorgohome*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> These cards are not really 4K ready. You really need more then 4GB once you go 4K. Also 4 cards is asking a bit much for stability. Have you tried with just 2 cards? 2 cards should be able to get 60 fps in BF4 with 0AA.
> 
> 
> 
> 
> 
> 
> 
> Seems pretty 4K ready to me.
> 
> I am quite surprised with using four cards, I have used 3-way GTX 660 Ti SLI before and this WAY, WAAAAAAAAAAAAAAAAAAAAY more stable than that was. I have a pretty bad experience with BF4 with 1, 2, 3 and/or 4 cards. 0xAA vs 4xAA does nothing to the FPS. FPS at low and ultra is the same.
> 
> I do not think two of these cards are able to maintain 60 FPS average in BF4 without AA, I do not believe so, but I will try it out later. Probably late next week.
Click to expand...

I have not play BF4 for months now but I did record frame time before. I can get 70s & 80s FPS average in BF4 with DirectX 11 & Mantle respectively at ultra & without AA @1080p 200% res scale. It does go as low as 60 FPS from time to time though.


----------



## ace101

Hi! I'm currently using Gigabyte R9 290. Can I add another one with my Cooler Master V750?


----------



## Broken-Heart

4GB per GPU is enough, even at 4K

Check out this article:

http://www.tomshardware.com/reviews/graphics-card-myths,3694-6.html

You would only need more than 4 GB if you use 8xMSAA and I don't think that you need that much AA @4K


----------



## Broken-Heart

Quote:


> Originally Posted by *ace101*
> 
> Hi! I'm currently using Gigabyte R9 290. Can I add another one with my Cooler Master V750?


I don't think 750W is enough for 2 290s

Check out this link

http://www.tomshardware.com/reviews/build-your-own-r9-290-crossfire,3711-18.html


----------



## heroxoot

Quote:


> Originally Posted by *Broken-Heart*
> 
> 4GB per GPU is enough, even at 4K
> 
> Check out this article:
> 
> http://www.tomshardware.com/reviews/graphics-card-myths,3694-6.html
> 
> You would only need more than 4 GB if you use 8xMSAA and I don't think that you need that much AA @4K


You're crazy to put 8xMSAA with a 4k res. The pixel density should make it kind of a waste.


----------



## rdr09

Quote:


> Originally Posted by *heroxoot*
> 
> You're crazy to put 8xMSAA with a 4k res. The pixel density should make it kind of a waste.


in some games you don't have a choice. BF4 for example, it only goes to 4MSAA. And i agree, 4K prolly needs at most 2MSAA or something other than that like SMAA or FXAA. Better an owner chime in.


----------



## Broken-Heart

^ Yeah, that's what I'm saying.


----------



## Gobigorgohome

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Just tried BF4 1440p Ultra 0x AA 0x FXAA 150% Resolution Scaling. I had the gaming running in DX11. The game looks beautiful. I was getting 60-90 fps with 2 x 290X @ Stock. Thats effectively rendering at 4K. You should give 2 cards a go. It was butter smooth for me. If i OCed to 1200/1500 probably never hit below 60 fps.


I am getting 65-120 FPS in game with both LOW and ULTRA (same map, 64P MP), 4x R9 290X @ stock/overclocked is just as bad. Laging A LOT even with 100 FPS ++ in-game. Running Samsung U28D590D with DP on card #1.

I will try crossfire and tri-fire and check if that is the problem. I have kind of problems in Metro 2033 and Metro Last Light too, but I just have very, very low FPS even with phys X turned off. I will have to look at this when my computer is up and running again. I will switch those GPU-cables to be on the safe side. Using 13.6 I think.
Quote:


> Originally Posted by *kizwan*
> 
> What was your CPU clocked at while you're trying BF4? What was your CPU usage?
> I have not play BF4 for months now but I did record frame time before. I can get 70s & 80s FPS average in BF4 with DirectX 11 & Mantle respectively at ultra & without AA @1080p 200% res scale. It does go as low as 60 FPS from time to time though.


I have done CPU at:

4,4 Ghz, 4,5 Ghz and 4,6 Ghz with cards at both stock and 1100/1300, using dual PSU's. FPS is not my problem, it is just a bad experience, I may have 100 FPS but it is still laging. I came to think of it might be the PCI-E cables to card #2 and card #3, I am using the "two in one" PCI-E. I will change this to PCI-E with single 8-pin, when I get my CPU. CPU does not bottleneck at 4,6 Ghz I think, I have not noticed it at least.
Quote:


> Originally Posted by *Broken-Heart*
> 
> 4GB per GPU is enough, even at 4K
> 
> Check out this article:
> 
> http://www.tomshardware.com/reviews/graphics-card-myths,3694-6.html
> 
> You would only need more than 4 GB if you use 8xMSAA and I don't think that you need that much AA @4K


Yes, 4GB should be enough, I have not noticed any bottlenecking really.

Reviews is as useful as boobs on a man, different factors such as CPU, RAM, MB, overclocking and such.


----------



## devilhead

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I am getting 65-120 FPS in game with both LOW and ULTRA (same map, 64P MP), 4x R9 290X @ stock/overclocked is just as bad. Laging A LOT even with 100 FPS ++ in-game. Running Samsung U28D590D with DP on card #1.
> 
> I will try crossfire and tri-fire and check if that is the problem. I have kind of problems in Metro 2033 and Metro Last Light too, but I just have very, very low FPS even with phys X turned off. I will have to look at this when my computer is up and running again. I will switch those GPU-cables to be on the safe side. Using 13.6 I think.
> I have done CPU at:
> 
> 4,4 Ghz, 4,5 Ghz and 4,6 Ghz with cards at both stock and 1100/1300, using dual PSU's. FPS is not my problem, it is just a bad experience, I may have 100 FPS but it is still laging. I came to think of it might be the PCI-E cables to card #2 and card #3, I am using the "two in one" PCI-E. I will change this to PCI-E with single 8-pin, when I get my CPU. CPU does not bottleneck at 4,6 Ghz I think, I have not noticed it at least.
> Yes, 4GB should be enough, I have not noticed any bottlenecking really.
> 
> Reviews is as useful as boobs on a man, different factors such as CPU, RAM, MB, overclocking and such.


use just mantle, even with one card 290X it is hard to play at 120 and more fps with DirectX (looks like you are playing at 30fps), with mantle those game rocks


----------



## Gobigorgohome

Quote:


> Originally Posted by *devilhead*
> 
> use just mantle, even with one card 290X it is hard to play at 120 and more fps with DirectX (looks like you are playing at 30fps), with mantle those game rocks


Then I guess that is my problem, looks horrible now at least. Thanks bro!


----------



## kizwan

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I have done CPU at:
> 
> 4,4 Ghz, 4,5 Ghz and 4,6 Ghz with cards at both stock and 1100/1300, using dual PSU's. FPS is not my problem, it is just a bad experience, I may have 100 FPS but it is still laging. I came to think of it might be the PCI-E cables to card #2 and card #3, I am using the "two in one" PCI-E. I will change this to PCI-E with single 8-pin, when I get my CPU. CPU does not bottleneck at 4,6 Ghz I think, I have not noticed it at least.
> 
> Yes, 4GB should be enough, I have not noticed any bottlenecking really.
> 
> Reviews is as useful as boobs on a man, different factors such as CPU, RAM, MB, overclocking and such.


The only occasions I'm experiencing lagging in BF4 if I joined a bad server or my connection acting up. Otherwise it will running smooth with two 290's rendering at 4K. You should try Mantle though. Lagging can be caused by couple of things. In my case, if I remember correctly, it happened because I turned off page file. That if I remember correctly because I did experiencing lagging initially & I fixed it with a couple of tweaks, one of them turning on page file. Do you think BF4 unable to utilize four 290's properly? Maybe you can try run BF4 with three cards, see whether you still experiencing the same problem.

Yeah, I think 4GB should be enough too. I also didn't noticed any problem running BF4 with Mantle, rendering @4K. I don't know whether the VRAM usage reported by BF4 or GPU-Z is correct though. It always more than 4GB usage but I didn't noticed any problem whatsoever.


----------



## heroxoot

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> You're crazy to put 8xMSAA with a 4k res. The pixel density should make it kind of a waste.
> 
> 
> 
> in some games you don't have a choice. BF4 for example, it only goes to 4MSAA. And i agree, 4K prolly needs at most 2MSAA or something other than that like SMAA or FXAA. Better an owner chime in.
Click to expand...

Exactly. A lot of the games I play max out 4x msaa when set for 1080p. Very few offer 8x for me. If they do I usually just do 4x anyway. It's not incredibly painful and it looks ok.


----------



## Gobigorgohome

Quote:


> Originally Posted by *kizwan*
> 
> The only occasions I'm experiencing lagging in BF4 if I joined a bad server or my connection acting up. Otherwise it will running smooth with two 290's rendering at 4K. You should try Mantle though. Lagging can be caused by couple of things. In my case, if I remember correctly, it happened because I turned off page file. That if I remember correctly because I did experiencing lagging initially & I fixed it with a couple of tweaks, one of them turning on page file. Do you think BF4 unable to utilize four 290's properly? Maybe you can try run BF4 with three cards, see whether you still experiencing the same problem.
> 
> Yeah, I think 4GB should be enough too. I also didn't noticed any problem running BF4 with Mantle, rendering @4K. I don't know whether the VRAM usage reported by BF4 or GPU-Z is correct though. It always more than 4GB usage but I didn't noticed any problem whatsoever.


I have stable 30-40 in ping on all the servers I play on, that is not the problem. It is just as bad with cable and wireless. I just think BF4 is poorly optimized for quad crossfire, I mean it is the only game I have that problem ... in Metro Last Light it does not seem to run with four cards even though it should, I have very low FPS but the game is stunning at 4K, seems capped at 35-40 FPS even though it is not. I do not know what happened with that game, I had a friend at my home and he looked at it and understood about what I did. Same poor FPS at lowest settings and highest settings, that might be the "DX11"-problem too, I will check it out. On DX11 it is so bad that I rather not play those games at all.

What you are saying about "page filing" I have never done any changes to that? What is it?


----------



## kizwan

Quote:


> Originally Posted by *Gobigorgohome*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> The only occasions I'm experiencing lagging in BF4 if I joined a bad server or my connection acting up. Otherwise it will running smooth with two 290's rendering at 4K. You should try Mantle though. Lagging can be caused by couple of things. In my case, if I remember correctly, it happened because I turned off page file. That if I remember correctly because I did experiencing lagging initially & I fixed it with a couple of tweaks, one of them turning on page file. Do you think BF4 unable to utilize four 290's properly? Maybe you can try run BF4 with three cards, see whether you still experiencing the same problem.
> 
> Yeah, I think 4GB should be enough too. I also didn't noticed any problem running BF4 with Mantle, rendering @4K. I don't know whether the VRAM usage reported by BF4 or GPU-Z is correct though. It always more than 4GB usage but I didn't noticed any problem whatsoever.
> 
> 
> 
> 
> 
> 
> 
> I have stable 30-40 in ping on all the servers I play on, that is not the problem. It is just as bad with cable and wireless. I just think BF4 is poorly optimized for quad crossfire, I mean it is the only game I have that problem ... in Metro Last Light it does not seem to run with four cards even though it should, I have very low FPS but the game is stunning at 4K, seems capped at 35-40 FPS even though it is not. I do not know what happened with that game, I had a friend at my home and he looked at it and understood about what I did. Same poor FPS at lowest settings and highest settings, that might be the "DX11"-problem too, I will check it out. On DX11 it is so bad that I rather not play those games at all.
> 
> What you are saying about "page filing" I have never done any changes to that? What is it?
Click to expand...

It's Windows page file that Windows uses as if it were RAM. In your case I can rule out page file because if it's stutter/lags because of page file, usually it accompanied with sudden FPS drop. I also can rule out CPU bottleneck too because of the same reason. Did you use frame pacing method 1 in BF4? Try monitor for CPU spike in BF4.

Just because DX11 is bad, it should not be the reason you don't want to play BF4 because you can use Mantle. Initially I did went back and forth between DX11 & Mantle but I always play BF4 with Mantle since then.


----------



## Vici0us

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> The only occasions I'm experiencing lagging in BF4 if I joined a bad server or my connection acting up. Otherwise it will running smooth with two 290's rendering at 4K. You should try Mantle though. Lagging can be caused by couple of things. In my case, if I remember correctly, it happened because I turned off page file. That if I remember correctly because I did experiencing lagging initially & I fixed it with a couple of tweaks, one of them turning on page file. Do you think BF4 unable to utilize four 290's properly? Maybe you can try run BF4 with three cards, see whether you still experiencing the same problem.
> 
> Yeah, I think 4GB should be enough too. I also didn't noticed any problem running BF4 with Mantle, rendering @4K. I don't know whether the VRAM usage reported by BF4 or GPU-Z is correct though. It always more than 4GB usage but I didn't noticed any problem whatsoever.
> 
> 
> 
> I have stable 30-40 in ping on all the servers I play on, that is not the problem. It is just as bad with cable and wireless. I just think BF4 is poorly optimized for quad crossfire, I mean it is the only game I have that problem ... in Metro Last Light it does not seem to run with four cards even though it should, I have very low FPS but the game is stunning at 4K, seems capped at 35-40 FPS even though it is not. I do not know what happened with that game, I had a friend at my home and he looked at it and understood about what I did. Same poor FPS at lowest settings and highest settings, that might be the "DX11"-problem too, I will check it out. On DX11 it is so bad that I rather not play those games at all.
> 
> What you are saying about "page filing" I have never done any changes to that? What is it?
Click to expand...

I've never had problems with BF4 but I have the same issue with Metro Last Light. When I was running it on 2-Way SLI GTX 780 Ti's it was very smooth! But with Crossfire 290's I get very low FPS, extremely laggy. Could be overclocking. I'm gonna bring my CPU and GPU's back to stock and see what happens.


----------



## Broken-Heart

^ Poissibly Driver issues


----------



## rdr09

Quote:


> Originally Posted by *devilhead*
> 
> use just mantle, even with one card 290X it is hard to play at 120 and more fps with DirectX (looks like you are playing at 30fps), with mantle those game rocks


^this and . . .

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Then I guess that is my problem, looks horrible now at least. Thanks bro!


oc the cpu more. and more.


----------



## Imprezzion

I notice that with Mantle lower average FPS and FPS drops don't really impact visual performance. The game feels butter smooth even at 60-70FPS whereas on DX11 the game only feels smooth above 100FPS.

I play with a single 290X @ 1180Mhz core and 1400Mhz VRAM on 1080p, 125% res scale, all Ultra settings with 2x Adaptive AA in CCC enabled. Average FPS ia about 80-90fps with drops to 65-70fps when someone is spamming smoke and flares and what not but the game never ''slows down'' like it would on DX11..

Also, the new drivers seem to have decreased my GPU's temps quite a lot to my surprise. I used to play BF4 with above settings at ~70c core and ~85c VRM.
Now, after installing the ''leaked r9 285'' drivers (newer build date then the 14.8's), it runs at ~65c core and ~75c VRM but performance is even slightly better on Mantle. Wierd..


----------



## rdr09

Quote:


> Originally Posted by *Imprezzion*
> 
> I notice that with Mantle lower average FPS and FPS drops don't really impact visual performance. The game feels butter smooth even at 60-70FPS whereas on DX11 the game only feels smooth above 100FPS.
> 
> I play with a single 290X @ 1180Mhz core and 1400Mhz VRAM on 1080p, 125% res scale, all Ultra settings with 2x Adaptive AA in CCC enabled. Average FPS ia about 80-90fps with drops to 65-70fps when someone is spamming smoke and flares and what not but the game never ''slows down'' like it would on DX11..
> 
> Also, the new drivers seem to have decreased my GPU's temps quite a lot to my surprise. I used to play BF4 with above settings at ~70c core and ~85c VRM.
> Now, after installing the ''leaked r9 285'' drivers (newer build date then the 14.8's), it runs at ~65c core and ~75c VRM but performance is even slightly better on Mantle. Wierd..


that's about what i get. the temp could be the changing weather. i use AB to monitor my cpu usage like this . . .



not sure what will happen if i add another 290. its using at least 5 of the 8 cores/threads with the cpu oc'ed to 4.5GHz.


----------



## mus1mus

Does anyone know how to set settings on
CCC static upon log on?


----------



## Gobigorgohome

Quote:


> Originally Posted by *rdr09*
> 
> ^this and . . .
> oc the cpu more. and more.


I am trying to get my hands on a few good overclockers as we speak.


----------



## bond32

Looking into an additional power supply... Seeing how the Corsair is way too expensive right now, my current psu is the EVGA 1300 G2. Any recommendations? I was thinking of another EVGA, was going to look into the 750 watt range if it's decent quality. Plan would be to run the 3 290x's off the 1300 watt G2. Thoughts?


----------



## rdr09

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I am trying to get my hands on a few good overclockers as we speak.


we have a member who found he needed to swap his SB-E to IVY-E to get quadfire going for heavy benching.


----------



## tsm106

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gobigorgohome*
> 
> I am trying to get my hands on a few good overclockers as we speak.
> 
> 
> 
> we have a member who found he needed to swap his SB-E to IVY-E to get quadfire going for heavy benching.
Click to expand...

Did he get some big result? Doesn't seem like a great reason to make a jump to me just for a boost in physics in one or two benches lol.


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> Did he get some big result? Doesn't seem like a great reason to make a jump to me just for a boost in physics in one or two benches lol.


you know him tsm. Kaapstad. I saw him made a comment benching Heaven.

Maybe not IVY but Haswell-E. prolly has to do with motherboard as well. you know those things better than most of us.


----------



## tsm106

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Did he get some big result? Doesn't seem like a great reason to make a jump to me just for a boost in physics in one or two benches lol.
> 
> 
> 
> you know him tsm. Kaapstad. I saw him made a comment benching Heaven.
Click to expand...

He has problems with cfx scaling it seems still judging by whatever latest scaling thread he made.


----------



## BradleyW

Most of the time, people mistake poor CFX scaling for CPU bottleneck. (Yes, even a 8 core can bottleneck if a game does not utilize the CPU efficiently, which happens to be most games).


----------



## 95139ieaci

I have to rant for a little bit about my Asus R9 290 DC2OC. I got the card 2 days ago. No overclocking at all in my rig.

The card performs well with high demand games such as Metro: Last Light, but with low demand games such as The Sims 4, Fable Anniversary, it crashes(BSOD atikmpag.sys) constantly, sometimes as soon as loading ends, usually 5 minutes into a game.

I can't believe after nine months(since its release), the card(and/or its driver,BIOS) is still faulty out of the box.

The card doesn't crash in 3dmark or furmark(1 hour burn-in), and its core temperature is always below 80C, leading me to believe the problem is not with PSU nor heat.

I am out of ideas.

Some posts around the internet suggest that the problem is about power play (power management) keeps changing core clock and that I should lock the core clock with MSI Afterburner. I haven't tried the method yet, but it has been a really frustrating process getting the card to work.

Current gig:
i5 4590
ASRock B85-ITX
win 8.1 64bit
Asus R9 290 DC2OC
Silverstone SST-ST45SF (450w bronze ITX)


----------



## skline00

Just moved from a single GTX780 Classified under water to 2 Sapphire R9 290 Tri-X gpus under water with EK copper acetal blocks.


----------



## DMT94

Joining








http://www.techpowerup.com/gpuz/details.php?id=5c3wf
Sapphire Vapor-X 290X
Stock Cooling


Spoiler: Picture!


----------



## MrWhiteRX7

Quote:


> Originally Posted by *bond32*
> 
> Looking into an additional power supply... Seeing how the Corsair is way too expensive right now, my current psu is the EVGA 1300 G2. Any recommendations? I was thinking of another EVGA, was going to look into the 750 watt range if it's decent quality. Plan would be to run the 3 290x's off the 1300 watt G2. Thoughts?


I'm running 3 x 290's and an overclocked IVY-E on my 1300g2. No issues even with my 3 cards OC'd.


----------



## Blue Dragon

Quote:


> Originally Posted by *95139ieaci*
> 
> I have to rant for a little bit about my Asus R9 290 DC2OC. I got the card 2 days ago. No overclocking at all in my rig.
> 
> The card performs well with high demand games such as Metro: Last Light, but with low demand games such as The Sims 4, Fable Anniversary, it crashes(BSOD atikmpag.sys) constantly, sometimes as soon as loading ends, usually 5 minutes into a game.
> 
> I can't believe after nine months(since its release), the card(and/or its driver,BIOS) is still faulty out of the box.
> 
> The card doesn't crash in 3dmark or furmark(1 hour burn-in), and its core temperature is always below 80C, leading me to believe the problem is not with PSU nor heat.
> 
> I am out of ideas.
> 
> Some posts around the internet suggest that the problem is about power play (power management) keeps changing core clock and that I should lock the core clock with MSI Afterburner. I haven't tried the method yet, but it has been a really frustrating process getting the card to work.
> 
> Current gig:
> i5 4590
> ASRock B85-ITX
> win 8.1 64bit
> Asus R9 290 DC2OC
> Silverstone SST-ST45SF (450w bronze ITX)


I had fun playing with drivers and what not, they did release a flash-

think I ended up on cat. 14.6 with 14.2 drivers

but i'm running win 7...


----------



## tsm106

Quote:


> Originally Posted by *95139ieaci*
> 
> I have to rant for a little bit about my Asus R9 290 DC2OC. I got the card 2 days ago. No overclocking at all in my rig.
> 
> The card performs well with high demand games such as Metro: Last Light, but with low demand games such as The Sims 4, Fable Anniversary, it crashes(BSOD atikmpag.sys) constantly, sometimes as soon as loading ends, usually 5 minutes into a game.
> 
> I can't believe after nine months(since its release), the card(and/or its driver,BIOS) is still faulty out of the box.
> 
> The card doesn't crash in 3dmark or furmark(1 hour burn-in), and its core temperature is always below 80C, leading me to believe the problem is not with PSU nor heat.
> 
> I am out of ideas.
> 
> Some posts around the internet suggest that the problem is about power play (power management) keeps changing core clock and that I should lock the core clock with MSI Afterburner. I haven't tried the method yet, but it has been a really frustrating process getting the card to work.
> 
> Current gig:
> i5 4590
> ASRock B85-ITX
> win 8.1 64bit
> Asus R9 290 DC2OC
> Silverstone SST-ST45SF (450w bronze ITX)


They're probably right and it was my first hunch too reading the symptoms. The fix is easy too. Add the offending *.exe to the RTSS server profile list, and set them to none, so that when the RTSS detects the app it will ignore it. Btw, when adding the *.exe, you don't need to traverse to the file, simply input the filename.exe. RTSS isn't so concerned with the file location. You'll have to play with it, test it to make sure you catch all the offending apps. This is really helpful if you like to run apps whilst gaming.

Further explanation.

http://www.overclock.net/t/1265543/the-amd-how-to-thread/360_40#post_21832882


----------



## amptechnow

anyone know why in ccc whatever i have set as the system seetings cancels out if i make a profile for a game or program? this is new and was not the case last drivber install. ive re installed used ddu and manually removed reg entries and files.

this started when i reflashed my second bios on both my cards. my 2nd card had a bad flash and everything went screwy. flashed back to bios i was using and even tried 1st bios and still having same problem. also at times cf will not work and i have to restart. any ideas? i would love to have all my profiles back for different games bc each one i like to run differently.


----------



## heroxoot

I decided to play BF4 for the first time in ages to find out I have no mantle on 14.6RC2. What gives? Maybe I should update and try the newest 14.8 or whichever it is. My Framerate is fairly playable tho. It just seems a bit inconsistent. Seeing 50 - 120 fps and on average 60 - 70.


----------



## naved777

is SLI scaling better than Crossfire scaling ?


----------



## Sgt Bilko

Quote:


> Originally Posted by *naved777*
> 
> is SLI scaling better than Crossfire scaling ?


Not atm it's not from what i've seen


----------



## Vici0us

Quote:


> Originally Posted by *naved777*
> 
> is SLI scaling better than Crossfire scaling ?


Right now Crossfire scales better.


----------



## BradleyW

Quote:


> Originally Posted by *naved777*
> 
> is SLI scaling better than Crossfire scaling ?


CFX scales better, but SLI deals with CPU overhead better (Due to their driver DX optimizations), resulting in higher performance despite lower scalability (For some games).


----------



## Gobigorgohome

Quote:


> Originally Posted by *bond32*
> 
> Looking into an additional power supply... Seeing how the Corsair is way too expensive right now, my current psu is the EVGA 1300 G2. Any recommendations? I was thinking of another EVGA, was going to look into the 750 watt range if it's decent quality. Plan would be to run the 3 290x's off the 1300 watt G2. Thoughts?


EVGA G2 1300W is fine for tri-fire 290X with the rest of the system on the same PSU. I have done it, only reason I have a second is because I have four cards and some other stuff.
Quote:


> Originally Posted by *rdr09*
> 
> we have a member who found he needed to swap his SB-E to IVY-E to get quadfire going for heavy benching.


I am doing the switch from 3930K (SB-E) @ 4,6 Ghz with 1,44vc to 4930K (IVY-E) @ 4,5 Ghz +++ with lower voltage than my 3930K. I will not bench too much, but I want better performance out of my machine, it is not too good with the poor overclocker I have now. Until I get a stable CPU/RAM overclock I will not touch the cards, maybe do 1100/1300 after that, if that is not satisfying I am going with pt1/pt3 and see if I can push it some more. I do not know what this will do to my 4K gaming FPS though.
Quote:


> Originally Posted by *tsm106*
> 
> Did he get some big result? Doesn't seem like a great reason to make a jump to me just for a boost in physics in one or two benches lol.


I have understood it that IVY-E gets better speed compared to SB-E at the same frequencies. I got a good pre-owned 4930K coming.


----------



## Gobigorgohome

Quote:


> Originally Posted by *naved777*
> 
> is SLI scaling better than Crossfire scaling ?


In my experience crossfire scales better, I had 2x GTX 660 Ti (yes I know, bad cards), I used them at both 2-way and 3-way and I had a very small increase of FPS even at 1080P and it was not working too good, but there was many factors that could have caused this. When I tried out 1x R9 290 it worked good, but not great, adding another card for crossfire it got great and less problems than with one card (yes I know, unbelievable). Now doing 4x R9 290X seems like 2-way SLI too me, only better. I have no stuttering, no tearing or anything else beside small problems with other things here and there.

I am a Nvidia fan, only bought AMD because it was cheap, 4GB memory and 512-bit bandwith. I could not been any happier with my purchase.


----------



## bond32

Quote:


> Originally Posted by *Gobigorgohome*
> 
> EVGA G2 1300W is fine for tri-fire 290X with the rest of the system on the same PSU. I have done it, only reason I have a second is because I have four cards and some other stuff.
> I am doing the switch from 3930K (SB-E) @ 4,6 Ghz with 1,44vc to 4930K (IVY-E) @ 4,5 Ghz +++ with lower voltage than my 3930K. I will not bench too much, but I want better performance out of my machine, it is not too good with the poor overclocker I have now. Until I get a stable CPU/RAM overclock I will not touch the cards, maybe do 1100/1300 after that, if that is not satisfying I am going with pt1/pt3 and see if I can push it some more. I do not know what this will do to my 4K gaming FPS though.
> I have understood it that IVY-E gets better speed compared to SB-E at the same frequencies. I got a good pre-owned 4930K coming.


My single EVGA 1300 G2 is not enough. It can run the cards up to around 1.4 volts but anything higher it shuts down right away.


----------



## tsm106

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Did he get some big result? Doesn't seem like a great reason to make a jump to me just for a boost in physics in one or two benches lol.
> 
> 
> 
> I have understood it that IVY-E gets better speed compared to SB-E at the same frequencies. I got a good pre-owned 4930K coming.
> 
> Click to expand...
Click to expand...

It's a wash. Ivy e has slightly higher IPC, but it cannot clock as high. In the end the scores are the same. Fwiw, haswell e gets you a solid boost over sbe because altogether it nets double the IPC gain from ivy e, but it carries a huge price premium with growing pains.

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bond32*
> 
> Looking into an additional power supply... Seeing how the Corsair is way too expensive right now, my current psu is the EVGA 1300 G2. Any recommendations? I was thinking of another EVGA, was going to look into the 750 watt range if it's decent quality. Plan would be to run the 3 290x's off the 1300 watt G2. Thoughts?
> 
> 
> 
> I'm running 3 x 290's and an overclocked IVY-E on my 1300g2. No issues even with my 3 cards OC'd.
Click to expand...

Running your psu at peak will kill it sooner than later. On paper, there's only 400w spare for your cpu/mb, etc. That's not much padding and it pretty much means you are running near peak whenever you are pushing it.

Quote:


> Originally Posted by *bond32*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gobigorgohome*
> 
> EVGA G2 1300W is fine for tri-fire 290X with the rest of the system on the same PSU. I have done it, only reason I have a second is because I have four cards and some other stuff.
> I am doing the switch from 3930K (SB-E) @ 4,6 Ghz with 1,44vc to 4930K (IVY-E) @ 4,5 Ghz +++ with lower voltage than my 3930K. I will not bench too much, but I want better performance out of my machine, it is not too good with the poor overclocker I have now. Until I get a stable CPU/RAM overclock I will not touch the cards, maybe do 1100/1300 after that, if that is not satisfying I am going with pt1/pt3 and see if I can push it some more. I do not know what this will do to my 4K gaming FPS though.
> I have understood it that IVY-E gets better speed compared to SB-E at the same frequencies. I got a good pre-owned 4930K coming.
> 
> 
> 
> My single EVGA 1300 G2 is not enough. It can run the cards up to around 1.4 volts but anything higher it shuts down right away.
Click to expand...

I concur.


----------



## Jflisk

Quote:


> Originally Posted by *bond32*
> 
> My single EVGA 1300 G2 is not enough. It can run the cards up to around 1.4 volts but anything higher it shuts down right away.


Bond
I have a 1350W from Thermaltake with 3x R9 290x and FX9590 processor. I cannot overclock the processor to the full 5.0GHZ with out the system freezing at times. So to say the R9 290x's are voltage hogs is an understatement. Never froze when I just had the 2x R9 290x's. Hope this helps you. Thanks


----------



## Widde

Quote:


> Originally Posted by *Jflisk*
> 
> Bond
> I have a 1350W from Thermaltake with 3x R9 290x and FX9590 processor. I cannot overclock the processor to the full 5.0GHZ with out the system freezing at times. So to say the R9 290x's are voltage hogs is an understatement. Never froze when I just had the 2x R9 290x's. Hope this helps you. Thanks


My Seasonic 1000w platinum gets hot from 2 290s, 3570k at 4.5ghz 1.35v 3 hdds and some other stuff. Everything oc'd ^^ Powerhogs indeed. Gonna drill up some holes under it though


----------



## hwoverclkd

Quote:


> Originally Posted by *Widde*
> 
> My Seasonic 1000w platinum gets hot from 2 290s, 3570k at 4.5ghz 1.35v 3 hdds and some other stuff. Everything oc'd ^^ Powerhogs indeed. Gonna drill up some holes under it though


hot air goes up, so you might want to drill holes at the top


----------



## Widde

Quote:


> Originally Posted by *acupalypse*
> 
> hot air goes up, so you might want to drill holes at the top


Bottom mounted psu







And intake fan facing bottom







<

Genius


----------



## Gobigorgohome

Quote:


> Originally Posted by *tsm106*
> 
> It's a wash. Ivy e has slightly higher IPC, but it cannot clock as high. In the end the scores are the same. Fwiw, haswell e gets you a solid boost over sbe because altogether it nets double the IPC gain from ivy e, but it carries a huge price premium with growing pains.
> I concur.


Haswell-E might be better than both SB-E and IVY-E, but I know for sure that the 4930K I get coming is a very good clocker (from a OCN member). My 3930K is pretty much stuck at 4,6 Ghz while the 4930K does 4,9 Ghz so it should help a little. If it not do the trick I might go for Haswell-E in a while, but I have other priorities before doing the upgrade so I might use what I have.

I base my recommendations on base clocks or mild overclocks, not on overclocked.


----------



## tsm106

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> It's a wash. Ivy e has slightly higher IPC, but it cannot clock as high. In the end the scores are the same. Fwiw, haswell e gets you a solid boost over sbe because altogether it nets double the IPC gain from ivy e, but it carries a huge price premium with growing pains.
> I concur.
> 
> 
> 
> Haswell-E might be better than both SB-E and IVY-E, but I know for sure that the 4930K I get coming is a very good clocker (from a OCN member). My 3930K is pretty much stuck at 4,6 Ghz while the 4930K does 4,9 Ghz so it should help a little. If it not do the trick I might go for Haswell-E in a while, but I have other priorities before doing the upgrade so I might use what I have.
> 
> I base my recommendations on base clocks or mild overclocks, not on overclocked.
Click to expand...

Going from a low to average sbe to a gold ivye will do the trick. But really, I think the fact that you had a poor clocking sbe is more the problem.

http://www.3dmark.com/compare/fs/2129370/fs/1179138

Look... huge difference right?

Processor
*Intel Core i7-4960X - Intel Core i7-3930K Processor*
Reported stock core clock
3,600 MHz - 3,200 MHz
Maximum turbo core clock
4,934 MHz -5,110 MHz

Physics Score
*18294.0 - 18332.0*

Physics Test
*58.1 - 58.2 fps*


----------



## ace101

Please help me decide. Which is better for GPU Cooling for my Gigabyte R9 290 Windforce?



OR


----------



## chronicfx

Quote:


> Originally Posted by *Widde*
> 
> Bottom mounted psu
> 
> 
> 
> 
> 
> 
> 
> And intake fan facing bottom
> 
> 
> 
> 
> 
> 
> 
> <
> 
> Genius


Lol


----------



## chronicfx

Quote:


> Originally Posted by *ace101*
> 
> Please help me decide. Which is better for GPU Cooling for my Gigabyte R9 290 Windforce?
> 
> 
> 
> OR


Neither.. Snag a second 290 on ebay run both stock and call that your overclock. Guarantee that nzxt junk will never get you even close in fps.


----------



## kizwan

Quote:


> Originally Posted by *Widde*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *acupalypse*
> 
> hot air goes up, so you might want to drill holes at the top
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Bottom mounted psu
> 
> 
> 
> 
> 
> 
> 
> And intake fan facing bottom
> 
> 
> 
> 
> 
> 
> 
> <
> 
> Genius
Click to expand...

I don't see anything wrong with that, unless there's no holes for the fan to pull air into the PSU. Mine facing bottom too. It does warm up a bit when under load. You have fans inside the case, right? Then you can control where hot air goes. I totally agree with powerhogs comment though. Even with two 290's, for cool & quiet operation, I think at least 1200 - 1300W PSU.

BTW, I set the switch to normal instead of hybrid. So, the fan always spinning & it still running quiet even under load. At least I don't hear it over my rads fans. Also I have the side panel off when gaming because my case have bad airflow with all the rads (fans intake config).


----------



## heroxoot

So is everyone using that August 27th driver? I heard it has an issue with ULPS. Not an issue for a single card user like me. The performance is great tho. Extra fps in benchmarks so far and no more heat. Went from 53 (give or take .1fps) to 55.6fps in heaven.

I never thought I'd see the day we could get 60fps on heaven with a single card doing 1080p. We have come a long way.


----------



## BradleyW

Quote:


> Originally Posted by *heroxoot*
> 
> So is everyone using that August 27th driver? I heard it has an issue with ULPS. Not an issue for a single card user like me. The performance is great tho. Extra fps in benchmarks so far and no more heat. Went from 53 (give or take .1fps) to 55.6fps in heaven.
> 
> I never thought I'd see the day we could get 60fps on heaven with a single card doing 1080p. We have come a long way.


I've been using 14.7 RC3 for a while. It has slightly better Direct-X software level performance compared to 14.4. For example, during a CPU limited area of Metro 2033 Redux, I gained 5 fps with RC3.

RC3 has ULPS issues in certain games.


----------



## Widde

Quote:


> Originally Posted by *kizwan*
> 
> I don't see anything wrong with that, unless there's no holes for the fan to pull air into the PSU. Mine facing bottom too. It does warm up a bit when under load. You have fans inside the case, right? Then you can control where hot air goes. I totally agree with powerhogs comment though. Even with two 290's, for cool & quiet operation, I think at least 1200 - 1300W PSU.
> 
> BTW, I set the switch to normal instead of hybrid. So, the fan always spinning & it still running quiet even under load. At least I don't hear it over my rads fans. Also I have the side panel off when gaming because my case have bad airflow with all the rads (fans intake config).


Old antec threehundred so no ventilation at the bottom








Gonna drill some holes there tomorrow


----------



## heroxoot

Quote:


> Originally Posted by *BradleyW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> So is everyone using that August 27th driver? I heard it has an issue with ULPS. Not an issue for a single card user like me. The performance is great tho. Extra fps in benchmarks so far and no more heat. Went from 53 (give or take .1fps) to 55.6fps in heaven.
> 
> I never thought I'd see the day we could get 60fps on heaven with a single card doing 1080p. We have come a long way.
> 
> 
> 
> I've been using 14.7 RC3 for a while. It has slightly better Direct-X software level performance compared to 14.4. For example, during a CPU limited area of Metro 2033 Redux, I gained 5 fps with RC3.
> 
> RC3 has ULPS issues in certain games.
Click to expand...

I tried 14.8 and the 14.xx aug 27th and the difference between the two was .6fps more. I skipped 14.7, but I think it would have been close with that 14.8.


----------



## VSG

The prodigal son returns:



Ok, that isn't the 2x R9 290x I had before, but this is good enough to re-join I would hope









Way better pictures tomorrow, this is going in Side 2 of the TX10-D build going on (in sig) for the girlfriend with a custom loop of course. Now I may try my hand at benching this, but longevity is more important here so don't expect much. @HOMECINEMA-PC, I may hit you up for BIOS/tips though!


----------



## Arizonian

Quote:


> Originally Posted by *geggeg*
> 
> The prodigal son returns:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Ok, that isn't the 2x R9 290x I had before, but this is good enough to re-join I would hope
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Way better pictures tomorrow, this is going in Side 2 of the TX10-D build going on (in sig) for the girlfriend with a custom loop of course. Now I may try my hand at benching this, but longevity is more important here so don't expect much. @HOMECINEMA-PC, I may hit you up for BIOS/tips though!


Good enough for me. Congrats - added


----------



## VSG

I will do a stock cooler vs Corsair HG10 vs Swiftech R9-LE/custom loop test soon before some benching and then it's off to do some medical research.


----------



## ace101

Quote:


> Originally Posted by *chronicfx*
> 
> Neither.. Snag a second 290 on ebay run both stock and call that your overclock. Guarantee that nzxt junk will never get you even close in fps.


How about cooling? I tried 3dMark11 with everything stock and reached 85C temperature and it really made a lot of noise with the stock coolers. Aside from overclocking I really want to reduce the heat from the GPU.


----------



## ILLmatik94

Is fan rattle pretty much unavoidable without the rubber mod, even with newer Tri-X cards? 3 months ago a Sapphire rep claimed it only happened with older cards and has been FIXED but it still seems like a common issue? http://forums.overclockers.co.uk/showthread.php?s=dcb9082a59e64a33009780c59cb10137&t=18602589&page=2


----------



## ozzy1925

Quote:


> Originally Posted by *ILLmatik94*
> 
> Is fan rattle pretty much unavoidable without the rubber mod, even with newer Tri-X cards? 3 months ago a Sapphire rep claimed it only happened with older cards and has been FIXED but it still seems like a common issue? http://forums.overclockers.co.uk/showthread.php?s=dcb9082a59e64a33009780c59cb10137&t=18602589&page=2


i tried this method for 2 of my cards but it didnt help me


----------



## pdasterly

rma approved on sapphire 290x, I had nzxt cooler on it then full ek waterblock. Funny part, I lost a few thermal pads so i put some new ones on(2) and they were different color, also my bios was flashed to tri-x oc. I feel better about my purchase now. Took 10 days after receiving rma


----------



## Wezzor

Would anyone mind linking a good overclocking guide?


----------



## KnownDragon

Quote:


> Originally Posted by *Wezzor*
> 
> Would anyone mind linking a good overclocking guide?


Pretty much any guide on overclocking the 290 is going to be the same. Core is King start with it. I would say go in increments of 5mhz and stress with Uniengine or firestrike loops. Do this until you see no more fps gain or start to see artifacts. Then bump voltage. Repeat and wash until max overclock then start on your ram. I always bump my ram by 100mhz test until it artifacts then back it off by 10mhz until artifacts are gone. A lot of guys are modding the bios if possible once they achieve their max overclock. Have good cooling though


----------



## Gobigorgohome

Quote:


> Originally Posted by *tsm106*
> 
> Going from a low to average sbe to a gold ivye will do the trick. But really, I think the fact that you had a poor clocking sbe is more the problem.
> 
> http://www.3dmark.com/compare/fs/2129370/fs/1179138
> 
> Look... huge difference right?
> 
> Processor
> *Intel Core i7-4960X - Intel Core i7-3930K Processor*
> Reported stock core clock
> 3,600 MHz - 3,200 MHz
> Maximum turbo core clock
> 4,934 MHz -5,110 MHz
> 
> Physics Score
> *18294.0 - 18332.0*
> 
> Physics Test
> *58.1 - 58.2 fps*


The difference from my poor 3930K to a good 4930K will be noticeable I think, at least if I decide to push my cards.

I do not really think the 3930K is too bad for the price I paid (250 USD), but the last owner said that he had it at 4,4 Ghz with AIO (so I knew it was not a silver/golden sample) and the fact that I had to use 1,408 volts for 4,5 Ghz says a lot. I guess I will see the difference in poor SB-E vs good IVY-E late this week or at least some time soon.


----------



## mojobear

Hey guys,

Wanted to show you all this. I know this is the 290(x) club but this offer is just too damn good for Canadians. Tigerdirect has the r9 295x2 for 1120, which is the US promotional price 999 + exchange. Usually we get things for a bit more than exchange. Newegg,ca has the r9 295x2 for 1300ish.

Found this out bc a friend just went out a bought one today









http://www.tigerdirect.ca/applications/SearchTools/item-details.asp?EdpNo=8971203&CatId=7387


----------



## Wezzor

Quote:


> Originally Posted by *KnownDragon*
> 
> Pretty much any guide on overclocking the 290 is going to be the same. Core is King start with it. I would say go in increments of 5mhz and stress with Uniengine or firestrike loops. Do this until you see no more fps gain or start to see artifacts. Then bump voltage. Repeat and wash until max overclock then start on your ram. I always bump my ram by 100mhz test until it artifacts then back it off by 10mhz until artifacts are gone. A lot of guys are modding the bios if possible once they achieve their max overclock. Have good cooling though


Thank you! Anyway, when should start worrying about temps? Right now when I max out heavy game my card gets to 75 c. And when I increase voltage by how much should I increase each time?


----------



## KnownDragon

Quote:


> Originally Posted by *Wezzor*
> 
> Thank you! Anyway, when should start worrying about temps? Right now when I max out heavy game my card gets to 75 c. And when I increase voltage by how much should I increase each time?


I would say that you should get gpu-z and monitor the gpu while testing. I believe at 90 the card will throttle. I am lucky and can squeeze 150 MHz with no voltage boost on core. Ram I only get 100 MHz. I would start off with small voltage boosts .001 but that is me. Some may go .01 or etc. when you have to start adding voltage for clock the temps will skyrocket pretty fast if not custom cooled. I find it more aggravating to overclock a gpu because of the hard freezes. If you stay with stock cooler replacing the Tim with a good one will generally shave a few degrees off. Do at your own risk. So let's say if your card is at 947 core and you can get a 100 MHz with no voltage boost you should see more fps but higher temps. Make notes each time of your fps when you stop seeing gains bump the voltage. .005 wouldn't hurt and might allow you to find maximum core with cooler temps. If you bump it .005 and gain 20 fps give it another .005 bump if only a few more fps show up bump the core up again.


----------



## Wezzor

Quote:


> Originally Posted by *KnownDragon*
> 
> I would say that you should get gpu-z and monitor the gpu while testing. I believe at 90 the card will throttle. I am lucky and can squeeze 150 MHz with no voltage boost on core. Ram I only get 100 MHz. I would start off with small voltage boosts .001 but that is me. Some may go .01 or etc. when you have to start adding voltage for clock the temps will skyrocket pretty fast if not custom cooled. I find it more aggravating to overclock a gpu because of the hard freezes. If you stay with stock cooler replacing the Tim with a good one will generally shave a few degrees off. Do at your own risk. So let's say if your card is at 947 core and you can get a 100 MHz with no voltage boost you should see more fps but higher temps. Make notes each time of your fps when you stop seeing gains bump the voltage. .005 wouldn't hurt and might allow you to find maximum core with cooler temps. If you bump it .005 and gain 20 fps give it another .005 bump if only a few more fps show up bump the core up again.


I went from 1000/1300 to 1100/1400 with just 5 FPS increase. (temps were fine) It just feels like a completly waste of time overclocking your GPU if the FPS increase is so low. Idle temps went from 35 c to 55 c but during load the temp were the same. I didn't even touch the voltage just the power limit. Am I doing something wrong or should I expect that kind of fps increase with these clocks? I could probably pump more but since I didn't see any FPS increase it felt just meaningless.


----------



## Zipperly

Quote:


> Originally Posted by *Wezzor*
> 
> I went from 1000/1300 to 1100/1400 with just 5 FPS increase. (temps were fine) It just feels like a completly waste of time overclocking your GPU if the FPS increase is so low. Idle temps went from 35 c to 55 c but during load the temp were the same. I didn't even touch the voltage just the power limit. Am I doing something wrong or should I expect that kind of fps increase with these clocks? I could probably pump more but since I didn't see any FPS increase it felt just meaningless.


This is what I noticed when I had my 290X clocked to 1200mhz core "no throttle". It was a little faster than stock but not by a whole lot, just doesnt seem to scale that great with an overclock. My GTX 780 with an overclock is an entirely different story, huge difference between overclock and stock clocks.


----------



## Wezzor

Quote:


> Originally Posted by *Zipperly*
> 
> This is what I noticed when I had my 290X clocked to 1200mhz core "no throttle". It was a little faster than stock but not by a whole lot, just doesnt seem to scale that great with an overclock. My GTX 780 with an overclock is an entirely different story, huge difference between overclock and stock clocks.


Ahh, thought I was doing something wrong.







Anyway, do you know why my idle temps did get higher but not load temps? When using stock clocks which I'm using now my card is idling around 30-35 c but when I overclocked it was idling on 50-55 c.


----------



## Zipperly

Quote:


> Originally Posted by *Wezzor*
> 
> Ahh, thought I was doing something wrong.
> 
> 
> 
> 
> 
> 
> 
> Anyway, do you know why my idle temps did get higher but not load temps? When using stock clock which I'm using now my card is idling around 30-35 c but when I overclocked it was idling on 50-55 c.


Not sure why your idle temps would be doing that unless the card is not switching back to 2d voltages after you exit a game.


----------



## Wezzor

Quote:


> Originally Posted by *Zipperly*
> 
> Not sure why your idle temps would be doing that unless the card is not switching back to 2d voltages after you exit a game.


How do I check that? RTSS?


----------



## Jflisk

Quote:


> Originally Posted by *heroxoot*
> 
> So is everyone using that August 27th driver? I heard it has an issue with ULPS. Not an issue for a single card user like me. The performance is great tho. Extra fps in benchmarks so far and no more heat. Went from 53 (give or take .1fps) to 55.6fps in heaven.
> 
> I never thought I'd see the day we could get 60fps on heaven with a single card doing 1080p. We have come a long way.


where is this AUG 27th driver you speak of. Do you mean 14.8. Thanks


----------



## BradleyW

Quote:


> Originally Posted by *Jflisk*
> 
> where is this AUG 27th driver you speak of. Do you mean 14.8. Thanks


14.8 Leak.


----------



## Jflisk

Okay looking for temps on the R9 290X x 3 or 4 watercooled after About an hour of gaming. Looking to compare my delta to others make sure I am not loosing it. Thanks all


----------



## sugarhell

Quote:


> Originally Posted by *BradleyW*
> 
> 14.8 Leak.


Its not a leak. Its the 14.8 for apus but is working for desktop gpus too.


----------



## BradleyW

Quote:


> Originally Posted by *sugarhell*
> 
> Its not a leak. Its the 14.8 for apus but is working for desktop gpus too.


Ahh yes I remember.


----------



## bond32

Quote:


> Originally Posted by *Jflisk*
> 
> Okay looking for temps on the R9 290X x 3 or 4 watercooled after About an hour of gaming. Looking to compare my delta to others make sure I am not loosing it. Thanks all


Considering I have just an absolutely absurd cooling setup now, my three after benching topped at 49 C which with ambient at 23 C, gives a delta of 26 degrees. Now mind you I have 1320mm of rad space, some 80mm thick with 2 MCP50x pumps in series...

This is with +100 mv.

The pumps helps drastically.


----------



## bluedevil

Trying to really set my 290 free, should I sell my z77/3470 for a z97/ 4690k? Or just get a 3770k? Onlybtjing is my mobo has no voltage control.


----------



## Jflisk

Quote:


> Originally Posted by *bond32*
> 
> Considering I have just an absolutely absurd cooling setup now, my three after benching topped at 49 C which with ambient at 23 C, gives a delta of 26 degrees. Now mind you I have 1320mm of rad space, some 80mm thick with 2 MCP50x pumps in series...
> 
> This is with +100 mv.
> 
> The pumps helps drastically.


Okay then I am not that far off my 3x r9 290(overclocked 1060-1350) FX9590 with a monsta 80mm 120.2 -120 - 240.2 Dual D5 Is running 43C Idel to 61C full load max. Crysis3 2 hrs. I guess it could always be better if I want to go crazy with it. I guess what I am saying is and I know each loop is different. My temps are not that bad.My temps start CPU 38C and GPUs 43C then go up to CPU 61C 3x GPU 58C.+1 Rep Thanks


----------



## bond32

Quote:


> Originally Posted by *Jflisk*
> 
> Okay then I am not that far off my 3x r9 290(overclocked 1060-1350) FX9590 with a monsta 80mm 120.2 -120 - 240.2 Dual D5 Is running 43C Idel to 61C full load max. Crysis3 2 hrs. I guess it could always be better if I want to go crazy with it. I guess what I am saying is and I know each loop is different. My temps are not that bad.My temps start CPU 38C and GPUs 43C then go up to CPU 61C 3x GPU 58C.+1 Rep Thanks


Are you using the PT bios or did you set it in AB to disable 2D clocks? Your idle temps, while really not that important, seem high. Mind idles around 29-31.

I assume your blocks are setup in parallel? Dual D5's could push series no problem, that would give some better temps.


----------



## DampMonkey

Quote:


> Originally Posted by *Jflisk*
> 
> Okay then I am not that far off my 3x r9 290(overclocked 1060-1350) FX9590 with a monsta 80mm 120.2 -120 - 240.2 Dual D5 Is running 43C Idel to 61C full load max. Crysis3 2 hrs. I guess it could always be better if I want to go crazy with it. I guess what I am saying is and I know each loop is different. My temps are not that bad.My temps start CPU 38C and GPUs 43C then go up to CPU 61C 3x GPU 58C.+1 Rep Thanks


Whats your scaling like with those cards? It seems like you should be bottlenecking from your CPU or PCIe lanes with that much GPU muscle


----------



## heroxoot

Quote:


> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BradleyW*
> 
> 14.8 Leak.
> 
> 
> 
> Its not a leak. Its the 14.8 for apus but is working for desktop gpus too.
Click to expand...

Quote:


> Originally Posted by *BradleyW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jflisk*
> 
> where is this AUG 27th driver you speak of. Do you mean 14.8. Thanks
> 
> 
> 
> 14.8 Leak.
Click to expand...

You're both wrong. It's the R9 285 driver fixed to work on all cards. 14.8 APU driver is different and came out 2 weeks earlier than this one.
http://forums.guru3d.com/showthread.php?t=392555


----------



## BradleyW

Quote:


> Originally Posted by *heroxoot*
> 
> You're both wrong. It's the R9 285 driver fixed to work on all cards. 14.8 APU driver is different and came out 2 weeks earlier than this one.
> http://forums.guru3d.com/showthread.php?t=392555


R9 285 driver is what I had in mind, but I thought it was leaked before it got released properly.Anyway thanks for clarification. I'm still going to wait for next drivers to show up on AMD under the R9 290X section for Win 8.1 64 bit.


----------



## Jflisk

Quote:


> Originally Posted by *DampMonkey*
> 
> Whats your scaling like with those cards? It seems like you should be bottlenecking from your CPU or PCIe lanes with that much GPU muscle


I don't ever bottle neck. I play games like crisis 3 at 150+ Frame rates. The FX9590 is a 8 core 4.7GHZ That tops out at 5.0GHZ processor. As far as scaling I can do 3D 1080I x 2 with no problems maintaining 60 FPS constant in any game.


----------



## Jflisk

Quote:


> Originally Posted by *bond32*
> 
> Are you using the PT bios or did you set it in AB to disable 2D clocks? Your idle temps, while really not that important, seem high. Mind idles around 29-31.
> 
> I assume your blocks are setup in parallel? Dual D5's could push series no problem, that would give some better temps.


I have ULPS disabled that's about it and the over clock 1060-1350.Looks like it sitting at 300-150 at idle. Yes the GPUs are in parallel. Thanks


----------



## BradleyW

Quote:


> Originally Posted by *Jflisk*
> 
> I don't ever bottle neck. I play games like crisis 3 at 150+ Frame rates. The FX9590 is a 8 core 4.7GHZ That tops out at 5.0GHZ processor. As far as scaling I can do 3D 1080I x 2 with no problems maintaining 60 FPS constant in any game.


What settings do you play at on Crysis 3 to get 150 fps despite the game being heavily CPU limited. My 3930K is quicker, yet I can drop as low as 70 in a CPU constraint area. I ain't saying your telling lies, I just want to know what settings you played at or if you made any useful tweaks which might help me.


----------



## Jflisk

Quote:


> Originally Posted by *BradleyW*
> 
> What settings do you play at on Crysis 3 to get 150 fps despite the game being heavily CPU limited. My 3930K is quicker, yet I can drop as low as 70 in a CPU constraint area.


Ill run a game and let you know. I am pretty sure I set it on ultra. Thanks


----------



## heroxoot

Quote:


> Originally Posted by *BradleyW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> You're both wrong. It's the R9 285 driver fixed to work on all cards. 14.8 APU driver is different and came out 2 weeks earlier than this one.
> http://forums.guru3d.com/showthread.php?t=392555
> 
> 
> 
> R9 285 driver is what I had in mind, but I thought it was leaked before it got released properly.Anyway thanks for clarification. I'm still going to wait for next drivers to show up on AMD under the R9 290X section for Win 8.1 64 bit.
Click to expand...

Yea it's a great driver but it is a little buggy. So far I hear ULPS has to be disabled. Other than that I had CCC crash once, but it didnt cause an issue.


----------



## battleaxe

Quote:


> Originally Posted by *Vici0us*
> 
> I've never had problems with BF4 but I have the same issue with Metro Last Light. When I was running it on 2-Way SLI GTX 780 Ti's it was very smooth! But with Crossfire 290's I get very low FPS, extremely laggy. Could be overclocking. I'm gonna bring my CPU and GPU's back to stock and see what happens.


Turn off Physics in MetroLL... I had the same thing.


----------



## Jflisk

Quote:


> Originally Posted by *BradleyW*
> 
> What settings do you play at on Crysis 3 to get 150 fps despite the game being heavily CPU limited. My 3930K is quicker, yet I can drop as low as 70 in a CPU constraint area. I ain't saying your telling lies, I just want to know what settings you played at or if you made any useful tweaks which might help me.


You would be correct. I just got done playing for a few. I saw 50FPS as the bottom and 100FPS at the top.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Jflisk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BradleyW*
> 
> What settings do you play at on Crysis 3 to get 150 fps despite the game being heavily CPU limited. My 3930K is quicker, yet I can drop as low as 70 in a CPU constraint area. I ain't saying your telling lies, I just want to know what settings you played at or if you made any useful tweaks which might help me.
> 
> 
> 
> You would be correct. I just got done playing for a few. I saw 50FPS as the bottom and 100FPS at the top.
Click to expand...

Yeah I didn't think that sounded right :/


----------



## VSG

Better pictures, as promised:









and thus the circle of (GPU) life is complete


----------



## valgusepoiss

Hi everyone!

I was planning to buy new Gigabye R9 290 for 390 euros but instead i got good deal of used 290x for 255 euros. It was used for mining for 3 weeks but other than that, it was not used much.

The thing is... i maybe expected too much. But hell, its a 500+ euros card when bought online in our country and its described as enthusiast tier card. SO i was expecting to play everything maxed at 1080p above 60 fps (my monitor is 60hz) but was a little disappointed when, not even using much aa and only 8x AF, the latest games (like Assasins Creed Black Flag - stutters when looking around or Metro 2033 Redux or Tomb Raider or Watch Dogs - around 40 fps at the start of game) were often going under 60 FPS.

Oh, and i always use vsync. The tearing is a major immersion killer for me.

Well, of course i am enjoying A LOT the good graphics and often above 60FPS the 290x is delivering me and im generally very happy with the purchase but some part of me cant help to feel a little bit disappointed.

How are your cards performing?

DO you have any advice on graphic settings i should turn down to get constant 60fps? I bet i cant even tell the difference on side by side comparison at some settings


----------



## nightfox

Quote:


> Originally Posted by *valgusepoiss*
> 
> Hi everyone!
> 
> I was planning to buy new Gigabye R9 290 for 390 euros but instead i got good deal of used 290x for 255 euros. It was used for mining for 3 weeks but other than that, it was not used much.
> 
> The thing is... i maybe expected too much. But hell, its a 500+ euros card when bought online in our country and its described as enthusiast tier card. SO i was expecting to play everything maxed at 1080p above 60 fps (my monitor is 60hz) but was a little disappointed when, not even using much aa and only 8x AF, the latest games (like Assasins Creed Black Flag - stutters when looking around or Metro 2033 Redux or Tomb Raider or Watch Dogs - around 40 fps at the start of game) were often going under 60 FPS.
> 
> Oh, and i always use vsync. The tearing is a major immersion killer for me.
> 
> Well, of course i am enjoying A LOT the good graphics and often above 60FPS the 290x is delivering me and im generally very happy with the purchase but some part of me cant help to feel a little bit disappointed.
> 
> How are your cards performing?
> 
> DO you have any advice on graphic settings i should turn down to get constant 60fps? I bet i cant even tell the difference on side by side comparison at some settings


The games you mentioned except Tomb raider are all NVIDIA games. Turn physx off. your physx is making your game crawl as physx is driver by your CPU and given the fact that your CPU is not OC'd, (I assume your system is at your signature), it is not enough.

On the hand, tomb raider. You might want to reinstall your driver. Tomb raider is working flawless for me. (single 290 and 1600p monitor). Try setting it to high first.


----------



## valgusepoiss

Quote:


> Originally Posted by *nightfox*
> 
> The games you mentioned except Tomb raider are all NVIDIA games. Turn physx off. your physx is making your game crawl as physx is driver by your CPU and given the fact that your CPU is not OC'd, (I assume your system is at your signature), it is not enough.
> 
> On the hand, tomb raider. You might want to reinstall your driver. Tomb raider is working flawless for me. (single 290 and 1600p monitor). Try setting it to high first.


Thanks for the advice. Is physx disabled in game menu or somewhere else? And yeah Tomb Raider runs best out of these games.

EDIT:
I also installed beta driver. Will see how that works out. It has improvements on some games i mentioned.


----------



## nightfox

Quote:


> Originally Posted by *valgusepoiss*
> 
> Thanks for the advice. Is physx disabled in game menu or somewhere else? And yeah Tomb Raider runs best out of these games.
> 
> EDIT:
> I also installed beta driver. Will see how that works out. It has improvements on some games i mentioned.


Yes physx is disable in game menu. Mostly they are in the graphic options.

I just check AC Black Flag it is in Graphic options very last part (physx particles)


----------



## valgusepoiss

Quote:


> Originally Posted by *nightfox*
> 
> I just check AC Black Flag it is in Graphic options very last part (physx particles)


I dont even have the option. The last one is v-sync. But thanks for the advice... ill check other games for particle settings too.


----------



## joeh4384

Quote:


> Originally Posted by *valgusepoiss*
> 
> Hi everyone!
> 
> I was planning to buy new Gigabye R9 290 for 390 euros but instead i got good deal of used 290x for 255 euros. It was used for mining for 3 weeks but other than that, it was not used much.
> 
> The thing is... i maybe expected too much. But hell, its a 500+ euros card when bought online in our country and its described as enthusiast tier card. SO i was expecting to play everything maxed at 1080p above 60 fps (my monitor is 60hz) but was a little disappointed when, not even using much aa and only 8x AF, the latest games (like Assasins Creed Black Flag - stutters when looking around or Metro 2033 Redux or Tomb Raider or Watch Dogs - around 40 fps at the start of game) were often going under 60 FPS.
> 
> Oh, and i always use vsync. The tearing is a major immersion killer for me.
> 
> Well, of course i am enjoying A LOT the good graphics and often above 60FPS the 290x is delivering me and im generally very happy with the purchase but some part of me cant help to feel a little bit disappointed.
> 
> How are your cards performing?
> 
> DO you have any advice on graphic settings i should turn down to get constant 60fps? I bet i cant even tell the difference on side by side comparison at some settings


Do you have a reference model? They are known to overheat and throttle.


----------



## Offler

OK. Just ordered R9-290x, Sapphire reference model.

Reason: My current HD 7970oc Rev 2.1 by Gigabyte has very poor ASIC - 61%, and overclocking beyond 1050Mhz is simply not possible.

Bought from shop as used, with 1 year warranty, and was used for reviewers when R9-290x was new on the market. Price: 250 euro.


----------



## valgusepoiss

Quote:


> Originally Posted by *joeh4384*
> 
> Do you have a reference model? They are known to overheat and throttle.


I have this one:
http://www.gigabyte.us/products/product-page.aspx?pid=4885#ov

CPUID HWMonitor showed gpu max temps of 83'C after game short session.


----------



## rdr09

Quote:


> Originally Posted by *Offler*
> 
> OK. Just ordered R9-290x, Sapphire reference model.
> 
> Reason: My current HD 7970oc Rev 2.1 by Gigabyte has very poor ASIC - 61%, and overclocking beyond 1050Mhz is simply not possible.
> 
> Bought from shop as used, with 1 year warranty, and was used for reviewers when R9-290x was new on the market. Price: 250 euro.


i had two 7900 cards that were below 60% ASIC. all did 1200 core. one 7950 was 58% and benched at 1255 core.

no tweaks . ..



all watered.


----------



## Offler

Nice.

I am not much fan of water cooling. I am big metal fan









Most water cooling stuff available here is made from plastics or rubber which may "dry" over time and start to crack/break up. Also I have Thermaltake case with big 25cm fan on the side so this may help to cool down that hot Hawaiian volcano...


----------



## rdr09

Quote:


> Originally Posted by *Offler*
> 
> Nice.
> 
> I am not much fan of water cooling. I am big metal fan
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Most water cooling stuff available here is made from plastics or rubber which may "dry" over time and start to crack/break up. Also I have Thermaltake case with big 25cm fan on the side so this may help to cool down that hot Hawaiian volcano...


got one used for a $100



1115 core

http://www.3dmark.com/3dm/4025300?

can't push higher 'cause of psu and on air. no voltage tweaks. very much like Hawaiis. they love the cold.


----------



## sugarhell

Quote:


> Originally Posted by *rdr09*
> 
> got one used for a $100
> 
> 
> 
> 1115 core
> 
> http://www.3dmark.com/3dm/4025300?
> 
> can't push higher 'cause of psu and on air. no voltage tweaks. very much like Hawaiis. they love the cold.


This is a boost card?


----------



## Offler

Gigabyte HD 7970 OC had 2 major revisions. I had rev 2.1 - much worse VRM (hot as hell). Even Windforce cooler wasnt able to cool it down.

Also I checked the cooler and thermal compound and replaced it immediatelly after purchase. Cooler was already slightly corroded on place where was air bubble. No change...

Tests for flickering were always positive when frequency was over 1060MHz...

Edit: Also no official possibility to increase voltage for the chip. I modded it with other gigabyte bios (For Tahiti XT2), and it was able to run up to 1100Mhz without flickering, but it was hot as hell, and was throttling when under load...


----------



## rdr09

Quote:


> Originally Posted by *sugarhell*
> 
> This is a boost card?


correct, so no voltage needed for just this oc. maybe at 1200.

@Offler, same thing with hawaii. if not cooled it won't be cool to oc.


----------



## sugarhell

Quote:


> Originally Posted by *rdr09*
> 
> correct, so no voltage needed for just this oc. maybe at 1200.
> 
> @offler, same thing with hawaii. if not cooled it won't be cool to oc.


Thats why the asic is so low. The bios has 1.25v stock regardless the asic and the gpuz understands that this is a low asic based on the volts. But you dont really know what is it


----------



## rdr09

Quote:


> Originally Posted by *sugarhell*
> 
> Thats why the asic is so low. The bios has 1.25v stock regardless the asic and the gpuz understands that this is a low asic based on the volts. But you dont really know what is it


others may be mistaken and not taking into account the boost in voltage as well by adding even more voltage at a certain oc when actually it is not needed. that may be confusing but in short - boost equates to higher than normal stock voltage.

@Offler, if i am to buy another hawaii - i'll do what you did - it will be a reference. just have to cool it.


----------



## TTheuns

I am considering to jump over to 'The Red Team':

I can get €530 for my MSI GeForce GTX 780 Ti.

And I can buy two R9 290X's for €520, or I could buy three R9 290's for €520.

Which would you guys go for?


----------



## joeh4384

Quote:


> Originally Posted by *TTheuns*
> 
> I am considering to jump over to 'The Red Team':
> 
> I can get €530 for my MSI GeForce GTX 780 Ti.
> 
> And I can buy two R9 290X's for €520, or I could buy three R9 290's for €520.
> 
> Which would you guys go for?


Are you putting them under water? Hawaii GPUs have a lot of heat to manage.


----------



## sugarhell

Quote:


> Originally Posted by *TTheuns*
> 
> I am considering to jump over to 'The Red Team':
> 
> I can get €530 for my MSI GeForce GTX 780 Ti.
> 
> And I can buy two R9 290X's for €520, or I could buy three R9 290's for €520.
> 
> Which would you guys go for?


3 290s ofc


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *bluedevil*
> 
> Trying to really set my 290 free, should I sell my z77/3470 for a z97/ 4690k? Or just get a 3770k? Onlybtjing is my mobo has no voltage control.


z97 / 4690k and ive run 3 290s at stock on a z97 SOC and received good frames on crisis 3 with 4790k......








Quote:


> Originally Posted by *geggeg*
> 
> The prodigal son returns:
> 
> 
> 
> Ok, that isn't the 2x R9 290x I had before, but this is good enough to re-join I would hope
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Way better pictures tomorrow, this is going in Side 2 of the TX10-D build going on (in sig) for the girlfriend with a custom loop of course. Now I may try my hand at benching this, but longevity is more important here so don't expect much. @HOMECINEMA-PC, I may hit you up for BIOS/tips though!


You know it bro









Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yeah I didn't think that sounded right :/


That's prolly cause it wasn't . LooooooL









Quote:


> Originally Posted by *rdr09*
> 
> got one used for a $100
> 
> 
> 
> 1115 core
> 
> http://www.3dmark.com/3dm/4025300?
> 
> can't push higher 'cause of psu and on air. no voltage tweaks. very much like Hawaiis. they love the cold.


$100 bucks man that's a bargain









Quote:


> Originally Posted by *sugarhell*
> 
> 3 290s ofc


Hell yes









Messin with Firestrike a new 3930k on me RIVE with 4102 bios and ol faithfull 290 Sapphy....... 1300/1400

http://www.3dmark.com/3dm/4074915


----------



## MEC-777

Hey all, I'm a fairly new member on the site and just have a few question about reference 290(X)'s.

My beloved Gigabyte HD 7950 just died the other day







and thus, I'm looking for a replacement. I had the kraken G10 + Corsair H55 installed on that card and even with OCing, the temps were nice and cool.







I'm not sure why it died, perhaps my OC wasn't properly stable? The card was voltage locked, so maybe I just pushed it too far...?

Stock clocks were:

Core - Base 900, boost 1000
Memory - 1250

My max OC was:

Core - 1150 (+150 over boost)
memory - 1450 (+200)

But I pulled it back to 1100 core and 1400 memory. It ran fine like this for a couple months with no issues - until recently. Was using MSI Afterburner BTW.

Anyways, I'm looking for a used reference 290(X) because they're relatively cheap (around $250) and I can just use the kraken G10 to keep it nice and cool and quiet (plus VRM/Vram heatsink kit). I will not be OCing this since it's really not necessary at 1080p gaming and if I do it'll only be a very modest/safe OC, nothing crazy.

I have a few concerns with buying a used 290(X) that I'm hoping some of you could help clarify/shed some light on since I've not yet been able to find any definite answers:

1) Is there a way to tell if a card has been used for mining? The seller could easily lie.

2) Is mining "hard" on these cards and does it significantly reduce their gaming life span?

3) Is there any particular brand or brands of reference cards that are known to be a little better than the others in terms of performance and or reliability, or are they all pretty much equal?

Thanks in advance and you can find my full system specs in my sig (Black Mamba).


----------



## ace101

Hi! Anyone here able to make four display with this card? I tried but it didn't work. It is stated in the box that it can make four displays.


----------



## TTheuns

Quote:


> Originally Posted by *joeh4384*
> 
> Are you putting them under water? Hawaii GPUs have a lot of heat to manage.


Hell yeah they're going under water! (In a while







)
Quote:


> Originally Posted by *sugarhell*
> 
> 3 290s ofc


That's what I thought.


----------



## PearlJammzz

I am thinking of adding a 2nd r9 290 to my setup....Do they still have the crossfire microstutter issue in games? Or is that fixed completely? I know they had some drivers awhile back that helped but it was still there. That's the last I heard.


----------



## Jflisk

Quote:


> Originally Posted by *PearlJammzz*
> 
> I am thinking of adding a 2nd r9 290 to my setup....Do they still have the crossfire microstutter issue in games? Or is that fixed completely? I know they had some drivers awhile back that helped but it was still there. That's the last I heard.


Nope no problems I have 3 of them no stuttering here.


----------



## Gobigorgohome

Quote:


> Originally Posted by *PearlJammzz*
> 
> I am thinking of adding a 2nd r9 290 to my setup....Do they still have the crossfire microstutter issue in games? Or is that fixed completely? I know they had some drivers awhile back that helped but it was still there. That's the last I heard.


I have not noticed anything with 13.6 at least, coming from GTX 660 Ti SLI it is a big difference. Running quad-crossfire too.


----------



## rdr09

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> z97 / 4690k and ive run 3 290s at stock on a z97 SOC and received good frames on crisis 3 with 4790k......
> 
> 
> 
> 
> 
> 
> 
> 
> You know it bro
> 
> 
> 
> 
> 
> 
> 
> 
> That's prolly cause it wasn't . LooooooL
> 
> 
> 
> 
> 
> 
> 
> 
> $100 bucks man that's a bargain
> 
> 
> 
> 
> 
> 
> 
> 
> Hell yes
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Messin with Firestrike a new 3930k on me RIVE with 4102 bios and ol faithfull 290 Sapphy....... 1300/1400
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/4074915


i'm sure you will land with both the 970 and the 980. maybe 2 or more or each.



1200 out the box. here is 290 @ 1200

http://www.3dmark.com/3dm/4079609?

not sure if it is your X79 or the memory oc that's holding the score back. using old driver @1300/1600

http://www.3dmark.com/3dm/3154240?


----------



## Klocek001

Help needed with my r9 290 Tri-X OC ....

The system detects it as 8600/8700M

Do you think flashing the BIOS (or UEFI) could help ? If so, how do I do that ?


----------



## valgusepoiss

Quote:


> Originally Posted by *valgusepoiss*
> 
> Hi everyone!
> 
> I was planning to buy new Gigabye R9 290 for 390 euros but instead i got good deal of used 290x for 255 euros. It was used for mining for 3 weeks but other than that, it was not used much.
> 
> The thing is... i maybe expected too much. But hell, its a 500+ euros card when bought online in our country and its described as enthusiast tier card. SO i was expecting to play everything maxed at 1080p above 60 fps (my monitor is 60hz) but was a little disappointed when, not even using much aa and only 8x AF, the latest games (like Assasins Creed Black Flag - stutters when looking around or Metro 2033 Redux or Tomb Raider or Watch Dogs - around 40 fps at the start of game) were often going under 60 FPS.
> 
> Oh, and i always use vsync. The tearing is a major immersion killer for me.
> 
> Well, of course i am enjoying A LOT the good graphics and often above 60FPS the 290x is delivering me and im generally very happy with the purchase but some part of me cant help to feel a little bit disappointed.
> 
> How are your cards performing?
> 
> DO you have any advice on graphic settings i should turn down to get constant 60fps? I bet i cant even tell the difference on side by side comparison at some settings


I just found out that the card ran on silent mode, put it on perfomance mode and the temps and perfomance are better. Lol, I also found out that my case has insufficient airflow. Had to open the case to get good temps.

...i was also wondering. Until i get better case fans, i could direct my desk fan airflow right into the case







From safe distance of course. What you guys think? A good idea?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *valgusepoiss*
> 
> I just found out that the card ran on silent mode, put it on perfomance mode and the temps and perfomance are better. Lol, I also found out that my case has insufficient airflow. Had to open the case to get good temps.
> 
> ...i was also wondering. Until i get better case fans, i could direct my desk fan airflow right into the case
> 
> 
> 
> 
> 
> 
> 
> From safe distance of course. What you guys think? A good idea?


Yes and put it real close esse`


----------



## MEC-777

Just bumping my post from yesterday... Have a few questions...

My beloved Gigabyte HD 7950 just died the other day







and thus, I'm looking for a used reference 290(X) because they're relatively cheap (around $250) and I can just use the kraken G10 to keep it nice and cool and quiet (plus VRM/Vram heatsink kit).

I have a few concerns with buying a used 290(X) that I'm hoping some of you could help clarify/shed some light on since I've not yet been able to find any definite answers:

1) Is there a way to tell if a card has been used for mining? The seller could easily lie.

2) Is mining "hard" on these cards and does it significantly reduce their gaming life span?

3) Is there any particular brand or brands of reference cards that are known to be a little better than the others in terms of performance and or reliability, or are they all pretty much equal?

Thanks in advance and you can find my full system specs in my sig (Black Mamba).


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *MEC-777*
> 
> Just bumping my post from yesterday... Have a few questions...
> 
> My beloved Gigabyte HD 7950 just died the other day
> 
> 
> 
> 
> 
> 
> 
> and thus, I'm looking for a used reference 290(X) because they're relatively cheap (around $250) and I can just use the kraken G10 to keep it nice and cool and quiet (plus VRM/Vram heatsink kit).
> 
> I have a few concerns with buying a used 290(X) that I'm hoping some of you could help clarify/shed some light on since I've not yet been able to find any definite answers:
> 
> 1) Is there a way to tell if a card has been used for mining? The seller could easily lie.
> 
> 2) Is mining "hard" on these cards and does it significantly reduce their gaming life span?
> 
> 3) Is there any particular brand or brands of reference cards that are known to be a little better than the others in terms of performance and or reliability, or are they all pretty much equal?
> 
> Thanks in advance and you can find my full system specs in my sig (Black Mamba).


7950 died ? *AIO cooler ?* burnt VRM's ??

1 . No not really








2 . Depends on what bios and what it did and for how long ??
3. No . Get a reference card and block it . Vrms get too warm with aio cooler
4. Bios / Drivers dictates performance . EG: Unlocked volts and safteys off .


----------



## Red1776

4xR290X MSI almost wet


----------



## MEC-777

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 7950 died ? *AIO cooler ?* burnt VRM's ??
> 
> 1 . No not really
> 
> 
> 
> 
> 
> 
> 
> 
> 2 . Depends on what bios and what it did and for how long ??
> 3. No . Get a reference card and block it . Vrms get too warm with aio cooler
> 4. Bios / Drivers dictates performance . EG: Unlocked volts and safteys off .


Yes, my 7950 died but I suspect it was already on it's way out before I installed the AIO (there were various hints). I was using the Kraken G10 with a Corsair H55. I did my research and I made sure it had adequate VRM cooling.







My particular Gigabyte Windforce 7950 had a nice beefy heat sink already installed from the factory (see pic below). That, in combination with the 92mm fan on the G10, blowing directly on the VRMs should have been more than enough to cool it. It certainly would have done a better job than the stock cooler with 80mm fans and a forest of hot heat pipes and fins in the way of it.









I've also done my homework in terms of installing the G10 on a reference 290/X. I know that removing the reference cooler leaves the VRMs bare and that the 92mm fan on the G10 alone is insufficient to cool them, thus I plan on adding a heatsink kit (like the one suggested on the first post in this thread) to help cool the VRMs and Vram modules (like I mentioned in my previous post). This solution does work because there a many in the Official kraken G10 section of this forum who have performed these modifications to various reference 290/X's with success.

Anyways, thanks for the input/suggestions. Much appreciated.









Pic of my 7950's factory VRM heatsink:


----------



## HOMECINEMA-PC

Idle temps at stock clocks Tri 290


----------



## Jflisk

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Idle temps at stock clocks Tri 290


Cant see the pictures to small. Thanks


----------



## rdr09

Quote:


> Originally Posted by *Jflisk*
> 
> Cant see the pictures to small. Thanks


scroll down and hit original (bottom right). it will be too big.


----------



## Jflisk

Quote:


> Originally Posted by *rdr09*
> 
> scroll down and hit original (bottom right). it will be too big.


Got it. Thanks


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *rdr09*
> 
> scroll down and hit original (bottom right). it will be too big.


Like ??


----------



## kizwan

Quote:


> Originally Posted by *rdr09*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jflisk*
> 
> Cant see the pictures to small. Thanks
> 
> 
> 
> 
> 
> 
> 
> scroll down and hit original (bottom right). it will be *too big*.
Click to expand...

That's what she said!


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *rdr09*
> 
> scroll down and hit original (bottom right). it will be *too big.*
> Quote:
> 
> 
> 
> Originally Posted by *HOMECINEMA-PC*
> 
> Like ??
Click to expand...

Do you like the temps , I meant









Quote:


> Originally Posted by *kizwan*
> 
> That's what she said!


And you sir have a crude mind ..... LoooooL


----------



## lock2701

hey guys,

I got Gigabyte r9 290 (gv-r929oc-4gd-ga), and running it at 1200Mhz/1525Mhz, 1.375v (1.281v after vdroop), under 70c on the core vrm under 75c.
What do you all think about my voltage and temp? How's this for gaming?


----------



## Devildog83

Going to snag a 290 ref. today, is there a particular brand that works better than others? There are just about all brands selling used for $240 and just wondering if there are pretty much the same or is there a real good one or one to avoid?


----------



## bluedevil

Should I bother with a GTX 970 over a 290? Kinda thinking so with the TDP.


----------



## BradleyW

Quote:


> Originally Posted by *bluedevil*
> 
> Should I bother with a GTX 970 over a 290? Kinda thinking so with the TDP.


The TDP is very attractive. However, at high resolution gaming, the 970 only beats the 290 by 7-8 fps or so.


----------



## Offler

Well. Some results after upgrade

Gigabyte HD 7970 OC rev 2.1 @ 1050Mhz
OCCT DX 10 stress test : 606-620 fps
Asic 61.3%

Sapphire R9-290x (Reference design
764-771 FPS @ 1025MHz
Asic 71%

3d Mark: http://www.3dmark.com/3dm/4091039

And I even didnt start to OC much. Reference card with stock cooler.


----------



## Gobigorgohome

Quote:


> Originally Posted by *Red1776*
> 
> 4xR290X MSI almost wet


Nice setup, I can guarantee that you will like the results! At least I do!

How much radiator space do you have for those four cards? And what are your plans with that setup?


----------



## Red1776

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> 4xR290X MSI almost wet
> 
> 
> 
> 
> 
> Nice setup, I can guarantee that you will like the results! At least I do!
> 
> How much radiator space do you have for those four cards? And what are your plans with that setup?
Click to expand...

Thanks 

Cooling them I have:

1x 45mm x 360mm

2x 45mm x 240mm

1x 45mm x 140mm

1x 45mm x 120mm

Heaven 4.0 they run @ 39-41c at an 20-22c ambient

as far as plans for it

http://www.overclock.net/t/1473361/amd-high-performance-project-by-red1776

This is one of two builds for the project

This is the other

7850K/R7 250 dual graphics


----------



## Gualichu04

I have issues enabling and disable crossfire with the 14.8 driver. Need to disable it to play witcher 2 or it instant crashes.


----------



## BradleyW

Quote:


> Originally Posted by *Gualichu04*
> 
> I have issues enabling and disable crossfire with the 14.8 driver. Need to disable it to play witcher 2 or it instant crashes.


Disabled ULPS? Witcher 2 hates ULPS.


----------



## kizwan

Quote:


> Originally Posted by *Gualichu04*
> 
> I have issues enabling and disable crossfire with the 14.8 driver. Need to disable it to play witcher 2 or it instant crashes.


Issue like what? BSOD/crash when enable/disable CFX?


----------



## Jflisk

Quote:


> Originally Posted by *Gualichu04*
> 
> I have issues enabling and disable crossfire with the 14.8 driver. Need to disable it to play witcher 2 or it instant crashes.


Just checked it in 14.8. turned it on and off and on again no problems.


----------



## Forceman

Quote:


> Originally Posted by *BradleyW*
> 
> The TDP is very attractive. However, at high resolution gaming, the 970 only beats the 290 by 7-8 fps or so.


Only? At high resolutions (and so presumably at relatively low FPS) 7-8 FPS is huge. That's 10% or more, most likely.

But if you already have a 290 I'd keep it.


----------



## BradleyW

Quote:


> Originally Posted by *Forceman*
> 
> Only? At high resolutions (and so presumably at relatively low FPS) 7-8 FPS is huge. That's 10% or more, most likely.
> 
> But if you already have a 290 I'd keep it.


In fact, on one of the reviews, the 290X beat the 980 in 2 games. With the amount they charge for GPU's, I expect much larger fps increases in ALL titles. After all, £429 is very expensive for an additional 8 fps.


----------



## Forceman

He was asking 290/970, not 290X/980. £429 is very expensive, but that isn't the price of a 970, is it? If so, that's outrageous.


----------



## Gualichu04

Quote:


> Originally Posted by *kizwan*
> 
> Issue like what? BSOD/crash when enable/disable CFX?


When i try to turn it on and hit apply it doesnt just turn on or off.


----------



## bond32

970 is priced extremely competitive... Thought it was $349 correct? Considering it trades blows with the 290, yet draws about 60% of what the 290 does, I say Nvidia hit it on the head with these...


----------



## tsm106

970 is $329. And ya it's a killer price. I moved or am moving to a 970 in the htpc, but will wait for 20nm for my main rig.


----------



## ZealotKi11er

Quote:


> Originally Posted by *bond32*
> 
> 970 is priced extremely competitive... Thought it was $349 correct? Considering it trades blows with the 290, yet draws about 60% of what the 290 does, I say Nvidia hit it on the head with these...


You would expect a card with new architecture 3 years in working and coming 1 year later to beat 290. Even so both GTX970/980 not that big upgrade for 290/290X owners. Benchamrks are also bit misleading. GTX970 Boots ~ 1300 MHz out of the box compare to 947Mhz 290. Both OC but 290 has higher % OC and scaling.


----------



## Gualichu04

Quote:


> Originally Posted by *BradleyW*
> 
> Disabled ULPS? Witcher 2 hates ULPS.


I hate ulps to so it has been disabled for every driver update in msi and registry.


----------



## bond32

Quote:


> Originally Posted by *ZealotKi11er*
> 
> You would expect a card with new architecture 3 years in working and coming 1 year later to beat 290. Even so both GTX970/980 not that big upgrade for 290/290X owners. Benchamrks are also bit misleading. GTX970 Boots ~ 1300 MHz out of the box compare to 947Mhz 290. Both OC but 290 has higher % OC and scaling.


Actually no, I wouldn't expect that. Why? All of these cards play everything fine. Sure the select few gamers that game at 4k see around 40-60 fps, but the majority that play games at 1080p, even 1440p will be impressed. What's the major difference then in this and the 290 aside from price? It's the power draw. They were able to provide competitive performance while having the card pull SIGNIFICANTLY less power, and costing less. Think of all the people that are upgrading from some 7770 or something with a 400 watt power supply, now they won't need to spend $500 but rather $329.

Not trying to argue or persuade, but the fact is these cards are already crazy powerful. It's time to direct attention at making that power efficient so we don't have to swap out house breakers to play battlefield 4.


----------



## rv8000

Quote:


> Originally Posted by *ZealotKi11er*
> 
> You would expect a card with new architecture 3 years in working and coming 1 year later to beat 290. Even so both GTX970/980 not that big upgrade for 290/290X owners. Benchamrks are also bit misleading. GTX970 Boots ~ 1300 MHz out of the box compare to 947Mhz 290. Both OC but 290 has higher % OC and scaling.


970 would be a sidegrade for anyone with a 290/290x.

On average though I have a good feeling 970 clocks are going to sit at ~1500 on air, hard pressed to find a 290 on air that far above 1120. Some reviews have the 970 OC'd models keeping up with ref 780tis without modded bioses which is a bit scary for a $329 price tag.

AMD work some mojo magic so I can buy another card in the next 3 months


----------



## th3illusiveman

Quote:


> Originally Posted by *bond32*
> 
> 970 is priced extremely competitive... Thought it was $349 correct? Considering it trades blows with the 290, yet draws about 60% of what the 290 does, I say Nvidia hit it on the head with these...


They did a good job but it's not as amazing as it seems at first sight. the 980 beats out the GTX 780 Ti sure, but it does so at ~150-200Mhz higher frequencies and the 780 Ti catches up *very* quickly to it when overclocked and even when the 980 is running at 1500Mhz it has trouble with 1300Mhz 780 Ti. They don't scale anywhere near as well as Kepler did with frequency increases. Still the fact that a mid range chip is competing with a full unlocked Big Die without a node shrink while consuming significantly less power is impressive. Its not a 580 > 680 leap in performance but it's scary how close they are. I like these new cards, but they won't make me lose sleep over buying this 290x afew weeks ago.


----------



## kizwan

Quote:


> Originally Posted by *Gualichu04*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Issue like what? BSOD/crash when enable/disable CFX?
> 
> 
> 
> When i try to turn it on and hit apply it doesnt just turn on or off.
Click to expand...

Did restarting your computer enabled/disabled CFX? I meant, for example, you want to disable CFX but setting it disabled in CCC doesn't applied immediately & only restarting the computer disabled it. If yes, probably just a bug. Try reinstalling the driver if you haven't try that. Just uninstall it from Control Panel.


----------



## heroxoot

The 970 is a sidegrade to a 290/290X. The difference between the 290 and 970 is very small if it even exists. The 290X out right beats it. For the price tho? If you have less than a 770 or 290 I say it's a great deal. Best part? AMD already prices their cards like this, and don't have to compete hard on prices.


----------



## Zipperly

Quote:


> Originally Posted by *bond32*
> 
> 970 is priced extremely competitive... Thought it was $349 correct? Considering it trades blows with the 290, yet draws about 60% of what the 290 does, I say Nvidia hit it on the head with these...


Its $329.99 and it doesnt trade blows with a 290, it beats a 290X fairly often.


----------



## Zipperly

Quote:


> Originally Posted by *th3illusiveman*
> 
> They did a good job but it's not as amazing as it seems at first sight. the 980 beats out the GTX 780 Ti sure, but it does so at ~150-200Mhz higher frequencies and the 780 Ti catches up *very* quickly to it when overclocked and even when the 980 is running at 1500Mhz it has trouble with 1300Mhz 780 Ti. They don't scale anywhere near as well as Kepler did with frequency increases. Still the fact that a mid range chip is competing with a full unlocked Big Die without a node shrink while consuming significantly less power is impressive. Its not a 580 > 680 leap in performance but it's scary how close they are. I like these new cards, but they won't make me lose sleep over buying this 290x afew weeks ago.


LMBO, I love all the downplay talk going on around here between 290, 290X and 780TI owners. Just classic!







The card is on launch drivers and its already beating the other cards which are on extremely mature drivers.


----------



## th3illusiveman

Quote:


> Originally Posted by *Zipperly*
> 
> LMBO, I love all the downplay talk going on around here between 290, 290X and 780TI owners. Just classic!
> 
> 
> 
> 
> 
> 
> 
> The card is on launch drivers and its already beating the other cards which are on extremely mature drivers.


Shows just how advanced your reading comprehension is







I really don't think i'll waste my time on you anymore. You still haven't delivered those imaginary benchmarks of yours that show an overclocked 780 beating an overclocked 290x.


----------



## Zipperly

Quote:


> Originally Posted by *th3illusiveman*
> 
> Shows just how advanced your reading comprehension is
> 
> 
> 
> 
> 
> 
> 
> I really don't think i'll waste my time on you anymore. You still haven't delivered those imaginary benchmarks of yours that show an overclocked 780 beating an overclocked 290x.


I never made benchmarks but I *"unlike yourself"* owned them both at the same time and am fully aware of what each was capable of doing. So you can continue to stubbornly argue with the guy who owned both cards *"I would never do that myself"* while I continue to get a good chuckle out of your denial.

Besides even if I had gone through the trouble of making benchmarks you are the EXACT type of guy who still wouldn't believe them anyway, in all honesty you are not even worthy of my time in the first place.

By the way my reading comprehension skills along with observation skills are just fine but im willing to bet you wont be satisfied without getting the last word so I'll just go ahead and give it too you.


----------



## heroxoot

Sorry Zipperly, but talk is cheap. If you want to win an argument here on performance you gotta back up your claims. Saying you owned both and know what they can do doesn't mean anything. You talk about reading comprehension skills, but you lack basic argument skills taught in ENG100.


----------



## Sgt Bilko

Settle down guys, no point in arguing about it when these cards are still within spitting distance of each other.

wait until some OCNers get ahold of some then we can talk.


----------



## Zipperly

Quote:


> Originally Posted by *heroxoot*
> 
> Sorry Zipperly, but talk is cheap. If you want to win an argument here on performance you gotta back up your claims.


LMBO, im not trying to win anything......
Quote:


> Saying you owned both and know what they can do doesn't mean anything


Actually owning both cards and having first hand experience with them both means everything.


----------



## heroxoot

Quote:


> Originally Posted by *Zipperly*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Sorry Zipperly, but talk is cheap. If you want to win an argument here on performance you gotta back up your claims.
> 
> 
> 
> LMBO, im not trying to win anything......
> Quote:
> 
> 
> 
> Saying you owned both and know what they can do doesn't mean anything
> 
> Click to expand...
> 
> Actually owning both cards and having first hand experience with them both means everything.
Click to expand...

I can say I've owned every graphics card of last gen, and know which one runs better. That proves nothing to anyone. Sorry bro, but you're going to have to back up your claims. Otherwise they have no meaning.


----------



## rv8000

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Settle down guys, no point in arguing about it when these cards are still within spitting distance of each other.
> 
> wait until some OCNers get ahold of some then we can talk.


Makes me wish I hadn't sold/shipped my 290 Vapor-X today without doing some serious benchmarking, got too jumpy for the new tech







. Excited for the possible headroom on the 970/980 with a modded bios, shame I never got any of my 290's under water.


----------



## Gualichu04

Quote:


> Originally Posted by *kizwan*
> 
> Did restarting your computer enabled/disabled CFX? I meant, for example, you want to disable CFX but setting it disabled in CCC doesn't applied immediately & only restarting the computer disabled it. If yes, probably just a bug. Try reinstalling the driver if you haven't try that. Just uninstall it from Control Panel.


It will renable or disable upon a restart everytime. I will just reinstall 14.8 again or try 14.7 rc2


----------



## gobblebox

Quote:


> Originally Posted by *heroxoot*
> 
> I can say I've owned every graphics card of last gen, and know which one runs better. That proves nothing to anyone. Sorry bro, but you're going to have to back up your claims. Otherwise they have no meaning.


+1

Claiming something without providing proof _especially_ on the internets, and even more so particularly on OCN, is generally accepted as nothing more than a fairy tale.

Pics or it didn't happen. Zipperly has to be trolling... there's no way he doesn't understand that he'll lose any argument against the 290x until he can provide hard evidence of his claims.


----------



## Paopawdecarabao

Will the EK FC R9 290X waterblock not the rev 2 will fit Asus r9 290x reference model because that's the one I have. Trading my R9 290 Tri X crossfire to two Asus R9 290X reference cards


----------



## Cyber Locc

Hey anyone using 14.7 with mixed res my eyefinity is making my rez 1094 now instead of 1080p which my main screen is 1080 and the wings are 1024 so where it gets 1094 i have not a clue the horizontal rez is fine just the vert is jacked

Fixed it was my fourth monitor causing the issue made the group with it unplugged and it was fine.

however I have another eyefinity question is there a way to disable eyefinity and enable with hotkeys, Also when i disable it disables 3 of my monitors and moves them in windows which I then have to remove them this is a pain is there a way to make this not happen.


----------



## th3illusiveman

Opened up my PC, cleaned out all that dust (it's been ~6 months lol) and removed empty HDD bays blocking my intake fan, added a spare noctua fan to my side panel and voila ~5c drop in load temps. A more more efficient space heater is born, bring on winter







.


----------



## Offler

Quote:


> Originally Posted by *Offler*
> 
> Well. Some results after upgrade
> 
> Gigabyte HD 7970 OC rev 2.1 @ 1050Mhz
> OCCT DX 10 stress test : 606-620 fps
> Asic 61.3%
> Thief Benchmark: 62fps average
> 
> Sapphire R9-290x (Reference design
> 764-771 FPS @ 1025MHz
> Asic 71%
> Thief benchmark: 82 fps avg
> 
> 3d Mark: http://www.3dmark.com/3dm/4091039
> 
> And I even didnt start to OC much. Reference card with stock cooler.


Clocked to 1080MHz
http://www.3dmark.com/3dm/4096865

OCCT DX 10 stress test : 764-771 FPS (no change)

Thief benchmark 83 fps avg (slight improvement)


----------



## Zipperly

Quote:


> Originally Posted by *heroxoot*
> 
> I can say I've owned every graphics card of last gen, and know which one runs better.


Sure, you could make that false claim but since you haven't actually "owned all those cards" your claim would bear no validity at all. That is the difference between you and I, I have actually owned both cards at the same time together and have first hand experience with each one and I don't have to prove anything to anyone when I am fully aware of what the two are capable of once overclocked.

I do however make it a practice to not argue with someone who has actually owned both cards and tell that person that they are lying when they have no reason to have an allegiance to either side. I have been purchasing each high end card from both camps for the past few years, I do this because it is a hobby and I typically keep which ever of the two cards performs best when both are overclocked and as such I stuck with the GTX 780 so what you choose to believe or not to believe makes no bearing on this fact at all. Good day.


----------



## Ized

I am searching for the newest PowerColor PCS+ BIOSes that have surfaced in the last couple of months and which can be gained by emailing Raymond over at PowerColor.

I have found this updated 290 BIOS and it fixed a lot of jittering problems that I was having while also giving me a 500point boost in Firestrike Extreme.

Now I would like to try the 290X version, but I can't track it down.

Since I have a regular reference 290 (unlockable) there is no use me emailing the guy since he wants serial numbers + pictures of the hardware.

Any leads would be welcome.

Cheers


----------



## bond32

I'm pretty committed already to the 290x myself, will probably still get a fourth card and an extra power supply, but I am excited about the new cards being released. The competition is always great for the market. Lets look at the past few nvidia releases, while good cards, are still much more expensive over AMD. Seems like now, the prices are lower but the main benefit is the reduced power consumption. I feel like many of you seem to miss that point often.

A card that provides pretty darn close performance to a 290x yet costs less and draws WAY less power sounds like a win to me... Already seeing single 980's breaking 14k in firestrike... Very impressive.


----------



## Jflisk

Quote:


> Originally Posted by *bond32*
> 
> I'm pretty committed already to the 290x myself, will probably still get a fourth card and an extra power supply, but I am excited about the new cards being released. The competition is always great for the market. Lets look at the past few nvidia releases, while good cards, are still much more expensive over AMD. Seems like now, the prices are lower but the main benefit is the reduced power consumption. I feel like many of you seem to miss that point often.
> 
> A card that provides pretty darn close performance to a 290x yet costs less and draws WAY less power sounds like a win to me... Already seeing single 980's breaking 14k in firestrike... Very impressive.


@Bond do you have a couple of links on the 980. I have owned some Nvdias over the years and always wind up on the red side. As far as firestrike I think its a little one sided. Seems to favor NVidia and Intel. I usually pull high 13000 in fire strike with FX9590 and 3x R9 290X. If I had an intel I would probably be pulling high 17's if not more. I don't think there calculations take into consideration 4.7 as opposed to 4.2. Like it sees my FX 9590 as a FX8350. I know there created out of the same Die and I have owned both but it looks like fire strike handles it as one in the same. Just my 2 cents. Thanks


----------



## rdr09

Quote:


> Originally Posted by *bond32*
> 
> I'm pretty committed already to the 290x myself, will probably still get a fourth card and an extra power supply, but I am excited about the new cards being released. The competition is always great for the market. Lets look at the past few nvidia releases, while good cards, are still much more expensive over AMD. Seems like now, the prices are lower but the main benefit is the reduced power consumption. I feel like many of you seem to miss that point often.
> 
> A card that provides pretty darn close performance to a 290x yet costs less and draws WAY less power sounds like a win to me... Already seeing single 980's breaking 14k in firestrike... Very impressive.
> 
> 
> Spoiler: Warning: Spoiler!


no wonder some TBlack owners feel . . .


----------



## bond32

Quote:


> Originally Posted by *Jflisk*
> 
> @Bond do you have a couple of links on the 980. I have owned some Nvdias over the years and always wind up on the red side. As far as firestrike I think its a little one sided. Seems to favor NVidia and Intel. I usually pull high 13000 in fire strike with FX9590 and 3x R9 290X. If I had an intel I would probably be pulling high 17's if not more. I don't think there calculations take into consideration 4.7 as opposed to 4.2. Like it sees my FX 9590 as a FX8350. I know there created out of the same Die and I have owned both but it looks like fire strike handles it as one in the same. Just my 2 cents. Thanks


Oh I am just coming across a few reviews as they come out. I know everyone has their opinions about whos legit where, but http://www.guru3d.com/articles-pages/gigabyte-geforce-gtx-980-g1-gaming-review.html is a good read...

In an unrelated note, those beta drivers seem pretty nice... My trifire at 1160/1350 scored a bit higher: http://www.3dmark.com/3dm/4100276?


----------



## Devildog83

I will be joining soon as I have purchased a Sapphire 290 reference card and it should be here the middle of next week.


----------



## kizwan

Quote:


> Originally Posted by *bond32*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jflisk*
> 
> @Bond do you have a couple of links on the 980. I have owned some Nvdias over the years and always wind up on the red side. As far as firestrike I think its a little one sided. Seems to favor NVidia and Intel. I usually pull high 13000 in fire strike with FX9590 and 3x R9 290X. If I had an intel I would probably be pulling high 17's if not more. I don't think there calculations take into consideration 4.7 as opposed to 4.2. Like it sees my FX 9590 as a FX8350. I know there created out of the same Die and I have owned both but it looks like fire strike handles it as one in the same. Just my 2 cents. Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Oh I am just coming across a few reviews as they come out. I know everyone has their opinions about whos legit where, but http://www.guru3d.com/articles-pages/gigabyte-geforce-gtx-980-g1-gaming-review.html is a good read...
> 
> In an unrelated note, *those beta drivers seem pretty nice*... My trifire at 1160/1350 scored a bit higher: http://www.3dmark.com/3dm/4100276?
Click to expand...

Which beta drivers? I want to try it too.


----------



## bond32

Quote:


> Originally Posted by *kizwan*
> 
> Which beta drivers? I want to try it too.


I think I am just late to the party... 14.x from aug 27: http://www.guru3d.com/files-details/amd-catalyst-14-x-beta-(14-300-1005-august-27)-download.html


----------



## bond32

New high on firestrike for me: http://www.3dmark.com/3dm/4101056? 23259


----------



## Gabkicks

i managed a little over 17k with some 14.8 beta drivers. http://www.3dmark.com/fs/2774002 and an overclock.


----------



## jamaican voodoo

im keep my trifire 290s until amd delivers another set of beast gpus, i don't post alot because i really don't have issues with my cards, honestly amd have their ups and downs as a company but they still deliver stellar gpus especially when it come to multi gpu scaling. to each their own nvidia makes great product too but i like amd cards better.

just ranting


----------



## heroxoot

Quote:


> Originally Posted by *bond32*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Which beta drivers? I want to try it too.
> 
> 
> 
> I think I am just late to the party... 14.x from aug 27: http://www.guru3d.com/files-details/amd-catalyst-14-x-beta-(14-300-1005-august-27)-download.html
Click to expand...

Yes these drivers are top notch. Just make sure to disable ULPS. I hear it causes issues. Other than that the gameplay has never been smoother.


----------



## thrgk

I am trying to use a secondary psu to power my 3 and 4th 290s. I only have the 2 gpus connected, and have it shorted using the paperclip method. Do i needto do anything to power it on so the computer finds the 2 gpus?

Does the psu go off along with the computer?


----------



## heroxoot

Quote:


> Originally Posted by *thrgk*
> 
> I am trying to use a secondary psu to power my 3 and 4th 290s. I only have the 2 gpus connected, and have it shorted using the paperclip method. Do i needto do anything to power it on so the computer finds the 2 gpus?
> 
> Does the psu go off along with the computer?


With this method just flip the PSU switch when you shut off the PC. Otherwise I think you can get an adapter to connect 2 PSU to the mobo so it can regulate it's on/off position.


----------



## thrgk

Hmm, got a link or name for the adapter?

I wonder if I should jsut sell both of mine and get a 1500w corsair, can do 5 gpu,s so plenty of plugs, and enough power


----------



## Mega Man

no you will have to do it by hand, this is a temp method only otherwise you will need a relay you have to turn on the PC and the second PSU together ( within a couple seconds of each other ) long term can cause danger to componants

you can either manually use a relay ( i can help with wiring if you need ) or something like add2psu or the bitspower equivalent

there are also various cables you can do the same thing with 

i have a add2psu that i am using however i bought relays that i will be making my own with, i am making a safety as well that if one does not power on the 12v the other will not either


----------



## thrgk

OK so just for a few days I can do paperclip, and turn the second psu on along with the first and be ok?

Does anyone sell the safety ones you are talking about? Those do sound best.


----------



## Mega Man

no but they wont be hard to make when i am done ill post a wiring diagram in my build log

basically just uses 12v to power 2 relays one is powered by psu1 other by psu2 the relays control PSon ( when either psu shuts off, BOTH psus do )


----------



## kizwan

Quote:


> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bond32*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Which beta drivers? I want to try it too.
> 
> 
> 
> I think I am just late to the party... 14.x from aug 27: http://www.guru3d.com/files-details/amd-catalyst-14-x-beta-(14-300-1005-august-27)-download.html
> 
> Click to expand...
> 
> Yes these drivers are top notch. Just make sure to disable ULPS. I hear it causes issues. Other than that the gameplay has never been smoother.
Click to expand...

Hmmm...I don't see any improvement from 14.6 beta driver though. Only tested with Firestrike & BF4 so far.


----------



## thrgk

Cool, maybe I could purchase one, or make my own if I got idea ok


----------



## heroxoot

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bond32*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Which beta drivers? I want to try it too.
> 
> 
> 
> I think I am just late to the party... 14.x from aug 27: http://www.guru3d.com/files-details/amd-catalyst-14-x-beta-(14-300-1005-august-27)-download.html
> 
> Click to expand...
> 
> Yes these drivers are top notch. Just make sure to disable ULPS. I hear it causes issues. Other than that the gameplay has never been smoother.
> 
> Click to expand...
> 
> Hmmm...I don't see any improvement from 14.6 beta driver though. Only tested with Firestrike & BF4 so far.
Click to expand...

BF4 didnt feel much better, but a lot more consistent. I didnt try firestrike. Did get improvements in heaven.

Personally I am one of those guys who had terrible performance prior to 14.6. If you did not get the huge boost I got with 14.6 then you might be already getting optimized performance. For me it feels like it's still improving every driver.


----------



## the9quad

You know how when you push memory too hard you get the occasional checkerboard flicker? Well here is my issue:

I do not run these cards overclocked at all.
I can play any DX11 game all day long and never crash or have any issues whatsoever.
If i play DX9 games I will periodically have that checkerboard flicker, and once in a while it will crash. About every 5th crash, it crashes so hard that the pc no longer recognizes the drivers being installed. If I run the DX9 games with crossfire off, I never get the flicker or crash either.

Any ideas? seriously had it up to here with these cards, and am thinking it is time to go green. If i have to run games in one card configuration anyway, I might as well just get one 980 and call it a day. This whole crossfire thing has been nothing but a hassle, and same goes for these cards.


----------



## ozzy1925

when i run valley benchmark with extreme hd setting my 290 trix oc throttles at scene 6 even at stock core .Is that normal?temp max 75c


----------



## th3illusiveman

Quote:


> Originally Posted by *Offler*
> 
> Clocked to 1080MHz
> http://www.3dmark.com/3dm/4096865
> 
> OCCT DX 10 stress test : 764-771 FPS (no change)
> 
> Thief benchmark 83 fps avg (slight improvement)


i get ~12400 GPU score in firestrike using 1080Mhz core and 1400Mhz mem


----------



## Offler

Quote:


> Originally Posted by *th3illusiveman*
> 
> i get ~12400 GPU score in firestrike using 1080Mhz core and 1400Mhz mem


That was with 14.7 RC3, Power limit 25%, stock cooler, and Fan limit 45%, while ram was at 1250MHz. GPU is set to Quiet mode and after ALL tests. I guess the GPU was throttling due temps a bit.


----------



## th3illusiveman

Quote:


> Originally Posted by *Offler*
> 
> That was with 14.7 RC3, Power limit 25%, stock cooler, and Fan limit 45%, while ram was at 1250MHz. GPU is set to Quiet mode and after ALL tests. I guess the GPU was throttling due temps a bit.


i was using practically the same driver. And you are probably throttling which makes overclocking rather counter intuitive and imo it's better to reduce your voltage and try maintaining a stable clock. Or better try getting an after market cooler for that thing. If you're doing 1080Mhz with no added voltage then you could hit 1150-1200 Mhz with a good cooler and some Mv.


----------



## Offler

In my case GPU frequency of R9-290x higher than 1025Mhz can gain little to no performance, with exception of 3d mark Firestrike. Reason is limit by CPU.

Thief, Witcher 2 and one special OCCT testing method gave me exactly same results regardless of GPU OC. Anyway I did not increased the GPU voltage.

I can put CPU underwater to gain 300-600 additional MHz, and then I can gain more performance by GPU OC, but I would rather use air cooling instead. I think thermal compounds can fix the problem a bit. Now I have Arctic Mx-2 on CPU, and tomorrow I am going to buy Arctic Silver 5 for both CPU and GPU.


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> Hmmm...I don't see any improvement from 14.6 beta driver though. Only tested with Firestrike & BF4 so far.


you should see some in FS at least.

14.X 290 @ 1280 vs 14.6 290 @ 1300

http://www.3dmark.com/compare/fs/2746418/fs/2205652

my cpu was not oc'ed the same, so compare graphics scores.


----------



## Offler

Quote:


> Originally Posted by *th3illusiveman*
> 
> i get ~12400 GPU score in firestrike using 1080Mhz core and 1400Mhz mem


http://www.3dmark.com/3dm/4109269

1080MHz core, 1400MHz mem. Still slightly lower...

But GPU didnt crashed, no artifacts. Quite good for a reference with stock cooler.


----------



## battleaxe

Quote:


> Originally Posted by *thrgk*
> 
> Cool, maybe I could purchase one, or make my own if I got idea ok


I ran a jumper from the green wire of the cable plugged into the board to the green wire coming out of the second psu. Turns itself on just like other when I push the power button.


----------



## thrgk

So as long as I get one of those 24 pin multi psu adapters I should be fine ? That way both will turn on and off when the main does ??


----------



## kizwan

Quote:


> Originally Posted by *heroxoot*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bond32*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Which beta drivers? I want to try it too.
> 
> 
> 
> I think I am just late to the party... 14.x from aug 27: http://www.guru3d.com/files-details/amd-catalyst-14-x-beta-(14-300-1005-august-27)-download.html
> 
> Click to expand...
> 
> Yes these drivers are top notch. Just make sure to disable ULPS. I hear it causes issues. Other than that the gameplay has never been smoother.
> 
> Click to expand...
> 
> Hmmm...I don't see any improvement from 14.6 beta driver though. Only tested with Firestrike & BF4 so far.
> 
> Click to expand...
> 
> 
> 
> 
> BF4 didnt feel much better, but a lot more consistent. I didnt try firestrike. Did get improvements in heaven.
> 
> Personally I am one of those guys who had terrible performance prior to 14.6. If you did not get the huge boost I got with 14.6 then you might be already getting optimized performance. For me it feels like it's still improving every driver.
Click to expand...

Quote:


> Originally Posted by *rdr09*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Hmmm...I don't see any improvement from 14.6 beta driver though. Only tested with Firestrike & BF4 so far.
> 
> 
> 
> 
> 
> 
> 
> you should see some in FS at least.
> 
> 14.X 290 @ 1280 vs 14.6 290 @ 1300
> 
> http://www.3dmark.com/compare/fs/2746418/fs/2205652
> 
> my cpu was not oc'ed the same, so compare graphics scores.
Click to expand...

14.6 beta gave me huge boost in performance. For me, the August 27 driver pretty much similar to 14.6 beta in performance.


----------



## Jflisk

I tried some form of the 14.8 drivers and I would crash in firestrike. Went back to the 14.7RC and no problems at all except the heat issue.


----------



## Forceman

Quote:


> Originally Posted by *the9quad*
> 
> You know how when you push memory too hard you get the occasional checkerboard flicker? Well here is my issue:
> 
> I do not run these cards overclocked at all.
> I can play any DX11 game all day long and never crash or have any issues whatsoever.
> If i play DX9 games I will periodically have that checkerboard flicker, and once in a while it will crash. About every 5th crash, it crashes so hard that the pc no longer recognizes the drivers being installed. If I run the DX9 games with crossfire off, I never get the flicker or crash either.
> 
> Any ideas? seriously had it up to here with these cards, and am thinking it is time to go green. If i have to run games in one card configuration anyway, I might as well just get one 980 and call it a day. This whole crossfire thing has been nothing but a hassle, and same goes for these cards.


Maybe the load with DX9 games isn't high enough and the cards aren't bumping into full 3D voltage? You see that a lot with guys overclocking the memory having trouble on the desktop when the card is in 2D volts.

Can you bump the voltage with AB and see if it helps?


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> 14.6 beta gave me huge boost in performance. For me, the August 27 driver pretty much similar to 14.6 beta in performance.


14.X adds about 200 on the graphics score.

Quote:


> Originally Posted by *Jflisk*
> 
> I tried some form of the 14.8 drivers and I would crash in firestrike. Went back to the 14.7RC and no problems at all except the heat issue.


the one for apu or 14.X?


----------



## pdasterly

having problem with 290x, recently rma card. They replaced card and now I cant install card. System blackscreens during install process, forcing hard reset. Other card works so Its not my system


----------



## Jflisk

Quote:


> Originally Posted by *rdr09*
> 
> 14.X adds about 200 on the graphics score.
> the one for apu or 14.X?


@RD I used the drivers 14.8 that were created on or near 8/12. What drivers are you using ill give them a try. Thanks


----------



## bond32

At least I can say I held a top spot for a little bit!! http://www.overclock.net/t/872945/top-30-3d-mark-13-fire-strike-scores-in-crossfire-sli

Bet every single one of us in the top 30 gets bumped down in the coming months... those new 980's are going to school us.


----------



## rdr09

Quote:


> Originally Posted by *Jflisk*
> 
> @RD I used the drivers 14.8 that were created on or near 8/12. What drivers are you using ill give them a try. Thanks


that one i linked labeled 14.X. makes my gpu cooler i think by a few degrees. created Aug 27.


----------



## Offler

Quote:


> Originally Posted by *bond32*
> 
> At least I can say I held a top spot for a little bit!! http://www.overclock.net/t/872945/top-30-3d-mark-13-fire-strike-scores-in-crossfire-sli
> 
> Bet every single one of us in the top 30 gets bumped down in the coming months... those new 980's are going to school us.


I dont think so.
http://www.3dmark.com/fs/1881409

R9-290x GTX980 and GTX970 have 64 ROPs which is crucial for raw graphic performance, but Nvidia cards have higher stock clocks.

In cooperation with lets say I7-3930, and high clocks you can get very high fillrate. Part of the trick is also that GTX980 has memory running on 7000MHz.

Clock it like that and you will have higher performance due higher amount of compute units and wider bus (512 compared to 384). Of course, high clocks for chips near/above 500mm squared is bit... troublesome.


----------



## bond32

Quote:


> Originally Posted by *Offler*
> 
> I dont think so.
> http://www.3dmark.com/fs/1881409
> 
> R9-290x GTX980 and GTX970 have 64 ROPs which is crucial for raw graphic performance, but Nvidia cards have higher stock clocks.
> 
> In cooperation with lets say I7-3930, and high clocks you can get very high fillrate. Part of the trick is also that GTX980 has memory running on 7000MHz.
> 
> Clock it like that and you will have higher performance due higher amount of compute units and wider bus (512 compared to 384). Of course, high clocks for chips near/above 500mm squared is bit... troublesome.


Not really sure what you're getting at here. There's already users here approaching 15k on a single gpu. When the classified comes out it will be at 15k on firestrike no prob.


----------



## Offler

I was simply explaining how the higher scores on GTX 9x0 are being archieved...

R9-290x = 64ROPs x 1000MHz = 64Gpixel/s fillrate
GTX 980 = 64 ROPs x 1127MHz = 72GPixel/s

And since R9-290x has more compute units, more TMUs and wider memory bus, the trick made on 3dmark is just a matter of frequency,

But I admit that smaller dies, are easier to overclock.


----------



## pdasterly

kinda stuck in bad situation. I have two 290x and they worked perfect, ended up with g10 brackets on both of them, again, worked perfect. Upgraded to full water cooled system and 1 card went bad. I was able to rma card and get everything back together now the other card is having problems. I'm really considering moving on from ati to nvidia. Computer seems like its been down forever


----------



## kizwan

Quote:


> Originally Posted by *pdasterly*
> 
> kinda stuck in bad situation. I have two 290x and they worked perfect, ended up with g10 brackets on both of them, again, worked perfect. Upgraded to full water cooled system and 1 card went bad. I was able to rma card and get everything back together now the other card is having problems. I'm really considering moving on from ati to nvidia. Computer seems like its been down forever


The card you get back from rma or the other card you having problem here? Basically, something wrong with the card if you get blackscreen when installing the drivers. As far as I know, the memory on the card maybe faulty. It's suck but you'll need to rma the card. Sorry to hear you having problem after problem but if you think going to nvidia is the solution, then go ahead.


----------



## rdr09

Quote:


> Originally Posted by *pdasterly*
> 
> kinda stuck in bad situation. I have two 290x and they worked perfect, ended up with g10 brackets on both of them, again, worked perfect. Upgraded to full water cooled system and 1 card went bad. I was able to rma card and get everything back together now the other card is having problems. I'm really considering moving on from ati to nvidia. Computer seems like its been down forever


could be the motherboard, specifically the PCIe slot? kinda hard to test if your system is watercooled.


----------



## alancsalt

Things have gone wrong for me fitting/changing waterblocks on Nvidia cards too...sometimes...


----------



## pdasterly

Quote:


> Originally Posted by *rdr09*
> 
> could be the motherboard, specifically the PCIe slot? kinda hard to test if your system is watercooled.


Other card works fine, tested both slots. Im not ready to give up yet but that was a stab in the back. I thought amd would have gotten their stuff together after all these years. This is bringing back memories of my ati all in wonder, that left a nasty taste in my mouth. Going to rma other card(more down time). Put in ticket thru sapphire, I guess ill have answer tomorrow


----------



## rdr09

Quote:


> Originally Posted by *pdasterly*
> 
> Other card works fine, tested both slots. Im not ready to give up yet but that was a stab in the back. I thought amd would have gotten their stuff together after all these years. This is bringing back memories of my ati all in wonder, that left a nasty taste in my mouth. Going to rma other card(more down time). Put in ticket thru sapphire, I guess ill have answer tomorrow


i know someone who switched form hawaii to 780. he is on his 4th 780. rma one after the other. it can happen on both sides. it seems to happen more with these cards, though. the 970 will be a fine card as an alternative.


----------



## pdasterly

gtx 980 at minimum but waterblocks are sold out everywhere. Im going to try to rma card and hope for the best


----------



## rdr09

Quote:


> Originally Posted by *pdasterly*
> 
> gtx 980 at minimum but waterblocks are sold out everywhere. Im going to try to rma card and hope for the best


i don't think you'd need to water the 980 right away. surely you can wait. the thing is, before you know it . . . the 980 Ti is out.


----------



## Wezzor

Hello guys!
I finally decided to OC my GPU so I read on some guides how to overclock a GPU and once I felt ready for it I went for it. I downloaded MSI Afterburner since it was the software most guides recommend. I then started with turning the power limit to 20% because I didn't wanted to touch the core voltage. (btw is it fine to have the power limit on 50%?) I then softly started with increasing the core clock up and kept increasing my core clock by a little and then ran valley test to check for artifacts. Anyway, I didn't notice any temperature difference during the valley test (during it), not so weird maybe since I didn't touch the core voltage but I did notice a huge difference in idle temps. Once I hit core clock 1100 my gpu was already idling on 55c. I could probably push the core clock more but since my card was almost hitting the load temperature which is 75c in idle I didn't wanted to continue. So that is why I'm here now desperately seeking out for help from you guys that can probably help me out with this problem. So if anyone of you know what I should to to get rid of this problem I would really appreciate your help.


----------



## the9quad

Quote:


> Originally Posted by *Wezzor*
> 
> Hello guys!
> I finally decided to OC my GPU so I read on some guides how to overclock a GPU and once I felt ready for it I went for it. I downloaded MSI Afterburner since it was the software most guides recommend. I then started with turning the power limit to 20% because I didn't wanted to touch the core voltage. (btw is it fine to have the power limit on 50%?) I then softly started with increasing the core clock up and kept increasing my core clock by a little and then ran valley test to check for artifacts. Anyway, I didn't notice any temperature difference during the valley test (during it), not so weird maybe since I didn't touch the core voltage but I did notice a huge difference in idle temps. Once I hit core clock 1100 my gpu was already idling on 55c. I could probably push the core clock more but since my card was almost hitting the load temperature which is 75c in idle I didn't wanted to continue. So that is why I'm here now desperately seeking out for help from you guys that can probably help me out with this problem. So if anyone of you know what I should to to get rid of this problem I would really appreciate your help.


Use a custom fan profile in afterburner. Will be louder but will keep it cool.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Wezzor*
> 
> Hello guys!
> I finally decided to OC my GPU so I read on some guides how to overclock a GPU and once I felt ready for it I went for it. I downloaded MSI Afterburner since it was the software most guides recommend. I then started with turning the power limit to 20% because I didn't wanted to touch the core voltage. (btw is it fine to have the power limit on 50%?) I then softly started with increasing the core clock up and kept increasing my core clock by a little and then ran valley test to check for artifacts. Anyway, I didn't notice any temperature difference during the valley test (during it), not so weird maybe since I didn't touch the core voltage but I did notice a huge difference in idle temps. Once I hit core clock 1100 my gpu was already idling on 55c. I could probably push the core clock more but since my card was almost hitting the load temperature which is 75c in idle I didn't wanted to continue. So that is why I'm here now desperately seeking out for help from you guys that can probably help me out with this problem. So if anyone of you know what I should to to get rid of this problem I would really appreciate your help.


Your idle temps should not be effected from overclocking.


----------



## Wezzor

Quote:


> Originally Posted by *the9quad*
> 
> Use a custom fan profile in afterburner. Will be louder but will keep it cool.


But that makes no sense mate. I did not increase the core voltage and If my load temps stays the same which is 75c on either 1000 or 1100 core clock but my idle temps went from 35c on 1000 to 55c on 1100. If both would changed I would find it logical but since only the idle temperature is changing it feels like something is pushing it. Maybe something like an inbuilt feature in the GPU? I was thinking that maybe I should do something in RTSS?


----------



## Wezzor

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Your idle temps should not be effected from overclocking.


Exactly! That's why I'm so confused and frustrated.


----------



## Ebefren

First OC. 1100mhz Gpu, 1500 vRam, +47v using Sapphire TriXX (dont know the precise voltage, trixx display only decimals). Stable, 80c with 53% (auto) fan.



Pretty cool







)


----------



## th3illusiveman

1150Mhz core, 1525Mhz mem 3DM11
http://www.3dmark.com/3dm11/8744053


----------



## ozzy1925

you dont need to increase mem clock for 3dmark


----------



## nightfox

Quote:


> Originally Posted by *th3illusiveman*
> 
> 1150Mhz core, 1525Mhz mem 3DM11
> http://www.3dmark.com/3dm11/8744053


according to 3dmark your gpu is 1040/1300. throttling or 3dmark bug?


----------



## Yuriewitsch

Quote:


> Originally Posted by *nightfox*
> 
> according to 3dmark your gpu is 1040/1300. throttling or 3dmark bug?


I think, its just showing default clocks.


----------



## ozzy1925

if you are using sapphire trixx,3dmark shows stock clocks no matter what core speed


----------



## Sgt Bilko

Quote:


> Originally Posted by *ozzy1925*
> 
> if you are using sapphire trixx,3dmark shows stock clocks no matter what core speed


It showed these up fine for me









http://www.3dmark.com/fs/1805629


http://www.3dmark.com/fs/1805770


----------



## mus1mus

Hey sarge, how much can you score with a single 290? firestrike


----------



## Vici0us

I just flashed my reference ASUS 290 to same clocks as my Windforce 290 (1040mhz). What power limit and voltage should I start setting it to, to at least get 1100mhz - 1150mhz. Any help would be appreciated. I need both cards stable. I am using Trixxx btw.


----------



## Sgt Bilko

Quote:


> Originally Posted by *mus1mus*
> 
> Hey sarge, how much can you score with a single 290? firestrike


Highest so far: http://www.3dmark.com/fs/2002510

I don't think i have a screencap around for it anywhere displaying the particulars but i'll have a look


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Highest so far: http://www.3dmark.com/fs/2002510
> 
> I don't think i have a screencap around for it anywhere displaying the particulars but i'll have a look


that's off by a mile. here is my 1260 using 13 driver . . .

http://www.3dmark.com/3dm/2713716

here is 1200 using 14.X driver . . .

http://www.3dmark.com/3dm/4079609?

yes, 1200. raised your power limit to max?


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Highest so far: http://www.3dmark.com/fs/2002510
> 
> I don't think i have a screencap around for it anywhere displaying the particulars but i'll have a look
> 
> 
> 
> that's off by a mile. here is my 1260 using 13 driver . . .
> 
> http://www.3dmark.com/3dm/2713716
> 
> here is 1200 using 14.X driver . . .
> 
> http://www.3dmark.com/3dm/4079609?
Click to expand...

That score was on the old 13.12 driver and you also get a higher graphics score by using an Intel CPU (weird i know)


----------



## mus1mus

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Highest so far: http://www.3dmark.com/fs/2002510
> 
> I don't think i have a screencap around for it anywhere displaying the particulars but i'll have a look


Mine's way off then. Sitting at 8500ish at 1150/1400


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> That score was on the old 13.12 driver and you also get a higher graphics score by using an Intel CPU (weird i know)


cpu does not matter much for those runs. should be a difference of 100 points at most in graphics. power limit?


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> That score was on the old 13.12 driver and you also get a higher graphics score by using an Intel CPU (weird i know)
> 
> 
> 
> cpu does not matter much for those runs. should be a difference of 100 points at most in graphics. power limit?
Click to expand...

That was Trixx at +180mV iirc and +50% powerlimit (card never throttled)

I'm running 14.7RC3 atm


----------



## mus1mus

Does Catalyst still control Power Limit when you adjust via AB or Trixxx?


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> That was Trixx at +180mV iirc and +50% powerlimit (card never throttled)
> 
> I'm running 14.7RC3 atm


Try 14.X driver and do 1200 core first. could be error correction kicking in.
Quote:


> Originally Posted by *mus1mus*
> 
> Does Catalyst still control Power Limit when you adjust via AB or Trixxx?


In my case, CCC is out of the picture like WORD or Excel. Make sure Overdrive is disabled and close it, then use Trixx or AB.


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> That was Trixx at +180mV iirc and +50% powerlimit (card never throttled)
> 
> I'm running 14.7RC3 atm
> 
> 
> 
> Try 14.X driver and do 1200 core first. could be error correction kicking in.
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Does Catalyst still control Power Limit when you adjust via AB or Trixxx?
> 
> Click to expand...
> 
> In my case, CCC is out of the picture like WORD or Excel. Make sure Overdrive is disabled and close it, then use Trixx or AB.
Click to expand...

Downloading 14.x atm, will test it out tomorrow and report back.

And Mus, if you still want CCC then just disable overdrive then run with AB, if you are using Trixx then it can be a pain depending on the driver?


----------



## mus1mus

Alrighty.

Thanks fellas.


----------



## ozzy1925

Quote:


> Originally Posted by *Sgt Bilko*
> 
> It showed these up fine for me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://www.3dmark.com/fs/1805629
> 
> 
> http://www.3dmark.com/fs/1805770


weird because it only shows correct speeds when i use afterburner and trixx at the same time .
http://www.3dmark.com/fs/2703168


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> That was Trixx at +180mV iirc and +50% powerlimit (card never throttled)
> 
> I'm running 14.7RC3 atm
> 
> 
> 
> Try 14.X driver and do 1200 core first. could be error correction kicking in.
> .
Click to expand...

14.X 1200/1500

http://www.3dmark.com/3dm/4119913

Graphics Score 12318 vs your 13115


----------



## Offler

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 14.X 1200/1500
> 
> http://www.3dmark.com/3dm/4119913
> 
> Graphics Score 12318 vs your 13115


What is your NB and HT frequency?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Offler*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> 14.X 1200/1500
> 
> http://www.3dmark.com/3dm/4119913
> 
> Graphics Score 12318 vs your 13115
> 
> 
> 
> What is your NB and HT frequency?
Click to expand...

2200/2600

NB doesn't have that big of an impact on Vishera and HT only really affects Multi-GPU setups


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 14.X 1200/1500
> 
> http://www.3dmark.com/3dm/4119913
> 
> Graphics Score 12318 vs your 13115


not sure what to tell you. i know that when i use AB i get less than 1K points in graphics vs Trixx. but you are using Trixx. i compared my scores to Home's and we are pretty close.


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> 14.X 1200/1500
> 
> http://www.3dmark.com/3dm/4119913
> 
> Graphics Score 12318 vs your 13115
> 
> 
> 
> not sure what to tell you. i know that when i use AB i get less than 1K points in graphics vs Trixx. but you are using Trixx. i compared my scores to Home's and we are pretty close.
Click to expand...

That was with AB, i'll re-run with Trixx


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> That was with AB, i'll re-run with Trixx


you don't want to see offler's score.


----------



## Offler

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 2200/2600
> 
> NB doesn't have that big of an impact on Vishera and HT only really affects Multi-GPU setups


I would never run HT on lower frequency than 2400MHz. 16 bit bus width x 2400Mhz unidirectional is 4800mb/s which is about half of PCI-E 2.0 bandwidth. According to measures by pciespeedtest with my HD 7970, the typical values transferred from CPU to GPU and back was 5500mb/s.

http://www.3dmark.com/fs/2792880

I focused on NB/HT frequency - 2800/2600, very low RAM latencies (1600MHz, CAS 7-7-7-18, CR1). CPU isnt the best piece so 4GHz is not possible atm... That result was archieved without AB or Trixx, only with CCC. Result with AB is worse by few points.

My graphics score is only 11800, but total score is higher since I squeezed from CPU and RAM every bit of performance. Today I am going to use TriXX and tweak the fan. Once GPU throttling is removed I will see some real results. I expect about 12400 of graphics score.


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> That was with AB, i'll re-run with Trixx
> 
> 
> 
> you don't want to see offler's score.
Click to expand...

Eh, it throttled with Trixx, not really sure what to say.
I have my 9590 at stock because tbh it's fast enough for me there.
Ram is a modest 2133Mhz 10-11-11-31 and that's it, everything else is stock.
Quote:


> Originally Posted by *Offler*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> 2200/2600
> 
> NB doesn't have that big of an impact on Vishera and HT only really affects Multi-GPU setups
> 
> 
> 
> I would never run HT on lower frequency than 2400MHz. 16 bit bus width x 2400Mhz unidirectional is 4800mb/s which is about half of PCI-E 2.0 bandwidth. According to measures by pciespeedtest with my HD 7970, the typical values transferred from CPU to GPU and back was 5500mb/s.
> 
> http://www.3dmark.com/fs/2792880
> 
> I focused on NB/HT frequency - 2800/2600, very low RAM latencies (1600MHz, CAS 7-7-7-18). CPU isnt the best piece so 4GHz is not possible atm... That result was archieved without AB or Trixx, only with CCC. Result with AB is worse by few points.
> 
> My graphics score is only 11800, but total score is higher since I squeezed from CPU and RAM every bit of performance. Today I am going to use TriXX and tweak the fan. Once GPU throttling is removed I will see some real results. I expect about 12400 of graphics score.
Click to expand...

My best runs are always Crossfire, for some reason i seem to get a better Physics score with Crossfire over Single GPU at the same clocks/settings.
Combined counts for a good amount of score in Firestrike due to the fact that it only uses "Physical" cores, thus your Phenom would have all 6 loaded up and my FX only has 4.
took me a while to figure out how they calculated it but there you go

I've done my best on Firestrike so far, i may tweak around again in the future but for now i'm happy with the way it performs


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Eh, it throttled with Trixx, not really sure what to say.
> I have my 9590 at stock because tbh it's fast enough for me there.
> Ram is a modest 2133Mhz 10-11-11-31 and that's it, everything else is stock.
> My best runs are always Crossfire, for some reason i seem to get a better Physics score with Crossfire over Single GPU at the same clocks/settings.
> Combined counts for a good amount of score in Firestrike due to the fact that it only uses "Physical" cores, thus your Phenom would have all 6 loaded up and my FX only has 4.
> took me a while to figure out how they calculated it but there you go
> 
> I've done my best on Firestrike so far, i may tweak around again in the future but for now i'm happy with the way it performs


Sgt, could it be your monitor(s)?


----------



## Sgt Bilko

Quote:


> Originally Posted by *rdr09*
> 
> Sgt, could it be your monitor(s)?


Good point actually, could be that but i can't be bothered re-testing atm, heading towards midnight and i'm afraid i have to sleep at some point


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Good point actually, could be that but i can't be bothered re-testing atm, heading towards midnight and i'm afraid i have to sleep at some point


sweet dreams.


----------



## bluedevil

Quote:


> Originally Posted by *rdr09*
> 
> sweet dreams.


are made of these....









All seriousness...if my 290 can clock up to 1.2ghz...should I look at a 970 for power efficiency? I have seen alot of reviewers get them north of 1.3ghz.


----------



## rdr09

Quote:


> Originally Posted by *bluedevil*
> 
> are made of these....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> All seriousness...if my 290 can clock up to 1.2ghz...should I look at a 970 for power efficiency? I have seen alot of reviewers get them north of 1.3ghz.


i'd say yes if power bill is high in your neck of the woods. not sure how much you'll save.


----------



## bluedevil

Quote:


> Originally Posted by *rdr09*
> 
> i'd say yes if power bill is high in your neck of the woods. not sure how much you'll save.


It's actually not too bad here. However I really don't like the added stress on my PSU when I OC both my CPU and 290, pretty much tops it out....imo.


----------



## rdr09

Quote:


> Originally Posted by *bluedevil*
> 
> It's actually not too bad here. However I really don't like the added stress on my PSU when I OC both my CPU and 290, pretty much tops it out....imo.


you are going by those reviews, huh? matching cards at stock with those with boost. it seems to me that a 970 at 1300 is about equal to a 290 at 1100.

those core at 1500 - 1600 are mind blowing.

don't get the evga. seems like they have issues with the cooler.


----------



## bluedevil

Quote:


> Originally Posted by *rdr09*
> 
> you are going by those reviews, huh? matching cards at stock with those with boost. it seems to me that a 970 at 1300 is about equal to a 290 at 1100.
> 
> those core at 1500 - 1600 are mind blowing.
> 
> don't get the evga. seems like they have issues with the cooler.


No. Only ones I take seriously is LinusTechTips (runs all GPUs OC'd) and JayzTwoCents. Namely this one. Just a little gunshy about Gigabyte. I don't really want to have another bum mobo by them ever.

1524mhz is just insane.


----------



## rdr09

Quote:


> Originally Posted by *bluedevil*
> 
> No. Only ones I take seriously is LinusTechTips (runs all GPUs OC'd) and JayzTwoCents. Namely this one. Just a little gunshy about Gigabyte. I don't really want to have another bum mobo by them ever.
> 
> 1524mhz is just insane.


get the cheapest and oc it. just avoid the ones having issues at launch like evga.


----------



## bluedevil

Quote:


> Originally Posted by *rdr09*
> 
> get the cheapest and oc it. just avoid the ones having issues at launch like evga.


I am looking for the quietest as well. Not really wanting to do "The Mod" on it any more. Too much pita.


----------



## Offler

30 percent in frequency means 30 percent in power consumption. Just dont be disappointed.


----------



## bluedevil

Quote:


> Originally Posted by *Offler*
> 
> 30 percent in frequency means 30 percent in power consumption. Just dont be disappointed.


Um no. That's not how power consumption on frequency works. Power consumption only increases as the voltage goes up, not frequency.


----------



## mus1mus

It's a little complicated and tricky really.

But if a component is ON, it will consume power.

The higher the Frequency, the more times it is ON per second. The more power it will consume.

Just not as big of an impact as Overvolting where power consumption goes sky high exponentially.

But the thing is, people start to Overvolt as soon as they Overclock. So yeah.


----------



## Zipperly

Quote:


> Originally Posted by *Offler*
> 
> 30 percent in frequency means 30 percent in power consumption. Just dont be disappointed.


lol


----------



## Zipperly

Quote:


> Originally Posted by *bluedevil*
> 
> I am looking for the quietest as well. Not really wanting to do "The Mod" on it any more. Too much pita.


You also want the best built PCB possible for better VRM phases and such.


----------



## Forceman

Quote:


> Originally Posted by *bluedevil*
> 
> Um no. That's not how power consumption on frequency works. Power consumption only increases as the voltage goes up, not frequency.


Actually frequency is a factor in the load power consumption for integrated circuits. It's the square of voltage so that plays a bigger part, but increased performance isn't free.


----------



## Zipperly

Quote:


> Originally Posted by *Forceman*
> 
> Actually frequency is a factor in the load power consumption for integrated circuits. It's the square of voltage so that plays a bigger part, but increased performance isn't free.


Thats true, I think the other guy just exaggerated it a bit is all.


----------



## bluedevil

Quote:


> Originally Posted by *Forceman*
> 
> Actually frequency is a factor in the load power consumption for integrated circuits. It's the square of voltage so that plays a bigger part, but increased performance isn't free.


Ok, I get it. But its defiantly not this.
Quote:


> Originally Posted by *Offler*
> 
> 30 percent in frequency means 30 percent in power consumption. Just dont be disappointed.


----------



## mus1mus

Actually, it's not exaggeration. It's true.









A 30% OC = 30% more power consumed.

Reasons why you can't or Chips dont scale that way:

Power Limit
Voltage required to trigger them faster.
Circuitry.
Etc.

That is why certain clocks cant be achieved without an increase in Voltage, Power Limit, and Proper Power Section to support.


----------



## Zipperly

Quote:


> Originally Posted by *mus1mus*
> 
> Actually, it's not exaggeration. It's true.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> A 30% OC = 30% more power consumed.


Okay then I stand corrected. Thank you.


----------



## bluedevil

Quote:


> Originally Posted by *Zipperly*
> 
> Okay then I stand corrected. Thank you.


I as well.









Just didn't want to believe it.









So if my 290 pulls 254w @ 975mhz, then at 1200mhz, it pulls 23% more watts. This also explains why I have to put my core voltage in MSI afterburner to +25.


----------



## Offler

I forgot to mention that 30% OC means 30% more power draw only under full load of course.

Just used to it from older times when dynamic underclocking for power saving was not popular, but generally its no exagerration.

Edit:

Done testing with TriXX. Result even a bit worse than with AB.
http://www.3dmark.com/3dm/4121750

Will try again after some tuning.


----------



## mus1mus

Quote:


> Originally Posted by *Zipperly*
> 
> Okay then I stand corrected. Thank you.


Quote:


> Originally Posted by *bluedevil*
> 
> I as well.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just didn't want to believe it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So if my 290 pulls 254w @ 975mhz, then at 1200mhz, it pulls 23% more watts. This also explains why I have to put my core voltage in MSI afterburner to +25.


Seriously, if there are no limiting involved with cards' power draw, we'll be seeing about 2X the power draw at stock when we OC by 30% and OV by 30%. At least, theoretically.

Manufacturers know that and are very well keen on designing them not to, as it's very dangerous to the cards.









Quote:


> Originally Posted by *Offler*
> 
> I forgot to mention that 30% OC means 30% more power draw only under full load of course.
> 
> Just used to it from older times when dynamic underclocking for power saving was not popular, but generally its no exagerration.
> 
> Edit:
> 
> Done testing with TriXX. Result even a bit worse than with AB.
> http://www.3dmark.com/3dm/4121750
> 
> Will try again after some tuning.


With same tests, Power Draw will reflect each other IMO. It doesn't need to be at full load to see the increase in Power Draw at stock vs. OC'ed.

But then again, theoretical values do not reflect real world usage most of the time.







But theories can be our guide.


----------



## Meatdohx

Quote:


> Originally Posted by *95139ieaci*
> 
> I have to rant for a little bit about my Asus R9 290 DC2OC. I got the card 2 days ago. No overclocking at all in my rig.
> 
> The card performs well with high demand games such as Metro: Last Light, but with low demand games such as The Sims 4, Fable Anniversary, it crashes(BSOD atikmpag.sys) constantly, sometimes as soon as loading ends, usually 5 minutes into a game.
> 
> I can't believe after nine months(since its release), the card(and/or its driver,BIOS) is still faulty out of the box.
> 
> The card doesn't crash in 3dmark or furmark(1 hour burn-in), and its core temperature is always below 80C, leading me to believe the problem is not with PSU nor heat.
> 
> I am out of ideas.
> 
> Some posts around the internet suggest that the problem is about power play (power management) keeps changing core clock and that I should lock the core clock with MSI Afterburner. I haven't tried the method yet, but it has been a really frustrating process getting the card to work.
> 
> Current gig:
> i5 4590
> ASRock B85-ITX
> win 8.1 64bit
> Asus R9 290 DC2OC
> Silverstone SST-ST45SF (450w bronze ITX)


I know im late to the party but i had this issue. Only dx9 game crash right?

DX11 and mantle are ok?

If that is the case you got a defective card. Asus DC 2 card have been horrible lately.


----------



## the9quad

Quote:


> Originally Posted by *Meatdohx*
> 
> I know im late to the party but i had this issue. Only dx9 game crash right?
> 
> DX11 and mantle are ok?
> 
> If that is the case you got a defective card. Asus DC 2 card have been horrible lately.


this is my problem as well, how can it be a defective card, if it works in DX11, but crashes in DX9? Sounds like a driver issue to me.

more information please. appreciate it.


----------



## Roy360

Just installed 14.4 from 13.2.

BSOD.

Used DUU to remove the old drivers, and install 14.4.

Now I"m missing HydraVision..... installing it separately doesn't work. Anyone know how to make it work?


----------



## Sgt Bilko

Quote:


> Originally Posted by *the9quad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Meatdohx*
> 
> I know im late to the party but i had this issue. Only dx9 game crash right?
> 
> DX11 and mantle are ok?
> 
> If that is the case you got a defective card. Asus DC 2 card have been horrible lately.
> 
> 
> 
> this is my problem as well, how can it be a defective card, if it works in DX11, but crashes in DX9? Sounds like a driver issue to me.
> 
> more information please. appreciate it.
Click to expand...

I get this problem in Fallout: New Vegas, it doesnt crash but I get a random multi color screen every so often when playing.

That was on 14.7 RC3 im on 14.x now but haven't tried it yet.

Out of curiosity, have you tried it with Crossfire disabled?


----------



## Offler

Ok... My PC is a Mystery machine...

Yesterday my tests with OCCT were doing 778FPS, today only 757FPS. (Certainly CPU limited) And you guess right that 3D mark scores went down









to Sgt. Bilko
This happened when I decreated HT to 2400MHz:
http://www.3dmark.com/fs/2803515

Strange is that TriXX was set to 1080/1400MHz.


----------



## maynard14

is it worth it selling my unlock 290 to 290x for a msi gaming gtx 970? i cant afford buying gtx 980 even i sell my 290x unlock


----------



## sugarhell

Quote:


> Originally Posted by *Offler*
> 
> Ok... My PC is a Mystery machine...
> 
> Yesterday my tests with OCCT were doing 778FPS, today only 757FPS. (Certainly CPU limited) And you guess right that 3D mark scores went down
> 
> 
> 
> 
> 
> 
> 
> 
> 
> to Sgt. Bilko
> This happened when I decreated HT to 2400MHz:
> http://www.3dmark.com/fs/2803515
> 
> Strange is that TriXX was set to 1080/1400MHz.


Why you care about occt?


----------



## Offler

Short answer = for reference, long answer in PM.


----------



## the9quad

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I get this problem in Fallout: New Vegas, it doesnt crash but I get a random multi color screen every so often when playing.
> 
> That was on 14.7 RC3 im on 14.x now but haven't tried it yet.
> 
> Out of curiosity, have you tried it with Crossfire disabled?


Works fine in single card


----------



## BradleyW

Quote:


> Originally Posted by *the9quad*
> 
> this is my problem as well, how can it be a defective card, if it works in DX11, but crashes in DX9? Sounds like a driver issue to me.
> 
> more information please. appreciate it.


Tried running the DirectX online update tool? Should add or fix vital files required for DX9 and above.
http://www.microsoft.com/en-us/download/details.aspx?id=35
Quote:


> Originally Posted by *the9quad*
> 
> Works fine in single card


Try the other cards on it's own as well, just to make sure one of the cards are not defective.


----------



## Sgt Bilko

Quote:


> Originally Posted by *the9quad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> I get this problem in Fallout: New Vegas, it doesnt crash but I get a random multi color screen every so often when playing.
> 
> That was on 14.7 RC3 im on 14.x now but haven't tried it yet.
> 
> Out of curiosity, have you tried it with Crossfire disabled?
> 
> 
> 
> Works fine in single card
Click to expand...

I tested out Single card on both of mine and it was fine, sounds like a driver issue.


----------



## BradleyW

I take it you've all disabled ULPS? That fixed my DX9 crashes.


----------



## Sgt Bilko

Quote:


> Originally Posted by *BradleyW*
> 
> I take it you've all disabled ULPS? That fixed my DX9 crashes.


Yep, I'm not getting crashes though like i said.

just a random multi-coloured screen occasionally but only when crossfire is enabled (pointless considering)


----------



## Offler

Anyone here still using Stock cooler? Well I do...

I have checked its construction and I was bit amazed at two things.

a) The copper plate on the cooler is in fact quite big Vapour chamber, with really good parameters.
When you check whole copper plate you have to realize that its not normal copper, but vapour chamber. Some people who wanted to remove it but to reuse the base, have to solder the metal it was holding it in place. Somehow it ended like this (check the screen):

http://www.overclock.net/t/1453555/quick-guide-for-the-disassembly-and-re-use-of-amd-290-reference-cooler-as-vrm-ram-heatsink#post_21469120

The bloated copper confirms that its vapour chamber.

So why it vapour chamber like this unable to transport heat away from GPU?

b) Embossed rectangle which should have contact with GPU is not finished
Rectangle which should have perfect contact with GPU has same structure a copper around it, most usually is bent "inside". Contact with GPU is then poor.





Anybody here tried lapping of this cooler?

Edit:
Found some people who did it. They claim to decrease of temperatures by 10 to 15 °C (both idle and load). they used also decent thermal compound.


----------



## ozzy1925

guys,i know this is not the place but have a question : i have sent my 2 out of 3x 290 cards to service and because of fan rattling and today they tell me i am going to get my money back.I already bought ek waterblock and ek back plates for them.What do you suggest me to do ?Sell the other card and get 2x980gtx with all the money?


----------



## mus1mus

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I tested out Single card on both of mine and it was fine, sounds like a driver issue.


Which driver are you using now Sarge?

Also Trixx has no Power Target Option?

Edit, scratch that. So dumb of me.. lol


----------



## BradleyW

Quote:


> Originally Posted by *ozzy1925*
> 
> guys,i know this is not the place but have a question : i have sent my 2 out of 3x 290 cards to service and because of fan rattling and today they tell me i am going to get my money back.I already bought ek waterblock and ek back plates for them.What do you suggest me to do ?Sell the other card and get 2x980gtx with all the money?


I'd go with 2 980's or wait to see what AMD bring out.


----------



## Sgt Bilko

Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> I tested out Single card on both of mine and it was fine, sounds like a driver issue.
> 
> 
> 
> Which driver are you using now Sarge?
> 
> Also Trixx has no Power Target Option?
> 
> Edit, scratch that. So dumb of me.. lol
Click to expand...

I'm on 14.x atm, haven't done too much atm though


----------



## ozzy1925

Quote:


> Originally Posted by *BradleyW*
> 
> I'd go with 2 980's or wait to see what AMD bring out.


thanks, tbh i really liked my cards but after seeing performance vs power usage of the 980s i think i will go with evga 980 sc sli.


----------



## bluedevil

Just realized that Mantle wasn't in the 14.4s, or at least I couldn't find it. Installed the 14.7s that went live on Aug 16th. All is good.


----------



## BradleyW

Quote:


> Originally Posted by *ozzy1925*
> 
> thanks, tbh i really liked my cards but after seeing performance vs power usage of the 980s i think i will go with evga 980 sc sli.


Go with the cheapest reference 980's and watercool them.


----------



## Talon720

Quote:


> Originally Posted by *bluedevil*
> 
> Just realized that Mantle wasn't in the 14.4s, or at least I couldn't find it. Installed the 14.7s that went live on Aug 16th. All is good.


I think, well thought mantle was first added in on a 14.1 driver version it's definitly in 14.4. I've had good luck with the 14.4 and currently 14.8 whql which I read on amd went live aug12th. So wait is that 14.7 driver a newer released beta, but just 14.7 woulda though it would be 14.8?
Quote:


> Originally Posted by *ozzy1925*
> 
> thanks, tbh i really liked my cards but after seeing performance vs power usage of the 980s i think i will go with evga 980 sc sli.


Yea personally in your postion id go with the nvidia cards they look nice performance and feature wise. You could go 970 to hold you over till the 20nm cards start comming out. Also I know the evga 970 had the heatsink flaw, might wanna make sure 980s don't have the same flaw.

I also wonder if nvidia is going to be supporting the new vesa 1.3 adaptive sync if that's a good sign of things to come.


----------



## Forceman

Quote:


> Originally Posted by *maynard14*
> 
> is it worth it selling my unlock 290 to 290x for a msi gaming gtx 970? i cant afford buying gtx 980 even i sell my 290x unlock


Not for a 970, unless you have other considerations (like you are on a reference card and the noise is bothering you, something like that).


----------



## bluedevil

Quote:


> Originally Posted by *Talon720*
> 
> I think we'll thought mantle was first added in on a 14.1 driver version it's definitly in 14.4. I've had good luck with the 14.4 and currently 14.8 whql which I read on amd went live aug12th. So wait is that 14.7 driver a newer released beta, but just 14.7 woulda though it would be 14.8?
> Yea personally in your postion id go with the nvidia cards they look nice performance and feature wise. You could go 970 to hold you over till the 20nm cards start comming out. Also I know the evga 970 had the heatsink flaw, might wanna make sure 980s don't have the same flaw.
> 
> I also wonder if nvidia is going to be supporting the new vesa 1.3 adaptive sync if that's a good sign of things to come.


Like I said, I couldn't find it in BF4 like I have with other Catalysts. Installed the 14.7s and bam, auto selected on startup. Also the 14.8 were very buggy for me in FC3 (was playing, now back to BF4) so I installed the 14.4s.


----------



## Kriant

re-building my loop with quad r9 290x.


----------



## th3illusiveman

+30% frequency =/= +30% voltage


----------



## kizwan

Quote:


> Originally Posted by *bluedevil*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Talon720*
> 
> I think we'll thought mantle was first added in on a 14.1 driver version it's definitly in 14.4. I've had good luck with the 14.4 and currently 14.8 whql which I read on amd went live aug12th. So wait is that 14.7 driver a newer released beta, but just 14.7 woulda though it would be 14.8?
> Yea personally in your postion id go with the nvidia cards they look nice performance and feature wise. You could go 970 to hold you over till the 20nm cards start comming out. Also I know the evga 970 had the heatsink flaw, might wanna make sure 980s don't have the same flaw.
> 
> I also wonder if nvidia is going to be supporting the new vesa 1.3 adaptive sync if that's a good sign of things to come.
> 
> 
> 
> 
> 
> 
> 
> Like I said, I couldn't find it in BF4 like I have with other Catalysts. Installed the 14.7s and bam, auto selected on startup. Also the 14.8 were very buggy for me in FC3 (was playing, now back to BF4) so I installed the 14.4s.
Click to expand...

Probably bad driver installation. That's all I can think of though.
Quote:


> Originally Posted by *th3illusiveman*
> 
> +30% frequency =/= +30% voltage


Correction. They're talking about power consumption, not voltage.


----------



## Meatdohx

Quote:


> Originally Posted by *the9quad*
> 
> this is my problem as well, how can it be a defective card, if it works in DX11, but crashes in DX9? Sounds like a driver issue to me.
> 
> more information please. appreciate it.


I troubleshooted one of my card for weeks. Trying fresh windows install, every single version of drivers. Nothing would do it i would BSOD in every DX9 games i was played but DX11 would never crash.

It took me a while to realize that. I tested my second card alone in DX9 and i could reproduce the BSOD in less than 5 minutes every time in DX 9 yet i could play for hours in DX 11.

I sent that card for RMA and when i got the card back i could run everything without any problems.

Correct me if i am wrong but isn't there something hardware related in a card that is used in DX9 that wouldn't be used in DX 11?

Anyway i strongly suspect one of your card is faulty.<

Also my card has elpida memory, the one i got back had hynix memory.


----------



## th3illusiveman

what type of power draw am i looking at with a 2500K @ 4.4Ghz and 290X with +100mv in Afterburner? what difference in Watts would there be in using 81mv instead of 100mv? is 100mv safe for 24/7 use if the core is kept under 80c and VRMS under 85c


----------



## Gobigorgohome

I got my golden 4930K today and did a quick and dirty overclock on it and the GPU's.

Running 4930K @ 4,6 with 1,296vc, 16GB 1866Mhz @ 1,5v, 4x Sapphire Radeon R9 290X (stock bios) MSI AB 1100/1300 +30 power limit, no voltage. I think I am on 13.6, if not that one I have no idea.

In 3DMark FireStrike I only get 20860 points which seems a little low to me. Max core clock show 1800 Mhz ... even though the CPU is overclocked to 4,6 Ghz.


----------



## ZealotKi11er

Quote:


> Originally Posted by *th3illusiveman*
> 
> what type of power draw am i looking at with a 2500K @ 4.4Ghz and 290X with +100mv in Afterburner? what difference in Watts would there be in using 81mv instead of 100mv? is 100mv safe for 24/7 use if the core is kept under 80c and VRMS under 85c


Yes 100 mV is safe. I think with 100 mV you are looking at ~ 75W more if you set Power Limit to MAX. These cards pull ~ 300W Stock and 450W MAX OC 50% Power Limit.


----------



## Cyber Locc

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I got my golden 4930K today and did a quick and dirty overclock on it and the GPU's.
> 
> Running 4930K @ 4,6 with 1,296vc, 16GB 1866Mhz @ 1,5v, 4x Sapphire Radeon R9 290X (stock bios) MSI AB 1100/1300 +30 power limit, no voltage. I think I am on 13.6, if not that one I have no idea.
> 
> In 3DMark FireStrike I only get 20860 points which seems a little low to me. Max core clock show 1800 Mhz ... even though the CPU is overclocked to 4,6 Ghz.


dual puss if not what psu is powering that power house


----------



## kizwan

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I got my golden 4930K today and did a quick and dirty overclock on it and the GPU's.
> 
> Running 4930K @ 4,6 with 1,296vc, 16GB 1866Mhz @ 1,5v, 4x Sapphire Radeon R9 290X (stock bios) MSI AB 1100/1300 +30 power limit, no voltage. I think I am on 13.6, if not that one I have no idea.
> 
> In 3DMark FireStrike I only get 20860 points which seems a little low to me. Max core clock show 1800 Mhz ... even though the CPU is overclocked to 4,6 Ghz.


Futuremark SystemInfo don't always get info (CPU/GPU clocks, etc) right but they doesn't affect the scores though.


----------



## Gobigorgohome

Quote:


> Originally Posted by *cyberlocc*
> 
> dual puss if not what psu is powering that power house


I have 2x EVGA G2 1300W (sig rig).
Quote:


> Originally Posted by *kizwan*
> 
> Futuremark SystemInfo don't always get info (CPU/GPU clocks, etc) right but they doesn't affect the scores though.


Okay, then there is no worries there. But the result still seems a little low, I am going to push the CPU all the way to 5 Ghz and see where it performs best (voltage and temperature).


----------



## Offler

Quote:


> Originally Posted by *Offler*
> 
> Anyone here still using Stock cooler? Well I do...
> 
> Blah blah blah, bad contact, lapping.
> 
> 
> 
> 
> 
> Anybody here tried lapping of this cooler?
> 
> Edit:
> Found some people who did it. They claim to decrease of temperatures by 10 to 15 °C (both idle and load). they used also decent thermal compound.


Just did a "quick lapping". Which means I did it in 2 hours, not 4... Shape of the the cooler wasnt as concave as I feared. Near the center were two holes and one bump near the bottom. So i brushed them away with sandpaper. The I applied Arctic Silver 5. The paste needs some time to act as expected but first numbers:

OCCT
Before: >94°C at 78% fan speed
After: 89 at 70% fan speed and stable.

Air temperature near exhaust during stress test
Before 45
After 50

Idle:
Before: 46
After: 38

Browsing with "clock lock"bug*:
Before: 80°C
After: 66°C

Mortal Kombat 9
Before 72
After 64

Idle values are close to numbers archieved by other people. What the hell were guys at Tom's Hardware doing?

* Youtube video on non-visible tab, or minimized browser caused GPU frequency to lock on 954 or 977MHz - depending on driver version.


----------



## rdr09

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I have 2x EVGA G2 1300W (sig rig).
> Okay, then there is no worries there. But the result still seems a little low, I am going to push the CPU all the way to 5 Ghz and see where it performs best (voltage and temperature).


maybe you have power saving features turned on on both bios and windows. in windows, you need to set performance to High. your physics score should be around 17K at 4.6GHz.

for reference: http://www.3dmark.com/3dm/3223548

also . . . why 13 drivers? use 14.X. got like 500 points extra in graphics score in FS.

wait, look at Home's Tri: http://www.3dmark.com/fs/2314573

highly oc'ed, though.
Quote:


> Originally Posted by *Offler*
> 
> Just did a "quick lapping". Which means I did it in 2 hours, not 4... Shape of the the cooler wasnt as concave as I feared. Near the center were two holes and one bump near the bottom. So i brushed them away with sandpaper. The I applied Arctic Silver 5. The paste needs some time to act as expected but first numbers:
> 
> OCCT
> Before: >94°C at 78% fan speed
> After: 89 at 70% fan speed and stable.
> 
> Air temperature near exhaust during stress test
> Before 45
> After 50
> 
> Idle:
> Before: 46
> After: 38
> 
> Browsing with "clock lock"bug*:
> Before: 80°C
> After: 66°C
> 
> Mortal Kombat 9
> Before 72
> After 64
> 
> Idle values are close to numbers archieved by other people. What the hell were guys at Tom's Hardware doing?
> 
> * Youtube video on non-visible tab, or minimized browser caused GPU frequency to lock on 954 or 977MHz - depending on driver version.


nice temps. btw, WEI > tom's.


----------



## Gualichu04

I Disabled crossfire to try a new overclock on single gpu and this is my new 3dMark score for single gpu.
http://www.3dmark.com/3dm/4136432? 12090 graphics score is nice


----------



## rdr09

Quote:


> Originally Posted by *Gualichu04*
> 
> I Disabled crossfire to try a new overclock on single gpu and this is my new 3dMark score for single gpu.
> http://www.3dmark.com/3dm/4136432? 12090 graphics score is nice


you raised the power limit? what do you use to oc? is overdrive in CCC disabled?

here is my 290 at same exact clocks . . .

http://www.3dmark.com/3dm/4136556?


----------



## Gualichu04

Quote:


> Originally Posted by *rdr09*
> 
> you raised the power limit? what do you use to oc? is overdrive in CCC disabled?
> 
> here is my 290 at same exact clocks . . .
> 
> http://www.3dmark.com/3dm/4136556?


msi afterburner +69mv 50% power limit. That is sad ur 290 beats mine at the same clocks. Overdrive is disabled so is ulps. I left power play enabled so they down clock at idle. The core frequency was fluctuating in 3dmark between 1050 and 1175


----------



## rdr09

Quote:


> Originally Posted by *Gualichu04*
> 
> msi afterburner +69mv 50% power limit. That is sad ur 290 beats mine at the same clocks. Overdrive is disabled so is ulps. I left power play enabled so they down clock at idle. The core frequency was fluctuating in 3dmark between 1050 and 1175


that's not good. it can't be temps 'cause yours water, right? here is what AB shows on mine. i use Trixx.



edit: ok, this came from the bench thread. a 290X at 1150 . . .

http://www.3dmark.com/3dm/1484478

i don't trust AB. lol. that's the op's old 290X, which was using 13 driver.


----------



## Gualichu04

Quote:


> Originally Posted by *rdr09*
> 
> that's not good. it can't be temps 'cause yours water, right? here is what AB shows on mine. i use Trixx.


It only fluctuates the clocks in certain games or benchmarks. the max temp is 53C on single gpu. Maybe i should try it without powerplay. I just use it to save energy.


----------



## rdr09

Quote:


> Originally Posted by *Gualichu04*
> 
> It only fluctuates the clocks in certain games or benchmarks. the max temp is 53C on single gpu. Maybe i should try it without powerplay. I just use it to save energy.


yah, disable it.


----------



## Gualichu04

Quote:


> Originally Posted by *rdr09*
> 
> yah, disable it.


msi still shows it going down in core mhz still. Gpu-0z shows the same thing. I read somewhere even if u have overdrive disabled in CCC the pwoer limit sitll wont be at 50% like you put it in msi afterburner and u msut put it at 50% in CCC s, it oes not downclock at load. I wonder if thats the issue. and in 3dmark the gpu doesnt stay at 100% load it goes a few percentage sometimes. Quite odd it downclocks.


----------



## rdr09

Quote:


> Originally Posted by *Gualichu04*
> 
> msi still shows it going down in core mhz still. Gpu-0z shows the same thing. I read somewhere even if u have overdrive disabled in CCC the pwoer limit sitll wont be at 50% like you put it in msi afterburner and u msut put it at 50% in CCC s, it oes not downclock at load. I wonder if thats the issue. and in 3dmark the gpu doesnt stay at 100% load it goes a few percentage sometimes. Quite odd it downclocks.


i read that, too. you can try and use CCC's power limit along with AB. Turn off GPUZ.


----------



## Gualichu04

Adjusted power limit in CCC and no more down clocking. Just have to do that every time msi after burner is opened. New 3dmark score. 12839 graphics 11173 overall


----------



## ZealotKi11er

rdr09 what are you getting 290 @ 1200/1500? GPU score? 970 seems to be getting 13.5K GPU score with 1550/8000 OC.


----------



## Devildog83

Mild disappointment - My 290 and back-plate will not be here for my Birthday tomorrow (thanks United States Postal Service) but they will be here Thursday. OK, I'll quit whining.


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> rdr09 what are you getting 290 @ 1200/1500? GPU score? 970 seems to be getting 13.5K GPU score with 1550/8000 OC.


Zeal, this is 1200/1500 . . .

http://www.3dmark.com/3dm/4079609?

here is 1280/1500

http://www.3dmark.com/3dm/4037848?


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> Zeal, this is 1200/1500 . . .
> 
> http://www.3dmark.com/3dm/4079609?
> 
> here is 1280/1500
> 
> http://www.3dmark.com/3dm/4037848?


Your 1.28GHz beating 1.6GHz/8000 970. Not bad.


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Your 1.28GHz beating 1.6GHz/8000 970. Not bad.


Hawaii just can't match the oc these new cards can achieve. if they can, especially the 290X, clock for clock - Hawaii wins.

at least in fs.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *rdr09*
> 
> Hawaii just can't match the oc these new cards can achieve. if they can, especially the 290X, clock for clock - Hawaii wins.
> 
> at least in fs.


Really , I just might hang on to mine a bit longer then


----------



## rdr09

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Really , I just might hang on to mine a bit longer then


not even the Ti's can beat your Tri 290s FS graphics scores in OCN. you know why?

'cause you are MAD.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *rdr09*
> 
> not even the Ti's can beat your Tri 290s FS graphics scores in OCN. you know why?
> 
> 'cause you are MAD.


No , its cause im not afraid to VOLTUP bro


----------



## rdr09

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> No , its cause im not afraid to VOLTUP bro


can't wait to see what 980s can do in your hands.


----------



## tsm106

Mad cojones.

Speaking of cojones, my ole xfx 7970 in iceland.

http://www.3dmark.com/3dm/4125968


----------



## sinnedone

So what are you guys pushing your voltages up to (and power limit) for benchmarking and on the daily?

I'd like to know how far I can push them voltage wise for 24/7 clocks. This would be watercooled of course.


----------



## bluedevil

Just broke 9k.

1200/1300 @ 1.31v and +50% on Power Limit

http://www.3dmark.com/3dm/4137932

How much do you think I can get with a higher clocked i7?


----------



## sinnedone

Quote:


> Originally Posted by *bluedevil*
> 
> Just broke 9k.
> 
> 1200/1300 @ 1.31v and +50% on Power Limit
> 
> http://www.3dmark.com/3dm/4137932
> 
> How much do you think I can get with a higher clocked i7?


Right now with an r9 290 at stock volts 50+ power limit at 1050/1250 I hit 9150 or so total score. This was with a 3770k at stock clocks. The physics test score was about 31fps, and I know at 4.6ghz this chip gets around a 40fps score in physics test.


----------



## bluedevil

Quote:


> Originally Posted by *sinnedone*
> 
> Right now with an r9 290 at stock volts 50+ power limit at 1050/1250 I hit 9150 or so total score. This was with a 3770k at stock clocks. The physics test score was about 31fps, and I know at 4.6ghz this chip gets around a 40fps score in physics test.


So its safe to say you will break 10k with a 290 and a 3770K?


----------



## shfwn

hey guys, newbie here.









OCN name: shfwn
Manufacturer & Brand: Gigabyte WIndForce R9 290 4GB OC
Cooler: aftermarket

cheers!


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *rdr09*
> 
> can't wait to see what 980s can do in your hands.


Hmmmmm , YES I would like that very much









@tsm106

Thats goddamn 1400 on the core









Found this sub I didn't BOT it









http://www.3dmark.com/3dm11/8695978 @ 1325 / 1400 ??


----------



## maynard14

Quote:


> Originally Posted by *Forceman*
> 
> Not for a 970, unless you have other considerations (like you are on a reference card and the noise is bothering you, something like that).


my only concern is my vrm temp sky rocket to 83c while gaming, im alreadyu using vrm heatsinks from gelid extreme but still no diff, my gpu core is at 61c with nzxt g10 and corsair h105, so i know i dont have room for overclock huhu sadly..


----------



## bluedevil

Got a lower score with 100mhz more on the CPU....









http://www.3dmark.com/3dm/4138079


----------



## tsm106

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> @tsm106
> 
> Thats goddamn 1400 on the core
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Found this sub I didn't BOT it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/8695978 @ 1325 / 1400 ??


That was 1400 on water, though he had very great ambient to work with. Highest I could get on socal ambient was 1380. I bet there is still more there if he could tighten the memory timings.

I benched the 970 I have on water. It's ok I guess. In valley it was 1fps faster than my single 7970. And you thought Unigine was hard on AMD lol. I need to finish tweaking it but so far it seems like a one or two trick pony. The reviewers were really doing a swell marketing job lol. But I needed it anyways cuz of the hdmi 2 port, thank goodness hehe.


----------



## th3illusiveman

Quote:


> Originally Posted by *bluedevil*
> 
> Got a lower score with 100mhz more on the CPU....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/4138079


i get ~12600 GPU score with 1125/1500.


----------



## bluedevil

Quote:


> Originally Posted by *th3illusiveman*
> 
> i get ~12600 GPU score with 1125/1500.


With a 290x and a 2600k @ 4.4ghz?


----------



## sinnedone

Quote:


> Originally Posted by *bluedevil*
> 
> So its safe to say you will break 10k with a 290 and a 3770K?


Honestly I dont know.


----------



## Arizonian

Quote:


> Originally Posted by *shfwn*
> 
> hey guys, newbie here.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> OCN name: shfwn
> Manufacturer & Brand: Gigabyte WIndForce R9 290 4GB OC
> Cooler: aftermarket
> 
> cheers!
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Welcome to OCN with your first post.









Feel free to *list your rig.*


----------



## th3illusiveman

Quote:


> Originally Posted by *bluedevil*
> 
> With a 290x and a 2600k @ 4.4ghz?


290x - Core 1125/ mem 1500Mhz and yes, CPU (i5) at 4.4Ghz[ http://www.3dmark.com/3dm/4138425?


----------



## pdasterly

amd has annoucement on the 25th, hope its the 390x. That should put the gtx 880 on sale fast


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *tsm106*
> 
> That was 1400 on water, though he had very great ambient to work with. Highest I could get on socal ambient was 1380. I bet there is still more there if he could tighten the memory timings.
> 
> I benched the 970 I have on water. It's ok I guess. In valley it was 1fps faster than my single 7970. And you thought Unigine was hard on AMD lol. I need to finish tweaking it but so far it seems like a one or two trick pony. The reviewers were really doing a swell marketing job lol. But I needed it anyways cuz of the hdmi 2 port, thank goodness hehe.


I noticed 256 bit bus on that new green thing , that's less than the TI .








I wants a R9 card with at least 4 DP's


----------



## mus1mus

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> I noticed 256 bit bus on that new green thing , that's less than the TI .
> 
> 
> 
> 
> 
> 
> 
> 
> I wants a R9 card with at least 4 DP's


LOL.

My Aussie buddy believes it a downgrade having a 256-bit BUS from 384 bits. But boy, if those reviews tell all the truth, the 970s are good bang-for-the-buck.


----------



## ozzy1925

Quote:


> Originally Posted by *bluedevil*
> 
> Got a lower score with 100mhz more on the CPU....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/4138079


that gpu score is very low for 1200 core. your card must throttle .you can use AB osd to see core clocks .


----------



## tsm106

Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HOMECINEMA-PC*
> 
> I noticed 256 bit bus on that new green thing , that's less than the TI .
> 
> 
> 
> 
> 
> 
> 
> 
> I wants a R9 card with at least 4 DP's
> 
> 
> 
> LOL.
> 
> My Aussie buddy believes it a downgrade having a 256-bit BUS from 384 bits. But boy, if those reviews tell all the truth, the 970s are good bang-for-the-buck.
Click to expand...

I could actually run a one to one test. I've got all x79 rigs, one with 290x Lightning, Zotac 970, and my quads. I've done 1:1 tests before in the past. Hmm...

http://www.overclock.net/t/1322119/12-11-vs-310-33


----------



## sinnedone

Quote:


> Originally Posted by *tsm106*
> 
> I could actually run a one to one test. I've got all x79 rigs, one with 290x Lightning, Zotac 970, and my quads. I've done 1:1 tests before in the past. Hmm...
> 
> http://www.overclock.net/t/1322119/12-11-vs-310-33


Interesting thread you made there, definitely do that again.


----------



## kizwan

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gualichu04*
> 
> I Disabled crossfire to try a new overclock on single gpu and this is my new 3dMark score for single gpu.
> http://www.3dmark.com/3dm/4136432? 12090 graphics score is nice
> 
> 
> 
> you raised the power limit? what do you use to oc? is overdrive in CCC disabled?
> 
> here is my 290 at same exact clocks . . .
> 
> http://www.3dmark.com/3dm/4136556?
Click to expand...

Are you guys running with Tess ON or OFF?
Quote:


> Originally Posted by *bluedevil*
> 
> Just broke 9k.
> 
> 1200/1300 @ 1.31v and +50% on Power Limit
> 
> http://www.3dmark.com/3dm/4137932
> 
> How much do you think I can get with a higher clocked i7?


At the same clocks, at least between high 10k to 11k.


----------



## sugarhell

Quote:


> Originally Posted by *kizwan*
> 
> Are you guys running with Tess ON or OFF?


You cant see any warning on 3dmark link...


----------



## Sammyboy83

Here is my firestrike score with 14.8. I believe with a clean win7 install, i may get past 12000 total score. 290 clocket at 1280/1700. Tess on

http://www.3dmark.com/fs/2809974


----------



## mus1mus

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> I noticed 256 bit bus on that new green thing , that's less than the TI .
> 
> 
> 
> 
> 
> 
> 
> 
> I wants a R9 card with at least 4 DP's
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> LOL.
> 
> My Aussie buddy believes it a downgrade having a 256-bit BUS from 384 bits. But boy, if those reviews tell all the truth, the 970s are good bang-for-the-buck.
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I could actually run a one to one test. I've got all x79 rigs, one with 290x Lightning, Zotac 970, and my quads. I've done 1:1 tests before in the past. Hmm...
> 
> http://www.overclock.net/t/1322119/12-11-vs-310-33
> Quote:
> 
> 
> 
> Originally Posted by *sinnedone*
> 
> Interesting thread you made there, definitely do that again.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> 
> 
> Click to expand...
Click to expand...


----------



## kizwan

Quote:


> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Are you guys running with Tess ON or OFF?
> 
> 
> 
> You cant see any warning on 3dmark link...
Click to expand...

Yes but I can't rely on that. I had results with Tess OFF but it doesn't display any warning in 3dmark link. Has been going on with 14.4 & 14.6 beta drivers for me.

Example, this with Tess OFF:-
http://www.3dmark.com/fs/2215231

Installed 27 August drivers, now I get the warning back.


----------



## Mega Man

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> That was 1400 on water, though he had very great ambient to work with. Highest I could get on socal ambient was 1380. I bet there is still more there if he could tighten the memory timings.
> 
> I benched the 970 I have on water. It's ok I guess. In valley it was 1fps faster than my single 7970. And you thought Unigine was hard on AMD lol. I need to finish tweaking it but so far it seems like a one or two trick pony. The reviewers were really doing a swell marketing job lol. But I needed it anyways cuz of the hdmi 2 port, thank goodness hehe.
> 
> 
> 
> I noticed 256 bit bus on that new green thing , that's less than the TI .
> 
> 
> 
> 
> 
> 
> 
> 
> I wants a R9 card with at least 4 DP's
Click to expand...

i know right, they just need to standardize all cards with 6 mini dps, nothing else.


----------



## th3illusiveman

Quote:


> Originally Posted by *tsm106*
> 
> I could actually run a one to one test. I've got all x79 rigs, one with 290x Lightning, Zotac 970, and my quads. I've done 1:1 tests before in the past. Hmm...
> 
> http://www.overclock.net/t/1322119/12-11-vs-310-33


That would be nice. 290X overclocked well vs a 970 overclocked well.


----------



## sinnedone

Does anyone here update BIOS' on their cards?

I have 2 Saphire reference R9 290 (Battlefield 4 edition) cards with BIOS *015.039.000.007.003523*

According to techpowerup there are several newer versions.

Would updating the Bios on any of these cards offer better overclockability or is the rule same as motherboards? Basically if it aint broke don't fix it?

I take it a Bios from the overclock versions would not work?


----------



## pdasterly

Quote:


> Originally Posted by *sinnedone*
> 
> Does anyone here update BIOS' on their cards?
> 
> I have 2 Saphire reference R9 290 (Battlefield 4 edition) cards with BIOS *015.039.000.007.003523*
> 
> According to techpowerup there are several newer versions.
> 
> Would updating the Bios on any of these cards offer better overclockability or is the rule same as motherboards? Basically if it aint broke don't fix it?
> 
> I take it a Bios from the overclock versions would not work?


I used tri-x oc bios 1040/1300


----------



## sinnedone

Quote:


> Originally Posted by *pdasterly*
> 
> I used tri-x oc bios 1040/1300


On a reference Saphire card?

Did you see any improvements or have issues?


----------



## pdasterly

on reference 290x, no problems. Wasn't looking for improvements just wanted to sync bios between both my cards


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> Are you guys running with Tess ON or OFF?
> At the same clocks, at least between high 10k to 11k.


here is 290 @ 1280 Tess off



bluedevil's run at 1200 is off. here was my 1200 using 13 driver

http://www.3dmark.com/3dm/2859470?

14.X adds about 400 -500pts.


----------



## Gobigorgohome

Okay, I have a problem. I installed 14.7 Beta driver, switched from "balanced" to "high performance" in Windows 7, running BF4 @ 4K with Mantle gave me around 45-50 FPS with 4-way crossfire enabled. Running: 0 6 12 18

4930K @ 4,6 Ghz
Cards @ stock
Used 8x 6+2-pins for power to the cards (should be no bottleneck with two cards on each PSU)

Flickering lines all over the map I played at (wave breaker, MP 64-player), running Ultra with MSAA off.

Pretty sure power saving is turned off in bios.

Different story with 3-way crossfire enabled though.

BF4 @ 4K, Mantle, Ultra no MSAA, FPS NEVER below 80 FPS, a lot of the time 100-120 and NO ISSUES, running 0 6 12. Everything else the same.

Do I have a bad card or is four cards bottlenecking my CPU?


----------



## Offler

Crossfire without bridges with so many cards... Whoa... Thats something. Traffic over the PCIE has to be quite big.

In 3 way crossfire 2 cards run on PCI-E 3.0 16x, one on 8x. On 4 way its 16x and three on 8x.

Have you tested any other 3D engine?
Are there any signs of graphical bugs or something?
What about leaked driver from 27th august?


----------



## kizwan

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Okay, I have a problem. I installed 14.7 Beta driver, switched from "balanced" to "high performance" in Windows 7, running BF4 @ 4K with Mantle gave me around 45-50 FPS with 4-way crossfire enabled. Running: 0 6 12 18
> 
> 4930K @ 4,6 Ghz
> Cards @ stock
> Used 8x 6+2-pins for power to the cards (should be no bottleneck with two cards on each PSU)
> 
> Flickering lines all over the map I played at (wave breaker, MP 64-player), running Ultra with MSAA off.
> 
> Pretty sure power saving is turned off in bios.
> 
> Different story with 3-way crossfire enabled though.
> 
> BF4 @ 4K, Mantle, Ultra no MSAA, FPS NEVER below 80 FPS, a lot of the time 100-120 and NO ISSUES, running 0 6 12. Everything else the same.
> 
> *Do I have a bad card or is four cards bottlenecking my CPU?*


You should be able to know that. What is the CPU usage?


----------



## thrgk

I am new to overclocking 290x's. In MSI AB, since I am on water is it safe to up the power limit by the +50% max and the voltage by +100mv? And after doing that just try to find stable clock? On my 7970s it was only +20%.


----------



## Gobigorgohome

Quote:


> Originally Posted by *Offler*
> 
> Crossfire without bridges with so many cards... Whoa... Thats something. Traffic over the PCIE has to be quite big.
> 
> In 3 way crossfire 2 cards run on PCI-E 3.0 16x, one on 8x. On 4 way its 16x and three on 8x.
> 
> Have you tested any other 3D engine?
> Are there any signs of graphical bugs or something?
> What about leaked driver from 27th august?


I used both Direct3d 11 and Mantle in BF4 and they feel the same with both 3-fire and 4-fire. Have been running Hitman Absolution fine all the time with 4 GPU's, Metro Last Light have some problems with low FPS, even with PhysX turned off. Maybe just poor optimization.
Tomb Raider is working great, have not tried Thief more than once (capped at 40 fps @ 4K as far as I know), horrible with four cards.
Have had some issues in FireStrike Extreme with overclocked cards only (small flickering with anything above 1100/1300, driver crashes etc.)
Quote:


> Originally Posted by *kizwan*
> 
> You should be able to know that. What is the CPU usage?


I highly doubt it is the CPU, but I will check.

Trifire, BF4 @ 4K, Ultra no MSAA, 4,6 Ghz 4930K, R9 290X's stock: CPU usage: 50 %
Quadfire, BF4 @ 4K, Ultra no MSAA, 4,6 Ghz 4930K, R9 290X's stock, CPU usage: 50 % (actually a little lower 44-48 %). It does not seem to be bottlenecking, but it is laging and I have poor FPS ... mid 40's to mid 50's at Ultra without MSAA.


----------



## bluedevil

Quote:


> Originally Posted by *rdr09*
> 
> here is 290 @ 1280 Tess off
> 
> 
> 
> bluedevil's run at 1200 is off. here was my 1200 using 13 driver
> 
> http://www.3dmark.com/3dm/2859470?
> 
> 14.X adds about 400 -500pts.


Just increased my memory to 1400.

http://www.3dmark.com/3dm/4142136

Now gonna do a run @ 1500.


----------



## bluedevil

Almost 9.5k.

http://www.3dmark.com/3dm/4142210


----------



## rdr09

Quote:


> Originally Posted by *bluedevil*
> 
> Just increased my memory to 1400.
> 
> http://www.3dmark.com/3dm/4142136
> 
> Now gonna do a run @ 1500.


blue follow post # 29853.

this is 14.6 Driver . . .

http://www.3dmark.com/3dm/3147012?

here is 14.X

http://www.3dmark.com/3dm/4079609?

slight bump but it could be just a fluke.

edit: thinking about, your card may be performing similarly in your games. performing like a 780 (jk). try running stock in FS. i get 10.5K at stock of 947MHz.


----------



## ZealotKi11er

Quote:


> Originally Posted by *thrgk*
> 
> I am new to overclocking 290x's. In MSI AB, since I am on water is it safe to up the power limit by the +50% max and the voltage by +100mv? And after doing that just try to find stable clock? On my 7970s it was only +20%.


I run mine @ Stock and set them to 1175MHz/1500MHz with +100mV +50% if i need more performance. Can do 1200MHz but one card needs +125mV so i have to mess with other tool so i dont bother. If under water you are safe. Make sure to check them temps. I hit 60s under water with these cards no problem once I OC them.


----------



## bluedevil

Quote:


> Originally Posted by *rdr09*
> 
> blue follow post # 29853.
> 
> this is 14.6 Driver . . .
> 
> http://www.3dmark.com/3dm/3147012?
> 
> here is 14.X
> 
> http://www.3dmark.com/3dm/4079609?
> 
> slight bump but it could be just a fluke.
> 
> edit: thinking about, your card may be performing similarly in your games. performing like a 780 (jk). try running stock in FS. i get 10.5K at stock of 947MHz.


OK will try tonite when I get home from work.


----------



## Offler

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I used both Direct3d 11 and Mantle in BF4 and they feel the same with both 3-fire and 4-fire. Have been running Hitman Absolution fine all the time with 4 GPU's, Metro Last Light have some problems with low FPS, even with PhysX turned off. Maybe just poor optimization.
> Tomb Raider is working great, have not tried Thief more than once (capped at 40 fps @ 4K as far as I know), horrible with four cards.
> Have had some issues in FireStrike Extreme with overclocked cards only (small flickering with anything above 1100/1300, driver crashes etc.)


I would expect trouble, since quadfire isnt really common configuration. At this point its hard to determine if its PCI-E bus overload one GPU is faulty.

a) Test each graphic card separately. Some stress test, Firestrike or similar. This will tell you if one GPU is failing
b) Check if each GPU has same BIOS version.
c) Check if each GPU has same memory chips.
d) test all graphic cards in Tri-fire. If only one of them is faulty or not matching you will found it this way.

Its just good practice to have all devices in CFX, SLI or RAID to have absolutely same parameters and firmwares.


----------



## Gobigorgohome

Quote:


> Originally Posted by *Offler*
> 
> I would expect trouble, since quadfire isnt really common configuration. At this point its hard to determine if its PCI-E bus overload one GPU is faulty.
> 
> a) Test each graphic card separately. Some stress test, Firestrike or similar. This will tell you if one GPU is failing
> b) Check if each GPU has same BIOS version.
> c) Check if each GPU has same memory chips.
> d) test all graphic cards in Tri-fire. If only one of them is faulty or not matching you will found it this way.
> 
> Its just good practice to have all devices in CFX, SLI or RAID to have absolutely same parameters and firmwares.


I have a quick answer for you at least.

a) done before putting them under water and everything was working good (each card individually, 2-way, 3-way and 4-way), running bf4, tomb raider, hitman absolution, not firestrike. Everything was good. Running everything at max @ 4K, but not BF4 because it would crash due to only 1300 wattage PSU.
b) not the same bios versions, two on the top and two on the bottom running different bios's (BF4 editions and "normal" editions), will be doing pt1/pt3 soon.
c) Not the same memory chips, two of the cards is running Elpida and the other two Hynix.
d) I was strictly when I tested the cards before even buying waterblocks, tested in any way possible, just used as they are now on water.

I know it is good practice to run the exact same cards, I did not think different versions of the same card would have anything to say ... it is 4x Sapphire Radeon R9 290X's ...


----------



## bond32

I have one sapphire 290x and 2 xfx 290's and don't have a single issue. Of course I got lucky with all hynix memory cards. I can tell you from experience, my first 290x had elipda memory and the lines you described would occur if the memory clocks were unstable.

As for running different cards, you shouldn't have issues with that especially if you run the same bios on all which I think the BF4 edition cards have the exact same bios as reference 290x (even if it is labeled differently).

However, when setting clocks in AB, I do still have to set the clocks for my 290x then enter the settings, in the pull down select the other two cards, click ok, then set the clocks for them seperatly even if I am setting them to the same clock speeds/voltage.


----------



## Paopawdecarabao

What fps or performance do you guys get with a reference asus 290x crossfire on a 5760x1080 bf4 ultra settings no AA? 32/32 map.

Seems like I don't get above 60fps but from my sapphire r9 290 trix I do get above 60fps. I don't know if map matters though.


----------



## nightfox

Quote:


> Originally Posted by *Paopawdecarabao*
> 
> What fps or performance do you guys get with a reference asus 290x crossfire on a 5760x1080 bf4 ultra settings no AA? 32/32 map.
> 
> Seems like I don't get above 60fps but from my sapphire r9 290 trix I do get above 60fps. I don't know if map matters though.


you mean single card can give you above 60 fps on same eyefinity and ultra settings?


----------



## Paopawdecarabao

Quote:


> Originally Posted by *nightfox*
> 
> you mean single card can give you above 60 fps on same eyefinity and ultra settings?


In a crossfire set-up


----------



## jagdtigger

Finally finished the OC on the GPU:
http://www.3dmark.com/3dm/3978296
(Core 1230 MHz, +200mV)


----------



## th3illusiveman

Quote:


> Originally Posted by *jagdtigger*
> 
> Finally finished the OC on the GPU:
> http://www.3dmark.com/3dm/3978296
> (Core 1230 MHz, +200mV)


you should be getting over 13500 GPU score at those clocks. Closer to 14000 really.


----------



## TheSoldiet

Guys!
http://wccftech.com/amd-radeon-r9-390x-arrives-1h-2015-feature-hydra-liquid-cooling

My next GPU if true


----------



## ZealotKi11er

Quote:


> Originally Posted by *TheSoldiet*
> 
> Guys!
> http://wccftech.com/amd-radeon-r9-390x-arrives-1h-2015-feature-hydra-liquid-cooling
> 
> My next GPU if true


Really want AMD to be the first to push the new memory standard.


----------



## tsm106

They pushed GDDR, and now look at where we are.


----------



## BradleyW

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Really want AMD to be the first to push the new memory standard.


GDDR?


----------



## thrgk

how do people get 1300 on the core? I upped it to +100 on core mem and 50% on pwoer limit and i crash at 1175


----------



## ZealotKi11er

Quote:


> Originally Posted by *thrgk*
> 
> how do people get 1300 on the core? I upped it to +100 on core mem and 50% on power limit and i crash at 1175


You need +200 mV and very low temps really.


----------



## DampMonkey

Quote:


> Originally Posted by *thrgk*
> 
> how do people get 1300 on the core? I upped it to +100 on core mem and 50% on pwoer limit and i crash at 1175


Are you using the stock bios?


----------



## thrgk

Yea I think, IDK i cannot see the bios switches due to the ek water block bridge


----------



## thrgk

Quote:


> Originally Posted by *ZealotKi11er*
> 
> You need +200 mV and very low temps really.


msi ab only does +100


----------



## ZealotKi11er

Quote:


> Originally Posted by *thrgk*
> 
> msi ab only does +100


Download Sapphire TRIXX.


----------



## Gabkicks

For me, afterburner doesn't actually increase lower limit. Trixx does though. Trixx works much better with my cards, though I don't see an aux power setting g in trixx


----------



## ZealotKi11er

Quote:


> Originally Posted by *Gabkicks*
> 
> For me, afterburner doesn't actually increase lower limit. Trixx does though. Trixx works much better with my cards, though I don't see an aux power setting g in trixx


What does AUX voltage even do? I know you need to increase vCore for 290/290X when you OC the memory.


----------



## Gobigorgohome

Quote:


> Originally Posted by *bond32*
> 
> I have one sapphire 290x and 2 xfx 290's and don't have a single issue. Of course I got lucky with all hynix memory cards. I can tell you from experience, my first 290x had elipda memory and the lines you described would occur if the memory clocks were unstable.
> 
> As for running different cards, you shouldn't have issues with that especially if you run the same bios on all which I think the BF4 edition cards have the exact same bios as reference 290x (even if it is labeled differently).
> 
> However, when setting clocks in AB, I do still have to set the clocks for my 290x then enter the settings, in the pull down select the other two cards, click ok, then set the clocks for them seperatly even if I am setting them to the same clock speeds/voltage.


The problem is that it has to be "unstable" at stock .... these results is with stock clocks on all the cards. 3-fire work like a charm and I am tired of trying to figure out what causes these issues, might as well just sell one card and use three of them, less power draw and better FPS ...


----------



## Blue Dragon

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What does AUX voltage even do? I know you need to increase vCore for 290/290X when you OC the memory.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *thrgk*
> 
> how do people get 1300 on the core? I upped it to +100 on core mem and 50% on power limit and i crash at 1175


PT1T bios and GPU tweek will set it up for that ......... rest is up to you








P.S Chilled loop helps


----------



## ZealotKi11er

Quote:


> Originally Posted by *Blue Dragon*


Ok i see the image but does it improve overclocking?


----------



## Gualichu04

Has anyone used elder of scrolls scrolls skyrim using their r9 290 or 290x in crossfire having any issues? I get the sound cutting out when i try to use crossfire. It is only skyrim that does this.


----------



## tsm106

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *thrgk*
> 
> how do people get 1300 on the core? I upped it to +100 on core mem and 50% on power limit and i crash at 1175
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PT1T bios and GPU tweek will set it up for that ......... rest is up to you
> 
> 
> 
> 
> 
> 
> 
> 
> P.S Chilled loop helps
Click to expand...

You can get by without a chiller. I did fine.


----------



## KnownDragon

Quote:


> Originally Posted by *TTheuns*
> 
> I am considering to jump over to 'The Red Team':
> 
> I can get €530 for my MSI GeForce GTX 780 Ti.
> 
> And I can buy two R9 290X's for €520, or I could buy three R9 290's for €520.
> 
> Which would you guys go for?


They also need a lot of power. 3 290's will push over 2 290x. Would love to see a fire strike score on those.


----------



## mus1mus

Will water push the cards up to 1300 on core?

How much Delta ( ambient to GPU temps ) will these cards at those clocks and voltages push?


----------



## alancsalt

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *thrgk*
> 
> how do people get 1300 on the core? I upped it to +100 on core mem and 50% on power limit and i crash at 1175
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PT1T bios and GPU tweek will set it up for that ......... rest is up to you
> 
> 
> 
> 
> 
> 
> 
> 
> P.S Chilled loop helps
> 
> Click to expand...
> 
> You can get by without a chiller. I did fine.
Click to expand...

But what are your ambients?


----------



## jagdtigger

Quote:


> Originally Posted by *th3illusiveman*
> 
> you should be getting over 13500 GPU score at those clocks. Closer to 14000 really.


Interesting, as far as i know it didn't down clocked under the tests. I rerun the test as soon my windows up and running, damn EFI boot loader don't like extended partitions...


----------



## kizwan

Quote:


> Originally Posted by *jagdtigger*
> 
> Quote:
> 
> 
> 
> Originally Posted by *th3illusiveman*
> 
> you should be getting over 13500 GPU score at those clocks. Closer to 14000 really.
> 
> 
> 
> Interesting, as far as i know it didn't down clocked under the tests. I rerun the test as soon my windows up and running, damn EFI boot loader don't like extended partitions...
Click to expand...

Use proper tools that support GPT disk when partitioning. You can still recover without reinstall windows.


----------



## jagdtigger

Quote:


> Originally Posted by *kizwan*
> 
> Use proper tools that support GPT disk when partitioning. You can still recover without reinstall windows.


It boots up fine when i pull out the hdd which has an extended partiton full with data... I just finished moving things. Next step is restore win from backup







.


----------



## tsm106

Quote:


> Originally Posted by *alancsalt*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *thrgk*
> 
> how do people get 1300 on the core? I upped it to +100 on core mem and 50% on power limit and i crash at 1175
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> PT1T bios and GPU tweek will set it up for that ......... rest is up to you
> 
> 
> 
> 
> 
> 
> 
> 
> P.S Chilled loop helps
> 
> Click to expand...
> 
> You can get by without a chiller. I did fine.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> But what are your ambients?
Click to expand...

I'm in socal, we don't get -10 ambient here lol.

It's basically 75f all year round here.


----------



## alancsalt

Neither do HOMECINEMA-PC or myself. About our lowest mid-winter temp would be 6°C and mid summer can be 40°C. He's a couple hundred K from me. You done good then....


----------



## Buehlar

Hey guys, I'll be joining the club pretty soon.









Just pulled the trigger on a pair of 290x DC II


----------



## Ized

Quote:


> Originally Posted by *jagdtigger*
> 
> Finally finished the OC on the GPU:
> http://www.3dmark.com/3dm/3978296
> (Core 1230 MHz, +200mV)


Today I scored 1point less than you on a 290 with 1111Mhz core 1445Mhz mem

http://www.3dmark.com/fs/2821483

Kind of interesting how inconsistent my card has been.

Little higher here but I don't recall the settings http://www.3dmark.com/fs/2792559


----------



## Phenomanator53

Hey guys, Just got a good deal on ebay for a 290:

pic of card is in sig rig
Brand is Gigabyte,
Cooler is Reference/Stock

please add me to the club









Is this score normal for this card?
http://www.3dmark.com/3dm11/8744321


----------



## boot318

Quote:


> Originally Posted by *Buehlar*
> 
> Hey guys, I'll be joining the club pretty soon.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just pulled the trigger on a pair of 290x DC II


Your crazy for buying 290x's! There is an 99% chance AMD is going to drop the price later today. $300-350 is my guess. You should've waited. Retailers prey on the misinformed.....


----------



## Phenomanator53

Hey guys i've noticed there is a green LED on the back of the PCB that lights up when the screen is off, anyone know what it means? I know on my Quadro the green LED means power supply OK but on the 290 i have no idea, can anyone explain to me what is going on?


----------



## Buehlar

Quote:


> Originally Posted by *boot318*
> 
> Your crazy for buying 290x's! There is an 99% chance AMD is going to drop the price later today. $300-350 is my guess. You should've waited. Retailers prey on the misinformed.....










Seriouslly?
Where did you get this info?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *tsm106*
> 
> You can get by without a chiller. I did fine.
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I'm in socal, we don't get -10 ambient here lol.
> Quote:
> 
> 
> 
> Originally Posted by *alancsalt*
> 
> But what are your ambients?
> 
> 
> 
> It's basically 75f all year round here.
> 
> Click to expand...
Click to expand...









and because youre a overclocking Demi God








Yep I would need to run A/C 24/7 @ 24c to get his ambients .

Quote:


> Originally Posted by *mus1mus*
> 
> Will water push the cards up to 1300 on core?
> 
> How much Delta ( ambient to GPU temps ) will these cards at those clocks and voltages push?


Yes and up to + 400mv . PT1T bios and GPU Tweek .
When ? at idle or full load ? or gaming load ?

Quote:


> Originally Posted by *alancsalt*
> 
> Neither do HOMECINEMA-PC or myself. About our lowest mid-winter temp would be 6°C and mid summer can be 40°C. He's a couple hundred K from me. You done good then....


My chiller / water temp on the cpu is running at 12c below ambient 24/7 . I have it well insulated now
Cards on separate loop 13c water temp all round . I normally disconnect the rads when benching the 290's


----------



## Blue Dragon

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Ok i see the image but does it improve overclocking?


not that I've personally seen... think it is suppose to stabilize vdroop, most likely directed at LN overclockers.


----------



## bluedevil

I got up early to do some benching.....









Now this I don't get. 1st is mine, 2nd is the highest score with my CPU/GPU combo.

http://www.3dmark.com/compare/fs/2822286/fs/2451662


----------



## mus1mus

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yep I would need to run A/C 24/7 @ 24c to get his ambients .
> Yes and up to + 400mv . PT1T bios and GPU Tweek .
> When ? at idle or full load ? or gaming load ?


Gaming. and Full.


----------



## rdr09

Quote:


> Originally Posted by *bluedevil*
> 
> I got up early to do some benching.....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now this I don't get. 1st is mine, 2nd is the highest score with my CPU/GPU combo.
> 
> http://www.3dmark.com/compare/fs/2822286/fs/2451662


you used both CCC and AB to set power limit? see ized's graphics score ( a few posts back) at 1110 core matching that 2nd card? also beating your score at 1200 core.

not sure if we talked about it but i use Trixx without any other app like CCC or AB. you saw it. 13K graphics score at 1200 core using 14.X.

http://www.3dmark.com/3dm/4079609

how about run 3DMark11 and lets see. it could be just FS. here at 1200 core.

http://www.3dmark.com/3dm11/8723023


----------



## obababoy

I have a couple questions about artifacting. I have the Sapphire R9 290 Vapor-X and when overclocking to 1150 core or so, the artifacts start to show up on screen around 72c and above. If I keep it in the 60's it works "fine" and I dont see any artifacts. Is this a sign of an unstable overclock or a messed up GPU? . I have the voltage offset in trixx at +50mV. Should I maybe run that higher? My last question is should I be using trixx or something like MSI afterburner? Both are installed.


----------



## Offler

If card start to artifact on certain temperature, it can mean that cooler is not properly fitted, or there is small air bubble somewhere in thermal compound, causing local overheating.

Other possibility is that GPU becomes "thermally sensitive" on some point of overclocking. At that point you can either lower frequency, or increase voltage. Higher voltage may cause even higher temperatures, and the problem may become even worse.

I am using TriXX since MSI AB is causing freeze on my system...

I would check thermal compound used on your card, but only in case if your Warranty period is over, or change of thermal paste will not void it. Issues related to OC (even air bubble under cooler) are not reason for RMA in case if device works perfectly on stock values.


----------



## rdr09

Quote:


> Originally Posted by *obababoy*
> 
> I have a couple questions about artifacting. I have the Sapphire R9 290 Vapor-X and when overclocking to 1150 core or so, the artifacts start to show up on screen around 72c and above. If I keep it in the 60's it works "fine" and I dont see any artifacts. Is this a sign of an unstable overclock or a messed up GPU? . I have the voltage offset in trixx at +50mV. Should I maybe run that higher? My last question is should I be using trixx or something like MSI afterburner? Both are installed.


you are using CCC? if you are and it works that's fine. what fan speed to keep at 60's? is it bearable? i suggest Trixx for single cards and use the fan curve. use one app at a time. if Trixx, then just that and disable Overdrive in CCC.

you have to check the temps of the VRMs, especially VRM 1. you can use either GPUZ or Hwinfo64 (free version) to see those temps. those maybe going higher than the core and can cause artifacts. i don't recommend upping the voltage. i do recommend raising the power limit to 50%. if you have not done so, try it and you might be able to lower the volts further.


----------



## sebmeikle

Could someone tell me if the red parts of the R9 290X are easily removable? I am planning a new X99 build, but with a black and white theme. I love the look of the performance of this card, but it would look totally out of place in my build.

I found this image: http://semiaccurate.com/assets/uploads/2013/11/Custom-290X-4-of-5.jpg

Could you tell me if those red parts would pry/pop out so I can paint them?


----------



## ZealotKi11er

Quote:


> Originally Posted by *sebmeikle*
> 
> Could someone tell me if the red parts of the R9 290X are easily removable? I am planning a new X99 build, but with a black and white theme. I love the look of the performance of this card, but it would look totally out of place in my build.
> 
> I found this image: http://semiaccurate.com/assets/uploads/2013/11/Custom-290X-4-of-5.jpg
> 
> Could you tell me if those red parts would pry/pop out so I can paint them?


I dont think so bu you can take the black part and paint the white parts or paint everything white.


----------



## bluedevil

Quote:


> Originally Posted by *rdr09*
> 
> you used both CCC and AB to set power limit? see ized's graphics score ( a few posts back) at 1110 core matching that 2nd card? also beating your score at 1200 core.
> 
> not sure if we talked about it but i use Trixx without any other app like CCC or AB. you saw it. 13K graphics score at 1200 core using 14.X.
> 
> http://www.3dmark.com/3dm/4079609
> 
> how about run 3DMark11 and lets see. it could be just FS. here at 1200 core.
> 
> http://www.3dmark.com/3dm11/8723023


Everywhere I pull up is my 1200mhz core, CCC or AB all say the same. Ocing is done is AB. Trying 3DMark11 now.


----------



## sebmeikle

Yeah, that was the original idea. But I saw this image and thought maybe it would come apart. I guess if I choose this card, it will be a base of careful masking.


----------



## rdr09

Quote:


> Originally Posted by *bluedevil*
> 
> Everywhere I pull up is my 1200mhz core, CCC or AB all say the same. Ocing is done is AB. Trying 3DMark11 now.


no, i read with cards behaving like yours, you have to use a combination of AB and CCC. set the power limit to 50% on both those apps.


----------



## bluedevil

Quote:


> Originally Posted by *rdr09*
> 
> no, i read with cards behaving like yours, you have to use a combination of AB and CCC. set the power limit to 50% on both those apps.


Ok can do. Gotta do it later.









Here is a 3dMark11 run.
http://www.3dmark.com/3dm11/8757253


----------



## zealord

sorry to jump in like that guys, but what is considered to be the best/most stable AMD driver version currently?

Thank you in advance !


----------



## obababoy

Quote:


> Originally Posted by *Offler*
> 
> If card start to artifact on certain temperature, it can mean that cooler is not properly fitted, or there is small air bubble somewhere in thermal compound, causing local overheating.
> 
> Other possibility is that GPU becomes "thermally sensitive" on some point of overclocking. At that point you can either lower frequency, or increase voltage. Higher voltage may cause even higher temperatures, and the problem may become even worse.
> 
> I am using TriXX since MSI AB is causing freeze on my system...
> 
> I would check thermal compound used on your card, but only in case if your Warranty period is over, or change of thermal paste will not void it. Issues related to OC (even air bubble under cooler) are not reason for RMA in case if device works perfectly on stock values.


Warranty is not over yet. I got this card a few months ago. I would like to check the thermal paste of this Vapor-X card. What would a bubble look like if I remove the paste? Would there be a hole or is there no way to tell?
Quote:


> Originally Posted by *rdr09*
> 
> you are using CCC? if you are and it works that's fine. what fan speed to keep at 60's? is it bearable? i suggest Trixx for single cards and use the fan curve. use one app at a time. if Trixx, then just that and disable Overdrive in CCC.
> 
> you have to check the temps of the VRMs, especially VRM 1. you can use either GPUZ or Hwinfo64 (free version) to see those temps. those maybe going higher than the core and can cause artifacts. i don't recommend upping the voltage. i do recommend raising the power limit to 50%. if you have not done so, try it and you might be able to lower the volts further.


I use CCC for monitor settings and what not but I think I disabled Overdrive. Fan speed for 60's is like 70% which gets a bit loud in my H440 case playing watch dogs with everything max. VRM's are great. I forget the temps but this Vapor-X 290 has a nice heatsink on them.

I guess I am confused. Could you give a quick run down on power limit? Does 50% just mean the card is allowed to use 150% of its normal power required? The parameter Trixx lets me change is VDDC Offset I believe.


----------



## rdr09

Quote:


> Originally Posted by *bluedevil*
> 
> Ok can do. Gotta do it later.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here is a 3dMark11 run.
> http://www.3dmark.com/3dm11/8757253


Quote:


> Originally Posted by *obababoy*
> 
> Warranty is not over yet. I got this card a few months ago. I would like to check the thermal paste of this Vapor-X card. What would a bubble look like if I remove the paste? Would there be a hole or is there no way to tell?
> I use CCC for monitor settings and what not but I think I disabled Overdrive. Fan speed for 60's is like 70% which gets a bit loud in my H440 case playing watch dogs with everything max. VRM's are great. I forget the temps but this Vapor-X 290 has a nice heatsink on them.
> 
> I guess I am confused. Could you give a quick run down on power limit? Does 50% just mean the card is allowed to use 150% of its normal power required? The parameter Trixx lets me change is VDDC Offset I believe.


you two need to read this . . .

http://www.overclock.net/t/1265543/the-amd-how-to-thread

blue, you are missing 2K points in graphics. read the thread above.
Quote:


> Originally Posted by *zealord*
> 
> sorry to jump in like that guys, but what is considered to be the best/most stable AMD driver version currently?
> 
> Thank you in advance !


go to amd driver site and download 14.4 WHQL or you can try this too . . .

http://www.overclock.net/t/1511189/amd-catalyst-14-x-beta-14-300-1005-0-august-27

that's what i use.


----------



## JeremyFenn

I love my Sapphire Vapor-X Tri-X R9-290x !!! It runs 1200/1600 all day long and never goes above 70c.







All I just is Trixx with +50 and +200 up, no hiccups.


----------



## kizwan

Quote:


> Originally Posted by *JeremyFenn*
> 
> I love my Sapphire Vapor-X Tri-X R9-290x !!! It runs 1200/1600 all day long and never goes above 70c.
> 
> 
> 
> 
> 
> 
> 
> All I just is Trixx with +50 and +200 up, no hiccups.


VRM temp?


----------



## JeremyFenn

Quote:


> Originally Posted by *kizwan*
> 
> VRM temp?


Don't know, I'm at the office not my house. I'm sure I have pictures somewhere with temps during benching.


----------



## JeremyFenn

VRMs look around 55c or so.


----------



## kizwan

Quote:


> Originally Posted by *JeremyFenn*
> 
> 
> 
> VRMs look around 55c or so.


Nice low temps.







At +200mV (same clocks), mine is 1.38V after Vdroop. What is your ambient/room temp usually?


----------



## JeremyFenn

Quote:


> Originally Posted by *kizwan*
> 
> Nice low temps.
> 
> 
> 
> 
> 
> 
> 
> At +200mV (same clocks), mine is 1.38V after Vdroop. What is your ambient/room temp usually?


We usually keep the AC set to 74F. I'd say ambients are around there somewhere.


----------



## Gualichu04

Since no one answered my question I will repost it. Has anyone playing elder of scroll skyrim with crossfire enabled experience any sound issues wit the 14.x driver? I cant use crossfire the sound just keeps cutting out.


----------



## sinnedone

Might have better luck in the gaming section of the forum with that question.


----------



## Chita Gonza

Ugh, couldn't resist and ended up buying the Sapphire BF4 edition using the mobile discount code. I need to talk to a rep and see if I can get my free shoprunner shipping for this.


----------



## Gobigorgohome

Okay, I am happy to say that the CPU overclock was my problem. Running my 4930K @ 4,6 Ghz and quadfire I get 45-50 FPS in BF4 @ 4K @ Ultra without MSAA.
4930K @ 4,7 Ghz and quadfire and I get well above 100 FPS in BF4 @ 4K @ Ultra without MSAA. Running stock cards on water and no problems, maximum temperature is 43C on core of each card.


----------



## bond32

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Okay, I am happy to say that the CPU overclock was my problem. Running my 4930K @ 4,6 Ghz and quadfire I get 45-50 FPS in BF4 @ 4K @ Ultra without MSAA.
> 4930K @ 4,7 Ghz and quadfire and I get well above 100 FPS in BF4 @ 4K @ Ultra without MSAA. Running stock cards on water and no problems, maximum temperature is 43C on core of each card.


Theres no way 100 mhz on your cpu gave you over 50 fps increase...


----------



## Gobigorgohome

Quote:


> Originally Posted by *bond32*
> 
> Theres no way 100 mhz on your cpu gave you over 50 fps increase...


Something were wrong with four cards yesterday and today it works great, only difference is overclock on the CPU. Every setting in bios is the same as yesterday, today playing with four cards was like playing with three cards yesterday only slightly better FPS. Hit 200 FPS in-game and not that big of a drop to the lowest framerates.

Perhaps it was not only the CPU overclock, it was something, but now it is smooth!







Even played some Metro Last Light @ 4K @ Very High and it was smooth, I have no idea what it is if it is not the CPU overclock because everything else is the same.


----------



## sinnedone

Could be something else, but I've read over the years that certain intel cpu's just don't like a specific frequency. Say you'll go fine to 4.5ghz, then take voltage up high and you still cant get 4.6 stable. Then you bump it up a step and can get the higher frequency at less voltage. I've read this on several occasions but never personally experienced it. It could be what happened to you unless you find another reason.


----------



## rdr09

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Something were wrong with four cards yesterday and today it works great, only difference is overclock on the CPU. Every setting in bios is the same as yesterday, today playing with four cards was like playing with three cards yesterday only slightly better FPS. Hit 200 FPS in-game and not that big of a drop to the lowest framerates.
> 
> Perhaps it was not only the CPU overclock, it was something, but now it is smooth!
> 
> 
> 
> 
> 
> 
> 
> Even played some Metro Last Light @ 4K @ Very High and it was smooth, I have no idea what it is if it is not the CPU overclock because everything else is the same.


could it be crossfire being bridgeless that makes it smoother? hmmm. i saw a 290 on sale in the ocn market.


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> could it be crossfire being bridgeless that makes it smoother? hmmm. i saw a 290 on sale in the ocn market.


Overkill. Only game that i notice difference is Crysis 3. In BF4 i have to upscale to 4K to use the second card.


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Overkill. Only game that i notice difference is Crysis 3. In BF4 i have to upscale to 4K to use the second card.


i think my SB will struggle anyways.


----------



## Devildog83

Finally I would like to join.









It will be going under water after I trade up to Haswell-e very soon.


----------



## th3illusiveman

Quote:


> Originally Posted by *JeremyFenn*
> 
> 
> 
> VRMs look around 55c or so.


1.328V? Isn't that going to kill your card? How are you getting 55c on the VRMs? My runs at 1100 with 1.203v (+81mv) but the VRMs get to 80c in Valley after ~30min and 85c with 1.223v (+100mv) fans at about 70%


----------



## Buehlar

Quote:


> Originally Posted by *Devildog83*
> 
> Finally I would like to join.
> 
> 
> 
> It will be going under water after I trade up to Haswell-e very soon.


Hey DD, glad to see you're still addicted!







I'm also retiring my HD7870s. It's time, just not enough umph for Eyefinity gaming anymore.

I'll finally get to play through all my games at max settings!


----------



## Gobigorgohome

Quote:


> Originally Posted by *sinnedone*
> 
> Could be something else, but I've read over the years that certain intel cpu's just don't like a specific frequency. Say you'll go fine to 4.5ghz, then take voltage up high and you still cant get 4.6 stable. Then you bump it up a step and can get the higher frequency at less voltage. I've read this on several occasions but never personally experienced it. It could be what happened to you unless you find another reason.


This CPU is gold, I have 4,8 Ghz @ 1,328 volts P95 stable. This chip is capable of 5 Ghz.

I have no other answer than the CPU-overclock, it is not hot either. I just hope it will last.


----------



## JeremyFenn

Quote:


> Originally Posted by *th3illusiveman*
> 
> 1.328V? Isn't that going to kill your card? How are you getting 55c on the VRMs? My runs at 1100 with 1.203v (+81mv) but the VRMs get to 80c in Valley after ~30min and 85c with 1.223v (+100mv) fans at about 70%


I have a Sapphire Vapor-X Tri-X R9-290x card. They have awesome VRMs


----------



## rdr09

hello.

i was hoping to be the first in the 3000th page.


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> hello.
> 
> i was hoping to be the first in the 3000th page.


Maybe me>


----------



## Paopawdecarabao

Me too


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *mus1mus*
> 
> Gaming. and Full.


I game at 1100 and bench at 1300 + ..... when the damn driver plays nice .....

Quote:


> Originally Posted by *rdr09*
> 
> no, i read with cards behaving like yours, you have to use a combination of AB and CCC. set the power limit to 50% on both those apps.


I will have to have a read of that


----------



## mus1mus

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> *I game at 1100 and bench at 1300 +* ..... when the damn driver plays nice .....
> I will have to have a read of that


Thanks buddy.

Can someone point me to 14.X? Is that the one from G3D?

It seems my card doesn't react well to that driver. I'm back to 14.7 RC3. And broke 9K in FS.


----------



## Devildog83

I am 3000, this showed up today. My system along with the 290 is getting a big makeover.


----------



## rdr09

Quote:


> Originally Posted by *Devildog83*
> 
> I am 3000, this showed up today. My system along with the 290 is getting a big makeover.


yes, sir, you are. congratulations!


----------



## Noufel

Quote:


> Originally Posted by *Devildog83*
> 
> I am 3000, this showed up today. My system along with the 290 is getting a big makeover.










DAT MAKEOVER !!!


----------



## Buehlar

Quote:


> Originally Posted by *Devildog83*
> 
> I am 3000, this showed up today. My system along with the 290 is getting a big makeover.
> 
> 
> Spoiler: Warning: Spoiler!


Nice looking board...I'm jealous.









Are you going with 8 core?


----------



## Noufel

quad cfx ???


----------



## th3illusiveman

can i be 3000?


----------



## bluedevil

Did a run at 975/1250.

http://www.3dmark.com/3dm/4161229

Now doing a run at 1200/1400.


----------



## mus1mus

Now everyone hates to post and keeps on just waiting to be the lucky 3000th...


----------



## Gualichu04

My Card is stable at 1160mhz core and 1400mhz memory at +50 power limit and +75mv and even in crossfire both cards work fine. Really need to run 3dmark fire strike to see how they do. Can't get either card to do 1200mhz core. I tried in trixx with +200mv and they won't do it.


----------



## Gualichu04

Then there will be 30000 and many more posts to come.


----------



## VSG

30,000th reply! Is this the biggest owner's club thread on OCN?


----------



## Arizonian

Quote:


> Originally Posted by *geggeg*
> 
> 30,000th reply! Is this the biggest owner's club thread on OCN?


We are really close









We're still beat by the *[Official] AMD Radeon HD 7950/7970/7990 Owners Thread* currently up to page 3695 with post #36947 of 36947.

With new GPU's coming out, we may not exceed the 7970 / 7950 Club.


----------



## VSG

To be fair that generation lasted over 2 years!


----------



## chronicfx

Quote:


> Originally Posted by *Devildog83*
> 
> I am 3000, this showed up today. My system along with the 290 is getting a big makeover.


Just did mine a few weeks ago with the gaming gt. Couldn't be happier with the upgrade!


----------



## tsm106

Quote:


> Originally Posted by *geggeg*
> 
> To be fair that generation lasted over 2 years!


True. Tahiti probably has the longest legs of any generation in recent memory. And it's still going strong, albeit in mult-gpu setups.


----------



## gulba

Hey, as I installed the 290x the System is missing 4GB of ram, i didnt mind it for a long time but now i need the extra ram, is there a chance to get it "back" ?


----------



## ahmedmo1

Just picked up an ASUS Radeon R9 290X DirectCU II to replace my SLI MSI GTX 670 Power Edition OC cards.


----------



## Buehlar

Quote:


> Originally Posted by *boot318*
> 
> Your crazy for buying 290x's! There is an 99% chance AMD is going to drop the price later today. $300-350 is my guess. You should've waited. Retailers prey on the misinformed.....


Well so much for your prediction.








Prices are higher now without $40 in rebates and the 5% promotional code that I used which expired on the 24th @ midnight. Got 6 free games too.
The prices for r9 2xx had already dropped pre GTX 980 launch anyway and after I waited out on upgrading while the whole bitcoin mining craze inflated prices... I wasn't waiting around a minute longer.









My luck they'll probably be half price by the time I receive mine.









Well I'll be set for ~2 yrs now anyway


----------



## Sgt Bilko

Quote:


> Originally Posted by *Buehlar*
> 
> Quote:
> 
> 
> 
> Originally Posted by *boot318*
> 
> Your crazy for buying 290x's! There is an 99% chance AMD is going to drop the price later today. $300-350 is my guess. You should've waited. Retailers prey on the misinformed.....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well so much for your prediction.
> 
> 
> 
> 
> 
> 
> 
> 
> Prices are higher now without $40 in rebates and the 5% promotional code that I used which expired on the 24th @ midnight. Got 6 free games too.
> The prices for r9 2xx had already dropped pre GTX 980 launch anyway and after I waited out on upgrading while the whole bitcoin mining craze inflated prices... I wasn't waiting around a minute longer.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My luck they'll probably be half price by the time I receive mine.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well I'll be set for ~2 yrs now anyway
Click to expand...

I'm curious to see what kind of temps you get with the DCU II's in Crossfire myself









Really only had a few single card users in here and they were complaining that even then they were hot cards.


----------



## boot318

Quote:


> Originally Posted by *Buehlar*
> 
> Well so much for your prediction.
> 
> 
> 
> 
> 
> 
> 
> 
> Prices are higher now without $40 in rebates and the 5% promotional code that I used which expired on the 24th @ midnight. Got 6 free games too.
> The prices for r9 2xx had already dropped pre GTX 980 launch anyway and after I waited out on upgrading while the whole bitcoin mining craze inflated prices... I wasn't waiting around a minute longer.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My luck they'll probably be half price by the time I receive mine.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well I'll be set for ~2 yrs now anyway


AMD made me eat crow! LOL. They planned the 25th 'big' announcement and it was only intended for India (old product launch that didn't launch there yet). I don't know why they didn't cut the prices. I mean, the 970 beats the R9 290x at $330.... and AMD still wants to charge $500+ for the cards?

You could've gotten three 970s or two 980s for for the same amount. Two 970s beats two r9 290x. You could've spent your money a little better. Personally, I would refuse the package and get my money back. They'll have to raise their prices down sooner or later. They are just fishing for chumps at the current price point. I love AMD but at the current prices, they aren't worth it.


----------



## Buehlar

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'm curious to see what kind of temps you get with the DCU II's in Crossfire myself
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Really only had a few single card users in here and they were complaining that even then they were hot cards.


Yea, I read that the heat spreaders were actually designed for the 780 DCII








I'm water cooling these anyway but before I do I'll be testing them extensively over the weekend!








They'll be here tomorrow.
Quote:


> Originally Posted by *boot318*
> 
> AMD made me eat crow! LOL. They planned the 25th 'big' announcement and it was only intended for India (old product launch that didn't launch there yet). I don't know why they didn't cut the prices. I mean, the 970 beats the R9 290x at $330.... and AMD still wants to charge $500+ for the cards?
> 
> You could've gotten three 970s or two 980s for for the same amount. Two 970s beats two r9 290x. You could've spent your money a little better. Personally, I would refuse the package and get my money back. They'll have to raise their prices down sooner or later. They are just fishing for chumps at the current price point. I love AMD but at the current prices, they aren't worth it.


Yes I know this but thanks for the concern








It's such a shame too because I'd love to grab a pair of 980's...the problem is that I use 3x1 display and frankly nvidia fumbled the ball with its drivers without supporting "proper" display profiles.
With AMD I can game in 3-way eyefinity, drop out and use "real" extended desktop mode for my productivity tasks with the touch of a hotkey.

Once you configure Nvidia for surround, the hotkey uses a "virtual" extended desktop mode. It doesn't actually drop you out of surround mode and the function is not the same. I uses a "windowed" mode for each display...all programs aren't supported, windows resize all over the place etc...been there, done that.
It's a PITA to have to disable surround every time I need "true" ext desktop mode and then reconfigure it every time I wan't to game in surround. Re-setting up bezel comp and arrange displays etc over and over is headache.

When it comes to "proper" display support for multiple displays, AMD got it right, and until Nvidia pony's up and supports proper profiles, I "need" an AMD card.









Unless of course I don't intend on using more than one monitor, then Nvidia may be a viable option.


----------



## Colin_MC

Hello
Sorry, but I didn't have the time, to review whole thread.
Which model of R9 290 is the best (fan noise/temperature, including VRM's):
- Asus Direct CU II
- GB WF3X
- Sapphire Tri-X ?
I'm wondering if I should change from Asus GTX770 2GB DCU II to R9 290 (small $ addition needed) - will it bring significant performance improvement (I've seen various tests, like on techpowerup). I'm little worried by high failure rate (but that's mainly on non-reference, with high VRM temperatures, right?)...


----------



## Wezzor

Why wouldn't you want to have 970 instead?


----------



## rdr09

Quote:


> Originally Posted by *Wezzor*
> 
> Why wouldn't you want to have 970 instead?


might still be very expensive in Poland. i would suggest that, too, but other countries charge a lot more for hardware.

@Colin, check the prices of the GTX 970 in your area. it runs cooler and uses less power. unless you play Battlefield, i highly recommend the 970.


----------



## Colin_MC

Ok. Let's say that:
- R9 290 (used one) would cost me about max 50$ more that I sold GTX770
- GTX970 - would be 180$ more.
For me - quite a big difference for me, especially that I don't have even FullHD display







(20" 1600x900, due to lack of space)

EDIT - Wezzor, as I see, you have Sapphire. Would you recommend it? Is it possible to change the idle fan speed? (cause it's one of most important things for me)


----------



## rdr09

Quote:


> Originally Posted by *Colin_MC*
> 
> Ok. Let's say that:
> - R9 290 (used one) would cost me about max 50$ more that I sold GTX770
> - GTX970 - would be 180$ more.
> For me - quite a big difference for me, especially that I don't have even FullHD display
> 
> 
> 
> 
> 
> 
> 
> 
> (20" 1600x900, due to lack of space)
> 
> EDIT - Wezzor, as I see, you have Sapphire. Would you recommend it? Is it possible to change the idle fan speed? (cause it's one of most important things for me)


makes sense. i recommend the sapphire among those.

edit: got a question . . . why does 3DMark11 reads my oc but not 3DMark?

http://www.3dmark.com/3dm11/8702421

http://www.3dmark.com/3dm/4163936?


----------



## Sgt Bilko

false
Quote:


> Originally Posted by *Colin_MC*
> 
> Hello
> Sorry, but I didn't have the time, to review whole thread.
> Which model of R9 290 is the best (fan noise/temperature, including VRM's):
> - Asus Direct CU II
> - GB WF3X
> - Sapphire Tri-X ?
> I'm wondering if I should change from Asus GTX770 2GB DCU II to R9 290 (small $ addition needed) - will it bring significant performance improvement (I've seen various tests, like on techpowerup). I'm little worried by high failure rate (but that's mainly on non-reference, with high VRM temperatures, right?)...


Out of those three the Tri-X is the better card.


----------



## Wezzor

Quote:


> Originally Posted by *Colin_MC*
> 
> EDIT - Wezzor, as I see, you have Sapphire. Would you recommend it? Is it possible to change the idle fan speed? (cause it's one of most important things for me)


I'd highly recommend Sapphire Radeon R9 290 Tri-X. Yes you can change idle fan speed but no need to do it since it's already extremely quite.


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> Now everyone hates to post and keeps on just waiting to be the lucky 3000th...


I love your disclaimer. But I plan to ignore it just the same.


----------



## rdr09

Quote:


> Originally Posted by *bluedevil*
> 
> Did a run at 975/1250.
> 
> http://www.3dmark.com/3dm/4161229
> 
> Now doing a run at 1200/1400.


blue, my bad. your gpu is actually outscoring mine at same clocks. seems to be working fine at lower clocks. could be your psu at higher loads.

is it still the rosewill 550W?


----------



## Devildog83

Quote:


> Originally Posted by *Buehlar*
> 
> Nice looking board...I'm jealous.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Are you going with 8 core?


I got the board for a review and I should make enough selling the CHVFZ, 8350 and Trident X 2400 to get the 5820x so all I will have left to buy is DDR4 RAM which is pretty expensive but with the 290 it should be a huge upgrade for my rig. Maybe the 8 core down the road.


----------



## Devildog83

Quote:


> Originally Posted by *Arizonian*
> 
> We are really close
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> We're still beat by the *[Official] AMD Radeon HD 7950/7970/7990 Owners Thread* currently up to page 3695 with post #36947 of 36947.
> 
> With new GPU's coming out, we may not exceed the 7970 / 7950 Club.


The FX 8350 thread has over 40,000 posts. Does it count?


----------



## zealord

what do you guys think is a safe +mV voltage (aux and core) for a Sapphire R9 290X Tri-X 24/7?

temps for default are around 70° for my card.


----------



## tsm106

Quote:


> Originally Posted by *boot318*
> 
> AMD made me eat crow! LOL. They planned the 25th 'big' announcement and it was only intended for India (old product launch that didn't launch there yet). I don't know why they didn't cut the prices. I mean, the 970 beats the R9 290x at $330.... and AMD still wants to charge $500+ for the cards?
> 
> You could've gotten three 970s or two 980s for for the same amount. Two 970s beats two r9 290x. You could've spent your money a little better. Personally, I would refuse the package and get my money back. They'll have to raise their prices down sooner or later. They are just fishing for chumps at the current price point. I love AMD but at the current prices, they aren't worth it.


Is this guy for real? 970 is not faster.


----------



## joeh4384

Quote:


> Originally Posted by *tsm106*
> 
> Is this guy for real? 970 is not faster.


It is pretty much right on the 290x's ass though.


----------



## tsm106

Quote:


> Originally Posted by *joeh4384*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Is this guy for real? 970 is not faster.
> 
> 
> 
> It is pretty much right on the 290x's ass though.
Click to expand...

I haven't seen one 970, not even mine which does over 1500, come close to my 290x scores.


----------



## Devildog83

Quote:


> Originally Posted by *tsm106*
> 
> Is this guy for real? 970 is not faster.


I am still trying to figure out how you "raise the prices down".


----------



## sugarhell

Quote:


> Originally Posted by *tsm106*
> 
> Is this guy for real? 970 is not faster.


The big 25th announcement was for india only.But the videocardz just post it as a worldwide announcement. Thats what you get if you spent too much time on wccf comment section,you believe on people with zero idea

A bit irony that you have both 290x and 970


----------



## sTOrM41

hey guys
i need a Club3D 290X Royal King or PowerColor 290X TurboDuo bios.

can somebody please upload it for me?


----------



## th3illusiveman

Quote:


> Originally Posted by *tsm106*
> 
> I haven't seen one 970, not even mine which does over 1500, come close to my 290x scores.


well why don't you show us the difference?


----------



## tsm106

Quote:


> Originally Posted by *th3illusiveman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I haven't seen one 970, not even mine which does over 1500, come close to my 290x scores.
> 
> 
> 
> well why don't you show us the difference?
Click to expand...

What's wrong? You incapable of checking the firestrike hof for the existence of a 970? That's what these trolls are basing their 970 pwns all rhetoric on isn't it?


----------



## boot318

Quote:


> Originally Posted by *tsm106*
> 
> Is this guy for real? 970 is not faster.


Almost every review has it faster. 1k and 2k it wins. 4k it might lose every now and again. You can almost buy two 970s for the price of one 290x. Are you talking about benching or actual gaming?
Quote:


> Originally Posted by *Devildog83*
> 
> I am still trying to figure out how you "raise the prices down".


780 was $750 before the 290x came out. 290x launched and performance came out = Nvidia had to drop their prices. It is kinda crazy an company wants you to spend $200ish for the same performance + higher energy consumption & heat. Not many people are going to pass up an $330 970 for an $520 290x. Not hard to understand. $400 makes the card a little more compelling. AMD always marketed like this,"You can spend more but you can't get more!" Well, Nvidia just pulled an AMD with the prices and they do nothing.


----------



## tsm106

Reviews lol, it's like the GTX 770 all over again.


----------



## boot318

^970 owners in this very forum are saying the same thing. Dismiss them also?

I don't think a 2-4 frames is worth $200 more. Too close for the price to be that far.


----------



## bluedevil

I think I just figured out my 290 is benching lower than everyone else's........Windows 8.1.

I guess that's why this would show up?
Quote:


> TIME MEASURING INACCURATE. RESULTS ... ( + 1 MORE )


http://www.3dmark.com/3dm11/8762034


----------



## battleaxe

Quote:


> Originally Posted by *bluedevil*
> 
> I think I just figured out my 290 is benching lower than everyone else's........Windows 8.1.
> 
> I guess that's why this would show up?
> http://www.3dmark.com/3dm11/8762034


Looks like Elpida Ram on that card. If so, then no.. that's about right. Hynix RAM cards will do much better yes.


----------



## bluedevil

Quote:


> Originally Posted by *battleaxe*
> 
> Looks like Elpida Ram on that card. If so, then no.. that's about right. Hynix RAM cards will do much better yes.


That it is.


----------



## battleaxe

Quote:


> Originally Posted by *bluedevil*
> 
> That it is.


yeah. Its a bummer. I have one of both. The card with Hynix kills the other one by a long shot.


----------



## thrgk

How can u tell what u have hynix or epilda? Where can I check my 290x and see what they have ?


----------



## bluedevil

Quote:


> Originally Posted by *thrgk*
> 
> How can u tell what u have hynix or epilda? Where can I check my 290x and see what they have ?


----------



## Devildog83

Quote:


> Originally Posted by *boot318*
> 
> Almost every review has it faster. 1k and 2k it wins. 4k it might lose every now and again. You can almost buy two 970s for the price of one 290x. Are you talking about benching or actual gaming?
> 780 was $750 before the 290x came out. 290x launched and performance came out = Nvidia had to drop their prices. It is kinda crazy an company wants you to spend $200ish for the same performance + higher energy consumption & heat. Not many people are going to pass up an $330 970 for an $520 290x. Not hard to understand. $400 makes the card a little more compelling. AMD always marketed like this,"You can spend more but you can't get more!" Well, Nvidia just pulled an AMD with the prices and they do nothing.


What ever, you still have to lower the prices down. If you raise them they go up.

For everyone else, why does my c 290 show more shaders than bluedevils 290?


----------



## bluedevil

Quote:


> Originally Posted by *Devildog83*
> 
> What ever, you still have to lower the prices down. If you raise them they go up.
> 
> For everyone else, why does my c 290 show more shaders than bluedevils 290?


Cause yours is an unlocked 290 to 290x. Same shader count.


----------



## Devildog83

Quote:


> Originally Posted by *bluedevil*
> 
> Cause yours is an unlocked 290 to 290x. Same shader count.


That's bonus. I just got it.


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> Looks like Elpida Ram on that card. If so, then no.. that's about right. Hynix RAM cards will do much better yes.


not so fast . . . mine's got elpida.

@blue, i think it's your psu but instead of getting a new psu . . . you might as well go with plan B.

yah, your plan B.









970


----------



## Offler

Finally got graphics score over 12 000

http://www.3dmark.com/3dm/4170397

GPU 1100MHz/1400MHz

CPU on 4GHz still not stable, so I went back to 3,9...


----------



## battleaxe

Quote:


> Originally Posted by *rdr09*
> 
> not so fast . . . mine's got elpida.
> 
> @blue, i think it's your psu but instead of getting a new psu . . . you might as well go with plan B.
> 
> yah, your plan B.


I wonder what causes the lower scores then?


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> I wonder what causes the lower scores then?


like i said, i suspect the psu. at 1200 core his card underperforms. at 975 MHz it looks normal.


----------



## bluedevil

Quote:


> Originally Posted by *rdr09*
> 
> not so fast . . . mine's got elpida.
> 
> @blue, i think it's your psu but instead of getting a new psu . . . you might as well go with plan B.
> 
> yah, your plan B.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 970


So you think my Capstone 550M is getting tapped out huh? I should do a run and see if the GPU or CPU downclocks during the run.....


----------



## rdr09

Quote:


> Originally Posted by *bluedevil*
> 
> So you think my Capstone 550M is getting tapped out huh? I should do a run and see if the GPU or CPU downclocks during the run.....


for awhile there were hawaiis underperforming but i think they were all 290X and mostly powercolors. run at stock, then add 10 - 20 MHz oc, then test each time.


----------



## Devildog83

My 1st 3DMark with the new card - Man the 8350 just don't cut it anymore.

http://www.3dmark.com/3dm/4170896


----------



## rdr09

Quote:


> Originally Posted by *Devildog83*
> 
> My 1st 3DMark with the new card - Man the 8350 just don't cut it anymore.
> 
> http://www.3dmark.com/3dm/4170896


your physics and graphics scores are fine. something went wrong with the combined score. not another psu?! idk, just speculating. put your cpu to stock and run again. no oc on the gpu as well. or, at least lower your oc on the cpu.

here is mine with an i7 ht off.

http://www.3dmark.com/3dm/3158989?

your physics score is higher than mine.

edit: i highly recommend this.


----------



## tsm106

Wow that is low. My 7970 is almost 3k higher in pscore. That said you have some tweaking to do! Itb looks cpu/mb related as the gscore is not terrible.


----------



## thrgk

looks like 2 of my 290x's are hynix and 2 are epilda.

Should I OC the hynix separate as they can get much higher memory clocks?


----------



## bluedevil

Quote:


> Originally Posted by *rdr09*
> 
> your physics and graphics scores are fine. something went wrong with the combined score. not another psu?! idk, just speculating. put your cpu to stock and run again. no oc on the gpu as well. or, at least lower your oc on the cpu.
> 
> here is mine with an i7 ht off.
> 
> http://www.3dmark.com/3dm/3158989?
> 
> your physics score is higher than mine.
> 
> edit: i highly recommend this.


This helps big time. Pretty much says I need a higher end CPU to go with my high end GPU.


----------



## Devildog83

Quote:


> Originally Posted by *rdr09*
> 
> your physics and graphics scores are fine. something went wrong with the combined score. not another psu?! idk, just speculating. put your cpu to stock and run again. no oc on the gpu as well. or, at least lower your oc on the cpu.
> 
> here is mine with an i7 ht off.
> 
> http://www.3dmark.com/3dm/3158989?
> 
> your physics score is higher than mine.
> 
> edit: i highly recommend this.


NO the PSU it fine, I have always had low combined scores in firestrike especially. I have tried many different settings but no help. I wonder if it has to do with the crosshair v mobo or something. Nobody has been able to answer that one.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Devildog83*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> your physics and graphics scores are fine. something went wrong with the combined score. not another psu?! idk, just speculating. put your cpu to stock and run again. no oc on the gpu as well. or, at least lower your oc on the cpu.
> 
> here is mine with an i7 ht off.
> 
> http://www.3dmark.com/3dm/3158989?
> 
> your physics score is higher than mine.
> 
> edit: i highly recommend this.
> 
> 
> 
> NO the PSU it fine, I have always had low combined scores in firestrike especially. I have tried many different settings but no help. I wonder if it has to do with the crosshair v mobo or something. Nobody has been able to answer that one.
Click to expand...

http://www.3dmark.com/fs/2002510


----------



## Devildog83

Her is a firestrike score with a 7870/270x crossfire, same low combined score. I did get 2300+ once with just the 7870. The only thing I haven't tried is to move the card down to the second 16x slot and see if it's the top slot that is going bad.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Devildog83*
> 
> Her is a firestrike score with a 7870/270x crossfire, same low combined score. I did get 2300+ once with just the 7870. The only thing I haven't tried is to move the card down to the second 16x slot and see if it's the top slot that is going bad.


What's the CPU/NB and HT running at there man?


----------



## Devildog83

Quote:


> Originally Posted by *Sgt Bilko*
> 
> What's the CPU/NB and HT running at there man?


I think 2600ht and 2200CPU/NB

with tessy set to 2x it's a better score but still combined is low.

http://www.3dmark.com/3dm/4171812


----------



## Sgt Bilko

Quote:


> Originally Posted by *Devildog83*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> What's the CPU/NB and HT running at there man?
> 
> 
> 
> I think 2600ht and 2200CPU/NB
> 
> with tessy set to 2x it's a better score but still combined is low.
> 
> http://www.3dmark.com/3dm/4171812
Click to expand...

I was running 2700Mhz CPU/NB and 3000Mhz HT Link

I can do a quick run now at my stock settings to compare?


----------



## Wezzor

I just did a Firestrike run and this are the scores. Are they good or not?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Devildog83*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> What's the CPU/NB and HT running at there man?
> 
> 
> 
> I think 2600ht and 2200CPU/NB
> 
> with tessy set to 2x it's a better score but still combined is low.
> 
> http://www.3dmark.com/3dm/4171812
> 
> 
> Click to expand...
> 
> I was running 2700Mhz CPU/NB and 3000Mhz HT Link
> 
> I can do a quick run now at my stock settings to compare?
Click to expand...

http://www.3dmark.com/3dm/4171911

Graphics Score: 10365
Physics Score: 7957
Combined Score: 3158

CPU @ 4.7, 2133Mhz 10-11-11-31, CPU/NB 2200Mhz, HT 2600Mhz, 290 @ 1000/1250

Also running other crap in the background but yeah.


----------



## zealord

Quote:


> Originally Posted by *Wezzor*
> 
> 
> I just did a Firestrike run and this are the scores. Are they good or not?


yep that is good/appropriate for your hardware


----------



## Wezzor

Quote:


> Originally Posted by *zealord*
> 
> yep that is good/appropriate for your hardware


Nice to hear! Thanks for the fast answer.


----------



## Devildog83

Quote:


> Originally Posted by *Sgt Bilko*
> 
> http://www.3dmark.com/3dm/4171911
> 
> Graphics Score: 10365
> Physics Score: 7957
> Combined Score: 3158
> 
> CPU @ 4.7, 2133Mhz 10-11-11-31, CPU/NB 2200Mhz, HT 2600Mhz, 290 @ 1000/1250
> 
> Also running other crap in the background but yeah.


At some point I will move the card down to the other 16x but it's very weird and doesn't seen correct.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Devildog83*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> http://www.3dmark.com/3dm/4171911
> 
> Graphics Score: 10365
> Physics Score: 7957
> Combined Score: 3158
> 
> CPU @ 4.7, 2133Mhz 10-11-11-31, CPU/NB 2200Mhz, HT 2600Mhz, 290 @ 1000/1250
> 
> Also running other crap in the background but yeah.
> 
> 
> 
> At some point I will move the card down to the other 16x but it's very weird and doesn't seen correct.
Click to expand...

It is weird, not sure whats going on there but something isn't right


----------



## Devildog83

Quote:


> Originally Posted by *Sgt Bilko*
> 
> It is weird, not sure whats going on there but something isn't right


Did bump the CPB/NB to 2400 and got a bit better. Not much.


----------



## rdr09

Quote:


> Originally Posted by *Devildog83*
> 
> At some point I will move the card down to the other 16x but it's very weird and doesn't seen correct.


if you are 100% sure its not the psu . . . i would not sweat it since you are going X99. i am sure its not your gpu.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Devildog83*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> It is weird, not sure whats going on there but something isn't right
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Did bump the CPB/NB to 2400 and got a bit better. Not much.
Click to expand...

Yeah it's still not right, I'm really not sure man, maybe someone in the FX thread might have a better idea?


----------



## Devildog83

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yeah it's still not right, I'm really not sure man, maybe someone in the FX thread might have a better idea?


Thanks, I have tried everywhere so now I am going to try Intel Haswell-e.


----------



## devilhead

soon is winter, so i can make better score, and i need to test new drivers, so i will get better graphic score








http://www.3dmark.com/fs/2523594


----------



## th3illusiveman

Quote:


> Originally Posted by *tsm106*
> 
> What's wrong? You incapable of checking the firestrike hof for the existence of a 970? That's what these trolls are basing their 970 pwns all rhetoric on isn't it?


What's with you jumping to conclusions?







You talk big but don't post any proof. Without it your posts are worthless.


----------



## Devildog83

Quote:


> Originally Posted by *rdr09*
> 
> if you are 100% sure its not the psu . . . i would not sweat it since you are going X99. i am sure its not your gpu.


Yep, it's a seasonic platinum and all of the voltages are spot on.


----------



## ahmedmo1

73% ASIC quality ASUS Radeon R9 290X DirectCU II

1100/1500 1.3V

auto fan profile- 86C Core and 97C VRM
Good OC?


----------



## th3illusiveman

Quote:


> Originally Posted by *ahmedmo1*
> 
> 73% ASIC quality ASUS Radeon R9 290X DirectCU II
> 
> 1100/1500 1.3V
> 
> auto fan profile- 86C Core and 97C Memory
> 
> Good OC?


that's kinda warm but the DirectCU II has a reputation for doing that. What are your VRMs running at? You can check their temps by using a program called GPU-Z clicking the sensors tab and scrolling down to the bottom. Make sure they aren't *over 100c* because they are typically running 10c higher than the core temp on custom cooled cards. How do you know what temps your memory is running at?


----------



## ahmedmo1

Quote:


> Originally Posted by *th3illusiveman*
> 
> that's kinda warm but the DirectCU II has a reputation for doing that. What are your VRMs running at? You can check their temps by using a program called GPU-Z clicking the sensors tab and scrolling down to the bottom. Make sure they aren't *over 100c* because they are typically running 10c higher than the core temp on custom cooled cards. How do you know what temps your memory is running at?


Sorry- meant my VRMs run at 97 max.


----------



## th3illusiveman

Quote:


> Originally Posted by *ahmedmo1*
> 
> Sorry- meant my VRMs run at 97 max.


That's not very good. They technically CAN run at those temps but i would highly suggest you drop your voltage or increase your fan speed until they are at most in the mid 80s.


----------



## mus1mus

Quote:


> Originally Posted by *Devildog83*
> 
> Did bump the CPB/NB to 2400 and got a bit better. Not much.


I assume you are using 14.X driver.

I have the same issue. What I suggest is to do an uninstall using 14.7. Then reinstall 14.7 using Express.

Test.

I am using the PCS+ bios that seems to create the conflict as everyone's getting good scores with 14.X. But will still need to do some tests.


----------



## th3illusiveman

Wow, this AMD 14.X driver is pretty sweet! upgraded from modded 14.8 drivers to the latest 14.X (September 2nd) and scores are up across the board.

R9-290X 1100/1500 24/7 OC

Valley Up from 65.8 to 67.9

3DM11 Up from 17085 to 17290 (gpu score)
http://www.3dmark.com/fs/2836403

3DMFS Up from 12748 to 12904 (gpu score)
http://www.3dmark.com/fs/2836563


----------



## Devildog83

Quote:


> Originally Posted by *mus1mus*
> 
> I assume you are using 14.X driver.
> 
> I have the same issue. What I suggest is to do an uninstall using 14.7. Then reinstall 14.7 using Express.
> 
> Test.
> 
> I am using the PCS+ bios that seems to create the conflict as everyone's getting good scores with 14.X. But will still need to do some tests.


I am on 14.4 but it's been this way for a long time. No matter the card or cards or what bios I have. TBH just 1 7870 gave me a better combined score. It actually got over 2400 which is still low but better than with more GPU. Every time I get to the combined test it drops to 10FPS or so, no mater what.


----------



## mus1mus

Quote:


> Originally Posted by *Devildog83*
> 
> I am on 14.4 but it's been this way for a long time. No matter the card or cards or what bios I have. TBH just 1 7870 gave me a better combined score. It actually got over 2400 which is still low but better than with more GPU. Every time I get to the combined test it drops to 10FPS or so, no mater what.


I almost gave up on that scenario a couple of nights back. Combined goes to 7fps or so.

Fresh OS, trying out 3 drivers, 14.7 RC3 solved it. Combined sits at 13ish


----------



## DMT94

Sapphire Vapor-X R9 290X
Stock Cooling


Spoiler: System image!


----------



## Devildog83

Quote:


> Originally Posted by *mus1mus*
> 
> I almost gave up on that scenario a couple of nights back. Combined goes to 7fps or so.
> 
> Fresh OS, trying out 3 drivers, 14.7 RC3 solved it. Combined sits at 13ish


with what we have fr hardware it should be 15 FPS or better.


----------



## Buehlar

Quote:


> Originally Posted by *Devildog83*
> 
> I am on 14.4 but it's been this way for a long time. No matter the card or cards or what bios I have. TBH just 1 7870 gave me a better combined score. It actually got over 2400 which is still low but better than with more GPU. Every time I get to the combined test it drops to 10FPS or so, no mater what.


Is the CPU bottle necking during the run?


----------



## mus1mus

Quote:


> Originally Posted by *Devildog83*
> 
> with what we have fr hardware it should be 15 FPS or better.


I have a 290.. Still fiddling on little things on it. Your 290 that unlocked to 290X will be better.


----------



## Klocek001

Anyone here who's been running 290 or 290X for more than a year without having to RMA once ?


----------



## pdasterly

two rma in the past month


----------



## Klocek001

Quote:


> Originally Posted by *pdasterly*
> 
> two rma in the past month


keep it at stock much ?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Klocek001*
> 
> Anyone here who's been running 290 or 290X for more than a year without having to RMA once ?


R9 290 Crossfire since 31st December last year, no issues here


----------



## pdasterly

Quote:


> Originally Posted by *Klocek001*
> 
> keep it at stock much ?


sapphire reference cards, nzxt g10 brackets, then went full water. 1 card died when going full water then the other card died when I got the first one back from rma
ran tri-x oc bios on both cards plus slight overclock on top of that


----------



## Klocek001

What I'm trying to establish is whether I need to sell the 290 and get sth more reliable. I'm totally amazed with its performance.The card was bought in Feb, sold to me in Jul, started having issues in Sept. Probably used for bitcoin (or whatever it's called) for 4 months before I got it off an auction site. It was running at 1150/1500 +80mV for a month before it died.

Should I have said "had been running for a month before it died" ? Is Past Perfect Continuous ever used anymore ?


----------



## Arizonian

Quote:


> Originally Posted by *DMT94*
> 
> 
> 
> Sapphire Vapor-X R9 290X
> Stock Cooling
> 
> 
> Spoiler: System image!


Congrats - added


----------



## pdasterly

Quote:


> Originally Posted by *Klocek001*
> 
> What I'm trying to establish is whether I need to sell the 290 and get sth more reliable. I'm totally amazed with its performance.The card was bought in Feb, sold to me in Jul, started having issues in Sept. Probably used for bitcoin (or whatever it's called) for 4 months before I got it off an auction site. It was running at 1150/1500 +80mV for a month before it died.
> 
> Should I have said "had been running for a month before it died" ? Is Past Perfect Continuous ever used anymore ?


Sounds like we have same problem, unfortunately you wont't find a more powerful card for a better price aka 780ti or gtx 880. price/ performance value, the 290x wins hands down


----------



## Klocek001

I'm going to keep it at stock when I get it back, its performance is an overkill for 1080p anyway (I own 290 Tri-X OC, I got no issues with thermal throttling). See how that plays out, if I have to RMA within a year I'll ask for a refund, a r9 3xx card or sell it if they refuse.


----------



## pdasterly

good luck with that. I need every bit or byte of performance for my triple monitors. Thinking of selling one 290x for a 295x2, just need buyer for one of my cards. If I can sell both then gtx 880 looks good. Amd is lagging on the r9 3xx, its time to strike back at the maxwell gpu series but their announcement on the 25th was for miners


----------



## Klocek001

What was the 25th Sept announcement ? I missed that one...


----------



## zealord

Quote:


> Originally Posted by *Klocek001*
> 
> What was the 25th Sept announcement ? I missed that one...


285 for India


----------



## pdasterly

https://www.cryptocoinsnews.com/amd-hbm-scrypt-mining/
and this http://www.thehindubusinessline.com/features/smartbuy/tech-news/amd-eyes-indian-film-industry-to-engage-with-makers/article6445578.ece
yes and the 285 for india


----------



## Klocek001

haha that's hilarious.


----------



## pdasterly

side note, to make you feel better. The 295x2 is the most powerful gaming gpu on planet and two 290x in crossfire offers more performance than the 295x2. Sure you can match performance with other gpu's but again price/performance wise, it can't be beat YET


----------



## th3illusiveman

They need price cuts ASAP. Nvidia really went all out with their 970 pricing and even if the 290X is at least even with it it in performance the 290x consumes quite abit more power while doing so (granted it is a year older), and reviewers pretending like that's the most important thing in the world all of a sudden doesn't help them either. Fair pricing for the 290x would be $325 with a game bundle and $280 for the 290 with a bundle as well i think. It's still alot of card for that money.


----------



## Klocek001

Yeah, when I heard of 970 beating 780Ti for a reasonable price I was gonna grab it and sell the 290. Then I realized that by the end of the year I'd probably be able to snatch the second used 290 Tri-X for a sum of money equal to 15 year-old's monthly pocket money. Shame about the reliability of those cards, though.


----------



## boot318

Quote:


> Originally Posted by *pdasterly*
> 
> Sounds like we have same problem, unfortunately you wont't find a more powerful card for a better price aka 780ti or gtx 880. *price/ performance value, the 290x wins hands down*


? I must be really tired. Price/performance and the R9 290x don't belong in the same sentence. For cards over $300, Nvidia is killing AMD. 780 is $330.... 780ti can be had for $440. 970 offers similar performance for $330. Alot of 290x models are the same price as the 980. AMD needs to adjust their prices.


----------



## pdasterly

970 wont beat a 780ti, the 980 does. Game will change when the r9 3xx drops. Actually the gtx 980 officially killed(discontinued) the 780ti.

Im not up to date on pricing but if the 780 is $330 and the 780ti is $440 amd pricing is correct. I purchased my card brand new for $400 almost 6 month ago. Purchased used 290x for $300 but thats another story because the seller I got from gave me nice discount for return customer.


----------



## Sgt Bilko

Quote:


> Originally Posted by *boot318*
> 
> Quote:
> 
> 
> 
> Originally Posted by *pdasterly*
> 
> Sounds like we have same problem, unfortunately you wont't find a more powerful card for a better price aka 780ti or gtx 880. *price/ performance value, the 290x wins hands down*
> 
> 
> 
> ? I must be really tired. Price/performance and the R9 290x don't belong in the same sentence. For cards over $300, Nvidia is killing AMD. 780 is $330.... 780ti can be had for $440. 970 offers similar performance for $330. Alot of 290x models are the same price as the 980. AMD needs to adjust their prices.
Click to expand...

A decent 970 here is $500+ while the most expensive R9 290 is at most $500.....US prices aren't indicative of everywhere









Not even gonna mention what a 780 Ti or 980 costs here


----------



## pdasterly

i know that, crazy prices for down under you would think shipping from china would be cheaper for you. I recently gave away some free brackets to a gentleman in australia and even though I was giving away parts for free shipping was murder.


----------



## Klocek001

Just checked it, 290X and GTX970 are the same price in Poland (around 1400PLN for a new one). If there is any difference, the 290X is 50PLN cheaper (I can find new ones for 1350, also a bunch of new VTX ones for 1300 but I wouldn't buy that) . The cheapest used r9 290 Tri-X (mind you, not the reference ones) are 980 PLN and the invoice says they're from June 2104 ! If that ain't a bargain....


----------



## Sgt Bilko

Quote:


> Originally Posted by *pdasterly*
> 
> i know that, crazy prices for down under you would think shipping from china would be cheaper for you. I recently gave away some free brackets to a gentleman in australia and even though I was giving away parts for free shipping was murder.


yeah, prices here go up a bit but oh well, in case you were wondering it's $700 for a 780Ti (custom cooler) and $700 for a 980 (reference)

AMD is traditionally cheaper here so it's usually the way i go with my cards


----------



## th3illusiveman

Quote:


> Originally Posted by *Sgt Bilko*
> 
> yeah, prices here go up a bit but oh well, in case you were wondering it's $700 for a 780Ti (custom cooler) and $700 for a 980 (reference)
> 
> AMD is traditionally cheaper here so it's usually the way i go with my cards


Here prices are also higher then the states (usually by $30-50) but i see a 780 Ti for $440 brand new.... and that's a great deal. AMDs 290X are still $450+


----------



## pdasterly

780ti for $440 is great deal, hands down. That's also used price, new they still are $550 ish, $575. Having duds on both sides(amd & nvidia) I would purchase new from authorized retailer so warranty would be in effect. Both my cards blew out and without warranty i would have been sol, knock on some wood


----------



## Noufel

Here in algeria i can get an msi 290 gaming for 300 € but for a reference 780 its 400€ and a reference 780 ti for around 580€ and the 9xx series aren't even here, nvidia is realy expensive here and my 290 tri-x crossfire is ~ equal sometimes 5-10 fps difference to a 970 sli so upgrading for me is a no urge i'll wait for the r9-390x or 980 ti and see


----------



## amptechnow

hey guys, some help would be greatly appreciated.

i have a xfx reference ive put under water and have been running the pt1 bios on now for months with great results. i got a powercolor pcs+ and tried to flash the pt1 bios to and after restarting the computer was shutting down when logging on to windows. i reflashed and now every time i try to install the drivers, no matter what version, the computer bsod half way through.. i want to say the error is page fault not handled (atikmag) or something along those lines. if i put the powercolor to the other bios position i can install drivers fine. could this mean the other bios is toast and unusable? ive reflashed in dos from a boot disk and still bsod.


----------



## boot318

Quote:


> Originally Posted by *pdasterly*
> 
> 780ti for $440 is great deal, hands down. That's also used price, new they still are $550 ish, $575. Having duds on both sides(amd & nvidia) I would purchase new from authorized retailer so warranty would be in effect. Both my cards blew out and without warranty i would have been sol, knock on some wood


http://www.newegg.com/Product/Product.aspx?Item=N82E16814125493&cm_re=gtx_780_ti-_-14-125-493-_-Product

I probably would skip that model but that is just to show they are around that price. Some 290x's are about the same price after MIR also.


----------



## pdasterly

that's best price ive seen, even comes with a game


----------



## pdasterly

Quote:


> Originally Posted by *amptechnow*
> 
> hey guys, some help would be greatly appreciated.
> 
> i have a xfx reference ive put under water and have been running the pt1 bios on now for months with great results. i got a powercolor pcs+ and tried to flash the pt1 bios to and after restarting the computer was shutting down when logging on to windows. i reflashed and now every time i try to install the drivers, no matter what version, the computer bsod half way through.. i want to say the error is page fault not handled (atikmag) or something along those lines. if i put the powercolor to the other bios position i can install drivers fine. could this mean the other bios is toast and unusable? ive reflashed in dos from a boot disk and still bsod.


run ddu, then boot into safe mode and delete video adapters in control panel, install one card at a time


----------



## rdr09

Quote:


> Originally Posted by *boot318*
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814125493&cm_re=gtx_780_ti-_-14-125-493-_-Product
> 
> I probably would skip that model but that is just to show they are around that price. Some 290x's are about the same price after MIR also.


for the longest time i was made to believe these Ti's all run at 1300 core and above. now, that the 900 series are out, they are saying these things can only go as far as 1200 core for daily use. believe it or not, a Ti at 1200 core is not even any faster than a 290 at same clocks. it can easily get beat by a 290X at that clock. then, you have that 3GB VRAM.

at this stage, i don't know why anyone would consider buying old gen new at both camps. unless one needs 4 or more monitors setup or plays games optimized for amd, the 900 series are the way to go. i play BF4, so have to stick to AMD. besides, my 290 at 1200 is roughly as fast as a 970 at 1500.


----------



## pdasterly

same clocks mean nothing if its not stable. I run 5780x1080p on three monitors so I need all the hp i can get. actually considering getting a 295x2(my mobo only has two pci-e slots. New tech will always(most of the time) beats old tech. I didn't even consider the 780ti due to its price, when I purchased my 290x the 780ti was twice as much and I ended up with 2 290x for the price of 1 780ti($800). This gpu war is favoring the consumer this go around, waiting for the r9 3xx series, hopefully it will use the same waterblock as the 290x but that's doubtful

edit: How about the 8gb 290X for the win?


----------



## rdr09

Quote:


> Originally Posted by *pdasterly*
> 
> same clocks mean nothing if its not stable. I run 5780x1080p on three monitors so I need all the hp i can get. actually considering getting a 295x2(my mobo only has two pci-e slots. New tech will always(most of the time) beats old tech. I didn't even consider the 780ti due to its price, when I purchased my 290x the 780ti was twice as much and I ended up with 2 290x for the price of 1 780ti($800). This gpu war is favoring the consumer this go around, waiting for the r9 3xx series, hopefully it will use the same waterblock as the 290x but that's doubtful
> 
> edit: How about the 8gb 290X for the win?


evenif the Ti is cheaper . . .


----------



## Raephen

Quote:


> Originally Posted by *Klocek001*
> 
> Anyone here who's been running 290 or 290X for more than a year without having to RMA once ?


Me!


----------



## th3illusiveman

Quote:


> Originally Posted by *rdr09*
> 
> for the longest time i was made to believe these Ti's all run at 1300 core and above. now, that the 900 series are out, they are saying these things can only go as far as 1200 core for daily use. believe it or not, a Ti at 1200 core is not even any faster than a 290 at same clocks. it can easily get beat by a 290X at that clock. then, you have that 3GB VRAM.
> 
> at this stage, i don't know why anyone would consider buying old gen new at both camps. unless one needs 4 or more monitors setup or plays games optimized for amd, the 900 series are the way to go. i play BF4, so have to stick to AMD. besides, my 290 at 1200 is roughly as fast as a 970 at 1500.


I still believe the 780 ti is faster than the 980 when it's overclocked. a 1250Mhz Ti will give a 1500Mhz 980 a run for it's money. At $440 it's a great deal.


----------



## rdr09

Quote:


> Originally Posted by *th3illusiveman*
> 
> I still believe the 780 ti is faster than the 980 when it's overclocked. a 1250Mhz Ti will give a 1500Mhz 980 a run for it's money. At $440 it's a great deal.


th, you've been around ocn as long as i have. you know fps is not everything.



i agree with you, though, especially the King. wait till the 980 Ti comes out if ever.


----------



## pdasterly

980 will be faster when drivers mature.
For my setup. Minimum fps is most important


----------



## Gobigorgohome

I am really excited for the R9 3XX GPU's, I got a good deal on my four cards brand new, but they deliver good performance too. I will not consider buying GPU's for more than 550 USD when I own these cards. For 4K gaming my cards will probably be good for a while.


----------



## Offler

http://www.3dmark.com/3dm/4177541

GPU 1125MHz
vRAM 1500MHz

CPU 3900MHz
RAM 1600MHz 7-7-7-18 CR1
NB 2800
HT 2600

It seems that clocking video ram to 1500MHz did a lot of score. Still on that stock cooler rocket engine, but its getting cold here...


----------



## kizwan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Klocek001*
> 
> Anyone here who's been running 290 or 290X for more than a year without having to RMA once ?
> 
> 
> 
> R9 290 Crossfire since 31st December last year, no issues here
Click to expand...

Mine since 13th December last year. No issue. Overclocked.


----------



## Widde

Quote:


> Originally Posted by *kizwan*
> 
> Mine since 13th December last year. No issue. Overclocked.


Running steady since november last year ^^ (crossfire)


----------



## pkrexer

Ugh, getting close to making the top 100 list on FS Extreme. I know my gpus can individually do 1280 core with +200mv, but I don't think my power supply could handle it.

http://www.3dmark.com/fs/2841277


----------



## Devildog83

Quote:


> Originally Posted by *Buehlar*
> 
> Is the CPU bottle necking during the run?


It must be, all of the cores don't use 100% during fire strike only 5, the other 3 use only 50%.


----------



## Noufel

Quote:


> Originally Posted by *rdr09*
> 
> for the longest time i was made to believe these Ti's all run at 1300 core and above. now, that the 900 series are out, they are saying these things can only go as far as 1200 core for daily use. believe it or not, a Ti at 1200 core is not even any faster than a 290 at same clocks. it can easily get beat by a 290X at that clock. then, you have that 3GB VRAM.
> 
> at this stage, i don't know why anyone would consider buying old gen new at both camps. unless one needs 4 or more monitors setup or plays games optimized for amd, the 900 series are the way to go. i play BF4, so have to stick to AMD. besides, my 290 at 1200 is roughly as fast as a 970 at 1500.


What voltage do you add for 1200 and is it your daily overclock ?


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> for the longest time i was made to believe these Ti's all run at 1300 core and above. now, that the 900 series are out, they are saying these things can only go as far as 1200 core for daily use. believe it or not, a Ti at 1200 core is not even any faster than a 290 at same clocks. it can easily get beat by a 290X at that clock. then, you have that 3GB VRAM.
> 
> at this stage, i don't know why anyone would consider buying old gen new at both camps. unless one needs 4 or more monitors setup or plays games optimized for amd, the 900 series are the way to go. i play BF4, so have to stick to AMD. besides, my 290 at 1200 is roughly as fast as a 970 at 1500.


Yeah thats OCN for you. They take someones else OC and just use it as reference. My 290X does 1200/1500 under water with +100mV while 290 does 1200/1500 +125mV. You cant really keep these overclocks stable in air because they are temp sensitive overclocks but under water they are 100% stable. I have my cards @ Stock for daily use. Its true a GTX780 Ti @ 1200MHz will be slower then 290X @ 1200MHz. I also dont really care about power these cards use. 95% of the day they are idle. Even is i played 2 hours a day its like 10% of the day being used fully.


----------



## subyman

Quote:


> Originally Posted by *rdr09*
> 
> for the longest time i was made to believe these Ti's all run at 1300 core and above. now, that the 900 series are out, they are saying these things can only go as far as 1200 core for daily use. believe it or not, a Ti at 1200 core is not even any faster than a 290 at same clocks. it can easily get beat by a 290X at that clock. then, you have that 3GB VRAM.
> 
> at this stage, i don't know why anyone would consider buying old gen new at both camps. unless one needs 4 or more monitors setup or plays games optimized for amd, the 900 series are the way to go. i play BF4, so have to stick to AMD. besides, my 290 at 1200 is roughly as fast as a 970 at 1500.


It's called cognitive dissonance. When the TI are top end, people say 1300mhz all day long to justify their purchase. When 9XX cards come out and they are looking for something new, then that "old crappy" TI can only do 1200mhz daily so they can justify the 9XX.


----------



## ZealotKi11er

Quote:


> Originally Posted by *subyman*
> 
> It's called cognitive dissonance. When the TI are top end, people say 1300mhz all day long to justify their purchase. When 9XX cards come out and they are looking for something new, then that "old crappy" TI can only do 1200mhz daily so they can justify the 9XX.


These cards are so fast that overclocking is pointless if you have 2 or more cards. Yes they will score high in benchmarks and get more fps in a game that needs the power but who plays Crysis 3 all day. In BF4 i get way too much fps @ 1440p with 2 cards. I see no reason why i have to OC my cards really.


----------



## rdr09

Quote:


> Originally Posted by *Noufel*
> 
> What voltage do you add for 1200 and is it your daily overclock ?


+75 for 1200 core. i play at stock.

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah thats OCN for you. They take someones else OC and just use it as reference. My 290X does 1200/1500 under water with +100mV while 290 does 1200/1500 +125mV. You cant really keep these overclocks stable in air because they are temp sensitive overclocks but under water they are 100% stable. I have my cards @ Stock for daily use. Its true a GTX780 Ti @ 1200MHz will be slower then 290X @ 1200MHz. I also dont really care about power these cards use. 95% of the day they are idle. Even is i played 2 hours a day its like 10% of the day being used fully.


just basing off some benchmarks that might not even be true representation of gaming.

Quote:


> Originally Posted by *subyman*
> 
> It's called cognitive dissonance. When the TI are top end, people say 1300mhz all day long to justify their purchase. When 9XX cards come out and they are looking for something new, then that "old crappy" TI can only do 1200mhz daily so they can justify the 9XX.


could very well be 'cause i still can't believe it.


----------



## Buehlar

Quote:


> Originally Posted by *Devildog83*
> 
> It must be, all of the cores don't use 100% during fire strike only 5, the other 3 use only 50%.


They all should be maxed 100% if it's bottlenecking, unless it's throttling for some reason. Is it the same 3 cores for every run? What are the temps on those 3 cores?

If everything is ok then the only thing I can think of is power fluctuating during peak system power draw. Most systems will run just fine with your specs, clocks and PSU for everyday tasks and gaming etc. BUT If you're really into constantlly running synthetic benchmarks on a daily basis, then you should really look into getting a PSU with a little more headroom IMO.


----------



## battleaxe

I actually run mine at only 800mhz on core. But I'm cross-firing a pair of 290's at only 1080p which is kinda stupid, but its where I am right now. I use -50 on the voltage too. It does help bring down the temps. At some point I will get another 1080p monitor then I will have surround. But right now I figure running this low on core is better long term, uses less energy, and I still get ridiculous frame rates.


----------



## bond32

Quote:


> Originally Posted by *ZealotKi11er*
> 
> These cards are so fast that overclocking is pointless if you have 2 or more cards. Yes they will score high in benchmarks and get more fps in a game that needs the power but who plays Crysis 3 all day. In BF4 i get way too much fps @ 1440p with 2 cards. I see no reason why i have to OC my cards really.


What sorts of settings do you run with? At all maxed ultra, 1440p with 3 cards my framerate bounces around 90-160 at 100% SS. I can crank that SS to 150% (75% of 4k res) and the framerate will drop under 100 fps. At 200%, I thought mine was aroun 70-80 fps. This is at 1440p 120hz.


----------



## Noufel

I also use a 1080 screen but a 120hz so i can justify my 290 crossfire but i can't justify overclocking them for gaming only for benching


----------



## ZealotKi11er

Quote:


> Originally Posted by *bond32*
> 
> What sorts of settings do you run with? At all maxed ultra, 1440p with 3 cards my framerate bounces around 90-160 at 100% SS. I can crank that SS to 150% (75% of 4k res) and the framerate will drop under 100 fps. At 200%, I thought mine was aroun 70-80 fps. This is at 1440p 120hz.


1440p Ultra 0x MSAA, 0x FXAA, 150% Resolution Scaling = 4K.

I get ~ 60-80 fps. With OC i get extra 15-20% better performance but only with these settings. I use DX11 because Mantle not so good @ 4K.

1080p @ 200% = 4K
1440p @ 150% = 4K


----------



## amptechnow

Quote:


> Originally Posted by *pdasterly*
> 
> run ddu, then boot into safe mode and delete video adapters in control panel, install one card at a time


tried this and was able to install drivers with one card. shut down and installed other card and as soon as i log on to windows the desktop pops up and then blue screens but cant read text bc its all white lines across screen, like text is all distorted.


----------



## zealord

What do you guys think is a safe memory overclock for a default voltage 290X? (Hynix memory)

1500mhz memory clock should be fine, shouldn't it?


----------



## Offler

Just yesterday I start to OC memory clock.

I stopped on 1625MHz but I didnt checked which type of memory is on the card. I believe that this result is also due fact that back of the card is additionally cooled. Fan used to rear exhaust is reversed, and air flow collides with flow coming from CPU cooler. This cools down CPU VRM and card itself is definitely affected by this airflow,

I increased VDDC up to +100, but later I had trouble to boot properly.

1500MHz was fine even on stock, but only after I adjusted the cooling.


----------



## ZealotKi11er

Quote:


> Originally Posted by *zealord*
> 
> What do you guys think is a safe memory overclock for a default voltage 290X? (Hynix memory)
> 
> 1500mhz memory clock should be fine, shouldn't it?


The Memory for all 290 and 290X Reference is Rated for 6GHz (1500MHz). Problem is the memory controller directly related to vCore is under-volted. To get 1500MHz for both my Hynix i need + 25mV


----------



## zealord

Quote:


> Originally Posted by *ZealotKi11er*
> 
> The Memory for all 290 and 290X Reference is Rated for 6GHz (1500MHz). Problem is the memory controller directly related to vCore is under-volted. To get 1500MHz for both my Hynix i need + 25mV


oh forgot to mention that it is a Sapphire Tri-X 290X. don't know if that changes much. I haven't had a Radeon card in a while


----------



## BradleyW

I was lucky enough to play FC4, AC 5, Alien, Witcher 3, Borderlands 3, Dying Light, BF Hardline and The Evil Within at EuroGamer Expo London today.

Without disclosing information, what I will say is this. On some of the listed titles above, AMD GPU's are going to struggle. (I'm sure you can guess which out of that list). I can't say how and why I think this, because that would mean I'd have to refer to specific details which are indeed private. This is just an opinion, take it or leave it.


----------



## zealord

I love green tea and gonna find a good overclock for my 290X


----------



## BradleyW

Quote:


> Originally Posted by *zealord*
> 
> you are having me worried man
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well the games are not released yet and we have no drivers for it. The games you mentioned are too big to run super poorly on AMD cards
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It probably will be all sorted when they release


I hope your right, but like I said, I have a bucket full of points I can't express, which would convince anyone towards a certain point of view. Again, can't discuss that point of view in detail, nor can I refer to any technical information about the game and specific hardware. I'm very concerned. It might be worth while jumping to GTX 980 SLI, maybe that suggestion hints towards things. AMD might pull through on the titles! I am confident in all but 2 of the titles. Let's hope these game samples are very early builds. After all, we are not far from release dates.

If AMD let me down, I'm going to save up and do a rebuild after Christmas using Haswell-E 8 + a rethink on the GPU situation. Please AMD, with all your might and power, please pull through!!!!


----------



## zealord

Quote:


> Originally Posted by *BradleyW*
> 
> I hope your right, but like I said, I have a bucket full of points I can't express, which would convince anyone towards a certain point of view. Again, can't discuss that point of view in detail, nor can I refer to any technical information about the game and specific hardware. I'm very concerned. It might be worth while jumping to GTX 980 SLI, maybe that suggestion hints towards things. AMD might pull through on the titles! I am confident in all but 2 of the titles.


I hope not. I just got a 290X like 5 days ago









Well Far Cry 4, AC Unity have some PC/Nvidia only features that we already know of. Borderlands has PhysX and Witcher 3 aswell? Fur PHysX or something.
I could deal with things missing like Nvidia gameworks or whatever it is called, as long as Radeon cards do not perform poorly or stutter lol


----------



## BradleyW

Quote:


> Originally Posted by *zealord*
> 
> I hope not. I just got a 290X like 5 days ago
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well Far Cry 4, AC Unity have some PC/Nvidia only features that we already know of. Borderlands has PhysX and Witcher 3 aswell? Fur PHysX or something.
> I could deal with things missing like Nvidia gameworks or whatever it is called, as long as Radeon cards do not perform poorly or stutter lol


The following comments are not related to my experience at Euro Gamer. The following comments are opinions I've expressed before hand on this forum:

My concern with AC is poor CPU optimization. If this is done correctly, AMD CFX will have a chance to shine!
Also, remember FC3's DX11 issues on AMD? Well, FC4 will most certainly be using DX11. So...... (Can't go into details)
Yup Physx will be a no go for Borderlands. We can all guess that based on BL2!
Witcher 3 might be OK on AMD with certain unnoticeable useless features turned off. However, if they are using Nvidia's new lighting technologies on top of physx and other stuff, that could all be tricky.


----------



## sinnedone

Quote:


> Originally Posted by *ZealotKi11er*
> 
> The Memory for all 290 and 290X Reference is Rated for 6GHz (1500MHz). Problem is the memory controller directly related to vCore is under-volted. To get 1500MHz for both my Hynix i need + 25mV


Thank you for this, I did not know that. I have 2 reference r9 290's, one with hynix memory and one with elpidia so that will definitely help me when it comes to figuring out overclocks.


----------



## ZealotKi11er

Quote:


> Originally Posted by *BradleyW*
> 
> I was lucky enough to play FC4, AC 5, Alien, Witcher 3, Borderlands 3, Dying Light, BF Hardline and The Evil Within at EuroGamer Expo London today.
> 
> Without disclosing information, what I will say is this. On some of the listed titles above, AMD GPU's are going to struggle. (I'm sure you can guess which out of that list). I can't say how and why I think this, because that would mean I'd have to refer to specific details which are indeed private. This is just an opinion, take it or leave it.


From those games.

FC4 - FC3 ran garbage in AMD cards in CF.
AC 5 - Garbage in AMD cards for sure.
Alien - Dont really care.
Wircher 3 - This better run but i think its a Nvidia Tittle.
BL3 - Nvidia game
BF - We have a idea how this game runs already.
Dying Light - I like to play this game.


----------



## bbond007

Quote:


> Originally Posted by *battleaxe*
> 
> I actually run mine at only 800mhz on core. But I'm cross-firing a pair of 290's at only 1080p which is kinda stupid, but its where I am right now. I use -50 on the voltage too. It does help bring down the temps. At some point I will get another 1080p monitor then I will have surround. But right now I figure running this low on core is better long term, uses less energy, and I still get ridiculous frame rates.


I have 2 machines with r9 290x crossfire.

I do the same thing with under-clocking and the voltage to bring down the noise.

The machine with the lightning as the primary GPU does a little better and I go 900mhz.

I have noticed that to underclock i have had to enable unofficial overclocking mode (with powerplay) in afterburner. This happned after 14.4 driver.

One is x-star 1440p and the other is 5760x1080p.

even under-clocked these things handle these resolutions fine and i like the quiet.
Quote:


> Originally Posted by *Klocek001*
> 
> Anyone here who's been running 290 or 290X for more than a year without having to RMA once ?


I had been mining on most of these cards for close to that.

I've had to replace a lot of MSI windforce r9 290x and just one MSI r9 290x gamer and it was for a non working display-port and it otherwise worked reliably. I'd say it varies by brand. The ones that developed problems seemed to do so withing the first hours to first month of operation.

I know, mining is bad, blah blah...

all I can say to that is that I weeded out the bad boards that would have died anyway I mined enough on each of those boards to get a gtx 970 or 980 if i decide the grass is greener.


----------



## Devildog83

Quote:


> Originally Posted by *Buehlar*
> 
> They all should be maxed 100% if it's bottlenecking, unless it's throttling for some reason. Is it the same 3 cores for every run? What are the temps on those 3 cores?
> 
> If everything is ok then the only thing I can think of is power fluctuating during peak system power draw. Most systems will run just fine with your specs, clocks and PSU for everyday tasks and gaming etc. BUT If you're really into constantlly running synthetic benchmarks on a daily basis, then you should really look into getting a PSU with a little more headroom IMO.


I don't have the AMD stuff in anymore, I pulled it and it's going up for sale. I was just seeing if I could figure it out before I pulled it. I did raise the HT volts up and gained over 100 points but still not good. I have my 3258k and Z97 board in until I get a 5820k and DDR4 to go in my X99 board.

By the way, the card only pulled about 180w at 100% and with the CPU maybe 250 max I think 660w is enough. Max my full system only drew 490w from the wall at full load. With X-Fire I could see it but I am good now that I have only 1 card. When I get 2 I will be upgrading.


----------



## Red1776

A herd of MSI R290X's


----------



## amptechnow

my xfx reference under water has been fine no rma since these cards were released. my pcs+ has been going good since they came out and just now bios #2 seems to be acting up. but other bios still runs fine. this could be my fault though ans not the card. so 2 cards here no rma's. unlike my msi gtx 660's which ive been rma'ing for 2 years now and still cant get 2 working cards from them.


----------



## rdr09

Quote:


> Originally Posted by *amptechnow*
> 
> my xfx reference under water has been fine no rma since these cards were released. my pcs+ has been going good since they came out and just now bios #2 seems to be acting up. but other bios still runs fine. this could be my fault though ans not the card. so 2 cards here no rma's. unlike my msi gtx 660's which ive been rma'ing for 2 years now and still cant get 2 working cards from them.


how's the i7 handling those cards? you still have it oc'ed to 4.9GHz?

anyways, had my 290 close to a year now and no rma either. drivers are slow to develop but what can you do? single card, single monitor, not a single issue playing games, though.

http://www.3dmark.com/compare/3dm11/7740014/3dm11/8701258


----------



## chronicfx

Quote:


> Originally Posted by *Red1776*
> 
> 
> 
> 
> A herd of MSI R290X's


That is gorgeous red!


----------



## Gobigorgohome

Quote:


> Originally Posted by *Red1776*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> A herd of MSI R290X's


Nice setup Red1776, white and black (two tone) CM Cosmos 2 and 4x R9 290X.







Maybe I do something similar to this with my system.









How does those MSI Gaming R9 290X compare to the MSI Lightning R9 290X?


----------



## Nevk

Quote:


> Originally Posted by *Red1776*
> 
> 
> 
> 
> A herd of MSI R290X's


So beautiful


----------



## Buehlar

Quote:


> Originally Posted by *Devildog83*
> 
> I don't have the AMD stuff in anymore, I pulled it and it's going up for sale. I was just seeing if I could figure it out before I pulled it. I did raise the HT volts up and gained over 100 points but still not good. I have my 3258k and Z97 board in until I get a 5820k and DDR4 to go in my X99 board.
> 
> By the way, the card only pulled about 180w at 100% and with the CPU maybe 250 max I think 660w is enough. Max my full system only drew 490w from the wall at full load. With X-Fire I could see it but I am good now that I have only 1 card. When I get 2 I will be upgrading.


My bad...thought you had 290's x-fired


----------



## ZealotKi11er

With MSI AB you can only get 1625MHz memory OC. Is there any way to increase the limit? Do i have to flash a different BIOS or use a different tool?


----------



## thrgk

How can u get AB to allow +200mv voltage like trixx? Trixx has no memory voltage, otherwise id be fine. How can i get +200mv and the memory voltage control all in 1 program?


----------



## sugarhell

Quote:


> Originally Posted by *thrgk*
> 
> How can u get AB to allow +200mv voltage like trixx? Trixx has no memory voltage, otherwise id be fine. How can i get +200mv and the memory voltage control all in 1 program?


Nothing has memory voltage...Except 290x lightning


----------



## thrgk

Quote:


> Originally Posted by *sugarhell*
> 
> Nothing has memory voltage...Except 290x lightning


my 290x in ab have aux voltage, can i get ab to have ability of +200mv on core voltage or only with trixx and forfit aux voltage


----------



## sugarhell

Quote:


> Originally Posted by *thrgk*
> 
> my 290x in ab have aux voltage, can i get ab to have ability of +200mv on core voltage or only with trixx and forfit aux voltage


aux voltage != memory voltage


----------



## Wezzor

Hey guys!
My card is currently running 1100/1400 on +20 power limit. If I start to increase either core clock or memory clock I start seeing artifacts when running valley. So I guess to push the card harder I need to increase the core voltage. So my question is should I be happy with the clocks I have now or should I try to increase the core voltage and try to push the card even more? I currently get around 68-75c in heavy games maxed. I could also add that I'm not watercooling my card.


----------



## BradleyW

Quote:


> Originally Posted by *sugarhell*
> 
> aux voltage != memory voltage


True!
AUX relates to the power distributed from the PCi-E slot. (90% certain that's true).


----------



## sugarhell

Quote:


> Originally Posted by *BradleyW*
> 
> True!
> AUX relates to the power distributed from the PCi-E slot. (90% certain that's true).


Yeah its true


----------



## BradleyW

Quote:


> Originally Posted by *sugarhell*
> 
> Yeah its true


What's the benefit of upping that voltage? If you exceed the wattage from the PCi-E power cables maybe?


----------



## kizwan

Quote:


> Originally Posted by *Wezzor*
> 
> Hey guys!
> My card is currently running 1100/1400 on +20 power limit. If I start to increase either core clock or memory clock I start seeing artifacts when running valley. So I guess to push the card harder I need to increase the core voltage. So my question is should I be happy with the clocks I have now or should I try to increase the core voltage and try to push the card even more? I currently get around 68-75c in heavy games maxed. I could also add that I'm not watercooling my card.


Are you happy at that clocks? Can you play games at that clocks?


----------



## Wezzor

Quote:


> Originally Posted by *kizwan*
> 
> Are you happy at that clocks? Can you play games at that clocks?


Well, it's my first time overclocking so I guess I'm pretty happy I don't really know what these cards are capable of on air. I haven't struggled with any game that I've played so far except for Watch Dogs but I guess it's just very bad optimized.


----------



## thrgk

Anyone else feel like bf4 is laggy with 4 290x's? like my fps is great but its just, weird idk laggy kinda


----------



## BradleyW

Quote:


> Originally Posted by *thrgk*
> 
> Anyone else feel like bf4 is laggy with 4 290x's? like my fps is great but its just, weird idk laggy kinda


Looks like a frame latency issue. Cap your fps and you'll be fine.


----------



## ZealotKi11er

Quote:


> Originally Posted by *thrgk*
> 
> Anyone else feel like bf4 is laggy with 4 290x's? like my fps is great but its just, weird idk laggy kinda


Why do you have 4? BF4 with 290X + 290 is butter smooth for me. Try DX11 if your using Mantle.


----------



## thrgk

Quote:


> Originally Posted by *BradleyW*
> 
> Looks like a frame latency issue. Cap your fps and you'll be fine.


hmm, how can i do that? Just turn on vsync or?


----------



## BradleyW

Quote:


> Originally Posted by *thrgk*
> 
> hmm, how can i do that? Just turn on vsync or?


use an fps limiter or limit the fps or by using a console command in game.


----------



## thrgk

Quote:


> Originally Posted by *BradleyW*
> 
> use an fps limiter or limit the fps or by using a console command in game.


Thanks ill try that.

+rep!


----------



## amptechnow

Quote:


> Originally Posted by *rdr09*
> 
> how's the i7 handling those cards? you still have it oc'ed to 4.9GHz?
> 
> anyways, had my 290 close to a year now and no rma either. drivers are slow to develop but what can you do? single card, single monitor, not a single issue playing games, though.
> 
> http://www.3dmark.com/compare/3dm11/7740014/3dm11/8701258


yeah i have i7 at 4.8 for 24/7 and gaming and it does ok. it bottlenecks slightly for a second or two in certain games but very rare. but not to an unplayable point or anything. depending on the game i try and use supersampling to work cards more.but i always run on ultra and max everyuthing quality wise and try to keep it at 144 fps or 120 if i use the lightboost strobe. id like to switch to 1440 or 4k soon and i think the cards will get a better work out. overall i am very happy with the 3770k and the cf r9 290's. ill do a run here in a sec and post for you. yeah i wish drivers would hurry up too and get a l;ittle better. i was getting great scores and then tried the 14.7 rc3 and seems to drop them a little but games run good.


----------



## Ironsmack

Like to add myself to the owners list as well. Thanks!

(2) XFX 290 w/ XSPC WB & backplate.


----------



## Wezzor

Why can't I increase my core voltage? This is how it looks. It seems locked somehow.








EDIT: Nvm, I found out the option to change it.


----------



## amptechnow

Quote:


> Originally Posted by *rdr09*
> 
> how's the i7 handling those cards? you still have it oc'ed to 4.9GHz?
> 
> anyways, had my 290 close to a year now and no rma either. drivers are slow to develop but what can you do? single card, single monitor, not a single issue playing games, though.
> 
> http://www.3dmark.com/compare/3dm11/7740014/3dm11/8701258


heres the score. http://www.3dmark.com/3dm11/8770715


----------



## rdr09

Quote:


> Originally Posted by *Red1776*
> 
> 
> 
> 
> A herd of MSI R290X's


you have it running already, Red? nice!

Quote:


> Originally Posted by *amptechnow*
> 
> heres the score. http://www.3dmark.com/3dm11/8770715


i see you got igpu enabled. i did it, too, recently and hooked up another monitor as an extension. mine is only a HD3000 but good enuf. i recommend this driver.


----------



## amptechnow

Quote:


> Originally Posted by *rdr09*
> 
> you have it running already, Red? nice!
> i see you got igpu enabled. i did it, too, recently and hooked up another monitor as an extension. mine is only a HD3000 but good enuf. i recommend this driver.


ill give it a try. i use the ipgu for 2 side monitors. really they are just for work and or monitoring while gaming.


----------



## ZealotKi11er

My cards will not do 1250MHz even with +200 mV. They do score in 3DMark but i can see artifacts pooping in and out sometimes during the run.

Still experimenting with memory speeds. Going to test something other then 3DMark because going from 1500 to 1600 made no real difference.


----------



## BradleyW

@Red1776
What tubing are you using on that setup?


----------



## amptechnow

14.7 rc3 gives me a little bit higher scores on both 3d marks and heaven and valley. but both new modded drivers had good scores, just a bit less. going to try them gaming now and see how they do.


----------



## battleaxe

I'm still having an annoying crash to desktop on crossfire. Running single card its fine. But on Xfire it crashes to desktop on both BF3 and BF4... any ideas?

I've Googled about it and seems inconclusive. Lots of issues but pretty hit or miss really.


----------



## rdr09

Quote:


> Originally Posted by *amptechnow*
> 
> 14.7 rc3 gives me a little bit higher scores on both 3d marks and heaven and valley. but both new modded drivers had good scores, just a bit less. going to try them gaming now and see how they do.


14.X gives about 100 pts more in Graphics score over 14.7 based on my runs.

Stock . . .

http://www.3dmark.com/compare/fs/2845374/fs/2856567
Quote:


> Originally Posted by *battleaxe*
> 
> I'm still having an annoying crash to desktop on crossfire. Running single card its fine. But on Xfire it crashes to desktop on both BF3 and BF4... any ideas?
> 
> I've Googled about it and seems inconclusive. Lots of issues but pretty hit or miss really.


what driver are you using and are your gpus oc'ed? wish i can help but i only have a single 290. i read BF4 is sensitive to oc.


----------



## battleaxe

Quote:


> Originally Posted by *rdr09*
> 
> 14.X gives about 100 pts more in Graphics score over 14.7 based on my runs.
> 
> Stock . . .
> 
> http://www.3dmark.com/compare/fs/2845374/fs/2856567
> what driver are you using and are your gpus oc'ed? wish i can help but i only have a single 290. i read BF4 is sensitive to oc.


The issue occurs whether they are OC'd, underclocked, or Running stock clocks. Doesn't seem to matter at all what the core or memory is set at. It seems to be a driver issue or something else is conflicting. I am running the 14.7 drivers. I have tried the 14.4 also. Same thing.

I can pass firestrike, 3dMark11 or whatever no problems. Single card runs fine with no issues.


----------



## kizwan

I have two, running in Crossfire. Even I can't help him. I don't know what's wrong with his setup.

@battleaxe, can you share what is the BIOS version on both cards? Did you get any error when it crashed?


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> The issue occurs whether they are OC'd, underclocked, or Running stock clocks. Doesn't seem to matter at all what the core or memory is set at. It seems to be a driver issue or something else is conflicting. I am running the 14.7 drivers. I have tried the 14.4 also. Same thing.
> 
> I can pass firestrike, 3dMark11 or whatever no problems. Single card runs fine with no issues.


try 14.X i posted in post 30181. disable crossfire, run 14.X to uninstall the existing driver (use Express method), reboot, run it again to install (Express), reboot, then enable crossfire. at your own risk.









also, update BF4.


----------



## battleaxe

Quote:


> Originally Posted by *kizwan*
> 
> I have two, running in Crossfire. Even I can't help him. I don't know what's wrong with his setup.
> 
> @battleaxe, can you share what is the BIOS version on both cards? Did you get any error when it crashed?


No it just says BF3 or BF4 has failed to respond and the game crashes to desktop. That's it.

This is a new install of windows too. Only about 2 weeks old. Its doing the same thing as on the old windows install.

... kinda weird...

Here's the BIOS versions. One is MSI Gaming. Other MSI reference. I haven't touched the BIOS on either.


----------



## BradleyW

@battleaxe

Try this :
http://www.microsoft.com/en-gb/download/details.aspx?id=35


----------



## battleaxe

Quote:


> Originally Posted by *BradleyW*
> 
> @battleaxe
> 
> Try this :
> http://www.microsoft.com/en-gb/download/details.aspx?id=35


It didn't allow. Said a newer version of DX was detected. Should I uninstall the one I have?

I'm wondering if I should delete the games and completely re-install them... I have used the DDU uninstaller and re installed the drivers about a 100 times. Doing it the right way in safe mode and everything.


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> It didn't allow. Said a newer version of DX was detected. Should I uninstall the one I have?
> 
> I'm wondering if I should delete the games and completely re-install them... I have used the DDU uninstaller and re installed the drivers about a 100 times. Doing it the right way in safe mode and everything.


have you tried repairing the game? rightclick icon in origin and choose Repair. BF4 first.


----------



## battleaxe

Quote:


> Originally Posted by *rdr09*
> 
> have you tried repairing the game? rightclick icon in origin and choose Repair. BF4 first.


Yup. Done that too. Several times actually... you see why I'm stumped? lol

maybe my HDD is going bad or something...?


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> Yup. Done that too. Several times actually... you see why I'm stumped? lol
> 
> maybe my HDD is going bad or something...?


last suggestion just BF4. Go to Documents>Battlefield 4

Move the cache and settings folders to the desktop, try and play BF4.


----------



## battleaxe

Quote:


> Originally Posted by *rdr09*
> 
> last suggestion just BF4. Go to Documents>Battlefield 4
> 
> Move the cache and settings folders to the desktop, try and play BF4.


Well. I thought I was homefree and that fixed it... Then it just did it again. Sigh...


----------



## battleaxe

Quote:


> Originally Posted by *battleaxe*
> 
> Well. I thought I was homefree and that fixed it... Then it just did it again. Sigh...


Interesting thing is that it happens on both BF3 and BF4 only. Maybe I'll try a different browser...


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> Interesting thing is that it happens on both BF3 and BF4 only. Maybe I'll try a different browser...


mine defaults to IE.


----------



## kizwan

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I have two, running in Crossfire. Even I can't help him. I don't know what's wrong with his setup.
> 
> @battleaxe, can you share what is the BIOS version on both cards? Did you get any error when it crashed?
> 
> 
> 
> No it just says BF3 or BF4 has failed to respond and the game crashes to desktop. That's it.
> 
> This is a new install of windows too. Only about 2 weeks old. Its doing the same thing as on the old windows install.
> 
> ... kinda weird...
> 
> Here's the BIOS versions. One is MSI Gaming. Other MSI reference. I haven't touched the BIOS on either.
Click to expand...

In this situation, when troubleshooting, I would try get both card flashed with same BIOS version. Try update the "015.039" card to "015.042", basically the same BIOS version the "015.042" card have.


----------



## Bertovzki

Im currious , I have done a lot or research online ,and used calculator and know the answer well enough, but would like to hear some feedback from R9290 owners specifcally.

PSU ...by my calculations and reseach 1000W is really the safest minimum if you are runnung 2 290's but can be achieved with 750W no problem.
I have seen test where 2 cards maxed out have pushed a AX860i to near 1000W at the wall without failing, and tests by jays 2 cents and NCIX dude proving you only need a lot less than people think.

My system will be:
1 x R9 290 Tri-x OC (,2x GPU latter )
CPU: i7 4790K
GA-Z97X Gaming 3 MOBO
8GB 1800mhz RAM (16GB latter)
XSPC Photon D5 270
2-4 SSD's
1 TB HHD
8 x fans
2x darkside UV 12" strips
Lamptron FCR5 V3 fan controller
NZXT RGB Hue colour controller +its light strip

I would have thought 1200w is best


----------



## battleaxe

Quote:


> Originally Posted by *kizwan*
> 
> In this situation, when troubleshooting, I would try get both card flashed with same BIOS version. Try update the "015.039" card to "015.042", basically the same BIOS version the "015.042" card have.


Good idea. I hadn't thought of doing that. Could be the solution.


----------



## Ironsmack

Have you updated punkbuster?


----------



## battleaxe

Quote:


> Originally Posted by *Ironsmack*
> 
> Have you updated punkbuster?


Good idea. Will try before flashing the BIOS to match.

Well, I just updated PB manually for both BF3 and BF4. Hopefully, this fixes it. I'll have a chance to test later today. Fingers crossed.

Update:

Updated punkbuster and started using Mozilla firefox for browser. Went through an entire round and not a single crash in Crossfire. Hopefully this fixed it.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Bertovzki*
> 
> Im currious , I have done a lot or research online ,and used calculator and know the answer well enough, but would like to hear some feedback from R9290 owners specifcally.
> 
> PSU ...by my calculations and reseach 1000W is really the safest minimum if you are runnung 2 290's but can be achieved with 750W no problem.
> I have seen test where 2 cards maxed out have pushed a AX860i to near 1000W at the wall without failing, and tests by jays 2 cents and NCIX dude proving you only need a lot less than people think.
> 
> My system will be:
> 1 x R9 290 Tri-x OC (,2x GPU latter )
> CPU: i7 4790K
> GA-Z97X Gaming 3 MOBO
> 8GB 1800mhz RAM (16GB latter)
> XSPC Photon D5 270
> 2-4 SSD's
> 1 TB HHD
> 8 x fans
> 2x darkside UV 12" strips
> Lamptron FCR5 V3 fan controller
> NZXT RGB Hue colour controller +its light strip
> 
> I would have thought 1200w is best


I was benching my 2 x 290s @ 200mV and was hitting 1050W from the wall. This was in 3DMark. 100mV was hitting 870W. Stock ~ 700W.


----------



## bond32

If it means anything to ya, with a 1300 watt EVGA G2 and 3 cards, when I push them to over 1.4 volts it shuts the system down (ocp protection). Power supply cannot go further at that point, meaning it is likely drawing ~1500 watts from the wall.

The 1300 watt EVGA G2 would be a perfect psu for 2 cards if you plan to OC the heck out of them. Really all depends on which psu you get... Make sure it's quality. Just because it claims 1200 watts it could be garbage.


----------



## pkrexer

My 860watt Seasonic has no issues running my 2x 290x's @ +100mv. I haven't even tried anything higher than that though.


----------



## bond32

Quote:


> Originally Posted by *pkrexer*
> 
> My 860watt Seasonic has no issues running my 2x 290x's @ +100mv. I haven't even tried anything higher than that though.


Of course it does. But when you start pulling 370+ watts per card, I highly doubt that 860 watt will cut it.


----------



## piquadrat

14.9 is out;
http://support.amd.com/en-us/kb-articles/Pages/AMDCatalyst14-9WINReleaseNotes.aspx
Is it safe to use DDU to uninstall 14.X drivers? 14.X seems to be a little different to the previous ones I used in my system (apu origin). I have observed "amdacpusrsvc" errors and warnings in the Event Log of my Windows 7 installation albeit the driver itself was working very good. And I wonder if DDU will remove all the stuff.


----------



## BradleyW

Quote:


> Originally Posted by *piquadrat*
> 
> 14.9 is out;
> http://support.amd.com/en-us/kb-articles/Pages/AMDCatalyst14-9WINReleaseNotes.aspx
> Is it safe to use DDU to uninstall 14.X drivers? 14.X seems to be a little different to the previous ones I used in my system (apu origin). I have observed "amdacpusrsvc" errors and warnings in the Event Log of my Windows 7 installation albeit the driver itself was working very good. And I wonder if DDU will remove all the stuff.


These drivers are extremely disappointing. Where are the profiles for all the new up and coming games?


----------



## sugarhell

Quote:


> Originally Posted by *BradleyW*
> 
> These drivers are extremely disappointing. Where are the profiles for all the new up and coming games?


Just do a quick search on the 3d profiles...


----------



## BradleyW

Quote:


> Originally Posted by *sugarhell*
> 
> Just do a quick search on the 3d profiles...


Will do!

edit: Found Alien Isolation, but nothing else yet.


----------



## Gobigorgohome

Quote:


> Originally Posted by *bond32*
> 
> Of course it does. But when you start pulling 370+ watts per card, I highly doubt that 860 watt will cut it.


Only reason to feed that much power to these cards is benchmarking, myself running stock cards and overclocked CPU (for 4K gaming) and it seems to be a good combination.


----------



## bond32

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Only reason to feed that much power to these cards is benchmarking, myself running stock cards and overclocked CPU (for 4K gaming) and it seems to be a good combination.


Not saying you shouldn't, because, well this is OCN, but I highly doubt you needed that fourth card if that's what you're doing. I play BF4 at fully maxed out settings and supersampling to 4k (at 1440p) and I get a solid 100-140 fps with my 3 cards. Your rig is much more high end than mine too.

Even playing battlefield, at a moderate OC with 1.3 V, my three cards will pull 200-250 watts each. Back to the 860 watt psu discussion, that would leave 360 for everything else. Should work fine but if you ever wanted to do serious benching you would need a larger psu.


----------



## BradleyW

Regarding drivers, we might get a beta driver in a few weeks. Just a guess, since 5 or more AAA titles are being released during the next 1 to 2 months.


----------



## dsmwookie

Should I be looking for any particular card for as a better performer under water? I don't tend to see it being worth $70 for a lightning model etc...


----------



## battleaxe

I seem to have solved my crashing issues on BF4 and BF3 thanks to several of you posters. I will go back and +1 those that helped me. Thanks guys!

I give a +1 for each good idea. I'm not sure what worked, but something did.







Several of you got several points. Thanks a lot man! Sigh of relief.


----------



## Devildog83

Quote:


> Originally Posted by *dsmwookie*
> 
> Should I be looking for any particular card for as a better performer under water? I don't tend to see it being worth $70 for a lightning model etc...


I just got a reference card for that exact thing. I don't think you need to spend a bunch of money on a card with elaborate cooling if your going under water.


----------



## Buehlar

Quote:


> Originally Posted by *dsmwookie*
> 
> Should I be looking for any particular card for as a better performer under water? I don't tend to see it being worth $70 for a lightning model etc...


In general, no, but keep in mind that some 3rd party GPU manufactures simply install their on cooling solution on top of a reference PCB card design while other manufactures go an extra mile to design their own PCB and in some cases they'll use higher quality components (capacitors, VRMs chokes, etc.
Usually a $50~$75 premium for the latter.
Quote:


> Originally Posted by *Devildog83*
> 
> I just got a reference card for that exact thing. I don't think you need to spend a bunch of money on a card with elaborate cooling if your going under water.


----------



## Bertovzki

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I was benching my 2 x 290s @ 200mV and was hitting 1050W from the wall. This was in 3DMark. 100mV was hitting 870W. Stock ~ 700W.


Quote:


> Originally Posted by *bond32*
> 
> If it means anything to ya, with a 1300 watt EVGA G2 and 3 cards, when I push them to over 1.4 volts it shuts the system down (ocp protection). Power supply cannot go further at that point, meaning it is likely drawing ~1500 watts from the wall.
> 
> The 1300 watt EVGA G2 would be a perfect psu for 2 cards if you plan to OC the heck out of them. Really all depends on which psu you get... Make sure it's quality. Just because it claims 1200 watts it could be garbage.


Quote:


> Originally Posted by *pkrexer*
> 
> My 860watt Seasonic has no issues running my 2x 290x's @ +100mv. I haven't even tried anything higher than that though.


Ok cool thanks guys,that confirms everything I thought already 1200W + gives me the headroom and future proofing I need ,and yes I will be buying quality, HardOCP is the best reviews I have seen, and JonnyGuru, It will be a Seasonic X 1250, Antec HCP 1300, Enermax Revo 1350 one of these , I wanted my cake and eat it though, as in I wanted a PSU shorter than 190 mm not possible at this size ,for my 750D it would free up bottom for another rad if 180mm or less, you can do it with the Coolmaster V1000 ,its 170mm but a bit small in wattage


----------



## Bertovzki

Quote:


> Originally Posted by *Devildog83*
> 
> I just got a reference card for that exact thing. I don't think you need to spend a bunch of money on a card with elaborate cooling if your going under water.


Quote:


> Originally Posted by *dsmwookie*
> 
> Should I be looking for any particular card for as a better performer under water? I don't tend to see it being worth $70 for a lightning model etc...


Quote:


> Originally Posted by *Buehlar*
> 
> In general, no, but keep in mind that some 3rd party GPU manufactures simply install their on cooling solution on top of a reference PCB card design while other manufactures go an extra mile to design their own PCB and in some cases they'll use higher quality components (capacitors, VRMs chokes, etc.
> Usually a $50~$75 premium for the latter.


Yeah I was wondering the same thing, but for not much more cost I will be getting the Sapphire R9 290 Tri-X OC, the reason being that it has more processor streams and does perform very slightly better at 4K settings, basicaly i will be paying for a more expensive cooler that will never be used


----------



## mus1mus

Quote:


> Originally Posted by *Bertovzki*
> 
> Yeah I was wondering the same thing, but for not much more cost *I will be getting the Sapphire R9 290 Tri-X OC, the reason being that it has more processor streams* and does perform very slightly better at 4K settings, basicaly i will be paying for a more expensive cooler that will never be used


What? Compared to what?


----------



## Devildog83

Quote:


> Originally Posted by *Bertovzki*
> 
> Yeah I was wondering the same thing, but for not much more cost I will be getting the Sapphire R9 290 Tri-X OC, the reason being that it has more processor streams and does perform very slightly better at 4K settings, basicaly i will be paying for a more expensive cooler that will never be used


I got a sapphire 290 ref and it was flashed to a 290x for $245 used. It works awesome. BF4 runs smooth as silk at max settings with Mantle @ stock clocks.


----------



## Aussiejuggalo

Anyone using the new 14.9 WHQL drivers?


----------



## battleaxe

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Anyone using the new 14.9 WHQL drivers?


Yes


----------



## Aussiejuggalo

Quote:


> Originally Posted by *battleaxe*
> 
> Yes


Any black screens or crashes?

I'm downloading them atm, still on 14.4


----------



## battleaxe

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Any black screens or crashes?
> 
> I'm downloading them atm, still on 14.4


No. Seems fine so far. Seems exactly like 14.7 TBH.. maybe a hair stronger. But not much


----------



## Dasboogieman

Quote:


> Originally Posted by *piquadrat*
> 
> 14.9 is out;
> http://support.amd.com/en-us/kb-articles/Pages/AMDCatalyst14-9WINReleaseNotes.aspx
> Is it safe to use DDU to uninstall 14.X drivers? 14.X seems to be a little different to the previous ones I used in my system (apu origin). I have observed "amdacpusrsvc" errors and warnings in the Event Log of my Windows 7 installation albeit the driver itself was working very good. And I wonder if DDU will remove all the stuff.


Yayyy, they finally fixed watchdogs and AC4. About time, I was hoping for prelim Crossfire profiles for the upcoming Evil Within and Shadow of Mordor titles but alas.


----------



## Ironsmack

Quote:


> Originally Posted by *battleaxe*
> 
> I seem to have solved my crashing issues on BF4 and BF3 thanks to several of you posters. I will go back and +1 those that helped me. Thanks guys!
> 
> I give a +1 for each good idea. I'm not sure what worked, but something did.
> 
> 
> 
> 
> 
> 
> 
> Several of you got several points. Thanks a lot man! Sigh of relief.


Glad you sorted that out!


----------



## Arizonian

Quote:


> Originally Posted by *piquadrat*
> 
> 14.9 is out;
> http://support.amd.com/en-us/kb-articles/Pages/AMDCatalyst14-9WINReleaseNotes.aspx
> Is it safe to use DDU to uninstall 14.X drivers? 14.X seems to be a little different to the previous ones I used in my system (apu origin). I have observed "amdacpusrsvc" errors and warnings in the Event Log of my Windows 7 installation albeit the driver itself was working very good. And I wonder if DDU will remove all the stuff.


Updated OP with driver info.

Quote:


> Originally Posted by *Ironsmack*
> 
> Like to add myself to the owners list as well. Thanks!
> 
> (2) XFX 290 w/ XSPC WB & backplate.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## JourneymanMike

Add me to the owners list please

VisionTek r9 290x

Stock cooling soon to be under water!

http://www.techpowerup.com/gpuz/details.php?id=hbhu6


----------



## Arizonian

Quote:


> Originally Posted by *JourneymanMike*
> 
> Add me to the owners list please
> 
> VisionTek r9 290x
> 
> Stock cooling soon to be under water!
> 
> http://www.techpowerup.com/gpuz/details.php?id=hbhu6


Congrats - added


----------



## Bertovzki

Quote:


> Originally Posted by *mus1mus*
> 
> What? Compared to what?


http://www.anandtech.com/show/7601/sapphire-radeon-r9-290-review-our-first-custom-cooled-290/3

R9 290X Tri-X OC _ 2816 stream processors 1040mhz/1300mhz

R9 290 Tri-X OC _ 2560 stream processors 1000mhz/1300mhz

R9 290 Tri-X _ 2560 Stream processors 957mhz/1250mhz

http://www.techpowerup.com/reviews/Sapphire/R9_290X_Tri-X_OC/

Probably such small difference that its not worth considering, and the tests are measured at stock clocks, stock coolers, which probably means nothing when underwater and under higher over clocks.

But more stream processors slightly more horse power.

correct me if im wrong, and maybe id be wasting my time to get a card with higher factory clocks and a few more stream processors, also I thought in original post i made, that the 3 tri x cards all had more stream processors than the next, but its only the expencive 290x tri x oc


----------



## rdr09

Quote:


> Originally Posted by *Bertovzki*
> 
> http://www.anandtech.com/show/7601/sapphire-radeon-r9-290-review-our-first-custom-cooled-290/3
> 
> R9 290X Tri-X OC _ 2816 stream processors 1040mhz/1300mhz
> 
> R9 290 Tri-X OC _ 2560 stream processors 1000mhz/1300mhz
> 
> R9 290 Tri-X _ 2560 Stream processors 957mhz/1250mhz
> 
> http://www.techpowerup.com/reviews/Sapphire/R9_290X_Tri-X_OC/
> 
> Probably such small difference that its not worth considering, and the tests are measured at stock clocks, stock coolers, which probably means nothing when underwater and under higher over clocks.
> 
> But more stream processors slightly more horse power.
> 
> correct me if im wrong, and maybe id be wasting my time to get a card with higher factory clocks and a few more stream processors, also I thought in original post i made, that the 3 tri x cards all had more stream processors than the next, but its only the expencive 290x tri x oc


correct the difference in stream processors will only be present between the 290X and the 290 Pro. if you are watercooling i recommend a used reference 290(AKA 290 Pro). these cards are almost a year old.


----------



## Dasboogieman

Just benchmarked Middle Earth: Shadow of Mordor, highest settings

Crossfire AMD 290 (Tri X and Gaming edition) at 1100mhz/1375mhz, +87mV

can't seem to get a framegrab for some reason, anyway
Max FPS: 120
Min FPS: 35
Average FPS: 90


Spoiler: Warning: Spoiler!













Basically, the game has a solid 90-100 FPS all round, the dips I assume were simply frame variances due to Crossfire. Extremely smooth gameplay.



Anyway the GPUz shot, notice the VRAM usage









After about 2 hrs of gameplay, VRM temps on the Tri X sits at 80 degrees, Core sits at 75 degrees, fan speed at 65%, power draw spikes to 300W during fights with 30+ dudes.


----------



## BradleyW

The GTX 980 seems to be sitting at 100 fps. Looks like 290X CFX is under performing significantly. Maybe they did not run SSAA?


----------



## Devildog83

Is it normal for fraps to not show up in BF4?


----------



## bond32

Quote:


> Originally Posted by *Devildog83*
> 
> Is it normal for fraps to not show up in BF4?


Are you using Mantle? Doubt it would if you are as the same is true for the OSD with AB. Another reason to use DX11


----------



## Dasboogieman

Quote:


> Originally Posted by *BradleyW*
> 
> The GTX 980 seems to be sitting at 100 fps. Looks like 290X CFX is under performing significantly. Maybe they did not run SSAA?


Dunno, the gpu usage graphs are functional 100% for both GPUs. My guess is as good as yours, maybe either I have SSAA enabled or Crossfire scaling is non-existent at the moment. Anyway, mine are using the highest settings, including that Texture setting that requires a minimum of 6Gb of VRAM to run.

Just benchmarked it again with disabled Crossfire

Max FPS: 90
Min FPS: 25
Average FPS: 75

If it means anything, the absolute maximum FPS recorded by the game is no longer 1200FPS (when crossfired) and instead now sits on 600FPS.

Either I'm CPU bottlenecked or Crossfire Scaling is really inconsistent at the moment.


----------



## Devildog83

Quote:


> Originally Posted by *bond32*
> 
> Are you using Mantle? Doubt it would if you are as the same is true for the OSD with AB. Another reason to use DX11


How do you tell FPS in Mantle if fraps don't show up?


----------



## Fickle Pickle

Quote:


> Originally Posted by *Devildog83*
> 
> How do you tell FPS in Mantle if fraps don't show up?


Open console, type Perfoverlay.drawfps 1


----------



## BradleyW

Quote:


> Originally Posted by *Dasboogieman*
> 
> Dunno, the gpu usage graphs are functional 100% for both GPUs. My guess is as good as yours, maybe either I have SSAA enabled or Crossfire scaling is non-existent at the moment. Anyway, mine are using the highest settings, including that Texture setting that requires a minimum of 6Gb of VRAM to run.
> 
> Just benchmarked it again with disabled Crossfire
> 
> Max FPS: 90
> Min FPS: 25
> Average FPS: 75
> 
> If it means anything, the absolute maximum FPS recorded by the game is no longer 1200FPS (when crossfired) and instead now sits on 600FPS.
> 
> Either I'm CPU bottlenecked or Crossfire Scaling is really inconsistent at the moment.


Did you try forcing AFR?
Cheers.


----------



## Dasboogieman

Quote:


> Originally Posted by *BradleyW*
> 
> Did you try forcing AFR?
> Cheers.


Derp, Crossfire wasn't enabled

much much better now


----------



## BradleyW

Quote:


> Originally Posted by *Dasboogieman*
> 
> Derp, Crossfire wasn't enabled
> 
> much much better now
> 
> 
> Spoiler: Warning: Spoiler!


That's better you silly billy! Just out of interest, what driver version are you using? 14.9 are unusable for me. Constant system lock ups.


----------



## Dasboogieman

Quote:


> Originally Posted by *BradleyW*
> 
> That's better you silly billy! Just out of interest, what driver version are you using? 14.9 are unusable for me. Constant system lock ups.


yup 14.9

Also, this game seems to luuuuuurve AMD at the moment, despite being an NVIDIA title, it seems to run much better on Hawaii, this thing basically matches the GTX 980. The SLI guys are also reporting poor scaling at this time.
Methinks this game heavily utilizes texture streaming so Hawaii's monstrous bandwidth is giving it a hard edge over the 970.


----------



## Agent Smith1984

You know, that is one thing I have considered about the 970 in general.....

Where will we see the bandwidth limitations on that 256bit bus rear it's ugly head? Sure everything looks insane on 1080 and 1440, and maybe 256bit at near 8GHz is all you need, but I am guessing since overclocking the VRAM on those cards seems to present dramatic performance increases (and scales in line with the core), that it will hit some sort of limitation somewhere. On the flip side of that coin, 290 owners get to enjoy VRAM that scales with their core..... or at least it appears that way.....

My 280x just can't make use of all of the memory bandwidth at 1080P, and going anything past 1440P will crush the core, so it's kind of unbalanced.
I am guessing that going crossfire would help make use of that bandwidth though.... It's one of the things that allow these cards to scale 75-100%, versus the 25-40% we used to see in it's early days.... well, that and a good CPU.....


----------



## Dasboogieman

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You know, that is one thing I have considered about the 970 in general.....
> 
> Where will we see the bandwidth limitations on that 256bit bus rear it's ugly head? Sure everything looks insane on 1080 and 1440, and maybe 256bit at near 8GHz is all you need, but I am guessing since overclocking the VRAM on those cards seems to present dramatic performance increases (and scales in line with the core), that it will hit some sort of limitation somewhere. On the flip side of that coin, 290 owners get to enjoy VRAM that scales with their core..... or at least it appears that way.....
> 
> My 280x just can't make use of all of the memory bandwidth at 1080P, and going anything past 1440P will crush the core, so it's kind of unbalanced.
> I am guessing that going crossfire would help make use of that bandwidth though.... It's one of the things that allow these cards to scale 75-100%, versus the 25-40% we used to see in it's early days.... well, that and a good CPU.....


Apparently according to the Litecoin guys, 320Gb/sec of memory bandwidth will perfectly saturate the AMD 290 core at 1000mhz. 352Gb/sec also should perfectly saturate the 290 core at 1100mhz, however, the ratio shifts towards the core thereafter due to the high number of errors the VRAM throws out beyond 5.5ghz. AMD basically got the ratio really nice for the Hawaii cards.


----------



## Wezzor

How's 14.9 working for everyone?


----------



## BradleyW

Quote:


> Originally Posted by *Wezzor*
> 
> How's 14.9 working for everyone?


Constant lock ups for me. Had to revert.


----------



## Offler

ok. did some serious overclock with new drivers

GPU: 1135 MHz
VRam: 1500 MHz
VDDC: +100
Power: +50
Cooler: Lapped Stock
Fan: Manually set to 82%

CPU: 3900
RAM 1600 7-7-7-18 CR1
NB 2800
HT 2600

http://www.3dmark.com/3dm/4213566

and screen to results + gpu z + cpu z:

http://slayershrine.wz.cz/r9-290x.jpg

Since I increased Vcore to stable and verified value everything OK.


----------



## piquadrat

Quote:


> Originally Posted by *Wezzor*
> 
> How's 14.9 working for everyone?


I'm second that question. Share your thoughts, guys.
Quote:


> Originally Posted by *BradleyW*
> 
> Constant lock ups for me. Had to revert.


Freezes, blackscreens, BSODs?
CF or single? Overclocked or not?


----------



## BradleyW

Quote:


> Originally Posted by *piquadrat*
> 
> I'm second that question. Share your thoughts, guys.
> Freezes, blackscreens, BSODs?
> CF or single? Overclocked or not?


CFX, Single, Overclocked, Stock....
Soon as I hit Windows, I get system lock up + boot failures.
Revert to 14.7 fixes everything.


----------



## piquadrat

Quote:


> Originally Posted by *BradleyW*
> 
> Constant lock ups for me. Had to revert.


Quote:


> Originally Posted by *BradleyW*
> 
> CFX, Single, Overclocked, Stock....
> Soon as I hit Windows, I get system lock up + boot failures.
> Revert to 14.7 fixes everything.


Win 8.1? I've heard about similar problems with latest amd drivers and windows 8.

Anyone switched from 14.X to 14.9?


----------



## heroxoot

Quote:


> Originally Posted by *BradleyW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *piquadrat*
> 
> I'm second that question. Share your thoughts, guys.
> Freezes, blackscreens, BSODs?
> CF or single? Overclocked or not?
> 
> 
> 
> CFX, Single, Overclocked, Stock....
> Soon as I hit Windows, I get system lock up + boot failures.
> Revert to 14.7 fixes everything.
Click to expand...

Lockups is known on 14.9. I would try the August 27th or September 2nd driver. They both work really good.


----------



## Offler

Quote:


> Originally Posted by *heroxoot*
> 
> Lockups is known on 14.9. I would try the August 27th or September 2nd driver. They both work really good.


For me it was easy to resolve...

Anyway:
http://www.3dmark.com/3dm/4214268

I doubt I will get much higher score...

Edit: I was wrong

http://www.3dmark.com/3dm/4214924

GPU 1165Mhz
RAm 1615
VDDC +156


----------



## JourneymanMike

Quote:


> Originally Posted by *Wezzor*
> 
> How's 14.9 working for everyone?


Lots of lock-ups! went back to 14.7!


----------



## pkrexer

No issues here with the 14.9 drivers, though my gaming only consist of BF4 and Max Payne 3 currently.

They do seem to perform slightly slower than the September 2nd driver for me.


----------



## Vici0us

Quote:


> Originally Posted by *BradleyW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *piquadrat*
> 
> I'm second that question. Share your thoughts, guys.
> Freezes, blackscreens, BSODs?
> CF or single? Overclocked or not?
> 
> 
> 
> CFX, Single, Overclocked, Stock....
> Soon as I hit Windows, I get system lock up + boot failures.
> Revert to 14.7 fixes everything.
Click to expand...

Thanks for sharing, that's really good to know. 14.8 works the best for me.


----------



## Jflisk

With the new 14.9 drivers the temperatures are where they should be as opposed to 14.7 RC3.I also have gained some points in fire strike. I have these installed on windows 8.1 with no problems as of yet but If I do run into something I will mention it. I have played a full gambit of games but I guess I should give Crysis 3 and Bio shock infinite a try.


----------



## Wezzor

Well, I guess I'll skip to update then. Thanks for all the answers!


----------



## BradleyW

Which version is the latest if you ignore the release of 14.9?


----------



## rdr09

Quote:


> Originally Posted by *BradleyW*
> 
> Which version is the latest if you ignore the release of 14.9?


this.


----------



## JourneymanMike

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BradleyW*
> 
> Which version is the latest if you ignore the release of 14.9?
> 
> 
> 
> this.
Click to expand...

That would be 14.7 - this is a beta driver. It works well for me, so does 14.4...


----------



## JourneymanMike

Quote:


> Originally Posted by *Vici0us*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BradleyW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *piquadrat*
> 
> I'm second that question. Share your thoughts, guys.
> Freezes, blackscreens, BSODs?
> CF or single? Overclocked or not?
> 
> 
> 
> CFX, Single, Overclocked, Stock....
> Soon as I hit Windows, I get system lock up + boot failures.
> Revert to 14.7 fixes everything.
> 
> Click to expand...
> 
> Thanks for sharing, that's really good to know. 14.8 works the best for me.
Click to expand...

I keep on seeing 14.8 in some posts - Is there a 14.8? I've never seen or heard of it before now...


----------



## rdr09

Quote:


> Originally Posted by *JourneymanMike*
> 
> That would be 14.7 - this is a beta driver. It works well for me, so does 14.4...


it's 14.X. when you run bench like 3DMark it willl be labeled 14.300.XXX, while 14.9 will be labeled 14.301.X X X.

14.X IS AS STABLE AS 14.4 IN MY EXPERIENCE. Warface is unplayable in both 14.6 AND 14.7. also, 14.X gives the same gain as 14.9 in benchmarks over 14.4. at least from my readings.

290 @ 1260 14.9 vs 14.6

http://www.3dmark.com/compare/fs/2863776/fs/2201336

edit: Warface works flawlessly using 14.9 using my 7950. Good sign.


----------



## JourneymanMike

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *JourneymanMike*
> 
> That would be 14.7 - this is a beta driver. It works well for me, so does 14.4...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> it's 14.X. when you run bench like 3DMark it willl be labeled 14.300.XXX, while 14.9 will be labeled 14.301.X X X.
> 
> 14.X IS AS STABLE AS 14.4 IN MY EXPERIENCE. Warface is unplayable in both 14.6 AND 14.7. also, 14.X gives the same gain as 14.9 in benchmarks over 14.4. at least from my readings.
> 
> 290 @ 1260 14.9 vs 14.6
> 
> http://www.3dmark.com/compare/fs/2863776/fs/2201336
> 
> edit: Warface works flawlessly using 14.9 using my 7950. Good sign.
Click to expand...

Thanks for the detailed info!


----------



## Arizonian

Quote:


> Originally Posted by *Devildog83*
> 
> What ever, you still have to lower the prices down. If you raise them they go up.
> 
> For everyone else, why does my c 290 show more shaders than bluedevils 290?
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## thrgk

anyone having terrible weird lag/gameplay in bf4 with 14.7 beta?


----------



## maynard14

guys can someone help me, i have reference xfx r9 290 unlock to 290x, after some months i can overclock my card to 1150 gpu core with 100mv volts using after burner, but after a while it shows me some graphic glithes like bricks or litle square when benching like 3d mark 11 and tomb raider, but i only use my card while gaming in stock mode never game with overclock, and i have nzxt g10 and some vrm heatsinks with gelid vrm also

but im wondering if my card is broken coz i never played it overclock and i can bench without artifact for some time but now i just cant even changing various driver from 14.4 to 14. 9 and changing also after burner versions

but if i use my card as normal no oc it works 100 percent fine


----------



## Gualichu04

@Arizonian I am listed as two r9 290 when i have 290x's in the OP.


----------



## bluedevil

Looks like I have been doing a little benching.....7,6,5,4 slots...

http://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/cpugpu/fs/P/1540/906/500000?minScore=0&cpuName=Intel Core i5-3470 Processor&gpuName=AMD Radeon R9 290


----------



## zealord

Quote:


> Originally Posted by *bluedevil*
> 
> Broke 9.4k finally.
> 
> http://www.3dmark.com/3dm/4218560


nice









how much +mV did you need for 1200mhz?


----------



## bluedevil

Quote:


> Originally Posted by *zealord*
> 
> nice
> 
> 
> 
> 
> 
> 
> 
> 
> 
> how much +mV did you need for 1200mhz?


Max 1.336


----------



## ZealotKi11er

I think you shoul still be getting higher the 12k for Gpu score.


----------



## kizwan

Quote:


> Originally Posted by *zealord*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bluedevil*
> 
> Broke 9.4k finally.
> 
> http://www.3dmark.com/3dm/4218560
> 
> 
> 
> nice
> 
> 
> 
> 
> 
> 
> 
> 
> 
> how much +mV did you need for 1200mhz?
Click to expand...

@bluedevil, go to BIOS, disable Speedstep, C1E, C3, C6 & C7. Then try again.


----------



## Arizonian

Quote:


> Originally Posted by *Gualichu04*
> 
> @Arizonian I am listed as two r9 290 when i have 290x's in the OP.


ooops







fixed


----------



## Gobigorgohome

Quote:


> Originally Posted by *thrgk*
> 
> anyone having terrible weird lag/gameplay in bf4 with 14.7 beta?


Nope, running four cards too, so there should not be any problem. Same FPS with Direct3D as Mantle too. 14.7 beta running as good as 14.4 for me.


----------



## Vici0us

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *thrgk*
> 
> anyone having terrible weird lag/gameplay in bf4 with 14.7 beta?
> 
> 
> 
> Nope, running four cards too, so there should not be any problem. Same FPS with Direct3D as Mantle too. 14.7 beta running as good as 14.4 for me.
Click to expand...

Running Crossfire on 14.7 without any problems. Probably gonna skip 14.9 from what I am hearing about it.


----------



## bluedevil

Quote:


> Originally Posted by *kizwan*
> 
> @bluedevil, go to BIOS, disable Speedstep, C1E, C3, C6 & C7. Then try again.


Thanks, this helped a ton! 9.5k (close enough) Good enough for slot #4.

http://www.3dmark.com/3dm/4221530

Edit///

Now up to 9.6K!

http://www.3dmark.com/3dm/4221632


----------



## Noufel

Got 10700 firestrike performance with one 290 trix 1150/1500 +75 Vcore and 4790k at 4.5 ghz temps are good with the 14.9 good work AMD








It seems that i can't go higher that 1100/1500 when crossfire is activated don't know why but i got 16670 in firestrike so i think its ok


----------



## rdr09

Quote:


> Originally Posted by *bluedevil*
> 
> Thanks, this helped a ton! 9.5k (close enough) Good enough for slot #4.
> 
> http://www.3dmark.com/3dm/4221530
> 
> Edit///
> 
> Now up to 9.6K!
> 
> http://www.3dmark.com/3dm/4221632


1220 core? if it is, you are missing about 700 pts in graphics score. its not the cpu. could be a bad bios or bad psu.

1220 with i7 stock HT off

http://www.3dmark.com/3dm/4221842?

1220 with i7 4.5 HT on

http://www.3dmark.com/3dm/4222057?


----------



## bluedevil

Quote:


> Originally Posted by *rdr09*
> 
> 1220 core? if it is, you are missing about 700 pts in graphics score. its not the cpu. could be a bad bios or bad psu.
> 
> 1220 with i7 stock HT off
> 
> http://www.3dmark.com/3dm/4221842?
> 
> 1220 with i7 4.5 HT on
> 
> http://www.3dmark.com/3dm/4222057?


I really think my PSU is holding me back. Thinking this 290 is voltage starved. Also you have to admit that you have a 600mhz more clock when comparing CPUs. Might just get a 4690k/Z97 and a CM V750 can see what they can do.









If you could do a run with HT off and @ 3.9ghz. That would be great!


----------



## rdr09

Quote:


> Originally Posted by *bluedevil*
> 
> I really think my PSU is holding me back. Thinking this 290 is voltage starved. Also you have to admit that you have a 600mhz more clock when comparing CPUs. Might just get a 4690k/Z97 and a CM V750 can see what they can do.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you could do a run with HT off and @ 3.9ghz. That would be great!


look again, the first run was slower setting than your cpu. compare physics.


----------



## battleaxe

Kinda meh.... but hey not too bad either. I know I can get higher. Got a water cooler coming. Top card is getting way too hot. Need to work on air flow in my case, not venting properly at all.


----------



## mus1mus

http://www.3dmark.com/3dm/4222797

I'm getting better results with the Leaked APU Driver.

1150/1500

I guess I'll keep this.

No go for 14.X


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> http://www.3dmark.com/3dm/4222797
> 
> I'm getting better results with the Leaked APU Driver.
> 
> 1150/1500
> 
> I guess I'll keep this.
> 
> No go for 14.X


Mind sharing? What leaked APU driver?


----------



## MojoW

Quote:


> Originally Posted by *battleaxe*
> 
> Mind sharing? What leaked APU driver?


I think he's talking about the 14.8 APU drivers that didn't only support APU's.
Here is a GURU3D Link.


----------



## mus1mus

Quote:


> Originally Posted by *MojoW*
> 
> I think he's talking about the 14.8 APU drivers that didn't only support APU's.
> Here is a GURU3D Link.


Yez.. this ^

Plus 1


----------



## tsm106

Quote:


> Originally Posted by *mus1mus*
> 
> http://www.3dmark.com/3dm/4222797
> 
> I'm getting better results with the Leaked APU Driver.
> 
> 1150/1500
> 
> I guess I'll keep this.
> 
> No go for 14.X


Here's one from 14.8. It's a pretty fast FS driver.

http://www.3dmark.com/fs/2715773


----------



## bluedevil

Quote:


> Originally Posted by *rdr09*
> 
> look again, the first run was slower setting than your cpu. compare physics.


I see. Still think my PSU might be the culprit. Gots some thinking to do.


----------



## tsm106

Quote:


> Originally Posted by *bluedevil*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> look again, the first run was slower setting than your cpu. compare physics.
> 
> 
> 
> I see. Still *think my PSU might be the culprit*. Gots some thinking to do.
Click to expand...

Are you hitting OCP/OPP or something?


----------



## bluedevil

Quote:


> Originally Posted by *tsm106*
> 
> Are you hitting OCP/OPP or something?


No. I am not experiencing this at all. Just thinking that the voltage supplied by the PCI cable (1 cable 2 6x2 plugs) is falling a bit short.


----------



## rdr09

Quote:


> Originally Posted by *bluedevil*
> 
> No. I am not experiencing this at all. Just thinking that the voltage supplied by the PCI cable (1 cable 2 6x2 plugs) is falling a bit short.


oh noes, that's your issue. you need two separate cables.


----------



## tsm106

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bluedevil*
> 
> No. I am not experiencing this at all. Just thinking that the voltage supplied by the PCI cable (1 cable 2 6x2 plugs) is falling a bit short.
> 
> 
> 
> oh noes, that's your issue. *you need two separate cables*.
Click to expand...

I don't think it's a need but a want issue, ie. I don't think it makes a difference.


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> I don't think it's a need but a want issue, ie. I don't think it makes a difference.


i read not long ago one member solved the blackscreen issue by switching to 2 cables. iirc, blue's stock run is normal but oc'ed it falls.

@blue, your gpu should have come with an adapter. try using it to fill the 6-pin or 8-pin.


----------



## bluedevil

Quote:


> Originally Posted by *rdr09*
> 
> i read not long ago one member solved the blackscreen issue by switching to 2 cables. iirc, blue's stock run is normal but oc'ed it falls,


I am gonna put this to rest. I gotta find a converter cable.


----------



## tsm106

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I don't think it's a need but a want issue, ie. I don't think it makes a difference.
> 
> 
> 
> i read not long ago one member solved the blackscreen issue by switching to 2 cables. iirc, blue's stock run is normal but oc'ed it falls,
Click to expand...

It was probably a placebo effect since blackscreens are not a power issue unless you hit the cards OCP. All the power is coming from the same place and the cables can handle a lot more power than the "spec" is listed at. I've been overclocking for years on split cables w/o issue btw.


----------



## rdr09

Quote:


> Originally Posted by *bluedevil*
> 
> I am gonna put this to rest. I gotta find a converter cable.


Quote:


> Originally Posted by *tsm106*
> 
> It was probably a placebo effect since blackscreens are not a power issue unless you hit the cards OCP. All the power is coming from the same place and the cables can handle a lot more power than the "spec" is listed at. I've been overclocking for years on split cables w/o issue btw.


it's worth a try. adapters should be included unless bought used and left out.


----------



## bluedevil

Quote:


> Originally Posted by *rdr09*
> 
> it's worth a try. adapters should be included unless bought used and left out.


Found and hooked up. Now for a run.


----------



## bluedevil

9440

GPU score up.

http://www.3dmark.com/3dm/4223925


----------



## rdr09

Quote:


> Originally Posted by *bluedevil*
> 
> 9440
> 
> GPU score up.
> 
> http://www.3dmark.com/3dm/4223925


sorry, i don't see any change if at 1200 core. try 1150 core and 1500 memory. don't change your voltages from 1200.

edit: BTW, if you are using AB to oc, you need to use CCC, too, to set the Power limit to 50%.


----------



## Offler

Just finished with OC of my Sapphire R9-290x with lapped reference stock cooler:

http://www.3dmark.com/3dm/4223472


This is the limit of this card, with its stock cooler, higher OC will cause artifacts.

If there is somebody with Phenom II x6 1090t or 1100t, please show what can your system do with water, LN2 or similar.

(I want to see CPU on 4000MHz or more, and GPU over 1200Mhz)

*Edit: Want in the club*
(Hope the info in the screenshot is enough)

GPU: R9-290x
Manufacturer: Sapphire
Cooler: Modded Reference Stock (Lapped, with Arctic Silver 5 thermal paste)
Core: 1180MHz
VRam: 1615Mhz
VDDC: +200
Power: +50%
Fan speed: Locked on 90%

Artifacts: No


----------



## bluedevil

Quote:


> Originally Posted by *rdr09*
> 
> sorry, i don't see any change if at 1200 core. try 1150 core and 1500 memory. don't change your voltages from 1200.
> 
> edit: BTW, if you are using AB to oc, you need to use CCC, too, to set the Power limit to 50%.


Ya know, it very well could my modded drivers to run my Qnix 2710 @ 92hz. I will try again with my 290 @ 1200/1500 and my 3470 @ 3.9ghz.

No patched drivers.

Edit///

http://www.3dmark.com/3dm/4224676

No clue what is holding back my score. Maybe Windows 8.1?


----------



## zealord

Quote:


> Originally Posted by *bluedevil*
> 
> Ya know, it very well could my modded drivers to run my Qnix 2710 @ 92hz. I will try again with my 290 @ 1200/1500 and my 3470 @ 3.9ghz.
> 
> No patched drivers.
> 
> Edit///
> 
> http://www.3dmark.com/3dm/4224676
> 
> No clue what is holding back my score. Maybe Windows 8.1?


I found out what your problem is .

Disable Anisotropic Filtering in Catalyst and RadeonPro. ( or set it to application controlled)


----------



## Agent Smith1984

Quote:


> Originally Posted by *Offler*
> 
> (I want to see CPU on 4000MHz or more, and GPU over 1200Mhz)


You are going to want to see 4.2GHz+ for the thuban with a 290x...

I have overclocked my 280x, and the graphics, and overall scores no longer improves on FireStrike after 4160MHz..... the physics score increased accordingly up to 4.2GHz (highest clock speed I can complete firestrike without reducing RAM multi and fudging the comparison), but it is at the 4160MHz point, that my CPU pushes my 280x to it's max potential on my system. I guess it just likes that clock speed.

With the 290x, you are going to need even more, but there won't be many thubans running at over 4.2 right now without some more extreme cooling, and currently anybody running extreme cooling and a 290x isn't using a thuban anymore anyways.....

This is just for your reference to show what the CPU score looks like as the thubans core is increases by a half multi, while the NB and RAM stay at 2700/1800 respectively. This will also give you somewhat of an idea as to how much the CPU affects the graphics score on a benchmark like Firestrike, where the goal is to stay as GPU bound as possible.

Firestrike @ 1090T @ 4050MHz
http://www.3dmark.com/fs/2860502

FireStrike @ 1090T @ 4162MHz
http://www.3dmark.com/fs/2860668


----------



## rdr09

Quote:


> Originally Posted by *bluedevil*
> 
> Ya know, it very well could my modded drivers to run my Qnix 2710 @ 92hz. I will try again with my 290 @ 1200/1500 and my 3470 @ 3.9ghz.
> 
> No patched drivers.
> 
> Edit///
> 
> http://www.3dmark.com/3dm/4224676
> 
> No clue what is holding back my score. Maybe Windows 8.1?


Quote:


> Originally Posted by *zealord*
> 
> I found out what your problem is .
> 
> Disable Anisotropic Filtering in Catalyst and RadeonPro. ( or set it to application controlled)


^this or try the other bios. shutdown, switch, boot, and try.


----------



## Offler

Just for reference my score with HD 7970:
http://www.3dmark.com/fs/2711174
(Gigabyte HD 7970 OC, Rev 2.1, Asic 61%, 1050 was highest stable frequency)

I was interested in comparison of two similar Phenom II CPUs and its physics score, and then compare that result with something newer.

This is score with Intel 5960x clocked to 4000MHz (quite close to my current CPU frequency), with R9-290x clocked to 1050Mhz (almost stock)
http://www.3dmark.com/fs/2755310

I am interested in FPS in tests. Quite interesting to see such values while I can say that total performance in 3d rendering is comparable. Of course, my system is overclocked to the limit, while the other system is on stock, but still I expected higher performance from newer system... Of course in Physics test, newer CPU generated more than twice as many frames per second (8 cores, 16 threads, about same frequency, 2x of total performance compared to 6 cores/threads on our CPUs)


----------



## Devildog83

OK, I know something was wrong with my AMD set up because this -

AMD


Intel G3258k dual core @ 4.1 Ghz
http://www.3dmark.com/3dm/4225740


By the way, the FX 8350 was at 4.8Ghz


----------



## bluedevil

Holy crap I actually got a Valid Result! Not the highest, but Valid nonetheless!
9262 1200/1250 3.6Ghz
http://www.3dmark.com/3dm/4226669

9455 1200/1500 3.6Ghz
http://www.3dmark.com/3dm/4226759


----------



## chronicfx

Actually a valid result for me too. So I will post it.

http://www.3dmark.com/3dm11/8782449

P24827 3dmark11

and

http://www.3dmark.com/3dm/4227170

P20779 Firestrike


----------



## JordanTr

Hello everybody. I just wanted to inform Arizonian, that he can remove me from owners list cause i sold my r9 290 and just got delivered gigabyte G1 GTX970


----------



## mus1mus

Quote:


> Originally Posted by *Devildog83*
> 
> OK, I know something was wrong with my AMD set up because this -
> 
> AMD
> 
> 
> Intel G3258k dual core @ 4.1 Ghz
> http://www.3dmark.com/3dm/4225740
> 
> 
> By the way, the FX 8350 was at 4.8Ghz


Something is wrong with your combined score. prolly a driver issue.

I'm pretty sure it would be easy for you to break 9500+ total score at 4.8. Try out the available versions of CCC on the main page. It's a bummer but you'll figure it out.


----------



## rdr09

Quote:


> Originally Posted by *bluedevil*
> 
> Holy crap I actually got a Valid Result! Not the highest, but Valid nonetheless!
> 9262 1200/1250 3.6Ghz
> http://www.3dmark.com/3dm/4226669
> 
> 9455 1200/1500 3.6Ghz
> http://www.3dmark.com/3dm/4226759


are you using the second bios here?


----------



## Devildog83

Quote:


> Originally Posted by *mus1mus*
> 
> Something is wrong with your combined score. prolly a driver issue.
> 
> I'm pretty sure it would be easy for you to break 9500+ total score at 4.8. Try out the available versions of CCC on the main page. It's a bummer but you'll figure it out.


It's gone now. Bids are piling up on E-Bay for the motherboard and CPU as we speak. I am getting an Intel 5820k as soon as I get paid for these. I was never able to figure it out but it's probably my overclock settings or something. I am just using the 3258k and Z97 until that happens.


----------



## bluedevil

Quote:


> Originally Posted by *rdr09*
> 
> are you using the second bios here?


OH MY GOD! I completely forgot about the 2nd BIOS!

Just ran another run with Anisotropic filtering off.
http://www.3dmark.com/3dm/4227640


----------



## rdr09

Quote:


> Originally Posted by *bluedevil*
> 
> OH MY GOD! I completely forgot about the 2nd BIOS!
> 
> Just ran another run with Anisotropic filtering off.
> http://www.3dmark.com/3dm/4227640


that's with 2nd bios? i know 14.9 shows valid results now. wonder why? what oc settings?


----------



## bluedevil

This is with the 2nd BIOS. Not really a difference.

http://www.3dmark.com/3dm/4227841

1175/1500 @ 3.9Ghz

I guess it is what it is.


----------



## Devildog83

Quote:


> Originally Posted by *JordanTr*
> 
> Hello everybody. I just wanted to inform Arizonian, that he can remove me from owners list cause i sold my r9 290 and just got delivered gigabyte G1 GTX970


TRAITOR !!!! Just kidding







Good luck, I heard they overclock like mad.


----------



## bluedevil

Quote:


> Originally Posted by *Devildog83*
> 
> TRAITOR !!!! Just kidding
> 
> 
> 
> 
> 
> 
> 
> Good luck, I heard they overclock like mad.


I am half tempted to do the same right about now.


----------



## tsm106

Quote:


> Originally Posted by *bluedevil*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Devildog83*
> 
> TRAITOR !!!! Just kidding
> 
> 
> 
> 
> 
> 
> 
> Good luck, I heard they overclock like mad.
> 
> 
> 
> I am half tempted to do the same right about now.
Click to expand...

Why??

Here's my 7970 vs my Zotac. This is 1340/1700 vs 1500/2000 btw.

http://www.3dmark.com/compare/fs/2715773/fs/2865843


----------



## mus1mus

Quote:


> Originally Posted by *tsm106*
> 
> Why??
> 
> Here's my 7970 vs my Zotac. This is 1340/1700 vs 1500/2000 btw.
> 
> http://www.3dmark.com/compare/fs/2715773/fs/2865843


WOW..

I wonder how a 290 at 1200 would compare to these.

wonderful!


----------



## bluedevil

Quote:


> Originally Posted by *tsm106*
> 
> Why??
> 
> Here's my 7970 vs my Zotac. This is 1340/1700 vs 1500/2000 btw.
> 
> http://www.3dmark.com/compare/fs/2715773/fs/2865843


Guess that tells me that my 290 is being choked by my 3470. Time to get a 4790k.


----------



## kizwan

970 can overclock like mad but my 290 @1200 can catch up 970 @1500 in 3DMark 11 & Firestrike.


----------



## bluedevil

Quote:


> Originally Posted by *kizwan*
> 
> 970 can overclock like mad but my 290 @1200 can catch up 970 @1500 in 3DMark 11 & Firestrike.


I guess the reviews don't show that.


----------



## kizwan

Quote:


> Originally Posted by *bluedevil*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> 970 can overclock like mad but my 290 @1200 can catch up 970 @1500 in 3DMark 11 & Firestrike.
> 
> 
> 
> 
> 
> 
> 
> I guess the reviews don't show that.
Click to expand...

The truth is out there...







Don't need to rely on reviews, you can check out 970 owner club thread.


----------



## rdr09

Quote:


> Originally Posted by *bluedevil*
> 
> I am half tempted to do the same right about now.


can't really blame you, since you have to oc 50 - 60MHz more to match other 290s. even with the 2nd bios you are missing 700 pts in graphics. it is not your cpu. maybe the motherboard. not sure.


----------



## bluedevil

Quote:


> Originally Posted by *rdr09*
> 
> can't really blame you, since you have to oc 50 - 60MHz more to match other 290s. even with the 2nd bios you are missing 700 pts in graphics. it is not your cpu. maybe the motherboard. not sure.


I really should replace my motherboard 1st though. It's quite dated, just can't decide if a 4790k or a 4690k is good for me.


----------



## DividebyZERO

Quote:


> Originally Posted by *mus1mus*
> 
> WOW..
> 
> I wonder how a 290 at 1200 would compare to these.
> 
> wonderful!


I have a [email protected] 1200/1600, on x5650(old school x58 boards) score here, and 13.12 drivers i haven't tried the new ones yet for FS - its similar in some ways and different in others but it's some data from a quick run

http://www.3dmark.com/3dm/4184514?


----------



## Devildog83

Quote:


> Originally Posted by *kizwan*
> 
> The truth is out there...
> 
> 
> 
> 
> 
> 
> 
> Don't need to rely on reviews, you can check out 970 owner club thread.[/quote
> 
> Yep,
> 
> It has to be highly overclocked to beat a 290 because the 256 bit memory bus and only 1600+ Cuda cores and from what I have seen once you do that the power-saving goes out the window. But they do overclock well.


----------



## rdr09

Quote:


> Originally Posted by *DividebyZERO*
> 
> I have a [email protected] 1200/1600, on x5650(old school x58 boards) score here, and 13.12 drivers i haven't tried the new ones yet for FS - its similar in some ways and different in others but it's some data from a quick run
> 
> http://www.3dmark.com/3dm/4184514?


14.X or 14.9 and add 400 - 500 pts in graphics score.


----------



## DividebyZERO

Quote:


> Originally Posted by *rdr09*
> 
> 14.X or 14.9 and add 400 - 500 pts in graphics score.


newer platform cpu would probably give me way more, it was just a test run at the time, ill try 14.9 and see.


----------



## mus1mus

Does it give some boost in gaming?


----------



## DividebyZERO

Mistake earlier, i just realized the old score i posted earlier was actually 1225/1600 - The proper score for 1200/1600 for me was actually
http://www.3dmark.com/3dm/4184477?
Quote:


> Originally Posted by *rdr09*
> 
> 14.X or 14.9 and add 400 - 500 pts in graphics score.


Looks like it did give a boost,

http://www.3dmark.com/3dm/4229000?

[email protected] - r9 [email protected]/1600 14.9


----------



## DividebyZERO

Quote:


> Originally Posted by *mus1mus*
> 
> Does it give some boost in gaming?


Not sure if this is valid since im going from 13.12 to 14.9 in these two titles. there appears to be a slight boost.

14.9 - 1200/1600
*Metro LL redux - 3 runs*
*
Average Framerate: 92.67
Max. Framerate: 280.50
Min. Framerate: 4.09
*
Tomb raider


Spoiler: Warning: Spoiler!







13.12 1200/1600
*Metro LL redux - 3 runs*
*
Average Framerate: 89.00
Max. Framerate: 271.22
Min. Framerate: 4.05

*
Tomb Raider


Spoiler: Warning: Spoiler!







Settings used:


Spoiler: Warning: Spoiler!



Win7
1080p
1200/1600 GPU R9 290x
CPU 4.2ghz x5650
1600 DDR3 6gb



Metro LL Redux
Options: Resolution: 1920 x 1080; Quality: Very High; SSAA: Off; Texture filtering: AF 16X; Motion Blur: Off; Tesselation: Normal; VSync: Off; Advanced PhysX: Off;

Tomb Raider


Spoiler: Warning: Spoiler!


----------



## rdr09

Quote:


> Originally Posted by *DividebyZERO*
> 
> Mistake earlier, i just realized the old score i posted earlier was actually 1225/1600 - The proper score for 1200/1600 for me was actually
> http://www.3dmark.com/3dm/4184477?
> Looks like it did give a boost,
> 
> http://www.3dmark.com/3dm/4229000?
> 
> [email protected] - r9 [email protected]/1600 14.9


nice. i was hoping you will beat my 290 . . . and you did. 1200 core . . .

http://www.3dmark.com/3dm/4205303

14.X gives a little more boost but pretty close.


----------



## Noufel

14.9 or 14.x wich one is beter for crossfire 290 ???


----------



## Sgt Bilko

Quote:


> Originally Posted by *Noufel*
> 
> 14.9 or 14.x wich one is beter for crossfire 290 ???


They both work well for me









I'm sticking with 14.9 though


----------



## Noufel

Quote:


> Originally Posted by *Sgt Bilko*
> 
> They both work well for me
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm sticking with 14.9 though


Thnx for the reply i'll keep 14.9 i got 10700 in firestrike compared to 10200 with 14.8 with one 290 at 1150/1500


----------



## Wezzor

It feels somehow that most people use Trixx for overclocking and MSI Afterburner/Rivaturner for monitoring etc why is that? Is Trixx better somehow when it comes to overclocking?


----------



## steadly2004

Yes, +200mv instead of +100 with AB, and handles CF better in terms of applying profiles to more than one card. But does not show very good graphs like AB does.

Sent from my Nexus 5 using Tapatalk


----------



## Offler

Quote:


> Originally Posted by *Wezzor*
> 
> It feels somehow that most people use Trixx for overclocking and MSI Afterburner/Rivaturner for monitoring etc why is that? Is Trixx better somehow when it comes to overclocking?


I also use TriXX. In my case 3rd party tools used to cause trouble. Since I upgraded to R9-290x, the system will freeze when I start MSI AB while there is any video stream running. TriXX apparently (according to its changelog) had this trouble, but it was resolved.

Its not the best tool, has some bugs, but overall works well, and is simple.


----------



## Wezzor

Quote:


> Originally Posted by *Offler*
> 
> I also use TriXX. In my case 3rd party tools used to cause trouble. Since I upgraded to R9-290x, the system will freeze when I start MSI AB while there is any video stream running. TriXX apparently (according to its changelog) had this trouble, but it was resolved.
> 
> Its not the best tool, has some bugs, but overall works well, and is simple.


Okay, guess I'll give it a try then.









Anyway guys. I noticed something that is really weird atleast for me. I've overclocked my GPU to 1100/1300 and haven't increased the voltage only the power limit that is set to 20+. I get much better score and FPS in both Firestrike and Valley and I don't get any artifacts either. But when I play games for example BF4 I get FPS drops all the time but when I go back to stock clocks I get constant 60FPS without any FPS drops (Got V-sync enabled). What can be causing this? Can it be that my GPU actually needs some voltage? I appreciate any help you guys can provide.


----------



## Offler

Quote:


> Originally Posted by *Wezzor*
> 
> Okay, guess I'll give it a try then.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyway guys. I noticed something that is really weird atleast for me. I've overclocked my GPU to 1100/1300 and haven't increased the voltage only the power limit that is set to 20+. I get much better score and FPS in both Firestrike and Valley and I don't get any artifacts either. But when I play games for example BF4 I get FPS drops all the time but when I go back to stock clocks I get constant 60FPS without any FPS drops (Got V-sync enabled). What can be causing this? Can it be that my GPU actually needs some voltage? I appreciate any help you guys can provide.


If the card goes up to 94°C it automatically downclocks, else it may overheat. If you overclock, you may encounter this earlier.

I used TriXX to customize fan curve. Minimum is 35-38%, over 65°C it goes up and on 87°C it reaches 78%. Also I always set power +50%. Then the card runs on Max frequency during gaming. Everything above 40 percent is quite noisy, above 50 its extremely noisy....

Just keep in mind I have lapped stock cooler. Normal stock will overheat earlier, while most other coolers will be ok.

You have to learn your own balance between temperature, noise and frequency, and setup the card accordingly. Change of cooler, or at least cooler mod is strongly recommended (depending on cooler on your card).


----------



## mus1mus

14.9 indeed. Valid and highest I have ever gotten.









1150/1500 http://www.3dmark.com/3dm/4233229


----------



## bluedevil

I just had a realization. Why do we bench our hardware with a bench-marking utility/software when our games are what we want the performance in? Furthermore, why bother getting higher FPS when our monitors can't output the FPS that the GPU/CPU is putting out?

My point entirely is (for example, my system) that currently the only games I play ATM are Titanfall and BF4. That's it. We all know that Titanfall is not too difficult to run at all. BF4 on the other hand, is. My settings are pretty much Ultra (no aa/af/msaa) @ 1440P @ 92Hz. Now with that said, I tend to get about 92 FPS (refresh rate cap) depending on the map/players. Now that's running at completely stock frequencies on both my CPU and GPU.

I guess that's my current hangup. I have been thinking of moving to a 4690K/Z97 but fail to see the value in anything but bench-marks.


----------



## kizwan

Set power limit to max.


----------



## Devildog83

Quote:


> Originally Posted by *mus1mus*
> 
> 14.9 indeed. Valid and highest I have ever gotten.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1150/1500 http://www.3dmark.com/3dm/4233229


Very nice. I can't wait to see what mine will do with a 5820k and X99.


----------



## Wezzor

Quote:


> Originally Posted by *Offler*
> 
> If the card goes up to 94°C it automatically downclocks, else it may overheat. If you overclock, you may encounter this earlier.
> 
> I used TriXX to customize fan curve. Minimum is 35-38%, over 65°C it goes up and on 87°C it reaches 78%. Also I always set power +50%. Then the card runs on Max frequency during gaming. Everything above 40 percent is quite noisy, above 50 its extremely noisy....
> 
> Just keep in mind I have lapped stock cooler. Normal stock will overheat earlier, while most other coolers will be ok.
> 
> You have to learn your own balance between temperature, noise and frequency, and setup the card accordingly. Change of cooler, or at least cooler mod is strongly recommended (depending on cooler on your card).


Well, my VRM goes to 68c same goes with GPU temperature so that cannot be the reason. I did increase the power limit to +50% now perhaps that will help







I'll come back with answers.


----------



## zealord

something is definitely wrong with 3dmark firestrike. Look at the graphics score.

look at these 2 results :

http://www.3dmark.com/3dm/4233229 (1150mhz R9 290)

http://www.3dmark.com/3dm/4184514? (1200mhz R9 290X)

These are not my results, but 2 I have taken out of this thread. Weird thing though I have the same "problem" with my 290X. My 1100mhz graphics score is 11700. It is a little bit too low.
But it is just 3dmark firestrike. I tested valley, heaven, 3dmark11, shadow of mordor etc. and the score was spot on with similar system, only 3dmark FS is behaving like this. I tried different OS, disable Aero, AF settings in catalyst, clean install new driver, etc. etc. etc. Nothing helps.
I have noticed this a few times the last couple of days. I have no idea why this is. It has nothing to do with Operating System, RAM, CPU etc. Remember : just the graphics score.


----------



## LandonAaron

Ther is nothing wrong with those two scores the higher 11,270 score was from a $1000 Xeon processor paired with a R9 290x, and the 9713 score was from $140 AMD FX processor paired with a r9 290x. While graphics are chiefly the department of the GPU a good CPU helps too. My i7-965 with a r9 290x scores 9441 fyi. Edit: NM, I see what your saying your just talking about the graphics scores. Yeah, I don't know. My graphics score is 11254 running at all stock speeds, just power % and custom fan profile adjusted.


----------



## zealord

Quote:


> Originally Posted by *LandonAaron*
> 
> Ther is nothing wrong with those two scores the higher 11,270 score was from a $1000 Xeon processor paired with a R9 290x, and the 9713 score was from $140 AMD FX processor paired with a r9 290x. While graphics are chiefly the department of the GPU a good CPU helps too.


I am not talking about the overall score (like I mentioned about 3 times in my post. I am talking about thet _*[ GRAPHICS SCORE ]*_ ).
The funny thing is the one where the graphics score is lower than it should be is actually the one with the better CPU. So the CPU cannot be the problem/bottleneck here.

do people even read what others write


----------



## rdr09

Quote:


> Originally Posted by *zealord*
> 
> something is definitely wrong with 3dmark firestrike. Look at the graphics score.
> 
> look at these 2 results :
> 
> http://www.3dmark.com/3dm/4233229 (1150mhz R9 290)
> 
> http://www.3dmark.com/3dm/4184514? (1200mhz R9 290X)
> 
> These are not my results, but 2 I have taken out of this thread. Weird thing though I have the same "problem" with my 290X. My 1100mhz graphics score is 11700. It is a little bit too low.
> But it is just 3dmark firestrike. I tested valley, heaven, 3dmark11, shadow of mordor etc. and the score was spot on with similar system, only 3dmark FS is behaving like this. I tried different OS, disable Aero, AF settings in catalyst, clean install new driver, etc. etc. etc. Nothing helps.
> I have noticed this a few times the last couple of days. I have no idea why this is. It has nothing to do with Operating System, RAM, CPU etc. Remember : just the graphics score.


Drivers. 13 - 14 gives extra 500 pts.

see post #30333


----------



## zealord

Quote:


> Originally Posted by *rdr09*
> 
> Drivers. 13 - 14 gives extra 500 pts.


Yeah that is true, but it should be much more than that.
The one card is a R9 290 and the other one is a R9 290X.
The R9 290 is even clocked lower and can keep up. The 1200mhz R9 290X should be atleast 1000 points above the 1150mhz R9 290 no matter what drivers.
Moreover I am running 14.9 and I have a graphics score of 11700 with my R9 290X at 1100mhz. Which is aswell too low.
The driver added a few hundred points in 3dmark and 3dmark11, but this is not the problem here. I have no idea what it is.


----------



## rdr09

Quote:


> Originally Posted by *zealord*
> 
> Yeah that is true, but it should be much more than that.
> The one card is a R9 290 and the other one is a R9 290X.
> The R9 290 is even clocked lower and can keep up. The 1200mhz R9 290X should be atleast 1000 points above the 1150mhz R9 290 no matter what drivers.
> Moreover I am running 14.9 and I have a graphics score of 11700 with my R9 290X at 1100mhz. Which is aswell too low.
> The driver added a few hundred points in 3dmark and 3dmark11, but this is not the problem here. I have no idea what it is.


with everything almost equal except the cards . . . the difference between the 290 and 290X at same clocks is 300 pts. approx. 1200 core

290X

http://www.3dmark.com/3dm/4229000?

290

http://www.3dmark.com/3dm/4205303


----------



## zealord

Quote:


> Originally Posted by *rdr09*
> 
> with everything almost equal except the cards . . . the difference between the 290 and 290X at same clocks is 300 pts. approx. 1200 core
> 
> 290X
> 
> http://www.3dmark.com/3dm/4229000?
> 
> 290
> 
> http://www.3dmark.com/3dm/4205303


exactly and it should be more than that. by the way how do you know what clocks the cards were running at? Are these results from this thread?


----------



## LandonAaron

Just got a r9 290x used off ebay. Apparently some guy has like 68 of these he was using for bit mining and is now selling them all on Ebay. He was letting them go for $220 + 15 shipping, but now it looks like he is putting them on auction 10 at a time. I got mine 2 days ago and performance seems good, temps and benchmarks all seem normal. GPU temp 81 C, VRM1 51C, VRM2 67C, with fan at 57% while doing simple GPU-Z PCI-E render test. Had to set up custom fan profile though as default auto fan control let temps go up into the 90's while only running the fan at like 38-40%, even with the UBER mode switched on. I'm in the process of installing my Arctic Accelero Hybrid water cooler on it, and got one of those Gelid ICY vision VRM heatsinks for it, so hopefully will be able to get some good overclocks. Also the card has Hynix memory so that's cool.

Question though, I noticed something weird last night while playing skyrim. MSI afterburner kept reporting my GPU usage as either 100 or 0% and rarely anything in between. I know this cant be accurate cause my GTX 770 only ran at like 60% utilization most of the time in Skyrim so the 290x shouldn't be getting maxed out. Also there is no way it could be 0% running around in a 3d game no matter how simple of an area it may be (not paused or inventory open or anything like that, running around with 0% GPU utilization). I tried checking the GPU utilization with MSI Afterburner, GUP-Z, and Overdrive and they all reported the same thing either 0% or 100%. This is on a Saphire R9 290x reference card.

One more question I am coming from Nvidia cards which only ever had one adjustable voltage and I noticed in MSI afterburner this card has a core voltage and an auxillary voltage I ca adjust. Is it safe to max both of these out, and if not what is the max value I should toy with for both? Thanks.


----------



## zealord

Maybe someone here with a normal 290 that runs at 1100 core clock / 1350 mem clock and can run 3Dmark Firestrike for me and post the result?

I would also be interested in a R9 290X 1100/1350 run.

OS, CPU etc. doesn't matter thanks


----------



## Talon720

With a 4770k oc and 3 290x oc I got a firestrike (performance) score of 21000 well it was 21xxx don't have the exact score infront of me. Any of you else have a similar setup have any scores wanted to see if I'm normal/over/under performing.


----------



## zealord

Quote:


> Originally Posted by *Talon720*
> 
> With a 4770k oc and 3 290x oc I got a firestrike (performance) score of 21000 well it was 21xxx don't have the exact score infront of me. Any of you else have a similar setup have any scores wanted to see if I'm normal/over/under performing.


it is way easier if you just post the link to the result of the firestrike test and tell us your clocks of GPU and CPU.


----------



## chronicfx

Quote:


> Originally Posted by *Talon720*
> 
> With a 4770k oc and 3 290x oc I got a firestrike (performance) score of 21000 well it was 21xxx don't have the exact score infront of me. Any of you else have a similar setup have any scores wanted to see if I'm normal/over/under performing.


I get just under 21000 with my 4790k at 4.7 and 3x 290x at stock on air
http://www.3dmark.com/3dm/4227170

Can you link yours as well? I want to see


----------



## bond32

This was my best score: http://www.3dmark.com/fs/2787366

4770k at 4.9 ghz, 1 290x and 2 290's, 1250/1500


----------



## JourneymanMike

Quote:


> Originally Posted by *LandonAaron*
> 
> Just got a r9 290x used off ebay. Apparently some guy has like 68 of these he was using for bit mining and is now selling them all on Ebay. He was letting them go for $220 + 15 shipping, but now it looks like he is putting them on auction 10 at a time. I got mine 2 days ago and performance seems good, temps and benchmarks all seem normal. GPU temp 81 C, VRM1 51C, VRM2 67C, with fan at 57% while doing simple GPU-Z PCI-E render test. Had to set up custom fan profile though as default auto fan control let temps go up into the 90's while only running the fan at like 38-40%, even with the UBER mode switched on. I'm in the process of installing my Arctic Accelero Hybrid water cooler on it, and got one of those Gelid ICY vision VRM heatsinks for it, so hopefully will be able to get some good overclocks. Also the card has Hynix memory so that's cool.
> 
> Question though, I noticed something weird last night while playing skyrim. MSI afterburner kept reporting my GPU usage as either 100 or 0% and rarely anything in between. I know this cant be accurate cause my GTX 770 only ran at like 60% utilization most of the time in Skyrim so the 290x shouldn't be getting maxed out. Also there is no way it could be 0% running around in a 3d game no matter how simple of an area it may be (not paused or inventory open or anything like that, running around with 0% GPU utilization). I tried checking the GPU utilization with MSI Afterburner, GUP-Z, and Overdrive and they all reported the same thing either 0% or 100%. This is on a Saphire R9 290x reference card.
> 
> One more question I am coming from Nvidia cards which only ever had one adjustable voltage and I noticed in MSI afterburner this card has a core voltage and an auxillary voltage I ca adjust. Is it safe to max both of these out, and if not what is the max value I should toy with for both? Thanks.


Hey, are these reference cards? And could you post a link to them?

Thanks,

Mike


----------



## rdr09

Quote:


> Originally Posted by *zealord*
> 
> exactly and it should be more than that. by the way how do you know what clocks the cards were running at? Are these results from this thread?


the 290 is mine and the other is DividedbyZero's 290Xin pg 3034.


----------



## StrongForce

Hey guys I was wondering and hesitating about getting a second hand r9 290, first do you think the price will drop consistently during next month, due to the retail prices potentially dropping as a side effect of the GTX 970 aggressive pricing.

It might cost arround 200 euros or a bit less hopefully.., also was thinking of getting a Gelid Icy cooler for it, it seem to cost arround 50 euros also which is pretty pricy, in the end I wonder if maybe I should just get a GTX 970 OC version for nearly the same price I get something new and even better performance.

Was wondering if you think the Gelid Icy is a valuable investment (adaptable to future cards maybe ?) and if it perform really well, according to some review I saw it seemed to do fine, just the VRM and RAM cooling was a bit weak.. but that can be modded too or simply put some fan near it.

Also, hopefully it's worth the investment from my hd 7950 OC that score arround 7200 on firestrike currently with my CPU also, potentially it sounds cause I may be able to sell my hd 7950 for 80-90 on ebay..


----------



## ZealotKi11er

Quote:


> Originally Posted by *StrongForce*
> 
> Hey guys I was wondering and hesitating about getting a second hand r9 290, first do you think the price will drop consistently during next month, due to the retail prices potentially dropping as a side effect of the GTX 970 aggressive pricing.
> 
> It might cost arround 200 euros or a bit less hopefully.., also was thinking of getting a Gelid Icy cooler for it, it seem to cost arround 50 euros also which is pretty pricy, in the end I wonder if maybe I should just get a GTX 970 OC version for nearly the same price I get something new and even better performance.
> 
> Was wondering if you think the Gelid Icy is a valuable investment (adaptable to future cards maybe ?) and if it perform really well, according to some review I saw it seemed to do fine, just the VRM and RAM cooling was a bit weak.. but that can be modded too or simply put some fan near it.
> 
> Also, hopefully it's worth the investment from my hd 7950 OC that score arround 7200 on firestrike currently with my CPU also, potentially it sounds cause I may be able to sell my hd 7950 for 80-90 on ebay..


I would not bother with after-market air coolers. I would just get 970 unless you buy a used 290 that has after market cooler.


----------



## StrongForce

Yea since the second might be hard to find I might just go with a 970 then.. i'll wait a couple weeks first I heard price might drop, and who knows if the r9 290 price might drop consistently I might even be tempted by just using a stock cooler one even though it's really bad, I mean I just won't overclock, not that I really need it for 1080p anyway !

But I heard of the "throttling" issues too..


----------



## battleaxe

Can black screen be cause by a bad PSU?

I put a water cooler on my second 290 and installed it. Had a water cooler on the first 290 already, not a single issue. Fired up 3dMark11 and it blackscreens now at stock within 3 to 5 seconds. It even does so at 800mhz core/1200mhz RAM too. I thought the card was dead.

So I put the other 290 in the top slot and only used one card instead. Same thing. Black screen. On two different cards. Both were working prior just fine and can clock up to about 1165 on core. The second card has had the water cooler on it for almost a year and no problems at all. Now both of them black-screen suddenly? Seems fishy to me... I've swapped both cards, ran only one card, ran both cards, and same thing every time. Black screen. No BSOD, just black and I have to hard reboot every time or it just sits there indefinitely. My only thought that could cause it is not getting enough power and it can't supply the voltage it needs to it goes black.

Ideas?


----------



## Aaron_Henderson

Got an ASUS 290X DirectCU II OC about a week ago, and just now got around to testing it. I haven't had an AMD card since an x1950 GT, and got used to Nvidia's Control Panel. I am having an issue getting my HDTV to work...I can only connect through HDMI or VGA...I would prefer HDMI. Problem is, I can't get it to display 1080p from the 290X, and everything I can get it to display is scaled strangely. The TV works fine with other cards I have...not sure what to about it. I tried forcing through the AMD driver software to add 1080p to the display manager, but it doesn't ever show up. Even after a reboot. Am I looking at editing EDID info or something now?


----------



## kizwan

Quote:


> Originally Posted by *Aaron_Henderson*
> 
> Got an ASUS 290X DirectCU II OC about a week ago, and just now got around to testing it. I haven't had an AMD card since an x1950 GT, and got used to Nvidia's Control Panel. I am having an issue getting my HDTV to work...I can only connect through HDMI or VGA...I would prefer HDMI. Problem is, I can't get it to display 1080p from the 290X, and everything I can get it to display is scaled strangely. The TV works fine with other cards I have...not sure what to about it. I tried forcing through the AMD driver software to add 1080p to the display manager, but it doesn't ever show up. Even after a reboot. Am I looking at editing EDID info or something now?


At what resolution you can set to at the moment?


----------



## Aaron_Henderson

Quote:


> Originally Posted by *kizwan*
> 
> At what resolution you can set to at the moment?


1680x1050 60 HZ or 1920x1080 30Hz, both are scaled weird as well. Weirder thing...I hooked up a 1440x900 monitor to it for now, and THAT lets me set 1080P...obviously downscaled, but still weird.


----------



## LandonAaron

Quote:


> Originally Posted by *JourneymanMike*
> 
> Hey, are these reference cards? And could you post a link to them?
> 
> Thanks,
> 
> Mike


http://www.ebay.com/itm/Huge-Lot-of-Sapphire-Radeon-R9-290x-4GB-GDDR5-GPUs-7-of-10-/301337177706?pt=PCC_Video_TV_Cards&hash=item4629186e6a

This is the only one I see right now from the same seller but he has been doing this for the last several days, where he puts 10 up for sale and then once they all sale out he puts up another 10. I have been keeping my eye on them, cause I was toying with the idea of buying a second one for crossfire. He said he had 68 though so he might be about to sell out. You might message him to see if he has more or just get that one.


----------



## tsm106

Quote:


> Originally Posted by *mus1mus*
> 
> WOW..
> 
> I wonder how a 290 at 1200 would compare to these.
> 
> wonderful!


Quote:


> Originally Posted by *bluedevil*
> 
> Guess that tells me that my 290 is being choked by my 3470. Time to get a 4790k.


Quote:


> Originally Posted by *kizwan*
> 
> 970 can overclock like mad but my 290 @1200 can catch up 970 @1500 in 3DMark 11 & Firestrike.


Quote:


> Originally Posted by *bluedevil*
> 
> I guess the reviews don't show that.


Quote:


> Originally Posted by *kizwan*
> 
> The truth is out there...
> 
> 
> 
> 
> 
> 
> 
> Don't need to rely on reviews, you can check out 970 owner club thread.


Oh you guys don't know the half of it. I'm still roughly a 1/3rd of the way thru game benching and I was just shocked by the 290x in Grid AutoSport. At 4k, it just pummels the [email protected]/2000 whilst being clocked at 1250/1650. Grid AS outputs in text format btw. I have a few more games to go and I wanted to compare it versus two 7970s. I find that the [email protected] is fast when there is low load but when you crank up the resolution and load, the card just does not scale with frequency. If you have 4K or higher and you have good cooling, I cannot think of any good reason to switch to a 970.

Quote:


> Originally Posted by *[email protected]/2000*
> fps_total av_fps_ms="19.184238" max_fps_ms="15.351959" [min_fps_ms="25.295305" B]av_fps="52.126125" max_fps="65.138268"/B] min_fps="39.533028"[
> 
> *resolution multisampling="off"* vsync="0" fullscreen="true" aspect="auto" height="2160" width="3840"


Quote:


> Originally Posted by *[email protected]/1650*
> fps_total *av_fps_ms="20.739557"* max_fps_ms="16.782841" min_fps_ms="26.259253" av_fps="48.217037" max_fps="59.584671" min_fps="38.081814"
> 
> *resolution multisampling="8f16xeqaa"* vsync="0" fullscreen="true" aspect="auto" height="2160" width="3840"


----------



## ZealotKi11er

Quote:


> Originally Posted by *tsm106*
> 
> Oh you guys don't know the half of it. I'm still roughly a 1/3rd of the way thru game benching and I was just shocked by the 290x in Grid AutoSport. At 4k, it just pummels the [email protected]/2000 whilst being clocked at 1250/1650. Grid AS outputs in text format btw. I have a few more games to go and I wanted to compare it versus two 7970s. I find that the [email protected] is fast when there is low load but when you crank up the resolution and load, the card just does not scale with frequency. If you have 4K or higher and you have good cooling, I cannot think of any good reason to switch to a 970.


So basically if you want to use SSAA.


----------



## DividebyZERO

Is 3dmark 13 firestrike validation working for anyone else right now? I keep loading my results and it just sits there saying "validating"...... and i cant view the results online

is this a good score for a singl 290x? i am trying so hard and it seems like the points are barely moving...


----------



## VSG

That is exceptionally good, but without the score having been validated I can't tell you if it bugged out somewhere or not. As a reference, my golden 780 Ti KPE had a graphics score of 15k and I haven't seen any Hawaii card hit that in above ambient cooling.


----------



## zealord

Quote:


> Originally Posted by *DividebyZERO*
> 
> Is 3dmark 13 firestrike validation working for anyone else right now? I keep loading my results and it just sits there saying "validating"...... and i cant view the results online
> 
> is this a good score for a singl 290x? i am trying so hard and it seems like the points are barely moving...


That is pretty good for a single 290X. Did you change anything since your last run?


----------



## DividebyZERO

Quote:


> Originally Posted by *geggeg*
> 
> That is exceptionally good, but without the score having been validated I can't tell you if it bugged out somewhere or not. As a reference, my golden 780 Ti KPE had a graphics score of 15k and I haven't seen any Hawaii card hit that in above ambient cooling.


the last 3 validations I had to reload 3dmark. That one above just sat there validating even after reloading, i rebooted and ran again and it went right through.

http://www.3dmark.com/3dm/4238542?
Gpu is at 1250/1775 [email protected] - pt1 bios +200mv
Quote:


> Originally Posted by *zealord*
> 
> That is pretty good for a single 290X. Did you change anything since your last run?


Yes its a different 290x, my other one was not liking anything past 1200/1600


----------



## Aaron_Henderson

Despite not liking my one HDTV, this card is better than expected, coming from GTX 570 SLI. I've barely had time to play with it so far, but it's certainly much faster, and best of all, much, much smoother. I was considering a 970, but the card I got was significantly cheaper and I don't feel I will regret it. Nice and quiet too, the DirectCu II. I might do some light overclocking eventually...might just baby the card as I plan to get another one eventually. Still waiting on a Bitfenix 8-pin extension before I post a picture, right now it has cables right from the PSU, cause I couldn't wait any longer to test out my 290x. The 570 SLI setup used 4x6-pin connectors, so now that's all I have.


----------



## VSG

Ah that memory clock! That is a really good card you have there


----------



## zealord

Quote:


> Originally Posted by *DividebyZERO*
> 
> the last 3 validations I had to reload 3dmark. That one above just sat there validating even after reloading, i rebooted and ran again and it went right through.
> 
> http://www.3dmark.com/3dm/4238542?
> Gpu is at 1250/1775 [email protected]
> Yes its a different 290x, my other one was not liking anything past 1200/1600


what voltage does the 1250mhz one run at?
and what voltage did your old run at 1200mhz run at?


----------



## DividebyZERO

Quote:


> Originally Posted by *geggeg*
> 
> Ah that memory clock! That is a really good card you have there


I am on water and using pt1 bios, would it be suicidal to use pt3 bios?
Quote:


> Originally Posted by *zealord*
> 
> what voltage does the 1250mhz one run at?
> and what voltage did your old run at 1200mhz run at?


1.359v


----------



## ZealotKi11er

What makes the PT1 Bios good? Also i dont think memory makes much difference. I gained 100 points going from 1500 to 1625MHz.


----------



## VSG

Quote:


> Originally Posted by *DividebyZERO*
> 
> I am on water and using pt1 bios, would it be suicidal to use pt3 bios?
> 1.359v


Honestly I am not the right person to answer this- yet. I sold my two 290x cards when the mining craze hit, and only got back with a 290 which hasn't even been tested by me yet









I do have a water block and loop in mind but in the meantime there are more experienced guys who can answer that.


----------



## JourneymanMike

Quote:


> Originally Posted by *LandonAaron*
> 
> Quote:
> 
> 
> 
> Originally Posted by *JourneymanMike*
> 
> Hey, are these reference cards? And could you post a link to them?
> 
> Thanks,
> 
> Mike
> 
> 
> 
> http://www.ebay.com/itm/Huge-Lot-of-Sapphire-Radeon-R9-290x-4GB-GDDR5-GPUs-7-of-10-/301337177706?pt=PCC_Video_TV_Cards&hash=item4629186e6a
> 
> This is the only one I see right now from the same seller but he has been doing this for the last several days, where he puts 10 up for sale and then once they all sale out he puts up another 10. I have been keeping my eye on them, cause I was toying with the idea of buying a second one for crossfire. He said he had 68 though so he might be about to sell out. You might message him to see if he has more or just get that one.
Click to expand...

Thanks for the info, He doesn't have any listed right now, I'll keep an eye on it...


----------



## rdr09

Quote:


> Originally Posted by *DividebyZERO*
> 
> the last 3 validations I had to reload 3dmark. That one above just sat there validating even after reloading, i rebooted and ran again and it went right through.
> 
> http://www.3dmark.com/3dm/4238542?
> Gpu is at 1250/1775 [email protected] - pt1 bios +200mv
> Yes its a different 290x, my other one was not liking anything past 1200/1600


nice. that 14.9 at work. AMD is not done yet for sure. here is Devilhead's 290X. i think he got this oc'ed to 1350 or something.

http://www.3dmark.com/fs/2523594

found here . . .

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/30070

and, of course, we have outliers like this . . .

http://www.3dmark.com/fs/1310610


----------



## DividebyZERO

Quote:


> Originally Posted by *rdr09*
> 
> nice. that 14.9 at work. AMD is not done yet for sure. here is Devilhead's 290X. i think he got this oc'ed to 1350 or something.
> 
> http://www.3dmark.com/fs/2523594
> 
> found here . . .
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/30070
> 
> and, of course, we have outliers like this . . .
> 
> http://www.3dmark.com/fs/1310610


how much role does cpu have here, also that last score was done on air/water? It seems like it would be done on ice or ln2??


----------



## chronicfx

Quote:


> Originally Posted by *bond32*
> 
> This was my best score: http://www.3dmark.com/fs/2787366
> 
> 4770k at 4.9 ghz, 1 290x and 2 290's, 1250/1500


So the g2 1300w handles 1200 on all three cards? Very impressive! I am one salary right now while my wife finds work but water cooling these cards would be nice. The only thing I like about the reference cards is every complains about vrm temp and mine never crack 55c although sub 50 on the core would be sweet.


----------



## rdr09

Quote:


> Originally Posted by *DividebyZERO*
> 
> how much role does cpu have here, also that last score was done on air/water? It seems like it would be done on ice or ln2??


cpu does affect the graphics score but not by much. for example, with a 7970 using my Phenom and an i7, the latter will have like 200 pts more in graphics. i think Devil's run was done using water and really low ambient. the last one with 20K graphics . . . i am not sure if that's legit. if it is, ln2 for sure.

i may try PT1 this winter. here is my 290 @ 1280 using 14.X

http://www.3dmark.com/3dm/4037848?


----------



## Nevk

14.9 WHQL

3770(Default)+R9 290 DC2OC (1000/1250)

http://www.3dmark.com/3dm/4232805?

FireStrike:

Score:9255


----------



## pdasterly

I need a guide to overclocking a watercooled card, ek block on 290x


----------



## ZealotKi11er

Quote:


> Originally Posted by *pdasterly*
> 
> I need a guide to overclocking a watercooled card, ek block on 290x


There is no guide. Increase voltage to +100 mV and see how high you can get. Start at 1150 and go up from there.


----------



## pdasterly

I can do 1200gpu on +100mv

drivers start to crash at 1225. Using heaven

gpu temp 45
vddc 1.328
vrm1 37
vrm2 33


----------



## th3illusiveman

Benchies for you guys

i5 2500K - 4.4GHz
Cat 14.9 drivers
R9 290X

Quote:


> Run at *1165 core /1525 mem*


*3DM11 - Graphics score 18220
*


Spoiler: Warning: Spoiler!






http://www.3dmark.com/3dm11/8779443

Quote:


> run at *1100/1500Mhz
> *


3D Firestrike - graphics score 12825



Spoiler: Warning: Spoiler!






http://www.3dmark.com/3dm/4240350

Tomb Raider - Ultimate Preset - 91.4 Fps Average



Spoiler: Warning: Spoiler!








Heaven Extreme - 8XMSAA - 59.5 Fps average



Spoiler: Warning: Spoiler!








Batman Arkham City - Extreme 8XMSAA - 140 Fps average



Spoiler: Warning: Spoiler!








Metro Last Light (d6) - Very High SSAA - 51.47 average



Spoiler: Warning: Spoiler!








Valley - Extreme HD 8XMSAA - 67.9 Fps



Spoiler: Warning: Spoiler!


----------



## Buehlar

Quote:


> Originally Posted by *th3illusiveman*
> 
> Benchies for you guys
> 
> i5 2500K - 4.4GHz
> Cat 14.9 drivers
> R9 290X
> [/U][/B]
> 
> *3DM11 - Graphics score 18220
> *
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/8779443
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 3D Firestrike - graphics score 12825
> 
> [/U][/B]
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/4240350
> 
> Tomb Raider - Ultimate Preset - 91.4 Fps Average
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> Heaven Extreme - 8XMSAA - 59.5 Fps average
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> Batman Arkham City - Extreme 8XMSAA - 140 Fps average
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> Metro Last Light (d6) - Very High SSAA - 51.47 average
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> Valley - Extreme HD 8XMSAA - 67.9 Fps
> 
> 
> 
> Spoiler: Warning: Spoiler!


Nice...









How about your temps?


----------



## th3illusiveman

Quote:


> Originally Posted by *Buehlar*
> 
> Nice...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How about your temps?


with the 1165 it was really just an extreme run, Core went to ~80c and VRMs were ~92c so it's not ideal.

My 24/7 1100/1500 has the core set at 70c and VRMs at 80c , max fan speed is 70%


----------



## Arizonian

Quote:


> Originally Posted by *Offler*
> 
> Just finished with OC of my Sapphire R9-290x with lapped reference stock cooler:
> 
> http://www.3dmark.com/3dm/4223472
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> This is the limit of this card, with its stock cooler, higher OC will cause artifacts.
> 
> If there is somebody with Phenom II x6 1090t or 1100t, please show what can your system do with water, LN2 or similar.
> 
> (I want to see CPU on 4000MHz or more, and GPU over 1200Mhz)
> 
> *Edit: Want in the club*
> (Hope the info in the screenshot is enough)
> 
> GPU: R9-290x
> Manufacturer: Sapphire
> Cooler: Modded Reference Stock (Lapped, with Arctic Silver 5 thermal paste)
> Core: 1180MHz
> VRam: 1615Mhz
> VDDC: +200
> Power: +50%
> Fan speed: Locked on 90%
> 
> Artifacts: No


Congrats - added









Quote:


> Originally Posted by *JordanTr*
> 
> Hello everybody. I just wanted to inform Arizonian, that he can remove me from owners list cause i sold my r9 290 and just got delivered gigabyte G1 GTX970


No problem. Enjoy.


----------



## Buehlar

@Arizonian can I join?









@th3illusiveman
My temps are similar to yours at almost the same clocks. I don't want to push these GPUs any further until they're on water...they run HOT!

*Cat 14.9
i7 3770K @ 4.5GHz @ ~55° (on water)
2x ASUS 290x DCII
core @ 1165mhz @ 81°
vrm @ 1500mhz @ 84°
Fans @ 100%

Single card run http://www.3dmark.com/fs/2885547*









*Crossfire run @ same clocks http://www.3dmark.com/fs/2885783*

The 2nd GPU will always run a bit cooler in general but you can tell the 1st GPU is really starving for air.


----------



## Offler

I have been checking the 3D mark results with R9 290(x) with various AMD CPUs...

Graphics and physics score is OK, but combined is very bad...
http://www.3dmark.com/fs/2423294

Why is this happenning?

Also I dont have highest score for combination of AMD CPU and single R9 290x, but... I have highest valid score...

Code:



Code:


CPU          Valid   Highest
Phenom x6 1090t 10119   10362
Phenom x6 1100T 9438    9451
FX-8350         9973    11287
FX-9370         9570    10251
FX-8150         8196    8670

Come on guys... Its certain that there can be much more valid high scores in 3D mark FS.


----------



## Arizonian

Quote:


> Originally Posted by *Buehlar*
> 
> @Arizonian can I join?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @th3illusiveman
> My temps are similar to yours at almost the same clocks. I don't want to push these GPUs any further until they're on water...they run HOT!
> 
> *Cat 14.9
> i7 3770K @ 4.5GHz @ ~55° (on water)
> 2x ASUS 290x DCII
> core @ 1165mhz @ 81°
> vrm @ 1500mhz @ 84°
> Fans @ 100%
> 
> Single card run http://www.3dmark.com/fs/2885547*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Crossfire run @ same clocks http://www.3dmark.com/fs/2885783*
> 
> The 2nd GPU will always run a bit cooler in general but you can tell the 1st GPU is really starving for air.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## bond32

Quote:


> Originally Posted by *chronicfx*
> 
> So the g2 1300w handles 1200 on all three cards? Very impressive! I am one salary right now while my wife finds work but water cooling these cards would be nice. The only thing I like about the reference cards is every complains about vrm temp and mine never crack 55c although sub 50 on the core would be sweet.


Yeah the 1300 holds still around 1.4V on the cards then as soon as the bench starts it shuts down. When each card pulls around 380-400 watts is the cut off.


----------



## chronicfx

Quote:


> Originally Posted by *bond32*
> 
> Yeah the 1300 holds still around 1.4V on the cards then as soon as the bench starts it shuts down. When each card pulls around 380-400 watts is the cut off.


Good to know


----------



## Sonic_AFB

Hi! good morning everyone, finally I can present myself with a reference Asus R9 290x water cooled with elpida memories and so far no black screens, I have to try more oc but for now I'm just in 1100/1350 + 25mV, but I'm a little scared because currently I have not dispelled the vrm, I have just put him fans and more or less remains at 70, I hope to learn much here

Have a nice day!


----------



## Klocek001

I just got a message from the shop where I bought my 290. They said they accept my RMA *conditionally* (I don't know the right expression, I'm translating word for word from Polish), but there are some "damaged capacitors and resistors". They sent it to Sapphire.
Does it mean I'll go no refund ? Will Sapphire decline my warranty ?


----------



## LandonAaron

When overclocking these cards should I max out the Auxiliary voltage as well? Looks like I can add 100mV to both the core and aux voltages. I have always had Nvidia cards before which only had a core voltage option, so not sure in which scenarios I should increase the Aux voltage.


----------



## BradleyW

Quote:


> Originally Posted by *LandonAaron*
> 
> When overclocking these cards should I max out the Auxiliary voltage as well? Looks like I can add 100mV to both the core and aux voltages. I have always had Nvidia cards before which only had a core voltage option, so not sure in which scenarios I should increase the Aux voltage.


No, leave AUX alone. It is the power given to the card via the PCI-E slot. I doubt you are maxing the PCI-E power cables and PCI-E predefined voltages. AUX might only help on extreme power draw scenarios.


----------



## battleaxe

Well, I have to say I found something really weird. I was getting some weird black screens on my cards since installing the second water block. If I ran a test such as valley it would run for about 3 seconds then black screen. Even at 800core 1000memory it did this. Finally sick of the crashing I tore the water cooler off the core. On the corners there were bare spots on the die where the paste did not compress all the way down to the block. Not sure what happened. I'm guessing the temp sensors are in the middle of the die somewhere and not on the corner as the temps looked totally fine. High 40's at load before the black screen.

I put it back together and the problem dissapeared. Temps are 52 max after 20 minutes of Valley and no more crashing. Even up to 1150 on both cores. Thank goodness that soap opera is over.


----------



## tsm106

Quote:


> Originally Posted by *geggeg*
> 
> That is exceptionally good, but without the score having been validated I can't tell you if it bugged out somewhere or not. As a reference, my golden 780 Ti KPE had a graphics score of 15k and I haven't seen any Hawaii card hit that in above ambient cooling.


Hmm, I ran a bench to see if I could hit 15k g score. I haven't benched on Hawaii in what seems like ages, nov 13 lol. Anyways... I wish it was cooler here, forecast says 91f. Oh here's the run on water only. I hope to better that when the heat wave ends.

http://www.3dmark.com/fs/2889373


----------



## ZealotKi11er

Quote:


> Originally Posted by *tsm106*
> 
> Hmm, I ran a bench to see if I could hit 15k g score. I haven't benched on Hawaii in what seems like ages, nov 13 lol. Anyways... I wish it was cooler here, forecast says 91f. Oh here's the run on water only. I hope to better that when the heat wave ends.
> 
> http://www.3dmark.com/fs/2889373


Hard to break 15K. I get 13.5K with 1200MHz core. Probably close to 14K with 1250MHz. Once i hit 1300MHz core its artifacts like made. I feel like any run where the card is not stable means nothing.


----------



## LandonAaron

Well my first attempt at overclocking this puppy went well. Put +100 on both the Core and Memory frequencies to run at 1100 and 1350 respectively, and upped power limit to +50, and maxed Core and Aux voltages. (Thanks Bradley for letting me know about the Aux voltage next time I will try without it). Ran Unigen Valley with max settings at 1920x1080 for about 30 min with max temps GPU-76, VRM1-70, VRM2-57. Using Artic Accelero Hybrid water cooler (1st gen), and Gelid Icy Vision VRM heatsinks. Only thing that seems weird to me is that before with the stock cooler it was VRM2 that would go up to about 70, and VRM1 would hover around 50. It seems now those temperatures are reversed.


----------



## DividebyZERO

Nice score, im going to test 2 more 290x tonight. Ive never tested mine individually before, I just went quadfire right out the gate.

what were the clocks on that run Tsm106?


----------



## battleaxe

Quote:


> Originally Posted by *tsm106*
> 
> Hmm, I ran a bench to see if I could hit 15k g score. I haven't benched on Hawaii in what seems like ages, nov 13 lol. Anyways... I wish it was cooler here, forecast says 91f. Oh here's the run on water only. I hope to better that when the heat wave ends.
> 
> http://www.3dmark.com/fs/2889373


Dang... that's insane...


----------



## LandonAaron

I recently put a stock coo
Quote:


> Originally Posted by *battleaxe*
> 
> Well, I have to say I found something really weird. I was getting some weird black screens on my cards since installing the second water block. If I ran a test such as valley it would run for about 3 seconds then black screen. Even at 800core 1000memory it did this. Finally sick of the crashing I tore the water cooler off the core. On the corners there were bare spots on the die where the paste did not compress all the way down to the block. Not sure what happened. I'm guessing the temp sensors are in the middle of the die somewhere and not on the corner as the temps looked totally fine. High 40's at load before the black screen.
> 
> I put it back together and the problem dissapeared. Temps are 52 max after 20 minutes of Valley and no more crashing. Even up to 1150 on both cores. Thank goodness that soap opera is over.


I recently had to put a stock cooler back on a GTX 770, and I had a very similar issue as you. I applied the thermal paste by just placing a small rice grain size blob in the middle of the die , but when I installed the card in my machine I was getting 90+ temps on GPU after just a minute of unigen valley. So I took the cooler off the card and saw that the thermal paste hadn't spread all the way to the corners. I then applied the thermal paste in a uniform layer over the entire dire and reinstalled and temperatures returned to normal. Lesson: Maybe the generally recommended method of applying thermal paste by just placing a very small amount in the middle of the die isn't actually the best method. I think especially on video cards it might be okay to use a little bit extra thermal paste as it has that groove right at the edge of the die, and any excess thermal paste can just squeeze out and get trapped there without worry of it getting somewhere problematic, and apparently not enough thermal paste is more dangerous than too much.


----------



## battleaxe

Quote:


> Originally Posted by *LandonAaron*
> 
> I recently put a stock coo
> I recently had to put a stock cooler back on a GTX 770, and I had a very similar issue as you. I applied the thermal paste by just placing a small rice grain size blob in the middle of the die , but when I installed the card in my machine I was getting 90+ temps on GPU after just a minute of unigen valley. So I took the cooler off the card and saw that the thermal paste hadn't spread all the way to the corners. I then applied the thermal paste in a uniform layer over the entire dire and reinstalled and temperatures returned to normal. Lesson: Maybe the generally recommended method of applying thermal paste by just placing a very small amount in the middle of the die isn't actually the best method. I think especially on video cards it might be okay to use a little bit extra thermal paste as it has that groove right at the edge of the die, and any excess thermal paste can just squeeze out and get trapped there without worry of it getting somewhere problematic, and apparently not enough thermal paste is more dangerous than too much.


Yeah, it definitely made me think twice about just dropping a dollop on there and calling it a day. Like you said on a GPU its gonna squeeze out the extra anyway. Kinda makes me wonder if the black screen issue that everyone was having when these cards came out werent from having too little paste on the die from the factory because I suddenly had a faulty card that previously had worked just fine. Really had me scratching my head for two days I can tell you.


----------



## tsm106

You need to push down hard on the back of the assembled card/block, right on the back of the pcb using your thumb. Push hard to spread the tim out.


----------



## battleaxe

Quote:


> Originally Posted by *tsm106*
> 
> You need to push down hard on the back of the assembled card/block, right on the back of the pcb using your thumb. Push hard to spread the tim out.


Good tip. Lesson learned.


----------



## sugarhell

Quote:


> Originally Posted by *battleaxe*
> 
> Good tip. Lesson learned.


Just care to not end like tsm's avatar


----------



## Sonic_AFB

hi guys! 60º at 1000/1250 -0.019v for a watercooled 290x and 4820k with a 360 rad is a little high? and 64º at 1100/1350 +0.025v? i suspect if this temps are high, the posibility of my cpu waterblock can be dirty and restrict water flow (not be the first time..)


----------



## Widde

Does anyone know MSI's stand on replacing the tim on the gpu? My sapphire doesnt have the stickers on the screws ^^


----------



## BradleyW

Quote:


> Originally Posted by *Widde*
> 
> Does anyone know MSI's stand on replacing the tim on the gpu? My sapphire doesnt have the stickers on the screws ^^


They won't check unless they have reason to believe you've changed TIM. E.g, stripped screws and so on.


----------



## Widde

Quote:


> Originally Posted by *BradleyW*
> 
> They won't check unless they have reason to believe you've changed TIM. E.g, stripped screws and so on.


Okey, so I'll be fine and gentle then







thanks ^^


----------



## VSG

Quote:


> Originally Posted by *tsm106*
> 
> Hmm, I ran a bench to see if I could hit 15k g score. I haven't benched on Hawaii in what seems like ages, nov 13 lol. Anyways... I wish it was cooler here, forecast says 91f. Oh here's the run on water only. I hope to better that when the heat wave ends.
> 
> http://www.3dmark.com/fs/2889373


Great job


----------



## amptechnow

yeah that score is nice. this is what i hit with 2 290's. http://www.3dmark.com/3dm/4248548? only 4000 points more. thought i would do better.

went back to 14.7 rc3 and all other benches ive gotten higher scores then any other driver. 14.9 was horrible. crossfire wouldnt even work right and a bunch of other issues. ive been waiting so long for an official driver and they released garbage.


----------



## kizwan

I got strange issue with 14.9. Firestrike flickering with multiple colours which only happened when overclocked. Not crash though. Back and forth between 14.6 beta & 14.9 drivers. No issue with 14.6 beta. When I re-install 14.9 for the second time, it still happening. One hour later I tried 3dmark 11, no issue. Then I tried Firestrike again, it ran without any issue at all this time. Strange indeed.

The only difference is A/C was on when the issue occurred. Then again, no problem with 14.6 beta or 14.X or any previous drivers.

Scores pretty much similar to 14.X. Since it's whql, I'll keep it.


----------



## zealord

anyone tested win 10 technical preview yet?

I got a ~600 point increase in 3dmark FS with it.
Could be faulty 3dmark installation on Win7 though, my points there are quite low


----------



## rdr09

Quote:


> Originally Posted by *zealord*
> 
> anyone tested win 10 technical preview yet?
> 
> I got a ~600 point increase in 3dmark FS with it.
> Could be faulty 3dmark installation on Win7 though, my points there are quite low


just did. added 200 pts to graphics score using 14.9.

http://www.3dmark.com/3dm/4250058?

1260 core

http://www.3dmark.com/3dm/4250131?


----------



## HOMECINEMA-PC

Okay so I need win 10 , new driver ( holds breath ) and some luck


----------



## zealord

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Okay so I need win 10 , new driver ( holds breath ) and some luck


14.9 is win 10 ready


----------



## rdr09

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Okay so I need win 10 , new driver ( holds breath ) and some luck


Quote:


> Originally Posted by *zealord*
> 
> 14.9 is win 10 ready


+rep. In my estimation, a 290X just needs 1240 to hit 14K in graphics.


----------



## HOMECINEMA-PC

here are all of my Firestrike subs .....


----------



## ZealotKi11er

http://www.3dmark.com/3dm/4250264?

1275/1500 290X

Windows 7 X64 14.9


----------



## amptechnow

using win 10 on my living room htpc. ill check some scores. only has an r7 and a g3258 though....


----------



## Petet1990

wats up guys..so im still having trouble gettin my cards to run at 100 percent..3 290s w a 5820k and a deluxe board...is the 8x8x8 bottle necking them?..


----------



## DividebyZERO

Quote:


> Originally Posted by *Petet1990*
> 
> wats up guys..so im still having trouble gettin my cards to run at 100 percent..3 290s w a 5820k and a deluxe board...is the 8x8x8 bottle necking them?..


Nope, has to be something esle, what driver version are you using and what are you using to load them with?


----------



## Petet1990

im using 14.9 now but was same thing with 14.7..plus i did a fresh windows install..maybe psu not enough? G2 1300


----------



## kizwan

Quote:


> Originally Posted by *Petet1990*
> 
> wats up guys..so im still having trouble gettin my cards to run at 100 percent..3 290s w a 5820k and a deluxe board...is the 8x8x8 bottle necking them?..


Resolution? My cards only run near 100% if I play at 4K.


----------



## Petet1990

Quote:


> Originally Posted by *kizwan*
> 
> Resolution? My cards only run near 100% if I play at 4K.


4320x2560


----------



## ebhsimon

Are 2 290s too much for my 4670k @ 4.4Ghz? Main games are BF3 and soon to be BF4.


----------



## Sgt Bilko

Quote:


> Originally Posted by *ebhsimon*
> 
> Are 2 290s too much for my 4670k @ 4.4Ghz? Main games are BF3 and soon to be BF4.


At 1080p yes but no reason for Crossfire at 1080 with these cards









1440p and higher youll be fine


----------



## ebhsimon

Quote:


> Originally Posted by *Sgt Bilko*
> 
> At 1080p yes but no reason for Crossfire at 1080 with these cards
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1440p and higher youll be fine


1440p 120hz it should be alright, right?


----------



## Sgt Bilko

Quote:


> Originally Posted by *ebhsimon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> At 1080p yes but no reason for Crossfire at 1080 with these cards
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1440p and higher youll be fine
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1440p 120hz it should be alright, right?
Click to expand...

Thats what I run at, you'll be fine


----------



## Klocek001

Hey I've got a question for those familiar with GPU RMA and its conditions. I bought my 290 Tri-X in June (second hand,prolly used for mining). Was running fine till September (moderate OC with MSI AB, 1150 +80mV). Now they're telling me there are some "damaged capacitors and resistors" but the vendor sent it to the manufacturer nevertheless. Do you think Sapphire can refuse to repair it for free because of any of those two factors I mentioned ? I never even touched the card apart from the installation when it arrived and taking it out for the RMA process.


----------



## DividebyZERO

Quote:


> Originally Posted by *Klocek001*
> 
> Hey I've got a question for those familiar with GPU RMA. I bought my 290 Tri-X in June (second hand,prolly used for mining). Was running fine till September (moderate OC with MSI AB, 1150 +80mV). Now they're telling me there are some "damaged capacitors and resistors" but the vendor sent it to the manufacturer nevertheless. Do you think Sapphire can refuse to repair it for free because of any of those two factors I mentioned ? I never even touched the card apart from the installation when it arrived and taking it out for the RMA process.


It's hard to say, but "damaged capacitors and resistors" doesn't mean you mishandled the card. People have had faulty VRMs blow out on their motherboards and that causes physical damage. It would be almost impossible for them to prove it was caused by you and you alone. If you dropped it, or spilled a soda on it then maybe yeah. I have never RMA'd to sapphire so maybe someone else will know. I just hope the vendor doesn't sabotage you with any suggestions or contact with sapphire.

Good luck man, i hope they treat you right.


----------



## Klocek001

I too hope they weren't playing True Detective cause I really did handle my card very carefully. I'm gonna ask them about it on Mon.


----------



## kizwan

Quote:


> Originally Posted by *Petet1990*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Resolution? My cards only run near 100% if I play at 4K.
> 
> 
> 
> 4320x2560
Click to expand...

Other than low usage, did you experiencing low performance/FPS? Take screenshot of the CPU & GPU usage when playing games. You can use MSI AB or Open Hardware Monitor (not HWMonitor). Post here.
Quote:


> Originally Posted by *Klocek001*
> 
> Hey I've got a question for those familiar with GPU RMA and its conditions. I bought my 290 Tri-X in June (second hand,prolly used for mining). Was running fine till September (moderate OC with MSI AB, 1150 +80mV). Now they're telling me there are some "damaged capacitors and resistors" but the vendor sent it to the manufacturer nevertheless. Do you think Sapphire can refuse to repair it for free because of any of those two factors I mentioned ? I never even touched the card apart from the installation when it arrived and taking it out for the RMA process.


Overclocking not cover under warranty. If you told them you overclocked the card, they can use "damaged capacitors and resistors" excuse to deny your RMA.


----------



## Klocek001

Quote:


> Originally Posted by *kizwan*
> 
> If you told them you overclocked the card, they can use "damaged capacitors and resistors" excuse to deny your RMA.


C'mon man....


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> http://www.3dmark.com/3dm/4250264?
> 
> 1275/1500 290X
> 
> Windows 7 X64 14.9


lol. Zeal, I matched your 290X with just 30Mhz more on the core using Win10. exact same score . . .

http://www.3dmark.com/3dm/4253714?

at your oc using Win10 you'll see prolly 200 more pts.


----------



## Petet1990

Quote:


> Originally Posted by *kizwan*
> 
> Other than low usage, did you experiencing low performance/FPS? Take screenshot of the CPU & GPU usage when playing games. You can use MSI AB or Open Hardware Monitor (not HWMonitor). Post here.
> 
> ill try to upload a little later..but yes fps is low.


----------



## WinterActual

Guys, the last few days I am reading about GTX 970 and I am wondering. Is it worth it to sell my R9 290 and get GTX970 instead? I believe the power gain is not that significant to make it a worthy "upgrade", don't you think?


----------



## Sgt Bilko

Quote:


> Originally Posted by *WinterActual*
> 
> Guys, the last few days I am reading about GTX 970 and I am wondering. Is it worth it to sell my R9 290 and get GTX970 instead? I believe the power gain is not that significant to make it a worthy "upgrade", don't you think?


It's a sidegrade tbh


----------



## Dasboogieman

Quote:


> Originally Posted by *WinterActual*
> 
> Guys, the last few days I am reading about GTX 970 and I am wondering. Is it worth it to sell my R9 290 and get GTX970 instead? I believe the power gain is not that significant to make it a worthy "upgrade", don't you think?


Not worth the opportunity cost from a performance or power perspective. Its only worth it if you are downsizing to a lower powered, smaller case build where the 145W TDP would matter.


----------



## Sonic_AFB

Quote:


> Originally Posted by *Sonic_AFB*
> 
> hi guys! 60º at 1000/1250 -0.019v for a watercooled 290x and 4820k with a 360 rad is a little high? and 64º at 1100/1350 +0.025v? i suspect if this temps are high, the posibility of my cpu waterblock can be dirty and restrict water flow (not be the first time..)


nobody can tell me nothing?


----------



## sugarhell

Quote:


> Originally Posted by *Sonic_AFB*
> 
> nobody can tell me nothing?


Your rad is small for both the cpu and gpu. You need at least 240mm per component.I prefer 360mm per hawaii gpu


----------



## ebhsimon

Quote:


> Originally Posted by *Dasboogieman*
> 
> Not worth the opportunity cost from a performance or power perspective. Its only worth it if you are downsizing to a lower powered, smaller case build where the 145W TDP would matter.


Correct. The only scenario where replacing a 290(x) with a 970 makes sense is in cases with suffocating airflow. Many small cases actually have quite good airflow, good enough to vent the hot air from a 290x out sufficiently.


----------



## bluedevil

Quote:


> Originally Posted by *ebhsimon*
> 
> Correct. The only scenario where replacing a 290(x) with a 970 makes sense is in cases with suffocating airflow. Many small cases actually have quite good airflow, good enough to vent the hot air from a 290x out sufficiently.


Would it be dumb to get rid of my VisionTek R9 290 that's Red Modded for a Gigabyte R9 290 Windforce? The Windforces are getting really cheap now....


----------



## ebhsimon

Quote:


> Originally Posted by *bluedevil*
> 
> Would it be dumb to get rid of my VisionTek R9 290 that's Red Modded for a Gigabyte R9 290 Windforce? The Windforces are getting really cheap now....


Unless you feel you need an open air cooled card over a watercooled card, I would think replacing it with the exact same card is not the best choice.
You might just be spending money for no reason if you replace it. Why not go crossfire?


----------



## bluedevil

Quote:


> Originally Posted by *ebhsimon*
> 
> Unless you feel you need an open air cooled card over a watercooled card, I would think replacing it with the exact same card is not the best choice.
> You might just be spending money for no reason if you replace it. Why not go crossfire?


Meh multi-gpu configs never really did anything for me. Think I will just keep the ole' Visiontek a while longer.


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> lol. Zeal, I matched your 290X with just 30Mhz more on the core using Win10. exact same score . . .
> 
> http://www.3dmark.com/3dm/4253714?
> 
> at your oc using Win10 you'll see prolly 200 more pts.


Going to stay with Windows 7 for a bit longer. Yeah Windows 10 seems to help scores.


----------



## BradleyW

Shadow of Mordor:
For those trying to get CFX working, be careful with AFR. I've noticed AFR removes many in game effects but still renders the calculations on this game! Using Tomb Raider profile offers best fps and allows all effects to be shown! So in other words, better graphics and higher fps.


----------



## ZealotKi11er

Quote:


> Originally Posted by *BradleyW*
> 
> Shadow of Mordor:
> For those trying to get CFX working, be careful with AFR. I've noticed AFR removes many in game effects but still renders the calculations on this game! Using Tomb Raider profile offers best fps and allows all effects to be shown! So in other words, better graphics and higher fps.


Or just wait for new Beta drivers.


----------



## BradleyW

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Or just wait for new Beta drivers.


If you really want to play the game like I do, I see no harm in enabling 1.8x crossfire scaling fix.


----------



## Sgt Bilko

Quote:


> Originally Posted by *BradleyW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> Or just wait for new Beta drivers.
> 
> 
> 
> If you really want to play the game like I do, I see no harm in enabling 1.8x crossfire scaling fix.
Click to expand...

^ That


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Going to stay with Windows 7 for a bit longer. Yeah Windows 10 seems to help scores.


i am just testing Win 10 using a spare laptop drive. You should try it 'cause it is, imo, very responsive and a lot smoother than any windoze. But for sure the final version will be bloated.


----------



## smoke2

I'm playing Tomb Raider on my stock OC Sapphire Tri-X 290, but I cannot use SSAA 4x because then game is beginning to cut.
I can only use SSAA 2x.
Resolution is 1600x1200, CPU is i5-4590, 16GB RAM, drivers are the latest 14.9.
What can cause the bottleneck? Or is it normal?


----------



## ZealotKi11er

Quote:


> Originally Posted by *smoke2*
> 
> I'm playing Tomb Raider on my stock OC Sapphire Tri-X 290, but I cannot use SSAA 4x because then game is beginning to cut.
> I can only use SSAA 2x.
> Resolution is 1600x1200, CPU is i5-4590, 16GB RAM, drivers are the latest 14.9.
> What can cause the bottleneck? Or is it normal?


Do you even know what SSAA 4X is ? Do some reading and you will understand.


----------



## BradleyW

Quote:


> Originally Posted by *smoke2*
> 
> I'm playing Tomb Raider on my stock OC Sapphire Tri-X 290, but I cannot use SSAA 4x because then game is beginning to cut.
> I can only use SSAA 2x.
> Resolution is 1600x1200, CPU is i5-4590, 16GB RAM, drivers are the latest 14.9.
> What can cause the bottleneck? Or is it normal?


With your hardware, I would strongly advise not to use SSAA otherwise you'll experience very low frame rates. FXAA will suffice in your case.

You might want to run medium or high preset with Tress-FX disabled.


----------



## zealord

__ https://twitter.com/i/web/status/518443317600587776
390X information inc ?


----------



## gobblebox

Quote:


> Originally Posted by *BradleyW*
> 
> With your hardware, I would strongly advise not to use SSAA otherwise you'll experience very low frame rates. FXAA will suffice in your case.
> 
> You might want to run medium or high preset with Tress-FX disabled.


Where are y'all adjusting these settings generally: in each game, or in CCC?

Are there any specific settings in CCC that I would want to make sure I have enabled/disabled or set to a specific setting to maximize framerate w/o sacrificing too much graphically?

I'm running a Sapphire Tri-X OC r9 290 (Standard use is OCed to 1135/1450 with +100mv) on a Qnix Evolution II - I got very lucky with this monitor - it overclocks to 120hz, but I normally run it at 96hz - 110hz.

Any suggestions, anyone?


----------



## BradleyW

Quote:


> Originally Posted by *gobblebox*
> 
> Where are y'all adjusting these settings generally: in each game, or in CCC?
> 
> Are there any specific settings in CCC that I would want to make sure I have enabled/disabled or set to a specific setting to maximize framerate w/o sacrificing too much graphically?
> 
> I'm running a Sapphire Tri-X OC r9 290 (Standard use is OCed to 1135/1450 with +100mv) on a Qnix Evolution II - I got very lucky with this monitor - it overclocks to 120hz, but I normally run it at 96hz - 110hz.
> 
> Any suggestions, anyone?


Adjust settings in-game and leave CCC alone.
If you need help with the Tomb Raider graphics options, let me know.


----------



## ZealotKi11er

Quote:


> Originally Posted by *gobblebox*
> 
> Where are y'all adjusting these settings generally: in each game, or in CCC?
> 
> Are there any specific settings in CCC that I would want to make sure I have enabled/disabled or set to a specific setting to maximize framerate w/o sacrificing too much graphically?
> 
> I'm running a Sapphire Tri-X OC r9 290 (Standard use is OCed to 1135/1450 with +100mv) on a Qnix Evolution II - I got very lucky with this monitor - it overclocks to 120hz, but I normally run it at 96hz - 110hz.
> 
> Any suggestions, anyone?


There is no such thing. To increase fps you have to decrease quality and you do that in-game. There is nothing for you to play with in CCC.


----------



## smoke2

Quote:


> Originally Posted by *BradleyW*
> 
> With your hardware, I would strongly advise not to use SSAA otherwise you'll experience very low frame rates. FXAA will suffice in your case.
> 
> You might want to run medium or high preset with Tress-FX disabled.


The reason is the GPU is not enough strong or CPU is bottlenecking the GPU?


----------



## BradleyW

Quote:


> Originally Posted by *smoke2*
> 
> The reason is the GPU is not enough strong or CPU is bottlenecking the GPU?


Tomb Raider is a very demanding game, that's all! It requires a lot of power to run all features at a high fps.


----------



## bbond007

Quote:


> Originally Posted by *BradleyW*
> 
> Tomb Raider is a very demanding game, that's all! It requires a lot of power to run all features at a high fps.


yep. hitting 100% CPU on my i5 @ 4.4Ghz from time to time causing some jerkiness @ 5760x1080p.

Find that that game runs better on my 1230v3 or FX 8320.


----------



## Kittencake

I just had a weird experience with my fan of my sapphire 290x Ref, the fan speed shot up to 100% and i couldn't slow it down with msi afterburner and the cards temp was 40c , I had to reboot to get it to slow down


----------



## chronicfx

Question, running trifire and i have not disabled my igpu in my 4790k. Is there any reason to do that?


----------



## Dasboogieman

Quote:


> Originally Posted by *chronicfx*
> 
> Question, running trifire and i have not disabled my igpu in my 4790k. Is there any reason to do that?


Leave the iGPU, you can use it for Quicksync


----------



## the9quad

Quote:


> Originally Posted by *Kittencake*
> 
> I just had a weird experience with my fan of my sapphire 290x Ref, the fan speed shot up to 100% and i couldn't slow it down with msi afterburner and the cards temp was 40c , I had to reboot to get it to slow down


I get that from time to time.


----------



## ebhsimon

Quote:


> Originally Posted by *smoke2*
> 
> The reason is the GPU is not enough strong or CPU is bottlenecking the GPU?


In Tomb Raider? The GPU.
EDIT: Just saw you have a 4590. I think it might be both. Check Afterburner to see if your CPU or GPU is running at 100%.


----------



## HOMECINEMA-PC

Okay after a hour of ddu and un and reinstalling 14.7 at least 3 times and a tri card removal , I finally got it to recognize tri fire .








Now to set a bench and get it to run before I go this 14.9 thing-a-ma-jig driver .


----------



## heroxoot

Quote:


> Originally Posted by *Kittencake*
> 
> I just had a weird experience with my fan of my sapphire 290x Ref, the fan speed shot up to 100% and i couldn't slow it down with msi afterburner and the cards temp was 40c , I had to reboot to get it to slow down


Probably a bios driver conflict. Or the bios is just buggy. The fact you could not control it means the driver lost readings.


----------



## rdr09

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Okay after a hour of ddu and un and reinstalling 14.7 at least 3 times and a tri card removal , I finally got it to recognize tri fire .
> 
> 
> 
> 
> 
> 
> 
> 
> Now to set a bench and get it to run before I go this 14.9 thing-a-ma-jig driver .


you have to remove the cards installing the driver? why not unplugged the other two after disabling crossifre and install the driver. i once used DDU and BF4 crashed, never again. i really never used it but tried it just once. no more. i use the driver itself to uninstall the old driver like i normally do. takes but 15 minutes and that's with a HDD. but then again, i only have a single 290.


----------



## Vici0us

Are 14.9 drivers worth installing? Still running 14.7 smooth.


----------



## Wezzor

Quote:


> Originally Posted by *Vici0us*
> 
> Are 14.9 drivers worth installing? Still running 14.7 smooth.


They work perfectly fine for me atleast.


----------



## Vici0us

Quote:


> Originally Posted by *Wezzor*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Vici0us*
> 
> Are 14.9 drivers worth installing? Still running 14.7 smooth.
> 
> 
> 
> They work perfectly fine for me atleast.
Click to expand...

Hmm.. Well I guess that's a sign that I should try them out. I got sometime to kill.


----------



## Ramzinho

i don't remember if i posted here before or not











Verification Link

Card Maker: Sapphire R9 290X Reference, Cooling: Koolance VID-AR290 V1.0


----------



## Ramzinho

Quote:


> Originally Posted by *Vici0us*
> 
> Hmm.. Well I guess that's a sign that I should try them out. I got sometime to kill.


Lots of people been praising them.. you should give them a go.


----------



## jagdtigger

@Ramzinho
Nice rig







. We have similar coolers, but mine is smaller:

(VID-AR290X)


----------



## Ramzinho

Quote:


> Originally Posted by *jagdtigger*
> 
> @Ramzinho
> Nice rig
> 
> 
> 
> 
> 
> 
> 
> . We have similar coolers, but mine is smaller:
> 
> (VID-AR290X)


This is Rev 1.1 and when i contacted koolance what is the difference.. here is the reply








Quote:


> Hello Ramzy,
> 
> We realized the cost to produce the large acetal piece on the Rev 1.0
> was too much and in order to prevent having to raise the price of the
> block, we cut the acetal section down to be the Rev 1.1. There was no
> issues with the original rev. The ports on both cards should line up.


----------



## bond32

I have both the early release and new revision koolance blocks - they both cool the same and cool very well.


----------



## Red1776

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> A herd of MSI R290X's
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nice setup Red1776, white and black (two tone) CM Cosmos 2 and 4x R9 290X.
> 
> 
> 
> 
> 
> 
> 
> Maybe I do something similar to this with my system.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How does those MSI Gaming R9 290X compare to the MSI Lightning R9 290X?
Click to expand...

Thanks GoBig 

I have not noticed any difference in OC'ing at all. The Twin frozr cooling is within a couple degrees and they seem to take about the voltage to get similar OC's


----------



## Arizonian

Quote:


> Originally Posted by *Ramzinho*
> 
> i don't remember if i posted here before or not
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Verification Link
> 
> Card Maker: Sapphire R9 290X Reference, Cooling: Koolance VID-AR290 V1.0
> 
> 
> Spoiler: Warning: Spoiler!


You did. #*422* of 473


----------



## Wezzor

I just bought Metro Last Light: Redux and WOW the game sure looks good. Anyway, what I want to ask you now guys is what settings should I run? I'm going to play with VSYNC enabled so I'd gladly skip FPS drops below 60 FPS.







My GPU is currently OC to 1100/1400. The only thing I know is that I'm sure not going to touch the SSAA option.


----------



## Necrocis85

GPU-Z Link http://www.techpowerup.com/gpuz/details.php?id=b5f6c

Gigabyte Windforce R9 290, stock cooling.

Long time lurker here, decided to go ahead and submit.


----------



## BLOWNCO

has anyone with the vapor x 290 tried the new EK 290x water blocks ive been compairing pics of the pcbs and they look identical. i just want to make sure before i drop 300 and 2 blocks to water cool them.


----------



## Vici0us

Quote:


> Originally Posted by *Ramzinho*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Vici0us*
> 
> Hmm.. Well I guess that's a sign that I should try them out. I got sometime to kill.
> 
> 
> 
> Lots of people been praising them.. you should give them a go.
Click to expand...

Installed 14.9 and after I couldn't open anything, rebooted so many times same thing (PC would freeze up no matter what), went back to 14.7 and everything is smooth again.


----------



## Arizonian

Quote:


> Originally Posted by *Necrocis85*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> GPU-Z Link http://www.techpowerup.com/gpuz/details.php?id=b5f6c
> 
> Gigabyte Windforce R9 290, stock cooling.
> 
> Long time lurker here, decided to go ahead and submit.


Congrats - added


----------



## Ironsmack

I'm just curious...

So I've been doing some stability test on my one 290 and so far - i have it stable on 1150/1270 with +55mv using unigine valley.

However, with IBT running along with unigine, my GPU downclocks like mad. All the way down to 427 Mhz.. Is this normal?


----------



## ebhsimon

Quote:


> Originally Posted by *Ironsmack*
> 
> I'm just curious...
> 
> So I've been doing some stability test on my one 290 and so far - i have it stable on 1150/1270 with +55mv using unigine valley.
> 
> However, with IBT running along with unigine, my GPU downclocks like mad. All the way down to 427 Mhz.. Is this normal?


What are the temps?


----------



## Ramzinho

i need help ocing my gpu.. i know i should stay away form MSI AB. i've trixxx., you guys recommend anything else? also where should i start? memory, core? and how far can i go on the VDDC offset. and does increasing VDDC offset means i'm increasing the voltage the GPU or what? and should i be doing any settings in the CCC?


----------



## MojoW

Quote:


> Originally Posted by *Ramzinho*
> 
> i need help ocing my gpu.. i know i should stay away form MSI AB. i've trixxx., you guys recommend anything else? also where should i start? memory, core? and how far can i go on the VDDC offset. and does increasing VDDC offset means i'm increasing the voltage the GPU or what? and should i be doing any settings in the CCC?


Your under water so you can max the slider out if you wanted to, it's a offset of 200mv.
Yes that offset will increase voltage to the core but the internal memory controller/regulator will also get a tad more volts but you can't add it manually.
Leave CCC as it is and start with your core and if you found your max core then do the memory if you want.

Edit: Oh and ''watch your temps'' had to say it even if your under water.


----------



## Newbie2009

Quote:


> Originally Posted by *Ramzinho*
> 
> i need help ocing my gpu.. i know i should stay away form MSI AB. i've trixxx., you guys recommend anything else? also where should i start? memory, core? and how far can i go on the VDDC offset. and does increasing VDDC offset means i'm increasing the voltage the GPU or what? and should i be doing any settings in the CCC?


Start with CCC, see how far you get with no volts. Then I would say trixx with volts until you hit your performance max. I personally wouldn't go higher than +100mv for 24/7.
Then put up power tune with trixx and keep overclocking until you hit the limit. (you should get to 1125 core before powertune becomes an issue id say) You on Air? I would leave memory until last.

I have no problems with the newest MSI AB stabiliy wise and it does have good monitoring, but never sets powertune correctly.


----------



## Ramzinho

Quote:


> Originally Posted by *Newbie2009*
> 
> Start with CCC, see how far you get with no volts. Then I would say trixx with volts until you hit your performance max. I personally wouldn't go higher than +100mv for 24/7.
> Then put up power tune with trixx and keep overclocking until you hit the limit. You on Air? I would leave memory until last.
> 
> I have no problems with the newest MSI AB, but never sets powertune correctly.


nope i'm under water


----------



## MojoW

Quote:


> Originally Posted by *MojoW*
> 
> Your under water so you can max the slider out if you wanted to, it's a offset of 200mv.
> Yes that offset will increase voltage to the core but the internal memory controller/regulator will also get a tad more volts but you can't add it manually.
> Leave CCC as it is and start with your core and if you found your max core then do the memory if you want.
> 
> Edit: Oh and ''watch your temps'' had to say it even if your under water.


Quote:


> Originally Posted by *Ramzinho*
> 
> nope i'm under water


Like i said.


----------



## Newbie2009

Quote:


> Originally Posted by *Ramzinho*
> 
> nope i'm under water


Nice.
Quote:


> Originally Posted by *Ramzinho*
> 
> nope i'm under water


Nice. I used 3D Mark firestrike for stability testing and double check stability 3D mark 11. 11 is good for memory testing


----------



## Ramzinho

Quote:


> Originally Posted by *MojoW*
> 
> Like i said.


Valley, Heaven, 3D Mark, what should i use to bench and test stability?


----------



## Vici0us

Quote:


> Originally Posted by *Ramzinho*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MojoW*
> 
> Like i said.
> 
> 
> 
> Valley, Heaven, 3D Mark, what should i use to bench and test stability?
Click to expand...

Use 3DMark (specially Firestrike) and Heaven.


----------



## Ramzinho

Quote:


> Originally Posted by *Vici0us*
> 
> Use 3DMark (specially Firestrike) and Heaven.


Funny story.. my GPU had a maximum temp of 146C







... way to go GPU-Z ... is that common?


----------



## naved777

One of my friend is going through nightmare after installing 14.9
he's using two 290x and after installing 14.9 he couldn't enable crossfire and system freezes constantly and now his windows got busted (probably)
I am not sure if he uninstalled the previous driver before installing 14.9. So would i tell him to install 14.9 after fresh windows install or its better stay with 14.7 ?
Its weird because i also am running 2 290x and haven't faced any troubles with 14.9


----------



## MojoW

Quote:


> Originally Posted by *naved777*
> 
> One of my friend is going through nightmare after installing 14.9
> he's using two 290x and after installing 14.9 he couldn't enable crossfire and system freezes constantly and now his windows got busted (probably)
> I am not sure if he uninstalled the previous driver before installing 14.9. So would i tell him to install 14.9 after fresh windows install or its better stay with 14.7 ?
> Its weird because i also am running 2 290x and haven't faced any troubles with 14.9


Let him run DDU(Display driver uninstaller) and then reinstall the drivers.
If it is because of driver remnants (leftovers) this will tell.


----------



## Sgt Bilko

Quote:


> Originally Posted by *MojoW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *naved777*
> 
> One of my friend is going through nightmare after installing 14.9
> he's using two 290x and after installing 14.9 he couldn't enable crossfire and system freezes constantly and now his windows got busted (probably)
> I am not sure if he uninstalled the previous driver before installing 14.9. So would i tell him to install 14.9 after fresh windows install or its better stay with 14.7 ?
> Its weird because i also am running 2 290x and haven't faced any troubles with 14.9
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Let him run DDU(Display driver uninstaller) and then reinstall the drivers.
> If it is because of driver remnants (leftovers) this will tell.
Click to expand...

^ Yup, What he said


----------



## Ironsmack

Quote:


> Originally Posted by *ebhsimon*
> 
> What are the temps?


GPU:
Core: 56C
VRM's: 62C

CPU:
Core: 56C OC to 4.5 Ghz @ 1.376 volts

Ambient: 23C


----------



## kizwan

Quote:


> Originally Posted by *Ironsmack*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ebhsimon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ironsmack*
> 
> I'm just curious...
> 
> So I've been doing some stability test on my one 290 and so far - i have it stable on 1150/1270 with +55mv using unigine valley.
> 
> However, with IBT running along with unigine, my GPU downclocks like mad. All the way down to 427 Mhz.. Is this normal?
> 
> 
> 
> What are the temps?
> 
> Click to expand...
> 
> GPU:
> Core: 56C
> VRM's: 62C
> 
> CPU:
> Core: 56C OC to 4.5 Ghz @ 1.376 volts
> 
> Ambient: 23C
Click to expand...

Only GPU downclock? Did CPU downclock too?


----------



## tsm106

The 14.9 driver brings some much needed speed to 3dmark 11. Here are some runs, but they're not the fastest just the fastest ones that show the clock speeds. The extreme run cracked the hof lol, so sad.

http://www.3dmark.com/3dm11/8802164

http://www.3dmark.com/3dm11/8801950


----------



## DividebyZERO

Quote:


> Originally Posted by *tsm106*
> 
> The 14.9 driver brings some much needed speed to 3dmark 11. Here are some runs, but they're not the fastest just the fastest ones that show the clock speeds. The extreme run cracked the hof lol, so sad.
> 
> http://www.3dmark.com/3dm11/8802164
> 
> http://www.3dmark.com/3dm11/8801950


its good to see you back here in the 290 thread showing the cards are still pretty good


----------



## matti2

R9 290
core clocked to 1090Mhz(no voltage or powerlimit added), no problems running valley or heaven, no problems on Arma3...ot other games.

But when bf4 running a few minutes, no artifacts, JUST freeze and then have to hardboot.

Temps ok, maybe i have to add some power or what??

Just tested more, when default clocked and fan limited, temps can go much higher without problems,
but on 1090 freezing on lower temps...weird..


----------



## Ironsmack

Quote:


> Originally Posted by *kizwan*
> 
> Only GPU downclock? Did CPU downclock too?


Just the GPU.

Here's a few things I've tried:

With IBT + valley:

Back to stock clocks on the GPU, with power limit to 50%. Tried IBT and Valley, GPU downclocks - but not as bad as when its OC'd.

Same settings as above, but with 13mV... It crashes. Screen freezes and pic gets garbled.

So finally tried it with just valley (same settings) it still crashes.

So i went back to my stable OC (with Valley and IBT) - 1100/1270, with +56mV - no crashes. Just downclocks.

Same settings as above, but just Valley - no downclocks.

So im stumped....


----------



## Ironsmack

Quote:


> Originally Posted by *matti2*
> 
> Just tested more, when default clocked and fan limited, temps can go much higher without problems,
> but on 1090 freezing on lower temps...weird..


Did you try adding some voltage?


----------



## matti2

not yet, how much is needed?


----------



## reedy777

Well u valley is to test a systems graphics where ibt is to test cpu. basically running ibt maxes out the cpu and limits it's ability to feed the gpu the data it needs to perform graphics processing. in other words the gpu needs the cpu to feed it and if it's busy the gpu must wait so I would think its logical seems what you might expect


----------



## reedy777

Quote:


> Originally Posted by *matti2*
> 
> R9 290
> core clocked to 1090Mhz(no voltage or powerlimit added), no problems running valley or heaven, no problems on Arma3...ot other games.
> 
> But when bf4 running a few minutes, no artifacts, JUST freeze and then have to hardboot.
> 
> Temps ok, maybe i have to add some power or what??
> 
> Just tested more, when default clocked and fan limited, temps can go much higher without problems,
> but on 1090 freezing on lower temps...weird..


It's just bf4 google it. Will crashes on a regular if alter settings in game


----------



## Vici0us

Quote:


> Originally Posted by *naved777*
> 
> One of my friend is going through nightmare after installing 14.9
> he's using two 290x and after installing 14.9 he couldn't enable crossfire and system freezes constantly and now his windows got busted (probably)
> I am not sure if he uninstalled the previous driver before installing 14.9. So would i tell him to install 14.9 after fresh windows install or its better stay with 14.7 ?
> Its weird because i also am running 2 290x and haven't faced any troubles with 14.9


I had the same problem. My PC would freeze, after cleaning everything and reinstall of 14.7 everything seems to be good so far. Maybe 14.9 is just not a big fan of Crossfire? Make sure he unnistalls 14.9 and then does a clean install of 14.7.


----------



## tsm106

If you are having install issues on 14.9, AMD just released a patched 14.9, aka 14.9.1. Get in the drivers forum.


----------



## Ironsmack

Quote:


> Originally Posted by *reedy777*
> 
> Well u valley is to test a systems graphics where ibt is to test cpu. basically running ibt maxes out the cpu and limits it's ability to feed the gpu the data it needs to perform graphics processing. in other words the gpu needs the cpu to feed it and if it's busy the gpu must wait so I would think its logical seems what you might expect


Yah, that seems logical. I just never thought that the GPU still needs the CPU to compute tasks. But thanks for the insight


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> The 14.9 driver brings some much needed speed to 3dmark 11. Here are some runs, but they're not the fastest just the fastest ones that show the clock speeds. The extreme run cracked the hof lol, so sad.
> 
> http://www.3dmark.com/3dm11/8802164
> 
> http://www.3dmark.com/3dm11/8801950


reminds me of Tahiti when 12.11 driver finally came out. you beat my 290 by 900 pts in graphics at same clock. nice one tsm.


----------



## sugarhell

Quote:


> Originally Posted by *rdr09*
> 
> reminds me of Tahiti when 12.11 driver finally came out. you beat my 290 by 900 pts in graphics at same clock. nice one tsm.


Look a new score :

http://www.3dmark.com/fs/2915162
http://puu.sh/c2o6i/d005c746de.jpg


----------



## Spectre-

my score


----------



## rdr09

Quote:


> Originally Posted by *sugarhell*
> 
> Look a new score :
> 
> http://www.3dmark.com/fs/2915162
> http://puu.sh/c2o6i/d005c746de.jpg


Quote:


> Originally Posted by *Spectre-*
> 
> 
> 
> my score


nice. that 7970 beats a 290 at stock.lol. i finally broke 14K using the same driver. waiting for more from AMD.


----------



## Ramzinho

i feel disappointed at my scores being 9480.. that's everything at stock.. that 1400Core 7970 got 14000 Points.. pretty impressive.

http://www.3dmark.com/3dm/4285447?

I've a question though. when i autorun firestrike. does it run on native resolution 1440p on my case.. or will it always run at 1080p ?


----------



## sugarhell

It will run at 1080p. Fse will run at 1440p. Its internal resolution


----------



## Ramzinho

Quote:


> Originally Posted by *sugarhell*
> 
> It will run at 1080p. Fse will run at 1440p. Its internal resolution


then my scores are laughable.


----------



## rdr09

Quote:


> Originally Posted by *Ramzinho*
> 
> then my scores are laughable.


Ranger's 7970 is more than golden. combined with oc'ing skillz. compare the graphics scores not overall. hawaii is like tahiti. it needs an oc to shine. oc your 290X to 1200 core and you'll see that it is indeed still faster than any tahiti by about 20% or more.

fixed.


----------



## zealord

guys what kind of VRM 1 and 2 temperatures do you guys have in Furmark ? overclocked, what model, what voltage, duration of run?
normal gaming vrm temps are within tolerable range, but furmark vrm tems are bang out of order.

I am not too worried about furmark, because it is an unrealistic scenario and does not represent real world gaming, but I am still curious


----------



## sugarhell

Quote:


> Originally Posted by *rdr09*
> 
> Ranger's 7970 is more than golden. combined with oc'ing skillz. compare the graphics scores not overall. hawaii is like tahiti. it needs an oc to shine. oc your 290X to 1200 core and you'll see that it is indeed still faster than any tahiti by about 20% or more.
> 
> fixed.


Nah its tsm's card + my bios.He just changed some sliders and he is using hoax with his ambients


----------



## Ramzinho

Quote:


> Originally Posted by *rdr09*
> 
> Ranger's 7970 is more than golden. combined with oc'ing skillz. compare the graphics scores not overall. hawaii is like tahiti. it needs an oc to shine. oc your 290X to 1200 core and you'll see that it is indeed still faster than any tahiti by about 20% or more.
> 
> fixed.


i am in the process of OCing it. i try 1080Core. but 3D mark only shows it running at 1000. although GPU-Z shows max speed was 1080.. i got less scores than my stock run lol.

pumped core to 1085.. got less scores... i'm confused


----------



## rdr09

Quote:


> Originally Posted by *Ramzinho*
> 
> i am in the process of OCing it. i try 1080Core. but 3D mark only shows it running at 1000. although GPU-Z shows max speed was 1080.. i got less scores than my stock run lol.


don't worry about 3DMark not showing the clocks. i think it stopped doing it after 14.4. depending on the silicon, you should not need any increase in VDDC with 1080 core. just raise your power limit. maybe not even at 1100. just max your power limit.


----------



## Ramzinho

Quote:


> Originally Posted by *rdr09*
> 
> don't worry about 3DMark not showing the clocks. i think it stopped doing it after 14.4. depending on the silicon, you should not need any increase in VDDC with 1080 core. just raise your power limit. maybe not even at 1100. just max your power limit.


well rdro... i'm using trix. it only had vddc and i've not even raised the vddc. i tried looking up a quick tutorial for Ocing 290Xs but all what's there is just mining vids.. can you give me a hand here?


----------



## rdr09

Quote:


> Originally Posted by *zealord*
> 
> guys what kind of VRM 1 and 2 temperatures do you guys have in Furmark ? overclocked, what model, what voltage, duration of run?
> normal gaming vrm temps are within tolerable range, but furmark vrm tems are bang out of order.
> 
> I am not too worried about furmark, because it is an unrealistic scenario and does not represent real world gaming, but I am still curious


nah, stay away from furmark. the render test in GPUZ will suffice. same temps i get in BF4.

Quote:


> Originally Posted by *sugarhell*
> 
> Nah its tsm's card + my bios.He just changed some sliders and he is using hoax with his ambients


i knew it. lol


----------



## rdr09

Quote:


> Originally Posted by *Ramzinho*
> 
> well rdro... i'm using trix. it only had vddc and i've not even raised the vddc. i tried looking up a quick tutorial for Ocing 290Xs but all what's there is just mining vids.. can you give me a hand here?


i use Trixx, too. try 1080 core and up the Power Li. you should have it (use the slider on the side). Max PL, apply and run a bench - FS.

i think it is safe to run GPUZ using Sensor tab during the test and check the temps (core and vrms) after. make sure those temps are staying below 80.

also, make sure Overdrive is disabled in CCC.

edit: check your temps first at stock. i forgot you have that lovely watercooling setup.


----------



## Ramzinho

Quote:


> Originally Posted by *rdr09*
> 
> i use Trixx, too. try 1080 core and up the Power Li. you should have it (use the slider on the side). Max PL, apply and run a bench - FS.
> 
> i think it is safe to run GPUZ using Sensor tab during the test and check the temps (core and vrms) after. make sure those temps are staying below 80.
> 
> also, make sure Overdrive is disabled in CCC.
> 
> edit: check your temps first at stock. i forgot you have that lovely watercooling setup.


Damn i missed the slider on the side.. what a terrible GUI Sapphire.

It worked...


----------



## Ramzinho

my target is to reach 1200 Core and highest possible memory under the safe VDDC offset. my GPU idles at 30, while CPU idles at 26C.. i'll be happy with both idling at 35/36 and Being 55C under load. these are my target temps as well.


----------



## rdr09

Quote:


> Originally Posted by *Ramzinho*
> 
> my target is to reach 1200 Core and highest possible memory under the safe VDDC offset. my GPU idles at 30, while CPU idles at 26C.. i'll be happy with both idling at 35/36 and Being 55C under load. these are my target temps as well.


1200 for daily use? is the cpu in the same loop? you might be better off shooting for 1100 or 1150 core and oc the cpu a bit more.

1200 . . you might need +100 or even more. or, might be less. raise your core clock small steps without touching the VDDC, max out PL and test. see how far stock volts will take you. temp is a non issue.

btw, the final test would be your games.


----------



## Ramzinho

Quote:


> Originally Posted by *rdr09*
> 
> 1200 for daily use? is the cpu in the same loop? you might be better off shooting for 1100 or 1150 core and oc the cpu a bit more.


Yes one loop fits all









I said that cause i wanted to know what are the safe limits for daily use... yes i'll OC the CPU a bit more. but GPU Overclocking i way shorter process than OCing my CPU. my last OC was on a Q9550 and it took me 3 days to find the sweet 24 hr prime stable test. was a hell of an OC back then. but again it's been useful.

Finally what is the difference between power limit and VDDC offset?


----------



## rdr09

Quote:


> Originally Posted by *Ramzinho*
> 
> Yes one loop fits all
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I said that cause i wanted to know what are the safe limits for daily use... yes i'll OC the CPU a bit more. but GPU Overclocking i way shorter process than OCing my CPU. my last OC was on a Q9550 and it took me 3 days to find the sweet 24 hr prime stable test. was a hell of an OC back then. but again it's been useful.
> 
> Finally what is the difference between power limit and VDDC offset?


not sure exactly. what i know is i can't oc high without PL, so i just max it. i'd say +100 VDDC is safe for daily use 'cause you have water. check your temps once and forget it.

edit: blow clean the rads once a month. i use a small dry vac.


----------



## Ramzinho

Quote:


> Originally Posted by *rdr09*
> 
> not sure exactly. what i know is i can't oc high without PL, so i just max it. i'd say +100 VDDC is safe for daily use 'cause you have water. check your temps once and forget it.
> 
> edit: blow clean the rads once a month. i use a small dry vac.


i do that every couple of weeks actually


----------



## DividebyZERO

Is it normal for 290x vrms to hit mid to high 70s on water using +200mv?
under load of course


----------



## ZealotKi11er

Quote:


> Originally Posted by *DividebyZERO*
> 
> Is it normal for 290x vrms to hit mid to high 70s on water using +200mv?
> under load of course


I think its normal.


----------



## MojoW

Quote:


> Originally Posted by *Ramzinho*
> 
> Yes one loop fits all
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I said that cause i wanted to know what are the safe limits for daily use... yes i'll OC the CPU a bit more. but GPU Overclocking i way shorter process than OCing my CPU. my last OC was on a Q9550 and it took me 3 days to find the sweet 24 hr prime stable test. was a hell of an OC back then. but again it's been useful.
> 
> Finally what is the difference between power limit and VDDC offset?


Power limit is how much power the card draws in total and VDDC offset is the voltage for the core.
When you overclock and don't max your PL the card will try to stay within it's TDP range if i'm correct.
Some cards need PL faster then others even without a voltage increase because of the sillicon lottery.


----------



## LandonAaron

I am getting temps as high as 92 C on my R9 290x running 1100 core, 1400 memory with +100 mv and +50 power limit. Which would be normal except I am using a water cooler. It is only a 120mm rad, but still I would think the temps would be better than that. I thought the VRM's would be the issue as they are on air with aluminum heat sinks but I am getting 50c on one and 70c on the other.


----------



## sugarhell

Quote:


> Originally Posted by *LandonAaron*
> 
> I am getting temps as high as 92 C on my R9 290x running 1100 core, 1400 memory with +100 mv and +50 power limit. Which would be normal except I am using a water cooler. It is only a 120mm rad, but still I would think the temps would be better than that. I thought the VRM's would be the issue as they are on air with aluminum heat sinks but I am getting 50c on one and 70c on the other.


120mm for a 350 watt card ( 100mv +50 power limit). Yeah thats the reason


----------



## LandonAaron

Forgot to mention that it is just in 3dmark that the temp gets that high. Normal gaming it hits like 80 max. Since the thermal cut off on this card is 95 should I even be too concerned? Its not like I will be benchmarking everyday, and will probably only keep this card for a year max.


----------



## sugarhell

Quote:


> Originally Posted by *LandonAaron*
> 
> Forgot to mention that it is just in 3dmark that the temp gets that high. Normal gaming it hits like 80 max. Since the thermal cut off on this card is 95 should I even be too concerned? Its not like I will be benchmarking everyday, and will probably only keep this card for a year max.


Try to reseat your block too.Even with a 120mm you shouldnt reach 90C


----------



## Talon720

Quote:


> Originally Posted by *zealord*
> 
> it is way easier if you just post the link to the result of the firestrike test and tell us your clocks of GPU and CPU.


Quote:


> Originally Posted by *chronicfx*
> 
> I get just under 21000 with my 4790k at 4.7 and 3x 290x at stock on air
> http://www.3dmark.com/3dm/4227170
> 
> Can you link yours as well? I want to see


Yea I'll post it when I get home later. I sorta forgot I posted this when I was at work


----------



## joeh4384

Seems pretty high when all the reviews of people with kraken g10s and aio coolers getting temps in the 50s/60s. I get upper 60s with a 295x2.


----------



## tsm106

Quote:


> Originally Posted by *DividebyZERO*
> 
> Is it normal for 290x vrms to hit mid to high 70s on water using +200mv?
> under load of course


My 290x vrm only goes to 47-50c (w/o fujipoly pads) at a lot higher than +200mv lol. On reference pcb, if I can remember far back enough with fujis since I never used the ref pcb w/o them, vrm temps were mid 40s.


----------



## BradleyW

New drivers are out. No improvements or list profiles/ white list profiles for any of the up and coming titles. NOT HAPPY.


----------



## Klocek001

Suppose my card gets back from RMA soon, should I choose 14.4 , 14.8 or 14.9.1 ? I remember I preferred 14.4 over 14.7 , much less stuttering in FC3. I don't care about benchmarks, they're basically a waste of time and electricity to me.


----------



## DividebyZERO

Quote:


> Originally Posted by *tsm106*
> 
> My 290x vrm only goes to 47-50c (w/o fujipoly pads) at a lot higher than +200mv lol. On reference pcb, if I can remember far back enough with fujis since I never used the ref pcb w/o them, vrm temps were mid 40s.


interesting since my core never seems to exceed 54 max at quad. Would stock pads make that much difference even with thermal paste slightly added to pads? Is this single card results in a single loop? I will rerun mine again single card and see .
I have 2 mcp35x pumps in my loop but 4 gpus + cpu so I dont think its flow.


----------



## sugarhell

Quote:


> Originally Posted by *BradleyW*
> 
> New drivers are out. No improvements or list profiles/ white list profiles for any of the up and coming titles. NOT HAPPY.


Its not a new driver. Its a fix of 14.9 BSOD. Read the notes !!


----------



## tsm106

Quote:


> Originally Posted by *DividebyZERO*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> My 290x vrm only goes to 47-50c (w/o fujipoly pads) at a lot higher than +200mv lol. On reference pcb, if I can remember far back enough with fujis since I never used the ref pcb w/o them, vrm temps were mid 40s.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> interesting since my core never seems to exceed 54 max at quad. Would stock pads make that much difference even with thermal paste slightly added to pads? Is this single card results in a single loop? I will rerun mine again single card and see .
> I have 2 mcp35x pumps in my loop but 4 gpus + cpu so I dont think its flow.
Click to expand...

What block are you using? Personally I like to keep the vrms around 50c, not too hot and not too cool. All else being equal and those are the temps you get, it's a matter of improving the heat transfer. And in that regard you are limited by the block design and the thermal pads. Stock ones are usually crap in the context of cooling the vrms which get quite toasty when lots of volts are added. One definite way of improving the heat transfer is with fuji ultra extremes. It's not that expensive but it is a colossal pain in the rear with quads.


----------



## BLOWNCO

Quote:


> Originally Posted by *BLOWNCO*
> 
> has anyone with the vapor x 290 tried the new EK 290x water blocks ive been compairing pics of the pcbs and they look identical. i just want to make sure before i drop 300 and 2 blocks to water cool them.


anyone?


----------



## BradleyW

Quote:


> Originally Posted by *sugarhell*
> 
> Its not a new driver. Its a fix of 14.9 BSOD. Read the notes !!


I did read the notes, hence knowing that these drivers do not include anything new towards up and coming titles.


----------



## Ramzinho

Quote:


> Originally Posted by *BradleyW*
> 
> I did read the notes, hence knowing that these drivers do not include anything new towards up and coming titles.


well aren't you impressed by 5 day update to 14.9 brad? i think they will release 14.10 beta by mid month at most...although if any new games are coming this month i think they will wait and patch it all together.


----------



## Buehlar

Quote:


> Originally Posted by *zealord*
> 
> guys what kind of VRM 1 and 2 temperatures do you guys have in Furmark ? overclocked, what model, what voltage, duration of run?
> normal gaming vrm temps are within tolerable range, but furmark vrm tems are bang out of order.
> 
> I am not too worried about furmark, because it is an unrealistic scenario and does not represent real world gaming, but I am still curious


I really don't recommend furmark to anyone but I did perform some tests the other day just to see how many watts the 290x pulls at stock vs OC.

*
powered by an Corsair AX1200i using Corsair Link software to monitor wattage
1x Asus R9 290x DCII (stock OC @ 1050MHz)
*

*For this test to be as accurate as possible, the only thing powered by the AX1200i PSU is one 290x! All other system hardware is powered by a seperate PSU (AX850)*

As you can see, the power consumption scales up really quickly when upping the voltage... >100watts when moving from stock 1.25v to 1.312v @1150MHZ

*Test #1
Stock clocks/voltage
@1050MHz*


Spoiler: Warning: Spoiler!









*Test #2
Overclock/voltage
1150MHz/ 1.312v*


Spoiler: Warning: Spoiler!


----------



## FrancisJF

Will my 850 watts still power these cards with i7- 3770K?


----------



## ZealotKi11er

Quote:


> Originally Posted by *FrancisJF*
> 
> Will my 850 watts still power these cards with i7- 3770K?


Should be fine up to +100 mV. After that you need 1000W+


----------



## Buehlar

Quote:


> Originally Posted by *FrancisJF*
> 
> Will my 850 watts still power these cards with i7- 3770K?


A single card?...yes. If you plan on high overclocks with 3770K and 1x GPU you shouldn't have any issues.

Crossfire?...yes...but...

If you plan on highly OCing 2x GPU'sCrossfire and the 3770K then 850w will be the bare minimum...You also consider some headroom for your other hardware...HDD's, fans, pumps if any, etc.


----------



## FrancisJF

Quote:


> Originally Posted by *Buehlar*
> 
> A single card?...yes. If you plan on high overclocks with 3770K and 1x GPU you shouldn't have any issues.
> 
> Crossfire?...yes...but...
> 
> If you plan on highly OCing 2x GPU'sCrossfire and the 3770K then 850w will be the bare minimum...You also consider some headroom for your other hardware...HDD's, fans, pumps if any, etc.


Ahh thanks, bought 2 of them over the weekend. Won't overclock them now til tax seasons...


----------



## maynard14

how bout corsair tx750m power supply, is it enough to power to 290in crossfire and a 1.3 v overclock 4770k?


----------



## Vici0us

Quote:


> Originally Posted by *maynard14*
> 
> how bout corsair tx750m power supply, is it enough to power to 290in crossfire and a 1.3 v overclock 4770k?


Definitely not, you need something around 1000W or more specially if you're planning on overclocking 290s. More like, Corsair HX1050


----------



## Buehlar

Quote:


> Originally Posted by *maynard14*
> 
> how bout corsair tx750m power supply, is it enough to power to 290in crossfire and a 1.3 v overclock 4770k?


Nope. 290 will pull pretty much the same as a 290x when overclocked. An 850w PSU will run crossfire BUT it'll be maxed when high OCing with little to no headroom.
Quote:


> Originally Posted by *Vici0us*
> 
> Definitely not, you need something around 1000W or more specially if you're planning on overclocking 290s. More like, Corsair HX1050


Yea, 1000w is recommenced for a 290 x-fire OC system. You can top 700w alone for both GPU's with a good OC. 1000w will gives you just enough headroom and not tax the PSU so harshly.


----------



## maynard14

i see but i plan not to overclock the 2 cards, just use them as is, but i will overclock my 4770k to 1.3 v.. but i think 750w is still not enough, maybe ill buy a 1000w power supply, but kinda broke right now if i buy a 2nd 290 haha


----------



## MojoW

Quote:


> Originally Posted by *maynard14*
> 
> i see but i plan not to overclock the 2 cards, just use them as is, but i will overclock my 4770k to 1.3 v.. but i think 750w is still not enough, maybe ill buy a 1000w power supply, but kinda broke right now if i buy a 2nd 290 haha


If your just gonna OC your CPU then your PSU will be enough.


----------



## smoke2

I'm playing Saints Row The Third on my 290.
CPU is i5-4590, 16GB RAM.
Graphics settings are on Ultra quality, V-Sync ON.
The graphics itself isn't much high quality, but when there are some cinematic animations it runs like low FPS, it isn't definitely smooth.
During playing the game it runs smooth.
What can cause it?


----------



## maynard14

Quote:


> Originally Posted by *MojoW*
> 
> If your just gonna OC your CPU then your PSU will be enough.


thank you so much







time to despose my ps3 and buy a 2nd card haha


----------



## incog

Quote:


> Originally Posted by *smoke2*
> 
> I'm playing Saints Row The Third on my 290.
> CPU is i5-4590, 16GB RAM.
> Graphics settings are on Ultra quality, V-Sync ON.
> The graphics itself isn't much high quality, but when there are some cinematic animations it runs like low FPS, it isn't definitely smooth.
> During playing the game it runs smooth.
> What can cause it?


Maybe V-sync? disable that and see it it becomes better


----------



## SadisticMajor

Hey guys, I was wondering if I could get some help. I just bought a Saphire R9 290x for my rig and I'm wondering if it comes with the power cables required to plug it into my PSU. Also, if it does not, what do I need to purchase to plug it in? It's still shipping, so I haven't gotten a chance to look at it yet to tell what to use or if it comes with what I need, and I can't find the information online. Thanks in advance.


----------



## mAs81

There should be PCI cables with your PSU..use those..
I mean what cables have you got on your current GPU,if you have any,I'm on my phone-can't see
GPU companies usually ship molex to GPU adapters..among other stuff.


you'll probably need two pci-to 8pin (6+2)cables for that card..

EDIT:
Pics added


----------



## Talon720

Quote:


> Originally Posted by *zealord*
> 
> it is way easier if you just post the link to the result of the firestrike test and tell us your clocks of GPU and CPU.


Quote:


> Originally Posted by *chronicfx*
> 
> I get just under 21000 with my 4790k at 4.7 and 3x 290x at stock on air
> http://www.3dmark.com/3dm/4227170
> 
> Can you link yours as well? I want to see


Ok here's the firestrike run. Someone tell me if you can't see it http://www.3dmark.com/compare/fs/2869727/fs/2869877cpu uncore/core 44/45 on edit(14.8-14.9) driver. The weird thing is it says I ran my gpu at stock on the faster run I was pretty sure I didn't I thought I did 1190 core or something. I can always just retest







ok I think what might explain the higher score is i might have went 14.8-14.9 that's the only thing I can think of that would explain the jump. I don't personally remember exactly when I installed the new driver, but it would make sense.


----------



## zealord

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Buehlar*
> 
> I really don't recommend furmark to anyone but I did perform some tests the other day just to see how many watts the 290x pulls at stock vs OC.
> 
> *
> powered by an Corsair AX1200i using Corsair Link software to monitor wattage
> 1x Asus R9 290x DCII (stock OC @ 1050MHz)
> *
> 
> *For this test to be as accurate as possible, the only thing powered by the AX1200i PSU is one 290x! All other system hardware is powered by a seperate PSU (AX850)*
> 
> As you can see, the power consumption scales up really quickly when upping the voltage... >100watts when moving from stock 1.25v to 1.312v @1150MHZ
> 
> *Test #1
> Stock clocks/voltage
> @1050MHz*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Test #2
> Overclock/voltage
> 1150MHz/ 1.312v*
> 
> 
> Spoiler: Warning: Spoiler!






thanks for that. your VRM temps seem to be super low for Furmark


----------



## matti2

When throttling down appears to hold down temperatures? Or is there such thing @ r9 290???


----------



## chronicfx

Quote:


> Originally Posted by *Talon720*
> 
> Ok here's the firestrike run. Someone tell me if you can't see it http://www.3dmark.com/compare/fs/2869727/fs/2869877cpu uncore/core 44/45 on edit(14.8-14.9) driver. The weird thing is it says I ran my gpu at stock on the faster run I was pretty sure I didn't I thought I did 1190 core or something. I can always just retest
> 
> 
> 
> 
> 
> 
> 
> ok I think what might explain the higher score is i might have went 14.8-14.9 that's the only thing I can think of that would explain the jump. I don't personally remember exactly when I installed the new driver, but it would make sense.


Everything from your physics to your framerates to your combined was better. Does a driver affect physics??? I guess combined would be affected but physics should not. Is it possible you had a background task going during the first run that lowered it a bit. Try a fresh restart and then disable virus scanning for the run. See what you get. Great score!

Edit: fyi i broke 21k on 14.9 it does add some point. I also was close to 25k on 3dmarl11.


----------



## battleaxe

Anyone have an idea why one of my 290's VRM 1 get really hot? I have the Gelid Icy revision kit on both VRM's and on both cards. The top card is fine and under 60c under load. The bottom card gets to 85c plus while benching. I even added a larger cooler to the VRM and still the same thing. Doesn't seem to matter what cooler is on there it just runs a lot hotter. Any ideas?


----------



## zealord

Quote:


> Originally Posted by *battleaxe*
> 
> Anyone have an idea why one of my 290's VRM 1 get really hot? I have the Gelid Icy revision kit on both VRM's and on both cards. The top card is fine and under 60c under load. The bottom card gets to 85c plus while benching. I even added a larger cooler to the VRM and still the same thing. Doesn't seem to matter what cooler is on there it just runs a lot hotter. Any ideas?


85c? what benchmark?

My VRM 1 Temp gets 110°C with furmark, but games and other benchmarks 70-90°C. I don't think you have to worry


----------



## battleaxe

Quote:


> Originally Posted by *zealord*
> 
> 85c? what benchmark?
> 
> My VRM 1 Temp gets 110°C with furmark, but games and other benchmarks 70-90°C. I don't think you have to worry


I can't say I'm worried, I just don't get it. The one card is pretty cool. The bottom card is hotter than it should be. I just don't get it. It should be about 55-60c like the other one. Doesn't make any sense. I know its fine to run that way, but there's no reason it should be getting to 85c.

3dMark11 is the test.


----------



## smoke2

Quote:


> Originally Posted by *incog*
> 
> Maybe V-sync? disable that and see it it becomes better


I've set V-sync off, but after one hour of playing it starts lagging also in game.
Drivers are 14.9
Have someone idea what it can cause?


----------



## kizwan

Quote:


> Originally Posted by *DividebyZERO*
> 
> Is it normal for 290x vrms to hit mid to high 70s on water using +200mv?
> under load of course


Depends. Is this temp while gaming or benching or torture test (e.g. furmark)? What is your ambient/room temp or water temp (under load)? How much radiator space do you have? Mine only hit mid 60s while benching (+200mV) in 32C ambient with stock EK pad & 360+240 rads space. Now with 360+240+120 & Fujipoly Extreme, mine now in high 40s. I know if you have pretty bad airflow & high ambient, it can easily go up to 70s.


----------



## matti2

How much voltage i can add to r9 290 to keep it safe? And power limit?
Now im on 1100gpu and 1350 memory, default volts


----------



## kizwan

Quote:


> Originally Posted by *matti2*
> 
> How much voltage i can add to r9 290 to keep it safe? And power limit?
> Now im on 1100gpu and 1350 memory, default volts


You can max out power limit. No problem there. It doesn't dictate how much power your card will draw but how much additional power the card can draw if needed.

Based on the information shared here, 1.45V is enough to pop the cap/vrm even under LN2 (ask @jomama22). So, i think you want to stay below 1.4V to keep your precious safe.


----------



## tainle

can you tell me that does r9 290x non-references card running in crossfire will get 94C in full load for both cards? i understand that when a single non-reference card can run cool at 80C average for most of them...


----------



## jagdtigger

@smoke2
The driver maybe? I also have 14.9 and i have a problem with a different game. I can run unigine Valley all day no probs but if i fire up Neverwinter it crashes after 20-30 min and leaves me with three black monitor(the funny thing is that the game runs only on one)... The only solution is to reset the PC...


----------



## SadisticMajor

Well I don't have a modular power supply, and it seems all the cables are taken except one 1x4 that has a fan siphoning off it. I had a guy build it for me as a gaming rig, and he didn't even include the graphics card hence I am ordering the 290x so I don't really know how he set it up. I opened the case to see the cables, and try to figure out how to plug it in ahead of time, but there seemed to be nothing extra. I would have thought it would come standard on a 500 watt to 1000 watt power supply. My only guess is I'll have to use a devider of some sort. The split ends of the Y connectors seem they would fit in that one plug, but I assume I'll need three more.


----------



## Red1776

Quote:


> Originally Posted by *tainle*
> 
> can you tell me that does r9 290x non-references card running in crossfire will get 94C in full load for both cards? i understand that when a single non-reference card can run cool at 80C average for most of them...


That depends on the model of the cards. For example before watercooling my 4 x MSI Twin Frozr R290X in quadfire, None of them reached 94C or even close. They all ran between 78-82C. Assuming good case airflow, you should be able to keep a pair of non ref cooled cards under the 95c limit


----------



## maynard14

is my card broken even in msi after burner tried to oc to 1150 fpu core, max power limit of 50, max 100 mv volts still artifacts like crazy

reference xfx r9 290x 80 percent fan speed, what could be wrong


----------



## battleaxe

Quote:


> Originally Posted by *maynard14*
> 
> is my card broken even in msi after burner tried to oc to 1150 fpu core, max power limit of 50, max 100 mv volts still artifacts like crazy
> 
> reference xfx r9 290x 80 percent fan speed, what could be wrong


Might not be able to handle 1150mhz. Not all of them can.

For that matter, not all of them can even hit 1100 on the core... FYI


----------



## Mega Man

broken vs not being able to oc, but working at stock, are 2 different things


----------



## maynard14

Quote:


> Originally Posted by *Mega Man*
> 
> broken vs not being able to oc, but working at stock, are 2 different things


yes i know but the last time i troed to oc to 1150 is working, no artifacts no glitch, and i only oc just for benching not gaming,


----------



## rdr09

Crossed 18K in graphics at only 1200 core with the new Beta . . .

14.9.1 vs 14.X

http://www.3dmark.com/compare/3dm11/8811868/3dm11/8765640


----------



## Buehlar

Well guys...I had to RMA one of my GPUs. It worked fine as a secondary card for Crossfire but when I tested it in as my primary I got pledged with black screens...









...sooo...yea...back in the box it went. Still within my 30 day Newegg replacement. Good thing I tested it before putting the waterblock on.

I've installed the block on the good GPU and loving them temps









@1200/1500 @1.35v http://www.3dmark.com/fs/2931000

max core temp @53°c
max vrm1 @55°c
max vrm2 @39°c



Temporarily cooled by a single EX420 at the moment with a chipset and HDD block tied into the loop.
I'll add the external AX480 back into the loop once I get test and waterblock new card.

Single card beats my old crossfire 7870's so I'm pretty stoked


----------



## gkolarov

Hi, I am thinkinк for second R9 290 for Crossfire with mine PCS+, but i am wondering for a couple of things:

1. Can I cross mine R9 290 PCS+ with Sapphire Tri-X or is better to look for another PCS+ card? The prices are the same, i am concern about the possible stability issues because of the different bioses
2. I have concerns and for the cooling. Now i am overclocking my PCS+ to 1200/6800 with +175mV on the core and the VRMs goes to 89-92 degrees in games , the fan goes to 80% (quite loud). I suppose with two card there will be no need to overclock so high => the cards will not be so hot, but they will be two with twice the quantity of hot air for dissipation.

My case is HAF X. I am attaching a picture of my case (the CPU water cooling is attached on the top of the case where is the DVD on the picture, on its previous place on the back side of the case there is a 140mm fan)


----------



## Imprezzion

I wonder... I have a second 290 laying around here that I bought as ''being broken'' for €20 just for the reference cooler since i chopped mine up for VRM cooling under the Accelero hybrid...
It is a ASUS and out of nowhere just stopped giving any signal to a monitor. None of the cards ports seem to work anymore..

However.. I can hear the windows startup sound.. So it does actually boot the card.

I tried to flash the BIOS with it in my second PCI-E slot next to my working 290 @ 290X unlock and atiflash recognizes it and i can flash it. Didn't fix the issues but.. since the PC seems to know the cards there..

How big is the chance it will work as a secondary card in crossfire.....


----------



## kizwan

Quote:


> Originally Posted by *tainle*
> 
> can you tell me that does r9 290x non-references card running in crossfire will get 94C in full load for both cards? i understand that when a single non-reference card can run cool at 80C average for most of them...


If my referenced cards in crossfire can run at 80s Celsius with custom fan profiles, there's no reason for the non-referenced cards to get the same or even better cooling.
Quote:


> Originally Posted by *SadisticMajor*
> 
> Well I don't have a modular power supply, and it seems all the cables are taken except one 1x4 that has a fan siphoning off it. I had a guy build it for me as a gaming rig, and he didn't even include the graphics card hence I am ordering the 290x so I don't really know how he set it up. I opened the case to see the cables, and try to figure out how to plug it in ahead of time, but there seemed to be nothing extra. I would have thought it would come standard on a 500 watt to 1000 watt power supply. My only guess is I'll have to use a devider of some sort. The split ends of the Y connectors seem they would fit in that one plug, but I assume I'll need three more.


What is the brand of the PSU? Rule of thumb, if possible avoid Y connectors. It's best to connect both PCIe power connectors directly from the PSU (basically two cables from the PSU for each PCIe power connectors).
Quote:


> Originally Posted by *maynard14*
> 
> is my card broken even in msi after burner tried to oc to 1150 fpu core, max power limit of 50, max 100 mv volts still artifacts like crazy
> 
> reference xfx r9 290x 80 percent fan speed, what could be wrong


That just means it needs more voltage. Are you using 14.9 when you got the artifacts?
Quote:


> Originally Posted by *rdr09*
> 
> Crossed 18K in graphics at only 1200 core with the new Beta . . .
> 
> 14.9.1 vs 14.X
> 
> http://www.3dmark.com/compare/3dm11/8811868/3dm11/8765640


My 3dmark11 version & "UI Version" have "s" at the end though. No idea what is that.

18K here too: 14.9 vs. 14.6
http://www.3dmark.com/compare/3dm11/8812448/3dm11/8382247
Quote:


> Originally Posted by *Imprezzion*
> 
> I wonder... I have a second 290 laying around here that I bought as ''being broken'' for €20 just for the reference cooler since i chopped mine up for VRM cooling under the Accelero hybrid...
> It is a ASUS and out of nowhere just stopped giving any signal to a monitor. None of the cards ports seem to work anymore..
> 
> *However.. I can hear the windows startup sound.. So it does actually boot the card.*
> 
> I tried to flash the BIOS with it in my second PCI-E slot next to my working 290 @ 290X unlock and atiflash recognizes it and i can flash it. Didn't fix the issues but.. since the PC seems to know the cards there..
> 
> *How big is the chance it will work as a secondary card in crossfire.....*


Not necessarily. Windows can still boot without GPU detected.

Why don't you try it? If the clocks move up (on the faulty card) & performance increase, it's working. I would say the chance is slim. Better don't get your hopes up. Still, I would try it though.


----------



## battleaxe

Quote:


> Originally Posted by *maynard14*
> 
> yes i know but the last time i troed to oc to 1150 is working, no artifacts no glitch, and i only oc just for benching not gaming,


New drivers can make a difference. Several things could make a difference: A fan not turning, overheating, drivers, unstable OC... etc...


----------



## battleaxe

Quote:


> Originally Posted by *Imprezzion*
> 
> I wonder... I have a second 290 laying around here that I bought as ''being broken'' for €20 just for the reference cooler since i chopped mine up for VRM cooling under the Accelero hybrid...
> It is a ASUS and out of nowhere just stopped giving any signal to a monitor. None of the cards ports seem to work anymore..
> 
> However.. I can hear the windows startup sound.. So it does actually boot the card.
> 
> I tried to flash the BIOS with it in my second PCI-E slot next to my working 290 @ 290X unlock and atiflash recognizes it and i can flash it. Didn't fix the issues but.. since the PC seems to know the cards there..
> 
> How big is the chance it will work as a secondary card in crossfire.....


Boot into safe mode and make sure AB isn't applying an OC on startup. Then reboot. This has happened to me before.


----------



## Aaron_Henderson

Hey guys, been using my DCII OC for about a week now, and I still can't figure out why it won't connect properly to my HDTV without scaling issues, wrong resolutions, etc. It literally will not allow me to set 1920x1080 @ 60HZ, it's not even in the list of resolutions, even if I force it through Catalyst. Quite frustrating, and would love for a solution. Also, I am able to run 1080p on a 1440x900 monitor without forcing anything...yet can't on my TV...I tried DVI-HDMI on both DVI ports, as well as the HDMI out port, and they all have the same results...I did already post here but was hoping it might not get missed this time.


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> If my referenced cards in crossfire can run at 80s Celsius with custom fan profiles, there's no reason for the non-referenced cards to get the same or even better cooling.
> What is the brand of the PSU? Rule of thumb, if possible avoid Y connectors. It's best to connect both PCIe power connectors directly from the PSU (basically two cables from the PSU for each PCIe power connectors).
> That just means it needs more voltage. Are you using 14.9 when you got the artifacts?
> My 3dmark11 version & "UI Version" have "s" at the end though. No idea what is that.
> 
> 18K here too: 14.9 vs. 14.6
> http://www.3dmark.com/compare/3dm11/8812448/3dm11/8382247


FS scores down, though. have not tried any game.

BTW, again, installing only using the Beta to uninstall the 14.X without issues. No DDU, No Safe mode, No issues. it worked for both my 7950 and 290 every time!

Quote:


> Originally Posted by *Aaron_Henderson*
> 
> Hey guys, been using my DCII OC for about a week now, and I still can't figure out why it won't connect properly to my HDTV without scaling issues, wrong resolutions, etc. It literally will not allow me to set 1920x1080 @ 60HZ, it's not even in the list of resolutions, even if I force it through Catalyst. Quite frustrating, and would love for a solution. Also, I am able to run 1080p on a 1440x900 monitor without forcing anything...yet can't on my TV...I tried DVI-HDMI on both DVI ports, as well as the HDMI out port, and they all have the same results...I did already post here but was hoping it might not get missed this time.


have you tried adjusting the scaling using Scaling option in CCC?

My Digital Panels>Scaling Options>Use Slider


----------



## Aaron_Henderson

Quote:


> Originally Posted by *rdr09*
> 
> FS scores down, though. have not tried any game.
> 
> BTW, again, installing only using the Beta to uninstall the 14.X without issues. No DDU, No Safe mode, No issues. it worked for both my 7950 and 290 every time!
> have you tried adjusting the scaling using Scaling option in CCC?
> 
> My Digital Panels>Scaling Options>Use Slider


I can't even get it to set [email protected], but I did play with the slider just to see what it does. Won't help in my scenario.


----------



## BradleyW

Quote:


> Originally Posted by *BradleyW*
> 
> The only thing wrong with AMD drivers is the driver overhead. We are losing large amounts of frames per second. Instead of trying to fix this, AMD hide behind Mantle. Granted, Mantle is great and you've inspired DirectX 12. But until then, fix the overhead please! It's ridiculous. Nvidia fixed their overhead with the gaming ready drivers a few month back. Here is an example.
> 
> Nvidia:
> 
> 
> AMD:
> 
> 
> *Note that the same CPU's perform far less on the AMD 295X2 system. CFX is being massively bottlenecked due to AMD driver overhead. This is seen in every game I've played over the past 3 years. It's a cold hard fact and needs to be addressed right now.*


----------



## rdr09

Quote:


> Originally Posted by *Aaron_Henderson*
> 
> I can't even get it to set [email protected], but I did play with the slider just to see what it does. Won't help in my scenario.


idk, could be your cable. let me just be clear. when you check list all modes . . . you don't see 1080?


----------



## rdr09

Quote:


> Originally Posted by *BradleyW*


i don't understand. care to explain this . . .



?

Bradley, are you there?


----------



## sugarhell

Quote:


> Originally Posted by *rdr09*
> 
> i don't understand. care to explain this . . .
> 
> 
> 
> ?


Its just a graph without context


----------



## BradleyW

Quote:


> Originally Posted by *rdr09*
> 
> i don't understand. care to explain this . . .
> 
> 
> 
> ?
> 
> Bradley, are you there?


Yes I can explain. It's 4K. The bottleneck is somewhat removed. But at 1080p, the CPU is heavily bottlenecked by AMD driver overhead.

However if that 295X2 could get the fps up to the 182 mark at 4K, you'd see the bottleneck again. Higher resolution is somewhat hiding the existence of a bottleneck because fps is not high enough to run into the limitation.

For this game it's not that important. However many other games have suffered greatly in the CFX department in CPU bound parts of games simply due to AMD's extensive driver overhead. Ever wondered why SLI has higher min fps at 1080p and 1440p compared to CFX in most games?


----------



## rdr09

Quote:


> Originally Posted by *sugarhell*
> 
> Its just a graph without context


you know these stuff better than most. so, the amd driver takes up more cpu overhead but the cards compensate on scaling?

edit: lol at the 760 and 7950. its worst than Anand.


----------



## BradleyW

Yes I can explain. It's 4K. The bottleneck is somewhat removed. But at 1080p, the CPU is heavily bottlenecked by AMD driver overhead.

However if that 295X2 could get the fps up to the 182 mark at 4K, you'd see the bottleneck again. Higher resolution is somewhat hiding the existence of a bottleneck because fps is not high enough to run into the limitation.

For this game it's not that important. However many other games have suffered greatly in the CFX department in CPU bound parts of games simply due to AMD's extensive driver overhead. Ever wondered why SLI has higher min fps at 1080p and 1440p compared to CFX in most games?

Also, how can CFX make up for the scalability? It's the scalability that's getting bottlenecked in the first place. Nvidia have addressed this issue. AMD hide behind Mantle. Again, not hating on Mantle, because it works perfectly. But when most games are Direct-X, it makes Mantle less important despite inspiring Direct-X 12's low level API qualities.


----------



## sugarhell

Quote:


> Originally Posted by *rdr09*
> 
> you know these stuff better than most. so, the amd driver takes up more cpu overhead but the cards compensate on scaling?


A few more cpu cycles yeah but not something stupid as this.

5960x is 3ghz at stock and boost 2 cores iirc at 3.5. 3970x is 3.5 ghz stock and boost to 4ghz. The game doesnt scale over 6 cores and still we see the 5960x with a lot lower clocks and it use only the 6 cores and it wins by over 20 fps.

The results doesnt make a sense for both amd and nvidia. Intel 6 cores and 8 core parts should perform the same given the edge to 4770k because it boost at 3.9

Also:



Look the scaling


----------



## rdr09

Quote:


> Originally Posted by *BradleyW*
> 
> Yes I can explain. It's 4K. The bottleneck is somewhat removed. But at 1080p, the CPU is heavily bottlenecked by AMD driver overhead.
> 
> However if that 295X2 could get the fps up to the 182 mark at 4K, you'd see the bottleneck again. Higher resolution is somewhat hiding the existence of a bottleneck because fps is not high enough to run into the limitation.
> 
> For this game it's not that important. However many other games have suffered greatly in the CFX department in CPU bound parts of games simply due to AMD's extensive driver overhead. Ever wondered why SLI has higher min fps at 1080p and 1440p compared to CFX in most games?
> 
> Also, how can CFX make up for the scalability? It's the scalability that's getting bottlenecked in the first place. Nvidia have addressed this issue. AMD hide behind Mantle. Again, not hating on Mantle, because it works perfectly. But when most games are Direct-X, it makes Mantle less important despite inspiring Direct-X 12's low level API qualities.


min?



i think you are trying to psych yourself into going 980s.


----------



## rdr09

Quote:


> Originally Posted by *sugarhell*
> 
> A few more cpu cycles yeah but not something stupid as this.
> 
> 5960x is 3ghz at stock and boost 2 cores iirc at 3.5. 3970x is 3.5 ghz stock and boost to 4ghz. The game doesnt scale over 6 cores and still we see the 5960x with a lot lower clocks and it use only the 6 cores and it wins by over 20 fps.
> 
> The results doesnt make a sense for both amd and nvidia. Intel 6 cores and 8 core parts should perform the same given the edge to 4770k because it boost at 3.9
> 
> Also:
> 
> 
> 
> Look the scaling


+rep. let me do my Homework.

You guys have nice nite or day.


----------



## BradleyW

Quote:


> Originally Posted by *rdr09*
> 
> min?
> 
> 
> 
> i think you are trying to psych yourself into going 980s.


I think I'd rather wait on the 980 Ti's, or see what 390X's can offer. As for the min on that 1440p benchmark, the results are good. They really are. But look at the 1080p benchmark, the overhead is silly. I'm sure you will agree. Why? AMD driver overhead.

@sugerhell

Usage is not always an indication of scaling. I've had 100 usage on both cards in the past with only a scalability of 1.5x at best in some titles (tested in GPU intensive parts of games).


----------



## zealord

guys I think we can look forward to Ryse son of Rome









http://www.overclock.net/t/1517881/pcgh-ryse-son-of-rome-benchmarks


----------



## Imprezzion

Quote:


> Originally Posted by *battleaxe*
> 
> Boot into safe mode and make sure AB isn't applying an OC on startup. Then reboot. This has happened to me before.


It does not boot into safe mode nor the BIOS. There's just no output on either DVI / HDMI or DP.
I bought it as a broken card just for the stock cooler but it would be the most epic €20 ever spent if it works in Crossfire somehow!
Probably won't unlock so i'll have to run the stock BIOS on my unlocked XFX for proper scaling but yeah. 2x 290 beats a single 290X @ 1200 easily


----------



## sugarhell

Quote:


> Originally Posted by *BradleyW*
> 
> I think I'd rather wait on the 980 Ti's, or see what 390X's can offer. As for the min on that 1440p benchmark, the results are good. They really are. But look at the 1080p benchmark, the overhead is silly. I'm sure you will agree. Why? AMD driver overhead.
> 
> @sugerhell
> 
> Usage is not always an indication of scaling. I've had 100 usage on both cards in the past with only a scalability of 1.5x at best in some titles (tested in GPU intensive parts of games).


Look the graphs. 290x 96 fps 295x2 181.Almost perfect scaling

You have zero evidence and you dont even doubt that the cpu results are really strange for both nvidia and amd. A 5960x shouldnt have 20fps over the 3970x or 4770k


----------



## BradleyW

Quote:


> Originally Posted by *sugarhell*
> 
> Look the graphs. 290x 96 fps 295x2 181.Almost perfect scaling
> 
> You have zero evidence and you dont even doubt that the cpu results are really strange for both nvidia and amd. A 5960x shouldnt have 20fps over the 3970x or 4770k


Do you think these guy's might have done some shoddy testing?


----------



## sugarhell

Quote:


> Originally Posted by *BradleyW*
> 
> Do you think these guy's might have done some shoddy testing?


No if you translate the article they said that find some locations with the gpu never reaching 99% usage. Maybe its a glitch or something.Either way the results are over 200 fps and its kinda high for the cpu


----------



## BradleyW

Quote:


> Originally Posted by *sugarhell*
> 
> No if you translate the article they said that find some locations with the gpu never reaching 99% usage. Maybe its a glitch or something.Either way the results are over 200 fps and its kinda high for the cpu


I noticed this. On the first level I looked down a corridor and the fps plummeted from 300 to 140. CPU usage spiked and GPU usage dropped.

There is no doubt that all the CPU results are far lower on the AMD system. Also average fps does not mean anything. Anything can shoot that average up. Look at the minimum fps. It supports there is no difference between 8 cores and 6 in this game. Bring me back to my original point that AMD overhead is a cause.


----------



## igrease

I swear, every driver I have used past 14.6 has actually given me worst performance in Battlefield 4. I don't understand how that can happen.

14.6-
100 ~ 120 fps
No stutters

14.7+
55 ~ 85 fps
Stutters pretty much all the time.

This in on Mantle by the way.


----------



## sugarhell

Quote:


> Originally Posted by *igrease*
> 
> I swear, every driver I have used past 14.6 has actually given me worst performance in Battlefield 4. I don't understand how that can happen.
> 
> 14.6-
> 100 ~ 120 fps
> No stutters
> 
> 14.7+
> 55 ~ 85 fps
> Stutters pretty much all the time.
> 
> This in on Mantle by the way.


Hmm big difference. Did you try to clean with DDU?

1) Uninstall with amd drivers normally.
2)then use ddu under safe mdoe to clean the registry
3)use ccleaner to clean more the registry
4)install new drivers

If not maybe its a problem with bf4 installation or settings. Can you check cpu usage when the stutter happens?


----------



## Aaron_Henderson

Quote:


> Originally Posted by *rdr09*
> 
> idk, could be your cable. let me just be clear. when you check list all modes . . . you don't see 1080?


No, I can't get it to show 1920x1080 in the list...I will try hooking it up again later and take some screens of the Catalyst software to show what I mean...I am pretty sure it's just reading the EDID wrong somehow, as it says the maximum supported resolution is [email protected] (in Catalyst software), though that is wrong, it should only be 60Hz, and it won't let me add 1920x1080 to the list of resolutions, well, I can do 1080i (30Hz) under "HDTV". If I try to force the resolution from the Catalyst software, nothing happens. It works fine on another monitor. The cable should be fine, it works on my Bluray player, and the TV+cable where working fine with my GTX 570 SLI setup I had before the 290x. Another thing to mention is that when I hook the TV up, via HDMI, a message pops up and says HDMI audio out will not function because the display does not support it, or something to that extent.


----------



## sugarhell

Quote:


> Originally Posted by *Aaron_Henderson*
> 
> No, I can't get it to show 1920x1080 in the list...I will try hooking it up again later and take some screens of the Catalyst software to show what I mean...I am pretty sure it's just reading the EDID wrong somehow, as it says the maximum supported resolution is [email protected] (in Catalyst software), though that is wrong, it should only be 60Hz, and it won't let me add 1920x1080 to the list of resolutions, well, I can do 1080i (30Hz) under "HDTV". If I try to force the resolution from the Catalyst software, nothing happens. It works fine on another monitor. The cable should be fine, it works on my Bluray player, and the TV+cable where working fine with my GTX 570 SLI setup I had before the 290x. Another thing to mention is that when I hook the TV up, via HDMI, a message pops up and says HDMI audio out will not function because the display does not support it, or something to that extent.


it reports the maximum refresh rate of the panel.

Did you try reinstall the drivers?


----------



## igrease

Quote:


> Originally Posted by *sugarhell*
> 
> Hmm big difference. Did you try to clean with DDU?
> 
> 1) Uninstall with amd drivers normally.
> 2)then use ddu under safe mdoe to clean the registry
> 3)use ccleaner to clean more the registry
> 4)install new drivers
> 
> If not maybe its a problem with bf4 installation or settings. Can you check cpu usage when the stutter happens?


Yes I use DDU to uninstall like always. And there shouldn't be anything wrong with the BF4 installation.


----------



## ZealotKi11er

Quote:


> Originally Posted by *BradleyW*
> 
> I noticed this. On the first level I looked down a corridor and the fps plummeted from 300 to 140. CPU usage spiked and GPU usage dropped.
> 
> There is no doubt that all the CPU results are far lower on the AMD system. Also average fps does not mean anything. Anything can shoot that average up. Look at the minimum fps. It supports there is no difference between 8 cores and 6 in this game. Bring me back to my original point that AMD overhead is a cause.


I am same page with you. I dont understand why AMD cards do so bad in DX11 now with CPU. We all know Nvidia improved their DX11 driver to attack Mantle but Mantle is not in every game. As far as Alien goes the fps are too high to care. Also they are using 5960X @ 4.6Ghz which is the fastest CPU available which is not a good example.
I hope AMD does something about DX11 or maybe they are not bothering anymore since DX12 will come shortly and pass this up until people forget.


----------



## Wezzor

Should I be happy with 1100/1400 and power limit on +50%? I've tried to go for more but I get artifacts so I need to start feeding voltage to the card. My card during load on a heavy game is around 64-66C same goes with VRM temps.


----------



## BradleyW

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I am same page with you. I dont understand why AMD cards do so bad in DX11 now with CPU. We all know Nvidia improved their DX11 driver to attack Mantle but Mantle is not in every game. As far as Alien goes the fps are too high to care. Also they are using 5960X @ 4.6Ghz which is the fastest CPU available which is not a good example.
> I hope AMD does something about DX11 or maybe they are not bothering anymore since DX12 will come shortly and pass this up until people forget.


Thank the lord for someone else who can see this. I've done my own extensive testing. This issue exists and it's a big issue in my eyes. I sure hope don't suffer the same fate with the AMD drivers in the future for DX12 at least.


----------



## DividebyZERO

So am I reading that fps dropped from 300 to 140?? This is gonna sound rough but seriously 140fps? Please explain this to me why 140fps is bad?


----------



## ZealotKi11er

Quote:


> Originally Posted by *BradleyW*
> 
> Thank the lord for someone else who can see this. I've done my own extensive testing. This issue exists and it's a big issue in my eyes. I sure hope don't suffer the same fate with the AMD drivers in the future for DX12 at least.


The game i have experience this the most is DX11 BF4 and BF3. I can use Mantle but Mantle does not work well @ 4K because of memory leak. Basically i have to run BF4 @ 4K Ultra to eliminate CPU bottleneck cause by AMD drivers. I really wished they release a driver to fix this problem. There are going to be a lot of DX11 game even after DX12 comes out.


----------



## Widde

Anyone know how I can get my eyefinity setup to work?







Have 2 benq g2440hdbl 24 inch and one Samsung Syncmaster 2243BW 22 inch, Thing is they are different resolutions, Samsungs resolution is 1680 x 1050, Is there any way to force it to 1080?







Or do I need to get another monitor. Got it to work with it but it looks horrid :S


----------



## ZealotKi11er

Quote:


> Originally Posted by *Widde*
> 
> Anyone know how I can get my eyefinity setup to work?
> 
> 
> 
> 
> 
> 
> 
> Have 2 benq g2440hdbl 24 inch and one Samsung Syncmaster 2243BW 22 inch, Thing is they are different resolutions, Samsungs resolution is 1680 x 1050, Is there any way to force it to 1080?
> 
> 
> 
> 
> 
> 
> 
> Or do I need to get another monitor. Got it to work with it but it looks horrid :S


Pretty sure Eyefinity supports monitors with different resolutions.


----------



## Aaron_Henderson

Quote:


> Originally Posted by *sugarhell*
> 
> it reports the maximum refresh rate of the panel.
> 
> Did you try reinstall the drivers?


Yeah, a couple times now...same result. I am going to give a different cable a try...I tried another one I have that I know works with my Bluray player, and I couldn't even get it to detect my TV when I tried it...while it detects on my other cable, just not able to set native resolution. No matter what I have tried so far has made much difference except the cable, so I'll get a better one and cross my fingers.


----------



## kizwan

Quote:


> Originally Posted by *Aaron_Henderson*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *sugarhell*
> 
> it reports the maximum refresh rate of the panel.
> 
> Did you try reinstall the drivers?
> 
> 
> 
> 
> 
> 
> 
> Yeah, a couple times now...same result. I am going to give a different cable a try...I tried another one I have that I know works with my Bluray player, and I couldn't even get it to detect my TV when I tried it...while it detects on my other cable, just not able to set native resolution. No matter what I have tried so far has made much difference except the cable, so I'll get a better one and cross my fingers.
Click to expand...

What is the exact model of the HDTV?


----------



## Widde

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Pretty sure Eyefinity supports monitors with different resolutions.


Cant seem to get it to work without looking horrible :S

It downscales my 2 1080p monitors and games look awful, Is there a way to force the 1680 x 1050p to 1920x1080?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Widde*
> 
> Cant seem to get it to work without looking horrible :S
> 
> It downscales my 2 1080p monitors and games look awful, Is there a way to force the 1680 x 1050p to 1920x1080?


Not with current AMD drivers you cant. That would be Down-sampling.


----------



## Widde

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Not with current AMD drivers you cant. That would be Down-sampling.


Kinda got it to work now, It downscales my 1080 monitors to 1050 so now I'm running 5040 x 1050, Just tested crysis 3 and it looked alright. Only WoW that looks horrid but that's fine with a single monitor. Shadow of mordor doesnt seem to support eyefinity yet though


----------



## Aaron_Henderson

Quote:


> Originally Posted by *kizwan*
> 
> What is the exact model of the HDTV?


Just a cheapo Insignia - http://www.insigniaproducts.com/products/televisions/NS-32L450A11.html


----------



## BradleyW

Quote:


> Originally Posted by *DividebyZERO*
> 
> So am I reading that fps dropped from 300 to 140?? This is gonna sound rough but seriously 140fps? Please explain this to me why 140fps is bad?


I am using an example to show possible driver overhead.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> The game i have experience this the most is DX11 BF4 and BF3. I can use Mantle but Mantle does not work well @ 4K because of memory leak. Basically i have to run BF4 @ 4K Ultra to eliminate CPU bottleneck cause by AMD drivers. I really wished they release a driver to fix this problem. There are going to be a lot of DX11 game even after DX12 comes out.


Yes, this is a concern for me.
Just to show you the poor DX11 performance with AMD drivers, listen to this:
BF4 + 1080p + Ultra + FOV 85 + 32 man Shanghai conquest (Top of skyscraper looking down on the front side of the building).
DX11.1 = min fps 68
Mantle = min fps 112

If that isn't bottleneck 101, I don't know what is! For ages I was screaming out that I was bottlenecked. Nobody would believe me. I then posted that evidence with SS's and everyone was like....oh....so a 3930K can bottleneck 290X's? Now I see people informing others about my findings. In fact it's one of the hottest topics on OCN since I came out with all these claims with evidence later on.

In other words, poor optimization in games + AMD drivers = derp.
Don't get me wrong, I also hate many things with Nvidia!


----------



## sugarhell

And i never had a single bottleneck with my 3770k and 3 7970 on bf4 after 200 hours. You need time,knowledge and patience to tweak an oced system.


----------



## BradleyW

Quote:


> Originally Posted by *sugarhell*
> 
> And i never had a single bottleneck with my 3770k and 3 7970 on bf4 after 200 hours. You need time,knowledge and patience to tweak an oced system.


Did you actually go looking for a bottleneck by doing extensive testing on suspected bottlenecked scenarios? Did you play at 1080p? Can you explain your second sentence a little more? I have the feeling you are suggesting that I don't have the, as you put it, time, knowledge and patience to tweak an overclocked system. Would I be wrong with that assumption?


----------



## rdr09

Quote:


> Originally Posted by *BradleyW*
> 
> I am using an example to show possible driver overhead.
> Yes, this is a concern for me.
> Just to show you the poor DX11 performance with AMD drivers, listen to this:
> BF4 + 1080p + Ultra + FOV 85 + 32 man Shanghai conquest (Top of skyscraper looking down on the front side of the building).
> DX11.1 = min fps 68
> Mantle = min fps 112
> 
> If that isn't bottleneck 101, I don't know what is! For ages I was screaming out that I was bottlenecked. Nobody would believe me. I then posted that evidence with SS's and everyone was like....oh....so a 3930K can bottleneck 290X's? Now I see people informing others about my findings. In fact it's one of the hottest topics on OCN since I came out with all these claims with evidence later on.
> 
> In other words, poor optimization in games + AMD drivers = derp.
> Don't get me wrong, I also hate many things with Nvidia!


you are making me nervous. just got another 290 ref with a block for $260. my SB might moan. or, will.


----------



## sugarhell

Quote:


> Originally Posted by *BradleyW*
> 
> Did you actually go looking for a bottleneck by making extensive tests on suspected bottlenecked situations? Did you play at 1080p?


200 hundred hours. My higher cpu usage was 45%. I changed from 1080p eyefinity to 1440p 120hz and now i am playing at 4k


----------



## BradleyW

Quote:


> Originally Posted by *sugarhell*
> 
> 200 hundred hours. My higher *cpu usage was 45%*. I changed from *1080p eyefinity* to 1440p 120hz and now i am playing at 4k


Read the bold bits. No wonder you did not see a bottleneck. I was on a single 1080p screen. I suffered CPU limitation as a result because my frame rate was being pushed. If AMD's overhead was lower, that 68 fps might have been higher. Also all cores were pegged at 99. All 6 of them.


----------



## sugarhell

Quote:


> Originally Posted by *BradleyW*
> 
> Read the bold bits. No wonder you did not see a bottleneck. I was on a single 1080p screen.


Bf4 MP bottleneck is because of AI and Network. If you have a cpu bottleneck its not because of the resolution.


----------



## BradleyW

Quote:


> Originally Posted by *sugarhell*
> 
> Bf4 MP bottleneck is because of AI and Network. If you have a cpu bottleneck its not because of the resolution.


Resolution is partly tied to CPU bottleneck. Overhead in drivers and games is the main culprit. CPU itself is fine. It's the software that always lets you down. It determines your lowest fps when using a high end machine. Like I said, all 6 cores pegged at 99. Mantle fixed the low fps and the pegged 99 usage. CFX performance also shot up since there was no longer a limitation with the draw calls.


----------



## sugarhell

Quote:


> Originally Posted by *BradleyW*
> 
> Resolution is partly tied to CPU bottleneck. Overhead in drivers and games is the main culprit. CPU itself is fine. It's the software that always lets you down. It determines your lowest fps when using a high end machine. Like I said, all 6 cores pegged at 99. Mantle fixed the low fps and the pegged 99 usage. CFX performance also shot up since there was no longer a limitation with the draw calls.


Someone said to not try to debate with you. I see the reason

If your all 6cores pegged to 99% usage i can easily say whats wrong. Its your end.

Frostbite use only 2 rendering threads + 1 for dx11 api thread. The rest are for sound each each. Its impossible for 6 cores to even spiked at 99% usage

I really hate when random people just read 2 articles on internet then just spread misinformations.


----------



## bbond007

i did the "red mod" and wow, what a difference.


Spoiler: Warning: Spoiler!








what a difference.

best 3dmark yet.

http://www.3dmark.com/fs/2934941

plus $40 in rebates coming my way









anyway, is this a good heatsink for my VRMs?

http://www.ebay.com/itm/200801993333

thanks!

cheers!


----------



## sugarhell

Quote:


> Originally Posted by *bbond007*
> 
> i did the "red mod" and wow, what a difference.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> what a difference.
> 
> best 3dmark yet.
> 
> http://www.3dmark.com/fs/2934941
> 
> plus $40 in rebates coming my way
> 
> 
> 
> 
> 
> 
> 
> 
> 
> anyway, is this a good heatsink for my VRMs?
> 
> http://www.ebay.com/itm/200801993333
> 
> thanks!
> 
> cheers!


Yeah it looks like its a good heatsink. You can get the gelid heatsink for the 290x too iirc


----------



## kizwan

I actually did extensive test (BF4) with Directx vs. Mantle @1080p in CFX. As far as I can remember, BF4 did perform poorly (high CPU usage, a lot of CPU spike) with 13.x beta drivers but CPU usage did improved a lot with 14.1 beta driver (Directx). Basically I did not experienced any CPU bottleneck @1080p with 14.1 beta drivers using Directx. Mantle improved the average FPS but the difference between min/max is minimal. With 14.1 beta driver of course. I did not do any test with later drivers though.


----------



## BradleyW

Quote:


> Originally Posted by *sugarhell*
> 
> Someone said to not try to debate with you. I see the reason
> 
> If your all 6cores pegged to 99% usage i can easily say whats wrong. Its your end.
> 
> Frostbite use only 2 rendering threads + 1 for dx11 api thread. The rest are for sound each each. Its impossible for 6 cores to even spiked at 99% usage
> 
> I really hate when random people just read 2 articles on internet then just spread misinformations.


I've made my own testing and supplied evidence. I've not just read two articles and proceeded to spread information. That's not what I do. I seek information, user reports, own testing ext.
It's not impossible for 6 cores to spike at 99 usage. You've just not seen the elevated CPU usage on BF4 because you've always gamed at much higher resolutions which put maximum strain on the GPU's. If you still have the 290X's, I'd advise you to try single 1080p and watch the CPU usage become pegged in various situations when CFX is enabled. It's as if the software or the CPU can't keep up with the GPU's sometimes. I'm not saying that's true, I'm saying it appears so.


----------



## rdr09

Quote:


> Originally Posted by *BradleyW*
> 
> I've made my own testing and supplied evidence. I've not just read two articles and proceeded to spread information. That's not what I do. I seek information, user reports, own testing ext.
> It's not impossible for 6 cores to spike at 99 usage. You've just not seen the elevated CPU usage on BF4 because you've always gamed at much higher resolutions which put maximum strain on the GPU's. If you still have the 290X's, I'd advise you to try single 1080p and watch the CPU usage become pegged in various situations when CFX is enabled. It's as if the software or the CPU can't keep up with the GPU's sometimes. I'm not saying that's true, I'm saying it appears so.


even a single 7950 can max out bf4 using 1080. can't you disable crossfire if using just 1080.


----------



## sugarhell

Quote:


> Originally Posted by *BradleyW*
> 
> I've made my own testing and supplied evidence. I've not just read two articles and proceeded to spread information. That's not what I do. I seek information, user reports, own testing ext.
> It's not impossible for 6 cores to spike at 99 usage. You've just not seen the elevated CPU usage on BF4 because you've always gamed at much higher resolutions which put maximum strain on the GPU's. If you still have the 290X's, I'd advise you to try single 1080p and watch the CPU usage become pegged in various situations when CFX is enabled. It's as if the software or the CPU can't keep up with the GPU's sometimes. I'm not saying that's true, I'm saying it appears so.


Its impossible 6 cores to spike at 99% usage because bf4 doesnt scale equal.

Go play at 1080p SP and then go play MP. Check your usage. Same res but higher usage? Why this happens? I will let you to find out


----------



## BradleyW

Quote:


> Originally Posted by *rdr09*
> 
> even a single 7950 can max out bf4 using 1080. can't you disable crossfire if using just 1080.


Well luckily Mantle solved the issue by getting away from the elevated CPU scenario when CFX is enabled.
Quote:


> Originally Posted by *sugarhell*
> 
> Its impossible 6 cores to spike at 99% usage because bf4 doesnt scale equal.
> 
> Go play at 1080p SP and then go play MP. Check your usage. Same res but higher usage? Why this happens? I will let you to find out


Yes, higher usage on multiplayer. I understand there is a lot more going on. That's true. Especially on those huge 64 man servers. Just saying that certain parts of certain maps throw my usage to 95 to 99 on each core (HT OFF) whilst reducing GPU usage. If I OC the CPU a little higher, that fps increases and the GPU usage also increases a bit. But at higher resolutions, you'll not run into that sort of thing because the GPU is always the bottlenecking factor when crunching a load of pixels.









As for those assumptions that I read the odd article and spread things based on that....well....don't make assumptions like that because it isn't true. I supplied details and conclusions from my own testing which proves I don't do that sort of thing.


----------



## ZealotKi11er

Quote:


> Originally Posted by *BradleyW*
> 
> Well luckily Mantle solved the issue by getting away from the elevated CPU scenario when CFX is enabled.


Not sure how well Mantle is working for me. I have to do more testing. Right now i run the game 1440p Ultra 0XMSAA - 0XFXAA and 150% resolution scaling.


----------



## Ramzinho

With the 3770K + Crossfire brought up.. i'm about to buy a second 290X and i where do you guys see a stock 3770k bottle necking dual 290x? and what's the sweet spot on the CPU not to ?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Ramzinho*
> 
> With the 3770K + Crossfire brought up.. i'm about to buy a second 290X and i where do you guys see a stock 3770k bottle necking dual 290x? and what's the sweet spot on the CPU not to ?


4.6GHz?

Other then Crysis 3 i really did not need my second 290 for 1440p. In BF4 it does help since i use 150% resolution scaling. Once more demanding games come out the second card will start to help. CPUs are not really getting any faster. 3770K OCed vs Best Intel CPU OCed in 95% of the games there is not much difference.


----------



## BradleyW

Quote:


> Originally Posted by *Ramzinho*
> 
> With the 3770K + Crossfire brought up.. i'm about to buy a second 290X and i where do you guys see a stock 3770k bottle necking dual 290x? and what's the sweet spot on the CPU not to ?


I think you'll be OK at 1440p with the 3770K + 290X CFX. Overclocking it to around 4.5GHz or higher would help out a lot.








I guess you could also look into running a single GTX 980. Just a thought.


----------



## Ramzinho

Quote:


> Originally Posted by *BradleyW*
> 
> I think you'll be OK at 1440p with the 3770K + 290X CFX. Overclocking it to around 4.5GHz or higher would help out a lot.
> 
> 
> 
> 
> 
> 
> 
> 
> I guess you could also look into running a single GTX 980. Just a thought.


If i can i would but sadly i can't for many many reasons.
1- my GPU is watercooled. if i decide to sell it now i'd throw away the block in the trash simply cause there is no market of watercooling in my country. i can say with a bit of doubt. that i might be one of like 10 out of 2 million gaming PCs in my country that is watercooled.
2- is that i've an outstanding offer on a 290X+Block it's so good that it's like i'm taking it for free..
3- A 980 is 200$ More here in my country at 780$


----------



## BradleyW

Quote:


> Originally Posted by *Ramzinho*
> 
> If i can i would but sadly i can't for many many reasons.
> 1- my GPU is watercooled. if i decide to sell it now i'd throw away the block in the trash simply cause there is no market of watercooling in my country. i can say with a bit of doubt. that i might be one of like 10 out of 2 million gaming PCs in my country that is watercooled.
> 2- is that i've an outstanding offer on a 290X+Block it's so good that it's like i'm taking it for free..
> 3- A 980 is 200$ More here in my country at 780$


Fair enough.


----------



## Ramzinho

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Not sure how well Mantle is working for me. I have to do more testing. Right now i run the game 1440p Ultra 0XMSAA - 0XFXAA and 150% resolution scaling.


150% scaling equates 4k ?

Also guys.. trust me this is not a Hate or anything like that? but isn't downsampling a feature Nvidia is Hyping for being natively featured in their new GPUs? if so how long has it been done on AMD or 7XX series if they were done .. i personally didn't know about it until i followed Brad's posts


----------



## ZealotKi11er

Quote:


> Originally Posted by *Ramzinho*
> 
> 150% scaling equates 4k ?


Yeah is 1440p or 200% if 1080p.

I cant play the game anymore without 150%. It looks too dam good. Also i dont have to use Mantle because GPU even with OC are 99% usage which is nice.


----------



## BradleyW

Quote:


> Originally Posted by *Ramzinho*
> 
> 150% scaling equates 4k ?


1080p 200% = 4K
1440p 150% = 4K


----------



## sugarhell

Quote:


> Originally Posted by *Ramzinho*
> 
> 150% scaling equates 4k ?
> 
> Also guys.. trust me this is not a Hate or anything like that? but isn't downsampling a feature Nvidia is Hyping for being natively featured in their new GPUs? if so how long has it been done on AMD or 7XX series if they were done .. i personally didn't know about it until i followed Brad's posts













Soon i believe


----------



## Ramzinho

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah is 1440p or 200% if 1080p.
> 
> I cant play the game anymore without 150%. It looks too dam good. Also i dont have to use Mantle because GPU even with OC are 99% usage which is nice.


Quote:


> Originally Posted by *BradleyW*
> 
> 1080p 200% = 4K
> 1440p 150% = 4K


Yeah i've been asking Zaelo cause he is on 1440p and I do as well..


----------



## Ramzinho

Quote:


> Originally Posted by *sugarhell*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Soon i believe


i'm so sad this seems in german and i can't understand a thing.


----------



## BradleyW

Quote:


> Originally Posted by *sugarhell*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Soon i believe


Let us all hope! This would be a really good feature to have.


----------



## ZealotKi11er

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah is 1440p or 200% if 1080p.


Quote:


> Originally Posted by *BradleyW*
> 
> Let us all hope! This would be a really good feature to have.


I was going to try that driver with build in Custom Resolutions but never really followed up with AMD.


----------



## BradleyW

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I was going to try that driver with build in Custom Resolutions but never really followed up with AMD.


Yeah, it would be nice to try. One thing I always noticed about SSAA and downsampling is that my mouse becomes slightly more delayed. I guess the down sample process adds a bit of latency?


----------



## ZealotKi11er

Quote:


> Originally Posted by *BradleyW*
> 
> Yeah, it would be nice to try. One thing I always noticed about SSAA and downsampling is that my mouse becomes slightly more delayed. I guess the down sample process adds a bit of latency?


Could add extra ms but probably different if its done from the drivers. This should work same ways as Nvidias own SSAA.


----------



## BradleyW

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Could add extra ms but probably different if its done from the drivers. This should work same ways as Nvidias own SSAA.


Might depend on the game as well. I noticed no mouse lag with BF4's native scaling option. However, other games felt a bit more delayed. It's good to see in-game scaling though! Definitely a step in the right direction.


----------



## StrongForce

Just a quick question, would a r9 290 with stock cooler would "throttle" as some people say it does ? or not even ? I'm thinking about getting a second hand one as a temporary solution.. there are plenty on ebay it seem.


----------



## ZealotKi11er

Quote:


> Originally Posted by *StrongForce*
> 
> Just a quick question, would a r9 290 with stock cooler would "throttle" as some people say it does ? or not even ? I'm thinking about getting a second hand one as a temporary solution.. there are plenty on ebay it seem.


I dont think it does if you got good case cooling. If your worried just increase the fan speed a bit.


----------



## tsm106

Quote:


> Originally Posted by *StrongForce*
> 
> Just a quick question, would a r9 290 with stock cooler would "throttle" as some people say it does ? or not even ? I'm thinking about getting a second hand one as a temporary solution.. there are plenty on ebay it seem.


All hawaii cards will throttle to meet whatever limit is in effect. The only time it won't throttle is if you use one of the unlocked bios but that is not recommended on air.


----------



## Ramzinho

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Could add extra ms but probably different if its done from the drivers. This should work same ways as Nvidias own SSAA.


Quote:


> Originally Posted by *BradleyW*
> 
> Might depend on the game as well. I noticed no mouse lag with BF4's native scaling option. However, other games felt a bit more delayed. It's good to see in-game scaling though! Definitely a step in the right direction.


Ok that's confusing me. downsampling gives you mouse lag? if done in driver but not in game? please elaborate more guys.







... as you see i'm trying to maximize the output of my rig


----------



## StrongForce

Quote:


> Originally Posted by *tsm106*
> 
> All hawaii cards will throttle to meet whatever limit is in effect. The only time it won't throttle is if you use one of the unlocked bios but that is not recommended on air.


I'm thinking of going crazy with a kraken g10+h55 .. and extra VRM cooling of course, I might need a better case for that too Phanteks Enthoo Pro Series was tempting me, seems like the best price/quality case arround to me.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Ramzinho*
> 
> Ok that's confusing me. downsampling gives you mouse lag? if done in driver but not in game? please elaborate more guys.
> 
> 
> 
> 
> 
> 
> 
> ... as you see i'm trying to maximize the output of my rig


Input lag. There is a extra process going on before you see the image. The GPU renders it @ 4K then Down samples. In BF4 i had no problem with it since its native. To do it in other games you basically have to hack the drivers to making it think you have a 4K monitor.


----------



## Aaron_Henderson

So I tried a new HDMI cable, and still can't get my HDTV to display a proper 1080p signal...I give up


----------



## Widde

Quote:


> Originally Posted by *StrongForce*
> 
> I'm thinking of going crazy with a kraken g10+h55 .. and extra VRM cooling of course, I might need a better case for that too Phanteks Enthoo Pro Series was tempting me, seems like the best price/quality case arround to me.


http://piclair.com/y0po2 No throtteling







alittle noisy but ^^ (crossfired)


----------



## DividebyZERO

Quote:


> Originally Posted by *Aaron_Henderson*
> 
> So I tried a new HDMI cable, and still can't get my HDTV to display a proper 1080p signal...I give up


What do you mean proper?


----------



## Aaron_Henderson

Quote:


> Originally Posted by *DividebyZERO*
> 
> What do you mean proper?


I made some posts a page or two back, but in a nutshell, I can't for the life of me get Catalyst to show 1920x1080 @ 60Hz in the list of resolutions...been trying now for a few days....tried fresh Windows install, re-installing drivers several times, every port/adapter combo I could try, 3 different HDMI cables, etc...


----------



## BradleyW

Quote:


> Originally Posted by *Aaron_Henderson*
> 
> I made some posts a page or two back, but in a nutshell, I can't for the life of me get Catalyst to show 1920x1080 @ 60Hz in the list of resolutions...been trying now for a few days....tried fresh Windows install, re-installing drivers several times, every port/adapter combo I could try, 3 different HDMI cables, etc...


I'm sorry if this has been asked before but have you tried a different TV or monitor? Just seeing this is a completely isolated issue, which I assume it is.


----------



## DividebyZERO

Quote:


> Originally Posted by *Aaron_Henderson*
> 
> I made some posts a page or two back, but in a nutshell, I can't for the life of me get Catalyst to show 1920x1080 @ 60Hz in the list of resolutions...been trying now for a few days....tried fresh Windows install, re-installing drivers several times, every port/adapter combo I could try, 3 different HDMI cables, etc...


I have a similar issue, i can see HDTV 1080p in the list, but not 1920x1080 PC resolutions. Is this what you mean?


----------



## StrongForce

Quote:


> Originally Posted by *Widde*
> 
> http://piclair.com/y0po2 No throtteling
> 
> 
> 
> 
> 
> 
> 
> alittle noisy but ^^ (crossfired)


Not sure what this graph means lol


----------



## Ramzinho

Quote:


> Originally Posted by *StrongForce*
> 
> Not sure what this graph means lol


it's a fan profile. not so aggressive. it's quite decent though.


----------



## Aaron_Henderson

Quote:


> Originally Posted by *DividebyZERO*
> 
> I have a similar issue, i can see HDTV 1080p in the list, but not 1920x1080 PC resolutions. Is this what you mean?


I can't even get 1080p to show up under HDTV, even if I force it...I get 1000i and 1080i as my highest. Yet Catalyst reports the maximum resolution of my TV at 1920x1080 75Hz...the highest resolution it will show @ 60Hz under the normal PC resolutions is 1280x1024 or something. I can randomly get it to show 1680x1050 and a bunch of non-16:9 resolutions if I unolug/plug it in a bunch of times...I've actually got a couple BSOD playing with the Catalyst software and changing resolutions around...strangest thing is that during post, it displays a 1080p signal just fine, then as soon as Windows begins to load, it kicks down to 720p...
Quote:


> Originally Posted by *BradleyW*
> 
> I'm sorry if this has been asked before but have you tried a different TV or monitor? Just seeing this is a completely isolated issue, which I assume it is.


Only other display I have with me at the moment is a 1440x900 75Hz monitor, and for some odd reason, it LETS me set 1920x1080 60Hz...even though it's down-scaling. I agree it is likely just a rare thing between the card and my TV...but I would love to figure it out somehow. Honestly, I am in the market for a new display anyway, but this is currently driving me up the wall since it's my first AMD card in 10+ years, and I've never run into so many issues with my Nvidia cards...they just seemed to work right without any headaches. I am not complaining though, I got a great deal on this card, and gaming wise it's great. Missing the simple down-sampling something awful though, let me tell you.


----------



## StrongForce

Quote:


> Originally Posted by *Ramzinho*
> 
> it's a fan profile. not so aggressive. it's quite decent though.


Ah ok I see ! I saw on a review the default fan profile had a Db of near 50 ! just like my old hd5870, lol, which was not all that noisy once your start playing and have your headset on, but if I were to increase the default any it might be ..


----------



## DividebyZERO

Quote:


> Originally Posted by *Aaron_Henderson*
> 
> I can't even get 1080p to show up under HDTV, even if I force it...I get 1000i and 1080i as my highest. Yet Catalyst reports the maximum resolution of my TV at 1920x1080 75Hz...the highest resolution it will show @ 60Hz under the normal PC resolutions is 1280x1024 or something. I can randomly get it to show 1680x1050 and a bunch of non-16:9 resolutions if I unolug/plug it in a bunch of times...I've actually got a couple BSOD playing with the Catalyst software and changing resolutions around...strangest thing is that during post, it displays a 1080p signal just fine, then as soon as Windows begins to load, it kicks down to 720p...
> Only other display I have with me at the moment is a 1440x900 75Hz monitor, and for some odd reason, it LETS me set 1920x1080 60Hz...even though it's down-scaling. I agree it is likely just a rare thing between the card and my TV...but I would love to figure it out somehow. Honestly, I am in the market for a new display anyway, but this is currently driving me up the wall since it's my first AMD card in 10+ years, and I've never run into so many issues with my Nvidia cards...they just seemed to work right without any headaches. I am not complaining though, I got a great deal on this card, and gaming wise it's great. Missing the simple down-sampling something awful though, let me tell you.


Have you tried a HDMI to DVI adapter and use DVI port?


----------



## Aaron_Henderson

Quote:


> Originally Posted by *DividebyZERO*
> 
> Have you tried a HDMI to DVI adapter and use DVI port?


Yeah, I have tried just about every combo of cable/port/adapter possible. I haven't bothered trying DVI to D-sub though, but an analog connection would not be ideal anyway. Also, another thing to note...before, when I tried hooking it up, I got a message about my display not being able to do audio over HDMI. With the new HDMI cable, I no longer get this message, but still no 1080p...


----------



## DividebyZERO

Quote:


> Originally Posted by *Aaron_Henderson*
> 
> Yeah, I have tried just about every combo of cable/port/adapter possible. I haven't bothered trying DVI to D-sub though, but an analog connection would not be ideal anyway. Also, another thing to note...before, when I tried hooking it up, I got a message about my display not being able to do audio over HDMI. With the new HDMI cable, I no longer get this message, but still no 1080p...


would you mind giving a list of information?

driver version, os version, brand of screen, type of gpu, how many displays used at one time.

Can you see what windows detects the monitor/hdtv as? is it Generic PNP?


----------



## tsm106

Quote:


> Originally Posted by *Aaron_Henderson*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DividebyZERO*
> 
> Have you tried a HDMI to DVI adapter and use DVI port?
> 
> 
> 
> Yeah, I have tried just about every combo of cable/port/adapter possible. I haven't bothered trying DVI to D-sub though, but an analog connection would not be ideal anyway. Also, another thing to note...before, when I tried hooking it up, I got a message about my display not being able to do audio over HDMI. With the new HDMI cable, I no longer get this message, but still no 1080p...
Click to expand...

A proper hdmi to dvi adapter will turn the hdmi monitor to dvi. And for all respects it should be no different. Now that doesn't mean your tv will be able to run at the resolution/refresh rate you desire. What tv is it?

http://www.amazon.com/gp/product/B0028Y4FWK/ref=oh_aui_search_detailpage?ie=UTF8&psc=1


----------



## InfraRedRabbit

Quote:


> Originally Posted by *StrongForce*
> 
> I'm thinking of going crazy with a kraken g10+h55 .. and extra VRM cooling of course, I might need a better case for that too Phanteks Enthoo Pro Series was tempting me, seems like the best price/quality case arround to me.


go crazy - just did that with my 290 - temps at max load (Shadow of Mordor) are 61C after 2hours of play - with Corsair SP fan (H80i fan - runs up to ~2600rpm) set at 1800rpm. borderlands 2 i havent seen the temps pass 50C yet.

im running at 1000/1300 with no VRM sinks on (yet - they havent arrived) and my VRMS dont go into the 80's.

further mods include a Noctua PPC 2000rpm fan for the rad (in pull) and a noctua 92mm Redux fan for the G10.

p.s. im running this in an Ncase M1 (small itx case) so airflow is limited.


----------



## Aaron_Henderson

Quote:


> Originally Posted by *tsm106*
> 
> A proper hdmi to dvi adapter will turn the hdmi monitor to dvi. And for all respects it should be no different. Now that doesn't mean your tv will be able to run at the resolution/refresh rate you desire. What tv is it?
> 
> http://www.amazon.com/gp/product/B0028Y4FWK/ref=oh_aui_search_detailpage?ie=UTF8&psc=1


I've tried the adapters on both ends, TV and 290x, no luck. Same exact issue as with standard HDMI connection.

Quote:


> Originally Posted by *DividebyZERO*
> 
> would you mind giving a list of information?
> 
> driver version, os version, brand of screen, type of gpu, how many displays used at one time.
> 
> Can you see what windows detects the monitor/hdtv as? is it Generic PNP?


Driver Version - 14.9
OS Version - Windows 7 Ultimate 64
Screen - http://www.insigniaproducts.com/products/televisions/NS-32L450A11.html
GPU - Asus 290x DirectCU II OC
- tried the TV as a single display, as well as with my 1440x900 monitor
Catalyst recognizes the HDTV the same as my 570 SLI, as "NS...something rather", I'd have to plug it in to check again.


----------



## rdr09

Quote:


> Originally Posted by *Aaron_Henderson*
> 
> Yeah, I have tried just about every combo of cable/port/adapter possible. I haven't bothered trying DVI to D-sub though, but an analog connection would not be ideal anyway. Also, another thing to note...before, when I tried hooking it up, I got a message about my display not being able to do audio over HDMI. With the new HDMI cable, I no longer get this message, but still no 1080p...


i read the manual for your TV and it says that HDMI 1 will not carry audio, so you prolly hooked your gpu to that. try the other HDMI ports if you haven't. It could be some fancy settings in your TV like auto sensing input.


----------



## Aaron_Henderson

Quote:


> Originally Posted by *rdr09*
> 
> i read the manual for your TV and it says that HDMI 1 will not carry audio, so you prolly hooked your gpu to that. try the other HDMI ports if you haven't. It could be some fancy settings in your TV like auto sensing input.


I have tried each port several times, and that's odd that the manual states that since I have been using audio over HDMI on HDMI 1 for a few years now without issue...I am honestly going to try a couple more HDMI cables, maybe some older GPU drivers, but other than that, I think I have wasted enough time on this. It would have been more efficient to just buy a new display. I have also checked for any HDMI settings on my TV and messed with those to no avail...3 different cables now...all ports on the 290x except display port, several DVI-HDMI adapters, a fresh Windows install, several driver uninstall re-install attempts...messing with every setting in Catalyst software even remotely related...frustrated, to say the least. I can get audio over HDMI 1 now actually, just not proper resolutions, and the ones it does set are scaled strangely. GPU upscaling+maintain aspect ratio/fill screen/centered timings settings do absolutely nothing except make the screen flicker black the odd time. Something to do with the drivers, I am sure, since 1080p displays as it should in post and UEFI screens, it's only when it's loading Windows does it fail to set the proper resolution. I am almost certain this is something to do with the way the AMD Windows drivers are reading the EDID from my TV.


----------



## battleaxe

Quote:


> Originally Posted by *StrongForce*
> 
> I'm thinking of going crazy with a kraken g10+h55 .. and extra VRM cooling of course, I might need a better case for that too Phanteks Enthoo Pro Series was tempting me, seems like the best price/quality case arround to me.


I just bought an Enthoo Pro. Very nice case. Very nice. I will tell you though that the stock cooling fans are not enough to keep two 290's cool and a CPU. Just no way. You will need to move to a higher pressure fan like 5 120's maybe to get it done. That's what I plan to do. My cards do stay cool enough to run, but I know they are hotter than they could be with some better fans. The fans do the job sorta, just not enough pressure. You can barely feel the air coming out from them. The case itself is awesome though, I love mine. Crazy value too.


----------



## DividebyZERO

Quote:


> Originally Posted by *Aaron_Henderson*
> 
> I have tried each port several times, and that's odd that the manual states that since I have been using audio over HDMI on HDMI 1 for a few years now without issue...I am honestly going to try a couple more HDMI cables, maybe some older GPU drivers, but other than that, I think I have wasted enough time on this. It would have been more efficient to just buy a new display. I have also checked for any HDMI settings on my TV and messed with those to no avail...3 different cables now...all ports on the 290x except display port, several DVI-HDMI adapters, a fresh Windows install, several driver uninstall re-install attempts...messing with every setting in Catalyst software even remotely related...frustrated, to say the least. I can get audio over HDMI 1 now actually, just not proper resolutions, and the ones it does set are scaled strangely. GPU upscaling+maintain aspect ratio/fill screen/centered timings settings do absolutely nothing except make the screen flicker black the odd time. Something to do with the drivers, I am sure, since 1080p displays as it should in post and UEFI screens, it's only when it's loading Windows does it fail to set the proper resolution. I am almost certain this is something to do with the way the AMD Windows drivers are reading the EDID from my TV.


Maybe a suggestion if you dont mind, download Toastyx CRU utility, when you extract it, there is a reset-all.exe in the folder. Try running it, and it should remove all old instances of monitors crap. Then reboot and see if it helps? Only thing i can think to suggest as i have had to do this myself when i was screwing around with monitor resolutions.


----------



## Aaron_Henderson

Quote:


> Originally Posted by *DividebyZERO*
> 
> Maybe a suggestion if you dont mind, download Toastyx CRU utility, when you extract it, there is a reset-all.exe in the folder. Try running it, and it should remove all old instances of monitors crap. Then reboot and see if it helps? Only thing i can think to suggest as i have had to do this myself when i was screwing around with monitor resolutions.


Thanks a lot, I'll give it a go and report back.


----------



## Widde

Quote:


> Originally Posted by *StrongForce*
> 
> Ah ok I see ! I saw on a review the default fan profile had a Db of near 50 ! just like my old hd5870, lol, which was not all that noisy once your start playing and have your headset on, but if I were to increase the default any it might be ..


It's actually not as noisy as it's reputation would suggest, I owned a 6970 reference, Not that loud either, Really loud compared to aftermarket coolers but hey ^^ That's what you pay for most of the time


----------



## BradleyW

Quote:


> Originally Posted by *DividebyZERO*
> 
> Maybe a suggestion if you dont mind, download Toastyx CRU utility, when you extract it, there is a reset-all.exe in the folder. Try running it, and it should remove all old instances of monitors crap. Then reboot and see if it helps? Only thing i can think to suggest as i have had to do this myself when i was screwing around with monitor resolutions.


I was also going to suggest using CRU. Used it myself. It does work well.


----------



## Buehlar

Quote:


> Originally Posted by *Aaron_Henderson*
> 
> Thanks a lot, I'll give it a go and report back.


Quote:


> Originally Posted by *BradleyW*
> 
> I was also going to suggest using CRU. Used it myself. It does work well.


@Aaron_Henderson
After reading several pages of your frustrations, I was going to suggest trying rollback and/or update monitor drivers from device manager (if the option is available), but as already been suggested, CRU should take care of that.

Did you say that the TV displays 1920x1080 correctly via the the on-board HDMI (motherboard port)?


----------



## StrongForce

Quote:


> Originally Posted by *InfraRedRabbit*
> 
> go crazy - just did that with my 290 - temps at max load (Shadow of Mordor) are 61C after 2hours of play - with Corsair SP fan (H80i fan - runs up to ~2600rpm) set at 1800rpm. borderlands 2 i havent seen the temps pass 50C yet.
> 
> im running at 1000/1300 with no VRM sinks on (yet - they havent arrived) and my VRMS dont go into the 80's.
> 
> further mods include a Noctua PPC 2000rpm fan for the rad (in pull) and a noctua 92mm Redux fan for the G10.
> 
> p.s. im running this in an Ncase M1 (small itx case) so airflow is limited.


Yea the h55 seems already pretty damn good for a r9 290.. so I imagine with a 80i

aren't those noctua PPC super loud or you undervolting them ?
Quote:


> Originally Posted by *battleaxe*
> 
> I just bought an Enthoo Pro. Very nice case. Very nice. I will tell you though that the stock cooling fans are not enough to keep two 290's cool and a CPU. Just no way. You will need to move to a higher pressure fan like 5 120's maybe to get it done. That's what I plan to do. My cards do stay cool enough to run, but I know they are hotter than they could be with some better fans. The fans do the job sorta, just not enough pressure. You can barely feel the air coming out from them. The case itself is awesome though, I love mine. Crazy value too.


Yeh I need as default it won't be good enough as there are not even top fans provided, id go for 3 x14 but as you say need high pressure ones, the best appears to be the noctua.. which cost freaking 30 bucks each so thats already 90 bucks added to the case spending, actually I have one square 14 noctua so that's 2, one round also which was supposed to be on my cooler but I use it for back of motherboard cooling, oh also that, there seem to be no air entry for the backside on this case, and I fear I might not be able to keep that high OC if I change case, as right now I have that 14cm fan sucking out heat from the back of the socket....
Quote:


> Originally Posted by *Widde*
> 
> It's actually not as noisy as it's reputation would suggest, I owned a 6970 reference, Not that loud either, Really loud compared to aftermarket coolers but hey ^^ That's what you pay for most of the time


it does get loud don't kid yourself, lol. when was playing with OC on 5870 it was nuts near 52% fan speed already if I remember properly


----------



## ZealotKi11er

Quote:


> Originally Posted by *StrongForce*
> 
> Yea the h55 seems already pretty damn good for a r9 290.. so I imagine with a 80i
> 
> aren't those noctua PPC super loud or you undervolting them ?
> Yeh I need as default it won't be good enough as there are not even top fans provided, id go for 3 x14 but as you say need high pressure ones, the best appears to be the noctua.. which cost freaking 30 bucks each so thats already 90 bucks added to the case spending, actually I have one square 14 noctua so that's 2, one round also which was supposed to be on my cooler but I use it for back of motherboard cooling, oh also that, there seem to be no air entry for the backside on this case, and I fear I might not be able to keep that high OC if I change case, as right now I have that 14cm fan sucking out heat from the back of the socket....
> it does get loud don't kid yourself, lol. when was playing with OC on 5870 it was nuts near 52% fan speed already if I remember properly


I think the only card i though had decent stock cooler from AMD was HD 5850. It was way better then GTX470. After that AMD fell hard to Nvidia.


----------



## bbond007

Quote:


> Originally Posted by *StrongForce*
> 
> Yea the h55 seems already pretty damn good for a r9 290.. so I imagine with a 80i


I just did miner with cooler master seidon and the cool thing with that is you need no brackets or anything, just some m3x20mm bolts. i went with 25mm because i may try to piggyback some sort of bracket or something on there.
the pump block and 290x have exact same hole pattern.

nice that they currently have $20 rebate too, but you'll have to use different mailing addresses.

temps reach to low 60s so far, but now I'm underclocked until I get my VRM thing from China.


----------



## maynard14

hi guys what the best 290 non ref cooler out there? i want to crossfire it to my xfx reference 290x wuth nzxt g10


----------



## Mega Man

best non ref cooler out is a waterblock


----------



## maynard14

Quote:


> Originally Posted by *Mega Man*
> 
> best non ref cooler out is a waterblock


wahah i mean sapphire tri x vs powercolor pcs?


----------



## Mega Man

one of these, you should invest in your pcs silent future !


----------



## maynard14

Quote:


> Originally Posted by *Mega Man*
> 
> one of these, you should invest in your pcs silent future !


thanks bro i just sold my ps3 to get a 2nd 290 so im kinda tight budget,, so i cant afford to buy a waterblock for now and im only using corsair tx750m and im not planning to overclock the 2 290s, also i only have 1080p monitor, i just want to add another card so that i wont upgrade for many yrs hahah , and i think 1080p is fine for me


----------



## BLOWNCO

guess i can finally join the club

gpu-z link

http://www.techpowerup.com/gpuz/details.php?id=hsfnv

and a screenshot


----------



## kizwan

Quote:


> Originally Posted by *maynard14*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> one of these, you should invest in your pcs silent future !
> 
> 
> 
> 
> 
> thanks bro i just sold my ps3 to get a 2nd 290 so im kinda tight budget,, so i cant afford to buy a waterblock for now and im only using corsair tx750m and im not planning to overclock the 2 290s, also i only have 1080p monitor, i just want to add another card so that i wont upgrade for many yrs hahah , and i think 1080p is fine for me
Click to expand...

If you keeping 4770k at stock, I don't think it's good idea to get another 290 especially if you're staying at 1080p. It may only give you headache in the future. And you can't overclock because your PSU. I think if you want to upgrade, better upgrade 1080p to 1440p monitor.

edit: typo


----------



## Arizonian

Quote:


> Originally Posted by *BLOWNCO*
> 
> guess i can finally join the club
> 
> gpu-z link
> 
> http://www.techpowerup.com/gpuz/details.php?id=hsfnv
> 
> and a screenshot
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## BLOWNCO

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added


thanks


----------



## Klocek001

Quote:


> Originally Posted by *maynard14*
> 
> wahah i mean sapphire tri x vs powercolor pcs?


I've got Tri-X (rma'd currently, which is sad) , I remember it didn't exceed 70 degrees in July (quite hot out there) with fan at 60% and a side intake 140mm. If your case has a a lot of air coming in you might be able to drop the fan to 50%, it'll be relatively quiet.


----------



## Klocek001

Quote:


> Originally Posted by *maynard14*
> 
> thanks bro i just sold my ps3 to get a 2nd 290 so im kinda tight budget,, so i cant afford to buy a waterblock for now and im only using corsair tx750m and im not planning to overclock the 2 290s, also i only have 1080p monitor, i just want to add another card so that i wont upgrade for many yrs hahah , and i think 1080p is fine for me


two 290s for 1080p is ridiculous, I get 80-100 fps on mine in every game with just an old 2500K. I'd rather put my money into 6 cores right now.


----------



## maynard14

Quote:


> Originally Posted by *Klocek001*
> 
> two 290s for 1080p is ridiculous, I get 80-100 fps on mine in every game with just an old 2500K. I'd rather put my money into 6 cores right now.


thanks bro but i already have 4770k...


----------



## Kalistoval

Hey errbody Im having a hard time with 14.9 drivers & 14.9.1 beta I have an Asus R9 290x DCUII, could some one recommend me the best driver for me a the moment. I would surely appreciate this thanks in advance.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Kalistoval*
> 
> Hey errbody Im having a hard time with 14.9 drivers & 14.9.1 beta I have an Asus R9 290x DCUII, could some one recommend me the best driver for me a the moment. I would surely appreciate this thanks in advance.


I'm going well on 14.9 WHQL atm, what sort of issues you having mate?


----------



## ebhsimon

Quote:


> Originally Posted by *Klocek001*
> 
> two 290s for 1080p is ridiculous, I get 80-100 fps on mine in every game with just an old 2500K. I'd rather put my money into 6 cores right now.


6 core, as in 2011-3? The CPU might be kind of affordable, but the motherboards are crazy expensive, not to mention the sky high price of DDR4 memory at the moment. I'd rather NOT put my money into 6 cores for a gaming machine right now.


----------



## Kalistoval

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I'm going well on 14.9 WHQL atm, what sort of issues you having mate?


So what I have been doing is cleaned out the lent and dust out my rig, went into the bios set everything default re flashed the bios to 2501 reinstalled my os windows 8.1 pro. After set up installed the 14.9 drivers, got a good to go message rebooted and black/blank screen this is through dvi and hdmi its never happened before. My card has the newest 290x bios it was working all good before I reassembled re-termal pasted re seated heat sink the works check for lose screws, dust, lent, moisture damaged cables or improper grounding, everything I narrowed it down to the driver because even in windows 7 ultimate I have the same issue. So I tried the driver from asus i believe 13. or something like that and worked. 1 thing to note is also when it did kick into windows it reverted to the windows driver and ccc was not installed


----------



## Kalistoval

Got it working ! how it fixed its self IDK it just did


----------



## Klocek001

Quote:


> Originally Posted by *ebhsimon*
> 
> 6 core, as in 2011-3? The CPU might be kind of affordable, but the motherboards are crazy expensive, not to mention the sky high price of DDR4 memory at the moment. I'd rather NOT put my money into 6 cores for a gaming machine right now.


I'm not talking about X99, but a X79 with my "old" 2666 RAM


----------



## Sgt Bilko

Quote:


> Originally Posted by *Kalistoval*
> 
> Got it working ! how it fixed its self IDK it just did


Well good to see it's working now, keep us updated with how you go


----------



## heroxoot

Hey guys do any of you know how to force D3D9 override in a game that rejects Radeon pro? Trying to play Ultra Street fighter 4 and I enabled Vsync + lock to refresh rate in Radeon pro. It felt weird so I enabled Triple buffer in the driver but it's OpenGL and I think the experience is placebo. When I enable Triplebuffer and/or D3D9 override it says that it didn't work when the game launches.


----------



## Ramzinho

Quote:


> Originally Posted by *maynard14*
> 
> thanks bro but i already have 4770k...


Now you need to save up for 1440p 120Hz... and then you will see what difference and beauty that is


----------



## Sgt Bilko

Quote:


> Originally Posted by *Ramzinho*
> 
> Quote:
> 
> 
> 
> Originally Posted by *maynard14*
> 
> thanks bro but i already have 4770k...
> 
> 
> 
> Now you need to save up for 1440p 120Hz... and then you will see what difference and beauty that is
Click to expand...

290 Crossfire at 1440p?, you'll be needing 120hz









Looking forward to seeing what Freesync brings next year


----------



## Ramzinho

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 290 Crossfire at 1440p?, you'll be needing 120hz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looking forward to seeing what Freesync brings next year


look at what you quoted me saying mate... 1440p 120Hz


----------



## Sgt Bilko

Quote:


> Originally Posted by *Ramzinho*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> 290 Crossfire at 1440p?, you'll be needing 120hz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looking forward to seeing what Freesync brings next year
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> look at what you quoted me saying mate... 1440p 120Hz
Click to expand...

Yup, i was agreeing with you


----------



## Wezzor

Are there any good 1440p 120hz IPS monitors out yet?


----------



## Ramzinho

Quote:


> Originally Posted by *Wezzor*
> 
> Are there any good 1440p 120hz IPS monitors out yet?


Only Korean IPS .. others are all TN panels


----------



## Wezzor

Quote:


> Originally Posted by *Ramzinho*
> 
> Only Korean IPS .. others are all TN panels


Ahh okay. I'd rather skip those Koreans monitors.







Do you think it'll take much longer before we start seeing those type of monitors on the market from the more known companies?


----------



## maynard14

haha.
Quote:


> Originally Posted by *Ramzinho*
> 
> Now you need to save up for 1440p 120Hz... and then you will see what difference and beauty that is


haha ill buy the 290 tri x then... after that ill upgrade my monitor to 1440p wahoooo looking forward for my 13 month pay!

thank you so much guys. i like to see my fps jump to 120 fps beyond wahooo. btw i have a monitor that 120hz so i hope crossfire 290 will give me boost in fps


----------



## Ramzinho

Quote:


> Originally Posted by *Wezzor*
> 
> Ahh okay. I'd rather skip those Koreans monitors.
> 
> 
> 
> 
> 
> 
> 
> Do you think it'll take much longer before we start seeing those type of monitors on the market from the more known companies?


there is a huge community of korean monitors owners on OCN.. and i personally own one.. it's goregous.. and it's 300$ mate.


----------



## Vici0us

Quote:


> Originally Posted by *Ramzinho*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Wezzor*
> 
> Ahh okay. I'd rather skip those Koreans monitors.
> 
> 
> 
> 
> 
> 
> 
> Do you think it'll take much longer before we start seeing those type of monitors on the market from the more known companies?
> 
> 
> 
> there is a huge community of korean monitors owners on OCN.. and i personally own one.. it's goregous.. and it's 300$ mate.
Click to expand...

I've got a 27" Acer IPS K272HUL @ 1440P on sale for $300 half a year ago. It's got beautiful colors, no light bleed. They go for $400 but if you keep an eye on it, you can grab it on newegg for $300. It's a 60hz though. But still nice for Crossfire.


----------



## ebhsimon

Quote:


> Originally Posted by *Ramzinho*
> 
> there is a huge community of korean monitors owners on OCN.. and i personally own one.. it's goregous.. and it's 300$ mate.


I own one too, and they are amazing! Except an overclock to 120hz is not guaranteed, much like an overclock to 1200mhz on your 290(x) is not guaranteed.

On another note, 4K kIPS panels from QNIX are just around the corner!

http://www.overclock.net/t/1384767/official-the-qnix-x-star-1440p-monitor-club/21090#post_22984632


----------



## leeroy123

Hi guys, I just signed up to see what people thought of my overclock on my R9 290 Radeon Xfx. Here is the results after testing for days.

https://dl.dropboxusercontent.com/u/82401287/Screenshot%202014-10-11%2014.23.43.png

Coreclock 1150
Memory Clock 1400

I don't know if this is good or bad, but thought it would be interesting to find out.


----------



## leeroy123

The GPU is on Stock cooling. Don't know where edit is lol. Edit - Found it sorry about that, wont double post again.


----------



## nightfox

I own korean monitors. I dont know the difference between a branded one or not. So far I have the 3020mdp and it is still working fine.\

As far as I know, korean maker "crossover" is preparing a 5k monitor

it is much more like 16:10 4k monitor and they are calling it 5k. 32 inches as far as I remember. I will try to find out again source


----------



## Ramzinho

Quote:


> Originally Posted by *ebhsimon*
> 
> I own one too, and they are amazing! Except an overclock to 120hz is not guaranteed, much like an overclock to 1200mhz on your 290(x) is not guaranteed.
> 
> On another note, 4K kIPS panels from QNIX are just around the corner!
> 
> http://www.overclock.net/t/1384767/official-the-qnix-x-star-1440p-monitor-club/21090#post_22984632


pretty much but there are so many rare cases i've seen people can't get at least 96Hz.. i got mine at 96 Hz which i think is enough


----------



## ebhsimon

Quote:


> Originally Posted by *Ramzinho*
> 
> pretty much but there are so many rare cases i've seen people can't get at least 96Hz.. i got mine at 96 Hz which i think is enough


Push it 'till it hertz. Seriously. I get occasional horizontal lines at 115hz, so I backed down to 110hz, which I've had running for just about a year now. And it still works perfectly fine. Simply amazing in games.


----------



## Sgt Bilko

Quote:


> Originally Posted by *ebhsimon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ramzinho*
> 
> pretty much but there are so many rare cases i've seen people can't get at least 96Hz.. i got mine at 96 Hz which i think is enough
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Push it 'till it hertz. Seriously. I get occasional horizontal lines at 115hz, so I backed down to 110hz, which I've had running for just about a year now. And it still works perfectly fine. Simply amazing in games.
Click to expand...

Mine's running at 110hz most of the time, i have the same issue at higher than that, great Monitor though, very happy with it


----------



## FrancisJF

Received my 2 290x on Thursday(10/09/14)



Waiting on waterblock now.


----------



## Arizonian

Quote:


> Originally Posted by *FrancisJF*
> 
> Received my 2 290x on Thursday(10/09/14)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Waiting on waterblock now.


Congrats









I'll take your word for it and update the roster to under water.









EDIT: But that doesn't mean you get out of posting pics of the blocks.


----------



## FrancisJF

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll take your word for it and update the roster to under water.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: But that doesn't mean you get out of posting pics of the blocks.


Thanks Arizonian!







. I was suppose to receive my waterblocks today but USPS said that I have to pick it up on Monday....


----------



## Gualichu04

crossfire 3dmark with 14.9 drivers. 1150core 1300 mem my last score with 1400mem 1160core on 14.x 3dmark


----------



## BranField

well i had a bit of an issue getting my vaporx underwater. the kit was supplied with one screw that was the wrong length (and i didnt notice), it tore the standoff off of the block and took me a long time to get the screw out as it kept spinning. hoping that EK can sort me out a new block so i can get cracking


----------



## BLOWNCO

does anyone know if the vapor x 290X water blocks will fit the vapor 290 PCB?


----------



## Sgt Bilko

Quote:


> Originally Posted by *BLOWNCO*
> 
> does anyone know if the vapor x 290X water blocks will fit the vapor 290 PCB?


The EK Vapor-X block will fit both the 290 and 290x Vapor-X


----------



## BLOWNCO

Quote:


> Originally Posted by *Sgt Bilko*
> 
> The EK Vapor-X block will fit both the 290 and 290x Vapor-X


thats awesome thanks for the confirmation ive got a third 290 on the way but it wont fit the mobo since the stock coolers are huge. guess ill have to wait a bit and put the system under water now.


----------



## Sgt Bilko

Quote:


> Originally Posted by *BLOWNCO*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> The EK Vapor-X block will fit both the 290 and 290x Vapor-X
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> thats awesome thanks for the confirmation i've got a third 290 on the way but it wont fit the mobo since the stock coolers are huge. guess ill have to wait a bit and put the system under water now.
Click to expand...

Cooling Configurator hasn't been updated yet but here's the block annoucement: http://www.ekwb.com/news/523/19/EK-introduces-Sapphire-Vapor-X-R9-290X-water-block/


----------



## BLOWNCO

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Cooling Configurator hasn't been updated yet but here's the block annoucement: http://www.ekwb.com/news/523/19/EK-introduces-Sapphire-Vapor-X-R9-290X-water-block/


well my cards are the 290 version not the X version thats why i was curious those 290X blocks would fit

my cards for reference

http://www.newegg.com/Product/Product.aspx?Item=N82E16814202103


----------



## Sgt Bilko

Quote:


> Originally Posted by *BLOWNCO*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Cooling Configurator hasn't been updated yet but here's the block annoucement: http://www.ekwb.com/news/523/19/EK-introduces-Sapphire-Vapor-X-R9-290X-water-block/
> 
> 
> 
> well my cards are the 290 version not the X version thats why i was curious those 290X blocks would fit
> 
> my cards for reference
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814202103
Click to expand...

well afaik there is no PCB changes between 290 and 290x versions of the cards for any AIB so you'll be safe


----------



## BLOWNCO

Quote:


> Originally Posted by *Sgt Bilko*
> 
> well afaik there is no PCB changes between 290 and 290x versions of the cards for any AIB so you'll be safe


yea i kinda figured ive looked at pictures of both pcb's and they look identical but i figured someone had done it so id ask first.

again thanks for the info


----------



## VSG

Quote:


> Originally Posted by *Sgt Bilko*
> 
> The EK Vapor-X block will fit both the 290 and 290x Vapor-X


You sure? I read the PCBs were different.

http://www.coolingconfigurator.com/step1_complist?gpu_gpus=1468 also says the same.


----------



## Sgt Bilko

Quote:


> Originally Posted by *geggeg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> The EK Vapor-X block will fit both the 290 and 290x Vapor-X
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You sure? I read the PCBs were different.
> 
> http://www.coolingconfigurator.com/step1_complist?gpu_gpus=1468 also says the same.
Click to expand...

You're right, the PCB's are different, some of the Caps around the vrm's are in different places and the 290x has more chokes.

first time i've seen this, thanks for correcting me man








Quote:


> Originally Posted by *BLOWNCO*
> 
> yea i kinda figured ive looked at pictures of both pcb's and they look identical but i figured someone had done it so id ask first.
> 
> again thanks for the info


VSG corrected me, looking at them head to head:

Vapor-X 290x:


Vapor-X 290:


----------



## BLOWNCO

Quote:


> Originally Posted by *Sgt Bilko*
> 
> You're right, the PCB's are different, some of the Caps around the vrm's are in different places and the 290x has more chokes.
> 
> first time i've seen this, thanks for correcting me man
> 
> 
> 
> 
> 
> 
> 
> 
> VSG corrected me, looking at them head to head:
> 
> Vapor-X 290x:
> 
> 
> Vapor-X 290:


well crap! guess ill have to switch to intel then so i can get a mobo that has enough slots to run three of these cards...shucks lol

thanks both of you for the help


----------



## BLOWNCO

Quote:


> Originally Posted by *geggeg*
> 
> You sure? I read the PCBs were different.
> 
> http://www.coolingconfigurator.com/step1_complist?gpu_gpus=1468 also says the same.


do you post on POTN?

your avatar looks familar


----------



## VSG

Quote:


> Originally Posted by *BLOWNCO*
> 
> do you post on POTN?
> 
> your avatar looks familar


Yeah, POTN, FM and a couple of other forums as well (although I am not very active on the photography forums lately).


----------



## BLOWNCO

Quote:


> Originally Posted by *geggeg*
> 
> Yeah, POTN, FM and a couple of other forums as well (although I am not very active on the photography forums lately).


haha yea me either the pc bug has bitten me but i knew your avatar was familiar


----------



## jagdtigger

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll take your word for it and update the roster to under water.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: But that doesn't mean you get out of posting pics of the blocks.


Can i ask you to modify mine as well? I just noticed that according to the roster im on stock. But my card has a Koolance VID-AR290X water block:
https://dl.dropboxusercontent.com/u/1201829/OCN/20140921_231316.jpg


----------



## amalinkin

Hi everyone.
I have R9 290X Reference Gigabyte, the memory is Elpida + installed Arctic Hybrid 2 water cooling.

Does anyone know - why I don't have any overclocking? Even +10 MHz and I get a blackscreen?

Is it possible to flash my card with the bios from other manufacturer? If yes- which bios?
I saw that some people have flashed sapphire card with powercolor bios, and they got +100MHz productivity.

Thankyou.


----------



## mAs81

Quote:


> Originally Posted by *amalinkin*
> 
> Hi everyone.
> Is it possible to flash my card with the bios from other manufacturer? If yes- which bios?
> I saw that some people have flashed sapphire card with powercolor bios, and they got +100MHz productivity.


Check out info for flashing your card here .

Have you upped the voltage ??
What OC utility are you using?


----------



## amalinkin

Quote:


> Originally Posted by *mAs81*
> 
> Check out info for flashing your card here .
> 
> Have you upped the voltage ??
> What OC utility are you using?


I haven't, only after setting +50 mV i got 1070MHz core working in the games
MSI AB 4.0 is favourite for me.

Thank you, I'm cheching the link


----------



## StonedAlex

Hey guys I think i'm going to be buying a used r9 290 soon and i'm trying to decide between the powercolor pcs+ and the tri-x. Anyone care to advise? I'm leaning towards the powercolor right now just because it has a backplate.


----------



## CravinR1

The tri-x supposed to be best cooler


----------



## BLOWNCO

Quote:


> Originally Posted by *CravinR1*
> 
> The tri-x supposed to be best cooler


actually the vapor X cooler is the best one.


----------



## Arizonian

Quote:


> Originally Posted by *jagdtigger*
> 
> Can i ask you to modify mine as well? I just noticed that according to the roster im on stock. But my card has a Koolance VID-AR290X water block:
> https://dl.dropboxusercontent.com/u/1201829/OCN/20140921_231316.jpg


Nice. Sorry about that - updated


----------



## TTheuns

Can any of you confirm that EK's Reference R9 290 blocks fit R9 290 Windforce cards?


----------



## FrancisJF

Quote:


> Originally Posted by *TTheuns*
> 
> Can any of you confirm that EK's Reference R9 290 blocks fit R9 290 Windforce cards?


According to Cooling Configurator, it will fit.


----------



## gkolarov

I am a new owner of R9 290 PCS+ and i am encountering a strange problem. I am using a monitor connected to the card via DVI and second monitor (TV Sony) connected to the card via HDMI . So, when i am doing something on the PC no matter is it gaming or just browsing and turn on the TV, the screen on the monitor (DVI connection) starts to blick (several times black screen) and if the card is overclocked (no matter whether with AB ot Trixx)after the blicking is over the card has reset to its default clocks. Is this normal? Where is the problem? I had nvidia card before this one R9 and I have never encountered such behavior!


----------



## maynard14

guys help, my 290 reference unlock to 290x suddleny black screen on me, i already tried doing all troubleshoot, like placing the original bios switch to 290, changing diff pcie line, tried diff cable still no display even in bios or pc start up., im using nzxt g10 and h105 so no temp issue,

before it black screen im playing shadow of mordor, then i just black screen no display ,,, and the last time i saw my temps they are in normal state,,

and i check also the gpu core, no crack no dents,,,

im a screwed coz my 2nd bios is flash to 290x so technically i have no warranty?


----------



## MojoW

Quote:


> Originally Posted by *maynard14*
> 
> guys help, my 290 reference unlock to 290x suddleny black screen on me, i already tried doing all troubleshoot, like placing the original bios switch to 290, changing diff pcie line, tried diff cable still no display even in bios or pc start up., im using nzxt g10 and h105 so no temp issue,
> 
> before it black screen im playing shadow of mordor, then i just black screen no display ,,, and the last time i saw my temps they are in normal state,,
> 
> and i check also the gpu core, no crack no dents,,,
> 
> im a screwed coz my 2nd bios is flash to 290x so technically i have no warranty?


Use your internal gpu from your haswell and then flash back the original bios.
Then put the card back to it's original state cooler wise.

Edit:You can also check if the card still responds to flashing in atiflash


----------



## maynard14

Quote:


> Originally Posted by *MojoW*
> 
> Use your internal gpu from your haswell and then flash back the original bios.
> Then put the card back to it's original state cooler wise.
> 
> Edit:You can also check if the card still responds to flashing in atiflash


thank you bro,.. but my mistake is that its on the store right now









but as for the nzxt g10 that i have put on my 290 my warranty according to the store i bought is still warrantied

but im so worried about the 290x bios that i flash,,, i didn't know that i can still flash the card even black screen


----------



## ZealotKi11er

Quote:


> Originally Posted by *maynard14*
> 
> thank you bro,.. but my mistake is that its on the store right now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> but as for the nzxt g10 that i have put on my 290 my warranty according to the store i bought is still warrantied
> 
> but im so worried about the 290x bios that i flash,,, i didn't know that i can still flash the card even black screen


So the card does not work anymore?


----------



## jagdtigger

Quote:


> Originally Posted by *Arizonian*
> 
> Nice. Sorry about that - updated


Thanks







.


----------



## maynard14

Quote:


> Originally Posted by *ZealotKi11er*
> 
> So the card does not work anymore?


yeah the card doesnt work no screen at all, all black but my pc still boots no error, so i remove the nzxt g10 and put it in the reference cooler, and return to the shop where i bought it, now im really hopiing that they wont see the bios flash on the 2nd bios or else im dead no warranty

i would like to add also, he offered me to add a little cash so that my 290 ref he will replace it with brand new ref 290x,,, so ill just grab that before they find out that my bios 2 is flash haha


----------



## DividebyZERO

Quote:


> Originally Posted by *maynard14*
> 
> yeah the card doesnt work no screen at all, all black but my pc still boots no error, so i remove the nzxt g10 and put it in the reference cooler, and return to the shop where i bought it, now im really hopiing that they wont see the bios flash on the 2nd bios or else im dead no warranty
> 
> i would like to add also, he offered me to add a little cash so that my 290 ref he will replace it with brand new ref 290x,,, so ill just grab that before they find out that my bios 2 is flash haha


have you verified your psu pcie power is good? Ive had bad rails before causing no video but system ran without video.


----------



## maynard14

yes bro... my psu is good.. i did try a diff video card and its working fine. ill just bite the offer for me to upgrade for a true 290x than loose my warranty if they found out that i flash the bios of the 290 to unlock to 290x


----------



## bluedevil

Should I even bother with getting the 14.9.1 beta driver? I have the 14.9s installed now, no issues.


----------



## VSG

Stick to it then, the 14.91 was just to remove some bugs affecting some users of the 14.9 driver.


----------



## rdr09

took time out from homework.

http://www.3dmark.com/3dm/4354146?

1250 core.


----------



## Agent Smith1984

What's really nice is that CPU score on the old i7!!








This old Thuby needs 4.2GHz to break 9,000 physics in FireStrike


----------



## mojobear

Hey guys,

Well 3dmark has a new ultra setting for 4K resolution. Give it a try. There are too many GTX 980s dotting the score board. Got a score to represent our owners club







haha


----------



## rdr09

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What's really nice is that CPU score on the old i7!!
> 
> 
> 
> 
> 
> 
> 
> 
> This old Thuby needs 4.2GHz to break 9,000 physics in FireStrike


got me another 290 coming. not sure if the i7 will handle them.
Quote:


> Originally Posted by *mojobear*
> 
> Hey guys,
> 
> Well 3dmark has a new ultra setting for 4K resolution. Give it a try. There are too many GTX 980s dotting the score board. Got a score to represent our owners club
> 
> 
> 
> 
> 
> 
> 
> haha


lol at tsm.

BTW, Good job mojobear!


----------



## Agent Smith1984

You should be good man.....
Post up some highly overclocked FireStrikes when you get em rolling.... I am thinking about grabbing a 290 or 290x off ebay for $200+/-

I am curious.... what would be better, a reference 290x (I'm guessing lower OC and higher temps), or an aftermarket 290 with a good cooler (tri-x etc...)
The two options are selling for about the same amount used.


----------



## battleaxe

Having two 290's I would personally go for the 290x. Add an AIO and VRM coolers if you don't like the heat. I picked up an h80 the other day for $38 and added it to my other 290. Now have an h80 on both of them. Temps never exceed 60c now. Quiet too. If I had it to do over again I would go 290x. But mining paid for my cards so I can't really complain and nor do I care that much. Still twice as much horsepower as I need until I do a monitor upgrade. (which is hopefully coming soon)


----------



## devilhead

http://www.3dmark.com/fs/2967990 tested FSU with stock bios and daily cpu overclock


----------



## pkrexer

#22 at the moment for 2x gpu's. This was pushing my system to the limits.

http://www.3dmark.com/fs/2969030

If I had a 5920k or something, maybe I could get close to the top 10.


----------



## tsm106

Quote:


> Originally Posted by *devilhead*
> 
> http://www.3dmark.com/fs/2967990 tested FSU with stock bios and daily cpu overclock


You better watchout, my 4820 with 11K physics will catch you.


----------



## Widde

http://www.3dmark.com/3dm/4357865?

My 3570k brought to its knees in the combined test


----------



## bond32

http://www.3dmark.com/3dm/4358192?

Dang... Poor 4770k...


----------



## Widde

Anyone else had a green screen with the 14.9 drivers? Or is it my overclock that has decided to go bad? Been running +88mV for a while now but temps have been good.


----------



## bond32

Slightly better score: http://www.3dmark.com/3dm/4358480?

Too tired now but that was with the stock bios. Will try with PT1 bios later, might get a little higher.


----------



## chiknnwatrmln

I'm going CrossFire!

The upgrade bug has been nagging at me for the past few weeks, looked around on eBay figuring out whether it'd be cheaper to buy a used GPU and new block or buy them together... I found a guy selling a handful of 290x's with EK blocks (matching the one I have already). I went to buy one for $285 at the last minute but stupid eBay made me change my password and that took up the remaining minute and a half... Regardless, I bought another one of his GPU + block combos for $305 + 30 shipping, not too bad I guess. I just gotta buy a terminal (parallel) and reinforcement bracket and I'm good to go.

Anything in particular I need to know about CF? I'm going to be using a 290 with a 290x, so I figure I can clock up the 290 a bit to match the 290x. I'm gonna run close to stock speeds though, I run my PC off an 865 watt UPS so I can't exceed that figure. That means no overvolting for me...

Regardless, I'm excited and will update back when I get it installed.

One question, do I need to reinstall my drivers for CF? Or am I good to go? I'm still running 14.4 iirc.


----------



## Sgt Bilko

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I'm going CrossFire!
> 
> The upgrade bug has been nagging at me for the past few weeks, looked around on eBay figuring out whether it'd be cheaper to buy a used GPU and new block or buy them together... I found a guy selling a handful of 290x's with EK blocks (matching the one I have already). I went to buy one for $285 at the last minute but stupid eBay made me change my password and that took up the remaining minute and a half... Regardless, I bought another one of his GPU + block combos for $305 + 30 shipping, not too bad I guess. I just gotta buy a terminal (parallel) and reinforcement bracket and I'm good to go.
> 
> Anything in particular I need to know about CF? I'm going to be using a 290 with a 290x, so I figure I can clock up the 290 a bit to match the 290x. I'm gonna run close to stock speeds though, I run my PC off an 865 watt UPS so I can't exceed that figure. That means no overvolting for me...
> 
> Regardless, I'm excited and will update back when I get it installed.
> 
> One question, do I need to reinstall my drivers for CF? Or am I good to go? I'm still running 14.4 iirc.


Just plug and play man.

You dont even have to clock up the 290 if you dont want to, the 290x will clock itself down to match the 290 iirc.


----------



## JourneymanMike

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I'm going CrossFire!
> 
> The upgrade bug has been nagging at me for the past few weeks, looked around on eBay figuring out whether it'd be cheaper to buy a used GPU and new block or buy them together... I found a guy selling a handful of 290x's with EK blocks (matching the one I have already). I went to buy one for $285 at the last minute but stupid eBay made me change my password and that took up the remaining minute and a half... Regardless, I bought another one of his GPU + block combos for $305 + 30 shipping, not too bad I guess. I just gotta buy a terminal (parallel) and reinforcement bracket and I'm good to go.
> 
> Anything in particular I need to know about CF? I'm going to be using a 290 with a 290x, so I figure I can clock up the 290 a bit to match the 290x. I'm gonna run close to stock speeds though, I run my PC off an 865 watt UPS so I can't exceed that figure. That means no overvolting for me...
> 
> Regardless, I'm excited and will update back when I get it installed.
> 
> One question, do I need to reinstall my drivers for CF? Or am I good to go? I'm still running 14.4 iirc.


I just added another 290x to my setup and crossfire couldn't be enabled until I reinstalled the drivers...

I'm using 14.9.1 - works fine!


----------



## chiknnwatrmln

Cool, I'm gonna run at stock speeds and see how much power headroom I have first before I try any OC'ing.

I haven't gotten around to updating drivers in a while, I tried but I think they wrecked a program I use called F.lux; I'll have to check out the newest drivers soon though. Can't wait to pump up the res scale on BF4.


----------



## Roy360

Anyone managed to get a portrait Setup with these cards?

I'm thinking of switching my landscape setup to portrait, but whenever I enable eyefinity, the oriention box disappears.


----------



## DividebyZERO

Quote:


> Originally Posted by *Roy360*
> 
> Anyone managed to get a portrait Setup with these cards?
> 
> I'm thinking of switching my landscape setup to portrait, but whenever I enable eyefinity, the oriention box disappears.


you probably need to rotate the screens first before enabling eyefinity


----------



## kizwan

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I'm going CrossFire!
> 
> The upgrade bug has been nagging at me for the past few weeks, looked around on eBay figuring out whether it'd be cheaper to buy a used GPU and new block or buy them together... I found a guy selling a handful of 290x's with EK blocks (matching the one I have already). I went to buy one for $285 at the last minute but stupid eBay made me change my password and that took up the remaining minute and a half... Regardless, I bought another one of his GPU + block combos for $305 + 30 shipping, not too bad I guess. I just gotta buy a terminal (parallel) and reinforcement bracket and I'm good to go.
> 
> Anything in particular I need to know about CF? I'm going to be using a 290 with a 290x, so I figure I can clock up the 290 a bit to match the 290x. I'm gonna run close to stock speeds though, I run my PC off an 865 watt UPS so I can't exceed that figure. That means no overvolting for me...
> 
> Regardless, I'm excited and will update back when I get it installed.
> 
> One question, do I need to reinstall my drivers for CF? Or am I good to go? I'm still running 14.4 iirc.


You don't need to use the reinforcement bracket if you use the terminal to link up your cards. Also I don't think you can use the reinforcement bracket with the terminal. Check that before buying.


----------



## chiknnwatrmln

Yep, I figured that out when looking at the terminals - the bracket gets in the way. Thanks for the heads up though.


----------



## Roy360

Quote:


> Originally Posted by *DividebyZERO*
> 
> you probably need to rotate the screens first before enabling eyefinity


I don't think my screens have any special sensors.

In any case, 14.9 fixed it. However, for some reason Portrait is much more demanding than 5760x1080p

ie. Darks Souls 2 with Wide Screen patcher, run much slower.


----------



## DividebyZERO

Quote:


> Originally Posted by *Roy360*
> 
> I don't think my screens have any special sensors.
> 
> In any case, 14.9 fixed it. However, for some reason Portrait is much more demanding than 5760x1080p
> 
> ie. Darks Souls 2 with Wide Screen patcher, run much slower.


ive done benching both portrait and landscape and some games like it better than others.

Edit: I just realized what you said, I meant rotate the monitors in CCC to portrait before enabljng eyefinity. Not the monitors physically.


----------



## jagdtigger

Quote:


> Originally Posted by *bond32*
> 
> Slightly better score: http://www.3dmark.com/3dm/4358480?
> 
> Too tired now but that was with the stock bios. Will try with PT1 bios later, might get a little higher.


Nice score







My poor single 290x card is not up to the task







:
http://www.3dmark.com/3dm/4360989?


----------



## mAs81

Well here's mine at stock clocks :
Vapor-X 290


----------



## Vici0us

Just wanted to let people know specially with Crossfire. If 14.9 is giving you problems then try 14.9.1 I had mad problems with 14.9 but with 14.9.1 everything is very smooth. Also what's the safe voltage for daily OC on two R9 290s on air.


----------



## gkolarov

1.3V ?


----------



## jagdtigger

Okay, finally my OC is stable(GPU: 1230 MHz, VRAM: 1500MHz):
http://www.3dmark.com/3dm/4365500?
Ultra:
http://www.3dmark.com/3dm/4365584?

And i have a strange problem, i have three screen. One TV(HDMI) and two monitor(one DVI-D, one DVID->HDMI). When i run 3dmark with OC-ed card the two HDMI connected monitor giving me black screen(for a moment it shows black screen than the image comes back and 3dmark not even interrupted by this). But if i run it on the DVI-d monitor the problem is gone...







Have someone any idea whats the problem here?


----------



## devilhead

Quote:


> Originally Posted by *tsm106*
> 
> You better watchout, my 4820 with 11K physics will catch you.


heh, i can do and 24 000







still waiting for 3000mhz memory, because now i use 2133.....
so try to catch this score http://www.3dmark.com/fs/2977596, hard to do anything more. because of ambient 20 of room


----------



## maynard14

finally got my replacement card from the store where i bought my xfx r9 290, they replace it with a 290x xfx reference card also

asic quality is 79.9 percent and it has elpida memory, but still im happy because im back playing ryse son of rome


----------



## Vici0us

Quote:


> Originally Posted by *maynard14*
> 
> finally got my replacement card from the store where i bought my xfx r9 290, they replace it with a 290x xfx reference card also
> 
> asic quality is 79.9 percent and it has elpida memory, but still im happy because im back playing ryse son of rome


Congrats!


----------



## tsm106

Quote:


> Originally Posted by *devilhead*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> You better watchout, my 4820 with 11K physics will catch you.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> heh, i can do and 24 000
> 
> 
> 
> 
> 
> 
> 
> still waiting for 3000mhz memory, because now i use 2133.....
> so try to catch this score http://www.3dmark.com/fs/2977596, hard to do anything more. because of ambient 20 of room
Click to expand...

Nice run. What clocks did you run? To catch ya looks like I'm gonna have to stick that card into my main rig, but I'm too lazy lol.


----------



## Gabkicks

anyone have luck with downsampling? I am having trouble getting 1440p on my vg248qe. i tried using custom resolution utility and keep getting out of range







. Any tips? I am on 14.9.1 drivers


----------



## devilhead

Quote:


> Originally Posted by *tsm106*
> 
> Nice run. What clocks did you run? To catch ya looks like I'm gonna have to stick that card into my main rig, but I'm too lazy lol.


1360/1725, but i need install windows 8.1 for next run, maybe it will help, and get good ambient, so then maybe able to push 1380-1390


----------



## mojobear

Hey guys,

How is everyone testing stability of OC?

I have looped firestrike test 1 extreme (now ultra) on loop and look for artifacts but its kinda hit and miss. For example sometimes I can go many runs without artifacts, other times its artifact central...must be related to the flux in VDDC.

Any insights guys?


----------



## Spectre-

Quote:


> Originally Posted by *mojobear*
> 
> Hey guys,
> 
> How is everyone testing stability of OC?
> 
> I have looped firestrike test 1 extreme (now ultra) on loop and look for artifacts but its kinda hit and miss. For example sometimes I can go many runs without artifacts, other times its artifact central...must be related to the flux in VDDC.
> 
> Any insights guys?


i loop Metro LL 20-25 times

or loop Heaven Valley 10 times for 24/7 stability


----------



## mojobear

Interesting and what do you look for? Artifacts or just ensure it doesnt crash
Quote:


> Originally Posted by *Spectre-*
> 
> i loop Metro LL 20-25 times
> 
> or loop Heaven Valley 10 times for 24/7 stability


----------



## FrancisJF

Quote:


> Originally Posted by *FrancisJF*
> 
> Received my 2 290x on Thursday(10/09/14)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> /spoiler]
> 
> Waiting on waterblock now.


Picked up my waterblock Monday then Today I put on the block.


----------



## Cyber Locc

Does anyone know of anywhere that has custom aluminum back-plates for 290xs reference. Dont need anything crazy just want a plain black one that will work for ek blocks. The reason I dont just get an ek backplate is EK seems to be blind and has put a crossfire cut out on a card that doesnt use that adapter and I can not stand it so I need to find one that doesnt have that.


----------



## ZealotKi11er

Quote:


> Originally Posted by *cyberlocc*
> 
> Does anyone know of anywhere that has custom aluminum back-plates for 290xs reference. Dont need anything crazy just want a plain black one that will work for ek blocks. The reason I dont just get an ek backplate is EK seems to be blind and has put a crossfire cut out on a card that doesnt use that adapter and I can not stand it so I need to find one that doesnt have that.


Lol i just noticed. That is stupid.


----------



## Arizonian

Quote:


> Originally Posted by *maynard14*
> 
> finally got my replacement card from the store where i bought my xfx r9 290, they replace it with a 290x xfx reference card also
> 
> asic quality is 79.9 percent and it has elpida memory, but still im happy because im back playing ryse son of rome
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats on that upgrade - updated


----------



## Cyber Locc

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Lol i just noticed. That is stupid.


Right It sure is lol.

Arizonan I also got a xfx card now as well if you want to put it down







xfx to 290. I will add screenshots once I get everything together.


----------



## Arizonian

Quote:


> Originally Posted by *cyberlocc*
> 
> Right It sure is lol.
> 
> Arizonan I also got a xfx card now as well if you want to put it down
> 
> 
> 
> 
> 
> 
> 
> xfx to 290. I will add screenshots once I get everything together.


Cool, no worries updated


----------



## maynard14

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats on that upgrade - updated


thank you sir


----------



## sinnedone

Quote:


> Originally Posted by *mojobear*
> 
> Interesting and what do you look for? Artifacts or just ensure it doesnt crash


I use 3dmark vantage and valley to check overclocks followed by a couple of hours gaming on a demanding game.

Look for artifacts. Unless you go absurd with your overclock it probably won't crash, might dump the driver but not crash your pc.
Quote:


> Originally Posted by *cyberlocc*
> 
> Does anyone know of anywhere that has custom aluminum back-plates for 290xs reference. Dont need anything crazy just want a plain black one that will work for ek blocks. The reason I dont just get an ek backplate is EK seems to be blind and has put a crossfire cut out on a card that doesnt use that adapter and I can not stand it so I need to find one that doesnt have that.


I wondered the same and asked on the ek thread. Apparently they were sent an engineering sample before release and it had crossfire bridges. It takes more work to make that cutout, sooooo yeah don't know


----------



## Spectre-

Quote:


> Originally Posted by *mojobear*
> 
> Interesting and what do you look for? Artifacts or just ensure it doesnt crash


really just look for fps drops and stable clocks ( no drops ) in both core/ memory

and i keep the volts under 1.35


----------



## Cyber Locc

Quote:


> Originally Posted by *sinnedone*
> 
> I use 3dmark vantage and valley to check overclocks followed by a couple of hours gaming on a demanding game.
> 
> Look for artifacts. Unless you go absurd with your overclock it probably won't crash, might dump the driver but not crash your pc.
> I wondered the same and asked on the ek thread. Apparently they were sent an engineering sample before release and it had crossfire bridges. It takes more work to make that cutout, sooooo yeah don't know


Thats what I'm saying I mean yes I understand if a mistake was made but are they saying they made all the gpu back plates that are still for sale a year after launch, before launch. That seems illogical to me the lighting back plate and the se back plate ect have no such cutout but would they work with reference design?? Im thinking not but then again its just a back plate and screws seem to be in the same place shoot Id pay more if they made a new batch without that I wont see it in my 900d anyway but just knowing that its there will drive me up the wall.


----------



## ebhsimon

@Arizonian Add me down for another 290! I wonder if 1 290 @ 1100mhz is faster or 2 290s @ 550mhz is faster.

http://www.techpowerup.com/gpuz/details.php?id=gr5u

Now this PCS 290 looks better than the Vapor X and is cooler (temps), but is also louder since the fan profile is more aggressive. Not sure which one I'll put on top. At idle they sound really similar though while the PCS+ idles at 39 (while web browsing) while the Vapor-X idles at 41 (while web browsing).

I had to use my stock power cables for this one since my sleeved cables are 8pin & 8pin while this is 8pin & 6pin. Definitely going to get more sleeved cables and sew them myself though!


----------



## Cyber Locc

Quote:


> Originally Posted by *ebhsimon*
> 
> @Arizonian Add me down for another 290! I wonder if 1 290 @ 1100mhz is faster or 2 290s @ 550mhz is faster.
> 
> http://www.techpowerup.com/gpuz/details.php?id=gr5u
> 
> Now this PCS 290 looks better than the Vapor X and is cooler (temps), but is also louder since the fan profile is more aggressive. Not sure which one I'll put on top.
> 
> I had to use my stock power cables for this one since my sleeved cables are 8pin & 8pin while this is 8pin & 6pin. Definitely going to get more sleeved cables and sew them myself though!


1 290 at 1100 mHz is faster as crossfire will degrade performance granted not by as much as it use to but it degrades none the less so 1 at 1100 Wil be about 20% faster than 2 at 550mhz of course Idk why you would run 2 290s at 550 each lol


----------



## ebhsimon

Quote:


> Originally Posted by *cyberlocc*
> 
> 1 290 at 1100 mHz is faster as crossfire will degrade performance granted not by as much as it use to but it degrades none the less so 1 at 1100 Wil be about 20% faster than 2 at 550mhz of course Idk why you would run 2 290s at 550 each lol


Hah just wondering. But you're right, with multi gpu performance doesn't scale 100%.


----------



## Arizonian

Quote:


> Originally Posted by *ebhsimon*
> 
> @Arizonian Add me down for another 290! I wonder if 1 290 @ 1100mhz is faster or 2 290s @ 550mhz is faster.
> 
> http://www.techpowerup.com/gpuz/details.php?id=gr5u
> 
> Now this PCS 290 looks better than the Vapor X and is cooler (temps), but is also louder since the fan profile is more aggressive. Not sure which one I'll put on top. At idle they sound really similar though while the PCS+ idles at 39 (while web browsing) while the Vapor-X idles at 41 (while web browsing).
> 
> I had to use my stock power cables for this one since my sleeved cables are 8pin & 8pin while this is 8pin & 6pin. Definitely going to get more sleeved cables and sew them myself though!
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - updated


----------



## Unknownm

2x 290 XFX @ 947Mhz, Hopefully they will unlock









Sorry for the messy wires , cable management will come tomorrow.


----------



## Spectre-

I love PT.1 bios

http://www.3dmark.com/3dm11/8835464

god damn


----------



## Unknownm

This means one card can unlock and the other can't?

Could you crossfire 290 & 290x together?


----------



## sinnedone

Quote:


> Originally Posted by *Spectre-*
> 
> I love PT.1 bios
> 
> http://www.3dmark.com/3dm11/8835464
> 
> god damn


Whats so special about that Bios? Will it work on 290's?


----------



## battleaxe

Quote:


> Originally Posted by *ebhsimon*
> 
> @Arizonian Add me down for another 290! I wonder if 1 290 @ 1100mhz is faster or 2 290s @ 550mhz is faster.
> 
> http://www.techpowerup.com/gpuz/details.php?id=gr5u
> 
> Now this PCS 290 looks better than the Vapor X and is cooler (temps), but is also louder since the fan profile is more aggressive. Not sure which one I'll put on top. At idle they sound really similar though while the PCS+ idles at 39 (while web browsing) while the Vapor-X idles at 41 (while web browsing).
> 
> I had to use my stock power cables for this one since my sleeved cables are 8pin & 8pin while this is 8pin & 6pin. Definitely going to get more sleeved cables and sew them myself though!


Those are both some sweet looking cards! Love that PCS


----------



## LandonAaron

Hello I am switching over to a custom water cooling loop for my CPU and R9 290x, and plan on using a universal GPU water block on the R9 290x. I already have Gelid's VRM heatsink and some nice copper heatsinks from Enzotech for all the Vram, I just need some sort of fan to assist in actively cooling these heatsinks. I thought something like that would be easy to find, but all of the PCI slot fans I have been able to find look really cheap.

The best I have found so far is this:

And this: 

I prefer the second one as I could get two of these and combine them together for a 3 fan configuration and the red fans would match my color scheme. Also they use 90mm fans vs. 80mm fans, and if the fans suck I could replace them, as they are just standard 90mm fans, so the brackets should work on anything. The big negative on this one is that it ships from China and won't arrive till probably the beginning of December (I am still waiting on some sleeved PSU cables I ordered nearly a month ago from China).

Does anyone have some other recommendations?


----------



## bond32

Quote:


> Originally Posted by *sinnedone*
> 
> Whats so special about that Bios? Will it work on 290's?


Yes. The PT1/PT3 bios are benching bios' and honestly I don't know where they came from. If you use them be careful, especially on air as the cards will NOT downclock - they stay in the full 3d clocks at all times (1000/1250) and the tdp limit is removed

Seriously be careful though, I have pulled almost 400 watts per card with PT1 bios. PT3 is identical except there's no vdrop. I recommend doing a lot of research before jumping into it as you can fry your card with it.


----------



## tsm106

They came from kingpin by way of shammy.


----------



## nightfox

Quote:


> Originally Posted by *Unknownm*
> 
> This means one card can unlock and the other can't?
> 
> Could you crossfire 290 & 290x together?


yes you can crossfire 290 and X. I am running 2x290 and 2x 290x







. Performance? to be honest i dont know. too lazy to find out


----------



## ZealotKi11er

Quote:


> Originally Posted by *Unknownm*
> 
> This means one card can unlock and the other can't?
> 
> Could you crossfire 290 & 290x together?


Thats what i have right now. TO be honest it bugs me having 290X + 290. I am all for equality. Performance 290X vs 290 ~ 7% Clock per Clock. 290X + 290 will give you like 3% more then 290 + 290.


----------



## sugarhell

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Thats what i have right now. TO be honest it bugs me having 290X + 290. I am all for equality. Performance 290X vs 290 ~ 7% Clock per Clock. 290X + 290 will give you like 3% more then 290 + 290.


No it will give you 290x performance and 290 performance


----------



## pkrexer

I used to have a 290x + 290 without any problems. My OCD made me sell the 290 and pick up a 290x for an extra $50 ugh lol


----------



## Agent Smith1984

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Thats what i have right now. TO be honest it bugs me having 290X + 290. I am all for equality. Performance 290X vs 290 ~ 7% Clock per Clock. 290X + 290 will give you like 3% more then 290 + 290.


You have my deepest sympathies









I know what you mean though, I seriously thought about grabbing a cheap used 7970 to Crossfire, but I know it would bug me on the inside a little....
The way VRAM demands have gone up lately, I am thinking getting a used 290x (selling for roughly 225-250 on ebay right now) is the way to go.... I can add another later


----------



## Jorginto

Guys max voltage on Gigabyte GV-R929XOC-4GD rev. 1.0 is 1.3V. I can buy a used one and keep wondering.


----------



## Ziox

Wanted to join the club









290x 1225/1625 stable

39c Idle 55c load 3hrs Heaven Benchmark


----------



## Nopileus

If you don't mind me asking, what core voltage do you get under load with +200mv?


----------



## Ziox

Quote:


> Originally Posted by *Nopileus*
> 
> If you don't mind me asking, what core voltage do you get under load with +200mv?


The highest Iv seen it go is 1.367 max load but its usually around 1.325. I could use +150mv to get 1205/1625 stable but I think im safe at 200+mv


----------



## GoldenboyXD

Hi guys, I have a question for the R9 290/290X users if you experienced this after updating the drivers... My R9 290 Tri-X default speed runs at 1000/1300 and always on stock and after updating from 14.4 to 14.9.. same goes with 14.9 to 14.9.1. I've noticed my clocks changed in TRIXX now showing as 500/600... I don't have any problems running lower clocks since i only play Dota2 at the moment and seems stable and no issues so far and less heat which is great!









Is this normal? I can always reset it back to default speeds if i need it to but I don't need a speed bump at the moment since I only play less graphic intensive games and internet surfing.


----------



## ZealotKi11er

Quote:


> Originally Posted by *GoldenboyXD*
> 
> Hi guys, I have a question for the R9 290/290X users if you experienced this after updating the drivers... My R9 290 Tri-X default speed runs at 1000/1300 and always on stock and after updating from 14.4 to 14.9.. same goes with 14.9 to 14.9.1. I've noticed my clocks changed in TRIXX now showing as 500/600... I don't have any problems running lower clocks since i only play Dota2 at the moment and seems stable and no issues so far and less heat which is great!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is this normal? I can always reset it back to default speeds if i need it to but I don't need a speed bump at the moment since I only play less graphic intensive games and internet surfing.


Because Dota 2 is not that demanding the card will down clock itself. For me its usually runs ~ 700-800MHz.


----------



## GoldenboyXD

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Because Dota 2 is not that demanding the card will down clock itself. For me its usually runs ~ 700-800MHz.


I understand. But what i have encountered was my clocks were down-clocked after the driver update and was not playing any games. Until I've checked the TRIXX software and to my surprise it was 500/600.


----------



## Spectre-

Quote:


> Originally Posted by *bond32*
> 
> Yes. The PT1/PT3 bios are benching bios' and honestly I don't know where they came from. If you use them be careful, especially on air as the cards will NOT downclock - they stay in the full 3d clocks at all times (1000/1250) and the tdp limit is removed
> 
> Seriously be careful though, I have pulled almost 400 watts per card with PT1 bios. PT3 is identical except there's no vdrop. I recommend doing a lot of research before jumping into it as you can fry your card with it.


i pulled 1.5 volts with nearly 430 watts


----------



## sinnedone

Quote:


> Originally Posted by *bond32*
> 
> Yes. The PT1/PT3 bios are benching bios' and honestly I don't know where they came from. If you use them be careful, especially on air as the cards will NOT downclock - they stay in the full 3d clocks at all times (1000/1250) and the tdp limit is removed
> 
> Seriously be careful though, I have pulled almost 400 watts per card with PT1 bios. PT3 is identical except there's no vdrop. I recommend doing a lot of research before jumping into it as you can fry your card with it.


Thank you. I'll look into it further before I do anything risky.


----------



## bond32

Quote:


> Originally Posted by *Spectre-*
> 
> i pulled 1.5 volts with nearly 430 watts


This... Awesome...

Now that I have a second 1300 G2, I can try a few different things. Testing the PT3 bios now...


----------



## Spectre-

Quote:


> Originally Posted by *bond32*
> 
> This... Awesome...
> 
> Now that I have a second 1300 G2, I can try a few different things. Testing the PT3 bios now...


let me know how it goes

heres a bunch of my benches

fs exteme-http://www.3dmark.com/3dm/4374626?
fs- http://www.3dmark.com/3dm/4374580?

i am only gonna compare gfx score since i have a hexy


----------



## bond32

Quote:


> Originally Posted by *Spectre-*
> 
> let me know how it goes
> 
> heres a bunch of my benches
> 
> fs exteme-http://www.3dmark.com/3dm/4374626?
> fs- http://www.3dmark.com/3dm/4374580?
> 
> i am only gonna compare gfx score since i have a hexy


That's a sick FS score... Not sure, think I barely hit 13k with my best run a while back in single card...


----------



## Spectre-

Quote:


> Originally Posted by *bond32*
> 
> That's a sick FS score... Not sure, think I barely hit 13k with my best run a while back in single card...


make sure you disable tess and enable gpu upscaling


----------



## Devildog83

Just wondering why 3DMark firestrike loads and works fine for me but 3DMark11 doesn't load and I have to restart.


----------



## BradleyW

How is Borderlands 3 running on CFX for you guy's?


----------



## Unknownm

I flashed both my cards with PT1T and PT1 bios and both failed. Even the BIOS switch didn't work and I forgot to backup the original BIOS. When restoring it with any other XFX BIOS it would fail too...

Every time drivers installed I would get BSOD on any card. One card reported 12TB of ram with Micron memory and since they are only a day old I returned it. The tech tested it out and said both cards are defective and he gave me 2 open box cards as a exchange

Seems like both cards work, gonna test out valley benchmark to make sure


----------



## Unknownm

Good starter score? No GPU overclock, fresh drivers installed


----------



## Widde

Is there any big difference in achievable clocks on the pt1 bios compared to the stock one? My cards are utterly useless doing 1125/1350 on +100mV benching.Just wondering, Temps outside is below zero now so let the benching commence


----------



## Cyber Locc

Quote:


> Originally Posted by *Unknownm*
> 
> Good starter score? No GPU overclock, fresh drivers installed


Umm wow that's as high as I get with 2 cards what is going on here. Do you have 2 cards in there why does it say 1







.


----------



## 1EvilMan

Quote:


> Originally Posted by *BradleyW*
> 
> How is Borderlands 3 running on CFX for you guy's?


I would be surprised if there is even a crossfire profile for it. Borderlands1 and 2 wouldn't start my second card.
My single 290X was running it fine before I sent it back for a different issue. If you somehow have physx turned on, the game will bog in places.


----------



## Takla

Quote:


> Originally Posted by *mojobear*
> 
> Hey guys,
> 
> How is everyone testing stability of OC?
> 
> I have looped firestrike test 1 extreme (now ultra) on loop and look for artifacts but its kinda hit and miss. For example sometimes I can go many runs without artifacts, other times its artifact central...must be related to the flux in VDDC.
> 
> Any insights guys?


first i run 4 Unigine Valley, then crysis 3 second chapter everything on very high + 8x msaa, if its still stable by this point i run battlefield 4 test range without aa but 200% resolution scale. just shoot the tower with a helicopter and then blow up the tanks a few times.

ultimate stability test never failed me before.


----------



## Vici0us

Quote:


> Originally Posted by *1EvilMan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BradleyW*
> 
> How is Borderlands 3 running on CFX for you guy's?
> 
> 
> 
> I would be surprised if there is even a crossfire profile for it. Borderlands1 and 2 wouldn't start my second card.
> My single 290X was running it fine before I sent it back for a different issue. If you somehow have physx turned on, the game will bog in places.
Click to expand...

How's Borderlands 3 in general? I really enjoyed first two.


----------



## Vici0us

Quote:


> Originally Posted by *Takla*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mojobear*
> 
> Hey guys,
> 
> How is everyone testing stability of OC?
> 
> I have looped firestrike test 1 extreme (now ultra) on loop and look for artifacts but its kinda hit and miss. For example sometimes I can go many runs without artifacts, other times its artifact central...must be related to the flux in VDDC.
> 
> Any insights guys?
> 
> 
> 
> first i run 4 Unigine Valley, then crysis 3 second chapter everything on very high + 8x msaa, if its still stable by this point i run battlefield 4 test range without aa but 200% resolution scale. just shoot the tower with a helicopter and then blow up the tanks a few times.
> 
> ultimate stability test never failed me before.
Click to expand...

Use Unigine Heaven - Extreme


----------



## joeh4384

Hello, I am just curious what the temp of everyone's top card is in a crossfire setup with non reference cards. I am trying to see if there is a significant difference having more than 1 slot between cards. Right now, I have a 295x2 and a MSI Twin Frozr 290x and when I run the 290x in the bottom, the VRMs overheat on the 295x2 causing the clocks to drop to 300mhz. The GPU temps are ok in this setup, 295x2 low 70s and 290x high 70s. When I run the 290x on top, the 290x reaches 95 degrees. Any suggestions? Right now I am running a Z87 Asus Hero board in a Fractal Arc Midi R2. I have two 140 MM Cougar Intakes in the front, one attached to a Corsair H90 in push/pull and two on top exhausting with the 295x2 rad and corsair sp fans in the back.


----------



## Mydog

Hello I just got an MSI R9 290X Twin Frozr from my store as a replacement for my old HD5970 and I want to do some OC'ing.
Got a water block for it so cooling is good.
Can anyone here tell me if it's possible to unlock vcore, flash custom bios to this card?
I'm familiar with flashing AMD but as this is not a ref pcb I'm not sure if it's possible.

Any help or pointers to the right threads/posts would be much appreciated


----------



## Red1776

Quote:


> Originally Posted by *Mydog*
> 
> Hello I just got an MSI R9 290X Twin Frozr from my store as a replacement for my old HD5970 and I want to do some OC'ing.
> Got a water block for it so cooling is good.
> Can anyone here tell me if it's possible to unlock vcore, flash custom bios to this card?
> I'm familiar with flashing AMD but as this is not a ref pcb I'm not sure if it's possible.
> 
> Any help or pointers to the right threads/posts would be much appreciated


First of all , are you sure you have the correct Waterblock? The Twin frozr's used a larger (taller) Cap around



he VRM area and require a rev 2.0 WB for example in the case of EK blocks.

secondly, if the 290X you bought is new, MSI departed from the reference layout on some of their 290X's so neither of the WB's will work (rev 1.0 or 2.0) because the chokes (the gold ones ) around the VRM area





see the lower image with the gold chokes? they are not reference and not comp with either version of the wb's and not reference (layout design)

The net pics are my 290X Twin Frozer cards with the larger caps that accept rev 2.0 (in EK)





what model and serial MSI 4GB Twin Frozr card do you have?

if you have not already it is worth removing the factory Cooling and doing a visual inspection to make sure you do not have a gold SSC choke version, and a reference layout and determine what rev WB you need.







I had to re-order WB for my R290X's to rev 2.0 in the EKWB blocks, so a bt of removal and verification can save you a lot of problems .

The card comes with two BIOS built in and is voltage unlocked with BIOS 1 so I am not sure what it is you are trying to do, but it is flashable if you wish.

Hope this helps

Greg


----------



## Mydog

@Red1776

Thanks for all that good info. I've got the Rev 2.0 EK block but have not removed the stock cooler yet, will check tonight. I didn't know that the bios 1 was voltage unlocked so that was useful info too, I assume that it's only unlocked for vcore and not vmem?

I have the MSI Radeon R9 290X Gaming 4GB GDDR5 PCI-Express 3.0, "Twin Frozr IV", DL-DVI-D + DL-DVI-D, HDMI, DP, 1040MHz
Don't have the serial number her.


----------



## Red1776

Quote:


> Originally Posted by *Mydog*
> 
> @Red1776
> 
> Thanks for all that good info. I've got the Rev 2.0 EK block but have not removed the stock cooler yet, will check tonight. I didn't know that the bios 1 was voltage unlocked so that was useful info too, I assume that it's only unlocked for vcore and not vmem?


 Using afterburner I get control for mem voltage as well.

BTW, this is how the layout looks from the EK comp chart (its a better view) than the images of mine disassembled.



Here is the link to the EK cooling configurator

http://www.coolingconfigurator.com/waterblock/3831109869024


----------



## Mydog

Quote:


> Originally Posted by *Red1776*
> 
> Using afterburner I get control for mem voltage as well.
> 
> BTW, this is how the layout looks from the EK comp chart (its a better view)


Thanks again








I saw that in the EK configurator but as you say the PCB might not be to ref design, have you got any info on what serial number that are not ref and will not fit even the rev 2.0?


----------



## joeh4384

I thought I read somewhere that the power connections were flipped (pcie clip on top instead of next to cooler) on the non compatible one on the MSI 290x's. That could be wrong though.


----------



## Red1776

Quote:


> Originally Posted by *Mydog*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> Using afterburner I get control for mem voltage as well.
> 
> BTW, this is how the layout looks from the EK comp chart (its a better view)
> 
> 
> 
> 
> Thanks again
> 
> 
> 
> 
> 
> 
> 
> 
> I saw that in the EK configurator but as you say the PCB might not be to ref design, have you got any info on what serial number that are not ref and will not fit even the rev 2.0?
Click to expand...

The above link has several MSI 4GB 290X with the cooler removed. some with twin frozr, some with the blowr type. they have slight variations but will still work.

I ended up calling MSI tech dept and reading the model/serial number to them and hey looked them each up to verify.

You may want to do the same. GPU companies have a habit of changing the design without notice, so it is worth the time to save yourself the headache. The gold SSC chokes are a dead giveaway though.


----------



## Mydog

Quote:


> Originally Posted by *joeh4384*
> 
> I thought I read somewhere that the power connections were flipped (pcie clip on top instead of next to cooler) on the non compatible one on the MSI 290x's. That could be wrong though.


Mine are flipped but not to worry right?

@Red1776

I can't really see if if the power connections are flipped in your pics of the cards with the Gold SSC chokes.


----------



## joeh4384

Are the clips at the bottom towards the heatsink or top of card?


----------



## Mydog

Quote:


> Originally Posted by *joeh4384*
> 
> Are the clips at the bottom towards the heatsink or top of card?


Clips are at the top of the card, not oriented the way I'm used to on 7970, 780 Ti or 980


----------



## Red1776

Quote:


> Originally Posted by *Mydog*
> 
> Quote:
> 
> 
> 
> Originally Posted by *joeh4384*
> 
> Are the clips at the bottom towards the heatsink or top of card?
> 
> 
> 
> Clips are at the top of the card, not oriented the way I'm used to on 7970, 780 Ti or 980
Click to expand...

Its the SSC chokes and non reference layout that cause the non comp though.

Pull the cooler off your card and send me the PCB layout image.


----------



## Mydog

Quote:


> Originally Posted by *Red1776*
> 
> Its the SSC chokes and non reference layout that cause the non comp though.
> Pull the cooler off your card and send me the PCB layout image.


Thanks I'll do that









@Red1776

I can on visual inspection even without taking of the cooler confirm that my card has the golden SSC chokes









I'm now on bios 1 but memory voltage are locked and core voltage only goes to +100mv, are there a special version of AB that I need to use?


----------



## Red1776

Quote:


> Originally Posted by *Mydog*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> Its the SSC chokes and non reference layout that cause the non comp though.
> Pull the cooler off your card and send me the PCB layout image.
> 
> 
> 
> Thanks I'll do that
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @Red1776
> 
> I can on visual inspection even without taking of the cooler confirm that my card has the golden SSC chokes
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm now on bios 1 but memory voltage are locked and core voltage only goes to +100mv, are there a special version of AB that I need to use?
Click to expand...

Did you check the 'unofficial Oc'ing' to unlock the increased voltage add?


----------



## Mydog

Quote:


> Originally Posted by *Red1776*
> 
> Did you check the 'unofficial Oc'ing' to unlock the increased voltage add?


The one in the first post?
Checking


----------



## sugarhell

If you want to bench use pt1 bios + gpu tweak


----------



## cplifj

and how should one go about RMA'ing a faulty card that only black screens after prolonged (several hours) of gaming heat ??

indeed we can't . shops tell us to bring in the complete pc and then they find no problems......

i am stuck with this asus 290X reference crap for ever . For once being able to buy such a "killer" topline card only to find what many have found.

slap some other cooling on it ? yeah if amd sends me one free of cost to make up for their engineering fail.

i think people are entitled to know this bad side of the story too.

great service, well they got my money so now they are a running.....

i'll remember that.


----------



## sugarhell

Quote:


> Originally Posted by *cplifj*
> 
> and how should one go about RMA'ing a faulty card that only black screens after prolonged (several hours) of gaming heat ??
> 
> indeed we can't . shops tell us to bring in the complete pc and then they find no problems......
> 
> i am stuck with this asus 290X reference crap for ever . For once being able to buy such a "killer" topline card only to find what many have found.
> 
> slap some other cooling on it ? yeah if amd sends me one free of cost to make up for their engineering fail.
> 
> i think people are entitled to know this bad side of the story too.
> 
> great service, well they got my money so now they are a running.....
> 
> i'll remember that.


Dont buy asus gpu again...


----------



## bluedevil

Just figured out my "power savings" compared to a 970. About $.90 a month....might not be worth going to a 970 for power savings.


----------



## Mega Man

Quote:


> Originally Posted by *bluedevil*
> 
> Just figured out my "power savings" compared to a 970. About $.90 a month....might not be worth going to a 970 for power savings.


Yea always people try to play the power saving game with pc component. There is not a large amount to be saved
Quote:


> Originally Posted by *sugarhell*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cplifj*
> 
> and how should one go about RMA'ing a faulty card that only black screens after prolonged (several hours) of gaming heat ??
> 
> indeed we can't . shops tell us to bring in the complete pc and then they find no problems......
> 
> i am stuck with this asus 290X reference crap for ever . For once being able to buy such a "killer" topline card only to find what many have found.
> 
> slap some other cooling on it ? yeah if amd sends me one free of cost to make up for their engineering fail.
> 
> i think people are entitled to know this bad side of the story too.
> 
> great service, well they got my money so now they are a running.....
> 
> i'll remember that.
> 
> 
> 
> Dont buy asus gpu again...
Click to expand...

This is amds fault how?


----------



## joeh4384

Think I posted this one too early today and it got burried. Any advice?

Hello, I am just curious what the temp of everyone's top card is in a crossfire setup with non reference cards. I am trying to see if there is a significant difference having more than 1 slot between cards. Right now, I have a 295x2 and a MSI Twin Frozr 290x and when I run the 290x in the bottom, the VRMs overheat on the 295x2 causing the clocks to drop to 300mhz. The GPU temps are ok in this setup, 295x2 low 70s and 290x high 70s. When I run the 290x on top, the 290x reaches 95 degrees. Any suggestions? Right now I am running a Z87 Asus Hero board in a Fractal Arc Midi R2. I have two 140 MM Cougar Intakes in the front, one attached to a Corsair H90 in push/pull and two on top exhausting with the 295x2 rad and corsair sp fans in the back.


----------



## Agent Smith1984

Power savings???
Is anybody even considering going to a 970 for power savings?

How about the butt whooping that a $330 card hands it when overclocked (and this is referring to BOTH cards being overclocked...)

I'm an AMD fan for sure, but I appreciate value. I know AMD is launching cards early '15, and those will probably be great, but right now, the 970 is the leader in value, and even competes closely with the 980 when overclocked!!!
Not to mention, I'm not reading about black screens, artifacts, BSOD's, overheating, and BOGUS driver issues all over the place. These cards are sold out all over the place for a month now, so if these were issues, we'd know about 'em.

Big props to NVIDIA this go round.... I'm by no means a dissapointed AMD owner (despite the issues that have plagued the 280x cards). The value was there for these cars, and at the $175-200 price tag they will be down to soon, the value is still somewhat there for these cards, but I feel like the 290/290x are in a bad segment of the market right now....

I'd expect to see $240 290's, and $290 290x's very soon..... and even then, the 970 for $40 is a better value.


----------



## sugarhell

And the 7870 oced can compete with a 7950 stock and a 7970 oced can compete with a 780 stock etc etc etc.

It looks like people get overhyped for the same performance as a 290x or 780ti. I dont know why.

Maybe its the freq thing. 1300 stock looks good compared to 1000 stock or 870 mhz titan stock lol

I dont say that 970 is bad,its really good, but just that. Nothing new on the performance section


----------



## Mydog

Quote:


> Originally Posted by *sugarhell*
> 
> If you want to bench use pt1 bios + gpu tweak


Links please?

Disabling ULPS is a must right, even with single GPU?

Sorry for all the questions, trying to catch up fast


----------



## sugarhell

Quote:


> Originally Posted by *Mydog*
> 
> Links please?
> 
> Disabling ULPS is a must right, even with single GPU?
> 
> Sorry for all the questions, trying to catch up fast


ULPS works only with crossfire. Zero core is what you dont want and you just need to put on control panel->power plan high performance and disable link state power management on the advanced settings.

You can easily find pt1 bios i think


----------



## battleaxe

Quote:


> Originally Posted by *bluedevil*
> 
> Just figured out my "power savings" compared to a 970. About $.90 a month....might not be worth going to a 970 for power savings.


Which is why I'll be keeping both my 290's.

Are they as good as the 970's? No.... but c'mon.... not that much different either, lets all be honest.


----------



## battleaxe

Quote:


> Originally Posted by *battleaxe*
> 
> Which is why I'll be keeping both my 290's.
> 
> Are they as good as the 970's? No.... but c'mon.... not that much different either, lets all be honest.


I need a new monitor. I am now running only one 290 at 750mhz on the core and still maxing out the frames of what my 60hz display can do. This is pathetic.

Edit: sorry I meant to edit last post not double post.


----------



## bond32

When you hear people here say "power savings" it doesnt refer to the fractions of cents you save on your power bill, but rather the fact you won't need 2 kW worth of power supplies to run multi-gpu setups...

2 970's even OC'ed probably draw 60% of what 2 290's draw (i'm guessing)...


----------



## DampMonkey

Quote:


> Originally Posted by *bond32*
> 
> When you hear people here say "power savings" it doesnt refer to the fractions of cents you save on your power bill, but rather the fact you won't need 2 kW worth of power supplies to run multi-gpu setups...
> 
> 2 970's even OC'ed probably draw 60% of what 2 290's draw (i'm guessing)...


60% sounds about right for stock clocks against a 290x. I know people who have sold their gk110's and hawaii's for 970's, siting the lower power usage. But they already had power supplies, so it didn't make any sense.


----------



## bond32

Quote:


> Originally Posted by *DampMonkey*
> 
> 60% sounds about right for stock clocks against a 290x. I know people who have sold their gk110's and hawaii's for 970's, siting the lower power usage. But they already had power supplies, so it didn't make any sense.


Consider a case like I was in - 3 290x's with a 1300 watt G2, the three cards topped out the G2 anything over 1.4 V.... Hence I needed a second psu or a bigger one, neither of which are cheap. Whereas, 3 970's I doubt would come close to that 1300 G2's capacity even max OC'ed.


----------



## ZealotKi11er

Quote:


> Originally Posted by *bond32*
> 
> Consider a case like I was in - 3 290x's with a 1300 watt G2, the three cards topped out the G2 anything over 1.4 V.... Hence I needed a second psu or a bigger one, neither of which are cheap. Whereas, 3 970's I doubt would come close to that 1300 G2's capacity even max OC'ed.


There is no reason to push 1.4v on those cards considering scaling. I found 1.375v Max is needed for 2 cards.


----------



## DampMonkey

Quote:


> Originally Posted by *bond32*
> 
> Consider a case like I was in - 3 290x's with a 1300 watt G2, the three cards topped out the G2 anything over 1.4 V.... Hence I needed a second psu or a bigger one, neither of which are cheap. Whereas, 3 970's I doubt would come close to that 1300 G2's capacity even max OC'ed.


That makes sense. The ax1500i and similars are mighty pricy and dual psu's are just overcomplication


----------



## LandonAaron

I think the lower heat and noise of a 970 would be the real draw IMO. So strange to me that one of the hottest power hungry cards on the market is also the one with the fewest non-reference cooling designs available for it. Maybe that is a misconception on my end since I mainly shop ebay when upgrading my GPU, but it seems 9 out of 10 R9 290x's use the stock cooler which is unbearably loud, and the non-reference design cards cost significantly more. I wasn't really worried about that though since I have an Arctic Accelero Hybrid Water cooler which has served me well and provided excellent cooling for every card I have used previously, but it has met its match with the R9 290x. Playing skyrim with a full ENB and all effects enabled I am running at a constant 94 degrees and down clocking my 1100 OC to 1075 or so. I can run my card fully stable with about +80mv, but lately I have taken to only giving it +38mv, so I could run at about 92 degrees with only the occasional artifact and no downclocking. Maybe the power savings they are referring to is from not having to run their AC so much.


----------



## battleaxe

Quote:


> Originally Posted by *bond32*
> 
> When you hear people here say "power savings" it doesnt refer to the fractions of cents you save on your power bill, but rather the fact you won't need 2 kW worth of power supplies to run multi-gpu setups...
> 
> 2 970's even OC'ed probably draw 60% of what 2 290's draw (i'm guessing)...


Well... yeah... that makes sense. But when you already have the setup it makes zero sense to switch to 970's when you are already setup for the powerdraw they bring.

But yeah, starting fresh I would be going to the 970 anyway. The power consideration as you mention is a good one. The power saved idea if you already have the power needed makes the 970 over a 290 a moot point. At that point I think the 290 is still very valid as its 50$ cheaper than a 970. I personally think the 290's are priced right where they should be. The performance on tap for $300.00 is pretty amazing really. Think about one year ago...

competition is a wonderful thing.


----------



## bond32

Quote:


> Originally Posted by *ZealotKi11er*
> 
> There is no reason to push 1.4v on those cards considering scaling. I found 1.375v Max is needed for 2 cards.


Don't be ridiculous... This is an overclock site







As soon as I get time I plan to go to 1.5 V...

Point being, the 1300 watt G2 is no slouch of a PSU. Arguably, there are only a tiny handful of equivalent/better psu's out there and they all aren't cheap.

In my case, I was able to get a second G2 from another member here for $100. I told him to keep the cable set since I had extra cables.


----------



## Agent Smith1984

If you already have 290's.... AWESOME.....

If you are sitting on a 7970/280x like I am right now... I think selling out for $150 used, and adding $180 to it for a 970 is a pretty SWEET DEAL!!!
Just imagine if there were some used ones going for $275









OR I could just buy a used 280x, add it to mine and have crossfire, which on paper performs MUCH better, but then I'm still limited to 3GB, ANNNDDDD, well, we all know all the other downfalls to crossfire by now don't we?


----------



## Dynamo11

Don't suppose anybody in here can help me. I just bought a brand new R9 290, with the intention of putting a water block on it and adding it to a custom loop. So I go to boot the system after fully building it, leak testing and all that good stuff, to be greeted by this. It's really hard for me to get a decent picture but essentially it is just a block of bright artifacts running down the screen. I've tested it with 3 different perfectly fine monitors and each one has that same line of light. So any ideas? I was really careful when applying the block and thought I'd taken every precaution, I can't think as to what it is. I really hope I haven't damaged the card though.


Spoiler: Warning: Spoiler!







EDIT: Figured I'd try booting into windows and this happened:


Spoiler: Warning: Spoiler!







Then I got a blue screen


----------



## joeh4384

Did you run the card on the stock cooler first?


----------



## Dynamo11

Quote:


> Originally Posted by *joeh4384*
> 
> Did you run the card on the stock cooler first?


No, rookie mistake I know but it was the last thing I was waiting for so just set to work on it straight out


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> If you already have 290's.... AWESOME.....
> 
> If you are sitting on a 7970/280x like I am right now... I think selling out for $150 used, and adding $180 to it for a 970 is a pretty SWEET DEAL!!!
> Just imagine if there were some used ones going for $275
> 
> 
> 
> 
> 
> 
> 
> 
> 
> OR I could just buy a used 280x, add it to mine and have crossfire, which on paper performs MUCH better, but then I'm still limited to 3GB, ANNNDDDD, well, we all know all the other downfalls to crossfire by now don't we?


Yeah.. I'd do that. I'd probably get the 970 too in your case. Why the heck not?

Hey any chance you get to upgrade in this hobby you take it man!!!


----------



## battleaxe

Quote:


> Originally Posted by *Dynamo11*
> 
> Don't suppose anybody in here can help me. I just bought a brand new R9 290, with the intention of putting a water block on it and adding it to a custom loop. So I go to boot the system after fully building it, leak testing and all that good stuff, to be greeted by this. It's really hard for me to get a decent picture but essentially it is just a block of bright artifacts running down the screen. I've tested it with 3 different perfectly fine monitors and each one has that same line of light. So any ideas? I was really careful when applying the block and thought I'd taken every precaution, I can't think as to what it is. I really hope I haven't damaged the card though.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> EDIT: Figured I'd try booting into windows and this happened:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Then I got a blue screen


Use the DDU uninstaller in safe mode. Then try again. I always uninstall the drivers about 5 times in safe mode just to be sure. I was having major problems with Crossfire a few weeks ago, but now its running awesome.

Edit: Also last week I had a weird black screen issue that was caused by not having enough thermal compound on the core. When I reapplied the block with new paste the black screen issue dissipated. There was just one tiny corner that didn't get enough compound on it. So reseat that block and try again. This die does not like it when it doesn't get a good interface with the block.


----------



## Agent Smith1984

That's a lot of artifacts
That GPU is definitely reaching maturity.....

j/k, good luck man


----------



## Dynamo11

Quote:


> Originally Posted by *battleaxe*
> 
> Edit: Also last week I had a weird black screen issue that was caused by not having enough thermal compound on the core. When I reapplied the block with new paste the black screen issue dissipated. There was just one tiny corner that didn't get enough compound on it. So reseat that block and try again. This die does not like it when it doesn't get a good interface with the block.


When I did apply the TIM (I used the line method down the middle) the top was a little thinner than the bottom, I had assumed it'd be fine but perhaps not. Probably wouldn't hurt to try and reseat the block to see if that fixes it.
Any other ideas let me know


----------



## battleaxe

Quote:


> Originally Posted by *Dynamo11*
> 
> When I did apply the TIM (I used the line method down the middle) the top was a little thinner than the bottom, I had assumed it'd be fine but perhaps not. Probably wouldn't hurt to try and reseat the block to see if that fixes it.
> Any other ideas let me know


Yeah, I was shocked that it fixed it for me. But as soon as I redid it (and made sure there was plenty) it fired up just fine.

What thermal compound are you using? If you are using a compound that is not conductive it will just squeeze out the extra around the die anyway and won't hurt anything. I put about a BB sized glob in the middle which is enough for using on a CPU. So I know it was too much or more than necessary, but it sure made a difference. My temps are great now and it runs instead of black-screening on me.

Have you ever gotten past the boot screen? When does it do this?

Edited for spelling


----------



## Dynamo11

Quote:


> Originally Posted by *battleaxe*
> 
> Yeah, I was shocked that it fixed it for me. But as soon as I redid it (and made sure there was plenty) it fired up just fine.
> 
> What thermal compound are you using? If you are using a compound that is not conductive it will just squeeze out the extra around the die anyway and won't hurt anything. I put about a BB sized glob in the middle which is enough for using on a CPU. So I know it was too much or more than necessary, but it sure made a difference. My temps are great now and it runs instead of black-screening on me.
> 
> Have you ever gotten past the boot screen? When does it do this?
> 
> Edited for spelling


I'm using PK-3, it's non conductive and I've used it plenty of times with no issues. I can get past the BIOS boot just fine, albeit with those artifacts on the screen. I can get to the Windows boot screen everytime but then it blue screens and restarts itself. It's hard to make the blue screen out but I think it's the one where it detects a hardware malfunction or change, I can't make out the code which would make my life a little easier.


----------



## battleaxe

Quote:


> Originally Posted by *Dynamo11*
> 
> I'm using PK-3, it's non conductive and I've used it plenty of times with no issues. I can get past the BIOS boot just fine, albeit with those artifacts on the screen. I can get to the Windows boot screen everytime but then it blue screens and restarts itself. It's hard to make the blue screen out but I think it's the one where it detects a hardware malfunction or change, I can't make out the code which would make my life a little easier.


Boot into safe mode and use DDU to uninstall all the drivers that may be on the PC. Do that about 4-5 times. Then try to reboot into Windows.

Here's the file if you can't retrieve it.

DDU.zip 1503k .zip file


Sorry I had to reattach the file as zipped.


----------



## rdr09

Arizonian, please add another 290 (Gigabyte reference).



it will be watercooled soon.



Asic is 80%. My other one that benches to 1300 core is 79%. Will eventually crossfire on my Intel rig.

http://www.3dmark.com/3dm/4395468?

Here's 3DMark11 crossfire at 1100 core . . .

http://www.3dmark.com/3dm11/8842661

Pardon the cables. Thanks.


----------



## Cyber Locc

Quote:


> Originally Posted by *Agent Smith1984*
> 
> If you already have 290's.... AWESOME.....
> 
> If you are sitting on a 7970/280x like I am right now... I think selling out for $150 used, and adding $180 to it for a 970 is a pretty SWEET DEAL!!!
> Just imagine if there were some used ones going for $275
> 
> 
> 
> 
> 
> 
> 
> 
> 
> OR I could just buy a used 280x, add it to mine and have crossfire, which on paper performs MUCH better, but then I'm still limited to 3GB, ANNNDDDD, well, we all know all the other downfalls to crossfire by now don't we?


Not all of us please explain these downfalls of crossfire? I mean sure I read about stuttering etc but so far. (since Saturday) I have had crossfire 290s and have experienced zero issues while benching hard and gaming a lot on them. pretty much every second of spare time I've had (maybe 5 hours on weekdays and was messing with it for 12 hours on Sunday). I bought my second card used so have been testing it hard as possible. So what issues are the ones we all know I'd like to know.

A 970 is a good card for sure







if your on 1080p that will be a good upgrade from a 280x however any other resolution. (higher I mean) I'm not sure if the 280x would have been better or not as a 290 is much better above 1080.


----------



## Dynamo11

Quote:


> Originally Posted by *battleaxe*
> 
> Boot into safe mode and use DDU to uninstall all the drivers that may be on the PC. Do that about 4-5 times. Then try to reboot into Windows.
> 
> Here's the file if you can't retrieve it.
> 
> DDU.zip 1503k .zip file
> 
> 
> Sorry I had to reattach the file as zipped.


Thanks for the help, I'll report back tomorrow as to if reseating the block and uninstalling the drivers via DDU fixed the issue.


----------



## MojoW

Quote:


> Originally Posted by *Dynamo11*
> 
> No, rookie mistake I know but it was the last thing I was waiting for so just set to work on it straight out


Don't worry i had the exact same screen and bsod problem it was a faulty card that still had the original cooler on it.
It got RMA'ed so you know what to do.


----------



## battleaxe

Quote:


> Originally Posted by *cyberlocc*
> 
> Not all of us please explain these downfalls of crossfire? I mean sure I read about stuttering etc but so far. (since Saturday) I have had crossfire 290s and have experienced zero issues while benching hard and gaming a lot on them. pretty much every second of spare time I've had (maybe 5 hours on weekdays and was messing with it for 12 hours on Sunday). I bought my second card used so have been testing it hard as possible. So what issues are the ones we all know I'd like to know.
> 
> A 970 is a good card for sure
> 
> 
> 
> 
> 
> 
> 
> if your on 1080p that will be a good upgrade from a 280x however any other resolution. (higher I mean) I'm not sure if the 280x would have been better or not as a 290 is much better above 1080.


I initially had some crossfire issues. But I since found out it was my motherboard that was causing it. Once I got that RMA'd it worked. Then I had issues with the blackscreen. Turned out that was not enough TIM on the water block. And now that that issue is also fixed crossfire works awesome. I have not noticed any stuttering issues whatsoever. Gaming I get in excess of 150 FPS and no tearing either. Not quite sure how they do that because tearing used to be a problem back in the day from what I had heard and I have seen it on my friends system. It looks terrible. But I've seen zero tearing and zero stuttering even in Xfire on my 290's.

I think the issue is the complexity of these systems. With crossfire or SLI there are just that many more things to go wrong. Some games don't support xfire or SLI, so that can be an issue. I have to disable xfire if I play age of Empires, but that's no big deal. My biggest issue is I need to get a new monitor, so dumb having all this power and not able to really enjoy it. 'Getting 150fps is kinda silly when the monitor only does 60hz. So in essense, I agree, but xfire or SLI both just adds complexity to the PC which can cause problems if everything isn't working properly. Like in my case my MOBU was bad.


----------



## Cyber Locc

Quote:


> Originally Posted by *battleaxe*
> 
> I initially had some crossfire issues. But I since found out it was my motherboard that was causing it. Once I got that RMA'd it worked. Then I had issues with the blackscreen. Turned out that was not enough TIM on the water block. And now that that issue is also fixed crossfire works awesome. I have not noticed any stuttering issues whatsoever. Gaming I get in excess of 150 FPS and no tearing either. Not quite sure how they do that because tearing used to be a problem back in the day from what I had heard and I have seen it on my friends system. It looks terrible. But I've seen zero tearing and zero stuttering even in Xfire on my 290's.
> 
> I think the issue is the complexity of these systems. With crossfire or SLI there are just that many more things to go wrong. Some games don't support xfire or SLI, so that can be an issue. I have to disable xfire if I play age of Empires, but that's no big deal. My biggest issue is I need to get a new monitor, so dumb having all this power and not able to really enjoy it. 'Getting 150fps is kinda silly when the monitor only does 60hz. So in essense, I agree, but xfire or SLI both just adds complexity to the PC which can cause problems if everything isn't working properly. Like in my case my MOBU was bad.


Ya more points of failure I can see I though I thought he was referring to performance issues . and I can't speak of previous amd crossfire as I never used it but my 670 sli defiantly did have stuttering at times and from my experience the games that don't support sli or crossfire don't need them as in your example

And in your case it seems crossfire was helpful more than harmful you had a bad board and crossfire found it I had a board that 1 pci slot didn't work and 3 months later none of them worked so that is a blessing in disguise most likely


----------



## battleaxe

Yeah, my 670 SLI rig stutters a lot more than the xfire 290's do. I played BF3 last night on the 670 machine and I did notice it. But still I love my SLI 670's. I have no plans to upgrade them. I really like those cards.


----------



## Mega Man

Quote:


> Originally Posted by *bond32*
> 
> When you hear people here say "power savings" it doesnt refer to the fractions of cents you save on your power bill, but rather the fact you won't need 2 kW worth of power supplies to run multi-gpu setups...
> 
> 2 970's even OC'ed probably draw 60% of what 2 290's draw (i'm guessing)...


meh unless itx i will probably never build a build less then 2500 w
Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cyberlocc*
> 
> Not all of us please explain these downfalls of crossfire? I mean sure I read about stuttering etc but so far. (since Saturday) I have had crossfire 290s and have experienced zero issues while benching hard and gaming a lot on them. pretty much every second of spare time I've had (maybe 5 hours on weekdays and was messing with it for 12 hours on Sunday). I bought my second card used so have been testing it hard as possible. So what issues are the ones we all know I'd like to know.
> 
> A 970 is a good card for sure
> 
> 
> 
> 
> 
> 
> 
> if your on 1080p that will be a good upgrade from a 280x however any other resolution. (higher I mean) I'm not sure if the 280x would have been better or not as a 290 is much better above 1080.
> 
> 
> 
> I initially had some crossfire issues. But I since found out it was my motherboard that was causing it. Once I got that RMA'd it worked. Then I had issues with the blackscreen. Turned out that was not enough TIM on the water block. And now that that issue is also fixed crossfire works awesome. I have not noticed any stuttering issues whatsoever. Gaming I get in excess of 150 FPS and no tearing either. Not quite sure how they do that because tearing used to be a problem back in the day from what I had heard and I have seen it on my friends system. It looks terrible. But I've seen zero tearing and zero stuttering even in Xfire on my 290's.
> 
> I think the issue is the complexity of these systems. With crossfire or SLI there are just that many more things to go wrong. Some games don't support xfire or SLI, so that can be an issue. I have to disable xfire if I play age of Empires, but that's no big deal. My biggest issue is I need to get a new monitor, so dumb having all this power and not able to really enjoy it. 'Getting 150fps is kinda silly when the monitor only does 60hz. So in essense, I agree, but xfire or SLI both just adds complexity to the PC which can cause problems if everything isn't working properly. Like in my case my MOBU was bad.
Click to expand...

IMO like many amd vs * ( insert competitor here ) issues are greatly exaggerated, i have not had many issues, and the few i do/had, either are game specific or user error


----------



## Ironsmack

So finally got my 290's running 1150/1380 with +63mV stable.

But i noticed according to my Cyberpower UPS and kill-a-watt meter, im pulling in 915 watts using Valley.

Playing with BF4 for an hour or so, pegs me around the same usage.

Well, I'm adding another 290 and im concerned if my PSU (EVGA 1300) can even handle that amount. I feel like I'm pushing the limits of my PSU.


----------



## rdr09

got my second 290 with a full ek block for $247.


----------



## bluedevil

Crazy question, with a 4790k or 5820k, which would you choose with cross-fired 290s running 4320x2560 (3 1440P Vertical)?


----------



## rdr09

Quote:


> Originally Posted by *bluedevil*
> 
> Crazy question, with a 4790k or 5820k, which would you choose with cross-fired 290s running 4320x2560 (3 1440P Vertical)?


if you are anywhere near a Microcenter . . . i'd get the 5820K. Any of those will handle 2 of these GPUs just fine.

Anyways, I replaced my 7950 in my AMD rig with a 290 and fired right up. No driver uninstall or install. Just switched. Then, took out the 290 and crossfired it to my intel rig, again no driver uninstall, which i was expecting.

Reinstalled my 7950 to my AMD rig, again no driver uninstall. Just switched. Amazing.

edit: But, that second 290's fan (reference) goes full blast playing PS2. Smooth, though.


----------



## bluedevil

Quote:


> Originally Posted by *rdr09*
> 
> if you are anywhere near a Microcenter . . . i'd get the 5820K. Any of those will handle 2 of these GPUs just fine.
> 
> Anyways, I replaced my 7950 in my AMD rig with a 290 and fired right up. No driver uninstall or install. Just switched. Then, took out the 290 and crossfired it to my intel rig, again no driver uninstall, which i was expecting.
> 
> Reinstalled my 7950 to my AMD rig, again no driver uninstall. Just switched. Amazing.
> 
> edit: But, that second 290's fan (reference) goes full blast playing PS2. Smooth, though.


It was just a crazy thought...might pitch it the Mrs come Tax time, or before and just PayPal credit it.


----------



## rdr09

Quote:


> Originally Posted by *bluedevil*
> 
> It was just a crazy thought...might pitch it the Mrs come Tax time, or before and just PayPal credit it.


if i may suggest . . . just get a 3770K. Used. My bad. You are building a whole new rig. yah, 5820K.


----------



## Cyber Locc

Quote:


> Originally Posted by *bluedevil*
> 
> Crazy question, with a 4790k or 5820k, which would you choose with cross-fired 290s running 4320x2560 (3 1440P Vertical)?


neither 5930k. Why I say that is because when you add a third card which your going to need for that resolution if you want to play stuff maxed and 60fps the 5930k will run all 3 at x8 where the others cant


----------



## rdr09

Quote:


> Originally Posted by *cyberlocc*
> 
> neither 5930k. Why I say that is because when you add a third card which your going to need for that resolution if you want to play stuff maxed and 60fps the 5930k will run all 3 at x8 where the others cant


i think the 5820K will run three at X8, too. actually, X16. i'm not mistaken.


----------



## Cyber Locc

Quote:


> Originally Posted by *rdr09*
> 
> i think the 5820K will run three at X8, too. actually, X16. i'm not mistaken.


no it won't that was the cut on haswell e ivy e had less cores haswell e has less lanes a 5820k has 20 lanes just like a 4790k.

I just recently choose to build ivy e for this exact reason.

If you don't want 3 cards tho (and don't mind less than maxed or less than 60fps constant) and are sure you only want 2 then a 4790k would be better if gaming is the main use as the 6 cores are useless in gaming it will only use 4 cores the reason some gaming systems are built on e platforms is for the 40 lanes and different applications the pc also does but if you aren't doing any work that would benefited from 6 cores and aren't using more than 2 gpus your paying a huge premium for stuff you won't use


----------



## bluedevil

Quote:


> Originally Posted by *cyberlocc*
> 
> no it won't that was the cut on haswell e ivy e had less cores haswell e has less lanes a 5820k has 20 lanes just like a 4790k


The 5820K has 28 PCIe lanes, it can do 8x8x8x.

http://ark.intel.com/products/82932/

The 5930K has 40 PCIe lanes, it can do 16x8x8.

http://ark.intel.com/products/82931

So really not much difference if doing 3 cards, 4 cards yes that would make a slight difference.

And this for the nay sayers about 8x vs 16x. Not much difference.
http://www.pugetsystems.com/labs/articles/Impact-of-PCI-E-Speed-on-Gaming-Performance-518/


----------



## Cyber Locc

Quote:


> Originally Posted by *bluedevil*
> 
> The 5820K has 28 PCIe lanes, it can do 8x8x8x.
> 
> http://ark.intel.com/products/82932/
> 
> The 5930K has 40 PCIe lanes, it can do 16x8x8.
> 
> http://ark.intel.com/products/82931
> 
> So really not much difference if doing 3 cards, 4 cards yes that would make a slight difference.
> 
> And this for the nay sayers about 8x vs 16x. Not much difference.
> http://www.pugetsystems.com/labs/articles/Impact-of-PCI-E-Speed-on-Gaming-Performance-518/


Yep your right sorry for some reason I thought it was 20 so for 3 cards yep your good. For 4 cards it more than a slight diffrence a card running at x4 on 4k isn't going to work 4k saturates x4 esp with crossfire x.

So yes if you want 3 cards 5820k if you want 2 than 4790k


----------



## kizwan

Quote:


> Originally Posted by *Ironsmack*
> 
> So finally got my 290's running 1150/1380 with +63mV stable.
> 
> But i noticed according to my Cyberpower UPS and kill-a-watt meter, im pulling in 915 watts using Valley.
> 
> Playing with BF4 for an hour or so, pegs me around the same usage.
> 
> Well, I'm adding another 290 and im concerned if my PSU (EVGA 1300) can even handle that amount. I feel like I'm pushing the limits of my PSU.


What is your 3930k running at? You know 915W is power draw from the wall, right? The actual power consumption is less than that, guestimate around 8XXW. This is the one you want to know & with it you can guestimate whether your PSU can handle three cards. You can run three at stock clock but I don't think your PSU can handle three when overclocked.


----------



## Cyber Locc

Quote:


> Originally Posted by *bluedevil*
> 
> The 5820K has 28 PCIe lanes, it can do 8x8x8x.
> 
> http://ark.intel.com/products/82932/
> 
> The 5930K has 40 PCIe lanes, it can do 16x8x8.
> 
> http://ark.intel.com/products/82931
> 
> So really not much difference if doing 3 cards, 4 cards yes that would make a slight difference.
> 
> And this for the nay sayers about 8x vs 16x. Not much difference.
> http://www.pugetsystems.com/labs/articles/Impact-of-PCI-E-Speed-on-Gaming-Performance-518/


I'm not a nay sayer about x16 being only slight improvement however just wanted to say what they will that only is about nvidia gpus and old ones at that which have less bandwidth than 290s by quite a bit also why they aren't as good at 4k they also have sli cables they don't provide over the pci lane I do agree x8 x8 x8 is enough x8 x8 x4 isn't however.
Quote:


> Originally Posted by *Ironsmack*
> 
> So finally got my 290's running 1150/1380 with +63mV stable.
> 
> But i noticed according to my Cyberpower UPS and kill-a-watt meter, im pulling in 915 watts using Valley.
> 
> Playing with BF4 for an hour or so, pegs me around the same usage.
> 
> Well, I'm adding another 290 and im concerned if my PSU (EVGA 1300) can even handle that amount. I feel like I'm pushing the limits of my PSU.


1300w will most likely be enough as you said 290s that's a 50w difference stock. if you want to be sure take the cards out put under load and record wattage then do 1 card record figure out how much 1 card is using and add that to your 915 then as said figure in the fact thats from the wall.


----------



## bluedevil

Quote:


> Originally Posted by *cyberlocc*
> 
> I'm not a nay sayer about x16 being only slight improvement however just wanted to say what they will that only is about nvidia gpus and old ones at that which have less bandwidth than 290s by quite a bit also why they aren't as good at 4k they also have sli cables they don't provide over the pci lane I do agree x8 x8 x8 is enough x8 x8 x4 isn't however.


Unless I get this beast.









http://www.newegg.com/Product/Product.aspx?Item=N82E16813132263

Not really.....


----------



## Cyber Locc

Quote:


> Originally Posted by *bluedevil*
> 
> Unless I get this beast.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16813132263
> 
> Not really.....


Thats a good board to get the price aint that bad x99 is expensive no matter what plan on spending at least 400 on a decent x99 board. its a much more expensive platform by far.

If you live by a microcenter first off I hate you lol







secondly you could prolly get a good deal on that board and cpu.

Also that board uses PLX to get x16 x16 x16 x16. So I wouldn't suggest it for that reason as I hate PLX been there done that through it out the window (However that was a very long time ago it is probably better now)


----------



## bluedevil

Quote:


> Originally Posted by *cyberlocc*
> 
> Thats a good board to get the price aint that bad x99 is expensive no matter what plan on spending at least 400 on a decent x99 board. its a much more expensive platform by far.


However, as I am finding out, it is built to last much longer than the mainstream platforms.


----------



## Cyber Locc

Quote:


> Originally Posted by *bluedevil*
> 
> However, as I am finding out, it is built to last much longer than the mainstream platforms.


This is true server grade







. Plus if you need a better cpu down the line drop in a 16 core xeon FTW


----------



## Arizonian

Quote:


> Originally Posted by *Unknownm*
> 
> 2x 290 XFX @ 947Mhz, Hopefully they will unlock
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sorry for the messy wires , cable management will come tomorrow.
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - updated









Quote:


> Originally Posted by *Ziox*
> 
> Wanted to join the club
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 290x 1225/1625 stable
> 
> 39c Idle 55c load 3hrs Heaven Benchmark
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added















Quote:


> Originally Posted by *rdr09*
> 
> Arizonian, please add another 290 (Gigabyte reference).
> 
> 
> 
> it will be watercooled soon.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Asic is 80%. My other one that benches to 1300 core is 79%. Will eventually crossfire on my Intel rig.
> 
> http://www.3dmark.com/3dm/4395468?
> 
> Here's 3DMark11 crossfire at 1100 core . . .
> 
> http://www.3dmark.com/3dm11/8842661
> 
> Pardon the cables. Thanks.


Congrats - updated


----------



## Ironsmack

Quote:


> Originally Posted by *kizwan*
> 
> What is your 3930k running at? You know 915W is power draw from the wall, right? The actual power consumption is less than that, guestimate around 8XXW. This is the one you want to know & with it you can guestimate whether your PSU can handle three cards. You can run three at stock clock but I don't think your PSU can handle three when overclocked.


Quote:


> Originally Posted by *cyberlocc*
> 
> I'm not a nay sayer about x16 being only slight improvement however just wanted to say what they will that only is about nvidia gpus and old ones at that which have less bandwidth than 290s by quite a bit also why they aren't as good at 4k they also have sli cables they don't provide over the pci lane I do agree x8 x8 x8 is enough x8 x8 x4 isn't however.
> 1300w will most likely be enough as you said 290s that's a 50w difference stock. if you want to be sure take the cards out put under load and record wattage then do 1 card record figure out how much 1 card is using and add that to your 915 then as said figure in the fact thats from the wall.


Well, my 3930k runs at 4.5 Ghz @ 1.39 volts.

With one card, power draw (from the wall) is 610 watts (+/- 10).


----------



## gkolarov

Hi, i am new to crossfire two R9 290 cards in my rig. They work fine. Just one thing that annoys me: No matter whether i am playing a game or benching (valley for example) the motion is flawlessly for 6-7 seconds, then makes a little pause - half a second, then again 6-7 seconds flawlessly , then again half a second pause. I am filling it like the applications stops for a while to load the next portion of graphic data. What is this? I haven't done driver cleaning when installed the second 290 card. I did just a driver installation above the previous drivers and restart. The windows restarted. The cards are not from the same vendor. One is PCS+, the other is XFX. Bios versions are different. Could this be the problem ?


----------



## devvfata1ity

Quote:


> Originally Posted by *zealord*
> 
> Maybe someone here with a normal 290 that runs at 1100 core clock / 1350 mem clock and can run 3Dmark Firestrike for me and post the result?
> 
> I would also be interested in a R9 290X 1100/1350 run.
> 
> OS, CPU etc. doesn't matter thanks


Firestrike overall score of 10700 using 4670k @ 4.4 & r9 290 @ 1100/1450


----------



## Buehlar

Quote:


> Originally Posted by *gkolarov*
> 
> Hi, i am new to crossfire two R9 290 cards in my rig. They work fine. Just one thing that annoys me: No matter whether i am playing a game or benching (valley for example) the motion is flawlessly for 6-7 seconds, then makes a little pause - half a second, then again 6-7 seconds flawlessly , then again half a second pause. I am filling it like the applications stops for a while to load the next portion of graphic data. What is this? I haven't done driver cleaning when installed the second 290 card. I did just a driver installation above the previous drivers and restart. The windows restarted. The cards are not from the same vendor. One is PCS+, the other is XFX. Bios versions are different. Could this be the problem ?


Sounds like another running process to me. Hardware monitoring software can cause this as it refreshes every few seconds.
A certain version of GPU Tweak monitor has given me this trouble before. Can't remember exactly which ver it was....2.6.xx something

What programs do you have running in the background?

You can troubleshoot by killing unnecessary processes until you find the culprit.


----------



## neurotix

Hey guys, haven't posted in here in a while, here are some of my recent benches. I still think these are fantastic cards, and now priced at $300 they're a price/performance king. (I paid $450 for a 7970 in early 2013.)

4770k @ 4.5ghz, two 290 Tri-X

Valley Crossfire


Fire Strike Crossfire


Fire Strike Single


3dm11 Crossfire


3dm11 Single (17k!)



3dmark Vantage


----------



## gkolarov

Ok, thanks, i will try. Everything is running on the background: gpu-z x 2, MSI AB, chrome, movie on the second monitor (HDMI), skype .... I will try with minimum background programs


----------



## rdr09

Quote:


> Originally Posted by *cyberlocc*
> 
> This is true server grade
> 
> 
> 
> 
> 
> 
> 
> . Plus if you need a better cpu down the line drop in a 16 core xeon FTW


@bluedevil, this is the best part.

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - updated
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - updated


Thanks.


----------



## bluedevil

I dunno a single 4k monitor is looking purdy good now over 3x qx2710s.


----------



## sugarhell

21:9 for the win.


----------



## bond32

My roommate has a 4k monitor... Although he only has 2 280x's, battlefield 4 at low/medium at 4k LOOKS good but PLAYS terrible. I would claw my eyes out personally after playing on my X-Star 1440p 120hz for so long.

My opinion the 1440p monitors at high refresh rates look magnitudes better than 4k, at least right now.


----------



## Agent Smith1984

Quote:


> Originally Posted by *cyberlocc*
> 
> Not all of us please explain these downfalls of crossfire? I mean sure I read about stuttering etc but so far. (since Saturday) I have had crossfire 290s and have experienced zero issues while benching hard and gaming a lot on them. pretty much every second of spare time I've had (maybe 5 hours on weekdays and was messing with it for 12 hours on Sunday). I bought my second card used so have been testing it hard as possible. So what issues are the ones we all know I'd like to know.
> 
> A 970 is a good card for sure
> 
> 
> 
> 
> 
> 
> 
> if your on 1080p that will be a good upgrade from a 280x however any other resolution. (higher I mean) I'm not sure if the 280x would have been better or not as a 290 is much better above 1080.


To be honest, all I do know personally is from reading....
But there has been enough research on my end to be somewhat weary of going the CF route....

I am finding 7970's and 280x for $125 though... I AM SO TEMPTED to just pick another up and run crossfire....
I mean from what I see, when working correctly, the scaling is 60-95%!!! That's huge....


----------



## sinnedone

Quote:


> Originally Posted by *bond32*
> 
> My roommate has a 4k monitor... Although he only has 2 280x's, battlefield 4 at low/medium at 4k LOOKS good but PLAYS terrible. I would claw my eyes out personally after playing on my X-Star 1440p 120hz for so long.
> 
> My opinion the 1440p monitors at high refresh rates look magnitudes better than 4k, at least right now.


This so much.

After overclocking my qnix'x refresh rate the difference is night and day. I only have one of my 290's in right now, but even with only one 290 at 1050/1250 stock volts and 96hz qnix profile running the fluidity is just too awesome to give up. BF4 is soo smooth.







I can run 120hz no problem on my qnix, but one 290 doesn't have the horsepower to keep the minimums up there at 1440p.

Quote:


> Originally Posted by *Agent Smith1984*
> 
> To be honest, all I do know personally is from reading....
> But there has been enough research on my end to be somewhat weary of going the CF route....
> 
> I am finding 7970's and 280x for $125 though... I AM SO TEMPTED to just pick another up and run crossfire....
> I mean from what I see, when working correctly, the scaling is 60-95%!!! That's huge....


When crossfire works its plain awesome. Now when it doesn't cough cough Far Cry 3 cough its soooo horrible. 7970 is still a very capable card.


----------



## Agent Smith1984

Quote:


> Originally Posted by *sinnedone*
> 
> When crossfire works its plain awesome. Now when it doesn't cough cough Far Cry 3 cough its soooo horrible. 7970 is still a very capable card.


And that is my fear..... I have seen reviews where Far Cry 3 scales 80+% with Crossfire on 7970/280x, and then I have read other stories where people are running the game better on a single card....

I would like to know this from any experienced crossfire user though;

Can I set profiles in Catalyst control center than use Crossfire for certain games, and single GPU for others on the fly without needing reboots etc???

I play Skyrim, Far Cry 3, Crysis 3, BF 4, and will be getting Far Cry 4 as soon as it drops (which will likely stop me from playing FC3, but my 9 year old may still want to play it).....









I also am faced with needing to get a 900w + PSU if I do this, though my case mounts for two PSU's, so I could just add


----------



## sinnedone

Quote:


> Originally Posted by *Agent Smith1984*
> 
> And that is my fear..... I have seen reviews where Far Cry 3 scales 80+% with Crossfire on 7970/280x, and then I have read other stories where people are running the game better on a single card....
> 
> I would like to know this from any experienced crossfire user though;
> 
> Can I set profiles in Catalyst control center than use Crossfire for certain games, and single GPU for others on the fly without needing reboots etc???
> 
> I play Skyrim, Far Cry 3, Crysis 3, BF 4, and will be getting Far Cry 4 as soon as it drops (which will likely stop me from playing FC3, but my 9 year old may still want to play it).....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I also am faced with needing to get a 900w + PSU if I do this, though my case mounts for two PSU's, so I could just add


I had crossfire 7870's. In other games was always pushing around triple digits at 1080p. Far cry 3 I could get a good 60 with one card, but crossfire dropped frames into the 30's if I remember correctly. With that game it was just poor optimization on the devs part. It's been a while since I played it though and I believed they fixed it. You'd have to look at current reviews and not ones from a year ago.


----------



## Dynamo11

Hey guys, back again. This morning I reseated the water block, and the problem remains. I then booted into safe mode and uninstalled the original drivers a few times and the problem remained. So what would my best course of action to be now? Slap the stock cooler and try to RMA (bearing in mind I think applying a water block voids the warranty), or await for some other solution. I really should have ran it when I got it, as I could have seen if it was like this straight away, or whether I have broken it in the mean time


----------



## sugarhell

Quote:


> Originally Posted by *Dynamo11*
> 
> Hey guys, back again. This morning I reseated the water block, and the problem remains. I then booted into safe mode and uninstalled the original drivers a few times and the problem remained. So what would my best course of action to be now? Slap the stock cooler and try to RMA (bearing in mind I think applying a water block voids the warranty), or await for some other solution. I really should have ran it when I got it, as I could have seen if it was like this straight away, or whether I have broken it in the mean time


Can you try with the stock cooler? I searched your previous comments and i didint find anything similar?

Can you explain a bit better? You see artifacts when you load the windows or from the start up(bios etc etc)?


----------



## battleaxe

Quote:


> Originally Posted by *Dynamo11*
> 
> Hey guys, back again. This morning I reseated the water block, and the problem remains. I then booted into safe mode and uninstalled the original drivers a few times and the problem remained. So what would my best course of action to be now? Slap the stock cooler and try to RMA (bearing in mind I think applying a water block voids the warranty), or await for some other solution. I really should have ran it when I got it, as I could have seen if it was like this straight away, or whether I have broken it in the mean time


I'm betting the card is DOA. RMA it. Put the cooler back on and send it in quoting said problems. Shouldn't be an issue. What brand is it?

You could try the air cooler first too as mentioned. Probably couldn't hurt on the outside chance that the block is shorting somewhere for some unknown reason.


----------



## Cyber Locc

Quote:


> Originally Posted by *bond32*
> 
> My roommate has a 4k monitor... Although he only has 2 280x's, battlefield 4 at low/medium at 4k LOOKS good but PLAYS terrible. I would claw my eyes out personally after playing on my X-Star 1440p 120hz for so long.
> 
> My opinion the 1440p monitors at high refresh rates look magnitudes better than 4k, at least right now.


Low to med settings on mid range cards is a far shot from generalizing 4k just saying. Also too little info does his monitor have a 60hz refresh rate? or 30 hz? or 30hz split into 2?. Is it a tn panel or ips as your 1440 is undoubtedly IPS. Im just saying that making a generalization like that is a little naive. I do agree that 120hz ect looks good though g sync and freesync help with that slightly and there 4ks are coming.
Quote:


> Originally Posted by *Agent Smith1984*
> 
> And that is my fear..... I have seen reviews where Far Cry 3 scales 80+% with Crossfire on 7970/280x, and then I have read other stories where people are running the game better on a single card....
> 
> I would like to know this from any experienced crossfire user though;
> 
> Can I set profiles in Catalyst control center than use Crossfire for certain games, and single GPU for others on the fly without needing reboots etc???
> 
> I play Skyrim, Far Cry 3, Crysis 3, BF 4, and will be getting Far Cry 4 as soon as it drops (which will likely stop me from playing FC3, but my 9 year old may still want to play it).....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I also am faced with needing to get a 900w + PSU if I do this, though my case mounts for two PSU's, so I could just add


Agent smith I can only speak for 290s but crossfire on those right now is very good I would assume 280x shares that i honestly think it would be worth a shot if you don't like it sell em and no harm done.

As for crossfire profiles for some games only I am not sure, however you don't need to reboot pc or anything and there is a toggle in CCC as well as the right click menu gives profiles that you also have the toggle.


----------



## Dynamo11

Quote:


> Originally Posted by *sugarhell*
> 
> Can you try with the stock cooler? I searched your previous comments and i didint find anything similar?
> 
> Can you explain a bit better? You see artifacts when you load the windows or from the start up(bios etc etc)?


It's hard to explain, I think a video shows it better than I can write (sorry for the bad quality, I'm recording it from my mobile):


Spoiler: Warning: Spoiler!










Quote:


> Originally Posted by *battleaxe*
> 
> I'm betting the card is DOA. RMA it. Put the cooler back on and send it in quoting said problems. Shouldn't be an issue. What brand is it?
> 
> You could try the air cooler first too as mentioned. Probably couldn't hurt on the outside chance that the block is shorting somewhere for some unknown reason.


Yeah, as of right now I think RMA'ing it might be the only solution, it's an XFX reference card. Before I send it back though I'll try it with the stock cooler on to eliminate the block being the issue.


----------



## bond32

Quote:


> Originally Posted by *cyberlocc*
> 
> Low to med settings on mid range cards is a far shot from generalizing 4k just saying. Also too little info does his monitor have a 60hz refresh rate? or 30 hz? or 30hz split into 2?. Is it a tn panel or ips as your 1440 is undoubtedly IPS. Im just saying that making a generalization like that is a little naive. I do agree that 120hz ect looks good though g sync and freesync help with that slightly and there 4ks are coming.
> Agent smith I can only speak for 290s but crossfire on those right now is very good I would assume 280x shares that i honestly think it would be worth a shot if you don't like it sell em and no harm done.
> 
> As for crossfire profiles for some games only I am not sure, however you don't need to reboot pc or anything and there is a toggle in CCC as well as the right click menu gives profiles that you also have the toggle.


He claims it is a 60 hz refresh rate but not like it matters, the framerate never gets over 50 fps. It's an IPS panel. There are no 4k monitors capable of anything over 60 right now but I think I saw LG develop an IPS panel that could do 120 hz at 4k so perhaps we could see that soon.

I'm not sure what you're saying about a "generalization"... I observed, that even if his IPS 4k panel with a constant framerate of 60 fps or more still looks unplayable to me personally after playing on my own 1440p monitor at 120 hz. This observation has been shown over and over... People who have played at 60 hz for years try 120 and never go back.

So, again, in my opinion there are no 4k monitors out yet that will provide a smooth gaming (shooter) experience. The 1440p overclocked monitors are the way to go.


----------



## Cyber Locc

Quote:


> Originally Posted by *bond32*
> 
> He claims it is a 60 hz refresh rate but not like it matters, the framerate never gets over 50 fps. It's an IPS panel. There are no 4k monitors capable of anything over 60 right now but I think I saw LG develop an IPS panel that could do 120 hz at 4k so perhaps we could see that soon.
> 
> I'm not sure what you're saying about a "generalization"... I observed, that even if his IPS 4k panel with a constant framerate of 60 fps or more still looks unplayable to me personally after playing on my own 1440p monitor at 120 hz. This observation has been shown over and over... People who have played at 60 hz for years try 120 and never go back.
> 
> So, again, in my opinion there are no 4k monitors out yet that will provide a smooth gaming (shooter) experience. The 1440p overclocked monitors are the way to go.


Well if its IPS then it's split panels as afaik there isn't any IPS single panels that are capable of 60Hz there also isn't any IPS 4k screens under 2,000 so that may give insight to which it is 60Hz or IPS.

If a monitor can do 120Hz at 4k is irralvent as if you cannot render 120fps than it doesn't matter and no setup can render a 120 fps on 4k at max settings most likely can't at med either.

From what I have seen gysync is as smooth if not smoother than 120hz and I disagree about the everyone loves 120Hz as I have used quite a few and none have had me rush out to buy one I notice little diffrence but to each there own.

However I will note part of my reluctance to 120Hz is most likely that I don't play much shooter games I do occasionally most of the time with friends and have played fps games on there screens 120Hz and it makes a better diffrence there however the majority of my gaming consists of dotas mmos and rpgs so to me graphics quality is much more important than refresh rates and 4k has it beat by a long shot in that dept. If you have the hardware to push it correctly that is.

Also a really cool feature of a few 4k monitors I was looking at is if its single panel and supports 4k 60Hz also supports 120Hz @ 1440 so kinda the best of both world's if your playing a shooter drop to 1440 if not get the graphical benefits of 4k


----------



## bond32

Quote:


> Originally Posted by *cyberlocc*
> 
> Well if its IPS then it's split panels as afaik there isn't any IPS single panels that are capable of 60Hz there also isn't any IPS 4k screens under 2,000 so that may give insight to which it is 60Hz or IPS.
> 
> If a monitor can do 120Hz at 4k is irralvent as if you cannot render 120fps than it doesn't matter and no setup can render a 120 fps on 4k at max settings most likely can't at med either.
> 
> From what I have seen gysync is as smooth if not smoother than 120hz and I disagree about the everyone loves 120Hz as I have used quite a few and none have had me rush out to buy one I notice little diffrence but to each there own.
> 
> However I will note part of my reluctance to 120Hz is most likely that I don't play much shooter games I do occasionally most of the time with friends and have played fps games on there screens 120Hz and it makes a better diffrence there however the majority of my gaming consists of dotas mmos and rpgs so to me graphics quality is much more important than refresh rates and 4k has it beat by a long shot in that dept. If you have the hardware to push it correctly that is.
> 
> Also a really cool feature of a few 4k monitors I was looking at is if its single panel and supports 4k 60Hz also supports 120Hz @ 1440 so kinda the best of both world's if your playing a shooter drop to 1440 if not get the graphical benefits of 4k


Which monitors have that dual refresh rate support??

Also, pretty sure plenty of people have rigs that can do 120fps at 4k...

I realize supersampling isn't the same, but I for one play bf4 at 150% res scale with all settings max'ed, at 1440p and framerate goes between 19 and 160, averages around 110-120 fps... This is of course with 3 heavily overclocked 290x's


----------



## Agent Smith1984

I dunno about "plenty" with rigs doing 120FPS at 4k..... have you seen 4k benchmarks?? They make ALL cards look stupid....


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I dunno about "plenty" with rigs doing 120FPS at 4k..... have you seen 4k benchmarks?? They make ALL cards look stupid....


Triple 4k on Quad 290x's


----------



## Agent Smith1984

HOLY CRAP!!!

Yeah, with quad CF, I can see it..... but that's not plenty!!


----------



## Cyber Locc

Quote:


> Originally Posted by *bond32*
> 
> Which monitors have that dual refresh rate support??
> 
> Also, pretty sure plenty of people have rigs that can do 120fps at 4k...
> 
> I realize supersampling isn't the same, but I for one play bf4 at 150% res scale with all settings max'ed, at 1440p and framerate goes between 19 and 160, averages around 110-120 fps... This is of course with 3 heavily overclocked 290x's


Off the top of my head the asus one claims that as the monitor itself has the ability to change resolutions and refresh rates. Alot of them boast the ability if it works and works well is another story. However in theory if a monitor over display port 1.2 can support 60hz at 4k it can also support 120hz @ 1440. I have not seen this in action so I make no claims to how well but it says it works.

No, no set of cards at all can run 4k maxed out with 120hz at all it takes 3 980s or 3 r9 290xs just to get a constant 60 (not dropping below) people may have pcs that can spike to 120hz at 4k but they will not hold a steady 120hz anytime soon without lowering settings quite a bit. For that very reason many people feel we aren't ready for 4k yet.

Here is just 1 bench of such playing crysis 3 an old game http://www.hardocp.com/article/2014/05/13/amd_radeon_r9_295x2_xfx_290x_dd_trifire_review/4#.VEFTkPldVps quadfire 290xs cant even get 60fps (58.2 so its very close) average at 4k and 980s and 295x2 trade blows at 4k with amd winning half and nvidia the other. You could most likely achieve 4k 120hz on some setups but it would require turning of aa and maybe turning down some other settings depending on the game no setup will max every game at 4k 120hz

And to the quadfire for 3x4k umm not happening lol unless your paying on low 3 x 1440 taxes quadfire 290xs with alot of games not reaching 60fps when maxed out. and 4k is twice the resolution of 1440.


----------



## Cyber Locc

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Triple 4k on Quad 290x's


Last I checked 4k doesnt have 4:3 screens which make those wings which is apparent by the color shift on them. also cant read the fps nor is there any settings attached also looking at a wall is not fraggin a group of 10 guys nice try tho. (unless thats the end of a bench still need to see more than that how about run valley at 3x4k on max and lets see the 5fps)


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agent Smith1984*
> 
> HOLY CRAP!!!
> 
> Yeah, with quad CF, I can see it..... but that's not plenty!!


Single 290x at 4k on Hitman Absolution:




Spoiler: Settings







From what i've heard from most people who have been using 4k is that AA usually isn't needed and can actually make the game look worse in some cases so turning it off just makes sense


----------



## Sgt Bilko

Quote:


> Originally Posted by *cyberlocc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Triple 4k on Quad 290x's
> 
> 
> 
> 
> 
> Last I checked 4k doesnt have 4:3 screens which make those wings which is apparent by the color shift on them. also cant read the fps nor is there any settings attached also looking at a wall is not fraggin a group of 10 guys nice try tho.
Click to expand...

that was in eyefinity and here are the settings:
Quote:


> Tomb raider - Ultimate preset with FXAA turned off. TressFX on. Quad 290x @ 1175/1575 clocks. CPU [email protected] - PCIE 2.0 x16 (NF200 PLX chips EVGA 4 way classified X58)


EDIT: to see them in full you need to right click then "open in new tab" or "open image in new tab"


----------



## Cyber Locc

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Single 290x at 4k on Hitman Absolution:
> 
> 
> 
> 
> Spoiler: Settings
> 
> 
> 
> 
> 
> 
> 
> From what i've heard from most people who have been using 4k is that AA usually isn't needed and can actually make the game look worse in some cases so turning it off just makes sense


ya hitman is very demanding and hey i found 120hz capable on 4k review as to your link http://www.tweaktown.com/articles/6726/4k-showdown-intel-x79-vs-x99-with-asus-geforce-gtx-980-4gb-quad-sli/index8.html 118


----------



## Cyber Locc

Quote:


> Originally Posted by *Sgt Bilko*
> 
> that was in eyefinity and here are the settings:
> EDIT: to see them in full you need to right click then "open in new tab" or "open image in new tab"


Not bad and what 21:9 and 4:3 monitors or those that are 4k thats not 3x4k screens


----------



## Agent Smith1984

I agree on the AA with 4k....
At 4k resolution, what the hell do you even need AA for???


----------



## bluedevil

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I agree on the AA with 4k....
> At 4k resolution, what the hell do you even need AA for???


You wouldn't.


----------



## Sgt Bilko

Quote:


> Originally Posted by *cyberlocc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> that was in eyefinity and here are the settings:
> EDIT: to see them in full you need to right click then "open in new tab" or "open image in new tab"
> 
> 
> 
> Not bad and what 21:9 and 4:3 monitors or those that are 4k thats not 3x4k screens
Click to expand...

They aren't my screencaps but they are running in 11520 x 2160 and 11520 divided by 3 = 3840 so that's 3 x 3840 x 2160 Monitors.


----------



## bond32

Quote:


> Originally Posted by *cyberlocc*
> 
> Off the top of my head the asus one claims that as the monitor itself has the ability to change resolutions and refresh rates. Alot of them boast the ability if it works and works well is another story. However in theory if a monitor over display port 1.2 can support 60hz at 4k it can also support 120hz @ 1440. I have not seen this in action so I make no claims to how well but it says it works.
> 
> No, no set of cards at all can run 4k maxed out with 120hz at all it takes 3 980s or 3 r9 290xs just to get a constant 60 (not dropping below) people may have pcs that can spike to 120hz at 4k but they will not hold a steady 120hz anytime soon without lowering settings quite a bit. For that very reason many people feel we aren't ready for 4k yet.
> 
> Here is just 1 bench of such playing crysis 3 an old game http://www.hardocp.com/article/2014/05/13/amd_radeon_r9_295x2_xfx_290x_dd_trifire_review/4#.VEFTkPldVps quadfire 290xs cant even get 60fps (58.2 so its very close) average at 4k and 980s and 295x2 trade blows at 4k with amd winning half and nvidia the other. You could most likely achieve 4k 120hz on some setups but it would require turning of aa and maybe turning down some other settings depending on the game no setup will max every game at 4k 120hz
> 
> And to the quadfire for 3x4k umm not happening lol unless your paying on low 3 x 1440 taxes quadfire 290xs with alot of games not reaching 60fps when maxed out. and 4k is twice the resolution of 1440.


Trifire can play 4k. Ultra settings but no aa. If I up the res scale to 200% which is 4k my framerate is between 60-115 ish...

From what I hear, aa on 4k isn't needed. I don't know how true that us as I don't have a 4k monitor.


----------



## DampMonkey

Quote:


> Originally Posted by *bluedevil*
> 
> You wouldn't.


Actually, you do. 4k has jaggies just like any other res


----------



## bluedevil

Quote:


> Originally Posted by *DampMonkey*
> 
> Actually, you do. 4k has jaggies just like any other res


Except, they are smaller, thus lowering then need to have AA.


----------



## sugarhell

Quote:


> Originally Posted by *bluedevil*
> 
> Except, they are smaller, thus lowering then need to have AA.


Depends the graphic engine. Some engines have so many jaggies that they need AA.Some others they produce jaggies that can be fixed with just the resolution


----------



## Cyber Locc

Quote:


> Originally Posted by *Sgt Bilko*
> 
> They aren't my screencaps but they are running in 11520 x 2160 and 11520 divided by 3 = 3840 so that's 3 x 3840 x 2160 Monitors.


I beliee you that they said its 3x 4k but it isn't. I sometimes use eyeifinty with some old 4:3 aspect ratios screens and when i do I get what you just linked as the wings are different color ratio then the main one. nvidia just did a demo with 4 titan blacks and they didn't close to 60 fps in a racing game. google is your friend no one is running 12k period esp at ultra with quad 290s quad 290s get taxed by 3x1440 which is 8k 12k is a large jump.


----------



## Sgt Bilko

Quote:


> Originally Posted by *cyberlocc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> They aren't my screencaps but they are running in 11520 x 2160 and 11520 divided by 3 = 3840 so that's 3 x 3840 x 2160 Monitors.
> 
> 
> 
> I beliee you that they said its 3x 4k but it isn't. I sometimes use eyeifinty with some old 4:3 aspect ratios screens and when i do I get what you just linked as the wings are different color ratio then the main one. nvidia just did a demo with 4 titan blacks and they didn't close to 60 fps in a racing game. google is your friend no one is running 12k period esp at ultra with quad 290s quad 290s get taxed by 3x1440 which is 8k 12k is a large jump.
Click to expand...

If you are talking about the black shaded rectangle then that's just part of the benchmark screen for Tomb Raider, it does it in 5760 x 1080 as well

here's another if you wish though:



Resolution is even written on that one


----------



## DampMonkey

Quote:


> Originally Posted by *bluedevil*
> 
> Except, they are smaller, thus lowering then need to have AA.


Definitely, but you still need AA in most cases. Usually x2 is enough.


----------



## Cyber Locc

Quote:


> Originally Posted by *Sgt Bilko*
> 
> If you are talking about the black shaded rectangle then that's just part of the benchmark screen for Tomb Raider, it does it in 5760 x 1080 as well
> 
> here's another if you wish though:
> 
> 
> 
> Resolution is even written on that one


Yep that's what i wanted







that seems more in line and notice the top fps is 63 lol. Nice to see that's possible is quad 290s is what I'm currently working on







. but I dont like eyefinity it makes my stomach hurt. also it appears aa is off there as well which is understandable. but as said some notice the difference some do its up to the user


----------



## Sgt Bilko

Quote:


> Originally Posted by *cyberlocc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> If you are talking about the black shaded rectangle then that's just part of the benchmark screen for Tomb Raider, it does it in 5760 x 1080 as well
> 
> here's another if you wish though:
> 
> 
> 
> Resolution is even written on that one
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yep that's what i wanted
> 
> 
> 
> 
> 
> 
> 
> that seems more in line and notice the top fps is 63 lol. Nice to see that's possible is quad 290s is what I'm currently working on
> 
> 
> 
> 
> 
> 
> 
> . but I dont like eyefinity it makes my stomach hurt
Click to expand...

All you needed to do was save the picture and check the res out for yourself, I don't know what settings he used for Valley but i do know that 4k eyefinity is possible on current gen so long as you are prepared to turn down a few settings in game.


----------



## Cyber Locc

Quote:


> Originally Posted by *Sgt Bilko*
> 
> All you needed to do was save the picture and check the res out for yourself, I don't know what settings he used for Valley but i do know that 4k eyefinity is possible on current gen so long as you are prepared to turn down a few settings in game.


he used ultra settings so max although without aa valley will say if aa is on.


----------



## sinnedone

I don't think 4k gaming is quite there yet. Honestly one card barely runs 30 fps which is horrible when gaming. It takes 3-4 cards to keep up at 4k 60fps at higher settings. Ive been watching videos and reading for the past year on what it would take to push games at 4k and averages and mins look like total crap honestly.

It was such a difference going from 60hz 1080p to 96-120hz 1440p that honestly that is my personal standard. I mean at the very minimum you need to keep your minimum frames at 60fps which is pretty difficult at 4k.


----------



## sugarhell

Quote:


> Originally Posted by *sinnedone*
> 
> I don't think 4k gaming is quite there yet. Honestly one card barely runs 30 fps which is horrible when gaming. It takes 3-4 cards to keep up at 4k 60fps at higher settings. Ive been watching videos and reading for the past year on what it would take to push games at 4k and averages and mins look like total crap honestly.
> 
> It was such a difference going from 60hz 1080p to 96-120hz 1440p that honestly that is my personal standard. I mean at the very minimum you need to keep your minimum frames at 60fps which is pretty difficult at 4k.


Yeah because some stupid sites wants to run 4k with 4xmsaa or ssaa. 3 cards are more than enough for 4k


----------



## sinnedone

Quote:


> Originally Posted by *sugarhell*
> 
> Yeah because some stupid sites wants to run 4k with 4xmsaa or ssaa. 3 cards are more than enough for 4k


3 high end cards are the only ones that are going to get you there right now. I honestly look at the minimum frames most of the time then average, highest fps is just for epeen and useless when it comes to judging gameplay.

Yes some sites do turn everything up to ultra as is the norm, but some don't. Some people say they still need aa at 4k while others don't. I honestly would take the worst case scenario then know it can only get better from there.







That said I am one of those people that like to go all out as far as eye candy goes so thats more personal preference there.


----------



## DampMonkey

Quote:


> Originally Posted by *sinnedone*
> 
> I don't think 4k gaming is quite there yet. Honestly one card barely runs 30 fps which is horrible when gaming. It takes 3-4 cards to keep up at 4k 60fps at higher settings. Ive been watching videos and reading for the past year on what it would take to push games at 4k and averages and mins look like total crap honestly.
> 
> It was such a difference going from 60hz 1080p to 96-120hz 1440p that honestly that is my personal standard. I mean at the very minimum you need to keep your minimum frames at 60fps which is pretty difficult at 4k.


It really depends on the game. The majority of games run great at 4k. Some of the big name ones that are go-to's for benchmarks however, do not. From my experience, BF4, Metro:LL, and Crysis 3 are rough at 4k. Definitely going to want at least 2 gpu's if you want all the eye-candy settings and anywhere near 60fps. CS:go and any other source game however, I run single card and average 100+ easily. Depends on what you plan to play i guess you could say

Also, mantle helps a lot with the min frames in BF4. Its definitely a noticeable difference better than dx11, even if the avg fps gains aren't huge.


----------



## chiknnwatrmln

Random question for anybody who might know a bit about power supplies:
If my PSU (Corsair AX860) is modular and has PCIe power cables that go from one connector at the power supply to two connectors at the graphics card, is it OK to run my card off one cable? I mean, it's ran fine at high voltages for a year now but I just randomly thought of this. I am drawing hundreds of watts through a single PCIe power connector.


----------



## battleaxe

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Random question for anybody who might know a bit about power supplies:
> If my PSU (Corsair AX860) is modular and has PCIe power cables that go from one connector at the power supply to two connectors at the graphics card, is it OK to run my card off one cable? I mean, it's ran fine at high voltages for a year now but I just randomly thought of this. I am drawing hundreds of watts through a single PCIe power connector.


there were several guys (miners) that burnt the power connections on their cards doing that. Burnt the connectors where they solder into the card. What happens is the increased resistance of the cables (not enough cross section of wire) creates heat. The more heat, the more resistance. The more resistance... well, its a cycle... which can end in burnt connections. FYI

So is it possible to kill a card like this? Yes.


----------



## chiknnwatrmln

Interesting, thanks for sharing that. I'm gonna have to buy some one connector cables, my rig would look goofy with four extra connectors dangling.


----------



## DampMonkey

Just an FYI for everyone, I've been running the PT1 bios for a while now, but for the last few months I've experienced a lot of issues with crossfire 290x's and not having 100% utilization with both cards. In most cases I would get better performance running 1 card rather than both. this was a sporadic occurence though and it was difficult to nail down what was causing the issue. I tried different ram, operating system, drivers, driver settings, but nothing seemd to work. Turns out it was the bios causing the issue. Id recommend using the Asus.rom bios rather than the pt1 or pt3 bios for full time use. They are still good for benching and hard overvolting though. But inconsistent for daily usage.


----------



## pkrexer

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Random question for anybody who might know a bit about power supplies:
> If my PSU (Corsair AX860) is modular and has PCIe power cables that go from one connector at the power supply to two connectors at the graphics card, is it OK to run my card off one cable? I mean, it's ran fine at high voltages for a year now but I just randomly thought of this. I am drawing hundreds of watts through a single PCIe power connector.


Some people will suggest running two separate power cables from the power supply. I for one have had no issues running my 290x's with one power cable. I've benched them up to +200mv and didn't experience any problems.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *pkrexer*
> 
> Some people will suggest running two separate power cables from the power supply. I for one have had no issues running my 290x's with one power cable. I've benched them up to +200mv and didn't experience any problems.


I have also gone up to +220mV without an issue, however I'd rather be safe than sorry. The cables seem pretty thick, so I asked Corsair to get their opinion.

Once I get my 2nd GPU I'm done overvolting, anyway.


----------



## Devildog83

There are cards out there the say individual cables are a must, I would guess a 295 x2 card would be one. I don't like the looks of a daisy chained cable though so even when I had a 7870 and R9 270x I run 1 for each connector (4 total 6 pin connectors), but that was more for looks than anything else. I run 2 for my 290 also.

Do you think if you have a single 12v rail it makes a difference?


----------



## tsm106

It's all going to the same power source. Only time it would make a real difference is if you are heavily modded and are drawing something silly like 800w per card.


----------



## Ironsmack

Out of curiosity, i installed 3DMark and wanted to see what my score was. Although, it sucks i couldn't use the Firestrike Ultra. During the benchmarking, there was a lot of artifacts that i see. I may have to up the voltage a bit more.



Is that okay for a Xfire 290's OC to 1150/1380 with 69mV?


----------



## amalinkin

Hi everyone.
I have R9 290X Reference Gigabyte GV-R929XD5-4GD-B, the memory is Elpida + installed Arctic Hybrid 2 water cooling.

I usually use to MSI AB 4, TRIXX (GPU-Z for watching).
I overclocked core to 1130MHz with +38mV (31mV also works), AUX +6mV , Power +50%, ULPS disabled, Force constant voltage - enabled.
Memory speed is 1375MHz - effective 5500
The same situation with the both versions of BIOS - uber and silent.
All temps is under 70C, core max 63.

So:
1. Is this OK that memory can't work even 1400 MHz?(I saw the description of RAM chips and I was confused, because by specification normal speed is 1500). If no, is there any way to get more memory frequency (may be some BIOS for Elpida memory exists)?
I was also trying to start memory 1500 with core +100mV and AUX +50mV , even 1400 and I get black screen.

2. What is the max safe AUX voltage?

3. And after getting more than 1150 MHz core - the card is throttling core to less frequency. How to avoid this?

Thank you.


----------



## Dasboogieman

Quote:


> Originally Posted by *Mydog*
> 
> Thanks I'll do that
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @Red1776
> 
> I can on visual inspection even without taking of the cooler confirm that my card has the golden SSC chokes
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm now on bios 1 but memory voltage are locked and core voltage only goes to +100mv, are there a special version of AB that I need to use?


I've got the same MSI card as you, gold SSC chokes. If you want more voltage, get Sapphire Trixx. The VRMs on this MSI design is much cooler (at least 15 degrees less) than the reference design. The only issue is the core temps are higher than the Tri X.


----------



## Red1776

Quote:


> Originally Posted by *Dasboogieman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mydog*
> 
> Thanks I'll do that
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @Red1776
> 
> I can on visual inspection even without taking of the cooler confirm that my card has the golden SSC chokes
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm now on bios 1 but memory voltage are locked and core voltage only goes to +100mv, are there a special version of AB that I need to use?
> 
> 
> 
> I've got the same MSI card as you, gold SSC chokes. If you want more voltage, get Sapphire Trixx. The VRMs on this MSI design is much cooler (at least 15 degrees less) than the reference design. The only issue is the core temps are higher than the Tri X.
Click to expand...

You need to (on the latest AB ) enabled 'unofficial overclocking' lets you exceed the default limit for voltage.

you no longer need to go into the CFG. file and change unofficial Overclocking to'1'.

if you need more help, let me know.

Greg (Red1776)


----------



## Aussiejuggalo

Got my 2 new Dell U2414H's set up but CCC decided to set one of them to unerscan 2-% and YCbCr pixel format







anyone else have this problem with there 290? Ones on HDMI the others DP

On another note, gonna be WC my Tri-X in the coming weeks







(already have all the parts but my backs been to sore the last week







)


----------



## StrongForce

Quote:


> Originally Posted by *amalinkin*
> 
> Hi everyone.
> I have R9 290X Reference Gigabyte GV-R929XD5-4GD-B, the memory is Elpida + installed Arctic Hybrid 2 water cooling.
> 
> I usually use to MSI AB 4, TRIXX (GPU-Z for watching).
> I overclocked core to 1130MHz with +38mV (31mV also works), AUX +6mV , Power +50%, ULPS disabled, Force constant voltage - enabled.
> Memory speed is 1375MHz - effective 5500
> The same situation with the both versions of BIOS - uber and silent.
> All temps is under 70C, core max 63.
> 
> So:
> 1. Is this OK that memory can't work even 1400 MHz?(I saw the description of RAM chips and I was confused, because by specification normal speed is 1500). If no, is there any way to get more memory frequency (may be some BIOS for Elpida memory exists)?
> I was also trying to start memory 1500 with core +100mV and AUX +50mV , even 1400 and I get black screen.
> 
> 2. What is the max safe AUX voltage?
> 
> 3. And after getting more than 1150 MHz core - the card is throttling core to less frequency. How to avoid this?
> 
> Thank you.


How do you change the AUX and power settings, and what does ULPS thing do ?? should I disable it on mine (hd 7950)


----------



## Sgt Bilko

Quote:


> Originally Posted by *StrongForce*
> 
> Quote:
> 
> 
> 
> Originally Posted by *amalinkin*
> 
> Hi everyone.
> I have R9 290X Reference Gigabyte GV-R929XD5-4GD-B, the memory is Elpida + installed Arctic Hybrid 2 water cooling.
> 
> I usually use to MSI AB 4, TRIXX (GPU-Z for watching).
> I overclocked core to 1130MHz with +38mV (31mV also works), AUX +6mV , Power +50%, ULPS disabled, Force constant voltage - enabled.
> Memory speed is 1375MHz - effective 5500
> The same situation with the both versions of BIOS - uber and silent.
> All temps is under 70C, core max 63.
> 
> So:
> 1. Is this OK that memory can't work even 1400 MHz?(I saw the description of RAM chips and I was confused, because by specification normal speed is 1500). If no, is there any way to get more memory frequency (may be some BIOS for Elpida memory exists)?
> I was also trying to start memory 1500 with core +100mV and AUX +50mV , even 1400 and I get black screen.
> 
> 2. What is the max safe AUX voltage?
> 
> 3. And after getting more than 1150 MHz core - the card is throttling core to less frequency. How to avoid this?
> 
> Thank you.
> 
> 
> 
> How do you change the AUX and power settings, and what does ULPS thing do ?? should I disable it on mine (hd 7950)
Click to expand...

ULPS (Ultra Low Power State) is fine for Single GPU setups, can cause issues with Crossfire though so it's easier to disable it

AUX voltage is listed in MSI Afterburner in the little "+" tab next to Core Voltage and it just raises the voltage supplied for the PCIe slot (Not that helpful really)


----------



## Forceman

Quote:


> Originally Posted by *amalinkin*
> 
> Hi everyone.
> I have R9 290X Reference Gigabyte GV-R929XD5-4GD-B, the memory is Elpida + installed Arctic Hybrid 2 water cooling.
> 
> I usually use to MSI AB 4, TRIXX (GPU-Z for watching).
> I overclocked core to 1130MHz with +38mV (31mV also works), AUX +6mV , Power +50%, ULPS disabled, Force constant voltage - enabled.
> Memory speed is 1375MHz - effective 5500
> The same situation with the both versions of BIOS - uber and silent.
> All temps is under 70C, core max 63.
> 
> So:
> 1. Is this OK that memory can't work even 1400 MHz?(I saw the description of RAM chips and I was confused, because by specification normal speed is 1500). If no, is there any way to get more memory frequency (may be some BIOS for Elpida memory exists)?
> I was also trying to start memory 1500 with core +100mV and AUX +50mV , even 1400 and I get black screen.
> 
> 2. What is the max safe AUX voltage?
> 
> 3. And after getting more than 1150 MHz core - the card is throttling core to less frequency. How to avoid this?
> 
> Thank you.


The limiting factor for memory overclocks seems to be the memory controller on these cards, not the memory chips themselves. Raising core voltage can help with getting higher memory clocks, but if you are at +100mV and it isn't getting better than you are probably just out of luck. Good thing is that the memory clock doesn't make that much difference to performance.

For the throttling, make sure you have the power limit maxed out, and make sure it isn't thermal throttling.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Got my 2 new Dell U2414H's set up but CCC decided to set one of them to unerscan 2-% and YCbCr pixel format
> 
> 
> 
> 
> 
> 
> 
> anyone else have this problem with there 290? Ones on HDMI the others DP
> 
> On another note, gonna be WC my Tri-X in the coming weeks
> 
> 
> 
> 
> 
> 
> 
> (already have all the parts but my backs been to sore the last week
> 
> 
> 
> 
> 
> 
> 
> )


Hey there








Its cause your running DP and HDMI . Try using DP and DVI ...... also in CCC > My digital flat panels > properties > Enable GPU scaling > scale image to full panel size to adjust


----------



## Aussiejuggalo

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Hey there
> 
> 
> 
> 
> 
> 
> 
> 
> Its cause your running DP and HDMI . Try using DP and DVI ...... also in CCC > My digital flat panels > properties > Enable GPU scaling > scale image to full panel size to adjust


Thought so, I cant run DP and DVI, these monitors only have DP and HDMI lol

I found the GPU scaling and colour options after a while, still kind of annoying but seems to be running right now

Would be lovely if AMD started doing full DP cards again


----------



## HOMECINEMA-PC

Tri monitors one DP and 2 DVI's on card but all have DP on the monitors ....... but once I start punching up the vcore







the DP / DVI / HDMI goes wonky and its black screen benchmarking time







LooooooL

edit : I can tell if its still running the bench though .......... coil whine


----------



## Mega Man

yes. single slot able, 6dp cards = dream


----------



## HOMECINEMA-PC

Just like a IVB-E that will bench at 5 gigs
and
EVGA / Gigabyte support too


----------



## chiknnwatrmln

Hey Arizonian, update me to CF XFX 290 + Tri-X OC 290x. Both ref boards under water. I will post a GPUz screenshot once my loop is done being bled, along with some prettier pictures. Thanks


Spoiler: Warning: Spoiler!


----------



## maynard14

guys why is that 3dmark11 after running it for 2 times, it gives me different results?









heres my first run on my 290x gpu core 1100 and gpu memory 1350 no voltage added

http://www.3dmark.com/3dm11/8851094

2nd run

also 1100 and 1350 oc

http://www.3dmark.com/3dm11/8851109

can someone shed some light


----------



## Spectre-

Quote:


> Originally Posted by *maynard14*
> 
> guys why is that 3dmark11 after running it for 2 times, it gives me different results?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> heres my first run on my 290x gpu core 1100 and gpu memory 1350 no voltage added
> 
> http://www.3dmark.com/3dm11/8851094
> 
> 2nd run
> 
> also 1100 and 1350 oc
> 
> http://www.3dmark.com/3dm11/8851109
> 
> can someone shed some light


if you arent waiting long enough it could just be temps... usually i turn my pc off after one run and then do another one


----------



## Arizonian

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Hey Arizonian, update me to CF XFX 290 + Tri-X OC 290x. Both ref boards under water. I will post a GPUz screenshot once my loop is done being bled, along with some prettier pictures. Thanks
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - updated


----------



## maynard14

Quote:


> Originally Posted by *Spectre-*
> 
> if you arent waiting long enough it could just be temps... usually i turn my pc off after one run and then do another one


oh i see, didnt know that, after the first test i tried it immediately for the 2nd test

now i know

thank you so much , ill try to run it later and see if my score are the same or atleast close to the highest score i got


----------



## ZealotKi11er

Quote:


> Originally Posted by *Forceman*
> 
> The limiting factor for memory overclocks seems to be the memory controller on these cards, not the memory chips themselves. Raising core voltage can help with getting higher memory clocks, but if you are at +100mV and it isn't getting better than you are probably just out of luck. Good thing is that the memory clock doesn't make that much difference to performance.
> 
> For the throttling, make sure you have the power limit maxed out, and make sure it isn't thermal throttling.


Going from 1250 to 1500 i did notice a difference in games. In 3DMark i tested up to 1650 and made no difference from 1500. Maybe 4K it can be different.


----------



## th3illusiveman

in BF4 do you guys get some shadow flickering? Usually when you zoom in through a sniper scope? What about some small random texture flickering every now and then (usually on a wall sometimes blood patch)


----------



## Klocek001

Quote:


> Originally Posted by *th3illusiveman*
> 
> in BF4 do you guys get some shadow flickering? Usually when you zoom in through a sniper scope? What about some small random texture flickering every now and then (usually on a wall sometimes blood patch)


what drivers are you using ? 14.7 had some mad flickering in the few games I played, I went back to 14.4 immidiately and that solved everything. I didn't really try 14.9 yet.


----------



## th3illusiveman

Quote:


> Originally Posted by *Klocek001*
> 
> what drivers are you using ? 14.7 had some mad flickering in the few games I played, I went back to 14.4 immediately and that solved everything. I didn't really try 14.9 yet.


14.9.0, it's not really horrible, and it doesn't affect game play or graphical quality that much but it does happen and it would prefer it didn't.


----------



## Klocek001

Did it happen on 14.4 ?


----------



## th3illusiveman

Quote:


> Originally Posted by *Klocek001*
> 
> Did it happen on 14.4 ?


i've never used 14.4


----------



## Buehlar

I also have some flicker in BF4 with 14.9


----------



## chiknnwatrmln

I get some z-fighting on certain textures at a long distance. I honestly think it's just the game, I messed with the settings a bunch and couldn't get it to go away.


----------



## Klocek001

Quote:


> Originally Posted by *th3illusiveman*
> 
> i've never used 14.4


try'em


----------



## amalinkin

Quote:


> Originally Posted by *Forceman*
> 
> The limiting factor for memory overclocks seems to be the memory controller on these cards, not the memory chips themselves. Raising core voltage can help with getting higher memory clocks, but if you are at +100mV and it isn't getting better than you are probably just out of luck. Good thing is that the memory clock doesn't make that much difference to performance.
> 
> For the throttling, make sure you have the power limit maxed out, and make sure it isn't thermal throttling.


Thank you for the answer.
Power limit is +50%, in watts 288W in (GPU-Z)
Temp max 65C

Anyway - if to see the temps changing +62 mV doesn't make real +62mV in dynamic voltages- if to see MSI AB or GPU-Z monitoring .


----------



## Klocek001

I don't think 290 series need any memory OC in the first place.


----------



## jagdtigger

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Tri monitors one DP and 2 DVI's on card but all have DP on the monitors ....... but once I start punching up the vcore
> 
> 
> 
> 
> 
> 
> 
> the DP / DVI / HDMI goes wonky and its black screen benchmarking time
> 
> 
> 
> 
> 
> 
> 
> LooooooL
> 
> edit : I can tell if its still running the bench though .......... coil whine


My card doing the same, the only difference is taht i use 2xDVI+1xHDMI...


----------



## Wezzor

Quote:


> Originally Posted by *Buehlar*
> 
> I also have some flicker in BF4 with 14.9


Same here. I've tried everything to get rid of it but nothing works.


----------



## rdr09

Quote:


> Originally Posted by *maynard14*
> 
> guys why is that 3dmark11 after running it for 2 times, it gives me different results?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> heres my first run on my 290x gpu core 1100 and gpu memory 1350 no voltage added
> 
> http://www.3dmark.com/3dm11/8851094
> 
> 2nd run
> 
> also 1100 and 1350 oc
> 
> http://www.3dmark.com/3dm11/8851109
> 
> can someone shed some light


its your physics scores that varied. you prolly have turbo enabled.
Quote:


> Originally Posted by *Wezzor*
> 
> Same here. I've tried everything to get rid of it but nothing works.


try this.

used on both single and crossfire 290s. same with my 7950 and it works with no issues. although, 14.9 never gave me issues as well.

btw, i don't use third party app to uninstall driver just the new driver alone. pretty similar to going to device manager to uninstall the old driver.


----------



## maynard14

Quote:


> Originally Posted by *rdr09*
> 
> its your physics scores that varied. you prolly have turbo enabled.
> try this.
> 
> used on both single and crossfire 290s. same with my 7950 and it works with no issues. although, 14.9 never gave me issues as well.
> 
> btw, i don't use third party app to uninstall driver just the new driver alone. pretty similar to going to device manager to uninstall the old driver.


what turbo sir? should i turn it off while benching? and how can i turn it off, maybe thats why i could not pass 4.4 ghz at 1.3 v on my 4770k

and lastly sir

here is my graph after benching with shadow of mordor gpu core 1100 and gpu memory 1350 default voltage:



on the graph my gpu core is kinda not stable ? or thats normal? where as my gpu memory is straight line to 1350

PLS help im just want to be able to learn how to stabilize my OC

thanks you


----------



## Wezzor

Quote:


> Originally Posted by *rdr09*
> 
> try this.
> 
> used on both single and crossfire 290s. same with my 7950 and it works with no issues. although, 14.9 never gave me issues as well.
> 
> btw, i don't use third party app to uninstall driver just the new driver alone. pretty similar to going to device manager to uninstall the old driver.


Thank you! I'll definitely check it out. Btw is it safe to use modded GPU drivers?


----------



## Mydog

So no water block fits my R9 290X Twin Frozr but at least I now can get 1.346 vcore through TriXX but no memory voltage in MSI AB or TriXX and of course temps go through the roof on air


----------



## rdr09

Quote:


> Originally Posted by *Wezzor*
> 
> Thank you! I'll definitely check it out. Btw is it safe to use modded GPU drivers?


the one i linked? i am using it rightnow on both my sigs.

see post#3. NEK4TE hosted it for easy download.

DL the driver, run it as if you are about to install, pick Uninstall, then Reboot. Takes about 2 minutes to uninstall old driver.

Run it again to Install this time. Before Rebooting, go to msconfig and uncheck CCC under startup (unless using crossfire).

I use Express method for both procedures (Uninstall and Install). Windows log on may take awhile to show up - just wait. And wait.


----------



## Wezzor

false
Quote:


> Originally Posted by *rdr09*
> 
> the one i linked? i am using it rightnow on both my sigs.
> 
> see post#3. NEK4TE hosted it for easy download.
> 
> DL the driver, run it as if you are about to install, pick Uninstall, then Reboot. Takes about 2 minutes to uninstall old driver.
> 
> Run it again to Install this time. Before Rebooting, go to msconfig and uncheck CCC under startup (unless using crossfire).
> 
> I use Express method for both procedures (Uninstall and Install). Windows log on may take awhile to show up - just wait. And wait.


Alright! Hopefully will they fix my flickering in BF4. I'll come back with the results.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Wezzor*
> 
> false
> Alright! Hopefully will they fix my flickering in BF4. I'll come back with the results.


What flickering are you talking about. I have always had flickering in BF3 and BF4 when i look far away with sniper.


----------



## Wezzor

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What flickering are you talking about. I have always had flickering in BF3 and BF4 when i look far away with sniper.


Well, I would actually call it an artifact. It's like 3-6 small black square boxes that appear sometimes on the screen when I play. One day I can play like 2h without a single one and the other day I get them multiple times. I have tried increasing the voltage but it wont help and I don't really want to lower the memory clock or core clock only because of BF4. It started after the big patch and BF4 is the only game I get it in. It's the same with Vsync. I can't use it anymore in BF4 because I get FPS drops and before I would always have stable 60FPS and never go down. I tested like 10 others games and valley and nothing the problem ONLY appears with BF4. I run the card 1100/1400 with 50% power limit.


----------



## bond32

I'm still not convinced the PT1 bios is really the better option for benching... Seems I get the same or worse scores with it. Anyone else have thoughts?

Edit: Is anyone familiar enough with AB to "fake" it into thinking all three of my cards are the same? I have one 290x and 2 290's, any time I want to change the clocks I have to enter the clocks for the 290x then go into settings and change to the 290's, then enter their clocks. Having identical bios doesn't work, even with the box checked in settings to adjust clocks of similar gpu's.

Edit again: I only ask because GPU tweak will adjust all three cards the same when I set the clocks once. But GPU tweak is wonky... AB seems to be the better option.


----------



## StrongForce

Right now I can get a stock cooler r9 290 for 200$ shipping included, should I just goforit ?









then I might stuck a liquid cooler, depending how noisy the thing is during gaming, i don't really mind spending 100$ into a kraken g10, corsair h55 and probably a gelid VRM cooling kit, since at least the h55 can be reused, and the kraken probably easily sold when not needed again, right ?









Of course if the 970's were the same prices as the ones in the USA here I would go, but most of em are 400$ currently ..

What ya'll think ?


----------



## BradleyW

Has anyone experienced fps drops in Borderlands 3, large environments? 1 CPU core is pegged at 99 and GPU usage drops. (Crossfire).


----------



## kizwan

Quote:


> Originally Posted by *maynard14*
> 
> guys why is that 3dmark11 after running it for 2 times, it gives me different results?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> heres my first run on my 290x gpu core 1100 and gpu memory 1350 no voltage added
> 
> http://www.3dmark.com/3dm11/8851094
> 
> 2nd run
> 
> also 1100 and 1350 oc
> 
> http://www.3dmark.com/3dm11/8851109
> 
> can someone shed some light


Just small variation. It's normal. Usually when benching I disabled Speedstep, C1E, C3, C6 & C7.


----------



## caenlen

is a 22,200 graphics score in 3d mark firestrike with 290x CF normal? not overlcocked. thats what i got (not he overall firestrike just the graphics score)


----------



## chronicfx

Yes it about 22k


----------



## battleaxe

Quote:


> Originally Posted by *StrongForce*
> 
> Right now I can get a stock cooler r9 290 for 200$ shipping included, should I just goforit ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> then I might stuck a liquid cooler, depending how noisy the thing is during gaming, i don't really mind spending 100$ into a kraken g10, corsair h55 and probably a gelid VRM cooling kit, since at least the h55 can be reused, and the kraken probably easily sold when not needed again, right ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Of course if the 970's were the same prices as the ones in the USA here I would go, but most of em are 400$ currently ..
> 
> What ya'll think ?


Well, I certainly think that's a good idea. I personally have h80's on my 290's. And the Gelid VRM coolers on the VRM's. I don't use the Kraken g10 at all and what I have works great. Temps are much lower than stock and well within safe ranges. Additional cost this way is about $50-60 more per card if you get the water coolers on sale like I did. I am really happy with this setup. I also have Nvidia, so I'm no fan-boy. I have GTX670's in SLI and the R9 290's in crossfire on another machine. I like both platforms and have no allegiance to either. I like what works. The 290's are beast cards to be sure and at $200.00 I don't see anything that could beat them. Not even at $300.00 in my mind.


----------



## ArchieGriffs

I'm kind of wanting to take advantage of the 290 sale for the MSI gaming right now: http://www.newegg.com/Product/Product.aspx?Item=N82E16814127774&cm_re=r9_290-_-14-127-774-_-Product

Not sure how much of a pain it would be to crossfire it with my existing Tri-X though, obviously the same memory amount, and MSI still has hynix (right?). I'm getting conflicting reports about the MSI's board though, some are saying it's not the reference PCB, others are just saying custom components (obviously), so if I were to put both under water, more specifically the HG10 bracket, I still shouldn't have any issues right?

I have a feeling we're not going to be getting much as far as black Friday sales go as AMD said something about them expecting a lower sales despite the upcoming holiday season, so I'm not sure it's going to get much better than this.

Has anyone who has RMA'ed a MSI gaming and has taken off the stickers gotten it back with no problems? From what I heard over on the MSI forums you can take them off to watercool/change thermal past etc. and it's only there to tell them whether or not to inspect for user damage and doesn't effect the RMA outcome assuming it isn't damaged.

Back when I used to stalk R9 290 reviews for different manufacturers MSI was definitely the #2 choice (before the PCS+ Vapor X came out), and most if not all the advantages the Tri-X has over the MSI looks like they'll disappear when I watercool, and do a bios flash, has anyone had experience xfiring them together to confirm?


----------



## battleaxe

Quote:


> Originally Posted by *ArchieGriffs*
> 
> I'm kind of wanting to take advantage of the 290 sale for the MSI gaming right now: http://www.newegg.com/Product/Product.aspx?Item=N82E16814127774&cm_re=r9_290-_-14-127-774-_-Product
> 
> Not sure how much of a pain it would be to crossfire it with my existing Tri-X though, obviously the same memory amount, and MSI still has hynix (right?). I'm getting conflicting reports about the MSI's board though, some are saying it's not the reference PCB, others are just saying custom components (obviously), so if I were to put both under water, more specifically the HG10 bracket, I still shouldn't have any issues right?
> 
> I have a feeling we're not going to be getting much as far as black Friday sales go as AMD said something about them expecting a lower sales despite the upcoming holiday season, so I'm not sure it's going to get much better than this.
> 
> Has anyone who has RMA'ed a MSI gaming and has taken off the stickers gotten it back with no problems? From what I heard over on the MSI forums you can take them off to watercool/change thermal past etc. and it's only there to tell them whether or not to inspect for user damage and doesn't effect the RMA outcome assuming it isn't damaged.
> 
> Back when I used to stalk R9 290 reviews for different manufacturers MSI was definitely the #2 choice (before the PCS+ Vapor X came out), and most if not all the advantages the Tri-X has over the MSI looks like they'll disappear when I watercool, and do a bios flash, has anyone had experience xfiring them together to confirm?


The gaming does have Hynix RAM. I can't comment on whether it would have a problem xfiring with a Tri-X though. It shouldn't be an issue though. I have no idea on the block configs, someone else will have to help there. If it were me I'd just poney up the last $30.00 and get another Tri-X personally. Its not that much more.


----------



## StrongForce

Quote:


> Originally Posted by *battleaxe*
> 
> Well, I certainly think that's a good idea. I personally have h80's on my 290's. And the Gelid VRM coolers on the VRM's. I don't use the Kraken g10 at all and what I have works great. Temps are much lower than stock and well within safe ranges. Additional cost this way is about $50-60 more per card if you get the water coolers on sale like I did. I am really happy with this setup. I also have Nvidia, so I'm no fan-boy. I have GTX670's in SLI and the R9 290's in crossfire on another machine. I like both platforms and have no allegiance to either. I like what works. The 290's are beast cards to be sure and at $200.00 I don't see anything that could beat them. Not even at $300.00 in my mind.


Yea .. getting more and more tempted as time goes







, if I was in USA for these prices I would definately go for a 330$ or so 970







altought, I heard about the coil whine problems.. and overclocking locked voltage or something, let's hope they patch that !

But here the cheapeast is like 400$ (for some odd reason, obviously, in USA like newegg is constantly out of stock apparently and prices haven't moved..here we have stock).


----------



## battleaxe

Quote:


> Originally Posted by *StrongForce*
> 
> Yea .. getting more and more tempted as time goes
> 
> 
> 
> 
> 
> 
> 
> , if I was in USA for these prices I would definately go for a 330$ or so 970
> 
> 
> 
> 
> 
> 
> 
> altought, I heard about the coil whine problems.. and overclocking locked voltage or something, let's hope they patch that !
> 
> But here the cheapeast is like 400$ (for some odd reason, obviously, in USA like newegg is constantly out of stock apparently and prices haven't moved..here we have stock).


If it were me getting one now, this is what I would pull the trigger on. http://www.newegg.com/Product/Product.aspx?Item=N82E16814202103


----------



## ArchieGriffs

Quote:


> Originally Posted by *battleaxe*
> 
> The gaming does have Hynix RAM. I can't comment on whether it would have a problem xfiring with a Tri-X though. It shouldn't be an issue though. I have no idea on the block configs, someone else will have to help there. If it were me I'd just poney up the last $30.00 and get another Tri-X personally. Its not that much more.


It comes down to a 50$ different with the 5% plus a pretty decent headset. If it was just a 30$ difference I'd just get another tri-x for the simplicity.


----------



## battleaxe

Quote:


> Originally Posted by *ArchieGriffs*
> 
> It comes down to a 50$ different with the 5% plus a pretty decent headset. If it was just a 30$ difference I'd just get another tri-x for the simplicity.


Or this: http://www.newegg.com/Product/Product.aspx?Item=N82E16814202103

Jeez... just posting that makes me want to buy another one.... I have a problem.


----------



## ArchieGriffs

Quote:


> Originally Posted by *battleaxe*
> 
> Or this: http://www.newegg.com/Product/Product.aspx?Item=N82E16814202103
> 
> Jeez... just posting that makes me want to buy another one.... I have a problem.


Custome PCB :\. Still incredibly tempting. Me hunting for the lowest deal is also me trying to justify not selling my 290 and getting 2 970s, it'd be a more costly path, especially since at some point I'd want to water cool those for the heck of it. I guess if you think about it it's only a 150$ difference assuming you were going to water cool either the 290 or 970 and needed a PSU upgrade whereas with the 970 you wouldn't be upgrading a PSU (like myself). I'd definitely be making use of the game bundle though.... Grrr so hard to decide.


----------



## bluedevil

What everone needs to realize is that the biggest bottleneck in a system is your monitor. If your monitor's refresh rate is 60hz, you will not be able to see above that without massive screen tearing. So getting two 290s is moot if you cant display it properly.


----------



## ArchieGriffs

Quote:


> Originally Posted by *bluedevil*
> 
> What everone needs to realize is that the biggest bottleneck in a system is your monitor. If your monitor's refresh rate is 60hz, you will not be able to see above that without massive screen tearing. So getting two 290s is moot if you cant display it properly.


Very true, but it's personally not my reason for wanting two (to, both work I guess?). Skyrim can't really make use of more than 90 FPS before physics glitches, and I'm typically seeing 30-40 FPS with Skyrim and want to see 50+ with every mod I can throw at it.


----------



## bluedevil

Quote:


> Originally Posted by *ArchieGriffs*
> 
> Very true, but it's personally not my reason for wanting two (to, both work I guess?). Skyrim can't really make use of more than 90 FPS before physics glitches, and I'm typically seeing 30-40 FPS with Skyrim and want to see 50+ with every mod I can throw at it.


Yeah I realize it depends on a few factors. What game, what resolution, settings, ect. Nothing kills me more when I see a member saying they get 100 fps in BF4 (or any other fps) on a 60hz panel.


----------



## StrongForce

Quote:


> Originally Posted by *battleaxe*
> 
> If it were me getting one now, this is what I would pull the trigger on. http://www.newegg.com/Product/Product.aspx?Item=N82E16814202103


Ahh man yea I'd like me some Vapor X though, my AMD FX 8320 already overheats so much that I had to go back to 4.635 Ghz







so water cooling would be nice I got a way that a H55 would actually push heats outside case with my case.. I believe at least.

And yea, I don't want to bring the debate of which card is better or what but like I said if I was in USA, id probably go for a 970, less power consumption, less temps, and slightly more performance according to what I saw on benchmarks but the question is also that, the 970 is a bit more expensive so, yea you pay more you get more, logical after all !

Though here like I previously mentioned.. prices are higher, and not only Nvidia unfortunatly, AMD too.. look at those 2 pages, the first is swiss the local shop I usually go to, cheapest prices usually :

http://www.prodimex.ch/pSearch.aspx?SEARCH=r9+290

yea there is the MSI at 309 (327$).. bit tempting I'd admit, though I'd even prefer water cooling(and getting a stock cooler one on ebay), altought, thinking about it this would most likely mean loosing warranty.

as a matter of comparison, let's look at the MSI again here :

http://www.ldlc.com/navigation/r9+290/

that's 376 $ without shipping, holy.

Something is wrong with the prices here. and to be honest even in USA the AMD would need to be like 20 bucks cheaper to be really competitive.

I could try to get the vapor X on ebay maybe.. that could be a nice plan, and forget about water cooling mmh..

EDIT : waw you know LDLC is a multi country website, french, swiss, belgium, luxembourg, and I clicked on the swiss flag, the r9 290 vapor X at 320 euros and convert to 385 swiss franks, but when I clicked the swiss flag it shows 330, what the.. I guess taxes are different but I never notice such fluctuation in prices ..

and inno3d 330 swiss franks mh, lol Ok for now I'll try to get one on ebay, if I can't we'll see how things go !

and just looked at some r9 290 vapor x reviews on newegg. it's not looking good !


----------



## Dimaggio1103

Just bought a 290 MSI card how well do yall think it will fair with my eyefinity 1080p setup?


----------



## ArchieGriffs

Quote:


> Originally Posted by *bluedevil*
> 
> Yeah I realize it depends on a few factors. What game, what resolution, settings, ect. Nothing kills me morealize more when I see a memmember saying they get 100 fps in BF4 (or any other fps) on a 60hz panel.


That's on par to someone asking someone for a mail-in-rebate at a post office.


----------



## Bertovzki

Quote:


> Originally Posted by *ArchieGriffs*
> 
> I'm kind of wanting to take advantage of the 290 sale for the MSI gaming right now: http://www.newegg.com/Product/Product.aspx?Item=N82E16814127774&cm_re=r9_290-_-14-127-774-_-Product
> 
> Not sure how much of a pain it would be to crossfire it with my existing Tri-X though


Re - Tri X ,do you have that under water yet ? I have never seen this card under water period !
I have had debates with several PC companies and my initial letter to Sapphire was also replied with NO ! it's not reference !
to which I replied it has to be,as EK have passed it as reference and I sent Sapphire pic's with identical PCB ,and they then replied yes it is indeed reference design.

I have got the Aquacool Hawaii water block for an R9 , but have no GPU yet.
I have always intended to get a Tri - X and I know most of the advantage of one is the heat sink which will be binned but the R9 290X Tri-X OC does have more processor streams and probably better 4K performance, even though it looses it's main advantage.
Please correct me anyone that thinks I'm wasting money on a Tri-x if above is not true as it will be under water anyway, though over clocking I'm really not interested in, on either my R9 or i74790K ,but want to have the ability if I wish or need to latter.

Re - R9 sale ,Id say they will only get cheaper and cheaper for ever, from this point on,other cards are always coming out and the R9 has dropped consistently $200 - $250 in my country in just last 2 weeks ,if anything the reality is the retailers and AMD will be very keen indeed to get rid of them while they still have the chance !! at a good price ,before they just cant sell them and have to sell at there own original purchase price or less to get rid of stock, I'd be more focused on watching stock levels as they may disappear ,if you want one.

I have no idea what AMD have lined up after R9 ,iv not bothered to look, but Id say they have something on the way and the retailers know this too.


----------



## bond32

You are wasting your money on the tri-x especially if you already have a block. It's a reference pcb with non reference cooler... Just get a reference design card. They go for around $200 - $250 now. There is no difference if you got the tri-x vs reference card if you're water cooling.


----------



## ArchieGriffs

Quote:


> Originally Posted by *bond32*
> 
> You are wasting your money on the tri-x especially if you already have a block. It's a reference pcb with non reference cooler... Just get a reference design card. They go for around $200 - $250 now. There is no difference if you got the tri-x vs reference card if you're water cooling.


Guaranteed hynix, unless the reference card you're getting will always come with hynix, then that point is moot. Hynix has proven to be the better overclocker this time around, so if you are worried about something like that. For 50-100$ more though? 50$ probably isn't worth the extra money, but 100$, yeah just get a reference. I like having a nice backup air cooler, and I also don't plan to immediately watercool it as the HG10 bracket isn't out yet.
Quote:


> Originally Posted by *Bertovzki*
> 
> Re - Tri X ,do you have that under water yet ? I have never seen this card under water period !
> I have had debates with several PC companies and my initial letter to Sapphire was also replied with NO ! it's not reference !
> to which I replied it has to be,as EK have passed it as reference and I sent Sapphire pic's with identical PCB ,and they then replied yes it is indeed reference design.


No, the HG10 bracket hasn't came out yet, so it will only be AIO cooled, not a custom water cooling setup, both because of cost and convenience. The Tri-X doesn't make as much sense as it does the other cards to watercool, but once you add a second card in there and you have it in a airflow restrictive case like the H440, it becomes advantageous. I also want it on a consistent moderate overclock, I'm not going for anything extreme, so a AIO is all I really want.


----------



## Bertovzki

Quote:


> Originally Posted by *bond32*
> 
> You are wasting your money on the tri-x especially if you already have a block. It's a reference pcb with non reference cooler... Just get a reference design card. They go for around $200 - $250 now. There is no difference if you got the tri-x vs reference card if you're water cooling.


Quote:


> Originally Posted by *ArchieGriffs*
> 
> Guaranteed hynix, unless the reference card you're getting will always come with hynix, then that point is moot. Hynix has proven to be the better overclocker this time around, so if you are worried about something like that. For 50-100$ more though? 50$ probably isn't worth the extra money, but 100$, yeah just get a reference. I like having a nice backup air cooler, and I also don't plan to immediately watercool it as the HG10 bracket isn't out yet.


Ok guys ,cool thanks for info,Is there a reference brand that is preferable, with any less issues ? I have always used Gigabite or Asus usually


----------



## ArchieGriffs

Quote:


> Originally Posted by *Bertovzki*
> 
> Ok guys ,cool thanks for info,Is there a reference brand that is preferable, with any less issues ? I have always used Gigabite or Asus usually


Unfortunately I'm only know enough about the non-reference coolers, otherwise I'd give advice. I guess go with your favorite manufacturer then. AMD stopped manufacturing the reference models though, so I don't know how many manufacturers still have stock of their non reference coolers.


----------



## kizwan

Quote:


> Originally Posted by *ArchieGriffs*
> 
> I'm kind of wanting to take advantage of the 290 sale for the MSI gaming right now: http://www.newegg.com/Product/Product.aspx?Item=N82E16814127774&cm_re=r9_290-_-14-127-774-_-Product
> 
> Not sure how much of a pain it would be to crossfire it with my existing Tri-X though, obviously the same memory amount, and MSI still has hynix (right?). I'm getting conflicting reports about the MSI's board though, some are saying it's not the reference PCB, others are just saying custom components (obviously), so if I were to put both under water, more specifically the HG10 bracket, I still shouldn't have any issues right?
> 
> I have a feeling we're not going to be getting much as far as black Friday sales go as AMD said something about them expecting a lower sales despite the upcoming holiday season, so I'm not sure it's going to get much better than this.
> 
> Has anyone who has RMA'ed a MSI gaming and has taken off the stickers gotten it back with no problems? From what I heard over on the MSI forums you can take them off to watercool/change thermal past etc. and it's only there to tell them whether or not to inspect for user damage and doesn't effect the RMA outcome assuming it isn't damaged.
> 
> Back when I used to stalk R9 290 reviews for different manufacturers MSI was definitely the #2 choice (before the PCS+ Vapor X came out), and most if not all the advantages the Tri-X has over the MSI looks like they'll disappear when I watercool, and do a bios flash, has anyone had experience xfiring them together to confirm?


Last time I checked, the HG10 is for referenced design card only. It also seems to use the stock fan for VRM1 cooling.

There are two version of MSI Gaming. The first version using larger caps which only compatible with EK rev 2.0 (& Alphacool NexXxos M02) waterblock. The second version, come with gold chokes is *not* compatible with any waterblocks as far as I know. Check out @Red1776's *post*.
Quote:


> Originally Posted by *Bertovzki*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *ArchieGriffs*
> 
> I'm kind of wanting to take advantage of the 290 sale for the MSI gaming right now: http://www.newegg.com/Product/Product.aspx?Item=N82E16814127774&cm_re=r9_290-_-14-127-774-_-Product
> 
> Not sure how much of a pain it would be to crossfire it with my existing Tri-X though
> 
> 
> 
> 
> 
> 
> 
> Re - Tri X ,do you have that under water yet ? I have never seen this card under water period !
> I have had debates with several PC companies and my initial letter to Sapphire was also replied with NO ! it's not reference !
> to which I replied it has to be,as EK have passed it as reference and I sent Sapphire pic's with identical PCB ,and they then replied yes it is indeed reference design.
> 
> I have got the Aquacool Hawaii water block for an R9 , but have no GPU yet.
> I have always intended to get a Tri - X and I know most of the advantage of one is the heat sink which will be binned but the R9 290X Tri-X OC does have more processor streams and probably better 4K performance, even though it looses it's main advantage.
> Please correct me anyone that thinks I'm wasting money on a Tri-x if above is not true as it will be under water anyway, though over clocking I'm really not interested in, on either my R9 or i74790K ,but want to have the ability if I wish or need to latter.
> 
> Re - R9 sale ,Id say they will only get cheaper and cheaper for ever, from this point on,other cards are always coming out and the R9 has dropped consistently $200 - $250 in my country in just last 2 weeks ,if anything the reality is the retailers and AMD will be very keen indeed to get rid of them while they still have the chance !! at a good price ,before they just cant sell them and have to sell at there own original purchase price or less to get rid of stock, I'd be more focused on watching stock levels as they may disappear ,if you want one.
> 
> I have no idea what AMD have lined up after R9 ,iv not bothered to look, but Id say they have something on the way and the retailers know this too.
Click to expand...

Aquacool Hawaii? Is this Alphacool or Aquacomputer waterblock?

The referenced 290X have 2816 Stream Processors & the TRI-X 290X OC also have 2816 Stream Processors. How TRI-X 2816 Stream Processors is more than referenced 2816 Stream Processors?

BTW, if I'm not mistaken, there are two version at least for TRI-X 290X. The first version is referenced design card & any waterblock should fit. I don't know about the second version of TRI-X. You might want to check this out before buying the card.


----------



## Bertovzki

Quote:


> Originally Posted by *kizwan*
> 
> Last time I checked, the HG10 is for referenced design card only. It also seems to use the stock fan for VRM1 cooling.
> 
> There are two version of MSI Gaming. The first version using larger caps which only compatible with EK rev 2.0 (& Alphacool NexXxos M02) waterblock. The second version, come with gold chokes is *not* compatible with any waterblocks as far as I know. Check out @Red1776's *post*.
> Aquacool Hawaii? Is this Alphacool or Aquacomputer waterblock?
> 
> The referenced 290X have 2816 Stream Processors & the TRI-X 290X OC also have 2816 Stream Processors. How TRI-X 2816 Stream Processors is more than referenced 2816 Stream Processors?
> 
> BTW, if I'm not mistaken, there are two version at least for TRI-X 290X. The first version is referenced design card & any waterblock should fit. I don't know about the second version of TRI-X. You might want to check this out before buying the card.


Yes excuse me, I was thinking as typing "what is that block ?....it is the Aquacomputer Hawaii copper + plexi.

There are 3 versions of Tri -X the : R9 290 Tri-X , R9 290 Tri-X OC and the one with more processor streams the R9 290X Tri-X OC

I know near nothing about any R9 except original research on GPU's a while ago and I liked the price / performance ratio and very good performance at 4K with the R9 290X Tri-X OC ,as it was the best by test at the time, but irrelevant to me now I guess as I am water cooling it.
are there other cheaper reference PCB cards that also have more processor streams,above mentioned is only one I noticed


----------



## ArchieGriffs

Quote:


> Originally Posted by *kizwan*
> 
> Last time I checked, the HG10 is for referenced design card only. It also seems to use the stock fan for VRM1 cooling.
> 
> There are two version of MSI Gaming. The first version using larger caps which only compatible with EK rev 2.0 (& Alphacool NexXxos M02) waterblock. The second version, come with gold chokes is *not* compatible with any waterblocks as far as I know. Check out @Red1776's *post*.


Ahhh.. that's why people cared about the golden caps, it's all coming together now. Thanks so much for that, I'll definitely hold out on the MSI purchase then. Corsair is selling a replacement blower for those with a reference card but not a reference cooler (and don't have the blower fan), so it should be compatible with my current R9 290 Tri X OC, as it was early februrary when I got it, which should have been before the revision. I'll have to figure out what has a reference design, though I can always just get a reference card, but then I'd be concerned about getting hynix or not >_<.. Way too complicated. Hopefully you can still buy the original Tri-X.


----------



## Bertovzki

Quote:


> Originally Posted by *ArchieGriffs*
> 
> Ahhh.. that's why people cared about the golden caps, it's all coming together now. Thanks so much for that, I'll definitely hold out on the MSI purchase then. Corsair is selling a replacement blower for those with a reference card but not a reference cooler (and don't have the blower fan), so it should be compatible with my current R9 290 Tri X OC, as it was early februrary when I got it, which should have been before the revision. I'll have to figure out what has a reference design, though I can always just get a reference card, but then I'd be concerned about getting hynix or not >_<.. Way too complicated. Hopefully you can still buy the original Tri-X.


Hynix ? different revisions of Tri-X ? is there somthing I should not be buying for my Hawaii waterblock and Hynix bad ?

What in short is a good reference design card that I can use with my water block and have the least issues with ,anyone ,thanks in advance guys any help appreciated

I'm starting to surf net n look at other cards besides Tri-X, if I'm definitely wasting money if going under water.

Some quick reply's from dudes who know ,can save me many hours research on which is reference and non Hynix or whatever.


----------



## Bertovzki

I just looked at Pricespy.co.nz , a must use site here in New Zealand to find the cheapest anything in NZ ,and the R9 290 Tri-X OC is only $1.63 more anyway and I know that card,
You got to shop around ,some shops still selling them for $1000 + instead of $ 500 ,lol twice the price ,ill pay $519 though from a company I deal with that is closer and good rep.

Still does anyone know of a list of Reference R9 290"s ?


----------



## rdr09

Quote:


> Originally Posted by *Bertovzki*
> 
> I have got the Aquacool Hawaii water block for an R9 , but have no GPU yet.
> I have always intended to get a Tri - X and I know most of the advantage of one is the heat sink which will be binned but the R9 290X Tri-X OC does have more processor streams and probably better 4K performance, even though it looses it's main advantage.
> Please correct me anyone that thinks I'm wasting money on a Tri-x if above is not true as it will be under water anyway, though over clocking I'm really not interested in, on either my R9 or i74790K ,but want to have the ability if I wish or need to latter.
> 
> Re - R9 sale ,Id say they will only get cheaper and cheaper for ever, from this point on,other cards are always coming out and the R9 has dropped consistently $200 - $250 in my country in just last 2 weeks ,if anything the reality is the retailers and AMD will be very keen indeed to get rid of them while they still have the chance !! at a good price ,before they just cant sell them and have to sell at there own original purchase price or less to get rid of stock, I'd be more focused on watching stock levels as they may disappear ,if you want one.
> 
> I have no idea what AMD have lined up after R9 ,iv not bothered to look, but Id say they have something on the way and the retailers know this too.


Quote:


> Originally Posted by *bond32*
> 
> You are wasting your money on the tri-x especially if you already have a block. It's a reference pcb with non reference cooler... Just get a reference design card. They go for around $200 - $250 now. There is no difference if you got the tri-x vs reference card if you're water cooling.


i agree with bond, just get a reference 290 or 290X if you have a waterblock. core is king when it comes to oc'ing. you have a better chance of winning the silicon with a reference.


----------



## Bertovzki

Quote:


> Originally Posted by *rdr09*
> 
> i agree with bond, just get a reference 290 or 290X if you have a waterblock. core is king when it comes to oc'ing. you have a better chance of winning the silicon with a reference.


Excuse my lack of education ,usually I have researched more thoroughly before asking qestions.

I'm a little confused,as I now see that the 290 Tri-x OC is infact the cheapest card I can get in NZ now ,and it is a reference design anyway.

And I don't know the difference between 290 and 290X is the 290X one with more processor streams ?what is the advantage and why is it more expensive ?


----------



## rdr09

Quote:


> Originally Posted by *Bertovzki*
> 
> Excuse my lack of education ,usually I have researched more thoroughly before asking qestions.
> 
> I'm a little confused,as I now see that the 290 Tri-x OC is infact the cheapest card I can get in NZ now ,and it is a reference design anyway.
> 
> And I don't know the difference between 290 and 290X is the 290X one with more processor streams ?what is the advantage and why is it more expensive ?


the 290X is top of the line in Hawaii cards. it is about 7 - 10% faster than the 290. the difference may come to play in higher rez or if you are into benching. And if you are into benching, you might want to go with a USED reference card. It is kinda late to buy these things new. got my second 290 used for $247 with a full waterblock. check the used market but make sure you screen the seller and verify return policy. i recommend cards like xfx, msi or gigabyte for used. their warranty is transferrable.

now, if you are not comfortable with used, then get the tri-x.


----------



## Bertovzki

Quote:


> Originally Posted by *rdr09*
> 
> the 290X is top of the line in Hawaii cards. it is about 7 - 10% faster than the 290. the difference may come to play in higher rez or if you are into benching. And if you are into benching, you might want to go with a USED reference card. It is kinda late to buy these things new. got my second 290 used for $247 with a full waterblock. check the used market but make sure you screen the seller and verify return policy. i recommend cards like msi or gigabyte for used. their warranty is transferrable.


Yes thanks for feed back , yes R9 290X Tri-X OC is better in 4K that is for sure, toms hardware review clarifies this,it is equal to GTX 780Ti on some high res tests ,exactly same fps.
And yeah I do want high res performance ,thats why I started to look at the R9 in the first place.(although this wont matter again under water)
Just been reading about why the 290X is faster and what it is.

I think it is the 290X I want, though maybe splitting hairs 4-5 fps is all you gain


----------



## bluedevil

Would it be foolish to get rid of my VisionTek R9 290 for a Sapphire Vapor X R9 290? I really have to finagle alot to get it stable at 1200/1550, without finagling its more comfortable at 1150/1400


----------



## BradleyW

Quote:


> Originally Posted by *bluedevil*
> 
> Would it be foolish to get rid of my VisionTek R9 290 for a Sapphire Vapor X R9 290? I really have to finagle alot to get it stable at 1200/1550, without finagling its more comfortable at 1150/1400


You won't see much of an increase at all. It's a massive waste of time and money.


----------



## bustacap22

I have the opportunity in purchasing a Sapphire R9 290x model 21226-00-40g. Its a reference model and I do plan on putting a waterblock. Any OCN users out there that can chime in with their experience with this model 290x. Good, Bad, Decent, OC potential, Memory, STAY AWAY, etc...

I will be playing on a 1440p monitor. Currently have dual 7970. Just trying to justify if I should pull the trigger or not. Thanks.


----------



## bluedevil

Quote:


> Originally Posted by *BradleyW*
> 
> You won't see much of an increase at all. It's a massive waste of time and money.


Kinda thought so, might wait to upgrade the GPU when the 3XX series comes out.


----------



## bond32

VaporX cars are arguably the best cards you can get (design wise) however they are indeed non-reference. With that said EK now has a full cover block which looks sick.

As for the memory, Hynix is hands down better for overclocking. Elipda us fine, it will OC some but Hynix has been shown to do much better.


----------



## Aaron_Henderson

Just wanted to pipe up to say after mucking about with my card for the past couple weeks, this 290X is great...was kind of on the fence about the switch to AMD, but have been getting back into gaming lately and pretty much all of my games hold 60 fps constant. Nice and smooth compared to the GTX 570 SLI setup I came from. Was unable to get it to work correctly with my one HDTV, but other than that, nice cool and quiet card. Way more quiet than I expected, that's for sure. Thought it would be a hair drier to be honest lol Haven't bothered benching yet, as I am in the midst of moving, but I'll get around to it. The temps I get at stock give me hope for good clocks though.


----------



## bluedevil

Kinda thinking if those who are looking to upgrade a 290, either should look at a 290x or wait for the 3XX series to launch.







So hard....really want to do a chipset and gpu upgrade this time!


----------



## ArchieGriffs

Quote:


> Originally Posted by *Bertovzki*
> 
> Hynix ? different revisions of Tri-X ? is there somthing I should not be buying for my Hawaii waterblock and Hynix bad ?


There's two different types of memory the 290 comes with (3 if you count the MSI lightning), hynix and elpida. Neither will affect what kind of waterblock you can put on them assuming they're both reference PCBs, hynix overclocks better which is why I'm concerned about it.


----------



## Bertovzki

Quote:


> Originally Posted by *ArchieGriffs*
> 
> There's two different types of memory the 290 comes with (3 if you count the MSI lightning), hynix and elpida. Neither will affect what kind of waterblock you can put on them assuming they're both reference PCBs, hynix overclocks better which is why I'm concerned about it.


Right thanks,I remember now reading about elpida having some issues,thanks for info
I will be buying a Tr-X either the 290x or standard 290 version,as they are the cheapest anyway and reference.
R9 290X Tri-X OC $ 633 , R9 290 Tri-X OC $ 519 NZD


----------



## battleaxe

Quote:


> Originally Posted by *ArchieGriffs*
> 
> There's two different types of memory the 290 comes with (3 if you count the MSI lightning), hynix and elpida. Neither will affect what kind of waterblock you can put on them assuming they're both reference PCBs, hynix overclocks better which is why I'm concerned about it.


Hynix doesn't necessarily OC better but their RAM timings are better or faster for some reason... My Elpida RAM 290 overclocks much higher than my Hynix RAM 290. The Elpida will hit 1550mhz, but the Hynix only about 1350mhz. However, even at 1350mhz The card with Hynix beats the scores on my Elpida card at 1550mhz on 3dMark11 at the same core of 1150mhz... Easily.

So the issue is more of a RAM and latency timings issue than the speed of the RAM itself. That's why the Hynix are typically better regardless of how high they OC. And why lots of people want the Tri-X as its guaranteed Hynix. The MSI gaming is Hynix also. Can't believe the prices on these on the bay. Dang.

I am getting a new monitor and soon.


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> Hynix doesn't necessarily OC better but their RAM timings are better or faster for some reason... My Elpida RAM 290 overclocks much higher than my Hynix RAM 290. The Elpida will hit 1550mhz, but the Hynix only about 1350mhz. However, even at 1350mhz The card with Hynix beats the scores on my Elpida card at 1550mhz on 3dMark11 at the same core of 1150mhz... Easily.
> 
> So the issue is more of a RAM and latency timings issue than the speed of the RAM itself. That's why the Hynix are typically better regardless of how high they OC. And why lots of people want the Tri-X as its guaranteed Hynix. The MSI gaming is Hynix also. Can't believe the prices on these on the bay. Dang.
> 
> I am getting a new monitor and soon.


i have to agree with you on hynix being a better brand that elpida but i am not sure if both are oc'ed the same will make a difference in results. my 290 has elpida and can only oc to 1500. it has gone to 1620 before blackscreening. i still think that core oc is what really matters. i'll take a car with a core that can oc to 1300 even if the memory can only oc to 1400 over one that can only do 1200 core and 1700 memory. but it all comes down to silicon. here is my 290 with an elpida memory . . .

http://www.3dmark.com/3dm11/8776470


----------



## battleaxe

Quote:


> Originally Posted by *rdr09*
> 
> i have to agree with you on hynix being a better brand that elpida but i am not sure if both are oc'ed the same will make a difference in results. my 290 has elpida and can only oc to 1500. it has gone to 1620 before blackscreening. i still think that core oc is what really matters. i'll take a car with a core that can oc to 1300 even if the memory can only oc to 1400 over one that can only do 1200 core and 1700 memory. but it all comes down to silicon. here is my 290 with an elpida memory . . .
> 
> http://www.3dmark.com/3dm11/8776470


You have point. Maybe my Elpida card is just a dud. Totally possible. I can't get anywhere near that kind of score.


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> You have point. Maybe my Elpida card is just a dud. Totally possible. I can't get anywhere near that kind of score.


if all other settings equal, a 290 at 1300 core with a hynix memory that can oc to 1700 will prolly score 100 more points in graphics.


----------



## battleaxe

Quote:


> Originally Posted by *rdr09*
> 
> if all other settings equal, a 290 at 1300 core with a hynix memory that can oc to 1700 will prolly score 100 more points in graphics.


Neither of my cards will hit 1200. Boo-hoo...


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> Neither of my cards will hit 1200. Boo-hoo...


i know. lol. it comes down to silicon. are they watercooled?

i've got another 290 and i have yet to watercool. it has a higher asic (80% the other 79%) than my old one. well, they are both old. it has elpida, too, which i was hoping cause i read mixing them memory might cause problems.


----------



## bond32

It's not a make or break in my opinion, however when I get my fourth card I will be making sure it is hynix, just because I like consistency between my other three.

My three cards need around 1.4 V for 1240/1500. Still haven't been able to get any higher stable yet. 1.5 V is scary even though temps are not an issue.


----------



## battleaxe

Quote:


> Originally Posted by *rdr09*
> 
> i know. lol. it comes down to silicon. are they watercooled?
> 
> i've got another 290 and i have yet to watercool. it has a higher asic (80% the other 79%) than my old one. well, they are both old. it has elpida, too, which i was hoping cause i read mixing them memory might cause problems.


They are both on h80's with the Accelero kit on the VRM's max temps I have seen with 200mv is about 60 on the core and about 80 on VRM 1. So they stay plenty cool.

Edit: And mine are mixed RAM makers and they work fine together.


----------



## MrWhiteRX7

Ran in to a very random issue... installed 14.9.1 and now CCC only recognizes 2 of my 3 gpu's. If I look at device manager or even afterburner it shows all 3. So does GPUZ, but CCC and playing in games it won't use all three. Rolled back to 14.4 and now the same thing. I've had tri-fire for years and never had this happen lol.

Any ideas?


----------



## battleaxe

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Ran in to a very random issue... installed 14.9.1 and now CCC only recognizes 2 of my 3 gpu's. If I look at device manager or even afterburner it shows all 3. So does GPUZ, but CCC and playing in games it won't use all three. Rolled back to 14.4 and now the same thing. I've had tri-fire for years and never had this happen lol.
> 
> Any ideas?


I had issues with that driver too. Try using DDU to uninstall from safe mode. You need to completely get rid of that driver. I went back to 14.7 which is working great for me. But I had serious issues going to the 14.9.1 driver as mentioned. Didn't work right at all and it took me a bit to get rid of everything to get it working right again. After uninstalling the drivers, completely shut down and take all power off the board as in disconnecting power. Let it sit at least 10 sec, then reboot before you install the new drivers. That's what I had to do, not sure why either.

Edit: turning off the PSU should work fine. That's all I meant by killing power to the motherboard.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *battleaxe*
> 
> I had issues with that driver too. Try using DDU to uninstall from safe mode. You need to completely get rid of that driver. I went back to 14.7 which is working great for me. But I had serious issues going to the 14.9.1 driver as mentioned. Didn't work right at all and it took me a bit to get rid of everything to get it working right again. After uninstalling the drivers, completely shut down and take all power off the board as in disconnecting power. Let it sit at least 10 sec, then reboot before you install the new drivers. That's what I had to do, not sure why either.
> 
> Edit: turning off the PSU should work fine. That's all I meant by killing power to the motherboard.


Good idea on the hard reset, I usually run DDU and just let it boot right up and install the driver. I'll try this when I get home. Thanks for insight! I'll let you know how this goes.

I did notice one thing though, gpuz showed my 3rd card running pcie 3.0 @ 1x??? hopefully this is due to the driver and not my RIVE board taking a dump


----------



## chiknnwatrmln

Tried to update my drivers to the newest ones (CF 290x + 290) and every time my PC crashes when the installation is almost done.. Any tips? I am on 14.4 now and it works just fine.

Also, kinda disappointed with CF. Of all the games I play, every single one has issues with CF.


----------



## joeh4384

Pretty tough to beat the price to performance on a used 290.
Quote:


> Originally Posted by *bluedevil*
> 
> Kinda thinking if those who are looking to upgrade a 290, either should look at a 290x or wait for the 3XX series to launch.
> 
> 
> 
> 
> 
> 
> 
> So hard....really want to do a chipset and gpu upgrade this time!


The market on used 290s is so cheap right now. You can get good cards like the tri-x 290x used for 250-270 now. I am sure reference cards go for close to 200 for people putting blocks on these.


----------



## Klocek001

Well, I'm finally going to get my second-hand, formerly mining, 290 TriX back from RMA and with all the fuss about GTX970 cards and 290 prices dropping I think I'm not going to play with OC or WC to get it to 1250 MHz, I'll just buy another second hand 290, even if it needs to be RMA'd within the first month. Thank God I bought 850W XFX instead of 550W when I was building my rig. I'll have much more performance than I need for 1080p, but I like that. It will give me possibilty to upgrade to 27" or more some time next year.


----------



## Mopar63

HOLY PRIZE CUT BATMAN!!!!!

http://www.newegg.com/Product/Product.aspx?Item=N82E16814202103&cm_re=290_Vapor-_-14-202-103-_-Product

This is a freaking STEAL!!!!!!


----------



## Agent Smith1984

It's a good price if you are adding it to another 290x in your current system and you have the power for it, but if doing a single GPU upgrade, OR building a new system, you are still going to want to spend the extra $30 to go to GTX 970, or wait for the next AMD cards in my opinion....


----------



## battleaxe

Quote:


> Originally Posted by *Mopar63*
> 
> HOLY PRIZE CUT BATMAN!!!!!
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814202103&cm_re=290_Vapor-_-14-202-103-_-Product
> 
> This is a freaking STEAL!!!!!!


Yeah, I thought that was amazing too... very tempting!!!!
Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Good idea on the hard reset, I usually run DDU and just let it boot right up and install the driver. I'll try this when I get home. Thanks for insight! I'll let you know how this goes.
> 
> I did notice one thing though, gpuz showed my 3rd card running pcie 3.0 @ 1x??? hopefully this is due to the driver and not my RIVE board taking a dump


I usually do the option to uninstall without restarting about 4 or 5 times, then the option to restart. But you'd just hit the power button and turn it off after it came back to the boot screen. Then disconnect power and let her sit a bit. Then reinstall drivers, shut down and repeat the power down technique. Worked for me anyway. Hope it does for you.
Quote:


> Originally Posted by *Agent Smith1984*
> 
> It's a good price if you are adding it to another 290x in your current system and you have the power for it, but if doing a single GPU upgrade, OR building a new system, you are still going to want to spend the extra $30 to go to GTX 970, or wait for the next AMD cards in my opinion....


Well, yeah I pretty much agree with you but a lot of guys are getting the used ones for 250.00.... that's a hard bargain. And a good deal less expensive than any 970. Still the 970/980's are sweet and I'd love to have one. But holding off on new monitor first.

I wish 4k was developed a bit more. They still seem a bit buggy to me... I want higher refresh rates than 60hz, which pretty much leaves me with 1440 displays right now. 4k is so beautiful to look at though I'm so torn on what to get... 4k at 60hz or 1440 at 100hz +...? What do you guys think I should do?


----------



## rdr09

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Tried to update my drivers to the newest ones (CF 290x + 290) and every time my PC crashes when the installation is almost done.. Any tips? I am on 14.4 now and it works just fine.
> 
> Also, kinda disappointed with CF. Of all the games I play, every single one has issues with CF.


you disable crossfire first prior to uninstalling and installing drivers?

first time i crossfired after over a year (7900s were first) and my games work. no issues here.

edit: i disable crossfire, shutdown system, unplug power to secondary car, boot, and proceed to updating driver.


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> It's a good price if you are adding it to another 290x in your current system and you have the power for it, but if doing a single GPU upgrade, OR building a new system, you are still going to want to spend the extra $30 to go to GTX 970, or wait for the next AMD cards in my opinion....


Quote:


> Originally Posted by *Mopar63*
> 
> HOLY PRIZE CUT BATMAN!!!!!
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814202103&cm_re=290_Vapor-_-14-202-103-_-Product
> 
> This is a freaking STEAL!!!!!!


Quote:


> Originally Posted by *rdr09*
> 
> you disable crossfire first prior to uninstalling and installing drivers?
> 
> first time i crossfired after over a year (7900s were first) and my games work. no issues here.
> 
> edit: i disable crossfire, shutdown system, unplug power to secondary car, boot, and proceed to updating driver.


Hey on that note, does it hurt to leave a card in the slot with no power run to it? Of course it wouldn't be used at all. I'm just wondering if it could mess something up leaving power disconnected and not really doing anything sitting in the motherboard while the other card is running?


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> Hey on that note, does it hurt to leave a card in the slot with no power run to it? Of course it wouldn't be used at all. I'm just wondering if it could mess something up leaving power disconnected and not really doing anything sitting in the motherboard while the other card is running?


yes, keep the second car it in place.


----------



## battleaxe

Quote:


> Originally Posted by *rdr09*
> 
> yes, keep the second car it in place.


Long term? Like if you just left it there for say 2 months until I get a 1440 display? Wont' hurt anything?


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> Long term? Like if you just left it there for say 2 months until I get a 1440 display? Wont' hurt anything?


it should not hurt anything.


----------



## Agent Smith1984

It would not affect anything in my opinion. Never just "let a GPU sit there" personally








It'd be too tough on my soul......
BUT I have left PCI, and PCI-E expansion cards installed without drivers or usage for months with no negative impacts...... Seems like a safe storage place to me


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> It would not affect anything in my opinion. Never just "let a GPU sit there" personally
> 
> 
> 
> 
> 
> 
> 
> 
> It'd be too tough on my soul......
> BUT I have left PCI, and PCI-E expansion cards installed without drivers or usage for months with no negative impacts...... Seems like a safe storage place to me


Yeah, I don't want to put any stress on the card. But I'm too lazy to take it out for a few months when its going right back in there. I don't want to mess with re-positioning everything... lol.... I'm so blasted lazy.


----------



## kizwan

Quote:


> Originally Posted by *rdr09*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *chiknnwatrmln*
> 
> Tried to update my drivers to the newest ones (CF 290x + 290) and every time my PC crashes when the installation is almost done.. Any tips? I am on 14.4 now and it works just fine.
> 
> Also, kinda disappointed with CF. Of all the games I play, every single one has issues with CF.
> 
> 
> 
> 
> 
> 
> 
> you disable crossfire first prior to uninstalling and installing drivers?
> 
> first time i crossfired after over a year (7900s were first) and my games work. no issues here.
> 
> edit: i disable crossfire, shutdown system, unplug power to secondary car, boot, and proceed to updating driver.
Click to expand...

No need to do all that though. I've updated drivers without removing the second card or even unplug the PCIe power cable from the second card. No problem whatsoever. Crossfire doesn't work without drivers & disabling crossfire before updating drivers is moot since you're going to uninstall the driver first anyway.


----------



## chiknnwatrmln

I will try unplugging my 2nd card before installing the newest drivers.


----------



## SeanEboy

Hey guys, I just finished my build, and I'm looking to dial it in a bit. I'm running quad 290x, EK blocks, and I'm hoping to go 1440p by three. However, it looks like I might need to purchase an active adapter? I lose the 3rd monitor. Darn, looks like that's what I need to do.

Now, what software setup should I be running for optimal performance? Eventually I'll be overclocking them, but for now I'm just looking to get things going at stock levels.


----------



## StrongForce

Maaan I just got such a good deal, and lucky, I was bidding on a r9 290 on ebay, stock cooler, had won the bid for 200 include shipping, then right after that I check my email and see another guy I had contacted on some swiss classified ad website he's trying to sell his r9 290x windforces.. I offered him the same price well 215 euros with shipping ( 260$) and he accepted, woa I'm so happy, also, glad the guy on ebay accepted that I retract my purchase... geeez !

Only problem, well not so much as winter is coming, gonna be heat ahha.

I gonna heat up the whole house with my Amd config (with [email protected] atm) lol.


----------



## bond32

Quote:


> Originally Posted by *SeanEboy*
> 
> Hey guys, I just finished my build, and I'm looking to dial it in a bit. I'm running quad 290x, EK blocks, and I'm hoping to go 1440p by three. However, it looks like I might need to purchase an active adapter? I lose the 3rd monitor. Darn, looks like that's what I need to do.
> 
> Now, what software setup should I be running for optimal performance? Eventually I'll be overclocking them, but for now I'm just looking to get things going at stock levels.


All the dvi connections are dual link so you should be fine... Unless I'm wrong and any display needs to be in the top card... I'm curious as I want to do the same.


----------



## pdasterly

Did some playing around with gpu clocks today since I had a fresh restart(giving windows 10 a try).
sapphire reference 290X with ek waterblock.
Card doesn't like high voltage, I get flashing blackscreen, sometimes video can recover by ctrl alt del and task manager will pop-up and let me close benchmark. That was using 150mv vddc

I have card stable at 1200\1600, 125 vddc offset and 50 power limit. Using triXX. Temps are in check, I didn't even bother to install gpu-z, but core, vrm1 and vrm2 never go above 50c on furmark burn in test. Ran valley, 50c gpu, 1.352 vddc, 366W vddc power in, vrm1 48 and vrm2 38

Is there a problem with my card?
Seems like I should be able to pump 200mv no problem?


----------



## bond32

I too played around with some clocks... Discovered the PT1 bios has been troll'in me. On stock bios, my scores are much better in firestrike ultra. Hell on PT1, it wouldn't even finish at clocks I thought it was stable. This is on the 14.9 drivers...

Small thing to note too: at 1200/1500 +200mv my score was 7730 http://www.3dmark.com/3dm/4442444?

at 1200/1625 +200mv my score was 7789 http://www.3dmark.com/3dm/4442497?

I noticed the physics changed too, might be error. But interesting none the less...


----------



## pdasterly

my firestrike score is 11492


----------



## bond32

Is there no way to get over 1.4 V without pt1 or pt3 bios?


----------



## ZealotKi11er

Quote:


> Originally Posted by *bond32*
> 
> Is there no way to get over 1.4 V without pt1 or pt3 bios?


Do these voltages even help? My cards stop improving after +125 mV. +200 mV gets me like 20-25MHz extra.


----------



## Cyber Locc

Quote:


> Originally Posted by *SeanEboy*
> 
> Hey guys, I just finished my build, and I'm looking to dial it in a bit. I'm running quad 290x, EK blocks, and I'm hoping to go 1440p by three. However, it looks like I might need to purchase an active adapter? I lose the 3rd monitor. Darn, looks like that's what I need to do.
> 
> Now, what software setup should I be running for optimal performance? Eventually I'll be overclocking them, but for now I'm just looking to get things going at stock levels.


That changed as of late you don't need an active adapter as long as all 3 monitors are the same if you mixed resolutions then you will need active adapter. At least that is what I have read I dont use eyefinity it makes me sick but maybe try using both the dvis and and the third on if you cant and need to go from dp to dvi might as well just get active there cheap now I got one for 15 bucks on ebay.

Quote:


> Originally Posted by *bond32*
> 
> All the dvi connections are dual link so you should be fine... Unless I'm wrong and any display needs to be in the top card... I'm curious as I want to do the same.


All monitors can only be hooked to the top card not just for eyefinity that's period. Ports on other cards wont work sadly as I wanted to add a 5th screen and cant without a DP splitter.


----------



## SeanEboy

Thanks for the response cyberlocc. I don't think I can do it, as I can see the third monitor there, until I start to create an eyefinity profile. I need that adapter. I also was told that you NEED an active unit, specifically:
Quote:


> Originally Posted by *Roland2*
> 
> You will need and active displayport to DVI adaptor. I used two on an ATI 7970 and put all three (one on DVI other two through DP-DVI) in eyefinity with no problems.
> 
> The adaptor I would recommend is the Accell B087B-007B like this one: http://www.amazon.com/Accell-B087B-007B-DisplayPort-Dual-Link-Adapter/dp/B00856WJH8/ref=sr_1_1?ie=UTF8&qid=1413849315&sr=8-1&keywords=Accell+B087B-007B
> 
> Don't get the 002B version, it doesn't work with these monitors.


----------



## CriticalHit

Quote:


> Originally Posted by *pdasterly*
> 
> Did some playing around with gpu clocks today since I had a fresh restart(giving windows 10 a try).
> sapphire reference 290X with ek waterblock.
> Card doesn't like high voltage, I get flashing blackscreen, sometimes video can recover by ctrl alt del and task manager will pop-up and let me close benchmark. That was using 150mv vddc
> 
> I have card stable at 1200\1600, 125 vddc offset and 50 power limit. Using triXX. Temps are in check, I didn't even bother to install gpu-z, but core, vrm1 and vrm2 never go above 50c on furmark burn in test. Ran valley, 50c gpu, 1.352 vddc, 366W vddc power in, vrm1 48 and vrm2 38
> 
> Is there a problem with my card?
> Seems like I should be able to pump 200mv no problem?


I have the same flashing black screen if I push voltages and power limit too high .. Can't get to +200mv even with pt1 ..

Have just learned to accept it .. Some cards seem a bit fussy with power delivery IMO

Edit I can't even get mine to 1200 core - max stable around 1135 and can't touch memory speeds


----------



## bluedevil

Anyone know of a BIOS editor for the R9 290(X)? Would love to play around with the voltages and frequencies on the BIOS side of things.


----------



## pdasterly

problem with 2nd gpu, I cant seem to install. When I run ccc, everything installs except display driver


----------



## Red1776

Quote:


> Originally Posted by *CriticalHit*
> 
> Quote:
> 
> 
> 
> Originally Posted by *pdasterly*
> 
> Did some playing around with gpu clocks today since I had a fresh restart(giving windows 10 a try).
> sapphire reference 290X with ek waterblock.
> Card doesn't like high voltage, I get flashing blackscreen, sometimes video can recover by ctrl alt del and task manager will pop-up and let me close benchmark. That was using 150mv vddc
> 
> I have card stable at 1200\1600, 125 vddc offset and 50 power limit. Using triXX. Temps are in check, I didn't even bother to install gpu-z, but core, vrm1 and vrm2 never go above 50c on furmark burn in test. Ran valley, 50c gpu, 1.352 vddc, 366W vddc power in, vrm1 48 and vrm2 38
> 
> Is there a problem with my card?
> Seems like I should be able to pump 200mv no problem?
> 
> 
> 
> I have the same flashing black screen if I push voltages and power limit too high .. Can't get to +200mv even with pt1 ..
> 
> Have just learned to accept it .. Some cards seem a bit fussy with power delivery IMO
> 
> Edit I can't even get mine to 1200 core - max stable around 1135 and can't touch memory speeds
Click to expand...

Have you tried checking out how consistent your PSU power delivery is?

you might try OCCT and generating charts to see if you are having wild swings while under stressed conditions

http://www.ocbase.com/index.php/download

It will generate charts on just about everythinghing on you voltages and voltage rails. look for oddities and or unusual amounts of Vdroop

These are OCCT voltage charts from a review I did a while back.


----------



## pdasterly

Using evga 1300 g2


----------



## CriticalHit

Quote:


> Originally Posted by *Red1776*
> 
> Have you tried checking out how consistent your PSU power delivery is?
> you might try OCCT and generating charts to see if you are having wild swings while under stressed conditions
> 
> http://www.ocbase.com/index.php/download
> 
> It will generate charts on just about everythinghing on you voltages and voltage rails. look for oddities and or unusual amounts of Vdroop
> 
> These are OCCT voltage charts from a review I did a while back.
> 
> 
> 
> 
> 
> 
> 
> How did u get the voltage charts ... I just tried the power supply tab bit only got GPU and CPU usage and fps charts .


----------



## Red1776

Quote:


> Originally Posted by *CriticalHit*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> Have you tried checking out how consistent your PSU power delivery is?
> you might try OCCT and generating charts to see if you are having wild swings while under stressed conditions
> 
> http://www.ocbase.com/index.php/download
> 
> It will generate charts on just about everythinghing on you voltages and voltage rails. look for oddities and or unusual amounts of Vdroop
> 
> These are OCCT voltage charts from a review I did a while back.
> 
> 
> 
> 
> 
> How did u get the voltage charts ... I just tried the power supply tab bit only got GPU and CPU usage and fps charts .
Click to expand...

Go to settings (the orange button) and check all of the tests you want charted. Disable everything in the next column except the temp limit if you want. Run the CPU stress test for about 10 min and shut off. The charts will generate in the same folder.


----------



## CriticalHit

Quote:


> Originally Posted by *Red1776*
> 
> Go to settings (the orange button) and check all of the tests you want charted. Disable everything in the next column except the temp limit if you want. Run the CPU stress test for about 10 min and shut off. The charts will generate in the same folder.


Thanks .. I actually had to downclock my CPU and vcore to get any readings - perhaps my old i7 860 is on its way out - was running very hot in the occp burn test so will be a permanent downclock and upgrade next year I think .. Its managed to sustain +4ghz for a years so can't complain ..

Anyway .. Voltages nice and smooth ( its a 1050w corsair ) ... And did bit more testing with lower CPU clock just to see if it affected how the GPUs overclock but nope .. Stuck at same oc and voltage limit


----------



## bond32

What, aside from unstable overclocks, causes the display to show lines all over? In some testing last night I observed that setting the voltage to 1.36 ish on pt3 bios, regardless of clock speeds would cause the vertical lines to skew the display. I thought that had to do with a memory issue, but what are the chances the 3 cards i have were drawing more current than what my psu can provide?

I'm going to run 2 cards per psu tonight. Last night's tests were running all 3 on one power supply.


----------



## Buehlar

Quote:


> Originally Posted by *bond32*
> 
> What, aside from unstable overclocks, causes the display to show lines all over? In some testing last night I observed that setting the voltage to 1.36 ish on pt3 bios, regardless of clock speeds would cause the vertical lines to skew the display. I thought that had to do with a memory issue, but what are the chances the 3 cards i have were drawing more current than what my psu can provide?
> 
> I'm going to run 2 cards per psu tonight. Last night's tests were running all 3 on one power supply.


Your PSU "should" meet the requirements for 290x tri-fire, however, with everything heavily OCed, you're at the very minimum.

Consider my power usage test @ 1.35v. With the PSU powering "only one 290x" it was drawing a massive 405.6w input from the wall and GPU output was 373.5w.
373.5 x 3 = 1120.5w

and you're adding even more voltage...so yea, the odds of a PSU bottleneck are fair... figure 400w per GPU = 1200w and that leaves you with 100w of headroom.

Is your CPU also OCed? If so, try a run without OCing your CPU.

290x @1.35v


----------



## Jorginto

Hey guys,
I just got myself a pair of powercolor R9 290 reference design. Before installing water blocks I tested the cards a little bit. Separetly they work like a charm but when installed together crossfire doesn't work. Every try of running any monitoring program like HWinfo or Gpu-z ends up in a crash. HWinfo crashes while accessing pcibus. Any ideas? My r9 280X-es worked without any issues.


----------



## Devildog83

Quote:


> Originally Posted by *Jorginto*
> 
> Hey guys,
> I just got myself a pair of powercolor R9 290 reference design. Before installing water blocks I tested the cards a little bit. Separetly they work like a charm but when installed together crossfire doesn't work. Every try of running any monitoring program like HWinfo or Gpu-z ends up in a crash. HWinfo crashes while accessing pcibus. Any ideas? My r9 280X-es worked without any issues.


Did you sweep and reinstall the drivers after you installed both cards. Sometimes deleting and reinstalling the monitoring software helps too because they all have there own system info they run.


----------



## bond32

Quote:


> Originally Posted by *Buehlar*
> 
> Your PSU "should" meet the requirements for 290x tri-fire, however, with everything heavily OCed, you're at the very minimum.
> 
> Consider my power usage test @ 1.35v. With the PSU powering "only one 290x" it was drawing a massive 405.6w input from the wall and GPU output was 373.5w.
> 373.5 x 3 = 1120.5w
> 
> and you're adding even more voltage...so yea, the odds of a PSU bottleneck are fair... figure 400w per GPU = 1200w and that leaves you with 100w of headroom.
> 
> Is your CPU also OCed? If so, try a run without OCing your CPU.
> 
> 290x @1.35v


Yes, I understand this. And for the record, I am using 2 EVGA 1300 watt G2 power supplies. But I am wondering if, once the voltage is raises in the levels I am referring to such as 1.4 V, what current does the card draw as well?

P=VI, is it really that simple? So I have seen one of my cards draw 380 watts, what does that mean for the current supplied? Is the G2 power supply not capable of that kind of current?

Another question, would I have any reason to believe dual link dvi isn't best for benching? The issue it seems I am facing is once the cards are pushed to 1.38 V or more the display gets distorted with vertical lines to all hell and back.... I always assumed this meant unstable clocks, but I have seen the lines appear on the desktop after only adjusting the voltage...


----------



## kizwan

If I remember correctly, someone did complained about this in the past when overclocking with 120Hz (or overclocked) monitor.


----------



## Buehlar

Quote:


> Originally Posted by *bond32*
> 
> Yes, I understand this. And for the record, I am using 2 EVGA 1300 watt G2 power supplies. But I am wondering if, once the voltage is raises in the levels I am referring to such as 1.4 V, what current does the card draw as well?
> 
> P=VI, is it really that simple? So I have seen one of my cards draw 380 watts, what does that mean for the current supplied? Is the G2 power supply not capable of that kind of current?
> 
> Another question, would I have any reason to believe dual link dvi isn't best for benching? The issue it seems I am facing is once the cards are pushed to 1.38 V or more the display gets distorted with vertical lines to all hell and back.... I always assumed this meant unstable clocks, but I have seen the lines appear on the desktop after only adjusting the voltage...


Well then in that case, maybe it's simply just too much voltage for one your cards?? Ease off of her a little bit man








I've never seen the lines you're describing but I haven't pushed mine past 1.35 yet, just got the waterblocks on and leak checking as we speak.
I'll do more test @ higher voltages soon.









...also DVI should have no effect whatsoever on your benches aslong as it supports your monitors resolution. Running some test between DVI-HDMI-DP will reveal no + or - in performance.


----------



## bluedevil

I think I just figured out why my 290 just isn't performing up to par. I think my PSU is over taxed. With my 290 and a OC'd 3470, I am pulling almost 500w. Now my PSU is gold rated at 92%. So that's 505w. So I am faced with a dilemma, get a bigger (750w) PSU? Or go to team green for a GTX 970?


----------



## the9quad

Quote:


> Originally Posted by *bluedevil*
> 
> I think I just figured out why my 290 just isn't performing up to par. I think my PSU is over taxed. With my 290 and a OC'd 3470, I am pulling almost 500w. Now my PSU is gold rated at 92%. So that's 505w. So I am faced with a dilemma, get a bigger (750w) PSU? Or go to team green for a GTX 970?


If I were in your shoes, I'd sell the 290 and get a 970 to be honest. Probably end up out of pocket cheaper, and you'd have a nicer (imo) card.


----------



## Ironsmack

Quote:


> Originally Posted by *the9quad*
> 
> If I were in your shoes, I'd sell the 290 and get a 970 to be honest. Probably end up out of pocket cheaper, and you'd have a nicer (imo) card.


Assuming you break even selling the 290 and buying the 970. With the over abundance of AMD cards in the market... Its tough to sell those cards.

But good for those looking to crossfire/trifire myself. Its a buyers market right now.

IMO, just upgrade to a higher wattage PSU.


----------



## Roboyto

Quote:


> Originally Posted by *bluedevil*
> 
> I think I just figured out why my 290 just isn't performing up to par. I think my PSU is over taxed. With my 290 and a OC'd 3470, I am pulling almost 500w. Now my PSU is gold rated at 92%. So that's 505w. So I am faced with a dilemma, get a bigger (750w) PSU? Or go to team green for a GTX 970?


Pulling 500W under what kind of load/OC?

My HTPC is 290/4770k powered by a Capstone 450M.

Gigabyte Z97N-WiFi

4770k @ 4.3GHz 1.25V // 1.812 VRIN

8GB 2133 RAM 1.5V

290 @ 1075/1375 +37mV

1 SSD & 2 HDD

2 Antec 620 AIOs

3 SilenX Effizio 120mm Fans

Running Tomb Raider all max settings at 1080P gives the 290 one hell of a workout and it runs like a champ.

Also, my main rig is 4770k/290 powered by the Capstone 650M. My 290 in here clocks about as well as any other under water up to 1300/1700 and the 650M doesn't flinch when driving 5760*1080. Gaming clocks are 1200/1500 +87mV; 4770k @ 4.5GHz 1.259V


----------



## bluedevil

Here is my setup

Gigabyte Z77N-WiFi
3470 @ 4.1GHz 1.3V
8GB 1600 RAM 1.35
290 @ 1200/1400 +75mV
1 SSD & 1 HDD
2 CM Seidon AIOs
3 CM BladeMaster fans
1 Antec Spotcool

And I am using.
http://powersupplycalculator.net/

Also I am using this review for the basis on the GPU consumption.
http://www.guru3d.com/articles_pages/radeon_r9_290_review_benchmarks,10.html


----------



## ZealotKi11er

Quote:


> Originally Posted by *bluedevil*
> 
> I think I just figured out why my 290 just isn't performing up to par. I think my PSU is over taxed. With my 290 and a OC'd 3470, I am pulling almost 500w. Now my PSU is gold rated at 92%. So that's 505w. So I am faced with a dilemma, get a bigger (750w) PSU? Or go to team green for a GTX 970?


What is you PSU? From your sig rig i see you have Rosewill CAPSTONE-550-M (550W).

You are saying youe system pulls 500W which is a bit high considering stock 290X for me pulls ~ 390W. My sytem has a lot more fans and is watercooled. If you are using 200mV then thats why.

Now if you do the math your PSU (Gold) 87% 90% 87% so you are looking ~ 88-89% efficiency. 500W*0.88 = 440W. Thats the actual power that you are using from PSU. You have 100W more to go.


----------



## bluedevil

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What is you PSU? From your sig rig i see you have Rosewill CAPSTONE-550-M (550W).
> 
> You are saying youe system pulls 500W which is a bit high considering stock 290X for me pulls ~ 390W. My sytem has a lot more fans and is watercooled. If you are using 200mV then thats why.
> 
> Now if you do the math your PSU (Gold) 87% 90% 87% so you are looking ~ 88-89% efficiency. 500W*0.88 = 440W. Thats the actual power that you are using from PSU. You have 100W more to go.


yep that's my PSU
http://www.newegg.com/Product/Product.aspx?Item=N82E16817182262

All I can figure is that my 290 (254w stock) is pulling more than 300w and my 3470 (77w stock) about 115w when both are OC'd. So that's at least 415w just from those two components, leaving about 91w to go for the rest of the system.

Also my math is this 92% efficiency X 550w = 506w


----------



## Roboyto

Quote:



> Originally Posted by *bluedevil*
> 
> Here is my setup
> 
> Gigabyte Z77N-WiFi
> 3470 @ 4.1GHz 1.3V
> 8GB 1600 RAM 1.35
> 290 @ 1200/1400 +75mV
> 1 SSD & 1 HDD
> 2 CM Seidon AIOs
> 3 CM BladeMaster fans
> 1 Antec Spotcool
> 
> And I am using.
> http://powersupplycalculator.net/
> 
> Also I am using this review for the basis on the GPU consumption.
> http://www.guru3d.com/articles_pages/radeon_r9_290_review_benchmarks,10.html


I got 489 according to the calculator. Guessed at 20% additional voltage for CPU overclock. Add a few more watts for your AIO pumps, and add a little more for your GPU overclock and you would probably end up at around 550W.

However, you have to figure that cooling the GPU with an AIO is doing wonders for it's efficiency so even with your OC you are probably pulling around stock wattage. Those stock consumption figures were probably read with the 290 running at 94C just under throttling. LegitReviews showed a 44W decrease when attaching an AIO cooler: http://www.legitreviews.com/nzxt-kraken-g10-gpu-water-cooler-review-on-an-amd-radeon-r9-290x_130344/5

I have a very similar setup, with a more power hungry CPU, a smaller GPU clock, but only a 450W PSU. I'm not leaning towards the PSU in this situation unless it is failing. Happen to have another one to test with?


----------



## bluedevil

Quote:


> Originally Posted by *Roboyto*
> 
> Quote:
> I got 489 according to the calculator. Guessed at 20% additional voltage for CPU overclock. Add a few more watts for your AIO pumps, and add a little more for your GPU overclock and you would probably end up at around 550W.
> 
> However, you have to figure that cooling the GPU with an AIO is doing wonders for it's efficiency so even with your OC you are probably pulling around stock wattage. Those stock consumption figures were probably read with the 290 running at 94C just under throttling. LegitReviews showed a 44W decrease when attaching an AIO cooler: http://www.legitreviews.com/nzxt-kraken-g10-gpu-water-cooler-review-on-an-amd-radeon-r9-290x_130344/5
> 
> I have a very similar setup, with a more power hungry CPU, a smaller GPU clock, but only a 450W PSU. I'm not leaning towards the PSU in this situation unless it is failing. Happen to have another one to test with?


No unfortunately I don't.


----------



## Roboyto

Quote:



> Originally Posted by *bluedevil*
> 
> No unfortunately I don't.


A local store that sells PSUs with a good return policy? Nicely open it up, plug it in and see if anything changes.

What is the problem you are having BTW?


----------



## ZealotKi11er

Quote:


> Originally Posted by *bluedevil*
> 
> yep that's my PSU
> http://www.newegg.com/Product/Product.aspx?Item=N82E16817182262
> 
> All I can figure is that my 290 (254w stock) is pulling more than 300w and my 3470 (77w stock) about 115w when both are OC'd. So that's at least 415w just from those two components, leaving about 91w to go for the rest of the system.
> 
> Also my math is this 92% efficiency X 550w = 506w


Why 550W * 0.92? What is that for. If PSU is 92% efficient then it can pull 550/0.92 ~ 600W from the wall. You PSU is not 92% just so you know. Its ~ 87% when 100% load and 90% when 50% load.


----------



## bluedevil

Quote:


> Originally Posted by *Roboyto*
> 
> Quote:
> 
> A local store that sells PSUs with a good return policy? Nicely open it up, plug it in and see if anything changes.
> 
> What is the problem you are having BTW?


For example in 3DMark, my 290 gets a lower score than most, in BF4 I get alot of dips. Things like that. I also have the bug, the I need to get new hardware all the time bug.


----------



## Roboyto

Quote:



> Originally Posted by *bluedevil*
> 
> For example in 3DMark, my 290 gets a lower score than most, in BF4 I get alot of dips. Things like that. I also have the bug, the I need to get new hardware all the time bug.


You need faster RAM. BF4 is especially sensitive to RAM speeds. Best blend of performance/cost is going to be 2133MHz. I suggest going with some that is a little more expensive with a 1.5V requirement opposed to 1.6V-1.65V.. This helps keep your CPU temps down as well since RAM voltage regulation is handled in the CPU.

Is it your graphics score that is lower, or your overall score? A 3470 will get crushed by i7s and you will show a lower overall score.


----------



## bluedevil

Quote:


> Originally Posted by *Roboyto*
> 
> Quote:
> 
> You need faster RAM. BF4 is especially sensitive to RAM speeds. Best blend of performance/cost is going to be 2133MHz. I suggest going with some that is a little more expensive with a 1.5V requirement opposed to 1.6V-1.65V.. This helps keep your CPU temps down as well since RAM voltage regulation is handled in the CPU.
> 
> Is it your graphics score that is lower, or your overall score? A 3470 will get crushed by i7s and you will show a lower overall score.


Here is my best run ever.....still low score for a 290 though @ 1.2ghz.

http://www.3dmark.com/fs/2871888


----------



## Roboyto

Quote:


> Originally Posted by *bluedevil*
> 
> Here is my best run ever.....still low score for a 290 though @ 1.2ghz.
> 
> http://www.3dmark.com/fs/2871888


BF4 RAM Speed Comparison

http://www.overclock.net/t/1442038/build-log-the-hawaiian-heat-wave/120_20#post_21628080

You can compare benchmark scores to my spreadsheet here

http://www.overclock.net/t/1456279/honey-i-shrunk-the-ultra-tower-beast-my-journey-to-creating-a-more-compact-pc-with-an-r9-290/20_20#post_21847939

I haven't run Firestrike in a while, I can do it in a little bit.


----------



## bluedevil

So what you are saying is that my lil 1600mhz sticks should be swapped out for 2400s?


----------



## BradleyW

Quote:


> Originally Posted by *bluedevil*
> 
> So what you are saying is that my lil 1600mhz sticks should be swapped out for 2400s?


See my sig. I did RAM testing ages ago.


----------



## battleaxe

Well, I finally got one of my 290's to do 1200 on the core. Then tried 1210 and it gave me black-screen. Then when it booted past the Windows splash screen its now is all black. Took out card, re-seated, used DDU to uninstall drivers, and same thing.

If I uninstall the drivers completely it will go to desktop, but as soon as I res-install drivers back to black-screen on Windows log in. Awesome. I think I found a solution on Toms' Bumware... a few others had a similar issue... sure hope it works.


----------



## Roboyto

Quote:


> Originally Posted by *bluedevil*
> 
> So what you are saying is that my lil 1600mhz sticks should be swapped out for 2400s?


2133 at the least.

Your graphics score is better than mine, but that was with 14.3 What driver version you running now?

My 290 at 1200/1500..but I need a driver update apparently since I'm on 14.6

Graphics Score: 11009

http://www.3dmark.com/3dm/4452761?

After a DDU driver wipe and 14.9 Install.

Graphics Score 11606

http://www.3dmark.com/3dm/4452870?


----------



## bluedevil

Quote:


> Originally Posted by *Roboyto*
> 
> 2133 at the least.
> 
> Your graphics score is better than mine, but that was with 14.3 What driver version you running now?
> 
> My 290 at 1200/1500..but I need a driver update apparently since I'm on 14.6
> 
> Graphics Score: 11009
> 
> http://www.3dmark.com/3dm/4452761?
> 
> After a DDU driver wipe and 14.9 Install.
> 
> Graphics Score 11606
> 
> http://www.3dmark.com/3dm/4452870?


Mmmm, running 14.9.1 atm. I am not buying into this memory talk......


----------



## Ironsmack

Quote:


> Originally Posted by *bluedevil*
> 
> So what you are saying is that my lil 1600mhz sticks should be swapped out for 2400s?


Not necessarily. What happens if you leave the 290 on stock settings? Do you still have the same problems on BF4?


----------



## bluedevil

Quote:


> Originally Posted by *Ironsmack*
> 
> Not necessarily. What happens if you leave the 290 on stock settings? Do you still have the same problems on BF4?


Not really.....thinking I just have a bum OCing 290.


----------



## Roboyto

Quote:


> Originally Posted by *bluedevil*
> 
> Mmmm, running 14.9.1 atm. I am not buying into this memory talk......


Far Cry 3 and Metro aren't BF4. BF4 is designed around and to utilize modern hardware. You running DX or Mantle?

Latency should also definitely be considered. So it is a good idea to get some quality RAM with good speed and latency.

Run Firestrike now. What score do you get with 14.9? Compare to mine, because I know mine performs as it should.

Are you sure your overclock is stable? How extensively have you tested your overclock? I literally ran benches 100s of times to find my overclock. Sometimes a few mV or MHz can make a big difference in performance.


----------



## ZealotKi11er

Quote:


> Originally Posted by *bluedevil*
> 
> Not really.....thinking I just have a bum OCing 290.


What kind of voltage do you need for 1200MHz?


----------



## Ironsmack

Quote:


> Originally Posted by *bluedevil*
> 
> Not really.....thinking I just have a bum OCing 290.


Well, maybe....

My (2) 290's OC to 1150/1400 with 69mV, from the outlet - my system pulls around 900+ watts.

However, when its 1050/1300 with +53 mV, it pulls around 755+ watts.

So it could be your PSU.


----------



## bluedevil

Quote:


> Originally Posted by *Roboyto*
> 
> Far Cry 3 and Metro aren't BF4. BF4 is designed around and to utilize modern hardware. You running DX or Mantle?
> 
> Latency should also definitely be considered. So it is a good idea to get some quality RAM with good speed and latency.
> 
> Run Firestrike now. What score do you get with 14.9? Compare to mine, because I know mine performs as it should.
> 
> Are you sure your overclock is stable? How extensively have you tested your overclock? I literally ran benches 100s of times to find my overclock. Sometimes a few mV or MHz can make a big difference in performance.


I will run Firestrike in a bit....but I did find this. It's older.


----------



## Roboyto

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What kind of voltage do you need for 1200MHz?


75mV, but he didn't specify which OC utility.

Quote:


> Originally Posted by *Ironsmack*
> 
> Well, maybe....
> 
> My (2) 290's OC to 1150/1400 with 69mV, from the outlet - my system pulls around 900+ watts.
> 
> However, when its 1050/1300 with +53 mV, it pulls around 755+ watts.
> 
> So it could be your PSU.


3930X likely pulling much more than 3470.

Quote:


> Originally Posted by *bluedevil*
> 
> I will run Firestrike in a bit....but I did find this. It's older.


This video just leaves too many questions to be asked with unknown variables. I used to have the same exact RAM you do and I was able to get it up a little over 2000 MHz. You should try overclocking yours and see if there is a difference.


----------



## josephimports

Quote:


> Originally Posted by *bluedevil*
> 
> Here is my best run ever.....still low score for a 290 though @ 1.2ghz.
> 
> http://www.3dmark.com/fs/2871888


Quote:


> Originally Posted by *Roboyto*
> 
> 2133 at the least.
> 
> Your graphics score is better than mine, but that was with 14.3 What driver version you running now?
> 
> My 290 at 1200/1500..but I need a driver update apparently since I'm on 14.6
> 
> Graphics Score: 11009
> 
> http://www.3dmark.com/3dm/4452761?
> 
> After a DDU driver wipe and 14.9 Install.
> 
> Graphics Score 11606
> 
> http://www.3dmark.com/3dm/4452870?


Here is my new HIS IceQ 290 at 1200/1500. Graphics score of 13,272. No mods or tweaks. Latest updates and drivers. Stock cooler using Trixx and +200mv.

http://www.3dmark.com/3dm/4420953


----------



## StrongForce

Quote:


> Originally Posted by *Buehlar*
> 
> Your PSU "should" meet the requirements for 290x tri-fire, however, with everything heavily OCed, you're at the very minimum.
> 
> Consider my power usage test @ 1.35v. With the PSU powering "only one 290x" it was drawing a massive 405.6w input from the wall and GPU output was 373.5w.
> 373.5 x 3 = 1120.5w
> 
> and you're adding even more voltage...so yea, the odds of a PSU bottleneck are fair... figure 400w per GPU = 1200w and that leaves you with 100w of headroom.
> 
> Is your CPU also OCed? If so, try a run without OCing your CPU.
> 
> 290x @1.35v
> 
> ]


Lol dude how do you configure corsair link so it actually works like that ?







I tryed to look arround and couldn't find !
Quote:


> Originally Posted by *bond32*
> 
> What, aside from unstable overclocks, causes the display to show lines all over? In some testing last night I observed that setting the voltage to 1.36 ish on pt3 bios, regardless of clock speeds would cause the vertical lines to skew the display. I thought that had to do with a memory issue, but what are the chances the 3 cards i have were drawing more current than what my psu can provide?
> 
> I'm going to run 2 cards per psu tonight. Last night's tests were running all 3 on one power supply.


To show line when it crashes or just lines like that ?


----------



## bluedevil

Quote:


> Originally Posted by *josephimports*
> 
> Here is my new HIS IceQ 290 at 1200/1500. Graphics score of 13,272. No mods or tweaks. Latest updates and drivers. Stock cooler using Trixx and +200mv.
> 
> http://www.3dmark.com/3dm/4420953


This graphics score is what I am talking about. I have no clue how in the world you can beat me running the same 290 clock.









Running Firestrike now.


----------



## Roboyto

Quote:


> Originally Posted by *josephimports*
> 
> Here is my new HIS IceQ 290 at 1200/1500. Graphics score of 13,272. No mods or tweaks. Latest updates and drivers. Stock cooler using Trixx and +200mv.
> 
> http://www.3dmark.com/3dm/4420953


Intriguing. Your score led me to enable monitoring of clock speeds via RSS. My GPU is not hitting 1200 during the benchmark for some reason, it is only peaking at 1026...but the VRAM is going to the appropriate speed of 1500.

Trixx was acting funny so I double checked with Afterburner and the same thing is happening.


----------



## bluedevil

Here's my run of 9644

http://www.3dmark.com/3dm/4453136

3470 @ stock settings

290 @ 1200/1500 with +75vddc


----------



## josephimports

The Hynix memory helps benchmark scores. Set Power limit to 50% to hold the clock speed.


----------



## Roboyto

Quote:


> Originally Posted by *josephimports*
> 
> Here is my new HIS IceQ 290 at 1200/1500. Graphics score of 13,272. No mods or tweaks. Latest updates and drivers. Stock cooler using Trixx and +200mv.
> 
> http://www.3dmark.com/3dm/4420953


Quote:


> Originally Posted by *Roboyto*
> 
> Intriguing. Your score led me to enable monitoring of clock speeds via RSS. My GPU is not hitting 1200 during the benchmark for some reason, it is only peaking at 1026...but the VRAM is going to the appropriate speed of 1500.
> 
> Trixx was acting funny so I double checked with Afterburner and the same thing is happening.


Uninstalled Afterburner and Trixx. Installed an older version of Trixx and it is holding 1200 core speed now.

1200/1500 +87mV

Graphics: 13328

http://www.3dmark.com/3dm/4453199?

Pushing the envelope

1290/1675 +175mV

Graphics: 14191

http://www.3dmark.com/3dm/4453286?

Quote:


> Originally Posted by *bluedevil*
> 
> Here's my run of 9644
> 
> http://www.3dmark.com/3dm/4453136
> 
> 3470 @ stock settings
> 
> 290 @ 1200/1500 with +75vddc


9644 is combined, your graphics is at 12484.

Use something to monitor clocks during the benchmark, maybe they aren't holding at peak like mine was doing.

Quote:


> Originally Posted by *bluedevil*
> 
> This graphics score is what I am talking about. I have no clue how in the world you can beat me running the same 290 clock.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Running Firestrike now.


Try a little more voltage or power limit. Or reducing the clocks just a hair. Instability can start with lower scores before visible signs appear.


----------



## Jorginto

Quote:


> Originally Posted by *Devildog83*
> 
> Did you sweep and reinstall the drivers after you installed both cards. Sometimes deleting and reinstalling the monitoring software helps too because they all have there own system info they run.


But should I install the drivers with one card and add the second one after that, couse with R9 280X CF it didn't make any difference.


----------



## Buehlar

Quote:


> Originally Posted by *StrongForce*
> 
> Lol dude how do you configure corsair link so it actually works like that ?
> 
> 
> 
> 
> 
> 
> 
> I tryed to look arround and couldn't find !
> To show line when it crashes or just lines like that ?


All I did was plug the link adapter into the PSU & USB port then installed the software. I didn't configure anything, all the settings are still set to default.


----------



## JourneymanMike

Quote:


> Originally Posted by *bluedevil*
> 
> Here's my run of 9644
> 
> http://www.3dmark.com/3dm/4453136
> 
> 3470 @ stock settings
> 
> 290 @ 1200/1500 with +75vddc


Here's my run - everything stock - my base line I guess.....

http://www.3dmark.com/fs/2937995


----------



## StrongForce

Quote:


> Originally Posted by *bond32*
> 
> What, aside from unstable overclocks, causes the display to show lines all over? In some testing last night I observed that setting the voltage to 1.36 ish on pt3 bios, regardless of clock speeds would cause the vertical lines to skew the display. I thought that had to do with a memory issue, but what are the chances the 3 cards i have were drawing more current than what my psu can provide?
> 
> I'm going to run 2 cards per psu tonight. Last night's tests were running all 3 on one power supply.


Quote:


> Originally Posted by *Buehlar*
> 
> All I did was plug the link adapter into the PSU & USB port then installed the software. I didn't configure anything, all the settings are still set to default.


USB port ?? I don't remember seeing a USB port, I'll have to take a look again... thanks for the info


----------



## bluedevil

Creepin.....9745

http://www.3dmark.com/3dm/4455613

and another run. Sooooo close to 10k....







9906

http://www.3dmark.com/fs/3040676


----------



## rdr09

Quote:


> Originally Posted by *bluedevil*
> 
> Creepin.....9745
> 
> http://www.3dmark.com/3dm/4455613
> 
> and another run. Sooooo close to 10k....
> 
> 
> 
> 
> 
> 
> 
> 9906
> 
> http://www.3dmark.com/fs/3040676


try turning off all power saving features in BIOS. Windows power option to High.

you graphics is just down 200 pts. based on the cpu.


----------



## bluedevil

Quote:


> Originally Posted by *rdr09*
> 
> try turning off all power saving features in BIOS. Windows power option to High.


Lol done and done. I think its all my chip as in it. Just weighing out my options, this mobo is killing me. No vcore options, other than auto. Makes upgrading to a 3770K seem hard, don't really want to gimp it from vcore. I just might have to take the plunge and get a 4790k, I know this is whats holding back all of my scores.


----------



## kizwan

I think the reason why you only see auto for Vcore because you have locked CPU.


----------



## bluedevil

Quote:


> Originally Posted by *kizwan*
> 
> I think the reason why you only see auto for Vcore because you have locked CPU.


Nope, vcore locked.

http://www.tweaktown.com/reviews/5240/gigabyte-z77n-wifi-intel-z77-mini-itx-motherboard-review/index3.html

http://www.overclock.net/t/1316594/gigabyte-z77n-wifi-h77n-wifi-no-voltage-control-voltage-hard-mod-info-inside

http://www.pcper.com/reviews/Motherboards/GIGABYTE-Z77N-WiFi-Motherboard-Review/Overclocking-Results

http://www.techspot.com/products/motherboards/gigabyte-ga-z77n-wifi.87405/


----------



## kizwan

I stand corrected! I've never own mini motherboard before.


----------



## bluedevil

Quote:


> Originally Posted by *kizwan*
> 
> I stand corrected! I've never own mini motherboard before.


Lol.







All is good. I feel like I should have waited a month to get my cpu/mobo though....Z87 was 1 month out and so was Haswell.







Should have done a Z87/4670K instead.


----------



## bluedevil

Odd. My 1200/1500 with 75mv + and 50 % power limit, is stable in 3dMark for the 5 passes I did this morning, however when I go to play BF4, I can't even get 1100/1400 stable. WTH?


----------



## Agent Smith1984

Power Supply....

Once you start running BF4, you have CPU and GPU under load at the same time...
550W doesn't cut it for that..... my OCZ 700W gets crushed when overclocking my CPU and GPU together....

If I run Prime95 and Kombuster together, it overloads, and shuts off, and will not turn back on without leaving it unplugged for 30 seconds (to reset the breaker).

Get some more power going to that system, and you will be ooootttaaaayyyy









Edit: That's just my theory..... You didn't specify if you were getting hangs, or really bad artifacts.... your symptoms prompted my suggestion....


----------



## bluedevil

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Power Supply....
> 
> Once you start running BF4, you have CPU and GPU under load at the same time...
> 550W doesn't cut it for that..... my OCZ 700W gets crushed when overclocking my CPU and GPU together....
> 
> If I run Prime95 and Kombuster together, it overloads, and shuts off, and will not turn back on without leaving it unplugged for 30 seconds (to reset the breaker).
> 
> Get some more power going to that system, and you will be ooootttaaaayyyy


Been saying that! Nobody wants to admit it to me! lol....It's either a new PSU or a new GPU, one of the two.









Getting red, white, blue, green screens. I believe its GPU related, but only when I OC it.

Here is my card.

http://www.tweaktown.com/guides/6667/visiontek-radeon-r9-290-video-card-circuit-and-overclocking-guide/index.html

Also found this power draw to be quite interesting....
http://www.tomshardware.com/reviews/radeon-r9-290-and-290x,3728-4.html


----------



## rdr09

Quote:


> Originally Posted by *bluedevil*
> 
> Been saying that! Nobody wants to admit it to me! lol....It's either a new PSU or a new GPU, one of the two.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Getting red, white, blue, green screens. I believe its GPU related, but only when I OC it.
> 
> Here is my card.
> 
> http://www.tweaktown.com/guides/6667/visiontek-radeon-r9-290-video-card-circuit-and-overclocking-guide/index.html
> 
> Also found this power draw to be quite interesting....
> http://www.tomshardware.com/reviews/radeon-r9-290-and-290x,3728-4.html


anyway to test the gpu in another system? i think i asked you that before. yah, i think its time to stop the madness and go green. lol

anyway, got my block installed on my second 290.



now the hard part. Installing. got to drain and all that.

Do I have to get rid of the silver coil since this block is copper? Thanks.


----------



## tsm106

Quote:


> Originally Posted by *rdr09*
> 
> 
> 
> now the hard part. Installing. got to drain and all that.
> 
> Do I have to get rid of the silver coil since this block is copper? Thanks.


1- buy qdc set, connect to in/out ports on gpu. Now you can simply pop the gpu block in and out of the loop without draining. Draining to swap gpu blocks is very pedestrian. There's many shortcuts to swapping blocks. I can swap blocks in 2-3 minutes without draining nor refilling.

2- you still use a kill coil?


----------



## bond32

Quote:


> Originally Posted by *tsm106*
> 
> 1- buy qdc set, connect to in/out ports on gpu. Now you can simply pop the gpu block in and out of the loop without draining. Draining to swap gpu blocks is very pedestrian. There's many shortcuts to swapping blocks. I can swap blocks in 2-3 minutes without draining nor refilling.
> 
> 2- you still use a kill coil?


What's your qdc recommendation? Koolance?


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> 1- buy qdc set, connect to in/out ports on gpu. Now you can simply pop the gpu block in and out of the loop without draining. Draining to swap gpu blocks is very pedestrian. There's many shortcuts to swapping blocks. I can swap blocks in 2-3 minutes without draining nor refilling.
> 
> 2- you still use a kill coil?


do you mind linking a good qdc, tsm? thanks.

yah, the coil has been there since i built the loop. like 2 years ago. i'll take it out.


----------



## VSG

Not TSM, but the best QDCs atm are the QD3/QD4 (if you can fit the latter) from Koolance. Just get the nickel finish versions.


----------



## rdr09

Quote:


> Originally Posted by *geggeg*
> 
> Not TSM, but the best QDCs atm are the QD3/QD4 (if you can fit the latter) from Koolance. Just get the nickel finish versions.


thanks, geggeg. i'll check them out.


----------



## tsm106

Quote:


> Originally Posted by *bond32*
> 
> What's your qdc recommendation? Koolance?


Quote:


> Originally Posted by *rdr09*
> 
> do you mind linking a good qdc, tsm? thanks.
> 
> yah, the coil has been there since i built the loop. like 2 years ago. i'll take it out.


Use a coolant vs a coil?

I use koolance because they've been making pc specific qdc the longest so they generally have it down. Though they are far from perfect mind you. The plating gets stripped over time, they can clog too, and they are expensive, though wc as a hobby is expensive so it goes with the territory as they say.

Basically for connecting gpus, you need a set of two each, male and female thus 4 total. **One matched set are compression and one set matched set are G1/4 (screw into gpu ports). And also if you are enterprising another matched set that you use with a length of tubing for those days when you need the block/s out of the loop for longer. This is where the extra set comes in, as you use it to "complete" your loop so it can run w/o your gpu. For ex. testing an air card or repair/rma/etc whatever may arise.

When picking qdc out go slow and make sure you have matching sets, ie. male/female loopside and male/female gpu side, then whether you want direct fitting on gpu side or not, and finally your tubing size. Triple check, as you do not want to order the wrongs ones because shipping and returns are hard to negotiate with wc vendors.

**I prefer shorter/closer connections when using qdc, but you are free to use compressions completely with a length of tubing. Choice is good, but that is rather redundant imo.

G1/4 vs Compression to gpu block:


----------



## bluedevil

Quote:


> Originally Posted by *rdr09*
> 
> anyway to test the gpu in another system? i think i asked you that before. yah, i think its time to stop the madness and go green. lol
> 
> anyway, got my block installed on my second 290.
> 
> 
> 
> now the hard part. Installing. got to drain and all that.
> 
> Do I have to get rid of the silver coil since this block is copper? Thanks.


Done. Ordered a Gigabyte GTX 970 Windforce OC.


----------



## tsm106

Oh forgot there's a secret to filling for speed. I do this all the time with my quads. Filling a quad blocked array creates not only massive amounts of bubbles, but very very tiny bubbles as well. The kind of bubbles that get stuck in rads leaving pockets of air behind. I avoid this by pre-filling my blocks. It's a lil or a lot crazy depending on how daring/experienced you are I suppose. I also use coolant which reduces the chances of liquid death. I'll try to explain it using the pic of my sons rig.



The 90deg top fitting, is what I use for filling. The bottom fitting I twist so it points away from block. After block is removed I stand it up against sink with the fitting that is pointing away from gpu, now pointed at sink. Loosen this fitting a few turns so air can escape. You can bag (and tape it) this fitting end to catch the water blow out so it liquid doesn't get on your block. Backplates are great while doing this. Now the other end, you fill slowly. A syringe helps too, another must have in a loop tool box. When I do this with quads, I fill a lil bit, blow into fill fitting to push water thru the array, fill some more, blow again, rinse repeat. When the block is filled, water and air will come out the other side obviously. Now screw that tight thru the baggy. This side is done for now until you are ready to remove the baggy and wipe dry the fitting. Back to the fill side, I also pre-fill the qdc and then screw that on carefully to not spill that liquid all over my work. And that's how I pre-fill my gpu block. This saves me a few hours during bubble removal time which is about how long it takes to wait for the super tiny bubbles to dissipate.

I'll be doing this again thurs as I will be testing another lightning then. I'll try to take pics of the process then. Remember to use paper towels and keep dry, prevent drips at all times.


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> Use a coolant vs a coil?
> 
> I use koolance because they've been making pc specific qdc the longest so they generally have it down. Though they are far from perfect mind you. The plating gets stripped over time, they can clog too, and they are expensive, though wc as a hobby is expensive so it goes with the territory as they say.
> 
> Basically for connecting gpus, you need a set of two each, male and female thus 4 total. **One matched set are compression and one set matched set are G1/4 (screw into gpu ports). And also if you are enterprising another matched set that you use with a length of tubing for those days when you need the block/s out of the loop for longer. This is where the extra set comes in, as you use it to "complete" your loop so it can run w/o your gpu. For ex. testing an air card or repair/rma/etc whatever may arise.
> 
> When picking qdc out go slow and make sure you have matching sets, ie. male/female loopside and male/female gpu side, then whether you want direct fitting on gpu side or not, and finally your tubing size. Triple check, as you do not want to order the wrongs ones because shipping and returns are hard to negotiate with wc vendors.
> 
> **I prefer shorter/closer connections when using qdc, but you are free to use compressions completely with a length of tubing. Choice is good, but that is rather redundant imo.
> 
> G1/4 vs Compression to gpu block:
> 
> 
> Spoiler: Warning: Spoiler!


Went to Koolance site and found some. about $50 for a pair (4pcs) excluding shipping. local company is based in Washington state. i may have to invest on those.


----------



## Devildog83

Quote:


> Originally Posted by *bluedevil*
> 
> yep that's my PSU
> http://www.newegg.com/Product/Product.aspx?Item=N82E16817182262
> 
> All I can figure is that my 290 (254w stock) is pulling more than 300w and my 3470 (77w stock) about 115w when both are OC'd. So that's at least 415w just from those two components, leaving about 91w to go for the rest of the system.
> 
> Also my math is this 92% efficiency X 550w = 506w


I think you are right, that's an awful big load for a 550w psu. Even if it can handle it, which it should be able to, do you want to run it at that for any length of time? They do degrade over time especially if they carry a big load. I have a 660w platinum with my 290 and it's just about right but if I was to need a new PSU I would get a 850w at least maybe 1000w because I will at some point be adding another card and I am always upgrading.


----------



## rdr09

Quote:


> Originally Posted by *bluedevil*
> 
> Done. Ordered a Gigabyte GTX 970 Windforce OC.


Congrats, blue!


----------



## ZealotKi11er

Quote:


> Originally Posted by *bluedevil*
> 
> Done. Ordered a Gigabyte GTX 970 Windforce OC.


Hopefully you dont change everything in your rig to attain high scores that 290 never could.


----------



## bluedevil

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Hopefully you dont change everything in your rig to attain high scores that 290 never could.


I will let you know when I hit 1.5ghz with it.


----------



## Jorginto

Guys, is there any way to stabilize the voltage on my 290's. While benching it's all over the place from 1.213 to 1.313 and I'm getting artifacts at 1125MHz core. Was hoping for a 1200 ish area but with that vdrop I doubt it.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Jorginto*
> 
> Guys, is there any way to stabilize the voltage on my 290's. While benching it's all over the place from 1.213 to 1.313 and I'm getting artifacts at 1125MHz core. Was hoping for a 1200 ish area but with that vdrop I doubt it.


Hoe much voltage are you adding?


----------



## Jorginto

100 mV, I've got the reference powercolors.

This is my best so far:
http://www.3dmark.com/3dm/4460102


----------



## ZealotKi11er

Quote:


> Originally Posted by *Jorginto*
> 
> 100 mV, I've got the reference powercolors.


Heat is you problem. Try +50mV. I dont even run +100mV with water cooled cards.


----------



## Jorginto

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Heat is you problem. Try +50mV. I dont even run +100mV with water cooled cards.


Nah, it's not it. I'm benching jet turbine style, before I install my water blocks. 80% fan speed and the temps hold 70d C. VRM the same.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Jorginto*
> 
> Nah, it's not it. I'm benching jet turbine style, before I install my water blocks. 80% fan speed and the temps hold 70d C. VRM the same.


What happens if you add more voltage. I only star seeing artifacts past 1220MHz.


----------



## Jorginto

Didn't try to go past 100mV. 1.313 is quite alot already. But as I told you before, I think it would be way better if the voltage was constant.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Jorginto*
> 
> Didn't try to go past 100mV. 1.313 is quite alot already. But as I told you before, I think it would be way better if the voltage was constant.


Thats not the point. If you are getting artifacts then there are other things at play. For example let say you card does 1.2-1.3v. You want to do 1.3v all the time. If you got +200 mV you will go 1.3-.14v. If you still get artifacts then its not the voltage core. The vCore changes based on the game load. You are getting 1.2v when there is not much going on. You can use those custom bios to see if they help you. I know for my card voltage is not the limit but the chip itself cant OC more then 1220MHz. I can bench 1275MHz but with artifacts.


----------



## Sempre

These new prices are really tempting. Vapor-x 290 for $300


----------



## SeanEboy

...And, I'm in the hole $2400... lol. Thank goodness I make stupid money doing stupid things sometimes.


----------



## BLOWNCO

Quote:


> Originally Posted by *Sempre*
> 
> These new prices are really tempting. Vapor-x 290 for $300


thats why i now have three of them lol


----------



## Sempre

Quote:


> Originally Posted by *BLOWNCO*
> 
> thats why i now have three of them lol


Yeah I remember you're the one who accidentally bought the third card


----------



## rdr09

Quote:


> Originally Posted by *Jorginto*
> 
> Guys, is there any way to stabilize the voltage on my 290's. While benching it's all over the place from 1.213 to 1.313 and I'm getting artifacts at 1125MHz core. Was hoping for a 1200 ish area but with that vdrop I doubt it.


have you played with the Power Limit?

I advise you watercool them first before shooting for higher oc's like 1200. does not matter if temps are fine on air.


----------



## BLOWNCO

Quote:


> Originally Posted by *Sempre*
> 
> Yeah I remember you're the one who accidentally bought the third card


haha yea told my wife the same thing! lol my next check will be my x79 mobo and then the Intel swap will hapoen !


----------



## Cyber Locc

Thats
Quote:


> Originally Posted by *bluedevil*
> 
> yep that's my PSU
> http://www.newegg.com/Product/Product.aspx?Item=N82E16817182262
> 
> All I can figure is that my 290 (254w stock) is pulling more than 300w and my 3470 (77w stock) about 115w when both are OC'd. So that's at least 415w just from those two components, leaving about 91w to go for the rest of the system.
> 
> Also my math is this 92% efficiency X 550w = 506w


Okay first off your overclocking a non overclock-able processor lol. Explain are you overclocking the bus? In my past experiences that is not a good idea if you want to overclock buy a K series chip JS. Also the vcore on your board may be locked but so is the chips as it is not meant to be overclocked. If your OCing by modding the bus speed which is the only possible way that I know of then your crazy experiences with gpu are likely caused by that right there. Your messing with stuff that board isn't designed to be messed with it may allow you to do it doesn't mean it will work right.

Thats not how efficiency works you will get 550 watts however depending on your cards overclock they can pull up to 450ws so that could easily tax your system. However I highly doubt that is the issue seeing how I had similar issues on my old 680 when trying to overclock a 3560 by using the bus overclocking I took the OC off and all of it went away. I also had blue screens every now and then and random lockups.

Also I would suggest you get a bigger power supply if you plan on going to 1500 boost clock as that's going to need like a hefty psu. http://www.guru3d.com/articles_pages/msi_geforce_gtx_970_gaming_review,7.html


----------



## bluedevil

Quote:


> Originally Posted by *cyberlocc*
> 
> Thats
> Okay first off your overclocking a non overclock-able processor lol. Explain are you overclocking the bus? In my past experiences that is not a good idea if you want to overclock buy a K series chip JS. Also the vcore on your board may be locked but so is the chips as it is not meant to be overclocked. If your OCing by modding the bus speed which is the only possible way that I know of then your crazy experiences with gpu are likely caused by that right there. Your messing with stuff that board isn't designed to be messed with it may allow you to do it doesn't mean it will work right.
> 
> Thats not how efficiency works you will get 550 watts however depending on your cards overclock they can pull up to 450ws so that could easily tax your system. However I highly doubt that is the issue seeing how I had similar issues on my old 680 when trying to overclock a 3560 by using the bus overclocking I took the OC off and all of it went away. I also had blue screens every now and then and random lockups.
> 
> Also I would suggest you get a bigger power supply if you plan on going to 1500 boost clock as that's going to need like a hefty psu. http://www.guru3d.com/articles_pages/msi_geforce_gtx_970_gaming_review,7.html


And what's wrong with OCing a locked processor? Did you know they can move up to 4 bins up? Guess not, that's why you went off all half cocked. Here is some info for you.

http://forums.vortez.net/overclocking-cooling/5109-how-overclock-non-k-intel-cpus.html

and here

http://www.tomshardware.com/answers/id-1787337/overclocking-locked-multiplier-cpu-3470.html

and here




Also I am not OCing via the bus, I do know that leads to system instability, must have learned that over what the last 9 years of being here....


----------



## Cyber Locc

Quote:


> Originally Posted by *bluedevil*
> 
> And what's wrong with OCing a locked processor? Did you know they can move up to 4 bins up? Guess not, that's why you went off all half cocked. Here is some info for you.
> 
> http://forums.vortez.net/overclocking-cooling/5109-how-overclock-non-k-intel-cpus.html
> 
> and here
> 
> http://www.tomshardware.com/answers/id-1787337/overclocking-locked-multiplier-cpu-3470.html
> 
> and here
> 
> 
> 
> 
> Also I am not OCing via the bus, I do know that leads to system instability, must have learned that over what the last 9 years of being here....


Lol you tell me that your not overclocking using the FSB and link a guide that talks about overclocking the FSB too funny.

Also let me give you a tip off that very guide "Note - It's actually very hard if not impossible to damage your CPU or motherboard by overclocking the bus. What it is likely to do however is corrupt your Windows install and give you visual artifacts" Sound familiar *"Visual Artifacts"*

"Did you know they can move up to 4 bins up? Guess not," Pretty sure I covered that I knew this when saying I have experienced the same issues when doing the same thing.

"Also I am not OCing via the bus, I do know that leads to system instability, must have learned that over what the last 9 years of being here...." Is not true some boards can modify the bus and some chips can get away with it my RIVEBE benching bios has its bus and BLCK modded and they work just fine and stable this is just 1 example just saying that blanket statement is incorrect I said "that board isn't designed" which is correct saying that its generally bad is wrong.

"that's you went off all half cocked" If you don't want help then I will no longer offer it no problem. However That guide is saying the same as me as will many people that have had the same issue if you don't want to acknowledge that could be the issue and most likely is that's on you.

Also in your 9 years you have been here you should have learned no 2 chips are the same just because 1 can do it by no means yours can (also what power supply efficiency means). Also should see that 99% of the time overclockers have issues with stuff as you are its the overclock that is the problem. Also there is more to overclocking then just changing a multiplier your chip might be stable at 4.0 with a higher voltage but you cannot give it that. Have you checked it thoroughly for stability.


----------



## bluedevil

Quote:


> Originally Posted by *cyberlocc*
> 
> Lol you tell me that your not overclocking using the FSB and link a guide that talks about overclocking the FSB too funny.
> 
> Also let me give you a tip off that very guide "Note - It's actually very hard if not impossible to damage your CPU or motherboard by overclocking the bus. What it is likely to do however is corrupt your Windows install and give you visual artifacts" Sound familiar *"Visual Artifacts"*
> 
> "Did you know they can move up to 4 bins up? Guess not," Pretty sure I covered that I knew this when saying I have experienced the same issues when doing the same thing.
> 
> "Also I am not OCing via the bus, I do know that leads to system instability, must have learned that over what the last 9 years of being here...." Is not true some boards can modify the bus and some chips can get away with it my RIVEBE benching bios has its bus and BLCK modded and they work just fine and stable this is just 1 example just saying that blanket statement is incorrect I said "that board isn't designed" which is correct saying that its generally bad is wrong.
> 
> "that's you went off all half cocked" If you don't want help then I will no longer offer it no problem. However That guide is saying the same as me as will many people that have had the same issue if you don't want to acknowledge that could be the issue and most likely is that's on you.
> 
> Also in your 9 years you have been here you should have learned no 2 chips are the same just because 1 can do it by no means yours can. Also should see that 99% of the time overclockers have issues with stuff as you are its the overclock that is the problem.


Moving on.....thanks, but no thanks.


----------



## Cyber Locc

Quote:


> Originally Posted by *bluedevil*
> 
> Moving on.....thanks, but no thanks.


Just lol is all I can say. Have fun.


----------



## bluedevil

Quote:


> Originally Posted by *cyberlocc*
> 
> Just lol is all I can say. Have fun.


Thanks again. Have a great night. I will take it into high consideration.


----------



## bluedevil

Sorry about that earlier guys.









Here is a ss of what I am running day to day. Very very stable.


----------



## ArchieGriffs

But is it very very very stable? Because we know that extra very is incredibly important. I kid, I kid.

My system is not stable and I haven't even been overclocking (it's also why I haven't been overclocking). Gotta get around to plugging in a different PSU and see if things change, preferably before I can no longer send my HDD back in to newegg for a replacement assuming it's not the PSU Whoops it's already been a month, good thing it actually works, but the clicking is annoying and clearly a bad sign. When 2 HDDs one new and one old start acting up (the new one from the get go), you have to think it's something else. Oh well... we'll see.


----------



## Jorginto

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Thats not the point. If you are getting artifacts then there are other things at play. For example let say you card does 1.2-1.3v. You want to do 1.3v all the time. If you got +200 mV you will go 1.3-.14v. If you still get artifacts then its not the voltage core. The vCore changes based on the game load. You are getting 1.2v when there is not much going on. You can use those custom bios to see if they help you. I know for my card voltage is not the limit but the chip itself cant OC more then 1220MHz. I can bench 1275MHz but with artifacts.


+125mV helped, but I don't wanna do more on air. This weekend I'm going to have more time to properly install LC blocks and heatsinks.

Youre doing 1220 on +100 mV? Is your card also reference design?


----------



## bluedevil

Quote:


> Originally Posted by *ArchieGriffs*
> 
> But is it very very very stable? Because we know that extra very is incredibly important. I kid, I kid.
> 
> My system is not stable and I haven't even been overclocking (it's also why I haven't been overclocking). Gotta get around to plugging in a different PSU and see if things change, preferably before I can no longer send my HDD back in to newegg for a replacement assuming it's not the PSU Whoops it's already been a month, good thing it actually works, but the clicking is annoying and clearly a bad sign. When 2 HDDs one new and one old start acting up (the new one from the get go), you have to think it's something else. Oh well... we'll see.


Lol 1000 runs of 1000 min of Prime 95. Lol.

In other news, I will be doing a side by side comparison of the VisionTek R9 290 vs Gigabyte GTX 970 Windforce OC, including system wattage draw thanks to the P3 P4400 Kill A Watt I ordered last night.







We will put an end to this PSU talk once and for all. If I am wrong about the power usage and I need a bigger PSU, I will then put in a order that day for one.


----------



## ebhsimon

Quote:


> Originally Posted by *bluedevil*
> 
> Lol 1000 runs of 1000 min of Prime 95. Lol.
> 
> In other news, I will be doing a side by side comparison of the VisionTek R9 290 vs Gigabyte GTX 970 Windforce OC, including system wattage draw thanks to the P3 P4400 Kill A Watt I ordered last night.
> 
> 
> 
> 
> 
> 
> 
> We will put an end to this PSU talk once and for all. If I am wrong about the power usage and I need a bigger PSU, I will then put in a order that day for one.


Yes let us know everything!
Performance @ stock
Performance @ OC
Heat output
Sound levels and signature
Power requirements

I am a little biased towards the 970 and I think it'll win, even though I have two 290s...


----------



## bluedevil

Quote:


> Originally Posted by *ebhsimon*
> 
> Yes let us know everything!
> Performance @ stock
> Performance @ OC
> Heat output
> Sound levels and signature
> Power requirements
> 
> I am a little biased towards the 970 and I think it'll win, even though I have two 290s...


Will do.


----------



## rdr09

Testing my second 290 on my AMD rig. Temp went to almost 50C testing FSU at 1200 core 1500 memory using AB. Rig only has one 120 rad with a fan to cool both cpu and gpu. Just testing before crossfiring in the Intel rig.



http://www.3dmark.com/3dm/4469323?

edit: I get a higher score using Trixx at only 1100 core.









http://www.3dmark.com/3dm/4470421?


----------



## battleaxe

Well, its official my reference MSI 290 died. I just packed it up for RMA. RMA is approved.

Do they still have reference cards laying around to send me if mine is totally dead? I kinda doubt it. I'm betting they send me an MSI290 Gaming. Am I right to think that?


----------



## Jorginto

Quote:


> Originally Posted by *battleaxe*
> 
> Well, its official my reference MSI 290 died.


Was that a voltage issue. What were your 24/7 overclock values?


----------



## battleaxe

Quote:


> Originally Posted by *Jorginto*
> 
> Was that a voltage issue. What were your 24/7 overclock values?


I ran at stock. Did some overclocking and benching the other day and it black-screened and would not come back. Tried everything.


----------



## Jorginto

Did you switch between normal and uber bios?


----------



## battleaxe

its a 290, so there is no uber. But yes I did try the other BIOS to no avail.


----------



## Fickle Pickle

Quote:


> Originally Posted by *battleaxe*
> 
> Well, its official my reference MSI 290 died. I just packed it up for RMA. RMA is approved.
> 
> Do they still have reference cards laying around to send me if mine is totally dead? I kinda doubt it. I'm betting they send me an MSI290 Gaming. Am I right to think that?


I just RMA'd my MSI reference R9-290x and received back another reference version. Good thing too since I'm underwater. So you will pretty much receive a reference.


----------



## battleaxe

Quote:


> Originally Posted by *Fickle Pickle*
> 
> I just RMA'd my MSI reference R9-290x and received back another reference version. Good thing too since I'm underwater. So you will pretty much receive a reference.


Darnit. I was actually hoping to get a gaming card. Oh well.


----------



## Arizonian

Quote:


> Originally Posted by *battleaxe*
> 
> Well, its official my reference MSI 290 died. I just packed it up for RMA. RMA is approved.
> 
> Do they still have reference cards laying around to send me if mine is totally dead? I kinda doubt it. I'm betting they send me an MSI290 Gaming. Am I right to think that?


Sorry to hear that.

Technically to fulfill RMA they would only need a refurbished 290 laying around to replace yours.

However if they didn't have one Im not sure if they would side grade you. They may upgrade you to a 290X for not being able to provide you with the exact model. Which makes for good customer service. You'll also probably receive some sort of correspondence to get your approval before they just sent something else out.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Jorginto*
> 
> +125mV helped, but I don't wanna do more on air. This weekend I'm going to have more time to properly install LC blocks and heatsinks.
> 
> Youre doing 1220 on +100 mV? Is your card also reference design?


One card need 100 mV the other needs 125 mV. The thing is they start at different voltages at stock and end up with same voltage in the end. For 1220 i used 150mV. Tried 200 and it did nothing to remove artifacts.


----------



## rdr09

1200 core sticks using Trixx . . .

http://www.3dmark.com/3dm/4472826?


----------



## Abyssic

guys... i've got a strange anomaly to talk about...

i wanted to test the performance impact when switching from PCIe3.0x16 to x8 on my R9 290X.
to do so, i placed the card in my lower slot, wich is only x8. i only changed that, clocks, settings even psu connection stayed the same.
well, it turned out that it really had quite the impact... but here it comes: POSITIVE impact!

i used valley to benchmark the results and got around 54 fps with my settings @x16. when using x8 i got 68 fps!

my mind is blown and i have no idea what is reality anymore. i tested both multiple times, watching temps and everything. nothing changed, except good 14 fps more...

any opinions on that? similar experiences?


----------



## battleaxe

Quote:


> Originally Posted by *Arizonian*
> 
> Sorry to hear that.
> 
> Technically to fulfill RMA they would only need a refurbished 290 laying around to replace yours.
> 
> However if they didn't have one Im not sure if they would side grade you. They may upgrade you to a 290X for not being able to provide you with the exact model. Which makes for good customer service. You'll also probably receive some sort of correspondence to get your approval before they just sent something else out.


Well, as long as I get one that works I don't really care. These are good cards. It worked flawlessly up until it croaked. Kinda a bummer. I like MSI though, so I'm happy I bought it. At least they have good/decent customer service.


----------



## rdr09

Quote:


> Originally Posted by *Abyssic*
> 
> guys... i've got a strange anomaly to talk about...
> 
> i wanted to test the performance impact when switching from PCIe3.0x16 to x8 on my R9 290X.
> to do so, i placed the card in my lower slot, wich is only x8. i only changed that, clocks, settings even psu connection stayed the same.
> well, it turned out that it really had quite the impact... but here it comes: POSITIVE impact!
> 
> i used valley to benchmark the results and got around 54 fps with my settings @x16. when using x8 i got 68 fps!
> 
> my mind is blown and i have no idea what is reality anymore. i tested both multiple times, watching temps and everything. nothing changed, except good 14 fps more...
> 
> any opinions on that? similar experiences?


based on my test . . . hardly any difference on current cars. must just be fluke on your run on the 1st slot.

Stock 290 at X8

http://www.3dmark.com/3dm/4403968?

X16

http://www.3dmark.com/3dm/4036887?

tested same difference with a 7950

Quote:


> Originally Posted by *battleaxe*
> 
> Well, as long as I get one that works I don't really care. These are good cards. It worked flawlessly up until it croaked. Kinda a bummer. I like MSI though, so I'm happy I bought it. At least they have good/decent customer service.


bet they'll give you a 290X.


----------



## battleaxe

Quote:


> Originally Posted by *rdr09*
> 
> bet they'll give you a 290X.


Let us hope.
















That would be sweet.


----------



## boot318

Quote:


> Originally Posted by *bluedevil*
> 
> Lol 1000 runs of 1000 min of Prime 95. Lol.
> 
> In other news, I will be doing a side by side comparison of the VisionTek R9 290 vs Gigabyte GTX 970 Windforce OC, including system wattage draw thanks to the P3 P4400 Kill A Watt I ordered last night.
> 
> 
> 
> 
> 
> 
> 
> We will put an end to this PSU talk once and for all. If I am wrong about the power usage and I need a bigger PSU, I will then put in a order that day for one.


290 vs 970? Your going to find out the ugly truth about power consumption with AMD. Looking forward to results.


----------



## battleaxe

Quote:


> Originally Posted by *boot318*
> 
> 290 vs 970? Your going to find out the ugly truth about power consumption with AMD. Looking forward to results.


Sounds awesome. My bet it that the 290 uses at least 50 watts more. And I don't care one bit.


----------



## Abyssic

Quote:


> Originally Posted by *rdr09*
> 
> based on my test . . . hardly any difference on current cars. must just be fluke on your run on the 1st slot.


well, i'd like to think that too but it doesn't seem like it.
i testet slot one first, then slot two, then slot one again and also slot 2 again (ofc shutdowns in between)
i now also tested 3d mark 11 wich gave me 500 gpu-points more in the second slot...


----------



## rdr09

Quote:


> Originally Posted by *Abyssic*
> 
> well, i'd like to think that too but it doesn't seem like it.
> i testet slot one first, then slot two, then slot one again and also slot 2 again (ofc shutdowns in between)
> i now also tested 3d mark 11 wich gave me 500 gpu-points more in the second slot...


so, slot one which is X16 is giving lower score?


----------



## Abyssic

Quote:


> Originally Posted by *rdr09*
> 
> so, slot one which is X16 is giving lower score?


exactly.


----------



## wolf9466

I have 2x290X - one of which isn't running, the other is mining X11 in my testing rig.


----------



## rdr09

Quote:


> Originally Posted by *Abyssic*
> 
> exactly.


it could be running cooler in the second slot or . . . there really is something wrong with the first.


----------



## battleaxe

Quote:


> Originally Posted by *Abyssic*
> 
> exactly.


Sum Ting Wong...


----------



## bluedevil

Quote:


> Originally Posted by *boot318*
> 
> 290 vs 970? Your going to find out the ugly truth about power consumption with AMD. Looking forward to results.


Quote:


> Originally Posted by *battleaxe*
> 
> Sounds awesome. My bet it that the 290 uses at least 50 watts more. And I don't care one bit.


From what I gather, the 970 that I am getting uses about 171w on load vs the 290 uses 254w on load, stock vs stock.


----------



## battleaxe

Quote:


> Originally Posted by *bluedevil*
> 
> From what I gather, the 970 that I am getting uses about 171w on load vs the 290 uses 254w on load, stock vs stock.


Still don't care. But, that's not surprising... still good cards if you ask me. And worth the asking price they are selling them for all day long. That being said I'd love to get a 970... one of those Jet looking things from Zotac preferably. But I'm done buying GPU's for a while.


----------



## battleaxe

Quote:


> Originally Posted by *wolf9466*
> 
> I have 2x290X - one of which isn't running, the other is mining X11 in my testing rig.


You must have free electricity.


----------



## wolf9466

Quote:


> Originally Posted by *bluedevil*
> 
> Quote:
> 
> 
> 
> Originally Posted by *boot318*
> 
> 290 vs 970? Your going to find out the ugly truth about power consumption with AMD. Looking forward to results.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *battleaxe*
> 
> Sounds awesome. My bet it that the 290 uses at least 50 watts more. And I don't care one bit.
> 
> Click to expand...
> 
> From what I gather, the 970 that I am getting uses about 171w on load vs the 290 uses 254w on load, stock vs stock.
Click to expand...

I'm a miner, so hash per watt matters more to me; so far, the 970 is a bit better, though. I think it gets around 6MH/s X11 with OC, while I can pull 9.1 - 9.2MH/s X11 from my 290X OC'd. Using your numbers, it's using 67.32% of the power, while getting 65.22% of the hash. Oh, nice! I think the 290X is doing better! Still gonna work more on my OpenCL kernels, I want to squeeze more out of it.


----------



## wolf9466

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wolf9466*
> 
> I have 2x290X - one of which isn't running, the other is mining X11 in my testing rig.
> 
> 
> 
> You must have free electricity.
Click to expand...

No, I'm a coder. It helps.


----------



## rdr09

Quote:


> Originally Posted by *bluedevil*
> 
> From what I gather, the 970 that I am getting uses about 171w on load vs the 290 uses 254w on load, stock vs stock.


my AC uses 5000w per hour.


----------



## battleaxe

Coder? How exactly does that help? You're burning your boss's electricity at work? lol


----------



## wolf9466

Quote:


> Originally Posted by *battleaxe*
> 
> Coder? How exactly does that help? You're burning your boss's electricity at work? lol


Nope; I have 50%+ more hashrate on AMD using around 17% more power. Therefore, at 0.11 USD per kWh, I can profit.


----------



## battleaxe

Quote:


> Originally Posted by *wolf9466*
> 
> Nope; I have 50%+ more hashrate on AMD using around 17% more power. Therefore, at 0.11 USD per kWh, I can profit.


Mind sharing how you are doing that? I would love to do a little mining again. I hate seeing my 290 just sitting here doing nothing all day.
Quote:


> Originally Posted by *rdr09*
> 
> my AC uses 5000w per hour.


Seriously... who cares about 50-70-100 watts in a PC? I've got power supplies falling out of my you-know-what over here. I could care less about .50 month more to run a 290 over anything else.


----------



## wolf9466

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wolf9466*
> 
> Nope; I have 50%+ more hashrate on AMD using around 17% more power. Therefore, at 0.11 USD per kWh, I can profit.
> 
> 
> 
> Mind sharing how you are doing that? I would love to do a little mining again. I hate seeing my 290 just sitting here doing nothing all day.
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> my AC uses 5000w per hour.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Seriously... who cares about 50-70-100 watts in a PC? I've got power supplies falling out of my you-know-what over here. I could care less about .50 month more to run a 290 over anything else.
Click to expand...

I rewrote most of the hashes in the X11 kernel; also modified the rest.


----------



## rdr09

290 1200 with an i7 @ 4.5 - 2977

http://www.3dmark.com/3dm/4473939?

290 1200 with a thuban @ 4.0 - 2921

http://www.3dmark.com/3dm/4472826


----------



## Tokkan

Quote:


> Originally Posted by *rdr09*
> 
> 290 1200 with an i7 @ 4.5 - 2977
> 
> http://www.3dmark.com/3dm/4473939?
> 
> 290 1200 with a thuban @ 4.0 - 2921
> 
> http://www.3dmark.com/3dm/4472826


Looking good


----------



## Abyssic

Quote:


> Originally Posted by *Abyssic*
> 
> guys... i've got a strange anomaly to talk about...
> 
> i wanted to test the performance impact when switching from PCIe3.0x16 to x8 on my R9 290X.
> to do so, i placed the card in my lower slot, wich is only x8. i only changed that, clocks, settings even psu connection stayed the same.
> well, it turned out that it really had quite the impact... but here it comes: POSITIVE impact!
> 
> i used valley to benchmark the results and got around 54 fps with my settings @x16. when using x8 i got 68 fps!
> 
> my mind is blown and i have no idea what is reality anymore. i tested both multiple times, watching temps and everything. nothing changed, except good 14 fps more...
> 
> any opinions on that? similar experiences?


add: 3d mark 11 gives same improvement on x8
note: temps, volts are the same


----------



## BradleyW

Hello fellow AMD users.
Has anyone found a profile that works for The Evil Within. or does it use OpenGL?


----------



## sugarhell

Quote:


> Originally Posted by *BradleyW*
> 
> Hello fellow AMD users.
> Has anyone found a profile that works for The Evil Within. or does it use OpenGL?


Whats the point? 30 fps locked for the le cinematic feel


----------



## thrgk

For some reason when I leave my computer idle, i can see on my logitech kkeyboard lcd the time is still ticking away, so its not stalled but its just a black screen and doesnt turn back on, like its in perma sleep mode but sleep is disabled


----------



## rdr09

Quote:


> Originally Posted by *Tokkan*
> 
> Looking good


i think my thuban or my motherboard is showing its age. i have to up the voltage for 4GHz to 1.42v from 1.4v. well, our thubans are indeed old. But, i love it.


----------



## Forceman

Quote:


> Originally Posted by *thrgk*
> 
> For some reason when I leave my computer idle, i can see on my logitech kkeyboard lcd the time is still ticking away, so its not stalled but its just a black screen and doesnt turn back on, like its in perma sleep mode but sleep is disabled


Have you tried disabling ULPS? That seemed to help for me. But recent drivers haven't seemed to have the problem, what version are you on?


----------



## amptechnow

Quote:


> Originally Posted by *thrgk*
> 
> For some reason when I leave my computer idle, i can see on my logitech kkeyboard lcd the time is still ticking away, so its not stalled but its just a black screen and doesnt turn back on, like its in perma sleep mode but sleep is disabled


are you getting a black screen after you login to windows? sometimes if i push my cards too much and they crash or fail the next time i login windows ill get a black screen after a few seconds. but if i start afterburner right away and set my normal overclock as fast as i can no black screen. then i can hit reset a few minutes later and back to normal. if i do not apply the overclock i will get a black screen no matter how many times i restart. maybe give it a shot.


----------



## Dimaggio1103

Someone plz help im pulling my hair out here. I upgraded to a 290 to push my eyefinity setup better. I read that you no longer need an active adapter so I did not buy one. I have three monitors at 1080p 2 plugged in via DVI the third via HDMI to DVI cable. My third monitor will not work now only 2 will. Why is this? I have reinstalled the driver twice now. restarted and everything yet I cannot get my eyefinity setup working.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Dimaggio1103*
> 
> Someone plz help im pulling my hair out here. I upgraded to a 290 to push my eyefinity setup better. I read that you no longer need an active adapter so I did not buy one. I have three monitors at 1080p 2 plugged in via DVI the third via HDMI to DVI cable. My third monitor will not work now only 2 will. Why is this? I have reinstalled the driver twice now. restarted and everything yet I cannot get my eyefinity setup working.


Silly question but i assume you have HDMI selected as source on the monitor?

happened to me once or twice


----------



## Dimaggio1103

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Silly question but i assume you have HDMI selected as source on the monitor?
> 
> happened to me once or twice


My monitor does not have that option since its HDMI to DVI cable. However yes DVI is selected. monitor works great booting up but when windows is loading it stops working like the driver is kicking it off. Even if I unplug the other two cables so that the HDMI to DVI is the only thing plugged in it still refuses to work in windows.

I know it has to be a software issue.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Dimaggio1103*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Silly question but i assume you have HDMI selected as source on the monitor?
> 
> happened to me once or twice
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My monitor does not have that option since its HDMI to DVI cable. However yes DVI is selected. monitor works great booting up but when windows is loading it stops working like the driver is kicking it off. Even if I unplug the other two cables so that the HDMI to DVI is the only thing plugged in it still refuses to work in windows.
> 
> I know it has to be a software issue.
Click to expand...

It's enabled in CCC as well?


----------



## Dimaggio1103

I do not see anywhere to enable HDMI in CCC I have tried detecting and get nothing. Man this is irritating..

Thanks for trying to help plus rep my man.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Dimaggio1103*
> 
> I do not see anywhere to enable HDMI in CCC I have tried detecting and get nothing. Man this is irritating..
> 
> Thanks for trying to help plus rep my man.


Sorry i had it mixed, when you click extend desktop in CCC under the "common display tasks" tab it should automatically enable all displays connected.


----------



## Dimaggio1103

No not letting me. Something is preventing this port from working. Spent last three hours on this. Having buyers remorse right now....

UPDATE: Works fine if I plug the HDMI one into the motherboard HDMI. Means its a AMD driver issue or AMD card issue.


----------



## ArchieGriffs

I have no idea what it is but every new driver release seems to come with a new problem on my secondary monitor. Sometimes when I turn my monitor on it will disconnect and reconnect repeatedly until I turn off the monitor and turn it back on (could easily be my monitor). After updating to whatever the newest CCC is, I can't keep track of it anymore, my monitor no longer is recognized in windows whenever I turn off my monitor, and only by having my monitor turned on before my computer boots up can I actually have it running. If I put a bunch of desktop icons on the second monitor and turn off my monitor it moves everything back onto the first :\.

I would play around with different versions, the newest isn't always the best, which is kind of sad.


----------



## th3illusiveman

do these cards scale well with frequency? Seems like the GTX 780 is sleeper card and gains massive performance from Overclocks whereas the 290x afew frame rates for like a 100Mhz increase. 780 seems to overclock well past stock OC 780 Ti performance while an OC'd 290x can only seem to slight beat a stock 780 Ti in the reviews i've seen. I chose the 290x over the 780 cause i thought it would scale better with frequency.. did i chose wrong?


----------



## ebhsimon

Idk about scaling, I see systems with one 290 getting like 9 thousand, while I only get 15 thousand with 2 cards on firestrike.

http://www.3dmark.com/3dm/4477673?


----------



## Dimaggio1103

Welp spent my whole night trying to figure this out. Tried 5 different versions of drivers and none worked. No idea why this is happening. Really wish now I would have sprung the extra for the GTX 970. So tired of these driver problems. I seem to always have bad luck when I go AMD. Pretty sad a 270X can push three monitors no problem but this 290X just does not want to seem to work. Back to newegg it will have to go I guess.


----------



## kizwan

Quote:


> Originally Posted by *Dimaggio1103*
> 
> Welp spent my whole night trying to figure this out. Tried 5 different versions of drivers and none worked. No idea why this is happening. Really wish now I would have sprung the extra for the GTX 970. So tired of these driver problems. I seem to always have bad luck when I go AMD. Pretty sad a 270X can push three monitors no problem but this 290X just does not want to seem to work. Back to newegg it will have to go I guess.


That's too bad. This what I found in 290 product page: _"*To support 3 displays, one of the monitors has to support DisplayPort"_.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dimaggio1103*
> 
> Welp spent my whole night trying to figure this out. Tried 5 different versions of drivers and none worked. No idea why this is happening. Really wish now I would have sprung the extra for the GTX 970. So tired of these driver problems. I seem to always have bad luck when I go AMD. Pretty sad a 270X can push three monitors no problem but this 290X just does not want to seem to work. Back to newegg it will have to go I guess.
> 
> 
> 
> That's too bad. This what I found in 290 product page: _"*To support 3 displays, one of the monitors has to support DisplayPort"_.
Click to expand...

I was running 3 monitors DVI/DVI/HDMI......and i am again on the 295x2


----------



## kizwan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dimaggio1103*
> 
> Welp spent my whole night trying to figure this out. Tried 5 different versions of drivers and none worked. No idea why this is happening. Really wish now I would have sprung the extra for the GTX 970. So tired of these driver problems. I seem to always have bad luck when I go AMD. Pretty sad a 270X can push three monitors no problem but this 290X just does not want to seem to work. Back to newegg it will have to go I guess.
> 
> 
> 
> That's too bad. This what I found in 290 product page: _"*To support 3 displays, one of the monitors has to support DisplayPort"_.
> 
> Click to expand...
> 
> I was running 3 monitors DVI/DVI/HDMI......and i am again on the 295x2
Click to expand...

Eyefinity setup with 290? I just quote what I found in product page. If that is not true for Eyefinity then either AMD is lying or the spec is wrong. I only use one monitor though. I'm not familiar with Eyefinity.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dimaggio1103*
> 
> Welp spent my whole night trying to figure this out. Tried 5 different versions of drivers and none worked. No idea why this is happening. Really wish now I would have sprung the extra for the GTX 970. So tired of these driver problems. I seem to always have bad luck when I go AMD. Pretty sad a 270X can push three monitors no problem but this 290X just does not want to seem to work. Back to newegg it will have to go I guess.
> 
> 
> 
> That's too bad. This what I found in 290 product page: _"*To support 3 displays, one of the monitors has to support DisplayPort"_.
> 
> Click to expand...
> 
> I was running 3 monitors DVI/DVI/HDMI......and i am again on the 295x2
> 
> Click to expand...
> 
> Eyefinity setup with 290?
Click to expand...

Yup, well most of the time i run 3 Monitors and just game on the middle but i can/have run eyefinity like that


----------



## kizwan

Probably something wrong with the HDMI to DVI adapter then.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kizwan*
> 
> Probably something wrong with the HDMI to DVI adapter then.


Well on the 290's it was 2 x DVI and 1 x HDMI, on the 295x2 it's 1 x DVI, 1 x MiniDP to DVI and 1 x MiniDP to HDMI

I got this from XFX with the 295x2:



It says you can't do it but i actually can


----------



## MojoW

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Well on the 290's it was 2 x DVI and 1 x HDMI, on the 295x2 it's 1 x DVI, 1 x MiniDP to DVI and 1 x MiniDP to HDMI
> 
> I got this from XFX with the 295x2:
> 
> 
> 
> It says you can't do it but i actually can


It depends on the card if you have hdmi and dvi-d it will share it's dac(pixelclock)
With the 295x2 it has one dvi-d so more pixelclock to share.
With the 290 you will need a active adapter so it uses it's own dac(pixelclock)


----------



## Sgt Bilko

Quote:


> Originally Posted by *MojoW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Well on the 290's it was 2 x DVI and 1 x HDMI, on the 295x2 it's 1 x DVI, 1 x MiniDP to DVI and 1 x MiniDP to HDMI
> 
> I got this from XFX with the 295x2:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> It says you can't do it but i actually can
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It depends on the card if you have hdmi and dvi-d it will share it's dac(pixelclock)
> With the 295x2 it has one dvi-d so more pixelclock to share.
> With the 290 you will need a active adapter so it uses it's own dac(pixelclock)
Click to expand...

But that's the thing, i was running DVI/DVI/HDMI on my 290 Crossfire
and it was working fine.


----------



## MojoW

Quote:


> Originally Posted by *Sgt Bilko*
> 
> But that's the thing, i was running DVI/DVI/HDMI on my 290 Crossfire
> and it was working fine.


That's strange it shouldn't work.
Only thing i can think of is your screen had lower timings, resolution and refresh rate.
What screens did you use?
And what screens is Dimaggio1103 using?


----------



## Sgt Bilko

Quote:


> Originally Posted by *MojoW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> But that's the thing, i was running DVI/DVI/HDMI on my 290 Crossfire
> and it was working fine.
> 
> 
> 
> That's strange it shouldn't work.
> Only thing i can think of is your screen had lower timings, resolution and refresh rate.
> What screens did you use?
> And what screens is Dimaggio1103 using?
Click to expand...

Well normally i'm just using a Qnix QX2710, Asus VE248H and an Asus VW246.

2 x 1080p panels and a 1440p one all at 60hz, i imagine if i had the Qnix at 110hz then it would derp out :/


----------



## rdr09

Quote:


> Originally Posted by *ebhsimon*
> 
> Idk about scaling, I see systems with one 290 getting like 9 thousand, while I only get 15 thousand with 2 cards on firestrike.
> 
> http://www.3dmark.com/3dm/4477673?


go by graphics score. the overall relies on both cpu and gpu. fs loves cores like BF3 or 4. my thuban at 4GHz scores 8600 in physics. a tad higher than your i5.


----------



## jagdtigger

Quote:


> Originally Posted by *MojoW*
> 
> That's strange it shouldn't work.
> Only thing i can think of is your screen had lower timings, resolution and refresh rate.
> What screens did you use?
> And what screens is Dimaggio1103 using?


But it works for me as well. One FHD tv->HDMI, LG W2486L FHD->DVI-D, LG IPS277L->DVI-D(with adapter cable, DVI-D->HDMI). The only problem is that if i start OC-ing it i get black screen than the images come back, and eventually i get a perma black screen(only reset helps). The starnge thing is only the TV and the ips277l doing this, the w2486l works fine and can use the machine while the other two shows black screen...







On stock no probs.


----------



## Jorginto

Guys, when I plug my monitors (Qnix and iiyama ) do dvi ports I can not move the mouse cursor between displays. When iiyama is connected via HDMI everything is fine. Is it possible to use both DVI ports at the same time?


----------



## aaroc

Quote:


> Originally Posted by *MojoW*
> 
> It depends on the card if you have hdmi and dvi-d it will share it's dac(pixelclock)
> With the 295x2 it has one dvi-d so more pixelclock to share.
> With the 290 you will need a active adapter so it uses it's own dac(pixelclock)


On the presentation of the R9 290(X) the AMD slides showed that you could connect one monitor on each connector without an active adapter. Thats why I changed from 7870. I have tested with R9 290 and R9 290X with three monitors: DVI DL+DVI DL+ HDMI and DVI DL+ DVI DL + Displayport. My monitors are 2560x1440 Samsungs and worked very well. did you try the cables/adapters?

I have other PC connected to one of the monitors and select the input mannually DVI DL/HDMI and works fine.


----------



## BradleyW

Quote:


> Originally Posted by *sugarhell*
> 
> Whats the point? 30 fps locked for the le cinematic feel


+1 Most helpful.


----------



## DVIELIS

Decided to finally join this awesome group







My data below










Spoiler: GPU-Z



http://www.techpowerup.com/gpuz/details.php?id=m3fqz






Spoiler: PIC









Spoiler: DATA



BRAND: Gigabyte
MODEL: Radeon R9 290
COOLING: Water (Corsair H90 under NZXT Kraken G10 bracket + Gelid VRM heatsink kit+ VRAM heatsinks)


----------



## Silent Scone

http://forums.overclockers.co.uk/showthread.php?t=18632104

Apparently these are in demand..

Anyone interested in an 8GB 290x reference board with reference blower, order away. I'd be quick though, the new clueless rep (or more commonly known here and banished as the 'AMD God') bought four of the five initially made.


----------



## sugarhell

this one?



Meh i need to copyright my pics


----------



## Silent Scone

I do find it slightly amusing considering the one thing that was a massive issue with the Vapor-X range was that there were simply no waterblocks. Then, not long ago EK stepped up to the challenge (after saying via support they had absolutely no plans, thanks EK).

Bit late now. Unless you want to stack them altogether on air.


----------



## VSG

lol that 5 made was just a number I told you sarcastically, pretty sure they made more or will decide based on the pre-order amount


----------



## sugarhell

Quote:


> Originally Posted by *Silent Scone*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I do find it slightly amusing considering the one thing that was a massive issue with the Vapor-X range was that there were simply no waterblocks. Then, not long ago EK stepped up to the challenge (after saying via support they had absolutely no plans, thanks EK).
> 
> Bit late now. Unless you want to stack them altogether on air.


Who stacks 4 heaters? Except ltmatt lol

(By the way i dont know why he works for amd he is clueless







)


----------



## Sgt Bilko

Quote:


> Originally Posted by *geggeg*
> 
> lol that 5 made was just a number I told you sarcastically, pretty sure they made more or will decide based on the pre-order amount


Yeah, even for a card like that 5 would be way too low a number to make.

I'd be guessing they will make them based on orders made


----------



## Silent Scone

There is an initial quantity but I know it's not 5


----------



## Mirob0t

Hey guys, i need some help with my Sapphire Tri-x r9 290x OC

the problem is , after playing a game and when the temp reach like 70
it stays like that the whole time even when im Idle, and also when i watch something youtube
temp increase to 60/70 wihtout even gaming, this did not happen before
i notice it the frist time after i played shadow of mordor

before it was not like this when i quit a game after hours the temp decreased fast and it went back to normale

i dont know waht the problem is, i changed drivers and also the bios of my Motherboard
i also tried different pci-3.0 slot and it still gave me the same problem

im using 14.9 beta atm ,i tried other drivers and still teh same problem ffs..... its the second time i bougt this card
the first time it died on me while paying wtch dogs, now this one starts gving me problems

im getting tired of these r9 290 series ffs,

Peace Miro


----------



## bbond007

Quote:


> Originally Posted by *kizwan*
> 
> That's too bad. This what I found in 290 product page: _"*To support 3 displays, one of the monitors has to support DisplayPort"_.


IDK, I have 4 monitors plugged into 1 card....

I did get a MSI R9 290X gamer with defective display-port. Lucky I noticed on day 30 after i bought it and was able to exchange it.


----------



## connectwise

Has changing the fan profile made any differences?


----------



## Mirob0t

Nope fan speed around 60% and it just doesnt wanna get lower when im idle for hours


----------



## Dimaggio1103

SO I finally got my eyefinity setup working. I went to the store and grabbed a Displayport to DVI adapter and now all three screens work. So despite AMDs spec sheets saying the pixel clock can handle three monitors mine actually cannot apparently. Probably last time I buy AMD, I really wanna support them and the amazing deals they give us however stuff like this has happened so much in the past. It might just be a driver problem however that still puts the blame on them.

Anyways sorry for the rant just glad its working now. Now time to see if the 290 was worth the upgrade. Here is hoping im not dissapointed.


----------



## cplifj

just typing this to mention that i'm running my 3 monitors which are the same types on my asus 290x reference, ports used are 2xDVI and 1XHDMI. 3 full hd pannels from medion 60Hz.

while everything works in windows, i do see the trouble during windows boot , the card keeps switching monitors on and off until at windows login.

This happens the second i hook up a 3e display, usually being the hdmi. also the card seems to allways want to put my HMDI up as bootscreen for biostime at startup.

when i have only 2 displays connected, either both dvi or 1 on dvi and 1 on hdmi, both screens show me the biosstartup screen, the second a third display is hooked up, this biosboottime screen ONLY appeard on the hdmi screen.

maybe not a big deal but i would like everything to happen on my centerscreen which is DVI connected offcourse. so in effect whenever i hook up 3 screens something is starting to grasp for air and going a bit haywire. these things might be helping cause the blackscreens (heat in that area, every bit of heat gets blown over the display parts/connectors).

I also dont find any refferences if there are certain ways one should connect, it says anything is good , just hook up. i dont know if there is a way to force it to use the displays to do what i want, and that is for hdmi to be off, till windows loads whenever i have 3 displays hooked up. and maybe stop the pinball machining on/off monitors during windows load.

Oh and i do love to clean out all those leftbehind monitors in windows devices , display hidden devices on offcourse. every other driver update or safeboot windows throws in another couple of refferences to monitors that never will get used again cause it redetected them as another new hardware addition.


----------



## kizwan

You can't control on which monitor(s) the boot screen going to appear. That (motherboard) BIOS job so to speak. The only way is to edit/hack the BIOS. When windows loaded, that's when drivers take over.

Blackscreen is a symptom for many underlying issue/problem. It could be overheated memory, unstable memory clocks/timings or just a simple drivers issue with specific hardware (motherboard, monitors, etc) especially when entering/exiting power management modes/levels.


----------



## Arizonian

Going to be away traveling for a few days so just setting myself a marker so I don't miss anyone. This way I can pick up from here. Have a good weekend everyone, play nice.


----------



## Buehlar

Quote:


> Originally Posted by *Arizonian*
> 
> Going to be away traveling for a few days so just setting myself a marker so I don't miss anyone. This way I can pick up from here. Have a good weekend everyone, play nice.


Without internet access??? j/k... be safe









BTW...could you update me? I have my 290x's underwater now


----------



## heroxoot

Quote:


> Originally Posted by *Buehlar*
> 
> Without internet access??? j/k... be safe
> 
> 
> 
> 
> 
> 
> 
> 
> 
> BTW...could you update me? I have my 290x's underwater now


Man that is the sickest looking test bench setup I've seen in a minute. Love the colors.


----------



## cplifj

Quote:


> Originally Posted by *kizwan*
> 
> You can't control on which monitor(s) the boot screen going to appear. That (motherboard) BIOS job so to speak. The only way is to edit/hack the BIOS. When windows loaded, that's when drivers take over.
> 
> Blackscreen is a symptom for many underlying issue/problem. It could be overheated memory, unstable memory clocks/timings or just a simple drivers issue with specific hardware (motherboard, monitors, etc) especially when entering/exiting power management modes/levels.


and here i was thinking that on a Multi output expansion gfx card it was the job of the gfx card bios to choose which output to use during boot.


----------



## rdr09

Quote:


> Originally Posted by *Buehlar*
> 
> Without internet access??? j/k... be safe
> 
> 
> 
> 
> 
> 
> 
> 
> 
> BTW...could you update me? I have my 290x's underwater now


i want one of those.


----------



## alancsalt

Only one?


----------



## rdr09

Quote:


> Originally Posted by *alancsalt*
> 
> Only one?


maybe two. one for spare. lol


----------



## Buehlar

Quote:


> Originally Posted by *heroxoot*
> 
> Man that is the sickest looking test bench setup I've seen in a minute. Love the colors.


Thanks








Quote:


> Originally Posted by *rdr09*
> 
> i want one of those.


What? One of my GPUs? No way I just got 'em!









You can have my old 7870's though


----------



## rdr09

Quote:


> Originally Posted by *Buehlar*
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> What? One of my GPUs? No way I just got 'em!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You can have my old 7870's though


I meant the whole setup. may not be advisable if there are kids or cats around.


----------



## Devildog83

Quote:


> Originally Posted by *Buehlar*
> 
> Without internet access??? j/k... be safe
> 
> 
> 
> 
> 
> 
> 
> 
> 
> BTW...could you update me? I have my 290x's underwater now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Most definitely outdid yourself on the test-bench "B". I'm lovin' it.


----------



## Cyber Locc

Quote:


> Originally Posted by *Dimaggio1103*
> 
> SO I finally got my eyefinity setup working. I went to the store and grabbed a Displayport to DVI adapter and now all three screens work. So despite AMDs spec sheets saying the pixel clock can handle three monitors mine actually cannot apparently. Probably last time I buy AMD, I really wanna support them and the amazing deals they give us however stuff like this has happened so much in the past. It might just be a driver problem however that still puts the blame on them.
> 
> Anyways sorry for the rant just glad its working now. Now time to see if the 290 was worth the upgrade. Here is hoping im not dissapointed.


Are all 3 of your screens the exact same like the exact same brand size everything? The reason I ask is not needing display port is a new thing from what i have read and it requires all 3 monitors to be 100% the same as does nvidias surround if there not exactly the same then you need active display port. Not sure how much truth there is to that just what I read.
Quote:


> Originally Posted by *cplifj*
> 
> just typing this to mention that i'm running my 3 monitors which are the same types on my asus 290x reference, ports used are 2xDVI and 1XHDMI. 3 full hd pannels from medion 60Hz.
> 
> while everything works in windows, i do see the trouble during windows boot , the card keeps switching monitors on and off until at windows login.
> 
> This happens the second i hook up a 3e display, usually being the hdmi. also the card seems to allways want to put my HMDI up as bootscreen for biostime at startup.
> 
> when i have only 2 displays connected, either both dvi or 1 on dvi and 1 on hdmi, both screens show me the biosstartup screen, the second a third display is hooked up, this biosboottime screen ONLY appeard on the hdmi screen.
> 
> maybe not a big deal but i would like everything to happen on my centerscreen which is DVI connected offcourse. so in effect whenever i hook up 3 screens something is starting to grasp for air and going a bit haywire. these things might be helping cause the blackscreens (heat in that area, every bit of heat gets blown over the display parts/connectors).
> 
> I also dont find any refferences if there are certain ways one should connect, it says anything is good , just hook up. i dont know if there is a way to force it to use the displays to do what i want, and that is for hdmi to be off, till windows loads whenever i have 3 displays hooked up. and maybe stop the pinball machining on/off monitors during windows load.
> 
> Oh and i do love to clean out all those leftbehind monitors in windows devices , display hidden devices on offcourse. every other driver update or safeboot windows throws in another couple of refferences to monitors that never will get used again cause it redetected them as another new hardware addition.


Mine does the same thing I have 4 screens and the hdmi and display port screens both show the bios. It does suck


----------



## Dimaggio1103

^Yes all three exactly the same. For some odd reason it just would not work. Display port adapter fixed though. I do not believe its even an active adapter, but for some reason it works.


----------



## Buehlar

Quote:


> Originally Posted by *Devildog83*
> 
> [/SPOILER]
> 
> Most definitely outdid yourself on the test-bench "B". I'm lovin' it.


Thanks man.








I'm pretty happy with it...make the upgrades and maintenance a breeze


----------



## Silent Scone

Quote:


> Originally Posted by *Buehlar*
> 
> Thanks man.
> 
> 
> 
> 
> 
> 
> 
> 
> I'm pretty happy with it..*.make the upgrades and maintenance a breeze*


It does? I don't see any QDCs









Looks awesome though obviously!


----------



## Vici0us

Quote:


> Originally Posted by *Buehlar*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Arizonian*
> 
> Going to be away traveling for a few days so just setting myself a marker so I don't miss anyone. This way I can pick up from here. Have a good weekend everyone, play nice.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Without internet access??? j/k... be safe
> 
> 
> 
> 
> 
> 
> 
> 
> 
> BTW...could you update me? I have my 290x's underwater now
Click to expand...

Wow.. that's so clean! Nice job!


----------



## Dimaggio1103

Welp my 290 has the code that says it cannot unlock. Was hoping for a repeat of my 6950>6970 days.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Dimaggio1103*
> 
> Welp my 290 has the code that says it cannot unlock. Was hoping for a repeat of my 6950>6970 days.


Dont worry you are not losing much. There is only like 7% difference.


----------



## Vici0us

Quote:


> Originally Posted by *Dimaggio1103*
> 
> Welp my 290 has the code that says it cannot unlock. Was hoping for a repeat of my 6950>6970 days.


Overclock and you'll have a 290X.


----------



## Buehlar

Quote:


> Originally Posted by *Silent Scone*
> 
> It does? I don't see any QDCs
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Looks awesome though obviously!


Yea, I did have QDCs in place but I just didn't look right. They were so big & bulky, and the flex tubing wasn't pretty ...in the end, I decided to remove them for a cleaner look with acrylic.



I still use the ones on the external rads.

The bench is still very easy to maintain compared to my case build. Just drain enough fluid to slip off the acrylic tubes.









Quote:


> Originally Posted by *Vici0us*
> 
> Wow.. that's so clean! Nice job!


Thanks


----------



## bond32

Here's my test bench... It's not near as pretty.


----------



## Buehlar

Quote:


> Originally Posted by *bond32*
> 
> Here's my test bench... It's not near as pretty.
> 
> 
> Spoiler: Warning: Spoiler!


You're packing a bit mo powa there









I'm toting the limit on the z77 platform...gotta get some more lanes next upgrade if I want a 3rd 290x. I'm torn between the Deluxe and Extreme but it will be after the holidays.


----------



## bond32

Quote:


> Originally Posted by *Buehlar*
> 
> You're packing a bit mo powa there
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm toting the limit on the z77 platform...gotta get some more lanes next upgrade if I want a 3rd 290x. I'm torn between the Deluxe and Extreme but it will be after the holidays.


Yep, have a fourth card coming monday. Already have the gpu block and 4 card link. My board is the M6E, has a plx so they all run at 8x.


----------



## Fickle Pickle

Quote:


> Originally Posted by *bond32*
> 
> Yep, have a fourth card coming monday. Already have the gpu block and 4 card link. My board is the M6E, has a plx so they all run at 8x.


Nice. You could probably warm up a large living room with those GPUs.


----------



## bond32

Quote:


> Originally Posted by *Fickle Pickle*
> 
> Nice. You could probably warm up a large living room with those GPUs.


You have no idea... LOL.

My roommate was saying how my desk area feels about 10 degrees warmer every time he walks past it...


----------



## ZealotKi11er

Quote:


> Originally Posted by *bond32*
> 
> You have no idea... LOL.
> 
> My roommate was saying how my desk area feels about 10 degrees warmer every time he walks past it...


Wait where the hell do you have that PC? Dorm?


----------



## bond32

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Wait where the hell do you have that PC? Dorm?


Ha no, my apartment. Would be quite a challenge for a dorm though...


----------



## Buehlar

Quote:


> Originally Posted by *Fickle Pickle*
> 
> Nice. You could probably warm up a large living room with those GPUs.


Winter is coming pretty quick!


----------



## Spectre-

Quote:


> Originally Posted by *Buehlar*
> 
> Winter is coming pretty quick!


summer in the land down under mate

its 38 degrees right now


----------



## ebhsimon

@bond32 What kind of temps do you get under full load? You have what appears to be 3 monstrously thick rads.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Spectre-*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Buehlar*
> 
> Winter is coming pretty quick!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> summer in the land down under mate
> 
> its 38 degrees right now
Click to expand...

Aye......getting kinda toasty ain't it?


----------



## Spectre-

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Aye......getting kinda toasty ain't it?


still waiting for the heatwave


----------



## Sgt Bilko

Quote:


> Originally Posted by *Spectre-*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Aye......getting kinda toasty ain't it?
> 
> 
> 
> still waiting for the heatwave
Click to expand...

Yeah.....not looking forward to it myself









At least my CPU will be under water by then and maybe the 295x2 if i can swing it


----------



## kizwan

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Spectre-*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Aye......getting kinda toasty ain't it?
> 
> 
> 
> still waiting for the heatwave
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> 
> 
> 
> 
> Yeah.....not looking forward to it myself
> 
> 
> 
> 
> 
> 
> 
> 
> 
> At least my CPU will be under water by then and maybe the 295x2 if i can swing it
Click to expand...

Even under water, the water temp will be at least in the 50s Celsius, unless you turn on A/C or use chiller.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Spectre-*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Aye......getting kinda toasty ain't it?
> 
> 
> 
> still waiting for the heatwave
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> 
> 
> 
> 
> Yeah.....not looking forward to it myself
> 
> 
> 
> 
> 
> 
> 
> 
> 
> At least my CPU will be under water by then and maybe the 295x2 if i can swing it
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Even under water, the water temp will be at least in the 50s Celsius, unless you turn on A/C or use chiller.
Click to expand...

That's ok, 50c is better than 70c


----------



## HOMECINEMA-PC

Only hit 30c here today with a reported 38c tomorrow ........... everywhere I goes tomorrow will involve a/c everywhere i venture


----------



## Spectre-

Quote:


> Originally Posted by *Sgt Bilko*
> 
> That's ok, 50c is better than 70c


i am going back to stock on my 3930k

AMD Release the new cards

i dont need a heater anymore


----------



## HOMECINEMA-PC

Temps


----------



## Dimaggio1103

So I got my 290 up to 1100/1300 I have a question I put MSI afterburner to +100mv is that safe for 24/7 Wanna squeeze every bit of performance I can. I think it hits like 1.25v in games.


----------



## jagdtigger

Since we are at the OC topic again i have a question.I have an OC-ed 4670k(@4,5GHz) and im thinking on getting a second 290x in next year but first i wanna know if the CPU can handle it or i have to buy a new CPU before that. What is your opinion about this?


----------



## ebhsimon

Quote:


> Originally Posted by *jagdtigger*
> 
> Since we are at the OC topic again i have a question.I have an OC-ed 4670k(@4,5GHz) and im thinking on getting a second 290x in next year but first i wanna know if the CPU can handle it or i have to buy a new CPU before that. What is your opinion about this?


Depends on what game.
I have a 4670k @ 4.4Ghz and 2 290s and get about ~120 fps (80-150) on ultra @ 1440p on BF3.


----------



## StrongForce

I got my r9 290x gigabytes windforce installed it today, here is my firestrike :http://www.3dmark.com/3dm/4506474? I was expecting a bit more (9000-10000+) because my 7950 OC'ed to the max does arround 7200, anything wrong ? I'm trying the latest 14.9 drivers, was running the 14.4 on the 7950, hope those drivers aren't too bad lol.

CPU fx [email protected] or so atm I reduced a bit because of heat(plus now with this card..)


----------



## joeh4384

Here is a compare with a MSI Gaming 290x and Intel 4690k at stock.

http://www.3dmark.com/compare/fs/3078351/fs/3012928
Quote:


> Originally Posted by *StrongForce*
> 
> I got my r9 290x gigabytes windforce installed it today, here is my firestrike :http://www.3dmark.com/3dm/4506474? I was expecting a bit more (9000-10000+) because my 7950 OC'ed to the max does arround 7200, anything wrong ? I'm trying the latest 14.9 drivers, was running the 14.4 on the 7950, hope those drivers aren't too bad lol.
> 
> CPU fx [email protected] or so atm I reduced a bit because of heat(plus now with this card..)


----------



## StrongForce

Quote:


> Originally Posted by *joeh4384*
> 
> Here is a compare with a MSI Gaming 290x and Intel 4690k at stock.
> 
> http://www.3dmark.com/compare/fs/3078351/fs/3012928


Not too far away considering the haswell is bit more powerful







.

Which driver you using ?


----------



## mirzet1976

@StrongForce You need to OC your CPU and GPU to get higher result, see my score, same CPU and GPU
GPU clock *1240/1500* and CPU freq *5ghz* both coold with WC
http://www.3dmark.com/3dm/4396172


----------



## joeh4384

Quote:


> Originally Posted by *StrongForce*
> 
> Not too far away considering the haswell is bit more powerful
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Which driver you using ?


14.9.1


----------



## mirzet1976

Add me to the owners list

Gigabyte R9 290 - GV-R929D5-4GD-B
Water Cooling

Validation - http://www.techpowerup.com/gpuz/details.php?id=uxk7b


----------



## Dimaggio1103

Quote:


> Originally Posted by *Dimaggio1103*
> 
> So I got my 290 up to 1100/1300 I have a question I put MSI afterburner to +100mv is that safe for 24/7 Wanna squeeze every bit of performance I can. I think it hits like 1.25v in games.


Anyone?


----------



## mirzet1976

I think is safe to sit on 1.25V for gaming some hours, you'r not gaming 24/7 and use 3D clocks, most time is 2D lower clocks and voltage, right


----------



## ZealotKi11er

Quote:


> Originally Posted by *Dimaggio1103*
> 
> Anyone?


+100 mV is fine but even under water i dont leave it for 24/7.


----------



## Dimaggio1103

So i cannot have my overclock 24/7? Any refrence to why 1.25 would be damaging for 24/7?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Dimaggio1103*
> 
> So i cannot have my overclock 24/7? Any refrence to why 1.25 would be damaging for 24/7?


There is no problem with it. I just find the extra OC and heat increase wasteful for 24/7. Probably because i see no difference because i have 2 cards. Also the increase in heat.


----------



## Dimaggio1103

So i have one card and what i can tell have a nice increase in performance from it. So running 24/7 should be fine in my case then? I run eyefinity so i need all the extra performance i can get.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Dimaggio1103*
> 
> So i have one card and what i can tell have a nice increase in performance from it. So running 24/7 should be fine in my case then? I run eyefinity so i need all the extra performance i can get.


I suggest you set a profile and apply it when you are gaming.


----------



## StrongForce

Quote:


> Originally Posted by *mirzet1976*
> 
> @StrongForce You need to OC your CPU and GPU to get higher result, see my score, same CPU and GPU
> GPU clock *1240/1500* and CPU freq *5ghz* both coold with WC
> http://www.3dmark.com/3dm/4396172


yea I need to but I need watercooling as 4.7 is already too much for my Nhd-14, and the card I will OC it a little yea.

Just saying was expecting a bit more at stock.


----------



## Dimaggio1103

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I suggest you set a profile and apply it when you are gaming.


For what reason? card throttles down in core speed and volts when not being taxed so why the need for a profile?

Also, do we not have official safe limits? or is all of this like a guessing game? I do not need the card to last forever just a couple of years until the 390X drops in price and I can move up.


----------



## sugarhell

The limit is close to 1.5 volt. So dont care


----------



## mirzet1976

Quote:


> Originally Posted by *StrongForce*
> 
> yea I need to but I need watercooling as 4.7 is already too much for my Nhd-14, and the card I will OC it a little yea.
> 
> Just saying was expecting a bit more at stock.


It would all total score was greater with Intel, but the result remains the same for graphics.
And FX Vishera chip and Hawaii are known for high emissions of heat when they are overclocked. I think that for this setup require water cooling to be cold.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Dimaggio1103*
> 
> For what reason? card throttles down in core speed and volts when not being taxed so why the need for a profile?
> 
> Also, do we not have official safe limits? or is all of this like a guessing game? I do not need the card to last forever just a couple of years until the 390X drops in price and I can move up.


Pretty sure the volt will ramp up of you using the PC for other thing like web browser. Its up do you. I get weird voltage jumps if i leave my cards with 100 mV for 14/7 which cause screen flicker from time to time.


----------



## Jorginto

Guys, I struggling with my Powercolor 2xR9 290 OC. I hope I didn't loose at the silicon lottery. My ASIC readings are 69,5 ans 77,2. Finally installed water blocks.

Cards work great at +31 mV and 1080 core but this is when the magic starts happening:

1100 - +63mV
1120 - +81mV
1130 - +100mV
1140 - +125mV
1150/1160 - +150mV
above that I get artifacts no matter the voltage.

Damn I was hoping for that 1200/1225 Mhz benchmarking score.

My best so far with single artifacts:

http://www.3dmark.com/3dm/4504177

This is with the Cooler Master Silent Pro M - 850W - hope it's not the issue, but my max power draw from the wall is around 700/725 W.

I've got another Cooler Master 1200W Gold. Maybe it would be worth a shot?

And is there any solution to that god damn cold boot windows entering black screen? Uncle google didn't help me that much.


----------



## Dimaggio1103

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Pretty sure the volt will ramp up of you using the PC for other thing like web browser. Its up do you. I get weird voltage jumps if i leave my cards with 100 mV for 14/7 which cause screen flicker from time to time.


Ok good idea will do. Now regarding safe voltage would the +100 be ok for the few hours of gaming a day I do? Also any idea on safe VRM temp threshold?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Jorginto*
> 
> Guys, I struggling with my Powercolor 2xR9 290 OC. I hope I didn't loose at the silicon lottery. My ASIC readings are 69,5 ans 77,2. Finally installed water blocks.
> 
> Cards work great at +31 mV and 1080 core but this is when the magic starts happening:
> 
> 1100 - +63mV
> 1120 - +81mV
> 1130 - +100mV
> 1140 - +125mV
> 1150/1160 - +150mV
> above that I get artifacts no matter the voltage.
> 
> Damn I was hoping for that 1200/1225 Mhz benchmarking score.
> 
> My best so far with single artifacts:
> 
> http://www.3dmark.com/3dm/4504177
> 
> This is with the Cooler Master Silent Pro M - 850W - hope it's not the issue, but my max power draw from the wall is around 700/725 W.
> 
> I've got another Cooler Master 1200W Gold. Maybe it would be worth a shot?
> 
> And is there any solution to that god damn cold boot windows entering black screen? Uncle google didn't help me that much.


I have similar system. I can tell you are very close a the limit with 150 mV for you PSU. I hit 1080W off the wall with 200 mV. 100 mV ~ 870W.


----------



## Jorginto

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I have similar system. I can tell you are very close a the limit with 150 mV for you PSU. I hit 1080W off the wall with 200 mV. 100 mV ~ 870W.


Well, I gotta play around with the second PSU. I was wandering that it could connected to power delivery.

Any tips on that cold boot windows entering black screen?


----------



## battleaxe

Quote:


> Originally Posted by *Jorginto*
> 
> Well, I gotta play around with the second PSU. I was wandering that it could connected to power delivery.
> 
> Any tips on that cold boot windows entering black screen?


Nope. I got that too. Couldn't fix it and returned the card RMA.


----------



## Jorginto

Quote:


> Originally Posted by *battleaxe*
> 
> Nope. I got that too. Couldn't fix it and returned the card RMA.


I'm not gonna RMA since it's not a big issue. As long as I'm not getting any BSOD's in game or benchmarks.

BTW could you link a picture of your VRM cooling solution? I manged to deal with the main VRM line, but the three small ones even with heatsinks on, hit the 90's to 100's d. C.


----------



## battleaxe

Quote:


> Originally Posted by *Jorginto*
> 
> I'm not gonna RMA since it's not a big issue. As long as I'm not getting any BSOD's in game or benchmarks.
> 
> BTW could you link a picture of your VRM cooling solution? I manged to deal with the main VRM line, but the three small ones even with heatsinks on, hit the 90's to 100's d. C.


I am using this on VRM's. The VRM2 never got over 55c. VRM1 max 65c. On 200mv VRM1 got up to about 85c though. But I never run it that way for very long. (not more than a few minutes other than to bench)

http://www.newegg.com/Product/Product.aspx?Item=N82E16835426042



I'd have to take a pic of the DIY cooler I use in addition to the VRM1 cooler. Its part of a CPU block that's been cut up. I'd take a pic but that was what was installed on the card that is now dead. So I can't very well show you til I get it back.


----------



## jagdtigger

Quote:


> Originally Posted by *ebhsimon*
> 
> Depends on what game.
> I have a 4670k @ 4.4Ghz and 2 290s and get about ~120 fps (80-150) on ultra @ 1440p on BF3.


Well judging from AC: Unity system requirements at some point i will need a new CPU sooner or later. But i have time to think, i want to buy it only in next year...


----------



## Sgt Bilko

Quote:


> Originally Posted by *jagdtigger*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ebhsimon*
> 
> Depends on what game.
> I have a 4670k @ 4.4Ghz and 2 290s and get about ~120 fps (80-150) on ultra @ 1440p on BF3.
> 
> 
> 
> Well judging from AC: Unity system requirements at some point i will need a new CPU sooner or later. But i have time to think, i want to buy it only in next year...
Click to expand...

"Ubisoft"......nuff said


----------



## ebhsimon

Quote:


> Originally Posted by *jagdtigger*
> 
> Well judging from AC: Unity system requirements at some point i will need a new CPU sooner or later. But i have time to think, i want to buy it only in next year...


That's if you want to play an Ubisoft game, you'll need a crazy ass CPU. I plan on upgrading to a 4790k and hopefully I get a decent one because I want to run high clocks on it (4.8+). Then I'll keep the i7 for maybe 2 years before upgrading again.
A 4790k and 2x 290s is a good combination for a 1440p screen @ 110hz


----------



## Cyber Locc

Okay guys so I have encountered a major issue. I have been experiencing problems with my audio in games like crackling. I had explored every possibility tried sound cards usb headsets and still the issue was there. Only in games which was weird. I figured it out (which i should have earlier as it happened when i put in my second card) crossfire causes the sound to crackle. upon goggling im not alon as a matter of a fact this issue is pretty common and to date still hasn't been fixed by amd. I got it to not do it for a little while by uninstalling all amd drivers(well sound drivers) however I just tried that again but it didn't work this time (granted i uninstalled the hd audio devices first so the others just disappeared that may be why). Anyway needless to say this obviously has to do with the audio on the cards anyway to remove that is there a bios or something does anyone have any fix for this. Its not 1 card or the other as I have tried both by themselves with no issues also disabling crossfire solves it its only when crossfire is on it does it and it makes games unplayable for me. (it also doesn't happen with music even when running heaven only game audio).

Okay upon some more reading I have found out something else it only happens when in fullscreen and with vsync off. Games that do it for me are all that i play lately. Dragon age, Wicther 2, archeage, shadow of mordor, need for speed most wanted/shift. TBH it probably does it on all my games those are just the ones i have checked. so I get to choose now no crossfire or no vsync IE. not enough FPS or Stutter game play or deal with crackling audio. TBH as my first amd cards not to happy with this issue.


----------



## DividebyZERO

Quote:


> Originally Posted by *cyberlocc*
> 
> Okay guys so I have encountered a major issue. I have been experiencing problems with my audio in games like crackling. I had explored every possibility tried sound cards usb headsets and still the issue was there. Only in games which was weird. I figured it out (which i should have earlier as it happened when i put in my second card) crossfire causes the sound to crackle. upon goggling im not alon as a matter of a fact this issue is pretty common and to date still hasn't been fixed by amd. I got it to not do it for a little while by uninstalling all amd drivers(well sound drivers) however I just tried that again but it didn't work this time (granted i uninstalled the hd audio devices first so the others just disappeared that may be why). Anyway needless to say this obviously has to do with the audio on the cards anyway to remove that is there a bios or something does anyone have any fix for this. Its not 1 card or the other as I have tried both by themselves with no issues also disabling crossfire solves it its only when crossfire is on it does it and it makes games unplayable for me. (it also doesn't happen with music even when running heaven only game audio).
> 
> Okay upon some more reading I have found out something else it only happens when in fullscreen and with vsync off. Games that do it for me are all that i play lately. Dragon age, Wicther 2, archeage, shadow of mordor, need for speed most wanted/shift. TBH it probably does it on all my games those are just the ones i have checked. so I get to choose now no crossfire or no vsync IE. not enough FPS or Stutter game play or deal with crackling audio. TBH as my first amd cards not to happy with this issue.


What driver version and are you heavily overclocked on cpu?


----------



## Cyber Locc

Quote:


> Originally Posted by *DividebyZERO*
> 
> What driver version and are you heavily overclocked on cpu?


All of them lol i have tried all the drivers i had in downloads still as well as the bew betas so like 6 diffrent drivers 14.6-14.9 and 9.1 and 9.2.

My cpu overclock is midly high also not the problem as i have cpearee the cmos 2 times and used no overclock on cpu or memory and reset a few times swaped drivers all with no overclock.

Also again ill refer you to alot of people are having same issue its the gpu people that i read that have also experienced it have been on windows 8.1, 8, and 7. Its not the os nor anything else its the drivers it has to be theres simply nothing else it could be. Also pretty much all the other people i have seen having the same issue also have r9 290s not 290xs so maybe the issue isnt there with 290xs im not sure

Also just as it might come up yes i have reinstalled the sound drivers multiple times reinstalled the games and windows the gpus are the problem.

here is 1 such thread that was updated recently. http://www.tomshardware.com/answers/id-1916025/audio-stutter-crackling-vsync-crossfire.html. After thinking it may be the gpu and and searching for the issue with gpu in it theres a ton of threads just like this one Im seriously amazed no one on here has mentioned this as apparently its been happening since launch of these cards.

So thanks to this thread http://www.overclock.net/t/1516541/sound-crackling-crossfire-only/30#post_23058446 I have found another variable to it it only happens if one of the monitors are connected via hdmi (which makes since as thats the only port that will pass sound AFAIK. which still really isnt viable for me I need that port its bad enough they limit us to 4 limiting me to 3 aint gonna work


----------



## taem

So I have a Powercolor pcs+ 290. This card has a default +50mv adjust in bios. I want to flash it with a bios from a different 290 to run at reference voltage.

Can I use any bios? This card has elpida VRAM. Do I need to find bios with elpida listed on the bios details, or can I use a bios that lists Hynix?

Because I flashed it with a 290 Sapphire tri-x bios, I chose the one that lists both Hynix and elpida, and it's behaving oddly. For one thing I had the tri-x card for a while, and the fan curve I'm getting is off. It caps at 41% max fan speed no matter how high the temp gets. The tri-x, the real one I mean, definitely did not cap at 41% fan.

And with this new bios flashed on the card, Sapphire Trixx crashes every time I try to go into settings.

Gpu z readout looks fine though. And the flashing went fine obviously, at least, I got the right successful flash window and gpu z shows the correct info for the new bios.


----------



## DividebyZERO

Quote:


> Originally Posted by *cyberlocc*
> 
> All of them lol i have tried all the drivers i had in downloads still as well as the bew betas so like 6 diffrent drivers 14.6-14.9 and 9.1 and 9.2.
> 
> My cpu overclock is midly high also not the problem as i have cpearee the cmos 2 times and used no overclock on cpu or memory and reset a few times swaped drivers all with no overclock.
> 
> Also again ill refer you to alot of people are having same issue its the gpu people that i read that have also experienced it have been on windows 8.1, 8, and 7. Its not the os nor anything else its the drivers it has to be theres simply nothing else it could be. Also pretty much all the other people i have seen having the same issue also have r9 290s not 290xs so maybe the issue isnt there with 290xs im not sure
> 
> Also just as it might come up yes i have reinstalled the sound drivers multiple times reinstalled the games and windows the gpus are the problem.
> 
> here is 1 such thread that was updated recently. http://www.tomshardware.com/answers/id-1916025/audio-stutter-crackling-vsync-crossfire.html. After thinking it may be the gpu and and searching for the issue with gpu in it theres a ton of threads just like this one Im seriously amazed no one on here has mentioned this as apparently its been happening since launch of these cards.
> 
> So thanks to this thread http://www.overclock.net/t/1516541/sound-crackling-crossfire-only/30#post_23058446 I have found another variable to it it only happens if one of the monitors are connected via hdmi (which makes since as thats the only port that will pass sound AFAIK. which still really isnt viable for me I need that port its bad enough they limit us to 4 limiting me to 3 aint gonna work


I bought and have had my R9 290s since launch day running quadfire. The only sound issue I've had was with 13.12 drivers and vsynch enabled. They knew it was a bug and since fixed it supposedly. Not sure what to say, im using onboard audio.


----------



## Cyber Locc

Quote:


> Originally Posted by *DividebyZERO*
> 
> I bought and have had my R9 290s since launch day running quadfire. The only sound issue I've had was with 13.12 drivers and vsynch enabled. They knew it was a bug and since fixed it supposedly. Not sure what to say, im using onboard audio.


Do you have an HDMI monitor connected? Good to hear the "Fix" worked for your the first person i have seen that its been fixed for so maybe there's hope of a fix for us too









I also am usually using onboard audio as i have really good onboard audio but my corsair 2100s also suffer the fate as does an old sound card I have and a friends that he let me try.


----------



## Sgt Bilko

Quote:


> Originally Posted by *cyberlocc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DividebyZERO*
> 
> I bought and have had my R9 290s since launch day running quadfire. The only sound issue I've had was with 13.12 drivers and vsynch enabled. They knew it was a bug and since fixed it supposedly. Not sure what to say, im using onboard audio.
> 
> 
> 
> Do you have an hdmi monitor connected? Good to hear the "Fix" worked for your the first person i have seen that its been fixed for so maybe there's hope of a fix for us too
Click to expand...

Same here, 13.12 was the only driver that caused any audio crackling and that was only with Vsync on.

Every driver after that has been great for me.

Oh....yes i am/was using HDMI for both the 295x2 and R9 290 Crossfire i had before.


----------



## Cyber Locc

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Same here, 13.12 was the only driver that caused any audio crackling and that was only with Vsync on.
> 
> Every driver after that has been great for me.
> 
> Oh....yes i am/was using HDMI for both the 295x2 and R9 290 Crossfire i had before.


Hmm were you also using display port just trying to figure out why it fixed it for you guys but not all these other people. that doesnt seem to be the issue its way less noticeable if display port is not hooked up at the same time but its still there granted fainter you really have to listen.

Your guys hdmi monitors need more info. did they have sound over hdmi supported or no was it adapted to dvi or straight hdmi.


----------



## Sgt Bilko

Quote:


> Originally Posted by *cyberlocc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Same here, 13.12 was the only driver that caused any audio crackling and that was only with Vsync on.
> 
> Every driver after that has been great for me.
> 
> Oh....yes i am/was using HDMI for both the 295x2 and R9 290 Crossfire i had before.
> 
> 
> 
> Hmm were you also using display port just trying to figure out why it fixed it for you guys but not all these other people.
Click to expand...

I never changed anything, it was just the 13.12 WHQL driver when i had Vsync on that caused it, every other driver afterwards was fine.

Have you tried swapping the cards around so you are using the second cards ports instead of the first?


----------



## Cyber Locc

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I never changed anything, it was just the 13.12 WHQL driver when i had Vsync on that caused it, every other driver afterwards was fine.
> 
> Have you tried swapping the cards around so you are using the second cards ports instead of the first?


yes and so did the guy from the other ocn thread there's from those 2 threads alone 15 people that still have this issue I don't think all of our cards are bad. (thats only 2 threads of alot of threads that were made after 13.12 its not the cards) there has to be some difference in the setups that makes the fix work for you and not us. it may be the cards brand, production model, amount of monitors any number of things but it never hurts to try and diagnose to help fix it.

Well I just hooked up my tv which does have speakers and still makes the noise. My hdmi cable is a few years old but still works for other things so i doubt that's the issue. What revision are your guys cards?

My tv is fairly old though so it uses the original version of hdmi. also my monitor is using an adpter.


----------



## Cyber Locc

Well i tried unhooking and hooking stuff up and it got even worse lol. So now i have unistalled 14.9 and installed 13.12 and then it got extremely badso now im reinstalling 14.9 to se if going back to 13.12 fixes it


----------



## Yvese

Anyone know whether or not the 2nd card in CF HAS to be on the 2nd PCIE slot or if you can put it at the far bottom slot? I checked the manual on my mobo but it only mentions the 3rd PCIE slot if using tri-fire which lists it as x4. Curious if it's still x4 in CF though.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Yvese*
> 
> Anyone know whether or not the 2nd card in CF HAS to be on the 2nd PCIE slot or if you can put it at the far bottom slot? I checked the manual on my mobo but it only mentions the 3rd PCIE slot if using tri-fire which lists it as x4. Curious if it's still x4 in CF though.


Any slot you wish, just x8 or x16 imo


----------



## battleaxe

Quote:


> Originally Posted by *cyberlocc*
> 
> All of them lol i have tried all the drivers i had in downloads still as well as the bew betas so like 6 diffrent drivers 14.6-14.9 and 9.1 and 9.2.
> 
> My cpu overclock is midly high also not the problem as i have cpearee the cmos 2 times and used no overclock on cpu or memory and reset a few times swaped drivers all with no overclock.
> 
> Also again ill refer you to alot of people are having same issue its the gpu people that i read that have also experienced it have been on windows 8.1, 8, and 7. Its not the os nor anything else its the drivers it has to be theres simply nothing else it could be. Also pretty much all the other people i have seen having the same issue also have r9 290s not 290xs so maybe the issue isnt there with 290xs im not sure
> 
> Also just as it might come up yes i have reinstalled the sound drivers multiple times reinstalled the games and windows the gpus are the problem.
> 
> here is 1 such thread that was updated recently. http://www.tomshardware.com/answers/id-1916025/audio-stutter-crackling-vsync-crossfire.html. After thinking it may be the gpu and and searching for the issue with gpu in it theres a ton of threads just like this one Im seriously amazed no one on here has mentioned this as apparently its been happening since launch of these cards.
> 
> So thanks to this thread http://www.overclock.net/t/1516541/sound-crackling-crossfire-only/30#post_23058446 I have found another variable to it it only happens if one of the monitors are connected via hdmi (which makes since as thats the only port that will pass sound AFAIK. which still really isnt viable for me I need that port its bad enough they limit us to 4 limiting me to 3 aint gonna work


Try using DDU to uninstall the AMD audio bus only.


----------



## bond32

Did you try a sound card? The Asus xonar is only in the $20 range on Amazon...


----------



## taem

Quote:


> Originally Posted by *Yvese*
> 
> Anyone know whether or not the 2nd card in CF HAS to be on the 2nd PCIE slot or if you can put it at the far bottom slot? I checked the manual on my mobo but it only mentions the 3rd PCIE slot if using tri-fire which lists it as x4. Curious if it's still x4 in CF though.


If the manual specs the slot as the third in tri-fire, there's a good chance the slot is capped at x4. I don't use the high end x99 and prior iteration boards, but the z77, z87, z97 boards are often this way.


----------



## Cyber Locc

Quote:


> Originally Posted by *battleaxe*
> 
> Try using DDU to uninstall the AMD audio bus only.


Yep triee removeing the drivers a fee diffrent ways and in diffrent orders.

@Bond32 - yes i did 2 diffrent soundcards and 2 usb headsets which are there own dacs. Plus i have a rampage 4 black edition that would be a serious audio downgrade.


----------



## bond32

Quote:


> Originally Posted by *cyberlocc*
> 
> Yep triee removeing the drivers a fee diffrent ways and in diffrent orders.
> 
> @Bond32 - yes i did 2 diffrent soundcards and 2 usb headsets which are there own dacs. Plus i have a rampage 4 black edition that would be a serious audio downgrade.


Dunno man, probably not a solution at all but if I were you I would just reformat and start fresh. Sounds like some sort of driver issue, would be my guess.


----------



## Dimaggio1103

So had the card a few days now and I gotta say it is a champ. Aside from the driver issues I have had this card is a beast. Playing 5760x1080 Shadow of mordor with Ultra textures at playable framerates. Totally stunned. So happy with its horsepower.


----------



## pdasterly

Middle earth is nice. Disappointed in civilization beyond earth


----------



## Dimaggio1103

Quote:


> Originally Posted by *pdasterly*
> 
> Middle earth is nice. Disappointed in civilization beyond earth


That was gonna be my next buy after borderlands. WHat was disappointing about it?


----------



## pdasterly

They tried too hard. Its all outer space. Nothing is the same except gameplay.


----------



## sugarhell

Quote:


> Originally Posted by *pdasterly*
> 
> They tried too hard. Its all outer space. Nothing is the same except gameplay.


Thats the point...


----------



## pdasterly

They should have built on top of other game like the previous editions


----------



## Cyber Locc

Quote:


> Originally Posted by *bond32*
> 
> Dunno man, probably not a solution at all but if I were you I would just reformat and start fresh. Sounds like some sort of driver issue, would be my guess.


Did that too however im going to try and install windows 8.1 today see if maybe thay works.

I just dont understand what could be causing it as i have disabled the audio drivers in every way possible i went as far to disable all hd audio in the registry that also disabled my onboard sound and still it persists.

Unplugging the hdmi cable was working but it doesnt now.


----------



## NotReadyYet

Took advantage of the deal going on at Newegg for an XFX 290x DD Edition card. I bought it for $330 ($300 after rebate) and sold my 2 year old Gigabyte 7970 for $150 flat. In the end, I basically paid $150 to upgrade to the 2nd best AMD card - I havn't decided if I want to sell the Never Settle Bundle Code to save even more money.

Any XFX owners on here and care to comment on their card?


----------



## tsm106

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Use a coolant vs a coil?
> 
> I use koolance because they've been making pc specific qdc the longest so they generally have it down. Though they are far from perfect mind you. The plating gets stripped over time, they can clog too, and they are expensive, though wc as a hobby is expensive so it goes with the territory as they say.
> 
> Basically for connecting gpus, you need a set of two each, male and female thus 4 total. **One matched set are compression and one set matched set are G1/4 (screw into gpu ports). And also if you are enterprising another matched set that you use with a length of tubing for those days when you need the block/s out of the loop for longer. This is where the extra set comes in, as you use it to "complete" your loop so it can run w/o your gpu. For ex. testing an air card or repair/rma/etc whatever may arise.
> 
> When picking qdc out go slow and make sure you have matching sets, ie. male/female loopside and male/female gpu side, then whether you want direct fitting on gpu side or not, and finally your tubing size. Triple check, as you do not want to order the wrongs ones because shipping and returns are hard to negotiate with wc vendors.
> 
> **I prefer shorter/closer connections when using qdc, but you are free to use compressions completely with a length of tubing. Choice is good, but that is rather redundant imo.
> 
> G1/4 vs Compression to gpu block:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Went to Koolance site and found some. about $50 for a pair (4pcs) excluding shipping. local company is based in Washington state. i may have to invest on those.
Click to expand...

You should have gotten an extra matching set so it is more flexible. For ex. in this pic you can see the lightning block removed and the black section of tube and qdc. Without this "adapter" testing aircooled cards and general troubleshooting is a huge pain in the ass.


----------



## ZealotKi11er

Quote:


> Originally Posted by *NotReadyYet*
> 
> Took advantage of the deal going on at Newegg for an XFX 290x DD Edition card. I bought it for $330 ($300 after rebate) and sold my 2 year old Gigabyte 7970 for $150 flat. In the end, I basically paid $150 to upgrade to the 2nd best AMD card - I havn't decided if I want to sell the Never Settle Bundle Code to save even more money.
> 
> Any XFX owners on here and care to comment on their card?


Good upgrade.


----------



## Yvese

Quote:


> Originally Posted by *taem*
> 
> If the manual specs the slot as the third in tri-fire, there's a good chance the slot is capped at x4. I don't use the high end x99 and prior iteration boards, but the z77, z87, z97 boards are often this way.


Hmm that's the thing though. My manual says all 3 slots are PCIE 3.0 x16 which is why I'm a bit confused.

Here's the manual: ftp://66.226.78.21/manual/Z87%20Extreme6.pdf

It's on page 8.
Quote:


> Originally Posted by *Sgt Bilko*
> 
> Any slot you wish, just x8 or x16 imo


I'm hoping this is the case. I'd like to CF but the Tri-X/Vapor-X take up like 2.5x PCI slots so if I use the 2nd slot there would be barely any space between them. If the third slot is locked to x4 though that would be a problem


----------



## kizwan

Quote:


> Originally Posted by *Yvese*
> 
> Quote:
> 
> 
> 
> Originally Posted by *taem*
> 
> If the manual specs the slot as the third in tri-fire, there's a good chance the slot is capped at x4. I don't use the high end x99 and prior iteration boards, but the z77, z87, z97 boards are often this way.
> 
> 
> 
> Hmm that's the thing though. My manual says all 3 slots are PCIE 3.0 x16 which is why I'm a bit confused.
> 
> Here's the manual: ftp://66.226.78.21/manual/Z87%20Extreme6.pdf
> 
> It's on page 8.
Click to expand...

Yup, you're confused.







Yes they're PCIe 3.0 x16 *slot* but they're x16, x8 & x4 lanes respectively. The PCIe 3.0 x16 *slot* is actually referring to the *physical slot* size, not the actual lanes.

When Crossfire, your cards will run at x8 x8 (PCIE2 PCIE3).

When 3-way Crossfire, your cards will run at x8 x4 x4.


----------



## jagdtigger

Quote:


> Originally Posted by *cyberlocc*
> 
> Did that too however im going to try and install windows 8.1 today see if maybe thay works.
> 
> I just dont understand what could be causing it as i have disabled the audio drivers in every way possible i went as far to disable all hd audio in the registry that also disabled my onboard sound and still it persists.
> 
> Unplugging the hdmi cable was working but it doesnt now.


Try it out with a linux live cd... If its working properly than its a software issue.


----------



## Cyber Locc

Quote:


> Originally Posted by *jagdtigger*
> 
> Try it out with a linux live cd... If its working properly than its a software issue.


Can you even run crossfire with linux?

Update: Googled it no you cant.


----------



## jagdtigger

I didnt know about that...







I use a linux live CD when i cant solve some issue that is related to a HW, and im not shore if the HW or the software is the cause of the problem...


----------



## Cyber Locc

Quote:


> Originally Posted by *jagdtigger*
> 
> I didnt know about that...
> 
> 
> 
> 
> 
> 
> 
> I use a linux live CD when i cant solve some issue that is related to a HW, and im not shore if the HW or the software is the cause of the problem...


Software is the cause of the problem if it was hardware it wouldnt be such a limited scope ie only when using crossfire with vsync on that is defiantly a driver issue.

Thanks for trying to help though I aprreciate it.


----------



## taem

Quote:


> Originally Posted by *Yvese*
> 
> Hmm that's the thing though. My manual says all 3 slots are PCIE 3.0 x16 which is why I'm a bit confused.
> 
> Here's the manual: ftp://66.226.78.21/manual/Z87%20Extreme6.pdf
> 
> It's on page 8


Like Kizwan said. I've never seen a z87 board without a plx chip that has more than one slot that runs at actual x16.

All the ones I've seen have one x16, one x8, and one x4, and it's hardwired, you don't get to choose which slot to use.

I don't see a difference between x8 and x16 personally. Even x4 doesn't incur a huge performance loss. Right now I have my 290 in an x8 slot because it's more convenient there.


----------



## Cyber Locc

Quote:


> Originally Posted by *taem*
> 
> Like Kizwan said. I've never seen a z87 board without a plx chip that has more than one slot that runs at actual x16.
> 
> All the ones I've seen have one x16, one x8, and one x4, and it's hardwired, you don't get to choose which slot to use.
> 
> I don't see a difference between x8 and x16 personally. Even x4 doesn't incur a huge performance loss. Right now I have my 290 in an x8 slot because it's more convenient there.


You will never see a z87 board that supports x16x2 without plx as haswell cpus only have 16 lanes. You need a plx chip to add more lanes.

You cant even run 1 gpu at x16 and another at x8 according to intels specs (thats actually from the 4790k spec sheet although im sure its the same) so your options are as said x16, or x8 x8, or x8 x4 x4.

As to the question ya its x16 x8 x4 runnibg cards in 1 and 3 will cause x8 x4.


----------



## Arizonian

Quote:


> Originally Posted by *Buehlar*
> 
> Without internet access??? j/k... be safe
> 
> 
> 
> 
> 
> 
> 
> 
> 
> BTW...could you update me? I have my 290x's underwater now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


That is sweet man. Nice work. Congrats - updated


----------



## amptechnow

Quote:


> Originally Posted by *NotReadyYet*
> 
> Took advantage of the deal going on at Newegg for an XFX 290x DD Edition card. I bought it for $330 ($300 after rebate) and sold my 2 year old Gigabyte 7970 for $150 flat. In the end, I basically paid $150 to upgrade to the 2nd best AMD card - I havn't decided if I want to sell the Never Settle Bundle Code to save even more money.
> 
> Any XFX owners on here and care to comment on their card?


i have a reference xfx 290. it has elpida memory. on stock bios i can get 1125 core for gaming and 1150 core for benching with 1625 memory clock for either. good bench scores and smooth gameplay performance. using pt1 bios i can hit 1200/1625 bit modded trix or modded asus gpu tweaks and upping voltage. its under water and temps stay low so do vrm's. i am very happy with my xfx card. i also contacted customer support and got a quick response. they said it was ok to remove stickers on card and install a water block and said my warranty would still be valid. i would buy xfx again, despite many people having bad things to say about them. it also plays well with my powercolor pcs+ 290.

at first i had some black screen issues when computer went to sleep. but that was a year ago and havent had one in a long long time. seems drivers fixed it.

welcome to the 290 party, enjoy your card.

EDIT:
i have a few questions for everyone with a 290 and or 290 cfx.... what settings do you run in amd ccc under gaming tab? i have been messing with settings for months trying to get it perfect with best quality and good frame rate. but i just cant decide. i would really like to shoot for the absolute best quality. i game at 1080 and have a 120hz panel. but many settings look the same to me and there are huge performance differences. does any one use supersampling? or just multi? or adaptive multi?
and do you use vsync and tripple buffering?
surface optimization?
normal, performance, or high quality?

are the visual differences that noticeble? maybe my vision is bad idk.....

i like the higher frame rates of like default settings, but i really like the games to look as good as possible.. what do you guys use?


----------



## taem

Quote:


> Originally Posted by *cyberlocc*
> 
> You will never see a z87 board that supports x16x2 without plx as haswell cpus only have 16 lanes. You need a plx chip to add more lanes.
> 
> You cant even run 1 gpu at x16 and another at x8 according to intels specs (thats actually from the 4790k spec sheet although im sure its the same) so your options are as said x16, or x8 x8, or x8 x4 x4.
> 
> As to the question ya its x16 x8 x4 runnibg cards in 1 and 3 will cause x8 x4.


Right. The key issue here is that of all the z87/z97 boards I know of, how you allot those PCI lanes is not user configurable. The slots on the mb are already hard wired with only the top slot switching between x16 and x8, so you can't put your card in a slot spec'd at x4 and run it in x16 or x8, even if you have no other cards in the system. So the CPU lanes aren't the only limit.

It needn't be that way, there's no reason a z87/97 mb couldn't configure each slot for x16 and allow the user to define how the lanes are allotted. You already have this with the x4 slot if you have m2 PCI on the board. I doubt the cost of this would be very much.


----------



## NotReadyYet

Quote:


> Originally Posted by *amptechnow*
> 
> i have a reference xfx 290. it has elpida memory. on stock bios i can get 1125 core for gaming and 1150 core for benching with 1625 memory clock for either. good bench scores and smooth gameplay performance. using pt1 bios i can hit 1200/1625 bit modded trix or modded asus gpu tweaks and upping voltage. its under water and temps stay low so do vrm's. i am very happy with my xfx card. i also contacted customer support and got a quick response. they said it was ok to remove stickers on card and install a water block and said my warranty would still be valid. i would buy xfx again, despite many people having bad things to say about them. it also plays well with my powercolor pcs+ 290.
> 
> at first i had some black screen issues when computer went to sleep. but that was a year ago and havent had one in a long long time. seems drivers fixed it.
> 
> welcome to the 290 party, enjoy your card.
> 
> EDIT:
> i have a few questions for everyone with a 290 and or 290 cfx.... what settings do you run in amd ccc under gaming tab? i have been messing with settings for months trying to get it perfect with best quality and good frame rate. but i just cant decide. i would really like to shoot for the absolute best quality. i game at 1080 and have a 120hz panel. but many settings look the same to me and there are huge performance differences. does any one use supersampling? or just multi? or adaptive multi?
> and do you use vsync and tripple buffering?
> surface optimization?
> normal, performance, or high quality?
> 
> are the visual differences that noticeble? maybe my vision is bad idk.....
> 
> i like the higher frame rates of like default settings, but i really like the games to look as good as possible.. what do you guys use?


Thanks for the feedback. This is my first XFX card ever. My first card goes way back in the day to VoodooFx, then an Nvidia once Voodoo was bought out by them. The Nvidia card gave me so many problems that I switched to ATI and have stayed with them ever since (I did have a 9800GT briefly).

I wasn't aware there were any bad things to say about XFX. What's up with that? Lots of bad cards or something?


----------



## Cyber Locc

Quote:


> Originally Posted by *taem*
> 
> Right. The key issue here is that of all the z87/z97 boards I know of, how you allot those PCI lanes is not user configurable. The slots on the mb are already hard wired with only the top slot switching between x16 and x8, so you can't put your card in a slot spec'd at x4 and run it in x16 or x8, even if you have no other cards in the system. So the CPU lanes aren't the only limit.
> 
> It needn't be that way, there's no reason a z87/97 mb couldn't configure each slot for x16 and allow the user to define how the lanes are allotted. You already have this with the x4 slot if you have m2 PCI on the board. I doubt the cost of this would be very much.


I do agree that it doesnt need to be like that but im pretty sure it is. I do not know for absolute certain though. I know my z77 boards would that way and every board even x79 boards have slots where they want the gpus to go. The only way to truly find out is try it worst case it dont work then move them up. I wouldnt however plan for it to work as im quite sure it wont.


----------



## amptechnow

Quote:


> Originally Posted by *NotReadyYet*
> 
> Thanks for the feedback. This is my first XFX card ever. My first card goes way back in the day to VoodooFx, then an Nvidia once Voodoo was bought out by them. The Nvidia card gave me so many problems that I switched to ATI and have stayed with them ever since (I did have a 9800GT briefly).
> 
> I wasn't aware there were any bad things to say about XFX. What's up with that? Lots of bad cards or something?


i am not sure. i have just heard many bad things on this site about them. especially when the 290 first dropped, and ppl were asking which cards were recommended, everyone said dont go xfx. i was so scared after reading this stuff bc i had just got my card. but it has turned out to be a great card. with some overclocking i get 290x scores and beat some 290x's too! and like i said i never have any issues as of the last long while. no game crashes no driver crashes. no issues unless i push my overclock too far.

i have though recently saw ppl reccommend xfx bc of their good warranty and the fact it transfers to a new buyer.

hindsight i would rather have bought another reference xfx 290 over my powercolor pcs. cheaper and better overclocking. under water that is. stock cooling the pcs destroys a reference design. and my pcs never goes above 62-65 degrees and the fan isnt too loud.


----------



## Yvese

Thanks for the responses guys. It's a shame since I don't think I'll be able to CF now since I'd be cutting it close with how big the Tri-X/Vapor-X cooler is. I was hoping to put the 2nd card on the 3rd PCIE slot but alas









Now my only option is to go 970 SLI if I want more power. Buying a different brand 290 is not an option since Sapphire is the only one that's smart enough to combine GPU and VRM cooling on the same heatsink.


----------



## ebhsimon

Quote:


> Originally Posted by *Yvese*
> 
> Thanks for the responses guys. It's a shame since I don't think I'll be able to CF now since I'd be cutting it close with how big the Tri-X/Vapor-X cooler is. I was hoping to put the 2nd card on the 3rd PCIE slot but alas
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now my only option is to go 970 SLI if I want more power. Buying a different brand 290 is not an option since Sapphire is the only one that's smart enough to combine GPU and VRM cooling on the same heatsink.


Hold on what's the problem? I have crossfire 290s Vapor-X and PCS+ and I can tell you the Vapor-X is HUGE. But because I don't want to go back and read through the last 10 pages what was your problem? I might be able to help you out.


----------



## Arctic Storm

I am SO tempted to pick up two 290x's and two water blocks to update my rig. I currently have 2x GTX680 (2gb) driving a 1440p screen. I'm definitely starting to see the limitations for 2gb in stuff like Metro Last Light. I would eventually like to go 4k, so hell, why not even get three. Prices are pretty much going down every time I check!


----------



## Yvese

Quote:


> Originally Posted by *ebhsimon*
> 
> Hold on what's the problem? I have crossfire 290s Vapor-X and PCS+ and I can tell you the Vapor-X is HUGE. But because I don't want to go back and read through the last 10 pages what was your problem? I might be able to help you out.


My concern is the size of the tri-x/vapor-x as you said. If I CF, there will be very little space between the two cards due to their size. My earlier post asked if I could move the 2nd card down to the next PCIE slot but turns out I can't since it would run in x4.

So really I'm just concerned about airflow. Maybe you could post a pic of your cards? I'm curious just how close together they are. Then again I see your case has a modded side exhaust so it shouldn't be a problem for you.


----------



## ebhsimon

Quote:


> Originally Posted by *Yvese*
> 
> My concern is the size of the tri-x/vapor-x as you said. If I CF, there will be very little space between the two cards due to their size. My earlier post asked if I could move the 2nd card down to the next PCIE slot but turns out I can't since it would run in x4.
> 
> So really I'm just concerned about airflow. Maybe you could post a pic of your cards? I'm curious just how close together they are. Then again I see your case has a modded side exhaust so it shouldn't be a problem for you.


I actually run my crossfire setup with another un-modded side panel. The main problem was the intake fans which weren't really strong enough and I thought 70C under load was hot. Also I face my PSU up now to create another point of exhaust to hopefully 'pull' cool air down to the bottom corner. It works quite well and in crossfire overclocked with +100mV, +50% power limit @ 1070/1400 in battlefield @ 1440p both cards get to 72C max. The PCS+ is on top because it has a more aggressive fan profile and in single GPU set ups is about 6C cooler than the Vapor-X.


----------



## Yvese

Quote:


> Originally Posted by *ebhsimon*
> 
> I actually run my crossfire setup with another un-modded side panel. The main problem was the intake fans which weren't really strong enough and I thought 70C under load was hot. Also I face my PSU up now to create another point of exhaust to hopefully 'pull' cool air down to the bottom corner. It works quite well and in crossfire overclocked with +100mV, +50% power limit @ 1070/1400 in battlefield @ 1440p both cards get to 72C max. The PCS+ is on top because it has a more aggressive fan profile and in single GPU set ups is about 6C cooler than the Vapor-X.


Very nice. I looked at your board on newegg and it looks like your board lets you put the 2nd card one slot below compared to my board, allowing more space for airflow. Looking at your third picture, that black PCI slot would be where my PCIE slot would be for a 2nd GPU. That would be an extremely tight fit


----------



## ebhsimon

Quote:


> Originally Posted by *Yvese*
> 
> Very nice. I looked at your board on newegg and it looks like your board lets you put the 2nd card one slot below compared to my board, allowing more space for airflow. Looking at your third picture, that black PCI slot would be where my PCIE slot would be for a 2nd GPU. That would be an extremely tight fit


These are both 2.5 slot coolers, you can't put it in the PCI-E slot under it. If you want to put two 290s in PCI-E slots 1 and 2 you'll need to get 2 slot cards WITHOUT back plates. Your Tri-X is already classified as a 2.2 slot card, but I don't have one so I can't comment on if you'd be able to fit another card in.


----------



## Bertovzki

What is the best R9 290 to put under water ? the most stable trouble free card ?


----------



## Red1776

Quote:


> Originally Posted by *Bertovzki*
> 
> What is the best R9 290 to put under water ? the most stable trouble free card ?


 Well you have opened a can of worms here.









Here is is my opinion and why.

I have built a quadfire machine for my own use every 9-12 mos and the last three have been MSI for the reasons you mention.

This current build is a quartet of MSI gaming cards, but MSI as a whole. This not attempting to say they are are the only ones, but I have had a terrific experience with MSI for three Generations now. I don't know how it's decided who gets what chips.but......

Certainly one to be considered


----------



## Bertovzki

Quote:


> Originally Posted by *Red1776*
> 
> Well you have opened a can of worms here.


Good I like worms !







,yeah I guess there could be a lot of opinions about this topic, thats what the threads for though I suppose.

Thanks for your 2 cents


----------



## Ironsmack

Quote:


> Originally Posted by *Bertovzki*
> 
> What is the best R9 290 to put under water ? the most stable trouble free card ?


Simple... A stock reference card.

If you want OC'd, then there are variable factors - BIOS, VRM's temp, Elpida/Hynix memory, etc.


----------



## Bertovzki

Quote:


> Originally Posted by *Ironsmack*
> 
> Simple... A stock reference card.
> 
> If you want OC'd, then there are variable factors - BIOS, VRM's temp, Elpida/Hynix memory, etc.


At first not concerned about OC , but I will later down the track and ,also just one card at first, then just 2 one day., I hear Hynix better for OC so ill plan for that.

Thanks for reply


----------



## maynard14

i have 290x xfx referece card, mine is elpida but with asic quality of 79.9 percent, i can oc to gpu core of 1105 and memory to 1460 with stock voltage using stock reference cooler







so i think i got a good card even its elpida


----------



## Spectre-

Quote:


> Originally Posted by *Bertovzki*
> 
> At first not concerned about OC , but I will later down the track and ,also just one card at first, then just 2 one day., I hear Hynix better for OC so ill plan for that.
> 
> Thanks for reply


hynix oc's a way better

i can oc my memory upto 1650mhz


----------



## Bertovzki

Quote:


> Originally Posted by *maynard14*
> 
> i have 290x xfx referece card, mine is elpida but with asic quality of 79.9 percent, i can oc to gpu core of 1105 and memory to 1460 with stock voltage using stock reference cooler
> 
> 
> 
> 
> 
> 
> 
> so i think i got a good card even its elpida


Quote:


> Originally Posted by *Spectre-*
> 
> hynix oc's a way better
> 
> i can oc my memory upto 1650mhz


Ok thanks for info guys, anyone know what cards have Hynix and are good reliable cards, and still thinking of Tri x 290x oc as its a good price here and the standard tri x still cheapest in NZ, does the tri x have hynix ?


----------



## ArchieGriffs

Tri-x has Hynix.


----------



## mAs81

Vapor-X 290 has Hynix too..


----------



## taem

There's a bios for the Tri-x that lists both Hynix and elpida. Not sure if that means some of the tri X's use elpida.

As to which oc's better, I've had high clocking elpida cards, and low-clocking Hynix cards. I assume that as a general rule Hynix oc's better since everyone keeps saying it but in my personal experience I've seen no difference.


----------



## amptechnow

Quote:


> Originally Posted by *Bertovzki*
> 
> What is the best R9 290 to put under water ? the most stable trouble free card ?


my xfx 290 reference with elpida does 1125/1625 no problem. though hynix does seem to overclock better more regularly, elpida cards can hit really good numbers. but it seems less likely according to most peoples experiences.

like mentioned, if overclocking and going under water, definitely get a reference card!

msi and xfx have good warranties. i build pc's for a living and use msi and xfx cards all the time. they have good prices usually. i have had no issues with any of them with amd chips. one of my old gaming rigs has 2 msi gtx660's in sli and i have had nothing but trouble with them and have rma'ed soooo many times. but the process is easy and semi-quick. so if something were to happen you know you are covered with those companies and the warranties transfer if you sell the cards.

personally i recommend msi or xfx. i also like/use powercolor cards. one of my favorite brands has been asus, but i have not been too impressed with their gpu's and feel they are overpriced at times. ive done some builds with gigabyte cards with good results as well. i dont think i have ever used sapphire, his, or visiontek so i cant comment on them.....

i think it really comes down to personal preference. and with the good old silicon lottery its a toss up. reference cards under water are my personal favorite. youll be fine with msi, xfx, powercolor, or gigabyte.

or say forget the reference design and get this.... http://www.newegg.com/Product/Product.aspx?Item=N82E16814121904&cm_re=r9_290-_-14-121-904-_-Product lol


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> You should have gotten an extra matching set so it is more flexible. For ex. in this pic you can see the lightning block removed and the black section of tube and qdc. Without this "adapter" testing aircooled cards and general troubleshooting is a huge pain in the ass.


make sense. no downtime. thanks.


----------



## ZealotKi11er

Quote:


> Originally Posted by *amptechnow*
> 
> my xfx 290 reference with elpida does 1125/1625 no problem. though hynix does seem to overclock better more regularly, elpida cards can hit really good numbers. but it seems less likely according to most peoples experiences.
> 
> like mentioned, if overclocking and going under water, definitely get a reference card!
> 
> msi and xfx have good warranties. i build pc's for a living and use msi and xfx cards all the time. they have good prices usually. i have had no issues with any of them with amd chips. one of my old gaming rigs has 2 msi gtx660's in sli and i have had nothing but trouble with them and have rma'ed soooo many times. but the process is easy and semi-quick. so if something were to happen you know you are covered with those companies and the warranties transfer if you sell the cards.
> 
> personally i recommend msi or xfx. i also like/use powercolor cards. one of my favorite brands has been asus, but i have not been too impressed with their gpu's and feel they are overpriced at times. ive done some builds with gigabyte cards with good results as well. i dont think i have ever used sapphire, his, or visiontek so i cant comment on them.....
> 
> i think it really comes down to personal preference. and with the good old silicon lottery its a toss up. reference cards under water are my personal favorite. youll be fine with msi, xfx, powercolor, or gigabyte.
> 
> or say forget the reference design and get this.... http://www.newegg.com/Product/Product.aspx?Item=N82E16814121904&cm_re=r9_290-_-14-121-904-_-Product lol


Memory after 1500MHz does not seem to make a difference at least in 3DMark. Might have to test it on games and see.


----------



## Buehlar

Quote:


> Originally Posted by *Arizonian*
> 
> That is sweet man. Nice work. Congrats - updated


Thanks man!

Leak check pass and it's now full of coolant. Really happy with the performance these beasts especially now that they're running a lot cooler









I just sent in my photos and info for this months Professional MOTM.

Drop by and vote for you favorite mod guys!

More photos in my log Mid-Lif Cry-Sys


----------



## mAs81

That is seriously amazing , kudos


----------



## Buehlar

Quote:


> Originally Posted by *mAs81*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That is seriously amazing , kudos


Thanks man... love the power these things have! My 3x1 Eyefinity is screaming!


----------



## bond32

It's about to get real... Time to throw some breakers.


----------



## Fickle Pickle

Quote:


> Originally Posted by *bond32*
> 
> It's about to get real... Time to throw some breakers.


Are you planning on replacing the H from your HVAC system in your house? Seems to me with those, you no longer need another unit.


----------



## bond32

Quote:


> Originally Posted by *Fickle Pickle*
> 
> Are you planning on replacing the H from your HVAC system in your house? Seems to me with those, you no longer need another unit.


YES. Brilliant. Now I need to design a geothermal loop... Send that heat 300+ feet down. Wouldn't need any radiators then...

I have a 360 monsta, 240 monsta, and 360 RX360 all in push pull with AP-15's. I still have plenty more radiators but according to my calculations I will be fine on rad space.


----------



## Fickle Pickle

Quote:


> Originally Posted by *bond32*
> 
> YES. Brilliant. Now I need to design a geothermal loop... Send that heat 300+ feet down. Wouldn't need any radiators then...
> 
> I have a 360 monsta, 240 monsta, and 360 RX360 all in push pull with AP-15's. I still have plenty more radiators but according to my calculations I will be fine on rad space.


I like the Geothermal loop idea!

Yeah, that sounds like enough rad. Funny thing is I have exactly that much rad space just for my single 290x and a 4790k.


----------



## tsm106

Ok, had a decent time this morning. Got my 290x back into the hof in some benches. I should run some singles but haven't gotten to that yet. Testing this new lightning. First one I bought. She went right to 1300 and change. Not bad lol.

fsu
http://www.3dmark.com/fs/3094249
fse
http://www.3dmark.com/fs/3094483
fs
http://www.3dmark.com/fs/3094515

Quote:


> Originally Posted by *bond32*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Fickle Pickle*
> 
> Are you planning on replacing the H from your HVAC system in your house? Seems to me with those, you no longer need another unit.
> 
> 
> 
> YES. Brilliant. Now I need to design a geothermal loop... Send that heat 300+ feet down. Wouldn't need any radiators then...
> 
> I have a 360 monsta, 240 monsta, and 360 RX360 all in push pull with AP-15's. I still have plenty more radiators but according to my calculations I will be fine on rad space.
Click to expand...

Are you really going to feed volts to it? If you are, I'm not sure that's enough rad space.


----------



## bond32

Quote:


> Originally Posted by *tsm106*
> 
> Ok, had a decent time this morning. Got my 290x back into the hof in some benches. I should run some singles but haven't gotten to that yet. Testing this new lightning. First one I bought. She went right to 1300 and change. Not bad lol.
> 
> fsu
> http://www.3dmark.com/fs/3094249
> fse
> http://www.3dmark.com/fs/3094483
> fs
> http://www.3dmark.com/fs/3094515
> Are you really going to feed volts to it? If you are, I'm not sure that's enough rad space.


Those kept the 3 cards at reasonable temps when I was pushing 1.37-1.39 volts to the three. I did see a vrm hit 70 but the ambient was very warm, around 77-78 F. I am curious to see how what I have does but I have plenty more - 720mm worth of ST30's that are in push pull (no ap-15's though).

I am about 74% sure a breaker is getting thrown, so probably either 1. buy a thick ole' extension cord for one of the two psu's or 2. figure out how to flip a breaker for the outlet or 3. go to a 20 amp breaker.


----------



## tsm106

I use a heavy gauge extension for my 2nd psu.

Off the top of my head, with serious volts which you'd need if you want to break 1300 and lay the smack down, you'll be in the ludicrous speed wattage area. Yea, so two circuits for sure. Get a heavy gauge extension.


----------



## Red1776

Quote:


> Originally Posted by *tsm106*
> 
> I use a heavy gauge extension for my 2nd psu.
> 
> Off the top of my head, with serious volts which you'd need if you want to break 1300 and lay the smack down, you'll be in the ludicrous speed wattage area. Yea, so two circuits for sure. Get a heavy gauge extension.


Yeo, I concure.

I got a pair of short 18A ext cords and run off two circuits for my 2200w.

1200w on one and the other two PSU's =1000w on another.


----------



## LandonAaron

What is considered the safe temperature range for the VRM?


----------



## Bertovzki

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *ArchieGriffs*
> 
> Tri-x has Hynix.


Quote:


> Originally Posted by *mAs81*
> 
> Vapor-X 290 has Hynix too..


Quote:


> Originally Posted by *taem*
> 
> There's a bios for the Tri-x that lists both Hynix and elpida. Not sure if that means some of the tri X's use elpida.
> 
> As to which oc's better, I've had high clocking elpida cards, and low-clocking Hynix cards. I assume that as a general rule Hynix oc's better since everyone keeps saying it but in my personal experience I've seen no difference.


Quote:


> Originally Posted by *amptechnow*
> 
> my xfx 290 reference with elpida does 1125/1625 no problem. though hynix does seem to overclock better more regularly, elpida cards can hit really good numbers. but it seems less likely according to most peoples experiences.
> 
> like mentioned, if overclocking and going under water, definitely get a reference card!
> 
> msi and xfx have good warranties. i build pc's for a living and use msi and xfx cards all the time. they have good prices usually. i have had no issues with any of them with amd chips. one of my old gaming rigs has 2 msi gtx660's in sli and i have had nothing but trouble with them and have rma'ed soooo many times. but the process is easy and semi-quick. so if something were to happen you know you are covered with those companies and the warranties transfer if you sell the cards.
> 
> personally i recommend msi or xfx. i also like/use powercolor cards. one of my favorite brands has been asus, but i have not been too impressed with their gpu's and feel they are overpriced at times. ive done some builds with gigabyte cards with good results as well. i dont think i have ever used sapphire, his, or visiontek so i cant comment on them.....
> 
> i think it really comes down to personal preference. and with the good old silicon lottery its a toss up. reference cards under water are my personal favorite. youll be fine with msi, xfx, powercolor, or gigabyte.
> 
> or say forget the reference design and get this.... http://www.newegg.com/Product/Product.aspx?Item=N82E16814121904&cm_re=r9_290-_-14-121-904-_-Product lol





Ok thanks all for info,I already have my Aquacomputer Hawaii block ,so need plain reference , and have a budget to keep to thanks for link though amtechnow


----------



## Paul17041993

I have to ask, those with experience that is, how well the memory on a 290X goes without heatsinks and [light] forced airflow? might attach an accelero IV to mine, without using the backside heatsink and using the GELID 290X expansion kit for the VRMs (+ extra heatsink for the lowest memory unit). not sure if its really worth spending the extra ~50AUD just for the VRAM, dont intend to majorly overclock either.

though I might even just get some custom machined blocks made if I can get them cheap...


----------



## Spectre-

Quote:


> Originally Posted by *Paul17041993*
> 
> I have to ask, those with experience that is, how well the memory on a 290X goes without heatsinks and [light] forced airflow? might attach an accelero IV to mine, without using the backside heatsink and using the GELID 290X expansion kit for the VRMs (+ extra heatsink for the lowest memory unit). not sure if its really worth spending the extra ~50AUD just for the VRAM, dont intend to majorly overclock either.
> 
> though I might even just get some custom machined blocks made if I can get them cheap...


i got my 290X with a G10 and i dont have any heatsinks on the memory i can still do 24/7 1400mhz on memory and 1650mhz when benching


----------



## heroxoot

Quote:


> Originally Posted by *LandonAaron*
> 
> What is considered the safe temperature range for the VRM?


I'd say below 80c. Mine don't pass 70 very often in heavy games, with an OC. VRM might be able to go higher but I just cannot stand to see something transferring that much voltage getting too hot.


----------



## battleaxe

Quote:


> Originally Posted by *Paul17041993*
> 
> I have to ask, those with experience that is, how well the memory on a 290X goes without heatsinks and [light] forced airflow? might attach an accelero IV to mine, without using the backside heatsink and using the GELID 290X expansion kit for the VRMs (+ extra heatsink for the lowest memory unit). not sure if its really worth spending the extra ~50AUD just for the VRAM, dont intend to majorly overclock either.
> 
> though I might even just get some custom machined blocks made if I can get them cheap...


How about 40 pcs for $12 ??? These are the RAM coolers and they are dirt cheap.

http://www.ebay.com/itm/141264980186


----------



## Paul17041993

Quote:


> Originally Posted by *Spectre-*
> 
> i got my 290X with a G10 and i dont have any heatsinks on the memory i can still do 24/7 1400mhz on memory and 1650mhz when benching


how hot do they get on each and whats your ambients? they go unstable if you clock higher than 1400 for a long time?

Quote:


> Originally Posted by *battleaxe*
> 
> How about 40 pcs for $12 ??? These are the RAM coolers and they are dirt cheap.
> 
> http://www.ebay.com/itm/141264980186













didnt think of ebay though, should probably be able to find a similar kit locally...


----------



## Spectre-

Quote:


> Originally Posted by *Paul17041993*
> 
> how hot do they get on each and whats your ambients? they go unstable if you clock higher than 1400 for a long time?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> didnt think of ebay though, should probably be able to find a similar kit locally...


not very hot ambient is around 36ish and load is probs around 60

1100/1400 mhz is my 24/7 stable with no VDDC

anyhing above 1530mhz i need to push 150mV +


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> Ok, had a decent time this morning. Got my 290x back into the hof in some benches. I should run some singles but haven't gotten to that yet. Testing this new lightning. First one I bought. She went right to 1300 and change. Not bad lol.
> 
> fsu
> http://www.3dmark.com/fs/3094249
> fse
> http://www.3dmark.com/fs/3094483
> fs
> http://www.3dmark.com/fs/3094515


nice. something i can only dream of. i go crossfire tomorrow.


----------



## battleaxe

Quote:


> Originally Posted by *Paul17041993*
> 
> how hot do they get on each and whats your ambients? they go unstable if you clock higher than 1400 for a long time?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> didnt think of ebay though, should probably be able to find a similar kit locally...


Bummer... sorry bout that....


----------



## boot318

Somebody got an 290x with an 1080p monitor? Just wondering what the benchmark looks like for Tomb Raider 2013 on Ultimate.


----------



## Severon300

Quote:


> Originally Posted by *boot318*
> 
> Somebody got an 290x with an 1080p monitor? Just wondering what the benchmark looks like for Tomb Raider 2013 on Ultimate.


R9 290 Ultimate with TressFx 1080p



R9 290 Ultimate without TressFx 1080p


----------



## th3illusiveman

Quote:


> Originally Posted by *Severon300*
> 
> R9 290 Ultimate with TressFx 1080p
> 
> 
> 
> R9 290 Ultimate without TressFx 1080p


isn't 75 alittle low? I get 91 fps at 1100Mhz. delta is too big for just ~300 cores.


----------



## Bertovzki

Quote:


> Originally Posted by *Paul17041993*
> 
> how hot do they get on each and whats your ambients? they go unstable if you clock higher than 1400 for a long time?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> didnt think of ebay though, should probably be able to find a similar kit locally...


Yeah ebay are useless too lazy to ship to OZ or NZ, there is something here called youshop I think ? where you can order through them and they ship to a warehouse then to your home , you maybe able to do the same if you dont find something , I just use Frozen CPU for everything now, as they are better than NZ shops ,have what i want and gets here in 8 days, for less money


----------



## Paul17041993

Quote:


> Originally Posted by *Bertovzki*
> 
> Yeah ebay are useless too lazy to ship to OZ or NZ, there is something here called youshop I think ? where you can order through them and they ship to a warehouse then to your home , you maybe able to do the same if you dont find something , I just use Frozen CPU for everything now, as they are better than NZ shops ,have what i want and gets here in 8 days, for less money


ok I had the hilarious idea of using a bunch of ~40mm fansinks (ones meant for old GPUs and chipsets), then I saw your profile pic...

though not to worry as Ive found various heatsinks I can get, though they all involve shipping from US, UK or china so I don't think Ill bother anyway. Assuming I ever actually change the cooling on this card as I'm quite content with the blower ( ~40% max fan) and the performance on underclocked settings (750/1000) anyway... (even though it can do 1200/1650 quite easily if cooled well...)


----------



## alancsalt

I use http://www.ustooz.com/


----------



## Bertovzki

Quote:


> Originally Posted by *Paul17041993*
> 
> ok I had the hilarious idea of using a bunch of ~40mm fansinks (ones meant for old GPUs and chipsets), then I saw your profile pic...
> 
> though not to worry as Ive found various heatsinks I can get, though they all involve shipping from US, UK or china so I don't think Ill bother anyway. Assuming I ever actually change the cooling on this card as I'm quite content with the blower ( ~40% max fan) and the performance on underclocked settings (750/1000) anyway... (even though it can do 1200/1650 quite easily if cooled well...)


Some VGA heat sinks at Frozen, on the Air cooling page , and AMD specific ones :

http://www.frozencpu.com/cat/l3/g40/c16/s224/list/p1/Air_Cooling-Chipset_HeatsinksCoolers-BGAVGA_Heatsinks-Page1.html

http://www.frozencpu.com/cat/l3/g40/c21/s224/list/p1/Air_Cooling-VGA_HeatsinksCoolers-BGAVGA_Heatsinks-Page1.html

And you have some anodized colour choice , if you do decide to get some , have a shop around fill your cart up lol







lot's of goodies on there


----------



## bond32

Quadfire is up and running as of last night. Works well. Having a challenge, just need to test more but as soon as I add voltage and a small OC, my entire setup freezes after about 10 seconds in game. I need to test each card individually...


----------



## Aaron_Henderson

Sweet setup there *Bond32*...and oh, the joys of watercooled, multi-gpu setups, eh? You'll get it figured out...hope one of your cards isn't a dud overclocker that's dragging the others down.


----------



## battleaxe

Quote:


> Originally Posted by *bond32*
> 
> Quadfire is up and running as of last night. Works well. Having a challenge, just need to test more but as soon as I add voltage and a small OC, my entire setup freezes after about 10 seconds in game. I need to test each card individually...


I love this open rack design man... so beautiful!

Makes me want to try it myself.


----------



## bond32

Quote:


> Originally Posted by *Aaron_Henderson*
> 
> Sweet setup there *Bond32*...and oh, the joys of watercooled, multi-gpu setups, eh? You'll get it figured out...hope one of your cards isn't a dud overclocker that's dragging the others down.


I bought the fourth card from a member on these forums. Even though he claimed it to be hynix, it's elpida. I'm already frustrated but not sending it back at this point. I know my original 3 were awesome clocking cards, all hynix.


----------



## Aaron_Henderson

Quote:


> Originally Posted by *bond32*
> 
> I bought the fourth card from a member on these forums. Even though he claimed it to be hynix, it's elpida. I'm already frustrated but not sending it back at this point. I know my original 3 were awesome clocking cards, all hynix.


Massive bummer dude...sounds like you are after the "perfect" setup, so I get your frustration. But selling/returning the card is always a bother...I'd keep my eyes out for another Hynix memory card, and once you've got it, then worry about selling the dud. If that is indeed what is going on. Never know, with that many cards, there could be lots of issues. Try some different drivers yet? Able to try a different PSU or run a couple cards off a second/dual PSU?


----------



## Buehlar

Quote:


> Originally Posted by *bond32*
> 
> Quadfire is up and running as of last night. Works well. Having a challenge, just need to test more but as soon as I add voltage and a small OC, my entire setup freezes after about 10 seconds in game. I need to test each card individually..


Well duh...your crossfire cable is missing!









Looks insane man! Let us know how well it scales once you get it sorted out.


----------



## bond32

At this point not sure what I will do. Something isn't right... The whole freezing issue. I have tried multiple drivers with no luck, it could be a problem with my board. Will try on bone stock tonight then try each card individually.

Kinda funny, when I had it initially setup I had the inlet on the left gpu coupling block and the discharge on the right. When it was running, the first and third cards were really hot - 50+ C. Turns out the top two cards are parallel, bottom two are parallel, and the two parallel circuits are series with each other. I goof'ed...


----------



## aaroc

Quote:


> Originally Posted by *bond32*
> 
> Quadfire is up and running as of last night. Works well. Having a challenge, just need to test more but as soon as I add voltage and a small OC, my entire setup freezes after about 10 seconds in game. I need to test each card individually...


Has some of your GPUS coil wine? Its louder or softer after installing the waterblocks?
Superb rig!


----------



## afokke

I wonder, how much can I expect to sell a used Sapphire Vapor-X R9 290 for? it was $455 when I bought it







and now is way down to $300 (on newegg). searching on ebay, all I see are new ones being sold for outrageous $400-500+ prices.


----------



## bond32

Quote:


> Originally Posted by *aaroc*
> 
> Has some of your GPUS coil wine? Its louder or softer after installing the waterblocks?
> Superb rig!


Not really sure as the only air cooling I have personally seen on these 290x's are with reference coolers which their fan noise totally overtakes anything else. But I am fairly certain the water blocks would help with that, I can hear coil whine but it is pretty minor compared to some other cards I have heard.


----------



## aaroc

Quote:


> Originally Posted by *bond32*
> 
> Not really sure as the only air cooling I have personally seen on these 290x's are with reference coolers which their fan noise totally overtakes anything else. But I am fairly certain the water blocks would help with that, I can hear coil whine but it is pretty minor compared to some other cards I have heard.


Thanks for the reply. I have 4 R9 290X and only one has coil whine. If im lucky I will install WB to them this weekend


----------



## SeanEboy

Definitely hear coil wine in one of my cards as well.. I'm willing to bet it has something to do with the lead card, versus the other cards behind/under it. Know what I mean?

Furthermore, some peeps have mentioned that it might be due to the fact that the card is rendering at a billion fps, despite the fact the screen can only show so many fps. So, basically setting an FPS high limit might take care of it. Some people with 290x said this worked. Mine is quiet as hell, but I'm watercooled, running 22 AP-15s @ 35%, so I can hear it. Doesn't even really bother me, I'll be messing with the FPS limit thing tonight.


----------



## Ironsmack

@bond32 im looking at your bridge... Shouldnt you plumb one fitting on one bridge and use another fitting for the other bridge?

I plumbed my GPU's like that by accident and one GPU temp rose to 76+ C, before i realize my mistake.


----------



## afokke

Quote:


> Originally Posted by *afokke*
> 
> I wonder, how much can I expect to sell a used Sapphire Vapor-X R9 290 for? it was $455 when I bought it
> 
> 
> 
> 
> 
> 
> 
> and now is way down to $300 (on newegg). searching on ebay, all I see are new ones being sold for outrageous $400-500+ prices.


probably only $200


----------



## Fickle Pickle

Quote:


> Originally Posted by *afokke*
> 
> probably only $200


Welcome to the world of tech.


----------



## afokke

Quote:


> Originally Posted by *Fickle Pickle*
> 
> Welcome to the world of tech.


what


----------



## bond32

Quote:


> Originally Posted by *Ironsmack*
> 
> @bond32 im looking at your bridge... Shouldnt you plumb one fitting on one bridge and use another fitting for the other bridge?
> 
> I plumbed my GPU's like that by accident and one GPU temp rose to 76+ C, before i realize my mistake.


This particular Koolance bridge isn't like normal ones. It only has one setup which is the inlet (where I have the inlet coming in from the cpu) flows to the top two cards which are in parallel with each other, then out the left block on the second card into the third card on that left side. The bottom two cards are in parallel with each other two, both parallel circuits are in series with each other.


----------



## Fickle Pickle

Quote:


> Originally Posted by *afokke*
> 
> what


I meant that with technology, items lose value extremely quickly. The only way to not lose ~50% from an item is to buy early and sell early in the lifecycle before the next product meant to replace it is announced.


----------



## Ironsmack

Quote:


> Originally Posted by *bond32*
> 
> This particular Koolance bridge isn't like normal ones. It only has one setup which is the inlet (where I have the inlet coming in from the cpu) flows to the top two cards which are in parallel with each other, then out the left block on the second card into the third card on that left side. The bottom two cards are in parallel with each other two, both parallel circuits are in series with each other.


Ohh cool. Thanks for the clarification!


----------



## bond32

Best run of the night in Firestrike Ultra: 9814 http://www.3dmark.com/3dm/4543467?


----------



## bond32

Well, there doesn't seem to be a way to get over 1.4 volts on stock bios... It's certainly possible on pt1 but I've noticed clock for clock pt1 yields worse results. This could be some sort of issues with trifire/quadfire...


----------



## Cyber Locc

So i think i may have figured something out about my sound problem my what should be amd audio driver is not its simply high definition audio and under properties it says windows. Anyone know how i could get windows to stop overwriting the driver and or get the driver myself.


----------



## heroxoot

Quote:


> Originally Posted by *cyberlocc*
> 
> So i think i may have figured something out about my sound problem my what should be amd audio driver is not its simply high definition audio and under properties it says windows. Anyone know how i could get windows to stop overwriting the driver and or get the driver myself.


Pause+Pagebreak > Advanced System settings > Hardware tab > driver installation settings > Let me choose > Never.

If this isn't it then I don't know. This stopped windows from attempting to change my drivers because it thinks it knows more than I do.


----------



## Cyber Locc

Quote:


> Originally Posted by *heroxoot*
> 
> Pause+Pagebreak > Advanced System settings > Hardware tab > driver installation settings > Let me choose > Never.
> 
> If this isn't it then I don't know. This stopped windows from attempting to change my drivers because it thinks it knows more than I do.


I seem to have it working now i dont know how because i have tried this before but i left the microsoft driver and disabled it it worked i disabled crossfire it may 2 entires i disabled both and it and reenabled it is still working so now im sitting in skyrims screen listening and waiting to se if it happens again lol


----------



## Spectre-

Quote:


> Originally Posted by *bond32*
> 
> Well, there doesn't seem to be a way to get over 1.4 volts on stock bios... It's certainly possible on pt1 but I've noticed clock for clock pt1 yields worse results. This could be some sort of issues with trifire/quadfire...


my gigabyte 290X's stock bios does 1.42 after drop off

PT.1 in my case turned out to be better clock for clock


----------



## BranField

Finally got my watercooling up and running. Did a quick bench loop this morning, ~25 min loop of haeven. Max GPU temp of 38. Really happy with that.


----------



## ebhsimon

So my i5 is clearly bottle-necking my 290s in BF3. Would hyper-threading alone fix this problem or would I need to go for a six core?


----------



## rdr09

Quote:


> Originally Posted by *ebhsimon*
> 
> So my i5 is clearly bottle-necking my 290s in BF3. Would hyper-threading alone fix this problem or would I need to go for a six core?
> 
> 
> Spoiler: Warning: Spoiler!


this was with my i7 @ 4.5 and 7900 crossfired . . .

HT off



HT on



HT should help with the 290s as well in BF3.


----------



## ebhsimon

Quote:


> Originally Posted by *rdr09*
> 
> this was with my i7 @ 4.5 and 7900 crossfired . . .
> 
> HT off
> 
> 
> 
> HT on
> 
> 
> 
> HT should help with the 290s as well in BF3.


Did your FPS change at all? Do you think that HT would be enough to record as well? Even using H.264 encoders in Bandicam it drops from 90 minimum to 60 minimum. I feel like HT would help relieve that bottle neck a little but there's only so much 4 cores can do, which is why I might go for a 5930k.


----------



## rdr09

Quote:


> Originally Posted by *ebhsimon*
> 
> Did your FPS change at all? Do you think that HT would be enough to record as well? Even using H.264 encoders in Bandicam it drops from 90 minimum to 60 minimum. I feel like HT would help relieve that bottle neck a little but there's only so much 4 cores can do, which is why I might go for a 5930k.


yes. might not be very scientific but here the gpus were at stock (7950/7970) . . .





the minimum was holding higher with HT on. Huge explosions were when fps drops with HT off.


----------



## ebhsimon

Quote:


> Originally Posted by *rdr09*
> 
> yes. might not be very scientific but here the gpus were at stock (7950/7970) . . .
> 
> 
> 
> 
> 
> the minimum was holding higher with HT on. Huge explosions were when fps drops with HT off.


I think that's great information rdr. Thanks. Do you think an i7 would be enough to record at the same time while keeping the frame rate, or probably jump to 6 cores?


----------



## pompss

The competition its open. IF you guys like mine or other Users builds go vote for it. We need more votes .

Thanks

http://www.overclock.net/t/1517172/ocn-mod-of-the-month-october-2014-professional-class-nominations/50#post_23074548


----------



## Dynamo11

Hey guys, just a little update on my borked card I got about 3 weeks ago. I RMA'd it and, it seems XFX agree that it had to be faulty from when it shipped and they are sending me a replacement as we speak. This time I'll be sure to check if there's anything wrong with it as soon as a I receive it but hopefully it should be fine. Next time I'll be posting here it should be about my monster overclock I've put on it (here's hoping)


----------



## LandonAaron

I just got my card and CPU liquid cooled with a real custom loop (no more AIO's!!







!) . Anyway, now that I have plenty of thermal headroom I was messing around with Trixx and AB trying to give the card some more voltage to find my highest possible OC. I checked the "extend official overclocking limits", "disable ULPS", and changed unofficial overclocking mode to both with and without "Power Play support" (not sure what that is). I still wasn't able to add any more than +100 mV using AB, but in Trixx I was able to push the voltae up to +200 mV.

Since the afterburner settings didn't seem to be helping anything I went ahead and reset all those back to normal, but I noticed in the monitoring graph my card is running at a constant 0.961 V even when just idling on the desktop.

Is this the normal idle voltage or is my ULPS still disabled even though I reset the setting and rebooted the computer?


----------



## rdr09

Quote:


> Originally Posted by *ebhsimon*
> 
> I think that's great information rdr. Thanks. Do you think an i7 would be enough to record at the same time while keeping the frame rate, or probably jump to 6 cores?


the newer 6 cores are still expensive. i don't record but i read so long as you use a different drive to record, then it should not tax the cpu much. it also depends on what recording app you use. i recommend going for an i7 that will work on your current setup.

anyway, i'm testing my 290s and this is what i get so far at 1200/1400 . . .

http://www.3dmark.com/3dm/4548331?

this weekend its suppose to freeze in our area, so i'll see what my second card can do. shhhh . . . zero coil whine.

EDIT: Kizwan is right. No need to unplug second card's power when updating driver but i did not try with crossfire enabled. I still disabled crossfire when updating. next beta i will try with crossfire enabled.


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> the newer 6 cores are still expensive. i don't record but i read so long as you use a different drive to record, then it should not tax the cpu much. it also depends on what recording app you use. i recommend going for an i7 that will work on your current setup.
> 
> anyway, i'm testing my 290s and this is what i get so far at 1200/1400 . . .
> 
> http://www.3dmark.com/3dm/4548331?
> 
> this weekend its suppose to freeze in our area, so i'll see what my second card can do. shhhh . . . zero coil whine.
> 
> EDIT: Kizwan is right. No need to unplug second card's power when updating driver but i did not try with crossfire enabled. I still disabled crossfire when updating. next beta i will try with crossfire enabled.


What do u get in the normal 1080p benchmark.


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What do u get in the normal 1080p benchmark.


here you go, Zeal. using second card (second slot) as primary rightnow for testing . . .

http://www.3dmark.com/3dm/4550854?


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> here you go, Zeal. using second card (second slot) as primary rightnow for testing . . .
> 
> http://www.3dmark.com/3dm/4550854?


This is what i am getting in W7.

http://www.3dmark.com/fs/2960425


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> This is what i am getting in W7.
> 
> http://www.3dmark.com/fs/2960425


that's about in line with the difference between the two. about 7%. i may need about 1250 to match your score.


----------



## amptechnow

my 290's are like -200 behing your 290x's. with about 100 less on core too.
http://www.3dmark.com/fs/2891244


----------



## tsm106

Quote:


> Originally Posted by *amptechnow*
> 
> my 290's are like -200 behing your 290x's. with about 100 less on core too.
> http://www.3dmark.com/fs/2891244










Tess disabled.


----------



## aaroc

I want to share my experience moving from FX 8350 + R9 290 to FX 9590 + R9 290X. All stock. On the F1 2012/2013 games is almost 10 fps more on the Texas circuit benchmark at 1440p.


----------



## Cyber Locc

This is a shame my first and probally last amd cards. Good thing I have havent bought blocks I was getting ready too as well. Mighthave to sell these and huy 980s I sinply cannot deal with this craxkling audio and a weeks worth of trying to figure it out and 4 os installs this is getting out of hand.


----------



## SeanEboy

Soo... would a complete system freeze on BF4, and firestrike mean not enough psu? How do I troubleshoot the freeze? Turn off a card?


----------



## bond32

Quote:


> Originally Posted by *SeanEboy*
> 
> Soo... would a complete system freeze on BF4, and firestrike mean not enough psu? How do I troubleshoot the freeze? Turn off a card?


Doubtful... Are any other components overclocked?


----------



## SeanEboy

Quote:


> Originally Posted by *bond32*
> 
> Doubtful... Are any other components overclocked?


Nothing. I get the watchdog blue screen error. Nothing is overclocked.


----------



## bond32

Quote:


> Originally Posted by *SeanEboy*
> 
> Nothing. I get the watchdog blue screen error.


Highly doubt that has to do with the video cards. I always get that if something else is unstable. Not saying you're wrong anything, but that's what I observed


----------



## SeanEboy

Quote:


> Originally Posted by *bond32*
> 
> Highly doubt that has to do with the video cards. I always get that if something else is unstable. Not saying you're wrong anything, but that's what I observed


Yeah, well it's my first water build... 3930k, rivbe, quad 290x, 16gb corsair, himmph... Could it perhaps be the ram? Maybe once I tax the ram it hangs? Maybe should I try a re-seat? Man, I'd hate to have to re-do the cpu...


----------



## bond32

Quote:


> Originally Posted by *SeanEboy*
> 
> Yeah, well it's my first water build... 3930k, rivbe, quad 290x, 16gb corsair, himmph... Could it perhaps be the ram? Maybe once I tax the ram it hangs? Maybe should I try a re-seat? Man, I'd hate to have to re-do the cpu...


No I doubt it's anything g like that. I'd reset the bios to stock and check to see if there's any updates to the bios. Also just for good measure I'd reinstall gpu drivers (use ddu to remove old ones).


----------



## SeanEboy

Quote:


> Originally Posted by *bond32*
> 
> No I doubt it's anything g like that. I'd reset the bios to stock and check to see if there's any updates to the bios. Also just for good measure I'd reinstall gpu drivers (use ddu to remove old ones).


THIS.. See, this is what I need.. DDU, I need to learn this DDU.. ;c)

DDU, and are there any drivers I should aim for? Non-beta, regular drivers? Bios was never moved, I haven't even started futzing yet!


----------



## bond32

Quote:


> Originally Posted by *SeanEboy*
> 
> THIS.. See, this is what I need.. DDU, I need to learn this DDU.. ;c)
> 
> DDU, and are there any drivers I should aim for? Non-beta, regular drivers? Bios was never moved, I haven't even started futzing yet!


I'm trying to figure the drivers out... I just tested the non-beta vs beta, long story short. I think the beta yielded better results in firestrike ultra


----------



## pdasterly

watchdog error is one of your cpu cores stopping, I get this error on a heavy overclock when running realbench benchmark


----------



## SeanEboy

Quote:


> Originally Posted by *bond32*
> 
> I'm trying to figure the drivers out... I just tested the non-beta vs beta, long story short. I think the beta yielded better results in firestrike ultra


At least you hope it would... But, if I'm testing stability, perhaps I should save the beta for end game?

Uhh, I'm trying to figure out how to do the bios.. EZ flash I guess? I haven't done this in 15 years.


----------



## SeanEboy

Quote:


> Originally Posted by *pdasterly*
> 
> watchdog error is one of your cpu cores stopping, I get this error on a heavy overclock when running realbench benchmark


Yeah, I did a little googling and kind of got that feeling, but still not sure. So, would that be attributed to a bunk/overdrawn psu?


----------



## tsm106

Quote:


> Originally Posted by *SeanEboy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *pdasterly*
> 
> watchdog error is one of your cpu cores stopping, I get this error on a heavy overclock when running realbench benchmark
> 
> 
> 
> Yeah, I did a little googling and kind of got that feeling, but still not sure. *So, would that be attributed to a bunk/overdrawn psu*?
Click to expand...

No, your setup/x is unstable.


----------



## pdasterly

unstable overclock, you can try to narrow down which core is failing by selecting affinity(core) in task manager and run benchmark or your repeatable error on each core to find which one needs to be downclocked


----------



## SeanEboy

Well, that's just the problem... I didn't have anything overclocked..


----------



## pdasterly

Quote:


> Originally Posted by *SeanEboy*
> 
> Well, that's just the problem... I didn't have anything overclocked..


download realbench and run benchmark. I agree you have a problem. I only get that error when I push my chip past 5ghz


----------



## tsm106

Quote:


> Originally Posted by *SeanEboy*
> 
> Well, that's just the problem... I didn't have anything overclocked..


You don't have to be oc'd to be unstable.


----------



## SeanEboy

Quote:


> Originally Posted by *pdasterly*
> 
> download realbench and run benchmark. I agree you have a problem. I only get that error when I push my chip past 5ghz


Cool, thanks, dling now. Done bios.
Quote:


> Originally Posted by *tsm106*
> 
> You don't have to be oc'd to be unstable.


Gotcha. Actually, there could be an OC going on via that Asus bios 'optimized defaults' couldn't there be...?


----------



## tsm106

Also, cpus and gpus degrade over time. Sometimes you will find what settings you once ran stable are no longer stable. If you didn't realize that it degraded it would be like a needle in a haystack, looking every which way w/o luck. It's something to keep in mind I think so you don't go off on wild goose chases, so confirm system stability at stock before moving on.


----------



## SeanEboy

Yeah, well I'm hoping it's not the cpu, I bought it used awhile back, and probably don't even have much time on it at all. Guy seemed straight up, I have a feeling bios/drivers/etc will clean it up. I was pretty anal about applying thermal compound and what not...

Looks like realbench is going to take a half hour to download.. Might be bedtime, fishing in the morning. Thanks for the help/encouragement guys.. Chat tomorrow.


----------



## pdasterly

asus servers are slow


----------



## kizwan

Quote:


> Originally Posted by *SeanEboy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bond32*
> 
> Doubtful... Are any other components overclocked?
> 
> 
> 
> Nothing. I get the watchdog blue screen error. Nothing is overclocked.
Click to expand...

Are you using Windows 8?
Quote:


> Originally Posted by *cyberlocc*
> 
> This is a shame my first and probally last amd cards. Good thing I have havent bought blocks I was getting ready too as well. Mighthave to sell these and huy 980s I sinply cannot deal with this craxkling audio and a weeks worth of trying to figure it out and 4 os installs this is getting out of hand.


Crackling audio means interference. I don't know what is your computer specs, whether are you using onboard or standalone soundcard.


----------



## Cyber Locc

Quote:


> Originally Posted by *kizwan*
> 
> Are you using Windows 8?
> Crackling audio means interference. I don't know what is your computer specs, whether are you using onboard or standalone soundcard.


Crackling audio means interference or latency in my case is latency from the direct x kernal it has nothing to do with my rig its amds drivers issue they have acknoledged that in january and recently its been acknoledged the issue wasnt fixed for everyone.

It only happens when crossfire is enabled and vsync is on thats about as far from a hardware issue as you could get.

And not trying to be mean or nothing with above typically I wouod say the same you did sadly thats not the case. If it was it would be fixable by me so I wish it was as at this point I have 4 options.

1. Deal with it.(not going to happen its a horrid sound)

2. Play with Vsync off or Crossfire disabled (so deal wirh stutter or play with only half of my cards both of those suck)

Note: Either of those is just until a fix comes if ever (they said they fixed this in march whoch for most people they did)

3. Sell my cards take a major hit of a loss money wise and buy 980s.

So see i wish it was interfence all my options suck.


----------



## LandonAaron

What voltage do your cards run at during idle/desktop? Mine is at a constant 0.961 which seems a bit high.


----------



## pdasterly

Quote:


> Originally Posted by *cyberlocc*
> 
> Crackling audio means interference or latency in my case is latency from the direct x kernal it has nothing to do with my rig its amds drivers issue they have acknoledged that in january and recently its been acknoledged the issue wasnt fixed for everyone.
> 
> It only happens when crossfire is enabled and vsync is on thats about as far from a hardware issue as you could get.
> 
> And not trying to be mean or nothing with above typically I wouod say the same you did sadly thats not the case. If it was it would be fixable by me so I wish it was as at this point I have 4 options.
> 
> 1. Deal with it.(not going to happen its a horrid sound)
> 
> 2. Play with Vsync off or Crossfire disabled (so deal wirh stutter or play with only half of my cards both of those suck)
> 
> Note: Either of those is just until a fix comes if ever (they said they fixed this in march whoch for most people they did)
> 
> 3. Sell my cards take a major hit of a loss money wise and buy 980s.
> 
> So see i wish it was interfence all my options suck.


are you using onboard sound or sound from gpu card?


----------



## Cyber Locc

Quote:


> Originally Posted by *pdasterly*
> 
> are you using onboard sound or sound from gpu card?


The only audio I dont (or haven't I should say) is the cards I don't have hdmi speakers or stereo. I have used my RIVBE onboard which is pretty good onboard. I have tried a buddys xonar another buddys soundblaster soundcards, I have tried my corsair 2100s, and astro a40s which both use usb sound. I have also tried a friends DAC Stack via spdif and all of them have the same issue. I have re-installed drivers maybe 50 times now windows 4 times (1 time I actually used windows 8.1 trial) I have updated all drivers to no avail and even bought driver fusion to make sure they were all up to date and through it all it crackles. Had a few times where I thought it was fixed it went 20 -30 mins without doing it but sadly it always comes back.

There was that ray of hope on another forum but aside from that I have given up. I also finnaly got a reply from AMD tech support a little while ago telling me to try to install 14.9.2 again (I have done so about 4 times now that makes 5 along with every other driver on tech power up and guru since 13.12.)

This has been a 3 week ordeal (when I entered into crossfrie sound hell (IE got my second card)). At first I thought like kizwan that it was interference and later found out that it was the cards. I have done everything aside from pull my own hair out which is next on my list. There is also another thread here on ocn with a few people same issue if you Google it you will find quite a few of us actually but apparently not a enough for AMD to do anything about it.

I am open to suggestions for other things to try though I like my hair where it is.


----------



## pdasterly

Quote:


> Originally Posted by *cyberlocc*
> 
> The only audio I dont (or haven't I should say) is the cards I don't have hdmi speakers or stereo. I have used my RIVBE onboard which is pretty good onboard. I have tried a buddys xonar another buddys soundblaster soundcards, I have tried my corsair 2100s, and astro a40s which both use usb sound. I have also tried a friends DAC Stack via spdif and all of them have the same issue. I have re-installed drivers maybe 50 times now windows 4 times (1 time I actually used windows 8.1 trial) I have updated all drivers to no avail and even bought driver fusion to make sure they were all up to date and through it all it crackles. Had a few times where I thought it was fixed it went 20 -30 mins without doing it but sadly it always comes back.


Try posting over at rog forum


----------



## Cyber Locc

Quote:


> Originally Posted by *pdasterly*
> 
> Try posting over at rog forum


Already been done and also it has nothing to do with rog so there no more help than anyone else I don't think there is a forum in existence you cant find this issue.
http://www.overclock.net/t/1516541/sound-crackling-crossfire-only this is a thread from another guy in crossfire audio hell and he has a MSI board. Sadly it doesn't discriminate. I have also seen reports of people RMAing cards which still didn't fix the problem. However why some are affected and others aren't you got me. Horrid Luck I guess.

I assume its a particular bios or card model that is affected and maybe the few I have seen report RMAing have either got the same card back or one from the same batch but IDK.

The only silver lining I have found was this
Quote:

OP: "For some reason in games I am getting insane audio stutter in games when Vsync is on, but when it's off, it's fine, tried all drivers including 14.4, 14.6, 14.7 but it still does it.

What is wrong?

Thanks!"

AMD REP: "Sorry to hear of the issue you're having.

What resolution are you using out of interest?

I've actually raised this bug myself with the driver team a few days back. Since then they've reproduced it and are working on a fix. As soon as i have more information on this i will be back to update the thread. Hang tight, a fix is coming in a future driver release.

I can offer this as a possible, temporary fix in the meantime. Please try running dual card crossfire for now, as in my screenshot below. So just disable the third card in CCC, like I've done in the picture below using the CrossfireX tab. Note i have Quad Crossfire so just ignore the 4th card option." (The guy only has 2 cards why the rep thought 3 he nor I know lol.)

That was on 9/20/14 and no update to the thread since he said later (that day but later in the thread) that a fix was being worked on and it was top priority and he would update the thread but hasn't.


----------



## pdasterly

its not rog but there are some very good asus techs on forum


----------



## Cyber Locc

Quote:


> Originally Posted by *pdasterly*
> 
> its not rog but there are some very good asus techs on forum


I know there is and its already there and they said the same thing Im saying no one but AMD can fix AMD issues as we don't know what the issue in there drivers is. And all the posts about it on the AMD forums (theres alot) have been neglected they did post about it once way back in February saying a fix was coming and it did and for alot of people it worked but for some it didn't. Believe me this has consumed me lol at work Im thinking about and reading when Im not busy what I will try next when I get home and spent all my free time on this.


----------



## rdr09

Quote:


> Originally Posted by *cyberlocc*
> 
> I know there is and its already there and they said the same thing Im saying no one but AMD can fix AMD issues as we don't know what the issue in there drivers is. And all the posts about it on the AMD forums (theres alot) have been neglected they did post about it once way back in February saying a fix was coming and it did and for alot of people it worked but for some it didn't. Believe me this has consumed me lol at work Im thinking about and reading when Im not busy what I will try next when I get home and spent all my free time on this.


i just tried BF4 with vsync on (normally off) and i am also using on board sound but did not hear any cracking. i used a headphones for the test to make sure. i know yesterday when i switched HDDs (i have 2 imaged but different drivers) at first windows did not recognize my secondary card and tried to install amd high definition device prolly cause my windows update is set to auto. i tried to stop but can't, so i let it be. thought it would mess up my sound - it didn't.

maybe your sound driver for the motherboard is conflicting with amd's. i recommend turning off windows update or don't let your sig update drivers itself and reinstall driver.

i have a cheap biostar Z77 motherboard that uses THX, which i normally hooked up to a 5.1 speaker.

edit: what are your specs? Rigbuilder is at the upper right corner of page. why would stutter with vsync off? which game?


----------



## lyxthe

Hi, I hope i'm asking my question on the right place.

I have a strange bug on my new configuration that I bought some times ago :

Sapphire R9 290x Vapor X
Intel i7 4790k
16Go Ram
MB Asus z97-A
SSD crucial MX100
OCZ 700w
HDD 1.5 To
Windows 8.1 pro (official licence)

When I am in game (any game, tested it with The witcher 2, tomb raider ...) , if I come back to the desktop (willingly by Alt-Tab of windows touch, or not willingly) I can't come back to the game. When I try it, the game process is been killed and i have to relaunch it.

I play in fullscreen.

In addition, I have a secondary screen in extended desktop that stays black as i am playing fullscreen (that is completly normal). The second problem I have is that when I play with mouse and keybord, whenever I translate the mouse to the right and click, I am ejected from the game and go back to the desktop, as if the focus on the game was lost by windows by the click. It is not a normal behavior, the games know how to deal the click in fullscreen to avoid take account the secondary screen. This issue is less problematic than the first one since I only have to change my configuration before game launched to avoid it. It is really anoying though.

Would you have an explanation for those both issues ?

When I play I have steam and AMD Gaming Evolved that are launched, could the problem come from one of them ?

I don't know if it is software or hardware, but I don't think it comes from the games as it is doing it for several games.

Thanks for your attention, and sorry for my bad english (I'm french).

lyxthe


----------



## aaroc

cyberlooc, a few years before your problem would represent an IRQ conflict. maybe you can check that.

lyxthe, do you have teamviewer installed on your PC?I had similar problems, so I kill TV before playing.


----------



## Cyber Locc

Quote:


> Originally Posted by *rdr09*
> 
> i just tried BF4 with vsync on (normally off) and i am also using on board sound but did not hear any cracking. i used a headphones for the test to make sure. i know yesterday when i switched HDDs (i have 2 imaged but different drivers) at first windows did not recognize my secondary card and tried to install amd high definition device prolly cause my windows update is set to auto. i tried to stop but can't, so i let it be. thought it would mess up my sound - it didn't.
> 
> maybe your sound driver for the motherboard is conflicting with amd's. i recommend turning off windows update or don't let your sig update drivers itself and reinstall driver.
> 
> i have a cheap biostar Z77 motherboard that uses THX, which i normally hooked up to a 5.1 speaker.
> 
> edit: what are your specs? Rigbuilder is at the upper right corner of page. why would stutter with vsync off? which game?


I have unistalled my onboard audio and disabled in bios before it did nothing. Also that would only explain the issue with onboard audio it works with alot of audio sources. Also if it was a drivers not playing nice it would happen all the time not just when in a game with vsync on.

My rig is on my profile juat not on my sig.

What game do I play umm quite a few and they all stutter when vsync is off thats no fault of the catds i have a 60hz monitor it cant handle 100fps. Also some games dont allow you to disable vsync such as skyrim.

But really in the end thats okay as my plans are to go to 4k where i wont need vsync and 1 card gets me 60 right now on max settings in like all games


----------



## Cyber Locc

Quote:


> Originally Posted by *aaroc*
> 
> cyberlooc, a few years before your problem would represent an IRQ conflict. maybe you can check that.
> 
> lyxthe, do you have teamviewer installed on your PC?I had similar problems, so I kill TV before playing.


That is what it is an audio latency from the direct X kernal caused by something in amds driver conflicting woth a driver I have or something


----------



## MojoW

Quote:


> Originally Posted by *aaroc*
> 
> cyberlooc, a few years before your problem would represent an IRQ conflict. maybe you can check that.
> 
> lyxthe, do you have teamviewer installed on your PC?I had similar problems, so I kill TV before playing.


That TV service is a resource hog, you should disable the service running in the background.
Because that service starts with windows and it should just be disabled untill you start TV.


----------



## lyxthe

Nop, sorry don't have TeamViewer installed, so that can't be the issue... A friend of mine told me that it could come from a bad directX installation. What would you thinkk about that ?


----------



## LandonAaron

Can someone please tell me what their GPU voltage is while on the desktop/idle?


----------



## chiknnwatrmln

My two reference cards are at .953 and .961v idle at stock speeds.


----------



## battleaxe

Quote:


> Originally Posted by *LandonAaron*
> 
> Can someone please tell me what their GPU voltage is while on the desktop/idle?


According to GPU-Z I am seeing .984v


----------



## taem

Quote:


> Originally Posted by *LandonAaron*
> 
> Can someone please tell me what their GPU voltage is while on the desktop/idle?


Quote:


> Originally Posted by *battleaxe*
> 
> According to GPU-Z I am seeing .984v


That is afaik the idle vddc for reference. My card is a Powercolor PCS+ 290 (bios is +50mV by default) flashed to a Tri-X bios (reference amd voltage) and I also get 0.984v. Is your ULPS disabled?



Quote:


> Originally Posted by *chiknnwatrmln*
> 
> My two reference cards are at .953 and .961v idle at stock speeds.


That's odd. What are your cards? Is that ULPS state?


----------



## battleaxe

Quote:


> Originally Posted by *taem*
> 
> That is afaik the idle vddc for reference. My card is a Powercolor PCS+ 290 (bios is +50mV by default) flashed to a Tri-X bios (reference amd voltage) and I also get 0.984v. Is your ULPS disabled?
> 
> 
> That's odd. What are your cards? Is that ULPS state?


My ULPS is not disabled right now. My second 290 is away on RMA. Once its back it will go back to ULPS "off".


----------



## rdr09

Quote:


> Originally Posted by *cyberlocc*
> 
> That is what it is an audio latency from the direct X kernal caused by something in amds driver conflicting woth a driver I have or something


you can try the way i uninstall the old driver using . . .

1. the uninstall option in the new (newer) driver and Express method.



2. Reboot

3. Install using Custom, then uncheck HDMI Audio Driver



Now, that is not exactly the way i do it. i simply use Express for both. ALSO, make sure Windows will not search and install a driver.


----------



## Cyber Locc

Quote:


> Originally Posted by *rdr09*
> 
> you can try the way i uninstall the old driver using . . .
> 
> 1. the uninstall option in the new (newer) driver and Express method.
> 
> 
> 
> 2. Reboot
> 
> 3. Install using Custom, then uncheck HDMI Audio Driver
> 
> 
> 
> Now, that is not exactly the way i do it. i simply use Express for both. ALSO, make sure Windows will not search and install a driver.


I have tried that problem is whether you install the hdmi audio or not microsoft will you can disable it however even then the issue is still there.

You say make sure windows doesnt install the driver how do you do that the hdmi audio is PNP it doesnt search it just installs the second the pc turns on if with ethernet unplugged windows gives it a generic plug and play hdmi audio driver.

This is why I said I really hate to do it esp as I already got backplates. But at some point I am going to have to consider swicthing to the green team. My first foray into amd gpus aint turning out as great as I hoped it would.


----------



## taem

Well what I find odd about chiknnwtrmln's vddc is that I've directly observed 4 different 290s, my Tri-X and Powercolor and my buddy's pair of MSI Gaming. At reference volts the idle vddc is exactly 0.984 on all 4 cards. Always. There are slight fluctuations in the stuff like vddc current/power in/out, but on all 4 of these cards, the 12v, vddc, and vddci don't fluctuate at idle.


----------



## Buehlar

Quote:


> Originally Posted by *LandonAaron*
> 
> Can someone please tell me what their GPU voltage is while on the desktop/idle?


ASUS 290X DCII OC

Voltage for both are almost identical. @ Stock OC 1050/1350. ULPS off.


----------



## chiknnwatrmln

I have ULPS disabled. I think that different chips need different voltages, similar to how if you run 5 cards at, say 1150 MHz you will need varying voltages for each one. It would only make sense that this would also apply to stock and 2D speeds.

My idle temps aren't crazy low or anything either, it's about 30c right now after sitting for a little bit. In my experience watercooling doesn't idle as low as air cooling does.


----------



## taem

Quote:


> Originally Posted by *Buehlar*
> 
> ASUS 290X DCII OC
> 
> Voltage for both are almost identical. @ Stock OC 1050/1350. ULPS off.


Can you check with gpu z? HWInfo numbers are always slightly different from everything else. But your idle vddc is also lower than .984. Hmm. edit, oh wait you have 290xs


----------



## Buehlar

Quote:


> Originally Posted by *taem*
> 
> Can you check with gpu z? HWInfo numbers are always slightly different from everything else. But your idle vddc is also lower than .984. Hmm. edit, oh wait you have 290xs


----------



## taem

Hmm. So why do some of the cards have .961 idle vddc and others have .984? Can't be asic, the 4 cards I know personally have asic ranging from low to high, all 4 are .984 idle vddc,


----------



## taem

Quote:


> Originally Posted by *amptechnow*
> 
> my 290's are like -200 behing your 290x's. with about 100 less on core too.
> http://www.3dmark.com/fs/2891244


Dang how are you guys getting such huge graphics scores? On a single 290 @ 1070 I get about 12000. On the highest oc I've tried (1170 I think? I hit a thermal limit at that point) I get about 13000 graphics score. You guys are getting more than double that!! That's crazy, because when I crossfire I don't get anywhere near double performance, I net about +60-70% in the scores and in-game framerate.


----------



## chiknnwatrmln

ASIC quality is a useless number that means nothing. I would wager that it's just because different chips operate at different voltages, similar to how some chips can be clocked higher than others on the same voltage.


----------



## rdr09

Quote:


> Originally Posted by *cyberlocc*
> 
> I have tried that problem is whether you install the hdmi audio or not microsoft will you can disable it however even then the issue is still there.
> 
> You say make sure windows doesnt install the driver how do you do that the hdmi audio is PNP it doesnt search it just installs the second the pc turns on if with ethernet unplugged windows gives it a generic plug and play hdmi audio driver.
> 
> This is why I said I really hate to do it esp as I already got backplates. But at some point I am going to have to consider swicthing to the green team. My first foray into amd gpus aint turning out as great as I hoped it would.


i would switch too if i have that type of issue. not sure if this would help coz, like i said, i have no issues with sounds with the way i install drivers.



___________________________________________________

@taem, like tsm said, amp's tess is off in that run.


----------



## taem

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> ASIC quality is a useless number that means nothing. I would wager that it's just because different chips operate at different voltages, similar to how some chips can be clocked higher than others on the same voltage.


But all 4 cards I know of have the same 0.984 vddc idle (and that other poster as well). And you have Buehlar have the same 0.961 vddc idle. So there's not variance so much as two different idle volts. Wonder what puts a gpu into one category or another?


----------



## chiknnwatrmln

They might be binned or something, I'm not exactly sure. What I do know is that the voltage goes in certain increments, so if a chip requires say .980v then it will be fed .984v. My 290 (the one that runs at .961v) is a decently good OCer, was able to hit 1245MHz on something like +200mV.

As for ASIC quality, here's a quote from an AMD spokesperson Dave Baumann:
Quote:


> ASIC "quality" is a misnomer propagated by GPU-z reading a register and not really knowing what it results in. It is even more meaningless with the binning mechanism of Hawaii.


----------



## LandonAaron

Quote:


> Originally Posted by *rdr09*
> 
> i would switch too if i have that type of issue. not sure if this would help coz, like i said, i have no issues with sounds with the way i install drivers.
> 
> 
> 
> ___________________________________________________
> 
> @taem, like tsm said, amp's tess is off in that run.


He can also try the "Prevent Installation of Devices not Described by Other Policy Settings" setting, but in my experience this will sometimes prevent any driver from being installed. Its good for cases where you have a driver you want to uninstall and it keeps auto-installing immediately after its been uninstalled. http://www.skidzopedia.com/2012/05/28/how-to-disable-automatic-driver-installation-in-windows-7-vista/


----------



## Cyber Locc

Quote:


> Originally Posted by *LandonAaron*
> 
> He can also try the "Prevent Installation of Devices not Described by Other Policy Settings" setting, but in my experience this will sometimes prevent any driver from being installed. Its good for cases where you have a driver you want to uninstall and it keeps auto-installing immediately after its been uninstalled. http://www.skidzopedia.com/2012/05/28/how-to-disable-automatic-driver-installation-in-windows-7-vista/


I will give that a shot tonight. However now that im thinking about it more and more. I wonder if thats even the issue Myself and others that have this probpem have been assuming it was that however. The issue that presents is latency mon and dpc latency show dxgkrnl.sys causing serious irqs and sound latency so that to me seems like a diffrent kind of problem.

It would seem that its something else in the drivers that direct x doesnt like how are you guys installing your direct x, from microsofts site.

Also if anyone has a chance to check there drivers system there is hd audio controllers that run the hd audio mine are just that hd audio controller driver and the driver info shows microsoft generic nothing in there is amd at all


----------



## kizwan

Quote:


> Originally Posted by *taem*
> 
> Quote:
> 
> 
> 
> Originally Posted by *amptechnow*
> 
> my 290's are like -200 behing your 290x's. with about 100 less on core too.
> http://www.3dmark.com/fs/2891244
> 
> 
> 
> Dang how are you guys getting such huge graphics scores? On a single 290 @ 1070 I get about 12000. On the highest oc I've tried (1170 I think? I hit a thermal limit at that point) I get about 13000 graphics score. You guys are getting more than double that!! That's crazy, because when I crossfire I don't get anywhere near double performance, I net about +60-70% in the scores and in-game framerate.
Click to expand...

Tessellation OFF.


----------



## LandonAaron

Quote:


> Originally Posted by *cyberlocc*
> 
> I will give that a shot tonight. However now that im thinking about it more and more. I wonder if thats even the issue Myself and others that have this probpem have been assuming it was that however. The issue that presents is latency mon and dpc latency show dxgkrnl.sys causing serious irqs and sound latency so that to me seems like a diffrent kind of problem.
> 
> It would seem that its something else in the drivers that direct x doesnt like how are you guys installing your direct x, from microsofts site.
> 
> Also if anyone has a chance to check there drivers system there is hd audio controllers that run the hd audio mine are just that hd audio controller driver and the driver info shows microsoft generic nothing in there is amd at all


I have never actually installed Direct X independently from Games. Usually whenever I install a game it will update Direct X and that has always worked fine for me. Here is what my AMD HD Audio drivers look like:


----------



## LandonAaron

Thanks for the reply's regarding idle voltage, I was afraid my ULPS wasn't working but it seems it is. I also have 0.961 idle voltage. It is a reference card. I wonder if those wit the .984 voltages are non-reference cards with factory OC's perhaps.


----------



## Cyber Locc

Quote:


> Originally Posted by *LandonAaron*
> 
> I have never actually installed Direct X independently from Games. Usually whenever I install a game it will update Direct X and that has always worked fine for me. Here is what my AMD HD Audio drivers look like:


Is that the ones that are in sound and game controllers or the ones in system devices?


----------



## kizwan

Quote:


> Originally Posted by *LandonAaron*
> 
> Thanks for the reply's regarding idle voltage, I was afraid my ULPS wasn't working but it seems it is. I also have 0.961 idle voltage. It is a reference card. I wonder if those wit the .984 voltages are non-reference cards with factory OC's perhaps.


ULPS only available when Crossfire. Single card doesn't use ULPS.


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> This is what i am getting in W7.
> 
> http://www.3dmark.com/fs/2960425


may need a bit more . . .

http://www.3dmark.com/3dm/4562432?

that's at +160 already.


----------



## tsm106

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> This is what i am getting in W7.
> 
> http://www.3dmark.com/fs/2960425
> 
> 
> 
> may need a bit more . . .
> 
> http://www.3dmark.com/3dm/4562432?
> 
> that's at +160 already.
Click to expand...

Push it moar?

http://www.3dmark.com/fs/2519782


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> Push it moar?
> 
> http://www.3dmark.com/fs/2519782


may need PT1 bios. lol

here is FSU at same clocks . . .

http://www.3dmark.com/3dm/4564345?


----------



## taem

Quote:


> Originally Posted by *kizwan*
> 
> Tessellation OFF.


With TESS off my score jumps by +1000 (from 11k to 12k). This is at stock volts with a mild oc just on the core, to 1070, so I could boost that by a few thousand if I up the volts and clocks. But my wondering is about the boost the second 290/x gets you.

You guys getting 26k gpu score on Firestrike, what would that score be with a single gpu would you guess?


----------



## Buehlar

Quote:


> Originally Posted by *LandonAaron*
> 
> Thanks for the reply's regarding idle voltage, I was afraid my ULPS wasn't working but it seems it is. I also have 0.961 idle voltage. It is a reference card. I wonder if those wit the .984 voltages are non-reference cards with factory OC's perhaps.


Mine idle @0.961v with non-ref (ASUS DCIIOC) with factory OC.


----------



## rdr09

Quote:


> Originally Posted by *taem*
> 
> With TESS off my score jumps by +1000 (from 11k to 12k). This is at stock volts with a mild oc just on the core, to 1070, so I could boost that by a few thousand if I up the volts and clocks. But my wondering is about the boost the second 290/x gets you.
> 
> You guys getting 26k gpu score on Firestrike, what would that score be with a single gpu would you guess?


here is FS @ 1260 . . .

http://www.3dmark.com/3dm/4163936?


----------



## taem

Quote:


> Originally Posted by *rdr09*
> 
> here is FS @ 1260 . . .
> 
> http://www.3dmark.com/3dm/4163936?


So 13621 gpu score @ 1260. If that's with tess off its in line with what I'm getting I suppose, 12k @ 1070 and 1350 mem.

But again -- how are you doubling your score in crossfire?? I'm not running xfire atm but iirc I didn't get anywhere near double performance on benches or games when I did. Am I cpu bottlenecking with a 4670k @ 4.7? Anandtech and other sources say a 4670k will not bottleneck 290 xfire but several savvy users here have said otherwise. I think you were one of them.


----------



## rdr09

Quote:


> Originally Posted by *taem*
> 
> So 13621 gpu score @ 1260. If that's with tess off its in line with what I'm getting I suppose, 12k @ 1070 and 1350 mem.
> 
> But again -- how are you doubling your score in crossfire?? I'm not running xfire atm but iirc I didn't get anywhere near double performance on benches or games when I did. Am I cpu bottlenecking with a 4670k @ 4.7? Anandtech and other sources say a 4670k will not bottleneck 290 xfire but several savvy users here have said otherwise. I think you were one of them.


no, that tess on. here i found a 1250 run . . .

http://www.3dmark.com/3dm/4002563?

graphics score for 290 @ 1250 is 13500

graphics score for 290s @ 1250 is 25200

so, its not 100% scale. may have to do with cards now running at X8X8.


----------



## taem

Quote:


> Originally Posted by *rdr09*
> 
> no, that tess on. here i found a 1250 run . . .
> 
> http://www.3dmark.com/3dm/4002563?
> 
> graphics score for 290 @ 1250 is 13500
> 
> graphics score for 290s @ 1250 is 25200
> 
> so, its not 100% scale.


It's pretty darned close and your scaling is way better than what I was getting. I can't find the Firestrike scores but in 3dmark11 for example I would score around 16-17k with one card, and 28k with xfire. You're getting like 85% scaling. I was getting under 70% scaling.


----------



## tsm106

Quote:


> Originally Posted by *taem*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> no, that tess on. here i found a 1250 run . . .
> 
> http://www.3dmark.com/3dm/4002563?
> 
> graphics score for 290 @ 1250 is 13500
> 
> graphics score for 290s @ 1250 is 25200
> 
> so, its not 100% scale.
> 
> 
> 
> It's pretty darned close and your scaling is way better than what I was getting. I can't find the Firestrike scores but in 3dmark11 for example I would score around 16-17k with one card, and 28k with xfire. You're getting like 85% scaling. I was getting under 70% scaling.
Click to expand...

Here:

15k gs

http://www.3dmark.com/fs/2889373

30k gs

http://www.3dmark.com/fs/3100892


----------



## NotReadyYet

I installed my new XFX 290x DD card yesterday. After I installed the latest AMD drivers I started up Warframe for a late night gaming session. After a couple of hours of smooth frames my game began to stutter and lock up a couple of times. This only happened when, during a mission, this NPC known as the Stalker appears to try and kill your guy. This caused my game to crash, and has been causing it to crash only when he appears. This game has never crashed on my before, only when I installed the new card (I previously had a 7970). Now, I did a basic driver install but I am thinking that maybe I should have done a total purge of everything from the registry. I am reluctant to do this since I am worried this might cause more problems then it's worth. So I began to wonder if my CPU is being bottlenecked by the new GPU. Could this cause a crash? I'm using my sig rig to play. Hoping for some input, thanks.


----------



## Cyber Locc

Quote:


> Originally Posted by *NotReadyYet*
> 
> I installed my new XFX 290x DD card yesterday. After I installed the latest AMD drivers I started up Warframe for a late night gaming session. After a couple of hours of smooth frames my game began to stutter and lock up a couple of times. This only happened when, during a mission, this NPC known as the Stalker appears to try and kill your guy. This caused my game to crash, and has been causing it to crash only when he appears. This game has never crashed on my before, only when I installed the new card (I previously had a 7970). Now, I did a basic driver install but I am thinking that maybe I should have done a total purge of everything from the registry. I am reluctant to do this since I am worried this might cause more problems then it's worth. So I began to wonder if my CPU is being bottlenecked by the new GPU. Could this cause a crash? I'm using my sig rig to play. Hoping for some input, thanks.


Use DDU to uninstall and re-install. No that cpu is not bottle-necking that card.


----------



## Forceman

Quote:


> Originally Posted by *taem*
> 
> Hmm. So why do some of the cards have .961 idle vddc and others have .984? Can't be asic, the 4 cards I know personally have asic ranging from low to high, all 4 are .984 idle vddc,


Pretty sure the difference is 290 and 290X. I seem to recall that when I flashed my card from a 290 to a 290X the idle voltage went from 0.984 to 0.961


----------



## th3illusiveman

mine idles at 1.031v :$


----------



## tsm106

Quote:


> Originally Posted by *th3illusiveman*
> 
> mine idles at 1.031v :$


That's silly high idle voltage.


----------



## Sgt Bilko

Quote:


> Originally Posted by *NotReadyYet*
> 
> I installed my new XFX 290x DD card yesterday. After I installed the latest AMD drivers I started up Warframe for a late night gaming session. After a couple of hours of smooth frames my game began to stutter and lock up a couple of times. This only happened when, during a mission, this NPC known as the Stalker appears to try and kill your guy. This caused my game to crash, and has been causing it to crash only when he appears. This game has never crashed on my before, only when I installed the new card (I previously had a 7970). Now, I did a basic driver install but I am thinking that maybe I should have done a total purge of everything from the registry. I am reluctant to do this since I am worried this might cause more problems then it's worth. So I began to wonder if my CPU is being bottlenecked by the new GPU. Could this cause a crash? I'm using my sig rig to play. Hoping for some input, thanks.


It's Warframe....

Have you tried both DX 11 and DX 10 for it?
Sometimes just messing about with Settings fixes it for me


----------



## heroxoot

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *th3illusiveman*
> 
> mine idles at 1.031v :$
> 
> 
> 
> That's silly high idle voltage.
Click to expand...

I agree. If you don't have 2d/3d profiles set it should clock down and idle at .9xx volts? Mine idles .969 I think.

So has anyone met the winter yet? My GPU idles 3c colder as of today since it was 30 degrees outside. My PC is my heater so I better start running crysis3 5 times


----------



## th3illusiveman

Quote:


> Originally Posted by *tsm106*
> 
> That's silly high idle voltage.


it is what it is


----------



## YellowBlackGod

Hello everybody....

I was about to order a gtx 980....But then I saw the following monster and decided for this for 200 euro less:

http://www.overclockers.co.uk/showproduct.php?prodid=GX-353-SP

What do you guys think? I think that for the price is the best bet.


----------



## zealord

Quote:


> Originally Posted by *YellowBlackGod*
> 
> Hello everybody....
> 
> I was about to order a gtx 980....But then I saw the following monster and decided for this for 200 euro less:
> 
> http://www.overclockers.co.uk/showproduct.php?prodid=GX-353-SPGX-353-SP
> 
> What do you guys think? I think that for the price is the best bet.


link is not working properly


----------



## YellowBlackGod

Must be ok now.Just checked.


----------



## zealord

Quote:


> Originally Posted by *YellowBlackGod*
> 
> Must be ok now.Just checked.


card is good, probably the best 290X out there.


----------



## YellowBlackGod

From what i understood performancewise is equal to nvidia's top cards plus more ram and cheaper and futureproof (dx.12). Best bang for the buck. I couldn't ignore it. The specs and the look impressed me.


----------



## MrTOOSHORT

Quote:


> Originally Posted by *YellowBlackGod*
> 
> From what i understood performancewise is equal to nvidia's top cards plus more ram and cheaper and futureproof (dx.12). Best bang for the buck. I couldn't ignore it. The specs and the look impressed me.


Well you don't 8gb of vram running only one card. Save £100 and go with the 4gb version:

*http://www.overclockers.co.uk/showproduct.php?prodid=GX-351-SP*


----------



## YellowBlackGod

Well the new games will slowly need more VRAM. Because of the new consoles and ports structure. 4GB of VRAM must be considered the minimum. But apart from that it was the price that made ne liok at this partixular model.


----------



## th3illusiveman

Quote:


> Originally Posted by *YellowBlackGod*
> 
> Well the new games will slowly need more VRAM. Because of the new consoles and ports structure. 4GB of VRAM must be considered the minimum. But apart from that it was the price that made ne liok at this partixular model.


a single 290x wont be fast enough to use all that VRAM for whatever game requires it. If you're buying that then you better be planning on buying another otherwise it's a waste of money.


----------



## rdr09

Quote:


> Originally Posted by *cyberlocc*
> 
> I will give that a shot tonight. However now that im thinking about it more and more. I wonder if thats even the issue Myself and others that have this probpem have been assuming it was that however. The issue that presents is latency mon and dpc latency show dxgkrnl.sys causing serious irqs and sound latency so that to me seems like a diffrent kind of problem.
> 
> It would seem that its something else in the drivers that direct x doesnt like how are you guys installing your direct x, from microsofts site.
> 
> Also if anyone has a chance to check there drivers system there is hd audio controllers that run the hd audio mine are just that hd audio controller driver and the driver info shows microsoft generic nothing in there is amd at all


i just tried Skyrim using my 290s and i have no sound issues the kind you are having. i'll test more.

edit: lol. i had to check if crossfire is enabled. it is very smooth.

Quote:


> Originally Posted by *tsm106*
> 
> Here:
> 
> 15k gs
> 
> http://www.3dmark.com/fs/2889373
> 
> 30k gs
> 
> http://www.3dmark.com/fs/3100892


----------



## chiknnwatrmln

http://www.3dmark.com/3dm/4569561

Probably the best I'm gonna get. Clocks were 1175/5850 with +81mV, I'm drawing nearly 800 watts from my 865 watt UPS so I really can't go higher.


----------



## jagdtigger

Quote:


> Originally Posted by *YellowBlackGod*
> 
> Well the new games will slowly need more VRAM


----------



## chiknnwatrmln

I wouldn't really use one poorly optimized game as a representative of every game..

Although recently the amount of poor PC ports is astounding, Ubisoft in particular.

Also http://www.3dmark.com/3dm11/8902300


----------



## chronicfx

I am going to have to disagree. They are writing these console games a certain way and it seems like too much work for them to undo the unified memory usage during port to pc. I think we are all gonna need more vram, too bad there is not a need for core power with any of this as they other requirements are pretty weak.


----------



## YellowBlackGod

Quote:


> Originally Posted by *chronicfx*
> 
> I am going to have to disagree. They are writing these console games a certain way and it seems like too much work for them to undo the unified memory usage during port to pc. I think we are all gonna need more vram, too bad there is not a need for core power with any of this as they other requirements are pretty weak.


That's what I am talking about too. It's the port structure that makes more ram necessary.This is proven by the LotF diagram.4GB is the minimum now. And still. The PS4 GPU, an HD 78xx like GPU, can handle most of the available 8 GB of system RAM. I don't see a reason why an R9 290x, a top of the line GPU, could not do this.


----------



## Forceman

Quote:


> Originally Posted by *th3illusiveman*
> 
> mine idles at 1.031v :$


Powercolor PCS+? They have the +50mV BIOS, which would make the voltage make sense for a 290.


----------



## pipes

when come out a video bios editor then work with r9 290x?


----------



## Cyber Locc

Quote:


> Originally Posted by *YellowBlackGod*
> 
> That's what I am talking about too. It's the port structure that makes more ram necessary.This is proven by the LotF diagram.4GB is the minimum now. And still. The PS4 GPU, an HD 78xx like GPU, can handle most of the available 8 GB of system RAM. I don't see a reason why an R9 290x, a top of the line GPU, could not do this.


4gb is the minuim for what? I understand what your saying and all but consoles only have 5gb of usable ram when all is said and done so your saying that games will use 5gbs of vram because of this. Im sorry but Im thinking yiur wrong maybe not only time will tell but Shadow of Mordor shows that even devs are over estimating IE. The 6gb for ultra is complete bull I run on ultra yes there installed and on it barely uses over 3gb. The fact is your half right there wipl be soke lazzy companies such as that one that will poorly code there game however that doesnt mean that everyone will.


----------



## ZealotKi11er

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> http://www.3dmark.com/3dm/4569561
> 
> Probably the best I'm gonna get. Clocks were 1175/5850 with +81mV, I'm drawing nearly 800 watts from my 865 watt UPS so I really can't go higher.


Thats a pretty good score. Whats is 5850? You mean 1450MHz memory?

Also your running 290 + 290X or 2 x 290X?
Quote:


> Originally Posted by *Forceman*
> 
> Powercolor PCS+? They have the +50mV BIOS, which would make the voltage make sense for a 290.


Its @ stock + 50mv? and you can add +50mV or 100mV for a total 150mV with MSI AB?


----------



## Forceman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Its @ stock + 50mv? and you can add +50mV or 100mV for a total 150mV with MSI AB?


Not sure about the AB limit, never really thought about that aspect. I flashed it to my card so I could run my day to day overclock without messing with AB voltage, which sometimes didn't get applied correctly after waking from sleep. I'll have to check my idle voltage when I get home, but petty sure it's 1.0+ now.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Forceman*
> 
> Powercolor PCS+? They have the +50mV BIOS, which would make the voltage make sense for a 290.


Quote:


> Originally Posted by *Forceman*
> 
> Not sure about the AB limit, never really thought about that aspect. I flashed it to my card so I could run my day to day overclock without messing with AB voltage, which sometimes didn't get applied correctly after waking from sleep. I'll have to check my idle voltage when I get home, but petty sure it's 1.0+ now.


Reason i am asking is that MSI is really the best tool for these cards. Sapphires and ASUS Tools are very poorly done in terms of UI, features, stability but they have 200mV support. I need 125mV for one card to do 1200MHz.


----------



## jamaican voodoo

anyone can build a computer, but can they configure it? hmm!


----------



## chiknnwatrmln

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Thats a pretty good score. Whats is 5850? You mean 1450MHz memory?
> 
> Also your running 290 + 290X or 2 x 290X?
> Its @ stock + 50mv? and you can add +50mV or 100mV for a total 150mV with MSI AB?


Yes, I have a tendency to list effective memory clocks. You are correct, I have a 290 and 290x.

Max power draw was actually 820 watts from the wall, if I got rid of my UPS I could do more but the safety of my rig is not worth an extra 20 or so megahertz.

Maybe one day I'll go for 5Ghz CPU and 1.45v to bump my physics score a bit, too sick of watching benchmarks to do it now though.


----------



## Roy360

First time ever doing a Firestrike test since I was a kid

http://www.3dmark.com/3dm/4575737 How'd my computer do?

Too low or okay?

I could have swore my CPU was overclocked to 4.3GHz, will have to check the next time I turn it off.

I wish Firestrike showed temperatures. My computer bombed the physX test. ~20fps. I'm wondering if the CPU was being cooked by the GPUs.


----------



## chiknnwatrmln

Are you running stock speeds on your graphics?

Also, if you can OC your RAM do it. Bumping my 1600MHz RAM to 1866 is about 99% stable, not good for daily usage but gives a nice bump in physics score.

Today was actually my first time running FS, Cloud Gate, Skydiver, etc. Usually I stick to 3dmk11.


----------



## pkrexer

Seems low. Looks like your GPUs are stock, but my 2x 290x's have you beat by a bit.

http://www.3dmark.com/fs/2861090
Quote:


> Originally Posted by *Roy360*
> 
> First time ever doing a Firestrike test since I was a kid
> 
> http://www.3dmark.com/3dm/4575737 How'd my computer do?
> 
> Too low or okay?
> 
> I could have swore my CPU was overclocked to 4.3GHz, will have to check the next time I turn it off.
> 
> I wish Firestrike showed temperatures. My computer bombed the physX test. ~20fps. I'm wondering if the CPU was being cooked by the GPUs.


----------



## Roy360

Quote:


> Originally Posted by *pkrexer*
> 
> Seems low. Looks like your GPUs are stock, but my 2x 290x's have you beat by a bit.
> 
> http://www.3dmark.com/fs/2861090


I keep getting beat by 2xR9 290X.
I'm going have to check my CPU overclock, and switch back to Dual Channel memory. On a side note, I wish I had tsm106's motherboard. My 4th slot is only for peripherals.
That maximus could handle 4 GPUs. (*takes a looks at the BNIB R9 290X sitting in its box)
Quote:


> Originally Posted by *tsm106*
> 
> Here:
> 
> 15k gs
> 
> http://www.3dmark.com/fs/2889373
> 
> 30k gs
> 
> http://www.3dmark.com/fs/3100892


----------



## ZealotKi11er

Quote:


> Originally Posted by *Roy360*
> 
> I keep getting beat by 2xR9 290X.
> I'm going have to check my CPU overclock, and switch back to Dual Channel memory. On a side note, I wish I had tsm106's motherboard. I'm using a PCIe riser so my sound card would fit with my 3 cards. (Should have bought a USB dac)
> That maximus could handle 4 GPUs + a sound card, unlike this silly deluxe which only has a 4th PCIe for show. (Me looks at the R9 290X sitting in box)


290 should score 10.5K stock. 3 of them should be ~ 30K. My 2 x290 score 25K.


----------



## Roy360

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 290 should score 10.5K stock. 3 of them should be ~ 30K. My 2 x290 score 25K.


How are you guys getting such high scores?

I've been searching thru 3D mark, 15k is the norm for 3820 with 3 or more cards. Looks like I have a bottleneck

http://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/cpugpu/fs/P/1397/906/19595?minScore=0&cpuName=Intel Core i7-3820 Processor&gpuName=AMD Radeon R9 290

http://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/cpugpu/fs/P/1397/906/19595?minScore=0&cpuName=Intel Core i7-3820 Processor&gpuName=AMD Radeon R9 290

Surprisingly, two card configs score higher than 3 or more.


----------



## chiknnwatrmln

What drivers are you running? Going from 14.4 to 14.9 Beta gave me a decent bump.


----------



## Roy360

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> What drivers are you running? Going from 14.4 to 14.9 Beta gave me a decent bump.


14.1

Looks like I was wrong about my CPU being the bottleneck.



Still not 30k, much higher than mine.

I wonder why on their tests, their CPU speed shows up correctly, but mine still says 3.6GHz (which is lower than stock)

I'm just going to assume Firestrike needs a patch or something. Or maybe its time to reformat. Maybe I'll try Windows 9.


----------



## alancsalt

There is no Windows 9









They went straight to 10.....


----------



## heroxoot

Quote:


> Originally Posted by *alancsalt*
> 
> There is no Windows 9
> 
> 
> 
> 
> 
> 
> 
> 
> 
> They went straight to 10.....


I hear this was due to incompatibility with code that works on 95/98. Oh well, whatever works right?


----------



## kizwan

Quote:


> Originally Posted by *Roy360*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> 290 should score 10.5K stock. 3 of them should be ~ 30K. My 2 x290 score 25K.
> 
> 
> 
> How are you guys getting such high scores?
> 
> I've been searching thru 3D mark, 15k is the norm for 3820 with 3 or more cards. Looks like I have a bottleneck
> 
> http://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/cpugpu/fs/P/1397/906/19595?minScore=0&cpuName=Intel Core i7-3820 Processor&gpuName=AMD Radeon R9 290
> 
> http://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/cpugpu/fs/P/1397/906/19595?minScore=0&cpuName=Intel Core i7-3820 Processor&gpuName=AMD Radeon R9 290
> 
> Surprisingly, two card configs score higher than 3 or more.
Click to expand...

Overclock your CPU & overclock your cards. Not fair comparison when you're running at stock clocks for both CPU & GPU's.


----------



## Ironsmack

Did you guys disable ULPS to have your voltage idle at around 0.9xx volts? Or is that with ULPS?

I disabled mine and both cards hover at 1.2xx.


----------



## heroxoot

Quote:


> Originally Posted by *Ironsmack*
> 
> Did you guys disable ULPS to have your voltage idle at around 0.9xx volts? Or is that with ULPS?
> 
> I disabled mine and both cards hover at 1.2xx.


I run single card so ULPS doesn't affect me. Even on dual monitor it volts down on 300/1250 clock.


----------



## MojoW

Quote:


> Originally Posted by *Ironsmack*
> 
> Did you guys disable ULPS to have your voltage idle at around 0.9xx volts? Or is that with ULPS?
> 
> I disabled mine and both cards hover at 1.2xx.


Looks like Force Constant Voltage is enabled.


----------



## Red1776

just an Holodeck XI update for the AMD HPP project FX- 8350 build portion

4 x R290X MSI Game edition EK watercoled

All four cards have easily gone over 1200 core.


----------



## rdr09

Quote:


> Originally Posted by *Red1776*
> 
> just an Holodeck XI update for the AMD HPP project FX- 8350 build portion
> 
> 4 x R290X MSI Game edition EK watercoled
> 
> All four cards have easily gone over 1200 core.


nice! power misers are scratching their head. lol, that's prolly 10% of your power consumption, which i really do not have any business on.


----------



## taem

Query. Anyone know off hand any 290 bios with special characteristics to consider?

Example, Powercolor pcs+ has +50mv stock. Any other bios with a volt adjust in the bios?

What are the bios with aggressive fan curves, and what have relaxed fan curves?

And anything else of note.


----------



## Red1776

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> just an Holodeck XI update for the AMD HPP project FX- 8350 build portion
> 
> 4 x R290X MSI Game edition EK watercoled
> 
> All four cards have easily gone over 1200 core.
> 
> 
> 
> 
> 
> 
> 
> nice! power misers are scratching their head. lol, that's prolly 10% of your power consumption, which i really do not have any business on.
Click to expand...

What? I didn't get that. but I have three PSU's and 2200W in this thing so no problem.


----------



## amptechnow

http://www.3dmark.com/3dm/4583914?

this is with 1135/1625 on my 290's with tesselation ON...


----------



## ZealotKi11er

Quote:


> Originally Posted by *amptechnow*
> 
> http://www.3dmark.com/3dm/4583914?
> 
> this is with 1135/1625 on my 290's with tesselation ON...


Dam Windows 8 seems to score much higher then Windows 7. I only get 1K more but with extra 100MHz OC and one card is a 290X.


----------



## amptechnow

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Dam Windows 8 seems to score much higher then Windows 7. I only get 1K more but with extra 100MHz OC and one card is a 290X.


yeah win 8.1 helps scores out some and i hear 10 does even more. i run 10 on my htpc but do not bench it bc it only has a g3258 and a r7 250 lol

but i am using stock bios on my cards. in cfx i can get up to about 1150 on core before artifacts and crashes. my xfx goes higher alone and much higher with pt1 bios. but my pcs+ seems to be holding it back. it can run the pt1 bios but when i have them both set to it i can not install drivers or log in windows bc i get black screen. i would love to see my scores with both on pt1 and a 1200-1250 core overclock. but just cant get it working.


----------



## killertea

I am running two sapphire r9 290x tri-x oc



both cards are running at

1040/1300 no other oc besides sapphires factory oc.

http://gpuz.techpowerup.com/14/11/02/k24.png


----------



## bond32

Broke 10k on firestrike ultra:
http://www.3dmark.com/3dm/4585091?


----------



## Sgt Bilko

Quote:


> Originally Posted by *bond32*
> 
> Broke 10k on firestrike ultra:
> http://www.3dmark.com/3dm/4585091?


Nice work









In hoping to break 9k soon


----------



## bond32

I'm not sure if you guys do this, but I noticed I have to restart between runs. Could just be a quadfire thing, not a huge deal. But I am really pleased with that result. Pretty sure my cpu is the limiting factor now as the majority of the time the gpus don't stay around 100% usage.

The right thing to do would be test each card individually but I'm lazy. I know my elpida card is a piece of crap and it's holding the entire rig back but it is what it is.


----------



## rdr09

Quote:


> Originally Posted by *bond32*
> 
> Broke 10k on firestrike ultra:
> http://www.3dmark.com/3dm/4585091?


indeed. very nice run.


----------



## Roy360

Are you guys testing with the demo? or with Fire Strike Extreme/Ultra?

I'm envious of you guys with 1200 core. I get artifacts after 977, and memory is limited by the Direct Cu ASUS card


----------



## amptechnow

Quote:


> Originally Posted by *bond32*
> 
> Broke 10k on firestrike ultra:
> http://www.3dmark.com/3dm/4585091?


very very nice!


----------



## SeanEboy

Quote:


> Originally Posted by *pdasterly*
> 
> download realbench and run benchmark. I agree you have a problem. I only get that error when I push my chip past 5ghz


So, finally ran realbench. Now ,I've also reinstalled those drivers in safe mode after using the wiper, etc.. So, that could have solved the problem perhaps? Anyway, here's my realbench, looks like my third card isn't listed(crossfire), 4th was off for trouble shooting. Checked again to see amber light lit, it's on, and CCC sees it.



Looks like I'm a fan of the beta drivers as well, 11195, and i haven't done anything yet. http://www.3dmark.com/3dm/4586024


----------



## amptechnow

has anyone checked out the firestrike hall of fame? we are getting owned by 980's.....a 290 is in 32nd for normal and for ultra we arent even in top 100...


----------



## DividebyZERO

Quote:


> Originally Posted by *amptechnow*
> 
> has anyone checked out the firestrike hall of fame? we are getting owned by 980's.....a 290 is in 32nd for normal and for ultra we arent even in top 100...


Thats fine with me because i don't play firestrike. I play games







.. Synthetic benchmarks don't really hold much weight with me.


----------



## SeanEboy

Quote:


> Originally Posted by *amptechnow*
> 
> has anyone checked out the firestrike hall of fame? we are getting owned by 980's.....a 290 is in 32nd for normal and for ultra we arent even in top 100...


----------



## amptechnow

Quote:


> Originally Posted by *SeanEboy*
> 
> So, finally ran realbench. Now ,I've also reinstalled those drivers in safe mode after using the wiper, etc.. So, that could have solved the problem perhaps? Anyway, here's my realbench, looks like my third card isn't listed(crossfire), 4th was off for trouble shooting. Checked again to see amber light lit, it's on, and CCC sees it.
> 
> 
> 
> Looks like I'm a fan of the beta drivers as well, 11195, and i haven't done anything yet. http://www.3dmark.com/3dm/4586024


here i ran real bench to compare... i will try it again after disabling my side 2 monitors... they run on the onboard intel graphics and most of encoding was going on on them not on the center screen which uses my cxf 290's...think that matters?


----------



## SeanEboy

Quote:


> Originally Posted by *amptechnow*
> 
> here i ran real bench to compare... i will try it again after disabling my side 2 monitors... they run on the onboard intel graphics and most of encoding was going on on them not on the center screen which uses my cxf 290's...think that matters?


Thanks for that. Well, not as much as having twice the ram does, I'd imagine. ;c) Right?


----------



## amptechnow

yeah that definitely helps. i just figured 2 290's would render and encode better then an intel hd 4000 igpu... here a bigger shot the other is hard to read.


----------



## rdr09

Quote:


> Originally Posted by *DividebyZERO*
> 
> Thats fine with me because i don't play firestrike. I play games
> 
> 
> 
> 
> 
> 
> 
> .. Synthetic benchmarks don't really hold much weight with me.


if i have the kind of monitor setup you have . . . i will not mind either. i saw them in another thread and that was a lot of work. jelly indeed.

anyway, these 290s are in the top 50 . . .

http://www.3dmark.com/fs/2970450

and tsm's

http://www.3dmark.com/fs/3100782

no chance with single.


----------



## pdasterly

Quote:


> Originally Posted by *SeanEboy*
> 
> So, finally ran realbench. Now ,I've also reinstalled those drivers in safe mode after using the wiper, etc.. So, that could have solved the problem perhaps? Anyway, here's my realbench, looks like my third card isn't listed(crossfire), 4th was off for trouble shooting. Checked again to see amber light lit, it's on, and CCC sees it.
> 
> 
> 
> Looks like I'm a fan of the beta drivers as well, 11195, and i haven't done anything yet. http://www.3dmark.com/3dm/4586024


http://www.3dmark.com/fs/3025719
http://rog.asus.com/realbench/show_comment.php?id=4773


----------



## SeanEboy

Quote:


> Originally Posted by *pdasterly*
> 
> http://www.3dmark.com/fs/3025719
> http://rog.asus.com/realbench/show_comment.php?id=4773


That was my extreme score before.

This is my reggie Firestrike score: http://www.3dmark.com/3dm/4586934


----------



## rdr09

Quote:


> Originally Posted by *SeanEboy*
> 
> That was my extreme score before.
> 
> This is my reggie Firestrike score: http://www.3dmark.com/3dm/4586934


that is really low. i've seen 280X (4X) score almost 40K in graphics score. my 2 290s gets 25K.


----------



## SeanEboy

Uh o.. Well then I wonder what the next step is..?


----------



## tsm106

Quote:


> Originally Posted by *rdr09*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *SeanEboy*
> 
> That was my extreme score before.
> 
> This is my reggie Firestrike score: http://www.3dmark.com/3dm/4586934
> 
> 
> 
> 
> 
> 
> 
> that is really low. i've seen 280X (4X) score almost 40K in graphics score. my 2 290s gets 25K.
Click to expand...

Almost 43K...

Quote:


> Originally Posted by *SeanEboy*
> 
> Uh o.. Well then I wonder what the next step is..?


What is your cpu's max overclock and your gpus? You need to push your cpu's imc and run the highest ram clocks you can.


----------



## xer0h0ur

Yup, that is low. The last time I benched Firestrike I got a 33473 Graphics Score running a 295X2 and a reference 290X. Granted my clocks are higher though.

http://www.3dmark.com/fs/2909462


----------



## Ironsmack

Quote:


> Originally Posted by *MojoW*
> 
> Looks like Force Constant Voltage is enabled.


I turned off constant voltage and rebooted. It was still hovering around 1.2xx on idle.

So i disabled Powerplay and rebooted. Now, it hovers 1.0xx. And both cards are idling on 30x MHz on core now, thanks!


----------



## ZealotKi11er

Quote:


> Originally Posted by *SeanEboy*
> 
> Uh o.. Well then I wonder what the next step is..?


Are they on air?


----------



## SeanEboy

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Are they on air?


No, but they're not OC'd, could that be the difference? Or the 3930k?


----------



## pdasterly

I think one of your cards isn't working, that score is closer to tri-fire.
http://www.3dmark.com/3dm/4586934
http://www.3dmark.com/fs/2519704

compare your scores, look at detailed scores

This system is actually closer to you score so you can get idea of whats wrong
http://www.3dmark.com/fs/2188312


----------



## Themisseble

R9 290 powercolor for 213$ on newegg


----------



## heroxoot

Has anyone here had a good time with the Evolve Alpha? my 290X only got a 60 - 70% load but it keeps 60fps nicely on my 290X. I have max settings + SMAA 2TX, no Vsync. I am a little shocked that it runs so well without AMD dropping a driver update just for the game.


----------



## Vici0us

Quote:


> Originally Posted by *heroxoot*
> 
> Has anyone here had a good time with the Evolve Alpha? my 290X only got a 60 - 70% load but it keeps 60fps nicely on my 290X. I have max settings + SMAA 2TX, no Vsync. I am a little shocked that it runs so well without AMD dropping a driver update just for the game.


Tried it today with Crossfire disabled. I had Vsync on and kept steady 60 FPS the whole time with one 290.


----------



## SeanEboy

Quote:


> Originally Posted by *pdasterly*
> 
> I think one of your cards isn't working, that score is closer to tri-fire.
> http://www.3dmark.com/3dm/4586934
> http://www.3dmark.com/fs/2519704
> 
> compare your scores, look at detailed scores
> 
> This system is actually closer to you score so you can get idea of whats wrong
> http://www.3dmark.com/fs/2188312


Thanks for that info man, I appreciate it.

I wonder if one isn't installed correctly? All (4) leds are lit up on the RIVBE, and I could've sworn I saw quadfire setup in CCC. I'll have to look again when I get home tonight. Could it be a power supply maxxing out thing? I only have a 1500w, Cooler Master at that (I know..I know..).


----------



## ledzepp3

Here's a follow up from after using the EK 290X Reinforcement Bracket for the past 6 months









I don't ever shut off my PC. It's either folding or gaming, there's almost no downtime unless the power goes out completely. This means that my cards stay nice and toasty (around ~50C when folding). At load temps is exactly when my old 7970 began to warp and bend just a little bit. I am pleased to report that with the reinforcement brackets installed along with a backplate- there is _no_ flexing, warping, or bending of any kind present in either card. The finish of the brackets is still phenomenal, and nothing has scratched the paint. I assumed that so many cycles of cooling between folding and gaming would create some abnormalities in the paint- but there's nothing.

If anyone doesn't have these brackets on their cards yet, I highly encourage it







plus they're not too expensive if you can find them! Probably the best investment for your graphics card if it's already liquid cooled.

-Zepp


----------



## SeanEboy

Quote:


> Originally Posted by *ledzepp3*
> 
> Here's a follow up from after using the EK 290X Reinforcement Bracket for the past 6 months
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I don't ever shut off my PC. It's either folding or gaming, there's almost no downtime unless the power goes out completely. This means that my cards stay nice and toasty (around ~50C when folding). At load temps is exactly when my old 7970 began to warp and bend just a little bit. I am pleased to report that with the reinforcement brackets installed along with a backplate- there is _no_ flexing, warping, or bending of any kind present in either card. The finish of the brackets is still phenomenal, and nothing has scratched the paint. I assumed that so many cycles of cooling between folding and gaming would create some abnormalities in the paint- but there's nothing.
> 
> If anyone doesn't have these brackets on their cards yet, I highly encourage it
> 
> 
> 
> 
> 
> 
> 
> plus they're not too expensive if you can find them! Probably the best investment for your graphics card if it's already liquid cooled.
> 
> -Zepp


Great write up.. Unfortunately that means my quad terminal is no longer used. Meh.


----------



## tsm106

Quote:


> Originally Posted by *SeanEboy*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *ledzepp3*
> 
> Here's a follow up from after using the EK 290X Reinforcement Bracket for the past 6 months
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I don't ever shut off my PC. It's either folding or gaming, there's almost no downtime unless the power goes out completely. This means that my cards stay nice and toasty (around ~50C when folding). At load temps is exactly when my old 7970 began to warp and bend just a little bit. I am pleased to report that with the reinforcement brackets installed along with a backplate- there is _no_ flexing, warping, or bending of any kind present in either card. The finish of the brackets is still phenomenal, and nothing has scratched the paint. I assumed that so many cycles of cooling between folding and gaming would create some abnormalities in the paint- but there's nothing.
> 
> If anyone doesn't have these brackets on their cards yet, I highly encourage it
> 
> 
> 
> 
> 
> 
> 
> plus they're not too expensive if you can find them! Probably the best investment for your graphics card if it's already liquid cooled.
> 
> -Zepp
> 
> 
> 
> 
> 
> 
> 
> Great write up.. Unfortunately that means my quad terminal is no longer used. Meh.
Click to expand...

If you're cards are sagging with a terminal bridge, you've got some serious issues.


----------



## SeanEboy

Quote:


> Originally Posted by *tsm106*
> 
> If you're cards are sagging with a terminal bridge, you've got some serious issues.


LOL, definitely not sagging, and the Caselabs M8 probably helps there as well.. But, he sold the bracket quite well in terms of the previous experience on warping cards, etc. But, I appreciate the concern, for sure. They're sitting pretty in there.. I think I still have some software troubleshooting to do though.


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> If you're cards are sagging with a terminal bridge, you've got some serious issues.


question is . . . 1500W enuf for 290X X 4?


----------



## tsm106

Quote:


> Originally Posted by *SeanEboy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> If you're cards are sagging with a terminal bridge, you've got some serious issues.
> 
> 
> 
> LOL, definitely not sagging, and the Caselabs M8 probably helps there as well.. But, he sold the bracket quite well in terms of the previous experience on warping cards, etc. But, I appreciate the concern, for sure. They're sitting pretty in there.. I think I still have some software troubleshooting to do though.
Click to expand...

You miss the point. It's almost impossible to warp on quad/tri setups because they act as an array, a single unit. Card sagging doesn't really apply to crossfire over three cards and mostly affects single cards because they are not supported on another plane. The sag is due to weak reinforcement of the pcb to the pcie bracket, very prevalent in single cards. But in multigpu each successive card reinforces the one above it, so on and so forth.


----------



## tsm106

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> If you're cards are sagging with a terminal bridge, you've got some serious issues.
> 
> 
> 
> question is . . . 1500W enuf for 290X X 4?
Click to expand...

I would say most ppl don't feed enough volts to make 1500w a concern. But if they are so equipped, one can trip a 1.5kw psu easily.


----------



## DividebyZERO

Quote:


> Originally Posted by *tsm106*
> 
> I would say most ppl don't feed enough volts to make 1500w a concern. But if they are so equipped, one can trip a 1.5kw psu easily.


pretty much right on point. Ive tripped my g1600 when I pushed 200mv to all my cards in FSU. I am usually good up to 100mv on all 4, after that its questionable. Some games dont work the best on quadfire. In those titles I run tri overclocked (+200mv 1200/1600)and can reach the same performance of quad at stock. Also hits about same wattage as 4 stock.

Speaking of which comes a question.

I am debating adding a second psu, if I do do I need to match the psu's up? Example if I use a multi rail g1600 and a 1200w single rail psu wil it have any issues? Or should I go dual g1600? Dual 1200w single rails? Not sure if that's a Shilka style question.


----------



## tsm106

The multi rail psu is a pain to work with because there is not enough power for each rail to run one card at its max. Hence the original problem that started this, but this time it makes it a bad host for multi psu setups. You will have to bond rails in a way, one rail per plug on a card with two plugs etc. Then add in decent large psu, for ex. a G2 1300w, etc. If it were me, I would just sell the G1600 and get either two 1300w G2, or a mix of 1300+1000, etc. Add in a psu jumper adapter and you are set.

http://www.amazon.com/gp/product/B00DL3L2J6/ref=oh_aui_search_detailpage?ie=UTF8&psc=1


----------



## Wezzor

Would you guys consider 72-75c being high on GPU core and VRM during full load? My GPU is just overclocked with 50%+ powerlimit 1100/1400. I have the fan speed still on automatic also since it's so quite then.


----------



## bond32

Quote:


> Originally Posted by *tsm106*
> 
> The multi rail psu is a pain to work with because there is not enough power for each rail to run one card at its max. Hence the original problem that started this, but this time it makes it a bad host for multi psu setups. You will have to bond rails in a way, one rail per plug on a card with two plugs etc. Then add in decent large psu, for ex. a G2 1300w, etc. If it were me, I would just sell the G1600 and get either two 1300w G2, or a mix of 1300+1000, etc. Add in a psu jumper adapter and you are set.
> 
> http://www.amazon.com/gp/product/B00DL3L2J6/ref=oh_aui_search_detailpage?ie=UTF8&psc=1


That's actually the exact setup I have, 2xEVGA 1300 G2's with that exact "splitter"... Works quite well. Giant psu's at full bench cause the lights to flicker in my apartment... Which, I cannot figure out. I have each on a different circuit, and the lights are on a different circuit. Think it's a frequency thing.. Who knows.


----------



## tsm106

Quote:


> Originally Posted by *bond32*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> The multi rail psu is a pain to work with because there is not enough power for each rail to run one card at its max. Hence the original problem that started this, but this time it makes it a bad host for multi psu setups. You will have to bond rails in a way, one rail per plug on a card with two plugs etc. Then add in decent large psu, for ex. a G2 1300w, etc. If it were me, I would just sell the G1600 and get either two 1300w G2, or a mix of 1300+1000, etc. Add in a psu jumper adapter and you are set.
> 
> http://www.amazon.com/gp/product/B00DL3L2J6/ref=oh_aui_search_detailpage?ie=UTF8&psc=1
> 
> 
> 
> 
> 
> 
> 
> That's actually the exact setup I have, 2xEVGA 1300 G2's with that exact "splitter"... Works quite well. Giant psu's at full bench cause the lights to flicker in my apartment... Which, I cannot figure out. I have each on a different circuit, and the lights are on a different circuit. Think it's a frequency thing.. Who knows.
Click to expand...

I would get that checked as that sounds like a brownout.


----------



## DividebyZERO

I have a 1100w and 1200w single rail already here. The g1600 has done pretty well for me except when I try to push really far. Truth be told I really dont need to overclock the cards for gaming. This was just something I was debating on doing for that time I feel like benching. I am not sure if I really care to bench anymore to be honest.

I am really wondering what the next gen AMD gpus will pull power wise... guess it doesn't hurt to future proof


----------



## LandonAaron

Quote:


> Originally Posted by *cyberlocc*
> 
> Is that the ones that are in sound and game controllers or the ones in system devices?


Sound and Game controllers


----------



## tsm106

Quote:


> Originally Posted by *DividebyZERO*
> 
> I have a 1100w and 1200w single rail already here. The g1600 has done pretty well for me except when I try to push really far. Truth be told I really dont need to overclock the cards for gaming. This was just something I was debating on doing for that time I feel like benching. I am not sure if I really care to bench anymore to be honest.
> 
> I am really wondering what the next gen AMD gpus will pull power wise... guess it doesn't hurt to future proof


They will use 40% delta color compression (10% more than Maxwell and iirc 20% than Tonga), power gating (on demand deactivation ala cylinder deactivation in cars), 20nm, lower powered HBM (though in what manner remains to be seen) and those key points should drop the power consumption a large amount. However, physics is still physics so they will still draw power when the power is needed. There's been a lot of BS over Maxwells low power consumption, but that is only true when it is as stock and within the design parameters. When those cards specifically the 980 is overclocked and benched hard, it's power consumption jumps right back up to it's Kepler counterparts. Thus all this power savings is from the color compression and power gating, ie. saving power when you don't need to use it. In other words it will still use essentially the same power when overclocked and pushed hard. It's my educated interpretation as seen thru this crystal ball.


----------



## joeh4384

I can imagine the next gen flagship using similar amounts of power as the 290x. I think the watercooling in the reference design is more a cool factor then a necessity. I think the 290x reference design would have been a big hit if it came factory cooled with a 120mm aio.


----------



## SeanEboy

Quote:


> Originally Posted by *joeh4384*
> 
> I can imagine the next gen flagship using similar amounts of power as the 290x. I think the watercooling in the reference design is more a cool factor then a necessity. I think the 290x reference design would have been a big hit if it came factory cooled with a 120mm aio.


It was also a sound factor, and much better cooling, ... I have (22) AP-15s running @35% push/pull on (3) 360s, that I sit 3' from, and can only hear them when I try... That 290x reference cooler was capable of pushing my case across the desk at 100%, let alone acting as a white noise machine for the whole house.


----------



## bond32

Today is just not my day... Could use some help troubleshooting some things.

I have no idea what happened or how, I thought it was from a failed OC/driver failure but when my PC rebooted, the windows emblem would appear, monitor would flicker off as normal, then black screen. Tried various GPU bios, tried each card, different PCIE slots, still nothing. Was able to reformat my ssd, however when windows attempts to boot for the first time rather than entering windows the PC reboots into a bootloop.

When I get home I will be testing out my roommate's video card, likely in the slot that bypasses the PLX.

My thoughts are:
- SSD has died?
- PLX Chip has died? (board)
- CPU died? 4770k

Seems unlikely to me the CPU would be dead if it still boots fine and bios works as it should. Any ideas?

To add insult to injury I made a quick small CPU water loop only to find the impeller in my D5 is now shot. Not a good day...


----------



## battleaxe

Quote:


> Originally Posted by *bond32*
> 
> Today is just not my day... Could use some help troubleshooting some things.
> 
> I have no idea what happened or how, I thought it was from a failed OC/driver failure but when my PC rebooted, the windows emblem would appear, monitor would flicker off as normal, then black screen. Tried various GPU bios, tried each card, different PCIE slots, still nothing. Was able to reformat my ssd, however when windows attempts to boot for the first time rather than entering windows the PC reboots into a bootloop.
> 
> When I get home I will be testing out my roommate's video card, likely in the slot that bypasses the PLX.
> 
> My thoughts are:
> - SSD has died?
> - PLX Chip has died? (board)
> - CPU died? 4770k
> 
> Seems unlikely to me the CPU would be dead if it still boots fine and bios works as it should. Any ideas?
> 
> To add insult to injury I made a quick small CPU water loop only to find the impeller in my D5 is now shot. Not a good day...


This just happened to my reference 290. Dead. My other 290 is completely fine.

Same thing. Windows splash screen then black. Did the same exact thing on two different machines too, so it was only the card. I RMA'd the thing hoping to get her back soon.


----------



## bond32

Quote:


> Originally Posted by *battleaxe*
> 
> This just happened to my reference 290. Dead. My other 290 is completely fine.
> 
> Same thing. Windows splash screen then black. Did the same exact thing on two different machines too, so it was only the card. I RMA'd the thing hoping to get her back soon.


That doesn't make sense to me though.... I used on-board video on one test, turned all pcie slots off (dipswitches) and got the same results...

Plus, I have 4 cards. I'm not opposed to testing each one/RMA but I am stumped as to troubleshooting this as it seems it could be many things.


----------



## battleaxe

Quote:


> Originally Posted by *bond32*
> 
> That doesn't make sense to me though.... I used on-board video on one test, turned all pcie slots off (dipswitches) and got the same results...
> 
> Plus, I have 4 cards. I'm not opposed to testing each one/RMA but I am stumped as to troubleshooting this as it seems it could be many things.


Ohh.... well then that's not the same. Sorry. I only experienced this on one card. Not both.


----------



## taem

The xfx double dissipation black edition is a reference pcb right? I think I saw it on the ek compat list so I'm pretty sure it is but I want to make 100% sure. Because it's cheaper than the lower clocked non-black double diss so that makes me wonder.



I want to put an aquacomputer kryographics on it so it has to be reference.

Any reason I shouldn't get this card and should get a different reference instead? This one is the cheapest I can find.


----------



## Jflisk

Quote:


> Originally Posted by *bond32*
> 
> That doesn't make sense to me though.... I used on-board video on one test, turned all pcie slots off (dipswitches) and got the same results...
> 
> Plus, I have 4 cards. I'm not opposed to testing each one/RMA but I am stumped as to troubleshooting this as it seems it could be many things.


Bond try booting into safe mode see if it will go F8 at the boot.


----------



## Cyber Locc

Quote:


> Originally Posted by *taem*
> 
> The xfx double dissipation black edition is a reference pcb right? I think I saw it on the ek compat list so I'm pretty sure it is but I want to make 100% sure. Because it's cheaper than the lower clocked non-black double diss so that makes me wonder.
> 
> 
> 
> I want to put an aquacomputer kryographics on it so it has to be reference.
> 
> Any reason I shouldn't get this card and should get a different reference instead? This one is the cheapest I can find.


yes its reference. That is the only reference card I would get unless your going water immediately. its cheaper than the non black edition that is strange is on sale? black edition is xfxs top cards kinda like evgas FTWS there binned only the best get the black tag whether thats bull or not is a matter of a opinion but that's what xfx claims. (AFAIK that is the only reference card that has a non reference cooler)

Okay guys im going to give AMD the benefit of the doubt and buy some blocks I hope I don't regret this. Partly because I will be adding another card and the day 4k freesync monitors hit I will have one so this vsync issue is temporary.

Why you ask, because where I'm going we don't need VSYNC Baby....


----------



## Ironsmack

Ok, so this just popped up on my IG feed:

http://www.anandtech.com/show/8001/sapphire-r9-290x-vaporx-8gb-hits-retail-uk-only-600

Who's getting one??


----------



## YellowBlackGod

Quote:


> Originally Posted by *Ironsmack*
> 
> Ok, so this just popped up on my IG feed:
> 
> http://www.anandtech.com/show/8001/sapphire-r9-290x-vaporx-8gb-hits-retail-uk-only-600
> 
> Who's getting one??


Me. It's on the way.  369 GBP including shipping.Best GPU of the market.


----------



## BradleyW

Quote:


> Originally Posted by *Ironsmack*
> 
> Ok, so this just popped up on my IG feed:
> 
> http://www.anandtech.com/show/8001/sapphire-r9-290x-vaporx-8gb-hits-retail-uk-only-600
> 
> Who's getting one??


8GB version of GTX 980 will be out soon.


----------



## YellowBlackGod

Quote:


> Originally Posted by *BradleyW*
> 
> 8GB version of GTX 980 will be out soon.


The price i want to see.Can't beat the powercolor and Sapphire offers of 8GB.


----------



## battleaxe

Quote:


> Originally Posted by *BradleyW*
> 
> 8GB version of GTX 980 will be out soon.


Who cares? It won't be anywhere near this price. Maybe close to twice as much actually.


----------



## SeanEboy

Quote:


> Originally Posted by *pdasterly*
> 
> I think one of your cards isn't working, that score is closer to tri-fire.
> http://www.3dmark.com/3dm/4586934
> http://www.3dmark.com/fs/2519704
> 
> compare your scores, look at detailed scores
> 
> This system is actually closer to you score so you can get idea of whats wrong
> http://www.3dmark.com/fs/2188312


annnd we have a winner! Yep, I have the ole yellow caution triangle in device manager. Finally figured out how to get to the effin' device manager. So annoying trying to figure out 7, now i have 8 to figure out. Anyway, I just tried installing CCC again, didn't pick up the card. Now what?


----------



## chiknnwatrmln

I would test the card individually to rule out hardware issues. If you can't remove the other card due to watercooling you can just unplug the power from the other 3.


----------



## SeanEboy

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I would test the card individually to rule out hardware issues. If you can't remove the other card due to watercooling you can just unplug the power from the other 3.


Well, I can turn them off via rampage black switch on board. So, I'll be doing that here. Turn all off, except for the 4th. Now, if it's listed as 4th in device manager, it's 4th right? Or do I need to know more? Wait, it says PCI bus 0, device 31, function 2. So, is PCI bvs 0.. So, would that be card #1?

Also, I just read that some guy had this problem, but didn't install the chipset drivers, perhaps I need to do a fresh windows install, and install chipset drivers and all that?

Ok, so I kill power to the 4th card, just for fun... The (4) cards still all show in device manager, despite my uninstalling the faulty card, turned off 'slot', and unplugged. I wonder if this slot switch is busted on #4 or whatever?


----------



## taem

Quote:


> Originally Posted by *cyberlocc*
> 
> yes its reference. That is the only reference card I would get unless your going water immediately. its cheaper than the non black edition that is strange is on sale? black edition is xfxs top cards kinda like evgas FTWS there binned only the best get the black tag whether thats bull or not is a matter of a opinion but that's what xfx claims. (AFAIK that is the only reference card that has a non reference cooler) :


Lol the black ed clock is still under 1ghz, what kind of binning they doing? Is there a 290 out there that can't do 980? The tri-x is an unbinned reference that comes stock 1000 clock. That xfx is the best looking card ever made though. I wish more pc parts had those sorts of aesthetics, instead of going the transformers route on everything.

The sapphire tri-x and powercolor pcs+ 290 are both reference pcbs with custom coolers. We'll the pcs+ used to be; powercolor revised the pcb, but kept the name and model number with no way for an end user to know which one he's ordering. You have to look at the stamp right above the PCIE connector to tell, reference is r29f and custom is r29fa.

What blocks you ordering?


----------



## chiknnwatrmln

I would think so, but you know what they say about people who assume..

Personally I would test one card at a time, if each of the four cards work independently then at least you know the cards and the motherboard are not faulty.

My money is on a software issue though. Make sure you're using DDU or a similar program in safe mode. Also, I had some issues with the 14.9 WHQL drivers, but the latest beta ones work for me. Maybe try older/newer drivers.


----------



## SeanEboy

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I would think so, but you know what they say about people who assume..
> 
> Personally I would test one card at a time, if each of the four cards work independently then at least you know the cards and the motherboard are not faulty.
> 
> My money is on a software issue though. Make sure you're using DDU or a similar program in safe mode. Also, I had some issues with the 14.9 WHQL drivers, but the latest beta ones work for me. Maybe try older/newer drivers.


Ok, so I found 14.4 straight off AMD, so I'll take a crack at those. Sound like a good idea? I'm going to do a driver wipe (ddu stylo), install the 14.4, then I'll go into the individual card testing...


----------



## chiknnwatrmln

Can't hurt. Good luck figuring out the issue, hopefully it's software and not something needing to be replaced.


----------



## Cyber Locc

Quote:


> Originally Posted by *taem*
> 
> Lol the black ed clock is still under 1ghz, what kind of binning they doing? Is there a 290 out there that can't do 980? The tri-x is an unbinned reference that comes stock 1000 clock. That xfx is the best looking card ever made though. I wish more pc parts had those sorts of aesthetics, instead of going the transformers route on everything.
> 
> The sapphire tri-x and powercolor pcs+ 290 are both reference pcbs with custom coolers. We'll the pcs+ used to be; powercolor revised the pcb, but kept the name and model number with no way for an end user to know which one he's ordering. You have to look at the stamp right above the PCIE connector to tell, reference is r29f and custom is r29fa.
> 
> What blocks you ordering?


Like I said they say theyre binned better whether thats the truth or not is another matter. I wasnt sure if the tri x was refrenece or not if it is thats also a good choice. ( I onpy knee forsure the xfx dd was)

Got some ek nickels on the way the newest ones there beatiful although i am still very upset with ek for the backplates having a ugly xfire cut out they dont need. Got some fuji pads as well







.


----------



## taem

Quote:


> Originally Posted by *cyberlocc*
> 
> Got some ek nickels on the way the newest ones there beatiful although i am still very upset with ek for the backplates having a ugly xfire cut out they dont need. Got some fuji pads as well
> 
> 
> 
> 
> 
> 
> 
> .


I really want the aquacomputer kryographics Hawaii, for the looks and performance, but EK makes the most sense. Cheaper, great all around performer, and they have blocks for every card that you can link together with the fc terminal. If I go with kryographics I can only use reference pcbs unless I want to mess with tubing or vid connecting the cards, and my board has only 1 slot spacing making that a pain. Vid connectors are leak prone too I hear.

Stren says the aquacomputer kryographics ac backplate works with ek. Prevents use of a terminal bridge but it's supposed to have by far the best performance. The passive is supposed to be good too.


----------



## Cyber Locc

Quote:


> Originally Posted by *taem*
> 
> I really want the aquacomputer kryographics Hawaii, for the looks and performance, but EK makes the most sense. Cheaper, great all around performer, and they have blocks for every card that you can link together with the fc terminal. If I go with kryographics I can only use reference pcbs unless I want to mess with tubing or vid connecting the cards, and my board has only 1 slot spacing making that a pain. Vid connectors are leak prone too I hear.
> 
> Stren says the aquacomputer kryographics ac backplate works with ek. Prevents use of a terminal bridge but it's supposed to have by far the best performance. The passive is supposed to be good too.


Personally I dont like the look of the kryos it doesnt cover the whole card which I hate mainly is why. And the temps between them arent too much so either one is great







besides the price aint much its only like a 20-30 dollar diffrence totally worth it if thats what you like IMO.

The backplate does work but only in passive it doesnt actively work as the connection isnt the same its close but not the same. Also the reason the passive kyro back works better is the plate has vrm thernal pad on it where the ek doesnt adding thermal pads there improves that on the ek to being abiut the same. (Thats why i have fuji thermal pads coming the extreme ones for the block vrm and the pads that come with are going on the backplate).

Also if you use the backplate in passive mode it has 4 holes and a big groove chunk missing out of it. Evwn the passive only backplate is like this it is something horrendous.

Yep the link is nice







however it also bridges the prices of the blocks as that is 30 bucks plus 10 for blanks if you need them.


----------



## Bertovzki

Quote:


> Originally Posted by *cyberlocc*
> 
> Like I said they say theyre binned better whether thats the truth or not is another matter. I wasnt sure if the tri x was refrenece or not if it is thats also a good choice. ( I onpy knee forsure the xfx dd was)
> 
> Got some ek nickels on the way the newest ones there beatiful although i am still very upset with ek for the backplates having a ugly xfire cut out they dont need. Got some fuji pads as well
> 
> 
> 
> 
> 
> 
> 
> .


That is one of the few things I do know for sure ,all 3 TrI-X 290"s are reference design PCB !

I was trying to find that out months ago, and was told by several PC companies here in NZ that it was not, yet i had seen pictures of it next to a reference, and identical, then I saw it on EK web site and then emailed Sappire to make %100 sure , they replied " NO IT IS NOT "in capitols ! , so I emailed them again and sent pictures this time and asked that they talk to a technician ,as either Sapphire or EK are wrong, and im blind.....they replied with appoligee and said yes it is indeed reference design , The same PC company here , that told me it is not reference ,is now selling one with a block installed second hand , after i emailed them back and told them the official newz.


----------



## Bertovzki

Quote:


> Originally Posted by *taem*
> 
> I really want the aquacomputer kryographics Hawaii, for the looks and performance, but EK makes the most sense. Cheaper, great all around performer, and they have blocks for every card that you can link together with the fc terminal. If I go with kryographics I can only use reference pcbs unless I want to mess with tubing or vid connecting the cards, and my board has only 1 slot spacing making that a pain. Vid connectors are leak prone too I hear.
> 
> Stren says the aquacomputer kryographics ac backplate works with ek. Prevents use of a terminal bridge but it's supposed to have by far the best performance. The passive is supposed to be good too.


The Hawaii block is the best performer, and the back plate is also the best only if active, if just the block its by far the worst !

If you have the Hawaii you need the active back plate or its a very bad performer on the VRAM so you would not buy without it, and I guess you guys have both seen the review I have ? the EK block is the best all rounder for price and 3rd place winner
http://www.xtremerigs.net/2014/08/11/r9-290x-gpu-waterblock-roundup/

I have the Hawaii I like the looks and the performance , and fully happy with it covering all it needs to, I in fact prefer it to cover less ,i like to see the card too, but thats PC's at this high level, we all like our own style and taste , and thats a good thing or we would all have the same thing.

My whole system is built around performance first then aesthetics


----------



## pdasterly

Quote:


> Originally Posted by *Bertovzki*
> 
> My whole system is built around performance first then aesthetics


Dark brown 1976 camaro lol


----------



## alancsalt

Rat rod?


----------



## taem

Quote:


> Originally Posted by *Bertovzki*
> 
> The Hawaii block is the best performer, and the back plate is also the best only if active, if just the block its by far the worst !
> 
> If you have the Hawaii you need the active back plate or its a very bad performer on the VRAM so you would not buy without it, and I guess you guys have both seen the review I have ? the EK block is the best all rounder for price and 3rd place winner
> http://www.xtremerigs.net/2014/08/11/r9-290x-gpu-waterblock-roundup/
> 
> I have the Hawaii I like the looks and the performance , and fully happy with it covering all it needs to, I in fact prefer it to cover less ,i like to see the card too, but thats PC's at this high level, we all like our own style and taste , and thats a good thing or we would all have the same thing.
> 
> My whole system is built around performance first then aesthetics


Yeah I'm pretty much going by what stren says. Stren and Martin. Those guys my go to for part picking.

Stren actually says the passive backplate is good enough, block + passive = second best vrm cooling, and active even better but probably not worth it since the passive is so good, is his advice. The advantage of the passive is you can use the kryoconnect bridge (I think... Have to pm shaggy on that one).

Actually the only reason I'm going this route is the assumption that the backplate will guard against a leak from the CPU block, and any fittings right above the gpu. Is that correct? Because if not I'd go with just the watercool block and skip the backplate.

I wish you could use the backplate that comes with your card because mine did.

edit to add, btw, temps with a $140 block better be outstanding. Because this is what I get after hours of gaming at full load on my Powercolor PCS+



It's only stock bios which is +50mV for this card and 1130 core 1450 mem clocks, but still, I consider these outstanding temps for air. Of the 290/x family, I'd bet only the Lightning and Club3D Royal Ace can match these temps on air.


----------



## Bertovzki

Quote:


> Originally Posted by *pdasterly*
> 
> Dark brown 1976 camaro lol


Ha yeah some might think t will look like a mongrel , a bit of this and that , no matching blocks here lol a rat rod lol


----------



## Cyber Locc

Quote:


> Originally Posted by *Bertovzki*
> 
> The Hawaii block is the best performer, and the back plate is also the best only if active, if just the block its by far the worst !
> 
> If you have the Hawaii you need the active back plate or its a very bad performer on the VRAM so you would not buy without it, and I guess you guys have both seen the review I have ? the EK block is the best all rounder for price and 3rd place winner
> http://www.xtremerigs.net/2014/08/11/r9-290x-gpu-waterblock-roundup/
> 
> I have the Hawaii I like the looks and the performance , and fully happy with it covering all it needs to, I in fact prefer it to cover less ,i like to see the card too, but thats PC's at this high level, we all like our own style and taste , and thats a good thing or we would all have the same thing.
> 
> My whole system is built around performance first then aesthetics


I agree and am the same way but were talking less than a degree of temp that's not like a huge difference at that point ascetics make the choice. Also that review is from the first version of EKs r9 290 blocks which of there is now 4 revisions .Also revision 4 could very well beat the AC esp seeing how it is a full cover block and has way more fins than revision 1 or the AC block. Also while the AC backplate is the best to cool VRMs doing so with the active back plate also increases your core temps for obvious reasons. Also that guide and his findings are more of a guide than a fact a koolance block beats the aqua computer and add in back plate the vrms get cooler again like I said .5 of a degree which is what were talking about is not by any means game changing. if you have a delta of 10 then your gpu core will be 30.47 vs 30.61 were not talking about 35 vs 55 were talking a small fraction of degree that wont net you anything at all period most software for your pc wont even tell you the difference some people take things a little too extreme. And really there again that could come down simply the thermal pads included in one versus the other. Thats why I grabbed fujis for my ek backplates theres a thread on here with people that did the same and the vrm temps went down 15-20 degrees now that is a difference.


----------



## Buehlar

Quote:


> Originally Posted by *cyberlocc*
> 
> I agree and am the same way but were talking less than a degree of temp that's not like a huge difference at that point ascetics make the choice. Also that review is from the first version of EKs r9 290 blocks which of there is now 4 revisions .Also revision 4 could very well beat the AC esp seeing how it is a full cover block and has way more fins than revision 1 or the AC block. Also while the AC backplate is the best to cool VRMs doing so with the active back plate also increases your core temps for obvious reasons. Also that guide and his findings are more of a guide than a fact a koolance block beats the aqua computer and add in back plate the vrms get cooler again like I said .5 of a degree which is what were talking about is not by any means game changing. if you have a delta of 10 then your gpu core will be 30.47 vs 30.61 were not talking about 35 vs 55 were talking a small fraction of degree that wont net you anything at all period most software for your pc wont even tell you the difference some people take things a little too extreme. And really there again that could come down simply the thermal pads included in one versus the other. *Thats why I grabbed fujis for my ek backplates theres a thread on here with people that did the same and the vrm temps went down 15-20 degrees now that is a difference.*


I was a bit hesitant when buying fujipoly for my block & backplates, but I'm not at all disappointed. Drastically doped my vrm temps ~5c lower than the core.


----------



## bond32

You guys realize those VRM can withstand upwards in the 110-120 C range right? Even pushing 1.5 volts to my four cards, on stock koolance pads max vrm gets in the 70's...


----------



## NotReadyYet

Add me to the club 

http://www.techpowerup.com/gpuz/details.php?id=df95a


----------



## taem

Quote:


> Originally Posted by *cyberlocc*
> 
> I agree and am the same way but were talking less than a degree of temp that's not like a huge difference at that point ascetics make the choice. Also that review is from the first version of EKs r9 290 blocks which of there is now 4 revisions .Also revision 4 could very well beat the AC esp seeing how it is a full cover block and has way more fins than revision 1 or the AC block. Also while the AC backplate is the best to cool VRMs doing so with the active back plate also increases your core temps for obvious reasons. Also that guide and his findings are more of a guide than a fact a koolance block beats the aqua computer and add in back plate the vrms get cooler again like I said .5 of a degree which is what were talking about is not by any means game changing. if you have a delta of 10 then your gpu core will be 30.47 vs 30.61 were not talking about 35 vs 55 were talking a small fraction of degree that wont net you anything at all period most software for your pc wont even tell you the difference some people take things a little too extreme. And really there again that could come down simply the thermal pads included in one versus the other. Thats why I grabbed fujis for my ek backplates theres a thread on here with people that did the same and the vrm temps went down 15-20 degrees now that is a difference.


I wish strength had done a 5 mount test for these blocks like he did with cpu blocks. But hey I don't want to sound like an ingrate I appreciate what he did!

I agree with what you say. The core temp differences were small, for core you could go with the worst on his list and it's only a few degrees off the best. There was quite a bit of variance on the VRMs though.

With the AC stren noted that one of the screws felt stripped and he might not have gotten good contact on the vrms.

I just put more faith in German engineering tolerances though. So I would definitely go AC or Watercool. I also prefer a smaller footprint block vs a full lengtg. That can matter for fitting in a case. Enthoo Primo res bracket for example, you'd have to dremel the bracket to fit a full length block in slot 4 or 5.

I plan to try the AC without backplate first and get a backplate only if vrms do get hot.


----------



## taem

Quote:


> Originally Posted by *Buehlar*
> 
> I was a bit hesitant when buying fujipoly for my block & backplates, but I'm not at all disappointed. Drastically doped my vrm temps ~5c lower than the core.


Those are my idle temps on air...
Quote:


> Originally Posted by *bond32*
> 
> You guys realize those VRM can withstand upwards in the 110-120 C range right? Even pushing 1.5 volts to my four cards, on stock koolance pads max vrm gets in the 70's...


I get instability at high oc if vrm temp gets that high. Not keen on those sorts of temps just as general principle either.


----------



## Cyber Locc

Quote:


> Originally Posted by *taem*
> 
> I wish strength had done a 5 mount test for these blocks like he did with cpu blocks. But hey I don't want to sound like an ingrate I appreciate what he did!
> 
> I agree with what you say. The core temp differences were small, for core you could go with the worst on his list and it's only a few degrees off the best. There was quite a bit of variance on the VRMs though.
> 
> With the AC stren noted that one of the screws felt stripped and he might not have gotten good contact on the vrms.
> 
> I just put more faith in German engineering tolerances though. So I would definitely go AC or Watercool. I also prefer a smaller footprint block vs a full lengtg. That can matter for fitting in a case. Enthoo Primo res bracket for example, you'd have to dremel the bracket to fit a full length block in slot 4 or 5.
> 
> I plan to try the AC without backplate first and get a backplate only if vrms do get hot.


Well personal preference always most imporant IMO.

When your talking about vrm temps and the backplates are we talking active or passive. He doesnt record active he also himself states that the ek backplate has no thermal pad on the backside of the vrm where the ac does and is the only one that does and adding said pads would make the diffrence as your backplate is now a heatsink for the vrms.

The only diffrence between the passive ac and the ek is that thermal pad there both alumiunuim the same thickness ect that thermal pad is the diffrence. IE. If you do what I suggest and put the fujipoly on the block side and use the ek thermal pad on the backplate the vrm temps will go down by about 15-20c and this isnt just me saying this theres a whole thread here on ocn reporting the same result which puts it perfrom way better than AC block and back but again we are splitting hairs and changing the thermal pad onthe AC may help it by alot as well.

With that said that is all about the AC backplate in passive in active mode there is no competion and if you like the AC block then you should run active however as per stren your core temps will go up quite a bit from reviews of it I have seen it goes up by about 5c on the core which makes since as your riasing the delta c and adding more heat to the water.


----------



## Cyber Locc

Quote:


> Originally Posted by *taem*
> 
> Those are my idle temps on air...
> I get instability at high oc if vrm temp gets that high. Not keen on those sorts of temps just as general principle either.


290s throttle when the vrm gets to 70c


----------



## bond32

Quote:


> Originally Posted by *cyberlocc*
> 
> 290s throttle when the vrm gets to 70c


I have literally never seen that. If you're getting throttling then there is some other problem.

The point is that the VRM's don't care if the temperature is 40 C or 70 C.... Sure, 90+ you should be concerned. I just don't see the point of the expensive fuji pads. I know they are effective, but what do you gain?


----------



## Cyber Locc

Quote:


> Originally Posted by *bond32*
> 
> I have literally never seen that. If you're getting throttling then there is some other problem.
> 
> The point is that the VRM's don't care if the temperature is 40 C or 70 C.... Sure, 90+ you should be concerned. I just don't see the point of the expensive fuji pads. I know they are effective, but what do you gain?


You gain not throttling they throttle at 70c even amd will tell you that not to mention the endless threads about and posts about it in this group alone. It may not show it in clock speed but it is run heaven and wacth when it gets up past 70c your usage will drop and spike up ans down in valley or heaven or whatever thats throttling get the vrms down it will stop.

And it is not something else the 7000 series did the same thing. And maybe yours dontbecause you have non refrence. Non refrence cards use diffrent vrms then refrence. Refrence will throttle at 70c on vrm1.

This may be fixable by higher the target throttle temp which is set to 70c im not sure I dont care to try. The reason yes the vrms can run at 100c for like 1000 hours what they can handpe is based on a life span equation and if you want the cards to last you dont run them past a certain degree hoard vrms are the same way.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *bond32*
> 
> I have literally never seen that. If you're getting throttling then there is some other problem.
> 
> The point is that the *VRM's don't care if the temperature is 40 C or 70 C*.... Sure, 90+ you should be concerned. I just don't see the point of the expensive fuji pads. I know they are effective, but what do you gain?


This is just plain wrong. Maybe at low clocks it doesn't matter, but as the VRM temps get higher they use more power and get inherently less stable..

When I was on air and my VRMs would get to 75+ I would run into instability. On water with Fujis on both cards they stay below 50c after hours of gaming and my cards are rock solid stable at a lower voltage.

I have never seen cards throttling due to VRM temps, though. Whether I have not seen them get hot enough or just not paid attention, I'm not sure.

Also, wit Fuji's you gain overclocking headroom due to lower temps. Whether or not it's worth the $15 is up to the end user, but if I'm spending $3k+ on my PC I'm going to do it right and get the lowest temps possible.


----------



## ZealotKi11er

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> This is just plain wrong. Maybe at low clocks it doesn't matter, but as the VRM temps get higher they use more power and get inherently less stable..
> 
> When I was on air and my VRMs would get to 75+ I would run into instability. On water with Fujis on both cards they stay below 50c after hours of gaming and my cards are rock solid stable at a lower voltage.
> 
> I have never seen cards throttling due to VRM temps, though. Whether I have not seen them get hot enough or just not paid attention, I'm not sure.
> 
> Also, wit Fuji's you gain overclocking headroom due to lower temps. Whether or not it's worth the $15 is up to the end user, but if I'm spending $3k+ on my PC I'm going to do it right and get the lowest temps possible.


They effect OC ability buy not by much unless you are trying to break very high clocks.


----------



## bond32

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> This is just plain wrong. Maybe at low clocks it doesn't matter, but as the VRM temps get higher they use more power and get inherently less stable..
> 
> When I was on air and my VRMs would get to 75+ I would run into instability. On water with Fujis on both cards they stay below 50c after hours of gaming and my cards are rock solid stable at a lower voltage.
> 
> I have never seen cards throttling due to VRM temps, though. Whether I have not seen them get hot enough or just not paid attention, I'm not sure.


I always assumed the instability came from the fact I was pushing 40-60% more voltage to the card (clocks too) than what was intended... Even if the VRM temps were in the 40's to 50's stability would still be an issue. Either way I will consider the fuji pads but buying them for 4 cards isn't going to be cheap. And by the way, I am not talking about 1.3 V... but 1.5 V.


----------



## ThijsH

I've never seen vrm throtling under 105 degrees C. They might be less efficient, but throtling at 70c? I highly doubt it.


----------



## bond32

I see the fuji pads come in 60x50, those of you with them did you cut them so that 2 pieces fit across VRM1 together?


----------



## Cyber Locc

Quote:


> Originally Posted by *ThijsH*
> 
> I've never seen vrm throtling under 105 degrees C. They might be less efficient, but throtling at 70c? I highly doubt it.


Its not really the temp as much as amd coded it to throttle at 70c thats by design if you dont beleive me search through this thread its been discussed quite a few times

Ya bond 1 strip of fuji is enough for 2 cards so it would only be 30 bucks to do all your cards and you will also still have enough for vrm2 as well.

Heres a long detailed post about it he does the xspc razor in op but ek and some other blocks are discussed later http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures


----------



## bond32

Quote:


> Originally Posted by *cyberlocc*
> 
> Its not really the temp as much as amd coded it to throttle at 70c thats by design if you dont beleive me search through this thread its been discussed quite a few times
> 
> Ya bond 1 strip of fuji is enough for 2 cards so it would only be 30 bucks to do all your cards and you will also still have enough for vrm2 as well.


Where did you order yours from? Frozencpu has them for $19.99 for one pad. Thinking I might get just the VRM's, use the koolance pads for the memory as they are.


----------



## Cyber Locc

Quote:


> Originally Posted by *bond32*
> 
> Where did you order yours from? Frozencpu has them for $19.99 for one pad. Thinking I might get just the VRM's, use the koolance pads for the memory as they are.


From frozen cpu you must be looking at the bigger pad mine was 15 dollars. The thread above has a link.


----------



## bond32

Quote:


> Originally Posted by *cyberlocc*
> 
> From frozen cpu you must be looking at the bigger pad mine was 15 dollars. The thread above has a link.


I'm a little confused... I need to wait to see how big the koolance block covers them I guess. I thought it was 100mm x 17mm, if that's the case then I would nee 4x http://www.frozencpu.com/products/17504/thr-185/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_Mosfet_Block_-_100_x_15_x_10_-_Thermal_Conductivity_170_WmK.html?tl=g8c487s1797

But that doesn't seem right...


----------



## tsm106

Quote:


> Originally Posted by *bond32*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cyberlocc*
> 
> Its not really the temp as much as *amd coded it to throttle at 70c thats by design if you dont beleive me search through this thread its been discussed quite a few times*
> 
> Ya bond 1 strip of fuji is enough for 2 cards so it would only be 30 bucks to do all your cards and you will also still have enough for vrm2 as well.
> 
> 
> 
> Where did you order yours from? Frozencpu has them for $19.99 for one pad. Thinking I might get just the VRM's, use the koolance pads for the memory as they are.
Click to expand...

What the five? Someone is smoking some stuff or just flat out pulling stuff out their ass.

And yea, you don't need to waste money on the memory pads, just get vrm pads if that's what you are set on. And btw some of us have been quietly using fujis for years. It's more of a benchers thing ya know...


----------



## bond32

Quote:


> Originally Posted by *tsm106*
> 
> What the five? Someone is smoking some stuff or just flat out pulling stuff out their ass.
> 
> And yea, you don't need to waste money on the memory pads, just get vrm pads if that's what you are set on. And btw some of us have been quietly using fujis for years. It's more of a benchers thing ya know...


I guess when I initially thought about it, for reasons unknown to me I was thinking I had to get memory pads too which would end out costing a ton for 4 cards... but if it helps squeeze a little more out of them then I guess I will consider it.

In other news, I am still troubleshooting my rig. Last night put in my roommate's card and the system still behaved the same - attempting to boot windows at the splash screen then to a black screen. I suspect I did not have my board properly supported on my acrylic sheet - the 4 cards caused it to sag but I didn't address it. Totally my fault if that was the case.

Managed to find the z87 classified board on amazon for $200. Ordered one that is in today. Had to be a mistake on that price... It has gone back up today lol.


----------



## tsm106

If by squeeze you mean throw money at useless things, go for it.


----------



## bond32

Quote:


> Originally Posted by *tsm106*
> 
> If by squeeze you mean throw money at useless things, go for it.


Well, isn't that what we do here? Such is the price of a painfully expensive hobby...


----------



## tsm106

Quote:


> Originally Posted by *bond32*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> If by squeeze you mean throw money at useless things, go for it.
> 
> 
> 
> Well, isn't that what we do here? Such is the price of a painfully expensive hobby...
Click to expand...

Go ahead, all you.


----------



## Klocek001

looks like I'm gonna hold on to my 290 Tri-X for a little longer. I asked the guys in 970 club what 970 to get - had several replies immediately. I asked whether AA has almost no negative influence on performance like with my 290 - didn't get a single reply.


----------



## Agent Smith1984

Hey, it's cheaper than race cars guys!!! LMAO


----------



## Agent Smith1984

Quote:


> Originally Posted by *Klocek001*
> 
> looks like I'm gonna hold on to my 290 Tri-X for a little longer. I asked the guys in 970 club what 970 to get - had several replies immediately. I asked whether AA has almost no negative influence on performance like with my 290 - didn't get a single reply.


How do you like that 290 Tri???? I got a guy locally wanting to sell me his for $220.
My 280x died, and it's a fairly decent upgrade for me.....

How high were you able to clock it? Seems the card has no problem staying cool, so it's just silicon luck I guess, or are there other limitations with the 290? (IE: VRM temps, power phase/limits, etc...)


----------



## Cyber Locc

Quote:


> Originally Posted by *bond32*
> 
> I'm a little confused... I need to wait to see how big the koolance block covers them I guess. I thought it was 100mm x 17mm, if that's the case then I would nee 4x http://www.frozencpu.com/products/17504/thr-185/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_Mosfet_Block_-_100_x_15_x_10_-_Thermal_Conductivity_170_WmK.html?tl=g8c487s1797
> 
> But that doesn't seem right...


Thats the ones there 15.99 not 19.99 that will cover 2 vrms. if Koolance has bigger strips it really doesnt matter as its not needed you just need to cover the vrms.

@tsm Like I siad mine throttle at 70c however im still on air so my cores are at 95c by the point so that could be a false positive and again I have read on here and AMD forums that the cards throttle at 70c on vrm1 if thats truth or not not for me to say Ill I can say is mine do and that I have read they all do. However as said here before that could be overclocks failing ect not necessarily a vrm temp problem at stock again I don't know.


----------



## tsm106

Quote:


> Originally Posted by *Klocek001*
> 
> looks like I'm gonna hold on to my 290 Tri-X for a little longer. I asked the guys in 970 club what 970 to get - had several replies immediately. I asked whether AA has almost no negative influence on performance like with my 290 - *didn't get a single reply*.


Lmao that is funny and there's your answer.


----------



## Cyber Locc

Quote:


> Originally Posted by *Klocek001*
> 
> looks like I'm gonna hold on to my 290 Tri-X for a little longer. I asked the guys in 970 club what 970 to get - had several replies immediately. I asked whether AA has almost no negative influence on performance like with my 290 - didn't get a single reply.


Dang that makes me glad I decided to give AMD the chance to fix my audio plague lol. Most the other people have went to 970s I'm not jumping ship yet.


----------



## Klocek001

Quote:


> Originally Posted by *Agent Smith1984*
> 
> How do you like that 290 Tri???? I got a guy locally wanting to sell me his for $220.
> My 280x died, and it's a fairly decent upgrade for me.....
> 
> How high were you able to clock it? Seems the card has no problem staying cool, so it's just silicon luck I guess, or are there other limitations with the 290? (IE: VRM temps, power phase/limits, etc...)


last time I measure my temps with HWmonitor I got 57 degrees with 70% on the fan speed (2 hrs of FC3 1080p Ultra). My 2500k got 60+ degrees but the Tri-X never went over 60. In other games I play the highest I noticed was 65, but the fan was on auto (it's not too noisy I assure you). As far as OC my card is already clocked at 1000 out of the box, I just raised power +50 and core to 1100 in CCC, no added vcore.
My card was 2nd hand but died after a month (1st owner used it for mining). I RMAd and got a new one after 2 weeks.


----------



## Agent Smith1984

You gotta FireStrike??


----------



## Klocek001

sorry mate, I don't do benchmarks.


----------



## Agent Smith1984

Don't need to with an overclocked 290


----------



## Klocek001




----------



## chiknnwatrmln

Dang this thread is moving fast. As for what Fujis to get, make sure you get the right thickness. Different blocks may require different thickness pads.

I got this one here, and it was just enough to do VRM's on 3 cards with EK blocks. http://www.frozencpu.com/products/17504/thr-185/Fujipoly_Ultra_Extreme_System_Builder_Thermal_Pad_-_Mosfet_Block_-_100_x_15_x_10_-_Thermal_Conductivity_170_WmK.html?id=X9UrALqs&mv_pc=178


----------



## taem

Quote:


> Originally Posted by *ThijsH*
> 
> I've never seen vrm throtling under 105 degrees C. They might be less efficient, but throtling at 70c? I highly doubt it.


I doubt they throttle at 70c vrm also. Are we talking mem vrm here? Because if mem vrm is at 70c then core vrm might throttle. But if the card throttled at 70c core vrm then that would mean the 290 is designed to throttle at load. Because only a handful of custom 290s even run at load with vrms under 70c.


----------



## FragZero

Note for people with a reference 290. The PowerColor PCS+ R9 290 works on reference cards (it does seem to use the reference pcb),

max temperature is 70 celsius
max fanspeed seems to be 100%

core 1040mhz
memory 1350mhz

Ideal if you don't mind the noise, has R9 290 device id's so it works on any motherboard and it offers 290x performance out of the box with no chance of throtteling!


----------



## SeanEboy

Newb question... So, if trifire 290x does better than quadfire.. Should I sell off one of my cards? Or, could I somehow put it to good use?

Also, I recall someone telling me that another platform would enable me much better usage of PCIE, or something to that effect..? Which mobo/cpu would best make usage of trifire, or quadfire?


----------



## Agent Smith1984

Quote:


> Originally Posted by *SeanEboy*
> 
> Newb question... So, if trifire 290x does better than quadfire.. Should I sell off one of my cards? Or, could I somehow put it to good use?
> 
> Also, I recall someone telling me that another platform would enable me much better usage of PCIE, or something to that effect..? Which mobo/cpu would best make usage of trifire, or quadfire?


If you have kids..... build a cheap i3 rig, throw the 290x in there, and get those little ones gaming!!


----------



## SeanEboy

Quote:


> Originally Posted by *Agent Smith1984*
> 
> If you have kids..... build a cheap i3 rig, throw the 290x in there, and get those little ones gaming!!


No kids, already have a Mac mini as a HTPC... perhaps I should sell that off, and use one of the 290x for a proper HTPC build.. Meh, I also have an unlockable 290 sitting in my desk. What a first world problem.

Perhaps I should throw a decent rig together for my 60" DLP.... Decisions, decisions.. I also have a Corsair H55 sitting around, and even had a 4930k at one point, but sold it. I'm trying to liquidate, but yet my cool answers always involved spending more money.


----------



## taem

Let me ask this again cos I'd really like to know.

Does a backplate protect against leaks from the cpu block and any other fittings that might hang above the gpu blocks? Because that's the only reason I'd get one. I'll leak test and all that of course but a $30 investment to help protect $700 worth of gpus in addition to better cooling on vrms might be worthwhile. But if it doesn't offer any leak protection then it's probably not worth it to me, I don't care about the aesthetics and I won't oc the cards too much cos I only have a 4670k so I don't need to eke out every single degree of performance.
Quote:


> Originally Posted by *cyberlocc*
> 
> Well personal preference always most imporant IMO.
> 
> When your talking about vrm temps and the backplates are we talking active or passive. He doesnt record active he also himself states that the ek backplate has no thermal pad on the backside of the vrm where the ac does and is the only one that does and adding said pads would make the diffrence as your backplate is now a heatsink for the vrms.
> 
> The only diffrence between the passive ac and the ek is that thermal pad there both alumiunuim the same thickness ect that thermal pad is the diffrence. IE. If you do what I suggest and put the fujipoly on the block side and use the ek thermal pad on the backplate the vrm temps will go down by about 15-20c and this isnt just me saying this theres a whole thread here on ocn reporting the same result which puts it perfrom way better than AC block and back but again we are splitting hairs and changing the thermal pad onthe AC may help it by alot as well.
> 
> With that said that is all about the AC backplate in passive in active mode there is no competion and if you like the AC block then you should run active however as per stren your core temps will go up quite a bit from reviews of it I have seen it goes up by about 5c on the core which makes since as your riasing the delta c and adding more heat to the water.


It's the passive he doesn't record.



But here is what he says:
Quote:


> Partly the backplate is so good because it cools so much of the PCB. The PCB gets hot because the chokes next to the VRM are not actively cooled and receive no airflow - their only heatsinking is through the PCB. Even a low impedance choke gets hot when 200A has to get through to the GPU core. Partly *the backplate does well I think because the screws seem to be a better length.* The block on it's own seemed to run out of screw thread before the VRM sections gets clamped to the block particularly well.
> 
> The block also has decent flow. *The $20 extra for the active backplate with the heatpipe is really a luxury. The passive backplate is already so good that it's unnecessary and the additional difference is small*, however if you want the absolute best performance then it is the one to get.


I was thinking I'd get just the AC block with slightly longer screws and try it that way first. If I get the block it will be the passive, he doesn't record the temps but its clear from this statement that the passive backplate is a top performer on vrm temps.

But depending on the answer to my question posted above your quote I'd go with the Watercool. Because this is sex:


----------



## Agent Smith1984

Ol' lady maybe? lol


----------



## Cyber Locc

Quote:


> Originally Posted by *taem*
> 
> Let me ask this again cos I'd really like to know.
> 
> Does a backplate protect against leaks from the cpu block and any other fittings that might hang above the gpu blocks? Because that's the only reason I'd get one. I'll leak test and all that of course but a $30 investment to help protect $700 worth of gpus in addition to better cooling on vrms might be worthwhile. But if it doesn't offer any leak protection then it's probably not worth it to me, I don't care about the aesthetics and I won't oc the cards too much cos I only have a 4670k so I don't need to eke out every single degree of performance.
> It's the passive he doesn't record.
> 
> 
> 
> But here is what he says:
> I was thinking I'd get just the AC block with slightly longer screws and try it that way first. If I get the block it will be the passive, he doesn't record the temps but its clear from this statement that the passive backplate is a top performer on vrm temps.
> 
> But depending on the answer to my question posted above your quote I'd go with the Watercool. Because this is sex:


Dude you need to read that thread more bud i have read the whole thing many times he is testing the passive abilitys of the heatsink. The active heatsink is the same as the passive they just took the heat pipe portion off which is attachabpe to either. The ek block cant use the active heatsink all the backplates tests there are in passive mode.

He does test active in the very beginning the first AC backplate test however later results are tested via the passive plate as the ek cant use active mode he says that in the comments when discussing the ek block with ac plate.

Also really either way it doesnt matter much as he said himself
(In reality the active part of the backplate is a marginal gain at best.)
He goes on to say the reason it cools better than the others he thinks is simply because there is thermal pad on the rear side of the vrms no other backplate has this and adding this would in theory provide the same results.

Just FYI what im saying and he says himself unless it states active its being used passively in his results the first test aside. what you keep linking is the passive test.


----------



## SeanEboy

Quote:


> Originally Posted by *taem*
> 
> Let me ask this again cos I'd really like to know.
> 
> Does a backplate protect against leaks from the cpu block and any other fittings that might hang above the gpu blocks? Because that's the only reason I'd get one. I'll leak test and all that of course but a $30 investment to help protect $700 worth of gpus in addition to better cooling on vrms might be worthwhile. But if it doesn't offer any leak protection then it's probably not worth it to me, I don't care about the aesthetics and I won't oc the cards too much cos I only have a 4670k so I don't need to eke out every single degree of performance.
> It's the passive he doesn't record.
> 
> 
> 
> But here is what he says:
> I was thinking I'd get just the AC block with slightly longer screws and try it that way first. If I get the block it will be the passive, he doesn't record the temps but its clear from this statement that the passive backplate is a top performer on vrm temps.
> 
> But depending on the answer to my question posted above your quote I'd go with the Watercool. Because this is sex:


In addition, you should definitely check out this thread if you're going to be watercooling:
http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures

Also, I have backplates on all my cards, it adds stability to the card, along with cooling. It's not really a 'protective cowl' of sorts.


----------



## NathG79

Hi Folks.
I'm currently running a 4770K on a z87 saber tooth with 2x 7970xfire remounted on a kraken g10
I have a be quiet 1200w psu.

Just made a impulse buy and got the re -released sapphire 290x vapor-x Oc 8gb and coupled it with the same model with 4Gb of Vram.
2 questions:
1.will my 1200w power these beasts with headroom.

2.Am I likely to run into any issues xfire'ing these cards.

I only ask because of the different video memory.
They are exactly the same clocks
I just couldn't justify £389 x2 so went for the cheaper 4Gb model to crossfire with.
My plans are to sell the 7970's with G10 bracket and Antec kuler attached.
Question 3.How much do you think the 7970's are worth with the AIO setup included. I Still have the original XFX DD cooler, box and some of the accessories.

Thanks in advance.


----------



## ThijsH

Quote:


> Originally Posted by *taem*
> 
> I doubt they throttle at 70c vrm also. Are we talking mem vrm here? Because if mem vrm is at 70c then core vrm might throttle. But if the card throttled at 70c core vrm then that would mean the 290 is designed to throttle at load. Because only a handful of custom 290s even run at load with vrms under 70c.


Was talking about vrm1 yeah. Vrm2 is never really a problem, usually stays very cold. I could understand if mem vrm throttles at that temp.
Quote:


> Originally Posted by *cyberlocc*
> 
> Its not really the temp as much as amd coded it to throttle at 70c thats by design if you dont beleive me search through this thread its been discussed quite a few times


I can't be arsed to search through ~3.2k posts. Could you be so kind as to provide me with sources and/or proof / articles etc. on the matter if you have any? Would make an interesting evening read.


----------



## taem

Quote:


> Originally Posted by *cyberlocc*
> 
> Dude you need to read that thread more bud i have read the whole thing many times he is testing the passive abilitys of the heatsink. The active heatsink is the same as the passive they just took the heat pipe portion off which is attachabpe to either. The ek block cant use the active heatsink all the backplates tests there are in passive mode.


He must have posted that either on the extreme systems thread or the one here, he doesn't mention that on the review on his own site at extreme rigs. Weird terminology he's using though then since the active and passive backplanes are the same thing, the active just includes the piping and the water channel block. If you take that off then it's the passive backplate, not active plate in passive mode.

Incidentally this is even better for me since I was going with the passive backplate anyway, I assumed it was a few degrees worse than what he listed as ac backplate but if it's even better, we'll great.

Re the pads thread, yeah I have it bookmarked but I'm going to try it with the stock pads first.


----------



## tsm106

Quote:


> Originally Posted by *SeanEboy*
> 
> In addition, you should definitely check out this thread if you're going to be watercooling:
> http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures
> 
> Also, I have backplates on all my cards, it adds stability to the card, along with cooling. It's not really a 'protective cowl' of sorts.


It depends on the backplate. The ek full pcb plates do add a big degree of protection from debris damage and water damage. Btw, you don't need to read a thread to know there's a difference between the stock 5w/mk pads vs the ultra ex 17w/mk pads. Do you need someone to tell you there's a difference using pads with over three times the heat transfer ability? As I wrote in that actual thread, the pads are not without their downsides, they dry up and become brittle (think one time application only), obscenely expensive for ultras, and are only worthwhile if you are a bencher. If you are not feeding a lot of volts and running high clocks, there is no point to reducing your vrm temps from 65c avg to 45c avg. That said, they are a waste to use on the memory ic since those are easily handled by the stock pads.


----------



## Cyber Locc

Quote:


> Originally Posted by *taem*
> 
> He must have posted that either on the extreme systems thread or the one here, he doesn't mention that on the review on his own site at extreme rigs. Weird terminology he's using though then since the active and passive backplanes are the same thing, the active just includes the piping and the water channel block. If you take that off then it's the passive backplate, not active plate in passive mode.
> 
> Incidentally this is even better for me since I was going with the passive backplate anyway, I assumed it was a few degrees worse than what he listed as ac backplate but if it's even better, we'll great.
> 
> Re the pads thread, yeah I have it bookmarked but I'm going to try it with the stock pads first.


Yes it was on extreme sytems when someone gave him the idea to use the backplate on the ek block. Although his graphs there that show active and passive modes show the difference is like a degree tops.
And again his logic of why is that on the ac backplate there is thermal pad on the backside of the vrms (where there is also a few vrms as well) no other backplate has this strip and adding this strip to any backplate will surely drop the the temps of the vrms as you just put a heat sink on them that wasn't there before.

And ya just like tsm said it might help if there's a drip but nothing is going to ensure a leak wont fry the system that's up to luck.


----------



## Cyber Locc

Quote:


> Originally Posted by *ThijsH*
> 
> Was talking about vrm1 yeah. Vrm2 is never really a problem, usually stays very cold. I could understand if mem vrm throttles at that temp.
> I can't be arsed to search through ~3.2k posts. Could you be so kind as to provide me with sources and/or proof / articles etc. on the matter if you have any? Would make an interesting evening read.


It may have been VRM2 I dont remember precisely. I do remember it being on this thread and some others both here and on AMD forums I will try to find it. I do rember the people saying it said the problem was also found in the 7000 series which I dont know about these are my first amd cards.


----------



## taem

Quote:


> Originally Posted by *cyberlocc*
> 
> Yes it was on extreme sytems when someone gave him the idea to use the backplate on the ek block. Although his graphs there that show active and passive modes show the difference is like a degree tops.
> And again his logic of why is that on the ac backplate there is thermal pad on the backside of the vrms (where there is also a few vrms as well) no other backplate has this strip and adding this strip to any backplate will surely drop the the temps of the vrms as you just put a heat sink on them that wasn't there before.
> 
> And ya just like tsm said it might help if there's a drip but nothing is going to ensure a leak wont fry the system that's up to luck.


Hmm. So instead of the $30 I could just get done regular Fuji pads and stick the heat sinks I remove from the pcb to put on the waterblock on the back of the card. Save a lot of money and probably get better temps. Yes?


----------



## Cyber Locc

Quote:


> Originally Posted by *taem*
> 
> Hmm. So instead of the $30 I could just get done regular Fuji pads and stick the heat sinks I remove from the pcb to put on the waterblock on the back of the card. Save a lot of money and probably get better temps. Yes?


From what I have seen yes.


----------



## ZealotKi11er

I was hitting ~ 80C with stock EK pads. With Fiji 11W i hit ~ 55C.


----------



## SeanEboy

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I was hitting ~ 80C with stock EK pads. With Fiji 11W i hit ~ 55C.


Cooler cards are happier cards. I would never reuse thermal anything, even a pad.


----------



## amptechnow

i have the watercool block and backplate on my xfx reference 290 and at 1135/1625 my core temp and both vrm's sit in the mid 50's under load from gaming and mid 20's idle. during long very stressful benches they may hit very low 60's. i love how the block and backplate look too.

edit: and i have no pads installed. well except for whatever pads came with block.


----------



## Bertovzki

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> This is just plain wrong. Maybe at low clocks it doesn't matter, but as the VRM temps get higher they use more power and get inherently less stable..
> 
> When I was on air and my VRMs would get to 75+ I would run into instability. On water with Fujis on both cards they stay below 50c after hours of gaming and my cards are rock solid stable at a lower voltage.
> 
> I have never seen cards throttling due to VRM temps, though. Whether I have not seen them get hot enough or just not paid attention, I'm not sure.
> 
> Also, wit Fuji's you gain overclocking headroom due to lower temps. Whether or not it's worth the $15 is up to the end user, but if I'm spending $3k+ on my PC I'm going to do it right and get the lowest temps possible.


Can you get these fuji pads at frozen ? i don't like PPC's they have a very bad attitude , I live in New Zealand ,Iv not done a search here yet , but will be ordering from Frozen or Amazon again soon.
The way i see it is iv not started build yet so , if its worth the fujis before i start then i will , and $15 no big deal for me

I should also add , Im not worried about OC's and will have just 1 card for a while ,id like to overclock latter though if i need to.


----------



## xer0h0ur

I bought my Fujipoly Ultra Extreme pads from frozencpu.com so yes they have them.


----------



## Bertovzki

Yeah I was playing catch up on thread, didnt see all the reference to fuji and frozen latter ,thanks


----------



## taem

Quote:


> Originally Posted by *amptechnow*
> 
> i have the watercool block and backplate on my xfx reference 290 and at 1135/1625 my core temp and both vrm's sit in the mid 50's under load from gaming and mid 20's idle. during long very stressful benches they may hit very low 60's. i love how the block and backplate look too.
> 
> edit: and i have no pads installed. well except for whatever pads came with block.


Anything you don't like about the watercool block? Trying to choose between that and the aquacomputer.


----------



## battleaxe

I personally love how the introduction of the GTX970/980 has brought this thread back to life. Now all of a sudden guys/gals like us are nabbing up the 290's or realizing what a great card they really are. Yeah, AMD is such a terrible company. I'll keep my 290's thank you. What a great bargain these have now become. Who knew?


----------



## amptechnow

Quote:


> Originally Posted by *taem*
> 
> Anything you don't like about the watercool block? Trying to choose between that and the aquacomputer.


i have no complaints. i think it comes down to personal preference and style. i bought it without reading any reviews and just bc of how it looked and colors. i got it right when it came out and took a gamble and it paid off. i will be getting another for my pcs+ eventually since i have a reference pcb model i think.

i do wish it covered the card from end to end so no pcb was showing but its not something i dont like. i just kind of like the look of a fully covered card with full backplate. but when do you really see that side of the card in a normal oriented case?

i say just go with whatever one looks the best in your case. all the tem[s are close and a say 5 degree difference only really matters if you are trying to break the world record or something. plus get some good pads and that will take off some degrees too. i just used ones that came with block and used none on back plate and i have no throttleing or temp issues. even with a decent oc and my 3770k at 4.8 and 1.41 volts on a single 240mm rad....


----------



## amptechnow

Quote:


> Originally Posted by *battleaxe*
> 
> I personally love how the introduction of the GTX970/980 has brought this thread back to life. Now all of a sudden guys/gals like us are nabbing up the 290's or realizing what a great card they really are. Yeah, AMD is such a terrible company. I'll keep my 290's thank you. What a great bargain these have now become. Who knew?


good point...

i love my 290's!!! and have since day 1 or even since i first saw them revealed. i was depressed 980 are all over bench charts, but i care more about gaming then bench and they have done a wonderful job for me and have been a great value.

ive gotten many different periods of enjoyment from them... getting my first 290. then going water. then going crossfire. next will be getting the 2nd under water. trying different bios'.... and even though they are just 290's they will hang with any other card....


----------



## xer0h0ur

Quote:


> Originally Posted by *battleaxe*
> 
> I personally love how the introduction of the GTX970/980 has brought this thread back to life. Now all of a sudden guys/gals like us are nabbing up the 290's or realizing what a great card they really are. Yeah, AMD is such a terrible company. I'll keep my 290's thank you. What a great bargain these have now become. Who knew?


I don't care about what the other guys are doing. I only care about getting better drivers STAT! Preferably I wouldn't like to wait 5 months in between WHQL approved drivers which turn out to need betas immediately after release for not working properly. Even worse still crossfire scaling has gone completely to hell in a hand basket. Nevermind having to disable crossfire in DX9 games due to stuttering.


----------



## pdasterly

Quote:


> Originally Posted by *taem*
> 
> Anything you don't like about the watercool block? Trying to choose between that and the aquacomputer.


Heatkiller backplate has no thermal pads


----------



## taem

Quote:


> Originally Posted by *pdasterly*
> 
> Heatkiller backplate has no thermal pads


Yeah. And it's a vented plate, and since the main reason I'd get a backplate is if it provided some protection against leaks from above as well as better vrm temps, I would not get this. My choice is between watercool block only and AC block + passive backplate.

Quote:


> Originally Posted by *amptechnow*
> 
> i have no complaints. i think it comes down to personal preference and style. i bought it without reading any reviews and just bc of how it looked and colors. i got it right when it came out and took a gamble and it paid off. i will be getting another for my pcs+ eventually since i have a reference pcb model i think


You can tell if your pcs+ is reference or revised custom by looking at the pcb right above the PCIE connector (can be seen with cooler on) the reference is stamped R29F and the revised is R29FA. EK has a block for the R29FA if that's what you have, though of course you wouldn't be able to use a bridge if you mix different blocks.

I have a reference pcs+, seems almost a shame to put it under water since it's such a beast of a cooler. I can only hit 1130/1450 at stock volt tho (which is +50mv) and I need to go to +100mv to get to 1160/1500, and to +200mv for 1200/1675. How's yours on oc?


----------



## pdasterly

Quote:


> Originally Posted by *taem*
> 
> Yeah. And it's a vented plate, and since the main reason I'd get a backplate is if it provided some protection against leaks from above as well as better vrm temps, I would not get this. My choice is between watercool block only and AC block + passive backplate.?


I just had a small water leak from my mosfets, the ek flat backplate caught all the water even though its not designed to(got lucky). The heatkiller has holes in it so its for appearance and adds a little stability to cards flexing.


----------



## Widde

Benched with the latest beta drivers now ^^ Graphics score seems to have gotten about 500 points higher at slightly lower clocks 1110/1350 http://www.3dmark.com/3dm/4608183? compared to my previous 1125/1350 http://cdn.overclock.net/9/90/90305833_3uzyw.jpeg

Too bad it seems I've lost the lottery both when it comes to gpus and my cpu :/ Hopefully next build will be better


----------



## kizwan

Quote:


> Originally Posted by *SeanEboy*
> 
> Newb question... So, if trifire 290x does better than quadfire.. Should I sell off one of my cards? Or, could I somehow put it to good use?
> 
> Also, I recall someone telling me that another platform would enable me much better usage of PCIE, or something to that effect..? Which mobo/cpu would best make usage of trifire, or quadfire?


I think it depends on the games, you may not get any performance increase at all or worst performance when running quadfire. If I were you I wouldn't sell the 4th cards because I can put it in new rig if I build one. GPU is the only thing I less frequently upgrade because I'm not hardcore gamer. I usually sell the whole rig if I want to let go. You can configure trifire without physically removing the 4th card right?

The only possible difference between quadfire capable motherboard A & quadfire capable motherboard B is number of lanes. If quadfire performed poorly on A, there's very less chance it'll work any better on B even if B have more lanes.


----------



## iCrap

Hey what do you guys have set for overclocks?
I have MSI afterburner and have set my sapphire 290 (on water) to 1100mhz core / 1300mhz memory and 30 percent power

anything higher and it seems to crash after a while in games. Does that seem about the limit of what most cards can do or am I doing it wrong?


----------



## Spectre-

Quote:


> Originally Posted by *iCrap*
> 
> Hey what do you guys have set for overclocks?
> I have MSI afterburner and have set my sapphire 290 (on water) to 1100mhz core / 1300mhz memory and 30 percent power
> 
> anything higher and it seems to crash after a while in games. Does that seem about the limit of what most cards can do or am I doing it wrong?


running 1100/1400 24/7 50% power stock vcore (1.22)
looped it 20 times on Metro LL and Heaven


----------



## iCrap

have you tried to push the core clock any higher? even 1120mhz crashes after like 30 minutes (was playing ac4)


----------



## taem

Quote:


> Originally Posted by *iCrap*
> 
> have you tried to push the core clock any higher? even 1120mhz crashes after like 30 minutes (was playing ac4)


Are you increasing voltage? 1100 core without a voltage increase isn't bad at all IMHO.


----------



## iCrap

Quote:


> Originally Posted by *taem*
> 
> Are you increasing voltage? 1100 core without a voltage increase isn't bad at all IMHO.


Nah I haven't.
Actually I don't even see a way to increase it via afterburner. is there a way?


----------



## SeanEboy

Quote:


> Originally Posted by *iCrap*
> 
> have you tried to push the core clock any higher? even 1120mhz crashes after like 30 minutes (was playing ac4)


Dat avatar. Dude, I did indeed crap. So good.

In other news.... I did a drive wipe (using wet ones, came out clean, checked with a mirror) - no. Using DDU. Anyway, got that 4th card to wake up, I think it was a "just the tip" situation on one of the power plugs. Having said that, CCC, and device manager both show the card. So, I guess it's time to play!


----------



## taem

Quote:


> Originally Posted by *iCrap*
> 
> Nah I haven't.
> Actually I don't even see a way to increase it via afterburner. is there a way?


Go to settings and in the first tab check unlock voltage control. Restart AB.

If you have a Sapphire card any reason you're not using Trixx? Trixx allows voltage increase to +200mv and if you're on water you might want to go higher than the +100 AB allows. Not the most stable app though.

Anyway if you can hit 1100 on stock voltage I'd guess you'll be able to hit 1200 on that card. More likely than not anyway.


----------



## Bertovzki

I'm curious , I'm planing on just one R9 290 ,maybe 2 one day , or maybe not.

Why do most of you guys have more than one card , I would have thought one 290 would run any high res game at full detail , smoothly ,and at 4K too ,and with some increased performance at 4K with 2 cards ,without any need at all for more cards ,so why ? is it because some guys just want to chase some bench mark or is there reason behind it ?

I like high resolutions and smooth game play , so thats why i got a 290, am I going to see any difference on screen with FPS after 1 card , in something like project cars or IL2 stumovick Battle of starlingrad Asseto Corsa for e.g. ?







at stock no OC


----------



## iCrap

One 290 is not enough for my 3x 1440p setup for sure. But if you are running a single 1440p i think a single 290 is fine.

4k and eyefinity 1440 would need 2+. i get fps drops quite a bit.

I won't get another 290 though. I'll upgrade when the new cards drop.


----------



## Spectre-

Quote:


> Originally Posted by *iCrap*
> 
> have you tried to push the core clock any higher? even 1120mhz crashes after like 30 minutes (was playing ac4)


+60mV i can push 1170/1500

anything after that its 120mV+


----------



## taem

Quote:


> Originally Posted by *Bertovzki*
> 
> I'm curious , I'm planing on just one R9 290 ,maybe 2 one day , or maybe not.
> 
> Why do most of you guys have more than one card , I would have thought one 290 would run any high res game at full detail , smoothly ,and at 4K too ,and with some increased performance at 4K with 2 cards ,without any need at all for more cards ,so why ? is it because some guys just want to chase some bench mark or is there reason behind it ?
> 
> I like high resolutions and smooth game play , so thats why i got a 290, am I going to see any difference on screen with FPS after 1 card , in something like project cars or IL2 stumovick Battle of starlingrad Asseto Corsa for e.g. ?
> 
> 
> 
> 
> 
> 
> 
> at stock no OC


I have a single 290 and a 1440p display. I can't max most recent titles and maintain 60fps. I had crossfire for a while, that can't max everything on a 1440p either, not with a 4670k anyway. I don't know what it would take to max a title like Metro Last Light and maintain 60fps.


----------



## Bertovzki

Quote:


> Originally Posted by *taem*
> 
> I have a single 290 and a 1440p display. I can't max most recent titles and maintain 60fps. I had crossfire for a while, that can't max everything on a 1440p either, not with a 4670k anyway. I don't know what it would take to max a title like Metro Last Light and maintain 60fps.


Ok thanks , I have a 4790 K and if I have 2 R9 290X Tri X OC under water then maybe id be pretty good for high res gaming then and only m.2 or evo SSD's , i was thinking the new m.2 from samsung XP941 and SSD's


----------



## Klocek001

Quote:


> Originally Posted by *taem*
> 
> I have a single 290 and a 1440p display. I can't max most recent titles and maintain 60fps. I had crossfire for a while, that can't max everything on a 1440p either, not with a 4670k anyway. I don't know what it would take to max a title like Metro Last Light and maintain 60fps.


what is your cpu clocked ?


----------



## Bertovzki

Quote:


> Originally Posted by *Klocek001*
> 
> what is your cpu clocked ?


I was hoping id be able to get good performance and FPS with one card and no OC on CPU and GPU , maybe i need mild stable OC on both


----------



## Klocek001

that might be the way, remember to disable physiX in LL too, that's a huge performance hit.


----------



## Bertovzki

Quote:


> Originally Posted by *iCrap*
> 
> One 290 is not enough for my 3x 1440p setup for sure. But if you are running a single 1440p i think a single 290 is fine.
> 
> 4k and eyefinity 1440 would need 2+. i get fps drops quite a bit.
> 
> I won't get another 290 though. I'll upgrade when the new cards drop.


I do forget about the fact I may get 3 monitors instead of a 4K TV or monitor , and maybe some oculus rift thing when there is a good one on the market , so yeah probably will look at 2 cards eventually.

Quote:


> Originally Posted by *Klocek001*
> 
> that might be the way, remember to disable physiX in LL too, that's a huge performance hit.


What is LL ? , I don't have my card yet , either this week or in 2 weeks time I order my 290.

Thanks guys


----------



## nightfox

Quote:


> Originally Posted by *Bertovzki*
> 
> I do forget about the fact I may get 3 monitors instead of a 4K TV or monitor , and maybe some oculus rift thing when there is a good one on the market , so yeah probably will look at 2 cards eventually.
> What is LL ? , I don't have my card yet , either this week or in 2 weeks time I order my 290.
> 
> Thanks guys


I am guessing that he means Metro Last Light


----------



## taem

Quote:


> Originally Posted by *Klocek001*
> 
> what is your cpu clocked ?


Right now it's 4.2 got new mb and haven't had time to tweak. My old settings on the z87 pro are not stable on the maximus vii gene.

But I've been running at 4.7 and that's what I was talking about. I feel like even at that decent clock the 4670k is holding back 290 xfire, cos my scaling in games and benches was only 60-70%. Folks with i7s and 3770ks seem to get 80%+ scaling, at least on benches. That's why I gave up on xfire a while back, though I plan to to go back to it as I set up my loop. Thinking about picking up a 4790k but I wonder if that's worth it, vs waiting a bit and then going x99 with ddr4 and a x16/x16 capable chip. Though, I seriously doubt x8/x8 holds me back on a single 1440p. Still, maybe I get better scaling with a better chip.

I'm fine with the 290 though, I feel like I can skip the 980 gen and maybe the one after that as well. 290 is a great card and at $300 it's such a steal.


----------



## YellowBlackGod

Some games are pretty tough to beat. For example Ryse: The Son of Rom, with all effects and settings to the max, just can't be handled even by 290x's equal the 780 ti, which hits just 30 fps on 1080p.


----------



## Bertovzki

Quote:


> Originally Posted by *YellowBlackGod*
> 
> Some games are pretty tough to beat. For example Ryse: The Son of Rom, with all effects and settings to the max, just can't be handled even by 290x's equal the 780 ti, which hits just 30 fps on 1080p.


yeah true , this however says more to me about lazy game programing and poor game engine , from what i see it is common now for developers to just not spend the time to write programs to use CPU power more efficiently and use more threads , because they don't need to , they just rely on us to buy brute force CPU's and GPU's to do the job, some game developers create awesome game engines that need considerably less power to do the same thing , I just hope the games I am into will have this sorted


----------



## YellowBlackGod

Yes, indeed....These consoles ports ruin the PC gaming experience (locked FPS, more RAM,poor coding) , just like ZX Spectrum did to Amstrad CPC back to the 80's.


----------



## amptechnow

i run stock clocks for program and web surfing. i run 1135/1626 for gaming and benching and sometimes bump core to 1150 for benches. ususally use afterburner, seems more stable for me, and i set mv to max power limit to max and aux to max(to lazy to find lowest working setting).

i just beat ryse son of rome last week. my 2 290's ran it fine at ultra settings and everything maxed. i never checked frame rates. sometimes if a game seems smooth i dont bother running fraps to see frame rate. bc sometimes a game will run lower then id like and even though it seems smooth, after i see low frame rates it then bothers me. have a 144hz panel i try and get to 120-144 fps bc its so smooth. i run with lightboost off since it washes out color and i dont have much ghosting anyway. i always turn off motion blur bc i just dont like it.

just my 2 cents...remember all cards are different and what may work for one may not for you. just spend some time and find your card(s) sweet spot and enjoy.


----------



## amptechnow

Quote:


> Originally Posted by *YellowBlackGod*
> 
> Yes, indeed....These consoles ports ruin the PC gaming experience (locked FPS, more RAM,poor coding) , just like ZX Spectrum did to Amstrad CPC back to the 80's.


so true. i remember when they used to make games just for the pc and they were awesome. waiting for that to happen again. the pc gaming market is still pretty big, i dont know why companies arent taking advantage of it and building good games. if some one made a great pc exclusive game and it capitalized on our architecture and hardware i think most pc gamers would buy it. its so sad some of us have so much power and cant unleash it bc of these crappy ports.

consoles are just a way bigger market. most people want that brainless put in a disk and play experience. and most dont even know how different pc gaming is. in the past 2 years i have had several console gamers bring me their computers for repair, virus removals, or upgrades and they got a chance to see my one setup. a few i let their kids play on it while i finished up their repairs. they were all amzaed at how much better it looked and played then their xbox/ps's. several of them later bought a gaming pc from us either for them or their kids. i think most hear pc gaming and thionk about their crappy laptop or best buy bought hp desktop and think, well gaming on that cant be that great. its small or slow or whatever. a lot of people outside our community dont even know pc's like ours exist. ive had clients see some of my rigs and were blown away or didnt even know what they were looking at. same goes for eyefinity. people just dont know whats out there or how good it is or can be. we need to educate more console gamers, in a good/nice way. and show them what they have been missing.

sorry about that. ive just been really frustrated with the pc gaming situation lately and these terrible ports.


----------



## Widde

Quote:


> Originally Posted by *amptechnow*
> 
> so true. i remember when they used to make games just for the pc and they were awesome. waiting for that to happen again. the pc gaming market is still pretty big, i dont know why companies arent taking advantage of it and building good games. if some one made a great pc exclusive game and it capitalized on our architecture and hardware i think most pc gamers would buy it. its so sad some of us have so much power and cant unleash it bc of these crappy ports.
> 
> consoles are just a way bigger market. most people want that brainless put in a disk and play experience. and most dont even know how different pc gaming is. in the past 2 years i have had several console gamers bring me their computers for repair, virus removals, or upgrades and they got a chance to see my one setup. a few i let their kids play on it while i finished up their repairs. they were all amzaed at how much better it looked and played then their xbox/ps's. several of them later bought a gaming pc from us either for them or their kids. i think most hear pc gaming and thionk about their crappy laptop or best buy bought hp desktop and think, well gaming on that cant be that great. its small or slow or whatever. a lot of people outside our community dont even know pc's like ours exist. ive had clients see some of my rigs and were blown away or didnt even know what they were looking at. same goes for eyefinity. people just dont know whats out there or how good it is or can be. we need to educate more console gamers, in a good/nice way. and show them what they have been missing.
> 
> sorry about that. ive just been really frustrated with the pc gaming situation lately and these terrible ports.


Then we also have the companies pushing for 30 fps across the board just for the fact that their precious consoles are already outdated and obsolete. 30 fps isnt acceptable


----------



## amptechnow

Quote:


> Originally Posted by *Widde*
> 
> Then we also have the companies pushing for 30 fps across the board just for the fact that their precious consoles are already outdated and obsolete. 30 fps isnt acceptable


i know! so sad. thats what i was explaining to some one the other day. you buy a console and it runs on same exact hardware until next generation comes out. no upgrades for years. and the way technology and coding and games advances so quickly, they are out dated almost as soon as they are released. a pc can be upgraded pretty much as often as your wallet will allow...

but 30 fps is totally unacceptable!!!


----------



## Klocek001

If you have a 120Hz or 144Hz display and set your Vsync to 1 frame, can you hit 100 fps for example or does it automatically drop to 60 if it can't maintain 120 ?

I'm just wondering if spending extra for a G-sync + nvidia card display even makes any sense with a very powerful card like 290 and a 120Hz 1080p display....


----------



## Fickle Pickle

Quote:


> Originally Posted by *amptechnow*
> 
> so true. i remember when they used to make games just for the pc and they were awesome. waiting for that to happen again. the pc gaming market is still pretty big, i dont know why companies arent taking advantage of it and building good games. if some one made a great pc exclusive game and it capitalized on our architecture and hardware i think most pc gamers would buy it. its so sad some of us have so much power and cant unleash it bc of these crappy ports.
> 
> consoles are just a way bigger market. most people want that brainless put in a disk and play experience. and most dont even know how different pc gaming is. in the past 2 years i have had several console gamers bring me their computers for repair, virus removals, or upgrades and they got a chance to see my one setup. a few i let their kids play on it while i finished up their repairs. they were all amzaed at how much better it looked and played then their xbox/ps's. several of them later bought a gaming pc from us either for them or their kids. i think most hear pc gaming and thionk about their crappy laptop or best buy bought hp desktop and think, well gaming on that cant be that great. its small or slow or whatever. a lot of people outside our community dont even know pc's like ours exist. ive had clients see some of my rigs and were blown away or didnt even know what they were looking at. same goes for eyefinity. people just dont know whats out there or how good it is or can be. we need to educate more console gamers, in a good/nice way. and show them what they have been missing.
> 
> sorry about that. ive just been really frustrated with the pc gaming situation lately and these terrible ports.


There are so many more varieties of games out now than ever before on PC, one does not need to necessarily play just AAA titles. There's not enough time in the day to play all the games one would want to play. I love my gaming PC, but I also enjoy having my PS3 and PS4 around for exclusives. There has never been a better time to be a gamer. Literally an endless supply of games of all genres to play on all platforms.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Fickle Pickle*
> 
> There are so many more varieties of games out now than ever before on PC, one does not need to necessarily play just AAA titles. There's not enough time in the day to play all the games one would want to play. I love my gaming PC, but I also enjoy having my PS3 and PS4 around for exclusives. *There has never been a better time to be a gamer.* Literally an endless supply of games of all genres to play on all platforms.


So so true....
People should not lose sight of that. The country is economically headed back in the right direction as a whole....
People are getting back to work, and getting back to being consumers, and not just survivalist.... I know this first hand.
I got 99 problems but a glitch ain't one......


----------



## Klocek001

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So so true....
> People should not lose sight of that. The country is economically headed back in the right direction as a whole....
> People are getting back to work, and getting back to being consumers, and not just survivalist.... I know this first hand.
> I got 99 problems but a glitch ain't one......


R U pickin up that TriX ?


----------



## Agent Smith1984

I'm on the fence now with my 280x working again...... what do you think?

Sell the 280x for $140 or so, add $60 and have a 290 Tri-X that should clock pretty well?

Keep in mind, my 280x is within 10% of most factory overclocked 290's with my overclocks.


----------



## pdasterly

Pc> ps4> xbox one


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'm on the fence now with my 280x working again...... what do you think?
> 
> Sell the 280x for $140 or so, add $60 and have a 290 Tri-X that should clock pretty well?
> 
> Keep in mind, my 280x is within 10% of most factory overclocked 290's with my overclocks.


Me? In a heartbeat.


----------



## LandonAaron

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'm on the fence now with my 280x working again...... what do you think?
> 
> Sell the 280x for $140 or so, add $60 and have a 290 Tri-X that should clock pretty well?
> 
> Keep in mind, my 280x is within 10% of most factory overclocked 290's with my overclocks.


If it were me I would get a 290x, since like you said it would only be $60 more.? Also it seems most people can hit about 1100 /1400 clocks with +100mv or 1200/1600 with +200mv on the 290x, but you will need a water cooling setup or an open air cooler or you will likely go deaf.


----------



## amptechnow

Quote:


> Originally Posted by *Klocek001*
> 
> If you have a 120Hz or 144Hz display and set your Vsync to 1 frame, can you hit 100 fps for example or does it automatically drop to 60 if it can't maintain 120 ?
> 
> I'm just wondering if spending extra for a G-sync + nvidia card display even makes any sense with a very powerful card like 290 and a 120Hz 1080p display....


not really sure I never use vsync. but i THINK it does drop to 60 then 30. i could be wrong.


----------



## PillarOfAutumn

Can someone please tell me what the power draw is of two Crossfired 290s during normal gaming and during benchmarking? Overclocked and stocked?

I currently have the Seasonic x-750 KM3 PSU and I'm wondering if this will be enough to run an i5-3570k OCed to 4.0 and two stock 290s. Otherwise, I'm looking to upgrade to the Seasonic 860 Platinum PSU.


----------



## Agent Smith1984

Quote:


> Originally Posted by *LandonAaron*
> 
> If it were me I would get a 290x, since like you said it would only be $60 more.? Also it seems most people can hit about 1100 /1400 clocks with +100mv or 1200/1600 with +200mv on the 290x, but you will need a water cooling setup or an open air cooler or you will likely go deaf.


Well, it's actually a 290 Tri-x OC not a 290x, though those cards seem to not having any problems getting 1100 or better on the core, and around 1500 on the memory.

Edit:
Holy crap, check this out.... 1225/1600, now THAT would be all kinds of an upgrade!!!
http://www.overclock.net/t/1456488/290-tri-x-non-x-version-silly-sapphire-tri-x-are-for-kids-some-temp-test-for-those-who-want-to-know

I guess with some overclocking I'd see a 20% performance increase for the most part, which is usually my criteria for upgrading. I mean, $60 after I sell my card is a pretty good price on an upgrade.

I'll be doing CPU/mobo/power supply around tax time too!


----------



## Klocek001

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, it's actually a 290 Tri-x OC not a 290x, though those cards seem to not having any problems getting 1100 or better on the core, and around 1500 on the memory.
> 
> I guess with some overclocking I'd see a 20% performance increase for the most part, which is usually my criteria for upgrading. I mean, $60 after I sell my card is a pretty good price on an upgrade.
> 
> I'll be doing CPU/mobo/power supply around tax time too!


Quote:


> Originally Posted by *LandonAaron*
> 
> If it were me I would get a 290x, since like you said it would only be $60 more.? Also it seems most people can hit about 1100 /1400 clocks with +100mv or 1200/1600 with +200mv on the 290x, but you will need a water cooling setup or an open air cooler or you will likely go deaf.


mine does 1100 on stock vcore.


----------



## Agent Smith1984

To give you an idea of why I am on the fence....

Here is my FireStrike:
http://www.3dmark.com/fs/2915459









Here is a 1090t at around 4GHz (though my CPU performs a bit better, likely due to NB/RAM clocks) with a lightly clocked 290
http://www.3dmark.com/fs/1892086









Not really much of an upgrade....

With some over clocks though, I'll be looking for something like this (not my CPU, but similar score from an 8350)
http://www.3dmark.com/fs/2001744


----------



## rdr09

Quote:


> Originally Posted by *Agent Smith1984*
> 
> To give you an idea of why I am on the fence....
> 
> Here is my FireStrike:
> http://www.3dmark.com/fs/2915459
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here is a 1090t at around 4GHz (though my CPU performs a bit better, likely due to NB/RAM clocks) with a lightly clocked 290
> http://www.3dmark.com/fs/1892086
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not really much of an upgrade....
> 
> With some over clocks though, I'll be looking for something like this (not my CPU, but similar score from an 8350)
> http://www.3dmark.com/fs/2001744


that 290 @ 1225 was with 13 driver. here one at 1220 using 14 driver . . .

http://www.3dmark.com/3dm/4222057?

here a comparison at same clocks . . .

http://www.3dmark.com/compare/fs/1392805/fs/2746418


----------



## LandonAaron

I would get a used 290x on ebay. I got mine there for $200. It was used in a bit mining rig for 6 months by the previous owner, and hasn't given me any problems. If you don't want to water cool it though you would likely need to spend $250 or more for one of the open air cooler designs. I don't think those bit mining cards are anything to be worried about, but there always is the risk. I always buy used though and have never had any problems, though I was a little apprehensive this time with the whole bit mining thing, but it worked out well for me in the end.

Actually I just checked ebay and it seems most R9 290x are going for $260 range now. There was a recent price cut on new 290x's though, maybe wait a week or two to see if Ebay used prices come down as well as a result. If it were me I would wait for a deal and just get the best, as you said a 280x to 290 might not be much of an upgrade.


----------



## Agent Smith1984

Well, mining cards don't scare me any.....

My 280x was in a mining rig, and I bet I have put it through more hell than crypto mining ever did. Maybe not 24/7, but certainly when it's under load....

1.3v 1250 is a big bump from 1.15 1070....

I have had this card at 1.35v 1290 core before, but my that was using Trixx, and it doesn't have a force constant voltage option..... so with my PSU being as old and frail as it is, the droop was getting down near 1.23v and artifacting.

She's quite the clocker...

As far as 290 or 290x.... in my opinion, I will be happier with pushing a non-reference 290 as far as it can go than I would a vanilla 290x at 90c+

$200 for a Tri-x is a pretty good price.... my only fear is all the black screen issues I have read about everywhere.... especially for the third party cards like this one...

Nobody in here recommends I go for the GTX780 I was offered for $200 at all huh? Tis' where the big reds come to play I suppose...


----------



## Agent Smith1984

What are the actual clocks on the 290 scoring 13k graphics I wonder? You know it's not what the link is reporting... can't be....

To hit 13k, I'm guessing 1200+/1500+ on 14.* drivers


----------



## bond32

Little update, as it turns out the splitting cable I used to join my two power supplies had a wire come out of the pin in the 24 pin connector. Turns out said wire was a 5 V... I think that shorted something on the board (Asus maximus vi extreme).... Doesn't matter, got the EVGA Classified all installed, an arguably better board anyway. Also plumbed up 2 more 480's. So cooling the 4770k and 4 290x's:

240 monsta push pull
360 monsta push pull
rx360 push pull
2xek xt480's pull

Really digging the EVGA board over the ASUS too. Not a lot of information out there about the classified boards. Progress pictures:


Spoiler: Warning: Spoiler!


----------



## rdr09

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What are the actual clocks on the 290 scoring 13k graphics I wonder? You know it's not what the link is reporting... can't be....
> 
> To hit 13k, I'm guessing 1200+/1500+ on 14.* drivers


1200/1500 will give 13K gs in FS. Win 8 will give slightly higher.


----------



## Sammyboy83

My 290 scored 13677 graphics score with 1300/1600. I think 1250 is enough for 13k.


----------



## rdr09

Quote:


> Originally Posted by *Sammyboy83*
> 
> My 290 scored 13677 graphics score with 1300/1600. I think 1250 is enough for 13k.


here is 1200 core . . .

http://www.3dmark.com/3dm/4079609?

newest beta is slightly lower . . .

http://www.3dmark.com/3dm/4616436?

here is 1300 core . . .

http://www.3dmark.com/3dm/4140843?


----------



## Agent Smith1984

Did most people notice improvements in games too, or was this just another "Improved score in 3dmark FireStrike" driver improvement.....
Have seen plenty of those in the past.....

Not to say the pretty impressive showing my 280x gives in FireStrike isn't the result of the same gimmick, but I will say for certain, that no other driver has worked as well with this card as my 14.7 beta....

Just sucks cause my scores are never valid. Maybe I will try 14.7rc3 final though (didn't even bother)

I did try all the 14.9 releases and get black screens galore....


----------



## xer0h0ur

Quote:


> Originally Posted by *bond32*
> 
> Little update, as it turns out the splitting cable I used to join my two power supplies had a wire come out of the pin in the 24 pin connector. Turns out said wire was a 5 V... I think that shorted something on the board (Asus maximus vi extreme).... Doesn't matter, got the EVGA Classified all installed, an arguably better board anyway. Also plumbed up 2 more 480's. So cooling the 4770k and 4 290x's:
> 
> 240 monsta push pull
> 360 monsta push pull
> rx360 push pull
> 2xek xt480's pull
> 
> Really digging the EVGA board over the ASUS too. Not a lot of information out there about the classified boards. Progress pictures:
> 
> 
> Spoiler: Warning: Spoiler!


Great googly moogly thassalotta cooling


----------



## Sammyboy83

wrong post


----------



## Agent Smith1984

Yeah, that's overvolting/clocking paradise right there.


----------



## LandonAaron

I score 13396 in Graphics in FS running at 1200/1600.

I definitely noticed an improvement playing Skyrim, but I was coming from a GTX 770 with only 2GB of VRAM, and Skryim was completely maxing those 2 gigs out.

You will only see a few FPS improvement most likely, but sometimes it only takes a few more FPS to hit that 60 FPS mark which if you use Vsync, makes all the difference in the world.


----------



## rdr09

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Did most people notice improvements in games too, or was this just another "Improved score in 3dmark FireStrike" driver improvement.....
> Have seen plenty of those in the past.....
> 
> Not to say the pretty impressive showing my 280x gives in FireStrike isn't the result of the same gimmick, but I will say for certain, that no other driver has worked as well with this card as my 14.7 beta....
> 
> Just sucks cause my scores are never valid. Maybe I will try 14.7rc3 final though (didn't even bother)
> 
> I did try all the 14.9 releases and get black screens galore....


with the thuban . . . you don't need to oc the 290. prolly the cpu needs to oc higher.


----------



## Knight26

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Can someone please tell me what the power draw is of two Crossfired 290s during normal gaming and during benchmarking? Overclocked and stocked?
> 
> I currently have the Seasonic x-750 KM3 PSU and I'm wondering if this will be enough to run an i5-3570k OCed to 4.0 and two stock 290s. Otherwise, I'm looking to upgrade to the Seasonic 860 Platinum PSU.


If I my memory serves me, I believe my rig was pulling between 650 & 700 watts with 2 290's at stock clocks and i7-3770k @ 4.5. That's playing BF4 multi-player. I was using a 1,000 watt OCZ PSU at the time. I added a 3rd card and a dual pump setup and my power draw went up to just over 1,000 watts at peak, so I replaced the OCZ unit with a 1300 watt unit. The Seasonic 860 would probably be a better choice if you want a little bit of power overhead as well as keeping the PSU working closer to it's most efficient range.


----------



## tsm106

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sammyboy83*
> 
> My 290 scored 13677 graphics score with 1300/1600. I think 1250 is enough for 13k.
> 
> 
> 
> here is 1200 core . . .
> 
> http://www.3dmark.com/3dm/4079609?
> 
> newest beta is slightly lower . . .
> 
> http://www.3dmark.com/3dm/4616436?
> 
> here is 1300 core . . .
> 
> http://www.3dmark.com/3dm/4140843?
Click to expand...

Here:



But its still slower than mah 290x.


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> Here:
> 
> 
> 
> But its still slower than mah 290x.


not fair. i can hit 14, too . . .



lol Win 10.


----------



## Agent Smith1984

I can go real close to 4.2, but don't like the heat.

I think the games are difficult enough to run to use any overclock on the GPU you give it..... even with this ol' Thuban
1080p max settings, 4xAA on Crysis 3 is only going to put me in the 60FPS realm, this thuban can handle that.

I only see 60-65% CPU utilization when playing on my 280x. I'm expecting that to increase by 10% or so with the 290....
There's no way the GPU won't be completely utilized. If I see that it's not, I will reduce the resolution timer and peg out all 6 cores









I got this.... trust me!!









Not to mention I'm upgrading CPU/mobo/PSU around tax time....








Who's to say another 290 is not in the works as well... BAHAHA


----------



## rdr09

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I can go real close to 4.2, but don't like the heat.
> 
> I think the games are difficult enough to run to use any overclock on the GPU you give it..... even with this ol' Thuban
> 1080p max settings, 4xAA on Crysis 3 is only going to put me in the 60FPS realm, this thuban can handle that.
> 
> I only see 60-65% CPU utilization when playing on my 280x. I'm expecting that to increase by 10% or so with the 290....
> There's no way the GPU won't be completely utilized. If I see that it's not, I will reduce the resolution timer and peg out all 6 cores
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I got this.... trust me!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not to mention I'm upgrading CPU/mobo/PSU around tax time....
> 
> 
> 
> 
> 
> 
> 
> 
> Who's to say another 290 is not in the works as well... BAHAHA


i use to have 7950/7970 to max C3 using 1080. a single 290 can do the same. yes, maxed 8XMSAA. smooth.


----------



## tsm106

Quote:


> Originally Posted by *rdr09*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Here:
> 
> 
> 
> But its still slower than mah 290x.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> not fair. i can hit 14, too . . .
> 
> 
> 
> lol Win 10.
Click to expand...

Win 10 is annoying with the time problem, but its the fact that MS keep mucking with the OS. I've got a couple copies of windows.old on my laptop that they keep leaving behind and they've taken away access so i cant delete them. FTL!


----------



## taem

So xfx is making a 8gb 290x joining sapphire and a few others.http://www.tomshardware.com/news/xfx-radeon-r9-290x-8gb,28024.html

I've heard sapphire has no plans to bring their 8gb vapor x to USA but hopefully one of these other ones will so I can at least think about it.


----------



## HakscH

*Hello everybody . My name is Stefan aka: HakscH . I am 26 years young and I live in Germany .
Sorry for my English at this point ^^
i just got my new R9 290 and i was looking for some benchmark results . just to know where my r9 scales. im kind of new in the area of GPU overclocking. so i hope you guys gonna teach me some interesting suff^^*

 http://www.sysprofile.de/id166023



Spoiler: GPU-Z Validation



http://www.techpowerup.com/gpuz/details.php?id=vdkn9





Spoiler: CPU-Z Validation



http://valid.canardpc.com/qg5gka



  


*GIGABYTE RADEON R9 290 OC WINDFORCE 3X*
*AMD FX-8150*
*CORSAIR HYDRO H80*
*4x4GB CORSAIR VENGEANCE*
*ASUS SABERTOOTH 990FX*
*COOLERMASTER GX650*
*SAMSUNG 840 EVO*
*HITACHI HDD*
*SAMSUNG DVD/RW*
*NZXT SENTRY LX*
*COOLERMASTER HAF-X 942*



Spoiler: From VTX3D Radeon HD 7950 X-Edition to Gigabyte R9 290 OC Windforce 3x

















I HOPE YOU GUYS ACCEPT MY INVITE! PLEASE


----------



## taem

Quote:


> Originally Posted by *HakscH*
> 
> *Hello everybody . My name is Stefan aka: HakscH . I am 26 years young and I live in Germany .
> Sorry for my English at this point ^^
> i just got my new R9 290 and i was looking for some benchmark results . just to know where my r9 scales. im kind of new in the area of GPU overclocking. so i hope you guys gonna teach me some interesting suff^^*


R9 290 reference should score about 10,000 gpu score in 3dmark Firestrike, and about 4600-4700 gpu score in 3dmark firestrike extreme.

Gaming frame rates and most other benches are hard to compare because the CPU and memory will affect the results. So the 3dmark gpu score is the most comparable number I can think of. But even there some guys will run the test with texture filter set to performance and surface optimization off and Tess off (these are all settings in the Gaming tab of Catalyst Control Center) and get higher numbers.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *tsm106*
> 
> Here:
> 
> 
> 
> But its still slower than mah 290x.


Holy crap you got a good clocking card. RDR's 1300MHz core is no slouch either.

Seems like all the GPU's/CPU's I've gotten are slightly above average, can't wait til I really hit the silicon lottery though.

Just curious, how many 290/x's have you guys used, and how many have been able to clock that high?


----------



## ZealotKi11er

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Holy crap you got a good clocking card. RDR's 1300MHz core is no slouch either.
> 
> Seems like all the GPU's/CPU's I've gotten are slightly above average, can't wait til I really hit the silicon lottery though.
> 
> Just curious, how many 290/x's have you guys used, and how many have been able to clock that high?


290X/290 cant go above 1250MHz unless you run the benchmark with artifacts or have super low ambient temps.


----------



## tsm106

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Holy crap you got a good clocking card. RDR's 1300MHz core is no slouch either.
> 
> Seems like all the GPU's/CPU's I've gotten are slightly above average, can't wait til I really hit the silicon lottery though.
> 
> Just curious, how many 290/x's have you guys used, and how many have been able to clock that high?


Hmm in total it's something like 2 sapphire 290x, 1 xfx 290x, 2 290x lightnings. All clocked to at least 1300. With tahiti, success was something like 80% clocked over 1300, and one of them was super gold clocking to 1400.


----------



## Red1776

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *chiknnwatrmln*
> 
> Holy crap you got a good clocking card. RDR's 1300MHz core is no slouch either.
> 
> Seems like all the GPU's/CPU's I've gotten are slightly above average, can't wait til I really hit the silicon lottery though.
> 
> Just curious, how many 290/x's have you guys used, and how many have been able to clock that high?
> 
> 
> 
> Hmm in total it's something like 2 sapphire 290x, 1 xfx 290x, 2 290x lightnings. All clocked to at least 1300. With tahiti, success was something like 80% clocked over 1300, and one of them was super gold clocking to 1400.
Click to expand...

I hit rather well this time as well. 4 MSI R290X Game Ed off the line consecutively.


----------



## ZealotKi11er

Quote:


> Originally Posted by *tsm106*
> 
> Hmm in total it's something like 2 sapphire 290x, 1 xfx 290x, 2 290x lightnings. All clocked to at least 1300. With tahiti, success was something like 80% clocked over 1300, and one of them was super gold clocking to 1400.


You are not the average overclocker. Telling people that these cards do at least 1300MHz is false. I have tested 3 x 290 and neither could do 1300MHz no normal BIOS and +200mV.


----------



## boot318

Quote:


> Originally Posted by *ZealotKi11er*
> 
> You are not the average overclocker. Telling people that these cards do at least 1300MHz is false. I have tested 3 x 290 and neither could do 1300MHz no normal BIOS and +200mV.


I was thinking the same thing. tsm106 must be underwater and pushing volts to the max.


----------



## Spectre-

`
Quote:


> Originally Posted by *ZealotKi11er*
> 
> You are not the average overclocker. Telling people that these cards do at least 1300MHz is false. I have tested 3 x 290 and neither could do 1300MHz no normal BIOS and +200mV.


my stock bios lets me do 1250/1600 at max

PT.1 is the ultimate benching bios next to Lightning


----------



## tsm106

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Hmm in total it's something like 2 sapphire 290x, 1 xfx 290x, 2 290x lightnings. All clocked to at least 1300. With tahiti, success was something like 80% clocked over 1300, and one of them was super gold clocking to 1400.
> 
> 
> 
> You are not the average overclocker. Telling people that these cards do at least 1300MHz is false. I have tested 3 x 290 and neither could do 1300MHz no normal BIOS and +200mV.
Click to expand...

What your problem? He asked me what I could clock my cards to and that's what they clocked to. Don't blame others because you couldn't hit 1300.


----------



## tsm106

For the funs. It was a great shot, thru the 'nads lol.



I told ranger, and he lol'd too.


----------



## devilhead

Quote:


> Originally Posted by *tsm106*
> 
> For the funs. It was a great shot, thru the 'nads lol.
> 
> 
> 
> I told ranger, and he lol'd too.


good card







i can play bf4 too with 1280 +200mv, but for me is better 1200/1500 +90mv


----------



## TommyGunn123

Sorry if it's been answered already but over 1000 pages is hard to scan over, but I was curious if you can buy a backplate like the ek one and slap it on a reference pcb card with the air cooler still on. I was thinking of getting either the XFX DD or the tri-x and a backplate for both a slight cooling improvement and for looks. I mean, imagine the double dissipation with the black ek backplate? Yeah exactly









If someones jury rigged one on with extra screws or know if the ek supplied screws work fine in either of those configs, I'd love to hear it, thanks!


----------



## Arizonian

Quote:


> Originally Posted by *HakscH*
> 
> *Hello everybody . My name is Stefan aka: HakscH . I am 26 years young and I live in Germany .
> Sorry for my English at this point ^^
> i just got my new R9 290 and i was looking for some benchmark results . just to know where my r9 scales. im kind of new in the area of GPU overclocking. so i hope you guys gonna teach me some interesting suff^^*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://www.sysprofile.de/id166023
> 
> 
> 
> 
> 
> Spoiler: GPU-Z Validation
> 
> 
> 
> http://www.techpowerup.com/gpuz/details.php?id=vdkn9
> 
> 
> 
> 
> 
> Spoiler: CPU-Z Validation
> 
> 
> 
> http://valid.canardpc.com/qg5gka
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> *GIGABYTE RADEON R9 290 OC WINDFORCE 3X*
> *AMD FX-8150*
> *CORSAIR HYDRO H80*
> *4x4GB CORSAIR VENGEANCE*
> *ASUS SABERTOOTH 990FX*
> *COOLERMASTER GX650*
> *SAMSUNG 840 EVO*
> *HITACHI HDD*
> *SAMSUNG DVD/RW*
> *NZXT SENTRY LX*
> *COOLERMASTER HAF-X 942*
> 
> 
> 
> Spoiler: From VTX3D Radeon HD 7950 X-Edition to Gigabyte R9 290 OC Windforce 3x
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I HOPE YOU GUYS ACCEPT MY INVITE! PLEASE


Congrats - added









Nice pics


----------



## ZealotKi11er

Quote:


> Originally Posted by *tsm106*
> 
> What your problem? He asked me what I could clock my cards to and that's what they clocked to. Don't blame others because you couldn't hit 1300.


Not my problem. I haven been following this forum since it started and not many people get over 1200 let alone 1300.


----------



## Ized

Could someone let me in on the secret of using "profiles"? Specifically "official" AMD profiles.

For example, the beta 14.9.2 comes with "Project Cars profile update"

I installed the driver.. is that it? Nothing to fiddle with inside the Catalyst Control Panel? Run the game and its using the new updated profile?

I would like to confirm that the game is in fact using this new profile, but I don't know how.

Thanks!


----------



## Vici0us

AMD Confirms 8GB Radeon R9 290X Card

Spotted 8GB XFX Radeon R9 290X Graphics Card

Sapphire Radeon R9 290X Vapor-X OC 8192MB GDDR5 Graphics Card - £379.99 (Now Selling in UK)
*
Recent rumors have suggested AMD is working on an improved version of its R9 290X graphics card, and today the company has confirmed it's true. According to an AMD representative, an overclocked 290X with 8GB of GDDR5 memory is on the way, although further details are a bit scarce at the moment.

AMD says it's already working with Sapphire, PowerColor, and MSI to manufacture the cards, but there are also "additional" AIBs in the works. As noted by Tom's Hardware, a purportedly leaked image from earlier this week shows an 8GB 290X from XFX. Of course, a beefed-up 8GB 290X already exists from Sapphire (for example), but this is AMD's first go-ahead on pushing 8GB single-GPU 290Xs out to most manufacturers.

The R9 290X's main competitor in its generation, Nvidia's GTX 780Ti, would appear to be a bit behind now in the memory department, although we have a sneaking suspicion Nvidia has 8GB Maxwell-based GTX 970 or 980s waiting closely in the wings.

Prices for current-gen 4GB 290X cards are a bit all over the map, with most of the lower-end units residing around the $330 USD mark. But according to what AMD is hinting, the 8GB 290X card will presumably fall somewhere between the current going prices for Nvidia's GTX 970 and 980.

AMD has said the 8GB R9 290X should arrive at retailers later this month, but has not made any reference to any other specification changes beyond the memory upgrade.*


----------



## amptechnow

Quote:


> Originally Posted by *Vici0us*
> 
> AMD Confirms 8GB Radeon R9 290X Card
> 
> Spotted 8GB XFX Radeon R9 290X Graphics Card
> 
> Sapphire Radeon R9 290X Vapor-X OC 8192MB GDDR5 Graphics Card - £379.99 (Now Selling in UK)
> *
> Recent rumors have suggested AMD is working on an improved version of its R9 290X graphics card, and today the company has confirmed it's true. According to an AMD representative, an overclocked 290X with 8GB of GDDR5 memory is on the way, although further details are a bit scarce at the moment.
> 
> AMD says it's already working with Sapphire, PowerColor, and MSI to manufacture the cards, but there are also "additional" AIBs in the works. As noted by Tom's Hardware, a purportedly leaked image from earlier this week shows an 8GB 290X from XFX. Of course, a beefed-up 8GB 290X already exists from Sapphire (for example), but this is AMD's first go-ahead on pushing 8GB single-GPU 290Xs out to most manufacturers.
> 
> The R9 290X's main competitor in its generation, Nvidia's GTX 780Ti, would appear to be a bit behind now in the memory department, although we have a sneaking suspicion Nvidia has 8GB Maxwell-based GTX 970 or 980s waiting closely in the wings.
> 
> Prices for current-gen 4GB 290X cards are a bit all over the map, with most of the lower-end units residing around the $330 USD mark. But according to what AMD is hinting, the 8GB 290X card will presumably fall somewhere between the current going prices for Nvidia's GTX 970 and 980.
> 
> AMD has said the 8GB R9 290X should arrive at retailers later this month, but has not made any reference to any other specification changes beyond the memory upgrade.*


8gb 290x for less then $400! cant wait...


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 290X/290 cant go above 1250MHz unless you run the benchmark with artifacts or have super low ambient temps.


one of my 290s i bought at launch does not show artifacts not till 1300. the one i bought used not till 1280. they both have the original bioses. problem is . . . they both have elpida, so 1500 memeory max for both.


----------



## Bertovzki

Well I just pulled the trigger , I was waiting to buy my R9 290X Tri-X OC because of the timing of me shifting home ,but I looked on pricespy.co.nz and I see the R9's where fast getting sold out ,so just paid for mine had no choice ,so what this means on the bright side is that , I will definitely be starting my build soon now , finally ! ,I have a huge pile of parts on the PC desk and the GPU was last thing I needed to start


----------



## YellowBlackGod

My 8 GB 290X Vapor-X OC from Sapphire is on the way. Just couldn't resist. These 8GB cards are equal performancewise to the 780ti/980 for less Money. I must confess I sold my 780ti Reference for this one.....


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> one of my 290s i bought at launch does not show artifacts not till 1300. the one i bought used not till 1280. they both have the original bioses. problem is . . . they both have elpida, so 1500 memeory max for both.


How cold are the cards running?


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> How cold are the cards running?


the only time i need to go below room temp is when i have to oc higher than 1300 like here . . .

http://www.3dmark.com/3dm/2098310

1300 below in water just needs room temp which is pretty much even throughout the year in my house - 72F.

its suppose to go down to almost freezing here in a few days, so may have to crack the windows open for benching and see what my second 290 can do.


----------



## rt123

Quote:


> Originally Posted by *devilhead*
> 
> good card
> 
> 
> 
> 
> 
> 
> 
> i can play bf4 too with 1280 +200mv, but for me is better 1200/1500 +90mv


Can you tell me what is the ASIC on that card of yours.?

1200Mhz @ +90mv is amazing.


----------



## avirex81

Hello Admin can you add me


2 on water, 1 in other tower and 1 in box still.

4 x R290's


----------



## pipes

Quote:


> Originally Posted by *avirex81*
> 
> Hello Admin can you add me
> 
> 
> 2 on water, 1 in other tower and 1 in box still.
> 
> 4 x R290's


You've already checked the temperature of the GPU?
Why does it seem that you have installed the connector in the hole leakage


----------



## LandonAaron

How does the dual Bios switch on the R9 290x work? Can I flash PT1 over the Quiet Bios, and keep my Uber Bios? I want to experiment with the PT1 Bios but I read that it doesn't have any of the power saving features, and I use my computer for work and play so its running over 12 hours a day so power saving tech is a must for full time use.


----------



## VSG

^Yeah, you can do that with any card with multiple BIOS.


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> the only time i need to go below room temp is when i have to oc higher than 1300 like here . . .
> 
> http://www.3dmark.com/3dm/2098310
> 
> 1300 below in water just needs room temp which is pretty much even throughout the year in my house - 72F.
> 
> its suppose to go down to almost freezing here in a few days, so may have to crack the windows open for benching and see what my second 290 can do.


What is the card hitting in the Core and VRM1. Also how much voltage are you using for 1300Mhz?


----------



## bustacap22

Wondering if someone with a 290x w/ EK waterblock and backplate provide measurement of the length of the card once installed. I am wondering if I might have clearance issue with my current watercooling setup. Planning on pulling the trigger on dual 290x. thanks


----------



## ZealotKi11er

Quote:


> Originally Posted by *bustacap22*
> 
> Wondering if someone with a 290x w/ EK waterblock and backplate provide measurement of the length of the card once installed. I am wondering if I might have clearance issue with my current watercooling setup. Planning on pulling the trigger on dual 290x. thanks


Should be same length as the PCB of 290X.


----------



## bustacap22

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Should be same length as the PCB of 290X.


Unfortunately, I dont have a 290x to reference from. Plus, all the measurements that I am able to obtained from online is a measurement that includes the OEM heatsink/cooler.


----------



## tsm106

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> 290X/290 cant go above 1250MHz unless you run the benchmark with artifacts or have super low ambient temps.
> 
> 
> 
> one of my 290s i bought at launch does not show artifacts not till 1300. the one i bought used not till 1280. they both have the original bioses. problem is . . . they both have elpida, so 1500 memeory max for both.
Click to expand...

I heard that was unpossible??


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What is the card hitting in the Core and VRM1. Also how much voltage are you using for 1300Mhz?


i don't really monitor my temp when benching. what i do know is that anything higher the 1300 will need lower ambient. like wait for winter thing. i did compare my temp on air vs water on my second 290 when i tested it it in my amd sig . . .



on air it was already sitting at 50. lol

edit: my vrms are always 4 -5 C cooler than the core.
Quote:


> Originally Posted by *bustacap22*
> 
> Unfortunately, I dont have a 290x to reference from. Plus, all the measurements that I am able to obtained from online is a measurement that includes the OEM heatsink/cooler.


it should not be any longer than 10.5 inches. same length as the pcb.

Quote:


> Originally Posted by *tsm106*
> 
> I heard that was unpossible??


everything is possible. even artifacts at stock. lol


----------



## devilhead

Quote:


> Originally Posted by *rt123*
> 
> Can you tell me what is the ASIC on that card of yours.?
> 
> 1200Mhz @ +90mv is amazing.


76.1%, but that doesn't make any sense


----------



## tsm106

The equation/data gpuz uses to derive asic is incorrect for Hawaii.


----------



## avirex81

Quote:


> Originally Posted by *pipes*
> 
> You've already checked the temperature of the GPU?
> Why does it seem that you have installed the connector in the hole leakage


Its fine. Its pretty cold. They run at stock anyways.


----------



## rt123

Quote:


> Originally Posted by *tsm106*
> 
> The equation/data gpuz uses to derive asic is incorrect for Hawaii.


Really.?
Are you sure.??


----------



## taem

Quote:


> Originally Posted by *bustacap22*
> 
> Unfortunately, I dont have a 290x to reference from. Plus, all the measurements that I am able to obtained from online is a measurement that includes the OEM heatsink/cooler.


My reference 290 pcb length is 267mm. And reference pcb dimensions are the same for 290 and 290x.

And yeah I hate how waterblock manufacturers do not provide info that should be basic. For air coolers you can get exact dimensions to the mm including orientations -- stuff like this:



but for waterblocks it's next to impossible to get any sort of detailed info. You're lucky just to get a compatibility list.

Quote:


> Originally Posted by *YellowBlackGod*
> 
> My 8 GB 290X Vapor-X OC from Sapphire is on the way. Just couldn't resist. These 8GB cards are equal performancewise to the 780ti/980 for less Money. I must confess I sold my 780ti Reference for this one.....


What's your display setup?

Quote:


> Originally Posted by *rdr09*
> 
> one of my 290s i bought at launch does not show artifacts not till 1300. the one i bought used not till 1280. they both have the original bioses. problem is . . . they both have elpida, so 1500 memeory max for both.


I have elpida and I can go well over 1500. Totally stable at 1600, 1650 artifacts on some games and benches, but not all. 1675 is crash free but artifacts a lot.

Btw I agree with that other guy. 1300 is freaking ridiculous, that's what 1% of the cards out there? Imho with 290/x if you can hit 1200 you should be happy.


----------



## amptechnow

yeah my one elpida card does 1625 on memory no problem at all. i wish the core oc'ed as good. i can barely hit 1200 on core adding tons of voltage and only for benching. games seems to not want to go higher then 1135.

and im not sure about asic score on gpuz. its reading both my cards as 81.8% and neither are very good overclockers.


----------



## taem

Quote:


> Originally Posted by *amptechnow*
> 
> yeah my one elpida card does 1625 on memory no problem at all. i wish the core oc'ed as good. i can barely hit 1200 on core adding tons of voltage and only for benching. games seems to not want to go higher then 1135.


I'm not unhappy with the oc on my PCS+ 290 but yeah, I wish it did a little better. At the stock +50mv I can hit 1130, and then I need to go to +100mv to hit 1160, and all the freakn way to +200mv to hit 1200.


----------



## Performer81

I also have a 290 pcs+. i can also reach 1160 with +100, sometimes i play BF4 or bench with 1200/[email protected]+150mv, it artifacts here and theresometimes but i dont care. Temps are still absolutely fine , around 70 for the core and under 80 vrm with fans @AUTO. voltage under load is 1,29-1,33 then, dont really wanna go higher.
I can also unlock it to a 290x, the default voltage is a little lower then. I can barely run the default 1050/1350 with 0 offset stable. WIth 290 bios i can reach 1070-1080. With +50mv 1120. I have the revised design of the PCS+.
The card also doesnt seem to like any voltage over 150, then the screen flickers, blackscreens for a second in the beginning, then after 20 seconds or so goes away and i can play. If i push it higher towards +200 it blackscreens longer or doesnt even stop and i have to exit.


----------



## Red1776

Quote:


> Originally Posted by *bustacap22*
> 
> Wondering if someone with a 290x w/ EK waterblock and backplate provide measurement of the length of the card once installed. I am wondering if I might have clearance issue with my current watercooling setup. Planning on pulling the trigger on dual 290x. thanks


hey Busta,

The EK blocks do not cover the entire block so the length is the PCB's original dimension of 10.75" from the inside of the expansion plate to the end of the card.









Hope that helps.


----------



## cephelix

Quote:


> Originally Posted by *amptechnow*
> 
> yeah my one elpida card does 1625 on memory no problem at all. i wish the core oc'ed as good. i can barely hit 1200 on core adding tons of voltage and only for benching. games seems to not want to go higher then 1135.
> 
> and im not sure about asic score on gpuz. its reading both my cards as 81.8% and neither are very good overclockers.


asic values on hawaii cards are kind of useless. mine reads 82% and still a poor overclocker. i could only go to 1060MHz on core before I need to add voltages.

anyway, I've been off the forum a few months so i've been somewhat out of touch and wanted to ask questions before i make an order on frozencpu
I have an msi r9 290 gaming twin frozr card with a pcb revision 2.2.

Last I checked,and posted on this thread, the EK full cover water block doesn't fit my particular card. Is that still the case? there are no new revisions to the EK water block right?
If that's the case, then i'll be ordering the universal waterblock with copper heatsinks. anything else that I missed out?


----------



## heroxoot

I can vouch for this. My asic is 76% and I can't do more than 1060 or so before I need voltage. I had to do like +50mV to get 1100MHZ even somewhat stable.


----------



## Red1776

Quote:


> Originally Posted by *cephelix*
> 
> Quote:
> 
> 
> 
> Originally Posted by *amptechnow*
> 
> yeah my one elpida card does 1625 on memory no problem at all. i wish the core oc'ed as good. i can barely hit 1200 on core adding tons of voltage and only for benching. games seems to not want to go higher then 1135.
> 
> and im not sure about asic score on gpuz. its reading both my cards as 81.8% and neither are very good overclockers.
> 
> 
> 
> asic values on hawaii cards are kind of useless. mine reads 82% and still a poor overclocker. i could only go to 1060MHz on core before I need to add voltages.
> 
> anyway, I've been off the forum a few months so i've been somewhat out of touch and wanted to ask questions before i make an order on frozencpu
> I have an msi r9 290 gaming twin frozr card with a pcb revision 2.2.
> 
> Last I checked,and posted on this thread, the EK full cover water block doesn't fit my particular card. Is that still the case? there are no new revisions to the EK water block right?
> If that's the case, then i'll be ordering the universal waterblock with copper heatsinks. anything else that I missed out?
Click to expand...

I have MSI R290X Gaming ED Twin Frozr GPU's as well and the only difference is that the caps around the VRM are taller. EK is making a Rev 2.0 full coverage block.


----------



## cephelix

@heroxoot
hey there hero...did u finally resovle your driver issues from last time?

@Red1776
i saw the rev 2.0 version of the blocks.not to question you or anything, just want to confirm before i make any purchases.i posted pics of my bare card a while back that showed that besides the taller caps, mine are also gold in colour instead of grey and the capacitor next to the 6-pin connector has also been moved inwards when it used to be further outward on the board. So there have been no new revisions besides the 2.0? Do pardon my ignorance


----------



## Red1776

Quote:


> Originally Posted by *cephelix*
> 
> @heroxoot
> hey there hero...did u finally resovle your driver issues from last time?
> 
> @Red1776
> i saw the rev 2.0 version of the blocks.not to question you or anything, just want to confirm before i make any purchases.i posted pics of my bare card a while back that showed that besides the taller caps, mine are also gold in colour instead of grey and the capacitor next to the 6-pin connector has also been moved inwards when it used to be further outward on the board. So there have been no new revisions besides the 2.0? Do pardon my ignorance


That is not ignorance, you should question everything having to do with waterblocks. The manuf list is not even expeditiously updated.

I'm glad you did ask. That particular card with gold caps does not have a water block made for it from EK last I corresponded with EK.


----------



## taem

ASIC has to do with voltage efficiency, not overclockability. The lower the ASIC the more leakage, is what I've been told. In fact I've heard a lower ASIC is desirable for water or ln2 to get the highest overclocks, though I don't understand the technical reason for that.
Quote:


> Originally Posted by *Red1776*
> 
> hey Busta,
> The EK blocks do not cover the entire block so the length is the PCB's original dimension of 10.75" from the inside of the expansion plate to the end of the card.


The length of a reference 290/x pcb is 10.5", not 10.75. And EK has two different types of blocks, one is shorter than pcb and the newer csq are length of pcb.


----------



## cephelix

Quote:


> Originally Posted by *Red1776*
> 
> That is not ignorance, you should question everything having to do with waterblocks. The manuf list is not even expeditiously updated.
> I'm glad you did ask. That particular card with gold caps does not have a water block made for it from EK last I corresponded with EK.


that's what i thought.thanks for the confirmation.seems like a universal waterblock with copper heatsinks it is for me then...just wondering how to secure them to the vrms to makes sure they dont fall off..all i can think of now is using low load fishing line threaded through the screw holes since i'm planning to add heatsinks to the front and back of vrm1.would that work or am i just being imaginative...lol


----------



## amptechnow

Quote:


> Originally Posted by *taem*
> 
> I'm not unhappy with the oc on my PCS+ 290 but yeah, I wish it did a little better. At the stock +50mv I can hit 1130, and then I need to go to +100mv to hit 1160, and all the freakn way to +200mv to hit 1200.


yeah my pcs+ is hynix too. i havent really tested it as the main or only card. only overclocking with the xfx reference underwater as card 1. which seems to be a better overclocker and with elpida mem can hit 1625 just fine like the pcs+. my xfx requires way less voltage to get to 1135.


----------



## heroxoot

Quote:


> Originally Posted by *cephelix*
> 
> @heroxoot
> hey there hero...did u finally resovle your driver issues from last time?


Not sure which you mean. I had performance issue but now my 290X is benching much closer to what it should granted my CPU. I had a netflix problem caused by the driver which was resolved by reinstalling silverlight.

Overall I have had bliss since 14.6. Thanks for asking though.


----------



## Red1776

Quote:


> Originally Posted by *taem*
> 
> ASIC has to do with voltage efficiency, not overclockability. The lower the ASIC the more leakage, is what I've been told. In fact I've heard a lower ASIC is desirable for water or ln2 to get the highest overclocks, though I don't understand the technical reason for that.
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> hey Busta,
> The EK blocks do not cover the entire block so the length is the PCB's original dimension of 10.75" from the inside of the expansion plate to the end of the card.
> 
> 
> 
> The length of a reference 290/x pcb is 10.5", not 10.75. And EK has two different types of blocks, one is shorter than pcb and the newer csq are length of pcb.
Click to expand...

I wasn't answering the length of the ref PCB, I measured the MSI Twin Frozr Game edition from a stated spot to the end of the card.

I also specified the Block I was referring to and posted an image of the Rev 2.0 block for the MSI cards with taller caps.


----------



## pkrexer

Good deal if anyone is looking for a 290


----------



## ArchieGriffs

Quote:


> Originally Posted by *pkrexer*
> 
> Good deal if anyone is looking for a 290


Geez, and you get 4 games instead of 3. If only it was a Tri-X and was 230$ after MIR, I'd bite. It's even cheaper than the Tri-X right now, that makes no sense but is still awesome for anyone picking up a Vapor X.


----------



## taem

Quote:


> Originally Posted by *ArchieGriffs*
> 
> Geez, and you get 4 games instead of 3. If only it was a Tri-X and was 230$ after MIR, I'd bite. It's even cheaper than the Tri-X right now, that makes no sense but is still awesome for anyone picking up a Vapor X.


Same here if it were a tri-x I'd buy right now. But I want a reference pcb cos I want to use an AC or watercool block. There is an ek block for the vapor x though. Why do you want a tri x over vapor x? Vapor x is clocked higher and has better components and is quieter and cooler. If on air it's the better card for sure. Has a backplate too.


----------



## rdr09

Quote:


> Originally Posted by *taem*
> 
> Same here if it were a tri-x I'd buy right now. But I want a reference pcb cos I want to use an AC or watercool block. There is an ek block for the vapor x though. Why do you want a tri x over vapor x? Vapor x is clocked higher and has better components and is quieter and cooler. If on air it's the better card for sure. Has a backplate too.


are you from the US? if you are, i suggest a reference used 290 or 290X. i think (just my thoughts), those non-refs' bioses are what's keeping them from being oc'ed. you'll get a good chance of oc'ing higher with reference.

But, that Beyond Earth alone is like . . . what? $40. if you have two 290s . . . why even oc. lol


----------



## battleaxe

Quote:


> Originally Posted by *pkrexer*
> 
> Good deal if anyone is looking for a 290


Yeah, I saw that earlier. Man what a deal. So tempting. But I have no reason to get one. Don't even have 1440 panel yet.


----------



## PillarOfAutumn

Hey everyone! I'm looking to buy a second 290 to crossfire. I have an offer for $190 shipped for an Asus reference 290. Is this a decent deal? Also, if I remember correctly, the Asus had some problems with overclocking where the screen would blacken and it would require a driver or OS reinstall....is this problem still there? Anything I should know about regarding these cards? Thank you!


----------



## taem

Quote:


> Originally Posted by *rdr09*
> 
> are you from the US? if you are, i suggest a reference used 290 or 290X. i think (just my thoughts), those non-refs' bioses are what's keeping them from being oc'ed. you'll get a good chance of oc'ing higher with reference.
> 
> But, that Beyond Earth alone is like . . . what? $40. if you have two 290s . . . why even oc. lol


I'm not going to buy a used 290 because the odds are so high that it came out of a mining rig. It'd be the gpu equivalent of screwing a crack whore. And can't I just flash a reference bios on to my reference pcb with custom cooler and bios card?

Free game value is not that high for me since I don't buy games until they're $5 on steam. And I'm not sue I'd buy Beyond Earth for $5. I wouldn't mind Alien Isolation though, I'll get that on steam when it hits like $7 for sure.


----------



## bond32

Quote:


> Originally Posted by *taem*
> 
> I'm not going to buy a used 290 because the odds are so high that it came out of a mining rig. It'd be the gpu equivalent of screwing a crack whore. And can't I just flash a reference bios on to my reference pcb with custom cooler and bios card?
> 
> Free game value is not that high for me since I don't buy games until they're $5 on steam. And I'm not sue I'd buy Beyond Earth for $5. I wouldn't mind Alien Isolation though, I'll get that on steam when it hits like $7 for sure.


You should really reconsider that thinking... At least if you ever cool it with water. I bought 2 out of 4 of mine from a miner about 8 months ago and those two are the best clocking cards I have...

And yes I have run them all at 1.5 volts before.


----------



## ArchieGriffs

Quote:


> Originally Posted by *taem*
> 
> Same here if it were a tri-x I'd buy right now. But I want a reference pcb cos I want to use an AC or watercool block. There is an ek block for the vapor x though. Why do you want a tri x over vapor x? Vapor x is clocked higher and has better components and is quieter and cooler. If on air it's the better card for sure. Has a backplate too.


It isn't compatible with the watercooling setup I'm going to have, it isn't a simple matter of which block, it needs to be a reference board, either way I'd probably be picky and want them to match anyways, if I decide to use the non reference coolers for any length of time, blue and yellow in a black/grey/white rig would look hideous, I think I'll just stick with two yellow Tri-X's.


----------



## heroxoot

Quote:


> Originally Posted by *taem*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> are you from the US? if you are, i suggest a reference used 290 or 290X. i think (just my thoughts), those non-refs' bioses are what's keeping them from being oc'ed. you'll get a good chance of oc'ing higher with reference.
> 
> But, that Beyond Earth alone is like . . . what? $40. if you have two 290s . . . why even oc. lol
> 
> 
> 
> I'm not going to buy a used 290 because the odds are so high that it came out of a mining rig. It'd be the gpu equivalent of screwing a crack whore. And can't I just flash a reference bios on to my reference pcb with custom cooler and bios card?
> 
> Free game value is not that high for me since I don't buy games until they're $5 on steam. And I'm not sue I'd buy Beyond Earth for $5. I wouldn't mind Alien Isolation though, I'll get that on steam when it hits like $7 for sure.
Click to expand...

Consider it. You can grab MSI or Sapphire and they will RMA for you no problem as long as the serial is valid. MSI especially.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Hey everyone! I'm looking to buy a second 290 to crossfire. I have an offer for $190 shipped for an Asus reference 290. Is this a decent deal? Also, if I remember correctly, the Asus had some problems with overclocking where the screen would blacken and it would require a driver or OS reinstall....is this problem still there? Anything I should know about regarding these cards? Thank you!


That's very cheep . Just do it already . Blackscreen = lower mem and / or gpu clock
OR flash PT1T bios and add more volts , till your DP , HDMI or DVI go wonky = Blackscreen benchmarking LooooooL


----------



## chiknnwatrmln

Quote:


> Originally Posted by *boot318*
> 
> I was thinking the same thing. tsm106 must be underwater and pushing volts to the max.


That's what I'm doing and my 290 maxed at 1245MHz with over 220mV.. Haven't tested out my 290x to the max but it seems about equal.

You guys talking about winter makes me want to move my rig across the room next to my window.. My temps aren't bad but today after gaming BF4 for a few hours I saw my GPUs hit 58c core. This is at 1111/5500 on +31 mV, rad fans at 80% and my fan cover closed. If I had it open it'd be cooler but louder. Also, my ambient is kinda high - 77F.


----------



## Buehlar

Trying to break 18000... *stable without artifacts!*

Anybody can post up some crazy bogus numbers that pass test full of artifacts...but who wants to see a load of baloney? Stop it guys...you know who you are









3770K @4.7 gave me a nice boost in physics. Working on a stable 4.8 OC

X-fire 290X
1185/1500 @1.36v
ULPS disabled
All temps are good...highest temps are on the cores 58° ~ 60°, the vrms are much lower
Rock solid w/o artifacts

1200 core @1.37v gives me some artifacts and test completes successfully.
Increasing RAM clocks doesn't seem to net anything.
Need some advice.
Should I just close my eyes and push voltage higher?

http://www.3dmark.com/fs/3166254


----------



## rt123

Quote:


> Originally Posted by *Buehlar*
> 
> Trying to break 18000... *stable without artifacts!*
> 
> Anybody can post up some crazy bogus numbers that pass test full of artifacts...but who wants to see a load of baloney? Stop it guys...you know who you are
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 3770K @4.7 gave me a nice boost in physics. Working on a stable 4.8 OC
> 
> X-fire 290X
> 1185/1500 @1.36v
> ULPS disabled
> All temps are good...highest temps are on the cores 58° ~ 60°, the vrms are much lower
> Rock solid w/o artifacts
> 
> 1200 core @1.37v gives me some artifacts and test completes successfully.
> Increasing RAM clocks doesn't seem to net anything.
> Need some advice.
> Should I just close my eyes and push voltage higher?
> 
> http://www.3dmark.com/fs/3166254


As far as I remember, FireStrike likes tighter RAM more than faster RAM.

Try benching with your GPU fans at 100%, so that you can push the clocks further without worrying about Temps.


----------



## KGBinUSA

Anybody wants to buy a moded R9 290?


----------



## Buehlar

Quote:


> Originally Posted by *rt123*
> 
> As far as I remember, FireStrike likes tighter RAM more than faster RAM.
> 
> Try benching with your GPU fans at 100%, so that you can push the clocks further without worrying about Temps.


Ummm I don't have fans...


----------



## rt123

Quote:


> Originally Posted by *Buehlar*
> 
> Ummm I don't have fans...


Lol.








You can always turn up the radiator fans.

My card doesn't artifact till above 70C, so I guess you can push more voltage in.


----------



## braddyjr

Hey there! anyone know if you can make a CF with a 290x Asus direct CuII and sapphire tri x oc?


----------



## Buehlar

Quote:


> Originally Posted by *rt123*
> 
> Lol.
> 
> 
> 
> 
> 
> 
> 
> 
> You can always turn up the radiator fans.
> 
> My card doesn't artifact till above 70C, so I guess you can push more voltage in.


Ha...yea man, the rad fans are up... Ambient is 27~28c tonight

Tried 1200 @ 1.375v but only nets a lower score. Minor artifacts (twice) at the end of the FS demo and once at begin of graphic test 1

I guess my card's my sweet spot is 1185 @1.36


----------



## pkrexer

Have you tried the latest 14.9.2 beta drivers? They gave me a nice boost in fire strike

http://www.3dmark.com/fs/3135045

Quote:


> Originally Posted by *Buehlar*
> 
> Trying to break 18000... *stable without artifacts!*
> 
> Anybody can post up some crazy bogus numbers that pass test full of artifacts...but who wants to see a load of baloney? Stop it guys...you know who you are
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 3770K @4.7 gave me a nice boost in physics. Working on a stable 4.8 OC
> 
> X-fire 290X
> 1185/1500 @1.36v
> ULPS disabled
> All temps are good...highest temps are on the cores 58° ~ 60°, the vrms are much lower
> Rock solid w/o artifacts
> 
> 1200 core @1.37v gives me some artifacts and test completes successfully.
> Increasing RAM clocks doesn't seem to net anything.
> Need some advice.
> Should I just close my eyes and push voltage higher?
> 
> http://www.3dmark.com/fs/3166254


----------



## pdasterly

sapphire 290x 1200/1600 +118mV +50 Powerlimit
gpu core 60c
vrm1 59c
vrm1 55c

No artifacts in firestrike, unigine and catzilla, stable so far. I get occasional game crash but thats another issue
http://www.3dmark.com/fs/3025719


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Buehlar*
> 
> Trying to break 18000... *stable without artifacts!*
> 
> Anybody can post up some crazy bogus numbers that pass test full of artifacts...but who wants to see a load of baloney? Stop it guys...you know who you are
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 3770K @4.7 gave me a nice boost in physics. Working on a stable 4.8 OC
> 
> X-fire 290X
> 1185/1500 @1.36v
> ULPS disabled
> All temps are good...highest temps are on the cores 58° ~ 60°, the vrms are much lower
> Rock solid w/o artifacts
> 
> 1200 core @1.37v gives me some artifacts and test completes successfully.
> Increasing RAM clocks doesn't seem to net anything.
> Need some advice.
> Should I just close my eyes and push voltage higher?
> 
> http://www.3dmark.com/fs/3166254


See if you can do more CPU. These are my results with a 290 and 290x at 1175 core, and my 3770k at 4.8 with my RAM oc'ed to 1866. http://www.3dmark.com/3dm/4569561

I should try again and see if I can break 18k.


----------



## Buehlar

Quote:


> Originally Posted by *pkrexer*
> 
> Have you tried the latest 14.9.2 beta drivers? They gave me a nice boost in fire strike
> 
> http://www.3dmark.com/fs/3135045


Haven't tried the beta yet....been shooting for a valid result.

Nice score. Your 4770K OC gives you a better boost vs my 3770K

We're pretty close on the Graphic score though.
What are your GPU clocks/voltage at for that run? It seems FS reported the stock clocks.


----------



## Buehlar

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> See if you can do more CPU. These are my results with a 290 and 290x at 1175 core, and my 3770k at 4.8 with my RAM oc'ed to 1866. http://www.3dmark.com/3dm/4569561
> 
> I should try again and see if I can break 18k.


I've been working on a 4.8 OC, I'll rerun once I'm stable with offset. My RAM is also @1866mhz


----------



## taem

Quote:


> Originally Posted by *rdr09*
> 
> are you from the US? if you are, i suggest a reference used 290 or 290X. i think (just my thoughts), those non-refs' bioses are what's keeping them from being oc'ed. you'll get a good chance of oc'ing higher with reference.


Btw maybe you can answer this for me, if I have a reference 290 with custom cooler and bios, and I flash it with reference bios, don't I have a reference card, just with a different cooler on it? I don't quite understand what goes on when you flash bios.

Also -- if I have elpida ram, do I need to use bios that specifies elpida in the bios? Because you have bios out there that list Hynix, of rllpida, or Hynix and elpida both, in the details. What happens if I flash my elpida card with a Hynix bios?


----------



## Red1776

Quote:


> Originally Posted by *taem*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> are you from the US? if you are, i suggest a reference used 290 or 290X. i think (just my thoughts), those non-refs' bioses are what's keeping them from being oc'ed. you'll get a good chance of oc'ing higher with reference.
> 
> 
> 
> Btw maybe you can answer this for me, if I have a reference 290 with custom cooler and bios, and I flash it with reference bios, don't I have a reference card, just with a different cooler on it? I don't quite understand what goes on when you flash bios.
> 
> Also -- if I have elpida ram, do I need to use bios that specifies elpida in the bios? Because you have bios out there that list Hynix, of rllpida, or Hynix and elpida both, in the details. What happens if I flash my elpida card with a Hynix bios?
Click to expand...

The BIOS controls and sets the operating parameters for the GPU from initialization to setting the memory and core clocks, power management, timings etc.

What you are after is a BIOS that is unlocked and allows access to change/mod voltages, fan profiles, etc.

in some cards below the next model up the BIOS could unlock additional pipelines/shaders/.


----------



## rdr09

Quote:


> Originally Posted by *bond32*
> 
> You should really reconsider that thinking... At least if you ever cool it with water. I bought 2 out of 4 of mine from a miner about 8 months ago and those two are the best clocking cards I have...
> 
> And yes I have run them all at 1.5 volts before.


+1.

Quote:


> Originally Posted by *taem*
> 
> Btw maybe you can answer this for me, if I have a reference 290 with custom cooler and bios, and I flash it with reference bios, don't I have a reference card, just with a different cooler on it? I don't quite understand what goes on when you flash bios.
> 
> Also -- if I have elpida ram, do I need to use bios that specifies elpida in the bios? Because you have bios out there that list Hynix, of rllpida, or Hynix and elpida both, in the details. What happens if I flash my elpida card with a Hynix bios?


if you are going to flash the bios with another, might as well use PT1. if i will, that will be the one i'll use. you can always experiment with a bios of a true reference from the same manufacturer, since if it fails you always flash it back to original bios.

even with reference, different manufacturers have different bioses. my msi and gigabyte reference cars do not have same bios. my theory is the bios of non-ref is gimped to prevent users from oc'ing much and in turn lessen chances of rma. that's just my thinking.


----------



## Klocek001

Quote:


> Originally Posted by *bond32*
> 
> You should really reconsider that thinking... At least if you ever cool it with water. I bought 2 out of 4 of mine from a miner about 8 months ago and those two are the best clocking cards I have...
> 
> And yes I have run them all at 1.5 volts before.


I bought a mining card and it broke after 3 weeks.


----------



## rdr09

Quote:


> Originally Posted by *Klocek001*
> 
> I bought a mining card and it broke after 3 weeks.


me too. supposedly was used for mining but i think it is brand spanking new.


----------



## taem

Quote:


> Originally Posted by *rdr09*
> 
> +1.
> if you are going to flash the bios with another, might as well use PT1. if i will, that will be the one i'll use. you can always experiment with a bios of a true reference from the same manufacturer, since if it fails you always flash it back to original bios.
> 
> even with reference, different manufacturers have different bioses. my msi and gigabyte reference cars do not have same bios. my theory is the bios of non-ref is gimped to prevent users from oc'ing much and in turn lessen chances of rma. that's just my thinking.


Doesn't pt.1 disable all power saving? I don't want that.

I've flashed a number of times so I know how to do it, but I'm wondering about your theory that ref bios oc best. Since I have a reference pcb if I flash my custom cooler card with ref bios, isn't that the same thing as having a full reference card?

I still wonder about whether you should match the mem you have with a bios that specifies elpida or Hynix.


----------



## rdr09

Quote:


> Originally Posted by *taem*
> 
> Doesn't pt.1 disable all power saving? I don't want that.
> 
> I've flashed a number of times so I know how to do it, but I'm wondering about your theory that ref bios oc best. Since I have a reference pcb if I flash my custom cooler card with ref bios, isn't that the same thing as having a full reference card?
> 
> I still wonder about whether you should match the mem you have with a bios that specifies elpida or Hynix.


we all have 2 bioses in our cars. you can always use the original by flipping the switch. yes, i believe you have to pick the bios for a particular memory type if possible.

like i said, i am not into flashing bios, so take it with a grain of salt.


----------



## dookiebot

Just purchased a Sapphire Vapor-X 290 with the great deal on newegg ($250 after rebate). I had been tempted to get SLI 970s to power my triple monitor setup, but I really want to stick with a single card solution so this will be my stop gap until I can see what AMD brings to the table in 2015. My 7950 will go into my second computer. Looking forward to joining the club.


----------



## pkrexer

Quote:


> Originally Posted by *Buehlar*
> 
> Haven't tried the beta yet....been shooting for a valid result.
> 
> Nice score. Your 4770K OC gives you a better boost vs my 3770K
> 
> We're pretty close on the Graphic score though.
> What are your GPU clocks/voltage at for that run? It seems FS reported the stock clocks.


Both cards are running 1150/1500 +50mv


----------



## tsm106

Quote:


> Originally Posted by *taem*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> +1.
> if you are going to flash the bios with another, might as well use PT1. if i will, that will be the one i'll use. you can always experiment with a bios of a true reference from the same manufacturer, since if it fails you always flash it back to original bios.
> 
> even with reference, different manufacturers have different bioses. my msi and gigabyte reference cars do not have same bios. my theory is the bios of non-ref is gimped to prevent users from oc'ing much and in turn lessen chances of rma. that's just my thinking.
> 
> 
> 
> Doesn't pt.1 disable all power saving? I don't want that.
> 
> I've flashed a number of times so I know how to do it, but I'm wondering about your theory that ref bios oc best. Since I have a reference pcb if I flash my custom cooler card with ref bios, isn't that the same thing as having a full reference card?
> 
> *I still wonder about whether you should match the mem you have with a bios that specifies elpida or Hynix.*
Click to expand...

Hawaii bios' from TPU's database denote what memory type is supported. It usually supports both types, but if you're worried check the details. Btw PT1 is different afaik, it sets all memory regardless of type to the same timings. That can be a good thing or not depending on how close or far apart your cards (memory ic) are in a crossfire setup


----------



## amptechnow

i just noticed on my Powercolr pcs+ where it wasR29FA ver1.0 that there are 2 squares labeled 4G and 8G, mine having a black dot on 4G. does this mean they have been planning 8GB models all along. why didn't the juscome with the 8GB models right away and blow everyone away?


----------



## Neet_za

Hey guys, have the Sapphire 290 Vapor-x @ 1100mhz ,1500mhz
I come here to ask if anyone has the 290x Vapor-x, want to unlock my 290 to a 290x, I need that 290x vapor-x bios, I did skim through the list on the first post, didn't find anyone with vapor-x coolers


----------



## amptechnow

got my xfx reference 290 stable at 1200/1625 with 50%power likmit and 1.5mV with elpida memory. how does that score look for a single card? and for some reason 3dmark is showing core clock wrong. so is gpuz. maybe its bc it is the pt1 bios???

http://www.3dmark.com/3dm/4636919?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Neet_za*
> 
> Hey guys, have the Sapphire 290 Vapor-x @ 1100mhz ,1500mhz
> I come here to ask if anyone has the 290x Vapor-x, want to unlock my 290 to a 290x, I need that 290x vapor-x bios, I did skim through the list on the first post, didn't find anyone with vapor-x coolers


Is that possible on vapor x?? And if so, is it also possible on the Tri-x?? I am picking one up for $200 this weekend!!
If I could unlock that would be awesome...


----------



## Neet_za

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Is that possible on vapor x?? And if so, is it also possible on the Tri-x?? I am picking one up for $200 this weekend!!
> If I could unlock that would be awesome...


ye go have a look at the unlock thread here; http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread/0_20
my memory allows the ability to load 290x bios then ultimately unlocking it to 290x
For your cards you're gonna have to read that first post of that thread, there are some checks you have to do with the memory to see if it is able to "_unlock_"
but its pretty easy to tell if its unlockable with the programs


----------



## Buehlar

Quote:


> Originally Posted by *amptechnow*
> 
> got my xfx reference 290 stable at 1200/1625 with 50%power likmit and 1.5mV with elpida memory. how does that score look for a single card? *and for some reason 3dmark is showing core clock wrong. so is gpuz*. maybe its bc it is the pt1 bios???
> 
> http://www.3dmark.com/3dm/4636919?


I don't use pt1 however, this happens whenever I change my clocks also... afterwards, I need to restart GPU-Z and 3Dmark so the apps can read the system info for updated clocks.
Give it a try & see if that resolves the problem...worked for me.


----------



## Buehlar

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> See if you can do more CPU. These are my results with a 290 and 290x at 1175 core, and my 3770k at 4.8 with my RAM oc'ed to 1866. http://www.3dmark.com/3dm/4569561
> 
> I should try again and see if I can break 18k.


cpu now @4.8
GPU's @1185/1500

Buttery smooth run with valid results! (14.9 drivers) http://www.3dmark.com/fs/3173578

I'm sooo close to 18k now!







Gonna try to squeeze a bit more from the GPUs.


----------



## Klocek001

I've read the article on 980 scaling, and what worries me is that PCIE 2.0 16X has only 94% performance of PCIE 3.0 16X at 1080p....
Have any of you found some info on 290 PCI-E 2.0 scaling ? I'm afraid my bus might affect min fps more than I've thought so far...


----------



## NathG79

Hi Folks.
My new pair of sapphire 290x vapor-x 8Gb arrived on Thursday.
Got them installed and up and running.

20141108_112551.jpg 4644k .jpg file


20141108_112604.jpg 5289k .jpg file


20141108_124432.jpg 4414k .jpg file


20141108_124441.jpg 4938k .jpg file


20141108_124450.jpg 4920k .jpg file


20141108_155001.jpg 2249k .jpg file


Top card has been running on desktop for about 4hrs now idling at 41c (with IFC switch on)
Ambient temp is 20c

Can I join?


----------



## tsm106

Quote:


> Originally Posted by *Klocek001*
> 
> I've read the article on 980 scaling, and what worries me is that PCIE 2.0 16X has only 94% performance of PCIE 3.0 16X at 1080p....
> Have any of you found some info on 290 PCI-E 2.0 scaling ? I'm afraid my bus might affect min fps more than I've thought so far...


There's actually lower overhead on pcie 2. But really, unless you are running quads the differences are so small that its not worth worrying about.


----------



## Arizonian

Quote:


> Originally Posted by *dookiebot*
> 
> Just purchased a Sapphire Vapor-X 290 with the great deal on newegg ($250 after rebate). I had been tempted to get SLI 970s to power my triple monitor setup, but I really want to stick with a single card solution so this will be my stop gap until I can see what AMD brings to the table in 2015. My 7950 will go into my second computer. Looking forward to joining the club.


Nice. Post the pics when you get it.









Quote:


> Originally Posted by *NathG79*
> 
> Hi Folks.
> My new pair of sapphire 290x vapor-x 8Gb arrived on Thursday.
> Got them installed and up and running.
> 
> 20141108_112551.jpg 4644k .jpg file
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 20141108_112604.jpg 5289k .jpg file
> 
> 
> 20141108_124432.jpg 4414k .jpg file
> 
> 
> 20141108_124441.jpg 4938k .jpg file
> 
> 
> 20141108_124450.jpg 4920k .jpg file
> 
> 
> 20141108_155001.jpg 2249k .jpg file
> 
> 
> 
> Top card has been running on desktop for about 4hrs now idling at 41c (with IFC switch on)
> Ambient temp is 20c
> 
> Can I join?


You sure can. Congrats - added









Also like to just say, if your a 290 or 290X owner and haven't submitted proof yet, just lurking, please do and join us.









On the chance If I missed you, please PM me with your submission post linked and I'll add you.


----------



## NathG79

These 8gb cards make mincemeat of shadow of mordor at 1440p ultra everything maxed out in crossfire. L.e.d on top card did go red after 20min though.

Waiting for next big AAA title to really push these beasts.


----------



## xer0h0ur

The gaming industry is actually about to shift to using less vRAM than more. You shouldn't expect many more titles in the near future requiring anywhere near that much vRAM.


----------



## velocityx

It's kinda funny what you say, it's not like designers can tap a switch and magically compress anything to use less v ram. Just like systemram grows and grows, v ram will also grow.


----------



## ZealotKi11er

Quote:


> Originally Posted by *xer0h0ur*
> 
> The gaming industry is actually about to shift to using less vRAM than more. You shouldn't expect many more titles in the near future requiring anywhere near that much vRAM.


The opposite since the resolutions are increasing dramatically.


----------



## NathG79

Quote:


> Originally Posted by *velocityx*
> 
> It's kinda funny what you say, it's not like designers can tap a switch and magically compress anything to use less v ram. Just like systemram grows and grows, v ram will also grow.


----------



## NathG79

This


----------



## TTheuns

Hello guys, I'm a long time lurker in this thread, and will be for another week, when I get a crossfire set of Sapphire R9 290 4GB cards.


----------



## xer0h0ur

Don't believe me then. Its no skin off my back.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Buehlar*
> 
> cpu now @4.8
> GPU's @1185/1500
> 
> Buttery smooth run with valid results! (14.9 drivers) http://www.3dmark.com/fs/3173578
> 
> I'm sooo close to 18k now!
> 
> 
> 
> 
> 
> 
> 
> Gonna try to squeeze a bit more from the GPUs.


You can def break 18k, keep trying. Our physics score is only 2 pts apart and your GPU score is like 500 higher than mine, wonder why my combined score is higher than yours..


----------



## Buehlar

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> You can def break 18k, keep trying. Our physics score is only 2 pts apart and your GPU score is like 500 higher than mine, wonder why my combined score is higher than yours..


Broke 18k last night with valid results







http://www.3dmark.com/fs/3174137
That's all she'll do for smooth solid runs without artifacts. I got another 10MHz out of the GPUs with CPU OC @4.8



14.9 drivers
Core @ 1195MHz
Voltage @1375mV
RAM @ 1500MHz
Power Limit +150

Now time to try out the beta drivers.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Buehlar*
> 
> Broke 18k last night with valid results
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/3174137
> That's all she'll do for smooth solid runs without artifacts. I got another 10MHz out of the GPUs with CPU OC @4.8
> 
> 
> 
> 14.9 drivers
> Core @ 1195MHz
> Voltage @1375mV
> RAM @ 1500MHz
> Power Limit +150
> 
> Now time to try out the beta drivers.


Nice work.









Sometime this week I'm gonna try for 4.9GHz and see if I can beat your score lol. You're clocked higher than I can do on my PSU, and you have two 290x's as opposed to my combo of 290+290x, but I'm hoping that extra 100MHz on my CPU will give me an edge.

It needs 1.4v for 4.8, so I'm anticipating ~1.45 for 4.9.

Random question, running the program on an SSD doesn't make a difference in the score right? I save it on my HDD to save space.


----------



## Buehlar

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Nice work.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sometime this week I'm gonna try for 4.9GHz and see if I can beat your score lol. You're clocked higher than I can do on my PSU, and you have two 290x's as opposed to my combo of 290+290x, but I'm hoping that extra 100MHz on my CPU will give me an edge.
> 
> It needs 1.4v for 4.8, so I'm anticipating ~1.45 for 4.9.
> 
> Random question, running the program on an SSD doesn't make a difference in the score right? I save it on my HDD to save space.


Makes no difference, an SSD will make the 3Dmark sub programs load faster but once loaded they execute from RAM.


----------



## ZealotKi11er

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Nice work.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sometime this week I'm gonna try for 4.9GHz and see if I can beat your score lol. You're clocked higher than I can do on my PSU, and you have two 290x's as opposed to my combo of 290+290x, but I'm hoping that extra 100MHz on my CPU will give me an edge.
> 
> It needs 1.4v for 4.8, so I'm anticipating ~1.45 for 4.9.
> 
> Random question, running the program on an SSD doesn't make a difference in the score right? I save it on my HDD to save space.


I am very close to breaking 18K too but i have 290 and 290X so i need a larger OC on the 290.


----------



## Buehlar

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I am very close to breaking 18K too but i have 290 and 290X so i need a larger OC on the 290.


What are your clocks?


----------



## webhito

Tried searching the forum for some late black screens but don't seem to find any recent ones. I have tried several drivers, had windows 8.1 and ended up ditching it for win 7 again, most of them were giving me issues with the black screen problem, sometimes during video playback, some times during gaming sessions, I have not been able to figure out if its something I am doing wrong or if its driver related. I just now today formatted with win 7 and installed 13.12 drivers so I am not sure how its going to play out, Machine should be in my sig, I can't recall tbh but everything is stock including the 290x. I had a manual profile set to ramp up to 100% at 75c with msi afterburner and I believe the bios switch was set to uber. Is this still an issue or has it been fixed? I have no windows crash reports as it just freezes and I need to do a hard reset.


----------



## Buehlar

Quote:


> Originally Posted by *webhito*
> 
> Tried searching the forum for some late black screens but don't seem to find any recent ones. I have tried several drivers, had windows 8.1 and ended up ditching it for win 7 again, most of them were giving me issues with the black screen problem, sometimes during video playback, some times during gaming sessions, I have not been able to figure out if its something I am doing wrong or if its driver related. I just now today formatted with win 7 and installed 13.12 drivers so I am not sure how its going to play out, Machine should be in my sig, I can't recall tbh but everything is stock including the 290x. I had a manual profile set to ramp up to 100% at 75c with msi afterburner and I believe the bios switch was set to uber. Is this still an issue or has it been fixed? I have no windows crash reports as it just freezes and I need to do a hard reset.


You're running crossfire? Try disabling ULPS.


----------



## webhito

Quote:


> Originally Posted by *Buehlar*
> 
> You're running crossfire? Try disabling ULPS.


Nope, its a single card.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Buehlar*
> 
> What are your clocks?


1225MHz/1500Mhz

4.6GHz CPU


----------



## pdasterly

Quote:


> Originally Posted by *webhito*
> 
> Tried searching the forum for some late black screens but don't seem to find any recent ones. I have tried several drivers, had windows 8.1 and ended up ditching it for win 7 again, most of them were giving me issues with the black screen problem, sometimes during video playback, some times during gaming sessions, I have not been able to figure out if its something I am doing wrong or if its driver related. I just now today formatted with win 7 and installed 13.12 drivers so I am not sure how its going to play out, Machine should be in my sig, I can't recall tbh but everything is stock including the 290x. I had a manual profile set to ramp up to 100% at 75c with msi afterburner and I believe the bios switch was set to uber. Is this still an issue or has it been fixed? I have no windows crash reports as it just freezes and I need to do a hard reset.


My blackscreen problems occurs with too much voltage. searched thread and seems like a memory problem, idk.
If pure gaming machine why not give windows 10 tech preview a try. I love it so far over 8.1


----------



## webhito

Quote:


> Originally Posted by *pdasterly*
> 
> My blackscreen problems occurs with too much voltage. searched thread and seems like a memory problem, idk.
> If pure gaming machine why not give windows 10 tech preview a try. I love it so far over 8.1


Its stock, so I don't think the voltage should be causing an issue, well I hope not at least. Regarding the Windows 10, I don't do betas lol, the few times that I ever tried them out they were really incompatible and caused more trouble than I could handle.


----------



## pdasterly

everything worked so far, only issue was getting iso mounting app. power iso won't work


----------



## tsm106

Quote:


> Originally Posted by *webhito*
> 
> Tried searching the forum for some late black screens but don't seem to find any recent ones. I have tried several drivers, had windows 8.1 and ended up ditching it for win 7 again, most of them were giving me issues with the black screen problem, sometimes during video playback, some times during gaming sessions, I have not been able to figure out if its something I am doing wrong or if its driver related. I just now today formatted with win 7 and installed 13.12 drivers so I am not sure how its going to play out, Machine should be in my sig, I can't recall tbh but everything is stock including the 290x. I had a manual profile set to ramp up to 100% at 75c with msi afterburner and I believe the bios switch was set to uber. Is this still an issue or has it been fixed? I have no windows crash reports as it just freezes and I need to do a hard reset.


Try this. It's probably the shortest most concise explanation I've written. Your symptoms line up with those in this thread too.

http://www.overclock.net/t/1522761/vapor-x-r9-290-causing-black-screen-lockup/0_40


----------



## webhito

Quote:


> Originally Posted by *tsm106*
> 
> Try this. It's probably the shortest most concise explanation I've written. Your symptoms line up with those in this thread too.
> 
> http://www.overclock.net/t/1522761/vapor-x-r9-290-causing-black-screen-lockup/0_40


Thanks Tsm, I will give that a shot.

btw, does this only happen with 290x or is it with amd cards in general? Does it happen with every driver?


----------



## tsm106

Quote:


> Originally Posted by *webhito*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Try this. It's probably the shortest most concise explanation I've written. Your symptoms line up with those in this thread too.
> 
> http://www.overclock.net/t/1522761/vapor-x-r9-290-causing-black-screen-lockup/0_40
> 
> 
> 
> Thanks Tsm, I will give that a shot.
> 
> btw, does this only happen with 290x or is it with amd cards in general? Does it happen with every driver?
Click to expand...

It can happen to any gpu that uses powerstates which is all current gen cards. And it can happen to any driver because it is symptomatic of AMD's viewpoint on apps being free to initiate powerstates changes. It's a byproduct of being more reactive to differing apps needs. Too you and I it's more annoying than anything lol. The idea that an app can dictate a powerstate sounds cool but in reality it's not all that and a bag of chips.


----------



## webhito

Quote:


> Originally Posted by *tsm106*
> 
> It can happen to any gpu that uses powerstates which is all current gen cards. And it can happen to any driver because it is symptomatic of AMD's viewpoint on apps being free to initiate powerstates changes. It's a byproduct of being more reactive to differing apps needs. Too you and I it's more annoying than anything lol. The idea that an app can dictate a powerstate sounds cool but in reality it's not all that and a bag of chips.


Thanks for the insight, it sure is annoying, seemed really random, sometimes during gaming, sometimes watching movies and others while watching youtube, i hope they get their stuff together and fix it, it would be a damn shame to have to switch over to the green side.


----------



## tsm106

Quote:


> Originally Posted by *webhito*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> It can happen to any gpu that uses powerstates which is all current gen cards. And it can happen to any driver because it is symptomatic of AMD's viewpoint on apps being free to initiate powerstates changes. It's a byproduct of being more reactive to differing apps needs. Too you and I it's more annoying than anything lol. The idea that an app can dictate a powerstate sounds cool but in reality it's not all that and a bag of chips.
> 
> 
> 
> Thanks for the insight, it sure is annoying, seemed really random, sometimes during gaming, sometimes watching movies and others while watching youtube, i hope they get their stuff together and fix it, it would be a damn shame to have to switch over to the green side.
Click to expand...

It's kind of like a necessary evil with a catch 22 unfortunately. If you read my how to thread you'll read about how Mozilla still has trouble accessing amd gpus without incurring excessive calls. Mozilla like other developers are sort of mavericks they do things their own way regardless. This also has something to do with the industry moving towards gpgpu and compute, utilizing the gpu more. Again it sounds cool on paper but in reality we don't want our gpus being used due to the heat created when as far as we're concerned we're just watch YouTube vids.


----------



## webhito

Quote:


> Originally Posted by *tsm106*
> 
> It's kind of like a necessary evil with a catch 22 unfortunately. If you read my how to thread you'll read about how Mozilla still has trouble accessing amd gpus without incurring excessive calls. Mozilla like other developers are sort of mavericks they do things their own way regardless. This also has something to do with the industry moving towards gpgpu and compute, utilizing the gpu more. Again it sounds cool on paper but in reality we don't want our gpus being used due to the heat created when as far as we're concerned we're just watch YouTube vids.


Yea, I completely agree, too bad we have to be the beta testers though lol. I don't like paying premium price to have something not work properly. I don't mind the heat, nor do I mind having to set up a card manually to avoid overheating and throttling but getting bsod's just by regular use is kind of out of the question. Sad thing is that I had issues with a 780ti also and nvidia's drivers.


----------



## ZealotKi11er

Yeah broke 18K http://www.3dmark.com/3dm/4654265?


----------



## bond32

New run here, new cpu too
bond32 - 4790k @ 5.0 ghz, 4x290x's @ 1200/1320

Tess Off: 10184 http://www.3dmark.com/3dm/4654683?


----------



## Buehlar

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah broke 18K http://www.3dmark.com/3dm/4654265?


Nice







Our scores are almost identical
...but man you sure are pushing that voltage!









Are you getting artifacts?

When I push my cards cores past 1200 I end up with lower scores.

I just loaded the new betas...what a friggin nightmare with blackscreen city!
Finally got them sorted disabling ULPS via safe mode.


----------



## neurotix

Just bumping this up in my sub list. Ignore.


----------



## amptechnow

so close to breaking 18K with my 2 290's. my pcs+ is really holding back my xfx reference. i can push the xfx past 1200/1625 noiw, but the pcs+ doesnt want to go over 1150/1625. i have the day off so i am going to try and catch up with ypou guys.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Buehlar*
> 
> Nice
> 
> 
> 
> 
> 
> 
> 
> Our scores are almost identical
> ...but man you sure are pushing that voltage!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Are you getting artifacts?
> 
> When I push my cards cores past 1200 I end up with lower scores.
> 
> I just loaded the new betas...what a friggin nightmare with blackscreen city!
> Finally got them sorted disabling ULPS via safe mode.


Problem is heat. With 1.4V my system is using ~ 1050W.

The cards are stable as long as temps are very low. I can probably push them more. As soon as Water Delta is over 12C they crash. No artifacts though. I have to test them a bit more but in reality a 50MHz more OC for extra 250W is not worth it unless i am using the PC as a heater.


----------



## Widde

Quote:


> Originally Posted by *ZealotKi11er*
> 
> unless i am using the PC as a heater.


+1 I am while playing bf4 for acouple of hours


----------



## Performer81

[/quote]
Quote:


> Originally Posted by *amptechnow*
> 
> so close to breaking 18K with my 2 290's. my pcs+ is really holding back my xfx reference. i can push the xfx past 1200/1625 noiw, but the pcs+ doesnt want to go over 1150/1625. i have the day off so i am going to try and catch up with ypou guys.


What voltage, offset?


----------



## bustacap22

Seeking anyone here who is running Reference Sapphire 290x in crossfire model #21226-00-40g @ single 1440p and watercooled. I am looking for benchmarks that ppl are getting with these cards. I have been debating if I should go with the Sapphire reference since I can purchase 2 cards for a sweet deal. thanks.


----------



## Buehlar

Quote:


> Originally Posted by *bond32*
> 
> New run here, new cpu too
> bond32 - 4790k @ 5.0 ghz, 4x290x's @ 1200/1320
> 
> Tess Off: 10184 http://www.3dmark.com/3dm/4654683?











I have nothing else to add...








Quote:


> Originally Posted by *ZealotKi11er*
> 
> Problem is heat. With 1.4V my system is using ~ 1050W.
> 
> The cards are stable as long as temps are very low. I can probably push them more. As soon as Water Delta is over 12C they crash. No artifacts though. I have to test them a bit more but in reality a 50MHz more OC for extra 250W is not worth it unless i am using the PC as a heater.


Yea...your GPUs and CPU both ask for too much V. You need more rad to bring down that Delta.

Running a dual loop here
@1.375v my GPU loop Delta T is ~5c with AX480 and RX420 so heat isn't an issue for me however, adding more core voltage and pushing clocks over 1200 doesn't gain me any performance. No crash, just lower scores.

Any ideas? Or maybe this just suggest max performance...


----------



## PillarOfAutumn

What type of benchmarks are you looking for? Synthetic or actual in game benchmarks? XF 290x will be able to crush anything at 1440p. If I remember correctly, XF 290 were able to do 140+ on BF4, 100+ in tomb raider, 80+ on Metro LL all at 1440p.


----------



## amptechnow

Quote:


> Originally Posted by *Performer81*


What voltage, offset?[/quote]

well afterburner seems to be the most stable for me. so even if i set voltage all the way to max it wont go any further then 1150 stable. gpu tweaks is useless unless i run xfx card with pt1 bios, then i can change voltage. trix seems very unstable with my system and gives me poor scores.

ive been having a strange issue for the longest time no one has been able to figure out. i flashed both my cards quiet bios' with pt1. if i set both cards to pt1 windows boots then bsod. if i uninstall drivers, manually or with ddu, and both ways very thuroughly; driver install crashes half way through. now if i set top card(xfx reference) to pt1 and bottom(pcs+) to stock. everything is fine. and vice versa if i set bottom card(pcs+) to pt1 and top card(xfx) to stock everything is fine. but no matter what i try i cant get both to work with pt1 bios. with one card installed they both run pt1 fine.

i might try setting bottom card to pt3 and give that a try. maybe there is some conflict with id's??? idk...

but my xfx reference will hit 1200+ using 1.456 volts...


----------



## ZealotKi11er

Quote:


> Originally Posted by *Buehlar*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have nothing else to add...
> 
> 
> 
> 
> 
> 
> 
> 
> Yea...your GPUs and CPU both ask for too much V. You need more rad to bring down that Delta.
> 
> Running a dual loop here
> @1.75v my GPU loop Delta T is ~5c with AX480 and RX420 so heat isn't an issue for me however, adding more core voltage and pushing clocks over 1200 doesn't gain me any performance. No crash, just lower scores.
> 
> Any ideas? Or maybe this just suggest max performance...


I would get more RAD space but really this is the MAX i am ever going to get as far as GPU power consumption goes. Its ~ 400W per GPU and 200W for CPU. The thing is i only need + 125 mV for GPUs for 1200MHz and CPU needs + 165 mV for 4.6GHz insted of + 210 mV for 4.7GHz.

In reality i gain 5% extra performance for 25% increase in heat and power consumption. For 24/7 i just run cards at stock 1000/1250. Really no point to OC for games i play.


----------



## bond32

Quote:


> Originally Posted by *Buehlar*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have nothing else to add...
> 
> 
> 
> 
> 
> 
> 
> 
> Yea...your GPUs and CPU both ask for too much V. You need more rad to bring down that Delta.
> 
> Running a dual loop here
> @1.75v my GPU loop Delta T is ~5c with AX480 and RX420 so heat isn't an issue for me however, adding more core voltage and pushing clocks over 1200 doesn't gain me any performance. No crash, just lower scores.
> 
> Any ideas? Or maybe this just suggest max performance...


1.75 V on gpu's?? If so, that's entirely too high... I don't go over 1.5 V. You use pt1 bios to bench? Also, restart your pc between runs. I know it takes longer but I discovered it helps...


----------



## ZealotKi11er

Quote:


> Originally Posted by *bond32*
> 
> 1.75 V on gpu's?? If so, that's entirely too high... I don't go over 1.5 V. You use pt1 bios to bench? Also, restart your pc between runs. I know it takes longer but I discovered it helps...


He probably means +175 mV.


----------



## Buehlar

Quote:


> Originally Posted by *bond32*
> 
> 1.75 V on gpu's?? If so, that's entirely too high... I don't go over 1.5 V. You use pt1 bios to bench? Also, restart your pc between runs. I know it takes longer but I discovered it helps...


Oops...sorry for typo *1.375 V*









no pt1...just stock uber. Yes with reboots


----------



## bond32

Quote:


> Originally Posted by *Buehlar*
> 
> Oops...sorry for typo *1.375 V*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> no pt1...just stock uber. Yes with reboots


Haha ok makes more sense... You can probably push it further, sounds about like my cards. Takes about 1.43 V for 1240 on the core for mine.


----------



## Bertovzki

My R9 290X Tri-X OC arrived today

Sorry cant link to Sapphire site in pic ,as it requires mt password , i must be doing something wrong ,probably spent at least an hour on this.
Below is pic with Sapphire registration serial code on screen and a pic of me and card

Pic's :


Spoiler: Warning: Spoiler!


----------



## Spectre-

Quote:


> Originally Posted by *bond32*
> 
> Haha ok makes more sense... You can probably push it further, sounds about like my cards. Takes about 1.43 V for 1240 on the core for mine.


sounds about right my ref. 290x needs 1.46 (1.43 drop off) to do 1250/1600

with PT1 i can push 1.5 anmd do 1270/1640

also stated before restart your pc every time to do a run for w.e benchmark (just not gpu also cpu benchmarks)


----------



## Jaimelmiel

Hello all, new to this forum. My validation is http://www.techpowerup.com/gpuz/details.php?id=2dzxs. My manufacturer is MSI. My make is R9 290x (2). My cooling is stock.

I am having a problem with crossfireing my 2 R9 290x's. They both run in either slot by themselves hooked to monitor. On Dirt 3 their avg framerates are 115 - 145 depending on the course. When either is set as the second gpu in slot 3. Framerates almost cut in half. Also while I was crossfiring the audio bus on the mobo went kaput. Everything else is fine. Temps etc. Cpu is running @ 40-50% when gaming
according to the resource monitor. Is the mobo a possible culprit? I have lost one bus already. According to the device manager.

Computer specs:

Cpu: FX 9590 @4.7 Ghz
Mobo: Sabertooth 990FX R 2.0
Cpu Cooler: Thermaltake Frio Extreme Air Cooler
SSD Intel Series 335 Jay Crest 240 GB
Memory: Team Zues 2133Mhz @ 1866Mhz
Case: Roswill Blackhawk Ultra
OS: Windows 8.1

.


----------



## kizwan

Quote:


> Originally Posted by *amptechnow*
> 
> well afterburner seems to be the most stable for me. so even if i set voltage all the way to max it wont go any further then 1150 stable. gpu tweaks is useless unless i run xfx card with pt1 bios, then i can change voltage. trix seems very unstable with my system and gives me poor scores.
> 
> ive been having a strange issue for the longest time no one has been able to figure out. i flashed both my cards quiet bios' with pt1. if i set both cards to pt1 windows boots then bsod. if i uninstall drivers, manually or with ddu, and both ways very thuroughly; driver install crashes half way through. now if i set top card(xfx reference) to pt1 and bottom(pcs+) to stock. everything is fine. and vice versa if i set bottom card(pcs+) to pt1 and top card(xfx) to stock everything is fine. but no matter what i try i cant get both to work with pt1 bios. with one card installed they both run pt1 fine.
> 
> i might try setting bottom card to pt3 and give that a try. maybe there is some conflict with id's??? idk...
> 
> but my xfx reference will hit 1200+ using 1.456 volts...


Did you use Trixx v4.8.2? I've heard later version causing instability to some system.


----------



## pdasterly

newest version of trixx crashes on my machine


----------



## pdasterly

what cause red screen? Heavy overclock


----------



## kizwan

Quote:


> Originally Posted by *pdasterly*
> 
> what cause red screen? Heavy overclock


I don't know but I found this.

http://www.tomshardware.com/faq/id-1961964/fix-red-screen-death.html


----------



## Arizonian

Quote:


> Originally Posted by *Jaimelmiel*
> 
> Hello all, new to this forum. My validation is http://www.techpowerup.com/gpuz/details.php?id=2dzxs. My manufacturer is MSI. My make is R9 290x (2). My cooling is stock.
> 
> *I am having a problem with crossfireing my 2 R9 290x's. They both run in either slot by themselves hooked to monitor. On Dirt 3 their avg framerates are 115 - 145 depending on the course. When either is set as the second gpu in slot 3. Framerates almost cut in half. Also while I was crossfiring the audio bus on the mobo went kaput. Everything else is fine. Temps etc. Cpu is running @ 40-50% when gaming
> according to the resource monitor. Is the mobo a possible culprit?* I have lost one bus already. According to the device manager.
> 
> Computer specs:
> 
> Cpu: FX 9590 @4.7 Ghz
> Mobo: Sabertooth 990FX R 2.0
> Cpu Cooler: Thermaltake Frio Extreme Air Cooler
> SSD Intel Series 335 Jay Crest 240 GB
> Memory: Team Zues 2133Mhz @ 1866Mhz
> Case: Roswill Blackhawk Ultra
> OS: Windows 8.1
> 
> .


Congrats - added









Hope you get your crossfire issue resolved, I can't help you there but you came to the right place to ask that question. Bolded it.

Don't forget to add your system build to your profile using *Rigbuilder* when you get the chance. Good luck.


----------



## Forceman

Quote:


> Originally Posted by *pdasterly*
> 
> what cause red screen? Heavy overclock


The times I got it, it was because of a failed CPU overclock. When the system would hard crash the screen would go all red, some kind of AMD driver thing maybe. I used to get them i BF4, but upping the CPU voltage took care of it.


----------



## Roboyto

Quote:


> Originally Posted by *amptechnow*
> 
> but my xfx reference will hit 1200+ using 1.456 volts...


I think you are trying to push cards too hard that simply aren't capable..plus your problem is only compounded with Xfire. I have owned (4) 290's and helped a friend set up a 5th one. I have had 2 lackluster performers which wouldn't break 1100 and 1200, respectively, with +200mV. When I max the voltage/power sliders in Trixx and the cards don't respond I just let it be since I know what a good card is capable of.

My reference XFX Black Edition benches 1300/1700 +200mV, or 1200/1500 +75mV, with stock BIOS. Even at +200mV in Trixx it peaks 1.422V in HWiNFO and that is before droop. I know this is OCN, but swapping BIOSes and ramming excessive voltage through your cards is A) only going to get you so far and B) possibly do irreparable damage.


----------



## Bertovzki

So how do I get added to the club, I posted pics a page or so back , and i have registered my R9 290X-Tri-X OC with Sapphire select club , but I dont have any validation page like dude above ?


----------



## Arizonian

Quote:


> Originally Posted by *Bertovzki*
> 
> My R9 290X Tri-X OC arrived today the pic's are very bad , my cell phone does not take good photos
> 
> I had trouble registering it on the Sapphire site , it would not accept the number, told me i was liar :
> 
> Pic's :
> 
> 
> Spoiler: Warning: Spoiler!


Quote:


> Originally Posted by *Bertovzki*
> 
> So how do I get added to the club, I posted pics a page or so back , and i have registered my R9 290X-Tri-X OC with Sapphire select club , but I dont have any validation page like dude above ?


*1STpost* gives how to join. Like poster above a GPU-Z validation link with OCN name showing would work or photo with OCN name on piece of paper.

Just go back to your original post one page back and add the *GPU-Z* validation link and all is good.







In the meantime.....good enough for me.

Congrats - already added









As for the Sapphire site registration....if your Sapphire is new, it's possible there was a previous owner. Returned box perhaps, who registered it and then returns it for whatever reason.


----------



## Bertovzki

Quote:


> Originally Posted by *Arizonian*
> 
> *1STpost* gives how to join. Like poster above a GPU-Z validation link with OCN name showing would work or photo with OCN name on piece of paper.
> 
> Just go back to your original post one page back and add the *GPU-Z* validation link and all is good.
> 
> 
> 
> 
> 
> 
> 
> In the meantime.....good enough for me.
> 
> Congrats - already added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As for the Sapphire site registration....if your Sapphire is new, it's possible there was a previous owner. Returned box perhaps, who registered it and then returns it for whatever reason.


Ok thanks man , could not link any page as it requires my password on the Sapphire site , but have posted pic's and user name , i'm the youngster in the pic


----------



## cephelix

Welcome welcome.always good to have new members here.have learned alot from this thread.now if only i have the time to put it into practice.seems like a waste to have my 290 sit at stock for close to 6 months now.


----------



## Bertovzki

Quote:


> Originally Posted by *cephelix*
> 
> Welcome welcome.always good to have new members here.have learned alot from this thread.now if only i have the time to put it into practice.seems like a waste to have my 290 sit at stock for close to 6 months now.










Hi , I will run stock for a while too , see how it performs first , I lurk on this thread ,read every day to gain some knowledge


----------



## cephelix

I've playes quite a few games on this new card at stoco,1200p..namely mass effect 1-3, sleeping dogs ..performs swimmingly well.recently did a clean install of w7 and playing shadow of mordor.i just use the preset settings, mostly on high and since i havent had time to download monitoring softwares, i dont have numbers.but all i can say is that it runs smooth..no hitches at all


----------



## Bertovzki

Quote:


> Originally Posted by *cephelix*
> 
> I've playes quite a few games on this new card at stoco,1200p..namely mass effect 1-3, sleeping dogs ..performs swimmingly well.recently did a clean install of w7 and playing shadow of mordor.i just use the preset settings, mostly on high and since i havent had time to download monitoring softwares, i dont have numbers.but all i can say is that it runs smooth..no hitches at all


Yeah sweet , I'm not into benching just gaming ,mostly any very realistic driving sim , good flight sims ,and strategy but mostly i just want to send fast email







, I have not even played a game for almost a year ,ha ,and in no real hurry to build rig , but I do game a lot when i do , I am also into my music ,I record my own sounds , been playing Guitar for 20 odd years , I have a blatant Iommi style . so will look forward to the extra power now for real time effects when I play and record.
I'm looking forward to trying Il2 Sturmovik too ,loved the first ones.
I will be interested to see how the 290 performs with Project cars and Asseto Corsa , I see some of the debate over Nvidea and AMD ,as far as game developers are concerned , hope PCars has good 290 support when it comes out.
I'm guessing the 290 performs very well in most games and at good resolution ,even with one card , I do mostly hear about drivers , benchmarks and voltage on this thread ,not much about 290's and gaming yet.


----------



## cephelix

@Bertovzki dude,that is some awesome artistic ability and creativity you have.....wish i could be that...but sadly, i'm not....and they say left handers are supposed to be more artistic. I suppose benches are easier to standardise as they run on the same graphical settings along with having consolidated sites for results eg 3dmark. Games, especially with the myriad of settings available make it tedious to compare...


----------



## Vici0us

Can someone with a custom loop answer a question? If I go with an open loop then how many rads and how big rads would I need to run an OC'd 4770K and X2 R9 290's (possibly 3 eventually).


----------



## rdr09

Quote:


> Originally Posted by *Vici0us*
> 
> Can someone with a custom loop answer a question? If I go with an open loop then how many rads and how big rads would I need to run an OC'd 4770K and X2 R9 290's (possibly 3 eventually).


i recommend 2X240 rads for the cpu and 2 290s. if you are sure to add a third 290, then go at least 1X360 and 1X240 rads.

i currently have 3 120 rads for my cpu (i7 SB 4.5GHz) and 2 290s but my ambient is pretty low all year long and my gpus stay at stock. i don't see 60C on the core and my vrms are lower by about 4-5C. one rad has 2 fans. my cpu, though, reaches the 70s during games.


----------



## jagdtigger

I dont understand whats wrong with my monitors. The problem is that my monitor(LG IPS277L on HDMI) gives me black screens when my card is OC-ed. The two others(1xDVI-D and 1xDP with a DP->HDMI adapter) are working just fine, any ideas?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Vici0us*
> 
> Can someone with a custom loop answer a question? If I go with an open loop then how many rads and how big rads would I need to run an OC'd 4770K and X2 R9 290's (possibly 3 eventually).


Depends how hard you will OC. I have 1 x 360mm Thick rad and 1 x 240 Thick RAD push/pull GT15s. They are fine for 2 cards. Good temp but not amazing. If you get 3 cards i would say 240 mm per card and 120 mm for CPU so for you ~ 480 + 360. If you OC + 200 mV then you need even more. You can run your setup with 360 mm RAD but your CPU will suffer.


----------



## tsm106

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Vici0us*
> 
> Can someone with a custom loop answer a question? If I go with an open loop then how many rads and how big rads would I need to run an OC'd 4770K and X2 R9 290's (possibly 3 eventually).
> 
> 
> 
> i recommend 2X240 rads for the cpu and 2 290s. if you are sure to add a third 290, then go at least 1X360 and 1X240 rads.
> 
> i currently have 3 120 rads for my cpu (i7 SB 4.5GHz) and 2 290s but my ambient is pretty low all year long and my gpus stay at stock. i don't see 60C on the core and my vrms are lower by about 4-5C. one rad has 2 fans. my cpu, though, reaches the 70s during games.
Click to expand...

That's not very good cooling. It's really a question of what level of delta anyone is willing to achieve or put up with. Judging by your temps, you are fine with a large delta, but that's not the same for everyone. Moving on lower delta equates to more rad space. You say you have low ambient so what maybe 18c? That's a load delta of 42c on gpu and 52c on cpu for stock clocked gpus. That's not very good cooling. The level of cooling also has a direct impact on overclocking and benching, more is better. I'm with the longtime wc reviewer and guru Martin, who applies his own formula of one 360mm rad per block. Now of course not everyone has the budget or the space for that level of rad surface, but I think it's good that they know what exactly it is to have very good temps. There is also a relationship between TDP, watts drawn and radiator surface area. Many dependencies not enough time.


----------



## PillarOfAutumn

Quick question. I have one R9 290 that is being cooled with an EK water block. I've got a good deal on a 290 Asus dcuii. I will just buy a dcuii specific wsterblock from ek. Now, since I'll have two different wsterblocks, will I be able to connect them together with an ek bridge? Both are compatible with the ek terminal so I'm just wondering if it will line up correctly. Thank you.


----------



## tsm106

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Quick question. I have one R9 290 that is being cooled with an EK water block. I've got a good deal on a 290 Asus dcuii. I will just buy a dcuii specific wsterblock from ek. Now, since I'll have two different wsterblocks, will I be able to connect them together with an ek bridge? Both are compatible with the ek terminal so I'm just wondering if it will line up correctly. Thank you.


EK bridge will not work as the ports won't line up. You will want to keep as much space between the blocks as possible so that you can get creative with connecting the ports between the two cards.


----------



## taem

Quote:


> Originally Posted by *tsm106*
> 
> EK bridge will not work as the ports won't line up. You will want to keep as much space between the blocks as possible so that you can get creative with connecting the ports between the two cards.


Seriously?? I thought all the ek blocks could be linked with their sli bridges.

I would use a vid connector if I couldn't use a bridge probably.


----------



## BradleyW

Just wanted to say:
The following games don't support CFX on a driver level:

RYSE
Mordor
Evil Within
Borderlands 3
COD Advanced Warfare
AC Unity
Far Cry 4
Lords of the fallen
Dead Rising 3










I fear that AMD won't release a new driver until Dragon Age 3. That driver might only increase performance for Civ 6 and Dragon Age 3.


----------



## Agent Smith1984

To have Far Cry 4 hitting the ground in a week, and not have crossfire support already squared away is a friggin' disgrace!


----------



## BradleyW

Quote:


> Originally Posted by *Agent Smith1984*
> 
> To have Far Cry 4 hitting the ground in a week, and not have crossfire support already squared away is a friggin' disgrace!


That whole list is a complete disgrace. They only care about CIV 6 and Dragon Age 3. (It appeared to be this way at the time of which this comment was written)


----------



## ZealotKi11er

Quote:


> Originally Posted by *Agent Smith1984*
> 
> To have Far Cry 4 hitting the ground in a week, and not have crossfire support already squared away is a friggin' disgrace!


Who is going to play FC4 anyways. Better wait for them to iron the bugs and then play. These days its really stupid to buy and try to play PC games in day 1. I have limited time playing games so when i do play these games i want the best experience. I only waited 2 years to play FC3. Well worth the wait.


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> That's not very good cooling. It's really a question of what level of delta anyone is willing to achieve or put up with. Judging by your temps, you are fine with a large delta, but that's not the same for everyone. Moving on lower delta equates to more rad space. You say you have low ambient so what maybe 18c? That's a load delta of 42c on gpu and 52c on cpu for stock clocked gpus. That's not very good cooling. The level of cooling also has a direct impact on overclocking and benching, more is better. I'm with the longtime wc reviewer and guru Martin, who applies his own formula of one 360mm rad per block. Now of course not everyone has the budget or the space for that level of rad surface, but I think it's good that they know what exactly it is to have very good temps. There is also a relationship between TDP, watts drawn and radiator surface area. Many dependencies not enough time.


wow. 360 per block. no wonder the 295 got such a bad wrap.

Quote:


> Originally Posted by *Agent Smith1984*
> 
> To have Far Cry 4 hitting the ground in a week, and not have crossfire support already squared away is a friggin' disgrace!


i woould not crossfire even the 280X with the thuban. also, there is an alternative - nvidia.


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> wow. 360 per block. no wonder the 295 got such a bad wrap.
> i woould not crossfire even the 280X with the thuban. also, there is an alternative - nvidia.


Considering 290X can dump 450W of heat lone even 360 is not enough to keep good delta temp. I can easily see 15C+ detla with my system which sucks. One day i will have more RAD space and not have to worry but need money, one more pump and a larger case.


----------



## Bertovzki

Quote:


> Originally Posted by *cephelix*
> 
> @Bertovzki dude,that is some awesome artistic ability and creativity you have.....wish i could be that...but sadly, i'm not....and they say left handers are supposed to be more artistic. I suppose benches are easier to standardise as they run on the same graphical settings along with having consolidated sites for results eg 3dmark. Games, especially with the myriad of settings available make it tedious to compare...


I will at some point do a bench ,just to see how things are working , more out of curiosity for a comparison to my current system a HD5850 and dual core 2.1 ghz lol , the 4790K and 1 or 2 x 290x's should be a bit better








Thanks for Compliment of my 2 pic's I posted , I have many more , this is a sadly neglected hobby ,I am more passionate about my guitar skills , and one of the reasons I have a new system soon , for recording music and playing guitar with real time effects , dual CPU too slow.
This whole PC build has given me a new idea for my art though , " UV paintings that glow bright " , so that drawing in colour pencil of the sunset for e.g. ,if in UV paint would be awesome ! , also I want to get a second NZXT Hue RGB for room lighting as well and some more Darkside lights for my paintings.
This sort of thing :


Spoiler: Warning: Spoiler!







Anyone on here know what AMD are doing regarding continued manufacture of the 290 , or more specifically the Sapphire R9 290X Tri-X OC , I hope it will be manufactured for some time , as I will want a second Tri-X in about 6 months , I'm just hoping they will be around for a while


----------



## ZealotKi11er

Quote:


> Originally Posted by *Bertovzki*
> 
> I will at some point do a bench ,just to see how things are working , more out of curiosity for a comparison to my current system a HD5850 and dual core 2.1 ghz lol , the 4790K and 1 or 2 x 290x's should be a bit better
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks for Compliment of my 2 pic's I posted , I have many more , this is a sadly neglected hobby ,I am more passionate about my guitar skills , and one of the reasons I have a new system soon , for recording music and playing guitar with real time effects , dual CPU too slow.
> This whole PC build has given me a new idea for my art though , " UV paintings that glow bright " , so that drawing in colour pencil of the sunset for e.g. ,if in UV paint would be awesome ! , also I want to get a second NZXT Hue RGB for room lighting as well and some more Darkside lights for my paintings.
> This sort of thing :
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Anyone on here know what AMD are doing regarding continued manufacture of the 290 , or more specifically the Sapphire R9 290X Tri-X OC , I hope it will be manufactured for some time , as I will want a second Tri-X in about 6 months , I'm just hoping they will be around for a while


Pretty sure you can find used and knowing AMD 290 is here to say. Just look at 7970. 3 years and going strong







.


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Considering 290X can dump 450W of heat lone even 360 is not enough to keep good delta temp. I can easily see 15C+ detla with my system which sucks. One day i will have more RAD space and not have to worry but need money, one more pump and a larger case.


i can only fit 3 120 rads in my case and just recently added a 290 using the same config. i am sure if my ambient is higher than 25C the rads would not be enuf. but before i crossfired, my 290 would not even see 60C on the core playing BF4 but i had it at stock as usual.

maybe the 290X is significantly warmer even at stock. prolly a 240 will suffice for a 290 but not a 290X.


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> i can only fit 3 120 rads in my case and just recently added a 290 using the same config. i am sure if my ambient is higher than 25C the rads would not be enuf. but before i crossfired, my 290 would not even see 60C on the core playing BF4 but i had it at stock as usual.
> 
> maybe the 290X is significantly warmer even at stock. prolly a 240 will suffice for a 290 but not a 290X.


For games at stock its fine. Its just scary to play BF4 in my system with +200mV as the water temp start ~ 28C and climb to ~ 42C and thats where the OC crashes. Also you have to consider summer temps. My loop was really made for 1 card setup to be pushed hard or 2 card very lightly. Does not help the fact my fans are @ 7v. 12v will drop the temps as good 7-8C but for the cost of noise. Hopefully AMD and Nvidia dont make 300W + cards.


----------



## Bertovzki

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Pretty sure you can find used and knowing AMD 290 is here to say. Just look at 7970. 3 years and going strong
> 
> 
> 
> 
> 
> 
> 
> .


Ok does that mean they will only be second hand options soon , I deffinitely want a new second card , not second hand one , so maybe I need to buy one sooner ?


----------



## Jflisk

I have 3x R9290x I see about 58C max across the board under water and playing Titanfall for 3-4hrs straight. That's with 2 D5s 1x 240 Monsta Rad 1x120 and 1x240.The only thing I don't like about this system is it idles at 44C across the board.

So that's FX9590 Processor 225W at idle there about and I think this is where my high idle temp comes from.
3 X R9 290X Not sure what it idles at in watts but I know what it hits under load. I have seen the whole system with a WEMO basically a electronic kilowatt. Hit 1130 Watts full load.


----------



## cephelix

Quote:


> Originally Posted by *Bertovzki*
> 
> Ok does that mean they will only be second hand options soon , I deffinitely want a new second card , not second hand one , so maybe I need to buy one sooner ?


for the lights, why not get an RGB light kit? Comes with controller to adjust lighting etc etc. Check it out if you're interested. UV paints would be cool, but would they lose the flourscence over time?? I've always envied those cases that are air-brushed or modded/have trimmings made of acrylic and the like...
as for the card, if you could afford it, why not just get it now if you know you're going to use it down the road?? But as @ZealotKi11er has mentioned, it should still be available even if amd releases a new generation of cards.and especially if it's one of the more popular models


----------



## Bertovzki

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Quick question. I have one R9 290 that is being cooled with an EK water block. I've got a good deal on a 290 Asus dcuii. I will just buy a dcuii specific wsterblock from ek. Now, since I'll have two different wsterblocks, will I be able to connect them together with an ek bridge? Both are compatible with the ek terminal so I'm just wondering if it will line up correctly. Thank you.


Quote:


> Originally Posted by *ZealotKi11er*
> 
> Considering 290X can dump 450W of heat lone even 360 is not enough to keep good delta temp. I can easily see 15C+ detla with my system which sucks. One day i will have more RAD space and not have to worry but need money, one more pump and a larger case.


It's interesting and good to see the same questions asked here as is currently on the 750D thread , I asked anyone to post more detail about their entire loop and cooling system , because to say I have this rad and this card is not enough , there are so many variables that can make a system run cooler , and som debate on Exhausting vs Intaking through the rads , I have a different view for my requirements and preferences .

Id like to see a few posts of all the variables in some dudes systems with water temp and CPU , GPU temps posted to get comparisons :

What is the exhaust vs intake setup ?, are your intakes through a rad or exhausting through a rad ?, what fan speeds ? what fans ? size Cfm ? H20 ?

What Radiators ? what size ?

What CPU block ? what GPU block ? what thermal paste ?

What case ? how much cool air is coming in ? what is the flow of air like through the case ? obstructed or unobstructed ?

What balance of intake vs exhaust fans ?

how many HDD's or optical drives , they add a lot of heat to a case , do you have none like me , or do you have 5 lol HDD's is there also fan controller light controller ?

What are temps no OC vs OC temp

What CPU ? and GPU ? do you have ?

What is your ambient temperature in room ? temperature in your case ?

What is the temps of your CPU and GPU ? and the water temp ?

Why havn't you got a fridge or nitrogen cooling , whats wrong with you ?









All these variables can add up and make a difference a half a degree here and there , and maybe 2 or 3 here.

This sort of thing comes up often on threads it's not really off topic to 290 thread or a 750D thread for eg because so many want to know this specific to their card or case or both, however I can start a thread on this data of peoples different variables where we can all post this info , but probably not necessary as , it's just as useful seeing it here or some case thread , but maybe it would be a good way to specifically compare your setup or case to others ?


----------



## Bertovzki

Quote:


> Originally Posted by *cephelix*
> 
> for the lights, why not get an RGB light kit? Comes with controller to adjust lighting etc etc. Check it out if you're interested. UV paints would be cool, but would they lose the flourscence over time?? I've always envied those cases that are air-brushed or modded/have trimmings made of acrylic and the like...
> as for the card, if you could afford it, why not just get it now if you know you're going to use it down the road?? But as @ZealotKi11er has mentioned, it should still be available even if amd releases a new generation of cards.and especially if it's one of the more popular models


True thats what I was hoping , that they will be around for quite a while because they are still very current and very good. I wont be able to afford another for 6 months ,car to repair ,moving house , SSD's to buy and PSU yet.
I do have the NZXT Hue RGB for my case and Darkside UV strips and various different single and double colour led's too







I just want a second NZXT for around my desk and room lighting














and some more UV strips for paintings , so it will be a light fest


----------



## Buehlar

Quote:


> Originally Posted by *Bertovzki*
> 
> It's interesting and good to see the same questions asked here as is currently on the 750D thread , I asked anyone to post more detail about their entire loop and cooling system , because to say I have this rad and this card is not enough , there are so many variables that can make a system run cooler , and som debate on Exhausting vs Intaking through the rads , I have a different view for my requirements and preferences .
> 
> Id like to see a few posts of all the variables in some dudes systems with water temp and CPU , GPU temps posted to get comparisons :
> 
> What is the exhaust vs intake setup ?, are your intakes through a rad or exhausting through a rad ?, what fan speeds ? what fans ? size Cfm ? H20 ?
> 
> What Radiators ? what size ?
> 
> What CPU block ? what GPU block ? what thermal paste ?
> 
> What case ? how much cool air is coming in ? what is the flow of air like through the case ? obstructed or unobstructed ?
> 
> What balance of intake vs exhaust fans ?
> 
> how many HDD's or optical drives , they add a lot of heat to a case , do you have none like me , or do you have 5 lol HDD's is there also fan controller light controller ?
> 
> What are temps no OC vs OC temp
> 
> What CPU ? and GPU ? do you have ?
> 
> What is your ambient temperature in room ? temperature in your case ?
> 
> What is the temps of your CPU and GPU ? and the water temp ?
> 
> Why havn't you got a fridge or nitrogen cooling , whats wrong with you ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> All these variables can add up and make a difference a half a degree here and there , and maybe 2 or 3 here.
> 
> This sort of thing comes up often on threads it's not really off topic to 290 thread or a 750D thread for eg because so many want to know this specific to their card or case or both, however I can start a thread on this data of peoples different variables where we can all post this info , but probably not necessary as , it's just as useful seeing it here or some case thread , but maybe it would be a good way to specifically compare your setup or case to others ?


Out of all the above, the one thing that trumps everything will be having adequate rad space. The more surface area of rad, the cooler your temps, simple as that.

Of course everything else is a factor, but it has minimal impact on temps when compared to rad capacity. People like to split hairs a lot debating the performance/noise ratio of fans, intake vs exhaust, push pull configs, etc.
You do gain or lose a degree here and there, but it's a fraction of your Delta gained via rad space.


----------



## bond32

Proper rad space for 4 290x's:


----------



## Bertovzki

Quote:


> Originally Posted by *bond32*
> 
> Proper rad space for 4 290x's:


thats the only way rad space is not an issue ,and why other variables don't matter.

Re - splitting hairs ,this is so often the case agreed I'm not with my build thats why 2 rads is fine for me RX 360 and EX 280 with Aerocool DS 140 on the EX will be fine and exhausting out my rads not intake, because I prefer cool air coming in and not hot rad air


----------



## tsm106

^^ There are easier ways of getting similar rad surface area and for less too I might add.


----------



## amptechnow

i know i need more rad, but a 240 keeps my 290 @ 1135/1625 and 3770K @4.8-5 cool. i keep everything clean, have good case air flow and push/pull for the rad. soon i will be modding case and removing hard drive/ssd cages and adding a 360 to front intake fans and putting my other 290 under water.


----------



## bond32

At some point I will break down and get a bunch of QDC's. I would like to test these rads each, not that I would be posting any data like some of the other members here but just because I am curious.


----------



## VSG

^ That's the spirit


----------



## Buehlar

Quote:


> Originally Posted by *geggeg*
> 
> ^ That's the spirit


^^ Speaks the great "hair splitter" ^^ LOL


----------



## bond32

I mean, I want to test things like, what actually the temperatures would be if I tried to run all four cards and off one 240 or 480... Crazy but I'm curious


----------



## VSG

Quote:


> Originally Posted by *Buehlar*
> 
> ^^ Speaks the great "hair splitter" ^^ LOL


??
Quote:


> Originally Posted by *bond32*
> 
> I mean, I want to test things like, what actually the temperatures would be if I tried to run all four cards and off one 240 or 480... Crazy but I'm curious


Not crazy at all. People keep asking if a single 120/240 can cool 1-2 components all the time. Using a 480 for your 4 cards would be similar.


----------



## Buehlar

Quote:


> Originally Posted by *geggeg*
> 
> ??


Oh nothing...just trolling ya VSG.









In context with my previous post, I just thought it was ironic that you popped in @ the right moment


----------



## ZealotKi11er

Quote:


> Originally Posted by *bond32*
> 
> I mean, I want to test things like, what actually the temperatures would be if I tried to run all four cards and off one 240 or 480... Crazy but I'm curious


You will kill the pump. With 2 cards i get about 15C delta. 4 cards would be 30C delta. Water ~ 25C-30C. @ 65C its too much.


----------



## VSG

Quote:


> Originally Posted by *Buehlar*
> 
> Oh nothing...just trolling ya VSG.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In context with my previous post, I just thought it was ironic that you popped in @ the right moment


lol ok









Quote:


> Originally Posted by *ZealotKi11er*
> 
> You will kill the pump. With 2 cards i get about 15C delta. 4 cards would be 30C delta. Water ~ 25C-30C. @ 65C its too much.


Very good point, +1. This need not be the case but definitely something to watch out for. I wouldn't run 4 cards on a single 240 ever myself.


----------



## bond32

Quote:


> Originally Posted by *ZealotKi11er*
> 
> You will kill the pump. With 2 cards i get about 15C delta. 4 cards would be 30C delta. Water ~ 25C-30C. @ 65C its too much.


What? lol.

Not going to kill the pump lol.


----------



## Buehlar

Quote:


> Originally Posted by *bond32*
> 
> What? lol.
> 
> Not going to kill the pump lol.


Pump life is relative to water temps. Too high is bad for the pump.


----------



## bond32

Quote:


> Originally Posted by *Buehlar*
> 
> Pump life is relative to water temps. Too high is bad for the pump.


Yeah... 100 degree water is bad. 15 degrees over ambient? You joking??

In case you guys didn't know, the D5 pump is a repurposed solar recirq pump. Max water temperature is 160 degrees. Just an fyi...


----------



## ZealotKi11er

Quote:


> Originally Posted by *bond32*
> 
> What? lol.
> 
> Not going to kill the pump lol.


Water Temp kills pump. One reason 295X2 has a limit in temp because of the pump.

GPUs are fine running with limited RAD. CPUs are not. Lets say your CPU hits 75C alone. It would probably have 3-5C Delta. Make that 30C and your CPU will hit 100C.
Quote:


> Originally Posted by *bond32*
> 
> Yeah... 100 degree water is bad. 15 degrees over ambient? You joking??
> 
> In case you guys didn't know, the D5 pump is a repurposed solar recirq pump. Max water temperature is 160 degrees. Just an fyi...


For D5 max liquid temp is 60C.


----------



## bond32

So, you mean to tell me, that a pump circulating water at 90, possibly 100 degrees F is going to fail, even if it has a temperature limit set at 160 F?

Well, can't argue with that logic...


----------



## Buehlar

Quote:


> Originally Posted by *bond32*
> 
> Yeah... 100 degree water is bad. 15 degrees over ambient? You joking??
> 
> In case you guys didn't know, the D5 pump is a repurposed solar recirq pump. Max water temperature is 160 degrees. Just an fyi...


Dude I didn't say 15 degrees over ambient will kill a pump. Where the ___ did you get that from?


----------



## bond32

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Water Temp kills pump. One reason 295X2 has a limit in temp because of the pump.
> 
> GPUs are fine running with limited RAD. CPUs are not. Lets say your CPU hits 75C alone. It would probably have 3-5C Delta. Make that 30C and your CPU will hit 100C.
> For D5 max liquid temp is 60C.


The pump within the 295x2? You seriously comparing that to a D5??

60 C is still 140 F. If you have water temperature anywhere remotely close to that you have a serious problem.

Edit:
Quote:


> Originally Posted by *Buehlar*
> 
> Dude I didn't say 15 degrees over ambient will kill a pump. Where the ___ did you get that from?


Didn't say you said he said blah blah who cares. It's been stated multiple times in this thread. Since I don't actually have a thermocouple in mine, I will assume it is about 15 degrees (worst case) over ambient. Is that not fair? What water temperature delta do you use?


----------



## Buehlar

Quote:


> Originally Posted by *ZealotKi11er*
> 
> For D5 max liquid temp is 60C.


^^This!


----------



## Buehlar

Quote:


> Originally Posted by *bond32*
> 
> The pump within the 295x2? You seriously comparing that to a D5??
> 
> 60 C is still 140 F. If you have water temperature anywhere remotely close to that you have a serious problem.
> 
> Edit:
> Didn't say you said he said blah blah who cares. It's been stated multiple times in this thread. Since I don't actually have a thermocouple in mine, I will assume it is about 15 degrees (worst case) over ambient. Is that not fair? What water temperature delta do you use?


Whatever man...if you really wanna run quad 290x through a single 240mm then go ahead...I have no clue how you expect to test this when you don't even know your current delta.
Regardless...manufacture rating is 60c for a D5.

P.S.
Get you a temp probe.


----------



## ZealotKi11er

Quote:


> Originally Posted by *bond32*
> 
> The pump within the 295x2? You seriously comparing that to a D5??
> 
> 60 C is still 140 F. If you have water temperature anywhere remotely close to that you have a serious problem.
> 
> Edit:
> Didn't say you said he said blah blah who cares. It's been stated multiple times in this thread. Since I don't actually have a thermocouple in mine, I will assume it is about 15 degrees (worst case) over ambient. Is that not fair? What water temperature delta do you use?


Don't mix F and C in the same discussion as it confuses people. I got my water temp to 50C with just 2 x 290s with 360 and 240. You can easily hit 60C with 4 cards.


----------



## kizwan

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *bond32*
> 
> What? lol.
> 
> Not going to kill the pump lol.
> 
> 
> 
> Water Temp kills pump. One reason 295X2 has a limit in temp because of the pump.
> 
> GPUs are fine running with limited RAD. CPUs are not. Lets say your CPU hits 75C alone. It would probably have 3-5C Delta. Make that 30C and your CPU will hit 100C.
> Quote:
> 
> 
> 
> Originally Posted by *bond32*
> 
> Yeah... 100 degree water is bad. 15 degrees over ambient? You joking??
> 
> In case you guys didn't know, the D5 pump is a repurposed solar recirq pump. Max water temperature is 160 degrees. Just an fyi...
> 
> Click to expand...
> 
> 
> 
> 
> 
> For D5 max liquid temp is 60C.
Click to expand...

Max. system temperature
-10 to + 95°C for pumps with brass housing (non-freezing)
+/- 0 to + 60°C for pumps with plastic housing (non-freezing)
Quote:


> Originally Posted by *bond32*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> Water Temp kills pump. One reason 295X2 has a limit in temp because of the pump.
> 
> GPUs are fine running with limited RAD. CPUs are not. Lets say your CPU hits 75C alone. It would probably have 3-5C Delta. Make that 30C and your CPU will hit 100C.
> For D5 max liquid temp is 60C.
> 
> 
> 
> 
> 
> 
> 
> The pump within the 295x2? You seriously comparing that to a D5??
> 
> 60 C is still 140 F. If you have water temperature anywhere remotely close to that you have a serious problem.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Edit:
> Quote:
> 
> 
> 
> Originally Posted by *Buehlar*
> 
> Dude I didn't say 15 degrees over ambient will kill a pump. Where the ___ did you get that from?
> 
> Click to expand...
> 
> Didn't say you said he said blah blah who cares. It's been stated multiple times in this thread. Since I don't actually have a thermocouple in mine, I will assume it is about 15 degrees (worst case) over ambient. Is that not fair? What water temperature delta do you use?
Click to expand...

That's not what they're trying to say to you. You did said you want to test the radiator one by one to see their performance. They just want to warn you of the likely high water temp when testing multiple cards with 120mm or 240mm rads. That's all.


----------



## Buehlar

Thank you @kizwan your translation skills are impressive


----------



## Jflisk

Quote:


> Originally Posted by *bond32*
> 
> Proper rad space for 4 290x's:


I have to say one thing your system is neat. It a new spin on a old ball. Nice


----------



## tsm106

Quote:


> Originally Posted by *kizwan*
> 
> They just want to warn you of the likely high water temp when testing multiple cards with 120mm or 240mm rads. That's all.


It's a non-issue. If you think anyone running 60c water temp has not noticed his crap melting right in front of them, well... can't argue with that. Shrugs...


----------



## Bertovzki

Quote:


> Originally Posted by *bond32*
> 
> I mean, I want to test things like, what actually the temperatures would be if I tried to run all four cards and off one 240 or 480... Crazy but I'm curious


I would like to see that , and what the difference is , with the rest of the system being the same ,and if it could be done in the same ambient room temp would be interesting and as far as i know no one has bothered to test like that, if you were to bother with the effort it would be good if you did take all relevant data and post it .
The more rads the greater the diminishing the returns are ,especially for one card , but 4 cards you could still probably yield some improvement for many more rads , another interesting test to do would be , try 1 x 290 for all your rads as opposed using your smallest rad,preferably a 120 if you could get your hands on one

Quote:


> Originally Posted by *geggeg*
> 
> ??
> Not crazy at all. People keep asking if a single 120/240 can cool 1-2 components all the time. Using a 480 for your 4 cards would be similar.


Yeah what you said , it does get asked all the time , it is one of the most asked questions on the 750D thread
I see some quite big differences between similar rad setups ( at least in rad size ) ,thats why I gave a long list of variables ,because its not always about rad surface and as simple as that , it a great place to start , and to just follow the basic rule of thumb is all that is needed most of the time, I'm not into splitting hairs , just I would like to see a bunch of other test and comparisons done on some aspects .


----------



## kizwan

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> They just want to warn you of the likely high water temp when testing multiple cards with 120mm or 240mm rads. That's all.
> 
> 
> 
> It's a non-issue. If you think anyone running 60c water temp has not noticed his crap melting right in front of them, well... can't argue with that. Shrugs...
Click to expand...

Well that is the idea of a warning. No one want to deliberately killed or melted their crap. If I'm running the test, I would have stopped the test before water temp reached 60C. And, not all people know temp limit of D5 pumps and/or take into account the temp limit of other stuff in the loop. It's just friendly warning that turned into unnecessary argument.


----------



## Sgt Bilko

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> They just want to warn you of the likely high water temp when testing multiple cards with 120mm or 240mm rads. That's all.
> 
> 
> 
> It's a non-issue. If you think anyone running 60c water temp has not noticed his crap melting right in front of them, well... can't argue with that. Shrugs...
> 
> Click to expand...
> 
> Well that is the idea of a warning. No one want to deliberately killed or melted their crap. If I'm running the test, I would have stopped the test before water temp reached 60C. And, not all people know temp limit of D5 pumps and/or take into account the temp limit of other stuff in the loop. It's just friendly warning that turned into unnecessary argument.
Click to expand...

All good info to me, just about to setup my first loop and every bit I learn now will help out in the future


----------



## tsm106

Quote:


> Originally Posted by *Sgt Bilko*
> 
> All good info to me, just about to setup my first loop and every bit I learn now will help out in the future


I think you'd glean a lot more from reading all the articles at Martin's.

http://martinsliquidlab.org/


----------



## Sgt Bilko

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> All good info to me, just about to setup my first loop and every bit I learn now will help out in the future
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think you'd glean a lot more from reading all the articles at Martin's.
> 
> http://martinsliquidlab.org/
Click to expand...

Yep, been doing a bit of research lately when time allows it.

GPU(s) aren't going under water yet as my 290's are getting donated to my Wife's build and i'll just be running the 295x2 for the time being but info helps no matter the source

(although Martin is probably the best source of info going for watercooling)


----------



## Mega Man

Quote:


> Originally Posted by *Bertovzki*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bond32*
> 
> I mean, I want to test things like, what actually the temperatures would be if I tried to run all four cards and off one 240 or 480... Crazy but I'm curious
> 
> 
> 
> I would like to see that , and what the difference is , with the rest of the system being the same ,and if it could be done in the same ambient room temp would be interesting and as far as i know no one has bothered to test like that, if you were to bother with the effort it would be good if you did take all relevant data and post it .
> The more rads the greater the diminishing the returns are ,especially for one card , but 4 cards you could still probably yield some improvement for many more rads , another interesting test to do would be , try 1 x 290 for all your rads as opposed using your smallest rad,preferably a 120 if you could get your hands on one
> 
> Quote:
> 
> 
> 
> Originally Posted by *geggeg*
> 
> ??
> Not crazy at all. People keep asking if a single 120/240 can cool 1-2 components all the time. Using a 480 for your 4 cards would be similar.
> 
> Click to expand...
> 
> Yeah what you said , it does get asked all the time , it is one of the most asked questions on the 750D thread
> I see some quite big differences between similar rad setups ( at least in rad size ) ,thats why I gave a long list of variables ,because its not always about rad surface and as simple as that
Click to expand...

I love when people talk about diminishing returns.

As it can be completely false.

What people should talk about as it has much more relevance is

what do you want from/out of your system

How much space and money do they have avail

The answers to the first question is either max perf or silence or a mix

The more rads you have the slower fans you can have and the less noise is made

And even with high speed fans less improvement is more then none.

The money and space is obvious

Ambient temps are easy to overcome in testing

Generally speaking if your ambient temp goes down 1 deg so will your core. This is why a delta is a far more accurate way to test


----------



## cephelix

Quote:


> Originally Posted by *Bertovzki*
> 
> True thats what I was hoping , that they will be around for quite a while because they are still very current and very good. I wont be able to afford another for 6 months ,car to repair ,moving house , SSD's to buy and PSU yet.
> I do have the NZXT Hue RGB for my case and Darkside UV strips and various different single and double colour led's too
> 
> 
> 
> 
> 
> 
> 
> I just want a second NZXT for around my desk and room lighting
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and some more UV strips for paintings , so it will be a light fest


all the best in your future endeavours man...where are you going to mount the dials if u put the nzxt leds around your desk?? After buying the 290, i feel somewhat shortchanged by nvidia for the gtx 460 n 570 i bought.


----------



## Bertovzki

Quote:


> Originally Posted by *cephelix*
> 
> all the best in your future endeavours man...where are you going to mount the dials if u put the nzxt leds around your desk?? After buying the 290, i feel somewhat shortchanged by nvidia for the gtx 460 n 570 i bought.


I will get one of those small PSU ,i can just plug into the wall and run it from there and probably build a small acrylic box with a optical drive in it and a external HHD ,like a mini rack 3 bay, that will be some ways down the track , i would get another 290 first and the rest of my gear.
Re - short changed by Nvidea , you mean you have far more bang for the buck with a 290 ?


----------



## cephelix

Quote:


> Originally Posted by *Bertovzki*
> 
> I will get one of those small PSU ,i can just plug into the wall and run it from there and probably build a small acrylic box with a optical drive in it and a external HHD ,like a mini rack 3 bay, that will be some ways down the track , i would get another 290 first and the rest of my gear.
> Re - short changed by Nvidea , you mean you have far more bang for the buck with a 290 ?


Nice..post pics when you're done.considering the specs between the two and at the point of me purchasing my 290, the nvidia equivalent was the 780.the 290 costed me SGD650 while the gtx 780 would've been around 800....
only the 970 now seems quite reasonable but the price difference between that and the 980 is about Sgd300 i think


----------



## Bertovzki

Quote:


> Originally Posted by *cephelix*
> 
> Nice..post pics when you're done.considering the specs between the two and at the point of me purchasing my 290, the nvidia equivalent was the 780.the 290 costed me SGD650 while the gtx 780 would've been around 800....
> only the 970 now seems quite reasonable but the price difference between that and the 980 is about Sgd300 i think


My 290 cost NZD $642 a 780 ti would be $800 ,so same for me its just better value for money, especially if i get 2 , and i could have just paid $500 for a R9 290 trix which is the cheapest 290 I can find here, so paying more for greatly diminished returns







for the extra factory over clock and a few more stream processors ,just to put it under water anyway ,and a nice heat sink I can hang on the wall


----------



## cephelix

Quote:


> Originally Posted by *Bertovzki*
> 
> My 290 cost NZD $642 a 780 ti would be $800 ,so same for me its just better value for money, especially if i get 2 , and i could have just paid $500 for a R9 290 trix which is the cheapest 290 I can find here, so paying more for greatly diminished returns
> 
> 
> 
> 
> 
> 
> 
> for the extra factory over clock and a few more stream processors ,just to put it under water anyway ,and a nice heat sink I can hang on the wall


or you could you know....sell your 290 and get 2 X's..lol


----------



## Sgt Bilko

290x's have dropped down again on Newegg









XFX DD R9 290x for $329.99


----------



## cephelix

That is cheap!!....too bad the prices here arent so


----------



## Bertovzki

Quote:


> Originally Posted by *cephelix*
> 
> or you could you know....sell your 290 and get 2 X's..lol


Sorry I was not very clear I do have the R9 290X Tri-X OC and will get another one, I meant i could have got the cheapest just plain 290 , which happened to be the R9 290 TriX , there are 3 versions of the Sappire Tri -X , they are " R9 290 Tr-X , R9 290 Tri-X OC , R9 290X Tri-X OC "
All the same yellow and black the first one is stock ,the second a small factory over clock the third one i have, has the factory over clock + more processor streams


----------



## cephelix

Quote:


> Originally Posted by *Bertovzki*
> 
> Sorry I was not very clear I do have the R9 290X Tri-X OC and will get another one, I meant i could have got the cheapest just plain 290 , which happened to be the R9 290 TriX , there are 3 versions of the Sappire Tri -X , they are " R9 290 Tr-X , R9 290 Tri-X OC , R9 290X Tri-X OC "
> All the same yellow and black the first one is stock ,the second a small factory over clock the third one i have, has the factory over clock + more processor streams


my bad..i just assumed it was a 290.well then, you could sell it off and get 2 of the 295x2..


----------



## Bertovzki

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 290x's have dropped down again on Newegg
> 
> 
> 
> 
> 
> 
> 
> 
> 
> XFX DD R9 290x for $329.99


Quote:


> Originally Posted by *cephelix*
> 
> That is cheap!!....too bad the prices here arent so


Yeah that is cheap and I knew I could get one cheaper over seas , but did not look , and I have bought most of my stuff from Frozen and PPC's , but I guess I just wanted to get It close ,local delivered in a couple of days for a change , and less chance of it getting bashed about by truck drivers and couriers etc...and easier to sort any warranty issue if i was unfortunate enough to experience one, so just pulled the trigger , now I can stare at the box , until i get my @#^* together and start my build, really just waiting on a camera, as I want good quality photos instead of that awful cell phone I have to use at moe , and I may get one tomorrow is the plan , because I am doing so many mods and acrylic lining etc.. I want to record the whole build log ,and in nice clear pic's


----------



## tarui04

Do I need to apply thermal paste/pad to the chokes (for a reference msi 290x) when installing on a Aquacomputer waterblock? the stock heatsink has a thermal pad that contacts the heatsink's fins but the AQC's manual didn't mention anything about applying thermal paste/pad on to it.

This is the version I am using
http://www.coolingconfigurator.com/upload/pictures/HIS-Radeon-R9-290X-IceQ-X2-Turbo-4GB-GDDR5-%28H290XQMT4GD%29-PCB.jpg


----------



## Bertovzki

Quote:


> Originally Posted by *cephelix*
> 
> my bad..i just assumed it was a 290.well then, you could sell it off and get 2 of the 295x2..


Oh yeah why didn't I think of that , oh thats right I'm poor


----------



## cephelix

Quote:


> Originally Posted by *Bertovzki*
> 
> Yeah that is cheap and I knew I could get one cheaper over seas , but did not look , and I have bought most of my stuff from Frozen and PPC's , but I guess I just wanted to get It close ,local delivered in a couple of days for a change , and less chance of it getting bashed about by truck drivers and couriers etc...and easier to sort any warranty issue if i was unfortunate enough to experience one, so just pulled the trigger , now I can stare at the box , until i get my @#^* together and start my build, really just waiting on a camera, as I want good quality photos instead of that awful cell phone I have to use at moe , and I may get one tomorrow is the plan , because I am doing so many mods and acrylic lining etc.. I want to record the whole build log ,and in nice clear pic's


looking forward to seeing your build log.i'm far far too lazy, disorganized and impatient to start one

But with the 295s you wouldnt need to go anywhere...just sitting down infront of your computer, recording or playing games...you save by not going out..and the shwer joy of having a blazing fast computer with awesome graphics will make you forget tht you are hungry or thirsty...it's a win-win situation if you ask me..lol


----------



## Bertovzki

Quote:


> Originally Posted by *cephelix*
> 
> looking forward to seeing your build log.i'm far far too lazy, disorganized and impatient to start one
> 
> But with the 295s you wouldnt need to go anywhere...just sitting down infront of your computer, recording or playing games...you save by not going out..and the shwer joy of having a blazing fast computer with awesome graphics will make you forget tht you are hungry or thirsty...it's a win-win situation if you ask me..lol


Ha yeah I cant argue with logic like that







, I havn't done that for a while lol , I have been known to play guitar for 20 hours straight until the skin is pealing off my finger tips


----------



## kizwan

Quote:


> Originally Posted by *tarui04*
> 
> Do I need to apply thermal paste/pad to the chokes (for a reference msi 290x) when installing on a Aquacomputer waterblock? the stock heatsink has a thermal pad that contacts the heatsink's fins but the AQC's manual didn't mention anything about applying thermal paste/pad on to it.
> 
> This is the version I am using
> http://www.coolingconfigurator.com/upload/pictures/HIS-Radeon-R9-290X-IceQ-X2-Turbo-4GB-GDDR5-%28H290XQMT4GD%29-PCB.jpg


If the manual doesn't mention it, then you don't need to put thermal pad on the choke. Always follow the manual. My EK water block also doesn't require thermal pad on the chokes.


----------



## Bertovzki

Quote:


> Originally Posted by *tarui04*
> 
> Do I need to apply thermal paste/pad to the chokes (for a reference msi 290x) when installing on a Aquacomputer waterblock? the stock heatsink has a thermal pad that contacts the heatsink's fins but the AQC's manual didn't mention anything about applying thermal paste/pad on to it.
> 
> This is the version I am using
> http://www.coolingconfigurator.com/upload/pictures/HIS-Radeon-R9-290X-IceQ-X2-Turbo-4GB-GDDR5-%28H290XQMT4GD%29-PCB.jpg


I'm sorry dude I cant help , I have not put one of these under water yet , not sure , as far as I know you just apply thermal pads that come with the block or the fuji ones from Frozen that some dudes talk about , some guru will chime in soon and help , I will listen in too.


----------



## cephelix

That is insane....but to be able to get so engrossed in something till the point of losing track of time is great i feel


----------



## Bertovzki

Quote:


> Originally Posted by *kizwan*
> 
> If the manual doesn't mention it, then you don't need to put thermal pad on the choke. Always follow the manual. My EK water block also doesn't require thermal pad on the chokes.


Ok doesnt require it , but is it bad if you do it ? ,or advantageous to do so ? and follow the manual because , say it will cause the block to not sit right for e.g. ?


----------



## Klocek001

where's the power control option in Trixxxx ?


----------



## Bertovzki

Quote:


> Originally Posted by *cephelix*
> 
> That is insane....but to be able to get so engrossed in something till the point of losing track of time is great i feel


It was a period of time I was going through that enabled me to play for a long time ,and just loved it, so I did , I have not done that for a long time


----------



## kizwan

Quote:


> Originally Posted by *Bertovzki*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> If the manual doesn't mention it, then you don't need to put thermal pad on the choke. Always follow the manual. My EK water block also doesn't require thermal pad on the chokes.
> 
> 
> 
> 
> 
> 
> Ok doesnt require it , but is it bad if you do it ? ,or advantageous to do so ?
Click to expand...

I don't know. It's possible if you put pad on the chokes, it may lift the block enough to prevent perfect contact between the GPU die & waterblock. This will affect cooling performance. If I remember correctly, there are few cases, people mistakenly put thermal pad on the memory (with Aquacomputer Kryographics block) & this caused high GPU temp because poor contact between GPU die & the water block.
Quote:


> Originally Posted by *Klocek001*
> 
> where's the power control option in Trixxxx ?


Scroll down.


----------



## cephelix

Quote:


> Originally Posted by *Bertovzki*
> 
> It was a period of time I was going through that enabled me to play for a long time ,and just loved it, so I did , I have not done that for a long time


Go do it when you can...would feel like the old days


----------



## Klocek001

Quote:


> Originally Posted by *kizwan*
> 
> Scroll down.


Sorry, I was being a moron.

Anyway, what can your 290 do on stock vcore ? I've got 1110 from the original 1000 so far, stable, with just +50% power. That's comes as a surprise to me since my former 290 needed +100mV for 1150. I'm gonna continue raising the core clock by 10MHz without added vcore but I feel I'm prolly close to the limit now. If my card could do 1200 at 50-70mV that would be sweet.


----------



## kizwan

Quote:


> Originally Posted by *Klocek001*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Scroll down.
> 
> 
> 
> 
> 
> 
> Sorry, I was being a moron.
> 
> Anyway, what can your 290 do on stock vcore ? I've got 1110 from the original 1000 so far, stable, with just +50% power. That's comes as a surprise to me since my former 290 needed +100mV for 1150. I'm gonna continue raising the core clock by 10MHz without added vcore but I feel I'm prolly close to the limit now. If my card could do 1200 at 50-70mV that would be sweet.
Click to expand...

Don't worry about it. A lot of people missed it.

I didn't tested mine thoroughly at stock voltage. Both of my cards can do 1100 at stock voltage & +50% power in Unigine Valley Benchmark with zero artifacts. I only tested them up to 1100 at stock voltage, maybe they can do higher but stable in Valley doesn't mean they stable in games though. Interestingly for 1200MHz, I only need +100mV in Valley but in other benchmarks like 3dmark, I'll need +200mV.

Heat is an issue to me because ambient is high around ~27C minimum. My cards will always crash when overclocked to 1200 under a hour when playing BF4. Just simply crash, no artifacts beforehand though. I think I'll try 1050 or 1100 at stock voltage later, when my internet quota replenished. I never thought to try this.


----------



## Klocek001

I only test in games, I don't even have any benchmarks. If heat's the issue why not try to work with sapphire trixxx custom fan control, it's been awesome so far to me. I set up 50% for 50 degrees, 55% for 55 degrees and so on and son on up to 75. I didn't even hit 60% once, my temps (rpm as well) stay between 50-55 (1110MHz core).

It's 11.11 today so I'm just gonna go from 1110 to 1111 and check for artifacts. So far I've been fine in Maxpayne 3 and AC4BF but these were only like 1h sessions.


----------



## dubdidub

I bought a Sapphire Vapor-X Radeon R9 290 Tri-X OC at amazon.fr last week, luckily it has unlockable shaders. I flashed the Vapor-X 290X Tri-X BIOS (11226-10-40G), now I am searching for the BIOS of the Vapor-X 290X Tri-X OC Version (11226-09-40G). Would be great if anyone had it and could send it to me / upload it somewhere. Thanks!


----------



## kizwan

Quote:


> Originally Posted by *Klocek001*
> 
> I only test in games, I don't even have any benchmarks. If heat's the issue why not try to work with sapphire trixxx custom fan control, it's been awesome so far to me. I set up 50% for 50 degrees, 55% for 55 degrees and so on and son on up to 75. I didn't even hit 60% once, my temps (rpm as well) stay between 50-55 (1110MHz core).
> 
> It's 11.11 today so I'm just gonna go from 1110 to 1111 and check for artifacts. So far I've been fine in Maxpayne 3 and AC4BF but these were only like 1h sessions.


Mine underwater. I didn't record it but water temps was high. I just checked my old log, when it crashed after 45 minutes in BF4, VRM1 was 79C, running at 1150/1600 with +100mV, in 30C ambient. Interestingly VRM1 was at 79C for 20 minutes before it crashed.


----------



## Klocek001

So far I've got 1120MHz stable on stock vcore (TriX 290), no artifacts that I saw, temps - max 53 degrees in custom fan % mode, testing in AC4BF (my favourite testing game - always keeps the core clock at 100%).
Anyway, I wanna ask some more experienced PC gamers about a thing that's starting to get on my nerves of steel. I get 60fps stable in AC4 but whenever I'm running fast and changing the direction the fps drops by one or two frames, I see little stuttering and after a second it gets back to 60. What's responsible for that ? Is that my HDD lagging ? Not enough GPU power (I doubt that) ? Vsync's fault ? When I already have the level of details set up correcly not to drop below 60 it's small things like this that're starting to become annoying.


----------



## Agent Smith1984

Sounds good on stock voltage.... I'm picking up the same card today.....
Looking forward to playing with the clocks tonight. Looks like those coolers do a really really good job.

I'm hoping to at least hit 1200 core/1500 memory with some votlage


----------



## Klocek001

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Sounds good on stock voltage.... I'm picking up the same card today.....
> Looking forward to playing with the clocks tonight. Looks like those coolers do a really really good job.
> 
> I'm hoping to at least hit 1200 core/1500 memory with some votlage


So far 1140MHz at stock vcore didn't give me no artifacts, I have high hope it'll stay that way. From my experience with testing with AC4 if the clock was too high the artifacts showed up right away.

1150 on stock vcore would be sweet, 15% more performance. Taking the 512-bit bus into the account I don't really have to OC my memory for 1080p .


----------



## jagdtigger

Quote:


> Originally Posted by *jagdtigger*
> 
> I dont understand whats wrong with my monitors. The problem is that my monitor(LG IPS277L on HDMI) gives me black screens when my card is OC-ed. The two others(1xDVI-D and 1xDP with a DP->HDMI adapter) are working just fine, any ideas?


Nothing?









And i just decided that i will buy a second r9 290x in next year, but before that i have some questions. Obviously my 650W PSU wont be enough, but i have no clue big will be the load on the PSU with 2 OC-ed card... The second is of course the water cooling. Do i need a new/second pump and one more radiator or not?

Thanks in advance for the help







.


----------



## pdasterly

Quote:


> Originally Posted by *kizwan*
> 
> Mine underwater. I didn't record it but water temps was high. I just checked my old log, when it crashed after 45 minutes in BF4, VRM1 was 79C, running at 1150/1600 with +100mV, in 30C ambient. Interestingly VRM1 was at 79C for 20 minutes before it crashed.


Something is wrong with your system, too hot. I have similar setup, vrm1 never goes past 60c at full load 1200/1600
maybe some fuijpoly ultra extreme is needed


----------



## PillarOfAutumn

I'm looking to buy a second reference R9-290 to add to my watercooled PC. I was about to pull the trigger on this one:

www.newegg.com/Product/Product.aspx?Item=N82E16814150697

but I realized it wasn't listed on cooling configurator (only the 290x variant is).

So instead, I may have to shell out some extra cash for this model which IS listed on cooling configurator

www.newegg.com/Product/Product.aspx?Item=N82E16814150701

My questions:

1. I know XFX is notorious for silently changing PCBs. I was wondering if the black edition will be alright?

2. Does anyone know of a cheaper option? I would really like to get a reference PCB and hopefully one with a reference cooler to save some money.

3. I've found better deals on the 290x. I was thinking of getting a 290x and crossfiring it with my R9-290 but my limiting factor is my PSU. I have an 860 watt platinum Seasonic PSU. Will this be able to support a 290x +290 crossfire? I was thinking this will be too close to the edge and for this reason I should get a 290.

Any guidance will be appreciated.


----------



## kizwan

Quote:


> Originally Posted by *pdasterly*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Mine underwater. I didn't record it but water temps was high. I just checked my old log, when it crashed after 45 minutes in BF4, VRM1 was 79C, running at 1150/1600 with +100mV, in 30C ambient. Interestingly VRM1 was at 79C for 20 minutes before it crashed.
> 
> 
> 
> Something is wrong with your system, too hot. I have similar setup, vrm1 never goes past 60c at full load 1200/1600
> maybe some fuijpoly ultra extreme is needed
Click to expand...

I forgot to mention this, it was pre-Fujipoly. It's the only GPU-Z log I can find. Post-fujipoly, VRM1 temp drops a lot but even with 360 + 240 + 120 rads, water temp still high when overclocking. I need more rads.


----------



## aaroc

Quote:


> Originally Posted by *Jflisk*
> 
> I have 3x R9290x I see about 58C max across the board under water and playing Titanfall for 3-4hrs straight. That's with 2 D5s 1x 240 Monsta Rad 1x120 and 1x240.The only thing I don't like about this system is it idles at 44C across the board.
> 
> So that's FX9590 Processor 225W at idle there about and I think this is where my high idle temp comes from.
> 3 X R9 290X Not sure what it idles at in watts but I know what it hits under load. I have seen the whole system with a WEMO basically a electronic kilowatt. Hit 1130 Watts full load.


The F2004 PC of my Signature is using input 120-150W on idle with FX 9590 and R9 290X. Values from Corsair Link on a Corsair AX1200i. I will post more data when I test with more GPUs.


----------



## Klocek001

Should I even try to OC my memory at 1080p ? Will it give me any difference going from 1300 to 1400 or 1500?
So far I established my card clocks to 1140 on stock vcore, but 1140 gives me 1 fps less than 1100 even though there are no artifacts (weird).
+62mV was not enough for 1200, seems like +81mV does the job, I'm gonna try sth in between like + 70mV but later.


----------



## Agent Smith1984

I am t-minus 2 hours and 30 minutes from joining this club. HELLO 290 Tri-X OC!!!

$180 on CL


----------



## tsm106

Quote:


> Originally Posted by *jagdtigger*
> 
> Quote:
> 
> 
> 
> Originally Posted by *jagdtigger*
> 
> I dont understand whats wrong with my monitors. The problem is that my monitor(LG IPS277L on HDMI) gives me black screens when my card is OC-ed. The two others(1xDVI-D and 1xDP with a DP->HDMI adapter) are working just fine, any ideas?
> 
> 
> 
> Nothing?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And i just decided that i will buy a second r9 290x in next year, but before that i have some questions. Obviously my 650W PSU wont be enough, but i have no clue big will be the load on the PSU with 2 OC-ed card... The second is of course the water cooling. Do i need a new/second pump and one more radiator or not?
> 
> Thanks in advance for the help
> 
> 
> 
> 
> 
> 
> 
> .
Click to expand...

Try a higher ql hdmi cable. Generally, the hdmi port is most resistant to blackscreens.

Two 290x watercooled, oc'd with your cpu, I'd recommend at least a 1kw psu. Something like an evga g2 1000w, but the 1300w is like only 20 bucks more so personally I would splurge the extra 20 and never worry about power problems again. A single DDC can handle the restriction. There are other reasons for getting a newer pump, but as far as the restriction you will be fine. Btw, make sure you have similar radiator surface area to the amount of wattage you will be drawing. That's the safest road to great overclocks.


----------



## jagdtigger

Quote:


> Originally Posted by *tsm106*
> 
> Try a higher ql hdmi cable. Generally, the hdmi port is most resistant to blackscreens.


I dont think its the cable, its fully shielded and has two separate ferrite cores...
http://www.hama.hu/go/reszlet/5712/35/TV_HIFI_Hazimozi-Kabel-HDMI_DVI_HDD/83056/High_Speed_HDMI_kabel_ethernettel_es_ferrit_szurovel_1_5_m/


----------



## tsm106

Shrugs lol, what are you going to do then? HDMI is the port to use to combat blackscreens and if you are getting them, it's either your port is bad or your monitor. Easiest thing to do is swap cables and pray for rain because the alternative is not pretty.


----------



## Jflisk

Quote:


> Originally Posted by *aaroc*
> 
> The F2004 PC of my Signature is using input 120-150W on idle with FX 9590 and R9 290X. Values from Corsair Link on a Corsair AX1200i. I will post more data when I test with more GPUs.


You know what had to check the wemo after you said that mine idles at 207Watts and at max hits 1130 Watts. Thanks


----------



## jagdtigger

@tsm106
It seems the damned HDMI port dont like the OC :/ ... The TV and the problematic monitor changed connectors(monitor to the DP->HDMI adapter and TV to the HDMI port) and now the TV producing the same symptoms. And only if the card is OC-ed. On stock everything is fine...


----------



## tsm106

Quote:


> Originally Posted by *jagdtigger*
> 
> @tsm106
> It seems the damned HDMI port dont like the OC :/ ... The TV and the problematic monitor changed connectors(monitor to the DP->HDMI adapter and TV to the HDMI port) and now the TV producing the same symptoms. And only if the card is OC-ed. On stock everything is fine...


Try a dvi to hdmi adapter?


----------



## Forceman

Quote:


> Originally Posted by *Klocek001*
> 
> Should I even try to OC my memory at 1080p ? Will it give me any difference going from 1300 to 1400 or 1500?
> So far I established my card clocks to 1140 on stock vcore, but 1140 gives me 1 fps less than 1100 even though there are no artifacts (weird).
> +62mV was not enough for 1200, seems like +81mV does the job, I'm gonna try sth in between like + 70mV but later.


It doesn't provide much of a boost, core is definitely better to push. You might see slightly better scores in Firestrike or something, but games won't show much difference.


----------



## DividebyZERO

Can anyone confirm a finding i've run into? I have AMD graphics cards, and i seem to have vsynch(no frame tearing) on the desktop. When running games in windowed mode its perfect. No tearing or stuttering. My in game meters show 100+fps also.
My global settings in CCC are not set for vsynch.


----------



## aaroc

false
Quote:


> Originally Posted by *Jflisk*
> 
> You know what had to check the wemo after you said that mine idles at 207Watts and at max hits 1130 Watts. Thanks


Here a screenshot, W8.1 update1 only running desktop and the applications in the SC.


----------



## DividebyZERO

Disregard the Win10 post i made, apparently no one has ever had screen tearing on desktop. I am apparently the only person in the whole wide world to get it. Carry on.


----------



## Agent Smith1984

Well, I am officially in the club!! Just picked my Sapphire Tri-x 290 OC....

BAD NEWS IS..... I have no clue what I am doing with the overclock... everything I just learned over the last 4 months with my 280x is out the door.

GPU-Z says this card has 80.2 ASIC quality... not sure if that means anything since my last card was a 64.5 and it clocked to the numbers you see in my sig (gotta change when I find my clocks for this thing







)

I'll google some info before I ask a million questions..... My inital Firestrike was only 9k, so I definitely have some work to do.


----------



## Arizonian

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, I am officially in the club!! Just picked my Sapphire Tri-x 290 OC....
> 
> BAD NEWS IS..... I have no clue what I am doing with the overclock... everything I just learned over the last 4 months with my 280x is out the door.
> 
> GPU-Z says this card has 80.2 ASIC quality... not sure if that means anything since my last card was a 64.5 and it clocked to the numbers you see in my sig (gotta change when I find my clocks for this thing
> 
> 
> 
> 
> 
> 
> 
> )
> 
> I'll google some info before I ask a million questions..... My inital Firestrike was only 9k, so I definitely have some work to do.


Nice.









Learning a new GPU clocks is part of the fun. I'd love to add you to the roster. Post a GPU-Z Link with OCN name or screen shot of GPU-Z validation tab open with OCN name display.


----------



## Agent Smith1984

Here we go!!! WEEEEE

$160 guys..... even if it won't OC all that well..... that's a great price.

Edit: Nevermind... this card probably clocks great.....

MY PSU IS HITTING 11.7 on the 12v rail, and the card is throttling all the way down to 950MHz....

I already did a 9500 Firestrike run with the card throttling to 950 and the memory at 1500.... 11,600 graphics score...

I am in desperate need of a power supply at this power, I can't believe it made such a big impact. My 280x at 1.3v 1250MHz ran perfect on this PSU.


----------



## Arizonian

Quote:


> Originally Posted by *Agent Smith1984*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Here we go!!! WEEEEE
> 
> $160 guys..... even if it won't OC all that well..... that's a great price.
> 
> Edit: Nevermind... this card probably clocks great.....
> 
> MY PSU IS HITTING 11.7 on the 12v rail, and the card is throttling all the way down to 950MHz....
> 
> I already did a 9500 Firestrike run with the card throttling to 950 and the memory at 1500.... 11,600 graphics score...
> 
> I am in desperate need of a power supply at this power, I can't believe it made such a big impact. My 280x at 1.3v 1250MHz ran perfect on this PSU.


Congrats - added


----------



## amptechnow

Quote:


> Originally Posted by *Agent Smith1984*
> 
> 
> 
> Here we go!!! WEEEEE
> 
> $160 guys..... even if it won't OC all that well..... that's a great price.
> 
> Edit: Nevermind... this card probably clocks great.....
> 
> MY PSU IS HITTING 11.7 on the 12v rail, and the card is throttling all the way down to 950MHz....
> 
> I already did a 9500 Firestrike run with the card throttling to 950 and the memory at 1500.... 11,600 graphics score...
> 
> I am in desperate need of a power supply at this power, I can't believe it made such a big impact. My 280x at 1.3v 1250MHz ran perfect on this PSU.


when i use afterburner i have to set power limit to 50% in ccc or my cards downclock. even though i set power limit in afterburner it doesnt set it for some reason. set your oc and then check ccc to see if the power limiots did go up. if not try raising them in ccc and watch your score go way up.


----------



## Agent Smith1984

HOLY CRAP!
That is exactly what it was..... how do I get the 50% to stick? It goes away and I get a black screen for 2 or 3 seconds every time I increase clocks, and then I have to go back into CCC and reset the 50% power limit...
really weird stuff....


----------



## Klocek001

With air cooled 290 I really suggest TriXX instead of AB and setting the custom fan control % for degrees - like 65 degrees - 65% fan, it really brings the temps down and the cards stays quiet for most of the time. My cards shows 11.75 on 12V but doesn't seems to clock down, stays between 960-999 in GR:FS (1000MHz stock).
Was your card 2nd hand ? I remember mine clocked down before it died of an angry capacitor.....


----------



## xer0h0ur

Lol if I leave my settings and raise power limit I lose over 500 points overall score. Wat


----------



## amptechnow

Quote:


> Originally Posted by *Agent Smith1984*
> 
> HOLY CRAP!
> That is exactly what it was..... how do I get the 50% to stick? It goes away and I get a black screen for 2 or 3 seconds every time I increase clocks, and then I have to go back into CCC and reset the 50% power limit...
> really weird stuff....


i have not found a way to make it stick with afterburner. every time you change clocks you will have to reset in ccc. and be sure to set each card if you have more then one.

you can use trix and keep the power limit. but different systems work better with different programs. afterburner is most stable for me but i can get bigger overclocks with trixx.

and the quick black screen when you change clocks is fine. its just resetting the display.


----------



## amptechnow

Quote:


> Originally Posted by *Klocek001*
> 
> With air cooled 290 I really suggest TriXX instead of AB and setting the custom fan control % for degrees - like 65 degrees - 65% fan, it really brings the temps down and the cards stays quiet for most of the time. My cards shows 11.75 on 12V but doesn't seems to clock down, stays between 960-999 in GR:FS (1000MHz stock).
> Was your card 2nd hand ? I remember mine clocked down before it died of an angry capacitor.....


its just some bug with afterburner not keeping power limit set. i have the same issue. i have to set power limit for each card in ccc. but like you i am fine if i use trixx bc trixx actually sets power limit. he confirmed what i said was right. so i doubt its a bad card. dont scare the guy. lol. sorry to hear about your card. if one of my 290s died it would be such a sad day. my cards have done it since day one and they are both new cards. it took me so long to figure out though and i couldnt for the life of me understand why they were downclocking so much and i was getting bad scores.

ab has fan profiles too. both programs are good. they just work better for some. and ab requires an extra step in ccc. i sat use whatever works best for you or which ever one you like more.


----------



## Agent Smith1984

I had to settle for 1080 +50mv on the core last night.... anything higher and it throttles from not having enough power. I had no idea the power requirements would change so much for this card.

I was also able to get the memory to run artifact free in 3d benchmarks at 1600MHz last night!!!
The problem is, once I close the 3d application, I get a black screen.....
I have read a lot of bad stuff about the black screen thing with these cards... what gives?









It may be a power issue too.....

vcore should be 1.21 with +50mv, it drops down to 1.009 under load...
12v rail is idling at 12 flat, and bottoming out at 11.63 under load, which is crazy because with my highly clocked 280x it idled at 12.25 and hit 11.96 under load.

Time for a new PSU. Age (8 years old), and dated amperage distribution scheme for the rails make this old 700w less effective than what a 500-600w would be today. This 290 is a total power hog!


----------



## bond32

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I had to settle for 1080 +50mv on the core last night.... anything higher and it throttles from not having enough power. I had no idea the power requirements would change so much for this card.
> 
> I was also able to get the memory to run artifact free in 3d benchmarks at 1600MHz last night!!!
> The problem is, once I close the 3d application, I get a black screen.....
> I have read a lot of bad stuff about the black screen thing with these cards... what gives?


The black screen comes from the memory controller not getting adequate voltage. You can try the PT1 benching bios if you can keep it cooled and it might help. Your throttling issue is likely due to keeping it cooled. When the card hits 94 C it will throttle. When you bench put the fans at 80-100% and see if it still throttles. I doubt your psu is to blame.


----------



## amptechnow

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I had to settle for 1080 +50mv on the core last night.... anything higher and it throttles from not having enough power. I had no idea the power requirements would change so much for this card.
> 
> I was also able to get the memory to run artifact free in 3d benchmarks at 1600MHz last night!!!
> The problem is, once I close the 3d application, I get a black screen.....
> I have read a lot of bad stuff about the black screen thing with these cards... what gives?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It may be a power issue too.....
> 
> vcore should be 1.21 with +50mv, it drops down to 1.009 under load...
> 12v rail is idling at 12 flat, and bottoming out at 11.63 under load, which is crazy because with my highly clocked 280x it idled at 12.25 and hit 11.96 under load.
> 
> Time for a new PSU. Age (8 years old), and dated amperage distribution scheme for the rails make this old 700w less effective than what a 500-600w would be today. This 290 is a total power hog!


what psu are you using? did you check its specs and compare it to the requirements for 290? check the amps. watts arent as important if you have a quality psu. i had a single corsair rm750 pushing 2 290's oc'ed and my 3770k @ 4.7. i now run dual rm750's. they are silent and the fan doesnt even turn on most of the time. very very pleased with them.

also like bond32 said turn those fans up and check your temps. what temps were you reaching? also what drivers are you using? gpu bios?

when 290's first came out i was getting black screens once in a while. drivers and bios updates have fixed it and i havent got one in a super long time. raising power limit and volts and aux power in afterburner can help too.


----------



## Klocek001

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Agent Smith1984*
> 
> I had to settle for 1080 +50mv on the core last night.... anything higher and it throttles from not having enough power. I had no idea the power requirements would change so much for this card.
> 
> I was also able to get the memory to run artifact free in 3d benchmarks at 1600MHz last night!!!
> The problem is, once I close the 3d application, I get a black screen.....
> I have read a lot of bad stuff about the black screen thing with these cards... what gives?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It may be a power issue too.....
> 
> vcore should be 1.21 with +50mv, it drops down to 1.009 under load...
> 12v rail is idling at 12 flat, and bottoming out at 11.63 under load, which is crazy because with my highly clocked 280x it idled at 12.25 and hit 11.96 under load.
> 
> Time for a new PSU. Age (8 years old), and dated amperage distribution scheme for the rails make this old 700w less effective than what a 500-600w would be today. This 290 is a total power hog!






you sure your card's previous owner was a gamer ? I bough mine for a very low price from a miner, it worked for 2 weeks or sth like like that but I had clock drop problems as well (sometimes it would drop to 800MHz causing terrible stutter).I RMAed and the store sent me a message that I had bought a complete wreck, there were damaged capacitors and resistors, but they sent it to Sapphire Germany anyways and I received a new one. Just saying the vdroop might be your card's fault, not the PSU.


----------



## Agent Smith1984

The guy I got it from just replaced it with a GTX 970....
He was definitely a gamer, but he bought the card used 6 months ago, and I believe he probably got it from a miner....

Everyone did see the part where I said my PSU is over 8 years old right??

It's got active PFC and 4 rails...

This is a problem because, one rail is primary CPU 4 pin, another rail is motherboard/peripherals, the third rail is PCI-E, and the 4th is shared between the second PCI-E connector, and the second 4 pin CPU power.... the last part is the biggest problem.

My CPU and this video card trying to share one of the rails is a big problem, and is causing the 12v rail to drop down way below acceptable output.

Here is my power supply:

http://www.extremeoverclocking.com/reviews/cases/OCZ_GameXStream_700W_1.html

Notice the 2006 review date!!! LMAO









Edit:

Just so everyone knows why the tri-x is prone to throttling and downclocking.....
It's a reference design, and DOES NOT USE power play support. I did a lot of reading about this morning.
I am 95% sure so far that this card is just fine, and just need some appropriate power sent to it.
It will run FireStrike at 1120/1500 with no problems, and scores over 12k (graphics







).....
Mind you that is with some throttling....

I looked at the GPU-Z sensor and can see where there are places that three things all happen at the same time:

1) The core clock goes down 20-50MHz
2) The vddc drops .1+ volts!!!!
3) The 12v rail drops from 11.75 to 11.63

Those all three happen at the same exact time, and if I reduce the core back to stock 1000 with 50% power limit,
none of those things happens..

It is a classic case of "need a better a power supply" if I have ever seen one.....


----------



## Klocek001

find sb with a fairly new and decent PSU, plug your card and just check the AB log


----------



## Agent Smith1984

Quote:


> Originally Posted by *bond32*
> 
> The black screen comes from the memory controller not getting adequate voltage. You can try the PT1 benching bios if you can keep it cooled and it might help. Your throttling issue is likely due to keeping it cooled. When the card hits 94 C it will throttle. When you bench put the fans at 80-100% and see if it still throttles. I doubt your psu is to blame.


The card does not break 73c and 80c on VRM with 60% fan







......
Mind you that was with 100mv and 1120 core.... though it did throttle here and there, it was not a temp thing...
Truly an amazing cooler on these cards...


----------



## PillarOfAutumn

Hey guys...I just had a buy first, ask questions later thing happen right now. I bought the Sapphire Tri-X 290. This is the one non reference cooler on the reference PCB, the one that can be watercooled with EK's block. I bought it from eBay and the user said that he returned his old reference sapphire card for warranty and Sapphire instead replaced it with this one. He just got it and so its kind of "brand new" or "fatory refurbished." I paid $220 for, including shipping. Is this a good or okay deal?

Also, I have a reference 290, the one that came with the 947 clock speed, where as this one is 1000. Does it matter where these are on the mobo (as in first slot or second slot)?


----------



## Agent Smith1984

Refurb for $220 is a so-so price...

Remember that exact card is selling for $250 new after rebate, with $120 in free games....

I bought the same card on CL yesterday for $160, obviously not a replacement card, but still functioning perfectly (or at least appears to be after testing it all night).

You still got a good score for a fair price either way.... and the coolers are so great.
It's the overclocking that seems to be the most complicated thing about these cards....

As far as slot placement, it shouldn't matter.
I actually have to run my card in my second slot because my primary slot got damaged during a toddler related tower tip


----------



## PillarOfAutumn

This is the cheapest that I could find: http://www.superbiiz.com/detail.php?name=AT-290TR4G

I was about to buy this one and try to sell off the 4 games for $60, but I figured that may be too much work on my part and even doing the rebate would be a hassle because I'm lazy.

Well, in any case, if the card works and is fully functional, I have no regrets. I just want to put this under water and finish upgrading this PC forever, haha.


----------



## Agent Smith1984

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> This is the cheapest that I could find: http://www.superbiiz.com/detail.php?name=AT-290TR4G
> 
> I was about to buy this one and try to sell off the 4 games for $60, but I figured that may be too much work on my part and even doing the rebate would be a hassle because I'm lazy.
> 
> Well, in any case, if the card works and is fully functional, I have no regrets. I just want to put this under water and finish upgrading this PC forever, haha.


Glad you like it... haven't finished toying with mine for daily use yet.....
My PSU is about 80 years old in tech years, so it is not liking it's new job of powering this card at all.

280x to this 290 was like switching from WalMart door greater, to hanging drywall.


----------



## jagdtigger

Quote:


> Originally Posted by *tsm106*
> 
> Try a dvi to hdmi adapter?


No effect, i just give up... 3DMARK runs fine on the monitor but the TV acting "funny" during the benchmark:
http://dl.dropboxusercontent.com/u/1201829/OCN/20141112_172714.mp4
(open it as url in WMPlayer)


----------



## tsm106

Quote:


> Originally Posted by *jagdtigger*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Try a dvi to hdmi adapter?
> 
> 
> 
> No effect, i just give up... 3DMARK runs fine on the monitor but the TV acting "funny" during the benchmark:
> http://dl.dropboxusercontent.com/u/1201829/OCN/20141112_172714.mp4
> (open it as url in WMPlayer)
Click to expand...

You know... if both tv and panel do it and you used the same cable, then the obviously it still could be the port or the cable.


----------



## jagdtigger

Quote:


> Originally Posted by *tsm106*
> 
> You know... if both tv and panel do it and you used the same cable, then the obviously it still could be the port or the cable.


I used a different cable for the TV so the culprit is the port. Well i guess i most get used to it....


----------



## bustacap22

Installed dual ref. 290x (Sapphire) last nite. Yes, so far impressed with the results coming from dual 7970 running at 1440p. I was already aware of the heat that dual 290x would bring to the table. The cards are stock as I am waiting to order the blocks for them. My prior setup with dual 7970 was able to achieve fantastic temps to my liking. Both dual 7970 avg 58-61 Celsius on full load. I currently have an Alphacool XT45 420 rad in p/p and an Alphacool UT60 240 rad in p/p. These two rads was able to achieve temps to my liking. With dual 290x that will be OC and a 3930k OC @ 4.6 ghz is this rad setup enough to achieve good temps. If not what would I need. Thanks.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Glad you like it... haven't finished toying with mine for daily use yet.....
> My PSU is about 80 years old in tech years, so it is not liking it's new job of powering this card at all.
> 
> 280x to this 290 was like switching from WalMart door greater, to hanging drywall.


lol! I already one 290 but was running it from a Seasonic x750 gold. My new 860 watt Seasonic platinum PSU just came in yesterday so hopefully that should have no problem running two 290s at 1000mhz core.


----------



## taem

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> I bought the Sapphire Tri-X 290. This is the one non reference cooler on the reference PCB, the one that can be watercooled with EK's block.


XFX DD Black 290 and XFX DD 290x are both listed as compatible with the EK reference 290x block. And if those two are compatible, the DD 290 and DD Black 290x should both be also, but those two are not listed. http://www.coolingconfigurator.com/waterblock/3831109868539

If anyone knows more about XFX DD 290/x compatibility with reference blocks, please post, because I'm considering picking one up, as the 290 DD can be had for ~$250 now, making used $200-220 less attractive.

Powercolor PCS+ 290 is also listed, but Powercolor revised the pcb. The first version, R29F, is reference, the newer version, R29FA, is not.

In any event -- afaik EK has a block for every single 290x out there except the HIS IceQ. They even made a block for the revised Powercolor PCS+, which is the same as the Club3D Royal Ace. Other than the IceQ, what card *can't* you put an EK block on? I'm actually a bit wary of EK gpu blocks simply because they make one for every card. I like the idea of a firm making just the one block and putting all their r&d into that, and not facing any manufacturing cost issues in terms of block design when you're going to make 10 different ones for the same gpu series. Not that EK blocks suffer in terms of performance, they all seem to do just fine. So I'm probably just way off base here. But it sort of makes sense in theory maybe.


----------



## xer0h0ur

So you're displeased with a company that makes versions of blocks that fit different cards which aren't identical? How does that even make sense in your head much less out loud.


----------



## taem

Quote:


> Originally Posted by *xer0h0ur*
> 
> So you're displeased with a company that makes versions of blocks that fit different cards which aren't identical? How does that even make sense in your head much less out loud.


Displeased is a strong word. I was just musing out loud about whether, when you make that many variations on a block, the economics of the manufacturing process might not call for slight modifications rather than a new design that fully reflects that pcb layout. Again I'm just musing out loud, not arguing that this has basis in fact. But take for example Aquacomputer. They only make the one reference block. And they're so confident of their engineering tolerances that they don't even use pads on the vram.


----------



## Jflisk

Quote:


> Originally Posted by *taem*
> 
> XFX DD Black 290 and XFX DD 290x are both listed as compatible with the EK reference 290x block. And if those two are compatible, the DD 290 and DD Black 290x should both be also, but those two are not listed. http://www.coolingconfigurator.com/waterblock/3831109868539
> 
> If anyone knows more about XFX DD 290/x compatibility with reference blocks, please post, because I'm considering picking one up, as the 290 DD can be had for ~$250 now, making used $200-220 less attractive.
> 
> Powercolor PCS+ 290 is also listed, but Powercolor revised the pcb. The first version, R29F, is reference, the newer version, R29FA, is not.
> 
> In any event -- afaik EK has a block for every single 290x out there except the HIS IceQ. They even made a block for the revised Powercolor PCS+, which is the same as the Club3D Royal Ace. Other than the IceQ, what card *can't* you put an EK block on? I'm actually a bit wary of EK gpu blocks simply because they make one for every card. I like the idea of a firm making just the one block and putting all their r&d into that, and not facing any manufacturing cost issues in terms of block design when you're going to make 10 different ones for the same gpu series. Not that EK blocks suffer in terms of performance, they all seem to do just fine. So I'm probably just way off base here. But it sort of makes sense in theory maybe.


You might want to have a look here looks like there is no difference other then the cooler. I own the XFX DD R290X and it uses reference blocks and while you picking up the block pick up the back plate it helps.

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/65741-xfx-r9-290-double-dissipation-review.html


----------



## taem

Quote:


> Originally Posted by *Jflisk*
> 
> You might want to have a look here looks like there is no difference other then the cooler. I own the XFX DD R290X and it uses reference blocks and while you picking up the block pick up the back plate it helps.
> 
> http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/65741-xfx-r9-290-double-dissipation-review.html


It says this
Quote:


> XFX does use their own PCB design but from what we can gather, there aren't any upgraded components. If anything, the layout has been simplified. We can also see that the heatsink's length has necessitated the inclusion of a small "lip" at the PCB's outer edge in order to maintain a fluid looking design.


"Layout has been simplified" -- what does that mean for block compatibility? The Black DD 290 is listed but the DD290 spoken of here is not.


----------



## xer0h0ur

Quote:


> Originally Posted by *taem*
> 
> Displeased is a strong word. I was just musing out loud about whether, when you make that many variations on a block, the economics of the manufacturing process might not call for slight modifications rather than a new design that fully reflects that pcb layout. Again I'm just musing out loud, not arguing that this has basis in fact. But take for example Aquacomputer. They only make the one reference block. And they're so confident of their engineering tolerances that they don't even use pads on the vram.


With current technology like 3D scanners, CNC machines, waterjet machines and the like, its hardly difficult at all to crank out revisions. All you need is to have the PCB on hand. That is your cost. I am also weary of having no thermal pads between block and vRAM. My cards are using Fujipoly Ultra Extreme pads and can overclock vRAM well past 1700MHz before artifacting, though I don't since my cooling is inadequate right now for heavy overclocks.


----------



## Jflisk

Quote:


> Originally Posted by *taem*
> 
> It says this
> "Layout has been simplified" -- what does that mean for block compatibility? The Black DD 290 is listed but the DD290 spoken of here is not.


this is odd The R9 290x DD referenced below is the one I have with a standard refrence waterblock. But it says non reference PCB. Thats odd .The only thing that I saw as a difference was the DD fan and a bracket they affix to the board first (removable).

XFX rolled out its first non-reference design Radeon R9 290 series graphics cards, the Radeon R9 290X Double Dissipation (model: R9-290X-EDFD), and the R9 290 Double Dissipation (model: R9-290A-EDFD). The two are based on a common board design, with a non-reference design PCB by the company,

Referenced here
http://www.techpowerup.com/196072/xfx-rolls-out-radeon-r9-290-series-double-dissipation-cards.html


----------



## taem

Quote:


> Originally Posted by *xer0h0ur*
> 
> With current technology like 3D scanners, CNC machines, waterjet machines and the like, its hardly difficult at all to crank out revisions. All you need is to have the PCB on hand. That is your cost. I am also weary of having no thermal pads between block and vRAM. My cards are using Fujipoly Ultra Extreme pads and can overclock vRAM well past 1700MHz before artifacting, though I don't since my cooling is inadequate right now for heavy overclocks.


Designing revisions would be easy, but are the metal blocks themselves made in house? I assume they'd be sent to a manufacturing plant. And revisions would incur an up-front cost to alter the manufacture process. And I wonder if cost increases the more you change the block, and whether any plans to make a lot of variations would lead to design choices to make such minor changes easier and cheaper. That was the thought that popped into my head. Again just musing out loud, i'm no expert by any means on engineering or manufacturing. And I'm not bashing EK at all btw, EK would be my second choice after Aquacomputer.

But here's a question about thermodynamics. Would thermal pads, even the top of the line Fuji, *ever* be better than paste based TIM? My understanding is no. You use pads on memory and vrms to ensure contact, not because it's better heat dissipation. If pads could ever be better, you'd see them used on gpu cores and cpus. So it's a question of engineering tolerances and the need to ensure contact on the core while using pads to cover the other parts like vrms and mem. And Aquacomputer is staking a claim to superior tolerances with their pad-less mem cooling. Since there isn't a vram temp monitor for the 290/x I have no way to check whether this design is working. But I like the in your face arrogance of that design choice. It's so german.


----------



## Sgt Bilko

Quote:


> Originally Posted by *taem*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jflisk*
> 
> You might want to have a look here looks like there is no difference other then the cooler. I own the XFX DD R290X and it uses reference blocks and while you picking up the block pick up the back plate it helps.
> 
> http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/65741-xfx-r9-290-double-dissipation-review.html
> 
> 
> 
> It says this
> Quote:
> 
> 
> 
> XFX does use their own PCB design but from what we can gather, there aren't any upgraded components. If anything, the layout has been simplified. We can also see that the heatsink's length has necessitated the inclusion of a small "lip" at the PCB's outer edge in order to maintain a fluid looking design.
> 
> Click to expand...
> 
> "Layout has been simplified" -- what does that mean for block compatibility? The Black DD 290 is listed but the DD290 spoken of here is not.
Click to expand...

There is one member in here that has 2 x XFX DD cards and they confirmed they are ref design.

Im on mobile so I cant dig through the info but have a look at the OP's memeber list for someone that has 2 XFX cards under water cooling and you should find your answer


----------



## xer0h0ur

Fujipoly Ultra Extreme pads are rated at 17.0 W/mK while the TIM I use Prolimatech PK-3 is rated at 11.2 W/mK. Just because the use of thermal pads on CPU's/GPU's is not widespread by people doesn't mean it doesn't exist or isn't in use. An example of extremely high quality CPU/GPU thermal pads are Coollaboratory's Liquid MetalPad which is just a hair under the performance of the Liquid Metal and Liquid Metal Pro TIMs they make. I personally stay away from all metal TIMs on GPUs but some are braver than I.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *xer0h0ur*
> 
> Fujipoly Ultra Extreme pads are rated at 17.0 W/mK while the TIM I use Prolimatech PK-3 is rated at 11.2 W/mK. Just because the use of thermal pads on CPU's/GPU's is not widespread by people doesn't mean it doesn't exist or isn't in use. An example of extremely high quality CPU/GPU thermal pads are Coollaboratory's Liquid MetalPad which is just a hair under the performance of the Liquid Metal and Liquid Metal Pro TIMs they make. I personally stay away from all metal TIMs on GPUs but some are braver than I.


I used Fujipoly Ultra Extreme on my GPU. If I remember correctly, my VRMs were going 40 C max. Since I'm redoing my system, I ordered IC 24 Carat thermal compound for the GPU and CPU blocks, mainly because it does a better job at transferring heat. Would you recommend that compound? I understand it can cause minor abrasions, but aside from that, it shouldn't fry anything right? I've just been using the TIM that came with EK.


----------



## Jflisk

Quote:


> Originally Posted by *taem*
> 
> Designing revisions would be easy, but are the metal blocks themselves made in house? I assume they'd be sent to a manufacturing plant. And revisions would incur an up-front cost to alter the manufacture process. And I wonder if cost increases the more you change the block, and whether any plans to make a lot of variations would lead to design choices to make such minor changes easier and cheaper. That was the thought that popped into my head. Again just musing out loud, i'm no expert by any means on engineering or manufacturing. And I'm not bashing EK at all btw, EK would be my second choice after Aquacomputer.
> 
> But here's a question about thermodynamics. Would thermal pads, even the top of the line Fuji, *ever* be better than paste based TIM? My understanding is no. You use pads on memory and vrms to ensure contact, not because it's better heat dissipation. If pads could ever be better, you'd see them used on gpu cores and cpus. So it's a question of engineering tolerances and the need to ensure contact on the core while using pads to cover the other parts like vrms and mem. And Aquacomputer is staking a claim to superior tolerances with their pad-less mem cooling. Since there isn't a vram temp monitor for the 290/x I have no way to check whether this design is working. But I like the in your face arrogance of that design choice. It's so german.


Quote:


> Originally Posted by *PillarOfAutumn*
> 
> I used Fujipoly Ultra Extreme on my GPU. If I remember correctly, my VRMs were going 40 C max. Since I'm redoing my system, I ordered IC 24 Carat thermal compound for the GPU and CPU blocks, mainly because it does a better job at transferring heat. Would you recommend that compound? I understand it can cause minor abrasions, but aside from that, it shouldn't fry anything right? I've just been using the TIM that came with EK.


You pointed out the only problem with it. Small abrasive marks. I used it for awhile and it does work.


----------



## taem

Quote:


> Originally Posted by *xer0h0ur*
> 
> Fujipoly Ultra Extreme pads are rated at 17.0 W/mK while the TIM I use Prolimatech PK-3 is rated at 11.2 W/mK. Just because the use of thermal pads on CPU's/GPU's is not widespread by people doesn't mean it doesn't exist or isn't in use. An example of extremely high quality CPU/GPU thermal pads are Coollaboratory's Liquid MetalPad which is just a hair under the performance of the Liquid Metal and Liquid Metal Pro TIMs they make. I personally stay away from all metal TIMs on GPUs but some are braver than I.


I don't doubt the specs you cite, but this place being what it is, if the Fuji ultra extreme could dissipate heat better than paste tim I have to think that would be widespread practice here. I read through a lot of threads here, and you have guys lapping, delidding, stressing every point along the gradient on flow rates and fan curves, but I've not seen one post yet that says Fuji ultra extreme or similar top end pad dissipates heat better than paste. I'm not arguing with you, I'm just expressing surprise. Why wouldn't guys like rdr09 or roboyto put pads on the core if they were better?


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Jflisk*
> 
> You pointed out the only problem with it. Small abrasive marks. I used it for awhile and it does work.


I feel like its one of those compounds that you will want to change as few times as possible. Maybe just once a year when you're flushing out and leaning your loop, so as to not scratch it the blocks too much. But how noticeable are these abrasions vs regular silver based TIMs?


----------



## Jflisk

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> I feel like its one of those compounds that you will want to change as few times as possible. Maybe just once a year when you're flushing out and leaning your loop, so as to not scratch it the blocks too much. But how noticeable are these abrasions vs regular silver based TIMs?


You will notice them. I never tried it on a CPU but I could see it removing the SN off. On a GPU they look like slight scratches. Your Call - I can say you will notice a temp drop once properly applied .


----------



## xer0h0ur

Quote:


> Originally Posted by *taem*
> 
> I don't doubt the specs you cite, but this place being what it is, if the Fuji ultra extreme could dissipate heat better than paste tim I have to think that would be widespread practice here. I read through a lot of threads here, and you have guys lapping, delidding, stressing every point along the gradient on flow rates and fan curves, but I've not seen one post yet that says Fuji ultra extreme or similar top end pad dissipates heat better than paste. I'm not arguing with you, I'm just expressing surprise. Why wouldn't guys like rdr09 or roboyto put pads on the core if they were better?


I can't speak as to why someone does what they do. All I can do is provide information and let you know with absolute certainty that at least that thermal pad (Liquid MetalPad) out-performs damn near any TIM you throw at it other than other all metal TIMs.

I know that its expensive to put Fujipoly Ultra Extreme pads on everything as I spent a good amount of coin for 0.5mm, 1mm and 1.5mm pads for my cards. I am certain cost has to factor in.


----------



## Agent Smith1984

Help:
http://www.overclock.net/t/1524337/290-tri-x-oc-throttling#post_23136429

Edit: I mean, help me... PLEASE!


----------



## taem

Quote:


> Originally Posted by *xer0h0ur*
> 
> I can't speak as to why someone does what they do. All I can do is provide information and let you know with absolute certainty that at least that thermal pad (Liquid MetalPad) out-performs damn near any TIM you throw at it other than other all metal TIMs.
> 
> I know that its expensive to put Fujipoly Ultra Extreme pads on everything as I spent a good amount of coin for 0.5mm, 1mm and 1.5mm pads for my cards. I am certain cost has to factor in.


Hmm. You know what I'm wondering? If it's a minimum thickness issue. Thinnest pad you can get is what 0.5 mm? Paste would be way thinner than that. The spec'd heat dissipation is based on what unit? Maybe what happens is, despite the higher heat dissip rating in the abstract, you have to use more units with pads and this makes the lower spec'd paste preferable. Which takes us back to the pad-less memory cooling of the kryographics. This being Aquacomputer were talking about, I have to think there is a thermodynamic basis for that design choice, it's not a cost saving measure. I have to think that AC believes paste on the mem rather than pads is better. Though of course, no manufacturer is thinking about including ultra extreme as stock pad.

Cost would not be an issue for the amount were talking about in terms of OCN users though. You're already buying sheets than come in various sizes for the mem and vrms. A small square for the core would be peanuts, when you're talking about a clientele that's shelling out $180+ for block and backplate and $20+ per 120mm fan etc etc.


----------



## razaice

I have a sapphire vapor x 290. What do you guys think would be the highest safe +mV setting for 24/7 gaming usage?


----------



## xer0h0ur

Quote:


> Originally Posted by *taem*
> 
> Hmm. You know what I'm wondering? If it's a minimum thickness issue. Thinnest pad you can get is what 0.5 mm? Paste would be way thinner than that. The spec'd heat dissipation is based on what unit? Maybe what happens is, despite the higher heat dissip rating in the abstract, you have to use more units with pads and this makes the lower spec'd paste preferable. Which takes us back to the pad-less memory cooling of the kryographics. This being Aquacomputer were talking about, I have to think there is a thermodynamic basis for that design choice, it's not a cost saving measure. I have to think that AC believes paste on the mem rather than pads is better. Though of course, no manufacturer is thinking about including ultra extreme as stock pad.
> 
> Cost would not be an issue for the amount were talking about in terms of OCN users though. You're already buying sheets than come in various sizes for the mem and vrms. A small square for the core would be peanuts, when you're talking about a clientele that's shelling out $180+ for block and backplate and $20+ per 120mm fan etc etc.


Sure I get that in a perfect world two perfectly flat surfaces mated together don't require TIM between them. Frankly though perfection is rarely attained and even if the surface of their block is milled to perfection then that would still require that the vRAM's surface also be perfectly flat to match it. I may be a trusting person but I am also a realist that knows theory rarely is reality.


----------



## Performer81

Is this supposed to be normal or is Powercolor kidding me?????
This is a 290 PCS+ with revised PCB. The vrm cooler isnt even touching these 2 vrm. SOme guy in another forum measured heat with a laser thermometer and they are at 93 degrees at stock, while gpu-z showed 77!! Im afraid of pushing the voltage now though it even survived gaming sessions with +175 anyhow.


----------



## taem

Quote:


> Originally Posted by *xer0h0ur*
> 
> Sure I get that in a perfect world two perfectly flat surfaces mated together don't require TIM between them. Frankly though perfection is rarely attained and even if the surface of their block is milled to perfection then that would still require that the vRAM's surface also be perfectly flat to match it. I may be a trusting person but I am also a realist that knows theory rarely is reality.


You do use paste on the mem modules with the kryographics, just not pads. Absent soldering I don't think you can mate two discrete pieces of metal without some sort of thermal interface.


----------



## xer0h0ur

Oh I completely misunderstood you. Didn't realize it was TIM on the vRAM. As for two perfectly flat metal surfaces, they will stick together. I forgot what they call that.


----------



## Agent Smith1984

Update on my new 290 (still learning this thing, and afterburner was giving me tons of problems!!!)

Just switched to trixx and got 1100/1500 with +80 offset, 50% power limit, 72c max temp with 50% fan.....

No throttling whatsoever.

Now here is the real kicker....

My 12v rail is at 11.63 during all this, and actual VDDC is reporting between 1.145 and 1.19..... completely stable and artifact free....
That tells me that the +80 isn't doing anything for me but giving the PSU a higher point to drop from so it can fall within a stable voltage for the overclock.
If I leave the voltage alone, it artifacts and falls to around .99???

Can't push any further until I ditch this crapped out PSU.

I am looking forward to hitting somewhere in the ballpark of 1200/1600 with this card once I get a nice PSU......

$160 CL find, with 80.2 ASIC quality and runs cool as a breeze.... I say WIN!


----------



## chiknnwatrmln

Quote:


> Originally Posted by *taem*
> 
> Hmm. You know what I'm wondering? If it's a minimum thickness issue. Thinnest pad you can get is what 0.5 mm? Paste would be way thinner than that. The spec'd heat dissipation is based on what unit? Maybe what happens is, despite the higher heat dissip rating in the abstract, you have to use more units with pads and this makes the lower spec'd paste preferable. Which takes us back to the pad-less memory cooling of the kryographics. This being Aquacomputer were talking about, I have to think there is a thermodynamic basis for that design choice, it's not a cost saving measure. I have to think that AC believes paste on the mem rather than pads is better. Though of course, no manufacturer is thinking about including ultra extreme as stock pad.
> 
> Cost would not be an issue for the amount were talking about in terms of OCN users though. You're already buying sheets than come in various sizes for the mem and vrms. A small square for the core would be peanuts, when you're talking about a clientele that's shelling out $180+ for block and backplate and $20+ per 120mm fan etc etc.


I would imagine this is the reasoning. The units used for thermal pad/TIM conductivity is W/m*k. Just but looking at the units you can tell that if you have a thicker thermal interface (a .5mm thermal pad vs say a .1mm thick layer of TIM paste) then you will get less efficient heat transfer than with the thinner layer, assuming their thermal conductivity is similar.

Furthermore TIM paste is not meant to be a layer between the die and heatsink. All it is meant to do is fill in the microscopic crack and pits between the two surfaces, as air is an insulator. This is why you need to avoid bubbles in your TIM applications. Compared to metal and the die silicon (copper's thermal conductivity is 401 W/m*k) the thermal pads/paste is actually not a very good conductor of heat. Off the top of my head some Fuji pads are what, 17 W/m*k? And I know TIM paste is usually in that area or lower, so compared to the metals used it's only a fraction as conductive.

If you look at things in an ideal scenario, almost all the TIM paste that is not filling in the microscopic cracks is pushed out between the die and heatsink. This would leave a very very thin layer (no idea how much, but I'd be willing to bet <.1mm)... now compare this to your five times thicker .5mm thermal pad and even if the pad is rated slightly better for thermal conductivity the TIM paste would clearly be a better choice. There is a reason that putting pads on dies is not a thing, this is just my guess.
Quote:


> Originally Posted by *Agent Smith1984*
> 
> Update on my new 290 (still learning this thing, and afterburner was giving me tons of problems!!!)
> 
> Just switched to trixx and got 1100/1500 with +80 offset, 50% power limit, 72c max temp with 50% fan.....
> 
> No throttling whatsoever.
> 
> Now here is the real kicker....
> 
> My 12v rail is at 11.63 during all this, and actual VDDC is reporting between 1.145 and 1.19..... completely stable and artifact free....
> That tells me that the +80 isn't doing anything for me but giving the PSU a higher point to drop from so it can fall within a stable voltage for the overclock.
> If I leave the voltage alone, it artifacts and falls to around .99???
> 
> Can't push any further until I ditch this crapped out PSU.
> 
> I am looking forward to hitting somewhere in the ballpark of 1200/1600 with this card once I get a nice PSU......
> 
> $160 CL find, with 80.2 ASIC quality and runs cool as a breeze.... I say WIN!


Software voltage readings are usually inaccurate. HWmonitor says my past 3 PSUs have been running at 6.9v on the 12v rail, if that was true I would have no computer left. Also the phenomena you're seeing is called Vdroop. Simply put, it's a safety mechanism to prevent voltage dips/spikes to your GPU core, and a result of it is the more current drawn = slightly less voltage. You can read up about it on Wikipedia if you're interested.


----------



## bond32

Discussed a few times, but I missed it. How do those of you with three or more qnix (dual link dvi only) monitors have them connected? Will a dual dvi to display port work?


----------



## xer0h0ur

Christ on a cracker. Nevermind. Proceed with your circle jerk.


----------



## Agent Smith1984

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I would imagine this is the reasoning. The units used for thermal pad/TIM conductivity is W/m*k. Just but looking at the units you can tell that if you have a thicker thermal interface (a .5mm thermal pad vs say a .1mm thick layer of TIM paste) then you will get more efficient heat transfer with the thinner layer, assuming their thermal conductivity is similar.
> 
> Furthermore TIM paste is not meant to be a layer between the die and heatsink. All it is meant to do is fill in the microscopic crack and pits between the two surfaces, as air is an insulator. This is why you need to avoid bubbles in your TIM applications. Compared to metal and the die silicon (copper's thermal conductivity is 401 W/m*k) the thermal pads/paste is actually not a very good conductor of heat. Off the top of my head some Fuji pads are what, 17 W/m*k? And I know TIM paste is usually in that area or lower, so compared to the metals used it's only a fraction as conductive.
> 
> If you look at things in an ideal scenario, almost all the TIM paste that is not filling in the microscopic cracks is pushed out between the die and heatsink. This would leave a very very thin layer (no idea how much, but I'd be willing to bet <.1mm)... now compare this to your five times thicker .5mm thermal pad and even if the pad is rated slightly better for thermal conductivity the TIM paste would clearly be a better choice. There is a reason that putting pads on dies is not a thing, this is just my guess.
> Software voltage readings are usually inaccurate. HWmonitor says my past 3 PSUs have been running at 6.9v on the 12v rail, if that was true I would have no computer left. Also the phenomena you're seeing is called Vdroop. Simply put, it's a safety mechanism to prevent voltage dips/spikes to your GPU core, and a result of it is the more current drawn = slightly less voltage. You can read up about it on Wikipedia if you're interested.


Yeah, I'm all too familiar with vdroop..... No newbness here








But it was not like this, this is ridiculous, my 280x @ 1.3v 1250 core would dip down to 1.22 sometimes and flicker, but only if the cpu was under heavy load too.... My 12v would bottom out at 11.8, it is now hitting 11.63, and after 8 great years, it's just not able to handle this card.... This is definitely not typical vdroop at all


----------



## taem

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I would imagine this is the reasoning. The units used for thermal pad/TIM conductivity is W/m*k. Just but looking at the units you can tell that if you have a thicker thermal interface (a .5mm thermal pad vs say a .1mm thick layer of TIM paste) then you will get more efficient heat transfer with the thinner layer, assuming their thermal conductivity is similar.
> 
> Furthermore TIM paste is not meant to be a layer between the die and heatsink. All it is meant to do is fill in the microscopic crack and pits between the two surfaces, as air is an insulator. This is why you need to avoid bubbles in your TIM applications. Compared to metal and the die silicon (copper's thermal conductivity is 401 W/m*k) the thermal pads/paste is actually not a very good conductor of heat. Off the top of my head some Fuji pads are what, 17 W/m*k? And I know TIM paste is usually in that area or lower, so compared to the metals used it's only a fraction as conductive.


Right. I assume that what happens with gpu block manufacture is, the point of emphasis is contact between block and core, and the screws that secure the block tighten this fit. Given that there are engineering tolerances involved, blocks would necessarily be designed to overshoot the distance between the block and the other parts of the pcb that need to be cooled, like the memory modules and the vrms, and you make up the difference with the thermal pads. Otherwise, if you design for the same level of tolerance between all parts of the pcb, you would get samples where the block runs into a memory module or vrm before making contact with the core. That would be my guess but I am certainly no engineer, quite the opposite in fact.

In any event I am still impressed with Aquacomputer's claim to a level of manufacturing precision that allows for pad-less memory cooling. I just wish I had a way to know whether they got it right. Stren's testing showed that there was quite a bit of variance in the paste spread on the vram modules with this block.


----------



## tsm106

I don't like aqua blocks, their fitment can be whacky at times and they are too chromed out. The 90s are over yo, chrome is old.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *xer0h0ur*
> 
> Christ on a cracker. Nevermind. Proceed with your circle jerk.


Speculating with logic = circlejerk? Lol okay.


----------



## xer0h0ur

Your statement reads "well 2 is more than 1 but since its not much better its not as good as 1." Seems legit. The fact is by actual real world testing the pad performed better at heat transfer.


----------



## taem

Quote:


> Originally Posted by *tsm106*
> 
> I don't like aqua blocks, their fitment can be whacky at times and they are too chromed out. The 90s are over yo, chrome is old.


It's the same nickel plating everyone uses no? Anyway ITS GOT A MAP OF HAWAII ON THE BLOCK!! And a Palm tree



Ehh I'm not crazy about the aesthetics of it either but that's not the part you see, the part you see I sort of like because the slab ness is relieved by that block and heat pipe



For looks this is unbeatable IMHO it's like a supermodels waist


----------



## chiknnwatrmln

Quote:


> Originally Posted by *xer0h0ur*
> 
> Your statement reads "well 2 is more than 1 but since its not much better its not as good as 1." Seems legit. The fact is by actual real world testing the pad performed better at heat transfer.


Not exactly, my whole point was that a much thinner thermal interface will transmit heat more easily than a much thicker, slightly more conductive one.

Also, do you have data? I haven't been keeping up with this thread so if you posted any I probably missed it.


----------



## taem

Quote:


> Originally Posted by *xer0h0ur*
> 
> Your statement reads "well 2 is more than 1 but since its not much better its not as good as 1." Seems legit. The fact is by actual real world testing the pad performed better at heat transfer.


What study is there that compares paste to pads? What chknnwtrmln pointed out is that you don't apply paste as a layer of heat transfer material between two pieces of metal they way you do with pads. The purpose of paste is to fill the gaps between pieces of metal to prevent pockets of superheated air. So the fact that ultra extreme's heat transfer rating is higher doesn't matter, since the paste's heat transfer rating only applies to those gaps and allow metal to metal heat transfer without hotspots. Obviously if you could fill those gaps with ultra extreme pad that would be better, but that would require microscopic work. So it's sort of apples and oranges here, paste and pad are intended for different functions. That's how I understood his comment. And that makes perfect sense to me, because you cannot tell me that using ultra extreme on the cores and CPUs would not be the norm here if this worked the way you say. Guys here stress and begrudge every last degree Celsius, you bet they'd use a superior interface between blocks and chips if it were available.


----------



## buttface420

i know i'm late but i just found out they have a 8gb version of the vapor-x 290x out !!!! imagine the powah of 2 of those crossfired,


----------



## PillarOfAutumn

Quote:


> Originally Posted by *buttface420*
> 
> i know i'm late but i just found out they have a 8gb version of the vapor-x 290x out !!!! imagine the powah of 2 of those crossfired,


Haha, have you seen the rumored/leaked specs of the 390x??


----------



## xer0h0ur

Quote:


> Originally Posted by *taem*
> 
> What study is there that compares paste to pads? What chknnwtrmln pointed out is that you don't apply paste as a layer of heat transfer material between two pieces of metal they way you do with pads. The purpose of paste is to fill the gaps between pieces of metal to prevent pockets of superheated air. So the fact that ultra extreme's heat transfer rating is higher doesn't matter, since the paste's heat transfer rating only applies to those gaps and allow metal to metal heat transfer without hotspots. Obviously if you could fill those gaps with ultra extreme pad that would be better, but that would require microscopic work. So it's sort of apples and oranges here, paste and pad are intended for different functions. That's how I understood his comment. And that makes perfect sense to me, because you cannot tell me that using ultra extreme on the cores and CPUs would not be the norm here if this worked the way you say. Guys here stress and begrudge every last degree Celsius, you bet they'd use a superior interface between blocks and chips if it were available.


Did you already forget about the Liquid MetalPad for CPUs and GPUs? The pad that is just a hair below CLU and CLP's performance. TIMs with a 38.4 w/mk and 32.6 w/mk.
Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Haha, have you seen the rumored/leaked specs of the 390x??


I haven't. What are they?


----------



## Arizonian

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Haha, have you seen the rumored/leaked specs of the 390x??


Glad you said rumored. If it's even close to what's being speculated Ive got a 780TI I'd sell to make up the cost so I can play with a 390X or whatever the name will be. Fun times ahead.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Arizonian*
> 
> Glad you said rumored. If it's even close to what's being speculated Ive got a 780TI I'd sell to make up the cost so I can play with a 390X or whatever the name will be. Fun times ahead.


Haha, I built my first PC in March 2013. I had the HD 7970 and because I was looking to watercool and didn't look at my card's compatibility with waterblocks (it was not compatible) I used that as an excuse to sell it. Then I held on from buying a new GPU until November when I bought the R9 290. Then early this year, I put everything under water using copper pipes and went all out. Now I was looking to sell my 290 with the hopes of buying a 390 and saw that the next round of cards isn't due until at least 4 months. So I used this as excuse to get a second 290 to XF. I think I'm done upgrading this PC forever. Its a hassle to redo the pipes lol. I will keep this as it is and it will go to the grave like this.

And AMD is gonna be in a price war with nVidia so you may be able to buy 2 390x after you sell your 780Ti








Quote:


> Originally Posted by *xer0h0ur*
> 
> I haven't. What are they?


20nm cards with the following:

Stream Processors: 4096
Memory: 4gb
Memory clock: 1.25ghz
Memory Bandwidth: 533GB/s

This is as per WCCFTech


----------



## taem

Quote:


> Originally Posted by *xer0h0ur*
> 
> Did you already forget about the Liquid MetalPad for CPUs and GPUs? The pad that is just a hair below CLU and CLP's performance. TIMs with a 38.4 w/mk and 32.6 w/mk.


That's 100% metal though. Different from what I thought we were talking about, which is whether you can directly compare fuji ultra extreme's heat dissip rating to that of paste.


----------



## taem

So I didn't know anything about metal pad so I glanced through some info. It's not a pad in the sense of Fuji. The "pad" is basically a delivery version of a product they also sell in liquid syringe form. (There are some other improvements, like eliminating the issue of the liquid form hardening before proper application.) The thing is supposed to melt between pieces of metal. It's more like low melt point solder.


----------



## BLOWNCO

can my post be updated i now have 3 total sapphire vapor X r9 290's


----------



## buttface420

Quote:


> Originally Posted by *BLOWNCO*
> 
> can my post be updated i now have 3 total sapphire vapor X r9 290's


sweeeeeet! when i can get just one of those i'll be chilling. unless the 380x-390x are what they say they are going to be im probably gonna get a vapor x 290/290x


----------



## BLOWNCO

Quote:


> Originally Posted by *buttface420*
> 
> sweeeeeet! when i can get just one of those i'll be chilling. unless the 380x-390x are what they say they are going to be


thannks , if i dont get the performance i want with these cards they will get sold and ill get 3 980's and be done with it lol


----------



## Arizonian

Quote:


> Originally Posted by *BLOWNCO*
> 
> can my post be updated i now have 3 total sapphire vapor X r9 290's
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - updated


----------



## BLOWNCO

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - updated


thanks


----------



## Bertovzki

Quote:


> Originally Posted by *Arizonian*
> 
> Glad you said rumored. If it's even close to what's being speculated Ive got a 780TI I'd sell to make up the cost so I can play with a 390X or whatever the name will be. Fun times ahead.


well if that is the case anytime soon id not bother with a second 290x for myself , probably going to be costly though


----------



## Jaimelmiel

Can I be updated I have 2 Msi R9 290x's, Stock cooling?


----------



## taem

Quote:


> Originally Posted by *BLOWNCO*
> 
> can my post be updated i now have 3 total sapphire vapor X r9 290's


Gotta say, that looks beastly. Those accents in the middle of the card all lined up like that, nice.


----------



## BLOWNCO

Quote:


> Originally Posted by *taem*
> 
> Gotta say, that looks beastly. Those accents in the middle of the card all lined up like that, nice.


thanks







once i order my cpu ill post pics of them all isntalled.


----------



## YellowBlackGod

Just received my Sapphire R9 290x Vapor-x with 8 GB RAM. Looks amazing.Hope it performs as well. Now I am waiiting for the Asrock Z97 extreme6 to install it.... These new 8GB Radeon models are the best answer to Nvidia, for the same performance and for a better price.

By the way. I am thinking about pairing all these with the Vapor-x CPU cooler. Does anyone have it? As far I understood I won't have any clearance issues.


----------



## rdr09

Quote:


> Originally Posted by *YellowBlackGod*
> 
> Just received my Sapphire R9 290x Vapor-x with 8 GB RAM. Looks amazing.Hope it performs as well. Now I am waiiting for the Asrock Z97 extreme6 to install it.... These new 8GB Radeon models are the best answer to Nvidia, for the same performance and for a better price.
> 
> By the way. I am thinking about pairing all these with the Vapor-x CPU cooler. Does anyone have it? As far I understood I won't have any clearance issues.


that PCIe X1 will go between the cpu cooler and the first car, so you are good. i recommend a good aio or just build a custom loop if blocks are available.


----------



## YellowBlackGod

Quote:


> Originally Posted by *rdr09*
> 
> that PCIe X1 will go between the cpu cooler and the first car, so you are good. i recommend a good aio or just build a custom loop if blocks are available.


I will have one card. So no problem there. Just not sure about water cooling yet, i will stick with air cooling and this particular cooler looks great.


----------



## PillarOfAutumn

For those buying the 8gb 290x, is there really a differenc in performance between the 8 and regular 290x? I'd imagine the 1080p and 1440p performance would be the same and 4k will only be slightly better, but nothing to get the smooth 60fps since horsepower would be the same as the 290x.


----------



## battleaxe

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> For those buying the 8gb 290x, is there really a differenc in performance between the 8 and regular 290x? I'd imagine the 1080p and 1440p performance would be the same and 4k will only be slightly better, but nothing to get the smooth 60fps since horsepower would be the same as the 290x.


I think there would be an advantage for tri-fire and quad-fire.


----------



## bond32

Guessing, would be the ideal setup to have the top card an 8 gb model and the crossfire'ed cards the 4 gb models...

In other news, have 2 more QNIX monitors coming. Will be running 7690 x 1440p.


----------



## YellowBlackGod

And there is also this RAM issue with the new games in 1080p and above. They are RAM hungry. All the custom R9 290x are high end cards.But especially these 8GB models for the price can't be easilly overseen.You need 150 dollars more for an 980 and more than that for a titan black. Costs about the same as the 970. But hey....it's the RAM amount and the price performance that make the difference here.


----------



## bond32

Quote:


> Originally Posted by *YellowBlackGod*
> 
> And there is also this RAM issue with the new games in 1080p and above. They are RAM hungry. All the custom R9 290x are high end cards.But especially these 8GB models for the price can't be easilly overseen.You need 150 dollars more for an 980 and more than that for a titan black. Costs about the same as the 970. But hey....it's the RAM amount and the price performance that make the difference here.


Maybe. If I run battlefield 4 with supersampling up to 4k then my 290x's are limited by the ram. But at 150% (3/4 4k) it runs fine. 4gb is fine for pretty much anything except 4k.


----------



## tsm106

Quote:


> Originally Posted by *bond32*
> 
> Maybe. If I run battlefield 4 with supersampling up to 4k then my 290x's are limited by the ram. But at 150% (3/4 4k) it runs fine. 4gb is fine for pretty much anything except 4k.


That's only 1.5x the "resolution" of 4k.


----------



## bond32

Quote:


> Originally Posted by *tsm106*
> 
> That's only 1.5x the "resolution" of 4k.


I thought 200% supersampling at 1440p was equal to 4k...


----------



## battleaxe

What does supersampling do? If I set it to 200% supersampling it cripples my 290...


----------



## xer0h0ur

1440p at 200% is 5120x2880, you best believe that is like a kick in the nuts to your graphics card.


----------



## tsm106

Quote:


> Originally Posted by *bond32*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> That's only 1.5x the "resolution" of 4k.
> 
> 
> 
> I thought 200% supersampling at 1440p was equal to 4k...
Click to expand...

Oh sorry, yer at 1440P. Yea, that is equal almost. Only single panel? You should try out triples sometime. Make those cards stretch their legs.

Quote:


> Originally Posted by *battleaxe*
> 
> What does supersampling do? If I set it to 200% supersampling it cripples my 290...


It makes your game/app render at a larger resolution internally then display that image shrunken down in essence to fit on your smaller screen.


----------



## nightfox

Quote:


> Originally Posted by *bond32*
> 
> Guessing, would be the ideal setup to have the top card an 8 gb model and the crossfire'ed cards the 4 gb models...
> 
> In other news, have 2 more QNIX monitors coming. Will be running 7690 x 1440p.


let me know how it performs. we almost have same set up but i could feel serious bottleneck on cpu during bf4 multi player. could it be driver issue. im thinking of upgrading to x99 system


----------



## battleaxe

Quote:


> Originally Posted by *tsm106*
> 
> Oh sorry, yer at 1440P. Yea, that is equal almost. Only single panel? You should try out triples sometime. Make those cards stretch their legs.
> It makes your game/app render at a larger resolution internally then display that image shrunken down in essence to fit on your smaller screen.


Maybe a stupid question.. but does it usually look any better? I have a 1080p display and when I set it to 200% I get like 40fps unless I overclock.


----------



## tsm106

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Oh sorry, yer at 1440P. Yea, that is equal almost. Only single panel? You should try out triples sometime. Make those cards stretch their legs.
> It makes your game/app render at a larger resolution internally then display that image shrunken down in essence to fit on your smaller screen.
> 
> 
> 
> Maybe a stupid question.. but does it usually look any better? I have a 1080p display and when I set it to 200% I get like 40fps unless I overclock.
Click to expand...

It does look better than just simply rendering at your native resolution. However it does NOT look better than actually playing at the real resolution. It's like a good facsimile, but not better than the real thing obviously.


----------



## bond32

Quote:


> Originally Posted by *tsm106*
> 
> Oh sorry, yer at 1440P. Yea, that is equal almost. Only single panel? You should try out triples sometime. Make those cards stretch their legs.
> It makes your game/app render at a larger resolution internally then display that image shrunken down in essence to fit on your smaller screen.


Literally just ordered 2 more qnix monitors, adapter, and triple monitor stand this morning.


----------



## battleaxe

Quote:


> Originally Posted by *tsm106*
> 
> It does look better than just simply rendering at your native resolution. However it does NOT look better than actually playing at the real resolution. It's like a good facsimile, but not better than the real thing obviously.


Okay. Yeah I didn't expect it to look like the real resolution it is trying to render at. But it was interesting to see how it looked at 200%. I'd need my other 290 back to actually play at that.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *battleaxe*
> 
> What does supersampling do? If I set it to 200% supersampling it cripples my 290...


Quick and dirty of supersampling is just that the game is being run at a higher resolution and being made to fit into your monitor. For example, at 1080p, if you were to super sample it to 200%, you're essentially running 4k resolution that's being squeezed into your 1080p monitor. Likewise, 133% super sampling on 1080p is equivalent to 1440p and 150% super sampling on 1440p is equivalent to 4k. Advantage is that because your GPU is rendering such a huge image for a small monitor, you're getting sharper details. Disadvantage is that it wrecks your performance because you're rendering higher resolutions.


----------



## tsm106

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Quote:
> 
> 
> 
> Originally Posted by *battleaxe*
> 
> What does supersampling do? If I set it to 200% supersampling it cripples my 290...
> 
> 
> 
> Quick and dirty of supersampling is just that the game is being run at a higher resolution and being made to fit into your monitor. For example, at 1080p, if you were to super sample it to 200%, you're essentially running 4k resolution that's being squeezed into your 1080p monitor. Likewise, 133% super sampling on 1080p is equivalent to 1440p and 150% super sampling on 1440p is equivalent to 4k. Advantage is that because your GPU is rendering such a huge image for a small monitor, you're getting sharper details. Disadvantage is that it wrecks your performance because you're rendering higher resolutions.
Click to expand...

Your math is incorrect. Resolution = area not perimeter.


----------



## battleaxe

Quote:


> Originally Posted by *tsm106*
> 
> Your math is incorrect. Resolution = area not perimeter.


No wonder my FPS went to the crapper. Makes sense now.

Jeez... I'm gonna need 3 of these cards to do 4k justice... Aren't I?


----------



## PillarOfAutumn

Quote:


> Originally Posted by *tsm106*
> 
> Your math is incorrect. Resolution = area not perimeter.


Isn't that what I said?
Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Quick and dirty of supersampling is just that the game is being run at a higher resolution and being made to fit into your monitor. *For example, at 1080p, if you were to super sample it to 200%, you're essentially running 4k resolution that's being squeezed into your 1080p monitor*. Likewise, 133% super sampling on 1080p is equivalent to 1440p and 150% super sampling on 1440p is equivalent to 4k. Advantage is that because your GPU is rendering such a huge image for a small monitor, you're getting sharper details. Disadvantage is that it wrecks your performance because you're rendering higher resolutions.


1080p supersampled by 200%, you're essentially running 4 1080p monitor, otherwise "4k resolution."

Whatever the super sample percentage is, you multiply the height and width with that much, and you can work out the area with that.

1920 x 1080 super sampled by 200% = 3840 x 2160
2560 x 1440 super sampled by 150%= 3840 x 2160


----------



## bond32

Quote:


> Originally Posted by *battleaxe*
> 
> No wonder my FPS went to the crapper. Makes sense now.
> 
> Jeez... I'm gonna need 3 of these cards to do 4k justice... Aren't I?


BF4 at 1080p with 200% supersampling is still only half of 4k (1440p).

Correct me if I am wrong, but if you wanted to play at 4k 3 or 4 cards would be able to do it and still have a decent framerate.


----------



## battleaxe

Quote:


> Originally Posted by *bond32*
> 
> BF4 at 1080p with 200% supersampling is still only half of 4k (1440p).


How do you figure that?


----------



## bond32

1440p is half of 4k resolution. 1080p is half the resolution of 1440p. So 1080p is one fourth the resolution of 4k. If you super sample 200% then you are doubling your resolution, so 1080p @ 200% = 1440p, 1440p @ 200% = 4k


----------



## xer0h0ur

Quote:


> Originally Posted by *battleaxe*
> 
> No wonder my FPS went to the crapper. Makes sense now.
> 
> Jeez... I'm gonna need 3 of these cards to do 4k justice... Aren't I?


I don't know, my tri-fired setup can pusk 4K fairly well but in absolute honesty tri-fire scaling isn't as good as I had hoped. The GPUs in general are grossly underutilized. Its been a while since I was only running the 295X2 so I don't know if crossfire scaling is any better than tri-fire right now.


----------



## Agent Smith1984

Have seen this confusing everyone, and argued in several different threads...

Still don't have an answer as to whether 200% = (1920x2) x (1080x2) or if it equals (1920x1080)x2









I won't even argue, cause I'm confused my damn self at this point....


----------



## PillarOfAutumn

Quote:


> Originally Posted by *bond32*
> 
> BF4 at 1080p with 200% supersampling is still only half of 4k (1440p).
> 
> Correct me if I am wrong, but if you wanted to play at 4k 3 or 4 cards would be able to do it and still have a decent framerate.


No. You're doubling the height and width. You're running 4 x 1080p, or 4k when you supersample to 200%. Supersampling to 1440p on 1080p only requires 133%.
Quote:


> Originally Posted by *Agent Smith1984*
> 
> Have seen this confusing everyone, and argued in several different threads...
> 
> Still don't have an answer as to whether *200% = (1920x2) x (1080x2)* or if it equals (1920x1080)x2
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I won't even argue, cause I'm confused my damn self at this point....


200% super sampling, you're doubling the height and width. 200% at 1080p is like having 4 1080p monitors.


----------



## bond32

Quote:


> Originally Posted by *xer0h0ur*
> 
> I don't know, my tri-fired setup can pusk 4K fairly well but in absolute honesty tri-fire scaling isn't as good as I had hoped. The GPUs in general are grossly underutilized. Its been a while since I was only running the 295X2 so I don't know if crossfire scaling is any better than tri-fire right now.


Crossfire (2 cards) is ideal, at least from my understanding... Going from one to two cards nets you about a 70% performance increase. Two to three cards, 50-60% increase over 2. Adding the fourth card 30-40% more. There's some pretty good articles around, ill try and find them.


----------



## xer0h0ur

I am not talking about synthetic benchmarks. I know that in 3dmark11 it shows a significant bump going from 2 to 3 Hawaii cores. The only thing that really matters to me though is actual in game performance. I mean I love benchmarking just as much as the next guy but ultimately the entire system was built with gaming in mind and in real world usage the GPU's have god awful low usage in tri-fire. I have a lot of screenshots I have been keeping of GPU usage in Thief for instance across various driver versions since I have gone to tri-fire. Both in Mantle and DX11 its bad.

As for supersampling, 1440p at 200% is 5120x2880


----------



## ledzepp3

Hey everyone









For the whole time my D'yer Mak'er rig has been alive, it's been running on a single 1080P screen- which means my rig is total overkill. Now here's my question... If I were to get a hold of another BenQ XL2420TE, would that be a worthwhile investment? I can't game at 60FPS anymore, because I get wicked headaches. The other choice would be to get something like the ROG Swift, as it's a jump to 1440P (which I've never experienced), plus it's got the refresh rate to match.

As of now, I'm running pretty heavily OC'd dual 290Xs (mostly on the core), with a 4.4GHz 3930K running with 1600MHz memory with 9-10-9-24 timings, all on a RIV:BE with a full water cooling loop.

Thoughts?

-Zepp

*EDIT*: I mostly play BF4 on Conquest Large (so lots of textures to render), some Borderlands 2, occasionally Tribes Ascend, and Galaxy on Fire 2. Oh, and Goat Simulator


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Have seen this confusing everyone, and argued in several different threads...
> 
> Still don't have an answer as to whether 200% = (1920x2) x (1080x2) or if it equals (1920x1080)x2
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I won't even argue, cause I'm confused my damn self at this point....


200% super sampling, you're doubling the height and width. 200% at 1080p is like having 4 1080p monitor
Quote:


> Originally Posted by *xer0h0ur*
> 
> I am not talking about synthetic benchmarks. I know that in 3dmark11 it shows a significant bump going from 2 to 3 Hawaii cores. The only thing that really matters to me though is actual in game performance. I mean I love benchmarking just as much as the next guy but ultimately the entire system was built with gaming in mind and in real world usage the GPU's have god awful low usage in tri-fire. I have a lot of screenshots I have been keeping of GPU usage in Thief for instance across various driver versions since I have gone to tri-fire. Both in Mantle and DX11 its bad.
> 
> As for supersampling, 1440p at 200% is 5120x2880


Sorry, I think I was throwing this term around too. But BF4 uses "Downsampling" (the thing you use with the resolution scale). So Resolution scale or downsampling set at 200% on a 1080p monitor is effectively 3840x2160 (4k).


----------



## bond32

Quote:


> Originally Posted by *xer0h0ur*
> 
> I am not talking about synthetic benchmarks. I know that in 3dmark11 it shows a significant bump going from 2 to 3 Hawaii cores. The only thing that really matters to me though is actual in game performance. I mean I love benchmarking just as much as the next guy but ultimately the entire system was built with gaming in mind and in real world usage the GPU's have god awful low usage in tri-fire. I have a lot of screenshots I have been keeping of GPU usage in Thief for instance across various driver versions since I have gone to tri-fire. Both in Mantle and DX11 its bad.
> 
> As for supersampling, 1440p at 200% is 5120x2880


Suppose I kinda had the same idea in mind. But when I purchased the second and third card I knew the drivers were not quite there. I don't have any complaints though. The only way to get the best single gpu performance is with a 980 it sounds like... There's a guy whos able to play bf4 on 3 ROG Swift's (1440p eyefinity, $3k+) on only one 980, around 50-60 fps.


----------



## xer0h0ur

I am a fairly patient person, so I am still going to give AMD the benefit of the doubt they won't abandon me or my tech while trying to pump out the next generation. Its just jaw dropping to me that if I manually create a crossfire profile through the CCC and force near full utilization of a single GPU, it results in higher framerates than low usage across three damned GPUs. That is grossly pathetic. They really need to do better.


----------



## Agent Smith1984

Quote:


> Originally Posted by *bond32*
> 
> Crossfire (2 cards) is ideal, at least from my understanding... Going from one to two cards nets you about a 70% performance increase. Two to three cards, 50-60% increase over 2. Adding the fourth card 30-40% more. There's some pretty good articles around, ill try and find them.


I've seen a lot of benches lately that show 85-100% dual GPU scaling..... that's insane for something that has shifted from hardware (chipsets) to all software (drivers) within the last 5 or 6 years.....

It's going over 3 that really suffers. Sometimes it's a CPU bottleneck. On mid-range CPU based systems, the struggle is real, but on high end, or highly overclocked CPU's, 3-4 GPU's CAN be utilized, it's just that , without a 4k display, the graphical demand isn't there to warrant 100% usage of all those GPU's.... At least that is what I have gathered from all my research....

I recently got rid of my 280x, and bought a 290.... I had considered going crossfire with a 7970 for cheap, and looked at lots of crossfire reviews.....

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> 200% super sampling, you're doubling the height and width. 200% at 1080p is like having 4 1080p monitor
> Sorry, I think I was throwing this term around too. But BF4 uses "Downsampling" (the thing you use with the resolution scale). So Resolution scale or downsampling set at 200% on a 1080p monitor is effectively 3840x2160 (4k).


I understand that 200% is 4k.... or at least the majority claim it to be the equivalent..... but I have read some saying that it's the equivalent of 1440p, because 100% of something is 1, 200% of something is 2







, how can 2, 1920x1080P displays, in any way, be equal to 4k???

I don't side with either opinion, just letting you know the arguments I have seen.....

It's going on in several threads, on several forums, all over the internet..


----------



## tsm106

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Your math is incorrect. Resolution = area not perimeter.
> 
> 
> 
> 
> 
> Isn't that what I said?
> Quote:
> 
> 
> 
> Originally Posted by *PillarOfAutumn*
> 
> Quick and dirty of supersampling is just that the game is being run at a higher resolution and being made to fit into your monitor. *For example, at 1080p, if you were to super sample it to 200%, you're essentially running 4k resolution that's being squeezed into your 1080p monitor*. Likewise, 133% super sampling on 1080p is equivalent to 1440p and 150% super sampling on 1440p is equivalent to 4k. Advantage is that because your GPU is rendering such a huge image for a small monitor, you're getting sharper details. Disadvantage is that it wrecks your performance because you're rendering higher resolutions.
> 
> Click to expand...
> 
> 1080p supersampled by 200%, you're essentially running 4 1080p monitor, otherwise "4k resolution."
> 
> Whatever the super sample percentage is, you multiply the height and width with that much, and you can work out the area with that.
> 
> 1920 x 1080 super sampled by 200% = 3840 x 2160
> 2560 x 1440 super sampled by 150%= 3840 x 2160
Click to expand...

No. You're math is still wrong. You keep doing this: multiplying the perimeter not the area. Google how to area vs perimeter. The resolution is 1920 times 1080 = 2MP. 2MP x 2 = 4MP, which is roughly equal to 2560 times 1600 = 4MP. Btw 2560x1440 = 3.7MP.


----------



## sugarhell

Ocn of math


----------



## tsm106

Yea, we need more argumentative threads like this.

http://www.overclock.net/t/1522690/r9-280-crossfire-4k


----------



## Agent Smith1984

Quote:


> Originally Posted by *tsm106*
> 
> No. You're math is still wrong. You keep doing this: multiplying the perimeter not the area. Google how to area vs perimeter. The resolution is 1920 times 1080 = 2MP. 2MP x 2 = 4MP, which is roughly equal to 2560 times 1600 = 4MP. Btw 2560x1440 = 3.7MP.


Yeah, I can dig that math!

So basically, if I play BF4 at 1080P, with 200% res scaling, it should be close in image quality to a 1600p display......

Sounds like a good conclusion, cause this is a badly beaten horse.


----------



## Agent Smith1984

Quote:


> Originally Posted by *tsm106*
> 
> Yea, we need more argumentative threads like this.
> 
> http://www.overclock.net/t/1522690/r9-280-crossfire-4k


I was knee deep in that one at one point....

The problem with the OP, and why it wasn't worth carrying on, is that the 280's can RENDER 4k, but the 3GB of VRAM will not suffice for smooth grameplay


----------



## Cyber Locc

Just wanted to say I got my cards under water







. Had a few hiccups with my terminal adapter and midplate so those arent installed but next update they will go in with a bunch of other stuff. Anyway heres the
current state.


Now if they would fix the driver issue that cause me to choose horrid sound or massive screen tearing when using both cards, Mhmm that would be great. And if they dont well I, I, I, Burn down the building


----------



## xer0h0ur

Oh screw it, I ordered my 4K monitor. Had enough of this relic of a Samsung 204B.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *tsm106*
> 
> No. You're math is still wrong. You keep doing this: multiplying the perimeter not the area. Google how to area vs perimeter. The resolution is 1920 times 1080 = 2MP. 2MP x 2 = 4MP, which is roughly equal to 2560 times 1600 = 4MP. Btw 2560x1440 = 3.7MP.


Yes, I should google and try to learn how to do grade school math again.

Tell me something then. People can run BF4 on a 290x at 1080p at 80+ FPS average. However, once they bump up the resolution scale to 200%, why do they suddenly get sub 30 or sub 20 FPS if all 200% scaling does is bump up the resolution to 2560 x 1600? I mean the 290x is certainly capable of running BF4 at 1600p at above 40-50 FPS average.

Take a look at the resolution scaling vs FPS on the 290x:
www.bit-tech.net/hardware/graphics/2013/11/27/battlefield-4-performance-analysis/7
Tell me that 25 FPS average is normal for BF4 at 1600p

Also, here's the FPS of BF4 at 4k resolution:
www.bit-tech.net/hardware/graphics/2013/11/27/battlefield-4-performance-analysis/6
The results don't carry over since on the scaling graphs, the reviewers used 4xMSAA on 200% scaling.

But if you look at this graph, it shows you that at 4k with ultra presets (what was used with the 200% scaling in the other graph), the performance of 1080p at 200% is very similar to 4k.
www.legitreviews.com/nvidia-geforce-gtx-780-ti-vs-amd-radeon-r9-290x-at-4k-ultra-hd_130055/4

Also, here's a quote from the Bit-tech article:

"Increasing the resolution scale percentage is akin to upping the resolution, so it's no surprise to see massive drops at each step. At 200 percent, the cards are effectively running at 4K with ultra settings, so it's no wonder that even the R9 290X crumbles."


----------



## tsm106

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Yea, we need more argumentative threads like this.
> 
> http://www.overclock.net/t/1522690/r9-280-crossfire-4k
> 
> 
> 
> I was knee deep in that one at one point....
> 
> The problem with the OP, and why it wasn't worth carrying on, is that the 280's can RENDER 4k, but the 3GB of VRAM will not suffice for smooth grameplay
Click to expand...

That's not what I was referring to but more of posts like the ones below.

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> No. You're math is still wrong. You keep doing this: multiplying the perimeter not the area. Google how to area vs perimeter. The resolution is 1920 times 1080 = 2MP. 2MP x 2 = 4MP, which is roughly equal to 2560 times 1600 = 4MP. Btw 2560x1440 = 3.7MP.
> 
> 
> 
> 
> 
> 
> 
> Yes, I should google and try to learn how to do grade school math again.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Tell me something then. People can run BF4 on a 290x at 1080p at 80+ FPS average. However, once they bump up the resolution scale to 200%, why do they suddenly get sub 30 or sub 20 FPS if all 200% scaling does is bump up the resolution to 2560 x 1600? I mean the 290x is certainly capable of running BF4 at 1600p at above 40-50 FPS average.
> 
> Take a look at the resolution scaling vs FPS on the 290x:
> www.bit-tech.net/hardware/graphics/2013/11/27/battlefield-4-performance-analysis/7
> Tell me that 25 FPS average is normal for BF4 at 1600p
> 
> Also, here's the FPS of BF4 at 4k resolution:
> www.bit-tech.net/hardware/graphics/2013/11/27/battlefield-4-performance-analysis/6
> The results don't carry over since on the scaling graphs, the reviewers used 4xMSAA on 200% scaling.
> 
> But if you look at this graph, it shows you that at 4k with ultra presets (what was used with the 200% scaling in the other graph), the performance of 1080p at 200% is very similar to 4k.
> www.legitreviews.com/nvidia-geforce-gtx-780-ti-vs-amd-radeon-r9-290x-at-4k-ultra-hd_130055/4
> 
> Also, here's a quote from the Bit-tech article:
> 
> "Increasing the resolution scale percentage is akin to upping the resolution, so it's no surprise to see massive drops at each step. At 200 percent, the cards are effectively running at 4K with ultra settings, so it's no wonder that even the R9 290X crumbles."
Click to expand...


----------



## xer0h0ur

You're creating a significant additional load on your entire system by scaling up. Its not like you can just divide by two and expect that framerate. Then there's the vRAM.


----------



## Forceman

Quote:


> Originally Posted by *tsm106*
> 
> No. You're math is still wrong. You keep doing this: multiplying the perimeter not the area. Google how to area vs perimeter. The resolution is 1920 times 1080 = 2MP. 2MP x 2 = 4MP, which is roughly equal to 2560 times 1600 = 4MP. Btw 2560x1440 = 3.7MP.


I think it's a safe bet that BF4 supersampling is not working off some kind of area based math - it is linearly increasing the resolution in both dimensions at the same time. Hence the name "resolution scale" in the options, not "megapixel scale". In any case, it's easy for anyone with a 1440p monitor to test, just run once at 1440p then rerun at 1080p with 1.33% scaling and see if the results are the same.

Edit: I'm so nice I just went and did it myself.

1440p - 109 fps
1080p 135% (~1440p equiv, I couldn't set 133% exactly) - 108 fps
1080p 200% (4k equiv) - 60 fps

It's clearly applying the scale to both the horizontal and vertical resolution.


----------



## xer0h0ur

I can't say with any certainty until I get my monitor in but I highly doubt the results would be equal, all things considered.


----------



## jojo9752

XFX R9 290X Core Edition
Kraken G10 + Corsair H55

XFX R9 290X Core Edition
Kraken G10 + Antec Kuhler H20 620

GPU-Z Link: http://www.techpowerup.com/gpuz/details.php?id=9qn4x
CPU-Z Link: http://valid.x86.fr/myzfp9

I'm not entirely sure if this is how I'm supposed to do this haha


----------



## PillarOfAutumn

Quote:


> Originally Posted by *jojo9752*
> 
> XFX R9 290X Core Edition
> Kraken G10 + Corsair H55
> 
> XFX R9 290X Core Edition
> Kraken G10 + Antec Kuhler H20 620
> 
> GPU-Z Link: http://www.techpowerup.com/gpuz/details.php?id=9qn4x
> CPU-Z Link: http://valid.x86.fr/myzfp9
> 
> I'm not entirely sure if this is how I'm supposed to do this haha


Take a picture of the cards with a piece of paper and your username and date on it.

Welcome







.


----------



## Arizonian

Quote:


> Originally Posted by *Jaimelmiel*
> 
> Can I be updated I have 2 Msi R9 290x's, Stock cooling?










That's strange because I have you down for 2 MSI 290X already. Your number 480.









Quote:


> Originally Posted by *cyberlocc*
> 
> Just wanted to say I got my cards under water
> 
> 
> 
> 
> 
> 
> 
> . Had a few hiccups with my terminal adapter and midplate so those arent installed but next update they will go in with a bunch of other stuff. Anyway heres the
> current state.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Now if they would fix the driver issue that cause me to choose horrid sound or massive screen tearing when using both cards, Mhmm that would be great. And if they dont well I, I, I, Burn down the building
> 
> 
> Spoiler: Warning: Spoiler!


Looks great. Nice work.









Quote:


> Originally Posted by *jojo9752*
> 
> XFX R9 290X Core Edition
> Kraken G10 + Corsair H55
> 
> XFX R9 290X Core Edition
> Kraken G10 + Antec Kuhler H20 620
> 
> GPU-Z Link: http://www.techpowerup.com/gpuz/details.php?id=9qn4x
> CPU-Z Link: http://valid.x86.fr/myzfp9
> 
> I'm not entirely sure if this is how I'm supposed to do this haha


Congrats - added









Yup your GPU-Z validation link worked as it displayed your OCN name.


----------



## Cyber Locc

Quote:


> Originally Posted by *Arizonian*
> 
> 
> 
> 
> 
> 
> 
> 
> That's strange because I have you down for 2 MSI 290X already. Your number 480.
> 
> 
> 
> 
> 
> 
> 
> 
> Looks great. Nice work.
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yup your GPU-Z validation link worked as it displayed your OCN name.


Thanks and ya I know just wwnted to share the sexiness. I loveeee them ek blocks


----------



## BLOWNCO

mine in the google doc still shows 1 card and not 3?


----------



## mAs81

Hey Arizonian I need an update too..My Sapphire is a 290 and not a 290X as it is listed in the members list..


----------



## Regnitto

I'm going to be ordering an XFX double d r9 290 next week, and I was just wondering, should I get the Black Edition, or just get the regular and overclock to BE specs (and beyond) to save $60?


----------



## ThijsH

They are the same except for the higher default clock. Go for the cheapest one, they'll likely overclock the same. Personally I bought a black edition only because it was cheaper than dd lol.


----------



## Severon300

I would like to be added I have a MSI R9290 4G Gaming

GPUZ http://www.techpowerup.com/gpuz/details.php?id=zcwhg

Stock 3D Mark Firestrike http://www.3dmark.com/fs/3037240

And a picture for good measure


----------



## Regnitto

Quote:


> Originally Posted by *ThijsH*
> 
> They are the same except for the higher default clock. Go for the cheapest one, they'll likely overclock the same. Personally I bought a black edition only because it was cheaper than dd lol.


that's what I was thinking, just wanted a second opinion. Once I have it in my hands I will be back to join the club


----------



## Agent Smith1984

oo ooo ooooohhhh

About to hit 10k.....
http://www.3dmark.com/3dm/4707384

Gonna keep pushing, this is 1150/1600

Completely artifact free and stable, several tests and games.....

12v rail is hitting 11.5v









UPDATE:
Captain, I cont do et, I dont hov thu powa


----------



## kizwan

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> For those buying the 8gb 290x, is there really a differenc in performance between the 8 and regular 290x? I'd imagine the 1080p and 1440p performance would be the same and 4k will only be slightly better, but nothing to get the smooth 60fps since horsepower would be the same as the 290x.


The horsepower still the same. More vRAM is useful when gaming in multi-monitor setup.
Quote:


> Originally Posted by *Agent Smith1984*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *bond32*
> 
> Crossfire (2 cards) is ideal, at least from my understanding... Going from one to two cards nets you about a 70% performance increase. Two to three cards, 50-60% increase over 2. Adding the fourth card 30-40% more. There's some pretty good articles around, ill try and find them.
> 
> 
> 
> I've seen a lot of benches lately that show 85-100% dual GPU scaling..... that's insane for something that has shifted from hardware (chipsets) to all software (drivers) within the last 5 or 6 years.....
> 
> It's going over 3 that really suffers. Sometimes it's a CPU bottleneck. On mid-range CPU based systems, the struggle is real, but on high end, or highly overclocked CPU's, 3-4 GPU's CAN be utilized, it's just that , without a 4k display, the graphical demand isn't there to warrant 100% usage of all those GPU's.... At least that is what I have gathered from all my research....
> 
> I recently got rid of my 280x, and bought a 290.... I had considered going crossfire with a 7970 for cheap, and looked at lots of crossfire reviews.....
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *PillarOfAutumn*
> 
> 200% super sampling, you're doubling the height and width. 200% at 1080p is like having 4 1080p monitor
> Sorry, I think I was throwing this term around too. But BF4 uses "Downsampling" (the thing you use with the resolution scale). So Resolution scale or downsampling set at 200% on a 1080p monitor is effectively 3840x2160 (4k).
> 
> Click to expand...
> 
> I understand that 200% is 4k.... or at least the majority claim it to be the equivalent..... but I have read some saying that it's the equivalent of 1440p, because 100% of something is 1, 200% of something is 2
> 
> 
> 
> 
> 
> 
> 
> , how can 2, 1920x1080P displays, in any way, be equal to 4k???
> 
> I don't side with either opinion, just letting you know the arguments I have seen.....
> 
> It's going on in several threads, on several forums, all over the internet..
Click to expand...

With 1080p monitor, 100% resolution scale = 1920 x 1080 & 200% resolution scale = (1920 x 200%) x (1080 x 200%) = 3840 x 2160p = 4K (internally)

With BF4, it's not really an argument because 200% is really is 2160p (internally). It said so if you enabled DrawScreenInfo in game.


----------



## mus1mus

Quote:


> Originally Posted by *Agent Smith1984*
> 
> oo ooo ooooohhhh
> 
> About to hit 10k.....
> http://www.3dmark.com/3dm/4707384
> 
> Gonna keep pushing, this is 1150/1600
> 
> Completely artifact free and stable, several tests and games.....
> 
> 12v rail is hitting 11.5v
> 
> 
> 
> 
> 
> 
> 
> 
> 
> UPDATE:
> Captain, I cont do et, I dont hov thu powa


very good card you got there..

any notable updates on the drivers?


----------



## Agent Smith1984

Quote:


> Originally Posted by *mus1mus*
> 
> very good card you got there..
> 
> any notable updates on the drivers?


Seems good indeed..... with some power it'll do for sure









I am actually running 14.9.1, and not having a single issue....... haven't tried anything else, I know AMD drivers well enough to just leave it be, lol


----------



## Arizonian

Quote:


> Originally Posted by *BLOWNCO*
> 
> mine in the google doc still shows 1 card and not 3?












#475 BLOWNCO SAPPHIRE R9 290 VAPOR X *3* http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/30710#post_22984394
Quote:


> Originally Posted by *mAs81*
> 
> Hey Arizonian I need an update too..My Sapphire is a 290 and not a 290X as it is listed in the members list..


Sorry about that, might have been auto fill on the spreadsheet I didn't notice happen. Fixed.








Quote:


> Originally Posted by *Severon300*
> 
> I would like to be added I have a MSI R9290 4G Gaming
> 
> GPUZ http://www.techpowerup.com/gpuz/details.php?id=zcwhg
> 
> Stock 3D Mark Firestrike http://www.3dmark.com/fs/3037240
> 
> And a picture for good measure
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## BLOWNCO

Quote:


> Originally Posted by *Arizonian*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> #475 BLOWNCO SAPPHIRE R9 290 VAPOR X *3* http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/30710#post_22984394
> Sorry about that, might have been auto fill on the spreadsheet I didn't notice happen. Fixed.


ha weird it still showed 1 the other night i looked.


----------



## tsm106

Quote:


> Originally Posted by *kizwan*
> 
> With 1080p monitor, 100% resolution scale = 1920 x 1080 & 200% resolution scale = (1920 x 200%) x (1080 x 200%) = 3840 x 2160p = 4K (internally)
> 
> With BF4, it's not really an argument because 200% is really is 2160p (internally). It said so if you enabled DrawScreenInfo in game.


As forceman alluded to, BF4 is using it's own scale. Dice is whacked because it IS 4x times the resolution. There's no denying that four 1080 screens fit into a 4k space. I don't think anyone can argue that, can they? Thus it comes down to what "Dice" is calling 200% resolution scale. It's all Dice's fault lol.


----------



## Cyber Locc

Okay guys I need some brainstorming here. I have found another clue in my audio issue and why others may not have it. If i unplug all my monitors but 1(I have 4) my audio issue goes away. So now there is another variable making it Crossfire + Vsync + Multiple Monitors = Audio issues caused by dxkrnl.sys.

Im not sure if this issue happens with eyefinity as I don't use it but I will try and find out.

EDIT: Eyefinity does not have the audio issues. So the issue is only present if you have multiple monitors not in eyefinity with crossfire and vsync.


----------



## kizwan

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> With 1080p monitor, 100% resolution scale = 1920 x 1080 & 200% resolution scale = (1920 x 200%) x (1080 x 200%) = 3840 x 2160p = 4K (internally)
> 
> With BF4, it's not really an argument because 200% is really is 2160p (internally). It said so if you enabled DrawScreenInfo in game.
> 
> 
> 
> As forceman alluded to, BF4 is using it's own scale. Dice is whacked because it IS 4x times the resolution. There's no denying that four 1080 screens fit into a 4k space. I don't think anyone can argue that, can they? Thus it comes down to what "Dice" is calling 200% resolution scale. It's all Dice's fault lol.
Click to expand...

I agree. If we follow standard, saying 200% (2x) resolution scale from 1080p is 4K will be wrong. I think in DICE view, "resolution scale" can be an ambiguous term.


----------



## DividebyZERO

Quote:


> Originally Posted by *cyberlocc*
> 
> Okay guys I need some brainstorming here. I have found another clue in my audio issue and why others may not have it. If i unplug all my monitors but 1(I have 4) my audio issue goes away. So now there is another variable making it Crossfire + Vsync + Multiple Monitors = Audio issues caused by dxkrnl.sys.
> 
> Im not sure if this issue happens with eyefinity as I don't use it but I will try and find out.
> 
> EDIT: Eyefinity does not have the audio issues. So the issue is only present if you have multiple monitors not in eyefinity with crossfire and vsync.


that would explain why i dont have this issue, i only run eyefinity.


----------



## Cyber Locc

Quote:


> Originally Posted by *DividebyZERO*
> 
> that would explain why i dont have this issue, i only run eyefinity.


Ya can you do me a favor get dpclat or latency mon or both and try running your screens without eyefinity and see if you have the issue. Im begging to think that the issue isnt isolates to a few only. I think only a few meet all the reqs as most with multiple screens have eyefinity.

And I guess I should have put this together soon as through this nightmare everyoje else with an issue had multiple screens. Also that expalins why peoples disable your hdmi monitor worked for them but not me asthey only had 2 screens I have 4 so there disable took them to 1 screen mine took me to 3.


----------



## PillarOfAutumn

Quick question. Aside from frozencpu, is there any other place that sells FujiPoly ultra extremes? The ones rated at 17 W/mK? Or do you guys recommend any other thermal pad that's comparable or better? FCPU is currently out in the thickness I'm looking for.


----------



## Forceman

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Quick question. Aside from frozencpu, is there any other place that sells FujiPoly ultra extremes? The ones rated at 17 W/mK? Or do you guys recommend any other thermal pad that's comparable or better? FCPU is currently out in the thickness I'm looking for.


Performance PCs (www.performance-pcs.com) has some, not sure what size/thickness though.


----------



## tsm106

Why the heck is everyone so obsessed with the Fujis. We benchers have been using them for years mind you, but it's not like it does most ppl any good. An ek block can keep vrm temps with the stock pads at 5w/mk in the range of 65c at upwards of +200mv. The 17w/mk brings that down to 50c. To get to 1300mhz core overclock you are going to surpass +200mv and so the Fujis come in handy as the vrm temps will spike past that 65c. However most everyone else that are not pushing stupid amounts of voltage, yall are wasting your money and making the pads scarce for those who are going to feed obscene voltage. Just saying...

Btw, on custom cards like the lightning, Fujis are like burning money as the Lightning vrms section with the stock 5w/mk pads hardly ever breaks 45c and thats with obscene volts. Different levels of mosfet efficiency and volume at play...


----------



## cephelix

Well, not all of us i.e. me happen to have a card that can fit the full cover block.and ambient temps may be high enough that even without pushing the volts, vrm temps are already high.anyways there's nothing wrong witg wanting the best for your system


----------



## Buehlar

Quote:


> Originally Posted by *tsm106*
> 
> Why the heck is everyone so obsessed with the Fujis. We benchers have been using them for years mind you, but it's not like it does most ppl any good. An ek block can keep vrm temps with the stock pads at 5w/mk in the range of 65c at upwards of +200mv. The 17w/mk brings that down to 50c. To get to 1300mhz core overclock you are going to surpass +200mv and so the Fujis come in handy as the vrm temps will spike past that 65c. However most everyone else that are not pushing stupid amounts of voltage, yall are *wasting your money* and making the pads scarce for those who are going to feed obscene voltage. Just saying...
> 
> Btw, on custom cards like the lightning, Fujis are like burning money as the Lightning vrms section with the stock 5w/mk pads hardly ever breaks 45c and thats with obscene volts. Different levels of mosfet efficiency and volume at play...


I wanted my temps as low as possible and if any card merits the use of fujipoly, the 290x is at the top of that list.

Furthermore, perhaps we sometimes forget that this is OCN, eh?


----------



## Agent Smith1984

Well.....

Got 100% stable at 1150/1600 on my 290... and then my power supply EXPLODED....

Looked like a shotgun let off out the back off the case..... was playing a 64 player BF4 map.... ultra settings, 175% res scale, 4xMSAA, about 45 minutes of playing at 1150/1600, everything was beautiful....

I could hear the PSU coil whining a bit, and I have known it was on it's way out since I got this card..... but I have never seen one pop like this one did.

Everything shut down..... tried to unplug and trip breaker, plugged it back in, nothing.... complete silence...... then a few seconds later, it just went boom!








Stunk up the whole living room.....


----------



## bond32

Sounds like a capacitor blew...


----------



## Agent Smith1984

Well.... it's 8 year old..... that's like 100 in PSU years









I had just posted a thread yesterday for PSU suggestions.... it must of read it....


----------



## bond32

Anything Shilka says... My vote goes to the EVGA G2 series... If you only have one card I'd go with the 850. Two cards the 1000 and so on. Personally I have 2 of the 1300 watt g2's.


----------



## Agent Smith1984

Looking to stay $70-90 if possible....

Really interested in the Rosewill Photon or Capstone modular series... looks like they are aimed at competing directly with the G2 for a good bit less cash....


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Looking to stay $70-90 if possible....
> 
> Really interested in the Rosewill Photon or Capstone modular series... looks like they are aimed at competing directly with the G2 for a good bit less cash....


The PSU is one part of the system you should never go cheap on. Buying a good and expensive PSU on sale is one thing, but looking for one specifically that is cheap is asking for trouble. If I had to build my system all over again, I'd go for a 1000 watt PSU that's at least gold from seasonic, corsair, or evga. But this will cost you at least $130+


----------



## shilka

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Looking to stay $70-90 if possible....
> 
> Really interested in the Rosewill Photon or Capstone modular series... looks like they are aimed at competing directly with the G2 for a good bit less cash....


I dont know anything about the Photon series as there are no reviews yet but the Capstone is a good option if you dont have a ton of money.
Post number 25.001 lol.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Looking to stay $70-90 if possible....
> 
> Really interested in the Rosewill Photon or Capstone modular series... looks like they are aimed at competing directly with the G2 for a good bit less cash....


lucky for you, seasonic x-1250 is in sale for $130!

http://www.newegg.com/Product/Product.aspx?Item=N82E16817151109


----------



## Sgt Bilko

The XFX DD R9 290x 8GB is up for pre-order in Aus!


----------



## Agent Smith1984

Quote:


> Originally Posted by *shilka*
> 
> I dont know anything about the Photon series as there are no reviews yet but the Capstone is a good option if you dont have a ton of money.
> Post number 25.001 lol.


Thanks for the input Shilka!

I have been reading through all of your PSU info... great stuff man!!

I am all too familiar with the importance of a good PSU. Back in 06 when I bought the OCZ, I had been through 3 or 4 cheap PSU's before it, within a few months time I had finally had enough of the garbage. I did some research, and found that the OCZ was one of the best on the market at the time for the money ($180+/-), and I had found it on sale for $160 so I ordered it. I was finally pleased with my PSU.....

In the end, it has been through 4 rigs, and has more than earned it's keep and praise. I don't have one bad thing to say about it... it was 82% efficient before 80 Plus cert even existed!!
At the time 75-79% was considered good!

I knew it's days we drawing near, so was pretty much just counting down to the firework show!
I was overclocking my 290 last night and saw the 12v rail dropping down to 11.5 (had been at 11.63 in the days before).... within 30 minutes of that, she went BOOM.

I think I will opt for the Capstone modular 750 since my budget is pretty limited.... it should be all I need for one GPU.

Thanks guys!


----------



## LandonAaron

Be sure to check Jonny Guru's reviews before you purchase any power supply. You could get an Evga G1 750w for $100, but I would spend a little more and get the G2 850W for $145/ $120 after rebate.


----------



## shilka

Quote:


> Originally Posted by *LandonAaron*
> 
> Be sure to check Jonny Guru's reviews before you purchase any power supply. You could get an Evga G1 750w for $100, but I would spend a little more and get the G2 850W for $145/ $120 after rebate.


The G1 is a rebrand of the old NEX so its still the same old rubbish.
http://www.overclock.net/t/1476935/why-you-should-not-buy-an-evga-supernova-nex650g-750g


----------



## Agent Smith1984

Yeah, I read the info you had posted on the NEX750 and decided to definitely stay away.....

As far as the G2, it's a nice unit, but I am trying to stay $90 or less, and the Rosewill Capstone 750-M is going for $85 right now. That PSU seems to have a good rep.

I am really interested in the Photon, but it's so new there are no reviews. It appears to be targeted directly at the G2 for $30 or so less. I'm guessing it's a super flower re-brand....

Also, the 1000W Cpastone-M is only $75 after $40 rebate, that seems like a great deal......


----------



## shilka

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I am really interested in the Photon, but it's so new there are no reviews. It appears to be targeted directly at the G2 for $30 or so less. I'm guessing it's a super flower re-brand....


No its HighPower aka Sirtec made, think it might be based on the HighPower Astro GD (F) but i dont know since no one has reviewed it yet.
Could be great or it could be rubbish.


----------



## Cyber Locc

Okay guys so I need your help in my study here to find a fix to my audio issue. So far I have gave me info here and to a few people out of the people (4 in total) 2 had the problem and 2 didn't have it. The 2 that did have the problem were using multiple monitors the 2 that didn't were only using one. The 2 that had the issue before with multi screens it was gone by unhooking and the ones that didn't have it hooked up there tv or another monitor and boom they had it.

So if anyone could help me out as I would like to see how widespread this is.

Have crossfire enabled,
Have vsync on,
Have more than 1 screen hooked up enabled and working (but not in eyefinity),

And see if you have audio issues and if you have the program or don't mind getting it check latency mon and see if dxgkrnl is having high spikes.

And report back with your findings. I would really appreciate it so we can hopefully get enough findings of the issue so AMD will work on a fix.


----------



## flopper

Quote:


> Originally Posted by *cyberlocc*
> 
> Okay guys so I need your help in my study here to find a fix to my audio issue. So far I have gave me info here and to a few people out of the people (4 in total) 2 had the problem and 2 didn't have it. The 2 that did have the problem were using multiple monitors the 2 that didn't were only using one. The 2 that had the issue before with multi screens it was gone by unhooking and the ones that didn't have it hooked up there tv or another monitor and boom they had it.
> 
> So if anyone could help me out as I would like to see how widespread this is.
> 
> Have crossfire enabled,
> Have vsync on,
> Have more than 1 screen hooked up enabled and working (but not in eyefinity),
> 
> And see if you have audio issues and if you have the program or don't mind getting it check latency mon and see if dxgkrnl is having high spikes.
> 
> And report back with your findings. I would really appreciate it so we can hopefully get enough findings of the issue so AMD will work on a fix.


amd is aware of this and fix is coming.
the audio issue when multiple screens and such.

if its this you mean

http://forums.overclockers.co.uk/showthread.php?p=27211741#post27211741


----------



## Cyber Locc

Quote:


> Originally Posted by *flopper*
> 
> amd is aware of this and fix is coming.
> the audio issue when multiple screens and such.
> 
> if its this you mean
> 
> http://forums.overclockers.co.uk/showthread.php?p=27211741#post27211741


I replied to the other thread as well. But yes this is the issue and yes I know amd is aware as I provided the info to Dev and asked him to forward it to AMD Matt. I also provided it to the driver team and AMD support so hopefully this will get taken care of soon.

Until then guys you just have disable not unhook not the best workaround but it will work.


----------



## arrow0309

Quote:


> Originally Posted by *LandonAaron*
> 
> Be sure to check Jonny Guru's reviews before you purchase any power supply. You could get an Evga G1 750w for $100, but I would spend a little more and get the G2 850W for $145/ $120 after rebate.


This









Quote:


> Originally Posted by *shilka*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> I am really interested in the Photon, but it's so new there are no reviews. It appears to be targeted directly at the G2 for $30 or so less. I'm guessing it's a super flower re-brand....
> 
> 
> 
> No its HighPower aka Sirtec made, think it might be based on the HighPower Astro GD (F) but i dont know since no one has reviewed it yet.
> Could be great or it could be rubbish.
Click to expand...

Read the review first









http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story&reid=377

*Performance 10

Functionality 10

Value 10

Build Quality 9.5

Total Score 9.9*


----------



## PillarOfAutumn

Hey guys. I'm looking for some help. I recently bought a used Sapphire 290 Trix-OC. This is my second 290, which will run in crossfire. So I hooked everything up, popped it into the PC, but it seems as though my PC doesn't detect the second GPU. I plugged in that GPU into a different PC I have and it ran perfectly, GPUz picked up the appropriate clock frequencies, it scored 9900 on 3Dmark Firestrike, and is overall stable. On the PC that this second GPU is intended for, I plugged in my soundcard in the PCIe lane I was going to use for my GPU and that soundcard worked.

So the graphics card is working, the PCIe lane is also working, however my PC isn't detecting the second GPU. Its not being picked up on device manager, CCC, sapphire trixx, or GPUz. IS there something I'm missing? Am I supposed to enable something from a menu or options? I just took a look at CCC and under "Performance", I only see AMD Overdrive, and no Crossfire option.


----------



## Sgt Bilko

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Hey guys. I'm looking for some help. I recently bought a used Sapphire 290 Trix-OC. This is my second 290, which will run in crossfire. So I hooked everything up, popped it into the PC, but it seems as though my PC doesn't detect the second GPU. I plugged in that GPU into a different PC I have and it ran perfectly, GPUz picked up the appropriate clock frequencies, it scored 9900 on 3Dmark Firestrike, and is overall stable. On the PC that this second GPU is intended for, I plugged in my soundcard in the PCIe lane I was going to use for my GPU and that soundcard worked.
> 
> So the graphics card is working, the PCIe lane is also working, however my PC isn't detecting the second GPU. Its not being picked up on device manager, CCC, sapphire trixx, or GPUz. IS there something I'm missing? Am I supposed to enable something from a menu or options? I just took a look at CCC and under "Performance", I only see AMD Overdrive, and no Crossfire option.


DDU Wipe then re-install the drivers, I had that issue once


----------



## Chopper1591

Quote:


> Originally Posted by *arrow0309*
> 
> This
> 
> 
> 
> 
> 
> 
> 
> 
> Read the review first
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story&reid=377
> 
> *Performance 10
> 
> Functionality 10
> 
> Value 10
> 
> Build Quality 9.5
> 
> Total Score 9.9*


I love reading the stuff from Johny. He actually knows what he is talking about when he does psu's.
And that is a damn tight score indeed.

ON TOPIC:
I really could use some insight from you guys here @ the club.

Been a happy owner of a 7950 vapor-x for the past ~1,5 years.
Sadly the recent games and games to come in the next few months will struggle with this card.

I've been playing with the idea of selling this card and grabbing a r9 290 Tri-X.

What do you guys think?
A good move?

I am doubting between the Sapphire tri-x and the Asus dc2oc.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Chopper1591*
> 
> ON TOPIC:
> I really could use some insight from you guys here @ the club.
> 
> Been a happy owner of a 7950 vapor-x for the past ~1,5 years.
> Sadly the recent games and games to come in the next few months will struggle with this card.
> 
> I've been playing with the idea of selling this card and grabbing a r9 290 Tri-X.
> 
> What do you guys think?
> A good move?
> 
> I am doubting between the Sapphire tri-x and the Asus dc2oc.


Tri-X over the DCU II anyday of the week.


----------



## tsm106

DC2, are you masochistic?? Into S&M?


----------



## Chopper1591

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Tri-X over the DCU II anyday of the week.


I kinda feel dumb here.
DCU2 wasn't reviewed that bad if I am not mistaken.

How is overclocking on the Tri-X?
Am I right to go for the Sapphire over the Gigabyte?
Quote:


> Originally Posted by *tsm106*
> 
> DC2, are you masochistic?? Into S&M?


Fair enough


----------



## tsm106

Quote:


> Originally Posted by *Chopper1591*
> 
> I am doubting between the Sapphire tri-x and the Asus dc2oc.


The 290 trix is a solid choice. The other one not so much.


----------



## Chopper1591

Quote:


> Originally Posted by *tsm106*
> 
> The 290 trix is a solid choice. The other one not so much.


Alright.

Seems that I stick with Sapphire once again.
Been rolling with them since my 9700 pro.

Bought a Asus 4870 once and I gave me problems so it went back to the store.


----------



## Jflisk

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Hey guys. I'm looking for some help. I recently bought a used Sapphire 290 Trix-OC. This is my second 290, which will run in crossfire. So I hooked everything up, popped it into the PC, but it seems as though my PC doesn't detect the second GPU. I plugged in that GPU into a different PC I have and it ran perfectly, GPUz picked up the appropriate clock frequencies, it scored 9900 on 3Dmark Firestrike, and is overall stable. On the PC that this second GPU is intended for, I plugged in my soundcard in the PCIe lane I was going to use for my GPU and that soundcard worked.
> 
> So the graphics card is working, the PCIe lane is also working, however my PC isn't detecting the second GPU. Its not being picked up on device manager, CCC, sapphire trixx, or GPUz. IS there something I'm missing? Am I supposed to enable something from a menu or options? I just took a look at CCC and under "Performance", I only see AMD Overdrive, and no Crossfire option.


I had this problem with trifire. Plug in only the GPU in the slot that is not being seen. Start the machine with a monitor attached to it. Make sure it goes into windows and the card is seen-installs drivers.
Shut down Install second card. Rinse repeat you should be good to go. If the DDU driver remove and install does not work.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *Jflisk*
> 
> I had this problem with trifire. Plug in only the GPU in the slot that is not being seen. Start the machine with a monitor attached to it. Make sure it goes into windows and the card is seen-installs drivers.
> Shut down Install second card. Rinse repeat you should be good to go. If the DDU driver remove and install does not work.


Quote:


> Originally Posted by *Sgt Bilko*
> 
> DDU Wipe then re-install the drivers, I had that issue once


Thanks guys. I appreciate it. I will try doing all of this next week once I get more time. But just to be clear, I put in a sound card in the PCIe lane and I was able to get sound from the soundcard. This means that the PCIe lane is clearly working and will function with the GPU, right?


----------



## Jflisk

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Thanks guys. I appreciate it. I will try doing all of this next week once I get more time. But just to be clear, I put in a sound card in the PCIe lane and I was able to get sound from the soundcard. This means that the PCIe lane is clearly working and will function with the GPU, right?


It should yes but the drivers are funny sometimes that's why I gave you my suggestion and the other person gave you there's. Between the 2 you should be okay. Once your system sees the card at the driver level it should be fine.


----------



## bond32

Sooooooooooooooo,........

I just now noticed, the Koolance blocks I have: the new revision uses 0.7mm thick thermal pads on the VRM1 strip. I have 3 of these blocks. The first revision used 1.0mm thickness on that vrm strip. I used the 1.0 on all of mine and I ordered the Fuji k=17 1.0 thick pads. Think they will be fine to use, but what's up Koolance? Why 0.7 when everyone else uses 1.0 right?


----------



## xer0h0ur

Quote:


> Originally Posted by *bond32*
> 
> Sooooooooooooooo,........
> 
> I just now noticed, the Koolance blocks I have: the new revision uses 0.7mm thick thermal pads on the VRM1 strip. I have 3 of these blocks. The first revision used 1.0mm thickness on that vrm strip. I used the 1.0 on all of mine and I ordered the Fuji k=17 1.0 thick pads. Think they will be fine to use, but Koolance? Why 0.7 when everyone else uses 1.0 right?


.3mm is a small difference so the 1mm pads themselves can squeeze to 0.7 when you tighten everything down. Hell I accidentally put so much pressure on the pads with my fingers holding down the vRAM pads when I was cutting them that some got squished and were wasted since they weren't making good contact anymore. And those were the 0.5mm Fujis.


----------



## Mugamat

Hi everyone. I`m using kraken g10 for about 4 month on my overclocked R9 290. Using it wis Corsair H55 With push&pull on exaust of my case. I made some acrylyc cover on the top mount and in glows in UV lights. I`m also added one cooler for another VRM. Using Gelid ICY Vision Heatsinks on another VRM and some no-named heatsink on memory. Here some photo of the card, and the temps (Feel sorry for bad quality and bad english.)


Spoiler: Warning: Spoiler!


----------



## shilka

Quote:


> Originally Posted by *arrow0309*
> 
> This
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Read the review first
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story&reid=377
> 
> *Performance 10
> 
> Functionality 10
> 
> Value 10
> 
> Build Quality 9.5
> 
> Total Score 9.9*


You misunderstood me i said that i did not know about the Rosewill Photon series and it could be good or bad i did not say that the HighPower Astro GD (F) was bad.
Read again.


----------



## arrow0309

false
Quote:


> Originally Posted by *shilka*
> 
> You misunderstood me i said that i did not know about the Rosewill Photon series and it could be good or bad i did not say that the HighPower Astro GD (F) was bad.
> Read again.


My bad









I've just read some of your 3ds, good job


----------



## PillarOfAutumn

Hey Arizonian.

Can you please update me on the list? I recently got a second 290 that I will be crossfiring.


----------



## taem

Something odd going on with my 290.

At idle, system consumes ~130w and gpu is at low 30s core high 20s vrms.

But if the displays turn off, system goes to 200+w and gpu temps go to 40s.

Any theories as to cause?


----------



## bond32

Nice new score...

http://www.3dmark.com/fs/3239962


----------



## UGApcGaming

Hey guys, new to overclock and have been lurking here for a while, but figured I'd introduce myself and share my rig. Main reason for joining is that I haven't found a solution for 4k gaming on a television. Here's the dilemma.

Want the new vizio due it's price and high velocity mode. Excellent TV for gaming as it has the lowest input lag of all the tvs out there by a large margin. Problem is the R9 290x doesn't have hdmi 2.0

The Panasonic Ax900 that comes out this month does however have a display port, but it's expected to be almost $8000.

If I buy the vizio, do you guys think I could add a future AMD card with HDMI 2.0 support in a tri fire configuration or will mixing different chips possibly cause an issue? I appreciate any suggestions and welcome rumors and speculation. Thanks!

4770k
16gb memory
Gigabyte GA-z87x-OC
2 R9 290X with XSPC copper blocks/Alphacool 80mmx140mm radiator/XSPC photon reservoir

Displays: 2x Acer H236HL with LG ultrawide 29um65p (6400x1080)


----------



## maynard14

here is my firestrike score for my 290x overclock core to 1105 and memory to 1460 with default voltage


----------



## Blameless

Quote:


> Originally Posted by *mus1mus*
> 
> I think, the reason that most of the guys who have used CLP advise not to use the TIM on Direct-Die application is that they scratch the Die. There's the issue.


They don't scratch the die, and it wouldn't really be an issue if they did.
Quote:


> Originally Posted by *mus1mus*
> 
> We don't know how thick is the non-conducting layer of the Die is. So the danger is there.


Orders of magnitude thicker than the actual metal layers that contain the transistors. No scratch that would not rapidly progress to the point of failure anyway, would be deep enough to risk shorting anything on the die itself if filled with a conductive TIM.
Quote:


> Originally Posted by *mus1mus*
> 
> As well as CLP overflowing to circuit traces surrounding the actual Die.
> 
> But no, Die to Heatsink using CLP is not much an issue. They wouldn't be using a copper Lid if it is that serious. CLP to traces and surrounding component is.


Yes, this can be an issue. The surface mount capacitors, resistors, and the odd exposed contact can be shorted by conductive TIM, or otherwise interfered with by capacitive ones.

Being very careful with TIM application, or covering any exposed components with a conformal coating can get around this.
Quote:


> Originally Posted by *xer0h0ur*
> 
> .3mm is a small difference so the 1mm pads themselves can squeeze to 0.7 when you tighten everything down. Hell I accidentally put so much pressure on the pads with my fingers holding down the vRAM pads when I was cutting them that some got squished and were wasted since they weren't making good contact anymore. And those were the 0.5mm Fujis.


Yeah, excessive thickness is bad, but you do generally want the pad to see some compression so everything makes solid contact and has sufficient mounting pressure for good thermal conductivity.
Quote:


> Originally Posted by *UGApcGaming*
> 
> If I buy the vizio, do you guys think I could add a future AMD card with HDMI 2.0 support in a tri fire configuration or will mixing different chips possibly cause an issue? I appreciate any suggestions and welcome rumors and speculation. Thanks!


You generally cannot mix GPUs of different generations/architectures. Hawaii based parts do not support HDMI 2.0 and it's unlikely that they ever will. Whatever replaces them is highly unlikely to be able to crossfire with them.


----------



## devilhead

Quote:


> Originally Posted by *bond32*
> 
> Nice new score...
> 
> http://www.3dmark.com/fs/3239962


good score, but why memory clocks are so low? i had like 20X290/X's, hynix memory ofc is better, but even with elpida worst card have done 1500mhz







and in firestrike score memory overclock makes sense


----------



## bond32

Quote:


> Originally Posted by *devilhead*
> 
> good score, but why memory clocks are so low? i had like 20X290/X's, hynix memory ofc is better, but even with elpida worst card have done 1500mhz
> 
> 
> 
> 
> 
> 
> 
> and in firestrike score memory overclock makes sense


Of the 4 cards, 3 of them are hynix and they are really good memory clockers. The one that's elpida won't do over 1400 on memory, and that was in regular firestrike not ultra. I was using the modified gpu tweak which doesn't appear to be able to adjust each card individually but I could be wrong.


----------



## virpz

Anyone knows of new 290/290x revisions that have reference PCB ?

I need a new card with ref. pcb so i can fit a waterblock that I already have.

Looking for info on the cards bellow and cards that are currently stocked at major vendors:

Gigabyte
GV-R929WF3-4GD
GV-R929OC-4GD
GV-R929XOC-4GD

Sapphire
100362-2SR
100361-2SR

Powercolor
AXR9 290 4GBD5-PPDHE


----------



## Klocek001

Quote:


> Originally Posted by *bond32*
> 
> Of the 4 cards, 3 of them are hynix and they are really good memory clockers. The one that's elpida won't do over 1400 on memory, and that was in regular firestrike not ultra. I was using the modified gpu tweak which doesn't appear to be able to adjust each card individually but I could be wrong.


my previous r9 290 had elipda and also couldn't do 1500 stable, now I have hynix and it clocks to 1600 no problem


----------



## tsm106

Quote:


> Originally Posted by *Klocek001*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bond32*
> 
> Of the 4 cards, 3 of them are hynix and they are really good memory clockers. The one that's elpida won't do over 1400 on memory, and that was in regular firestrike not ultra. I was using the modified gpu tweak which doesn't appear to be able to adjust each card individually but I could be wrong.
> 
> 
> 
> my previous r9 290 had elipda and also couldn't do 1500 stable, now I have hynix and it clocks to 1600 no problem
Click to expand...

I had two elpida cards, both did 1600 comfortably. I've seen elpida do 1700 btw, and I've also seen samsungs that cant do over 1650.


----------



## PillarOfAutumn

Quote:


> Originally Posted by *virpz*
> 
> Anyone knows of new 290/290x revisions that have reference PCB ?
> 
> I need a new card with ref. pcb so i can fit a waterblock that I already have.
> 
> Looking for info on the cards bellow and cards that are currently stocked at major vendors:
> 
> Gigabyte
> GV-R929WF3-4GD
> GV-R929OC-4GD
> GV-R929XOC-4GD
> 
> Sapphire
> 100362-2SR
> 100361-2SR
> 
> Powercolor
> AXR9 290 4GBD5-PPDHE


I can only speak for the 290, not there 290x. From the list you've given, I believe only the gigabyte is reference. Gigabyte marks it in the PCB, whether its rev 1.0 or 2.X. I haven't heard of any rev 2.0 gigabyte windforce cards so that may be your best best out of them.

On top of that, XFX DD black edition is also a reference PCB, not sure about the xfx DD normal one. You should check out coolingconfigurator to see what's reference, not reference.

If you're looking for brand new cards, you can check out superbiiz.com. they have the sapphire tri-x oc, I don't remember the model number, but its the reference one, not the new one. There's also the MSI 290 gaming with the twin frozr cooler that's reference as well.


----------



## bond32

Quote:


> Originally Posted by *tsm106*
> 
> I had two elpida cards, both did 1600 comfortably. I've seen elpida do 1700 btw, and I've also seen samsungs that cant do over 1650.


Yeah, I just haven't spent enough time with mine. The elpida card was the last one I purchased, been meaning to test it. Unfortunately, the pcie dip switches on this z87 classified board are directly under the first card (poor design). So I need to unplug the others and I am lazy.


----------



## sTOrM41

did anybody recently buy a Powersolor 290X PCS or LCS and can please upload the bios for me?


----------



## Bertovzki

Can anyone tell me , Does my new Sapphire R9 290X Tri-X OC have Hynix Vram ? or how do I find out ? , I'm not too concerned as probably not over clocking CPU or GPU , but will if I need to one day


----------



## xer0h0ur

Quote:


> Originally Posted by *Bertovzki*
> 
> Can anyone tell me , Does my new Sapphire R9 290X Tri-X OC have Hynix Vram ? or how do I find out ? , I'm not too concerned as probably not over clocking CPU or GPU , but will if I need to one day


GPU-Z


----------



## Bertovzki

Quote:


> Originally Posted by *xer0h0ur*
> 
> GPU-Z


I'm new here , does GPU-Z mean I have to have my card installed and run some software to detect my vram ? or do i just read , my GPU will be in its box for a month or so yet.


----------



## xer0h0ur

Yes it means the card needs to be installed and you have to install GPU-Z and run it to see the brand of memory or the GPU's ASIC rating.


----------



## Bertovzki

Quote:


> Originally Posted by *xer0h0ur*
> 
> Yes it means the card needs to be installed and you have to install GPU-Z and run it to see the brand of memory or the GPU's ASIC rating.


Ok thanks for info


----------



## aaroc

For a three display setup 7680x1440 does the new 8GB R9 290X worth the upgrade? I have a Quadfire of R9 290X 4GB cards right now.


----------



## battleaxe

Quote:


> Originally Posted by *Bertovzki*
> 
> Can anyone tell me , Does my new Sapphire R9 290X Tri-X OC have Hynix Vram ? or how do I find out ? , I'm not too concerned as probably not over clocking CPU or GPU , but will if I need to one day


the Tri-X all have Hynix. You win.


----------



## bond32

Quote:


> Originally Posted by *battleaxe*
> 
> the Tri-X all have Hynix. You win.


Not entirely sure this is accurate... At least the Tri-X bios has the capability to utilize Elpida...

The Vapor-X only has Hynix I believe.


----------



## razaice

Quote:


> Originally Posted by *bond32*
> 
> Not entirely sure this is accurate... At least the Tri-X bios has the capability to utilize Elpida...
> 
> The Vapor-X only has Hynix I believe.


I've heard something similar to this. If you want to know for sure what type of memory you have, GPU-Z will tell you.


----------



## battleaxe

Quote:


> Originally Posted by *bond32*
> 
> Not entirely sure this is accurate... At least the Tri-X bios has the capability to utilize Elpida...
> 
> The Vapor-X only has Hynix I believe.


Back when I was mining the Tri-X was known to only come with Hynix. The Hynix RAM made a great deal to us who were mining, so some would buy the Tri-X solely for the Hynix memory. I can say with almost certainty that Hynix is all they come with.

I could be wrong though. But pretty sure I'm right.


----------



## bond32

Quote:


> Originally Posted by *battleaxe*
> 
> Back when I was mining the Tri-X was known to only come with Hynix. The Hynix RAM made a great deal to us who were mining, so some would buy the Tri-X solely for the Hynix memory. I can say with almost certainty that Hynix is all they come with.
> 
> I could be wrong though. But pretty sure I'm right.


http://www.techpowerup.com/vgabios/index.php?architecture=&manufacturer=Sapphire&model=R9+290X&interface=&memType=&memSize=

All the Tri-x have elpida listed in the bios. Just sayin...


----------



## taem

Quote:


> Originally Posted by *bond32*
> 
> http://www.techpowerup.com/vgabios/index.php?architecture=&manufacturer=Sapphire&model=R9+290X&interface=&memType=&memSize=
> 
> All the Tri-x have elpida listed in the bios. Just sayin...


I don't think that's right, I flashed my powercolor pcs+ with tri-x bios, of the 4 or so tri-x bios at the tech powerup database only one listed both Hynix and elpida, the others listed only Hynix. Speaking only of 290 and not 290x though.


----------



## battleaxe

Quote:


> Originally Posted by *bond32*
> 
> http://www.techpowerup.com/vgabios/index.php?architecture=&manufacturer=Sapphire&model=R9+290X&interface=&memType=&memSize=
> 
> All the Tri-x have elpida listed in the bios. Just sayin...


Doesn't really mean anything. They could have just used the Saphire reference BIOS as a a starting point.


----------



## Bertovzki

Quote:


> Originally Posted by *battleaxe*
> 
> Doesn't really mean anything. They could have just used the Saphire reference BIOS as a a starting point.


Ok guys , thanks "all " for your feedback , I guess i'll find out soon , and re - a month or so before starting my build , probably will have card in before that , as tomorrow good weather here and I will take my case outside and start to mutilate it


----------



## DividebyZERO

Quote:


> Originally Posted by *aaroc*
> 
> For a three display setup 7680x1440 does the new 8GB R9 290X worth the upgrade? I have a Quadfire of R9 290X 4GB cards right now.


I am wondering the same thing, I have triple 4k and anything under for custom resolution. Not sure if its worth the money because the next series may be launching q1 2015...


----------



## aaroc

Quote:


> Originally Posted by *DividebyZERO*
> 
> I am wondering the same thing, I have triple 4k and anything under for custom resolution. Not sure if its worth the money because the next series may be launching q1 2015...


I have 4 EK waterblock for them. Dont plan on changing GPU and WB soon, but a GPU that uses the same WB sounds interesting.


----------



## DividebyZERO

Quote:


> Originally Posted by *aaroc*
> 
> I have 4 EK waterblock for them. Dont plan on changing GPU and WB soon, but a GPU that uses the same WB sounds interesting.


it would be amazing if someone had them and even ran 4k benchs. Anything over that would be awesome.


----------



## tsm106

Quote:


> Originally Posted by *DividebyZERO*
> 
> Quote:
> 
> 
> 
> Originally Posted by *aaroc*
> 
> For a three display setup 7680x1440 does the new 8GB R9 290X worth the upgrade? I have a Quadfire of R9 290X 4GB cards right now.
> 
> 
> 
> I am wondering the same thing, I have triple 4k and anything under for custom resolution. Not sure if its worth the money because the next series may be launching q1 2015...
Click to expand...

For quads at very high res? Hell yea. It will allow you to run the highest iq settings and the hi-res textures in new games like shadow of mordor.


----------



## Arizonian

Quote:


> Originally Posted by *Mugamat*
> 
> Hi everyone. I`m using kraken g10 for about 4 month on my overclocked R9 290. Using it wis Corsair H55 With push&pull on exaust of my case. I made some acrylyc cover on the top mount and in glows in UV lights. I`m also added one cooler for another VRM. Using Gelid ICY Vision Heatsinks on another VRM and some no-named heatsink on memory. Here some photo of the card, and the temps (Feel sorry for bad quality and bad english.)
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added















Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Hey Arizonian.
> 
> Can you please update me on the list? I recently got a second 290 that I will be crossfiring.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - updated








Quote:


> Originally Posted by *UGApcGaming*
> 
> Hey guys, new to overclock and have been lurking here for a while, but figured I'd introduce myself and share my rig. Main reason for joining is that I haven't found a solution for 4k gaming on a television. Here's the dilemma.
> 
> Want the new vizio due it's price and high velocity mode. Excellent TV for gaming as it has the lowest input lag of all the tvs out there by a large margin. Problem is the R9 290x doesn't have hdmi 2.0
> 
> The Panasonic Ax900 that comes out this month does however have a display port, but it's expected to be almost $8000.
> 
> If I buy the vizio, do you guys think I could add a future AMD card with HDMI 2.0 support in a tri fire configuration or will mixing different chips possibly cause an issue? I appreciate any suggestions and welcome rumors and speculation. Thanks!
> 
> 4770k
> 16gb memory
> Gigabyte GA-z87x-OC
> 2 R9 290X with XSPC copper blocks/Alphacool 80mmx140mm radiator/XSPC photon reservoir
> 
> Displays: 2x Acer H236HL with LG ultrawide 29um65p (6400x1080)
> 
> 
> Spoiler: Warning: Spoiler!


Welcome to OCN wit your first post. Nice work. Congrats - added


----------



## Chopper1591

*Asking advice from Tri-x owners.*

Is there something to look out for when shopping for these?
Like with my current 7950 vapor-x, bios numbers: referring to types of memory chips.

Are there known bad batches?
And last but not least, is there a big difference between the regular and OC versions?


----------



## Agent Smith1984

Hearing a lot about the tri-x 290....

I too can confirm Hynix memory.... almost everyone I have seen has hynix memory....

What confuses me, is why some people are saying they can't get past 1400MHz memory, and then others (like myself) have no problem going to 1600+???
Doesn't the memory voltage coincide with the core voltage? If that's the case, maybe people aren't juicing enough??


----------



## LandonAaron

Quote:


> Originally Posted by *UGApcGaming*
> 
> Hey guys, new to overclock and have been lurking here for a while, but figured I'd introduce myself and share my rig. Main reason for joining is that I haven't found a solution for 4k gaming on a television. Here's the dilemma.
> 
> Want the new vizio due it's price and high velocity mode. Excellent TV for gaming as it has the lowest input lag of all the tvs out there by a large margin. Problem is the R9 290x doesn't have hdmi 2.0
> 
> The Panasonic Ax900 that comes out this month does however have a display port, but it's expected to be almost $8000.
> 
> If I buy the vizio, do you guys think I could add a future AMD card with HDMI 2.0 support in a tri fire configuration or will mixing different chips possibly cause an issue? I appreciate any suggestions and welcome rumors and speculation. Thanks!


Before buying that Vizio TV be sure to play around with the display model, or maybe head over the AVSforums and see if they have a thread for it. I have a Vizio TV (just 1080p) I bought last year. And while it has good picture one very annoying thing, is that the only input mode that has low input lag is game mode. And Game Mode comes with its own display settings like brightness and contrast. You can not adjust the display settings at all while in game mode otherwise it creates a new "custom display mode" which doesn't have game modes low input lag. Even PC mode has horrible input lag. Only game mode can be used for gaming. Fortunately the Game mode has the best balance of settings all round of all the default display modes, but still its annoying they can not be adjusted. Just want to give you a heads up


----------



## cephelix

@Mugamat your vrm temps look good..what is the ambient temp?? looking towards a universal block so will need the copper heatsinks like yours


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Hearing a lot about the tri-x 290....
> 
> I too can confirm Hynix memory.... almost everyone I have seen has hynix memory....
> 
> What confuses me, is why some people are saying they can't get past 1400MHz memory, and then others (like myself) have no problem going to 1600+???
> Doesn't the memory voltage coincide with the core voltage? If that's the case, maybe people aren't juicing enough??


The memory on these are all a bit different. I have Hynix memory that cannot go over 1350mhz no matter how much voltage I push through the card. Just depends on the lottery. Silicon lottery applies to memory too. You can get a great core and a terrible memory card. Or both. All lottery and what you win, mem and your core on a specific card.


----------



## Mugamat

Quote:


> Originally Posted by *cephelix*
> 
> @Mugamat your vrm temps look good..what is the ambient temp?? looking towards a universal block so will need the copper heatsinks like yours


It`s about 30 C, and pc taking air right from the heat battery )))) I`m advice you Gelid ICY Heatsink kit. It`s chip, fits perfect and keeping vrm cold under some air flow from fans.


----------



## LandonAaron

Looks like the r9 290x will lag behind the gtx970 a bit in far cry 4.


----------



## bond32

Yet another game that looks to be more optimized towards Nvidia... Also, 1080p? Really?


----------



## Mugamat

I`m a noobee in OC and just tried to overclock my R9 290 under Kraken G10. Can anybody tell me is that a good result or i should never do it again?)


----------



## Agent Smith1984

Use trixx.....

AB doesn't set the 50% power limit as it should in CCC.

Also, AB limits you to +100mv....


----------



## Chopper1591

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Hearing a lot about the tri-x 290....
> 
> I too can confirm Hynix memory.... almost everyone I have seen has hynix memory....
> 
> What confuses me, is why some people are saying they can't get past 1400MHz memory, and then others (like myself) have no problem going to 1600+???
> Doesn't the memory voltage coincide with the core voltage? If that's the case, maybe people aren't juicing enough??


Yeah it seems that most, if not all, tri-x cards use Hynix.
Quote:


> Originally Posted by *battleaxe*
> 
> The memory on these are all a bit different. I have Hynix memory that cannot go over 1350mhz no matter how much voltage I push through the card. Just depends on the lottery. Silicon lottery applies to memory too. You can get a great core and a terrible memory card. Or both. All lottery and what you win, mem and your core on a specific card.


Should it be the same as with the 7950 back then?
That later batches are starting to be worse then the first series?

I am looking at a second hand card, 8 months old.
210 euro(261 usd) ex. shipping.

Seems like a decent price to me. What about you?


----------



## Klocek001

my 290 tri-x just died again, after it was exchanged for new like a month ago. RATS ! not even while playing, just writing in MS word.


----------



## Chopper1591

Quote:


> Originally Posted by *Klocek001*
> 
> my 290 tri-x just died again, after it was exchanged for new like a month ago. RATS ! not even while playing, just writing in MS word.


How is your psu?

I've had one doing that on me once.


----------



## Klocek001

I don't know, it seems fine. 12V is 11.75V guys told me it's alright. It's XFX 850W.
What exactly happened to you ?


----------



## tsm106

Quote:


> Originally Posted by *Klocek001*
> 
> I don't know, it seems fine. 12V is 11.75V guys told me it's alright. It's XFX 850W.
> What exactly happened to you ?


What will you do when the third gpus dies? You know, if the second one died so quickly, I suspect the third will die just as fast.


----------



## Chopper1591

Quote:


> Originally Posted by *Klocek001*
> 
> I don't know, it seems fine. 12V is 11.75V guys told me it's alright. It's XFX 850W.
> What exactly happened to you ?


Well... electronics are weird sometimes.

A faulty psu can kill electonics.

But then again, bad units happen... so it might just as well be just bad luck.

How old is the psu?
Do you have other weird stuff happenings sometimes? Or was everything working fine?


----------



## Klocek001

Quote:


> Originally Posted by *tsm106*
> 
> What will you do when the third gpus dies? You know, if the second one died so quickly, I suspect the third will die just as fast.


you think that's my PSU ? There's nothing weird happening, the cards just died, not even in games.
It's about 12 mths old (the PSU).


----------



## Chopper1591

Quote:


> Originally Posted by *Klocek001*
> 
> you think that's my PSU ? There's nothing weird happening, the cards just died, not even in games.
> It's about 12 mths old (the PSU).


If I were you.
I'd lay the subject down at xfx and ask their opinion.

I had weird issues one day and couldn't locate the problem...
Changed multiple hardware one at a time untill I thought about the psu. Had contact with Corsair about it and they offered me an RMA.


----------



## Jflisk

Quote:


> Originally Posted by *Klocek001*
> 
> I don't know, it seems fine. 12V is 11.75V guys told me it's alright. It's XFX 850W.
> What exactly happened to you ?


.


----------



## Chopper1591

Quote:


> Originally Posted by *Jflisk*
> 
> The XFX power supplies do not have the best track record just a heads up. There video cards on the other hand I have never had a problem with.


XFX psu's can be expected to be pretty good actually.
They are made by Seasonic.


----------



## shilka

Quote:


> Originally Posted by *Jflisk*
> 
> The XFX power supplies do not have the best track record just a heads up. There video cards on the other hand I have never had a problem with.


http://www.overclock.net/t/1436079/xfx-power-supplies-information-thread


----------



## Jflisk

Quote:


> Originally Posted by *shilka*
> 
> http://www.overclock.net/t/1436079/xfx-power-supplies-information-thread


Okay got it. When I was looking for a PSU awhile ago I did not see many good things about them. Might be an older model I was looking at. Thanks


----------



## Chopper1591

Quote:


> Originally Posted by *Jflisk*
> 
> Okay got it. When I was looking for a PSU awhile ago I did not see many good things about them. Might be an older model I was looking at. Thanks


Ah well.
We are all to learn here, right?

And, as with all brands, there are mediocre and better models. It all comes down to the price you pay.


----------



## Agent Smith1984

I'm wishing someone would review the new Rosewill Photon series before I order a new PSU Friday....

At this point, it's come down to Rosewill Capstone 750-M, or Rosewill Photon (the photon has a striking feature list for the price) at around $90.....

Of course, it ponders the questions, do I go $120 for the EVGA 750 G2 which is basically the best PSU you can get for the money, or go ahead and get the power I need for crossfire at the same price ($120 for Rosewill Photon 1050W) ....









Decisions.....


----------



## shilka

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'm wishing someone would review the new Rosewill Photon series before I order a new PSU Friday....
> 
> At this point, it's come down to Rosewill Capstone 750-M, or Rosewill Photon (the photon has a striking feature list for the price) at around $90.....
> 
> Of course, it ponders the questions, do I go $120 for the EVGA 750 G2 which is basically the best PSU you can get for the money, or go ahead and get the power I need for crossfire at the same price ($120 for Rosewill Photon 1050W) ....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Decisions.....


All i know about the Photon is the OEM is HighPower cant tell you if its good or not.


----------



## Agent Smith1984

Thanks Shilka!

I wonder if there are any other HighPower 550-1050w units that have popped up on the market later through other brandings.... could be the same units.
I have heard it's hit or miss, no?


----------



## Chopper1591

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'm wishing someone would review the new Rosewill Photon series before I order a new PSU Friday....
> 
> At this point, it's come down to Rosewill Capstone 750-M, or Rosewill Photon (the photon has a striking feature list for the price) at around $90.....
> 
> Of course, it ponders the questions, do I go $120 for the EVGA 750 G2 which is basically the best PSU you can get for the money, or go ahead and get the power I need for crossfire at the same price ($120 for Rosewill Photon 1050W) ....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Decisions.....


I never skimp on the psu.

And ask yourself this:
Why would a unit be the same price and deliver 300W more?

Looks like you have your answer there.
Trust me. Don't skimp on the psu.

The G2 is a solid unit.
These are made by Super Flower btw, if you didn't know.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Chopper1591*
> 
> I never skimp on the psu.
> 
> And ask yourself this:
> Why would a unit be the same price and deliver 300W more?
> 
> Looks like you have your answer there.
> Trust me. Don't skimp on the psu.
> 
> The G2 is a solid unit.
> These are made by Super Flower btw, if you didn't know.


I _agree_..... to a _degree_....

I'm all too familiar with the importance of getting a good PSU (I paid $180 for my OCZ in 2006 and it lasted 8 years!!







), but the reviews for the Rosewill units have been great....
Rosewill is a store brand, so there is the majority of your savings right there..... They were once viewed as a poor choice, but have since made a pretty decent name for themselves.....

Now, this does draw an even bigger point though..... it is said that nothing can even compare to the performance of the EVGA G2 units in their pricerange, and they do represent a name brand, but I think they are simply pricing themselves to stay in the market because of how lowly priced the first gen units were, and how horrible they were.....

In my opinion, with Shilka's information on the photon, there is a really good chance that this:
http://www.newegg.com/Product/Product.aspx?Item=N82E16817182325

Is actually this:
http://www.newegg.com/Product/Product.aspx?Item=N82E16817250013&cm_re=astro_power_supply-_-17-250-013-_-Product


----------



## BLOWNCO

is there anyway to set over clock profiles like in AB when going into 3d so a game and then down clocking when not in game without having to use afterburner?


----------



## Agent Smith1984

Quote:


> Originally Posted by *BLOWNCO*
> 
> is there anyway to set over clock profiles like in AB when going into 3d so a game and then down clocking when not in game without having to use afterburner?


Yes, that's exactly what I did with my 280x..... it will also make your Firestrike clocks report correctly









It's in settings, "profile" select whichever one you want for the 2d and 3d drop down menu....


----------



## shilka

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Thanks Shilka!
> 
> I wonder if there are any other HighPower 550-1050w units that have popped up on the market later through other brandings.... could be the same units.
> I have heard it's hit or miss, no?


It might be based on the HighPower Astro GD (F) or it might be something brand new, but i have no idea as i have never seen the insides of the Photon.


----------



## BLOWNCO

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yes, that's exactly what I did with my 280x..... it will also make your Firestrike clocks report correctly
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's in settings, "profile" select whichever one you want for the 2d and 3d drop down menu....


how do you do it without using afterburner?


----------



## tsm106

Quote:


> Originally Posted by *BLOWNCO*
> 
> how do you do it without using afterburner?


You don't.


----------



## cephelix

Quote:


> Originally Posted by *Mugamat*
> 
> It`s about 30 C, and pc taking air right from the heat battery )))) I`m advice you Gelid ICY Heatsink kit. It`s chip, fits perfect and keeping vrm cold under some air flow from fans.


bodes well for me, my ambient temps are similar to yours....can't wait to get my r9 290 into the loop...seems a bit of a waste that 1 x 360mm and 1 x 240 mm rad cooling only my cpu


----------



## pipes

when come out a video bios editor then work with r9 290x?


----------



## BradleyW

Hoping we get a FC4 driver in the next 16 hours at least.


----------



## BLOWNCO

Quote:


> Originally Posted by *tsm106*
> 
> You don't.


thats a bummer i can not for the life of me get a good overclock like i can in trixx or the profiles to actually work if i set them up in AB


----------



## LandonAaron

Quote:


> Originally Posted by *BradleyW*
> 
> Hoping we get a FC4 driver in the next 16 hours at least.


+1 I have been looking forward to this game all year. I know there is alot of Ubisoft hate, but I am going to try and not let it get in the way of me enjoying this game.


----------



## BradleyW

Quote:


> Originally Posted by *LandonAaron*
> 
> +1 I have been looking forward to this game all year. I know there is alot of Ubisoft hate, but I am going to try and not let it get in the way of me enjoying this game.


I absolutely loved the Far Cry series. Can't wait for FC4. I played an early build several months ago at EU Gamer Expo London.


----------



## taem

Quote:


> Originally Posted by *BradleyW*
> 
> I absolutely loved the Far Cry series. Can't wait for FC4. I played an early build several months ago at EU Gamer Expo London.


I enjoyed the hell out of FC3. Can't point to any one thing that was great, and I wouldn't call it one of the all time greats, but shrug it's one of the most fun games I've ever played. And there is a ton of hours of gaming to be had in it. I always wait for steam sales to pick up titles but I just might pay the full for FC4. I'm considering getting another 290 because of this, I can't max out FC3 on my one 290 not even close.


----------



## chiknnwatrmln

FC4 is really just like a DLC expansion of FC3.. imo it feels like FC3.5. Don't get me wrong, it's still fun, but it's pretty much FC3 on a new map with a new story and a grappling hook and ultralight helicopter.

Having said that, I really want the CF driver to come out. Running the game on a single card is killing me. Ubi also needs to fix the game a bit; after an hour or two I get some hardcore stuttering.


----------



## LandonAaron

http://www.overclock.net/t/1525279/amd-catalyst-14-11-2-released

FC4 and Dragonage Inquisition optimizations with beta driver. No crossfire for FC4 though. But it does claim a 50% increase in performance for FC4 when using Anti-Aliasing over previous driver.


----------



## chiknnwatrmln

Sweet, I checked like an hour ago and those weren't there.


----------



## taem

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> FC4 is really just like a DLC expansion of FC3.. imo it feels like FC3.5. Don't get me wrong, it's still fun, but it's pretty much FC3 on a new map with a new story and a grappling hook and ultralight helicopter.


Sounds great








Quote:


> Having said that, I really want the CF driver to come out. Running the game on a single card is killing me. Ubi also needs to fix the game a bit; after an hour or two I get some hardcore stuttering.


Farg1. No cf support? I've been debating whether to invest further into 290 with xfire and blocks or just save the cash for 390x. Looking at the list of titles that don't support xfire it looks like waiting on 390x might be the better option.

Does fc4 have the frames to render ahead option that fc3 did? That solved the stuttering for me in fc3.


----------



## USlatin

Does someone know if the MSI R9 280x can run three monitors like this:

a. DVI > DIV
b. HDMI > HDMI to DVI adapter > DVI
c. HDMI > HDMI input

and what kind of adapter do I need for monitor B, a regular one or would I need an active adapter?


----------



## Mega Man

you can run up to 3 monitors without an active adapter with a 280x.

with a 7950/7970 you can run 2


----------



## USlatin

thanks, and does the adapter degrade the image at all?


----------



## Mega Man

no imo no


----------



## chiknnwatrmln

Quote:


> Originally Posted by *taem*
> 
> Sounds great
> 
> 
> 
> 
> 
> 
> 
> 
> Farg1. No cf support? I've been debating whether to invest further into 290 with xfire and blocks or just save the cash for 390x. Looking at the list of titles that don't support xfire it looks like waiting on 390x might be the better option.
> 
> Does fc4 have the frames to render ahead option that fc3 did? That solved the stuttering for me in fc3.


Not yet, AMD and Ubi are working on getting CF to work properly. I'm sure it will be done soon, considering the game isn't even released yet..

And I don't believe it does, I haven't looked closely enough though.


----------



## Mega Man

meh look what they did with AC

they will just cap it at 30 fps anyway


----------



## neurotix

1 290 Vapor-X


----------



## tsm106

Gah, I'm so sleepy now. Been transferring my rig from the 700D to a Primo. There's so much more to do and I have to finish before the woman gets home tomorrow night. She'll freak to see all the crap I have out modding this lol.


----------



## Mega Man

NICE !


----------



## Sempre

Just got these







two Sapphire R9 290s


----------



## kizwan

Quote:


> Originally Posted by *Mugamat*
> 
> I`m a noobee in OC and just tried to overclock my R9 290 under Kraken G10. Can anybody tell me is that a good result or i should never do it again?)


Unless you can run at least at 1080p, Preset Custom, Quality Ultra & Tessellation Extreme, you should never do this again.








Quote:


> Originally Posted by *neurotix*
> 
> 1 290 Vapor-X
> 
> 
> Spoiler: Warning: Spoiler!


Tessellation OFF?


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> Gah, I'm so sleepy now. Been transferring my rig from the 700D to a Primo. There's so much more to do and I have to finish before the woman gets home tomorrow night. She'll freak to see all the crap I have out modding this lol.


Very nice.


----------



## bond32

Quote:


> Originally Posted by *tsm106*
> 
> Gah, I'm so sleepy now. Been transferring my rig from the 700D to a Primo. There's so much more to do and I have to finish before the woman gets home tomorrow night. She'll freak to see all the crap I have out modding this lol.


Damn you and your creativity... Been on the fence about ordering the primo (again) and decided against it as I didn't think of that second psu location.


----------



## Chopper1591

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I _agree_..... to a _degree_....
> 
> I'm all too familiar with the importance of getting a good PSU (I paid $180 for my OCZ in 2006 and it lasted 8 years!!
> 
> 
> 
> 
> 
> 
> 
> ), but the reviews for the Rosewill units have been great....
> Rosewill is a store brand, so there is the majority of your savings right there..... They were once viewed as a poor choice, but have since made a pretty decent name for themselves.....
> 
> Now, this does draw an even bigger point though..... it is said that nothing can even compare to the performance of the EVGA G2 units in their pricerange, and they do represent a name brand, but I think they are simply pricing themselves to stay in the market because of how lowly priced the first gen units were, and how horrible they were.....
> 
> In my opinion, with Shilka's information on the photon, there is a really good chance that this:
> http://www.newegg.com/Product/Product.aspx?Item=N82E16817182325
> 
> Is actually this:
> http://www.newegg.com/Product/Product.aspx?Item=N82E16817250013&cm_re=astro_power_supply-_-17-250-013-_-Product


Hard to tell just by looking at the two units.
One thing is for sure. They are both made by Sirtec.

Still.
Like you said yourself.
You payed a price premium on a OCZ in the past. Which lasted you a good while.

Why risk it by saving a few bucks?
The G2 really is nicely priced. My Corsair was more expensive.

Quote:


> Originally Posted by *tsm106*
> 
> Gah, I'm so sleepy now. Been transferring my rig from the 700D to a Primo. There's so much more to do and I have to finish before the woman gets home tomorrow night. She'll freak to see all the crap I have out modding this lol.


Damn it.

So much room in that case.
I need one now.









I haven't payed much attention to that case before, but that turned psu design is just plain nice.

Quote:


> Originally Posted by *neurotix*
> 
> 1 290 Vapor-X


Would you be so kind to make an FireStrike run with 1000/1500 clocks on your 290?

Thanks.


----------



## Agent Smith1984

That is a lot of EVGA PSU nested in there!!!

2300 watts









Oh, and in regards to the PSU, the Tiger Direct locally has the XFX Pro 750W Black Edition for $120..... that is a very well performing seasonic unit correct?


----------



## Red1776

Quote:


> Originally Posted by *tsm106*
> 
> Gah, I'm so sleepy now. Been transferring my rig from the 700D to a Primo. There's so much more to do and I have to finish before the woman gets home tomorrow night. She'll freak to see all the crap I have out modding this lol.


Nice T









Now lets see if you get as much grief as I get having 2.3Kw. If you ("you" generically) are a heavy OC'er I don't think it's overkill.

You going to leave the left panel open, or do you have something planned with the res in that position?


----------



## bond32

Quote:


> Originally Posted by *Red1776*
> 
> Nice T
> 
> 
> 
> 
> 
> 
> 
> 
> Now lets see if you get as much grief as I get having 2.3Kw. If you ("you" generically) are a heavy OC'er I don't think it's overkill.
> You going to leave the left panel open, or do you have something planned with the res in that position?


That location in that case should still allow one to put the side panel on... It has 2x140mm fan mounts on that side panel, can even put a rad there. Another brilliant feature of that case.

Pretty darn good idea putting the PSU there. My only question, how would you get the power cable to it? I have seen 90 adapters, suppose that is the best option, might even run it out the back.


----------



## Agent Smith1984

$$$$$$$$$









Nice stuff dude....


----------



## shilka

Quote:


> Originally Posted by *Agent Smith1984*
> 
> That is a lot of EVGA PSU nested in there!!!
> 
> 2300 watts
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Oh, and in regards to the PSU, the Tiger Direct locally has the XFX Pro 750W Black Edition for $120..... that is a very well performing seasonic unit correct?


Is it the semi or fully modular Black Edition?


----------



## Chopper1591

Quote:


> Originally Posted by *Agent Smith1984*
> 
> That is a lot of EVGA PSU nested in there!!!
> 
> 2300 watts
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Oh, and in regards to the PSU, the Tiger Direct locally has the XFX Pro 750W Black Edition for $120..... that is a very well performing seasonic unit correct?


Do you mean this one?
XPS-750W-BEF (P1-750B-BEFX)

If so, yeah that is a good unit.
Grab it if you can. Those look to be discontinued.

Read here


----------



## Agent Smith1984

Quote:


> Originally Posted by *shilka*
> 
> Is it the semi or fully modular Black Edition?


It would be this baby right here:
http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=7580853&CatId=2496


----------



## shilka

Quote:


> Originally Posted by *Agent Smith1984*
> 
> It would be this baby right here:
> http://www.tigerdirect.com/applications/SearchTools/item-details.asp?EdpNo=7580853&CatId=2496


Thats a Seasonic KM3 based unit the Corsair AX760/860 and Cooler Master V700/850/1000 use the same platform.


----------



## Klocek001

Please fellow OCN members help me choose from some PSUs I've found on my fav site








Here are the ones in my price range, please just scroll down and tell me which one you'd choose for yourself.

http://www.morele.net/komputery/podzespoly-komputerowe/zasilacze-61/300.00,400.00,,,,,,,,,/1/#product_list

It's gonna power a single 290 (moderate OC, like 1100/1500) with a 4,8GHz 2500k, 8GB RAM and 1TB HDD. Thx for any help!

I'd choose Be Quiet! PUREPOWER L8 730W CM, please correct me if I'm wrong Shilka.
Also, the quality is more important than the power for me, if a 600W NZXT is better quality than 730W BQ I'd go for 600W (I don't know if it is, just trying to explain).


----------



## Jflisk

All this talk about power supplies reminds me I have to find a 1500W power supply (any suggestions are more then welcome) to replace my Thermaltake1350W (already split the rail). Cant do my 5 GHZ on my processor any more. Between then R9 290X trifire videocards and the FX9590 at full tilt its putting a hurting on my power supply. Started freezing after adding the 3rd card and trying 5GHZ. Dropped the processor back down to 4.7 and the videocards overclocked runs fine.With the wemo at wall pulling 1130W that I can see at 4.7GHZ. Think its jumping up enough to cause a freeze at 5GHZ.


----------



## Agent Smith1984

Hmm, seems PSU discussion has found it's way into this thread....

It seems semi-off topic, but I'm the one who started it, my apologies....
It is only because I know how power hungry the 290 is, and it is the entire reason why my last PSU blew up , lol.

Another user with a 290 (I expect to see a TON more people running 290's now that they are sub $200 used, and $250+/- new....)
Maybe there are others who have realized they need more power also.

Could we be looking at starting a 290/290x POWER Thread???
Of course, if there isn't one already.....

I am way late to the party on getting a 290, but that's because I have to keep my PC spending to a min....And I'm not the only one apparently, because I am seeing users pop up several times a day who have just gotten their hands on one, especially now that prices are down so much.


----------



## shilka

Quote:


> Originally Posted by *Klocek001*
> 
> Please fellow OCN members help me choose from some PSUs I've found on my fav site
> 
> 
> 
> 
> 
> 
> 
> 
> Here are the ones in my price range, please just scroll down and tell me which one you'd choose for yourself.
> 
> http://www.morele.net/komputery/podzespoly-komputerowe/zasilacze-61/300.00,400.00,,,,,,,,,/1/#product_list
> 
> It's gonna power a single 290 (moderate OC, like 1100/1500) with a 4,8GHz 2500k, 8GB RAM and 1TB HDD. Thx for any help!
> 
> I'd choose Be Quiet! PUREPOWER L8 730W CM, please correct me if I'm wrong Shilka.
> Also, the quality is more important than the power for me, if a 600W NZXT is better quality than 730W BQ I'd go for 600W (I don't know if it is, just trying to explain).


The Be Quiet L8 is a HEC made unit so no way in hell.
You would be way better off with this Seasonic M12II bbased unit
http://www.morele.net/zasilacz-xfx-xxx-edition-pro-modularny-80plus-bronze-650w-p1-650x-xxb9-511076/
Quote:


> Originally Posted by *Jflisk*
> 
> All this talk about power supplies reminds me I have to find a 1500W power supply (any suggestions are more then welcome) to replace my Thermaltake1350W (already split the rail). Cant do my 5 GHZ on my processor any more. Between then R9 290X trifire videocards and the FX9590 at full tilt its putting a hurting on my power supply. Started freezing after adding the 3rd card and trying 5GHZ. Dropped the processor back down to 4.7 and the videocards overclocked runs fine.With the wemo at wall pulling 1130W that I can see at 4.7GHZ. Think its jumping up enough to cause a freeze at 5GHZ.


EVGA SuperNova G2 or P2 1600 watts then.


----------



## tsm106

Quote:


> Originally Posted by *Red1776*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Gah, I'm so sleepy now. Been transferring my rig from the 700D to a Primo. There's so much more to do and I have to finish before the woman gets home tomorrow night. She'll freak to see all the crap I have out modding this lol.
Click to expand...

Nice T








Now lets see if you get as much grief as I get having 2.3Kw. If you ("you" generically) are a heavy OC'er I don't think it's overkill.
You going to leave the left panel open, or do you have something planned with the res in that position?

The res is coupled to the pumps. You can see it more clearly if you open the pic. I will have the cpu dump into the res. Res will be mounted to that drive cage side. I don't need to leave the left door open because it comes with grills for a 240mm. There's the ventilation for the psu.

Quote:


> Originally Posted by *bond32*
> 
> Damn you and your creativity... Been on the fence about ordering the primo (again) and decided against it as I didn't think of that second psu location.


I asked in the enthoo thread if anyone had done dual 480 and dual psu. Apparently no one has. I did get some annoying opinion that the case wasn't very good for this lol. Like I asked for that... Anyways, the psu snaps into place there, saves me from having to mount it to the front panel. The curved panel which the psu sits up against has a lip which you will have to flatten. I will use velcro. Btw, the front panel right there is perforated and it has a nice big cable hole. I have a 2ft right angle extension coming so it will route thru that hole under the case to the back. I'll fasten it to the case right under psu 1. Where the 2nd psu is atm is really an idea place imo. It's perfect, works as a pump stand too lol.


----------



## Klocek001

Quote:


> Originally Posted by *shilka*
> 
> The Be Quiet L8 is a HEC made unit so no way in hell.
> EVGA SuperNova G2 or P2 1600 watts then.


That's what I'm about, I know to look at OEM but it doesn't tell me duck.
Whta would be the best choice then ? I know XFX are Seasonic and they're good but the one I have has killed two of my 290s even though it seems to work fine. I'd rather not go with XFX.


----------



## Jflisk

Quote:


> Originally Posted by *shilka*
> 
> The Be Quiet L8 is a HEC made unit so no way in hell.
> You would be way better off with this Seasonic M12II bbased unit
> http://www.morele.net/zasilacz-xfx-xxx-edition-pro-modularny-80plus-bronze-650w-p1-650x-xxb9-511076/
> EVGA SuperNova G2 or P2 1600 watts then.


+1 and Thank you


----------



## shilka

Quote:


> Originally Posted by *Klocek001*
> 
> That's what I'm about, I know to look at OEM but it doesn't tell me duck.
> Whta would be the best choice then ? I know XFX are Seasonic and they're good but the one I have has killed two of my 290s even though it seems to work fine. I'd rather not go with XFX.


How much do you want to spend?
Quote:


> Originally Posted by *Jflisk*
> 
> +1 and Thank you


http://www.overclock.net/t/1492076/1375-1700-watts-comparison-thread#post_22324235


----------



## Klocek001

Quote:


> Originally Posted by *shilka*
> 
> How much do you want to spend?


In PLN, 400 is the upper limit.
That's 120 USD.


----------



## Agent Smith1984

You are going to want every bit of 700watts for a highly OC'd CPU, and OC'd 290.....

Now, I know people are getting away with running a 550w unit, with a single 290 on some intel systems and what not, but I don't think they are pushing very high voltage/clocks.
I say that because I have seen several cases where overclocked CPU's, combined with coolers, a few hard drives, some mid-high speed fans, and a highly overclocked 290, is pulling between 650-700w..... if your PSU is + gold, then you can get by with it....

Something about the 290's power draw/voltage ratio is very strange....
With just a touch of voltage and clock speed, this thing goes to sucking some serious juice from the wall......


----------



## shilka

Quote:


> Originally Posted by *Klocek001*
> 
> In PLN, 400 is the upper limit.
> That's 120 USD.


Best unit below 400 with more then 550 watts is this
http://www.morele.net/zasilacz-cooler-master-v650s-650w-modularny-80-gold-rs-650-amaa-g1-633092/


----------



## tsm106

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Something about the 290's power draw/voltage ratio is very strange....
> With just a touch of voltage and clock speed, this thing goes to sucking some serious juice from the wall......


There's nothing strange about it actually. The cards are TDP restricted stock. First step is to raise that limit, and that power limit slider does just that. And then when you add in voltage it goes up and up. I think when unlocked and heavily overclocked they use more power than unlocked ti and titans but not much more. All cards when clocked high suck down power, its not unusual you know. Hell even maxwell goes ape when you oc them.


----------



## Chopper1591

Quote:


> Originally Posted by *Klocek001*
> 
> In PLN, 400 is the upper limit.
> That's 120 USD.


In that price range you should be able to grab an M12II 750W
A G2 will be a bit more.

Although it does depend on the prices where you live.

edit:
Can you buy from the bay?


----------



## Klocek001

Quote:


> Originally Posted by *shilka*
> 
> Best unit below 400 with more then 550 watts is this
> http://www.morele.net/zasilacz-cooler-master-v650s-650w-modularny-80-gold-rs-650-amaa-g1-633092/


thanks for your time.


----------



## Agent Smith1984

Quote:


> Originally Posted by *tsm106*
> 
> There's nothing strange about it actually. The cards are TDP restricted stock. First step is to raise that limit, and that power limit slider does just that. And then when you add in voltage it goes up and up. I think when unlocked and heavily overclocked they use more power than unlocked ti and titans but not much more. All cards when clocked high suck down power, its not unusual you know. Hell even maxwell goes ape when you oc them.


I guess it's unusual to me cause I just came from tahiti..... overclocking Tahiti, and overvolting Tahiti by the THE SAME percentages, doesn't yield a similar increase in power draw compared to Hawaii. It seems to scale way higher (worse) on Hawaii.... that's been my experience anyways...


----------



## Klocek001

Is CM VS550 same as V550S ? I got lost in their nomenclature







I've read Shilka wrote that's the best bang for buck there is, I'm actually thinking of spending less on PSU and trade my 290 as soon as they send it back to me from RMA to get a 970. Ubisoft fan here (GR,FC,AC and you know... the rest. TC I guess), you prolly know why I'm doing this...The price gap between 550 and 650 should cover the gap between the 970 and 290.


----------



## tsm106

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> There's nothing strange about it actually. The cards are TDP restricted stock. First step is to raise that limit, and that power limit slider does just that. And then when you add in voltage it goes up and up. I think when unlocked and heavily overclocked they use more power than unlocked ti and titans but not much more. All cards when clocked high suck down power, its not unusual you know. Hell even maxwell goes ape when you oc them.
> 
> 
> 
> I guess it's unusual to me cause I just came from tahiti..... overclocking Tahiti, and overvolting Tahiti by the THE SAME percentages, doesn't yield a similar increase in power draw compared to Hawaii. It seems to scale way higher (worse) on Hawaii.... that's been my experience anyways...
Click to expand...

Tahiti uses a lot of power too. I don't know how you got the impression it didn't. I've posted on numerous occasions data showing one card drawing over 300w at 1300 core. My quad tahiti setup which clocked over 1300, used to eat psus alive until I went to a dual setup. I think maybe most ppl overplay how they overclock and bench and that gives ppl the wrong impression.


----------



## shilka

Quote:


> Originally Posted by *Klocek001*
> 
> Is CM VS550 same as V550S ? I got lost in their nomenclature
> 
> 
> 
> 
> 
> 
> 
> I've read Shilka wrote that's the best bang for buck there is, I'm actually thinking of spending less on PSU and trade my 290 as soon as they send it back to me from RMA to get a 970. Ubisoft fan here (GR,FC,AC and you know... the rest. TC I guess), you prolly know why I'm doing this...The price gap between 550 and 650 should cover the gap between the 970 and 290.


The Cooler Master V550 aka VS550 aka V550S aka Vanguard 550 is the same unit yes.
Just dont confuse it with the Corsair VS550 which is trash.


----------



## Klocek001

I heard Corsair has a good AX series, shame they cost so much for a 760W unit.


----------



## shilka

Quote:


> Originally Posted by *Klocek001*
> 
> I heard Corsair has a good AX series, shame they cost so much for a 760W unit.


Too overpriced EVGA SuperNova G2 series are better and cheaper, or the fully modular (not the semi modular) Cooler Master V which is based on the same Seasonic KM3 as the AX which is often the same price or cheaper.
And the V dont have any problems with coil whine something the AX have had problems with.


----------



## tsm106

lol, it's sooo overpriced. You can get an evga g2 1kw for the price of the corsair ax 760. That's a good deal right? If you deal watch you can get the G2's for cheap.


----------



## Klocek001

Damn, now that I'm reading more of this stuff I'm thinking about splurging on Vanquard V700, that's gonna be a PSU for years and it seems of almost impeccable quality.


----------



## shilka

Quote:


> Originally Posted by *Klocek001*
> 
> Damn, now that I'm reading more of this stuff I'm thinking about splurging on Vanquard V700, that's gonna be a PSU for years and it seems of almost impeccable quality.


Seasonic KM3 based unit with a 135mm Protechnic fluid dynamic bearing fan.
Every other KM3 based unit i know of use 120mm double ball-bearings fans.


----------



## Klocek001

Quote:


> Originally Posted by *shilka*
> 
> Seasonic KM3 based unit with a 135mm Protechnic fluid dynamic bearing fan.
> .


Is that good ? Do I want that ?








In the long run the fan is actually VERY important to me, I want to screw my PSU in and don't wanna hear from it for 5 years.


----------



## shilka

Quote:


> Originally Posted by *Klocek001*
> 
> Is that good ? Do I want that ?
> 
> 
> 
> 
> 
> 
> 
> 
> In the long run the fan is actually VERY important to me, I want to screw my PSU in and don't wanna hear from it for 5 years.


I am not a fan expert but i believe that fluid dynamic bearing fans are more quiet then double ball-bearings fan but double ball-bearings fan will last longer.
By last longer i mean like at least 7-8 years the fluid dynamic bearing should last at least 5 years if not longer.


----------



## Klocek001

just quoting after a guy from tom'shw

"Fluid bearings have the advantages of near-silent operation and high life expectancy (comparable to ball bearings), but tend to be the most expensive."

Seems like I've chosen my PSU to be Vanquard V700, more expensive than others but after 2 years I won't remember the price, just appreciate the quality and be satisfied with the purchase.


----------



## Agent Smith1984

Quote:


> Originally Posted by *tsm106*
> 
> Tahiti uses a lot of power too. I don't know how you got the impression it didn't. I've posted on numerous occasions data showing one card drawing over 300w at 1300 core. My quad tahiti setup which clocked over 1300, used to eat psus alive until I went to a dual setup. I think maybe most ppl overplay how they overclock and bench and that gives ppl the wrong impression.


I know tahiti uses a ton of power.... I watched my 280x' power usage go through the roof when heavily overclocking (1270 @ 1.3v / 1800 @ 1.6v)
My 12v rail would bottom out at 11.84......

With the 290, on stock clocks, the 12v rail was dipping down to 11.75, and once I overclocked it with more voltage it was toast.... it ran at about 11.5v for 45 minutes, and then shut down like a good boy.....
I unplugged it to trip the breaker, plugged it back in, hit the switch and nothing but a big pop.....

Essentially, both cards use a ton of power, but the increase in power draw for the equivalent clock/voltage increases on tahiti (actually higher percentages on my 280x), netted a much higher power draw increase percentage on the 290.....

That struck ME as unusual for a new architecture.... that's all I was getting at


----------



## DolanTheDuck

Do you guys also have GPU spikes in Far Cry 4? No matter which settings I use, the game keeps stuttering when i'm running or driving.
R9 290 tri-x @ latest beta.


----------



## chiknnwatrmln

FC4 seems poorly optimized for AMD cards. I also have crazy stuttering while forcing one GPU. 290x.


----------



## BradleyW

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> FC4 seems poorly optimized for AMD cards. I also have crazy stuttering while forcing one GPU. 290x.


Load stutter or frame latency stutter? Disabling *HT* might fix the stuttering.


----------



## Bertovzki

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Hmm, seems PSU discussion has found it's way into this thread....
> 
> It seems semi-off topic, but I'm the one who started it, my apologies....
> It is only because I know how power hungry the 290 is, and it is the entire reason why my last PSU blew up , lol.
> 
> Another user with a 290 (I expect to see a TON more people running 290's now that they are sub $200 used, and $250+/- new....)
> Maybe there are others who have realized they need more power also.
> 
> Could we be looking at starting a 290/290x POWER Thread???
> Of course, if there isn't one already.....
> 
> I am way late to the party on getting a 290, but that's because I have to keep my PC spending to a min....And I'm not the only one apparently, because I am seeing users pop up several times a day who have just gotten their hands on one, especially now that prices are down so much.


Quote:


> Originally Posted by *Klocek001*
> 
> That's what I'm about, I know to look at OEM but it doesn't tell me duck.
> Whta would be the best choice then ? I know XFX are Seasonic and they're good but the one I have has killed two of my 290s even though it seems to work fine. I'd rather not go with XFX.


IMHO nothing is off topic if it is because you have a 290 that is cooking , throttling , or PSU stressed , anything that is caused by you having a 290 including your cooling air or water being adequate to cool your 290 ( rads ,fan's )
Some of the experts here may disagree , but i have done a lot of research on PSU's started because i wanted less than 190 mm length for a 1000W + PSU , but in the process looked at every aspect of their performance and some clips from jays 2 cents worth and linustechtips dude on what a PSU actually requires , both said with their test setup that the PSU didnt actually work very hard at all













But I have also seen a AXi 860W draw 1000W from the wall with 2 x 290 and some i7 chip , this does not say much for the wattage choice but does say a lot for the AXi 860.

The conclusion i come to is : if you are running 1 x 290 then 750W is fine , if you are going to now or ever run 2 x 290 then IMHO 1000W is bare minimum ! ( gives headroom and you can operate without being in the 90% -110% power range ,stressing PSU all the time )

My system will have 17 4790K , 2 x 290 , d5 pump , MOBO , FC5 V3 , darkside UV lights , NZXT Hue RGB , 10 fans ,SSD's and maybe a HDD , so I want headroom and prefer 1200W - 1300W.
Also I have done all my reading on sites like HardOCP , the best IMHO and Jhonnyguru , I'm only interested in the PSU's that have the pass from HardOCP ,preferably gold pass

I have seen someone on here say they have seen 290's draw 450W OC'd high , so they can at full load 300W each = 600W , or if other dude right 450W X 2 = 900W just for 290"s ?


----------



## bond32

Highly recommend checking the marketplace here for a PSU.... I got my second EVGA 1300 G2 for $100 shipped, and it looked unused. Provided I did tell him I didn't need the cable set, still a heck of a deal if you ask me...


----------



## diggiddi

Guys would I be correct in saying avg overclock for reference 290's are 1000 to 1100 mhz?


----------



## dspacek

hello, I have a problem with my crossfire setup.
I have Saphire R9-290 tri X and Saphire R9-290 in my Gigabyte X79 UD3 motherboard and the performance is with crossfire half of the single card performance. I dont know how to turn both on. Everything is connected OK. Bios setup for PCI-E is set for 3rd generation of GPUs.
Crossfire turned on in CCC.
Drivers 14.9 (14.301.1001-140915a-176154C)

HELP me PLS, Im tired.


----------



## battleaxe

Quote:


> Originally Posted by *diggiddi*
> 
> Guys would I be correct in saying avg overclock for reference 290's are 1000 to 1100 mhz?


I would think most can do 1100mhz on core. Very good 290's can do 1200. Great ones can hit 1300. Mine do 1170mhz, so not the greatest but not bad enough to be upset about either. I think 1100 would be fairly low and most could do that. (but maybe not all)


----------



## diggiddi

Quote:


> Originally Posted by *battleaxe*
> 
> I would think most can do 1100mhz on core. Very good 290's can do 1200. Great ones can hit 1300. Mine do 1170mhz, so not the greatest but not bad enough to be upset about either. I think 1100 would be fairly low and most could do that. (but maybe not all)


It seems you have watercooled gpus in your sig? I'm talking about bone stock Reference gpu's here


----------



## battleaxe

Quote:


> Originally Posted by *diggiddi*
> 
> It seems you have watercooled gpus in your sig? I'm talking about bone stock Reference gpu's here


Should still apply. Most I gained was about 30mhz on water.

Edit: replace your thermal paste with Gelid Extreme. I gained about 8C from doing that. FYI


----------



## diggiddi

Ok then


----------



## chiknnwatrmln

Ref board should be good for ~1150MHz if you crank the fans and add voltage.

Also I just found out my 290x is a pretty good clocker, because FC4 does not yet support CF I've been running just one card. Runs 1200 core stable on +118mv, my 290 required around 175 for the same speed.


----------



## DividebyZERO

Quote:


> Originally Posted by *dspacek*
> 
> hello, I have a problem with my crossfire setup.
> I have Saphire R9-290 tri X and Saphire R9-290 in my Gigabyte X79 UD3 motherboard and the performance is with crossfire half of the single card performance. I dont know how to turn both on. Everything is connected OK. Bios setup for PCI-E is set for 3rd generation of GPUs.
> Crossfire turned on in CCC.
> Drivers 14.9 (14.301.1001-140915a-176154C)
> 
> HELP me PLS, Im tired.


On a fresh boot up, go into CCC , disable crossfire, then re-enable it. Then launch a game or benchmark and test it. Ive had issues where CF doesn't work right until its been turned off then back on. Mainly on newer drivers. I'm still stuck back on 13.12 for various reasons.


----------



## LandonAaron

When updating Catalyst drivers do you need to uninstall the old ones first or will the installation of the new drivers handle that as part of its setup?


----------



## joeh4384

I always use DDU when uninstalling AMD drivers. I have had no driver issues doing this.

http://www.wagnardmobile.com/DDU/downloads.html


----------



## LandonAaron

Quote:


> Originally Posted by *joeh4384*
> 
> I always use DDU when uninstalling AMD drivers. I have had no driver issues doing this.
> 
> http://www.wagnardmobile.com/DDU/downloads.html


I'd rather not use a 3rd party program unless I have to.


----------



## kizwan

Quote:


> Originally Posted by *LandonAaron*
> 
> When updating Catalyst drivers do you need to uninstall the old ones first or will the installation of the new drivers handle that as part of its setup?


Just uninstall the old ones from Control Panel. I don't use DDU. No need to IMO.


----------



## Regnitto

Quote:


> Originally Posted by *LandonAaron*
> 
> When updating Catalyst drivers do you need to uninstall the old ones first or will the installation of the new drivers handle that as part of its setup?


I just go thru the Catalyst installer, and it prompts to completely remove old AMD drivers if it detects any. I just click ok when it prompts and let it do the work


----------



## chiknnwatrmln

AMD driver remover does not remove all of AMD's drivers. Last time I had Nvidia (gtx 670) their driver remover did not work fully either..

The only way to fully uninstall drivers is with a program like DDU (I use it for convenience) or to know exactly which files, folders, and registry entries need to be deleted. I have had to do that several times too, it's a pain.

Using Catalyst uninstaller leaves folders, files, and registry entries associated with drivers. Several times when updating drivers I've had to run DDU or manually remove them in safe mode because the Catalyst uninstaller does not do its job fully.


----------



## JourneymanMike

sorry wrong place...


----------



## kizwan

It's probably doesn't remove everything but the official uninstaller always works flawlessly to me. I don't have any drivers problem at all.


----------



## chiknnwatrmln

It doesn't cause issues for some people, but for others like me it can make updating drivers a pain. I have had buggy or corrupt driver installations at least half a dozen times in the past year alone, and all of them were fixed by fully uninstalling then reinstalling the drivers.


----------



## tsm106

Just finished the majority of the case transfer. Off to the airport to pick up the woman. Here's a phone pic, lil blurry I think.


----------



## cephelix

Looking good!! Wonder what the missus will say if she knew what you've been up to while she was away


----------



## tsm106

Putting aside her annoyance at my mess, she'd probably lol, like how she heckled my in a video in the AMD How To thread. Woman just don't understand.


----------



## Arizonian

Quote:


> Originally Posted by *neurotix*
> 
> 1 290 Vapor-X
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added








Quote:


> Originally Posted by *Sempre*
> 
> Just got these
> 
> 
> 
> 
> 
> 
> 
> two Sapphire R9 290s
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## cephelix

Quote:


> Originally Posted by *tsm106*
> 
> Putting aside her annoyance at my mess, she'd probably lol, like how she heckled my in a video in the AMD How To thread. Woman just don't understand.


Lol.annoyance at messes is totally normal..gf gets annoyed at my messes as well


----------



## Klocek001

good news, my 290 is working again. Would you tell me why it could stop working for a while then get back to normal ? Maybe it got a lil too warm, then gave me black screen and no signal. I had already sent a RMA ticket but it's working again.


----------



## diggiddi

Quote:


> Originally Posted by *tsm106*
> 
> Putting aside her annoyance at my mess, she'd probably lol, like how she heckled my in a video in the AMD How To thread. *Woman just don't understand*.


LOL


----------



## Chopper1591

Some last advice needed before I go for the kill and buy my gpu upgrade.


Sapphire 290 Tri-x OC 8 months old: 210 euro
Asus 290X DCUII OC 10 months old(no invoice): 260 euro

Which of the two should I grab?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Chopper1591*
> 
> Some last advice needed before I go for the kill and buy my gpu upgrade.
> 
> 
> Sapphire 290 Tri-x OC 8 months old: 210 euro
> Asus 290X DCUII OC 10 months old(no invoice): 260 euro
> 
> Which of the two should I grab?


Tri-X hands down.

Direct CU II sucks on Hawaii......290x just means it will overheat quicker


----------



## Chopper1591

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Tri-X hands down.
> 
> Direct CU II sucks on Hawaii......290x just means it will overheat quicker


If Bilko says it.

The tri-x will probably clock the same anyway.


----------



## vallonen

Tri-X


----------



## Chopper1591

Quote:


> Originally Posted by *vallonen*
> 
> Tri-X


It's funny that the tri-x is still better. Even though we are comparing a 290 with a 290x.


----------



## rdr09

Quote:


> Originally Posted by *Chopper1591*
> 
> It's funny that the tri-x is still better. Even though we are comparing a 290 with a 290x.


how much are those? if your system is watercooled, then i guess you'll watercool the gpu, too. if so, then just get a used reference. either 290X or 290. got my second 290 for $187 and a full block for $60. But, if you like the games bundled, then go for it.


----------



## Chopper1591

Quote:


> Originally Posted by *rdr09*
> 
> how much are those? if your system is watercooled, then i guess you'll watercool the gpu, too. if so, then just get a used reference. either 290X or 290. got my second 290 for $187 and a full block for $60. But, if you like the games bundled, then go for it.


ATM I am not going to put the gpu under water.
Will need to buy more stuff to do that.
The 360 UT60 won't be sufficient to cool both the 290(x) and the 5ghz fx-8320.
On top of that, the gpu block will set me back around 100 euro.

So I think the tri-x is the way to go.


----------



## rdr09

Quote:


> Originally Posted by *Chopper1591*
> 
> ATM I am not going to put the gpu under water.
> Will need to buy more stuff to do that.
> The 360 UT60 won't be sufficient to cool both the 290(x) and the 5ghz fx-8320.
> On top of that, the gpu block will set me back around 100 euro.
> 
> So I think the tri-x is the way to go.


well, the Tri X is of reference design, so it will be easy to watercool. i think the 290s run much cooler than the 290X unless you get a good clocker and oc'ed much. i only have 3 120 rads = to a 360 rad to cool 2 290 and my i7 @ 4.5. but my ambient is low. may have to add another 120 or 240 in the future in a bigger case.


----------



## Chopper1591

Quote:


> Originally Posted by *rdr09*
> 
> well, the Tri X is of reference design, so it will be easy to watercool. i think the 290s run much cooler than the 290X unless you get a good clocker and oc'ed much. i only have 3 120 rads = to a 360 rad to cool 2 290 and my i7 @ 4.5. but my ambient is low. may have to add another 120 or 240 in the future in a bigger case.


Look @ my rig in the sig.
It ain't in the case.









I didn't know the tr-x was refference. Kinda expected it not to be, just like my 7950 vapor-x.
Maybe I will put it under water in the (near) future.

But... lets just see how it works out of the box first.

Btw...
I just discovered that the guy from who I want to buy sold more of the same(saw it in his selling feedback).
So I asked him if he could explain that to me.
He did confess that it was in fact an ex miner card...
But hey, I am just going for it I guess. It's just a nice deal and there is still 14 months of warranty left.
So I should be good, right?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Chopper1591*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> well, the Tri X is of reference design, so it will be easy to watercool. i think the 290s run much cooler than the 290X unless you get a good clocker and oc'ed much. i only have 3 120 rads = to a 360 rad to cool 2 290 and my i7 @ 4.5. but my ambient is low. may have to add another 120 or 240 in the future in a bigger case.
> 
> 
> 
> Look @ my rig in the sig.
> It ain't in the case.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I didn't know the tr-x was refference. Kinda expected it not to be, just like my 7950 vapor-x.
> Maybe I will put it under water in the (near) future.
> 
> But... lets just see how it works out of the box first.
> 
> Btw...
> I just discovered that the guy from who I want to buy sold more of the same(saw it in his selling feedback).
> So I asked him if he could explain that to me.
> He did confess that it was in fact an ex miner card...
> But hey, I am just going for it I guess. It's just a nice deal and there is still 14 months of warranty left.
> So I should be good, right?
Click to expand...

Tri-X is a Ref PCB, Vapor-X is not (to top it off the 290 and 290x Vapor-X's are different PCB's anyway







)

But either way unless you are benching or going above 1440p the 290 has more than enough power


----------



## Nwanko

Hey guys i have a quick question. What is the MAX temperature for gaming on a GIgabyte 290x WF3 OC? I'm now on 1100 Core / 1400 Mem - Msi AB settings: +13 V \ +50 power limit. The temps now in gaming are arround 82C Core with the fan at arround 80% (performance mode bios switch) ASIC: 73,7%


----------



## Chopper1591

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Tri-X is a Ref PCB, Vapor-X is not (to top it off the 290 and 290x Vapor-X's are different PCB's anyway
> 
> 
> 
> 
> 
> 
> 
> )
> 
> But either way unless you are benching or going above 1440p the 290 has more than enough power


Thought so, yeah.

For the moment being I am not seeing 1400p happening in the near future.
My 7950 have been plenty for the past few months... but recent games just seem to like allot of resources. And 3gb isn't going to cut it anymore.
And if it needs to, the 290 will probably clock a bit higher then stock anyway.


----------



## Agent Smith1984

In any case.... I was shocked at how well my tri-x overclocks for a reference design....

I always thought the edge went to non-reference boards like the Asus, with it's DIGI power VRM design and what not, but I am seeing my tri-x clock better than many of the vapor-x and Asus boards.....
I guess there are some big variances in overclocking with Hawaii whether it's reference PCB or not.......
I think it's the memory I got lucky on, because I see several people not able to get past 1400+/-

I'm not done pushing the core yet, not until I get my PSU Friday.....

Is +200mv pretty safe for daily driving as long as temps are good? My card runs 74c MAX with 80c VRM on +150mv.
I was running 1165 @ +150 completely stable when the PSU crapped out. I am hoping for 1200.

Also, isn't there some benching BIOS going around (I believe PT3?) that will allow for 1.4v core? I hear that isn't safe though?
Is it risky because of possible temp issues, or it just a dangerous BIOS regardless?

I definitely wouldn't mind trying to break some FireSrike/Thuban records (still hold the record for 280x on thuban







), lol
Not that, that's a big deal or anything...








Mostly just a cheap skate who is still putting new GPU's on an old CPU... but getting good results doing it though!


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agent Smith1984*
> 
> In any case.... I was shocked out how well my tri-x overclocked for a reference design....
> 
> I always thought the edge went to non-reference boards like the Asus, with it's DIGI power VRM design and what not, but I am seeing my tri-x clock better than many of the vapor-x and Asus boards.....
> I guess there are some big variances in overclocking with Hawaii whether it's reference PCB or not.......


The Ref design is very robust and quite powerful, Not really sure why people seem to think otherwise tbh


----------



## Agent Smith1984

It was just the impression I had got from 280x I guess.... my non-reference Asus TOP 280x seemed to overclock a lot better than reference 7970/280x's.....

I thought it was the power delivery (12 phase DIGI VRM), but I am beginning to think now that it was less a board thing, and more a matter of core binning.

I have been totally impressed with the 290 so far..... I honestly think that the new pricing I am seeing makes them just as good a value in the $250-270 range as the GTX 970 is in the $330-350 range......

Agreed?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Agent Smith1984*
> 
> It was just the impression I had got from 280x I guess.... my non-reference Asus TOP 280x seemed to overclock a lot better than reference 7970/280x's.....
> 
> I thought it was the power delivery (12 phase DIGI VRM), but I am beginning to think now that it was less a board thing, and more a matter of core binning.
> 
> I have been totally impressed with the 290 so far..... I honestly think that the new pricing I am seeing makes them just as good a value in the $250-270 range as the GTX 970 is in the $330-350 range......
> 
> Agreed?


Well the 280x was a more mature process and there were no reference ones to compare it (although Tsm's are pretty beastly)

But yeah, the 290 is a great card for the price, the GTX 970 is about $60 more than a 290x here, kinda sad really....


----------



## rdr09

Quote:


> Originally Posted by *Chopper1591*
> 
> Look @ my rig in the sig.
> It ain't in the case.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I didn't know the tr-x was refference. Kinda expected it not to be, just like my 7950 vapor-x.
> Maybe I will put it under water in the (near) future.
> 
> But... lets just see how it works out of the box first.
> 
> Btw...
> I just discovered that the guy from who I want to buy sold more of the same(saw it in his selling feedback).
> So I asked him if he could explain that to me.
> He did confess that it was in fact an ex miner card...
> But hey, I am just going for it I guess. It's just a nice deal and there is still 14 months of warranty left.
> So I should be good, right?


that is a good plan. make sure it works out the box first. my second one was a miner's car supposedly but i think it was never used. when i took the stock cooler off, there was not a trace of dust. i don't think a dust blower can do such a good job. not a scratch on the pcie connector.

keep an eye out for a used full waterblock.


----------



## PillarOfAutumn

Quick question. How long does it take for EKWB to ship their products? I bought a water block for a 290 last week and since then, it has gone in and out of stock, but my order still says "processing". Does anyone have experience with buying directly from ekwb?


----------



## Agent Smith1984

I have a question about running water on 290's.....

What kind of overclocking results are people REALLY getting??

For example, if my card is doing 1150/1600 on air with low 70's on core, and 80c on VRM, do I have a ton more overclocking potential on water, or am I to assume that the card is not dealing with a thermal ceiling, and wouldn't benefit much either way.....

Not sure if it matters or not, but I have an 80.2% ASIC quality.....

For the first time, I am actually interested in water cooling my GPU......


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I have a question about running water on 290's.....
> 
> What kind of overclocking results are people REALLY getting??
> 
> For example, if my card is doing 1150/1600 on air with low 70's on core, and 80c on VRM, do I have a ton more overclocking potential on water, or am I to assume that the card is not dealing with a thermal ceiling, and wouldn't benefit much either way.....
> 
> Not sure if it matters or not, but I have an 80.2% ASIC quality.....
> 
> For the first time, I am actually interested in water cooling my GPU......


What voltage? Water will let you safely push higher voltage with lower temps.

Highest I would go daily is +180 or so. My ref 290 went from 1100 daily (noise limited) to 1200 daily and much quieter.

Imo the major benefit of watercooling is lower temps and noise. You gain some OCing headroom as opposed to air but a card that doesn't clock well on air won't clock that much better under water.


----------



## LandonAaron

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> What voltage? Water will let you safely push higher voltage with lower temps.
> 
> Highest I would go daily is +180 or so. My ref 290 went from 1100 daily (noise limited) to 1200 daily and much quieter.
> 
> Imo the major benefit of watercooling is lower temps and noise. You gain some OCing headroom as opposed to air but a card that doesn't clock well on air won't clock that much better under water.


Also those lower temps will keep you from throttling. So say you could overclock to 1200 with +200mv on the stock cooler you may not actually be able to run at 1200 for very long as your card quickly shoots up to 94 degrees and you find yourself throttling and running much closer to stock.


----------



## Agent Smith1984

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> What voltage? Water will let you safely push higher voltage with lower temps.
> 
> Highest I would go daily is +180 or so. My ref 290 went from 1100 daily (noise limited) to 1200 daily and much quieter.
> 
> Imo the major benefit of watercooling is lower temps and noise. You gain some OCing headroom as opposed to air but a card that doesn't clock well on air won't clock that much better under water.


Well, I had stopped at 1165 on +175 when my PSU blew up, but I think it will do much better than that with clean steady power. My 12v rail was hittiing 11.5v, and vcore was drooping as low as 1.15, yet it was perfectly stable and artifact free at those clocks for an hour before the PSU died.

I'm hoping for 1200 core @ +200 on air...... I think I have a pretty good clocking card. I'm going to research a good used/entry level water solution, and see if the temps/noise reduction would be worth it.
I'm assuming a full plate vs something like the kraken is the only thing worth doing IF I am already seeing great air cooled temps right?


----------



## Chopper1591

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> What voltage? Water will let you safely push higher voltage with lower temps.
> 
> Highest I would go daily is +180 or so. My ref 290 went from 1100 daily (noise limited) to 1200 daily and much quieter.
> 
> Imo the major benefit of watercooling is lower temps and noise. You gain some OCing headroom as opposed to air but a card that doesn't clock well on air won't clock that much better under water.


Seems legit, have to agree on this.

And for me going under water was mostly because of the lower noise. And because of aesthetics.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, I had stopped at 1165 on +175 when my PSU blew up, but I think it will do much better than that with clean steady power. My 12v rail was hittiing 11.5v, and vcore was drooping as low as 1.15, yet it was perfectly stable and artifact free at those clocks for an hour before the PSU died.
> 
> I'm hoping for 1200 core @ +200 on air...... I think I have a pretty good clocking card. I'm going to research a good used/entry level water solution, and see if the temps/noise reduction would be worth it.
> I'm assuming a full plate vs something like the kraken is the only thing worth doing IF I am already seeing great air cooled temps right?


Edit: I get no throttling whatsoever with 50% PL and 200mv..... I was told the 290's don't throttle until 90C+/-???
My card never broke 74c at roughly 70% fan with +175mv @ 1165MHz
It was getting into the realm of.... LOUD though,


----------



## Chopper1591

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Edit: I get no throttling whatsoever with 50% PL and 200mv..... I was told the 290's don't throttle until 90C+/-???
> My card never broke 74c at roughly 70% fan with +175mv @ 1165MHz
> It was getting into the realm of.... LOUD though,


I was told, don't remember by who, that the tri-x uses a different powertune 1.0/2.0.
This makes it react different to temps.

Most cards run at or around 94c and adjust fan speed to stay around that.
Then if your card needs like 60-65% fan to stay below that threshold, it will start to throttle.

Tri-x doesn't do that. And thus stays lower in temp but is louder.

Have a read here:
http://www.overclock.net/t/1452250/semiaccurate-what-is-amd-s-powertune-2-0-and-what-does-it-do


----------



## Agent Smith1984

Quote:


> Originally Posted by *Chopper1591*
> 
> I was told, don't remember by who, that the tri-x uses a different powertune 1.0/2.0.
> This makes it react different to temps.
> 
> Most cards run at or around 94c and adjust fan speed to stay around that.
> Then if your card needs like 60-65% fan to stay below that threshold, it will start to throttle.
> 
> Tri-x doesn't do that. And thus stays lower in temp but is louder.


You may be thiking of vapor-x....

Tri-x is totally reference, uses the standard temp throttle.
It's basically a totally reference 290 with an AWESOME cooler slapped on it. I tried backing the cooler down to 35% where it's literally inaudible over my other fans, but it runs at about 87c, It still never throttled, but I figured it'd be better to keep temps down with a more aggressive fan profile.


----------



## Chopper1591

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You may be thiking of vapor-x....
> 
> Tri-x is totally reference, uses the standard temp throttle.
> It's basically a totally reference 290 with an AWESOME cooler slapped on it. I tried backing the cooler down to 35% where it's literally inaudible over my other fans, but it runs at about 87c, It still never throttled, but I figured it'd be better to keep temps down with a more aggressive fan profile.


I will have to let someone else fill in on this.
But I am pretty sure it was about the Tri-x.

Also, have a read here. It somewhat discusses it:
http://www.anandtech.com/show/7601/sapphire-radeon-r9-290-review-our-first-custom-cooled-290/2

If you do some reading on various reviews, you can see multiple 290's running 94c. Refference and custom.


----------



## LandonAaron

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, I had stopped at 1165 on +175 when my PSU blew up, but I think it will do much better than that with clean steady power. My 12v rail was hittiing 11.5v, and vcore was drooping as low as 1.15, yet it was perfectly stable and artifact free at those clocks for an hour before the PSU died.
> 
> I'm hoping for 1200 core @ +200 on air...... I think I have a pretty good clocking card. I'm going to research a good used/entry level water solution, and see if the temps/noise reduction would be worth it.
> I'm assuming a full plate vs something like the kraken is the only thing worth doing IF I am already seeing great air cooled temps right?


I am using a custom water cooler with a universal block (EK VGA Supremacy) just for the GPU portion of the card, and the GelID IcyVision for the VRM's. I also have a PCI slot fan blowing on the card to assist the VRM heatsinks, and this setup works really well for me. Before going custom loop I had an Artic Accelero Hybrid Cooler and it couldn't keep up. I kept hitting 94 and throlttling. I think I may have had a bad mount though, as I used that cooler on several Nvidia cards with no issues (GTX 580, GTX670, and GTX 770). But all those cards are lower TDP than the 290x.

If you go with something like the kraken g10, I would still suggest either a 240 mm radiator, or something like the Corsair H80 with a 120 double thick rad. The artic accelero I had had a standard thickness 120 rad and it couldn't keep up.

Here's a pic of my current setup:


----------



## Chopper1591

Quote:


> Originally Posted by *LandonAaron*
> 
> I am using a custom water cooler with a universal block (EK VGA Supremacy) just for the GPU portion of the card, and the GelID IcyVision for the VRM's. I also have a PCI slot fan blowing on the card to assist the VRM heatsinks, and this setup works really well for me. Before going custom loop I had an Artic Accelero Hybrid Cooler and it couldn't keep up. I kept hitting 94 and throlttling. I think I may have had a bad mount though, as I used that cooler on several Nvidia cards with no issues (GTX 580, GTX670, and GTX 770). But all those cards are lower TDP than the 290x.
> 
> If you go with something like the kraken g10, I would still suggest either a 240 mm radiator, or something like the Corsair H80 with a 120 double thick rad. The artic accelero I had had a standard thickness 120 rad and it couldn't keep up.
> 
> Here's a pic of my current setup:
> 
> 
> Spoiler: Warning: Spoiler!


Thats a smart drain spot.
I like it.


----------



## Agent Smith1984

Nice stuff bro!
You gotta firesrike on that thing? Curious to see how that ol' i7 does on physics.


----------



## Chopper1591

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Nice stuff bro!
> You gotta firesrike on that thing? Curious to see how that ol' i7 does on physics.


Not TOO great I think.

But I agree with you. I am Always curious to scores.


----------



## kizwan

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Quick question. How long does it take for EKWB to ship their products? I bought a water block for a 290 last week and since then, it has gone in and out of stock, but my order still says "processing". Does anyone have experience with buying directly from ekwb?


For me it took almost a week for them to process my order (from order to shipment). You'll get shipment notification via email soon.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Chopper1591*
> 
> Not TOO great I think.
> 
> But I agree with you. I am Always curious to scores.


Honestly, the i7 965 @ 4GHz is probably right inline with an FX8350 @ 4.5+/-


----------



## Chopper1591

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Honestly, the i7 965 @ 4GHz is probably right inline with an FX8350 @ 4.5+/-


Lets find out:


----------



## LandonAaron

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Nice stuff bro!
> You gotta firesrike on that thing? Curious to see how that ol' i7 does on physics.


I just upgraded to the i7-4790k and I haven't updated my sig rig yet. My i7-965 at 4.2 GHZ, gave me an identical overall firestrike score as my i7-4790k at 4.7GHZ. That is overall score though. I don't know what the specific physix score was. I basically lost 3 hard drives to a short circuit last week and a ton of notes, and work files and just all sort stuff. It really sucked, and I am starting all over now with a completely empty hard drive.

A side note if you have a modular power supply always check that the cables are pinned correctly before you plug them in.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Chopper1591*
> 
> Lets find out:


My thuban can break 9,000 @ 4.2GHz.....

The 83** needs around 4.7GHz to break 9,000

The i7 965 @ 4.2GHz scores over 10,000

The i5 4690k needs around 4.6GHz to break 9,000


----------



## Chopper1591

Quote:


> Originally Posted by *Agent Smith1984*
> 
> My thuban can break 9,000 @ 4.2GHz.....
> 
> The 83** needs around 4.7GHz to break 9,000
> 
> The i7 965 @ 4.2GHz scores over 10,000
> 
> The i5 4690k needs around 4.6GHz to break 9,000


Yeah, Thuban's are pretty epic.









I have to look at the core usage sometime while running Firestrike. Looks like multithreading isn't working optimal there.

Anyway, performance per dollar is pretty nice on the fx-83x0's if you ask me.


----------



## rdr09

Quote:


> Originally Posted by *DampMonkey*
> 
> Installed 10-28-2013
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


is that the carbide case?


----------



## tsm106

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I have a question about running water on 290's.....
> 
> What kind of overclocking results are people REALLY getting??
> 
> For example, if my card is doing 1150/1600 on air with low 70's on core, and 80c on VRM, do I have a ton more overclocking potential on water, or am I to assume that the card is not dealing with a thermal ceiling, and wouldn't benefit much either way.....
> 
> Not sure if it matters or not, but I have an 80.2% ASIC quality.....
> 
> For the first time, I am actually interested in water cooling my GPU......


On water, oc becomes easier but it still comes down to luck, skill, and cojones. Asic is not really relevant.
Quote:


> Originally Posted by *Agent Smith1984*
> 
> You may be thiking of vapor-x....
> 
> Tri-x is totally reference, uses the standard temp throttle.
> It's basically a totally reference 290 with an AWESOME cooler slapped on it. I tried backing the cooler down to 35% where it's literally inaudible over my other fans, but it runs at about 87c, It still never throttled, but I figured it'd be better to keep temps down with a more aggressive fan profile.


The trix is not reference. It is actually a custom card that uses the reference layout. And iirc the later trix cannot use a reference bios anymore since they have changed things under the hood. That is probably its main disadvantage since it cannot use the unlocked ref bios' which is needed to break 1300.


----------



## Chopper1591

Quote:


> Originally Posted by *rdr09*
> 
> is that the carbide case?


OMG
I love black/white.
Quote:


> Originally Posted by *tsm106*
> 
> On water, oc becomes easier but it still comes down to luck, skill, and cojones. Asic is not really relevant.
> The trix is not reference. It is actually a custom card that uses the reference layout. And iirc the later trix cannot use a reference bios anymore since they have changed things under the hood. That is probably its main disadvantage since it cannot use the unlocked ref bios' which is needed to break 1300.


Was it you?
Informing me about the PowerTune thing on the Tri-x cards?

@ Agent Smith:

Care to post a run of Cinebench?
Here's one with my current settings.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Chopper1591*
> 
> OMG
> I love black/white.
> Was it you?
> Informing me about the PowerTune thing on the Tri-x cards?
> 
> @ Agent Smith:
> 
> Care to post a run of Cinebench?
> Here's one with my current settings.


I can't post a new one until I get my PSU put back in Friday, I'm on my work rig right now (FX-6200)....

At 4.2 I hit 7.4 for sure though









Just looked, and saw you are running 15, I haven't used that one yet, do you have an 11.5 run??


----------



## tsm106

Quote:


> Originally Posted by *Agent Smith1984*
> 
> It was just the impression I had got from 280x I guess.... my non-reference Asus TOP 280x seemed to overclock a lot better than reference 7970/280x's.....
> 
> I thought it was the power delivery (12 phase DIGI VRM), but I am beginning to think now that it was less a board thing, and more a matter of core binning.
> 
> I have been totally impressed with the 290 so far..... I honestly think that the new pricing I am seeing makes them just as good a value in the $250-270 range as the GTX 970 is in the $330-350 range......
> 
> Agreed?


Sorry not meaning to nitpick your posts but dude, you're all wrong. I have not seen one Asus DC/Matrix card that I didn't crush (lol ok a bit of hyperbole) with reference card whether it be on Tahiti or Hawaii.


----------



## Agent Smith1984

Quote:


> Originally Posted by *tsm106*
> 
> Sorry not meaning to nitpick your posts but dude, you're all wrong. I have not seen one Asus DC/Matrix card that I didn't crush (lol ok a bit of hyperbole) with reference card whether it be on Tahiti or Hawaii.


I have seen reference 7950/70/280x cards all clock great too, but but my 280x at 1250/1800 was definitely a better overclocker than most reference 280x that users right here on ONC are posting about in the owners club.... just go look









I know there were water users running 1.35-1.4 and getting 1300-1400 core clock on some of the older 79** reference cards, but for out-of-box potential and air cooling, the DC/Matrix and Toxic cards were the best overclocking cards I have seen reviewed and posted about on here..... I have also done BIOS modding on reference 7950's in the past, and found that with hynix memory, I could get 1800+ memory no problem, but on three different reference 79** cards at 1.3v, all they did was 1200MHz and cook eggs..... again, that's out of the box...

The whole point of my post was to say that I am impressed with the reference board, and think that maybe the non-reference boards being better for OC is BS, and it all comes down to chip binning, and luck of the silicon....


----------



## Klocek001

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, I had stopped at 1165 on +175 when my PSU blew up, but I think it will do much better than that with clean steady power. My 12v rail was hittiing 11.5v, and vcore was drooping as low as 1.15, yet it was perfectly stable and artifact free at those clocks for an hour before the PSU died.
> 
> I'm hoping for 1200 core @ +200 on air...... I think I have a pretty good clocking card. I'm going to research a good used/entry level water solution, and see if the temps/noise reduction would be worth it.
> I'm assuming a full plate vs something like the kraken is the only thing worth doing IF I am already seeing great air cooled temps right?


whoa that's crazy to go +175mv on air, I wouldn't do that. btw that's a lot of voltage you need to push, mine was stable at 1140 on stock vcore, needed just +81mv for 1200.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Klocek001*
> 
> whoa that's crazy to go +175mv on air, I wouldn't do that. btw that's a lot of voltage you need to push, mine was stable at 1140 on stock vcore, needed just +81mv for 1200.


The 175 was just testing.... I started at 200 and decided to work my way down








I don't see the dangers of +200 if the temps are in the low 70's though?
I plan on running the card at +200 24/7 with max potential clocks....

I've ran every video cards I've ever had dating back to the TI4200.... BALLS TO THE WALLS


----------



## tsm106

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Sorry not meaning to nitpick your posts but dude, you're all wrong. I have not seen one Asus DC/Matrix card that I didn't crush (lol ok a bit of hyperbole) with reference card whether it be on Tahiti or Hawaii.
> 
> 
> 
> I have seen reference 7950/70/280x cards all clock great too, but but my 280x at 1250/1800 was definitely a better overclocker than most reference 280x that users right here on ONC are posting about in the owners club.... just go look
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I know there were water users running 1.35-1.4 and getting 1300-1400 core clock on some of the older 79** reference cards, but for out-of-box potential and air cooling, the DC/Matrix and Toxic cards were the best overclocking cards I have seen reviewed and posted about on here..... I have also done BIOS modding on reference 7950's in the past, and found that with hynix memory, I could get 1800+ memory no problem, but on three different reference 79** cards at 1.3v, all they did was 1200MHz and cook eggs..... again, that's out of the box...
> 
> The whole point of my post was to say that I am impressed with the reference board, and think that maybe the non-reference boards being better for OC is BS, and it all comes down to chip binning, and luck of the silicon....
Click to expand...

I think you're the first person I've seen bragging about their DC2, Asus card.


----------



## Agent Smith1984

Quote:


> Originally Posted by *tsm106*
> 
> I think you're the first person I've seen bragging about their DC2, Asus card.


ROFL, I bet I am!!!
They are nightmares


----------



## Chopper1591

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I can't post a new one until I get my PSU put back in Friday, I'm on my work rig right now (FX-6200)....
> 
> At 4.2 I hit 7.4 for sure though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just looked, and saw you are running 15, I haven't used that one yet, do you have an 11.5 run??


Here is a shot with results from various runs:


----------



## wermad

Sapphire 290 on its way from the egg











Crossfire will be added down the road. Time to sell em 280X's (







).


----------



## battleaxe

Quote:


> Originally Posted by *wermad*
> 
> Sapphire 290 on its way from the egg
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Crossfire will be added down the road. Time to sell em 280X's (
> 
> 
> 
> 
> 
> 
> 
> ).


Ahhh.... nice!

Me wants three of them. I must resist for MaxWell or 390x though. Must resist.


----------



## neurotix

Arizonian,

Please update me to 2x Sapphire R9 290 Vapor-X:





Thanks.


----------



## wermad

Quote:


> Originally Posted by *battleaxe*
> 
> Ahhh.... nice!
> 
> Me wants three of them. I must resist for MaxWell or 390x though. Must resist.


Hehe, me too







. Hopefully, these are still reference (planned water loop).


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, I had stopped at 1165 on +175 when my PSU blew up, but I think it will do much better than that with clean steady power. My 12v rail was hittiing 11.5v, and vcore was drooping as low as 1.15, yet it was perfectly stable and artifact free at those clocks for an hour before the PSU died.
> 
> I'm hoping for 1200 core @ +200 on air...... I think I have a pretty good clocking card. I'm going to research a good used/entry level water solution, and see if the temps/noise reduction would be worth it.
> I'm assuming a full plate vs something like the kraken is the only thing worth doing IF I am already seeing great air cooled temps right?


If the lower sound and temps are worth it to you then by all means go for it. I would definitely recommend building a full loop if you're up to it, do your homework first though. It doesn't hurt to read up on things like how to prevent galvanic corrosion, algae, etc. Depending on your budget you can look into things like getting a backplate (aesthetics + minor cooling benefits), putting your CPU under water, and high-end thermal pads for VRMs.

If you're more interested in using a Kraken-esque setup, you'll obviously need a closed loop cooler and some VRM and VRAM heatsinks. Because you're running such high voltage, I'm not too sure how well different heat sinks on the main VRM's would work out.

I personally would not run much more voltage than that for a daily OC though. I did run +175mv daily for ~11 months with no issues on a early reference card, however.


----------



## Draygonn

Is it ok to run a 290 without heatsinks on the memory? I'm trying to reuse my Gelid Icy Vision. I have the VRMs covered but the heatsink tape isn't sticking to the memory. My 480 memory ran cool, it didn't have heatsinks so I'm hoping the same can be done here.


----------



## cephelix

Quote:


> Originally Posted by *Draygonn*
> 
> Is it ok to run a 290 without heatsinks on the memory? I'm trying to reuse my Gelid Icy Vision. I have the VRMs covered but the heatsink tape isn't sticking to the memory. My 480 memory ran cool, it didn't have heatsinks so I'm hoping the same can be done here.


do corrent me if i'm wrong but you should be able to.just need a fan blowing at it.the vrams dont get too hot, it's the vrms and specifically vrm 1 that gets really hot


----------



## Chopper1591

Quote:


> Originally Posted by *Draygonn*
> 
> Is it ok to run a 290 without heatsinks on the memory? I'm trying to reuse my Gelid Icy Vision. I have the VRMs covered but the heatsink tape isn't sticking to the memory. My 480 memory ran cool, it didn't have heatsinks so I'm hoping the same can be done here.


Quote:


> Originally Posted by *cephelix*
> 
> do corrent me if i'm wrong but you should be able to.just need a fan blowing at it.the vrams dont get too hot, it's the vrms and specifically vrm 1 that gets really hot


I am thinking the same thing here.

But...
Buying some now ram sinks and/or thermal tape isn't going to cost you much.
So you might as wel play i safe here.


----------



## cephelix

Quote:


> Originally Posted by *Chopper1591*
> 
> I am thinking the same thing here.
> 
> But...
> Buying some now ram sinks and/or thermal tape isn't going to cost you much.
> So you might as wel play i safe here.


agreed....that's what i'll be doing for mine as well since i'm planning to buy some stuff from fcpu, might as well get them together and be on the safe side instead of having to order, ship and wait again


----------



## Chopper1591

God damnit.

Forgot to do measurements...
Bought the 290 Tri-x yesterday. Replacing my 7950 vapor-x. Which is 30mm longer.

My reservoir is placed next to my gpu with about 10mm clearance.


















What should I do?


----------



## Aaron_Henderson

So how come when I was looking at reference cards, OCN basically told me that was stupid and that I should be getting a non-reference, even though I planned on watercooling? Now that I have purchased my DCu II, reference is the way to go? Remind not to listen to you guys all the time and trust what I already know...


----------



## Sgt Bilko

Quote:


> Originally Posted by *Aaron_Henderson*
> 
> So how come when I was looking at reference cards, OCN basically told me that was stupid and that I should be getting a non-reference, even though I planned on watercooling? Now that I have purchased my DCu II, reference is the way to go? Remind not to listen to you guys all the time and trust what I already know...


Only reason i can think of to go non-ref Hawaii (if you're watercooling) is the Lightning tbh.

if you are sticking with the stock cooling then non-ref is the way to go but it depends more on the cooler and not the card.


----------



## kizwan

Quote:


> Originally Posted by *Aaron_Henderson*
> 
> So how come when I was looking at reference cards, OCN basically told me that was stupid and that I should be getting a non-reference, even though I planned on watercooling? Now that I have purchased my DCu II, reference is the way to go? Remind not to listen to you guys all the time and trust what I already know...


All I know DCU II is not crappy card but crappy cooler. If your card running without any problem at stock clock, then you got good card. What is the max clocks you able to get with your card?

For Hawaii, referenced card is not beneath non-referenced card that you might think. All I know what non-referenced card can do, in term of max clocks when overclocking, the referenced card has been known to be able to achieved the same number too. All down to silicon lottery for Hawaii cards.


----------



## Chopper1591

Quote:


> Originally Posted by *Aaron_Henderson*
> 
> So how come when I was looking at reference cards, OCN basically told me that was stupid and that I should be getting a non-reference, even though I planned on watercooling? Now that I have purchased my DCu II, reference is the way to go? Remind not to listen to you guys all the time and trust what I already know...


Sorry to hear that.

My info is: stick the ref. design if you plan to do a fullcover block.
Some non-ref can do it, but most won't.
Quote:


> Originally Posted by *Sgt Bilko*
> 
> Only reason i can think of to go non-ref Hawaii (if you're watercooling) is the Lightning tbh.
> 
> if you are sticking with the stock cooling then non-ref is the way to go but it depends more on the cooler and not the card.


Spot on.


----------



## Aaron_Henderson

I'm just cheesed I was told NOT to get a reference card...even though I stated it's intended purpose, that it would eventually be water-cooled and Crossfired...not too worried about it since EK released a block for the DCu II, and I am pretty sure my DCu II OC is the same PCB...but it's expensive and from what I can tell, my only option for a full cover block on this card. Oh well, like I said, my own fault, should have just trusted me initial instinct and went for a reference card, like I always try to. All I was really offered are vague statements like, "Trust me, you are going to regret going reference..." and the like. Nothing wrong with my card though, just the conflicting advice that led to me owning it. I would have much rather just got a reference card...also, now to get Crossfire, I need to find another DCu II OC, so the PCB layout matches,,,reference all look the same once the cooler is removed.


----------



## battleaxe

Quote:


> Originally Posted by *cephelix*
> 
> do corrent me if i'm wrong but you should be able to.just need a fan blowing at it.the vrams dont get too hot, it's the vrms and specifically vrm 1 that gets really hot


This is the absolute best deal I have seen on memory sinks. Very, very cheap. No reason not to buy. 3M tape too.


----------



## pkrexer

Quote:


> Originally Posted by *Aaron_Henderson*
> 
> I'm just cheesed I was told NOT to get a reference card...even though I stated it's intended purpose, that it would eventually be water-cooled and Crossfired...not too worried about it since EK released a block for the DCu II, and I am pretty sure my DCu II OC is the same PCB...but it's expensive and from what I can tell, my only option for a full cover block on this card. Oh well, like I said, my own fault, should have just trusted me initial instinct and went for a reference card, like I always try to. All I was really offered are vague statements like, "Trust me, you are going to regret going reference..." and the like. Nothing wrong with my card though, just the conflicting advice that led to me owning it. I would have much rather just got a reference card...also, now to get Crossfire, I need to find another DCu II OC, so the PCB layout matches,,,reference all look the same once the cooler is removed.


Pretty sure its common knowledge that you should go reference if you plan on water cool... I've seen that suggested hundreds of times throughout this thread.


----------



## rdr09

Quote:


> Originally Posted by *pkrexer*
> 
> Pretty sure its common knowledge that you should go reference if you plan on water cool... I've seen that suggested hundreds of times throughout this thread.


^this. reference for watercooling. its been that way since the 6900 series iirc.

edit: he prolly asked in the wrong section. if he asked here, then i am sure reference car will be the recommendation, since he is watercooling.


----------



## Buehlar

Quote:


> Originally Posted by *Aaron_Henderson*
> 
> I'm just cheesed I was told NOT to get a reference card...even though I stated it's intended purpose, that it would eventually be water-cooled and Crossfired...not too worried about it since EK released a block for the DCu II, and I am pretty sure my DCu II OC is the same PCB...but it's expensive and from what I can tell, my only option for a full cover block on this card. Oh well, like I said, my own fault, should have just trusted me initial instinct and went for a reference card, like I always try to. All I was really offered are vague statements like, "*Trust me, you are going to regret going reference..*." and the like. Nothing wrong with my card though, just the conflicting advice that led to me owning it. I would have much rather just got a reference card...also, now to get Crossfire, I need to find another DCu II OC, so the PCB layout matches,,,reference all look the same once the cooler is removed.


I've done a lot of reading and the only advice I've ever seen or read against getting a reference card was because the coolers are loud. But for watercooling, whoever gave you that advice maybe misunderstood your plans.
Quote:


> Originally Posted by *pkrexer*
> 
> Pretty sure its common knowledge that you should go reference if you plan on water cool... I've seen that suggested hundreds of times throughout this thread.


I wouldn't go so far as to call it "common knowledge". It's a bit easier to suggest getting a reference card to "first time watercoolers" inquiring about watercooling advice, mainly because there's many more manufactures producing ref waterblocks.
Unless you've already researched and/or know that there's a non-ref block and you plan for it (like I did) then there's absolutly nothing wrong with watercooling non-ref cards.


----------



## Chopper1591

Quote:


> Originally Posted by *battleaxe*
> 
> This is the absolute best deal I have seen on memory sinks. Very, very cheap. No reason not to buy. 3M tape too.


Pretty nice indeed.

Although I like my stuff copper.








These are 18 usd for 10 pieces though.


----------



## Agent Smith1984

Well, there are a lot of non-reference, but reference BASED cards.....

My tri-x for example, should have no problem going to water, but the cooling solution I have on it now is excellent until then...
I really think that I am going to get 1000w+ PSU, and get another one of these cards, and try to ride out on that setup for a little while. I can't see (2) 290's not being able to get acceptable framerates qat 1080P for the next few years


----------



## rdr09

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, there are a lot of non-reference, but reference BASED cards.....
> 
> My tri-x for example, should have no problem going to water, but the cooling solution I have on it now is excellent until then...
> I really think that I am going to get 1000w+ PSU, and get another one of these cards, and try to ride out on that setup for a little while. I can't see (2) 290's not being able to get acceptable framerates qat 1080P for the next few years


like i said, Smith . . . Aaron prolly asked in the wrong section and been given advice by a member who prolly does not even own the hardware. it happens, i find myself giving advice at times without reading much the op.


----------



## Chopper1591

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, there are a lot of non-reference, but reference BASED cards.....
> 
> My tri-x for example, should have no problem going to water, but the cooling solution I have on it now is excellent until then...
> I really think that I am going to get 1000w+ PSU, and get another one of these cards, and try to ride out on that setup for a little while. I can't see (2) 290's not being able to get *acceptable framerates qat 1080P for the next few years*


Comes down to what you think is acceptable.
If you mean 60+.

I am not so sure.
Gpu's age real quick.

2 years on one card is long if you ask me.


----------



## Aaron_Henderson

Yeah, I definitely did not ask the club here...looks like I would have got some solid advice if I had. I thought maybe there was something wrong with the reference cards, similar to when I got my GTX 570 and people where recommending against reference cards due to power delivery issue and cards going "pop" for no reason...the purchase was rushed as I had a ton going on, and figured I better err on the side of caution as it seemed strange to have people so sure the reference cards were a bad buy, water-cooling or crossfire be damned. Like I said, my fault for not trusting my gut, and especially my fault for not asking in the club here. It's only a minor inconvenience, I just saw a bunch of post of people preaching how the reference cards seem to clock better in general, and kind of had a "WTH?" moment as that is the opposite of what I was told before making my purchase. All in all though, I am happy with my card I guess, so I am not really complaining, just wish I wouldn't have had people here talk me out of the reference cards, for seemingly no reason other than heat/noise, when in the long run, with a waterblock, it wouldn't matter one bit...oh well, live and learn to not blindly follow advice, I guess.


----------



## Agent Smith1984

Quote:


> Originally Posted by *rdr09*
> 
> like i said, Smith . . . Aaron prolly asked in the wrong section and been given advice by a member who prolly does not even own the hardware. it happens, i find myself giving advice at times without reading much the op.


Yeah, I too read a lot of info before getting mine that said to stay away from reference because of temps and throttling. To be honest, I wanted to go water, but it's not in the budget at the moment. Luckily, my card was kind of a win/win because it's at least reference based, and the stock cooling solution works really well for now..... I definitely plan on running a kraken and ram sinks with a closed loop at minimum.

My younger brother has been preaching to me for years to always go reference because they usually have the best VRAM, the best chance of modifying BIOS, and they will accept all aftermarket cooling solutions. Seems he's been right....

Back in the day (like FX5500/ 9600 Pro days, and all the way up until the 7800GTX, and X1900XT







), almost every PCB out there was at least a reference design, and worst case it would be a different color or have an additional power port, but now there are a lot of different things being done with boards.... especially from Asus and Sapphire I've noticed.

Edit: And in the case of my pervious card the 280x..... research quickly revealed that if you didn't have a reference card, there was a GOOD chance you had an artifacting piece of crap.... Mine was an Asus TOP, and it took some serious tinkering, driver swaps, BIOS edits, and tinkering to get that card working correctly 95% of the time


----------



## Agent Smith1984

Quote:


> Originally Posted by *Chopper1591*
> 
> Comes down to what you think is acceptable.
> If you mean 60+.
> 
> I am not so sure.
> Gpu's age real quick.
> 
> 2 years on one card is long if you ask me.


Yeah, but look at how well the 7970 still does..... some people bought them new and ran them since early 2012, and are just now adding a second used card for dirt cheap and seeing 70-100% gains.
The fact that I know I'll be sticking with 1080P gives me some breathing room. Plus, the not-so-impressive console out right now will slow down development a little. Not that it's a good thing, it's just a reality.


----------



## Chopper1591

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yeah, but look at how well the 7970 still does..... some people bought them new and ran them since early 2012, and are just now adding a second used card for dirt cheap and seeing 70-100% gains.
> The fact that I know I'll be sticking with 1080P gives me some breathing room. Plus, the not-so-impressive console out right now will slow down development a little. Not that it's a good thing, it's just a reality.


Don't start consoles here.
They always lack behind pc's.









720p 30fps. Say what?


----------



## Chopper1591

.


----------



## ArcticZero

Hi guys, with Black Friday coming up, I have a quick question. I can't seem to find any cheap reference 290X's around the web (US). Anyone know where one could grab one below $400 new like with many of the aftermarket cooled cards? Really prefer reference so it doesn't dump heat inside my case.


----------



## heroxoot

Quote:


> Originally Posted by *ArcticZero*
> 
> Hi guys, with Black Friday coming up, I have a quick question. I can't seem to find any cheap reference 290X's around the web (US). Anyone know where one could grab one below $400 new like with many of the aftermarket cooled cards? Really prefer reference so it doesn't dump heat inside my case.


It's your location. They are all sub 400USD in the USA.

http://www.newegg.com/Product/ProductList.aspx?N=100007709%20600473871&IsNodeId=1&Submit=ENE

So unless you are going to move here I don't think you will find one that easily willing to ship your way.


----------



## LandonAaron

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yeah, I too read a lot of info before getting mine that said to stay away from reference because of temps and throttling. To be honest, I wanted to go water, but it's not in the budget at the moment. Luckily, my card was kind of a win/win because it's at least reference based, and the stock cooling solution works really well for now..... I definitely plan on running a kraken and ram sinks with a closed loop at minimum.
> 
> My younger brother has been preaching to me for years to always go reference because they usually have the best VRAM, the best chance of modifying BIOS, and they will accept all aftermarket cooling solutions. Seems he's been right....
> 
> Back in the day (like FX5500/ 9600 Pro days, and all the way up until the 7800GTX, and X1900XT
> 
> 
> 
> 
> 
> 
> 
> ), almost every PCB out there was at least a reference design, and worst case it would be a different color or have an additional power port, but now there are a lot of different things being done with boards.... especially from Asus and Sapphire I've noticed.
> 
> Edit: And in the case of my pervious card the 280x..... research quickly revealed that if you didn't have a reference card, there was a GOOD chance you had an artifacting piece of crap.... Mine was an Asus TOP, and it took some serious tinkering, driver swaps, BIOS edits, and tinkering to get that card working correctly 95% of the time


And then there are cards like the Nvidia GTX 970, which doesn't even have a reference design. I think each generation of cards is different and its hard to make generalizations about whether Reference is better or not. I generally go non-reference as I don't use full cover water blocks and if I am going to run on air I prefer an open air design for less heat and noise. Also if f I put on water many non-reference cards have VRM heatsinks you can use even after the rest of the cooler is removed. MSI twin frozer cards especially.


----------



## ArcticZero

Quote:


> Originally Posted by *heroxoot*
> 
> It's your location. They are all sub 400USD in the USA.
> 
> http://www.newegg.com/Product/ProductList.aspx?N=100007709%20600473871&IsNodeId=1&Submit=ENE
> 
> So unless you are going to move here I don't think you will find one that easily willing to ship your way.


My contact is in the US. I did mention US in my first post. And I was referring to all reference. The only one I see on there is Powercolor, but I've heard most of them are on Elpidas.


----------



## heroxoot

Quote:


> Originally Posted by *ArcticZero*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> It's your location. They are all sub 400USD in the USA.
> 
> http://www.newegg.com/Product/ProductList.aspx?N=100007709%20600473871&IsNodeId=1&Submit=ENE
> 
> So unless you are going to move here I don't think you will find one that easily willing to ship your way.
> 
> 
> 
> My contact is in the US. I did mention US in my first post. And I was referring to all reference. The only one I see on there is Powercolor, but I've heard most of them are on Elpidas.
Click to expand...

Sapphire or MSI is the way to go but that's just my opinion. I can confirm a 290X Gaming edition has Hynix memory. They are 360USD. Or if your contact can manage, you can get one off eBay and RMA it if bad for a new one. MSI only cares about valid serial for RMA.


----------



## Draygonn

Quote:


> Originally Posted by *cephelix*
> 
> do corrent me if i'm wrong but you should be able to.just need a fan blowing at it.the vrams dont get too hot, it's the vrms and specifically vrm 1 that gets really hot


Seems to be working fine. No temp sensor but putting a finger on the back of the memory doesn't get too hot.


----------



## diggiddi

Quote:


> Originally Posted by *Chopper1591*
> 
> God damnit.
> 
> Forgot to do measurements...
> Bought the 290 Tri-x yesterday. Replacing my 7950 vapor-x. Which is 30mm longer.
> 
> My reservoir is placed next to my gpu with about 10mm clearance.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What should I do?


Give it away......







to me


----------



## incog

Hello everyone, what is better:

http://www.materiel.net/carte-graphique/sapphire-radeon-r9-290-vapor-x-oc-4-go-100683.html

http://www.materiel.net/carte-graphique/sapphire-radeon-r9-290x-tri-x-uefi-4-go-110111.html

290X is stronger than the 290 by a bit, however the VaporX cooler is better. So which one would you prefer?


----------



## battleaxe

Quote:


> Originally Posted by *incog*
> 
> Hello everyone, what is better:
> 
> http://www.materiel.net/carte-graphique/sapphire-radeon-r9-290-vapor-x-oc-4-go-100683.html
> 
> http://www.materiel.net/carte-graphique/sapphire-radeon-r9-290x-tri-x-uefi-4-go-110111.html
> 
> 290X is stronger than the 290 by a bit, however the VaporX cooler is better. So which one would you prefer?


The 290x and put it under water. Better, cooler, faster, quieter.


----------



## Chopper1591

Quote:


> Originally Posted by *diggiddi*
> 
> Give it away......
> 
> 
> 
> 
> 
> 
> 
> to me


You wished.









Placed an order for this.


----------



## rdr09

Still testing my second 290. it is slightly slower than my first and i accidentally installed it on the first slot.









Looks like VRM1 temp keeps up with the core temp when car is oc'ed. i shutdown the system and waited about 15 minutes before benching oc'ed.

947MHz to 1260MHz


----------



## cephelix

Quote:


> Originally Posted by *battleaxe*
> 
> This is the absolute best deal I have seen on memory sinks. Very, very cheap. No reason not to buy. 3M tape too.


That is cheap...but I'm looking for copper heatsinks. Just browsed FCPU and the Low profile memory heatsinks are out of stock..***
Quote:


> Originally Posted by *Chopper1591*
> 
> Pretty nice indeed.
> 
> Although I like my stuff copper.
> 
> 
> 
> 
> 
> 
> 
> 
> These are 18 usd for 10 pieces though.


Pricey but that's what I'm looking for!any idea where to get them?
Quote:


> Originally Posted by *Chopper1591*
> 
> Comes down to what you think is acceptable.
> If you mean 60+.
> 
> I am not so sure.
> Gpu's age real quick.
> 
> 2 years on one card is long if you ask me.


Agreed, especially if you want to play at absolute max settings with no comprimise.
If you're willing to tweak settings lower so long as it's playable, then i guess your card's life would be longer

On a separate note, anyone using the EK thermosphere know if I could reuse the cooling mid-plate on the MSI R9 290 Gaming 4G? I'm fairly certain that I'd have to remove the backplate though just wondering about the plate under the shroud that cools the components


----------



## ZealotKi11er

Anyone with 2 x 290/X played Crysis 3 lately? I am getting 1s hangs every 10s of game play. Dont remeber having this problem when i first played it with 2 x HD 7970. I am using the latest drivers.


----------



## Kelwing

Back to AMD. Think I'm done with duel cards and such. Had a MSI 290 when they first came out and ended up with all kinds of issues. Went back to Nvidia with two 770's. Decided I want a good single card and went with a MSI 290X Lightning. Could not pass on the price.

http://www.techpowerup.com/gpuz/details.php?id=4r5xq



http://s222.photobucket.com/user/mi.../2014-11-20_16-54-40_338_zps5e4679f5.jpg.html


----------



## DividebyZERO

Quote:


> Originally Posted by *Kelwing*
> 
> Back to AMD. Think I'm done with duel cards and such. Had a MSI 290 when they first came out and ended up with all kinds of issues. Went back to Nvidia with two 770's. Decided I want a good single card and went with a MSI 290X Lightning. Could not pass on the price.
> 
> http://s222.photobucket.com/user/mi.../2014-11-20_16-54-40_338_zps5e4679f5.jpg.html


two cards dueling to the death! I can see that being a problem when they are supposed to work in tandem.


----------



## Arizonian

Quote:


> Originally Posted by *LandonAaron*
> 
> I am using a custom water cooler with a universal block (EK VGA Supremacy) just for the GPU portion of the card, and the GelID IcyVision for the VRM's. I also have a PCI slot fan blowing on the card to assist the VRM heatsinks, and this setup works really well for me. Before going custom loop I had an Artic Accelero Hybrid Cooler and it couldn't keep up. I kept hitting 94 and throlttling. I think I may have had a bad mount though, as I used that cooler on several Nvidia cards with no issues (GTX 580, GTX670, and GTX 770). But all those cards are lower TDP than the 290x.
> 
> If you go with something like the kraken g10, I would still suggest either a 240 mm radiator, or something like the Corsair H80 with a 120 double thick rad. The artic accelero I had had a standard thickness 120 rad and it couldn't keep up.
> 
> Here's a pic of my current setup:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added
















_If you could please link a GPU-Z validation link in your original post with OCN name showing would be great._
Quote:


> Originally Posted by *neurotix*
> 
> Arizonian,
> 
> Please update me to 2x Sapphire R9 290 Vapor-X:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks.


Congrats - updated







Tri-X to VaporX









Quote:


> Originally Posted by *Kelwing*
> 
> Back to AMD. Think I'm done with duel cards and such. Had a MSI 290 when they first came out and ended up with all kinds of issues. Went back to Nvidia with two 770's. Decided I want a good single card and went with a MSI 290X Lightning. Could not pass on the price.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s222.photobucket.com/user/mi.../2014-11-20_16-54-40_338_zps5e4679f5.jpg.html


Congrats - added









_If you could please link a GPU-Z validation link in your original post with OCN name showing would be great._

If anyone hasn't added your submission yet or I missed your post that didn't have submission, join us...


----------



## neurotix

Thanks Arizonian


----------



## Bertovzki

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yeah, but look at how well the 7970 still does..... some people bought them new and ran them since early 2012, and are just now adding a second used card for dirt cheap and seeing 70-100% gains.
> The fact that I know I'll be sticking with 1080P gives me some breathing room. Plus, the not-so-impressive console out right now will slow down development a little. Not that it's a good thing, it's just a reality.


It's been nearly 5 years , and im still using my HD 5850 until my new system is up and going


----------



## Regnitto

I will be purchasing a VisionTek R9 290 from @bluedevil tomorrow. Once it comes in and I get it installed I will post gpu-z screenshot


----------



## Chopper1591

Quote:


> Originally Posted by *cephelix*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> That is cheap...but I'm looking for copper heatsinks. Just browsed FCPU and the Low profile memory heatsinks are out of stock..***
> 
> 
> Pricey but that's what I'm looking for!any idea where to get them?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Agreed, especially if you want to play at absolute max settings with no comprimise.
> If you're willing to tweak settings lower so long as it's playable, then i guess your card's life would be longer
> 
> On a separate note, anyone using the EK thermosphere know if I could reuse the cooling mid-plate on the MSI R9 290 Gaming 4G? I'm fairly certain that I'd have to remove the backplate though just wondering about the plate under the shroud that cools the components


I don't know about stores where you live.

But you can get some on the bay:
http://www.ebay.com/itm/Enzotech-RAM-cooler-BCC9-Low-Profile-passive-/201221950101?pt=UK_Computing_Water_Cooling&hash=item2ed9c34a95

Or through Frozencpu:
http://www.frozencpu.com/search.html?mv_profile=keyword_search&mv_session_id=e8Kpi5fd&searchspec=bcc9&go.x=0&go.y=0

These are low profile ram sinks from Enzotech.
There are more cheaper copper sinks but Enzotech uses better copper. They are also real nice to look at.

Here are some pic's from my old HD6850:




Those are also really soft.
So if they interfere with the gpu cooler you can easily bend some of the pins. I even cut some shorter once on a HD4870.


----------



## cephelix

Quote:


> Originally Posted by *Chopper1591*
> 
> I don't know about stores where you live.
> 
> But you can get some on the bay:
> http://www.ebay.com/itm/Enzotech-RAM-cooler-BCC9-Low-Profile-passive-/201221950101?pt=UK_Computing_Water_Cooling&hash=item2ed9c34a95
> 
> Or through Frozencpu:
> http://www.frozencpu.com/search.html?mv_profile=keyword_search&mv_session_id=e8Kpi5fd&searchspec=bcc9&go.x=0&go.y=0
> 
> These are low profile ram sinks from Enzotech.
> There are more cheaper copper sinks but Enzotech uses better copper. They are also real nice to look at.
> 
> Here are some pic's from my old HD6850:
> 
> 
> 
> 
> Those are also really soft.
> So if they interfere with the gpu cooler you can easily bend some of the pins. I even cut some shorter once on a HD4870.


Googled for places that sell copper heatsinks where I live but didnt turn up any results. i can find the aluminium ones..but the copper does look good. either way, i found a place in australia that delivers. and since i'm going there next month, might as well i have them delivered to where i'm staying in sydney

Was looking for the really low ones since i thought that the bcc9s were too tall and might interfere with the gpu block


----------



## Chopper1591

Quote:


> Originally Posted by *cephelix*
> 
> Googled for places that sell copper heatsinks where I live but didnt turn up any results. i can find the aluminium ones..but the copper does look good. either way, i found a place in australia that delivers. and since i'm going there next month, might as well i have them delivered to where i'm staying in sydney
> 
> Was looking for the really low ones since i thought that the bcc9s were too tall and might interfere with the gpu block


The bcc9's are:
Dimensions (mm)
14(L) mm x 14mm(W) x 9mm(H)

Like I said earlier.
You can trim those to smaller sizes real easy. The copper is high quality, thus soft.

Which brand does the store you are talking about sell? And what are the prices?
Can't you buy from e-bay? I buy lots of stuff from there. Shipping costs ain't that bad.

For ram these days, aluminum will suffice. But copper is better and looks nice.


----------



## cephelix

Quote:


> Originally Posted by *Chopper1591*
> 
> The bcc9's are:
> Dimensions (mm)
> 14(L) mm x 14mm(W) x 9mm(H)
> 
> Like I said earlier.
> You can trim those to smaller sizes real easy. The copper is high quality, thus soft.
> 
> Which brand does the store you are talking about sell? And what are the prices?
> Can't you buy from e-bay? I buy lots of stuff from there. Shipping costs ain't that bad.
> 
> For ram these days, aluminum will suffice. But copper is better and looks nice.


TheKoolRoom

for what i ordered, would cost 54AUD incl shipping looking at the ENZOTECH MOS-C10 and and the bcc9 i suppose
any links to reputable sellers? never used ebay before








if you can't post here, do pm me


----------



## rdr09

Quote:


> Originally Posted by *Kelwing*
> 
> Back to AMD. Think I'm done with duel cards and such. Had a MSI 290 when they first came out and ended up with all kinds of issues. Went back to Nvidia with two 770's. Decided I want a good single card and went with a MSI 290X Lightning. Could not pass on the price.
> 
> http://www.techpowerup.com/gpuz/details.php?id=4r5xq
> 
> http://s222.photobucket.com/user/mistwalker7/media/MSI290XLightning_zps506bb624.gif.html
> 
> http://s222.photobucket.com/user/mi.../2014-11-20_16-54-40_338_zps5e4679f5.jpg.html


i have no issues running dual 290s. heck, even notorious Skyrim that i've been reading does not play well in crossfire is surprisingly smooth. but, i only have one screen.









nice car, btw.


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> i have no issues running dual 290s. heck, even notorious Skyrim that i've been reading does not play well in crossfire is surprisingly smooth. but, i only have one screen.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> nice car, btw.


Skyrim was smooth for me since HD 6970s lol. All i had to do is ALT-TAB to turn on CF.


----------



## bond32

Insanely good price for a 290x lightning...:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814127787


----------



## DividebyZERO

Quote:


> Originally Posted by *bond32*
> 
> Insanely good price for a 290x lightning...:
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814127787


Must resist...... ugh....

do they overclock better? I dont mean for benching but for game stable overclocks? I heard they arent much different aside from memory voltage?


----------



## tsm106

Quote:


> Originally Posted by *DividebyZERO*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bond32*
> 
> Insanely good price for a 290x lightning...:
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814127787
> 
> 
> 
> Must resist...... ugh....
> 
> do they overclock better? I dont mean for benching but for game stable overclocks? I heard they arent much different aside from memory voltage?
Click to expand...

They're very different actually. The vrm section runs stupid cool, especially under water (no need for special pads). It's a shame that you still have to pray to the silicon lottery god, however you do get more volts by default from AB.

**Btw, if you wc beware that these blocks are really overpriced 150+ and with backplate and shipping you're looking at nearly $200 lol.


----------



## bond32

Quote:


> Originally Posted by *tsm106*
> 
> They're very different actually. The vrm section runs stupid cool, especially under water (no need for special pads). It's a shame that you still have to pray to the silicon lottery god, however you do get more volts by default from AB.
> 
> **Btw, if you wc beware that these blocks are really overpriced 150+ and with backplate and shipping you're looking at nearly $200 lol.


This... Interestingly enough, I have checked the lightning thread. Not many users getting that high, I expected much more tbh.


----------



## tsm106

Quote:


> Originally Posted by *bond32*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> They're very different actually. The vrm section runs stupid cool, especially under water (no need for special pads). It's a shame that you still have to pray to the silicon lottery god, however you do get more volts by default from AB.
> 
> **Btw, if you wc beware that these blocks are really overpriced 150+ and with backplate and shipping you're looking at nearly $200 lol.
> 
> 
> 
> This... Interestingly enough, I have checked the lightning thread. Not many users getting that high*, I expected much more tbh*.
Click to expand...

You expect too much from most ppl lol.


----------



## bond32

Quote:


> Originally Posted by *tsm106*
> 
> You expect too much from most ppl lol.


Haha, guess so...

Too much emphasis on that cooler for the lightning. In my early years, before I knew anything, I bought a 780 lightning. Wasn't very impressive at all and the fans are surprisingly loud. Figured, the absurd amount of power delivery coupled with samsung would be a winner...


----------



## LandonAaron

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> _If you could please link a GPU-Z validation link in your original post with OCN name showing would be great._
> Congrats - updated
> 
> 
> 
> 
> 
> 
> 
> Tri-X to VaporX
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> _If you could please link a GPU-Z validation link in your original post with OCN name showing would be great._
> 
> If anyone hasn't added your submission yet or I missed your post that didn't have submission, join us...


Quote:


> Originally Posted by *LandonAaron*
> 
> I am using a custom water cooler with a universal block (EK VGA Supremacy) just for the GPU portion of the card, and the GelID IcyVision for the VRM's. I also have a PCI slot fan blowing on the card to assist the VRM heatsinks, and this setup works really well for me. Before going custom loop I had an Artic Accelero Hybrid Cooler and it couldn't keep up. I kept hitting 94 and throlttling. I think I may have had a bad mount though, as I used that cooler on several Nvidia cards with no issues (GTX 580, GTX670, and GTX 770). But all those cards are lower TDP than the 290x.
> 
> If you go with something like the kraken g10, I would still suggest either a 240 mm radiator, or something like the Corsair H80 with a 120 double thick rad. The artic accelero I had had a standard thickness 120 rad and it couldn't keep up.
> 
> Here's a pic of my current setup:


Here is my GPUz validation link it has my OCN name. http://www.techpowerup.com/gpuz/details.php?id=dkgnk

Let me know if there is anything else you need. This will be my first time joining a club


----------



## tsm106

Quote:


> Originally Posted by *bond32*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> You expect too much from most ppl lol.
> 
> 
> 
> Haha, guess so...
> 
> Too much emphasis on that cooler for the lightning. In my early years, before I knew anything, I bought a 780 lightning. Wasn't very impressive at all and the fans are surprisingly loud. Figured, the absurd amount of power delivery coupled with samsung would be a winner...
Click to expand...

Silicon lottery, ain't nothing you can do about bad luck except throw moar money at it lol.


----------



## ZealotKi11er

Quote:


> Originally Posted by *tsm106*
> 
> Silicon lottery, ain't nothing you can do about bad luck except throw moar money at it lol.


Silicon Lottery is a myth. You either get good card or bad card. Most people will get average card. 28nm is too mature right now to call it lottery. Almost all cards will be identical in OC potential. 10-20 MHz is not really lottery. For 290X depending on the cooling all cards should do 1180-1220.


----------



## bond32

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Silicon Lottery is a myth. You either get good card or bad card. Most people will get average card. 28nm is too mature right now to call it lottery. Almost all cards will be identical in OC potential. 10-20 MHz is not really lottery. For 290X depending on the cooling all cards should do 1180-1220.


What's your basis of this? How many cards have you had to say this? 2?

Sorru but I honestly don't agree.


----------



## ZealotKi11er

Quote:


> Originally Posted by *bond32*
> 
> What's your basis of this? How many cards have you had to say this? 2?
> 
> Sorru but I honestly don't agree.


It has more to do with colling especially 290X. I can do 1300MHz game stable is card is below 40C but i cant go claim my card can do 1300MHz can I? Under 24/7 operation and under normal ambient temps my cards do lower then that. I am not trying to make excuses for my OC but simply letting other people know that have not OCed this card to OC number that they can expect. I have 3 x 290s and a 290X.


----------



## tsm106

This guy seems to think his opinion is the rule of thumb lol.









Quote:


> Originally Posted by *ZealotKi11er*
> 
> You are not the average overclocker. Telling people that these cards do at least 1300MHz is false. I have tested 3 x 290 and neither could do 1300MHz no normal BIOS and +200mV.


Quote:


> Originally Posted by *ZealotKi11er*
> 
> 290X/290 cant go above 1250MHz unless you run the benchmark with artifacts or have super low ambient temps.


Quote:


> Originally Posted by *ZealotKi11er*
> 
> Silicon Lottery is a myth. You either get good card or bad card. Most people will get average card. 28nm is too mature right now to call it lottery. Almost all cards will be identical in OC potential. 10-20 MHz is not really lottery. For 290X depending on the cooling all cards should do 1180-1220.


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> It has more to do with colling especially 290X. I can do 1300MHz game stable is card is below 40C but i cant go claim my card can do 1300MHz can I? Under 24/7 operation and under normal ambient temps my cards do lower then that. I am not trying to make excuses for my OC but simply letting other people know that have not OCed this card to OC number that they can expect. I have 3 x 290s and a 290X.


but, Zeal, both my 290s can easily do 1200 while i see some struggle to get 1150 even using the same cooling. isn't that lottery?


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> but, Zeal, both my 290s can easily do 1200 while i see some struggle to get 1150 even using the same cooling. isn't that lottery?


Can easily do 1200MHz, What kind of cooling and voltage.


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Can easily do 1200MHz, What kind of cooling and voltage.


when i say easy, i mean i don't have to find out if +60 or +100 will do for a certain oc. i play at stock but if i check how a driver performs i use a synthetic bench first.

my cards can do 1200 using +100. maybe lower. i am not sure. they can bench and game at 1100 using stock. my first one does not need extra VDDC not till 1170, the other 1150.


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> when i say easy, i mean i don't have to find out if +60 or +100 will do for a certain oc. i play at stock but if i check how a driver performs i use a synthetic bench first.
> 
> my cards can do 1200 using +100. maybe lower. i am not sure. they can bench and game at 1100 using stock. my first one does not need extra VDDC not till 1170, the other 1150.


Its like that for all AMD cards. They can hit certain clocks with no problem then after then they need stupid amount of voltage to increase extra 10-20MHz. 3DMark is usually not a good indication of stability. I have to drop my cards to 1185MHz + 100mV is i want to keep playing BF4 for example for 3 hours. I can do 1220MHz in 3DMark no problem because temps dont hit high enough per run.


----------



## LandonAaron

I got an issue maybe someone can help me with. I have basically found my my max stable OC for gaming and benchmarking at 1200/1600 with +200mv using Trixx. The card is watercooled with custom loop and GPU Temp in the low 50's, and VRM1 gets up to about 70 max and VRM2 stays in the low 50s. Anyway I have found this OC to be pretty much 100% stable with no artifacts that I have noticed. However, if I put the computer to sleep and forget to reset back to stock when I wake the computer up I will just have black screens on my monitors. This isn't too much of an issue as all I need to do is remember to reset back to stock before putting computer to sleep. However today something different happened. I went to put the computer to sleep and remembered for once to reset back to stock, but when I hit the reset button in Trixx both monitors just went to black screens and stayed that way. It wasn't as if they weren't getting a signal either as whenever that happens they go to sleep after a few seconds, instead they stayed on just with black screens. Should I maybe start turning down the clocks first and then the voltage?


----------



## bond32

Quote:


> Originally Posted by *LandonAaron*
> 
> I got an issue maybe someone can help me with. I have basically found my my max stable OC for gaming and benchmarking at 1200/1600 with +200mv using Trixx. The card is watercooled with custom loop and GPU Temp in the low 50's, and VRM1 gets up to about 70 max and VRM2 stays in the low 50s. Anyway I have found this OC to be pretty much 100% stable with no artifacts that I have noticed. However, if I put the computer to sleep and forget to reset back to stock when I wake the computer up I will just have black screens on my monitors. This isn't too much of an issue as all I need to do is remember to reset back to stock before putting computer to sleep. However today something different happened. I went to put the computer to sleep and remembered for once to reset back to stock, but when I hit the reset button in Trixx both monitors just went to black screens and stayed that way. It wasn't as if they weren't getting a signal either as whenever that happens they go to sleep after a few seconds, instead they stayed on just with black screens. Should I maybe start turning down the clocks first and then the voltage?


Probably a memory issue. Drop the memory down to 1500 and see if it persists


----------



## rdr09

Quote:


> Originally Posted by *LandonAaron*
> 
> I got an issue maybe someone can help me with. I have basically found my my max stable OC for gaming and benchmarking at 1200/1600 with +200mv using Trixx. The card is watercooled with custom loop and GPU Temp in the low 50's, and VRM1 gets up to about 70 max and VRM2 stays in the low 50s. Anyway I have found this OC to be pretty much 100% stable with no artifacts that I have noticed. However, if I put the computer to sleep and forget to reset back to stock when I wake the computer up I will just have black screens on my monitors. This isn't too much of an issue as all I need to do is remember to reset back to stock before putting computer to sleep. However today something different happened. I went to put the computer to sleep and remembered for once to reset back to stock, but when I hit the reset button in Trixx both monitors just went to black screens and stayed that way. It wasn't as if they weren't getting a signal either as whenever that happens they go to sleep after a few seconds, instead they stayed on just with black screens. Should I maybe start turning down the clocks first and then the voltage?


Quote:


> Originally Posted by *bond32*
> 
> Probably a memory issue. Drop the memory down to 1500 and see if it persists


like bond say or do 1180 /1550 and maybe a lower VDDC of +180.


----------



## ZealotKi11er

Quote:


> Originally Posted by *LandonAaron*
> 
> I got an issue maybe someone can help me with. I have basically found my my max stable OC for gaming and benchmarking at 1200/1600 with +200mv using Trixx. The card is watercooled with custom loop and GPU Temp in the low 50's, and VRM1 gets up to about 70 max and VRM2 stays in the low 50s. Anyway I have found this OC to be pretty much 100% stable with no artifacts that I have noticed. However, if I put the computer to sleep and forget to reset back to stock when I wake the computer up I will just have black screens on my monitors. This isn't too much of an issue as all I need to do is remember to reset back to stock before putting computer to sleep. However today something different happened. I went to put the computer to sleep and remembered for once to reset back to stock, but when I hit the reset button in Trixx both monitors just went to black screens and stayed that way. It wasn't as if they weren't getting a signal either as whenever that happens they go to sleep after a few seconds, instead they stayed on just with black screens. Should I maybe start turning down the clocks first and then the voltage?


I find TriXX a horrible tool for 24/7 OC. Just good enough for benchmark runs and uninstall. You should not need that much volts for 1200Mhz. Ty lower and see if it help. Its all bout temp/voltage balance.


----------



## Agent Smith1984

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I find TriXX a horrible tool for 24/7 OC. Just good enough for benchmark runs and uninstall. You should not need that much volts for 1200Mhz. Ty lower and see if it help. Its all bout temp/voltage balance.


That's weird, because I find Trixx to be the only thing that works well for me....

Afterburner never sticks the 50% power limit setting, it has to be redone in CCC everytime you apply a new setting. I found out about the problem because of the horrible throttling I was getting while overclocking..... come to find out, I was using AB, and the power limit kept going back to 0% in CCC even though AB said it was at 50....

Also, I have seen a lot of people use 150-200mv to get 1150-1200 core on the 290. I was under the impression that additional voltage was okay as long as temps were good... mine haven't broken 74c with 150mv, and I am planning on pushing +200mv tonight once I get my new PSU installed (blew my old one up during BF4







)

I have also heard others say don't use that much voltage though... I'm confused.... I ALWAYS push for the highest obtainable clock speeds within safe temps, regardless of how much voltage it requires (assuming no higher voltage than the manufacturer has allowed will be used).

I am hoping to hit 1200+/1600+ tonight with my new PSU.


----------



## wermad




----------



## Agent Smith1984

It's a thing of beauty isn't it???


----------



## ZealotKi11er

Quote:


> Originally Posted by *Agent Smith1984*
> 
> That's weird, because I find Trixx to be the only thing that works well for me....
> 
> Afterburner never sticks the 50% power limit setting, it has to be redone in CCC everytime you apply a new setting. I found out about the problem because of the horrible throttling I was getting while overclocking..... come to find out, I was using AB, and the power limit kept going back to 0% in CCC even though AB said it was at 50....
> 
> Also, I have seen a lot of people use 150-200mv to get 1150-1200 core on the 290. I was under the impression that additional voltage was okay as long as temps were good... mine haven't broken 74c with 150mv, and I am planning on pushing +200mv tonight once I get my new PSU installed (blew my old one up during BF4
> 
> 
> 
> 
> 
> 
> 
> )
> 
> I have also heard others say don't use that much voltage though... I'm confused.... I ALWAYS push for the highest obtainable clock speeds within safe temps, regardless of how much voltage it requires (assuming no higher voltage than the manufacturer has allowed will be used).
> 
> I am hoping to hit 1200+/1600+ tonight with my new PSU.


74C @ 150mv probably yield same OC as 45C @ 75 mV.


----------



## wermad

Quote:


> Originally Posted by *Agent Smith1984*
> 
> It's a thing of beauty isn't it???


Yes, pic sucks though









Picked up another one







, crossfire here I come









edit: Note to self, need to find crossfire bridge (







,





















)


----------



## bond32

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 74C @ 150mv probably yield same OC as 45C @ 75 mV.


No. Just no...


----------



## LandonAaron

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I find TriXX a horrible tool for 24/7 OC. Just good enough for benchmark runs and uninstall. You should not need that much volts for 1200Mhz. Ty lower and see if it help. Its all bout temp/voltage balance.


Well my approach to OCing wasn't to pick a clock speed and then find whatever voltage would I would need, I just maxed out the voltage and then found my highest stable clock speed, which happened to be 1200/1600. I think me and Agent Smith have pretty similar philosophies when it comes to OCing.I pretty much have to use Trixx as I know of no other program that will allow +200mv, but what I typically do is open Trix apply my overclock, close it, and open Afterburner to monitor my usage and temps. Afterburner always reports the same settings that I set in Trixx so it seems to be working fine.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> 74C @ 150mv probably yield same OC as 45C @ 75 mV.


My setup is water cooled so the difference between +100mv and +200mv is about 5 degree for the GPU and maybe 10 max for the VRM. With the GPU in the low 50's and the VRM in the high 60's.


----------



## ZealotKi11er

Quote:


> Originally Posted by *LandonAaron*
> 
> Well my approach to OCing wasn't to pick a clock speed and then find whatever voltage would I would need, I just maxed out the voltage and then found my highest stable clock speed, which happened to be 1200/1600. I think me and Agent Smith have pretty similar philosophies when it comes to OCing.I pretty much have to use Trixx as I know of no other program that will allow +200mv, but what I typically do is open Trix apply my overclock, close it, and open Afterburner to monitor my usage and temps. Afterburner always reports the same settings that I set in Trixx so it seems to be working fine.
> 
> My setup is water cooled so the difference between +100mv and +200mv is about 5 degree for the GPU and maybe 10 max for the VRM. With the GPU in the low 50's and the VRM in the high 60's.


For me +200 mV is beyond what the loop can handle so temp increase more then 5C. 45C vs 50C can mean the difference between perfect stable and instability and artifacts.


----------



## sf101

Odd . I usually have issues strait away if i add to many initial volts.


----------



## LandonAaron

Quote:


> Originally Posted by *ZealotKi11er*
> 
> For me +200 mV is beyond what the loop can handle so temp increase more then 5C. 45C vs 50C can mean the difference between perfect stable and instability and artifacts.


I've never heard this before. It has always been my understanding that instability arises at very high temperatures, and that really temperature itself does not have the large of an impact on stability. Or that it only has an indirect affect on stability in that it limits the amount of voltage you can feed the chip. I'll test this theory though and see if I can't get higher overclock with less voltage, but the idea seems rather counter intuitive to me, especially when we are talking about temps as low as 45 and 50 degrees.


----------



## ZealotKi11er

Quote:


> Originally Posted by *LandonAaron*
> 
> I've never heard this before. It has always been my understanding that instability arises at very high temperatures, and that really temperature itself does not have the large of an impact on stability. Or that it only has an indirect affect on stability in that it limits the amount of voltage you can feed the chip. I'll test this theory though and see if I can't get higher overclock with less voltage, but the idea seems rather counter intuitive to me, especially when we are talking about temps as low as 45 and 50 degrees.


40C will get you much better OC then 50C+. You can easily prove it.


----------



## Kelwing

Quote:


> Originally Posted by *rdr09*
> 
> i have no issues running dual 290s. heck, even notorious Skyrim that i've been reading does not play well in crossfire is surprisingly smooth. but, i only have one screen.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> nice car, btw.


If I had the extra money I would grab a second one. I am very impressed with the card. Especially after the nightmare the 290 was a year ago. So far it's going thru DA:I with everything maxed with ease.

As for the car thanks. My main hobby, drag racing. Between it and messing with my pc for gaming. I'm surprised my wallet hasn't beat me to death yet


----------



## rt123

Quote:


> Originally Posted by *Agent Smith1984*
> 
> That's weird, because I find Trixx to be the only thing that works well for me....
> 
> Afterburner never sticks the 50% power limit setting, it has to be redone in CCC everytime you apply a new setting. I found out about the problem because of the horrible throttling I was getting while overclocking..... come to find out, I was using AB, and the power limit kept going back to 0% in CCC even though AB said it was at 50....
> 
> Also, I have seen a lot of people use 150-200mv to get 1150-1200 core on the 290. I was under the impression that additional voltage was okay as long as temps were good... mine haven't broken 74c with 150mv, and I am planning on pushing +200mv tonight once I get my new PSU installed (blew my old one up during BF4
> 
> 
> 
> 
> 
> 
> 
> )
> 
> I have also heard others say don't use that much voltage though... I'm confused.... I ALWAYS push for the highest obtainable clock speeds within safe temps, regardless of how much voltage it requires (assuming no higher voltage than the manufacturer has allowed will be used).
> 
> I am hoping to hit 1200+/1600+ tonight with my new PSU.


When using AfterBurner or any other OCing tool. Stop using AMD OverDrive. There is conflict between the two utilities & that's why it keeps on re-setting.


----------



## rdr09

Quote:


> Originally Posted by *Kelwing*
> 
> If I had the extra money I would grab a second one. I am very impressed with the card. Especially after the nightmare the 290 was a year ago. So far it's going thru DA:I with everything maxed with ease.
> 
> As for the car thanks. My main hobby, drag racing. Between it and messing with my pc for gaming. I'm surprised my wallet hasn't beat me to death yet


both hobbies are expensive. i'm not into any racing but once a NY cab driver told me to slow down.


----------



## Mega Man

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Silicon lottery, ain't nothing you can do about bad luck except throw moar money at it lol.
> 
> 
> 
> Silicon Lottery is a myth. You either get good card or bad card. Most people will get average card. 28nm is too mature right now to call it lottery. Almost all cards will be identical in OC potential. 10-20 MHz is not really lottery. For 290X depending on the cooling all cards should do 1180-1220.
Click to expand...





Quote:


> Originally Posted by *tsm106*
> 
> This guy seems to think his opinion is the rule of thumb lol.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> You are not the average overclocker. Telling people that these cards do at least 1300MHz is false. I have tested 3 x 290 and neither could do 1300MHz no normal BIOS and +200mV.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> 290X/290 cant go above 1250MHz unless you run the benchmark with artifacts or have super low ambient temps.
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> Silicon Lottery is a myth. You either get good card or bad card. Most people will get average card. 28nm is too mature right now to call it lottery. Almost all cards will be identical in OC potential. 10-20 MHz is not really lottery. For 290X depending on the cooling all cards should do 1180-1220.
> 
> Click to expand...
Click to expand...

absolutely agree with you tsm !


----------



## Agent Smith1984

The only clubbin' I'm doin' on Friday night, is 290 clubbin'










I just noticed the dat eon this said 11/14... that's funny because I blew my PSU up last Friday, and just got my system back up and running tonight.... that means my battery is dead as hell....

YOU KNOW IT'S TIME FOR A CPU/MOBO UPGRADE WHEN YOUR BATTERY IS DEAD!!! ROFL


----------



## wermad

Lol, don't worry about your cpu, I'm using a lowly Pentium


----------



## Agent Smith1984

Quote:


> Originally Posted by *wermad*
> 
> Lol, don't worry about your cpu, I'm using a lowly Pentium


To be honest, I'm looking at my physics and combined scores, and telling most i5's to kiss my ass..... it'd have to be an i7 to make it worth my while.....

My thuban says no to that that upgrade, buy this b!tc4 a battery"










My thuban, is a Chuck Norris stepping, for those who didn't know.....


----------



## wermad

I'm only playing classic (gog) games at the moment. It handles it properly at stock. I have a second 290 ordered so it will surely be a bottleneck. Going to try to oc it to 4.7 as that's a fairly reachable speed for many PAE. Eventually, I'm going to pick up a 4770k or 4790K (probably tax season).


----------



## Agent Smith1984

Quote:


> Originally Posted by *wermad*
> 
> I'm only playing classic (gog) games at the moment. It handles it properly at stock. I have a second 290 ordered so it will surely be a bottleneck. Going to try to oc it to 4.7 as that's a fairly reachable speed for many PAE. Eventually, I'm going to pick up a 4770k or 4790K (probably tax season).


Those little pentiums crank out some serious FPS on games that don't peg out the cores too badly! Especially with a nice round 4.8 (you'll need some loud air, or decent water for that)... at least you already have a board for the upgrade path







.


----------



## DividebyZERO

DR. COX is the bomb


----------



## Chopper1591

Quote:


> Originally Posted by *cephelix*
> 
> TheKoolRoom
> 
> for what i ordered, would cost 54AUD incl shipping looking at the ENZOTECH MOS-C10 and and the bcc9 i suppose
> any links to reputable sellers? never used ebay before
> 
> 
> 
> 
> 
> 
> 
> 
> if you can't post here, do pm me


You could look what the shipping costs are when you order from Frozencpu
And.. else it will be just from the store you mentioned I guess.
Ebay has only one offering, it seems. And that is a bit expensive.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Chopper1591*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cephelix*
> 
> TheKoolRoom
> 
> for what i ordered, would cost 54AUD incl shipping looking at the ENZOTECH MOS-C10 and and the bcc9 i suppose
> any links to reputable sellers? never used ebay before
> 
> 
> 
> 
> 
> 
> 
> 
> if you can't post here, do pm me
> 
> 
> 
> You could look what the shipping costs are when you order from Frozencpu
> And.. else it will be just from the store you mentioned I guess.
> Ebay has only one offering, it seems. And that is a bit expensive.
Click to expand...

FrozenCPU is your best bet, i order from there a little and shipping is pretty quick considering and the prices are reasonable for said shipping


----------



## cephelix

Quote:


> Originally Posted by *Chopper1591*
> 
> You could look what the shipping costs are when you order from Frozencpu
> And.. else it will be just from the store you mentioned I guess.
> Ebay has only one offering, it seems. And that is a bit expensive.


Quote:


> Originally Posted by *Sgt Bilko*
> 
> FrozenCPU is your best bet, i order from there a little and shipping is pretty quick considering and the prices are reasonable for said shipping


Thanks for that guys..i do think the shipping is reasonable.bought the fujipolys from there.looking to ship clu and a few other things next on my next order


----------



## LandonAaron

Quote:


> Originally Posted by *cephelix*
> 
> Thanks for that guys..i do think the shipping is reasonable.bought the fujipolys from there.looking to ship clu and a few other things next on my next order


My advice on buying from FrozenCPU is to wait until you have your order completely picked out. They tend to discount shipping pretty heavily for combined orders vs. what you would end up paying in shipping if you had it all on separate orders. IMO their shipping is a little overpriced, but they have very quick service, and you won't find a better selection. I think you are ordering some copper VRAM and VRM heatsinks. I have some of the copper enzotech's and they are nice, just be sure you get quality thermal transfer tape, of appropriate thickness. There isn't alot of surface area on VRM and thermal transfer tape alone probably isn't going to give you a good mount. I would suggest the Gelid Icy Vision pack for the VRM. Its aluminum instead of copper, but the high mounting pressure of using screws instead of just tape will give you better thermal transfer and lower temps IMO. On the VRAM side of things any heatsink will really do, and thus I wouldn't suggest copper, as they are heavier and harder to keep attached. Thermal tape even the stickiest has a hard time holding copper heatsinks. Remeber these are going to be mounted upside down once you get the card installed. I would just use some small alluminum ones instead. Finnally don't use an epoxy for attaching the heatsink, they will rip off the VRM and VRAM chips when you try to remove the heatsink. Epoxy is really that permanent. The best thing I have found for attaching copper heatsinks is thermal adhesive compound that Arctic Cooling makes. They only seem to sale it packaged with their coolers. Though they have a spare parts page, and I think you can order it from there now. I wrote to their customer service once, and they shipped me some for free. The stuff is really cool it comes out like a liquid but drys into hard rubbery like substance which has an awesome hold. And it cleans off very easily. I have been using a combination of Water and Air with on my last 3 cards so this is just what I have found in that time. Finally my number one suggestion if you are considering using something like the Kraken G10 or a universal GPU block, is to buy a card like the MSI series with the Twin Frozen coolers. These cards come with a base plate already installed that completely covers all the VRM and VRAM componenets. Meaning you don't have to mess with a bunch of small individual heatsinks for each VRAM chip. The base plates work very well as long as there is no flex in the card. If the card is flexed the base plate may pull away from the card on one side.

Anyway sorry for the wall of text. I think universal coolers are they way to go. I upgrade my GPU way too often to sink a bunch of money into full cover blocks.


----------



## tsm106

Quote:


> They tend to discount shipping pretty heavily for combined orders vs. what you would end up paying in shipping if you had it all on separate orders.


It's priced by weight not quantity.

If you guys are talking gpu only cooling,have you guys looked at teh Gelid kit?


----------



## LandonAaron

Quote:


> Originally Posted by *tsm106*
> 
> It's priced by weight not quantity.
> 
> If you guys are talking gpu only cooling,have you guys looked at teh Gelid kit?


I purchased a single sheet of thermal transfer tape. It was the size of sticky post-it and the shipping was $5.65. So I always try to combine orders from them now.

Oh I also orderd and XSPC water cooling kit, with a pump/res, 420mm radiaotr, 10 feet of hose, then another 12 feet of hose in a different color, and like 3 or 4 metal fittings on top of the 6 fittings it came with, and the shipping was only like $11.50. So, I don't know how they calculate it, but I know that single piece of transfer tape didin't way half as much as that water cooling kit, but its shipping cost half as much.


----------



## cephelix

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *LandonAaron*
> 
> My advice on buying from FrozenCPU is to wait until you have your order completely picked out. They tend to discount shipping pretty heavily for combined orders vs. what you would end up paying in shipping if you had it all on separate orders. IMO their shipping is a little overpriced, but they have very quick service, and you won't find a better selection. I think you are ordering some copper VRAM and VRM heatsinks. I have some of the copper enzotech's and they are nice, just be sure you get quality thermal transfer tape, of appropriate thickness. There isn't alot of surface area on VRM and thermal transfer tape alone probably isn't going to give you a good mount. I would suggest the Gelid Icy Vision pack for the VRM. Its aluminum instead of copper, but the high mounting pressure of using screws instead of just tape will give you better thermal transfer and lower temps IMO. On the VRAM side of things any heatsink will really do, and thus I wouldn't suggest copper, as they are heavier and harder to keep attached. Thermal tape even the stickiest has a hard time holding copper heatsinks. Remeber these are going to be mounted upside down once you get the card installed. I would just use some small alluminum ones instead. Finnally don't use an epoxy for attaching the heatsink, they will rip off the VRM and VRAM chips when you try to remove the heatsink. Epoxy is really that permanent. The best thing I have found for attaching copper heatsinks is thermal adhesive compound that Arctic Cooling makes. They only seem to sale it packaged with their coolers. Though they have a spare parts page, and I think you can order it from there now. I wrote to their customer service once, and they shipped me some for free. The stuff is really cool it comes out like a liquid but drys into hard rubbery like substance which has an awesome hold. And it cleans off very easily. I have been using a combination of Water and Air with on my last 3 cards so this is just what I have found in that time. Finally my number one suggestion if you are considering using something like the Kraken G10 or a universal GPU block, is to buy a card like the MSI series with the Twin Frozen coolers. These cards come with a base plate already installed that completely covers all the VRM and VRAM componenets. Meaning you don't have to mess with a bunch of small individual heatsinks for each VRAM chip. The base plates work very well as long as there is no flex in the card. If the card is flexed the base plate may pull away from the card on one side.
> 
> Anyway sorry for the wall of text. I think universal coolers are they way to go. I upgrade my GPU way too often to sink a bunch of money into full cover blocks.






Thank you so much for the info. Even in "wall of text" form.....I try to consolidate my orders as much as possible (don't want to waste money on shipping costs). I already have the MSI R9 290 4G Gaming which of course with the newest PCB revision, doesn't fit the EK Rev 2.0 block. I was wondering if I could reuse the mid-plate actually. Then all I have to do is find the appropriate length/type screws. Question is though, I'm planning to use the EK thermosphere and wondering if it'll clear the midplate. Also, that would mean that the backplate would have to be removed correct? Unless I mod the backplate.
Earlier this year I ordered Fujipolys from FCPU because they ran out of stock for one of them, so when they got back in stock, I snapped them up as quick as I could. Been looking around for VRM heatsinks like the Gelid ones but in copper. Found one, almost, but then realised that the mounting screw holes were diagonal (not the exact one but you get my point)from each other.
For the Arctic cooling thermal adhesive, the only one I could find from a brief google search is this which is not the one I think you're talking about because the site states " is a two part permanent adhesive for thermal joints". would you be referring to the G1 instead? if you are, then I'm out of luck as the product has been discontinued.

One last thing, how do you cool your VRM 2?? Since I know the VRAM should be fine. Planning to mount 2 x 120mm fans under the GPU like so


----------



## laitoukid

1. http://www.techpowerup.com/gpuz/details.php?id=mnkg7
2. XFX
3. Aftermarket


----------



## LandonAaron

Quote:


> Originally Posted by *cephelix*
> 
> 
> Thank you so much for the info. Even in "wall of text" form.....I try to consolidate my orders as much as possible (don't want to waste money on shipping costs). I already have the MSI R9 290 4G Gaming which of course with the newest PCB revision, doesn't fit the EK Rev 2.0 block. I was wondering if I could reuse the mid-plate actually. Then all I have to do is find the appropriate length/type screws. Question is though, I'm planning to use the EK thermosphere and wondering if it'll clear the midplate. Also, that would mean that the backplate would have to be removed correct? Unless I mod the backplate.
> Earlier this year I ordered Fujipolys from FCPU because they ran out of stock for one of them, so when they got back in stock, I snapped them up as quick as I could. Been looking around for VRM heatsinks like the Gelid ones but in copper. Found one, almost, but then realised that the mounting screw holes were diagonal (not the exact one but you get my point)from each other.
> For the Arctic cooling thermal adhesive, the only one I could find from a brief google search is this which is not the one I think you're talking about because the site states " is a two part permanent adhesive for thermal joints". would you be referring to the G1 instead? if you are, then I'm out of luck as the product has been discontinued.
> 
> One last thing, how do you cool your VRM 2?? Since I know the VRAM should be fine. Planning to mount 2 x 120mm fans under the GPU like so


I actually purchased a reference design Saphire R9 290x, so I am not completely familiar with the MSI Twin Frozer one, but I have owned the MSI versions of other cards. I didn't think the MSI Gaming edition had a back plate



The GelD Icy Vision comes with a small T shaped heatsink for VRM2. Its incredibly tiny, but it seems to work fine. I am just using the thermal tape that it came with. I use a PCI slot fan that has 3 x 80mm fans to blow air across all the heatsinks. I am sure the Fujipoly thermal tape will work fine, just be careful not to bump the heatsinks once you get them attached. I would maybe consider some of the aluminum heatsinks made by Akust.

Edit: I found that pic here: http://www.pcper.com/news/General-Tech/Another-Custom-Hawaii-MSI-R9-290-290X-GAMING-4G

but I don't even think thats a pic of R9 290, I think its a pic of an Nvidia Card looking at it. Do you have a pic of your card, including baseplate and what not?


----------



## LandonAaron

@cephelix My only concern with the PCI slot fan you linked is that is just powered by 4 pin molex so you won't have any speed control. If you can you should try to find those same metal brackets on EBAY and then pick out your own 2 x120 PWM fans. You can get a small wire that will convert the 4pin PWM header on the card to a regular 4 pin PWM header, and you can connect your fans to that using a splitter. That way you can control your fan speeds through afterburner.


----------



## cephelix

@LandonAaron that is a picture of the 290....
as for pics, these are the only ones I have. no picture of the back plate though. but basically it looks like this.

If the Gelid ones come with everything, i may just get that


----------



## Vici0us

Has anyone experienced Windforce rattling? It's on and off. Is there any way of fixing it besides going with an aftermarket cooler?


----------



## LandonAaron

@cephelix
PCI slot fan bracket:

The 120mm version of these brackets are a lot harder to find than their 80/90 mm counter partds, and they pretty much all come from china so you would be in for a wait.
http://www.ebay.com/itm/PCI-12cm-Dual-Fans-Mount-Rack-Cooling-Heatsink-Bracket-for-Graphics-video-card-/150964042576?pt=LH_DefaultDomain_0&hash=item2326287b50

GPU PWM to regular PWM adapter:
http://www.ebay.com/itm/Gelid-PWM-Adaptor-Cable-for-VGA-Cooler-Fans-/271553591380?pt=LH_DefaultDomain_0&hash=item3f39daf854
[/URL]

I see the pictures now. Huh this one is going to be kind of tricky. On the reference R9 290 the VRM is in a single row, but on this card there are two rows of VRM so the GelID Icy Vision wouldn't work. You need something a little wider. I would look at Enzotech's MST series VRM heatsinks they are full copper and really heavy duty, but you will have to get creative on mounting it. Checkout Enzhotech MST-81. A mounting mechanism I came up with for using these types of heatsinks is to take a PCI slot blanking plate and stick it between the fins and then use a zip tie through the screw holes on either end of the VRM area hoding the pci blanking plate down. If you can put a a spring on the zip ties that is even better to keep constan pressure on it.

I wouldn't worry to much about using the mid-plate on this card as all it really cools is some of the VRAM and VRM2. If you can that's cool, but if you can't no big deal.

Edit: Those gold chokes are sexy.


----------



## cephelix

Quote:


> Originally Posted by *LandonAaron*
> 
> @cephelix
> PCI slot fan bracket:
> 
> The 120mm version of these brackets are a lot harder to find than their 80/90 mm counter partds, and they pretty much all come from china so you would be in for a wait.
> http://www.ebay.com/itm/PCI-12cm-Dual-Fans-Mount-Rack-Cooling-Heatsink-Bracket-for-Graphics-video-card-/150964042576?pt=LH_DefaultDomain_0&hash=item2326287b50
> 
> GPU PWM to regular PWM adapter:
> http://www.ebay.com/itm/Gelid-PWM-Adaptor-Cable-for-VGA-Cooler-Fans-/271553591380?pt=LH_DefaultDomain_0&hash=item3f39daf854
> [/URL]
> 
> I see the pictures now. Huh this one is going to be kind of tricky. On the reference R9 290 the VRM is in a single row, but on this card there are two rows of VRM so the GelID Icy Vision wouldn't work. You need something a little wider. I would look at Enzotech's MST series VRM heatsinks they are full copper and really heavy duty, but you will have to get creative on mounting it. Checkout Enzhotech MST-81. A mounting mechanism I came up with for using these types of heatsinks is to take a PCI slot blanking plate and stick it between the fins and then use a zip tie through the screw holes on either end of the VRM area hoding the pci blanking plate down. If you can put a a spring on the zip ties that is even better to keep constan pressure on it.
> 
> I wouldn't worry to much about using the mid-plate on this card as all it really cools is some of the VRAM and VRM2. If you can that's cool, but if you can't no big deal.
> 
> Edit: Those gold chokes are sexy.


thanks for all the advice! will take a look into it. looking to order some stuff before year's end....and i'll probably go with the mst-81 as you suggested. the dimensions for the heatsink are matches closely to the VRM1 length and width.


----------



## bond32

Crap. Killed one of my 290's... RMA time, how's XFX's RMA system?

Basically, I am re-doing a lot in my room, going to be coming up with some other way to mount my computer other than this rack now that I have the 3 qnix monitors. So I put all the air cooler's back on until I figure it out. After about 24 hours of troubleshooting one of them is def done...


----------



## jagdtigger

How, you pushed them over the edge in OC?


----------



## wermad

Recently, ive heard its been pretty good. In the past, you got your replacement with man-love juices (







).


----------



## bond32

Quote:


> Originally Posted by *jagdtigger*
> 
> How, you pushed them over the edge in OC?


Doubtfull... Havent done any benching in a while. It's actually a driver issue but I have reformatted and still persists... And I have tried the card by itself and I get no display. If I install the one in question with the other three I get a blue screen (atikmag.sys) every time.


----------



## jagdtigger

Interesting, well good luck with the RMA







.


----------



## black7hought

I ordered a Sapphire R9 290 Tri-X OC to replace my reference Sapphire R9 290. Has anyone else made the switch and noticed a significant difference with the fan noise?


----------



## wermad

I'm new to hawaii and compared to my MSI 280X, the Sapphire tri-x is super quiet. I haven't gamed yet but I'm taking a guess it should be hella better then the reference turbine. I ordered a second Sapphire and hope to have it by the end of next week







.


----------



## hornedfrog86

Quote:


> Originally Posted by *black7hought*
> 
> I ordered a Sapphire R9 290 Tri-X OC to replace my reference Sapphire R9 290. Has anyone else made the switch and noticed a significant difference with the fan noise?


I just ordered one,, my r9 290 is quiet.


----------



## black7hought

Quote:


> Originally Posted by *hornedfrog86*
> 
> I just ordered one,, my r9 290 is quiet.


Which 290 did you order?

I currently have a reference 290 which is loud at 20% and only gets louder as I game and the fan has to work more. I ordered the Tri-X to replace it after watching a couple videos on the differences.


----------



## razaice

My vapor-x is similar to a tri-x and it's practically silent even when overclocked at full load.


----------



## black7hought

Thanks for the information razaice.


----------



## hornedfrog86

Quote:


> Originally Posted by *black7hought*
> 
> Which 290 did you order?
> 
> I currently have a reference 290 which is loud at 20% and only gets louder as I game and the fan has to work more. I ordered the Tri-X to replace it after watching a couple videos on the differences.


Sapphire R9 290X 4GB GDDR5 DUAL DVI-D/HDMI/DP TRI-X OC Version PCI-Express Graphics Card 11226-00-40G

My 290 is a Tri-X


----------



## black7hought

Alright, that is the same as mine except I ordered a 290 non-X.


----------



## chiknnwatrmln

Need some help...

Was using 14.11.2 Beta drivers without a problem, but Skyrim was not working properly so I rolled back to 14.9 WHQL to see if that would fix it.

Uninstalled 14.9 and reinstalled 14.11.2 (used DDU + manually removed reg entries, fresh download of drivers too) and after five clean reinstalls it's broken every time. CCC uses 10-15% CPU, and eyefinity isn't working properly and every time I go try to make an Eyefinity group the drop down menu goes back up.

All this and I didn't even figure out the problem with Skyrim. What the hell man, I've been uninstalling and reinstalling driver for two hours now.

Okay after reinstalling three more times I have no idea what's wrong. I tried reinstalling from the exact file I originally had, but no dice.

I'm at a loss for words. I don't understand how uninstalling and then reinstalling is creating so many goddam issues. Dealing with graphics drivers is worse than getting teeth pulled.


----------



## rotorwash

How does one sign up? Like this? MSI R9 290 Twin Frozr; Stock; GPU-Z validation


----------



## pshootr

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Need some help...
> 
> Was using 14.11.2 Beta drivers without a problem, but Skyrim was not working properly so I rolled back to 14.9 WHQL to see if that would fix it.
> 
> Uninstalled 14.9 and reinstalled 14.11.2 (used DDU + manually removed reg entries, fresh download of drivers too) and after five clean reinstalls it's broken every time. CCC uses 10-15% CPU, and eyefinity isn't working properly and every time I go try to make an Eyefinity group the drop down menu goes back up.
> 
> All this and I didn't even figure out the problem with Skyrim. What the hell man, I've been uninstalling and reinstalling driver for two hours now.
> 
> Okay after reinstalling three more times I have no idea what's wrong. I tried reinstalling from the exact file I originally had, but no dice.
> 
> I'm at a loss for words. I don't understand how uninstalling and then reinstalling is creating so many goddam issues. Dealing with graphics drivers is worse than getting teeth pulled.


Sorry to hear your having such trouble. I have also went through this headache myself. Have a look through this thread if you haven't already. There is a handy script from (El_Capitan) that makes cleaning files/reg. much easier than doing it manually. After reading the thread and utilizing the script mentioned, I got back on track. Good luck.

http://www.overclock.net/t/988215/how-to-remove-your-amd-ati-gpu-drivers/340#post_22584270


----------



## Vici0us

Guys! Help me out! I hear Gigabyte has a common problem with their Windforce cards making rattling (fans). Is there anyway of fixing it without an aftermarket cooler? It's on and off, getting annoying, thanks in advance.


----------



## tsm106

Quote:


> Originally Posted by *Vici0us*
> 
> Guys! Help me out! I hear Gigabyte has a common problem with their Windforce cards making rattling (fans). Is there anyway of fixing it without an aftermarket cooler? It's on and off, getting annoying, thanks in advance.


Figure out where the rattle is, squeeze some rubber cement into it, wait for it to dry, win.


----------



## Vici0us

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Vici0us*
> 
> Guys! Help me out! I hear Gigabyte has a common problem with their Windforce cards making rattling (fans). Is there anyway of fixing it without an aftermarket cooler? It's on and off, getting annoying, thanks in advance.
> 
> 
> 
> Figure out where the rattle is, squeeze some rubber cement into it, wait for it to dry, win.
Click to expand...

I used Super Lube (synthetic oil) always worked on fans. But the fan on my Windforce didn't even budge..
Also, what's the best Arctic cooler to use on a 290?


----------



## tsm106

That's works for a lubrication problem but some rattles are not fan motor related but vibration.


----------



## Vici0us

Quote:


> Originally Posted by *Vici0us*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Vici0us*
> 
> Guys! Help me out! I hear Gigabyte has a common problem with their Windforce cards making rattling (fans). Is there anyway of fixing it without an aftermarket cooler? It's on and off, getting annoying, thanks in advance.
> 
> 
> 
> Figure out where the rattle is, squeeze some rubber cement into it, wait for it to dry, win.
> 
> Click to expand...
> 
> I used Super Lube (synthetic oil) always worked on fans. But the fan on my Windforce didn't even budge..
Click to expand...

Quote:


> Originally Posted by *tsm106*
> 
> That's works for a lubrication problem but some rattles are not fan motor related but vibration.


I'll give it a try, thanks for the help! Nice rig btw.


----------



## Paliosh

If you have vibration problem with your fans on your tri-x, here is my way to fix it and even got couple of degrees in temperature down.
First of all make sure your problem with tri-x is the vibration, when you start hearing the noise start pushing a little bit the basis of the cooler up, if the noise stop this is your problem.
How to fix it:
Fist you will have to tighten all of the screws(i mean every one you can see) on the video card with just a 1/4 turnover.
Then you will need some thin piece of rubber or in my case i used the tip of this:
You are gonna need just a little bit so you cut just a 0.5mm maximum of it.


Then you put it right here between the metal and the plastic:


And its done.
In my case this stop all of the noise produced by vibration and in addition when i tighten all of the screws i apparently got better contact between the cooler and the chip and this give me better temperatures.


----------



## gertruude

im trying to find out what the highest voltage i can use on air for this card

thanks


----------



## Sgt Bilko

Quote:


> Originally Posted by *gertruude*
> 
> im trying to find out what the highest voltage i can use on air for this card
> 
> thanks


Before it overheats?

Not much, better off going for something different gertie


----------



## gertruude

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Before it overheats?
> 
> Not much, better off going for something different gertie


bit late lol bought it yesterday


----------



## LandonAaron

Quote:


> Originally Posted by *gertruude*
> 
> bit late lol bought it yesterday


It seems the Direct CUII version has had some issues specific to it, but I don't remember what they are. But I think you should be okay. The DirectCUII has an open air cooler so you can probablly run the fans at 100% without too much noise, and thus you should have some headroom with your temps for adding voltage.


----------



## tsm106

Quote:


> Originally Posted by *gertruude*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Before it overheats?
> 
> Not much, better off going for something different gertie
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> bit late lol bought it yesterday
Click to expand...

Refuse the shipment stat.


----------



## sf101

Odd . I usually have issues strait away if i add to many initial volts.
Quote:


> Originally Posted by *LandonAaron*
> 
> My advice on buying from FrozenCPU is to wait until you have your order completely picked out. They tend to discount shipping pretty heavily for combined orders vs. what you would end up paying in shipping if you had it all on separate orders. IMO their shipping is a little overpriced, but they have very quick service, and you won't find a better selection. I think you are ordering some copper VRAM and VRM heatsinks. I have some of the copper enzotech's and they are nice, just be sure you get quality thermal transfer tape, of appropriate thickness. There isn't alot of surface area on VRM and thermal transfer tape alone probably isn't going to give you a good mount. I would suggest the Gelid Icy Vision pack for the VRM. Its aluminum instead of copper, but the high mounting pressure of using screws instead of just tape will give you better thermal transfer and lower temps IMO. On the VRAM side of things any heatsink will really do, and thus I wouldn't suggest copper, as they are heavier and harder to keep attached. Thermal tape even the stickiest has a hard time holding copper heatsinks. Remeber these are going to be mounted upside down once you get the card installed. I would just use some small alluminum ones instead. Finnally don't use an epoxy for attaching the heatsink, they will rip off the VRM and VRAM chips when you try to remove the heatsink. Epoxy is really that permanent. The best thing I have found for attaching copper heatsinks is thermal adhesive compound that Arctic Cooling makes. They only seem to sale it packaged with their coolers. Though they have a spare parts page, and I think you can order it from there now. I wrote to their customer service once, and they shipped me some for free. The stuff is really cool it comes out like a liquid but drys into hard rubbery like substance which has an awesome hold. And it cleans off very easily. I have been using a combination of Water and Air with on my last 3 cards so this is just what I have found in that time. Finally my number one suggestion if you are considering using something like the Kraken G10 or a universal GPU block, is to buy a card like the MSI series with the Twin Frozen coolers. These cards come with a base plate already installed that completely covers all the VRM and VRAM componenets. Meaning you don't have to mess with a bunch of small individual heatsinks for each VRAM chip. The base plates work very well as long as there is no flex in the card. If the card is flexed the base plate may pull away from the card on one side.
> 
> Anyway sorry for the wall of text. I think universal coolers are they way to go. I upgrade my GPU way too often to sink a bunch of money into full cover blocks.


This guy needs to seriously fing learn to use punctuation , sentence spacing , commas , periods and paragraphs ..

I can not make it through this wall of text.


----------



## Mega Man

i refuse to use punctuation unless my phone adds it for me i also NEVER capitalize unless i want emphasis again, unless my phone does it for me


----------



## rdr09

keep an eye out for used full waterblocks. got mine for $60 shipped.


----------



## Arizonian

Boy I missed a lot in the last two days I've been away. Thanks for being patient with me everyone.
Quote:


> Originally Posted by *Agent Smith1984*
> 
> The only clubbin' I'm doin' on Friday night, is 290 clubbin'
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I just noticed the dat eon this said 11/14... that's funny because I blew my PSU up last Friday, and just got my system back up and running tonight.... that means my battery is dead as hell....
> 
> YOU KNOW IT'S TIME FOR A CPU/MOBO UPGRADE WHEN YOUR BATTERY IS DEAD!!! ROFL


Congrats - added









Quote:


> Originally Posted by *laitoukid*
> 
> 1. http://www.techpowerup.com/gpuz/details.php?id=mnkg7
> 2. XFX
> 3. Aftermarket


Congrats - added









Quote:


> Originally Posted by *rotorwash*
> 
> How does one sign up? Like this? MSI R9 290 Twin Frozr; Stock; GPU-Z validation


Perfect - Congrats - added








Quote:


> Originally Posted by *wermad*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Quote:


> Originally Posted by *hornedfrog86*
> 
> I just ordered one,, my r9 290 is quiet.


Quote:


> Originally Posted by *gertruude*
> 
> bit late lol bought it yesterday


Add submission gentlemen, I'll add you to the club and to anyone else recently who's purchased but hasn't joined. Please do so.









Good times with these prices for a lot of people now able to get in on a great GPU.


----------



## DividebyZERO

Wow, so i was toying around with one of my ref coolers that i have removed. I cannot believe how soft the copper plate is!! i tapped with with a screw driver and it dented accidentally. I literally made another dent with my knuckle by tapping it. Anyways i am going to modify one with an Aftermarket cooler. So i removed the gpu heatsink out of the ref cooler. It was probably not the best way to remove it. I used a screw driver and a hammer. Lightly tapped it places and finally detached the solder/glue. This would probably be easier if it was heated enough to loosen then solder/glue. I did it without heat.

pics:



soon to be added cooler possibly(one of my ideas):


----------



## Nwanko

Add me pls.

Gigabyte R9 290X Windforce 3x OC - Stock Cooling



GPU-Z validation link : http://www.techpowerup.com/gpuz/details.php?id=uu34e


----------



## bond32

Spent a good bit of time this weekend rebuilding my rig. Built a few small brackets as well for radiators and put the gpu blocks back on the three cards I have and got the fujipoly pads (k=17) on:




Dem fuji pads... Wow. I had no idea how effective they really are. Had enough to do all the VRM1's and 2 VRM2. Load temps on the VRM1 topped 31 C @ 1.32 volts on pt1 bios... What the heck. Dang that's low.


----------



## Agent Smith1984

Well, I see somebody DOES NOT have kids..... LMAO


----------



## bond32

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, I see somebody DOES NOT have kids..... LMAO


Ha no... No kids, no pets


----------



## Agent Smith1984

Right.... cause there would be fur and fingers laying inside that thing!!!

Hope the locals know that those strange light flickerings they see at night aren't ghosts.... it's you playing BF4!!!


----------



## bond32

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Right.... cause there would be fur and fingers laying inside that thing!!!
> 
> Hope the locals know that those strange light flickerings they see at night aren't ghosts.... it's you playing BF4!!!


Do you live near me?? Lol. Yes actually the lights do flicker, especially when all 4 gpu's are full tilt. Roommate gets a kick out of it too. Not actually sure why the lights flicker yet either, the lights are on a separate circuit.


----------



## battleaxe

Quote:


> Originally Posted by *bond32*
> 
> Spent a good bit of time this weekend rebuilding my rig. Built a few small brackets as well for radiators and put the gpu blocks back on the three cards I have and got the fujipoly pads (k=17) on:
> 
> 
> 
> 
> Dem fuji pads... Wow. I had no idea how effective they really are. Had enough to do all the VRM1's and 2 VRM2. Load temps on the VRM1 topped 31 C @ 1.32 volts on pt1 bios... What the heck. Dang that's low.


Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, I see somebody DOES NOT have kids..... LMAO


Kids learn pretty fast. They would only stick their fingers in those fans once at most. My boy maybe twice, but even he would draw the line at that. I always say.. let em' learn the way we did. Go ahead son, stick your finger in there. See what happens. Isn't that the way we all learned? Safety is so over the top these days. We've raised a bunch of sissies. lol

Just kidding... well not really...


----------



## bond32

Quote:


> Originally Posted by *battleaxe*
> 
> Kids learn pretty fast. They would only stick their fingers in those fans once at most. My boy maybe twice, but even he would draw the line at that. I always say.. let em' learn the way we did. Go ahead son, stick your finger in there. See what happens. Isn't that the way we all learned? Safety is so over the top these days. We've raised a bunch of sissies. lol
> 
> Just kidding... well not really...


No way man... Wouldn't let anyone near that close to my AP-15's those things were over $20 each!


----------



## Agent Smith1984

I think I would spend as much time singing black sabbath into those fans as I would playing games....

** Ozzy voice "IIII aaaammmmm IIiirrrrroooonn Maaaaaaaannn"

ROFL

Nice f'n rig though bro, seriously.... I am surprised it doesn't blow itself around the room, haha


----------



## battleaxe

Quote:


> Originally Posted by *bond32*
> 
> No way man... Wouldn't let anyone near that close to my AP-15's those things were over $20 each!


That's why you make the kids pay for what they break. I have 5 kids. They break all kinds of crap. Yeah, stuff like dads PC stuff is paid for by whomever broke it. There are no accidents. Only carelessness.









Yes I was in the military.


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> That's why you make the kids pay for what they break. I have 5 kids. They break all kinds of crap. Yeah, stuff like dads PC stuff is paid for by whomever broke it. There are no accidents. Only carelessness.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yes I was in the military.


ROFL,

I have 3 kids and one on the way, none of them are old enough to pay for anything they break yet..... they'd never be able to afford to leave home if I put it on a tab though..


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> ROFL,
> 
> I have 3 kids and one on the way, none of them are old enough to pay for anything they break yet..... they'd never be able to afford to leave home if I put it on a tab though..


LOL... yeah it was sorta joke. But my little boy is six and man can he save some money. He squanders quarters all over the house and he's got like $100+ saved up. He's so funny, his sisters either break things and have to pay for them or just spend their money on dumb crap... He knows well enough not to touch dads computers though.









Mostly he just takes his own toys apart and breaks all his stuff. Kid is pretty hilarious.


----------



## LandonAaron

Quote:


> Originally Posted by *bond32*
> 
> Spent a good bit of time this weekend rebuilding my rig. Built a few small brackets as well for radiators and put the gpu blocks back on the three cards I have and got the fujipoly pads (k=17) on:
> 
> 
> 
> 
> Dem fuji pads... Wow. I had no idea how effective they really are. Had enough to do all the VRM1's and 2 VRM2. Load temps on the VRM1 topped 31 C @ 1.32 volts on pt1 bios... What the heck. Dang that's low.


It took me a minute to figure out what I was looking at here. I have never seen radiators that thick before. I would have to put grills on those fans, not for the kids but for myself. I always end up getting my fingers in the fans somehow. Got a pretty bad cut from Delta once. I will hardly handle a spinning fan anymore. I am pretty clumsy though. I'm sure my knee or my ankle would end up in there somehow.


----------



## Agent Smith1984

Is any of your hardware saturating the cooling ability past ambient room temps? I don't see how they could... lol


----------



## LandonAaron

It seems like Crossfiring R9 290x with a R9 290 is pretty popular in this thread. Why is that? My understanding of Crossfire and SLI, is that you are basically going to run at the slower of the two card's speed, which kind of make getting the 290x a waste if your pairing it with a 290.

Is it so that you can have a single R9 290x for games that either don't scale well in Crossfire or that have problems with crossfire, while also being able to crossfire in games that do scale well, without having to shell out the extra cash for a whole other 290x?


----------



## bond32

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Is any of your hardware saturating the cooling ability past ambient room temps? I don't see how they could... lol


No not possible... Ambient in my apartment is about 20-21 C (69-70 F). Cannot cool it lower than that without the help of other means (refrigeration cycle, ice bath, etc...).
Quote:


> Originally Posted by *LandonAaron*
> 
> It seems like Crossfiring R9 290x with a R9 290 is pretty popular in this thread. Why is that? My understanding of Crossfire and SLI, is that you are basically going to run at the slower of the two card's speed, which kind of make getting the 290x a waste if your pairing it with a 290.
> 
> Is it so that you can have a single R9 290x for games that either don't scale well in Crossfire or that have problems with crossfire, while also being able to crossfire in games that do scale well, without having to shell out the extra cash for a whole other 290x?


You're talking about 20 mhz difference that they would run at, assuming one were to leave a 290 and a 290x at stock clocks. I for one use pt1 bios which holds the stock frequencies at 1000 on core. The reason it is popular, my guess, would be simply getting either or for the cheapest. I bought a 290x the day it was released then the prices skyrocketed. Earlier in the year I found someone willing to part with 2 290's so I jumped on it, but honestly the difference is almost negligible especially when you can adjust the clocks of them.

I don't understand your last part... Not getting a second card just because you think the scaling is poor is a poor decision in itself. The second card will give you about 70-90% more performance in almost all situations. Saying "I play xx game, it doesn't support CF" is a weak argument as well because if it does't support it now, it's almost guaranteed to soon. If you can get more cards for relatively cheap, go for it (if you have the power supply to handle the load).
Quote:


> Originally Posted by *LandonAaron*
> 
> It took me a minute to figure out what I was looking at here. I have never seen radiators that thick before. I would have to put grills on those fans, not for the kids but for myself. I always end up getting my fingers in the fans somehow. Got a pretty bad cut from Delta once. I will hardly handle a spinning fan anymore. I am pretty clumsy though. I'm sure my knee or my ankle would end up in there somehow.


Its a 240 and 360 monsta (80mm thick) and an rx 360 all in push pull. Interestingly enough they keep my 3 heavily OC'ed cards quite cool. Honestly would have thought that wouldn't be enough rad space. Also have a 4790k @ 5 ghz to keep cool.


----------



## LandonAaron

Quote:


> Originally Posted by *bond32*
> 
> I don't understand your last part... Not getting a second card just because you think the scaling is poor is a poor decision in itself. The second card will give you about 70-90% more performance in almost all situations. Saying "I play xx game, it doesn't support CF" is a weak argument as well because if it does't support it now, it's almost guaranteed to soon. If you can get more cards for relatively cheap, go for it (if you have the power supply to handle the load).


What I was saying was that in crossfire a 290 and a 290x are basically going to operate as two 290's, so why do people bother getting a 290x. Then I was supposing that maybe they got the 290x for the better single card performance in games that might have problems with crossfire or whatever. Wasn't trying to make a case against crossfire.

I currently have a 290x, and chronic upgrade-itis. I was kind of toying with the idea of selling it and getting a GTX 980, but the price per performance ratio isn't that convincing to me. I usually just go single card (the last time crossfired or SLI'd anythign was the GTX 260), but am considering Crossfire and am trying to figure out if I should save my money and get a 290 or spend extra and get another 290x.


----------



## cephelix

Quote:


> Originally Posted by *bond32*
> 
> Spent a good bit of time this weekend rebuilding my rig. Built a few small brackets as well for radiators and put the gpu blocks back on the three cards I have and got the fujipoly pads (k=17) on:
> 
> 
> 
> 
> Dem fuji pads... Wow. I had no idea how effective they really are. Had enough to do all the VRM1's and 2 VRM2. Load temps on the VRM1 topped 31 C @ 1.32 volts on pt1 bios... What the heck. Dang that's low.


dude,is that a computer on your fan rack??


----------



## bond32

Quote:


> Originally Posted by *cephelix*
> 
> dude,is that a computer on your fan rack??


yep


----------



## chiknnwatrmln

Quote:


> Originally Posted by *LandonAaron*
> 
> It seems like Crossfiring R9 290x with a R9 290 is pretty popular in this thread. Why is that? My understanding of Crossfire and SLI, is that you are basically going to run at the slower of the two card's speed, which kind of make getting the 290x a waste if your pairing it with a 290.
> 
> Is it so that you can have a single R9 290x for games that either don't scale well in Crossfire or that have problems with crossfire, while also being able to crossfire in games that do scale well, without having to shell out the extra cash for a whole other 290x?


Nope, with CF they both run at independent speeds. In games that are optimized and actually utilize my hardware I have a constant 100% GPU usage on both my 290 and 290x at the same clocks, using 2560 and 2816 cores respectively.

The performance difference between a 290 and 290x is almost nonexistent anyway, a good OC'ing 290 will beat an average 290x.


----------



## Sgt Bilko

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Quote:
> 
> 
> 
> Originally Posted by *LandonAaron*
> 
> It seems like Crossfiring R9 290x with a R9 290 is pretty popular in this thread. Why is that? My understanding of Crossfire and SLI, is that you are basically going to run at the slower of the two card's speed, which kind of make getting the 290x a waste if your pairing it with a 290.
> 
> Is it so that you can have a single R9 290x for games that either don't scale well in Crossfire or that have problems with crossfire, while also being able to crossfire in games that do scale well, without having to shell out the extra cash for a whole other 290x?
> 
> 
> 
> Nope, with CF they both run at independent speeds. In games that are optimized and actually utilize my hardware I have a constant 100% GPU usage on both my 290 and 290x at the same clocks, using 2560 and 2816 cores respectively.
> 
> The performance difference between a 290 and 290x is almost nonexistent anyway, a good OC'ing 290 will beat an average 290x.
Click to expand...

^That


----------



## Roboyto

Quote:


> Originally Posted by *bond32*
> 
> Spent a good bit of time this weekend rebuilding my rig. Built a few small brackets as well for radiators and put the gpu blocks back on the three cards I have and got the fujipoly pads (k=17) on:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Dem fuji pads... Wow. I had no idea how effective they really are. Had enough to do all the VRM1's and 2 VRM2. Load temps on the VRM1 topped 31 C @ 1.32 volts on pt1 bios... What the heck. Dang that's low.


Dem Fujis do w-o-r-k









Nice use of a simple metal rack there, I love it.









What's your clocks, volts, and load temps overall? CPU/GPUs


----------



## USlatin

if I were to hook up 3 monitors to an MSI R9 280x, should I get two mini display port to DVI cables if two of them are DVI only monitors? And then hook up the main monitor to the DVI or HDMI?

what's the best way?

they will most likey be 3 1920x1200 monitors, one of them for video color correction work


----------



## bond32

Quote:


> Originally Posted by *Roboyto*
> 
> Dem Fujis do w-o-r-k
> 
> 
> 
> 
> 
> 
> 
> 
> Nice use of a simple metal rack there, I love it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What's your clocks, volts, and load temps overall? CPU/GPUs


Not doing any benching at the moment, but playing bf4 at +100mv, 1200/1320 the vrm1 temps max'ed out at 32 C. My ambient is pretty low tonight but still, that is pretty dang good. One of the GPU core temps hit 39 C, the rest were under that.CPU temps are in the 70's, sometimes see an 80 spike.

Put a few more fans on the top to cool the board components and have some airflow across the top of the gpu's. Should also help.


----------



## Arizonian

Quote:


> Originally Posted by *Nwanko*
> 
> Add me pls.
> 
> Gigabyte R9 290X Windforce 3x OC - Stock Cooling
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> GPU-Z validation link : http://www.techpowerup.com/gpuz/details.php?id=uu34e


Congrats - added


----------



## Red1776

If you are having problems with the standard 290/290x heatsinks not being sufficient, or leaving you room for OC head room. This could be the answer for you.

t


----------



## Gobigorgohome

I have read somewhere that the Fiji XT board is coming sometime early next year, when is it best going for a sell? I want to get rid of at least two of my four cards.

All four have EK-FC R9 290X Acetal+Nickel blocks, Black backplates, Fujipoly Ultra Extreme pads and are not used that much (3 weeks at most, only been sitting in my rig), never clocked higher than 1100/1300 (just for benching), do not know what they can do on pt1. They are all Sapphire Radeon R9 290X, have all original coolers, boxes and so on. What is the used-prices for something like this?
The two cards I may want to sell both have Elpida memory.

I am not using my computer that much so I am thinking, why not sell two of them and keep two.


----------



## battleaxe

I own a decently clocking R9 290. It can do 1170 on core. Memory is not so hot though. Only 1350mhz on memory max.

I just bought a new Gigabyte G1 970.

So I can compare the two. I'm not going to run a bunch of benches, review sites have done that. I did however run Valley and play some games so I can tell you how they fare on those.

On Valley the two produced almost identical scores. The G1 970 scored 2723 everything Extreme at 1080p. The 290 scored 2683 same sets. So almost identical scores. The difference is negligible at best.

In games I cannot tell the difference between the two cards. Both are very smooth and play great. Frames seem to be about the same on BF3 and BF4 as I get on either card. At least by a margin of error.

I could have bought another 290 for about $100 less than I paid for the G1 970. This probably would have been the smarter deal. I already have plenty of power and PSU's laying around here. I already have plenty of water cooling units too. But I wanted to try the G1 and figured selling my GTX670's was a good way to pay for it. My 670's have already sold on flea bay, so I guess I was right.

The G1 is a great card. The 290 is a great card. Both are excellent products. There are bonuses to the Nividia like a bit lower power, runs cooler, quieter etc... but then the R9 is a lot cheaper too. So I would say the 290 is still the card to beat for the money. That being said I'm glad I bought the G1. And I'm glad I own the 290. Such a terrible time to be in this hobby right now.









I think I'll keep them both. And in a little while I'll probably get another of each too just for fun. Then finally when 4k becomes a bit more stable, with higher frames and such a new monitor will be on the horizon. But for now, these are some great, fantastic cards we have available right now.

Gotta love competition!!!


----------



## pdasterly

Had to rma sapphire card, they are sending vapor x as replacement. Is vapor-x board a reference design?
I dont think my water block will fit?


----------



## BLOWNCO

Quote:


> Originally Posted by *pdasterly*
> 
> Had to rma sapphire card, they are sending vapor x as replacement. Is vapor-x board a reference design?
> I dont think my water block will fit?


nope the vapor X card uses a custom PCB

edit if its the 290X vapor X version ek makes a block for it other then that your SOL


----------



## pdasterly

They said the reference card is unavailable


----------



## rotorwash

@bond32
Quote:


> Also have a 4790k @ 5 ghz to keep cool.


How do you do that? My 4790k chokes @ 4.6Ghz :-| Nice rig though, very creative, chuckle.


----------



## pdasterly

Will tri-x version work?
I asked for 295x2 and they said no


----------



## arrow0309

Quote:


> Originally Posted by *pdasterly*
> 
> Will tri-x version work?
> I asked for 295x2 and they said no


Get the Tri-X, will work fine


----------



## heroxoot

Quote:


> Originally Posted by *Red1776*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> If you are having problems with the standard 290/290x heatsinks not being sufficient, or leaving you room for OC head room. This could be the answer for you.
> 
> t


You're saying some of the cards have no heat pads under them? I know mine do thank god.


----------



## black7hought

My 290 Tri-X OC arrived today. I'm moving from my reference 290 to that if it could be updated on the list.


Spoiler: Warning: Spoiler!


----------



## Red1776

Quote:


> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> If you are having problems with the standard 290/290x heatsinks not being sufficient, or leaving you room for OC head room. This could be the answer for you.
> 
> t
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You're saying some of the cards have no heat pads under them? I know mine do thank god.
Click to expand...

No hex.

I benched the MSI game edition quadfire with the twin Frozr as part of the AMD HPP project. The next step was changing to watercooling







They performed very well even under quadfire conditions. the warmest they got was 78c ( single card topped out at 63C) Moving on to watercooling I no longer need them and thought that folks who have the very warm ref coolers could benefit from the superior twin frozr coolers. Thus, I am putting them on the market. All of them are equipped with thermal pads. I am including new thermal pads and TIM included in the price with each twin Frozr.

Hope that clears things up


----------



## battleaxe

Quote:


> Originally Posted by *Red1776*
> 
> No hex.
> I benched the MSI game edition quadfire with the twin Frozr as part of the AMD HPP project. The next step was changing to watercooling
> 
> 
> 
> 
> 
> 
> 
> 
> They performed very well even under quadfire conditions. the warmest they got was 78c ( single card topped out at 63C) Moving on to watercooling I no longer need them and thought that folks who have the very warm ref coolers could benefit from the superior twin frozr coolers. Thus, I am putting them on the market.
> Hope that clears things up


So, you're saying you no longer need the air coolers? Hmmm..... I wonder what this could mean???

Hmmmm.....


----------



## heroxoot

Ahh very good. There was a lack of context I suppose. Twin frozr is a pretty banging cooler for 2nd party. I love mine.


----------



## Red1776

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> No hex.
> I benched the MSI game edition quadfire with the twin Frozr as part of the AMD HPP project. The next step was changing to watercooling
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> They performed very well even under quadfire conditions. the warmest they got was 78c ( single card topped out at 63C) Moving on to watercooling I no longer need them and thought that folks who have the very warm ref coolers could benefit from the superior twin frozr coolers. Thus, I am putting them on the market.
> Hope that clears things up
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So, you're saying you no longer need the air coolers? Hmmm..... I wonder what this could mean???
> 
> Hmmmm.....
Click to expand...

It means they are under water with EK FC R290X Rev 2.0 with 5 rads ,2 D5's and a load temp of 39C


----------



## battleaxe

Quote:


> Originally Posted by *Red1776*
> 
> It means they are under water with EK FC R290X Rev 2.0 with 5 rads ,2 D5's and a load temp of 39C


Yeah, of course. That's what I meant. LOL


----------



## Red1776

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> It means they are under water with EK FC R290X Rev 2.0 with 5 rads ,2 D5's and a load temp of 39C
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah, of course. That's what I meant. LOL
Click to expand...

 I know....it just gave me the chance to say 39c load ...hehehe


----------



## ZealotKi11er

I am having stutter problem with Crysis 3. If any of you has this problem please let me know. Need to find a solution.


----------



## battleaxe

Quote:


> Originally Posted by *Red1776*
> 
> I know....it just gave me the chance to say 39c load ...hehehe


Its always a good thing when you can use a sentence with "load" in it.


----------



## pdasterly

http://slickdeals.net/f/7422576-xfx-radeon-r9-295x2-video-card-656-50-ar-free-games-at-newegg


----------



## kizwan

Quote:


> Originally Posted by *Red1776*
> 
> Quote:
> 
> 
> 
> Originally Posted by *heroxoot*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Red1776*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> If you are having problems with the standard 290/290x heatsinks not being sufficient, or leaving you room for OC head room. This could be the answer for you.
> 
> t
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You're saying some of the cards have no heat pads under them? I know mine do thank god.
> 
> Click to expand...
> 
> No hex.
> I benched the MSI game edition quadfire with the twin Frozr as part of the AMD HPP project. The next step was changing to watercooling
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> They performed very well even under quadfire conditions. the warmest they got was 78c ( single card topped out at 63C) Moving on to watercooling I no longer need them and thought that folks who have the very warm ref coolers could benefit from the superior twin frozr coolers. Thus, I am putting them on the market. All of them are equipped with thermal pads. I am including new thermal pads and TIM included in the price with each twin Frozr.
> Hope that clears things up
Click to expand...

They say "A picture is worth a thousand words" but a thousand words can caused confusion. I'm glad you clears things up.


----------



## LandonAaron

Quote:


> Originally Posted by *Red1776*
> 
> If you are having problems with the standard 290/290x heatsinks not being sufficient, or leaving you room for OC head room. This could be the answer for you.


Do those coolers fit the reference PCB? I was under the impression that the MSI Gaming Edition cards had a custom design.


----------



## Agent Smith1984

Quote:


> Originally Posted by *pdasterly*
> 
> http://slickdeals.net/f/7422576-xfx-radeon-r9-295x2-video-card-656-50-ar-free-games-at-newegg


DAAYUUMMM

That's a sick deal....

$660 for 295??? You get crossfire on one board, you get watercooling, and you get games! That's awesome!


----------



## LandonAaron

So on a dual card like that can you use all 8GB of Vram or is it like traditional crossfire where you can only use 4GB?


----------



## Agent Smith1984

Quote:


> Originally Posted by *LandonAaron*
> 
> So on a dual card like that can you use all 8GB of Vram or is it like traditional crossfire where you can only use 4GB?


Sadly they dupe you, and there are only 4GB usable VRAM...

Although, I would suggest anyone getting VRAM lag (due to not having enough), just turn your pagefile off, and let the RAM handle everything (barring you have 16GB of RAM)

Also, if you have issues with turning off the pagefile, you can create a 4GB RAMDISK, and put your pagefile on that....
It won't work as fast as GDDR5, but it's certainly faster than harddrive swaps.


----------



## miraldo

Hello









One simple question, is it SEASONIC S12 II 620 good choice for running R9 290 TRI-X, i5 2500k(stock), 8Gb Crucial Ballistix Tactical?


----------



## madmanmarz

Is there a way to make it so afterburner gets a voltage bump automatically on startup? I am currently getting about 1.2v with the slider maxed out and I have read about making the .bat


----------



## Sgt Bilko

Quote:


> Originally Posted by *madmanmarz*
> 
> Is there a way to make it so afterburner gets a voltage bump automatically on startup? I am currently getting about 1.2v with the slider maxed out and I have read about making the .bat


"Apply overclock on startup"

Should be an option next to the profiles and set AB to start with windows etc.

Only use it if your overclock is solid though


----------



## BLOWNCO

finally got all 3 of them in and running


----------



## LandonAaron

Quote:


> Originally Posted by *Sgt Bilko*
> 
> "Apply overclock on startup"
> 
> Should be an option next to the profiles and set AB to start with windows etc.
> 
> Only use it if your overclock is solid though


I still wouldn't recommend applying the overclock on start-up. I can overclock my card to where its completely stable for gaming and benchmarks, but then when I put the computer to sleep and try to wake it back up I get a black screen. I then have to reset the computer, which makes it try to resume to windows again. Then reset the computer one more time, to get the option to delete restoration data and boot normally, which will cancel the overclock and allow computer to work. I wouldn't want to be in a situation where the only way to get the computer to boot without black screen would be to safe boot.


----------



## madmanmarz

Yeah but what about having to use the .bat file to get the extra 100 or 200mv? Will that extra voltage bump automatically apply or do I have to run the .bat file each time I start up?


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I am having stutter problem with Crysis 3. If any of you has this problem please let me know. Need to find a solution.


what are you using to record? also, where are you recording it to? if in the same place the os is installed, then that could be it. just finished playing using 2 290s, too, and it was very smooth. using 4K. but even when i played using 1080 same thing. latest beta, btw.


----------



## aaroc

2x R9 290X are faster than a single R9 295? do they use more or less energy? At current prices, I can sell my 4x R9 290Xs and buy two R9 295 and have some money left.








OOOOO what to do..... I can sell them and buy new XFX R9 290X and the same thing....


----------



## Sgt Bilko

Quote:


> Originally Posted by *aaroc*
> 
> 2x R9 290X are faster than a single R9 295? do they use more or less energy? At current prices, I can sell my 4x R9 290Xs and buy two R9 295 and have some money left.
> 
> 
> 
> 
> 
> 
> 
> 
> OOOOO what to do..... I can sell them and buy new XFX R9 290X and the same thing....


Two R9 290x's are about the same as a R9 295x2 performance wise but the 295x2 consumes less power,
disadvantage is you cannot disable Crossfire so if you get two R9 295x2's then you will either be running Crossfire or Quadfire.

You can change Catalyst profiles to make it so only one core will be used during applications but that's all a profile per game type thing.

Having had CF R9 290's and a 295x2 i can tell you that the 295x2 is a beast of a card and performs awesome but the downside of not being able to disable Crossfire with the click of a button is a small pain


----------



## xer0h0ur

If I could do it all over again, I would have dual waterblocked 8GB 290X's instead of this 295X2 since I stopped running tri-fire. Tri-fire is a mess that only invites headache after headache.


----------



## Regnitto

My VisionTek R9 290 ships out friday!!!


----------



## Red1776

Quote:


> Originally Posted by *xer0h0ur*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bertovzki*
> 
> Can anyone tell me , Does my new Sapphire R9 290X Tri-X OC have Hynix Vram ? or how do I find out ? , I'm not too concerned as probably not over clocking CPU or GPU , but will if I need to one day
> 
> 
> 
> GPU-Z
Click to expand...

 It does have the Hynix modules


----------



## arrow0309

I agree, my 290 Tri-X has those Hynix too
It can daily at 1500 and more however I'm keepin'it at 1475 only (1135/1475 still on air)
Is already heating up a lot of power juice with the ACU for instance:



Can't wait to receive the new EK waterblock (bought from eBay, I already have the backplate) and finally go under water with my whole system









Do you guys confirm me the EK-FC R9-290X (copper plexi) Rev 1 is 100% compatible with my Tri-X?


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> what are you using to record? also, where are you recording it to? if in the same place the os is installed, then that could be it. just finished playing using 2 290s, too, and it was very smooth. using 4K. but even when i played using 1080 same thing. latest beta, btw.


Same drive but its SSD so its more then fast enough. Happens even when not recording. I might have to do a fresh WIndows Install.


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Same drive but its SSD so its more then fast enough. Happens even when not recording. I might have to do a fresh WIndows Install.


have you tried repairing the game?

i recommend making an image of the ssd once fixed to a HDD. that way, in the future if something goes wrong, you don't need to reinstall everything.


----------



## Chopper1591

Count me in on the fun:

Sapphire r9 290 Tri-x OC.
Validation.

1150/1300 +0mV +50 powerlimit


Spoiler: Warning: Spoiler!







1200/1500 +100mV +50 powerlimit


Spoiler: Warning: Spoiler!







What do you guys advice as a safe daily overclock? Increased voltage wise.
And temps? I made a fan profile that keeps the gpu around 80c while gaming and vrm1 around 88-90c.


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> have you tried repairing the game?
> 
> i recommend making an image of the ssd once fixed to a HDD. that way, in the future if something goes wrong, you don't need to reinstall everything.


How does that work? Never done that before?

I did repair the game. That was the first thing i did.


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> How does that work? Never done that before?
> 
> I did repair the game. That was the first thing i did.


what imaging or cloning?

i bork my os once oc'ing my cpu too far, so to avoid reinstalling i use the free version . . .

http://www.partition-tool.com/easeus-partition-manager/manual.htm

just go to wizard tab and choose clone. just take caution which drive to pick as source and which is for destination. source, of course, is the one with the os. shutdown, unplug ssd, go to bios and set hdd as primary (may be automatic), boot and test. if it works, then set it aside. i keep both imaged hdds inside my sig. all my PCs have imaged drives even laptops. less downtime.


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> what imaging or cloning?
> 
> i bork my os once oc'ing my cpu too far, so to avoid reinstalling i use the free version . . .
> 
> http://www.partition-tool.com/easeus-partition-manager/manual.htm
> 
> just go to wizard tab and choose clone. just take caution which drive to pick as source and which is for destination. source, of course, is the one with the os. shutdown, unplug ssd, go to bios and set hdd as primary (may be automatic), boot and test. if it works, then set it aside. i keep both imaged hdds inside my sig. all my PCs have imaged drives even laptops. less downtime.


Oh so you have SSD and HDD that has OS image in-case the SSD OS fails and you have to restore. My backup 500GB HDD died and dont have a place to clone my SSD.


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Oh so you have SSD and HDD that has OS image in-case the SSD OS fails and you have to restore. My backup 500GB HDD died and dont have a place to clone my SSD.


just get a cheap HDD bigger capacity than your ssd. copy (image) the SSD to the hdd and keep the hdd. when your os, files, games gets bork . . . instead of reinstalling everything, just *re-image* the content of the HDD to the SSD.

it is better than reinstalling all software including the os.


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> just get a cheap HDD bigger capacity than your ssd. copy (image) the SSD to the hdd and keep the hdd. when your os, files, games gets bork . . . instead of reinstalling everything, just *re-image* the content of the HDD to the SSD.
> 
> it is better than reinstalling all software including the os.


What about GPU drivers. That mostly the reason why i reinstall. Should i just create a image with all games/program installed but no GPU drivers?


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What about GPU drivers. That mostly the reason why i reinstall. Should i just create a image with all games/program installed but no GPU drivers?


that's another reason to image a drive. test the new driver on the spare HDD or SSD. if it is all good, then update driver on the primary drive.

my spare drive is not plugged when not in use (power is but not sata cable).

Fix your issues in your SSD first, then image it.

edit: you don't have to do this . . . but all my Win7 OS have not been updated since Feb of this year. the only thing i update is MSE. i turned off auto update.


----------



## jagdtigger

"but all my Win7 OS have not been updated since Feb of this year. the only thing i update is MSE. i turned off auto update."
Thats not a wise decision...


----------



## black7hought

New Validation for my switch to a Tri-X OC from a reference card.


Spoiler: Warning: Spoiler!


----------



## Chopper1591

Quote:


> Originally Posted by *black7hought*
> 
> New Validation for my switch to a Tri-X OC from a reference card.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats.

What did you do with the reference?


----------



## rdr09

Quote:


> Originally Posted by *jagdtigger*
> 
> "but all my Win7 OS have not been updated since Feb of this year. the only thing i update is MSE. i turned off auto update."
> Thats not a wise decision...


i know. that's why i said, Zeal does not have to do it. my systems are all working fine. i update gpu driver as soon as a new one comes out and never a single issue. my theory is MS updates affect systems. So long as my MSE is up-to-date - cool with that.


----------



## Chopper1591

You guys missed my post?
Seeking advice on a daily OC.

What do you find safe? Temp and voltage wise.

Now running a gaming OC of 1100/1300 with +50 power limit and +0 voltage.
1200/1500 +50 with +100mV seems stable. 80c on the core and ~90c on vrm1.

Is that safe for prolonged use?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Chopper1591*
> 
> You guys missed my post?
> Seeking advice on a daily OC.
> 
> What do you find safe? Temp and voltage wise.
> 
> Now running a gaming OC of 1100/1300 with +50 power limit and +0 voltage.
> 1200/1500 +50 with +100mV seems stable. 80c on the core and ~90c on vrm1.
> 
> Is that safe for prolonged use?


Thats a fine OC if its stable. I find it hard to believe you can maintain 1200 core @ those temps though.


----------



## razaice

Quote:


> Originally Posted by *Chopper1591*
> 
> You guys missed my post?
> Seeking advice on a daily OC.
> 
> What do you find safe? Temp and voltage wise.
> 
> Now running a gaming OC of 1100/1300 with +50 power limit and +0 voltage.
> 1200/1500 +50 with +100mV seems stable. 80c on the core and ~90c on vrm1.
> 
> Is that safe for prolonged use?


I think the vrm1 temp on your +100mV oc is getting a little high. You also have to figure it'll be even higher in the summer. Everything else looks pretty good.


----------



## rdr09

Quote:


> Originally Posted by *Chopper1591*
> 
> You guys missed my post?
> Seeking advice on a daily OC.
> 
> What do you find safe? Temp and voltage wise.
> 
> Now running a gaming OC of 1100/1300 with +50 power limit and +0 voltage.
> 1200/1500 +50 with +100mV seems stable. 80c on the core and ~90c on vrm1.
> 
> Is that safe for prolonged use?


yes on both. but, that 90 is too close to the top and your gpu clocks might be cycling at load. check using AB. when at load . . . gpu usage should be flat.


----------



## Chopper1591

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Thats a fine OC if its stable. I find it hard to believe you can maintain 1200 core @ those temps though.


I haven't played games on the 1200 clock, yet.
Am a bit scared to push things high before I got some more info.

Will do a bit of Far Cry 4 and monitor temps on 1200 core.
Quote:


> Originally Posted by *razaice*
> 
> I think the vrm1 temp on your +100mV oc is getting a little high. You also have to figure it'll be even higher in the summer. Everything else looks pretty good.


Yeah.
I am a bit disappointed on the vrm1 temp. Tri-x should be one of the better, if not the best, coolers. Right?
Is there a way to improve the vrm cooling? Adding sinks to the backside of the card?

Using this fan curve right now.


Quote:


> Originally Posted by *rdr09*
> 
> yes on both. but, that 90 is too close to the top and your gpu clocks might be cycling at load. check using AB. when at load . . . gpu usage should be flat.


Will do.

Edit:

1200 core is not stable at all.
Did pass a few benches...

But less then 1 minute in Far Cry and it throttles and shows artifacts.
Temp: 81c core, vrm1 90c vrm2 59c

Edit 2:

Far Cry looks like a horrible way to see if the gpu throttles.
Here is a shot after 3 minutes of heavy combat.
Gpu usage is all over the place while temps look fine.


To rule things out I just did 3 minutes in the same area in Far Cry.
The cooler actually looks pretty good, looking at the small increase of temp going from stock to 1150 core with +50mv.


----------



## LandonAaron

I think there is something off with the way MSIAB records GPU usage on the R9 290x. When playing Far Cry 4, and Skyrim, it will either report 100% usage, or 0% usage. I noticed in 3d Mark though it seems to report normal/correct usage. Anyway I wouldn't be looking at the usage variable if you are trying to see if it is throttling, you should be looking at the Core Clock, that will be a better indicator. But really and truly it isn't going to throttle until it hits 94 degrees. That is the point it throttles at, and you will never see it hit 95 degrees.

Edit it throttles at 94 degrees on the Core. I have no idea if or when it throttles on VRM temps. The best way to reduce VRM temps if you want to keep using the same cooler is with Fujipad's. I don't use them personally, but plenty of people do, and everyone says they bring there temps down several degrees.


----------



## Chopper1591

Quote:


> Originally Posted by *LandonAaron*
> 
> I think there is something off with the way MSIAB records GPU usage on the R9 290x. When playing Far Cry 4, and Skyrim, it will either report 100% usage, or 0% usage. I noticed in 3d Mark though it seems to report normal/correct usage. Anyway I wouldn't be looking at the usage variable if you are trying to see if it is throttling, you should be looking at the Core Clock, that will be a better indicator. But really and truly it isn't going to throttle until it hits 94 degrees. That is the point it throttles at, and you will never see it hit 95 degrees.
> 
> Edit it throttles at 94 degrees on the Core. I have no idea if or when it throttles on VRM temps. The best way to reduce VRM temps if you want to keep using the same cooler is with Fujipad's. I don't use them personally, but plenty of people do, and everyone says they bring there temps down several degrees.


Yeah, I thought about that.
Thermal pad upgrade.

But I am afraid that will void my warranty.

Just did two runs of 5 minute each, Unigine Heaven.
One on stock, 1000, and one on 1150 with +50 +50

The overclock even seems to be more solid.


----------



## Dynamo11

So finally got around to overclocking my water cooled card the other day. Finally got 1200/1500. 1200 was my target from the get go and lowering the memory after the fact didn't seem to let me increase the core clock anymore so I just left that at 1500. To get it though I had to push the volts to 143mV, but heat isn't an issue and from what I've read anything up to 200mV seems to be "ok".


Spoiler: Warning: Spoiler!







Seems to be stable in games after using it for about a week, so pretty happy with it


----------



## razaice

Quote:


> Originally Posted by *Chopper1591*
> 
> Will do a bit of Far Cry 4 and monitor temps on 1200 core.
> Yeah.
> I am a bit disappointed on the vrm1 temp. Tri-x should be one of the better, if not the best, coolers. Right?
> Is there a way to improve the vrm cooling? Adding sinks to the backside of the card?


I just did a few runs in valley at +150mV, 1190 core, 1580 mem and came out to 78c core, 77c vrm1, and 65c vrm2. Maybe your case doesn't have good air flow? Or it's possible the vrm1 on your card isn't being properly cooled for some reason, because a vapor-x shouldn't have temps that much lower than a tri-x.


----------



## Chopper1591

Quote:


> Originally Posted by *Dynamo11*
> 
> So finally got around to overclocking my water cooled card the other day. Finally got 1200/1500. 1200 was my target from the get go and lowering the memory after the fact didn't seem to let me increase the core clock anymore so I just left that at 1500. To get it though I had to push the volts to 143mV, but heat isn't an issue and from what I've read anything up to 200mV seems to be "ok".
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Seems to be stable in games after using it for about a week, so pretty happy with it


Can you fill in your rig and put it in your sig?
For now, do you have an 290 or 290x?

What do you guys think about these temps?
Did 10 minutes of Heaven with a fixed 60% fan.

Quote:


> Originally Posted by *razaice*
> 
> I just did a few runs in valley at +150mV, 1190 core, 1580 mem and came out to 78c core, 77c vrm1, and 65c vrm2. Maybe your case doesn't have good air flow? Or it's possible the vrm1 on your card isn't being properly cooled for some reason, because a vapor-x should have temps that much lower than a tri-x.


That's allot lower indeed.

How long did you run Valley?
And do you have a custom fan curve?

Build is in a 650D with a Bitfenix Spectre Pro 200mm as intake, which is said to be 148cfm.

Thinking about placing the stock 200mm Corsair fan on top as exhaust.
I did make it 200mm intake and 120mm exhaust to keep the dust out(positive pressure).


----------



## Dynamo11

Quote:


> Originally Posted by *Chopper1591*
> 
> Can you fill in your rig and put it in your sig?
> For now, do you have an 290 or 290x?
> 
> What do you guys think about these temps?
> Did 10 minutes of Heaven with a fixed 60% fan.


Whoops, hadn't realised I hadn't put my rig in my sig. R9 290X


----------



## razaice

Quote:


> Originally Posted by *Chopper1591*
> 
> That's allot lower indeed.
> 
> How long did you run Valley?
> And do you have a custom fan curve?


I only ran it for two runs, but it was obvious that it was pretty stable at those temps and would maybe only go up a degree or two if I went longer. I just left the fans on the default setting.


----------



## razaice

Quote:


> Originally Posted by *Chopper1591*
> 
> Build is in a 650D with a Bitfenix Spectre Pro 200mm as intake, which is said to be 148cfm.
> 
> Thinking about placing the stock 200mm Corsair fan on top as exhaust.
> I did make it 200mm intake and 120mm exhaust to keep the dust out(positive pressure).


You've obviously put thought into your air flow, so that's probably not the problem.


----------



## Chopper1591

Quote:


> Originally Posted by *razaice*
> 
> You've obviously put thought into your air flow, so that's probably not the problem.


A little frustrated here.
Surely, the vapor-x is better. But by this much?









Do you have time to do 10 minutes of valley or heaven, and post that?

Here was my 3dmark run. +100


----------



## razaice

Quote:


> Originally Posted by *Chopper1591*
> 
> A little frustrated here.
> Surely, the vapor-x is better. But by this much?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Do you have time to do 10 minutes of valley or heaven, and post that?
> 
> Here was my 3dmark run. +100


I mistyped with my earlier comment. I meant to say a vapor-x *shouldn't* be that much lower than a tri-x. The difference should probably be 5c at the most. Sure I'll post it when it's done.


----------



## Chopper1591

Quote:


> Originally Posted by *razaice*
> 
> I mistyped with my earlier comment. I meant to say a vapor-x *shouldn't* be that much lower than a tri-x. The difference should probably be 5c at the most. Sure I'll post it when it's done.


Quote:


> Originally Posted by *razaice*
> 
> I mistyped with my earlier comment. I meant to say a vapor-x *shouldn't* be that much lower than a tri-x. The difference should probably be 5c at the most. Sure I'll post it when it's done.


Just found this:




Seems to have max of 68 on vrm1 after 30 minutes of 3dmark with a 1200 core +143mV.


----------



## razaice

The temps were a little higher than I thought they'd be, but this is +150mV with 1180 core and 1580 memory.


----------



## kizwan

Quote:


> Originally Posted by *razaice*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Chopper1591*
> 
> Will do a bit of Far Cry 4 and monitor temps on 1200 core.
> Yeah.
> I am a bit disappointed on the vrm1 temp. Tri-x should be one of the better, if not the best, coolers. Right?
> Is there a way to improve the vrm cooling? Adding sinks to the backside of the card?
> 
> 
> 
> 
> 
> 
> I just did a few runs in valley at +150mV, 1190 core, 1580 mem and came out to 78c core, 77c vrm1, and 65c vrm2. Maybe your case doesn't have good air flow? Or it's possible the vrm1 on your card isn't being properly cooled for some reason, because a vapor-x shouldn't have temps that much lower than a tri-x.
Click to expand...

Actually Vapor-X should have better VRM temp than Tri-X because the former have custom VRM. And as far as I can remember, Tri-X is good cooler for the core but not for VRM1.


----------



## razaice

Quote:


> Originally Posted by *kizwan*
> 
> Actually Vapor-X should have better VRM temp than Tri-X because the former have custom VRM. And as far as I can remember, Tri-X is good cooler for the core but not for VRM1.


Ah I see. I didn't know that.


----------



## Chopper1591

Quote:


> Originally Posted by *razaice*
> 
> 
> The temps were a little higher than I thought they'd be, but this is +150mV with 1180 core and 1580 memory.


Temps are pretty low anyway.
Especially considering you were pumping 1.343v max.

Still, what concerns me is the difference between vrm1 and vrm2.
Your is 82c and 66c where mine was 82c and 57c.

Sadly I don't think this is a valid reason to ask for replacement at the retailer.
Quote:


> Originally Posted by *kizwan*
> 
> Actually Vapor-X should have better VRM temp than Tri-X because the former have custom VRM. And as far as I can remember, Tri-X is good cooler for the core but not for VRM1.


Any ideas on how to improve?


----------



## ZealotKi11er

First card is 290X, Second is 290.

Air Intake Temp ~ 26.5C
Water Temp Under Load ~ 33.5C

That's 7C Delta which is really good. Normal usage i have to add ~ 7-8C which result on ~ 15C Delta.

If so the temps are not that impressive for these cards.

OC is 1200/1500 + 100 mV


----------



## LandonAaron

So I been playing some Far Cry 4 which stutters pretty badly. Monitoring my GPU Frequencey which I have OC'd to 1200mhz I see it bouncing around up and down and floating around 1130mhz. Memory speed is locked in at 1600mhz. I am giving it +200mv and and +50 power limit. Temps are in the 50's for the core, and 60's for VRM. Do you think I might get better performance using the PT1 BIOS? Can you lock in the core frequency with PT1 so that it does not vary, and just stays at what its set at?


----------



## ZealotKi11er

Quote:


> Originally Posted by *LandonAaron*
> 
> So I been playing some Far Cry 4 which stutters pretty badly. Monitoring my GPU Frequencey which I have OC'd to 1200mhz I see it bouncing around up and down and floating around 1130mhz. Memory speed is locked in at 1600mhz. I am giving it +200mv and and +50 power limit. Temps are in the 50's for the core, and 60's for VRM. Do you think I might get better performance using the PT1 BIOS? Can you lock in the core frequency with PT1 so that it does not vary, and just stays at what its set at?


I would think the power limit its not working.


----------



## Gobigorgohome

Quote:


> Originally Posted by *xer0h0ur*
> 
> If I could do it all over again, I would have dual waterblocked 8GB 290X's instead of this 295X2 since I stopped running tri-fire. Tri-fire is a mess that only invites headache after headache.


What? Tri-fire is a mess? I am running quadfire R9 290X and no big problems, other than CPU bottlenecks (everything above 4,7 Ghz solves that issue), have been running tri-fire as well and even less problems than with four cards (did not need to go past stock clocks for it to work).







Gaming @ 4K and I have 60+ FPS pretty much all the time at max settings without AA, have not tried C3, but I am going to do that and report back how bad it is.

If it is not running good then something is wrong.







Oh, and btw, I am running cards @ stock clocks (1000/1250)

Only problem I have had running more than two cards is with Nvidia Surround/Eyefinity and compatibility issues with three screens and games that is not optimized for three-screen setups.









Is this a good time trying to get rid of a couple of R9 290X's?


----------



## MrWhiteRX7

I've had tri-fire 290's pretty much since they were released... I love it


----------



## xer0h0ur

Get back to me when you try using a 295X2 with a 290X.


----------



## LandonAaron

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I would think the power limit its not working.


I thought that too, but I don't know how to check if the power limit has actually been applied or not. I apply my OC with Trixx, since it lets me do +200mv, and then load afterburner just for monitoring making no changes to OC. I tried using MSIAB to apply power limit too, just too see, but I don't think it changed anything. I like how on Nvidia cards GPUz reports the reason for any downclocking, be it Power Limit, Thermal Throttling, or low Utilization. I wish AMD had a similar variable.


----------



## Chopper1591

Quote:


> Originally Posted by *LandonAaron*
> 
> So I been playing some Far Cry 4 which stutters pretty badly. Monitoring my GPU Frequencey which I have OC'd to 1200mhz I see it bouncing around up and down and floating around 1130mhz. Memory speed is locked in at 1600mhz. I am giving it +200mv and and +50 power limit. Temps are in the 50's for the core, and 60's for VRM. Do you think I might get better performance using the PT1 BIOS? Can you lock in the core frequency with PT1 so that it does not vary, and just stays at what its set at?


I also experience weird stuff in Far Cry.
We really need updates, both on the game and the drivers.

Just did ~25 minutes of Heaven.
1200 core with +100mv and a fixed 60% fan.



Since we are talking about bios switches.
I really can't find much info on it.
Which one should I use on my tri-x, and why?


----------



## kizwan

Quote:


> Originally Posted by *Chopper1591*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Actually Vapor-X should have better VRM temp than Tri-X because the former have custom VRM. And as far as I can remember, Tri-X is good cooler for the core but not for VRM1.
> 
> 
> 
> Any ideas on how to improve?
Click to expand...

Change to better thermal pad. That should help.


----------



## madmanmarz

This week was a good week.

Buy a used r9 290 for $200
Unlock it to 290x
Put it on the same waterblock I've been using forever and install VRM/memory sinks
Overclock it
Bout to test 1250/1750 but 1200/1700 is stable. Crashes around 1300/1775. Whatttttaaaabeasttt and only 1.28v


----------



## Gobigorgohome

Quote:


> Originally Posted by *xer0h0ur*
> 
> Get back to me when you try using a 295X2 with a 290X.


That will never happen, no reason to buy a 295x2 now, waiting for next generation.


----------



## xer0h0ur

Then obviously were not talking about the same scenario to be making comparisons. Ya think.


----------



## rdr09

Quote:


> Originally Posted by *madmanmarz*
> 
> This week was a good week.
> 
> Buy a used r9 290 for $200
> Unlock it to 290x
> Put it on the same waterblock I've been using forever and install VRM/memory sinks
> Overclock it
> Bout to test 1250/1750 but 1200/1700 is stable. Crashes around 1300/1775. Whatttttaaaabeasttt and only 1.28v


very rare that a 290 that unlocks can oc as much. you won the lottery twice. congrats. at 1250 . . . no 290 will catch it.


----------



## Dynamo11

Quote:


> Originally Posted by *madmanmarz*
> 
> This week was a good week.
> 
> Buy a used r9 290 for $200
> Unlock it to 290x
> Put it on the same waterblock I've been using forever and install VRM/memory sinks
> Overclock it
> Bout to test 1250/1750 but 1200/1700 is stable. Crashes around 1300/1775. Whatttttaaaabeasttt and only 1.28v


Damn son, you really did luck out with that one.


----------



## xer0h0ur

Is there very much correlation with high ASIC ratings and better overclocking?


----------



## madmanmarz

Quote:


> Originally Posted by *Dynamo11*
> 
> Damn son, you really did luck out with that one.


Quote:


> Originally Posted by *rdr09*
> 
> very rare that a 290 that unlocks can oc as much. you won the lottery twice. congrats. at 1250 . . . no 290 will catch it.


Thanks! I never seem to have good luck with CPU's but I keep getting lucky with the cards!


----------



## Chopper1591

Quote:


> Originally Posted by *kizwan*
> 
> Change to better thermal pad. That should help.


I will probably just do it.
Sadly I can only buy it from Frozen cpu.

Should I try my luck and re-use the ram pads?
Also, which size should I get for the vrm's? 1mm?

Will this suffice?
http://www.frozencpu.com/products/17504/thr-185/Fujipoly_ModRight_Ultra_Extreme_System_Builder_Thermal_Pad_Blister_Pack_-_Mosfet_Block_-_100_x_15_x_10_-_Thermal_Conductivity_170_WmK.html

Will set me back around 22 usd shipped. 15-20 days.









Quote:


> Originally Posted by *madmanmarz*
> 
> Thanks! I never seem to have good luck with CPU's but I keep getting lucky with the cards!


Congrats dude.

Really good deal. A 200 usd card that clocks like a beast.


----------



## rdr09

Quote:


> Originally Posted by *Chopper1591*
> 
> I haven't played games on the 1200 clock, yet.
> Am a bit scared to push things high before I got some more info.
> 
> Will do a bit of Far Cry 4 and monitor temps on 1200 core.
> Yeah.
> I am a bit disappointed on the vrm1 temp. Tri-x should be one of the better, if not the best, coolers. Right?
> Is there a way to improve the vrm cooling? Adding sinks to the backside of the card?
> 
> Using this fan curve right now.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Will do.
> 
> Edit:
> 
> 1200 core is not stable at all.
> Did pass a few benches...
> 
> But less then 1 minute in Far Cry and it throttles and shows artifacts.
> Temp: 81c core, vrm1 90c vrm2 59c
> 
> Edit 2:
> 
> Far Cry looks like a horrible way to see if the gpu throttles.
> Here is a shot after 3 minutes of heavy combat.
> Gpu usage is all over the place while temps look fine.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> To rule things out I just did 3 minutes in the same area in Far Cry.
> The cooler actually looks pretty good, looking at the small increase of temp going from stock to 1150 core with +50mv.


see the core clock? that should be flat like a stretched string.


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> very rare that a 290 that unlocks can oc as much. you won the lottery twice. congrats. at 1250 . . . no 290 will catch it.


290 that unlcoked for me overclocks better then 290 that did not.


----------



## Chopper1591

Quote:


> Originally Posted by *rdr09*
> 
> see the core clock? that should be flat like a stretched string.


Do you consider that very little fluctuation as throttling?
If so, then it throttles at stock.


----------



## rdr09

Quote:


> Originally Posted by *Chopper1591*
> 
> Do you consider that very little fluctuation as throttling?
> If so, then it throttles at stock.


i am comparing it to mine which is a reference. maybe the Tri X has a boost feature or something coz its oc'ed out the box. that's prolly why it is not flat. not necessarily throttling.


----------



## Chopper1591

Quote:


> Originally Posted by *rdr09*
> 
> i am comparing it to mine which is a reference. maybe the Tri X has a boost feature or something coz its oc'ed out the box. that's prolly why it is not flat. not necessarily throttling.


Yeah, the tri-x is somewhat reference.
Although it uses another powertune, 1.0 vs 2.0.

It is indeed a boost function.

How are your vrm temps btw?


----------



## rdr09

Quote:


> Originally Posted by *Chopper1591*
> 
> Yeah, the tri-x is somewhat reference.
> Although it uses another powertune, 1.0 vs 2.0.
> 
> It is indeed a boost function.
> 
> How are your vrm temps btw?


mine is watered. the vrms are always cooler than the core when watercooled. all temps are always below 60C at load but my ambient is low - 22C.

your vrms temps are very good considering yours on air. lower than the core temp. nice.

@ Zeal, you won the lottery twice, too. lol


----------



## Chopper1591

Quote:


> Originally Posted by *rdr09*
> 
> mine is watered. the vrms are always cooler than the core when watercooled. all temps are always below 60C at load but my ambient is low - 22C.
> 
> your vrms temps are very good considering yours on air. lower than the core temp. nice.
> 
> @ Zeal, you won the lottery twice, too. lol


Really?
I am a bit lost now...

I actually think my vrm temps are too high...



Should I upgrade to Fujipoly?
Or do you think that is a waste with stock air?
Not going to put it under water for some time. Maybe next year when the 300 series starting to roll out and people sell their 290 waterblocks.


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> mine is watered. the vrms are always cooler than the core when watercooled. all temps are always below 60C at load but my ambient is low - 22C.
> 
> your vrms temps are very good considering yours on air. lower than the core temp. nice.
> 
> @ Zeal, you won the lottery twice, too. lol


My 290 that unlocks is a pre production 290 (Engineering Sample) Has many readout pins and Crossfire finger.


----------



## LandonAaron

Well, I figured out why my Core Clock wasn't stay straight on 1200 where I had it set. It was just cause of low utilization. Disabled Vsync, and GPU usage went to full 100%, and core clock went "flat as a string". I think that is probably what is going on with Chopper's too.


----------



## LandonAaron

How can you know if a 290 will unlock or not?


----------



## ZealotKi11er

Quote:


> Originally Posted by *LandonAaron*
> 
> How can you know if a 290 will unlock or not?


Flash 290X bios and see if it's a 290X.


----------



## rdr09

Quote:


> Originally Posted by *Chopper1591*
> 
> Really?
> I am a bit lost now...
> 
> I actually think my vrm temps are too high...
> 
> 
> 
> Should I upgrade to Fujipoly?
> Or do you think that is a waste with stock air?
> Not going to put it under water for some time. Maybe next year when the 300 series starting to roll out and people sell their 290 waterblocks.


just keep an eye out for a used full block. got mine for $60. your temps are fine.

Quote:


> Originally Posted by *ZealotKi11er*
> 
> My 290 that unlocks is a pre production 290 (Engineering Sample) Has many readout pins and Crossfire finger.


i see.

Quote:


> Originally Posted by *LandonAaron*
> 
> How can you know if a 290 will unlock or not?


http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread

there is small app you need to download and read the results.


----------



## black7hought

Quote:


> Originally Posted by *Chopper1591*
> 
> Congrats.
> 
> What did you do with the reference?


I have the ref 290 on eBay at the moment. I don't think I have enough rep to post it on the Sale page here.


----------



## rdr09

Planetside 2 in 4K maxed with a single 290 stock.



Resized from 23MB to 9MB.


----------



## razaice

Quote:


> Originally Posted by *LandonAaron*
> 
> Well, I figured out why my Core Clock wasn't stay straight on 1200 where I had it set. It was just cause of low utilization. Disabled Vsync, and GPU usage went to full 100%, and core clock went "flat as a string". I think that is probably what is going on with Chopper's too.


Yeah I think that's a feature specific to the 290 and 290x. When the card isn't being utilized at 100%, it downclocks itself to stay cooler. Apparently it changes very rapidly too. If you speed up the sensor refresh rate in gpu-z you see the clock is going nuts.


----------



## LandonAaron

Quote:


> Originally Posted by *rdr09*
> 
> Planetside 2 in 4K maxed with a single 290 stock.
> 
> 
> 
> Resized from 23MB to 9MB.


Is that game any good. Was thinking of getting it?


----------



## <({D34TH})>

Quote:


> Originally Posted by *LandonAaron*
> 
> Is that game any good. Was thinking of getting it?


It'll ruin the way you see Battlefield forever









It tanks even the fastest of CPUs on heavy battles though.


----------



## bond32

New trifire score ultra:http://www.3dmark.com/3dm/4861585

1290/1500 and 1290/1300

4790k is holding my scores down now...

Extreme: http://www.3dmark.com/3dm/4861542

Guess I'll try to squeeze a bit more out of the physics...


----------



## kizwan

Quote:


> Originally Posted by *Chopper1591*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Change to better thermal pad. That should help.
> 
> 
> 
> 
> 
> 
> 
> I will probably just do it.
> Sadly I can only buy it from Frozen cpu.
> 
> Should I try my luck and re-use the ram pads?
> Also, which size should I get for the vrm's? 1mm?
> 
> Will this suffice?
> http://www.frozencpu.com/products/17504/thr-185/Fujipoly_ModRight_Ultra_Extreme_System_Builder_Thermal_Pad_Blister_Pack_-_Mosfet_Block_-_100_x_15_x_10_-_Thermal_Conductivity_170_WmK.html
> 
> Will set me back around 22 usd shipped. 15-20 days.
Click to expand...

I don't know the thickness required for VRM with Tri-X cooler. I think 1mm should be ok.

Either you want it or you don't want it.







They're expensive, yes. You could just run your card at stock clocks.



Quote:


> Originally Posted by *Chopper1591*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> i am comparing it to mine which is a reference. maybe the Tri X has a boost feature or something coz its oc'ed out the box. that's prolly why it is not flat. not necessarily throttling.
> 
> 
> 
> 
> 
> 
> 
> Yeah, the tri-x is somewhat reference.
> Although it uses another powertune, 1.0 vs 2.0.
> 
> It is indeed a boost function.
> 
> How are your vrm temps btw?
Click to expand...

As far as I know both use the same powertune. The 290 Tri-X OC is pretty much an overclocked card, not boost feature. Mine clocks are flat when running Valley or Heaven. Try increase only the power limit.


----------



## Gobigorgohome

Quote:


> Originally Posted by *bond32*
> 
> New trifire score ultra:http://www.3dmark.com/3dm/4861585
> 
> 1290/1500 and 1290/1300
> 
> 4790k is holding my scores down now...
> 
> Extreme: http://www.3dmark.com/3dm/4861542
> 
> Guess I'll try to squeeze a bit more out of the physics...


Have you considered going X99, DDR4 and so on? I hear those 5960Xs wrecks everything in benchmarking.







But it will set you back a few dollars.


----------



## Chopper1591

Quote:


> Originally Posted by *rdr09*
> 
> just keep an eye out for a used full block. got mine for $60. your temps are fine.
> i see.
> http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread
> 
> there is small app you need to download and read the results.


I will.

Think I'll just leave it for the time being.
Quote:


> Originally Posted by *razaice*
> 
> Yeah I think that's a feature specific to the 290 and 290x. When the card isn't being utilized at 100%, it downclocks itself to stay cooler. Apparently it changes very rapidly too. If you speed up the sensor refresh rate in gpu-z you see the clock is going nuts.


Explains things a bit.
Utilization is just horrible on Far Cry atm.
Some scenes give me 90 fps with a smooth ~100% gpu usage. Others give me choppy 30-35fps with gpu usage shooting up and down from 100% to 25-30%.
Quote:


> Originally Posted by *kizwan*
> 
> I don't know the thickness required for VRM with Tri-X cooler. I think 1mm should be ok.
> 
> Either you want it or you don't want it.
> 
> 
> 
> 
> 
> 
> 
> They're expensive, yes. You could just run your card at stock clocks.
> 
> 
> As far as I know both use the same powertune. The 290 Tri-X OC is pretty much an overclocked card, not boost feature. Mine clocks are flat when running Valley or Heaven. Try increase only the power limit.


Wow.
For me their are not THAT expensive.
You are showing the 11w/mk btw.

The only sizes I see, plus pricing inc. shipping to my home:

Fujipoly / ModRight Ultra Extreme System Builder Thermal Pad Blister Pack - 1/2 Sheet - 200 x 150 x 1.0 - Thermal Conductivity 17.0 W/mK $ 194.99 + $ 32.50
Fujipoly / ModRight Ultra Extreme System Builder Thermal Pad Blister Pack - 60 x 50 x 1.0 - Thermal Conductivity 17.0 W/mK $ 19.99 + $ 7.99
Fujipoly / ModRight Ultra Extreme System Builder Thermal Pad Blister Pack - Mosfet Block - 100 x 15 x 1.0 - Thermal Conductivity 17.0 W/mK $ 15.99 + $ 7.99

About the clocks in Heaven.
See for yourself. The difference in Heaven and Far Cry.

Same clocks/settings:


----------



## rdr09

Quote:


> Originally Posted by *LandonAaron*
> 
> Is that game any good. Was thinking of getting it?


something to play between Skyrim and BF4. it has gotten better.
Quote:


> Originally Posted by *bond32*
> 
> New trifire score ultra:http://www.3dmark.com/3dm/4861585
> 
> 1290/1500 and 1290/1300
> 
> 4790k is holding my scores down now...
> 
> Extreme: http://www.3dmark.com/3dm/4861542
> 
> Guess I'll try to squeeze a bit more out of the physics...


Nice runs.


----------



## Chopper1591

1200 looks like a no go with the stock heatsink.

15 minutes or so into Valey Extreme HD:
85c on the core
vrm1... 105c
~55% fan(auto)


----------



## rdr09

Quote:


> Originally Posted by *Chopper1591*
> 
> 1200 looks like a no go with the stock heatsink.
> 
> 15 minutes or so into Valey Extreme HD:
> 85c on the core
> vrm1... 105c
> ~55% fan(auto)


i recommend not leaving your fans in auto when oc'ed. set it to manual to keep temps low before the card throttles or a fan curve. this auto settings are not aggressive enuf to catch up with temps or are only designed for stock clocks.

one of the downsides of oc'ing on air is fan noise.


----------



## Gobigorgohome

Quote:


> Originally Posted by *Chopper1591*
> 
> 1200 looks like a no go with the stock heatsink.
> 
> 15 minutes or so into Valey Extreme HD:
> 85c on the core
> vrm1... 105c
> ~55% fan(auto)


Overclocking the card with auto fan-settings sounds like a bad idea.


----------



## Performer81

Quote:


> Originally Posted by *Chopper1591*
> 
> 1200 looks like a no go with the stock heatsink.
> 
> 15 minutes or so into Valey Extreme HD:
> 85c on the core
> vrm1... 105c
> ~55% fan(auto)


MY PCS+ handles 1200/1400 quite good with STock heat sink and auto fans. This is with +160 and effective ~1,3V under load.
GPU Temp goes only little over 70 and Vrm in the middle 80s.
Im only concerned of 2 of my vrms because they have absolutely no heatsink. SOme other user measured 93 degrees at stock settings with a laser thermometer. DO they all have a temp diode inside and GPU-z shows the hottest of them?


----------



## Chopper1591

Quote:


> Originally Posted by *Performer81*
> 
> MY PCS+ handles 1200/1400 quite good with STock heat sink and auto fans. This is with +160 and effective ~1,3V under load.
> GPU Temp goes only little over 70 and Vrm in the middle 80s.
> Im only concerned of 2 of my vrms because they have absolutely no heatsink. SOme other user measured 93 degrees at stock settings with a laser thermometer. DO they all have a temp diode inside and GPU-z shows the hottest of them?












Can you show a gpu-z shot after 15-20 minutes of Valley Extreme HD?


----------



## Performer81

15min Heaven. I have a 140mm side fan blowing directly at the card. WIthout it would be much warmer.

http://abload.de/image.php?img=heaven_1200_15h7avc.jpg


----------



## Chopper1591

Quote:


> Originally Posted by *Performer81*
> 
> 15min Heaven. I have *a 140mm side fan blowing directly at the card*. WIthout it would be much warmer.
> 
> http://abload.de/image.php?img=heaven_1200_15h7avc.jpg


Explains things.








I can't... nice big window there.

Is that auto fan speed? 70% must be hard on the ears.
Mine auto's below 60% @ 75c.


----------



## Performer81

Auto fan, yeah 70 is quite noisy. YOu could just test without side panel if the temps are better. Should have the same effect.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Chopper1591*
> 
> Explains things.
> 
> 
> 
> 
> 
> 
> 
> 
> I can't... nice big window there.
> 
> Is that auto fan speed? 70% must be hard on the ears.
> Mine auto's below 60% @ 75c.


You'd be much better off with a more aggressive fan profile if you are overclocking and the PCS+ is a very capable cooler equal to the Tri-X from what i've seen.


----------



## Chopper1591

Quote:


> Originally Posted by *Sgt Bilko*
> 
> You'd be much better off with a more aggressive fan profile if you are overclocking and the PCS+ is a very capable cooler equal to the Tri-X from what i've seen.


You are right...
But it kinda takes the advantage of my cpu under water away.








Things are quiet now.

This is my current profile:


60% is already pretty loud.

I am really thinking about putting it under water.
But then comes the next question...
Will my current rad be enough to cool the cpu and gpu?
Alphacool 360 UT60 with 3 ap-15's.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Chopper1591*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> You'd be much better off with a more aggressive fan profile if you are overclocking and the PCS+ is a very capable cooler equal to the Tri-X from what i've seen.
> 
> 
> 
> You are right...
> But it kinda takes the advantage of my cpu under water away.
> 
> 
> 
> 
> 
> 
> 
> 
> Things are quiet now.
> 
> This is my current profile:
> 
> 
> 60% is already pretty loud.
> 
> I am really thinking about putting it under water.
> But then comes the next question...
> Will my current rad be enough to cool the cpu and gpu?
> Alphacool 360 UT60 with 3 ap-15's.
Click to expand...

Yeah i'd bump it to 70% at 70c but i don't mind noise all that much.

I haven't even started my first loop yet so i'll let someone with more knowledge answer that one but I'd say you can just don't expect a high clock on the GPU,


----------



## kizwan

Quote:


> Originally Posted by *Chopper1591*
> 
> About the clocks in Heaven.
> See for yourself. The difference in Heaven and Far Cry.
> 
> Same clocks/settings:
> 
> 
> Spoiler: Warning: Spoiler!


Sorry, I forgot. I get the same core clock fluctuation with single 290.

With Crossfire, I get flat line. So, if you want to see it flat line, go for Crossfire.


----------



## Chopper1591

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Yeah i'd bump it to 70% at 70c but i don't mind noise all that much.
> 
> I haven't even started my first loop yet so i'll let someone with more knowledge answer that one but I'd say you can just don't expect a high clock on the GPU,


Will do a run for fun.
Fix fan speed @ 70% and do Valley Extreme with 1200 core +125mv(this setting gave me no artifacts but 105c on the vrm this morning).
Quote:


> Originally Posted by *kizwan*
> 
> Sorry, I forgot. I get the same core clock fluctuation with single 290.
> 
> With Crossfire, I get flat line. So, if you want to see it flat line, go for Crossfire.


----------



## Chopper1591

Quote:


> Originally Posted by *launch*
> 
> I can overclock my card to where its completely stable for gaming and benchmarks, but then when I put the computer to sleep and try to wake it back up I get a black screen. I then have to reset the computer, which makes it try to resume to windows again.


I've had the same for months with ati.

For me disabling sleep mode on the monitor solved this.


----------



## ZealotKi11er

Quote:


> Originally Posted by *launch*
> 
> I can overclock my card to where its completely stable for gaming and benchmarks, but then when I put the computer to sleep and try to wake it back up I get a black screen. I then have to reset the computer, which makes it try to resume to windows again.


Why put PC to Sleep. If you have SSD just shut it down.


----------



## Chopper1591

I've had contact with the store where my card was bought. I got it second hand.

There is an issue with the fans. Sometimes at various speeds the card makes an annoying rattling noise.
Contacted the store by e-mail to tell them about the issue. And told about the rather high vrm temp.
Changing the thermal pad will void the warranty, they told me.

They do offered an RMA though.

I am not sure if I should send it in for repair/replacement.
What do you guys think about the stock temps? Are they off?


----------



## razaice

Quote:


> Originally Posted by *Chopper1591*
> 
> I've had contact with the store where my card was bought. I got it second hand.
> 
> There is an issue with the fans. Sometimes at various speeds the card makes an annoying rattling noise.
> Contacted the store by e-mail to tell them about the issue. And told about the rather high vrm temp.
> Changing the thermal pad will void the warranty, they told me.
> 
> They do offered an RMA though.
> 
> I am not sure if I should send it in for repair/replacement.
> What do you guys think about the stock temps? Are they off?


This thread might be helpful: http://www.overclock.net/t/1456488/290-tri-x-non-x-version-silly-sapphire-tri-x-are-for-kids-some-temp-test-for-those-who-want-to-know

It shows temps at different settings for a 290 tri-x.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Chopper1591*
> 
> I've had contact with the store where my card was bought. I got it second hand.
> 
> There is an issue with the fans. Sometimes at various speeds the card makes an annoying rattling noise.
> Contacted the store by e-mail to tell them about the issue. And told about the rather high vrm temp.
> Changing the thermal pad will void the warranty, they told me.
> 
> They do offered an RMA though.
> 
> I am not sure if I should send it in for repair/replacement.
> What do you guys think about the stock temps? Are they off?
> 
> 
> Spoiler: Warning: Spoiler!


Your card is fine, just saying the same as before, if you are going to overclock then raise the fan speed manually or have a more aggressive fan curve


----------



## Chopper1591

Quote:


> Originally Posted by *razaice*
> 
> This thread might be helpful: http://www.overclock.net/t/1456488/290-tri-x-non-x-version-silly-sapphire-tri-x-are-for-kids-some-temp-test-for-those-who-want-to-know
> 
> It shows temps at different settings for a 290 tri-x.


I did saw that post a couple of days ago.

That's really what started me to doubt the cooling of mine in the first place.








Quote:


> Originally Posted by *Sgt Bilko*
> 
> Your card is fine, just saying the same as before, if you are going to overclock then raise the fan speed manually or have a more aggressive fan curve


This is with aggresive fan. At least, I call that aggresive.

After half an hour or so into far cry.


The guy from the post that razaice shared had this:

Max VDDC Volts-1.313v -messed up copy and paste(results are correct below)
Fan Speed Setting Fixed-70%
GPU Temp Max-70°c
VRM 1 Temp Max-71°c
VRM 2 Temp MAX-46°c
Core-1150
MEM-1450

Higher voltage, same core, higher memory and 14c cooler on the vrm. Thats a lot.


----------



## kizwan

Quote:


> Originally Posted by *Chopper1591*
> 
> The guy from the post that razaice shared had this:
> 
> Max VDDC Volts-1.313v -messed up copy and paste(results are correct below)
> Fan Speed Setting Fixed-70%
> GPU Temp Max-70°c
> VRM 1 Temp Max-71°c
> VRM 2 Temp MAX-46°c
> Core-1150
> MEM-1450
> 
> Higher voltage, same core, higher memory and 14c cooler on the vrm. Thats a lot.


The problem with the result is that he (or she) doesn't record ambient/room temp. This make the result meaningless. If your ambient temp is 14C higher than his (or her), then no way you're going to get the same temps.

No idea why I didn't record the temps when my cards still using stock cooler. These are the only screenshots I have with stock referenced cooler; BF3 - CFX - stock clock (947/1250) - ambient 30 - 32C - fan speed 80%. GPU1 & GPU2 respectively.


----------



## Chopper1591

Quote:


> Originally Posted by *kizwan*
> 
> The problem with the result is that he (or she) doesn't record ambient/room temp. This make the result meaningless. If your ambient temp is 14C higher than his (or her), then no way you're going to get the same temps.
> 
> No idea why I didn't record the temps when my cards still using stock cooler. These are the only screenshots I have with stock referenced cooler; BF3 - CFX - stock clock (947/1250) - ambient 30 - 32C - fan speed 80%. GPU1 & GPU2 respectively.


That is true.
Ambient is needed to make an statement. Maybe he was running it from his refrigerator?
My ambient is around 20-22c. Maybe even less.

Here's a quote from a Sapphire guy on that topic we are talking about earlier:
Quote:


> Originally Posted by *VaporX*
> 
> Hey guys you have been pressing for this so I thought I would get you some answers. This what I am being told by the engineers.
> This is the reason the reference boards have better VRM temps. As for the Tri-X design our card should be able to keep *VRM temps under 75C* with a *VDDC at 1.3V and the 1.2 Ghz* core speed. However this would require *a lot of fan speed adjustment* and would result in a louder card.
> 
> In the end everything is a trade off of noise vs cooling and our goal is to try find the best balance. We have had a few reviews in Europe show our VRM at around 80C in a chassis for prolonged gaming sessions with the fan set on auto.
> 
> I have passed along your thoughts and comments and our team will be keeping VRM temps more in mind with future designs.


He must be talking about very high fan speed then.

Considering I reached 80 on the vrm with only 1.148v and 62% fan.


----------



## LandonAaron

Quote:


> Originally Posted by *rdr09*
> 
> Planetside 2 in 4K maxed with a single 290 stock.
> 
> 
> 
> Resized from 23MB to 9MB.


Quote:


> Originally Posted by *Chopper1591*
> 
> I did saw that post a couple of days ago.
> 
> That's really what started me to doubt the cooling of mine in the first place.
> 
> 
> 
> 
> 
> 
> 
> 
> This is with aggresive fan. At least, I call that aggresive.
> 
> After half an hour or so into far cry.
> 
> 
> The guy from the post that razaice shared had this:
> 
> Max VDDC Volts-1.313v -messed up copy and paste(results are correct below)
> Fan Speed Setting Fixed-70%
> GPU Temp Max-70°c
> VRM 1 Temp Max-71°c
> VRM 2 Temp MAX-46°c
> Core-1150
> MEM-1450
> 
> Higher voltage, same core, higher memory and 14c cooler on the vrm. Thats a lot.


I don't really think you have anything to worry about with the VRM temps. From what I have read they can go all the way up to 120C. I say as long as your under 90 you have nothing to worry about.


----------



## Chopper1591

Quote:


> Originally Posted by *LandonAaron*
> 
> I don't really think you have anything to worry about with the VRM temps. From what I have read they can go all the way up to 120C. I say as long as your under 90 you have nothing to worry about.


Rated up to 120c, true.
But it won't be very good for a stable clock.
Under 90 should be fine indeed.

Maybe I am just spoiled.
I expected to be able to clock higher on this card.

When I see a fullcover block offered second hand I am going to jump on it.


----------



## tsm106

Haha, I would not run my vrms at 120c.


----------



## LandonAaron

Those alphacool blocks that just have a water chamber for the GPU, and passive air heatksins for the rest of the card are fairly cheap and suppose to do a decent job of cooling the VRM despite not have true water blocks for that portion of the card.


----------



## Chopper1591

Quote:


> Originally Posted by *tsm106*
> 
> Haha, I would not run my vrms at 120c.











Quote:


> Originally Posted by *LandonAaron*
> 
> Those alphacool blocks that just have a water chamber for the GPU, and passive air heatksins for the rest of the card are fairly cheap and suppose to do a decent job of cooling the VRM despite not have true water blocks for that portion of the card.


I can't follow you.

Suppose to do a decent job?
Do you have anything to back that up? And which kind of passive cooling do you mean?

I take you are talking about the universal gpu blocks to cool the gpu core itself?
That is less expensive, yes.
But going that route takes the airflow over the ram and vrm's away. So one shall have to add fan(s) to cool that portion where you apply heatsinks.

Fullcover should be about the same price as a universal block + decent vrm sinks and extra fan for the airflow.


----------



## welshy46

Has anyone modded an EK rev 1 block for the 290x to fit the MSI rev2.2. I bought one after the death of my rev 1 Gigabyte oc and only realised it wouldn't fit my water block when I tried to fit it







as far as I can tell it's only the 4 larger caps that stop it fitting. Is it just a matter of drilling out the points where the caps hit or am I missing something obvious.


i put a dab of thermal paste on the caps to see where they hit the block


----------



## tsm106

Quote:


> Originally Posted by *welshy46*
> 
> Has anyone modded an EK rev 1 block for the 290x to fit the MSI rev2.2. I bought one after the death of my rev 1 Gigabyte oc and only realised it wouldn't fit my water block when I tried to fit it
> 
> 
> 
> 
> 
> 
> 
> as far as I can tell it's only the 4 larger caps that stop it fitting. Is it just a matter of drilling out the points where the caps hit or am I missing something obvious.
> 
> 
> i put a dab of thermal paste on the caps to see where they hit the block


No point to it because that 2.2 board is the biggest pain in the ass. It has ic's that need to be cooled stuffed all over the pcb. It's like the designer of that pcb is being devious on purpose. No one designs a pcb where hot components that need to be cooled are separated that far and randomly apart from each other unless they are trying to be a prick on purpose. Best thing to do is sell that pcb, or go gpu only.


----------



## welshy46

@tsm106 Thanks, That'll be the obvious thing I was missing, I knew it couldn't be that easy or EK would have done it. looks like a new card will have to go on the xmas list.


----------



## LandonAaron

Quote:


> Originally Posted by *Chopper1591*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can't follow you.
> 
> Suppose to do a decent job?
> Do you have anything to back that up? And which kind of passive cooling do you mean?
> 
> I take you are talking about the universal gpu blocks to cool the gpu core itself?
> That is less expensive, yes.
> But going that route takes the airflow over the ram and vrm's away. So one shall have to add fan(s) to cool that portion where you apply heatsinks.
> 
> Fullcover should be about the same price as a universal block + decent vrm sinks and extra fan for the airflow.


Here is the waterblock I was talking about. http://www.xtremerigs.net/2014/08/11/alphacool-gpx-a290-review/ These are cheaper than most blocks and offer decent performance. I read a different review on these when I first saw them, and it seems that review stated that some of the problems with these were resolved in later revisions, but I'm not sure.


----------



## pdasterly

received 290x back from rma, they sent me a tri-x model. Should I copy bios from tri-x to my reference model, both are under water in crossfire


----------



## ZealotKi11er

Quote:


> Originally Posted by *pdasterly*
> 
> received 290x back from rma, they sent me a tri-x model. Should I copy bios from tri-x to my reference model, both are under water in crossfire


Are the cards both reference? If so i dont see a problem.


----------



## pdasterly

http://www.3dmark.com/fs/3341084


----------



## ZealotKi11er

Quote:


> Originally Posted by *pdasterly*
> 
> http://www.3dmark.com/fs/3341084


You could probably break 20K if you pushed for 1300MHz.


----------



## pdasterly

I think thats as far as it will go. I get driver errors @ 1200 no matter the voltage


----------



## pdasterly

New beta driver made my machine fastest of its kind for now








4790k xfire 290x

There are no good oc guides for the 290x


----------



## ZealotKi11er

Quote:


> Originally Posted by *pdasterly*
> 
> New beta driver made my machine fastest of its kind for now
> 
> 
> 
> 
> 
> 
> 
> 
> 4790k xfire 290x
> 
> There are no good oc guides for the 290x


How hod to the cards get under load and how much voltage ur using for 1200MHz?


----------



## pdasterly

60c core and vrm1
150mv in trixx


----------



## ZealotKi11er

Quote:


> Originally Posted by *pdasterly*
> 
> 60c core and vrm1
> 150mv in trixx


Thats in the high side. You are getting too high Deltas.


----------



## kizwan

Quote:


> Originally Posted by *pdasterly*
> 
> New beta driver made my machine fastest of its kind for now
> 
> 
> 
> 
> 
> 
> 
> 
> 4790k xfire 290x
> 
> There are no good oc guides for the 290x


No need guides. If there's any guide at all, it will be only one or two lines of instructions; "move the slider" & "hit apply".
Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *pdasterly*
> 
> 60c core and vrm1
> 150mv in trixx
> 
> 
> 
> Thats in the high side. You are getting too high Deltas.
Click to expand...

How do you know it's too high "Deltas"? You don't know his ambient or water temp as reference. Or you do know his ambient or water temp?


----------



## ZealotKi11er

Quote:


> Originally Posted by *kizwan*
> 
> No need guides. If there's any guide at all, it will be only one or two lines of instructions; "move the slider" & "hit apply".
> How do you know it's too high "Deltas"? You don't know his ambient or water temp as reference. Or you do know his ambient or water temp?


I know because i have the same setup. I hit 60C in very hot days so i am assuming for the worse here.


----------



## pdasterly

78f ambient


----------



## Arizonian

Quote:


> Originally Posted by *BLOWNCO*
> 
> finally got all 3 of them in and running
> 
> 
> 
> Spoiler: Warning: Spoiler!


Nice








Quote:


> Originally Posted by *black7hought*
> 
> New Validation for my switch to a Tri-X OC from a reference card.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - updated


----------



## LandonAaron

Quote:


> Originally Posted by *pdasterly*
> 
> 60c core and vrm1
> 150mv in trixx


Quote:


> Originally Posted by *ZealotKi11er*
> 
> Thats in the high side. You are getting too high Deltas.


60c isn't hot at all. The core can go up to 94C and the VRM can go up to 120, if anything its on the cold side.


----------



## chiknnwatrmln

My cards hit 60c after an hour or so gaming. Granted I have a good amount of rad space, but this is with my air vents closed for less noise. My ambient is ~72ish. Also this is with +100mv, 1165 core.

To above, they are talking about water cooling.


----------



## Chopper1591

Quote:


> Originally Posted by *LandonAaron*
> 
> Here is the waterblock I was talking about. http://www.xtremerigs.net/2014/08/11/alphacool-gpx-a290-review/ These are cheaper than most blocks and offer decent performance. I read a different review on these when I first saw them, and it seems that review stated that some of the problems with these were resolved in later revisions, but I'm not sure.


Thanks for the share.
That seems like a no go though.

Firstly because I don't seem to be able to buy that where I live.
And the restriction on that block is just huge. Will have to do a dual loop if I go that route.

And after looking it up on Frozencpu, it even seems to be more expensive then the EK fullcover.

Last but not least, decent performance isn't enough for me.


----------



## LandonAaron

I think those heat killer blocks are really good. Performace-pcs had the copper one on sale for 86 dollars a month ago but now it's back up to like 136 so doesn't really help you much.


----------



## Raephen

Quote:


> Originally Posted by *Chopper1591*
> 
> I am really thinking about putting it under water.
> But then comes the next question...
> Will my current rad be enough to cool the cpu and gpu?
> Alphacool 360 UT60 with 3 ap-15's.


A single R9 and your cpu? Should be doable. My setup has a loop with my mildly clocked 3570K, and a R9 290 on a single slim (30mm) 360mm rad. For keeping it all cool enough it does a good job. So your UT60 should only add to what's already good enough (for mild overclocks).


----------



## Chopper1591

Quote:


> Originally Posted by *LandonAaron*
> 
> I think those heat killer blocks are really good. Performace-pcs had the copper one on sale for 86 dollars a month ago but now it's back up to like 136 so doesn't really help you much.


I just read through the review you posted of the Alphacool block.
Did you even read it thoroughly?









The summary sums it up pretty good actually:
Quote:


> *Summary*
> 
> This card is designed to be a budget alternative to the real deal water blocks. While up front you don't save much ($96 vs the $118 for the EK block) the cost savings instead come when or if you upgrade your GPU. Then if a new backplate and front plate combo is still being designed and manufactured then you'll be able to transition cheaply (assuming the price is cheap) to a fully watercooled new GPU. While the idea and the thermal performance on the core is good this block/backplate is however sloppily designed, unattractive and very restrictive. Outstanding core results only get you so far, but by far the biggest problem is the compatibility issue which mean that this version at least should be avoided.


The thermal pads used are just hilarious.








1.5mm on the memory and 3mm on half the backside of the card for the backplate.

And the pricing aint that good either.
Top that off with the major restriction it creates, and you have one unit to avoid.

I will probably stick to EK and use Fujipoly.
Will grab that if I can find it reasonably priced, used.

Quote:


> Originally Posted by *Raephen*
> 
> A single R9 and your cpu? Should be doable. My setup has a loop with my mildly clocked 3570K, and a R9 290 on a single slim (30mm) 360mm rad. For keeping it all cool enough it does a good job. So your UT60 should only add to what's already good enough (for mild overclocks).


Yeah a single gpu.

Would you be so kind to overclock that i5 to lets say 4.5 and run Heaven or Valley for 10-15 minutes and post results?
Oh, lol, just saw you are also from the Netherlands.

I am willing to keep my cpu at 4.6-4.8ghz and clock the card to 1200 or so.


----------



## Raephen

Quote:


> Originally Posted by *Raephen*
> 
> A single R9 and your cpu? Should be doable. My setup has a loop with my mildly clocked 3570K, and a R9 290 on a single slim (30mm) 360mm rad. For keeping it all cool enough it does a good job. So your UT60 should only add to what's already good enough (for mild overclocks).


Here:


This was after 20 minutes of OCCT 3D-test (looks very much like the FurMark donut to me).

Mind you, my loop setup isn't that optimal: bay-res -> pump -> rad -> cpu -> gpu -> back to the res.

And during the test, both the pump and the 3 Bitfenix Pro Ledw PWM fans on my rad never spun faster than 25,5%

In my Phantom case I also have 2 200mm fans in top, so in a way I have a push-pull setup on my rad, but they are just two old Corsair white led stock fans from m yold Corsair case.


----------



## Raephen

Quote:


> Originally Posted by *Chopper1591*
> 
> Yeah a single gpu.
> 
> Would you be so kind to overclock that i5 to lets say 4.5 and run Heaven or Valley for 10-15 minutes and post results?
> Oh, lol, just saw you are also from the Netherlands.
> 
> I am willing to keep my cpu at 4.6-4.8ghz and clock the card to 1200 or so.


I actually began running OCCT on my gaming rig just afetr I made that first post.

The i5 is clocked to 4,3 GHz. the card at 1100core, 1250mem, +13mV


----------



## Sgt Bilko

Quote:


> Originally Posted by *Chopper1591*
> 
> Quote:
> 
> 
> 
> Originally Posted by *LandonAaron*
> 
> I think those heat killer blocks are really good. Performace-pcs had the copper one on sale for 86 dollars a month ago but now it's back up to like 136 so doesn't really help you much.
> 
> 
> 
> I just read through the review you posted of the Alphacool block.
> Did you even read it thoroughly?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The summary sums it up pretty good actually:
> Quote:
> 
> 
> 
> *Summary*
> 
> This card is designed to be a budget alternative to the real deal water blocks. While up front you don't save much ($96 vs the $118 for the EK block) the cost savings instead come when or if you upgrade your GPU. Then if a new backplate and front plate combo is still being designed and manufactured then you'll be able to transition cheaply (assuming the price is cheap) to a fully watercooled new GPU. While the idea and the thermal performance on the core is good this block/backplate is however sloppily designed, unattractive and very restrictive. Outstanding core results only get you so far, but by far the biggest problem is the compatibility issue which mean that this version at least should be avoided.
> 
> Click to expand...
> 
> The thermal pads used are just hilarious.
> 
> 
> 
> 
> 
> 
> 
> 
> 1.5mm on the memory and 3mm on half the backside of the card for the backplate.
> 
> And the pricing aint that good either.
> Top that off with the major restriction it creates, and you have one unit to avoid.
> 
> I will probably stick to EK and use Fujipoly.
> Will grab that if I can find it reasonably priced, used.
Click to expand...

Considering the block is only on the Core it gives decent performance for the vrms.

By all means the EK blocks are better but it was an option provided for you if you couldn't find the EK one.


----------



## Chopper1591

Quote:


> Originally Posted by *Raephen*
> 
> Here:
> 
> 
> This was after 20 minutes of OCCT 3D-test (looks very much like the FurMark donut to me).
> 
> Mind you, my loop setup isn't that optimal: bay-res -> pump -> rad -> cpu -> gpu -> back to the res.
> 
> And during the test, both the pump and the 3 Bitfenix Pro Ledw PWM fans on my rad never spun faster than 25,5%
> 
> In my Phantom case I also have 2 200mm fans in top, so in a way I have a push-pull setup on my rad, but they are just two old Corsair white led stock fans from m yold Corsair case.


Hehe, you readed my mind.

I do hope to get better temps then that though.
83c on the vrms is very high for water IMO.

About the loop order. I don't think any order should make much of a difference.
Some people say putting the rad before the blocks is better but water heats anyway.

Thanks for posting the results.

But I really think my card is cooler with the stock cooler then yours under water.
Quote:


> Originally Posted by *Sgt Bilko*
> 
> Considering the block is only on the Core it gives decent performance for the vrms.
> 
> By all means the EK blocks are better but it was an option provided for you if you couldn't find the EK one.


You are right.
And I am thankfull for sharing.
I am a bit sick since last weekend so I might react a little snappy.









But for the minor price difference I might as well go with the fullcover block.


----------



## kizwan

@Chopper1591, Raephen use OCCT/Furmark to heat up the card. Temps will be lot higher than Heaven/Valley.


----------



## Chopper1591

Quote:


> Originally Posted by *kizwan*
> 
> @Chopper1591, Raephen use OCCT/Furmark to heat up the card. Temps will be lot higher than Heaven/Valley.


I know, that should take the temps higher. But that high?


----------



## Raephen

Quote:


> Originally Posted by *Chopper1591*
> 
> Hehe, you readed my mind.
> 
> I do hope to get better temps then that though.
> 83c on the vrms is very high for water IMO.
> 
> About the loop order. I don't think any order should make much of a difference.
> Some people say putting the rad before the blocks is better but water heats anyway.
> 
> Thanks for posting the results.
> 
> But I really think my card is cooler with the stock cooler then yours under water.
> You are right.
> And I am thankfull for sharing.
> I am a bit sick since last weekend so I might react a little snappy.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But for the minor price difference I might as well go with the fullcover block.


Yeah, the VRM1seemed bit high,yes, But then I thought: OCCT looks like FurMark, probably behaves likeFuMark too... bloody stressfull.

And mind you: both pump and relevant fans topped out at 25,5% duty cycle.

I'm running Heaven 4.0 now, with the pump and fans pegged at 40%.

I'll post back later.


----------



## Raephen

And here it is, after some 15+ minutes of Heaven:



More than reasonable for a rad half the thickness of yours ;-p


----------



## Raephen

Quote:


> Originally Posted by *Chopper1591*
> 
> I know, that should take the temps higher. But that high?


FurMark yes, especially the VRM1.


----------



## welshy46

Looks like it is possible to fit a rev1 EK block to a MSi R290x rev2.2. Pretty simple too, I drilled out the parts of the acetal that were in the way. Applied the pads from the stock cooler to any other likely points of conflict and put the card onto the block to check for any impressions in the pads. Happy that nothing was probably going to short out I screwed it all together and after running furmark on 2560x1440 for an hour. First 20 minutes without fans on the rads until the GPU temp hit 65c then 40 minutes with the fans where it stayed at around 44c. I'm going to run a couple longer stress tests but up to now everything looks good.



proper ghetto test rig

apologies for potato pics


----------



## Chopper1591

Quote:


> Originally Posted by *Raephen*
> 
> And here it is, after some 15+ minutes of Heaven:
> 
> 
> 
> More than reasonable for a rad half the thickness of yours ;-p


Ok.
I'm sold.

Will most likely not be disappointed.

Do you use the Aquacomputer backplate btw?


----------



## pkrexer

I've seem to run into a new issue and haven't noticed since updating to the latest betas. Some reason my 2nd gpu won't down clock at times. If I restart, it'll fix it.

Anyone else run into this?


----------



## Raephen

Quote:


> Originally Posted by *Chopper1591*
> 
> Ok.
> I'm sold.
> 
> Will most likely not be disappointed.
> 
> Do you use the Aquacomputer backplate btw?


No. Just the AC full cover block. And switching the stock AC pads on VRM1 for something else might bring it down a few C.


----------



## Chopper1591

Quote:


> Originally Posted by *Raephen*
> 
> No. Just the AC full cover block. And switching the stock AC pads on VRM1 for something else might bring it down a few C.


Impressive.

The AC backplate seems to be the best there is in VRM cooling.
I am kinda looking where I can get one.
It is even compatible with the EK fullcover.


----------



## Raephen

Quote:


> Originally Posted by *Chopper1591*
> 
> Impressive.
> 
> The AC backplate seems to be the best there is in VRM cooling.
> I am kinda looking where I can get one.
> It is even compatible with the EK fullcover.


http://www.aquatuning.nl/waterkoeling/gpu-waterkoelers/gpu-backplates/16363/aquacomputer-backplate-fuer-kryographics-hawaii-r9-290x/290-aktiv-xcs?c=4796

http://www.aquatuning.nl/waterkoeling/gpu-waterkoelers/gpu-backplates/16362/aquacomputer-back-plate-for-kryographics-hawaii-r9-290x/290-passive?c=4796


----------



## pdasterly

Quote:


> Originally Posted by *Raephen*
> 
> http://www.aquatuning.nl/waterkoeling/gpu-waterkoelers/gpu-backplates/16363/aquacomputer-backplate-fuer-kryographics-hawaii-r9-290x/290-aktiv-xcs?c=4796
> 
> http://www.aquatuning.nl/waterkoeling/gpu-waterkoelers/gpu-backplates/16362/aquacomputer-back-plate-for-kryographics-hawaii-r9-290x/290-passive?c=4796


I tried to buy everywhere but out of stock.


----------



## Raephen

They appear to ben in stock at Aquatuning. More than 50 of the active backplate in stock in their german warehouse

It is a .nl website, but really it's a german company.


----------



## Chopper1591

Quote:


> Originally Posted by *Raephen*
> 
> http://www.aquatuning.nl/waterkoeling/gpu-waterkoelers/gpu-backplates/16363/aquacomputer-backplate-fuer-kryographics-hawaii-r9-290x/290-aktiv-xcs?c=4796
> 
> http://www.aquatuning.nl/waterkoeling/gpu-waterkoelers/gpu-backplates/16362/aquacomputer-back-plate-for-kryographics-hawaii-r9-290x/290-passive?c=4796


Thanks.

I found it @ Frozencpu, but they only have the active version for 53 usd(€42,50).


----------



## LandonAaron

Have you checked out the Heatkiller Blocks? They give great cooling for GPU and VRM without the need for a back plate. They have a back plate available but its purely aesthetic from what I understand. Acryilic version is 125 at FCPU. http://www.frozencpu.com/products/25834/ex-blc-1903/HEATKILLER_GPU-X_R9_290X_Acrylic_Edition_Full_Coverage_Water_Block_-_Copper_15031.html?tl=c309s2073b180#blank

It was the second best block reviewed in that extreme rigs roundup. They call it the "Watercool" in their roundup, but its called "heatkiller" elsewhere. http://www.xtremerigs.net/2014/08/11/r9-290x-gpu-waterblock-roundup/


----------



## Chopper1591

Quote:


> Originally Posted by *LandonAaron*
> 
> Have you checked out the Heatkiller Blocks? They give great cooling for GPU and VRM without the need for a back plate. They have a back plate available but its purely aesthetic from what I understand. Acryilic version is 125 at FCPU. http://www.frozencpu.com/products/25834/ex-blc-1903/HEATKILLER_GPU-X_R9_290X_Acrylic_Edition_Full_Coverage_Water_Block_-_Copper_15031.html?tl=c309s2073b180#blank
> 
> It was the second best block reviewed in that extreme rigs roundup. They call it the "Watercool" in their roundup, but its called "heatkiller" elsewhere. http://www.xtremerigs.net/2014/08/11/r9-290x-gpu-waterblock-roundup/


I did.
Maybe I will just grab one of those.

The block is somewhat more expensive then the EK.
But buying the EK with an AC backplate will set me back like 10 euro's more.

Tough decision.
A backplate does look cool though.









And Watercool is the brand if I am not mistaken.


----------



## tsm106

Quote:


> Originally Posted by *Chopper1591*
> 
> Quote:
> 
> 
> 
> Originally Posted by *LandonAaron*
> 
> Have you checked out the Heatkiller Blocks? They give great cooling for GPU and VRM without the need for a back plate. They have a back plate available but its purely aesthetic from what I understand. Acryilic version is 125 at FCPU. http://www.frozencpu.com/products/25834/ex-blc-1903/HEATKILLER_GPU-X_R9_290X_Acrylic_Edition_Full_Coverage_Water_Block_-_Copper_15031.html?tl=c309s2073b180#blank
> 
> It was the second best block reviewed in that extreme rigs roundup. They call it the "Watercool" in their roundup, but its called "heatkiller" elsewhere. http://www.xtremerigs.net/2014/08/11/r9-290x-gpu-waterblock-roundup/
> 
> 
> 
> I did.
> Maybe I will just grab one of those.
> 
> The block is somewhat more expensive then the EK.
> But buying the EK with an AC backplate will set me back like 10 euro's more.
> 
> Tough decision.
> A backplate does look cool though.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And Watercool is the brand if I am not mistaken.
Click to expand...

I really am in disbelief of that testing, and always been. It's showing deltas of 40 degrees over water temp.My vrms don't even breach 45c let alone go 40 degrees over water temp.


----------



## Gobigorgohome

How far will I probably get my Sapphire Radeon R9 290X Hynix under water with Fujipoly Ultra Extreme? Thinking of selling off two cards and keep two and clock them with PT1, I have no reason to keep all four cards ... as long as it can take max settings in pretty much any game at 4K without AA I am happy. CPU can reach 5 Ghz with acceptable voltage, anyway it is going below zero now so not really concerned about voltage.

Will 1200/1400 be a BIG difference from 1000/1250 for gaming? Or should I just keep three cards and overclock them?







I am sure as heck not keeping four of them.


----------



## King PWNinater

Ok, so my 290 is watercooled. Should I be able to turn up the voltage by +100mv (And nothing else, just voltage.) without any repercussions or crashing/stalling?


----------



## Roboyto

Quote:


> Originally Posted by *King PWNinater*
> 
> Ok, so my 290 is watercooled. Should I be able to turn up the voltage by +100mv (And nothing else, just voltage.) without any repercussions or crashing/stalling?


You probably can turn up the voltage, but what is the point of adding the voltage if you aren't going to increase the clocks? All you will be doing is creating unnecessary stress and heat.


----------



## King PWNinater

Quote:


> Originally Posted by *Roboyto*
> 
> You probably can turn up the voltage, but what is the point of adding the voltage if you aren't going to increase the clocks? All you will be doing is creating unnecessary stress and heat.


I mean, I'm not going to do that, but theoretically, if I did, and if my card could take the heat, I won't get instability/crashes, ect... Right?


----------



## rdr09

Quote:


> Originally Posted by *King PWNinater*
> 
> Ok, so my 290 is watercooled. Should I be able to turn up the voltage by +100mv (And nothing else, just voltage.) without any repercussions or crashing/stalling?


if you will oc you prolly need to. others, like myself, will prolly want to lower the voltage from default and hope to stay stable. this even though my temps are already good. mine is watercooled too. might move the temp from good to great.

edit: great being the core and the vrms won't see 50C in BF4 under normal ambient.


----------



## bond32

Quote:


> Originally Posted by *Gobigorgohome*
> 
> How far will I probably get my Sapphire Radeon R9 290X Hynix under water with Fujipoly Ultra Extreme? Thinking of selling off two cards and keep two and clock them with PT1, I have no reason to keep all four cards ... as long as it can take max settings in pretty much any game at 4K without AA I am happy. CPU can reach 5 Ghz with acceptable voltage, anyway it is going below zero now so not really concerned about voltage.
> 
> Will 1200/1400 be a BIG difference from 1000/1250 for gaming? Or should I just keep three cards and overclock them?
> 
> 
> 
> 
> 
> 
> 
> I am sure as heck not keeping four of them.


You get out of here with that noise...

Might sell one, I have observed only a small increase when the fourth card was added... But as for temps, put the k=17 fuji's on mine, can honestly say I am blown away by how much lower the VRM temps are from what I expected.


----------



## Roboyto

Quote:


> Originally Posted by *King PWNinater*
> 
> I mean, I'm not going to do that, but theoretically, if I did, and if my card could take the heat, I won't get instability/crashes, ect... Right?


Yeah, it shouldn't cause any issues. Although, the 290 in my HTPC doesn't like it at all once I surpass ~175mV in Trixx. Only way to find out is to give it a whirl.

Quote:


> Originally Posted by *rdr09*
> 
> if you will oc you prolly need to. others, like myself, will prolly want to lower the voltage from default and hope to stay stable. this even though my temps are already good. mine is watercooled too. might move the temp from good to great.
> 
> edit: great being the core and the vrms won't see 50C in BF4 under normal ambient.


Yup.

If you are going to run stock then this is the road I would be going down. Same concept applies with decreasing voltage to sustain a clock as it does to increase volts/clocks. With the increase in efficiency due to the lower temperatures you may be able to drop them more than you would think.

Quote:


> Originally Posted by *Gobigorgohome*
> 
> How far will I probably get my Sapphire Radeon R9 290X Hynix under water with Fujipoly Ultra Extreme? Thinking of selling off two cards and keep two and clock them with PT1, I have no reason to keep all four cards ... as long as it can take max settings in pretty much any game at 4K without AA I am happy. CPU can reach 5 Ghz with acceptable voltage, anyway it is going below zero now so not really concerned about voltage.
> 
> Will 1200/1400 be a BIG difference from 1000/1250 for gaming? Or should I just keep three cards and overclock them?
> 
> 
> 
> 
> 
> 
> 
> I am sure as heck not keeping four of them.


The overclocking capabilities of your card are more reliant on the Silicon Lottery than anything really. The AMD reference cooler is sufficient to tell if you have a capable overclocker or not. Watercooling can expand the clocks in which you are able to achieve and/or lower the voltage required to hit a given clock. The first pair of 290s I bought were identical and 5 serial #s apart. One was a 'dud' that was capable of ~10% overclock to core or RAM.

The other, which I am still using, clocks up to 1300/1700 under water and was able to make it to ~1215/1675, depending on the bench, with the stock blower. Before the full cover block I was needing ~115mV to get to ~1200 core clock; After going under water that was reduced to ~85mV.

You should see a solid jump in performance going to 1200 on the core clock. The 150MHz jump on the RAM is only good for a very minor performance jump. Going all the way up to 1700 on my card netted on average, between 3DMark11 Performance/Xtreme , Unigine Heaven/Valley, and FFXIV Benchmark, only a 5.5% gain in bench scores. You can see my spreadsheet from my benches back in Feb here: http://www.overclock.net/t/1456279/honey-i-shrunk-the-ultra-tower-beast-my-journey-to-creating-a-more-compact-pc-with-an-r9-290/20_20#post_21847939


----------



## Chopper1591

Quote:


> Originally Posted by *tsm106*
> 
> I really am in disbelief of that testing, and always been. It's showing deltas of 40 degrees over water temp.My vrms don't even breach 45c let alone go 40 degrees over water temp.


What do you advice then?
Maybe the total degrees are off.
But do you disagree on the order or cooling too?

I am just looking for proper cooling(especially vrm) for not too expensive.
Quote:


> Originally Posted by *King PWNinater*
> 
> Ok, so my 290 is watercooled. Should I be able to turn up the voltage by +100mv (And nothing else, just voltage.) without any repercussions or crashing/stalling?


Quote:


> Originally Posted by *King PWNinater*
> 
> I mean, I'm not going to do that, but theoretically, if I did, and if my card could take the heat, I won't get instability/crashes, ect... Right?


Of course.
The stock cooler can probably take it, mine can at least.

Can you give some more info on how you have cooled it?
What block? In which loop?
Quote:


> Originally Posted by *bond32*
> 
> You get out of here with that noise...
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Might sell one, I have observed only a small increase when the fourth card was added... But as for temps, put the k=17 fuji's on mine, can honestly say I am blown away by how much lower the VRM temps are from what I expected.


Quote:


> Originally Posted by *Roboyto*
> 
> Yeah, it shouldn't cause any issues. Although, the 290 in my HTPC doesn't like it at all once I surpass ~175mV in Trixx. Only way to find out is to give it a whirl.
> 
> Yup.
> 
> If you are going to run stock then this is the road I would be going down. Same concept applies with decreasing voltage to sustain a clock as it does to increase volts/clocks. With the increase in efficiency due to the lower temperatures you may be able to drop them more than you would think.
> 
> The overclocking capabilities of your card are more reliant on the Silicon Lottery than anything really. The AMD reference cooler is sufficient to tell if you have a capable overclocker or not. Watercooling can expand the clocks in which you are able to achieve and/or lower the voltage required to hit a given clock. The first pair of 290s I bought were identical and 5 serial #s apart. One was a 'dud' that was capable of ~10% overclock to core or RAM.
> 
> The other, which I am still using, clocks up to 1300/1700 under water and was able to make it to ~1215/1675, depending on the bench, with the stock blower. Before the full cover block I was needing ~115mV to get to ~1200 core clock; After going under water that was reduced to ~85mV.
> 
> You should see a solid jump in performance going to 1200 on the core clock. The 150MHz jump on the RAM is only good for a very minor performance jump. Going all the way up to 1700 on my card netted on average, between 3DMark11 Performance/Xtreme , Unigine Heaven/Valley, and FFXIV Benchmark, only a 5.5% gain in bench scores. You can see my spreadsheet from my benches back in Feb here: http://www.overclock.net/t/1456279/honey-i-shrunk-the-ultra-tower-beast-my-journey-to-creating-a-more-compact-pc-with-an-r9-290/20_20#post_21847939


Haven't looked at it that way.
That's actually a good tip.

Might try and see how far the card goes with 100% fan speed.
Under water it should be able to keep at least the same temps, right?

And what do you advice to test stability with?
I have had 3dmark runs stable, well at least artifact free. Then when I started Far Cry with that clock it almost instantly started artifacting.

Edit:
Did a quick test.

1200 core +150mv 80% fixed fan speed:


----------



## kizwan

@Chopper1591, the results I got with my two 290's with EK water blocks & stock EK thermal pads on the VRM1 pretty much in line with Stren's result for EK water block. So I believe with Stren's review. Fujipoly Extreme thermal pad helps a lot, reducing the delta more than half. FYI, I already know about Fujipoly thermal pad for a long time now & I've never thought I'm going to need it ever. VRM on my previous GPU, 5870 running cool with EK water block & stock EK thermal pad.


----------



## Chopper1591

Quote:


> Originally Posted by *kizwan*
> 
> @Chopper1591, the results I got with my two 290's with EK water blocks & stock EK thermal pads on the VRM1 pretty much in line with Stren's result for EK water block. So I believe with Stren's review. Fujipoly Extreme thermal pad helps a lot, reducing the delta more than half. FYI, I already know about Fujipoly thermal pad for a long time now & I've never thought I'm going to need it ever. VRM on my previous GPU, 5870 running cool with EK water block & stock EK thermal pad.


Good to hear some more experience.

I probably won't need the Fuji no.
But knowing me, if it can be better... it has to.

We are on OC.net here.









Any tips on how to do the loop?



I can leave the cpu tubing as is and change the tubing that goes from res>rad now to rad>gpu>res.
Or change the rad>cpu>res to rad>cpu>gpu>res.

I think the later will look cleaner.

Current, as is:


----------



## kizwan

Quote:


> Originally Posted by *Chopper1591*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> @Chopper1591, the results I got with my two 290's with EK water blocks & stock EK thermal pads on the VRM1 pretty much in line with Stren's result for EK water block. So I believe with Stren's review. Fujipoly Extreme thermal pad helps a lot, reducing the delta more than half. FYI, I already know about Fujipoly thermal pad for a long time now & I've never thought I'm going to need it ever. VRM on my previous GPU, 5870 running cool with EK water block & stock EK thermal pad.
> 
> 
> 
> 
> 
> 
> 
> Good to hear some more experience.
> 
> I probably won't need the Fuji no.
> But knowing me, if it can be better... it has to.
> 
> We are on OC.net here.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any tips on how to do the loop?
> 
> 
> 
> I can leave the cpu tubing as is and change the tubing that goes from res>rad now to rad>gpu>res.
> Or change the rad>cpu>res to rad>cpu>gpu>res.
> 
> I think the later will look cleaner.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Current, as is:
Click to expand...

Choose the order or route that look good for you. If I were you, I would:-

Rotate the top radiator so that the water ports are at the back.
Connect tube from bay-res >> CPU >> top rad >> GPU >> bay-res.
For me this way it look much cleaner. Also, you do need another rad, 120mm or 240mm. I recommend the latter at least.

This is how my loop look currently. Not build for aesthetic in mind if I'm honest. The red dye is bad choice. Both the tube & res are now stained because of it.


----------



## Chopper1591

Quote:


> Originally Posted by *kizwan*
> 
> Choose the order or route that look good for you. If I were you, I would:-
> 
> Rotate the top radiator so that the water ports are at the back.
> Connect tube from bay-res >> CPU >> top rad >> GPU >> bay-res.
> For me this way it look much cleaner. Also, you do need another rad, 120mm or 240mm. I recommend the latter at least.
> 
> This is how my loop look currently. Not build for aesthetic in mind if I'm honest. The red dye is bad choice. Both the tube & res are now stained because of it.












Thats why I decided to go with colored tubing.
I don't like the staining that dye's do. And they all do.

I don't plan on adding another radiator though. I don't find I need it.
Maybe later.

And sadly turning the radiator is a no go. As I have already made two holes in the top of the case for the tubing...


----------



## kizwan

Quote:


> Originally Posted by *Chopper1591*
> 
> And sadly turning the radiator is a no go. As I have already made two holes in the top of the case for the tubing...


I missed that your radiator is external. Current tube routing fine too. bay-res >> rad >> CPU >> GPU >> bay-res.


----------



## Roboyto

Quote:


> Originally Posted by *Chopper1591*
> 
> I am just looking for proper cooling(especially vrm) for not too expensive.
> 
> Under water it should be able to keep at least the same temps, right?
> 
> And what do you advice to test stability with?
> I have had 3dmark runs stable, well at least artifact free. Then when I started Far Cry with that clock it almost instantly started artifacting.


The Fuji's are worth the $ for the performance increase given.

Under water it should give better temperatures than air, however you mentioned you didn't want to add any more radiator space. You will not see as good of temperatures as other with only a 360mm radiator for an 8320 and a 290, especially if you plan on overclocking/volting both of them. You might be able to squeak by with an Intel CPU, but that FX furnace you are running throws a monkey in the wrench.

Since you have that massive 200mm fan up front I would stick another radiator right there. A 120mm at the minimum as @kizwan suggested, but I would be stuffing a Phobya extreme 200mm up there if it were me: http://www.microcenter.com/product/393102/Xtreme_200mm_Liquid_Cooling_Radiator

You must take into consideration that a benchmark is simply that, a benchmark. It can't cover all the possibilities for coding, graphics engines, etc that are out there for the variety of different things your GPU is going to see. I've suggested it 100 times and I will again, Final Fantasy XIV Benchmark is my go to for unquestionable stability. The clocks and voltages I find for running FFXIV bench have worked for anything else I have thrown at it.

Also, I just thought that artifact free isn't the only thing to be concerned with. If you appear to have a clean run, but your score drops then that isn't really a stable run because performance decreased. If that happens, then you need to lower clocks/voltages and try again.


----------



## Chopper1591

Quote:


> Originally Posted by *kizwan*
> 
> I missed that your radiator is external. Current tube routing fine too. bay-res >> rad >> CPU >> GPU >> bay-res.


No problem.
If I turned back time I probably would have made the holes in the back so I had rotated the rad...

A UT60 should do fine with a 290 and fx-8320, seeing the temps with the cpu only. Fans at 600-800rpm give very good temps.


----------



## Sempre

Is ~90C with 35% fan normal? I know that reference coolers are not the best but i didn't expect it to be this severe!
This is after a couple of minutes of watching Youtube full screen (Flash, not HTML5).
Hardware acceleration in Waterfox is disabled and ambient is 24C.


----------



## Chopper1591

Quote:


> Originally Posted by *Sempre*
> 
> Is ~90C with 35% fan normal? I know that reference coolers are not the best but i didn't expect it to be this severe!
> This is after a couple of minutes of watching Youtube full screen (Flash, not HTML5).
> Hardware acceleration in Waterfox is disabled and ambient is 24C.


You are running CF, right?

90c is pretty high if you ask me.
Why do you have a 35% fixed fan settings?
I would leave it on auto and see how it goes.

How is the airflow in your case?
Tried with the side panel of? To rule things out.


----------



## Sempre

Quote:


> Originally Posted by *Chopper1591*
> 
> You are running CF, right?
> 
> 90c is pretty high if you ask me.
> Why do you have a 35% fixed fan settings?
> I would leave it on auto and see how it goes.
> 
> How is the airflow in your case?
> Tried with the side panel of? To rule things out.


No, only one card is installed right now temporarily.
I have it 35% when not gaming because at auto it'll be 20% and the temp will rise, fans will speed up and so on. So far 35% worked for me until i watched a long Youtube playlist and this happened.

Ventilation isn't so great with the HDD cage and cables going on, but i don't think it would matter much would it. The cpu is watercooled and the card blows hot air outside the case so there isn't hot air sticking around inside.

Just tried out 40% with the same playlist and it's now around ~60C.

Edit: There is no difference when removing the side panel.


----------



## Sonic_AFB

1150/1325 +0.44 mv on a Asus R9 290x watercooled is good? tested a lot of hours on BF4 0 artifacts, and what is the maximun safe voltage? my asic is 73.9%


----------



## tsm106

Quote:


> Originally Posted by *Sempre*
> 
> So far 35% worked for me until i watched a long Youtube playlist and this happened.


Hmm, that reads like hw accel is enabled on the flash/utube. If you don't want the gpu used creating heat, disable hw accel.


----------



## Sempre

Quote:


> Originally Posted by *tsm106*
> 
> Hmm, that reads like hw accel is enabled on the flash/utube. If you don't want the gpu used creating heat, disable hw accel.


I already said i disabled HW acceleration in waterfox. But maybe this setting doesn't affect the Flash plugin?
I'll try out HTML5 and see what happens.


----------



## Chopper1591

Quote:


> Originally Posted by *Sempre*
> 
> No, only one card is installed right now temporarily.
> I have it 35% when not gaming because at auto it'll be 20% and the temp will rise, fans will speed up and so on. So far 35% worked for me until i watched a long Youtube playlist and this happened.
> 
> Ventilation isn't so great with the HDD cage and cables going on, but i don't think it would matter much would it. The cpu is watercooled and the card blows hot air outside the case so there isn't hot air sticking around inside.
> 
> Just tried out 40% with the same playlist and it's now around ~60C.
> 
> Edit: There is no difference when removing the side panel.


You could do like me, and create an custom fan curve in Afterburner or Sapphire Trixx.
Where you just raise the minimum fan speed.


----------



## tsm106

Quote:


> Originally Posted by *Sempre*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Hmm, that reads like hw accel is enabled on the flash/utube. If you don't want the gpu used creating heat, disable hw accel.
> 
> 
> 
> I already said i disabled HW acceleration in waterfox. But maybe this setting doesn't affect the Flash plugin?
> I'll try out HTML5 and see what happens.
Click to expand...

If you want it disabled in flash, you have to do it on the plugin itself. Right click the video, etc...


----------



## Sempre

Ok i just tried HTML5 in IE and it dropped to ~80C. Not much but still better than 90+
Quote:


> Originally Posted by *Chopper1591*
> 
> You could do like me, and create an custom fan curve in Afterburner or Sapphire Trixx.
> Where you just raise the minimum fan speed.


+R Yeah I have a profile for gaming and it's pretty aggressive. But I didn't think I needed one for video watching








Quote:


> Originally Posted by *tsm106*
> 
> If you want it disabled in flash, you have to do it on the plugin itself. Right click the video, etc...


Thanks tsm I totally missed that. Right clicked it and disabled HW acceleration, Temps dropped to ~75C +R

Obviously that's still not a good temp to sit on but at least i now know what was the issue. I guess i'll have to make a custom profile for watching videos until i pick up used blocks.


----------



## chiknnwatrmln

Helping my friend with his first build, budget of $1200 including os and monitor. We ordered a Powercolor PCS+ 290, we got it for $230 and an additional $20 rebate. Once we get it built I'll post some pics.

Also, I think I just lost my 290... I was playing Skyrim and my PC locked up, did a hard reset and got stuck in a boot loop bluescreen. I got to safemode and uninstalled drivers, rebooted, reinstalled, rebooted, and now my PC won't recognize my 2nd GPU and I get this.



I'm really bummed, I haven't been running CF for that long and I don't have the money to replace the card.


----------



## Gobigorgohome

Quote:


> Originally Posted by *bond32*
> 
> You get out of here with that noise...
> 
> Might sell one, I have observed only a small increase when the fourth card was added... But as for temps, put the k=17 fuji's on mine, can honestly say I am blown away by how much lower the VRM temps are from what I expected.


Yes, the fourth card is pretty much meaningless ... anyway, at the settings I am using in BF4 I am getting 100+ FPS anyways so no need to have the fourth card. I have VRM-temperatures mid 50s with 1100/1300 no voltage added. Crashed at Firestrike Extreme, running everything else fine, but when I am overclocking these cards it will be PT1.









Might just sell one and keep three then, two Hynix and one Elpida, or should I go for selling off both the Elpida's and get a Lightning/Matrix or something to pair up with the two Hynix-cards? The Matrix is quite cheap in Norway these days, around 414 USD shipped and everything (original price 660 USD), 30 days return policy and so on. All my cards have Fujipoly Ultra Extreme already btw.








Quote:


> Originally Posted by *Roboyto*
> 
> The overclocking capabilities of your card are more reliant on the Silicon Lottery than anything really. The AMD reference cooler is sufficient to tell if you have a capable overclocker or not. Watercooling can expand the clocks in which you are able to achieve and/or lower the voltage required to hit a given clock. The first pair of 290s I bought were identical and 5 serial #s apart. One was a 'dud' that was capable of ~10% overclock to core or RAM.
> 
> The other, which I am still using, clocks up to 1300/1700 under water and was able to make it to ~1215/1675, depending on the bench, with the stock blower. Before the full cover block I was needing ~115mV to get to ~1200 core clock; After going under water that was reduced to ~85mV.
> 
> You should see a solid jump in performance going to 1200 on the core clock. The 150MHz jump on the RAM is only good for a very minor performance jump. Going all the way up to 1700 on my card netted on average, between 3DMark11 Performance/Xtreme , Unigine Heaven/Valley, and FFXIV Benchmark, only a 5.5% gain in bench scores. You can see my spreadsheet from my benches back in Feb here: http://www.overclock.net/t/1456279/honey-i-shrunk-the-ultra-tower-beast-my-journey-to-creating-a-more-compact-pc-with-an-r9-290/20_20#post_21847939


Yes, did not really have the time or anything to test out the air cooler, 100% fan speed, 78C and stock clocks is everything I wanted to do with that thing .... been on water since late June. Have not really done any clocking of them, yet. Could just throw in all cards and see what they are capable of on PT1, I have 2x MO-RA3 420 LT's with 18 fans to cool them, so they are not getting too hot.







With 9 fans they are 46C core at load, VRM around 55C I think.


----------



## pdasterly

what does pt1 bios do?


----------



## Gobigorgohome

Quote:


> Originally Posted by *pdasterly*
> 
> what does pt1 bios do?


I think it is a modded Asus-bios, which unlocks voltage control(?), have not really used it yet so I am not 100% sure. @bond32 knows


----------



## Spectre-

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I think it is a modded Asus-bios, which unlocks voltage control(?), have not really used it yet so I am not 100% sure. @bond32 knows


unlocked voltage no downclocking and constantly running 1000/1250 mhz with your stock volts all the time


----------



## Chopper1591

Quote:


> Originally Posted by *Sempre*
> 
> Ok i just tried HTML5 in IE and it dropped to ~80C. Not much but still better than 90+
> +R Yeah I have a profile for gaming and it's pretty aggressive. But I didn't think I needed one for video watching
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks tsm I totally missed that. Right clicked it and disabled HW acceleration, Temps dropped to ~75C +R
> 
> Obviously that's still not a good temp to sit on but at least i now know what was the issue. I guess i'll have to make a custom profile for watching videos until i pick up used blocks.


Has this been asked before?
What are your idle temps?

I would really create the custom fan cruve and use it for everything. Your fan speed seems to be the problem after all.
What you could do is make the curve start at ~40%? I don't know till which % the fan starts to be audible. Mine is pretty quiet around 46%.

Try some things out.

Of none works you have no other option then to increase airflow.
Do you have a side intake fan?
Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Helping my friend with his first build, budget of $1200 including os and monitor. We ordered a Powercolor PCS+ 290, we got it for $230 and an additional $20 rebate. Once we get it built I'll post some pics.
> 
> Also, I think I just lost my 290... I was playing Skyrim and my PC locked up, did a hard reset and got stuck in a boot loop bluescreen. I got to safemode and uninstalled drivers, rebooted, reinstalled, rebooted, and now my PC won't recognize my 2nd GPU and I get this.
> 
> 
> 
> I'm really bummed, I haven't been running CF for that long and I don't have the money to replace the card.


That sucks.

Have you tried with just the one card?

And... no warranty?


----------



## Gobigorgohome

Quote:


> Originally Posted by *Spectre-*
> 
> unlocked voltage no downclocking and constantly running 1000/1250 mhz with your stock volts all the time


Yeah, that is pretty general for any R9 290X bios? My Sapphire Radeon already running 1000/1250 with stock volts, but the whole "good thing" with PT1 is that you/I/whoever can use GPU tweak for overclocking. Blah ... do not really know that much about PT1.


----------



## Spectre-

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Yeah, that is pretty general for any R9 290X bios? My Sapphire Radeon already running 1000/1250 with stock volts, but the whole "good thing" with PT1 is that you/I/whoever can use GPU tweak for overclocking. I think PT1 has unlocked voltage up to 1,4 volts (which is seen like the safe limit) while the PT3 has beyond 1,4 volts (not considered safe), that is pretty much the understanding I have gotten from the threads about it. I will only do the "safe version" on my cards.


what i meant was that the cards wont downclock and they will use full power all the time ( basically 3d mode)
lol PT1 lets me go upto 1.9 volts
i bench at 1.55 volts

my stock bios does 1.43


----------



## Chopper1591

Quote:


> Originally Posted by *Spectre-*
> 
> what i meant was that the cards wont downclock and they will use full power all the time ( basically 3d mode)
> lol PT1 lets me go upto 1.9 volts
> i bench at 1.55 volts
> 
> my stock bios does 1.43


2nd monitor attached?


----------



## Spectre-

Quote:


> Originally Posted by *Chopper1591*
> 
> 2nd monitor attached?


dont quite get what you mean


----------



## Sgt Bilko

Quote:


> Originally Posted by *Spectre-*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Chopper1591*
> 
> 2nd monitor attached?
> 
> 
> 
> dont quite get what you mean
Click to expand...

He is thinking that you mean the card was stuck at 3D clocks because of a second monitor attached and not because you were running the PT1 Bios which *locks* your clocks at their 3D setting


----------



## Gobigorgohome

Quote:


> Originally Posted by *Spectre-*
> 
> what i meant was that the cards wont downclock and they will use full power all the time ( basically 3d mode)
> lol PT1 lets me go upto 1.9 volts
> i bench at 1.55 volts
> 
> my stock bios does 1.43


As I said, never used pt1/pt3 bios on R9 290X so I do not know. Hmmm, I have not really thought about the downclocking, does it matter too much for lifespan of the cards?

1,9 volts, okay, I am going to take it slow, to start with at least.


----------



## Spectre-

Quote:


> Originally Posted by *Sgt Bilko*
> 
> He is thinking that you mean the card was stuck at 3D clocks because of a second monitor attached and not because you were running the PT1 Bios which *locks* your clocks at their 3D setting


oh no

its 24/7 3d clocks max power all the time


----------



## kizwan

Quote:


> Originally Posted by *Chopper1591*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Spectre-*
> 
> what i meant was that the cards wont downclock and they will use full power all the time ( basically 3d mode)
> lol PT1 lets me go upto 1.9 volts
> i bench at 1.55 volts
> 
> my stock bios does 1.43
> 
> 
> 
> 2nd monitor attached?
Click to expand...

With PT1 & PT3 vBIOS, your card will always running at 3D clocks. No idle clocks.


----------



## Chopper1591

Quote:


> Originally Posted by *Sgt Bilko*
> 
> He is thinking that you mean the card was stuck at 3D clocks because of a second monitor attached and not because you were running the PT1 Bios which *locks* your clocks at their 3D setting


True.
Quote:


> Originally Posted by *kizwan*
> 
> With PT1 & PT3 vBIOS, your card will always running at 3D clocks. No idle clocks.


Didn't know that...
Don't want that either.









Pretty satisfied with the bios as is on my 290 tri-x.
Has a pretty solid constant clock out of the box.


----------



## LandonAaron

Quote:


> Originally Posted by *Sempre*
> 
> Is ~90C with 35% fan normal? I know that reference coolers are not the best but i didn't expect it to be this severe!
> This is after a couple of minutes of watching Youtube full screen (Flash, not HTML5).
> Hardware acceleration in Waterfox is disabled and ambient is 24C.


This doesn't seem right to me. I only used the reference cooler for a few days before I put the card under water, but hitting 90C on a youtube video seems pretty "severe" as you said. First thing I would try is setting your fan back to the default "Auto" mode, and see how high the temps get and how high the fan turns up to. If the fan goes into the leaf blower mode, and the temps are still skyrocketing you have a problem, and if your card is overheating now it will most definitely be blowing up once you start 3d gaming. If its a new card I would RMA. If its a used card I would reapply thermal paste and re-seat the cooler.

I know you said disabling HW acceleration for youtube videos fixed the issue, but thats more of a workaround, as now you are just not using the card, but the cards thermal problems are still going to be there the next time you do.


----------



## Chopper1591

Quote:


> Originally Posted by *LandonAaron*
> 
> This doesn't seem right to me. I only used the reference cooler for a few days before I put the card under water, but hitting 90C on a youtube video seems pretty "severe" as you said. First thing I would try is setting your fan back to the default "Auto" mode, and see how high the temps get and how high the fan turns up to. If the fan goes into the leaf blower mode, and the temps are still skyrocketing you have a problem, and if your card is overheating now it will most definitely be *blowing up* once you start 3d gaming. If its a new card I would RMA. If its a used card I would reapply thermal paste and re-seat the cooler.
> 
> I know you said disabling HW acceleration for youtube videos fixed the issue, but thats more of a workaround, as now you are just not using the card, but the cards thermal problems are still going to be there the next time you do.


----------



## YellowBlackGod

May I Join? Just arived. The best GPU of the market! D It gave me a WEI score of 8.9 (!!!) on Windows 8.1, paired with an Asrock Z97 extreme6 and an i7 4790K. Complete Rig pictures still to come.


----------



## xer0h0ur

Not gonna lie, wish I had two of those ^


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Chopper1591*
> 
> Has this been asked before?
> What are your idle temps?
> 
> I would really create the custom fan cruve and use it for everything. Your fan speed seems to be the problem after all.
> What you could do is make the curve start at ~40%? I don't know till which % the fan starts to be audible. Mine is pretty quiet around 46%.
> 
> Try some things out.
> 
> Of none works you have no other option then to increase airflow.
> Do you have a side intake fan?
> That sucks.
> 
> Have you tried with just the one card?
> 
> And... no warranty?


My card is under warranty, but after leaving my PC for a bit it started up just fine and both cards are working. Go figure, I'm not complaining lol.


----------



## BLOWNCO

Quote:


> Originally Posted by *xer0h0ur*
> 
> Not gonna lie, wish I had two of those ^


id take 3 of them for sure


----------



## Mega Man

meh ill stick with my watercooling !


----------



## Regnitto

Quote:


> Originally Posted by *YellowBlackGod*
> 
> 
> 
> May I Join? Just arived. The best GPU of the market! D It gave me a WEI score of 8.9 (!!!) on Windows 8.1, paired with an Asrock Z97 extreme6 and an i7 4790K. Complete Rig pictures still to come.


that card would match my rig perfectly!


----------



## tsm106

Quote:


> Originally Posted by *Mega Man*
> 
> meh ill stick with my watercooling !


My fancy pants aircoolers are all sitting comfy in their boxes sleeping.


----------



## Mega Man




----------



## Chopper1591

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> My card is under warranty, but after leaving my PC for a bit it started up just fine and both cards are working. Go figure, I'm not complaining lol.


Good to hear.

Although I wouldn't settle with that...
Need to know why it was behaving so badly.
Quote:


> Originally Posted by *Mega Man*
> 
> meh ill stick with my watercooling !


Agreed.
Once you go swiming you never go back.
Quote:


> Originally Posted by *Regnitto*
> 
> that card would match my rig perfectly!


My eyes...

Can't see the rig at all. It's all lights.


----------



## YellowBlackGod

Quote:


> Originally Posted by *Regnitto*
> 
> that card would match my rig perfectly!


It matches the Asrock Z97 and the CoolerMaster Hyper 103 Cooler perfectly colorwise. It just happens that I have a white-led bottom case fan as well and a Superflower Leadex PSU with white Leds in the connectors.But that's OK. I will still just change two ram sticks from White (or Black) to blue for better colouring matching (RAM is Kingston Hyper X fury 1866 MHz). What do you thing would suit better for RAM: 2 black-2blue or 2 white-2 blue modules?


----------



## Unknownm

With stock TIM and fan profiles with ghetto duct tape mod. 80/90c load from valley benchmark with 1000Mhz no voltage.

After re-applying TIM using AS5 and same benchmark, 82c on top card & 75c on bottom card with +5 core voltage @ 1050Mhz
Quote:


> Originally Posted by *Spectre-*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> He is thinking that you mean the card was stuck at 3D clocks because of a second monitor attached and not because you were running the PT1 Bios which *locks* your clocks at their 3D setting
> 
> 
> 
> oh no
> 
> its 24/7 3d clocks max power all the time
Click to expand...

I used that PT1 bios on my older r9 290. Idle was about 80w from gpuz,

If you are like me where you want 3d clocks all the time but wanna save some power when you are AFK. Just create a new profile with lowest voltage for core/mem, it dropped the watts down to 50w idle from 80w.. just in voltage control.

but make sure you load default voltages back before using a program.


----------



## Arizonian

Quote:


> Originally Posted by *YellowBlackGod*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> May I Join? Just arived. The best GPU of the market! D It gave me a WEI score of 8.9 (!!!) on Windows 8.1, paired with an Asrock Z97 extreme6 and an i7 4790K. Complete Rig pictures still to come.


Congrats - added


----------



## Chopper1591

Quote:


> Originally Posted by *YellowBlackGod*
> 
> It matches the Asrock Z97 and the CoolerMaster Hyper 103 Cooler perfectly colorwise. It just happens that I have a white-led bottom case fan as well and a Superflower Leadex PSU with white Leds in the connectors.But that's OK. I will still just change two ram sticks from White (or Black) to blue for better colouring matching (RAM is Kingston Hyper X fury 1866 MHz). What do you thing would suit better for RAM: 2 black-2blue or 2 white-2 blue modules?


I advice you to just use two sticks of ram for stability issues.
Go with the color you like best.


----------



## YellowBlackGod

Quote:


> Originally Posted by *Chopper1591*
> 
> I advice you to just use two sticks of ram for stability issues.
> Go with the color you like best.


They are on the way now. Can't go back. But I already use 4. Motherboard is updated to latest BIOS, everything seems to works pretty solid. Why would 4 Ram sticks affect stability?


----------



## Chopper1591

Quote:


> Originally Posted by *YellowBlackGod*
> 
> They are on the way now. Can't go back. But I already use 4. Motherboard is updated to latest BIOS, everything seems to works pretty solid. Why would 4 Ram sticks affect stability?


When overclocking 2 vs 4 sticks make a difference.
4 sticks put more stress on the IMC. Thus needing more voltage to be stable.

Then again I use 2400 c9 ram which puts some more stress.

I would probably go with the 2white-2blue. For the contrast against the black board.
My color scheme is pretty messed up anyway. Red ram vs black board and white tubing.


----------



## YellowBlackGod

Quote:


> Originally Posted by *Chopper1591*
> 
> When overclocking 2 vs 4 sticks make a difference.
> 4 sticks put more stress on the IMC. Thus needing more voltage to be stable.
> 
> Then again I use 2400 c9 ram which puts some more stress.
> 
> I would probably go with the 2white-2blue. For the contrast against the black board.
> My color scheme is pretty messed up anyway. Red ram vs black board and white tubing.


I will go with blue and white then. It suits well i think. With blue & white leds and blue-black Motherboard it will look quite nice. Considering overclocking, For now I will take all four cores to run in turbo mode at 4.4 Ghz. Later on I may use the auto overclock function of Asrock.


----------



## Regnitto

Quote:


> Originally Posted by *Chopper1591*
> 
> My eyes...
> 
> Can't see the rig at all. It's all lights.


it's a bad pic. I'll post another one tonight where the light isn't glaring so much.


----------



## Chopper1591

Quote:


> Originally Posted by *YellowBlackGod*
> 
> I will go with blue and white then. It suits well i think. With blue & white leds and blue-black Motherboard it will look quite nice. Considering overclocking, For now I will take all four cores to run in turbo mode at 4.4 Ghz. Later on I may use the auto overclock function of Asrock.


No... don't do it.

Death to auto overclocking.
Manual is the way to go.

Auto will unnecessarilly raise voltages.

This is my current:


Will go higher pretty easy, but I like my system cool and silent.
Quote:


> Originally Posted by *Regnitto*
> 
> it's a bad pic. I'll post another one tonight where the light isn't glaring so much.












Waiting in anticipation.


----------



## Mugamat

Here is some new oc results. I`m still newbee in it. But i guess my r9 290 does it well. For some reason FireStrike test didn`t recognized my clocks and showed stock speed in result and driver problem. Would try to run it againg as soon as posible. For now can share with your furmark and valley result.
About VGA: It`s Visiontek R9 290 with G10 bracket and corsair h55. Also upgraded with Gelid ICY heatsinks on VRMs and memory.


Spoiler: Warning: Spoiler!








I think i can get bigger result after upgrade. I`m going to upgrade CPU for i7 and install custom liquid cooling loop. Sorry for bad english)


----------



## Performer81

OMG, i wouldnt use furmark with that voltage. It puts tons of load on the card even @ stock settings most of the cards overheat.


----------



## Agent Smith1984

My 290 has turned my rig into a friggin space heater when it's overclocked...

The card temps themselves are good for air (74c core/ 84c VRM), but the heat that the cooler is dissapating inside the case is insane....
It even makes my CPU unstable (it's watercooled, but the rad is internal, and sucking up all that hot air).
Of course, this is only if the door is shut on the front. As soon as I open the door the ambient temp goes down 10 degrees and resolves the issues.

Long term plan is to get this card under water, but for the short term, I am thinking of making the following changes. This will also reduce our heating needs in the living room I think











Thoughts??


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> My 290 has turned my rig into a friggin space heater when it's overclocked...
> 
> The card temps themselves are good for air (74c core/ 84c VRM), but the heat that the cooler is dissapating inside the case is insane....
> It even makes my CPU unstable (it's watercooled, but the rad is internal, and sucking up all that hot air).
> Of course, this is only if the door is shut on the front. As soon as I open the door the ambient temp goes down 10 degrees and resolves the issues.
> 
> Long term plan is to get this card under water, but for the short term, I am thinking of making the following changes. This will also reduce our heating needs in the living room I think
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thoughts??


Yeah, get a h80 and put it under water. Then add a Gelid Icy revision kit to the VRM's. I just ran a bench today at 1170mhz and VRM hit 41C max, while the core hit 47C max. Cheap, easy, and replaceable. Quiet too.

That was at +150mv BTW, so not bad at all.


----------



## Chopper1591

Quote:


> Originally Posted by *Mugamat*
> 
> Here is some new oc results. I`m still newbee in it. But i guess my r9 290 does it well. For some reason FireStrike test didn`t recognized my clocks and showed stock speed in result and driver problem. Would try to run it againg as soon as posible. For now can share with your furmark and valley result.
> About VGA: It`s Visiontek R9 290 with G10 bracket and corsair h55. Also upgraded with Gelid ICY heatsinks on VRMs and memory.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> I think i can get bigger result after upgrade. I`m going to upgrade CPU for i7 and install custom liquid cooling loop. Sorry for bad english)


Can you post results with showing max values in gpu-z.
The shots you provided are not really usefull.
Quote:


> Originally Posted by *Agent Smith1984*
> 
> My 290 has turned my rig into a friggin space heater when it's overclocked...
> 
> The card temps themselves are good for air (74c core/ 84c VRM), but the heat that the cooler is dissapating inside the case is insane....
> It even makes my CPU unstable (it's watercooled, but the rad is internal, and sucking up all that hot air).
> Of course, this is only if the door is shut on the front. As soon as I open the door the ambient temp goes down 10 degrees and resolves the issues.
> 
> Long term plan is to get this card under water, but for the short term, I am thinking of making the following changes. This will also reduce our heating needs in the living room I think
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thoughts??


I know the feeling.
Lucky me, my rad is external.

But what I advice, change cpu rad to intake. Done.
Try to keep possitive pressure in the case.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Chopper1591*
> 
> Can you post results with showing max values in gpu-z.
> The shots you provided are not really usefull.
> I know the feeling.
> Lucky me, my rad is external.
> 
> But what I advice, change cpu rad to intake. Done.
> Try to keep possitive pressure in the case.


Well, it certainly has a ton of positive pressure now, which I had attributed to the high ambient temp?
I don't put a ton of stock into a case having +/- pressure, when most cases provide enough nooks, cracks, and crannies, to keep the air balanced regardless, but it my case (Edit: seriously had no pun intended here, wow







), the entire case has been lined with padding from Frozen CPU, and and seals up very tightly


----------



## Chopper1591

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, it certainly has a ton of positive pressure now, which I had attributed to the high ambient temp?
> I don't put a ton of stock into a case having +/- pressure, when most cases provide enough nooks, cracks, and crannies, to keep the air balanced regardless, but it my case, the entire case has been lined with padding from Frozen CPU, and and seals up very tightly


Oh, that sucks.

Padded and very silent cases don't go very well with very hot hardware.
With that case you would've been better of with an reference card, dumping most of the heat out of the case through the pci-e cover.
But then again.... would've been very noisy.









You probably need higher cfm fans as intake and exhaust to deal with the extra heat from the gpu.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Agent Smith1984*
> 
> My 290 has turned my rig into a friggin space heater when it's overclocked...
> 
> The card temps themselves are good for air (74c core/ 84c VRM), but the heat that the cooler is dissapating inside the case is insane....
> It even makes my CPU unstable (it's watercooled, but the rad is internal, and sucking up all that hot air).
> Of course, this is only if the door is shut on the front. As soon as I open the door the ambient temp goes down 10 degrees and resolves the issues.
> 
> Long term plan is to get this card under water, but for the short term, I am thinking of making the following changes. This will also reduce our heating needs in the living room I think
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thoughts??


I got the same issue, except I have two cards. It really doesn't help that two of my four intake fans are behind a rad, so I'm drawing hot air up through my case then through my top rad. My temps aren't too bad (if I have my top vents closed I get around 60c on the cards with +100mv after an hour or so, with the vents open ~55c) but the inside of my case gets pretty hot. Just sticking my hand in through one of the front 5.25 bays and it's noticeably much warmer. My SSDs and HDD read an ambient of ~100f inside the case, lol. My CPU is still stable and not too hot (~65c max) but my temps would be lower if my bottom rad wasn't bringing hot air into my case.


----------



## wermad

Second card came in







. Waiting on a ton of parts so no water atm







. Temps are pretty good, ~55c during a quick run of 3d11 & 3d13.


----------



## Chopper1591

Quote:


> Originally Posted by *wermad*
> 
> Second card came in
> 
> 
> 
> 
> 
> 
> 
> . Waiting on a ton of parts so no water atm
> 
> 
> 
> 
> 
> 
> 
> . Temps are pretty good, ~55c during a quick run of 3d11 & 3d13.


Nice.

We want results.


----------



## Nwanko

Anyone got H55+G10 with 290x? How much worse is it than X40+G10.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Nwanko*
> 
> Anyone got H55+G10 with 290x? How much worse is it than X40+G10.


Would also love to know myself, since that is the cooling option I am considering for my 290.....

I did notice that all of the sudden, the G10 has gone up to $60 on newegg???


----------



## Spectre-

Quote:


> Originally Posted by *Nwanko*
> 
> Anyone got H55+G10 with 290x? How much worse is it than X40+G10.


H55 i used to have did 35 idle and 70 max while benching and the vrm's stayed around 60-75ish

i have the intel asetek unit now which is a thixk 120mm rad and my temps are 35 idle 62 max while benching vrms are the same

ambient temps at most times are around 24-26 degrees


----------



## Agent Smith1984

Quote:


> Originally Posted by *Spectre-*
> 
> H55 i used to have did 35 idle and 70 max while benching and the vrm's stayed around 60-75ish
> 
> i have the intel asetek unit now which is a thixk 120mm rad and my temps are 35 idle 62 max while benching vrms are the same
> 
> ambient temps at most times are around 24-26 degrees


That's not even worth it for me then.....

I am idling at 36c and hitting 72c max on air with +100mv
*VRM hitting around 84c +/-

I was hoping for sub 50c temps, just based on the G10 review... with a kraken x40 they were around 46c load (stock), and the power usage of the card went down 30 watts.
AMD acredited the lower TDP to less cap leakage with such drastic board temp drops.... 30 watts is a huge savings in power when you consider that's stock clock/voltage usage with no other changes besides cooling!


----------



## Chopper1591

Quote:


> Originally Posted by *Nwanko*
> 
> Anyone got H55+G10 with 290x? How much worse is it than X40+G10.


Quote:


> Originally Posted by *Agent Smith1984*
> 
> Would also love to know myself, since that is the cooling option I am considering for my 290.....
> 
> I did notice that all of the sudden, the G10 has gone up to $60 on newegg???


I guess you could get an idea by comparing both units...

Here ya go:
clicky

I still really advice people to get either a thick 120 rad or a dual to do the Kraken.


----------



## Roboyto

Quote:


> Originally Posted by *Nwanko*
> 
> Anyone got H55+G10 with 290x? How much worse is it than X40+G10.


Quote:


> Originally Posted by *Agent Smith1984*
> 
> That's not even worth it for me then.....
> 
> I am idling at 36c and hitting 72c max on air with +100mv
> VRAM hitting around 84c +/-
> 
> I was hoping for sub 50c temps, just based on the G10 review... with a kraken x40 they were around 46c load (stock), and the power usage of the card went down 30 watts.
> AMD acredited the lower TDP to less cap leakage with such drastic board temp drops.... 30 watts is a huge savings in power when you consider that's stock clock/voltage usage with no other changes besides cooling!


You could probably achieve sub 50C temps at stock settings, but not overclocked and volted; Get yourself a refurb 240mm kit and you'd have no problem.

The biggest gain is definitely going to be noise level. I have mine setup with a SilenX Effizio 120mm on an Antec 620 and it is literally inaudible. The temperatures could definitely be better with a more powerful fan, but I value silence over performance in this instance. The fan has thermistor control, but I have the temp probe taped to the bottom of the fan shroud so it isn't influencing fan speed at all; Something I am going to have to experiment with soon.

http://www.overclock.net/t/1478544/the-build-formerly-known-as-the-devil-inside-a-gaming-htpc-for-my-wife-4770k-corsair-250d-powercolor-r9-290/20_20#post_22214255

You can get some very nice VRM temps if you swap out the kraken fan and upgrade thermal pads. With stock clocks I am seeing just under 50C VRM1 temps after 1 hour bench loop.

I am going to be upgrading to CLU with my Kraken setup tonight or tomorrow..once that happens sub 50C may be plausible with a mild overclock. I will have have results posted in the thread listed above and I will probably make a post here as well.


----------



## Gobigorgohome

Is the R9 290X a good photography editing card? Thinking of throwing one of my cards into my Z77 for photography editing and gaming @ 1080P, what do you guys say?


----------



## VSG

Depends on what programs you are using. Adobe OpenGL and CUDA are close, but I would give the edge to CUDA still with Photoshop CS myself.


----------



## Gobigorgohome

Quote:


> Originally Posted by *geggeg*
> 
> Depends on what programs you are using. Adobe OpenGL and CUDA are close, but I would give the edge to CUDA still with Photoshop CS myself.


Will the GTX 660 Ti be better than the R9 290X then? Or something like the GTX 970?


----------



## VSG

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Will the GTX 660 Ti be better than the R9 290X then? Or something like the GTX 970?


I am not an expert in this matter but the number of core/stream processors plays an important role here as well as the performance per unit- just like with gaming except scaling can be a lot better and exceed driver limitations. I would dare say the 290x would be better than both those Nvidia cards but don't take my word for it.


----------



## tsm106

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *geggeg*
> 
> Depends on what programs you are using. Adobe OpenGL and CUDA are close, but I would give the edge to CUDA still with Photoshop CS myself.
> 
> 
> 
> Will the GTX 660 Ti be better than the R9 290X then? Or something like the GTX 970?
Click to expand...

No. Plus the industry is moving towards opencl in general as more and more apps move towards the middle.

http://www.cgchannel.com/2014/06/adobe-rolls-out-photoshop-cc-2014/


----------



## VSG

About time too!


----------



## Gobigorgohome

Quote:


> Originally Posted by *geggeg*
> 
> I am not an expert in this matter but the number of core/stream processors plays an important role here as well as the performance per unit- just like with gaming except scaling can be a lot better and exceed driver limitations. I would dare say the 290x would be better than both those Nvidia cards but don't take my word for it.


I understand, well, then I guess the GTX 660 Ti will be in that one for now, until further change ... still doing quad crossfire then.


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Would also love to know myself, since that is the cooling option I am considering for my 290.....
> 
> I did notice that all of the sudden, the G10 has gone up to $60 on newegg???


You guys don't need the G10 bracket. its pointless... unless its just aesthetics you are after. You can strap the block on with zip-ties, put some cheap sinks on the RAM and the Gelid kit on the VRM's. I put a 120mm fan mounted to the bottom of the card too just to push air accross the VRM 1. My VRM's don't exceed 41C even at 150mv of extra juice. Core hasn't gone over 52C even while clocked at 1200 and running +150mv. Very cool. This is with an h80 AIO strapped to it. Just FYI


----------



## tsm106

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> Would also love to know myself, since that is the cooling option I am considering for my 290.....
> 
> I did notice that all of the sudden, the G10 has gone up to $60 on newegg???
> 
> 
> 
> You guys don't need the G10 bracket. its pointless... unless its just aesthetics you are after. You can strap the block on with zip-ties, put some cheap sinks on the RAM and the Gelid kit on the VRM's. I put a 120mm fan mounted to the bottom of the card too just to push air accross the VRM 1. *My VRM's don't exceed 41C even at 150mv of extra juice. Core hasn't gone over 52C even while clocked at 1200 and running +150mv.* Very cool. This is with an h80 AIO strapped to it. Just FYI
Click to expand...

I call shenanigans. Having put together a clc recently for someone using a gelid kit, the above is laughable. To get vrm temps that low you'd need a fullcover and overpriced fuji extremes, but even that won't drop down to 41c, closer to mid 40s. lol, 41c vrm is even lower than the 20 phase lightning with a fullcover. The core temps, not even worth it...


----------



## battleaxe

Quote:


> Originally Posted by *tsm106*
> 
> I call shenanigans. Having put together a clc recently for someone using a gelid kit, the above is laughable. To get vrm temps that low you'd need a fullcover and overpriced fuji extremes, but even that won't drop down to 41c, closer to mid 40s. lol, 41c vrm is even lower than the 20 phase lightning with a fullcover. The core temps, not even worth it...


LOL... well you're right. It was VRM2 that was 41c. VRM1 was 46c. No need to be jerk about it bro.


----------



## Chopper1591

Quote:


> Originally Posted by *battleaxe*
> 
> LOL... well you're right. It was VRM2 that was 41c. VRM1 was 46c. No need to be jerk about it bro.


12c ambient?


----------



## battleaxe

Quote:


> Originally Posted by *Chopper1591*
> 
> 12c ambient?


19-20C









I like my house cold. I guess that does matter a bit...

I don't feel the need to keep my house at 80 deg F in the winter. Okay, so that matters quite a bit... Forgot about that. Still even if you add 5-6C the temps are quite fine, and again that bracket is pointless other than for aesthetics.


----------



## silencespr

Quote:


> Originally Posted by *battleaxe*
> 
> 19-20C
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I like my house cold. I guess that does matter a bit...
> 
> I don't feel the need to keep my house at 80 deg F in the winter. Okay, so that matters quite a bit... Forgot about that. Still even if you add 5-6C the temps are quite fine, and again that bracket is pointless other than for aesthetics.


i used to have to sit in my basement in a jacket and gloves how cold it was in the winter but the system loved it.


----------



## battleaxe

Quote:


> Originally Posted by *silencespr*
> 
> i used to have to sit in my basement in a jacket and gloves how cold it was in the winter but the system loved it.


Yeah, it definitely helps. I just bought a new G1 970 and it has never hit 53C yet in the same room... And its on air.

The temps on my 290 that is running with an H80 are close to the same as my temps on the G1 970...

But yeah, this room hits 20c at the most during the winter. Usually around 19C in here. So I guess if you add 5C to the 41C and 46C you end up with 46C VRM2 and 51C on VRM1. The core on the 290 is usually around 44-46C while gaming. It can hit the low 50's running Firestrike if I'm pumping a lot of volts to it. So if you add 5C to those numbers, then I would be right where TSM said I would. But, I still think these are good numbers and it definitely helped the 290 hit higher clocks. I couldn't get this thing to 1200mhz until it was under water and the VRM's were also very cool. It just wouldn't do it. But last night I was playing BF4 at 1200mhz for about 30 minutes, so it seems the colder these get the happier they are. But there could be other factors at work here too. IDK... I don't pay that much attention to the details to get too worried about them.


----------



## Sonikku13

After selling my 290X for a phone, I realized gaming on a iGPU stinks. So I bought a new 290 for $230 on Cyber Monday. Sure, it's a downgrade from a 290X, but it was cheap.


----------



## Agent Smith1984

I used to duct tape flex pipe to a window unit and then tape the other end to my 120mm front intake.... but that was 9 years ago....

7900GT with circuit pen volt mod and zalman fatality cooler seemed to FRIGGIN LOVE IT, hahahah

Water cooling with AIO kits wasn't even available... it was custom loop and blocks or bulky ass air coolers.....

My ambient temp in my case was around 36f when doing this.... I kept a bag of instant rice in the bottom of the case for moisutre, hahaha

I guess my point is, I wouldn't do anything that ghetto fabbed at 30 with a 290, but I am not above zip tying the block with some good tim..... my GPU sits right side up in my BTX case anyways.


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I used to duct tape flex pipe to a window unit and then tape the other end to my 120mm front intake.... but that was 9 years ago....
> 
> 7900GT with circuit pen volt mod and zalman fatality cooler seemed to FRIGGIN LOVE IT, hahahah
> 
> Water cooling with AIO kits wasn't even available... it was custom loop and blocks or bulky ass air coolers.....
> 
> My ambient temp in my case was around 36f when doing this.... I kept a bag of instant rice in the bottom of the case for moisutre, hahaha
> 
> I guess my point is, I wouldn't do anything that ghetto fabbed at 30 with a 290, but I am not above zip tying the block with some good tim..... my GPU sits right side up in my BTX case anyways.


LOL... I think I'll just keep my room at 20c instead. That sounds awesome though.


----------



## BradleyW

Has anyone tried forcing CF in FC4?


----------



## silencespr

Quote:


> Originally Posted by *Agent Smith1984*
> 
> *I used to duct tape flex pipe to a window unit and then tape the other end to my 120mm front intake*.... but that was 9 years ago....
> 
> 7900GT with circuit pen volt mod and zalman fatality cooler seemed to FRIGGIN LOVE IT, hahahah
> 
> Water cooling with AIO kits wasn't even available... it was custom loop and blocks or bulky ass air coolers.....
> 
> My ambient temp in my case was around 36f when doing this.... I kept a bag of instant rice in the bottom of the case for moisutre, hahaha
> 
> I guess my point is, I wouldn't do anything that ghetto fabbed at 30 with a 290, but I am not above zip tying the block with some good tim..... my GPU sits right side up in my BTX case anyways.


Always wanted to do that but would ruin the look of my office.


----------



## battleaxe

Quote:


> Originally Posted by *BradleyW*
> 
> Has anyone tried forcing CF in FC4?


Guessing by how well FC4 works overall I would venture that it won't work.


----------



## BradleyW

Quote:


> Originally Posted by *battleaxe*
> 
> Guessing by how well FC4 works overall I would venture that it won't work.


It has been disabled in the FC4 profile whilst AMD investigate performance problems. AFR will work, but yeah, maybe it won't work well...


----------



## Agent Smith1984

From what I'm hearing, that title is a FAR CRY from performing well on AMD, especially with CF......

Lots of pun intended.


----------



## battleaxe

Quote:


> Originally Posted by *BradleyW*
> 
> It has been disabled in the FC4 profile whilst AMD investigate performance problems. AFR will work, but yeah, maybe it won't work well...


FC4 is so bad right now.... Ubisoft should be forced to pull that title. Seriously, its a mess and Ubisoft is just blaming the consumer. Hard to believe what these developers are getting away with now.

It took Origin a year to get BF4 working correctly. BF3 is a decent game now. I play BF3 more than I play BF4 just because it works better, its smoother, etc... IDK why we keep letting them do this to us?
Quote:


> Originally Posted by *Agent Smith1984*
> 
> From what I'm hearing, that title is a FAR CRY from performing well on AMD, especially with CF......
> 
> Lots of pun intended.


That game is a FAR CRY from working well on any platform. Its not just AMD... don't let anyone fool you.


----------



## LandonAaron

Quote:


> Originally Posted by *Agent Smith1984*
> 
> From what I'm hearing, that title is a FAR CRY from performing well on AMD, especially with CF......
> 
> Lots of pun intended.


Quote:


> Originally Posted by *battleaxe*
> 
> FC4 is so bad right now.... Ubisoft should be forced to pull that title. Seriously, its a mess and Ubisoft is just blaming the consumer. Hard to believe what these developers are getting away with now.
> 
> It took Origin a year to get BF4 working correctly. BF3 is a decent game now. I play BF3 more than I play BF4 just because it works better, its smoother, etc... IDK why we keep letting them do this to us?
> That game is a FAR CRY from working well on any platform. Its not just AMD... don't let anyone fool you.


Far Cry 4 is a stutter factory. Check out the thread: http://www.overclock.net/t/1520430/official-far-cry-4-information-discussion-thread. You will find like 2 people who claim it runs flawlessly on their GTX 770's, or 5870's and plenty of people with R9 290x's and GTX 980's who can tell you the game is an absolute mess. I'm not say that the people who claim it runs flawlessly are liars, I'm just saying I don't believe them.


----------



## Agent Smith1984

Quote:


> Originally Posted by *LandonAaron*
> 
> Check out the Far Cry 4 thread.
> Far Cry 4 is a stutter factory. Check out the thread: http://www.overclock.net/t/1520430/official-far-cry-4-information-discussion-thread. You will find like 2 people who claim it runs flawlessly on their GTX 770's, or 5870's and plenty of people with R9 290x's and GTX 980's who can tell you the game is an absolute mess. I'm not say that the people who claim it runs flawlessly are liars, I'm just saying I don't believe them.


I've read all the same stuff..... I really wanted this title too... it's a shame.

I will say though, that the people with less powerful cards, may be getting decent gameplay, because they are inherently having to run lower settings. It may be that the higher the settings used, the more prevalent the issues become, and that's why they are more obvious to high end hardware users.

Even with my thuban, and a single 290, I expect to run max settings on this title at 1080p.
I haven't bothered getting it yet, and probably won't for a while....
It'll be optimized and running well about the time it's dropped down to $30, lol


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I've read all the same stuff..... I really wanted this title too... it's a shame.
> 
> I will say though, that the people with less powerful cards, may be getting decent gameplay, because they are inherently having to run lower settings. It may be that the higher the settings used, the more prevalent the issues become, and that's why they are more obvious to high end hardware users.
> 
> Even with my thuban, and a single 290, I expect to run max settings on this title at 1080p.
> I haven't bothered getting it yet, and probably won't for a while....
> It'll be optimized and running well about the time it's dropped down to $30, lol


I got the game for free with my new G1 970... I may as well have gotten a paperweight. No wait. A paperweight would have been a lot more useful. Stutter factory is right as said earlier. Junk, would be another way of putting it.


----------



## bond32

Quote:


> Originally Posted by *battleaxe*
> 
> Yeah, it definitely helps. I just bought a new G1 970 and it has never hit 53C yet in the same room... And its on air.
> 
> The temps on my 290 that is running with an H80 are close to the same as my temps on the G1 970...
> 
> But yeah, this room hits 20c at the most during the winter. Usually around 19C in here. So I guess if you add 5C to the 41C and 46C you end up with 46C VRM2 and 51C on VRM1. The core on the 290 is usually around 44-46C while gaming. It can hit the low 50's running Firestrike if I'm pumping a lot of volts to it. So if you add 5C to those numbers, then I would be right where TSM said I would. But, I still think these are good numbers and it definitely helped the 290 hit higher clocks. I couldn't get this thing to 1200mhz until it was under water and the VRM's were also very cool. It just wouldn't do it. But last night I was playing BF4 at 1200mhz for about 30 minutes, so it seems the colder these get the happier they are. But there could be other factors at work here too. IDK... I don't pay that much attention to the details to get too worried about them.


Your H80 and aluminum heat sinks don't cool to those numbers.

At the exact same ambient, with +150mV my three VRM1's will run between 41 and 46 C. So you mean to tell me that a closed loop cooler and aluminum air cooled heat sinks cool the same as 960mm of 80mm thick rad space in push pull with AP-15's and 2 MCP50x pumps?

Yeah I call bs.


----------



## battleaxe

Quote:


> Originally Posted by *bond32*
> 
> Your H80 and aluminum heat sinks don't cool to those numbers.
> 
> At the exact same ambient, with +150mV my three VRM1's will run between 41 and 46 C. So you mean to tell me that a closed loop cooler and aluminum air cooled heat sinks cool the same as 960mm of 80mm thick rad space in push pull with AP-15's and 2 MCP50x pumps?
> 
> Yeah I call bs.


Lol... Whatever... Go bark up someone else's tree. I have nothing to prove to you or anyone else. Get a job or a life. Get both.


----------



## bond32

Quote:


> Originally Posted by *battleaxe*
> 
> Lol... Whatever... Go bark up someone else's tree. I have nothing to prove to you or anyone else. Get a job or a life. Get both.


Uh, what? Not sure if you're actually trying to respond to my post, or just upset that you spilled your cheeros on your belly this morning...

You do realize that other people read this right? So if someone were thinking, "Man, I need to get one of those H80's so I can get those VRM temps!" Then, when they get said setup only to find they now have temperatures 20 degrees higher than yours, where's the headache going to be at then?

No one is trying to argue with you. We are actually calling shenanigans. And you don't have to get butthurt about it.


----------



## battleaxe

Quote:


> Originally Posted by *bond32*
> 
> Uh, what? Not sure if you're actually trying to respond to my post, or just upset that you spilled your cheeros on your belly this morning...
> 
> You do realize that other people read this right? So if someone were thinking, "Man, I need to get one of those H80's so I can get those VRM temps!" Then, when they get said setup only to find they now have temperatures 20 degrees higher than yours, where's the headache going to be at then?
> 
> No one is trying to argue with you. We are actually calling bs. And you don't have to get butthurt about it.


yeah. My reference 290 would have run hotter than this because its stock volts were higher for some reason. This one runs 5-8c cooler than the other 290. You do realize there can be differences between GPUs right?

I have no interest in arguing with you about this. I know what my temps are. I'm not blind. Believe it or don't. Makes no difference to me.


----------



## Spectre-

Quote:


> Originally Posted by *battleaxe*
> 
> You guys don't need the G10 bracket. its pointless... unless its just aesthetics you are after. You can strap the block on with zip-ties, put some cheap sinks on the RAM and the Gelid kit on the VRM's. I put a 120mm fan mounted to the bottom of the card too just to push air accross the VRM 1. My VRM's don't exceed 41C even at 150mv of extra juice. Core hasn't gone over 52C even while clocked at 1200 and running +150mv. Very cool. This is with an h80 AIO strapped to it. Just FYI


lol i used the amd bracket to hold mt 290 with the H100i .. too bad the h100i died


----------



## wermad

Quote:


> Originally Posted by *Chopper1591*
> 
> Nice.
> 
> We want results.


G3258 really holding these back, so I need to search for a new cpu.


----------



## battleaxe

Quote:


> Originally Posted by *Spectre-*
> 
> lol i used the amd bracket to hold mt 290 with the H100i .. too bad the h100i died


yeah I have had one die on me too, an AIO that is. So this does have its drawbacks...

BTW: @bond32 you quoted me but I had already admitted that those results weren't right. 41c was only vrm2... And my ambient is 19 to 20c. So 41c vrm2, 46c vrm1, and no more than 52c on core at 19-20c ambient. Also, this one is cooler than my other 290 by another 5c plus. Just to clear things up.

But there are differences between how hot GPUs get. I also found some various issues with mounting. So if you aren't doing better than this on your loop it could be you just have some hot cores, I have a cool one, or your mount is not ideal. Any one of those things could cause a difference of 10c on its own.


----------



## Tokkan

Finally strapped a AIO on to my R9 290 reference, just finished it. Using the stock heatsing for the VRM and VRAM, working perfectly. Have the stock fan in place too and letting it on auto because it'll create airflow for the vrm's etc.
My computer has changed alot now, it isn't clean looking at it though...


----------



## battleaxe

Quote:


> Originally Posted by *Tokkan*
> 
> Finally strapped a AIO on to my R9 290 reference, just finished it. Using the stock heatsing for the VRM and VRAM, working perfectly. Have the stock fan in place too and letting it on auto because it'll create airflow for the vrm's etc.
> My computer has changed alot now, it isn't clean looking at it though...


A think you would get better airflow with a conventional 100mm or 120mm fan pushing air over the VRM1 area? I'm using a 120 mounted right to the back section of the GPU and it works really well. Just an idea. I found a low VRM1 temp helps dramatically for overclocking. Now that my VRM1 is under 50c I was able to get about 30 mhz more on the core. I think as they get colder the VRM's are more efficient so higher OC's can become a bit more stable. Seems I read that somewhere back at the beginning of the OP. I could be wrong though.


----------



## Roboyto

Quote:
Originally Posted by *battleaxe* 

You guys don't need the G10 bracket. its pointless... unless its just aesthetics you are after. You can strap the block on with zip-ties, put some cheap sinks on the RAM and the Gelid kit on the VRM's. I put a 120mm fan mounted to the bottom of the card too just to push air accross the VRM 1. My VRM's don't exceed 41C even at 150mv of extra juice. Core hasn't gone over 52C even while clocked at 1200 and running +150mv. Very cool. This is with an h80 AIO strapped to it. Just FYI


> 19-20C
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I like my house cold. I guess that does matter a bit...
> 
> I don't feel the need to keep my house at 80 deg F in the winter. Okay, so that matters quite a bit... Forgot about that. Still even if you add 5-6C the temps are quite fine, and again that bracket is pointless other than for aesthetics.


I guess the G10 is pointless until you want to service the card by..cutting and reattaching zip ties..no thanks. Zip ties are for cable management.

I'll take the aesthetic benefit and the luxury







of screws holding things together. Got some pics of how you attached the 120mm fan to the bottom of the card perchance?

I will join into the group calling BS of your temperature claims with those clocks/volts as I was one of the first adopters of the Gelid kit..I ordered it back in May directly from China. With the Gelid VRM kit, Fujipoly Ultra Extreme thermal pads, an upgraded fan for the G10 and *STOCK settings* on my 290 I don't have that low of a VRM or core temperature under load. My house sits at 68F-70F in the winter time so my ambients aren't that far off from yours.

Your GPU core claims *might* be possible with a 240 or 280mm AIO with push/pull but not with an H80.


----------



## battleaxe

Quote:


> Originally Posted by *Roboyto*
> 
> I will join into the group calling BS of your temperature claims with those clocks/volts as I was one of the first adopters of the Gelid kit..I ordered it back in May directly from China. With the Gelid VRM kit, Fujipoly Ultra Extreme thermal pads, an upgraded fan for the G10 and _*STOCK settings*_ on my 290 I don't have that low of a VRM or core temperature under load. My house sits at 68F-70F in the winter time so my ambients aren't that far off from yours.
> 
> Your GPU core claims _*might*_ be possible with a 240 or 280mm AIO with push/pull but not with an H80.


Whatever... I already explained this.

Read my posts cause clearly you missed a post or two as did bond32, or don't, either way - I have nothing to prove.

This is what I said after tsm noticed my temps were a bit lower than they should have been. And here IS what I wrote in response.
Quote:


> But yeah, this room hits 20c at the most during the winter. Usually around 19C in here. So I guess if you add 5C to the 41C and 46C you end up with 46C VRM2 and 51C on VRM1. The core on the 290 is usually around 44-46C while gaming. It can hit the low 50's running Firestrike if I'm pumping a lot of volts to it. So if you add 5C to those numbers, then I would be right where TSM said I would. But, I still think these are good numbers and it definitely helped the 290 hit higher clocks. I couldn't get this thing to 1200mhz until it was under water and the VRM's were also very cool. It just wouldn't do it. But last night I was playing BF4 at 1200mhz for about 30 minutes, so it seems the colder these get the happier they are. But there could be other factors at work here too. IDK... I don't pay that much attention to the details to get too worried about them.


----------



## Roboyto

Quote:


> Originally Posted by *battleaxe*
> 
> Whatever... I already explained this.
> 
> Read my posts cause clearly you missed a post or two as did bond32, or don't, either way - I have nothing to prove.


Your 'explanation' due to a cold room and not paying attention to details is a load of Wookiee dookiee. This is the wrong place, and a particularly poor thread thread to choose, to try and throw around ridiculous figures without evidence to back it up; People around here pay very close attention to the details.

I didn't miss any posts, but I did miss any kind of photo, screenshot, GPU-Z log, or anything else for that matter, to support your claims. I have had my hands on (5) 290's and out of the five, three of them were modded to help reduce temperatures with AIO coolers, and the 4th is under a full block. A 290 at 1200 core, with +150mV, in a ~20C ambient environment, will not run the Core @46C, VRM 1 @51C, and VRM2 @ 46C being cooled by a 'home brew' Corsair H80 with a Gelid VRM kit setup; that should sum up all the explaining from your last dozen, or so, posts in this thread.


----------



## Spectre-

Quote:


> Originally Posted by *Roboyto*
> 
> I guess the G10 is pointless until you want to service the card by..cutting and reattaching zip ties..no thanks. Zip ties are for cable management.
> 
> I'll take the aesthetic benefit and the luxury
> 
> 
> 
> 
> 
> 
> 
> of screws holding things together. Got some pics of how you attached the 120mm fan to the bottom of the card perchance?
> 
> I will join into the group calling BS of your temperature claims with those clocks/volts as I was one of the first adopters of the Gelid kit..I ordered it back in May directly from China. With the Gelid VRM kit, Fujipoly Ultra Extreme thermal pads, an upgraded fan for the G10 and _*STOCK settings*_ on my 290 I don't have that low of a VRM or core temperature under load. My house sits at 68F-70F in the winter time so my ambients aren't that far off from yours.
> 
> Your GPU core claims _*might*_ be possible with a 240 or 280mm AIO with push/pull but not with an H80.


umm not to be that guy but i can do 1.5 volts with my intel asetek cooler thingy

i mean all my hwbot submissions are done using the h100i (before it died) and this new intel cooler with the G10


----------



## Roboyto

Quote:
Originally Posted by *Spectre-* 

umm not to be that guy but i can do 1.5 volts with my intel asetek cooler thingy

i mean all my hwbot submissions are done using the h100i (before it died) and this new intel cooler with the G10

Yeah, you can do that..and I can run my 290 with the Kraken with +187mV and it will run; crap card that black screens after +187mV.

*But* you're not claiming that the card is running at temperatures on par with, or better than, full water loops. The operating temperatures he claims aren't feasible with the circumstances he has stated.

*My AIO cooled 290 Setup*

PowerColor R9 290 OC Edition (originally equipped w/ Reference Cooler)

Running at it's stock 'OC' settings of 975/1250

FFXIV Benchmark looped for 1 hour and 3 minutes

Card is cooled by:

Kraken G10

Antec Kuhler 620

Xigmatek PTI G4512 TIM

(1) SilenX Effizio 120mm fan

Gelid VRM Kit w/ Fuji Ultra Extreme Pads for VRM1 & JunPus Thermal Tape for VRM2

Zalman Shark Fin 92mm fan for VRM1 cooling

VRM2 is passively cooled

Max temperatures were:

Core - 62C

VRM1 - 49C

VRM2 - 55C



Spoiler: 1 hour bench loop















Tell me how a similar card, cooled with a similar setup, can run at lower temperatures when it is running at significantly higher clocks and voltages?

It simply doesn't add up.

Quote:
Originally Posted by *Spectre-* 

oh yeah no way

unless its like 10 degrees ambient ( i used to bench outside in winter)

i get 40 idle and 80-90ish while benching on 1.5 volts vcore and i am pretty sure my vrm's are over 100


> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> He obviously doesn't know what is really possible so while pulling numbers from out of his ass, he picked numbers that are way over the unpossible line. Fyi, if his numbers were real, the rest of the world is doing something wrong lol.
Click to expand...

Precisely.

Throwing around false information doesn't do anyone any good.


----------



## Spectre-

Quote:


> Originally Posted by *Roboyto*
> 
> Yeah, you can do that..and I can run my 290 with the Kraken with +187mV and it will run; crap card that black screens after +187mV.
> 
> _*But*_ you're not claiming that the card is running at temperatures on par with, or better than, full water loops. The operating temperatures he claims aren't feasible with the circumstances he has stated.


oh yeah no way

unless its like 10 degrees ambient ( i used to bench outside in winter)

i get 40 idle and 80-90ish while benching on 1.5 volts vcore and i am pretty sure my vrm's are over 100


----------



## tsm106

He obviously doesn't know what is really possible so while pulling numbers from out of his ass, he picked numbers that are way over the unpossible line. Fyi, if his numbers were real, the rest of the world is doing something wrong lol.


----------



## Chopper1591

Quote:


> Originally Posted by *wermad*
> 
> G3258 really holding these back, so I need to search for a new cpu.


Your kidding, right?
You really running cf 290 tri-x with a G3258?









Quote:


> Originally Posted by *battleaxe*
> 
> yeah I have had one die on me too, an AIO that is. So this does have its drawbacks...
> 
> BTW: @bond32 you quoted me but I had already admitted that those results weren't right. 41c was only vrm2... And my ambient is 19 to 20c. So 41c vrm2, 46c vrm1, and no more than 52c on core at 19-20c ambient. Also, this one is cooler than my other 290 by another 5c plus. Just to clear things up.
> 
> But there are differences between how hot GPUs get. I also found some various issues with mounting. So if you aren't doing better than this on your loop it could be you just have some hot cores, I have a cool one, or your mount is not ideal. Any one of those things could cause a difference of 10c on its own.


Me not believe.
Even 46c on vrm1 is not possible with high volts and clocks. At least not with a diy mod thing.
Quote:


> PowerColor R9 290 OC Edition (originally equipped w/ Reference Cooler)
> Running at it's stock 'OC' settings of 975/1250
> FFXIV Benchmark looped for 1 hour and 3 minutes
> Card is cooled by:
> Kraken G10
> Antec Kuhler 620
> Xigmatek PTI G4512 TIM
> (1) SilenX Effizio 120mm fan
> Gelid VRM Kit w/ Fuji Ultra Extreme Pads for VRM1 & JunPus Thermal Tape for VRM2
> Zalman Shark Fin 92mm fan for VRM1 cooling
> VRM2 is passively cooled
> 
> Max temperatures were:
> Core - 62C
> VRM1 - 49C
> VRM2 - 55C
> 
> 
> Spoiler: 1 hour bench loop
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Tell me how a similar card, cooled with a similar setup, can run at lower temperatures when it is running at significantly higher clocks and voltages?
> 
> It simply doesn't add up.
> 
> Precisely.
> 
> Throwing around false information doesn't do anyone any good.


This is valid data.
I do believe this.

VRM2 is in par with my stock 290 tri-x vrm2 temp.

I can't wait to put my 290 in the loop.
That will shine some light on this subject. I will post proper before and after results for you guys.

Just bought an extra 140mm radiator to add in the single loop, as I think the UT60 360 will be on the edge of keeping a sub 10c Delta T.
Quote:


> Originally Posted by *Spectre-*
> 
> oh yeah no way
> 
> unless its like 10 degrees ambient ( i used to bench outside in winter)
> 
> i get 40 idle and 80-90ish while benching on 1.5 volts vcore and i am pretty sure my vrm's are over 100


Quote:


> Originally Posted by *tsm106*
> 
> He obviously doesn't know what is really possible so while pulling numbers from out of his ass, he picked numbers that are way over the unpossible line. Fyi, if his numbers were real, the rest of the world is doing something wrong lol.


Weird, right?

Multiple people saying it aint happening and it still did.
What is his secret?


----------



## wermad

Quote:


> Originally Posted by *Chopper1591*
> 
> Your kidding, right?
> You really running cf 290 tri-x with a G3258?


Its temporary, as i mentioned before







. Gog's have no issues with a G3258, so that's what I'm playing right now.


----------



## Chopper1591

Quote:


> Originally Posted by *wermad*
> 
> Its temporary, as i *mentioned before*
> 
> 
> 
> 
> 
> 
> 
> . Gog's have no issues with a G3258, so that's what I'm playing right now.


I must have missed that.


----------



## battleaxe

Quote:


> Originally Posted by *Roboyto*
> 
> Your 'explanation' due to a cold room and not paying attention to details is a load of Wookiee dookiee. This is the wrong place, and a particularly poor thread thread to choose, to try and throw around ridiculous figures without evidence to back it up; People around here pay very close attention to the details.
> 
> I didn't miss any posts, but I did miss any kind of photo, screenshot, GPU-Z log, or anything else for that matter, to support your claims. I have had my hands on (5) 290's and out of the five, three of them were modded to help reduce temperatures with AIO coolers, and the 4th is under a full block. A 290 at 1200 core, with +150mV, in a ~20C ambient environment, will not run the Core @46C, VRM 1 @51C, and VRM2 @ 46C being cooled by a 'home brew' Corsair H80 with a Gelid VRM kit setup; that should sum up all the explaining from your last dozen, or so, posts in this thread.


Okay, thanks for telling me what my system does. I am forever grateful for your expertise on the matter of all things AMD 290's... I didn't realize you broke into my house and ran benches while I was sleeping.









I'm done with this conversation.

Go on thinking whatever you want. I won't produce any photos because even if I did you three would clearly not believe it and find a way to dismiss it anyway. So I'll just show myself off this forum thread. I'll keep my results to myself on my last 290 now that I know how ridiculous they are and no-one would believe them. Save myself some sore fingers. Thanks buddy. Appreciate it.









Edit: okay decided to post a screenie as I don't really enjoy being called a liar. Now I'm sure you will tell me what's wrong with this right? What voodoo did I perform on my H80 and aircooled VRM's to accomplish this aside from my cold room? And are these about where I said they were? Or am I missing something? Maybe I used the wrong test or the wrong GPU-z version? I'm sure you will all tell me what I did wrong and how my results are not valid. So here I am waiting on said rebuttal since you know it all.



http://www.3dmark.com/3dm11/9074303

And before you even mention that I used 3dMark11 I already said that the core goes up to 52c while running Firestrike. And no I cannot run my RAM any higher as I get black-screens above 1250 cause my RAM sucks. So yes, that would explain my low VRM2 temps.

Edit: still waiting guys... what's wrong with my results? Surely you will find some way to discount my readings...?
Quote:


> Originally Posted by *bond32*
> 
> Your H80 and aluminum heat sinks don't cool to those numbers.
> 
> At the exact same ambient, with +150mV my three VRM1's will run between 41 and 46 C. So you mean to tell me that a closed loop cooler and aluminum air cooled heat sinks cool the same as 960mm of 80mm thick rad space in push pull with AP-15's and 2 MCP50x pumps?
> 
> Yeah I call shenanigans.


Care to elaborate?
Quote:


> Originally Posted by *bond32*
> 
> Your H80 and aluminum heat sinks don't cool to those numbers.
> 
> At the exact same ambient, with +150mV my three VRM1's will run between 41 and 46 C. So you mean to tell me that a closed loop cooler and aluminum air cooled heat sinks cool the same as 960mm of 80mm thick rad space in push pull with AP-15's and 2 MCP50x pumps?
> 
> Yeah I call shenanigans.


Quote:


> Originally Posted by *Roboyto*
> 
> Your 'explanation' due to a cold room and not paying attention to details is a load of Wookiee dookiee. This is the wrong place, and a particularly poor thread thread to choose, to try and throw around ridiculous figures without evidence to back it up; People around here pay very close attention to the details.
> 
> I didn't miss any posts, but I did miss any kind of photo, screenshot, GPU-Z log, or anything else for that matter, to support your claims. I have had my hands on (5) 290's and out of the five, three of them were modded to help reduce temperatures with AIO coolers, and the 4th is under a full block. A 290 at 1200 core, with +150mV, in a ~20C ambient environment, will not run the Core @46C, VRM 1 @51C, and VRM2 @ 46C being cooled by a 'home brew' Corsair H80 with a Gelid VRM kit setup; that should sum up all the explaining from your last dozen, or so, posts in this thread.


There ya go. Enjoy. Don't forget to tell me what I did wrong and call shenanigans again. I really appreciated that BTW.
Quote:


> Originally Posted by *bond32*
> 
> Your H80 and aluminum heat sinks don't cool to those numbers.
> 
> At the exact same ambient, with +150mV my three VRM1's will run between 41 and 46 C. So you mean to tell me that a closed loop cooler and aluminum air cooled heat sinks cool the same as 960mm of 80mm thick rad space in push pull with AP-15's and 2 MCP50x pumps?
> 
> Yeah I call shenanigans.


Quote:


> Originally Posted by *Roboyto*
> 
> Your 'explanation' due to a cold room and not paying attention to details is a load of Wookiee dookiee. This is the wrong place, and a particularly poor thread thread to choose, to try and throw around ridiculous figures without evidence to back it up; People around here pay very close attention to the details.
> 
> I didn't miss any posts, but I did miss any kind of photo, screenshot, GPU-Z log, or anything else for that matter, to support your claims. I have had my hands on (5) 290's and out of the five, three of them were modded to help reduce temperatures with AIO coolers, and the 4th is under a full block. A 290 at 1200 core, with +150mV, in a ~20C ambient environment, will not run the Core @46C, VRM 1 @51C, and VRM2 @ 46C being cooled by a 'home brew' Corsair H80 with a Gelid VRM kit setup; that should sum up all the explaining from your last dozen, or so, posts in this thread.


Quote:


> Originally Posted by *Roboyto*
> 
> Yeah, you can do that..and I can run my 290 with the Kraken with +187mV and it will run; crap card that black screens after +187mV.
> 
> _*But*_ you're not claiming that the card is running at temperatures on par with, or better than, full water loops. The operating temperatures he claims aren't feasible with the circumstances he has stated.
> 
> *My AIO cooled 290 Setup*
> 
> PowerColor R9 290 OC Edition (originally equipped w/ Reference Cooler)
> Running at it's stock 'OC' settings of 975/1250
> FFXIV Benchmark looped for 1 hour and 3 minutes
> Card is cooled by:
> Kraken G10
> Antec Kuhler 620
> Xigmatek PTI G4512 TIM
> (1) SilenX Effizio 120mm fan
> Gelid VRM Kit w/ Fuji Ultra Extreme Pads for VRM1 & JunPus Thermal Tape for VRM2
> Zalman Shark Fin 92mm fan for VRM1 cooling
> VRM2 is passively cooled
> 
> Max temperatures were:
> Core - 62C
> VRM1 - 49C
> VRM2 - 55C
> 
> 
> Spoiler: 1 hour bench loop
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Tell me how a similar card, cooled with a similar setup, can run at lower temperatures when it is running at significantly higher clocks and voltages?
> 
> It simply doesn't add up.
> 
> Precisely.
> 
> Throwing around false information doesn't do anyone any good.


Being an uneducated cynic doesn't do anyone any good either. But don't let that stop you. Go on calling shenanigans on me.


----------



## Chopper1591

Quote:


> Originally Posted by *battleaxe*
> 
> Okay, thanks for telling me what my system does. I am forever grateful for your expertise on the matter of all things AMD 290's... I didn't realize you broke into my house and ran benches while I was sleeping.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm done with this conversation.
> 
> Go on thinking whatever you want. I won't produce any photos because even if I did you three would clearly not believe it and find a way to dismiss it anyway. So I'll just show myself off this forum thread. I'll keep my results to myself on my last 290 now that I know how ridiculous they are and no-one would believe them. Save myself some sore fingers. Thanks buddy. Appreciate it.












*Edit*:

Good score?


----------



## Chopper1591

Quote:


> Originally Posted by *battleaxe*
> 
> Wow... its so quiet in here? Where's the three know-it-all's that were calling shenanigans on me?
> 
> Not everyone is trying to make their rigs out to be better than yours. Yes, my 290 runs cooler than most apparently. Yes my room is cold. Why should that be such a surprise to all of you? I remember seeing someone on here who had a couple 290's on a custom loop and if memory serves their 290's VRM's were not even getting into the 40's at max. So it is not impossible as I hope my screenshot above can easily show.
> 
> On BF3 my temps are similar to those above running 3dMark11.
> 
> On BF4 my temps are similar to running Firestrike which is about 52C on core. I forget what the VRM's are while running Firestrike, but I'm sure its higher than my results posted above. Just because your cards will or will not do something does not mean that is true of all hardware. It doesn't work that way, there are always exceptions to the general rule. My card runs cool, but my RAM also sucks and won't overclock at all, so what's that saying?
> 
> It means nothing... its just the silicon lottery. Get over it.
> 
> Edit: Oh wait you probably can't respond because you are still in class right? I'll check back later after school is out and you ride the bus home.


Haha.
You sure are very nice.

Must have loads of friends.


----------



## battleaxe

Quote:


> Originally Posted by *Chopper1591*
> 
> Haha.
> You sure are very nice.
> 
> Must have loads of friends.


Yeah, I don't take kindly to being called a liar.

Actually I do have lots of friends (probably hard to believe I know), but they don't act ******ed like those guys just did either, and I wouldn't take this kind of crap off my friends as well and they know it.

LOL... yeah, I was kinda rude I guess. Oh well.









Edit: under the circumstances I think I had the right to defend myself? Maybe not?...

I'll apologize if everyone here agrees I was a total jerk? Or was my response correct and just?


----------



## cplifj

so what happened AMD, i work with tripple screen on this 290x since i got it a year ago.

So today all of a sudden when starting the pc i NO LONGER have tripple screen since only 1 screen still works. just shut it down yesterday and started up today.

so by standing powered off, 2 off the screenoutputs ,(being the DVI's) just died overnight while being powered off.

some real fabulous piece off tech AMD, bunch of crap.just over a year old reference card, now killed via a programmed flaw ? nice period off the year to have this problem also.

programmed to do so via your multitude of crap drivers perhaps AMD?? i believe so allready. i dont trust your industries anymore.


----------



## rdr09

XBOX!


----------



## battleaxe

These recent event have got me thinking. If the h80/air cooled method really sucks as bad as I have recently been told... I am wondering what this card might do on a proper loop? This card does 1200mhz core on only 1.266volts, so if I flashed the BIOS and ran a lot more volts into it I wonder what it could do on the core?



Only problem is the RAM sucks on it... won't go over 1300mhz







So I'm guessing that would kill my benchmarks from doing anything special...

What kind of voltage are the guys running to get stable at 1300mhz? Those of you who have reached those core speeds?
Quote:


> Originally Posted by *rdr09*
> 
> XBOX!


Can't say I blame you for trying to change the subject. Truth be told I've had more fun on the last three pages then the previous 100.


----------



## Gobigorgohome

Quote:


> Originally Posted by *cplifj*
> 
> so what the happened AMD, i work with tripple screen on this 290x since i got it a year ago.
> 
> So today all of a sudden when starting the pc i NO LONGER have tripple screen since only 1 screen still works. just shut it down yesterday and started up today.
> 
> so by standing powered off, 2 off the screenoutputs ,(being the DVI's) just died overnight while being powered off.
> 
> some real fabulous piece off tech AMD, bunch of crap.just over a year old reference card, now killed via a programmed flaw ? nice period off the year to have this problem also.
> 
> programmed to do so via your multitude of crap drivers perhaps AMD?? i believe so allready. i dont trust your industries for 1 flying anymore.


Sure the ports are broken? I have a similar experience sometimes with my quadfire and the DP port on my R9 290X, suddenly black screen, plug it in and out a couple of times and it is good for a while ... do not know if it is the DP, card or monitor. Seems like the monitor is on when it comes back up, so do not think it is the screen. Using the DP cable that followed the U28D590 too so there should be no problem, the first DP cable lasted one week before it was broken so bad quality maybe?


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> These recent event have got me thinking. If the h80/air cooled method really sucks as bad as I have recently been told... I am wondering what this card might do on a proper loop? This card does 1200mhz core on only 1.266volts, so if I flashed the BIOS and ran a lot more volts into it I wonder what it could do on the core?
> 
> 
> 
> Only problem is the RAM sucks on it... won't go over 1300mhz
> 
> 
> 
> 
> 
> 
> 
> So I'm guessing that would kill my benchmarks from doing anything special...
> 
> What kind of voltage are the guys running to get stable at 1300mhz? Those of you who have reached those core speeds?
> Can't say I blame you for trying to change the subject. Truth be told I've had more fun on the last three pages then the previous 100.
> 
> But my funniest post got deleted... took a bit to come up with that one too... What a bummer.


what's the subject anyways?


----------



## battleaxe

Quote:


> Originally Posted by *rdr09*
> 
> XBOX!


Quote:


> Originally Posted by *rdr09*
> 
> what's the subject anyways?


I was being accused of lying earlier. So instead of saying my card was hitting "X" max voltage I figured I had to post a screenie as proof so they wouldn't throw a conniption and call crap on me.

I'm just wondering if (as I recently found out) my 290 runs really cool, how well it would do with more volts and might be able to hit close to 1300mhz?

And I was curious what kind of volts others were needing to hit 1300mhz on the core? That's all. Cause it seems this card runs cool, needs very little volts to hit 1200 on the core, but I will quickly be limited by TRIX at 200mv extra.

Not sure what's going on with this thing... ? Maybe its just a fluke card or something? 1.266v was the max while running 3dMark11 at 1200mhz and it only hit 46c on core max (as shown in my earlier screenshot). And from what a few have said this is abnormal.

So what could it do on a proper loop? That's kinda what I'm wondering. I just cannot see the point of going to the expense of a full loop when this is what I'm getting already. But maybe it would improve even further. IDK


----------



## Tokkan

Quote:


> Originally Posted by *battleaxe*
> 
> A think you would get better airflow with a conventional 100mm or 120mm fan pushing air over the VRM1 area? I'm using a 120 mounted right to the back section of the GPU and it works really well. Just an idea. I found a low VRM1 temp helps dramatically for overclocking. Now that my VRM1 is under 50c I was able to get about 30 mhz more on the core. I think as they get colder the VRM's are more efficient so higher OC's can become a bit more stable. Seems I read that somewhere back at the beginning of the OP. I could be wrong though.


Actually, the AIO radiator is below the VRM zone and it is pushing air to the VRM's.
Idk how much lower it can get though, getting 55ºC on core at full load and 35-40 for VRM's. Have a side fan blowing right on the card also.


----------



## battleaxe

Quote:


> Originally Posted by *Tokkan*
> 
> Actually, the AIO radiator is below the VRM zone and it is pushing air to the VRM's.
> Idk how much lower it can get though, getting 55ºC on core at full load and 35-40 for VRM's. Have a side fan blowing right on the card also.


Yeah, that's cold. Seems to be working just fine... is it loud?

What are your voltages at those temps BTW? And your core and MEM speeds?


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> I was being accused of lying earlier. So instead of saying my card was hitting "X" max voltage I figured I had to post a screenie as proof so they wouldn't throw a conniption and call crap on me.
> 
> I'm just wondering if (as I recently found out) my 290 runs really cool, how well it would do with more volts and might be able to hit close to 1300mhz?
> 
> And I was curious what kind of volts others were needing to hit 1300mhz on the core? That's all. Cause it seems this card runs cool, needs very little volts to hit 1200 on the core, but I will quickly be limited by TRIX at 200mv extra.
> 
> Not sure what's going on with this thing... ? Maybe its just a fluke card or something? 1.266v was the max while running 3dMark11 at 1200mhz and it only hit 46c on core max (as shown in my earlier screenshot). And from what a few have said this is abnormal.
> 
> So what could it do on a proper loop? That's kinda what I'm wondering. I just cannot see the point of going to the expense of a full loop when this is what I'm getting already. But maybe it would improve even further. IDK


it comes down to silicon. one of my 290s can do 1300 at +200 (just for benching). my other can only do 1280.

http://www.3dmark.com/3dm11/8776470

yah, with a full block you might hit higher clocks.

i use Trixx.


----------



## battleaxe

Quote:


> Originally Posted by *rdr09*
> 
> it comes down to silicon. one of my 290s can do 1300 at +200 (just for benching). my other can only do 1280.
> 
> http://www.3dmark.com/3dm11/8776470
> 
> yah, with a full block you might hit higher clocks.


Wow... nice score... what volts were you running to get that, do you know?


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> Wow... nice score... what volts were you running to get that, do you know?


i've seen it maxed out at 1.41v using +200. not sure how droop comes into play.

edit: and, yah, synthetic benches (or even games some say) benefit a little oc on the memory. as much as 100 points in graphics score i think from 1300 to 1600 MHz. then there is Win 8 or 10 showing better figures as much as 200 pts.


----------



## Chopper1591

Quote:


> Originally Posted by *battleaxe*
> 
> These recent event have got me thinking. If the h80/air cooled method really sucks as bad as I have recently been told... I am wondering what this card might do on a proper loop? This card does 1200mhz core on only 1.266volts, so if I flashed the BIOS and ran a lot more volts into it I wonder what it could do on the core?
> 
> 
> 
> Only problem is the RAM sucks on it... won't go over 1300mhz
> 
> 
> 
> 
> 
> 
> 
> So I'm guessing that would kill my benchmarks from doing anything special...
> 
> What kind of voltage are the guys running to get stable at 1300mhz? Those of you who have reached those core speeds?
> Can't say I blame you for trying to change the subject. Truth be told I've had more fun on the last three pages then the previous 100.
> 
> But my funniest post got deleted... took a bit to come up with that one too... What a bummer.


Do the mod and see for yourself.
Benches won't take the temp up that high anyway.
It's gaming that does bring the heat. At least for me.
Benches are short and intens.

Pump that voltage up and see where you get...
You have amazing temps to start with, right?


----------



## Tokkan

Quote:


> Originally Posted by *battleaxe*
> 
> Yeah, that's cold. Seems to be working just fine... is it loud?
> 
> What are your voltages at those temps BTW? And your core and MEM speeds?


I'm at stock currently because I believe my PSU is on the verge of death, the 12V rail is at 11.8V and drops to 11.4V.
I did a quick run at 1150Mhz /1350Mhz and the temps were good, in comparison its dead silent.


----------



## battleaxe

Quote:


> Originally Posted by *Tokkan*
> 
> I'm at stock currently because I believe my PSU is on the verge of death, the 12V rail is at 11.8V and drops to 11.4V.
> I did a quick run at 1150Mhz /1350Mhz and the temps were good, in comparison its dead silent.


I don't know much about PSU rails, but 11.4 doesn't sound very good.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Tokkan*
> 
> I'm at stock currently because I believe my PSU is on the verge of death, the 12V rail is at 11.8V and drops to 11.4V.
> I did a quick run at 1150Mhz /1350Mhz and the temps were good, in comparison its dead silent.


If you're reading voltage from HWmonitor it's most likely wrong. HWmonitor has been telling me my 12v rail has been at 7v for the past two years with four different power supplies..


----------



## battleaxe

Quote:


> Originally Posted by *rdr09*
> 
> i've seen it maxed out at 1.41v using +200. not sure how droop comes into play.
> 
> edit: and, yah, synthetic benches (or even games some say) benefit a little oc on the memory. as much as 100 points in graphics score i think from 1300 to 1600 MHz. then there is Win 8 or 10 showing better figures as much as 200 pts.


Well, I tried to run it at +200mv and it just blackscreened on me. So I guess +150mv at 1200 is best it will do. Poor me.


----------



## Roboyto

I'm at work. I'll disect the information once I get home. I'd like to see all the max values for GPU-Z, you could very easily set voltage after the bench was completed. Also you need to run a bench loop or show temps after gaming for some time. Temps after a single run are meaningless..Unlesss you play games for a few minutes at a time


----------



## Chopper1591

Quote:


> Originally Posted by *Tokkan*
> 
> I'm at stock currently because I believe my PSU is on the verge of death, the 12V rail is at 11.8V and drops to 11.4V.
> I did a quick run at 1150Mhz /1350Mhz and the temps were good, in comparison its dead silent.


Thats still alright.

Atx v2.3 12v is rated between 11.40v and 12.60v
Your psu is atx v2.2, but I suspect that to not be very different.

So you are ok... could be better, could be worse.

But, measuring through software isn't the most accurate way to do it.
If you want to know it, grab the Multi-meter.


----------



## Chopper1591

Quote:


> Originally Posted by *Roboyto*
> 
> I'm at work. I'll disect the information once I get home. I'd like to see all the max values for GPU-Z, you could very easily set voltage after the bench was completed. Also you need to run a bench loop or show temps after gaming for some time. Temps after a single run are meaningless..Unlesss you play games for a few minutes at a time


I agree on the temp thing after benches.
My temps are a good deal lower after just a bench run compared to a few hours of gaming.


----------



## bond32

Quote:


> Originally Posted by *battleaxe*
> 
> These recent event have got me thinking. If the h80/air cooled method really sucks as bad as I have recently been told... I am wondering what this card might do on a proper loop? This card does 1200mhz core on only 1.266volts, so if I flashed the BIOS and ran a lot more volts into it I wonder what it could do on the core?
> 
> 
> 
> Only problem is the RAM sucks on it... won't go over 1300mhz
> 
> 
> 
> 
> 
> 
> 
> So I'm guessing that would kill my benchmarks from doing anything special...
> 
> What kind of voltage are the guys running to get stable at 1300mhz? Those of you who have reached those core speeds?
> Can't say I blame you for trying to change the subject. Truth be told I've had more fun on the last three pages then the previous 100.
> 
> But my funniest post got deleted... took a bit to come up with that one too... What a bummer.


Is this for real? The max power draw of your card says 270 something watts? You didn't run it at that voltage you claimed I can tell you right now. At +200mV one of my 4 cards will draw well over 300 watts.

The voltage indicates absolutely no change from leaving 2D power levels which means they hit your "1200" for a fraction of a second, then fell.

I realize that you are already really butthurt, but if you keep posting nonsense, especially in this thread, you're going to get shot down.


----------



## Gobigorgohome

Baaaaaaaah .... a little more work than just using the EK-FC Terminal Quad Semi-Parallel ....


----------



## tsm106

^^Bond, look the power out, it's too low. For 1.25v +150mv = 220w lolzers.









Quote:


> Originally Posted by *Gobigorgohome*
> 
> Baaaaaaaah .... a little more work than just using the EK-FC Terminal Quad Semi-Parallel ....


Is your loop that bad? Why go full parallel? There's a math equation that explains it, but you are in essence only running a 1/4th the cooling thru each block in quad parallel.


----------



## Tokkan

Quote:


> Originally Posted by *Chopper1591*
> 
> Thats still alright.
> 
> Atx v2.3 12v is rated between 11.40v and 12.60v
> Your psu is atx v2.2, but I suspect that to not be very different.
> 
> So you are ok... could be better, could be worse.
> 
> But, measuring through software isn't the most accurate way to do it.
> If you want to know it, grab the Multi-meter.


Yea I'm not too worried about it, I'll have to switch the PSU eventually and I'll take advantage to make a complete new system when the time comes.


----------



## battleaxe

Quote:


> Originally Posted by *Roboyto*
> 
> I'm at work. I'll disect the information once I get home. I'd like to see all the max values for GPU-Z, you could very easily set voltage after the bench was completed. Also you need to run a bench loop or show temps after gaming for some time. Temps after a single run are meaningless..Unlesss you play games for a few minutes at a time


So you called crap, but now that you look like a fool you are gonna change the way this is done right? You called crap after I made a comment on my temps and running 3dMark11 or Firestrike. I said nothing about running a game for an hour. So now you're gonna change your tune and change the way I have to measure my temps.

Wow, thanks so much for your permission. I had no idea you were the overclocking police. I'm so sorry.

So how would you like me to do this? Why don't you tell me how I have to run the bench to your satisfaction so my temps are high enough or until you actually realize you were wrong? I'm sorry, I had no idea I had to run these temp clarifications to suit your purposes and tastes.

Wow, this is a lot harder than I had imagined. So now all this is because I fabricated and turned the voltage up after the test? Well, that's creative...

And why would I do that? Do you mind sharing why I would do that?

So now what? I have to run the bench again right? And this time I have to post the max voltages just so you are satisfied? Why? Are you going to publicly apologize to me? What are you going to do to make this worth my time?

Edited: I said some not-so-nice things so took them off to be less of a jerk.


----------



## battleaxe

Quote:


> Originally Posted by *bond32*
> 
> Is this for real? The max power draw of your card says 270 something watts? You didn't run it at that voltage you claimed I can tell you right now. At +200mV one of my 4 cards will draw well over 300 watts.
> 
> The voltage indicates absolutely no change from leaving 2D power levels which means they hit your "1200" for a fraction of a second, then fell.
> 
> I realize that you are already really butthurt, but if you keep posting nonsense, especially in this thread, you're going to get shot down.


What do you want this test to look like? Huh? Whats the proof on the screen you want? Go ahead and tell me now that way I can shut you all you high school kids up at once.

I never said +200mv. I said +150!!!

My card cannot even go to 200mv or it black-screens.

Edited: I got mad and said some stuff I shouldn't have said. Shame on me.


----------



## Chopper1591

Quote:


> Originally Posted by *bond32*
> 
> Is this for real? The max power draw of your card says 270 something watts? You didn't run it at that voltage you claimed I can tell you right now. At +200mV one of my 4 cards will draw well over 300 watts.
> 
> The voltage indicates absolutely no change from leaving 2D power levels which means they hit your "1200" for a fraction of a second, then fell.
> 
> I realize that you are already really butthurt, but if you keep posting nonsense, especially in this thread, you're going to get shot down.


Yeah it's weird, right?
Something is smelly here.

Look, this is a shot of mine from last week or so.
1200 1500 with only +150. Look at the peak voltage and power:


Quote:


> Originally Posted by *Tokkan*
> 
> Yea I'm not too worried about it, I'll have to switch the PSU eventually and I'll take advantage to make a complete new system when the time comes.


Good plan.
I think it will still last pretty long. Just don't overload it for too long.
Meaning, don't push that card too hard.


----------



## tsm106

Lmao, when you have to resort to personal insults and attacks...


----------



## Roboyto

Quote:


> Originally Posted by *battleaxe*
> 
> Go away. I'm done with you.


You asked where I was...here I am, yet now you tell me to go away









The screenshot I posted was after an hour loop, i.e. real world scenario. A single run through 3DMark11 doesn't tell much of anything.

I'm sorry but these cards are most known for high operating temperatures and there isn't a single other person even coming close to what you're claiming.

I have no problem admitting when I am wrong, but there needs to be more information to substantiate your claims.


----------



## Roboyto

Quote:


> Originally Posted by *battleaxe*
> 
> So you called crap, but now that you look like a fool you are gonna change the way this is done right? You called crap after I made a comment on my temps and running 3dMark11 or Firestrike. I said nothing about running a game for an hour. So now you're gonna change your tune and change the way I have to measure my temps.
> 
> Wow, thanks so much for your permission. I had no idea you were the overclocking police. I'm so sorry.
> 
> So how would you like me to do this? Why don't you tell me how I have to run the bench to your satisfaction so my temps are high enough or until you actually realize you were wrong? I'm sorry, I had no idea I had to run these temp clarifications to suit your purposes and tastes.
> 
> Wow, this is a lot harder than I had imagined. So now all this is because I fabricated and turned the voltage up after the test? Well, that's creative...
> 
> And why would I do that? Do you mind sharing why I would do that?
> 
> So now what? I have to run the bench again right? And this time I have to post the max voltages just so you are satisfied? Why? Are you going to publicly apologize to me? What are you going to do to make this worth my time?
> 
> Go Troll somewhere else.


Hey guys look how stable my card is and low my temperatures are after 4 minutes!

When I post results for an hour then I would anticipate to see something similar..you know an equal comparison..apples-to-apples


----------



## Chopper1591

Quote:


> Originally Posted by *Roboyto*
> 
> You asked where I was...here I am, yet now you tell me to go away
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The screenshot I posted was after an hour loop, i.e. real world scenario. A single run through 3DMark11 doesn't tell much of anything.
> 
> I'm sorry but these cards are most known for high operating temperatures and there isn't a single other person even coming close to what you're claiming.
> 
> I have no problem admitting when I am wrong, but there needs to be more information to substantiate your claims.


It keeps getting better.

I really think someone is gonna exlode here someday.
The friction is building. I can feel it.

Here, 25 minutes of Heaven, fixed 60% fan:


----------



## Gobigorgohome

Quote:


> Originally Posted by *tsm106*
> 
> ^^Bond, look the power out, it's too low. For 1.25v +150mv = 220w lolzers.
> 
> 
> 
> 
> 
> 
> 
> 
> Is your loop that bad? Why go full parallel? There's a math equation that explains it, but you are in essence only running a 1/4th the cooling thru each block in quad parallel.


Saw some Titan 4-way SLI doing full parallel so if that could be cooled I figured this would work as well, well, if it does not I could just remove the two C47s on the second card and use plugs and it is essential the same as the EK Quad Semi-Parallel bridge ... do not think the temperature will be too awful, ambient at 13C and dual MO-RA3 1260s just for the GPU's, I do not think it will run too hot, anyways, to get rid of those two C47's only take like 30 minutes so no big deal if I get bad temperatures ... 1/4th of 5 liters with cold water ... hmm, I will check back when it is tested out.


----------



## battleaxe

Quote:


> Originally Posted by *Roboyto*
> 
> Hey guys look how stable my card is and low my temperatures are after 4 minutes!
> 
> When I post results for an hour then I would anticipate to see something similar..you know an equal comparison..apples-to-apples


Except... that's not what I claimed is it? Ever.

I clearly said after 3dMark11 or Firestrike. You guys said I was a liar. What on my screen shot is missing that makes it more clear? I'll be glad to put it up there to shut you all up on the matter.

Why does the GPU-Z look weird? I have no idea. But this is what it does so I can only assume it will do the same thing again.

How do I do this so its clear and I don't get accused any more? Someone tell me please. And no I have no interest in running it for an hour as that is not what I said ever in my original claim. Now was it?

So who's full of crap then? I never made a claim that was not true and I proved it.

BTW, how did I get the 3dMark11 result which shoes my core at 1200/1250 and the screen shots running GPU-Z too? How would I trick you on that? Clearly I haven't put this much thought into falsifying benchmarks.... as maybe some of you have? Hmmm....


----------



## tsm106

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> ^^Bond, look the power out, it's too low. For 1.25v +150mv = 220w lolzers.
> 
> 
> 
> 
> 
> 
> 
> 
> Is your loop that bad? Why go full parallel? There's a math equation that explains it, but you are in essence only running a 1/4th the cooling thru each block in quad parallel.
> 
> 
> 
> Saw some Titan 4-way SLI doing full parallel so if that could be cooled I figured this would work as well, well, if it does not I could just remove the two C47s on the second card and use plugs and it is essential the same as the EK Quad Semi-Parallel bridge ... do not think the temperature will be too awful, ambient at 13C and dual MO-RA3 1260s just for the GPU's, I do not think it will run too hot, anyways, to get rid of those two C47's only take like 30 minutes so no big deal if I get bad temperatures ... 1/4th of 5 liters with cold water ... hmm, I will check back when it is tested out.
Click to expand...

The crux of full parallel on gpus is to reduce the flow "thru" whichever block in question, in this case your quad gpus, to better flow thru the rest of the loop. Essentially you are sacrificing cooling of the gpus for the sake of another block, typically the cpu. You shouldn't copy parallel cooling just because you see others do it, because I hope that they are making an active choice for making this compromise. Know what I mean?


----------



## chiknnwatrmln

This thread has been very entertaining lately...


----------



## Gobigorgohome

Quote:


> Originally Posted by *tsm106*
> 
> The crux of full parallel on gpus is to reduce the flow "thru" whichever block in question, in this case your quad gpus, to better flow thru the rest of the loop. Essentially you are sacrificing cooling of the gpus for the sake of another block, typically the cpu. You shouldn't copy parallel cooling just because you see others do it, because I hope that they are making an active choice for making this compromise. Know what I mean?


Yes, I know what you mean. As said before, re-do project takes maybe 30 minutes, just have to drain, plug out two EK-FC Terminals and have two stop plugs in ... I may do it later tonight.


----------



## tsm106

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> This thread has been very entertaining lately...


I think we're all doing it wrong if you can get 41c or 46c or whatever with some rinky dink gelid heatsinks and a fan. Even AMD is clueless if they didn't know that!

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> The crux of full parallel on gpus is to reduce the flow "thru" whichever block in question, in this case your quad gpus, to better flow thru the rest of the loop. Essentially you are sacrificing cooling of the gpus for the sake of another block, typically the cpu. You shouldn't copy parallel cooling just because you see others do it, because I hope that they are making an active choice for making this compromise. Know what I mean?
> 
> 
> 
> Yes, I know what you mean. As said before, re-do project takes maybe 30 minutes, just have to drain, plug out two EK-FC Terminals and have two stop plugs in ... I may do it later tonight.
Click to expand...

Try it out like as is if it's already plumbed up. I suspect you won't be happy, but I could be wrong. You could have silly powerful pumps and have flow to spare.

** I should comment that in my loop or in respect to the blocks in my loop and hardware, my gpus are most important cooling wise since they dump the most heat overall. On an individual basis the cpu dumps the most heat, but it is not utilized vary much in games unlike our gpus. Thus keeping the gpus cooled trumps my need to keep the cpu cool by a mile. And in that regard, parallel is not for me.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *tsm106*
> 
> I think we're all doing it wrong if you can get 41c or 46c or whatever with some rinky dink gelid heatsinks and a fan. Even AMD is clueless if they didn't know that!


I think you replied to the wrong person..


----------



## Tokkan

Played a few rounds of BF4 and the final temps I reached were of arround 65 degrees for everything, Last night I only loaded up to see if it was seated correctly, did a round and VRM1 max was 68 degrees and VRM2 max was 59 degrees. Max temp for core was 66 degrees. No change in fan speed.
40 euros bought me silence, finally my laptop is louder than my desktop again









Didn't give it enough time before for the temperatures to set in.


----------



## Gobigorgohome

Quote:


> Originally Posted by *tsm106*
> 
> Try it out like as is if it's already plumbed up. I suspect you won't be happy, but I could be wrong. You could have silly powerful pumps and have flow to spare.
> 
> ** I should comment that in my loop or in respect to the blocks in my loop and hardware, my gpus are most important cooling wise since they dump the most heat overall. On an individual basis the cpu dumps the most heat, but it is not utilized vary much in games unlike our gpus. Thus keeping the gpus cooled trumps my need to keep the cpu cool by a mile. And in that regard, parallel is not for me.


Okay, I re-did it, leak testing as we speak. Now it is running exactly as the EK-Terminal Quad Semi-Parallel.







Re-did the in and outs as well to be sure it is truly semi-parallel. Soo, 45 minutes on draining the system, re-do the loop and fill it up again.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Okay, I re-did it, leak testing as we speak. Now it is running exactly as the EK-Terminal Quad Semi-Parallel.
> 
> 
> 
> 
> 
> 
> 
> Re-did the in and outs as well to be sure it is truly semi-parallel. Soo, 45 minutes on draining the system, re-do the loop and fill it up again.


Why don't you just pinch or bend the tubing closed around the GPUs so you don't have to drain and refill? That's what I did with my CPU block when I found out I mixed up the in and out port. As long as you have a towel handy you shouldn't get any water anywhere bad.

Nevermind, just saw that you have solid tubing.


----------



## battleaxe

Okay, I ran Valley as I am not sure how to run 3dMark and also show the temps at the same time while the bench is running too. I ran long enough for the temps to stabilize which hopefully is clear by the graph.

Or is this still wrong and I'm still full of crap? Seems to me the VRM's get bit hotter on Valley than they do on 3dMark11? So that's a bit higher..

So is something wrong with my card? Is that what you guys are saying? Something wrong with my voltage or something?

Or does this somehow prove nothing too since you are the Gods of AMD290 watercooled cards and i'm just an idiot with a 290 and an h80?

Edit: had to remove some of my jerk comments...


----------



## LandonAaron

Quote:


> Originally Posted by *battleaxe*
> 
> Okay, I ran Valley as I am not sure how to run 3dMark and also show the temps at the same time while the bench is running too. I ran long enough for the temps to stabilize which hopefully is clear by the graph.
> 
> Or is this still wrong and I'm still full of crap? Seems to me the VRM's get bit hotter on Valley than they do on 3dMark11? So that's a bit higher..
> 
> So is something wrong with my card? Is that what you guys are saying? Something wrong with my voltage or something?
> 
> Or does this somehow prove nothing too since you are the Gods of AMD290 watercooled cards and i'm just an idiot with a 290 and an h80?
> 
> Here come the comments about how I made something up right? C'mon don't let me down... here they come...


I think they don't believe it cause your power in and power out watts are lower than theirs. But you can see the 1200mhz core clock over time in your GPUz shot, but you can't tell if Valley has actually been running or not. Take a screenshot where you can see the valley timer.


----------



## battleaxe

Quote:


> Originally Posted by *LandonAaron*
> 
> I think they don't believe it cause your power in and power out watts are lower than theirs. But you can see the 1200mhz core clock over time in your GPUz shot, but you can't tell if Valley has actually been running or not. Take a screenshot where you can see the valley timer.


Hey, that's actually a helpful suggestion.

I'm doing all I can to prove to these guys that I'm telling the truth... cause I really want to be like them. They are the real GPU GODS and know all... I bow in their presence truthfully. lol They're just trying to teach me something. They're trying to help me cause I'm so dull and can't read GPU-Z very well. I really owe everything I know to them... plus I'm just an idiot with an h80, that's really the issue here.

The funny thing is I'm willing to admit my card is weird. But they insist that I'm a fraud, so here we go...


----------



## LandonAaron

Quote:


> Originally Posted by *battleaxe*
> 
> Hey, that's actually a helpful suggestion.
> 
> I'm doing all I can to prove to these guys that I'm telling the truth... cause I really want to be like them. They are the real GPU GODS and know all... I bow in their presence truthfully. lol They're just trying to teach me something. They're trying to help me cause I'm so dull and can't read GPU-Z very well. I really owe everything I know to them... plus I'm just an idiot with an h80, that's really the issue here.
> 
> The funny thing is I'm willing to admit my card is weird. But they insist that I'm a fraud, so here we go...


Yeah I would definitely say that card that can do 1200 core but only 1250 memory is definitely a weird card.


----------



## bond32

Quote:


> Originally Posted by *battleaxe*
> 
> Hey, that's actually a helpful suggestion.
> 
> I'm doing all I can to prove to these guys that I'm telling the truth... cause I really want to be like them. They are the real GPU GODS and know all... I bow in their presence truthfully. lol They're just trying to teach me something. They're trying to help me cause I'm so dull and can't read GPU-Z very well. I really owe everything I know to them... plus I'm just an idiot with an h80, that's really the issue here.
> 
> The funny thing is I'm willing to admit my card is weird. But they insist that I'm a fraud, so here we go...


And the butthurt continues...

For starters, your voltage chart looks wonky. I would bet, that if someone else here had an identical setup to yours, running identical clocks but perhaps different drivers, they would have higher benches. But also a steady voltage/power ka-piche?

Your power consumption is still unusually low which would imply there is an issue likely with a driver. My bet is the card isn't staying in 3d clocks near as often as it should hence the lower temperatures you see.

Do you want to brag about lower temps with low bench scores or normal temps with higher scores?
Quote:


> Originally Posted by *Gobigorgohome*
> 
> Okay, I re-did it, leak testing as we speak. Now it is running exactly as the EK-Terminal Quad Semi-Parallel.
> 
> 
> 
> 
> 
> 
> 
> Re-did the in and outs as well to be sure it is truly semi-parallel. Soo, 45 minutes on draining the system, re-do the loop and fill it up again.


I'm curious about your results... With my koolance 4 way connections, I don't get any choice on flow. The top two cards are parallel, bottom two are parallel, both parallel loops are connected in series.

Edit: And because I'm curious, heres the score from the 3dmark11 thread showing a 3570k and 290 from another member:
http://www.3dmark.com/3dm11/7620150

Pretty substantial over 14k if you ask me...

And another thought, no need to get all bent out of shape. If you're running something and it turns out there's a problem you don't know of, is it so bad that we pointed that out? Despite your childish insults?


----------



## battleaxe

Quote:


> Originally Posted by *LandonAaron*
> 
> Yeah I would definitely say that card that can do 1200 core but only 1250 memory is definitely a weird card.


Yeah, the memory is trash... I'm wondering if that's why the temps are so low? I have no idea, but it is a strange animal for sure. This is not the only 290 I have owned, so I am quite aware it is not typical.









That don't mean the results are wrong either, but whatever.


----------



## battleaxe

Quote:


> Originally Posted by *bond32*
> 
> And the butthurt continues...
> 
> For starters, your voltage chart looks wonky. I would bet, that if someone else here had an identical setup to yours, running identical clocks but perhaps different drivers, they would have higher benches. But also a steady voltage/power ka-piche?
> 
> Your power consumption is still unusually low which would imply there is an issue likely with a driver. My bet is the card isn't staying in 3d clocks near as often as it should hence the lower temperatures you see.


So my low temps could be a result of something wrong with the voltage? Or drivers?

It scores 17500ish on 3dMark11??

Isn't that about normal though? Of course what you're saying is possible, but seems the scores are okay? My clocks are only 1200/1250 so 17500 is about right isn't it?

It also games fine, but temps do about the same thing on games too. IDK
Quote:


> Edit: And because I'm curious, heres the score from the 3dmark11 thread showing a 3570k and 290 from another member:
> http://www.3dmark.com/3dm11/7620150
> 
> Pretty substantial over 14k if you ask me...
> 
> And another thought, no need to get all bent out of shape. If you're running something and it turns out there's a problem you don't know of, is it so bad that we pointed that out? Despite your childish insults?


And besides you saying my results were false, which obviously implies I am lying? How is that not a childish insult? If someone accuses me of lying over and over... and like yourself getting the whole thing started... then three other people joining in the Trolling on my posts, yeah. I got kinda ticked off. But whatever man...

...your score was tess modified right? Mine isn't. And mine was 17500, not 14k.

I was pretty sure that 17,500 was about right for my clock speeds. I could be wrong though. I'm telling you my card is fine. Maybe its different... I get that now. But it seems to work as it should other than having low voltage, low temps, and the RAM sucks. I'm not saying this is the case for sure, maybe it is screwed up, but everything else besides the low temps and low voltages has been normal. This card was quite a bit stronger than my other 290 that was reference and died on me a bit ago.

FYI - I just posted my results (several pages ago) and never thought it was any kind of issue. I wasn't trying to one up anyone. Just saying what my setup does and I was happy with it. I didn't realize it is so far from the norm and never meant to get all this started. But to be honest it was kinda funny to post some of the stuff, I had a bit of fun with it as I do like to write some. But then you guys started making condescending remarks implying I was full of crap and lying about what my card does. Yeah, that kinda got the "butt hurt" going a bit.

Anyway, water under the bridge. If there's something to make me think something is wrong with my card or drivers then I'll try to figure it out. But the card works fine as far as I can tell. Unless those benchmarks really are way off. Then I'll try to figure it out too.


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> So my low temps could be a result of something wrong with the voltage? Or drivers?
> 
> It scores 17500ish on 3dMark11??
> 
> Isn't that about normal though? Of course what you're saying is possible, but seems the scores are okay? My clocks are only 1200/1250 so 17500 is about right isn't it?
> 
> It also games fine, but temps do about the same thing on games too. IDK


no your card is fine. they're just jelly.


----------



## bond32

This guy's pretty hilarious.


----------



## battleaxe

Quote:


> Originally Posted by *bond32*
> 
> This guy's pretty hilarious.


I do what I can...

the doctors say I'm bipolar, but I don't listen to those voices any more.


----------



## Gobigorgohome

Quote:


> Originally Posted by *bond32*
> 
> I'm curious about your results... With my koolance 4 way connections, I don't get any choice on flow. The top two cards are parallel, bottom two are parallel, both parallel loops are connected in series.
> 
> Edit: And because I'm curious, heres the score from the 3dmark11 thread showing a 3570k and 290 from another member:
> http://www.3dmark.com/3dm11/7620150
> 
> Pretty substantial over 14k if you ask me...
> 
> And another thought, no need to get all bent out of shape. If you're running something and it turns out there's a problem you don't know of, is it so bad that we pointed that out? Despite your childish insults?


If I understand what you are saying correctly that is how the EK-FC Terminal Quad Semi-Parallel works (pretty much only one way to use it when taken ports in mind).


----------



## battleaxe

Quote:


> Originally Posted by *Gobigorgohome*
> 
> If I understand what you are saying correctly that is how the EK-FC Terminal Quad Semi-Parallel works (pretty much only one way to use it when taken ports in mind).
> 
> Have I been bent out of shape? No. Have I said it is bad it gets pointed out what I have done that is less smart than other things? No. Have I behaved in a way that would be considered "childish insults"? Guess one time, no. I just argued that I had seen it been done with similar systems before and I said I would try it out before I gave any more feedback about running full serial, I re-did it without testing it so no temperature-problems had to be taken to mind. Beside, I just want to cool the GPU's, if I do not have the best temperatures that does not bather me as long as it is considered cold enough.
> 
> 
> 
> 
> 
> 
> 
> 46C core and 55C VRM is cold enough for what I am doing at this time.


He was talking about me. I was being childish. I'll own it. I got kinda pissed off.

All good now... Took my medication and everything is back to normal now.


----------



## pshootr

I recieved my Sapphire R9 290 OC Tri-X from newegg, but have not installed it yet as I am waiting on a new case to be delivered. I was a little upset that the box for my card was not wrapped in plastic and the sticker seal was not properly sealing the box! I paid for a new card, and I have no way of knowing if it is new or used. Even if it were only slightly used, I have no idea who used it, or what they did with it. Is it common for items to be shipped without being wrapped in plastic?


----------



## JourneymanMike

Quote:


> Originally Posted by *pshootr*
> 
> I recieved my Sapphire R9 290 OC Tri-X from newegg, but have not installed it yet as I am waiting on a new case to be delivered. I was a little upset that the box for my card was not wrapped in plastic and the sticker seal was not properly sealing the box! I paid for a new card, and I have no way of knowing if it is new or used. Even if it were only slightly used, I have no idea who used it, or what they did with it. Is it common for items to be shipped without being wrapped in plastic?


I've had that happen a couple of times to me also... I've always had luck that the component worked as expected.

Now I've heard of others that got either used or refurbished junk and had to RMA because the part didn't work!

Maybe one might be better off emailing or calling the supplier and discussing the situation before installing and testing it...









Good Luck to you...


----------



## pshootr

Quote:


> Originally Posted by *JourneymanMike*
> 
> I've had that happen a couple of times to me also... I've always had luck that the component worked as expected.
> 
> Now I've heard of others that got either used or refurbished junk and had to RMA because the part didn't work!
> 
> Maybe one might be better off emailing or calling the supplier and discussing the situation before installing and testing it...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Good Luck to you...


Thank you. I think I will call them This really bugs me because I payed full price rather than open box price. :/


----------



## tsm106

Not all brands shrink wrap. I've gotten various boards that were not wrapped from various vendors, egg and amazon included. If you are concerned check for fingerprints. Workers at the factory use gloves so you'll know if someone not official handled your card.


----------



## pshootr

Quote:


> Originally Posted by *tsm106*
> 
> Not all brands shrink wrap. I've gotten various boards that were not wrapped from various vendors, egg and amazon included. If you are concerned check for fingerprints. Workers at the factory use gloves so you'll know if someone not official handled your card.


Thing is, the sticker seal for the box was not sealing the box. You know how after you reseal a sticker it does not stay sealed, it lifts back up because the adhesive is no longer as sticky. And no matter who handled the card it still appears to be an open box. Can anyone say wether the Sapphire R9 290 OC Tri-X box is suppose to come wrapped in plastic?


----------



## Regnitto

Got my 290 today!!!!! Thanks to @bluedevil for the great deal! VisionTek R9 290




GPU-z validation

Fire Strike Results 1050/1250


----------



## pshootr

Quote:


> Originally Posted by *Regnitto*
> 
> Got my 290 today!!!!! Thanks to @bluedevil for the great deal! VisionTek R9 290
> 
> 
> 
> 
> GPU-z validation
> 
> Fire Strike Results 1050/1250


Congrats dude I bet you are pumped!


----------



## alancsalt

Instead of that two letter acronym for nonsense starting with B that hopefully is now edited out, you might try "nonsense" or "shenanigans".
Numerous reports. Please avoid this or risk further warnings/infractions.
Quote:


> You are EXPECTED to:
> Maintain an environment that is friendly to all ages
> No swearing, racy images etc.
> 
> You may not:
> Use profanity. This includes the use of symbols, abbreviations, or *acronyms* to circumvent the no profanity rule.


Quote:


> Adding ***** or using acronyms that are synonymous with swearing ARE both the same as swearing.
> 
> - The censorship ("*****") is just reactive action to someone who has broken our rules. It does not make things "alright".
> - Acronyms that are synonymous with swearing are defined as circumvention of our rules.
> 
> Both of these will not be tolerated - the same way that swearing here has never been tolerated.
> 
> We are a professional community. Let's keep it that way.
> 
> Thread Origin: http://www.overclock.net/t/219513/the-dreaded-f/50_50#post_2541569


*So, anything that turns into asterisks, edit it out with other words.*


----------



## sinnedone

Need help with the fan pinout on these cards. Does anyone happen to know the pinout and if the cards do voltage control of the fan or pwm only?


----------



## hornedfrog86

Quote:


> Originally Posted by *pshootr*
> 
> Thing is, the sticker seal for the box was not sealing the box. You know how after you reseal a sticker it does not stay sealed, it lifts back up because the adhesive is no longer as sticky. And no matter who handled the card it still appears to be an open box. Can anyone say wether the Sapphire R9 290 OC Tri-X box is suppose to come wrapped in plastic?


Mine had stickers at the ends but not sealed in plastic.


----------



## bluedevil

Quote:


> Originally Posted by *Regnitto*
> 
> Got my 290 today!!!!! Thanks to @bluedevil for the great deal! VisionTek R9 290
> 
> 
> 
> 
> GPU-z validation
> 
> Fire Strike Results 1050/1250


That's a good looking 290 right there...


----------



## josephimports

Corsair HG10 / Antec 620 push
NZXT H2
Ref. 290 stock
60 min valley loop
ambient 17c


NZXT G10 / Zalman LQ320 push/pull
Corsair 730T
Ref. 290 stock
60 min valley loop
ambient 17c


IMO, the HG10 looks better and is quicker to install. Performance is nearly identical.


----------



## pshootr

Quote:


> Originally Posted by *hornedfrog86*
> 
> Mine had stickers at the ends but not sealed in plastic.


Hmm. maybe Sapphire does not use the shrink-wrap plastic. I would still be concerned though, because one of the stickers was not sealed on one side. Thanks for your response.


----------



## Regnitto

Quote:


> Originally Posted by *pshootr*
> 
> Congrats dude I bet you are pumped!


That I most certainly am






















Quote:


> Originally Posted by *bluedevil*
> 
> That's a good looking 290 right there...


I know, right! Thanks again for the great deal, can't wait to get it under water and see the results you had with it







I need to go give you some positive trader rating.


----------



## hornedfrog86

Quote:


> Originally Posted by *pshootr*
> 
> Hmm. maybe Sapphire does not use the shrink-wrap plastic. I would still be concerned though, because one of the stickers was not sealed on one side. Thanks for your response.


Sure, the card seems fine and looks new.


----------



## Bertovzki

I had my R9 290X Tri-X OC arrive with no plastic but what seemed like un tampered seal stickers from a company that has big turn over , and at the time I bought the card they were going like hot cakes !

But I did hear from another local dude that he had a card that was clearly used , that alone was enough for me to never use that company even though they had the cheapest 290 in the country ( maybe its why ) that is increadibly rude and dishonest to sell a card that has for any reason come out of its packaging without being stated as such to the buyer.

Adding to that , the fans on the tri-x were all covered in plastic and clearly not used!


----------



## pshootr

Quote:


> Originally Posted by *Bertovzki*
> 
> I had my R9 290X Tri-X OC arrive with no plastic but what seemed like un tampered seal stickers from a company that has big turn over , and at the time I bought the card they were going like hot cakes !
> 
> But I did hear from another local dude that he had a card that was clearly used , that alone was enough for me to never use that company even though they had the cheapest 290 in the country ( maybe its why ) that is increadibly rude and dishonest to sell a card that has for any reason come out of its packaging without being stated as such to the buyer.
> 
> Adding to that , the fans on the tri-x were all covered in plastic and clearly not used!


I'm pretty sure this box has been opened due to the fact one of the stickers was no longer sticky enough to seal the box. This card has plastic covering the face of the cooler, but does not cover the fans themselves. I agree it is very uncool and nonprofessional to sell a product as new if the box has already been opened. For all I know someone overclocked the piss out of it with god knows how many volts.

I suppose I'll be happy enough so long as it works well. Still somewhat disappointing to not know what has been done with the card before it was shipped to me. I think all companies should use shrink-wrap to avoid this kind of issue.


----------



## Mega Man

@battleaxe
you thought about HWinfo min max average and time opened



if i had a 290/290x installed it would show VRM temps


----------



## Bertovzki

Quote:


> Originally Posted by *pshootr*
> 
> I'm pretty sure this box has been opened due to the fact one of the stickers was no longer sticky enough to seal the box. This card has plastic covering the face of the cooler, but does not cover the fans themselves. I agree it is very uncool and nonprofessional to sell a product as new if the box has already been opened. For all I know someone overclocked the piss out of it with god knows how many volts.
> 
> I suppose I'll be happy enough so long as it works well. Still somewhat disappointing to not know what has been done with the card before it was shipped to me. I think all companies should use shrink-wrap to avoid this kind of issue.


Yeah ok you would just have to hope that it was taken out the box to show some customer what it looked like and that was all , but still , like you say it should be shrink wrapped in factory to eliminate this question or possibility , just had a look back at original post , and see it was Newegg , you would hope to have faith in them that it was not used , as they have the turnover of product and move the cards regularly


----------



## pshootr

Quote:


> Originally Posted by *Bertovzki*
> 
> Yeah ok you would just have to hope that it was taken out the box to show some customer what it looked like and that was all , but still , like you say it should be shrink wrapped in factory to eliminate this question or possibility , just had a look back at original post , and see it was Newegg , you would hope to have faith in them that it was not used , as they have the turnover of product and move the cards regularly


I'm trying to positive minded, but it just bothers me that the sticker is not sealing the box. I have not checked out the warranty details yet, but if overclocking the card voids warranty then I could end up having a void warranty through no fault of my own. That would be a terrible shame.


----------



## joeh4384

Quote:


> Originally Posted by *pshootr*
> 
> I'm trying to positive minded, but it just bothers me that the sticker is not sealing the box. I have not checked out the warranty details yet, but if overclocking the card voids warranty then I could end up having a void warranty through no fault of my own. That would be a terrible shame.


You should be fine with Sapphire's warranty as long as you do not mod the cooler and have the original invoice.


----------



## pshootr

Quote:


> Originally Posted by *joeh4384*
> 
> You should be fine with Sapphire's warranty as long as you do not mod the cooler and have the original invoice.


That is good news. Thanks for the word.


----------



## Arizonian

Quote:


> Originally Posted by *Regnitto*
> 
> Got my 290 today!!!!! Thanks to @bluedevil for the great deal! VisionTek R9 290
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> GPU-z validation
> 
> Fire Strike Results 1050/1250


Congrats - added









In other news today - *[Techspot] AMD Catalyst Omega Drivers*


----------



## Chopper1591

Quote:


> Originally Posted by *battleaxe*
> 
> So my low temps could be a result of something wrong with the voltage? Or drivers?
> 
> It scores 17500ish on 3dMark11??
> 
> Isn't that about normal though? Of course what you're saying is possible, but seems the scores are okay? My clocks are only 1200/1250 so 17500 is about right isn't it?
> 
> It also games fine, but temps do about the same thing on games too. IDK
> And besides you saying my results were false, which obviously implies I am lying? How is that not a childish insult? If someone accuses me of lying over and over... and like yourself getting the whole thing started... then three other people joining in the Trolling on my posts, yeah. I got kinda ticked off. But whatever man...
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> ...your score was tess modified right? Mine isn't. And mine was 17500, not 14k.
> 
> I was pretty sure that 17,500 was about right for my clock speeds. I could be wrong though. I'm telling you my card is fine. Maybe its different... I get that now. But it seems to work as it should other than having low voltage, low temps, and the RAM sucks. I'm not saying this is the case for sure, maybe it is screwed up, but everything else besides the low temps and low voltages has been normal. This card was quite a bit stronger than my other 290 that was reference and died on me a bit ago.
> 
> FYI - I just posted my results (several pages ago) and never thought it was any kind of issue. I wasn't trying to one up anyone. Just saying what my setup does and I was happy with it. I didn't realize it is so far from the norm and never meant to get all this started. But to be honest it was kinda funny to post some of the stuff, I had a bit of fun with it as I do like to write some. But then you guys started making condescending remarks implying I was full of crap and lying about what my card does. Yeah, that kinda got the "butt hurt" going a bit.
> 
> Anyway, water under the bridge. If there's something to make me think something is wrong with my card or drivers then I'll try to figure it out. But the card works fine as far as I can tell. Unless those benchmarks really are way off. Then I'll try to figure it out too


I've had it with you and your "thing".

It starts to stain this thread, big time.

Get your results on the table here, showing your settings with it.
Run whatever you want.... heaven, valley, 3dmark11 3dmark 2013. And do show scores with Sapphire Trixx or Afterburner open. Coupled with max values of Gpu-z.

Then I will copy your clocks/settings and do the same runs.
Want to clear this stuff out now. Im done with it...


----------



## Sgt Bilko

Quote:


> Originally Posted by *Arizonian*
> 
> In other news today - *[Techspot] AMD Catalyst Omega Drivers*


Now that look like a damn nice driver!


----------



## bluedevil

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In other news today - *[Techspot] AMD Catalyst Omega Drivers*


You can also remove me.


----------



## Silent Scone

Still can't believe they've called them Omega Drivers!


----------



## rdr09

Quote:


> Originally Posted by *Silent Scone*
> 
> Still can't believe they've called them Omega Drivers!


it would have been better if amd called it "wonder" driver. but, it won't be original.


----------



## Kokin

Quote:


> Originally Posted by *Chopper1591*
> 
> Yeah it's weird, right?
> Something is smelly here.
> 
> Look, this is a shot of mine from last week or so.
> 1200 1500 with only +150. Look at the peak voltage and power:


Actually his wattage values look legit compared to my XFX R9 290 set to the same 1200/1250 +150mv (assuming 50% power limit). Ran Valley for about 15~20 minutes to get these max results. Memory was at 1250 for the test, but it's normally 1400, which was when I opened GPU-Z. I do use an EK full cover block + backplate and a full custom watercooling setup. Temps are a bit higher than normal but my fans were set to 800~900RPM and pump was running at 2000RPM (4500RPM max), not ideal speeds for this type of intense activity.



Edit: Forgot to mention my card's ASIC quality is 72.0%. Maybe for you guys with the higher wattages, you have more leaky GPUs? Also, ambient temp is 30C, water temps went up to 37C.


----------



## SPLWF

I got this for free, brand new. It's got hynix memory and all, go figure my other 290 is Elpida.


----------



## Aaron_Henderson

Hey guys, haven't had time to get my CPU overclock dialed back in since I redid a bunch of my system, but is 10500 around what my setup should be scoring in Firestrike with a mild overclock? I just wanted to get a baseline before I start adding more volts...CPU was just at 4.6GHz auto overclock and 290X was just bumped up a hair to 1150/1500. Haven't played with the card clocking yet, just set it to something I knew should be stable. I am hoping to get 1250 core with some volts, and put my CPU back to 5GHz, and overclock my RAM...hopefully be in the 11500-12000 range then?


----------



## tsm106

Quote:


> Originally Posted by *Kokin*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Chopper1591*
> 
> Yeah it's weird, right?
> Something is smelly here.
> 
> Look, this is a shot of mine from last week or so.
> 1200 1500 with only +150. Look at the peak voltage and power:
> 
> 
> 
> 
> Actually his wattage values look legit compared to my XFX R9 290 set to the same 1200/1250 +150mv (assuming 50% power limit). Ran Valley for about 15~20 minutes to get these max results. Memory was at 1250 for the test, but it's normally 1400, which was when I opened GPU-Z. I do use an EK full cover block + backplate and a full custom watercooling setup. Temps are a bit higher than normal but my fans were set to 800~900RPM and pump was running at 2000RPM (4500RPM max), not ideal speeds for this type of intense activity.
> 
> 
> 
> Edit: Forgot to mention my card's ASIC quality is 72.0%. Maybe for you guys with the higher wattages, you have more leaky GPUs? Also, ambient temp is 30C, water temps went up to 37C.
Click to expand...

No... I disagree. Nothing in his screens or posts makes sense. Ok, so you managed similar wattage out, but why are your temps so high compared to his? What are you doing wrong? Btw, the asic figures are not accurate, and we don't really know what they mean.

Anyways, he keeps claiming he hit 17K in 3dm11. That's not possible at the clocks he says are his maximum. Do you think 1200/1250 is capable of hitting 17K in 3md11? There are so many inconsistencies that just shouldn't be there.

http://www.3dmark.com/3dm11/8786414


----------



## Sempre

Sorry for the late replies I've been busy the last couple of days.

Quote:


> Originally Posted by *Chopper1591*
> 
> Has this been asked before?
> What are your idle temps?
> 
> I would really create the custom fan cruve and use it for everything. Your fan speed seems to be the problem after all.
> What you could do is make the curve start at ~40%? I don't know till which % the fan starts to be audible. Mine is pretty quiet around 46%.
> 
> Try some things out.
> 
> Of none works you have no other option then to increase airflow.
> Do you have a side intake fan?
> That sucks.
> 
> Have you tried with just the one card?
> 
> And... no warranty?


Idle temps are around ~57C with 30% fan speed, ~50C with 40% fan speed. By idle i mean browsing, not AFK
I tried out a custom fan curve but found manual to be better for lite usage since i don't expect sudden temp spikes like in gaming or watching videos/youtube.
Are you using a ref cooler? Becasue mine starts being noticeable at 35% and at 40% it's distracting. Edit: I just saw that you have a Tri-x and that's by far a better and quieter cooler so no comparison here.
I removed the HDD cage but it didn't help much and i don't plan on putting a side fan.
Got it used, no warranty.

Quote:


> Originally Posted by *LandonAaron*
> 
> This doesn't seem right to me. I only used the reference cooler for a few days before I put the card under water, but hitting 90C on a youtube video seems pretty "severe" as you said. First thing I would try is setting your fan back to the default "Auto" mode, and see how high the temps get and how high the fan turns up to. If the fan goes into the leaf blower mode, and the temps are still skyrocketing you have a problem, and if your card is overheating now it will most definitely be blowing up once you start 3d gaming. If its a new card I would RMA. If its a used card I would reapply thermal paste and re-seat the cooler.
> 
> I know you said disabling HW acceleration for youtube videos fixed the issue, but thats more of a workaround, as now you are just not using the card, but the cards thermal problems are still going to be there the next time you do.


Auto mode is usless, it only starts speeding up when it's too late. Just right now the card was 60C with %30 manual fan, i clicked the auto mode and the fan went back to %20.
When playing Battlefield at full load the temps stay below 90c with no throttling* if i crank the fan really high like 70-80% so it's doable but really noisy.

I agree with you on the last part. It's not a fix but at least it doesn't reach 90c now.

*By forcing the clock speed using Radeon Pro Overdrive because apparently even if the card doesn't reach 94c, the clock speed still fluctuates causing stutter.

I'm done with the ref cooler. Im thinking of either going the NZXT G10 route *OR* try to find used full blocks and get a Swiftech H220-X, any thoughts?


----------



## Chopper1591

Quote:


> Originally Posted by *Kokin*
> 
> Actually his wattage values look legit compared to my XFX R9 290 set to the same 1200/1250 +150mv (assuming 50% power limit). Ran Valley for about 15~20 minutes to get these max results. Memory was at 1250 for the test, but it's normally 1400, which was when I opened GPU-Z. I do use an EK full cover block + backplate and a full custom watercooling setup. Temps are a bit higher than normal but my fans were set to 800~900RPM and pump was running at 2000RPM (4500RPM max), not ideal speeds for this type of intense activity.
> 
> 
> 
> Edit: Forgot to mention my card's ASIC quality is 72.0%. Maybe for you guys with the higher wattages, you have more leaky GPUs? Also, ambient temp is 30C, water temps went up to 37C.


We should compare aples to aples and do the same benches with the same clocks.
Compare scores and then look at the difference in wattage.

My card has an ASIC of 80.8%, so in theory it should use less power.
That is, if you can go with the story most people say about ASIC.

Cooling shouldn't matter.
Although it is known that a card can clock with less voltage when kept cooler.

Here's Firestrike, 1200/1500 +100:


Spoiler: Warning: Spoiler!







And 1200/1300 +100 after ~25 minutes of Heaven:


Spoiler: Warning: Spoiler!







Quote:


> Originally Posted by *SPLWF*
> 
> I got this for free, brand new. It's got hynix memory and all, go figure my other 290 is Elpida.


Congrats.
I thought all tri-x 290s had hynix?

Who'd you rob for that btw?








Quote:


> Originally Posted by *Aaron_Henderson*
> 
> Hey guys, haven't had time to get my CPU overclock dialed back in since I redid a bunch of my system, but is 10500 around what my setup should be scoring in Firestrike with a mild overclock? I just wanted to get a baseline before I start adding more volts...CPU was just at 4.6GHz auto overclock and 290X was just bumped up a hair to 1150/1500. Haven't played with the card clocking yet, just set it to something I knew should be stable. I am hoping to get 1250 core with some volts, and put my CPU back to 5GHz, and overclock my RAM...hopefully be in the 11500-12000 range then?


I can't give advice on this.
Do you have shots with your settings and runs?


----------



## pdasterly

what games are you guys/gals playing. Alien isolation runs great on the 290x, I get up to 130 fps on 5760x1080 with crossfire 290x(overclocked of course)(1200/1600) Ultra settings


----------



## SPLWF

Quote:


> Originally Posted by *Chopper1591*
> 
> Congrats.
> I thought all tri-x 290s had hynix?
> 
> Who'd you rob for that btw?


I thought Tri-x's were a mixed bag when it came to hynix or elpida? I could be wrong

Fell off the back of a truck, lol.

My wife's company gives her $1000 every two years for laptop/tablet or desktop PC. We used it for PC part this year


----------



## battleaxe

Quote:


> Originally Posted by *Chopper1591*
> 
> We should compare aples to aples and do the same benches with the same clocks.
> Compare scores and then look at the difference in wattage.
> 
> My card has an ASIC of 80.8%, so in theory it should use less power.
> That is, if you can go with the story most people say about ASIC.
> 
> Cooling shouldn't matter.
> Although it is known that a card can clock with less voltage when kept cooler.




For reference to see what happens at higher core/voltages



3dMark11 score on 1200/1250mhz run



3dMark11 score at 1230mhz/1250mhz

http://www.3dmark.com/3dm11/9081080



3dMark11 score at 1240mhz/1250mhz

needed +187mv crashed when trying +200mv


----------



## Aaron_Henderson

Quote:


> Originally Posted by *Chopper1591*
> 
> I can't give advice on this.
> Do you have shots with your settings and runs?


I was just getting a quick baseline before I start clocking everything, so I didn't bother with any screens, but I will post some later, if I remember. Was just wondering if everything is as it should be before i start clocking...if my score is low for current settings (4.6GHz, 1150/1500 290X), then I would need to look after whatever is wrong first, is my reasoning. 10,500 in Firestrike seems around where I should be though, no? I'll get some screens...


----------



## Agent Smith1984

Quote:


> Originally Posted by *Aaron_Henderson*
> 
> I was just getting a quick baseline before I start clocking everything, so I didn't bother with any screens, but I will post some later, if I remember. Was just wondering if everything is as it should be before i start clocking...if my score is low for current settings (4.6GHz, 1150/1500 290X), then I would need to look after whatever is wrong first, is my reasoning. 10,500 in Firestrike seems around where I should be though, no? I'll get some screens...


I'm getting 9900 with thuban at 4GHz, and 290 @ the same clocks you are running, so your score is dead on!


----------



## Aaron_Henderson

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'm getting 9900 with thuban at 4GHz, and 290 @ the same clocks you are running, so your score is dead on!


Cool, that's all I needed to hear, was that I was in the ballpark of where my system should be before I start doing some higher volt runs. Thanks! Had some weird stuttering/usage spikes, but figured out that was a setting in MSI Afterburner causing it, and since looking after that, everything has been running great. Oddly enough, the stuttering/usage spikes did not effect my Firestrike scores...which is a bummer since I half expected a tiny boost in my score after I fixed it









EDIT - it's tempting to pick up a 3770K...but I will keep my 2500K for now, I suppose. Once I get everything dialed in again for 5Ghz, I think it will be alright. But the extra points in benchmarks makes the 3770k a want, but not a need







Plus, I highly doubt I'll get 5Ghz out of a 3770K on my setup anyway. Wonder what 2600k/2700k are running for nowadays...


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> 
> 
> For reference to see what happens at higher core/voltages
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 3dMark11 score on 1200/1250mhz run
> 
> 
> 
> 3dMark11 score at 1230mhz/1250mhz
> 
> http://www.3dmark.com/3dm11/9081080
> 
> 
> 
> 3dMark11 score at 1240mhz/1250mhz
> 
> 
> 
> needed +187mv crashed when trying +200mv


good job figuring out what amount of VDDC is needed for a certain oc.


----------



## Performer81

Valley is a bad oc test. Try 3dmark or a a demanding game like BF4. SOmehow my clocks jump around like crazy in that valley bench, in heaven or 3dmark they are rock solid.


----------



## battleaxe

Quote:


> Originally Posted by *Performer81*
> 
> Valley is a bad oc test. Try 3dmark or a a demanding game like BF4. SOmehow my clocks jump around like crazy in that valley bench, in heaven or 3dmark they are rock solid.


Yeah, it was more to see what my max voltage is doing. Seems mine is too low for some reason. Which in turn seems to make my temps lower than they should be. My card is running really strange voltages, not like anyone else's card apparently. Has me a little stumped. I can't get over +200mv though or it just crashes, so I'm guessing they did this on purpose with the BIOS, but I really don't know.


----------



## Spectre-

Quote:


> Originally Posted by *Performer81*
> 
> Valley is a bad oc test. Try 3dmark or a a demanding game like BF4. SOmehow my clocks jump around like crazy in that valley bench, in heaven or 3dmark they are rock solid.


metro LL benchmark is where your gpu will be tested


----------



## Regnitto

I just put my 290 under water.

Cooler Master Seidon 120XL bolted to gpu with m3-20mm cap screws and 120mm Bygears 110cfm on the VRMs:








I haven't Figured out my Radiator mounting yet, so it's propped up on a box for the time being.


----------



## pshootr

Damn, you didn't waste any time did you.. lol







Enjoy.


----------



## Regnitto

Bought the card and cooler together from @bluedevil, who previously ran the same set up only with a spot cool, intending to do this from the start. I hate seeing any temp over 70c in my rig even if it's designed for it


----------



## Agenesis

Is the best 290x the lightning? I didn't realize I had few hundred dollars worth of reward points on my discover card and now I've got the itch


----------



## rt123

Quote:


> Originally Posted by *Agenesis*
> 
> Is the best 290x the lightning? I didn't realize I had few hundred dollars worth of reward points on my discover card and now I've got the itch


Yes the Lightning is the Best 290X.


----------



## Spectre-

Quote:


> Originally Posted by *Agenesis*
> 
> Is the best 290x the lightning? I didn't realize I had few hundred dollars worth of reward points on my discover card and now I've got the itch


i would try my best to get rid of that itch and save those points till january


----------



## melodystyle2003

When i overclock monitor's refresh rate from 60 to 80Hz, memory clocks on idle dont get lower but run continuously @1300mhz.
Any idea if this can be fixed and how?
Fresh formatted system, win 8.1 x64, 14.9 drivers.


----------



## Kokin

Quote:


> Originally Posted by *Chopper1591*
> 
> Here's Firestrike, 1200/1500 +100:
> 
> 
> Spoiler: Warning: Spoiler!


Unfortunately it's AMD vs Intel here, so can't do apples to apples. I'm kind of confused how I scored higher overall despite a much lower graphics/physics score.

Fire Strike @ 1200/1400, +100mv, 50% power limit


Spoiler: Warning: Spoiler!








http://www.3dmark.com/3dm/4953153
Quote:


> Originally Posted by *tsm106*
> 
> No... I disagree. Nothing in his screens or posts makes sense. Ok, so you managed similar wattage out, but why are your temps so high compared to his? What are you doing wrong? Btw, the asic figures are not accurate, and we don't really know what they mean.
> 
> Anyways, he keeps claiming he hit 17K in 3dm11. That's not possible at the clocks he says are his maximum. Do you think 1200/1250 is capable of hitting 17K in 3md11? There are so many inconsistencies that just shouldn't be there.
> 
> http://www.3dmark.com/3dm11/8786414


Don't get me wrong, I'm not defending him, I'm simply showing everyone that my power output at those settings are similar to his. People were raising pitch forks when he showed lower power usage, yet my GPU was outputting the same power with the same settings. If my R9 290 is supposed to be outputting more power, then I would love to access that since I'm not limited by temps.

I explained why my temps were so high: 30C ambient temp, fans and pump were set to almost minimum RPM. Realistic temps on my screenshot above.
Quote:


> Originally Posted by *melodystyle2003*
> 
> When i overclock monitor's refresh rate from 60 to 80Hz, memory clocks on idle dont get lower but run continuously @1300mhz.
> Any idea if this can be fixed and how?
> Fresh formatted system, win 8.1 x64, 14.9 drivers.


What are you using to overclock your monitor? For people using the Qnix/X-Star, we lose downclocking memory if we don't use the standard timings in CRU. However, using the standard timings in CRU would also mean a higher pixel clock rate, which possibly meant lower overclocks for some folks.


----------



## rdr09

Quote:


> Originally Posted by *Kokin*
> 
> Unfortunately it's AMD vs Intel here, so can't do apples to apples. I'm kind of confused how I scored higher overall despite a much lower graphics/physics score.
> 
> Fire Strike @ 1200/1400, +100mv, 50% power limit
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/4953153
> Don't get me wrong, I'm not defending him, I'm simply showing everyone that my power output at those settings are similar to his. People were raising pitch forks when he showed lower power usage, yet my GPU was outputting the same power with the same settings. If my R9 290 is supposed to be outputting more power, then I would love to access that since I'm not limited by temps.
> 
> I explained why my temps were so high: 30C ambient temp, fans and pump were set to almost minimum RPM. Realistic temps on my screenshot above.
> What are you using to overclock your monitor? For people using the Qnix/X-Star, we lose downclocking memory if we don't use the standard timings in CRU. However, using the standard timings in CRU would also mean a higher pixel clock rate, which possibly meant lower overclocks for some folks.


you should be getting similar score as chopper's in graphics - 13100. see his graph (fps), that's how it should look like in FS. you might want to check for a new bios with a manufacturer or try the second bios.


----------



## Kokin

Quote:


> Originally Posted by *rdr09*
> 
> you should be getting similar score as chopper's in graphics - 13100. see his graph (fps), that's how it should look like in FS. you might want to check for a new bios with a manufacturer or try the second bios.


Both of the BIOS on my 290 are the same, at least it came like that. I'm guessing +100mv is not stable if it's supposed to be at 13K-ish.


----------



## Sempre

Quote:


> Originally Posted by *melodystyle2003*
> 
> When i overclock monitor's refresh rate from 60 to 80Hz, memory clocks on idle dont get lower but run continuously @1300mhz.
> Any idea if this can be fixed and how?
> Fresh formatted system, win 8.1 x64, 14.9 drivers.


On my Qnix i used Lawson's memory downclockable timings from this thread, for the same issue you described.

But since you're using a 1080P Dell I reckon you could download CRU and fiddle with the timings to try and make the memory downlclock when idle.


----------



## rdr09

Quote:


> Originally Posted by *Kokin*
> 
> Both of the BIOS on my 290 are the same, at least it came like that. I'm guessing +100mv is not stable if it's supposed to be at 13K-ish.


could be. run at lower clocks using the same voltage. 1150 maybe. and lower memory by 50MHz. it should still be higher than your current score for your 1200.

battleaxe was crashing with +200 @ 1240MHz core and he lowered it to +180 and it worked.


----------



## MystaMagoo

A little guidance and help needed

Got a R9 290 from ebay
found out it had been used with 2 other R9's as a bitcoin miner

It's a reference design Sapphire with the bios switch.

Not sure if my 'crashing' is related to R9 or PC so am testing?
I flashed 1 bios with latest from Sapphire in case it was the old heat/fan speed problem.

Thinking of OC'ing if the card turns out to be good....
Can I flash any 290 bios?
I'm also going to get a spare pci plate and remove the fins for extra cooling.

Thanks


----------



## Chopper1591

Quote:


> Originally Posted by *battleaxe*
> 
> 
> 
> For reference to see what happens at higher core/voltages
> 
> 
> 
> 3dMark11 score on 1200/1250mhz run
> 
> 
> 
> 3dMark11 score at 1230mhz/1250mhz
> 
> http://www.3dmark.com/3dm11/9081080
> 
> 
> 
> 3dMark11 score at 1240mhz/1250mhz
> 
> needed +187mv crashed when trying +200mv


Sorry, but I really need to be difficult.
Can you do a 1200/1250 run of 3dmark 2013?
I don't have 3dmark11, and that is old anyway. Somehow doesn't like my win8.1 install.

Show the results and the max values in gpu-z in a single screenshot.
Then we can compare.

But it is weird indeed...
The voltage on your card.

Quote:


> Originally Posted by *Kokin*
> 
> Unfortunately it's AMD vs Intel here, so can't do apples to apples. I'm kind of confused how I *scored higher overall* despite a much lower graphics/physics score.
> 
> Fire Strike @ 1200/1400, +100mv, 50% power limit
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/4953153
> Don't get me wrong, I'm not defending him, I'm simply showing everyone that my power output at those settings are similar to his. People were raising pitch forks when he showed lower power usage, yet my GPU was outputting the same power with the same settings. If my R9 290 is *supposed to be outputting more power*, then I would love to access that since I'm not limited by temps.
> 
> I explained why my temps were so high: 30C ambient temp, fans and pump were set to almost minimum RPM. Realistic temps on my screenshot above.
> What are you using to overclock your monitor? For people using the Qnix/X-Star, we lose downclocking memory if we don't use the standard timings in CRU. However, using the standard timings in CRU would also mean a higher pixel clock rate, which possibly meant lower overclocks for some folks.


I have no idea why your cards are different.
Mine is totally out of the box and shows well over 300 watts with a 1200/1500 clock, and with only +150.

The overal score is different with your build because you use Intel. I can't recall why exactly, but FireStrike somehow prefers Intel and gives it a better Combined score, thus creating the higher overal.
But I wasn't aiming at that at all. Just to compare graphics scores.
Like said earlier, the fps chart is weird... You sure it is stable?


----------



## melodystyle2003

Quote:


> Originally Posted by *Kokin*
> 
> What are you using to overclock your monitor? For people using the Qnix/X-Star, we lose downclocking memory if we don't use the standard timings in CRU. However, using the standard timings in CRU would also mean a higher pixel clock rate, which possibly meant lower overclocks for some folks.


Quote:


> Originally Posted by *Sempre*
> 
> On my Qnix i used Lawson's memory downclockable timings from this thread, for the same issue you described.
> 
> But since you're using a 1080P Dell I reckon you could download CRU and fiddle with the timings to try and make the memory downlclock when idle.


Guys thanks for your replies.
I am using CRU and pixel clock rate is elevated from 148.5 to 177.78 in order to achieve the 80Hz.
Hm i need to find the appropriate mem timings in order to make it idle? Any reference would be helpful, i will dig in my self too.


----------



## battleaxe

Quote:


> Originally Posted by *Chopper1591*
> 
> Sorry, but I really need to be difficult.
> Can you do a 1200/1250 run of 3dmark 2013?
> I don't have 3dmark11, and that is old anyway. Somehow doesn't like my win8.1 install.
> 
> Show the results and the max values in gpu-z in a single screenshot.
> Then we can compare.
> 
> But it is weird indeed...
> The voltage on your card.
> I have no idea why your cards are different.
> Mine is totally out of the box and shows well over 300 watts with a 1200/1500 clock, and with only +150.
> 
> The overal score is different with your build because you use Intel. I can't recall why exactly, but FireStrike somehow prefers Intel and gives it a better Combined score, thus creating the higher overal.
> But I wasn't aiming at that at all. Just to compare graphics scores.
> Like said earlier, the fps chart is weird... You sure it is stable?


I'll try to get 3dmark2013 installed this weekend and do that.

Maybe mine is just unstable... IDK

I can play bf3 with these settings. I think bf4 too, but not totally sure. Have to try it again, don't play that title much.

Edit: I'm wondering if I have a funky bios. Might flash to see if it changes things.


----------



## Forceman

Quote:


> Originally Posted by *melodystyle2003*
> 
> Guys thanks for your replies.
> I am using CRU and pixel clock rate is elevated from 148.5 to 177.78 in order to achieve the 80Hz.
> Hm i need to find the appropriate mem timings in order to make it idle? Any reference would be helpful, i will dig in my self too.


If you just use the LCD Standard timings from the drop-down, it should work. I'm running 96Hz with LCD Standard and mine downclocks fine. Using different timings doesn't though.


----------



## Chopper1591

Quote:


> Originally Posted by *battleaxe*
> 
> I'll try to get 3dmark2013 installed this weekend and do that.
> 
> Maybe mine is just unstable... IDK
> 
> I can play bf3 with these settings. I think bf4 too, but not totally sure. Have to try it again, don't play that title much.
> 
> Edit: I'm wondering if I have a funky bios. Might flash to see if it changes things.


The second part from my response was actually to Kokin.
I didn't meant you with the question if it was stable.

I wouldn't botter in changing to another bios.
Why risk it if you find your card working properly.

We can compare when you find time to install 3dmark 2013 though.
Just run Firestrike and post.

____________________________________________________________________________________________

Argh...
I can't wait untill I have the funds to put this hot-head under water.
Just cant stand 80c on the vrm, 70c on the core is okay btw.


----------



## tsm106

All these posts from fantastical claims, what a waste of time and bandwidth.


----------



## Chopper1591

Quote:


> Originally Posted by *tsm106*
> 
> All these posts from fantastical claims, what a waste of time and bandwidth.


Dude?


----------



## battleaxe

Well, I'm not trying to claim anything. At this point I'm trying to find out if something is wrong with my GPU or my PC causing said "fantastical claims". And I appreciate someone (Chopper) actually taking the time to try and help me. Which instead of accusing me of lying might have been a better way to approach this... like, your temps are so low; that's kinda weird, "I think there might be something wrong with your system"? Just an idea.

Anyway, back to the issue, I had a strange experience with Firestrike... at the end of the demo it would black screen. It did this even at stock clocks.

I saw you were running Heaven so I downloaded that and tried it. Some quite strange things on the core temp going on with spikes to 146C. Might indicate a software problem, or pump failure? I also re-installed my CCC drivers too (after using DDU). These are the latest BETA drivers fro all these runs. I ran this as a continuous loop never stopping heaven. I just upped the voltage or clocks, then hit F9 to restart the Bench, that way temps would not go back down. Took a screen shot at the end of each run to show max voltages and such.

There are spikes on the core temp graphs on the last couple graphs. Those spikes go up to 146C, so I switched to "avg" for the core temp at that point so it would be more clear what is going on. I'm starting to wonder if the temps are not a result of a software error or something? Still the voltages look a bit low no?

Heaven: 1200/1250 +175mv



Heaven: 1200/1250 +175mv showing Heaven settings



Heaven: 1230/1250 +187mv



Heaven: 1240/1250 +187



Heaven: 1245/1250 +193, notice spikes on core temp graph.



>>>

I tried 1250/1250 +193 but it black screened halfway through.


----------



## devilhead

battleaxe try to set 30hz at your monitor and test again







1250/1250 +193


----------



## trihy

I had spikes to 146 when using trixxx.

So, it´s a software/driver report problem.


----------



## battleaxe

Quote:


> Originally Posted by *devilhead*
> 
> battleaxe try to set 30hz at your monitor and test again
> 
> 
> 
> 
> 
> 
> 
> 1250/1250 +193


My monitor did not enjoy that. It started jumping like I just drank 12 beers.
Quote:


> Originally Posted by *trihy*
> 
> I had spikes to 146 when using trixxx.
> 
> So, it´s a software/driver report problem.


Thank you. That's a relief.

*Public apology:*

For anyone whom I may have offended in the last few days I apologize. I let my blood pressure get a bit too high and said some things I aught not have said. In short I acted like a complete jerk and I had a bit too much fun doing it...

For those of you who actually tried to help me instead of posting a bunch of crap about me lying (insert the other word here) I am going to personally look for your posts and give you reps whenever possible. This is a great forum and I did not mean to offend anyone else not directly involved. Again, I am sorry and I hope you can forgive me.

Moving on...


----------



## chiknnwatrmln

I saw your 3dm11 score a few pages back and I can say mine was similar.. http://www.3dmark.com/3dm11/7941320

Almost the same core clock (1245MHz) but my memory was a bit higher, can't quite remember what clock though. Of course this is an old bench user older drivers so that accounts for your graphics score being just about the same despite your lower mem clock. Of course this was on an absurd amount of voltage (+220mV I think) and I would never run that high voltage daily.


----------



## battleaxe

Quote:


> Originally Posted by *trihy*
> 
> I had spikes to 146 when using trixxx.
> 
> So, it´s a software/driver report problem.


Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I saw your 3dm11 score a few pages back and I can say mine was similar.. http://www.3dmark.com/3dm11/7941320
> 
> Almost the same core clock (1245MHz) but my memory was a bit higher, can't quite remember what clock though. Of course this is an old bench user older drivers so that accounts for your graphics score being just about the same despite your lower mem clock. Of course this was on an absurd amount of voltage (+220mV I think) and I would never run that high voltage daily.


Well, that' actually some very good news. Thanks!

There's the possibility that my GPU isn't junk... that's always good.

Edit: How did you get it to +220mv may I ask?

wish mine could just get to +200mv...


----------



## Mugamat

Just made some modification to my AIO WC... Combined Corsair H55 (GPU) and Cooler Master Seidon 120v (CPU) into the one closed loop. Changed tubing and fluid. Got quiet low temp now even on OC GPU. Here is some screens.

http://www.3dmark.com/3dm11/9088475




I deadly need to change my CPU i think....


----------



## Arizonian

Quote:


> Originally Posted by *bluedevil*
> 
> You can also remove me.


Got ya.









Quote:


> Originally Posted by *SPLWF*
> 
> I got this for free, brand new. It's got hynix memory and all, go figure my other 290 is Elpida.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats added








Quote:


> Originally Posted by *Regnitto*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> I just put my 290 under water.
> 
> Cooler Master Seidon 120XL bolted to gpu with m3-20mm cap screws and 120mm Bygears 110cfm on the VRMs:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I haven't Figured out my Radiator mounting yet, so it's propped up on a box for the time being.


Congrats - updated


----------



## Chopper1591

Quote:


> Originally Posted by *battleaxe*
> 
> Well, I'm not trying to claim anything. At this point I'm trying to find out if something is wrong with my GPU or my PC causing said "fantastical claims". And I appreciate someone (Chopper) actually taking the time to try and help me. Which instead of accusing me of lying might have been a better way to approach this... like, your temps are so low; that's kinda weird, "I think there might be something wrong with your system"? Just an idea.
> 
> Anyway, back to the issue, I had a strange experience with Firestrike... at the end of the demo it would black screen. It did this even at stock clocks.
> 
> I saw you were running Heaven so I downloaded that and tried it. Some quite strange things on the core temp going on with spikes to 146C. Might indicate a software problem, or pump failure? I also re-installed my CCC drivers too (after using DDU). These are the latest BETA drivers fro all these runs. I ran this as a continuous loop never stopping heaven. I just upped the voltage or clocks, then hit F9 to restart the Bench, that way temps would not go back down. Took a screen shot at the end of each run to show max voltages and such.
> 
> There are spikes on the core temp graphs on the last couple graphs. Those spikes go up to 146C, so I switched to "avg" for the core temp at that point so it would be more clear what is going on. I'm starting to wonder if the temps are not a result of a software error or something? Still the voltages look a bit low no?
> 
> Heaven: 1200/1250 +175mv
> 
> 
> 
> Heaven: 1200/1250 +175mv showing Heaven settings
> 
> 
> 
> Heaven: 1230/1250 +187mv
> 
> 
> 
> Heaven: 1240/1250 +187
> 
> 
> 
> Heaven: 1245/1250 +193, notice spikes on core temp graph.
> 
> 
> 
> >>>
> 
> I tried 1250/1250 +193 but it black screened halfway through.


Are you sure your voltage is even changing when you set it in Trixx?
It almost looks like you are at stock volts.
I really think your problem is in the voltage, thus the black-screening.

Here are two runs I just did.
Heaven 1200/1250 +100(trying to duplicate your voltage, which was still lower).


+150


I will do some more runs in the afternoon.
Am a bit busy now.









Quote:


> Originally Posted by *Mugamat*
> 
> Just made some modification to my AIO WC... Combined Corsair H55 (GPU) and Cooler Master Seidon 120v (CPU) into the one closed loop. Changed tubing and fluid. Got quiet low temp now even on OC GPU. Here is some screens.
> 
> http://www.3dmark.com/3dm11/9088475
> 
> 
> 
> 
> I deadly need to change my CPU i think....


What do you mean? Why do you need to change that cpu so badly?
And.. by looking at your bench shots, your gpu asks for some more cooling on the vrm.


----------



## Newbie2009

Too many pictures!









Anyway thought I'd pop in to see if anyone picked up the £500 295x. What a nice deal. If I had just the one card I'd be adding for trifire.


----------



## Chopper1591

Quote:


> Originally Posted by *Newbie2009*
> 
> Too many pictures!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyway thought I'd pop in to see if anyone picked up the £500 295x. What a nice deal. If I had just the one card I'd be adding for trifire.


Nice deal.

But personally I'd rather go with tri-fire 290(x).

Just did measurements with the multimeter as some people were talking about the 12v rail being low(software).
I got the same thing, so decided to put things to the test.

Sorry for the bad quality, video is 1080p but YouTube messed it up:


----------



## zpaf

After one year decide to switch my asus reference non X with an X card.
Good looking card.










One run at defaults.









http://www.3dmark.com/3dm/4947954

and another at max clocks.









http://www.3dmark.com/3dm/4948674


----------



## Mugamat

Quote:


> Originally Posted by *Chopper1591*
> 
> Are you sure your voltage is even changing when you set it in Trixx?
> It almost looks like you are at stock volts.
> I really think your problem is in the voltage, thus the black-screening.
> 
> Here are two runs I just did.
> Heaven 1200/1250 +100(trying to duplicate your voltage, which was still lower).
> 
> 
> +150
> 
> 
> I will do some more runs in the afternoon.
> Am a bit busy now.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What do you mean? Why do you need to change that cpu so badly?
> And.. by looking at your bench shots, your gpu asks for some more cooling on the vrm.


I want to be able to OC CPU... As i know it`s impossible to overclock i5 4570.
VRM temps high because i set exhaust from radiator straight to gpu. It`s still pretty acceseble, otherway i can make tubing a little longer and set cooling to top side of case.


----------



## Chopper1591

Quote:


> Originally Posted by *Mugamat*
> 
> I want to be able to OC CPU... As i know it`s impossible to overclock i5 4570.
> VRM temps high because i set exhaust from radiator straight to gpu. It`s still pretty acceseble, otherway i can make tubing a little longer and set cooling to top side of case.


I can't follow you.

You can add a fan to blow air over the vrm sink. Should get the temps down.

Why didn't you buy a K cpu in the first place? Isn't that more expensive.


----------



## Mugamat

Quote:


> Originally Posted by *Chopper1591*
> 
> I can't follow you.
> 
> You can add a fan to blow air over the vrm sink. Should get the temps down.
> 
> Why didn't you buy a K cpu in the first place? Isn't that more expensive.


There is fan blowing on VRM. But he blows hot air that outcomes from radiator.


----------



## Chopper1591

Quote:


> Originally Posted by *Mugamat*
> 
> There is fan blowing on VRM. But he blows hot air that outcomes from radiator.


Can you post a photo of what you mean?

And is the air really that warm?
If so, then I think your radiator is struggling to keep up.

When I put my cpu under heavy stress with high voltage the air coming from my radiator is warmish at most.

Why don't you turn around the radiator then? So it blows the hot air out of the case.


----------



## Roboyto

Late replies as I have been busy the last 2 days.

Quote:



> Originally Posted by *battleaxe*
> 
> Yeah, it was more to see what my max voltage is doing. Seems mine is too low for some reason. Which in turn seems to make my temps lower than they should be. My card is running really strange voltages, not like anyone else's card apparently. Has me a little stumped. I can't get over +200mv though or it just crashes, so I'm guessing they did this on purpose with the BIOS, but I really don't know.


The max voltage is questionably low for the amount of offset voltage you are applying. This could explain why your temperatures overall are so low.

My XFX 290 that is fully watercooled looks like this at stock clocks of 980/1250

Max voltage reported at stock is 1.219V in FireStrike



Spoiler: Firestrike stock















Raising my core clock to 1200 and adjusting only the necessary +87mV my cards needs for 1200 core clock gives me this:

Core voltage to 1.313V in FireStrike



Spoiler: Firestrike 1200-1250 87mV















Here you can also see I am getting the 146C spikes as you and @trihy have mentioned. I can likely confirm it is a Trixx issue.

Quote:


> Originally Posted by *Chopper1591*
> 
> Like said earlier, the fps chart is weird... You sure it is stable?


I have seen irregularities with holding a steady clock rate, FPS dips, or low bench scores and power limit was the issue. I have applied my overclock settings and forgot to increase the power limit, once the power limit is adjusted accordingly everything goes back to normal.

Here are a couple runs of Firestrike on my XFX 290. The first at 1200/1250 with +50% power, and the second is the same clocks with 0% power. The overall score drops nearly 8% from the power limit alone.

First run: 1200/1250 +150mV +50% power



Second Run: 1200/1250 +150mV +0% Power



Quote:


> Originally Posted by *battleaxe*
> 
> Well, I'm not trying to claim anything. At this point I'm trying to find out if something is wrong with my GPU or my PC causing said "fantastical claims". And I appreciate someone (Chopper) actually taking the time to try and help me. Which instead of accusing me of lying might have been a better way to approach this... like, your temps are so low; that's kinda weird, "I think there might be something wrong with your system"? Just an idea.
> 
> Anyway, back to the issue, I had a strange experience with Firestrike... at the end of the demo it would black screen. It did this even at stock clocks.
> 
> I saw you were running Heaven so I downloaded that and tried it. Some quite strange things on the core temp going on with spikes to 146C. Might indicate a software problem, or pump failure? I also re-installed my CCC drivers too (after using DDU). These are the latest BETA drivers fro all these runs. I ran this as a continuous loop never stopping heaven. I just upped the voltage or clocks, then hit F9 to restart the Bench, that way temps would not go back down. Took a screen shot at the end of each run to show max voltages and such.
> 
> There are spikes on the core temp graphs on the last couple graphs. Those spikes go up to 146C, so I switched to "avg" for the core temp at that point so it would be more clear what is going on. I'm starting to wonder if the temps are not a result of a software error or something? Still the voltages look a bit low no?
> 
> Heaven: 1200/1250 +175mv
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Heaven: 1200/1250 +175mv showing Heaven settings
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Heaven: 1230/1250 +187mv
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Heaven: 1240/1250 +187
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Heaven: 1245/1250 +193, notice spikes on core temp graph.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> >>>
> 
> I tried 1250/1250 +193 but it black screened halfway through.


The voltages are still very low, however *your Heaven scores are inline* with what I was seeing several months ago when I was benching the hell out of my XFX card. Take a look here and you can see what I mean when I say benching the hell out of my card.

http://www.overclock.net/t/1456279/honey-i-shrunk-the-ultra-tower-beast-my-journey-to-creating-a-more-compact-pc-with-an-r9-290/20_20#post_21847939

It also seems your 3DMark11 graphics scores are where they are supposed to be. The card must be functioning at the reported clock speeds.

Your graphics score at 1230/1250 was 17953.

My graphics score at 1230/1250 is 18036; A difference of 0.46%.



Spoiler: 3DMark11 1230/1250















Quote:


> Originally Posted by *battleaxe*
> 
> My monitor did not enjoy that. It started jumping like I just drank 12 beers.
> Thank you. That's a relief.
> 
> *Public apology:*
> 
> For anyone whom I may have offended in the last few days I apologize. I let my blood pressure get a bit too high and said some things I aught not have said. In short I acted like a complete jerk and I had a bit too much fun doing it...
> 
> For those of you who actually tried to help me instead of posting a bunch of crap about me lying (insert the other word here) I am going to personally look for your posts and give you reps whenever possible. This is a great forum and I did not mean to offend anyone else not directly involved. Again, I am sorry and I hope you can forgive me.
> 
> Moving on...


I would like to do the same and apologize for the way things ended up playing out.

I have not been active in this community for that long, especially when compared to some other folks. However, the short time that I have been here, nearly all of my time has been spent in this thread or doing something pertaining to this thread. I don't know everything about these cards, but I have a pretty good idea of what temps, voltages, bench scores, etc. are supposed to look like. That being said, when you initially posted your figures it appeared unreal.

Now that we have some screenshots with pertinent information we can try to get to the bottom of what is happening.



My fully watercooled XFX 290 peaks at 1.375V with +150mV applied compared to your 1.273V. That's a 100mV difference between the two, and will definitely cause considerably lower temperatures at same/similar clock rates.

Even with your lower voltage, your core temps are still particularly low being cooled by an AIO. Are you running push pull on your H80? If so, what fans are you using and what speed are you running them at? I would still like to see a pic of how your rig is setup, particularly how you rigged up the 120mm fan for VRM1.

Here is a run through Heaven with my AIO cooled PowerColor 290. I tried to mimic your peak voltage and the closest I got was 1.281V with +50mV, unfortunately this card will not make it to 1200 core at this voltage level so I had to settle for 1100/1250. This is not ideal for an exact comparison, but this is the best I could get and I'm more concerned with voltages presently.

My Antec 620 radiator has a fairly weak SilenX Effizio on it simply to keep it quiet since this is my HTPC. This was after 1 run through full screen, ultra/extreme, 1080P:

1100/1250 +50mV +50% Power

Max Core: 67C

Max VRM1: 62C

Max VRM2: 57C



Spoiler: 1100/1250 Heaven Ultra















And then after 5 more consecutive runs in 720P windowed the temps didn't move much with a gain of 1C on core/VRM1:



Spoiler: 6 consecutive heaven runs
















Your core temperatures, *at this low of a voltage*, could be attributed to a couple of powerful fans in push/pull, with your thicker radiator, when compared to my setup. 
Our VRM1 temperatures are nearly identical with a similar voltage.
My VRM2 difference is likely due to the confined space in the 250D and the utter lack of airflow to the back end of the card.

Can you give us the specifics for your GPU; make/model/board revision?

Do you know exactly what BIOS your card is running? Your BIOS could be the cause of your considerably lower voltages compared to everyone else. Would you upload the BIOS here or to TechPowerUP and maybe someone would try flashing their card with the BIOS to see what happens?

I don't believe we have seen what your cards max voltage is at stock clocks/volts. It would be nice to see where it's baseline is to compare to your overvolted settings.

Typically simple math will get you in the ballpark when you increase the voltage offset. As seen at the beginning of my post, my fully watercooled XFX 290 at stock peaks at 1.219V; Adding +87mV it peaks at 1.313V. This is an actual increase of .094V which is pretty close to the adjustment in Trixx of +87mV.

If we try this same simple math in reverse for your card then 1.273V - 193mV would yield a stock peak of ~1.08V which is pretty darn low. This is ~100mV, or more, lower than either of my 290s.

I tried using AfterBurner to undervolt my PowerColor 290 and it would do nothing but blackscreen idling at the desktop, which required a hard reboot by switching my PSU on/off; it didn't even want to idle with -10mV.

After tinkering around with the clocks/voltages on my PowerColor 290, I started to see a slight variation in what maximum voltage was recorded in GPU-Z. A variation of .008V-.016V in either direction depending on how many benches I ran and if the PC had been rebooted recently.

This made me think that it is more important to observe the average voltage rather than a brief high spike so we can more accurately compare our voltages and temperatures; Yes I do have Force Constant Voltage checked in the settings.

I finally matched your peak voltage of 1.273V for 5 consecutive runs; The peak spiked to 1.297V after I closed the bench for some reason.

I wanted to see min/max/avg values simultaneously so I opened up 3 instances of GPU-Z and then immediately launched Heaven.

Unigine Heaven - 900P Windowed - Ultra/Extreme Settings - 5 Consecutive Runs

1120/1250 +68mV +50% Power

Max Core: 71C

Max VRM1: 66C

Max VRM2: 60C

The max core temp will have to be noted by the Trixx panel due to the 146C logging error, which wasn't the only error I noticed. There was also 0C minimum core temp and 0 MHz RAM speed.



Spoiler: Heaven 5 Consecutive Runs 1.273V Peak

















Spoiler: 5 run average

















Spoiler: 5 run max

















Spoiler: 5 run min















I ended up with an average VDDC of 1.207V for the 5 runs.

Maybe if you ran a similar test to find an average voltage, as well as recording your max voltage at stock settings, we can get to the bottom of the mystery that is your GPU...?


----------



## battleaxe

Quote:


> Originally Posted by *Roboyto*
> 
> The max voltage is questionably low for the amount of offset voltage you are applying. This could explain why your temperatures overall are so low.


Okay, this was kinda fun.

Ran 3 instances of GPU-Z as suggested showing min/max and avg. Hope this shows something tangible.



BIOS: Hopefully it attaches okay. Its just in a zipped folder...

Hawaii-BIOS-legacy.zip 99k .zip file


Lots of pics and graphs inside.


Spoiler: Warning: Spoiler!



Stock Sets:



1050/1250/+50



1100/1250/+100 I didn't try to test for stability or min voltages, just set it to something I was pretty sure it would pass on. I didn't want to get a blackscreen in the middle of a test. Same for all below except last few tests...



1200/1250/+170



1240/1250/+193



1250/1250/+200... figured I might as well go for it here. It held +200mv. First time I've ever seen it do this. Usually black-screens...



After this I tried 1255/1275/+200mv and it black screened immediately. End of testing. Heaven tests on each round. Non stop test, continuous loop.

Pics of the rig... (Disclaimer) this is Ghetto as all get out okay... its ugly... butt ugly. This is not my primary rig but a mining rig/gaming rig, this 290 started out its life as a script miner. A buddy of mine bought an MSI Gaming 290 just like it at the same time and his temps were always higher than mine. Guess we know why now...

Edit: Oh, and there is a single fan on the h80 radiator in "pull" mode only, it is pulling the air from where you see to the other side... toward the front of the inside of the case...

I used the stock h80 120mm fan that came with it. They move quite a bit of air yes. They are not silent, but not overly noisy either IMO. I think they do 1400rpm until the cpu revs up then they go up to around 1800rmp. This fan on the radiator is hooked to the fan controller of my case (Enthoo Pro), which is tethered to the CPU fan out on the MOBU.

So when the CPU fan spools up the GPU fan also spools up. Seems to work well since a load on the CPU (common in games) will push the fans up on the 290 also.

There is about a four inch gap to the front panel (you cannot see this as its behind the radiator shown) where two 120mm fans pull the hot air out of the case.

There are several other fans pushing air into the case from above and behind the Skythe Mine CPU cooler above, and an overall positive air pressure inside case. It vents heat pretty well. All hot air from the 290 comes out the front of the case. The fan strapped to the VRM1 is a 120mm Corsair fan, the same kind that comes stock with the h80 water cooler and identical to the one pulling air through the H20 cooler.









Quote:


> Originally Posted by *Chopper1591*
> 
> Nice deal.
> 
> But personally I'd rather go with tri-fire 290(x).
> 
> Just did measurements with the multimeter as some people were talking about the 12v rail being low(software).
> I got the same thing, so decided to put things to the test.
> 
> Sorry for the bad quality, video is 1080p but YouTube messed it up:


Seeing this makes me wonder if GPU-Z is even accurate at all?


----------



## Aaron_Henderson

Turns out my 290X is a bit of a dud clocker...doesn't like to go more than 1150 on the core







Got it clocked at 1150/1500, and my 2500k sitting pretty at 5GHz again, but my scores aren't quite there yet...I forgot my ram was at 1333 Mhz for this run though...guess 11000 in Firestrike might be out of reach with my setup as-is...well, suicide runs excluded. I could probably do a run 5.2+GHz, but the 290X just won't go much further without artifacts. Might have to look into getting it a block eventually.


----------



## cephelix

Ok i got a question since the last few pages have been about benches and whatnot.in 3dmark, when overclocking my 290, it gets through firestrike albeit stuttering but in the earlier tests, it artifacts heavily and sometimes crashes.that means that my oc is unstable right? Besides 3d mark, what another good test? Heaven? Valley? Since i don't feel like running the whole suite of tests in 3dmark every time i increase clocks initially. Of course for final clocks i would.


----------



## pdasterly

Quote:


> Originally Posted by *cephelix*
> 
> Ok i got a question since the last few pages have been about benches and whatnot.in 3dmark, when overclocking my 290, it gets through firestrike albeit stuttering but in the earlier tests, it artifacts heavily and sometimes crashes.that means that my oc is unstable right? Besides 3d mark, what another good test? Heaven? Valley? Since i don't feel like running the whole suite of tests in 3dmark every time i increase clocks initially. Of course for final clocks i would.


My firestrike stutters also until I reset machine. catzilla, I actually went back to occt for final oc


----------



## cephelix

Quote:


> Originally Posted by *pdasterly*
> 
> My firestrike stutters also until I reset machine. catzilla, I actually went back to occt for final oc


so running catzilla at 720p is fine? Since the last time i tried, any higher resolution required buying the license


----------



## aaroc

Quote:


> Originally Posted by *cephelix*
> 
> so running catzilla at 720p is fine? Since the last time i tried, any higher resolution required buying the license


When I tested, Catzilla 720p did not stress the PC like 1440p (higher Temps)..


----------



## chiknnwatrmln

IMO use whatever games you play as a stability test. I mean, if that's what you're using your rig for and they run stable then that should be fine.

BF4 is a pretty good stability test, although it sometimes crashes on its own. Skyrim with ENB and various other mods will give your GPU a workout too. Either Metro game will also suffice.


----------



## tsm106

Prime and FSU work pretty darn good for me.


----------



## cephelix

Quote:


> Originally Posted by *aaroc*
> 
> When I tested, Catzilla 720p did not stress the PC like 1440p (higher Temps)..


Quote:


> Originally Posted by *chiknnwatrmln*
> 
> IMO use whatever games you play as a stability test. I mean, if that's what you're using your rig for and they run stable then that should be fine.
> 
> BF4 is a pretty good stability test, although it sometimes crashes on its own. Skyrim with ENB and various other mods will give your GPU a workout too. Either Metro game will also suffice.


Quote:


> Originally Posted by *tsm106*
> 
> Prime and FSU work pretty darn good for me.


thanks guys.that's what i thought about catzilla too...also, do any of the games come as a standalone benchmark? Since the games i play are mainly rpg..witcher, mass effect, ac...and the like


----------



## Chopper1591

Quote:


> Originally Posted by *battleaxe*
> 
> Okay, this was kinda fun.
> 
> Ran 3 instances of GPU-Z as suggested showing min/max and avg. Hope this shows something tangible.
> 
> 
> 
> BIOS: Hopefully it attaches okay. Its just in a zipped folder...
> 
> Hawaii-BIOS-legacy.zip 99k .zip file
> 
> 
> Lots of pics and graphs inside.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Stock Sets:
> 
> 
> 
> 1050/1250/+50
> 
> 
> 
> 1100/1250/+100 I didn't try to test for stability or min voltages, just set it to something I was pretty sure it would pass on. I didn't want to get a blackscreen in the middle of a test. Same for all below except last few tests...
> 
> 
> 
> 1200/1250/+170
> 
> 
> 
> 1240/1250/+193
> 
> 
> 
> 1250/1250/+200... figured I might as well go for it here. It held +200mv. First time I've ever seen it do this. Usually black-screens...
> 
> 
> 
> After this I tried 1255/1275/+200mv and it black screened immediately. End of testing. Heaven tests on each round. Non stop test, continuous loop.
> 
> Pics of the rig... (Disclaimer) this is Ghetto as all get out okay... its ugly... butt ugly. This is not my primary rig but a mining rig/gaming rig, this 290 started out its life as a script miner. A buddy of mine bought an MSI Gaming 290 just like it at the same time and his temps were always higher than mine. Guess we know why now...
> 
> Edit: Oh, and there is a single fan on the h80 radiator in "pull" mode only, it is pulling the air from where you see to the other side... toward the front of the inside of the case...
> 
> I used the stock h80 120mm fan that came with it. They move quite a bit of air yes. They are not silent, but not overly noisy either IMO. I think they do 1400rpm until the cpu revs up then they go up to around 1800rmp. This fan on the radiator is hooked to the fan controller of my case (Enthoo Pro), which is tethered to the CPU fan out on the MOBU.
> 
> So when the CPU fan spools up the GPU fan also spools up. Seems to work well since a load on the CPU (common in games) will push the fans up on the 290 also.
> 
> There is about a four inch gap to the front panel (you cannot see this as its behind the radiator shown) where two 120mm fans pull the hot air out of the case.
> 
> There are several other fans pushing air into the case from above and behind the Skythe Mine CPU cooler above, and an overall positive air pressure inside case. It vents heat pretty well. All hot air from the 290 comes out the front of the case. The fan strapped to the VRM1 is a 120mm Corsair fan, the same kind that comes stock with the h80 water cooler and identical to the one pulling air through the H20 cooler.
> 
> 
> 
> 
> 
> 
> 
> 
> Seeing this makes me wonder if GPU-Z is even accurate at all?


Food...

That is something I can work with.
When I find the time I will duplicate most settings and post results to you.
Quote:


> Originally Posted by *cephelix*
> 
> Ok i got a question since the last few pages have been about benches and whatnot.in 3dmark, when overclocking my 290, it gets through firestrike albeit stuttering but in the earlier tests, it artifacts heavily and sometimes crashes.that means that my oc is unstable right? Besides 3d mark, what another good test? Heaven? Valley? Since i don't feel like running the whole suite of tests in 3dmark every time i increase clocks initially. Of course for final clocks i would.


Could be memory overclock.
Which clocks were you running and at what voltage?

Artifacts is Always instability.

Have to agree with the other guys on this.
Benchmarks are not really stability tests IMO.
Do what you normally do with your system. And work on making that stable.

Benchmarks are nice to compare various clocks...
I use it to compare the gains in performance versus the need in voltage(thus wattage) to find the sweetspot.

You could always buy 3dmark through steam, it's only 25 euro's.


----------



## cephelix

Quote:


> Originally Posted by *Chopper1591*
> 
> Food...
> 
> That is something I can work with.
> When I find the time I will duplicate most settings and post results to you.
> Could be memory overclock.
> Which clocks were you running and at what voltage?
> 
> Artifacts is Always instability.
> 
> Have to agree with the other guys on this.
> Benchmarks are not really stability tests IMO.
> Do what you normally do with your system. And work on making that stable.
> 
> Benchmarks are nice to compare various clocks...
> I use it to compare the gains in performance versus the need in voltage(thus wattage) to find the sweetspot.
> 
> You could always buy 3dmark through steam, it's only 25 euro's.


memory clocks were stock.i cant remember exactly now as i've reset my system back to stock when i got my 4790k.and the data is at home.waiting for a few wc components before i'll start overclocking again


----------



## Performer81

@Battleaxe
That average voltage @+170mv (1,1V) is less than my average stock voltage here @+0 and 1050MHZ with my 290x pcs+. And i need that voltage








But maybe the measuring between different PCB cannot ber compared.


----------



## Chopper1591

Quote:


> Originally Posted by *Performer81*
> 
> @Battleaxe
> That average voltage @+170mv (1,1V) is less than my average stock voltage here @+0 and 1050MHZ with my 290x pcs+. And i need that voltage
> 
> 
> 
> 
> 
> 
> 
> 
> But maybe the measuring between different PCB cannot ber compared.


No, your are right...

Somehow his voltage is a good deal lower.
I average higher then 1.3v with +150.

That is with the default bios.


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> Okay, this was kinda fun.
> 
> Ran 3 instances of GPU-Z as suggested showing min/max and avg. Hope this shows something tangible.
> 
> 
> 
> BIOS: Hopefully it attaches okay. Its just in a zipped folder...
> 
> Hawaii-BIOS-legacy.zip 99k .zip file
> 
> 
> Lots of pics and graphs inside.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Stock Sets:
> 
> 
> 
> 1050/1250/+50
> 
> 
> 
> 1100/1250/+100 I didn't try to test for stability or min voltages, just set it to something I was pretty sure it would pass on. I didn't want to get a blackscreen in the middle of a test. Same for all below except last few tests...
> 
> 
> 
> 1200/1250/+170
> 
> 
> 
> 1240/1250/+193
> 
> 
> 
> 1250/1250/+200... figured I might as well go for it here. It held +200mv. First time I've ever seen it do this. Usually black-screens...
> 
> 
> 
> After this I tried 1255/1275/+200mv and it black screened immediately. End of testing. Heaven tests on each round. Non stop test, continuous loop.
> 
> Pics of the rig... (Disclaimer) this is Ghetto as all get out okay... its ugly... butt ugly. This is not my primary rig but a mining rig/gaming rig, this 290 started out its life as a script miner. A buddy of mine bought an MSI Gaming 290 just like it at the same time and his temps were always higher than mine. Guess we know why now...
> 
> Edit: Oh, and there is a single fan on the h80 radiator in "pull" mode only, it is pulling the air from where you see to the other side... toward the front of the inside of the case...
> 
> I used the stock h80 120mm fan that came with it. They move quite a bit of air yes. They are not silent, but not overly noisy either IMO. I think they do 1400rpm until the cpu revs up then they go up to around 1800rmp. This fan on the radiator is hooked to the fan controller of my case (Enthoo Pro), which is tethered to the CPU fan out on the MOBU.
> 
> So when the CPU fan spools up the GPU fan also spools up. Seems to work well since a load on the CPU (common in games) will push the fans up on the 290 also.
> 
> There is about a four inch gap to the front panel (you cannot see this as its behind the radiator shown) where two 120mm fans pull the hot air out of the case.
> 
> There are several other fans pushing air into the case from above and behind the Skythe Mine CPU cooler above, and an overall positive air pressure inside case. It vents heat pretty well. All hot air from the 290 comes out the front of the case. The fan strapped to the VRM1 is a 120mm Corsair fan, the same kind that comes stock with the h80 water cooler and identical to the one pulling air through the H20 cooler.
> 
> 
> 
> 
> 
> 
> 
> 
> Seeing this makes me wonder if GPU-Z is even accurate at all?


it may be inaccurate. i get 1.32v max in Heaven at +180. check your 12v reading . . . it is low as well.


----------



## battleaxe

Quote:


> Originally Posted by *Chopper1591*
> 
> Nice deal.
> 
> But personally I'd rather go with tri-fire 290(x).
> 
> Just did measurements with the multimeter as some people were talking about the 12v rail being low(software).
> I got the same thing, so decided to put things to the test.
> 
> Sorry for the bad quality, video is 1080p but YouTube messed it up:


Quote:


> Originally Posted by *rdr09*
> 
> it may be inaccurate. i get 1.32v max in Heaven at +180. check your 12v reading . . . it is low as well.


Chopper already tested and found that GPU-Z does not record the correct voltage readings. He measured with a DMM. So I would assume that's the issue. But it may also indicate other readings are not right either, like my voltages and temps. Might be completely false altogether. I have no idea how to find out though.


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> Chopper already tested and found that GPU-Z does not record the correct voltage readings. He measured with a DMM. So I would assume that's the issue. But it may also indicate other readings are not right either, like my voltages and temps. Might be completely false altogether. I have no idea how to find out though.


my bad. i saw that. you can use Hwinfo64 too . . . here is +180 run in Heaven


----------



## bond32

Try this program: http://www.hwinfo.com/download.php

It tends to be the most accurate I have found.

Edit: nvm you have it already.


----------



## battleaxe

Quote:


> Originally Posted by *bond32*
> 
> Try this program: http://www.hwinfo.com/download.php
> 
> It tends to be the most accurate I have found.
> 
> Edit: nvm you have it already.


I tried to flash a different BIOS on the UEFI switch but it wouldn't take. Kept black screening during the flash. Had to hard reset each time. Ran that HW info test. Showing similar results...

I think AMD must have set this BIOS to do this on purpose. Maybe a certain ASIC quality indicates something so they flash this BIOS on these types of cards. I wish I could get a BIOS to take on it so I could test out some different sets, but it may not be possible because of the cards limitations? I'm not at all sure of anything here.


----------



## arrow0309

Got a bran new *EKWB* + backplate and a compact dual 240x60 mm custom liquid cooling on my CM 690 III








Huge improvement on the 290 @1150/1475, ~ 30°C down on the gpu and over 40° lower vrm temps








Very happy with my new build















Some more pics here:

http://www.overclock.net/t/294838/the-cooler-master-690-club/19550#post_23250478


----------



## tsm106

^^Nice, and shiny all copper. Btw, if it's not already sealed you might want to clear coat that rad before it changes colors on you.


----------



## arrow0309

Naaah, it won't last forever
Next year I'm gonna switch to a new Enthoo Primo build and with new rads too
For now, let it shine


----------



## Chopper1591

Quote:


> Originally Posted by *rdr09*
> 
> it may be inaccurate. i get 1.32v max in Heaven at +180. check your 12v reading . . . it is low as well.


Yep, we got that sorted out.

Should be able to measure gpu voltage as well, right?
But I am a bit scared to mess things up.
I don't know of the tri-x has dedicated measure points, like on some motherboards.
Quote:


> Originally Posted by *battleaxe*
> 
> Chopper already tested and found that GPU-Z does not record the correct voltage readings. He measured with a DMM. So I would assume that's the issue. But it may also indicate other readings are not right either, like my voltages and temps. Might be completely false altogether. I have no idea how to find out though.


If we go that route...
Then what is the point in monitoring at all?
Temp sensors are probably on the high side, to keep it safe. Real temps will most likely be lower, if inaccurate.

Seems like gpu-z is to blame. Hwinfo64 shows more accurate settings(will have to test more extensively though.

Measuring temps is better to do with a IR gun btw.
They are pretty cheap actually.
Quote:


> Originally Posted by *battleaxe*
> 
> I tried to flash a different BIOS on the UEFI switch but it wouldn't take. Kept black screening during the flash. Had to hard reset each time. Ran that HW info test. Showing similar results...
> 
> I think AMD must have set this BIOS to do this on purpose. Maybe a certain ASIC quality indicates something so they flash this BIOS on these types of cards. I wish I could get a BIOS to take on it so I could test out some different sets, but it may not be possible because of the cards limitations? I'm not at all sure of anything here.


Have you flashed gpu's before?

Want to make sure you did it right.
Did atiflash complete?
Quote:


> Originally Posted by *arrow0309*
> 
> Got a bran new *EKWB* + backplate and a compact dual 240x60 mm custom liquid cooling on my CM 690 III
> 
> 
> 
> 
> 
> 
> 
> 
> Huge improvement on the 290 @1150/1475, ~ 30°C down on the gpu and over 40° lower vrm temps
> 
> 
> 
> 
> 
> 
> 
> 
> Very happy with my new build
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Some more pics here:
> 
> http://www.overclock.net/t/294838/the-cooler-master-690-club/19550#post_23250478


Ah men!
Keep that stuff away from here....










Spoiler: Warning: Spoiler!



I really can't wait to put mine under water. Will have the money in a couple of weeks.
Already have an extra 140mm radiator(to company my 360), the Fujipoly Ultra vrm pad, 140mm Apache fan on the way.
Now the cash for the EK block and AC Backplate.


Quote:


> Originally Posted by *tsm106*
> 
> ^^Nice, and shiny all copper. Btw, if it's not already sealed you might want to clear coat that rad before it changes colors on you.


True.
Copper will discolor pretty quickly when exposed to oxygen.
Coating would be a good idea.
Will remember that when I ever get something shiny.


----------



## battleaxe

Quote:


> Originally Posted by *Chopper1591*
> 
> Yep, we got that sorted out.
> 
> Have you flashed gpu's before?
> 
> Want to make sure you did it right.
> Did atiflash complete?


Not AMD, flashed several Nvidia though... So quite possible I did it wrong.

It did not complete. It black-screened before finishing, so pretty sure it did not take at all.

GPU-Z and HWinfo were showing almost identical voltages. I just did not have GPU-Z to show max voltages as shown in HWinfo, forgot to do that...

So, if one is wrong, then they both are. So this seems unlikely. I'll try to flash again with some different instructions than on OCN that I found.


----------



## arrow0309

Quote:


> Originally Posted by *arrow0309*
> 
> Naaah, it won't last forever
> Next year I'm gonna switch to a new Enthoo Primo build and with new rads too
> For now, let it shine


Quote:


> Originally Posted by *Chopper1591*
> 
> Quote:
> 
> 
> 
> Originally Posted by *arrow0309*
> 
> Got a bran new *EKWB* + backplate and a compact dual 240x60 mm custom liquid cooling on my CM 690 III
> 
> 
> 
> 
> 
> 
> 
> 
> Huge improvement on the 290 @1150/1475, ~ 30°C down on the gpu and over 40° lower vrm temps
> 
> 
> 
> 
> 
> 
> 
> 
> Very happy with my new build
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Some more pics here:
> 
> http://www.overclock.net/t/294838/the-cooler-master-690-club/19550#post_23250478
> 
> 
> 
> Ah men!
> Keep that stuff away from here....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I really can't wait to put mine under water. Will have the money in a couple of weeks.
> Already have an extra 140mm radiator(to company my 360), the Fujipoly Ultra vrm pad, 140mm Apache fan on the way.
> Now the cash for the EK block and AC Backplate.
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> ^^Nice, and shiny all copper. Btw, if it's not already sealed you might want to clear coat that rad before it changes colors on you.
> 
> Click to expand...
> 
> True.
> Copper will discolor pretty quickly when exposed to oxygen.
> Coating would be a good idea.
> Will remember that when I ever get something shiny.
Click to expand...

Forgot to mention, the copper rad is a Coolgate CG-240 CuP and it's already factory transparent treated


----------



## eperelez

Hello, I'm thinking of adding a 2nd R9 290X to my sig rig. Is my powers supply enough for this? Also, is my CPU strong enough, or should I think of moving to an i7 build? Thanks in advance and +rep for all who respond.


----------



## Agenesis

Yikes I got my 290x lightning in today and at 1150 core w/ +200 I'm pulling 710w from the wall running occt gpu only with the cpu not loaded. I realize this doesn't translate to the typical gaming power consumption, but it's still more than I've seen with any gpu - even the gtx 480.

Temps are pretty good topping out at 76c. This card seems capable of overclocking to 1150~1200 without producing any errors on the occt error checking test. How does it rank for those familiar with the average clocks? Good? Bad? Decent?


----------



## Roboyto

Quote:


> Originally Posted by *eperelez*
> 
> Hello, I'm thinking of adding a 2nd R9 290X to my sig rig. Is my powers supply enough for this? Also, is my CPU strong enough, or should I think of moving to an i7 build? Thanks in advance and +rep for all who respond.


850 could be enough for your build depending on what kind of clocks you are running on everything.

A heavily overclocked 83XX CPU can draw ~300W alone. R9 290(X) can be in the same ballpark when overclocked and under load.

That is a solid PSU, but depending on it's age and if you're looking to run max clocks on all your components than you may want to consider something in the 1kW range.

The FX chip with a solid clock shouldn't have trouble keeping up with the pair of 290s. I haven't seen anything in the thread stating there is an issue, but some others with first hand experience may chime in.

Now wouldn't be a bad time for an Intel upgrade especially if you're near a Microcenter who has the 4790k for $250 when bundled with a motherboard.


----------



## bond32

Quote:


> Originally Posted by *Agenesis*
> 
> Yikes I got my 290x lightning in today and at 1150 core w/ +200 I'm pulling 710w from the wall running occt gpu only with the cpu not loaded. I realize this doesn't translate to the typical gaming power consumption, but it's still more than I've seen with any gpu - even the gtx 480.
> 
> Temps are pretty good topping out at 76c. This card seems capable of overclocking to 1150~1200 without producing any errors on the occt error checking test. How does it rank for those familiar with the average clocks? Good? Bad? Decent?


It's a good card, but I haven't seen anything impressive over reference honestly. It has extremely good power delivery, yet I haven't seen anything mind blowing all things considered. Those clocks you listed are pretty common place for reference boards.


----------



## Roboyto

Quote:


> Originally Posted by *Agenesis*
> 
> Yikes I got my 290x lightning in today and at 1150 core w/ +200 I'm pulling 710w from the wall running occt gpu only with the cpu not loaded. I realize this doesn't translate to the typical gaming power consumption, but it's still more than I've seen with any gpu - even the gtx 480.
> 
> Temps are pretty good topping out at 76c. This card seems capable of overclocking to 1150~1200 without producing any errors on the occt error checking test. How does it rank for those familiar with the average clocks? Good? Bad? Decent?


Most cards don't need +200mV to make it to 1150-1200. If you haven't spent time to overclock a little at a time gradually increasing clocks and volts I would suggest doing so. I don't see the point of running high volts unless they're necessary for a given overclock.

Yes the card can handle +200mV but it is a lot of stress especially for an air cooled card; even if it's a lightning.

My good card makes 1200/1500 with +87mV.

My worse card is only in the 1150 range around +100mV. This card isn't pushed hard because it only drives 1080 on my TV. It is typically run at 1075/1375 +37mV.

One other thing, to find your max core clock you should leave the RAM alone initially. RAM speed gives a small fraction of performance on these cards compared to core.

If you've got other questions or concerns see the link in my sig for the "Need To Know" post.


----------



## Chopper1591

Quote:


> Originally Posted by *arrow0309*
> 
> Forgot to mention, the copper rad is a Coolgate CG-240 CuP and it's already factory transparent treated


Ah, man...
I need one of those rad's.

Even if it is only for the looks.








Doubt it will cool better.
Quote:


> Originally Posted by *eperelez*
> 
> Hello, I'm thinking of adding a 2nd R9 290X to my sig rig. Is my powers supply enough for this? Also, is my CPU strong enough, or should I think of moving to an i7 build? Thanks in advance and +rep for all who respond.


Your cpu might bottleneck it a bit, but I wouldn't expect it be extreme.
What is your OC on the cpu?

Personally I wouldn't want to cf that card with a 850w psu.

If you use the eXtreme psu calculator, and trust that:
When you don't overclock your cpu you might be able to pull it off.
How ever, if you do plan to OC(including the gpu's) I would advice you to upgrade the psu.


----------



## Agenesis

Quote:


> Originally Posted by *bond32*
> 
> It's a good card, but I haven't seen anything impressive over reference honestly. It has extremely good power delivery, yet I haven't seen anything mind blowing all things considered. Those clocks you listed are pretty common place for reference boards.


That's what I was afraid of, but considering the going price for these cards aren't fetching any premium over other 290x models I think it's acceptable. Would've been different had I paid $150 or whatever premium these cards originally sold for.

Also thanks roboyto, I'll definitely look into it.


----------



## eperelez

Quote:


> Originally Posted by *Roboyto*
> 
> 850 could be enough for your build depending on what kind of clocks you are running on everything.
> 
> A heavily overclocked 83XX CPU can draw ~300W alone. R9 290(X) can be in the same ballpark when overclocked and under load.
> 
> That is a solid PSU, but depending on it's age and if you're looking to run max clocks on all your components than you may want to consider something in the 1kW range.
> 
> The FX chip with a solid clock shouldn't have trouble keeping up with the pair of 290s. I haven't seen anything in the thread stating there is an issue, but some others with first hand experience may chime in.
> 
> Now wouldn't be a bad time for an Intel upgrade especially if you're near a Microcenter who has the 4790k for $250 when bundled with a motherboard.


Quote:


> Originally Posted by *Chopper1591*
> 
> Your cpu might bottleneck it a bit, but I wouldn't expect it be extreme.
> What is your OC on the cpu?
> 
> Personally I wouldn't want to cf that card with a 850w psu.
> 
> If you use the eXtreme psu calculator, and trust that:
> When you don't overclock your cpu you might be able to pull it off.
> How ever, if you do plan to OC(including the gpu's) I would advice you to upgrade the psu.


I'm running at stock clocks, but I will probably keep this setup and overclock my CPU. I think I'll hold off and start planning a new i7 build. I'll wait to see what the 390X has to offer. It's just tempting with all the price drops and deals going around. Thanks for your input!


----------



## pshootr

Hello, I am going to be installing a Sapphire r9 290 Tri-X OC in my system soon and was wondering what driver/catalyst package is best for this card?


----------



## BLOWNCO

Quote:


> Originally Posted by *pshootr*
> 
> Hello, I am going to be installing a Sapphire r9 290 Tri-X OC in my system soon and was wondering what driver/catalyst package is best for this card?


http://support.amd.com/en-us/download

go there and dl the latest driver


----------



## pshootr

Quote:


> Originally Posted by *BLOWNCO*
> 
> http://support.amd.com/en-us/download
> 
> go there and dl the latest driver


Ok thanks, wasn't sure if you guys find the latest to be best or not.


----------



## BLOWNCO

Quote:


> Originally Posted by *pshootr*
> 
> Ok thanks, wasn't sure if you guys find the latest to be best or not.


my setup likes the latest beta drivers but yours may vary. amd is launching a new set of drivers do if you can wait till tomorrow i would wait till then.


----------



## joeh4384

Quote:


> Originally Posted by *pshootr*
> 
> Hello, I am going to be installing a Sapphire r9 290 Tri-X OC in my system soon and was wondering what driver/catalyst package is best for this card?


Apparently tomorrow AMD is releasing Catalyst Omega drivers that are supposed to bring nice improvements and something similar to DSR.


----------



## Harry604

im picking up a asus direct cu2 290x for 200$ off craigslist

sold my gtx 980 looking to try amd


----------



## pshootr

Wow, figures, drivers tomorrow lol. I am currently using the latest drivers (14.9) for my HD 5570. I wonder if I can just swap cards and leave the drivers in tact?


----------



## Roboyto

Quote:
Originally Posted by *Mugamat* 

VRM temps high because i set exhaust from radiator straight to gpu. It`s still pretty acceseble, otherway i can make tubing a little longer and set cooling to top side of case.

90C is not a good operating temperature for the VRM. If it is possible to turn the fan around and exhaust out the bottom your VRM temperature would be much better and you may even be able to reduce your voltage to get your desired overclock.

Quote:
Originally Posted by *battleaxe* 

Okay, this was kinda fun.

Ran 3 instances of GPU-Z as suggested showing min/max and avg. Hope this shows something tangible.

BIOS: Hopefully it attaches okay. Its just in a zipped folder...

Hawaii-BIOS-legacy.zip 99k .zip file


Lots of pics and graphs inside.


Spoiler: Warning: Spoiler!



Stock Sets:



1050/1250/+50



1100/1250/+100 I didn't try to test for stability or min voltages, just set it to something I was pretty sure it would pass on. I didn't want to get a blackscreen in the middle of a test. Same for all below except last few tests...



1200/1250/+170



1240/1250/+193



1250/1250/+200... figured I might as well go for it here. It held +200mv. First time I've ever seen it do this. Usually black-screens...



After this I tried 1255/1275/+200mv and it black screened immediately. End of testing. Heaven tests on each round. Non stop test, continuous loop.

Pics of the rig... (Disclaimer) this is Ghetto as all get out okay... its ugly... butt ugly. This is not my primary rig but a mining rig/gaming rig, this 290 started out its life as a script miner. A buddy of mine bought an MSI Gaming 290 just like it at the same time and his temps were always higher than mine. Guess we know why now...

Edit: Oh, and there is a single fan on the h80 radiator in "pull" mode only, it is pulling the air from where you see to the other side... toward the front of the inside of the case...

I used the stock h80 120mm fan that came with it. They move quite a bit of air yes. They are not silent, but not overly noisy either IMO. I think they do 1400rpm until the cpu revs up then they go up to around 1800rmp. This fan on the radiator is hooked to the fan controller of my case (Enthoo Pro), which is tethered to the CPU fan out on the MOBU.

So when the CPU fan spools up the GPU fan also spools up. Seems to work well since a load on the CPU (common in games) will push the fans up on the 290 also.

There is about a four inch gap to the front panel (you cannot see this as its behind the radiator shown) where two 120mm fans pull the hot air out of the case.

There are several other fans pushing air into the case from above and behind the Skythe Mine CPU cooler above, and an overall positive air pressure inside case. It vents heat pretty well. All hot air from the 290 comes out the front of the case. The fan strapped to the VRM1 is a 120mm Corsair fan, the same kind that comes stock with the h80 water cooler and identical to the one pulling air through the H20 cooler.









Your peak and average voltages are considerably lower than either of my cards. The ~100mV difference would have to be the primary contributing factor to lower temperatures overall.

Ghetto-ish yes. I would have at least made a trip to the store for some black zip-ties on the pump









Be mindful of those hard plastic tubes, they are prone to cracks/leaks.

Quote:
Originally Posted by *cephelix* 

Ok i got a question since the last few pages have been about benches and whatnot.in 3dmark, when overclocking my 290, it gets through firestrike albeit stuttering but in the earlier tests, it artifacts heavily and sometimes crashes.that means that my oc is unstable right? Besides 3d mark, what another good test? Heaven? Valley? Since i don't feel like running the whole suite of tests in 3dmark every time i increase clocks initially. Of course for final clocks i would.

Quote:
Originally Posted by *Chopper1591* 

Could be memory overclock.

I disagree. It is extremely rare, from my experience, for artifacting to be caused by RAM speeds with these cards. Too high of a RAM clock will give black screens, lock ups, or very rarely a BSOD.

Artifacts for these cards is going to mean you need more voltage for your core clock..Or if you have no more voltage to add, then you must reduce your core clock.

If you're doing a lot of back-to-back benching, then rebooting every few runs can make a difference from my experience.

As others have mentioned, benchmarks are benchmarks. They're good tests, but no guarantee for all the different games/engines/coding that are out there.

However..

My go to for absolute stability is the Final Fantasty XIV benchmark. Any overclock I have completed successfully in that benchmark has worked for any game I have played. Roughly halfway through the bench there is a scene with several Coeurls(cat/lizard-ish creature with 2 very large whiskers), which I have found to be the earliest problematic point of the bench for artifacting; if it doesn't happen immediately obviously. If you artifact here, you might as well exit and readjust your overclock settings because it will only get worse from that point on.



Spoiler: A Coeurl







Quote:
Originally Posted by *Agenesis* 

That's what I was afraid of, but considering the going price for these cards aren't fetching any premium over other 290x models I think it's acceptable. Would've been different had I paid $150 or whatever premium these cards originally sold for.

Also thanks roboyto, I'll definitely look into it.









No problem.

I agree with @bond32 in regards to the lightning. For whatever reason with this round of AMD cards, the lightning was no guarantee for an above average card. The only real benefit to it is those that are extreme overclockers with something like LN2 in mind.

At least you didn't overpay...I would have been salty if I was an early adopter expecting the lightning legacy...

Quote:
Originally Posted by *joeh4384* 

Apparently tomorrow AMD is releasing Catalyst Omega drivers that are supposed to bring nice improvements and something similar to DSR.

I believe those large claims were comparing the first set of drivers released with the 290(X) cards, 13.12 if I'm not mistaken, to the Omega Drivers. See link:

http://www.overclock.net/t/1528550/techspot-amd-catalyst-omega-drivers/0_20#post_23236535

Quote:
Originally Posted by *pshootr* 

Hello, I am going to be installing a Sapphire r9 290 Tri-X OC in my system soon and was wondering what driver/catalyst package is best for this card?


> Quote:
> 
> 
> 
> Originally Posted by *pshootr*
> 
> Wow, figures, drivers tomorrow lol. I am currently using the latest drivers (14.9) for my HD 5570. *I wonder if I can just swap cards and leave the drivers in tact?*
Click to expand...

Most likely the latest drivers. 14.9 is working fine in both my systems with 290's. Make sure to properly scrub whatever drivers you were running, with something like Display Driver Uninstaller, from your previous card even if it was AMD.

Quote:


> Originally Posted by *Harry604*
> 
> im picking up a asus direct cu2 290x for 200$ off craigslist
> 
> sold my gtx 980 looking to try amd


Before installing in your system I would pop the cooler off and see what kind of contact is being made with the GPU die. ASUS was extremely lazy and used the cooler from the GK104/GTX 780 without modifying it at all.

The GK104 die is significantly larger than the 290(X) die, 28% larger to be exact, and because of this only 3/5 heatpipes make contact with the 290(X) die. This does not bode well for load temperatures, especially if you plan to overclock.

I'm uncertain if they ever addressed the issue, so it is best to check for yourself.



Spoiler: Lazy ASUS


----------



## cephelix

as always @Roboyto, thank you very much for the info. I completely forgot about the final fantasy benchmark. I do realise that all these are tests and is not proof of stability and whatnot. but I feel it helps me having a point of reference so at least if i do run into problems, i'd know where it's coming from. Still new to these things so confidence in my own abilities is somewhat lacking.


----------



## Buehlar

The new dirvers are here guys. Currently running beta 2 stable.
Anyone planning to test the new ones?

AMD Catalyst™ Driver 14.12


----------



## Chopper1591

Quote:


> Originally Posted by *eperelez*
> 
> I'm running at stock clocks, but I will probably keep this setup and overclock my CPU. I think I'll hold off and start planning a new i7 build. I'll wait to see what the 390X has to offer. It's just tempting with all the price drops and deals going around. Thanks for your input!


What you could do is buy a wattage meter and see what you pull out of the wall.
That will give you a rough idea if you need to change that psu or not.

But like you said: you are going to overclock.
I wouldn't doubt about it and buy the 1k(+) unit.
Quote:


> Originally Posted by *pshootr*
> 
> Hello, I am going to be installing a Sapphire r9 290 Tri-X OC in my system soon and was wondering what driver/catalyst package is best for this card?


Depends..
For some things the beta is better.
Quote:


> Originally Posted by *Harry604*
> 
> im picking up a asus direct cu2 290x for 200$ off craigslist
> 
> sold my gtx 980 looking to try amd


Trolling?

Why'd you do that?
Quote:


> Originally Posted by *Buehlar*
> 
> The new dirvers are here guys. Currently running beta 2 stable.
> Anyone planning to test the new ones?
> 
> AMD Catalyst™ Driver 14.12


Will try it this afternoon.


----------



## pdasterly

They fixed call of duty advanced warfare crossfire profile big time
getting 60fps min 110max @ 5780x1080p
extra settings, [email protected] HBAO, everything else max

Was getting 55fps average on previous driver


----------



## Arizonian

Quote:


> Originally Posted by *arrow0309*
> 
> Got a bran new *EKWB* + backplate and a compact dual 240x60 mm custom liquid cooling on my CM 690 III
> 
> 
> 
> 
> 
> 
> 
> 
> Huge improvement on the 290 @1150/1475, ~ 30°C down on the gpu and over 40° lower vrm temps
> 
> 
> 
> 
> 
> 
> 
> 
> Very happy with my new build
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Some more pics here:
> 
> http://www.overclock.net/t/294838/the-cooler-master-690-club/19550#post_23250478


Congrats - looks great. Updated









OP updated with latest driver info.

*Introducing the* *Catalyst™ Omega Driver* *Windows®*

*
AMD Catalyst™ Driver 14.12 Windows®*


Spoiler: Warning: Spoiler!



*Highlights of AMD Catalyst™ 14.11.2 Windows® Beta Driver Performance Improvements*
◾Dragon Age: Inquisition performance optimizations
- Up to 5% performance increase over Catalyst™ 14.11.1 beta in single GPU scenarios with Anti-Aliasing enabled.
- Optimized AMD CrossFire™ profile
◾Far Cry 4 performance optimizations
- Up to 50% performance increase over Catalyst™ 14.11.1 beta in single GPU scenarios with Anti-Aliasing enabled.

*Important Notes*
◾All previous versions of AMD Catalyst™ drivers must be completely uninstalled to receive full benefits of this driver. Please make sure to completely uninstall all previous AMD Catalyst™ display drivers before installing the 14.11.2 Beta.
◾The AMD CrossFire™ profile for Far Cry 4 is currently disabled in this driver while AMD works with Ubisoft to investigate an issue where AMD CrossFire™ configurations are not performing as intended. An update is expected on this issue in the near future through an updated game patch or an AMD driver posting.

*Resolved Issues*
◾[409235]: Small chance of intermittent screen tearing or corruption in Call of Duty®: Advanced Warfare on high settings 4K resolution in AMD CrossFire™ mode.
◾[408892]: World of Warcraft can sometimes exhibit corruption when using CMAA in AMD CrossFire™ configurations.
◾[407431]: Minecraft sometimes produces corruption when changing video settings in windowed mode.
◾[407338]: XDMA Quad CrossFire™ configurations in portrait Eyefinity modes sometimes display tearing or stuttering.

*Known Issues*
◾[408723]: System can sometimes hang when upgrading to Catalyst™ 14.11.2 from Catalyst™14.7 in AMD CrossFire™ configurations. As a workaround please completely uninstall previous Catalyst™ software versions before installing Catalyst™14.11.2 beta.
◾[409518]: Slight performance drops in FIFA 2015 on AMD CrossFire™ configurations.
◾[409502]: Occasional flickering sometimes observed while playing FIFA 2015 in AMD Dual Graphics configurations.
◾[409638]: Slight Battlefield 4 performance drop on AMD Radeon™ R9 290X in AMD CrossFire™ configuration.
◾[409628]: AMD Radeon™ R9 285 intermittently hangs in Hitman Absolution on new game start.
◾[408484]: AMD Radeon™ R9 285 can sometimes exhibit flickering in Assassins Creed Unity.
◾[409613]: Assassins Creed Unity can sometimes experience frame stutter on some AMD CrossFire™ configurations.
◾[408706]: Call of Duty: Advanced Warfare intermittent black screen when loading game in Quad AMD Crossfire™ configurations.
◾[409600]: Civilization: Beyond Earth mantle users in AMD CrossFire™ configurations may sometimes experience an issue where they cannot change their game resolution. As a work around please use "Enable MGPU=1" in the games configuration .ini file.

AMD is currently working with BioWare to resolve the following issues:
◾Flickering is sometimes observed in Dragon Age: Inquisition on a limited number of surfaces in AMD CrossFire™ configurations.

AMD is currently working with Ubisoft to resolve the following issues:
◾Uneven hair corruption sometimes observed in Assassins Creed Unity when applying "ultra" game settings.
◾Flickering occasionally observed between windows on walls in Assassins Creed Unity.
◾Windows/Doors intermittently flash with black textures in Assassins Creed Unity.
◾Assassins Creed Unity occasionally exits to desktop when "ultra" game settings are applied.


----------



## Harry604

i just got my direct cu2 r9 290x

i threw a antec 620 cooler on it with a ex360 rad maxes out at 45c

my problem now is it keeps downclocking never stays at 1150mhz

how can i fix this


----------



## directorJay

anyone running on the 14.12 drivers? the Omega Driver? how is it, I'm currently downloading right now and if only there can be a heads up of what I can expect


----------



## Spooby904

Just downloaded them... But have no option for VSR with my r9 290 xfire setup


----------



## Mega Man

so how are people liking the omega driver ?


----------



## Takla

Quote:


> Originally Posted by *Spooby904*
> 
> Just downloaded them... But have no option for VSR with my r9 290 xfire setup


anyone else with some info for VSR?


----------



## kizwan

Quote:


> Originally Posted by *Harry604*
> 
> i just got my direct cu2 r9 290x
> 
> i threw a antec 620 cooler on it with a ex360 rad maxes out at 45c
> 
> my problem now is it keeps downclocking never stays at 1150mhz
> 
> how can i fix this


Downclocking while under load (e.g. gaming)? More info would be helpful. Also screenshot showing the core downclocking. What is the VRM temps when it happened?


----------



## Chopper1591

Quote:


> Originally Posted by *kizwan*
> 
> Downclocking while under load (e.g. gaming)? More info would be helpful. Also screenshot showing the core downclocking. What is the VRM temps when it happened?


This.

We need more info to give advice.


----------



## kizwan

I just did quick Valley bench with Omega drivers. No tweak. Highest VSR I can get is 3200 x 1800 on 1080p monitor. What is the max res do you guys get?

1080p


1440p


----------



## mojobear

Anyone with eyefinity try VSR - does it work? I tried it on my system @ 5760 x 1080 and no dice









Any help or comments about their experience would be appreciated thanks


----------



## mus1mus

Just got my 290 swimming. And haven't been in touch for a while.

Can someone share a good BIOS to work with a reference MSI?

40s on Core and 50s hottest VRM. Am I in the ballpark for water cooled 290s using ek full cover?

Thanks gang


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> Just got my 290 swimming. And haven't been in touch for a while.
> 
> Can someone share a good BIOS to work with a reference MSI?
> 
> *40s on Core and 50s hottest VRM. Am I in the ballpark for water cooled 290s using ek full cover?*
> 
> Thanks gang


Sounds about right with stock EK thermal pads. VRM1 on mine was a couple degrees higher than the core.


----------



## LandonAaron

14.12, no FC4 CFX profile?


----------



## ebduncan

I updated to the Omega drivers last night.

not much changed for me performance wise vs 14.11.2 betas. I picked up around 30-100 3dmark Firestrike points. Although the runs couldn't be directly compared as the run with the older drivers the cpu was at 5052mhz and the cpu is back at its normal 4945 speed with the omega drivers. 3dmark is weird though and the physics score pretty much remained the same.

Here are the runs. Will have to do some actual game benches later. I will probably put the cpu back to 5052 and rerun a few times of firestrike just to see a more direct comparison to my 14.11.2 scores.

Omega
http://www.3dmark.com/fs/3421965
CPU at 4945mhz
[email protected] 1240/1600
Score: 9549
Graphics:13552.0

14.11.2 betas
http://www.3dmark.com/fs/3377805
CPU at 5052mhz
[email protected] 1240/1600
Score :9509
Graphics:13397.0


----------



## ElevenEleven

Just received a PowerColor PCS+ 290X yesterday and have a lot to test. It was an open box from NewEgg, but came in the original box and really looked new and unused (just the serial number on the box is crossed out with a marker, and a new serial number is pasted on).

Using the pre-Omega beta driver and have a lot to test in terms of stability and such, as I've read those cards might have some older BIOS issues. GPU-Z reads my BIOS as follows:
015.044.000.001.000000 (113-C5711101-C672). Not sure if it needs to be updated or not yet. Elpida memory. ASIC of 82.7%.

Stock with the previous beta driver, hopefully this is okay (I've got a lot to read about to catch up on the latest AMD series):



Is that low? The stock configuration is 1050MHz core / 1350MHz memory. The i5 2400K is overclocked to basically be like the stock i5 2500K. I'd just love to know if this is reasonable or if I have a Powercolor PCS+ with problems and need a BIOS change.


----------



## chiknnwatrmln

I can't upgrade past 14.9, what the hell? On an 14.11 or the newest 14.12 CCC uses ~10% CPU and flickers, and any time I try to change any settings it reverts back. I will try to take a video. Does this happen to anyone else?


----------



## mAs81

Hey guys I'm having a peculiar problem..

I always had an idle temp of max 45c on my card,but today after I unplugged everything to fix the cables and clean my desk,it idles @ 55-56c..
I have a 2560x1440 monitor connected via dvi and a 1920x1080 tv monitor connected via hdmi..

At first I thought that it was the cable,because I changed the hdmi cable with a longer dvi one..I changed back to the hdmi one but no dice..
I was running this setup with the 14.11.2 drivers , and even though I rolled back to 14.9 nothing changed..I'm running now the Omega drivers with no temp change..

Here's my HWINFO and GPU-Z readings.. :



Why is this happening and how can I fix this??It's really eerie seeing my Sapphire logo on my Vapor-X get yellow with almost minimum load









Any advice would be highly appreciated


----------



## joeh4384

Are you using CRU by chance? It looks like your memory is running at its high clock.


----------



## mAs81

Quote:


> Originally Posted by *joeh4384*
> 
> Are you using CRU by chance? It looks like your memory is running at its high clock.


The memory was always running @ 1400Mhz with only the core clock fluctuating depending on use


----------



## ebduncan

Quote:


> Originally Posted by *mAs81*
> 
> Hey guys I'm having a peculiar problem..
> 
> I always had an idle temp of max 45c on my card,but today after I unplugged everything to fix the cables and clean my desk,it idles @ 55-56c..
> I have a 2560x1440 monitor connected via dvi and a 1920x1080 tv monitor connected via hdmi..
> 
> At first I thought that it was the cable,because I changed the hdmi cable with a longer dvi one..I changed back to the hdmi one but no dice..
> I was running this setup with the 14.11.2 drivers , and even though I rolled back to 14.9 nothing changed..I'm running now the Omega drivers with no temp change..
> 
> Here's my HWINFO and GPU-Z readings.. :
> 
> 
> 
> Why is this happening and how can I fix this??It's really eerie seeing my Sapphire logo on my Vapor-X get yellow with almost minimum load
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any advice would be highly appreciated


Did you go from a single monitor to a dual monitor setup?

IF so its completely normal.


----------



## joeh4384

Quote:


> Originally Posted by *ElevenEleven*
> 
> Just received a PowerColor PCS+ 290X yesterday and have a lot to test. It was an open box from NewEgg, but came in the original box and really looked new and unused (just the serial number on the box is crossed out with a marker, and a new serial number is pasted on).
> 
> Using the pre-Omega beta driver and have a lot to test in terms of stability and such, as I've read those cards might have some older BIOS issues. GPU-Z reads my BIOS as follows:
> 015.044.000.001.000000 (113-C5711101-C672). Not sure if it needs to be updated or not yet. Elpida memory. ASIC of 82.7%.
> 
> Stock with the previous beta driver, hopefully this is okay (I've got a lot to read about to catch up on the latest AMD series):
> 
> 
> 
> Is that low? The stock configuration is 1050MHz core / 1350MHz memory. The i5 2400K is overclocked to basically be like the stock i5 2500K. I'd just love to know if this is reasonable or if I have a Powercolor PCS+ with problems and need a BIOS change.


That score seems pretty similar to what I get with my 290x at stock clocks. It is at 1030hz though but I am using it with a 4690k.


----------



## mAs81

Quote:


> Originally Posted by *ebduncan*
> 
> Did you go from a single monitor to a dual monitor setup?
> 
> IF so its completely normal.


No,I always had dual monitors..I just unplugged everything,cleaned my desk,re-connected everything(using a dvi cable for the 1920x1080 tv monitor)and booted..


----------



## ebduncan

Quote:


> Originally Posted by *mAs81*
> 
> No,I always had dual monitors..I just unplugged everything,cleaned my desk,re-connected everything(using a dvi cable for the 1920x1080 tv monitor)and booted..


Not sure what you did while cleaning you desk, but did that include your computer as well? might have unplugged a fan cable or something...

Check to see if one of your fans died.

Other than that I have no idea.


----------



## mAs81

Quote:


> Originally Posted by *ebduncan*
> 
> Not sure what you did while cleaning you desk, but did that include your computer as well? might have unplugged a fan cable or something...
> 
> Check to see if one of your fans died.
> 
> Other than that I have no idea.


That's the first thing I checked,I've done it before,getting anxious about temps with no intake fan









Thanks anyway man,appreciate it


----------



## Performer81

Quote:


> Originally Posted by *ElevenEleven*
> 
> Just received a PowerColor PCS+ 290X yesterday and have a lot to test. It was an open box from NewEgg, but came in the original box and really looked new and unused (just the serial number on the box is crossed out with a marker, and a new serial number is pasted on).
> 
> Using the pre-Omega beta driver and have a lot to test in terms of stability and such, as I've read those cards might have some older BIOS issues. GPU-Z reads my BIOS as follows:
> 015.044.000.001.000000 (113-C5711101-C672). Not sure if it needs to be updated or not yet. Elpida memory. ASIC of 82.7%.
> 
> Stock with the previous beta driver, hopefully this is okay (I've got a lot to read about to catch up on the latest AMD series):
> 
> 
> 
> Is that low? The stock configuration is 1050MHz core / 1350MHz memory. The i5 2400K is overclocked to basically be like the stock i5 2500K. I'd just love to know if this is reasonable or if I have a Powercolor PCS+ with problems and need a BIOS change.


COuld you please upload the bios of your card? Would like to test it on my unlocked 290, my other 290X pcs+ bios makes some trouble.


----------



## chiknnwatrmln

Here is a video of my issue with CCC.




Anyone have an ideas?


----------



## ElevenEleven

Quote:


> Originally Posted by *joeh4384*
> 
> That score seems pretty similar to what I get with my 290x at stock clocks. It is at 1030hz though but I am using it with a 4690k.


Thank you! At least I know that it's a reasonable score and not too low. Looks like I can comfortably run at 1100/1400 at reduced stock voltage (testing at -19mV now, going to see how far down I can push). PCS+ cooling is pretty good, but fans are audible getting close to 40% speed+ (a whiny kind of sound, which I dislike, not a rush of air), so I'd like to stay there or lower. This is so far max 64C core, 72C VRM1.



(what's up with the crazy temperature numbers as reported by Valley?)


----------



## Harry604

my vrm 1 and 2 temps around under 60c i have a fan blowing on them core under 48c

in gaming and running heaven it downclocks

im running a 2560x1440p monitor at 120hz

the asus direct cu2 290x comes with a heatsink over top vrm 1

after 15 mins of heaven vrm 1 is 64 and vrm 2 is 61

power limit is +50


----------



## mAs81

Quote:


> Originally Posted by *mAs81*
> 
> Hey guys I'm having a peculiar problem..
> 
> I always had an idle temp of max 45c on my card,but today after I unplugged everything to fix the cables and clean my desk,it idles @ 55-56c..
> I have a 2560x1440 monitor connected via dvi and a 1920x1080 tv monitor connected via hdmi..
> 
> At first I thought that it was the cable,because I changed the hdmi cable with a longer dvi one..I changed back to the hdmi one but no dice..
> I was running this setup with the 14.11.2 drivers , and even though I rolled back to 14.9 nothing changed..I'm running now the Omega drivers with no temp change..
> 
> Here's my HWINFO and GPU-Z readings.. :
> 
> 
> 
> Why is this happening and how can I fix this??It's really eerie seeing my Sapphire logo on my Vapor-X get yellow with almost minimum load
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any advice would be highly appreciated


Has this never happened before to anyone?


----------



## The Mac

yup. memory is all full clock.

unplug the TV and see if it dropds.


----------



## mAs81

Quote:


> Originally Posted by *The Mac*
> 
> yup. memory is all full clock.
> 
> unplug the TV and see if it dropds.


Well I did it and it dropped alright,but I thought that running memory at full clock is something normal when you have dual monitors,
plus how come before it was working at high clocks and was cool??


----------



## tsm106

Quote:


> Originally Posted by *mAs81*
> 
> Quote:
> 
> 
> 
> Originally Posted by *The Mac*
> 
> yup. memory is all full clock.
> 
> unplug the TV and see if it dropds.
> 
> 
> 
> Well I did it and it dropped alright,but I thought that running memory at full clock is something normal when you have dual monitors,
> plus how come before it was working at high clocks and was cool??
Click to expand...

By default in multi mon it will run memory at fullspeed. You can however set it up to run downclocked, depending on the specific driver. Some versions won't allow memory to be downclocked.


----------



## mAs81

Quote:


> Originally Posted by *tsm106*
> 
> By default in multi mon it will run memory at fullspeed. You can however set it up to run downclocked, depending on the specific driver. Some versions won't allow memory to be downclocked.


What I can't explain,for the life of me,is this +10c rise in my card..All I did was unplug the monitors an plug them again..It makes no difference whatsoever what cable I use to plug my tv-monitor..
I'm currently connected to my card via a dvi dual link(2560x1440)and a dvi(1920x1080)..

Would it make any sense if I connected them in any other way?I can always get a display port for my Dell and reconnect the tv monitor via hdmi..

This is all so random


----------



## tsm106

I suppose it can seem random. All of this stuff is very intertwined with your oc app and how you oc. How are you going about that process?


----------



## mAs81

Nothing's changed..Stock clocks with downvolting on the Vcore because my gigabyte m/b set it at 1.275 on its own


----------



## tsm106

I meant on the gpu. Long story short, you need an oc app like msi ab with defined default and 3D clocks, so that ab can negotiate the powerstates for you and sort of do a trick to enable the gpu to run in low power mode.


----------



## mAs81

Quote:


> Originally Posted by *tsm106*
> 
> I meant on the gpu. Long story short, you need an oc app like msi ab with defined default and 3D clocks, so that ab can negotiate the powerstates for you and sort of do a trick to enable the gpu to run in low power mode.


Whoops,sorry,of course that's what you meant..I'm using Sapphire TriXX and I'm not overclocking at the moment


----------



## chiknnwatrmln

Anyone have any insight as to why CCC is so bugged for me? This is making CF and Eyefinity totally unusable..
Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I can't upgrade past 14.9, what the hell? On an 14.11 or the newest 14.12 CCC uses ~10% CPU and flickers, and any time I try to change any settings it reverts back. I will try to take a video. Does this happen to anyone else?


Here are two videos of this.


----------



## tsm106

Quote:


> Originally Posted by *mAs81*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I meant on the gpu. Long story short, you need an oc app like msi ab with defined default and 3D clocks, so that ab can negotiate the powerstates for you and sort of do a trick to enable the gpu to run in low power mode.
> 
> 
> 
> Whoops,sorry,of course that's what you meant..I'm using Sapphire TriXX and I'm not overclocking at the moment
Click to expand...

It's doesn't matter if you're actually overclocking atm, it's how and what app you choose to use to control it all. Trixx doesn't support auto switching profiles and afaik uses unofficial oc mode, both of which are not conducive to seamless interaction with AMD's powersavings tech. AB however can coexist fairly well like I described before.

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Anyone have any insight as to why CCC is so bugged for me? This is making CF and Eyefinity totally unusable..
> Quote:
> 
> 
> 
> Originally Posted by *chiknnwatrmln*
> 
> I can't upgrade past 14.9, what the hell? On an 14.11 or the newest 14.12 CCC uses ~10% CPU and flickers, and any time I try to change any settings it reverts back. I will try to take a video. Does this happen to anyone else?
> 
> 
> 
> Here are two videos of this.
> 
> 
> Spoiler: Warning: Spoiler!
Click to expand...

Borked driver install. Uninstall and reinstall. How are you cleaning drivers? At this point in time, I just use DDU.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *tsm106*
> 
> It's doesn't matter if you're actually overclocking atm, it's how and what app you choose to use to control it all. Trixx doesn't support auto switching profiles and afaik uses unofficial oc mode, both of which are not conducive to seamless interaction with AMD's powersavings tech. AB however can coexist fairly well like I described before.
> Borked driver install. Uninstall and reinstall. How are you cleaning drivers? At this point in time, I just use DDU.


I also use DDU. This has happened after several clean driver uninstalls, I checked the registry and appropriate folders after manually and all traces were removed. 14.9 and any drivers before work just fine.

I also tried not using any other programs such as Trixx, MSIAB, etc. No luck.


----------



## tsm106

Eeek. How old is your os install at this point?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *tsm106*
> 
> Eeek. How old is your os install at this point?


That was my second thought. It's not too old, last time I reinstalled Windows was late April I think. This installation has only seen 290/x, no Nvidia or different cards.
I might try reinstalling Windows once I get some time after final exams.


----------



## Regnitto

does anyone have the measurements for the cooler mounting hole locations on the reference 290/290x pcb? I am working on designing a custom full-card fan shroud for my red-modded VisionTek 29 290. I don't personally have anything to measure such a small distance as the holes around the edge of the card. Once I have my blueprints drawn up I will take a pic so you can see what I have in mind. (of course I'll have to go back in and add mounting hole locations to the blueprints once i have this info)


----------



## jstoneky

I have this exact same issue, and it only causes the problem when I try to configure eyefinity. Same flickering, not able to click, etc.

I have tried countless uninstall/reinstalls and DDU -- It doesn't break with 14.9, so I don't believe its a Windows OS problem. I can reinstall 14.9 and configure eyefinity without issue.

I am running 290x x2 on an Asus Z97-C Mobo ... really not sure what the issue is at this point, so let me know if you figure anything out.


----------



## Harry604

Quote:


> Originally Posted by *kizwan*
> 
> Downclocking while under load (e.g. gaming)? More info would be helpful. Also screenshot showing the core downclocking. What is the VRM temps when it happened?


Quote:


> Originally Posted by *Chopper1591*
> 
> This.
> 
> We need more info to give advice.


my vrm 1 and 2 temps around under 70c i have a fan blowing on them core under 48c

in gaming and running heaven it downclocks

im running a 2560x1440p monitor at 120hz

the asus direct cu2 290x comes with a heatsink over top vrm 1

after 15 mins of heaven vrm 1 is 64 and vrm 2 is 61

power limit is +50


----------



## kizwan

Quote:


> Originally Posted by *Harry604*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Downclocking while under load (e.g. gaming)? More info would be helpful. Also screenshot showing the core downclocking. What is the VRM temps when it happened?
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Chopper1591*
> 
> This.
> 
> We need more info to give advice.
> 
> Click to expand...
> 
> 
> 
> 
> 
> my vrm 1 and 2 temps around under 70c i have a fan blowing on them core under 48c
> 
> in gaming and running heaven it downclocks
> 
> im running a 2560x1440p monitor at 120hz
> 
> the asus direct cu2 290x comes with a heatsink over top vrm 1
> 
> after 15 mins of heaven vrm 1 is 64 and vrm 2 is 61
> 
> power limit is +50
> 
> 
> Spoiler: Warning: Spoiler!
Click to expand...

Core clocks fluctuating when running Unigine is normal. This only happen with single gpu though. With crossfire for some reason it'll run at max clock.


----------



## Harry604

Quote:


> Originally Posted by *kizwan*
> 
> Core clocks fluctuating when running Unigine is normal. This only happen with single gpu though. With crossfire for some reason it'll run at max clock.


it happens when i game to


----------



## boredmug

Needing some opinions on r9 290's. They are xfx which i know isn't a great brand. I was going to buy a pair of r9 290x's used here pretty soon but a buddy of mine has a couple of these reference 290's that i was going to slap my EK-supremacy bridge edition gpu blocks on and retain the stock heatspreader to cool ram and vrm's. I guess it's possible they could be unlockable which would be a bonus but i'm really wondering if it's worth spending 350 on the pair? I was looking at spending 250 or so a card for some used gigabyte or Sapphire 290x's.


----------



## Arctic Storm

I just picked up a 3rd r9 290 (non x) and am currently running a watercooled trifire setup. I think I may be confused about how MSI afterburner reads the cards. Under "GPU 3 Memory Usage", when playing Shadows of Mordor on Very High at 4k, it reads over 10,000mb of VRAM usage... I am assuming this means that real VRAM usage is 10,000/3? Memory usage seems very high for all games (Dragon Age Inquistion reads 8000mb @ 4k at very high). So that leads me to conclude VRAM use per card = Total Vram (in this case lets use 10000)/# of cards (3) = 3333mb Vram, which seems much more appropriate to me.


----------



## tsm106

Quote:


> Originally Posted by *Arctic Storm*
> 
> I just picked up a 3rd r9 290 (non x) and am currently running a watercooled trifire setup. I think I may be confused about how MSI afterburner reads the cards. Under "GPU 3 Memory Usage", when playing Shadows of Mordor on Very High at 4k, *it reads over 10,000mb of VRAM usage... I am assuming this means that real VRAM usage is 10,000/3?* Memory usage seems very high for all games (Dragon Age Inquistion reads 8000mb @ 4k at very high). So that leads me to conclude VRAM use per card = Total Vram (in this case lets use 10000)/# of cards (3) = 3333mb Vram, which seems much more appropriate to me.


Correct, its total vram, then ya divide that by the number of cards. I would recommend ppl not get overly critical in regards to vram usage. All the readings from all the apps are incorrect anyways, and many games/apps reserve memory space on top of that. Vram memory pages happen very fast, often and too numerous to really track, if these apps could actually track true vram usage. About the only time I start to wonder about memory usage is when I'm stuck in a vram choke situation, the dreaded stutter step. In this case, reduce your game settings to free up memory.


----------



## kizwan

Quote:


> Originally Posted by *Harry604*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Core clocks fluctuating when running Unigine is normal. This only happen with single gpu though. With crossfire for some reason it'll run at max clock.
> 
> 
> 
> it happens when i game to
Click to expand...

Which games? If the fluctuations similar to what you get with Unigine it should be normal. Mine crossfire-ed, I rarely play games with single gpu & I only play few games. Maybe @Chopper1591 could help you more if you tell us the name of the games.


----------



## Harry604

Quote:


> Originally Posted by *kizwan*
> 
> Which games? If the fluctuations similar to what you get with Unigine it should be normal. Mine crossfire-ed, I rarely play games with single gpu & I only play few games. Maybe @Chopper1591 could help you more if you tell us the name of the games.


Bf4,far cry 4,cs global,ryse son of Rome,assasins creed

Isn't there a bios I can flash that stops the down clocking
Obviously not temp related or my psu

First time with a amd card


----------



## tsm106

Quote:


> Originally Posted by *Harry604*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Which games? If the fluctuations similar to what you get with Unigine it should be normal. Mine crossfire-ed, I rarely play games with single gpu & I only play few games. Maybe @Chopper1591 could help you more if you tell us the name of the games.
> 
> 
> 
> Bf4,far cry 4,cs global,ryse son of Rome,assasins creed
> 
> Isn't there a bios I can flash that stops the down clocking
> Obviously not temp related or my psu
> 
> First time with a amd card
Click to expand...

What app are you using to control your clocks?


----------



## Harry604

Quote:


> Originally Posted by *tsm106*
> 
> What app are you using to control your clocks?


msi afterburner


----------



## Aaron_Henderson

I think I can push everything a bit more, but I really need better cooling on the 290x...and I still haven't overclocked my ram. I am still 8th highest 2500K+290X setup (10745 is the top) I could find using the search and compare functions on the 3DMark website...at least I know everything is running as it should. I still want to push a bit more though. This is just the daily driven settings with the AMD Omega drivers.


----------



## rdr09

Quote:


> Originally Posted by *Harry604*
> 
> Bf4,far cry 4,cs global,ryse son of Rome,assasins creed
> 
> Isn't there a bios I can flash that stops the down clocking
> Obviously not temp related or my psu
> 
> First time with a amd card


don't combine AB and GPUZ. use just AB to monitor your GPU Core Clock testing Heaven.


----------



## tsm106

Quote:


> Originally Posted by *Harry604*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> What app are you using to control your clocks?
> 
> 
> 
> msi afterburner
Click to expand...

Setup AB like in the pic below. Don't combine apps, as rdr mentioned use gpuz or overdrive while AB is running. Adding to that multiple diagnostic apps that are reading the same sensors can/will cause conflicts.



After configuring like so, setup two presets. One is default and one is for your oc/daily settings/whatever use, etc. Now go to settings/profiles and pick your default preset for 2D profile and other preset for 3D profile. Exit AB, and restart it to save settings. You card should now drop down to low power clocks while idle, and conversely maintain the clocks you chose for the 3D profile under load. Btw, be sure to max Power Limit on your 3D profile.


----------



## Chopper1591

Quote:


> Originally Posted by *rdr09*
> 
> don't combine AB and GPUZ. use just AB to monitor your GPU Core Clock testing Heaven.


Quote:


> Originally Posted by *tsm106*
> 
> Setup AB like in the pic below. Don't combine apps, as rdr mentioned use gpuz or overdrive while AB is running. Adding to that multiple diagnostic apps that are reading the same sensors can/will cause conflicts.
> 
> 
> 
> After configuring like so, setup two presets. One is default and one is for your oc/daily settings/whatever use, etc. Now go to settings/profiles and pick your default preset for 2D profile and other preset for 3D profile. Exit AB, and restart it to save settings. You card should now drop down to low power clocks while idle, and conversely maintain the clocks you chose for the 3D profile under load. Btw, be sure to max Power Limit on your 3D profile.


So, do you guys advice to run Trixx and Gpu-z for example?
I do want another monitoring tool because AB doesn't monitor vrm temps(for me).

Just did some comparing of DSR with 3DMark.
What do you guys think? I hadn't expected such a big difference...

1080p:


Spoiler: Warning: Spoiler!







1440p:


Spoiler: Warning: Spoiler!







I guess I won't be using it in games...
At least not in new games.

Edit:

Dirt 3 on the other hand.









1080p:


Spoiler: Warning: Spoiler!






Couldn't switch to window here so made the ss with Fraps.

1440p:


Spoiler: Warning: Spoiler!






* I switched to window view to make the shot with snipping tool.

Looks almost like dsr isn't working.


----------



## zpaf

New driver new score.

















http://www.3dmark.com/3dm/5004011?


----------



## Chopper1591

Quote:


> Originally Posted by *zpaf*
> 
> New driver new score.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/5004011?


Would be nice if you also posted the scores with the previous driver...


----------



## zpaf

Quote:


> Originally Posted by *Chopper1591*
> 
> Would be nice if you also posted the scores with the previous driver...



http://www.3dmark.com/3dm/4948674?


----------



## LandonAaron

Quote:


> Originally Posted by *tsm106*
> 
> Setup AB like in the pic below. Don't combine apps, as rdr mentioned use gpuz or overdrive while AB is running. Adding to that multiple diagnostic apps that are reading the same sensors can/will cause conflicts.
> 
> 
> 
> After configuring like so, setup two presets. One is default and one is for your oc/daily settings/whatever use, etc. Now go to settings/profiles and pick your default preset for 2D profile and other preset for 3D profile. Exit AB, and restart it to save settings. You card should now drop down to low power clocks while idle, and conversely maintain the clocks you chose for the 3D profile under load. Btw, be sure to max Power Limit on your 3D profile.


I don't see how this will maintain a constant clock under load. Seems the only difference would be that the OC profile would be applied automatically when a 3d game is launched. The actual clock speed will still fluctuate below the set clock speed, the same as if you had manually selected the profile.


----------



## jagdtigger

Quote:


> Originally Posted by *zpaf*
> 
> New driver new score.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/5004011?


Nice, here is mine:
http://www.3dmark.com/3dm/4999066?

With previous driver
http://www.3dmark.com/3dm/4971240?

GPU=1220 MHz, VRAM=1500 MHz, VDDC offset +162 mV


----------



## Harry604

Quote:


> Originally Posted by *tsm106*
> 
> Setup AB like in the pic below. Don't combine apps, as rdr mentioned use gpuz or overdrive while AB is running. Adding to that multiple diagnostic apps that are reading the same sensors can/will cause conflicts.
> 
> 
> 
> After configuring like so, setup two presets. One is default and one is for your oc/daily settings/whatever use, etc. Now go to settings/profiles and pick your default preset for 2D profile and other preset for 3D profile. Exit AB, and restart it to save settings. You card should now drop down to low power clocks while idle, and conversely maintain the clocks you chose for the 3D profile under load. Btw, be sure to max Power Limit on your 3D profile.


Quote:


> Originally Posted by *LandonAaron*
> 
> I don't see how this will maintain a constant clock under load. Seems the only difference would be that the OC profile would be applied automatically when a 3d game is launched. The actual clock speed will still fluctuate below the set clock speed, the same as if you had manually selected the profile.


Quote:


> Originally Posted by *jagdtigger*
> 
> Nice, here is mine:
> http://www.3dmark.com/3dm/4999066?
> 
> With previous driver
> http://www.3dmark.com/3dm/4971240?
> 
> GPU=1230,1 MHz VRAM=1500 MHz


it worked does not downclock in games now


----------



## kizwan

Quote:


> Originally Posted by *Harry604*
> 
> it worked does not downclock in games now


Do you see any difference in FPS compare to before?


----------



## s7Design

I've upgraded to a crossfire combination of a R9 290 and a R9 290x all running on stock settings and stock coolers


----------



## tsm106

Quote:


> Originally Posted by *Harry604*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Setup AB like in the pic below. Don't combine apps, as rdr mentioned use gpuz or overdrive while AB is running. Adding to that multiple diagnostic apps that are reading the same sensors can/will cause conflicts.
> 
> 
> 
> After configuring like so, setup two presets. One is default and one is for your oc/daily settings/whatever use, etc. Now go to settings/profiles and pick your default preset for 2D profile and other preset for 3D profile. Exit AB, and restart it to save settings. You card should now drop down to low power clocks while idle, and conversely maintain the clocks you chose for the 3D profile under load. Btw, be sure to max Power Limit on your 3D profile.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *LandonAaron*
> 
> I don't see how this will maintain a constant clock under load. Seems the only difference would be that the OC profile would be applied automatically when a 3d game is launched. The actual clock speed will still fluctuate below the set clock speed, the same as if you had manually selected the profile.
> 
> Click to expand...
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *jagdtigger*
> 
> Nice, here is mine:
> http://www.3dmark.com/3dm/4999066?
> 
> With previous driver
> http://www.3dmark.com/3dm/4971240?
> 
> GPU=1230,1 MHz VRAM=1500 MHz
> 
> Click to expand...
> 
> 
> 
> 
> 
> it worked does not downclock in games now
Click to expand...


----------



## pshootr

I just upgraded my old HD5570 with a Sapphire R9 290 Tri-X OC and scored hynix memory.


----------



## Regnitto

I started working on designing a full card fan shroud for my red-modded R9-290. My card is a reference pcb, so once I'm done I should have a custom shroud that would fit any reference 290/290x. This will feature mounting for a 120mm fan over the VRMs and will bolt on using the existing reference cooler mounting holes and a backplate (My Visiontek R9 290 came with a nice backplate that has threaded m3 0.5 mounting holes). This is intended to be made on a 3D printer, and is 2mm thick, so It shouldn't add too much weight. Here are some screen shots of what i've got done so far, let me know if you think there is any market value in this.....I may be talked into having extras made for the right price......




keep in mind, this is still an early rendering, still need to make an opening for the radiator hoses and the power connectors, and make the mounting brackets.........put some designs on it........lot of work left to do.


----------



## pshootr

I downloaded 3dmark since I installed a 290 tri-x. The first time I ran fire-strike the test completed but the program froze my pc rite after the last test. I had to use my reset button to reboot. Second time I started up 3dmark, I tried to uncheck (show-demo) and my pc froze again. I was wondering if this is a known issue with an easy fix, or if I may be having instability issues? I just upgraded the card from a HD5570, and did a thorough uninstall/cleaning of previous AMD drivers, and then rebooted and ran ccleaner before I installed the Omega drivers. I also just installed a FX-3820E CPU.

All clocks are stock for CPU and GPU for now.

Using Auto on everything in my bios (latest release) for now.


----------



## rdr09

Quote:


> Originally Posted by *pshootr*
> 
> I downloaded 3dmark since I installed a 290 tri-x. The first time I ran fire-strike the test completed but the program froze my pc rite after the last test. I had to use my reset button to reboot. Second time I started up 3dmark, I tried to uncheck (show-demo) and my pc froze again. I was wondering if this is a known issue with an easy fix, or if I may be having instability issues? I just upgraded the card from a HD5570, and did a thorough uninstall/cleaning of previous AMD drivers, and then rebooted and ran ccleaner before I installed the Omega drivers. I also just installed a FX-3820E CPU.
> 
> All clocks are stock for CPU and GPU for now.
> 
> Using Auto on everything in my bios (latest release) for now.


what's the rest of your components?


----------



## Chopper1591

Quote:


> Originally Posted by *pshootr*
> 
> I just upgraded my old HD5570 with a Sapphire R9 290 Tri-X OC and scored hynix memory.


That's a massive upgrade.
Quote:


> Originally Posted by *Regnitto*
> 
> I started working on designing a full card fan shroud for my red-modded R9-290. My card is a reference pcb, so once I'm done I should have a custom shroud that would fit any reference 290/290x. This will feature mounting for a 120mm fan over the VRMs and will bolt on using the existing reference cooler mounting holes and a backplate (My Visiontek R9 290 came with a nice backplate that has threaded m3 0.5 mounting holes). This is intended to be made on a 3D printer, and is 2mm thick, so It shouldn't add too much weight. Here are some screen shots of what i've got done so far, let me know if you think there is any market value in this.....I may be talked into having extras made for the right price......
> 
> 
> 
> 
> keep in mind, this is still an early rendering, still need to make an opening for the radiator hoses and the power connectors, and make the mounting brackets.........put some designs on it........lot of work left to do.


Will need to improve a lot before I will like it.
But I do like the idea.

Interested in how it will turn out, keep us updated.
Quote:


> Originally Posted by *pshootr*
> 
> I downloaded 3dmark since I installed a 290 tri-x. The first time I ran fire-strike the test completed but the program froze my pc rite after the last test. I had to use my reset button to reboot. Second time I started up 3dmark, I tried to uncheck (show-demo) and my pc froze again. I was wondering if this is a known issue with an easy fix, or if I may be having instability issues? I just upgraded the card from a HD5570, and did a thorough uninstall/cleaning of previous AMD drivers, and then rebooted and ran ccleaner before I installed the Omega drivers. I also just installed a FX-3820E CPU.
> 
> All clocks are stock for CPU and GPU for now.
> 
> Using Auto on everything in my bios (latest release) for now.


If that info in your sig is updated, you are seriously under-powered.
A 430w unit is way too small for your build.

You got your problem right there.

If you've read the box of you 290 tri-x you'd have seen that the requirements are a 750w psu.

My vote goes to a SuperNOVA G2, for the value of it. Best bang for the buck IMHO.
Cheaper would be a M12II unit. Anything cheaper than that and you are sacrificing quality.


----------



## Raephen

Quote:


> Originally Posted by *Chopper1591*
> 
> You got your problem right there.
> 
> If you've read the box of you 290 tri-x you'd have seen that the requirements are a 750w psu.
> 
> My vote goes to a SuperNOVA G2, for the value of it. Best bang for the buck IMHO.
> Cheaper would be a M12II unit. Anything cheaper than that and you are sacrificing quality.


I agree: it's very likely a power issue.

But the minimum 750W requirement is a recommendation manufacturers do with a VERY wide margin, to account for all types of crappy components.

For instance: the Seasonic M12II you mentioned is a very good psu, for it's price (I've used the 430W version in a HTPC). The 620W version would power his rig with more than enough juice. The 520 might, also, be up for the job, but that would be too narrow a margin for my personal liking to recommend to anyone -- I might do it for myself, but when giving advice I prefer to keep avarage use closer to 50%.

Before I bought my platinum XP2, I powered my rig with the Seasonic 560-X. The only times I recall my watt meter at the wall going past 400W were when I ran stuff like FurMark.

Long story short: with a good quality PSU, don't put too much stock in minimum PSU requirements.


----------



## Regnitto

Quote:


> Originally Posted by *Chopper1591*
> 
> Will need to improve a lot before I will like it.
> But I do like the idea.
> 
> Interested in how it will turn out, keep us updated.


that was just a couple hrs of dicking around in a program I've never used before. It will need to improve a lot before I will like it enough to pay to have it 3D printed, lol. Just wanted to throw the concept up and see what people think of the idea. thanks for the input and I will post more pics of the concept as it comes along!


----------



## tsm106

But why bother? I'm not sure anyone or the market has asked for a full shroud? And especially with a specific product on the market like the Corsair HG10 that addresses all the problems that have plagued CLC mod setups in the past. I think you've been beat to the punch by the HG10.


----------



## battleaxe

Quote:


> Originally Posted by *Regnitto*
> 
> I started working on designing a full card fan shroud for my red-modded R9-290. My card is a reference pcb, so once I'm done I should have a custom shroud that would fit any reference 290/290x. This will feature mounting for a 120mm fan over the VRMs and will bolt on using the existing reference cooler mounting holes and a backplate (My Visiontek R9 290 came with a nice backplate that has threaded m3 0.5 mounting holes). This is intended to be made on a 3D printer, and is 2mm thick, so It shouldn't add too much weight. Here are some screen shots of what i've got done so far, let me know if you think there is any market value in this.....I may be talked into having extras made for the right price......
> 
> 
> 
> 
> keep in mind, this is still an early rendering, still need to make an opening for the radiator hoses and the power connectors, and make the mounting brackets.........put some designs on it........lot of work left to do.


Quote:


> Originally Posted by *tsm106*
> 
> But why bother? I'm not sure anyone or the market has asked for a full shroud? And especially with a specific product on the market like the Corsair HG10 that addresses all the problems that have plagued CLC mod setups in the past. I think you've been beat to the punch by the HG10.












Maybe because he wanted it? Which kinda means someone else might want one too?


----------



## Scorpion49

So, I now have an XFX 290X DD 8GB card. I'm wondering if an HG10 will fit it, there are no memory modules on the back so I'm assuming they just used double density packages on the front and left the rest alone. I might try and take it apart tomorrow and see whats up with the PCB.


----------



## tsm106

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> No one asked for your opinion.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> No one asked for yours either, now did they?
> 
> And by the way, ever notice that my results were indeed legitimate? How's that crow taste?
Click to expand...

I asked him a question, now go away.









You're the genius who claimed silly temps right? Ah I remember now.


----------



## pshootr

Quote:


> Originally Posted by *pshootr*
> 
> I downloaded 3dmark since I installed a 290 tri-x. The first time I ran fire-strike the test completed but the program froze my pc rite after the last test. I had to use my reset button to reboot. Second time I started up 3dmark, I tried to uncheck (show-demo) and my pc froze again. I was wondering if this is a known issue with an easy fix, or if I may be having instability issues? I just upgraded the card from a HD5570, and did a thorough uninstall/cleaning of previous AMD drivers, and then rebooted and ran ccleaner before I installed the Omega drivers. I also just installed a FX-3820E CPU.
> 
> All clocks are stock for CPU and GPU for now.
> 
> Using Auto on everything in my bios (latest release) for now.


[Edit:]

Sorry, I forgot to update my sig. I am using a "Rosewill Photon 850W" PSU. It seems that I read some posts about disabling 2D acceleration to stop some sort of issues similar to this? So far it has only happened during 2D operations such as the 2D interface from 3Dmark, and one time with firefox. System did not freeze during the 3D portion of 3Dmark.


----------



## battleaxe

Quote:


> Originally Posted by *tsm106*
> 
> I asked him a question, now go away.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You're the genius who claimed silly temps right? Ah I remember now.


Yeah, I'm that genius and proved those results were correct, valid, and posted plenty of data to back it up.

So now you're onto trolling the next person huh?


----------



## silencespr

boom


----------



## Archea47

Hey Team,

Can anyone confirm that the aquacomputer full cover and active back plates fit the Sapphire tri-x 290x?

Thanks!


----------



## boredmug

I wouldn't mind something like that for my cards. I use the stock heat spreader and ek supremacy bridge edition gpu blocks. Would be nice to have something that worked with that so I could exhaust any hot air from the vrms out the back like it was intended.
Quote:


> Originally Posted by *tsm106*
> 
> But why bother? I'm not sure anyone or the market has asked for a full shroud? And especially with a specific product on the market like the Corsair HG10 that addresses all the problems that have plagued CLC mod setups in the past. I think you've been beat to the punch by the HG10.


----------



## Mega Man

Quote:


> Originally Posted by *boredmug*
> 
> I wouldn't mind something like that for my cards. I use the stock heat spreader and ek supremacy bridge edition gpu blocks. Would be nice to have something that worked with that so I could exhaust any hot air from the vrms out the back like it was intended.
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> But why bother? I'm not sure anyone or the market has asked for a full shroud? And especially with a specific product on the market like the Corsair HG10 that addresses all the problems that have plagued CLC mod setups in the past. I think you've been beat to the punch by the HG10.
Click to expand...

not to argue but you do realize how much that clc blocks airflow ?


----------



## boredmug

Well I would only want it for vrm cooling and I don't use a clc system. It would still suck cold air in across vrm with the heat spreader attached and exhaust it out the back of the case. I don't care much about the gpu as the water from the universal block is cooling it.
Quote:


> Originally Posted by *Mega Man*
> 
> not to argue but you do realize how much that clc blocks airflow ?


----------



## Regnitto

Quote:


> Originally Posted by *boredmug*
> 
> I wouldn't mind something like that for my cards. I use the stock heat spreader and ek supremacy bridge edition gpu blocks. Would be nice to have something that worked with that so I could *exhaust any hot air from the vrms out the back like it was intended*.


Missed your post earlier. Glad you like the Idea. The rear exhaust is one of the reasons I started this project.
Quote:


> Originally Posted by *Mega Man*
> 
> not to argue but you do realize how much that clc blocks airflow ?


The CLC I used has a pretty low profile pump. I know it will be a restriction, but shouldn't be a really big one


----------



## MrWhiteRX7

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *tsm106*
> 
> Setup AB like in the pic below. Don't combine apps, as rdr mentioned use gpuz or overdrive while AB is running. Adding to that multiple diagnostic apps that are reading the same sensors can/will cause conflicts.
> 
> 
> 
> After configuring like so, setup two presets. One is default and one is for your oc/daily settings/whatever use, etc. Now go to settings/profiles and pick your default preset for 2D profile and other preset for 3D profile. Exit AB, and restart it to save settings. You card should now drop down to low power clocks while idle, and conversely maintain the clocks you chose for the 3D profile under load. Btw, be sure to max Power Limit on your 3D profile.






I LOVE YOU!

Using this method is so easy and works great. Definitely helps, check out my BF4 frame times before and after setting up MSI to work in this fashion.







I have mostly just overclocked with CCC since I never have to touch my voltages.. I guess my clocks weren't staying static. WOOOOO!!! That bf4 run was with 3 cards jaming too obviously immense overkill for the FPS I cap at, but I don't like disabling and re enabling gpu's all the time.


----------



## rt123

Quote:


> Originally Posted by *tsm106*
> 
> Setup AB like in the pic below. Don't combine apps, as rdr mentioned use gpuz or overdrive while AB is running. Adding to that multiple diagnostic apps that are reading the same sensors can/will cause conflicts.
> 
> 
> 
> After configuring like so, setup two presets. One is default and one is for your oc/daily settings/whatever use, etc. Now go to settings/profiles and pick your default preset for 2D profile and other preset for 3D profile. Exit AB, and restart it to save settings. You card should now drop down to low power clocks while idle, and conversely maintain the clocks you chose for the 3D profile under load. Btw, be sure to max Power Limit on your 3D profile.


So a question for you.

Do you Enable Unofficial Overclocking Mode while Benching on the 290XL.?
Also With or Without PowerPlay.?

Thank You.


----------



## zpaf

Best cooler for R9 290.
Here is with my daily clocks.


----------



## Aaron_Henderson

So maybe I can get past 11,000 in Firestrike, but I will need a new cooler...any air coolers out there that will get the job done? Not likely, eh? Right now, I am only able to get my 290X to 1170 core @ +50...so I am definitely getting heat artifacts. I am nearly positive I can go much higher if I can keep the temps in check, which the stock cooling is failing to do at anything past these clocks/volts. I have a spare H80 I could use for "the red mod", but it seems kind of ghetto, to be honest, and I don't believe the Kraken works with with the H80. I am not really into spending $100 to cool this thing, at that price, I might as well just pick up a 140 rad+full cover block and add it to my current loop. I guess a universal block and the 140 rad would be a little more cost efficient, but again, kind of ghetto, IMO. Right now I am able to score into the 10,700 range...which is OK, but I would like to get my 290X up to 1250-1300 core, that should be more than enough to be the highest scoring 2500K+290X, and by quite a bit.


----------



## jagdtigger

How much physics point do you get with that CPU? I'm just curios since my previous CPU was a 2500K.


----------



## Aaron_Henderson

Quote:


> Originally Posted by *jagdtigger*
> 
> How much physics point do you get with that CPU? I'm just curios since my previous CPU was a 2500K.


9000 for physics.


----------



## jagdtigger

Nice







, my 4670k gets only 8877 at 4,5 GHz...


----------



## Aaron_Henderson

Quote:


> Originally Posted by *jagdtigger*
> 
> Nice
> 
> 
> 
> 
> 
> 
> 
> , my 4670k gets only 8877 at 4,5 GHz...


Thanks man, I can probably go a bit higher too, just overclocked my RAM a bit, and still haven't done any runs at non-daily settings. I can probably get a run in at 5.1-5.2 GHz, though it wouldn't be stable for 24/7 use.


----------



## 21cage12

Good stuff, am getting the powercolor r9 290, the best 290 imo, not in terms of beauty though- I might need to polish it up or changed the cooler, maybe polish em, cause that cooler does the job really well!


----------



## joeh4384

Quote:


> Originally Posted by *21cage12*
> 
> Good stuff, am getting the powercolor r9 290, the best 290 imo, not in terms of beauty though- I might need to polish it up or changed the cooler, maybe polish em, cause that cooler does the job really well!


The PCS doesn't look too bad. The HIS one is pretty ugly though.


----------



## rdr09

Quote:


> Originally Posted by *Archea47*
> 
> Hey Team,
> 
> Can anyone confirm that the aquacomputer full cover and active back plates fit the Sapphire tri-x 290x?
> 
> Thanks!


the Tri X is based on reference design and the aqua as well, so they should be fine. i recommend taking a pic of the pcb on the gpu and post it here so others can verify.


----------



## Chopper1591

Quote:


> Originally Posted by *Raephen*
> 
> I agree: it's very likely a power issue.
> 
> But the minimum 750W requirement is a recommendation manufacturers do with a VERY wide margin, to account for all types of crappy components.
> 
> For instance: the Seasonic M12II you mentioned is a very good psu, for it's price (I've used the 430W version in a HTPC). The 620W version would power his rig with more than enough juice. The 520 might, also, be up for the job, but that would be too narrow a margin for my personal liking to recommend to anyone -- I might do it for myself, but when giving advice I prefer to keep avarage use closer to 50%.
> 
> Before I bought my platinum XP2, I powered my rig with the Seasonic 560-X. The only times I recall my watt meter at the wall going past 400W were when I ran stuff like FurMark.
> 
> Long story short: with a good quality PSU, don't put too much stock in minimum PSU requirements.


Yeah, I know that.
Personally I wouldn't want a 560w unit to power my rig.

I think the 750w I have now is sometimes on the small side...
My cpu is power hungry though...

And I've been using a-grade psu's for the past years.

IMO it's better to spend some more then to save a few bucks and be sorry later.
Been there, done that.









Quote:


> Originally Posted by *tsm106*
> 
> But why bother? I'm not sure anyone or the market has asked for a full shroud? And especially with a specific product on the market like the Corsair HG10 that addresses all the problems that have plagued CLC mod setups in the past. I think you've been beat to the punch by the HG10.


I agree on that.
Corsair nailed it IMO.

Quote:


> Originally Posted by *pshootr*
> 
> [Edit:]
> 
> Sorry, I forgot to update my sig. I am using a "Rosewill Photon 850W" PSU. It seems that I read some posts about disabling 2D acceleration to stop some sort of issues similar to this? So far it has only happened during 2D operations such as the 2D interface from 3Dmark, and one time with firefox. System did not freeze during the 3D portion of 3Dmark.


Ah, ok.









Did you solve it already?
I really think you should contact the store if you bought the card new...
Needing to change stuff to make it stable at stock is a no go IMO.

It's been processed here.
Results were: your card runs at too low a voltage. Thus creating less heat, and being less stable....
Done. Now move on and forget it.

Quote:


> Originally Posted by *Archea47*
> 
> Hey Team,
> 
> Can anyone confirm that the aquacomputer full cover and active back plates fit the Sapphire tri-x 290x?
> 
> Thanks!


I dare say it fits.
Haven' tried it myself. But pcb is totally reference so it should fit.









Quote:


> Originally Posted by *zpaf*
> 
> Best cooler for R9 290.
> Here is with my daily clocks.


Care to post a better picture?
I can't read it at all.


----------



## kizwan

Quote:


> Originally Posted by *Chopper1591*
> 
> Quote:
> 
> 
> 
> Originally Posted by *zpaf*
> 
> Best cooler for R9 290.
> Here is with my daily clocks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Care to post a better picture?
> I can't read it at all.
Click to expand...

Right-click on the picture & choose open in new tab.


----------



## Chopper1591

Quote:


> Originally Posted by *kizwan*
> 
> Right-click on the picture & choose open in new tab.


Ehmm..
I can't?

That's why I asked...
Normally I just click on it so it shows as a pop-up and then, if needed, click on "Original" in the lower right corner.

Can you open the furmark shot?


----------



## kizwan

Quote:


> Originally Posted by *Chopper1591*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Right-click on the picture & choose open in new tab.
> 
> 
> 
> Ehmm..
> I can't?
> 
> That's why I asked...
> Normally I just click on it so it shows as a pop-up and then, if needed, click on "Original" in the lower right corner.
> 
> Can you open the furmark shot?
Click to expand...

Yes, I can open the furmark shot.

EDIT: My bad. Depending on what browser you're using, you probably need to copy the image link & open in new tab/window. It'll be three steps; right-click on the picture, copy image link & paste in new tab/window. The one I gave you earlier if you're using Chrome.


----------



## pdasterly

machine overheating after new drivers, Idk if the omega drivers are the cause but when I play shadow of mordor my gpu core/vrm1 skyrocket.


----------



## LandonAaron

I am playing Bioshock 1 in DX10 mode currently and searching for an Anti-aliasing solution. Nothing I have tried seems to work including SMAA injector, SweetFX SMAA, Radeon Pro SMAA, Radeon Pro MLAA, and Catalyst Morphological Filtering. The only thing I haven't really tried so far is Down-sampling, as I have only ever tried it once on an Nvidia card, and am not really sure how to do it on AMD.

Does anyone know of an easy to follow guide for down-sampling with AMD?


----------



## Sgt Bilko

Well i've transferred my R9 290's over to my wife's rig (FX-9590 + R9 290) and my secondary rig (FX-6300 + R9 290) and it's working just fine on a 550w PSU


----------



## kizwan

Quote:


> Originally Posted by *LandonAaron*
> 
> I am playing Bioshock 1 in DX10 mode currently and searching for an Anti-aliasing solution. Nothing I have tried seems to work including SMAA injector, SweetFX SMAA, Radeon Pro SMAA, Radeon Pro MLAA, and Catalyst Morphological Filtering. The only thing I haven't really tried so far is Down-sampling, as I have only ever tried it once on an Nvidia card, and am not really sure how to do it on AMD.
> 
> Does anyone know of an easy to follow guide for down-sampling with AMD?


Correct me if I'm wrong but I thought you can do that with Omega drivers.


----------



## hyp36rmax

Look what showed up to the office today. One out of two of my Sapphire R9 290x Vapor-X 8GB cards arrived!


----------



## LandonAaron

Quote:


> Originally Posted by *kizwan*
> 
> Correct me if I'm wrong but I thought you can do that with Omega drivers.


Okay cool. Yes it works. Its a new option called Virtual Super Resolution. Max resolution available after enabling it was 3200 x 1800, on my 1920 x 1080 monitor. It did help a bit with the aliasing but did not completely remove it. I think regular Anti-Aliasing techniques would look better. Performance impact seemed relatively minor, only increasing VRAM from 664 MB to 808MB, and GPU usage went from about 25% to 35% (Bioshock is incredibly easy to run now days!)


----------



## rdr09

Quote:


> Originally Posted by *pdasterly*
> 
> machine overheating after new drivers, Idk if the omega drivers are the cause but when I play shadow of mordor my gpu core/vrm1 skyrocket.


here were my temps in BF4 with 2 290s in about 2 hours of play. ambient is between 22 -25C. This is with 3 120 rads cooling the cpu as well.


Spoiler: Warning: Spoiler!









Spoiler: Warning: Spoiler!







How are they? don't like to see 60 but it may have to wait.


----------



## boredmug

You need more rad.
Quote:


> Originally Posted by *rdr09*
> 
> here were my temps in BF4 with 2 290s in about 2 hours of play. ambient is between 22 -25C. This is with 3 120 rads cooling the cpu as well.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> How are they? don't like to see 60 but it may have to wait.


----------



## rdr09

Quote:


> Originally Posted by *boredmug*
> 
> You need more rad.


i need a new case, then more rad space. thanks


----------



## hyp36rmax

Quote:


> Originally Posted by *rdr09*
> 
> i need a new case, then more rad space. thanks


I agree, your rads are fine, maybe look into your fan curves. What fans are you using? Case?


----------



## boredmug

I always thought the point of water cooling was for the system to be quieter than air cooling and stay cooler. You are trying to dissipate an awful lot of heat with not much rad space. Higher speed fans might help a little, but the difference between lowest setting and full speed on my fan controller only makes a few degrees difference at most.
Quote:


> Originally Posted by *hyp36rmax*
> 
> I agree, your rads are fine, maybe look into your fan curves. What fans are you using? Case?


----------



## rdr09

Quote:


> Originally Posted by *hyp36rmax*
> 
> I agree, your rads are fine, maybe look into your fan curves. What fans are you using? Case?


i only have a 912. never thought going xfire. cougar fans are dead silent but i can hear the pump. i am thinking of Air 540 maybe. all the fans are just hooked up to the motherboard. yah, i like the silence.


----------



## boredmug

Do a little research and find a case you like that can support at least 5x120mm worth of rad. That's what I would suggest.


----------



## hyp36rmax

Quote:


> Originally Posted by *boredmug*
> 
> *I always thought the point of water cooling was for the system to be quieter than air cooling and stay cooler.* You are trying to dissipate an awful lot of heat with not much rad space. Higher speed fans might help a little, but the difference between lowest setting and full speed on my fan controller only makes a few degrees difference at most.


Well yes and no, just like air cooling you can go for maximum performance hence sacrificing sound and vice versa. Possibilities are endless as you can add as many radiators as you can to the point your temperature delta won't change much as well. There are many factors such as the type of radiators (High or Low FPI), Fans with higher static pressure, the fan curves you set, your ambient is all relative to your final temperature delta.

I personally have a build with *one* 120mm radiator cooling both a GTX 780Ti and an i7 4770k in a Cooler Master Elite 130 with awesome temps









*Source:* Link


Spoiler: #beastMODE Temperatures



*Benchmarks*

*3DMark Firestrike*


Spoiler: 3D Mark: Firestrike



*Score:* 9957
*CPU Core:* 3.9Ghz (Stock)
*GPU Core:* 1046 mhz (Overclocked)

*CPU temperature (Min) / (Max):* 39C / 43C
*GPU temperature (Min) / (Max):* 36C / 44C

*Link:* http://www.3dmark.com/3dm/4196284

*3DMark Firestrike*





*HWINFO 64 Intel i7 4770K*


*HWINFO 64 Nvidia GTX 780Ti*




*Battlefield 4 Multiplayer*


Spoiler: Battlefield 4!



*CPU Core:* 3.9Ghz (Stock)
*GPU Core:* 1046 mhz (Overclocked)

*CPU temperature (Min) / (Max):* 34C / 45C
*GPU temperature (Min) / (Max):* 31C / 47C

*Battlefield 4*







*HWINFO 64 Intel i7 4770K*


*HWINFO 64 Nvidia GTX 780Ti*




*Unigine Heaven 4.0*


Spoiler: Unigine Heaven 4.0



*Score:* 1485
*FPS:* 58.9
*Min FPS:* 8.5
*Max FPS:* 122.3
*CPU Core:* 3.9Ghz (Stock)
*GPU Core:* 1046 mhz (Overclocked)

*CPU temperature (Min) / (Max):* 33C / 40C
*GPU temperature (Min) / (Max):* 30C / 43C

*Unigine Heaven 4.0*







*HWINFO 64 Intel i7 4770K*



*HWINFO 64 Nvidia GTX 780Ti*





*Metro Last Light*


Spoiler: Metro Last Light



*CPU Core:* 3.9Ghz (Stock)
*GPU Core:* 1046 mhz (Overclocked)

*Ambient temperature:* 23C
*CPU temperature (Min) / (Max):* 36C / 43C
*GPU temperature (Min) / (Max):* 34C / 45C

*Metro Last Light*









*HWINFO 64 Intel i7 4770K*



*HWINFO 64 Nvidia GTX 780Ti*







Quote:


> Originally Posted by *rdr09*
> 
> i only have a 912. never thought going xfire. cougar fans are dead silent but i can hear the pump. i am thinking of Air 540 maybe. all the fans are just hooked up to the motherboard. yah, i like the silence.


Yes the HAF 912 can be a really tight fit for crossfire.


----------



## melodystyle2003

Quote:


> Originally Posted by *rdr09*
> 
> i only have a 912. never thought going xfire. cougar fans are dead silent but i can hear the pump. i am thinking of Air 540 maybe. all the fans are just hooked up to the motherboard. yah, i like the silence.


540 stock fans are not quiet though, but at 7v are working pleasant providing adequate air flow with low levels of noise.


----------



## boredmug

What kinda rad would that be? Lemme guess? Something with the name Monsta in it? Lol
Quote:


> Originally Posted by *hyp36rmax*
> 
> Well yes and no, just like air cooling you can go for maximum performance hence sacrificing sound and vice versa. Possibilities are endless as you can add as many radiators as you can to the point your temperature delta won't change much as well. There are many factors such as the type of radiators (High or Low FPI), Fans with higher static pressure, the fan curves you set, your ambient is all relative to your final temperature delta.
> 
> I personally have a build with *one* 120mm radiator cooling both a GTX 780Ti and an i7 4770k in a Cooler Master Elite 130 with awesome temps
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Source:* Link
> 
> 
> Spoiler: #beastMODE Temperatures
> 
> 
> 
> *Benchmarks*
> 
> *3DMark Firestrike*
> 
> 
> Spoiler: 3D Mark: Firestrike
> 
> 
> 
> *Score:* 9957
> *CPU Core:* 3.9Ghz (Stock)
> *GPU Core:* 1046 mhz (Overclocked)
> 
> *CPU temperature (Min) / (Max):* 39C / 43C
> *GPU temperature (Min) / (Max):* 36C / 44C
> 
> *Link:* http://www.3dmark.com/3dm/4196284
> 
> *3DMark Firestrike*
> 
> 
> 
> 
> 
> *HWINFO 64 Intel i7 4770K*
> 
> 
> *HWINFO 64 Nvidia GTX 780Ti*
> 
> 
> 
> 
> *Battlefield 4 Multiplayer*
> 
> 
> Spoiler: Battlefield 4!
> 
> 
> 
> *CPU Core:* 3.9Ghz (Stock)
> *GPU Core:* 1046 mhz (Overclocked)
> 
> *CPU temperature (Min) / (Max):* 34C / 45C
> *GPU temperature (Min) / (Max):* 31C / 47C
> 
> *Battlefield 4*
> 
> 
> 
> 
> 
> 
> 
> *HWINFO 64 Intel i7 4770K*
> 
> 
> *HWINFO 64 Nvidia GTX 780Ti*
> 
> 
> 
> 
> *Unigine Heaven 4.0*
> 
> 
> Spoiler: Unigine Heaven 4.0
> 
> 
> 
> *Score:* 1485
> *FPS:* 58.9
> *Min FPS:* 8.5
> *Max FPS:* 122.3
> *CPU Core:* 3.9Ghz (Stock)
> *GPU Core:* 1046 mhz (Overclocked)
> 
> *CPU temperature (Min) / (Max):* 33C / 40C
> *GPU temperature (Min) / (Max):* 30C / 43C
> 
> *Unigine Heaven 4.0*
> 
> 
> 
> 
> 
> 
> 
> *HWINFO 64 Intel i7 4770K*
> 
> 
> 
> *HWINFO 64 Nvidia GTX 780Ti*
> 
> 
> 
> 
> 
> *Metro Last Light*
> 
> 
> Spoiler: Metro Last Light
> 
> 
> 
> *CPU Core:* 3.9Ghz (Stock)
> *GPU Core:* 1046 mhz (Overclocked)
> 
> *Ambient temperature:* 23C
> *CPU temperature (Min) / (Max):* 36C / 43C
> *GPU temperature (Min) / (Max):* 34C / 45C
> 
> *Metro Last Light*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *HWINFO 64 Intel i7 4770K*
> 
> 
> 
> *HWINFO 64 Nvidia GTX 780Ti*
> 
> 
> 
> 
> 
> 
> 
> Yes the HAF 912 can be a really tight fit for crossfire.


----------



## hyp36rmax

Quote:


> Originally Posted by *boredmug*
> 
> What kinda rad would that be? Lemme guess? Something with the name Monsta in it? Lol


Of course







, It's still a 120mm radiator in an SFF case... I probably could have gotten away with the UT60 as well. rdr09 also stated his goal was silence sacrificing some performance in heat dissipation, i'm sure if he mounted some Gentle Typhoons and set his fan curves a little more aggressively he may be able to shed a few more degrees. You make it sound like there is only one cookie cutter way to build a rig...


----------



## boredmug

No, I see your point. I checked out your rig. It's impressive. BUT he's obviously running thinner rads like I have in my rig. Even with faster fans I doubt he's going to see temps that he is satisfied with. Curious what fan speeds you achieved those temps with? I'm assuming a big fatty responds better the more you crank them up
Quote:


> Originally Posted by *hyp36rmax*
> 
> Of course
> 
> 
> 
> 
> 
> 
> 
> , It's still a 120mm radiator in an SFF case... I probably could have gotten away with the UT60 as well. rdr09 also stated his goal was silence sacrificing some performance in heat dissipation, i'm sure if he mounted some Gentle Typhoons and set his fan curves a little more aggressively he may be able to shed a few more degrees. You make it sound like there is only one cookie cutter way to build a rig...


----------



## rdr09

Quote:


> Originally Posted by *melodystyle2003*
> 
> 540 stock fans are not quiet though, but at 7v are working pleasant providing adequate air flow with low levels of noise.


i have cougar fans.


----------



## reedy777

Quote:


> Originally Posted by *rdr09*
> 
> i need a new case, then more rad space. thanks


I have corsair 540 air 1x240 and 1x360 rad both push 2600k @5ghz 1.42v 2 290x @1100mhz+56mv max temps ~60c cpu ~50c gpus ambient 20-25c really like the case would recommend


----------



## rdr09

Quote:


> Originally Posted by *reedy777*
> 
> I have corsair 540 air 1x240 and 1x360 rad both push 2600k @5ghz 1.42v 2 290x @1100mhz+56mv max temps ~60c cpu ~50c gpus ambient 20-25c really like the case would recommend


thanks. i think i'll head over to microcenter tomorrow and pick one up. replace 2 of my 120s to 240s.


----------



## tsm106

Quote:


> Originally Posted by *Regnitto*
> 
> Quote:
> 
> 
> 
> Originally Posted by *boredmug*
> 
> I wouldn't mind something like that for my cards. I use the stock heat spreader and ek supremacy bridge edition gpu blocks. Would be nice to have something that worked with that so I could *exhaust any hot air from the vrms out the back like it was intended*.
> 
> 
> 
> Missed your post earlier. Glad you like the Idea. The rear exhaust is one of the reasons I started this project.
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> not to argue but you do realize how much that clc blocks airflow ?
> 
> Click to expand...
> 
> The CLC I used has a pretty low profile pump. I know it will be a restriction, but shouldn't be a really big one
Click to expand...

It will be restrictive because the pump is in the path of the airflow. There's now way getting around the lump of block/pump and tubes which are far from aerodynamic. As you can see below, there's _tons_ of room for airflow. Note the products on the market are designed with the tubes on the horizontal plane not vertical. And imo the bulk of the heat produced is from the core and that is taken care of by the CLC so the the heat from the vrms, if they are dumped back into the case, should really not be a problem. And if it were a problem, I suspect one would have more pressing issues to deal with first.


----------



## wermad




----------



## tsm106

^^Nice, blocks on!

Quote:


> Originally Posted by *rt123*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Setup AB like in the pic below. Don't combine apps, as rdr mentioned use gpuz or overdrive while AB is running. Adding to that multiple diagnostic apps that are reading the same sensors can/will cause conflicts.
> 
> 
> 
> After configuring like so, setup two presets. One is default and one is for your oc/daily settings/whatever use, etc. Now go to settings/profiles and pick your default preset for 2D profile and other preset for 3D profile. Exit AB, and restart it to save settings. You card should now drop down to low power clocks while idle, and conversely maintain the clocks you chose for the 3D profile under load. Btw, be sure to max Power Limit on your 3D profile.
> 
> 
> 
> So a question for you.
> 
> Do you Enable Unofficial Overclocking Mode while Benching on the 290XL.?
> Also With or Without PowerPlay.?
> 
> Thank You.
Click to expand...

On the 290XL, I've been using official method. I feel AB has evolved quite a bit and when you get the settings right there are NO issues, whether 24/7 or benching. However, it really comes down to what method is open to you. The Lightning bios is pretty relaxed restriction wise until you get to around +350mv starting at 1.25v. But it only works with AB. The reference bios on the other hand stock, sucks so you have to swap to the PT bios and that bios really needs gputweak to work its magic, and gputweak is unofficial only. Iirc Trixx is unofficial, or at least the older ones were. It's strange but I still haven't used the new Trixx with hawaii lol. Thus, it depends on the app and type of card I'm using.


----------



## rt123

Quote:


> Originally Posted by *tsm106*
> 
> ^^Nice, blocks on!
> On the 290XL, I've been using official method. I feel AB has evolved quite a bit and when you get the settings right there are NO issues, whether 24/7 or benching. However, it really comes down to what method is open to you. The Lightning bios is pretty relaxed restriction wise until you get to around +350mv starting at 1.25v. But it only works with AB. The reference bios on the other hand stock, sucks so you have to swap to the PT bios and that bios really needs gputweak to work its magic, and gputweak is unofficial only. Iirc Trixx is unofficial, or at least the older ones were. It's strange but I still haven't used the new Trixx with hawaii lol. Thus, it depends on the app and type of card I'm using.


Okay thanks, I have been using both unofficial & official together, had no issues here, was just wondering if I was doing something wrong here.

The +350mv debacle has more to do with the 0.95 rail mod, I think.

PT Bios on the Lightning..?


----------



## Arizonian

Quote:


> Originally Posted by *s7Design*
> 
> I've upgraded to a crossfire combination of a R9 290 and a R9 290x all running on stock settings and stock coolers
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added both









Quote:


> Originally Posted by *pshootr*
> 
> I just upgraded my old HD5570 with a Sapphire R9 290 Tri-X OC and scored hynix memory.


Post a GPU-Z link with validation tab open with OCN name, will add you to the roster.

Also anyone else reading, who hasn't done so yet, please do so.









Quote:


> Originally Posted by *hyp36rmax*
> 
> Look what showed up to the office today. One out of two of my Sapphire R9 290x Vapor-X 8GB cards arrived!
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## Chopper1591

Quote:


> Originally Posted by *hyp36rmax*
> 
> Look what showed up to the office today. One out of two of my Sapphire R9 290x Vapor-X 8GB cards arrived!


Oeh, nice.
Congrats.

What's up with those, are they EOL?
I can't seem to find them in my country any more.
Quote:


> Originally Posted by *LandonAaron*
> 
> Okay cool. Yes it works. Its a new option called Virtual Super Resolution. Max resolution available after enabling it was 3200 x 1800, on my 1920 x 1080 monitor. It did help a bit with the aliasing but did not completely remove it. I think regular Anti-Aliasing techniques would look better. Performance impact seemed relatively minor, only increasing VRAM from 664 MB to 808MB, and GPU usage went from about 25% to 35% (Bioshock is incredibly easy to run now days!)


Hehe,
Of course.

You are talking about an near ancient game here.
Pc's grow really fast.









Aren't there some mods or something? To use proper AA.

I play CSS 1440p everything max with 8xAA, average ~250 fps.








Quote:


> Originally Posted by *rdr09*
> 
> i need a new case, then more rad space. thanks


Or you could always grab a rad-box, or mount external.

I do agree that 120.3 is pretty small for cpu+dual gpu.
Quote:


> Originally Posted by *hyp36rmax*
> 
> I agree, your rads are fine, maybe look into your fan curves. What fans are you using? Case?


Nope.
More rad is the way to go here IMO.
Quote:


> Originally Posted by *boredmug*
> 
> I always thought the point of water cooling was for the system to be quieter than air cooling and stay cooler. You are trying to dissipate an awful lot of heat with not much rad space. Higher speed fans might help a little, but the difference between lowest setting and full speed on my fan controller only makes a few degrees difference at most.


Most think that way, I think.









But there are also plenty of guys than run 2000+ rpm fans all the time.
It's really personal. If you want sub 800 rpm fans you need allot of rad space. As where if you were running 1500 rpm, not so much.


----------



## Scorpion49

Quote:


> Originally Posted by *hyp36rmax*
> 
> Look what showed up to the office today. One out of two of my Sapphire R9 290x Vapor-X 8GB cards arrived!
> 
> *snip*


Hah! I'm not the only sucker who bought 8GB card.


----------



## Chopper1591

Dude...

Whats that with all those psu calculators.
I'm starting to doubt that my psu likes the current components.

fx-8320 @ 5.0 1.575v.
290 tri-x, @ 1200(+) when under water in a few weeks.

Or do you think I'm stressing on nothing and my hx750 will last?


----------



## Raephen

Quote:


> Originally Posted by *Chopper1591*
> 
> Dude...
> 
> Whats that with all those psu calculators.
> I'm starting to doubt that my psu likes the current components.
> 
> fx-8320 @ 5.0 1.575v.
> 290 tri-x, @ 1200(+) when under water in a few weeks.
> 
> Or do you think I'm stressing on nothing and my hx750 will last?


In my opinion it should hold, even if you pushed your FX way past 1.4 V and 5.0 GHz.

But, if you are stressing out, buy a cheap watt meter. I think that would ease your mind more than enough.

I used to own a 990X Sabertooth, and while I only used a FX 4170 @ 4.8 GHz with sub 1.4 V Vcore along with a HD 7870, I really think you have enough juice for your system.

But get yourself an early christmas present and buy a watt meter


----------



## hyp36rmax

Quote:


> Originally Posted by *Scorpion49*
> 
> Hah! I'm not the only sucker who bought 8GB card.


Lol!


----------



## mAs81

Hey guys I've found the culprit about my high idle temps..Seems like something is fried in my card


Spoiler: Warning: Spoiler!






and unfortunately only the middle fan works while the other two don't..(I made sure I had IFC off)
After one run of Unigine Valley this is what it looked like :


I've opened a ticket with Sapphire but I'm not very optimistic about it..

Is it safe to run my card with only one fan seeing that even though the temp are 80c+ my VRMs are well below??


----------



## hyp36rmax

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added both
> 
> 
> 
> 
> 
> 
> 
> 
> Post a GPU-Z link with validation tab open with OCN name, will add you to the roster.
> 
> Also anyone else reading, who hasn't done so yet, please do so.
> 
> 
> 
> 
> 
> 
> 
> 
> *Congrats - added*


Thank you thank you! Now to wait for the second card to come in to update the xXcrossfireXx Club








Quote:


> Originally Posted by *Chopper1591*
> 
> Oeh, nice.
> Congrats.
> 
> *What's up with those, are they EOL?
> I can't seem to find them in my country any more.*
> Hehe,
> Of course.
> 
> You are talking about an near ancient game here.
> Pc's grow really fast.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Aren't there some mods or something? To use proper AA.
> 
> I play CSS 1440p everything max with 8xAA, average ~250 fps.
> 
> 
> 
> 
> 
> 
> 
> 
> Or you could always grab a rad-box, or mount external.
> 
> I do agree that 120.3 is pretty small for cpu+dual gpu.
> Nope.
> More rad is the way to go here IMO.
> Most think that way, I think.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But there are also plenty of guys than run 2000+ rpm fans all the time.
> It's really personal. If you want sub 800 rpm fans you need allot of rad space. As where if you were running 1500 rpm, not so much.


Yes our CM Storm Trigger has been discontinued.


----------



## Gumbi

Probably the worst place to post this, but I'm considering grabbing a VaporX 290 as a replacement to my 7950. The 290 Trix seems to be the best one (almost the best coolwr but not the most expensive), however all the little perks the VaporX has over the Trix seem worth the 20 euro, and **** it, it's Christmas









-Slightly better cooling
- Custom PCB and binned chip = slightly better oc on average
- Backplate is always nice on a big card
- LEDs are pretty sexy, and the blue colour scheme should meld with my Gigabyte board

My CPU is a 4790k clocked at 4.7ghz. I'm thinking of upgrading my 21.5" 1080p monitor to a 144hz 1080pscreen or a 1440p too, however nVidia wrecks AMD in CPU limited games, but AMD tends to excel in higher reses.

So what do ye think? The 290 VaporX worth the upgrade? And should I pair it with a high res or high refresh rate monitor? 7950 plus accelero cooler plus current monitor should offset the costs nicely if I do decide to go ahead with the upgrade.

Thanks!


----------



## Mega Man

Quote:


> Originally Posted by *Gumbi*
> 
> My CPU is a 4790k clocked at 4.7ghz. I'm thinking of upgrading my 21.5" 1080p monitor to a 144hz 1080pscreen or a 1440p too, however nVidia wrecks AMD in CPU limited games, but AMD tends to excel in higher reses.
> 
> Thanks!


?? huh ??


----------



## Chopper1591

Quote:


> Originally Posted by *Raephen*
> 
> In my opinion it should hold, even if you pushed your FX way past 1.4 V and 5.0 GHz.
> 
> But, if you are stressing out, buy a cheap watt meter. I think that would ease your mind more than enough.
> 
> I used to own a 990X Sabertooth, and while I only used a FX 4170 @ 4.8 GHz with sub 1.4 V Vcore along with a HD 7870, I really think you have enough juice for your system.
> 
> But get yourself an early christmas present and buy a watt meter


You'r probably right.
Will grab a watt meter some day.

Just curious if someone is in sort of the same situation and has some light on my case.









And I know, my unit is pretty decent.
If I had a b-grade 750w unit it would be another story.
Quote:


> Originally Posted by *mAs81*
> 
> Hey guys I've found the culprit about my high idle temps..Seems like something is fried in my card
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> and unfortunately only the middle fan works while the other two don't..(I made sure I had IFC off)
> After one run of Unigine Valley this is what it looked like :
> 
> 
> I've opened a ticket with Sapphire but I'm not very optimistic about it..
> 
> Is it safe to run my card with only one fan seeing that even though the temp are 80c+ my VRMs are well below??


Oops.
I would return the card I were you.

Temp wise you should be good...
But it sucks to have such a nice card and have mediocre cooling.
Quote:


> Originally Posted by *hyp36rmax*
> 
> Thank you thank you! Now to wait for the second card to come in to update the xXcrossfireXx Club
> 
> 
> 
> 
> 
> 
> 
> 
> Yes our CM Storm Trigger has been discontinued.


Ehmm..
What?
Quote:


> Originally Posted by *Gumbi*
> 
> Probably the worst place to post this, but I'm considering grabbing a VaporX 290 as a replacement to my 7950. The 290 Trix seems to be the best one (almost the best coolwr but not the most expensive), however all the little perks the VaporX has over the Trix seem worth the 20 euro, and **** it, it's Christmas
> 
> 
> 
> 
> 
> 
> 
> 
> 
> -Slightly better cooling
> - Custom PCB and binned chip = slightly better oc on average
> - Backplate is always nice on a big card
> - LEDs are pretty sexy, and the blue colour scheme should meld with my Gigabyte board
> 
> My CPU is a 4790k clocked at 4.7ghz. I'm thinking of upgrading my 21.5" 1080p monitor to a 144hz 1080pscreen or a 1440p too, however nVidia wrecks AMD in CPU limited games, but AMD tends to excel in higher reses.
> 
> So what do ye think? The 290 VaporX worth the upgrade? And should I pair it with a high res or high refresh rate monitor? 7950 plus accelero cooler plus current monitor should offset the costs nicely if I do decide to go ahead with the upgrade.
> 
> Thanks!


Go go go.
Get yourself one.

Either the tri-x or vapor-x will be good.
Coming from a 7950 vapor-x myself.

And my vote goes to.....
1440p 144hz monitor.
But, a single 290 will probably not be enough.


----------



## hyp36rmax

Quote:


> Originally Posted by *Chopper1591*
> 
> You'r probably right.
> Will grab a watt meter some day.
> 
> Just curious if someone is in sort of the same situation and has some light on my case.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And I know, my unit is pretty decent.
> If I had a b-grade 750w unit it would be another story.
> Oops.
> I would return the card I were you.
> 
> Temp wise you should be good...
> But it sucks to have such a nice card and have mediocre cooling.
> *Ehmm..
> What?*
> Go go go.
> Get yourself one.
> 
> Either the tri-x or vapor-x will be good.
> Coming from a 7950 vapor-x myself.
> 
> And my vote goes to.....
> 1440p 144hz monitor.
> But, a single 290 will probably not be enough.


That was to Arizonian, reread the quote... I'm in charge of the AMD Crossfire club...


----------



## Gumbi

Haha 1440p 144hz... It was 1080p 144hz or 1440p 60hz







Don't think I can afford a high res high refresh rate monitor, not to mention the horsrepower needed to power it (2 290s minimum I'd say to make it worth it).

But I'm pretty much aet on a VaporX. I know a Trixwill perform just as well but all the small, sweet bonuses of the VaporX over it and the faxt it's Christmas make it much more enticing







And plus the Gold Never Settle offer might make me some further cash to further offset the cost.


----------



## mAs81

Quote:


> Originally Posted by *Chopper1591*
> 
> Oops.
> I would return the card I were you.
> Temp wise you should be good...
> But it sucks to have such a nice card and have mediocre cooling.


Tell me about it..
I've filed a ticket with Sapphire,but as I said before I' not very optimistic about it..
First of all,they have terrible customer service,at least from what I've seen posted in OCN,and second,I don't have a receipt for it..It was purchased using my cousin's Amazon Prime account when I visited my family in the US..
I've also notified a Sapphire Rep here in OCN,but he's been off line for more than 10 days..


----------



## Gumbi

Quote:


> Originally Posted by *Mega Man*
> 
> ?? huh ??


Pasting these on mobile from skype, hopefully they're correct.

http://www.pcgameshardware.de/screenshots/original/2014/11/WoW_Warlords_of_Draenor_GPU_Benchmark_1080p-pcgh.png

http://cdn.overclock.net/8/86/86444eea_http--www.gamegpu.ru-images-stories-Test_GPU-Action-Plants_vs._Zombies_Garden_Warfare-test-PVZ_proz_radeon.jpeg

http://cdn.overclock.net/e/ed/ed47445d_http--www.gamegpu.ru-images-stories-Test_GPU-Action-Plants_vs._Zombies_Garden_Warfare-test-PVZ_proz_nv.jpeg


----------



## Chopper1591

Quote:


> Originally Posted by *hyp36rmax*
> 
> That was to Arizonian, reread the quote... I'm in charge of the AMD Crossfire club...


I weren't talking about the quote you made from Arizonian.
Good for you, to lead the club.









I was talking about:
Quote:


> Quote:
> Originally Posted by Chopper1591 View Post
> 
> Oeh, nice.
> Congrats.
> 
> *What's up with those, are they EOL?
> * I can't seem to find them in my country any more.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Hehe,
> Of course.
> 
> You are talking about an near ancient game here.
> Pc's grow really fast. biggrin.gif
> 
> Aren't there some mods or something? To use proper AA.
> 
> I play CSS 1440p everything max with 8xAA, average ~250 fps. tongue.gif
> Or you could always grab a rad-box, or mount external.
> 
> I do agree that 120.3 is pretty small for cpu+dual gpu.
> Nope.
> More rad is the way to go here IMO.
> Most think that way, I think. tongue.gif
> 
> But there are also plenty of guys than run 2000+ rpm fans all the time.
> It's really personal. If you want sub 800 rpm fans you need allot of rad space. As where if you were running 1500 rpm, not so much.
> 
> 
> *Yes our CM Storm Trigger has been discontinued*.


I didn't get your response.
Asked if the vapor-x 8gb was EOL and you said yea, the Storm Trigger has been discontinued.









Quote:


> Originally Posted by *Gumbi*
> 
> Haha 1440p 144hz... It was 1080p 144hz or 1440p 60hz
> 
> 
> 
> 
> 
> 
> 
> Don't think I can afford a high res high refresh rate monitor, not to mention the horsrepower needed to power it (2 290s minimum I'd say to make it worth it).
> 
> But I'm pretty much aet on a VaporX. I know a Trixwill perform just as well but all the small, sweet bonuses of the VaporX over it and the faxt it's Christmas make it much more enticing
> 
> 
> 
> 
> 
> 
> 
> And plus the Gold Never Settle offer might make me some further cash to further offset the cost.


Haha, true.
I was trolling a bit.

Only display I see, where I can buy, is a ROG 27" 1440p 144hz 1ms. Pretty nice.
But... € 750,-.










But, honestly, I would prefer a 144hz over a 1440p.
We have VSR now, after all.
Quote:


> Originally Posted by *mAs81*
> 
> Tell me about it..
> I've filed a ticket with Sapphire,but as I said before I' not very optimistic about it..
> First of all,they have terrible customer service,at least from what I've seen posted in OCN,and second,I don't have a receipt for it..It was purchased using my cousin's Amazon Prime account when I visited my family in the US..
> I've also notified a Sapphire Rep here in OCN,but he's been off line for more than 10 days..


Kinda sound like you are.... in trouble.

No receipt, no fun...


----------



## mAs81

Quote:


> Originally Posted by *Chopper1591*
> 
> Kinda sound like you are.... in trouble.
> No receipt, no fun...


That's why I'm not very optimistic..
Even though I have registered it on their site ..Plus I have all the S/N numbers and everything..


----------



## hyp36rmax

Quote:


> Originally Posted by *Chopper1591*
> 
> I weren't talking about the quote you made from Arizonian.
> Good for you, to lead the club.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I was talking about:
> I didn't get your response.
> Asked if the vapor-x 8gb was EOL and you said yea, the Storm Trigger has been discontinued.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Haha, true.
> I was trolling a bit.
> 
> Only display I see, where I can buy, is a ROG 27" 1440p 144hz 1ms. Pretty nice.
> But... € 750,-.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But, honestly, I would prefer a 144hz over a 1440p.
> We have VSR now, after all.
> Kinda sound like you are.... in trouble.
> 
> No receipt, no fun...


Ok looks like a miscommunication, thought you were talking about the Trigger keyboard. No clue about the vapor x


----------



## mAs81

Quote:


> Originally Posted by *hyp36rmax*
> 
> No clue about the vapor x


Seems like I'm not the only one with the _exact_ problem..Another ocn member had it plus I've seen it posted in Sapphire's forums..
That's a really bad show,considering the cost of the card itself


----------



## Chopper1591

Quote:


> Originally Posted by *hyp36rmax*
> 
> Ok looks like a miscommunication, thought you were talking about the Trigger keyboard. No clue about the vapor x


Ah np.









I was just a bit lost on your reply, hehe.
Quote:


> Originally Posted by *mAs81*
> 
> Seems like I'm not the only one with the _exact_ problem..Another ocn member had it plus I've seen it posted in Sapphire's forums..
> That's a really bad show,considering the cost of the card itself


If that is the case I wouldn't make conclusions too early.

Lets just wait for reply from Sapphire.
A S/N should be enough to clarify warranty IMO.

Made RMA's with corsair purely on the S/N before, multiple times.
With and without receipt.


----------



## taem

After installing 14.12 omega the memory clock on my pcs+ 290 is running at full all the time. The core down clocks to 300mhx at idle as it should. But mem clock stays at max. I'm not running any oc apps like SB or Trixx. The mem clock at max problem happens whether I oc or run stock. (Through ccc.)

Anyone have a guess what the issue might be and how I might fix it?

Edit something is using almost 200mb of VRAM constantly. Turning off everything including anti virus doesn't change this. Any way to see what's using VRAM?


----------



## Mega Man

Quote:


> Originally Posted by *Gumbi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> ?? huh ??
> 
> 
> 
> Pasting these on mobile from skype, hopefully they're correct.
> 
> http://www.pcgameshardware.de/screenshots/original/2014/11/WoW_Warlords_of_Draenor_GPU_Benchmark_1080p-pcgh.png
> 
> http://cdn.overclock.net/8/86/86444eea_http--www.gamegpu.ru-images-stories-Test_GPU-Action-Plants_vs._Zombies_Garden_Warfare-test-PVZ_proz_radeon.jpeg
> 
> http://cdn.overclock.net/e/ed/ed47445d_http--www.gamegpu.ru-images-stories-Test_GPU-Action-Plants_vs._Zombies_Garden_Warfare-test-PVZ_proz_nv.jpeg
Click to expand...

cherry picking one game does proof not make ! that one game maybe programed better for nvidia
Quote:


> Originally Posted by *mAs81*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Chopper1591*
> 
> Kinda sound like you are.... in trouble.
> No receipt, no fun...
> 
> 
> 
> That's why I'm not very optimistic..
> Even though I have registered it on their site ..Plus I have all the S/N numbers and everything..
Click to expand...

iirc saphire only requires SN for warranty ( i could be wrong though ) however most companies require warranty service from the country it was purchased from
Quote:


> Originally Posted by *taem*
> 
> After installing 14.12 omega the memory clock on my pcs+ 290 is running at full all the time. The core down clocks to 300mhx at idle as it should. But mem clock stays at max. I'm not running any oc apps like SB or Trixx. The mem clock at max problem happens whether I oc or run stock. (Through ccc.)
> 
> Anyone have a guess what the issue might be and how I might fix it?
> 
> Edit something is using almost 200mb of VRAM constantly. Turning off everything including anti virus doesn't change this. Any way to see what's using VRAM?


how many displays are you running


----------



## mAs81

Quote:


> Originally Posted by *Mega Man*
> 
> iirc saphire only requires SN for warranty ( i could be wrong though ) however most companies require warranty service from the country it was purchased from


I filled a ticket this morning and they still haven't got back to me..For what it's worth I've PM'ed their REP here too..
I'll just have to wait and see I guess..
Thanks


----------



## PillarOfAutumn

hey guys quick question.

my firestrike extreme score on a single 290 was around 5000, but a second gpu in XF with dame clocks scores a 7582. is this normal for two 290s?


----------



## Gumbi

Quote:


> Originally Posted by *Mega Man*
> 
> cherry picking one game does proof not make ! that one game maybe programed better for nvidia
> iirc saphire only requires SN for warranty ( i could be wrong though ) however most companies require warranty service from the country it was purchased from
> how many displays are you running


2 games actually. And here are two more (Mantless Star Swarm and Mantleless Battlefield).

http://cdn.overclock.net/f/f2/f2b0a642_i7_bf4_1280.png

http://cdn.overclock.net/6/64/645c13a2_i7_sw_1920.png


----------



## kizwan

Quote:


> Originally Posted by *mAs81*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> iirc saphire only requires SN for warranty ( i could be wrong though ) however most companies require warranty service from the country it was purchased from
> 
> 
> 
> I filled a ticket this morning and they still haven't got back to me..For what it's worth I've PM'ed their REP here too..
> I'll just have to wait and see I guess..
> Thanks
Click to expand...

I'm pretty sure no one is working on the weekends.
Quote:


> Originally Posted by *Gumbi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> cherry picking one game does proof not make ! that one game maybe programed better for nvidia
> iirc saphire only requires SN for warranty ( i could be wrong though ) however most companies require warranty service from the country it was purchased from
> how many displays are you running
> 
> 
> 
> 2 games actually. And here are two more (Mantless Star Swarm and Mantleless Battlefield).
> 
> http://cdn.overclock.net/f/f2/f2b0a642_i7_bf4_1280.png
> 
> http://cdn.overclock.net/6/64/645c13a2_i7_sw_1920.png
Click to expand...

I will not be playing BF4 at 1280 x 720.


----------



## Fickle Pickle

Quote:


> Originally Posted by *mAs81*
> 
> I filled a ticket this morning and they still haven't got back to me..For what it's worth I've PM'ed their REP here too..
> I'll just have to wait and see I guess..
> Thanks


Unlikely they will respond to your ticket until the weekend is over.


----------



## pshootr

Hello guys, I have installed a sapphire R9 290 Tri-x OC and installed AB. I would like to OC a bit but I'm unsure what would be a good starting point. I would like to know what reasonable core/mem clocks and voltage to start with?

And also what is a good way to test stability for the OC? I have 3dmark installed, I didn't know if it is well suited for stability testing.

Thanks


----------



## kizwan

Quote:


> Originally Posted by *pshootr*
> 
> Hello guys, I have installed a sapphire R9 290 Tri-x OC and installed AB. I would like to OC a bit but I'm unsure what would be a good starting point. I would like to know what reasonable core/mem clocks and voltage to start with?
> 
> And also what is a good way to test stability for the OC? I have 3dmark installed, I didn't know if it is well suited for stability testing.
> 
> Thanks


Games will be good stability test tool.


----------



## Regnitto

New Fire Strike Score on my Visiontek R9 290. Clocked 1200/1500. FX-6100 @ 4.6ghz:

http://www.3dmark.com/3dm/5045979?


----------



## pshootr

Quote:


> Originally Posted by *Regnitto*
> 
> New Fire Strike Score on my Visiontek R9 290. Clocked 1200/1500. FX-6100 @ 4.6ghz:
> 
> http://www.3dmark.com/3dm/5045979?


Nice score. What voltage on your GPU? Using AB by any chance?


----------



## Regnitto

Quote:


> Originally Posted by *pshootr*
> 
> Nice score. What voltage on your GPU? Using AB by any chance?


knew I forgot something in the screenshots:


----------



## pshootr

Quote:


> Originally Posted by *Regnitto*
> 
> knew I forgot something in the screenshots:


Holy crap!







Thank you

Can you game with those clocks?


----------



## Regnitto

Quote:


> Originally Posted by *pshootr*
> 
> Holy crap!
> 
> 
> 
> 
> 
> 
> 
> Thank you
> 
> Can you game with those clocks?


games like a charm 120fps bf3. 100fps Shadow of Mordor


----------



## pshootr

Quote:


> Originally Posted by *Regnitto*
> 
> games like a charm 120fps bf3. 100fps Shadow of Mordor


Nice, very impressive clocks indeed. Some creative cooling you have there


----------



## Regnitto

Quote:


> Originally Posted by *pshootr*
> 
> Nice, very impressive clocks indeed.


Thanks









Would probably get a higher score with a better cpu....thinking about getting a Devil's Canyon at tax time.


----------



## pshootr

I started to tinker with my new CPU and GPU. CPU at 4.0Ghz, GPU at 1100/1400 Here is the Fire Strike results.



Edit: Hmm, AB said 1300 memory clock I think and GPU-Z showed 1400 max. I dunno


----------



## zpaf

Try to emulate clocks from msi lightning 1080/1250 with my tri-x.
Trixx software cant downclock ram but msi ab does.
Strange result.
With Trixx at 1080/1300 gpu score is ~ 12300









http://www.3dmark.com/3dm/4991652


----------



## Chopper1591

Quote:


> Originally Posted by *pshootr*
> 
> I started to tinker with my new CPU and GPU. CPU at 4.0Ghz, GPU at 1100/1400 Here is the Fire Strike results.
> 
> 
> 
> Edit: Hmm, AB said 1300 memory clock I think and GPU-Z showed 1400 max. I dunno


Scores look a bit low.
You also have a 290 tri-x, right?

Here are a few shots of mine:
fx-8320 @ 4.7

1150 1300(stock mem):


Spoiler: Warning: Spoiler!







1150 1500:


Spoiler: Warning: Spoiler!







1200 1500 +100


Spoiler: Warning: Spoiler!







1100 1300 is my regular gaming clock.
That is stable with stock voltage(+0). I do set power limit to +50 though to avoid throttling.


----------



## Gumbi

Quote:


> Originally Posted by *kizwan*
> 
> I'm pretty sure no one is working on the weekends.
> I will not be playing BF4 at 1280 x 720.


So? How is that relevant to the point at hand, that nVidia outperforms AMD when the games are CPU limited. This is what you initially had a problem with, I posted benches, you accused me of cherry picking, I posted more benches, and you cherry pick yourself by saying one of the benches is irrelevant because it doesn't apply to you. Well so what? That's not the point.

It's not about you. We are discussing whether nVidia outperforms AMD in CPU limited gamez, which I have amply demonstrated at this point.


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> I'm pretty sure no one is working on the weekends.
> I will not be playing BF4 at 1280 x 720.


even with my HD7770. lol


----------



## Chopper1591

Quote:


> Originally Posted by *Gumbi*
> 
> So? How is that relevant to the point at hand, that nVidia outperforms AMD when the games are CPU limited. This is what you initially had a problem with, I posted benches, you accused me of cherry picking, I posted more benches, and you cherry pick yourself by saying one of the benches is irrelevant because it doesn't apply to you. Well so what? That's not the point.
> 
> It's about you. We are discussinf whether nVidia outperforms AMD in CPU limited gamez, which I have amply demonstrated at this point.


IMO low resolution is the way to go to display cpu limitation.
That's a well known thing..


----------



## Forceman

Quote:


> Originally Posted by *Chopper1591*
> 
> IMO low resolution is the way to go to display cpu limitation.
> That's a well known thing..


But if all those differences wash out at higher resolutions, which they often do, it's of pretty questionable value. Who cares which CPU is faster at a resolution no one plays, if at reasonable resolutions there is no significant difference?

Same with cranking the settings to max on GPU tests, sure card X is faster than card Y at settings Z, but if the result is playing at 10 FPS does it really matter?


----------



## Regnitto

Quote:


> Originally Posted by *pshootr*
> 
> Some creative cooling you have there


just your standard red mod. just wait till i get my custom fan shroud made.


----------



## Chopper1591

Quote:


> Originally Posted by *Forceman*
> 
> But if all those differences wash out at higher resolutions, which they often do, it's of pretty questionable value. Who cares which CPU is faster at a resolution no one plays, if at reasonable resolutions there is no significant difference?
> 
> Same with cranking the settings to max on GPU tests, sure card X is faster than card Y at settings Z, but if the result is playing at 10 FPS does it really matter?


That is true.

I shouldn't have mixxed in a already going on discussion.








Just pointed out that cpu bottlenecks are far more noteable at lower resolutions.

But I do not agree on the subject of the small difference in cpu's at a reasonable resolution.
There are a good deal of games out that that would benefit a good deal more from a 5930k, or even a 4770k, over my 8320(even though it is clocked at 4.8ghz).


----------



## aaroc

Quote:


> Originally Posted by *hyp36rmax*
> 
> Look what showed up to the office today. One out of two of my Sapphire R9 290x Vapor-X 8GB cards arrived!


Super nice!!!! Can you elaborate the reason of the upgrade to R9290X 8GB? I think I will upgrade too to run better with 3 256x1440 monitors.


----------



## Archea47

Quote:


> Originally Posted by *Gumbi*
> 
> So what do ye think? The 290 VaporX worth the upgrade?


My only note to add with custom PCB is: make sure full cover water blocks are available if you think you'll ever want to go that route

My past pair of 280Xs performed great but I couldn't look at them out of frustration I was so upset there were no full cover blocks for their vision pcbs. Not a problem if you don't want to go there, though


----------



## wermad

Probably shadow of mordor
















'tastic!


----------



## Gumbi

Quote:


> Originally Posted by *Archea47*
> 
> My only note to add with custom PCB is: make sure full cover water blocks are available if you think you'll ever want to go that route
> 
> My past pair of 280Xs performed great but I couldn't look at them out of frustration I was so upset there were no full cover blocks for their vision pcbs. Not a problem if you don't want to go there, though


I probably won't go down that road, but there do in fact exist blocks specifically for the Vapor X cards


----------



## BradleyW

Hello fellow 290x owners, please could you help me with a strange issue I'm currently suffering from?

http://www.overclock.net/t/1530369/amd-omega-driver-negative-crossfire-scaling

Thank you.


----------



## VSG

That's a link that will just take users to their subscribed threads


----------



## BradleyW

Fixed, sorry


----------



## hyp36rmax

Quote:


> Originally Posted by *aaroc*
> 
> Super nice!!!! Can you elaborate the reason of the upgrade to R9290X 8GB? I think I will upgrade too to run better with 3 256x1440 monitors.


My crossfire 7970's felt very dated with an ASUS PB287Q 4k monitor, I figured two of these bad boys will do it justice.









Quote:


> Originally Posted by *wermad*
> 
> Probably shadow of mordor
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 'tastic!


Not sure if you we're talking to me sir, hehe but yes that game also


----------



## pdasterly

Quote:


> Originally Posted by *mAs81*
> 
> Not sure if you we're talking to me sir, hehe but yes that game also


on mordor did your overall temp go up(core and vrm1), also what resolution are you running. I can run 5780x1080 but the sides are cutoff in eyefinity


----------



## Widde

Quote:


> Originally Posted by *pdasterly*
> 
> on mordor did your overall temp go up(core and vrm1), also what resolution are you running. I can run 5780x1080 but the sides are cutoff in eyefinity


Have you tried flawless widerscreen? It worked for me in shadow of mordor, Fixed the black borders and a fov issue with the menu


----------



## kizwan

Quote:


> Originally Posted by *Gumbi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I will not be playing BF4 at 1280 x 720.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So? How is that relevant to the point at hand, that nVidia outperforms AMD when the games are CPU limited. This is what you initially had a problem with, I posted benches, you accused me of cherry picking, I posted more benches, and you cherry pick yourself by saying one of the benches is irrelevant because it doesn't apply to you. Well so what? That's not the point.
> 
> It's about you. We are discussinf whether nVidia outperforms AMD in CPU limited gamez, which I have amply demonstrated at this point.
Click to expand...

Huh ??? You seriously need to make sure you're replying to the correct person here.









Since you replied to me, I got to ask, seriously, who are playing BF4 at 1280 x 720 with 290/290x or 780/780ti/980? Using your own argument, you can't exclude mantle result just because it's irrelevant to you. The last two results pretty much show 290x outperforms 780. If you want to post some graphs, at least choose the ones that running at reasonable resolutions; 1080p or 1440p or 4k.

For example, I found a couple of newer review with newer drivers, just to show you at reasonable resolutions, the differences are minimal than you might think.

I don't think running BF4 at 1440p or 4k with Mantle with single GPU is necessary though. They can get similar average FPS with DX11 too.

Single - 1440p
_(source)_




Single - 4k


CFX/SLI - 4k
_(source)_


----------



## BradleyW

Any Crossfire owners here who own a copy of AC Unity? For some reason I have negative CF scaling.

http://www.overclock.net/t/1530369/amd-omega-driver-negative-crossfire-scaling


----------



## MrWhiteRX7

Quote:


> Originally Posted by *BradleyW*
> 
> Any Crossfire owners here who own a copy of AC Unity? For some reason I have negative CF scaling.
> 
> http://www.overclock.net/t/1530369/amd-omega-driver-negative-crossfire-scaling


I've never really even messed with in Xfire. I do have the game though. Can't mess with it tonight but probably tomorrow after work I'll try it with 2 and 3 gpus both with the acu profile and afr.


----------



## Mega Man

Quote:


> Originally Posted by *BradleyW*
> 
> Any Crossfire owners here who own a copy of AC Unity? For some reason I have negative CF scaling.
> 
> http://www.overclock.net/t/1530369/amd-omega-driver-negative-crossfire-scaling


my bet .... its because ubi cant release a game worth a darn


----------



## Chopper1591

Quote:


> Originally Posted by *Mega Man*
> 
> my bet .... its because ubi cant release a game worth a darn


That pretty much sums it up.


----------



## Gumbi

Quote:


> Originally Posted by *kizwan*
> 
> Huh ??? You seriously need to make sure you're replying to the correct person here.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Since you replied to me, I got to ask, seriously, who are playing BF4 at 1280 x 720 with 290/290x or 780/780ti/980? Using your own argument, you can't exclude mantle result just because it's irrelevant to you. The last two results pretty much show 290x outperforms 780. If you want to post some graphs, at least choose the ones that running at reasonable resolutions; 1080p or 1440p or 4k.
> 
> For example, I found a couple of newer review with newer drivers, just to show you at reasonable resolutions, the differences are minimal than you might think.
> 
> I don't think running BF4 at 1440p or 4k with Mantle with single GPU is necessary though. They can get similar average FPS with DX11 too.
> 
> Single - 1440p
> _(source)_
> 
> 
> 
> 
> Single - 4k
> 
> 
> CFX/SLI - 4k
> _(source)_


I think I meant Megaman. Either way, the point of the slides was to demonstrate inferiority of AMD when a game is CPU limited. The other slides showed more "practical" results of this (the WoW one in particular). I'm not bashing, I have a 7959 myself abd am toying grabbing a 290 VaporX this Christmas.


----------



## Mega Man

no it does not show anything,

all you would post is very specific games ( iirc 2 ? )

no mention of drivers, ect

it is called cherry picking

you would not show full reviews just this or that from them. i can do that too and make w.e. i want to say look like it is true, it does not mean it is


----------



## Gobigorgohome

I am building another X79 machine with a GA-X79-UD3 and 3930K, together with 16 GB RAM, SSD, HDD and a few GPU's.

When it comes to the GPU's I am down to R9 290X, this time around the cards will be air cooled ONLY, no water for these cards so the cooler have to be able to get rid of all the heat from the cards. I am doing 2-way, maybe 3-way crossfire and my alternatives is these:

Sapphire Radeon TRI-X R9 290X: 405 USD each
Asus Matrix Platinum R9 290X: 420 USD each
MSI Lightning R9 290X: 520 USD each

The cards will not be stacked, they have the same distance between each other as the RIVBE with x16 x16.









I will probably buy two cards right away and use them for 1080p/5760x1080/4K and maybe throw in one of my "old" R9 290X together with the new ones if the performance is not satisfying with two cards.


----------



## bond32

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I am building another X79 machine with a GA-X79-UD3 and 3930K, together with 16 GB RAM, SSD, HDD and a few GPU's.
> 
> When it comes to the GPU's I am down to R9 290X, this time around the cards will be air cooled ONLY, no water for these cards so the cooler have to be able to get rid of all the heat from the cards. I am doing 2-way, maybe 3-way crossfire and my alternatives is these:
> 
> Sapphire Radeon TRI-X R9 290X: 405 USD each
> Asus Matrix Platinum R9 290X: 420 USD each
> MSI Lightning R9 290X: 520 USD each
> 
> The cards will not be stacked, they have the same distance between each other as the RIVBE with x16 x16.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I will probably buy two cards right away and use them for 1080p/5760x1080/4K and maybe throw in one of my "old" R9 290X together with the new ones if the performance is not satisfying with two cards.


I know you say no water, but if you're interested I am selling my 290's. Check out the marketplace. They are reference 290's and 290x's.


----------



## hyp36rmax

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I am building another X79 machine with a GA-X79-UD3 and 3930K, together with 16 GB RAM, SSD, HDD and a few GPU's.
> 
> When it comes to the GPU's I am down to R9 290X, t*his time around the cards will be air cooled ONLY, no water for these cards so the cooler have to be able to get rid of all the heat from the cards.* I am doing 2-way, maybe 3-way crossfire and my alternatives is these:
> 
> Sapphire Radeon TRI-X R9 290X: 405 USD each
> Asus Matrix Platinum R9 290X: 420 USD each
> MSI Lightning R9 290X: 520 USD each
> 
> The cards will not be stacked, they have the same distance between each other as the RIVBE with x16 x16.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I will probably buy two cards right away and use them for 1080p/5760x1080/4K and maybe throw in one of my "old" R9 290X together with the new ones if the performance is not satisfying with two cards.


I'm interested to see your results with an air-cooled multi GPU 290X set. I'd imagine even with much more efficient air coolers your computer case will be one hell of a hot box as the heat has to go somewhere with the hot air recirculating back into the open coolers, I wonder how much you'll be affected by throttling.

What computer case will you be using? What is your air-flow configuration look like? Fans? Front as intake rear as exhaust? Inquiring minds want to know


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I am building another X79 machine with a GA-X79-UD3 and 3930K, together with 16 GB RAM, SSD, HDD and a few GPU's.
> 
> When it comes to the GPU's I am down to R9 290X, this time around the cards will be air cooled ONLY, no water for these cards so the cooler have to be able to get rid of all the heat from the cards. I am doing 2-way, maybe 3-way crossfire and my alternatives is these:
> 
> Sapphire Radeon TRI-X R9 290X: 405 USD each
> Asus Matrix Platinum R9 290X: 420 USD each
> MSI Lightning R9 290X: 520 USD each
> 
> The cards will not be stacked, they have the same distance between each other as the RIVBE with x16 x16.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I will probably buy two cards right away and use them for 1080p/5760x1080/4K and maybe throw in one of my "old" R9 290X together with the new ones if the performance is not satisfying with two cards.


Definitely not the Matrix, Tbh i'd say Reference is the way to go for that if you are wanting air only but if you can't get them then the Lightning would probably be the best choice.

I know Tsm106 is running 3 Lightning's but i'm not sure if he had them on air to start with or just threw them straight under water


----------



## tsm106

I went straight to water, but the Lightning cooler really is a marvel. Between those three at that pricing, I'd go trix then lightning. If it was US pricing, the lightning first obviously since its 350 at amazon, there's no need for second place then.


----------



## Gobigorgohome

Quote:


> Originally Posted by *hyp36rmax*
> 
> I'm interested to see your results with an air-cooled multi GPU 290X set. I'd imagine even with much more efficient air coolers your computer case will be one hell of a hot box as the heat has to go somewhere with the hot air recirculating back into the open coolers, I wonder how much you'll be affected by throttling.
> 
> What computer case will you be using? What is your air-flow configuration look like? Fans? Front as intake rear as exhaust? Inquiring minds want to know


I actually thought of getting a Corsair 450D or just use my Carbide Air 540 and mount 2x 140 mm fans in the front for intake and a 120 mm as exhaust or something. I do not think the cards would be throttling with fans at around 90-100%.







I did have my four reference cards in quadfire on air with 100% fan-speed, I do not want that again.








Quote:


> Originally Posted by *Sgt Bilko*
> 
> Definitely not the Matrix, Tbh i'd say Reference is the way to go for that if you are wanting air only but if you can't get them then the Lightning would probably be the best choice.
> 
> I know Tsm106 is running 3 Lightning's but i'm not sure if he had them on air to start with or just threw them straight under water


Then I guess it will be the Lightning, the price-difference is almost nothing anyways.
Quote:


> Originally Posted by *tsm106*
> 
> I went straight to water, but the Lightning cooler really is a marvel. Between those three at that pricing, I'd go trix then lightning. If it was US pricing, the lightning first obviously since its 350 at amazon, there's no need for second place then.


350 on Amazon is good, I could get one card here for 465 USD with free shipping. 500 USD from a Norwegian shop ... Amazon seems to be working with fees and charges for 27 USD ... that is enough ...


----------



## Levesque

Hey guys.

Just got an R9 290x (249$! lol), and using it with a 144Hz Asus monitor. I get alot of ''stuttering'' at 144Hz in some games. Is there a way to get rid of this?

Thx.


----------



## tsm106

Quote:


> Originally Posted by *Levesque*
> 
> Hey guys.
> 
> Just got an R9 290x (249$! lol), and using it with a 144Hz Asus monitor. I get alot of ''stuttering'' at 144Hz in some games. Is there a way to get rid of this?
> 
> Thx.


Omg, he's back from the dead!

Single gpu... lower IQ settings.


----------



## Chopper1591

Quote:


> Originally Posted by *Levesque*
> 
> Hey guys.
> 
> Just got an R9 290x (249$! lol), and using it with a 144Hz Asus monitor. I get alot of ''stuttering'' at 144Hz in some games. Is there a way to get rid of this?
> 
> Thx.


Ahh, not another one.
250 usd is about the price I paid for my used 290 tri-x.









Quote:


> Originally Posted by *JourneymanMike*
> 
> Please don't use that language


Sorry.


----------



## Levesque

Quote:


> Originally Posted by *tsm106*
> 
> Omg, he's back from the dead!
> 
> Single gpu... lower IQ settings.


Yeah, got some new business, etc, so way too busy to play with my computers anymore.









I'm using a lowly 1080p monitor at work... do you really think I should lower my settings to use 144Hz with that resolution?!?!?


----------



## kizwan

Quote:


> Originally Posted by *hyp36rmax*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gobigorgohome*
> 
> I am building another X79 machine with a GA-X79-UD3 and 3930K, together with 16 GB RAM, SSD, HDD and a few GPU's.
> 
> When it comes to the GPU's I am down to R9 290X, t*his time around the cards will be air cooled ONLY, no water for these cards so the cooler have to be able to get rid of all the heat from the cards.* I am doing 2-way, maybe 3-way crossfire and my alternatives is these:
> 
> Sapphire Radeon TRI-X R9 290X: 405 USD each
> Asus Matrix Platinum R9 290X: 420 USD each
> MSI Lightning R9 290X: 520 USD each
> 
> The cards will not be stacked, they have the same distance between each other as the RIVBE with x16 x16.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I will probably buy two cards right away and use them for 1080p/5760x1080/4K and maybe throw in one of my "old" R9 290X together with the new ones if the performance is not satisfying with two cards.
> 
> 
> 
> 
> 
> 
> 
> I'm interested to see your results with an air-cooled multi GPU 290X set. I'd imagine even with much more efficient air coolers your computer case will be one hell of a hot box as the heat has to go somewhere with the hot air recirculating back into the open coolers, I wonder how much you'll be affected by throttling.
> 
> What computer case will you be using? What is your air-flow configuration look like? Fans? Front as intake rear as exhaust? Inquiring minds want to know
Click to expand...

For two cards, it should be ok though, depending on how high you overclock & you willing to crank up the fan(s) to max.
Quote:


> Originally Posted by *Chopper1591*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I went straight to water, but the Lightning cooler really is a marvel. Between those three at that pricing, I'd go trix then lightning. If it was US pricing, the lightning first obviously since its 350 at amazon, there's no need for second place then.
> 
> 
> 
> Those prices.
> 
> In that aspect europe sucks.
Click to expand...

Nah, the rest of the world sucks too.


----------



## Chopper1591

Quote:


> Originally Posted by *kizwan*
> 
> For two cards, it should be ok though, depending on how high you overclock & you willing to crank up the fan(s) to max.
> *Nah, the rest of the world sucks too.*


----------



## wermad

3rd sapphire ordered







. Two more blocks incoming! Now, on the hunt for a quad


----------



## mAs81

Sapphire support about my RMA is going......slow.
After 3 days we're still trying to establish that the intelligent fa control switch is on off..


----------



## Gobigorgohome

Quote:


> Originally Posted by *kizwan*
> 
> For two cards, it should be ok though, depending on how high you overclock & you willing to crank up the fan(s) to max.


I am kind of switching teams for this build, will try to get 3x 780 Ti's pre-owned instead (mostly because they run cooler) and I have always wanted to own a few of those cards.


----------



## BLOWNCO

Quote:


> Originally Posted by *bond32*
> 
> I know you say no water, but if you're interested I am selling my 290's. Check out the marketplace. They are reference 290's and 290x's.


what cards are you moving too?


----------



## bond32

Quote:


> Originally Posted by *BLOWNCO*
> 
> what cards are you moving too?


Not sure yet, just dropping to 1 card at the moment. Probably get the 390x when it's released


----------



## BLOWNCO

Quote:


> Originally Posted by *bond32*
> 
> Not sure yet, just dropping to 1 card at the moment. Probably get the 390x when it's released


nice!! i wanna see what they drop for the new cards then maybe move to them also.


----------



## BradleyW

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> I've never really even messed with in Xfire. I do have the game though. Can't mess with it tonight but probably tomorrow after work I'll try it with 2 and 3 gpus both with the acu profile and afr.


This would be extremely useful thank you.


----------



## thrgk

whats the best way to uninstall amd drivers? DDU or the uninstall interface? These omega suck for 4 card performance


----------



## tsm106

Quote:


> Originally Posted by *thrgk*
> 
> whats the best way to uninstall amd drivers? DDU or the uninstall interface? These omega suck for 4 card performance


At this point DDU has won me over, it makes it so easy to swap.


----------



## thrgk

Quote:


> Originally Posted by *tsm106*
> 
> At this point DDU has won me over, it makes it so easy to swap.


works ok on win 8.1?


----------



## tsm106

Quote:


> Originally Posted by *thrgk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> At this point DDU has won me over, it makes it so easy to swap.
> 
> 
> 
> works ok on win 8.1?
Click to expand...

Yea, hell it even can reboot to safemode for you so you don't have to do the triple boot hoop jump by MS.


----------



## chiknnwatrmln

I wonder if AMD's next flagship will be enough to push 1440p on its own.. If so I might have to sell my cards and blocks, pick up a 1440p monitor and get one. I like the raw power of CF, but literally no games utilize my cards 100%.


----------



## sinnedone

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I wonder if AMD's next flagship will be enough to push 1440p on its own.. If so I might have to sell my cards and blocks, pick up a 1440p monitor and get one. I like the raw power of CF, but literally no games utilize my cards 100%.


One R9 290 reference pushes bf4 ultra everything above 60 fps for me at 1440p.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *sinnedone*
> 
> One R9 290 reference pushes bf4 ultra everything above 60 fps for me at 1440p.


I might have to look into getting a 1440p monitor soon then. I like Eyefinity but too many games are not compatible with it and I've kinda had my fill of having three monitors.


----------



## Wezzor

Do you really see a big difference between 1080p and 1440p? Is it like comparing 60Hz and 144Hz?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Wezzor*
> 
> Do you really see a big difference between 1080p and 1440p? Is it like comparing 60Hz and 144Hz?


I've personally never seen a 1440p monitor, lol. Currently running 3 900p TN panels (long story why I got those instead of 1080p). They're really nice quality and great image (an 75hz) but I think a nice Korean 27" 1440p would like better. A bit of simple math tells me that a 27" 1440p screen would have about 20% higher pixel density than my current screens. On a side note, my mom just bought a 4k TV, I'm going to run my rig off of that for a bit just to see what it's like.. I would like to think my two cards would be able to push playable framerates in some games.


----------



## Chopper1591

Quote:


> Originally Posted by *kizwan*
> 
> For two cards, it should be ok though, depending on how high you overclock & you willing to crank up the fan(s) to max.
> Nah, the rest of the world sucks too.


Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I've personally never seen a 1440p monitor, lol. Currently running 3 900p TN panels (long story why I got those instead of 1080p). They're really nice quality and great image (an 75hz) but I think a nice Korean 27" 1440p would like better. A bit of simple math tells me that a 27" 1440p screen would have about 20% higher pixel density than my current screens. On a side note, my mom just bought a 4k TV, I'm going to run my rig off of that for a bit just to see what it's like.. I would like to think *my two cards would be able to push playable framerates* in some games.


Would like to know to.

But my money is on, no.


----------



## rdr09

Quote:


> Originally Posted by *Chopper1591*
> 
> Would like to know to.
> 
> But my money is on, no.


i have no issue playing using a 4K monitor with my 2 290s. C3 in medium no AA iooks better than 1080 Ultra 8msaa. BF4 - easy. Skyrim and a few other games can be played even with one 290. love it. only downside is Adobe menus are tiny. forced to use reading glasses. lol

edit: with a 4K TV . . . it will run in 30Hz. Hawaii does not have HDMI 2.0. you need a 970 or 980 for that.


----------



## Gobigorgohome

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I've personally never seen a 1440p monitor, lol. Currently running 3 900p TN panels (long story why I got those instead of 1080p). They're really nice quality and great image (an 75hz) but I think a nice Korean 27" 1440p would like better. A bit of simple math tells me that a 27" 1440p screen would have about 20% higher pixel density than my current screens. On a side note, my mom just bought a 4k TV, I'm going to run my rig off of that for a bit just to see what it's like.. I would like to think my two cards would be able to push playable framerates in some games.


Two R9 290X is the "entry" at 4K ... although you are running one R9 290 and one R9 290X ... had two MSI Lightning R9 290X and they did pretty well on 4K, maxed Hitman Absolution @ 4K without any problems ... I am sure if you push those cards together with a good CPU you would not need "new" GPU's.

I think I am getting rid of one R9 290X to start off with and cut back to one PSU so that I could use my other EVGA G2 1300W for the other X79 rig I am building.


----------



## bond32

One 290 is perfectly fine to run a single 1440p monitor. I'm using one 290x at the moment... Bf4 usually runs between 70 and 100 fps on ultra amd 120 res scale.

For 4k you will indeed need to get a second card. Seeing how 4k is twice the resolution of 1440p, need almost twice the graphic computing power yes?


----------



## Gobigorgohome

Quote:


> Originally Posted by *bond32*
> 
> One 290 is perfectly fine to run a single 1440p monitor. I'm using one 290x at the moment... Bf4 usually runs between 70 and 100 fps on ultra amd 120 res scale.
> 
> For 4k you will indeed need to get a second card. Seeing how 4k is twice the resolution of 1440p, need almost twice the graphic computing power yes?


I have only tried FC4 with 1440p because it was so bad optimized, BF4 and everything else I run at 4K. I would have had three cards for 4K really, if you want to push your cards two cards is fine for 4K, if not, get three.







Four cards is pretty much just for the aesthetics of having four cards.


----------



## HOMECINEMA-PC

I have no probs TRI 290 on a TRI 1440p set up . 2x 27 inchers and a 28" sammie UHD in da middle . So that's 8079 x 1440p = about 150fps avg using crossfire on COD Ghosts


----------



## Wezzor

But guys, is there any reason to have AA on at all for 1440p? If that's the case, I think I'd be able to run 1440p with my rig. CPU at 4,5Ghz and GPU at 1100/1400.


----------



## Chopper1591

Quote:


> Originally Posted by *rdr09*
> 
> i have no issue playing using a 4K monitor with my 2 290s. C3 in medium no AA iooks better than 1080 Ultra 8msaa. BF4 - easy. Skyrim and a few other games can be played even with one 290. love it. only downside is Adobe menus are tiny. forced to use reading glasses. lol
> 
> edit: with a 4K TV . . . it will run in 30Hz. Hawaii does not have HDMI 2.0. you need a 970 or 980 for that.


Really?

I could believe that higher resolution lacks the need of AA.
But lowering other settings and still look better? Hmm.

If only I had the money to go 4k.


----------



## Forceman

Quote:


> Originally Posted by *Wezzor*
> 
> But guys, is there any reason to have AA on at all for 1440p? If that's the case, I think I'd be able to run 1440p with my rig. CPU at 4,5Ghz and GPU at 1100/1400.


I think AA is still helpful even at 1440p, but it kinds of depends how sensitive you are to aliasing.


----------



## rdr09

Quote:


> Originally Posted by *Chopper1591*
> 
> Really?
> 
> I could believe that higher resolution lacks the need of AA.
> But lowering other settings and still look better? Hmm.
> 
> If only I had the money to go 4k.


you'll be surprised. more like amazed. i got mine cheap. $350.


----------



## Chopper1591

Quote:


> Originally Posted by *rdr09*
> 
> you'll be surprised. more like amazed. i got mine cheap. $350.


A quick look gives me these.
If I go 3840x2160 and buy a decent brand, the cheap options are:

Samsung U28D590D : TN 1ms edge-lit 60hz €470(585usd)
Dell Ultrasharp UP2414Q : AH-IPS 8ms edge-lit 60hz €580(722usd)


----------



## Performer81

How does gpu-z read out the vrm temps? On my pcs+ 2 vrms have no contact to the cooler at all. But even with max. overclock the vrm temp is under 80 degrees. Is this possible?
I really dont feel comfortable in pushing high voltage.



Or are they not important? Also there is too less space to put a useful cooler on them.


----------



## heroxoot

Quote:


> Originally Posted by *Performer81*
> 
> How does gpu-z read out the vrm temps? On my pcs+ 2 vrms have no contact to the cooler at all. But even with max. overclock the vrm temp is under 80 degrees. Is this possible?
> I really dont feel comfortable in pushing high voltage.


I'd assume they have a sensor. Mine never pass 80c with an OC, or 70c on stock.


----------



## rdr09

Quote:


> Originally Posted by *Chopper1591*
> 
> A quick look gives me these.
> If I go 3840x2160 and buy a decent brand, the cheap options are:
> 
> Samsung U28D590D : TN 1ms edge-lit 60hz €470(585usd)
> Dell Ultrasharp UP2414Q : AH-IPS 8ms edge-lit 60hz €580(722usd)


I find those expensive prolly 'cause i am cheap. how much are 1440s?
Quote:


> Originally Posted by *Performer81*
> 
> How does gpu-z read out the vrm temps? On my pcs+ 2 vrms have no contact to the cooler at all. But even with max. overclock the vrm temp is under 80 degrees. Is this possible?
> I really dont feel comfortable in pushing high voltage.


double check your readings with HWINFO64. GPUZ and HW read the same for my 290s.


----------



## Chopper1591

Quote:


> Originally Posted by *rdr09*
> 
> I find those expensive prolly 'cause i am cheap. how much are 1440s?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> double check your readings with HWINFO64. GPUZ and HW read the same for my 290s.


I also find those expensive...
That's why I said, I wish I had the money to go 4k.









I am not doing with Acer or AOC.
Cheapest 4k is an AOC, 400 euro. 28inch TN 1ms.
Something has got to be wrong there... looks too nice to be true, which it probably also will be.









1440p...
Go all the way from €320, for an Acer 27inch A-HVA panel(***?) 6ms edge.
To the epic, Asus ROG 2inch 144hz 1ms for a whooping €739, that is 921 usd.

Mid-class looks to be 400-500 euro.
Dell ultrasharp 27inch being 480


----------



## sinnedone

I am currently running a qnix 27" 1440p monitor.(which go for about 300-350 us) It overclocks to 120hz which visually the fluidity is very nice. Using only one R9 290 right now until I finish my watercooled build in which I will be running two R9 290's. I'm hoping 2 of them can keep my lower fps drops close to the triple digits at 1440p.

Its better for me because of screen real estate. You can actually snap 2 windows to each side and not scroll. Meaning you will have 2 full windows instead of the zoomed in scroll fest at 1080p. much more usable in this regard.

As far as AA goes I guess it depends on the person and how close you sit. I personally notice AA because the monitor is only about 2 feet away from my face. If I sat back a bit it might not be noticeable to me.


----------



## rdr09

Quote:


> Originally Posted by *Performer81*
> 
> How does gpu-z read out the vrm temps? On my pcs+ 2 vrms have no contact to the cooler at all. But even with max. overclock the vrm temp is under 80 degrees. Is this possible?
> I really dont feel comfortable in pushing high voltage.
> 
> 
> 
> Or are they not important? Also there is too less space to put a useful cooler on them.


if the thermal pad is touching them, then heat will still transfer. if not, then i recommend re-doing the pad. make sure those two exposed ones are covered. the sensor might not be located on all them. not sure.

@chopper, i've owned an Acer monitor for almost 4 years now without issue, so i went and got an Acer 4K. no dead pixel, which can happen to any brand. did not even calibrate the color on both 7950 and my 290s. plug and play. the stand is superb. better than my sony tv. no screws to assemble and easy to set portrait mode.

edit: But, despite the naysayers, i recommend waiting for freesync. might be the same price on your neck of the woods.


----------



## Performer81

Nothing is touching them, they are half under the cooler but the thermal pad is far away. I think the cooler is not designed for the pcb.
Some ther user measured 93 degrees at stock settings with a laser thermometer while Gpu-z showed high 70s but he put a water cooling on it right after.


----------



## Chopper1591

Quote:


> Originally Posted by *rdr09*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> if the thermal pad is touching them, then heat will still transfer. if not, then i recommend re-doing the pad. make sure those two exposed ones are covered. the sensor might not be located on all them. not sure.
> 
> 
> 
> @chopper, i've owned an Acer monitor for almost 4 years now without issue, so i went and got an Acer 4K. no dead pixel, which can happen to any brand. did not even calibrate the color on both 7950 and my 290s. plug and play. the stand is superb. better than my sony tv. no screws to assemble and easy to set portrait mode.
> 
> edit: But, despite the naysayers, i recommend waiting for freesync. might be the same price on your neck of the woods.


Good to hear. About the acer.

Any idea when we can expect freesync?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Chopper1591*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> if the thermal pad is touching them, then heat will still transfer. if not, then i recommend re-doing the pad. make sure those two exposed ones are covered. the sensor might not be located on all them. not sure.
> 
> 
> 
> @chopper, i've owned an Acer monitor for almost 4 years now without issue, so i went and got an Acer 4K. no dead pixel, which can happen to any brand. did not even calibrate the color on both 7950 and my 290s. plug and play. the stand is superb. better than my sony tv. no screws to assemble and easy to set portrait mode.
> 
> edit: But, despite the naysayers, i recommend waiting for freesync. might be the same price on your neck of the woods.
> 
> 
> 
> Good to hear. About the acer.
> 
> Any idea when we can expect freesync?
Click to expand...

Q1 2015 Samsung and others will be releasing Freesync enabled Monitors


----------



## Chopper1591

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Q1 2015 Samsung and others will be releasing Freesync enabled Monitors


Thanks mate.

Time to start saving then.


----------



## hyp36rmax

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Q1 2015 Samsung and others will be releasing Freesync enabled Monitors


That's what i'm waiting for also! I'm currently using an ASUS PB287Q 4k (which is awesome btw) and pick up one of these for my Crossfire 290X setup and move the ASUS to my other rig.








Quote:


> Originally Posted by *Chopper1591*
> 
> Thanks mate.
> 
> Time to start saving then.


Yep, i know the feeling


----------



## tsm106

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Wezzor*
> 
> But guys, is there any reason to have AA on at all for 1440p? If that's the case, I think I'd be able to run 1440p with my rig. CPU at 4,5Ghz and GPU at 1100/1400.
> 
> 
> 
> I think AA is still helpful even at 1440p, but it kinds of depends how sensitive you are to aliasing.
Click to expand...

I sometimes game on a 70" 4K and at that size the jaggies are very obvious so AA is still needed.


----------



## Agent Smith1984

So how does quadfire hold up at 4k with 4x-8x AA on those 290's??
Flexing those 512 bit muscles at that res/setting I bet...... does the VRAM fill up (obviously this will vary from title to title)?


----------



## tsm106

Quote:


> Originally Posted by *Agent Smith1984*
> 
> So how does quadfire hold up at 4k with 4x-8x AA on those 290's??
> Flexing those 512 bit muscles at that res/setting I bet...... does the VRAM fill up (obviously this will vary from title to title).


I haven't messed with it in a while, last time was with just three cards. It handles it rather well. But tbh, it's only 2mp higher density vs my display setup, and technically a 6mp load at 120-144fps is roughly twice the load of 4k, equivalent to a 14mp load. Thus, it's harder to run my games on eyefinity at optimal fps than 4k at 60fps. I don't ever bother monitoring vram usage. The apps cannot directly read vram anyways so it's fundamentally inaccurate. If you noticed vram choke, lower textures till it stops.


----------



## Agent Smith1984

Not sure if it's still applicable, but my brother was running into VRAM load on Watch Dogs with his 7950, and he just disabled the pagefile, and it fixed the stutters.
Basically kept the textures from pagefile swapping, and forced them to swap from RAM, which is apparently fast enough to fix the issues on his rig. He was running a i7 3820 with DDR3 1866 in quad channel though....


----------



## tsm106

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Not sure if it's still applicable, but my brother was running into VRAM load on Watch Dogs with his 7950, and he just disabled the pagefile, and it fixed the stutters.
> Basically kept the textures from pagefile swapping, and forced them to swap from RAM, which is apparently fast enough to fix the issues on his rig. He was running a i7 3820 with DDR3 1866 in quad channel though....


Ppl come up with the oddest fixes, so I suppose if it works for him that's good. It would probably have been easier to just lower iq settings/textures. WD is a horrifically unoptimized game btw. However disabling swapfile on bf4 will not get you very far.


----------



## wermad

How's your Phanteks case???

Here's my 900D:




I'm hoping the 3rd Sapphire gets in by this weekend (and new cpu).


----------



## tsm106

That's different, psu in the same chamber as mb. I like the Primo a lot. It was the easiest case to setup, more than any case I've used before. I had to use the dremel just once, then the two bore holes for the couplers. Loop was transferred and done in 3 days so I was happy about the turnover time.


----------



## wermad

I ended up removing the odd bay, very easy w/ a drill and bit, and a few screw holes here and there. Was able to squeeze in the massively overkill triple Monsta 480s







.

I'm a good 10c on the water block on idle. I'm hoping both the last sapphire and the two remaining blocks arrive by this weekend.


----------



## tsm106

How many cards are you running? Three right? Where will your psu go? I'm guessing you have a bigger psu or more psu cuz that v1k won't cut it.


----------



## Klocek001

well, I'm back on the thread again, my brand new 290 Trix will be back from RMA tomorrow or Friday. The shop offered me G1 Gaming 970 for some extra cash but I turned it down. That's what makes a devoted member of this club, ain't it?

I got a new PSU in the meantime, hope this will solve my cards' mysteriously dying thing once and for all. I even had some time to call a friend of mine, an electrician. he checked the outlet in my room and the main wire, said it was fine. I also bought a new power bar, with surge protection and a varistor. I pretty much did everything that came to my mind to ensure this won't happen again.

Since the first thing I'll have to do is install the driver, my question is: what driver should I choose ? The last one I was using was 14.11.1, there have been 2 new ones in the meantime.


----------



## wermad

Quote:


> Originally Posted by *tsm106*
> 
> How many cards are you running? Three right? Where will your psu go? I'm guessing you have a bigger psu or more psu cuz that v1k won't cut it.


tpu pulled 500w at the wall for two 290s. They're stock so no worries about power. Actually, I was about to pick a new unit, but decided to wait. Will measure with the killawatt.


----------



## tsm106

Quote:


> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> How many cards are you running? Three right? Where will your psu go? I'm guessing you have a bigger psu or more psu cuz that v1k won't cut it.
> 
> 
> 
> tpu pulled 500w at the wall for two 290s. They're stock so no worries about power. Actually, I was about to pick a new unit, but decided to wait. Will measure with the killawatt.
Click to expand...

Dude, my cpu pulls 450w at the wall priming with nothing connected except the cpu at 5.0, so I take website findings with a gigantic grain of salt. Though tpu is usually the most accurate and BUT they measure from the plugs not wall typically. It's guru that always post stupid low numbers from the wall. Anyways, I remember this because I just had to redo my cpu validations so I went thru every profile over again, from 4.3 to 5.0. Since I run dual psu, I have all accessories, drives, pumps fans running off my 2nd psu. Two cards on the 1300 with cpu. Two cards on 1000 with loop, fans, drives etc. That reminds me I should a thorough killawatt test maxed out with furmark.

I can say without a doubt, pulling 500w at the wall with two 290x for full system draw is bs.


----------



## F4ze0ne

Quote:


> Originally Posted by *wermad*
> 
> tpu pulled 500w at the wall for two 290s. They're stock so no worries about power. Actually, I was about to pick a new unit, but decided to wait. Will measure with the killawatt.


3 cards running over 1k watt according to [H]...



Spoiler: Warning: Spoiler!







http://www.hardocp.com/article/2014/05/13/amd_radeon_r9_295x2_xfx_290x_dd_trifire_review/9


----------



## wermad

You guys keep missing I'm running 290s, and not overclocking to the sky.

And you have factor in efficiency as these measurements are AT THE WALL!!!

I'll be fine, since I don't run furmark 24/7 nor vie for oc king of the world









Edit: also note haswell, not x79 system


----------



## chiknnwatrmln

Quote:


> Originally Posted by *rdr09*
> 
> i have no issue playing using a 4K monitor with my 2 290s. C3 in medium no AA iooks better than 1080 Ultra 8msaa. BF4 - easy. Skyrim and a few other games can be played even with one 290. love it. only downside is Adobe menus are tiny. forced to use reading glasses. lol
> 
> edit: with a 4K TV . . . it will run in 30Hz. Hawaii does not have HDMI 2.0. you need a 970 or 980 for that.


Blech, 30 fps... I thought that displayport supported 4k60fps? Or is that a newer version of DP?


----------



## bond32

I will say I had my doubts too... But when I added voltage and my 1300 watt G2 shut down on 3 cards (one 290x and 2 290's), promply bought another 1300 G2.


----------



## mAs81

Hey guys just a quick question..

The RMA process isn't going too well with Sapphire,and I thought of using a 4 pin Y splitter fan extension cable to connect the fans to my mobo..

Does that sound like a viable solution?The m/b fan header is PWM to my knowledge , so the fans will run at an appropriate speed in regards to temps,and not at a fixed speed,right?


----------



## wermad

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Blech, 30 fps... I thought that displayport supported 4k60fps? Or is that a newer version of DP?


Dp 1.2 has enough bandwidth for 4k, and 1.3 almost has twice as much.

Most budget 4k tv's and monitors are limited to 30hz. But progress is coming as some acer s are going for ~$350 (60hz, 28", tn). A nice 32" 60hz ips will be well over $1k.


----------



## Gobigorgohome

Quote:


> Originally Posted by *bond32*
> 
> I will say I had my doubts too... But when I added voltage and my 1300 watt G2 shut down on 3 cards (one 290x and 2 290's), promply bought another 1300 G2.


Have you any idea how much your system was using with four cards with that overclock?

1300W is enough for dual cards on PT1, right?


----------



## tsm106

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bond32*
> 
> I will say I had my doubts too... But when I added voltage and my 1300 watt G2 shut down on 3 cards (one 290x and 2 290's), promply bought another 1300 G2.
> 
> 
> 
> Have you any idea how much your system was using with four cards with that overclock?
> 
> 1300W is enough for dual cards on PT1, right?
Click to expand...

A G2 1300 is enough for my hexacore and two very highly volted Lightinings, though I also moved all accessories off this psu.


----------



## Gobigorgohome

Quote:


> Originally Posted by *tsm106*
> 
> A G2 1300 is enough for my hexacore and two very highly volted Lightinings, though I also moved all accessories off this psu.


I kind of hit a bump in the road when I actually had an agreement of buying two cards, and he sold one card to someone else. So yeah, then I guess I have to keep my cards and do crossfire on both rigs.

Rig 1 will be:

4930K @ 4,7 Ghz (might do 4,5 Ghz on this too)
Asus Rampage IV Black Edition
16GB Crucial Ballistix Elite 1866 Mhz
2x R9 290X (Hynix)
EVGA G2 1300W
LD Cooling PC-V8
1x MO-RA3 420 LT with 9x 140 mm fans
(+ maybe 1/2 480s)

Rig 2 will be:

3930K @ ~4,5 Ghz
Gigabyte GA-X79-UD3
16GB Corsair Dominator Platinum 2133 Mhz
2x R9 290X (Elpida)
EVGA G2 1300W
Corsair 450D
1x MO-RA3 420 LT with 9x 140 mm fans
(+ maybe 360, 240 and 120)

Or I could keep three cards on the first rig and buy another R9 290X for the second rig (a card with Hynix). Both will be pushing 4K.


----------



## tsm106

Both those setups look good. That said, at 4k they won't be 4k killers on triple A titles. If you think about in megapixel loads, it makes sense. One 290x is a killer at 1080, not so much at 1440 though still very good. At 4k it is being slayed. 1080 = 2mp, 1440 = 2.5mp ish, 4k = 8mp. It would take 4 gpus to maintain the "killer" level that you had at 1080.


----------



## Gobigorgohome

Quote:


> Originally Posted by *tsm106*
> 
> Both those setups look good. That said, at 4k they won't be 4k killers on triple A titles. If you think about in megapixel loads, it makes sense. One 290x is a killer at 1080, not so much at 1440 though still very good. At 4k it is being slayed. 1080 = 2mp, 1440 = 2.5mp ish, 4k = 8mp. It would take 4 gpus to maintain the "killer" level that you had at 1080.


I could spring for two TRI-X R9 290X's and buy blocks for them and do full trifire on both setups, but I am not sure the extra FPS would make me spend another 810 USD on GPU's.


----------



## tsm106

Whats the 2nd rig for? It makes sense to get moar gpu for your personal use rig, but if that second is a communal/shared rig, I wouldn't bother adding more gpu to it.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *wermad*
> 
> Dp 1.2 has enough bandwidth for 4k, and 1.3 almost has twice as much.
> 
> Most budget 4k tv's and monitors are limited to 30hz. But progress is coming as some acer s are going for ~$350 (60hz, 28", tn). A nice 32" 60hz ips will be well over $1k.


Cool, ty. My mom bought an expensive TV, it's 120hz 55" 4k. I would think it has DP in, I'll have to check when we get it though.


----------



## tsm106

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> Dp 1.2 has enough bandwidth for 4k, and 1.3 almost has twice as much.
> 
> Most budget 4k tv's and monitors are limited to 30hz. But progress is coming as some acer s are going for ~$350 (60hz, 28", tn). A nice 32" 60hz ips will be well over $1k.
> 
> 
> 
> Cool, ty. My mom bought an expensive TV, it's 120hz 55" 4k. I would think it has DP in, I'll have to check when we get it though.
Click to expand...

Nooo, 99% TV's do not have DP. Only 4k I know of with DP is the Panasonic. If you wanna connect your gpu to the tv and use the full [email protected], you will need a gpu with HDMI 2.0. AMD has not released a HDMI 2.0 gpu yet, will have to wait for next gen release or get a 970/980.

In my case I bench with our 4k tv, but don't game on it so I don't care if the image is smeared plastered at 30hz.


----------



## Gobigorgohome

Quote:


> Originally Posted by *tsm106*
> 
> Whats the 2nd rig for? It makes sense to get moar gpu for your personal use rig, but if that second is a communal/shared rig, I wouldn't bother adding more gpu to it.


Second rig is pretty much spare parts, got the UD3 for 80 USD new, so I kind of had to buy it.







The thought behind it was to have something to play at LAN parties and such, nobody else than me (and some friends) will ever use it, thought of maybe getting a third 1080p screen and do 5760x1080 eyefinity again.

I might be able to pick up some Gigabyte R9 290s where I live, do not know how many cards the seller want to sell, but I think I can get three for 810 USD. Is that good or not? I might also get one Lightning R9 290X for 300 USD pre-owned.


----------



## wermad

These 120hz 4k tv like seike, are 30hz 4k and 120hz 2k (aka 1080).

A 55" 4k 60hz would a be few grand atm. Don't know if any 120hz 4k hdtv's are available for consumer mainstream users.

For hdtv's, hdmi 2.0 will provide the bandwidth ( I think 1.4 does support 4k @60hz, but not do 5k).


----------



## tsm106

Quote:


> Originally Posted by *wermad*
> 
> These 120hz 4k tv like seike, are 30hz 4k and 120hz 2k (aka 1080).
> 
> A 55" 4k 60hz would a be few grand atm. Don't know if any 120hz 4k hdtv's are available for consumer mainstream users.
> 
> For hdtv's, hdmi 2.0 will provide the bandwidth ( I think 1.4 does support 4k @60hz, but not do 5k).


You know... my Vizio Pseries 70" is only $2K and it does [email protected] You can get a Pseries in 50" for $1k iirc. Not sure what deals are like now but that was last month.

You need hdmi 2.0 for [email protected] 5K isn't even in the spec yet and HDMI 2.0 is already maxed out at [email protected]


----------



## Raephen

Quote:


> Originally Posted by *mAs81*
> 
> Hey guys just a quick question..
> 
> The RMA process isn't going too well with Sapphire,and I thought of using a 4 pin Y splitter fan extension cable to connect the fans to my mobo..
> 
> Does that sound like a viable solution?The m/b fan header is PWM to my knowledge , so the fans will run at an appropriate speed in regards to temps,and not at a fixed speed,right?


As long as the cabling has the right type of headers, I guess you could. I know Gelid overs a GPU-PWM female to standard PWM male cable, but I've never seen them the other way arround as you would need.

If you've found that, you'd need software capable to run a motherboard header based of GPU temps. Speedfan might do the trick, but I'm no expert on that.

I mostly use either a curve with the software that comes with the motherboard, or - in the case of my htpc - Open Hardware Monitor to run htem at a fixed speed. And I've never seen the GPU temp as a variable in either solution (I must admit: I've never had need to look that hard into it, though...)


----------



## wermad

Quote:


> Originally Posted by *tsm106*
> 
> You know... my Vizio Pseries 70" is only $2K and it does [email protected] You can get a Pseries in 50" for $1k iirc. Not sure what deals are like now but that was last month.
> 
> You need hdmi 2.0 for [email protected] 5K isn't even in the spec yet and HDMI 2.0 is already maxed out at [email protected]


Ips? These are the pricey ones me thinks.








, confusing with wqhd @60hz.


----------



## tsm106

Quote:


> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> You know... my Vizio Pseries 70" is only $2K and it does [email protected] You can get a Pseries in 50" for $1k iirc. Not sure what deals are like now but that was last month.
> 
> You need hdmi 2.0 for [email protected] 5K isn't even in the spec yet and HDMI 2.0 is already maxed out at [email protected]
> 
> 
> 
> Ips? These are the pricey ones me thinks.
> 
> 
> 
> 
> 
> 
> 
> 
> , confusing with wqhd @60hz.
Click to expand...

Vizio's Pseries is a mix of ips or va. The whole line up is [email protected] and it starts at $1K and is mostly VA. It's not about ips being more expensive. VA is better in most respects. Want lag get ips. The Pseries is known for its extremely low lag for a 4k panel and most of the series is VA.


----------



## Gobigorgohome

3x Gigabyte Windforce 3X 290 for 810 USD, worth it? How big of a difference is it between R9 290 and R9 290X? Will be running them on air if I decide to go with them.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *tsm106*
> 
> Nooo, 99% TV's do not have DP. Only 4k I know of with DP is the Panasonic. If you wanna connect your gpu to the tv and use the full [email protected], you will need a gpu with HDMI 2.0. AMD has not released a HDMI 2.0 gpu yet, will have to wait for next gen release or get a 970/980.
> 
> In my case I bench with our 4k tv, but don't game on it so I don't care if the image is smeared plastered at 30hz.


I gets 60hz @ 4k on DP , 30hz with hdmi . Gots a 4k currrrvy sammie in da lounge room , salesbloke didn't know what DP was LoooooL


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Gobigorgohome*
> 
> 3x Gigabyte Windforce 3X 290 for 810 USD, worth it? How big of a difference is it between R9 290 and R9 290X? Will be running them on air if I decide to go with them.


If your running on air don't bother . Watercooling these beasts is the really the wisest thing you can do . Not even sure you can waterblock those gigas anyways . Mem chips for me would be a deciding factor


----------



## Gumbi

Does anybody know what the max RPM of the fans on the VaporX cooling system is (for the 290 if thst makes a difference)?


----------



## rdr09

Not sure about you guys with Tri and Quads but playing with 2 290s is like playing with one card. soooooo smoooooth.


----------



## thrgk

i am on 14.9 and in bf4, not all my 4 290x are at 100%, is this normal?


----------



## Gobigorgohome

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> If your running on air don't bother . Watercooling these beasts is the really the wisest thing you can do . Not even sure you can waterblock those gigas anyways . Mem chips for me would be a deciding factor


Yes, water cooling is preferred, but I want to minimize the cost ... the best way I could do it is with using three cards with both setups, I can get a Lightning R9 290X for 300 USD used, but there is no way I am able to put that thing under water ... if so I need three Lightning R9 290X, because of the larger PCB than the reference models I already have. As far as I know the EK-FC R9 290X Lightning blocks is kind of limited I am afraid, to get the block into Norway is going to cost me more than it will taste ... In that case I have to buy the BP one from FCPU that will be 175 USD without shipping, and shipping is 50+ USD for one block ... already 225 USD (this is the bear minimum, probably unrealistic too).
I am not sure if the Gigabyte Windforce is Elpida or Hynix, I guess I can find out though, anyways, I thought a bit more on the price on them, I do not want to pay more than 240 USD for one of those cards (even though they are not used), new they are 350 USD. This is bit of a dilemma ...








Quote:


> Originally Posted by *rdr09*
> 
> Not sure about you guys with Tri and Quads but playing with 2 290s is like playing with one card. soooooo smoooooth.


I have been running tri-fire and quad-fire and it is all very smooth, but I would like to cut down to tri-fire, I do not utilize the fourth card at all for what I am doing, so no need for having it in there drawing more power than necessary.


----------



## tsm106

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Nooo, 99% TV's do not have DP. Only 4k I know of with DP is the Panasonic. If you wanna connect your gpu to the tv and use the full [email protected], you will need a gpu with HDMI 2.0. AMD has not released a HDMI 2.0 gpu yet, will have to wait for next gen release or get a 970/980.
> 
> In my case I bench with our 4k tv, but don't game on it so I don't care if the image is smeared plastered at 30hz.
> 
> 
> 
> I gets 60hz @ 4k on DP , 30hz with hdmi . Gots a 4k currrrvy sammie in da lounge room , salesbloke didn't know what DP was LoooooL
Click to expand...

You're using a DP to HDMI converter? If yes, yer not getting [email protected] Unless I'm mistaken Sammie TVs dont have DP.


----------



## kizwan

Quote:


> Originally Posted by *wermad*
> 
> You guys keep missing I'm running 290s, and not overclocking to the sky.
> 
> And you have factor in efficiency as these measurements are AT THE WALL!!!
> 
> I'll be fine, since I don't run furmark 24/7 nor vie for oc king of the world
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: also note haswell, not x79 system


I'm running fine with two 290's on 1050W PSU. Most of the time I'm gaming at stock clock (290's) but I did play BF4 with 290's overclocked to 1150/1600 & CPU at 4.75GHz. No problem. The PSU did running hot though. I think with Haswell you should be fine.
Quote:


> Originally Posted by *wermad*
> 
> These 120hz 4k tv like seike, are 30hz 4k and 120hz 2k (aka 1080).
> 
> A 55" 4k 60hz would a be few grand atm. Don't know if any 120hz 4k hdtv's are available for consumer mainstream users.
> 
> For hdtv's, hdmi 2.0 will provide the bandwidth ( I think 1.4 does support 4k @60hz, but not do 5k).


AFAIK 1.4 support up to [email protected]
Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Nooo, 99% TV's do not have DP. Only 4k I know of with DP is the Panasonic. If you wanna connect your gpu to the tv and use the full [email protected], you will need a gpu with HDMI 2.0. AMD has not released a HDMI 2.0 gpu yet, will have to wait for next gen release or get a 970/980.
> 
> In my case I bench with our 4k tv, but don't game on it so I don't care if the image is smeared plastered at 30hz.
> 
> 
> 
> I gets 60hz @ 4k on DP , 30hz with hdmi . Gots a 4k currrrvy sammie in da lounge room , salesbloke didn't know what DP was LoooooL
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> You're using a DP to HDMI converter? If yes, yer not getting [email protected] Unless I'm mistaken Sammie TVs dont have DP.
Click to expand...

I don't know how many 4k monitor/tv he have but I think he is referring to UHD monitor. The Sammie 4k curved UHD TV do not have DP AFAIK.


----------



## tsm106

Quote:


> Originally Posted by *kizwan*
> 
> I don't know how many 4k monitor/tv he have but I think he is referring to UHD monitor. The Sammie 4k curved UHD TV do not have DP AFAIK.


Oh fancy!


----------



## wermad

Ok, ran 3d11 and 3dfs, both 1080, stock clocks, crossfire, confirmed (via AB) both gpu's loading @99%.

-3D11 Demo, G1, G2, G3: ~550w
-3dFS Demo, G1, G2: ~510w

These were on a brand new Kill-A-Watt I just picked up tonight from Fry's. The old one was pretty whacky, reporting idle ~450w and load ~1400w. New one shows idle ~80w. The incoming i5 should add a bit but not much.

REMEMBER, these are at the wall readings. Factor in my psu's efficiency (gold ~89%), the V1000 is churning out ~485w and 449w respectively. Very close to the some of the crossfire 290 figures.

I'll keep an eye on the Kill-A-Watt. Once it hits ~1150w, then I know i"m at the max rated output of the V1000









Enter the excuses, in 3...2.....


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *tsm106*
> 
> You're using a DP to HDMI converter? If yes, yer not getting [email protected] Unless I'm mistaken Sammie TVs dont have DP.
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I'm running fine with two 290's on 1050W PSU. Most of the time I'm gaming at stock clock (290's) but I did play BF4 with 290's overclocked to 1150/1600 & CPU at 4.75GHz. No problem. The PSU did running hot though. I think with Haswell you should be fine.
> AFAIK 1.4 support up to [email protected]
> I don't know how many 4k monitor/tv he have but I think he is referring to UHD monitor. The Sammie 4k curved UHD TV do not have DP AFAIK.
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Oh fancy!
> 
> 
> 
> 
> 
> Click to expand...
Click to expand...

Yes I have a 28' UHD Sammy with DP and a 67' sammy curvy in lounge room with no DP but will run UHD thru PC / HDMI










Howsit goin kizwan , tsm106 and @wermad


----------



## chiknnwatrmln

Quote:


> Originally Posted by *tsm106*
> 
> Nooo, 99% TV's do not have DP. Only 4k I know of with DP is the Panasonic. If you wanna connect your gpu to the tv and use the full [email protected], you will need a gpu with HDMI 2.0. AMD has not released a HDMI 2.0 gpu yet, will have to wait for next gen release or get a 970/980.
> 
> In my case I bench with our 4k tv, but don't game on it so I don't care if the image is smeared plastered at 30hz.


Thanks, just checked online and you are correct. The damn thing has like 15 inputs and a quad core processor but no displayport. Kinda lame imo, but whatever, it was not bought for gaming.


----------



## Bertovzki

Quote:


> Originally Posted by *wermad*
> 
> Ok, ran 3d11 and 3dfs, both 1080, stock clocks, crossfire, confirmed (via AB) both gpu's loading @99%.
> 
> -3D11 Demo, G1, G2, G3: ~550w
> -3dFS Demo, G1, G2: ~510w
> 
> These were on a brand new Kill-A-Watt I just picked up tonight from Fry's. The old one was pretty whacky, reporting idle ~450w and load ~1400w. New one shows idle ~80w. The incoming i5 should add a bit but not much.
> 
> REMEMBER, these are at the wall readings. Factor in my psu's efficiency (gold ~89%), the V1000 is churning out ~485w and 449w respectively. Very close to the some of the crossfire 290 figures.
> 
> I'll keep an eye on the Kill-A-Watt. Once it hits ~1150w, then I know i"m at the max rated output of the V1000
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Enter the excuses, in 3...2.....


I am interested in any feedback about PSU's , id prefer the Seasonic 1200W Platinum , but as is the case with most 1000W PSU's they are nearly all 190 mm long , if they are the best ones i want anyway.
So the V1000 is a very good PSU and i like the fact it is 170 mm long , very short indeed for this class of PSU ! , that is a very attractive quality for my 750D case as it opens up more possibilities for modding and bottom fan size up to 140 mm or even a 280 rad in the bottom.

So to sum up , i prefer the head room of a 1200W PSU , but tempting to go V1000 for size , and i will never run more than 2 290x's , so feedback welcome , any opinions on this subject ? , it would be a no brainer if the 1200 W PSU's were 170 mm but they are not.


----------



## Chopper1591

Quote:


> Originally Posted by *Bertovzki*
> 
> I am interested in any feedback about PSU's , id prefer the Seasonic 1200W Platinum , but as is the case with most 1000W PSU's they are nearly all 190 mm long , if they are the best ones i want anyway.
> So the V1000 is a very good PSU and i like the fact it is 170 mm long , very short indeed for this class of PSU ! , that is a very attractive quality for my 750D case as it opens up more possibilities for modding and bottom fan size up to 140 mm or even a 280 rad in the bottom.
> 
> So to sum up , i prefer the head room of a 1200W PSU , but tempting to go V1000 for size , and i will never run more than 2 290x's , so feedback welcome , any opinions on this subject ? , it would be a no brainer if the 1200 W PSU's were 170 mm but they are not.


If you dont clock those cards like crazy you should be good with the v1000.


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yes I have a 28' UHD Sammy with DP and a 67' sammy curvy in lounge room with no DP but will run UHD thru PC / HDMI
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Howsit goin kizwan , tsm106 and @wermad


Not bad bro. I got into new hobby & got to spend more $$$$ that I don't have.







I see you taking over the lounge room now.


----------



## ivers

So I Wiped my SSd completely , instaled a fresh copy of Win7 and new driver for AMD, i am still getting Black screen in game with my R9 290.


----------



## Archea47

Quote:


> Originally Posted by *wermad*
> 
> Ok, ran 3d11 and 3dfs, both 1080, stock clocks, crossfire, confirmed (via AB) both gpu's loading @99%.
> 
> -3D11 Demo, G1, G2, G3: ~550w
> -3dFS Demo, G1, G2: ~510w
> 
> These were on a brand new Kill-A-Watt I just picked up tonight from Fry's. The old one was pretty whacky, reporting idle ~450w and load ~1400w. New one shows idle ~80w. The incoming i5 should add a bit but not much.
> 
> REMEMBER, these are at the wall readings. Factor in my psu's efficiency (gold ~89%), the V1000 is churning out ~485w and 449w respectively. Very close to the some of the crossfire 290 figures.
> 
> I'll keep an eye on the Kill-A-Watt. Once it hits ~1150w, then I know i"m at the max rated output of the V1000
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Enter the excuses, in 3...2.....


Interesting and what I want to hear, but troubling

Interesting because I thought my 1000W would be solid for OC'd 290Xs. Over in the 280x thread people used to claim 600 was plenty for CFX'd 280xs (which I thought was madness). Now on the other side of the spectrum I'm heading my 1000W might not be enough for overclocked 290Xs!









FYI my processor takes >300W and my PSU is EVGA G2 1000. Peripherals are 2 SSDs, 1 7.2K HDD, Logitec headset and mouse.

Troubling because from your data it looks like we can't trust Kill-a-Watt. If you can go buy 5 more and see the standard deviation between their readings that'd be swell


----------



## statyksyn

How can i Join the club?


----------



## bond32

Quote:


> Originally Posted by *Archea47*
> 
> Interesting and what I want to hear, but troubling
> 
> Interesting because I thought my 1000W would be solid for OC'd 290Xs. Over in the 280x thread people used to claim 600 was plenty for CFX'd 280xs (which I thought was madness). Now on the other side of the spectrum I'm heading my 1000W might not be enough for overclocked 290Xs!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> FYI my processor takes >300W and my PSU is EVGA G2 1000. Peripherals are 2 SSDs, 1 7.2K HDD, Logitec headset and mouse.
> 
> Troubling because from your data it looks like we can't trust Kill-a-Watt. If you can go buy 5 more and see the standard deviation between their readings that'd be swell


You should be fine with the 1000 G2. It is capable of drawing over 1200 watts if I recall correctly, but at any rate you will need to be approaching 1.5 volts to the cards to start to see it's limits.

For example, and these are purely rough numbers just for fyi, but I have seen software readings of around 370-380 watts per card before when they were at 1.46 volts.


----------



## rdr09

Quote:


> Originally Posted by *statyksyn*
> 
> How can i Join the club?


Show off your setup with a sign that states your ocn name.


----------



## Devildog83

Quote:


> Originally Posted by *Bertovzki*
> 
> I am interested in any feedback about PSU's , id prefer the Seasonic 1200W Platinum , but as is the case with most 1000W PSU's they are nearly all 190 mm long , if they are the best ones i want anyway.
> So the V1000 is a very good PSU and i like the fact it is 170 mm long , very short indeed for this class of PSU ! , that is a very attractive quality for my 750D case as it opens up more possibilities for modding and bottom fan size up to 140 mm or even a 280 rad in the bottom.
> 
> So to sum up , i prefer the head room of a 1200W PSU , but tempting to go V1000 for size , and i will never run more than 2 290x's , so feedback welcome , any opinions on this subject ? , it would be a no brainer if the 1200 W PSU's were 170 mm but they are not.


1000w is plenty for a 4790k and 2 x 290x's even if overclocked as long as it's a top quality PSU you will be fine with that.


----------



## F4ze0ne

Quote:


> Originally Posted by *bond32*
> 
> You should be fine with the 1000 G2. It is capable of drawing over 1200 watts if I recall correctly, but at any rate you will need to be approaching 1.5 volts to the cards to start to see it's limits.
> 
> For example, and these are purely rough numbers just for fyi, but I have seen software readings of around 370-380 watts per card before when they were at 1.46 volts.


How much watts can the EVGA 1200w P2 handle then?

I'm putting a 295x2 and a 290x on an X99 system. Hopefully, I have enough power... I'm only planning on overclocking the cpu.


----------



## wermad

Quote:


> Originally Posted by *Bertovzki*
> 
> I am interested in any feedback about PSU's , id prefer the Seasonic 1200W Platinum , but as is the case with most 1000W PSU's they are nearly all 190 mm long , if they are the best ones i want anyway.
> So the V1000 is a very good PSU and i like the fact it is 170 mm long , very short indeed for this class of PSU ! , that is a very attractive quality for my 750D case as it opens up more possibilities for modding and bottom fan size up to 140 mm or even a 280 rad in the bottom.
> 
> So to sum up , i prefer the head room of a 1200W PSU , but tempting to go V1000 for size , and i will never run more than 2 290x's , so feedback welcome , any opinions on this subject ? , it would be a no brainer if the 1200 W PSU's were 170 mm but they are not.


V1000 is a great unit. JG gave it an excellent score (9.7/10). The cables are a tad bit short but nothing too troublesome. I love that its 100% modular, not too long, and the ribbon cables. You can actually buy these cables now off CM's store.

With my psu's unique location, space is tight. Something like a seasonic 1250w (or its xfx twin) or an Enermax 1500w and lepa 1600w should fit. albeit, the cables are not that great looking (my ocn ocd will kick in







).

The mining craze crash has flooded ebay w/ used powersupplies. AX1200 used to sell strong, now they're going for ~$75-100. My V1000 is going for ~$100-110, and seen a bunch of great ones ~$1000 (Enermax 1500w sold for $105, lepa 1600 for $140).

I wish the V1200 was priced better or more abundant. Best price I've found is new ~$270.
Quote:


> Originally Posted by *Chopper1591*
> 
> If you dont clock those cards like crazy you should be good with the v1000.


Nah, with three and currently only at 1080, I don't have a need to oc. I'm going WQHD once i get my taxes (k-monitor, 32" 2560x1440) and so three will still be ok. These are factory oc and close to 290x performance.
Quote:


> Originally Posted by *Archea47*
> 
> Interesting and what I want to hear, but troubling
> 
> Interesting because I thought my 1000W would be solid for OC'd 290Xs. Over in the 280x thread people used to claim 600 was plenty for CFX'd 280xs (which I thought was madness). Now on the other side of the spectrum I'm heading my 1000W might not be enough for overclocked 290Xs!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> FYI my processor takes >300W and my PSU is EVGA G2 1000. Peripherals are 2 SSDs, 1 7.2K HDD, Logitec headset and mouse.
> 
> Troubling because from your data it looks like we can't trust Kill-a-Watt. If you can go buy 5 more and see the standard deviation between their readings that'd be swell


Its not a precise tool but it does get you some good indication. I think i killed my old one a few years back when we did the Furmark+prime testing (triple 6950s).

I've owned some power hungry setups in the past (quad 480s, 580s, 7970 ltg.) with many fans (up to 52 fans) so I'm familiar with power needs. My room is setup on a 20amp breaker (enough for 3000w). Also, keep in mind your cpu is a definitely power hungry. My pea may not add much and there should be a moderate increase with the i5.
Quote:


> Originally Posted by *bond32*
> 
> You should be fine with the 1000 G2. It is capable of drawing over 1200 watts if I recall correctly, but at any rate you will need to be approaching 1.5 volts to the cards to start to see it's limits.
> 
> For example, and these are purely rough numbers just for fyi, but I have seen software readings of around 370-380 watts per card before when they were at 1.46 volts.


A lot of ppl don't factor in efficiency and first thing they point out is overclocking. I don't plan to overclock (as stated a bunch of times







). And, its simple to calculate efficiency.

I may do some oc'ing for the Red Vs Green competition but nothing atm if I'm honest.
Quote:


> Originally Posted by *Devildog83*
> 
> 1000w is plenty for a 4790k and 2 x 290x's even if overclocked as long as it's a top quality PSU you will be fine with that.


Yup, quality is the name of the game. Something cheap will probably fry with some high oc'd cards. And more then likely, kill something else.


----------



## mAs81

Quote:


> Originally Posted by *Raephen*
> 
> As long as the cabling has the right type of headers, I guess you could. I know Gelid overs a GPU-PWM female to standard PWM male cable, but I've never seen them the other way arround as you would need.
> 
> If you've found that, you'd need software capable to run a motherboard header based of GPU temps. Speedfan might do the trick, but I'm no expert on that.
> 
> I mostly use either a curve with the software that comes with the motherboard, or - in the case of my htpc - Open Hardware Monitor to run htem at a fixed speed. And I've never seen the GPU temp as a variable in either solution (I must admit: I've never had need to look that hard into it, though...)


That's the plan..I ordered a custom cable from icemodz,and I'm planning to use either EZ setup for fan control or find a more suitable program that will adjust fan speed by temps or not..I'll give speedfan a try,thanks









If anyone else knows a program for that kin of use,give me a holler


----------



## wermad

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yes I have a 28' UHD Sammy with DP and a 67' sammy curvy in lounge room with no DP but will run UHD thru PC / HDMI
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Howsit goin kizwan , tsm106 and @wermad












Comfy chairs and dat tv


----------



## tsm106

Total man cave with the bench stand and innards hanging out like that.


----------



## conwa

Quote:


> Originally Posted by *Devildog83*
> 
> 1000w is plenty for a 4790k and 2 x 290x's even if overclocked as long as it's a top quality PSU you will be fine with that.


I got a 850w CM silent pro (garbage, I know) and with a 4790K (@ 4.8) and 290x in crossfire overclocked till 1170 core, I got no problems whatsoever!!!


----------



## Devildog83

Quote:


> Originally Posted by *F4ze0ne*
> 
> How much watts can the EVGA 1200w P2 handle then?
> 
> I'm putting a 295x2 and a 290x on an X99 system. Hopefully, I have enough power... I'm only planning on overclocking the cpu.


I know that TTL tested 2 - 295x2 cards and 1200w would not be enough if overclocked and loaded to the max but for your tri-fire set-up 1200w should be enough, especially if you don't overclock the GPU's.

Video -


----------



## AlienPrime173

Just got mine today!

PCS+ 290X!









Got it delivered to my work!


----------



## tsm106

People, PSU PSA. I find that most ppl don't actually overclock as much as they say. That doesn't mean their low numbers are the rule of thumb. You can run these cards in a manner that does not use up a lot of power, if you wanted to. On the flip side you can draw a helluvlot of power as well. Below is my dual psu setup. On the first psu, a 1300 G2 which for this test runs a 3930K at 5ghz and two 290x at 1100/1400 stock voltage, and 1250/1600 +300 (more than needed but extra just in case). There is NOTHING ELSE CONNECTED! No drives, fans, nothing. Second psu runs the other two 290x and all the accessories, pumps, 30 or so odd fans, etc etc.



Now the wattage at teh wall. It's not so bad 5ghz cpu etc. 923*87 = 803w converted.



Now lets see really overclocked, though actually its not that overclocked compared to what max OC is with is 1300/1700 for two.



Look at the temps after 5mins lol. Now for the martini shot.



Lmao. THis surprised me a bit because I hadn't actually run this specific test till now. 1537w at the wall lol. 1537*.87=1337w. I guess I am over running my psu.

Long story short, you COULD run these cards, rigs in similar low draw setups. And yet on the flipside you can also run them balls to the wall.

Thus how do you make a recommendation? For the low end only? That would mean you won't ever get to oc balls to the walls? Are you guys ok with that? Personally, I wouldn't be because I don't want to be limited in what I can do.


----------



## Archea47

Scary stuff ... and here I thought I was overkilling it!

Time to look at 1300W+ PSUs ... I'm looking to put them underwater in a couple weeks but if I can't OC them there's not much point apart from the aesthetics, which I don't want to admit is a driver

Is there a way to test the calibration of the Kill-a-Watt to be sure?


----------



## tsm106

Quote:


> Originally Posted by *Archea47*
> 
> Scary stuff ... and here I thought I was overkilling it!
> 
> Time to look at 1300W+ PSUs ... I'm looking to put them underwater in a couple weeks but if I can't OC them there's not much point apart from the aesthetics, which I don't want to admit is a driver
> 
> Is there a way to test the calibration of the Kill-a-Watt to be sure?


You can run a variety of load points, boot up, idle, stock stock stock, to get an idea of what it is reporting. Defective units will give erroneous readings which is a major tell.


----------



## Mugamat

Renewed aio cooling on my R9 290 to Thermaltake Water 3.0 Ultimate. Here are some screens with temps in idle and load. Heaven result and case look.

*Idle:* 

*Load:*

*Result:*

*Case Look:*


----------



## Mugamat

Damn, placed old screen on load. Here is the newest


----------



## thrgk

Hey guys I have an issue.

I had 14.9 drivers installed, and started to play bf4. it was going fine, I had my cards overclocked to 1100core, 1300mem, +50mv, +50% pl, and +50mv aux voltage.

Like 30mins in it bsod with atikimg or something. So i rebooted, and it showed the same bsod, rebooted again it started, but it came out No drivers detected or there is a problem with the drivers please reinstall. So I uninstalled using DDU, went to install 14.9 and it bsod while installing again.

I have now done a clean windows install, after installing motherboard drivers and window updates i went to install drivers but it BSOD again. So I went into device manager, right clicked on one and updated driver software. Now it shows all 200 series in device manager, but the last one in the list has a yellow tri angle next to it. Any idea why? I right clicked it and installed, i tried disabling it, but when I go to install drivers like the 14.12 or 14.9 I bsod, so i think the yellow triangle is causing the issue. What should I do? These are new cards, less then a year old.

Thanks guys

EDIT: The code windows gives is CODE 43 (The device has stopped working) when I click properties on it in device manager


----------



## F4ze0ne

Quote:


> Originally Posted by *Devildog83*
> 
> I know that TTL tested 2 - 295x2 cards and 1200w would not be enough if overclocked and loaded to the max but for your tri-fire set-up 1200w should be enough, especially if you don't overclock the GPU's.
> 
> Video -


Thanks. I kind of figured I'd be ok. After reading the tri-fire review on [H] they stated 1200w minimum. So, I knew going into it that overclocking the cards would not be possible if I went the 1200w route. I guess I've boxed myself into a corner not going with at least 1350w+, but I can always add a 2nd PSU down the line if I get the itch to clock the cards up. I suppose I could put the system components on my 650w and use the 1200w for the cards alone.


----------



## dteg

well i definitely want in. took me awhile to decide which one i wanted, and then i forgot to consider if it could even fit in my case







. so i went back through my top list and found 1 that could. BARELY though.
got the MSI one and with amazon prime had it in my hands in no time.


----------



## wermad

Quote:


> Originally Posted by *tsm106*
> 
> You can run a variety of load points, boot up, idle, stock stock stock, to get an idea of what it is reporting. Defective units will give erroneous readings which is a major tell.


Mother-of-god! That's a lot of powah!

Yup, my old unit was giving me stupidly absurd readings. New one seems to be working properly.

Try the furmark + prime torture test (at your own risk)?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *kizwan*
> 
> Not bad bro. I got into new hobby & got to spend more $$$$ that I don't have.
> 
> 
> 
> 
> 
> 
> 
> 
> I see you taking over the lounge room now.


When its not in use ..........









What hobby is this ?? Sports cars ?? New Wife !? Casino ?!
Quote:


> Originally Posted by *wermad*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Comfy chairs and dat tv


Those chairs are full leather / electric recliners . They turn into beds !!










Yep that teev

Quote:


> Originally Posted by *tsm106*
> 
> Total man cave with the bench stand and innards hanging out like that.


That's my Z97 rig ( mothballed ) with chilled cpu and 290's

This is man cave rig





U like ??

Sorry about the quality of those pics . Me thinks I need a new camera ..............


----------



## Gobigorgohome

How high clocks is possible with MSI Lightning R9 290X's on air with fans @ 100%? Thinking of picking up one card for 300 USD pre-owned and buy another one brand new for crossfire and push them a little on air.


----------



## Gumbi

Quote:


> Originally Posted by *Gobigorgohome*
> 
> How high clocks is possible with MSI Lightning R9 290X's on air with fans @ 100%? Thinking of picking up one card for 300 USD pre-owned and buy another one brand new for crossfire and push them a little on air.


They suck a lot more jui e when overclocked 200 more watts is possible I think, but if you're happy with thst and 100% fans you're looking at 1200-1250mhz.


----------



## tarui04

I just found out the hard way that the msi 290x gaming, while uses a reference board design, uses components a little different from ref.

the chokes or something beside the capacitors has solder legs that stick out too much, such that it prevents the AQC active backplate from installing properly. 

there are 4 contact areas, 1 denoted by the screwdriver. shows the contact between the solder legs and the raised portion of the backplate.

is there any solution besides drilling out the affected areas (4 holes or 1 rectangular hole to cover all 4 solder legs) If i want to use the active backplate?


----------



## HOMECINEMA-PC

Sounds like a job for Mr Dremel


----------



## tsm106

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Sounds like a job for Mr Dremel


There's water channel there. Mr Dremel could create a flood... so be careful.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *tsm106*
> 
> There's water channel there. Mr Dremel could create a flood... so be careful.


LooooL
Once again another post with very little info eg : RIGBUILDER
ROG boards are ' nearly waterproof '
Speaking from experience









Anyways back to COD BO 2


----------



## tarui04

Quote:


> Originally Posted by *tsm106*
> 
> There's water channel there. Mr Dremel could create a flood... so be careful.


there's no drilling on the waterblock

just the backplate


----------



## tsm106

Quote:


> Originally Posted by *tarui04*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> There's water channel there. Mr Dremel could create a flood... so be careful.
> 
> 
> 
> there's no drilling on the waterblock
> 
> just the backplate
Click to expand...

The active backplate has a water channel so have to be careful if you drag out Mr Dremel to grind something to clear something...?

*Oh looking closer the channel is just screwed onto the plate? Somewhat gimicky...


----------



## invincible20xx

is there anyway to get away with 40% maximum fan speed while gaming without hitting 95c on a red model r9 290 ?


----------



## Klocek001

why do you ask ?


----------



## Bertovzki

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yes I have a 28' UHD Sammy with DP and a 67' sammy curvy in lounge room with no DP but will run UHD thru PC / HDMI
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Howsit goin kizwan , tsm106 and @wermad


You are very brave indeed putting that can of drink on the same table as your exposed PC components







so easy to knock a drink over ,zapp ! smoke sparks , scares me looking at it , nice rig though , nice screen !


----------



## kizwan

Quote:


> Originally Posted by *tsm106*
> 
> People, PSU PSA. I find that most ppl don't actually overclock as much as they say. That doesn't mean their low numbers are the rule of thumb. You can run these cards in a manner that does not use up a lot of power, if you wanted to. On the flip side you can draw a helluvlot of power as well. Below is my dual psu setup. On the first psu, a 1300 G2 which for this test runs a 3930K at 5ghz and two 290x at 1100/1400 stock voltage, and 1250/1600 +300 (more than needed but extra just in case). There is NOTHING ELSE CONNECTED! No drives, fans, nothing. Second psu runs the other two 290x and all the accessories, pumps, 30 or so odd fans, etc etc.
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Now the wattage at teh wall. It's not so bad 5ghz cpu etc. *923*87 = 803w* converted.
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Now lets see really overclocked, though actually its not that overclocked compared to what max OC is with is 1300/1700 for two.
> 
> 
> 
> Look at the temps after 5mins lol. Now for the martini shot.
> 
> 
> 
> Lmao. THis surprised me a bit because I hadn't actually run this specific test till now. 1537w at the wall lol. 1537*.87=1337w. I guess I am over running my psu.
> 
> Long story short, you COULD run these cards, rigs in similar low draw setups. And yet on the flipside you can also run them balls to the wall.
> 
> Thus how do you make a recommendation? For the low end only? That would mean you won't ever get to oc balls to the walls? Are you guys ok with that? Personally, I wouldn't be because I don't want to be limited in what I can do.


Why 87%?


----------



## Trys0meM0re

Quote:


> Originally Posted by *kizwan*
> 
> Why 87%?


Efficiency conversion.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Bertovzki*
> 
> You are very brave indeed putting that can of drink on the same table as your exposed PC components
> 
> 
> 
> 
> 
> 
> 
> so easy to knock a drink over ,zapp ! smoke sparks , scares me looking at it , nice rig though , nice screen !


If your rough as guts it _might_ spill but nowhere near it








The mobo is sitting on a 3inch thick foam pad , plus it had a water chiller on it at the time as well .


----------



## Chopper1591

Quote:


> Originally Posted by *tsm106*
> 
> People, PSU PSA. I find that most ppl don't actually overclock as much as they say. That doesn't mean their low numbers are the rule of thumb. You can run these cards in a manner that does not use up a lot of power, if you wanted to. On the flip side you can draw a helluvlot of power as well. Below is my dual psu setup. On the first psu, a 1300 G2 which for this test runs a 3930K at 5ghz and two 290x at 1100/1400 stock voltage, and 1250/1600 +300 (more than needed but extra just in case). There is NOTHING ELSE CONNECTED! No drives, fans, nothing. Second psu runs the other two 290x and all the accessories, pumps, 30 or so odd fans, etc etc.
> 
> 
> 
> Now the wattage at teh wall. It's not so bad 5ghz cpu etc. 923*87 = 803w converted.
> 
> 
> 
> Now lets see really overclocked, though actually its not that overclocked compared to what max OC is with is 1300/1700 for two.
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Look at the temps after 5mins lol. Now for the martini shot.
> 
> 
> 
> Lmao. THis surprised me a bit because I hadn't actually run this specific test till now. 1537w at the wall lol. 1537*.87=1337w. I guess I am over running my psu.
> 
> Long story short, you COULD run these cards, rigs in similar low draw setups. And yet on the flipside you can also run them balls to the wall.
> 
> 
> Thus how do you make a recommendation? For the low end only? That would mean you won't ever get to oc balls to the walls? Are you guys ok with that? Personally, I wouldn't be because I don't want to be limited in what I can do.


Thanks for sharing this tsm.

By looking at that I might even be able to run my 8320 with CF 290 tri-x, mildly OCed, on my hx750 psu.

Any idea what your cpu wattage is? With the shown overclock?
Quote:


> Originally Posted by *Gobigorgohome*
> 
> How high clocks is possible with MSI Lightning R9 290X's on air with fans @ 100%? Thinking of picking up one card for 300 USD pre-owned and buy another one brand new for crossfire and push them a little on air.


My 290 tri-x reaches 1200 no problem with 100% fan.
I know this is not a very good comparison, but you should be able to achieve at least that.
Depends on the card also of course... some just clock better.

But the question is....
Can you really stand 100% fan?

Ear damage!
Quote:


> Originally Posted by *Bertovzki*
> 
> You are very brave indeed putting that can of drink on the same table as your exposed PC components
> 
> 
> 
> 
> 
> 
> 
> so easy to knock a drink over ,zapp ! smoke sparks , scares me looking at it , nice rig though , nice screen !


Early fireworks this year.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Chopper1591*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Thanks for sharing this tsm.
> 
> By looking at that I might even be able to run my 8320 with CF 290 tri-x, mildly OCed, on my hx750 psu.
> 
> Any idea what your cpu wattage is? With the shown overclock?
> My 290 tri-x reaches 1200 no problem with 100% fan.
> I know this is not a very good comparison, but you should be able to achieve at least that.
> Depends on the card also of course... some just clock better.
> 
> But the question is....
> Can you really stand 100% fan?
> 
> Ear damage! [/B]
> 
> 
> Early fireworks this year.


There will be when you try to overclock that FX thing past 5 gigglehurtles LooooooL


----------



## Gobigorgohome

Quote:


> Originally Posted by *Gumbi*
> 
> They suck a lot more jui e when overclocked 200 more watts is possible I think, but if you're happy with thst and 100% fans you're looking at 1200-1250mhz.


1200 Mhz sounds fine to me. Thank you!








Quote:


> Originally Posted by *Chopper1591*
> 
> My 290 tri-x reaches 1200 no problem with 100% fan.
> I know this is not a very good comparison, but you should be able to achieve at least that.
> Depends on the card also of course... some just clock better.
> 
> But the question is....
> Can you really stand 100% fan?
> 
> Ear damage!


I have done four referencecards stancked at 100% fan speed, two cards (even though they have more fans) cannot be worse than that!







.


----------



## Mega Man

Quote:


> Originally Posted by *F4ze0ne*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bond32*
> 
> You should be fine with the 1000 G2. It is capable of drawing over 1200 watts if I recall correctly, but at any rate you will need to be approaching 1.5 volts to the cards to start to see it's limits.
> 
> For example, and these are purely rough numbers just for fyi, but I have seen software readings of around 370-380 watts per card before when they were at 1.46 volts.
> 
> 
> 
> How much watts can the EVGA 1200w P2 handle then?
> 
> I'm putting a 295x2 and a 290x on an X99 system. Hopefully, I have enough power... I'm only planning on overclocking the cpu.
Click to expand...

umm... is this a trick question.... 1200

unless you mean the " buffer " in which case i recommend reading this
http://www.overclock.net/t/928113/a-message-to-the-community-on-enthusiast-power-supplies/0_100
Quote:


> Originally Posted by *tsm106*
> 
> Total man cave with the bench stand and innards hanging out like that.











Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Sorry about the quality of those pics . Me thinks I need a new camera ..............


totally stealing that !
Quote:


> Originally Posted by *tarui04*
> 
> I just found out the hard way that the msi 290x gaming, while uses a reference board design, uses components a little different from ref.
> 
> the chokes or something beside the capacitors has solder legs that stick out too much, such that it prevents the AQC active backplate from installing properly.
> 
> there are 4 contact areas, 1 denoted by the screwdriver. shows the contact between the solder legs and the raised portion of the backplate.
> 
> is there any solution besides drilling out the affected areas (4 holes or 1 rectangular hole to cover all 4 solder legs) If i want to use the active backplate?


wait ... are you talking about this ?

( please note this is a random pic, not a gpu )

if so then you need a pair of these 
( side cutters ) and just nip the tip, _you can_ ( please note this does not mean you WILL ) do it cleanly so you can keep the warranty as well * at your own risk


----------



## Chopper1591

Quote:


> Originally Posted by *Gobigorgohome*
> 
> 1200 Mhz sounds fine to me. Thank you!
> 
> 
> 
> 
> 
> 
> 
> 
> I have done four referencecards stancked at 100% fan speed, two cards (even though they have more fans) cannot be worse than that!
> 
> 
> 
> 
> 
> 
> 
> .


Haha omg.
Nah those cards will be much nicer on the ears.
Reference is horrible.


----------



## Gobigorgohome

Quote:


> Originally Posted by *Chopper1591*
> 
> Haha
> Nah those cards will be much nicer on the ears.
> Reference is horrible.


Four reference-cards on air with 100% fan speed is seriously *bad*, the cards did manage to stay at 78C (yes, all four), but the amount of noise they produces is unbelievable, I laughed so hard the first time I heard it that it is terrible. It was that bad that I had to put my microphone on the other side of the room while gaming for my mates to hear me while we were gaming.







To make the situation a little worse it was 25C+ ambient when I had those cards in that setup. The room my cards are in now is 10-13C ambient, so it would be good with a few heaters (R9 290X's).


----------



## tarui04

any suggestions? drill hole straight through or mill some holes into the backplate to fit the "solder legs"?


----------



## Chopper1591

Quote:


> Originally Posted by *tarui04*
> 
> any suggestions? drill hole straight through or mill some holes into the backplate to fit the "solder legs"?


If you can mill... do it.
Wou look cleaner after all.

You could also put some rubber in or something to isolate, just in case.


----------



## tarui04

Quote:


> Originally Posted by *Chopper1591*
> 
> If you can mill... do it.
> Wou look cleaner after all.
> 
> You could also put some rubber in or something to isolate, just in case.


any idea what kind bits i should be looking for?

i need those that can cut horizontally. as in, it can remove material when moved horizontally. I have no idea what those things are called in the cnc world.


----------



## Chopper1591

Quote:


> Originally Posted by *tarui04*
> 
> any idea what kind bits i should be looking for?
> 
> i need those that can cut horizontally. as in, it can remove material when moved horizontally. I have no idea what those things are called in the cnc world.


Thats not my field of knowledge.
I would just grab mr dremel and use something like these


----------



## Mega Man

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Chopper1591*
> 
> Haha
> Nah those cards will be much nicer on the ears.
> Reference is horrible.
> 
> 
> 
> Four reference-cards on air with 100% fan speed is seriously *bad*, the cards did manage to stay at 78C (yes, all four), but the amount of noise they produces is unbelievable, I laughed so hard the first time I heard it that it is terrible. It was that bad that I had to put my microphone on the other side of the room while gaming for my mates to hear me while we were gaming.
> 
> 
> 
> 
> 
> 
> 
> To make the situation a little worse it was 25C+ ambient when I had those cards in that setup. The room my cards are in now is 10-13C ambient, so it would be good with a few heaters (R9 290X's).
Click to expand...

pfffft stand next to a chiller

Quote:


> Originally Posted by *tarui04*
> 
> any suggestions? drill hole straight through or mill some holes into the backplate to fit the "solder legs"?


@tarui04


Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Mega Man*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tarui04*
> 
> I just found out the hard way that the msi 290x gaming, while uses a reference board design, uses components a little different from ref.
> 
> the chokes or something beside the capacitors has solder legs that stick out too much, such that it prevents the AQC active backplate from installing properly.
> 
> there are 4 contact areas, 1 denoted by the screwdriver. shows the contact between the solder legs and the raised portion of the backplate.
> 
> is there any solution besides drilling out the affected areas (4 holes or 1 rectangular hole to cover all 4 solder legs) If i want to use the active backplate?
> 
> 
> 
> wait ... are you talking about this ?
> 
> ( please note this is a random pic, not a gpu )
> 
> if so then you need a pair of these
> ( side cutters ) and just nip the tip, _you can_ ( please note this does not mean you WILL ) do it cleanly so you can keep the warranty as well * at your own risk
Click to expand...


----------



## tarui04

Quote:


> Originally Posted by *Mega Man*
> 
> pfffft stand next to a chiller
> @tarui04


I'm not wiling to cut/trim the solder bit off. just in case i damage the board.


----------



## Chopper1591

Quote:


> Originally Posted by *tarui04*
> 
> I'm not wiling to cut/trim the solder bit off. just in case i damage the board.


Then follow my reply.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *tarui04*
> 
> I'm not wiling to cut/trim the solder bit off. just in case i damage the board.


Put tape around them and then snip if you scared to scratch PCB


----------



## tarui04

Quote:


> Originally Posted by *Chopper1591*
> 
> Then follow my reply.


i know...i am trying to find a cnc services workshop that is willing to handle this kind of one-off low volume jobs.
i have a dremel but the work stand isn't that precise. it tends to wobble a little


----------



## ebduncan

Hey guys.

I have been thinking about selling my R9-290 with EK water block. What would be a fair asking price? Its a XFX Black Edition R9-290 (reference pcb/cooler) Would come with the factory cooling solution in the box as well as the EK water block installed on card.

If it makes any difference the card will do 1250 core/1600+ mem (hynix)

About to make the move to 4k. With all the talk of new AMD cards coming out soon figured now would be the time to sell. I can rough it out on my old 7950 until the new cards are released.


----------



## Gobigorgohome

Okay, I have a question guys. I have decided to go with tri-fire with my RIVBE, it run games smoother or as smooth as quadfire so I do not see the point in having four cards in that system, it just uses more power.

I have got a MSI Lightning for 345 USD shipped and I am ordering at least one card from Amazon, should I do:

2x MSI Lightning R9 290X and 1x Sapphire Radeon R9 290X (Hynix)

OR

3x MSI Lightning R9 290X (the middle one will have to be on water)










Edit: The water cooled Lightning will go good with the phase changed i5-3570K as well


----------



## wermad

@Ebduncan.

Ebay is selling 290s (ref) ~$180-220 depending on condition. The block, they usually go for~$65-90 depending on model and if a backplate is included. Now typically, the forums tend to sell about 10-15% less then ebay.

I bought three sapphire trix oc 290s. With a better cooler and slight oc, they sell a bit higher then the reference.

Normally you would need to go the market for an appraisal, but you can get an idea from what I posted. Good luck and what are you moving on to? Maxwell? Gm200?

Edit: damn ups decided to make package (cpu) five day air vs two









Crossing my fingers usps delivers my 3rd sapphire today or tomorrow. Pae will be choking







, lol.


----------



## Chopper1591

Quote:


> Originally Posted by *tarui04*
> 
> i know...i am trying to find a cnc services workshop that is willing to handle this kind of one-off low volume jobs.
> i have a dremel but the work stand isn't that precise. it tends to wobble a little


Good luck.
Post back.

Maybe I am crazy or something but it sounds you are about to overdo it...
If I was in your place I would just trim it while holding the dremel in one hand and the backplate in the other.









Does it need to be that pretty? It's the downside of the backplate, right?


----------



## ebduncan

Quote:


> Originally Posted by *wermad*
> 
> @Ebduncan.
> 
> Ebay is selling 290s (ref) ~$180-220 depending on condition. The block, they usually go for~$65-90 depending on model and if a backplate is included. Now typically, the forums tend to sell about 10-15% less then ebay.
> 
> I bought three sapphire trix oc 290s. With a better cooler and slight oc, they sell a bit higher then the reference.
> 
> Normally you would need to go the market for an appraisal, but you can get an idea from what I posted. Good luck and what are you moving on to? Maxwell? Gm200?
> 
> Edit: damn ups decided to make package (cpu) five day air vs two
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Crossing my fingers usps delivers my 3rd sapphire today or tomorrow. Pae will be choking
> 
> 
> 
> 
> 
> 
> 
> , lol.


Thanks. Rep'ed. Looks like I will just sell the reference cooler separate. I'm sure someone will want to buy a brand new unused one to replace their broken fan etc....

Not sure what I will move to. It will be a toss up between GM200, and the 390/390x. Probably the 390x as it has free sync and I am going to buy one of those Samsung free sync 4k monitors that will be released soon. We will see.


----------



## Chopper1591

Quote:


> Originally Posted by *ebduncan*
> 
> Thanks. Rep'ed. Looks like I will just sell the reference cooler separate. I'm sure someone will want to buy a brand new unused one to replace their broken fan etc....
> 
> Not sure what I will move to. It will be a toss up between GM200, and the 390/390x. Probably the 390x as it has free sync and I am going to buy one of those Samsung free sync 4k monitors that will be released soon. We will see.


390x all the way!

Very interested in the HBM thingy.


----------



## bond32

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Okay, I have a question guys. I have decided to go with tri-fire with my RIVBE, it run games smoother or as smooth as quadfire so I do not see the point in having four cards in that system, it just uses more power.
> 
> I have got a MSI Lightning for 345 USD shipped and I am ordering at least one card from Amazon, should I do:
> 
> 2x MSI Lightning R9 290X and 1x Sapphire Radeon R9 290X (Hynix)
> 
> OR
> 
> 3x MSI Lightning R9 290X (the middle one will have to be on water)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: The water cooled Lightning will go good with the phase changed i5-3570K as well


Suppose you are already aware but I thought that lightning air cooler needed triple slot spacing no?


----------



## wermad

Boards like the z97 classy allow you to run 3way triple slot cards. Though, that is gonna be a hawt sammich







.


----------



## Gobigorgohome

Quote:


> Originally Posted by *bond32*
> 
> Suppose you are already aware but I thought that lightning air cooler needed triple slot spacing no?


Yeah? Both the Rampage IV Black Edition and the GA-X79-UD3 have the space for 4-way CFX/SLI, the first card occupy the first and second slot, the second card (water cooled would take up two spots) and the third card would sit in the fourth slot ... I do not see the problem ...


----------



## tsm106

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bond32*
> 
> Suppose you are already aware but I thought that lightning air cooler needed triple slot spacing no?
> 
> 
> 
> Yeah? Both the Rampage IV Black Edition and the GA-X79-UD3 have the space for 4-way CFX/SLI, the first card occupy the first and second slot, the second card (water cooled would take up two spots) and the third card would sit in the fourth slot ... I do not see the problem ...
Click to expand...

Lightning is 2.5 slot spacing. Some boards as mentioned have a "2B" slot that will allow thicker than 2 slot cards to run triple. The RIVE Black is not one of them. The regular RIVE does however.

Note the black slot between the two middle red slots. That is the 2B slot.


----------



## Roboyto

*Finally got around to making an Omega Drivers Comparison....better late than never?*


All runs were made with identical CPU/RAM settings. 
This is a fairly fresh Win7 Ultimate install; November 30th
DDU was run to upgrade drivers; CCleaner was also used to make sure any extra traces were gone.
Tesselation was always on unless noted.
All runs were made at least twice and the best score was recorded
If there was a large delta, then I would reboot and run another 1-2 times to make sure the results were valid.

Stock clocks of 980/1250 were used for the first set
They were then compared to a fairly high OC of 1280/1675 +200mV +50% power
This is on stock XFX BIOS.
Trixx for OC

Some screenshots are missing, my apologies...

My system...

*Win7 Ultimate x64*

*ASUS Z87 Gryphon w/ Thermal Armor*

*i7 4770k @ 4.5GHz - XSPC Raystorm - Delidded/Naked*

*2x8 GB Dominator Platinum 2400MHz*

*XFX BE R9 290 Reference - XSPC Razor w/ Backplate*

*Rosewill Capstone 650W Modular*

*Stock Clocks - 980/1250 -Driver 14.301.1001-140915a-176154C - Catalyst 14.9*

*Bioshock Infinite Benchmark (Ultra DX11 DDOF - 1920*1080)*




Per Scene Stats:

    

Scene Duration (seconds)



Average FPS



Min FPS



Max FPS



Scene Name



32.79



128.77



29.39



305.94



Welcome Center



6.54



141.22



21.6



168.62



Scene Change: Disregard Performance In This Section



22.14



159.65



125.24



190.89



Town Center



8.05



151.12



82.86



230.01



Raffle



9.04



201.96



95.1



255.04



Monument Island



3.02



211.46



179.5



237.86



Benchmark Finished: Disregard Performance In This Section



81.57



151.56



21.6



305.94



Overall




*Cloudgate* - 27,725 - http://www.3dmark.com/3dm/5051789?
*Firestrike* - 9,526 - http://www.3dmark.com/3dm/5051701?
*Sky Diver* - 27,291 - http://www.3dmark.com/3dm/5051763?
*3DMark11 Performance* - P13870 - http://www.3dmark.com/3dm11/9134896
*3DMark11 Performance *Tess OFF** - P15282 - http://www.3dmark.com/3dm11/9134871
*Final Fantasy XIV Benchmark* *(Max Settings - 1920*1080 Windowed)* - 13,090
*Tomb Raider (Ultimate Settings - 5760*1080)* - Min: 34.0 | Max: 48.0 | Avg: 40.1
*Tomb Raider (Ultimate Settings - 1920*1080 Windowed)* - Min: 62.1 | Max: 108.0 | Avg: 80.7
*Heaven (Extreme Presets - 1600*900 Windowed)* - Min: 8.6 | Max: 138.3 | FPS: 65.6 | Score: 1653
*Valley (Extreme HD - 1920*1080 Windowed)* - Min: 16.5 | Max: 107.0 | FPS: 58.2 | Score: 2433



Spoiler: Screenies - Stock Clocks - Catalyst 14.9



*3DMark11 Tesselation OFF*



*3DMark11 Tesselation ON*



*FFXIV Benchmark*



*Unigine Heaven*



*Tomb Raider 1920*1080*



*Tomb Raider 5760*1080*



*Unigine Valley*





*GPU Clocks - 1280/1675 - Driver 14.301.1001-140915a-176154C - Catalyst 14.9*

*Bioshock Infinite Benchmark (Ultra DX11 DDOF - 1920*1080)*




Per Scene Stats:

    

Scene Duration (seconds)



Average FPS



Min FPS



Max FPS



Scene Name



32.79



145.81



30.25



371.04



Welcome Center



6.55



146.33



21.77



206.03



Scene Change: Disregard Performance In This Section



22.12



201.57



153.63



257.75



Town Center



8.05



190.84



90.3



280.09



Raffle



9.04



251.41



134.85



379.44



Monument Island



3.01



261.73



210.91



290.62



Benchmark Finished: Disregard Performance In This Section



81.56



181.47



21.77



379.44



Overall




*Cloudgate* - 29,725 - http://www.3dmark.com/3dm/5051855?
*Firestrike* - 11,778 - http://www.3dmark.com/3dm/5051981?
*Sky Diver* - 31,201 - http://www.3dmark.com/3dm/5051944?
*3DMark11 Performance* - P16609 - http://www.3dmark.com/3dm11/9134823
*3DMark11 Performance *Tess OFF** - P18079 - http://www.3dmark.com/3dm11/9134835 *My best 3DMark11 run EvAr*








*Final Fantasy XIV Benchmark (Max Settings - 1920*1080 Windowed)* - 16,508
*Tomb Raider (Ultimate Settings - 5760*1080)* - Min: 43.4 | Max: 58.4 | Avg: 50.2
*Tomb Raider (Ultimate Settings - 1920*1080 Windowed)* - Min: 78.0 | Max: 126.5 | Avg: 101.8
*Heaven (Extreme Presets - 1600*900 Windowed) -* Min: 33.5 | Max: 176.1 | FPS: 83.9 | Score: 2115
*Valley (Extreme HD - 1920*1080 Windowed) -* Min: 35.1 | Max: 135.3 | FPS: 74.2 | Score: 3,103



Spoiler: The Warm & Fuzzy Screenshot



*P18079 w/ Tesselation OFF* *- Similar Scores*







Spoiler: Screenies - OverClock - Catalyst 14.9



*3DMark11 Tesselation OFF*



*3DMark11 Tesselation ON*



*FFXIV Benchmark*



*Unigine Heaven*



*Tomb Raider 1920*1080*



*Unigine Valley*





*Overclock Performance Increase* - *Catalyst 14.9*

*Overall average has been calculated. I also took the average w/o Cloudgate/Sky Diver since they are less common and posted the worst gains*




*14.9 Catalyst*

    

*Benchmark*



*Score: Stock*



*Score: OverClock*



*Increase*



*%*



*Bioshock (Avg FPS)*



151.56



181.47



29.91



19.73%



*Cloudgate*



27,275



29,725



2,450



8.98%



*Firestrike*



9,526



11,778



2,252



23.64%



*Sky Diver*



27,291



31,201



3,910



14.33%



*3DMark11*



13,870



16,609



2,739



19.75%



*3Dmark11 Tess *OFF**



15,282



18,079



2,797



18.30%



*FFXIV*



13,090



16,508



3,418



26.11%



*Tomb Raider 5760 (Avg FPS)*



48.0



58.4



10.4



21.67%



*Tomb Raider 1080 (Avg FPS)*



80.7



101.8



21.1



26.15%



*Heaven*



1,653



2,115



462



27.95%



*Valley*



2,433



3,103



670



27.54%



 

*Overall Average Improvement*



21.29%

 

*Without Cloudgate & Skydiver*



23.43%

 

Omega

*Stock Clocks - 980/1250 - Driver 14.501.1003-141120a-177998C - Omega 14.12*

*Bioshock Infinite Benchmark (Ultra DX11 DDOF - 1920*1080)*




*Per Scene Stats:*

    

*Scene Duration (seconds)*



* Average FPS*



* Min FPS*



* Max FPS*



* Scene Name*



*32.78*



*140.93*



*29.57*



*310.45*



* Welcome Center*



*6.54*



*147.92*



*20.06*



*195.9*



* Scene Change: Disregard Performance In This Section*



*22.12*



*184.35*



*141.35*



*263.82*



* Town Center*



*8.05*



*173.96*



*62.15*



*240.11*



* Raffle*



*9.03*



*239.74*



*118*



*363*



* Monument Island*



*3.51*



*252.68*



*180.74*



*386.26*



* Benchmark Finished: Disregard Performance In This Section*



*82.04*



*172.16*



*20.06*



*386.26*



* Overall*




*Cloudgate* - 27,686 - http://www.3dmark.com/3dm/5067286?
*Firestrike* - 9,529 - http://www.3dmark.com/3dm/5067344?
*Sky Diver* - 27,533 - http://www.3dmark.com/3dm/5067312?
*3DMark11 Performance* - P13,898 - http://www.3dmark.com/3dm11/9135153
*3DMark11 Performance *Tess OFF** - P15,313 - http://www.3dmark.com/3dm11/9135164
*Final Fantasy XIV Benchmark* *(Max Settings - 1920*1080 Windowed)* - 13,215
*Tomb Raider (Ultimate Settings - 5760*1080)* - Min: 33.9 | Max: 46.6| Avg: 39.6
*Tomb Raider (Ultimate Settings - 1920*1080 Windowed)* - Min: 62.0 | Max: 106.0 | Avg: 80.8
*Heaven (Extreme Presets - 1600*900 Windowed)* - Min: 25.8 | Max: 139.2 | FPS: 66.2 | Score: 1,668
*Valley (Extreme HD - 1920*1080 Windowed)* - Min: 28.6 | Max: 107.8 | FPS: 58.4 | Score: 2,442



Spoiler: Screenies - Stock Clocks - Omega



*Unigine Heaven*



*Tomb Raider 1920*1080*



*Tomb Raider 5760*1080*



*Unigine Valley*





*GPU Clocks - 1280/1675 - Driver 14.501.1003-141120a-177998C - Omega 14.12*

*Bioshock Infinite Benchmark (Ultra DX11 DDOF - 1920*1080)*


 

*Per Scene Stats:*

   

*Scene Duration (seconds)*



* Average FPS*



* Min FPS*



* Max FPS*



* Scene Name*



*32.96*



*155.36*



*1.46*



*417.19*



* Welcome Center*



*7.06*



*145.46*



*20.02*



*304.15*



* Scene Change:*

*Disregard Performance*



*22.1*



*226.79*



*139.98*



*318.79*



* Town Center*



*8.03*



*221.12*



*92.08*



*307.18*



* Raffle*



*9.02*



*300.5*



*200.87*



*463.84*



* Monument Island*



*3.01*



*314.86*



*234.79*



*343.14*



* Benchmark Finished:*

*Disregard Performance*



*82.18*



*202.15*



*1.46*



*463.84*



* Overall*




*Cloudgate* - 29,828 - http://www.3dmark.com/3dm/5102119?
*Firestrike* - 11,799 - http://www.3dmark.com/3dm/5102205?
*Sky Diver* - 31,479 - http://www.3dmark.com/3dm/5102167?
*3DMark11 Performance* - P16,568 - http://www.3dmark.com/3dm11/9151648
*3DMark11 Performance *Tess OFF** - P18043 - http://www.3dmark.com/3dm11/9151703
*Final Fantasy XIV Benchmark* *(Max Settings - 1920*1080 Windowed)* - 16,502
*Tomb Raider (Ultimate Settings - 5760*1080)* - Min: 44.0 | Max: 60.4 | Avg: 51.9
*Tomb Raider (Ultimate Settings - 1920*1080 Windowed)* - Min: 77.7 | Max: 134.2 | Avg: 103.1
*Heaven (Extreme Presets - 1600*900 Windowed)* - Min: | 33.3 Max: |177.5 FPS: |83.9 Score: 2,114
*Valley (Extreme HD - 1920*1080 Windowed)* - Min: 34.6 | Max: 135.3 | FPS: |73.5 Score: 3,077



Spoiler: Screenies - OverClock - Omega



*3DMark11 Tesselation OFF*



*3DMark11 Tesselation ON*



*FFXIV Benchmark*



*Unigine Heaven*



*Tombraider 1920*1080*



*Tomb Raider 5760*1080*



*Unigine Valley*





*Overclock Performance Increase - Omega*

*Overall average has been calculated. I also took the average w/o Cloudgate/Sky Diver since they are less common and posted the worst gains*




*14.12 Omega*

    

*Benchmark*



*Score: Stock*



*Score: OverClock*



*Increase*



*%*



*Bioshock (Avg FPS)*



172.16



202.15



29.99



17.42%



*Cloudgate*



27,686



29,828



2142.00



7.74%



*Firestrike*



9,529



11,799



2270.00



23.82%



*Sky Diver*



27,533



31,479



3946.00



14.33%



*3DMark11*



13,898



16,568



2670.00



19.21%



*3Dmark11 Tess *OFF**



15,313



18,043



2730.00



17.83%



*FFXIV*



13,215



16,502



3287.00



24.87%



*Tomb Raider 5760 (Avg FPS)*



39.6



51.9



12.30



31.06%



*Tomb Raider 1080 (Avg FPS)*



80.8



103.1



22.30



27.60%



*Heaven*



1,668



2,114



446.00



26.74%



*Valley*



2,442



3,077



635.00



26.00%



 

*Overall Average Improvement*



21.51%

 

*Without Cloudgate & Skydiver*



23.84%

 

*Stock Clock Comparison: Catalyst 14.9 ~VS~ Omega 14.12*

*As can be seen below, there is barely any difference between 14.9 & Omega...with the exception of BioShock with solid gains, and Tomb Raider with a hard performance hit.*

*Averages were calculated the same as above, but also without Tomb Raider & BioShock since they influence the averages heavily.*




*Benchmark*



*Stock - 14.9*



*Stock - Omega*



*Increase*



*%*



*Bioshock (Avg FPS)*



151.56



172.16



20.60



13.59%



*Cloudgate*



27,275



27,686



411



1.51%



*Firestrike*



9,526



9,529



3



0.03%



*Sky Diver*



27,291



27,533



242



0.89%



*3DMark11*



13,870



13,898



28



0.20%



*3Dmark11 Tess *OFF**



15,282



15,313



31



0.20%



*FFXIV*



13,090



13,215



125



0.95%



*Tomb Raider 5760 (Avg FPS)*



48.0



39.6



-8.4



-17.50%



*Tomb Raider 1080 (Avg FPS)*



80.7



80.8



0.1



0.12%



*Heaven*



1,653



1,668



15



0.91%



*Valley*



2,433



2,442



9



0.37%



 

*Overall Average Improvement*



0.12%

 

*Without Cloudgate & Skydiver*



-0.12%

 

*Without Tombraider*



2.07%

 

*Without Tombraider & Bioshock*



0.63%

 

*Overclock Comparison: Catalyst 14.9 ~VS~ Omega 14.12*

*As can be seen below, there is barely any difference between 14.9 & Omega...with the exception of BioShock with solid gains, and Tomb Raider with a hard performance hit.*

*Averages were calculated the same as above, but also without Tomb Raider & BioShock since they influence the averages heavily.*




*Benchmark*



*OC - 14.9*



*OC - Omega*



*Increase*



*%*



*Bioshock (Avg FPS)*



181.47



202.15



20.68



11.40%



*Cloudgate*



29,725



29,828



103



0.35%



*Firestrike*



11,778



11,799



21



0.18%



*Sky Diver*



31,201



31,479



278



0.89%



*3DMark11*



16,609



16,568



-41



-0.25%



*3Dmark11 Tess *OFF**



18,079



18,043



-36



-0.20%



*FFXIV*



16,508



16,502



-6



-0.04%



*Tomb Raider 5760 (Avg FPS)*



58.4



51.9



-6.5



-11.13%



*Tomb Raider 1080 (Avg FPS)*



101.8



103.1



1.3



1.28%



*Heaven*



2,115



2,114



-1



-0.05%



*Valley*



3,103



3,077



-26



-0.84%



 

*Overall Average Improvement*



0.14%

 

*Without Cloudgate & Skydiver*



0.04%

 

*Without Tombraider*



1.25%

 

*Without Tombraider & Bioshock*



-0.02%

 

This obviously isn't a comprehensive list, but it tells a familiar story for us enthusiast class GPU owners. AMD's primary focus with Omega, and previously Mantle, is lower tier hardware. Hopefully the drivers address pertinent issues in games where a smooth gaming experience can be more important than benchmarks.

-El Fin









P.S.

The experience while benching with Omega was smooth and enjoyable, this is a definite plus compared to some of the other drivers that were released in the last year since the 290(X) debuted.

Considering the performance results for Biochock and Tomb Raider, which aren't synthetic benchmarks, it is certainly plausible that there is solid performance to be gained, or lost, in other titles. I just don't have the game library, or the time, to test other titles that don't have convenient benchmarks built in.

If you have detailed before and after results from a game, then let me know and I will add the information to the post.


----------



## mAs81

Nice comparison , good job


----------



## Gobigorgohome

Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Lightning is 2.5 slot spacing. Some boards as mentioned have a "2B" slot that will allow thicker than 2 slot cards to run triple. The RIVE Black is not one of them. The regular RIVE does however.
> 
> Note the black slot between the two middle red slots. That is the 2B slot.


Yes, did you read the post that bond32 quoted? I wrote that I need to have the middle card (the card that will be in the third PCI-E x16 port) will have to be water cooled, then trifire is possible, I am well aware of the layout of my motherboard(s).

Three (or 2,5) ports equal slot 1 and slot 2 is "occupied", slot 3 is where I place the 2 slot card (water cooled MSI Lightning/reference is ~2 slot) and the last Lightning I put in slot 4. Then I will be able to do tri-fire with both the RIVBE and the GA-X79-UD3 if my math is not waaaaaay north. The top card will have the space for two 2 slot cards ... so there will be 1,5 slot "space" between the first lightning and the water cooled card. Three Lightnings on air is a no go, which I have not claimed either.









Edit: The bottom card will be "lower" in the case than the motherboard, but that does not mather, it will actually fit inside my Air 540 too.


----------



## tsm106

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Lightning is 2.5 slot spacing. Some boards as mentioned have a "2B" slot that will allow thicker than 2 slot cards to run triple. The RIVE Black is not one of them. The regular RIVE does however.
> 
> Note the black slot between the two middle red slots. That is the 2B slot.
> 
> 
> 
> 
> 
> 
> 
> 
> Yes, did you read the post that bond32 quoted? I wrote that I need to have the middle card (the card that will be in the third PCI-E x16 port) will have to be water cooled, then trifire is possible, I am well aware of the layout of my motherboard(s).
> 
> Three (or 2,5) ports equal slot 1 and slot 2 is "occupied", slot 3 is where I place the 2 slot card (water cooled MSI Lightning/reference is ~2 slot) and the last Lightning I put in slot 4. Then I will be able to do tri-fire with both the RIVBE and the GA-X79-UD3 if my math is not waaaaaay north. The top card will have the space for two 2 slot cards ... so there will be 1,5 slot "space" between the first lightning and the water cooled card. Three Lightnings on air is a no go, which I have not claimed either.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: The bottom card will be "lower" in the case than the motherboard, but that does not mather, it will actually fit inside my Air 540 too.
Click to expand...

That's a strange extreme of compromises. I would go all air or not at all because it seems extremely strange to go water only on one card. And not on the main card so you would not get to capitalize on the gains of water. However if I was going to go water just to fit that setup, I would go water on the first slot, air on the second and fourth. That way you get the main card cooled well. Though personally I might entertain the idea of selling a board and swapping to a RIVE for the 2B slot.The BE board still has a lot of street value for some strange reason and the regular RIVE doesn't. Imo, the BE version... is really inflexible.


----------



## Gobigorgohome

Quote:


> Originally Posted by *tsm106*
> 
> That's a strange extreme of compromises. I would go all air or not at all because it seems extremely strange to go water only on one card. And not on the main card so you would not get to capitalize on the gains of water. However if I was going to go water just to fit that setup, I would go water on the first slot, air on the second and fourth. That way you get the main card cooled well. Though personally I might entertain the idea of selling a board and swapping to a RIVE for the 2B slot.The BE board still has a lot of street value for some strange reason and the regular RIVE doesn't. Imo, the BE version... is really inflexible.


RIVE is hard to come by in Norway these days, mostly sold out in any store so the price is 550 USD+, used ones is also hard to come by. The idea is to sell off one of the Elpida cards and buy blocks for the Lightnings.







The water cooled card among the Lightnings was just a temporary solution until I could get some better setup up and running.

Or I could just order one more card and use them on air (which was the plan) and skip tri-fire.


----------



## tsm106

Oh its a temporary thing. In that case do what you gotta do.


----------



## pshootr

Did some overclocking with Sapphire R9 290 Tri-X OC and used BF4 as a testing platform. So far I can only go to 1160/Core 1400/Mem with +63v or I get white flickering artifacts.

Max temp: 70C
Max fan speed: 75%
Max VDDC: 1.305v

Was hoping for better. May be with some more experience with tweaking I can get a little more.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Yes, did you read the post that bond32 quoted? I wrote that I need to have the middle card (the card that will be in the third PCI-E x16 port) will have to be water cooled, then trifire is possible, I am well aware of the layout of my motherboard(s).
> 
> Three (or 2,5) ports equal slot 1 and slot 2 is "occupied", slot 3 is where I place the 2 slot card (water cooled MSI Lightning/reference is ~2 slot) and the last Lightning I put in slot 4. Then I will be able to do tri-fire with both the RIVBE and the GA-X79-UD3 if my math is not waaaaaay north. The top card will have the space for two 2 slot cards ... so there will be 1,5 slot "space" between the first lightning and the water cooled card. Three Lightnings on air is a no go, which I have not claimed either.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: The bottom card will be "lower" in the case than the motherboard, but that does not mather, it will actually fit inside my Air 540 too.


If your going to all that effort to coolo one card , why not all 3 ?? Its the only way to fly with these beasts .
Temp or not wblock your cards , don't stuff around .


----------



## Buehlar

Quote:


> Originally Posted by *Roboyto*
> 
> *Finally got around to making an Omega Drivers Comparison....better late than never?*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> All runs were made with identical CPU/RAM settings.
> This is a fairly fresh Win7 Ultimate install; November 30th
> DDU was run to upgrade drivers; CCleaner was also used to make sure any extra traces were gone.
> Tesselation was always on unless noted.
> All runs were made at least twice and the best score was recorded
> If there was a large delta, then I would reboot and run another 1-2 times to make sure the results were valid.
> 
> Stock clocks of 980/1250 were used for the first set
> They were then compared to a fairly high OC of 1280/1675 +200mV +50% power
> This is on stock XFX BIOS.
> Trixx for OC
> 
> Some screenshots are missing, my apologies...
> _My system..._
> 
> *Win7 Ultimate x64*
> *ASUS Z87 Gryphon w/ Thermal Armor*
> *i7 4770k @ 4.5GHz - XSPC Raystorm - Delidded/Naked*
> *2x8 GB Dominator Platinum 2400MHz*
> *XFX BE R9 290 Reference - XSPC Razor w/ Backplate*
> *Rosewill Capstone 650W Modular*
> 
> *Stock Clocks - 980/1250 -Driver 14.301.1001-140915a-176154C - Catalyst 14.9*
> 
> *Bioshock Infinite Benchmark (Ultra DX11 DDOF - 1920*1080)*
> 
> 
> 
> Per Scene Stats:
> 
> Scene Duration (seconds)
> 
> Average FPS
> 
> Min FPS
> 
> Max FPS
> 
> Scene Name
> 
> 32.79
> 
> 128.77
> 
> 29.39
> 
> 305.94
> 
> Welcome Center
> 
> 6.54
> 
> 141.22
> 
> 21.6
> 
> 168.62
> 
> Scene Change: Disregard Performance In This Section
> 
> 22.14
> 
> 159.65
> 
> 125.24
> 
> 190.89
> 
> Town Center
> 
> 8.05
> 
> 151.12
> 
> 82.86
> 
> 230.01
> 
> Raffle
> 
> 9.04
> 
> 201.96
> 
> 95.1
> 
> 255.04
> 
> Monument Island
> 
> 3.02
> 
> 211.46
> 
> 179.5
> 
> 237.86
> 
> Benchmark Finished: Disregard Performance In This Section
> 
> 81.57
> 
> 151.56
> 
> 21.6
> 
> 305.94
> 
> Overall
> 
> 
> *Cloudgate* - 27,725 - http://www.3dmark.com/3dm/5051789?
> *Firestrike* - 9,526 - http://www.3dmark.com/3dm/5051701?
> *Sky Diver* - 27,291 - http://www.3dmark.com/3dm/5051763?
> *3DMark11 Performance* - P13870 - http://www.3dmark.com/3dm11/9134896
> *3DMark11 Performance *Tess OFF** - P15282 - http://www.3dmark.com/3dm11/9134871
> *Final Fantasy XIV Benchmark* *(Max Settings - 1920*1080 Windowed)* - 13,090
> *Tomb Raider (Ultimate Settings - 5760*1080)* - Min: 34.0 | Max: 48.0 | Avg: 40.1
> *Tomb Raider (Ultimate Settings - 1920*1080 Windowed)* - Min: 62.1 | Max: 108.0 | Avg: 80.7
> *Heaven (Extreme Presets - 1600*900 Windowed)* - Min: 8.6 | Max: 138.3 | FPS: 65.6 | Score: 1653
> *Valley (Extreme HD - 1920*1080 Windowed)* - Min: 16.5 | Max: 107.0 | FPS: 58.2 | Score: 2433
> 
> 
> 
> Spoiler: Screenies - Stock Clocks - Catalyst 14.9
> 
> 
> 
> _*3DMark11 Tesselation OFF*_
> 
> 
> 
> _*3DMark11 Tesselation ON*_
> 
> 
> 
> _*FFXIV Benchmark*_
> 
> 
> 
> _*Unigine Heaven*_
> 
> 
> 
> _*Tomb Raider 1920*1080*_
> 
> 
> 
> _*Tomb Raider 5760*1080*_
> 
> 
> 
> _*Unigine Valley*_
> 
> 
> 
> 
> 
> _*GPU Clocks - 1280/1675 - Driver 14.301.1001-140915a-176154C - Catalyst 14.9*_
> 
> *Bioshock Infinite Benchmark (Ultra DX11 DDOF - 1920*1080)*
> 
> 
> 
> Per Scene Stats:
> 
> Scene Duration (seconds)
> 
> Average FPS
> 
> Min FPS
> 
> Max FPS
> 
> Scene Name
> 
> 32.79
> 
> 145.81
> 
> 30.25
> 
> 371.04
> 
> Welcome Center
> 
> 6.55
> 
> 146.33
> 
> 21.77
> 
> 206.03
> 
> Scene Change: Disregard Performance In This Section
> 
> 22.12
> 
> 201.57
> 
> 153.63
> 
> 257.75
> 
> Town Center
> 
> 8.05
> 
> 190.84
> 
> 90.3
> 
> 280.09
> 
> Raffle
> 
> 9.04
> 
> 251.41
> 
> 134.85
> 
> 379.44
> 
> Monument Island
> 
> 3.01
> 
> 261.73
> 
> 210.91
> 
> 290.62
> 
> Benchmark Finished: Disregard Performance In This Section
> 
> 81.56
> 
> 181.47
> 
> 21.77
> 
> 379.44
> 
> Overall
> 
> 
> *Cloudgate* - 29,725 - http://www.3dmark.com/3dm/5051855?
> *Firestrike* - 11,778 - http://www.3dmark.com/3dm/5051981?
> *Sky Diver* - 31,201 - http://www.3dmark.com/3dm/5051944?
> *3DMark11 Performance* - P16609 - http://www.3dmark.com/3dm11/9134823
> *3DMark11 Performance *Tess OFF** - P18079 - http://www.3dmark.com/3dm11/9134835 *My best 3DMark11 run EvAr*
> 
> 
> 
> 
> 
> 
> 
> 
> *Final Fantasy XIV Benchmark (Max Settings - 1920*1080 Windowed)* - 16,508
> *Tomb Raider (Ultimate Settings - 5760*1080)* - Min: 43.4 | Max: 58.4 | Avg: 50.2
> *Tomb Raider (Ultimate Settings - 1920*1080 Windowed)* - Min: 78.0 | Max: 126.5 | Avg: 101.8
> *Heaven (Extreme Presets - 1600*900 Windowed) -* Min: 33.5 | Max: 176.1 | FPS: 83.9 | Score: 2115
> *Valley (Extreme HD - 1920*1080 Windowed) -* Min: 35.1 | Max: 135.3 | FPS: 74.2 | Score: 3,103
> 
> 
> 
> Spoiler: The Warm & Fuzzy Screenshot
> 
> 
> 
> *P18079 w/ Tesselation OFF* *- Similar Scores*
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Screenies - OverClock - Catalyst 14.9
> 
> 
> 
> *3DMark11 Tesselation OFF*
> 
> 
> 
> *3DMark11 Tesselation ON*
> 
> 
> 
> *FFXIV Benchmark*
> 
> 
> 
> *Unigine Heaven*
> 
> 
> 
> *Tomb Raider 1920*1080*
> 
> 
> 
> *Unigine Valley*
> 
> 
> 
> 
> 
> *Overclock Performance Increase* - *Catalyst 14.9*
> 
> *Overall average has been calculated. I also took the average w/o Cloudgate/Sky Diver since they are less common and posted the worst gains*
> 
> 
> 
> *14.9 Catalyst*
> 
> *Benchmark*
> 
> *Score: Stock*
> 
> *Score: OverClock*
> 
> *Increase*
> 
> *%*
> 
> *Bioshock (Avg FPS)*
> 
> 151.56
> 
> 181.47
> 
> 29.91
> 
> 19.73%
> 
> *Cloudgate*
> 
> 27,275
> 
> 29,725
> 
> 2,450
> 
> 8.98%
> 
> *Firestrike*
> 
> 9,526
> 
> 11,778
> 
> 2,252
> 
> 23.64%
> 
> *Sky Diver*
> 
> 27,291
> 
> 31,201
> 
> 3,910
> 
> 14.33%
> 
> *3DMark11*
> 
> 13,870
> 
> 16,609
> 
> 2,739
> 
> 19.75%
> 
> *3Dmark11 Tess *OFF**
> 
> 15,282
> 
> 18,079
> 
> 2,797
> 
> 18.30%
> 
> *FFXIV*
> 
> 13,090
> 
> 16,508
> 
> 3,418
> 
> 26.11%
> 
> *Tomb Raider 5760 (Avg FPS)*
> 
> 48.0
> 
> 58.4
> 
> 10.4
> 
> 21.67%
> 
> *Tomb Raider 1080 (Avg FPS)*
> 
> 80.7
> 
> 101.8
> 
> 21.1
> 
> 26.15%
> 
> *Heaven*
> 
> 1,653
> 
> 2,115
> 
> 462
> 
> 27.95%
> 
> *Valley*
> 
> 2,433
> 
> 3,103
> 
> 670
> 
> 27.54%
> 
> 
>  
> *Overall Average Improvement*
> 
> 21.29%
>  
> *Without Cloudgate & Skydiver*
> 
> 23.43%
>  
> 
> Omega
> 
> _*Stock Clocks - 980/1250 - Driver 14.501.1003-141120a-177998C - Omega 14.12*_
> 
> *Bioshock Infinite Benchmark (Ultra DX11 DDOF - 1920*1080)*
> 
> 
> 
> *Per Scene Stats:*
> 
> *Scene Duration (seconds)*
> 
> * Average FPS*
> 
> * Min FPS*
> 
> * Max FPS*
> 
> * Scene Name*
> 
> *32.78*
> 
> *140.93*
> 
> *29.57*
> 
> *310.45*
> 
> * Welcome Center*
> 
> *6.54*
> 
> *147.92*
> 
> *20.06*
> 
> *195.9*
> 
> * Scene Change: Disregard Performance In This Section*
> 
> *22.12*
> 
> *184.35*
> 
> *141.35*
> 
> *263.82*
> 
> * Town Center*
> 
> *8.05*
> 
> *173.96*
> 
> *62.15*
> 
> *240.11*
> 
> * Raffle*
> 
> *9.03*
> 
> *239.74*
> 
> *118*
> 
> *363*
> 
> * Monument Island*
> 
> *3.51*
> 
> *252.68*
> 
> *180.74*
> 
> *386.26*
> 
> * Benchmark Finished: Disregard Performance In This Section*
> 
> *82.04*
> 
> *172.16*
> 
> *20.06*
> 
> *386.26*
> 
> * Overall*
> 
> 
> *Cloudgate* - 27,686 - http://www.3dmark.com/3dm/5067286?
> *Firestrike* - 9,529 - http://www.3dmark.com/3dm/5067344?
> *Sky Diver* - 27,533 - http://www.3dmark.com/3dm/5067312?
> *3DMark11 Performance* - P13,898 - http://www.3dmark.com/3dm11/9135153
> *3DMark11 Performance *Tess OFF** - P15,313 - http://www.3dmark.com/3dm11/9135164
> *Final Fantasy XIV Benchmark* *(Max Settings - 1920*1080 Windowed)* - 13,215
> *Tomb Raider (Ultimate Settings - 5760*1080)* - Min: 33.9 | Max: 46.6| Avg: 39.6
> *Tomb Raider (Ultimate Settings - 1920*1080 Windowed)* - Min: 62.0 | Max: 106.0 | Avg: 80.8
> *Heaven (Extreme Presets - 1600*900 Windowed)* - Min: 25.8 | Max: 139.2 | FPS: 66.2 | Score: 1,668
> *Valley (Extreme HD - 1920*1080 Windowed)* - Min: 28.6 | Max: 107.8 | FPS: 58.4 | Score: 2,442
> 
> 
> 
> Spoiler: Screenies - Stock Clocks - Omega
> 
> 
> 
> *Unigine Heaven*
> 
> 
> 
> *Tomb Raider 1920*1080*
> 
> 
> 
> *Tomb Raider 5760*1080*
> 
> 
> 
> *Unigine Valley*
> 
> 
> 
> 
> 
> _*GPU Clocks - 1280/1675 - Driver 14.501.1003-141120a-177998C - Omega 14.12*_
> 
> *Bioshock Infinite Benchmark (Ultra DX11 DDOF - 1920*1080)*
> 
> 
> 
> *Per Scene Stats:*
> 
> *Scene Duration (seconds)*
> 
> * Average FPS*
> 
> * Min FPS*
> 
> * Max FPS*
> 
> * Scene Name*
> 
> *32.96*
> 
> *155.36*
> 
> *1.46*
> 
> *417.19*
> 
> * Welcome Center*
> 
> *7.06*
> 
> *145.46*
> 
> *20.02*
> 
> *304.15*
> 
> * Scene Change:*
> *Disregard Performance*
> 
> *22.1*
> 
> *226.79*
> 
> *139.98*
> 
> *318.79*
> 
> * Town Center*
> 
> *8.03*
> 
> *221.12*
> 
> *92.08*
> 
> *307.18*
> 
> * Raffle*
> 
> *9.02*
> 
> *300.5*
> 
> *200.87*
> 
> *463.84*
> 
> * Monument Island*
> 
> *3.01*
> 
> *314.86*
> 
> *234.79*
> 
> *343.14*
> 
> * Benchmark Finished:*
> *Disregard Performance*
> 
> *82.18*
> 
> *202.15*
> 
> *1.46*
> 
> *463.84*
> 
> * Overall*
> 
> 
> *Cloudgate* - 29,828 - http://www.3dmark.com/3dm/5102119?
> *Firestrike* - 11,799 - http://www.3dmark.com/3dm/5102205?
> *Sky Diver* - 31,479 - http://www.3dmark.com/3dm/5102167?
> *3DMark11 Performance* - P16,568 - http://www.3dmark.com/3dm11/9151648
> *3DMark11 Performance *Tess OFF** - P18043 - http://www.3dmark.com/3dm11/9151703
> *Final Fantasy XIV Benchmark* *(Max Settings - 1920*1080 Windowed)* - 16,502
> *Tomb Raider (Ultimate Settings - 5760*1080)* - Min: 44.0 | Max: 60.4 | Avg: 51.9
> *Tomb Raider (Ultimate Settings - 1920*1080 Windowed)* - Min: 77.7 | Max: 134.2 | Avg: 103.1
> *Heaven (Extreme Presets - 1600*900 Windowed)* - Min: | 33.3 Max: |177.5 FPS: |83.9 Score: 2,114
> *Valley (Extreme HD - 1920*1080 Windowed)* - Min: 34.6 | Max: 135.3 | FPS: |73.5 Score: 3,077
> 
> 
> 
> Spoiler: Screenies - OverClock - Omega
> 
> 
> 
> *3DMark11 Tesselation OFF*
> 
> 
> 
> *3DMark11 Tesselation ON*
> 
> 
> 
> *FFXIV Benchmark*
> 
> 
> 
> *Unigine Heaven*
> 
> 
> 
> *Tombraider 1920*1080*
> 
> 
> 
> *Tomb Raider 5760*1080*
> 
> 
> 
> *Unigine Valley*
> 
> 
> 
> 
> 
> _*Overclock Performance Increase - Omega*_
> 
> *Overall average has been calculated. I also took the average w/o Cloudgate/Sky Diver since they are less common and posted the worst gains*
> 
> 
> 
> *14.12 Omega*
> 
> *Benchmark*
> 
> *Score: Stock*
> 
> *Score: OverClock*
> 
> *Increase*
> 
> *%*
> 
> *Bioshock (Avg FPS)*
> 
> 172.16
> 
> 202.15
> 
> 29.99
> 
> 17.42%
> 
> *Cloudgate*
> 
> 27,686
> 
> 29,828
> 
> 2142.00
> 
> 7.74%
> 
> *Firestrike*
> 
> 9,529
> 
> 11,799
> 
> 2270.00
> 
> 23.82%
> 
> *Sky Diver*
> 
> 27,533
> 
> 31,479
> 
> 3946.00
> 
> 14.33%
> 
> *3DMark11*
> 
> 13,898
> 
> 16,568
> 
> 2670.00
> 
> 19.21%
> 
> *3Dmark11 Tess *OFF**
> 
> 15,313
> 
> 18,043
> 
> 2730.00
> 
> 17.83%
> 
> *FFXIV*
> 
> 13,215
> 
> 16,502
> 
> 3287.00
> 
> 24.87%
> 
> *Tomb Raider 5760 (Avg FPS)*
> 
> 39.6
> 
> 51.9
> 
> 12.30
> 
> 31.06%
> 
> *Tomb Raider 1080 (Avg FPS)*
> 
> 80.8
> 
> 103.1
> 
> 22.30
> 
> 27.60%
> 
> *Heaven*
> 
> 1,668
> 
> 2,114
> 
> 446.00
> 
> 26.74%
> 
> *Valley*
> 
> 2,442
> 
> 3,077
> 
> 635.00
> 
> 26.00%
> 
> 
>  
> *Overall Average Improvement*
> 
> 21.51%
>  
> *Without Cloudgate & Skydiver*
> 
> 23.84%
>  
> 
> _*Stock Clock Comparison: Catalyst 14.9 ~VS~ Omega 14.12*_
> 
> *As can be seen below, there is barely any difference between 14.9 & Omega...with the exception of BioShock with solid gains, and Tomb Raider with a hard performance hit.*
> 
> *Averages were calculated the same as above, but also without Tomb Raider & BioShock since they influence the averages heavily.*
> 
> 
> 
> *Benchmark*
> 
> *Stock - 14.9*
> 
> *Stock - Omega*
> 
> *Increase*
> 
> *%*
> 
> *Bioshock (Avg FPS)*
> 
> 151.56
> 
> 172.16
> 
> 20.60
> 
> 13.59%
> 
> *Cloudgate*
> 
> 27,275
> 
> 27,686
> 
> 411
> 
> 1.51%
> 
> *Firestrike*
> 
> 9,526
> 
> 9,529
> 
> 3
> 
> 0.03%
> 
> *Sky Diver*
> 
> 27,291
> 
> 27,533
> 
> 242
> 
> 0.89%
> 
> *3DMark11*
> 
> 13,870
> 
> 13,898
> 
> 28
> 
> 0.20%
> 
> *3Dmark11 Tess *OFF**
> 
> 15,282
> 
> 15,313
> 
> 31
> 
> 0.20%
> 
> *FFXIV*
> 
> 13,090
> 
> 13,215
> 
> 125
> 
> 0.95%
> 
> *Tomb Raider 5760 (Avg FPS)*
> 
> 48.0
> 
> 39.6
> 
> -8.4
> 
> -17.50%
> 
> *Tomb Raider 1080 (Avg FPS)*
> 
> 80.7
> 
> 80.8
> 
> 0.1
> 
> 0.12%
> 
> *Heaven*
> 
> 1,653
> 
> 1,668
> 
> 15
> 
> 0.91%
> 
> *Valley*
> 
> 2,433
> 
> 2,442
> 
> 9
> 
> 0.37%
> 
> 
>  
> *Overall Average Improvement*
> 
> 0.12%
>  
> *Without Cloudgate & Skydiver*
> 
> -0.12%
>  
> *Without Tombraider*
> 
> 2.07%
>  
> *Without Tombraider & Bioshock*
> 
> 0.63%
>  
> 
> _*Overclock Comparison: Catalyst 14.9 ~VS~ Omega 14.12*_
> 
> *As can be seen below, there is barely any difference between 14.9 & Omega...with the exception of BioShock with solid gains, and Tomb Raider with a hard performance hit.*
> 
> *Averages were calculated the same as above, but also without Tomb Raider & BioShock since they influence the averages heavily.*
> 
> 
> 
> *Benchmark*
> 
> *OC - 14.9*
> 
> *OC - Omega*
> 
> *Increase*
> 
> *%*
> 
> *Bioshock (Avg FPS)*
> 
> 181.47
> 
> 202.15
> 
> 20.68
> 
> 11.40%
> 
> *Cloudgate*
> 
> 29,725
> 
> 29,828
> 
> 103
> 
> 0.35%
> 
> *Firestrike*
> 
> 11,778
> 
> 11,799
> 
> 21
> 
> 0.18%
> 
> *Sky Diver*
> 
> 31,201
> 
> 31,479
> 
> 278
> 
> 0.89%
> 
> *3DMark11*
> 
> 16,609
> 
> 16,568
> 
> -41
> 
> -0.25%
> 
> *3Dmark11 Tess *OFF**
> 
> 18,079
> 
> 18,043
> 
> -36
> 
> -0.20%
> 
> *FFXIV*
> 
> 16,508
> 
> 16,502
> 
> -6
> 
> -0.04%
> 
> *Tomb Raider 5760 (Avg FPS)*
> 
> 58.4
> 
> 51.9
> 
> -6.5
> 
> -11.13%
> 
> *Tomb Raider 1080 (Avg FPS)*
> 
> 101.8
> 
> 103.1
> 
> 1.3
> 
> 1.28%
> 
> *Heaven*
> 
> 2,115
> 
> 2,114
> 
> -1
> 
> -0.05%
> 
> *Valley*
> 
> 3,103
> 
> 3,077
> 
> -26
> 
> -0.84%
> 
> 
>  
> *Overall Average Improvement*
> 
> 0.14%
>  
> *Without Cloudgate & Skydiver*
> 
> 0.04%
>  
> *Without Tombraider*
> 
> 1.25%
>  
> *Without Tombraider & Bioshock*
> 
> -0.02%
>  
> 
> _This obviously isn't a comprehensive list, but it tells a familiar story for us enthusiast class GPU owners. AMD's primary focus with Omega, and previously Mantle, is lower tier hardware. Hopefully the drivers address pertinent issues in games where a smooth gaming experience can be more important than benchmarks._
> 
> _-El Fin _


Thanks... +rep


----------



## Roboyto

Quote:


> Originally Posted by *pshootr*
> 
> Did some overclocking with Sapphire R9 290 Tri-X OC and used BF4 as a testing platform. So far I can only go to 1160/Core 1400/Mem with +63v or I get white flickering artifacts.
> 
> Max temp: 70C
> Max fan speed: 75%
> Max VDDC: 1.305v
> 
> Was hoping for better. May be with some more experience with tweaking I can get a little more.


If you're artifacting then you need more voltage. 1160 core clock is pretty solid for only +63mV; Bump voltage up marginally until the artifacts disappear.

Also, it is wise to keep an eye on VRM1 temperature, along with the core, especially with an air cooled card. You want to keep that VRM1 in the 80C range, and no more than 90C, for longevity's sake.


----------



## rdr09

Quote:


> Originally Posted by *Roboyto*
> 
> *Finally got around to making an Omega Drivers Comparison....better late than never?*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> All runs were made with identical CPU/RAM settings.
> This is a fairly fresh Win7 Ultimate install; November 30th
> DDU was run to upgrade drivers; CCleaner was also used to make sure any extra traces were gone.
> Tesselation was always on unless noted.
> All runs were made at least twice and the best score was recorded
> If there was a large delta, then I would reboot and run another 1-2 times to make sure the results were valid.
> 
> Stock clocks of 980/1250 were used for the first set
> They were then compared to a fairly high OC of 1280/1675 +200mV +50% power
> This is on stock XFX BIOS.
> Trixx for OC
> 
> Some screenshots are missing, my apologies...
> _My system..._
> 
> *Win7 Ultimate x64*
> *ASUS Z87 Gryphon w/ Thermal Armor*
> *i7 4770k @ 4.5GHz - XSPC Raystorm - Delidded/Naked*
> *2x8 GB Dominator Platinum 2400MHz*
> *XFX BE R9 290 Reference - XSPC Razor w/ Backplate*
> *Rosewill Capstone 650W Modular*
> 
> *Stock Clocks - 980/1250 -Driver 14.301.1001-140915a-176154C - Catalyst 14.9*
> 
> *Bioshock Infinite Benchmark (Ultra DX11 DDOF - 1920*1080)*
> 
> 
> 
> Per Scene Stats:
> 
> Scene Duration (seconds)
> 
> Average FPS
> 
> Min FPS
> 
> Max FPS
> 
> Scene Name
> 
> 32.79
> 
> 128.77
> 
> 29.39
> 
> 305.94
> 
> Welcome Center
> 
> 6.54
> 
> 141.22
> 
> 21.6
> 
> 168.62
> 
> Scene Change: Disregard Performance In This Section
> 
> 22.14
> 
> 159.65
> 
> 125.24
> 
> 190.89
> 
> Town Center
> 
> 8.05
> 
> 151.12
> 
> 82.86
> 
> 230.01
> 
> Raffle
> 
> 9.04
> 
> 201.96
> 
> 95.1
> 
> 255.04
> 
> Monument Island
> 
> 3.02
> 
> 211.46
> 
> 179.5
> 
> 237.86
> 
> Benchmark Finished: Disregard Performance In This Section
> 
> 81.57
> 
> 151.56
> 
> 21.6
> 
> 305.94
> 
> Overall
> 
> 
> *Cloudgate* - 27,725 - http://www.3dmark.com/3dm/5051789?
> *Firestrike* - 9,526 - http://www.3dmark.com/3dm/5051701?
> *Sky Diver* - 27,291 - http://www.3dmark.com/3dm/5051763?
> *3DMark11 Performance* - P13870 - http://www.3dmark.com/3dm11/9134896
> *3DMark11 Performance *Tess OFF** - P15282 - http://www.3dmark.com/3dm11/9134871
> *Final Fantasy XIV Benchmark* *(Max Settings - 1920*1080 Windowed)* - 13,090
> *Tomb Raider (Ultimate Settings - 5760*1080)* - Min: 34.0 | Max: 48.0 | Avg: 40.1
> *Tomb Raider (Ultimate Settings - 1920*1080 Windowed)* - Min: 62.1 | Max: 108.0 | Avg: 80.7
> *Heaven (Extreme Presets - 1600*900 Windowed)* - Min: 8.6 | Max: 138.3 | FPS: 65.6 | Score: 1653
> *Valley (Extreme HD - 1920*1080 Windowed)* - Min: 16.5 | Max: 107.0 | FPS: 58.2 | Score: 2433
> 
> 
> 
> Spoiler: Screenies - Stock Clocks - Catalyst 14.9
> 
> 
> 
> _*3DMark11 Tesselation OFF*_
> 
> 
> 
> _*3DMark11 Tesselation ON*_
> 
> 
> 
> _*FFXIV Benchmark*_
> 
> 
> 
> _*Unigine Heaven*_
> 
> 
> 
> _*Tomb Raider 1920*1080*_
> 
> 
> 
> _*Tomb Raider 5760*1080*_
> 
> 
> 
> _*Unigine Valley*_
> 
> 
> 
> 
> 
> _*GPU Clocks - 1280/1675 - Driver 14.301.1001-140915a-176154C - Catalyst 14.9*_
> 
> *Bioshock Infinite Benchmark (Ultra DX11 DDOF - 1920*1080)*
> 
> 
> 
> Per Scene Stats:
> 
> Scene Duration (seconds)
> 
> Average FPS
> 
> Min FPS
> 
> Max FPS
> 
> Scene Name
> 
> 32.79
> 
> 145.81
> 
> 30.25
> 
> 371.04
> 
> Welcome Center
> 
> 6.55
> 
> 146.33
> 
> 21.77
> 
> 206.03
> 
> Scene Change: Disregard Performance In This Section
> 
> 22.12
> 
> 201.57
> 
> 153.63
> 
> 257.75
> 
> Town Center
> 
> 8.05
> 
> 190.84
> 
> 90.3
> 
> 280.09
> 
> Raffle
> 
> 9.04
> 
> 251.41
> 
> 134.85
> 
> 379.44
> 
> Monument Island
> 
> 3.01
> 
> 261.73
> 
> 210.91
> 
> 290.62
> 
> Benchmark Finished: Disregard Performance In This Section
> 
> 81.56
> 
> 181.47
> 
> 21.77
> 
> 379.44
> 
> Overall
> 
> 
> *Cloudgate* - 29,725 - http://www.3dmark.com/3dm/5051855?
> *Firestrike* - 11,778 - http://www.3dmark.com/3dm/5051981?
> *Sky Diver* - 31,201 - http://www.3dmark.com/3dm/5051944?
> *3DMark11 Performance* - P16609 - http://www.3dmark.com/3dm11/9134823
> *3DMark11 Performance *Tess OFF** - P18079 - http://www.3dmark.com/3dm11/9134835 *My best 3DMark11 run EvAr*
> 
> 
> 
> 
> 
> 
> 
> 
> *Final Fantasy XIV Benchmark (Max Settings - 1920*1080 Windowed)* - 16,508
> *Tomb Raider (Ultimate Settings - 5760*1080)* - Min: 43.4 | Max: 58.4 | Avg: 50.2
> *Tomb Raider (Ultimate Settings - 1920*1080 Windowed)* - Min: 78.0 | Max: 126.5 | Avg: 101.8
> *Heaven (Extreme Presets - 1600*900 Windowed) -* Min: 33.5 | Max: 176.1 | FPS: 83.9 | Score: 2115
> *Valley (Extreme HD - 1920*1080 Windowed) -* Min: 35.1 | Max: 135.3 | FPS: 74.2 | Score: 3,103
> 
> 
> 
> Spoiler: The Warm & Fuzzy Screenshot
> 
> 
> 
> *P18079 w/ Tesselation OFF* *- Similar Scores*
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Screenies - OverClock - Catalyst 14.9
> 
> 
> 
> *3DMark11 Tesselation OFF*
> 
> 
> 
> *3DMark11 Tesselation ON*
> 
> 
> 
> *FFXIV Benchmark*
> 
> 
> 
> *Unigine Heaven*
> 
> 
> 
> *Tomb Raider 1920*1080*
> 
> 
> 
> *Unigine Valley*
> 
> 
> 
> 
> 
> *Overclock Performance Increase* - *Catalyst 14.9*
> 
> *Overall average has been calculated. I also took the average w/o Cloudgate/Sky Diver since they are less common and posted the worst gains*
> 
> 
> 
> *14.9 Catalyst*
> 
> *Benchmark*
> 
> *Score: Stock*
> 
> *Score: OverClock*
> 
> *Increase*
> 
> *%*
> 
> *Bioshock (Avg FPS)*
> 
> 151.56
> 
> 181.47
> 
> 29.91
> 
> 19.73%
> 
> *Cloudgate*
> 
> 27,275
> 
> 29,725
> 
> 2,450
> 
> 8.98%
> 
> *Firestrike*
> 
> 9,526
> 
> 11,778
> 
> 2,252
> 
> 23.64%
> 
> *Sky Diver*
> 
> 27,291
> 
> 31,201
> 
> 3,910
> 
> 14.33%
> 
> *3DMark11*
> 
> 13,870
> 
> 16,609
> 
> 2,739
> 
> 19.75%
> 
> *3Dmark11 Tess *OFF**
> 
> 15,282
> 
> 18,079
> 
> 2,797
> 
> 18.30%
> 
> *FFXIV*
> 
> 13,090
> 
> 16,508
> 
> 3,418
> 
> 26.11%
> 
> *Tomb Raider 5760 (Avg FPS)*
> 
> 48.0
> 
> 58.4
> 
> 10.4
> 
> 21.67%
> 
> *Tomb Raider 1080 (Avg FPS)*
> 
> 80.7
> 
> 101.8
> 
> 21.1
> 
> 26.15%
> 
> *Heaven*
> 
> 1,653
> 
> 2,115
> 
> 462
> 
> 27.95%
> 
> *Valley*
> 
> 2,433
> 
> 3,103
> 
> 670
> 
> 27.54%
> 
> 
>  
> *Overall Average Improvement*
> 
> 21.29%
>  
> *Without Cloudgate & Skydiver*
> 
> 23.43%
>  
> 
> Omega
> 
> _*Stock Clocks - 980/1250 - Driver 14.501.1003-141120a-177998C - Omega 14.12*_
> 
> *Bioshock Infinite Benchmark (Ultra DX11 DDOF - 1920*1080)*
> 
> 
> 
> *Per Scene Stats:*
> 
> *Scene Duration (seconds)*
> 
> *Average FPS*
> 
> *Min FPS*
> 
> *Max FPS*
> 
> *Scene Name*
> 
> *32.78*
> 
> *140.93*
> 
> *29.57*
> 
> *310.45*
> 
> *Welcome Center*
> 
> *6.54*
> 
> *147.92*
> 
> *20.06*
> 
> *195.9*
> 
> *Scene Change: Disregard Performance In This Section*
> 
> *22.12*
> 
> *184.35*
> 
> *141.35*
> 
> *263.82*
> 
> *Town Center*
> 
> *8.05*
> 
> *173.96*
> 
> *62.15*
> 
> *240.11*
> 
> *Raffle*
> 
> *9.03*
> 
> *239.74*
> 
> *118*
> 
> *363*
> 
> *Monument Island*
> 
> *3.51*
> 
> *252.68*
> 
> *180.74*
> 
> *386.26*
> 
> *Benchmark Finished: Disregard Performance In This Section*
> 
> *82.04*
> 
> *172.16*
> 
> *20.06*
> 
> *386.26*
> 
> *Overall*
> 
> 
> *Cloudgate* - 27,686 - http://www.3dmark.com/3dm/5067286?
> *Firestrike* - 9,529 - http://www.3dmark.com/3dm/5067344?
> *Sky Diver* - 27,533 - http://www.3dmark.com/3dm/5067312?
> *3DMark11 Performance* - P13,898 - http://www.3dmark.com/3dm11/9135153
> *3DMark11 Performance *Tess OFF** - P15,313 - http://www.3dmark.com/3dm11/9135164
> *Final Fantasy XIV Benchmark* *(Max Settings - 1920*1080 Windowed)* - 13,215
> *Tomb Raider (Ultimate Settings - 5760*1080)* - Min: 33.9 | Max: 46.6| Avg: 39.6
> *Tomb Raider (Ultimate Settings - 1920*1080 Windowed)* - Min: 62.0 | Max: 106.0 | Avg: 80.8
> *Heaven (Extreme Presets - 1600*900 Windowed)* - Min: 25.8 | Max: 139.2 | FPS: 66.2 | Score: 1,668
> *Valley (Extreme HD - 1920*1080 Windowed)* - Min: 28.6 | Max: 107.8 | FPS: 58.4 | Score: 2,442
> 
> 
> 
> Spoiler: Screenies - Stock Clocks - Omega
> 
> 
> 
> *Unigine Heaven*
> 
> 
> 
> *Tomb Raider 1920*1080*
> 
> 
> 
> *Tomb Raider 5760*1080*
> 
> 
> 
> *Unigine Valley*
> 
> 
> 
> 
> 
> _*GPU Clocks - 1280/1675 - Driver 14.501.1003-141120a-177998C - Omega 14.12*_
> 
> *Bioshock Infinite Benchmark (Ultra DX11 DDOF - 1920*1080)*
> 
> 
> 
> *Per Scene Stats:*
> 
> *Scene Duration (seconds)*
> 
> *Average FPS*
> 
> *Min FPS*
> 
> *Max FPS*
> 
> *Scene Name*
> 
> *32.96*
> 
> *155.36*
> 
> *1.46*
> 
> *417.19*
> 
> *Welcome Center*
> 
> *7.06*
> 
> *145.46*
> 
> *20.02*
> 
> *304.15*
> 
> *Scene Change:*
> *Disregard Performance*
> 
> *22.1*
> 
> *226.79*
> 
> *139.98*
> 
> *318.79*
> 
> *Town Center*
> 
> *8.03*
> 
> *221.12*
> 
> *92.08*
> 
> *307.18*
> 
> *Raffle*
> 
> *9.02*
> 
> *300.5*
> 
> *200.87*
> 
> *463.84*
> 
> *Monument Island*
> 
> *3.01*
> 
> *314.86*
> 
> *234.79*
> 
> *343.14*
> 
> *Benchmark Finished:*
> *Disregard Performance*
> 
> *82.18*
> 
> *202.15*
> 
> *1.46*
> 
> *463.84*
> 
> *Overall*
> 
> 
> *Cloudgate* - 29,828 - http://www.3dmark.com/3dm/5102119?
> *Firestrike* - 11,799 - http://www.3dmark.com/3dm/5102205?
> *Sky Diver* - 31,479 - http://www.3dmark.com/3dm/5102167?
> *3DMark11 Performance* - P16,568 - http://www.3dmark.com/3dm11/9151648
> *3DMark11 Performance *Tess OFF** - P18043 - http://www.3dmark.com/3dm11/9151703
> *Final Fantasy XIV Benchmark* *(Max Settings - 1920*1080 Windowed)* - 16,502
> *Tomb Raider (Ultimate Settings - 5760*1080)* - Min: 44.0 | Max: 60.4 | Avg: 51.9
> *Tomb Raider (Ultimate Settings - 1920*1080 Windowed)* - Min: 77.7 | Max: 134.2 | Avg: 103.1
> *Heaven (Extreme Presets - 1600*900 Windowed)* - Min: | 33.3 Max: |177.5 FPS: |83.9 Score: 2,114
> *Valley (Extreme HD - 1920*1080 Windowed)* - Min: 34.6 | Max: 135.3 | FPS: |73.5 Score: 3,077
> 
> 
> 
> Spoiler: Screenies - OverClock - Omega
> 
> 
> 
> *3DMark11 Tesselation OFF*
> 
> 
> 
> *3DMark11 Tesselation ON*
> 
> 
> 
> *FFXIV Benchmark*
> 
> 
> 
> *Unigine Heaven*
> 
> 
> 
> *Tombraider 1920*1080*
> 
> 
> 
> *Tomb Raider 5760*1080*
> 
> 
> 
> *Unigine Valley*
> 
> 
> 
> 
> 
> _*Overclock Performance Increase - Omega*_
> 
> *Overall average has been calculated. I also took the average w/o Cloudgate/Sky Diver since they are less common and posted the worst gains*
> 
> 
> 
> *14.12 Omega*
> 
> *Benchmark*
> 
> *Score: Stock*
> 
> *Score: OverClock*
> 
> *Increase*
> 
> *%*
> 
> *Bioshock (Avg FPS)*
> 
> 172.16
> 
> 202.15
> 
> 29.99
> 
> 17.42%
> 
> *Cloudgate*
> 
> 27,686
> 
> 29,828
> 
> 2142.00
> 
> 7.74%
> 
> *Firestrike*
> 
> 9,529
> 
> 11,799
> 
> 2270.00
> 
> 23.82%
> 
> *Sky Diver*
> 
> 27,533
> 
> 31,479
> 
> 3946.00
> 
> 14.33%
> 
> *3DMark11*
> 
> 13,898
> 
> 16,568
> 
> 2670.00
> 
> 19.21%
> 
> *3Dmark11 Tess *OFF**
> 
> 15,313
> 
> 18,043
> 
> 2730.00
> 
> 17.83%
> 
> *FFXIV*
> 
> 13,215
> 
> 16,502
> 
> 3287.00
> 
> 24.87%
> 
> *Tomb Raider 5760 (Avg FPS)*
> 
> 39.6
> 
> 51.9
> 
> 12.30
> 
> 31.06%
> 
> *Tomb Raider 1080 (Avg FPS)*
> 
> 80.8
> 
> 103.1
> 
> 22.30
> 
> 27.60%
> 
> *Heaven*
> 
> 1,668
> 
> 2,114
> 
> 446.00
> 
> 26.74%
> 
> *Valley*
> 
> 2,442
> 
> 3,077
> 
> 635.00
> 
> 26.00%
> 
> 
>  
> *Overall Average Improvement*
> 
> 21.51%
>  
> *Without Cloudgate & Skydiver*
> 
> 23.84%
>  
> 
> _*Stock Clock Comparison: Catalyst 14.9 ~VS~ Omega 14.12*_
> 
> *As can be seen below, there is barely any difference between 14.9 & Omega...with the exception of BioShock with solid gains, and Tomb Raider with a hard performance hit.*
> 
> *Averages were calculated the same as above, but also without Tomb Raider & BioShock since they influence the averages heavily.*
> 
> 
> 
> *Benchmark*
> 
> *Stock - 14.9*
> 
> *Stock - Omega*
> 
> *Increase*
> 
> *%*
> 
> *Bioshock (Avg FPS)*
> 
> 151.56
> 
> 172.16
> 
> 20.60
> 
> 13.59%
> 
> *Cloudgate*
> 
> 27,275
> 
> 27,686
> 
> 411
> 
> 1.51%
> 
> *Firestrike*
> 
> 9,526
> 
> 9,529
> 
> 3
> 
> 0.03%
> 
> *Sky Diver*
> 
> 27,291
> 
> 27,533
> 
> 242
> 
> 0.89%
> 
> *3DMark11*
> 
> 13,870
> 
> 13,898
> 
> 28
> 
> 0.20%
> 
> *3Dmark11 Tess *OFF**
> 
> 15,282
> 
> 15,313
> 
> 31
> 
> 0.20%
> 
> *FFXIV*
> 
> 13,090
> 
> 13,215
> 
> 125
> 
> 0.95%
> 
> *Tomb Raider 5760 (Avg FPS)*
> 
> 48.0
> 
> 39.6
> 
> -8.4
> 
> -17.50%
> 
> *Tomb Raider 1080 (Avg FPS)*
> 
> 80.7
> 
> 80.8
> 
> 0.1
> 
> 0.12%
> 
> *Heaven*
> 
> 1,653
> 
> 1,668
> 
> 15
> 
> 0.91%
> 
> *Valley*
> 
> 2,433
> 
> 2,442
> 
> 9
> 
> 0.37%
> 
> 
>  
> *Overall Average Improvement*
> 
> 0.12%
>  
> *Without Cloudgate & Skydiver*
> 
> -0.12%
>  
> *Without Tombraider*
> 
> 2.07%
>  
> *Without Tombraider & Bioshock*
> 
> 0.63%
>  
> 
> _*Overclock Comparison: Catalyst 14.9 ~VS~ Omega 14.12*_
> 
> *As can be seen below, there is barely any difference between 14.9 & Omega...with the exception of BioShock with solid gains, and Tomb Raider with a hard performance hit.*
> 
> *Averages were calculated the same as above, but also without Tomb Raider & BioShock since they influence the averages heavily.*
> 
> 
> 
> *Benchmark*
> 
> *OC - 14.9*
> 
> *OC - Omega*
> 
> *Increase*
> 
> *%*
> 
> *Bioshock (Avg FPS)*
> 
> 181.47
> 
> 202.15
> 
> 20.68
> 
> 11.40%
> 
> *Cloudgate*
> 
> 29,725
> 
> 29,828
> 
> 103
> 
> 0.35%
> 
> *Firestrike*
> 
> 11,778
> 
> 11,799
> 
> 21
> 
> 0.18%
> 
> *Sky Diver*
> 
> 31,201
> 
> 31,479
> 
> 278
> 
> 0.89%
> 
> *3DMark11*
> 
> 16,609
> 
> 16,568
> 
> -41
> 
> -0.25%
> 
> *3Dmark11 Tess *OFF**
> 
> 18,079
> 
> 18,043
> 
> -36
> 
> -0.20%
> 
> *FFXIV*
> 
> 16,508
> 
> 16,502
> 
> -6
> 
> -0.04%
> 
> *Tomb Raider 5760 (Avg FPS)*
> 
> 58.4
> 
> 51.9
> 
> -6.5
> 
> -11.13%
> 
> *Tomb Raider 1080 (Avg FPS)*
> 
> 101.8
> 
> 103.1
> 
> 1.3
> 
> 1.28%
> 
> *Heaven*
> 
> 2,115
> 
> 2,114
> 
> -1
> 
> -0.05%
> 
> *Valley*
> 
> 3,103
> 
> 3,077
> 
> -26
> 
> -0.84%
> 
> 
>  
> *Overall Average Improvement*
> 
> 0.14%
>  
> *Without Cloudgate & Skydiver*
> 
> 0.04%
>  
> *Without Tombraider*
> 
> 1.25%
>  
> *Without Tombraider & Bioshock*
> 
> -0.02%
>  
> 
> _This obviously isn't a comprehensive list, but it tells a familiar story for us enthusiast class GPU owners. AMD's primary focus with Omega, and previously Mantle, is lower tier hardware. Hopefully the drivers address pertinent issues in games where a smooth gaming experience can be more important than benchmarks._
> 
> _-El Fin_


Wow. Lot of work. +rep. thanks.


----------



## Gobigorgohome

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> If your going to all that effort to coolo one card , why not all 3 ?? Its the only way to fly with these beasts .
> Temp or not wblock your cards , don't stuff around .


I understood about half of what you just wrote ...









I think I will stick to two cards on air and use the quadfire as it is. The Lightnings will then be air-cooled (pretty much just for LAN parties anyways) and a heater in my freakishy cold room. Yes, I am serious about the heater.


----------



## HOMECINEMA-PC

Sorry mate its my slight typo and my Oztralian and your self being from Norway

If your going to cool one card , why not all three ?? Its the only way to go with these beasts .
Temporary or not waterblock your cards , don't mess around .

Is what I stated .


----------



## pshootr

Quote:


> Originally Posted by *Roboyto*
> 
> If you're artifacting then you need more voltage. 1160 core clock is pretty solid for only +63mV; Bump voltage up marginally until the artifacts disappear.
> 
> Also, it is wise to keep an eye on VRM1 temperature, along with the core, especially with an air cooled card. You want to keep that VRM1 in the 80C range, and no more than 90C, for longevity's sake.


Thanks, I will try what you suggested. I have not figured out yet how to check the VRM temps though, after I get dinner started I will do some Google searches. Is +100mv perfectly safe as long as temps are in check?

Edit: I found VRM temp on GPU-Z


----------



## X-Alt

My XFX DD 290 is on its way to replace my tired old 7970, I'm looking forward to it quite a bit. Anyways, is overclocking more or less the same principle with Hawaii compared to Tahiti, or are they more sensitive to voltages, I've got a couple of friends who say they are quite finicky with the volts compared to their 7950s, is this correct or the silicon lottery giving them bad luck. Anyways, from user experience on the thread, what seems to be the average OC range for a reference board, I'm not that worried about temps since I don't mind a 70%+ fan profile, my Tahiti could handle a good 1.325V without any issue as long as it was cool?

Regards,
Alt


----------



## ebduncan

Quote:


> Originally Posted by *pshootr*
> 
> Thanks, I will try what you suggested. I have not figured out yet how to check the VRM temps though, after I get dinner started I will do some Google searches. Is +100mv perfectly safe as long as temps are in check?
> 
> Edit: I found VRM temp on GPU-Z


+100mv isn't really anything...

I use TRIX, and go up to +200mv, afterburner = crap. Good luck keep your card cool though on air cooling.

my card is waterblocked.


----------



## pshootr

Quote:


> Originally Posted by *ebduncan*
> 
> +100mv isn't really anything...
> 
> I use TRIX, and go up to +200mv, afterburner = crap. Good luck keep your card cool though on air cooling.
> 
> my card is waterblocked.


Thank you. I use AB for the profiles, otherwise shifting power-states causes my system to freeze when Flash is activated. Disabling HA in Waterfox does not help.


----------



## Archea47

Quote:


> Originally Posted by *tsm106*
> 
> People, PSU PSA. I find that most ppl don't actually overclock as much as they say. That doesn't mean their low numbers are the rule of thumb. You can run these cards in a manner that does not use up a lot of power, if you wanted to. On the flip side you can draw a helluvlot of power as well. Below is my dual psu setup. On the first psu, a 1300 G2 which for this test runs a 3930K at 5ghz and two 290x at 1100/1400 stock voltage, and 1250/1600 +300 (more than needed but extra just in case). There is NOTHING ELSE CONNECTED! No drives, fans, nothing. Second psu runs the other two 290x and all the accessories, pumps, 30 or so odd fans, etc etc.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> Now the wattage at teh wall. It's not so bad 5ghz cpu etc. 923*87 = 803w converted.
> 
> 
> 
> Now lets see really overclocked, though actually its not that overclocked compared to what max OC is with is 1300/1700 for two.
> 
> 
> 
> Look at the temps after 5mins lol. Now for the martini shot.
> 
> 
> 
> Lmao. THis surprised me a bit because I hadn't actually run this specific test till now. 1537w at the wall lol. 1537*.87=1337w. I guess I am over running my psu.
> 
> Long story short, you COULD run these cards, rigs in similar low draw setups. And yet on the flipside you can also run them balls to the wall.
> 
> Thus how do you make a recommendation? For the low end only? That would mean you won't ever get to oc balls to the walls? Are you guys ok with that? Personally, I wouldn't be because I don't want to be limited in what I can do.


Thanks to your info I'm now the proud owner of Bond's EVGA G2 1300. I will be replacing my G2 1000 with the new unit

Really appreciate the reality check ... and really hope it will be enough







I sometimes run 1.67 Vcore which comes out to around 300W on the AMD 8350


----------



## pshootr

1200/core at +100mv artifacted on rare occasion. Can someone explain some guidelines on using the axillary voltage on AB please.


----------



## Regnitto

Quote:


> Originally Posted by *pshootr*
> 
> 1200/core at +100mv artifacted on rare occasion. Can someone explain some guidelines on using the axillary voltage on AB please.


I've been wondering about the same thing and was going to ask, but you beat me to it.


----------



## ebduncan

Quote:


> Originally Posted by *pshootr*
> 
> 1200/core at +100mv artifacted on rare occasion. Can someone explain some guidelines on using the axillary voltage on AB please.


Quote:


> Originally Posted by *Regnitto*
> 
> I've been wondering about the same thing and was going to ask, but you beat me to it.


Its the phase locked loop (PLL) voltage

it can help with stability, but its not going to drastically increase your overclock.


----------



## Regnitto

Quote:


> Originally Posted by *ebduncan*
> 
> Its the phase locked loop (PLL) voltage
> 
> it can help with stability, but its not going to drastically increase your overclock.


I start getting artifacts over 1215/1500 on my 290. think it might help me get to 1220 or 1225?


----------



## Mega Man

Quote:


> Originally Posted by *tarui04*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Chopper1591*
> 
> Then follow my reply.
> 
> 
> 
> i know...i am trying to find a cnc services workshop that is willing to handle this kind of one-off low volume jobs.
> i have a dremel but the work stand isn't that precise. it tends to wobble a little
Click to expand...

there is next to zero chance of damaging the PCB, i do mean zero with the side cutters

besides with the cost of a cnc job, wouldnt it be worth spending 10 and trying it to see if you feel confortable
Quote:


> Originally Posted by *Chopper1591*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tarui04*
> 
> i know...i am trying to find a cnc services workshop that is willing to handle this kind of one-off low volume jobs.
> i have a dremel but the work stand isn't that precise. it tends to wobble a little
> 
> 
> 
> Good luck.
> Post back.
> 
> Maybe I am crazy or something but it sounds you are about to overdo it...
> If I was in your place I would just trim it while holding the dremel in one hand and the backplate in the other.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Does it need to be that pretty? It's the downside of the backplate, right?
Click to expand...

agreed


----------



## ebduncan

Quote:


> Originally Posted by *Regnitto*
> 
> I start getting artifacts over 1215/1500 on my 290. think it might help me get to 1220 or 1225?


certainly wouldn't hurt to try, as long as your temps are in check.

are you running at +100 mv? have you tried modding the command lines in msi afterburner to allow for a higher offset voltage? You would likely have better luck doing that than messing with the PLL voltage.

but like I said it wouldn't hurt to try.


----------



## Regnitto

I've done this:


But still get this:



my settings in AB:


----------



## ebduncan

Quote:


> Originally Posted by *Regnitto*
> 
> I've done this:


you have to do this

Guys its easy to give more volts on msi.

Just use /wi4,30,8d,10 for 100mv. The offset is 6.25 mv in hexademical. So on decimal is :16*6.25=100 mv. For 50mv you need 8. For 200mv you need 20( 20=32 on dec. So 32 * 6.25=200mv)

The easy way to do changes:

Create a txt on desktop. Write
CD C:\Program Files (x86)\MSI Afterburner
MSIAfterburner.exe /wi4,30,8d,10

and then save as .bat file. everytime you start this bat file msi will start with +100mv

For 50mv: 8
For 100mv:10
For 125mv:14
For 150mv:18
For 175mv:1C
For 200mv:20

I wouldn't go over this point because
1)You are close to leave the sweet spot of the ref pcb vrms efficiency
2)These commands add 200mv on top of the 100mv offset through AB gui.That means 300mv

There are plenty of guides on here, just hit Search :-D


----------



## pshootr

Quote:


> Originally Posted by *ebduncan*
> 
> Its the phase locked loop (PLL) voltage
> 
> it can help with stability, but its not going to drastically increase your overclock.


Thank you.


----------



## pshootr

Quote:


> Originally Posted by *ebduncan*
> 
> you have to do this
> 
> Guys its easy to give more volts on msi.
> 
> Just use /wi4,30,8d,10 for 100mv. The offset is 6.25 mv in hexademical. So on decimal is :16*6.25=100 mv. For 50mv you need 8. For 200mv you need 20( 20=32 on dec. So 32 * 6.25=200mv)
> 
> The easy way to do changes:
> 
> Create a txt on desktop. Write
> CD C:\Program Files (x86)\MSI Afterburner
> MSIAfterburner.exe /wi4,30,8d,10
> 
> and then save as .bat file. everytime you start this bat file msi will start with +100mv
> 
> For 50mv: 8
> For 100mv:10
> For 125mv:14
> For 150mv:18
> For 175mv:1C
> For 200mv:20
> 
> I wouldn't go over this point because
> 1)You are close to leave the sweet spot of the ref pcb vrms efficiency
> 2)These commands add 200mv on top of the 100mv offset through AB gui.That means 300mv
> 
> There are plenty of guides on here, just hit Search :-D


Thank you for the instructions. +1

But I have to open AB manually after using the bat. file, is this normal?

So if 100mv on the slider is now 300mv, then 25mv on slider would be 150mv?


----------



## ebduncan

Quote:


> Originally Posted by *pshootr*
> 
> Thank you for the instructions. +1
> 
> But I have to open AB manually after using the bat. file, is this normal?
> 
> So if 100mv on the slider is now 300mv, then 25mv on slider would be 150mv?


ermm no, the slider will change  to reflect the new values if it doesn't then you probably did something wrong.


----------



## thrgk

When running bf4 and monitoring vram usage, its says im using 7300mb on average, thats with 100% resolution scale, all AA off, and 1440p. Is that true? My 4gb 290x arent enough? How can I run 150 or 200% res scale then


----------



## pshootr

Quote:


> Originally Posted by *ebduncan*
> 
> ermm no, the slider will change  to reflect the new values if it doesn't then you probably did something wrong.


Ok. I'm glad the slider is supposed to change. Now I will try and make the .bat file again. I did it like this.

CD C:\Program Files (x86)\MSI Afterburner
MSIAfterburner.exe /wi4,30,8d,20

Is it normal to have to open AB manually after executing the bat. file?


----------



## ebduncan

Quote:


> Originally Posted by *pshootr*
> 
> Ok. I'm glad the slider is supposed to change. Now I will try and make the bat. file again. I did it like this.
> 
> CD C:\Program Files (x86)\MSI Afterburner
> MSIAfterburner.exe /wi4,30,8d,10
> 
> Is it normal to have to open AB manually after executing the bat. file?


It should automatically open. Make sure your using the correct target directory for MSI After Burner.exe
(ie where the program is actually installed on your computer)

Also if your trying to change the voltage max, you need to change the last digit from 10, to 20.
Quote:


> Originally Posted by *thrgk*
> 
> When running bf4 and monitoring vram usage, its says im using 7300mb on average, thats with 100% resolution scale, all AA off, and 1440p. Is that true? My 4gb 290x arent enough? How can I run 150 or 200% res scale then


Divide the Vram usage by the number of cards you have. Since each card has to have the same information in vram, but the program monitoring vram usage monitors the total gb used for all cards. Meaning if you have 4 gpus each with 4gb vram, total available vram will report as 16gb. With Crossire even though you have 4gb of vram per card the game can only use up to 4gb vram because all the data across the cards must be the same.

If the info in your signature is correct then

your only using 1825mb on each card.


----------



## pshootr

Quote:


> Originally Posted by *ebduncan*
> 
> It should automatically open. Make sure your using the correct target directory for MSI After Burner.exe
> (ie where the program is actually installed on your computer)


I did check the path, and the one you provided is the correct path to my install. Strange that it is not opening, and even when I open it manually the slider is still limited to 100mv.


----------



## Regnitto

Quote:


> Originally Posted by *pshootr*
> 
> I did check the path, and the one you provided is the correct path to my install. Strange that it is not opening, and even when I open it manually the slider is still limited to 100mv.


I'm having the same problem


----------



## ebduncan

Quote:


> Originally Posted by *pshootr*
> 
> I did check the path, and the one you provided is the correct path to my install. Strange that it is not opening, and even when I open it manually the slider is still limited to 100mv.


Try this instead then

CD C:\Program Files (x86)\MSI Afterburner
MSIAfterburner.exe /wi6,30,8d,20

Put that in the bat file.

run the bat file

Then open afterburner.


----------



## Regnitto

Quote:


> Originally Posted by *ebduncan*
> 
> Try this instead then
> 
> CD C:\Program Files (x86)\MSI Afterburner
> MSIAfterburner.exe /wi6,30,8d,20
> 
> Put that in the bat file.
> 
> run the bat file
> 
> Then open afterburner.


slider was still limited to 100mv


----------



## venom55520

sign me up


----------



## ebduncan

Quote:


> Originally Posted by *Regnitto*
> 
> slider was still limited to 100mv


don't know what to tell you then. I literally just downloaded MSIafterburner. opened it up, turned extend overclocking limits, and changed the unoffical overclocking mode to with out powerplay support

closed it.

made that batch file,
CD C:\Program Files (x86)\MSI Afterburner
MSIAfterburner.exe /wi6,30,8d,20

ran msi again, and blam voltage was set to +200mv on the slider.

worked for me...


----------



## pshootr

Quote:


> Originally Posted by *ebduncan*
> 
> don't know what to tell you then. I literally just downloaded MSIafterburner. opened it up, turned extend overclocking limits, and changed the unoffical overclocking mode to with out powerplay support
> 
> closed it.
> 
> made that batch file,
> CD C:\Program Files (x86)\MSI Afterburner
> MSIAfterburner.exe /wi6,30,8d,20
> 
> ran msi again, and blam voltage was set to +200mv on the slider.
> 
> worked for me...


Wow, ok. Thanks a lot for your help. I will try again. I did try unofficial overclocking mode without powerplay, but I didn't notice the extend overclocking limits. I will check it out.

Edit: Oh I see it, it was already checked though


----------



## Arizonian

Quote:


> Originally Posted by *pshootr*
> 
> I started to tinker with my new CPU and GPU. CPU at 4.0Ghz, GPU at 1100/1400 Here is the Fire Strike results.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Edit: Hmm, AB said 1300 memory clock I think and GPU-Z showed 1400 max. I dunno


Noticed you weren't added to the list. Can't find your submission and like to add you. Please post a GPU-Z Link with OCN name or Screen shot of GPU-Z validation tab open with OCN Name or finally a pic of GPU with piece of paper with OCN name showing. I'll get you on the list.









Quote:


> Originally Posted by *wermad*
> 
> How's your Phanteks case???
> 
> Here's my 900D:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> I'm hoping the 3rd Sapphire gets in by this weekend (and new cpu).


Strange bud, I looked to update you and noticed I never added you! Either missed you or just pop a validation link in your post I linked as proof. Congrats - added















Quote:


> Originally Posted by *statyksyn*
> 
> How can i Join the club?


To be added on the member list please submit the following in your post a GPU-Z Link with OCN name or Screen shot of GPU-Z validation tab open with OCN Name or finally a pic of GPU with piece of paper with OCN name showing.









Quote:


> Originally Posted by *AlienPrime173*
> 
> Just got mine today!
> 
> PCS+ 290X!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Got it delivered to my work!
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Just add a validation link as well in original post.









Quote:


> Originally Posted by *dteg*
> 
> well i definitely want in. took me awhile to decide which one i wanted, and then i forgot to consider if it could even fit in my case
> 
> 
> 
> 
> 
> 
> 
> . so i went back through my top list and found 1 that could. BARELY though.
> got the MSI one and with amazon prime had it in my hands in no time.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Again, Just add a validation link as well in original post.








Quote:


> Originally Posted by *Roboyto*
> 
> *Finally got around to making an Omega Drivers Comparison....better late than never?*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> All runs were made with identical CPU/RAM settings.
> This is a fairly fresh Win7 Ultimate install; November 30th
> DDU was run to upgrade drivers; CCleaner was also used to make sure any extra traces were gone.
> Tesselation was always on unless noted.
> All runs were made at least twice and the best score was recorded
> If there was a large delta, then I would reboot and run another 1-2 times to make sure the results were valid.
> 
> Stock clocks of 980/1250 were used for the first set
> They were then compared to a fairly high OC of 1280/1675 +200mV +50% power
> This is on stock XFX BIOS.
> Trixx for OC
> 
> Some screenshots are missing, my apologies...
> _My system..._
> 
> *Win7 Ultimate x64*
> *ASUS Z87 Gryphon w/ Thermal Armor*
> *i7 4770k @ 4.5GHz - XSPC Raystorm - Delidded/Naked*
> *2x8 GB Dominator Platinum 2400MHz*
> *XFX BE R9 290 Reference - XSPC Razor w/ Backplate*
> *Rosewill Capstone 650W Modular*
> 
> *Stock Clocks - 980/1250 -Driver 14.301.1001-140915a-176154C - Catalyst 14.9*
> 
> *Bioshock Infinite Benchmark (Ultra DX11 DDOF - 1920*1080)*
> 
> 
> 
> Per Scene Stats:
> 
> Scene Duration (seconds)
> 
> Average FPS
> 
> Min FPS
> 
> Max FPS
> 
> Scene Name
> 
> 32.79
> 
> 128.77
> 
> 29.39
> 
> 305.94
> 
> Welcome Center
> 
> 6.54
> 
> 141.22
> 
> 21.6
> 
> 168.62
> 
> Scene Change: Disregard Performance In This Section
> 
> 22.14
> 
> 159.65
> 
> 125.24
> 
> 190.89
> 
> Town Center
> 
> 8.05
> 
> 151.12
> 
> 82.86
> 
> 230.01
> 
> Raffle
> 
> 9.04
> 
> 201.96
> 
> 95.1
> 
> 255.04
> 
> Monument Island
> 
> 3.02
> 
> 211.46
> 
> 179.5
> 
> 237.86
> 
> Benchmark Finished: Disregard Performance In This Section
> 
> 81.57
> 
> 151.56
> 
> 21.6
> 
> 305.94
> 
> Overall
> 
> 
> *Cloudgate* - 27,725 - http://www.3dmark.com/3dm/5051789?
> *Firestrike* - 9,526 - http://www.3dmark.com/3dm/5051701?
> *Sky Diver* - 27,291 - http://www.3dmark.com/3dm/5051763?
> *3DMark11 Performance* - P13870 - http://www.3dmark.com/3dm11/9134896
> *3DMark11 Performance *Tess OFF** - P15282 - http://www.3dmark.com/3dm11/9134871
> *Final Fantasy XIV Benchmark* *(Max Settings - 1920*1080 Windowed)* - 13,090
> *Tomb Raider (Ultimate Settings - 5760*1080)* - Min: 34.0 | Max: 48.0 | Avg: 40.1
> *Tomb Raider (Ultimate Settings - 1920*1080 Windowed)* - Min: 62.1 | Max: 108.0 | Avg: 80.7
> *Heaven (Extreme Presets - 1600*900 Windowed)* - Min: 8.6 | Max: 138.3 | FPS: 65.6 | Score: 1653
> *Valley (Extreme HD - 1920*1080 Windowed)* - Min: 16.5 | Max: 107.0 | FPS: 58.2 | Score: 2433
> 
> 
> 
> Spoiler: Screenies - Stock Clocks - Catalyst 14.9
> 
> 
> 
> _*3DMark11 Tesselation OFF*_
> 
> 
> 
> _*3DMark11 Tesselation ON*_
> 
> 
> 
> _*FFXIV Benchmark*_
> 
> 
> 
> _*Unigine Heaven*_
> 
> 
> 
> _*Tomb Raider 1920*1080*_
> 
> 
> 
> _*Tomb Raider 5760*1080*_
> 
> 
> 
> _*Unigine Valley*_
> 
> 
> 
> 
> 
> _*GPU Clocks - 1280/1675 - Driver 14.301.1001-140915a-176154C - Catalyst 14.9*_
> 
> *Bioshock Infinite Benchmark (Ultra DX11 DDOF - 1920*1080)*
> 
> 
> 
> Per Scene Stats:
> 
> Scene Duration (seconds)
> 
> Average FPS
> 
> Min FPS
> 
> Max FPS
> 
> Scene Name
> 
> 32.79
> 
> 145.81
> 
> 30.25
> 
> 371.04
> 
> Welcome Center
> 
> 6.55
> 
> 146.33
> 
> 21.77
> 
> 206.03
> 
> Scene Change: Disregard Performance In This Section
> 
> 22.12
> 
> 201.57
> 
> 153.63
> 
> 257.75
> 
> Town Center
> 
> 8.05
> 
> 190.84
> 
> 90.3
> 
> 280.09
> 
> Raffle
> 
> 9.04
> 
> 251.41
> 
> 134.85
> 
> 379.44
> 
> Monument Island
> 
> 3.01
> 
> 261.73
> 
> 210.91
> 
> 290.62
> 
> Benchmark Finished: Disregard Performance In This Section
> 
> 81.56
> 
> 181.47
> 
> 21.77
> 
> 379.44
> 
> Overall
> 
> 
> *Cloudgate* - 29,725 - http://www.3dmark.com/3dm/5051855?
> *Firestrike* - 11,778 - http://www.3dmark.com/3dm/5051981?
> *Sky Diver* - 31,201 - http://www.3dmark.com/3dm/5051944?
> *3DMark11 Performance* - P16609 - http://www.3dmark.com/3dm11/9134823
> *3DMark11 Performance *Tess OFF** - P18079 - http://www.3dmark.com/3dm11/9134835 *My best 3DMark11 run EvAr*
> 
> 
> 
> 
> 
> 
> 
> 
> *Final Fantasy XIV Benchmark (Max Settings - 1920*1080 Windowed)* - 16,508
> *Tomb Raider (Ultimate Settings - 5760*1080)* - Min: 43.4 | Max: 58.4 | Avg: 50.2
> *Tomb Raider (Ultimate Settings - 1920*1080 Windowed)* - Min: 78.0 | Max: 126.5 | Avg: 101.8
> *Heaven (Extreme Presets - 1600*900 Windowed) -* Min: 33.5 | Max: 176.1 | FPS: 83.9 | Score: 2115
> *Valley (Extreme HD - 1920*1080 Windowed) -* Min: 35.1 | Max: 135.3 | FPS: 74.2 | Score: 3,103
> 
> 
> 
> Spoiler: The Warm & Fuzzy Screenshot
> 
> 
> 
> *P18079 w/ Tesselation OFF* *- Similar Scores*
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Screenies - OverClock - Catalyst 14.9
> 
> 
> 
> *3DMark11 Tesselation OFF*
> 
> 
> 
> *3DMark11 Tesselation ON*
> 
> 
> 
> *FFXIV Benchmark*
> 
> 
> 
> *Unigine Heaven*
> 
> 
> 
> *Tomb Raider 1920*1080*
> 
> 
> 
> *Unigine Valley*
> 
> 
> 
> 
> 
> *Overclock Performance Increase* - *Catalyst 14.9*
> 
> *Overall average has been calculated. I also took the average w/o Cloudgate/Sky Diver since they are less common and posted the worst gains*
> 
> 
> 
> *14.9 Catalyst*
> 
> *Benchmark*
> 
> *Score: Stock*
> 
> *Score: OverClock*
> 
> *Increase*
> 
> *%*
> 
> *Bioshock (Avg FPS)*
> 
> 151.56
> 
> 181.47
> 
> 29.91
> 
> 19.73%
> 
> *Cloudgate*
> 
> 27,275
> 
> 29,725
> 
> 2,450
> 
> 8.98%
> 
> *Firestrike*
> 
> 9,526
> 
> 11,778
> 
> 2,252
> 
> 23.64%
> 
> *Sky Diver*
> 
> 27,291
> 
> 31,201
> 
> 3,910
> 
> 14.33%
> 
> *3DMark11*
> 
> 13,870
> 
> 16,609
> 
> 2,739
> 
> 19.75%
> 
> *3Dmark11 Tess *OFF**
> 
> 15,282
> 
> 18,079
> 
> 2,797
> 
> 18.30%
> 
> *FFXIV*
> 
> 13,090
> 
> 16,508
> 
> 3,418
> 
> 26.11%
> 
> *Tomb Raider 5760 (Avg FPS)*
> 
> 48.0
> 
> 58.4
> 
> 10.4
> 
> 21.67%
> 
> *Tomb Raider 1080 (Avg FPS)*
> 
> 80.7
> 
> 101.8
> 
> 21.1
> 
> 26.15%
> 
> *Heaven*
> 
> 1,653
> 
> 2,115
> 
> 462
> 
> 27.95%
> 
> *Valley*
> 
> 2,433
> 
> 3,103
> 
> 670
> 
> 27.54%
> 
> 
>  
> *Overall Average Improvement*
> 
> 21.29%
>  
> *Without Cloudgate & Skydiver*
> 
> 23.43%
>  
> 
> Omega
> 
> _*Stock Clocks - 980/1250 - Driver 14.501.1003-141120a-177998C - Omega 14.12*_
> 
> *Bioshock Infinite Benchmark (Ultra DX11 DDOF - 1920*1080)*
> 
> 
> 
> *Per Scene Stats:*
> 
> *Scene Duration (seconds)*
> 
> * Average FPS*
> 
> * Min FPS*
> 
> * Max FPS*
> 
> * Scene Name*
> 
> *32.78*
> 
> *140.93*
> 
> *29.57*
> 
> *310.45*
> 
> * Welcome Center*
> 
> *6.54*
> 
> *147.92*
> 
> *20.06*
> 
> *195.9*
> 
> * Scene Change: Disregard Performance In This Section*
> 
> *22.12*
> 
> *184.35*
> 
> *141.35*
> 
> *263.82*
> 
> * Town Center*
> 
> *8.05*
> 
> *173.96*
> 
> *62.15*
> 
> *240.11*
> 
> * Raffle*
> 
> *9.03*
> 
> *239.74*
> 
> *118*
> 
> *363*
> 
> * Monument Island*
> 
> *3.51*
> 
> *252.68*
> 
> *180.74*
> 
> *386.26*
> 
> * Benchmark Finished: Disregard Performance In This Section*
> 
> *82.04*
> 
> *172.16*
> 
> *20.06*
> 
> *386.26*
> 
> * Overall*
> 
> 
> *Cloudgate* - 27,686 - http://www.3dmark.com/3dm/5067286?
> *Firestrike* - 9,529 - http://www.3dmark.com/3dm/5067344?
> *Sky Diver* - 27,533 - http://www.3dmark.com/3dm/5067312?
> *3DMark11 Performance* - P13,898 - http://www.3dmark.com/3dm11/9135153
> *3DMark11 Performance *Tess OFF** - P15,313 - http://www.3dmark.com/3dm11/9135164
> *Final Fantasy XIV Benchmark* *(Max Settings - 1920*1080 Windowed)* - 13,215
> *Tomb Raider (Ultimate Settings - 5760*1080)* - Min: 33.9 | Max: 46.6| Avg: 39.6
> *Tomb Raider (Ultimate Settings - 1920*1080 Windowed)* - Min: 62.0 | Max: 106.0 | Avg: 80.8
> *Heaven (Extreme Presets - 1600*900 Windowed)* - Min: 25.8 | Max: 139.2 | FPS: 66.2 | Score: 1,668
> *Valley (Extreme HD - 1920*1080 Windowed)* - Min: 28.6 | Max: 107.8 | FPS: 58.4 | Score: 2,442
> 
> 
> 
> Spoiler: Screenies - Stock Clocks - Omega
> 
> 
> 
> *Unigine Heaven*
> 
> 
> 
> *Tomb Raider 1920*1080*
> 
> 
> 
> *Tomb Raider 5760*1080*
> 
> 
> 
> *Unigine Valley*
> 
> 
> 
> 
> 
> _*GPU Clocks - 1280/1675 - Driver 14.501.1003-141120a-177998C - Omega 14.12*_
> 
> *Bioshock Infinite Benchmark (Ultra DX11 DDOF - 1920*1080)*
> 
> 
> 
> *Per Scene Stats:*
> 
> *Scene Duration (seconds)*
> 
> * Average FPS*
> 
> * Min FPS*
> 
> * Max FPS*
> 
> * Scene Name*
> 
> *32.96*
> 
> *155.36*
> 
> *1.46*
> 
> *417.19*
> 
> * Welcome Center*
> 
> *7.06*
> 
> *145.46*
> 
> *20.02*
> 
> *304.15*
> 
> * Scene Change:*
> *Disregard Performance*
> 
> *22.1*
> 
> *226.79*
> 
> *139.98*
> 
> *318.79*
> 
> * Town Center*
> 
> *8.03*
> 
> *221.12*
> 
> *92.08*
> 
> *307.18*
> 
> * Raffle*
> 
> *9.02*
> 
> *300.5*
> 
> *200.87*
> 
> *463.84*
> 
> * Monument Island*
> 
> *3.01*
> 
> *314.86*
> 
> *234.79*
> 
> *343.14*
> 
> * Benchmark Finished:*
> *Disregard Performance*
> 
> *82.18*
> 
> *202.15*
> 
> *1.46*
> 
> *463.84*
> 
> * Overall*
> 
> 
> *Cloudgate* - 29,828 - http://www.3dmark.com/3dm/5102119?
> *Firestrike* - 11,799 - http://www.3dmark.com/3dm/5102205?
> *Sky Diver* - 31,479 - http://www.3dmark.com/3dm/5102167?
> *3DMark11 Performance* - P16,568 - http://www.3dmark.com/3dm11/9151648
> *3DMark11 Performance *Tess OFF** - P18043 - http://www.3dmark.com/3dm11/9151703
> *Final Fantasy XIV Benchmark* *(Max Settings - 1920*1080 Windowed)* - 16,502
> *Tomb Raider (Ultimate Settings - 5760*1080)* - Min: 44.0 | Max: 60.4 | Avg: 51.9
> *Tomb Raider (Ultimate Settings - 1920*1080 Windowed)* - Min: 77.7 | Max: 134.2 | Avg: 103.1
> *Heaven (Extreme Presets - 1600*900 Windowed)* - Min: | 33.3 Max: |177.5 FPS: |83.9 Score: 2,114
> *Valley (Extreme HD - 1920*1080 Windowed)* - Min: 34.6 | Max: 135.3 | FPS: |73.5 Score: 3,077
> 
> 
> 
> Spoiler: Screenies - OverClock - Omega
> 
> 
> 
> *3DMark11 Tesselation OFF*
> 
> 
> 
> *3DMark11 Tesselation ON*
> 
> 
> 
> *FFXIV Benchmark*
> 
> 
> 
> *Unigine Heaven*
> 
> 
> 
> *Tombraider 1920*1080*
> 
> 
> 
> *Tomb Raider 5760*1080*
> 
> 
> 
> *Unigine Valley*
> 
> 
> 
> 
> 
> _*Overclock Performance Increase - Omega*_
> 
> *Overall average has been calculated. I also took the average w/o Cloudgate/Sky Diver since they are less common and posted the worst gains*
> 
> 
> 
> *14.12 Omega*
> 
> *Benchmark*
> 
> *Score: Stock*
> 
> *Score: OverClock*
> 
> *Increase*
> 
> *%*
> 
> *Bioshock (Avg FPS)*
> 
> 172.16
> 
> 202.15
> 
> 29.99
> 
> 17.42%
> 
> *Cloudgate*
> 
> 27,686
> 
> 29,828
> 
> 2142.00
> 
> 7.74%
> 
> *Firestrike*
> 
> 9,529
> 
> 11,799
> 
> 2270.00
> 
> 23.82%
> 
> *Sky Diver*
> 
> 27,533
> 
> 31,479
> 
> 3946.00
> 
> 14.33%
> 
> *3DMark11*
> 
> 13,898
> 
> 16,568
> 
> 2670.00
> 
> 19.21%
> 
> *3Dmark11 Tess *OFF**
> 
> 15,313
> 
> 18,043
> 
> 2730.00
> 
> 17.83%
> 
> *FFXIV*
> 
> 13,215
> 
> 16,502
> 
> 3287.00
> 
> 24.87%
> 
> *Tomb Raider 5760 (Avg FPS)*
> 
> 39.6
> 
> 51.9
> 
> 12.30
> 
> 31.06%
> 
> *Tomb Raider 1080 (Avg FPS)*
> 
> 80.8
> 
> 103.1
> 
> 22.30
> 
> 27.60%
> 
> *Heaven*
> 
> 1,668
> 
> 2,114
> 
> 446.00
> 
> 26.74%
> 
> *Valley*
> 
> 2,442
> 
> 3,077
> 
> 635.00
> 
> 26.00%
> 
> 
>  
> *Overall Average Improvement*
> 
> 21.51%
>  
> *Without Cloudgate & Skydiver*
> 
> 23.84%
>  
> 
> _*Stock Clock Comparison: Catalyst 14.9 ~VS~ Omega 14.12*_
> 
> *As can be seen below, there is barely any difference between 14.9 & Omega...with the exception of BioShock with solid gains, and Tomb Raider with a hard performance hit.*
> 
> *Averages were calculated the same as above, but also without Tomb Raider & BioShock since they influence the averages heavily.*
> 
> 
> 
> *Benchmark*
> 
> *Stock - 14.9*
> 
> *Stock - Omega*
> 
> *Increase*
> 
> *%*
> 
> *Bioshock (Avg FPS)*
> 
> 151.56
> 
> 172.16
> 
> 20.60
> 
> 13.59%
> 
> *Cloudgate*
> 
> 27,275
> 
> 27,686
> 
> 411
> 
> 1.51%
> 
> *Firestrike*
> 
> 9,526
> 
> 9,529
> 
> 3
> 
> 0.03%
> 
> *Sky Diver*
> 
> 27,291
> 
> 27,533
> 
> 242
> 
> 0.89%
> 
> *3DMark11*
> 
> 13,870
> 
> 13,898
> 
> 28
> 
> 0.20%
> 
> *3Dmark11 Tess *OFF**
> 
> 15,282
> 
> 15,313
> 
> 31
> 
> 0.20%
> 
> *FFXIV*
> 
> 13,090
> 
> 13,215
> 
> 125
> 
> 0.95%
> 
> *Tomb Raider 5760 (Avg FPS)*
> 
> 48.0
> 
> 39.6
> 
> -8.4
> 
> -17.50%
> 
> *Tomb Raider 1080 (Avg FPS)*
> 
> 80.7
> 
> 80.8
> 
> 0.1
> 
> 0.12%
> 
> *Heaven*
> 
> 1,653
> 
> 1,668
> 
> 15
> 
> 0.91%
> 
> *Valley*
> 
> 2,433
> 
> 2,442
> 
> 9
> 
> 0.37%
> 
> 
>  
> *Overall Average Improvement*
> 
> 0.12%
>  
> *Without Cloudgate & Skydiver*
> 
> -0.12%
>  
> *Without Tombraider*
> 
> 2.07%
>  
> *Without Tombraider & Bioshock*
> 
> 0.63%
>  
> 
> _*Overclock Comparison: Catalyst 14.9 ~VS~ Omega 14.12*_
> 
> *As can be seen below, there is barely any difference between 14.9 & Omega...with the exception of BioShock with solid gains, and Tomb Raider with a hard performance hit.*
> 
> *Averages were calculated the same as above, but also without Tomb Raider & BioShock since they influence the averages heavily.*
> 
> 
> 
> *Benchmark*
> 
> *OC - 14.9*
> 
> *OC - Omega*
> 
> *Increase*
> 
> *%*
> 
> *Bioshock (Avg FPS)*
> 
> 181.47
> 
> 202.15
> 
> 20.68
> 
> 11.40%
> 
> *Cloudgate*
> 
> 29,725
> 
> 29,828
> 
> 103
> 
> 0.35%
> 
> *Firestrike*
> 
> 11,778
> 
> 11,799
> 
> 21
> 
> 0.18%
> 
> *Sky Diver*
> 
> 31,201
> 
> 31,479
> 
> 278
> 
> 0.89%
> 
> *3DMark11*
> 
> 16,609
> 
> 16,568
> 
> -41
> 
> -0.25%
> 
> *3Dmark11 Tess *OFF**
> 
> 18,079
> 
> 18,043
> 
> -36
> 
> -0.20%
> 
> *FFXIV*
> 
> 16,508
> 
> 16,502
> 
> -6
> 
> -0.04%
> 
> *Tomb Raider 5760 (Avg FPS)*
> 
> 58.4
> 
> 51.9
> 
> -6.5
> 
> -11.13%
> 
> *Tomb Raider 1080 (Avg FPS)*
> 
> 101.8
> 
> 103.1
> 
> 1.3
> 
> 1.28%
> 
> *Heaven*
> 
> 2,115
> 
> 2,114
> 
> -1
> 
> -0.05%
> 
> *Valley*
> 
> 3,103
> 
> 3,077
> 
> -26
> 
> -0.84%
> 
> 
>  
> *Overall Average Improvement*
> 
> 0.14%
>  
> *Without Cloudgate & Skydiver*
> 
> 0.04%
>  
> *Without Tombraider*
> 
> 1.25%
>  
> *Without Tombraider & Bioshock*
> 
> -0.02%
>  
> 
> _This obviously isn't a comprehensive list, but it tells a familiar story for us enthusiast class GPU owners. AMD's primary focus with Omega, and previously Mantle, is lower tier hardware. Hopefully the drivers address pertinent issues in games where a smooth gaming experience can be more important than benchmarks._
> 
> _-El Fin _
> 
> 
> 
> 
> 
> 
> 
> 
> 
> _P.S._
> 
> _The experience while benching with Omega was smooth and enjoyable, this is a definite plus compared to some of the other drivers that were released in the last year since the 290(X) debuted._
> 
> _Considering the performance results for Biochock and Tomb Raider, which aren't synthetic benchmarks, it is certainly plausible that there is solid performance to be gained, or lost, in other titles. I just don't have the game library, or the time, to test other titles that don't have convenient benchmarks built in._
> 
> _If you have detailed before and after results from a game, then let me know and I will add the information to the post._


Nice work - going to add this to OP information section for members to link to easier without having to dig for it.








Quote:


> Originally Posted by *venom55520*
> 
> sign me up
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added 







Thank you for the proper submission.









*Wow this marks 502 members minus those who requested to be removed with 727 GPU's total purchased among us*.


----------



## ultraex2003

Quote:


> Originally Posted by *ebduncan*
> 
> don't know what to tell you then. I literally just downloaded MSIafterburner. opened it up, turned extend overclocking limits, and changed the unoffical overclocking mode to with out powerplay support
> 
> closed it.
> 
> made that batch file,
> CD C:\Program Files (x86)\MSI Afterburner
> MSIAfterburner.exe /wi6,30,8d,20
> 
> ran msi again, and blam voltage was set to +200mv on the slider.
> 
> worked for me...


yes work also for me !! thanks!!

but there is a way all the slide of 200mv to work as 100mv without the bat file;


----------



## pshootr

Here is my GPU-Z Validation link http://www.techpowerup.com/gpuz/details.php?id=narfq and thank you.


----------



## Arizonian

Quote:


> Originally Posted by *pshootr*
> 
> Here is my GPU-Z Validation link http://www.techpowerup.com/gpuz/details.php?id=narfq and thank you.


Congrats - added









_#503 / 728 GPU's club total._


----------



## venom55520

Quote:


> Originally Posted by *Arizonian*


I should note that it's a 290x, not a 290.

And thanks for the welcome


----------



## pshootr

Is 1.438v normal for +100mv?


----------



## Roboyto

Quote:


> Originally Posted by *X-Alt*
> 
> My XFX DD 290 is on its way to replace my tired old 7970, I'm looking forward to it quite a bit. Anyways, is overclocking more or less the same principle with Hawaii compared to Tahiti, or are they more sensitive to voltages, I've got a couple of friends who say they are quite finicky with the volts compared to their 7950s, is this correct or the silicon lottery giving them bad luck. Anyways, from user experience on the thread, what seems to be the average OC range for a reference board, I'm not that worried about temps since I don't mind a 70%+ fan profile, my Tahiti could handle a good 1.325V without any issue as long as it was cool?
> 
> Regards,
> Alt


OC capabilities can vary greatly from one card to another; reference or not.

Silicon lottery winners come in more than one flavor with these cards it seems though. Some cards make it in the ball park of 1100 core, without adjusting voltage at all..but they can fail to hit a high number once voltage is added. Others start needing voltage fairly soon, but clocks continue to climb as the voltage does like my XFX BE which will hit 1300 core. The only way to know is to test your card and find out.

Overclock here is going to be a little different than Tahiti. Here is a little blurb from my 'Need to Know' post about these cards. Check my sig for the link to it for other pertinent information.



Spoiler: Warning: Spoiler!




*Core clock is king! Overclock core 1st, worry about memory 2nd. The 512-bit bus makes up for the 5GHz default memory speeds.*
If you experience *artifacting/tearing/flashing* then you odds are you:
pushed core clock too high
need to add more voltage (if temperatures will allow)
should increase power limit (if temperatures will allow)
If you can't add more voltage/power, bring the core clock down a little



*If you experience black screen, BSOD, or other lockups then odds are you have pushed the RAM clock too high.*
*RAM speed gives a fraction of a performance boost compared to core*
MSI Afterburner AUX voltage can sometime assist in RAM clocks.
Make sure you *Force Constant Voltage* in your OC utility
Make sure you *Disable ULPS* via registry or OC Utility; especially with multiple GPUs!



*RAM Speed Comparisons I found @ 1080P*




*Above average cards can get to ~1100 core clock without needing additional voltage. *
This is of course dependent upon:
silicon lottery/specific GPU
PSU
cooling
voltage regulation
and the varying card/BIOS default voltage settings






If you're sticking with air cooling then you must closely monitor VRM1 temperatures. They will quickly become your limiting factor for an overclock long before the core temperature does; a problem the 79XX cards never had.

Quote:


> Originally Posted by *pshootr*
> 
> Is 1.438v normal for +100mv?


That is about right depending on the stock voltage for your card. My XFX peaks at 1.414V with +200mV and 50% power limit.

*edit* thought it said +200mV...let me install AB and see

That is different than what I see in AB with the default +100mV. Mine peaked at 1.320V


----------



## pshootr

Quote:


> That is about right depending on the stock voltage for your card. My XFX peaks at 1.414V with +200mV and 50% power limit.
> 
> *edit* thought it said +200mV...let me install AB and see


Thank you, I made a .bat file to increase the voltage slider on AB but the slider did not change for me unfortunately. I wanted to see if my slider at +100mv was accurate after using the .bat file.

Edit: Ah ok, I just noticed your [Edit] Thanks


----------



## pshootr

Oh wow, interesting, perhaps my voltage was effected even though the slider did not change. That is what I wanted to figure out lol

I will have to do some testing, earlier I seen that GPU-Z reported my max voltage at 1.438

Thank you very much


----------



## Performer81

Quote:


> Originally Posted by *AlienPrime173*
> 
> Just got mine today!
> 
> PCS+ 290X!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Got it delivered to my work!
> 
> ]http://www.overclock.net/content/type/61/id/2287643/
> ]http://www.overclock.net/content/type/61/id/2287644
> http://www.overclock.net/content/type/61/id/2287645/


Nice, you have the new revision also. COuld you check if the X also has 2 vrms with no contact to the vrm cooler??? YOu can see that even when the card is in the case.

This pic is from my 290 pcs+:

http://abload.de/image.php?img=h3paqct5drwm.jpg


----------



## Forceman

Quote:


> Originally Posted by *pshootr*
> 
> Oh wow, interesting, perhaps my voltage was effected even though the slider did not change. That is what I wanted to figure out lol
> 
> I will have to do some testing, earlier I seen that GPU-Z reported my max voltage at 1.438
> 
> Thank you very much


Unless they've changed something, using the command line doesn't change the slider, it just adds 100mV to whatever is in the slider. So it'll still say 100, but it'll actually be 200 (and 0 would be 100).


----------



## Performer81

Jeah, hexadezimal 10 is +100mv when the slider is at 0. YOu can see it at the idle voltage graph. SOmehow if i took the slider at -100 to get back to 0 it blackscreened.


----------



## Gobigorgohome

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Sorry mate its my slight typo and my Oztralian and your self being from Norway
> 
> If your going to cool one card , why not all three ?? Its the only way to go with these beasts .
> Temporary or not waterblock your cards , don't mess around .
> 
> Is what I stated .


Yeah, language differences is common in here I guess.









As of the cards, I see what you mean about water cooling them, I just have to see what I do. If the Lightnings is air-cooled there will be no water cooled card together with them, crossfire on air is doable, not so sure about tri-fire.







The quadfire system is water cooled and running like a champ (gaming machine).


----------



## thrgk

Quote:


> Originally Posted by *ebduncan*
> 
> It should automatically open. Make sure your using the correct target directory for MSI After Burner.exe
> (ie where the program is actually installed on your computer)
> 
> Also if your trying to change the voltage max, you need to change the last digit from 10, to 20.
> Divide the Vram usage by the number of cards you have. Since each card has to have the same information in vram, but the program monitoring vram usage monitors the total gb used for all cards. Meaning if you have 4 gpus each with 4gb vram, total available vram will report as 16gb. With Crossire even though you have 4gb of vram per card the game can only use up to 4gb vram because all the data across the cards must be the same.
> 
> If the info in your signature is correct then
> 
> your only using 1825mb on each card.


Ah so whatever number ab says I am using divide it by 4 since I have 4 cards ?


----------



## Archea47

Hey gang,

I'm finalizing the parts for adding the 290Xs to my loop, which I then hope to 1200+

My tri-x cards have 8+6 PCiE power connectors, it's this anything to be concerned about? It seems like the power consumption at that voltage for those clocks exceeds the rated capacity for the cabling powering the cards

Thanks!


----------



## Chopper1591

Quote:


> Originally Posted by *Archea47*
> 
> Hey gang,
> 
> I'm finalizing the parts for adding the 290Xs to my loop, which I then hope to 1200+
> 
> My tri-x cards have 8+6 PCiE power connectors, it's this anything to be concerned about? It seems like *the power consumption at that voltage for those clocks exceeds the rated capacity for the cabling powering the cards*
> 
> Thanks!


Uhmm..
What?

8-pin is rated at 150w and 6-pin at 75w.
That's it... they can't supply more.

The pci-e slot itself also delivers 75w btw.

You have some secret power source connected to it?


----------



## ebduncan

Quote:


> Originally Posted by *Forceman*
> 
> Unless they've changed something, using the command line doesn't change the slider, it just adds 100mV to whatever is in the slider. So it'll still say 100, but it'll actually be 200 (and 0 would be 100).


ERM? NO

http://smg.photobucket.com/user/ebduncan/media/Computer 2014/msitweaked_zps2ed08bfd.png.html

http://smg.photobucket.com/user/ebduncan/media/Computer 2014/msidefault_zpse0b53cb4.png.html

The slider changes....


----------



## Forceman

Quote:


> Originally Posted by *Chopper1591*
> 
> Uhmm..
> What?
> 
> 8-pin is rated at 150w and 6-pin at 75w.
> That's it... they can't supply more.
> 
> The pci-e slot itself also delivers 75w btw.
> 
> You have some secret power source connected to it?


Rated for is not the same as limited to.
Quote:


> Originally Posted by *ebduncan*
> 
> ERM? NO
> 
> http://smg.photobucket.com/user/ebduncan/media/Computer 2014/msitweaked_zps2ed08bfd.png.html
> 
> http://smg.photobucket.com/user/ebduncan/media/Computer 2014/msidefault_zpse0b53cb4.png.html
> 
> The slider changes....


Is that new? I haven't messed with it in a long while, but I'd have sworn it didn't change the slider before.


----------



## Archea47

Quote:


> Originally Posted by *Chopper1591*
> 
> Uhmm..
> What?
> 
> 8-pin is rated at 150w and 6-pin at 75w.
> That's it... they can't supply more.
> 
> The pci-e slot itself also delivers 75w btw.
> 
> You have some secret power source connected to it?


That was exactly my question - it seems like the cards on heavy OC can use > 300W


----------



## Performer81

SLider doesnt change here. +100mv is max. But stock voltage changes (290 pcs+)


----------



## Chopper1591

Quote:


> Originally Posted by *Performer81*
> 
> SLider doesnt change here. +100mv is max. But stock voltage changes (290 pcs+)


Same here.

Actually the main reason I use Trixx over AB.


----------



## Archea47

Nevermind - took a leap and ordered the right backplates from aquatuning.us ... I hope they're legit


----------



## pshootr

If I set disable ULPS in TRIXX, it does not remember my setting after restarting it. Am I missing something?

I have been using AB instead of TRIXX with 2D/3D profiles in order to stop freezes related to flash (disabling HA didn't help me) but would like to try and get TRIXX setup in a way that will work as well. However as stated above TRIXX is not remembering my setting for (disable ULPS).

I would actually be satisfied with AB is the slider was changed appropriately after using a .bat file to increase voltage slider. But I have had no luck with that.


----------



## kizwan

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Chopper1591*
> 
> Uhmm..
> What?
> 
> 8-pin is rated at 150w and 6-pin at 75w.
> That's it... they can't supply more.
> 
> The pci-e slot itself also delivers 75w btw.
> 
> You have some secret power source connected to it?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Rated for is not the same as limited to.
Click to expand...

Yup, they can draw more than that.
Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ebduncan*
> 
> ERM? NO
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://smg.photobucket.com/user/ebduncan/media/Computer 2014/msitweaked_zps2ed08bfd.png.html
> 
> http://smg.photobucket.com/user/ebduncan/media/Computer 2014/msidefault_zpse0b53cb4.png.html
> 
> 
> 
> The slider changes....
> 
> 
> 
> Is that new? I haven't messed with it in a long while, but I'd have sworn it didn't change the slider before.
Click to expand...

It's not new. When using that command the voltage slider will be automatically set to max & the max value will be changed to whatever we set.
Quote:


> Originally Posted by *pshootr*
> 
> If I set disable ULPS in TRIXX, it does not remember my setting after restarting it. Am I missing something?
> 
> I have been using AB instead of TRIXX with 2D/3D profiles in order to stop freezes related to flash (disabling HA didn't help me) but would like to try and get TRIXX setup in a way that will work as well. However as stated above TRIXX is not remembering my setting for (disable ULPS).
> 
> I would actually be satisfied with AB is the slider was changed appropriately after using a .bat file to increase voltage slider. But I have had no luck with that.


I disabled ULPS using AB & overclock using TRIXX.


----------



## Regamaster

Hi guys, I recently got an ASUS R9 290 4GB for my build and now understand that my ambient temps just aren't low enough to run it acceptably quiet. I was planning on getting a Corsair HG10 bracket and coupling it with an Antec Kuhler 620 closed loop but ended up changing my mind.

So now its *for sale* in the marketplace, if anyone is interested feel free to PM me with questions or offers.


----------



## pshootr

Quote:


> I disabled ULPS using AB & overclock using TRIXX.


Good idea, but having to use both programs is kind of not ideal. Kind of crappy that TRIXX will not remember settings. Seems like such a basic feature to neglect.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *ebduncan*
> 
> +100mv isn't really anything...
> 
> I use TRIX, and go up to +200mv, afterburner = crap. Good luck keep your card cool though on air cooling.
> 
> my card is waterblocked.


You know that Asus PT1T bios and Asus GPU tweek will give you 400mv on the slider ?? .............. and that's no crap either









Quote:


> Originally Posted by *Gobigorgohome*
> 
> Yeah, language differences is common in here I guess.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As of the cards, I see what you mean about water cooling them, I just have to see what I do. If the Lightnings is air-cooled there will be no water cooled card together with them, crossfire on air is doable, not so sure about tri-fire.
> 
> 
> 
> 
> 
> 
> 
> The quadfire system is water cooled and running like a champ (gaming machine).


Water cooling dropped ambient load temps by 35c . Essential for a multicard gaming rig . I run my Tri on a separate loop with a twin D5 pump / res combo , 1 360 60mm and 1 420 60mm Xspc rads in line with a 1hp water chiller with a 4 litre res .
All with QDC's so I can remove the rads and hook up the chiller direct . The CPU loop has a chiller , inline pump / res no rads .


----------



## ElevenEleven

Well, my new R9 290X by Powercolor (PCS+) is probably going back to NewEgg, unless someone has some ideas







I was getting occasional black screens while playing but also while just browsing the web in Chrome or Waterfox browsers. I've updated the BIOS to the latest custom BIOS straight from Powercolor technical support and tried running without any fan curve modification or overclock at complete stock settings (Afterburner process disabled), but still I just got a black screen earlier today when browsing the web on Waterfox (only 2 tabs open), then after a restart and leaving the computer idle for an hour I came back to the monitor not turning on/computer not responding. My previous HD7970 would sometimes have issues with waking up the monitor but it was rare and definitely never black screens when doing even the simple things. This was an open box, so I can only return, not exchange, sadly







Not sure if I should return or RMA.

P.S.: I'm on the latest Omega driver. Always a complete clean install. Windows 7 Pro 64-bit. Display cable is a good DL-DVI. Went back to HD7970 for now.


----------



## tsm106

Quote:


> Originally Posted by *Archea47*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Chopper1591*
> 
> Uhmm..
> What?
> 
> 8-pin is rated at 150w and 6-pin at 75w.
> That's it... they can't supply more.
> 
> The pci-e slot itself also delivers 75w btw.
> 
> You have some secret power source connected to it?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That was exactly my question - *it seems like the cards on heavy OC can use > 300W*
Click to expand...


----------



## Archea47

Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> Spoiler: Warning: Spoiler!


Yep, that's the picture that got me thinking about my Tri-x cards and them only having 8+6 power. I believe you're on Lightnings with 8+8+6, which is 150w worth of difference. BTW very frustrated to notice that difference so soon after buying the Sapphires


----------



## ebduncan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> You know that Asus PT1T bios and Asus GPU tweek will give you 400mv on the slider ?? .............. and that's no crap either


Your not limited in afterburner. I mean i could run +600mv if I really wanted to. I don't use MSI though. I prefer Trixx. Its a lighter weight program and plays well with other monitoring programs.

I'm not interesting in frying my card so I keep the voltage to sane levels. I run at 1250 core and 1600 mem already so no real need for more. When I upgrade to a new card, I will see about pushing this current card to its absolute limits for fun. You know putting the radiators in ice water and all


----------



## thrgk

I have 4 290x connected with a bridge and all water blocks on them. If i wanted to take them out of my motherboard, is there an easy way to to unlock all the pcie locks on the slots? That little latch thing you push down so the card comes out when you pull on it.


----------



## ElevenEleven

Use something thin and narrow (and strong) other than your finger to push on those tabs? The only way to unlock the slots is to push the tabs, so you may have to get creative if you can't reach them easily.


----------



## tsm106

Quote:


> Originally Posted by *thrgk*
> 
> I have 4 290x connected with a bridge and all water blocks on them. If i wanted to take them out of my motherboard, is there an easy way to to unlock all the pcie locks on the slots? That little latch thing you push down so the card comes out when you pull on it.


I usually use a plastic chopstick, like the kind they sell at ikea. I also have one of those long plastic door panel removers. Either works, just no metal or magnetic.


----------



## Gobigorgohome

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Water cooling dropped ambient load temps by 35c . Essential for a multicard gaming rig . I run my Tri on a separate loop with a twin D5 pump / res combo , 1 360 60mm and 1 420 60mm Xspc rads in line with a 1hp water chiller with a 4 litre res .
> All with QDC's so I can remove the rads and hook up the chiller direct . The CPU loop has a chiller , inline pump / res no rads .


I do not really need a water chiller for my gaming machine, 10C ambient temperature there is no way it will get "too hot", load temperatures is about 35C on all four cards and 40C on the 4930K @ 4,7 Ghz ... HDD's and SSD is about 20C, idle temperatures on the CPU is 13C average ... the GPU's are 19C average. There is no way I am heating up that room (concrete walls and no heating possibilities whatsoever). I usually gaming in a winter jacket and cap to stay warm.









In other words, to get that room remotely warm (~17C) I need a couple of Lightnings air-cooled.


----------



## pdasterly

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I do not really need a water chiller for my gaming machine, 10C ambient temperature there is no way it will get "too hot", load temperatures is about 35C on all four cards and 40C on the 4930K @ 4,7 Ghz ... HDD's and SSD is about 20C, idle temperatures on the CPU is 13C average ... the GPU's are 19C average. There is no way I am heating up that room (concrete walls and no heating possibilities whatsoever). I usually gaming in a winter jacket and cap to stay warm.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In other words, to get that room remotely warm (~17C) I need a couple of Lightnings air-cooled.


just give it some time, your room will eventually heat up


----------



## Gobigorgohome

Quote:


> Originally Posted by *pdasterly*
> 
> just give it some time, your room will eventually heat up


Yes, I guess it will get warmer in May/June 2015


----------



## Buehlar

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I do not really need a water chiller for my gaming machine, 10C ambient temperature there is no way it will get "too hot", load temperatures is about 35C on all four cards and 40C on the 4930K @ 4,7 Ghz ... HDD's and SSD is about 20C, idle temperatures on the CPU is 13C average ... the GPU's are 19C average. There is no way I am heating up that room (concrete walls and no heating possibilities whatsoever). *I usually gaming in a winter jacket and cap to stay warm.*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In other words, to get that room remotely warm (~17C) I need a couple of Lightnings air-cooled.


Burrrrr
I'm all for keeping temps as low as possible but that's not my idea of a comfortable gaming environment LOL


----------



## Gumbi

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I do not really need a water chiller for my gaming machine, 10C ambient temperature there is no way it will get "too hot", load temperatures is about 35C on all four cards and 40C on the 4930K @ 4,7 Ghz ... HDD's and SSD is about 20C, idle temperatures on the CPU is 13C average ... the GPU's are 19C average. There is no way I am heating up that room (concrete walls and no heating possibilities whatsoever). I usually gaming in a winter jacket and cap to stay warm.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In other words, to get that room remotely warm (~17C) I need a couple of Lightnings air-cooled.


You do realise that water cooling doesn't reduce the amount of heat your cards produce lol? They still produce the same amount of heat, the water colling just moves it more quickly from the cards to the room.


----------



## kizwan

Quote:


> Originally Posted by *Gumbi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gobigorgohome*
> 
> I do not really need a water chiller for my gaming machine, 10C ambient temperature there is no way it will get "too hot", load temperatures is about 35C on all four cards and 40C on the 4930K @ 4,7 Ghz ... HDD's and SSD is about 20C, idle temperatures on the CPU is 13C average ... the GPU's are 19C average. There is no way I am heating up that room (concrete walls and no heating possibilities whatsoever). I usually gaming in a winter jacket and cap to stay warm.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In other words, to get that room remotely warm (~17C) I need a couple of Lightnings air-cooled.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You do realise that water cooling doesn't reduce the amount of heat your cards produce lol? They still produce the same amount of heat, the water colling just moves it more quickly from the cards to the room.
Click to expand...









He didn't say otherwise.


----------



## pshootr

Quote:


> Originally Posted by *ebduncan*
> 
> don't know what to tell you then. I literally just downloaded MSIafterburner. opened it up, turned extend overclocking limits, and changed the unoffical overclocking mode to with out powerplay support
> 
> closed it.
> 
> made that batch file,
> CD C:\Program Files (x86)\MSI Afterburner
> MSIAfterburner.exe /wi6,30,8d,20
> 
> ran msi again, and blam voltage was set to +200mv on the slider.
> 
> worked for me...


I opened settings on AB and went to profiles and set 2D and 3D profiles to "none", and after that AB started up with the slider at 200mv. Should be able to create new profiles now without issue hopefully. Hope this may help others who had trouble doing this.

Thank you again for all your effort/help.









Edit: This method does seem quirky though, if you click the slider to adjust it, it revers back to 100mv. So I am unable to slide down to say 150mv. for example. Anyways it is progress.


----------



## Forceman

Quote:


> Originally Posted by *Gumbi*
> 
> You do realise that water cooling doesn't reduce the amount of heat your cards produce lol? They still produce the same amount of heat, the water colling just moves it more quickly from the cards to the room.


There are some efficiency gains from running at lower temps though, especially in these cards. So running water cooled may also result in less heat.


----------



## kizwan

Quote:


> Originally Posted by *pshootr*
> 
> Quote:
> 
> 
> 
> I disabled ULPS using AB & overclock using TRIXX.
> 
> 
> 
> Good idea, but having to use both programs is kind of not ideal. Kind of crappy that TRIXX will not remember settings. Seems like such a basic feature to neglect.
Click to expand...

Disabling ULPS is one time job only. You only need to redo it if you re-install the drivers.


----------



## pshootr

Quote:


> Originally Posted by *kizwan*
> 
> Disabling ULPS is one time job only. You only need to redo it if you re-install the drivers.


So once I disable ULPS through AB, It will always be disabled whether AB is running or not?


----------



## kizwan

Quote:


> Originally Posted by *pshootr*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Disabling ULPS is one time job only. You only need to redo it if you re-install the drivers.
> 
> 
> 
> So once I disable ULPS through AB, It will always be disabled whether AB is running or not?
Click to expand...

Yes.


----------



## pshootr

Quote:


> Originally Posted by *kizwan*
> 
> Yes.


Ok good to know, thank you. I guess my next question is when I start TRIXX, ULPS is not disabled in the settings. Does this mean that TRIXX has re-enabled ULPS upon its start-up??


----------



## kizwan

Quote:


> Originally Posted by *pshootr*
> 
> Ok good to know, thank you. I guess my next question is when I start TRIXX, ULPS is not disabled in the settings. Does this mean that TRIXX has re-enabled ULPS upon its start-up??


No, that means you have not disable ULPS. BTW, are you running crossfire?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *tsm106*
> 
> I usually use a plastic chopstick, like the kind they sell at ikea. I also have one of those long plastic door panel removers. Either works, just no metal or magnetic.


Same ..... chopstick









Quote:


> Originally Posted by *Gobigorgohome*
> 
> I do not really need a water chiller for my gaming machine, 10C ambient temperature there is no way it will get "too hot", load temperatures is about 35C on all four cards and 40C on the 4930K @ 4,7 Ghz ... HDD's and SSD is about 20C, idle temperatures on the CPU is 13C average ... the GPU's are 19C average. There is no way I am heating up that room (concrete walls and no heating possibilities whatsoever). I usually gaming in a winter jacket and cap to stay warm.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In other words, to get that room remotely warm (~17C) I need a couple of Lightnings air-cooled.


LoooooL , of cause your near the arctic circle you don't need to run that .








Because I live in Queensland Oztralia I run my portable mancave a/c nearly 12 hrs a day 6 days a week , and on HOT days ( over 32c inside house not mancave ) I will crank the ducted a/c as well

Quote:


> Originally Posted by *pshootr*
> 
> Ok good to know, thank you. I guess my next question is when I start TRIXX, ULPS is not disabled in the settings. Does this mean that TRIXX has re-enabled ULPS upon its start-up??


Be carefull running more than one monitoring program at once eg: AB and Trixx can cause conflicts between each others settings


----------



## pshootr

Quote:


> Originally Posted by *kizwan*
> 
> No, that means you have not disable ULPS. BTW, are you running crossfire?


In AB disable ULPS is checked, so it should be disabled. However when TRIXX is started after closing AB, disable ULPS is unchecked in the TRIXX settings. That is what has me confused. Perhaps TRIXX does not recognize that ULPS has already been disabled by AB. I don't know..

No I only run one card for now.


----------



## pshootr

Quote:


> Be carefull running more than one monitoring program at once eg: AB and Trixx can cause conflicts between each others settings


Thank you, I am only running one at a time.


----------



## kizwan

Quote:


> Originally Posted by *pshootr*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> No, that means you have not disable ULPS. BTW, are you running crossfire?
> 
> 
> 
> In AB disable ULPS is checked, so it should be disabled. However when TRIXX is started after closing AB, disable ULPS is unchecked in the TRIXX settings. That is what has me confused. Perhaps TRIXX does not recognize that ULPS has already been disabled by AB. I don't know..
> 
> No I only run one card for now.
Click to expand...

ULPS is for Crossfire only. It's not available when running single card. TRIXX says it's not disable because there's nothing to disable.


----------



## pshootr

Quote:


> Originally Posted by *kizwan*
> 
> ULPS is for Crossfire only. It's not available when running single card. TRIXX says it's not disable because there's nothing to disable.


Ah, I see. This whole new video card thing is really throwing me for a loop.


----------



## pshootr

Another funny thing is, that I was having freezing issues with flash (disable HA did not help) Only thing that helped was using AB with disable UPLS and 2D/3D profiles created. Recently I removed 2D/3D profiles, to get slider to 200mv (using a .bat file) and now flash no longer seems to cause a freeze issue even without the 2D/3D profiles. I guess its because without using profiles my OC and voltage were constant. So confusing..

I actually like TRIXX for its simplicity, it is much more strait forward and far less overwhelming than AB, and has 200mv option by default. However I like having profiles for 2D/3D from AB because then you don't have to initiate OC settings every time you want to start a game.

It seems to me, overall AB might be best for advanced users. While TRIXX may be a better option for average users. I feel that if TRIXX has 2D/3D profiles it would be near perfect for the average user.


----------



## ebduncan

Quote:


> Originally Posted by *Gumbi*
> 
> You do realise that water cooling doesn't reduce the amount of heat your cards produce lol? They still produce the same amount of heat, the water colling just moves it more quickly from the cards to the room.


Not entirely true, its well known water cooling reduces the temperature and has a small impact on power consumption because the resistance is lower. In the case of the R9-290/290x this drop is around 20-40 watts when water cooling.
Quote:


> Originally Posted by *Gobigorgohome*
> 
> I do not really need a water chiller for my gaming machine, 10C ambient temperature there is no way it will get "too hot", load temperatures is about 35C on all four cards and 40C on the 4930K @ 4,7 Ghz ... HDD's and SSD is about 20C, idle temperatures on the CPU is 13C average ... the GPU's are 19C average. There is no way I am heating up that room (concrete walls and no heating possibilities whatsoever). I usually gaming in a winter jacket and cap to stay warm.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In other words, to get that room remotely warm (~17C) I need a couple of Lightnings air-cooled.


10c ambient temp? That's a cold room. I wouldn't want to game in a room like that. You need a space heater or something.


----------



## pshootr

I have been using TRIXX tonight and so far I have not had any freezing issues despite not having AB open with 2D/3D profiles, knock on wood. I got 1200/core 1500/mem so far with +125mv, running BF4 with no issues. Max VRM1=74C on stock air, max fan 80%. Not to shabby, will try tomorrow to see how high the memory can go at +125mv. I think I'm at my max core on air for this card. Thanks for all the help guys/gals. Goodnight


----------



## Gumbi

Quote:


> Originally Posted by *Forceman*
> 
> There are some efficiency gains from running at lower temps though, especially in these cards. So running water cooled may also result in less heat.


True, but these gains are fairly minimal.


----------



## zealord

wondering when we will have a new (beta) driver for better Metal Gear Solid V Ground Zeroes AMD GPU performance







.
A 290X performs about as well as a GTX 680/770 in this game or rather as "bad" as lol.


----------



## Gobigorgohome

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> LoooooL , of cause your near the arctic circle you don't need to run that .
> 
> 
> 
> 
> 
> 
> 
> 
> Because I live in Queensland Oztralia I run my portable mancave a/c nearly 12 hrs a day 6 days a week , and on HOT days ( over 32c inside house not mancave ) I will crank the ducted a/c as well


I do not think the house was warmer than 30C this summer, then I did crossfire Lightning R9 290X on air on a Asus Rampage IV Gene (stacked) and they thermal throttled on stock fan-profile ... that is not going to happen in my cold room.
Quote:


> Originally Posted by *ebduncan*
> 
> 10c ambient temp? That's a cold room. I wouldn't want to game in a room like that. You need a space heater or something.


That is right now, it will get colder ... it is actually on the cold side of 10C now ...


----------



## AlienPrime173

will check once i'm home


----------



## joeh4384

Quote:


> Originally Posted by *ebduncan*
> 
> Not entirely true, its well known water cooling reduces the temperature and has a small impact on power consumption because the resistance is lower. In the case of the R9-290/290x this drop is around 20-40 watts when water cooling.
> 10c ambient temp? That's a cold room. I wouldn't want to game in a room like that. You need a space heater or something.


10c is frigid, I would have my 290x's folding to provide warmth when not gaming.


----------



## DANKO

Hellow good day, to everyone! How are you?. My name y Darian, and i am from argentina. I have an Asus Matrix 290x Platinum R9, and I need the Standard BIOS, because a friend of mine, flash de LN2 Bios, into de Standard one. Please exists the possibility that you send it me. Ill wait your response, greetings and thanks for everyone, DJR!

The Asus website, contains no bios for this card!









My Email is [email protected]


----------



## AlienPrime173

i can supply 290x from pcs+

it's all i have. not sure if it works on reference/non ref


----------



## DANKO

Quote:


> Originally Posted by *AlienPrime173*
> 
> i can supply 290x from pcs+
> 
> it's all i have. not sure if it works on reference/non ref


Thank you very much AlienPrime173, but i need the original one of the matrix platinum.

I tried the asus direct cu II 290x, but it dont work properly.

Thanks also!


----------



## snow cakes

is there a visible performance difference between the 290x and the 290?


----------



## DANKO

Quote:


> Originally Posted by *snow cakes*
> 
> is there a visible performance difference between the 290x and the 290?


Snow cakes you could see that here

http://www.tomshardware.com/reviews/radeon-r9-290-and-290x,3728.html

http://www.anandtech.com/show/7481/the-amd-radeon-r9-290-review

http://www.techspot.com/review/762-gigabyte-radeon-r9-290-oc/


----------



## Performer81

]
Quote:


> Originally Posted by *AlienPrime173*
> 
> i can supply 290x from pcs+
> 
> it's all i have. not sure if it works on reference/non ref


What version do you have? I am interested because i have some issues with my unlocked pcs+.


----------



## hyp36rmax

Quote:


> Originally Posted by *DANKO*
> 
> Hellow good day, to everyone! How are you?. My name y Darian, and i am from argentina. I have an Asus Matrix 290x Platinum R9, and I need the Standard BIOS, because a friend of mine, flash de LN2 Bios, into de Standard one. Please exists the possibility that you send it me. Ill wait your response, greetings and thanks for everyone, DJR!
> 
> The Asus website, contains no bios for this card!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My Email is [email protected]


Here you go! It should be at Tech Powerup's depository shortly.

Go to town and enjoy!

*Source:* Link


----------



## snow cakes

Quote:


> Originally Posted by *DANKO*
> 
> Snow cakes you could see that here
> 
> http://www.tomshardware.com/reviews/radeon-r9-290-and-290x,3728.html
> 
> http://www.anandtech.com/show/7481/the-amd-radeon-r9-290-review
> 
> http://www.techspot.com/review/762-gigabyte-radeon-r9-290-oc/


great thanks


----------



## tsm106

Quote:


> Originally Posted by *snow cakes*
> 
> is there a visible performance difference between the 290x and the 290?


It depends, under typical gaming the gap is small, half a dozen percentage points. However under benching the gap can be quite large when every inch counts.


----------



## ebduncan

Quote:


> Originally Posted by *snow cakes*
> 
> is there a visible performance difference between the 290x and the 290?


performance difference is 3-4% at the same clock speeds.


----------



## DANKO

Quote:


> Originally Posted by *hyp36rmax*
> 
> Here you go! It should be at Tech Powerup's depository shortly.
> 
> Go to town and enjoy!
> 
> *Source:* Link


Thank you very much hyp36rmax!

Unfortunately on that page the bios of the asus matrix platinum is not found, but i think that i found it, in another web!


----------



## DANKO

Quote:


> Originally Posted by *snow cakes*
> 
> great thanks


Your welcome snow cakes!


----------



## ElevenEleven

Current owners of 290X R9 : would you swap to a GTX970 now for the same price? Just trying to decide if I should keep/RMA my faulty new R9 290X with an excellent cooler or return and keep waiting for GTX970 sales to match (my 290X was effectively $255). Just bugs me to have high power consumption on principle, though my PSU can easily handle it and more. I don't know what is better for a long term investment--I usually keep my GPUs for a couple of years or a bit longer. This is for a single 2560x1440 monitor.
(This is NOT an nVidia vs. AMD choice, it's a product line vs. product line question for me, which does include various pluses and minuses, drivers, etc.)


----------



## velocityx

Quote:


> Originally Posted by *ElevenEleven*
> 
> Current owners of 290X R9 : would you swap to a GTX970 now for the same price? Just trying to decide if I should keep/RMA my faulty new R9 290X with an excellent cooler or return and keep waiting for GTX970 sales to match (my 290X was effectively $255). Just bugs me to have high power consumption on principle, though my PSU can easily handle it and more. I don't know what is better for a long term investment--I usually keep my GPUs for a couple of years or a bit longer. This is for a single 2560x1440 monitor.
> (This is NOT an nVidia vs. AMD choice, it's a product line vs. product line question for me, which does include various pluses and minuses, drivers, etc.)


I bought two of these a year ago. The earliest I plan to get new cards is probably when the next series of Geforce cards comes out or when there is a Radeon 490 ready or something entirely else after the next generation radeons. Also using a single 2560x1440 monitor. I mean I have so much performance right now that the only thing I plan to do is invest money in some water cooling.


----------



## Archea47

290X Club submission:

http://www.techpowerup.com/gpuz/details.php?id=u8huc

2x Sapphire R9 290X with Tri-X cooler

Aquacomputer waterblocks and backplates are in the mail (why aren't these in the list of waterblocks on the first post?)


----------



## chiknnwatrmln

Quote:


> Originally Posted by *ElevenEleven*
> 
> Current owners of 290X R9 : would you swap to a GTX970 now for the same price? Just trying to decide if I should keep/RMA my faulty new R9 290X with an excellent cooler or return and keep waiting for GTX970 sales to match (my 290X was effectively $255). Just bugs me to have high power consumption on principle, though my PSU can easily handle it and more. I don't know what is better for a long term investment--I usually keep my GPUs for a couple of years or a bit longer. This is for a single 2560x1440 monitor.
> (This is NOT an nVidia vs. AMD choice, it's a product line vs. product line question for me, which does include various pluses and minuses, drivers, etc.)


I wouldn't. Performance is very similar between the two (coming down to silicon lottery as I understand it) but the R9 has a bit of an edge at higher resolutions. Also since I run CF, the bridgeless multi-GPU is a must. IMO GPU bridges are hideous and just look goofy. As long as your PSU can support the extra power the decrease in power consumption of the 970 over the life of the card does not justify the up front price right now. I would think it's going to be a while to see ~$250 970's.


----------



## Arizonian

Quote:


> Originally Posted by *Archea47*
> 
> 290X Club submission:
> 
> http://www.techpowerup.com/gpuz/details.php?id=u8huc
> 
> 2x Sapphire R9 290X with Tri-X cooler
> 
> Aquacomputer waterblocks and backplates are in the mail (why aren't these in the list of waterblocks on the first post?)


Congrats - added
















Please send me the link to the Aquacomputer 290x product page and I'd be glad to add it. I don't mind at all.









When I searched translated to English I couldn't find it. Closest was the news release.

http://www.aquacomputer.com/newsreader/items/komplettkuehler-fuer-amd-r9-290x-und-290.html


----------



## kizwan

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Please send me the link to the Aquacomputer 290x product page and I'd be glad to add it. I don't mind at all.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> When I searched translated to English I couldn't find it. Closest was the news release.
> 
> http://www.aquacomputer.com/newsreader/items/komplettkuehler-fuer-amd-r9-290x-und-290.html


Here you go:-
http://shop.aquacomputer.de/product_info.php?language=en&products_id=3151
http://shop.aquacomputer.de/product_info.php?language=en&products_id=3152
http://shop.aquacomputer.de/product_info.php?language=en&products_id=3135
http://shop.aquacomputer.de/product_info.php?language=en&products_id=3136

Backplate:-
http://shop.aquacomputer.de/product_info.php?language=en&products_id=3149
http://shop.aquacomputer.de/product_info.php?language=en&products_id=3150


----------



## Arizonian

Quote:


> Originally Posted by *kizwan*
> 
> Here you go:-
> http://shop.aquacomputer.de/product_info.php?language=en&products_id=3151
> http://shop.aquacomputer.de/product_info.php?language=en&products_id=3152
> http://shop.aquacomputer.de/product_info.php?language=en&products_id=3135
> http://shop.aquacomputer.de/product_info.php?language=en&products_id=3136
> 
> Backplate:-
> http://shop.aquacomputer.de/product_info.php?language=en&products_id=3149
> http://shop.aquacomputer.de/product_info.php?language=en&products_id=3150


Sweet - thank you for that.







OP updated with that info.


----------



## pshootr

I Did some more testing with my Sapphire Tri-x OC. Was running BF4 1200/1600 +143mv with no issue, max VRM1 74C on stock air, max fan 81%. However when I ran Fire-strike I got artifacts. Fire-strike heats things up more than BF4 so I did not want to increase offset. So now I am running 1160/1500 +87mv (almost half the offset), max VRM1 64C, max fan 71%. Looks like this is my sweet spot. More quiet, and cooler for what should be only a small performance hit.


----------



## zealord

hmm is it normal for AMD too take so long with releasing new drivers for games? The AMD GPU performance in Metal Gear Ground Zeroes is embarrassing to say the least and Nvidia had drivers ready 2 days prior to release (yeah yeah I know it is a way meant to be played title but still)

I wouldn't care if it is just 5-10% less performance in a single title compared to the average performance in games, but in this case it is around 30-40% less performance and it is making me wait, because with a new driver this game would run at a rock solid no dips below 60 fps at 1080p max settings. Currently it dips down to the low 40s and it bothers me quite a lot.

A GTX 970 can keep a constant 60 and the 290X is atleast on par with that card so I see no reason why AMD can't deliver


----------



## joeh4384

Quote:


> Originally Posted by *zealord*
> 
> hmm is it normal for AMD too take so long with releasing new drivers for games? The AMD GPU performance in Metal Gear Ground Zeroes is embarrassing to say the least and Nvidia had drivers ready 2 days prior to release (yeah yeah I know it is a way meant to be played title but still)
> 
> I wouldn't care if it is just 5-10% less performance in a single title compared to the average performance in games, but in this case it is around 30-40% less performance and it is making me wait, because with a new driver this game would run at a rock solid no dips below 60 fps at 1080p max settings. Currently it dips down to the low 40s and it bothers me quite a lot.
> 
> A GTX 970 can keep a constant 60 and the 290X is atleast on par with that card so I see no reason why AMD can't deliver


AMD has been pretty good this fall with coming out with game ready beta drivers pretty quick.


----------



## Scorpion49

Anyone with an XFX Double Dissipation card having sagging issues? The back end of my card is almost 1/2 inch lower than the slot now over only a week or two of use and it seems to be getting worse.


----------



## ThijsH

That's surprising, my xfx DD is having 0 sagging. You could install a backplate to reduce the sagging.


----------



## Scorpion49

Quote:


> Originally Posted by *ThijsH*
> 
> That's surprising, my xfx DD is having 0 sagging. You could install a backplate to reduce the sagging.


I'm thinking of just putting a water block on it with a backplate at some point since I already have a CPU loop, but the sag is pretty bad.


----------



## ThijsH

Now's a good time to buy a waterblock for it then ^^. Sagging can be really bad for your card, if it's too bad.


----------



## Klocek001

Quote:


> Originally Posted by *Scorpion49*
> 
> Anyone with an XFX Double Dissipation card having sagging issues? The back end of my card is almost 1/2 inch lower than the slot now over only a week or two of use and it seems to be getting worse.


my case came with a VGA guide and it's really doing the job, maybe you should try a makeshift one.


----------



## wermad

Cooler master store sells the Haf 935 (haf x) vga support bracket. Though it only does two-slot coolers.

Some ppl use supports like a pencil or a piece of small plastic tube.

Lastly, go fancy with a power color vga support bracket.
Quote:


>


----------



## kizwan

Quote:


> Originally Posted by *Scorpion49*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ThijsH*
> 
> That's surprising, my xfx DD is having 0 sagging. You could install a backplate to reduce the sagging.
> 
> 
> 
> I'm thinking of just putting a water block on it with a backplate at some point since I already have a CPU loop, but the sag is pretty bad.
Click to expand...

YMMV. If with the stock cooler it already sagging, putting water block & even backplate on it, it may still sagging. Backplate only helps a little, not much. EK do sell support bracket to prevent sagging but I don't know whether it's good or not. You also can DIY a gpu support stand yourself.

EDIT: Ninja'd by wermad (on the gpu support stand).


----------



## Scorpion49

Quote:


> Originally Posted by *wermad*
> 
> Cooler master store sells the Haf 935 (haf x) vga support bracket. Though it only does two-slot coolers.
> 
> Some ppl use supports like a pencil or a piece of small plastic tube.
> 
> Lastly, go fancy with a power color vga support bracket.


Yeah, the thing is though the bend is permanent. I tried to prop it up but it just pushed the slot way up and it ends up banana shaped with a ton of pressure on the slot. I was hoping putting a block on with a backplate might straighten the PCB, probably do that after the holidays.


----------



## Agent Smith1984

Quote:


> Originally Posted by *ThijsH*
> 
> That's surprising, my xfx DD is having 0 sagging. You could install a backplate to reduce the sagging.


No way dude, DD's are going to have _some_ sagging, even if it's just cause of their sheer size!

Oh... wait.... you meant....









HAPPY HOLIDAYS!!!!


----------



## wermad

Quote:


> Originally Posted by *Scorpion49*
> 
> Yeah, the thing is though the bend is permanent. I tried to prop it up but it just pushed the slot way up and it ends up banana shaped with a ton of pressure on the slot. I was hoping putting a block on with a backplate might straighten the PCB, probably do that after the holidays.


That doesnt sound right. Did you buy the card new? Might wanna get in touch with xfx support.


----------



## F4ze0ne

Quote:


> Originally Posted by *Scorpion49*
> 
> Yeah, the thing is though the bend is permanent. I tried to prop it up but it just pushed the slot way up and it ends up banana shaped with a ton of pressure on the slot. I was hoping putting a block on with a backplate might straighten the PCB, probably do that after the holidays.


Why not contact XFX and see if it can be replaced under warranty?


----------



## pshootr

Quote:


> Originally Posted by *wermad*
> 
> Cooler master store sells the Haf 935 (haf x) vga support bracket. Though it only does two-slot coolers.
> 
> Some ppl use supports like a pencil or a piece of small plastic tube.
> 
> Lastly, go fancy with a power color vga support bracket.


Is it just me, or are you not using screws to hold your card to the chassis? I use both screws for my two slot Sapphire Tri-x, and I lock the pcie slot. I get no sag at all.

Edit: I just realized the pic is not your rig. My mistake.


----------



## ThijsH

Quote:


> Originally Posted by *Agent Smith1984*
> 
> No way dude, DD's are going to have _some_ sagging, even if it's just cause of their sheer size!
> 
> Oh... wait.... you meant....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> HAPPY HOLIDAYS!!!!


Haha nice one.
Happy Holidays to you as well and everyone!!


----------



## chiknnwatrmln

As for GPU sagging, as stated earlier you can use an EK bracket for reference cards. I'm not totally sure if it will fit without an EK block, but it only costs $10 or so so it's worth a shot. It also goes horizontally instead of vertically, much less ugly than the other thing posted imo.

You will always have a bit of GPU sag though. When I had one ref card with an EK block, plate, and bracket I still got a bit of sag. Now with two with blocks, plates, and the water terminal I still has a little sag. Granted it's only 1/6 inch or so, but it's still noticeable.


----------



## tsm106

Quote:


> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Scorpion49*
> 
> Yeah, the thing is though *the bend is permanent*. I tried to prop it up but it just pushed the slot way up and it ends up banana shaped with a ton of pressure on the slot. I was hoping putting a block on with a backplate might straighten the PCB, probably do that after the holidays.
> 
> 
> 
> That doesnt sound right. Did you buy the card new? Might wanna get in touch with xfx support.
Click to expand...

Warped board? Something tells me that is way out of spec!


----------



## rivaldog

1. http://www.techpowerup.com/gpuz/details.php?id=4g8nb
2. Sapphire Tri-X R9 290
3. Stock cooling


----------



## combine1237

I have a question about the asus dcuii 290x and average vrm temps.


----------



## Mega Man

Quote:


> Originally Posted by *wermad*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Cooler master store sells the Haf 935 (haf x) vga support bracket. Though it only does two-slot coolers.
> 
> Some ppl use supports like a pencil or a piece of small plastic tube.
> 
> Lastly, go fancy with a power color vga support bracket.
> Quote:
Click to expand...

power colors jack is called the power jack !

others use fishing line


----------



## Arizonian

Quote:


> Originally Posted by *rivaldog*
> 
> 1. http://www.techpowerup.com/gpuz/details.php?id=4g8nb
> 2. Sapphire Tri-X R9 290
> 3. Stock cooling


Perfect. Congrats - added


----------



## arrow0309

Do you guys use the new Afterburner 4.1.0?
It looks really cool











I wonder if we have to configure it in the same way as before, who wants to try it and give the feedback?


----------



## smoke2

I'm often being run youtube on the background and listening the music.
During playing the videos on the background I have on my Tri-X 290 60 degrees temp on GPU core.
Is it safe to provide GPU on this temp several hours every day?


----------



## zealord

Quote:


> Originally Posted by *smoke2*
> 
> I'm often being run youtube on the background and listening the music.
> During playing the videos on the background I have on my Tri-X 290 60 degrees temp on GPU core.
> Is it safe to provide GPU on this temp several hours every day?


yes. you can also turn off hardware acceleration in chrome or flash player


----------



## EddieEdit

Anyone have any experience with the Powercooler r9 290x PCS+? I am able to purchase this card for $300 and was wondering if this would be a safe investment. I've been reading alot of mixed review on these cards such as needing to update bios and black screens. What would you guys do?


----------



## zealord

Quote:


> Originally Posted by *EddieEdit*
> 
> Anyone have any experience with the Powercooler r9 290x PCS+? I am able to purchase this card for $300 and was wondering if this would be a safe investment. I've been reading alot of mixed review on these cards such as needing to update bios and black screens. What would you guys do?


don't think it has anything to do with manufacturer. many 290X's had this problem. I haven't encountered it with the last couple of drivers.


----------



## EddieEdit

Quote:


> Originally Posted by *zealord*
> 
> don't think it has anything to do with manufacturer. many 290X's had this problem. I haven't encountered it with the last couple of drivers.


How about the card itself? Is there any reason why I should opt for the more common 290x tri-x? Or is this card pretty much in the same ball park in regards to build quality and longevity to the more common brands.


----------



## smoke2

Quote:


> Originally Posted by *zealord*
> 
> yes. you can also turn off hardware acceleration in chrome or flash player


When I tried to turn off hardware acceleration the video looks little bit more deteriorate.
Dunno it was only feeling or reality


----------



## Gumbi

Quote:


> Originally Posted by *EddieEdit*
> 
> How about the card itself? Is there any reason why I should opt for the more common 290x tri-x? Or is this card pretty much in the same ball park in regards to build quality and longevity to the more common brands.


The Trix and PCS are about equal. They are the best aftermarket coolers on 290s barring the Vaporx or Lighting.


----------



## the9quad

Quote:


> Originally Posted by *arrow0309*
> 
> Do you guys use the new Afterburner 4.1.0?
> It looks really cool
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I wonder if we have to configure it in the same way as before, who wants to try it and give the feedback?


works the same, in fact the older type skins come with it as well.


----------



## Regnitto

Quote:


> Originally Posted by *the9quad*
> 
> works the same, in fact the older type skins come with it as well.


but that new skin looks freaking sweet


----------



## X-Alt

Quote:


> Originally Posted by *Scorpion49*
> 
> Anyone with an XFX Double Dissipation card having sagging issues? The back end of my card is almost 1/2 inch lower than the slot now over only a week or two of use and it seems to be getting worse.


I'm installing my card later tonight, I will be sure to tighten the screws really hard, considering the fact it has no backplate. I'll report what I see, hopefully your card's issue is any easy fix.


----------



## tsm106

Quote:


> Originally Posted by *X-Alt*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Scorpion49*
> 
> Anyone with an XFX Double Dissipation card having sagging issues? The back end of my card is almost 1/2 inch lower than the slot now over only a week or two of use and it seems to be getting worse.
> 
> 
> 
> I'm installing my card later tongight, *I will be sure to tighten the screws really hard*, considering the fact it has no backplate. I'll report what I see, hopefully your card's issue is any easy fix.
Click to expand...

Not sure what was discussed earlier, but really hard can be a bad thing just saying.


----------



## X-Alt

Quote:


> Originally Posted by *tsm106*
> 
> Not sure what was discussed earlier, but really hard can be a bad thing just saying.


By really hard, I meant quite snug, not the point I strip them or anything. I'm not a fan of sagging, especially with such a pretty looking card. On an unrelated note, should I sweep my drivers completely, or should I simply update Catalyst to Omega and call it a day?


----------



## wermad

Triplets under water. Will do some benching for some power numbers if I have time today.





edit: idles ~120-130w.


----------



## snow cakes

Quote:


> Originally Posted by *wermad*
> 
> Triplets under water. Will do some benching for some power numbers if I have time today.
> 
> 
> 
> 
> 
> edit: idles ~120-130w.


That looks great man cant wait for the benches, and performance wise how much better are 3 as compared to 2x when gaming?


----------



## HOMECINEMA-PC

Merry XMAS to all from 'Stralia


----------



## chiknnwatrmln

Quote:


> Originally Posted by *X-Alt*
> 
> By really hard, I meant quite snug, not the point I strip them or anything. I'm not a fan of sagging, especially with such a pretty looking card. On an unrelated note, should I sweep my drivers completely, or should I simply update Catalyst to Omega and call it a day?


Be careful, the main concern is not stripping the screws but cracking the PCB itself. Silicon is pretty rigid, but if it cracks you're pretty screwed. I normally tighten screws until I get a decent amount of resistance and everything is held in place, no more. It's not like my rig moves around or anything, and a bit of GPU sag is better than overtightened screws and risking damage imo.


----------



## DividebyZERO

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Merry XMAS to all from 'Stralia


you too!


----------



## X-Alt

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Be careful, the main concern is not stripping the screws but cracking the PCB itself. Silicon is pretty rigid, but if it cracks you're pretty screwed. I normally tighten screws until I get a decent amount of resistance and everything is held in place, no more. It's not like my rig moves around or anything, and a bit of GPU sag is better than overtightened screws and risking damage imo.


It went well, and I put it in snugly and the card has very little, if any sag. Anyways, I have no experience dealing with Elpida VRAM, do they need more voltage than Hynix, or are they all around more unstable than the former, and what are the usual clocks that people with Elpida RAM achieve?


----------



## wermad

Quote:


> Originally Posted by *snow cakes*
> 
> That looks great man cant wait for the benches, and performance wise how much better are 3 as compared to 2x when gaming?


I'm still undecided on the cpu, so I'm still in the g3258. I'll run some benches for some power draw figures though. All the idle at 35c on the Koolance blocks.

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Merry XMAS to all from 'Stralia


Happy Christmas to all


----------



## HOMECINEMA-PC

Just finished a couple of hours of XMAS gaming ........
COD Black Ops 2 ..... a oldie but a goodie
25c idle 35c load temps....... stock clocks with [email protected]@2800 . In 4k of cause


----------



## chiknnwatrmln

Quote:


> Originally Posted by *X-Alt*
> 
> It went well, and I put it in snugly and the card has very little, if any sag. Anyways, I have no experience dealing with Elpida VRAM, do they need more voltage than Hynix, or are they all around more unstable than the former, and what are the usual clocks that people with Elpida RAM achieve?


That's good. Some people have done a lot of research into Elpida vs Hynix/Samsung but I run one Hynix and one Elpida.. imo there is almost no real world difference. My Elpida was capable of 1500+MHz (been a long time since I ran it that high though) and I haven't tested the max for my Hynix card but I run both at 1375MHz daily. With such a wide bus overclocking the RAM doesn't provide a huge benefit outside of benchmarking.


----------



## BradleyW

Patch 1.04 for AC Unity has caused massive flickering on AMD CFX. Anyone know of a fix / different profile? (290X CFX)


----------



## X-Alt

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> That's good. Some people have done a lot of research into Elpida vs Hynix/Samsung but I run one Hynix and one Elpida.. imo there is almost no real world difference. My Elpida was capable of 1500+MHz (been a long time since I ran it that high though) and I haven't tested the max for my Hynix card but I run both at 1375MHz daily. With such a wide bus overclocking the RAM doesn't provide a huge benefit outside of benchmarking.


I've got mine tested stable at 1380, that should do. I'm used to Tahiti chips, I had my Hynix around 1750 at a certain point, and I got some benefits from it. Now, to see what this core can do..


----------



## Gabkicks

The v1000 is enough for 3 r9 290s? How is the latest MSI afterburner? Does it hold settings now, or should I stick to Trixx?


----------



## Klocek001

Quote:


> Originally Posted by *Gabkicks*
> 
> The v1000 is enough for 3 r9 290s? How is the latest MSI afterburner? Does it hold settings now, or should I stick to Trixx?


Stock or OC ? Unless you want in for benching I'd advse against getting the third if you're happy with two. Why ? Additional trouble (compatibility,heat,noise,stutter). 2 card setup is the sweet spot, best scaling. I'm sure you're already getting 144 FPS on 2 cards, 1 card can get anywhere between 80 and 120 fps.


----------



## rivaldog

Quote:


> Originally Posted by *Arizonian*
> 
> Perfect. Congrats - added


Thanks a lot man


----------



## wermad

Ran 3D11 (sans physics and combined), this seems to consume the most vs 3DFS and Heaven. Also, made sure all three hit 100% usage in AB (the demo and Graphics Test #1 seem to push the gpu's more).

~775-810W *@ the Kill-A-Watt*. Factor in efficiency, ~680-715w *total system draw*. Subtract the cpu, mb, extras (fans & pump), ~100W, I'm guessing I'm ~600-650w for total gpu consumption.

So, a 1000w psu is good enough for three factory oc 290s and an enthusiast platform (ie LGA1150/1155). If these were 290X's and a X79/99 system, I would say you're pushing it and would opt for at least a 1200W unit. Make sure you guys consider good quality units. There's lots of cheap Gold and now some Platinum units on ebay thanks to the fall of the mining craze. Saw another Enermax 1500w go for ~$125.


----------



## rdr09

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Just finished a couple of hours of XMAS gaming ........
> COD Black Ops 2 ..... a oldie but a goodie
> 25c idle 35c load temps....... stock clocks with [email protected]@2800 . In 4k of cause


me too but different games . . .


Spoiler: Warning: Spoiler!











BF4 server is down.









But, Merry Christmas!


----------



## xKrNMBoYx

I've obtained a good amount of Graduation, Christmas, New Years, Birthday money, enough to build a whole new rig (Intel i5/i7, GTX 980, etc) but I decided against that. Instead though I purchased an R9 290 and can't wait for when it arrives. Obviously being the 26th means it will be a few days before things get processed and moving so the wait will be rough. It'll probably be just in time to test Unity. I'll be looking through this thread for valuable info.


----------



## pshootr

I was wondering if drive speed would effect scores at all on Fire-Strike benches?

Also wondering if RAM clock-speed should effect scores much?


----------



## rdr09

Quote:


> Originally Posted by *pshootr*
> 
> I was wondering if drive speed would effect scores at all on Fire-Strike benches?
> 
> Also wondering if RAM clock-speed should effect scores much?


RAM does but not the drive. HDD to SSD will not impact the score. if it does its negligible. RAM from 1333 to 1800 or higher will add to the graphics score. also, OS. Win 8 gives better scores in FS.


----------



## pshootr

Quote:


> Originally Posted by *rdr09*
> 
> RAM does but not the drive. HDD to SSD will not impact the score. if it does its negligible. RAM from 1333 to 1800 or higher will add to the graphics score. also, OS. Win 8 gives better scores in FS.


Good to know, thank you for your helpful response.


----------



## Archea47

Yep, my girlfriend buys the best Christmas gifts


----------



## JourneymanMike

I have the same blocks for my 290x's, but with nickel - you'll be real pleased with them...

You are blessed to have such a nice girlfriend - don't blow it! In fact you should marry her!


----------



## hyp36rmax

Quote:


> Originally Posted by *Archea47*
> 
> 
> 
> Yep, my girlfriend buys the best Christmas gifts


Mrs hyp36rmax did the same for me for my Vapor-X 290X's, EK blocks with backplanes and Alphacool radiators.


----------



## Gobigorgohome

How will 1x R9 290X and 2x R9 290 do in tri-fire? The R9 290X is a Lightning, the R9 290 is Sapphire Radeon reference PCB's.









I could get the R9 290 real cheap, R9 290X's is almost impossible to find used in Norway nowadays, lots of R9 290s though.


----------



## Gumbi

Quote:


> Originally Posted by *Gobigorgohome*
> 
> How will 1x R9 290X and 2x R9 290 do in tri-fire? The R9 290X is a Lightning, the R9 290 is Sapphire Radeon reference PCB's.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I could get the R9 290 real cheap, R9 290X's is almost impossible to find used in Norway nowadays, lots of R9 290s though.


Pretty sure it will work, they will just all work at the frequency and speed of the 290s.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Gumbi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gobigorgohome*
> 
> How will 1x R9 290X and 2x R9 290 do in tri-fire? The R9 290X is a Lightning, the R9 290 is Sapphire Radeon reference PCB's.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I could get the R9 290 real cheap, R9 290X's is almost impossible to find used in Norway nowadays, lots of R9 290s though.
> 
> 
> 
> Pretty sure it will work, they will just all work at the frequency and speed of the 290s.
Click to expand...

When i was using an R9 290 with my 295x2 the 290 clocked up to the 295x2's speeds and other times all 3 GPU's stayed at Ref clocks 1018/1250 + 980/1250

Worked fine for me either way


----------



## Gobigorgohome

Quote:


> Originally Posted by *Gumbi*
> 
> Pretty sure it will work, they will just all work at the frequency and speed of the 290s.


It should work too, but the thing you are saying about them operating at the 290s frequency seems a little weird to me ... I rather think they will work at their stock frequency, the R9 290X Lightning will not "downclock" to the R9 290-levels. If so it does not sound right, I had 3-way SLI before and the cards overclocked was not running the same frequency and it was working good.








Quote:


> Originally Posted by *Sgt Bilko*
> 
> When i was using an R9 290 with my 295x2 the 290 clocked up to the 295x2's speeds and other times all 3 GPU's stayed at Ref clocks 1018/1250 + 980/1250
> 
> Worked fine for me either way


Yes, this is pretty much what I am planning on doing. R9 290X will be operating at stock clocks mostly (1080 Mhz), the R9 290s just have to match that.


----------



## nightfox

Quote:


> Originally Posted by *Gumbi*
> 
> Pretty sure it will work, they will just all work at the frequency and speed of the 290s.


wrong. they will work independently. its been long time those were gone. i know that cause i have 2x290'x and 2x290's quadfired


----------



## Gobigorgohome

Quote:


> Originally Posted by *nightfox*
> 
> wrong. they will work independently. its been long time those were gone. i know that cause i have 2x290'x and 2x290's quadfired


How is the 2x r9 290x and 2x r9 290 in quadfire compared to 4x r9 290x in quadfire? Do you know? I have quadfire R9 290X myself, the Lightning R9 290X is for another machine, the one that probably will get two R9 290s too.


----------



## wermad

Tahiti (HD 7xxx) series also ran independent in crossfire. They no longer had to match the lowest clocked card. So a 7970 GE could be paired with a 7950.

Ie: 290x + 290 will run at its default speeds in crossfire (947 + 1000 in reference-stock guise).

As far as quadfire, Hawaii didn't have the magic of Tahiti, so from what owners have told me, as i was interested in quads, its mainly for benching. You really have to push resolutions really high, but even then, gains are minimal (check out DeadlyDNA 12k Eyefinity quad). This was the deal breaker for me tbh:

http://forums.anandtech.com/showthread.php?t=2359120

edit: ^^^ I'm sure someone will come up with the excuse of the cpu being a bottleneck (lol).


----------



## sTOrM41

does somebody have a up-to-date 290x pcs+ bios for me? (in best case from week 47!)

i just got a bios from AlienPrime173 (thanks man!)
from 44, but its not working on me 290 (non x)


----------



## Gobigorgohome

Quote:


> Originally Posted by *wermad*
> 
> Tahiti (HD 7xxx) series also ran independent in crossfire. They no longer had to match the lowest clocked card. So a 7970 GE could be paired with a 7950.
> 
> Ie: 290x + 290 will run at its default speeds in crossfire (947 + 1000 in reference-stock guise).
> 
> As far as quadfire, Hawaii didn't have the magic of Tahiti, so from what owners have told me, as i was interested in quads, its mainly for benching. You really have to push resolutions really high, but even then, gains are minimal (check out DeadlyDNA 12k Eyefinity quad). This was the deal breaker for me tbh:
> 
> http://forums.anandtech.com/showthread.php?t=2359120
> 
> edit: ^^^ I'm sure someone will come up with the excuse of the cpu being a bottleneck (lol).


I am aware of DeadlyDNA's setup (both cards and monitor-setup), I am pushing 4K only on my machine and it is working good, the fourth card does not gain much performance. Could even run three cards with very good performance at 4K. In my case the quadfire R9 290X is the main gaming rig, the R9 290X Lightning and R9 290s will be a LAN party rig and spare rig when the other rig is under construction.


----------



## levism99




----------



## pshootr

Quote:


> Originally Posted by *levism99*


Holy cow?


----------



## Gumbi

Quote:


> Originally Posted by *pshootr*
> 
> Holy cow?


Let us know how they clock, my 290 Vapor X is on the way, can't wait to get my hands on it.


----------



## levism99

I will, just waiting on my 3 EK-FC R9-290X VaporX - Nickel water blocks


----------



## Harry604

i just installed a asus reference 290 with my asus direct cuii 290x

im getting low gpu usage in bf3

in shadow of mordor im getting 99 percent usage with both cards but the fps havent gone up from a single card

i have a 3570k at 4.7 ax 750w psu corsair

z77 msi gd 65 mb

installed new omega drivers


----------



## VSG

Shadow of Mordor has a frame cap of 100 FPS. Are you hitting that with a single card?


----------



## Arizonian

Quote:


> Originally Posted by *wermad*
> 
> Triplets under water. Will do some benching for some power numbers if I have time today.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> edit: idles ~120-130w.


Nice work.








Quote:


> Originally Posted by *Archea47*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Yep, my girlfriend buys the best Christmas gifts


Computer gifts make the best gifts IMO.








Quote:


> Originally Posted by *levism99*
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added.


----------



## Algeron

http://www.techpowerup.com/gpuz/details.php?id=fu3bk

Msi 290x 4gd5, Reference cooler version.

For now anyways.









Still debating going g10 or hg10 a1.


----------



## tsm106

Quote:


> Originally Posted by *Algeron*
> 
> Still debating going g10 or hg10 a1.


If you don't plan on changing your card for a while get teh hg10, it's better in everyway but it is limited to the reference pcb. In it for the long haul. If not then get the former.


----------



## hyp36rmax

Quote:


> Originally Posted by *levism99*
> 
> I will, just waiting on my 3 EK-FC R9-290X VaporX - Nickel water blocks


Performance-pcs? I got the last few, i'm sure they are restocking after X-mas. FrozenCPU has a few in stock as well.


----------



## xKrNMBoYx

Quote:


> Originally Posted by *Algeron*
> 
> Still debating going g10 or hg10 a1.


I personally just bought a reference R9 290 (yet to come) and depending on how much I can tolerate the noise/temps of the stock cooler I am planning on getting the HG10. I have space constraints due to a mandatory/necessary sound card, meaning I need the cooler to be dual-slot. The G10 is cheaper but I've heard it makes the GPU take over 3 slots. Additionally I'm using a lot of Corsair products so it will look good/match the other components/accessories. I like the idea of reusing the blower fan of the original cooler and how the HG10 kind of looks like an actual GPU (shroud) compared to a barebone look of the G10.


----------



## sjgusmc21

Quote:


> Originally Posted by *xKrNMBoYx*
> 
> I personally just bought a reference R9 290 (yet to come) and depending on how much I can tolerate the noise/temps of the stock cooler I am planning on getting the HG10. I have space constraints due to a mandatory/necessary sound card, meaning I need the cooler to be dual-slot. The G10 is cheaper but I've heard it makes the GPU take over 3 slots. Additionally I'm using a lot of Corsair products so it will look good/match the other components/accessories. I like the idea of reusing the blower fan of the original cooler and how the HG10 kind of looks like an actual GPU (shroud) compared to a barebone look of the G10.


I went the G10 route because i had a spare Antec 620. However, about 3 weeks after installing it my VRM2 temps skyrocketed up in the 90's. I then purchased the HG10 and haven't looked back. Works great.


----------



## ebduncan

Quote:


> Originally Posted by *xKrNMBoYx*
> 
> I personally just bought a reference R9 290 (yet to come) and depending on how much I can tolerate the noise/temps of the stock cooler I am planning on getting the HG10. I have space constraints due to a mandatory/necessary sound card, meaning I need the cooler to be dual-slot. The G10 is cheaper but I've heard it makes the GPU take over 3 slots. Additionally I'm using a lot of Corsair products so it will look good/match the other components/accessories. I like the idea of reusing the blower fan of the original cooler and how the HG10 kind of looks like an actual GPU (shroud) compared to a barebone look of the G10.


I dunno why people say the reference cooler is loud. Mine wasn't loud unless I manually turned up the fan speed to 60%+

Honestly I think folks who reviewed them turned the fan speed up to 100% to overclock and couldn't erase that dreadful noise from their memory. (its fricken loud at 100%) If you left it at auto or manually adjusted it to run @ under 60% it wasn't loud.


----------



## Forceman

I had a reference card, and I thought it was comically loud in Auto. Like, it was so loud it was like a hyperbole of loud. But maybe that was just because I was used to using a AIO cooled GTX680.


----------



## sjgusmc21

Loud? I commonly referred to mine as a 'leaf blower'. Holy crap was this thing loud, both of them (started with a XFX and hosed it up...now I have the Powercolor one) were freak'n loud beyond belief. I can't believe they put these things on the market with the noise they make. Anyhow, doesn't matter now because it is relatively quiet with all of my mods. Don't get me wrong...my pc is far from quiet....I run Bgears 140 fans so quiet isn't in my knowledge base...but NOT on the same scale as a single reference R9 290.


----------



## Regnitto

Quote:


> Originally Posted by *sjgusmc21*
> 
> Loud? I commonly referred to mine as a 'leaf blower'. Holy crap was this thing loud, both of them (started with a XFX and hosed it up...now I have the Powercolor one) were freak'n loud beyond belief. I can't believe they put these things on the market with the noise they make. Anyhow, doesn't matter now because it is relatively quiet with all of my mods. Don't get me wrong...my pc is far from quiet....I run Bgears 140 fans so quiet isn't in my knowledge base...but NOT on the same scale as a single reference R9 290.


I got a BGears 120 on my 290 VRM. those things are LOUD.


----------



## wermad

sapphire tri-x coolers are a treat. too bad their just sitting in boxes now that the cards are on water. my rad's sp120 hp's at 12v are hella loud. I currently have them down to 5v and they're whisper quiet.


----------



## ebduncan

loud?

mine didn't even break 30dba......

ya'll are crazy, or have high ambient temps or something.


----------



## wermad

Quote:


> Originally Posted by *ebduncan*
> 
> loud?
> 
> mine didn't even break 30dba......
> 
> ya'll are crazy, or have high ambient temps or something.


12x @ 12v =


----------



## ebduncan

Quote:


> Originally Posted by *wermad*
> 
> 12x @ 12v =


was referring to the reference 290/290x cooler

not sp120's, I have a set of sp120's I run them at 7 volts with the included low voltage adapters. 12 volts are they little to noisy for me. Definitely louder than my 290 was with the reference cooler.


----------



## farzam

Hi guys. I want to update my r9 290 bios for increase performance. Can you give me best bios for graphic card.
Sapphire r9 290


----------



## sjgusmc21

_I got a BGears 120 on my 290 VRM. those things are LOUD._

Hell yeah they are...but...NOTHING compared to a stock R9 290....even with all four running at max. But man, do they push the air.....kind of like a F4 engine!


----------



## wermad

Reference turbine coolers will always be laud. Hawaii is Fermi all over again (lol).

Even with the 7v adapters, the hp are laud. I made a custom harness for the psu to power the hubs for my hp's. With the pressure they produce and the massive radiator overkill, they run fine. Cant run more then eight hp pwm and I didn't want to ruin a controller. So this was my solution to my needs.

The Sapphire tri-x coolers are a bit more quiet then the Msi TFIV (280x).


----------



## Roboyto

Quote:
Originally Posted by *xKrNMBoYx* 

I personally just bought a reference R9 290 (yet to come) and depending on how much I can tolerate the noise/temps of the stock cooler I am planning on getting the HG10. I have space constraints due to a mandatory/necessary sound card, meaning I need the cooler to be dual-slot. The G10 is cheaper but I've heard it makes the GPU take over 3 slots. Additionally I'm using a lot of Corsair products so it will look good/match the other components/accessories. I like the idea of reusing the blower fan of the original cooler and how the HG10 kind of looks like an actual GPU (shroud) compared to a barebone look of the G10.

You won't likely tolerate the noise/temps of the reference cooler; It is quite obnoxious.

With the long wait, I'm surprised Corsair even released that attachment with 390(X) likely around the corner.

Kraken G10 definitely doesn't take up 3 slots...it would never have fit in my 250D if that was the case.

http://www.overclock.net/t/1478544/the-build-formerly-known-as-the-devil-inside-a-gaming-htpc-for-my-wife-4770k-corsair-250d-powercolor-r9-290/20_20#post_22214255



Spoiler: Kraken in 250D















My gripe with the Corsair version is it forcing all the heat from VRM1 directly at the pump/heatsink. VRM1 makes a bunch of heat, and I would guess, depending on the speed the blower is running at, you could have a bit of noise due to turbulence caused by all the air being forced into the pump/heatsink. Edit: Just thought the fan runs according to the core temp, and it is adjustable in OC utilities.

I've been running the Kraken G10, with a few tweaks, for the last 7 months without any issues at all. Temperatures are pretty solid even with a bit of voltage and overclock applied.

Here's a 5-run loop of heaven on extreme settings. 1125/1250 +68mV +50% power



Spoiler: heaven loop







The core temperature isn't the best maxing at 73C, however this is running an older Antec 620 AIO with a single, absolutely silent, SilenX Effizio fan; This is my HTPC, so silence was top priority. You can easily achieve lower GPU core temps with a better AIO, and better fan(s). This is also isn't the greatest performing 290, so you may be able to get a few more MHz on the core with similar or slightly higher voltage.

Quote:
Originally Posted by *ebduncan* 

I dunno why people say the reference cooler is loud. Mine wasn't loud unless I manually turned up the fan speed to 60%+

Honestly I think folks who reviewed them turned the fan speed up to 100% to overclock and couldn't erase that dreadful noise from their memory. (its fricken loud at 100%) If you left it at auto or manually adjusted it to run @ under 60% it wasn't loud.

You must have a high tolerance for noise.

Anything over ~40% was much too loud for my taste, but both my PCs are quieter than the average laptop. I'd rather hear nothing coming from my computer than have a few extra MHz here or there.

Quote:
Originally Posted by *Forceman* 

I had a reference card, and I thought it was *comically loud* in Auto. Like, it was so loud it was like a *hyperbole* of loud. But maybe that was just because I was used to using a AIO cooled GTX680.

Quote:
Originally Posted by *sjgusmc21* 

Loud? I commonly referred to mine as a 'leaf blower'. Holy crap was this thing loud, both of them (started with a XFX and hosed it up...now I have the Powercolor one) were freak'n loud beyond belief. I can't believe they put these things on the market with the noise they make.

Originally Posted by *wermad* 

Reference turbine coolers will always be laud. Hawaii is Fermi all over again (lol).

Yes.

Quote:
Originally Posted by *wermad* 

sapphire tri-x coolers are a treat. too bad their just sitting in boxes now that the cards are on water. my rad's sp120 hp's at 12v are hella loud. I currently have them down to 5v and they're whisper quiet.


> Originally Posted by *wermad*
> 
> Even with the 7v adapters, the hp are laud.


I agree, the SP120 HP are quite loud at 12V. I have mine using the 7V resistors and turned down to silent through ASUS AISuite.

I was going to use the SP120 HP in my HTPC, but without a controller they didn't make the cut. Switched to SilenX Effizio's, which are inaudible, and only took a minor hit to temperatures even the when CPU is maxed out for ~40min doing a BD rip.

Quote:


> Originally Posted by *farzam*
> 
> Hi guys. I want to update my r9 290 bios for increase performance. Can you give me best bios for graphic card.


You may want to try overclocking the card as is to see where you can get with it. If you have a fair performer, then a BIOS update may help you push it even further.

If you want to increase your performance, the first thing you should do is ditch that reference cooler.


----------



## xKrNMBoYx

I havn't experienced it yet so I can't provide my opinion. My guess is like yours. Reviewers are highly likely to do some overclock for their video cards. Obviously to keep it cool they probably raised the fan speed and were horrified at the sound. The thing is though, most blower fans are absolutely loud at 100%. Not sure about the newer NVIDIA ones though. When I bought the GTX 465 I turned the fan to the max speed to test the fan and boy it was loud. My power supply is has only 50A on the 12V and my CPU is overclocked so I don't tend to overclock the 290 right away. That probably means I won't need super high fan rpm rates. I've noticed that gpu fans have some sort of max fan speed when on auto settings. Probably the max speed before the fan starts getting loud. As long as the fan can cool the 290 at stock speeds I will be fine.

The 4 SP120L get loud when they are at max RPM so I don't want to add a blower fan at 100% on top of that. That would be one loud microatx PC.


----------



## magicase

I'm using TriXX to OC my 290x to 1200/1600 but I have notice 1 issue. For some odd reason the clock speeds stay on max 100% load after gaming and won't go down making the GPU hot on idle. Does anyone know how to fix this issue?


----------



## Roboyto

Quote:


> Originally Posted by *magicase*
> 
> I'm using TriXX to OC my 290x to 1200/1600 but I have notice 1 issue. For some odd reason the clock speeds stay on max 100% load after gaming and won't go down making the GPU hot on idle. Does anyone know how to fix this issue?


Clockspeed stays at your max overclock, or GPU load stays at 100%?


----------



## pshootr

Quote:


> Originally Posted by *magicase*
> 
> I'm using TriXX to OC my 290x to 1200/1600 but I have notice 1 issue. For some odd reason the clock speeds stay on max 100% load after gaming and won't go down making the GPU hot on idle. Does anyone know how to fix this issue?


Set up one pre-set for stock clocks and one for your overclock. Then go to settings/profiles and select stock pre-set for 2/D profile and OC pre-set for 3D profile. Save, close, and restart so it wil remember your settings.


----------



## magicase

Quote:


> Originally Posted by *Roboyto*
> 
> Clockspeed stays at your max overclock, or GPU load stays at 100%?


Both clock speeds and load are on max.


----------



## Roboyto

Quote:


> Originally Posted by *magicase*
> 
> Both clock speeds and load are on max.


Well, be certain the game has actually closed in task manager; this is silly I know, but it doesn't hurt to check.

Not downclocking to 300/150 is one issue, but still showing a load of 100% is another entirely.

Have you altered the BIOS? I believe some of the more extreme BIOS force voltages and clocks.

As @pshootr has suggested you should save a profile for the stock settings of the card in Trixx; And have your desired overclock(s) on another. After you're done gaming either load the stock profile, or hit reset/apply so things go back to stock settings. Try this and see what happens.

I use Trixx as well and am not a fan of having it start with Windows, and definitely don't allow it to apply overclocks/voltage on start up. It's only a few mouse clicks to get things going so I just turn it on when I'm going to be gaming.

If this doesn't change anything, try gaming without Trixx, and see what happens. Best to know if it is hardware/software related.

If it doesn't happen without Trixx running, then try Afterburner and see what happens.

What version of Windows are you running?

What version of drivers are you running? Did you recently upgrade to this card? Did you properly remove the old drivers, with DDU, even if they were from an AMD card?

http://www.guru3d.com/files-get/display-driver-uninstaller-download,16.html

What version of Trixx are you running?

What game(s) does this happen with?

The more detailed information you can give, the quicker we can get to the bottom of your issue.


----------



## pshootr

Quote:


> Originally Posted by *pshootr*
> 
> Set up one pre-set for stock clocks and one for your overclock. Then go to settings/profiles and select stock pre-set for 2/D profile and OC pre-set for 3D profile. Save, close, and restart so it wil remember your settings.


Edit: I thought I read you were using AB. You can only do what I stated in AB, not TRIXX. I am very sorry, my mistake.


----------



## mistax

anyone with a 290x having issues with World of Warcraft or Heroes of the storm @ 2560x1440? So i'm currently sitting on a 4790k that yet to be open and i was wondering if anyone with a 290x had issues with either of those games on a 290x. I can't seem to keep 60fps in that game at all and I think it's because im cpu bound, but people are telling me it a gpu issue and that i need to crossfire?


----------



## ebhsimon

Quote:


> Originally Posted by *mistax*
> 
> anyone with a 290x having issues with World of Warcraft or Heroes of the storm @ 2560x1440? So i'm currently sitting on a 4790k that yet to be open and i was wondering if anyone with a 290x had issues with either of those games on a 290x. I can't seem to keep 60fps in that game at all and I think it's because im cpu bound, but people are telling me it a gpu issue and that i need to crossfire?


When you're playing what is your CPU and GPU usage? If your GPU is at 100% and your CPU is at say 70% (70% is an arbitrary number, but if the usage isn't maxed out at 100%) then it is your GPU that is not strong enough. If your CPU is at 100% and your GPU is lower than 100% usage then it is your CPU which is not strong enough. That's the general rule of thumb anyway.


----------



## mistax

Quote:


> Originally Posted by *ebhsimon*
> 
> When you're playing what is your CPU and GPU usage? If your GPU is at 100% and your CPU is at say 70% (70% is an arbitrary number, but if the usage isn't maxed out at 100%) then it is your GPU that is not strong enough. If your CPU is at 100% and your GPU is lower than 100% usage then it is your CPU which is not strong enough. That's the general rule of thumb anyway.


gpu always sitting @ 100%. cpu never goes above 30%. I guess i need crossfire/trifire to play @ 2560x1440?


----------



## Roboyto

Quote:


> Originally Posted by *mistax*
> 
> anyone with a 290x having issues with World of Warcraft or Heroes of the storm @ 2560x1440? So i'm currently sitting on a 4790k that yet to be open and i was wondering if anyone with a 290x had issues with either of those games on a 290x. I can't seem to keep 60fps in that game at all and I think it's because im cpu bound, but people are telling me it a gpu issue and that i need to crossfire?


I just got back into WoW the last couple days. My 4770k @ 4.5GHz and single 290 at stock settings, is running every setting maxed at 5760*1080; Frames don't dip under 50.

I have the RTSS/HWinfo feed on my screen and WoW does utilize 8 threads, and heavier than I had anticipated.

With all settings maxed at 5760*1080 WoW is pushing the 290 to 100% full time, at stock clocks anyhow. Haven't tried my 1200/1500 gaming clocks yet.

If your CPU information is correct with an i7 920, I wouldn't think you'd be having issues with your CPU as a bottleneck...but I doubt your 290X is holding you back at that resolution.


----------



## ebhsimon

Quote:


> Originally Posted by *Roboyto*
> 
> I just got back into WoW the last couple days. My 4770k @ 4.5GHz and single 290 at stock settings, is running every setting maxed at 5760*1080; Frames don't dip under 50.
> 
> I have the RTSS/HWinfo feed on my screen and WoW does utilize 8 threads, and heavier than I had anticipated.
> 
> With all settings maxed at 5760*1080 WoW is pushing the 290 to 100% full time, at stock clocks anyhow. Haven't tried my 1200/1500 gaming clocks yet.
> 
> If your CPU information is correct with an i7 920, I wouldn't think you'd be having issues with your CPU as a bottleneck...but I doubt your 290X is holding you back _at that resolution_.


Quote:


> Originally Posted by *mistax*
> 
> gpu always sitting @ 100%. cpu never goes above 30%. I guess i need crossfire/trifire to play @ 2560x1440?


Hm. @mistax can you observe your clock speeds of the GPU? It could be throttling seeing as @Roboyto's 290 can run it at 5760*1080, which is nearly double the resolution of 2560*1440.


----------



## mistax

Quote:


> Originally Posted by *ebhsimon*
> 
> Hm. @mistax can you observe your clock speeds of the GPU? It could be throttling seeing as @Roboyto's 290 can run it at 5760*1080, which is nearly double the resolution of 2560*1440.


my clock speed sit @ 1100 and gpu usage is 100%


----------



## chiknnwatrmln

Having a big problem with my PC. So the past few days I've been getting random hard crashes while playing Fallout and only Fallout. Other games played just fine so i figured it was a mod conflict or something. But today I got a hard crash and restarted my PC.

I got a BSoD on boot, immediately after the Windows startup screen. I booted into safemode and did a clean uninstall and reinstall of both 14.9 and Omega drivers but the same thing happens. You might remember me mentioning a different bug with Omega drivers a few months ago, so I figured that I would just reinstall Windows and see if that would help.

Fast forward an hour and I got a new installation of Windows 7 and all my essential drivers. Then I installed Omega drivers and got a BSoD when rebooting. Clean uninstall and 14.9, same problem.

The BSoD says atikmdag and some other stuff, but it's too quick to read. I don't think it's a hardware issue because my cards were both working fine in all games and benchmarks besides Fallout. The PC also runs fine without AMD drivers installed Any idea of what to troubleshoot next? And how can I test if it's my videocard(s) if I can't even boot into Windows? I will try booting with just one GPU at a time but I don't think it will work.


----------



## Gobigorgohome

So impressed of my quadfire, even though it is stock GPU's. It does 4K very well.


----------



## DividebyZERO

Quote:


> Originally Posted by *Gobigorgohome*
> 
> So impressed of my quadfire, even though it is stock GPU's. It does 4K very well.


What version driver are you using? I am on 14.12 and noticed a reduction in power and performance on quadfire. That said its also never ran better with vsynch and smoothness. even when i overclock my performance/power usage is way lower than before. Just curious on your thoughts..

also noticed my clock speeds vary during usage unless pegged to 100%


----------



## Roboyto

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Having a big problem with my PC. So the past few days I've been getting random hard crashes while playing Fallout and only Fallout. Other games played just fine so i figured it was a mod conflict or something. But today I got a hard crash and restarted my PC.
> 
> I got a BSoD on boot, immediately after the Windows startup screen. I booted into safemode and did a clean uninstall and reinstall of both 14.9 and Omega drivers but the same thing happens. You might remember me mentioning a different bug with Omega drivers a few months ago, so I figured that I would just reinstall Windows and see if that would help.
> 
> Fast forward an hour and I got a new installation of Windows 7 and all my essential drivers. Then I installed Omega drivers and got a BSoD when rebooting. Clean uninstall and 14.9, same problem.
> 
> The BSoD says atikmdag and some other stuff, but it's too quick to read. I don't think it's a hardware issue because my cards were both working fine in all games and benchmarks besides Fallout. The PC also runs fine without AMD drivers installed Any idea of what to troubleshoot next? And how can I test if it's my videocard(s) if I can't even boot into Windows? I will try booting with just one GPU at a time but I don't think it will work.


Disconnect power to one or the other GPU and see what happens.

Also, check your RAM! I have had that same BSOD error on two separate occasions and RAM was the culprit.

Don't rely on synthetic tests either, they have let me down several times. Swap with other known good RAM is easiest, if you can't do that then check each DIMM in each slot and see where you end up.


----------



## Gobigorgohome

Quote:


> Originally Posted by *DividebyZERO*
> 
> What version driver are you using? I am on 14.12 and noticed a reduction in power and performance on quadfire. That said its also never ran better with vsynch and smoothness. even when i overclock my performance/power usage is way lower than before. Just curious on your thoughts..
> 
> also noticed my clock speeds vary during usage unless pegged to 100%


Yes, I am on 14.12, I have not noticed any reduction in power or performance, but I do not monitor either of the two anyways.


----------



## alancsalt

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Having a big problem with my PC. So the past few days I've been getting random hard crashes while playing Fallout and only Fallout. Other games played just fine so i figured it was a mod conflict or something. But today I got a hard crash and restarted my PC.
> 
> I got a BSoD on boot, immediately after the Windows startup screen. I booted into safemode and did a clean uninstall and reinstall of both 14.9 and Omega drivers but the same thing happens. You might remember me mentioning a different bug with Omega drivers a few months ago, so I figured that I would just reinstall Windows and see if that would help.
> 
> Fast forward an hour and I got a new installation of Windows 7 and all my essential drivers. Then I installed Omega drivers and got a BSoD when rebooting. Clean uninstall and 14.9, same problem.
> 
> The BSoD says atikmdag and some other stuff, but it's too quick to read. I don't think it's a hardware issue because my cards were both working fine in all games and benchmarks besides Fallout. The PC also runs fine without AMD drivers installed Any idea of what to troubleshoot next? And how can I test if it's my videocard(s) if I can't even boot into Windows? I will try booting with just one GPU at a time but I don't think it will work.


There's a setting in bios about not instantly rebooting on crash that gives you time to read usually OR program like "who crashed" will read your dumps and give you the BSOD number to check against the overclocking BSOD list.

There's probably a newer one, but:
BSOD codes for overclocking
0x101 = increase vcore
0x124 = increase/decrease vcore or QPI/VTT...have to test to see which one it is
0x0A = unstable RAM/IMC, increase QPI first, if that doesn't work increase vcore
0x1E = increase vcore
0x3B = increase vcore
0x3D = increase vcore
0xD1 = QPI/VTT, increase/decrease as necessary
0x9C = QPI/VTT most likely, but increasing vcore has helped in some instances
0x50 = RAM timings/Frequency or uncore multi unstable, increase RAM voltage or adjust QPI/VTT, or lower uncore if you're higher than 2x
0x109 = Not enough or too Much memory voltage
0x116 = Low IOH (NB) voltage, GPU issue (most common when running multi-GPU/overclocking GPU)
0x7E = Corrupted OS file, possibly from overclocking. Run sfc /scannow and chkdsk /r


----------



## JourneymanMike

This is my sig - stock settings

http://www.3dmark.com/3dm/5294041


----------



## Roboyto

Quote:


> Originally Posted by *alancsalt*
> 
> There's a setting in bios about not instantly rebooting on crash that gives you time to read usually OR program like "who crashed" will read your dumps and give you the BSOD number to check against the overclocking BSOD list.
> 
> There's probably a newer one, but:
> BSOD codes for overclocking
> 0x101 = increase vcore
> 0x124 = increase/decrease vcore or QPI/VTT...have to test to see which one it is
> 0x0A = unstable RAM/IMC, increase QPI first, if that doesn't work increase vcore
> 0x1E = increase vcore
> 0x3B = increase vcore
> 0x3D = increase vcore
> 0xD1 = QPI/VTT, increase/decrease as necessary
> 0x9C = QPI/VTT most likely, but increasing vcore has helped in some instances
> 0x50 = RAM timings/Frequency or uncore multi unstable, increase RAM voltage or adjust QPI/VTT, or lower uncore if you're higher than 2x
> 0x109 = Not enough or too Much memory voltage
> 0x116 = Low IOH (NB) voltage, GPU issue (most common when running multi-GPU/overclocking GPU)
> 0x7E = Corrupted OS file, possibly from overclocking. Run sfc /scannow and chkdsk /r


Very nice information to have







@Arizonian maybe add to the OP?

BlueScreenView is another quick/easy BSOD dumpfile reader: http://www.nirsoft.net/utils/blue_screen_view.html


----------



## chiknnwatrmln

OK, my PC booted up today and I did a clean install of 14.9 drivers and it's working. I tried out the new MSI AB but I'm having an issue with it. Event though I click the setting to sync my two GPUs I have to manually change the clocks for each one individually. Also I can't increase the power limit. Even though I move the slider up and hit OK, it just resets itself to zero. I have unofficial overclocking mode on.

Also TY Alcansalt, that will be helpful if I have that problem again in the future.


----------



## kizwan

Just re-enable page file & disable auto restart on failures in Windows, if you want to see the BSOD. The page file is useful if you want to get the minidump but otherwise disable auto restart should be enough.


----------



## Gumbi

Hey, just got my new Vapor x 290







Will post up proof soon so that I may be part of the esteemed club. Just a quick question though. Has anybody any recommendations as to whether I should use Trixx or Afterburner as an overclock tool. Have used both successfully in the past.


----------



## Its L0G4N

Has anyone used the Aquacomputer Kryographics Hawaii for their R9 290? I'll be ordering two of the them with backplates in February. I found a website that did water block comparisons and it was showing the Aquacomputer had the lowest temps.

Want to hear opinions/alternatives for this block


----------



## wermad

Its very popular, primarily for the looks (imho), and usually, they're all pretty close. Though, its a huge amount cooler vs stock. Especially the turbine. I've heard mixed results on the backplate. I actually asked a few users in the water thread. I ended up w/ Koolance since I got a block on ebay pretty inexpensive. I found another member selling more of these blocks so i was extremely lucky they were available preown.

Check out Stren's review:

http://www.xtremerigs.net/2014/08/11/r9-290x-gpu-waterblock-roundup/

water cooling thread:

http://www.overclock.net/t/584302/ocn-water-cooling-club-and-picture-gallery


----------



## Roboyto

Quote:


> Originally Posted by *Its L0G4N*
> 
> Has anyone used the Aquacomputer Kryographics Hawaii for their R9 290? I'll be ordering two of the them with backplates in February. I found a website that did water block comparisons and it was showing the Aquacomputer had the lowest temps.
> 
> Want to hear opinions/alternatives for this block


Core temperature performance between all blocks is going to be very similar. I am using the XSPC Razor and have wonderful results. I haven't seen anyone particularly complain about any brand of block in this thread, aside from the VRM1 cooling; which there is more information about below.

The aquacomputer block/backplate pulls ahead with it's out-of-the-box VRM1 cooling capabilities.

Other blocks/backplates can match/exceed the VRM performance with a thermal pad upgrade to Fujipoly Ultra Extreme. A link to my thread for Fujipoly performance gains is listed below.

290 Need to Know: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21880_20#post_22208781

This is a little blurb from my '290 Need to Know' post regarding the block:

*Which waterblock/backplate is best?*


Pretty much all waterblocks will cool the GPU core the same
The core isn't the *problem child* of the 290(X) though, it's *VRM1 *



*Best overall waterblock/backplate performance when using exactly what is given to you in the box goes to Aquacomputer.*
Whether you choose the passive/active backplate, you will get the best VRM1 temps here. 




Card looks a lot better with the active attachment; but it is not necessary as you get 90% of the performance without it.




*Other manufacturers have slacked in the VRM1 area, especially when you start pushing these cards...but there is an easy fix!*
Replacing the thermal pads with Fujipoly Ultra Extreme will slash VRM temps.
*See my thread here:* http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures
Make sure you double check pad thickness for your specific block in my ^ thread
the 15x100mm strip of Fujipoly Ultra Extreme is easily enough to do 2 cards; 3 if you cut thin enough.


----------



## bond32

Just a little heads up for those of you with the Koolance blocks, they require pads that are 0.7mm thick which is what's included. I have now tried both the 1.0mm and the 0.5mm thick pads on the VRM1. The 1.0mm pads perform vastly better, however they cannot be reused after application. The 0.5mm pads still perform much better over the stock 0.7mm thick pads, however not as well as I hoped.

Had I known Koolance used this non-standard size pads I would have passed on it long ago.


----------



## Widde

Can it be the omega drivers making my pc freeze and/or bsod while folding on both cards? Gpus at stock speed and temps are not over 75c. or is it my psu that's going bad? Gaming,benchmarjs etc is fine even overclocked.


----------



## wermad

Quote:


> Originally Posted by *bond32*
> 
> Just a little heads up for those of you with the Koolance blocks, they require pads that are 0.7mm thick which is what's included. I have now tried both the 1.0mm and the 0.5mm thick pads on the VRM1. The 1.0mm pads perform vastly better, however they cannot be reused after application. The 0.5mm pads still perform much better over the stock 0.7mm thick pads, however not as well as I hoped.
> 
> Had I known Koolance used this non-standard size pads I would have passed on it long ago.


Pick up a small sheet from ppcs.com. I was lucky the first block came w/ the 07 pad unused. More then likely, the first owner used 1.0mm on the vrm. Since my second and third blocks didn't have pads (preowned as well), i picked up a small sheet from ppcs.com. The first sheet was actually enough for all three (the first did have the vram pad on there properly). It is kind of annoying since I have large sheets of spare pad, but not in this size( 0.5, 1.0, 1.5). All three are the revised "short" block versions btw (v1.1).


----------



## StrongForce

I started overclocking my r9 290x Windforce today, here are my results so far, started getting artifacts very quickly so I had to up voltage by alot to get high..

1155 mhz core

6080 mhz memory

max voltage 1.250

Ran fine 7 mn furmark 18fps average 1900x1200 8x MSAA, I know I should do it more will do that, but, then I wanted to run a 3dmark firestrike...and boom, crash, it won't even start i get a weird app error







so I guess it's not 100% stable, unless..?

will do more testing.

Also running Omega drivers.

Oh also on Furmark the voltage shows 1.133 is there Vdrop too ? I guess, cause when increasing voltage it definately improved stability.


----------



## Raephen

Quote:


> Originally Posted by *Its L0G4N*
> 
> Has anyone used the Aquacomputer Kryographics Hawaii for their R9 290? I'll be ordering two of the them with backplates in February. I found a website that did water block comparisons and it was showing the Aquacomputer had the lowest temps.
> 
> Want to hear opinions/alternatives for this block


My 290 is cooled by a Kryographics block, without backplate, and it performs just fine









I posted some images in this thread with results, please look them up if you are interested.


----------



## EagleOne

wrong area


----------



## DividebyZERO

Anyone else notice power consumption down on the 14.12 omega drivers? My quadfire stock barely breaks 1kw in most games @ 4k or triple 4k.. I am pretty sure stock on old drivers before i was pulling 1.2kw. I am using pt1 bios on all cards


----------



## Roboyto

Quote:


> Originally Posted by *StrongForce*
> 
> I started overclocking my r9 290x Windforce today, here are my results so far, started getting artifacts very quickly so I had to up voltage by alot to get high..
> 
> 1155 mhz core
> 
> 6080 mhz memory
> 
> max voltage 1.250
> 
> Ran fine 7 mn furmark 18fps average 1900x1200 8x MSAA, I know I should do it more will do that, but, then I wanted to run a 3dmark firestrike...and boom, crash, it won't even start i get a weird app error
> 
> 
> 
> 
> 
> 
> 
> so I guess it's not 100% stable, unless..?
> 
> will do more testing.
> 
> Also running Omega drivers.
> 
> Oh also on Furmark the voltage shows 1.133 is there Vdrop too ? I guess, cause when increasing voltage it definately improved stability.


Try GPU-Z or HWINFO to view stats.

1.25V really isn't very high at all for Vcore on these cards. The 290 Powercolor reference 'OC' Edition in my HTPC maxes at 1.227 with all stock settings.


Odds are you are having crashes because your voltage is too low for that core clock speed. However, pushing the RAM too far can just as easily cause crashes as well.
I would suggest returning your RAM clocks to stock to find your GPU core overclock, and then work on the RAM afterwards.
The RAM speed gives you only a fraction of a performance gain compared to core.
If you notice in your benching that your scores don't improve, or decrease, when you're adding more voltage/clocks then you will likely need to increase the power limit so the card has enough juice to hold max clocks.
How much offset were you applying and what OC utility are you running? You should be able to pull off around +100mV in Trixx with the Gigabyte cooler, but you must keep an eye on VRM1 temperature. They will be your limiting factor well before the core temperature. You don't want to be exceeding 90C for VRM1.

Typically you aren't at a 'dangerous' voltage until you're in the 1.4V range, where you must rely on at least full cover water cooling to keep the VRM1 temperatures within reason. Once you start adding some more voltage and power limit you will see how quickly temperatures get out of hand with an air cooler on these cards.

Also Furmark isn't a very good test as it creates unnecessary/unrealistic draw/load on the card. You are much better off with benches from 3DMark and Unigine, than Furmark/Kombustor and things of that nature. And lastly, you must also keep in mind these are benchmarks and are typically a good indicator of stable settings, but they don't guarantee them. Always be prepared for the possibility of a little voltage/clock tweaking when playing your favorite games.

Other pertinent 290 info can be found in my sig within the 'Need to Know' post.


----------



## wermad

Quote:


> Originally Posted by *DividebyZERO*
> 
> Anyone else notice power consumption down on the 14.12 omega drivers? My quadfire stock barely breaks 1kw in most games @ 4k or triple 4k.. I am pretty sure stock on old drivers before i was pulling 1.2kw. I am using pt1 bios on all cards


run 3d11 and/or 3dFs to compare scores and power usage. I'm curious too on my power usage since I'm still riding 14.9.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *EagleOne*
> 
> Having a real hard time seeing the: Bios / Ln2 switch 290X msi Lightning
> 
> can someone Help


Its a little black switch towards the middle of the card


----------



## StrongForce

Quote:


> Originally Posted by *Roboyto*
> 
> Try GPU-Z or HWINFO to view stats.
> 
> 1.25V really isn't very high at all for Vcore on these cards. The 290 Powercolor reference 'OC' Edition in my HTPC maxes at 1.227 with all stock settings.
> 
> 
> Odds are you are having crashes because your voltage is too low for that core clock speed. However, pushing the RAM too far can just as easily cause crashes as well.
> I would suggest returning your RAM clocks to stock to find your GPU core overclock, and then work on the RAM afterwards.
> The RAM speed gives you only a fraction of a performance gain compared to core.
> If you notice in your benching that your scores don't improve, or decrease, when you're adding more voltage/clocks then you will likely need to increase the power limit so the card has enough juice to hold max clocks.
> How much offset were you applying and what OC utility are you running? You should be able to pull off around +100mV in Trixx with the Gigabyte cooler, but you must keep an eye on VRM1 temperature. They will be your limiting factor well before the core temperature. You don't want to be exceeding 90C for VRM1.
> 
> Typically you aren't at a 'dangerous' voltage until you're in the 1.4V range, where you must rely on at least full cover water cooling to keep the VRM1 temperatures within reason. Once you start adding some more voltage and power limit you will see how quickly temperatures get out of hand with an air cooler on these cards.
> 
> Also Furmark isn't a very good test as it creates unnecessary/unrealistic draw/load on the card. You are much better off with benches from 3DMark and Unigine, than Furmark/Kombustor and things of that nature. And lastly, you must also keep in mind these are benchmarks and are typically a good indicator of stable settings, but they don't guarantee them. Always be prepared for the possibility of a little voltage/clock tweaking when playing your favorite games.
> 
> Other pertinent 290 info can be found in my sig within the 'Need to Know' post.


yea but usually I run furmark, if it's stable 45mn on furmark.. usually a good indication lol.

I use OC guru 2 from Gigabytes, and the 1.25 is the max voltage, I don't wanna use volt mods or anything as it would void warranty.

VRM 1 was good I believe VRM 2 was 93 and I assumed it was good also.

Ok I had to reboot cause my drivers weren't stable, my 3dmark however isn't stable and crashed after a few seconds







will investigate


----------



## Roboyto

Quote:


> Originally Posted by *StrongForce*
> 
> yea but usually I run furmark, if it's stable 45mn on furmark.. usually a good indication lol.
> 
> I use OC guru 2 from Gigabytes, and the 1.25 is the max voltage, I don't wanna use volt mods or anything as it would void warranty.
> 
> VRM 1 was good I believe VRM 2 was 93 and I assumed it was good also.
> 
> Unexpected error running tests.
> Workload Single init returned error message: DXGI call IDXGISwapChain::SetFullscreenState failed [-2005270494]:
> 
> The requested functionality is not supported by the device or the driver.
> 
> DXGI_ERROR_NOT_CURRENTLY_AVAILABLE
> 
> that's the error I get on 3dmark, weird, it ran fine the other day, so I really think it's the clocks, even with memory default clock ! mmh will investigate more, even try with stock core clocks again.
> 
> EDIT: oh actually stock clocks still crash.. lemme see. maybe verify cache if i can


But furmark isn't a good test for stability, especially when it doesn't load modern cards as much as it once did due to AMD/Nvidia adding safety features so you don't fry your card. There are so many other graphical scenarios than a swaying furry donut..ya know? If you must use Furmark, then go with the preset 15 min bench.

If VRM2 was at 93C, then you would have a problem. I would double check that because it is hard, if not impossible, for VRM2(memory) to get to that temperature; you were probably looking at VRM1. If VRM1 was at 93C, then I would suggest not pushing the card any further for the sake of longevity of the card.

I highly doubt that using a program like Trixx/Afterburner to add additional voltage is going to void your warranty. But I don't know for certain as Gigabyte doesn't have a warranty statement on their website like most other manufacturers. If you refuse to use another utility besides Gigabyte's, then don't expect to get high overclocks from the card with such a low voltage cap.

If you crash at stock, then there is likely another issue at hand. If you have updated drivers recently make sure you did a clean install using Display Driver Uninstaller. I did a thorough comparison of Catalyst 14.9 to Omega and didn't run into any crashing issues even with fairly high clocks @ 1280/1675 pushing slightly over 1.4V


----------



## StrongForce

Ah yea definately there is some problem, even at 1115 core I get flicker/artifact and crashes in 3dmark, will download Trixx, you're saying the voltage cap is higher there ?? I thought it was card limited not software lol.

Also I tryed other overclocking softwares, Trixx was one of them and I had big troubles with one of them.. so I figured I would try OC Guru 2

Will download Trixx..


----------



## Scorpion49

Anyone else noticing massive tearing in any and all videos with the omega drivers? I know I didn't have this problem before I installed them and now it almost makes me nauseous to watch netflix because the tearing is so awful. Is there any way to fix it?


----------



## Arizonian

Quote:


> Originally Posted by *Roboyto*
> 
> Very nice information to have
> 
> 
> 
> 
> 
> 
> 
> @Arizonian
> maybe add to the OP?
> 
> BlueScreenView is another quick/easy BSOD dumpfile reader: http://www.nirsoft.net/utils/blue_screen_view.html


Quote:


> Originally Posted by *Algeron*
> 
> http://www.techpowerup.com/gpuz/details.php?id=fu3bk
> 
> Msi 290x 4gd5, Reference cooler version.
> 
> For now anyways.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Still debating going g10 or hg10 a1.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added








Quote:


> Originally Posted by *Roboyto*
> 
> Very nice information to have
> 
> 
> 
> 
> 
> 
> 
> @Arizonian
> maybe add to the OP?
> 
> BlueScreenView is another quick/easy BSOD dumpfile reader: http://www.nirsoft.net/utils/blue_screen_view.html


Quote:


> Originally Posted by *chiknnwatrmln*
> 
> OK, my PC booted up today and I did a clean install of 14.9 drivers and it's working. I tried out the new MSI AB but I'm having an issue with it. Event though I click the setting to sync my two GPUs I have to manually change the clocks for each one individually. Also I can't increase the power limit. Even though I move the slider up and hit OK, it just resets itself to zero. I have unofficial overclocking mode on.
> 
> Also TY Arizonian, that will be helpful if I have that problem again in the future.


I don't see why not, good info. Done.


----------



## magicase

Just bought a 295x2 and I have found 1 issue with it already. It's not the card but the driver side.

Mantle on BF4 with Omega driver doesn't activate the 2nd GPU. DX11 works fine but Mantle won't kick in the 2nd gpu.


----------



## StrongForce

Quote:


> Originally Posted by *Scorpion49*
> 
> Anyone else noticing massive tearing in any and all videos with the omega drivers? I know I didn't have this problem before I installed them and now it almost makes me nauseous to watch netflix because the tearing is so awful. Is there any way to fix it?


I had that problem before, I think, it might be cause you need to have the Aero theme activated.. not sure if that fixed the tearing on windows or also in videos but worth a try







.

So I tryed Trixx, and that thing is bad, when I applyed settings my card started to go under load ?! so I cancelled and closed it.

Now I ran with afterburner, +100 mV +39% power limit

1181 seems to be my limit with Core (which is alot better !)

How do you get 1.4 v though ??

and I'm at 1500 with memory and think I might keep it that way, at least for now ..

http://www.3dmark.com/fs/3646281 (for some reason doesn't work on that browser I'm using but on another it does uh..)

9535 Score

8600 at stock GPU so nearly 1k boost, yay, I'm sure with some tweaks I can take it further but for now it's perfect, I'll monitor VRM temps and stuff ! and hope it's stable..


----------



## pshootr

Quote:


> Originally Posted by *StrongForce*
> 
> I had that problem before, I think, it might be cause you need to have the Aero theme activated.. not sure if that fixed the tearing on windows or also in videos but worth a try
> 
> 
> 
> 
> 
> 
> 
> .
> 
> So I tryed Trixx, and that thing is bad, when I applyed settings my card started to go under load ?! so I cancelled and closed it.
> 
> Now I ran with afterburner, +100 mV +39% power limit
> 
> 1181 seems to be my limit with Core (which is alot better !)
> 
> How do you get 1.4 v though ??
> 
> and I'm at 1500 with memory and think I might keep it that way, at least for now ..
> 
> http://www.3dmark.com/fs/3646281 (for some reason doesn't work on that browser I'm using but on another it does uh..)
> 
> 9535 Score
> 
> 8600 at stock GPU so nearly 1k boost, yay, I'm sure with some tweaks I can take it further but for now it's perfect, I'll monitor VRM temps and stuff ! and hope it's stable..


AB will not let you go up to 1.4v by default. I think you need +200mv or possibly more for 1.4v. I don't understand why your GPU went under load after applying OC settings in TRIXX, that is strange. That kind of sux, I find that for novice users TRIXX is much more simplistic to use, hence why I like it. And it also allows you to use up to +200mv by default. There is a work around for getting AB to go higher than +100mv, but it comes with some drawbacks and is a pain.


----------



## StrongForce

yea strange, I will try again next time, sounds good though if I can push it more, well, the thing is I got a FX CPU, so at one point I'll reach my max CPU temp (because of the GPU heat) lol

thanks for the tips though !


----------



## pshootr

Quote:


> Originally Posted by *StrongForce*
> 
> yea strange, I will try again next time, sounds good though if I can push it more, well, the thing is I got a FX CPU, so at one point I'll reach my max CPU temp (because of the GPU heat) lol
> 
> thanks for the tips though !


No problem. If you like AB and want to try getting more voltage using it you can have a look at some posts from here. Good luck, by the way your Fire-Strike score is pretty nice. 9128 is the highest I got so far, and I think a seen a slight flash of artifact. I am tinkering still with my card. 1200/1600 +156mv for the 9128 score. You have to really pay attention to Fire-Strike because sometimes an artifact will be so small and fast you might miss it if you are not really paying close attention. I tend to be very critical as to what I call stable, hehe.


----------



## StrongForce

Quote:


> Originally Posted by *pshootr*
> 
> No problem. If you like AB and want to try getting more voltage using it you can have a look at some posts from here. Good luck, by the way your Fire-Strike score is pretty nice. 9128 is the highest I got so far, and I think a seen a slight flash of artifact. I am tinkering still with my card. 1200/1600 +156mv for the 9128 score. You have to really pay attention to Fire-Strike because sometimes an artifact will be so small and fast you might miss it if you are not really paying close attention. I tend to be very critical as to what I call stable, hehe.


Yea, true, and real game you will notice the artifacts more easily anyway, and damn, that's actually awesome.. I mean I got a r9 290X and only 300 points more uh ?







plus your ram speed is much slower (my ram run a little bit slower than 2133), you should be happy with that score you got, I wouldn't touch it.

With better ram speeds you probably get more FPS's in a real game scenario too, just saying







.


----------



## amdzack

anyone have problem coil whine with this card??
one of my r9 290 had it..and burn in fire in my rig....


----------



## Scorpion49

Quote:


> Originally Posted by *StrongForce*
> 
> I had that problem before, I think, it might be cause you need to have the Aero theme activated.. not sure if that fixed the tearing on windows or also in videos but worth a try
> 
> 
> 
> 
> 
> 
> 
> .


I think I figured it out, after I installed the drivers I didn't re-do the stupid windows performance thing that tunes the video settings, I re-ran that and now it seems to be fine.


----------



## HOMECINEMA-PC

Happy new year 290 peeps from 'stralia


----------



## Arizonian

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Happy new year 290 peeps from 'stralia


Way to go Australlia! Hang tight everyone and happy new too!









Here's hoping we don't have to wait long for the 390X w/reference Liqiud cooling comes out sooner.


----------



## Spectre-

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Happy new year 290 peeps from 'stralia


hopefully the other great countries have there drinks prepared for NYE ( just went through a bottle of absinth with mates)


----------



## thrgk

anyone know who sells single brackets for the 290x reference? I am trying to fit a soundcard in my x99 msi ac power and i need one of my cards to have a single bracket


----------



## hyp36rmax

Quote:


> Originally Posted by *thrgk*
> 
> anyone know who sells single brackets for the 290x reference? I am trying to fit a soundcard in my x99 msi ac power and i need one of my cards to have a single bracket


You would still have an issue with the 2nd DVI on 290X Reference coolers...



Not like the HD 7970 Reference coolers that could be converted to single slot.


----------



## thrgk

So no way to fit 4 gpus and a soundcard? know of any good usb sound cards?


----------



## hyp36rmax

Quote:


> Originally Posted by *thrgk*
> 
> So no way to fit 4 gpus and a soundcard? know of any good usb sound cards?


Well you can always go with a PCIe Riser ribbon as long as you have enough slots on the computer case.











*Source:* Link


----------



## thrgk

yea I got the slot, tho its just a small pcie x1 that I need, not a pcie x2. Where would I put the sound card tho anyway, it would look very jumbled. I think a usb soundcard would be better, but i know notta about audio


----------



## hyp36rmax

Quote:


> Originally Posted by *thrgk*
> 
> yea I got the slot, tho its just a small pcie x1 that I need, not a pcie x2. Where would I put the sound card tho anyway, it would look very jumbled. I think a usb soundcard would be better, but i know notta about audio


Here's another one that is much more compact and give you some room to be able to route the cable for a cleaner look designed for miners which will also work for your sound card. Just connect it in between one of your GPU's with 1-2X slot.



*Source:* Link


----------



## thrgk

What do you use for a soundcard?

Also with that tho, I really have no room for the soundcard bracket i/o. all of my brackets are full other then the last one.

See I circled the one on the right is where the soundcard goes, but the left circle is my empty bracket.


----------



## wermad

I used a long ribbon pciex1 cable when i had quad 7970 lightnings. They're getting hard to find but I was able to reroute easily my pciex1 sound card. It has an angled slot to make it more compact.


----------



## bond32

That board has a really good onboard realtek... 1150 probably right? It's as good if not better than most sound cards in my opinion.


----------



## thrgk

no mine is the MSI X99 AC POWER


----------



## bond32

Hmmm don't see anything yet. Have you tried the onboard?


----------



## thrgk

Quote:


> Originally Posted by *bond32*
> 
> Hmmm don't see anything yet. Have you tried the onboard?


what do you mean?


----------



## wermad

Realtek sucks tbh. The Core3D on some boards is a lot better. But no where near a discrete card.

Here's my old sth10, rive x79 + quad lightnings, + SB core3d Z


----------



## bond32

Tried the onboard audio as opposed to the sound card


----------



## wermad

Here's the one i used (and still have btw







):

http://www.ebay.com/itm/PCIe-PCI-express-36pin-1x-1x-FFC-Flexible-Ribbon-PCI-E-x1-extender-riser-cable-/161216122548?pt=LH_DefaultDomain_0&hash=item25893aceb4

This ribbon and thin cable allows you to contort it a bit and its long enough to reach the sound card w/ the important angle slot:


----------



## tsm106

I don't get the obsession with sound cards, but I guess that's because I use toslink.


----------



## thrgk

can u still lock it into the bracket tho? or does it kinda just float? it would seem the height of the pcie slot on the ribbon would not let it fit correctly.


----------



## thrgk

Quote:


> Originally Posted by *wermad*
> 
> Here's the one i used (and still have btw
> 
> 
> 
> 
> 
> 
> 
> ):
> 
> http://www.ebay.com/itm/PCIe-PCI-express-36pin-1x-1x-FFC-Flexible-Ribbon-PCI-E-x1-extender-riser-cable-/161216122548?pt=LH_DefaultDomain_0&hash=item25893aceb4
> 
> This ribbon and thin cable allows you to contort it a bit and its long enough to reach the sound card w/ the important angle slot:


would this work http://www.amazon.com/gp/product/B0067WZH2Q/ref=pd_lpo_sbs_dp_ss_1?pf_rd_p=1944687682&pf_rd_s=lpo-top-stripe-1&pf_rd_t=201&pf_rd_i=B00GXHOMFS&pf_rd_m=ATVPDKIKX0DER&pf_rd_r=1H9NRKYZPPRPVJWFAE59


----------



## wermad

Yes, mine was just held by the slot thumbscrew. As long as its not a heavy card, its fine. Pm'ing in a bit. Let me look for the cable.

Edit: I have that one to. The cable is fixed, short, and the slot requires cutting in the case to make it fit. The one I linked is better.

Got to dig through my parts bin but I do have it.


----------



## thrgk

Quote:


> Originally Posted by *wermad*
> 
> Yes, mine was just held by the slot thumbscrew. As long as its not a heavy card, its fine. Pm'ing in a bit. Let me look for the cable.
> 
> Edit: I have that one to. The cable is fixed, short, and the slot requires cutting in the case to make it fit. The one I linked is better.
> 
> Got to dig through my parts bin but I do have it.


If you arent using them id buy it off you


----------



## hyp36rmax

Quote:


> Originally Posted by *tsm106*
> 
> I don't get the obsession with sound cards, but I guess that's because I use toslink.


Same here I have a dedicated Denon Reciever via Toslink Optical, looking for some speakers to compliment my setup.


----------



## thrgk

Quote:


> Originally Posted by *hyp36rmax*
> 
> Same here I have a dedicated Denon Reciever via Toslink Optical, looking for some speakers to compliment my setup.


What are these? I have a 2.1 setup now with amp and 12inch sub, would this be more worth while then a sound card? where would i get em


----------



## hyp36rmax

Quote:


> Originally Posted by *thrgk*
> 
> What are these? I have a 2.1 setup now with amp and 12inch sub, would this be more worth while then a sound card? where would i get em


I'm using a Denon AVR-487 Home Theater Receiver connected with an optical cable from my motherboard to process all the sound. I have all 5 speakers and sub that came with the set connected and it sounds awesome! It can only get better with a new set of speakers. Another option for you if you choose a headset is to get a headphone amp+DAC along with some higher end headphones and it's bliss. I currently own a pair of Beyerdynamic DT990's 250 ohms connected to the same Denon AVR-487. (Not a fan of all those "gaming" USB surround sound headsets)


----------



## thrgk

yea I was going to get a receiver instead.

Can you list a link to all the parts you have? My plan was just to get 5 speakers, 1 sub, yamaha receiver, and attach it using an optical cable. The receiver is the amp right?


----------



## hyp36rmax

Quote:


> Originally Posted by *thrgk*
> 
> yea I was going to get a receiver instead.
> 
> Can you list a link to all the parts you have? My plan was just to get 5 speakers, 1 sub, yamaha receiver, and attach it using an optical cable. The receiver is the amp right?


Correct You're fine with your Yamaha Receiver. Sounds like you're on the right track and just need to connect an optical cable from your motherboard to your receiver / amp. I got my set several years ago and is very dated by today's audio tech, however all I really need is to upgrade the speakers and i'll be good.


----------



## tsm106

Quote:


> Originally Posted by *hyp36rmax*
> 
> Quote:
> 
> 
> 
> Originally Posted by *thrgk*
> 
> yea I was going to get a receiver instead.
> 
> Can you list a link to all the parts you have? My plan was just to get 5 speakers, 1 sub, yamaha receiver, and attach it using an optical cable. The receiver is the amp right?
> 
> 
> 
> Correct You're fine with your Yamaha Receiver. Sounds like you're on the right track and just need to connect an optical cable from your motherboard to your receiver / amp. I got my set several years ago and is very dated by today's audio tech, however all I really need is to upgrade the speakers and i'll be good.
Click to expand...

I use a Yamaha Neo receiver with a Dayton 3 way soundbar and an Energy compact ESW-M6 sub. I did jump on the Pioneer sub by AJ for 60 bucks, so I might swap to that for more low end extension, but really it's icing on a fine cake as is. All in all, it's a lot more sound than I actually need lol.


----------



## thrgk

still wedmad let me know if u got an extra, id still buy it off ya


----------



## kizwan

Quote:


> Originally Posted by *thrgk*
> 
> yea I got the slot, tho its just a small pcie x1 that I need, not a pcie x2. Where would I put the sound card tho anyway, it would look very jumbled. I think a usb soundcard would be better, but i know notta about audio


Quote:


> Originally Posted by *thrgk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bond32*
> 
> Hmmm don't see anything yet. Have you tried the onboard?
> 
> 
> 
> what do you mean?
Click to expand...

Your motherboard does have onboard sound card. You can use that.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Arizonian*
> 
> Way to go Australlia! Hang tight everyone and happy new too!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here's hoping we don't have to wait long for the 390X w/reference Liqiud cooling comes out sooner.


Yes 390 series ...... I will never get rid of my 290's though








Quote:


> Originally Posted by *Spectre-*
> 
> hopefully the other great countries have there drinks prepared for NYE ( just went through a bottle of absinth with mates)


La de da top shelf mate








Quote:


> Originally Posted by *tsm106*
> 
> I don't get the obsession with sound cards, but I guess that's because I use toslink.


Me three . Sony 5.2 ch amp with double 12'" subs . I can run my games in DD . Who needs headphones









Quote:


> Originally Posted by *kizwan*
> 
> Your motherboard does have onboard sound card. You can use that.


Hey there you


----------



## Forceman

Quote:


> Originally Posted by *thrgk*
> 
> yea I was going to get a receiver instead.
> 
> Can you list a link to all the parts you have? My plan was just to get 5 speakers, 1 sub, yamaha receiver, and attach it using an optical cable. The receiver is the amp right?


If you get a receiver get one with HDMI and then just connect it to the video card. That way you get full uncompressed 5.1. If you use coax or optical you'll still need something that can compress the audio (Dolby Digital Live or DTS) to send it over that connection. Not all motherboards can do that. Of course, if you only want 2.1 you should be fine with whatever.


----------



## alancsalt

Well, I bought all the Audioengine gear, with the DAC so I run off USB, the 5+ speakers, and the S8 base unit...

I assume that means I'm not using the onboard sound?


----------



## Forceman

If you have a USB DC, then you aren't using the on-board. The DAC is doing your audio for you.


----------



## thrgk

this is the receiver I was eyeing. what speakers you guys recommend? I dont want to spend $100 for a pair of speakers lol, I have dayton ones now, but i think they are crap cause I got 2 dayton and an amp for $65


----------



## hyp36rmax

Much better! R9 290X Vapor-X 8GB Crossfire with EK blocks


----------



## rdr09

Quote:


> Originally Posted by *hyp36rmax*
> 
> Much better! Vapor-X 8GB with EK blocks


Nice! i am jelly.


----------



## hyp36rmax

Quote:


> Originally Posted by *rdr09*
> 
> Nice! i am jelly.


Thank you sir! Waiting on my Seasonic 1200 Watt PSU to start on my rebuild this weekend. First build for 2015!! I'm so excited, and I just can't hide it... hahaha


----------



## HOMECINEMA-PC

Hmmm 8gb cards eh ??
Tasty


----------



## Raephen

Quote:


> Originally Posted by *thrgk*
> 
> no mine is the MSI X99 AC POWER


I personally don't see the need for a dedicated soundcard with that motherboard.

It uses a Realtek 1150 based sound solution, and that is a good one.

http://www.guru3d.com/articles_pages/msi_x99s_xpower_ac_motherboard_review,7.html

On another note: best wishes for 2015, OCN!


----------



## Archea47

Quote:


> Originally Posted by *hyp36rmax*
> 
> Much better! R9 290X Vapor-X 8GB Crossfire with EK blocks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Quote:


> Originally Posted by *rdr09*
> 
> Nice! i am jelly.


Also quite jealous! That setup sir is the real deal. I'd have loved to go that route but I couldn't handle the extra $400+ for two cards at 1440p. What resolution are you going to run them at?


----------



## thrgk

Quote:


> Originally Posted by *Forceman*
> 
> If you get a receiver get one with HDMI and then just connect it to the video card. That way you get full uncompressed 5.1. If you use coax or optical you'll still need something that can compress the audio (Dolby Digital Live or DTS) to send it over that connection. Not all motherboards can do that. Of course, if you only want 2.1 you should be fine with whatever.


Quote:


> Originally Posted by *Raephen*
> 
> I personally don't see the need for a dedicated soundcard with that motherboard.
> 
> It uses a Realtek 1150 based sound solution, and that is a good one.
> 
> http://www.guru3d.com/articles_pages/msi_x99s_xpower_ac_motherboard_review,7.html
> 
> On another note: best wishes for 2015, OCN!


So a Xonar u7 would not be better? I thought mine was good already, though id still like to use the creative z if possible.

No in a few posts above it said to use HDMI from the receiver, but isnt optical better? And I would need somewhere to plug the optical into?

Is the receiver like an amp/dac/sound card all in one, and can it be used with only a 2.1 setup if I was thinking about 5.1 in the future?


----------



## Raephen

Quote:


> Originally Posted by *thrgk*
> 
> So a Xonar u7 would not be better? I thought mine was good already, though id still like to use the creative z if possible.
> 
> No in a few posts above it said to use HDMI from the receiver, but isnt optical better? And I would need somewhere to plug the optical into?
> 
> Is the receiver like an amp/dac/sound card all in one, and can it be used with only a 2.1 setup if I was thinking about 5.1 in the future?


I am not an expert, and neither do I have an exspensive audio set hooked up to my computer, but as far as my knowledge goes, a sound card wouldn't make a difference when sending the digital signal via toslink. If I remember correctly, they are mostly just the raw digital sound data, to be handled by your audio device.

EDIT: My advice would be just to try the two and see if you can hear the difference...


----------



## Forceman

Quote:


> Originally Posted by *thrgk*
> 
> So a Xonar u7 would not be better? I thought mine was good already, though id still like to use the creative z if possible.
> 
> No in a few posts above it said to use HDMI from the receiver, but isnt optical better? And I would need somewhere to plug the optical into?
> 
> Is the receiver like an amp/dac/sound card all in one, and can it be used with only a 2.1 setup if I was thinking about 5.1 in the future?


HDMI is better than TOSLink, if your receiver supports it. HDMI has significantly more bandwidth available (I don't recall the actual number) than optical/coax, so it can directly support 5.1 audio. Optical/coax can only carry 5.1 audio if it has been compressed, which is no problem for videos (which are already compressed) but means you need something to compress the audio for games. So you either need a motherboard that supports Dolby Digital Live or DTS:Connect (or whatever the new name is), or you need a sound card.

The Creative Z almost certainly supports one of those techs, if not both, so you can just use it. You won't notice any difference between the Z and a U7, or even the on-board, if you are using digital (optical/coax).


----------



## thrgk

Quote:


> Originally Posted by *Forceman*
> 
> HDMI is better than TOSLink, if your receiver supports it. HDMI has significantly more bandwidth available (I don't recall the actual number) than optical/coax, so it can directly support 5.1 audio. Optical/coax can only carry 5.1 audio if it has been compressed, which is no problem for videos (which are already compressed) but means you need something to compress the audio for games. So you either need a motherboard that supports Dolby Digital Live or DTS:Connect (or whatever the new name is), or you need a sound card.
> 
> The Creative Z almost certainly supports one of those techs, if not both, so you can just use it. You won't notice any difference between the Z and a U7, or even the on-board, if you are using digital (optical/coax).


do you know if my mobo onboard supports it? Idk how to check, its the MSI X99 AC Power.


----------



## thrgk

right now I am not using digital, or optical or hdmi just a amp, 2 speakrs and a sub.

If I did upgrade to a u7 would I hear a large difference? Since I am not digital?

Also if I do go a receiver, what speakers would you recommend? I dont want to spend $200 on 5 speakers, and I already have a $120 12inch sub.


----------



## Forceman

Quote:


> Originally Posted by *thrgk*
> 
> do you know if my mobo onboard supports it? Idk how to check, its the MSI X99 AC Power.


The Realtek chip on that (the ALC1150) supports it, and the board has an optical out, but it really comes down to the drivers. You'd have to install the drivers from MSI and see if it enables those options in the Realtek control panel. If not, you'd need to use hacked drivers to get it to work.

Edit: Looks like it does not support it out of the box, at least from looking at the manual.


----------



## hyp36rmax

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Hmmm 8gb cards eh ??
> Tasty


Quote:


> Originally Posted by *Archea47*
> 
> Also quite jealous! That setup sir is the real deal. I'd have loved to go that route but I couldn't handle the extra $400+ for two cards at 1440p. What resolution are you going to run them at?


Thanks guys can't wait to get them in my system this weekend. I'll be using 4k 3840x2160p with an ASUS PB287Q. Surely putting them to work


----------



## thrgk

Quote:


> Originally Posted by *Forceman*
> 
> The Realtek chip on that (the ALC1150) supports it, and the board has an optical out, but it really comes down to the drivers. You'd have to install the drivers from MSI and see if it enables those options in the Realtek control panel. If not, you'd need to use hacked drivers to get it to work.
> 
> Edit: Looks like it does not support it out of the box, at least from looking at the manual.


Ok so I would need a soundcard? Is a U7 good enough or would the creative z be better?

Also since I am on analog, would you go U7 vs Creative z?


----------



## wermad

Quote:


> Originally Posted by *Raephen*
> 
> I personally don't see the need for a dedicated soundcard with that motherboard.
> 
> It uses a Realtek 1150 based sound solution, and that is a good one.
> 
> http://www.guru3d.com/articles_pages/msi_x99s_xpower_ac_motherboard_review,7.html
> 
> On another note: best wishes for 2015, OCN!


its not.
I made the mistake of thinking the same thing coming off a board with core3d. The core3d is hella better then realtek. Obviously, my old SB core 3d Z card offered even more, but had to get creative (no pun) to install it since I was running quads. It's a shame more boards don't come with the core3d.


----------



## thrgk

Quote:


> Originally Posted by *wermad*
> 
> its not.
> I made the mistake of thinking the same thing coming off a board with core3d. The core3d is hella better. Obviously, my old SB core 3d Z card offered even more, but had to get tricky to install it since I was running quads. It's a shame more boards don't come with the core3d.


Whats a Core3d? Built into the asus rampage?

BTW did u have any extra of those risers I could buy or no? If not Ill order one from China


----------



## wermad

Quote:


> Originally Posted by *thrgk*
> 
> Whats a Core3d? Built into the asus rampage?
> 
> BTW did u have any extra of those risers I could buy or no? If not Ill order one from China


Its an audio chip built into a few boards. Its far better then the vanilla stuff realtek puts on most boards. I bought a $500 rivbe thinking it was better with the sniper 5 core3d, and it was a complete let down. Traded the be for cash and a rive, and I got the z card.

I do, pm your info, and I'll ship it tomorrow for you. On the road right now on family trip.


----------



## thrgk

Quote:


> Originally Posted by *wermad*
> 
> Its an audio chip built into a few boards. Its far better then the vanilla stuff realtek puts on most boards. I bought a $500 rivbe thinking it was better with the sniper 5 core3d, and it was a complete let down. Traded the be for cash and a rive, and I got the z card.
> 
> I do, pm your info, and I'll ship it tomorrow for you. On the road right now on family trip.


Awesome thanks so much, PM sent.

I wonder if I should return my creative z for a core 3d


----------



## wermad

Same chip







, the card is still the better option vs switching mb for one with a core3d (the hassle).


----------



## thrgk

Quote:


> Originally Posted by *wermad*
> 
> Same chip
> 
> 
> 
> 
> 
> 
> 
> , the card is still the better option vs switching mb for one with a core3d (the hassle).


Ah so ill keep the creative z. How about the u7 ever think of using that? External one


----------



## thrgk

Wermad do you have any pics of how you did position the soundcard when you had 4 gpus? It is very tight already, the only way I could see is sticking in the riser cable and going over the fourth card no real way to go under?


----------



## Gobigorgohome

Anybody have problems with 14.12 drivers? I have done a little BF4 online gaming the last few days (also some FarCry 4) and my experience is worse than before, FPS in BF4 dips all the way down to 35 FPS in-game (which never happened with 14.9) ... guess that is reason enough to go back to 14.9 ... anyone else?


----------



## edo101

Hey guys, I don't know if many of you have the Gigabyte R9 290 Windforce OC card but I bought it back in June and it wasn't performing as it should. So I flashed the Bios with Gigabyte's own bios and it was working but still no improved performance improvement.

And then AMD released 14.6 CCC drivers and it started performing as it should. All of a sudden it started crashing. If I remeber correctly I think this card ran hot as well.

I finally made the effort to RMA it this week. I haven't shipped it in yet though. Now I went to Newegg and checked out reviews and it appears people have the same issues. IN fact some guy RMA'd his card and Gigabyte fixed it by downclocking it. Yes you heard right.

Anyways I am paranoid now and I want to see if you guys would recommend I RMA it and because I can no longer return it to seller, if I should put it up for sale on eBay after I get it back (it has the transferrable warranty) and get the Tri-X which is on sale today for 259 AR?

*I actually just fixed with Bios update but still the question remains*
Should I sell this card and try and by the Tri-X. is it worth it. I think I prbly would be losing 100 bucks though. I bought the 290 for 310 back then, the seem to be selling around 250 right now and prbly gonna have to shell out for a new Tri-x for 270 because I NEED warranty


----------



## Klocek001

what is the possible and logical explanation for this: I get terrible vibration and rattle when I set my fan above 50% with AB, but there's no vibration or rattling at the same speed in CCC.


----------



## kizwan

Quote:


> Originally Posted by *edo101*
> 
> Hey guys, I don't know if many of you have the Gigabyte R9 290 Windforce OC card but I bought it back in June and it wasn't performing as it should. So I flashed the Bios with Gigabyte's own bios and it was working but still no improved performance improvement.
> 
> And then AMD released 14.6 CCC drivers and *it started performing as it should*. All of a sudden *it started crashing*. If I remeber correctly I think this card ran hot as well.
> 
> I finally made the effort to RMA it this week. I haven't shipped it in yet though. Now I went to Newegg and checked out reviews and it appears people have the same issues. IN fact some guy RMA'd his card and Gigabyte fixed it by downclocking it. Yes you heard right.
> 
> Anyways I am paranoid now and I want to see if you guys would recommend I RMA it and because I can no longer return it to seller, if I should put it up for sale on eBay after I get it back (it has the transferrable warranty) and get the Tri-X which is on sale today for 259 AR?
> 
> *I actually just fixed with Bios update but still the question remains*
> Should I sell this card and try and by the Tri-X. is it worth it. I think I prbly would be losing 100 bucks though. I bought the 290 for 310 back then, the seem to be selling around 250 right now and prbly gonna have to shell out for a new Tri-x for 270 because I NEED warranty


Well, which one is it??? If it started crashing, then it's *not* performing as it should.


----------



## Archea47

Quote:


> Originally Posted by *edo101*
> 
> I finally made the effort to RMA it this week. I haven't shipped it in yet though.


If you do RMA, don't forget the Swedish Fish trick!


----------



## Klocek001

have you had any problems since the last bios flash ? How high are your clocks out of the box ? Gigabyte is infamous for too high out of the box OC, they did it with 780 Ti "GHz" version, I thought 290 was problem free but I might be wrong. Anyway not a lot of people I heard getting a Gigabyte 290/290X. People usually choose trix, vaprox or XFX DD.


----------



## rdr09

so, i played BF3 MP 64 in 1080 and 4K using 2 290s and an i7 SB @ 4.5 . . .



i get virtually the same cpu usage. is HWINO64 any acuurate?


----------



## edo101

Before the 14.6 beta driver it was not performing as it should. Low fps relative to similar clocked cards

I thought it might be a BIOS issue so I flashed it to an updated bios but it wasn't the issue. Running out of ideas, luckily AMD released the 14.6 beta driver. It worked wonders. Then it started to act up again and I reflashed the old bios. After that the card became wildly unstable even in Windows. So I boxed it up but never RMA'd it because I just didn't have time.

I sent Gigabyte an RMA request two days ago. And then for some reason today I decided to play a little trick on the card. I wiped all drivers and flashed the card with the latest BIOS and installed AMD's Omega drivers and the card is working again.

I have been using it since morning, no issues yet. I thought I noticed a weird artificating in Windows but since then nothing has happened. This card is loud, hot, and horrible overclocker though it runs all my games decently at my 1440p resolution.

I asked that question before I "fixed" it. Not sure if it will hold cause this card has been wildly erratic in the past. As it stands it is working for now but it did the same thing earlier back in June where it was working and then it wasn't.

I'm at a crossroads right now. Go ahead with the RMA since I have an RMA ticket already or spend more time I don't have testing this card to see if it is stable enough to attempt to sell and buy the Trix whenever its on sale next? or just stick with the card. I don't know how much louder the Tri-X is but it seems to be cooler while running at lower fans and also a better VRM cooler.

It would be nice if Gigabyte resent me a fully new packaged card/refurbished card. Although it sounds like sometimes they just tweak it.

Lesson learned though, never buying a Gigabyte product again. At the time when I bought it, the Tri-X was very expensive thanks to Bitcoin (even in June it was).


----------



## edo101

Quote:


> Originally Posted by *kizwan*
> 
> Well, which one is it??? If it started crashing, then it's *not* performing as it should.


Lol I get your point. The card was underperforming fps wise. Then it became unstable.

But yeah i guess in both scenarios it was not peforming like it should









Now it looks fixed. Just not sure if two days from now if it won't start artifacting all over the place and fail to boot into windows with drivers


----------



## edo101

Quote:


> Originally Posted by *Archea47*
> 
> If you do RMA, don't forget the Swedish Fish trick!


Thanks for showing me that. I'm gonna RMA it. The card started throw random white and green dots in Crysis. So there it is . unless this is normal for AMD cards in general I am RMA'ng it. COH 2 ran with no problems but Crysis for some reason give me faint dots lol

Actually the card shows no artifacts in any of my other games. Just Crysis lol


----------



## xKrNMBoYx

MSI R9 290 Reference Design/Stock Cooler


----------



## combine1237

Anyone know good vrm heatsinks, thermal adhesive and 1mm thermal pads for my 290x's. Also I am having trouble identifying some of the vrms on my asus dcuii 290x could anyone help point them out?

XFX 290x


Asus DCUII OC 290x



Thanks for any help I can get.

Edit: I need everything to fit with a kraken g10


----------



## Buehlar

Quote:


> Originally Posted by *combine1237*
> 
> Anyone know good vrm heatsinks, thermal adhesive and 1mm thermal pads for my 290x's. Also I am having trouble identifying some of the vrms on my asus dcuii 290x could anyone help point them out?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> XFX 290x
> 
> 
> Asus DCUII OC 290x
> 
> 
> 
> 
> Thanks for any help I can get.
> 
> Edit: I need everything to fit with a kraken g10


Here are the vrm details from EK for the DCII 290x


----------



## THUMPer1

Man guys I have some questions. Always been an ATI/AMD guy.
My system
3570k @ 4.5 GHZ
8GB Mushkin 1600 RAM at 8-8-8-24
840 pro SSD
*Diamond R9 290x* The V2 with Hynix RAM and aftermarket cooler 78.2 ASIC
I have clocked the card differently, but most of the clocks are all 1100/1300, 1150/1350, 1100/1350. Somewhere in there. The card stays cool never gets above 70, and my CPU never gets above 55 when gaming. I have the cold weather to thank for that. I am using the recent version of Afterburner, I have also tried TRIXX.

Here are my settings.




The issue I am having is my performance in BF4 varies. I run BF4 in DX11, as mantle is inconsistent. I am also on the Omega driver. I have tried the 14.11.2 also. Always using DDU with safemode and restarting before installing different drivers. The only thing I have not done is a clean windows install. I came from a 280x just about 2 weeks ago. So below is what I'm annoyed with the GPU usage is ALL OVER the place and it feels like it in game.
This is a typical GPUz sensor screenshot regardless of clocks or voltage. FYi GPU usage in Afterburner is the same.


My FPS are mostly above 100 but I will get dips to 70 and it feels like a huge drop. I play at 1080p and most settings are ultra, some are low, but 4XAA.
The GPU usage on the 280x and my 7950 which I still have were all 99-100% all the time.
*Every other game I play has no issues* What gives? Is it the game? Bad drivers?


----------



## pshootr

Quote:


> Originally Posted by *THUMPer1*
> 
> Man guys I have some questions. Always been an ATI/AMD guy.
> My system
> 3570k @ 4.5 GHZ
> 8GB Mushkin 1600 RAM at 8-8-8-24
> 840 pro SSD
> *Diamond R9 290x* The V2 with Hynix RAM and aftermarket cooler 78.2 ASIC
> I have clocked the card differently, but most of the clocks are all 1100/1300, 1150/1350, 1100/1350. Somewhere in there. The card stays cool never gets above 70, and my CPU never gets above 55 when gaming. I have the cold weather to thank for that. I am using the recent version of Afterburner, I have also tried TRIXX.
> 
> Here are my settings.
> 
> 
> 
> 
> The issue I am having is my performance in BF4 varies. I run BF4 in DX11, as mantle is inconsistent. I am also on the Omega driver. I have tried the 14.11.2 also. Always using DDU with safemode and restarting before installing different drivers. The only thing I have not done is a clean windows install. I came from a 280x just about 2 weeks ago. So below is what I'm annoyed with the GPU usage is ALL OVER the place and it feels like it in game.
> This is a typical GPUz sensor screenshot regardless of clocks or voltage. FYi GPU usage in Afterburner is the same.
> 
> 
> My FPS are mostly above 100 but I will get dips to 70 and it feels like a huge drop. I play at 1080p and most settings are ultra, some are low, but 4XAA.
> The GPU usage on the 280x and my 7950 which I still have were all 99-100% all the time.
> *Every other game I play has no issues* What gives? Is it the game? Bad drivers?


Seems like your clocks are throttling. In AB, make two presets, one for default clock and one for your OC, then go to settings in AB and navigate to the profiles tab, select your stock preset for the 2D profile and the OC preset for the 3D profile, then save and exit AB. After you restart AB you should be good to go.


----------



## THUMPer1

Quote:


> Originally Posted by *pshootr*
> 
> Seems like your clocks are throttling. In AB, make two presets, one for default clock and one for your OC, then go to settings in AB and navigate to the profiles tab, select your stock preset for the 2D profile and the OC preset for the 3D profile, then save and exit AB. After you restart AB you should be good to go.


I'll try that. Thanks. It should be noted though that my game was minimized in that screenshot. I had it set to 1100 core. Would only 4 mhz be considered throttling? Or is that just a margin of error?


----------



## pshootr

Quote:


> Originally Posted by *THUMPer1*
> 
> I'll try that. Thanks. It should be noted though that my game was minimized in that screenshot. I had it set to 1100 core. Would only 4 mhz be considered throttling? Or is that just a margin of error?


Well, in the gpu load bar it shows up and down a lot is why I thought of throttling.


----------



## pshootr

Although, if a certain part of the map or without explosions ect. the GPU does throttle down, it should not be an issue unless you feel it. I think most people just like to see that their GPU has been hammered the whole time


----------



## pshootr

Perhaps you didn't notice this with your other cards because they had less headroom and had to work that much harder. I don't know, could just be difference in drivers or hardware.

Someone else with more experience will chime in I'm sure.


----------



## kizwan

That doesn't look like throttling because clocks not fluctuating,

@THUMPer1, I always play BF4 at 4K (200% resolution scale), so the GPU usage is almost max. Last time I play BF4 at 1080p with single GPU, GPU usage is better than yours. FPS-wise, yours look ok to me at Ultra. Did you try frame pacing method 1?

RenderDevice.FramePacingMethod 1


----------



## DividebyZERO

I have the same thing going on in 14.12 omega drivers. My overclock isn't consistent on load. it seems to stay consistent on stock clocks. I also noticed my Power consumption has been down along with the performance. It's wierd because i have ULPS off, and i am running pt1 bios . At any rate when i set to say 1100/1600 it rarely stays on 1100 and normally bounces around in the 1000-1100 range. I am water cooled and this just happened with the driver change.

Vsynch seems to work like a pro so as long as i can use vsynch and keep a steady 60fps,95fps i don't care too much for now. It does lower my benchmark scores and for some reason 3dmark wont run in eyefinity so. I think it may just be the driver saving some power.


----------



## pshootr

Quote:


> Originally Posted by *DividebyZERO*
> 
> I have the same thing going on in 14.12 omega drivers. My overclock isn't consistent on load. it seems to stay consistent on stock clocks. I also noticed my Power consumption has been down along with the performance. It's wierd because i have ULPS off, and i am running pt1 bios . At any rate when i set to say 1100/1600 it rarely stays on 1100 and normally bounces around in the 1000-1100 range. I am water cooled and this just happened with the driver change.
> 
> Vsynch seems to work like a pro so as long as i can use vsynch and keep a steady 60fps,95fps i don't care too much for now. It does lower my benchmark scores and for some reason 3dmark wont run in eyefinity so. I think it may just be the driver saving some power.


In your case, your clock is changing during your 3D apps. I would recommend using AB and trying the AB settings mentioned previously in my first response to "Thumper" if this is a problem for you.

I learned this from another OCN member and would like to give credit where it is due, but I can not seem to find his post.


----------



## chiknnwatrmln

If your issue is only with BF4, then BF4 is the problem. If the card loads up properly and maintains clocks on other games/benches then the problem is with BF4.


----------



## rdr09

Quote:


> Originally Posted by *THUMPer1*
> 
> Man guys I have some questions. Always been an ATI/AMD guy.
> My system
> 3570k @ 4.5 GHZ
> 8GB Mushkin 1600 RAM at 8-8-8-24
> 840 pro SSD
> *Diamond R9 290x* The V2 with Hynix RAM and aftermarket cooler 78.2 ASIC
> I have clocked the card differently, but most of the clocks are all 1100/1300, 1150/1350, 1100/1350. Somewhere in there. The card stays cool never gets above 70, and my CPU never gets above 55 when gaming. I have the cold weather to thank for that. I am using the recent version of Afterburner, I have also tried TRIXX.
> 
> Here are my settings.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The issue I am having is my performance in BF4 varies. I run BF4 in DX11, as mantle is inconsistent. I am also on the Omega driver. I have tried the 14.11.2 also. Always using DDU with safemode and restarting before installing different drivers. The only thing I have not done is a clean windows install. I came from a 280x just about 2 weeks ago. So below is what I'm annoyed with the GPU usage is ALL OVER the place and it feels like it in game.
> This is a typical GPUz sensor screenshot regardless of clocks or voltage. FYi GPU usage in Afterburner is the same.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> My FPS are mostly above 100 but I will get dips to 70 and it feels like a huge drop. I play at 1080p and most settings are ultra, some are low, but 4XAA.
> The GPU usage on the 280x and my 7950 which I still have were all 99-100% all the time.
> *Every other game I play has no issues* What gives? Is it the game? Bad drivers?


Not sure what you are using to monitor fps but if it is fraps - don't. see if unticking voltage monitoring in AB will help. close AB after doing the settings. i had issues in the past combining apps. try with the gpu at stock, set your pagefile to auto, max out the game and, last, raise your resolution scale to 130%. i can max out that game with a 7950 using 1080. i mean max.


----------



## Roboyto

Quote:


> Originally Posted by *combine1237*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Anyone know good vrm heatsinks, thermal adhesive and 1mm thermal pads for my 290x's. Also I am having trouble identifying some of the vrms on my asus dcuii 290x could anyone help point them out?
> 
> XFX 290x
> 
> 
> Asus DCUII OC 290x
> 
> 
> 
> 
> 
> Thanks for any help I can get.
> 
> Edit: I need everything to fit with a kraken g10


See my posts in my build log for suggestions regarding thermal pads and tape. You can see the large thermal performance increase I saw with the Kraken and a few select enhancements to it.

Post #25 Kraken/RAM Sinks/Thermal Tape - http://www.overclock.net/t/1478544/the-build-formerly-known-as-the-devil-inside-a-gaming-htpc-for-my-wife-4770k-corsair-250d-powercolor-r9-290/20_20#post_22214255

Post #30 has the Kraken Fan upgrade

Post #41 is Fujipoly Ultra Extreme thermal pad upgrade

You should just reuse the VRM sink that comes attached to your card for the main VRMs(VRM1) on the right side of the card; highlighted in yellow. Upgrading the thermal pads to Fujipoly Ultra Extreme will likely improve it's performance.



I don't suggest using thermal adhesive. It is more hassle than it is worth, and it is a PITA to remove compared to thermal tape.

For the RAM/VDD VRMs(VRM2) on the left side of the card, highlighted in green, you will want to use some of that JunPus thermal tape I mention in my build log. Just about any heatsink will work just fine, copper or aluminum, it shouldn't make that much of a difference. You could get several small sinks, or one large sink to cover all of these. You won't need to worry about any airflow to this end of the card to keep good temps. I haven't seen anything over ~60C on VRM2 with the very small sink and JunPus thermal tape.

Measure the dimensions of that cluster on the left side and you may be able to find something here that will tailor to it nicely...or a couple small ones will work just as good.

http://www.performance-pcs.com/mobo-chipset-mosfet-coolers


----------



## Gobigorgohome

Okay, these 14.12 drivers are kind of weird. I have better FPS in BF4 @ 4K with crossfire than I have with both trifire and quadfire .... trifire and quadfire is both bad, dips down to 30-35 FPS now and then, and make the game less than playable. With 14.9 I had no problems with either trifire or quadfire


----------



## THUMPer1

Thanks for the info guys.
the *RenderDevice.FramePacingMethod 1* does not seem to work with DX11, or at all. It does not show up in the command list in game. I have a user config file with some tweaks. I am using the in game FPS counter.
I added *ANTIALIASINGPOST - HIGH*
I removed *Unlock Voltage monitoring* in AfterBurner.
I tried some Resolution Scaling to 130 but that seemed to make a noticeable "clunky" feeling during playing.


----------



## silencespr

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Okay, these 14.12 drivers are kind of weird. I have better FPS in BF4 @ 4K with crossfire than I have with both trifire and quadfire .... trifire and quadfire is both bad, dips down to 30-35 FPS now and then, and make the game less than playable. With 14.9 I had no problems with either trifire or quadfire


ive noticed the same... i have one R9 295x2 and two R9 290x and when i enable quad fire at 4k the game gets choppy and unppleyable.. when i run tri fire it runs well... waiting for my mini display port to display port adapter to come in see how the R9 295x2 performs alone.


----------



## dodgethis

Long story short.

The MSI 290 Twin Frozr I have is from a very early batch of the release and I am wondering if anyone can positively ID that this is the reference design without me having to remove the card from my current rig.

Here is a link to the PCB in question

http://www.coolingconfigurator.com/step1_complist?gpu_gpus=1286

Here's a link to the newer PCB

http://www.coolingconfigurator.com/step1_complist?gpu_gpus=1401

Here is the picture of the two caps near the top of the card near the PCI-E connectors.



What do you guys think? Do I have a reference layout card?


----------



## fat4l

Hi guys.
I would like to ask what *voltages* ur running with *watercooling*, for everyday use?

If anyone who has been "with the thread" for longer can say his knowledge it would be great. thanks


----------



## chiknnwatrmln

Quote:


> Originally Posted by *fat4l*
> 
> Hi guys.
> I would like to ask what *voltages* ur running with *watercooling*, for everyday use?
> 
> If anyone who has been "with the thread" for longer can say his knowledge it would be great. thanks


I run +37mV to both my cards. 1111MHz. I can run 1165MHz with +87mV, but that draws a lot more power for not a lot more performance. My limiting factor right now is my PSU.

When I had a single card I ran +170mV or so with 1200MHz. I ran it like this for around eight months every day, the card still works just fine.

This is with reference boards.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *silencespr*
> 
> ive noticed the same... i have one R9 295x2 and two R9 290x and when i enable quad fire at 4k the game gets choppy and unppleyable.. when i run tri fire it runs well... waiting for my mini display port to display port adapter to come in see how the R9 295x2 performs alone.


Moar PSU
Quote:


> Originally Posted by *fat4l*
> 
> Hi guys.
> I would like to ask what *voltages* ur running with *watercooling*, for everyday use?
> 
> If anyone who has been "with the thread" for longer can say his knowledge it would be great. thanks


Daily stock , Gaming Stock @ 1000 / [email protected] 1.25vc ,
Benchmarking 1360 / 1400 @ 1.5vc + single card , xfire 1330 / 1400 odd , Tri 1300 / 1400 odd 1.5vc +


----------



## Blue Dragon

has anyone seen this(shaders/TMU/texture fillrate)? they are identical Asus DCUII r9-290 cards- neither is an 'X' model...


I updated to cat. 14.12 Omega and I lost GPU core temps on second card in HWinfo64(which I also updated), but card seems to be working fine... really busy so I don't know when I'll get time to switch the cards to see if it still acts same in first slot. didn't think these cards were unlockable so I'm assuming it's a mis-read on GPUZ's part.


----------



## MojoW

Quote:


> Originally Posted by *Blue Dragon*
> 
> has anyone seen this(shaders/TMU/texture fillrate)? they are identical Asus DCUII r9-290 cards- neither is an 'X' model...
> 
> 
> I updated to cat. 14.12 Omega and I lost GPU core temps on second card in HWinfo64(which I also updated), but card seems to be working fine... really busy so I don't know when I'll get time to switch the cards to see if it still acts same in first slot. didn't think these cards were unlockable so I'm assuming it's a mis-read on GPUZ's part.


GPU-Z will misread CFX when ulps isn't disabled!


----------



## Sgt Bilko

Quote:


> Originally Posted by *MojoW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Blue Dragon*
> 
> has anyone seen this(shaders/TMU/texture fillrate)? they are identical Asus DCUII r9-290 cards- neither is an 'X' model...
> 
> 
> I updated to cat. 14.12 Omega and I lost GPU core temps on second card in HWinfo64(which I also updated), but card seems to be working fine... really busy so I don't know when I'll get time to switch the cards to see if it still acts same in first slot. didn't think these cards were unlockable so I'm assuming it's a mis-read on GPUZ's part.
> 
> 
> 
> GPU-Z will misread CFX when ulps isn't disabled!
Click to expand...

^ Yep that


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> ^ Yep that


Gidday Jetsetter / Trendsetter


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> ^ Yep that
> 
> 
> 
> Gidday Jetsetter / Trendsetter
Click to expand...

Afternoon there!

Hows things?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Afternoon there!
> 
> Hows things?


No its 12.40Am here , but i am goods and ready for sleeps








And 4000 posts yipee


----------



## Sgt Bilko

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Afternoon there!
> 
> Hows things?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> No its 12.40Am here , but i am goods and ready for sleeps
> 
> 
> 
> 
> 
> 
> 
> 
> And 4000 posts yipee
Click to expand...

Afternoon here









grats on the 4k posts man, I'm not too far off it


----------



## rdr09

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Afternoon here
> 
> 
> 
> 
> 
> 
> 
> 
> 
> grats on the 4k posts man, I'm not too far off it


4K. pfft. those are just my infractions.lol

kid kid


----------



## muhd86

can we install 4 r9-290 tri-x sapphire gpus on a rampage iv board // need a quick reply please .


----------



## tsm106

Quote:


> Originally Posted by *muhd86*
> 
> can we install 4 r9-290 tri-x sapphire gpus on a rampage iv board // need a quick reply please .


Wasn't it already answered and you disregarded it?

http://www.overclock.net/t/1534060/quad-r9-290-tri-x-sapphire/0_40#post_23361203


----------



## Arizonian

Quote:


> Originally Posted by *xKrNMBoYx*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> MSI R9 290 Reference Design/Stock Cooler


Congrats - added









Quote:


> Originally Posted by *Sgt Bilko*
> 
> Afternoon here
> 
> 
> 
> 
> 
> 
> 
> 
> 
> grats on the 4k posts man, I'm not too far off it


It's not the quantity, It's the quality of the posts that count.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Arizonian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Afternoon here
> 
> 
> 
> 
> 
> 
> 
> 
> 
> grats on the 4k posts man, I'm not too far off it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's not the quantity, It's the quality of the posts that count.
Click to expand...

Very true


----------



## tsm106

Quote:


> Originally Posted by *THUMPer1*
> 
> Quote:
> 
> 
> 
> Originally Posted by *pshootr*
> 
> Seems like your clocks are throttling. In AB, make two presets, one for default clock and one for your OC, then go to settings in AB and navigate to the profiles tab, select your stock preset for the 2D profile and the OC preset for the 3D profile, then save and exit AB. After you restart AB you should be good to go.
> 
> 
> 
> I'll try that. Thanks. It should be noted though that my game was minimized in that screenshot. I had it set to 1100 core. Would only 4 mhz be considered throttling? Or is that just a margin of error?
Click to expand...

There's a few steps missing when running a profile setup. Do what is written here. Don't forget to check unified monitoring. Gpuz I'd not good for usage monitoring btw.

http://www.overclock.net/t/1529888/pc-freezes-from-youtube-browser-apps-after-r9-290-install/0_40#post_23271996


----------



## pshootr

Quote:


> Originally Posted by *tsm106*
> 
> There's a few steps missing when running a profile setup. Do what is written here. Don't forget to check unified monitoring. Gpuz I'd not good for usage monitoring btw.
> 
> http://www.overclock.net/t/1529888/pc-freezes-from-youtube-browser-apps-after-r9-290-install/0_40#post_23271996


Thank you. I wanted to direct him to this post but I could not seem to find it.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Afternoon here
> 
> 
> 
> 
> 
> 
> 
> 
> 
> grats on the 4k posts man, I'm not too far off it


You did it in a much shorter time frame . Say hello to da missus for me eh









Quote:


> Originally Posted by *rdr09*
> 
> 4K. pfft. those are just my infractions.lol
> 
> kid kid


LoooooL
You , I like you









Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> It's not the quantity, It's the quality of the posts that count.


Your a ' cup half full ' kind of guy ? Yes ??


----------



## ahmedmo1

After switching to the Omega drivers and increasing voltage (+30mv), my black screen issue at idle and BSODs during flash videos have completely disappeared. Problem-free for over two weeks.


----------



## xKrNMBoYx

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> It's not the quantity, It's the quality of the posts that count.


Thanks. As soon as I get a bigger/better power supply it's time to put it through some stressing and than OC (along with the 8350).


----------



## mambastik

Hey guys! Building my first desktop and decided to pick up a used R9 290 on eBay from a bitmining rig, no OC housed in a cooled warehouse. It must've been the packaging but it seems a small piece got ripped from the board. My guess is that it snagged on the grippy plastic bubble wrap that it was wrapped in. The area you are looking at is towards the back of the card opposite of the ports. Attached are images from 2 different angles:




Since that piece was between 2 screws, I didn't find out until after I started taking apart the stock cooler. Unless I was scanning the whole board centimeter by centimeter, I probably wouldn't have noticed it. So here's the thing: the GPU runs perfectly fine, benchmarked and everything. After babying it around and installing a Kraken G10 + Corsair H55, it completely fell off. Re-benchmarked and still works completely fine.

I know everyone here's a lot more knowledgeable than I am, so my question is what exactly is it? And should I be worried down the line?


----------



## Mercy4You

I have a rather unusual question I guess for Overclocknet









Why does CCC allow Powerlimit settings under 0% and does it really lower voltage etc when downclocking core and memory?

Anyone tested this yet because I cannot find any information about this...

To be more specific, when I run my core @ 900 en memory @ 1250, should I also reduce my Powerlimit settings then







?


----------



## maynard14

Guys help.. Im playing ryse son of rome and im having black screen while playing and my pc lags and i could not use it. Im using 290x with g10 and h105 with omega drivers. I already uinstall and install again the omega drivers but still black screen.. Tried stock settings on my 4770k no luck still. What coul be the problem. I also use my 290x at default clocks. I have no idea whats the problem...


----------



## Mercy4You

Quote:


> Originally Posted by *maynard14*
> 
> Guys help.. Im playing ryse son of rome and im having black screen while playing and my pc lags and i could not use it. Im using 290x with g10 and h105 with omega drivers. I already uinstall and install again the omega drivers but still black screen.. Tried stock settings on my 4770k no luck still. What coul be the problem. I also use my 290x at default clocks. I have no idea whats the problem...


Please some more information, like did you have blackscreens before on other drivers and did you use DDU and what about other games?


----------



## maynard14

Quote:


> Originally Posted by *Mercy4You*
> 
> Please some more information, like did you have blackscreens before on other drivers and did you use DDU and what about other games?


sorry for missing infos, no this is the first time i got black screen, i only black screen in ryse son of rome, and i already played assassins creed unity with omega drivers no problems no black screen and with omega drivers, only ryse of rome is giving me black screen , yes i did uninstall omega drivers with latest ddu


----------



## Mercy4You

Quote:


> Originally Posted by *maynard14*
> 
> sorry for missing infos, no this is the first time i got black screen, i only black screen in ryse son of rome, and i already played assassins creed unity with omega drivers no problems no black screen and with omega drivers, only ryse of rome is giving me black screen , yes i did uninstall omega drivers with latest ddu










ok, then I should roll back to 14.9 if I were you! BTW, I find 14.9 very fast and stable, so I did not bother to go for the Omega drivers...


----------



## maynard14

Quote:


> Originally Posted by *Mercy4You*
> 
> 
> 
> 
> 
> 
> 
> 
> ok, then I should roll back to 14.9 if I were you! BTW, I find 14.9 very fast and stable, so I did not bother to go for the Omega drivers...


really? hmm i might try it, i already tried to up my voltage to 30mv volts and power limit to 20 with stock clocks! still black screen and hangs,. huhu this is heart breaking


----------



## StrongForce

Just wondering anyone tryed VSR ? how does that work, I mean how to activate it, do I have to put supersampling in the catalyst drivers ?


----------



## Mercy4You

Quote:


> Originally Posted by *maynard14*
> 
> really? hmm i might try it, i already tried to up my voltage to 30mv volts and power limit to 20 with stock clocks! still black screen and hangs,. huhu this is heart breaking


You 'should' try it, it's the only way to exclude drivers as the 'problem'








Or stop playing Ryse son of Rome


----------



## maynard14

Quote:


> Originally Posted by *Mercy4You*
> 
> You 'should' try it, it's the only way to exclude drivers as the 'problem'
> 
> 
> 
> 
> 
> 
> 
> 
> Or stop playing Ryse son of Rome


hahah yeah that game is good but i think its buggy? ahmm but my card is not broken right? its a driver issue?


----------



## Mercy4You

Quote:


> Originally Posted by *maynard14*
> 
> hahah yeah that game is good but i think its buggy? ahmm but my card is not broken right? its a driver issue?


That's why I would roll back drivers first and then play this game again to see what happens. To be honoust, If this problem only happens during that game, I wouldn't worry about your card being broken


----------



## maynard14

Thanks sir. Im downloading the 14.9 now. But while downloading played shadow of mordor for about 10 min... Boom it black screen also.. Damn.... I think my card is broken. Last chance is to install 14.9 driver


----------



## Forceman

Quote:


> Originally Posted by *Mercy4You*
> 
> I have a rather unusual question I guess for Overclocknet
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Why does CCC allow Powerlimit settings under 0% and does it really lower voltage etc when downclocking core and memory?
> 
> Anyone tested this yet because I cannot find any information about this...
> 
> To be more specific, when I run my core @ 900 en memory @ 1250, should I also reduce my Powerlimit settings then
> 
> 
> 
> 
> 
> 
> 
> ?


Changing just the power limit won't change the delivered voltage unless you exceed it. So setting 90% on the power limit will just change the throttle point, it won't affect anything that happens as long as you are below that limit.

If you want to undervolt, use AB or Trixx or something and manually reduce the voltage with a negative offset.


----------



## maynard14

Still black screen with 14.9 driver... Hmmm i hope i can still rma the card. Weird i finish ac unity with no issue.. But ryse sons of rome and shadow of mordor is giving me black screen. Damn. Hope im able to rma it.


----------



## maynard14

Guys...is this normal.. While checking my card i found this! What could it be?



Its like liquid something... Huhu


----------



## ThijsH

What are we supposed to look at in that picture?


----------



## maynard14

Here another picture. Im using my cp right now. The capacitors? The little square thing.. Its like colored black liquid coming out of the square thing...


----------



## Mercy4You

Quote:


> Originally Posted by *Forceman*
> 
> Changing just the power limit won't change the delivered voltage unless you exceed it. So setting 90% on the power limit will just change the throttle point, it won't affect anything that happens as long as you are below that limit.
> 
> If you want to undervolt, use AB or Trixx or something and manually reduce the voltage with a negative offset.


Ok thx









Can I count on Powertune to automatically reduce voltage when running manually adjusted lower clocks?


----------



## maynard14

I googled them and its called inductor.. So there some black liquid is coming out of them... But it seems the black liquid thing is hard.. Is that normal? Will i be still be able to rma the card at that shape?


----------



## BradleyW

Quote:


> Originally Posted by *maynard14*
> 
> I googled them and its called inductor.. So there some black liquid is coming out of them... But it seems the black liquid thing is hard.. Is that normal? Will i be still be able to rma the card at that shape?


When did it appear, or might it have always been there?


----------



## Mercy4You

Quote:


> Originally Posted by *maynard14*
> 
> I googled them and its called inductor.. So there some black liquid is coming out of them... But it seems the black liquid thing is hard.. Is that normal? Will i be still be able to rma the card at that shape?


Too bad the rolling back didn't solve the problem, but now you know the problem is not the Omega driver!

It's not normal that the liquid thing is hard







But, I think bleeding cards fall under the guarantee terms...


----------



## maynard14

I didt really notice them before.. But i am might paranoid because of the black screen im getting lately. But i googled some reference pcb for the 290x some have the same exact inductor with liquid thing but some doesnt have. I hope this liquid thing is normal. Im sorry guys im just kinda lost coz i dont know what to do now. My card is kust 2 months old. Cant barely buy a new one.


----------



## maynard14

Quote:


> Originally Posted by *Mercy4You*
> 
> Too bad the rolling back didn't solve the problem, but now you know the problem is not the Omega driver!
> 
> It's not normal that the liquid thing is hard
> 
> 
> 
> 
> 
> 
> 
> But, I think bleeding cards fall under the guarantee terms...


I hope so sir they will still rma it. I dont know what cause it my temps are ok .. My vrm is at 71 c and 80c while my gpu core is at 55c..


----------



## BradleyW

That inductor is coated. Liquid or no liquid, it is not making contact with the component under the cover. I would say this is not the cause of the issue. If you still get black screens, RMA the card. A small number of cards have a black screen issue. I don't know why. Never had it myself with these 1 day released 290X's I have.


----------



## ThijsH

Found someone with a relevant thread: http://linustechtips.com/main/topic/111418-r9-290x-chokes/ .

@maynard14


----------



## Mercy4You

Quote:


> Originally Posted by *maynard14*
> 
> I hope so sir they will still rma it. I dont know what cause it my temps are ok .. My vrm is at 71 c and 80c while my gpu core is at 55c..


Be glad you didn't by an ASUS card! I had the same blackscreen problems with my first R9 290X and ASUS did not take it back, while it was only 2 months old...


----------



## maynard14

Guys thank you... Thank you ! I can sleep better now.. Im confident that they will replace my card. And thats another info leared. Hope tomorrow ill be able to rma it. Thanks guys! Blasted black screen issue!


----------



## owcraftsman

Question? I have a second 290x coming sometime today and I want to install and setup to run BF4.

Can you all give me some suggestion to setup properly?

Been a Green team player many years which is why I ask. I'm clear on install and setup. For example I know I have to install drivers for a 2nd Nvidia card and SLI won't be enabled automatically and I do like to tweak my global and per-game settings. So that would be the type of advice I'm looking for for basic to advanced. I've read so many horror stories here as a subscriber for a year now especially regarding BF4. I'd just like to get it as close to right the first time if at all possible.

Thanks in advance!


----------



## tsm106

Quote:


> Originally Posted by *maynard14*
> 
> Guys thank you... Thank you ! I can sleep better now.. Im confident that they will replace my card. And thats another info leared. Hope tomorrow ill be able to rma it. Thanks guys! Blasted black screen issue!


That black liquidy looking stuff is a composite that is used to hold the choke inside the two halves. It's there to absorb the natural vibrations. If you see that on your inductors, they've already blown their load so to speak. You should never see that on a new board.


----------



## hyp36rmax

Yay they are in!


----------



## kawasakizx10rr

AMD R9 290X Battlefield 4 reference edition from club3d, with a EK acetal & nickel full water block

The gpu shares the same loop as the motherboard and cpu but temps are still nice and low


----------



## ahmedmo1

Quote:


> Originally Posted by *owcraftsman*
> 
> Question? I have a second 290x coming sometime today and I want to install and setup to run BF4.
> 
> Can you all give me some suggestion to setup properly?
> 
> Been a Green team player many years which is why I ask. I'm clear on install and setup. For example I know I have to install drivers for a 2nd Nvidia card and SLI won't be enabled automatically and I do like to tweak my global and per-game settings. So that would be the type of advice I'm looking for for basic to advanced. I've read so many horror stories here as a subscriber for a year now especially regarding BF4. I'd just like to get it as close to right the first time if at all possible.
> 
> Thanks in advance!


What resolution are you running BF4 at?


----------



## Gobigorgohome

Okay, I threw out two of my cards earlier this afternoon and did try some BF4 @ 4K afterwards and it looked quite good, PT1 and overclocking next.









Did notice something weird in GPU-Z though, card 1 is showing Hynix while card 2 is showing unidentified ... any way to go around this?


----------



## silencespr

hey guys what are the best bios for the R290x ? dont feel like reading 3387 pages. thx


----------



## Forceman

Quote:


> Originally Posted by *Mercy4You*
> 
> Ok thx
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can I count on Powertune to automatically reduce voltage when running manually adjusted lower clocks?


I don't think it will if it is in 3D clock mode - it'll lower the 2D clocks (on the desktop, for instance) but if oyu lower the 33D clock speed you'll still get the same voltage. You'd have to change it manually using AB or Trixx. Although I guess it wouldn't hurt to try it and see what happens - I'm not positive it won't, just highly doubt it.


----------



## owcraftsman

Quote:


> Originally Posted by *ahmedmo1*
> 
> Quote:
> 
> 
> 
> Originally Posted by *owcraftsman*
> 
> Question? I have a second 290x coming sometime today and I want to install and setup to run BF4.
> 
> Can you all give me some suggestion to setup properly?
> 
> Been a Green team player many years which is why I ask. I'm clear on install and setup. For example I know I have to install drivers for a 2nd Nvidia card and SLI won't be enabled automatically and I do like to tweak my global and per-game settings. So that would be the type of advice I'm looking for for basic to advanced. I've read so many horror stories here as a subscriber for a year now especially regarding BF4. I'd just like to get it as close to right the first time if at all possible.
> 
> Thanks in advance!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What resolution are you running BF4 at?
Click to expand...

2560x1440 60-70hz capable plus I extend my desktop to a 1080p monitor but game remains on main monitor only


----------



## maynard14

Quote:


> Originally Posted by *tsm106*
> 
> That black liquidy looking stuff is a composite that is used to hold the choke inside the two halves. It's there to absorb the natural vibrations. If you see that on your inductors, they've already blown their load so to speak. You should never see that on a new board.


i see, i hope they will still accept warranty for my card


----------



## maynard14

Quote:


> Originally Posted by *maynard14*
> 
> i see, i hope they will still accept warranty for my card


i found this while searching for bleeding chokes



sorry double post


----------



## mojobear

Hey guys,

Just wanted to confirm that the fuji pad upgrade for VRMs posted originally by roboyto works amazingly well

http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures

I only put the pads on VRM1 but they have decreased by about 20 C. VRM1 temps are now below GPU core...ridiculous. See below picture. This was looped 3dmark firestrike first scene


----------



## pshootr

Quote:


> Originally Posted by *mojobear*
> 
> Hey guys,
> 
> Just wanted to confirm that the fuji pad upgrade for VRMs posted originally by roboyto works amazingly well
> 
> http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures
> 
> I only put the pads on VRM1 but they have decreased by about 20 C. VRM1 temps are now below GPU core...ridiculous. See below picture. This was looped 3dmark firestrike first scene


Wow man, that is sweet


----------



## Arizonian

Quote:


> Originally Posted by *kawasakizx10rr*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> AMD R9 290X Battlefield 4 reference edition from club3d, with a EK acetal & nickel full water block
> 
> The gpu shares the same loop as the motherboard and cpu but temps are still nice and low
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## maynard14

Ahaha.. My 3rd 290x ! But this one has uefi bios ready.. Whats the diff btw from a normal 290x?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *maynard14*
> 
> 
> Ahaha.. My 3rd 290x ! But this one has uefi bios ready.. Whats the diff btw from a normal 290x?


Unlocks to a 290 ?????


----------



## maynard14

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Unlocks to a 290 ?????


hahaha! still having problems! black screennnnnnnn even with new 290x

hmm what can be also the cause of black screen? is it my psu?


----------



## Mercy4You

Quote:


> Originally Posted by *maynard14*
> 
> hahaha! still having problems! black screennnnnnnn even with new 290x
> 
> hmm what can be also the cause of black screen? is it my psu?


WOW, you arranged a new R9 fast









Now you already know 2 things:
1: it's not the drivers
2: It's not your GPU


----------



## maynard14

Haha yes...lucky me its a hynix 290x but I have an new prob.. While playing with stock cpu speed my pc will restart and after restart it will give me error in post bios.. It will tell me anti surge activated... Which means my psu? But if i tried playing with overclock cpu at 4.3 ghz 1.27 volts while playing it will black screen and pc will hang...
i thought my vcard but no it aint..

I dont know if psu ram or processor or even my mobo is the faulty one... Help pls...


----------



## Mercy4You

Quote:


> Originally Posted by *maynard14*
> 
> Haha yes...lucky me its a hynix 290x but I have an new prob.. While playing with stock cpu speed my pc will restart and after restart it will give me error in post bios.. It will tell me anti surge activated... Which means my psu? But if i tried playing with overclock cpu at 4.3 ghz 1.27 volts while playing it will black screen and pc will hang...
> i thought my vcard but no it aint..
> 
> I dont know if psu ram or processor or even my mobo is the faulty one... Help pls...


It may be the electrical wiring on your wall-socket. Try your pc on another place in your house first before looking further...


----------



## ThijsH

That does sound like there might be something wrong with your power. This could also explain why the choke on the vrm started bleeding. Maybe the psu or maybe it's the wall wiring? Either way it could be very dangerous for your components indeed if the power is the problem. Also gratz on getting hynix memory


----------



## sTOrM41

Quote:


> Originally Posted by *maynard14*
> 
> But this one has uefi bios ready.. Whats the diff btw from a normal 290x?


every 290(x) should be uefi ready,
even my 2 years old 7950 was uefi ready


----------



## maynard14

Quote:


> Originally Posted by *Mercy4You*
> 
> It may be the electrical wiring on your wall-socket. Try your pc on another place in your house first before looking further...


Quote:


> Originally Posted by *ThijsH*
> 
> That does sound like there might be something wrong with your power. This could also explain why the choke on the vrm started bleeding. Maybe the psu or maybe it's the wall wiring? Either way it could be very dangerous for your components indeed if the power is the problem. Also gratz on getting hynix memory


I already also tried directly plugging my pc to a wall outlet.. Still the same.. But never tried plugging it on a diff outlet...

I will try that later.. But is there another way?


----------



## HOMECINEMA-PC

@maynard14
VGA bios might need flashing to the asus PT1T for better compatibility perhaps . Is this occurring after win / boot splash screen ?? That's what I did to stop it from happening .
http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread/0_20
Have a read of that give it a go and call me in the morning LooooL


----------



## maynard14

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> @maynard14
> VGA bios might need flashing to the asus PT1T for better compatibility perhaps . Is this occurring after win / boot splash screen ?? That's what I did to stop it from happening .
> http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread/0_20
> Have a read of that give it a go and call me in the morning LooooL


thanks sir, no it only accurs while playing a game,,, while playing ryse sons of rome and every high end game max settings after i play for 1 min now it will black screen if my 4770k is oced to 4,3 ghz

but if i play with stock clock of my 4770k my pc simple restarts and gives me error anti surge has been activated

i also tried flashing my card to pt1 bios but still no luck

after work ill try to stress test my pc without the card

i think my psu is failing


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *maynard14*
> 
> thanks sir, no it only accurs while playing a game,,, while playing ryse sons of rome and every high end game max settings after i play for 1 min now it will black screen if my 4770k is oced to 4,3 ghz
> 
> but if i play with stock clock of my 4770k my pc simple restarts and gives me error anti surge has been activated
> 
> i also tried flashing my card to pt1 bios but still no luck
> 
> after work ill try to stress test my pc without the card
> 
> i think my psu is failing


Ohhh sounds like your on to it , Replace and test or turn off surge protection ........

Are you using PSU cable extensions at all ?? If so remove them could be culprit ........


----------



## maynard14

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Ohhh sounds like your on to it , Replace and test or turn off surge protection ........
> 
> Are you using PSU cable extensions at all ?? If so remove them could be culprit ........


yes im using bitfenix extensions, i will remove them later







and test my pc without the case,, i hope no shorts is triggering

i already replace the card with a new 290x, so i think the gpu is not the issue


----------



## thrgk

I was wondering how to get my speeds and voltage to go lower when idle on my 4 290x? Right now idle my clocks re 300/150 but my voltage is 961mv. Is this normal? Should I enable powerplay and if so does it effect my performance in games, etc?


----------



## tsm106

Quote:


> Originally Posted by *maynard14*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HOMECINEMA-PC*
> 
> Ohhh sounds like your on to it , Replace and test or turn off surge protection ........
> 
> *Are you using PSU cable extensions at all ?? If so remove them could be culprit ........*
> 
> 
> 
> yes *im using bitfenix extensions*, i will remove them later
> 
> 
> 
> 
> 
> 
> 
> and test my pc without the case,, i hope no shorts is triggering
> 
> i already replace the card with a new 290x, so i think the gpu is not the issue
Click to expand...











Btw, don't flash to pt1, it will exacerbate power issues because that bios has no powersaving limits.


----------



## maynard14

Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Btw, don't flash to pt1, it will exacerbate power issues because that bios has no powersaving limits.


on my last 290x the one with bleeding chokes i flashed the pt1 bios

but on my new gpu i never flash the pt1 bios, its just 1 day old haha, havent installed nzxt g10 and never dissamble it, ill just test my pc later without the gpu

im hoping its not my processor or my motherboard frown.gif

if my psu is causing my prob i can still buy one, processor and mobo are expensives! haha


----------



## Regnitto

I'm sorry if I'm asking a question that's been answered earlier in these threads, but I really don't feel like digging through almost 1700 pages of posts.......Is the pt1 bios just for the 290x, or is there a pt1 for the 290 as well? My card is not able to be flashed to a 290x, so that is not an option. And if so, does it give any performance boost or just prevent throttling when fully loaded?


----------



## tsm106

Quote:


> Originally Posted by *Regnitto*
> 
> I'm sorry if I'm asking a question that's been answered earlier in these threads, but I really don't feel like digging through almost 1700 pages of posts.......Is the pt1 bios just for the 290x, or is there a pt1 for the 290 as well? My card is not able to be flashed to a 290x, so that is not an option. And if so, does it give any performance boost or just prevent throttling when fully loaded?


PT1 bios doesn't boost performance but allows you to run more voltage, provided you are running the modded gputweak. It also has had powertune removed so it will not downclock frequencies at idle. Regarding throttling, I suppose you could use this to combat throttling but imo its the wrong way to go about it. That said, PT1 is strictly for reference cards, in either form 290x or 290. If your card is not compatible with a 290x bios, chances are it may not be fine with the PT1 bios either.


----------



## DividebyZERO

Quote:


> Originally Posted by *tsm106*
> 
> PT1 bios doesn't boost performance but allows you to run more voltage, provided you are running the modded gputweak. It also has had powertune removed so it will not downclock frequencies at idle. Regarding throttling, I suppose you could use this to combat throttling but imo its the wrong way to go about it. That said, PT1 is strictly for reference cards, in either form 290x or 290. If your card is not compatible with a 290x bios, chances are it may not be fine with the PT1 bios either.


PT1 saved my butt on my x58/290x ref combo, of course i doubt anyone runs that combo these days.


----------



## EdWeisz

Quote:


> Originally Posted by *DividebyZERO*
> 
> PT1 saved my butt on my x58/290x ref combo, of course i doubt anyone runs that combo these days.


still clinging on my x58 (with R9 290x).


----------



## Regnitto

Quote:


> Originally Posted by *tsm106*
> 
> PT1 bios doesn't boost performance but allows you to run more voltage, provided you are running the modded gputweak. It also has had powertune removed so it will not downclock frequencies at idle. Regarding throttling, I suppose you could use this to combat throttling but imo its the wrong way to go about it. That said, PT1 is strictly for reference cards, in either form 290x or 290. If your card is not compatible with a 290x bios, chances are it may not be fine with the PT1 bios either.


the downclock at idle is one of the things I was hoping to get rid of, also (and this may be something I can fix with settings in AB maybe?) if I'm in game and it's not running 90-100% load, my clocks will fluctuate between 1050-1200 (oc is 1200/1500 +100mv, +50% power limit). I do not see this in firestrike or heaven as they stay 100% load. It has done this since day 1. is this part of the powertune and/or something I can prevent through AB? also.....what happens if I flip my BIOS switch? I've yet to mess with that, also I have not messed with the BIOS on this card, however I did run Hawaii Info to determine that I am hardlocked to 290.


----------



## maynard14

Currently on the process of finding the defective part of my pc..ehe


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Regnitto*
> 
> the downclock at idle is one of the things I was hoping to get rid of, also (and this may be something I can fix with settings in AB maybe?) if I'm in game and it's not running 90-100% load, my clocks will fluctuate between 1050-1200 (oc is 1200/1500 +100mv, +50% power limit). I do not see this in firestrike or heaven as they stay 100% load. It has done this since day 1. is this part of the powertune and/or something I can prevent through AB? also..... *what happens if I flip my BIOS switch?* I've yet to mess with that, also I have not messed with the BIOS on this card, however I did run Hawaii Info to determine that I am hardlocked to 290.


Nothing , just the fan profile is different ....., till you flash a different bios


----------



## Blue Dragon

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Nothing , just the fan profile is different ....., till you flash a different bios


that's one thing I was never really clear on w/ R9-290(x)s.... is it just a fan profile switch or is it an actual bios switch that carries different fan profiles? if the latter... why don't they release two different bios versions when updating? which bios am I getting? it doesn't really affect my cards, but still would like to know.


----------



## Forceman

I'm not positive on the 290X (but I'm pretty confident it's the same), but on the 290 it is too different BIOSes. I have 290X flashed on one side and 290 on the other, and both work just fine.


----------



## thrgk

If I did decide to go 7.1 or 5.1 what speakers would you recommend ? Would the back ones be wireless or do ceiling attachments ?

I feel that 6 1/2 inch speakers like my Dayton would be too large


----------



## silencespr

Quote:


> Originally Posted by *maynard14*
> 
> Currently on the process of finding the defective part of my pc..ehe


how is that Mobo working out for you ?


----------



## maynard14

Quote:


> Originally Posted by *silencespr*
> 
> how is that Mobo working out for you ?


I think my rig is fine now. Played ryse sons of rome for 15 mins still no black screen.

It could be my casing making shorting for my mobo or maybe the sleeve extensions

I tried also the beta driver the last beta driver and its working great.. But i would like to sell the 290x simple because of the driver issue.. This prob makes me loose my mind.. Haha but im thankful i work it out.

Tomorrow ill assemble back my rig to my raven case.

The motherboard is really good by its looks and its a tough board. But my oc is at 4.3 ghz only delided and voltage to 1.27 v


----------



## tsm106

Quote:


> Originally Posted by *Regnitto*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> PT1 bios doesn't boost performance but allows you to run more voltage, provided you are running the modded gputweak. It also has had powertune removed so it will not downclock frequencies at idle. Regarding throttling, I suppose you could use this to combat throttling but imo its the wrong way to go about it. That said, PT1 is strictly for reference cards, in either form 290x or 290. If your card is not compatible with a 290x bios, chances are it may not be fine with the PT1 bios either.
> 
> 
> 
> the downclock at idle is one of the things I was hoping to get rid of, also (and this may be something I can fix with settings in AB maybe?) if I'm in game and it's not running 90-100% load, my clocks will fluctuate between 1050-1200 (oc is 1200/1500 +100mv, +50% power limit). I do not see this in firestrike or heaven as they stay 100% load. It has done this since day 1. is this part of the powertune and/or something I can prevent through AB? also.....what happens if I flip my BIOS switch? I've yet to mess with that, also I have not messed with the BIOS on this card, however I did run Hawaii Info to determine that I am hardlocked to 290.
Click to expand...

Rule of thumb, you don't want to be running PT1, it is a hammer when a chisel is all that's required. You don't want to use PT1 for throttling issues, because most likely AB is not configured properly. Follow the link in this post below for starters.

Quote:


> Originally Posted by *pshootr*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> There's a few steps missing when running a profile setup. Do what is written here. Don't forget to check unified monitoring. Gpuz I'd not good for usage monitoring btw.
> 
> http://www.overclock.net/t/1529888/pc-freezes-from-youtube-browser-apps-after-r9-290-install/0_40#post_23271996
> 
> 
> 
> Thank you. I wanted to direct him to this post but I could not seem to find it.
Click to expand...

I should recommend this to be stickied in the OP.


----------



## aaroc

Hello. Does anyone has a xfx r9 290x 8gb double dissipation? It's a reference board?
I have ek web for reference board r9 290x and want to reuse them.
The powercolor r9 290x 8gb are a lot cheaper but don't use a reference card and I would have to buy new wbs. I checked the configurator but they removed the 8gb from
The available options.


----------



## Regnitto

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Nothing , just the fan profile is different ....., till you flash a different bios


Fan profile doesn't matter since I don't have a fan plugged into my red modded 290


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Regnitto*
> 
> Fan profile doesn't matter since I don't have a fan plugged into my red modded 290


Position 1 = Stock bios
Position 2 = 290x bios ( PT1T for benching ) with GPU tweek
I run this 24/7 @ 1000/[email protected] but I run chillers as well . No Dramas


----------



## Regnitto

Quote:


> Originally Posted by *tsm106*
> 
> Rule of thumb, you don't want to be running PT1, it is a hammer when a chisel is all that's required. You don't want to use PT1 for throttling issues, because most likely AB is not configured properly. Follow the link in this post below for starters.
> I should recommend this to be stickied in the OP.


Quote:


> Originally Posted by *tsm106*
> 
> Rule of thumb, you don't want to be running PT1, it is a hammer when a chisel is all that's required. You don't want to use PT1 for throttling issues, because most likely AB is not configured properly. Follow the link in this post below for starters.
> I should recommend this to be stickied in the OP.


I used that link a few days ago and matched my configuration to that, but still have the same issue, although I'm not really sure how much of an issue it really is. The only performance issues I seem to have in games is CPU physics with my FX-6100........maybe what i'm seeing is the clocks just following the load and CPU bottleneck? Thinking about moving to Devil's Canyon if I get a good enough tax check.....If it's really good like last year maybe even Haswell-e, although I'm looking into getting some more monitors and setting up eyefinity too, so that may make a difference on how I upgrade the rest of my rig.

I know it's not a thermal throttle issue, as even at my oc, I never see the core go over 65c unless i leave my radiator fan on low. and my VRMs never break 67 in game and 75 in benchmark.


----------



## Jabba1977

Hello!!, I´m with Nvidia for the latest two years...

Do you think CF 290X is smoother than a 980 SLI or 970...is for gaming at 1600p.

What do you think?, are the framepacing and AMD drivers working???.

Thanks!!!.


----------



## rdr09

Quote:


> Originally Posted by *Jabba1977*
> 
> Hello!!, I´m with Nvidia for the latest two years...
> 
> Do you think CF 290X is smoother than a 980 SLI or 970...is for gaming at 1600p.
> 
> What do you think?, are the framepacing and AMD drivers working???.
> 
> Thanks!!!.


depends on the games. if they are nvidia optimized, then more than likely they'll run better with the 980s.

http://www.hardocp.com/article/2014/11/11/nvidia_geforce_gtx_980_sli_overclocked_gpu_review/7#.VK3OzCvF-6N

XDMA baby!


----------



## Jabba1977

I speak in general...is CF with actual 290x smoother than sli 970/980.

Thanks.


----------



## tsm106

Quote:


> Originally Posted by *Regnitto*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Rule of thumb, you don't want to be running PT1, it is a hammer when a chisel is all that's required. You don't want to use PT1 for throttling issues, because most likely AB is not configured properly. Follow the link in this post below for starters.
> I should recommend this to be stickied in the OP.
> 
> 
> 
> I used that link a few days ago and matched my configuration to that, but still have the same issue, although I'm not really sure how much of an issue it really is. The only performance issues I seem to have in games is CPU physics with my FX-6100........maybe what i'm seeing is the clocks just following the load and CPU bottleneck? Thinking about moving to Devil's Canyon if I get a good enough tax check.....If it's really good like last year maybe even Haswell-e, although I'm looking into getting some more monitors and setting up eyefinity too, so that may make a difference on how I upgrade the rest of my rig.
> 
> I know it's not a thermal throttle issue, as even at my oc, I never see the core go over 65c unless i leave my radiator fan on low. and my VRMs never break 67 in game and 75 in benchmark.
Click to expand...

Ok, two things to clear up usage and throttling. Usage deals with the cpu being able to process data enough to keep the gpu loaded. Usage variation is not throttling. Throttling happens when the gpu hits the temp target or powerlimit target. With a properly setup profile, AB will do exactly as instructed. Did you enable the unified monitoring option? Are you actually getting both issues, low usage and throttling?


----------



## Regnitto

Quote:


> Originally Posted by *tsm106*
> 
> Ok, two things to clear up usage and throttling. Usage deals with the cpu being able to process data enough to keep the gpu loaded. Usage variation is not throttling. Throttling happens when the gpu hits the temp target or powerlimit target. With a properly setup profile, AB will do exactly as instructed. Did you enable the unified monitoring option? Are you actually getting both issues, low usage and throttling?


depends on the game, but both yes, and the throttling is only when usage is under 75%, otherwise, it's rock solid.

edit: here's screen of my AB settings:



and here is my OC settings (taken at idle)



I will see about taking video of gameplay with OSD to catch the issue and post. just played Diablo 3 and had 100% usage and steady full clock throughout.


----------



## tsm106

^^Your settings are wrong. You enabled "extend OFFICIAL overclock limits" and unofficial at the same time! Needless to say, t's not a good idea to run both official and unofficial modes at the same time.


----------



## Gumbi

Quote:


> Originally Posted by *tsm106*
> 
> ^^Your settings are wrong. You enabled "extend OFFICIAL overclock limits" and unofficial at the same time! Needless to say, t's not a good idea to run both official and unofficial modes at the same time.


What's bad about it? I think I have the same settings myself and haven't run into issues thus far.


----------



## Regnitto

Quote:


> Originally Posted by *tsm106*
> 
> ^^Your settings are wrong. You enabled "extend OFFICIAL overclock limits" and unofficial at the same time! Needless to say, t's not a good idea to run both official and unofficial modes at the same time.


so turn off official and run with unofficial? keep with powerplay or without?


----------



## Regnitto

turned off "extend official overclocking limits" and left unofficial with powerplay turned on, just played a little bf4........30-50% gpu usage core fluctuating between 1050-1150, never goes to my full 1200 clock, cpu usage all 6 cores running 75-95%


----------



## Mercy4You

Quote:


> Originally Posted by *Jabba1977*
> 
> I speak in general...is CF with actual 290x smoother than sli 970/980.
> 
> Thanks.


In general 1 of each is smoothest.


----------



## kizwan

Quote:


> Originally Posted by *Regnitto*
> 
> turned off "extend official overclocking limits" and left unofficial with powerplay turned on, just played a little bf4........30-50% gpu usage core fluctuating between 1050-1150, never goes to my full 1200 clock, cpu usage all 6 cores running 75-95%


Untick "Unlock Voltage Monitoring" - I think there is issue with this setting but I can't remember what it is. Probably already fixed but turn this off anyway.
Untick "Extend official overclocking limits"
Disabled "Unofficial overclocking mode"

Try the above first.


----------



## Archea47

Quick warning: aquacomputer blocks don't fit on (at least my) Tri-X 290Xs. Going to try milling them tomorrow to fit. The main VRM chokes Sapphire is using (bought from Newegg a few months ago) must be taller than reference. The VRM cooler doesn't touch the thermal pad and the chokes are hitting the top of the section cut out for them (verified with a little paste on them and test fitting)


----------



## JourneymanMike

Quote:


> Originally Posted by *Archea47*
> 
> Quick warning: aquacomputer blocks don't fit on (at least my) Tri-X 290Xs. Going to try milling them tomorrow to fit. The main VRM chokes Sapphire is using (bought from Newegg a few months ago) must be taller than reference. The VRM cooler doesn't touch the thermal pad and the chokes are hitting the top of the section cut out for them (verified with a little paste on them and test fitting)


Are you using a Reference water block on a Non-Reference card? If so


----------



## Regnitto

Quote:


> Originally Posted by *kizwan*
> 
> Untick "Unlock Voltage Monitoring" - I think there is issue with this setting but I can't remember what it is. Probably already fixed but turn this off anyway.
> Untick "Extend official overclocking limits"
> Disabled "Unofficial overclocking mode"
> 
> Try the above first.


I'll try that after work


----------



## Jabba1977

Quote:


> Originally Posted by *Mercy4You*
> 
> In general 1 of each is smoothest.


I need two of them, so, I want to know if CF is smoother (fcat analysis, frame-peacing, etc, etc) than actual sli of 970 / 980.

I have read many reviews...hardcop, etc, etc.

But I,ll be gratefull if someone can tell me from "first hand", is for playing with one monitor (no g-sync) at 1600p with x79 platform.

Thanks, best regards.


----------



## braddyjr

Hi, can anyone post a screenshot of your msi afterburner with a Crossfire configuration??? A screenshot for each VGA.... I doubt my crossfire performance


----------



## Mercy4You

Quote:


> Originally Posted by *Jabba1977*
> 
> I need two of them, so, I want to know if CF is smoother (fcat analysis, frame-peacing, etc, etc) than actual sli of 970 / 980.
> 
> I have read many reviews...hardcop, etc, etc.
> 
> But I,ll be gratefull if someone can tell me from "first hand", is for playing with one monitor (no g-sync) at 1600p with x79 platform.
> 
> Thanks, best regards.


I know, I was joking









My plan was also maybe one day to go X-fire, but after reading many reviews I came to the conclusion that nothing beats just 1 GPU. Especially if you want ultra-fast FPS gaming like what I do (Death Match BF3 and BF4).
I hate stutter, or micro-stutter. And, I also found articles stating that with 2 cards, while FRAPS may show decent stable fps count, in real time gameplay there can be delay in what you see on your monitor.

Not to joke now, why such a big monitor? I game @ 24 inch monitor and I sit 50 cm from my screen. A bigger screen would mean more water/heat/noise/power and I have to sit further back to get the same result







?


----------



## Jabba1977

I have a HP ZR30W (2560x1600) because I working too with photoshop, etc, etc....

So, I have to decide between Xfire 290x or 970 / 980...I think sli 970/980 has more microstuttering...isn´t it?.

Thanks.


----------



## Mercy4You

Quote:


> Originally Posted by *Jabba1977*
> 
> I have a HP ZR30W (2560x1600) because I working too with photoshop, etc, etc....
> 
> So, I have to decide between Xfire 290x or 970 / 980...I think sli 970/980 has more microstuttering...isn´t it?.
> 
> Thanks.


Lol, you're a persistent guy









Both X-fire and SLI suffer from microstutter depending on which game you run with it, like Rdr09 already pointed out to you.

Have you considered to stay @ 1 GPU and just lower your resolution on your screen while gaming? That way you won't have to go the double GPU route









BTW, in a few months time there will be new single GPU's which can produce very decent FPS counts on 2560 x 1600 monitors


----------



## DarthBaggins

I've been debating on snagging a couple Ref 290's and slapping some waterblocks on them, only thing keeping me from getting a pair is whether my PSU will handle them (plan to keep stock clocks on the 290's). Only time I might run xfire is for BF4 and a few other games other than that they'll be folding. current screen setup is at 3840 x 1080, but I tend to game via one monitor so 1920 x 1080 is preferred and from what I've heard these can handle that w/ little to no issues.


----------



## Jabba1977

I don´t want lower resolution!!! Why???....

I want best solution possible Xfire / SLI for 1600p... is not possible?...AaaAAAaaAAarg!!!

Thanks.


----------



## Mercy4You

Quote:


> Originally Posted by *Jabba1977*
> 
> I don´t want lower resolution!!! Why???....
> 
> I want best solution possible Xfire / SLI for 1600p... is not possible?...AaaAAAaaAAarg!!!
> 
> Thanks.


Now ur mad lol

Did you see my last remark I edited in my last post?

""BTW, in a few months time there will be new single GPU's which can produce very decent FPS counts on 2560 x 1600 monitors







"" (R9 390X among others)


----------



## sugarhell

Quote:


> Originally Posted by *Jabba1977*
> 
> I need two of them, so, I want to know if CF is smoother (fcat analysis, frame-peacing, etc, etc) than actual sli of 970 / 980.
> 
> I have read many reviews...hardcop, etc, etc.
> 
> But I,ll be gratefull if someone can tell me from "first hand", is for playing with one monitor (no g-sync) at 1600p with x79 platform.
> 
> Thanks, best regards.


Eveerything is smooth if you have experience with crossfire and sli. If you dont and you dont have time to mess up with things wait for next gen gpus. Dont buy a 980 or a 290x. Its pointless at the point we are know


----------



## Jabba1977

Yeaa...I have experience... SLI TITAN, CF 7970, 7990, 780ti, single 980...looollll, but 290x are tempting me at his actual price.

I have the oportunitty tu put 290x lightning from friend at very good price...

So, perhaps, I put two of this... and, after, moves to 390x...when is his presentation?

Thanks.


----------



## sugarhell

Quote:


> Originally Posted by *Jabba1977*
> 
> Yeaa...I have experience... SLI TITAN, CF 7970, 7990, 780ti, single 980...looollll, but 290x are tempting me at his actual price.
> 
> I have the oportunitty tu put 290x lightning from friend at very good price...
> 
> So, perhaps, I put two of this... and, after, moves to 390x...when is his presentation?
> 
> Thanks.


If you have experience then what do you ask? Microstutter is not an objective matter. You either see it or you dont. Even so the frame variance on crossfire nowdays is less than 5ms. Impossible to see this with a human eye


----------



## Mercy4You

Quote:


> Originally Posted by *Jabba1977*
> 
> Yeaa...I have experience... SLI TITAN, CF 7970, 7990, 780ti, single 980...looollll, but 290x are tempting me at his actual price.
> 
> I have the oportunitty tu put 290x lightning from friend at very good price...
> 
> So, perhaps, I put two of this... and, after, moves to 390x...when is his presentation?
> 
> Thanks.


You should ask 'the9quad', he's member here and runs 2x R9 290's and has lots of experience with it









But I agree with sugarhell, it's better to wait for next gen GPU's which are around the corner now


----------



## Jabba1977

I ask because , since CF 7970, I don,t put any AMD/ATI in my system...and, I think CF is more evolutionated than SLI with the latest 970/980 despite of his powerfull....

I have experience with rivaturner, Ati_tools, injector, etc, etc...

Is for this reason, I want to know....

Thanks.


----------



## Mercy4You

Quote:


> Originally Posted by *Jabba1977*
> 
> I ask because , since CF 7970, I don,t put any AMD/ATI in my system...and, I think CF is more evolutionated than SLI with the latest 970/980 despite of his powerfull....
> 
> I have experience with rivaturner, Ati_tools, injector, etc, etc...
> 
> Is for this reason, I want to know....
> 
> Thanks.


I just noticed you have 1200 W PSU, so you should definitely take 2 or 3 GPU's









(joking, again...)


----------



## Archea47

Quote:


> Originally Posted by *JourneymanMike*
> 
> Are you using a Reference water block on a Non-Reference card? If so


It's a 'reference' board and on a couple threads round here I was told it should work. Of course ... in this thread it was also suggested I take some measurements to be sure


----------



## Jabba1977

For me, is not a joke...because I worked hard and money, is money...and yes, I have a 1200W PSU, in my rig has been mounted, TITAN-Z, 590 Quad, etc, etc, etc...

So, only I want is if 290X CF working best in terms of MS, than 980 Sli...

ThaaaAAAaaaanks.


----------



## sugarhell

Quote:


> Originally Posted by *Jabba1977*
> 
> For me, is not a joke...because I worked hard and money, is money...and yes, I have a 1200W PSU, in my rig has been mounted, TITAN-Z, 590 Quad, etc, etc, etc...
> 
> So, only I want is if 290X CF working best in terms of MS, than 980 Sli...
> 
> ThaaaAAAaaaanks.


You have a titan z? Okay


----------



## velocityx

crossfire double 290s is butter smooth in games i play, tomb raider, diablo 3, dragon age inquisition, battlefield 3 and 4, crysis 3. I believe its a smoother combo than SLI right now because of the XDMA engine and lack of bridges.


----------



## Mercy4You

Quote:


> Originally Posted by *Jabba1977*
> 
> For me, is not a joke...because I worked hard and money, is money...and yes, I have a 1200W PSU, in my rig has been mounted, TITAN-Z, 590 Quad, etc, etc, etc...
> 
> So, only I want is if 290X CF working best in terms of MS, than 980 Sli...
> 
> ThaaaAAAaaaanks.


Ok, no jokes anymore







Here's a link with the9quad discussing frametime's (= microstutter) with X-fire and Sli

http://www.overclock.net/t/1469627/retired-battlefield-4-frame-time-analyzer-version-4-2-released-major-release/160

Ask him, he's very nice guy and then decide what you're going to do


----------



## Jabba1977

Ok, thanks...I think that perhaps i´ll probe 290x CF.

Yes, I had a TITAN-Z but for me is worthless because the problems with the latest drivers and internal SLI, so I returned it some time ago.

Regards.


----------



## JourneymanMike

Quote:


> Originally Posted by *Archea47*
> 
> Quote:
> 
> 
> 
> Originally Posted by *JourneymanMike*
> 
> Are you using a Reference water block on a Non-Reference card? If so
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's a 'reference' board and on a couple threads round here I was told it should work. Of course ... in this thread it was also suggested I take some measurements to be sure
Click to expand...

Don't even try milling the chokes down! You'll end up with junk!


----------



## tsm106

Quote:


> Originally Posted by *JourneymanMike*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Archea47*
> 
> Quote:
> 
> 
> 
> Originally Posted by *JourneymanMike*
> 
> Are you using a Reference water block on a Non-Reference card? If so
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's a 'reference' board and on a couple threads round here I was told it should work. Of course ... in this thread it was also suggested I take some measurements to be sure
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Don't even try milling the chokes down! You'll end up with junk!
> 
> 
> 
> 
> 
> 
> 
> <<<---
Click to expand...

lol, he was referring to milling the block not the inductors.


----------



## DividebyZERO

*revs up grinder* lets fix this dilemma now!


----------



## Archea47

Quote:


> Originally Posted by *tsm106*
> 
> lol, he was referring to milling the block not the inductors.


Yep, that's the case









I've made some progress but I'm going slow

What I'm not sure about is how much clearance I need between the chokes and the waterblock - will it short if it touches? Copper is obviously conductive but I don't know if current runs through the outer part of the chokes or just the coil inside

Edit: Added picture. Talking a break now, my lady came over to cool my nerves


----------



## JourneymanMike

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *JourneymanMike*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Archea47*
> 
> Quote:
> 
> 
> 
> Originally Posted by *JourneymanMike*
> 
> Are you using a Reference water block on a Non-Reference card? If so
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's a 'reference' board and on a couple threads round here I was told it should work. Of course ... in this thread it was also suggested I take some measurements to be sure
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Don't even try milling the chokes down! You'll end up with junk!
> 
> 
> 
> 
> 
> 
> 
> <<<---
> 
> Click to expand...
> 
> lol, he was referring to milling the block not the inductors.
Click to expand...

I should read the post from the begging!!


----------



## JourneymanMike

Quote:


> Originally Posted by *Archea47*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> lol, he was referring to milling the block not the inductors.
> 
> 
> 
> Yep, that's the case
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've made some progress but I'm going slow
> 
> What I'm not sure about is how much clearance I need between the chokes and the waterblock - will it short if it touches? Copper is obviously conductive but I don't know if current runs through the outer part of the chokes or just the coil inside
> 
> Edit: Added picture. Talking a break now, my lady came over to cool my nerves
Click to expand...

Take a little off and try the fit and so on ...until you get the clearance that you need! Then, just for good measure, if you have enough material, take a couple more .001's


----------



## tsm106

Quote:


> Originally Posted by *Archea47*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> lol, he was referring to milling the block not the inductors.
> 
> 
> 
> Yep, that's the case
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've made some progress but I'm going slow
> 
> What I'm not sure about is how much clearance I need between the chokes and the waterblock - will it short if it touches? Copper is obviously conductive but I don't know if current runs through the outer part of the chokes or just the coil inside
> 
> Edit: Added picture. Talking a break now, my lady came over to cool my nerves
Click to expand...

Measure the height the cap is over the inductor. There's your clearance requirement.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Jabba1977*
> 
> I need two of them, so, I want to know if CF is smoother (fcat analysis, frame-peacing, etc, etc) than actual sli of 970 / 980.
> 
> I have read many reviews...hardcop, etc, etc.
> 
> But I,ll be gratefull if someone can tell me from "first hand", is for playing with one monitor (no g-sync) at 1600p with x79 platform.
> 
> Thanks, best regards.


CF runs like one big card 1440p , 1600p and 4k


----------



## nosequeponer

I run bf4 all max in one r290x at 144p dowscaled to 1080p with no problem and over 60fps


----------



## Archea47

First block is done, but I'm hesitant to celebrate on that one as the copper flaked off showing a tiny edge of an o-ring. Should be fine but I'll leak test it overnight and put a little silicone gasket sealant if it doesn't leak

Second block is almost done


----------



## wermad

What's the purpose in milling the AC blocks? fit a non reference model?


----------



## tsm106

Quote:


> Originally Posted by *wermad*
> 
> What's the purpose in milling the AC blocks? fit a non reference model?


Sapphire using taller caps then reference spec again, surprise!

Quote:


> Originally Posted by *Archea47*
> 
> First block is done, but I'm hesitant to celebrate on that one as the copper flaked off showing a tiny edge of an o-ring. Should be fine but I'll leak test it overnight and put a little silicone gasket sealant if it doesn't leak
> 
> Second block is almost done


Don't mill the other one till you test the milled one first. No sense in ruining two blocks right? Got an air pressure dial and pump?


----------



## virpz

The best thing one could do to an 290/290x is to put it under water. You get more performance, better overclocking
Quote:


> Originally Posted by *tsm106*
> 
> Sapphire using taller caps then reference spec again, surprise!
> Don't mill the other one till you test the milled one first. No sense in ruining two blocks right? Got an air pressure dial and pump?


AC blocks are the best.

Wouldn't be easier to just replace the caps ?


----------



## tsm106

Quote:


> Originally Posted by *virpz*
> 
> AC blocks are the best.
> 
> Wouldn't be easier to just replace the caps ?


Let me guess... You read that review that showed the AC blocks at the top right?

No, it would be silly to replace the caps. Possible destroy $120 block or $300 gpu? Hmm...


----------



## virpz

Quote:


> Originally Posted by *tsm106*
> 
> Let me guess... You read that review that showed the AC blocks at the top right?


I own two of these blocks .
Quote:


> Originally Posted by *tsm106*
> 
> No, it would be silly to replace the caps. Possible destroy $120 block or $300 gpu? Hmm...


You need to know atleast the basics on electronics/soldering...
Thru hole caps, five minutes task, not silly at all.

Good luck !


----------



## wermad

Quote:


> Originally Posted by *tsm106*
> 
> Sapphire using taller caps then reference spec again, surprise!


Damn, they did keep the amd logo on the pcb? I did purchase two new ref 290s and the koolance block went on them without a hitch. The third is about 9 months old and all do have the logo. Seems like they're pulling an xfx


----------



## Bertovzki

Quote:


> Originally Posted by *Archea47*
> 
> First block is done, but I'm hesitant to celebrate on that one as the copper flaked off showing a tiny edge of an o-ring. Should be fine but I'll leak test it overnight and put a little silicone gasket sealant if it doesn't leak
> 
> Second block is almost done


Whats the deal with that , i havw a R920X Tri-X OC and the aquacomputer hawaii block , is there a compatability problem ?


----------



## Archea47

Quote:


> Originally Posted by *wermad*
> 
> Damn, they did keep the amd logo on the pcb? I did purchase two new ref 290s and the koolance block went on them without a hitch. The third is about 9 months old and all do have the logo. Seems like they're pulling an xfx


\]

Yep it has the AMD logo next to the PCIE slot

The component that interfered was the chokes. On the board the physical order goes VRM -> Chokes/inductors -> capacitors (I'm not a EE so sorry if the terms could be better). I had to take ~0.030" off for the blocks to fit the cards

Do other waterblocks cool the chokes?


----------



## tsm106

Quote:


> Originally Posted by *virpz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> No, it would be silly to replace the caps. Possible destroy $120 block or $300 gpu? Hmm...
> 
> 
> 
> You need to know atleast the basics on electronics/soldering...
> Thru hole caps, five minutes task, not silly at all.
> 
> Good luck !
Click to expand...

You're right it's not silly, its freaking ******ed! Buy a new card, and replace the caps on it. Brilliance in action.


----------



## virpz

Quote:


> Originally Posted by *tsm106*
> 
> You're right it's not silly, its freaking ******ed! Buy a new card, and replace the caps on it. Brilliance in action.


By removing the heatsink you are pretty much voiding any warranty from most brands.









Replacing capacitors is not rocket science.









Milling a water block, failing and then using silicon on copper to prevent a leak is certainly the way to go.









You are totally ignorant about electronics.

Treat that anger disorder.


----------



## Bertovzki

is there anyone here who has not had a problem fitting the hawaii block to a 290 X tri X OC ? , or do i have to damn well mill my new block or sell GPU ?


----------



## Mega Man




----------



## Bertovzki

Quote:


> Originally Posted by *Mega Man*


?


----------



## Mega Man

was not directed at you, but i dont want to quote them


----------



## Bertovzki

Quote:


> Originally Posted by *Mega Man*
> 
> was not directed at you, but i dont want to quote them


All good lol, i am feeling a bit freeked out , i am hoping i do not have another mod , but can if i have to , hell iv done so many now it will just be one more , it is just very annying to hear that Sapphire sell a card as reference , or at least others classify it as such like EK for e.g. , but then here it is not reference , it has diferent chokes , that makes me choke ,


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Mega Man*


What does that mean anyways ??


----------



## wermad

Quote:


> Originally Posted by *Bertovzki*
> 
> is there anyone here who has not had a problem fitting the hawaii block to a 290 X tri X OC ? , or do i have to damn well mill my new block or sell GPU ?


I purchases a new *290* tri-x oc from newegg.com in November and another new one off ebay at that time too. I also picked up a preowned one from ebay a few months old. All three of my koolance blocks went on w/ no issues (v1.1, aka "short block). This maybe something w/ the 290X. I would shoot an email to sapphire to confirm and if they've made an sku change. MSI is also known to make tweaks frequently but they typically do update the sku numbers.


Spoiler: Warning: Spoiler!








Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> What does that mean anyways ??


Look up "Jerry Springer show" and "Jerry springer show chant". Its a very old and raunchy (also probably very fake) talk show w/ mostly trash for guests fighting most of the time. Eventually, one of the "security" bouncer guys got his own spin off show. When ever a fight would break or about to break, the audience would chant "Jerry!, Jerry!, ...".


----------



## DividebyZERO

Quote:


> Originally Posted by *wermad*
> 
> I purchases a new *290* tri-x oc from newegg.com in November and another new one off ebay at that time too. I also picked up a preowned one from ebay a few months old. All three of my koolance blocks went on w/ no issues (v1.1, aka "short block). This maybe something w/ the 290X. I would shoot an email to sapphire to confirm and if they've made an sku change. MSI is also known to make tweaks frequently but they typically do update the sku numbers.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> Look up "Jerry Springer show" and "Jerry springer show chant". Its a very old and raunchy (also probably very fake) talk show w/ mostly trash for guests fighting most of the time. Eventually, one of the "security" bouncer guys got his own spin off show. When ever a fight would break or about to break, the audience would chant "Jerry!, Jerry!, ...".


Haha Jerry Springer is an American crap show so i wouldn't expect anyone outside US to know what that meant. I wondered why they had that one in the emoticons.


----------



## virpz

Gonna join the club

Powercolor R9 290X cooled by Aquacomputer Kryographics + Active Backplate.

http://www.techpowerup.com/gpuz/details.php?id=g6k6k


----------



## Bertovzki

Quote:


> Originally Posted by *wermad*
> 
> I purchases a new *290* tri-x oc from newegg.com in November and another new one off ebay at that time too. I also picked up a preowned one from ebay a few months old. All three of my koolance blocks went on w/ no issues (v1.1, aka "short block). This maybe something w/ the 290X. I would shoot an email to sapphire to confirm and if they've made an sku change. MSI is also known to make tweaks frequently but they typically do update the sku numbers.
> 
> 
> Spoiler: Warning: Spoiler!


Thanks , can only hope i also have no problem , it gives me hope.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *wermad*
> 
> I purchases a new *290* tri-x oc from newegg.com in November and another new one off ebay at that time too. I also picked up a preowned one from ebay a few months old. All three of my koolance blocks went on w/ no issues (v1.1, aka "short block). This maybe something w/ the 290X. I would shoot an email to sapphire to confirm and if they've made an sku change. MSI is also known to make tweaks frequently but they typically do update the sku numbers.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> Look up "Jerry Springer show" and "Jerry springer show chant". Its a very old and raunchy (also probably very fake) talk show w/ mostly trash for guests fighting most of the time. Eventually, one of the "security" bouncer guys got his own spin off show. When ever a fight would break or about to break, the audience would chant "Jerry!, Jerry!, ...".
> Quote:
> 
> 
> 
> Originally Posted by *DividebyZERO*
> 
> Haha Jerry Springer is an American crap show so i wouldn't expect anyone outside US to know what that meant. I wondered why they had that one in the emoticons.
Click to expand...

LoooooooooL








I used to watch it here for the ...ummm .. the nakedness .. Long time ago
And yes its utter , utter 'murican rubbish like Ricky Lake and all of that format ......
Extremely of topic


----------



## Mega Man

i can only imagine a life without knowing about jerry .... it would be so good of a life


----------



## Arizonian

Quote:


> Originally Posted by *virpz*
> 
> Gonna join the club
> 
> Powercolor R9 290X cooled by Aquacomputer Kryographics + Active Backplate.
> 
> http://www.techpowerup.com/gpuz/details.php?id=g6k6k


Congrats - added


----------



## Mercy4You

Quote:


> Originally Posted by *Mega Man*
> 
> i can only imagine a life without knowing about jerry .... it would be so good of a life


I just LOVED to watch Jerry Springer, made me feel all fuzzy and warm inside about having NO family...


----------



## Jabba1977

Very happy with my "NEW BABY" --> 290x LIGHTNING (first version) ...I think that: I WON THE SILICON LOTTERY !!









Let´s rock...
















1282Core / 1670 Mem (by air)



http://www.3dmark.com/3dm/5430640?


----------



## DarthBaggins

Think I'll be snagging a Ref. 290x w/in a week unless AMD announces the next line-up


----------



## hyp36rmax

Quote:


> Originally Posted by *Bertovzki*
> 
> is there anyone here who has not had a problem fitting the hawaii block to a 290 X tri X OC ? , or do i have to damn well mill my new block or sell GPU ?


There are GPU blocks available for you that will fit, I own two VaporX 290x (similar cooler) with EK backplates with no problems


----------



## rdr09

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> CF runs like one big card 1440p , 1600p and 4k


Yup, just like one big card. a little tearing in Titanfall (just this game) but no biggie in 4K.


----------



## DarthBaggins

That's odd, my 7870&270x c.f. handle titanfall perfectly but of course that's in 1080p not 4k. But either way that's good especially if it's only in that one game


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *hyp36rmax*
> 
> There are GPU blocks available for you that will fit, I own two VaporX 290x (similar cooler) with EK backplates with no problems


Hmm ek blocks








Those ones don't have those stupid oval rings for the bridge do they ??

Quote:


> Originally Posted by *rdr09*
> 
> Yup, just like one big card. a little tearing in Titanfall (just this game) but no biggie in 4K.


Add a third its one big card with 12gb of video ram .....








Der


----------



## DarthBaggins

Only issue I have with ek is their blocks for these cards are in Nickel not copper or else I'd use their blocks


----------



## hyp36rmax

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Hmm ek blocks
> 
> 
> 
> 
> 
> 
> 
> 
> *Those ones don't have those stupid oval rings for the bridge do they ??*
> Add a third its one big card with 12gb of video ram .....
> 
> 
> 
> 
> 
> 
> 
> 
> Der


I think they do use the FC link with the ovals as a bridge, as i'm using a coupler instead.



Quote:


> Originally Posted by *DarthBaggins*
> 
> Only issue I have with ek is their blocks for these cards are in Nickel not copper or else I'd use their blocks


I prefer to have the copper as well as it would match my Alphacool radiators. I don't have much of a selection for a non-reference GPU, but hey at least its still copper underneath that nickel plating


----------



## Gumbi

Quote:


> Originally Posted by *Jabba1977*
> 
> Very happy with my "NEW BABY" --> 290x LIGHTNING (first version) ...I think that: I WON THE SILICON LOTTERY !!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Let´s rock...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1282Core / 1670 Mem (by air)
> 
> 
> 
> http://www.3dmark.com/3dm/5430640?


Wow sweet man. Esp those mem clocks. Did you use any aux voltage to get your clocks more stable?


----------



## Jabba1977

I thin my mem can go higher!!!!.

This test was only with +30 on mem, aux +50, 200mv (less than 1.35v in my card) and ln2 BIOS with 8+8+6 connector. Although this, my card can do 1260-1270 with 1650 mem, on normal BIOS and with 8+8 , I reach about 1200 on stock voltage  , for gaming I think is very, very good with stock voltage / underclock at 1180 / 6000 

Soon I receive another 290x for building my CF..., I´ll put an TRI-X 290X OC.... and want to run my CF about 1150-1200 / 5500-6000 , depends when the second card arrives, temps, silicon...etc,etc

Core tops about 1286-1290... Mem, I think my card could go higher but I hasn´t put modded bios for black screens 

Very, very happy with this 290X, the temps are incredible!!!, and by air...

Regards.


----------



## kizwan

Quote:


> Originally Posted by *DarthBaggins*
> 
> Only issue I have with ek is their blocks for these cards are in Nickel not copper or else I'd use their blocks


EK do have copper. Mine copper.


----------



## DarthBaggins

Acrylic topped? Cause when I do their configurator it shows nickel for Rev 2.0


----------



## kizwan

Quote:


> Originally Posted by *DarthBaggins*
> 
> Acrylic topped? Cause when I do their configurator it shows nickel for Rev 2.0
> 
> 
> Spoiler: Warning: Spoiler!


Better double check at their online store too. The coolingconfigurator appears to mistakenly show nickel block for "3831109869000" but this EAN no. is for copper block.

http://www.ekwb.com/shop/blocks/vga-blocks/ati-radeon-full-cover-blocks/radeon-rx-200-series/ek-fc-r9-290x-rev-2-0.html


----------



## DarthBaggins

hmmm guess someone dropped the ball on the stock photo, lol


----------



## grifers

Hi guys im new with with a 290x asus card. This results are good for one 290x in unigine heaven?:

http://s309.photobucket.com/user/grifers/media/heaven2015-01-0921-06-54-11.png.html

I see for there in this fórum with this settings 68 fps







with same settings. I have 2600k up to 4800 MHz ht on, ¿have bottleneck?. 8 Gb RAM. Before I have 7970 in cossfire. Im reinstall new drivers and etc.... Please help me!!!

P.D - Sorry my poor english, im spanish.

P.D- Overclock is 1050/1300 with CCC Overdrive


----------



## hyp36rmax

Quote:


> Originally Posted by *grifers*
> 
> Hi guys im new with with a 290x asus card. This results are good for one 290x in unigine heaven?:
> 
> http://s309.photobucket.com/user/grifers/media/heaven2015-01-0921-06-54-11.png.html
> 
> I see for there in this fórum with this settings 68 fps
> 
> 
> 
> 
> 
> 
> 
> with same settings. I have 2600k up to 4800 MHz ht on, ¿have bottleneck?. 8 Gb RAM. Before I have 7970 in cossfire. Im reinstall new drivers and etc.... Please help me!!!
> 
> P.D - Sorry my poor english, im spanish.


Crossfire 7970's give a single R9 290X a good fight as I came from overclocked 7970's Crossfire myself, and couldn't justify only one unless I went crossfire 290x's.


----------



## grifers

Quote:


> Originally Posted by *hyp36rmax*
> 
> Crossfire 7970's give a single R9 290X a good fight as I came from overclocked 7970's Crossfire myself, and couldn't justify only one unless I went crossfire 290x's.


With 2 7970 GHz edition stock clocks in crossfire have 72 fps same setting.


----------



## Bertovzki

Quote:


> Originally Posted by *hyp36rmax*
> 
> There are GPU blocks available for you that will fit, I own two VaporX 290x (similar cooler) with EK backplates with no problems


thanks for info but i prefer the other way around i will buy a gpu for the block and give the 290x away to my dad ,or start again and get a 980 and block


----------



## Aaron_Henderson

Played with my card a bit more, now I can get my card to 1170 core/1625 mem at +50mv, but am limited by temps on the core I think. Is it a pretty safe bet that if I get some better cooling on the card, I might be able to take it 1250-ish on the core? I don't want to sink $100-$200 on it and not get more out of the card...I can run benches up to 1200, but it artifacts pretty badly as temps rise well above 80C on core and vrm. I'd like to put a block on it, but I would be in for the block, another rad, and most likely a new pump...I have been looking at the AIO GPU brackets, but it seems a tad ghetto, and I am not sure it would offer the performance I would be looking for anyway...I have a spare H80 though so it would be a FAR cheaper option.


----------



## hyp36rmax

Quote:


> Originally Posted by *Bertovzki*
> 
> thanks for info but i prefer the other way around i will buy a gpu for the block and give the 290x away to my dad ,or start again and get a 980 and block


The same block that fits my Vapor-X will fit your Tri-X OC as well. Most people generally pick up reference coolers to accommodate aftermarket GPU blocks, however EK has been pretty good in accommodating the most popular higher end non-reference designs as well. I don't see why you would need to give up your current card...


----------



## Bertovzki

Quote:


> Originally Posted by *hyp36rmax*
> 
> The same block that fits my Vapor-X will fit your Tri-X OC as well. Most people generally pick up reference coolers to accommodate aftermarket GPU blocks, however EK has been pretty good in accommodating the most popular higher end non-reference designs as well. I don't see why you would need to give up your current card...


What pisses me off is EK say on there website configurator that the R9 290X Tri-X OC is reference design , then I e-mailed Sapphire to confirm , first reply was not " its not reference " , I then e-mailed back and said " either EK or Sapphire have it wrong as the pictures of the card are identical , one of you are wrong , the reply from Sapphire was " yes sir it is ! indeed a reference design card , so for the record guys from both Sapphire and EK the design is reference !! , but it seems that 99.9 % which is not good enough , and we are not told this by either EK or Sapphire , i will be sending them both e-mails soon ,and also Aquacomputer, and i may even try and get a refund from Sapphire as they have supplied me incorrect info.
If this is indeed impossible for the hawaii aquacomputer block to fit , maybe it is a one off , and a aquacomputer run of blocks not milled deep enough, another possibility, but wishful thinking.

Re what is easier to sell , the card is far easier to sell , i will never sell the block here, and the aquacomputer is the best block , so i will get a card for the block as i did in the first place , i bought this GPU for the block not the other way around , so that may clear up the seemingly illogical thinking , it is in fact full of logic and reason , and if it is Sapphire and the chokes , then this is the fault , and the one to fix.
Any 290 will do.

Another option is to mill the block and be more careful than the dude that did it before , and i will be talking to him when i get more time to , i will get to the bottom of this and find the best solution.

If i keep the card i lose over $200 NZD for the block , if i sell the card for $100 less than i bought it , then that is a real bargain for someone for a brand new unused card , and i am $100 better off and get rid of the fault , so that is the logic also.

Id still like to hear if anyone in this club has the same combo and no problem ????

I am reluctant to take the ambient cooler off to try , but may do , its just a warranty void issue then , so maybe should start e-mailing first.

Damn , what are the chances aye , i was so careful , researched read did everything right , now i have this problem because EK and Sapphire both are inaccurate and give false information , EK by giving it the tick of approval as reference design , and Sapphire by telling me via customer support in reply to a very direct and specific question " is this a reference design PCB ? " YES it is , and failing to tell me the full truth , which would have been : yes this is reference design ,but the chokes are taller , which is in other words not reference design !!!
I also said to Sapphire that i was to use it with a water aquacomputer block for a custom loop , to which they replied : we have to tell you this voids the warranty , i replied "yes i am very happy to thanks "

As far as customer service goes with nearly any company , i feel IMO that you are never really talking to someone that knows what they are talking about , when you really want to just talk to an engineer , but the appropriate questions should be asked and answered either way.


----------



## Bertovzki

Quote:


> Originally Posted by *Archea47*
> 
> \]
> 
> Yep it has the AMD logo next to the PCIE slot
> 
> The component that interfered was the chokes. On the board the physical order goes VRM -> Chokes/inductors -> capacitors (I'm not a EE so sorry if the terms could be better). I had to take ~0.030" off for the blocks to fit the cards
> 
> Do other waterblocks cool the chokes?


0.030" sounds like its relatively small amount = 0.76 mm , but enough to mill right through the block in one spot you said ?
How did you get on with the second block ? can you please if you don't mind ,take some close up clear high resolution photos , so i can see what we are talking about as to the spot you milled right through , and so i can plan how to avoid it if i choose this option , which i will if it is a good option.

Thanks in advance for any help

And re EK blocks to other post from other member , once again thanks for the info , any is appreciated , but if this is only a nickel option , then this too is out of the question as i am only interested in full copper.

EDIT : i missed the post above , so there is a full copper if i really have to go this choice


----------



## wermad

Full copper, go w/ the AC, its smexy







.

Also, xspc is making some pretty solid copper blocks though it covers the base entirely w/ the top. I don't like the multi-hole port system, too busy imho.

Tbh, if you're interested in gtx 980, wait for PI and see what 390X can do. We're close if amd is on schedule (though past experiences may prelude to a few, ahem, delays).


----------



## pshootr

Quote:


> Originally Posted by *wermad*
> 
> Full copper, go w/ the AC, its smexy
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Also, xspc is making some pretty solid copper blocks though it covers the base entirely w/ the top. I don't like the multi-hole port system, too busy imho.
> 
> Tbh, if you're interested in gtx 980, wait for PI and see what 390X can do. We're close if amd is on schedule (though past experiences may prelude to a few, ahem, delays).


I like the idea of full copper. But it does oxidize much faster than nickle. I think that is why heat-sheilds ect. are plated with nickle.


----------



## Bertovzki

Quote:


> Originally Posted by *wermad*
> 
> Full copper, go w/ the AC, its smexy
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Also, xspc is making some pretty solid copper blocks though it covers the base entirely w/ the top. I don't like the multi-hole port system, too busy imho.
> 
> Tbh, if you're interested in gtx 980, wait for PI and see what 390X can do. We're close if amd is on schedule (though past experiences may prelude to a few, ahem, delays).


Thanks for feedback wermad , yeah man that Aquacomputer Hawaii is damn nice and a great performer , hence my preference to keep the block over the GPU








Of cause this is after i get more info from Archea47 , hopefully and anyone else that may have good or bad experience on this subject , then i can access the situation better and make a plan.

I think this combo of block and GPU is very rare as far as i know , we may be the only 2.

And regards to the 980 option , if this whole situation does not work out quickly and easily , i regretfully have to say the build will be postponed in a major way , and will come to a complete hault ! for at least 6 months and i will do nothing what so ever and forget about it , i have other more important things in life to do , and will happily do so , other things require the money it will cost for to get another card.
In which case i will have an entirely new GPU and block for sure.

Why ? would anyone do that you may ask ? really ? answer because i do not care , i have not played a single minute on a game for well over a year ! , this is a project as an art form and more about the case and the build than what it will be used for , i have waited for 4 years to do this i will wait some more if i have to , but i do hope not to.


----------



## pshootr

Quote:


> Originally Posted by *Bertovzki*
> 
> Thanks for feedback wermad , yeah man that Aquacomputer Hawaii is damn nice and a great performer , hence my preference to keep the block over the GPU
> 
> 
> 
> 
> 
> 
> 
> 
> Of cause this is after i get more info from Archea47 , hopefully and anyone else that may have good or bad experience on this subject , then i can access the situation better and make a plan.
> 
> I think this combo of block and GPU is very rare as far as i know , we may be the only 2.
> 
> And regards to the 980 option , if this whole situation does not work out quickly and easily , i regretfully have to say the build will be postponed in a major way , and will come to a complete hault ! for at least 6 months and i will do nothing what so ever and forget about it , i have other more important things in life to do , and will happily do so , other things require the money it will cost for to get another card.
> In which case i will have an entirely new GPU and block for sure.
> 
> Why ? would anyone do that you may ask ? really ? answer because i do not care , i have not played a single minute on a game for well over a year ! , this is a project as an art form and more about the case and the build than what it will be used for , i have waited for 4 years to do this i will wait some more if i have to , but i do hope not to.


Kind of like kicking the soccer ball as hard as you can even if your blindfolded and spinning in circles? hehe


----------



## Bertovzki

Quote:


> Originally Posted by *pshootr*
> 
> Kind of like kicking the soccer ball as hard as you can even if your blindfolded and spinning in circles? hehe


Sorry dude not sure what you mean excuse my ignorance , i feel some frustration at being dicked around by Sapphire , but true what i say , i have things to do and urgent repairs on car , moving house , etc.. so im saving money , no spending , cant continue build if i need new card , will see what happens soon, may find easy fix for issue ,or just get my head into other things .
I dont mind giving the GPU to my old man , he is looking for a new GPU , he has been looking at the 290, he is near 70 , but has a better water cooled PC than me.

I can then buy a R10 490 when they are about







, i will see what can be done , decisions to make , like do i take card apart and just see what i have to do ? will see soon


----------



## Archea47

Quote:


> Originally Posted by *Bertovzki*
> 
> 0.030" sounds like its relatively small amount = 0.76 mm , but enough to mill right through the block in one spot you said ?
> How did you get on with the second block ? can you please if you don't mind ,take some close up clear high resolution photos , so i can see what we are talking about as to the spot you milled right through , and so i can plan how to avoid it if i choose this option , which i will if it is a good option.
> 
> Thanks in advance for any help
> 
> And re EK blocks to other post from other member , once again thanks for the info , any is appreciated , but if this is only a nickel option , then this too is out of the question as i am only interested in full copper.
> 
> EDIT : i missed the post above , so there is a full copper if i really have to go this choice


The second block was a success. The copper did start to peel up a bit and one small piece did chip off but it was away from the seal

The first block failed the leak test. I packed the area with JB Water Weld and am going to leak test again in the morning. Yeah yeah, JB Weld, but it's no use at this point so it can't hurt

If your card is like mine, and I hope it isn't, .030" makes a difference because it meant I could see a sliver of light between the VRM cooler on the block and the thermal pad on the VRMs. These cards I have, which may or may not be different from yours, would obviously overheat immediately and I bought these for overclocking.

The difficulty I had milling them is 1.) the thickness of the area to begin with is only >0.050" and considering 2.) there's a seal pressing against that area for the water channel going to the VRM. As I milled to the point where the test fit finally cleared the copper started lifting from the acrylic (from the o-ring?). The cutting tool I was using was brand new but on both blocks it sort of chipped off a couple spots, and in the wrong spot (probably combined with the metal being pushed from the glass) like the first block it can compromise the seal

We'll see how it goes with the Water Weld on the leaker. FWIW I'll be running parallel (crossing fingers still)

Like Bertovzki I really want the Aquacomputer to work. I fell in love with the card at first sight, a few days after buying my previous set of 280Xs and was kicking myself for not going '90s. I think if I can't get it to work I might buy another Aquacomputer block and have a friend try cutting it at work on a cooled CNC. I admittedly never cut copper (except for the computer







) and it can probably be done better


----------



## Bertovzki

Quote:


> Originally Posted by *Archea47*
> 
> The second block was a success. The copper did start to peel up a bit and one small piece did chip off but it was away from the seal
> 
> The first block failed the leak test. I packed the area with JB Water Weld and am going to leak test again in the morning. Yeah yeah, JB Weld, but it's no use at this point so it can't hurt
> 
> If your card is like mine, and I hope it isn't, .030" makes a difference because it meant I could see a sliver of light between the VRM cooler on the block and the thermal pad on the VRMs. These cards I have, which may or may not be different from yours, would obviously overheat immediately and I bought these for overclocking.
> 
> The difficulty I had milling them is 1.) the thickness of the area to begin with is only >0.050" and considering 2.) there's a seal pressing against that area for the water channel going to the VRM. As I milled to the point where the test fit finally cleared the copper started lifting from the acrylic (from the o-ring?). The cutting tool I was using was brand new but on both blocks it sort of chipped off a couple spots, and in the wrong spot (probably combined with the metal being pushed from the glass) like the first block it can compromise the seal
> 
> We'll see how it goes with the Water Weld on the leaker. FWIW I'll be running parallel (crossing fingers still)
> 
> Like Bertovzki I really want the Aquacomputer to work. I fell in love with the card at first sight, a few days after buying my previous set of 280Xs and was kicking myself for not going '90s. I think if I can't get it to work I might buy another Aquacomputer block and have a friend try cutting it at work on a cooled CNC. I admittedly never cut copper (except for the computer
> 
> 
> 
> 
> 
> 
> 
> ) and it can probably be done better


Ok thanks a heap for the feadback , i appreciate your time and effort to do so , it does not sound too hopeful.

My old man has a milling machine , i have a few options , not sure what to do , giving the card away and pulling plug on this build most likely , at this stage , thinking ill cut my loses and try and sell the lot or give it away , probably not without trying a refund from Sapphire "idiots for giving wrong info " or void warranty and see if my card fits after taking the ambient cooler off.

Or i have to buy another block to fit which will be extremely annoying , if i do i will try and mill the Aqua first i guess as nothing to loose it is useless otherwise
Damn Sapphire support !!









Thanks again decisions decisions what to do budget stretched already way beyond what my original MOBO , GPU RAM , OS , and that was all , hell iv spent probably 3500 since then on tools and components.


----------



## wermad

Quote:


> Originally Posted by *pshootr*
> 
> I like the idea of full copper. But it does oxidize much faster than nickle. I think that is why heat-sheilds ect. are plated with nickle.


It tarnishes into a dull brown eventually. It really has to be put in a position w/ a lot of air and moisture/water to make it patina. I've seen a few builds that were never drained for over a year and the copper is fine. Think about this; the pipes that route clean water in your home probably are copper and they have to last decades. Also, think of the water as an abrasive element, so less likely to patina.

The one thing w/ nickel is the failure. It has happened and it will happen. Though, there's no ill affect other then you end up w/ areas of small exposed copper. Nickel is just for looks, that's all.


----------



## pshootr

Quote:


> Originally Posted by *wermad*
> 
> It tarnishes into a dull brown eventually. It really has to be put in a position w/ a lot of air and moisture/water to make it patina. I've seen a few builds that were never drained for over a year and the copper is fine. Think about this; the pipes that route clean water in your home probably are copper and they have to last decades. Also, think of the water as an abrasive element, so less likely to patina.
> 
> The one thing w/ nickel is the failure. It has happened and it will happen. Though, there's no ill affect other then you end up w/ areas of small exposed copper. Nickel is just for looks, that's all.[/quotPoint taken.
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> It tarnishes into a dull brown eventually. It really has to be put in a position w/ a lot of air and moisture/water to make it patina. I've seen a few builds that were never drained for over a year and the copper is fine. Think about this; the pipes that route clean water in your home probably are copper and they have to last decades. Also, think of the water as an abrasive element, so less likely to patina.
> 
> The one thing w/ nickel is the failure. It has happened and it will happen. Though, there's no ill affect other then you end up w/ areas of small exposed copper. Nickel is just for looks, that's all.
> 
> 
> 
> Yes very true. However in the long term, nickle will hold up way longer. Have you ever seen a penny dropped in to salt water? The effects are large compared to nickle. Large companies have a standard of quality and endurance to keep in place, as well as an aesthetic quality.
> 
> I understand that for someone who wants to keep hardware for a few years, this is no problem, but that is not the way large companies think about it as far as standard goes..
Click to expand...


----------



## pshootr

Imagine if you were a company and after a few years your machines possessor had green stuff on it.


----------



## pshootr

And even if my pipes still don't leak, how would they perform for thermal transfer?


----------



## Bertovzki

Quote:


> Originally Posted by *pshootr*
> 
> Yes very true. However in the long term, nickle will hold up way longer. Have you ever seen a penny dropped in to salt water? The effects are large compared to nickle. Large companies have a standard of quality and endurance to keep in place, as well as an aesthetic quality.
> 
> I understand that for someone who wants to keep hardware for a few years, this is no problem, but that is not the way large companies think about it as far as standard goes..


Is a nickel coin solid nickel though , like the copper coin , dropped into salt water or a plated nickel coin

, quote : "A nickel, in American usage, is a five-cent coin struck by the United States Mint. Composed of 75% copper and 25% nickel, the piece has been issued since 1866. The silver half dime, equal to five cents, had been issued since the 1790s. "

%25 nickel , hell that must be a very thick layer on the outside , not like a very thin water block layer ??

It is this failure of nickel that makes me not want a block in it.


----------



## pshootr

I penny is not all copper now. But that is not the point.

Nickle is more resilient. that is the point.

If most enthusiast wanted to look at copper, then we should see more products comprised of 100% copper even if it means an inferior product?

If I was a manufacture I think I would go by the current model and plate my copper with nickle. This gives you the both of best worlds. You get the thermal conductivity of copper, with the added bonus of longevity from the nickle plating.

In all honesty it would be even cheaper to not plate products in nickle for you.

Having said that, I do get the concept of sacrificing longevity for performance. After all this is OCN. But from a manufacturing point of view...

For example, I am willing to lap my processor to the bare copper to achieve a flat surface even if it means losing some other qualitative properties. However I would not create a large scale market with this idea in mind.


----------



## wermad

Quote:


> Originally Posted by *pshootr*
> 
> Yes very true. However in the long term, nickle will hold up way longer. Have you ever seen a penny dropped in to salt water? The effects are large compared to nickle. Large companies have a standard of quality and endurance to keep in place, as well as an aesthetic quality.
> 
> I understand that for someone who wants to keep hardware for a few years, this is no problem, but that is not the way large companies think about it as far as standard goes..


Lol, seems like you haven't been around long enough to see how nickel can fail. Remember, its very thin layer of nickel plating. Lots of things can cause the nickel to start eroding. Ek had a very nasty incident a few years back. There's still reports of nickel failure from different block makers but they're less common now. From what I've gathered over the years, its usually down to faulty manufacturing (process). If you'd like, take a stroll down history:

http://www.overclock.net/t/915966/please-read-before-purchasing-ek-nickel-plated-blocks-update-revised-plating-info

The best bit of advise w/ nickel blocks, buy it new, so the warranty applies to you and follow the block makers recommendations. It won't prevent nickel failure but if it does happen, you're covered. There's no thermal advantage to either tbh (copper vs nickel plated).


----------



## pshootr

Quote:


> Originally Posted by *wermad*
> 
> Lol, seems like you haven't been around long enough to see how nickel can fail. Remember, its very thin layer of nickel plating. Lots of things can cause the nickel to start eroding. Ek had a very nasty incident a few years back. There's still reports of nickel failure from different block makers but they're less common now. From what I've gathered over the years, its usually down to faulty manufacturing (process). If you'd like, take a stroll down history:
> 
> http://www.overclock.net/t/915966/please-read-before-purchasing-ek-nickel-plated-blocks-update-revised-plating-info
> 
> The best bit of advise w/ nickel blocks, buy it new, so the warranty applies to you and follow the block makers recommendations. It won't prevent nickel failure but if it does happen, you're covered. There's no thermal advantage to either tbh (copper vs nickel plated).


I am not in the nickle market. And I am not trying to place value over one from the other, in any other terms than our interest.

What exactly is "so called" nickle failure anyways? Does this mean a failure of metal itself, or the application of which it was applied. Or the quality of so called nickle that was used in the first place?

Look for sources in a direct manner for metal composition, not just from OC forums.


----------



## pshootr

Bottom line is that as far as market standard goes, copper is the best choice for thermal conductivity, and nickle is the best way to protect the copper.

Otherwise, Intel would not do it.


----------



## Bertovzki

@ Archea47 and anyone else who has answers or something to say :

I need to take photos in day light , i have terrible light now and cant get a good shot.

I just decided to take the GPU apart , try the block now.

And i am fully naive with where VRM , VRAM is and chokes , all i know is i have felt like " RAMing" this card down Sapphires throat and making them "Choke"









But as far as i can tell from my investigation there is zero problem ! i can not find an issue anywhere , i have looked back at page 3 million and 80 ,i mean 3380 for the reference of where the VRM in question is , and anything else that needs cooling , thermal pads and paste areas.

I cant see anywhere where it does not sit down on the block enough , or anywhere where the pads will not contact , on the contrary , the best photos i can offer are ones where , if you look at the mounts ( where you would put the screws ) and the gap between the PCB and the mount , both with and without the stock thermal pad on , you will see without the thermal pad's , the block sits all the way down and the mounts just touch the PCB , if i put even just one thermal pad on a VRAM either side of the GPU itself then the block does not want to go down and has clearance under the mounts , i have tried a piece of the stock thermal pad on all points that need thermal pads , and to me it seems perfect !

I can not see a problem ? am i missing something , i hope not , as i say better pics tomorrow in the light.


Spoiler: Warning: Spoiler!






Above 3 pic's all with zero thermal pad's.




And with one pad on either side of the GPU itself


If i put a pad anywhere on a VRAM VRM , it leaves a deep impression on it , and holds the whole block up off the PCB , suggesting that i have excellent contact all around.

I can not get even close to seeing any light between the pad and any VRAM , VRM , Choke or the block.


----------



## combine1237

I need to change the status of my xfx 290x to cooled by an aftermarket g10. Also I need to add an asus dcuii 290x with a g10 cooler too. http://www.techpowerup.com/gpuz/details.php?id=nyh5z


----------



## grifers

Quote:


> Originally Posted by *grifers*
> 
> Hi guys im new with with a 290x asus card. This results are good for one 290x in unigine heaven?:
> 
> http://s309.photobucket.com/user/grifers/media/heaven2015-01-0921-06-54-11.png.html
> 
> I see for there in this fórum with this settings 68 fps
> 
> 
> 
> 
> 
> 
> 
> with same settings. I have 2600k up to 4800 MHz ht on, ¿have bottleneck?. 8 Gb RAM. Before I have 7970 in cossfire. Im reinstall new drivers and etc.... Please help me!!!
> 
> P.D - Sorry my poor english, im spanish.
> 
> P.D- Overclock is 1050/1300 with CCC Overdrive


Guys, is ok this results for one 290x at 1050/1300?. Please need your help


----------



## Archea47

Quote:


> Originally Posted by *Bertovzki*
> 
> @ Archea47 and anyone else who has answers or something to say :
> If i put a pad anywhere on a VRAM VRM , it leaves a deep impression on it , and holds the whole block up off the PCB , suggesting that i have excellent contact all around.
> 
> I can not get even close to seeing any light between the pad and any VRAM , VRM , Choke or the block.


You Lucky Punk!!

If I both test fit or screw down mine I got zero (before modification) impressions on the VRM and the 'mounts' only touched on the VGA input side of the card

Do you have calipers to measure the height of your chokes and the depth of your recess on the AC block for the chokes? Im curious if it's my cards or my blocks


----------



## Klocek001

today I found an alpenfohn fan bracket I haven't been using for quite some time, and I think it gave me an idea to see if I can get keep my 290 trix really cool and quiet on air.
The only spare fan I could find was a 120mm blue vortex, I'ma find another one (the bracket can take two 120mm fans), run them on 7V adapters and see if I can stay under 70 degrees on air with the card's fan set manually to 40%.
here are the pics, this is actually a cool invention


Spoiler: Warning: Spoiler!







I've just given this a try, and I'm surprised in a good way.
A 15min run in Far Cry 4, fan at 40%, one prolimatech blue vortex 120 with 7V adapter on the bracket, 22 degrees in my room - results: max gpu temp 65 degrees. I bet if I get one more of those fans I'll be able to go down to 30%. This blue vortex 120mm with a 7v adapter is really near silent.


----------



## Archea47

Well the first block failed the second leak test. I took off the top plate, scraped all the initial water weld off, and put a bead along the outside edge of the VRM seal. Put the top plate back on, cranked her down slowly and, while I Don't want to get my hopes up to much, it looks like it'll work! The water weld spread and filed in nicely and it looks like the main gasket is fully engaged.

Going to let it set for a few hours (directions say 1 hr set time) and give it the final leak test. If this didn't work I think I'm going to just have to bring it to the recycler


----------



## moorhen2

Might need to put this in the GPU forum, but will try here first, having a real mare at the moment, got myself a new card Wednesday, XFX R9 290, uninstalled all drivers etc, took out my 6970, installed the 290,re-installed drivers, all's well for a couple of hours, then "ping" monitor goes black, no signal, pc is still running normal, just a black screen, only way to do anything is turn off PSU, turn on again, same thing happens again shortly after. GPU goes back for replacement, worked ok yesterday, now the new one is randomly black screening, monitor goes blank, no signal, gpu fans start running fast. Any ideas, as I am pulling my hair out trying to find out the problem, surely I cant have got 2 duff cards. Any help would be much appreciated.

Sorry, I am in the GPU section.lol


----------



## Regnitto

Quote:


> Originally Posted by *Regnitto*
> 
> turned off "extend official overclocking limits" and left unofficial with powerplay turned on, just played a little bf4........30-50% gpu usage core fluctuating between 1050-1150, never goes to my full 1200 clock, cpu usage all 6 cores running 75-95%


I found a solution to this problem in BF4 at least. By turning the resolution scaling up to 200% it is now holding the full oc steady and keeping the load 95-99%. I guess it's just a case of 1920x1080 wasn't enough of a challenge for the card? I still have the issue in NFS most wanted (2013). I guess it must be game related rather than card related.


----------



## Bertovzki

Quote:


> Originally Posted by *Archea47*
> 
> You Lucky Punk!!
> 
> If I both test fit or screw down mine I got zero (before modification) impressions on the VRM and the 'mounts' only touched on the VGA input side of the card
> 
> Do you have calipers to measure the height of your chokes and the depth of your recess on the AC block for the chokes? Im curious if it's my cards or my blocks


Yeah lucky punk alright , but i fully feel your pain , it just totally suxs , and im absolutely sure i have not missed anything.

I need i diagram of specifically what the chokes even are , i still dont know , i know the VRAM , VRM as i looked at the schematic with red and green high lights on these, on page 3380 , but i have checked clearance on all the contact surfaces before any pads and after , and clearly have plenty of contact , and my only concern was the complete opposite to your scenario , in that , it holds the block up in the air when sitting on the pads , so more concerned that it is contacting on the GPU properly , but that also gets a good dose of thermal paste transfered , so that is ok too.

I will do all i can to help find out the source of this problem , and will go visit the old man and use his calipers and measure every aspect of the block and card , hieghts of VRAM's VRM's " chokes when i know what one is and where it is " , and all the depths of all the all the block channels , and make a clear photo with reference of all of this for you to look at.

It is my last day at work , here for another 8 hours , will have a look tonight or tomorrow for you , probably tonight , the sooner the better.


----------



## Archea47

The problematic block passed the (1 hr) leak test!









What a relief. This brings me to where I was Wednesday evening. Time to get back to enjoying the build


----------



## tsm106

Quote:


> Originally Posted by *moorhen2*
> 
> Might need to put this in the GPU forum, but will try here first, having a real mare at the moment, got myself a new card Wednesday, XFX R9 290, uninstalled all drivers etc, took out my 6970, installed the 290,re-installed drivers, all's well for a couple of hours, then "ping" monitor goes black, no signal, pc is still running normal, just a black screen, only way to do anything is turn off PSU, turn on again, same thing happens again shortly after. GPU goes back for replacement, worked ok yesterday, now the new one is randomly black screening, monitor goes blank, no signal, gpu fans start running fast. Any ideas, as I am pulling my hair out trying to find out the problem, surely I cant have got 2 duff cards. Any help would be much appreciated.
> 
> Sorry, I am in the GPU section.lol


Check your psu? The HX1000 is a difficult psu to run because of its design. I would replace it just on principle. That said, the HX1000 is actually two 500w psu bonded to make one psu. Thus you have to think of it as two separate 500w psus. View the HX1000 rail chart below. You will most likely have to load balance the rails.


----------



## tsm106

Quote:


> Originally Posted by *Mega Man*
> 
> was not directed at you, but i dont want to quote them


lol, not worth getting into it. Is it even worth explaining to someone that clueless?


----------



## moorhen2

Quote:


> Originally Posted by *tsm106*
> 
> Check your psu? The HX1000 is a difficult psu to run because of its design. I would replace it just on principle. That said, the HX1000 is actually two 500w psu bonded to make one psu. Thus you have to think of it as two separate 500w psus. View the HX1000 rail chart below. You will most likely have to load balance the rails.


Thanks for the reply, should have said I now have a Super Flower 1300w Gold psu,.


----------



## Performer81

Quote:


> Originally Posted by *Regnitto*
> 
> I found a solution to this problem in BF4 at least. By turning the resolution scaling up to 200% it is now holding the full oc steady and keeping the load 95-99%. I guess it's just a case of 1920x1080 wasn't enough of a challenge for the card? I still have the issue in NFS most wanted (2013). I guess it must be game related rather than card related.


If i set +50% Powertune in Afterburner it has no effect here, only directly in CCC.


----------



## wermad

Appraisals are done in the market (read market t&c before posting btw)


----------



## tsm106

Quote:


> Originally Posted by *wermad*
> 
> The one thing w/ nickel is the failure. It has happened and it will happen. Though, there's no ill affect other then you end up w/ areas of small exposed copper. Nickel is just for looks, that's all.


Technically, it doesn't matter if the nickel strips on the inside man. PPL get nickel plated cuz it's chrome looking and blingtastic, just saying. Copper on the outside looks ugly after a few months and gets worse over time btw.


----------



## thrgk

Quote:


> Originally Posted by *Forceman*
> 
> HDMI is better than TOSLink, if your receiver supports it. HDMI has significantly more bandwidth available (I don't recall the actual number) than optical/coax, so it can directly support 5.1 audio. Optical/coax can only carry 5.1 audio if it has been compressed, which is no problem for videos (which are already compressed) but means you need something to compress the audio for games. So you either need a motherboard that supports Dolby Digital Live or DTS:Connect (or whatever the new name is), or you need a sound card.
> 
> The Creative Z almost certainly supports one of those techs, if not both, so you can just use it. You won't notice any difference between the Z and a U7, or even the on-board, if you are using digital (optical/coax).


Why is one of my cards running at x16?

I have 4 290x and a 5960x (not sure how many lanes that is) but why are the top 3 x8 and the fourth x16?


----------



## wermad

5960x has 40 lanes, as does the 5930k. More then likely, its running 8x/8x/8x/16x in 4way. The 5820k has 28 lanes.


----------



## thrgk

since my 4 cards are reference can i flash any reference bios to them? As long as the clocks are equal


----------



## Tobiman

A reference Sapphire BIOS for a 290 could brick a reference MSI 290 card but it *rarely* happens. Personally, I try to stick to BIOSes from the same vendor. I'd try it on one card before proceeding to flash the others. Remember to back up your old ROM before flashing.


----------



## tsm106

Quote:


> Originally Posted by *Tobiman*
> 
> A reference Sapphire BIOS for a 290 could brick a reference MSI 290 card but it *rarely* happens. Personally, I try to stick to BIOSes *from the same vendor.* I'd try it on one card before proceeding to flash the others. Remember to back up your old ROM before flashing.


You should match via memory type and VR type. Some cards have different VR controllers so if one bios is written for a different bios, that's a brick waiting to happen. Same ca be expected of a bios for different "type" of memory ic. For a quick reference, at tpu's vga bios db, when you click on detail, it lists what memory ic are supported by that bios' timing tables.


----------



## Spectre-

Quote:


> Originally Posted by *thrgk*
> 
> since my 4 cards are reference can i flash any reference bios to them? As long as the clocks are equal


i have flashed my 290x's bios's ( one is PT1 and other one is AMD)

i think you should be fine

from what i have seen is that the sapphire and AMD bios's tend to give more score in benchmarks( 3d m11, fs and catzilla)


----------



## Archea47

@Arizonian can my Cooling on the roster be updated to Water?


----------



## Bertovzki

Quote:


> Originally Posted by *Archea47*
> 
> @Arizonian can my Cooling on the roster be updated to Water?


I took all the measurements for you , looks like you do not need them now though , but i will keep them for a latter date if you want


----------



## thrgk

I flashed a asus and it worked. See I don't know which are which 2 are asus 2 are sapphire. When I looked at the memory most biases supported hynix and epidela. I'll just keep this asus bios since I know it works on my fourth card well


----------



## arrow0309

Quote:


> Originally Posted by *Archea47*
> 
> @Arizonian can my Cooling on the roster be updated to Water?


Man, I definitely need one of those Supernova G2 1300 too to replace my G2 850
Especially for my new setup with two 290 and an Enthoo Primo (and new rads)


----------



## thrgk

Do all reference 290x have bios switches ? I have 2 asus and 2 sapphire and ek blocks on but I can't see them. Do the blocks cover it or

It seems in ATI flash it days image 0 and image 1 can I somehow get the backup buos from ATI flash


----------



## Klocek001

Quote:


> Originally Posted by *thrgk*
> 
> Do all reference 290x have bios switches ? I have 2 asus and 2 sapphire and ek blocks on but I can't see them. Do the blocks cover it or
> 
> It seems in ATI flash it days image 0 and image 1 can I somehow get the backup buos from ATI flash


do you mean legacy/uefi or uber/quiet ? the second one - yes, the first one -no. my 290 trix won't allow me to fast boot as it doesn't support uefi gop.


----------



## thrgk

So will my reference 290x have bios switches ?


----------



## JourneymanMike

Quote:


> Originally Posted by *thrgk*
> 
> Do all reference 290x have bios switches ? I have 2 asus and 2 sapphire and ek blocks on but I can't see them. Do the blocks cover it or
> 
> It seems in ATI flash it days image 0 and image 1 can I somehow get the backup buos from ATI flash


I have two Sapphire Reference 290X's - I can't find any bios switches...


----------



## hyp36rmax

Quote:


> Originally Posted by *JourneymanMike*
> 
> I have two Sapphire Reference 290X's - I can't find any bios switches...


Reference models didn't have a bios switch unless you have a Vapor-X or Tri-X cooler


----------



## thrgk

So no way I can get my original bios back as I lost the folder ? How can o know what bios from the bios collection to use ?


----------



## Klocek001

Quote:


> Originally Posted by *hyp36rmax*
> 
> Reference models didn't have a bios switch unless you have a Vapor-X or Tri-X cooler


no legacy/uefi switch on a trix


----------



## tsm106

Quote:


> Originally Posted by *hyp36rmax*
> 
> Quote:
> 
> 
> 
> Originally Posted by *JourneymanMike*
> 
> I have two Sapphire Reference 290X's - I can't find any bios switches...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Reference models didn't have a bios switch unless you have a Vapor-X or Tri-X cooler
Click to expand...

Some serious misinformation here...









Quote:


> Originally Posted by *thrgk*
> 
> Do all reference 290x have bios switches ? I have 2 asus and 2 sapphire and ek blocks on but I can't see them. *Do the blocks cover it* or
> 
> It seems in ATI flash it days image 0 and image 1 can I somehow get the backup buos from ATI flash


Yes, blocks cover them.


----------



## thrgk

Quote:


> Originally Posted by *tsm106*
> 
> Some serious misinformation here...
> 
> 
> 
> 
> 
> 
> 
> 
> Yes, blocks cover them.


can I get to them with the blocks on?

if not is there anyway in ATIFLASH I can tell it to save a copy of the backup bios to so and so a file?


----------



## tsm106

Maybe if you had some kind of a hook and pick set. Though, if you didn't have the foresight to see this coming... I dunno.


----------



## thrgk

Quote:


> Originally Posted by *tsm106*
> 
> Maybe if you had some kind of a hook and pick set. Though, if you didn't have the foresight to see this coming... I dunno.


Can I save it from ATIFLASH anyway?


----------



## tsm106

Quote:


> Originally Posted by *thrgk*
> 
> if not is there anyway in ATIFLASH I can tell it to save a copy of the backup bios to so and so a file?


Magic? I don't understand. You have to physically move the switch. If you cannot access the switch physically, software cannot magically achieve the same task.


----------



## hyp36rmax

Quote:


> Originally Posted by *thrgk*
> 
> So no way I can get my original bios back as I lost the folder ? How can o know what bios from the bios collection to use ?


*Check out Techpowerup*: Link
Quote:


> Originally Posted by *Klocek001*
> 
> no legacy/uefi switch on a trix


I stand corrected as I only own an R9 290X Vapor-X which has a switch.


----------



## Klocek001

Quote:


> Originally Posted by *hyp36rmax*
> 
> *Check out Techpowerup*: Link
> I stand corrected as I only own an R9 290X Vapor-X which has a switch.


I've owned three trix's unless it's playing Houdini there's no switch


----------



## hyp36rmax

Quote:


> Originally Posted by *Klocek001*
> 
> I've owned three trix's unless it's playing Houdini there's no switch


There we have it the *Tri-X does not* have a switch







However I will say my *Vapor-X does*


----------



## thrgk

Quote:


> Originally Posted by *hyp36rmax*
> 
> *Check out Techpowerup*: Link
> I stand corrected as I only own an R9 290X Vapor-X which has a switch.


Yes but how do I know which asus or sapphire one to use


----------



## thrgk

Does mine even have one ? Looks like od be able to see it if I did even with the wb


----------



## hyp36rmax

Quote:


> Originally Posted by *thrgk*
> 
> Yes but how do I know which asus or sapphire one to use


Whichever bios you choose will be under your discretion.

Quote:


> Originally Posted by *thrgk*
> 
> Does mine even have one ? Looks like od be able to see it if I did even with the wb


As stated above the reference models *does* have a BIOS switch.

*Here is my Vapor-X with EK water blocks:*


----------



## tsm106

Stop with the crazy no bios switch statements. Zoom in on the pic below.

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *utnorris*
> 
> Yeah, it always seems like when we want to have fun, well sh** we have a party too go to or some other event for the kids. It probably would not be that big of a deal if it didn't HAPPEN EVERY WEEKEND!!!! But seriously, just means late nights for me while everyone else sleeps.
> 
> 
> 
> Amen. I had 20 minutes till basketball lol. I managed to finished the gpu only block. Its cobbled together from spare parts. It should be good, just need to watch vrm temps. I might use a temp lead on the rivf if/when I get a chance. I also have a big 200mm fan which should keep air fed to the rest of the pcb beyond the 80mm fan there.
Click to expand...


----------



## kizwan

Reference cards does have BIOS switch. Sapphire TRI-X 290/290X does have BIOS switch. The switch is to switch between BIOS1 & BIOS2, whether it's between quiet/uber or legacy/uefi.

For TRI-X, both 290 & 290X, as you can see in the pic, the switch is on the right of the used-to-be Crossfire finger. Same location too on reference cards.


----------



## hyp36rmax

Quote:


> Originally Posted by *tsm106*
> 
> Stop with the crazy no bios switch statements. Zoom in on the pic below.


Here's a better shot, *you're correct*







For some reason I didn't think the reference model had one. You just kept calling out wrong with no image. I clearly don't own a reference design.


----------



## tsm106

Btw for those concerned iirc you can also use a bent paperclip to access the switch hidden behind the waterblock bridge.


----------



## wermad

Ugh, EK and XSPC, why yous puts notches on backplates knowing Hawaii no longer uses crossfire bridges (







) ???


Spoiler: Warning: Spoiler!






Lazy r&d and manufacturing, ? (







)



Looks like I'm going w/ HK backplates:



edit: bp also guilty. Hk it is then


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *wermad*
> 
> Ugh, EK and XSPC, why yous puts notches on backplates knowing Hawaii no longer uses crossfire bridges (
> 
> 
> 
> 
> 
> 
> 
> ) ???
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Lazy r&d and manufacturing, ? (
> 
> 
> 
> 
> 
> 
> 
> )
> 
> 
> 
> Looks like I'm going w/ HK backplates:


I like the look of that


----------



## VSG

A mould maybe? If future AMD cards continue this, then maybe we will see this not be the case. Good point though, I will talk to the guys at EK and XSPC (no BP contact here) and see.


----------



## wermad

Looks likt they're just milled from cut slices off billets (I've had their backplates before and they show the saw swirl marks and the carving swirls from the mil on the underside). They probably just took Tahiti prints and just redid the screw holes since both pcb's are 10.5". The SE doesn't have the notch but the configurator says its not compatible w/ the reference Hawaii design.

The HK looks to be a stamped design but I do like the polished look and will go nicely w/ the nickel bp fittings and the polished SS & nickel on the Koolance blocks. They're only $5 more then the ek so no a biggie tbh.


----------



## Archea47

Quote:


> Originally Posted by *Klocek001*
> 
> I've owned three trix's unless it's playing Houdini there's no switch


Both of my Tri-X cards have a switch

maybe there was a rev change, which could explain my AC blocks not fitting

Edit: the aquacomputer backplate is full cover - no cutout for xfire


----------



## tsm106

They notched it for bridges, but then their block bridge blocks the bios switch lol.

Btw, lightning backplates are not notched.


----------



## wermad

Quote:


> Originally Posted by *tsm106*
> 
> They notched it for bridges, but then their block bridge blocks the bios switch lol.
> 
> Btw, lightning backplates are not notched.


Yup, the non reference blocks got the solid design. As I mentioned, they just got lazy (all three) since Tahiti was the same length and all it took was to redo the screw holes.

HK and AC did theirs in full. Wish Koolance had some sort of backplate, but the HK will do.


----------



## Buehlar

FWIW, the EK backplates for DC2's don't have x-fire bridge notched out either.


----------



## Archea47

Quote:


> Originally Posted by *Bertovzki*
> 
> I took all the measurements for you , looks like you do not need them now though , but i will keep them for a latter date if you want


Hey Bertovski,

The chokes are the 6 gray components just left of the VRMs highlighted in yellow in this picture from this club:


If you could post the height of those that'd be great so I can make sure if I get a third card it has that measurement to fit. Thanks for doing the measurements

BTW I'm on the rig right now and she's nice and frosty


----------



## kizwan

I think I'm the only one that did not bothered with the Crossfire finger cutout on the EK backplates.







Anyway, the copper on my blocks still look ok after almost a year but there's minor oxidation, greenish I think, spots here & there. Only on the side of the block.


----------



## nosequeponer

What's the usuall oc under water?? So far 1200 core and 1350 mem and seems to hold


----------



## Klocek001

Quote:


> Originally Posted by *Archea47*
> 
> Both of my Tri-X cards have a switch
> 
> maybe there was a rev change, which could explain my AC blocks not fitting
> 
> Edit: the aquacomputer backplate is full cover - no cutout for xfire


hwere is it ?


----------



## rdr09

Quote:


> Originally Posted by *nosequeponer*
> 
> What's the usuall oc under water?? So far 1200 core and 1350 mem and seems to hold


For 7/24 use or benching? it comes down to silicon. i only oc for benching. stock in games. 2 290s (it's like one big card) for 4K. Zero stutter but i don't play Ubi crap games.

1400 on the memory should be doable. if you won the silicon, then 1500 - 1600. set the PL to max.


----------



## Spectre-

Quote:


> Originally Posted by *rdr09*
> 
> For 7/24 use or benching? it comes down to silicon. i only oc for benching. stock in games. 2 290s (it's like one big card) for 4K. Zero stutter but i don't play Ubi crap games.
> 
> 1400 on the memory should be doable. if you won the silicon, then 1500 - 1600. set the PL to max.


i do 1100/1400 on stock volts

my benching is on Pt1 and 1.5+ and do 1270/1600

also had bad luck oc'ing elpida ram


----------



## Chopper1591

Quote:


> Originally Posted by *wermad*
> 
> Ugh, EK and XSPC, why yous puts notches on backplates knowing Hawaii no longer uses crossfire bridges (
> 
> 
> 
> 
> 
> 
> 
> ) ???
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Lazy r&d and manufacturing, ? (
> 
> 
> 
> 
> 
> 
> 
> )
> 
> 
> 
> Looks like I'm going w/ HK backplates:
> 
> 
> 
> edit: bp also guilty. Hk it is then


I wouldn't do that.
Sure, it looks nice.

But that plate is more like an insulator.
Makes no contact to the vrms at all.
It will increase temps more then to help cooling.


----------



## nosequeponer

Quote:


> Originally Posted by *rdr09*
> 
> For 7/24 use or benching? it comes down to silicon. i only oc for benching. stock in games. 2 290s (it's like one big card) for 4K. Zero stutter but i don't play Ubi crap games.
> 
> 1400 on the memory should be doable. if you won the silicon, then 1500 - 1600. set the PL to max.


Quote:


> Originally Posted by *Spectre-*
> 
> i do 1100/1400 on stock volts
> 
> my benching is on Pt1 and 1.5+ and do 1270/1600
> 
> also had bad luck oc'ing elpida ram


for 24/7, not really interested in benching.

so far it´s using 1.25v max, and it seems stable with unigine and hitman becnhmark, i´ll try later on with BF4..

i can also try to raise a bit more the mem, no artifacts so far


----------



## rdr09

Quote:


> Originally Posted by *Spectre-*
> 
> i do 1100/1400 on stock volts
> 
> my benching is on Pt1 and 1.5+ and do 1270/1600
> 
> also had bad luck oc'ing elpida ram


Nice. i'm a wuzz in flashing bioses. both 290s of mine can bench 1280. one can do a little higher but it's in the second slot.

Quote:


> Originally Posted by *nosequeponer*
> 
> for 24/7, not really interested in benching.
> 
> so far it´s using 1.25v max, and it seems stable with unigine and hitman becnhmark, i´ll try later on with BF4..
> 
> i can also try to raise a bit more the mem, no artifacts so far


1.25v is fine i suppose, so long as your temps are below 80C (imo). watered should be able to go below 60C if you have enuf rad space. BF4 is sensitive to oc i read. test bf4 with memory at stock and oc'ed.

i only use one app to monitor temps if i have to . . . HWINFO64. i close GPUZ, AB, Trixx and disable overdrive in CCC. your vrms should be cooler than the core.

edit: you really need to oc that high? what cpu?


----------



## nosequeponer

temps are below 50 wiith those clocks (core max temp has been 50c , vrm1 48 and vrm2 39

so temps are under control

cpu is 2500k @4.5, have´n´t try to oc it more yet, but i think it could go higher (it´s also under water)


----------



## Bertovzki

Quote:


> Originally Posted by *Archea47*
> 
> Hey Bertovski,
> 
> The chokes are the 6 gray components just left of the VRMs highlighted in yellow in this picture from this club:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> If you could post the height of those that'd be great so I can make sure if I get a third card it has that measurement to fit. Thanks for doing the measurements
> 
> BTW I'm on the rig right now and she's nice and frosty
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


The Chokes are 7.0 mm high , you will have to convert if needed , the board is made in metric measurements.




I will double check the 1.7 mm strip at the end , it may be 1.8 like the other end ,but unlikely as i tripple measured everything for max accuracy.


----------



## rdr09

Quote:


> Originally Posted by *nosequeponer*
> 
> temps are below 50 wiith those clocks (core max temp has been 50c , vrm1 48 and vrm2 39
> 
> so temps are under control
> 
> cpu is 2500k @4.5, have´n´t try to oc it more yet, but i think it could go higher (it´s also under water)


your temps and oc for both gpu and cpu are perfectly fine but will await for BF4 results. please test MP64. thanks.


----------



## nosequeponer

Quote:


> Originally Posted by *rdr09*
> 
> your temps and oc for both gpu and cpu are perfectly fine but will await for BF4 results. please test MP64. thanks.


will do later tonight


----------



## hyp36rmax

I have a similar setup with an i5 2500K @ 4.5 and crossfire Vapor-X 290X 8gb under water. Here are my results in BF4 (64MP Siege of Shanghai) as i'm curious on my performance comparatively in FPS and temperatures. Thoughts?

*BF4 Video Settings and Temperature Benchmark*



*Intel i5 2500k @ 4.5ghz:* *Avg:* 37.8 C *Min:* 23 C *Max:* 55.8 C
*Sapphire VAPOR-X R9 290X 8GB @ 1030mhz GPU 01:* *Avg:* 46.2 C *Min:* 34 C *Max:* 53 C
*Sapphire VAPOR-X R9 290X 8GB @ 1030mhz GPU 02:* *Avg:* 48.1 C *Min:* 35 C *Max:* 56 C



*Battlefield 4 Ultra Settings No Anti Aliasing*


----------



## nosequeponer

i use 2560x1440 everything ultra and donwscaled to 1080p, should get fraps to get a proper comparation


----------



## hyp36rmax

Quote:


> Originally Posted by *nosequeponer*
> 
> i use 2560x1440 everything ultra and donwscaled to 1080p, should get fraps to get a proper comparation


I can also run it at 2560x1440P as well as a down scaled at 1080P when I get back from work later tonight. Are you in crossfire? I also use Fraps for my data analysis.


----------



## nosequeponer

no crossfire, single 290x 4gb


----------



## Archea47

Quote:


> Originally Posted by *Bertovzki*
> 
> The Chokes are 7.0 mm high , you will have to convert if needed , the board is made in metric measurements.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> I will double check the 1.7 mm strip at the end , it may be 1.8 like the other end ,but unlikely as i tripple measured everything for max accuracy.


Wow, that's some serious and impressive effort! Thanks, +rep!


----------



## thrgk

Could my dual psu be causing detetion/gpu issues with my 4 290x? my top 2 card are powered by the 1200corsair (main) psu and in gpuz it says its giving the 12 rail 11.88v where as my secondary psu to power the last 2 290x is giving 11.75. Could that voltage difference be causing an issue?


----------



## Klocek001

Quote:


> Originally Posted by *thrgk*
> 
> Could my dual psu be causing detetion/gpu issues with my 4 290x? my top 2 card are powered by the 1200corsair (main) psu and in gpuz it says its giving the 12 rail 11.88v where as my secondary psu to power the last 2 290x is giving 11.75. Could that voltage difference be causing an issue?


it's just a software voltage reading. you never know if it's accurate. btw 11.75v is fine actually from what I've been told.


----------



## thrgk

Quote:


> Originally Posted by *Klocek001*
> 
> it's just a software voltage reading. you never know if it's accurate. btw 11.75v is fine actually from what I've been told.


Yea but would have 2 cards 11.75 and 2 11.88 screw stuff up? since they are working together in xfire?


----------



## Talon720

Also for anyone looking for an alternative method to getting to the bios switch behind the bridge block you can use a zip tie (not to thin) to reach behind the block. Super easy with the zip tie. Just have to cut It flat


----------



## Talon720

Quote:


> Originally Posted by *kizwan*
> 
> I think I'm the only one that did not bothered with the Crossfire finger cutout on the EK backplates.
> 
> 
> 
> 
> 
> 
> 
> Anyway, the copper on my blocks still look ok after almost a year but there's minor oxidation, greenish I think, spots here & there. Only on the side of the block.


Yea the sides of my copper blocks are getting some dark blackish spots oxidation not to worried copper is super easy to clean off. Kinda wish I went nickle but my last Amd mb had copper block my new Intel mb has nickle lol. I was trying to match but ended up unmatched in the end.

I'm super jealous of you guys who have cards that can keep pushing voltage. My cards blackscreen if the voltage gets to high even tested at stock clocks. I wanted to swap my 1st card out Hynix for one of the 2 elpidas to see if it helped, but my pump is taped on with that 3m pad that stuff is so strong. Im not complaining I can still oc gpu and my elpidas will do 1400 the Hynix will go higher but not much point especially the in trifire.


----------



## BuildTestRepeat

Just ordered myself a Powercolor PCS+ R9 290 yesterday! Will be my first AMD card. Always been an nvidia fan but thought id give the "dark side" a whirl


----------



## vieuxchnock

*And the third baby just came in.

http://www.servimg.com/view/17159996/606

Have to install his waterblock and will be ready to join the 2 others.
*


----------



## Archea47

Quote:


> Originally Posted by *BuildTestRepeat*
> 
> Just ordered myself a Powercolor PCS+ R9 290 yesterday! Will be my first AMD card. Always been an nvidia fan but thought id give the "dark side" a whirl


Welcome to the cool kids' club


----------



## Regnitto

Quote:


> Originally Posted by *BuildTestRepeat*
> 
> Just ordered myself a Powercolor PCS+ R9 290 yesterday! Will be my first AMD card. Always been an nvidia fan but thought id give the "dark side" a whirl


welcome to the dark side. We have cookies!


----------



## nosequeponer

Quote:


> Originally Posted by *hyp36rmax*
> 
> I have a similar setup with an i5 2500K @ 4.5 and crossfire Vapor-X 290X 8gb under water. Here are my results in BF4 (64MP Siege of Shanghai) as i'm curious on my performance comparatively in FPS and temperatures. Thoughts?
> 
> *BF4 Video Settings and Temperature Benchmark*
> 
> 
> 
> *Intel i5 2500k @ 4.5ghz:* *Avg:* 37.8 C *Min:* 23 C *Max:* 55.8 C
> *Sapphire VAPOR-X R9 290X 8GB @ 1030mhz GPU 01:* *Avg:* 46.2 C *Min:* 34 C *Max:* 53 C
> *Sapphire VAPOR-X R9 290X 8GB @ 1030mhz GPU 02:* *Avg:* 48.1 C *Min:* 35 C *Max:* 56 C
> 
> 
> 
> *Battlefield 4 Ultra Settings No Anti Aliasing*


quick game:

Frames: 11202 - Time: 159870ms - Avg: 70.069 - Min: 41 - Max: 105


continue testing

Not that stable... testing at 1100 core 1300 mem

hate this game, 40 minutes stable and then locks...


----------



## BuildTestRepeat

Quote:


> Originally Posted by *Regnitto*
> 
> welcome to the dark side. We have cookies!


Quote:


> Originally Posted by *Archea47*
> 
> Welcome to the cool kids' club


Thanks guys, im not one to purchase "off brand" cards. But it seems powercolor has gotten their game together especially with this PCS+ R9 290. Couldnt resist









Now the wait..... Just wanna see the big brown ups truck in my driveway already lol


----------



## Mega Man

powercolor isnt an off brand


----------



## virpz

Quote:


> Originally Posted by *BuildTestRepeat*
> 
> Thanks guys, im not one to purchase "off brand" cards. But it seems powercolor has gotten their game together especially with this PCS+ R9 290. Couldnt resist


Powercolor is an old manufacturer and during the R350 era they have manufactured the best ATI cards one could buy.
They have been doing that before MSI, ASUS, GIGABYTE and all the likes went into the market for gaming video cards.

[]'s


----------



## wermad

i was a little put off by PC, seeing a ton of the refurb on newegg and the not so good reviews. I know, I know, newegg reviews are worth nill. But one can tell things are a little off when you see a lot of refurb from a retailer. Most other brands, the refurb/open-box units would get snagged up quickly, but PC still was on there (a lot of stock or revolving door by both retailer and manufacturer, ?). Though, the Devil series look noyce







.

I went with sapphire since the TriX cooler was pretty good and at that time I was still on air.


----------



## Archea47

Quote:


> Originally Posted by *wermad*
> 
> i was a little put off by PC, seeing a ton of the refurb on newegg and the not so good reviews. I know, I know, newegg reviews are worth nill. But one can tell things are a little off when you see a lot of refurb from a retailer. Most other brands, the refurb/open-box units would get snagged up quickly, but PC still was on there (a lot of stock or revolving door by both retailer and manufacturer, ?). Though, the Devil series look noyce
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I went with sapphire since the TriX cooler was pretty good and at that time I was still on air.


Pretty much same experience here. I'd previously wanted to pick up their 7990. Was adding them (290Xs) to the cart and saw all the reviews with issues. I went with the TriX as it was cheaper than reference-cooled and would work well until I went water


----------



## Arizonian

Quote:


> Originally Posted by *Jabba1977*
> 
> Very happy with my "NEW BABY" --> 290x LIGHTNING (first version) ...I think that: I WON THE SILICON LOTTERY !!
> 
> 
> 
> 
> 
> 
> 
> 
> Let´s rock...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1282Core / 1670 Mem (by air)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/5430640?


Nice clocks for air, dang.

Submit either a GPU-Z Link with OCN name showing, Screen shot of GPU-Z validation tab open with OCN Name or a pic of GPU with piece of paper with OCN name showing next to your GPU.
Quote:


> Originally Posted by *Archea47*
> 
> @Arizonian can my Cooling on the roster be updated to Water?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Updated TRI X Water








Quote:


> Originally Posted by *vieuxchnock*
> 
> *And the third baby just came in.
> *
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> *http://www.servimg.com/view/17159996/606*
> 
> 
> *
> Have to install his waterblock and will be ready to join the 2 others.
> *


Post a submission when you get it lined up.








Quote:


> Originally Posted by *BuildTestRepeat*
> 
> Just ordered myself a Powercolor PCS+ R9 290 yesterday! Will be my first AMD card. Always been an nvidia fan but thought id give *the "dark side"* a whirl


Fixed. You've joined *the force* Luke.









_Awaiting submission._


----------



## BuildTestRepeat

Quote:


> Originally Posted by *Mega Man*
> 
> powercolor isnt an off brand


Quote:


> Originally Posted by *virpz*
> 
> Powercolor is an old manufacturer and during they have manufactured the best ATI cards one could buy.
> They have been doing that before MSI, ASUS, GIGABYTE and all the likes went into the market for gaming video cards.
> 
> []'s


Maybe i could have put that better, Not most peoples go to brand? Regardless excited as all hell to get that PCS+ 290


----------



## pdasterly

Im ready to upgrade my monitors, I currently have three 23" 1080p ips monitors
What would be a good upgrade? I have crossfire 290x


----------



## BuildTestRepeat

Quote:


> Originally Posted by *pdasterly*
> 
> Im ready to upgrade my monitors, I currently have three 23" 1080p ips monitors
> What would be a good upgrade? I have crossfire 290x


The korean 1440P monitors that overclock damn well like the QNIX QX2710 are bad ass. In the market for one myself. 96hz seems easily doable. Some get 120hz out of them!


----------



## wermad

4k is just too new and too expensive. Best 4k experience is with 32" or great screen imho for typical monitor setups. So, the natural suggestion would be triple WQHD, but I"m not sure if amd has fixed the many issues that plagued Eyefinity WQHD users.

I'm going w/ a single 32" WQHD (2560x1440). I think this is a good spot for single monitors w/out going Eyefinity or 4k. I've haven't dabbled w/ 120hz, so I can't really offer comment on it. Plus, ips 120hz WQHD are very few and I only like to run portrait in Eyefinity.

I do miss my five dell 6000x1920 Eyefinity setup though :'(


----------



## Klocek001

Quote:


> Originally Posted by *thrgk*
> 
> Yea but would have 2 cards 11.75 and 2 11.88 screw stuff up? since they are working together in xfire?


measure it with a digital PSU tester that'll give more precise results. I guess it's improbable to get 2 PSUs to work at EXACTLY the same voltage and the 2 PSUs method works for people on OCN.
PM Shilka.


----------



## Mega Man

Quote:


> Originally Posted by *Klocek001*
> 
> Quote:
> 
> 
> 
> Originally Posted by *thrgk*
> 
> Yea but would have 2 cards 11.75 and 2 11.88 screw stuff up? since they are working together in xfire?
> 
> 
> 
> measure it with a digital PSU tester that'll give more precise results. I guess it's improbable to get 2 PSUs to work at EXACTLY the same voltage and the 2 PSUs method works for people on OCN.
> PM Shilka.
Click to expand...

first please dont, use a DMM not a psu tester....

second why do they need to be at exactly the same voltage

never will that happen

there is a range of acceptable voltage that is good, and then there is atx spec.

if you buy a quality psu neither will be an issue or worry

you have to realize it takes the 12v and converts it, it does not matter about a slightly different voltage inputs,


----------



## hyp36rmax

Hey guys, Come join the VAPOR-X + TRI-X Owners Club I just made for those of you with Sapphire VAPOR-X + TRI-X GPU's. *All* are welcome.











*Source: *Link


----------



## fragamemnon

Update:


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Mega Man*
> 
> powercolor isnt an off brand


My first 290 is a powercolor , and at the time ( Oct last year ) was $40 cheaper in 'Stralia than the others .
It didn't last long though .

Quote:


> Originally Posted by *wermad*
> 
> 4k is just too new and too expensive. Best 4k experience is with 32" or great screen imho for typical monitor setups. So, the natural suggestion would be triple WQHD, but I"m not sure if amd has fixed the many issues that plagued Eyefinity WQHD users.
> 
> I'm going w/ a single 32" WQHD (2560x1440). I think this is a good spot for single monitors w/out going Eyefinity or 4k. I've haven't dabbled w/ 120hz, so I can't really offer comment on it. Plus, ips 120hz WQHD are very few and I only like to run portrait in Eyefinity.
> 
> I do miss my five dell 6000x1920 Eyefinity setup though :'(


I have no dramas running eyefinity using triple 1440p . My best 4k experience is on the the UHD Sammy 67' currrvy , and the 28' one too ( not currvy )








Damn good frame rates too


----------



## Arizonian

Quote:


> Originally Posted by *fragamemnon*
> 
> Update:
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## rdr09

Quote:


> Originally Posted by *pdasterly*
> 
> Im ready to upgrade my monitors, I currently have three 23" 1080p ips monitors
> What would be a good upgrade? I have crossfire 290x


if you want to take a chance, i'd say go 4K (not TV) and skip 1440. my 2 290s at stock handles my games just fine in 28 in 4K. very smooth. Or, wait for Freesync monitor.

other have issues with flickering but i have zero issues. plug and play.

edit: if you use Adobe for work like Dreamweaver, then you might have to get some reading glasses.







other apps are fine as well as games.

Here's what I am talking about . . .


----------



## silencespr

Anyone from Bergen county NJ want to help me update bios on my damn cards ? i can't seem to be able to do it ?


----------



## rdr09

Quote:


> Originally Posted by *silencespr*
> 
> Anyone from Bergen county NJ want to help me update bios on my damn cards ? i can't seem to be able to do it ?


silence, i like to help but idk how to flash bioses. i did it once with a 7950 and bork it. check out post # 1 if you haven't . . .

http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread


----------



## Widde

Is there any chance that when overclocking my gpu's my cpu becomes unstable? Because 1080 on the core at +88mV aint normal (I hope).


----------



## wermad

Quote:


> Originally Posted by *Widde*
> 
> Is there any chance that when overclocking my gpu's my cpu becomes unstable? Because 1080 on the core at +88mV aint normal (I hope).


What cooling do you have (both cpu and gpu)?


----------



## Widde

Quote:


> Originally Posted by *wermad*
> 
> What cooling do you have (both cpu and gpu)?


Temps are never over the 80s on the gpus (Ref coolers) and a H80i on the cpu rarely sees over 70 while gaming (It's a loud rig







)


----------



## wermad

What specifically happens to your cpu? There could be lots of things. So stock, everything is fun, but oc your cards and your cpu starts to fail? Run a cpu bench for stability if you haven't done so.


----------



## Jester435

Can I be added to the club

XFX r9 290x DD


----------



## Widde

Quote:


> Originally Posted by *wermad*
> 
> What specifically happens to your cpu? There could be lots of things. So stock, everything is fun, but oc your cards and your cpu starts to fail? Run a cpu bench for stability if you haven't done so.


Gaming is fine overclocked or not, But folding I have had bsods while folding on both cards at stock with cpu at 4.5, took it down to 4.4 now, But is it possible that folding makes the cpu oc go bad? Havent dared to fold on both now might screw up the ppd







Its rock solid in everything else, just folding that got me bsod


----------



## wermad

run prime or aida64 for a bit and see how the cpu holds @ 4.5. Is your Ivy delided btw?


----------



## Widde

Quote:


> Originally Posted by *wermad*
> 
> run prime or aida64 for a bit and see how the cpu holds @ 4.5. Is your Ivy delided btw?


Done both aida and prime95 24hours when I did my oc, Thing is, it's stable in everything, Except when I'm folding only on the gpus, I'm not folding on the cpu, Can the power draw make the cpu oc unstable in some way? Got a 1000w seasonic platinum so it should not have any problems supplying power


----------



## DarthBaggins

Quote:


> Originally Posted by *Widde*
> 
> Done both aida and prime95 24hours when I did my oc, Thing is, it's stable in everything, Except when I'm folding only on the gpus, I'm not folding on the cpu, Can the power draw make the cpu oc unstable in some way? Got a 1000w seasonic platinum so it should not have any problems supplying power


What are you [email protected] settings? are you allowing 2 cores/threads for you GPU's if not that could be why


----------



## Widde

Quote:


> Originally Posted by *DarthBaggins*
> 
> What are you [email protected] settings? are you allowing 2 cores/threads for you GPU's if not that could be why


Pretty new to folding but here is some settings http://piclair.com/qb3rx


----------



## DarthBaggins

Hmm if your just folding GPU's then you shouldn't be getting a bsod







You have xfire off?


----------



## thrgk

If i have a quad ek bridge, can I only use 3 cards and plug the holes where the 4th card would go? Or are they unplugable?


----------



## DarthBaggins

Should be able to plug them off


----------



## Widde

Quote:


> Originally Posted by *DarthBaggins*
> 
> Should be able to plug them off


Have turned crossfire off now, havent tried folding since the last bsod when crossfire was enabled


----------



## Lao Tzu

Hi all, well, i order my new GPU, a SAPPHIRE TRI-X R9 290X 4GB GDDR5 OC, im upgrading from a HD7970 GHz (that moment was the most powerfull Single GPU), now its time to evolve!!! gaming evolved!!!, arrives 22/01/2015, when i have it i up new fotos of my rig !!!


----------



## thrgk

Quote:


> Originally Posted by *DarthBaggins*
> 
> Should be able to plug them off


Do they include the plug off valves with it?


----------



## wermad

Quote:


> Originally Posted by *thrgk*
> 
> If i have a quad ek bridge, can I only use 3 cards and plug the holes where the 4th card would go? Or are they unplugable?


There's a lane switch on your mb if you just need to turn off one card and not physically remove it (also helps to remove the power connections after doing the switch).

If you must remove the the card/block and still retain the bridge, ek does sell a parallel plate to block off one port.

http://www.performance-pcs.com/ek-fc-terminal-blank-parallel.html



Your bridge is semi-parallel, so I would just block off the bottom ports.


----------



## thrgk

Quote:


> Originally Posted by *wermad*
> 
> There's a lane switch on your mb if you just need to turn off one card and not physically remove it (also helps to remove the power connections after doing the switch).
> 
> If you must remove the the card/block and still retain the bridge, ek does sell a parallel plate to block off one port.
> 
> http://www.performance-pcs.com/ek-fc-terminal-blank-parallel.html
> 
> 
> 
> Your bridge is semi-parallel, so I would just block off the bottom ports.


Ah awesome thanks. I keep having issues with quad fire. It keeps stop being detected, same thing happened with 7970s in quad fire so i think ill go try and sell the 290x, less problems. IDK why it stops working, it was working, but I played bf4 got a red screen and rebooted and now only detects 3 so idk


----------



## wermad

Sounds like you may have a bad card. I had issues w/ one lightning and eventually CCC would detect three of four cards (gpuz saw all four). That bad one ended up crashing on its own.

Try the switches first to find out if you have a bad card. If you go w/ three, the blocking plate would be cheaper then buying a new triple terminal.


----------



## tsm106

^^They're called blanks btw.


----------



## wermad

Quote:


> Originally Posted by *tsm106*
> 
> ^^They're called blanks btw.


EK has some interesting names that can confuse. Such as terminal. So its easier to use terminology that is more common (unless its causing confusion). I hate how Koolance calls their blocks "cold plates". Stupid marketing gimicky names but its still, for practicalities sake, a plate blocking some holes and a waterblock







.

Also, I provide links to the exact product most of the time so it does the fine specifications as opposed to me typing the info (and free marketing products







).


----------



## nightfox

Quote:


> Originally Posted by *thrgk*
> 
> Ah awesome thanks. I keep having issues with quad fire. It keeps stop being detected, same thing happened with 7970s in quad fire so i think ill go try and sell the 290x, less problems. IDK why it stops working, it was working, but I played bf4 got a red screen and rebooted and now only detects 3 so idk


are u using omega drivers? i experience this alot with omega driver. what i did was i reflashed all my gpu. with theif responding vbios. i dont know how long it will last but so far itz bedn 2 weeks and i dont experience anymore. im talking about only 3 cards detected.


----------



## tsm106

Quote:


> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> ^^They're called blanks btw.
> 
> 
> 
> EK has some interesting names that can confuse. Such as terminal. So its easier to use terminology that is more common (unless its causing confusion). I hate how Koolance calls their blocks "cold plates". Stupid marketing gimicky names but its still, for practicalities sake, a plate blocking some holes and a waterblock
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Also, I provide links to the exact product most of the time so it does the fine specifications as opposed to me typing the info (and free marketing products
> 
> 
> 
> 
> 
> 
> 
> ).
Click to expand...

Shrugs. Blanks are blanks. I don't think it's marketing, because they, ek in this case have to differentiate the bridge systems or ppl will be buying parts they think will fit, only to be irate later when it won't go together. Anyways as far as blanks go there are specific ones too. Like, you can't use a parallel blank in a "terminal" series bridge for obvious reasons. And the terminal system is the newer bridge system that's only compatible with current gen and newer cards, ie. csq and csq clean. And technically, Koolance is right, blocks are coldplates. Though coldplate is hardly used anymore.


----------



## fishingfanatic

Anyone know if a pr of universal blocks would work on a D13 290x dual core?

FF


----------



## Archea47

I'm still getting used to this watercooled video card thing ...

Spent the past half hour freaking out wondering why Afterburner doesn't have fan speed or tachnometer after disabling ULPS


----------



## thrgk

do hynix 290x clock better on memory then epildea? if so what percentage usually


----------



## wermad

Quote:


> Originally Posted by *fishingfanatic*
> 
> Anyone know if a pr of universal blocks would work on a D13 290x dual core?
> 
> FF


Since its a custom card, I would hit up an owner to get some measurements on the screw holes for the cores. I would also ask the distance from the core (or screw holes) to the edge of the pcb. Most uniblocks are a tad short and may require extensions. Don't want to end up w/ an extension or fitting clearance issue. Curious why you didn't go w/ the reference 295x2? tons of blocks for this monster and the stock cooler is pretty good on its own ("hybrid" cls w/ fan).
Quote:


> Originally Posted by *Archea47*
> 
> I'm still getting used to this watercooled video card thing ...
> 
> Spent the past half hour freaking out wondering why Afterburner doesn't have fan speed or tachnometer after disabling ULPS


Lol, I had this issue the first time i watercooled (4870x2 + 4870). The old versions would bug you there's an issue but MSI did update it later on just "gray" it out (tach/sensor).
Quote:


> Originally Posted by *thrgk*
> 
> do hynix 290x clock better on memory then epildea? if so what percentage usually


wish i could answer this for you as I've heard the arguments. But I would leave it to a more knowledgeable member to answer this in full for ya







.

Btw, don't give up on quads! its so deliciously yummy







.


----------



## tsm106

Quote:


> Originally Posted by *thrgk*
> 
> do hynix 290x clock better on memory then epildea? if so what percentage usually


Yea, in general terms hynix tends to clock higher.


----------



## Archea47

Took my first swing at OC'ing these 290Xs yesterday

+100mv (1.3V) and 1165/1500 worked perfectly during a couple hours of BF4, CQ Large on outdoor maps

Temps:


Along those lines, do you guys get three sections in HWinfo for each GPU? I posted this in the HWinfo official thread and the developer asked:
Quote:


> The 1st section per each card is general GPU information, the other two are concerning the VRMs - 2nd is most probably the GPU Core VRM, but I'm not sure what the 3rd is. Usually the 3rd is GPU Memory VRM, but in your case 1.001V is too low for a GPU Memory voltage. Any idea what might be running at such voltage on your GPU?


http://www.overclock.net/t/1235672/official-hwinfo-32-64-thread/600#post_23411557

Is this normal? Am I undervolting my vRAM? I didn't observe any artifacts etc. The only monitoring software I'm using is AB


----------



## rdr09

Quote:


> Originally Posted by *Archea47*
> 
> Took my first swing at OC'ing these 290Xs yesterday
> 
> +100mv (1.3V) and 1165/1500 worked perfectly during a couple hours of BF4, CQ Large on outdoor maps
> 
> Temps:
> 
> 
> Along those lines, do you guys get three sections in HWinfo for each GPU? I posted this in the HWinfo official thread and the developer asked:
> http://www.overclock.net/t/1235672/official-hwinfo-32-64-thread/600#post_23411557
> 
> Is this normal? Am I undervolting my vRAM? I didn't observe any artifacts etc. The only monitoring software I'm using is AB


what's your ambient? i am jelly of your temps. and, yes, you get 3 sections.


----------



## Archea47

Quote:


> Originally Posted by *rdr09*
> 
> what's your ambient? i am jelly of your temps. and, yes, you get 3 sections.


They're on water (Aquacomputer blocks). See sig rig









Are the voltage readings in the third section normal?

Also what's up with the voltage on the 2nd card? It was +0.032 over card1 at stock voltage, too. Both have same BIOS


----------



## rdr09

Quote:


> Originally Posted by *Archea47*
> 
> They're on water (Aquacomputer blocks). See sig rig
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Are the voltage readings in the third section normal?
> 
> Also what's up with the voltage on the 2nd card? It was +0.032 over card1 at stock voltage, too. Both have same BIOS


i saw a pic of your rig, so i know they are watered. very nice btw. not sure if the voltage readings are accurate. the tolerance is 12V +/- 5% i believe. so, your cars are good. my 290s are not showing same readings as well but they are about same difference.


----------



## kizwan

Quote:


> Originally Posted by *Archea47*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> what's your ambient? i am jelly of your temps. and, yes, you get 3 sections.
> 
> 
> 
> They're on water (Aquacomputer blocks). See sig rig
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Are the voltage readings in the third section normal?
> 
> Also what's up with the voltage on the 2nd card? It was +0.032 over card1 at stock voltage, too. Both have same BIOS
Click to expand...

It's probably AUX voltage.


----------



## Chopper1591

Quote:


> Originally Posted by *kizwan*
> 
> It's probably AUX voltage.


I see that come by every now and then.

Could someone be so kind to give me some info on AUX voltage?
I have absolutely no idea what it does.


----------



## Regnitto

Quote:


> Originally Posted by *Chopper1591*
> 
> I see that come by every now and then.
> 
> Could someone be so kind to give me some info on AUX voltage?
> I have absolutely no idea what it does.


I believe (and I could be wrong if I misinterpreted the post about it several pages back) it is similar to the CPU PLL voltage for the GPU. I also believe it MAY help with stability on your vram clock, but I'm sure one of the more experienced guys on here can give you a better answer and correct me if I'm wrong.


----------



## Bertovzki

Quote:


> Originally Posted by *Archea47*
> 
> They're on water (Aquacomputer blocks). See sig rig
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Are the voltage readings in the third section normal?
> 
> Also what's up with the voltage on the 2nd card? It was +0.032 over card1 at stock voltage, too. Both have same BIOS


Yeah what are your ambients , looks good for me then 1 less card and similar rad surface area , same card and block.


----------



## Archea47

Quote:


> Originally Posted by *Bertovzki*
> 
> Yeah what are your ambients , looks good for me then 1 less card and similar rad surface area , same card and block.


Ambient temps maybe 68*F ... until this thing's been gaming for a couple hours (which it was when recorded)

Also keep in mind I run my pump and fans (3000RPM 120s and 1850 140s) 100% whenever it's on. I have a controller but wear headphones anyway


----------



## Bertovzki

Quote:


> Originally Posted by *Archea47*
> 
> Ambient temps maybe 68*F ... until this thing's been gaming for a couple hours (which it was when recorded)
> 
> Also keep in mind I run my pump and fans (3000RPM 120s and 1850 140s) 100% whenever it's on. I have a controller but wear headphones anyway


Ok cool thanks for info , good to have a benchmark comparison , for my rig when its done.


----------



## tsm106

Quote:


> Originally Posted by *Archea47*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Bertovzki*
> 
> Yeah what are your ambients , looks good for me then 1 less card and similar rad surface area , same card and block.
> 
> 
> 
> Ambient temps maybe 68*F ... until this thing's been gaming for a couple hours (which it was when recorded)
> 
> *Also keep in mind I run my pump and fans (3000RPM 120s and 1850 140s) 100% whenever it's on*. I have a controller but wear headphones anyway
Click to expand...

lol, that's one way to do it!


----------



## wermad

got my 14 fans ~1000rpm. can't hear them at all. pump runs louder then them.


----------



## Bertovzki

Quote:


> Originally Posted by *tsm106*
> 
> lol, that's one way to do it!


I was hopeing for good temps at about 800 or less.


----------



## Archea47

Yeah I just only have so much room in my case for rads so ... RPM

Any additional input on what the 1.00V reading in the third section of HWinfo64 is and how/when to adjust it?

And also voltage on my gpu cores? AB is set to synchronize settings between cards but gpu2 reads 1.332V while gpu1 reads 1.300V with +100mV. At stock gpu2 was 1.232 and gpu1 1.200. Is it probably just poor sensor calibration from Sapphire or? GPU2 is 69% ASIC and GPU1 is 78% FWIW


----------



## Chopper1591

Quote:


> Originally Posted by *Archea47*
> 
> Yeah I just only have so much room in my case for rads so ... RPM
> 
> Any additional input on what the 1.00V reading in the third section of HWinfo64 is and how/when to adjust it?
> 
> And also voltage on my gpu cores? AB is set to synchronize settings between cards but gpu2 reads 1.332V while gpu1 reads 1.300V with +100mV. At stock gpu2 was 1.232 and gpu1 1.200. Is it probably just poor sensor calibration from Sapphire or? GPU2 is 69% ASIC and GPU1 is 78% FWIW


I have no experience with cf myself.
But as you said you put synchronize one and gpu2 has a .032v higher stock voltage. It makes perfect sense that when you put +100 on both the gpu's(thus the sync) that gpu2 still has .032v more.

Do you get what I mean?


----------



## tsm106

Quote:


> Originally Posted by *Archea47*
> 
> Yeah I just only have so much room in my case for rads so ... RPM
> 
> Any additional input on what the 1.00V reading in the third section of HWinfo64 is and how/when to adjust it?
> 
> And also voltage on my gpu cores? AB is set to synchronize settings between cards but gpu2 reads 1.332V while gpu1 reads 1.300V with +100mV. At stock gpu2 was 1.232 and gpu1 1.200. *Is it probably just poor sensor calibration from Sapphire or?* GPU2 is 69% ASIC and GPU1 is 78% FWIW


Don't worry about it. The synchronize setting, is NOT to make them run the same voltage in an offset voltage setup. It's to give them the same offset increment in voltage. Each gpu is going to run at whatever leakage it is going to run at and there's nothing to do about it. That said, it is easier to run cards with the same leakage level, keeps their voltages closer and temps for that matter.


----------



## BuildTestRepeat

IT CAME TODAY!

Got the Powercolor PCS+ R9 290 Installed !


----------



## Lao Tzu

Quote:


> Originally Posted by *BuildTestRepeat*
> 
> 
> 
> IT CAME TODAY!
> 
> Got the Powercolor PCS+ R9 290 Installed !


Congratz!!!!


----------



## Orga

Hi all,

I need help.

I own a Sapphire R9 290 4Gb about a year ago but don't really playing game with it until recently.

The Problem is the Fan is failed to responds to the increasing temperature. The Fan spins at about 1075rpm at MAX at 95 degree C. It failed to keep the GPU at lower temp.

I can't hear the fan's noise which supposed to getting louder to kept the GPU cool.

I did not OC the card, had it with the latest official driver from Sapphire website (14.12) and updated to the latest vbios as well. I'm running on Air with win8.1 64 bit, i7 4770k and Asus Maximus Vi Hero.

I used teachpower up GPUZ, and Uningine Valley to monitor temp and run the test.

Do you have this problem?


----------



## wermad

Any one tried running 5x1 eyefinity by any chance w/ 290/290X?


----------



## pshootr

Why do the cards have the bios switch anyways, did they really think it would be necessary or is it just to appeal to enthusiasts?
Quote:


> Originally Posted by *Orga*
> 
> Hi all,
> 
> I need help.
> 
> I own a Sapphire R9 290 4Gb about a year ago but don't really playing game with it until recently.
> 
> The Problem is the Fan is failed to responds to the increasing temperature. The Fan spins at about 1075rpm at MAX at 95 degree C. It failed to keep the GPU at lower temp.
> 
> I can't hear the fan's noise which supposed to getting louder to kept the GPU cool.
> 
> I did not OC the card, had it with the latest official driver from Sapphire website (14.12) and updated to the latest vbios as well. I'm running on Air with win8.1 64 bit, i7 4770k and Asus Maximus Vi Hero.
> 
> I used teachpower up GPUZ, and Uningine Valley to monitor temp and run the test.
> 
> Do you have this problem?


Hello, you can use Sapphire's "TRIXX" to set up fan profiles, or you could also use "Afterburner" from MSI.


----------



## tsm106

Quote:


> Originally Posted by *wermad*
> 
> Any one tried running 5x1 eyefinity by any chance w/ 290/290X?


You gonna need that DP hub. Btw, I don't notice tearing using two dvi and 1 dp at 144hz.


----------



## keikei

Hi Everyone,

I have a 290 trix, what is considered a decent/stable OC without voltage? I'm trying to squeak out some more performance. I have the gpu clock at 1100 right now. Thanks.


----------



## wermad

Quote:


> Originally Posted by *tsm106*
> 
> You gonna need that DP hub. Btw, I don't notice tearing using two dvi and 1 dp at 144hz.


i have some inexpensive monitors for 5x1 in mind: dvi, dvi, hdmi, 2x dvi>dp hub (accell dual dp hub w/ usb power). I wasn't sure if the tearing was still present when using dvi to dp adapters. I know Karlitos had issues w/ a WQHD (dvi-d to dp) and his quads. I've read all his threads and it seems the issue was w/ the displayport and drivers. Last news was this last fall where he abandoned his Eyefinity setup for a 4k screen. i may sell two of my cards and pick up a 295x2 if its too much hassle.

Any concerns running active dvi to dp adapters for all screens (295x2) in your opinion? If this doesn't go to fruition, I may go w/ a 32" wqhd k monitor or a used Swift.


----------



## Regnitto

well, I had myself a crappy evening last night. I pulled my 290 out because I didn't like the way I had the pump mounted (hoses coming out of the pump right next to the pcie connector instead of at the top of the card), and it finally got to bugging me enough to try to fix it.

So I got everything redone, got it all back in the computer, boot up and get into windows just fine............screen starts flashing after a minute or 2, bad. discover AB had for some reason set the 1200/1500 oc on stock volts/power limit. Reset AB and the flashing went away for the moment, but all the sliders were at minimum instead of in the middle as they were supposed to be. 2 min later, the flashing started back up and then screen goes black and computer hard froze. reset pc, boots to windows fine, AB looks normal w stock clocks and all sliders in the middle as they should be.......5 min later the flashing starts back up, black screen, freeze.

at this point I'm wondering if I got everything plugged in all the way, so i unplug the VGA power connectors, take the card out of the slot, put it all back in and double/triple check all connections, boot into windows. Artifacting desktop within seconds, hard freeze, then screen goes black.

so now I'm running the 7850 again and the ol' lady is on onboard HD4250. I've had no problems since.

One person in the red mod thread said he had the same problem with a 290x and fixed it by reseating the pump and uninstall/reinstall drivers. so I'm gonna try that tonight, then if that don't work, try it one more time with the stock cooler.

Do you guys think that a re-seat of the pump and fresh drivers might fix this, or do you think it's done? I will be disassembling the red mod and inspecting the card this evening, if I find anything visually, I'll take pics and post to get your opinions as well.

also, never had any temp problems, while this issue was occurring, it was idling at it's usual 37c. I had no problems with the card before repositioning the pump.


----------



## Roboyto

Quote:


> Originally Posted by *keikei*
> 
> Hi Everyone,
> 
> I have a 290 trix, what is considered a decent/stable OC without voltage? I'm trying to squeak out some more performance. I have the gpu clock at 1100 right now. Thanks.


1100 without voltage is pretty good already when you consider the stock clock for a 290 is 947 MHz; you're at 16% with no additional voltage.

Card sounds like it acts similar to my XFX, and if that is the case you may be able to get to ~1200 with ~75mV assuming the VRMs don't get too toasty; 80C max is about where you want to be.

Quote:


> Originally Posted by *Regnitto*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> well, I had myself a crappy evening last night. I pulled my 290 out because I didn't like the way I had the pump mounted (hoses coming out of the pump right next to the pcie connector instead of at the top of the card), and it finally got to bugging me enough to try to fix it.
> 
> So I got everything redone, got it all back in the computer, boot up and get into windows just fine............screen starts flashing after a minute or 2, bad. discover AB had for some reason set the 1200/1500 oc on stock volts/power limit. Reset AB and the flashing went away for the moment, but all the sliders were at minimum instead of in the middle as they were supposed to be. 2 min later, the flashing started back up and then screen goes black and computer hard froze. reset pc, boots to windows fine, AB looks normal w stock clocks and all sliders in the middle as they should be.......5 min later the flashing starts back up, black screen, freeze.
> 
> at this point I'm wondering if I got everything plugged in all the way, so i unplug the VGA power connectors, take the card out of the slot, put it all back in and double/triple check all connections, boot into windows. Artifacting desktop within seconds, hard freeze, then screen goes black.
> 
> so now I'm running the 7850 again and the ol' lady is on onboard HD4250. I've had no problems since.
> 
> One person in the red mod thread said he had the same problem with a 290x and fixed it by reseating the pump and uninstall/reinstall drivers. so I'm gonna try that tonight, then if that don't work, try it one more time with the stock cooler.
> 
> 
> 
> Do you guys think that a re-seat of the pump and fresh drivers might fix this, or do you think it's done? I will be disassembling the red mod and inspecting the card this evening, if I find anything visually, I'll take pics and post to get your opinions as well.
> 
> also, never had any temp problems, while this issue was occurring, it was idling at it's usual 37c. I had no problems with the card before repositioning the pump.


Maybe.

I would uninstall the drivers with Display Driver Uninstaller, uninstall any OC utilities, and then use a registry cleaner, like CCleaner, just to make sure everything is gone. Re-install the GPU, start fresh with drivers and see what happens at stock settings.

You're running the red mod, so what are you using to keep the primary VRMs cool? If those have been running without proper cooling you could have damaged something.

Primary VRMs in yellow


----------



## Regnitto

Quote:


> Originally Posted by *Roboyto*
> 
> Maybe.
> 
> I would uninstall the drivers with Display Driver Uninstaller, uninstall any OC utilities, and then use a registry cleaner, like CCleaner, just to make sure everything is gone. Re-install the GPU, start fresh with drivers and see what happens at stock settings.
> 
> You're running the red mod, so what are you using to keep the primary VRMs cool? If those have been running without proper cooling you could have damaged something.
> 
> Primary VRMs in yellow
> 
> 
> Spoiler: Warning: Spoiler!





Spoiler: Here's a pic of my red mod:




you can see why i repositioned my pump










The primary VRM is cooled with a 120mm Bygears 110cfm @ 2000rpm and VisionTek already has an aluminum heat sink on the primary VRM. I've never seen VRM1 or VRM2 go over 70c. GPU also stays in the 60's.


Spoiler: Temps:


----------



## keikei

Quote:


> Originally Posted by *Roboyto*
> 
> 1100 without voltage is pretty good already when you consider the stock clock for a 290 is 947 MHz; you're at 16% with no additional voltage.
> 
> Card sounds like it acts similar to my XFX, and if that is the case you may be able to get to ~1200 with ~75mV assuming the VRMs don't get too toasty; 80C max is about where you want to be.


Great. I'm using trixx, do i need to use the vddc offset and what power limit settings should I use? I'm at 20% right now. Thank you.


----------



## Regnitto

took a couple pics of my TIM spread cuz it looks like I may not have had the pump seated properly........


Spoiler: Warning: Spoiler!


----------



## bluedevil

Quote:


> Originally Posted by *Regnitto*
> 
> took a couple pics of my TIM spread cuz it looks like I may not have had the pump seated properly........
> 
> 
> Spoiler: Warning: Spoiler!


Might just be overearing due to the TIM application. The symptoms from what you were sayin, that would add up.


----------



## Regnitto

Quote:


> Originally Posted by *bluedevil*
> 
> Might just be overearing due to the TIM application. The symptoms from what you were sayin, that would add up.


That's what I'm hoping at the moment. getting it cleaned up and gonna retest. not seeing anything else wrong at the moment.


----------



## Roboyto

Quote:


> Originally Posted by *keikei*
> 
> Great. I'm using trixx, do i need to use the vddc offset and what power limit settings should I use? I'm at 20% right now. Thank you.


Yeah, use the offset increase; it maxes at 200mV. Make sure you keep an eye on VRM1 temps though, they get out of hand quick air cooled.

Power limit you will have to play around with. To find out if you have enough power limit for a given clock speed you can run a bench and record your score. Increase the power limit to max of 50% and see if your score comes up or not. If the score increases then you know the card was throttling clocks due to a lack of power. You can also monitor in GPU-Z graphs for your clock speed, if you notice the core clock graph sporadically moving up and down then you know you have a power issue; or temperature if you're at 95C.

My card will run 100% stable at 1200/1500 with +87mV. One time I was benching and noticed my scores were down. After a quick investigation of Trixx software I noticed my power limit was at 0%; Re-ran the bench and all was well. You can see that in the link below:

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/33060_20#post_23247070

Quote:


> Originally Posted by *Regnitto*
> 
> That's what I'm hoping at the moment. getting it cleaned up and gonna retest. not seeing anything else wrong at the moment.


Been in this situation before with my Kraken cooled 290. I fired up Heaven and the GPU temp went straight to 95C and started throttling. I didn't see any artifacts though


----------



## Regnitto

Quote:


> Originally Posted by *Roboyto*
> 
> Been in this situation before with my Kraken cooled 290. I fired up Heaven and the GPU temp went straight to 95C and started throttling. I didn't see any artifacts though


I was on the desktop idling. never got a chance to run heaven. gpu temp was 37c on hwinfo (sorry, no screen shot)


----------



## tsm106

Quote:


> Originally Posted by *wermad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> You gonna need that DP hub. Btw, I don't notice tearing using two dvi and 1 dp at 144hz.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i have some inexpensive monitors for 5x1 in mind: dvi, dvi, hdmi, 2x dvi>dp hub (accell dual dp hub w/ usb power). I wasn't sure if the tearing was still present when using dvi to dp adapters. I know Karlitos had issues w/ a WQHD (dvi-d to dp) and his quads. I've read all his threads and it seems the issue was w/ the displayport and drivers. Last news was this last fall where he abandoned his Eyefinity setup for a 4k screen. i may sell two of my cards and pick up a 295x2 if its too much hassle.
> 
> *Any concerns running active dvi to dp adapters for all screens* (295x2) in your opinion? If this doesn't go to fruition, I may go w/ a 32" wqhd k monitor or a used Swift.
Click to expand...

Karl's issue are the dl dvi to dp adapters. Basically they suck, are incredibly inconsistent, and often fail over time. The one AMD sent him didn't work either lol, ironically. Only go dp if you can go dp natively, and /or avoid the dl adapter if possible. My 144hz panels support every port interface so going 2 dvi and 1 dp is not an issue. If you have dp native on yours, try it out. Btw, the dp 1.2 hub has limited bandwidth so it will depend on your panel density.


----------



## Regnitto

Spoiler: Update!



Quote:


> Originally Posted by *Regnitto*
> 
> well, I had myself a crappy evening last night. I pulled my 290 out because I didn't like the way I had the pump mounted (hoses coming out of the pump right next to the pcie connector instead of at the top of the card), and it finally got to bugging me enough to try to fix it.
> 
> So I got everything redone, got it all back in the computer, boot up and get into windows just fine............screen starts flashing after a minute or 2, bad. discover AB had for some reason set the 1200/1500 oc on stock volts/power limit. Reset AB and the flashing went away for the moment, but all the sliders were at minimum instead of in the middle as they were supposed to be. 2 min later, the flashing started back up and then screen goes black and computer hard froze. reset pc, boots to windows fine, AB looks normal w stock clocks and all sliders in the middle as they should be.......5 min later the flashing starts back up, black screen, freeze.
> 
> at this point I'm wondering if I got everything plugged in all the way, so i unplug the VGA power connectors, take the card out of the slot, put it all back in and double/triple check all connections, boot into windows. Artifacting desktop within seconds, hard freeze, then screen goes black.
> 
> so now I'm running the 7850 again and the ol' lady is on onboard HD4250. I've had no problems since.
> 
> One person in the red mod thread said he had the same problem with a 290x and fixed it by reseating the pump and uninstall/reinstall drivers. so I'm gonna try that tonight, then if that don't work, try it one more time with the stock cooler.
> 
> Do you guys think that a re-seat of the pump and fresh drivers might fix this, or do you think it's done? I will be disassembling the red mod and inspecting the card this evening, if I find anything visually, I'll take pics and post to get your opinions as well.
> 
> also, never had any temp problems, while this issue was occurring, it was idling at it's usual 37c. I had no problems with the card before repositioning the pump.






I re-installed the stock air cooler, and uninstalled/reinstalled drivers, and uninstalled AB. everything seems to be working fine so far. have been up for 30 min with no flickering or artifacting. ran a few passes on Heaven, all is well. performance right where it should be at stock and temps identical to what they were before i removed cooler (screen coming soon) gpu 79c max vrm1 72c max. looks like the problem was not seated properly. I'm just gonna leave it stock for now and test drive it for the weekend. not gonna re-mod it tho, gotta make it last a while. use the AIO to cool my girl's Athlon II x3


----------



## pb4ugo2bed

This is my first time with an AMD card
I'll add my Gigabyte Radeon R9 290 4GB WINDFORCE to the mix.

OverclockGPU1175150069mVBenchmarkmxtmp89.png 2011k .png file


----------



## pb4ugo2bed




----------



## TeddyShot

I Purchased a Powercolor PCS+ R9 290 from Newegg this week and it arrived a couple days ago, I havent tried it yet as I'm waiting for the rest of my parts to arrive. I've been looking through this thread and noticed a lot of people have had issues with this card, but the amount of those responses has apparently died out in the last couple months. I'm certain you guys have more experience with R9 290s so I'm wondering, have the issues that people were experiencing before been fixed in the latest version? Is there any way of telling which version you have recieved? And is there anything I should look out for when I finally build my pc?


----------



## Sgt Bilko

@Arizonian Can you take me off the list?
My DD 290's have since moved to the Wife's rig and i'm running a 295x2 now


----------



## Orga

Hi, thanks for replying.

Like I stated earlier, both Afterburners and trixx failed to force the fan to full speed. When it just powered up during booting, I can hear the fan spin at higher speed. I know this cause the noise level from the graphic card is high. But only for a moment and it's slows down. Even if I use afterburner or other tools. Yes, I have set it to 100%, and apply.

Anyone had experienced this before?


----------



## keikei

Quote:


> Originally Posted by *Orga*
> 
> Hi, thanks for replying.
> 
> Like I stated earlier, both Afterburners and trixx failed to force the fan to full speed. When it just powered up during booting, I can hear the fan spin at higher speed. I know this cause the noise level from the graphic card is high. But only for a moment and it's slows down. Even if I use afterburner or other tools. Yes, I have set it to 100%, and apply.
> 
> Anyone had experienced this before?


What I do in CCC (catalyst control center), if im not playing games I set the fan speed to manual @ 20%. When i decide to game I change it to about 55%. *I also set all my game to vsync or something similar to cap the fps*, otherwise the gpu will overwork itself to the max (reason for high temps). When i'm done playing, I manually change the fan speed back to 20%.


----------



## kizwan

Quote:


> Originally Posted by *Orga*
> 
> Hi, thanks for replying.
> 
> Like I stated earlier, both Afterburners and trixx failed to force the fan to full speed. When it just powered up during booting, I can hear the fan spin at higher speed. I know this cause the noise level from the graphic card is high. But only for a moment and it's slows down. Even if I use afterburner or other tools. Yes, I have set it to 100%, and apply.
> 
> Anyone had experienced this before?


It probably helps if you post screenshot of your custom fan speed graph. You probably forgot to enable "User Define" which causing AB to ignore your setting.


----------



## dodgethis

Got the reference 290x under my AQC waterblock and backplate. I used Fujipoly Extreme thermal pads for the memory instead of TIM.

Here be photos.


----------



## Orga

Quote:


> Originally Posted by *kizwan*
> 
> It probably helps if you post screenshot of your custom fan speed graph. You probably forgot to enable "User Define" which causing AB to ignore your setting.









Hi,

Thanks again for helping me here. I am pretty sure I have enable all the necessaries items. I includes few screenshots while running uningine. As you can see, soon after the temp hit 94, its started to declock to about 350Mhz while the fan stated to run at 100% all the time.

By the way, my power supply is a Corsair TX750 M. Should be no problem I guess.


----------



## Roboyto

Quote:


> Originally Posted by *Orga*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hi,
> 
> Thanks again for helping me here. I am pretty sure I have enable all the necessaries items. I includes few screenshots while running uningine. As you can see, soon after the temp hit 94, its started to declock to about 350Mhz while the fan stated to run at 100% all the time.
> 
> By the way, my power supply is a Corsair TX750 M. Should be no problem I guess.


It looks like you are using AMD Overdrive and MSI Afterburner, typically this will cause nothing but problems. Disable AMD Overdrive, and only use Afterburner to make adjustments to the card. You also stated above you have tried with Trixx as well. Be certain that only one overclocking utility is running at a time.


----------



## Archea47

Quote:


> Originally Posted by *dodgethis*
> 
> Got the reference 290x under my AQC waterblock and backplate. I used Fujipoly Extreme thermal pads for the memory instead of TIM.


Looking great! Mine have been Champs so far.

What thickness did you use on the vram and vrms? Are your core Temps OK?


----------



## Orga

I am sure I only utilised 1 OC program while runing the test. The reason I includes amd overdrive is to declare that I knew its existence.

I have an issue with trixx. Whenever I try to click the setting tab, it's crashed. So I never really use trixx.

I also install radeonpro. I had created a game profile for it. It shouldn't run if I don't launch the specify game app.


----------



## Regnitto

Quote:


> Originally Posted by *Orga*
> 
> I am sure I only utilised 1 OC program while runing the test. The reason I includes amd overdrive is to declare that I knew its existence.
> 
> I have an issue with trixx. *Whenever I try to click the setting tab, it's crashed*. So I never really use trixx.
> 
> I also install radeonpro. I had created a game profile for it. It shouldn't run if I don't launch the specify game app.


there is a bug in one version of TRIXX where clicking the settings tab doesn't work and crashes the program. It was the newest version when I tried TRIXX a few months ago, idk if they have released a newer version since then, but if not, roll back 1 version and the settings tab works fine


----------



## tsm106

Quote:


> Originally Posted by *Orga*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> It probably helps if you post screenshot of your custom fan speed graph. You probably forgot to enable "User Define" which causing AB to ignore your setting.
> 
> 
> 
> 
> 
> Hi,
> 
> Thanks again for helping me here. I am pretty sure I have enable all the necessaries items. I includes few screenshots while running uningine. As you can see, soon after the temp hit 94, its started to declock to about 350Mhz while the fan stated to run at 100% all the time.
> 
> By the way, my power supply is a Corsair TX750 M. Should be no problem I guess.
Click to expand...

I can't tell anything about your settings from that screen. Grab the ab screen and extend it all the way so we can see the rest of settings. Btw, why do you have no voltage monitoring, nor voltage unlocked, but you then have force voltage enabled? The card will throttle when you hit the overheat threshold.


----------



## dodgethis

Quote:


> Originally Posted by *Archea47*
> 
> Looking great! Mine have been Champs so far.
> 
> What thickness did you use on the vram and vrms? Are your core Temps OK?


I used the 0.5mm ones. Truth be told, I have not tested the card yet. It's sitting in the box while I put together my rig. But you're right. I ought to do some testing to make sure. I think I will do so next weekend.

I did remove the block after the application to check on the contact. The contact was good but not complete, probably due to my CoolerMaster X-1 being very viscous, until I burn in the TIM. I tried to remove the block just now but the surface tension won't let me. I hope that's a indicator of a solid contact.

We will know how well it works once I do some testing


----------



## Arizonian

Quote:


> Originally Posted by *Jester435*
> 
> Can I be added to the club
> 
> XFX r9 290x DD
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added








Quote:


> Originally Posted by *BuildTestRepeat*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> IT CAME TODAY!
> 
> Got the Powercolor PCS+ R9 290 Installed !


Congrats - added








Quote:


> Originally Posted by *pb4ugo2bed*
> 
> This is my first time with an AMD card
> I'll add my Gigabyte Radeon R9 290 4GB WINDFORCE to the mix.
> 
> OverclockGPU1175150069mVBenchmarkmxtmp89.png 2011k .png file


Congrats - added









Quote:


> Originally Posted by *Sgt Bilko*
> 
> @Arizonian Can you take me off the list?
> My DD 290's have since moved to the Wife's rig and i'm running a 295x2 now


As far as I see it, your still an owner.







Any GPU's I give my kids rigs are really mine.








Quote:


> Originally Posted by *dodgethis*
> 
> Got the reference 290x under my AQC waterblock and backplate. I used Fujipoly Extreme thermal pads for the memory instead of TIM.
> 
> Here be photos.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## keikei

BF4 is a very finicky game. Even with a gpu clock of 1100, i get crashes. The new gen of gpus can come soon enough.


----------



## DDSZ

I'm with you








Gigabyte R9 290 with stock cooler


Spoiler: GPU-Z!






http://www.techpowerup.com/gpuz/details.php?id=d62kv


----------



## Sgt Bilko

Quote:


> Originally Posted by *Arizonian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> @Arizonian Can you take me off the list?
> My DD 290's have since moved to the Wife's rig and i'm running a 295x2 now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As far as I see it, your still an owner.
> 
> 
> 
> 
> 
> 
> 
> Any GPU's I give my kids rigs are really mine.
Click to expand...

Roger That


----------



## NFleck

Just got my XFX R9 290X today. Waiting on Kraken G10 as my aio watercooler bracket is nVidia only. I may go drill some holes if I get bored but for now the stock cooler isn't working too badly on valley extreme stress testing I can get it to stay around 70C at 100% fan but it's really loud.


Spoiler: GPU-Z






http://www.techpowerup.com/gpuz/details.php?id=7652e


----------



## kizwan

Quote:


> Originally Posted by *Orga*
> 
> I am sure I only utilised 1 OC program while runing the test. The reason I includes amd overdrive is to declare that I knew its existence.
> 
> I have an issue with trixx. Whenever I try to click the setting tab, it's crashed. So I never really use trixx.
> 
> I also install radeonpro. I had created a game profile for it. It shouldn't run if I don't launch the specify game app.


AMD CCC running in the background. Even if you didn't open CCC, the AMD Overdrive will still working if you enabled it. Just disabled it & restart the computer. Then try again with only AB running. Make sure Trixx also not running in background.


----------



## Arizonian

Quote:


> Originally Posted by *DDSZ*
> 
> I'm with you
> 
> 
> 
> 
> 
> 
> 
> 
> Gigabyte R9 290 with stock cooler
> 
> 
> Spoiler: GPU-Z!
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/details.php?id=d62kv


Congrats - added








Quote:


> Originally Posted by *NFleck*
> 
> Just got my XFX R9 290X today. Waiting on Kraken G10 as my aio watercooler bracket is nVidia only. I may go drill some holes if I get bored but for now the stock cooler isn't working too badly on valley extreme stress testing I can get it to stay around 70C at 100% fan but it's really loud.
> 
> 
> Spoiler: GPU-Z
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/details.php?id=7652e


Congrats - added


----------



## YellowBlackGod

Long live the R9 290/290x. It is still the best card around (the custom ones) and a good value for money.


----------



## Orga

Quote:


> Originally Posted by *tsm106*
> 
> I can't tell anything about your settings from that screen. Grab the ab screen and extend it all the way so we can see the rest of settings. Btw, why do you have no voltage monitoring, nor voltage unlocked, but you then have force voltage enabled? The card will throttle when you hit the overheat threshold.


Here is the rest of them.

I uninstall radeon pro and trixx. I also disable overdrive at CCC. The result is still the same.

By the way, I am running the card at "quite mode" not on uber.


----------



## ultraex2003

add me
gigabyte 290x @asus bios Raijintek Morpheus+ 2x12 Arctic F12 PWM
http://www.techpowerup.com/gpuz/details.php?id=nbywk


Spoiler: Warning: Spoiler!









Spoiler: Warning: Spoiler!


----------



## dodgethis

Truth be told, I am rather worried that my decision to use thermal pads on the VRAM, instead of only TIM, has gotten me worried about contact between the core and the block. Granted, I did use the thinnest ones in the form of Fujipoly Extreme 0.5mm ones. I did up a ghetto cooling rig to test the 290x and AQC waterblock combo. I used some spare tubing and connecters from my previous plans, including some Koolance QDCs.





I will install the card into my rig tomorrow.


----------



## BuildTestRepeat

My Powercolor PCS+ R9 290 seems to have the common black screen bug. Only when trying to display 4k 60HZ, constantly and randomly black screens. 1080P seems fine. I sent raymond.hsu and email.


----------



## sinnedone

Quote:


> Originally Posted by *keikei*
> 
> BF4 is a very finicky game. Even with a gpu clock of 1100, i get crashes. The new gen of gpus can come soon enough.


You probably need more voltage for your over clock. Back down to 1000-1050 and see if still crashes.


----------



## hyp36rmax

Quote:


> Originally Posted by *ultraex2003*
> 
> add me
> gigabyte 290x @asus bios Raijintek Morpheus+ 2x12 Arctic F12 PWM
> http://www.techpowerup.com/gpuz/details.php?id=nbywk
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Been curious about this cooler, how are your temps? What is your ambient during your testing?


----------



## Jester435

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> As far as I see it, your still an owner.
> 
> 
> 
> 
> 
> 
> 
> Any GPU's I give my kids rigs are really mine.
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added


Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> As far as I see it, your still an owner.
> 
> 
> 
> 
> 
> 
> 
> Any GPU's I give my kids rigs are really mine.
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added


I have a r9 290x.. You have me listed as a r9 290 on the list..

Thanks


----------



## Roboyto

Quote:


> Originally Posted by *NFleck*
> 
> Just got my XFX R9 290X today. Waiting on Kraken G10 as my aio watercooler bracket is nVidia only. I may go drill some holes if I get bored but for now the stock cooler isn't working too badly on valley extreme stress testing I can get it to stay around 70C at 100% fan but it's really loud.
> 
> 
> Spoiler: GPU-Z
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/details.php?id=7652e


Congrats and welcome to the club!

If your card has the reference VRM layout I would suggest getting the Gelid VRM kit and using some good thermal pads to tame the volcanic VRMs on your card. Check out my build log for information about the few tweaks I did to my 290 to keep it cool and quiet:

http://www.overclock.net/t/1478544/the-build-formerly-known-as-the-devil-inside-a-gaming-htpc-for-my-wife-4770k-corsair-250d-powercolor-r9-290/20_20#post_22214255


----------



## ultraex2003

Quote:


> Originally Posted by *hyp36rmax*
> 
> Been curious about this cooler, how are your temps? What is your ambient during your testing?


20c (ambient) and after a half hour gaming>>> battlefield 4 ultra settings









SOS the cooler with 2x12fan is very very big !!!








before


Spoiler: Warning: Spoiler!









Spoiler: Warning: Spoiler!







after


Spoiler: Warning: Spoiler!









Spoiler: Warning: Spoiler!


----------



## wermad

Nice temp drop


----------



## tsm106

Quote:


> Originally Posted by *Orga*
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I can't tell anything about your settings from that screen. Grab the ab screen and extend it all the way so we can see the rest of settings. Btw, why do you have no voltage monitoring, nor voltage unlocked, but you then have force voltage enabled? The card will throttle when you hit the overheat threshold.
> 
> 
> 
> Here is the rest of them.
> 
> I uninstall radeon pro and trixx. I also disable overdrive at CCC. The result is still the same.
> 
> By the way, I am running the card at "quite mode" not on uber.
Click to expand...

Quote:


> Originally Posted by *tsm106*
> 
> Setup AB like in the pic below. Don't combine apps, as rdr mentioned use gpuz or overdrive while AB is running. Adding to that multiple diagnostic apps that are reading the same sensors can/will cause conflicts.
> 
> 
> 
> After configuring like so, setup two presets. One is default and one is for your oc/daily settings/whatever use, etc. Now go to settings/profiles and pick your default preset for 2D profile and other preset for 3D profile. Exit AB, and restart it to save settings. You card should now drop down to low power clocks while idle, and conversely maintain the clocks you chose for the 3D profile under load. Btw, be sure to max Power Limit on your 3D profile.


For more reference:

http://www.overclock.net/t/1529888/pc-freezes-from-youtube-browser-apps-after-r9-290-install/0_40#post_23271996


----------



## NFleck

Quote:


> Originally Posted by *Roboyto*
> 
> Quote:
> 
> 
> 
> Originally Posted by *NFleck*
> 
> Just got my XFX R9 290X today. Waiting on Kraken G10 as my aio watercooler bracket is nVidia only. I may go drill some holes if I get bored but for now the stock cooler isn't working too badly on valley extreme stress testing I can get it to stay around 70C at 100% fan but it's really loud.
> 
> 
> Spoiler: GPU-Z
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/details.php?id=7652e
> 
> 
> 
> Congrats and welcome to the club!
> 
> If your card has the reference VRM layout I would suggest getting the Gelid VRM kit and using some good thermal pads to tame the volcanic VRMs on your card. Check out my build log for information about the few tweaks I did to my 290 to keep it cool and quiet:
> 
> http://www.overclock.net/t/1478544/the-build-formerly-known-as-the-devil-inside-a-gaming-htpc-for-my-wife-4770k-corsair-250d-powercolor-r9-290/20_20#post_22214255
Click to expand...

Card is http://www.newegg.ca/Product/Product.aspx?Item=N82E16814150675&cm_re=xfx_4gb_r9_290x-_-14-150-675-_-Product

I have the GELID Solutions CL-R9290-01-A R9 290 Series VRM Cooling Enhancement Kit - recieved, 50mm x 50mm x 0.5mm (6w/mK) thermal pad, copper vram heatsinks, Kraken G10, and Arctic F9 92MM (1800rpm 43cfm) in transit

Also have a custom bracket (http://keplerdynamics.com/sigmacool/mki) for attaching an aio cooler to my GTX 660, but the holes are slightly larger spacing and the plate is a bit too big on the backside of the card as well as not having a mount for a fan, so I'm just going to wait for the g10 mount.

Your guide is actually the one I followed to make me come to the decision to make the switch from my aio watercooled gtx 660 to get the 290x and do this mod.. Thanks a million for writing and sharing it as I'm sure it's been an inspiration to many.


----------



## Roboyto

Quote:


> Originally Posted by *NFleck*
> 
> 50mm x 50mm x 0.5mm (6w/mK) thermal pad
> 
> Also have a custom bracket (http://keplerdynamics.com/sigmacool/mki) for attaching an aio cooler to my GTX 660, but the holes are slightly larger spacing and the plate is a bit too big on the backside of the card as well as not having a mount for a fan, so I'm just going to wait for the g10 mount.
> 
> Your guide is actually the one I followed to make me come to the decision to make the switch from my aio watercooled gtx 660 to get the 290x and do this mod.. Thanks a million for writing and sharing it as I'm sure it's been an inspiration to many.


I'm curious to see how those pads perform. I don't think you will see the same temps that I was getting though. The Fuji Extreme have nearly double the conductance @ 11w/mK, and the Ultra Extreme that I used have nearly triple @ 17 w/mK. I know they are a bit on the pricey side, but they do work extremely well. Depending on how far you can/want to push your card, you may end up needing the better pads.

I've used those kepler dynamics brackets before. He does make them specific for AMD but they have been listed as sold out the last 2-3 times I've looked on his site. Maybe the G10 took all of his demand









I'm glad the guide helped you out









I've actually replaced the 290 from that build log with a GTX 970 that I got for $260 from NewEgg open box. The VRM temp situation on the little Zotac 970 I got isn't any different than the 290(X); it just isn't as well documented or known, I think. This is likely because some of the cards aren't capable of monitoring the VRM temps; at least my little Zotac isn't.



Spoiler: It's just a little guy















I was surprised to see the VRMs hitting 94C, with my IR thermometer, with stock heatsink and *all stock settings* while running benchmarks. This is actually worse than either of my reference 290's did at stock clocks; and it only got worse once I attached the Kraken because the VRMs are on the opposite side of the card so the fan is worthless.

Granted some of the companies have addressed this issue with a large heatsink/'frontplate' covering the mosfets and RAM chips, but I can't imagine the temperatures being that outstanding especially when the GPU heat is being blown on them.

The problem is only compounded when you turn up the heat by pushing the power limit, with an extremely simple BIOS edit. The BIOS edit to increase power limit from 6% to 25% was actually necessary to keep the card from throttling with stock settings. I've since rectified the hot VRM problem with a little ingenuity, but was shocked to see those temperatures, with *all stock settings*, when the cards are so highly regarded for their efficiency.


----------



## NFleck

Quote:


> Originally Posted by *Roboyto*
> 
> I'm curious to see how those pads perform. I don't think you will see the same temps that I was getting though. The Fuji Extreme have nearly double the conductance @ 11w/mK, and the Ultra Extreme that I used have nearly triple @ 17 w/mK. I know they are a bit on the pricey side, but they do work extremely well. Depending on how far you can/want to push your card, you may end up needing the better pads.


The fujipoly 11w/mk ones were quite hard to get shipped to canada for a reasonable price.. I went way over-budget on the gfx card and with all the extras, so paying that much extra for shipping on top of the price of the fuji forced me to go with something a bit lower.. I don't really overclock my gpu at all, and more-or-less just like the look and temps of liquid cooling on my card, so I think these will suffice for now and I can always upgrade to fuji down the line. Thank you for your advice.
Quote:


> Originally Posted by *Roboyto*
> 
> I've used those kepler dynamics brackets before. He does make them specific for AMD but they have been listed as sold out the last 2-3 times I've looked on his site. Maybe the G10 took all of his demand


Yeah mines the nvidia one.. I was thinking about taking the bracket out to my shop and drilling 4 new holes, but I haven't had the urge to do it yet and as mentioned there is no mount for a fan so I don't think it would be good enough. I do have a 200mm fan intaking from the side and front of my (HAF XM) case, and a 140mm fan exhausting out the top rear, and my cpu is using an H100 with a 240 rad push/pull out the top. Do you think it's worth it to try the kepler mount and gelid kit while I'm waiting for the G10 and ramsinks?

Quote:


> Originally Posted by *Roboyto*
> 
> I'm glad the guide helped you out
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've actually replaced the 290 from that build log with a GTX 970 that I got for $260 from NewEgg open box. The VRM temp situation on the little Zotac 970 I got isn't any different than the 290(X); it just isn't as well documented or known, I think. This is likely because some of the cards aren't capable of monitoring the VRM temps; at least my little Zotac isn't.
> 
> 
> Spoiler: It's just a little guy
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I was surprised to see the VRMs hitting 94C, with my IR thermometer, with stock heatsink and *all stock settings* while running benchmarks. This is actually worse than either of my reference 290's did at stock clocks; and it only got worse once I attached the Kraken because the VRMs are on the opposite side of the card so the fan is worthless.
> 
> Granted some of the companies have addressed this issue with a large heatsink/'frontplate' covering the mosfets and RAM chips, but I can't imagine the temperatures being that outstanding especially when the GPU heat is being blown on them.
> 
> The problem is only compounded when you turn up the heat by pushing the power limit, with an extremely simple BIOS edit. The BIOS edit to increase power limit from 6% to 25% was actually necessary to keep the card from throttling with stock settings. I've since rectified the hot VRM problem with a little ingenuity, but was shocked to see those temperatures, with *all stock settings*, when the cards are so highly regarded for their efficiency.


Does the 970 outperform the 290? This is one of the largest hardware purchases I've made for this pc, and was hoping the 290x would be a good card to get. Hope I didn't make a dumb decision.

Also, is it helpful to flash any different roms on my 290x? or is that only useful for 290->290x?


----------



## Roboyto

Quote:
Originally Posted by *NFleck* 

The fujipoly 11w/mk ones were quite hard to get shipped to canada for a reasonable price.. I went way over-budget on the gfx card and with all the extras, so paying that much extra for shipping on top of the price of the fuji forced me to go with something a bit lower.. I don't really overclock my gpu at all, and more-or-less just like the look and temps of liquid cooling on my card, so I think these will suffice for now and I can always upgrade to fuji down the line. Thank you for your advice.
Yeah mines the nvidia one.. I was thinking about taking the bracket out to my shop and drilling 4 new holes, but I haven't had the urge to do it yet and as mentioned there is no mount for a fan so I don't think it would be good enough. I do have a 200mm fan intaking from the side and front of my (HAF XM) case, and a 140mm fan exhausting out the top rear, and my cpu is using an H100 with a 240 rad push/pull out the top. Do you think it's worth it to try the kepler mount and gelid kit while I'm waiting for the G10 and ramsinks?


> Does the 970 outperform the 290? This is one of the largest hardware purchases I've made for this pc, and was hoping the 290x would be a good card to get. Hope I didn't make a dumb decision.
> 
> Also, is it helpful to flash any different roms on my 290x? or is that only useful for 290->290x?


Let me know how it goes, you may be fine with those pads. You may get lucky and get a card that will do a decent overclock and require little to no extra voltage or power limit









I would say just wait for the Kraken. Do you know if that Kepler Dynamics is compatible with a 970? I'd buy yours and use that rather than butchering my Kraken if I don't have to. It looks rather ridiculous with the whole fan section hanging off the end of the card right now.

I am working on a comparison of 970 and 290. If you look around on the review sites you will typically see it outperforming the 290(X) depending on the test/bench/game they are running. However, I don't think that (m)any of those sites re-run the benchmarks for the 290(X) with current drivers, which can make a BIG difference in the scores/FPS they generate. They also don't specify what version of the 290(X) they are using benchmark results from. If they chose to use results from the throttle prone reference cards, then those scores are considerably lower than they could be.









From what I have seen so far they trade blows if you have an average card. However, my good 290 which clocks to 1275/1675, give or take a lil' depending on bench, definitely beats the 970 I have. On the other end of the spectrum, above average 970's would probably be in the same ballpark or above my good 290.

I figured you, or someone, would ask why I bought a 970 as it is a lateral move in performance. I have 4 primary reasons for my switch to the 970:


CUDA Encoding for Blu-Ray rips
This takes a lot of stress off my 4790k so it isn't running 100% for a rip
Rip times have been cut in half or less.

PhysX for when the new Batman comes out
My wife LOVES the Batman games, and when PhysX is done right it is pretty sweet.
The HTPC this card resides in is *technically* her PC since I sold the PS3 on her









The lower power consumption with my 450W PSU
PSU wasn't having any issues running the 290, but for longevity sake this is a better choice

Boredom..I wanted something new to tinker with.
Wanted to see what all the hype was about
Wanted to pit the 290/970 against each other myself


I wasn't willing to pay full price for the card though. I patiently watched Newegg open box until one became available and snatched it. For $264 shipped, it was definitely worth buying for what I am using the card for.

If you take efficiency into consideration, this is where Nvidia definitely wins. However, most people in the enthusiast GPU market don't really care about efficiency so take that as you will.

It will be very interesting if the rumors for the R9 380X are true...45% more GPU cores with the same TDP of 290W. This would make 4096 cores, matching the 4096-bit memory bus due to the 3D stacked HBM. Not a bad move as far as efficiency is concerned on AMD's part...but these are rumors so we will see what unravels

You didn't make a bad decision by any means. You have the top tier (single GPU) from AMD, and it will likely serve you well for years to come before you need to make an upgrade.

If you don't want to go down the overclocking road, then a BIOS flash isn't really going to be something that you will need to explore. See how the card runs before you think about doing anything like that.


----------



## Mega Man

Quote:


> Originally Posted by *YellowBlackGod*
> 
> Long live the R9 290/290x. It is still the best card around (the custom reference ones) and a good value for money.


fixed


----------



## Bertovzki

Quote:


> Originally Posted by *dodgethis*
> 
> Truth be told, I am rather worried that my decision to use thermal pads on the VRAM, instead of only TIM, has gotten me worried about contact between the core and the block. Granted, I did use the thinnest ones in the form of Fujipoly Extreme 0.5mm ones. I did up a ghetto cooling rig to test the 290x and AQC waterblock combo. I used some spare tubing and connecters from my previous plans, including some Koolance QDCs.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> I will install the card into my rig tomorrow.


Did you do a test run ? and see if the Tim had good contact between the GPU and block , and that your thermal pads had been compressed and had good contact ?
If it was like my block , the mounts sat up a little off the PCB , but would id say pull down nicely when screwed down on all mounts.
Did you see my pics on page 3409 of all the block measurements ? :
http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/34080

Just curious if any of these blocks are different given that @Arcea47 had to mill his.


----------



## NFleck

Quote:


> Originally Posted by *Roboyto*
> 
> Quote:
> 
> 
> 
> If you don't want to go down the overclocking road, then a BIOS flash isn't really going to be something that you will need to explore. See how the card runs before you think about doing anything like that.
> 
> 
> 
> My EVGA GTX 660 SC had locked voltages and was not really overclock'able at all, so it's not something I've been able to try. I do, however, have extensive experience overclocking cpus under water, and actually have wanted to try at least seeing how far I can push it (safely). Is there a benefit to flashing roms for overclocking? From what I've read you need to have an asus bios to use some tools. I'll do some searching and see what I can find.
Click to expand...


----------



## Roboyto

Quote:


> Originally Posted by *NFleck*
> 
> My EVGA GTX 660 SC had locked voltages and was not really overclock'able at all, so it's not something I've tried. I do, however, have extensive experience overclocking cpus under water, and definitely want to try at least seeing how far I can push it (safely). Is there a benefit to flashing roms for overclocking? From what I've read you need to have an asus bios to use some tools. I'll do some searching and see what I can find.


You only need an ASUS BIOS to use GPU Tweak. You can have plenty of success with Sapphire Trixx of MSI Afterburner to overclock your card. See the 'Need to Know' post in my sig for general 290(X) information. If you've got problems/questions with your OC, feel free to shoot me a PM..I've tweaked, poked, and prodded 6 of these cards and know them pretty well.

With the watercooler on the core you won't need to worry about temperatures there, it is the primary VRM1 that will limit your overclock-ability. For benching purposes you can run up to ~90C, but I wouldn't run that high on a regular basis. The general consensus around here is ~80C and under for normal use.

Just noticed you bought an XFX card, so make sure your register it for the lifetime warranty







My XFX is a champ overclocker, so hopefully you receive a good one as well.


----------



## NFleck

Quote:


> Originally Posted by *Roboyto*
> 
> Quote:
> 
> 
> 
> Originally Posted by *NFleck*
> 
> My EVGA GTX 660 SC had locked voltages and was not really overclock'able at all, so it's not something I've tried. I do, however, have extensive experience overclocking cpus under water, and definitely want to try at least seeing how far I can push it (safely). Is there a benefit to flashing roms for overclocking? From what I've read you need to have an asus bios to use some tools. I'll do some searching and see what I can find.
> 
> 
> 
> You only need an ASUS BIOS to use GPU Tweak. You can have plenty of success with Sapphire Trixx of MSI Afterburner to overclock your card. See the 'Need to Know' post in my sig for general 290(X) information. If you've got problems/questions with your OC, feel free to shoot me a PM..I've tweaked, poked, and prodded 6 of these cards and know them pretty well.
> 
> With the watercooler on the core you won't need to worry about temperatures there, it is the primary VRM1 that will limit your overclock-ability. For benching purposes you can run up to ~90C, but I wouldn't run that high on a regular basis. The general consensus around here is ~80C and under for normal use.
> 
> Just noticed you bought an XFX card, so make sure your register it for the lifetime warranty
> 
> 
> 
> 
> 
> 
> 
> My XFX is a champ overclocker, so hopefully you receive a good one as well.
Click to expand...

It has void stickers on the screws on the stock cooler so I think i'll be screwed once I take it off..
Also, I'm using AB for my fan control while using stock fan, so I'll probably stick with it for oc.


----------



## Mega Man

you can read it on the warranty faq on the website but in the _*usa*_ installing a different cooler DOES NOT violate warranty

didnt anyone notice it seems as if hydravision is gone ?


----------



## Roboyto

Quote:


> Originally Posted by *NFleck*
> 
> It has void stickers on the screws on the stock cooler so I think i'll be screwed once I take it off..
> Also, I'm using AB for my fan control while using stock fan, so I'll probably stick with it for oc.


If you can be a little on the crafty side and warm up the little stickers you may be able to remove the stickers in one piece. Stick them on a piece of wax paper, and if you need to RMA the card at some point pop em back on.

Quote:


> Originally Posted by *Mega Man*
> 
> you can read it on the warranty faq on the website but in the *usa* installing a different cooler DOES NOT violate warranty
> 
> didnt anyone notice it seems as if hydravision is gone ?


He resides north of the border unfortunately









Yes I did...strange indeed


----------



## dodgethis

Quote:


> Originally Posted by *Bertovzki*
> 
> Did you do a test run ? and see if the Tim had good contact between the GPU and block , and that your thermal pads had been compressed and had good contact ?
> If it was like my block , the mounts sat up a little off the PCB , but would id say pull down nicely when screwed down on all mounts.
> Did you see my pics on page 3409 of all the block measurements ? :
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/34080
> 
> Just curious if any of these blocks are different given that @Arcea47 had to mill his.


Yep, all the thermal pads all came off the card onto the block during a test fit with the screws, and with good impressions. The core had good contact during the same test fit.

This is where I am now, just after the first boot. Praise be to GabeN.


__
https://flic.kr/p/qScDGF


Going to run some games now.

After running The Evil Within. That game taxed the core of my MSI 290 Gaming up to about 95 degrees.










TBH I am even more impressed by the lower CPU temps and the complete lack of noise. I guess dumping the heat outside of the case, rather than it rising to the CPU really makes a difference. And the radiator with the two Scythe Grand Flexes running at 1400rpm are about 50 cm away from me and I cannot even notice it anything!


----------



## ZealotKi11er

I run my cards 1175MHz/1500MHz + 100mV. I can clock them higher but after you play for 1 hours plus the extra heat is not worth it. Even 360 + 240 Push/Pull struggles with these cards. The good news is that most games @ 1440p i can run fine with one cards or 2 cards @ stock. Only need to OC for Crysis 3 and BF4 @ 4K.


----------



## Bertovzki

Quote:


> Originally Posted by *dodgethis*
> 
> Yep, all the thermal pads all came off the card onto the block after I removed the card during a test fit with the screws, and with good impressions. The core had good contact during the same test fit.
> 
> This is where I am now just after the first boot. Praise be to GabeN.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> __
> https://flic.kr/p/qScDGF
> 
> 
> 
> Going to run some games now.
> 
> After running The Evil Within. That game taxed my MSI 290 Gaming up to about 95 degrees.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> TBH I am even more impressed by the lower CPU temps and the complete lack of noise. I guess dump the heat outside of the case, rather than it rising to the CPU really make a difference. And the radiator with the two Scythe Grand Flexes running at 1400rpm are about 50 cm away from me and I cannot even notice it under load!


Sweet sounds all good then.


----------



## hyp36rmax

Anyone have this happen to them before in Titanfall? #errorfall



I just did a clean reformat of Windows 8.1 with CrossfireX AMD R9 290X VAPOR-X and loaded up a game of Titanfall to configure my settings and smooth everything out in 4k 3840x2160 and noticed "ERROR" all over the map. It was pretty spectacular as I wished I had taken video of this. I exited the game thinking I did something wrong with my drivers or setup and coul not reproduce this as it was normal when I reloaded up the game. If it happens again i'll be ready.


----------



## ZealotKi11er

Quote:


> Originally Posted by *hyp36rmax*
> 
> Anyone have this happen to them before in Titanfall? #errorfall
> 
> 
> 
> 
> I just did a clean reformat of Windows 8.1 with CrossfireX AMD R9 290X VAPOR-X and loaded up a game of Titanfall to configure my settings and smooth everything out in 4k 3840x2160 and noticed "ERROR" all over the map. It was pretty spectacular as I wished I had taken video of this. I exited the game thinking I did something wrong with my drivers or setup and coul not reproduce this as it was normal when I reloaded up the game. If it happens again i'll be ready.


More like a game error.


----------



## hyp36rmax

Quote:


> Originally Posted by *ZealotKi11er*
> 
> More like a game error.


Of course, I thought it was kind of cool... lol


----------



## Bertovzki

Quote:


> Originally Posted by *hyp36rmax*
> 
> Of course, I thought it was kind of cool... lol


Ha yea Titanfall taking the piss by the looks , a good way to do a error msg


----------



## fragamemnon

After 24hrs of folding, this is as high as I got. Might be able to push it even further, but we'll see about that.








3+1x120 rads with EK-Vardars are currently cooling both cards and the CPU running at 4.8GHz. Power draw at wall is 700w for the system only. Fans are PWM-controller, alas, because my fan controller hasn't arrived yet.


----------



## Orga

Quote:


> Originally Posted by *tsm106*
> 
> For more reference:
> 
> http://www.overclock.net/t/1529888/pc-freezes-from-youtube-browser-apps-after-r9-290-install/0_40#post_23271996


No, changing the setting doesn't solve my problem. It seem like the software failed to run the GPU fan to its maximum speed even though i set it to 100% in whatever utilities I used. May be the bios is wrongly configure?

Is there something limiting fan's speed in the bios? I notice it's never past 1070 rpm?


----------



## k2blazer

Hi, wondering who can help me here.

I have a pair of R9 290 Sapphire Tri-X cards, I'll be sticking with stock cooling but I would like to get a pair of backplates to improve the look and stop the card flexing.

Any recommendations?


----------



## Lao Tzu

Hi, it came today, Sapphire TRI-X R9 290X OC 4GB GDDR5, Cooling: TRI-X, validation:

http://www.techpowerup.com/gpuz/details.php?id=u37eg


----------



## Widde

Just checked my ASIC quality, it's 69 and 71 %, Dont know if it's related to why I'm not able to get over 1100 on the core







No matter what I do the cards wont go over 1100 allthough I havent checked after lowering my cpu oc that have been seemingly stable for over a year and a half, pc went unstable when trying to push the gpus.


----------



## kizwan

Quote:


> Originally Posted by *Orga*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> For more reference:
> 
> http://www.overclock.net/t/1529888/pc-freezes-from-youtube-browser-apps-after-r9-290-install/0_40#post_23271996
> 
> 
> 
> No, changing the setting doesn't solve my problem. It seem like the software failed to run the GPU fan to its maximum speed even though i set it to 100% in whatever utilities I used. May be the bios is wrongly configure?
> 
> Is there something limiting fan's speed in the bios? I notice it's never past 1070 rpm?
Click to expand...

Did you try flip the BIOS switch to other/uber BIOS?
Quote:


> Originally Posted by *k2blazer*
> 
> Hi, wondering who can help me here.
> 
> I have a pair of R9 290 Sapphire Tri-X cards, I'll be sticking with stock cooling but I would like to get a pair of backplates to improve the look and stop the card flexing.
> 
> Any recommendations?


Backplate won't be able to stop card flexing. YMMV though. Good luck!
Quote:


> Originally Posted by *Widde*
> 
> Just checked my ASIC quality, it's 69 and 71 %, Dont know if it's related to why I'm not able to get over 1100 on the core
> 
> 
> 
> 
> 
> 
> 
> No matter what I do the cards wont go over 1100 allthough I havent checked after lowering my cpu oc that have been seemingly stable for over a year and a half, pc went unstable when trying to push the gpus.


Even after increasing voltage?


----------



## Widde

Quote:


> Originally Posted by *kizwan*
> 
> Did you try flip the BIOS switch to other/uber BIOS?
> Backplate won't be able to stop card flexing. YMMV though. Good luck!
> Even after increasing voltage?


+100mV and +50% power limit, Temps are fine even vrms


----------



## kizwan

Quote:


> Originally Posted by *Widde*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Did you try flip the BIOS switch to other/uber BIOS?
> Backplate won't be able to stop card flexing. YMMV though. Good luck!
> Even after increasing voltage?
> 
> 
> 
> +100mV and +50% power limit, Temps are fine even vrms
Click to expand...

May I know the temps please? While folding right?


----------



## Widde

83c on the core, vrm1 55c, vrm2 71c

1075/1250 is the overclock atm


----------



## rdr09

Quote:


> Originally Posted by *Widde*
> 
> 83c on the core, vrm1 55c, vrm2 71c
> 
> 1075/1250 is the overclock atm


you are already temp limited (core).


----------



## Widde

Quote:


> Originally Posted by *rdr09*
> 
> you are already temp limited (core).


How? It isnt throtteling







And cant get over 1100 when I'm dropping the ambient so the core hovers around 65-68


----------



## kizwan

Quote:


> Originally Posted by *Widde*
> 
> 83c on the core, vrm1 55c, vrm2 71c
> 
> 1075/1250 is the overclock atm


Yeah, temps look ok, especially VRM1. VRM2 strangely high though. I was suspecting VRM1 temp but it's not. If +100mV is enough for gaming, then it should be enough for folding too. I don't fold but with games, especially BF4, mine consistently crashed/BSOD when VRM1 running at 60s Celsius for extended period of time (based on the records I have here 5 to 10 minutes). Overclocked was 1150/1400 +100mV.


----------



## fragamemnon

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Widde*
> 
> 83c on the core, vrm1 55c, vrm2 71c
> 
> 1075/1250 is the overclock atm
> 
> 
> 
> you are already temp limited (core).
Click to expand...

This.
I was able to extend my OC margin with 6% without adding extra current when I changed the cooling solution - even if they throttle at 94o, those processors, like all others, perform better at lower operating temperature.


----------



## Widde

Quote:


> Originally Posted by *kizwan*
> 
> Yeah, temps look ok, especially VRM1. VRM2 strangely high though. I was suspecting VRM1 temp but it's not. If +100mV is enough for gaming, then it should be enough for folding too. I don't fold but with games, especially BF4, mine consistently crashed/BSOD when VRM1 running at 60s Celsius for extended period of time (based on the records I have here 5 to 10 minutes). Overclocked was 1150/1400 +100mV.


Sigh I'll just have to accept that the silicon in these cards have been scraped from the concrete and put into gpus then


----------



## rdr09

Quote:


> Originally Posted by *Widde*
> 
> How? It isnt throtteling
> 
> 
> 
> 
> 
> 
> 
> And cant get over 1100 when I'm dropping the ambient so the core hovers around 65-68


i based it on your initial temp of 83.


----------



## Widde

Quote:


> Originally Posted by *rdr09*
> 
> i based it on your initial temp of 83.


I'm just clinging to hope







Have 2 dud gpus that wont overclock and a crappy 3570k that wont go over 4.5 so just getting sick and tired of it, Gonna go full custom and a new pc when I get a job


----------



## fragamemnon

I was barely keeping 1100MHz on the stock cooler of one GPU, other became unstable at anything over 1050MHz.
If you get your temps lower, you'll see a miracle.

Oh, and that was not game stable. I could only fold like so.


----------



## kizwan

Quote:


> Originally Posted by *Widde*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> you are already temp limited (core).
> 
> 
> 
> 
> 
> 
> 
> How? It isnt throtteling
> 
> 
> 
> 
> 
> 
> 
> And cant get over 1100 when I'm *dropping the ambient* so the core hovers around 65-68
Click to expand...

If this is the case, folding probably need more than +100mV. I personally wouldn't run more than +100mV for 24/7 unless if I can keep temps in 50s at least.

EDIT: I meant range between +100 to +200mV.


----------



## rdr09

Quote:


> Originally Posted by *Widde*
> 
> I'm just clinging to hope
> 
> 
> 
> 
> 
> 
> 
> Have 2 dud gpus that wont overclock and a crappy 3570k that wont go over 4.5 so just getting sick and tired of it, Gonna go full custom and a new pc when I get a job


i understand how it feels if your components can't oc the way you want them to. my 290s are a little above average but not the best. i game at stock, though. it is my cpu that i have to oc.


----------



## Lao Tzu

Hi, i have a problem with VSR, choose a resolution beyon native, but some games not showing at full screen like NFS, Hitman Absolution have an option that is "full screen exclusive" that i put in off and the game run with VSR full screen. In desk have same problem, some help!!


----------



## miklkit

Hi!

My MSI 280X is bottlenecking my system and I'm looking for an upgrade. I can get a Powercolor 290X 4gb for a decent price, but is it any good? Or should I wait for the 3xxx series? I have a pretty new Seasonic 850watt PSU.


----------



## rdr09

Quote:


> Originally Posted by *miklkit*
> 
> Hi!
> 
> My MSI 280X is bottlenecking my system and I'm looking for an upgrade. I can get a Powercolor 290X 4gb for a decent price, but is it any good? Or should I wait for the 3xxx series? I have a pretty new Seasonic 850watt PSU.


next gen are almost here. just wait. out of curiosity . . . how did you check the issue of bottleneck?


----------



## miklkit

It's pretty easy to do actually.

Here is Heaven 4.0 

Here is Half Life 2 Ep. 2 

Here is a pre alpha game on the Unity engine. 

The cpu is typically just loafing along while the gpu is running around 100% and fps varies by how much action there is on the screen.


----------



## rdr09

Quote:


> Originally Posted by *miklkit*
> 
> It's pretty easy to do actually.
> 
> Here is Heaven 4.0
> 
> Here is Half Life 2 Ep. 2
> 
> Here is a pre alpha game on the Unity engine.
> 
> The cpu is typically just loafing along while the gpu is running around 100% and fps varies by how much action there is on the screen.


well, not sure about that game but Heaven hardly uses the cpu.

run something like Firestrike. A game to test is BF4 MP 64 or BF3 MP 64.

edit: my Phenom II X 4 usages Heaven and Firestrike with a 7950 . . .


----------



## miklkit

Is BF4 free? I have exactly zero interest in online games.


----------



## rdr09

Quote:


> Originally Posted by *miklkit*
> 
> Is BF4 free? I have exactly zero interest in online games.


not yet. BF3 i think is. it will flex your cpu just the same.

here are other synthetic benchmarks . . .

http://www.techpowerup.com/downloads/Benchmarking/Futuremark/

they are useful for diagnosing problems with systems.


----------



## miklkit

Here is Firestrike loads. Not much going on cpu wise.


----------



## tsm106

Quote:


> Originally Posted by *miklkit*
> 
> Here is Firestrike loads. Not much going on cpu wise.


Ugh... man not even sure where to start. Run a FS with your cpu clocked as high as it will go and your gpu maxed out. Post the link here.


----------



## miklkit

Link? Didn't see anything about a link. Here is the score.


----------



## Blue Dragon

Quote:


> Originally Posted by *miklkit*
> 
> Link? Didn't see anything about a link. Here is the score.


click the orange bar directly underneath the score frame to drop down result details w/ link in there


----------



## tsm106

Quote:


> Post the link here.


The link to the run.

Why you chop out the info in the screen? Can't tell what your clocks are from that.


----------



## stren

So I have 4 reference 290s (one flashed to a 290x) all running stilt's old bioses from the mining days. However those bioses seemed to remove LLC so stable volts were quite high. Now that I'm done repurposing them in my build, it's time to tweak.

So all I'd like is a stable non throttling bios to run. Is there anything new since those days? I was never a fan of the pt1/pt1t bioses, they seemed to really make the cards less stable.

Any new bioses? Are we still flashing from a bootable usb to a cmd line? Pic relevant cause they are under water:


----------



## miklkit

Meh. Nevermind..........


----------



## kizwan

Quote:


> Originally Posted by *rdr09*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *miklkit*
> 
> It's pretty easy to do actually.
> 
> Here is Heaven 4.0
> 
> Here is Half Life 2 Ep. 2
> 
> Here is a pre alpha game on the Unity engine.
> 
> The cpu is typically just loafing along while the gpu is running around 100% and fps varies by how much action there is on the screen.
> 
> 
> 
> 
> 
> 
> 
> well, not sure about that game but Heaven hardly uses the cpu.
> 
> run something like Firestrike. A game to test is BF4 MP 64 or BF3 MP 64.
> 
> edit: my Phenom II X 4 usages Heaven and Firestrike with a 7950 . . .
Click to expand...

He is complaining about gpu bottleneck, not cpu bottleneck.


----------



## tsm106

Quote:


> Originally Posted by *kizwan*
> 
> He is complaining about gpu bottleneck, not cpu bottleneck.


I think he's complaining about the wrong thing lol. But apparently he dun wanna go there.
Quote:


> Originally Posted by *miklkit*
> 
> Meh. Nevermind..........


----------



## kizwan

Quote:


> Originally Posted by *tsm106*
> 
> I think he's complaining about the wrong thing lol. But apparently he dun wanna go there.


Hmmm. Based on his previous post, his 280X tapped out (gpu usage wise) & was looking an upgrade advice on Powercolor 290X.

Regarding his 280X, his firestrike score look ok to me, compare to the other 280X scores.
http://www.3dmark.com/fs/3853395


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> Hmmm. Based on his previous post, his *280X tapped out* (gpu usage wise) & was looking an upgrade advice on Powercolor 290X.
> 
> Regarding his 280X, his firestrike score look ok to me, compare to the other 280X scores.
> http://www.3dmark.com/fs/3853395


that is how it should be. but, Heaven is not a good test. I am not sure of the games he/she tested. i recommended BF4 or BF3 MP64.


----------



## tsm106

Shrugs... I think in most situations an amd cpu is the bottleneck in comparison to intel's much higher ipc counterparts. That is besides the point if the question is to get a faster gpu.


----------



## mus1mus

His score is just off. He should be getting around 9K in firestrike. Stock clocks should give him around 8.5K Total.

I have the same set-up and though AMD lags Intels, it's only for the Physics and Combined that the scores deficit should show.

Core 1180 / 1600 Memory IIRC
Quote:


> Originally Posted by *miklkit*
> 
> Meh. Nevermind..........


I would start with Driver Uninstallation and Cleaning out your OS if you can. Grab the Omega Driver, figure out the OC and TEMPs.


----------



## Orga

Quote:


> Originally Posted by *kizwan*
> 
> Did you try flip the BIOS switch to other/uber BIOS?


Hi,

I flip the switch to uber bios.. and I am getting something wired. The fan spin like a motorbike... vroom vroom.. I attached a screenshot from GPUZ 

Not enough power?


----------



## kizwan

Quote:


> Originally Posted by *Orga*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Did you try flip the BIOS switch to other/uber BIOS?
> 
> 
> 
> Hi,
> 
> I flip the switch to uber bios.. and I am getting something wired. The fan spin like a motorbike... vroom vroom.. I attached a screenshot from GPUZ
> 
> Not enough power?
Click to expand...

I think the fan PWM controller probably kaput. Can you see whether the fan connector is plugged in properly?


----------



## fragamemnon

You are throttling to 500MHz, core temp is 94o and VRMs are also hot, especially VRM 2. Even when the fan is at 100% it still can't cool it down.

This seems like bad contact to me - have you disassembled or done anything to it?
I'd suggest taking the cooler off, if it doesn't break your warranty, reapplying the TIM and replacing thermal pads. If you are comfortable with that, of course.


----------



## rdr09

Quote:


> Originally Posted by *fragamemnon*
> 
> You are throttling to 500MHz, core temp is 94o and VRMs are also hot, especially VRM 2. Even when the fan is at 100% it still can't cool it down.
> 
> This seems like bad contact to me - have you disassembled or done anything to it?
> I'd suggest taking the cooler off, if it doesn't break your warranty, reapplying the TIM and replacing thermal pads. If you are comfortable with that, of course.


^this exactly. GPUZ was showing current temp. i bet if that was at _Max_ . . . that temp was going over 95 making the gpu throttle like you said.


----------



## joeh4384

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Roger That


Quote:


> Originally Posted by *Orga*
> 
> Hi,
> 
> I flip the switch to uber bios.. and I am getting something wired. The fan spin like a motorbike... vroom vroom.. I attached a screenshot from GPUZ
> 
> Not enough power?


I would take it apart and reapply paste. It seems like the core doesn't have good contact or it has a **** paste job.


----------



## Orga

Well, it's from stock design. I never take off the heat sink/cooler.

But previously under silent mode, it's doesn't go vroom vroom.. Seem like something holding up the fan


----------



## Orga

Quote:


> Originally Posted by *kizwan*
> 
> I think the fan PWM controller probably kaput. Can you see whether the fan connector is plugged in properly?


I have no idea where is the fan connector is. We'll investigate more tomorrow. What is the PMW?


----------



## fragamemnon

Quote:


> Originally Posted by *Orga*
> 
> Well, it's from stock design. I never take off the heat sink/cooler.
> 
> But previously under silent mode, it's doesn't go vroom vroom.. Seem like something holding up the fan


Yes. The BIOS was holding it.
The normal BIOS limits it to 47% fan speed (IIRC). Otherwise it sounds like a jet-powered vacuum.

I bet your GPU was throttling even worse.


----------



## kizwan

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fragamemnon*
> 
> You are throttling to 500MHz, core temp is 94o and VRMs are also hot, especially VRM 2. Even when the fan is at 100% it still can't cool it down.
> 
> This seems like bad contact to me - have you disassembled or done anything to it?
> I'd suggest taking the cooler off, if it doesn't break your warranty, reapplying the TIM and replacing thermal pads. If you are comfortable with that, of course.
> 
> 
> 
> ^this exactly. GPUZ was showing current temp. i bet if that was at _Max_ . . . that temp was going over 95 making the gpu throttle like you said.
Click to expand...

Yes, it's throttling because core is at 94C. But that is not the issue that was being discuss. It was continuation from previous discussion. See a couple pages back. The issue is the fan running at lower rpm than it should for 100% fan speed.








Quote:


> Originally Posted by *Orga*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I think the fan PWM controller probably kaput. Can you see whether the fan connector is plugged in properly?
> 
> 
> 
> I have no idea where is the fan connector is. We'll investigate more tomorrow. What is the PMW?
Click to expand...

If I'm not mistaken the fan is PWM fan. If you can, examine the fan connector.


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> Yes, it's throttling because core is at 94C. But that is not the issue that was being discuss. It was continuation from previous discussion. See a couple pages back. The issue is the fan running at lower rpm than it should for 100% fan speed.
> 
> 
> 
> 
> 
> 
> 
> 
> If I'm not mistaken the fan is PWM fan. If you can, examine the fan connector.


my bad. frag pointed out the issue with the bios.


----------



## Lao Tzu

Hi, add me!!!

validation:
http://www.techpowerup.com/gpuz/details.php?id=u37eg

Sapphire AMD R9 290X OC 4GB GDDR5
Cooling: TRI-X


----------



## fragamemnon

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fragamemnon*
> 
> You are throttling to 500MHz, core temp is 94o and VRMs are also hot, especially VRM 2. Even when the fan is at 100% it still can't cool it down.
> 
> This seems like bad contact to me - have you disassembled or done anything to it?
> I'd suggest taking the cooler off, if it doesn't break your warranty, reapplying the TIM and replacing thermal pads. If you are comfortable with that, of course.
> 
> 
> 
> ^this exactly. GPUZ was showing current temp. i bet if that was at _Max_ . . . that temp was going over 95 making the gpu throttle like you said.
> 
> Click to expand...
> 
> Yes, it's throttling because core is at 94C. But that is not the issue that was being discuss. It was continuation from previous discussion. See a couple pages back. The issue is the fan running at lower rpm than it should for 100% fan speed.
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Orga*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I think the fan PWM controller probably kaput. Can you see whether the fan connector is plugged in properly?
> 
> Click to expand...
> 
> I have no idea where is the fan connector is. We'll investigate more tomorrow. What is the PMW?
> 
> Click to expand...
> 
> If I'm not mistaken the fan is PWM fan. If you can, examine the fan connector.
Click to expand...

Yes, I guess I deviated from the topic.
But that is bothersome, you have to admit! I can't just skim over it.

Plus, I also gave him the reason for the 'vroom vroom' effect.









It could be the PWM controller or it could be the reading. It depends on how loud the cooler is. We could use either noise or another tachometer in order to determine the cause.

Also yes, the fan is 4-pin, PWM controlled.
see here.

Edit: On a side note, Sapphire doesn't take kindly to opening the card up, even for a TIM replacement.
You might want to try issuing a RMA first if the problem is in the PWM controller, the fan, or anything hardware-related.


----------



## kizwan

Quote:


> Originally Posted by *fragamemnon*
> 
> Yes, I guess I deviated from the topic.
> But that is bothersome, you have to admit! I can't just skim over it.
> 
> Plus, I also gave him the reason for the 'vroom vroom' effect.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It could be the PWM controller or it could be the reading. It depends on how loud the cooler is. We could use either noise or another tachometer in order to determine the cause.
> 
> Also yes, the fan is 4-pin, PWM controlled.
> see here.


Well you did skimmed the important part of the issue. 94C is not bothersome to be honest. The Hawaii card was designed that way. If the card run like that for a thousand years, it still won't break the card. If you don't believe me, see me a thousand years later.







To summarize for you; He tried to increase fan speed using three software (not at the same time) but the fan doesn't respond to the setting. The fan speed is correctly set & detected by GPU-Z at 100% but RPM remain low. Hence I suggest flip the BIOS switch.

There you go. No more detour.

I'm running out of ideas. Maybe you guys can figure it out. Also there's member at our local forum also having same problem.


----------



## rdr09

Quote:


> Originally Posted by *Orga*
> 
> Hi,
> 
> I flip the switch to uber bios.. and I am getting something wired. The fan spin like a motorbike... vroom vroom.. I attached a screenshot from GPUZ
> 
> Not enough power?


if ambient is normal and case airflow is good . . . then, nothing else will help but rma. it should still be under warranty.


----------



## fragamemnon

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fragamemnon*
> 
> Yes, I guess I deviated from the topic.
> But that is bothersome, you have to admit! I can't just skim over it.
> 
> Plus, I also gave him the reason for the 'vroom vroom' effect.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It could be the PWM controller or it could be the reading. It depends on how loud the cooler is. We could use either noise or another tachometer in order to determine the cause.
> 
> Also yes, the fan is 4-pin, PWM controlled.
> see here.
> 
> 
> 
> Well you did skimmed the important part of the issue. 94C is not bothersome to be honest. The Hawaii card was designed that way. If the card run like that for a thousand years, it still won't break the card. If you don't believe me, see me a thousand years later.
> 
> 
> 
> 
> 
> 
> 
> To summarize for you; He tried to increase fan speed using three software (not at the same time) but the fan doesn't respond to the setting. The fan speed is correctly set & detected by GPU-Z at 100% but RPM remain low. Hence I suggest flip the BIOS switch.
> 
> There you go. No more detour.
> 
> I'm running out of ideas. Maybe you guys can figure it out. Also there's member at our local forum also having same problem.
Click to expand...

But 94oC at 2750RPM and throttling down to _500MHz_ - 1/2 of what it's supposed to be, is abnormal!
You could almost passively cool that.

Another thing I can think of, is a broken fan - motor or bearings. It could vibrate, which a) produces noise, and b) is not able to maintain rated RPM.


----------



## Agent Smith1984

That thing is toasting hot, and throttling horribly.
There is nothing about 94c that wouldn't be "bothersome" to me..... even _if_ it wasn't throttling like crazy









Either RMA the card if you ca (ideal), buy a third party cooler ($$), purchase a used cooler/fan from someone who has gone water cooling, or find a way to rig up a different fan on that thing.

Those were all listed in order of best solution, to worst solution, btw.....


----------



## thrgk

I have a question that has been bugging me for years. When handling video cards or any cpu or PC part, when they are not hooked to power, can static kill the card? Can like laying a video card on a rug kill it or toughing the gold pcb?

I am an OCD person and dont want to hurt cards. Nothing is broken, just wondering out of curiosity.


----------



## Agent Smith1984

Couldn't tell ya.

I mean, of course we all know you are supposed to keep these things static free, and most recomend a wrist band during hardware installations, but to be honest, I have left PC hardware all over the place; carpet, laundry piles (don't ask), and never had anything die from being "mishandled"....

However, that is probably all luck, and I recomend none of those things, and highly advise you keep your stuff static free


----------



## JourneymanMike

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Couldn't tell ya.
> 
> I mean, of course we all know you are supposed to keep these things static free, and most recomend a wrist band during hardware installations, but to be honest, I have left PC hardware all over the place; carpet, laundry piles (don't ask), and never had anything die from being "mishandled"....
> 
> However, that is probably all luck, and I recomend none of those things, and highly advise you keep your stuff static free


I've done the same... However, I do set them down carefully... Probably luck also!


----------



## pdasterly

sold one of my 290x cards, should I get 295X or wait for the 390x. Also is there any info if the 290x will xfire with the 390x?


----------



## hyp36rmax

Quote:


> Originally Posted by *pdasterly*
> 
> sold one of my 290x cards, should I get 295X or wait for the 390x. *Also is there any info if the 290x will xfire with the 390x*?


I don't think this will work as it's supposed to be a different series altogether and not a rebrand like the HD 7970 and R9 280X.


----------



## Agent Smith1984

Quote:


> Originally Posted by *hyp36rmax*
> 
> I don't think this will work as it's supposed to be a different series altogether and not a rebrand like the HD 7970 and R9 280X.


I highly doubt you will be able to crossfire 290x with the 390x when it drops....

However, you will be able to crossfire a 290x with a 295x, since they are both Hawaii....

Damn Hawaii core had 3072 stream processors shut off in it all along....

I WONDER if any of the folks with 290x or unlockable 290's, will be able to drop a 295 BIOS and unlock even more potential.


----------



## kizwan

Quote:


> Originally Posted by *thrgk*
> 
> I have a question that has been bugging me for years. When handling video cards or any cpu or PC part, when they are not hooked to power, can static kill the card? Can like laying a video card on a rug kill it or toughing the gold pcb?
> 
> I am an OCD person and dont want to hurt cards. Nothing is broken, just wondering out of curiosity.


Yes, static discharge is in fact can kill the card. Laying video cards on the rug not necessarily will kill the cards if there's no static charge there. Have you seen a small spark when your finger(s) nearly touching the metal surface of the PC case? IMO, the chance for static discharge is pretty small but better safe than sorry. I always touching the metal surface of the case to discharge any static charge before handling any electronic hardware.


----------



## mirzet1976

Does someone can explain this about total memory, the left dxdiag32 and right dxdiag 64. Graphic is R9 290


----------



## thrgk

Quote:


> Originally Posted by *kizwan*
> 
> Yes, static discharge is in fact can kill the card. Laying video cards on the rug not necessarily will kill the cards if there's no static charge there. Have you seen a small spark when your finger(s) nearly touching the metal surface of the PC case? IMO, the chance for static discharge is pretty small but better safe than sorry. I always touching the metal surface of the case to discharge any static charge before handling any electronic hardware.


So if you touch metal (with one hand or both?) it discharges u?


----------



## fragamemnon

Quote:


> Originally Posted by *thrgk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Yes, static discharge is in fact can kill the card. Laying video cards on the rug not necessarily will kill the cards if there's no static charge there. Have you seen a small spark when your finger(s) nearly touching the metal surface of the PC case? IMO, the chance for static discharge is pretty small but better safe than sorry. I always touching the metal surface of the case to discharge any static charge before handling any electronic hardware.
> 
> 
> 
> So if you touch metal (with one hand or both?) it discharges u?
Click to expand...

As long as it is grounded. Your PC case is, typically the outside of your stereo system (if metal) is, or a home radiator is also a viable choice.


----------



## hyp36rmax

Quote:


> Originally Posted by *mirzet1976*
> 
> Does someone can explain this about total memory, the left dxdiag32 and right dxdiag 64. Graphic is R9 290


Here you go!









Quote:


> *DirectX Diagnostic or DxDiag shows wrong total memory*
> 
> Running the Microsoft DirectX Diagnostic Tool may show the incorrect total memory under the Display tab for the video card, as shown in the picture below. In this example, a NVIDIA GeForce 9800GT with 512MB of video memory is being reported as 3303MB approx. total memory.
> 
> 
> 
> This is only the "approximate" total of memory the video card is using and will include not only the total video memory but also the shared system memory in the computer.Viewing the display properties for the video card will break down how the memory is used as shown in the example below. As can be seen in the picture below, the adapter properties show that the dedicated video memory as well as shared system memory.


*Source:* Link


----------



## thrgk

Quote:


> Originally Posted by *fragamemnon*
> 
> As long as it is grounded. Your PC case is, typically the outside of your stereo system (if metal) is, or a home radiator is also a viable choice.


Ah ok thats cool, hopefully my sth10 is grounded then lol, I always touch it with both hands before working on anything.


----------



## mirzet1976

Quote:


> Originally Posted by *hyp36rmax*
> 
> Here you go!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Source:* Link


----------



## rdr09

Quote:


> Originally Posted by *thrgk*
> 
> Ah ok thats cool, hopefully my sth10 is grounded then lol, I always touch it with both hands before working on anything.


if everything is assembled (motherboard, power) and the psu is plugged to the wall, then your sth10 is grounded.


----------



## pshootr

Quote:


> Originally Posted by *Orga*
> 
> Hi,
> 
> I flip the switch to uber bios.. and I am getting something wired. The fan spin like a motorbike... vroom vroom.. I attached a screenshot from GPUZ
> 
> Not enough power?


The vroom vroom can happen from a poorly configured fan profile. For instance 50C/35%-52C/70%

Also please make sure you don't have any other monitoring software loaded, including AMD Overdrive. This can lead to unreliable Temp readings. Very hard to believe your GPU is 90C when VRM is not even 90C

You can only hope your issues are from simple things like this.

Edit: Just noticed that your VRM2 is reading hotter than VRM1, that is indeed odd. I would uninstall your Video drivers completely, and start over.

I have a feeling your readings are bogus.

I hope I'm not just being a nooblet.


----------



## Orga

Quote:


> Originally Posted by *pshootr*
> 
> The vroom vroom can happen from a poorly configured fan profile. For instance 50C/35%-52C/70%
> 
> Also please make sure you don't have any other monitoring software loaded, including AMD Overdrive. This can lead to unreliable Temp readings. Very hard to believe your GPU is 90C when VRM is not even 90C
> 
> You can only hope your issues are from simple things like this.
> 
> Edit: Just noticed that your VRM2 is reading hotter than VRM1, that is indeed odd. I would uninstall your Video drivers completely, and start over.
> 
> I have a feeling your readings are bogus.
> 
> I hope I'm not just being a nooblet.


I understand how to configure the fan profile correctly. The problem is, the fan is not running at full speed even though I set it manually to 100% rpm. I think the bios provided from the sapphire website has a bug.

I have no idea on the deference temperature reading on VRAM.

Apart from GPUZ, I have CPUZ and RoG-CPUZ by asus.

Is there a chance that the fan setting for my motherboard ( fan xpert by Asus RoG) has anything to do with this problem?

I am using DDU to uninstall the driver. I will try driver from sapphire website when I am at home.


----------



## pshootr

Quote:


> Originally Posted by *Orga*
> 
> I understand how to configure the fan profile correctly. The problem is, the fan is not running at full speed even though I set it manually to 100% rpm. I think the bios provided from the sapphire website has a bug.
> 
> I have no idea on the deference temperature reading on VRAM.
> 
> Apart from GPUZ, I have CPUZ and RoG-CPUZ by asus.
> 
> Is there a chance that the fan setting for my motherboard ( fan xpert by Asus RoG) has anything to do with this problem?
> 
> I am using DDU to uninstall the driver. I will try driver from sapphire website when I am at home.


CPUZ should not interfere, but any GPU monitoring software running along with GPUZ is not recommended. The software's can confuse each other.

I have never used the ASUS RoG, so I'm not sure, but I would imagine it would only be able to control your Chassis and CPU fans.

Have you made sure that AMD overdrive is not running?

Any other monitoring software in use, like HWINFO64? If so turn it off.

Maybe also try The original bios again.

Let us know how things go after you reinstall your drivers.


----------



## Arizonian

Quote:


> Originally Posted by *ultraex2003*
> 
> add me
> gigabyte 290x @asus bios Raijintek Morpheus+ 2x12 Arctic F12 PWM
> http://www.techpowerup.com/gpuz/details.php?id=nbywk
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added








Quote:


> Originally Posted by *Jester435*
> 
> I have a r9 290x.. You have me listed as a r9 290 on the list..
> 
> Thanks


Fixed








Quote:


> Originally Posted by *Lao Tzu*
> 
> Hi, it came today, Sapphire TRI-X R9 290X OC 4GB GDDR5, Cooling: TRI-X, validation:
> 
> http://www.techpowerup.com/gpuz/details.php?id=u37eg


Congrats - added


----------



## dodgethis

Quote:


> Originally Posted by *Bertovzki*
> 
> Sweet sounds all good then.


Oh dear, despite my good results of using thermal pads instead of TIM on the VRAM, AQC has told me to not do that and use the TIM as prescribed. It seems the core would get contact, but not good enough contact, for cooling. I guess I'd better listen to them.


----------



## Bertovzki

Quote:


> Originally Posted by *dodgethis*
> 
> Oh dear, despite my good results of using thermal pads instead of TIM on the VRAM, AQC has told me to not do that and use the TIM as prescribed. It seems the core would get contact, but not good enough contact, for cooling. I guess I'd better listen to them.


Ok , yes i just checked my instructions on the AQC and it says Non conductive paste on RAM and GPU and thermal pads on the voltage regulators , interesting when it is all pads except GPU when you take the stock cooler off , but thats cool if the block is designed to be used like that then that is how it should be used


----------



## dodgethis

AQC claims their block is so precisely machined that you can use TIM instead of pads on the VRAM.

I went with pads because I didn't like the way the paste spread out during my contact test. It's hard to gauge an appropriate amount of those guys... And the X-1 paste I'm using ain't exactly cheap.


----------



## Bertovzki

Quote:


> Originally Posted by *dodgethis*
> 
> AQC claims their block is so precisely machined that you can use TIM instead of pads on the VRAM.
> 
> I went with pads because I didn't like the way the paste spread out during my contact test. It's hard to gauge an appropriate amount of those guys... And the X-1 paste I'm using ain't exactly cheap.


yeah getting the right amount must be hard alright , a test spot each on 2-4 rams and see i guess

Im not familiar with the tim you used , i want to get the right one for the job , good and non conductive.


----------



## dodgethis

Quote:


> Originally Posted by *Bertovzki*
> 
> yeah getting the right amount must be hard alright , a test spot each on 2-4 rams and see i guess
> 
> Im not familiar with the tim you used , i want to get the right one for the job , good and non conductive.


Coolermaster's top of the line TIM, second only to GC-Extreme. Definitely non conductive.


----------



## Bertovzki

Quote:


> Originally Posted by *dodgethis*
> 
> Coolermaster's top of the line TIM, second only to GC-Extreme. Definitely non conductive.


Yeah thanks , already Googled and sourced local supply , ill get some , will be 3rd brand new unused syringe , Galid that came with EK supremacy CSQ clear ,and got some arctic silver 5 , so will get the X1 for the GPU


----------



## Orga

Quote:


> Originally Posted by *pshootr*
> 
> CPUZ should not interfere, but any GPU monitoring software running along with GPUZ is not recommended. The software's can confuse each other.
> 
> I have never used the ASUS RoG, so I'm not sure, but I would imagine it would only be able to control your Chassis and CPU fans.
> 
> Have you made sure that AMD overdrive is not running?
> 
> Any other monitoring software in use, like HWINFO64? If so turn it off.
> 
> Maybe also try The original bios again.
> 
> Let us know how things go after you reinstall your drivers.





Here are my GPUZ reading with 2 different bios (.41 and .39) with completes driver uninstall using DDU. Both running with latest driver.

Only MSI Afterburner used. AMD Overdrive is disable.


----------



## rdr09

Quote:


> Originally Posted by *Orga*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> Here are my GPUZ reading with 2 different bios (.41 and .39) with completes driver uninstall using DDU. Both running with latest driver.
> 
> Only MSI Afterburner used. AMD Overdrive is disable.


orga,

set your GPU Temp to Max using the dropdown.



i must have missed it but if you change the TIM on the core it might help. but 1000 rpm is not gonna cut it.

you know what? set the RPM to Max as well. But, by the looks of it . . . it was staying around 1000 RPM.

If you can't RMA your card . . . check out Post #21896 . . . (scroll down to "My card runs too hot . . ."

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781

*edit: i think i messed up with that figure above. the other other end is 91C, which was the temp showing. my bad.*


----------



## BuildTestRepeat

Have to send my Powercolor PCS+ R9 290 back to newegg for RMA. After telling raymond at Powercolor about a few red screens ive had he says the card is most likely defective. Bummer.


----------



## tsm106

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Orga*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> Here are my GPUZ reading with 2 different bios (.41 and .39) with completes driver uninstall using DDU. Both running with latest driver.
> 
> Only MSI Afterburner used. AMD Overdrive is disable.
> 
> 
> 
> orga,
> 
> *set your GPU Temp to Max using the dropdown.
> *
> 
> 
> i must have missed it but if you change the TIM on the core it might help. but 1000 rpm is not gonna cut it.
> 
> edit: you know what? set the RPM to Max as well. But, by the looks of it . . . it was staying around 1000 RPM.
Click to expand...

Set all your temps to max, it's more meaningful that way.


----------



## thrgk

I was wondering, I have 4 290x with waterblocks and it weighs alot. Does anyone use support of any kind? So much weight on PCIE slots seems harsh but is it just my OCD? How do most people do it? It seems laying down would be the best solution


----------



## mojobear

Hey Thrgk,

No support for me! Just 4 cards hanging in the breeze haha. I think as long as you are secured well to the case then its okay. I havent noticed much sag with time.


----------



## Phaster89

well i've been testing my hynix memory msi r9 290 gaming 4g oc and this is what i got, all runs were made on air and all temps and clocks were verified on gpu-z:



well i have a few questions:

is it normal to only achieve a 3 fps increase in the minimum fps and a 14 fps on maximum fps?
it seems that +100mv gives me an instant crash, why is that?
i also can't go over 1400 on the memory without an instant crash, why is that?
are the vrm temps within a reasonable range?
can i squeeze anything more this card? what do you suggest? except from other forms of cooling


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Phaster89*
> 
> well i've been testing my hynix memory msi r9 290 gaming 4g oc and this is what i got, all runs were made on air and all temps and clocks were verified on gpu-z:
> 
> 
> 
> well i have a few questions:
> 
> is it normal to only achieve a 3 fps increase in the minimum fps and a 14 fps on maximum fps?
> it seems that +100mv gives me an instant crash, why is that?
> i also can't go over 1400 on the memory without an instant crash, why is that?
> are the vrm temps within a reasonable range?
> can i squeeze anything more this card? what do you suggest? except from other forms of cooling


Need moar info from you .
Like what monitoring prog are you using ?? AB or Trixx or GPU tweek ?
Whats the cpu o/c ?
Voltages ??
Air or watercooling ?
O/c the vid clock first and work on that before messin with the mem o/c .


----------



## thrgk

I have 4 290x, can I use the second cards hdmi cable, or does only the first cards ports work since they are in crossfire?

I have currently my receiver plugged into my 2nd cards hdmi but its not working


----------



## pshootr

Quote:


> Originally Posted by *Orga*
> 
> 
> 
> 
> Here are my GPUZ reading with 2 different bios (.41 and .39) with completes driver uninstall using DDU. Both running with latest driver.
> 
> Only MSI Afterburner used. AMD Overdrive is disable.


Damn. Ya those fans are not spinning up properly. Has the card always been like this?

Unfortunately it does seem something is wrong with the card. When you post your Max temps and fan speed as some others have suggested, also show the first tab in GPUZ please.


----------



## Mega Man

Quote:


> Originally Posted by *thrgk*
> 
> I have 4 290x, can I use the second cards hdmi cable, or does _*only the first cards ports work since they are in crossfire*_?
> 
> I have currently my receiver plugged into my 2nd cards hdmi but its not working


you can use a hdmi splitter though


----------



## thrgk

Quote:


> Originally Posted by *Mega Man*
> 
> you can use a hdmi splitter though


ah ok, i thought only the first cards worked


----------



## Phaster89

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Need moar info from you .
> Like what monitoring prog are you using ?? AB or Trixx or GPU tweek ?


msi afterburner
Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Whats the cpu o/c ?


i have a stock 4690 non k also on air
Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Voltages ??




these are the stock values

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Air or watercooling ?


air
Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> O/c the vid clock first and work on that before messin with the mem o/c .


1100 on the core was the furthest i could get without getting artifacts on valley, even though with 1100 i sometimes get artifacts while indoors on arma 3


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Phaster89*
> 
> msi afterburner
> i have a stock 4690 non k also on air
> 
> 
> these are the stock values
> air
> 1100 on the core was the furthest i could get without getting artifacts on valley, even though with 1100 i sometimes get artifacts while indoors on arma 3


Put the power limit up all the way and same with core voltage for a start


----------



## Lao Tzu

Quote:


> Originally Posted by *Phaster89*
> 
> msi afterburner
> i have a stock 4690 non k also on air
> 
> 
> these are the stock values
> air
> 1100 on the core was the furthest i could get without getting artifacts on valley, even though with 1100 i sometimes get artifacts while indoors on arma 3


Try to up vCore by 5MHz until dont have artifacts and tress test pass, with power limit +50%.


----------



## Agent Smith1984

Definitely want the power limiter at 50%.

My card throttles at anything over 1050MHz without the power limiter that high!

I get 1175 with +200mv and 50% power limiter, but my card is not a great clocker.
The RAM does 1600MHz though, so I guess I won there....


----------



## rdr09

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Definitely want the power limiter at 50%.
> 
> My card throttles at anything over 1050MHz without the power limiter that high!
> 
> I get 1175 with +200mv and 50% power limiter, but my card is not a great clocker.
> The RAM does 1600MHz though, so I guess I won there....


how's your 290 doing?


----------



## hyp36rmax

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Definitely want the power limiter at 50%.
> 
> My card throttles at anything over 1050MHz without the power limiter that high!
> 
> I get 1175 with +200mv and 50% power limiter, but my card is not a great clocker.
> The RAM does 1600MHz though, so I guess I won there....


What's the advantage of overclocking the ram?


----------



## Phaster89

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Put the power limit up all the way and same with core voltage for a start


i'll try that
Quote:


> Originally Posted by *Lao Tzu*
> 
> Try to up vCore by 5MHz until dont have artifacts and tress test pass, with power limit +50%.


with stock voltage?

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Definitely want the power limiter at 50%.
> 
> My card throttles at anything over 1050MHz without the power limiter that high!
> 
> I get 1175 with +200mv and 50% power limiter, but my card is not a great clocker.
> The RAM does 1600MHz though, so I guess I won there....


afterburner only goes up to 100 how could you get +200mv?


----------



## Agent Smith1984

Quote:


> Originally Posted by *rdr09*
> 
> how's your 290 doing?


I honestly love this thing man. It does everything I could ever want it to do at 1080P.

To be honest, it doesn't even need the overclock, but I'm just the type to crank it up ya know?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Phaster89*
> 
> i'll try that
> with stock voltage?
> afterburner only goes up to 100 how could you get +200mv?


Using trixx, but the settings tab in trixx always makes the app crash, so I can't start the OC with windows, or any of that good stuff.
The fan control seems to work though, so I just launch it every time I'm gaming. Seems to be a common but unsolved issue.


----------



## rdr09

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I honestly love this thing man. It does everything I could ever want it to do at 1080P.
> 
> To be honest, it doesn't even need the overclock, but I'm just the type to crank it up ya know?


i should have asked . . .

how's that thuban holding up?


----------



## Phaster89

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Using trixx, but the settings tab in trixx always makes the app crash, so I can't start the OC with windows, or any of that good stuff.
> The fan control seems to work though, so I just launch it every time I'm gaming. Seems to be a common but unsolved issue.


well i tried +200mv with +50% power limit on stock clocks and as soon as i start unigine valley my gpu crashes


----------



## MrWhiteRX7

Haven't chimed in here in a while... still jamming on all three of my 290's







No rAgrEtz lol


----------



## pshootr

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Haven't chimed in here in a while... still jamming on all three of my 290's
> 
> 
> 
> 
> 
> 
> 
> No rAgrEtz lol


Glad to hear it







Been running a 290 Tri-X OC for a couple months and happy so far


----------



## MrWhiteRX7

Quote:


> Originally Posted by *pshootr*
> 
> Glad to hear it
> 
> 
> 
> 
> 
> 
> 
> Been running a 290 Tri-X OC for a couple months and happy so far


Ham yea! You'll be good for years! Mine are reference cards from launch and no degradation in sight. 1100mhz on all 3 in trifire stock volts for about a year now







Just showing my love in here. I know there's a lot of hate towards the red team sometimes. Wooooo!


----------



## pshootr

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Ham yea! You'll be good for years! Mine are reference cards from launch and no degradation in sight. 1100mhz on all 3 in trifire stock volts for about a year now
> 
> 
> 
> 
> 
> 
> 
> Just showing my love in here. I know there's a lot of hate towards the red team sometimes. Wooooo!


Nice man, I have mine folding at 1200/1600, but for gaming I use 1160/1500,

I would like to get a second one


----------



## paconan

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Using trixx, but the settings tab in trixx always makes the app crash, so I can't start the OC with windows, or any of that good stuff.
> The fan control seems to work though, so I just launch it every time I'm gaming. Seems to be a common but unsolved issue.


I'm using the penultimate version (v 4.8.2) and not crasheo ...
crasheo only last last versión

I sorry for my English, I'm Spanish


----------



## DDSZ

Is 80*C (75*C / 65*C on VRM) - safe temp for those cards?


----------



## Gumbi

Quote:


> Originally Posted by *DDSZ*
> 
> Is 80*C (75*C / 65*C on VRM) - safe temp for those cards?


Absolutely. The core is safe til 95, the VRMs to 120 but their performance degrades massively before that so keeping core and VRMs below 95 is a good marker.

What card you have? If it's an aftermarket cooler their is a good chance applying some thermal paste could drop your core temps a nice bit.


----------



## Lao Tzu

Playing Middle Earth: Shadow of Mordor, maxed out, v-sync on, HD texture pack, 1080p, 60FPS allways !!!


----------



## Archea47

+100mV +50% power limit and 1200/1500 on my xfired 290Xs - no artifacts or any other issues yet after playing BF4 for a few hours. Highest I've tested so far


----------



## rdr09

Quote:


> Originally Posted by *Lao Tzu*
> 
> Playing Middle Earth: Shadow of Mordor, maxed out, v-sync on, HD texture pack, 1080p, 60FPS allways !!!


at 1100 oc or stock?
Quote:


> Originally Posted by *Archea47*
> 
> +100mV +50% power limit and 1200/1500 on my xfired 290Xs - no artifacts or any other issues yet after playing BF4 for a few hours. Highest I've tested so far


Wow.


----------



## Roboyto

Quote:


> Originally Posted by *Phaster89*
> 
> well i tried +200mv with +50% power limit on stock clocks and as soon as i start unigine valley my gpu crashes


One of my 290's does this with max voltage as well. You have a card that isn't capable of handling all of the additional voltage for some reason.

As @HOMECINEMA-PC said previously you should leave the RAM speed alone entirely until you find your max clocks for the core. You will get significantly more performance out of the core than the RAM.

It looks like you got 1100 core with the default +13mV, which is pretty good. Increase your power limit to 50%. Put the RAM back to stock 1250, and continue to increase the core speed until you experience some issues. 99% of the time the *core speed is not* going to cause the benchmark to crash, give you a black screen, or cause a BSOD/lockup; this caused by unstable RAM overclock. However, unstable core speed will get artifacts, tearing, screen flickering, or when your benchmark score doesn't improve you will know you need to add more voltage.

Once you experience any of the aforementioned issues, bump you core voltage 10-15mV and re-run the benchmark. If you get a clean run, then increase your core clock a little more, ~25MHz, and re-run the benchmark.

Continue increasing voltage and clocks until you reach a point where increasing either value doesn't do you any good.

At this point you can start fiddling with the RAM speeds. However, you may have already found your RAM limit at 1400. When you have your max GPU core speed, I would try 1375 on the RAM and see what happens. From what you've said and is in your spreadsheet, your card acts very similarly to my Powercolor 290. The 290(X) don't really benefit from RAM speeds due to the massive 512-bit bus...so don't be discouraged when the RAM doesn't OC well, it doesn't matter all that much!

You already have a nice start with your spreadsheet, so just continue doing that and you will be able to find your GPU's sweet spot where you get a great blend of performance increase with moderate alterations in voltage and GPU/VRM temps.

Keep your eye on VRM1 temperature; Once you start adding voltage the temperature will climb quickly. You really don't want it to be anything more than 85C-90C for everyday usage.

Last thing to keep in mind is you are using a synthetic benchmark to test with. They are typically a very good sign of stability, but offer no guarantees for the games you will play. Once you find your max clocks for Unigine, you should test with other benchmarks to see how your card acts. 3DMark is a great place to go next, and my all time favorite is still the Final Fantasty XIV benchmark. For my AMD cards it hasn't let me down to find a 100% absolutely stable OC for anything else I have subjected them to.

Quote:


> Originally Posted by *Archea47*
> 
> +100mV +50% power limit and 1200/1500 on my xfired 290Xs - no artifacts or any other issues yet after playing BF4 for a few hours. Highest I've tested so far


Sounds like a couple of lottery winners there







My good 290 makes 1200/1500 +87mV, and can hit 1300/1700 with max voltage/power.

Have you tried pushing them further?


----------



## Klocek001

My 290 does 1200 with +81mv and 1140 on stock voltage. The previous one I had couldn't even do 1150 wth +100mV without artifacting.
Quote:


> Originally Posted by *Roboyto*
> 
> Sounds like a couple of lottery winners there
> 
> 
> 
> 
> 
> 
> 
> My good 290 makes 1200/1500 +87mV, and can hit 1300/1700 with max voltage/power.


you mean 1300 at +200mv ? that's a lot of extra potential to unleash...


----------



## Roboyto

Quote:


> Originally Posted by *Klocek001*
> 
> you mean 1300 at +200mv ? that's a lot of extra potential to unleash...


Yes, 1300 with +200mV 50% power limit.

The extra 100MHz from 1200-1300 wasn't as big of a jump as you would think, plus that is ALOT of additional abuse to the card for a much smaller performance gain going from 975-1200. This is why I settled at 1200/1500 +87mV. These settings gave me the best blend of performance and temperatures all around. I have my 290 is a small loop, dual 240mm, with a 4770k, so temperatures could have became an issue with maximum overclocks on my hardware.

I did a lot of benching...you can see here in my build log: http://www.overclock.net/t/1456279/honey-i-shrunk-the-ultra-tower-beast-my-journey-to-creating-a-more-compact-pc-with-an-r9-290/20_20#post_21847939

1200/1500 on a single card provides a very enjoyable 5760*1080 gaming experience. I can't max every setting, but with AA dialed down a bit I can get pretty close in most games.

The best part of my build as a whole is the utter lack of noise...it is whisper quiet sitting on it's pedestal above my head


----------



## Lao Tzu

Quote:


> Originally Posted by *rdr09*
> 
> at 1100 oc or stock?
> Wow.


with oc 1100/1400, indeed all games, less Crysis 3, bad optimized...and is a game sponsor AMD "Gaming Evolved"


----------



## GOLDDUBBY

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> I know there's a lot of hate towards the red team sometimes. Wooooo!


Nah we nvidia customers are getting mighty sick of getting milked. Paying twice the money for less performance. My next cards will be red, for sure!

Cant be a gamer with the nvidia crap.


----------



## Lao Tzu

Quote:


> Originally Posted by *GOLDDUBBY*
> 
> Nah we nvidia customers are getting mighty sick of getting milked. Paying twice the money for less performance. My next cards will be red, for sure!
> 
> Cant be a gamer with the nvidia crap.


i use both companys, nVidia and AMD, in cpu intels allways be a step ahead, but in graphics now in present and with the new 300 Series coming, AMD its fighting


----------



## MrWhiteRX7

Quote:


> Originally Posted by *GOLDDUBBY*
> 
> Nah we nvidia customers are getting mighty sick of getting milked. Paying twice the money for less performance. My next cards will be red, for sure!
> 
> Cant be a gamer with the nvidia crap.


Red ftw







I have a 780ti Kingpin and it just collects dust lol.


----------



## Archea47

Quote:


> Originally Posted by *Roboyto*
> 
> Sounds like a couple of lottery winners there
> 
> 
> 
> 
> 
> 
> 
> My good 290 makes 1200/1500 +87mV, and can hit 1300/1700 with max voltage/power.
> 
> Have you tried pushing them further?


Thanks for the encouragement & data. I haven't tinkered with them further yet but will be back in town Monday to see how far they can go at +100mV and beyond. The trick for me is I'm sharing the loop with my 8350. The 8350 is more sensitive to Temps (the GPU readings are all low 50s at 1200) than the 290Xs in this scenario and don't want to overmuch the 290Xs so far I can't do at least 5GHz on the CPU


----------



## Roboyto

Quote:


> Originally Posted by *Archea47*
> 
> Thanks for the encouragement & data. I haven't tinkered with them further yet but will be back in town Monday to see how far they can go at +100mV and beyond. The trick for me is I'm sharing the loop with my 8350. The 8350 is more sensitive to Temps (the GPU readings are all low 50s at 1200) than the 290Xs in this scenario and don't want to overmuch the 290Xs so far I can't do at least 5GHz on the CPU


Yeah, your 8350 and motherboard cooling definitely create a lot of additional heat; You have some thick radiators though. What kind of CPU temps are you seeing presently?

My loop is quite small with (2) XPSC EX 240 radiators for a 4770k and a 290. It doesn't sound like a lot until you realize you have 2 radiators, 6 fans, a dual-bay reservoir, 3 SSDs, and a 2TB 3.5" drive stuffed inside a mATX case. It took a lot of thinking, tweaking, swearing, and benching to get everything how it is now, but the juice was worth the squeeze. My temps are solid for the circumstances..they could be better, but I value silence over a little extra performance. All my fans are controlled through AISuite on silent mode, and the computer doesn't make a peep unless it is booting up when AISuite isn't running yet.


----------



## Phaster89

Quote:


> Originally Posted by *Roboyto*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> One of my 290's does this with max voltage as well. You have a card that isn't capable of handling all of the additional voltage for some reason.
> 
> As @HOMECINEMA-PC
> said previously you should leave the RAM speed alone entirely until you find your max clocks for the core. You will get significantly more performance out of the core than the RAM.
> 
> It looks like you got 1100 core with the default +13mV, which is pretty good. Increase your power limit to 50%. Put the RAM back to stock 1250, and continue to increase the core speed until you experience some issues. 99% of the time the *core speed is not* going to cause the benchmark to crash, give you a black screen, or cause a BSOD/lockup; this caused by unstable RAM overclock. However, unstable core speed will get artifacts, tearing, screen flickering, or when your benchmark score doesn't improve you will know you need to add more voltage.
> 
> Once you experience any of the aforementioned issues, bump you core voltage 10-15mV and re-run the benchmark. If you get a clean run, then increase your core clock a little more, ~25MHz, and re-run the benchmark.
> 
> Continue increasing voltage and clocks until you reach a point where increasing either value doesn't do you any good.
> 
> At this point you can start fiddling with the RAM speeds. However, you may have already found your RAM limit at 1400. When you have your max GPU core speed, I would try 1375 on the RAM and see what happens. From what you've said and is in your spreadsheet, your card acts very similarly to my Powercolor 290. The 290(X) don't really benefit from RAM speeds due to the massive 512-bit bus...so don't be discouraged when the RAM doesn't OC well, it doesn't matter all that much!
> 
> You already have a nice start with your spreadsheet, so just continue doing that and you will be able to find your GPU's sweet spot where you get a great blend of performance increase with moderate alterations in voltage and GPU/VRM temps.
> 
> Keep your eye on VRM1 temperature; Once you start adding voltage the temperature will climb quickly. You really don't want it to be anything more than 85C-90C for everyday usage.
> 
> Last thing to keep in mind is you are using a synthetic benchmark to test with. They are typically a very good sign of stability, but offer no guarantees for the games you will play. Once you find your max clocks for Unigine, you should test with other benchmarks to see how your card acts. 3DMark is a great place to go next, and my all time favorite is still the Final Fantasty XIV benchmark. For my AMD cards it hasn't let me down to find a 100% absolutely stable OC for anything else I have subjected them to.
> 
> Sounds like a couple of lottery winners there
> 
> 
> 
> 
> 
> 
> 
> My good 290 makes 1200/1500 +87mV, and can hit 1300/1700 with max voltage/power.
> 
> Have you tried pushing them further?


well my friend i have to thank you for the tips, here's the afternoon of testing after your helpful post



i've highlighted the best results based on the highest minimum fps. 1150 on the core was the highest i could go without artifacts on valley, beyond 1150 bumping core voltage and core clock gave me artifacts.


----------



## PachAz

So how does a 290/290x stand up against thes new gtx 9xx from nvidia these days, I have looked at this thread for 6 months or so ^^.


----------



## Bertovzki

Quote:


> Originally Posted by *PachAz*
> 
> So how does a 290/290x stand up against thes new gtx 9xx from nvidia these days, I have looked at this thread for 6 months or so ^^.


Very well :

http://www.hwcompare.com/18225/geforce-gtx-980-vs-radeon-r9-290x/


----------



## Roboyto

Quote:


> Originally Posted by *PachAz*
> 
> So how does a 290/290x stand up against thes new gtx 9xx from nvidia these days, I have looked at this thread for 6 months or so ^^.


I'm working on a comparison of my 290 and 970 I just bought a couple weeks ago. Will have details later tonight or Tomorrow.


----------



## Roboyto

Quote:


> Originally Posted by *PachAz*
> 
> So how does a 290/290x stand up against thes new gtx 9xx from nvidia these days, I have looked at this thread for 6 months or so ^^.


I'm working on a comparison of my 290 and 970 I bought a few weeks ago. I will have details later tonight or tomorrow.
Quote:


> Originally Posted by *Phaster89*
> 
> well my friend i have to thank you for the tips, here's the afternoon of testing after your helpful post
> 
> 
> 
> i've highlighted the best results based on the highest minimum fps. 1150 on the core was the highest i could go without artifacts on valley, beyond 1150 bumping core voltage and core clock gave me artifacts.


Glad I was able to help you out


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Roboyto*
> 
> I'm working on a comparison of my 290 and 970 I bought a few weeks ago. I will have details later tonight or tomorrow.
> Glad I was able to help you out


Likewise


----------



## PachAz

I even think of removing one 290 from my system because I dont even play any games where I benefit from it. Kinda shame, since it is disabled in catalyst for the past 6 months.


----------



## Kaltenbrunner

I'm looking at a gigabyte r9 290 OC so its set at 1040/5000 (i guess 5000/4=1250MHz)

What do these OC like in general ? I'm figuring I'll CF for 1600p, but chances are I won't get another gigabyte OC, which is fine, I could down clock it if I had to in afterburner or whatever


----------



## Vici0us

Quote:


> Originally Posted by *Kaltenbrunner*
> 
> I'm looking at a gigabyte r9 290 OC so its set at 1040/5000 (i guess 5000/4=1250MHz)
> 
> What do these OC like in general ? I'm figuring I'll CF for 1600p, but chances are I won't get another gigabyte OC, which is fine, I could down clock it if I had to in afterburner or whatever


Never really pushed mine with voltage because I have two cards so no need for that. But It's sitting @ 1100 / 1400 (5600mhz) on stock voltage.


----------



## fragamemnon

Quote:


> Originally Posted by *PachAz*
> 
> I even think of removing one 290 from my system because I dont even play any games where I benefit from it. Kinda shame, since it is disabled in catalyst for the past 6 months.


Why don't you fold then?


----------



## HOMECINEMA-PC

I run my tri fire set up @ Stock [email protected]@1.25v before droop on the Asus PT1T bios and GPU Tweek to suit


----------



## Phaster89

how can i check the exact voltage of my 290? +13 on afterburner is exactly precise lol


----------



## Lao Tzu

Quote:


> Originally Posted by *Phaster89*
> 
> how can i check the exact voltage of my 290? +13 on afterburner is exactly precise lol


exact voltage need to use a spensive voltimeter direct to the board !!!, xd, u can check in HWMonitor, GPU-Z


----------



## rt123

Quote:


> Originally Posted by *Lao Tzu*
> 
> exact voltage need to use a spensive voltimeter direct to the board !!!, xd, u can check in HWMonitor, GPU-Z


A DMM is $20-$30 at Walmart. Not that Spensive.


----------



## Lao Tzu

The post that saying that GTX 900 series are a lot coolest than 290X are lying wrong, mine with air cooler, OC to 1100/1400 have a max temp of 68°C while 970 have 62°C (6°C less)


----------



## Lao Tzu

Quote:


> Originally Posted by *rt123*
> 
> A DMM is $20-$30 at Walmart. Not that Spensive.


, refering the most expensive, iam joking, dont take to seriusly


----------



## Kaltenbrunner

Quote:


> Originally Posted by *PachAz*
> 
> I even think of removing one 290 from my system because I dont even play any games where I benefit from it. Kinda shame, since it is disabled in catalyst for the past 6 months.


well see I'll be in the same boat if I CF I'm sure, a 280 CF would be less wasteful, but the extra cores, ROPs, 4GB, etc, makes the 290 well worth the extra price imo over the 280.

I have just enough cash for a 290 today, but not for anything bigger, and if I waited an extra 2-4 weeks to get a 980, I'd almost definitely dip into the money for partying $$$

A 290 will be quite the improvement over a 7950, maybe 1 will be enough for now


----------



## Kaltenbrunner

This round of cards is confusing to me. A 280~7970, yet a 290 is way ahead, but then a 290x is barely better than a 290


----------



## PachAz

Quote:


> Originally Posted by *Kaltenbrunner*
> 
> well see I'll be in the same boat if I CF I'm sure, a 280 CF would be less wasteful, but the extra cores, ROPs, 4GB, etc, makes the 290 well worth the extra price imo over the 280.
> 
> I have just enough cash for a 290 today, but not for anything bigger, and if I waited an extra 2-4 weeks to get a 980, I'd almost definitely dip into the money for partying $$$
> 
> A 290 will be quite the improvement over a 7950, maybe 1 will be enough for now


The questions is what games you are playing, or if you are cpu or gpu limited. To be fair I could still manage with a 7970 these days because I only play wot, which is very cpu bound. Why not buy a used R9 290 and save the extra money to the new generation R9 xxx or sumething? Even I bought my second R9 290 used and a discounted WB, which was kinda good considering my 2nd gpu dont add any value to me.


----------



## Roboyto

Quote:


> Originally Posted by *PachAz*
> 
> The questions is what games you are playing, or if you are cpu or gpu limited. To be fair I could still manage with a 7970 these days because I only play wot, which is very cpu bound. Why not buy a used R9 290 and save the extra money to the new generation R9 xxx or sumething? Even I bought my second R9 290 used and a discounted WB, which was kinda good considering my 2nd gpu dont add any value to me.


I have a Powercolor OC reference 290 for sale with or without Kraken G10/VRM cooling. No overclocking champion, but still solid have been running 1075/1375 with +37 mV for gaming. Goes higher, but starts needing more voltage.


----------



## long99x

I just bought 2x r9 290 dc2OC today,love them,temp heaven is 62-63, I use DP adapter but no sound through monitor, my monitor is LG 29um95, with this DP cable and my old card 7950 i have sound to monitor, I try to installed driver 14.12, 14.9, 13.12 but no different









Anyone know how to fix help me please


----------



## rdr09

Quote:


> Originally Posted by *long99x*
> 
> I just bought 2x r9 290 dc2OC today,love them,temp heaven is 62-63, I use DP adapter but no sound through monitor, my monitor is LG 29um95, with this DP cable and my old card 7950 i have sound to monitor, I try to installed driver 14.12, 14.9, 13.12 but no different
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone know how to fix help me please


why do you need an adapter?

i meant if both car and display have dp.


----------



## kizwan

Typo, I think he meant DP cable.


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> Typo, I think he meant DP cable.


prolly. that adapter threw me off. anyway, i read about this issue before in the drivers section but not sure how they fixed it. prolly onboard sound is conflicting with amd's.


----------



## tsm106

Check to see if the sound is enabled in the sound applet in control panel.


----------



## Zelx0

Time to join the club







I own a R9 290 gigabyte windforce with their stock air cooler ^^

Gpu Proof + 3d score


Spoiler: Warning: Spoiler!







Right I'm using a soft overclock on it (need to get time to finish tweaking it around):
+88 core voltage
+50 power limit
1150 coreclock
1300 memclock


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *rdr09*
> 
> why do you need an adapter?
> 
> i meant if both *car* and display have dp.
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Typo, I think he meant DP cable.
Click to expand...

Does your car have DP Kizwan ??

Quote:


> Originally Posted by *long99x*
> 
> I just bought 2x r9 290 dc2OC today,love them,temp heaven is 62-63, I use DP adapter but no sound through monitor, my monitor is LG 29um95, with this DP cable and my old card 7950 i have sound to monitor, I try to installed driver 14.12, 14.9, 13.12 but no different
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone know how to fix help me please
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> prolly. that adapter threw me off. anyway, i read about this issue before in the drivers section but not sure how they fixed it. prolly onboard sound is conflicting with amd's.
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Check to see if the sound is enabled in the sound applet in control panel.
> 
> 
> 
> 
> 
> Click to expand...
Click to expand...

Make your monitor sound as default


----------



## rdr09

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Does your car have DP Kizwan ??
> Make your monitor sound as default
> 
> 
> Spoiler: Warning: Spoiler!


Home's, check out my new car . . .


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *rdr09*
> 
> Home's, check out my new car . . .


Its looks like a black 3 series BMW 201x model . With spare tyre on . Guessing of cause . If so youre doing well









My ride


http://www.overclock.net/t/961467/the-show-your-car-and-car-discussion-thread/16300_20#post_23458115


----------



## rdr09

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Its looks like a black 3 series BMW 201x model . With spare tyre on . Guessing of cause . If so youre doing well
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My ride
> 
> 
> http://www.overclock.net/t/961467/the-show-your-car-and-car-discussion-thread/16300_20#post_23458115


Wait wut?! wut do you mean? let me check outside. it's suppose to have all 4 tires not 3.5. if it is . . . ill rma this dang thing.


----------



## Bertovzki

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Its looks like a black 3 series BMW 201x model . With spare tyre on . Guessing of cause . If so youre doing well
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My ride
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/961467/the-show-your-car-and-car-discussion-thread/16300_20#post_23458115


my car , what i like about it is i can smoke in it and it doesn't stink









Spoiler: Warning: Spoiler!






reality is probably almost as bad , all black in and out :


Spoiler: Warning: Spoiler!


----------



## Roboyto

Quote:


> Originally Posted by *PachAz*
> 
> So how does a 290/290x stand up against thes new gtx 9xx from nvidia these days, I have looked at this thread for 6 months or so ^^.


Quote:


> Originally Posted by *Roboyto*
> 
> I'm working on a comparison of my 290 and 970 I just bought a couple weeks ago. Will have details later tonight or Tomorrow.


*R9 290







~VS~







GTX 970*

**Disclaimer**

This, by far, is *not* a perfect comparison of the two cards since it was a large learning experience for me. I have not had an Nvidia card that I have been able to tweak/overclock ever...so bear with me folks! (I did once own a GTX 760, but it was an absolute dud that did not overclock even 1MHz Core/RAM and died on me rather quickly)

You may be wondering why someone who has never delved into Nvidia would buy a 970....


The Hype
CUDA for BD Rips
PhysX for new Batman for my wife
Bored with both rigs having 290s
Open Box for $264

Though it is not a perfect comparison it has some good information for those ignorant to the *'other side'*(like I was), and it paints a pretty decent picture with benchmark scores.

I felt this comparison needed to be done because I don't think (m)any of the professional review sites retest older hardware with current drivers. We don't always know what card they are posting the results from... What we DO KNOW, is that benchmark/FPS scores for a reference 290(X) from late 2013, look much different than benchmark/FPS scores for a properly cooled 290(X) with current drivers.

Keep in mind I only tested the 970 at 1080P. The VRAM issue for 970 can cause a problem after 3.5GB, so if you're running high resolution and plan to max out VRAM usage, then the 290(X) is likely the better choice so you don't suffer from a performance drop and choppy frame rates.

The 290 is in my main rig, and the 970 is in my HTPC. They are similar but not identical, so here you are:

*290 Rig:*
Win7 Ultimate x64
ASUS Z87 Gryphon w/ Thermal Armor
i7 4770k @ 4.5GHz - XSPC Raystorm - Delidded/Naked
2x8 GB Dominator Platinum 2400MHz
XFX BE R9 290 Reference - XSPC Razor w/ Backplate
Rosewill Capstone 650W Modular

*970 Rig:*
Win7 Ultimate x64
Gigabyte GA-Z97N Wi-Fi Mini-ITX
i7 4790k @ 4.7GHz - Antec 620 - Delidded/CLU
2x4 GB Dominator 2133MHz 1.5V
Zotac GTX 970 Reference - Kraken G10 w/ Antec 620 & VRM Heatsink Upgrade
Rosewill Capstone 450W Modular

The clock speed difference in CPU/RAM will make a slight difference in physics scores for 3DMark, so I will be focusing on graphics scores between the two. Interestingly enough the physics scores for the 4790K are better in firestrike, but worse in 3DMark11. I'm assuming this means that Firestrike relies more on CPU clock, and 3DMark11 more on RAM. I apologize, I now realize I should have just swapped RAM and clocked to 4.5GHz...but hindsight is 20/20







And..I don't feel like re-running all the benches

You will also notice a small number of the benches had a 4770k used. The same time I swapped the 4790k in, I also attached the Kraken to the 970...graphics scores are more important...and remember this isn't a perfect comparison.



Spoiler: Zotac GTX 970 Bios Mod & VRM Cooling



The GTX 970 also underwent a small BIOS mod to increase the power limit from 6% to 25%. The new Nvidia cards have a statistic in GPU-Z that tells you what is causing the core clock to throttle; i.e. temp, power, etc. I noticed immediately that GPU-Z was frequently showing power throttling.



The throttling statistic is handy, but this Zotac lacks VRM monitoring; I'm not sure if it is just this card, or all newer Nvidia. So I pulled out my trusty Rosewill IR thermometer to see how hot the VRMs were running. I was expecting fairly low temperatures because of the efficiency of these cards, but was shocked to see 94C with the stock cooler and all stock settings after a 2 run loop of Heaven; Overclocking the card pushed it up to 98C. Once I increased the power limit to 25%, the temperature skyrocketed to







*132C*







within 3 min of Heaven and the card shut itself down to prevent any damage.










The shoddy temperatures are due to a very crappy heatsink on Zotac's part; which they have since upgraded to one with fins. The crappy heatsink was a blessing in disguise however, because it allowed me to attach another heatsink directly to it. I swapped the crappy thermal pads for some Fuji Extreme, and slapped an Enzotech MST-66 heatsink directly on top of the factory one to bring temperatures within reason. With the Fuji Extreme, double heatsink, 25% power and an OC, the VRMs run 12C cooler than bone stock configuration.






























Since the 970 underwent a BIOS mod I recorded four sets of bench results:


Completely Stock
Stock Heatsink/BIOS Overclocked
Stock Clocks w/ 25% PWR BIOS Kraken
25% PWR BIOS Overclocked Kraken

These Nvidia card with the new Boost 2.0 can be tricky since the card can be limited thermally or by power...AND you have the option to prioritize one or the other, which I didn't realize until I was pretty much done with my evaluation. More tricky than fiddling with a 290(X) anyhow...



Spoiler: Nvidia Boost 2.0 - For Those Who Don't Know



When I saw how hot the VRMs were getting with the stock configuration, I tackled that issue at the same time as attaching the Kraken. I initially assumed that the increase in benchmark scores was due to the increased power limit that in turn increased boost clock. However, I have come to realize that this is only partially true. The power limit did help a bit in some circumstances, but it turns out that the drastically lower GPU core temp allowed the higher boost clock. This struck me as odd because when I was using the stock cooler I made sure that it wasn't hitting the 81C threshold. I guess this all depends on how the BIOS is setup...so if the core temp is in the High 40s to Low 50s then it will boost higher.

That may be kind of confusing so let me just put the numbers down:

Completely Stock Everything: GPU 1076 | Boost 1216 | GPU-Z Actual Boost 1278

Stock Clocks/PWR & Kraken: GPU 1076 | Boost 1216 | GPU-Z Actual Boost 1366

Stock Cooler/BIOS Overclock (+210 core): GPU 1286 | Boost 1426 | GPU-Z Actual Boost 1488

Kraken 25% PWR Overclock (+150 core): GPU 1226 | Boost 1366 | GpU-Z Actual Boost 1516

Another thing that makes sense now, that didn't previously, is why I had to reduce the overclock in Firestorm/Afterburner once I installed the Kraken. The low core temp is allowing more boost, so I had to turn the overclock down. It was trying to reach ~1575 core boost clock with the Kraken cooling, which would crash immediately in anything I ran except for FFXIV.

You probably noticed I didn't mention additional voltage anywhere. This is because the card is limited to a mere +37mV and it rarely had any benefit to an overclock. Once I attached the Kraken, any amount of additional voltage made no difference whatsoever.



A long Boost 2.0 story made short: Keep the core very cool and you get higher than advertised boost clocks.

Alright! Now that all those shenanigans are out of the way, we can get to the benchmark results! Drum roll please!










*XFX R9 290 BE - Stock Clocks 975/1250 | Omega 14.12 | Overclock 1275/1675 +200mV +50% Power Limit*

*Zotac GTX 970 - Stock Clocks 1076/1216/1753 (Actual Boost Up to 1278) | ForceWare 347.09 | Overclock 1286/1426/1840 (Actual Boost Up to 1488)*

*970 Kraken PWR - Stock Clocks 1076/1216/1753 (Actual Boost Up to 1366) | ForceWare 347.09 | Overclock 1226/1366/1815 (Actual Boost Up to 1516)*


 *Zotac GTX 970 Stock BIOS*    *XFX R9 290 Black Edition*    *GTX 970 Kraken PWR*   *Cloudgate**Graphics**Physics*  *Cloudgate**Graphics**Physics*  *Cloudgate**Graphics**Physics**Stock*27,65775,1518,611 *Stock*27,68670,3218,868 *Stock*29,900*76,221*9,562*Overclock*28,88285,8818,692 *Overclock*29,828*88,574*8,981 *Overclock*30,52082,3359,530               *Skydiver**Graphics**Physics*  *Skydiver**Graphics**Physics*  *Skydiver**Graphics**Physics**Stock*26,50634,28812,495 *Stock*27,533*36,349*12,592 *Stock*27,40335,67912,875*Overclock*28,14140,34911,717 *Overclock*31,479*46,589*12,578 *Overclock*28,45238,49312,709               *3DMark11**Graphics**Physics*  *3DMark11**Graphics**Physics*  *3DMark11**Graphics**Physics**Stock*13,830*14,843*11,479 *Stock*13,89814,86511,975 *Stock*14,201*15,500*11,303*Overclock*14,60516,39311,024 *Overclock*16,568*19,315*12,162 *Overclock*14,92716,58511,534               *Firestrike**Graphics**Physics*  *Firestrike**Graphics**Physics*  *Firestrike**Graphics**Physics**Stock*10,121*11,630*13,307 *Stock*9,52910,90212,386 *Stock*10,309*11,944*13,223*Overclock*11,03112,89113,314 *Overclock*11,799*14,057*12,363 *Overclock*11,03112,89113,314               *FFXIV*    *FFXIV*    *FFXIV*  *Stock*n/a   *Stock*13,215   *Stock**15,433*  *Overclock*n/a   *Overclock*16,502   *Overclock**16,616*                 *Heaven**FPS**Min / Max*  *Heaven**FPS**Min / Max*  *Heaven**FPS**Min / Max**Stock**1,725**68.5*32.1 / 150.9 *Stock*1,66866.225.8 / 139.2 *Stock**1,832**72.7*29.6 / 158.4*Overclock*1,94377.129.6 / 170.0 *Overclock**2,114**83.9*33.3 / 177.5 *Overclock*1,96878.130.3 / 170.7               *Valley**FPS**Min / Max*  *Valley**FPS**Min / Max*  *Valley**FPS**Min / Max**Stock*2,36356.528.9 / 103.2 *Stock**2,442**58.4*28.6 / 107.8 *Stock*2,37656.829.9 / 109.2*Overclock*2,59862.126.1 / 118.8 *Overclock**3,077**73.5*34.6 / 135.3 *Overclock*2,53260.532.9 / 114.6               *Bioshock (average)**Min**Max*  *Bioshock (average)**Min**Max*  *Bioshock (average)**Min**Max**Stock**179.36*23.17682.24 *Stock*172.1620.06386.26 *Stock**185.22*23.27505.41*Overclock*190.5319.25854.72 *Overclock**202.15*1.46463.84 *Overclock*197.7122.28605.82               *Tomb Raider (average)**Min**Max*  *Tomb Raider (average)**Min**Max*  *Tomb Raider (average)**Min**Max**Stock**86.0*66.0106.0 *Stock*80.862.0106.0 *Stock**97.2*74.0120.6*Overclock**96.1*73.8120.6 *Overclock*103.177.7134.2 *Overclock**144.0*80.0105.6

All of the R9 290 results used were from my Omega Drivers Comparison from a few weeks back...http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/33460_20#post_23300417

All of the settings for the benches are the same that I used for the Omega Drivers Comparison, which are typically default/maximum settings.

I have 4 screenshots for every bench, but I'm not going to post them all since that is ridiculous. I will toss a few of them up so you can see what I was talking about with boost clocks changing and things of that nature.



Spoiler: Valley Stock Everything - Boosting 1278









Spoiler: Skydiver w/ Kraken Cooling - Stock Clocks - Boosting 1366









Spoiler: Firestrike Stock Cooler Overclocked - Boosting 1472









Spoiler: Firestrike Kraken 25% Power - Boosting 1516









Spoiler: 3DMark11 Kraken 25% Power - Boosting 1491









Spoiler: Tomb Raider Stock Everything - Boosting 1265









Spoiler: Tomb Raider Kraken 25% Power - Boosting 1516









Spoiler: Heaven Kraken 25% Power - Boosting 1501







*970 Stock Settings*

Cloudgate: http://www.3dmark.com/3dm/5636125?

Skydiver: http://www.3dmark.com/3dm/5636059?

3DMark11: http://www.3dmark.com/3dm11/9328408

FireStrike: http://www.3dmark.com/3dm/5635965?

*970 Overclock*

Cloudgate: http://www.3dmark.com/3dm/5352334?

Skydiver: http://www.3dmark.com/3dm/5352431?

3DMark11: http://www.3dmark.com/3dm/9225137

FireStrike: http://www.3dmark.com/3dm/5352499

*970 PWR BIOS Stock Settings*

Cloudgate: http://www.3dmark.com/3dm/5636341?

Skydiver: http://www.3dmark.com/3dm/5636307?

3DMark11: http://www.3dmark.com/3dm11/9328484

FireStrike: http://www.3dmark.com/3dm/5636253?

*970 PWR BIOS Overclock*

Cloudgate: http://www.3dmark.com/3dm/5636568?

Skydiver: http://www.3dmark.com/3dm/5636530?

3DMark11: http://www.3dmark.com/3dm11/9328563

FireStrike: http://www.3dmark.com/3dm/5636467?

*Alright, that is a ton of information to swallow at once so let's try and break it down, shall we?*

We have 9 benchmarks, with stock/overclocked settings which gives us a total of 18 possibilities for win/lose...and we have a tie *9/9*.

If we remove Cloudgate/Skydiver from the equation we drop down to a total of 14...and *970* takes it with 8/14 wins.

If we look at just stock clocks for both cards where the 970 doesn't have the Kraken/25% Power...the *290* takes it with 10/16 wins. (no FFXIV)

If we look at stock clocks where the 970 has the Kraken/25% Power...the *970* takes it with 7/9 wins.

*But we still have other factors to consider...Price, Power Consumption, Operating Temperatures, Overclockability, etc.*

*Price* *290:* It looks like NewEgg is running low on stock of R9 290's as the cheapest one currently is an HIS for $269 + a $20 MIR. If you go used, then the 290 definitely wins here as you can easily find them in the $175-$200 range.

*Price 970:* This Zotac that I have is still the cheapest available at $329 + $10 gift card. Prices go all the way up to $399 where you are guaranteeing yourself a card that has an advertised stock boost clock in the mid 1300's...which will likely go even further. The more expensive cards may even have an 8+6 PCI-E power instead of 6+6 to ensure you get enough juice for high clocks.

*Power Consumption 290:* In reference configuration these cards are downright awful. They get too hot which destroys any chance they had at efficiency. Put a better cooler on them, or some form of water cooling and the power consumption can come down quite a bit. Keep it at stock clocks, or with a minor OC and it's not so bad. I know from first hand experience that a 290 @ 1075/1375 +37mV has no problems whatsoever running on a 450W gold PSU with a mildly OC'd 4770k; if you're running AMD FX CPU then you know you will need a big(ger) PSU.

*Power Consumption 970:* We all know the 970 wins this battle without question. But, it is the new kid on the block and this was one of Nvidia's primary concerns; KUDOS to them for such an efficient card. If you are thinking about upgrading from a smaller GPU and you have a smaller/older power supply, then the 970 is likely the better choice.

*Operating Temperatures 290:* Same story as the power consumption unfortunately. They can be tamed, but it takes time/money/effort especially if you want to run high overclocks all the time.

*Operating Temperatures 970:* In stock configuration the 970 also takes the cake. The cooler on this card is downright tiny, but it was still able to handle a decent overclock simply bt forcing the fans up to 60% where it was still pretty close to inaudible. Increase the power limit on the card to unlock maximum potential and the VRMs are just as much of an issue here as they are with the 290(X). *However*, this is likely due to the really crappy heatsink on this particular card. Some of the manufacturers are rolling out new revisions of the 970 already where they've added 'front plates' to the card for VRM/RAM cooling, and consequently they have increased core/boost speeds.

*Overclockability R9 290:* I have an above average 290 that holds stable in anything at 1275/1675. Not only is the core clock marvelous, the RAM speed is extremely uncommon in the 290(X) world. Odds are if you have an average card that you would slip under the performance of the 970 in most, if not all, scenarios.

*Overclockability GTX 970:* With this little Zotac boosting, and holding, very near/above 1.5GHz it's performance downright shocked me. I have looked at numerous reviews of numerous 970's and this is par for the course; nearly all of them land in the 1450-1525 range. Where my 970 falls short is the RAM speed only going ~100MHz over stock. Odds are pretty good you could end up with better clocking RAM.

*ETC. R9 290:* They obviously carry an advantage for Folding @ Home. No Xfire bridges. You can run Quad-Fire.

*ETC. GTX 970:* An unusually large number of 970 cards suffer from annoying coil whine; Mine does, but it is tolerable even with my obsession for absolute silence. There are a few 970's that come in the small size like this Zotac, which is a plus for small cases. ASUS even has a mini version that is 6.7" long; This matches dimensions of a mini-ITX board! You also have CUDA and PhysX if you're into those things. Limited to Tri-SLI for 970.

*The VRAM issue must definitely be taken into consideration now that it has officially come to light...If you need to use more than 3.5GB of VRAM and don't want to experience problems, then the 970 isn't going to cut the mustard. The 290(X) definitely have the advantage here.*

*C*







*nclusi*







*n...*

I have had my hands on numerous *290s*, and they are wonderful GPUs. The release of them stuck it to Nvidia when their pricing was really getting out of hand. The *290**(X)* showed everyone that AMD can still pack a







at a very reasonable price. But...they are 15 months old and that is an eternity in the computer world.

The *970* has nearly everything going for it except for price and the VRAM issue. If you take most everything into consideration...the savings in your electric bill, the probability of needing to upgrade cooling in one form or another, or even a new PSU...then the *290 * heavily relies on you requiring constant use of more than 3.5GB of VRAM.

Nvidia has the upper hand presuming you don't run higher resolutions with every setting maxed...AMD's 15 month old hardware is giving the 970 a run for it's money, and can still be the better choice if you need all the VRAM.

Am I converted to the Green Team? Not a chance







!!! I am still impressed with the performance of this little *970* even with the VRAM issue and 'misunderstanding' regarding the cards actual specifications.

Your opinion of Nvidia's questionable business practices may sway you more in the red direction..but that is for you to decide.

*What will happen when the R9 300 series cards come out is a whole 'nother story! *

El Fin


----------



## hyp36rmax

Quote:


> Originally Posted by *Roboyto*
> 
> *R9 290
> 
> 
> 
> 
> 
> 
> 
> ~VS~
> 
> 
> 
> 
> 
> 
> 
> GTX 970*
> 
> **Disclaimer**
> 
> This, by far, is *not* a perfect comparison of the two cards since it was a large learning experience for me. I have not had an Nvidia card that I have been able to tweak/overclock ever...so bear with me folks! (I did once own a GTX 760, but it was an absolute dud that did not overclock even 1MHz Core/RAM and died on me rather quickly)
> 
> You may be wondering why someone who has never delved into Nvidia would buy a 970....
> 
> 
> The Hype
> CUDA for BD Rips
> PhysX for new Batman for my wife
> Bored with both rigs having 290s
> Open Box for $264
> 
> Though it is not a perfect comparison it has some good information for those ignorant to the *'other side'*(like I was), and it paints a pretty decent picture with benchmark scores.
> 
> I felt this comparison needed to be done because I don't think (m)any of the professional review sites retest older hardware with current drivers. We don't always know what card they are posting the results from... What we DO KNOW, is that benchmark/FPS scores for a reference 290(X) from late 2013, look much different than benchmark/FPS scores for a properly cooled 290(X) with current drivers.
> 
> Keep in mind I only tested the 970 at 1080P. What I have gathered from review sites is that the 970 keeps pace with higher resolutions.
> 
> The 290 is in my main rig, and the 970 is in my HTPC. They are similar but not identical, so here you are:
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> *290 Rig:*
> Win7 Ultimate x64
> ASUS Z87 Gryphon w/ Thermal Armor
> i7 4770k @ 4.5GHz - XSPC Raystorm - Delidded/Naked
> 2x8 GB Dominator Platinum 2400MHz
> XFX BE R9 290 Reference - XSPC Razor w/ Backplate
> Rosewill Capstone 650W Modular
> 
> *970 Rig:*
> Win7 Ultimate x64
> Gigabyte GA-Z97N Wi-Fi Mini-ITX
> i7 4790k @ 4.7GHz - Antec 620 - Delidded/CLU
> 2x4 GB Dominator 2133MHz 1.5V
> Zotac GTX 970 Reference - Kraken G10 w/ Antec 620 & VRM Heatsink Upgrade
> Rosewill Capstone 450W Modular
> 
> The clock speed difference in CPU/RAM will make a slight difference in physics scores for 3DMark, so I will be focusing on graphics scores between the two. Interestingly enough the physics scores for the 4790K are better in firestrike, but worse in 3DMark11. I'm assuming this means that Firestrike relies more on CPU clock, and 3DMark11 more on RAM. I apologize, I now realize I should have just swapped RAM and clocked to 4.5GHz...but hindsight is 20/20
> 
> 
> 
> 
> 
> 
> 
> And..I don't feel like re-running all the benches
> 
> You will also notice a small number of the benches had a 4770k used. The same time I swapped the 4790k in, I also attached the Kraken to the 970...graphics scores are more important...and remember this isn't a perfect comparison.
> 
> 
> 
> Spoiler: Zotac GTX 970 Bios Mod & VRM Cooling
> 
> 
> 
> The GTX 970 also underwent a small BIOS mod to increase the power limit from 6% to 25%. The new Nvidia cards have a statistic in GPU-Z that tells you what is causing the core clock to throttle; i.e. temp, power, etc. I noticed immediately that GPU-Z was frequently showing power throttling.
> 
> 
> 
> The throttling statistic is handy, but this Zotac lacks VRM monitoring; I'm not sure if it is just this card, or all newer Nvidia. So I pulled out my trusty Rosewill IR thermometer to see how hot the VRMs were running. I was expecting fairly low temperatures because of the efficiency of these cards, but was shocked to see 94C with the stock cooler and all stock settings after a 2 run loop of Heaven; Overclocking the card pushed it up to 98C. Once I increased the power limit to 25%, the temperature skyrocketed to
> 
> 
> 
> 
> 
> 
> 
> *132C*
> 
> 
> 
> 
> 
> 
> 
> within 3 min of Heaven and the card shut itself down to prevent any damage.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The shoddy temperatures are due to a very crappy heatsink on Zotac's part; which they have since upgraded to one with fins. The crappy heatsink was a blessing in disguise however, because it allowed me to attach another heatsink directly to it. I swapped the crappy thermal pads for some Fuji Extreme, and slapped an Enzotech MST-66 heatsink directly on top of the factory one to bring temperatures within reason. With the Fuji Extreme, double heatsink, 25% power and an OC, the VRMs run 12C cooler than bone stock configuration.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Since the 970 underwent a BIOS mod I recorded four sets of bench results:
> 
> 
> Completely Stock
> Stock Heatsink/BIOS Overclocked
> Stock Clocks w/ 25% PWR BIOS Kraken
> 25% PWR BIOS Overclocked Kraken
> 
> These Nvidia card with the new Boost 2.0 can be tricky since the card can be limited thermally or by power...AND you have the option to prioritize one or the other, which I didn't realize until I was pretty much done with my evaluation. More tricky than fiddling with a 290(X) anyhow...
> 
> 
> 
> Spoiler: Nvidia Boost 2.0 - For Those Who Don't Know
> 
> 
> 
> When I saw how hot the VRMs were getting with the stock configuration, I tackled that issue at the same time as attaching the Kraken. I initially assumed that the increase in benchmark scores was due to the increased power limit that in turn increased boost clock. However, I have come to realize that this is only partially true. The power limit did help a bit in some circumstances, but it turns out that the drastically lower GPU core temp allowed the higher boost clock. This struck me as odd because when I was using the stock cooler I made sure that it wasn't hitting the 81C threshold. I guess this all depends on how the BIOS is setup...so if the core temp is in the High 40s to Low 50s then it will boost higher.
> 
> That may be kind of confusing so let me just put the numbers down:
> 
> Completely Stock Everything: GPU 1076 | Boost 1216 | GPU-Z Actual Boost 1278
> 
> Stock Clocks/PWR & Kraken: GPU 1076 | Boost 1216 | GPU-Z Actual Boost 1366
> 
> Stock Cooler/BIOS Overclock (+210 core): GPU 1286 | Boost 1426 | GPU-Z Actual Boost 1488
> 
> Kraken 25% PWR Overclock (+150 core): GPU 1226 | Boost 1366 | GpU-Z Actual Boost 1516
> 
> Another thing that makes sense now, that didn't previously, is why I had to reduce the overclock in Firestorm/Afterburner once I installed the Kraken. The low core temp is allowing more boost, so I had to turn the overclock down. It was trying to reach ~1575 core boost clock with the Kraken cooling, which would crash immediately in anything I ran except for FFXIV.
> 
> You probably noticed I didn't mention additional voltage anywhere. This is because the card is limited to a mere +37mV and it rarely had any benefit to an overclock. Once I attached the Kraken, any amount of additional voltage made no difference whatsoever.
> 
> 
> 
> A long Boost 2.0 story made short: Keep the core very cool and you get higher than advertised boost clocks.
> 
> Alright! Now that all those shenanigans are out of the way, we can get to the benchmark results! Drum roll please!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *XFX R9 290 BE - Stock Clocks 975/1250 | Omega 14.12 | Overclock 1275/1675 +200mV +50% Power Limit*
> 
> *Zotac GTX 970 - Stock Clocks 1076/1216/1753 (Actual Boost Up to 1278) | ForceWare 347.09 | Overclock 1286/1426/1840 (Actual Boost Up to 1488)*
> 
> *970 Kraken PWR - Stock Clocks 1076/1216/1753 (Actual Boost Up to 1366) | ForceWare 347.09 | Overclock 1226/1366/1815 (Actual Boost Up to 1516)*
> 
> 
> *Zotac GTX 970 Stock BIOS*    *XFX R9 290 Black Edition*    *GTX 970 Kraken PWR*   *Cloudgate**Graphics**Physics*  *Cloudgate**Graphics**Physics*  *Cloudgate**Graphics**Physics**Stock*27,65775,1518,611 *Stock*27,68670,3218,868 *Stock*29,900*76,221*9,562*Overclock*28,88285,8818,692 *Overclock*29,828*88,574*8,981 *Overclock*30,52082,3359,530               *Skydiver**Graphics**Physics*  *Skydiver**Graphics**Physics*  *Skydiver**Graphics**Physics**Stock*26,50634,28812,495 *Stock*27,533*36,349*12,592 *Stock*27,40335,67912,875*Overclock*28,14140,34911,717 *Overclock*31,479*46,589*12,578 *Overclock*28,45238,49312,709               *3DMark11**Graphics**Physics*  *3DMark11**Graphics**Physics*  *3DMark11**Graphics**Physics**Stock*13,830*14,843*11,479 *Stock*13,89814,86511,975 *Stock*14,201*15,500*11,303*Overclock*14,60516,39311,024 *Overclock*16,568*19,315*12,162 *Overclock*14,92716,58511,534               *Firestrike**Graphics**Physics*  *Firestrike**Graphics**Physics*  *Firestrike**Graphics**Physics**Stock*10,121*11,630*13,307 *Stock*9,52910,90212,386 *Stock*10,309*11,944*13,223*Overclock*11,03112,89113,314 *Overclock*11,799*14,057*12,363 *Overclock*11,03112,89113,314               *FFXIV*    *FFXIV*    *FFXIV*  *Stock*n/a   *Stock*13,215   *Stock**15,433*  *Overclock*n/a   *Overclock*16,502   *Overclock**16,616*                 *Heaven**FPS**Min / Max*  *Heaven**FPS**Min / Max*  *Heaven**FPS**Min / Max**Stock**1,725**68.5*32.1 / 150.9 *Stock*1,66866.225.8 / 139.2 *Stock**1,832**72.7*29.6 / 158.4*Overclock*1,94377.129.6 / 170.0 *Overclock**2,114**83.9*33.3 / 177.5 *Overclock*1,96878.130.3 / 170.7               *Valley**FPS**Min / Max*  *Valley**FPS**Min / Max*  *Valley**FPS**Min / Max**Stock*2,36356.528.9 / 103.2 *Stock**2,442**58.4*28.6 / 107.8 *Stock*2,37656.829.9 / 109.2*Overclock*2,59862.126.1 / 118.8 *Overclock**3,077**73.5*34.6 / 135.3 *Overclock*2,53260.532.9 / 114.6               *Bioshock (average)**Min**Max*  *Bioshock (average)**Min**Max*  *Bioshock (average)**Min**Max**Stock**179.36*23.17682.24 *Stock*172.1620.06386.26 *Stock**185.22*23.27505.41*Overclock*190.5319.25854.72 *Overclock**202.15*1.46463.84 *Overclock*197.7122.28605.82               *Tomb Raider (average)**Min**Max*  *Tomb Raider (average)**Min**Max*  *Tomb Raider (average)**Min**Max**Stock**86.0*66.0106.0 *Stock*80.862.0106.0 *Stock**97.2*74.0120.6*Overclock**96.1*73.8120.6 *Overclock*103.177.7134.2 *Overclock**144.0*80.0105.6
> 
> All of the R9 290 results used were from my Omega Drivers Comparison from a few weeks back...http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/33460_20#post_23300417
> 
> All of the settings for the benches are the same that I used for the Omega Drivers Comparison, which are typically default/maximum settings.
> 
> I have 4 screenshots for every bench, but I'm not going to post them all since that is ridiculous. I will toss a few of them up so you can see what I was talking about with boost clocks changing and things of that nature.
> 
> 
> 
> Spoiler: Valley Stock Everything - Boosting 1278
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Skydiver w/ Kraken Cooling - Stock Clocks - Boosting 1366
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Firestrike Stock Cooler Overclocked - Boosting 1472
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Firestrike Kraken 25% Power - Boosting 1516
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: 3DMark11 Kraken 25% Power - Boosting 1491
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Tomb Raider Stock Everything - Boosting 1265
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Tomb Raider Kraken 25% Power - Boosting 1516
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Heaven Kraken 25% Power - Boosting 1501
> 
> 
> 
> 
> 
> 
> 
> *970 Stock Settings*
> 
> Cloudgate: http://www.3dmark.com/3dm/5636125?
> 
> Skydiver: http://www.3dmark.com/3dm/5636059?
> 
> 3DMark11: http://www.3dmark.com/3dm11/9328408
> 
> FireStrike: http://www.3dmark.com/3dm/5635965?
> 
> *970 Overclock*
> 
> Cloudgate: http://www.3dmark.com/3dm/5352334?
> 
> Skydiver: http://www.3dmark.com/3dm/5352431?
> 
> 3DMark11: http://www.3dmark.com/3dm/9225137
> 
> FireStrike: http://www.3dmark.com/3dm/5352499
> 
> *970 PWR BIOS Stock Settings*
> 
> Cloudgate: http://www.3dmark.com/3dm/5636341?
> 
> Skydiver: http://www.3dmark.com/3dm/5636307?
> 
> 3DMark11: http://www.3dmark.com/3dm11/9328484
> 
> FireStrike: http://www.3dmark.com/3dm/5636253?
> 
> *970 PWR BIOS Overclock*
> 
> Cloudgate: http://www.3dmark.com/3dm/5636568?
> 
> Skydiver: http://www.3dmark.com/3dm/5636530?
> 
> 3DMark11: http://www.3dmark.com/3dm11/9328563
> 
> FireStrike: http://www.3dmark.com/3dm/5636467?
> 
> *Alright, that is a ton of information to swallow at once so let's try and break it down, shall we?*
> 
> We have 9 benchmarks, with stock/overclocked settings which gives us a total of 18 possibilities for win/lose...and we have a tie *9/9*.
> 
> If we remove Cloudgate/Skydiver from the equation we drop down to a total of 14...and *970* takes it with 8/14 wins.
> 
> If we look at just stock clocks for both cards where the 970 doesn't have the Kraken/25% Power...the *290* takes it with 10/16 wins. (no FFXIV)
> 
> If we look at stock clocks where the 970 has the Kraken/25% Power...the *970* takes it with 7/9 wins.
> 
> *But we still have other factors to consider...Price, Power Consumption, Operating Temperatures, Overclockability, etc.*
> 
> *Price* *290:* It looks like NewEgg is running low on stock of R9 290's as the cheapest one currently is an HIS for $269 + a $20 MIR. If you go used, then the 290 definitely wins here as you can easily find them in the $175-$200 range.
> 
> *Price 970:* This Zotac that I have is still the cheapest available at $329 + $10 gift card. Prices go all the way up to $399 where you are guaranteeing yourself a card that has an advertised stock boost clock in the mid 1300's...which will likely go even further. The more expensive cards may even have an 8+6 PCI-E power instead of 6+6 to ensure you get enough juice for high clocks.
> 
> *Power Consumption 290:* In reference configuration these cards are downright awful. They get too hot which destroys any chance they had at efficiency. Put a better cooler on them, or some form of water cooling and the power consumption can come down quite a bit. Keep it at stock clocks, or with a minor OC and it's not so bad. I know from first hand experience that a 290 @ 1075/1375 +37mV has no problems whatsoever running on a 450W gold PSU with a mildly OC'd 4770k; if you're running AMD FX CPU then you know you will need a big(ger) PSU.
> 
> *Power Consumption 970:* We all know the 970 wins this battle without question. But, it is the new kid on the block and this was one of Nvidia's primary concerns; KUDOS to them for such an efficient card. If you are thinking about upgrading from a smaller GPU and you have a smaller/older power supply, then the 970 is likely the better choice.
> 
> *Operating Temperatures 290:* Same story as the power consumption unfortunately. They can be tamed, but it takes time/money/effort especially if you want to run high overclocks all the time.
> 
> *Operating Temperatures 970:* In stock configuration the 970 also takes the cake. The cooler on this card is downright tiny, but it was still able to handle a decent overclock simply bt forcing the fans up to 60% where it was still pretty close to inaudible. Increase the power limit on the card to unlock maximum potential and the VRMs are just as much of an issue here as they are with the 290(X). *However*, this is likely due to the really crappy heatsink on this particular card. Some of the manufacturers are rolling out new revisions of the 970 already where they've added 'front plates' to the card for VRM/RAM cooling, and consequently they have increased core/boost speeds.
> 
> *Overclockability R9 290:* I have an above average 290 that holds stable in anything at 1275/1675. Not only is the core clock marvelous, the RAM speed is extremely uncommon in the 290(X) world. Odds are if you have an average card that you would slip under the performance of the 970 in most, if not all, scenarios.
> 
> *Overclockability GTX 970:* With this little Zotac boosting, and holding, very near/above 1.5GHz it's performance downright shocked me. I have looked at numerous reviews of numerous 970's and this is par for the course; nearly all of them land in the 1450-1525 range. Where my 970 falls short is the RAM speed only going ~100MHz over stock. Odds are pretty good you could end up with better clocking RAM.
> 
> *ETC. R9 290:* They obviously carry an advantage for Folding @ Home. No Xfire bridges. You can run Quad-Fire.
> 
> *ETC. GTX 970:* An unusually large number of 970 cards suffer from annoying coil whine; Mine does, but it is tolerable even with my obsession for absolute silence. There are a few 970's that come in the small size like this Zotac, which is a plus for small cases. ASUS even has a mini version that is 6.7" long; This matches dimensions of a mini-ITX board! You also have CUDA and PhysX if you're into those things. Limited to Tri-SLI for 970.
> 
> 
> 
> *C*
> 
> 
> 
> 
> 
> 
> 
> *nclusi*
> 
> 
> 
> 
> 
> 
> 
> *n...*
> 
> I have had my hands on numerous *290s*, and they are wonderful GPUs. The release of them stuck it to Nvidia when their pricing was really getting out of hand. The *290**(X)* showed everyone that AMD can still pack a
> 
> 
> 
> 
> 
> 
> 
> at a very reasonable price. But...they are 15 months old and that is an eternity in the computer world.
> 
> The *970* has everything going for it, except for the higher price tag. Then again, if you take everything into consideration...the savings in your electric bill, the probability of needing to upgrade cooling in one form or another, or even a new PSU...then the *290* doesn't have much to stand on unless you grab bargain used *290s*.
> 
> I am a realist. The facts are here and it is pretty plain that Nvidia has the upper hand *currently*..considering I'm comparing Nvidia's newest to AMD's 15 month old hardware.
> 
> Does this mean I am converted? Not a chance
> 
> 
> 
> 
> 
> 
> 
> !!! Am I impressed with the performance of this little *970*? Absolutely.
> 
> *What will happen when the R9 300 series cards come out is a whole 'nother story! *
> 
> El Fin


Wow very informative! Great job! Although the setup is similar between the two systems, swapping the cards on one system would be more accurate, however that's just being nitpicky and your results probably have a small margin of error anyways. With that said this is freaking awesome! Cheers!


----------



## Buehlar

Awesome review... +rep


----------



## hyp36rmax

Quote:


> Originally Posted by *rdr09*
> 
> Home's, check out my new car . . .


Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Its looks like a black 3 series BMW 201x model . With spare tyre on . Guessing of cause . If so youre doing well
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My ride
> 
> 
> http://www.overclock.net/t/961467/the-show-your-car-and-car-discussion-thread/16300_20#post_23458115


Haha Sweet rides! It's show your car day! haha. My EVO IX RS





I had just finished mounting the new bumper and she's been painted since then. LOL. Been thinking of picking up a BMW M4 or last gen E92 M3.


----------



## Bertovzki

@Roboyto thanks for your time and effort in all your work to do the tests and present it for the club and OCN , nothing better than facts and comparisons to look at hardware


----------



## sinnedone

Awesome post Roboyto, read through all of it.

Thank you for all the time and effort you put into this.


----------



## Roboyto

Quote:


> Originally Posted by *hyp36rmax*
> 
> Wow very informative! Great job! Although the setup is similar between the two systems, swapping the cards on one system would be more accurate, however that's just being nitpicky and your results probably have a small margin of error anyways. With that said this is freaking awesome! Cheers!


Thank you.

Removing the 290 from my main rig is a pain because of the waterloop, so it was out of the question. I could have gotten even closer results if I would have just moved the RAM from one rig to the other, and then clocked the 4790k down to 4.5GHz...but all that didn't pop into my mind until I was too far into it. Plus the way Boost 2.0 works threw a wrench into things as well because I didn't have screenshots of some benches with completely stock settings/cooler..and I really didn't feel like removing the card, the kraken/radiator, and then reinstalling the stock cooler. And then my damn bluray drive threw me a curve ball as well...it's malfunctioning and either causes the system to not boot or causes the SATA controller to act up and not recognize my (2) 2TB storage drives. I about crapped my pants when I thought I lost 1.8TB of movies!

This is why there is a large disclaimer at the start









It was an enjoyable learning experience though









Quote:


> Originally Posted by *Buehlar*
> 
> Awesome review... +rep


Thanks


----------



## hyp36rmax

Quote:


> Originally Posted by *Roboyto*
> 
> Thank you.
> 
> Removing the 290 from my main rig is a pain because of the waterloop, so it was out of the question. I could have gotten even closer results if I would have just moved the RAM from one rig to the other, and then clocked the 4790k down to 4.5GHz...but all that didn't pop into my mind until I was too far into it. Plus the way Boost 2.0 works threw a wrench into things as well because I didn't have screenshots of some benches with completely stock settings/cooler..and I really didn't feel like removing the card, the kraken/radiator, and then reinstalling the stock cooler. And then my damn bluray drive threw me a curve ball as well...it's malfunctioning and either causes the system to not boot or causes the SATA controller to act up and not recognize my (2) 2TB storage drives. I about crapped my pants when I thought I lost 1.8TB of movies!
> 
> This is why there is a large disclaimer at the start
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It was an enjoyable learning experience though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks


I can only imagine the adventure that would have been haha! I love how detailed you were with your comparison. Major kudos!


----------



## long99x

Quote:


> Originally Posted by *rdr09*
> 
> why do you need an adapter?
> 
> i meant if both car and display have dp.


Quote:


> Originally Posted by *kizwan*
> 
> Typo, I think he meant DP cable.


thank you, i meant DP cable
Quote:


> Originally Posted by *rdr09*
> 
> prolly. that adapter threw me off. anyway, i read about this issue before in the drivers section but not sure how they fixed it. prolly onboard sound is conflicting with amd's.


Quote:


> Originally Posted by *tsm106*
> 
> Check to see if the sound is enabled in the sound applet in control panel.


Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Does your car have DP Kizwan ??
> Make your monitor sound as default
> 
> ...[URL=http://www.overclock.net/content/type/61/id/2331069/width/500/height/1000%5B/IMG]http://www.overclock.net/content/type/61/id/2331069/width/500/height/1000[/IMG[/URL]][/QUOTE]
> 
> i already did that [IMG alt="frown.gif"]https://www.overclock.net/images/smilies/frown.gif


----------



## Arizonian

Quote:


> Originally Posted by *Zelx0*
> 
> Time to join the club
> 
> 
> 
> 
> 
> 
> 
> I own a R9 290 gigabyte windforce with their stock air cooler ^^
> 
> Gpu Proof + 3d score
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Right I'm using a soft overclock on it (need to get time to finish tweaking it around):
> +88 core voltage
> +50 power limit
> 1150 coreclock
> 1300 memclock


Congrats - added








Quote:


> Originally Posted by *Roboyto*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> _*R9 290
> 
> 
> 
> 
> 
> 
> 
> 
> ~VS~
> 
> 
> 
> 
> 
> 
> 
> 
> GTX 970*_
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> **Disclaimer**
> 
> This, by far, is _*not*_ a perfect comparison of the two cards since it was a large learning experience for me. I have not had an Nvidia card that I have been able to tweak/overclock ever...so bear with me folks! (I did once own a GTX 760, but it was an absolute dud that did not overclock even 1MHz Core/RAM and died on me rather quickly)
> 
> You may be wondering why someone who has never delved into Nvidia would buy a 970....
> 
> The Hype
> CUDA for BD Rips
> PhysX for new Batman for my wife
> Bored with both rigs having 290s
> Open Box for $264
> 
> Though it is not a perfect comparison it has some good information for those ignorant to the *'other side'*(like I was), and it paints a pretty decent picture with benchmark scores.
> 
> I felt this comparison needed to be done because I don't think (m)any of the professional review sites retest older hardware with current drivers. We don't always know what card they are posting the results from... What we DO KNOW, is that benchmark/FPS scores for a reference 290(X) from late 2013, look much different than benchmark/FPS scores for a properly cooled 290(X) with current drivers.
> 
> Keep in mind I only tested the 970 at 1080P. What I have gathered from review sites is that the 970 keeps pace with higher resolutions.
> 
> The 290 is in my main rig, and the 970 is in my HTPC. They are similar but not identical, so here you are:
> 
> *290 Rig:*
> 
> Win7 Ultimate x64
> 
> ASUS Z87 Gryphon w/ Thermal Armor
> 
> i7 4770k @ 4.5GHz - XSPC Raystorm - Delidded/Naked
> 
> 2x8 GB Dominator Platinum 2400MHz
> 
> XFX BE R9 290 Reference - XSPC Razor w/ Backplate
> 
> Rosewill Capstone 650W Modular
> 
> *970 Rig:*
> 
> Win7 Ultimate x64
> 
> Gigabyte GA-Z97N Wi-Fi Mini-ITX
> 
> i7 4790k @ 4.7GHz - Antec 620 - Delidded/CLU
> 
> 2x4 GB Dominator 2133MHz 1.5V
> 
> Zotac GTX 970 Reference - Kraken G10 w/ Antec 620 & VRM Heatsink Upgrade
> 
> Rosewill Capstone 450W Modular
> 
> The clock speed difference in CPU/RAM will make a slight difference in physics scores for 3DMark, so I will be focusing on graphics scores between the two. Interestingly enough the physics scores for the 4790K are better in firestrike, but worse in 3DMark11. I'm assuming this means that Firestrike relies more on CPU clock, and 3DMark11 more on RAM. I apologize, I now realize I should have just swapped RAM and clocked to 4.5GHz...but hindsight is 20/20
> 
> 
> 
> 
> 
> 
> 
> And..I don't feel like re-running all the benches
> 
> You will also notice a small number of the benches had a 4770k used. The same time I swapped the 4790k in, I also attached the Kraken to the 970...graphics scores are more important...and remember this isn't a perfect comparison.
> 
> 
> 
> Spoiler: Zotac GTX 970 Bios Mod & VRM Cooling
> 
> 
> 
> The GTX 970 also underwent a small BIOS mod to increase the power limit from 6% to 25%. The new Nvidia cards have a statistic in GPU-Z that tells you what is causing the core clock to throttle; i.e. temp, power, etc. I noticed immediately that GPU-Z was frequently showing power throttling.
> 
> 
> 
> 
> The throttling statistic is handy, but this Zotac lacks VRM monitoring; I'm not sure if it is just this card, or all newer Nvidia. So I pulled out my trusty Rosewill IR thermometer to see how hot the VRMs were running. I was expecting fairly low temperatures because of the efficiency of these cards, but was shocked to see 94C with the stock cooler and all stock settings after a 2 run loop of Heaven; Overclocking the card pushed it up to 98C. Once I increased the power limit to 25%, the temperature skyrocketed to
> 
> 
> 
> 
> 
> 
> 
> _*132C*_
> 
> 
> 
> 
> 
> 
> 
> within 3 min of Heaven and the card shut itself down to prevent any damage.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The shoddy temperatures are due to a very crappy heatsink on Zotac's part; which they have since upgraded to one with fins. The crappy heatsink was a blessing in disguise however, because it allowed me to attach another heatsink directly to it. I swapped the crappy thermal pads for some Fuji Extreme, and slapped an Enzotech MST-66 heatsink directly on top of the factory one to bring temperatures within reason. With the Fuji Extreme, double heatsink, 25% power and an OC, the VRMs run 12C cooler than bone stock configuration.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Since the 970 underwent a BIOS mod I recorded four sets of bench results:
> 
> Completely Stock
> Stock Heatsink/BIOS Overclocked
> Stock Clocks w/ 25% PWR BIOS Kraken
> 25% PWR BIOS Overclocked Kraken
> These Nvidia card with the new Boost 2.0 can be tricky since the card can be limited thermally or by power...AND you have the option to prioritize one or the other, which I didn't realize until I was pretty much done with my evaluation. More tricky than fiddling with a 290(X) anyhow...
> 
> 
> Spoiler: Nvidia Boost 2.0 - For Those Who Don't Know
> 
> 
> 
> When I saw how hot the VRMs were getting with the stock configuration, I tackled that issue at the same time as attaching the Kraken. I initially assumed that the increase in benchmark scores was due to the increased power limit that in turn increased boost clock. However, I have come to realize that this is only partially true. The power limit did help a bit in some circumstances, but it turns out that the drastically lower GPU core temp allowed the higher boost clock. This struck me as odd because when I was using the stock cooler I made sure that it wasn't hitting the 81C threshold. I guess this all depends on how the BIOS is setup...so if the core temp is in the High 40s to Low 50s then it will boost higher.
> 
> That may be kind of confusing so let me just put the numbers down:
> 
> Completely Stock Everything: GPU 1076 | Boost 1216 | GPU-Z Actual Boost 1278
> Stock Clocks/PWR & Kraken: GPU 1076 | Boost 1216 | GPU-Z Actual Boost 1366
> 
> Stock Cooler/BIOS Overclock (+210 core): GPU 1286 | Boost 1426 | GPU-Z Actual Boost 1488
> Kraken 25% PWR Overclock (+150 core): GPU 1226 | Boost 1366 | GpU-Z Actual Boost 1516
> 
> Another thing that makes sense now, that didn't previously, is why I had to reduce the overclock in Firestorm/Afterburner once I installed the Kraken. The low core temp is allowing more boost, so I had to turn the overclock down. It was trying to reach ~1575 core boost clock with the Kraken cooling, which would crash immediately in anything I ran except for FFXIV.
> 
> You probably noticed I didn't mention additional voltage anywhere. This is because the card is limited to a mere +37mV and it rarely had any benefit to an overclock. Once I attached the Kraken, any amount of additional voltage made no difference whatsoever.
> 
> 
> 
> A long Boost 2.0 story made short: Keep the core very cool and you get higher than advertised boost clocks.
> 
> Alright! Now that all those shenanigans are out of the way, we can get to the benchmark results! Drum roll please! " src="https://www.overclock.net/images/smilies/drum.gif" style="width:65px;">
> *XFX R9 290 BE - Stock Clocks 975/1250 | Omega 14.12 | Overclock 1275/1675 +200mV +50% Power Limit*
> *Zotac GTX 970 - Stock Clocks 1076/1216/1753 (Actual Boost Up to 1278) | ForceWare 347.09 | Overclock 1286/1426/1840 (Actual Boost Up to 1488)*
> *970 Kraken PWR - Stock Clocks 1076/1216/1753 (Actual Boost Up to 1366) | ForceWare 347.09 | Overclock 1226/1366/1815 (Actual Boost Up to 1516)*
> 
> 
> *Zotac GTX 970 Stock BIOS*    *XFX R9 290 Black Edition*    *GTX 970 Kraken PWR*   *Cloudgate**Graphics**Physics*  *Cloudgate**Graphics**Physics*  *Cloudgate**Graphics**Physics**Stock*27,65775,1518,611 *Stock*27,68670,3218,868 *Stock*29,900*76,221*9,562*Overclock*28,88285,8818,692 *Overclock*29,828*88,574*8,981 *Overclock*30,52082,3359,530               *Skydiver**Graphics**Physics*  *Skydiver**Graphics**Physics*  *Skydiver**Graphics**Physics**Stock*26,50634,28812,495 *Stock*27,533*36,349*12,592 *Stock*27,40335,67912,875*Overclock*28,14140,34911,717 *Overclock*31,479*46,589*12,578 *Overclock*28,45238,49312,709               *3DMark11**Graphics**Physics*  *3DMark11**Graphics**Physics*  *3DMark11**Graphics**Physics**Stock*13,830*14,843*11,479 *Stock*13,89814,86511,975 *Stock*14,201*15,500*11,303*Overclock*14,60516,39311,024 *Overclock*16,568*19,315*12,162 *Overclock*14,92716,58511,534               *Firestrike**Graphics**Physics*  *Firestrike**Graphics**Physics*  *Firestrike**Graphics**Physics**Stock*10,121*11,630*13,307 *Stock*9,52910,90212,386 *Stock*10,309*11,944*13,223*Overclock*11,03112,89113,314 *Overclock*11,799*14,057*12,363 *Overclock*11,03112,89113,314               *FFXIV*    *FFXIV*    *FFXIV*  *Stock*n/a   *Stock*13,215   *Stock**15,433*  *Overclock*n/a   *Overclock*16,502   *Overclock**16,616*                 *Heaven**FPS**Min / Max*  *Heaven**FPS**Min / Max*  *Heaven**FPS**Min / Max**Stock**1,725**68.5*32.1 / 150.9 *Stock*1,66866.225.8 / 139.2 *Stock**1,832**72.7*29.6 / 158.4*Overclock*1,94377.129.6 / 170.0 *Overclock**2,114**83.9*33.3 / 177.5 *Overclock*1,96878.130.3 / 170.7               *Valley**FPS**Min / Max*  *Valley**FPS**Min / Max*  *Valley**FPS**Min / Max**Stock*2,36356.528.9 / 103.2 *Stock**2,442**58.4*28.6 / 107.8 *Stock*2,37656.829.9 / 109.2*Overclock*2,59862.126.1 / 118.8 *Overclock**3,077**73.5*34.6 / 135.3 *Overclock*2,53260.532.9 / 114.6               *Bioshock (average)**Min**Max*  *Bioshock (average)**Min**Max*  *Bioshock (average)**Min**Max**Stock**179.36*23.17682.24 *Stock*172.1620.06386.26 *Stock**185.22*23.27505.41*Overclock*190.5319.25854.72 *Overclock**202.15*1.46463.84 *Overclock*197.7122.28605.82               *Tomb Raider (average)**Min**Max*  *Tomb Raider (average)**Min**Max*  *Tomb Raider (average)**Min**Max**Stock**86.0*66.0106.0 *Stock*80.862.0106.0 *Stock**97.2*74.0120.6*Overclock**96.1*73.8120.6 *Overclock*103.177.7134.2 *Overclock**144.0*80.0105.6
> 
> All of the R9 290 results used were from my Omega Drivers Comparison from a few weeks back...http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/33460_20#post_23300417
> 
> All of the settings for the benches are the same that I used for the Omega Drivers Comparison, which are typically default/maximum settings.
> 
> I have 4 screenshots for every bench, but I'm not going to post them all since that is ridiculous. I will toss a few of them up so you can see what I was talking about with boost clocks changing and things of that nature.
> 
> 
> 
> Spoiler: Valley Stock Everything - Boosting 1278
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Skydiver w/ Kraken Cooling - Stock Clocks - Boosting 1366
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Firestrike Stock Cooler Overclocked - Boosting 1472
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Firestrike Kraken 25% Power - Boosting 1516
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: 3DMark11 Kraken 25% Power - Boosting 1491
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Tomb Raider Stock Everything - Boosting 1265
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Tomb Raider Kraken 25% Power - Boosting 1516
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Heaven Kraken 25% Power - Boosting 1501
> 
> 
> 
> 
> 
> 
> 
> 
> *970 Stock Settings*
> Cloudgate: http://www.3dmark.com/3dm/5636125?
> Skydiver: http://www.3dmark.com/3dm/5636059?
> 3DMark11: http://www.3dmark.com/3dm11/9328408
> FireStrike: http://www.3dmark.com/3dm/5635965?
> 
> *970 Overclock*
> Cloudgate: http://www.3dmark.com/3dm/5352334?
> Skydiver: http://www.3dmark.com/3dm/5352431?
> 3DMark11: http://www.3dmark.com/3dm/9225137
> FireStrike: http://www.3dmark.com/3dm/5352499
> 
> *970 PWR BIOS Stock Settings*
> Cloudgate: http://www.3dmark.com/3dm/5636341?
> Skydiver: http://www.3dmark.com/3dm/5636307?
> 3DMark11: http://www.3dmark.com/3dm11/9328484
> FireStrike: http://www.3dmark.com/3dm/5636253?
> 
> *970 PWR BIOS Overclock*
> Cloudgate: http://www.3dmark.com/3dm/5636568?
> Skydiver: http://www.3dmark.com/3dm/5636530?
> 3DMark11: http://www.3dmark.com/3dm11/9328563
> FireStrike: http://www.3dmark.com/3dm/5636467?
> 
> *Alright, that is a ton of information to swallow at once so let's try and break it down, shall we?*
> 
> We have 9 benchmarks, with stock/overclocked settings which gives us a total of 18 possibilities for win/lose...and we have a tie *9/9*.
> 
> If we remove Cloudgate/Skydiver from the equation we drop down to a total of 14...and *970* takes it with 8/14 wins.
> 
> If we look at just stock clocks for both cards where the 970 doesn't have the Kraken/25% Power...the *290* takes it with 10/16 wins. (no FFXIV)
> 
> If we look at stock clocks where the 970 has the Kraken/25% Power...the *970* takes it with 7/9 wins.
> 
> *But we still have other factors to consider...Price, Power Consumption, Operating Temperatures, Overclockability, etc.*
> 
> *Price* *290:* It looks like NewEgg is running low on stock of R9 290's as the cheapest one currently is an HIS for $269 + a $20 MIR. If you go used, then the 290 definitely wins here as you can easily find them in the $175-$200 range.
> 
> *Price970:* This Zotac that I have is still the cheapest available at $329 + $10 gift card. Prices go all the way up to $399 where you are guaranteeing yourself a card that has an advertised stock boost clock in the mid 1300's...which will likely go even further. The more expensive cards may even have an 8+6 PCI-E power instead of 6+6 to ensure you get enough juice for high clocks.
> 
> *Power Consumption 290:* In reference configuration these cards are downright awful. They get too hot which destroys any chance they had at efficiency. Put a better cooler on them, or some form of water cooling and the power consumption can come down quite a bit. Keep it at stock clocks, or with a minor OC and it's not so bad. I know from first hand experience that a 290 @ 1075/1375 +37mV has no problems whatsoever running on a 450W gold PSU with a mildly OC'd 4770k; if you're running AMD FX CPU then you know you will need a big(ger) PSU.
> 
> *Power Consumption 970:* We all know the 970 wins this battle without question. But, it is the new kid on the block and this was one of Nvidia's primary concerns; KUDOS to them for such an efficient card. If you are thinking about upgrading from a smaller GPU and you have a smaller/older power supply, then the 970 is likely the better choice.
> 
> *Operating Temperatures 290:* Same story as the power consumption unfortunately. They can be tamed, but it takes time/money/effort especially if you want to run high overclocks all the time.
> 
> *Operating Temperatures 970:* In stock configuration the 970 also takes the cake. The cooler on this card is downright tiny, but it was still able to handle a decent overclock simply bt forcing the fans up to 60% where it was still pretty close to inaudible. Increase the power limit on the card to unlock maximum potential and the VRMs are just as much of an issue here as they are with the 290(X). _*However*_, this is likely due to the really crappy heatsink on this particular card. Some of the manufacturers are rolling out new revisions of the 970 already where they've added 'front plates' to the card for VRM/RAM cooling, and consequently they have increased core/boost speeds.
> 
> *Overclockability R9 290:* I have an above average 290 that holds stable in anything at 1275/1675. Not only is the core clock marvelous, the RAM speed is extremely uncommon in the 290(X) world. Odds are if you have an average card that you would slip under the performance of the 970 in most, if not all, scenarios.
> 
> *Overclockability GTX 970:* With this little Zotac boosting, and holding, very near/above 1.5GHz it's performance downright shocked me. I have looked at numerous reviews of numerous 970's and this is par for the course; nearly all of them land in the 1450-1525 range. Where my 970 falls short is the RAM speed only going ~100MHz over stock. Odds are pretty good you could end up with better clocking RAM.
> 
> *ETC. R9 290:* They obviously carry an advantage for Folding @ Home. No Xfire bridges. You can run Quad-Fire.
> 
> _*ETC. GTX 970:*_ An unusually large number of 970 cards suffer from annoying coil whine; Mine does, but it is tolerable even with my obsession for absolute silence. There are a few 970's that come in the small size like this Zotac, which is a plus for small cases. ASUS even has a mini version that is 6.7" long; This matches dimensions of a mini-ITX board! You also have CUDA and PhysX if you're into those things. Limited to Tri-SLI for 970.
> 
> _*C*_
> 
> 
> 
> 
> 
> 
> 
> _*nclusi*_
> 
> 
> 
> 
> 
> 
> 
> _*n...*_
> 
> I have had my hands on numerous *290s*, and they are wonderful GPUs. The release of them stuck it to Nvidia when their pricing was really getting out of hand. The *290**(X)* showed everyone that AMD can still pack a
> 
> 
> 
> 
> 
> 
> 
> at a very reasonable price. But...they are 15 months old and that is an eternity in the computer world.
> 
> The *970* has everything going for it, except for the higher price tag. Then again, if you take _everything_ into consideration...the savings in your electric bill, the probability of needing to upgrade cooling in one form or another, or even a new PSU...then the *290* doesn't have much to stand on unless you grab bargain used *290s*.
> 
> I am a realist. The facts are here and it is pretty plain that Nvidia has the upper hand _*currently*_..considering I'm comparing Nvidia's newest to AMD's 15 month old hardware.
> 
> Does this mean I am converted? Not a chance " src="https://www.overclock.net/images/smilies/laugher.gif" style="width:22px;">
> !!! Am I impressed with the performance of this little *970*? Absolutely.
> 
> _*What will happen when the R9 300 series cards come out is a whole 'nother story! *_
> 
> El Fin


Awesome - this deserves it's own thread in *Graphic Cards / Graphic Cards General*


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Does your car have DP Kizwan ??


My car doesn't have DP but my cards does.


----------



## hyp36rmax

Quote:


> Originally Posted by *kizwan*
> 
> My car doesn't have DP but my cards does.


Haha! When I read that I was thinking "down-pipe" since we were talking about cars for a little. #turbo #moreboost #nocats


----------



## X-Alt

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Roboyto*
> 
> _*R9 290
> 
> 
> 
> 
> 
> 
> 
> 
> ~VS~
> 
> 
> 
> 
> 
> 
> 
> 
> GTX 970*_
> 
> **Disclaimer**
> 
> This, by far, is _*not*_ a perfect comparison of the two cards since it was a large learning experience for me. I have not had an Nvidia card that I have been able to tweak/overclock ever...so bear with me folks! (I did once own a GTX 760, but it was an absolute dud that did not overclock even 1MHz Core/RAM and died on me rather quickly)
> 
> You may be wondering why someone who has never delved into Nvidia would buy a 970....
> 
> The Hype
> CUDA for BD Rips
> PhysX for new Batman for my wife
> Bored with both rigs having 290s
> Open Box for $264
> 
> Though it is not a perfect comparison it has some good information for those ignorant to the *'other side'*(like I was), and it paints a pretty decent picture with benchmark scores.
> 
> I felt this comparison needed to be done because I don't think (m)any of the professional review sites retest older hardware with current drivers. We don't always know what card they are posting the results from... What we DO KNOW, is that benchmark/FPS scores for a reference 290(X) from late 2013, look much different than benchmark/FPS scores for a properly cooled 290(X) with current drivers.
> 
> Keep in mind I only tested the 970 at 1080P. What I have gathered from review sites is that the 970 keeps pace with higher resolutions.
> 
> The 290 is in my main rig, and the 970 is in my HTPC. They are similar but not identical, so here you are:
> 
> *290 Rig:*
> 
> Win7 Ultimate x64
> 
> ASUS Z87 Gryphon w/ Thermal Armor
> 
> i7 4770k @ 4.5GHz - XSPC Raystorm - Delidded/Naked
> 
> 2x8 GB Dominator Platinum 2400MHz
> 
> XFX BE R9 290 Reference - XSPC Razor w/ Backplate
> 
> Rosewill Capstone 650W Modular
> 
> *970 Rig:*
> 
> Win7 Ultimate x64
> 
> Gigabyte GA-Z97N Wi-Fi Mini-ITX
> 
> i7 4790k @ 4.7GHz - Antec 620 - Delidded/CLU
> 
> 2x4 GB Dominator 2133MHz 1.5V
> 
> Zotac GTX 970 Reference - Kraken G10 w/ Antec 620 & VRM Heatsink Upgrade
> 
> Rosewill Capstone 450W Modular
> 
> The clock speed difference in CPU/RAM will make a slight difference in physics scores for 3DMark, so I will be focusing on graphics scores between the two. Interestingly enough the physics scores for the 4790K are better in firestrike, but worse in 3DMark11. I'm assuming this means that Firestrike relies more on CPU clock, and 3DMark11 more on RAM. I apologize, I now realize I should have just swapped RAM and clocked to 4.5GHz...but hindsight is 20/20
> 
> 
> 
> 
> 
> 
> 
> And..I don't feel like re-running all the benches
> 
> You will also notice a small number of the benches had a 4770k used. The same time I swapped the 4790k in, I also attached the Kraken to the 970...graphics scores are more important...and remember this isn't a perfect comparison.
> 
> 
> 
> Spoiler: Zotac GTX 970 Bios Mod & VRM Cooling
> 
> 
> 
> The GTX 970 also underwent a small BIOS mod to increase the power limit from 6% to 25%. The new Nvidia cards have a statistic in GPU-Z that tells you what is causing the core clock to throttle; i.e. temp, power, etc. I noticed immediately that GPU-Z was frequently showing power throttling.
> 
> 
> 
> 
> The throttling statistic is handy, but this Zotac lacks VRM monitoring; I'm not sure if it is just this card, or all newer Nvidia. So I pulled out my trusty Rosewill IR thermometer to see how hot the VRMs were running. I was expecting fairly low temperatures because of the efficiency of these cards, but was shocked to see 94C with the stock cooler and all stock settings after a 2 run loop of Heaven; Overclocking the card pushed it up to 98C. Once I increased the power limit to 25%, the temperature skyrocketed to
> 
> 
> 
> 
> 
> 
> 
> _*132C*_
> 
> 
> 
> 
> 
> 
> 
> within 3 min of Heaven and the card shut itself down to prevent any damage.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The shoddy temperatures are due to a very crappy heatsink on Zotac's part; which they have since upgraded to one with fins. The crappy heatsink was a blessing in disguise however, because it allowed me to attach another heatsink directly to it. I swapped the crappy thermal pads for some Fuji Extreme, and slapped an Enzotech MST-66 heatsink directly on top of the factory one to bring temperatures within reason. With the Fuji Extreme, double heatsink, 25% power and an OC, the VRMs run 12C cooler than bone stock configuration.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Since the 970 underwent a BIOS mod I recorded four sets of bench results:
> 
> Completely Stock
> Stock Heatsink/BIOS Overclocked
> Stock Clocks w/ 25% PWR BIOS Kraken
> 25% PWR BIOS Overclocked Kraken
> These Nvidia card with the new Boost 2.0 can be tricky since the card can be limited thermally or by power...AND you have the option to prioritize one or the other, which I didn't realize until I was pretty much done with my evaluation. More tricky than fiddling with a 290(X) anyhow...
> 
> 
> Spoiler: Nvidia Boost 2.0 - For Those Who Don't Know
> 
> 
> 
> When I saw how hot the VRMs were getting with the stock configuration, I tackled that issue at the same time as attaching the Kraken. I initially assumed that the increase in benchmark scores was due to the increased power limit that in turn increased boost clock. However, I have come to realize that this is only partially true. The power limit did help a bit in some circumstances, but it turns out that the drastically lower GPU core temp allowed the higher boost clock. This struck me as odd because when I was using the stock cooler I made sure that it wasn't hitting the 81C threshold. I guess this all depends on how the BIOS is setup...so if the core temp is in the High 40s to Low 50s then it will boost higher.
> 
> That may be kind of confusing so let me just put the numbers down:
> 
> Completely Stock Everything: GPU 1076 | Boost 1216 | GPU-Z Actual Boost 1278
> Stock Clocks/PWR & Kraken: GPU 1076 | Boost 1216 | GPU-Z Actual Boost 1366
> 
> Stock Cooler/BIOS Overclock (+210 core): GPU 1286 | Boost 1426 | GPU-Z Actual Boost 1488
> Kraken 25% PWR Overclock (+150 core): GPU 1226 | Boost 1366 | GpU-Z Actual Boost 1516
> 
> Another thing that makes sense now, that didn't previously, is why I had to reduce the overclock in Firestorm/Afterburner once I installed the Kraken. The low core temp is allowing more boost, so I had to turn the overclock down. It was trying to reach ~1575 core boost clock with the Kraken cooling, which would crash immediately in anything I ran except for FFXIV.
> 
> You probably noticed I didn't mention additional voltage anywhere. This is because the card is limited to a mere +37mV and it rarely had any benefit to an overclock. Once I attached the Kraken, any amount of additional voltage made no difference whatsoever.
> 
> 
> 
> A long Boost 2.0 story made short: Keep the core very cool and you get higher than advertised boost clocks.
> 
> Alright! Now that all those shenanigans are out of the way, we can get to the benchmark results! Drum roll please! " src="https://www.overclock.net/images/smilies/drum.gif" style="width:65px;">
> *XFX R9 290 BE - Stock Clocks 975/1250 | Omega 14.12 | Overclock 1275/1675 +200mV +50% Power Limit*
> *Zotac GTX 970 - Stock Clocks 1076/1216/1753 (Actual Boost Up to 1278) | ForceWare 347.09 | Overclock 1286/1426/1840 (Actual Boost Up to 1488)*
> *970 Kraken PWR - Stock Clocks 1076/1216/1753 (Actual Boost Up to 1366) | ForceWare 347.09 | Overclock 1226/1366/1815 (Actual Boost Up to 1516)*
> 
> 
> *Zotac GTX 970 Stock BIOS*    *XFX R9 290 Black Edition*    *GTX 970 Kraken PWR*   *Cloudgate**Graphics**Physics*  *Cloudgate**Graphics**Physics*  *Cloudgate**Graphics**Physics**Stock*27,65775,1518,611 *Stock*27,68670,3218,868 *Stock*29,900*76,221*9,562*Overclock*28,88285,8818,692 *Overclock*29,828*88,574*8,981 *Overclock*30,52082,3359,530               *Skydiver**Graphics**Physics*  *Skydiver**Graphics**Physics*  *Skydiver**Graphics**Physics**Stock*26,50634,28812,495 *Stock*27,533*36,349*12,592 *Stock*27,40335,67912,875*Overclock*28,14140,34911,717 *Overclock*31,479*46,589*12,578 *Overclock*28,45238,49312,709               *3DMark11**Graphics**Physics*  *3DMark11**Graphics**Physics*  *3DMark11**Graphics**Physics**Stock*13,830*14,843*11,479 *Stock*13,89814,86511,975 *Stock*14,201*15,500*11,303*Overclock*14,60516,39311,024 *Overclock*16,568*19,315*12,162 *Overclock*14,92716,58511,534               *Firestrike**Graphics**Physics*  *Firestrike**Graphics**Physics*  *Firestrike**Graphics**Physics**Stock*10,121*11,630*13,307 *Stock*9,52910,90212,386 *Stock*10,309*11,944*13,223*Overclock*11,03112,89113,314 *Overclock*11,799*14,057*12,363 *Overclock*11,03112,89113,314               *FFXIV*    *FFXIV*    *FFXIV*  *Stock*n/a   *Stock*13,215   *Stock**15,433*  *Overclock*n/a   *Overclock*16,502   *Overclock**16,616*                 *Heaven**FPS**Min / Max*  *Heaven**FPS**Min / Max*  *Heaven**FPS**Min / Max**Stock**1,725**68.5*32.1 / 150.9 *Stock*1,66866.225.8 / 139.2 *Stock**1,832**72.7*29.6 / 158.4*Overclock*1,94377.129.6 / 170.0 *Overclock**2,114**83.9*33.3 / 177.5 *Overclock*1,96878.130.3 / 170.7               *Valley**FPS**Min / Max*  *Valley**FPS**Min / Max*  *Valley**FPS**Min / Max**Stock*2,36356.528.9 / 103.2 *Stock**2,442**58.4*28.6 / 107.8 *Stock*2,37656.829.9 / 109.2*Overclock*2,59862.126.1 / 118.8 *Overclock**3,077**73.5*34.6 / 135.3 *Overclock*2,53260.532.9 / 114.6               *Bioshock (average)**Min**Max*  *Bioshock (average)**Min**Max*  *Bioshock (average)**Min**Max**Stock**179.36*23.17682.24 *Stock*172.1620.06386.26 *Stock**185.22*23.27505.41*Overclock*190.5319.25854.72 *Overclock**202.15*1.46463.84 *Overclock*197.7122.28605.82               *Tomb Raider (average)**Min**Max*  *Tomb Raider (average)**Min**Max*  *Tomb Raider (average)**Min**Max**Stock**86.0*66.0106.0 *Stock*80.862.0106.0 *Stock**97.2*74.0120.6*Overclock**96.1*73.8120.6 *Overclock*103.177.7134.2 *Overclock**144.0*80.0105.6
> 
> All of the R9 290 results used were from my Omega Drivers Comparison from a few weeks back...http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/33460_20#post_23300417
> 
> All of the settings for the benches are the same that I used for the Omega Drivers Comparison, which are typically default/maximum settings.
> 
> I have 4 screenshots for every bench, but I'm not going to post them all since that is ridiculous. I will toss a few of them up so you can see what I was talking about with boost clocks changing and things of that nature.
> 
> 
> 
> Spoiler: Valley Stock Everything - Boosting 1278
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Skydiver w/ Kraken Cooling - Stock Clocks - Boosting 1366
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Firestrike Stock Cooler Overclocked - Boosting 1472
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Firestrike Kraken 25% Power - Boosting 1516
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: 3DMark11 Kraken 25% Power - Boosting 1491
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Tomb Raider Stock Everything - Boosting 1265
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Tomb Raider Kraken 25% Power - Boosting 1516
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Heaven Kraken 25% Power - Boosting 1501
> 
> 
> 
> 
> 
> 
> 
> 
> *970 Stock Settings*
> Cloudgate: http://www.3dmark.com/3dm/5636125?
> Skydiver: http://www.3dmark.com/3dm/5636059?
> 3DMark11: http://www.3dmark.com/3dm11/9328408
> FireStrike: http://www.3dmark.com/3dm/5635965?
> 
> *970 Overclock*
> Cloudgate: http://www.3dmark.com/3dm/5352334?
> Skydiver: http://www.3dmark.com/3dm/5352431?
> 3DMark11: http://www.3dmark.com/3dm/9225137
> FireStrike: http://www.3dmark.com/3dm/5352499
> 
> *970 PWR BIOS Stock Settings*
> Cloudgate: http://www.3dmark.com/3dm/5636341?
> Skydiver: http://www.3dmark.com/3dm/5636307?
> 3DMark11: http://www.3dmark.com/3dm11/9328484
> FireStrike: http://www.3dmark.com/3dm/5636253?
> 
> *970 PWR BIOS Overclock*
> Cloudgate: http://www.3dmark.com/3dm/5636568?
> Skydiver: http://www.3dmark.com/3dm/5636530?
> 3DMark11: http://www.3dmark.com/3dm11/9328563
> FireStrike: http://www.3dmark.com/3dm/5636467?
> 
> *Alright, that is a ton of information to swallow at once so let's try and break it down, shall we?*
> 
> We have 9 benchmarks, with stock/overclocked settings which gives us a total of 18 possibilities for win/lose...and we have a tie *9/9*.
> 
> If we remove Cloudgate/Skydiver from the equation we drop down to a total of 14...and *970* takes it with 8/14 wins.
> 
> If we look at just stock clocks for both cards where the 970 doesn't have the Kraken/25% Power...the *290* takes it with 10/16 wins. (no FFXIV)
> 
> If we look at stock clocks where the 970 has the Kraken/25% Power...the *970* takes it with 7/9 wins.
> 
> *But we still have other factors to consider...Price, Power Consumption, Operating Temperatures, Overclockability, etc.*
> 
> *Price* *290:* It looks like NewEgg is running low on stock of R9 290's as the cheapest one currently is an HIS for $269 + a $20 MIR. If you go used, then the 290 definitely wins here as you can easily find them in the $175-$200 range.
> 
> *Price970:* This Zotac that I have is still the cheapest available at $329 + $10 gift card. Prices go all the way up to $399 where you are guaranteeing yourself a card that has an advertised stock boost clock in the mid 1300's...which will likely go even further. The more expensive cards may even have an 8+6 PCI-E power instead of 6+6 to ensure you get enough juice for high clocks.
> 
> *Power Consumption 290:* In reference configuration these cards are downright awful. They get too hot which destroys any chance they had at efficiency. Put a better cooler on them, or some form of water cooling and the power consumption can come down quite a bit. Keep it at stock clocks, or with a minor OC and it's not so bad. I know from first hand experience that a 290 @ 1075/1375 +37mV has no problems whatsoever running on a 450W gold PSU with a mildly OC'd 4770k; if you're running AMD FX CPU then you know you will need a big(ger) PSU.
> 
> *Power Consumption 970:* We all know the 970 wins this battle without question. But, it is the new kid on the block and this was one of Nvidia's primary concerns; KUDOS to them for such an efficient card. If you are thinking about upgrading from a smaller GPU and you have a smaller/older power supply, then the 970 is likely the better choice.
> 
> *Operating Temperatures 290:* Same story as the power consumption unfortunately. They can be tamed, but it takes time/money/effort especially if you want to run high overclocks all the time.
> 
> *Operating Temperatures 970:* In stock configuration the 970 also takes the cake. The cooler on this card is downright tiny, but it was still able to handle a decent overclock simply bt forcing the fans up to 60% where it was still pretty close to inaudible. Increase the power limit on the card to unlock maximum potential and the VRMs are just as much of an issue here as they are with the 290(X). _*However*_, this is likely due to the really crappy heatsink on this particular card. Some of the manufacturers are rolling out new revisions of the 970 already where they've added 'front plates' to the card for VRM/RAM cooling, and consequently they have increased core/boost speeds.
> 
> *Overclockability R9 290:* I have an above average 290 that holds stable in anything at 1275/1675. Not only is the core clock marvelous, the RAM speed is extremely uncommon in the 290(X) world. Odds are if you have an average card that you would slip under the performance of the 970 in most, if not all, scenarios.
> 
> *Overclockability GTX 970:* With this little Zotac boosting, and holding, very near/above 1.5GHz it's performance downright shocked me. I have looked at numerous reviews of numerous 970's and this is par for the course; nearly all of them land in the 1450-1525 range. Where my 970 falls short is the RAM speed only going ~100MHz over stock. Odds are pretty good you could end up with better clocking RAM.
> 
> *ETC. R9 290:* They obviously carry an advantage for Folding @ Home. No Xfire bridges. You can run Quad-Fire.
> 
> _*ETC. GTX 970:*_ An unusually large number of 970 cards suffer from annoying coil whine; Mine does, but it is tolerable even with my obsession for absolute silence. There are a few 970's that come in the small size like this Zotac, which is a plus for small cases. ASUS even has a mini version that is 6.7" long; This matches dimensions of a mini-ITX board! You also have CUDA and PhysX if you're into those things. Limited to Tri-SLI for 970.
> 
> _*C*_
> 
> 
> 
> 
> 
> 
> 
> _*nclusi*_
> 
> 
> 
> 
> 
> 
> 
> _*n...*_
> 
> I have had my hands on numerous *290s*, and they are wonderful GPUs. The release of them stuck it to Nvidia when their pricing was really getting out of hand. The *290**(X)* showed everyone that AMD can still pack a
> 
> 
> 
> 
> 
> 
> 
> at a very reasonable price. But...they are 15 months old and that is an eternity in the computer world.
> 
> The *970* has everything going for it, except for the higher price tag. Then again, if you take _everything_ into consideration...the savings in your electric bill, the probability of needing to upgrade cooling in one form or another, or even a new PSU...then the *290* doesn't have much to stand on unless you grab bargain used *290s*.
> 
> I am a realist. The facts are here and it is pretty plain that Nvidia has the upper hand _*currently*_..considering I'm comparing Nvidia's newest to AMD's 15 month old hardware.
> 
> Does this mean I am converted? Not a chance " src="https://www.overclock.net/images/smilies/laugher.gif" style="width:22px;">
> !!! Am I impressed with the performance of this little *970*? Absolutely.
> 
> _*What will happen when the R9 300 series cards come out is a whole 'nother story! *_
> 
> El Fin





If only everybody else did it like you. Best Review\Comparison 2015


----------



## Agent Smith1984

Agreed!!!

And as much as it shows how well the 970 is put together, it also reminded me that I don't need a new GPU yet.
So it really serves both purposes.

Thanks for the objective review!


----------



## rdr09

Nice cars guys. i don't own any bmw. that was just a joke and don't mean to derail the thread.

But this review of Rob is really impressive. +rep.

the only thing i would like to add is the oc on the 970 is a bit below average while the one for the 290 is above. keep it that way.


----------



## Roboyto

Quote:
Originally Posted by *Bertovzki* 

@Roboyto thanks for your time and effort in all your work to do the tests and present it for the club and OCN , nothing better than facts and comparisons to look at hardware









Quote:
Originally Posted by *sinnedone* 

Awesome post Roboyto, read through all of it.

Thank you for all the time and effort you put into this.









Quote:
Originally Posted by *hyp36rmax* 

I can only imagine the adventure that would have been haha! I love how detailed you were with your comparison. Major kudos!

Quote:
Originally Posted by *X-Alt* 
If only everybody else did it like you. Best Review\Comparison 2015

Quote:
Originally Posted by *Agent Smith1984* 

And as much as it shows how well the 970 is put together, it also reminded me that I don't need a new GPU yet.
So it really serves both purposes.

Thanks for the objective review!

Quote:
Originally Posted by *Arizonian* 



> Awesome - this deserves it's own thread in *Graphic Cards / Graphic Cards General*


Thanks everyone for the kind words; It was a blast making the comparison. Prett sure the the 970 is sold to a co-worker, so something else is going to have to take its place









Quote:


> Originally Posted by *rdr09*
> 
> But this review of Rob is really impressive. +rep.
> 
> the only thing i would like to add is the oc on the 970 is a bit below average while the one for the 290 is above. keep it that way.


While the increase to the core clock in Afterburner is below average, the boost speeds it achieves are fairly common. It doesn't have issues holding high 1400 to low 1500 core speeds in any of the benches I ran. The card would probably go even further if it were to allow more power than the 25%. What's interesting is that I edited the BIOS a 2nd time to go to 50% power, and AB or Firestorm wouldn't allow me to go any higher than the 25%...? I'm guessing it could have hit the hardware limit of 225W from the PCI-E slot and 6+6 pin..?

The RAM speed however, is definitely lackluster. It's not uncommon for Samsung chips to be in the high 7, if not 8GHz range.

The best review for a 970 I saw was on OverClockersClub for the MSI Gaming, and it reached 1609 Core(max boost) and 8208 on the RAM! I'm guessing it was a cherry picked card for the early review; 11/17/2014. This card scored 11,729 in firestrike with a 4770k @ 4.4GHz with a graphics score of 13,891. My overall score of 11,031 and graphics score of 12,891 is probably on par for the boost speed I got at 1516. I'm willing to bet that the RAM speed on these cards can have a decent effect on scores since their overall bandwidth is quite low, 234 GB/s ~vs~ 428 GB/s, compared to 290(X). http://www.overclockersclub.com/reviews/msi_gtx_970_gaming_4g/4.htm

All in all, for a completely reference design 970 it holds its own...especially for the $264 I spent to acquire it









I'm still all for the Red Team..As soon as the R9 300's are out you can expect to see me all over that thread like the red in *Radeon*


----------



## Kaltenbrunner

Yeehaaa, my gigabyte r9 290 OC has shipped. Its clocked at 1040/1250, what kind of OC might I expect from it ? Hope I can finally play FC3, hitman, crysis 3 and max all setting, maybe turn down MSAA a bit, but get 50-60fps at 1080p and if I can get 50fps @1600p in big games, I won't have to CF. These really are beasts right ?

Also I have an XFX 750W pro black edition which is seasonic built. Should that be able to handle CF of such cards ?


----------



## Raephen

+rep, @Roboyto.

Compared to all the 'Pro-"reviewers their reviews, and the comparison they made with the 970 and Hawai GPUs, yours hits so much closer to home in real life performance deltas, I feel.

Good job!


----------



## Roboyto

Quote:


> Originally Posted by *Kaltenbrunner*
> 
> Yeehaaa, my gigabyte r9 290 OC has shipped. Its clocked at 1040/1250, what kind of OC might I expect from it ? Hope I can finally play FC3, hitman, crysis 3 get max all setting, maybe turn down MSAA at, but get 50-60fps at 1080 and if I can get 50fps @1600p in big games, I won't have to CF. These really are beasts right ?
> 
> Also I have an XFX 750W pro black edition which is seasonic built. Should that be able to handle CF of such cards ?


Nice choice for a card. The Gigabyte usually have solid temps for core and VRMs. For general information see here: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781

You should be able to play most new games with maximum settings 1080P with very good frame rates. You may have to dial AA down a little bit, but that is likely about it. 1600P will be a little different story, but you should still have an enjoyable experience with all high settings since I can run 5760*1080 with high settings and still see 40-50 fps with a single, highly clocked, 290.

These cards do pack a punch. Your overclock is anyone's guess as it is a luck game really. You will have to keep an eye on your VRM temperatures when you start pushing the envelope as they will likely limit you before the core temperature. 80-90C is where you want to be for VRM1. The card should function normally if the GPU core is under 94C. Once it hits that temperature it will start throttling clock speed and voltage to bring the temperature down.

750W, I believe, is a stretch for a pair of these cards...you may be able to pull it off with stock clocks on both cards though. Someone else may chime in who has experience with Xfire.

Quote:


> Originally Posted by *Raephen*
> 
> +rep, @Roboyto.
> 
> Compared to all the 'Pro-"reviewers their reviews, and the comparison they made with the 970 and Hawai GPUs, yours hits so much closer to home in real life performance deltas, I feel.
> 
> Good job!


Thank you very much!


----------



## Spectre-

@arizonian can you remove me from the list

got a 970 since my 290x pooped on me


----------



## pshootr

Quote:


> Originally Posted by *Spectre-*
> 
> @arizonian can you remove me from the list
> 
> got a 970 since my 290x pooped on me


Any idea as to why your card popped?


----------



## Spectre-

Quote:


> Originally Posted by *pshootr*
> 
> Any idea as to why your card popped?


my guess is vram

everything turnt on but i was getting no display

i returned it to the store i bought it from but they couldnt refund so they are gonna rma it for me and when i get it back ill see which card i wanna keep


----------



## pshootr

Quote:


> Originally Posted by *Spectre-*
> 
> my guess is vram
> 
> everything turnt on but i was getting no display
> 
> i returned it to the store i bought it from but they couldnt refund so they are gonna rma it for me and when i get it back ill see which card i wanna keep


Ah I see. Was the price of the card much higher than online vendors?


----------



## Spectre-

Quote:


> Originally Posted by *pshootr*
> 
> Ah I see. Was the price of the card much higher than online vendors?


nope it was off by like $20

so i dont really mind


----------



## Kaltenbrunner

When I ordered my r9 290 on Sunday it said 4-6 days for shipping, now according to this mornings email, its 2 day shipping, so if I'm lucky I'll have it tomorrow, if not Thurs for sure


----------



## B-linq

Hello everyone,
So I decided to buy Gigabyte R9 290x (second handed) and now I looking for a better and quieter cooler. It probably will be Accelero Extreme IV but I read somewhere that its not that great as III. version because of heatsink doesent cool well. Any experience? I read some but I will be glad for any fresh ideas.

And reason why i bought this and not a new one or a different one is just because i live in overpriced country (and i dont want to buy present-day expensive hi-end because new generation is near) ->

The cheapest model of brand new R9 290 - ~370$
The cheapest model of R9 290x ~ 416$
The cheapest model of Nvidia 970 ~ 370$
The cheapest model of Nvidia 980 ~ 625$
(I could buy it cheaper from for example amazon but the cost would be almos same with shipping, not to mention some difficulties if a wild malfunction appear.)

Another reason is even though I dont mind Nvidia - This generation (960-980) of graphics is not "the new" generation Its still 28nm with better power coms. but thats it. I expected much much more. But hey, maybe next year







Well and 970 has some memory issues as it seems :/


----------



## JourneymanMike

Quote:


> Originally Posted by *B-linq*
> 
> Hello everyone,
> So I decided to buy Gigabyte R9 290x (second handed) and now I looking for a better and quieter cooler. It probably will be Accelero Extreme IV but I read somewhere that its not that great as III. version because of heatsink doesent cool well. Any experience? I read some but I will be glad for any fresh ideas.
> 
> And reason why i bought this and not a new one or a different one is just because i live in overpriced country (and i dont want to buy present-day expensive hi-end because new generation is near) ->
> 
> The cheapest model of brand new R9 290 - ~370$
> The cheapest model of R9 290x ~ 416$
> The cheapest model of Nvidia 970 ~ 370$
> The cheapest model of Nvidia 980 ~ 625$
> (I could buy it cheaper from for example amazon but the cost would be almos same with shipping, not to mention some difficulties if a wild malfunction appear.)
> 
> Another reason is even though I dont mind Nvidia - This generation (960-980) of graphics is not "the new" generation Its still 28nm with better power coms. but thats it. I expected much much more. But hey, maybe next year
> 
> 
> 
> 
> 
> 
> 
> Well and 970 has some memory issues as it seems :/


The USED 290's and 290X'x come cheap since coin mining with PC's have gone down...

I bought three used 290X's on eBay, they all function perfectly!

For cooling I bought the Aquacomputer Kryographics Acetel/Nickle blocks and Active cooling backplates... This provides very efficient cooling and great overclocking ability...


----------



## fragamemnon

Quote:


> Originally Posted by *B-linq*
> 
> Hello everyone,
> So I decided to buy Gigabyte R9 290x (second handed) and now I looking for a better and quieter cooler. It probably will be Accelero Extreme IV but I read somewhere that its not that great as III. version because of heatsink doesent cool well. Any experience? I read some but I will be glad for any fresh ideas.
> 
> And reason why i bought this and not a new one or a different one is just because i live in overpriced country (and i dont want to buy present-day expensive hi-end because new generation is near) ->
> 
> The cheapest model of brand new R9 290 - ~370$
> The cheapest model of R9 290x ~ 416$
> The cheapest model of Nvidia 970 ~ 370$
> The cheapest model of Nvidia 980 ~ 625$
> (I could buy it cheaper from for example amazon but the cost would be almos same with shipping, not to mention some difficulties if a wild malfunction appear.)
> 
> Another reason is even though I dont mind Nvidia - This generation (960-980) of graphics is not "the new" generation Its still 28nm with better power coms. but thats it. I expected much much more. But hey, maybe next year
> 
> 
> 
> 
> 
> 
> 
> Well and 970 has some memory issues as it seems :/


Are you looking into air or water cooling solutions? Are you up for modding the card? What is your case and airflow?

You might be better off with applying the Red Mod if you're feeling adventurous.


----------



## X-Alt

Quote:


> Originally Posted by *Kaltenbrunner*
> 
> Yeehaaa, my gigabyte r9 290 OC has shipped. Its clocked at 1040/1250, what kind of OC might I expect from it ? Hope I can finally play FC3, hitman, crysis 3 and max all setting, maybe turn down MSAA a bit, but get 50-60fps at 1080p and if I can get 50fps @1600p in big games, I won't have to CF. These really are beasts right ?
> 
> Also I have an XFX 750W pro black edition which is seasonic built. Should that be able to handle CF of such cards ?


A good 750W can do it in a pinch, but it might be a good idea to get some headroom. Gigabyte cards have been voltage locked for a while, but that particular edition might be unlocked. The 290 does pretty well. I hover about 60-95 FPS on BF4 Rouge Transmission (Ultra 120FOV, single [email protected]\1400), so all should be good in terms of frames.


----------



## Klocek001

Quote:


> Originally Posted by *Lao Tzu*
> 
> The post that saying that GTX 900 series are a lot coolest than 290X are lying wrong, mine with air cooler, OC to 1100/1400 have a max temp of 68°C while 970 have 62°C (6°C less)


68 degrees at what fan setting ?
edit: you've got a 200mm side intake fan, that's why your card's cooler.


----------



## YellowBlackGod

My Sapphire Vapor-X 290X (OC) 8192MB hits maximum 70C (Stock Speed 1060/1400). Stock fan speed too. Depends on the game as well. In my case FC4 ultra setting (AA included) 1080P.


----------



## Klocek001

My 290 trix goes up to 72-73 degrees on auto, but I rarely keep it at that setting. I usually set the fan between 60 and 70% which keeps the GPU below 60. I tried 100%, the result was 53 degrees but the noise... 50% is probably the best fan setting for those cards, fairly quiet and temps in mid 60s. I also used to have alpenfohn fan bracket which helped a little but I gotta get better fans, the ones I used were just two 80mm silentiumpc default fans, I need like two 92mm noctuas to feel the difference.


----------



## YellowBlackGod

Honestly I am in love with this card. It replaced a Reference 780ti, I am so glad I did the move.


----------



## melodystyle2003

@Roboyto nice review mate








I would like to add that the gtx 970 clocks you have tested can be used 24/7 to an average 970 while the r9 290 clocks are quite high and from my personal perspective difficult to achieve for 24/7, at least on air (i did have two r9 290 and the max i could push them was to 1220/1600Mhz for bench only with some artifacts).
In addition to that, an 1600+Mhz gtx 970 does not gain much in comparison to an 1500Mhz, since its performance is not quite proportional to Mhz, unlike r9 290 is.


----------



## B-linq

JourneymanMike
Nicely done. I'm playing with the idea of water cooling but i decided to not do it just yet. I will, but not just yet.
I have got mine from miner too

fragamemnon
Im looking into air solution. If you by modding mean change air fan then yes. My case is Define XL R2, airflow - solid?


----------



## Roboyto

Quote:



> Originally Posted by *melodystyle2003*
> 
> @Roboyto nice review mate
> 
> 
> 
> 
> 
> 
> 
> 
> I would like to add that the gtx 970 clocks you have tested can be used 24/7 to an average 970 while the r9 290 clocks are quite high and from my personal perspective difficult to achieve for 24/7, at least on air (i did have two r9 290 and the max i could push them was to 1220/1600Mhz for bench only with some artifacts).
> In addition to that, an 1600+Mhz gtx 970 does not gain much in comparison to an 1500Mhz, since its performance is not quite proportional to Mhz, unlike r9 290 is.


Thank you.

I did mention these points in the review. My 290 is an exceptional performer in core and RAM speed, so odds are an average 290 would end up losing to an average 970 most of the time. It may not be a large margin, but the 290 would still be trailing.

I still love my 290 in my main rig. It drives 5760*1080 wonderfully considering it is a single GPU. It hasn't given me any problems, and takes whatever abuse I throw at it.

I am looking forward to, and preparing for, the release of the 300 series. Just sold my 2nd 290 with Kraken/Gelid VRM cooling about a half hour ago...that $250 will hopefully cover 1/3-1/2 of the cost of the new Radeon Flagship









As soon as a there is some solid info on release date for the 300's my good 290 will be up for sale with the XSPC waterblock/backplate and all the original packaging/heatsink/accessories/etc.


----------



## Roboyto

I just gave the GTX 970 much praise, but Nvidia has apparently done some shady dealings regarding ROPs and L2 Cache and fully usable VRAM...If you plan on using more than 3.5GB of VRAM, this is definitely worth knowing.

Since I didn't bench more than 1080P, I didn' experience this.

*taken from here: http://www.gamespot.com/articles/nvidia-admits-to-error-in-gtx-970-specs-and-memory/1100-6424915/*

""GPU maker Nvidia has responded to claims that its critically praised GTX 970 uses only 3.5GB of its 4GB of VRAM, resulting in performance drops or stuttering in games that pass over that threshold (usually those running at 1440p and 4K resolutions). In doing so, the company has revealed that the published specifications for the GTX 970 were partially incorrect, with the GPU actually sporting fewer ROPs and L2 cache than consumers and reviewers were initially led to believe."

Nvidia's Senior VP of GPU Engineering Jonah Alben spoke to PC Perspective about the issue, with the publication noting that "despite initial reviews and information from NVIDIA, the GTX 970 actually has fewer ROPs and less L2 cache than the GTX 980. *NVIDIA says this was an error in the reviewer's guide and a misunderstanding* between the engineering team and the technical PR team on how the architecture itself functioned. That means the GTX 970 has 56 ROPs and 1792 KB of L2 cache compared to 64 ROPs and 2048 KB of L2 cache for the GTX 980.""

"While not a performance bottleneck in of itself (the GPU's 13 SM units, running at four pixels per clock for a total of 52, limit the GPU more than the 56 pixels per clock processed by the ROPs), the reduced number of ROPs directly affects the GPU's memory subsystem. Unlike the pricier GTX 980, the GTX 970's 4GB of memory is divided into two pools in order to accommodate the reduced ROPs: one 3.5GB section, and one 0.5GB section.

The result, it transpires, is that the 0.5GB section is slower than the larger 3.5GB section. However, it is still faster than system memory accessed using the PCI Express bus."

"While this issue does not affect most games, those that feature large textures, or those that are ran at 1440p and higher resolutions may see a drop in performance. In a statement, Nvidia clamied this would be around a three percent drop. It's worth noting that Nvidia's benchmarks only looked at average FPS performance, which may not account for the frame stuttering that some users claim to be experiencing.

A Change.org petition calling for refunds due to the incorrect specifications has since been launched. Those interested in a more technical breakdown of the GTX 970's memory subsystem and performance troubles should check out the full PC Perspective story."


----------



## Kaltenbrunner

Quote:


> Originally Posted by *X-Alt*
> 
> A good 750W can do it in a pinch, but it might be a good idea to get some headroom. Gigabyte cards have been voltage locked for a while, but that particular edition might be unlocked. The 290 does pretty well. I hover about 60-95 FPS on BF4 Rouge Transmission (Ultra 120FOV, single [email protected]\1400), so all should be good in terms of frames.


1 thing is I haven't OCed my i7 in ages, I'm worried the MB will die of "old age". If I do CF, I certainly won't need to OC them, and could turn down the power slider, not sure how much difference that actually makes tho. My PSU had good reviews.

Have to see what 1 290 runs like with 1600p in 3 weeks, can't wait for that, hope the Canadian dollar is still worth enough in 3 weeks to buy 30" IPS on ebay


----------



## Spectre-

after having the 970

i miss my 290x

nvidia is all good but i miss my space heater


----------



## X-Alt

Quote:


> Originally Posted by *Kaltenbrunner*
> 
> 1 thing is I haven't OCed my i7 in ages, I'm worried the MB will die of "old age". If I do CF, I certainly won't need to OC them, and could turn down the power slider, not sure how much difference that actually makes tho. My PSU had good reviews.
> 
> Have to see what 1 290 runs like with 1600p in 3 weeks, can't wait for that, hope the Canadian dollar is still worth enough in 3 weeks to buy 30" IPS on ebay


I'm pretty sure Z68s will be ok for your setup, but one 290 should do ok at 2560x1600. Best of luck!


----------



## Roboyto

Quote:


> Originally Posted by *Kaltenbrunner*
> 
> 1 thing is I haven't OCed my i7 in ages, I'm worried the MB will die of "old age". If I do CF, I certainly won't need to OC them, and could turn down the power slider, not sure how much difference that actually makes tho. My PSU had good reviews.
> 
> Have to see what 1 290 runs like with 1600p in 3 weeks, can't wait for that, hope the Canadian dollar is still worth enough in 3 weeks to buy 30" IPS on ebay


If I can run good frames for 5760*1080 Eyefinity, then your 1600P should be just fine









Quote:


> Originally Posted by *Spectre-*
> 
> after having the 970
> 
> i miss my 290x
> 
> nvidia is all good but i miss my space heater


With the price of the cards you can probably sell the 970 and pickup a 290(X) and just about break even...There's at least one, if not two, 970 Wanted ads in the marketplace right now.


----------



## Spectre-

Quote:


> Originally Posted by *Roboyto*
> 
> If I can run good frames for 5760*1080 Eyefinity, then your 1600P should be just fine
> 
> 
> 
> 
> 
> 
> 
> 
> 
> With the price of the cards you can probably sell the 970 and pickup a 290(X) and just about break even...There's at least one, if not two, 970 Wanted ads in the marketplace right now.


oh lol

my 290x pooped on me

its getting rma'd

ill probs buy a 780/ti for benching fun


----------



## Roboyto

Quote:


> Originally Posted by *Spectre-*
> 
> oh lol
> 
> my 290x pooped on me
> 
> its getting rma'd
> 
> ill probs buy a 780/ti for benching fun


Well then...that's not so bad...just sell the 970 when you get your Radeon back! What manufacturer? Hopefully your RMA is going well?

Sold that little Zotac I had and just picked up a MSI TwinFrozr V 970 from MicroCenter. I'm hoping I don't have to resort to Kraken cooling on this one. Picked it up specifically because the whole heatsink/pipes are nickel plated. Going to put some CLU on it and see what kind of temperatures I can achieve on air


----------



## Spectre-

Quote:


> Originally Posted by *Roboyto*
> 
> Well then...that's not so bad...just sell the 970 when you get your Radeon back! What manufacturer? Hopefully your RMA is going well?
> 
> Sold that little Zotac I had and just picked up a MSI TwinFrozr V 970 from MicroCenter. I'm hoping I don't have to resort to Kraken cooling on this one. Picked it up specifically because the whole heatsink/pipes are nickel plated. Going to put some CLU on it and see what kind of temperatures I can achieve on air


i had the gigabyte reference

man the thing was a beast

did 1240/1650 with liquid cooling on stock bios while benching

i even ran PT1


----------



## Roboyto

Quote:


> Originally Posted by *Spectre-*
> 
> i had the gigabyte reference
> 
> man the thing was a beast
> 
> did 1240/1650 with liquid cooling on stock bios while benching
> 
> i even ran PT1


Sounds like my XFX Black Edition reference. Mine gets a little better than that with XSPC block, but it's still running strong. I was anticipating issues because anything I play at 5760*1080 pushes the GPU to 100% pretty much the entire time...it just keeps going...8 hour gaming session? No problem!

I think you will be good with Gigabyte RMA. I just had an experience with them a few weeks back.

I do hardware reviews for NewEgg's EggXpert Review program. A Z87X board I received had the PCI-E slots die after only a few months and they handled my RMA lickety split. I was very pleased with their RMA process and communication through it all. I hope it goes smoothly for you.


----------



## Spectre-

Quote:


> Originally Posted by *Roboyto*
> 
> Sounds like my XFX Black Edition reference. Mine gets a little better than that with XSPC block, but it's still running strong. I was anticipating issues because anything I play at 5760*1080 pushes the GPU to 100% pretty much the entire time...it just keeps going...8 hour gaming session? No problem!
> 
> I think you will be good with Gigabyte RMA. I just had an experience with them a few weeks back.
> 
> I do hardware reviews for NewEgg's EggXpert Review program. A Z87X board I received had the PCI-E slots die after only a few months and they handled my RMA lickety split. I was very pleased with their RMA process and communication through it all. I hope it goes smoothly for you.


XFX DD's are ref. cards

someone on this thread posted that


----------



## Roboyto

Quote:


> Originally Posted by *Spectre-*
> 
> XFX DD's are ref. cards
> 
> someone on this thread posted that


No no, not a Double Dissipation. They had a standard reference card, and a Black Edition reference card. It has a 3.5% factory overclock on the core to 980, as opposed to 947, and I believe Hynix memory was standard on them as opposed to having the possibility of Elpida.

http://xfxforce.com/en-us/products/amd-radeon-r9-series/amd-radeon-r9-290-black-edition-r9-290a-enbc


----------



## Spectre-

Quote:


> Originally Posted by *Roboyto*
> 
> No no, not a Double Dissipation. They had a standard reference card, and a Black Edition reference card. It has a 3% factory overclock on the core to 975, as opposed to 947, and I believe Hynix memory was standard on them as opposed to having the possibility of Elpida.


oh

then it should be good

have you tried checking if it can unlock ?


----------



## NFleck

Finally got my G10 from NCIX.ca today. Only missing my high w/mK thermal pad and vram heatsinks, but I decided to take off my antec kuhler 620 from my GTX 660, and test it out on my new XFX (core) R9 290x
I\m using an Arctic F9 high static (and still pushing high airflow in comparison to other top 90mm fans @ 43cfm) fan in a push config, with a 200mm fan intaking from the side. The AIO is mounted at the rear of my HAF XM in a push/pull config with good static pressure fans (specs on request). Pics/rigbuilder of my pc (pre-mod) are in my sig.

Here's my preliminary test results to see if I got the thermal paste and torque right -- which I did first try it seems. I am using Heaven and Valley to benchmark temps (FPS/Scores thrown in for "completeness") in conjunction with hwinfo64 v4.48
Ambient temps are somewhat cool in a basement during winter in canada but inside case seems to be ~20C

*Stock Cooler - Stock Clocks*

*Heaven Extreme/Ultra @ 1920x1080* (15mins plus benchmark length)

GPU: 54-84C
VRM1: 39-63C
VRM2: 26-52C
FPS: 53.5 (24.9-109)
Score: 1347

*Valley ExtremeHD* (run directly after heaven for 15min plus benchmark length)

GPU: 54-88C
VRM1: 39-64C
VRM2: 26-55C
FPS: 58.2 (24.9-109.8)
Score: 2433

*Kraken G10 & Antec Kuhler 620 - MX2 Thermal Paste ++ Gelid Enhancement Kit -- No Vram Heatsinks @ Stock Clocks & Voltages*

*Heaven Extreme/Ultra @ 1920x1080* (15 mins plus benchmark length)

GPU: 30-58C (Hovers steady at 58C)
VRM1: 33-49C
VRM2: 21-49C
FPS: 54 (25-110)
Score: 1359

*Valley ExtremeHD* (run directly after heaven for 15min plus benchmark length)

GPU: 30-60C
VRM1: 33-49C
VRM2: 21-52C
FPS: 59.2 (24-115.7)
Score: 2479

Some interesting things to note is that both VRMs now peak at the same temp at the same time, and then VRM2 actually goes slightly higher -- which makes sense since it has no fan directly on it.. Just goes to show how well the gelid vrm bar works in conjuction with the G10
The idle temps are slightly better, but the load temps are an amazing improvement.

I will add some pictures later.. Just really excited about my initial results and wanted to share. Definitely worth the investment for temps like this IMO.


----------



## Roboyto

Quote:


> Originally Posted by *Spectre-*
> 
> oh
> 
> then it should be good
> 
> have you tried checking if it can unlock ?


Yeah, I ordered a pair of them way back on 1/2/14; Neither one unlocked. One was a dud overclocker that only got 10% to core/memory; sold to a co-worker. The other is the one I have been rambling about and it still







on a daily basis.

Quote:


> Originally Posted by *NFleck*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Finally got my G10 from NCIX.ca today. Only missing my high w/mK thermal pad and vram heatsinks, but I decided to take off my antec kuhler 620 from my GTX 660, and test it out on my new XFX (core) R9 290x
> I\m using an Arctic F9 high static (and still pushing high airflow in comparison to other top 90mm fans @ 43cfm) fan in a push config, with a 200mm fan intaking from the side. The AIO is mounted at the rear of my HAF XM in a push/pull config with good static pressure fans (specs on request). Pics/rigbuilder of my pc (pre-mod) are in my sig.
> 
> Here's my preliminary test results to see if I got the thermal paste and torque right -- which I did first try it seems. I am using Heaven and Valley to benchmark temps (FPS/Scores thrown in for "completeness") in conjunction with hwinfo64 v4.48
> Ambient temps are somewhat cool in a basement during winter in canada but inside case seems to be ~20C
> 
> *Stock Cooler - Stock Clocks*
> 
> *Heaven Extreme/Ultra @ 1920x1080* (15mins plus benchmark length)
> 
> GPU: 54-84C
> VRM1: 39-63C
> VRM2: 26-52C
> FPS: 53.5 (24.9-109)
> Score: 1347
> 
> *Valley ExtremeHD* (run directly after heaven for 15min plus benchmark length)
> 
> GPU: 54-88C
> VRM1: 39-64C
> VRM2: 26-55C
> FPS: 58.2 (24.9-109.8)
> Score: 2433
> 
> *Kraken G10 & Antec Kuhler 620 - MX2 Thermal Paste ++ Gelid Enhancement Kit -- No Vram Heatsinks @ Stock Clocks & Voltages*
> 
> *Heaven Extreme/Ultra @ 1920x1080* (15 mins plus benchmark length)
> 
> GPU: 30-58C (Hovers steady at 58C)
> VRM1: 33-49C
> VRM2: 21-49C
> FPS: 54 (25-110)
> Score: 1359
> 
> *Valley ExtremeHD* (run directly after heaven for 15min plus benchmark length)
> 
> GPU: 30-60C
> VRM1: 33-49C
> VRM2: 21-52C
> FPS: 59.2 (24-115.7)
> Score: 2479
> 
> 
> 
> Some interesting things to note is that both VRMs now peak at the same temp at the same time, and then VRM2 actually goes slightly higher -- which makes sense since it has no fan directly on it.. Just goes to show how well the gelid vrm bar works in conjuction with the G10
> The idle temps are slightly better, but the load temps are an amazing improvement.
> 
> I will add some pictures later.. Just really excited about my initial results and wanted to share. Definitely worth the investment for temps like this IMO.


Congrats on the upgrade









You won't be disappointed with it one bit. Once you put the better pads on VRM1 you can even push a decent amount of voltage and OC and still keep the VRMs under what they would have run at stock.

The tape that comes with the Gelid kit for VRM2 is quite awful stuff. Picking up just about anything will likely give you better temps for VRM2.

If you can get this stuff I would suggest it: http://www.performance-pcs.com/misc-tim/new-junpus-thermal-tape-t180-0-4mm.html

$0.85 USD per foot. It has the best thermal conductance I could find for tape at 1.8 W/mK


----------



## HoZy

Hey guys,

I've owned both of my Reference XFX 290X since 11/3/13,(Running Sapphire TriXX bios)
Both have EK* Full Cover Water Blocks in a system that consists of dual 280mm rads.
Still going strong at 1150/1500 +100 VDDC Offset, +50% power. (Highest I've had them is 1275) but PSU wasn't liking it)

Just thought I'd touch base.

^_^

EDIT: I meant EK, Been playing with my gf's new EVGA SC ACX 2.0 GTX970.


----------



## Spectre-

Quote:


> Originally Posted by *HoZy*
> 
> Hey guys,
> 
> I've owned both of my Reference XFX 290X since 11/3/13,(Running Sapphire TriXX bios)
> Both have EVGA Full Cover Water Blocks in a system that consists of dual 280mm rads.
> Still going strong at 1150/1500 +100 VDDC Offset, +50% power. (Highest I've had them is 1275) but PSU wasn't liking it)
> 
> Just thought I'd touch base.
> 
> ^_^


EVGA blocks on amd

thats new


----------



## wermad

EVGA used to partner with Swiftech to make their Hydrocopper blocks. Swiftech sold amd blocks as well. I believe Evga has moved on to EK (probably due to the better performance and cost).


----------



## HoZy

Sorry guys, I meant EK. Got EVGA on the brain lately with the gf's new EVGA SC ACX 2.0 GTX970.


----------



## Bertovzki

Quote:


> Originally Posted by *JourneymanMike*
> 
> The USED 290's and 290X'x come cheap since coin mining with PC's have gone down...
> 
> I bought three used 290X's on eBay, they all function perfectly!
> 
> For cooling I bought the Aquacomputer Kryographics Acetel/Nickle blocks and Active cooling backplates... This provides very efficient cooling and great overclocking ability...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Are they Sapphire 290X ?


----------



## GrimDoctor

Does anyone here own a Sapphire 290x 6GB? AMD looks to play much better with AutoCAD than Nvidia from what I'm reading. I can't really afford workstation gear.


----------



## wermad

Might not answer your question but as far as hawaii, anything that is not the reference turbine cooler should give you very good air temps. The Sapphires w/ the TriX coolers will run very nicely. I ran a couple before slapping blocks on them since a hawt samwich was not my plan (







).


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Spectre-*
> 
> oh lol
> my 290x pooped on me
> its getting rma'd
> ill probs buy a 780/ti for benching fun


Hey there you guy.....









I hope you use the overvolt tool and get past 1400 odd Mhz or try to beat my scores . Cant remember what they were LooooL








Quote:


> Originally Posted by *HoZy*
> 
> Hey guys,
> 
> I've owned both of my Reference XFX 290X since 11/3/13,(Running Sapphire TriXX bios)
> Both have EK* Full Cover Water Blocks in a system that consists of dual 280mm rads.
> Still going strong at 1150/1500 +100 VDDC Offset, +50% power. (Highest I've had them is 1275) but PSU wasn't liking it)
> 
> Just thought I'd touch base.
> 
> ^_^
> 
> EDIT: I meant EK, Been playing with my gf's new EVGA SC ACX 2.0 GTX970.
> 
> Quote:
> 
> 
> 
> Originally Posted by *GrimDoctor*
> 
> Does anyone here own a Sapphire 290x 6GB? AMD looks to play much better with AutoCAD than Nvidia from what I'm reading. I can't really afford workstation gear.
Click to expand...

Gidday fellow 'Stralians


----------



## maynard14

guys have you tried to install omega driver to winnodws 10 preview? im trying to install the omega driver now on windows 10 but it seems it will install but when i open amd control center it doesnt work


----------



## Spectre-

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Hey there you guy.....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I hope you use the overvolt tool and get past 1400 odd Mhz or try to beat my scores . Cant remember what they were LooooL
> 
> 
> 
> 
> 
> 
> 
> 
> Gidday fellow 'Stralians


I WILL BUY 2

jk

how ya been man

also why does the RIVE i buoght from u keep killing my cards


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Spectre-*
> 
> I WILL BUY 2
> 
> jk
> 
> how ya been man
> 
> also why does the RIVE i bought from u keeps killing my cards


I benched 3 BTW








Great guns mate . Re doing my forklift ticket and the peeps im doing the course with are not very motivated .......


Good question . Ive never had a issue with that board or previous local owner either .
Maybe your just too much of a dang overvolting roughnut ??
Or the cards you've been using are weak as ??
Or your PSU is surging ??
Try other PCI slot ??
I have no idea why








But my BE , 4960x and TRI 290's have not missed a beat thank gawd


----------



## Spectre-

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> I benched 3 BTW
> 
> 
> 
> 
> 
> 
> 
> 
> Great guns mate . Re doing my forklift ticket and the peeps im doing the course are not very motivated .......
> 
> 
> Good question . Ive never had a issue with that board or previous local owner either .
> Maybe your just too much of a dang overvolting roughnut ??
> Or the cards you've been using are weak as ??
> Or your PSU is surging ??
> Try other PCI slot ??
> I have no idea why
> 
> 
> 
> 
> 
> 
> 
> 
> But my BE , 4960x and TRI 290's have not missed a beat thank gawd


lol iwas joking

as for overvolting

1.6 isnt overvolting


----------



## YellowBlackGod

Quote:


> Originally Posted by *GrimDoctor*
> 
> Does anyone here own a Sapphire 290x 6GB? AMD looks to play much better with AutoCAD than Nvidia from what I'm reading. I can't really afford workstation gear.


Not 6GB but 8GB. I have one, the Vapor-X 290x OC. The Tri-X 8GB version has been also released a couple of days ago. Well if you want to know about temps, i hardly get 70C, even in very demanding games like Far Cry 4.


----------



## Bertovzki

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> I benched 3 BTW
> 
> 
> 
> 
> 
> 
> 
> 
> Great guns mate . Re doing my forklift ticket and the peeps im doing the course are not very motivated .......


Hes motivated hes just cleaning the floor with his back


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Spectre-*
> 
> lol iwas joking
> 
> as for overvolting
> 
> 1.6 isnt overvolting


No sheet Sherlock









No 1.6 cpu vcore isn't









When I try to go past 1.53v my DP goes wonky and I bench blackscreen style ..... when I was benching









These days its PT1T bios and [email protected]@1.25v before vdroop . No problems with good temps allround


----------



## GrimDoctor

Quote:


> Originally Posted by *YellowBlackGod*
> 
> Not 6GB but 8GB. I have one, the Vapor-X 290x OC. The Tri-X 8GB version has been also released a couple of days ago. Well if you want to know about temps, i hardly get 70C, even in very demanding games like Far Cry 4.


Oh really. I'll have to see if I can find one in Australia, 8Gb would be nice for rendering. Where did you get yours?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Bertovzki*
> 
> Hes motivated hes just cleaning the floor with his back


And the back of his sweaty baldy knob head too ........


----------



## YellowBlackGod

Quote:


> Originally Posted by *GrimDoctor*
> 
> Oh really. I'll have to see if I can find one in Australia, 8Gb would be nice for rendering. Where did you get yours?


I got it new from Ebay.co.uk (I live in Greece). It is also available in Amazon, in other shops in Germany, as far as I remember, and in the Store of Overclockers.co.uk. There is also a Powercolor 290x version with 8 GB of VRAM and I think MSI was supposed to release one as well. Believe me, right now this is the best card of the market and the best value for Money.


----------



## GrimDoctor

Quote:


> Originally Posted by *YellowBlackGod*
> 
> I got it new from Ebay.co.uk (I live in Greece). It is also available in Amazon, in other shops in Germany, as far as I remember, and in the Store of Overclockers.co.uk. There is also a Powercolor 290x version with 8 GB of VRAM and I think MSI was supposed to release one as well. Believe me, right now this is the best card of the market and the best value for Money.


If there's an MSI version I'm interested, they would have the best support over here of those options.

May I ask what you paid for it?


----------



## Bertovzki

Quote:


> Originally Posted by *Roboyto*
> 
> If you can get this stuff I would suggest it: http://www.performance-pcs.com/misc-tim/new-junpus-thermal-tape-t180-0-4mm.html
> 
> $0.85 USD per foot. It has the best thermal conductance I could find for tape at 1.8 W/mK


So is this the sort of thing i could attach heat sinks to a M.2 SSD for e.g. as they get hot ! or is there something better for this ?


----------



## Spectre-

Quote:


> Originally Posted by *GrimDoctor*
> 
> Oh really. I'll have to see if I can find one in Australia, 8Gb would be nice for rendering. Where did you get yours?


i think kogan has 8gb r9 290x by sapphire
Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> No sheet Sherlock
> 
> 
> 
> 
> 
> 
> 
> 
> 
> No 1.6 cpu vcore isn't
> 
> 
> 
> 
> 
> 
> 
> 
> 
> When I try to go past 1.53v my DP goes wonky and I bench blackscreen style ..... when I was benching
> 
> 
> 
> 
> 
> 
> 
> 
> 
> These days its PT1T bios and [email protected]@1.25v before vdroop . No problems with good temps allround


i flashed 1 of my bios's to PT1 and the other was the stock GB bios

i was using stock for most part

my cards got to 1.6 ( 1.55 with drop off) and then they did the black screen stuff

but my card could only do 1270 on core but the mem got to 1640ish


----------



## YellowBlackGod

Quote:


> Originally Posted by *GrimDoctor*
> 
> If there's an MSI version I'm interested, they would have the best support over here of those options.
> 
> May I ask what you paid for it?


I bought it in November and I did everything in my power to get one because in Greece it was not available. I sold an 780ti Reference in order to get this baby. I have the 8GB OC Version (1060/1400 MHz) of Vapor-X. I wanted this because of the looks (suits beautifully to the blue schemed Asrock Z97/extreme 6 that i have), features and build quality. I bought it 480 Euros (make the caulculaton for australian currency, shipping costs were included).


----------



## GrimDoctor

Quote:


> Originally Posted by *YellowBlackGod*
> 
> I bought it in November and I did everything in my power to get one because in Greece it was not available. I sold an 780ti Reference in order to get this baby. I have the 8GB OC Version (1060/1400 MHz) of Vapor-X. I wanted this because of the looks (suits beautifully to the blue schemed Asrock Z97/extreme 6 that i have), features and build quality. I bought it 480 Euros (make the caulculaton for australian currency, shipping costs were included).


Thank you very much. I've seen some very good AutoCAD benchmarks with 4GB 290x so this might be exactly what I'm after as well as being not as expensive as a 980


----------



## Lao Tzu

Quote:


> Originally Posted by *GrimDoctor*
> 
> Thank you very much. I've seen some very good AutoCAD benchmarks with 4GB 290x so this might be exactly what I'm after as well as being not as expensive as a 980


And render 4:00 minutes of 720p video in approx 50sec Oo ...


----------



## GrimDoctor

I'm starting to look at Crossfire and weighing up pricing. I haven't run CF OCs before but I am assuming like Nvidia's you can't OC as high when you have two?

If so, would I be better off going 2 x 290 instead of 2 x 290x? I could get a much better price on 2 290s.


----------



## rdr09

Quote:


> Originally Posted by *GrimDoctor*
> 
> I'm starting to look at Crossfire and weighing up pricing. I haven't run CF OCs before but I am assuming like Nvidia's you can't OC as high when you have two?
> 
> If so, would I be better off going 2 x 290 instead of 2 x 290x? I could get a much better price on 2 290s.


Grim, I have 2 290s and I love them. I know you used to own a 970 and somehow it did not meet what you need it to do for work. In games it was perfect, right?

Not sure what games you play but I suggest you start off with one AMD card first and gauge it. These things get hot, especially on air. You might end having multiple issues and find out your 290 or 290X is the exact opposite of your 970. Good for work but nor for the games you play. If you weigh your work more, then you might get a better experience.


----------



## JourneymanMike

Quote:


> Originally Posted by *Bertovzki*
> 
> Are they Sapphire 290X ?


Yes, they are Sapphire Reference cards...


----------



## fragamemnon

Quote:


> Originally Posted by *GrimDoctor*
> 
> I'm starting to look at Crossfire and weighing up pricing. I haven't run CF OCs before but I am assuming like Nvidia's you can't OC as high when you have two?
> 
> If so, would I be better off going 2 x 290 instead of 2 x 290x? I could get a much better price on 2 290s.


In-game I wouldn't justify the small performance increase from the 290x considering the price difference. I am more than happy with my 290s.

Also, as long as you can lead the heat out of the case, you'll be just fine with overclocks. You will need to dissipate up to 650W for the GPUs alone though, it depends on how far you are willing to push them. This case stands valid for NVidia cards too. I don't know why your OC would be limited in a multi-GPU environment apart from power draw or heat dissipation. I currently run 1225MHz per GPU game-stable with +163mV on each core.


----------



## GrimDoctor

Quote:


> Originally Posted by *fragamemnon*
> 
> In-game I wouldn't justify the small performance increase from the 290x considering the price difference. I am more than happy with my 290s.
> 
> Also, as long as you can lead the heat out of the case, you'll be just fine with overclocks. You will need to dissipate up to 650W for the GPUs alone though, it depends on how far you are willing to push them. This case stands valid for NVidia cards too. I don't know why your OC would be limited in a multi-GPU environment apart from power draw or heat dissipation. I currently run 1225MHz per GPU game-stable with +163mV on each core.


Based on the prices I am finding it's tempting.

Quote:


> Originally Posted by *rdr09*
> 
> Grim, I have 2 290s and I love them. I know you used to own a 970 and somehow it did not meet what you need it to do for work. In games it was perfect, right?
> 
> Not sure what games you play but I suggest you start off with one AMD card first and gauge it. These things get hot, especially on air. You might end having multiple issues and find out your 290 or 290X is the exact opposite of your 970. Good for work but nor for the games you play. If you weigh your work more, then you might get a better experience.


Thanks for the tip. The 970 was quite good in games, single and SLi though some 970 owners in those other threads have some interesting expectations. Heat was minimal, I understand that will be the biggest difference. My current build log is a little bit of an air experiment so it could be interesting with a space heater







though I agree, one card at time is probably a better move. I had 760 SLi and that got hot, don't know how it compares to 290s, I haven't had an AMD for a while now. Either way this case setup will be a whole different kettle of fish.

In relation to games I play most, some would probably benefit from AMD/Mantle, namely DAI and Civ BE. I tend to play RTS (extremely varied) and RPGs like Dragon Age and Witcher. I like my action games to but I get the most hours out of the previous genres, especially games like XCOM...don't know how many Ironman playthroughs I've done now lol.

Anyway, back on topic, I am torn between these three options:
1} 980
2) 2 x 290
3) 290x 8GB
1 and 2 would cost the exact same price, 3 being half way between as such. I do wonder if I am putting too much stock into VRAM because of my previous experience. The 8GB cards are reviewing well, whilst not being needed for everyone. My setup is probably a bit unique but even for work at home I can't justify workstation cards and I don't know how they would handle if I wanted to game, hence I thought I may be better with something gaming that could allow work also.


----------



## rdr09

Quote:


> Originally Posted by *GrimDoctor*
> 
> Based on the prices I am finding it's tempting.
> Thanks for the tip. The 970 was quite good in games, single and SLi though some 970 owners in those other threads have some interesting expectations. Heat was minimal, I understand that will be the biggest difference. My current build log is a little bit of an air experiment so it could be interesting with a space heater
> 
> 
> 
> 
> 
> 
> 
> though I agree, one card at time is probably a better move. I had 760 SLi and that got hot, don't know how it compares to 290s, I haven't had an AMD for a while now. Either way this case setup will be a whole different kettle of fish.
> 
> In relation to games I play most, some would probably benefit from AMD/Mantle, namely DAI and Civ BE. I tend to play RTS (extremely varied) and RPGs like Dragon Age and Witcher. I like my action games to but I get the most hours out of the previous genres, especially games like XCOM...don't know how many Ironman playthroughs I've done now lol.
> 
> Anyway, back on topic, I am torn between these three options:
> 1} 980
> 2) 2 x 290
> 3) 290x 8GB
> 1 and 2 would cost the exact same price, 3 being half way between as such. I do wonder if I am putting too much stock into VRAM because of my previous experience. The 8GB cards are reviewing well, whilst not being needed for everyone. My setup is probably a bit unique but even for work at home I can't justify workstation cards and I don't know how they would handle if I wanted to game, hence I thought I may be better with something gaming that could allow work also.


that 8GB would be my recommendation.


----------



## GrimDoctor

Quote:


> Originally Posted by *rdr09*
> 
> that 8GB would be my recommendation.


How are the Sapphire cards quality and support wise? I've never owned one.
Sapphire Radeon R9 290X Vapor-X OC 8GB


----------



## rdr09

Quote:


> Originally Posted by *GrimDoctor*
> 
> How are the Sapphire cards quality and support wise? I've never owned one.
> Sapphire Radeon R9 290X Vapor-X OC 8GB


one of the highly recommend brand on air. not good for watercooling 'cause of company's policy.


----------



## GrimDoctor

Quote:


> Originally Posted by *rdr09*
> 
> one of the highly recommend brand on air. not good for watercooling 'cause of company's policy.


I've read in a Tom's hardware review on the 8GB that it's cooler than a 970...really? If that's true it's quite an achievement from what I understand so far.


----------



## hyp36rmax

Quote:


> Originally Posted by *GrimDoctor*
> 
> How are the Sapphire cards quality and support wise? I've never owned one.
> Sapphire Radeon R9 290X Vapor-X OC 8GB


A beautiful card! Check out our VAPOR-X and TRI-X Owners Club: *Link*

Quote:


> Originally Posted by *rdr09*
> 
> one of the highly recommend brand on air. not good for watercooling 'cause of company's policy.


Correct! I remember that debacle of a thread regarding water-cooling.... Such a grey area of the market, can't blame Sapphire for taking that stance and protecting their bottom line and resources. I really do wish they would lighten up, oh well water-blocks it is! haha


----------



## rdr09

Quote:


> Originally Posted by *hyp36rmax*
> 
> A beautiful card!
> 
> Correct! I remember that debacle of a thread regarding water-cooling.... Such a grey area of the market, can't blame Sapphire for taking that stance and protecting their bottom line and resources. I really do wish they would lighten up, oh well water-blocks it is! haha


i was about to show Grim your thread . . .

http://www.overclock.net/t/1535830/sapphire-vapor-x-tri-x-owners-club


----------



## hyp36rmax

Quote:


> Originally Posted by *rdr09*
> 
> i was about to show Grim your thread . . .
> 
> http://www.overclock.net/t/1535830/sapphire-vapor-x-tri-x-owners-club


Thank you sir! I had ninja edited to add it too haha.


----------



## GrimDoctor

Quote:


> Originally Posted by *hyp36rmax*
> 
> A beautiful card! Check out our VAPOR-X and TRI-X Owners Club: *Link*
> 
> Correct! I remember that debacle of a thread regarding water-cooling.... Such a grey area of the market, can't blame Sapphire for taking that stance and protecting their bottom line and resources. I really do wish they would lighten up, oh well water-blocks it is! haha


Quote:


> Originally Posted by *rdr09*
> 
> i was about to show Grim your thread . . .
> 
> http://www.overclock.net/t/1535830/sapphire-vapor-x-tri-x-owners-club


Quote:


> Originally Posted by *hyp36rmax*
> 
> Thank you sir! I had ninja edited to add it too haha.


Lol, thanks guys reading it now. So they cool as well as some of those reviews state? If so I am getting close to being sold.


----------



## hyp36rmax

Quote:


> Originally Posted by *GrimDoctor*
> 
> Lol, thanks guys reading it now. So they cool as well as some of those reviews state? If so I am getting close to being sold.


Yes sir! You'll love it! Have you considered the new R9 300 series which is around the corner ?


----------



## GrimDoctor

Quote:


> Originally Posted by *hyp36rmax*
> 
> Yes sir! You'll love it! Have you considered the new R9 300 series which is around the corner ?


I have but as of right now, my 760s just sold on eBay and I only have a single 570 lying around.
It's started to make me think that I get a 4GB 290x as it will probably have the best resale value once the 300/GM200s come around allowing me to reassess but still get work done now?


----------



## YellowBlackGod

Quote:


> Originally Posted by *GrimDoctor*
> 
> How are the Sapphire cards quality and support wise? I've never owned one.
> Sapphire Radeon R9 290X Vapor-X OC 8GB


My recommendation as well. I will say that again. It is the Best card of the market right now. Have had 3 Sapphire cards,zero problems, all have worked good and we're stable.


----------



## hyp36rmax

Quote:


> Originally Posted by *GrimDoctor*
> 
> I have but as of right now, my 760s just sold on eBay and I only have a single 570 lying around.
> It's started to make me think that I get a 4GB 290x as it will probably have the best resale value once the 300/GM200s come around allowing me to reassess but still get work done now?


Fair enough. I actually did something similar coming from crossfire 7970's anticipating the R9 390X, however I figured why not just jump and enjoy it now and grab the 390X cards in a year for another build. All is well again in hyp36rland lol!


----------



## YellowBlackGod

From now on I will be running only 8GB models. The 290x is futureproof as well,with confirmed DX12 support. It will run all upcoming games on ultra.No reason to change now really. When the 400series 8GB model comes out I will consider then.


----------



## BradleyW

Quote:


> Originally Posted by *YellowBlackGod*
> 
> From now on I will be running only 8GB models. The 290x is futureproof as well,with confirmed DX12 support. It will run all upcoming games on ultra.No reason to change now really. When the 400series 8GB model comes out I will consider then.


Where does it say DX12 support? I'm exited now!


----------



## YellowBlackGod

Quote:


> Originally Posted by *BradleyW*
> 
> Where does it say DX12 support? I'm exited now!


I have seen it written either on AMD or Sapphire's Website in the GPU specs. I did all that research before buying the card.







It's somewhere there written. Just with an asterisc: If nothing at the specs of DX12 changes till release. But I am pretty sure it won't.


----------



## mAs81

Quote:


> Originally Posted by *YellowBlackGod*
> 
> I have seen it written either on AMD or Sapphire's Website in the GPU specs. I did all that research before buying the card.
> 
> 
> 
> 
> 
> 
> 
> It's somewhere there written. Just with an asterisc: If nothing at the specs of DX12 changes till release. But I am pretty sure it won't.


Yeah , I think I read somewhere that the same thing stands for R9 290 too


----------



## YellowBlackGod

Quote:


> Originally Posted by *mAs81*
> 
> Yeah , I think I read somewhere that the same thing stands for R9 290 too


There you go: Look at the specs TAB.
















http://www.amd.com/en-us/products/graphics/desktop/r9

Enjoy your cards guys!!!!

Footnote: Based on our review of the Microsoft DirectX(r) 12 specification dated July 23, 2014, we are confident that devices based on our GCN architecture will be able to support DirectX(r) 12 graphics when available. We recommend that you check AMD.com prior to purchase to confirm that a particular device will support DirectX(r) 12 graphics. Note however, any changes to the DirectX(r) 12 specification after this date could impact or completely eliminate this ability - and AMD disclaims all liability resulting therefrom.

But no change I guess. SO everything is fine


----------



## mAs81

Quote:


> Originally Posted by *YellowBlackGod*
> 
> There you go: Look at the specs TAB.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.amd.com/en-us/products/graphics/desktop/r9
> 
> No asterisc now 1000% confirmed. Enjoy your cards guys!!!!


Thanks for confirming it , +REP
if only I could fix my gpu cooler now


----------



## YellowBlackGod

Quote:


> Originally Posted by *mAs81*
> 
> Thanks for confirming it , +REP
> if only I could fix my gpu cooler now


See the footnote my mistake. But no problem i guess.


----------



## mAs81

Quote:


> Originally Posted by *YellowBlackGod*
> 
> See the footnote my mistake. But no problem i guess.


No prob..I read it,but it still seems that we're good..here's hoping that nothing will change


----------



## BradleyW

Quote:


> Originally Posted by *YellowBlackGod*
> 
> There you go: Look at the specs TAB.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.amd.com/en-us/products/graphics/desktop/r9
> 
> Enjoy your cards guys!!!!
> 
> Footnote: Based on our review of the Microsoft DirectX(r) 12 specification dated July 23, 2014, we are confident that devices based on our GCN architecture will be able to support DirectX(r) 12 graphics when available. We recommend that you check AMD.com prior to purchase to confirm that a particular device will support DirectX(r) 12 graphics. Note however, any changes to the DirectX(r) 12 specification after this date could impact or completely eliminate this ability - and AMD disclaims all liability resulting therefrom.
> 
> But no change I guess. SO everything is fine


+1 cheers buddy!


----------



## Bima Sylirian

Hi guys. I'm sorry if I were on the wrong place. I'm think about upgrading to Radeon R9 290 because it has twice amount of the VRAM (per GPU) compared to my current graphics card, that is Radeon HD 6990. However, I can only find the reference R9 290 for now. Is the reference one still performs okay? I'm afraid the GPU will throttle quite often which will hammer its performance. FYI, ambient temperature in my room is around 28-29 C which might become a concern. Other than that, I heard the noise of reference cooler is very horrible as well. So I need some opinion about this.
Thanks you.


----------



## YellowBlackGod

By the way the new version of Trixx is out (4.8.9) and STILL has the same problem, it chrashes while attepting to open the settings tab!!!!


----------



## pshootr

Quote:


> Originally Posted by *YellowBlackGod*
> 
> By the way the new version of Trixx is out (4.8.9) and STILL has the same problem, it chrashes while attepting to open the settings tab!!!!


Interesting, I still have the last release and It does not crash when I open the settings tab.


----------



## YellowBlackGod

Quote:


> Originally Posted by *pshootr*
> 
> Interesting, I still have the last release and It does not crash when I open the settings tab.


The 4.8.6? The problem is confirmed by others in the Sapphire Forums.


----------



## rdr09

Quote:


> Originally Posted by *YellowBlackGod*
> 
> The 4.8.6? The problem is confirmed by others in the Sapphire Forums.


you really have to use the .6 version?

i checked mine and it is still the 4.8.1 version. i use it to oc both my 290s.


----------



## Forceman

Quote:


> Originally Posted by *Bima Sylirian*
> 
> Hi guys. I'm sorry if I were on the wrong place. I'm think about upgrading to Radeon R9 290 because it has twice amount of the VRAM (per GPU) compared to my current graphics card, that is Radeon HD 6990. However, I can only find the reference R9 290 for now. Is the reference one still performs okay? I'm afraid the GPU will throttle quite often which will hammer its performance. FYI, ambient temperature in my room is around 28-29 C which might become a concern. Other than that, I heard the noise of reference cooler is very horrible as well. So I need some opinion about this.
> Thanks you.


Unless you plan to watercool it, I would avoid a reference card. They can get very loud, especially with that high an ambient temp.


----------



## YellowBlackGod

Quote:


> Originally Posted by *rdr09*
> 
> you really have to use the .6 version?
> 
> i checked mine and it is still the 4.8.1 version. i use it to oc both my 290s.


Well i like to be up to date. That's what the new releases are for. But for now I will stick to the 4.8.2 version which seems to work fine. Still Sapphire should update more frequently and fix the issues, like ASUS does with GPUTweak.


----------



## Starage

Hello

I picked up two 290 cards

Everything runs fine, just at start up I get this message., atipdlxx failed to load
I found atipdlxx.dll file and put in

C:\AMD\AMD-Catalyst-14.11.2Beta-64Bit-Win8.1-Win7-Nov19\Packages\Drivers\Display\WB6A_INF\B177872

But still the message failed to load.

I ran NVidia for years, just went back to ati.

Any ideas how to solve this message?


----------



## bluedevil

Thinking of ditching my 970 for xfired 290s. Looking for a constant 96 fps in BF4 @ 1440P, max settings.


----------



## tsm106

Quote:


> Originally Posted by *Starage*
> 
> Hello
> 
> I picked up two 290 cards
> 
> Everything runs fine, just at start up I get this message., atipdlxx failed to load
> I found atipdlxx.dll file and put in
> 
> C:\AMD\AMD-Catalyst-14.11.2Beta-64Bit-Win8.1-Win7-Nov19\Packages\Drivers\Display\WB6A_INF\B177872
> 
> But still the message failed to load.
> 
> I ran NVidia for years, just went back to ati.
> 
> Any ideas how to solve this message?


Easiest way is to run DDU, that will clean the drivers and set windoze not to auto update drivers, that is bad.

Then install the latest driver build, the omegas.

Download below and download the lastest driver from amd's site.

http://www.wagnardmobile.com/DDU/ddudownload.htm

Quote:


> Originally Posted by *bluedevil*
> 
> Thinking of ditching my 970 for xfired 290s. Looking for a constant 96 fps in BF4 @ 1440P, max settings.


http://www.newegg.com/Product/Product.aspx?Item=N82E16814127787&cm_re=radeon_r9_290x-_-14-127-787-_-Product

Lighhtning for 309 AR. Gah that is silly low.


----------



## rdr09

Quote:


> Originally Posted by *YellowBlackGod*
> 
> Well i like to be up to date. That's what the new releases are for. But for now I will stick to the 4.8.2 version which seems to work fine. Still Sapphire should update more frequently and fix the issues, like ASUS does with GPUTweak.


i see.
Quote:


> Originally Posted by *Starage*
> 
> Hello
> 
> I picked up two 290 cards
> 
> Everything runs fine, just at start up I get this message., atipdlxx failed to load
> I found atipdlxx.dll file and put in
> 
> C:\AMD\AMD-Catalyst-14.11.2Beta-64Bit-Win8.1-Win7-Nov19\Packages\Drivers\Display\WB6A_INF\B177872
> 
> But still the message failed to load.
> 
> I ran NVidia for years, just went back to ati.
> 
> Any ideas how to solve this message?


i'd also recommend to do one card first and test. i mean do not install both and install the driver. test each card.

you don't need to pull out the gpus. just unplug the power to one.


----------



## Kaltenbrunner

My new/openbox gigabyte r9 290 OC arrived, think I know why it was such a deal compared to some of the others on NCIX ebay outlet...................it has some coil whine type thing. Its a fairly low pitched and maybe around 60Hz since it sort of reminds me some of the buzz I've heard out of my guitar amp.

Its not that loud either, and doesn't do it all the time, I only played 10min of BF4 and 5min of skyrim, does it in the menu's more often so far, but also in game too. I have v-sync on btw. I never had the volume that loud, if there's any action I don't notice it at all, so it could be way way worse.

I'll try changing its 1040/1250 OC over the next few days, see if that makes any difference, maybe lower or raise the power slider or the voltage itself. WCS, does gigabyte have serial # based RMA ? So that even if its openbox (no way to know the cards history), or 2nd hand, u can RMA it since its under the 2 or 3 year warranty time ?

On the good side, I have all settings maxed in BF4 @1080, and keeping 60fps is easy, and feels way smoother than with my 7950. And as I predicted the temps never went over 67C. In skytim they were under 57C. That was only after 10 and 5 minutes, but I don't expect much more than that with my case and no panels on it.


----------



## Starage

Olay guys, thank you. I will do that.


----------



## Roboyto

Quote:


> Originally Posted by *Bertovzki*
> 
> So is this the sort of thing i could attach heat sinks to a M.2 SSD for e.g. as they get hot ! or is there something better for this ?


Sure, or some thermal pads.


----------



## Bima Sylirian

Quote:


> Originally Posted by *Forceman*
> 
> Unless you plan to watercool it, I would avoid a reference card. They can get very loud, especially with that high an ambient temp.


Thanks for the input.








I'll avoid the reference card.


----------



## pshootr

Quote:


> Originally Posted by *YellowBlackGod*
> 
> The 4.8.6? The problem is confirmed by others in the Sapphire Forums.


That is correct, I use 4.8.6 and the settings tab does not cause it to crash for me.

I use a single card, and I do not have AMD Overdrive running. Not sure if any of that makes a difference or not. I also use the Omega drivers for the GPU and chipset.


----------



## Gobigorgohome

Today I kissed goodbye two of my Sapphire Radeon R9 290X with waterblocks mounted. I hope the new owner take care of them the same way I did. Just want to lay down and cry, but I can comfort myself with my MSI Lightning R9 290X (which I will not get rid of).









Waiting for next generation AMD ....


----------



## Starage

So I ran ddu, removed sapphire trixx
Some other gpu programs. Found some MSI programs from my big bang II x79 board and msi afterburner.
Restarted system.
ran ddu again, there was nothing to remove. But when I restarted the fault message do not show up.
So I reinstalled sapphire trixx. restarted.
No messages or errors.
I have the most current amd/ati drivers. installed with one card at a time.
All is good now.

Thank you guys.

So happy I did not go with two 970 GTX.

These 290 do the job just fine.


----------



## Bertovzki

Quote:


> Originally Posted by *Roboyto*
> 
> Sure, or some thermal pads.


Ok , yeah i would need it to be adhesive as i would be needing it to stick well without any mount , a heat sink straight on top of memory chips.

Thanks for info , i know a bit off topic , but a great place to ask.


----------



## Serandur

Hello, I was just wondering if anyone here has ever played KotOR I or II on their 290/290Xs? I'm curious to know if they have any graphical glitches or problems at all with the games? Thank you.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Starage*
> 
> So I ran ddu, removed sapphire trixx
> Some other gpu programs. Found some MSI programs from my big bang II x79 board and msi afterburner.
> Restarted system.
> ran ddu again, there was nothing to remove. But when I restarted the fault message do not show up.
> So I reinstalled sapphire trixx. restarted.
> No messages or errors.
> I have the most current amd/ati drivers. installed with one card at a time.
> All is good now.
> 
> Thank you guys.
> 
> So happy I did not go with two 970 GTX.
> 
> These 290 do the job just fine.


Heck yes!!!


----------



## e4et

Hi guys.
I need some help please.
Bought a MSI Gaming R9 290 and cannot get both my displays to work at the same time. One going from dvi and the other from the hdmi.
individually they work just fine, but together I get as far as the windows login screen, then both screens just go black/no signal.

I had a pair of 7950's in this rig, so I don't think power from my 750w corsair is an issue.
I also updated my Asrock z77 extrm 4 to latest available bios.

Please help.

Thanks in advance.


----------



## melodystyle2003

Do you have signal to any or both monitors on POST and windows loading, while both are connected to gpu?


----------



## e4et

Quote:


> Originally Posted by *melodystyle2003*
> 
> Do you have signal to any or both monitors on POST and windows loading, while both are connected to gpu?


Hi.
Yes, I have signal to both while posting and also the windows logo, both goes blank just before the login screen.
On single display, everything works fine.
Also if I connect one of the screens to the card (when running with only one) as soon as I connect, both screens go blank.


----------



## ghabhaducha

Quote:


> Originally Posted by *e4et*
> 
> Hi.
> Yes, I have signal to both while posting and also the windows logo, both goes blank just before the login screen.
> On single display, everything works fine.
> Also if I connect one of the screens to the card (when running with only one) as soon as I connect, both screens go blank.


Sounds like a possible driver issue/mishap to me. Try running DDU from safe mode, and remove all AMD drivers. After uninstalling, install a fresh copy of the newest ones and see if the issue still persists.


----------



## e4et

Quote:


> Originally Posted by *ghabhaducha*
> 
> Sounds like a possible driver issue/mishap to me. Try running DDU from safe mode, and remove all AMD drivers. After uninstalling, install a fresh copy of the newest ones and see if the issue still persists.


Hi, thanks i'll try that tonight when I get home.
Do you think the dvi cable could have something to do with it, the card has dual dvi-d double link ports, my cable is a dvi-d single link ?
Unfortunately I don't have a dual link cable to test with.


----------



## ghabhaducha

Is your monitor 1440p or 1080p 144Hz? If not, I don't think the DVI cable is the problem. Not to mention, you are able to get video until Windows logon screen pops up, which to me yells a driver issue.


----------



## e4et

Quote:


> Originally Posted by *ghabhaducha*
> 
> Is your monitor 1440p or 1080p 144Hz? If not, I don't think the DVI cable is the problem. Not to mention, you are able to get video until Windows logon screen pops up, which to me yells a driver issue.


It's a 24" 1080 monitor.
Almost heading home.
Thanks for the help thus far.


----------



## e4et

Quote:


> Originally Posted by *e4et*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ghabhaducha*
> 
> Sounds like a possible driver issue/mishap to me. Try running DDU from safe mode, and remove all AMD drivers. After uninstalling, install a fresh copy of the newest ones and see if the issue still persists.
> 
> 
> 
> Hi, thanks i'll try that tonight when I get home.
> Do you think the dvi cable could have something to do with it, the card has dual dvi-d double link ports, my cable is a dvi-d single link ?
> Unfortunately I don't have a dual link cable to test with.
Click to expand...

Hi.
Okay, using only the dvi connection, I booted into safe mode and used ddu with the shut down option.
Once the PC was powered of I connected the HDMI cable and started the PC and successfully booted into Windows, both screens working, mirrored. I then started to install the latest official omega driver and about halfway through the setup both screens just went black.


----------



## Siezureboy

Anyone have issues with the ctd no msg in dragon age? I know its a known issue already, especially with nvidia cards but i wasnt sure if anyone else on amd side besides me that had the same issue. I have a sapphire 290x


----------



## YellowBlackGod

AMD Cashes in On GTX 970 Drama, Cuts R9 290X Price.
http://www.techpowerup.com/209412/amd-cashes-in-on-gtx-970-drama-cuts-r9-290x-price.html

4GB means 4GB!!!

Run Guys!!!! The best card right now at the best Price!!!!


----------



## DividebyZERO

Quote:


> Originally Posted by *YellowBlackGod*
> 
> AMD Cashes in On GTX 970 Drama, Cuts R9 290X Price.
> http://www.techpowerup.com/209412/amd-cashes-in-on-gtx-970-drama-cuts-r9-290x-price.html
> 
> 4GB means 4GB!!!
> 
> Run Guys!!!! The best card right now at the best Price!!!!


Never waste a good crisis lol. Sadly though a majority of people have 970gtx will simply just get a 980gtx. They aren't phased by this enough to give up nvidia. They still believe AMD drivers are horrible and Nvidia drivers are perfect. They still believe 290x is a furnace despite cards having excellent cooling solutions. They simply believe AMD is trash and Nvidia is perfect and always the best.

Example
http://www.overclock.net/t/1535502/gtx-970s-can-only-use-3-5gb-of-4gb-vram-issue/1110#post_23472920


----------



## YellowBlackGod

There are many out there who think otherwise. Like me. I changed a reference 780ti, which was an ellegant furnace. I didn't regret it and the Omega drivers of AMD are fine.


----------



## sinnedone

Quick question about card BIOS'.

Its been a while since I flashed a card Bios, and just wanted to know if there are performance gains or improvements to be had.

I currently own two reference Saphire R9 290's with like the release date BIOS on them. I wanted to know if say using a newer BIOS from say a Tri-X would add better performance or maybe better overclock ability? (other than the stock core clock bumped a bit, but I always overclock anyway)


----------



## Kaltenbrunner

Hey my new r9 290 OC does not have coil whine, its the PSU.

In 18 days I'm ordering another 30" 1600p IPS. Once I get 8GB ram, an 850W PSU and crossfire

*THIS COMPUTER WILL THEN BE THE ULTIMATE POWER IN THE UNIVERSE*


----------



## Forceman

Quote:


> Originally Posted by *sinnedone*
> 
> Quick question about card BIOS'.
> 
> Its been a while since I flashed a card Bios, and just wanted to know if there are performance gains or improvements to be had.
> 
> I currently own two reference Saphire R9 290's with like the release date BIOS on them. I wanted to know if say using a newer BIOS from say a Tri-X would add better performance or maybe better overclock ability? (other than the stock core clock bumped a bit, but I always overclock anyway)


Typically no. I'd be a little bit cautious of using a non-reference BIOS in a reference card also, just in case it has different voltage controllers or something (or in the Tri-x case different fan profiles maybe).


----------



## DividebyZERO

Quote:


> Originally Posted by *Kaltenbrunner*
> 
> Hey my new r9 290 OC does not have coil whine, its the PSU.
> 
> In 18 days I'm ordering another 30" 1600p IPS. Once I get 8GB ram, an 850W PSU and crossfire
> 
> *THIS COMPUTER WILL THEN BE THE ULTIMATE POWER IN THE UNIVERSE*


Regardless of hardware or brands used, i always enjoy peoples enthusiasm when they build something they like.

enjoy it!


----------



## Kaltenbrunner

^^^ exactly


----------



## kizwan

Quote:


> Originally Posted by *sinnedone*
> 
> Quick question about card BIOS'.
> 
> Its been a while since I flashed a card Bios, and just wanted to know if there are performance gains or improvements to be had.
> 
> I currently own two reference Saphire R9 290's with like the release date BIOS on them. I wanted to know if say using a newer BIOS from say a Tri-X would add better performance or maybe better overclock ability? (other than the stock core clock bumped a bit, but I always overclock anyway)


I use Tri-X VBIOS for my 290's because with it the cards automatically run at 1000/1300 by default. Other than that I don't see any difference in overclockability between Tri-X & stock VBIOS.
Quote:


> Originally Posted by *Kaltenbrunner*
> 
> Hey my new r9 290 OC does not have coil whine, its the PSU.
> 
> In 18 days I'm ordering another 30" 1600p IPS. Once I get 8GB ram, an 850W PSU and crossfire
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> *THIS COMPUTER WILL THEN BE THE ULTIMATE POWER IN THE UNIVERSE*


You might want go with 1000W PSU just in case.


----------



## pshootr

Quote:


> Originally Posted by *kizwan*
> 
> I use Tri-X VBIOS for my 290's because with it the cards automatically run at 1000/1300 by default. Other than that I don't see any difference in overclockability between Tri-X & stock VBIOS.
> You might want go with 1000W PSU just in case.


When I got my Tri-X, the gurus in the PSU section assured me that a quality 750W would be plenty for oc'd 290's is crossfire, even with an overclocked CPU. I did opt for an 850 though. Just stating what I was told by "shilka".


----------



## hyp36rmax

Quote:


> Originally Posted by *pshootr*
> 
> When I got my Tri-X, the gurus in the PSU section assured me that a quality 750W would be plenty for oc'd 290's is crossfire, even with an overclocked CPU. I did opt for an 850 though. Just stating what I was told by "shilka".


Shilka does have a vast knowledge with PSU's as I find him and Phaedrus the goto guys for power. Shoot, Phaedrus and I work in the same office haha. He probably suggested it on the basis that you would not be feeding more voltage to the GPU therefore increasing its power hungry characteristics lol.


----------



## tsm106

They were pretty clueless about these setups. Took a long time before it finally sunk in that you could suck down 1500w at the socket with two cards and a hexacore lol or on the flip side you could putter along under 800w. The problem is they think they know whats best for everyone. I don't buy a four door car to only use two doors, know what I mean?


----------



## hyp36rmax

Quote:


> Originally Posted by *tsm106*
> 
> They were pretty clueless about these setups. Took a long time before it finally sunk in that you could suck down 1500w at the socket with two cards and a hexacore lol or on the flip side you could putter along under 800w. The problem is they think they know whats best for everyone. I don't buy a four door car to only use two doors, know what I mean?


Possible, you had the best real world example with factory voltage and give'em some juice of power consumption. The car example is subjective as I worked on 4 door cars to only be used as two doors as they were the best performers in its class for competitive racing. Roll cages take up a lot of room. However you have a good point while the PSU gurus have the general knowldge of what will work as it does take some fine tuning and experience with the product to get a clear result


----------



## pshootr

Quote:


> Originally Posted by *tsm106*
> 
> They were pretty clueless about these setups. Took a long time before it finally sunk in that you could suck down 1500w at the socket with two cards and a hexacore lol or on the flip side you could putter along under 800w. The problem is they think they know whats best for everyone. I don't buy a four door car to only use two doors, know what I mean?


Quote:


> Originally Posted by *hyp36rmax*
> 
> Possible, you had the best real world example with factory voltage and give'em some juice of power consumption. The car example is subjective as I worked on 4 door cars to only be used as two doors as they were the best performers in its class for competitive racing. Roll cages take up a lot of room. However you have a good point while the PSU gurus have the general knowldge of what will work as it does take some fine tuning and experience with the product to get a clear result


Well I hope he was rite, because if I need to buy another PSU when I go crossfire I'm going to be pissed.


----------



## kizwan

Quote:


> Originally Posted by *tsm106*
> 
> They were pretty clueless about these setups. Took a long time before it finally sunk in that you could suck down 1500w at the socket with two cards and a hexacore lol or on the flip side you could putter along under 800w. The problem is they think they know whats best for everyone. I don't buy a four door car to only use two doors, know what I mean?


Exactly. My PSU running hot when under load. If I go with 850W, there's possiblity the PSU might melted. lol.


----------



## steadly2004

Quote:


> Originally Posted by *pshootr*
> 
> Well I hope he was rite, because if I need to buy another PSU when I go crossfire I'm going to be pissed.


Quote:


> Originally Posted by *tsm106*
> 
> They were pretty clueless about these setups. Took a long time before it finally sunk in that you could suck down 1500w at the socket with two cards and a hexacore lol or on the flip side you could putter along under 800w. The problem is they think they know whats best for everyone. I don't buy a four door car to only use two doors, know what I mean?


Just to throw it out there, my current setup. 4930k when at 4.3 and a 290x plus 290 pull 700w from the wall. The cards are reference and at stock settings running BOINC with everything loaded up. I haven't tested with the cards over clocked as 700w running for a day or two already affects the power bill a little. I might OC them another time and see how far it goes up With more volts and maybe 1100 core or 1150. I think that's my best game stable clocks, but can bench at like 1190/1500. Maybe a little more on the ram, but it's been ages since I OC for benchies

I wish I would have used the kill-a-watt meter when I had a 3rd card in there. All loaded up, I remember I tripped the breaker a time or two, lol. Had to kick the kids out watching TV and shut off whatever else was plugged in on the same circuit.


----------



## mojobear

Also just to add a different perspective my 4 r9 290s OC to 1194/1350 +131mV pulls 1350 watt from the wall w solid load from games like crysis 3, sleeping dogs, and tomb raider....aka games w good scaling. My cpu is more tame though using a 4770k









If u can cool the r9s down it really increased efficiency. My cores go up to 60C max and vrm 1 is about 50c max using Fuji pads. I really think lower temps does wonders for efficiency


----------



## cosita88

hello
This is my Sapphire 290 reference.
With a AX3 and two sinks for VRMs the Zalman brand.










I remove excess of thermal pad









































Sorry for my English .Thanks


----------



## kizwan

Quote:


> Originally Posted by *steadly2004*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *pshootr*
> 
> Well I hope he was rite, because if I need to buy another PSU when I go crossfire I'm going to be pissed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> They were pretty clueless about these setups. Took a long time before it finally sunk in that you could suck down 1500w at the socket with two cards and a hexacore lol or on the flip side you could putter along under 800w. The problem is they think they know whats best for everyone. I don't buy a four door car to only use two doors, know what I mean?
> 
> Click to expand...
> 
> 
> 
> 
> 
> Just to throw it out there, my current setup. 4930k when at 4.3 and a 290x plus 290 pull 700w from the wall. The cards are reference and at stock settings running BOINC with everything loaded up. I haven't tested with the cards over clocked as 700w running for a day or two already affects the power bill a little. I might OC them another time and see how far it goes up With more volts and maybe 1100 core or 1150. I think that's my best game stable clocks, but can bench at like 1190/1500. Maybe a little more on the ram, but it's been ages since I OC for benchies
> 
> I wish I would have used the kill-a-watt meter when I had a 3rd card in there. All loaded up, I remember I tripped the breaker a time or two, lol. Had to kick the kids out watching TV and shut off whatever else was plugged in on the same circuit.
Click to expand...

Your CPU at low/moderate overclock helps in low power consumption. You do need to have separate circuit for your computer though. When I first run my 290's, I did tripped the main circuit breaker but this is because the circuit breaker my computer connected to is shared with other appliances that also draw a lot of power. The circuit breaker gone kaput & have burn mark. Right now my computer connected to separate circuit & I can overclock and bench/games back-to-back without any problem.
Quote:


> Originally Posted by *mojobear*
> 
> Also just to add a different perspective my 4 r9 290s OC to 1194/1350 +131mV pulls 1350 watt from the wall w solid load from games like crysis 3, sleeping dogs, and tomb raider....aka games w good scaling. *My cpu is more tame though using a 4770k*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If u can cool the r9s down it really increased efficiency. My cores go up to 60C max and vrm 1 is about 50c max using Fuji pads. I really think lower temps does wonders for efficiency


Yeah, Haswell low power consumption helps a lot.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *kizwan*
> 
> I use Tri-X VBIOS for my 290's because with it the cards automatically run at 1000/1300 by default. Other than that I don't see any difference in overclockability between Tri-X & stock VBIOS.
> You might want go with 1000W PSU just in case.
> Quote:
> 
> 
> 
> Originally Posted by *pshootr*
> 
> When I got my Tri-X, the gurus in the PSU section assured me that a quality 750W would be plenty for oc'd 290's is crossfire, even with an overclocked CPU. I did opt for an 850 though. Just stating what I was told by "shilka".
Click to expand...

Always future proof yourself and always try to get the Bigger PSU . So if you need 2 of em your wattage needs will be addressed * specially if your a bencher *


----------



## fragamemnon

For reference, I'm running 2 290s and when folding with +163mV on the cores (1225MHz for one and 1200MHz for the other GPU), plus my quad-core Ivy overclocked to 4.8GHz, water loop, and all the other peripherals except monitors, my Bronze-rated PSU draws up to 800W from the wall. In games with VSync disabled I usually draw about 650-680W.


----------



## Spectre-

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Always future proof yourself and always try to get the Bigger PSU . So if you need 2 of em your wattage needs will be addressed * specially if your a bencher *


yeap

if i decide to bench 2 290x's with PT1 my 1000 watt psu reaches its peak limit with a oc'd 3930k @ 1.55vcore


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *tsm106*
> 
> They were pretty clueless about these setups. Took a long time before it finally sunk in that you could suck down 1500w at the socket with two cards and a hexacore lol or on the flip side you could putter along under 800w. The problem is they think they know whats best for everyone. I don't buy a four door car to only use two doors, know what I mean?


My lowest PSU size is 1200 watts no ifs and no buts about it these days
I remember awhile ago certain peeps were beaten down cause their arguments didn't match their rep . Needless to say I go by what ive need to do for my exploits ...and it works for me . With correct advise of cause








Youse know who you are








Quote:


> Originally Posted by *Spectre-*
> 
> yeap
> 
> if i decide to bench 2 290x's with PT1 my 1000 watt psu reaches its peak limit with a oc'd 3930k @ 1.55vcore


TRI fire + hexcore + heavy o/c's = twin PSU's . I had to go back to one PSU and stock gpu clocks cause the cable was causing my boards surge protection to trip .








Havent benched in over 3 months . Withdrawal effects have dissipated now .......

@alancsalt
Are you gonna go red with the 300 series coming up at all ??


----------



## alancsalt

Where are they?


----------



## HOMECINEMA-PC

Yeah I know








Hoping for a big improvement . And a w/b for it too . no aio pls


----------



## Spectre-

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> My lowest PSU size is 1200 watts no ifs and no buts about it these days
> I remember awhile ago certain peeps were beaten down cause their arguments didn't match their rep . Needless to say I go by what ive need to do for my exploits ...and it works for me . With correct advise of cause
> 
> 
> 
> 
> 
> 
> 
> 
> Youse know who you are
> 
> 
> 
> 
> 
> 
> 
> 
> TRI fire + hexcore + heavy o/c's = twin PSU's . I had to go back to one PSU and stock gpu clocks cause the cable was causing my boards surge protection to trip .
> 
> 
> 
> 
> 
> 
> 
> 
> Havent benched in over 3 months . Withdrawal effects have dissipated now .......
> 
> @alancsalt
> Are you gonna go red with the 300 series coming up at all ??


keen on going back to red team with new gpu's

hopefully i can go X99 by the end of this year or can afford LGA 1151 chips
Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yeah I know
> 
> 
> 
> 
> 
> 
> 
> 
> Hoping for a big improvement . And a w/b for it too . no aio pls


nearly 40% if the HBM thing is for real also keen for HSA to catch on to synthetic benchmarks


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Spectre-*
> 
> keen on going back to red team with new gpu's
> 
> hopefully i can go X99 by the end of this year or can afford LGA 1151 chips
> nearly 40% if the HBM thing is for real also keen for HSA to catch on to synthetic benchmarks


Im gonna hang on to my RIVBE setup , I believe I can get more out of this socket with new 300 series .
Defin will be benching again when these red things are fer sale here


----------



## Spectre-

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yeah I know
> 
> 
> 
> 
> 
> 
> 
> 
> Hoping for a big improvement . And a w/b for it too . no aio pls


Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Im gonna hang on to my RIVBE setup , I believe I can get more out of this socket with new 300 series .
> Defin will be benching again when these red things are fer sale here


I want to cling onto my 3930k but its started to age

also i have learnt that your benching rig and your daily gaming/use rig cant be the same if you bench a lot

i am might just convert my current rig to my gmaing rig and grab the octa once i start full time in may just for benching

as for price just add $150 to w.e USA price would be ( Gotta love abbott)


----------



## Gobigorgohome

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Im gonna hang on to my RIVBE setup , I believe I can get more out of this socket with new 300 series .
> Defin will be benching again when these red things are fer sale here


Me too, I will try out the 300-series as well on my RIVBE and 4930K. Only have one MSI Lightning R9 290X left in my rig now.







Not going to work good with 4K gaming.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Spectre-*
> 
> I want to cling onto my 3930k but its started to age
> 
> also i have learnt that your benching rig and your daily gaming/use rig cant be the same if you bench a lot
> 
> i am might just convert my current rig to my gaming rig and grab the octa once i start full time in may just for benching
> 
> as for price just add $150 to w.e USA price would be ( Gotta love abbott)


Ive still got my new 3930k as a spare it has done a 5.4ghz val when I benched it a bit on my spare RIVE and 290 combo


----------



## Spectre-

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Ive still got my new 3930k as a spare it has done a 5.4ghz val when I benched it a bit on my spare RIVE and 290 combo


thats true but i dont wanna psend money on X79 now

also check this out -

http://www.kitguru.net/components/power-supplies/matthew-wilson/8pack-and-super-flower-to-launch-worlds-first-2000w-consumer-psu/

looks like dual psu setups might not be needed anymore ( unless LN2)


----------



## long99x

Hi everyone, which is the best bios for asus r9 290dc2oc and can I use atiwinflash to flash bios r9 290 ?

thanks


----------



## PsYcHo29388

Should I pull the plug on getting a 290/290X or wait to see what AMD has to offer this year?


----------



## sinnedone

Depends how much money you can spend. If you have around a 500 dollar budget I'd wait.

Two 290's are cutting it well for me at 1440p 120hz


----------



## keikei

Quote:


> Originally Posted by *PsYcHo29388*
> 
> Should I pull the plug on getting a 290/290X or wait to see what AMD has to offer this year?


Waiting wont hurt. You'll want to see the numbers.


----------



## PsYcHo29388

Quote:


> Originally Posted by *sinnedone*
> 
> Depends how much money you can spend. If you have around a 500 dollar budget I'd wait.


Well, my current budget is sub $300, but it might be nice to wait because IMO the power consumption on the 290 at least is kinda high compared to the GTX 970. Perhaps the R9 3XX series will have similar or less power consumption and perform just as good if not better.
Quote:


> Originally Posted by *keikei*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PsYcHo29388*
> 
> Should I pull the plug on getting a 290/290X or wait to see what AMD has to offer this year?
> 
> 
> 
> Waiting wont hurt. You'll want to see the numbers.
Click to expand...

Very true, as I don't 100% need a new card right now, it's only a want. I'll probably get over it in a few days


----------



## Siezureboy

I think the hawaii chipset this time round for amd is the equivalent of nvidia's 4xx line previously. Those cards consumed considerably on stock as well. Hopefully theyll follow nvidias footsteps and learn a lot to improve for the next gen. Im pretty much keeping a keen eye on amd on what they have to offer.


----------



## rdr09

Quote:


> Originally Posted by *PsYcHo29388*
> 
> Well, my current budget is sub $300, but it might be nice to wait because IMO the power consumption on the 290 at least is kinda high compared to the GTX 970. Perhaps the R9 3XX series will have similar or less power consumption and perform just as good if not better.
> Very true, as I don't 100% need a new card right now, it's only a want. I'll probably get over it in a few days


not sure about the modstream combining the 290 with a 8300 chip. the 290 consumes about the same power as the 780Ti prolly even less. the 290x is a bit higher.

if you are going for the 290 or 290X, then i recommend the Lightning for air. if you are watercooling, just get a used reference and a full waterblock. got my second one for $187 for the 290 and $60 for the full ek waterblock. i was just lucky.


----------



## Roboyto

Quote:


> Originally Posted by *long99x*
> 
> Hi everyone, which is the best bios for asus r9 290dc2oc and can I use atiwinflash to flash bios r9 290 ?
> 
> thanks


'best' is a matter of opinion and what you are trying to accomplish.

Bench/OC the card with the stock BIOS and see where you get with that; you may not need to flash the card for good performance.

If you are looking for maximum performance, then the ASUS PT1/PT1T (or some other variant), can help achieve better performance. However AFAIK, they don't allow the card to reduce voltage or downclock so you'd be running max all the time; this can obviously cause the GPU to degrade/fail sooner than normal. I have not experimented with these BIOS's, or flashing in general on my cards, but plenty of other people can tell you all about it.

General 290 info here: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781

BIOS flashing information can be found here: http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread

How are your temperatures on that card BTW? ASUS used the GTX 780 cooler on the 290(X) and only 3/5 heatpipes make contact with the GPU die. Did you buy it recently or used? I have been wondering if the did anything to fix their lazy error.

Quote:


> Originally Posted by *PsYcHo29388*
> 
> Should I pull the plug on getting a 290/290X or wait to see what AMD has to offer this year?


290(X) are a hell of a lot of bang for your buck. They have some drawbacks, but they are solid cards.

The largest draw back is operating temperature on the cards, especially if you plan to overclock and push them to the max. This requires very good, if not exceptional, airflow in your case with a great heatsink on the card...if you're not watercooling them. Most of the heatsinks that are out now can quell the GPU core temperature, but VRM1 will still likely be a problem when air cooled once you start adding voltage/power limit.

If the *rumors* are true for the R9 300 series, then the cards will still have the same TDP ~290W with *supposedly* a ~40% increase in performance; from the last time I looked. The biggest question on my mind is if AMD is going to use a superior VRM setup this time around so they don't get so hot. Hopefully watercooling won't, practically, be a requirement for the new cards.

The choice is yours. If you've gotta have the new hotness, then hold out a little longer...otherwise the 290s are currently a great bargain new, or even better if you'd like to gamble on a used one in the ~$175 range.


----------



## kizwan

Quote:


> Originally Posted by *Siezureboy*
> 
> I think the hawaii chipset this time round for amd is the equivalent of nvidia's 4xx line previously. Those cards consumed considerably on stock as well. Hopefully theyll follow nvidias footsteps and learn a lot to improve for the next gen. Im pretty much keeping a keen eye on amd on what they have to offer.


Well I hope they follow Nvidia footstep on the low power consumption ONLY. I don't want my next AMD cards have segmented VRAM.


----------



## long99x

Quote:


> Originally Posted by *Roboyto*
> 
> 'best' is a matter of opinion and what you are trying to accomplish.
> 
> Bench/OC the card with the stock BIOS and see where you get with that; you may not need to flash the card for good performance.
> 
> If you are looking for maximum performance, then the ASUS PT1/PT1T (or some other variant), can help achieve better performance. However AFAIK, they don't allow the card to reduce voltage or downclock so you'd be running max all the time; this can obviously cause the GPU to degrade/fail sooner than normal. I have not experimented with these BIOS's, or flashing in general on my cards, but plenty of other people can tell you all about it.
> 
> General 290 info here: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781
> BIOS flashing information can be found here: http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread
> 
> How are your temperatures on that card BTW? ASUS used the GTX 780 cooler on the 290(X) and only 3/5 heatpipes make contact with the GPU die. Did you buy it recently or used? I have been wondering if the did anything to fix their lazy error.


thanks for your answer








on stock I play metro redux for 2 hours, my room temp is about 25*C, gpu 1 temp max 73, vrm 1 max 76, vrm 2 max 65, gpu 2 temp max 66, vrm 1 65, vrm 2 61, fan 80% speed
I read the dc2 heatsink is worst before i bought this card but at my place the card is cheapest, I bought used, they look like new
Best bios I mean best for temperature and lowest voltage without underclock

I read that post and follow the command but nothing change, how can i change llc for my vddc controller, thanks again


----------



## Roboyto

Quote:


> Originally Posted by *kizwan*
> 
> Well I hope they follow Nvidia footstep on the low power consumption ONLY. I don't want my next AMD cards have segmented VRAM.


Not to defend Nvidia's shady dealings with the 970, but the 980 doesn't have the segmented VRAM issue.

There shouldn't be any issue with bandwidth on the 300 series since they will be packing the stacked HBM with a 4096-bit bus...Bandwidth for DAYSSSSSSSSSSSS









Quote:


> Originally Posted by *long99x*
> 
> thanks for your answer
> 
> 
> 
> 
> 
> 
> 
> 
> on stock I play metro redux for 2 hours, my room temp is about 25*C, gpu 1 temp max 73, vrm 1 max 76, vrm 2 max 65, gpu 2 temp max 66, vrm 1 65, vrm 2 61, fan 80% speed
> I read the dc2 heatsink is worst before i bought this card but at my place the card is cheapest, I bought used, they look like new
> Best bios I mean best for temperature and lowest voltage without underclock


You're welcome.

Temps are solid, especially with Xfire









I can't give you a good answer to suggest a BIOS to reduce your voltage. If I were to take a guess I would presume a reference BIOS would have lower voltage than the DC2..but someone else is going to have to chime in on this one.


----------



## GOLDDUBBY

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Ive still got my new 3930k as a spare it has done a 5.4ghz val when I benched it a bit on my spare RIVE and 290 combo


My 3930k is a true gamers cpu at 4860mhz 24/7 clock 1.455vcore in win.

....

Any word on when the 3-series is coming?


----------



## Zelx0

Quote:


> Originally Posted by *GOLDDUBBY*
> 
> My 3930k is a true gamers cpu at 4860mhz 24/7 clock 1.455vcore in win.
> 
> ....
> 
> Any word on when the 3-series is coming?


I'm not sure if there's any official date, but I'm betting on 2Q/3Q 2015, likely June


----------



## Arizonian

Quote:


> Originally Posted by *Spectre-*
> 
> @arizonian can you remove me from the list
> 
> got a 970 since my 290x pooped on me


Done.








Quote:


> Originally Posted by *cosita88*
> 
> hello
> This is my Sapphire 290 reference.
> With a AX3 and two sinks for VRMs the Zalman brand.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I remove excess of thermal pad
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sorry for my English .Thanks


Congrats - added















Quote:


> Originally Posted by *GOLDDUBBY*
> 
> -snip-
> ....
> Any word on when the 3-series is coming?


Nothing except for SK Hynix said it would start mass production of HBM memory in the first quarter of this year. Hint, hint to possible time frame.

End of Feb I'll have a 4790K built, selling 780TI and hoping to put an AMD 390X in that box. Very interested in HBM memory.


----------



## Forceman

Quote:


> Originally Posted by *GOLDDUBBY*
> 
> Any word on when the 3-series is coming?


In the investor conference call, the AMD CEO said 2Q. So no sooner than April, but it kind of seemed from the way she said it that it might be later in 2Q.


----------



## PsYcHo29388

Quote:


> Originally Posted by *rdr09*
> 
> not sure about the modstream combining the 290 with a 8300 chip. the 290 consumes about the same power as the 780Ti prolly even less. the 290x is a bit higher.
> 
> if you are going for the 290 or 290X, then i recommend the Lightning for air. if you are watercooling, just get a used reference and a full waterblock. got my second one for $187 for the 290 and $60 for the full ek waterblock. i was just lucky.


Nah I'm not looking into water cooling any time soon, and with the new case I've got good airflow now so just about any old cooler will do most likely. With that said, I will probably go MSI or Gigabyte if I do get a card. Personally, I don't think I'll have any problems with this PSU if I got a new card, maybe if I did crossfire, but I don't plan on doing that either.
Quote:


> Originally Posted by *Roboyto*
> 
> 290(X) are a hell of a lot of bang for your buck. They have some drawbacks, but they are solid cards.
> 
> The largest draw back is operating temperature on the cards, especially if you plan to overclock and push them to the max. This requires very good, if not exceptional, airflow in your case with a great heatsink on the card...if you're not watercooling them. Most of the heatsinks that are out now can quell the GPU core temperature, but VRM1 will still likely be a problem when air cooled once you start adding voltage/power limit.
> 
> If the *rumors* are true for the R9 300 series, then the cards will still have the same TDP ~290W with *supposedly* a ~40% increase in performance; from the last time I looked. The biggest question on my mind is if AMD is going to use a superior VRM setup this time around so they don't get so hot. Hopefully watercooling won't, practically, be a requirement for the new cards.
> 
> The choice is yours. If you've gotta have the new hotness, then hold out a little longer...otherwise the 290s are currently a great bargain new, or even better if you'd like to gamble on a used one in the ~$175 range.


Yep, the price point for used 290(x) cards is very sweet IMO and that's why I was thinking about getting one but like you mentioned, power consumption and heat. I don't plan on doing any overclocking so heat shouldn't be an issue but power consumption is everything to me, If I can pay $50-$100 more for a card straight up (ex. GTX 970) but it consumes 80-120 less watts then I save more money in the long run on the power bill.


----------



## mus1mus

This is Funny


----------



## Roboyto

Quote:


> Originally Posted by *PsYcHo29388*
> 
> Yep, the price point for used 290(x) cards is very sweet IMO and that's why I was thinking about getting one but like you mentioned, power consumption and heat. I don't plan on doing any overclocking so heat shouldn't be an issue but power consumption is everything to me, If I can pay $50-$100 more for a card straight up (ex. GTX 970) but it consumes 80-120 less watts then I save more money in the long run on the power bill.


The 290(X) if cooled properly is more efficient than most think. I had no issues whatsoever running my 290, with an AIO cooler on it, with a 450W gold PSU for 9 months without any problems whatsoever. The card had a mild OC of 1075/1375 +37mV. That little PSU also powered a 4770k @ 4.3GHz with 8GB of 2133 RAM, 2 AIO pumps, 3 fans, 1 SSD, 2 HDD, and an ODD.

If you don't plan on overclocking the card, then one of the 290(X) with a good cooler on it would work well for you. Keeping the GPU/VRM temps cool can improve the efficiency of the card a fair amount. LegitReviews saw a 44W decrease in power consumption when using the Kraken G10: http://www.legitreviews.com/nzxt-kraken-g10-gpu-water-cooler-review-on-an-amd-radeon-r9-290x_130344/5


----------



## JourneymanMike

Quote:


> Originally Posted by *PsYcHo29388*
> 
> Nah I'm not looking into water cooling any time soon, and with the new case I've got good airflow now so just about any old cooler will do most likely. With that said, I will probably go MSI or Gigabyte if I do get a card. Personally, I don't think I'll have any problems with this PSU if I got a new card, maybe if I did crossfire, but I don't plan on doing that either.
> Yep, the price point for used 290(x) cards is very sweet IMO and that's why I was thinking about getting one but like you mentioned, power consumption and heat. I don't plan on doing any overclocking so heat shouldn't be an issue but power consumption is everything to me, If I can pay $50-$100 more for a card straight up (ex. GTX 970) but it consumes 80-120 less watts then I save more money in the long run on the power bill.


If you were to water cool your system, temps would come way down, depending upon the water cooling you are using...

My 2x 290's were, stock cooling 53c idle (I think) 94c all out and with water cooling, 29c idle - 56c all out...

You may want to consider WC.

And Yes, I know you said you don't want to water cool at this time...


----------



## Spectre-

Quote:


> Originally Posted by *Roboyto*
> 
> The 290(X) if cooled properly is more efficient than most think. I had no issues whatsoever running my 290, with an AIO cooler on it, with a 450W gold PSU for 9 months without any problems whatsoever. The card had a mild OC of 1075/1375 +37mV. That little PSU also powered a 4770k @ 4.3GHz with 8GB of 2133 RAM, 2 AIO pumps, 3 fans, 1 SSD, 2 HDD, and an ODD.
> 
> If you don't plan on overclocking the card, then one of the 290(X) with a good cooler on it would work well for you. Keeping the GPU/VRM temps cool can improve the efficiency of the card a fair amount. LegitReviews saw a 44W decrease in power consumption when using the Kraken G10: http://www.legitreviews.com/nzxt-kraken-g10-gpu-water-cooler-review-on-an-amd-radeon-r9-290x_130344/5


yep on my 290X once i had the G10 installed my power draw went from 300ish to 250 ( if gpu-z shows me the right numbers)


----------



## sinnedone

I'm going to chime in on the cooler vrm discussion. I have 2 reference Saphire R9 290's that by themselves couldnt muster more than 1050-1060 on core without crashing.(no added voltage just 50+ power limit) Mind you this is with the stock reference blower going 60-80 percent fanspeed and vrm"s in the mid 70s as well as core temp. I put waterblocks on them and they're doing 1100mhz on core in crossfire like nothing. So it does seem that even if your vrm's are within spec, you can get more out of your card the cooler you get it. Just for reference cards were at 1100/1250 at 50c max on the core and 43c max on the vrm's with no added voltage just 50+ power limit.


----------



## GOLDDUBBY

Quote:


> Originally Posted by *mus1mus*
> 
> This is Funny


Funny cos it's true!


----------



## long99x

Quote:


> Originally Posted by *Roboyto*
> 
> You're welcome.
> 
> Temps are solid, especially with Xfire
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can't give you a good answer to suggest a BIOS to reduce your voltage. If I were to take a guess I would presume a reference BIOS would have lower voltage than the DC2..but someone else is going to have to chime in on this one.


Thank you.

And what about LLC, I use this command but nothing change







Quote:


> /ai4,30,38,7f


----------



## tahirahmed

Hello everyone,

This is undeniably the biggest R9 290/290X thread I have seen so I'll ask for advice here rather than anywhere else. I have Sapphire R9 290 Tri-X from like six months and I am very satisfied with it but I am thinking about getting another as they are quite cheap now for a Crossfire setup though I have no dual card experience before whether in SLI or CF. So far I did quite a bit of research on internet and found mixed opinions about it, some say CF setups are absolutely worth it and bring improvements to almost every game while some say it only bring issues. The reviews say XDMA CF solved all the major issues and work with almost every game out of the box.

My question however is does CF setup overall bring a good experience or bad experience ? also does it work with newer games easily or require a lot of tweaking/settings to make it work ? My major concern are few upcoming games, specially The Witcher 3 which recently mentioned that a single R9 290 will be good enough only for med-high settings at 1080p.

Also how's your experience with Nvidia favoring titles like Farcry 4, AC Unity, Watch Dogs, Shadow of Mordor, Skyrim etc ? I tried finding videos on Youtube but didn't find much, does it mean CF doesn't work with these games ?

Please help me out in this decision.


----------



## pshootr

Quote:


> Originally Posted by *tahirahmed*
> 
> Hello everyone,
> 
> This is undeniably the biggest R9 290/290X thread I have seen so I'll ask for advice here rather than anywhere else. I have Sapphire R9 290 Tri-X from like six months and I am very satisfied with it but I am thinking about getting another as they are quite cheap now for a Crossfire setup though I have no dual card experience before whether in SLI or CF. So far I did quite a bit of research on internet and found mixed opinions about it, some say CF setups are absolutely worth it and bring improvements to almost every game while some say it only bring issues. The reviews say XDMA CF solved all the major issues and work with almost every game out of the box.
> 
> My question however is does CF setup overall bring a good experience or bad experience ? also does it work with newer games easily or require a lot of tweaking/settings to make it work ? My major concern are few upcoming games, specially The Witcher 3 which recently mentioned that a single R9 290 will be good enough only for med-high settings at 1080p.
> 
> Also how's your experience with Nvidia favoring titles like Farcry 4, AC Unity, Watch Dogs, Shadow of Mordor, Skyrim etc ? I tried finding videos on Youtube but didn't find much, does it mean CF doesn't work with these games ?
> 
> Please help me out in this decision.


Hello, welcome to OCN







I have not run CF yet either, so I cant really be of help in that area. But I'm sure some of our members will do their best to answer your questions. I will be following along, because I have basically the same questions in mind.


----------



## wermad

Quote:


> Originally Posted by *tahirahmed*
> 
> Hello everyone,
> 
> This is undeniably the biggest R9 290/290X thread I have seen so I'll ask for advice here rather than anywhere else. I have Sapphire R9 290 Tri-X from like six months and I am very satisfied with it but I am thinking about getting another as they are quite cheap now for a Crossfire setup though I have no dual card experience before whether in SLI or CF. So far I did quite a bit of research on internet and found mixed opinions about it, some say CF setups are absolutely worth it and bring improvements to almost every game while some say it only bring issues. The reviews say XDMA CF solved all the major issues and work with almost every game out of the box.
> 
> My question however is does CF setup overall bring a good experience or bad experience ? also does it work with newer games easily or require a lot of tweaking/settings to make it work ? My major concern are few upcoming games, specially The Witcher 3 which recently mentioned that a single R9 290 will be good enough only for med-high settings at 1080p.
> 
> Also how's your experience with Nvidia favoring titles like Farcry 4, AC Unity, Watch Dogs, Shadow of Mordor, Skyrim etc ? I tried finding videos on Youtube but didn't find much, does it mean CF doesn't work with these games ?
> 
> Please help me out in this decision.


It typically comes down to the resolution and the game. Some games work great on both camps, and some like crap for either. Also, multiple monitors and 4k really need more then one gpu to get that extra grunt. I would recommend look at the 295x2 reviews (essentially, two 290Xs in crossfire). There's a few of them already out testing wqhd and 4k.

Unfortunately, its never going to be a perfect scenario for which ever camp you choose. I prefer the price/performance factor of amd and also the ability to do up to six monitors in Eyefinity (if I ever need it). Nvidia has really priced itself out of my budget but I'm quite happy w/ Amd so far (previous setup was quad 7970 Lightnings). Also, the driver issues I was plagued with in the past have never resurfaced with Amd. Unlike, Nvidia, my last setup (tri 780s) ran into some driver issues (especially w/ older games). I have not ran any of the newer games on this triple 290 setup but I'll be finishing up soon. I'm undecided on going 4k or Eyefinity wqhd.


----------



## Red1776

Quote:


> Originally Posted by *tahirahmed*
> 
> Hello everyone,
> 
> This is undeniably the biggest R9 290/290X thread I have seen so I'll ask for advice here rather than anywhere else. I have Sapphire R9 290 Tri-X from like six months and I am very satisfied with it but I am thinking about getting another as they are quite cheap now for a Crossfire setup though I have no dual card experience before whether in SLI or CF. So far I did quite a bit of research on internet and found mixed opinions about it, some say CF setups are absolutely worth it and bring improvements to almost every game while some say it only bring issues. The reviews say XDMA CF solved all the major issues and work with almost every game out of the box.
> 
> My question however is does CF setup overall bring a good experience or bad experience ? also does it work with newer games easily or require a lot of tweaking/settings to make it work ? My major concern are few upcoming games, specially The Witcher 3 which recently mentioned that a single R9 290 will be good enough only for med-high settings at 1080p.
> 
> Also how's your experience with Nvidia favoring titles like Farcry 4, AC Unity, Watch Dogs, Shadow of Mordor, Skyrim etc ? I tried finding videos on Youtube but didn't find much, does it mean CF doesn't work with these games ?
> 
> Please help me out in this decision.


Hi There,

I will first give you my qualifications on the subject and practice of Crossfire. I have built At least one quad CF machine for my own use every year since 2007.

Here are a few of them



CF tech has come a long way and driver support for CF is fantastic these days, and the scaling is very good, often approaching 100%. CF is supported by most every major title and the ones that don't are not in need of it.

There are also a myriad of very good tools at ones disposal for OC'ing tweaking , and syncing CF cards running in tandem such as Trixx, Afterburner, ASUS Tweak, and Radeon Pro to mention a few.

Prior to the latest project with AMD, my last project was a 4 x 7970 rig:



I have been working on and just finished two new CF machines in a project called the AMD High Performance Project:

The first is a quad R290X



and the second is an A10 7850 Dual Graphics machine.



I am about to post the project results if you would like to follow along about the capability and performance of CF applications. I am currently using my CF machines with triple monitor 5760 x 1080 and 4K resolutions. I am a graphics enthusiast and having he ability to run 'Ultra" settings all the way around is fantastic.

This is the working build log/thread currently. The finished result thread will start new very soon here.

http://www.overclock.net/t/1473361/amd-high-performance-project-by-red1776

It should provide a lot of answers for you about multiple GPU systems and their performance, and what is required power use wise, cooling, and other issues.

Feel free to PM me with any questions if you like.














Good Luck

Greg


----------



## Gil80

Hi all.

Do you know if there's an existing back plate to the Gigabyte R9 290 Windforce OC edition?

Or are all backplates for reference cards only?


----------



## pshootr

Very nice post, thank you


----------



## disintegratorx

I'm still in the process of rebuilding my system. I think I'll be getting a 4930k CPU to replace my Sandy Bridge and I already have an ASUS Deluxe X79 board, I just have to save a little bit more to get the CPU.

Everything else is working fine,I'm just waiting for that to pair up with my LCS 290X. Man that will be a fine day when it happens. Lol Also thinking about getting a matching set of some quad channel RAM as well..


----------



## tahirahmed

@pshootr

Thank you very much







yes that is exactly what I thought, I have been looking for useful information on this decision and so far this is the only thread where I see a lot of CF users so this is the best place to ask.

@wermad

Sorry I forgot to mention my resolution, for the moment I am playing on 1080p single monitor. For this 2x R9 290 will be overkill so I will be switching to 1440p once I get into this, my price concern is to play The Witcher 3 at 60fps with maxed out settings, this is perhaps my most anticipated game so I want to be ready. Unfortunately it is an Nvidia powered title so this gives me concerns about it's CF support, a lot of Ubisoft games FC4, ACU, WD had problems with dual card setups (even SLIs) or so I heard that's why I am doing the necessary asking before I buy another card.

I checked several reviews of R9 290X and 290 in CF but none of them show something for games like FC4 and ACU ? I want to know what you guys do when a game has bad support for CF ? is there any workaround that allows you to use two cards even if they don't scale too well ? it must be better than a single card ? I also heard that some setting called "AFR Friendly" works in several games that don't have official CF support.

Do you mean AMD drivers are more stable in CF than Nvidia's SLI drivers ? I heard SLI is generally more better than CF and Nvidia drivers are more stable for dual cards ?

@Red1776

Hi and wow those are more CF setups than I have ever seen, you certainly are a CF lover









I would love to try something like quad cards but that's something I cannot afford and I want to experience the thing with two cards first if it's not too difficult to get into. As you said the driver support is now great for CF, does this mean CF work with most of the games out of the box ? I am asking this because AMD include CF profiles in driver releases and as far as I know AMD take a bit of time before rolling out a new driver so does it mean we have to wait long times for working CF profiles for games ?

I will certainly be following your new project and I am interested in the results and scaling of 4 cards. Do you test using latest games or only benchmark suits ? also I use MSI AB so that alone is enough to control CF setup correctly ?

Sorry for so many questions at once.


----------



## wermad

Quote:


> Originally Posted by *tahirahmed*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> @pshootr
> 
> Thank you very much
> 
> 
> 
> 
> 
> 
> 
> yes that is exactly what I thought, I have been looking for useful information on this decision and so far this is the only thread where I see a lot of CF users so this is the best place to ask.
> 
> @wermad
> 
> Sorry I forgot to mention my resolution, for the moment I am playing on 1080p single monitor. For this 2x R9 290 will be overkill so I will be switching to 1440p once I get into this, my price concern is to play The Witcher 3 at 60fps with maxed out settings, this is perhaps my most anticipated game so I want to be ready. Unfortunately it is an Nvidia powered title so this gives me concerns about it's CF support, a lot of Ubisoft games FC4, ACU, WD had problems with dual card setups (even SLIs) or so I heard that's why I am doing the necessary asking before I buy another card.
> 
> I checked several reviews of R9 290X and 290 in CF but none of them show something for games like FC4 and ACU ? I want to know what you guys do when a game has bad support for CF ? is there any workaround that allows you to use two cards even if they don't scale too well ? it must be better than a single card ? I also heard that some setting called "AFR Friendly" works in several games that don't have official CF support.
> 
> Do you mean AMD drivers are more stable in CF than Nvidia's SLI drivers ? I heard SLI is generally more better than CF and Nvidia drivers are more stable for dual cards ?
> 
> @Red1776
> 
> Hi and wow those are more CF setups than I have ever seen, you certainly are a CF lover
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I would love to try something like quad cards but that's something I cannot afford and I want to experience the thing with two cards first if it's not too difficult to get into. As you said the driver support is now great for CF, does this mean CF work with most of the games out of the box ? I am asking this because AMD include CF profiles in driver releases and as far as I know AMD take a bit of time before rolling out a new driver so does it mean we have to wait long times for working CF profiles for games ?
> 
> I will certainly be following your new project and I am interested in the results and scaling of 4 cards. Do you test using latest games or only benchmark suits ? also I use MSI AB so that alone is enough to control CF setup correctly ?
> 
> Sorry for so many questions at once.


My last sample of Nvidia was in the summer of 13' (triple gtx 780s). Beyond two cards, Amd has generally had better scaling then Nvidia. Two card situations, I give the slight edge to Nvidia, especially games that were optimized for them from the get go.

As I mentioned, its give or take w/ games. Its just the way it is. If the titles you play more frequent run better on Nvidia, I would switch tbh. Bioshock is a title that typically favors Nvidia. But its not a bad experience with Amd. I had loads of fun playing through that in 5x1 Eyefinity with quad 7970s. Something you can't do with Nvidia. Never had a single issue as long as I had the game in medium settings (6000x1920). Over time, games do start improving w/ newer driver releases. Nvidia does tend to decrease efforts to improving game performance with newer driver releases on the older generation of cards (ie gtx 7xx & 6xx). Its a grumbling i here frequently, especially when Amd is optimizing older cards for newer games.

My main reason why I switched from Amd to Nvidia was the dreaded drivers and crappy reference (Caymen & Tahiti). It was really bad I had to abandon two attempts to put quad crossfire setups together. Eventually, after some time and learning a lot about displayport 1.2 and MST hubs, I took the plunge w/ Amd again and i haven't had a solid reason to leave them.

If you plan to switch, now is the time. There may be a slight increase in the used market for the 970 for the obvious reasons (







) and you don't want to have depreciation hit further once PI is launched. I was thinking of getting a fourth card but I really need to push 4k or 5x1 to really make the four cards work. Otherwise, like many have reported, quads is not much beneficial vs two or three (scaling). These Sapphires w/ the triX coolers and factory oc still sell strong (bought mine back in November). I bought both new ones for ~$260 and the used one for $220. Add "no mining" and it typically sells quicker







.

Good luck


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *mus1mus*
> 
> This is Funny


That is pure gold ......

You'd think he could afford a top denture at least being a 'engineer' and all that









I will never go back to green things , this proves that they think that we are all a bunch of mugs , ready to false advertise their products and rip us off without a care in the world ........








Quote:


> Originally Posted by *tahirahmed*
> 
> Hello everyone,
> 
> This is undeniably the biggest R9 290/290X thread I have seen so I'll ask for advice here rather than anywhere else. I have Sapphire R9 290 Tri-X from like six months and I am very satisfied with it but I am thinking about getting another as they are quite cheap now for a Crossfire setup though I have no dual card experience before whether in SLI or CF. So far I did quite a bit of research on internet and found mixed opinions about it, some say CF setups are absolutely worth it and bring improvements to almost every game while some say it only bring issues. The reviews say XDMA CF solved all the major issues and work with almost every game out of the box.
> 
> My question however is does CF setup overall bring a good experience or bad experience ? also does it work with newer games easily or require a lot of tweaking/settings to make it work ? My major concern are few upcoming games, specially The Witcher 3 which recently mentioned that a single R9 290 will be good enough only for med-high settings at 1080p.
> 
> Also how's your experience with Nvidia favoring titles like Farcry 4, AC Unity, Watch Dogs, Shadow of Mordor, Skyrim etc ? I tried finding videos on Youtube but didn't find much, does it mean CF doesn't work with these games ?
> 
> Please help me out in this decision.


CF is like one big 8gb card . Even better if its w/blocked .......









Noticed that nearly all water cooled build pics posted here eg : Quadfire / multi rad setups and the like don't use quick disconnects


----------



## Spectre-

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> That is pure gold ......
> 
> You'd think he could afford a top denture at least being a 'engineer' and all that
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I will never go back to green things , this proves that they think that we are all a bunch of mugs , ready to false advertise their products and rip us off without a care in the world ........
> 
> 
> 
> 
> 
> 
> 
> 
> CF is like one big 8gb card . Even better if its w/blocked .......


besides the whole 3.5gb thing

i was impressed by the 970

but then my 970 does 1600 on air so thats my spot of luck

Tbh after being on the red tem for so long i cant wait for my 290x to come back from RMA

ill probs trade my 970 for a 780ti and then sell the ti


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Spectre-*
> 
> besides the whole 3.5gb thing
> 
> i was impressed by the 970
> 
> but then my 970 does 1600 on air so thats my spot of luck
> 
> Tbh after being on the red tem for so long i cant wait for my 290x to come back from RMA
> 
> ill probs trade my 970 for a 780ti and then sell the ti


1600mhz on air WoW









Lots of cooling and volts required to bench those power hungry suckers past 1390mhz LoooL


----------



## Spectre-

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 1600mhz on air WoW
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Lots of cooling and volts required to bench those suckers past 1390mhz LoooL


'

oh i meant 1600 boost

its pretty stable 1540ish in games but in benchmarks i can do low 1600's easily


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Spectre-*
> 
> '
> 
> oh i meant 1600 boost
> 
> its pretty stable 1540ish in games but in benchmarks i can do low 1600's easily


LooooL









Havent been keeping up with the jones's as of late still 1540mhz is impressive but still would rather have 300 series .

Learnt my lesson about green things with 760 series bios and stuff like that

Doing full time study ATM re edumacating oneself doing certs and stuff .........


----------



## Spectre-

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> LooooL
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Havent been keeping up with the jones's as of late still 1540mhz is impressive but still would rather have 300 series .
> 
> Learnt my lesson about green things with 760 series bios and stuff like that
> 
> Doing full time study ATM re edumacating oneself doing certs and stuff .........


Lol

Ill be grabbing a 7970 soon for some hwbot points

that should be fun

R9 300 should definetly be my next upgrade as well

looking forward to another PT1 bios like thing for 300 series


----------



## Red1776

Quote:


> Originally Posted by *tahirahmed*
> 
> @pshootr
> 
> Thank you very much
> 
> 
> 
> 
> 
> 
> 
> yes that is exactly what I thought, I have been looking for useful information on this decision and so far this is the only thread where I see a lot of CF users so this is the best place to ask.
> 
> @wermad
> 
> Sorry I forgot to mention my resolution, for the moment I am playing on 1080p single monitor. For this 2x R9 290 will be overkill so I will be switching to 1440p once I get into this, my price concern is to play The Witcher 3 at 60fps with maxed out settings, this is perhaps my most anticipated game so I want to be ready. Unfortunately it is an Nvidia powered title so this gives me concerns about it's CF support, a lot of Ubisoft games FC4, ACU, WD had problems with dual card setups (even SLIs) or so I heard that's why I am doing the necessary asking before I buy another card.
> 
> I checked several reviews of R9 290X and 290 in CF but none of them show something for games like FC4 and ACU ? I want to know what you guys do when a game has bad support for CF ? is there any workaround that allows you to use two cards even if they don't scale too well ? it must be better than a single card ? I also heard that some setting called "AFR Friendly" works in several games that don't have official CF support.
> 
> Do you mean AMD drivers are more stable in CF than Nvidia's SLI drivers ? I heard SLI is generally more better than CF and Nvidia drivers are more stable for dual cards ?
> 
> @Red1776
> 
> Hi and wow those are more CF setups than I have ever seen, you certainly are a CF lover
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I would love to try something like quad cards but that's something I cannot afford and I want to experience the thing with two cards first if it's not too difficult to get into. As you said the driver support is now great for CF, does this mean CF work with most of the games out of the box ? I am asking this because AMD include CF profiles in driver releases and as far as I know AMD take a bit of time before rolling out a new driver so does it mean we have to wait long times for working CF profiles for games ?
> 
> I will certainly be following your new project and I am interested in the results and scaling of 4 cards. Do you test using latest games or only benchmark suits ? also I use MSI AB so that alone is enough to control CF setup correctly ?
> 
> Sorry for so many questions at once.


No problem. I am benching benchmarks, games and all. AB is my choice of GPU setup/control.

AMD driver team has improved immensely over the last two years and they usually have a cf driver day one, however you occasional have to wait for the next beta, however they are getting traction and the inside track with devs signing up to get involved with Mantle.

I think most will be surprised at the scaling to all four cards at higher resolutions. The third and 4th card is no longer a poor ROI. This is something that AMD has over team green. I am rather excited to present this project, there are a lot of surprises coming.





Greg


----------



## Roboyto

Quote:
Originally Posted by *Spectre-* 

besides the whole 3.5gb thing

i was impressed by the 970

but then my 970 does 1600 on air so thats my spot of luck

Tbh after being on the red tem for so long i cant wait for my 290x to come back from RMA

ill probs trade my 970 for a 780ti and then sell the ti


> Quote:
> 
> 
> 
> Originally Posted by *HOMECINEMA-PC*
> 
> 1600mhz on air WoW
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Lots of cooling and volts required to bench those suckers past 1390mhz LoooL
Click to expand...

This was my overall impression of the 970 as well.

I sold the Zotac I had because I thought I could get a better performing 970 due to crappy RAM clocks...so I bought an MSI gaming.

Briefly had the MSI 970 due to the 3.5GB since I'll have a 4K TV sooner than later...it hit the exact same RAM speed as the Zotac







with a tiny improvement in core. However, I must say TwinFrozr V is *downright stupendous*. At idle, where the fans are *off*, the card sat in the low 40's. With the voltage/power maxed, it would boost to 1564. The temp was 65C with the fans at 47%, which was as quiet as the fans being off.

The performance that the cards generate with the power they consume is still impressive.

I returned the MSI 970, and picked up an MSI 980. Same story here for the TF-V cooler...amazing/silent 68C full load with power/voltage maxed...even at 100% fan speed the noise is ridiculously unobtrusive and drops the already great temps another 8-10C. The 980 allows a bit more power/voltage than the 970, with +87mV and 22% power without touching the BIOS..But, Maxwell doesn't seem respond to voltage/power very well...unless it's just the 3 cards I have had.

This 980 I have hits 1420Mhz core which translates to 1553MHz boost with stock power/voltage, 1584MHz boost with power slider at 22%, and maxing the voltage to +87mV gives me another 17MHz to 1601MHz; allowing me to break 13k in firestrike without touching the RAM. Of course my luck would be that this card has the worst RAM of any GPU I've ever owned, as it struggles to even make a 100MHz jump; only completed one firestrike run with +100 to RAM.

I was quite pleased with the performance, despite the atrocious RAM speeds...until I went to put the lid on my 250D. The 980 power connectors are in the worst position and interfere with the removable ODD tray, which in turn interferes with the top going on. I can force it in, or dremel away some material..but really don't want to do that...decisions...I may just return the 980 as well and wait for AMD to strike to see what Nvidia's response is. A fixed 970, 970 Ti, 980 Ti...and where will that put the price for these cards? I want the HTPC to be a capable 4K machine, and realistically that requires a newer offering or a pair of cards...might as well just play the waiting game.


----------



## Roy360

Do you guys have trouble resuming from idle?

At first I thought it was my overclocks, so I disabled all of them. Running stock CPU and RAM.

But I'm still greeted by random black screens when I try to resume my PC.

Sleep is disabled. No Screen Savers. Monitor is set to turn off in 5 mins.

Can't tell if this is GPU/CPU or PSU problem.

Running 3 R9 290s and using MSI AfterBurner to sync the clock speeds to match the first card. 1000/1260.

Card 1: ASUS OC DirectCU 1000/1260
Card 2: Reference XFX 947/1250
Card 3: Reference VisionTek 977/1250

No voltage changes. Each card is using their own settings. With the visiontek card using the most.

Using 14.3, but I've had the black screen problem for a while.


----------



## Roboyto

Quote:


> Originally Posted by *Roy360*
> 
> Do you guys have trouble resuming from idle?
> 
> At first I thought it was my overclocks, so I disabled all of them. Running stock CPU and RAM.
> 
> But I'm still greeted by random black screens when I try to resume my PC.
> 
> Sleep is disabled. No Screen Savers. Monitor is set to turn off in 5 mins.
> 
> Can't tell if this is GPU/CPU or PSU problem.
> 
> Running 3 R9 290s and using MSI AfterBurner to sync the clock speeds to match the first card. 1000/1260.
> 
> Card 1: ASUS OC DirectCU 1000/1260
> Card 2: Reference XFX 947/1250
> Card 3: Reference VisionTek 977/1250
> 
> No voltage changes. Each card is using their own settings. With the visiontek card using the most.
> 
> Using 14.3, but I've had the black screen problem for a while.


It had happened to me on occasion previously, but I haven't had the issue in quite some time; running single 290. You may want to consider 14.9 or Omega.


----------



## Forceman

Quote:


> Originally Posted by *Roboyto*
> 
> It had happened to me on occasion previously, but I haven't had the issue in quite some time; running single 290. You may want to consider 14.9 or Omega.


Same with me, the newer drivers seem to have fixed it. Disabling ULPS (even though it's not supposed to matter for single card, I guess) seemed to make it happen less often also.


----------



## demitrisln

Hello all,
I have a question. I just bought my second Sapphire R9 290X with EK fulll waterblock. I'm wondering which EK SLI connector I need to get to connect these together? Below is a link to my motherboards website if that helps.

http://www.gigabyte.com/products/product-page.aspx?pid=4672#ov

and I think this is the connector I need....

http://www.frozencpu.com/products/10970/ex-blc-757/EK_FC_Bridge_Dual_Serial_-_SLI_Connection_EK-FC_Bridge_DUAL_Serial.html?tl=g57c645s2061


----------



## Widde

Quote:


> Originally Posted by *demitrisln*
> 
> Hello all,
> I have a question. I just bought my second Sapphire R9 290X with EK fulll waterblock. I'm wondering which EK SLI connector I need to get to connect these together? Below is a link to my motherboards website if that helps.
> 
> http://www.gigabyte.com/products/product-page.aspx?pid=4672#ov
> 
> and I think this is the connector I need....
> 
> http://www.frozencpu.com/products/10970/ex-blc-757/EK_FC_Bridge_Dual_Serial_-_SLI_Connection_EK-FC_Bridge_DUAL_Serial.html?tl=g57c645s2061


Isnt that one for old 5xxx series amd cards and 200 series nvidia?


----------



## demitrisln

Yeah it is... Crap


----------



## Gil80

Quote:


> Originally Posted by *Gil80*
> 
> Hi all.
> 
> Do you know if there's an existing back plate to the Gigabyte R9 290 Windforce OC edition?
> 
> Or are all backplates for reference cards only?


Bump


----------



## Gil80

AMD is reducing the prices of 290/290X in response to Nvidia's 970 memory design issue.
I love their ad!



4GB is 4GB







the copywriter is genius!!!


----------



## kizwan

Quote:


> Originally Posted by *demitrisln*
> 
> Hello all,
> I have a question. I just bought my second Sapphire R9 290X with EK fulll waterblock. I'm wondering which EK SLI connector I need to get to connect these together? Below is a link to my motherboards website if that helps.
> 
> http://www.gigabyte.com/products/product-page.aspx?pid=4672#ov
> 
> and I think this is the connector I need....
> 
> http://www.frozencpu.com/products/10970/ex-blc-757/EK_FC_Bridge_Dual_Serial_-_SLI_Connection_EK-FC_Bridge_DUAL_Serial.html?tl=g57c645s2061


If you want to know which terminal/bridge you need to use, it depends on which EK waterblocks do you have.

http://www.ekwb.com/shop/blocks/vga-blocks/ati-radeon-full-cover-blocks/radeon-rx-200-series.html
http://www.ekwb.com/shop/blocks/vga-blocks/multiple-block-connectivity.html


----------



## Mega Man

Quote:


> Originally Posted by *Gil80*
> 
> AMD is reducing the prices of 290/290X in response to Nvidia's 970 memory design issue.
> I love their ad!
> 
> 
> 
> 4GB is 4GB
> 
> 
> 
> 
> 
> 
> 
> the copywriter is genius!!!


on some forums the nvidia fanboys keep posting stuff to the extent of " this is poor taste " i think they are referring to the taste in their mouths


----------



## Gil80

I'm pretty sure nVidia get's their chances to do the same with there's a big issue with AMD cards.

I'm not a fanboy of either brand, I'm just happy we have this competition between the two. That's how we get better prices. We need a 3rd player though


----------



## ForNever

Quote:


> Originally Posted by *Mega Man*
> 
> on some forums the nvidia fanboys keep posting stuff to the extent of " this is poor taste " i think they are referring to the taste in their mouths


Poor taste??? They ripped off their customer base, then said, "aw shucks ya caught us...sooooorry". Defending that is really another level of pathetic.


----------



## Mega Man

meh doesnt nvidia do it every generation ?


----------



## chiknnwatrmln

Having a problem. For some reason all of my games stutter when I have vsync on. The problem is not present when it's off.

When I get the stutter, it's very consistent. It happens once per second, and the GPU usage on both cards drops, then goes right back up. It's almost as if when games are locked to 60 fps, on frame every second is entirely skipped. Ruins the whole point of vsync.

Normally this isn't a problem because I don't like vsync, but for the Metro games vsync is a must because 75 fps feels like 15 sometimes.

Does anyone know how to fix this? I tried forcing vsync and triple buffering through ccc profile but they do not apply at all ingame.


----------



## Widde

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Having a problem. For some reason all of my games stutter when I have vsync on. The problem is not present when it's off.
> 
> When I get the stutter, it's very consistent. It happens once per second, and the GPU usage on both cards drops, then goes right back up. It's almost as if when games are locked to 60 fps, on frame every second is entirely skipped. Ruins the whole point of vsync.
> 
> Normally this isn't a problem because I don't like vsync, but for the Metro games vsync is a must because 75 fps feels like 15 sometimes.
> 
> Does anyone know how to fix this? I tried forcing vsync and triple buffering through ccc profile but they do not apply at all ingame.


I've had problems in the past with vsync and audio stuttering, Never made sense of it and didnt think much of it since I dont use vsync, But as far as I'm aware its been there since the start


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Gil80*
> 
> Hi all.
> 
> Do you know if there's an existing back plate to the Gigabyte R9 290 Windforce OC edition?
> 
> Or are all backplates for reference cards only?


No .
You would think there would be
Why don't you Google it








Quote:


> Originally Posted by *ForNever*
> 
> Poor taste??? They ripped off their customer base, then said, "aw shucks ya caught us...sooooorry". Defending that is really another level of pathetic.
> Quote:
> 
> 
> 
> Originally Posted by *HOMECINEMA-PC*
> 
> That is pure gold ......
> 
> You'd think he could afford a top denture at least being a 'engineer' and all that
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I will never go back to green things , *this proves that they think that we are all a bunch of mugs , ready to false advertise their products and rip us off without a care in the world* ........
> 
> 
> 
> 
> 
> 
> 
> 
> CF is like one big 8gb card . Even better if its w/blocked .......
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Noticed that nearly all water cooled build pics posted here eg : Quadfire / multi rad setups and the like don't use quick disconnects
Click to expand...

I wonder if there could be a possibility of a court case . In 'stralia companies can be taken to court over false advertising and giving misleading / false product claims


----------



## om3nz

Hey everyone. Just got the r9 290.







Please add me to the club.

http://www.techpowerup.com/gpuz/details.php?id=8hyh5

Manufacturer: Sapphire
Cooling: VAPOR X


----------



## hyp36rmax

Quote:


> Originally Posted by *om3nz*
> 
> Hey everyone. Just got the r9 290.
> 
> 
> 
> 
> 
> 
> 
> Please add me to the club.
> 
> http://www.techpowerup.com/gpuz/details.php?id=8hyh5
> 
> Manufacturer: Sapphire
> Cooling: VAPOR X


Welcome aboard!


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Widde*
> 
> I've had problems in the past with vsync and audio stuttering, Never made sense of it and didnt think much of it since I dont use vsync, But as far as I'm aware its been there since the start


Weird issue, my audio is fine (perhaps because I have a dedicated sound card) but my video stutters. Super annoying.


----------



## BradleyW

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Weird issue, my audio is fine (perhaps because I have a dedicated sound card) but my video stutters. Super annoying.


I don't use Vsync, but when I have tried it on the 290's, I get the same issue as you do. Looks like Vsync has been broken for the past 4 months. Go back and check all the release notes for CCC. They all mention Vsync as a fixed or known issue. It's something they are having trouble with at the moment and some users are feeling this.

Have you tried running an fps limiter along side Vsync? Example, if your refresh rate is 60Hz, set the fps limiter to 60. I know it makes no sense at this point, but please do try it.

Does anyone know when we can expect 15.2 CCC Beta with FC4 and Dying Light CFX profiles? Many users on Guru3d claim this week. AMDMatt, the rep on Guru3d said CFX profiles will be for FC4, Dying Light and CFX fix for AC Unity.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *BradleyW*
> 
> I don't use Vsync, but when I have tried it on the 290's, I get the same issue as you do. Looks like Vsync has been broken for the past 4 months. Go back and check all the release notes for CCC. They all mention Vsync as a fixed or known issue. It's something they are having trouble with at the moment and some users are feeling this.
> 
> Have you tried running an fps limiter along side Vsync? Example, if your refresh rate is 60Hz, set the fps limiter to 60. I know it makes no sense at this point, but please do try it.
> 
> Does anyone know when we can expect 15.2 CCC Beta with FC4 and Dying Light CFX profiles? Many users on Guru3d claim this week. AMDMatt, the rep on Guru3d said CFX profiles will be for FC4, Dying Light and CFX fix for AC Unity.


What program do you recommend to limit the FPS? I can never get RadeonPro working correctly, and for some reason applying a profile via CCC does not work for Metro Redux.

I feel like FC4 CF is never coming. The game has been out for so long that I beat it a while ago. Really disappointed it's been this long with only single GPU support.


----------



## BradleyW

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> What program do you recommend to limit the FPS? I can never get RadeonPro working correctly, and for some reason applying a profile via CCC does not work for Metro Redux.
> 
> I feel like FC4 CF is never coming. The game has been out for so long that I beat it a while ago. Really disappointed it's been this long with only single GPU support.


MSI Afterburner's Rivatuner for limiting fps.

Yeah, FC4 profile is out with next driver release. I am sick of waiting for it.


----------



## joeh4384

Afr friendly works pretty well in far cry 4 although the audio in cut scenes is completely muffled.


----------



## chiknnwatrmln

I always got spotty GPU usage and stuttering with AFR.

How exactly do you use Rivatuner to limit FPS? I installed it and the only thing I'm seeing really is editing your monitor's supported resolution, like CRU.

Got it working. The problem was with FRAPS running in the background, vsync now is buttery smooth. Constant 60fps on Metro Redux in the swamps, entirely maxed out settings. Amazing.


----------



## BradleyW

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I always got spotty GPU usage and stuttering with AFR.
> 
> How exactly do you use Rivatuner to limit FPS? I installed it and the only thing I'm seeing really is editing your monitor's supported resolution, like CRU.



Quote:


> Originally Posted by *BradleyW*
> 
> *MSI Afterburner's Rivatuner* for limiting fps.
> 
> Yeah, FC4 profile is out with next driver release. I am sick of waiting for it.


http://event.msi.com/vga/afterburner/download.htm


----------



## Arizonian

If your using Raptr for gaming you may want to change your passwords.

http://www.legitreviews.com/raptr-users-urged-change-password-hacking_158129


----------



## chiknnwatrmln

Ty, I didn't have Rivatuner installed, only MSI AB.


----------



## BradleyW

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Ty, I didn't have Rivatuner installed, only MSI AB.












Let me know how ya get on.


----------



## hamzta09

If anyone runs Crossfire, on air in a case.
Please post your temps after ~30min gaming at 22c indoor temps or so, and your type of cards + slot in between and noise.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *BradleyW*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Let me know how ya get on.


I got it working OK, and I started using the OSD via Rivatuner. I disabled FRAPS as a result and found out FRAPS was causing the vsync suttering.

Now I can run vsync with no problem.


----------



## Arizonian

Quote:


> Originally Posted by *tahirahmed*
> 
> Hello everyone,
> 
> This is undeniably the biggest R9 290/290X thread I have seen so I'll ask for advice here rather than anywhere else. I have Sapphire R9 290 Tri-X from like six months and I am very satisfied with it but I am thinking about getting another as they are quite cheap now for a Crossfire setup though I have no dual card experience before whether in SLI or CF. So far I did quite a bit of research on internet and found mixed opinions about it, some say CF setups are absolutely worth it and bring improvements to almost every game while some say it only bring issues. The reviews say XDMA CF solved all the major issues and work with almost every game out of the box.
> 
> My question however is does CF setup overall bring a good experience or bad experience ? also does it work with newer games easily or require a lot of tweaking/settings to make it work ? My major concern are few upcoming games, specially The Witcher 3 which recently mentioned that a single R9 290 will be good enough only for med-high settings at 1080p.
> 
> Also how's your experience with Nvidia favoring titles like Farcry 4, AC Unity, Watch Dogs, Shadow of Mordor, Skyrim etc ? I tried finding videos on Youtube but didn't find much, does it mean CF doesn't work with these games ?
> 
> Please help me out in this decision.


Welcome to OCN.









I see you already got your answer. We got a great bunch of members here. Feel free to post a submission of your GPU's and join us.

Quote:


> Originally Posted by *Gil80*
> 
> AMD is reducing the prices of 290/290X in response to Nvidia's 970 memory design issue.
> I love their ad!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 4GB is 4GB
> 
> 
> 
> 
> 
> 
> 
> the copywriter is genius!!!


I can't blame AMD one bit. If tables were turned it would have been no different, that's all I'll say.








Quote:


> Originally Posted by *om3nz*
> 
> Hey everyone. Just got the r9 290.
> 
> 
> 
> 
> 
> 
> 
> Please add me to the club.
> 
> http://www.techpowerup.com/gpuz/details.php?id=8hyh5
> 
> Manufacturer: Sapphire
> Cooling: VAPOR X


Congrats - added









And for anyone else who hasn't yet submitted proof, please view OP for submission and join us.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *hamzta09*
> 
> If anyone runs Crossfire, on air in a case.
> Please post your temps after ~30min gaming at 22c indoor temps or so, and your type of cards + slot in between and noise.


Hot and noisy . With toasty vrm and gpu temps ........... on air that is say 70c plus under load


----------



## GOLDDUBBY

Quote:


> Originally Posted by *ForNever*
> 
> Poor taste??? They ripped off their customer base, then said, "aw shucks ya caught us...sooooorry". Defending that is really another level of pathetic.


the way nvidia treats the consumer grade customerbase, they don't deserve customers tbh.

milking every dollar and cent from normal people, they should be ashamed of themselves, their poor ethics and their products!

The video was great! *Funny cos it's true.


----------



## YellowBlackGod

Quote:


> Originally Posted by *GOLDDUBBY*
> 
> the way nvidia treats the consumer grade customerbase, they don't deserve customers tbh.
> 
> milking every dollar and cent from normal people, they should be ashamed of themselves, their poor ethics and their products!
> 
> The video was great! *Funny cos it's true.


That's what i understood as well and changed from 780ti to the 8GB Vapor-x.


----------



## mus1mus

Still, it's not a 290 that can melt polar icebergs and pull more Watts than the whole Africa! They said.

As if I care! I said.


----------



## DeviousAddict

Hey guys, I'm after some suggestions please.

I overheated my GPU the other week ( http://www.overclock.net/t/1536791/help-and-advice-required/0_30 )
Now I'm after some suggestions as to what to do now. Do I go ahead and bake my card, will that actually work?
I can't RMA it becasue the sapphire TOC's say I can't, plus since it's my fault I can't go through the supplier for a replacement either









I don't want to bake it if there is another option I can try 1st.

Cheers guys


----------



## Agent Smith1984

Quote:


> Originally Posted by *DeviousAddict*
> 
> Hey guys, I'm after some suggestions please.
> 
> I overheated my GPU the other week ( http://www.overclock.net/t/1536791/help-and-advice-required/0_30 )
> Now I'm after some suggestions as to what to do now. Do I go ahead and bake my card, will that actually work?
> I can't RMA it becasue the sapphire TOC's say I can't, plus since it's my fault I can't go through the supplier for a replacement either
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I don't want to bake it if there is another option I can try 1st.
> 
> Cheers guys


The concept of baking a card to fix it, after it has overheated is still foreign to me! lol


----------



## DeviousAddict

I think it has something to do with resetting the solder, that's what I've gathered after reading up on it anyway.


----------



## Starage

hello

I have an asus 290 at 947 and a xfx 290 at 980. Can I flash my asus card to be at 980 core speed so both match?


----------



## Hattifnatten

You could just use Afterburner, and set it to apply the overclock at start-up.


----------



## Starage

I have sapphire trixx runing.

Also, I just updated to new cat drivers

My samsung t240 24 monitor is 1920x1200, but now I am able to run 2560x1600.

http://www.directcanada.com/products/?sku=14040MN9260

Will running 2560 x 1600 damage my monitor?


----------



## Hattifnatten

Not at all. I do believe that VSR only sends out an image in the native resolution of the monitor. The frame is rendered at the set resolution, and scaled down by onboard dedicated hardware.


----------



## coolcole01

alright guys i just went and got me two asus r9 290x cards for 289 a piece it was impossible to pass up any advice for running these beasts in crossfire thank you for any help in advance. I had two hd 7950 cards and two 6950 cards in the past so im not completely clueless. But i have had gtx 780 for the last year.Running at 1440p.


----------



## joeh4384

Quote:


> Originally Posted by *coolcole01*
> 
> alright guys i just went and got me two asus r9 290x cards for 289 a piece it was impossible to pass up any advice for running these beasts in crossfire thank you for any help in advance. I had two hd 7950 cards and two 6950 cards in the past so im not completely clueless. But i have had gtx 780 for the last year.Running at 1440p.


The biggest challenge is keeping them cool.


----------



## coolcole01

i have been running them pretty hard have a spot fan pointed at them and a 240mm fan on the side been staying under 82


----------



## Starage

I have two 290 in crossfire in a cm haf x case. Idle is 42c gaming is 70-85c
I am using sapphire trixx to control the fan speed.
My cards are the single blower fan, so all the heat exit the back of the case.
Not as hot as I figured they would be.

Good luck.


----------



## ghabhaducha

I've had my first 290x since launch, but I'm finally ready to provide a submission to the club. Thank you for maintaining it!


Spoiler: 1.



GPU-Z Validation Link






Spoiler: 2.



2x XFX R9 290x flashed w/AMD UEFI Bios
Bios Version: 015.041.000.000.003518 (113-C6710100-X01)





Spoiler: 3.



Both under water w/EK-FC R9-290X - Acetal+Nickel


----------



## JuliusCivilis

Just got my new GPU, yeey! Very happy with it. Here the validation to join the club.



Brand: Sapphire 290x

Cooling: Tri-X

I do have a few question though, how do I stresstest this card properly to see if my overclock is stable. Im up to 1140 on the core and it seems to be stable (Valley Benchmark and gaming). Some people say play 30 minutes of valley and look for crashes/artifacts. What do you think?

Another question: I want to use stock voltage for now and what I have seen people do is first find their max core clock. Then go back to stock and look for their highest memory clock. And then combine the too and see if it is still stable. I have googled around a bit but cant find any defenitive answers. Can one of you help me out?

Thanks!

Edit: 1 more question if you people dont mind haha: What is the max VRM 1 and 2 I should accept?


----------



## pshootr

Quote:


> Originally Posted by *JuliusCivilis*
> 
> Just got my new GPU, yeey! Very happy with it. Here the validation to join the club.
> 
> 
> 
> Brand: Sapphire 290x
> 
> Cooling: Tri-X
> 
> I do have a few question though, how do I stresstest this card properly to see if my overclock is stable. Im up to 1140 on the core and it seems to be stable (Valley Benchmark and gaming). Some people say play 30 minutes of valley and look for crashes/artifacts. What do you think?
> 
> Another question: I want to use stock voltage for now and what I have seen people do is first find their max core clock. Then go back to stock and look for their highest memory clock. And then combine the too and see if it is still stable. I have googled around a bit but cant find any defenitive answers. Can one of you help me out?
> 
> Thanks!
> 
> Edit: 1 more question if you people dont mind haha: What is the max VRM 1 and 2 I should accept?


Congrats. The R9 cards are very nice, and I am very happy with mine. 1140 on core sounds very good for stock voltage, you may have a nice overclocker there.









Valley Benchmark and gaming should do it. From what I read, it is best to keep VRM's below 80C for long life of the card. Slightly above should not be an issue. I use custom fan settings from TRIXX, or you could also use fan profiles from MSI After-Burner if needed or desired..


----------



## disintegratorx

Yeah. Here are some programs that supposedly are free that you can use to delay start up programs, btw. Old subject for me, but I've just found them.
http://www.thewindowsclub.com/set-delay-time-startup-programs-windows

I'm going to use one for MSI-AB then and see how it works out.

Edit: This post can be scratched as it can be done from within Windows (with the Task Scheduler)..


----------



## Galatian

Hi all,

I purchased two R9 290X recently and put a water block on them. So far I'm pretty happy with them, but when I dropped some settings in Titanfall to get high FPS I realized that sometimes I get huge drops in performance. It occurs once (and only once) per map played and it becomes a complete slideshow for like 8 seconds. Is this perhaps a problem with power tune? Does the driver "think" that one GPU is not needed so it powers it completely down? Is there any solution to my problem?


----------



## rdr09

Quote:


> Originally Posted by *Galatian*
> 
> Hi all,
> 
> I purchased two R9 290X recently and put a water block on them. So far I'm pretty happy with them, but when I dropped some settings in Titanfall to get high FPS I realized that sometimes I get huge drops in performance. It occurs once (and only once) per map played and it becomes a complete slideshow for like 8 seconds. Is this perhaps a problem with power tune? Does the driver "think" that one GPU is not needed so it powers it completely down? Is there any solution to my problem?


iirc, that is one of the games i maxed out with one 290 at stock in 4K. its not gpu demanding at all.

edit: i do play the game still in crossfire but no stutter. tearing - yes.

is your ULPS disabled?


----------



## k2blazer

Recently painted my Sapphire 290's white.

EDIT: Sorry about the picture size, resized.


----------



## bluedevil

Quote:


> Originally Posted by *k2blazer*
> 
> Recently painted my Sapphire 290's white.


please resize.......your photos are tiny....


----------



## mAs81

Quote:


> Originally Posted by *bluedevil*
> 
> please resize.......your photos are tiny....


+1 on that..I'm really interested in seeing the paint job on them Vapor-X


----------



## k2blazer

Quote:


> Originally Posted by *k2blazer*
> 
> Recently painted my Sapphire 290's white.
> 
> EDIT: Sorry about the picture size, resized.


There you go.


----------



## mAs81

Great job man,kudos..







they look great in your build!!
Quote:


> Originally Posted by *k2blazer*
> 
> 
> 
> Spoiler: Warning: Spoiler!


Question : in the picture the LEDs are not on?!?You painted them too or what?


----------



## k2blazer

Those cards aren't the vapor-x edition, they don't have led's.


----------



## mAs81

Quote:


> Originally Posted by *k2blazer*
> 
> Those cards aren't the vapor-x edition, they don't have led's.


Really?I thought that the Tri-X had LED in the Sapphire logo too..Sorry-brain freeze I guess


----------



## k2blazer

Quote:


> Originally Posted by *mAs81*
> 
> Really?I thought that the Tri-X had LED in the Sapphire logo too..Sorry-brain freeze I guess


No worries, I wish man.. they would look awesome.


----------



## Phaster89

Quote:


> Originally Posted by *k2blazer*
> 
> Recently painted my Sapphire 290's white.
> 
> EDIT: Sorry about the picture size, resized.
> 
> 
> Spoiler: Warning: Spoiler!


how loud are those 2 while gaming?


----------



## k2blazer

Quote:


> Originally Posted by *Phaster89*
> 
> how loud are those 2 while gaming?


When I play BF4, I can't hear them... Lol.


----------



## Phaster89

Quote:


> Originally Posted by *k2blazer*
> 
> When I play BF4, I can't hear them... Lol.


it figures i can't hear my 290 at 80% fan speed when i'm playing hitman absolution for example, but on arma 2 and 3 the fan at 80% is really really annoying

one more nail in the air cooling coffin, at least for me


----------



## Klocek001

Quote:


> Originally Posted by *Phaster89*
> 
> it figures i can't hear my 290 at 80% fan speed when i'm playing hitman absolution for example, but on arma 2 and 3 the fan at 80% is really really annoying
> 
> one more nail in the air cooling coffin, at least for me


80% is the same as 80%. It's loud as hell.


----------



## LandonAaron

I'm building my fiance a gaming computer so we can game together. (She's a gamer but has only ever played on consoles). Anyway, I have an R9 290x, and I am getting here one too. Getting the MSI gaming edition one, because I don't feel like messing with water cooling for the GPU for this build, but don't want to deal with the noise from a blower cooler either. I am thinking that when I upgrade down the line her computer will likely end up with both R9 290x's in it, so I want to go ahead and get a PSU that can handle that sort of load. I was thinking 750w or 850w probably Evga G2's line. Would this be enough power or should I got 1000w? The CPU will be an i5-4690k moderately overclocked with just a 120mm rad water cooler.


----------



## tsm106

Quote:


> Originally Posted by *LandonAaron*
> 
> I'm building my fiance a gaming computer so we can game together. (She's a gamer but has only ever played on consoles). Anyway, I have an R9 290x, and I am getting here one too. Getting the MSI gaming edition one, because I don't feel like messing with water cooling for the GPU for this build, but don't want to deal with the noise from a blower cooler either. I am thinking that when I upgrade down the line her computer will likely end up with both R9 290x's in it, so I want to go ahead and get a PSU that can handle that sort of load. I was thinking 750w or 850w probably Evga G2's line. Would this be enough power or should I got 1000w? The CPU will be an i5-4690k moderately overclocked with just a 120mm rad water cooler.


Don't get the gaming 4g cards. There are so many better choices, like a lightning or trix or whatever. Don't lock yourself in with a terrible pcb. PSU wise, get the G2 1300. You can get it for around 130 AR every other month it seems like. It's more than ya need, I prefer it that way and its silly cheap when its on sale at the egg.

Looks like the code might still work.

http://www.overclock.net/t/1538028/newegg-rebate-evga-supernova-g2-1300w-130-after-30-rebate-and-code-exlaknt26/0_40


----------



## joeh4384

Quote:


> Originally Posted by *LandonAaron*
> 
> I'm building my fiance a gaming computer so we can game together. (She's a gamer but has only ever played on consoles). Anyway, I have an R9 290x, and I am getting here one too. Getting the MSI gaming edition one, because I don't feel like messing with water cooling for the GPU for this build, but don't want to deal with the noise from a blower cooler either. I am thinking that when I upgrade down the line her computer will likely end up with both R9 290x's in it, so I want to go ahead and get a PSU that can handle that sort of load. I was thinking 750w or 850w probably Evga G2's line. Would this be enough power or should I got 1000w? The CPU will be an i5-4690k moderately overclocked with just a 120mm rad water cooler.


I would go 1000 for crossfire but 850 would be enough. The MSI card is ok but the Tri-X and Lightning are definitely better models.


----------



## Roboyto

Quote:


> Originally Posted by *JuliusCivilis*
> 
> Just got my new GPU, yeey! Very happy with it. Here the validation to join the club.
> 
> 
> 
> Brand: Sapphire 290x
> 
> Cooling: Tri-X
> 
> I do have a few question though, how do I stresstest this card properly to see if my overclock is stable. Im up to 1140 on the core and it seems to be stable (Valley Benchmark and gaming). Some people say play 30 minutes of valley and look for crashes/artifacts. What do you think?
> 
> Another question: I want to use stock voltage for now and what I have seen people do is first find their max core clock. Then go back to stock and look for their highest memory clock. And then combine the too and see if it is still stable. I have googled around a bit but cant find any defenitive answers. Can one of you help me out?
> 
> Thanks!
> 
> Edit: 1 more question if you people dont mind haha: What is the max VRM 1 and 2 I should accept?


Welcome









This should cover your questions: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21880_20#post_22208781

*Core* *>* RAM

The 512-bit bus makes up for the slower RAM speeds.

It is best to find the GPU clock first, but I wouldn't do the RAM clock separately. RAM clocks have the possibility to effect the voltage necessary for overall stability. Best to find your core clock and then start working on the RAM with the core overclocked.


----------



## LandonAaron

Quote:


> Originally Posted by *tsm106*
> 
> Don't get the gaming 4g cards. There are so many better choices, like a lightning or trix or whatever. Don't lock yourself in with a terrible pcb. PSU wise, get the G2 1300. You can get it for around 130 AR every other month it seems like. It's more than ya need, I prefer it that way and its silly cheap when its on sale at the egg.
> 
> Looks like the code might still work.
> 
> http://www.overclock.net/t/1538028/newegg-rebate-evga-supernova-g2-1300w-130-after-30-rebate-and-code-exlaknt26/0_40


What specifically is bad about the MSI card? I already ordered it off Ebay. I was going to get a new GTX 970, but I found this for $70 less. I have owned a GTX 770 gaming edition card from MSI before and was really impressed with it, so I figured going MSI again would be a safe bet as the cooler looks almost identical.

I found a good deal on a refurbished Corsair AX860. I am thinking of getting that for myself as it matches my color scheme, and putting my old 1000w EX PSU in my girlfriends computer. Its over 5 years old at this point though and was never anything special to begin with. I am also a little apprehensive of having 2 over powered PSU's which are basically overkill in both of our systems. Isn't it inefficient to get a bigger PSU than what you need?


----------



## joeh4384

Quote:


> Originally Posted by *LandonAaron*
> 
> What specifically is bad about the MSI card? I already ordered it off Ebay. I was going to get a new GTX 970, but I found this for $70 less. I have owned a GTX 770 gaming edition card from MSI before and was really impressed with it, so I figured going MSI again would be a safe bet as the cooler looks almost identical.
> 
> I found a good deal on a refurbished Corsair AX860. I am thinking of getting that for myself as it matches my color scheme, and putting my old 1000w EX PSU in my girlfriends computer. Its over 5 years old at this point though and was never anything special to begin with. I am also a little apprehensive of having 2 over powered PSU's which are basically overkill in both of our systems. Isn't it inefficient to get a bigger PSU than what you need?


On the 290x's the 3 fan cards have better cooling than two fan cards. I have the MSI 290x and it works fine but I did have to reapply the thermal paste. I get load temps 75-81 at stock 1030mhz clock. Before repasting, I was getting 83-86 on load.


----------



## Roboyto

Quote:


> Originally Posted by *LandonAaron*
> 
> What specifically is bad about the MSI card? I already ordered it off Ebay. I was going to get a new GTX 970, but I found this for $70 less. I have owned a GTX 770 gaming edition card from MSI before and was really impressed with it, so I figured going MSI again would be a safe bet as the cooler looks almost identical.
> 
> I found a good deal on a refurbished Corsair AX860. I am thinking of getting that for myself as it matches my color scheme, and putting my old 1000w EX PSU in my girlfriends computer. Its over 5 years old at this point though and was never anything special to begin with. I am also a little apprehensive of having 2 over powered PSU's which are basically overkill in both of our systems. Isn't it inefficient to get a bigger PSU than what you need?


These cards run hot, you have to have an outstanding cooler for you to have an enjoyable experience. This question arises quite often, and Sapphire didn't slouch. If you were able to get the TwinFrozr V on a 290(X) it might be a different story, but that's not the case. With the Lightning at $329 on NewEgg right now, I wouldn't look any further.

I'm no PSU expert, but they are typically most efficient in the ~ 30%-50% load range. Running the PSU with very low, or very high, load typically shows worse efficiency.

These cards can run on 'small' PSUs if you're smart about it...

My HTPC ran on a Rosewill Capstone 450W:

4770k @ 4.3GHz

R9 290 1075/1375

1 SSD, 2 HDD, 1 ODD

2 AIO Pumps & 3 fans

The PSU didn't obviously run at it's most efficient under high stress on CPU/GPU, but it never gave me one issue gaming or benching over a 9 month period.


----------



## Gobigorgohome

Got sold the third card today, going to try out the MSI Lightning R9 290X and the last Sapphire Radeon R9 290X in crossfire then, one R9 290X is pretty effective on 1440p, but don't have a chance at 4K so I guess I will keep playing at 1440p, but kick the settings up to max.







Maybe start overclocking them aswell.


----------



## LandonAaron

Quote:


> Originally Posted by *Roboyto*
> 
> These cards run hot, you have to have an outstanding cooler for you to have an enjoyable experience. This question arises quite often, and Sapphire didn't slouch. If you were able to get the TwinFrozr V on a 290(X) it might be a different story, but that's not the case. With the Lightning at $329 on NewEgg right now, I wouldn't look any further.
> 
> I'm no PSU expert, but they are typically most efficient in the ~ 30%-50% load range. Running the PSU with very low, or very high, load typically shows worse efficiency.
> 
> These cards can run on 'small' PSUs if you're smart about it...
> 
> My HTPC ran on a Rosewill Capstone 450W:
> 4770k @ 4.3GHz
> R9 290 1075/1375
> 1 SSD, 2 HDD, 1 ODD
> 2 AIO Pumps & 3 fans
> 
> The PSU didn't obviously run at it's most efficient under high stress on CPU/GPU, but it never gave me one issue gaming or benching over a 9 month period.


I was able to get this card used, (looks brand new, trace amounts of dust and still had some of the clear plastic sticker stuff on it) for $260 so I'm happy with the deal I got. As long as it doesn't throttle I'm okay with it running a little hot. I looked at a Tom's Hardware article ,http://www.tomshardware.com/reviews/radeon-r9-290-and-290x,3728-7.html, that compared all the 3rd party R9 290x's before I purchased, and the MSI only ran a few degrees higher than the Tri-X (76 vs. 73c max) and was 1 decible louder (43.9 vs. 42.8).

Anyway, I think I am going to go with a 850w PSU since I will most likely be running single card most of time, but still want the headroom to run a dual card configuration if I decide I want to. I used this power supply calculator here: http://www.extreme.outervision.com/index.jsp and it said I would need 880W to run two R9 290x's with my current rig with my current overclock. Pretty cool tool it allows you enter everything in down to the last fan. Now to decide who gets what. Do I giver he the new 850w or the 5 year old 1000w. AKA whats better a 5 year old 1000w BFG or a new 850W from EVGA G2? I get whatevers better









So my current top picks are a refurbished Corsair AX860, a new Evga G2 850w, and just to muddy the water a bit a Rosewill Capstone 1000-M. Johnny Guru five the rosewill a good rating, but there are several newegg user reviews who claimed the PSU died on them after just a few months of use. So I'm not sure what to think of that.


----------



## Roboyto

Quote:


> Originally Posted by *LandonAaron*
> 
> I was able to get this card used, (looks brand new, trace amounts of dust and still had some of the clear plastic sticker stuff on it) for $260 so I'm happy with the deal I got. As long as it doesn't throttle I'm okay with it running a little hot. I looked at a Tom's Hardware article ,http://www.tomshardware.com/reviews/radeon-r9-290-and-290x,3728-7.html, that compared all the 3rd party R9 290x's before I purchased, and the MSI only ran a few degrees higher than the Tri-X (76 vs. 73c max) and was 1 decible louder (43.9 vs. 42.8).
> 
> Anyway, I think I am going to go with a 850w PSU since I will most likely be running single card most of time, but still want the headroom to run a dual card configuration if I decide I want to. I used this power supply calculator here: http://www.extreme.outervision.com/index.jsp and it said I would need 880W to run two R9 290x's with my current rig with my current overclock. Pretty cool tool it allows you enter everything in down to the last fan. Now to decide who gets what. Do I giver he the new 850w or the 5 year old 1000w. AKA whats better a 5 year old 1000w BFG or a new 850W from EVGA G2? I get whatevers better
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So my current top picks are a refurbished Corsair AX860, a new Evga G2 850w, and just to muddy the water a bit a Rosewill Capstone 1000-M. Johnny Guru five the rosewill a good rating, but there are several newegg user reviews who claimed the PSU died on them after just a few months of use. So I'm not sure what to think of that.


You may not see the same results as Tom's Hardware though. And, that doesn't change the fact that running these cards on air requires excellent air flow through your case and directly to the card. If you have that happening, then you'll probably be fine..but most people end up needing to do something to make the card run optimally..Especially if you want to overclock.

If you're going to run dual cards then you're probably safer with a 1kW. I wouldn't run new high end GPU(s) on a 5 year old PSU









I dabble in refurb items, but I don't think a PSU would be one of them.

Don't know anything about the Capstone 1000W, but I have the 650W & 450W which have both been silent and flawless for their 2 year run, so far, in my two rigs. The 650/450 are manufactured by SuperFlower, who has a pretty solid reputation for PSUs. If they are behind the 1000W unit, it may not be a bad choice...but there are other folks around who know a lot more about PSUs than I do.

If you have more questions about PSUs then there are other threads or posts to check and these 3 aren't a bad start:

http://www.overclock.net/t/1441118/290x-psu-power-output-tests/0_20#post_21156921

http://www.overclock.net/t/1482157/700-750-watts-comparison-thread#post_22109815

http://www.overclock.net/t/1438987/1000-1050-watts-comparison-thread#post_21108368


----------



## LandonAaron

@Roboyto Thanks. Yeah the Capstone 1000m is a Superflower. I'm kind of thinking of just getting two PSU's, since I recently upgraded my motherboard and CPU after a tragic accident, that really makes the case for Asus's armor idea, took out my last motherboard. I just feel its time to replace it as it is basically the last remaining item beyond the optical drive that hasn't been upgraded over the years. I have just never really felt the need to upgrade it as there is no immediate performance gain to be had from doing so. Its a very important yet utterly boring component.


----------



## sinnedone

So, is it possible on 2 reference Saphire 290's to top out at 1050ish on the core with stock volts and cooler. Then do 1125 on the core stable with stock voltage on full waterblock?

Still checking but get no artifacts in games or benchmarks but I only played bf4 for little under 2 hours straight so it could prove unstable in longer gaming sessions.


----------



## rdr09

Quote:


> Originally Posted by *sinnedone*
> 
> So, is it possible on 2 reference Saphire 290's to top out at 1050ish on the core with stock volts and cooler. Then do 1125 on the core stable with stock voltage on full waterblock?
> 
> Still checking but get no artifacts in games or benchmarks but I only played bf4 for little under 2 hours straight so it could prove unstable in longer gaming sessions.


its silicon. i have not tried in crossfire but individually . . . one of my 290s can do 1150 at stock VDDC and the other 1170. just for benching. both reference. and, yes, they are watered. never tested on air, i never oc'ed on air.

edit: i think they can do better . . .

http://www.3dmark.com/3dm/4644282


----------



## tsm106

Quote:


> Originally Posted by *LandonAaron*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Don't get the gaming 4g cards. There are so many better choices, like a lightning or trix or whatever. Don't lock yourself in with a terrible pcb. PSU wise, get the G2 1300. You can get it for around 130 AR every other month it seems like. It's more than ya need, I prefer it that way and its silly cheap when its on sale at the egg.
> 
> Looks like the code might still work.
> 
> http://www.overclock.net/t/1538028/newegg-rebate-evga-supernova-g2-1300w-130-after-30-rebate-and-code-exlaknt26/0_40
> 
> 
> 
> What specifically is bad about the MSI card? I already ordered it off Ebay. I was going to get a new GTX 970, but I found this for $70 less. I have owned a GTX 770 gaming edition card from MSI before and was really impressed with it, so I figured going MSI again would be a safe bet as the cooler looks almost identical.
> 
> I found a good deal on a refurbished Corsair AX860. I am thinking of getting that for myself as it matches my color scheme, and putting my old 1000w EX PSU in my girlfriends computer. Its over 5 years old at this point though and was never anything special to begin with. I am also a little apprehensive of having 2 over powered PSU's which are basically overkill in both of our systems. Isn't it inefficient to get a bigger PSU than what you need?
Click to expand...

lots wannabe psu experts around here. Anyways, a psu has to maintains its efficiency rating throughout its power range to get its cert, so... it will be efficient no matter the typical power level or usage. I would get the 1300 cuz it's cheap and it will run your cards since you're eventually going dual, and run them no matter how hard you oc them. You could either run these cards and sip power or you could draw power like a monster, from 800w to 1300w. I prefer more than less, but I wouldn't go less than 1000w.

The gaming pcb is quite random in its part placement. This makes the pcb almost impossible to wc and although you wrote that you probably won't wc, imo it's smarter not to buy a pcb that no one wants especially later at resale. I suppose it's much ado about nothing since you already purchased it. But if I were to buy new, the lightning is maybe ten bucks more and blows the gaming card away.


----------



## ghabhaducha

Quote:


> Originally Posted by *LandonAaron*
> 
> What specifically is bad about the MSI card? I already ordered it off Ebay. I was going to get a new GTX 970, but I found this for $70 less. I have owned a GTX 770 gaming edition card from MSI before and was really impressed with it, so I figured going MSI again would be a safe bet as the cooler looks almost identical.
> 
> I found a good deal on a refurbished Corsair AX860. I am thinking of getting that for myself as it matches my color scheme, and putting my old 1000w EX PSU in my girlfriends computer. Its over 5 years old at this point though and was never anything special to begin with. I am also a little apprehensive of having 2 over powered PSU's which are basically overkill in both of our systems. Isn't it inefficient to get a bigger PSU than what you need?


I agree with tsm106 here regarding purchasing the EVGA 1300G2 for $130. Check out my response to Shilka in this post. I did some quick calculations to find out how much one saves by buying a smaller PSU vs an "overkill" one today. Depending on how much you are paying for power, the price difference over a year based on a 89% efficiency vs. a 92% efficiency could be less than one meal at a restaurant.

Don't get me wrong, I respect Shilka's views tremendously, and if I had to pay as much for power as he does, I would definitely buy more appropriate wattage power supplies. However, I feel that given how cheap we can find power supplies off of Slickdeals, etc, and how little many of us in the states pay for power compared to other parts of the world, buying larger units doesn't seem too bad. Especially, when you consider that they will likely run cooler since they have been designed to handle larger loads.

Having said that I wouldn't spend more than $300 on a PSU just to future proof myself! A $130 EVGA 1300G2, or a $110 ($97) Seasonic X-1250 on the other hand would be a better choice for me than a $90-$120 750-850W 80+ gold/platinum unit.


----------



## Mega Man

i am with you , although soon i want to make a single psu build and i will be buying a evga 1600w/super flower leadex 1600/2000w ( i have not decided, ) as i normally pay more then 400 for both psus, i have no issues paying 400+ for one


----------



## Arizonian

I was able to land an EVGA G2 1300 for $160 minus rebate made it less. G2 / P2 are solid PSU's and an amazing value based on that. I was impressed with review tests.

I was caught with my shorts down when I was thinking about crossfiring my 290X with an 850 Watt PSU in my rig. It kept me from having fun. I promised myself my next PSU would be able to handle two power-hungry graphics cards no matter what. And even if I decide to stick with one single GPU at least that door is open again on my next build I should be done with at end of month. A 4790K and it will be just salvatating for AMD to release the 390X or whatever it may be called. It's piece of mind I can crossfire it now.









Update, still last post:
Quote:


> Originally Posted by *ghabhaducha*
> 
> I've had my first 290x since launch, but I'm finally ready to provide a submission to the club. Thank you for maintaining it!
> 
> 
> Spoiler: 1.
> 
> 
> 
> GPU-Z Validation Link
> 
> 
> 
> 
> 
> 
> Spoiler: 2.
> 
> 
> 
> 2x XFX R9 290x flashed w/AMD UEFI Bios
> Bios Version: 015.041.000.000.003518 (113-C6710100-X01)
> 
> 
> 
> 
> 
> Spoiler: 3.
> 
> 
> 
> Both under water w/EK-FC R9-290X - Acetal+Nickel


Congrats - added















Quote:


> Originally Posted by *JuliusCivilis*
> 
> Just got my new GPU, yeey! Very happy with it. Here the validation to join the club.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Brand: Sapphire 290x
> 
> Cooling: Tri-X
> 
> I do have a few question though, how do I stresstest this card properly to see if my overclock is stable. Im up to 1140 on the core and it seems to be stable (Valley Benchmark and gaming). Some people say play 30 minutes of valley and look for crashes/artifacts. What do you think?
> 
> Another question: I want to use stock voltage for now and what I have seen people do is first find their max core clock. Then go back to stock and look for their highest memory clock. And then combine the too and see if it is still stable. I have googled around a bit but cant find any defenitive answers. Can one of you help me out?
> 
> Thanks!
> 
> Edit: 1 more question if you people dont mind haha: What is the max VRM 1 and 2 I should accept?


Congrats - added








Quote:


> Originally Posted by *k2blazer*
> 
> Recently painted my Sapphire 290's white.
> 
> EDIT: Sorry about the picture size, resized.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Please throw a GPU-Z validation link in that post. I'm using it as your submission link in list. Nice work on the modification. I currently have a white / black theme and next rig will as well. I like what you did.


----------



## BuildTestRepeat

Sadly i will need removed as i sent my card back for RMA, and being the rebate period for me to mail out was going to be after i got my RMA card back newegg refunded me the $. Now i dont know what i wanna pickup!


----------



## jidolboy

Hello guys!
I was wondering if I'm able to run 290/x on my rig. Currently, I have 550w XFX Bronze + PSU with i5-4690k OC'd @ 1.3v.
From what I have researched 550w is possible when CPU + 290 is running at stock voltage but that isn't the case for me. I always thought 550w would be enough to run any single gpu but it doesn't seem like it







I don't plan on getting another PSU just for this card either.


----------



## wermad

Brand new g1600 I ebay for $130. Pissed I missed it







. Meh, I'll keep riding the v1000







.


----------



## Kriant

Sooo, does anyone know whether there is a way to properly sync four r9 290x cards clocks? (I know, the general answer is afterburner or trixx), my specific problem is that two of my cards are Hynix chipped, and two are Elpida (aaand also 2 have stock bios with 1000/1250 clocks and two are powercolor 1030/1250).
Basically when I select to sync similar cards, it syncs them in pairs, but not all 4 of them.

I guess I can manually adjust clocks, but it's a bit of a pita (fyi, I refer to bread).


----------



## JourneymanMike

Quote:


> Originally Posted by *jidolboy*
> 
> Hello guys!
> I was wondering if I'm able to run 290/x on my rig. Currently, I have 550w XFX Bronze + PSU with i5-4690k OC'd @ 1.3v.
> From what I have researched 550w is possible when CPU + 290 is running at stock voltage but that isn't the case for me. I always thought 550w would be enough to run any single gpu but it doesn't seem like it
> 
> 
> 
> 
> 
> 
> 
> I don't plan on getting another PSU just for this card either.


To put it short... NO

Most PSU calculators recommend @ least a 650 Watt PSU

You can search on this forum or Google for recommendations...

Mike


----------



## Dieselbird

http://www.techpowerup.com/gpuz/details.php?id=wwuzp

mild overclock on stock cooling, keeps it below 140* good enough for me

first one I bought had the bad ram, had to RMA but this one rocks


----------



## chiknnwatrmln

Quote:


> Originally Posted by *JourneymanMike*
> 
> To put it short... NO
> 
> Most PSU calculators recommend @ least a 650 Watt PSU
> 
> You can search on this forum or Google for recommendations...
> 
> Mike


It is possible to run a 290/x and an i5 on 550 watts, even with overclocking.

When I had a single GPU, even when OC'ed with +180mV or so my whole rig pulled around 550 watts from the wall, which works out to about 500 watts DC. Whether or not he wants to run his PSU balls to the wall is his decision, if he has a quality brand it should be OK.

My rig was very similar to his, I was running my 3770k at 1.35v iirc at the time. Also running water cooling, so that 500 watts was with a pump, ten or so fans, a HDD, fan controller, and sound card.

Personally with that PSU I would not overclock too much, maybe stick with +100mV or a little less.


----------



## ghabhaducha

Spoiler: Quote



Quote:


> Originally Posted by *Kriant*
> 
> Sooo, does anyone know whether there is a way to properly sync four r9 290x cards clocks? (I know, the general answer is afterburner or trixx), my specific problem is that two of my cards are Hynix chipped, and two are Elpida (aaand also 2 have stock bios with 1000/1250 clocks and two are powercolor 1030/1250).
> Basically when I select to sync similar cards, it syncs them in pairs, but not all 4 of them.
> 
> I guess I can manually adjust clocks, but it's a bit of a pita (fyi, I refer to bread).






I had a similar problem with my two cards (one was xfx ref the other was xfx dd), so what i did was just flash the stock amd 1000/1250 290x uefi bios on both and then synced them in msi afterburner. I currently run the at 1100/1350 (one was elpida).


----------



## Klocek001

Quote:


> Originally Posted by *JourneymanMike*
> 
> To put it short... NO
> 
> Most PSU calculators recommend @ least a 650 Watt PSU
> 
> You can search on this forum or Google for recommendations...
> 
> Mike


even if you count 300W TDP for the 290X and 150W for an OC'd i5 (I mean really high clocks, close to 5GHz) it's still 100W left for the rest of the system. The question is: can the PSU really deliver 550W and how well does perform under max load ? IDK what PSU they're taliking about but if it was like CM V550S, Super flower or Sesonic I'd say it's alright.

edit: Okay, I'm a moron, the PSU brand is in the quote. I guess XFX is Seasonic made, so I think he'll be fine. I don't really remember well but the 550W can do like ~530W on the 12V so I'd go easy on those overclocks.


----------



## rt123

Quote:


> Originally Posted by *Klocek001*
> 
> even if you count 300W TDP for the 290X and 150W for an OC'd i5 (I mean really high clocks, close to 5GHz) it's still 100W left for the rest of the system. The question is: can the PSU really deliver 550W and how well does perform under max load ? IDK what PSU they're taliking about but if it was like CM V550S, Super flower or Sesonic I'd say it's alright.
> 
> edit: Okay, I'm a moron, the PSU brand is in the quote. I guess XFX is Seasonic made, so I think he'll be fine. I don't really remember well but the 550W can do like ~530W on the 12V so I'd go easy on those overclocks.


Did you forget the CPU cooler, case fans, RAM, HDD,SSD..?


----------



## Klocek001

Quote:


> Originally Posted by *Klocek001*
> 
> it's still 100W left for the rest of the system.


----------



## rt123

You still shouldn't run a PSU that close to 100%.
Especially with a cheap XFX 550W.


----------



## JourneymanMike

Quote:


> Originally Posted by *rt123*
> 
> You still shouldn't run a PSU that close to 100%.
> Especially with a cheap XFX 550W.


^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
This

You need some head room, the PSU calculators recommend at least 650W! That would be getting by on his system!

Budget Components = Budget Performance


----------



## Mega Man

Quote:


> Originally Posted by *rt123*
> 
> You still shouldn't run a PSU that close to 100%.
> Especially with a cheap XFX 550W.


Quote:


> Originally Posted by *JourneymanMike*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rt123*
> 
> You still shouldn't run a PSU that close to 100%.
> Especially with a cheap XFX 550W.
> 
> 
> 
> ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
> This
> 
> You need some head room, the PSU calculators recommend at least 650W! That would be getting by on his system!
> 
> Budget Components = Budget Performance
Click to expand...

says who ?

quality > all !
Quote:


> Originally Posted by *Phaedrus2129*
> 
> These enthusiast power supplies we all have (high-end Corsair, Antec, SeaSonic, Enermax, Silverstone, etc.) can take a lot more abuse than you think. I have tell people this a lot. These are heavy duty, precision engineered electron pushers, not tinker toys.
> 
> While you can't trust your average cheap or OEM power supply so much, and you can't trust a generic as far as you can throw it... A high-end enthusiast grade power supply is engineered with massive safety margins. Take the Corsair VX550. That's a CWT PSH, and Rocketfish/Best Buy took that same PSU (w/ minor modifications, nothing important) and rated it as a 700W _and it can hold that rating within ATX specs_.
> 
> When you buy a high-end PSU you aren't just buying reliability and performance, you're buying your headroom right there. Some people say, "Well the TX750 can push 900W, so when I buy it I'm getting a 900W PSU". That defeats the purpose. The purpose is that you can treat it as a 750W PSU and draw that amount from it long-term, for extended periods, while a cheap 750W might eventually break. That's part of the reason for buying a high-end PSU, instead of something just adequate, like an OCZ ModXStream or a Rosewill RV2 or a CM Silent Pro.
> 
> When you buy a high end enthusiast power supply, especially one that I can vouch for, you should know that you're buying into more than just a name. You're buying a machine, and one that's a lot tougher than the typical dreck you might have used before. So don't be afraid to use it for what it's intended for. Forget about "extra headroom", forget about babying your PSU or keeping some massive unnecessary safety margin. Go ahead, get that second Fermi and go wild. Most of you have already paid for that ability, so make the most of it.


with that said
http://www.overclock.net/t/1431929/psu-index-thread/0_100
http://www.overclock.net/t/1436079/xfx-power-supplies-information-thread/0_100

you may have to find your own to find if your psu can support what you want


----------



## hyp36rmax

Any solution for crossfire users with the 2nd GPU in 3D clocks on boot or idle? Currently using OMEGA Drivers.


----------



## BradleyW

Quote:


> Originally Posted by *hyp36rmax*
> 
> Any solution for crossfire users with the 2nd GPU in 3D clocks on boot or idle? Currently using OMEGA Drivers.


Enable ULPS?

Or force lower clocks without powerplay support in MSI AB.


----------



## hyp36rmax

Quote:


> Originally Posted by *BradleyW*
> 
> Enable ULPS?
> 
> Or force lower clocks without powerplay support in MSI AB.


Thank you! I also found starting a game and exiting will do the same. Another solution I found *Link*


----------



## Yorkston

Two questions: What do you use to get accurate voltage readings for what your GPU is running at? Afterburner and GPUz show it bouncing all over the place. And what do you consider to be safe temps for VRM1?


----------



## mAs81

I always use HWInfo to monitor my system's voltages and temps,and found it pretty reliable..


Spoiler: Warning: Spoiler!






These cards do tend to run hot,but also have high tolerance in heat..VRM temperatures don't tend to be a problem until well about 100C , though it's better to be on the safe side and performance wise , try and keep them under 90-95c


----------



## JourneymanMike

@MegaMan Says YOU!


----------



## vieuxchnock

*Finally, the sleeving is done for my 3X 290.
http://www.servimg.com/view/17159996/625

My fingers are gone with.

Paracord is very nice but it's hard on fingers.

It looks yellow in the front but look between the res and the card, it's really green.
One day, I'll buy a good camera.

*


----------



## Roboyto

Quote:


> Originally Posted by *Yorkston*
> 
> Two questions: What do you use to get accurate voltage readings for what your GPU is running at? Afterburner and GPUz show it bouncing all over the place. And what do you consider to be safe temps for VRM1?


Do you have force constant voltage checked in AfterBurner? There can be a little fluctuation, but it shouldn't be 'all over the place'. Is the card holding constant clocks and not throttling? Once core hits 95C it throttles voltage and clock speeds, and you would see much fluctuation in this scenario.

VRM1 you want to keep 80C, maybe 90C max. They are rated to run at higher temperatures than this, but for efficiency, stability, overclock-ability and longevity the lower the temperature the better.

Quote:


> Originally Posted by *vieuxchnock*
> 
> *Finally, the sleeving is done for my 3X 290.
> 
> 
> My fingers are gone with.
> 
> Paracord is very nice but it's hard on fingers.
> 
> It looks yellow in the front but look between the res and the card, it's really green.
> One day, I'll buy a good camera.*


----------



## PillarOfAutumn

Quick question. I have 2 R9 290 in XF, each running at 1000/1300. Its paired with an i5-3570k OCed to 4.0ghz

My Fire Strike Xtreme score is 7251. Is this reasonable?


----------



## Agent Smith1984

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> Quick question. I have 2 R9 290 in XF, each running at 1000/1300. Its paired with an i5-3570k OCed to 4.0ghz
> 
> My Fire Strike Xtreme score is 7251. Is this reasonable?


I think it's okay, but you will see a huge benefit from overclocking CPU when running crossfire.
You should also have some wiggle room on the video cards, but the CPU will help the most when talking mutli-GPU









Somewhere in the 4.5-4.8GHz should be plenty doable on that chip, and really opens up the GPU's, especially during gaming. The Firestrike results are a little skewed by the CPU score, because a low (not that yours is super low, but my physics score on my old AMD X6 is higher than what an i5 will do at 4GHz, just to put it in perspective) physics score can drop the entire score. As far as the graphics score alone, we'd need to see it...


----------



## PillarOfAutumn

But here's the weird thing...
This is my system with a single 290: http://www.3dmark.com/3dm/2117812

And this is my score with 2 290s: http://www.3dmark.com/3dm/5805670?


----------



## Zelx0

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> But here's the weird thing...
> This is my system with a single 290: http://www.3dmark.com/3dm/2117812
> 
> And this is my score with 2 290s: http://www.3dmark.com/3dm/5805670?


It's not a valid result, so the score ends up being messed up


----------



## chiknnwatrmln

Quote:


> Originally Posted by *JourneymanMike*
> 
> @MegaMan Says YOU!


PSU calculators are known for way overestimating how much power a rig needs..

I don't know about the specific PSU that he is using but if it can deliver 550 watts on the 12v rail reliably then that is more than enough for an OC'ed 290 + i5 rig. I wouldn't recommend running balls to the wall, but with a mild overvoltage on the GPU he should still have a good amount of headroom.


----------



## rdr09

Quote:


> Originally Posted by *PillarOfAutumn*
> 
> But here's the weird thing...
> This is my system with a single 290: http://www.3dmark.com/3dm/2117812
> 
> And this is my score with 2 290s: http://www.3dmark.com/3dm/5805670?


the crossfire graphics score is kinda low. mine is about 9K at 947 MHz core (stock). check your temps (core and VRMS) using HWINFO64.


----------



## Yorkston

I've gotten my 290 stable at 1200/1300 with +88mV in Afterburner, is that good/bad/average for these cards? And how important is memory OC vs core? (elpida mem sadly)


----------



## hamzta09

Anyone here got Sapphire 290X Tri-X in Crossfire?

Im going with either that or the 8GB Twin Frozr.

Wondering what the temps are on the Tri-X.


----------



## rdr09

Quote:


> Originally Posted by *Yorkston*
> 
> I've gotten my 290 stable at 1200/1300 with +88mV in Afterburner, is that good/bad/average for these cards? And how important is memory OC vs core? (elpida mem sadly)


that's very good. not sure about the second question.


----------



## QuantumPhyzx

The biggest issue with Firestrike for me is the physics processing. After overclocking my FSB, I noticed a huge increase in score (around 2.4k to be precise). I am guessing that perhaps since the FSB also overclocks memory and northbridge, that this is what benefited me in improving my score. The combined test was running at about 7 fps when my score was hitting 6.5k. After the FSB overclock I was getting about 13-14 fps in the combined test. It's crazy how different it looked too--14 fps actually looked smooth.

http://www.3dmark.com/fs/4005303

Here's my cpuid/hwmonitor/trixx:



I have an FX-8350, ASRock Fatal1ty Killer 990FX, 16gb Trident-X DDR3 1600 (I believe they are running at around 1680 or 1720 right now due to the FSB overclock, can't remember), Samsung 850 250gb, Sapphire R9 290 Tri-X.


----------



## JourneymanMike

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> PSU calculators are known for way overestimating how much power a rig needs..
> 
> I don't know about the specific PSU that he is using but if it can deliver 550 watts on the 12v rail reliably then that is more than enough for an OC'ed 290 + i5 rig. I wouldn't recommend running balls to the wall, but with a mild overvoltage on the GPU he should still have a good amount of headroom.


Hmmm uh, yeah, OK!

I wouldn't chance it... It would be safer to have some headroom

But that' his decision... I just commented on his post and gave my opinion...

Be Cool









Mike


----------



## coolcole01

Been loving these 2 cards 1440p gaming is being very good here is my validation link
http://www.techpowerup.com/gpuz/details.php?id=cs6r6

ASUS Direct CU II r9 290x


----------



## rgrwng

Bought these from Mainframe Customs this afternoon. Money well spent, at the moment.


----------



## venom55520

Quote:


> Originally Posted by *JourneymanMike*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hmmm uh, yeah, OK!
> 
> I wouldn't chance it... It would be safer to have some headroom
> 
> But that' his decision... I just commented on his post and gave my opinion...
> 
> Be Cool
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Mike


I'm gonna have to agree with chiknnwatrmln.

A quality 550W is more than adequate with headroom for an OC and a single 290x. Assuming all components are running at stock, I would bet that the max draw for that system is somewhere around 400. With OC's the MAX and I mean MAX draw he'll probably see is ~490W. Quality power supplies are rated to supply their advertised power 100% of the time continuously and that's why they are quality. Essentially, almost always, the enthusiast power supplies are way overkill for the components they power.


----------



## Klocek001

if he wanted to use a PSU calculator he'd do that, he can type. The difference is that the PSU calculator would tell him he can't use a 290x on a 550W PSU while he certainly can do that on XFX 550W. The fact that XFX is budget don't make it crap, it's certainly better than any 550W overpriced Corsair unit. I can't imagine a single 290x with OC'd i5 drawing more than 500W unless it's 1.5V on the card and 1.5V on the CPU, and this PSU can sustain a ~530W load.


----------



## tsm106

Could one run a puny 550w psu, yea you could. And you could do a lot of other stupid things in life, no one is going to stop you either.


----------



## Bertovzki

Quote:


> Originally Posted by *tsm106*
> 
> Could one run a puny 550w psu, yea you could. And you could do a lot of other stupid things in life, no one is going to stop you either.


Yeah , why would you use a toy to power top gear , of course you can argue that you can run the setup on a low wattage PSU , and there is proof out there that you can , and what a PSU can supply , and how low a wattage a system actually often uses , but equally i have seen the evidence of for eg. 2 x 290 + CPU drawing 1000W from th wall from a 860Ti ,
What is the point in stressing your PSU out running it to its absolute max capability when they are cheap as chips compared to the rest of your system , you just buy one that runs your gear comfortably with some head room for future up grade too ,IMO i would rather be running my PSU in the 50 - 90 power supply range , regardless of where its optimal efficiency sits , thats just my preference.

So IMO don't bother with anything less than a 750W and preferably a 860 or 1000,

My personal opinion for my rig , a i7 4790K , 2 x 290X TriX OC , 8-10 fans , 2x 850 SSD , maybe HDD , a FC5 V3 fan controller , NZXT RGB , numerous led's a D5 pump
I want a 1000 - 1200W PSU , prefer 1200W without question , but the V1000 is short at 170 mm so may compromise and get that as it will be fine.


----------



## hamzta09

Quote:


> Originally Posted by *rgrwng*
> 
> Bought these from Mainframe Customs this afternoon. Money well spent, at the moment.
> 
> 
> 
> [im.png[/img]


That top card must be gasping for air. holy hell


----------



## Roboyto

Quote:


> Originally Posted by *Yorkston*
> 
> I've gotten my 290 stable at 1200/1300 with +88mV in Afterburner, is that good/bad/average for these cards? And how important is memory OC vs core? (elpida mem sadly)


Core is way more important than RAM on these cards. If it doesn't go any further on the memory you aren't missing that much performance with the 512-bit bus.

See link in my sig for 290(X) info. Within that post there is some info on stock RAM clocks compared to very high @ ~1700.

Quote:


> Originally Posted by *venom55520*
> 
> I'm gonna have to agree with chiknnwatrmln.
> 
> A quality 550W is more than adequate with headroom for an OC and a single 290x. Assuming all components are running at stock, I would bet that the max draw for that system is somewhere around 400. With OC's the MAX and I mean MAX draw he'll probably see is ~490W. Quality power supplies are rated to supply their advertised power 100% of the time continuously and that's why they are quality. Essentially, almost always, the enthusiast power supplies are way overkill for the components they power.


I concur. I ran a 290/4770k off of a 450W for 9 months without any issues whatsoever. Rosewill(Super Flower) Capstone 450M Gold Modular. PowerColor 290 w/ Kraken OC 1075/1375 +37mV, 4770k @ 4.3GHz 1.25V/1.812 VRIN, 8GB Dominator 2133 @ 1.5V, SSD, 2 HDD, 2 AIOs and 3 fans.

I've been using this PSU for 2 years now, and it has seen it's share of different hardware. CPUs: Phenom II X6 stock, FX-8150 stock, i5 3570k 4.5GHz, i7 3770k 4.5GHz, i7 4770k 4.3GHz, and now i7 4790k 4.7GHz..and the GPUs: GT640, HD6770, HD6790, HD7770, GTX670, R9 290, and now GTX 970.

Never once has it given me a problem and is always dead silent. It powers my HTPC, which is my resident BD ripper, and has never flinched with 90-100% CPU load for rips.

My main rig running the Capstone 650M. 4770k 4.5GHz 1.26V, 16GB Dominator Platinum 2400 1.65V, R9 290 1200/1500 +87mv for gaming and ~1300/~1700 +200mV depending on bench, with 3 SSDs, 1 HDD, 1 MCP-655, and 6 fans. Same story, never an issue.

If the 290(X) are properly cooled and you're not running maximum voltage/clocks then you can get away with a smaller PSU; attaching an AIO to has been known to drop power consumption at stock ~40W. You must obviously be mindful of your other hardware though. http://www.legitreviews.com/nzxt-kraken-g10-gpu-water-cooler-review-on-an-amd-radeon-r9-290x_130344/5

This was posted a ways back, and is listed in the OP Useful Info Section:

http://www.overclock.net/t/1441118/290x-psu-power-output-tests/0_20#post_21156921

It shows a 4770k @ 4.8GHz 1.46V and Sapphire 290X @ 1200/1600 and flashed with an ASUS BIOS which is likely increasing the power draw. It shows 600W max system draw while running Furmark and 473W for Unigine/3DMark. With less extreme overclocks/voltages it is easy to reason that the power draw would undoubtedly be a fair bit less.

What's the bottom line here? It's divided into a few tiers IMO...


You want to run your hardware at/very near stock clocks and aren't pushing your hardware 24/7, then you can use a smaller high quality/efficiency PSU and you won't likely have any problems.
You want to run your hardware with a decent overclock and like to push your hardware a bit, then you will want to stick with/very near the guidelines that are set forth by the manufacturer; This is 750W IIRC.
You want to run your hardware at its maximum capable speeds/voltages/etc, then you will want to do your homework and buy what is appropriate for your specific needs.

IMO the recommendation of 750W is in place to cover crappy/tired PSUs and the majority of other hardware configurations, to ensure an enjoyable gaming experience. 750W should be able to cover any typical CPU/MoBo/RAM and a single 290(X) without issues.


----------



## Levys

Hello, I was wondering how many amperes I should have for cf R9 290 + FX 8350 + CVF-Z + D5 Photon Waterpump etc.
look in my sig. if you will. I can get this psu dirt cheap : http://dealer.huntkey.com/en/product/p-68-419.html

I can find enough on wattage use , but not so much about amperes.
also do these 4 rails stack up to 80 amps? or do i need like 40 amps per rail ?
Or am I safe using only 2 of these rails and does it split the amps 2x40amp
Or one cable from all four ( 2 rails per card ? )

Help please


----------



## Jabba1977

Added second 290x Lightning...(by air), happy!!!, this cards are beast....



http://www.3dmark.com/3dm/5818355?

Best Regards.


----------



## hamzta09

Quote:


> Originally Posted by *Jabba1977*
> 
> Added second 290x Lightning...(by air), happy!!!, this cards are beast....
> 
> 
> 
> http://www.3dmark.com/3dm/5818355?
> 
> Best Regards.


temps in lets say BF4 or any other xfire taxing games?


----------



## ConnorMcLeod

Hi,

When a log a whole heaven bench, at default setting (VTX3D R9 290 X-Edition 975MHz / 1250MHz) i have more than 10% of log entries that are at 866MHz and the rest of them at 975MHz, i've tried to set max power to 50% (with AMD Overdrive) but it doesn't affect this thing.
Is this a problem ? Is there anyway to improve it ?


----------



## jagdtigger

Its not OC-ed so its not normal if its downclocking, add a +20 mV VDDC offset and run it again(when i OC-ed my 290x and the voltage wasn't enough for the given clock it gave me similar symptomes)...


----------



## Chopper1591

Finally got time to add my gpu(290 tri-x) to the loop.
Temps are very nice IMO.

Single loop:
XSPC dual bay res/d5 top combo // w. swiftech mcp-655(setting 5 atm)
UT60 360
XTC 140.1
EK Supremacy Evo
EK 290x Acetal FC w. EK backplate
Fujipoly Ultra vrm pads, ek ram pads.

Comparison shots, Valley Extreme preset(~15 minutes):
1200 1500 +150 through Trixx.

Stock, with 100% fixed fan.


Water.


----------



## ConnorMcLeod

Quote:


> Originally Posted by *jagdtigger*
> 
> Its not OC-ed so its not normal if its downclocking, add a +20 mV VDDC offset and run it again(when i OC-ed my 290x and the voltage wasn't enough for the given clock it gave me similar symptomes)...


Do i need msi afterburner for that ? only have max power in AMD Overdrive.

Also, is Valley more stressfull than Heaven ?

Edit : Same is occuring with +20mV

18% 866MHz / 82% 975MHz

When i set 1120Hz : 40% 995MHz / 60% 1120Hz


----------



## Esperante

I have a question for you guys. My 290 used to have a DVI cable hooked up until I fried it, so now I'm using an HDMI cable. The gpu when not being used used to go into sleep modes and the fans stopped. I like this. It no longer does this with HDMI. Is this normal, any way to fix it?


----------



## kizwan

Quote:


> Originally Posted by *ConnorMcLeod*
> 
> Hi,
> 
> When a log a whole heaven bench, at default setting (VTX3D R9 290 X-Edition 975MHz / 1250MHz) i have more than 10% of log entries that are at 866MHz and the rest of them at 975MHz, i've tried to set max power to 50% (with AMD Overdrive) but it doesn't affect this thing.
> Is this a problem ? Is there anyway to improve it ?


Quote:


> Originally Posted by *ConnorMcLeod*
> 
> Quote:
> 
> 
> 
> Originally Posted by *jagdtigger*
> 
> Its not OC-ed so its not normal if its downclocking, add a +20 mV VDDC offset and run it again(when i OC-ed my 290x and the voltage wasn't enough for the given clock it gave me similar symptomes)...
> 
> 
> 
> Do i need msi afterburner for that ? only have max power in AMD Overdrive.
> 
> Also, is Valley more stressfull than Heaven ?
> 
> Edit : Same is occuring with +20mV
> 
> 18% 866MHz / 82% 975MHz
> 
> When i set 1120Hz : 40% 995MHz / 60% 1120Hz
Click to expand...

Using Afterburner, create 2D & 3D profiles.


----------



## Arizonian

Quote:


> Originally Posted by *BuildTestRepeat*
> 
> Sadly i will need removed as i sent my card back for RMA, and being the rebate period for me to mail out was going to be after i got my RMA card back newegg refunded me the $. Now i dont know what i wanna pickup!


Sorry to hear that. Hope you have better luck with your next GPU you choose.
Quote:


> Originally Posted by *Dieselbird*
> 
> http://www.techpowerup.com/gpuz/details.php?id=wwuzp
> 
> mild overclock on stock cooling, keeps it below 140* good enough for me
> 
> first one I bought had the bad ram, had to RMA but this one rocks


Congrats - added








Quote:


> Originally Posted by *vieuxchnock*
> 
> *Finally, the sleeving is done for my 3X 290.*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> *http://www.servimg.com/view/17159996/625*
> 
> 
> *
> My fingers are gone with.
> 
> Paracord is very nice but it's hard on fingers.
> 
> It looks yellow in the front but look between the res and the card, it's really green.
> One day, I'll buy a good camera.
> 
> *


Congrats - updated









BTW ... AMAZING work.








Quote:


> Originally Posted by *coolcole01*
> 
> Been loving these 2 cards 1440p gaming is being very good here is my validation link
> http://www.techpowerup.com/gpuz/details.php?id=cs6r6
> 
> ASUS Direct CU II r9 290x


Congrats - added









Quote:


> Originally Posted by *rgrwng*
> 
> Bought these from Mainframe Customs this afternoon. Money well spent, at the moment.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Quote:


> Originally Posted by *Jabba1977*
> 
> Added second 290x Lightning...(by air), happy!!!, this cards are beast....
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/5818355?
> 
> Best Regards.


Congrats - added


----------



## jagdtigger

Quote:


> Originally Posted by *ConnorMcLeod*
> 
> Do i need msi afterburner for that ? only have max power in AMD Overdrive.
> 
> Also, is Valley more stressfull than Heaven ?
> 
> Edit : Same is occuring with +20mV
> 
> 18% 866MHz / 82% 975MHz
> 
> When i set 1120Hz : 40% 995MHz / 60% 1120Hz


Thats strange, tried Heaven 4.0 on my OC-ed 290X and its maxing out the GPU. If its not overheating then i think its RMA time...


----------



## ConnorMcLeod

Quote:


> Originally Posted by *kizwan*
> 
> Using Afterburner, create 2D & 3D profiles.


Didn't help at all, used default settings for 2D, and OC +20mV for 3D, result is exactly the same.
Automatic profil management doesn't seem to work though, on windows gpuz reports OC value, and if i load 2D profil, when i launch Uningine, default clock is detected.

Also, is there anyway to fix uningine reporting wrong temp ? seems like it's reading float numbers and using it as integers.


----------



## Agent Smith1984

Quote:


> Originally Posted by *ConnorMcLeod*
> 
> Do i need msi afterburner for that ? only have max power in AMD Overdrive.
> 
> Also, is Valley more stressfull than Heaven ?
> 
> Edit : Same is occuring with +20mV
> 
> 18% 866MHz / 82% 975MHz
> 
> When i set 1120Hz : 40% 995MHz / 60% 1120Hz


You may find that the power limit increase on AB doesn't actually work..
I can set mine to 50% in AB, and check CCC and it's at 0%....

Not sure why, but it's always been that way for me. I have to use a buggy trixx for overclocking instead....
With my card's factory OC, it will power throttle at anything less than 20%, so I had throttling right out of the bag until I realized this......


----------



## Chopper1591

Nobody saw my post?
Post #34715

It seems like my power usage is actually lower now I have the card under water.
Anybody here seen the same? Maybe that the vrm's are more efficient because they are a LOT cooler.


----------



## mAs81

Finally,I got my Vapor-X cooler working again and did some benchmarks @ stock clocks..
Valley @ extreme HD :


Spoiler: Warning: Spoiler!






Valley @ 2560X1440


Spoiler: Warning: Spoiler!






FireStrike :
http://www.3dmark.com/3dm/5829919?

All in all I'm very pleased to have my card back








I control the fans with my fan controller and after doing the benchmarks one after the other , my temps were great!!


----------



## ConnorMcLeod

Just tested with my sapphire R9 290 tri-x and i don't have the problem at all, i've set +20mV and 1030MHz (default is 957) and frequency is stable during the whole bench (and temps 10°C cooler).


----------



## tsm106

Quote:


> Originally Posted by *Chopper1591*
> 
> Nobody saw my post?
> Post #34715
> 
> It seems like my power usage is actually lower now I have the card under water.
> Anybody here seen the same? Maybe that the vrm's are more efficient because they are a LOT cooler.


That's good, means your cooling is working. Lower temps equal lower consumption. higher efficiency etc.


----------



## Forceman

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You may find that the power limit increase on AB doesn't actually work..
> I can set mine to 50% in AB, and check CCC and it's at 0%....
> 
> Not sure why, but it's always been that way for me. I have to use a buggy trixx for overclocking instead....
> With my card's factory OC, it will power throttle at anything less than 20%, so I had throttling right out of the bag until I realized this......


I have seen the same issue. Normally setting the power limit to 50% in CCC makes it stick, and then you can use AB to overclock.


----------



## Chopper1591

Quote:


> Originally Posted by *tsm106*
> 
> That's good, means your cooling is working. Lower temps equal lower consumption. higher efficiency etc.


Ok, nice.

So in theory, if I put the card under LN2 I can lower the voltage even more?


----------



## hamzta09

Question how come there are like zero Crossfire reviews out there for the 290 series?
I mean for non-Ref models.


----------



## Maracus

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You may find that the power limit increase on AB doesn't actually work..
> I can set mine to 50% in AB, and check CCC and it's at 0%....
> 
> Not sure why, but it's always been that way for me. I have to use a buggy trixx for overclocking instead....
> With my card's factory OC, it will power throttle at anything less than 20%, so I had throttling right out of the bag until I realized this......


I had the same problem with AB. Setting the power limit to +50 back to 49 then +50 seemed to get it to work.


----------



## Arizonian

Quote:


> Originally Posted by *Chopper1591*
> 
> Finally got time to add my gpu(290 tri-x) to the loop.
> Temps are very nice IMO.
> 
> Single loop:
> XSPC dual bay res/d5 top combo // w. swiftech mcp-655(setting 5 atm)
> UT60 360
> XTC 140.1
> EK Supremacy Evo
> EK 290x Acetal FC w. EK backplate
> Fujipoly Ultra vrm pads, ek ram pads.
> 
> Comparison shots, Valley Extreme preset(~15 minutes):
> 1200 1500 +150 through Trixx.
> 
> Stock, with 100% fixed fan.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Water.
> 
> 
> Spoiler: Warning: Spoiler!


I missed this but good enough for submission. Congrats - added


----------



## Gobigorgohome

If it is any reason to do so, I have "downgraded" to single MSI Lightning R9 290X on air from 4x Sapphire Radeon R9 290X on water. Still a R9 290X-owner though.


----------



## Bertovzki

Quote:


> Originally Posted by *Gobigorgohome*
> 
> If it is any reason to do so, I have "downgraded" to single MSI Lightning R9 290X on air from 4x Sapphire Radeon R9 290X on water. Still a R9 290X-owner though.


Why did you do that , you selling off , and waiting for 3xxx series ?


----------



## Chopper1591

Quote:


> Originally Posted by *Arizonian*
> 
> I missed this but good enough for submission. Congrats - added


Played some more with the settings yesterday.
And it seems that I can actually lower the voltage offset to +75 to keep it stable at 1200/1500.

Highest I got so far was 1275 1625 with +175.


----------



## Gobigorgohome

Quote:


> Originally Posted by *Bertovzki*
> 
> Why did you do that , you selling off , and waiting for 3xxx series ?


First off, I am gaming like 2-3 hours a week, sometimes even less, then I used to game at 4K with high/max settings. Then the Omega-drivers came and messed up everything, could just have installed 14.9, but I did not bother. The power consumption of four cards is just ridicilous, the heat dissipation is insane and gaming at 4K is less than necassary (at least for me). Now I am gaming at 1440p with single card and high settings and it is running pretty smooth in the games I play.








Better to sell off the cards than having them dusting, soon going off to study somewhere else in Europe and I cannot take my computer with me, so better sell them while there still is someone that wants them.








And yes, I am waiting for next generation AMD GPU's.


----------



## Bertovzki

Quote:


> Originally Posted by *Gobigorgohome*
> 
> First off, I am gaming like 2-3 hours a week, sometimes even less, then I used to game at 4K with high/max settings. Then the Omega-drivers came and messed up everything, could just have installed 14.9, but I did not bother. The power consumption of four cards is just ridicilous, the heat dissipation is insane and gaming at 4K is less than necassary (at least for me). Now I am gaming at 1440p with single card and high settings and it is running pretty smooth in the games I play.
> 
> 
> 
> 
> 
> 
> 
> 
> Better to sell off the cards than having them dusting, soon going off to study somewhere else in Europe and I cannot take my computer with me, so better sell them while there still is someone that wants them.
> 
> 
> 
> 
> 
> 
> 
> 
> And yes, I am waiting for next generation AMD GPU's.


Yes i can relate to that , for a start i have not played a single minute on a game for over a year now , im not a gamer at all at moe , but i do like a good driving sim and flight sim , music editing.
Id like to be able to run 4K , and hope i would be able to with 2 x 290X ? , but if i cant , then i will upgrade latter on.
4K looks awesome , but i can settle for full HD

I want to run Project cars in 4K if pos , but if not what ever i can run it at.

Are 2 x 290X and a 4790K good enough for 4 K ?


----------



## ghabhaducha

Quote:


> Originally Posted by *Chopper1591*
> 
> Nobody saw my post?
> Post #34715
> 
> It seems like my power usage is actually lower now I have the card under water.
> Anybody here seen the same? Maybe that the vrm's are more efficient because they are a LOT cooler.


I experienced that too when i went with water and agree with what's been said above. I also think its also because the stock fan is a massive power hog.


----------



## Offler

Quote:


> Originally Posted by *Offler*
> 
> Just did a "quick lapping". Which means I did it in 2 hours, not 4... Shape of the the cooler wasnt as concave as I feared. Near the center were two holes and one bump near the bottom. So i brushed them away with sandpaper. The I applied Arctic Silver 5. The paste needs some time to act as expected but first numbers:
> 
> OCCT
> Before: >94°C at 78% fan speed
> After: 89 at 70% fan speed and stable.
> 
> Air temperature near exhaust during stress test
> Before 45
> After 50
> 
> Idle:
> Before: 46
> After: 38
> 
> Browsing with "clock lock"bug*:
> Before: 80°C
> After: 66°C
> 
> Mortal Kombat 9
> Before 72
> After 64
> 
> Idle values are close to numbers archieved by other people. What the hell were guys at Tom's Hardware doing?
> 
> * Youtube video on non-visible tab, or minimized browser caused GPU frequency to lock on 954 or 977MHz - depending on driver version.


So after 4 months I checked the temperatures again when the thermal paste had enough time to fit. Browsing with "clock" bug is now at 57°C. Playing archeage was pushing fans toward 60% when temperatures of GPU were over 90 degrees. Now it does not go throught 42% / 82°C.

The card was being used for 8 hours everyday with Archeage, which is quite high GPU load.


----------



## Gobigorgohome

Quote:


> Originally Posted by *Bertovzki*
> 
> Yes i can relate to that , for a start i have not played a single minute on a game for over a year now , im not a gamer at all at moe , but i do like a good driving sim and flight sim , music editing.
> Id like to be able to run 4K , and hope i would be able to with 2 x 290X ? , but if i cant , then i will upgrade latter on.
> 4K looks awesome , but i can settle for full HD
> 
> I want to run Project cars in 4K if pos , but if not what ever i can run it at.
> 
> Are 2 x 290X and a 4790K good enough for 4 K ?


Two R9 290X in crossfire is good for 4K, do not think it will do high/max settings, because it wont (maybe unless you overclock the cards pretty high).
4K is gimmick really, in all honesty, if I had known how it looked back when the Samsung U28D590 got released I would not have bought it. I think it is good for better resolution in internetbrowsers, for video/photo editing and creativity, for gaming I will stick to 1440p for now. 4K monitors can be downscaled to 1440p and the difference in apperance in games (1440p vs 4K is not that big), for photo editing there is a difference with 1440p and 4K though.

I have just used my cards with a 3930K and a 4930K, also four cards do require that the cache is high enough on the CPU. 4,6 Ghz SB-E/IVY-E is cache-bottlenecking 4x R9 290X in quad crossfire, three cards is okay though. How the 4790K work I am not sure of, because I have never owned it.







You will probably be just fine.









So my conclution is, 4K okay for anything but gaming, to me 1440p and 4K look exactly alike only better framerates at 1440p. 4K is just unnecassary for gaming if you ask me.









Anyways, in my experience, pretty much any simulators is pretty easy to run at high resolution. Back in the day mid-end cards (GTX 660 Ti) could run Dirt 3 @ 5760x1080 without any problems, newer games might be requiring more though.


----------



## rdr09

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Two R9 290X in crossfire is good for 4K, do not think it will do high/max settings, because it wont (maybe unless you overclock the cards pretty high).
> 4K is gimmick really, in all honesty, if I had known how it looked back when the Samsung U28D590 got released I would not have bought it. I think it is good for better resolution in internetbrowsers, for video/photo editing and creativity, for gaming I will stick to 1440p for now. 4K monitors can be downscaled to 1440p and the difference in apperance in games (1440p vs 4K is not that big), for photo editing there is a difference with 1440p and 4K though.
> 
> I have just used my cards with a 3930K and a 4930K, also four cards do require that the cache is high enough on the CPU. 4,6 Ghz SB-E/IVY-E is cache-bottlenecking 4x R9 290X in quad crossfire, three cards is okay though. How the 4790K work I am not sure of, because I have never owned it.
> 
> 
> 
> 
> 
> 
> 
> You will probably be just fine.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So my conclution is, 4K okay for anything but gaming, to me 1440p and 4K look exactly alike only better framerates at 1440p. 4K is just unnecassary for gaming if you ask me.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyways, in my experience, pretty much any simulators is pretty easy to run at high resolution. Back in the day mid-end cards (GTX 660 Ti) could run Dirt 3 @ 5760x1080 without any problems, newer games might be requiring more though.


i use 2 290s at stock with a 4K just fine for gaming. But, like you said, no need to max games out. most my games are set to medium. i jumped from 1080, so i can't compare with any other rez. love my 4k.

edit: i think you can still be considered a pioneer in 4K gaming , thus your experiences was meh. i remember reading threads about that particular monitor not playing nice with both camps. we have members here whose been using 4K for at least a year.

my 4K monitor is plug and play. never even bothered checking or touching the settings.


----------



## GOLDDUBBY

Quote:


> Originally Posted by *Phaster89*
> 
> it figures i can't hear my 290 at 80% fan speed when i'm playing hitman absolution for example, but on arma 2 and 3 the fan at 80% is really really annoying
> 
> one more nail in the air cooling coffin, at least for me


That's such an easy fix now adays, with kraken g10. And I believe alphacool also have a simple solution for more advanced water cooling.


----------



## Phaster89

the kraken g10 would be a nice solution but i've already decided to endure loud noises until my next build which will be fully watercooled

the alphacool gpu block is not worth it, an ek block only costs an extra 20~25€ and has much better performance


----------



## Gobigorgohome

Quote:


> Originally Posted by *rdr09*
> 
> i use 2 290s at stock with a 4K just fine for gaming. But, like you said, no need to max games out. most my games are set to medium. i jumped from 1080, so i can't compare with any other rez. love my 4k.
> 
> edit: i think you can still be considered a pioneer in 4K gaming , thus your experiences was meh. i remember reading threads about that particular monitor not playing nice with both camps. we have members here whose been using 4K for at least a year.
> 
> my 4K monitor is plug and play. never even bothered checking or touching the settings.


Like I said, 4K seems meaningless for gaming, that is my experience after 7-8 months using 4K with max settings (pretty much) in any game. I like max settings because it looks best, but 1440p with high/max settings look pretty good for the price of just one card.









Regarding 4K gaming, it is okay, nothing more though. 1440p from this point and forward for me, seems smoother.


----------



## rdr09

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Like I said, 4K seems meaningless for gaming, that is my experience after 7-8 months using 4K with max settings (pretty much) in any game. I like max settings because it looks best, but 1440p with high/max settings look pretty good for the price of just one card.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Regarding 4K gaming, it is okay, nothing more though. 1440p from this point and forward for me, seems smoother.


i can certainly max some games out. i've seen someone in this thread played 4K with a single 290 with higher settings than mine in BF4 MP. . . but, i don't oc my gpus in games. mine are watered and 1200 core are easy. my games are smooth except for tearing in some games like Titanfall. vsync kinda help. with no AA in BF4 i've seen my vram usage go as high as 3500MB per card. but, no need for AA for a 28 in.

no issues with my 4k. no flickering, no ghosting, and i think the 2 ms helps, too.

edit: i guess what i trying to say is 4K is doable. very doable nowadays.


----------



## fyzzz

What is a 'normal' asic value on r9 290? mine is 84,2 %. I know some say asic is pure bs but im curious


----------



## BradleyW

Quote:


> Originally Posted by *fyzzz*
> 
> What is a 'normal' asic value on r9 290? mine is 84,2 %. I know some say asic is pure bs but im curious


Average is 70 to 80. My ASIC is 77, but I can do 1150 on the cores without voltage tweaks. I run stock 24/7 on my GPU's though.


----------



## fyzzz

Quote:


> Originally Posted by *BradleyW*
> 
> Average is 70 to 80. My ASIC is 77, but I can do 1150 on the cores without voltage tweaks. I run stock 24/7 on my GPU's though.


Ahh ok i just was curious. My card can only do 1060 without voltage tweaks.


----------



## BradleyW

Quote:


> Originally Posted by *fyzzz*
> 
> Ahh ok i just was curious. My card can only do 1060 without voltage tweaks.


What are your stock clocks? Remember I have 290X's.


----------



## fyzzz

Quote:


> Originally Posted by *BradleyW*
> 
> What are your stock clocks? Remember I have 290X's.


I have the asus 290 dc2 so the stock clock is 1000 mhz


----------



## Roy360

still getting black screens whenever my monitors go to sleep. On 14.3, I see that 14.4.6 is up, so I'm downloading that at the moment.

Could the black screens mean my vcore isn't high enough?

I'm using a mix of different cards in CF so their vcores and clock speeds are all over the place.

I'm using afterburner to set all the cards at 1000/1260, but I"m not sure what's happening with the voltage. (If they are the same across all cards or not.)

EDIT:
http://forums.guru3d.com/showthread.php?t=394780

It seems that DVI might be the problem. *Off to go buy a MST hub


----------



## Phaster89

i also have black screen when my monitor goes to sleep at whatever the overclock i set


----------



## Chopper1591

Quote:


> Originally Posted by *ghabhaducha*
> 
> I experienced that too when i went with water and agree with what's been said above. I also think its also because the *stock fan is a massive power hog*.


Hmm..
Interesting, I hadn't thought about that.

Makes sense though, the stock fan using more power. Certainly as it were 3 fans.
But the most difference is probably because of the lower temp, thus higher efficiency.
Quote:


> Originally Posted by *fyzzz*
> 
> Ahh ok i just was curious. My card can only do 1060 without voltage tweaks.


That is most likely also because of the different stock voltage.
What voltage are you running on stock?

My 290 tri-x could do(have to see if there is a difference now I am under water) 1100 core with stock voltage.
Now I have it under water I run 1200 core 1500 mem with only +75. With stock cooler I had +150 for that clock.

My ASIC is 80.8%
Quote:


> Originally Posted by *Phaster89*
> 
> i also have black screen when my monitor goes to sleep at whatever the overclock i set


I have it sometimes...
Can't seem to find out why exactly.

Sleep mode of the pc itself was also a PITA. I ended up disabling that.


----------



## fyzzz

My stock volt is 1.141 at max


----------



## Agent Smith1984

Is anyone running 200mv+ on air cooling, 24/7??

My card never breaks 78c core and 90c on the VRM with that voltage, and it's the only way I can get close to touching 1200MHz...
I see some peopel with the 290 tri-x getting 1200 core at 100-150mv with stock cooler, and some people needing over 100 just to hit 1100, so it looks like it's always luck of the draw with this card, even though it is somewhat binned for the factory OC.

It pretty much tells me that you don't even need a bin to hit 1000 on hawaii, but to get over 1100 is a different story....
It seems the card is happiest at 150mv 1150/1500, but will do 1175/1600 on 200mv. There is a performance advantage there, but not sure if it's worth the power/thermal gains it requires to get there.....


----------



## chiknnwatrmln

^I would stick with 1150 if I were you. An extra 25MHz will not provide you any tangible benefit, not really worth the extra 50mV and heat imo. You would be lucky to get even 1 more frame per second in most games with that little speed bump.

Pretty much all Hawaii cards should do 1100 MHz, most will do 1150, some will do 1200, and a small portion will do 1250+. 1300 and higher is pretty rare, even for WC.


----------



## Roy360

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> ^I would stick with 1150 if I were you. An extra 25MHz will not provide you any tangible benefit, not really worth the extra 50mV and heat imo. You would be lucky to get even 1 more frame per second in most games with that little speed bump.
> 
> Pretty much all Hawaii cards should do 1100 MHz, most will do 1150, some will do 1200, and a small portion will do 1250+. 1300 and higher is pretty rare, even for WC.


unless of course you have mine luck. And can't go above 1050 without a voltage bump. It's like VisionTek paid to get chips that can't overclock


----------



## Regnitto

Quote:


> Originally Posted by *Roy360*
> 
> unless of course you have mine luck. And can't go above 1050 without a voltage bump. It's like VisionTek paid to get chips that can't overclock


my visiontek does pretty good. I can get 1100 without touching voltage and 1200 under water.


----------



## Bertovzki

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Like I said, 4K seems meaningless for gaming, that is my experience after 7-8 months using 4K with max settings (pretty much) in any game. I like max settings because it looks best, but 1440p with high/max settings look pretty good for the price of just one card.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Regarding 4K gaming, it is okay, nothing more though. 1440p from this point and forward for me, seems smoother.


Quote:


> Originally Posted by *rdr09*
> 
> i can certainly max some games out. i've seen someone in this thread played 4K with a single 290 with higher settings than mine in BF4 MP. . . but, i don't oc my gpus in games. mine are watered and 1200 core are easy. my games are smooth except for tearing in some games like Titanfall. vsync kinda help. with no AA in BF4 i've seen my vram usage go as high as 3500MB per card. but, no need for AA for a 28 in.
> 
> no issues with my 4k. no flickering, no ghosting, and i think the 2 ms helps, too.
> 
> edit: i guess what i trying to say is 4K is doable. very doable nowadays.


Thanks for feedback guys , yeah 4K still tempting , but if i was to lean in any direction , it would be always to have full detail in games ! and a big ass screen , so maybe i am better to get a full HD instead but a big one.
I dont know much about monitors and TV's , but i have found a good TV as good as a monitor , i guess as long as you have a good refresh rate and Ms response time ?
What would be the max Ms speed before its not very good 2 Ms sounds very good.


----------



## YellowBlackGod

Quote:


> Originally Posted by *Bertovzki*
> 
> Thanks for feedback guys , yeah 4K still tempting , but if i was to lean in any direction , it would be always to have full detail in games ! and a big ass screen , so maybe i am better to get a full HD instead but a big one.
> I dont know much about monitors and TV's , but i have found a good TV as good as a monitor , i guess as long as you have a good refresh rate and Ms response time ?
> What would be the max Ms speed before its not very good 2 Ms sounds very good.


We shall not forget the upscale feature of the Omega drivers. We can play up to 3K (3200x1800) at the moment. I tested it with FC4 (still unoptimised for Radeon GPUs) in 2K and 3K, maxed out. It works quite well. For me this feature eliminates the need for a 4K monitor at the moment. You can buy a good Full HD gaming Monitor with high refresh rate and response time and use this.


----------



## Bertovzki

Quote:


> Originally Posted by *YellowBlackGod*
> 
> We shall not forget the upscale feature of the Omega drivers. We can play up to 3K (3200x1800) at the moment. I tested it with FC4 (still unoptimised for Radeon GPUs) in 2K and 3K, maxed out. It works quite well. For me this feature eliminates the need for a 4K monitor at the moment. You can buy a good Full HD gaming Monitor with high refresh rate and response time and use this.


Ok thanks for info , never herd of this upscaling thing , sounds alright ill read up


----------



## YellowBlackGod

There you go!

http://techreport.com/news/27483/vsr-is-amd-answer-to-nvidia-dsr-tech

It is called VSR, it is a simple setting in Catalyst Control Center. It adds up to 4K resolutions in games for 1080P monitors. The 4K res is only available for R9 285 though. The 290X can go up to 3K for the moment but this is OK for me. 2K is also fine. I have tested them both and it works great.







All I am watching now for is a good 27-28" Full HD gaming monitor, with high refresh rate and probably Freesync. This eliminates practically the need for a 4k expensive monitor.


----------



## Bertovzki

Quote:


> Originally Posted by *YellowBlackGod*
> 
> There you go!
> 
> http://techreport.com/news/27483/vsr-is-amd-answer-to-nvidia-dsr-tech
> 
> It is called VSR, it is a simple setting in Catalyst Control Center. It adds up to 4K resolutions in games for 1080P monitors. The 4K res is only available for R9 285 though. The 290X can go up to 3K for the moment but this is OK for me. 2K is also fine. I have tested them both and it works great.
> 
> 
> 
> 
> 
> 
> 
> All I am watching now for is a good 27-28" Full HD gaming monitor, with high refresh rate and probably Freesync. This eliminates practically the need for a 4k expensive monitor.


Thanks man , appreciate it , it all sounds pretty weird upscaling making a monitor go higher res, i need to read









Yeah i will be on the hunt soon for a monitor or good TV as i need both , iv got the hell rig but nothing to view it on


----------



## YellowBlackGod

Not weird at all. You turn it on in CCC. Nothing changes on your full HD monitor. In the game video settings you will see available 2K and 3K resolutions. Pick one of them and you will see 2K and 3K clearness while playing.


----------



## fyzzz

I have been trying to overclock my 290 but the voltage is very strange. gputweak is only giving me just under 1.2v when i put 1.4v in gputweak


----------



## Roboyto

Quote:


> Originally Posted by *fyzzz*
> 
> I have been trying to overclock my 290 but the voltage is very strange. gputweak is only giving me just under 1.2v when i put 1.4v in gputweak


Do you have an ASUS card or BIOS?

You must have an ASUS card or flash an ASUS BIOS to use GPU Tweak.

When you're having issues, it is best to give as much detail as possible; We're not psychic. Help us, help you


----------



## fyzzz

Quote:


> Originally Posted by *Roboyto*
> 
> Do you have an ASUS card or BIOS?
> 
> You must have an ASUS card or flash an ASUS BIOS to use GPU Tweak.
> 
> When you're having issues, it is best to give as much detail as possible; We're not psychic. Help us, help you


I have the asus r9 290 dc2 oc. But gpu tweak does not work well


----------



## fyzzz

here is a screenshot of gpu-z after running valley and 1.4v put in gpu tweak
http://gpuz.techpowerup.com/15/02/10/6yy.png


----------



## Roboyto

Quote:
Originally Posted by *fyzzz* 

I have the asus r9 290 dc2 oc. But gpu tweak does not work well

Please post some screenshots of the card running under high/maximum load. Use GPU-Z or HWInfo so we can see the GPU statistics for temperatures the core and VRMs are running at as well as voltage.

1.4V is more than likely too much for an aircooled card, it could be throttling due to temperature. Once the core reaches 95C it will throttle clocks/voltages to keep the temperature down. To run that much voltage you will also need to increase the power limit on the card. Increasing voltage/power make the VRMs run even hotter than they do stock, which is already a problem before you start pushing the limits of the GPU. Even if the core is running at an acceptable temperature, the VRMs can overheat. If the card doesn't go into safety mode, or the drivers don't crash causing the default settings to be applied, you can cause serious/unrepairable damage.

The 290(X) DC2 have bad cooling performance because the heatsink was designed for the GTX 780, and only 3/5 heatpipes make contact with the 290(x) GPU die. This is severely inhibiting cooling performance on your card, especially when you want to add voltage/power.

Read here for general information on the 290(X):

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781

Quote:
Originally Posted by *fyzzz* 

here is a screenshot of gpu-z after running valley and 1.4v put in gpu tweak
http://gpuz.techpowerup.com/15/02/10/6yy.png


> Quote:
> 
> 
> 
> Originally Posted by *fyzzz*
> 
> I will try again then
Click to expand...

This looks like you are running at stock settings. If you haven't overclocked the card, then it isn't going to require more voltage than stock. Just applying voltage without changing anything else isn't going to change your performance.


----------



## fyzzz

Quote:


> Originally Posted by *Roboyto*
> 
> Please post some screenshots of the card running under high/maximum load. Use GPU-Z or HWInfo so we can see the GPU statistics for temperatures the core and VRMs are running at as well as voltage.
> 
> 1.4V is more than likely too much for an aircooled card, it could be throttling due to temperature. Once the core reaches 95C it will throttle clocks/voltages to keep the temperature down. To run that much voltage you will also need to increase the power limit on the card. Increasing voltage/power make the VRMs run even hotter than they do stock, which is already a problem before you start pushing the limits of the GPU. Even if the core is running at an acceptable temperature, the VRMs can overheat. If the card doesn't go into safety mode, or the drivers don't crash causing the default settings to be applied, you can cause serious/unrepairable damage.
> 
> The 290(X) DC2 have bad cooling performance because the heatsink was designed for the GTX 780, and only 3/5 heatpipes make contact with the 290(x) GPU die. This is severely inhibiting cooling performance on your card, especially when you want to add voltage/power.
> 
> Read here for general information on the 290(X):
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781
> 
> This looks like you are running at stock settings. If you haven't overclocked the card, then it isn't going to require more voltage than stock. Just applying voltage without changing anything else isn't going to change your performance.


I will try again then


----------



## fyzzz

second try:http://gpuz.techpowerup.com/15/02/10/hwy.png


----------



## fyzzz

and yes i know asus was lazy and just took the cooler of 780. But i did not have lot's of money and the 290 was on sale so i bought it. I have also reduced temps by taking in cold air from my window into my computer. I'm planning to watercool later when i get some money


----------



## Zelx0

Is it normal having my r9 290 capping at 3400~ ram usage? Or AB bug?
I was running watch dogs with stock clock


----------



## Regnitto

Quote:


> Originally Posted by *Zelx0*
> 
> Is it normal having my r9 290 capping at 3400~ ram usage? Or AB bug?
> I was running watch dogs with stock clock


don't have a screenshot of mine at the moment, but I've seen my 290 go up to 4009mb ram usage in Shadow of Mordor with ultra texture quality and resolution scaling at 200% (4k). idk about watch dogs tho, as I never play it, maybe that's just as much ram as it uses?


----------



## Zelx0

Quote:


> Originally Posted by *Regnitto*
> 
> don't have a screenshot of mine at the moment, but I've seen my 290 go up to 4009mb ram usage in Shadow of Mordor with ultra texture quality and resolution scaling at 200% (4k). idk about watch dogs tho, as I never play it, maybe that's just as much ram as it uses?


It's weird since afterburner is showing a yellow line, that's why I'm wondering if it's faulty, I need to try shadow of mordor to check then


----------



## Regnitto

mine does the "yellow line" over 3 gb too. if you look at the graph settings, it only goes to 3gb max on the graph, the yellow line is the reading going beyond the graphs measurement ability. don't know if there is a way to increase that to show all 4gb in the graph, or if that is just a bug on MSI's part.

edit: after posting this I ran the Shadow of Mordor Benchmark to take a screenshot of the ram usage in AB:


Spoiler: Warning: Spoiler!







as you can see from the pic, it used 4038mb of VRAM, although the graph only reads up to 3072mb within the confines of the graph.


----------



## Zelx0

Quote:


> Originally Posted by *Regnitto*
> 
> mine does the "yellow line" over 3 gb too. if you look at the graph settings, it only goes to 3gb max on the graph, the yellow line is the reading going beyond the graphs measurement ability. don't know if there is a way to increase that to show all 4gb in the graph, or if that is just a bug on MSI's part.
> 
> edit: after posting this I ran the Shadow of Mordor Benchmark to take a screenshot of the ram usage in AB:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> as you can see from the pic, it used 4038mb of VRAM, although the graph only reads up to 3072mb within the confines of the graph.


Thanks, that's exactly what I was wondering about! Rep


----------



## Widde

I've enabled VSR in ccc but when I'm picking a higher resolution ingame everything becomes super small and looks really weird, running a 1080p monitor


----------



## Regnitto

Quote:


> Originally Posted by *Widde*
> 
> I've enabled VSR in ccc but when I'm picking a higher resolution ingame everything becomes super small and looks really weird, running a 1080p monitor


everything is still the same size (in pixels) but now there are more pixels rendered so it looks smaller, then it scales it down to fit in ur 1920x1080 resolution screen and looks a little fuzzy

I olny noticed fuzzyness on desktop after enabling VSR and setting resolution to 2560x1440 on my 1080p monitor.


----------



## YellowBlackGod

Quote:


> Originally Posted by *Regnitto*
> 
> everything is still the same size (in pixels) but now there are more pixels rendered so it looks smaller, then it scales it down to fit in ur 1920x1080 resolution screen and looks a little fuzzy
> 
> I olny noticed fuzzyness on desktop after enabling VSR and setting resolution to 2560x1440 on my 1080p monitor.


I have tried 2K and 3K. That's exactly what it does. It scales down but renders more pixels. For me the picture is clearer and very smooth. I don't see any fuzzyness. No tearing or blurryness as well. Could this be because of the Overdrive and ACR functions of my monitor that are enabled ( still a 60Hz Monitor). Game tested: FC4.


----------



## YellowBlackGod

DirectX 12, due to ship with the forthcoming Microsoft operating system Windows 10, is able to handle 600K draw calls *and boost AMD card performance by 600%, according to Stardock CEO Brad Wardell.*

http://www.eteknix.com/dx12-can-handle-600k-draw-calls-increasing-amd-gpu-performance-600/

290X = best futureproof card at the moment!!!!


----------



## KAS Phase4

My new R9 290, Sapphire Trix (I flashed bios 's Trix OC @ 1000/1300), ASIC 87.6%










OC stable ingame at 1200/1650 (+193mV)


----------



## mus1mus

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I think it's okay, but you will see a huge benefit from overclocking CPU when running crossfire.
> You should also have some wiggle room on the video cards, but the CPU will help the most when talking mutli-GPU
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Somewhere in the 4.5-4.8GHz should be plenty doable on that chip, and really opens up the GPU's, especially during gaming. The Firestrike results are a little skewed by the CPU score, because a low (not that yours is super low, but my physics score on my old AMD X6 is higher than what an i5 will do at 4GHz, just to put it in perspective) physics score can drop the entire score. As far as the graphics score alone, we'd need to see it...


Was a little late but here:

i5-3570K at 4.5 with 2400 memory:
http://www.3dmark.com/fs/4035364

FX8320 at 5.0 with 2133 memory:
http://www.3dmark.com/fs/3977214

The difference in Physics Score is about 1400 points. Favors the FX

The difference in Combined Score is about 1400 points. Favors the i5

Graphics scores are on par. Same card. Same OC.

Total score = i5 murders the FX which is due to the engine running half of the FX's power at Combined Test. And it impacts the Total Score by a lot compared to how Physics score does.


----------



## Agent Smith1984

Quote:


> Originally Posted by *mus1mus*
> 
> Was a little late but here:
> 
> i5-3570K at 4.5 with 2400 memory:
> http://www.3dmark.com/fs/4035364
> 
> FX8320 at 5.0 with 2133 memory:
> http://www.3dmark.com/fs/3977214
> 
> The difference in Physics Score is about 1400 points. Favors the FX
> 
> The difference in Combined Score is about 1400 points. Favors the i5
> 
> Graphics scores are on par. Same card. Same OC.
> 
> Total score = i5 murders the FX which is due to the engine running half of the FX's power at Combined Test. And it impacts the Total Score by a lot compared to how Physics score does.


What clocks on the 290x??


----------



## mus1mus

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What clocks on the 290x??


1150 / 1600 IIRC


----------



## Agent Smith1984

Quote:


> Originally Posted by *mus1mus*
> 
> 1150 / 1600 IIRC


Hmm... I don't think I can clock my 290 high enough to post a matching graphics score....

Highest I can bench is around 1190-1200 core. However, my RAM seems to run all the way to 1630 with no issues, so I will give it a go later....

Reason I ask is, if I can put up a matching 12,900+/- graphics scores, I can actually beat that i5 firestrike score you posted!
I know it's a silly bench, but it'd be interesting to see right....


----------



## k2blazer

Thought I'd run that benchmark with two R9 290 Sapphire Tri-x cards.

They're not yet overclocked.

http://www.3dmark.com/3dm/5864853


----------



## mus1mus

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Hmm... I don't think I can clock my 290 high enough to post a matching graphics score....
> 
> Highest I can bench is around 1190-1200 core. However, my RAM seems to run all the way to 1630 with no issues, so I will give it a go later....
> 
> Reason I ask is, if I can put up a matching 12,900+/- graphics scores, I can actually beat that i5 firestrike score you posted!
> I know it's a silly bench, but it'd be interesting to see right....


1200 / 1600 can score more if stable and cool. And yeah, your PSU should be able to support it if you are overvolting like nuts.







I have seen about 200+ total score bump by changing my PSU.









That i5 score is just with AB on my daily clocks.







Trixxx should give it more but proves nothing in daily use as my card can only clock to a benchable 1180 on core.
Quote:


> Originally Posted by *k2blazer*
> 
> Thought I'd run that benchmark with two R9 290 Sapphire Tri-x cards.
> 
> They're not yet overclocked.
> 
> http://www.3dmark.com/3dm/5864853


I thought it's sorcery seeing your 2500K score higher than my 3570K on Physics. Til I saw they're XFire'd.


----------



## Agent Smith1984

Quote:


> Originally Posted by *mus1mus*
> 
> 1200 / 1600 can score more if stable and cool. And yeah, your PSU should be able to support it if you are overvolting like nuts.
> 
> 
> 
> 
> 
> 
> 
> I have seen about 200+ total score bump by changing my PSU.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That i5 score is just with AB on my daily clocks.
> 
> 
> 
> 
> 
> 
> 
> Trixxx should give it more but proves nothing in daily use as my card can only clock to a benchable 1180 on core.
> I thought it's sorcery seeing your 2500K score higher than my 3570K on Physics. Til I saw they're XFire'd.


Funny you mention the PSU, because I am able to run +200 with no power throttling at all, as long as I am at 50% power limit, but when I first got this card, I couldn't figure out what the hell was happening:

I was running stock clocks (1000/1300), with no additional voltage, and 0 power limit. I looked at GPU-Z and say that the core kept throttling down 900-940MHz, yet the temps were 70c core/79c VRM... I was like huh? Then I set the power limit to 20%, and it stop. The card was literally power throttling under it's own stock clocks. I am assuming it's because the "Stock" clocks are a bit of an OC compared to a reference 290.....

Then I began overclocking, and found that as long as I used 50% power limit, I could run 200mv offset, with no throttling at all.
I would think, that even this mid-range PSU (reviews show it to be a pretty capable unit, even though it's a Raidmax) could handle that kind of power draw on a single card, though I wouldn't dare throw 2 overclocked cards at it. It was pulled from a 290x crossfire rig used for mining though, so I know it will handle two cards with stock power draw.


----------



## mus1mus

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Funny you mention the PSU, because I am able to run +200 with no power throttling at all, as long as I am at 50% power limit, but when I first got this card, I couldn't figure out what the hell was happening:
> 
> I was running stock clocks (1000/1300), with no additional voltage, and 0 power limit. I looked at GPU-Z and say that the core kept throttling down 900-940MHz, yet the temps were 70c core/79c VRM... I was like huh? Then I set the power limit to 20%, and it stop. The card was literally power throttling under it's own stock clocks. I am assuming it's because the "Stock" clocks are a bit of an OC compared to a reference 290.....
> 
> Then I began overclocking, and found that as long as I used 50% power limit, I could run 200mv offset, with no throttling at all.
> I would think, that even this mid-range PSU (reviews show it to be a pretty capable unit, even though it's a Raidmax) could handle that kind of power draw on a single card, though I wouldn't dare throw 2 overclocked cards at it. It was pulled from a 290x crossfire rig used for mining though, so I know it will handle two cards with stock power draw.


OC'ed or not, +50 power limit is your friend.









I have a Cooler Master Silent Pro 850 before and switched to a V1000.

850 is pretty much capable. But doesn't have a clean delivery, I suppose.


----------



## k2blazer

Quote:


> Originally Posted by *mus1mus*
> 
> 1200 / 1600 can score more if stable and cool. And yeah, your PSU should be able to support it if you are overvolting like nuts.
> 
> 
> 
> 
> 
> 
> 
> I have seen about 200+ total score bump by changing my PSU.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That i5 score is just with AB on my daily clocks.
> 
> 
> 
> 
> 
> 
> 
> Trixxx should give it more but proves nothing in daily use as my card can only clock to a benchable 1180 on core.
> I thought it's sorcery seeing your 2500K score higher than my 3570K on Physics. Til I saw they're XFire'd.


Haha









It's amazing seeing how good the 2500k still is this far down the line, doesn't hold my 290's back.


----------



## Agent Smith1984

Quote:


> Originally Posted by *k2blazer*
> 
> Haha
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's amazing seeing how good the 2500k still is this far down the line, doesn't hold my 290's back.


Good score on the graphics.....

Do you have any headroom left to OC the CPU at all? I would assume it would pull at least a 4.8GHz OC if you push it.

In benchmarks the graphics score will scale accordingly with both cards, but in real world usage, that setup will benefit GREATLY from higher CPU clocks.


----------



## mus1mus

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Good score on the graphics.....
> 
> Do you have any headroom left to OC the CPU at all? I would assume it would pull at least a 4.8GHz OC if you push it.
> 
> In benchmarks the graphics score will scale accordingly with both cards, but in real world usage, that setup will benefit GREATLY from higher CPU clocks.


Not that fast though. SLI / XFire will always be more beneficial than CPU Clocks unless, games doesn't support Multi-GPU IMO.

And Opps: http://www.3dmark.com/3dm/5866991


----------



## Agent Smith1984

Quote:


> Originally Posted by *mus1mus*
> 
> Not that fast though. SLI / XFire will always be more beneficial than CPU Clocks unless, games doesn't support Multi-GPU IMO.
> 
> And Opps: http://www.3dmark.com/3dm/5866991


Oh damn man.... what's the GPU clocks on that run??? I got that CPU score smoked, but your graphics score is really nice on that run.


----------



## k2blazer

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Good score on the graphics.....
> 
> Do you have any headroom left to OC the CPU at all? I would assume it would pull at least a 4.8GHz OC if you push it.
> 
> In benchmarks the graphics score will scale accordingly with both cards, but in real world usage, that setup will benefit GREATLY from higher CPU clocks.


I think I tried pushing it past there a while ago and it just wasn't stable, I'll try again though. I don't have a whole lot of experience with overclocking cpu's.


----------



## mus1mus

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Oh damn man.... what's the GPU clocks on that run??? I got that CPU score smoked, but your graphics score is really nice on that run.


1160 / 1625







http://www.techpowerup.com/gpuz/details.php?id=gmesv

Nothing really special. A lot of people are getting better Core speeds and scores than that.


----------



## Agent Smith1984

Quote:


> Originally Posted by *k2blazer*
> 
> I think I tried pushing it past there a while ago and it just wasn't stable, I'll try again though. I don't have a whole lot of experience with overclocking cpu's.


Well, this is definitely the place to go for help (in the appropriate section of course).....

I've never seen any k series Sandy Bridge chip with less than 4.6GHz in the tank....


----------



## Agent Smith1984

Quote:


> Originally Posted by *mus1mus*
> 
> 1160 / 1625
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nothing really special. A lot of people are getting better Core speeds and scores than that.


Well, I see some folks getting 1250+ on the cores sometimes.... I'm like wuuhhh???

I would of been thrilled to have gotten one of those gems, but with the $160 I paid for my 290 tri-x, I am MORE than happy with it.

Even at the semi-agressive 1150/1500 I run at daily, this thing churns through 1080P like a kitchen aid through flour.
Like a quaker through oats, like a jet boat through calm water.... Really I guess what I'm saying is, it does a really good job playing games at 1080P


----------



## Yorkston

I think I lucked out on the silicon lottery. 1200/1500 stable so far with +94mV on my 290, and I'm still going on the memory clock. Unfortunately i'm running into a temperature wall, even with my G10+H55+gellid VRM kit. Core and VRM2 hover in the low 60s, but VRM1 is hitting ~86C even with the Gellid kit, Fujipoly pad, and a side intake fan. Any ideas short of a full waterblock? MUST GO FASTER.

On a slightly unrelated note, even with force constant voltage enabled my voltage readings for the card still fluctuate a bit. Is that normal?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Yorkston*
> 
> I think I lucked out on the silicon lottery. 1200/1500 stable so far with +94mV on my 290, and I'm still going on the memory clock. Unfortunately i'm running into a temperature wall, even with my G10+H55+gellid VRM kit. Core and VRM2 hover in the low 60s, but VRM1 is hitting ~86C even with the Gellid kit, Fujipoly pad, and a side intake fan. Any ideas short of a full waterblock? MUST GO FASTER.
> 
> On a slightly unrelated note, even with force constant voltage enabled my voltage readings for the card still fluctuate a bit. Is that normal?


On temps, I would think you can go a tad further.... maybe 150mv, and hit around 90c, which is still acceptable to my knowledge, considering the cores are "safe up to 94c"...









Also, yes, you will always get some vdroop on the core. Just part of the game. I do wonder though, if the vdroop worsens with VRM temps, or if it directly correlated to the PSU rail itself??

Just a note though....

Seeing 1200MHz at 100+/- offset is a good thing for sure, but don't be surprised when you see the core clock scaling go right down the crapper.

Seems like most cards I have read about begin to really taper off core-clock after 100-150mv, and from there, you are getting around 5MHz per 10mv increase if you are lucky... So even though you have 1200 at 100mv, you could end up with a card that still falls short of 1250 at 200mv.
At least that's what I have seen in my research....

Not trying to be discouraging at all, just something to remember if you experience instability regardless of how much power you send. It may not be the temps, these cores just run out of leg room sometimes (mind you, that considering regular cooling methods, with standard BIOS, etc).


----------



## mus1mus

Quote:


> Originally Posted by *Yorkston*
> 
> I think I lucked out on the silicon lottery. 1200/1500 stable so far with +94mV on my 290, and I'm still going on the memory clock. Unfortunately i'm running into a temperature wall, even with my G10+H55+gellid VRM kit. Core and VRM2 hover in the low 60s, but VRM1 is hitting ~86C even with the Gellid kit, Fujipoly pad, and a side intake fan. Any ideas short of a full waterblock? MUST GO FASTER.
> 
> On a slightly unrelated note, even with force constant voltage enabled my voltage readings for the card still fluctuate a bit. Is that normal?


Hitting 86 on the VRMs is not that hot per spec. But some have been noticing artifacts with the VRM hitting 70s. So there's your limit for OC. Running them cool like 50s will enable you to push them further and "Lower your Power Usage"









Larger VRM sinks could help you. But a full cover will exploit the chances of a quieter PC and/or a higher OC. A much stable one to be exact.

And yeah, Vcore fluctuates.
I
V
V
V
Quote:


> Originally Posted by *Agent Smith1984*
> 
> On temps, I would think you can go a tad further.... maybe 150mv, and hit around 90c, which is still acceptable to my knowledge, considering the cores are "safe up to 94c"...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also, yes, you will always get some vdroop on the core. Just part of the game. I do wonder though, if the vdroop worsens with VRM temps, or if it directly correlated to the PSU rail itself??
> 
> Just a note though....
> 
> Seeing 1200MHz at 100+/- offset is a good thing for sure, but don't be surprised when you see the core clock scaling go right down the crapper.
> 
> Seems like most cards I have read about begin to really taper off core-clock after 100-150mv, and from there, you are getting around 5MHz per 10mv increase if you are lucky... So even though you have 1200 at 100mv, you could end up with a card that still falls short of 1250 at 200mv.
> At least that's what I have seen in my research....
> 
> Not trying to be discouraging at all, just something to remember if you experience instability regardless of how much power you send. It may not be the temps, these cores just run out of leg room sometimes (mind you, that considering regular cooling methods, with standard BIOS, etc).


----------



## GOLDDUBBY

Quote:


> Originally Posted by *Yorkston*
> 
> ..Unfortunately i'm running into a temperature wall, even with my G10+H55+gellid VRM kit. Core and VRM2 hover in the low 60s, but VRM1 is hitting ~86C even with the Gellid kit, Fujipoly pad, and a side intake fan. Any ideas short of a full waterblock? MUST GO FASTER.


I believe the VRM are rated to run at 81°C so you're running them around rated temperature is all.

But, go full block!!


----------



## LandonAaron

Hopefully someone here can help. So I decided that my r9 290x was pretty sweet and it really hasn't broken sweat chewing through game after game spitting out 60 FPS at 1080p no problem, and it just so happens I'm currently building my fiance a gaming computer and she needs a monitor for said computer. So I am thinking its upgrade time. Time for me to step up to 1440p and let her have my old 1080p monitor. I shop around read the reviews and decide on one of those cheaper Korean monitors the QNIX 2710. It arrives a few days later I hook it up, great picture, 1440p looking pretty sweet, life is good.

Now I'm ready to game in a higher high def so I launch Trixx, and OC the 290x to my usual 1200/1600 mhz and I'm off to the races. The game (Serious Sam 3) loads and its beautiful I can see the pores on these monsters faces right as I unload a double barrel shotgun into said faces, and then boom the monitor is blinking red, then blue and then green. Turns out this is what the monitor does when it doesn't detect a signal. I exit to the desktop disable and re enable the monitor in windows and get the picture back. But as soon as I load up the game again it starts all over with the blinking colors.

After trying various things the only thing that seems to help is to not overclock the card at all. I mean at all. If i raise core clock even 50mhz then the screen starts doing the blinking color thing. I have been OC'ing on this card for months at 1080p without issue, and ever since the Omega drivers came out I have been playing games at 1440p and 4k on my 1080p monitor with the virtual super resolutions, so I don't understand why this is happening. I have posted about it on the Korean Monitor Forum, but I doesn't seem like they know what issue could be. Any help would be appreciated.

P.S. I am not overclocking the monitor just the GPU.


----------



## Phaster89

should i rma my card? this is the first time anything like this happened
temps were ok, the max core temp was 77 and i had it on stock clocks


----------



## Yorkston

Quote:


> Originally Posted by *Phaster89*
> 
> should i rma my card? this is the first time anything like this happened
> temps were ok, the max core temp was 77 and i had it on stock clocks


DayZ- Working as intended

But seriously, has that only happened once or is it constant now? DayZ has always had issues with artifacting.


----------



## Phaster89

well i've been playing on the same server for almost 8 months and this is the first time i had artifacts, i dunno if anyone can tell but the artifacts are coming from the zombies


----------



## wermad

Is a fourth worth the scaling (if any) in 4k?


----------



## tsm106

Quote:


> Originally Posted by *wermad*
> 
> Is a fourth worth the scaling (if any) in 4k?


What do you think?





http://www.techpowerup.com/reviews/MSI/R9_290X_Lightning/13.html

Also



http://www.digitalstormonline.com/unlocked/4-way-quad-crossfire-amd-r9-295x2-benchmarks-at-4k-idnum228/


----------



## Yorkston

How do you get afterburner to let you raise voltage past +100? I've enabled unofficial overclocking but it only opened up aux voltage. Also can't raise mem clock past 1625.


----------



## wermad

Fourth = quad fire...

This is the best review I've seen so far. Besides nav's xfire 295x2 review, this one does show some scaling...

http://forums.anandtech.com/showthread.php?t=2359120

Just looking for more info and experience thoughts.


----------



## mojobear

Woha Tsm106 - what are your OCs on the lightnings for 24/7 gaming?


----------



## tsm106

Quote:


> Originally Posted by *mojobear*
> 
> Woha Tsm106 - what are your OCs on the lightnings for 24/7 gaming?


I actually run underclocked at 1000/1250mhz 24/7. It balances pretty well with my 4.6ghz daily cpu oc. For benching add 300/450mhz.


----------



## mojobear

Yikes you do 1300 on the core? lol nice. I havent dared bump up the voltage > + 150 mV for 1200/1350 (elpida memory









What kinda of watts do you get when benching? Most Ive pulled at 1167/1350 + 87mV is about 1350 W


----------



## tsm106

I haven't actually run a killawatt check when benching at or near the max. I should try it sometime. But I have done some quick tests like here.

This is two cards at +300mv and cpu at 5ghz. I forgot got to ramp my pumps up here so its ah... little warm lol but you get the point.



http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/33400_40#post_23296223

http://www.3dmark.com/fs/3758553


----------



## mojobear

Damn that looks like fun. Sigh too bad installing a 20A would be too much trouble for me. Otherwise I would go your route and do a 1600W + 1000W and bench the crap out of them.

The best I got with + 200 mV is about 1240 mhz on core...which I suppose is not bad for 4 cards...but doesnt beat 1300 mhz


----------



## tsm106

To get to 1300, you need closer to a +300mv offset plus some lucky silicon.


----------



## mojobear

oh yikes okay. no way I am going there. Ill stick to 1240 mhz +200 mV max.









Nice rig. You should post some pics of your quad rig and mora setup. I was comtemplating MoRA but went with another 480.


----------



## mojobear

Quote:


> Originally Posted by *wermad*
> 
> Fourth = quad fire...
> 
> This is the best review I've seen so far. Besides nav's xfire 295x2 review, this one does show some scaling...
> 
> http://forums.anandtech.com/showthread.php?t=2359120
> 
> Just looking for more info and experience thoughts.


its actually very smooth gaming on quad with most titles esp with the omega drivers. At least that has been my experience. Of course you now its diminishing returns as you go up on the GPU scale haha.
I get similar results as the graph with good scaling games like tomb raider etc but really fortune smiled down on me and handed me a 4th gpu. I would not have gotten one by my own accord.
I mean if you were playing an intensive game and was getting around 30 FPS with tri-fire and added a fourth, best you could do would be 40, but more like 35-36 fps...doesn't change the gaming experience.
At least that's my two cents.


----------



## mojobear

Quote:


> Originally Posted by *tsm106*
> 
> To get to 1300, you need closer to a +300mv offset plus some lucky silicon.


actually you should post some pics of your quad rig for your 20,000 post


----------



## wermad

Quote:


> Originally Posted by *mojobear*
> 
> its actually very smooth gaming on quad with most titles esp with the omega drivers. At least that has been my experience. Of course you now its diminishing returns as you go up on the GPU scale haha.
> I get similar results as the graph with good scaling games like tomb raider etc but really fortune smiled down on me and handed me a 4th gpu. I would not have gotten one by my own accord.
> I mean if you were playing an intensive game and was getting around 30 FPS with tri-fire and added a fourth, best you could do would be 40, but more like 35-36 fps...doesn't change the gaming experience.
> At least that's my two cents.


One thing that does seem to come up is that quad Hawaii does better in Eyefinity (well, as long as its setup right). At this point I'm really going w/ one monitor (from previous 5x1 1200). I'm not 100% sure but if the opportunity comes up, i want to take advantage of it. I'll mull over it for a while. I'm a few weeks out before my next case comes in (crossing fingers super tight!), so the only changes I've made was add a new cpu and mb setup.


----------



## tsm106

Quote:


> Originally Posted by *mojobear*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> To get to 1300, you need closer to a +300mv offset plus some lucky silicon.
> 
> 
> 
> actually you should post some pics of your quad rig for your 20,000 post
Click to expand...

Doh, wasn't even paying attention lol. That said, there are some in my rig pics. You can see my main rigs evolution too.


----------



## Arizonian

Quote:


> Originally Posted by *KAS Phase4*
> 
> My new R9 290, Sapphire Trix (I flashed bios 's Trix OC @ 1000/1300), ASIC 87.6%
> 
> 
> 
> 
> 
> 
> 
> 
> 
> OC stable ingame at 1200/1650 (+193mV)
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## Alexbo1101

Alright boys (and gals), question time!









I'm temted to buy another 290x with the current price drops, but with the 3X0x somewhere in the "near" future I'm wondering if I should hold off and wait.

So my question for you is the following; yay or nay for another 290x?

(More specific a Sapphire Tri-X, the Matrix is actually cheaper, but then I have to do a ******ed loop to fit it into my system.)


----------



## YellowBlackGod

Quote:


> Originally Posted by *Alexbo1101*
> 
> Alright boys (and gals), question time!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm temted to buy another 290x with the current price drops, but with the 3X0x somewhere in the "near" future I'm wondering if I should hold off and wait.
> 
> So my question for you is the following; yay or nay for another 290x?
> 
> (More specific a Sapphire Tri-X, the Matrix is actually cheaper, but then I have to do a ******ed loop to fit it into my system.)


Ι am thinking about the same, and here is what I think:

1) There will be always somenthing new around the corner. Best thing is to buy today what you want now. When you sell it you will buy something newer and better than the one you want to wait for now.
2) The 290X has nothing to lose in performance against at least 380X. 390X this is another matter, not for everyday normal use. It is more like the upcoming titan Z.
3) The 290X is futureproof. DX12, Open GL 4.4, Freesync, Mantle and so on. Currently the best card around.
4) Buy a second 290X, you won't need an upgrade for the next two years (especially now that mantle and DirectX 12 will recognise the RAM of both GPUs as ONE). You will benefit more from selling them both in a couple of years.
5) It is cheaper.
6) For a good custom model of the 390X or 380X you will have to wait even more, at least a couple of months.
7) Still need more arguments?


----------



## GOLDDUBBY

Quote:


> Originally Posted by *Yorkston*
> 
> ..Unfortunately i'm running into a temperature wall, even with my G10+H55+gellid VRM kit. Core and VRM2 hover in the low 60s, but VRM1 is hitting ~86C even with the Gellid kit, Fujipoly pad, and a side intake fan. Any ideas short of a full waterblock? MUST GO FASTER.


Rated to run at 81°C so you're running them around rated temperature.
Quote:


> Originally Posted by *Alexbo1101*
> 
> Alright boys (and gals), question time!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm temted to buy another 290x with the current price drops, but with the 3X0x somewhere in the "near" future I'm wondering if I should hold off and wait.
> 
> So my question for you is the following; yay or nay for another 290x?
> 
> (More specific a Sapphire Tri-X, the Matrix is actually cheaper, but then I have to do a ******ed loop to fit it into my system.)


Buy now. With the pricedrop, you can sell them off in a couple of months when the 3xx arrives. What ever little money you lost you can consider "rent"


----------



## Alexbo1101

Quote:


> Originally Posted by *GOLDDUBBY*
> 
> Buy now. With the pricedrop, you can sell them off in a couple of months when the 3xx arrives. What ever little money you lost you can consider "rent"


Quote:


> Originally Posted by *YellowBlackGod*
> 
> Ι am thinking about the same, and here is what I think:
> 
> 1) There will be always somenthing new around the corner. Best thing is to buy today what you want now. When you sell it you will buy something newer and better than the one you want to wait for now.
> 2) The 290X has nothing to lose in performance against at least 380X. 390X this is another matter, not for everyday normal use. It is more like the upcoming titan Z.
> 3) The 290X is futureproof. DX12, Open GL 4.4, Freesync, Mantle and so on. Currently the best card around.
> 4) Buy a second 290X, you won't need an upgrade for the next two years (especially now that mantle and DirectX 12 will recognise the RAM of both GPUs as ONE). You will benefit more from selling them both in a couple of years.
> 5) It is cheaper.
> 6) For a good custom model of the 390X or 380X you will have to wait even more, at least a couple of months.
> 7) Still need more arguments?


I wouldn't call it cheap, since the card still costs ~$420, but that's $205 less than launch price, so I guess that's something







.

And the problem with Denmark is that I'm probably never going to find someone to take it off my hands when I want to sell it...


----------



## YellowBlackGod

Quote:


> Originally Posted by *Alexbo1101*
> 
> I wouldn't call it cheap, since the card still costs ~$420, but that's $205 less than launch price, so I guess that's something
> 
> 
> 
> 
> 
> 
> 
> .
> 
> And the problem with Denmark is that I'm probably never going to find someone to take it off my hands when I want to sell it...


Same goes for me here in Greece. But to sell them there are tech forums that will help a lot in this matter.







Still 2x 290X will give you performance near the levels of 390X.


----------



## GOLDDUBBY

get this one, http://www.club-3d.com/index.php/products/reader.en/product/radeon-r9-290x-8gb-royalace.html

kr 3.080 - kr 3.778 http://www.pricerunner.dk/pl/37-3024575/Grafikkort/Club-3D-Radeon-R9-290X-RoyalAce-%28CGAX-R929X9SO%29-Sammenlign-Priser

Like an AMD version of GTX Titan


----------



## Alexbo1101

Quote:


> Originally Posted by *GOLDDUBBY*
> 
> get this one, http://www.club-3d.com/index.php/products/reader.en/product/radeon-r9-290x-8gb-royalace.html
> 
> kr 2.739 - kr 3.856 http://www.pricerunner.dk/pl/37-2972494/Grafikkort/Club-3D-Radeon-R9-290X-RoyalAce-%28CGAX-R929X8SO%29-Sammenlign-Priser


That's almost the same price as a Tri-X (which I'd rather have, as I don't trust XFX and club3D).

EDIT: You're a sneaky one, but how good would that work (4GB plus 8GB)?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *tsm106*
> 
> I actually run underclocked at 1000/1250mhz 24/7. It balances pretty well with my 4.6ghz daily cpu oc. For benching add 300/450mhz.


That's what I run mine at for 24/7









Its not easy getting past 1300mhz

http://hwbot.org/submission/2568247_homecinema_pc_3dmark11___performance_radeon_r9_290_19462_marks

http://hwbot.org/submission/2619817_homecinema_pc_3dmark11___performance_radeon_r9_290x_19738_marks

http://hwbot.org/submission/2567679_homecinema_pc_3dmark11___performance_2x_radeon_r9_290_28744_marks?recalculate=true

http://hwbot.org/submission/2567357_homecinema_pc_3dmark11___performance_3x_radeon_r9_290_33217_marks


----------



## YellowBlackGod

Is the 290X unlockable to 3072 cores, or is it a full Hawaii chip?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *YellowBlackGod*
> 
> Is the 290X unlockable to 3072 cores, or is it a full Hawaii chip?


2816 unified cores for 290x
2560 unified cores for 290


----------



## YellowBlackGod

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 2816 shader cores for 290x


That I know.







Just wanted to make sure that it is a full chip and not a locked one.


----------



## Spectre-

didnt 8 pack say there is suppose suppose to be a full fat Hawaii


----------



## MikeMike86

http://www.overclock.net/t/1497475/various-amd-s-radeon-r9-290x-hawaii-xt-gpu-might-not-be-the-full-chip-after-all

Be awesome if we got a free 10% gain in performance with a bios flash, all I find is that they're deactivated, I would assume laser cut but either way I'd definitely try a few bios flashes when the fats come out haha


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Spectre-*
> 
> didnt 8 pack say there is suppose suppose to be a full fat Hawaii


Sounds like a possible 3xx series card to me


----------



## Spectre-

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Sounds like a possible 3xx series card to me


could just be the R9 380 will be the R9 290x ( rehashed) and the R9 380X will be full fat hawaii

and the 390 series will be Fiji with HBM


----------



## Regnitto

possibly 380 = 290, 380x = 290x, 390x = full fat hawaii, 390 = fat hawaii with couple deactivated cores........


----------



## Serandur

Well, I've been refunded for my 970s. Is the Lightning 290X the way to go? Or should I consider a Vapor X, Tri X, or PCS over it?


----------



## Alexbo1101

Of the models you list I would probably go like this: Lightning > Vapor-X > Tri-X = PCS+

Keep in mind that this is my personal preference.


----------



## rdr09

Quote:


> Originally Posted by *Serandur*
> 
> Well, I've been refunded for my 970s. Is the Lightning 290X the way to go? Or should I consider a Vapor X, Tri X, or PCS over it?


kinda late in the game. prices seems to be going up for hawaii and the 300 series are almost here. unless you buy used, i'd say wait.


----------



## k2blazer

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, this is definitely the place to go for help (in the appropriate section of course).....
> 
> I've never seen any k series Sandy Bridge chip with less than 4.6GHz in the tank....


4.6... Didn't want to push it any further.

http://valid.x86.fr/8w90cb


----------



## GOLDDUBBY

Quote:


> Originally Posted by *Alexbo1101*
> 
> That's almost the same price as a Tri-X (which I'd rather have, as I don't trust XFX and club3D).
> 
> EDIT: You're a sneaky one, but how good would that work (4GB plus 8GB)?


Sell your 4gb one and get 2x 8gb ones!







5k ready!


----------



## Agent Smith1984

Looks like the 3.5gb thing with the 970 is driving prices up..... Surprising in my opinion. I dunno if I'd dump a 970 over .5gb.....


----------



## skupples

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Looks like the 3.5gb thing with the 970 is driving prices up..... Surprising in my opinion. I dunno if I'd dump a 970 over .5gb.....


Noticed this as well.


----------



## Alexbo1101

Quote:


> Originally Posted by *GOLDDUBBY*
> 
> Sell your 4gb one and get 2x 8gb ones!
> 
> 
> 
> 
> 
> 
> 
> 5k ready!


I'll do that the minute someone wants to buy a 290x with a EK Copper/Acetal block for $455


----------



## GOLDDUBBY

Quote:


> Originally Posted by *Alexbo1101*
> 
> I'll do that the minute someone wants to buy a 290x with a EK Copper/Acetal block for $455


I would, just to have mantle for BF4, but nvidia has implemented a "shut off" feature to their drivers that shuts physX off if you have a non-nvidia card in the system. Typical nvidia BS!

Upgrading my rig to x99 this fall, switching back to amd (gpu's) then aswell.


----------



## Serandur

Quote:


> Originally Posted by *Alexbo1101*
> 
> Of the models you list I would probably go like this: Lightning > Vapor-X > Tri-X = PCS+
> 
> Keep in mind that this is my personal preference.


I guess I'd agree overall, though I like the Vapor X most aesthetically.

Quote:


> Originally Posted by *rdr09*
> 
> kinda late in the game. prices seems to be going up for hawaii and the 300 series are almost here. unless you buy used, i'd say wait.


I need a "cheap" and temporary card to hold me over until the new stuff. Lightning 290X looks good.









I've got the real money saved for some next-gen flagship part, will sell the 290X too.


----------



## rdr09

Quote:


> Originally Posted by *Serandur*
> 
> I guess I'd agree overall, though I like the Vapor X most aesthetically.
> I need a "cheap" and temporary card to hold me over until the new stuff. Lightning 290X looks good.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've got the real money saved for some next-gen flagship part, will sell the 290X too.


if you view $400 as cheap - sure.


----------



## Spectre-

Quote:


> Originally Posted by *rdr09*
> 
> if you view $400 as cheap - sure.


$500 for 290X in aus


----------



## Agent Smith1984

Everyone has lost their minds.... lol


----------



## mojobear

Quote:


> Originally Posted by *wermad*
> 
> One thing that does seem to come up is that quad Hawaii does better in Eyefinity (well, as long as its setup right). At this point I'm really going w/ one monitor (from previous 5x1 1200). I'm not 100% sure but if the opportunity comes up, i want to take advantage of it. I'll mull over it for a while. I'm a few weeks out before my next case comes in (crossing fingers super tight!), so the only changes I've made was add a new cpu and mb setup.


Nice. I saw your old rig in your profile. it looks beastly. Eyefinity definitely works well, although I am coming from nvidia surround so not sure compared to 7970s. For sure you can run triple now without DP which is a nice change.

I am going to make the plunge into 1440p eyefinity in a few months once some nice monitors come out from Benq or Asus.


----------



## Agent Smith1984

Quote:


> Originally Posted by *k2blazer*
> 
> 4.6... Didn't want to push it any further.
> 
> http://valid.x86.fr/8w90cb


taking it easy on the ol' girl huh?
You ever get brave, go for 5ghz, a lot of those have been known to do it. Push that ram more too! Cl 9 @ 1333 is way conservative..... Good to see its still doing a good job for you though.


----------



## Elmy

Quote:


> Originally Posted by *Alexbo1101*
> 
> That's almost the same price as a Tri-X (which I'd rather have, as I don't trust XFX and club3D).
> 
> EDIT: You're a sneaky one, but how good would that work (4GB plus 8GB)?


Why dont you trust Club3D? They are a very good company and have been around a long time in the Netherlands/Europe. They just started selling their products in North America about 2 years ago.


----------



## BradleyW

Quote:


> Originally Posted by *Elmy*
> 
> Why dont you trust Club3D? They are a very good company and have been around a long time in the Netherlands/Europe. They just started selling their products in North America about 2 years ago.


Club3D is Powercolor btw. I'd trust them.


----------



## Alexbo1101

Quote:


> Originally Posted by *Elmy*
> 
> Why dont you trust Club3D? They are a very good company and have been around a long time in the Netherlands/Europe. They just started selling their products in North America about 2 years ago.


I've always thought of them as a discount brand with their flashy cards and such, maybe I'm wrong.
Quote:


> Originally Posted by *BradleyW*
> 
> Club3D is Powercolor btw. I'd trust them.


Aren't PowerColor, Club3D and XFX all part of the TUL Corporation?


----------



## Spectre-

Quote:


> Originally Posted by *Alexbo1101*
> 
> I've always thought of them as a discount brand with their flashy cards and such, maybe I'm wrong.
> Aren't PowerColor, Club3D and XFX all part of the TUL Corporation?


TBH with Hawaii Companies like- Powercolour, Club3d and Sapphire amazed me with there cooling on Hawaii (considering Aus just started getting Sapphire stuff in Australia (2013))


----------



## yiannis

hello guys i am going to buy two sapphire r9 290x for playing bf4 in 144hz.I want to know your opinion my cpu is i5 2500k 4,5ghz.If somebody could tell me about performance and temps it would be nice thanks alot......greetings from greece


----------



## tsm106

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I actually run underclocked at 1000/1250mhz 24/7. It balances pretty well with my 4.6ghz daily cpu oc. For benching add 300/450mhz.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That's what I run mine at for 24/7
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *
> Its not easy getting past 1300mhz*
> 
> http://hwbot.org/submission/2568247_homecinema_pc_3dmark11___performance_radeon_r9_290_19462_marks
> 
> http://hwbot.org/submission/2619817_homecinema_pc_3dmark11___performance_radeon_r9_290x_19738_marks
> 
> http://hwbot.org/submission/2567679_homecinema_pc_3dmark11___performance_2x_radeon_r9_290_28744_marks?recalculate=true
> 
> http://hwbot.org/submission/2567357_homecinema_pc_3dmark11___performance_3x_radeon_r9_290_33217_marks
Click to expand...

True. And I do it on water only.









http://www.3dmark.com/fs/3758553


----------



## Alexbo1101

Think I'll be going with Sapphire for my second 290x, since I trust Sapphire and the only 290x that's cheaper is Asus' Matrix card.


----------



## tsm106

Quote:


> Originally Posted by *Alexbo1101*
> 
> Think I'll be going with Sapphire for my second 290x, since I trust Sapphire and the only 290x that's cheaper is Asus' Matrix card.


You guys are driving prices up lol.


----------



## Alexbo1101

Quote:


> Originally Posted by *tsm106*
> 
> You guys are driving prices up lol.


Whatever do you mean







.

No seriously, the Matrix is $25 cheaper, but I'm not touching that thing...

And I doubt that the Danish market has any effect on the american market at all







.


----------



## k2blazer

Quote:


> Originally Posted by *Agent Smith1984*
> 
> taking it easy on the ol' girl huh?
> You ever get brave, go for 5ghz, a lot of those have been known to do it. Push that ram more too! Cl 9 @ 1333 is way conservative..... Good to see its still doing a good job for you though.


Yes indeed, she's a few years old now. lol

Here's the result with the cards slightly overclocked.

http://www.3dmark.com/3dm/5876743


----------



## Agent Smith1984

Quote:


> Originally Posted by *k2blazer*
> 
> Yes indeed, she's a few years old now. lol
> 
> Here's the result with the cards slightly overclocked.
> 
> http://www.3dmark.com/3dm/5876743


I am so ready to get another 290, new PSU, and a sabertooth....

I'm totally cool with keeping the x6 still. Hell, I'm still putting up a 8700 point physics score in firestrike! lol


----------



## rdr09

Quote:


> Originally Posted by *Spectre-*
> 
> $500 for 290X in aus


i like to visit beautiful Australia before winter sets. Last time i went there the currencies were equal. Got a $90 speeding ticket in Sydney. lol


----------



## k2blazer

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I am so ready to get another 290, new PSU, and a sabertooth....
> 
> I'm totally cool with keeping the x6 still. Hell, I'm still putting up a 8700 point physics score in firestrike! lol


That's very decent, beats my 2500k.

I cannot seem to push the core clocks on my 290's past 1100 without running into issues, I would like to get as far as possible without changing the voltage, maybe I've reached it? :s

& even pushing a tiny bit on the memory 1300-1320 will cause issues.


----------



## Agent Smith1984

Quote:


> Originally Posted by *k2blazer*
> 
> That's very decent, beats my 2500k.
> 
> I cannot seem to push the core clocks on my 290's past 1100 without running into issues, I would like to get as far as possible without changing the voltage, maybe I've reached it? :s
> 
> & even pushing a tiny bit on the memory 1300-1320 will cause issues.


You have to increase the voltage to push the memory past stock usually.
1100 is definitely near the top on stock voltage on the core also.

Try 100mv offset if the temsp are good.... 50% power limiter also.

Should give you around 1150 core, and around 1400-1500 on the VRAM

Again, watch temps....
Also use afterburner to make sure you aren't throttling.


----------



## tsm106

lol, 290x lightning just jumped 70 bucks overnight at newegg and amazon. And the crap gaming card jumped even higher at 130 or something lol what the hell is going on?


----------



## k2blazer

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You have to increase the voltage to push the memory past stock usually.
> 1100 is definitely tops on for stock voltage on the core also.
> 
> Try 100mv offset if the temsp are good.... 50% power limiter also.
> 
> Should give you around 1150 core, and around 1400-1500 on the VRAM
> 
> Again, watch temps....
> Also use afterburner to make sure you aren't throttling.


Temps aren't great, highest hit was 76 and that was with 98% fan speed.

VRMs were under 60 though.


----------



## Agent Smith1984

Quote:


> Originally Posted by *k2blazer*
> 
> Temps aren't great, highest hit was 76 and that was with 98% fan speed.
> 
> VRMs were under 60 though.


Reference cards or aftermarket?


----------



## k2blazer

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Reference cards or aftermarket?


Sapphire Tri-x.


----------



## Agent Smith1984

Quote:


> Originally Posted by *k2blazer*
> 
> Sapphire Tri-x.


Oh, well you should be fine then...

76c is a good temp. They don't even throttle until the 90's I think.....

Also, make sure you use 50% power limit, even if you don't OC, trust me!
I was getting throttling on the same exact card, with no OC at all, until I set the power limit to 50%


----------



## Agent Smith1984

Quote:


> Originally Posted by *tsm106*
> 
> lol, 290x lightning just jumped 70 bucks overnight at newegg and amazon. And the crap gaming card jumped even higher at 130 or something lol what the hell is going on?


I know, it's insane....

I was planning on 290's being 150-175 used pretty much everywhere by now....
But NOOOO..... NVIDIA had to come out and admit they gimped their best selling card....

Now everybody wants a friggin 290 again cause of the 4GB. Kind of funny for the people who lost their minds, sold their 290's to get 970's, and are now watching their card go down in value, while the 290 goes right back up.

It's all ridiculous if you ask me though..... you'd think bitcoin had come back or something


----------



## Aaron_Henderson

I get about 9000 physics score with my 2500K


----------



## k2blazer

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Oh, well you should be fine then...
> 
> 76c is a good temp. They don't even throttle until the 90's I think.....
> 
> Also, make sure you use 50% power limit, even if you don't OC, trust me!
> I was getting throttling on the same exact card, with no OC at all, until I set the power limit to 50%


Ahh ok, so in Sapphire Trixx what should I change the VDDC Offset to?


----------



## GOLDDUBBY

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Oh, well you should be fine then...
> 
> 76c is a good temp. They don't even throttle until the 90's I think.....
> 
> Also, make sure you use 50% power limit, even if you don't OC, trust me!
> I was getting throttling on the same exact card, with no OC at all, until I set the power limit to 50%


Wait what? ..50% ? not 100% ? how can a card throttle from too much power?  the bios clock table might be faulty.. seems weird, though


----------



## Agent Smith1984

Quote:


> Originally Posted by *GOLDDUBBY*
> 
> Wait what? ..50% ? not 100% ? how can a card throttle from too much power?  the bios clock table might be faulty.. seems weird, though


No....
It throttled at 0% power limit increase.
When I increased it to 50% power limit it stopped.

The 50% power limit means that the card can draw 50% more power than it's standard power rating.

The card came overclocked from the factory to 1000MHz core and with no power limit increase it would fluctuate between 940 and 990MHz. I went straight to 50% and it stopped immediately and stayed pegged at 1000MHz core.
From there I was able to overclock.


----------



## Widde

Just a quick question, Will running a 1080p monitor at 1440p with vsr damage it?


----------



## tsm106

Quote:


> Originally Posted by *Widde*
> 
> Just a quick question, Will running a 1080p monitor at 1440p with vsr damage it?


No.


----------



## Agent Smith1984

Quote:


> Originally Posted by *k2blazer*
> 
> Ahh ok, so in Sapphire Trixx what should I change the VDDC Offset to?


Start at 100mv is usually good.... see how far you can push the core first.
Then do the VRAM.

If your temps are good, you can toy with 150-200mv


----------



## Yorkston

Can Afterburner go past +100mv or will I need to switch to something else?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Yorkston*
> 
> Can Afterburner go past +100mv or will I need to switch to something else?


Afterburner is still capped at 100mv, I really don't understand why....

You will need trixx to get past that.


----------



## tsm106

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Yorkston*
> 
> Can Afterburner go past +100mv or will I need to switch to something else?
> 
> 
> 
> Afterburner is still capped at 100mv, I really don't understand why....
> 
> You will need trixx to get past that.
Click to expand...

You can use the internal command to tell AB to add more LLC offset. I'm pretty sure its in the OP.


----------



## Agent Smith1984

Quote:


> Originally Posted by *tsm106*
> 
> You can use the internal command to tell AB to add more LLC offset. I'm pretty sure its in the OP.


If I could ever get Afterburner to apply my power limit increase, I'd love to, but no matter what I do, I always end up having to constantly set it in CCC, which defeats the purpose of having it. It causes black screens if I forget to launch CCC and reset it to 50, whenever I launch a 3d app, because the VRAM clocks up to 1500, freezes up without the juice....

Then when I OC in trixx, it works perfectly, but the damn settings screen causes a crash on it, so that app has been damn near worthless too. I can't get an auto launch with windows, so it's a manual thing every time I want to crank it up.
OC for me personally with this card, has been a pain in the ass....

I literally just run the damn thing stock when I play games most of the time. Luckily it's a strong enough card to not even notice.
I do see sometimes in Crysis 3 though, where I am like, "wait a minute, are my clocks set?"
That game actually seems to enjoy some extra RAM speed with AA cranked. Most games show a 2% gain or less from it though....


----------



## kizwan

You didn't use version 4.8.2 right?

It should be no problem if you disable overdrive in CCC.


----------



## cokker

Hi all, just picked up a MSI Gaming 290x and its going great but one thing is bugging me, VRM 2 on programs like GPUZ are reporting 82c. This temperature doesn't change so I'm guessing it's a faulty reading?

Anyone else seen anything like this?


----------



## rt123

Quote:


> Originally Posted by *Agent Smith1984*
> 
> If I could ever get Afterburner to apply my power limit increase, I'd love to, but no matter what I do, I always end up having to constantly set it in CCC, which defeats the purpose of having it. It causes black screens if I forget to launch CCC and reset it to 50, whenever I launch a 3d app, because the VRAM clocks up to 1500, freezes up without the juice....
> 
> Then when I OC in trixx, it works perfectly, but the damn settings screen causes a crash on it, so that app has been damn near worthless too. I can't get an auto launch with windows, so it's a manual thing every time I want to crank it up.
> OC for me personally with this card, has been a pain in the ass....
> 
> I literally just run the damn thing stock when I play games most of the time. Luckily it's a strong enough card to not even notice.
> I do see sometimes in Crysis 3 though, where I am like, "wait a minute, are my clocks set?"
> That game actually seems to enjoy some extra RAM speed with AA cranked. Most games show a 2% gain or less from it though....


You can never get AfterBurner to apply the Power Limit Increase because you are supposed to Disable AMD OverDrive while using AfterBurner, or else there is a conflict between the 2 & OverDrive overrides AfterBurner's PL settings.


----------



## Agent Smith1984

Well, I have tried to disable it, and tried it with OD enabled, and nothing works, and every combo there is, and it never works in AB. Trixx works fine, and doesn't give a damn if OD is enabled, but the program crashes (though I had read trying an older version is the solution).

Never had this problem with my 280x though. I could have CCC, trixx, GPUTweak all open at one friggin time and AB would still change the PL......

Thinking about putting this tri-x up for sale..... maybe ask $200 for it and see what happens.
It's a great card, but may see if I can find a saddened 970 owner who REALLY has to have 4GB..... lol


----------



## tsm106

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, I have tried to disable it, and tried it with OD enabled, and nothing works, and every combo there is, and it never works in AB. Trixx works fine, and doesn't give a damn if OD is enabled, but the program crashes (though I had read trying an older version is the solution).
> 
> Never had this problem with my 280x though. I could have CCC, trixx, GPUTweak all open at one friggin time and AB would still change the PL......
> 
> Thinking about putting this tri-x up for sale..... maybe ask $200 for it and see what happens.
> It's a great card, but may see if I can find a saddened 970 owner who REALLY has to have 4GB..... lol


I never touch Overdrive, ever. I only use AB to raise powerlimit. Let me tell you, there is no way the below is possible if AB did not raise my powerlimit.


----------



## GOLDDUBBY

Quote:


> Originally Posted by *Agent Smith1984*
> 
> No....
> It throttled at 0% power limit increase.
> When I increased it to 50% power limit it stopped.
> 
> The 50% power limit means that the card can draw 50% more power than it's standard power rating.
> 
> The card came overclocked from the factory to 1000MHz core and with no power limit increase it would fluctuate between 940 and 990MHz. I went straight to 50% and it stopped immediately and stayed pegged at 1000MHz core.
> From there I was able to overclock.


Sorry, but to me that would make 150% .. but I understand now. Was puzzled if you meant it overheated at 150% or something, but the card needs 150% tdp right out the box. Sounds like a bad factory overclock? Like they maybe added volt but didn't balance the tdp ..weird


----------



## Agent Smith1984

Quote:


> Originally Posted by *GOLDDUBBY*
> 
> Sorry, but to me that would make 150% .. but I understand now. Was puzzled if you meant it overheated at 150% or something, but the card needs 150% tdp right out the box. Sounds like a bad factory overclock? Like they maybe added volt but didn't balance the tdp ..weird


It's certainly not the best overclocker, that's for sure....

Best I can get out of her 24/7 game stable is 1150/1600

Not a slouch by any means, but certainly not a lottery pick.


----------



## Agent Smith1984

Quote:


> Originally Posted by *tsm106*
> 
> I never touch Overdrive, ever. I only use AB to raise powerlimit. Let me tell you, there is no way the below is possible if AB did not raise my powerlimit.


Dude....

That's A LOT OF POWER









Just goes to show you that an overclocked CPU and 4 overclocked video cards can AND WILL use 300W or more A PIECE!!


----------



## GOLDDUBBY

Quad-fire OC'ed ?!


----------



## tsm106

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I never touch Overdrive, ever. I only use AB to raise powerlimit. Let me tell you, there is no way the below is possible if AB did not raise my powerlimit.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Dude....
> 
> That's A LOT OF POWER
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just goes to show you that an overclocked CPU and 4 overclocked video cards can AND WILL use 300W or more A PIECE!!
Click to expand...

Quote:


> Originally Posted by *GOLDDUBBY*
> 
> Quad-fire OC'ed ?!


lol that is just *two cards and cpu*.

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/33420#post_23296223


----------



## wermad

Quote:


> Originally Posted by *tsm106*
> 
> You guys are driving prices up lol.


It's probably the stupid gtx 970 issue. I'm noticing ebay is no longer as cheap as it was a couple of months ago.

Meh, looks like I won't need a 4th. just picked up a k monitor from a forum. Triple should be fine.


----------



## RelicLTD

Hey everyone! I just picked up an MSI 290x Twin Frozr as the first piece of my first ever gaming build. I know it's not the top non-reference card but I got a great deal. Obviously I've heard these cards run hot, and because I have yet to purchase the rest of my parts, I still have some flexibility. I am planning to stick to air for the one card, but am making my build keeping crossfire in mind for a few months down the line. I want to stick with air for crossfire, but I'm starting to realize this may not be an option for two of these cards. Keep in mind this will be my first ever build, so I apologize if I appear ignorant in some ways.

I have three questions:

1. What *case* do you recommend? I want to optimize airflow. I'm looking at the Corsair 400R vs Vengeance C70 for the side fan options, or the 750d(full tower) vs 450d. What do you think is best? I like the 750d in terms of flexibility if I have to go liquid in the end, but it wouldn't be my first choice if I can stick with air. I plan to throw on some SP fans in most slots. Are AF fans recommended for any particular fan locations (maybe the rear exhaust?)

2. If I pick up a *second card*, can it be any brand or reference for Crossfire?

3. Which *motherboard* is best? Obviously I want some space between the cards, but still have the best performance (not hitting the 4x on the lower slot). I would love some recommendations. I like the creature comforts of the Asus z97 Pro Gaming (but I can't tell if the gap is big enough) but am open to anything.

Here are is my current parts list if anyone is interested:


Spoiler: Warning: Spoiler!



PCPartPicker part list / Price breakdown by merchant

*CPU:* Intel Core i5-4690K 3.5GHz Quad-Core Processor ($219.89 @ OutletPC)
*CPU Cooler:* Cooler Master Hyper 212 EVO 82.9 CFM Sleeve Bearing CPU Cooler ($28.98 @ OutletPC)
*Motherboard:* Asus Z97-A ATX LGA1150 Motherboard ($137.99 @ SuperBiiz)
*Memory:* G.Skill Ares Series 8GB (2 x 4GB) DDR3-1600 Memory ($69.98 @ OutletPC)
*Storage:* Samsung 850 EVO-Series 250GB 2.5" Solid State Drive ($99.99 @ Amazon)
*Storage:* Seagate Barracuda 1TB 3.5" 7200RPM Internal Hard Drive ($48.89 @ OutletPC)
*Video Card:* MSI Radeon R9 290X 4GB TWIN FROZR Video Card ($282.98 @ Newegg)
*Case:* Corsair Vengeance C70 (White) ATX Mid Tower Case ($109.99 @ Amazon)
*Power Supply:* EVGA SuperNOVA 1000G2 1000W 80+ Gold Certified Fully-Modular ATX Power Supply ($109.99 @ Newegg)
*Operating System:* Microsoft Windows 8.1 (OEM) (64-bit) ($89.75 @ OutletPC)
*Wireless Network Adapter:* TP-Link TL-WDN4800 802.11a/b/g/n PCI-Express x1 Wi-Fi Adapter ($34.98 @ OutletPC)
*Total:* $1233.41
_Prices include shipping, taxes, and discounts when available_
_Generated by PCPartPicker 2015-02-12 20:05 EST-0500_

Considering dropping the extra dough for the i7, and upgraded CPU cooler (probably a noctua that doesn't interfere with PCI slots). Video editing/streaming is more of hobby and I can probably get by without the i7... anyway, that's another discussion.


----------



## Bertovzki

Quote:


> Originally Posted by *tsm106*
> 
> lol, 290x lightning just jumped 70 bucks overnight at newegg and amazon. And the crap gaming card jumped even higher at 130 or something lol what the hell is going on?


Strange ? the 290X tri-x OC has dropped $130 here in NZ ....NZD of course


----------



## wermad

I bought a used 290 tri-x oc for $200 less then two months ago. Now, they're going for $250-260. I've tried offers but no one seems to be budging from $260. I missed one on hardforums for $225 (







). Seems like everyone is getting greedy w/ the whole Nvidia fallout.


----------



## taem

Quote:


> Originally Posted by *wermad*
> 
> I bought a used 290 tri-x oc for $200 less then two months ago. Now, they're going for $250-260. I've tried offers but no one seems to be budging from $260. I missed one on hardforums for $225 (
> 
> 
> 
> 
> 
> 
> 
> ). Seems like everyone is getting greedy w/ the whole Nvidia fallout.


I find the used prices silly. It's easy to shop around and find a new 290 for $250-270. I'm not going to go with a used card to save like $20.


----------



## pshootr

I couldn't agree more. I got my Tri-x oc new for 269.99


----------



## wermad

Quote:


> Originally Posted by *taem*
> 
> I find the used prices silly. It's easy to shop around and find a new 290 for $250-270. I'm not going to go with a used card to save like $20.


These are Tri-X oc. They tend to fetch for more due to the better cooler and reference pcb design. They also have a factory oc (1ghz). I've seen a ton of vanilla 290s sell for less then $200. Newegg had these brand new ~$250 (I bought one three months ago). Now they're ~$280-300. I saved $50 carefully looking for a non miner that was ~ six months old in December. If you are careful, you can find some used stuff that works great and saves you a bit.

Meh, will have to wait; just like the mining craze.


----------



## taem

Quote:


> Originally Posted by *wermad*
> 
> These are Tri-X oc. They tend to fetch for more due to the better cooler and reference pcb design. They also have a factory oc (1ghz). I've seen a ton of vanilla 290s sell for less then $200. Newegg had these brand new ~$250 (I bought one three months ago). Now they're ~$280-300. I saved $50 carefully looking for a non miner that was ~ six months old in December. If you are careful, you can find some used stuff that works great and saves you a bit.
> 
> Meh, will have to wait; just like the mining craze.


The tri-x is actually one of the cheaper 290s you can find new at newegg and amazon. Regularly on sale for $270ish.


----------



## Alexbo1101

Alright bought a 290X Tri-X OC, now we just need to see if I need a new PSU too, I hope I don't.


----------



## wermad

Quote:


> Originally Posted by *taem*
> 
> The tri-x is actually one of the cheaper 290s you can find new at newegg and amazon. Regularly on sale for $270ish.


Hmmm...not sure if you're making this up or actually looking at the source.

Asus DC2 Oc: $329 -> $299 mir
XFX BE: $314 -> $284 mir
Sapphire tri-x oc: $299 -> $279 mir
Gigabyte WF3/oc: $299 -> $279 mir
XFX DD: $299 -> $269mir
PC TurboDuo: $279 (oos)
MSI Gaming: $269 -> $239 mir
His iPower: $259 > $249 mir
Diamond R9290D54Gv2: $259 (oos)

Make sure you use the newegg.com filters: R9 290, newegg seller, Highest to lowest

newegg.com

Amazon is too congested with many sellers, most which tend to wildly price their stuff. Newegg is still pretty much the go to for pricing. Plus, they tend to come up as one of the cheapest for current-market products.

Btw, I placed the sapphire above the GB since its a reference design which gives you the option for any reference block. The GB uses a custom board, and though there are blocks, there's not that many options as reference.


----------



## taem

Quote:


> Originally Posted by *wermad*
> 
> Hmmm...not sure if you're making this up or actually looking at the source.
> 
> Asus DC2 Oc: $329 -> $299 mir
> XFX BE: $314 -> $284 mir
> Sapphire tri-x oc: $299 -> $279 mir
> Gigabyte WF3/oc: $299 -> $279 mir
> XFX DD: $299 -> $269mir
> PC TurboDuo: $279 (oos)
> MSI Gaming: $269 -> $239 mir
> His iPower: $259 > $249 mir
> Diamond R9290D54Gv2: $259 (oos)
> 
> Make sure you use the newegg.com filters: R9 290, newegg seller, Highest to lowest
> 
> newegg.com
> 
> Amazon is too congested with many sellers, most which tend to wildly price their stuff. Newegg is still pretty much the go to for pricing. Plus, they tend to come up as one of the cheapest for current-market products.
> 
> Btw, I placed the sapphire above the GB since its a reference design which gives you the option for any reference block. The GB uses a custom board, and though there are blocks, there's not that many options as reference.


A lot of those prices at newegg just dropped over the past few days. For months now the tri-x, xfx dd, and msi gaming have consistently been the cheapest new custom cooled 290s. Believe me Ive been tracking this. Tri-x was as low as $230 once.

But I still can't decide whether 290 xfire is worth it for me. Doesn't help me at 1440p, can't get me playable at 4k. That's why I never clicked the buy. The low price on the lightning was also a deterrent.


----------



## Serandur

I purchased a Lightning 290X today. Can't wait until it arrives... on Tuesday. Long wait ahead.


----------



## mAs81

Quote:


> Originally Posted by *Serandur*
> 
> I purchased a Lightning 290X today. Can't wait until it arrives... on Tuesday. Long wait ahead.


Congrats , that is one beast of a card


----------



## Widde

Quote:


> Originally Posted by *RelicLTD*
> 
> Hey everyone! I just picked up an MSI 290x Twin Frozr as the first piece of my first ever gaming build. I know it's not the top non-reference card but I got a great deal. Obviously I've heard these cards run hot, and because I have yet to purchase the rest of my parts, I still have some flexibility. I am planning to stick to air for the one card, but am making my build keeping crossfire in mind for a few months down the line. I want to stick with air for crossfire, but I'm starting to realize this may not be an option for two of these cards. Keep in mind this will be my first ever build, so I apologize if I appear ignorant in some ways.
> 
> I have three questions:
> 
> 1. What *case* do you recommend? I want to optimize airflow. I'm looking at the Corsair 400R vs Vengeance C70 for the side fan options, or the 750d(full tower) vs 450d. What do you think is best? I like the 750d in terms of flexibility if I have to go liquid in the end, but it wouldn't be my first choice if I can stick with air. I plan to throw on some SP fans in most slots. Are AF fans recommended for any particular fan locations (maybe the rear exhaust?)
> 
> 2. If I pick up a *second card*, can it be any brand or reference for Crossfire?
> 
> 3. Which *motherboard* is best? Obviously I want some space between the cards, but still have the best performance (not hitting the 4x on the lower slot). I would love some recommendations. I like the creature comforts of the Asus z97 Pro Gaming (but I can't tell if the gap is big enough) but am open to anything.
> 
> Here are is my current parts list if anyone is interested:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> PCPartPicker part list / Price breakdown by merchant
> 
> *CPU:* Intel Core i5-4690K 3.5GHz Quad-Core Processor ($219.89 @ OutletPC)
> *CPU Cooler:* Cooler Master Hyper 212 EVO 82.9 CFM Sleeve Bearing CPU Cooler ($28.98 @ OutletPC)
> *Motherboard:* Asus Z97-A ATX LGA1150 Motherboard ($137.99 @ SuperBiiz)
> *Memory:* G.Skill Ares Series 8GB (2 x 4GB) DDR3-1600 Memory ($69.98 @ OutletPC)
> *Storage:* Samsung 850 EVO-Series 250GB 2.5" Solid State Drive ($99.99 @ Amazon)
> *Storage:* Seagate Barracuda 1TB 3.5" 7200RPM Internal Hard Drive ($48.89 @ OutletPC)
> *Video Card:* MSI Radeon R9 290X 4GB TWIN FROZR Video Card ($282.98 @ Newegg)
> *Case:* Corsair Vengeance C70 (White) ATX Mid Tower Case ($109.99 @ Amazon)
> *Power Supply:* EVGA SuperNOVA 1000G2 1000W 80+ Gold Certified Fully-Modular ATX Power Supply ($109.99 @ Newegg)
> *Operating System:* Microsoft Windows 8.1 (OEM) (64-bit) ($89.75 @ OutletPC)
> *Wireless Network Adapter:* TP-Link TL-WDN4800 802.11a/b/g/n PCI-Express x1 Wi-Fi Adapter ($34.98 @ OutletPC)
> *Total:* $1233.41
> _Prices include shipping, taxes, and discounts when available_
> _Generated by PCPartPicker 2015-02-12 20:05 EST-0500_
> 
> Considering dropping the extra dough for the i7, and upgraded CPU cooler (probably a noctua that doesn't interfere with PCI slots). Video editing/streaming is more of hobby and I can probably get by without the i7... anyway, that's another discussion.


Yes you can mix and match vendors with crossfire ^^ Brands doesnt matter as long as it's the same gpu, exception with the 290 that will run with a 290x but the x will slow down? Correct me if I'm wrong though ^^


----------



## wermad

Since Tahiti (ie 7970), each card will run at its own clocks. It won't down clock to match. As for xfire, as long as it's part of the same core family, they can crossfire (ie 7950, 7970, 280x, 7990; 290, 290x, 295x2, 380x*).

Vram will match the lowest card though. Hopefully this will pass once dx12 stacks vram.


----------



## fyzzz

I have finally got gpu tweak to work properly by flashing my bios to reference asus 290 bios. Now the problem is that my card seems to be a bad overclocker and doesn't care about the extra voltage and temperatures is not good either (the dc2 cooler sucks for the 290)


----------



## mus1mus

Quote:


> Originally Posted by *fyzzz*
> 
> I have finally got gpu tweak to work properly by flashing my bios to reference asus 290 bios. Now the problem is that my card seems to be a bad overclocker and doesn't care about the extra voltage and temperatures is not good either (the dc2 cooler sucks for the 290)


What were the issues with the previous bios?


----------



## fyzzz

Quote:


> Originally Posted by *mus1mus*
> 
> What were the issues with the previous bios?


The voltage was strange i could only get just over 1.2v with 1.4v put in gpu tweak. But now i get around 1.3 instead


----------



## DDSZ

What would you recommend me: *NZXT Kraken G10 + Corsair H75* or *ARCTIC Accelero Hybrid II-120*?


----------



## Yorkston

Quote:


> Originally Posted by *DDSZ*
> 
> What would you recommend me: *NZXT Kraken G10 + Corsair H75* or *ARCTIC Accelero Hybrid II-120*?


Can't speak for the Accelero but my G10+H55 works well. Dropped core temps at load from 80c>55c from the windforce cooler and is way quieter. Currently running my 290 at 1200/1625 with +100mV. Core tops out at around 65C, VRM1 82C, VRM2 70C. Make sure you get the Gellid VRM kit if you go G10, VRM1 temps is going to be your bottleneck.


----------



## LandonAaron

Quote:


> Originally Posted by *DDSZ*
> 
> What would you recommend me: *NZXT Kraken G10 + Corsair H75* or *ARCTIC Accelero Hybrid II-120*?


I had the original Arctic Accelero Hybrid and it worked well, but I don't like the new one because it has no active cooling and doesn't even have heat sinks for the VRM. The one heat sink it has is on the wrong side of the card. Also the R9 290x is a hot card and my original Accelero had a hard time keeping it cool. If I were going to use a 120mm radiator I would use an extra thick one like H80 instead of the H75.

So basically I would go with the Kraken G10 with a Corsair H80.

Edit: I looked at the H80 and it doesn't have the right mounting mechanism for the G10. This thermaltake one does though, http://www.newegg.com/Product/Product.aspx?Item=N82E16835106219R, but I would wait till that have non open box one in stock as it seems they didn't apply much of discount for the open box status.


----------



## Levys

Quote:


> Originally Posted by *Roy360*
> 
> unless of course you have mine luck. And can't go above 1050 without a voltage bump. It's like VisionTek paid to get chips that can't overclock


My card I got from Club 3D is actually a Visiontek and it clocks out at 1220/1600 max I believe it does 1100 under water without extra juice.


----------



## bufferpl

Quote:


> Originally Posted by *Yorkston*
> 
> Can't speak for the Accelero but my G10+H55 works well. Dropped core temps at load from 80c>55c from the windforce cooler and is way quieter. Currently running my 290 at 1200/1625 with +100mV. Core tops out at around 65C, VRM1 82C, VRM2 70C. Make sure you get the Gellid VRM kit if you go G10, VRM1 temps is going to be your bottleneck.


Did u also get the heatsinks for RAM or only the Gelid VRM?


----------



## Yorkston

Quote:


> Originally Posted by *bufferpl*
> 
> Did u also get the heatsinks for RAM or only the Gelid VRM?


I got these things for the VRAM, fits perfectly and comes with thermal tape.


----------



## Roy360

Quote:


> Originally Posted by *Levys*
> 
> My card I got from Club 3D is actually a Visiontek and it clocks out at 1220/1600 max I believe it does 1100 under water without extra juice.


then I guess it's my ASUS card.

When I only had my XFX card, it ran at 1250/1500 with a moderate bump. Since my added my DirectCU and VisionTek cards, my overclocks can't exceed 1000/1260 without a bump.

My R9 290X CLUB3D card also overclocks well. However, I've already purchased my waterblocks and it seems like a waste to pair a 290X with two R9 290s


----------



## LandonAaron

Alright so I got one of those discount Korean 2560 x 1440 monitors and have been having trouble with it where every time I try to play a game with 290x overclocked the monitor either gets double vision or just loses picture entirely. After much messing around and trying different settings I found out that it was specifically the voltage that was causing the issue. Anything over +100 seems to cause the monitor to screw up. I can leave everything at stock except voltage and turn it up to +200 and the monitor loses picture, but if I only turn the voltage up to +100 its fine. Due to the lower voltage setting this means I can now only overclock to 1150/1550 instead of the 1200/1600. So do you think this is a problem with my GPU or with the monitor?

If it was going crazy just from a high overclock I would be inclined to believe that it was just an unstable OC, but since I have been using this OC for months, and because it is specifically the voltage setting which causes issues I am inclined to believe it is the monitor and I should probably go ahead and get an RMA while I still can. What do yall think?


----------



## Forceman

I'd say it's the card. Something about the voltage or heat or something is causing the card to send a bad signal to the monitor, causing it to blank out. There is pretty much nothing in the monitor that could be affected by the card overclock, except maybe that it is less tolerant of borderline signals and so is more likely to blank out. If everything else is good with the monitor (no dead pixels, no back light bleed) I'd probably keep it.


----------



## ssgtnubb

So I've been wanting to try Team Red for awhile now and wanted to ask the groups opinion on the PowerColor Devil 13 Dual Core AXR9 290X II. I'd like to get a crossfire setup to have a good feel compared to the 970 and 980 SLI setups I've had in the past. I don't see much on this card and I was just curious on the communities feelings towards it. My system won't have any issue's pushing this card both from a power standpoint nor a cooling standpoint.


----------



## wermad

Its more inline with the 295X2 (its a custom 290X-x2), so the 295x2 reviews are a good place to start off. Btw, I think this Devil did have some reviews.

edit:

http://www.overclockersclub.com/reviews/powercolor_devil13_dual_core_r9_290x_8gb/
http://www.guru3d.com/articles-pages/powercolor-radeon-290x-295x2-devil13-review,1.html
http://www.tomshardware.com/reviews/powercolor-devil-13-dual-amd-r9-290x-review,3853.html
http://hexus.net/tech/reviews/graphics/71425-powercolor-devil-13-dual-core-r9-290x/
http://www.fudzilla.com/35401-powercolor-devil-13-dual-gpu-r9-290x-reviewed
http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/66784-powercolor-devil-13-r9-290x-dual-core-review.html
http://www.hardwareheaven.com/content/reviews/graphics-cards/40148/powercolor-devil13-r9-290x2-graphics-card-review
http://www.jagatreview.com/2014/06/review-powercolor-devil-13-dual-core-r9-290x-gaming-super-kencang-dengan-dual-hawaii-xt/

+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

Hey chaps! i sourced a couple of Vesuvius. I'm waiting on the seller to ship them next week. So that means my triplets are going bye-bye (







). They'll be in the market soon.


----------



## LandonAaron

Quote:


> Originally Posted by *bufferpl*
> 
> Did u also get the heatsinks for RAM or only the Gelid VRM?


If you have good airflow like what the G10 should provide you are probably good, but since your on OCN good probably isn't good enough. I would recommend aluminum heatsinks with thermal tape like the other poster recommended. I have the copper ones from Enzotech and they can be a pain. They are heavy and thermal tape isn't strong enough to keep them attached very well.


----------



## LandonAaron

Quote:


> Originally Posted by *Forceman*
> 
> I'd say it's the card. Something about the voltage or heat or something is causing the card to send a bad signal to the monitor, causing it to blank out. There is pretty much nothing in the monitor that could be affected by the card overclock, except maybe that it is less tolerant of borderline signals and so is more likely to blank out. If everything else is good with the monitor (no dead pixels, no back light bleed) I'd probably keep it.


That's kind of what I'm thinking. As it just doesn't make sense to me how an overclock on the GPU could affect the monitor. But I have been using this overclock for a long time at 1920 x 1080 and since the omega drivers came out I have been using the virtual super resolution at 2560 x 1400 or even higher on my 1920 x 1080 monitor without a problem. And it seems to me that if it was the card these problems would have shown up then as well since it has to do the same amount of rendering either way. I would see artifacts sometimes when running with an OC on my 1080p monitor, so maybe it is just that on this monitor whenever an artifact is passed to it just causes it to blank out or something.

I don't know, I am probably just going to return it anyway. If the second one does this too then I'll know for sure its the card and not the monitor. But of the two 1080p monitors I have and the one new 1440p monitor it is only this new one that has ever had an issue like this...


----------



## tsm106

Quote:


> Originally Posted by *ssgtnubb*
> 
> So I've been wanting to try Team Red for awhile now and wanted to ask the groups opinion on the PowerColor Devil 13 Dual Core AXR9 290X II. I'd like to get a crossfire setup to have a good feel compared to the 970 and 980 SLI setups I've had in the past. I don't see much on this card and I was just curious on the communities feelings towards it. My system won't have any issue's pushing this card both from a power standpoint nor a cooling standpoint.


Unless there's some reason you want a super massive aircooler, there isn't much reason to not get a reference 295x2.

Quote:


> Originally Posted by *LandonAaron*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bufferpl*
> 
> Did u also get the heatsinks for RAM or only the Gelid VRM?
> 
> 
> 
> If you have good airflow like what the G10 should provide you are probably good, b*ut since your on OCN good probably isn't good enough*. I would recommend aluminum heatsinks with thermal tape like the other poster recommended. I have the copper ones from Enzotech and they can be a pain. They are heavy and thermal tape isn't strong enough to keep them attached very well.
Click to expand...

lol it's never a good idea or good enough to not use heatsinks on your vrms, especially on cards like these.


----------



## Forceman

Quote:


> Originally Posted by *LandonAaron*
> 
> That's kind of what I'm thinking. As it just doesn't make sense to me how an overclock on the GPU could affect the monitor. But I have been using this overclock for a long time at 1920 x 1080 and since the omega drivers came out I have been using the virtual super resolution at 2560 x 1400 or even higher on my 1920 x 1080 monitor without a problem. And it seems to me that if it was the card these problems would have shown up then as well since it has to do the same amount of rendering either way. I would see artifacts sometimes when running with an OC on my 1080p monitor, so maybe it is just that on this monitor whenever an artifact is passed to it just causes it to blank out or something.
> 
> I don't know, I am probably just going to return it anyway. If the second one does this too then I'll know for sure its the card and not the monitor. But of the two 1080p monitors I have and the one new 1440p monitor it is only this new one that has ever had an issue like this...


VSR isn't sending an actual 1440p signal though, so I don't think that would show the issue. I think other people have had similar issues before - the voltage thing is ringing a bell in my mind somewhere. Maybe search the Korean monitor thread?


----------



## LandonAaron

Quote:


> Originally Posted by *tsm106*
> 
> Unless there's some reason you want a super massive aircooler, there isn't much reason to not get a reference 295x2.
> lol it's never a good idea or good enough to not use heatsinks on your vrms, especially on cards like these.


Not the VRM's the VRam. I would definitely get the Gelid kit if he has reference board. I wouldn't splurge on the super high performance thermal tape (forget brand name), but yes for VRM you need a heatsink. For the VRam aluminum heatsinks are plenty or no heat sink with adequate airflow.


----------



## wermad

Quote:


> Originally Posted by *tsm106*
> 
> Unless there's some reason you want a super massive aircooler, there isn't much reason to not get a reference 295x2.
> lol it's never a good idea or good enough to not use heatsinks on your vrms, especially on cards like these.


The reference cooler on the 295x2 is pretty mild imho. I talked to a few owners and most are having to take extreme measures such as 3k fans to keep it under the thermal limit of 75C amd has imposed on the 295x2. Plus, owners tell the fans are not controllable. I'll find out soon enough, though blocks are incoming (three monsta 480s await!).

So, the Devil's massive cooler, doesn't look great at first, but if it doesn't have a low thermal limit like the 295x2, and you don't plan to crossfire (quad-fire), it would be a nice alternative to the 295x2 aio cooler.


----------



## ghabhaducha

Hey guys so I've been trying to figure this out for a couple weeks, I can't seem to get crossfire working on Tomb Raider with my 2x 290x's. I'm currently running the latest Catalyst Omega Drivers, and my rig is in my sig. I also am somewhat new to crossfire setups, so I don't know if there is some option that i have overlooked or what. Crossfire seems to work great in other games (GRID2, SOM, AC:BF, etc)

Thanks in advance!


----------



## kcuestag

Quick question, does the latest official Omega drivers support CrossfireX for Far Cry 4?

I am back on AMD with two R9 290X and I don't think my 2nd card is being used in that game.


----------



## CGabry

Add me...


----------



## ghabhaducha

Quote:


> Originally Posted by *kcuestag*
> 
> Quick question, does the latest official Omega drivers support CrossfireX for Far Cry 4?
> 
> I am back on AMD with two R9 290X and I don't think my 2nd card is being used in that game.


I don't think it does, at least it didn't for me.


----------



## GOLDDUBBY

Btw.. when running dual gpu on a single card, if mounted in a 16x pci.e would that be comparable to 2 cards in 8x pci.e ?


----------



## joeh4384

Quote:


> Originally Posted by *kcuestag*
> 
> Quick question, does the latest official Omega drivers support CrossfireX for Far Cry 4?
> 
> I am back on AMD with two R9 290X and I don't think my 2nd card is being used in that game.


It doesn't have the profile enabled but Far Cry 4 works pretty well with AFR friendly mode setup in Catalyst.


----------



## wermad

Quote:


> Originally Posted by *GOLDDUBBY*
> 
> Btw.. when running dual gpu on a single card, if mounted in a 16x pci.e would that be comparable to 2 cards in 8x pci.e ?


Well, before (7990 et prev) you did, though 295x2 has its own onboard PLX chip. So, when 295x2 goes in a 3.0 16x slot, the plx chip gives you 16x lanes. HardOcp verfied through their quad 295x2 review on a Maximus VI Ext., you can run two 295x2 on 8x 3.0.

As long as you're running pcie 3.0 4x, you can run any single core Hawaii for the exception for 295x2 (minimum is 8x 3.0). Hawaii will run on pcie 2.0 8x (equivalent to 3.0 4x) and 2.0 16x (equivalent to 3.0 8x).

Which cpu and board do you have? Will determine if you're running 3.0 and what lanes are available to you.


----------



## taem

Why isn't multi gpu not automatic, why do we need title-specific driver level support? Mystifies me in this day and age. I just don't understand the technical reasons for this.


----------



## mojobear

4k plunge. nice.

How do you feel about these types of monitors. http://www.ebay.ca/itm/221667696474

40 inch 4k. Its about the PPI of 1440p @ 27 inches.

I was contemplating going this, with PLP - 2 x 23 inch monitors in portrait on the sides, PPI almost matches about 10% less, but I think the image wouldnt be that distorted. THEN I find out that PLP is only on the R9 285







Blah!


----------



## Forceman

Isn't 40" a little too big for a monitor? If you sit close enough to see the difference in resolution it seems like you'd be looking side to side, and if you sit farther back wouldn't the resolution be a waste? I'd think you'd want a higher PPI than a 1440p 27" if you are making the jump to 4k.


----------



## LandonAaron

Quote:


> Originally Posted by *mojobear*
> 
> 4k plunge. nice.
> 
> How do you feel about these types of monitors. http://www.ebay.ca/itm/221667696474
> 
> 40 inch 4k. Its about the PPI of 1440p @ 27 inches.
> 
> I was contemplating going this, with PLP - 2 x 23 inch monitors in portrait on the sides, PPI almost matches about 10% less, but I think the image wouldnt be that distorted. THEN I find out that PLP is only on the R9 285
> 
> 
> 
> 
> 
> 
> 
> Blah!


How do you mean R9 290x doesn't support PLP? I only have two monitors but I was able to rotate one of them to portrait mode.


----------



## mojobear

Quote:


> Originally Posted by *LandonAaron*
> 
> How do you mean R9 290x doesn't support PLP? I only have two monitors but I was able to rotate one of them to portrait mode.


my bad i mean for eyefinity PLP


----------



## Arizonian

Quote:


> Originally Posted by *CGabry*
> 
> Add me...
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## ghabhaducha

Quote:


> Originally Posted by *joeh4384*
> 
> It doesn't have the profile enabled but Far Cry 4 works pretty well with AFR friendly mode setup in Catalyst.


Ah, this totally works. Hmm now I know how to disable crossfire for office 2013...lol, no longer a scrub with crossfire (ok, maybe less).


----------



## LandonAaron

Quote:


> Originally Posted by *Forceman*
> 
> Isn't 40" a little too big for a monitor? If you sit close enough to see the difference in resolution it seems like you'd be looking side to side, and if you sit farther back wouldn't the resolution be a waste? I'd think you'd want a higher PPI than a 1440p 27" if you are making the jump to 4k.


Quote:


> Originally Posted by *mojobear*
> 
> 4k plunge. nice.
> 
> How do you feel about these types of monitors. http://www.ebay.ca/itm/221667696474
> 
> 40 inch 4k. Its about the PPI of 1440p @ 27 inches.
> 
> I was contemplating going this, with PLP - 2 x 23 inch monitors in portrait on the sides, PPI almost matches about 10% less, but I think the image wouldnt be that distorted. THEN I find out that PLP is only on the R9 285
> 
> 
> 
> 
> 
> 
> 
> Blah!


That thing looks amazing. I think the pixel density is just right. Any higher and text will be too hard to read especially on the peripheral parts of the screen. I would like to get something like that and no longer have to mess with a multi monitor setup. But will it run at 60 FPS in 4k mode?


----------



## mojobear

Quote:


> Originally Posted by *LandonAaron*
> 
> That thing looks amazing. I think the pixel density is just right. Any higher and text will be too hard to read especially on the peripheral parts of the screen. I would like to get something like that and no longer have to mess with a multi monitor setup. But will it run at 60 FPS in 4k mode?


Pretty sure it runs 60hz 4k









Pretty tempted except for the fact that I cant do PLP eyefinity with the R9 290s


----------



## Bertovzki

Im looking for a monitor next , I looked at this thing , but it is low grade ? is it not ? 8.5 ms response time ? , you guys will know , look at the specks on my local site here in NZ

And if you say it has same PPI as a 27" 1440p then is there any advantage , as i will need to sit reasonably close

http://www.pbtech.co.nz/index.php?z=p&p=MONPHS4065&name=Philips-BDM4065UC75-40-4K-UHD-Monitor--3840x2160--


----------



## fyzzz

I just orderd the raijintek morpheus with some noctua nf f12's to replace the crappy dc2 cooler on my 290. Can't wait until it's here


----------



## leetmode

Hey guys, just finished upgrading my computer and finally have a pair of Sapphire Tri-X 290xs that I desperately needed, add me!



I have a quick question about my set up though, so far everything seems to be normal except for the fact when I play BF4 I am not getting 100% usage from both GPUs.

Is this normal? I was under the impression I should be seeing something like 99%...


----------



## taem

Quote:


> Originally Posted by *LandonAaron*
> 
> That thing looks amazing. I think the pixel density is just right. Any higher and text will be too hard to read especially on the peripheral parts of the screen. I would like to get something like that and no longer have to mess with a multi monitor setup. But will it run at 60 FPS in 4k mode?


What's your gpu setup that you'd worry about running 60fps @ 4k? In the reviews and benches of crossfire that I've looked at, you can't 60 fps constant with fairly high settings. You're hard pressed to average 60 even on older titles like Hitman Absolution.

I'd get this monitor right now if I could get 60fps with my 290 crossfire. As it stands though, unless you're going 3 cards or more, I don't think Hawaii cuts it for 4k gaming.

This screen looks great though, I'm hoping to get one when the 380x comes out.


----------



## mojobear

Quote:


> Originally Posted by *Bertovzki*
> 
> Im looking for a monitor next , I looked at this thing , but it is low grade ? is it not ? 8.5 ms response time ? , you guys will know , look at the specks on my local site here in NZ
> 
> And if you say it has same PPI as a 27" 1440p then is there any advantage , as i will need to sit reasonably close
> 
> http://www.pbtech.co.nz/index.php?z=p&p=MONPHS4065&name=Philips-BDM4065UC75-40-4K-UHD-Monitor--3840x2160--


The exact specs Im not exactly sure of. Its a VA panel so the blacks are great but there are some trade offs.

PPI for 40 inch diagonal = 2160 pixels / 19.61inch height = 110 PPI
PPI for 27 inch diagonal = 1440 pixels / 13.24 inch height = 109 PPI

Here is the first review I found... there may be more now...

http://www.pcgamer.com/philips-bdm4065uc-monitor-review/

http://www.tftcentral.co.uk/reviews/philips_bdm4065uc.htm

There is also a seiko 40 inch 4k coming out soon. http://www.pcgamer.com/seikis-40-inch-4k-display-is-a-desk-dominating-beauty-for-under-1000/


----------



## Bertovzki

Quote:


> Originally Posted by *mojobear*
> 
> The exact specs Im not exactly sure of. Its a VA panel so the blacks are great but there are some trade offs.
> 
> PPI for 40 inch diagonal = 2160 pixels / 19.61inch height = 110 PPI
> PPI for 27 inch diagonal = 1440 pixels / 13.24 inch height = 109 PPI
> 
> Here is the first review I found... there may be more now...
> 
> http://www.pcgamer.com/philips-bdm4065uc-monitor-review/
> 
> http://www.tftcentral.co.uk/reviews/philips_bdm4065uc.htm
> 
> There is also a seiko 40 inch 4k coming out soon. http://www.pcgamer.com/seikis-40-inch-4k-display-is-a-desk-dominating-beauty-for-under-1000/


Thanks for the links , it is tempting this screen , I probably wont be able to run 4k for a while , as i need a second gpu or 3xxx card , but i guess i can down scale it to 1440p if needed.


----------



## hyp36rmax

Finally finished the window side panel for my TJ08-E and installed the Aquacomputer Farbwerk LED Controller. I get a video up of the Farbwerk in action rotating the LED colors automatically. Only thing left is the actual X99 gear haha.





*Build Log: *Link


----------



## Bertovzki

Quote:


> Originally Posted by *hyp36rmax*
> 
> Finally finished the window side panel for my TJ08-E and installed the Aquacomputer Farbwerk LED Controller. I get a video up of the Farbwerk in action rotating the LED colors automatically. Only thing left is the actual X99 gear haha.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Build Log: *Link


The Farbwerk looks like an awesome controller , i nearly got one last night , but decided not to as i have an unused NZXT RGB , so will fire that up first.

but i will get it at some point as a second controller for more controlled light in other areas too


----------



## hyp36rmax

Quote:


> Originally Posted by *Bertovzki*
> 
> The Farbwerk looks like an awesome controller , i nearly got one last night , but decided not to as i have an unused NZXT RGB , so will fire that up first.
> 
> but i will get it at some point as a second controller for more controlled light in other areas too


Yes the Farbwerk is pretty awesome, I look forward to discovering all the other applications I can use it for as I'm going to look into finding out how to program the Farbwerk to respond via music/ Imagine the light show!!! haha


----------



## Bertovzki

Quote:


> Originally Posted by *hyp36rmax*
> 
> Yes the Farbwerk is pretty awesome, I look forward to discovering all the other applications I can use it for as I'm going to look into finding out how to program the Farbwerk to respond via music/ Imagine the light show!!! haha


Yip i will definitely get one , and my m8 has a disco ball about the size of a tennis ball which should look great








, or at least it will make for some amusing vids when i get the rig finished


----------



## LandonAaron

Quote:


> Originally Posted by *Bertovzki*
> 
> Im looking for a monitor next , I looked at this thing , but it is low grade ? is it not ? 8.5 ms response time ? , you guys will know , look at the specks on my local site here in NZ
> 
> And if you say it has same PPI as a 27" 1440p then is there any advantage , as i will need to sit reasonably close
> 
> http://www.pbtech.co.nz/index.php?z=p&p=MONPHS4065&name=Philips-BDM4065UC75-40-4K-UHD-Monitor--3840x2160--


I really don't see why anyone would want a higher PPI than what a 27" 1440p monitor, which is what I have. At this pixel density with DPI set to default settings you need to be within 2 feet of you monitor to read text. It really makes leaning back in my chair hard, lol.


----------



## LandonAaron

As I've previously mentioned I recently made the jump to 1440p. I am currently running a single R9 290x, but am thinking of either going crossfire or just upgrading to a GTX 980. I can't find many/any benchmarks comparing 2 R9 290x's to a single GTX 980, though I'm sure the CF 290x would be the better performer, I just don't know by how much. It would cost about the same to go either route if I factor in selling my current card. My other concern is that I may not see the benefits of CF since it seems kind of hit and miss now days for multi GPU support. What would be yall's rough estimate of percentage of games that support crossfire?


----------



## Bertovzki

Quote:


> Originally Posted by *LandonAaron*
> 
> As I've previously mentioned I recently made the jump to 1440p. I am currently running a single R9 290x, but am thinking of either going crossfire or just upgrading to a GTX 980. I can't find many/any benchmarks comparing 2 R9 290x's to a single GTX 980, though I'm sure the CF 290x would be the better performer, I just don't know by how much. It would cost about the same to go either route if I factor in selling my current card. My other concern is that I may not see the benefits of CF since it seems kind of hit and miss now days for multi GPU support. What would be yall's rough estimate of percentage of games that support crossfire?


Right , i appreciate your feedback

I guess to explain what might seem ilogical , i just want a damn big monitor for one , or a big TV , the thing that made the fore metioned monitor attractive was its size and price for a large good quality screen.

I am fully open to the TV options too if there are any suggestions , as i want a good TV too , my old man has a 60 in plasma , and it looks fantastic and runs games perfectly , so i may be better to spend 2 times more again and do that , and perhaps get a small monitor 27" in the mean time.

I guess the other reason i was thinking of the 4K was because i am thinking ahead to when in a couple of years it will be easy to run that on one GPU probably.


----------



## YellowBlackGod

Quote:


> Originally Posted by *LandonAaron*
> 
> As I've previously mentioned I recently made the jump to 1440p. I am currently running a single R9 290x, but am thinking of either going crossfire or just upgrading to a GTX 980. I can't find many/any benchmarks comparing 2 R9 290x's to a single GTX 980, though I'm sure the CF 290x would be the better performer, I just don't know by how much. It would cost about the same to go either route if I factor in selling my current card. My other concern is that I may not see the benefits of CF since it seems kind of hit and miss now days for multi GPU support. What would be yall's rough estimate of percentage of games that support crossfire?


What kind of 290X do you have? A reference one? Or a good custom model? Because if you own a custom one you do not need to upgrade. Just take a look at the Benchmarks of the new games. The 290X and the 980 have almost identical performance. Sometimes is the 980 ahead, sometimes the 290X. The fact that the 290X is still, after 1,5 years almost, a competitive top notch card, shows how great it's value for money is. Plus it is futureproof and has nothing less to offer when compared with the 980: Freesync, DX12 support, OpenGL 4.4, Omega Drivers, plus better price and in many custom models more VRAM (8GB). I would crossfire If I were you, I am actually thinking about it, but the 8GB Vapor-X 290X from Sapphire that I own doesn't let me to convince my self that I should do it. Because it is really powerfull even in 3K downscaled resolution (using the VSR feature of the Omega Drivers), not to mention 1080P and 2K.


----------



## Chopper1591

Quote:


> Originally Posted by *LandonAaron*
> 
> As I've previously mentioned I recently made the jump to 1440p. I am currently running a single R9 290x, but am thinking of either going crossfire or just upgrading to a GTX 980. I can't find many/any benchmarks comparing 2 R9 290x's to a single GTX 980, though I'm sure the CF 290x would be the better performer, I just don't know by how much. It would cost about the same to go either route if I factor in selling my current card. My other concern is that I may not see the benefits of CF since it seems kind of hit and miss now days for multi GPU support. What would be yall's rough estimate of percentage of games that support crossfire?


My vote goes to an second 290x.
It is roughly 25-30% more powerful.
Although the 980 would be more efficient as in power consumption.

Do you need to upgrade now?
I advice you to wait for amd to release the 300 series.


----------



## rdr09

Quote:


> Originally Posted by *leetmode*
> 
> Hey guys, just finished upgrading my computer and finally have a pair of Sapphire Tri-X 290xs that I desperately needed, add me!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I have a quick question about my set up though, so far everything seems to be normal except for the fact when I play BF4 I am not getting 100% usage from both GPUs.
> 
> Is this normal? I was under the impression I should be seeing something like 99%...


read this thread. it should include info about disabling ULPS if you have not done so.

http://www.overclock.net/t/1265543/the-amd-how-to-thread

is your cpu oc'ed?


----------



## Chopper1591

Quote:


> Originally Posted by *rdr09*
> 
> read this thread. it should include info about disabling ULPS if you have not done so.
> 
> http://www.overclock.net/t/1265543/the-amd-how-to-thread
> 
> is your cpu oc'ed?


Interested in seeing a solution to this.


----------



## rdr09

Quote:


> Originally Posted by *Chopper1591*
> 
> Interested in seeing a solution to this.


see post # 252 . . .

http://www.overclock.net/t/1442038/build-log-the-hawaiian-heat-wave/250

i have pretty much same usage with 2 290s in 4K with an i7 SB @ 4.5GHz.


----------



## Chopper1591

Quote:


> Originally Posted by *rdr09*
> 
> see post # 252 . . .
> 
> http://www.overclock.net/t/1442038/build-log-the-hawaiian-heat-wave/250
> 
> i have pretty much same usage with 2 290s in 4K with an i7 SB @ 4.5GHz.


Interesting.
Which cards are you running?
And are they under water?

You are running a single 360 rad, right? So probably only cooling the cpu.

What do you think?
If I add another 290, will my fx-8320 be a bottleneck? It is running at 4.8ghz.


----------



## rdr09

Quote:


> Originally Posted by *Chopper1591*
> 
> Interesting.
> Which cards are you running?
> And are they under water?
> 
> You are running a single 360 rad, right? So probably only cooling the cpu.
> 
> What do you think?
> If I add another 290, will my fx-8320 be a bottleneck? It is running at 4.8ghz.


i am running 2 reference 290s at stock watercooled. i only have 3 120 rads, so like one rad per block. my temps are far from ideal. i doubt a 4.8GHz 8300 will be bottlenecked in games like BF4 at high rez.

here is my cpu usage in BF4 MP 64 . . .



thread 8 was sleeping.


----------



## Chopper1591

Quote:


> Originally Posted by *rdr09*
> 
> i am running 2 reference 290s at stock watercooled. i only have 3 120 rads, so like one rad per block. my temps are far from ideal. i doubt a 4.8GHz 8300 will be bottlenecked in games like BF4 at high rez.
> 
> here is my cpu usage in BF4 MP 64 . . .
> 
> 
> 
> thread 8 was sleeping.


Hmm.. so you do.
Care to share your temps? I believe you if you say they are pretty bad.
I find that is very few rad-space. I had a single 360 UT60 for cpu only, then added another 140.1 when I added my single 290 tri-x to the loop.
And I still struggle to keep it below 10c Delta with low fan speeds.

Are you running high rpm?

My 290 tri-x runs around 40c on the core and around 35c on vrm1. The stock air cooler let the vrm reach around 80-90c.
That is at a 1200 1500 +175 overclock.


----------



## rdr09

Quote:


> Originally Posted by *Chopper1591*
> 
> Hmm.. so you do.
> Care to share your temps? I believe you if you say they are pretty bad.
> I find that is very few rad-space. I had a single 360 UT60 for cpu only, then added another 140.1 when I added my single 290 tri-x to the loop.
> And I still struggle to keep it below 10c Delta with low fan speeds.
> 
> Are you running high rpm?
> 
> My 290 tri-x runs around 40c on the core and around 35c on vrm1. The stock air cooler let the vrm reach around 80-90c.
> That is at a 1200 1500 +175 overclock.


i don't oc my gpu in games. if i do my temps will be worst . . .

BF4 MP 64 temps . . .



my fans are in silent mode in bios. cougar fans are dead silent. i hear the pump. not sure if those rpms are for the fans. most are running at 1200.


----------



## Chopper1591

Quote:


> Originally Posted by *rdr09*
> 
> i don't oc my gpu in games. if i do my temps will be worst . . .
> 
> BF4 MP 64 temps . . .
> 
> 
> 
> my fans are in silent mode in bios. cougar fans are dead silent. i hear the pump.


I will do a test for you when I get home.
Stock settings. 15-20 minutes of Heaven. Will get me close to max temps I guess.
Can do gaming if you want.... Far Cry or Unity for example.


----------



## rdr09

Quote:


> Originally Posted by *Chopper1591*
> 
> I will do a test for you when I get home.
> Stock settings. 15-20 minutes of Heaven. Will get me close to max temps I guess.
> Can do gaming if you want.... Far Cry or Unity for example.


thanks. its warm inside the house in winter cause of the heating. i do plan on replacing 2 of of the 120 rads to 2 240 rads.


----------



## leetmode

Quote:


> Originally Posted by *rdr09*
> 
> read this thread. it should include info about disabling ULPS if you have not done so.
> 
> http://www.overclock.net/t/1265543/the-amd-how-to-thread
> 
> is your cpu oc'ed?


I believe I do have ULPS disabled, I clicked the check box in the Afterburner settings (running 4.1). After going through that thread I think I pretty much installed and set up Afterburner the same way. My CPU is not over clocked. I am running these with a 2560x1600 monitor but I still imagined that I would be getting better results. I'm pretty much getting the same FPS and detail as I was with 1 290x. I haven't overclocked the cards but I still imagined things would better.

Quote:


> Originally Posted by *rdr09*
> 
> see post # 252 . . .
> 
> http://www.overclock.net/t/1442038/build-log-the-hawaiian-heat-wave/250
> 
> i have pretty much same usage with 2 290s in 4K with an i7 SB @ 4.5GHz.


Sorry for the stupid question but how do you increase the resolution scale?


----------



## rdr09

Quote:


> Originally Posted by *leetmode*
> 
> I believe I do have ULPS disabled, I clicked the check box in the Afterburner settings (running 4.1). After going through that thread I think I pretty much installed and set up Afterburner the same way. My CPU is not over clocked. I am running these with a 2560x1600 monitor but I still imagined that I would be getting better results. I'm pretty much getting the same FPS and detail as I was with 1 290x. I haven't overclocked the cards but I still imagined things would better.
> Sorry for the stupid question but how do you increase the resolution scale?


in BF4 Graphics settings. But, you don't really have to anymore 'cause you are at 1600 rez. you still have to oc the cpu, though, or just turn off power saving features in bios.

check out post #81 . . .

http://www.overclock.net/t/1540718/which-is-a-better-card/80#post_23545980

the 3930K stock is getting matched by my Phenom in physics score.


----------



## Chopper1591

Quote:


> Originally Posted by *rdr09*
> 
> thanks. its warm inside the house in winter cause of the heating. i do plan on replacing 2 of of the 120 rads to 2 240 rads.


I don't know how you hooked up the fans...
If they are in the cpu header, fan1 and fan2? If they are, then they are indeed running around 1200rpm. Which is pretty silent.
But 1200 ain't dead silent IMO.








I want 800 or lower. Sadly I need more rad space to do that.

Ohh.. you actualy have 3 120mm rads. Hmm.
Can I ask why? Single rads are said to be better. Why haven't you bought a 240 and a 120 from the start?

Here is my current setup.
Will change back to a tube res this week.


Spoiler: Warning: Spoiler!








Just did around 20 minutes of Heaven on stock gpu clocks.
Here are the results.


Water temp was 25.6c

Pretty happy with the performance of my gpu cooling.


----------



## rdr09

Quote:


> Originally Posted by *Chopper1591*
> 
> I don't know how you hooked up the fans...
> If they are in the cpu header, fan1 and fan2? If they are, then they are indeed running around 1200rpm. Which is pretty silent.
> But 1200 ain't dead silent IMO.
> 
> 
> 
> 
> 
> 
> 
> 
> I want 800 or lower. Sadly I need more rad space to do that.
> 
> Ohh.. you actualy have 3 120mm rads. Hmm.
> Can I ask why? Single rads are said to be better. Why haven't you bought a 240 and a 120 from the start?
> 
> Here is my current setup.
> Will change back to a tube res this week.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just did around 20 minutes of Heaven on stock gpu clocks.
> Here are the results.
> 
> 
> Water temp was 25.6c
> 
> Pretty happy with the performance of my gpu cooling.


very nice. i need a new case. really nice temps you have. never planned on getting another 290. was worried my i7 will not handle them.


----------



## Chopper1591

Quote:


> Originally Posted by *rdr09*
> 
> very nice. i need a new case. really nice temps you have. never planned on getting another 290. was worried my i7 will not handle them.


Yeah I want another case too, but haven't got the money.
My 360 radiator is outside of the case because it won't fit.









650D isn't really a small case, I've had much smaller ones.
But it is pretty crowded already with the stuff in it now. Although a second card would fit in nicely.
A longer psu will be tough though, I guess. And it will most likely be longer when I go for a 1000W unit. Right?

I was actually amazed with the temps I get. Had expected it to be a good deal lower then air but not this much.








Can't complain.

Edit:
For comparison, here is a run I just did.
1275 core 1625 mem +175mv.
Water temp raised to just 26.5c.


I dare you to try that clock and see which temps you get.


----------



## Kaltenbrunner

How well do these tun in crossfire ????? I plan to, but I fear driver problems like last time w/ 7950 CF in some games


----------



## Chopper1591

Quote:


> Originally Posted by *Kaltenbrunner*
> 
> How well do these tun in crossfire ????? I plan to, but I fear driver problems like last time w/ 7950 CF in some games


Scaling is much better with the current drivers.

But it stays personal.
What is your budget, what is your goal?

It is said that a single card is best, and I agree somehow.
Less power consumption, newest tech.

But...
CF is less expensive. You can also grab a second hand 290 for a decent price lately. Even cheaper if you wait for the 380x/390 series to release.


----------



## Kriant

Quote:


> Originally Posted by *Kaltenbrunner*
> 
> How well do these tun in crossfire ????? I plan to, but I fear driver problems like last time w/ 7950 CF in some games


With 2 cards scaling is good, for the most part. But we are still waiting for working profiles for FC4, Dying Light, AC:U, Evolve etc. I put two in my brother's rig, because of the price/performance ratio at the time. And they work well.


----------



## twoofswords

Finally got my 290x under water... Now to play with some overclocking!


----------



## Chopper1591

Quote:


> Originally Posted by *twoofswords*
> 
> 
> 
> Finally got my 290x under water... Now to play with some overclocking!


Nooo.
What did you do?

Why not a fullcover block?








Those vrm's will be screaming. Or your clock will be limited.


----------



## bond32

Quote:


> Originally Posted by *Chopper1591*
> 
> Nooo.
> What did you do?
> 
> Why not a fullcover block?
> 
> 
> 
> 
> 
> 
> 
> 
> Those vrm's will be screaming. Or your clock will be limited.


To be honest that's the smarter route in my opinion... Slightly cheaper cost for that block over full cover plus when you upgrade you can re-use it and no need to buy yet again another block...


----------



## Chopper1591

Quote:


> Originally Posted by *bond32*
> 
> To be honest that's the smarter route in my opinion... Slightly cheaper cost for that block over full cover plus when you upgrade you can re-use it and *no need to buy yet again another block*...


There you are right, no doubt.

Although I don't care, I will just sell the block and since I take real good care of my stuff I will get a decent amount of money back.
Enthusiasts also pay a fair price for second hand hardware. Certainly watercooling stuff. One year old stuff will net at least 60% of the Retail price.

But again.
I did change to water to tame those vrms.
Sure, core needs to be cooled too but the vrms were way hotter(90c) then the core(70-75c) during overclocking.


----------



## rdr09

Quote:


> Originally Posted by *Chopper1591*
> 
> Yeah I want another case too, but haven't got the money.
> My 360 radiator is outside of the case because it won't fit.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 650D isn't really a small case, I've had much smaller ones.
> But it is pretty crowded already with the stuff in it now. Although a second card would fit in nicely.
> A longer psu will be tough though, I guess. And it will most likely be longer when I go for a 1000W unit. Right?
> 
> I was actually amazed with the temps I get. Had expected it to be a good deal lower then air but not this much.
> 
> 
> 
> 
> 
> 
> 
> 
> Can't complain.
> 
> Edit:
> For comparison, here is a run I just did.
> 1275 core 1625 mem +175mv.
> Water temp raised to just 26.5c.
> 
> 
> I dare you to try that clock and see which temps you get.


haha. anything over 1275 core . . . i have to open the window just to bench. its 19 F or - 7 C outside.


----------



## twoofswords

Quote:


> Originally Posted by *Chopper1591*
> 
> Those vrm's will be screaming. Or your clock will be limited.


The VRM's are all under heatsinks. Judging from the forum reviews of the people attaching Corsair HG10 / NZXT Kraken G10 to their 290's, it works fine.
Quote:


> Originally Posted by *bond32*
> 
> To be honest that's the smarter route in my opinion... Slightly cheaper cost for that block over full cover plus when you upgrade you can re-use it and no need to buy yet again another block...


Even when adding the cost of the Gelid kit, I saved a bit of money versus the full size water block.


----------



## BigBeard86

nice temps there man (full water blocks)

What pump/reservoir/rads you use?

I have 2 290's in crossfire with the NZXT bracket, and two NZXT x41 for each gpu. On stock volts i get ~40c on core and 35c Vrms (upgraded fans plus gelid sinks). With +200mv i get ~50 c core and 45c vrm.

The temps you have for overvolted card are very nice, i haven't seen anyone having the cards run that cool, even with a custom loop. Are you using chilled water?


----------



## duganator

Just bought a gigabyte windforce 290x and I can't seem to get the stupid thing to work. I'm coming from a 7970 btw. I seated the card in the pcie slot and plugged up both power connectors on my 850 watt psu and the fans don't spin and I get no display with the card. Is it doa?


----------



## Ganf

Quote:


> Originally Posted by *duganator*
> 
> Just bought a gigabyte windforce 290x and I can't seem to get the stupid thing to work. I'm coming from a 7970 btw. I seated the card in the pcie slot and plugged up both power connectors on my 850 watt psu and the fans don't spin and I get no display with the card. Is it doa?


Tried different slots and different power cords yet?

You didn't change anything else by chance did you?


----------



## duganator

Quote:


> Originally Posted by *Ganf*
> 
> Tried different slots and different power cords yet?
> 
> You didn't change anything else by chance did you?


Tried all three pcie slots on my board and both sets of power connectors on my psu and the fans don't even start on the card.


----------



## pshootr

Quote:


> Originally Posted by *duganator*
> 
> Tried all three pcie slots on my board and both sets of power connectors on my psu and the fans don't even start on the card.


Does your 7970 still fire up?


----------



## duganator

Quote:


> Originally Posted by *pshootr*
> 
> Does your 7970 still fire up?


Yup, I'm using it currently.


----------



## pshootr

Quote:


> Originally Posted by *duganator*
> 
> Yup, I'm using it currently.


Damn, sounds like you got a DOA card. Can you see the PCB where the fans connect to insure they are plugged in?


----------



## duganator

Quote:


> Originally Posted by *pshootr*
> 
> Damn, sounds like you got a DOA card. Can you see the PCB where the fans connect to insure they are plugged in?


Yeah, I already checked both connectors


----------



## pshootr

Quote:


> Originally Posted by *duganator*
> 
> Yeah, I already checked both connectors


I'm at a loss here as to what you can try other than trying it in another MB. What MB do you use? Was the card purchased new?


----------



## duganator

Quote:


> Originally Posted by *pshootr*
> 
> I'm at a loss here as to what you can try other than trying it in another MB. What MB do you use? Was the card purchased new?


I have a spare motherboard at my other house that I could try, but I'm running the newest bios on my p67 fatal1ty professional. I purchased the card open box from microcenter so i'm guessing that is why I'm dealing with all these issues.


----------



## pshootr

Quote:


> Originally Posted by *duganator*
> 
> I have a spare motherboard at my other house that I could try, but I'm running the newest bios on my p67 fatal1ty professional. I purchased the card open box from microcenter so i'm guessing that is why I'm dealing with all these issues.


Ya, unfortunately, it seems the GPU is in fact defective. Well, you tried. I guess you win some/lose some with open boxes.


----------



## mfknjadagr8

Should be joining the club soon two reference 290s en route


----------



## mus1mus

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Should be joining the club soon two reference 290s en route


----------



## mfknjadagr8

Quote:


> Originally Posted by *mus1mus*


yeah finally get to step up to real video cards for once... i havent had that for a long time... when i bought my 8800gtx it was the top of the line.... before that was the voodoo 5...so havent had high end video for 12 years ish...


----------



## pshootr

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Should be joining the club soon two reference 290s en route


Awesome man, you must be pumped, I am very happy with my 290. BF4 is silky smooth maxed out with AA, I still run 1080. I went to this from a HD5570


----------



## elamelo

Hate to duplicate post...but count me in:

Got everything installed. Here is a quick update:






went from a 750ti to a WC'd R9 290 by Visiontek for $348 (shipped):
https://www.visiontek.com/graphics-cards/liquid-cooled-series/visiontek-cryovenom-liquidcooled-series-r9-290-detail.html


----------



## pshootr

Quote:


> Originally Posted by *elamelo*
> 
> Hate to duplicate post...but count me in:
> 
> Got everything installed. Here is a quick update:
> 
> 
> 
> 
> 
> 
> went from a 750ti to a WC'd R9 290 by Visiontek for $348 (shipped):
> https://www.visiontek.com/graphics-cards/liquid-cooled-series/visiontek-cryovenom-liquidcooled-series-r9-290-detail.html


Nice man







I like the black hoses.


----------



## mfknjadagr8

Quote:


> Originally Posted by *elamelo*
> 
> Hate to duplicate post...but count me in:
> 
> Got everything installed. Here is a quick update:
> 
> 
> 
> 
> 
> 
> went from a 750ti to a WC'd R9 290 by Visiontek for $348 (shipped):
> https://www.visiontek.com/graphics-cards/liquid-cooled-series/visiontek-cryovenom-liquidcooled-series-r9-290-detail.html


mine will be around 320 with the swiftech block for each one... but i get to support swiftech so thats a plus for me...their customer service won me over hands down... i was originally looking at the aquacomputer blocks but i can get within a few c of that and pay less and support swiftech. win win for me.. but they might run with stock blower coolers for a few weeks...the horror!


----------



## Bertovzki

Quote:


> Originally Posted by *duganator*
> 
> I have a spare motherboard at my other house that I could try, but I'm running the newest bios on my p67 fatal1ty professional. I purchased the card open box from microcenter so i'm guessing that is why I'm dealing with all these issues.


Quote:


> Originally Posted by *pshootr*
> 
> Ya, unfortunately, it seems the GPU is in fact defective. Well, you tried. I guess you win some/lose some with open boxes.


Yeah open box , a bad start , I have been trying to buy a Mobo for a month now , the first one came in a clearly well use opened box with big dents from finger nails in the lock tab on the box , and the PCB was bent 10 mm from one end to the other , it was clearly used for a demo model , and had probably been sitting on its end for months , maybe in the sun , and had this huge warp in it , so bad the bottom of the mobo touched the metal mobo tray.

I asked djthrottleboy how his was shipped , he said no sealed wrapping , i had another company look at a board yesterday for me , he said it was i little bent ( means the seal was broken on this too )
long story short , a list of 4 GA Z97X G1 WIFI BK Mobo's that i know had no seals on box , i only found out they should have seals when i was enqiring if the last two mobos in NZ had straight PCB's or not , and the supplier told the reseller that they cant open the box as it is sealed.
This gave the confidence to finally buy a board again , and not have to wait another month for re supply

You see if your GPU has been handled goodness knows how many times by who ever with static in their body or sharp rings on the fingers , who knows what damage may be done


----------



## Gil80

Hi all,

I use two R9 290 by Gigabyte. It's the Windforce OC edition.
I switched from air cooling to water cooling for both cards. I use the EK R9 290 waterblocks.

Running Furmark for about 30 minutes, the GPU cores reached 53deg and I'm happy with that.

But the Vrams are whole other story. The main GPU Vrams get to 90 deg, and that is after applying the new thermal pads supplied with the EK waterblocks. I did everything as instructed.
The 2nd GPU Vrams get to about 70~75deg.

Can anyone give me some advice, what to do?

Maybe I'm testing wrong? I use GPU-z to monitor the temps. I'd appreciate any help and advice to remedy this temp issue.
I live in Australia, and I couldn't find here high temp conductivity thermal pads. So I was bound to use the EK supplied thermal pads.

Thanks!


----------



## Arizonian

Quote:


> Originally Posted by *leetmode*
> 
> Hey guys, just finished upgrading my computer and finally have a pair of Sapphire Tri-X 290xs that I desperately needed, add me!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I have a quick question about my set up though, so far everything seems to be normal except for the fact when I play BF4 I am not getting 100% usage from both GPUs.
> 
> Is this normal? I was under the impression I should be seeing something like 99%...


Congrats - added








Quote:


> Originally Posted by *hyp36rmax*
> 
> Finally finished the window side panel for my TJ08-E and installed the Aquacomputer Farbwerk LED Controller. I get a video up of the Farbwerk in action rotating the LED colors automatically. Only thing left is the actual X99 gear haha.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Build Log: *Link


Looks awesome. Nice job.
Quote:


> Originally Posted by *elamelo*
> 
> Hate to duplicate post...but count me in:
> 
> Got everything installed. Here is a quick update:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> went from a 750ti to a WC'd R9 290 by Visiontek for $348 (shipped):
> https://www.visiontek.com/graphics-cards/liquid-cooled-series/visiontek-cryovenom-liquidcooled-series-r9-290-detail.html


Congrats - added
















Anyone other 290 / 290X owners who haven't joined, please post submission as per OP and I'll be happy to get you added.


----------



## Nwanko

@duganator

Clear your CMOS a couple of times,i had the same issue with my wf3 290x when i went from 7970. If that doesn't work plug the power cord out and wait a while then try starting it.


----------



## Chopper1591

Quote:


> Originally Posted by *rdr09*
> 
> haha. anything over 1275 core . . . i have to open the window just to bench. its 19 F or - 7 C outside.


Hehe nice.

I would be tempted to place my pc outside in the freezing cold and see how it clocks.








Quote:


> Originally Posted by *twoofswords*
> 
> The VRM's are all under heatsinks. Judging from the forum reviews of the people attaching Corsair HG10 / NZXT Kraken G10 to their 290's, it works fine.
> Even when adding the cost of the Gelid kit, I saved a bit of money versus the full size water block.


I believe you.

But, for me "just fine" isn't going to cut it.








I also just like the looks of the fullcover plus the idea that everything is cooled. When I go passive on the vrm I need to put extra airflow over the sinks.

Quote:


> Originally Posted by *BigBeard86*
> 
> nice temps there man (full water blocks)
> 
> What pump/reservoir/rads you use?
> 
> I have 2 290's in crossfire with the NZXT bracket, and two NZXT x41 for each gpu. On stock volts i get ~40c on core and 35c Vrms (upgraded fans plus gelid sinks). With +200mv i get ~50 c core and 45c vrm.
> 
> The temps you have for overvolted card are very nice, i haven't seen anyone having the cards run that cool, even with a custom loop. Are you using chilled water?


Loop consists of a 360 UT60, 140 xtc, swiftech mcp-655(was on settings 5 for the bench), xspc dualbay d5 combo atm(x-res 140 top combo on the way to replace the dualbay), ek fullcover and ek supremacy evo cpu block.

Your temps are decent I guess.
But the question is how quiet is it? Which fans are you running on the nzxt rads? I dare say they are louder then my rads.


----------



## rdr09

Quote:


> Originally Posted by *Gil80*
> 
> Hi all,
> 
> I use two R9 290 by Gigabyte. It's the Windforce OC edition.
> I switched from air cooling to water cooling for both cards. I use the EK R9 290 waterblocks.
> 
> Running Furmark for about 30 minutes, the GPU cores reached 53deg and I'm happy with that.
> 
> But the Vrams are whole other story. The main GPU Vrams get to 90 deg, and that is after applying the new thermal pads supplied with the EK waterblocks. I did everything as instructed.
> The 2nd GPU Vrams get to about 70~75deg.
> 
> Can anyone give me some advice, what to do?
> 
> Maybe I'm testing wrong? I use GPU-z to monitor the temps. I'd appreciate any help and advice to remedy this temp issue.
> I live in Australia, and I couldn't find here high temp conductivity thermal pads. So I was bound to use the EK supplied thermal pads.
> 
> Thanks!


even with the stock thermal pads your temps in the vrms should not go that high. you sure you removed the protective covers on both sides of the pads? also, there are two sizes of pads even for the windforce.you used the right one for the vrms?

the No. 2 Pad for the vrms and its 1mm thick.

Btw, they have a Rev. 2 for the Winforce full waterblock.


----------



## Gil80

With the air cooling, these were the temps that I got... go figure.

As for the EK thermal pads, it was easy to follow since the pre-cut thermal are the 0.5mm and the rest is 1mm and they illustrate where each one goes.
The 0.5mm for the modules surrounding the GPU core.

I did another test, this time I set my D5 pump to 4 instead of 2.
I closed the case from both sides and ran furmark again with all fans at 1200RPM.
Setting of furmark are Full screen, 1920x1200 and 8MSAA with post processing.

After 30 minutes, the core temp stabilized on 53deg and the Vrams didn't reach 70deg.
So now I'm happy... but I doubt that is was the pump settings from 2 to 4.

Oh and of course I removed the film from the thermal pads


----------



## rdr09

Quote:


> Originally Posted by *Gil80*
> 
> With the air cooling, these were the temps that I got... go figure.
> 
> As for the EK thermal pads, it was easy to follow since the pre-cut thermal are the 0.5mm and the rest is 1mm and they illustrate where each one goes.
> The 0.5mm for the modules surrounding the GPU core.
> 
> I did another test, this time I set my D5 pump to 4 instead of 2.
> I closed the case from both sides and ran furmark again with all fans at 1200RPM.
> Setting of furmark are Full screen, 1920x1200 and 8MSAA with post processing.
> 
> After 30 minutes, the core temp stabilized on 53deg and the Vrams didn't reach 70deg.
> So now I'm happy... but I doubt that is was the pump settings from 2 to 4.
> 
> Oh and of course I removed the film from the thermal pads


Cool. there might still be air in the system. don't use furmark. use the render test in GPUZ. click the "?" and start the test. open another instance or two to monitor the temps.


----------



## Gil80

Why not furmark?


----------



## rdr09

Quote:


> Originally Posted by *Gil80*
> 
> Why not furmark?


i read it loads an unrealistic amount of stress to a gpu. nothing in "real-world" use would ever come close. although your cards are watered, it would be safe to stay away from it. the render test in GPUZ is just as good if not better. i use that and my games to check temps.

perfect time to check your temps in your neck of the woods.


----------



## Chopper1591

Quote:


> Originally Posted by *Gil80*
> 
> Why not furmark?


It is just generally know that Furmark is something to avoid.
It is a power hog and does nothing more then heat up your gpu with unnatural amounts. It does nothing to actually test stability.

You are far better to just use benchmarks for extensive periods or just game for prolonged time.

I still think 70c is high for the vrms.
Are these with overclocked settings?

Mine reaches 35c max with a decent overclock of 1275 core with +175mv. The core is around 40-41c with that clock.


----------



## Ganf

Quote:


> Originally Posted by *rdr09*
> 
> i read it loads an unrealistic amount of stress to a gpu. nothing in "real-world" use would ever come close. although your cards are watered, it would be safe to stay away from it. the render test in GPUZ is just as good if not better. i use that and my games to check temps.
> 
> perfect time to check your temps in your neck of the woods.


Proper watercooling setups provide an unnecessary amount of heat displacement, so what else are you gonna test it with? A torch?


----------



## rdr09

Quote:


> Originally Posted by *Ganf*
> 
> Proper watercooling setups provide an unnecessary amount of heat displacement, so what else are you gonna test it with? A torch?


sometimes, in Sydney, it feels like a torch in summer.


----------



## Ganf

Quote:


> Originally Posted by *rdr09*
> 
> sometimes, in Sydney, it feels like a torch in summer.


We get the same thing here in Florida, plus 80% humidity, and I dislike using AC unless it gets over 100 Fahrenheit. In order to make sure my loop is up for the summer heat I try to build it just before spring and spend March and April testing under extreme heat loads. It seems to work so far, and I get to run my PC at 35c ambient temps all summer long without an issue. Furmark hasn't failed me yet.


----------



## rdr09

Quote:


> Originally Posted by *Ganf*
> 
> We get the same thing here in Florida, plus 80% humidity, and I dislike using AC unless it gets over 100 Fahrenheit. In order to make sure my loop is up for the summer heat I try to build it just before spring and spend March and April testing under extreme heat loads. It seems to work so far, and I get to run my PC at 35c ambient temps all summer long without an issue. Furmark hasn't failed me yet.


what is it there - 60F? i hate you.

its 16F here. i'll check you guys later - have to shovel snow.


----------



## Ganf

Quote:


> Originally Posted by *rdr09*
> 
> what is it there - 60F? i hate you.
> 
> its 16F here. i'll check you guys later - have to shovel snow.


I miss the snow... Someday I wanna be able to hook my radiator up to a garden hose and chuck it out in the backyard, then fill it up with antifreeze and enjoy sub-freezing temps all winter long by the cozy fire like you do.


----------



## rdr09

Quote:


> Originally Posted by *Ganf*
> 
> I miss the snow... Someday I wanna be able to hook my radiator up to a garden hose and chuck it out in the backyard, then fill it up with antifreeze and enjoy sub-freezing temps all winter long by the cozy fire like you do.


snow is nice the first few hours.

but, back on topic, i was just parroting what others been saying abour furmark. i never use it, so can't really say its bad for the system. what i did find out is that the temps i get using the GPUZ render test is similar to the temps i get in BF4. i don't oc my gpus in games, so instability is never an issue.


----------



## Ganf

Quote:


> Originally Posted by *rdr09*
> 
> snow is nice the first few hours.
> 
> but, back on topic, i was just parroting what others been saying abour furmark. i never use it, so can't really say its bad for the system. what i did find out is that the temps i get using the GPUZ render test is similar to the temps i get in BF4. i don't oc my gpus in games, so instability is never an issue.


Yeah, I don't use furmark until I know my system is stable because even the slightest amount of instability will hard lock your system. Better to get it stable using easier benchmarks, and then use Furmark to test your heat dissipation.

15 minutes in FurMark will give you the same type of heat soak you can expect after 5 hours of gaming in hot weather, in my experience. If your temps are still good after 15 minutes you have a worry free system. Of course you don't run it without a full coverage block, otherwise you're just begging to pop a VRM or something else of the like.

OCCT will still reveal errors before any other benchmark does, from my experience. Between the two of them you can set up a rock solid overclock with a minimal amount of effort.


----------



## kizwan

Modern GPUs nowadays are TDP limited when running furmark. So it's not good indicator for real-world gaming stability. However, according to anandtech review unlike 290X, the 290 for some reason is not TDP limited when running furmark. That is however with older drivers. Probably changed now with newer drivers. Anyway Furmark is power virus & if the driver did not TDP limited the card, it will draw more power than normal gaming. If it did, for sure temps will be higher. For this reason alone I will not use furmark nor recommend it to anyone. My stability tool is gaming & benching for real-world temps & stability. I don't need absolute max overclock stability because I don't mind having more than one overclock profile for games as long as they're stable for that game.


----------



## duganator

Quote:


> Originally Posted by *Nwanko*
> 
> @duganator
> 
> Clear your CMOS a couple of times,i had the same issue with my wf3 290x when i went from 7970. If that doesn't work plug the power cord out and wait a while then try starting it.


I'll give that a go and if it doesn't work I guess I'll return the card.


----------



## Ganf

Quote:


> Originally Posted by *kizwan*
> 
> Modern GPUs nowadays are TDP limited when running furmark. So it's not good indicator for real-world gaming stability. However, according to anandtech review unlike 290X, the 290 for some reason is not TDP limited when running furmark. That is however with older drivers. Probably changed now with newer drivers. Anyway Furmark is power virus & if the driver did not TDP limited the card, it will draw more power than normal gaming. If it did, for sure temps will be higher. For this reason alone I will not use furmark nor recommend it to anyone. My stability tool is gaming & benching for real-world temps & stability. I don't need absolute max overclock stability because I don't mind having more than one overclock profile for games as long as they're stable for that game.


All of this is true, which is why I don't use it for a stability benchmark or a performance benchmark. I use it as a heat dissipation benchmark. I'm testing my custom loop, not my GPU.


----------



## Alexbo1101

Welp... Finally get another card and now after activating crossfire my 4k monitor won't work









First off, some proof for you @Arizonian:


And now the 4k problem:


I can't choose any other resolutions than this, as you can see. Anyone got some ideas?


----------



## mAs81

I'm not a crossfire user so take this with a grain of salt,but since you're using the Omega drivers,go to the CCC and under Flat Panel properties,enable the virtual super resolution box..


----------



## Alexbo1101

It just states the display as not connected and if I disable VSR I can only choose 640x480 as resolution.


----------



## boot318

Quote:


> Originally Posted by *Alexbo1101*
> 
> It just states the display as not connected and if I disable VSR I can only choose 640x480 as resolution.


Sure your cable isn't loose to the monitor? Long shot.... but worth a shot.


----------



## Alexbo1101

Quote:


> Originally Posted by *boot318*
> 
> Sure your cable isn't loose to the monitor? Long shot.... but worth a shot.


Worth a shot, but already tried that


----------



## Alexbo1101

Lol, problem fixed... Pulled the power cable for a minute and it went straight back on again.









Dunno what caused it to go bananas...


----------



## mAs81

Quote:


> Originally Posted by *Alexbo1101*
> 
> Lol, problem fixed... Pulled the power cable for a minute and it went straight back on again.
> 
> 
> 
> 
> 
> 
> 
> 
> Dunno what caused it to go bananas...


It's the little things like this that make you wonder if the most obvious solution would be the correct one all along , lmao ..








Good thing that you were able to fix it tho..


----------



## taem

Is it a bad idea to test overclock limits while in crossfire? These cards will be watercooled and I don't want to swap them in and out. I could test on air but then I wonder about the temp issue.

Could I set one card after another as primary in CCC and turn crossfire off and test that way? Is that possible?


----------



## tsm106

Quote:


> Originally Posted by *taem*
> 
> Is it a bad idea to test overclock limits while in crossfire? These cards will be watercooled and I don't want to swap them in and out. I could test on air but then I wonder about the temp issue.
> 
> Could I set one card after another as primary in CCC and turn crossfire off and test that way? Is that possible?


You can test however you want tbh though at some point you're going to have to test them in cfx. And to test individually, simply disable cfx and move the monitor plug from one card to the other. Gl.


----------



## LandonAaron

Quote:


> Originally Posted by *Ganf*
> 
> We get the same thing here in Florida, plus 80% humidity, and I dislike using AC unless it gets over 100 Fahrenheit. In order to make sure my loop is up for the summer heat I try to build it just before spring and spend March and April testing under extreme heat loads. It seems to work so far, and I get to run my PC at 35c ambient temps all summer long without an issue. Furmark hasn't failed me yet.


Wow. I live in Texas, and I can't imagine not running AC all year. Probably run it like 300 days of the year.


----------



## inov1

Hi All

I just upgraded from 7950+Boost MSI twin Frozr III and I got a reference R9 290 second hand with the Accelero Xtreme IV already fitted. Having spent a couple of weeks reading this and other forums, I am happily running it at 1100 Mhz with Power +0 and stock volts. The temps I get in Metro last light are about 80C and the VRM is similar, and the core sticks to 1100mhz (unofficial overclocking in AB without powerplay support). When using OCCT I find that the it downclocks to 880Mhz or there about, and if I put it at stock speed and increase the power to +50 (with unofficial overclocking disabled) it hits 94C pretty quickly (1-2min) and starts throttling.

Having heard so many great things about the Xtreme IV I am a liittle disappointed. Am I being greedy? I have read about some people running their cards at 1250Mhz with plenty of extra volts at roughly 70C so I wonder if there is a problem with the cooler or my case? It's an old Antec server case with severeal intake fans over the drives at front and a 120mm exhaust at the back.

I have an overclocked 3570k at 4.7Ghz (cool lab ultra TIM, delidded, artic CPU cooler) which doesn't get much over 60C so I've thought that the case temps can't be to blame. Mobo is gigabyte z68xp-ud3p with 8gb ram.

I think my options are to cut a couple of holes in the case (bottom and panel near to 290) and stick a couple of 120mm fans there or add a PCI cooler (GELID) or to take apart the Arctic Xtreme cooler and see if TIM needs changing. Looking from top the cooler looks properly seated. Running things with the side panel off makes no difference.

The card seller claimed 70C during gaming but didn't get much more elaboration from him.

Would value any comments.

inov1


----------



## boot318

I can finally join the club!











(I don't know why it isn't required to put your user name in pic to join)

I returned my 970 after RAM-GATE. It feels nice to see and feel a premium card (strictly build quality). Of course I wished I would've gotten the MSI LIGHTNING, but the price went up $50 and it wouldn't have fit in my case.


----------



## GOLDDUBBY

Quote:


> Originally Posted by *boot318*
> 
> I can finally join the club!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (I don't know why it isn't required to put your user name in pic to join)
> 
> I returned my 970 after RAM-GATE. It feels nice to see and feel a premium card (strictly build quality). Of course I wished I would've gotten the MSI LIGHTNING, but the price went up $50 and it wouldn't have fit in my case.


Aww, you missed the perfect reason to get a new case


----------



## Ganf

Quote:


> Originally Posted by *LandonAaron*
> 
> Wow. I live in Texas, and I can't imagine not running AC all year. Probably run it like 300 days of the year.


My immune system doesn't agree with erratic 30 degree swings in temperature as I move in and out all day long, so I eliminated as much of the problem as I could and I haven't been sick since. I'll take being mildly uncomfortable over sinus infections, migraines and joint pain any day. I'm too young for that crap.


----------



## starjammer

Guys, I have a question. I realize that my 290x is not a retail version, probably a sample or early engineering version. The PCB serial is 102-C67101-00. Would you know if this card is a reference version, such that I can use full-cover water blocks for it?


----------



## Yorkston

Quote:


> Originally Posted by *starjammer*
> 
> Guys, I have a question. I realize that my 290x is not a retail version, probably a sample or early engineering version. The PCB serial is 102-C67101-00. Would you know if this card is a reference version, such that I can use full-cover water blocks for it?


Which specific card do you have? Many manufacturers use the reference PCB.


----------



## starjammer

I am not really sure, as it did not come with a box when I bought it. The only logo is AMD on the fan:



The GPU-Z report on the original bios is this:



Though I have since upgraded to the 15.039.000.007.003525 ATI 290x BIOS.


----------



## GOLDDUBBY

Subvendor ATI .. looks very reference indeed


----------



## starjammer

Quote:


> Originally Posted by *GOLDDUBBY*
> 
> Subvendor ATI .. looks very reference indeed


Really? I am just worried because I am looking to get a Swiftech Komodo R9 LE for this, but the site compatibility states that:
Quote:


> The Komodo-R9 LE is compatible with AMD® Radeon™ reference PCB designs (109-C67157-00-02). _The PCB revision number represents the deciding factor to go by when selecting your Swiftech product,rather than the brand and/or model number of the card. AMD® reference design boards usually bear the AMD® logo, with the PCB number positioned directly below it. Please carefully inspect your card to locate the PCB revision number_.


So do you think I'd be safe using this GPU block?


----------



## Yorkston

Quote:


> Originally Posted by *starjammer*
> 
> Really? I am just worried because I am looking to get a Swiftech Komodo R9 LE for this, but the site compatibility states that:
> So do you think I'd be safe using this GPU block?


If nothing else you can pop the cooler off, find a pic of the reference PCB online, and just eyeball it.


----------



## Serandur

Uh-oh. I got my Lightning 290X yesterday and now, a couple of times, I've had that black screen issue I've heard about. The first time it happened when I tried loading up KotOR with the widescreen patch mod, the second time I just left my 290X running Unigine Heaven for a bit and came back to an unresponsive black screen that required a hard shutdown. I haven't overclocked or anything yet, no temperature issues (never crossed 76C with a more lax custom fan profile that ramps up heavily at 80C), my CPU isn't even overclocked right now. I've got:

MSI Lightning 290X (Samsung memory)
i7-3770K
EVGA Supernova G2 750W

Is there a fix to this?


----------



## starjammer

Quote:


> Originally Posted by *Yorkston*
> 
> If nothing else you can pop the cooler off, find a pic of the reference PCB online, and just eyeball it.


I am trying to avoid doing that, but if I can't find any other solid supporting fact on whether it's reference or not, I may just go do that. What are the important things to check when comparing it against a reference pcb?


----------



## Yorkston

Quote:


> Originally Posted by *starjammer*
> 
> I am trying to avoid doing that, but if I can't find any other solid supporting fact on whether it's reference or not, I may just go do that. What are the important things to check when comparing it against a reference pcb?


VRM and cap locations. Basically it should look identical to this. Usually it is only the "special" cards that change the layout, such as the Vapor-X.


----------



## starjammer

Quote:


> Originally Posted by *Yorkston*
> 
> VRM and cap locations. Basically it should look identical to this. Usually it is only the "special" cards that change the layout, such as the Vapor-X.


Gotcha. I'll probably try that when I don't have to work on my desktop.







Thanks!

*EDIT* How about the screw holes? Should I be concerned if there are discrepancies?


----------



## mfknjadagr8

Got a question for you guys I know when you crossfire you have to match clocks etc so best bet would be test each card separately find max clocks for both then make lowest temp card the bottom card and match lowest clocks?


----------



## Yorkston

@star- Yes, screwholes will need to line up as well.

@mfknfkalhfhef- You don't "need" to match clocks for crossfire, it will auto-throttle the faster card down to match the slower one. You can keep the primary card clocked higher for games that don't support crossfire.


----------



## starjammer

Quote:


> Originally Posted by *Yorkston*
> 
> @star- Yes, screwholes will need to line up as well.
> ...


Thanks!


----------



## mfknjadagr8

Quote:


> Originally Posted by *Yorkston*
> 
> @star- Yes, screwholes will need to line up as well.
> 
> @mfknfkalhfhef- You don't "need" to match clocks for crossfire, it will auto-throttle the faster card down to match the slower one. You can keep the primary card clocked higher for games that don't support crossfire.


awesome even better...less work for me


----------



## kizwan

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Got a question for you guys I know when you crossfire you have to match clocks etc so best bet would be test each card separately find max clocks for both then make lowest temp card the bottom card and match lowest clocks?


No, you don't need to match clocks. Both cards will be running at their pre-set clocks even when in Crossfire. One good example is when running 290-290X in Crossfire.


----------



## Agonist

Finally was able to get me a good R9 290.
R9 290 with HD 2900 Pro

Also running a 9800gtx+ with hybrid physx


----------



## taem

Quote:


> Originally Posted by *tsm106*
> 
> You can test however you want tbh though at some point you're going to have to test them in cfx. And to test individually, simply disable cfx and move the monitor plug from one card to the other. Gl.


Is there any benefit to testing individually? I did test individually when running crossfire on air earlier, because folks told me that's how you do, but I wondered at the time what the point was.


----------



## Regnitto

only reason I can think of off the top of my head would be to see which is the better overclocker to put as your primary card.....but then again, I've never crossfired.


----------



## rdr09

Quote:


> Originally Posted by *taem*
> 
> Is there any benefit to testing individually? I did test individually when running crossfire on air earlier, because folks told me that's how you do, but I wondered at the time what the point was.


Quote:


> Originally Posted by *Regnitto*
> 
> only reason I can think of off the top of my head would be to see which is the better overclocker to put as your primary card.....but then again, I've never crossfired.


you test them individually to find out if both are in working order. if you go straight to crossfire config and issues occur, then you won't know if both or just one of the cards are/is bad.


----------



## kizwan

Like tsm106 mentioned earlier, you can test the cards one by one by disabling Crossfire & connect monitor to the card one by one. This is how I tested mine.


----------



## inov1

Great, thanks for confirming my suspiscions!

When I don't need to work on the PC I will take apart the card and reseat the cooler + TIM


----------



## trelokomio58

I would like to ask a question. I just bought a 290X dcii OC and I was wondering if there is a way to raise the voltage over 100mV with msi afterburner. Right now afterburner allows me to raise it only at +100mV.
Thanks in advance.


----------



## trelokomio58

I open a new text file on desktop.
I type in to the text file "C:\Program Files (x86)\MSI Afterburner
MSIAfterburner.exe /wi4,30,8d,20" and then i save this file as .bat on desktop..

I run this .bat file with double click and then i open the afterburner, but the max limit is again +100mv:-(

What i am doing wrong:-(


----------



## mirzet1976

Try "C:\Program Files (x86)\MSI Afterburner\MSIAfterburner.exe" /wi6,30,8d,20 or with Trixx you can go +200mV


----------



## mirzet1976

Run this command "C:\Program Files (x86)\MSI Afterburner\MSIAfterburner.exe" /I2CD to see which is your I2C bus and ID device


----------



## Performer81

Quote:


> Originally Posted by *trelokomio58*
> 
> I open a new text file on desktop.
> I type in to the text file "C:\Program Files (x86)\MSI Afterburner
> MSIAfterburner.exe /wi4,30,8d,20" and then i save this file as .bat on desktop..
> 
> I run this .bat file with double click and then i open the afterburner, but the max limit is again +100mv:-(
> 
> What i am doing wrong:-(


its wi6 i think. then you have to click on it after opening Afterburner. Dont change anything in Afterburner then.


----------



## Arizonian

Quote:


> Originally Posted by *boot318*
> 
> I can finally join the club!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> (I don't know why it isn't required to put your user name in pic to join)
> 
> I returned my 970 after RAM-GATE. It feels nice to see and feel a premium card (strictly build quality). Of course I wished I would've gotten the MSI LIGHTNING, but the price went up $50 and it wouldn't have fit in my case.


Congrats - added








Quote:


> Originally Posted by *starjammer*
> 
> I am not really sure, as it did not come with a box when I bought it. The only logo is AMD on the fan:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> The GPU-Z report on the original bios is this:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Though I have since upgraded to the 15.039.000.007.003525 ATI 290x BIOS.


Congrats - added








Quote:


> Originally Posted by *Agonist*
> 
> Finally was able to get me a good R9 290.
> R9 290 with HD 2900 Pro
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> Also running a 9800gtx+ with hybrid physx


Congrats - added


----------



## Unknownm

Can anyone help. Just out of no where my 4K screen flickers and blanks only on my AMD Cards. I know this because plugged into my DP port on Motherboard and using iGPU to push 4K, no flicker happens. The strange thing is both r9 cards flicker and since it's crossfire I never used the bottom cards DP port before, thus no way possible that the port is broken in anyway.

Drivers tested:
14.12
14.11 beta

Also completely Re installed Windows 8 , also windows 10. Both of them only caused flickering on AMD cards.





This is what I only know so far. Selecting DP 1.1 on my monitor completely stops the flicker but only allows me to run 3200x1800 or 2560x1440 as max resolution.


----------



## BradleyW

Did you disable crossfire when you plugged the monitor into the bottom GPU?


----------



## fyzzz

http://i.imgur.com/6v5ZQxC.jpg http://www.3dmark.com/3dm/5964403?

1250 mhz on my r9 290 dc2 on air (raijintek morpheus)


----------



## Unknownm

Quote:


> Originally Posted by *BradleyW*
> 
> Did you disable crossfire when you plugged the monitor into the bottom GPU?


I can boot into windows with it plugged into the second slot however the BIOS etc won't show because default is first card

I've already pulled out each card and ran only single. Both cards gave the same results.... it only happens on the AMD Cards


----------



## KAS Phase4

Quote:


> Originally Posted by *mirzet1976*
> 
> Run this command "C:\Program Files (x86)\MSI Afterburner\MSIAfterburner.exe" /I2CD to see which is your I2C bus and ID device


Thanks


----------



## DeviousAddict

I've got a random problem. I have the curved LG uwhd monitor, it has 10bit colour which ive been using since i got it (set in ccc monitor settings).
However when I turned it on the other day the colours looked off so I checked the settings and it's no longer an option, 8 & 6 only now.
I've got dp1.2 turned on within my monitor osd settings and I don't think I've changed anything the would affect it.
Cheers guys


----------



## Digital X

Ordered a PowerColer PCS+ 290 yesterday, going from a GTX 760. Using a Corsair Air 240 case, Asrock B75 Pro3 M. Would it be too small if I ever decided to crossfire in this case? I believe it's a triple slot card, and first ever AMD card.


----------



## Gabkicks

Anyone running 2 r9 290's with an Oculus Rift DK2/1 or 290x's ? I get terrible stutter/judder with crossfire enabled.
And So my 2nd 290 is pretty much useless to me since I game a lot with rift, and for a single 1080p screen, 1 290 is enough for most games (except pcars) D:


----------



## rdr09

Quote:


> Originally Posted by *Digital X*
> 
> Ordered a PowerColer PCS+ 290 yesterday, going from a GTX 760. Using a Corsair Air 240 case, Asrock B75 Pro3 M. Would it be too small if I ever decided to crossfire in this case? I believe it's a triple slot card, and first ever AMD card.


if another pcs the second one might hit anything below it like the psu. i believe that 290 is a 3 slot gpu. also, looking at the spec of that mobo . . . the second gpu will run at X4. not good.


----------



## LandonAaron

Have yall seen this yet? it is like corsair's version of the NXXT G10?


----------



## Serandur

I originally ordered an MSI Lightning 290X, now I'm returning it for a Vapor-X 290X (the 8GB model). Not that there's much of anything wrong with the Lightning, but my specific model had an issue with one of its fans and I was just too enticed by the beauty of the Vapor-X. Don't have much use for all its extra VRAM, but its metallic blue and black color scheme match my Z77-UD3H and Gskill Ares RAM perfectly and I hear it's the quietest/coolest air-cooled 290X out there.

It cost me quite a few extra pennies though, I hope it retains decent resale value when Fiji launches or is about to be launched.


----------



## tsm106

Quote:


> Originally Posted by *LandonAaron*
> 
> Have yall seen this yet? it is like corsair's version of the NXXT G10?


It's not very similar really. The Corsair is specific to the 290x reference pcb where the G10 is a universal. Obviously being specific the Corsair unit is useless on any other card.


----------



## pshootr

All the sudden I am freezing again from flash/youtube. So I tried running AB again and plugged in these settings and made profiles, but it is not helping. Last time I went through this, I had uninstalled HWINFO64 around the same time I had plugged in the mentioned AB settings. So I was never really sure what stopped the issue. Ironically enough though, I recently installed HWINFO64 again :/

So I guess I am going to uninstall HWINFO64 it again and keep my fingers crossed. Disabling HA did not help either.


----------



## Yorkston

Quote:


> Originally Posted by *pshootr*
> 
> All the sudden I am freezing again from flash/youtube. So I tried running AB again and plugged in these settings and made profiles, but it is not helping. Last time I went through this, I had uninstalled HWINFO64 around the same time I had plugged in the mentioned AB settings. So I was never really sure what stopped the issue. Ironically enough though, I recently installed HWINFO64 again :/
> 
> So I guess I am going to uninstall HWINFO64 it again and keep my fingers crossed. Disabling HA did not help either.


How did you turn off HA? I ended up having to right click on a youtube video > settings > display > uncheck HA to fully disable it.


----------



## pshootr

Quote:


> Originally Posted by *Yorkston*
> 
> How did you turn off HA? I ended up having to right click on a youtube video > settings > display > uncheck HA to fully disable it.


I disabled it through the setting in Firefox, then I disabled it by rite clicking on the video like you said. I also uninstalled HWINFO64 and I am still having the issue.


----------



## tsm106

Quote:


> Originally Posted by *pshootr*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Yorkston*
> 
> How did you turn off HA? I ended up having to right click on a youtube video > settings > display > uncheck HA to fully disable it.
> 
> 
> 
> I disabled it through the setting in Firefox, then I disabled it by rite clicking on the video like you said. I also uninstalled HWINFO64 and I am still having the issue.
Click to expand...

FF is notorious for not disabling hw accel. Link below under rtss, you can make a profile to prevent the gpu from reacting to FF gpu calls for good, or that is until the next FF update.

http://www.overclock.net/t/1265543/the-amd-how-to-thread


----------



## pshootr

Quote:


> Originally Posted by *tsm106*
> 
> FF is notorious for not disabling hw accel. Link below under rtss, you can make a profile to prevent the gpu from reacting to FF gpu calls for good, or that is until the next FF update.
> 
> http://www.overclock.net/t/1265543/the-amd-how-to-thread


Thank you for the reply and the link/instructions. RTSS already has a profile for firefox with ADL set to none.

I don't get it, for the longest time I did not have this issue anymore even without starting AB, and now its back.

Edit:

I recently installed HWINFO64 again (have since uninstalled it again)

I also changed my MB to the Asus CHVFZ, but I don't see why this would make a difference.


----------



## tsm106

Quote:


> Originally Posted by *pshootr*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> FF is notorious for not disabling hw accel. Link below under rtss, you can make a profile to prevent the gpu from reacting to FF gpu calls for good, or that is until the next FF update.
> 
> http://www.overclock.net/t/1265543/the-amd-how-to-thread
> 
> 
> 
> Thank you for the reply and the link/instructions. RTSS already has a profile for firefox with ADL set to none.
> 
> I don't get it, for the longest time I did not have this issue anymore *even without starting AB*, and now its back.
> 
> Edit:
> 
> I recently installed HWINFO64 again (have since uninstalled it again)
> 
> I also changed my MB to the Asus CHVFZ, but I don't see why this would make a difference.
Click to expand...

Some apps do not do what the check boxes say they will do. They can make direct calls to the gpu which causes the abrupt powerstate changes.

My instructions are to use AB to control these powerstate changes. It is not because of AB that the powerstate changes happen. In other words it's not AB's fault but the apps themselves. Though that's not to say that AB cannot cause issues due to misconfigured setups because that can happen too.

I don't want to get into hypotheticals over what caused what, but at the end of the day the flicker returned. It's either adobe or mozilla, pick your poison.

http://www.overclock.net/t/1265543/the-amd-how-to-thread/280_40#post_20021232


----------



## pshootr

funny thimg is, what worked before, is not working now. go figure, lol


----------



## tsm106

Quote:


> Originally Posted by *pshootr*
> 
> funny thimg is, what worked before, is not working now. go figure, lol


Are you running the same build of FF and Flash? And didn't you say you changed MBs?


----------



## pshootr

Quote:


> Originally Posted by *tsm106*
> 
> Are you running the same build of FF and Flash? And didn't you say you changed MBs?


I am running the same FF, flash had an update a couple weeks ago, (things were still ok after that) and yes I changed my MB about a week ago, and installed HWINFO64 again. A while back when I fixed this issue, I figured it was either getting rid of HWINFO64 or using the AB setting you mentioned. I was never certain which one fixed it.

Edited above.


----------



## tsm106

Quote:


> Originally Posted by *pshootr*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Are you running the same build of FF and Flash? And didn't you say you changed MBs?
> 
> 
> 
> I am running the same FF, flash had an update a couple weeks ago, (things were still ok after that) and yes I changed my MB about a week ago, and installed HWINFO64 again. A while back when I fixed this issue, I figured it was either getting rid of HWINFO64 or using the AB setting you mentioned. I was never certain which one fixed it.
> 
> Edited above.
Click to expand...

Maybe it's just me, but this doesn't read like it just started randomly. What also strikes me as odd is that you have the tool at your discretion to check/confirm what app is using the the gpu and yet there is much confusion as to what is using what or causing what. Shrugs...

Quote:


> Originally Posted by *pshootr*
> 
> funny thimg is, what worked before, is not working now. go figure, lol


----------



## pshootr

Quote:


> Originally Posted by *tsm106*
> 
> Maybe it's just me, but this doesn't read like it just started randomly. What also strikes me as odd is that you have the tool at your discretion to check/confirm what app is using the the gpu and yet there is much confusion as to what is using what or causing what. Shrugs...


Well a few things have changed, so your rite it is not totally random. Though the MB and HWINFO64 has been installed for about a week, and I just started having this issue again. So in that respect is is kind of out of the blue. I understand with so many things being mentioned that this is not easy to pinpoint.

I know that flash/power-states is causing it, I am just not certain why yet. I have tried the AB fix as I did last time, but this time I am not getting past the freezing by doing this. I am probably not being as concise with my posts as I should be, and for that I am sorry.

Thanks for your help.


----------



## shoti02

also posted without the r 290 topic...

which two should i mount in my rig ??......







)


----------



## Yorkston

Vapor-X are 3-slot coolers aren't they? Might not fit in a crossfire setup if you are staying air.


----------



## shoti02

nope...2.5 slot....will fit with my asus maximus hero vii


----------



## kizwan

Quote:


> Originally Posted by *pshootr*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Maybe it's just me, but this doesn't read like it just started randomly. What also strikes me as odd is that you have the tool at your discretion to check/confirm what app is using the the gpu and yet there is much confusion as to what is using what or causing what. Shrugs...
> 
> 
> 
> Well a few things have changed, so your rite it is not totally random. Though the MB and HWINFO64 has been installed for about a week, and I just started having this issue again. So in that respect is is kind of out of the blue. I understand with so many things being mentioned that this is not easy to pinpoint.
> 
> I know that flash/power-states is causing it, I am just not certain why yet. I have tried the AB fix as I did last time, but this time I am not getting past the freezing by doing this. I am probably not being as concise with my posts as I should be, and for that I am sorry.
> 
> Thanks for your help.
Click to expand...

I don't have this problem & I use both Chrome & Firefox extensively. The Chrome use HTML5 by default whereas FF use Flash Player by default. I can force computer to freeze (while playing youtube in FF) by opening AB & press the info button. You can always use HTML5 instead of flash player.

https://www.youtube.com/html5


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> I don't have this problem & I use both Chrome & Firefox extensively. The Chrome use HTML5 by default whereas FF use Flash Player by default. I can force computer to freeze (while playing youtube in FF) by opening AB & press the info button. You can always use HTML5 instead of flash player.
> 
> https://www.youtube.com/html5


i wonder why some systems are affected by FF, AB, Hardware Acceleration, etc. i have those on both my sigs and i can run them simultaneously without issues.



i hit the info button.

edit: k, i suspect Windows updates.


----------



## kizwan

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I don't have this problem & I use both Chrome & Firefox extensively. The Chrome use HTML5 by default whereas FF use Flash Player by default. I can force computer to freeze (while playing youtube in FF) by opening AB & press the info button. You can always use HTML5 instead of flash player.
> 
> https://www.youtube.com/html5
> 
> 
> 
> i wonder why some systems are affected by FF, AB, Hardware Acceleration, etc. i have those on both my sigs and i can run them simultaneously without issues.
> 
> 
> 
> i hit the info button.
> 
> edit: k, i suspect Windows updates.
Click to expand...

Why is your info window only show little info.

I can have AB running in the background & not have any problem.



Windows will freeze up if I try to run AB while youtube/flash video is playing or already open. Or if AB already running, windows will freeze up when I try to open the AB info window. Anyway this is not really an issue because why we want to launch AB while watching youtube/flash video.


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> Why is your info window only show little info.
> 
> I can have AB running in the background & not have any problem.
> 
> 
> 
> Windows will freeze up if I try to run AB while youtube/flash video is playing or already open. Or if AB already running, windows will freeze up when I try to open the AB info window. Anyway this is not really an issue because why we want to launch AB while watching youtube/flash video.


i think the issue is for some who use AB to oc and have it running in the background. but, like i said, i can run those without issues. i don't oc my gpus, though.


----------



## RobzDragon

Just curious, would this be the place for me? Just got a r9 295x2..soon a 2nd.


----------



## kizwan

Quote:


> Originally Posted by *RobzDragon*
> 
> Just curious, would this be the place for me? Just got a r9 295x2..soon a 2nd.


Owner club here: http://www.overclock.net/t/1483021/official-amd-r9-295x2-owners-club


----------



## RobzDragon

Thanks!


----------



## GOLDDUBBY

Quote:


> Originally Posted by *kizwan*
> 
> Why is your info window only show little info.
> 
> I can have AB running in the background & not have any problem.
> 
> 
> 
> Windows will freeze up if I try to run AB while youtube/flash video is playing or already open. Or if AB already running, windows will freeze up when I try to open the AB info window. Anyway this is not really an issue because why we want to launch AB while watching youtube/flash video.


Quote:


> Originally Posted by *rdr09*
> 
> i wonder why some systems are affected by FF, AB, Hardware Acceleration, etc. i have those on both my sigs and i can run them simultaneously without issues.
> 
> 
> 
> i hit the info button.
> 
> edit: k, i suspect Windows updates.


Celebrity Jepardy .. is that you grandma?









In your browser settings under Advance, you can untick the box that says 'Enable hardware accelerated graphics' frasing depends on browser ofc.


----------



## rdr09

Quote:


> Originally Posted by *GOLDDUBBY*
> 
> Celebrity Jepardy .. is that you grandma?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In your browser settings under Advance, you can untick the box that says 'Enable hardware accelerated graphics' frasing depends on browser ofc.


i know. i tell others to disable those if they are encountering issue. i can enable mine and it would not give my sigs issues. not sure why. prolly because i don't use DDU.









edit: that was Jeopardy in SNL 40th. Quite funny. watch it.


----------



## pshootr

When I first got my 290 I had this flash issue where it would freeze my pc. It was using AB and creating 2D/3D profiles that helped get rid of the issue actually. Then I stopped using AB and started using TRIXX instead, and oddly enough the issue was still gone. But now, after changing my MB and then changing the slot my GPU is in, and installing HWINFO64 the issue is back. This time using AB with profiles has not been the miracle cure sadly.

And disabling HA has not helped either. Its true I could use chrome for youtube. But I like my Firefox, and in my eyes Flash is like a must have for the web. Many sites use it, heck even after a Firestrike run your results are shown with Flash. So not being able to use flash is like having a broken card.


----------



## azcrazy

Glad i got my 290 today , sadly waiting on tubes for the water cooling


----------



## Vici0us

Quote:


> Originally Posted by *pshootr*
> 
> When I first got my 290 I had this flash issue where it would freeze my pc. It was using AB and creating 2D/3D profiles that helped get rid of the issue actually. Then I stopped using AB and started using TRIXX instead, and oddly enough the issue was still gone. But now, after changing my MB and then changing the slot my GPU is in, and installing HWINFO64 the issue is back. This time using AB with profiles has not been the miracle cure sadly.
> 
> And disabling HA has not helped either. Its true I could use chrome for youtube. But I like my Firefox, and in my eyes Flash is like a must have for the web. Many sites use it, heck even after a Firestrike run your results are shown with Flash. So not being able to use flash is like having a broken card.


Why wouldn't you use your GPU in a primary PCI-E slot (top one)?


----------



## Coree

Hey, is it possible/recommended to use double sided thermal tapes for the VRM's? I'm thinking of buying this model particularily http://www.watercoolinguk.co.uk/p/Alphacool-Double-Sided-Sticky-Thermal-Pad-100x100x05mm_20848.html How strong is the bond? Is there a possibility that the heatsink falls off due to high heat etc.
The availability of these thermal pads are limited in my country though, this is my only option..


----------



## givmedew

Quote:


> Originally Posted by *Coree*
> 
> Hey, is it possible/recommended to use double sided thermal tapes for the VRM's? I'm thinking of buying this model particularily http://www.watercoolinguk.co.uk/p/Alphacool-Double-Sided-Sticky-Thermal-Pad-100x100x05mm_20848.html How strong is the bond? Is there a possibility that the heatsink falls off due to high heat etc.
> The availability of these thermal pads are limited in my country though, this is my only option..


I DO NOT AT ALL RECOMMEND IT IN ANY WAY SHAPE OR FORM!!!

This is why:

*For full coverage blocks:*

There is going to be people that will say it is just fine... however I know from experience that MEMORY chips come off! I have seen memory chips pulled off from people removing RAM heat spreaders.

I bought a used R9 290X last year in march for $200. When I pulled the cooler off a ton of components fell off the board. Obviously this was a defective card with crap solder and most likely while it was sitting on a run way in the belly of a plane it hit some extreme temp that loosened up the parts... Because you could tell how the fell off the board when I removed the heat sink that they all just recently came undone...

It wasn't the memory in that situation...

The thing is when your adhearing individual heat sinks you can twist them off which is mostly safe... MOSTLY...

*ONTO INDIVIDUAL HEAT SINKS!!!*

These must be put on with double sided tape or glue right? Well it is a little safer to remove these BUT!!! What if it falls off... that loose piece of copper can short things out on your motherboard or the GPU.

I had a Corsair heat spreader just fall right off and onto a video card. I was shocked that they where ONLY held on by the tape... not secured to each other which is what it looked like by the way they designed them.

That said...

If the card and motherboard is in the normal orientation that it is in most tower PCs and you don't have ANY cards below it then I don't see why it wouldn't be ok for individual heat sinks. Just know when it comes to removing them that you should use fishing line to cut them off or a scalpel/exacto knife (CAREFULLY!!!). Seriously chips come off!

Also if your talking about chips on the backside which is actually the top in a tower PC then I see no issue because gravity is pulling them towards the chips not away from them.

If you use a PC that the motherboard lays flat or you have any expensive cards below the GPU... DONT DO IT! Spend the money and buy full coverage components dissect and destroy a reference cooler to use to cool your components.

If your cooling backside components then you can buy a backplate and then glue heat sinks too it. Mix thermal paste with super glue or something... look up a how to on mixing those on google to make sure you are using compatible glues and pastes.


----------



## pdasterly

nice writeup
Ive used SEKISUI tape, the cheapest on fleabay with good success. I did have two heatsinks fall off card but they landed on top of the 2nd gpu so shorting out wasn't issue. If you go this route just make sure you clean of chip first and press and hold heatsink firmly for 10 seconds to get good contact. After a few heat cycles the heatsink will settle in or on. No problem removing either, simply twist off
Quote:


> Originally Posted by *givmedew*
> 
> I DO NOT AT ALL RECOMMEND IT IN ANY WAY SHAPE OR FORM!!!
> 
> This is why:
> 
> *For full coverage blocks:*
> 
> There is going to be people that will say it is just fine... however I know from experience that MEMORY chips come off! I have seen memory chips pulled off from people removing RAM heat spreaders.
> 
> I bought a used R9 290X last year in march for $200. When I pulled the cooler off a ton of components fell off the board. Obviously this was a defective card with crap solder and most likely while it was sitting on a run way in the belly of a plane it hit some extreme temp that loosened up the parts... Because you could tell how the fell off the board when I removed the heat sink that they all just recently came undone...
> 
> It wasn't the memory in that situation...
> 
> The thing is when your adhearing individual heat sinks you can twist them off which is mostly safe... MOSTLY...
> 
> *ONTO INDIVIDUAL HEAT SINKS!!!*
> 
> These must be put on with double sided tape or glue right? Well it is a little safer to remove these BUT!!! What if it falls off... that loose piece of copper can short things out on your motherboard or the GPU.
> 
> I had a Corsair heat spreader just fall right off and onto a video card. I was shocked that they where ONLY held on by the tape... not secured to each other which is what it looked like by the way they designed them.
> 
> That said...
> 
> If the card and motherboard is in the normal orientation that it is in most tower PCs and you don't have ANY cards below it then I don't see why it wouldn't be ok for individual heat sinks. Just know when it comes to removing them that you should use fishing line to cut them off or a scalpel/exacto knife (CAREFULLY!!!). Seriously chips come off!
> 
> Also if your talking about chips on the backside which is actually the top in a tower PC then I see no issue because gravity is pulling them towards the chips not away from them.
> 
> If you use a PC that the motherboard lays flat or you have any expensive cards below the GPU... DONT DO IT! Spend the money and buy full coverage components dissect and destroy a reference cooler to use to cool your components.
> 
> If your cooling backside components then you can buy a backplate and then glue heat sinks too it. Mix thermal paste with super glue or something... look up a how to on mixing those on google to make sure you are using compatible glues and pastes.


----------



## pshootr

Quote:


> Originally Posted by *Vici0us*
> 
> Why wouldn't you use your GPU in a primary PCI-E slot (top one)?


Because in the top slot, the GPU sits very close to my CPU cooler. This means there is combined heat, and also very restricted air-flow for each of those components. In that regard I think it is silly to have the "primary" slot so close to the CPU.

Although as far as I know, on my MB slot 1 and slot 3 have the same bandwidth for single card applications.


----------



## SLOWION

My new baby


----------



## GOLDDUBBY

Quote:


> Originally Posted by *SLOWION*
> 
> My new baby


Secksi !!


----------



## Bertovzki

Regarding heat sinks , if you never intend to remove them , would it not be better to use a good thermal paste or thermal pad and glue it on with epoxy around the perimeter of the heat sink ?

I personally would only use a full block too , but just putting idea out there.


----------



## trivium nate

okay nto to start any wars or anything but i saw this card posted on twitter:

http://www.newegg.com/Product/Product.aspx?Item=N82E16814150723&cm_re=290x_xfx-_-14-150-723-_-Product

how much faster/bigger/batter/better is it than my current card is this card?, I dont need 8Gb of ram but hey why not...

this is my current card

http://www.newegg.com/Product/Product.aspx?Item=N82E16814487079

ive never been an ati guy...no offense but if you dont mind....


----------



## Yorkston

980 > 290x unless you ae doing 4k+ resolutions. 8GB of Vram is somewhat wasted on this generation of cards as the GPUs will usually run out of power before you run out of 4GB Vram.


----------



## pshootr

Quote:


> Originally Posted by *trivium nate*
> 
> okay nto to start any wars or anything but i saw this card posted on twitter:
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814150723&cm_re=290x_xfx-_-14-150-723-_-Product
> 
> how much faster/bigger/batter/better is it than my current card is this card?, I dont need 8Gb of ram but hey why not...
> 
> this is my current card
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814487079
> 
> ive never been an ati guy...no offense but if you dont mind....


Looks like a great option is you intend to run 4k. It's a bit overkill for 1080p. But it is not really that much more after rebate than a 290, so the value looks very decent to me.

I'm not sure how to compare it to your card though, I'll leave that to someone else.


----------



## rdr09

Quote:


> Originally Posted by *trivium nate*
> 
> okay nto to start any wars or anything but i saw this card posted on twitter:
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814150723&cm_re=290x_xfx-_-14-150-723-_-Product
> 
> how much faster/bigger/batter/better is it than my current card is this card?, I dont need 8Gb of ram but hey why not...
> 
> this is my current card
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814487079
> 
> ive never been an ati guy...no offense but if you dont mind....


if that 290X was available at launch i would have bought that. i got my 290 for $400. my second one, though, cost me $187 and a full block for $60. got them to drive a 4K. saw my vram usage in BF4 reach 3750 MB per card without AA.

might not use 5, 6, 7, or 8 GB but i might just go over 4GB in the coming games. i read evolve is one.


----------



## taem

Quote:


> Originally Posted by *trivium nate*
> 
> okay nto to start any wars or anything but i saw this card posted on twitter:
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814150723&cm_re=290x_xfx-_-14-150-723-_-Product
> 
> how much faster/bigger/batter/better is it than my current card is this card?, I dont need 8Gb of ram but hey why not...
> 
> this is my current card
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814487079
> 
> ive never been an ati guy...no offense but if you dont mind....


I'd take the Lightning for $320 over that card even though it has half the vram.

But honestly you shouldn't upgrade what you have.


----------



## Euda

Hey guys, does anyone know this kind of "artifacts" (only occuring on shaded ground)?:
Watch_Dogs - Artefakte?:


----------



## GOLDDUBBY

Quote:


> Originally Posted by *Euda*
> 
> Hey guys, does anyone know this kind of "artifacts" (only occuring in shader Grund)?:
> Watch_Dogs - Artefakte?:


Looks like core not getting enough volts, but that game is so broken, my money is on bad optimization..


----------



## Frontside

My MSI R9 290X gets new dress









More pics of my upcoming build in signature below


----------



## Arizonian

Quote:


> Originally Posted by *Serandur*
> 
> I originally ordered an MSI Lightning 290X, now I'm returning it for a Vapor-X 290X (the 8GB model). Not that there's much of anything wrong with the Lightning, but my specific model had an issue with one of its fans and I was just too enticed by the beauty of the Vapor-X. Don't have much use for all its extra VRAM, but its metallic blue and black color scheme match my Z77-UD3H and Gskill Ares RAM perfectly and I hear it's the quietest/coolest air-cooled 290X out there.
> 
> It cost me quite a few extra pennies though, I hope it retains decent resale value when Fiji launches or is about to be launched.


Post a submission when you get your blue baby. Preferably in rig so I can see the aesthetics








Quote:


> Originally Posted by *shoti02*
> 
> also posted without the r 290 topic...
> 
> which two should i mount in my rig ??......
> 
> 
> 
> 
> 
> 
> 
> )
> 
> 
> Spoiler: Warning: Spoiler!


Nice position to be in with a hard decision. IMO VaporX - Congrats - added








Quote:


> Originally Posted by *azcrazy*
> 
> Glad i got my 290 today , sadly waiting on tubes for the water cooling


Post a submission pic when you get it under water. I'll be glad to add you to the roster.








Quote:


> Originally Posted by *SLOWION*
> 
> My new baby
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added








Quote:


> Originally Posted by *Frontside*
> 
> My MSI R9 290X gets new dress
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added
















On a side note: Waiting on ordered parts today for 'devils advocate' build in signature. Red / black / white color scheme, all set up and ready for AMD's next flagship.


----------



## artrzy

Hello to all
I've got a question. Which cooler is better for r9 290? Accelero Xtreme III or Accelero Xtreme IV. I don't know what to choose to have better temp. Thanks!


----------



## Mega Man

Water block is


----------



## mfknjadagr8

Quote:


> Originally Posted by *Mega Man*
> 
> Water block is


lol...have you used the komodo block mega?


----------



## Mega Man

Yes.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Mega Man*
> 
> Yes.


how would you rate it amongst others you've used? Looking to get a pair soon


----------



## Mega Man

it works, i dont really feel like i can grade them as i have not used them enough nor any thing else, i would suggest you look into the reviews ( they dont cool vrms supposedly as well ) however really watercooling vs anything else is always better

so the real question is

1 what do you want to do ( max clocks =ek + new pads )
2 looks what do you like ?


----------



## taem

I just ordered the Watercool Heatkiller GPU-X3 290x block. $92 for the copper. Stren gave it silver award for best combination of core and vrm cooling without a cooling backplate. Love the low price given where the 290x is in its life cycle, love that it gets great vrm cooling without a backplate and pads.



Plus wow this is the best looking gpu block ever.



No one seems to use this block though... No reviews, no one talks about it, nothing. Everyone uses EK and a few Aquacomputer users. I would have preferred Aquacomputer + backplate but that's $150+ to the Heatkiller's $92.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Mega Man*
> 
> it works, i dont really feel like i can grade them as i have not used them enough nor any thing else, i would suggest you look into the reviews ( they dont cool vrms supposedly as well ) however really watercooling vs anything else is always better
> 
> so the real question is
> 
> 1 what do you want to do ( max clocks =ek + new pads )
> 2 looks what do you like ?


looks dont mean alot to me i mean i dont want it to look like crap but im open on looks.. i read a review for the komodo blocks that stated the vrm cooling was bad (actually the one posted above) then turned around and read a second that completely contradicted it (found it intresting here http://forums.guru3d.com/showthread.php?t=389139 http://kingpincooling.com/forum/showthread.php?t=2710 )... talking with bram from swiftech he says in thier testing it was more in line with the second review i read so who knows... aqua computer block is pretty nice and performs but its more like 190ish with backplate which is something i want... as for what i wanna do... im looking for decent clocks not necessarily super volted benching for record type clocks but game stable decent temps and modest overclock....i wont need the full grunt of both of the 290s for awhile as i still have a 1080p monitor but i would like to try some downsampled stuff :0


----------



## Mega Man

My biggest problem is 2 things. 1 with the back plate I hate the "x" cut out. And 2 i wish they went rgb.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Mega Man*
> 
> My biggest problem is 2 things. 1 with the back plate I hate the "x" cut out. And 2 i wish they went rgb.


the x cutout kinda looks game console-esk and the colored strips to change led color was a way to cut costs I'm sure but it's same on my h220x so not too bad...I would've thought retention through the backplate would have been better for mounting though...I've got more time though as lutro0 is still healing


----------



## Spongeboy5040

Just put the order in for a Sapphire Tri-X 8GB 290X. Anyone know if these cards fit reference water blocks like the 4GB Models??


----------



## DeviousAddict

Quote:


> Originally Posted by *Spongeboy5040*
> 
> Just put the order in for a Sapphire Tri-X 8GB 290X. Anyone know if these cards fit reference water blocks like the 4GB Models??


Hey, I've got two of those and no, a reference waterblock will not fit them.
I have 1 each of these http://www.ekwb.com/shop/blocks/vga-blocks/ati-radeon-full-cover-blocks/radeon-rx-200-series/ek-fc-r9-290x-vaporx-acetal-nickel.html

They also come in a clear version http://www.ekwb.com/shop/blocks/vga-blocks/ati-radeon-full-cover-blocks/radeon-rx-200-series/ek-fc-r9-290x-vaporx-nickel.html

the backplate is specific too, you need this http://www.ekwb.com/shop/blocks/vga-blocks/fc-backplates/amd-radeon-series/ek-fc-r9-290x-vaporx-backplate-black.html Which only comes in black unfortunatly









Amazing cards too btw, I MOD Skyrim quite heavly with textures 8 & 4k and i rougly use 6-7gb Vram. Also Shadow of Mordor on max settings (inc hi rez texture pack) uses around 4.5gb vram.

Edit: Just noticed you said Tri-X, are you sure they're not Vapor-X? because i dont thin there is an 8gb Tri-X version. ( i could be wrong though)


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *taem*
> 
> I just ordered the Watercool Heatkiller GPU-X3 290x block. $92 for the copper. Stren gave it silver award for best combination of core and vrm cooling without a cooling backplate. Love the low price given where the 290x is in its life cycle, love that it gets great vrm cooling without a backplate and pads.
> 
> 
> 
> Plus wow this is the best looking gpu block ever.
> 
> 
> 
> No one seems to use this block though... No reviews, no one talks about it, nothing. Everyone uses EK and a few Aquacomputer users. I would have preferred Aquacomputer + backplate but that's $150+ to the Heatkiller's $92.


No not everone uses ek .

Why don't I see xspc block ??


----------



## LandonAaron

Quote:


> Originally Posted by *taem*
> 
> I just ordered the Watercool Heatkiller GPU-X3 290x block. $92 for the copper. Stren gave it silver award for best combination of core and vrm cooling without a cooling backplate. Love the low price given where the 290x is in its life cycle, love that it gets great vrm cooling without a backplate and pads.
> 
> 
> 
> Plus wow this is the best looking gpu block ever.
> 
> 
> 
> No one seems to use this block though... No reviews, no one talks about it, nothing. Everyone uses EK and a few Aquacomputer users. I would have preferred Aquacomputer + backplate but that's $150+ to the Heatkiller's $92.


Can you tell me where you ordered from. I have been shopping for a sub $100 full cover water block and haven't been able to come up with anything. These heatkillers were going for $86 at performance-pcs several months ago (should of bought one then, doh) but now they are like $120 there.


----------



## SLAKER

Hi Fellow Owners
I recently upgraded from a 7950 to a MSI R9 290x Gaming.
I noticed I was only getting +-100 fps (I have 120hz screen so I need my 120fps) in bf4 and I checked my GPU usage and it was only using +-40%.(Screen shot below)
In benchmarks Heaven and 3D Mark the GPU maxes at 100% usage.
Temps are low so its not throttling.
In Game settings : All on low except Mesh quality which is on ultra and FOV on 90

My specs are
FX8350 OCed to 4.8GHz - It is not bottle-necking. CPU only at 60% usage. (Screen shot below)
ROG Crosshair V Formula Z
8GB Kingston hyperx Beast 1866mhz CL9
MSI R9 290 Gaming
256GB Crucial SSD
640 Watt vantec ion2
Windows 8.1

What I have tried :

Different drivers
Mantle and Direct X
Unparked Cores

Anyone have any idea about what could be wrong ?
Anyone want to share setups and configs.

Regards
SLAKER

Im new to this forum so if this is in the wrong place please let me know.


----------



## rdr09

look at the column that's labeled Maximum. most of them are in the 90s.

here was my reading with my i7 SB @ 4.5 HT on and off and a single 290 . . .



that was 1080.


----------



## SLAKER

Thats the maximum for the day I have not rebooted. That screen shot was taken while I was playing I had not tabbed out or anything. (2nd screen)


----------



## rdr09

Quote:


> Originally Posted by *SLAKER*
> 
> Thats the maximum for the day I have not rebooted. That screen shot was taken while I was playing I had not tabbed out or anything. (2nd screen)


well, there were times where the cpu was hitting as high as 90% usage and most prolly during huge explosions. if it's 1080, try raising the resolution scale to 120, 130%.


----------



## taem

Quote:


> Originally Posted by *LandonAaron*
> 
> Can you tell me where you ordered from. I have been shopping for a sub $100 full cover water block and haven't been able to come up with anything. These heatkillers were going for $86 at performance-pcs several months ago (should of bought one then, doh) but now they are like $120 there.


Aquatuning. http://www.aquatuning.us/water-cooling/gpu-water-blocks/gpu-full-cover/?p=1&f=251&sSort=5 My first time ordering from them and have yet to receive but experience good so far, I screwed up my order and emailed them after payment had processed and it was in the pick to be shipped stage, and a rep named Julia Fenske sorted me out right away.

Great prices on gpu blocks there, and the funny thing is, shipping is cheaper than Performance PCs. I know right? Shipping from Germany, with customs fees, and it's STILL CHEAPER THAN P-PCs. /shakes head. I do like P-PCs though and order from them a lot, just saying.

The Aquacomputer Kryographics copper/plexi is only $106, with the backplate for like $30. I hope I don't regret going that route since that's rated the top performer. But the Heatkiller for $98 (was $92 when I ordered but went up a bit) seemed too good a deal given stren's review of it. http://www.xtremerigs.net/2014/08/11/watercool-heatkiller-gpu-x3-r9-290x-review/

*"The heatkiller doesn't just stop at core temps either, it has the best results for VRM cooling for a block without a backplate. If watercool had only made a backplate that could enhance the cooling then this block might have cinched a gold award!

Another beautiful block from team Germany - excellent performance on the core and VRMs particularly at low flow where it counts most. An active backplate would have given AquaComputer a run for their money but as it is Watercool will have to settle for silver!"
*

And again -- I think it's the best looking gpu block ever made. I ordered the copper/plexi (cheapest) but it comes in all flavors with the most expensive nickel-black at $125





I would only get the Heatkiller if you don't want a backplate though, because the Heatkiller backplate looks awful and doesn't help cooling. If you want a backplate you should definitely go the Aquacomputer Kryographics + passive backplate. I like backplates on air cooled custom cards, but I really do not like the plain slab appearance of backplates for gpu blocks, like something out of Poland in the communist 70s):

ugly

so ugly


Basically looks like something some dude made in his garage with a dremel.

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> No not everone uses ek .
> 
> Why don't I see xspc block ??


I think he got that block late in his review cycle, and he got poor results with it and there was a question of whether the unit he got was flawed in some way because he couldn't get good contact. I think that's why he left it out of the summary.

http://www.xtremerigs.net/2014/10/22/xspc-r9-290x-waterblock/


----------



## LandonAaron

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *taem*
> 
> Aquatuning. http://www.aquatuning.us/water-cooling/gpu-water-blocks/gpu-full-cover/?p=1&f=251&sSort=5 My first time ordering from them and have yet to receive but experience good so far, I screwed up my order and emailed them after payment had processed and it was in the pick to be shipped stage, and a rep named Julia Fenske sorted me out right away.
> 
> Great prices on gpu blocks there, and the funny thing is, shipping is cheaper than Performance PCs. I know right? Shipping from Germany, with customs fees, and it's STILL CHEAPER THAN P-PCs. /shakes head. I do like P-PCs though and order from them a lot, just saying.
> 
> The Aquacomputer Kryographics copper/plexi is only $106, with the backplate for like $30. I hope I don't regret going that route since that's rated the top performer. But the Heatkiller for $98 (was $92 when I ordered but went up a bit) seemed too good a deal given stren's review of it. http://www.xtremerigs.net/2014/08/11/watercool-heatkiller-gpu-x3-r9-290x-review/
> 
> *"The heatkiller doesn't just stop at core temps either, it has the best results for VRM cooling for a block without a backplate. If watercool had only made a backplate that could enhance the cooling then this block might have cinched a gold award!
> 
> Another beautiful block from team Germany - excellent performance on the core and VRMs particularly at low flow where it counts most. An active backplate would have given AquaComputer a run for their money but as it is Watercool will have to settle for silver!"
> *
> 
> And again -- I think it's the best looking gpu block ever made. I ordered the copper/plexi (cheapest) but it comes in all flavors with the most expensive nickel-black at $125
> 
> 
> 
> 
> 
> I would only get the Heatkiller if you don't want a backplate though, because the Heatkiller backplate looks awful and doesn't help cooling. If you want a backplate you should definitely go the Aquacomputer Kryographics + passive backplate. I like backplates on air cooled custom cards, but I really do not like the plain slab appearance of backplates for gpu blocks, like something out of Poland in the communist 70s):
> 
> ugly
> 
> so ugly
> 
> 
> Basically looks like something some dude made in his garage with a dremel.
> I think he got that block late in his review cycle, and he got poor results with it and there was a question of whether the unit he got was flawed in some way because he couldn't get good contact. I think that's why he left it out of the summary.
> 
> http://www.xtremerigs.net/2014/10/22/xspc-r9-290x-waterblock/






Lol, thanks for the links. I actually like the plain black slab look, I guess when it comes to looks it pure subjectivity. I doubt I would buy one though, as they seem to serve very little purpose.


----------



## taem

Quote:


> Originally Posted by *LandonAaron*
> 
> 
> Lol, thanks for the links. I actually like the plain black slab look, I guess when it comes to looks it pure subjectivity. I doubt I would buy one though, as they seem to serve very little purpose.


If the backplate takes thermal pads, then they serve a good purpose. A solid backplate with no holes in it like XSPC, EK, or Watercool might also save your gpu from a leak in the cpu block.

Aesthetics wise, if the motherboard were slab like, then the slab backplates might work better. But unless you have a sabretooth or formula or some other board with armor, a motherboard has exposed circuit boards and capacitors etc. Imho the partial blocks work much better to match that appearance. And once that thin gpu pcb is exposed at all, the sleekness of the Heatkiller looks terrific.

Depends on your fittings and tubing too. If you're going to stick qdcs on your gpu block and use heavy bitspower fittings with 3/4" tubing, then something as sleek as the Heatkiller would look unbalanced, and a thicker block like the Kryographics would look better.

Imho slabs also look a lot better if there is some relief, like the Aquacomputer



But aesthetically probably the crowning achievement of this AMD gen is the map of Hawaii on the Aquacomputer block



that just kills me.


----------



## Serandur

Got my Vapor-X, some pics:
Quote:


> Originally Posted by *Arizonian*
> 
> Post a submission when you get your blue baby. Preferably in rig so I can see the aesthetics


Sorry about the quality, they're not the best but:








It's a really nice card, though it does have a few scratches/manufacturing defects on it for some reason and I was disappointed to see Elpida GDDR5 in it (expected Hynix), but it's fast, quiet, and looks perfect in my PC.


----------



## GOLDDUBBY

Quote:


> Originally Posted by *SLAKER*
> 
> Hi Fellow Owners
> I recently upgraded from a 7950 to a MSI R9 290x Gaming.
> I noticed I was only getting +-100 fps (I have 120hz screen so I need my 120fps) in bf4 and I checked my GPU usage and it was only using +-40%.(Screen shot below)
> In benchmarks Heaven and 3D Mark the GPU maxes at 100% usage.
> Temps are low so its not throttling.
> In Game settings : All on low except Mesh quality which is on ultra and FOV on 90
> 
> My specs are
> FX8350 OCed to 4.8GHz - It is not bottle-necking. CPU only at 60% usage. (Screen shot below)
> ROG Crosshair V Formula Z
> 8GB Kingston hyperx Beast 1866mhz CL9
> MSI R9 290 Gaming
> 256GB Crucial SSD
> 640 Watt vantec ion2
> Windows 8.1
> 
> What I have tried :
> 
> Different drivers
> Mantle and Direct X
> Unparked Cores
> 
> Anyone have any idea about what could be wrong ?
> Anyone want to share setups and configs.
> 
> Regards
> SLAKER
> 
> Im new to this forum so if this is in the wrong place please let me know.


What AB skin is that? Lookin nice.

:edit: try using ddu to uninstall all the previous drivers, then do a fresh install of the latest drivers to your system.

http://www.instalki.pl/programy/download/Windows/deinstalatory/Display_Driver_Uninstaller.html

:edit 2: found the skin, http://grum-d.deviantart.com/art/MSI-Afterburner-ASUS-ROG-Skin-502513889


----------



## Mega Man

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> My biggest problem is 2 things. 1 with the back plate I hate the "x" cut out. And 2 i wish they went rgb.
> 
> 
> 
> the x cutout kinda looks game console-esk and the colored strips to change led color was a way to cut costs I'm sure but it's same on my h220x so not too bad...I would've thought retention through the backplate would have been better for mounting though...I've got more time though as lutro0 is still healing
Click to expand...

all in all for them to make RGHB it would just be tooo hard i have a way to do it but frankly it would be too hard imo for everyone to implement
Quote:


> Originally Posted by *LandonAaron*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *taem*
> 
> Aquatuning. http://www.aquatuning.us/water-cooling/gpu-water-blocks/gpu-full-cover/?p=1&f=251&sSort=5 My first time ordering from them and have yet to receive but experience good so far, I screwed up my order and emailed them after payment had processed and it was in the pick to be shipped stage, and a rep named Julia Fenske sorted me out right away.
> 
> Great prices on gpu blocks there, and the funny thing is, shipping is cheaper than Performance PCs. I know right? Shipping from Germany, with customs fees, and it's STILL CHEAPER THAN P-PCs. /shakes head. I do like P-PCs though and order from them a lot, just saying.
> 
> The Aquacomputer Kryographics copper/plexi is only $106, with the backplate for like $30. I hope I don't regret going that route since that's rated the top performer. But the Heatkiller for $98 (was $92 when I ordered but went up a bit) seemed too good a deal given stren's review of it. http://www.xtremerigs.net/2014/08/11/watercool-heatkiller-gpu-x3-r9-290x-review/
> 
> *"The heatkiller doesn't just stop at core temps either, it has the best results for VRM cooling for a block without a backplate. If watercool had only made a backplate that could enhance the cooling then this block might have cinched a gold award!
> 
> Another beautiful block from team Germany - excellent performance on the core and VRMs particularly at low flow where it counts most. An active backplate would have given AquaComputer a run for their money but as it is Watercool will have to settle for silver!"
> *
> 
> And again -- I think it's the best looking gpu block ever made. I ordered the copper/plexi (cheapest) but it comes in all flavors with the most expensive nickel-black at $125
> 
> 
> 
> 
> 
> I would only get the Heatkiller if you don't want a backplate though, because the Heatkiller backplate looks awful and doesn't help cooling. If you want a backplate you should definitely go the Aquacomputer Kryographics + passive backplate. I like backplates on air cooled custom cards, but I really do not like the plain slab appearance of backplates for gpu blocks, like something out of Poland in the communist 70s):
> 
> ugly
> 
> so ugly
> 
> 
> Basically looks like something some dude made in his garage with a dremel.
> I think he got that block late in his review cycle, and he got poor results with it and there was a question of whether the unit he got was flawed in some way because he couldn't get good contact. I think that's why he left it out of the summary.
> 
> http://www.xtremerigs.net/2014/10/22/xspc-r9-290x-waterblock/
> 
> 
> 
> 
> 
> 
> 
> Lol, thanks for the links. I actually like the plain black slab look, I guess when it comes to looks it pure subjectivity. I doubt I would buy one though, as they seem to serve very little purpose.
Click to expand...

they serve several, they help stiffen the GPU
they passively/actively cool the mem//back of vrm/back of gpu die
prectect from shorts, ( this is a rather large + tbh


----------



## SLAKER

On a 100% res scale










On a 130% res scale


----------



## Mega Man

soooo beyond lost


----------



## ghabhaducha

I think he's referring to VSR or something, and highlighting the increase in GPU usage.


----------



## SLAKER

its a response to this
Quote:


> Originally Posted by *rdr09*
> 
> well, there were times where the cpu was hitting as high as 90% usage and most prolly during huge explosions. if it's 1080, try raising the resolution scale to 120, 130%.


----------



## jmng14

hey guys i have a question. If i Crossfire two 290s, will i be able to run displays off the second card or is the second card still used for computing only


----------



## tsm106

Quote:


> Originally Posted by *jmng14*
> 
> hey guys i have a question. If i Crossfire two 290s, will i be able to run displays off the second card or is the second card still used for computing only


2nd card or slave cards in cfx have their ports deactivated. You can only go thru the active card.


----------



## DeviousAddict

I thought you could connect extra monitors into your 2nd card because of eyefinity? I thought that was the point of it.
Never used it so I dont really know, I just assumed that was the case myself.


----------



## tsm106

Quote:


> Originally Posted by *DeviousAddict*
> 
> *I thought you could connect extra monitors into your 2nd card because of eyefinity?* I thought that was the point of it.
> Never used it so I dont really know, I just assumed that was the case myself.


It's never worked that way afaik.


----------



## DeviousAddict

Quote:


> Originally Posted by *tsm106*
> 
> ]
> 
> It's never worked that way afaik.


Roger dodge good to know thank you


----------



## ChronoBodi

selling off my Ttians, got my MSI R9 290X 8GBs, two of them....

haven't installed them yet, but does anyone have any experience with the MSI Gaming OC Twin Frozr IV's power plug? It's awfully close to the cooler heatsink, is it really that hard to remove or is it all hyperbole?


----------



## LandonAaron

Quote:


> Originally Posted by *ChronoBodi*
> 
> selling off my Ttians, got my MSI R9 290X 8GBs, two of them....
> 
> haven't installed them yet, but does anyone have any experience with the MSI Gaming OC Twin Frozr IV's power plug? It's awfully close to the cooler heatsink, is it really that hard to remove or is it all hyperbole?


I've owned the MSI Twin Frozr GTX 770 and R9 290x though I only watercooled the GTX 770, but their power plugs are in the same spot as every other graphics card I have ever seen. Gave me no problems.


----------



## LandonAaron

Do all blower type R9 290x's use the reference PCB layout. ex: MSI, Asus, VisionTek?

Also I already own one R9 290x. Do yall suggest a second for crossfire or save a few bucks and get an R9 290?

Finally does it matter if one is reference PCB, and the other is a custom one for crossfire?


----------



## ChronoBodi

Quote:


> Originally Posted by *LandonAaron*
> 
> Do all blower type R9 290x's use the reference PCB layout. ex: MSI, Asus, VisionTek?
> 
> Also I already own one R9 290x. Do yall suggest a second for crossfire or save a few bucks and get an R9 290?
> 
> Finally does it matter if one is reference PCB, and the other is a custom one for crossfire?


doesn't matter what PCB both cards are, as long as both GPUs are within the r9 290/290x specs, they both work, PCB or brand doesn't matter.


----------



## LandonAaron

Quote:


> Originally Posted by *ChronoBodi*
> 
> doesn't matter what PCB both cards are, as long as both GPUs are within the r9 290/290x specs, they both work, PCB or brand doesn't matter.


Okay. I want to get full cover blocks too, so looking to buy reference. Any R9 290x or R9 290 with a blower fan will work with a reference water block right?


----------



## ChronoBodi

Well more research would have to be done. and it's a bit hard to tell. Like, the MSI Gaming OC one has a "custom" PCB that's pretty much a reference PCB with better parts on it.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814131617&cm_re=r9_290x-_-14-131-617-_-Product

That one should be reference, but, more research is needed.


----------



## GOLDDUBBY

Quote:


> Originally Posted by *SLAKER*
> 
> On a 100% res scale
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> On a 130% res scale


Are you still talking about during bf4 ? .. did you try to uninstall the drivers using DDU ? .. people have had the same issues that you explained, and after a fresh run of DDU and install of 1 driver, the problems went away.

http://www.instalki.pl/programy/download/Windows/deinstalatory/Display_Driver_Uninstaller.html


----------



## LandonAaron

Quote:


> Originally Posted by *ChronoBodi*
> 
> Well more research would have to be done. and it's a bit hard to tell. Like, the MSI Gaming OC one has a "custom" PCB that's pretty much a reference PCB with better parts on it.
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814131617&cm_re=r9_290x-_-14-131-617-_-Product
> 
> That one should be reference, but, more research is needed.


Yeah, that's what I was afraid of stuff like that. Better parts=Taller caps= will not fit reference water block. Not saying that is for sure the case, just distinct possibility. Makes it a lot harder shopping for a card when you don't know which ones are reference and which one's aren't. And aint nobody got time for researching, every single one, XFX, MSI, Gigabyte, VisionTek, Saphire, PowerCooler, the list goes on.


----------



## ChronoBodi

Quote:


> Originally Posted by *LandonAaron*
> 
> I've owned the MSI Twin Frozr GTX 770 and R9 290x though I only watercooled the GTX 770, but their power plugs are in the same spot as every other graphics card I have ever seen. Gave me no problems.


holy crap they actually fixed the power connection on the MSI gaming OC, they reversed the connector on the 8 GB model so its not a PITA anymore to take the connector out.


----------



## cokker

Quote:


> Originally Posted by *ChronoBodi*
> 
> holy crap they actually fixed the power connection on the MSI gaming OC, they reversed the connector on the 8 GB model so its not a PITA anymore to take the connector out.


Picked up a 290x 4GB from amazon about 2 weeks ago and that has the new PCI-E power connector orientation, nice surprise!

Also I had a peek inside the cooler and it looks like it has those gold coloured SSC chokes like the lighting model.


----------



## taem

Quote:


> Originally Posted by *LandonAaron*
> 
> Yeah, that's what I was afraid of stuff like that. Better parts=Taller caps= will not fit reference water block. Not saying that is for sure the case, just distinct possibility. Makes it a lot harder shopping for a card when you don't know which ones are reference and which one's aren't. And aint nobody got time for researching, every single one, XFX, MSI, Gigabyte, VisionTek, Saphire, PowerCooler, the list goes on.


How about the stunt Powercolor pulled, where they released a reference pcb with custom cooler, and then they revised the pcb so it no longer fits reference blocks, and gave the consumer no way of knowing which is which when they place an order. The only way is to look at what's stamped on the pcb, R29F for reference, R29FA for custom. There is nothing on the box or part number to show which is which so a vendor can't tell you.

Anyway generally it's not hard to tell which cards are reference, block makers list compatibility. Like here's a handy one from Watercool





http://gpu.watercool.de/WATERCOOL_HEATKILLER_GPU_Compatibility.pdf

And EK makes blocks for every gpu ever made by anyone so just go there and you'll be fine.

Anyway I'm not sure those "better" parts on the MSI Gaming were a good idea. I've not heard that card clocks better than a reference, and the cooler is designed for silence not heat dissipation, so to push it you would need a water block anyway, and the "better" parts just means you have only one choice of block. I personally favor reference pcbs for all cards that are not lightning or matrix. No point in doing otherwise. The custom pcbs this gen, like Asus DC2 and Gigabyte and MSI Gaming, don't clock any better than the Tri-X or XFX DD etc which are reference. Just focus on the cooler and leave the pcb alone except for the premium boards meant to be pushed hard or take ln2 cooling.

edited to include 290 chart


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *taem*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Aquatuning. http://www.aquatuning.us/water-cooling/gpu-water-blocks/gpu-full-cover/?p=1&f=251&sSort=5 My first time ordering from them and have yet to receive but experience good so far, I screwed up my order and emailed them after payment had processed and it was in the pick to be shipped stage, and a rep named Julia Fenske sorted me out right away.
> 
> Great prices on gpu blocks there, and the funny thing is, shipping is cheaper than Performance PCs. I know right? Shipping from Germany, with customs fees, and it's STILL CHEAPER THAN P-PCs. /shakes head. I do like P-PCs though and order from them a lot, just saying.
> 
> The Aquacomputer Kryographics copper/plexi is only $106, with the backplate for like $30. I hope I don't regret going that route since that's rated the top performer. But the Heatkiller for $98 (was $92 when I ordered but went up a bit) seemed too good a deal given stren's review of it. http://www.xtremerigs.net/2014/08/11/watercool-heatkiller-gpu-x3-r9-290x-review/
> 
> *"The heatkiller doesn't just stop at core temps either, it has the best results for VRM cooling for a block without a backplate. If watercool had only made a backplate that could enhance the cooling then this block might have cinched a gold award!
> 
> Another beautiful block from team Germany - excellent performance on the core and VRMs particularly at low flow where it counts most. An active backplate would have given AquaComputer a run for their money but as it is Watercool will have to settle for silver!"
> *
> 
> And again -- I think it's the best looking gpu block ever made. I ordered the copper/plexi (cheapest) but it comes in all flavors with the most expensive nickel-black at $125
> 
> 
> 
> 
> 
> I would only get the Heatkiller if you don't want a backplate though, because the Heatkiller backplate looks awful and doesn't help cooling. If you want a backplate you should definitely go the Aquacomputer Kryographics + passive backplate. I like backplates on air cooled custom cards, but I really do not like the plain slab appearance of backplates for gpu blocks, like something out of Poland in the communist 70s):
> 
> 
> 
> ugly
> 
> so ugly
> 
> 
> Basically looks like something some dude made in his garage with a dremel.
> I think he got that block late in his review cycle, and he got poor results with it and there was a question of whether the unit he got was flawed in some way because he couldn't get good contact. I think that's why he left it out of the summary.
> 
> http://www.xtremerigs.net/2014/10/22/xspc-r9-290x-waterblock/


I don't care about ugliness , you should see my Chillputer










Anyways a block isnt a block unless its a full cover top and bottom . That's what I pay the moolah for .

This looks quite balanced . Wouldn't you agree ??

@Mega Man

Some of these newbies need to have a look around at other peeps posts before they do to prevent us from being 'lost'








Cause lately here its been thru the roof


----------



## Mega Man

Hahaha


----------



## ChronoBodi

Quote:


> Originally Posted by *cokker*
> 
> Picked up a 290x 4GB from amazon about 2 weeks ago and that has the new PCI-E power connector orientation, nice surprise!
> 
> Also I had a peek inside the cooler and it looks like it has those gold coloured SSC chokes like the lighting model.


is there any comparisons made between this revised model vs the old one? I'm curious.


----------



## cokker

Quote:


> Originally Posted by *ChronoBodi*
> 
> is there any comparisons made between this revised model vs the old one? I'm curious.


No idea, I did a bit of googling but only found pictures of early review models.

I would pull the cooler off my card but I don't wanna void the warranty sticker, I know MSI are normally pretty cool with heatsink removals but you know lol.


----------



## taem

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Anyways a block isnt a block unless its a full cover top and bottom . That's what I pay the moolah for .
> 
> This looks quite balanced . Wouldn't you agree ??


Yeah it does. Balance depends on what else you have in there, and with 3+ gpus and corsair dominators that look works because there's a slab theme going on.

But generally speaking I don't think the slab works aesthetically, it dominates the motherboard. If you have one gpu in there with normal sized ram that slab looks like a prison building in the middle of a suburban street. Water cooling setups in general dominate the looks too much for my taste. I do get that aesthetics are half the reason folks do this anyway so I can kind of understand it but I prefer a loop that doesn't overwhelm the rest of the system. I dislike 3/4" OD tubing and thick rads for that reason.

Your chillputer is ridiculous lol. There's not even a place to sit in front of your screen you'd be like 2 yards away.


----------



## alancsalt

Chillputer looks just fine. It's for benchmarking, rather than office work, browsing, playing games or being pretty.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *taem*
> 
> Yeah it does. Balance depends on what else you have in there, and with 3+ gpus and corsair dominators that look works because there's a slab theme going on.
> 
> But generally speaking I don't think the slab works aesthetically, it dominates the motherboard. If you have one gpu in there with normal sized ram that slab looks like a prison building in the middle of a suburban street. Water cooling setups in general dominate the looks too much for my taste. I do get that aesthetics are half the reason folks do this anyway so I can kind of understand it but I prefer a loop that doesn't overwhelm the rest of the system. I dislike 3/4" OD tubing and thick rads for that reason.
> 
> *Your chillputer is ridiculous lol. There's not even a place to sit in front of your screen you'd be like 2 yards away.*



Like salty said . LoooL
Eat thy words


----------



## mfknjadagr8

Ok so tommorrow 290s arrive according to usps... then within the week my two blocks and mcp50 pump and extra rad should be here...then i can chill it out


----------



## HOMECINEMA-PC

Come on 3xx series


----------



## taem

Quote:


> Originally Posted by *alancsalt*
> 
> Chillputer looks just fine. It's for benchmarking, rather than office work, browsing, playing games or being pretty.


Lol dude that does not look "fine." If I had a wife I know she wouldn't let me have that in the house.


----------



## Mega Man

heh you should see mine, i have more the n 7 pcs here ... some in peices atm


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *taem*
> 
> Yeah it does. Balance depends on what else you have in there, and with 3+ gpus and corsair dominators that look works because there's a slab theme going on.
> 
> But generally speaking I don't think the slab works aesthetically, it dominates the motherboard. If you have one gpu in there with normal sized ram that slab looks like a prison building in the middle of a suburban street. Water cooling setups in general dominate the looks too much for my taste. I do get that aesthetics are half the reason folks do this anyway so I can kind of understand it but I prefer a loop that doesn't overwhelm the rest of the system. I dislike 3/4" OD tubing and thick rads for that reason.
> 
> Your chillputer is ridiculous lol. There's not even a place to sit in front of your screen you'd be like 2 yards away.
> 
> 
> 
> 
> 
> 
> 
> Like salty said . LoooL
Click to expand...


----------



## taem

Quote:


> Originally Posted by *Mega Man*
> 
> heh you should see mine, i have more the n 7 pcs here ... some in peices atm


When it comes to DIY looking setups though I prefer the low end to the high end. Like this



That's full aussie

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 
> Like salty said . LoooL


Dam dude, just shaking head. How you live like that?? What do your parents say when they see that?


----------



## alancsalt

A 'puter of rare beauty....









He doesn't live with his parents for a start, nor does he have a wife unless you count his faithful Lexy, who never nags nor tries to tell him how to live.....


----------



## taem

Quote:


> Originally Posted by *alancsalt*
> 
> A 'puter of rare beauty....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> He doesn't live with his parents for a start, nor does he have a wife unless you count his faithful Lexy, who never nags nor tries to tell him how to live.....


It just kills me he's got nice hardwood flooring and a nice ergonomic chair and he's got corsair dominators and backplates for his gpus and high end fittings, and then a cardboard box and industrial tubing sitting right there. Those tubes would drive me NUTS. The clutter is fine I have clutter too, but those tubes, dam, the wires on my displays drive me nuts. And I'm really hoping that cardboard isn't actually a part of his rig....

And is Lexy his dog? Cos I have dogs, and they nag PLENTY


----------



## rt123

Quote:


> Originally Posted by *taem*
> 
> It just kills me he's got nice hardwood flooring and a nice ergonomic chair and he's got corsair dominators and backplates for his gpus and high end fittings, and then a cardboard box and _*industrial tubing sitting right there. Those tubes would drive me NUTS.*_ The clutter is fine I have clutter too, but those tubes, dam, the wires on my displays drive me nuts. And I'm really hoping that cardboard isn't actually a part of his rig....


I think he runs a Water Chiller or Phase Change setup.
That's why the industrial tubing,


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *taem*
> 
> When it comes to DIY looking setups though I prefer the low end to the high end. Like this
> 
> 
> 
> That's full aussie
> Dam dude, just shaking head. How you live like that?? What do your parents say when they see that?



PFFFFFTTTTTTTT Ha check this out








Fully ducted Portable A/C water cooled 2011 Mk2 rig


New house , New bench room with old 46" monitor



Things got a bit outta control . The bug bit *HARD*









I bought 2 waterchillers and it just doesn't make sense to run it in a case anymore ( my hands are too big LoooL







) so I turned it into a desk / bench / chill / puter .

Very easy to get at the bits . Can break it down and put it back together in 15 mins . Hence the Quick Disconnects
Quote:


> Originally Posted by *alancsalt*
> 
> A 'puter of rare beauty....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> He doesn't live with his parents for a start, nor does he have a wife unless you count his faithful Lexy, who never nags nor tries to tell him how to live.....
> Quote:
> 
> 
> 
> Originally Posted by *taem*
> 
> It just kills me he's got nice hardwood flooring and a nice ergonomic chair and he's got corsair dominators and backplates for his gpus and high end fittings, and then a cardboard box and industrial tubing sitting right there. Those tubes would drive me NUTS. The clutter is fine I have clutter too, but those tubes, dam, the wires on my displays drive me nuts. And I'm really hoping that cardboard isn't actually a part of his rig....
> *And is Lexy his dog?* Cos I have dogs, and they nag PLENTY
> 
> 
> Spoiler: Warning: Spoiler!
Click to expand...





















This is Lexy










2002 IS 300 Lexus









Cardboard ( I know so ghetto







) is there to stop the heat generated from the water chiller exhaust getting into my cool atmosphere ....

and the portable A/C intake sucks that chillers exhaust hot air in and recycles it into the portable A/C's outtake ( BTW the A/C exhaust is ducted outside as well ) and punches out cold air into the rads so the 1hp chiller doesn't need to kick in as often when switched on .


The cool air sucked in through the rads cools the 'puter down and me at the same time . PLUS I have a fully ducted 15kw reverse cycle A/C in the roof to chill the room / house down as well .
Adjusted via touchpad for individual zoneing !!

Quote:


> Originally Posted by *rt123*
> 
> I think he runs a Water Chiller or Phase Change setup.
> That's why the industrial tubing,


No phase change here . Cant run that here in Brisvegas 'Stralia 24/7 . High humidity = sweating = disaster








Its just 3/4 OD watercooling hose sheathed in industrial copper pipe A/C insulation to prevent sweating and to keep constant temp to waterpump when using 1hp chiller ( GPU's only ) .


Hailer HC - 1000 has a 4ltr res in it so its perfect to use it and not run it all the time . Sorts out my 290's easy . Like for example 30c full load temps


The CPU has its own chiller as well that runs 24/7 its only a 1/16th HP unit a Hailer HC 150 with no rads


Quote:


> Originally Posted by *kizwan*


Thank you Kizzy , May the Force be with you


----------



## Spectre-

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> PFFFFFTTTTTTTT Ha check this out
> 
> 
> 
> 
> 
> 
> 
> 
> Fully ducted Portable A/C water cooled 2011 Mk2 rig
> 
> 
> New house , New bench room with old 46" monitor
> 
> 
> 
> Things got a bit outta control . The bug bit *HARD*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I bought 2 waterchillers and it just doesn't make sense to run it in a case anymore ( my hands are too big LoooL
> 
> 
> 
> 
> 
> 
> 
> ) so I turned it into a desk / bench / chill / puter .
> 
> Very easy to get at the bits . Can break it down and put it back together in 15 mins . Hence the Quick Disconnects
> This is Lexy
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2002 IS 300 Lexus
> 
> 
> 
> 
> 
> 
> 
> 
> 
> We need a HCPC cooling thread now
> 
> Cardboard ( I know so ghetto
> 
> 
> 
> 
> 
> 
> 
> ) is there to stop the heat generated from the water chiller exhaust getting into my cool atmosphere ....
> 
> and the portable A/C intake sucks that chillers exhaust hot air in and recycles it into the portable A/C's outtake ( BTW the A/C exhaust is ducted outside as well ) and punches out cold air into the rads so the 1hp chiller doesn't need to kick in as often when switched on .
> 
> 
> The cool air sucked in through the rads cools the 'puter down and me at the same time . PLUS I have a fully ducted 15kw reverse cycle A/C in the roof to chill the room / house down as well .
> Adjusted via touchpad for individual zoneing !!
> 
> No phase change here . Cant run that here in Brisvegas 'Stralia 24/7 . High humidity = sweating = disaster
> 
> 
> 
> 
> 
> 
> 
> 
> Its just 3/4 OD watercooling hose sheathed in industrial copper pipe A/C insulation to prevent sweating and to keep constant temp to waterpump when using 1hp chiller ( GPU's only ) .
> 
> 
> Hailer HC - 1000 has a 4ltr res in it so its perfect to use it and not run it all the time . Sorts out my 290's easy . Like for example 30c full load temps
> 
> 
> The CPU has its own chiller as well that runs 24/7 its only a 1/16th HP unit a Hailer HC 150 with no rads
> 
> Thank you Kizzy , May the Force be with you


Edit: Demanding HCPC cooling thread


----------



## HOMECINEMA-PC

Ummm hello there







^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^


----------



## sinnedone

Quick question for visiontek owners(or anyone in the know) , how have you been with this particular brand?

How does it compare to say Saphire cards? (talking about reference style cards)


----------



## taem

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> PFFFFFTTTTTTTT Ha check this out
> 
> 
> 
> 
> 
> 
> 
> 
> Fully ducted Portable A/C water cooled 2011 Mk2 rig...
> 
> Cardboard ( I know so ghetto
> 
> 
> 
> 
> 
> 
> 
> ) is there to stop the heat generated from the water chiller exhaust getting into my cool atmosphere ....


Lol I just had to give you grief about the cardboard and tubes straight out of the set of Pandorum. I get what you're doing and totally respect it, it's what separates the enthusiasts from the casually interested but ultimately just users like me. You're the successor generation to the first wave water coolers with home milled blocks and pond pumps in buckets next to their desks lol. If it had been left up to guys like me we'd still be using beige boxes with a single 80mm fan and our components cooking.

I guess though that there won't ever be an industry growing up around the extreme cooling methods like what you're doing and the stuff like ln2 and mineral oil etc because not enough folks will be willing to do it. They made water cooling really easy so guys like me can get on board but the esoteric stuff, I doubt enough of a market will ever emerge to support a bunch of manufacturers creating cheap and easy ways for home users to adopt the tech.

But dudes... the term "faithful" applies to dogs, and dogs only. Not to cars, not even nice ones.


----------



## kulomen

http://www.techpowerup.com/gpuz/details.php?id=a9za my R9 working on EKWB water blok









sapphire r9 290x vapor-x 8gb

and I have some issue with it. anyone fell good enough to help?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *taem*
> 
> Lol I just had to give you grief about the cardboard and tubes straight out of the set of *Pandorum.* I get what you're doing and totally respect it, it's what separates the enthusiasts from the casually interested but ultimately just users like me. You're the successor generation to the first wave water coolers with home milled blocks and pond pumps in buckets next to their desks lol. If it had been left up to guys like me we'd still be using beige boxes with a single 80mm fan and our components cooking.
> 
> I guess though that there won't ever be an industry growing up around the extreme cooling methods like what you're doing and the stuff like ln2 and mineral oil etc because not enough folks will be willing to do it. They made water cooling really easy so guys like me can get on board but the esoteric stuff, I doubt enough of a market will ever emerge to support a bunch of manufacturers creating cheap and easy ways for home users to adopt the tech.
> 
> But dudes... the term "faithful" applies to dogs, and dogs only. Not to cars, not even nice ones.


Pandorum is a bit of a scary movie









Quote:


> Originally Posted by *kulomen*
> 
> http://www.techpowerup.com/gpuz/details.php?id=a9za my R9 working on EKWB water blok
> 
> 
> 
> 
> 
> 
> 
> 
> 
> sapphire r9 290x vapor-x 8gb
> 
> and I have some issue with it. anyone fell good enough to help?


What are you talking about , what problem ? If the block don't fit it must be the wrong one , at a guess









** Not a mind reader **


----------



## tsm106

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Pandorum is a bit of a scary movie


It's ok Cung Le will save you!


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *tsm106*
> 
> It's ok Cung Le will save you!


By MMA ing my azz , no thanks


----------



## tsm106

Would you let Ronda Rousey arm bar you? I'm kind of inclined to lol.

*Crap she fought tonight, missed it I think.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *tsm106*
> 
> Would you let *Ronda Rousey arm bar* you? I'm kind of inclined to lol.
> 
> *Crap she fought tonight, missed it I think.


Well ....... she's hot and scary at the same time ..... thats all I should add to that . TOS and all that jazz


----------



## azcrazy

Hello

I just got my 290 running but i have few problems with it.

1st if i install a driver i get black screen after windows start (i have tried 14.12,14.4 and the omega one )

2nd i have red line running thru the whole screen.

any ideas?


----------



## HOMECINEMA-PC

Driver clean and re-install









hopefully that will work


----------



## kulomen

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Pandorum is a bit of a scary movie
> 
> 
> 
> 
> 
> 
> 
> 
> What are you talking about , what problem ? If the block don't fit it must be the wrong one , at a guess
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ** Not a mind reader **


first at all I'm running on 14.12 drivers and sometimes when I play, game just freeze, after closing it GPU is still stuck on 100% gpu usage and only reboot can help.
second. during valey benchmark from some reasons I have quite low score even after small over-clocking.

1.jpg 2582k .jpg file


my mate said to me that I should kick at least 4000


----------



## azcrazy

Is a fresh install of windows just upgraded my ssd


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *kulomen*
> 
> first at all I'm running on 14.12 drivers and sometimes when I play, game just freeze, after closing it GPU is still stuck on 100% gpu usage and only reboot can help.
> second. during valey benchmark from some reasons I have quite low score even after small over-clocking.
> 
> 1.jpg 2582k .jpg file
> 
> 
> my mate said to me that I should kick at least 4000


Havent run valley in over a year







.

I see your new so welcome









Okay need full details of your setup so others can also chime in .

So you need to put your rig in your signature....

http://www.overclock.net/t/1258253/how-to-put-your-rig-in-your-sig/0_20

Quote:


> Originally Posted by *azcrazy*
> 
> Is a fresh install of windows just upgraded my ssd


I meant wipe the driver and re- install your graphics driver

Im running ......



so its a older driver but it runs my TriFire setup well

Also check your monitor connections too









I *used to* get blackscreen till I flashed the bios to the Asus PT1T bios .

For me PT1T bios and Asus GPU tweek fixed that for me

Sometimes with earlier release cards other vendors boards ( Gigabyte ) don't like other vendors cards bios's . .


----------



## kulomen

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Havent run valley in over a year
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I see your new so welcome
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Okay need full details of your setup so others can also chime in .
> 
> So you need to put your rig in your signature.... .


done.


----------



## HOMECINEMA-PC

From what im seeing on the valley single card scores 75fps is what a FX 9590 with a 290x has scored there ....

Might have to add some vcore for example ... what monitoring program are you using ??


----------



## Wezzor

What exactly does the Asus PT1T bios do?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Wezzor*
> 
> What exactly does the Asus PT1T bios do?


It has vdroop is unlocked and tricked my x79 Gigabyte board to indentify my asus flashed gigabyte card I was using at the time .

Was looking fore some txt for u to read but I cant find it ATM ....


----------



## kulomen

just valley for test and msi afterburner for OSD showing gpu usage, fps, temp. temp is going up to 55 Celsius
Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> From what im seeing on the valley single card scores 75fps is what a FX 9590 with a 290x has scored there ....
> 
> Might have to add some vcore for example ... what monitoring program are you using ??


I compare to this one http://www.vortez.net/articles_pages/amd_radeon_r9_290x_review,15.html

valley for test and msi afterburner for OSD temp, FPS, gpu usage. temps



freshly made test with temp 47 Celsius
with memory clock 1400Mhz and GPU clock 1100Mhz and core voltage +25mV


----------



## rdr09

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> It has vdroop is unlocked and tricked my x79 Gigabyte board to indentify my asus flashed gigabyte card I was using at the time .
> 
> Was looking fore some txt for u to read but I cant find it ATM ....


Home's need your help.

what QDs do you recommend for these lines? tsm told me once and i forgot.

i am think QD3s. the RED ones are male.



i excluded the rest of the loop to make it simpler. Thanks.

edit: i am planning on getting them here . . .

http://www.performance-pcs.com/


----------



## Arizonian

Quote:


> Originally Posted by *Serandur*
> 
> Got my Vapor-X, some pics:
> Sorry about the quality, they're not the best but:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's a really nice card, though it does have a few scratches/manufacturing defects on it for some reason and I was disappointed to see Elpida GDDR5 in it (expected Hynix), but it's fast, quiet, and looks perfect in my PC.


Sweet. Love pics, I'm a computer voyeur.









Congrats - added

Quote:


> Originally Posted by *kulomen*
> 
> http://www.techpowerup.com/gpuz/details.php?id=a9za my R9 working on EKWB water blok
> 
> 
> 
> 
> 
> 
> 
> 
> 
> sapphire r9 290x vapor-x 8gb
> 
> and I have some issue with it. anyone fell good enough to help?


Congrats - added









Glad HOMECINEMA-PC was able to help. Welcome to OCN too.









Side note:
Got my new rig completed this weekend and did it with flu. I've have a lot of fun putting a rig together and this one was not easy. However getting new parts at your door and lying in bed with fever still isn't going to fly.









Again - anyone with new 290 / 290X please post submission and I'll be glad to add you to roster.


----------



## starjammer

Quote:


> Originally Posted by *rdr09*
> 
> Home's need your help.
> 
> what QDs do you recommend for these lines? tsm told me once and i forgot.
> 
> i am think QD3s. the RED ones are male.


The Koolance QDCs have these Inner Diameter measurements:

QD2: ID 6mm (1/4in)
QD3: ID 10mm (3/8in) to 13mm (1/2in)
QD4: ID 13mm (1/2in)

As you can see, the QD3 falls into most tubing sizes, so yes you should be good with that.


----------



## rdr09

Quote:


> Originally Posted by *starjammer*
> 
> The Koolance QDCs have these Inner Diameter measurements:
> 
> QD2: ID 6mm (1/4in)
> QD3: ID 10mm (3/8in) to 13mm (1/2in)
> QD4: ID 13mm (1/2in)
> 
> As you can see, the QD3 falls into most tubing sizes, so yes you should be good with that.


Thanks. i am thinking of this will connect directly to the rad instead of male . . .

http://www.performance-pcs.com/fittings-connectors/koolance-qd3-female-quick-disconnect-no-spill-coupling-male-threaded-g-1-4-bspp-black.html

and this to the tube . . .

http://www.performance-pcs.com/koolance-qd3-quick-disconnect-no-spill-coupling-male-compression-13-x-19mm-1-2-x-3-4in-black-21218.html#Specifications

will a 7/16 tube work? they're mixing english and metric measures.


----------



## GOLDDUBBY

The English uses metric system, so does the rest of the world except for Liberia, and north america.


----------



## rdr09

Quote:


> Originally Posted by *GOLDDUBBY*
> 
> The English uses metric system, so does the rest of the world except for Liberia, and north america.


yah. guess it depends where the item was made.


----------



## GOLDDUBBY

Seven devided by sixteen inch, is ~ 11mm

(No, that tubes seems a bit too narrow for that fitting)


----------



## starjammer

Quote:


> Originally Posted by *rdr09*
> 
> Thanks. i am thinking of this will connect directly to the rad instead of male . . .
> 
> http://www.performance-pcs.com/fittings-connectors/koolance-qd3-female-quick-disconnect-no-spill-coupling-male-threaded-g-1-4-bspp-black.html
> 
> and this to the tube . . .
> 
> http://www.performance-pcs.com/koolance-qd3-quick-disconnect-no-spill-coupling-male-compression-13-x-19mm-1-2-x-3-4in-black-21218.html#Specifications
> 
> will a 7/16 tube work? they're mixing english and metric measures.


7/16 in is 11.11mm, so you should be good. I am not sure though if Koolance has a fitting that will fit exactly that. Maybe get a fitting with 1/2 ID for a snug fit.

May I suggest though that you use the male part in the rad and the female part in the tube? The female part has the quick disconnect switch, so if you use it on the rad, you might have difficulty taking it off. Having it on the tube is more natural.

So something like this pair:

http://www.performance-pcs.com/xspc-g1-4-male-to-male-rotary-fitting-black-chrome-finish-18881.html
http://www.performance-pcs.com/koolance-qd3-female-quick-disconnect-no-spill-coupling-compression-for-13mm-x-16mm-1-2in-x-5-8in-black.html


----------



## rdr09

Quote:


> Originally Posted by *starjammer*
> 
> 7/16 in is 11.11mm, so you should be good. I am not sure though if Koolance has a fitting that will fit exactly that. Maybe get a fitting with 1/2 ID for a snug fit.
> 
> May I suggest though that you use the male part in the rad and the female part in the tube? The female part has the quick disconnect switch, so if you use it on the rad, you might have difficulty taking it off. Having it on the tube is more natural.
> 
> So something like this pair:
> 
> http://www.performance-pcs.com/xspc-g1-4-male-to-male-rotary-fitting-black-chrome-finish-18881.html
> http://www.performance-pcs.com/koolance-qd3-female-quick-disconnect-no-spill-coupling-compression-for-13mm-x-16mm-1-2in-x-5-8in-black.html


you are right. that male is currently oos. i'll wait. thanks again.


----------



## GOLDDUBBY

Fitting a 11mm tube to a 13mm fitting, hot water and stretch with scissors or pliers. Works, but not sure how much the stress cuts of the life of the hose material.

The compression barb usually add 1 or 2 mm aswell, so even an intended size hose does get stretched by the barb.


----------



## LandonAaron

What do you guys think the PowerColor TurboDuo AXR9 290? I know the Tri-X are the best, but is this a pretty decent cooler/ model? I am trying to go crossfire, and my current card is water cooled with a universal GPU block. I was thinking of buying two full cover blocks, but I just can't bring myself to spend the cash. So I am thinking I will keep watercooling my top card with universal GPU black, and get an open air cooler design for the bottom one. Still, I want a card with a reference board, just in case I change my mind and decide to put a full cover block on there.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *rdr09*
> 
> Home's need your help.
> 
> what QDs do you recommend for these lines? tsm told me once and i forgot.
> 
> i am think QD3s. the RED ones are male.
> 
> 
> 
> i excluded the rest of the loop to make it simpler. Thanks.
> 
> edit: i am planning on getting them here . . .
> 
> http://www.performance-pcs.com/


QD3's dude .



Im certain screw in QD4's require a 3/8th thread not 1/4 so they are only good for rad or blocks that use 3/8 thread obviously or as a compression joiner

Id also budget for those on the card blocks as well for super easy install / removal and no need to drain the loop



Also I tried 1 pair of black QD3's and the got a tad rusty when the paint come off and basically are stuffed ( cant get refund cause Frozen PC is borked ) .

No problem with the chrome ones though









Males screw ins on the rads and cards and females on the cpu block with comp QD3's to suit your arrangement


----------



## AmcieK

http://i.imgur.com/NZhvxnG.jpg
http://i.imgur.com/bIBW389.jpg

Any ideas ? not overlocked , last time didnt change drivers ... setings medium ....


----------



## taem

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Also I tried 1 pair of black QD3's and the got a tad rusty when the paint come off and basically are stuffed ( cant get refund cause Frozen PC is borked ) .
> 
> No problem with the chrome ones though


I just read this thread http://www.overclock.net/t/1404275/black-koolance-quick-disconnect-owners/0_100 right after receiving two pairs of black QD3s. That's just great.

Where are you finding chrome plated qd3s though? All the ones I'm looking at on P-PCs and Koolance web site say nickel plated. I'm only looking for g1/4 threaded qd3s though, is chrome only for compression versions?

Also thinking I should just avoid Koolance altogether, no matter how nice qd3s are, and just get Swiftech lok seals or something.


----------



## rdr09

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> QD3's dude .
> 
> 
> 
> Im certain screw in QD4's require a 3/8th thread not 1/4 so they are only good for rad or blocks that use 3/8 thread obviously or as a compression joiner
> 
> Id also budget for those on the card blocks as well for super easy install / removal and no need to drain the loop
> 
> 
> 
> Also I tried 1 pair of black QD3's and the got a tad rusty when the paint come off and basically are stuffed ( cant get refund cause Frozen PC is borked ) .
> 
> No problem with the chrome ones though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Males screw ins on the rads and cards and females on the cpu block with comp QD3's to suit your arrangement


i'll match all based on the QD3s. i'll go with the females for the rads and add a couple more males for an extra line. i'll use it to complete the loop when the gpus are out. getting ready for the 300 series.

Quote:


> Originally Posted by *taem*
> 
> I just read this thread http://www.overclock.net/t/1404275/black-koolance-quick-disconnect-owners/0_100 right after receiving two pairs of black QD3s. That's just great.
> 
> Where are you finding chrome plated qd3s though? All the ones I'm looking at on P-PCs and Koolance web site say nickel plated. I'm only looking for g1/4 threaded qd3s though, is chrome only for compression versions?
> 
> Also thinking I should just avoid Koolance altogether, no matter how nice qd3s are, and just get Swiftech lok seals or something.


may have to go with the chrome instead of the blacks. good info. thanks.


----------



## devolved

Quote:


> Originally Posted by *LandonAaron*
> 
> What do you guys think the PowerColor TurboDuo AXR9 290? I know the Tri-X are the best, but is this a pretty decent cooler/ model? I am trying to go crossfire, and my current card is water cooled with a universal GPU block. I was thinking of buying two full cover blocks, but I just can't bring myself to spend the cash. So I am thinking I will keep watercooling my top card with universal GPU black, and get an open air cooler design for the bottom one. Still, I want a card with a reference board, just in case I change my mind and decide to put a full cover block on there.


Yeah I have two of those running in a cooler master storm trooper case with good fans on front and back giving good air flow and it's still too hot for crossfire in the summer.
I'm going to use kraken g10's on them when I have time before it starts getting hot again.
That powercolor cooler isn't bad tho.


----------



## azcrazy

Would a bad driver artifact the card in mobo bios?


----------



## BradleyW

Quote:


> Originally Posted by *azcrazy*
> 
> Would a bad driver artifact the card in mobo bios?


The BIOS is independent from the AMD GPU driver software I believe.


----------



## azcrazy

Quote:


> Originally Posted by *BradleyW*
> 
> The BIOS is independent from the AMD GPU driver software I believe.


That's what i thought but wasn't sure, i guess going red is gonna hurt me little


----------



## BradleyW

Quote:


> Originally Posted by *azcrazy*
> 
> That's what i thought but wasn't sure, i guess going red is gonna hurt me little


Tried updating motherboard BIOS?


----------



## azcrazy

Well, the card is artifacting in bios and in windows, and every time i tried to install the driver i get black screen, i have clean the drivers like 5 times (the way is suggested in 1st page) still no dice.

I'm doing widows update hoping that it will fix the problem since is a fresh install of windows!


----------



## BradleyW

Quote:


> Originally Posted by *azcrazy*
> 
> Well, the card is artifacting in bios and in windows, and every time i tried to install the driver i get black screen, i have clean the drivers like 5 times (the way is suggested in 1st page) still no dice.
> 
> I'm doing widows update hoping that it will fix the problem since is a fresh install of windows!


Faulty card.


----------



## tsm106

Artifacts in bios, barring a loose plug is a sure sign of defect of damaged card.


----------



## azcrazy

Quote:


> Originally Posted by *BradleyW*
> 
> Faulty card.


Quote:


> Originally Posted by *tsm106*
> 
> Artifacts in bios, barring a loose plug is a sure sign of defect of damaged card.


But , but , but i just got it









Time to go back to my sorry arse GTX 580


----------



## tsm106

Get your money back?


----------



## azcrazy

Quote:


> Originally Posted by *tsm106*
> 
> Get your money back?


hope so!


----------



## MU4L

Just picked up a Lightning edition for $200. I'm stoked!


----------



## Arizonian

Quote:


> Originally Posted by *MU4L*
> 
> Just picked up a Lightning edition for $200. I'm stoked!


That's a great deal.









Post a pic with OCN name or GPU-Z validation I'll add you to the roster. Welcome to OCN too.


----------



## tsm106

Someone call the cops, you got away with theft!


----------



## MU4L

Quote:


> Originally Posted by *Arizonian*
> 
> That's a great deal.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Post a pic with OCN name or GPU-Z validation I'll add you to the roster. Welcome to OCN too.


Thanks!

I haven't built my computer yet, unfortunately, so here's a pic of it with my OCN name on a piece of paper:



*Just kidding.*

Here's the real pic:


----------



## MU4L

Quote:


> Originally Posted by *tsm106*
> 
> Someone call the cops, you got away with theft!


Haha


----------



## DzillaXx

Ok.

Now I have a 290, but AB only lets me take my voltage to 100+ mv. Which I can only take my card to 1180.

Now Trixx will allow me to go upto 200+ mv, which allows me to push my card into the 1230-1240mhz area. 1250 would be possible if I could push past 200+ mv, but not really what I'm looking for (but would be nice).

What I want is 200+ mv in After Burner. Now I tried setting a offset, but it will not take. I'm using AB 4.1.0, with the Omega Drivers. So am I doing something wrong?

Does EVGA precision allow for 200+ mv?

I would love to get some 200+mv Action, but I dislike Trixx.

Temps are not a problem. Even with my voltage maxed @ 1180mhz, the card barely goes over 50c. Sadly there is no Bios Editing tool for Hawaii.


----------



## mirzet1976

What's your VRM temperature.
I have reference Gigabyte R9 290 benching clocks 1290/1600MHz VDDC offset +200mV, Power limit +50 in Trixx, cooled with EK-FC R9 290X waterblock, ELPIDA ram

1290/1600

1290/1600

1300/1625


----------



## DzillaXx

Pretty low.

I couldn't tell you my VRM temp after long usage.

But after 10-15min of Shadow of Mordor gameplay, it didn't break 50c.

It has a full sized block, that has good VRM cooling.

Temps are not going to be a problem.

You have a nice card though. Mine isn't quite as lucky.


----------



## gordesky1

Has anyone tried oblivion or any other older games on these cards??? I went to play it again today after years not doing so. And i was looking for super smooth fps even with mods. But im getting 30s and sometimes 20s in some areas...

I notice the clocks in msi afterburner going down to 600-800 and gpu load is between 1% to max 40%... I have my lightning clocked to 1200/1650 and ony games that will use that is newer high demanding games...

I have tried everything in msi burner to stop this clock switching down... Heres my after burner settings.

I posted about this in the lightning club but i just got told to put the settings what i have in msi, Which still doesn't fix it..

I also tried disabling with out power play support and while the clocks stays maxed i still can feel the throttling down....

The other weird thing is if i look up at the sky on oblivion it goes to maxed clocks but if i look down it goes down to 700 to 800mhz when there's more stuff on the screen....


----------



## mirzet1976

Quote:


> Originally Posted by *DzillaXx*
> 
> Pretty low.
> 
> I couldn't tell you my VRM temp after long usage.
> 
> But after 10-15min of Shadow of Mordor gameplay, it didn't break 50c.
> 
> It has a full sized block, that has good VRM cooling.
> 
> Temps are not going to be a problem.
> 
> You have a nice card though. Mine isn't quite as lucky.


If you like MSI Afterburner try this with MSI Aftreburner.

First make another shortcut of MSI Afterburner on desktop put this "C:\Program Files (x86)\MSI Afterburner\MSIAfterburner.exe" /wi6,30,8d,20
in Propertis - Target ,save and run it. And than run first MSI Afterburner and you will get 200mV


----------



## DzillaXx

Quote:


> Originally Posted by *mirzet1976*
> 
> If you like MSI Afterburner try this with MSI Aftreburner.
> 
> First make another shortcut of MSI Afterburner on desktop put this "C:\Program Files (x86)\MSI Afterburner\MSIAfterburner.exe" /wi6,30,8d,20
> in Propertis - Target ,save and run it. And than run first MSI Afterburner and you will get 200mV


I'll try it again. But have done it a bunch of times already. Just doesn't stick.

Maybe I just need to uninstall AB and reinstall it. As I did upgrade a old version to that one.

Because everything else is the same as your AB window. Same Driver. So IDK... Trixx works fine, so I know the card can push higher volts.


----------



## kizwan

Quote:


> Originally Posted by *DzillaXx*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mirzet1976*
> 
> If you like MSI Afterburner try this with MSI Aftreburner.
> 
> First make another shortcut of MSI Afterburner on desktop put this "C:\Program Files (x86)\MSI Afterburner\MSIAfterburner.exe" /wi6,30,8d,20
> in Propertis - Target ,save and run it. And than run first MSI Afterburner and you will get 200mV
> 
> 
> 
> 
> 
> I'll try it again. But have done it a bunch of times already. Just doesn't stick.
> 
> Maybe I just need to uninstall AB and reinstall it. As I did upgrade a old version to that one.
> 
> Because everything else is the same as your AB window. Same Driver. So IDK... Trixx works fine, so I know the card can push higher volts.
Click to expand...

When you did the above trick, did you touch the voltage slider?


----------



## DzillaXx

Quote:


> Originally Posted by *DzillaXx*
> 
> I'll try it again. But have done it a bunch of times already. Just doesn't stick.
> 
> Maybe I just need to uninstall AB and reinstall it. As I did upgrade a old version to that one.
> 
> Because everything else is the same as your AB window. Same Driver. So IDK... Trixx works fine, so I know the card can push higher volts.


Ok Update.

After uninstalling, and reinstalling. It now works.









Also now that I have Aux control, 1250 works fine @ 200+ mv

So one more question to ask. How much Aux should I give my card? Memory also seems fine @ 1600 (Hynix), might just max it out. Right now my Aux is +19, anyone know if that is in the ballpark of where it should be?

Quote:


> Originally Posted by *kizwan*
> 
> When you did the above trick, did you touch the voltage slider?


I already had my overclock set. It wouldn't move to 200. Now it does, but if I mess with the slider now it will go back down to 100. But using the shortcut fixes it.


----------



## DzillaXx

Quote:


> Originally Posted by *DzillaXx*
> 
> Ok Update.
> 
> After uninstalling, and reinstalling. It now works.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also now that I have Aux control, 1250 works fine @ 200+ mv
> 
> So one more question to ask. How much Aux should I give my card? Memory also seems fine @ 1600 (Hynix), might just max it out. Right now my Aux is +19, anyone know if that is in the ballpark of where it should be?
> I already had my overclock set. It wouldn't move to 200. Now it does, but if I mess with the slider now it will go back down to 100. But using the shortcut fixes it.


is it going to drop back to 100mv after every reboot?


----------



## tsm106

Yes. It's a manual command that you have issue. Since its a per use command, it's better intended for benching that 24/7. Though I suppose you could create a start menu shortcut to issue the command on each boot.


----------



## Widde

Isnt it time for another 3d fanboy overclocking competition soon?


----------



## mfknjadagr8

Add me please


----------



## mfknjadagr8

Ok so after installing both r9 290s (reference) i got my first black screen within 10 minutes of gaming yay







anyhow both have elpdia memory and idle around 50c with stock crappy cooler set to 50% fan speed in afterburner...ive got two waterblocks on order hopefully it will help tame them... after running one loop of heaven at 1920x1080 ultra hd preset temps hit 79C with fan set to 60 which was pretty loud.. this was the result seems pretty low but ive never ran valley before so dont have a baseline...this is using both of the new omega drivers both gpu and chipset


----------



## Arizonian

Quote:


> Originally Posted by *MU4L*
> 
> Thanks!
> 
> I haven't built my computer yet, unfortunately, so here's a pic of it with my OCN name on a piece of paper:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> *Just kidding.*
> 
> Here's the real pic:
> 
> 
> Spoiler: Warning: Spoiler!


I've been getting more lenient on submissions as of late and would have accepted the first one.









Congrats - added








Quote:


> Originally Posted by *mfknjadagr8*
> 
> Add me please
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









_I put em down as Sapphires PM me and I'll make a correction if their not._


----------



## HOMECINEMA-PC

If I had a dollar for every FX chip owner that said ' My score is really low ' id have a 5960x by now


----------



## Spectre-

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> If I had a dollar for every FX chip owner that said ' My score is really low ' id have a 5960x by now


+1

and i would own the car i really want


----------



## MrWhiteRX7

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> If I had a dollar for every FX chip owner that said ' My score is really low ' id have a 5960x by now


----------



## kizwan

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Ok so after installing both r9 290s (reference) i got my first black screen within 10 minutes of gaming yay
> 
> 
> 
> 
> 
> 
> 
> anyhow both have elpdia memory and idle around 50c with stock crappy cooler set to 50% fan speed in afterburner...ive got two waterblocks on order hopefully it will help tame them... after running one loop of heaven at 1920x1080 ultra hd preset temps hit 79C with fan set to 60 which was pretty loud.. this was the result
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> seems pretty low but ive never ran valley before so dont have a baseline...this is using both of the new omega drivers both gpu and chipset


You probably forgot to enable Crossfire.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Spectre-*
> 
> +1
> 
> and i would own the car i really want


Smexy


----------



## Spectre-

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Smexy


indeed


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Spectre-*
> 
> +1
> 
> and i would own the car i really want


All ready have mine


----------



## jagdtigger

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> All ready have mine
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Nice







. I only have these two







:


Spoiler: Warning: Spoiler!


----------



## Spectre-

Quote:


> Originally Posted by *jagdtigger*
> 
> [/SPOILER]
> Nice
> 
> 
> 
> 
> 
> 
> 
> 
> . I only have these two
> 
> 
> 
> 
> 
> 
> 
> :
> 
> 
> Spoiler: Warning: Spoiler!


jelly


----------



## Alexbo1101

Think you missed one Arizonian









Added a second 290X to my pc.
Quote:


> Originally Posted by *Arizonian*
> 
> -snip-
> 
> Again - anyone with new 290 / 290X please post submission and I'll be glad to add you to roster.


Quote:


> Originally Posted by *Alexbo1101*
> 
> -snip-
> 
> First off, some proof for you @Arizonian:
> 
> 
> -snip-


----------



## mfknjadagr8

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> If I had a dollar for every FX chip owner that said ' My score is really low ' id have a 5960x by now


I'm not comparing scores with Intel I'm comparing fx owners scores with similar clocks...I don't care if your Intel runs double scores to what mine does I'm just looking to get the scores and performance I should for what I have...if I had a nickel for everytime I've read Intel is better than amd when the question had nothing to do with Intel I'd be twice as rich


----------



## rdr09

Quote:


> Originally Posted by *mfknjadagr8*
> 
> I'm not comparing scores with Intel I'm comparing fx owners scores with similar clocks...I don't care if your Intel runs double scores to what mine does I'm just looking to get the scores and performance I should for what I have...if I had a nickel for everytime I've read Intel is better than amd when the question had nothing to do with Intel I'd be twice as rich


intel or not . . . your score shows only one card is working. go over post # 1 (i think) and there is an instruction of how to set crossfire in valley . . .

http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0

labeled: AMD tweaks. But, disable and re-enable crossifre prior to running the bench as well.


----------



## mfknjadagr8

Quote:


> Originally Posted by *rdr09*
> 
> intel or not . . . your score shows only one card is working. go over post # 1 (i think) and there is an instruction of how to set crossfire in valley . . .
> 
> http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0


thanks I will try it when I get home must have overlooked that the first time I looked...I did find that both of my card arent unlockable and I think there is an issue with my driver install or perhaps a bug because I got the error package install failed but the report showed everything installed...however I see no crossfire profiles in control center...I know they are there because trying to create a new one days it would overwrite preset...


----------



## stren

BTW if any 290 owners want waterblocks - I have seven for sale (and some 290s too) in my fs thread.


----------



## supermiguel

Can the Modded Asus PT1 bios be flashed on any brand of 290x? is that still the best bios for water cooling 290x?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *mfknjadagr8*
> 
> I'm not comparing scores with Intel I'm comparing fx owners scores with similar clocks...I don't care if your Intel runs double scores to what mine does I'm just looking to get the scores and performance I should for what I have...if I had a nickel for everytime I've read Intel is better than amd when the question had nothing to do with Intel I'd be twice as rich


Also knew what you asked , always hear the same thing . Its why I don't run AM3+ no more .
If you want higher scores on benchmarks , its the physics side of things that need to be addressed tighter ram timings , higher clock speeds . So get over yourself









You should re-install both cards again this has helped windows recognize cards it didn't previously for me in the past








Quote:


> Originally Posted by *rdr09*
> 
> intel or not . . . your score shows only one card is working. go over post # 1 (i think) and there is an instruction of how to set crossfire in valley . . .
> 
> http://www.overclock.net/t/1360884/official-top-30-unigine-valley-benchmark-1-0
> 
> labeled: AMD tweaks. But, disable and re-enable crossifre prior to running the bench as well.
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> You probably forgot to enable Crossfire.
Click to expand...

Crossfire enabling helps too









Quote:


> Originally Posted by *supermiguel*
> 
> Can the Modded Asus PT1 bios be flashed on any brand of 290x? is that still the best bios for water cooling 290x?


Yes it can or 290 . If your really lucky that bios *could unlock* your 290 into a 290x .
My belief is that the PT1T bios is the better w/c bios . For my setup it is anyways


----------



## supermiguel

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Also knew what you asked , always here the same thing . Its why I don't run AM3+ no more .
> If you want higher scores on benchmarks , its the physics side of things that need to be addressed tighter ram timings , higher clock speeds . So get over yourself
> 
> 
> 
> 
> 
> 
> 
> 
> Crossfire enabling helps too
> 
> 
> 
> 
> 
> 
> 
> 
> Yes it can or 290 . If your really lucky that bios *could unlock* your 290 into a 290x .
> My belief is that the PT1T bios is the better w/c bios . For my setup it is anyways


Both of mine are 290x just trying to overclock them... Im using msiab at 200mV but temps still good so Want more i xan do it with msiab, but seems like most of u guys mod the bios


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *supermiguel*
> 
> Both of mine are 290x just trying to overclock them... Im using msiab at 200mV but temps still good so Want more i xan do it with msiab, but seems like most of u guys mod the bios


Shamino's ( I think that's how you spell it ) *PT1T has vdroop* , PT1 bios and Asus GPU tweek . I can run past 1.55vc with that .


----------



## Spectre-

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Shamino's ( I think that's how you spell it ) *PT1T has vdroop* , PT1 bios and Asus GPU tweek . I can run past 1.55vc with that .


whats the max volts. you have hit ?

i got my 290X to 1.55 ( 1.48 with drop) did 1250/1600 (avg.) on most benches


----------



## mfknjadagr8

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Also knew what you asked , always here the same thing . Its why I don't run AM3+ no more .
> If you want higher scores on benchmarks , its the physics side of things that need to be addressed tighter ram timings , higher clock speeds . So get over yourself
> 
> 
> 
> 
> 
> 
> 
> 
> You should re-install both cards again this has helped windows recognize cards it didn't previously for me in the past
> 
> 
> 
> 
> 
> 
> 
> 
> Crossfire enabling helps too :happysmile:


Yeah im not conceited at all but thanks for that... crossfire is enabled... not necessarily looking for higher benchmark scores simply looking to ensure im getting the performance i should be at the clocks im running on both sides...i am gonna try a reseat and perhaps a ddu uninstall and reinstall in safe mode this time... also other things to work on i wont give up im invested nearly 1k once watercooling parts get here so believe ill get it sorted


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Yeah im not conceited at all but thanks for that... crossfire is enabled... not necessarily looking for higher benchmark scores simply looking to ensure im getting the performance i should be at the clocks im running on both sides...i am gonna try a reseat and perhaps a ddu uninstall and reinstall in safe mode this time... also other things to work on i wont give up im invested nearly 1k once watercooling parts get here so believe ill get it sorted


Get back to us for more brainstorming if no luck with re-seat








Quote:


> Originally Posted by *Spectre-*
> 
> whats the max volts. you have hit ?
> 
> i got my 290X to 1.55 ( 1.48 with drop) did 1250/1600 (avg.) on most benches


Well you know my highest 290 Clock was 1360mhz so it had to be somewhere past 1.52vc with droop .
But I cant honestly say its been awhile and ive lost a crap load of screenies with that info


----------



## mfknjadagr8

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Get back to us for more brainstorming if no luck with re-seat


well disabling ulps stopped the black screening at least I'm pretty sure it was black screening within a couple of minutes and now I played for two hours..metro last light on max settings 1080p...no black screens but not very good performance...going to test each card separately and work with drivers tomorrow...reseat didn't help but it was worth a shot...I did notice steady clocks even though temps held pretty steady 94 being max and mostly around 88...with fan at 60 percent


----------



## pshootr

Quote:


> Originally Posted by *mfknjadagr8*
> 
> well disabling ulps stopped the black screening at least I'm pretty sure it was black screening within a couple of minutes and now I played for two hours..metro last light on max settings 1080p...no black screens but not very good performance...going to test each card separately and work with drivers tomorrow...reseat didn't help but it was worth a shot...I did notice steady clocks even though temps held pretty steady 94 being max and mostly around 88...with fan at 60 percent


Glad to hear that disabling ULPS worked







You may want to run your fans higher than 60% until you get your blocks, 94C is pretty toasty. I know that AMD says 90C is ok, but It is still scary to me.


----------



## MU4L

Quote:


> Originally Posted by *Arizonian*
> 
> I've been getting more lenient on submissions as of late and would have accepted the first one.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added


Haha, I wasted time taking the real picture!









Thanks for adding me!


----------



## ChronoBodi

Wait guys, does Crossfire work in windowed mode?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *ChronoBodi*
> 
> Wait guys, does Crossfire work in windowed mode?


No... unless it's a Mantle title I think.


----------



## ChronoBodi

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> No... unless it's a Mantle title I think.


Never? Not to incite some stupid fanboy wars, but Nvidia SLI works with windowed mode....

Not that it matters to me since I play fullscreen, but if AMD has it working in Mantle, then they will have it working in DX12 I guess.

What's the full list of Mantle games?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *mfknjadagr8*
> 
> well disabling ulps stopped the black screening at least I'm pretty sure it was black screening within a couple of minutes and now I played for two hours..metro last light on max settings 1080p...no black screens but not very good performance...going to test each card separately and work with drivers tomorrow...reseat didn't help but it was worth a shot...I did notice steady clocks even though temps held pretty steady 94 being max and mostly around 88...with fan at 60 percent


Poor hardware



If so no wonder performance is down as you mentioned








or is that 94f ??

Toasty VRM,s









Don't be shy and crank that fan to 100%

Go water ASAP and have 50c shaved of load temps









* Just read pshootrs post *


----------



## Blameless

Black screen behavior has changed with recent driver releases.

I used to run the PT1 BIOS to get around them as well as to eliminate throttling, but as of the Omega driver maybe slightly earlier, the PT1 BIOS had become a source of idle black screens, rather than a fix. Essentially, it appears that the PT1 BIOS would, for lack of any low-power states, cause the GPU to overheat during sleep to the point where it would crash while in sleep.

Reverting to a more standard BIOS (the original AMD review sample, in this case), seems to have fixed this, and I currently have zero issues resuming from protracted sleep periods with my reference 290 (Elpida memory) flashed to 290X, and +50% power limit is all I really need to keep the card from throttling in any scenario the reference cooler can actually cool. I do need about 13mV more for unconditional stability than I needed with the PT1 BIOS, however.


----------



## HOMECINEMA-PC

@megaman Wifes gone away ............. that's a bummer









Time to QUAD


----------



## Damianpl

Hey there!

My R9 290x:

Asus R9290X-4GD5
Cooling: Rajnitek Morpheus with 2x 120mm Enermax fans

Valid. link http://www.techpowerup.com/gpuz/details.php?id=f5qsw





Please let me know if im missing anything!


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mfknjadagr8*
> 
> well disabling ulps stopped the black screening at least I'm pretty sure it was black screening within a couple of minutes and now I played for two hours..metro last light on max settings 1080p...no black screens but not very good performance...going to test each card separately and work with drivers tomorrow...reseat didn't help but it was worth a shot...I did notice steady clocks even though temps held pretty steady 94 being max and mostly around 88...with fan at 60 percent
> 
> 
> 
> Poor hardware
> 
> 
> 
> If so no wonder performance is down as you mentioned
> 
> 
> 
> 
> 
> 
> 
> 
> or is that 94f ??
> 
> Toasty VRM,s
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Don't be shy and crank that fan to 100%
> 
> Go water ASAP and have 50c shaved of load temps
> 
> 
> 
> 
> 
> 
> 
> 
> 
> * Just read pshootrs post *
Click to expand...

Hmmm...cpu shouldn't affect Valley score that bad though. I doubt Crossfire is enabled because the score is what you get with single 290.


----------



## rdr09

Quote:


> Originally Posted by *mfknjadagr8*
> 
> well disabling ulps stopped the black screening at least I'm pretty sure it was black screening within a couple of minutes and now I played for two hours..metro last light on max settings 1080p...no black screens but not very good performance...going to test each card separately and work with drivers tomorrow...reseat didn't help but it was worth a shot...I did notice steady clocks even though temps held pretty steady 94 being max and mostly around 88...with fan at 60 percent


94? here is what 94 will do to your card (one in the left where the secondary card was on air) . . .



right side with water.


----------



## mfknjadagr8

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Poor hardware
> 
> 
> 
> If so no wonder performance is down as you mentioned
> 
> 
> 
> 
> 
> 
> 
> 
> or is that 94f ??
> 
> Toasty VRM,s
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Don't be shy and crank that fan to 100%
> 
> Go water ASAP and have 50c shaved of load temps
> 
> 
> 
> 
> 
> 
> 
> 
> 
> * Just read pshootrs post *


I have two komodo blocks an extra rad and a mcp50x coming...the 94c was a spike ran around 88 most of the time...the vrms according to hw64 showed less than 90 on all and under 70 on vrm2...most of the time vrm one was around 82 and vrm2 averaged 68...I normally don't care about noise but at 100 percent those suckers scream...but I probably will do so anyway...should only be 3 days or so


----------



## GOLDDUBBY

Can someone show me a frstrk extreme with 290x ?

:edit: Please =^_^=


----------



## mirzet1976

Can someone show me a frstrk extreme with 290x ?

:edit: Please =^_^=[/quote]

here http://www.overclock.net/t/1443196/fire-strike-extreme-top-30


----------



## supermiguel

any reason to use the Modded Asus bios instead of MSIAB now that it allows you to go higher than 200mV?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *mirzet1976*
> 
> Can someone show me a frstrk extreme with 290x ?
> 
> :edit: Please =^_^=


here http://www.overclock.net/t/1443196/fire-strike-extreme-top-30[/quote]

This will do ya . I could of gone higher clock speed

http://www.3dmark.com/fs/2056001


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mirzet1976*
> 
> Quote:
> 
> 
> 
> Originally Posted by *GOLDDUBBY*
> 
> Can someone show me a frstrk extreme with 290x ?
> 
> :edit: Please =^_^=
> 
> 
> 
> here http://www.overclock.net/t/1443196/fire-strike-extreme-top-30
> 
> Click to expand...
> 
> This will do ya . I could of gone higher clock speed
> 
> http://www.3dmark.com/fs/2056001
Click to expand...


----------



## ChronoBodi

so stock volts wise, with power limit on +30, core 1100 and memory 1400 is fine.

Now i try core 1150 and memory 1500 with power limit +50, and it's not stable.

The option for adjusting core voltage is locked in Afterburner, but to be fair the top GPU already is always at 80c load with fan speed of 90% , however the bottom gpu is 10c cooler....

i guess you can't help it with having two open-air GPUs in a full-tower chassis, yes maybe blowers would be better but there were no blower 8GB option nor there were any blowers besides the crappy reference AMD one.

then again, 80c full load is a lot more preferable to 95c load from reference AMD blower.


----------



## fasttracker440

Hey all I have had a 290x for a while now and finally have started playing around with it. Currently running 2 cards number 3 and 4 are waiting for blocks. they are Sapphire R9 290X tri x oc cards. I have them stable at 1210 core and 1550 ram with stock bios. I have a feeling I could get a lot more out of them if i could push some more volts to them but I am unsure what bios to use for these cards. Can i just flash the bios's stated at the start of this thread or do i have to find modded ones for my specific brand. Also after all that is sorted is there a bios mod tool like for the 600 series nvidia cards to just slam your stable oc in to a bios and forgo the 3rd part program? This is probably the best bench i will get with out artifacting.


----------



## supermiguel

whats the most graphics intense (as a requirement) out there? like if i wanted to push my 290x what game should i be trying?


----------



## Kriant

Quote:


> Originally Posted by *supermiguel*
> 
> whats the most graphics intense (as a requirement) out there? like if i wanted to push my 290x what game should i be trying?


Shadow of Mordor on max.
Metro Last Light on max
Crysis 3
Far Cry 4
Dying Light
Dragon Age: Inquisition

They will look at your vram, and they will consume it like butter. Also those games will load the card up to 100%.


----------



## Agent Smith1984

Quote:


> Originally Posted by *supermiguel*
> 
> whats the most graphics intense (as a requirement) out there? like if i wanted to push my 290x what game should i be trying?


There are quite a few....

Crysis 3 is always a good one.
Watch Dogs is good
Shadows of Mordor can give it a good workout
Metro Last Light is pretty graphic intensive too


----------



## Alexbo1101

Well just tried giving the cards a spin at valley, I guess this is an okay score since the cards are only clocked at 1040/1300 (coupled with a 3570k at 4500).



I'm not the most knowledgeable person with benchmarks, is this average or is it completely off the mark?


----------



## supermiguel

Quote:


> Originally Posted by *Kriant*
> 
> Shadow of Mordor on max.
> Metro Last Light on max
> Crysis 3
> Far Cry 4
> Dying Light
> Dragon Age: Inquisition
> 
> They will look at your vram, and they will consume it like butter. Also those games will load the card up to 100%.


Quote:


> Originally Posted by *Agent Smith1984*
> 
> There are quite a few....
> 
> Crysis 3 is always a good one.
> Watch Dogs is good
> Shadows of Mordor can give it a good workout
> Metro Last Light is pretty graphic intensive too


Ya also just got a new 1440p monitor and wanted to really test it... see what it can do with a good looking game


----------



## Devildog83

Just bought a 4670k, is that a good match for a 290 flashed to a 290x.


----------



## Dt_Freak1

http://www.techpowerup.com/gpuz/details.php?id=ep7e6
Gigabyte R9 290x Windforce
stock 3 fan gigabyte cooler


----------



## LandonAaron

When can we expect a new Catalyst Driver? The last release was three months ago and several big games have come out in that time. Specifically hoping for a driver for Evolve and Dying Light. Does anyone know if crossfire works with these games?


----------



## chiknnwatrmln

Considering we don't even have a FC4 CF patch it's gonna be a while...

AMD has been pretty disappointing in their driver support imo. Omega just straight up does not work on my PC so I'm still on 14.9.


----------



## pshootr

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Considering we don't even have a FC4 CF patch it's gonna be a while...
> 
> AMD has been pretty disappointing in their driver support imo. Omega just straight up does not work on my PC so I'm still on 14.9.


Its not Omega, its your pc. You probably had old driver files left over/conflicting with some of the Omega files.

Even after using DDU, I ended up finding ATI/AMD stuff in the registry while doing a manual cleaning.


----------



## supermiguel

any good oc guide for the 290x?


----------



## pshootr

Quote:


> Originally Posted by *supermiguel*
> 
> any good oc guide for the 290x?


I'm really not sure, but its pretty simple. Raise your core 25-50mhz at a time until you start to see artifacts in games or Heaven Benchmark. Then raise the voltage 10-20mv at a time until the artifacts go away. Rinse and repeat until your temps are at max, or until you can no longer run without artifacts. (when you get a rough idea of your OC you can go up/down on frequency/voltage in smaller increments for finer tuning)

Same thing for memory, but find your desired core speed first, then play with memory.

Try to keep GPU core and VRM below 90C. Even better would be 70ish core and 80ish vrm's.

You can use TRIXX or MSI-Afterburner to OC with. TRIXX will be less intimidating because it has far fewer settings.


----------



## supermiguel

Quote:


> Originally Posted by *pshootr*
> 
> I'm really not sure, but its pretty simple. Raise your core 25-50mhz at a time until you start to see artifacts in games or Heaven Benchmark. Then raise the voltage 10-20mv at a time until the artifacts go away. Rinse and repeat until your temps are at max, or until you can no longer run without artifacts. (when you get a rough idea of your OC you can go up/down on frequency/voltage in smaller increments for finer tuning)
> 
> Same thing for memory, but find your desired core speed first, then play with memory.
> 
> Try to keep GPU core and VRM below 90C. Even better would be 70ish core and 80ish vrm's.
> 
> You can use TRIXX or MSI-Afterburner to OC with. TRIXX will be less intimidating because it has far fewer settings.


So right now i have my core voltage in msi afterburner at +100 power limit at +50, core 1135, memory clock at 1250 (stock)... and temps at load are around 39C.... When i try 1140 i get artifacts.. Now since my temp is low... how much more can i go up? like what is red zone for this cards??


----------



## Red1776

Quote:


> Originally Posted by *supermiguel*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Kriant*
> 
> Shadow of Mordor on max.
> Metro Last Light on max
> Crysis 3
> Far Cry 4
> Dying Light
> Dragon Age: Inquisition
> 
> They will look at your vram, and they will consume it like butter. Also those games will load the card up to 100%.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> There are quite a few....
> 
> Crysis 3 is always a good one.
> Watch Dogs is good
> Shadows of Mordor can give it a good workout
> Metro Last Light is pretty graphic intensive too
> 
> Click to expand...
> 
> Ya also just got a new 1440p monitor and wanted to really test it... see what it can do with a good looking game
Click to expand...

Run the scene in Crysis Warhead on the train from out of the tunnel with the helicopter attacks all the way into the jungle and the alien attacks as the train stops

Metro LL



Metro 2033



Crysis 3



Whitcher 2 (on *Uber* settings) The toughest to date


----------



## rdr09

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Considering we don't even have a FC4 CF patch it's gonna be a while...
> 
> AMD has been pretty disappointing in their driver support imo. Omega just straight up does not work on my PC so I'm still on 14.9.


Omega works on my 2 290s. took like 8 mins to install. others need DDU. never used it though.


----------



## pshootr

Quote:


> Originally Posted by *supermiguel*
> 
> So right now i have my core voltage in msi afterburner at +100 power limit at +50, core 1135, memory clock at 1250 (stock)... and temps at load are around 39C.... When i try 1140 i get artifacts.. Now since my temp is low... how much more can i go up? like what is red zone for this cards??


I forgot to mention the power limit.
















If you are artifacting at 1140 core with +100mv, then you likely need to lower your core frequency little at a time until you stop artifacting. "Or"

TRIXX will let you go up to +200mv. Afterburner will too if you edit the command line of your shortcut as described in this post.

Your load temp is wicked low, are you sure your looking at the rite temp, what kind of cooling you have on your card?

Some decent clocking cards can hit 1200/1500 and higher with enough voltage/cooling. Someone with more experience than I can tighten this up for me though, I am kind of a newb


----------



## supermiguel

Quote:


> Originally Posted by *pshootr*
> 
> I forgot to mention the power limit.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you are artifacting at 1140 core with +100mv, then you likely need to lower your core frequency little at a time until you stop artifacting. "Or"
> 
> TRIXX will let you go up to +200mv. Afterburner will too if you edit the command line of your shortcut as described in this post.
> 
> Your load temp is wicked low, are you sure your looking at the rite temp, what kind of cooling you have on your card?
> 
> Some decent clocking cards can hit 1200/1500 and higher with enough voltage/cooling. Someone with more experience than I can tighten this up for me though, I am kind of a newb


Few monsta rads...so whats the max voltage this cards can handle?


----------



## pshootr

Quote:


> Originally Posted by *supermiguel*
> 
> Few monsta rads...so whats the max voltage this cards can handle?


I'm not sure how much voltage is too much. But with your cooling, I would think that +200mv should be fine.

Edit:

I am currently running my 290 at 1150/1500 with +75mv. It will do 1200/1600 with more voltage, but I find my current clocks to be a nice balance being that I am on air.


----------



## supermiguel

ya even with +400 running at 1.344V still giving me issues at 1150







now seems like it gives me more issues if XF is enabled, should i be disabling it to OC?


----------



## pshootr

Quote:


> Originally Posted by *supermiguel*
> 
> ya even with +400 running at 1.344V still giving me issues at 1150
> 
> 
> 
> 
> 
> 
> 
> now seems like it gives me more issues if XF is enabled, should i be disabling it to OC?


I have not run CF yet myself. Maybe you should try finding out the max OC of each of your cards. One may be a better clocker than the other, and therefor holding your OC back in CF.


----------



## Arizonian

Quote:


> Originally Posted by *Alexbo1101*
> 
> Think you missed one Arizonian
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Added a second 290X to my pc.


I did. Thank you for your patience.

Congrats - updated
















Quote:


> Originally Posted by *Dt_Freak1*
> 
> http://www.techpowerup.com/gpuz/details.php?id=ep7e6
> Gigabyte R9 290x Windforce
> stock 3 fan gigabyte cooler


Congrats - added


----------



## chiknnwatrmln

Quote:


> Originally Posted by *pshootr*
> 
> Its not Omega, its your pc. You probably had old driver files left over/conflicting with some of the Omega files.
> 
> Even after using DDU, I ended up finding ATI/AMD stuff in the registry while doing a manual cleaning.


No leftover files, I tried a fresh Win7 install and it doesn't work for me.

I made a ticket with AMD but they kept asking trivial things like "is your card plugged in?" etc.

I'm not sure what it is about my PC and Omega but they don't play nicely together.


----------



## pshootr

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> No leftover files, I tried a fresh Win7 install and it doesn't work for me.
> 
> I made a ticket with AMD but they kept asking trivial things like "is your card plugged in?" etc.
> 
> I'm not sure what it is about my PC and Omega but they don't play nicely together.


Wow that is strange. Are you using the chipset drivers from AMD, or just windows drivers? (Is you card plugged in) LOL









Jokes aside.. I'm not sure what to think. You must be frustrated (I would be). What exactly happens when you try and install Omega?


----------



## tsm106

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Quote:
> 
> 
> 
> Originally Posted by *pshootr*
> 
> Its not Omega, its your pc. You probably had old driver files left over/conflicting with some of the Omega files.
> 
> Even after using DDU, I ended up finding ATI/AMD stuff in the registry while doing a manual cleaning.
> 
> 
> 
> No leftover files, I tried a fresh Win7 install and it doesn't work for me.
> 
> I made a ticket with AMD but they kept asking trivial things like "is your card plugged in?" etc.
> 
> I'm not sure what it is about my PC and Omega but they don't play nicely together.
Click to expand...

PM the rep in the drivers forum, Warsam... hopefully he can expedite your issue. There are rare cases when your specific setup falls outside the standard deviation, ie. the range of normal, and the drivers just can't place nice with your rig.


----------



## iCrap

I put a third 290 on a riser but it wouldnt go into crossfire mode








yes, i know its ghetto.


----------



## pshootr

Quote:


> Originally Posted by *iCrap*
> 
> I put a third 290 on a riser but it wouldnt go into crossfire mode
> 
> 
> 
> 
> 
> 
> 
> 
> yes, i know its ghetto.


I think that anti-static bag may be conductive. I would use a cardboard box.


----------



## iCrap

Quote:


> Originally Posted by *pshootr*
> 
> I think that anti-static bag may be conductive. I would use a cardboard box.


Yea i swapped it out after that pic with cardboard.
Regardless it wouldn't go into crossfire, so i just took it out. It only showed up as disabled adapter.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *pshootr*
> 
> Wow that is strange. Are you using the chipset drivers from AMD, or just windows drivers? (Is you card plugged in) LOL
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Jokes aside.. I'm not sure what to think. You must be frustrated (I would be). What exactly happens when you try and install Omega?


I use chipset drivers from ASUS for my motherboard, I get the video drivers directly from the AMD site.

With Omega I get high CCC CPU usage (constant 10-15%) and I can't change any settings. Every time I click a drop down menu, it stays open for about a second then closes itself and reverts to the previous setting. And when I used to run triple screens Eyefinity would bug out and I couldn't re-enable it because of the menu bug. Super annoying. Good thing Omega doesn't have any huge difference between 14.9, I guess.
Quote:


> Originally Posted by *tsm106*
> 
> PM the rep in the drivers forum, Warsam... hopefully he can expedite your issue. There are rare cases when your specific setup falls outside the standard deviation, ie. the range of normal, and the drivers just can't place nice with your rig.


I'll give that a shot, thanks.


----------



## Mega Man

you can use a riser, but i dont think you can use a usb3 converter


----------



## Chopper1591

Still happy with my first water-cooled gpu.
Haven't tried any extreme overclocking with voltage mods though, stock voltage settings do the trick in games I guess.

1100 1300 with +0
1200 1500 with +100
1275 1675 with +200
Only saw some weird stuff yesterday.
Playing Dying Light(I know the optimization is horrid) and Hitman Absolution.

Gpu usage seems to be all over the place, monitored with AB.
Usage changes from 100% to as low as 25ish %. With a zig-zag pattern.

Is that normal? FPS seem to be pretty solid.
Card also is not throttling, I guess. Core clock isn't changing more than 10-15mhz.

Cpu usage is also fluctuating, so I don't think it is a bottleneck. fx-8320 is clocked at 4.8 fyi.


----------



## Dan848

Chopper,

Different games use the GPU and CPU differently, although MOST modern games are becoming more GPU dependent. For smooth game play it is a good idea to have a good anti-virus that auto-scans, and manually run it once in a while to make sure you have no virus, malware and other trash. I use Bitdefender, one of the top names, however, I am not trying to push it on anyone, this is not a post to sell any name brand.

As for your GPU and CPU fluctuating, that is normal for many games. As long as your GPU stays cool enough not to throttle you are fine. I tend to drop my core and memory clocks a little below maximum for long life, no use blowing up an expensive piece of hardware for 1 or 2 fps.

Also, backing down core and memory clocks often result in more stable game play. The same holds true for system RAM and CPU if the CPU is overclocked. Overclocking the CPU is highly dependent upon a very good power supply, good motherboard, at least CAS 9 1866 RAM [for high and stable OC], and the luck of the draw in getting a good CPU. Of course you will need a high quality power supply and good motherboard for overclocking the GPU for long life.

Keep your system clean of viruses and clean out the crap that Windows piles up, some of it hidden. I use CCleaner Professional [small fee for lifetime updates and full functions]. It gets rid of hidden files, including temporary, that Windows keeps even if you delete files including cookies, using Windows own software.

A note about CCleaner here, enter options, Settings, Secure file deletion, to fully scrub all left over files and copies of cookies [yes, Windows keeps your cookies even if you delete them using Windows software]. Also, CCleaner Professional is the only program I have used that is safe for cleaning Windows Registry.

Many people overlook cleaning out the inside of their computer, especially heatsinks and chips, with compressed air. Dust bunnies cause heat which in turn causes chips and the power supply to fail sooner than normal. Compressed air can be purchased from many places, newegg, Wal-Mart, and many other sources.

This has been a rather long post, however, your questions regarding irregularities can have many root problems.


----------



## HOMECINEMA-PC

Yep a wall of compressed air that post


----------



## Chopper1591

Quote:


> Originally Posted by *Dan848*
> 
> Chopper,
> 
> Different games use the GPU and CPU differently, although MOST modern games are becoming more GPU dependent. For smooth game play it is a good idea to have a good anti-virus that auto-scans, and manually run it once in a while to make sure you have no virus, malware and other trash. I use Bitdefender, one of the top names, however, I am not trying to push it on anyone, this is not a post to sell any name brand.
> 
> As for your GPU and CPU fluctuating, that is normal for many games. As long as your GPU stays cool enough not to throttle you are fine. I tend to drop my core and memory clocks a little below maximum for long life, no use blowing up an expensive piece of hardware for 1 or 2 fps.
> 
> Also, backing down core and memory clocks often result in more stable game play. The same holds true for system RAM and CPU if the CPU is overclocked. Overclocking the CPU is highly dependent upon a very good power supply, good motherboard, at least CAS 9 1866 RAM [for high and stable OC], and the luck of the draw in getting a good CPU. Of course you will need a high quality power supply and good motherboard for overclocking the GPU for long life.
> 
> Keep your system clean of viruses and clean out the crap that Windows piles up, some of it hidden. I use CCleaner Professional [small fee for lifetime updates and full functions]. It gets rid of hidden files, including temporary, that Windows keeps even if you delete files including cookies, using Windows own software.
> 
> A note about CCleaner here, enter options, Settings, Secure file deletion, to fully scrub all left over files and copies of cookies [yes, Windows keeps your cookies even if you delete them using Windows software]. Also, CCleaner Professional is the only program I have used that is safe for cleaning Windows Registry.
> 
> Many people overlook cleaning out the inside of their computer, especially heatsinks and chips, with compressed air. Dust bunnies cause heat which in turn causes chips and the power supply to fail sooner than normal. Compressed air can be purchased from many places, newegg, Wal-Mart, and many other sources.
> 
> This has been a rather long post, however, your questions regarding irregularities can have many root problems.


Wow, what a post.
80% looks to be off-topic on my post though. No offense, you are just trying to help out, I can tell that.

People here should get badges or something to show their experience level.
You clearly try to tell me basic stuff which I didn't ask for at all.

Nonetheless I will still try to answer as completely as possible.


I have Bitdefender, been using it for 3 years or something now. I keep switching between Bitdefender and Kaspersky. So yes, I know the importance of a virus scanner... although I have had my time where I didn't use one at all because it does slow your system. And Kaspersky in particular kept bugging me and blocked some games from working properly somehow.
Where did we begin to mention the gpu and/or cpu overclock? I wasn't talking about thinking that either of my overclocks are unstable, right? My gpu stays around 41c core and 35ish on the vrm's with the 1275 overclock. Should be good for longevity, right?
Secondly, my cpu is clocked at 4.8 like I said, also with proper cooling. Keeping it below 50, more around 45, when gaming. It passed 20 runs of IBT-avx very-high and a few hours of p95 blend, so I call it stable "enough". I'm lazy to test overnight.







, and I don't think it is causing the problems in the games.
You also mentioned the ram thing... if you did open my rig in my sig you'd have seen that I have a set of TridentX 2400 c9, which is currently running at ~2140 8-10-10-1T. Paired with a 2670 cpu-nb, so bandwidth should be plenty, right?
Last but not least. I really tend to stay away from anything that cleans my registry. I don't know how much experience you have with them, but I have had my share of problems from using registry cleaners. Multiple times have had issues leading to eventually re-installing my OS. So thanks, but no thanks. I do clean my trash every now and then though. But I don't touch the registry no more. If you look on the web you see a fair share of "advanced" users advising to avoid them too.
I must admit though, you have some talent in advertising.
Ever thought about going into sales?








I mean, really.

Thanks for taking time to give me the long reply though, but I have to say it wasn't really helping. Sorry.

* edit:
Oh yeah, you also posted about the psu.
Again, in my sig it states which one I have. I have a Corsair hx750 unit, which is 80+ gold and proper.
To my knowledge that unit is good enough to power my system. Correct my if I'm wrong.


----------



## Devildog83

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yep a wall of compressed air that post


LOL







You're pretty punny man.


----------



## fishingfanatic

Just buy a cheap compressor/tank combo. I got one for $100 a few years back. We added up the cost of buying the cans, paid for itself in less than a yr.

I put a small copper pipe tip which I pinched to blow out the dust bunnies,...Use it regularly

FF







.


----------



## Waleh

Ladies and gents, how is a single 290x performing on 1440p these days? Does it have enough power to max games? I want an OCN perspective on this


----------



## DeviousAddict

Quote:


> Originally Posted by *Waleh*
> 
> Ladies and gents, how is a single 290x performing on 1440p these days? Does it have enough power to max games? I want an OCN perspective on this


I'm currently running a 3440x1440 monitor. In dying light with everything on max except textures on medium (0 difference to high except massive fps loss), I sit at around 50-60fps occasional dips to 35 in heavy zombie scenes in daylight.


----------



## Chopper1591

Quote:


> Originally Posted by *Waleh*
> 
> Ladies and gents, how is a single 290x performing on 1440p these days? Does it have enough power to max games? I want an OCN perspective on this


Really comes down to what you use it for.
Also, what is your goal? 60fps? Do you need AA?

Overal, I think yes.
Quote:


> Originally Posted by *DeviousAddict*
> 
> I'm currently running a 3440x1440 monitor. In dying light with everything on max except textures on medium (0 difference to high except massive fps loss), I sit at around 50-60fps occasional dips to 35 in heavy zombie scenes in daylight.


Ehm...
How?

I have very bad performance in that game.
Even with "only" 1080p.
It never goes near 60 fps, more like 40ish average.

That is with my 290(non x) @ 1275 core.


----------



## DeviousAddict

I've got the 8gb VaporX 290x. Dying light uses around 4.5gb of the available vram.
Other than watercooling the card, it's at stock clocks.
With high textures on i see around 5gb vram usage amd my fps drops to 25!
When I'm back on pc at the weekend I'll post some screen shots


----------



## LandonAaron

Quote:


> Originally Posted by *fishingfanatic*
> 
> Just buy a cheap compressor/tank combo. I got one for $100 a few years back. We added up the cost of buying the cans, paid for itself in less than a yr.
> 
> I put a small copper pipe tip which I pinched to blow out the dust bunnies,...Use it regularly
> 
> FF
> 
> 
> 
> 
> 
> 
> 
> .


Yeah, when I lived with my parents I would use my dads air compressor/tank, and it worked about 1000x better than Duster for cleaning out the computer. I started shopping around for one of my own recently, and from what I read it seems you will really need to spend at least $100 to get one with enough pressure and a large enough tank to work well for dusting.

Then I came across this,

, http://www.amazon.com/Metro-Vacuum-ED500-500-Watt-Electric/dp/B001J4ZOAW/ref=sr_1_1?ie=UTF8&qid=1425573425&sr=8-1&keywords=datavac

Its only $60, and if you watch the video from the first review posted there on amazon it looks like its pretty powerful, probably even more powerful than a good air compressor.


----------



## LandonAaron

Quote:


> Originally Posted by *DeviousAddict*
> 
> I've got the 8gb VaporX 290x. Dying light uses around 4.5gb of the available vram.
> Other than watercooling the card, it's at stock clocks.
> With high textures on i see around 5gb vram usage amd my fps drops to 25!
> When I'm back on pc at the weekend I'll post some screen shots


What resolution and view distance are your running at? I am playing at 2560 x 1440, with a single 4GB R9 290x, and with High Textures I stay at about 3.5 GB of VRAM and the lowest my FPS drops to is 50. But I have the view distance turned down to about 40% or so.

I can't figure this game out. The first time I played my card was at constant 100% utilization and my GPU Core Clock never dropped down from the 1125mhz I have it overclocked to, and my FPS was in the 70's and 80's. Then the next time I played my GPU utilization keept fluctuating between 80-60%, my core clock fluctuates and rarely hits the 1125mhz I have it set to, and my FPS is in the 60's and 50's.

I have tried using the Mod Manager to play in borderless fullscreen mode, and tried using Radeon Pro to force Vsync, instead of using the in game option, but neither setting seems to help.

I have my monitor overclocked to 96hz and since my FPS is always below that Vsync on or off doesn't seem to have any effect since I am never running faster than the refresh rate anyway.

Its frustrating that my refresh rate is 96hz, and I am getting FPS of 50 or 60, yet the card is only 80% utilized. Its like it doesn't realize it has more work to do.

Edit: Its not a CPU bottleneck either, as I have an i7-4790k overclocked to 4.6 GHZ, and none of the cores are ever fully utilized during play.


----------



## LandonAaron

Quote:


> Originally Posted by *Waleh*
> 
> Ladies and gents, how is a single 290x performing on 1440p these days? Does it have enough power to max games? I want an OCN perspective on this


I just upgraded to 1440p about 2 weeks ago. In that time I have played Serious Sam 3, Hard Reset, Shadow Warrior, and Dying Light. The 290x seems to be enough power to max most of these games at 60FPS, with moderate AA. 2xMSAA or post processing AA like FXAA, or SMAA. Though Hard Reset, and Shadow Warrior seem to suffer from some pretty bad micro-stutter regardless of settings, resolutions, FPS, etc. But I think that is just how those games are coded and not anything to do with the card.


----------



## YellowBlackGod

I simply love this card, which has been proven to be STILL the BEST out there today.


----------



## taem

Quote:


> Originally Posted by *Waleh*
> 
> Ladies and gents, how is a single 290x performing on 1440p these days? Does it have enough power to max games? I want an OCN perspective on this


I have a single 290 with a small oc (1130 core) outputting to a 2560x1440.

If you're looking for 60fps constant, like I do, for recent games you cannot max, no. But you can get close to max on titles like Tomb Raider, Bioshock Infinite, Hitman Absolution, Sleeping Dogs.

Titles like Metro LL and Crysis 3 you can't get close to max, you have to dial back quite a bit.

But I think you'd be quite pleased with a 290x and a 1440p.

Quote:


> Originally Posted by *DeviousAddict*
> 
> I'm currently running a 3440x1440 monitor. In dying light with everything on max except textures on medium (0 difference to high except massive fps loss), I sit at around 50-60fps occasional dips to 35 in heavy zombie scenes in daylight.


You using a 34UM95 by any chance? Seriously interested in that screen.


----------



## Kriant

1440p + max settings in current/new titles on one card? No, not with comfortable fps anyway. With 2-3 cards? Sure, up to 4k.


----------



## Agent Smith1984

Quote:


> Originally Posted by *taem*
> 
> I have a single 290 with a small oc (1130 core) outputting to a 2560x1440.
> 
> If you're looking for 60fps constant, like I do, for recent games you cannot max, no. But you can get close to max on titles like Tomb Raider, Bioshock Infinite, Hitman Absolution, Sleeping Dogs.
> 
> Titles like Metro LL and Crysis 3 you can't get close to max, you have to dial back quite a bit.


I agree will all of that....

Even with my 290 at 1150/1600, 1080P is the highest resolution for absolute max settings. Now if you go to a custom mix of high textures, 4xAA, some very high settings, etc.... that's when you can really dial in a nice looking image, and still get great framerates.

I don't have a 1440p monitor to play with personally, so I just max everything and run at 1080P....


----------



## Waleh

Ah ok, got it. So it looks like AAA games can't be completely maxed at 60 FPS which I guess is expected. But then again, at 1440p I'm not sure you necessarily need to max everything anyway as the image already looks amazing. I'm planning my next upgrade in the summer but with the 3xx series and skylake on the way, I'm thinking that I should wait for the newest tech! The 290x though does look like a very nice card


----------



## fishingfanatic

The good thing about the compressor/tank is it can be used for more than just that. As I said, I pinched a copper line, 3/16" tubing, so it works very well. Then on occasion I use it to pump up the tires on the

vehicles,...

It does look like a handy little thing though doesn't it. If you get it let me know how it works.

Oh, and once there's hardly anything left in the tank I point it at the dog and she goes crazy...lol









FF


----------



## taem

Quote:


> Originally Posted by *Waleh*
> 
> Ah ok, got it. So it looks like AAA games can't be completely maxed at 60 FPS which I guess is expected. But then again, at 1440p I'm not sure you necessarily need to max everything anyway as the image already looks amazing. I'm planning my next upgrade in the summer but with the 3xx series and skylake on the way, I'm thinking that I should wait for the newest tech! The 290x though does look like a very nice card


I haven't used anti aliasing since I went to 1440p. And most settings don't pack that big a performance hit I find, it's just a couple of settings you dial back, you can max most sliders.

If you're still on a 6850 I'd say it's totally worth it to pick up a 290, they are so cheap these days, you can get a new one off Newegg for $250-270.

And is Skylake going to do anything for gaming? The big features they're touting are biometric log ins and wireless charging.

Me, I'm debating whether to hold out for 3xx or get another 290. I did crossfire briefly but I returned the second card for the $550 and I'm glad I did that, didn't really need it for 1440p gaming and they're half the price now. I'm thinking I might skip 3xx at release because I hear it's going to be 4gb vram (though faster vram). If that's the case, 290 crossfire is plenty good enough to keep me going til the 8gb 3xxs release, and I'm hearing there is an architectural reason why breaking the 4gb barrier might be tough for 3xx.


----------



## chiknnwatrmln

I'm crossfiring a 290 and a 290x at 1440p. I agree that 4GB is kind of a limit for some games, in BF4 for example I hit 4GB on slightly less than ultra settings. However I have more than enough sheer GPU power to push almost every game maxed out at 60 FPS constant.

I'm probably not going to upgrade until the series after the 3xx, I don't want to drop huge money on two more GPU's, blocks, and backplates any time soon. Hopefully I'll be out of school and working by then, and will have the cash for a whole new rig..

I see no reason to upgrade CPU any time soon. My 3770k is still running strong, and unless you need an i7 you should be fine with your 4670k for a while. CPU performance has kinda stagnated. It doesn't help that AMD is not focusing on enthusiast level CPU's either, so Intel can kinda just do whatever they want.


----------



## chickensloth

Just got a club3d royal ace r9 290, and it rocks. Got the core to 1111 with 50mv offset and ram to 1444. Power limit set to +50%. She stays below 75c under an hour of Heaven, and the vrms stay about the same temp. I can't vouch for having experience with other cards, but I really like way the club3d looks in my HAF XB from the top down. Rocking good card! I managed to push it to 1170/1490, but the heat was too much for my comfort (stil under 90c though)


----------



## taem

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I'm crossfiring a 290 and a 290x at 1440p. I agree that 4GB is kind of a limit for some games, in BF4 for example I hit 4GB on slightly less than ultra settings. However I have more than enough sheer GPU power to push almost every game maxed out at 60 FPS constant.
> 
> I'm probably not going to upgrade until the series after the 3xx, I don't want to drop huge money on two more GPU's, blocks, and backplates any time soon. Hopefully I'll be out of school and working by then, and will have the cash for a whole new rig..
> 
> I see no reason to upgrade CPU any time soon. My 3770k is still running strong, and unless you need an i7 you should be fine with your 4670k for a while. CPU performance has kinda stagnated. It doesn't help that AMD is not focusing on enthusiast level CPU's either, so Intel can kinda just do whatever they want.


Yeah I totally agree with you. The 290x series is so good, even with a single 290 I'm not feeling any urge to upgrade. And if I'm going to upgrade a 290 which I can crossfire for cheap these days, I want to hold out for more vram. Even with a beefy processing boost on the 3xx you'd think 3xx 4gb is going to be limiting for 4k.

And cpu side I've never found my 4670k to be a limit in any way. I'm pretty sure I'll pass on Broadwell and Skylake on desktop. Might get a new laptop with Skylake though.

Quote:


> Originally Posted by *chickensloth*
> 
> Just got a club3d royal ace r9 290, and it rocks. Got the core to 1111 with 50mv offset and ram to 1444. Power limit set to +50%. She stays below 75c under an hour of Heaven, and the vrms stay about the same temp. I can't vouch for having experience with other cards, but I really like way the club3d looks in my HAF XB from the top down. Rocking good card! I managed to push it to 1170/1490, but the heat was too much for my comfort (stil under 90c though)


That looks like the king of the 290x coolers, same thing as the pcs+ but BIGGER. Also uses the same ek block as the revised pcs+.


----------



## LandonAaron

I keep debating with myself about going crossfire, but I just can't bring myself to commit to "solution" that is only going to work on some games, and never knowing if they are going to implement a profile for the game your interested in or not.


----------



## Agonist

Quote:


> Originally Posted by *LandonAaron*
> 
> I keep debating with myself about going crossfire, but I just can't bring myself to commit to "solution" that is only going to work on some games, and never knowing if they are going to implement a profile for the game your interested in or not.


Thats where I am at right now. In 2013 I had HD 5770 crossfire. Never had an issue. In 2014 I had HD 7850 crossfire and HD 7950 crossfire.
I never had a problem with crossfire profiles for release games. Only ones that didnt work were Project Cars and Assetto Corsa.

With no new driver since 1st Omega release, im really iffy about getting a second R9 290.
I run eyefinity, 4264x1024 res, and most games a single R9 290 is nice but I want all the eye candy I can get in Assetto Corsa, Project cars and Grand Theft Auto 5 when it releases.


----------



## tsm106

Quote:


> Originally Posted by *LandonAaron*
> 
> I keep debating with myself about going crossfire, but I just can't bring myself to commit to "solution" that is only going to work on some games, and never knowing if they are going to implement a profile for the game your interested in or not.


Do you drive a truck or suv?


----------



## LandonAaron

Quote:


> Originally Posted by *tsm106*
> 
> Do you drive a truck or suv?


??? No


----------



## tsm106

Quote:


> Originally Posted by *LandonAaron*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Do you drive a truck or suv?
> 
> 
> 
> ??? No
Click to expand...

No matter, works both ways. Point is lot of devices/things in life are not useful or you don't use all the time. Realistically it's not good way to judge utility.


----------



## chickensloth

That looks like the king of the 290x coolers, same thing as the pcs+ but BIGGER. Also uses the same ek block as the revised pcs+.[/quote]
Uhh, it's better than king--its the ace







--now my ambient is 18-19c which helps. I feel like I took a gamble, and won


----------



## Jyve

I have that datavac duster. Been using it for years. More than paid for itself by not buying canned air!

Works wonders. More powerful than canned air. Gets a little warm with extended use.


----------



## Agonist

Quote:


> Originally Posted by *Jyve*
> 
> I have that datavac duster. Been using it for years. More than paid for itself by not buying canned air!
> 
> Works wonders. More powerful than canned air. Gets a little warm with extended use.


I need to pick that thing up then.
I got through alot of air duster. 3 cans a month sadly. I have 5 computers to clean every month, plus other projects.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *LandonAaron*
> 
> I keep debating with myself about going crossfire, but I just can't bring myself to commit to "solution" that is only going to work on some games, and never knowing if they are going to implement a profile for the game your interested in or not.


I have gotten CF to work on every game I play except STALKER.

Fallout, Skyrim, Half-Life, etc etc can all use CF if you force certain profiles


----------



## Agonist

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I have gotten CF to work on every game I play except STALKER.
> 
> Fallout, Skyrim, Half-Life, etc etc can all use CF if you force certain profiles


I could never get Project Cars to work a forced profile. It ran like a dog with a single 7950 with eyefinity on 3 monitors.
The first BF Hardline beta worked with BF4 profile on my 7850 crossfire.


----------



## supermiguel

So im trying to overclock 2 290x, basically i have crossfire enabled, and im using msiab and i can run fine with +50 1145 but when i try to go higher even with +400mV running at 1.344V still giving me issues at 1150







I have both cards watercooled, at 1.344(+400mV) my loaded temps are around 40C... So just wondering what to do next give it more voltage? +500/600 or load the asus bios on both of them... Also earlier someone mentioned the possition of the switch on the cards does this really matter for oc???


----------



## chiknnwatrmln

Dude... Easy. +400mV is a crap-ton. If you can't do 1150 on that much that card is just not gonna do 1150.


----------



## LandonAaron

Quote:


> Originally Posted by *LandonAaron*
> 
> I keep debating with myself about going crossfire, but I just can't bring myself to commit to "solution" that is only going to work on some games, and never knowing if they are going to implement a profile for the game your interested in or not.


Quote:


> Originally Posted by *tsm106*
> 
> Do you drive a truck or suv?


Quote:


> Originally Posted by *LandonAaron*
> 
> ??? No


Quote:


> Originally Posted by *tsm106*
> 
> No matter, works both ways. Point is lot of devices/things in life are not useful or you don't use all the time. Realistically it's not good way to judge utility.


I see what your saying, but I think its more useful to judge a computer by its worst case performance scenario, instead of its best case performance scenario. Kind of similar to rating a games performance based on the lowest recorded FPS, instead of average or highest recorded FPS.


----------



## Mega Man

not really tsm really did a great job,
CFX will do what you need, but when you really need it it does far better, and is extremely worth it,

i will not stop doing quadfire builds now
you may say " wasteful because 2 games quad does not work, "

i say great because 300 it does fine


----------



## taem

Quote:


> Originally Posted by *tsm106*
> 
> No matter, works both ways. Point is lot of devices/things in life are not useful or you don't use all the time. Realistically it's not good way to judge utility.


Yeah I've come around to that view. At first when I tried crossfire I was annoyed at all the titles that didn't support it. But then a lot of them do. With the ones that don't, I'm no worse off, and the ones that do, I'm a lot better off. At the time I tried crossfire the Tri-X OC was $550 so I returned it. But now that they're half the price and the block I want is available I'll go back to crossfire pretty soon.

I don't get why multi gpu support is still so case by case though. In this day and age it should be ootb on all titles.

Quote:


> Originally Posted by *LandonAaron*
> 
> I see what your saying, but I think its more useful to judge a computer by its worst case performance scenario, instead of its best case performance scenario. Kind of similar to rating a games performance based on the lowest recorded FPS, instead of average or highest recorded FPS.


But the worst case performance scenario for crossfire is what you have right now, which is single card mode. The choice is to stay there at low fps all the time or enjoy higher fps not all of the time, but a lot of the time.


----------



## Red1776

Quote:


> Originally Posted by *LandonAaron*
> 
> Quote:
> 
> 
> 
> Originally Posted by *LandonAaron*
> 
> I keep debating with myself about going crossfire, but I just can't bring myself to commit to "solution" that is only going to work on some games, and never knowing if they are going to implement a profile for the game your interested in or not.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Do you drive a truck or suv?
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *LandonAaron*
> 
> ??? No
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> No matter, works both ways. Point is lot of devices/things in life are not useful or you don't use all the time. Realistically it's not good way to judge utility.
> 
> Click to expand...
> 
> I see what your saying, but I think its more useful to judge a computer by its worst case performance scenario, instead of its best case performance scenario. Kind of similar to rating a games performance based on the lowest recorded FPS, instead of average or highest recorded FPS.
Click to expand...

+1+1+1 hhehe


----------



## mfknjadagr8

Intresting to see what i find when i pop the coolers off...the bottom card idles and runs about 15c higher even on 100 percent fan.. this is with them tested seperately... i noticed crossfire simply isnt working well period... during gaming or valley with cfx enabled it is barely being utilized..little spikes up to 10 percent to 30 percent once every 30 seconds or so...this is even when its struggling on last light to keep 60 frames...im thinking this might just be the driver at this point.. they are getting solid power and both work equally well (albiet the temp differences) seperately just not together...one thing i noticed... in gpu z the bios versions are slightly different once i reenable crossfire and plug power back into the first card i will post what the difference is... could a difference in bios's be the reason its not crossfiring well? I understand on 1080p it wouldnt need to use it constantly but when im running it at 3200 downscaled to the 1920 x 1080 i would think it would need to run both pretty well...
That was running one loop of valley...temps were under 70 most of the time...on bottom card around 85 even though it was just chilling ;0

Ok so here are bios versions

card 1 bios

card 2 bios

Second card idles around 65 to 67c and hits max pretty easily...when gaming though with 100 percent fan it doesnt break the max... but that will be remedied when water blocks come... tracking says tuesday...


----------



## supermiguel

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I have gotten CF to work on every game I play except STALKER.
> 
> Fallout, Skyrim, Half-Life, etc etc can all use CF if you force certain profiles


how can you tell if CF is being used correctly?


----------



## Red1776

Quote:


> Originally Posted by *supermiguel*
> 
> Quote:
> 
> 
> 
> Originally Posted by *chiknnwatrmln*
> 
> I have gotten CF to work on every game I play except STALKER.
> 
> Fallout, Skyrim, Half-Life, etc etc can all use CF if you force certain profiles
> 
> 
> 
> how can you tell if CF is being used correctly?
Click to expand...

CF works with STALKER...at least quad does.


----------



## GOLDDUBBY

Quote:


> Originally Posted by *Chopper1591*
> 
> Wow, what a post.
> 80% looks to be off-topic on my post though. No offense, you are just trying to help out, I can tell that.
> 
> People here should get badges or something to show their experience level.
> You clearly try to tell me basic stuff which I didn't ask for at all.
> 
> Nonetheless I will still try to answer as completely as possible.
> 
> 
> I have Bitdefender, been using it for 3 years or something now. I keep switching between Bitdefender and Kaspersky. So yes, I know the importance of a virus scanner... although I have had my time where I didn't use one at all because it does slow your system. And Kaspersky in particular kept bugging me and blocked some games from working properly somehow.
> Where did we begin to mention the gpu and/or cpu overclock? I wasn't talking about thinking that either of my overclocks are unstable, right? My gpu stays around 41c core and 35ish on the vrm's with the 1275 overclock. Should be good for longevity, right?
> Secondly, my cpu is clocked at 4.8 like I said, also with proper cooling. Keeping it below 50, more around 45, when gaming. It passed 20 runs of IBT-avx very-high and a few hours of p95 blend, so I call it stable "enough". I'm lazy to test overnight.
> 
> 
> 
> 
> 
> 
> 
> , and I don't think it is causing the problems in the games.
> You also mentioned the ram thing... if you did open my rig in my sig you'd have seen that I have a set of TridentX 2400 c9, which is currently running at ~2140 8-10-10-1T. Paired with a 2670 cpu-nb, so bandwidth should be plenty, right?
> Last but not least. I really tend to stay away from anything that cleans my registry. I don't know how much experience you have with them, but I have had my share of problems from using registry cleaners. Multiple times have had issues leading to eventually re-installing my OS. So thanks, but no thanks. I do clean my trash every now and then though. But I don't touch the registry no more. If you look on the web you see a fair share of "advanced" users advising to avoid them too.
> I must admit though, you have some talent in advertising.
> Ever thought about going into sales?
> 
> 
> 
> 
> 
> 
> 
> 
> I mean, really.
> 
> Thanks for taking time to give me the long reply though, but I have to say it wasn't really helping. Sorry.
> 
> * edit:
> Oh yeah, you also posted about the psu.
> Again, in my sig it states which one I have. I have a Corsair hx750 unit, which is 80+ gold and proper.
> To my knowledge that unit is good enough to power my system. Correct my if I'm wrong.


Best is to not have a anti virus. They tend to corrupt files and the scans steal an insane amount of performance.

Avast full suit steals more than 50% bandwidth of my internet.

As a Gamer, ggp (game guard personal) is the only antivirus, firewall, worth having. If any. Unfortunately I think ggp stopped making an english version, not sure.

I've had kapersky, avast, nod32, bitdefender, avira, norton, and a couple more, but they all tend to cause corruption to files, along with massive negative impact on performance. More so than most viruses and trojans to be honest. Some unwanted stuff can be annoying tho, especially grime and add-ware. Unfortunately those tend to slip through the a/v anyways .. so what's the worth?

At least I haven't found a really good a/v firewall suit that cause less issues than the viruses they are meant to protect against. I rather do a monthly clean install of windows and use browser add on antivirus apps.


----------



## pshootr

Quote:


> Originally Posted by *GOLDDUBBY*
> 
> Best is to not have a anti virus. They tend to corrupt files and the scans steal an insane amount of performance.
> 
> Avast full suit steals more than 50% bandwidth of my internet.
> 
> As a Gamer, ggp (game guard personal) is the only antivirus, firewall, worth having. If any. Unfortunately I think ggp stopped making an english version, not sure.
> 
> I've had kapersky, avast, nod32, bitdefender, avira, norton, and a couple more, but they all tend to cause corruption to files, along with massive negative impact on performance. More so than most viruses and trojans to be honest. Some unwanted stuff can be annoying tho, especially grime and add-ware. Unfortunately those tend to slip through the a/v anyways .. so what's the worth?
> 
> At least I haven't found a really good a/v firewall suit that cause less issues than the viruses they are meant to protect against. I rather do a monthly clean install of windows and use browser add on antivirus apps.


I think you need to come back down to earth hehe (clean install every month). I have been running AV for many many years, and it has never corrupted my files. A modern computer with decent hardware will have plenty of resources/speed for AV, and windows firewall seems to work well for me. I have had the same windows install for over 3 years.

Edit:

A clean install is very rarely needed as long as you have patience to maintenance your pc, and work through any issues as they arise.


----------



## iCrap

I just noticed this.... 1337 r9 290? Is that some sort of easter egg? lol.


----------



## DeviousAddict

@taem
I've got an LG 34UC87 the curved 21:9
It is thee best monitor ive ever owned. The colours and blacks are soo good it took me a while to get use to it playing games (tiny bit of setting changes to calibrate colours etc)

@LandonAaron
I'm running 3440x1440. Dying light was massively poorly optimised when it 1st came out, pretty cpu heavy. Even though they have released an optimization patch it is still hard hitting on a system.
I do only see 80-85% usage on my gpu even though I'm only getting 50-60fps. Sp I can only guess the rest of the horse power comes from the cpu


----------



## Native89

Hi all. I have a 290x Vapor X coming in the mail and am just doing the usual research while I wait.
Was just wondering how the newer Ubisoft titles are doing on the 290x, specifically AssCreed: Unity and FC4.
The only reason I ask is because most of the videos and reviews I find are from before the patches started rolling out.
Coming from a GTX 760, I expect a decent jump in fps, but just wanted to gauge the situation ATM.

Also, I'll be running it on an EVGA 650G. Cutting it a little close I know.
Would it be advisable to overclock on this PSU? Even a mild one would suffice; maybe without touching voltage?
Rest of the system would be an i5 4690 (non K), 8gb 1600 RAM, 4 fans, an SSD and 2 HDD's.

Thanks for the wealth of information in this thread and I look forward to becoming an owner.


----------



## Chopper1591

Quote:


> Originally Posted by *chickensloth*
> 
> That looks like the king of the 290x coolers, same thing as the pcs+ but BIGGER. Also uses the same ek block as the revised pcs+.


Uhh, it's better than king--its the ace







--now my ambient is 18-19c which helps. I feel like I took a gamble, and won







[/quote]

It is a decent cooler.
But the best? I don't know.

The tri-x seems to be cooler and more silent, but only by a very small margin.
source

Nonetheless it is a nice card, the backplate really helps for aesthetics IMO.
Quote:


> Originally Posted by *Agonist*
> 
> I need to pick that thing up then.
> I got through alot of air duster. 3 cans a month sadly. I have 5 computers to clean every month, plus other projects.


Wow.
Why do you clean so often? Do you live in the sahara?









I only dust every few months...
Most dust is only on the top rad, which is outside the case.

Funny to see the topic being on the Metro DataVac.
I had my eye on one of those a few years ago already. Really tempted to buy one now.
Sadly they are priced like 95 euros here. Pretty steep.
Quote:


> Originally Posted by *supermiguel*
> 
> So im trying to overclock 2 290x, basically i have crossfire enabled, and im using msiab and i can run fine with +50 1145 but when i try to go higher even with +400mV running at 1.344V still giving me issues at 1150
> 
> 
> 
> 
> 
> 
> 
> I have both cards watercooled, at 1.344(+400mV) my loaded temps are around 40C... So just wondering what to do next give it more voltage? +500/600 or load the asus bios on both of them... Also earlier someone mentioned the possition of the switch on the cards does this really matter for oc???


Something is really wrong here.
What is your stock VID?

1.344v with +400mv is just plain mad. My card with +200 is close to 1.4v.

About the bios switches, can't hurt to try....
Quote:


> Originally Posted by *mfknjadagr8*
> 
> Intresting to see what i find when i pop the coolers off...the bottom card idles and runs about 15c higher even on 100 percent fan.. this is with them tested seperately... i noticed crossfire simply isnt working well period... during gaming or valley with cfx enabled it is barely being utilized..little spikes up to 10 percent to 30 percent once every 30 seconds or so...this is even when its struggling on last light to keep 60 frames...im thinking this might just be the driver at this point.. they are getting solid power and both work equally well (albiet the temp differences) seperately just not together...one thing i noticed... in gpu z the bios versions are slightly different once i reenable crossfire and plug power back into the first card i will post what the difference is... could a difference in bios's be the reason its not crossfiring well? I understand on 1080p it wouldnt need to use it constantly but when im running it at 3200 downscaled to the 1920 x 1080 i would think it would need to run both pretty well...
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> That was running one loop of valley...temps were under 70 most of the time...on bottom card around 85 even though it was just chilling ;0
> 
> Ok so here are bios versions
> 
> card 1 bios
> 
> card 2 bios
> 
> Second card idles around 65 to 67c and hits max pretty easily...when gaming though with 100 percent fan it doesnt break the max... but that will be remedied when water blocks come... tracking says tuesday...


Out of curiosity, how many rads are you going to use to cool the gpu's?
Is it going to be a single loop, for both the cpu and gpu's?

I ask it because my UT60 360 and XTC 140 is barely enough to cool my 8320 and single 290 tri-x.
Temps are fine, but to keep it below 10C Delta I need to ramp the fans up when I clock my card to 1275.
Quote:


> Originally Posted by *GOLDDUBBY*
> 
> Best is to not have a anti virus. They tend to corrupt files and the scans steal an insane amount of performance.
> 
> Avast full suit steals more than 50% bandwidth of my internet.
> 
> As a Gamer, ggp (game guard personal) is the only antivirus, firewall, worth having. If any. Unfortunately I think ggp stopped making an english version, not sure.
> 
> I've had kapersky, avast, nod32, bitdefender, avira, norton, and a couple more, but they all tend to cause corruption to files, along with massive negative impact on performance. More so than most viruses and trojans to be honest. Some unwanted stuff can be annoying tho, especially grime and add-ware. Unfortunately those tend to slip through the a/v anyways .. so what's the worth?
> 
> At least I haven't found a really good a/v firewall suit that cause less issues than the viruses they are meant to protect against. I rather do a monthly clean install of windows and use browser add on antivirus apps.


Ehmm..
No?

I don't know where you get that information but I find it rather false.
Sure, it can slowdown your system. But you are over-stating it really bad.

And I must say my current Bitdefender blocks a decent amount of crap. Paired with Ad-Aware it does a decent job.
Yeah, you can have stuff slipping by. But IMO that is more because of the way you use your system. Skip on the porn man. *joking*

But I do agree on pshootr:
Fixing the issues is the way to go. Re-installing on every issue is the way I did when I was like 10 years old.
Repairing is the part where you get to work on your skills.









I would find it a PITA to re-install so often.

If you have file corruption that often I would advice you to check on your overclock, if you have any, or overall component stability. System instability can give you file corruption, that is a fact.

And I do have my share of experience on AV software, the list is pretty long. But that was not the subject here.








My fav's ATM are Bitdefender and Kaspersky. I tend to avoid Norton the last few years.
Quote:


> Originally Posted by *pshootr*
> 
> I think you need to come back down to earth hehe (clean install every month). I have been running AV for many many years, and it has never corrupted my files. A modern computer with decent hardware will have plenty of resources/speed for AV, and windows firewall seems to work well for me. I have had the same windows install for over 3 years.
> 
> Edit:
> 
> A clean install is very rarely needed as long as you have patience to maintenance your pc, and work through any issues as they arise.


Nailed it.

Fresh install every month is just plain mad.
I hate it when I even have to do it after "only" 1 year of usage. I have a lot of stuff on the system, and I mean a LOT.

Fixing the issues as they arise is the way to go.
Re-install is for the ignorant.








Quote:


> Originally Posted by *iCrap*
> 
> I just noticed this.... 1337 r9 290? Is that some sort of easter egg? lol.
> 
> 
> Spoiler: Warning: Spoiler!


Hehe, nice find.

Now only if it was also a L33T clocker.
Quote:


> Originally Posted by *Native89*
> 
> Hi all. I have a 290x Vapor X coming in the mail and am just doing the usual research while I wait.
> Was just wondering how the newer Ubisoft titles are doing on the 290x, specifically AssCreed: Unity and FC4.
> The only reason I ask is because most of the videos and reviews I find are from before the patches started rolling out.
> Coming from a GTX 760, I expect a decent jump in fps, but just wanted to gauge the situation ATM.
> 
> Also, I'll be running it on an EVGA 650G. Cutting it a little close I know.
> Would it be advisable to overclock on this PSU? Even a mild one would suffice; maybe without touching voltage?
> Rest of the system would be an i5 4690 (non K), 8gb 1600 RAM, 4 fans, an SSD and 2 HDD's.
> 
> Thanks for the wealth of information in this thread and I look forward to becoming an owner.


Congrats on the buy, firstly.

On my 290 tri-x:
Far Cry 4, runs okay. Patches did increase the performance to a degree, although I must say I haven't played it in a month or so. So maybe it is even better now.
Creed Unity, not so. Still a pile of crap if you ask me.









I think you will manage on that psu of yours.
Personally I wouldn't touch overclocking on the card. These get real power hungry when overclocked IMO.

Best scenario would be to grab an kill-a-watt or something similar. Then you can actually see the power usage and decide on whether to overclock or not.
Also, are you clocking that i5?

For comparison:
I have an hx750, which is also an 80+ gold, and I think it is barely enough to power my system when I overclock both the card(1275 with overvolting close to 1.4v) and the cpu to 4.8-5.0ghz.


----------



## Native89

Quote:


> Originally Posted by *Chopper1591*
> 
> On my 290 tri-x:
> Far Cry 4, runs okay. Patches did increase the performance to a degree, although I must say I haven't played it in a month or so. So maybe it is even better now.
> Creed Unity, not so. Still a pile of crap if you ask me.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think you will manage on that psu of yours.
> Personally I wouldn't touch overclocking on the card. These get real power hungry when overclocked IMO.
> 
> Best scenario would be to grab an kill-a-watt or something similar. Then you can actually see the power usage and decide on whether to overclock or not.
> Also, are you clocking that i5?
> 
> For comparison:
> I have an hx750, which is also an 80+ gold, and I think it is barely enough to power my system when I overclock both the card(1275 with overvolting close to 1.4v) and the cpu to 4.8-5.0ghz.


Thanks for the advice. I actually had a kill-a-watt sitting in my Amazon cart. Gonna get it ASAP.
My i5 can't overclock (not unlocked) so I'll debate overclocking the 290x pending my kill-a-watt purchase.

Sucks about the Ubisoft games, but I think I can live without them.
Main reason for buying this card was to get ready for GTAV and a monitor upgrade (900p to 1080p).


----------



## Vici0us

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Quote:
> 
> 
> 
> Originally Posted by *LandonAaron*
> 
> I keep debating with myself about going crossfire, but I just can't bring myself to commit to "solution" that is only going to work on some games, and never knowing if they are going to implement a profile for the game your interested in or not.
> 
> 
> 
> I have gotten CF to work on every game I play except STALKER.
> 
> Fallout, Skyrim, Half-Life, etc etc can all use CF if you force certain profiles
Click to expand...

Crossfire works pretty good, it's definitely worth it specially at the price that cards are going for now.


----------



## Chopper1591

Quote:


> Originally Posted by *Native89*
> 
> Thanks for the advice. I actually had a kill-a-watt sitting in my Amazon cart. Gonna get it ASAP.
> My i5 can't overclock (not unlocked) so I'll debate overclocking the 290x pending my kill-a-watt purchase.
> 
> Sucks about the Ubisoft games, but I think I can live without them.
> Main reason for buying this card was to get ready for GTAV and a monitor upgrade (900p to 1080p).


Exactly.
My 290 tri-x, plus water loop including gpu now, is primarily for GTA V.








Sure, it is nice to also play other games with pretty decent settings.

I am interested to see the power consumption once you get it.
Be sure to report idle watts also.

Ubisoft is playing really bad lately IMO. It's a shame, they make nice games.
You can always grab Far Cry 4 if the price is right. You will enjoy it anyway.
Creed.... we will have to see what future patches bring, including amd it's own.

Smart move, upgrading that monitor.
Why haven't you gone with 1440p while you were at it?
I am looking to go that route. Sadly decent screens are priced pretty steep here, at least the screens I want(1440p 144hz or 120hz).


----------



## fragamemnon

Quote:


> Originally Posted by *iCrap*
> 
> I just noticed this.... 1337 r9 290? Is that some sort of easter egg? lol.
> 
> 
> Spoiler: 1337


It's the manufacture date: Year '13, Week 37.

One of mine is 1338.









The funny thing and real easter egg (probably, but I rather think of it as a massive coincidence) is that a lot of 1337-manufactured 290s are unlockable to X.


----------



## supermiguel

Quote:


> Originally Posted by *Chopper1591*
> 
> Something is really wrong here.
> What is your stock VID?
> 
> 1.344v with +400mv is just plain mad. My card with +200 is close to 1.4v.
> 
> About the bios switches, can't hurt to try....


i have to check what my VID is when i get home, but the card are the MSI 290X GAMING 4G


----------



## rdr09

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Intresting to see what i find when i pop the coolers off...the bottom card idles and runs about 15c higher even on 100 percent fan.. this is with them tested seperately... i noticed crossfire simply isnt working well period... during gaming or valley with cfx enabled it is barely being utilized..little spikes up to 10 percent to 30 percent once every 30 seconds or so...this is even when its struggling on last light to keep 60 frames...im thinking this might just be the driver at this point.. they are getting solid power and both work equally well (albiet the temp differences) seperately just not together...one thing i noticed... in gpu z the bios versions are slightly different once i reenable crossfire and plug power back into the first card i will post what the difference is... could a difference in bios's be the reason its not crossfiring well? I understand on 1080p it wouldnt need to use it constantly but when im running it at 3200 downscaled to the 1920 x 1080 i would think it would need to run both pretty well...
> That was running one loop of valley...temps were under 70 most of the time...on bottom card around 85 even though it was just chilling ;0
> 
> Ok so here are bios versions
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> card 1 bios
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> card 2 bios
> 
> Second card idles around 65 to 67c and hits max pretty easily...when gaming though with 100 percent fan it doesnt break the max... but that will be remedied when water blocks come... tracking says tuesday...


if your second card idles that high, then it is definitely throttling when loaded. apps may not be measuring or indicating temps accurately. it could be reaching 94 or higher.

my gpus are both used 100% in valley using 4K. iirc, even when i had 1080.

since they'll be watered soon - no worries. also, do not mix apps like gpuz, ccc, AB or Trixx.


----------



## Alexbo1101

Is it just me or is afterburner screwing around?

Whenever i set power limit, apply it and then change to another gpu it resets the powerlimit back to +0% ?

Is this normal or am I doing something wrong?


----------



## fragamemnon

Quote:


> Originally Posted by *Alexbo1101*
> 
> Is it just me or is afterburner screwing around?
> 
> Whenever i set power limit, apply it and then change to another gpu it resets the powerlimit back to +0% ?
> 
> Is this normal or am I doing something wrong?


Disable Overdrive from CCC.


----------



## Alexbo1101

Quote:


> Originally Posted by *fragamemnon*
> 
> Disable Overdrive from CCC.


Can't disable it more than it already is


----------



## mfknjadagr8

@chopper I will have 720 mm of rad with the new one...single loop mcp30 and mcp50x pumps and a alphacool 100mm reservoir


----------



## Chopper1591

Quote:


> Originally Posted by *supermiguel*
> 
> i have to check what my VID is when i get home, but the card are the MSI 290X GAMING 4G


Ok.
Did you really post: 10 480 monsta rads?








Quote:


> Originally Posted by *mfknjadagr8*
> 
> @chopper I will have 720 mm of rad with the new one...single loop mcp30 and mcp50x pumps and a alphacool 100mm reservoir


720mm should manage.
I do consider that to be the minimum I guess. But I like my stuff quiet.


----------



## supermiguel

Quote:


> Originally Posted by *Chopper1591*
> 
> Ok.
> Did you really post: 10 480 monsta rads?


Lol i did







but then realized the question wasnt for me, but yeah single loop of 10 monsta 480







80 fans


----------



## taem

Quote:


> Originally Posted by *Native89*
> 
> Thanks for the advice. I actually had a kill-a-watt sitting in my Amazon cart. Gonna get it ASAP.
> My i5 can't overclock (not unlocked) so I'll debate overclocking the 290x pending my kill-a-watt purchase


Kill-a-watt is OK but belkin conserve insight has a small display on a cable so you can put it next to your screen. It's usually cheaper. Just wait til amazon is in stock.

Quote:


> Originally Posted by *Chopper1591*
> 
> It is a decent cooler.
> But the best? I don't know.
> 
> The tri-x seems to be cooler and more silent, but only by a very small margin


Both are good the best coolers of the 290x cards along with lightning. But pcs+ is better. I had both. Tri-x is on a less aggressive curve by default, at same fan % the tri x is noticeably louder. And pcs+ gets slightly better temps.

And club3d is basically the same card as pcs+ but with an even bigger heatsink.


----------



## fragamemnon

Quote:


> Originally Posted by *Alexbo1101*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fragamemnon*
> 
> Disable Overdrive from CCC.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can't disable it more than it already is
Click to expand...

Then there is one more odd quirk, which I found once. It had me wondering for a couple of days, even though I tried disabling/enabling it.

Enable Overdrive, set OC values and power target via 3rd party software, then set power target again from within Overdrive (without touching the clocks offset).
Repeat for second GPU.

And let me know if it works.


----------



## Alexbo1101

Quote:


> Originally Posted by *fragamemnon*
> 
> Then there is one more odd quirk, which I found once. It had me wondering for a couple of days, even though I tried disabling/enabling it.
> 
> Enable Overdrive, set OC values and power target via 3rd party software, then set power target again from within Overdrive (without touching the clocks offset).
> Repeat for second GPU.
> 
> And let me know if it works.


Yeah, I found the culprit, AB's 'unofficial overclocking mode' (w/o powerplay), after I switched it over to 'extend official overclocking limits' it works like a charm.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Chopper1591*
> 
> Ok.
> Did you really post: 10 480 monsta rads?
> 
> 
> 
> 
> 
> 
> 
> 
> 720mm should manage.
> I do consider that to be the minimum I guess. But I like my stuff quiet.


well all my rads are just in push and pull respectively so I run my fans at max but even though I'm not scared to run at max the full speed on the reference blowers is to loud even for me can't overstate how happy water cooling them will make me...I'm hoping with water blocks I can properly attach my res again and not have it strip magnetted where it was mounted without mounts...if not it will be relocated...


----------



## supermiguel

Quote:


> Originally Posted by *supermiguel*
> 
> i have to check what my VID is when i get home, but the card are the MSI 290X GAMING 4G


Quote:


> Originally Posted by *Chopper1591*
> 
> Ok.


I have this same card so should be the same:


----------



## mfknjadagr8

I'm thinking when I get home I'm gonna take the bottom 290 out and see what the paste looks like.. willing to bet it's either hard or way overapplied...it's not gonna be easy though as my water line from cpu was shortened so I barely got it in there







I also noticed the backplate for both of mine I had too tweak the card to get it to line up with case never had that issue before


----------



## iCrap

Quote:


> Originally Posted by *fragamemnon*
> 
> It's the manufacture date: Year '13, Week 37.
> 
> One of mine is 1338.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The funny thing and real easter egg (probably, but I rather think of it as a massive coincidence) *is that a lot of 1337-manufactured 290s are unlockable to X*.


Not this one!


----------



## Ghostdragon445

is it actually worth changing the thermal paste on a vapor-x card or is it usually pretty good, i have good temps but was wondering what the general consensus is on this topic.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Ghostdragon445*
> 
> is it actually worth changing the thermal paste on a vapor-x card or is it usually pretty good, i have good temps but was wondering what the general consensus is on this topic.


if you have good temps it's not worth it imo...you potentially could get worse temps than you have now..not doubting your abilities but in general if it isn't experiencing poor temps I wouldn't mess with a good thing


----------



## Ghostdragon445

Quote:


> Originally Posted by *mfknjadagr8*
> 
> if you have good temps it's not worth it imo...you potentially could get worse temps than you have now..not doubting your abilities but in general if it isn't experiencing poor temps I wouldn't mess with a good thing


Ok, I can do it easily enough, just wanted to know if it was recommended/worth it


----------



## tsm106

Quote:


> Originally Posted by *Ghostdragon445*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mfknjadagr8*
> 
> if you have good temps it's not worth it imo...you potentially could get worse temps than you have now..not doubting your abilities but in general if it isn't experiencing poor temps I wouldn't mess with a good thing
> 
> 
> 
> Ok, I can do it easily enough, just wanted to know if it was recommended/worth it
Click to expand...

Make sure you really want to do it because you technically void Sapphires warranty. After which you have to lie and pretend you never did it if it ever comes up.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Ghostdragon445*
> 
> Ok, I can do it easily enough, just wanted to know if it was recommended/worth it


differences in thermal pastes are 4c at best...unless it was bad paste job which apparently yours wasn't...fit instance my second card idle at 65 so yeah definitely worth it for me


----------



## Waleh

Is anyone using the Gigabyte windforce edition or the Asus DIrectCU II version? These two are the cheapest ones I can get in Canada. They both seem pretty solid but is there anything I should know about them?

http://www.ncix.com/detail/gigabyte-radeon-r9-290x-oc-95-93519-1423.htm?affiliateid=7474144

http://www.ncix.com/detail/asus-radeon-r9-290x-directcu-48-93996-1423.htm?affiliateid=7474144


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *mfknjadagr8*
> 
> I'm thinking when I get home I'm gonna take the bottom 290 out and see what the paste looks like.. willing to bet it's either hard or way overapplied...it's not gonna be easy though as my water line from cpu was shortened so I barely got it in there
> 
> 
> 
> 
> 
> 
> 
> I also noticed the backplate for both of mine I had too tweak the card to get it to line up with case never had that issue before
> Quote:
> 
> 
> 
> Originally Posted by *mfknjadagr8*
> 
> if you have good temps it's not worth it imo...you potentially could get worse temps than you have now..not doubting your abilities but in general if it isn't experiencing poor temps I wouldn't mess with a good thing
Click to expand...

First you say your gonna check the paste , then you tell another bloke who's thinking the same thing , no to







?
But your right in say'n not to though
Leave it be anyways your going to watercooling








Quote:


> Originally Posted by *Ghostdragon445*
> 
> is it actually worth changing the thermal paste on a vapor-x card or is it usually pretty good, i have good temps but was wondering what the general consensus is on this topic.


Only if your going to block it








If it aint broke it don't need fixing .

Quote:


> Originally Posted by *Devildog83*
> 
> LOL
> 
> 
> 
> 
> 
> 
> 
> You're pretty punny man.


----------



## Ghostdragon445

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> First you say your gonna check the paste , then you tell another bloke who's thinking the same thing , no to ???
> Leave it be your going to watercooling anyways
> Only if your going to block it
> If it aint broke it don't need fixing .


No i asked advice on a something and he replied with if your card is fine dont bother because its not worth it down the road.

His is a BAD thermal paste job and he needs to change it to get better performance.... you just miss read the entire convo


----------



## Ghostdragon445

Quote:


> Originally Posted by *tsm106*
> 
> Make sure you really want to do it because you technically void Sapphires warranty. After which you have to lie and pretend you never did it if it ever comes up.


Ah i will keep that in mind before i open the card up, didnt know that about sapphire cards.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Ghostdragon445*
> 
> No i asked advice on a something and he replied with if your card is fine dont bother because its not worth it down the road.
> 
> His is a BAD thermal paste job and he needs to change it to get better performance.... you just miss read the entire convo


No , hes going to water cooling anyways . Reference coolers are crap new paste or not .

Your cooler is a good un so its fine unless your gonna block it

Miss read that


----------



## mfknjadagr8

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> No , hes going to water cooling anyways . Reference coolers are crap new paste or not .
> 
> Your cooler is a good un so its fine unless your gonna block it
> 
> Miss read that


lol yes but for 20 minutes of my time ill have a little better temps so it doesnt overheat...it idles around 65 which is indicative of shoddy paste or horribly crowned heatsink...even if its the latter ill know and can either repaste or leave it out until Tuesday...







why would I recommend repasting to someone with good temps







plus my card is second hand so I have nothing to.lose considering I will be removing it all anyway...I'm actually about to go take it out now...if I can get it back out without removing the top card or the cpu to second rad hose...I plan to tear down the loop once lol


----------



## mfknjadagr8

Ok so I found the issues ill let the pictures speak for themselves...



The thermal broke apart like chalk...sadly the memory pads were partially destroyed...I didn't even have to pry to.get the assembly off it almost fell apart


----------



## pshootr

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Ok so I found the issues ill let the pictures speak for themselves...
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The thermal broke apart like chalk...sadly the memory pads were partially destroyed...I didn't even have to pry to.get the assembly off it almost fell apart


Looks like the typical thermal compound found on most stock parts. For $300.00+ you would think hey might use some better/longer-lasting TIM.









I don't think I would mind spending 50







more for a card with better TIM


----------



## mfknjadagr8

Well this things seen some heat...I thought I had more thermal compound on there...look closer









Spoiler: Warning: Spoiler!





 core is nice and clean though









EDIT: after the reinstall... idle temp is WAY better...


and valley score crossfire is acutally working now


----------



## pshootr

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Well this things seen some heat...I thought I had more thermal compound on there...look closer
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> core is nice and clean though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: after the reinstall... idle temp is WAY better...
> 
> 
> and valley score crossfire is acutally working now


Awsome man, glad you got them running cooler. And good to see your score coming out proper


----------



## Buehlar

Quote:


> Originally Posted by *Waleh*
> 
> Is anyone using the Gigabyte windforce edition or the Asus DIrectCU II version? These two are the cheapest ones I can get in Canada. They both seem pretty solid but is there anything I should know about them?
> 
> http://www.ncix.com/detail/gigabyte-radeon-r9-290x-oc-95-93519-1423.htm?affiliateid=7474144
> 
> http://www.ncix.com/detail/asus-radeon-r9-290x-directcu-48-93996-1423.htm?affiliateid=7474144


Here's a review for them.

Gigibyte R9 290X Windforce OC

ASUS R9 290X DCII

Both are pretty solid performing cards however I prefer the build quality of the DCII from ASUS which is usually more expensive than the other brands.

The Gigabyte may run slightly cooler but may also be a bit louder with it's triple fans.

I'm running a pair of 290x DCII's underwater and have been very satisfied with their performance.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Ghostdragon445*
> 
> Ah i will keep that in mind before i open the card up, didnt know that about sapphire cards.


Yeah @tsm106 *had* Sapphire cards , found out that nugget of info to do with warranty and promptly sold them .

Quote:


> Originally Posted by *mfknjadagr8*
> 
> lol yes but for 20 minutes of my time ill have a little better temps so it doesnt overheat...it idles around 65 which is indicative of shoddy paste or horribly crowned heatsink...even if its the latter ill know and can either repaste or leave it out until Tuesday...
> 
> 
> 
> 
> 
> 
> 
> why would I recommend repasting to someone with good temps
> 
> 
> 
> 
> 
> 
> 
> plus my card is second hand so I have nothing to.lose considering I will be removing it all anyway...I'm actually about to go take it out now...if I can get it back out without removing the top card or the cpu to second rad hose...I plan to tear down the loop once lol
> Quote:
> 
> 
> 
> Originally Posted by *mfknjadagr8*
> 
> Well this things seen some heat...I thought I had more thermal compound on there...look closer
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> core is nice and clean though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: after the reinstall... idle temp is WAY better...
> 
> 
> and valley score crossfire is acutally working now
Click to expand...

Well that's a good result . Nice temp drop









I am a bit lazy so what I did to sort temps is I blocked em without bothering to do a re-seat with or a re-pad









I tend to say not to do it in case of complications .
Like if you tighten the cooler to hard on the gpu die you can crack the gpu .


----------



## mfknjadagr8

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yeah @tsm106 *had* Sapphire cards , found out that nugget of info to do with warranty and promptly sold them .
> Well that's a good result . Nice temp drop
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am a bit lazy so what I did to sort temps is I blocked em without bothering to do a re-seat with or a re-pad
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I tend to say not to do it in case of complications .
> Like if you tighten the cooler to hard on the gpu die you can crack the gpu .


now they run about the same temps under heavy load.. and whoever called thermal throttling as my possible crossfire issue was right on the money.. crossfire instantly worked upon boot...i have the komodo blocks coming on tues so... itll be alot better then i can start overclocking these beasts...one nice thing is komodo blocks come pre padded... so easier for me ;0


----------



## Red1776

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mfknjadagr8*
> 
> I'm thinking when I get home I'm gonna take the bottom 290 out and see what the paste looks like.. willing to bet it's either hard or way overapplied...it's not gonna be easy though as my water line from cpu was shortened so I barely got it in there
> 
> 
> 
> 
> 
> 
> 
> I also noticed the backplate for both of mine I had too tweak the card to get it to line up with case never had that issue before
> Quote:
> 
> 
> 
> Originally Posted by *mfknjadagr8*
> 
> if you have good temps it's not worth it imo...you potentially could get worse temps than you have now..not doubting your abilities but in general if it isn't experiencing poor temps I wouldn't mess with a good thing
> 
> 
> 
> 
> 
> Click to expand...
> 
> First you say your gonna check the paste , then you tell another bloke who's thinking the same thing , no to
> 
> 
> 
> 
> 
> 
> 
> ?
> But your right in say'n not to though
> Leave it be anyways your going to watercooling
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ghostdragon445*
> 
> is it actually worth changing the thermal paste on a vapor-x card or is it usually pretty good, i have good temps but was wondering what the general consensus is on this topic.
> 
> Click to expand...
> 
> Only if your going to block it
> 
> 
> 
> 
> 
> 
> 
> 
> If it aint broke it don't need fixing .
> 
> Quote:
> 
> 
> 
> Originally Posted by *Devildog83*
> 
> LOL
> 
> 
> 
> 
> 
> 
> 
> You're pretty punny man.
> 
> Click to expand...
Click to expand...

The firs thing I do is always change the Thermal Compound on my GPU's. It is almost always done from the factory is a sloppy hideous, vapid, slipshod, horrid, incompetent, messy, misaligned, ...oh dear...I've gone to far.

anyway I replace it with IC Diamond (that will start an argument)







and always gt nice improvements in load temps.

Good luck 

Greg


----------



## kizwan

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Well this things seen some heat...I thought I had more thermal compound on there...look closer
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> core is nice and clean though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: after the reinstall... idle temp is WAY better...
> 
> 
> and valley score crossfire is acutally working now


That's the score you get with Crossfire? Clocks?
Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ghostdragon445*
> 
> Ah i will keep that in mind before i open the card up, didnt know that about sapphire cards.
> 
> 
> 
> Yeah @tsm106 *had* Sapphire cards , found out that nugget of info to do with warranty and promptly sold them .
> 
> Quote:
> 
> 
> 
> Originally Posted by *mfknjadagr8*
> 
> lol yes but for 20 minutes of my time ill have a little better temps so it doesnt overheat...it idles around 65 which is indicative of shoddy paste or horribly crowned heatsink...even if its the latter ill know and can either repaste or leave it out until Tuesday...
> 
> 
> 
> 
> 
> 
> 
> why would I recommend repasting to someone with good temps
> 
> 
> 
> 
> 
> 
> 
> plus my card is second hand so I have nothing to.lose considering I will be removing it all anyway...I'm actually about to go take it out now...if I can get it back out without removing the top card or the cpu to second rad hose...I plan to tear down the loop once lol
> Quote:
> 
> 
> 
> Originally Posted by *mfknjadagr8*
> 
> Well this things seen some heat...I thought I had more thermal compound on there...look closer
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> core is nice and clean though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: after the reinstall... idle temp is WAY better...
> 
> 
> and valley score crossfire is acutally working now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> 
> 
> Click to expand...
> 
> 
> 
> 
> Well that's a good result . Nice temp drop
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am a bit lazy so what I did to sort temps is I blocked em without bothering to do a re-seat with or a re-pad
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I tend to say not to do it in case of complications .
> Like if you tighten the cooler to hard on the gpu die you can crack the gpu .
Click to expand...

You have water chillers, re-seat or re-pad kind of redundant.








Quote:


> Originally Posted by *Red1776*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mfknjadagr8*
> 
> I'm thinking when I get home I'm gonna take the bottom 290 out and see what the paste looks like.. willing to bet it's either hard or way overapplied...it's not gonna be easy though as my water line from cpu was shortened so I barely got it in there
> 
> 
> 
> 
> 
> 
> 
> I also noticed the backplate for both of mine I had too tweak the card to get it to line up with case never had that issue before
> Quote:
> 
> 
> 
> Originally Posted by *mfknjadagr8*
> 
> if you have good temps it's not worth it imo...you potentially could get worse temps than you have now..not doubting your abilities but in general if it isn't experiencing poor temps I wouldn't mess with a good thing
> 
> 
> 
> 
> 
> Click to expand...
> 
> First you say your gonna check the paste , then you tell another bloke who's thinking the same thing , no to
> 
> 
> 
> 
> 
> 
> 
> ?
> 
> But your right in say'n not to though
> 
> Leave it be anyways your going to watercooling
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ghostdragon445*
> 
> is it actually worth changing the thermal paste on a vapor-x card or is it usually pretty good, i have good temps but was wondering what the general consensus is on this topic.
> 
> Click to expand...
> 
> Only if your going to block it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If it aint broke it don't need fixing .
> 
> Quote:
> 
> 
> 
> Originally Posted by *Devildog83*
> 
> LOL
> 
> 
> 
> 
> 
> 
> 
> You're pretty punny man.
> 
> Click to expand...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> 
> 
> 
> The firs thing I do is always change the Thermal Compound on my GPU's. It is almost always done from the factory is a sloppy hideous, vapid, slipshod, horrid, incompetent, messy, misaligned, ...oh dear...I've gone to far.
> anyway I replace it with IC Diamond (that will start an argument)
> 
> 
> 
> 
> 
> 
> 
> and always gt nice improvements in load temps.
> Good luck
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Greg
Click to expand...

No argument here.







IC Diamond is one of the best TIMs.


----------



## Arizonian

Quote:


> Originally Posted by *iCrap*
> 
> I put a third 290 on a riser but it wouldnt go into crossfire mode
> 
> 
> 
> 
> 
> 
> 
> 
> yes, i know its ghetto.
> 
> 
> Spoiler: Warning: Spoiler!


You know I went to update you but didn't see you.









Your added - congrats















Quote:


> Originally Posted by *LandonAaron*
> 
> Yeah, when I lived with my parents I would use my dads air compressor/tank, and it worked about 1000x better than Duster for cleaning out the computer. I started shopping around for one of my own recently, and from what I read it seems you will really need to spend at least $100 to get one with enough pressure and a large enough tank to work well for dusting.
> 
> Then I came across this,
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> ,
> 
> 
> http://www.amazon.com/Metro-Vacuum-ED500-500-Watt-Electric/dp/B001J4ZOAW/ref=sr_1_1?ie=UTF8&qid=1425573425&sr=8-1&keywords=datavac
> 
> Its only $60, and if you watch the video from the first review posted there on amazon it looks like its pretty powerful, probably even more powerful than a good air compressor.


Quote:


> Originally Posted by *Jyve*
> 
> I have that datavac duster. Been using it for years. More than paid for itself by not buying canned air!
> 
> Works wonders. More powerful than canned air. Gets a little warm with extended use.


Just picked this up last weekend with a box of goodies and will agree it's worth the money. Works much much better than cans. Strong force of dry air that gets even the finest of dust off. Loud as a vacuum cleaner but after you use it you will care less. I'm sorry I didn't fork out the $60 sooner as I've over spent that by far as well.


----------



## Red1776

Quote:


> Originally Posted by *Arizonian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *iCrap*
> 
> I put a third 290 on a riser but it wouldnt go into crossfire mode
> 
> 
> 
> 
> 
> 
> 
> 
> yes, i know its ghetto.
> 
> If you purchased one of those $5 risers on ebay, that may be the reason. I put a fourth HD 5870 on an ASUS CIVF
> 
> 
> 
> You can see the blue foil where the arrow is pointing. What you need is one of these:
> 
> 
> 
> They are from a company called adexelec.com. They are comparatively expensive $60.00,but work well and can be hand Made to any length.
> 
> You can find them here : http://www.adexelec.com/pciexp.htm
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You know I went to update you but didn't see you.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Your added - congrats
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *LandonAaron*
> 
> Yeah, when I lived with my parents I would use my dads air compressor/tank, and it worked about 1000x better than Duster for cleaning out the computer. I started shopping around for one of my own recently, and from what I read it seems you will really need to spend at least $100 to get one with enough pressure and a large enough tank to work well for dusting.
> 
> Then I came across this,
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> ,
> 
> 
> http://www.amazon.com/Metro-Vacuum-ED500-500-Watt-Electric/dp/B001J4ZOAW/ref=sr_1_1?ie=UTF8&qid=1425573425&sr=8-1&keywords=datavac
> 
> Its only $60, and if you watch the video from the first review posted there on amazon it looks like its pretty powerful, probably even more powerful than a good air compressor.
> 
> Click to expand...
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jyve*
> 
> I have that datavac duster. Been using it for years. More than paid for itself by not buying canned air!
> 
> Works wonders. More powerful than canned air. Gets a little warm with extended use.
> 
> Click to expand...
> 
> Just picked this up last weekend with a box of goodies and will agree it's worth the money. Works much much better than cans. Strong force of dry air that gets even the finest of dust off. Loud as a vacuum cleaner but after you use it you will care less. I'm sorry I didn't fork out the $60 sooner as I've over spent that by far as well.
Click to expand...

*Quoted by kizwan:*

*That's great ": IC Diamond Usually draws a harsh response. I have had the best results with IC (including Indigo Extreme) within .02c*

*Hi* *Arizonian,*

*I was looking at your title tag line and added a flux capacitor a couple years ago to my 4x 7970 quad build *



*It took an extra 240mm to keep temps down, but the performance boost was substantial. I realize mine is the 1.51 Gigawatt Flux Capacitor v2.0, but still, I would like to join*

*I would like to join the club please.*

*Thanks AZ *


----------



## mirzet1976

I have a problem with MSI Afterburner 4.1.0 to go over 1235mhz for core clock and 1625MHz for the memory clock, while I do not have that problem in TRiXX can 1400MHz core clock and 1825mhz memory. How can this be solved to go with MSI Afterburner over the limit of the 1235mhz.


----------



## rdr09

Quote:


> Originally Posted by *mirzet1976*
> 
> I have a problem with MSI Afterburner 4.1.0 to go over 1235mhz for core clock and 1625MHz for the memory clock, while I do not have that problem in TRiXX can 1400MHz core clock and 1825mhz memory. How can this be solved to go with MSI Afterburner over the limit of the 1235mhz.


1400? you sure you have a hawaii not a 970 or 980?

anyways, if you are trying to get more volts using AB, check out post one by Arizonian. Scroll down to "How to give more volts using AB . . . ." by sugarhell.

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club


----------



## mirzet1976

Quote:


> Originally Posted by *rdr09*
> 
> 1400? you sure you have a hawaii not a 970 or 980?
> 
> anyways, if you are trying to get more volts using AB, check out post one by Arizonian. Scroll down to "How to give more volts using AB . . . ." by sugarhell.
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club


not that, but the slider does not go over 1235, see picture


----------



## mfknjadagr8

I'm having issue of afterburner force closing while having it open on second monitor while gaming...after it crashes the profile for fan etc stays on though kind odd...


----------



## godiegogo214

Hey guys whats up

Looking for advice.....I have 2 MSI R9 290X cards on crossfire on my machine and my machine temps are (CPU 32C Idle GPU1 38C idle GPU2 31C idle on the socket) when i game my GPU1 raises to 61C and GPU2 40 something) so my computer will start throughing GPU1 is hot warnings any thoughts on how i can make these cards colder?

First time i do dual gpus or anything like that....its incredible how hot these cards get

pc on my profile

Thank you for the help


----------



## mfknjadagr8

Quote:


> Originally Posted by *godiegogo214*
> 
> Hey guys whats up
> 
> Looking for advice.....I have 2 MSI R9 290X cards on crossfire on my machine and my machine temps are (CPU 32C Idle GPU1 38C idle GPU2 31C idle on the socket) when i game my GPU1 raises to 61C and GPU2 40 something) so my computer will start throughing GPU1 is hot warnings any thoughts on how i can make these cards colder?
> 
> First time i do dual gpus or anything like that....its incredible how hot these cards get
> 
> pc on my profile
> 
> Thank you for the help


60c shouldn't be throttling unless your vrm temps are high...you can aim fans across them or go water


----------



## godiegogo214

How do i know if the card is throtteling?


----------



## mfknjadagr8

Quote:


> Originally Posted by *godiegogo214*
> 
> How do i know if the card is throtteling?


use a monitoring program to check temps and watch or log the core speeds and temperature to see if the core speeds drop under load or fluctuate quickly...throttling is when the gpu can't handle the load either by to much heat or voltages...the card will throttle when you exceed temps or the power envelope to protect itself from damage...(mostly) when you throttle the clocks will drop normally extremely steeply to allow the card to cool or run with less power depending on the reason it is throttling


----------



## godiegogo214

@mfknjadagr8I will do that....heavensdx11 on ultra should be able to stress it

Thanks for the tip


----------



## Kriant

I found the new torture test for my OCing - Zombie Army Trilogy on Max settings. Oh that game sends unstable OCs freezing, and braking drivers so badly, I have to clean the registry and reinstall them 0_o.


----------



## ayazam

hello guys,

right now i have HIS R9 290 IceQ X2, i wonder is it worth it to CF with another 290? im still playing games in 1080p and happy with it, i can get a 2nd hand 290 for $230~ or a new card for $300~, of course both have aftermarket cooler just like mine, but different manufacture

what's your opinion?


----------



## taem

Quote:


> Originally Posted by *ayazam*
> 
> hello guys,
> 
> right now i have HIS R9 290 IceQ X2, i wonder is it worth it to CF with another 290? im still playing games in 1080p and happy with it, i can get a 2nd hand 290 for $230~ or a new card for $300~, of course both have aftermarket cooler just like mine, but different manufacture
> 
> what's your opinion?


I would not crossfire 290 for 1080p.

$300 for a new 290 is not a good price these days, neither is $230 for a used. Those are not prices you have to jump on.


----------



## Maintenance Bot

Add me please when you got time. Thanks.

http://www.techpowerup.com/gpuz/details.php?id=gemde

2-Sapphire R9-290x's


----------



## fishingfanatic

Put a 75 cfm or larger fan behind the gpu to blow air across it and the vrms. Remove an extra expansion slot plate or 2 for better flow if necessary.

Worked for a pr of 5970s in a CM Cosmos 1st gen. From 105 C to 86 with a 120mm 90 cfm fan, low noise!!!

Hope that helps !

FF


----------



## iCrap

Crossfire seriously sucks or i'm just doing it wrong. some of my games are stuttering hard and most are getting little to no boost at all with 2 cards








I mean no way my CPU is bottlenecking or causing an issue... it's a freaking 6 core @ 4.5ghz...


----------



## mfknjadagr8

what games are you playing?... i had issues with my crossfire not working at all... turns out it was thermal throttling in my case... now its butter smooth on the ones ive played it one since i repasted the offender....i was getting 0 usage or at most 3 to 15 percent spikes.. now im getting nice even usage between the cards on everything ive tried so far... from bf4 to metro last light... i would check temps and see if you are getting throttling in the clocks of the gpus... if either of your cards throttle it will be horribly choppy...and fps will never be constant


----------



## kizwan

Quote:


> Originally Posted by *iCrap*
> 
> Crossfire seriously sucks or i'm just doing it wrong. some of my games are stuttering hard and most are getting little to no boost at all with 2 cards
> 
> 
> 
> 
> 
> 
> 
> 
> I mean no way my CPU is bottlenecking or causing an issue... it's a freaking 6 core @ 4.5ghz...


What games that have the stuttering problem? Did you check gpu usage?


----------



## iCrap

Quote:


> Originally Posted by *kizwan*
> 
> What games that have the stuttering problem? Did you check gpu usage?


My GPU ussage is weird. it's zero on card 1 and 100 on card 2
i made a post here
http://www.overclock.net/t/1545174/crossfire-struggles/0_80#post_23637121
Quote:


> Originally Posted by *mfknjadagr8*
> 
> what games are you playing?... i had issues with my crossfire not working at all... turns out it was thermal throttling in my case... now its butter smooth on the ones ive played it one since i repasted the offender....i was getting 0 usage or at most 3 to 15 percent spikes.. now im getting nice even usage between the cards on everything ive tried so far... from bf4 to metro last light... i would check temps and see if you are getting throttling in the clocks of the gpus... if either of your cards throttle it will be horribly choppy...and fps will never be constant


mine are on water so temps are like 45c. I had the issues you were describing while the 2nd card was on air.


----------



## kizwan

Quote:


> Originally Posted by *iCrap*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> What games that have the stuttering problem? Did you check gpu usage?
> 
> 
> 
> My GPU ussage is weird. it's zero on card 1 and 100 on card 2
> i made a post here
> http://www.overclock.net/t/1545174/crossfire-struggles/0_80#post_23637121
Click to expand...

Did you already try re-install the driver? Uinstall using DDU if necessary. Both GPU detected properly, like both running at x16? Try run the GPU-Z render test for each cards. You also can disable Crossfire & test games with single GPU only.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *kizwan*
> 
> " SNIP "
> *You have water chillers, re-seat or re-pad kind of redundant.
> 
> 
> 
> 
> 
> 
> 
> *
> No argument here.
> 
> 
> 
> 
> 
> 
> 
> IC Diamond is one of the best TIMs.


This is true









Its the best thing I did cooling wise using chiller(s) on both CPU / GPU









IC Diamond is on all three of my 290's

Quote:


> Originally Posted by *rdr09*
> 
> *1400? you sure you have a hawaii not a 970 or 980?*
> 
> anyways, if you are trying to get more volts using AB, check out post one by Arizonian. Scroll down to "How to give more volts using AB . . . ." by sugarhell.
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club


LooooL 1400mhz


----------



## Red1776

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> " SNIP "
> *You have water chillers, re-seat or re-pad kind of redundant.
> 
> 
> 
> 
> 
> 
> 
> *
> No argument here.
> 
> 
> 
> 
> 
> 
> 
> IC Diamond is one of the best TIMs.
> 
> 
> 
> This is true
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Its the best thing I did cooling wise using chiller(s) on both CPU / GPU
> 
> 
> 
> 
> 
> 
> 
> 
> 
> IC Diamond is on all three of my 290's
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> *1400? you sure you have a hawaii not a 970 or 980?*
> 
> anyways, if you are trying to get more volts using AB, check out post one by Arizonian. Scroll down to "How to give more volts using AB . . . ." by sugarhell.
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club
> 
> Click to expand...
> 
> LooooL 1400mhz
Click to expand...

GOOD MAN SEEING PAST ALL OF THE IC DIAMOND subterfuge and all off the querimonious buffoons. it is the best compound in a tube, I use it on all of my GPU's and CPU;s and to the sane same thermal result as the indigo extreme.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Red1776*
> 
> GOOD MAN SEEING PAST ALL OF THE IC DIAMOND subterfuge and all off the querimonious buffoons. it is the best compound in a tube, I use it on all of my GPU's and CPU;s and to the sane same thermal result as the indigo extreme.


I got one application of Indigo to flow right out of three goes .
But it dropped Sandybee idle temps by 5c


----------



## mfknjadagr8

Quote:


> Originally Posted by *Red1776*
> 
> GOOD MAN SEEING PAST ALL OF THE IC DIAMOND subterfuge and all off the querimonious buffoons. it is the best compound in a tube, I use it on all of my GPU's and CPU;s and to the sane same thermal result as the indigo extreme.


any of you guys used the cool lab pad for the cpu? It looks easy and from what I've read performs pretty good just wondering if anyone had used it


----------



## Chopper1591

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Ok so I found the issues ill let the pictures speak for themselves...
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> The thermal broke apart like chalk...sadly the memory pads were partially destroyed...I didn't even have to pry to.get the assembly off it almost fell apart


Wow.
Your card is one of the worst TIM applications I have seen so far.
Looks like it was dried up completely.

Did you re-apply with those crappy mem pads?

Interested to see the difference when your blocks come in the mail.
I think your difference will be even larger than mine was.

Showed it before, but will do it again:
same oc settings, 100% fan vs water, ~15 minutes of valley.


Spoiler: Warning: Spoiler!







Quote:


> Originally Posted by *pshootr*
> 
> Looks like the typical thermal compound found on most stock parts. For $300.00+ you would think hey might use some better/longer-lasting TIM.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I don't think I would mind spending 50
> 
> 
> 
> 
> 
> 
> 
> more for a card with better TIM


I do agree on your point.
But I must say the paste on my tri-x was applied pretty good actually.

The TIM was more than a year old and I could still wipe it off with papercloth pretty easy. Not dried at all.
Quote:


> Originally Posted by *mfknjadagr8*
> 
> any of you guys used the cool lab pad for the cpu? It looks easy and from what I've read performs pretty good just wondering if anyone had used it


Not used it but it seems to perform very good.
Although I did read that it is a PITA to remove it. Personally I wouldn't use it, I also think it is too expensive for "just TIM".


----------



## taem

Question:

If you ever want to put a card with a block on it back on the stock air cooler, how do you know what thickness pads to use for the vrms and mem? My card is a Powercolor PCS+ 290, original version with reference pcb.


----------



## tsm106

Quote:


> Originally Posted by *taem*
> 
> Question:
> 
> If you ever want to put a card with a block on it back on the stock air cooler, how do you know what thickness pads to use for the vrms and mem? My card is a Powercolor PCS+ 290, original version with reference pcb.


Trick question?

Save the stock cooler with pads, wrap it in plastic. When it comes time to slap stock cooler on, it's like new condition or whenever you took it off originally. All the lil bits of pad that came off the cooler and stuck to pcb, remove them with a razor and put them back where they go, on cooler, etc. Anyways, the stock pads are not like the ek/aftermarket pads because they are somewhat specific to the brand. Some AIB use thick spongy pads that smush down flat, some use thin fabric pads, etc. Worse case, you'd have to figure it out on a case by case basis. In other words look at your stock cooler pads and determine it yourself. Or do as I wrote above, scrape all the bits off, put them back on the stock cooler, and bag it for that rainy day.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Chopper1591*
> 
> Wow.
> Your card is one of the worst TIM applications I have seen so far.
> Looks like it was dried up completely.
> 
> Did you re-apply with those crappy mem pads?
> 
> Interested to see the difference when your blocks come in the mail.
> I think your difference will be even larger than mine was.
> 
> Showed it before, but will do it again:
> same oc settings, 100% fan vs water, ~15 minutes of valley.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I do agree on your point.
> But I must say the paste on my tri-x was applied pretty good actually.
> 
> The TIM was more than a year old and I could still wipe it off with papercloth pretty easy. Not dried at all.
> Not used it but it seems to perform very good.
> Although I did read that it is a PITA to remove it. Personally I wouldn't use it, I also think it is too expensive for "just TIM".


yes I did put it back with the crappy pads as they were already that way before because I didn't even have to pry the pcb or the cooler it literally came off with no pressure...I don't have pads and wouldn't waste them if I did blocks come in two more days...but I did ensure the pieces were aligned exactly the way they came off...thankfully it was just the memory pads and not the vrms







but yes it was way overpasted and dry...when I made that prediction I didn't know it would be both lol


----------



## Arizonian

Quote:


> Originally Posted by *Red1776*
> 
> *Quoted by kizwan:*
> *That's great ": IC Diamond Usually draws a harsh response. I have had the best results with IC (including Indigo Extreme) within .02c*
> 
> *Hi* *Arizonian,*
> *I was looking at your title tag line and added a flux capacitor a couple years ago to my 4x 7970 quad build *
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> *It took an extra 240mm to keep temps down, but the performance boost was substantial. I realize mine is the 1.51 *
> Gigawatt Flux Capacitor
> v2.0, but still, I would like to join
> *I would like to join the club please.*
> *Thanks AZ *


"The 1.21 Gigawatt Flux Capacitor Club" is pretty exclusive.







It's a hidden thread.









Congrats on the quad - added
















Quote:


> Originally Posted by *Maintenance Bot*
> 
> Add me please when you got time. Thanks.
> 
> http://www.techpowerup.com/gpuz/details.php?id=gemde
> 
> 2-Sapphire R9-290x's
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Looks real nice - congrats - added


----------



## BuildTestRepeat

And im back!!!

After sending back my Powercolor PCS+ R9 290 for being a bad card i decided to give AMD one more shot. Wanted to wait for the 300 series cards but couldn't resist this when i saw it on craigslist.

Picked up a Sapphire reference R9 290 locally with a GELID Cooler on it. VRM and Memory chips all have heatsinks.

Current stable OC :

StableOC.JPG 96k .JPG file


Card Porn









SapphireR9290GELID 1975k .jpg file


SapphireR9290GELIDHeatsinks 2025k .jpg file


R9290VRMHeatsinks 1574k .jpg file


----------



## dallas1990

im having some problem with msi afterburner. its not picking up my fan speed or anything for my 2 r9 290x. i tried re-installing it and didnt work and i've tried tri-xx and still nothing but the fans are spinning. any ideas on how to fix this?


----------



## godiegogo214

Are these temps normal??



this was running one benchmark of heavens dx 11 on ultra setting

I dont know anything about msi afterburner....this is the first time i deal with high end cards like this.....also quick question when you have crossfire on? are the 2 cards not supposed to help each other? why do i see so much load from the primary cards but barely any load from the secondary card......( about 80% load on primary 30 -40% load on secondary)

i'm just still surprised at how hot these cards get......seriously concidering just water cooling the whole computer


----------



## dallas1990

i'd say 86 isn't to bad with a benchmark program. and thats still cooler then the refrence cards with there 95* but im new with amd cards so dont take my word


----------



## godiegogo214

Quote:


> Originally Posted by *dallas1990*
> 
> i'd say 86 isn't to bad with a benchmark program. and thats still cooler then the refrence cards with there 95* but im new with amd cards so dont take my word


ive seen it go to the 90c playing the witcher on high settings......the socket gets to like 63 and AI tools throughs a warning (asus sebertooth 990FX r2 board)


----------



## dallas1990

I have the same motherboard as well. I tend to ignore Ai tools cause of the temperature tells you the socket not the actual cpu. So I keep an eye on temps with Hwmonitor


----------



## godiegogo214

Quote:


> Originally Posted by *dallas1990*
> 
> I have the same motherboard as well. I tend to ignore Ai tools cause of the temperature tells you the socket not the actual cpu. So I keep an eye on temps with Hwmonitor


so AI tools is asus way of being extra cautious i guess we can say


----------



## Arizonian

Quote:


> Originally Posted by *BuildTestRepeat*
> 
> And im back!!!
> 
> After sending back my Powercolor PCS+ R9 290 for being a bad card i decided to give AMD one more shot. Wanted to wait for the 300 series cards but couldn't resist this when i saw it on craigslist.
> 
> Picked up a Sapphire reference R9 290 locally with a GELID Cooler on it. VRM and Memory chips all have heatsinks.
> 
> Current stable OC :
> 
> StableOC.JPG 96k .JPG file
> 
> 
> Card Porn
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> SapphireR9290GELID 1975k .jpg file
> 
> 
> SapphireR9290GELIDHeatsinks 2025k .jpg file
> 
> 
> R9290VRMHeatsinks 1574k .jpg file


Congrats - added


----------



## Chopper1591

Quote:


> Originally Posted by *mfknjadagr8*
> 
> yes I did put it back with the crappy pads as they were already that way before because I didn't even have to pry the pcb or the cooler it literally came off with no pressure...I don't have pads and wouldn't waste them if I did blocks come in two more days...but I did ensure the pieces were aligned exactly the way they came off...thankfully it was just the memory pads and not the vrms
> 
> 
> 
> 
> 
> 
> 
> but yes it was way overpasted and dry...when I made that prediction I didn't know it would be both lol


9 out of 10 cards are overpasted.
It looks like it either had bad contact with the die from the start or it has heated up for prolonged time somehow, causing the paste to dry up so badly.

Hehe, yeah.
Vrm pads are way more important.


----------



## taem

So I took the stock cooler off my powercolor pcs+ 290 and I see these things around the die



close up



two more behind vrms



I'm guessing those have to come off. they're pretty tall I don't think there's any way the block goes on with those there.

Now I'm wondering how I ever get the stock cooler back on if I ever need to. Guess this goes in trash instead of eBay when I upgrade. Vrm2 pads are all torn up anyway.


----------



## Chopper1591

Quote:


> Originally Posted by *taem*
> 
> So I took the stock cooler off my powercolor pcs+ 290 and I see these things around the die
> 
> 
> 
> close up
> 
> 
> 
> two more behind vrms
> 
> 
> 
> I'm guessing those have to come off. they're pretty tall I don't think there's any way the block goes on with those there.
> 
> Now I'm wondering how I ever get the stock cooler back on if I ever need to. Guess this goes in trash instead of eBay when I upgrade. Vrm2 pads are all torn up anyway.


Seen it before.
But not so many.

Why don't you store them with the stock cooler?
Some double-sided tape will work fine when you need to remount the cooler to sell the card.

And no, the block will most likely not fit when you let the rubber blocks sit where they are.


----------



## DzillaXx

Quote:


> Originally Posted by *taem*
> 
> So I took the stock cooler off my powercolor pcs+ 290 and I see these things around the die
> 
> 
> 
> close up
> 
> 
> 
> two more behind vrms
> 
> 
> 
> I'm guessing those have to come off. they're pretty tall I don't think there's any way the block goes on with those there.
> 
> Now I'm wondering how I ever get the stock cooler back on if I ever need to. Guess this goes in trash instead of eBay when I upgrade. Vrm2 pads are all torn up anyway.


I have the same PCB(LF R29F), The EK blocks don't interfere. No need to remove them. (If you had the EK Block in hand, you would also see they don't block anything)

I'm happy with mine, overclocks decently.

Also you can buy VRM pads for cheap online. So don't toss anything out!!!!
Also depending on the Block you get, they will probably give you extra pads. I know EK did.


----------



## taem

Quote:


> Originally Posted by *DzillaXx*
> 
> I have the same PCB(LF R29F), The EK blocks don't interfere. No need to remove them. (If you had the EK Block in hand, you would also see they don't block anything)
> 
> I'm happy with mine, overclocks decently.
> 
> Also you can buy VRM pads for cheap online. So don't toss anything out!!!!
> Also depending on the Block you get, they will probably give you extra pads. I know EK did.


How do I know if the block is making good contact? Until I try it and get horrible temps that is. I did not get extra pads with the block, it's Watercool, they're Germans, they don't factor human error.

As for after market pads do you know what thickness?

I also mangled the plastic arrow things that clip the vrms heatsinks on lol.

Just 4 screws holding that huge heatsink assembly, metal shroud and fans to pcb. And a very small plate. Was surprised.

There was enough paste to treat 20 cards. Should have taken a pic the die area looked like cement was pouted on to it.


----------



## DzillaXx

Quote:


> Originally Posted by *taem*
> 
> How do I know if the block is making good contact? Until I try it and get horrible temps that is. I did not get extra pads with the block, it's Watercool, they're Germans, they don't factor human error.
> 
> As for after market pads do you know what thickness?
> 
> I also mangled the plastic arrow things that clip the vrms heatsinks on lol.
> 
> Just 4 screws holding that huge heatsink assembly, metal shroud and fans to pcb. And a very small plate. Was surprised.
> 
> There was enough paste to treat 20 cards. Should have taken a pic the die area looked like cement was pouted on to it.


Yeah most of the card is heatsink, backplate is useless. As it can't be used with a block.

For my Card, it was etched out where all those things would go. I get great temps, and the block fit like a glove.

Granted I used the EK 290x block.

The Stock VRM pads that come with the waterblock are around the same size as what you would find on the stock cooler. Memory requires the thin pads. You will probably have enough left over with the block you buy.


----------



## tsm106

^^Guys. Waterblocks purchased new come with matching pads. That's the way it is. You don't use those pads with your stock cooler. Don't imply that either, it gets confusing for beginners. When you take off the stock cooler, transfer any stuck pad/pad pieces etc over to the stock cooler and bag it/store it whatever.

The new ek block will have pads, they even come pre-cut nowadays. If you bought a used block, go to ek's site and download the manual which will have the pad specs. Go to ppcs, buy some pads, they even sell replacement ek pads.

Do not mix the ek pads and stock cooler pads. They are different designs and made for different compression forces.



Easy. Take them off and stick them back onto the stock cooler so you don't lose them. They serve no purpose on a fullcover waterblock.


----------



## kizwan

Quote:


> Originally Posted by *godiegogo214*
> 
> Are these temps normal??
> 
> 
> 
> this was running one benchmark of heavens dx 11 on ultra setting
> 
> I dont know anything about msi afterburner....this is the first time i deal with high end cards like this.....also quick question when you have crossfire on? are the 2 cards not supposed to help each other? why do i see so much load from the primary cards but barely any load from the secondary card......( about 80% load on primary 30 -40% load on secondary)
> 
> i'm just still surprised at how hot these cards get......seriously concidering just water cooling the whole computer


Yeah, temp can go up to that. So it's pretty normal. It's good idea running your card with custom fan profile.

Both card should be working, if Crossfire enabled, when running heaven.


----------



## dallas1990

Anyone have an idea what's causing my problem? I've tried installing Asus gpu tweak, reinstalling amd drivers. With no luck. I can't seem to control my gpu fan speed. And gpu tweak says something about "psad fan overdrive 5" failure.

this is what i see in msi AB


----------



## Devildog83

Quote:


> Originally Posted by *dallas1990*
> 
> Anyone have an idea what's causing my problem? I've tried installing Asus gpu tweak, reinstalling amd drivers. With no luck. I can't seem to control my gpu fan speed. And gpu tweak says something about "psad fan overdrive 5" failure.
> 
> this is what i see in msi AB


Did you set up a fan profile? If not click on the tools tab, the one that looks like a gear, and set up a profile and save. After that make sure you click the tab that says start with windows and start minumized and when you turn you PC on it will load automatically. Also after you change anything if you want to keep it click the check mark in the main UI.


----------



## dallas1990

There's no fan tab in the settings menu though it's like it was deleted or something


----------



## mfknjadagr8

Quote:


> Originally Posted by *Devildog83*
> 
> Did you set up a fan profile? If not click on the tools tab, the one that looks like a gear, and set up a profile and save. After that make sure you click the tab that says start with windows and start minumized and when you turn you PC on it will load automatically. Also after you change anything if you want to keep it click the check mark in the main UI.


thus may help...on my 290s I have to maintain at least 50 percent fan speed or I get black screens in gaming...but that's with stock blower...if I forget to change my profile to my gaming one it will black screen within a half hour...even if temps aren't that high...I feel like it's the drastic temperature spikes causing the black screens because the fan doesn't spin up high enough or adjust quickly enough for heat spikes...at least that's what it seems like on mine


----------



## kizwan

Quote:


> Originally Posted by *dallas1990*
> 
> Anyone have an idea what's causing my problem? I've tried installing Asus gpu tweak, reinstalling amd drivers. With no luck. I can't seem to control my gpu fan speed. And gpu tweak says something about "psad fan overdrive 5" failure.
> 
> this is what i see in msi AB


With the little information you gave us, I'm going to guess you have Asus card. I think you will need to use GPU Tweak to set fan profile. What is the exact failure message the GPU Tweak throws?


----------



## Dt_Freak1

has anyone successfully done a voltmodded bios for a gigabyte radeon r9 290x?


----------



## dallas1990

I actually have two sapphire 290x vapor-x oc 8gb cards in cf. The msi ab, gpu tweak and sapphire's program won't let me change the fan curve or speed. Asus fail code is pADL_Overdrive5_FanSpeed_Get


----------



## taem

Quote:


> Originally Posted by *tsm106*
> 
> ^^Guys. Waterblocks purchased new come with matching pads. That's the way it is. You don't use those pads with your stock cooler. Don't imply that either, it gets confusing for beginners. When you take off the stock cooler, transfer any stuck pad/pad pieces etc over to the stock cooler and bag it/store it whatever.
> 
> The new ek block will have pads, they even come pre-cut nowadays. If you bought a used block, go to ek's site and download the manual which will have the pad specs. Go to ppcs, buy some pads, they even sell replacement ek pads.
> 
> Do not mix the ek pads and stock cooler pads. They are different designs


Yeah I knew that and that other guy probably did too. I was just being dense. With the card that I and that other guy have the VRM pads are independent of the heatsink and plate. So you don't need a specific thickness to get contact without messing up the contact between heatsink and the die.

Here you see how small the plate is and how it does not interact with the vrms at all.



Here are the VRM pads and heatsinks. I can use any thickness I want amd secure to pcb with nut and screw. I think that's what that guy meant.



I like this design.

Quote:


> Easy. Take them off and stick them back onto the stock cooler so you don't lose them. They serve no purpose on a fullcover waterblock.


Yeah. They're there because the heatsink attaches with only 4 screws, to provide stability. So I'm pretty sure they don't have to be positioned exactly.

I was just being stupid think8ng only in terms of getting it back to factory state.


----------



## godiegogo214

Quote:


> Originally Posted by *kizwan*
> 
> Yeah, temp can go up to that. So it's pretty normal. It's good idea running your card with custom fan profile.
> 
> Both card should be working, if Crossfire enabled, when running heaven.


Thank you for the reply. When I get home I'll post a screenshot of the core load on the gpus any ideas how to fix this I'm sorry it's the first time I do crossfire. Thnx for the help


----------



## taem

I've had a 290 since release but I don't think I was ever added -- can you add me?

Heatkiller on.





Sex. Crappy phone pix do not do it justice. I'm pretty sure I messed up the paste job tho.


----------



## godiegogo214

So it is clear that my pc is not using the second card for some reason

This is so frustrating









i'm using driver version 9.14.10.01080, Catalyst 14.12
Bios version 015.043.000.004

this is the info from both cards

also i dont know if it applys but i have an 850 Watt PSU


----------



## Roaches

Quote:


> Originally Posted by *godiegogo214*
> 
> 
> 
> So it is clear that my pc is not using the second card for some reason
> 
> This is so frustrating
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i'm using driver version 9.14.10.01080, Catalyst 14.12
> Bios version 015.043.000.004
> 
> this is the info from both cards
> 
> also i dont know if it applys but i have an 850 Watt PSU


Ain't PCI-E 3.0 required to run Crossfire from XDMA on Hawaii cards? I could be wrong though.


----------



## tsm106

Quote:


> Originally Posted by *godiegogo214*
> 
> 
> 
> So it is clear that my pc is not using the second card for some reason
> 
> This is so frustrating
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i'm using driver version 9.14.10.01080, Catalyst 14.12
> Bios version 015.043.000.004
> 
> this is the info from both cards
> 
> also i dont know if it applys but i have an 850 Watt PSU


Unless you use mantle, crossfire only works in fullscreen mode, ie. not whilst on the desktop.


----------



## godiegogo214

I think you re right on the PCI express 3.0......so far i'm reading and it keeps saying 3.0 can someone else please confirm?

I'm telling you i'm never buying anymore stuff from tigerdirect.....unbelievable


----------



## tsm106

Quote:


> Originally Posted by *godiegogo214*
> 
> I think you re right on the PCI express 3.0......so far i'm reading and it keeps saying 3.0 can someone else please confirm?
> 
> I'm telling you i'm never buying anymore stuff from tigerdirect.....unbelievable


Wrong.


----------



## MrWhiteRX7

Definitely don't need pcie 3.0 or else all the FX amd cpu owners would be uber sad haha.


----------



## godiegogo214

Yeah i see it now.....i was just border line raging without knowing...but now when i ran heaven dx 11 on full screen mode i saw the gpus clocks spike to 1030 and the usage was up there....up and down but look uniform.....frames were 130 or so and max frames was 218....i did custom with full screen and high settings


----------



## tsm106

Quote:


> Originally Posted by *godiegogo214*
> 
> Yeah i see it now.....i was just border line raging without knowing...but now when i ran heaven dx 11 on full screen mode i saw the gpus clocks spike to 1030 and the usage was up there....up and down but look uniform.....frames were 130 or so and max frames was 218....i did custom with full screen and high settings


AMD gpu and Afterburner 101:

http://www.overclock.net/t/1265543/the-amd-how-to-thread/0_40

Run Heaven or Valley according the bench threads in the benchmark subforums so you can later compare and see if you've got your rig setup right. Running at random settings won't help you in that respect.


----------



## mfknjadagr8

Quote:


> Originally Posted by *tsm106*
> 
> Unless you use mantle, crossfire only works in fullscreen mode, ie. not whilst on the desktop.


that explains why windowed mode doesn't use the second card...are there mantle settings of any kind I see choosing it as an option instead of dx on supported titles is that all? Don't get me wrong it helps quite a bit


----------



## tsm106

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Unless you use mantle, crossfire only works in fullscreen mode, ie. not whilst on the desktop.
> 
> 
> 
> that explains why windowed mode doesn't use the second card...are there mantle settings of any kind I see choosing it as an option instead of dx on supported titles is that all? Don't get me wrong it helps quite a bit
Click to expand...

Mantle is used only in games that support mantle api, like bf4, civ be, and other games.


----------



## mfknjadagr8

Quote:


> Originally Posted by *tsm106*
> 
> Mantle is used only in games that support mantle api, like bf4, civ be, and other games.


that's what I was guessing it was...just your wording of unless you have mantle threw me off I thought I might be missing out on something


----------



## kizwan

Quote:


> Originally Posted by *godiegogo214*
> 
> 
> 
> So it is clear that my pc is not using the second card for some reason
> 
> This is so frustrating
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i'm using driver version 9.14.10.01080, Catalyst 14.12
> Bios version 015.043.000.004
> 
> this is the info from both cards
> 
> also i dont know if it applys but i have an 850 Watt PSU


Yeah, like tsm said, make sure you're running in fullscreen mode. With DX11 API, Crossfire only working in fullscreen mode. Only with Mantle API, Crossfire works in both windowed & fullscreen mode.

It's a long shot but did you try ticked the _"Enabled AMD CrossFireX for applications that have no associated ..."_? I don't have that option because I use Win 7.

You can test both cards individually to make sure both cards are working. Disabled crossfire & connect the monitor to the GPU that you want to test.


----------



## Bertovzki

Ok now i am about to mount my aqacompter Hawaii block on my R9 290X Tri-X OC ,

I have Coolermaster X1 extreme fusion , is everyone here a fan of the pea method ? i dont like how it doesnt spread across the surface very well , i have not built a system for many years now , and the last system i bought off my old man was already done , not had to do much in the way of thermal paste , i know all the pros and cons and does and donts , curious as to any opinions or feedback ,also never used thermal pads before , i will just be using the aquacomputer pads with the block

Anything i should know about this block ? , im going to do it tomorrow and my CPU , not sure whether to use my gelid that came with the EK supremacy or not , they are both good pastes from what i read.
I also have Arctic Silver 5


----------



## tsm106

I use pea. Mount and remount a couple times to confirm you get good a spread and a repeatable spread.


----------



## kizwan

Quote:


> Originally Posted by *Bertovzki*
> 
> Ok now i am about to mount my aqacompter Hawaii block on my R9 290X Tri-X OC ,
> 
> I have Coolermaster X1 extreme fusion , is everyone here a fan of the pea method ? i dont like how it doesnt spread across the surface very well , i have not built a system for many years now , and the last system i bought off my old man was already done , not had to do much in the way of thermal paste , i know all the pros and cons and does and donts , curious as to any opinions or feedback ,also never used thermal pads before , i will just be using the aquacomputer pads with the block
> 
> Anything i should know about this block ? , im going to do it tomorrow and my CPU , not sure whether to use my gelid that came with the EK supremacy or not , they are both good pastes from what i read.
> I also have Arctic Silver 5


I use "X" or "+" for TIM. I just make sure I don't put too much. My way is when applying TIM, I'd press the block to spread the TIM & then lift the block to check the spread. If I'm satisfy with the spread which I usually do, I just re-mount the block. Anyway, doesn't matter whichever method you use, just check the spread.

When mounting the block, just make sure to follow the included instruction manual. Common mistake is applying thermal pad on the memory.


----------



## Bertovzki

@tsm106 @kiswan , interesting you both say check spread , this is what ideally i would always like to have done , but thought it would be the last thing you would want to do as that could make air pockets removing it and re seating it ? no ? obviously no problems ? , well that would make it easier

Yeah i understand less is best too , as long as you have good spread , you so often see in other dudes examples or vid , that th pea method just has a round blob in the middle and it doesnt make it very fa too edges at all , i would have thought it is splitting hairs a bit , as long as you dont have a dry spot in the middle an air pocket.

I know most of the heat is immediately over die and not over he whole surface , but i do like the idea of it covering the whole surface.

Any minor bubbles should dissipate not long after heating up and burning in ?


----------



## kizwan

The proper way is once you get good spread, just clean the TIM & re-apply the TIM again with the same way. I sometime do this but most of the time I'm too lazy. I just re-mount the block carefully. So far this method worked ok for me.

That's why I use "X" or "+". I'll get nice square spread.

If you put too much TIM then air pocket may formed when you re-mount the block. If you just apply the right amount of TIM, then I wouldn't worry about it.


----------



## Bertovzki

Ok thanks guys , I quite like the idea of "X" or Star


----------



## tsm106

The reason for checking the spread is to confirm one you got proper spread and two if there is a big enough pea to cover the die. If you don't get full coverage on the die apply a bigger pea next time, so on and do forth.


----------



## taem

About spread-- I use pea on cpu because the die is smaller than the heat spreader and it's easy to get good coverage.

But doesn't a gpu die extend to the corners? That's why I've always gone X on gpu.

EK recommends a X and a + for a union jack style application. Really calls for a thin paste like gelid gc extreme doing it this way though.


----------



## BuildTestRepeat

Im surprised i was not bombarded with questions on the GELID cooler haha









I plan to strap my Corsair H40 to it soon using the red mod zip tie technique being the VRM and memory chips already have heatsinks. Will post temp results and max OC.


----------



## dallas1990

has anyone pc restart randomly during a game? its like my gpu's overheat and my computer just black-screens and then restart


----------



## Kriant

Quote:


> Originally Posted by *tsm106*
> 
> The reason for checking the spread is to confirm one you got proper spread and two if there is a big enough pea to cover the die. If you don't get full coverage on the die apply a bigger pea next time, so on and do forth.


That. I always double check. Though once I forgot to put TP when replacing my WB, and then I was scratching my head, why one card was going around 70ish c while others were under 40


----------



## thrgk

I have 4 290x under water and I'm debating on getting a 1440p freesync monitor , 4k freesync monitor (will be released soon from samaung) or just get a 4k 60hz monitor. I play mainly bf4 and fps style games.

Do you think 4 cards would do well 4k freesync or would my fps be so low it wouldn't make sense? Would going regular 4k be a better move ? Tho for fast pace I thought it might be a hinder. Is 1440 and freesync the best option or could my rig handle 4k and freesync? I thought freesync was like 144hz and if my rig can't hold 144fps in 4k res then there is no point ?

thanks


----------



## LandonAaron

Pea method never works for me. The TIM never reaches the corners, unless I start using beans, and that's a whole different method. I just spread the TIM as evenly as I can.

A few days ago a couple people on this thread mentioned a TIM called IC Diamond. I have been using Prolimatech PK-3, and though it has given me better temps than the standard Artic Silver 5, the stuff is seriously hard to apply. It comes out really hard, and you don't spread it so much as you smush it into place. It comes with a little spatula to help with this, but the paste is twice as likely to cling to the spatula as it is to the block, Anyway, its a pain, and I don't wish to ever use it again. I was considering GelID as that seems very popular, but I kinda want something a little more exotic, like one of those metal pastes or DIAMONDS!!! After seeing the recommendations here I did a little research and found a thread where people claimed that using the TIM would damage your components due to the hardness of diamonds. Is there any truth to that? Do you have to take any special precautions when using it?


----------



## LandonAaron

Quote:


> Originally Posted by *thrgk*
> 
> I have 4 290x under water and I'm debating on getting a 1440p freesync monitor , 4k freesync monitor (will be released soon from samaung) or just get a 4k 60hz monitor. I play mainly bf4 and fps style games.
> 
> Do you think 4 cards would do well 4k freesync or would my fps be so low it wouldn't make sense? Would going regular 4k be a better move ? Tho for fast pace I thought it might be a hinder. Is 1440 and freesync the best option or could my rig handle 4k and freesync? I thought freesync was like 144hz and if my rig can't hold 144fps in 4k res then there is no point ?
> 
> thanks


I think your confusing Freesync with high refresh rate monitors. As far as I know there are no Freesync monitors on the market yet, though there should be some this year, hopefully in the coming months. Like Gsync the technology will likely be found on high refresh rate monitors, but there is nothing preventing it from being put on a regular 60hz monitor either. Getting a high refresh rate monitor is always > than a low refresh rate monitor, all things being equal, because a high refresh rate monitor can run a low refresh rates as well, but a low refresh rate monitor won't be able to run high refresh rates. As far as 4K + Freesync vs. just 4K, you would want the one with Freesync if similarly priced. Freesync is going to give you better, smoother experience regardless of the refresh rate or resolution you choose to run at. When it comes to 4K there may be little reason in getting a monitor with > 60hz refresh rate as it will be hard for any computer to push the FPS needed to take advantage of it. I think 4 cards would be sufficient for a good gaming experience though. But if it were me I would just get a 1440p monitor, and 2 cards. If your in no hurry and have the cash to afford a $700 monitor wait for the Freesync versions. Otherwise you can get 1440p Korean monitor for $325-$350 that can be overclocked from 60hz to somewhere between 96hz and 120hz.

Another thing to take into consideration when going 4K > 60hz, is the card your using. I am pretty sure HDMI can only do 4K up to 60hz, DVI can't do 4K, and only Display Port can do 4K at 120hz. I maybe wrong but off the top of my head thats how I think it works.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *LandonAaron*
> 
> Pea method never works for me. The TIM never reaches the corners, unless I start using beans, and that's a whole different method. I just spread the TIM as evenly as I can.
> 
> A few days ago a couple people on this thread mentioned a TIM called IC Diamond. I have been using Prolimatech PK-3, and though it has given me better temps than the standard Artic Silver 5, the stuff is seriously hard to apply. It comes out really hard, and you don't spread it so much as you smush it into place. It comes with a little spatula to help with this, but the paste is twice as likely to cling to the spatula as it is to the block, Anyway, its a pain, and I don't wish to ever use it again. I was considering GelID as that seems very popular, but I kinda want something a little more exotic, like one of those metal pastes or DIAMONDS!!! After seeing the recommendations here I did a little research and found a thread where people claimed that using the TIM would damage your components due to the hardness of diamonds. Is there any truth to that? Do you have to take any special precautions when using it?


Personally I love IC Diamond and use it on my gpus and cpu







Never had an issue even after repasting. My 2600k has a good few years on it with ic diamond and no scars... HOWEVER you will see people stating the exact opposite so honestly just research the hell out of it and come up with a conclusion or maybe flip a coin? This alone can turn in to quite a debate


----------



## Bertovzki

Thanks all for opinions on Thermal paste application , good to hear some local feedback , there are a lot of different opinions

This i the best method though


----------



## battleaxe

Quote:


> Originally Posted by *Bertovzki*
> 
> Thanks all for opinions on Thermal paste application , good to hear some local feedback , there are a lot of different opinions
> 
> This i the best method though


LOL

gonna use that method next time.


----------



## mfknjadagr8

Quote:


> Originally Posted by *dallas1990*
> 
> has anyone pc restart randomly during a game? its like my gpu's overheat and my computer just black-screens and then restart


I was having black screens during gaming.. turning off dlps helped with it some but i got to watching my temps as my crossfire wasnt working properly and one of the cards was overheating... i repasted it and turned fans up a bit and no more black screen so far.. if you have a card you just bought i wouldnt recommend repasting as it voids warranty on most to remove the heatsinks and things... but i would monitor the temps while gaming and see if they are getting high... my temps werent extremely high but it was making crossfire not work properly and black screening with temps around 90...


----------



## kizwan

Quote:


> Originally Posted by *LandonAaron*
> 
> Pea method never works for me. The TIM never reaches the corners, unless I start using beans, and that's a whole different method. I just spread the TIM as evenly as I can.
> 
> A few days ago a couple people on this thread mentioned a TIM called IC Diamond. I have been using Prolimatech PK-3, and though it has given me better temps than the standard Artic Silver 5, the stuff is seriously hard to apply. It comes out really hard, and you don't spread it so much as you smush it into place. It comes with a little spatula to help with this, but the paste is twice as likely to cling to the spatula as it is to the block, Anyway, its a pain, and I don't wish to ever use it again. I was considering GelID as that seems very popular, but I kinda want something a little more exotic, like one of those metal pastes or DIAMONDS!!! After seeing the recommendations here I did a little research and found a thread where *people claimed that using the TIM would damage your components due to the hardness of diamonds. Is there any truth to that?* Do you have to take any special precautions when using it?


Not true at all. I'm pretty sure the people who claimed this never use it before.

When you remove the paste without proper solvent, it can scratch the IHS or the die surface. That's all, it doesn't cause any damage.


----------



## dallas1990

Quote:


> Originally Posted by *mfknjadagr8*
> 
> I was having black screens during gaming.. turning off dlps helped with it some but i got to watching my temps as my crossfire wasnt working properly and one of the cards was overheating... i repasted it and turned fans up a bit and no more black screen so far.. if you have a card you just bought i wouldnt recommend repasting as it voids warranty on most to remove the heatsinks and things... but i would monitor the temps while gaming and see if they are getting high... my temps werent extremely high but it was making crossfire not work properly and black screening with temps around 90...


dlps? Mine don't even hit 78-80 degrees. I thought it might have been my cpu overclock so I turned it back to stock so far so good. 1000w gold rated psu should produce enough power for 2 sapphire 290x vapor-x cards in case and a 4.8ghz amd fx8320.i thought


----------



## mfknjadagr8

Quote:


> Originally Posted by *dallas1990*
> 
> dlps? Mine don't even hit 78-80 degrees. I thought it might have been my cpu overclock so I turned it back to stock so far so good. 1000w gold rated psu should produce enough power for 2 sapphire 290x vapor-x cards in case and a 4.8ghz amd fx8320.i thought


It might be borderline if you need a lot of volts for that overclock... also it depends a lot on what make and model of powersupply it is.. just because it says 1000w doesnt mean it will provide clean enough and enough constant peak power.. im not saying thats your problem but you should pm @shilka about your psu and ask... im not very good at the power delivery side of psus but him and a couple of others know damn near everything there is to know about psu's... dlps is what the amd gpus use to lower the clocks and voltages when idle.. this apparently can cause a lot of issues with clocks and speeds changing and causing the black screens... it helped mine a lot... i was black screening 2 or 3 minutes into an intensive game every time... with it off it went down to once every few hours... changing the fan profile to up the speeds has eliminated it altogether... so far anyhow


----------



## dallas1990

I will, but my psu is a evga supernova 1000 g2 and the volts is 1.41


----------



## mfknjadagr8

Quote:


> Originally Posted by *dallas1990*
> 
> I will, but my psu is a evga supernova 1000 g2 and the volts is 1.41


it should be good enough i think i remember him endorsing that model and those volts are pretty low... are you sure your cpu overclock is fully stable? If dropping cpu back to stock actually is helping that could have been the issue was a not fully stable overclock... you dont notice the instability if its minor instability until its stressed for a long time or under variable conditions...IF it is your cpu overclock jump on over to http://www.overclock.net/t/1318995/official-fx-8320-fx-8350-vishera-owners-club/0_30

and they can help you out sorting out why your overclock wasnt stable


----------



## dallas1990

Yea it's stable but I'll redo it. I'm new with amd cards so idk if they could be the culprit since I just got them Saturday


----------



## Arizonian

Quote:


> Originally Posted by *taem*
> 
> I've had a 290 since release but I don't think I was ever added -- can you add me?
> 
> Heatkiller on.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sex. Crappy phone pix do not do it justice. I'm pretty sure I messed up the paste job tho.


Your #279 out of 546 members.









It's hard to find on that list but start somewhere close to the middle your bound to find it.


----------



## DeviousAddict

Quote:


> Originally Posted by *dallas1990*
> 
> I actually have two sapphire 290x vapor-x oc 8gb cards in cf. The msi ab, gpu tweak and sapphire's program won't let me change the fan curve or speed. Asus fail code is pADL_Overdrive5_FanSpeed_Get


I've got the same cards. You need to flick the switch on the side of the card to put the fans in manual mode. They're default to auto which is controlled by the gpu. Only the middle fan spins until it goes over 65c then all the fans spin


----------



## dallas1990

Actually my problem was that and the amd driver. I re-download it and install it and it's fixed







but thank you


----------



## Alexbo1101

Going to sell 4K monitor, replacing it will be a BenQ XL2730z, just hope my two 290x can push 144 fps at that resolution.

In other news; my 1300w PSU just arrived, time to OC!


----------



## dallas1990

Lol I'm planning on a 1600w from evga. Best to just overkill it and be future proof


----------



## fishingfanatic

For me, I dab then use a plastic utility knife blade. For scraping paint on delicate surfaces. Bag of 20 or so, pretty cheap.

I would use the Gelid, especially if it's the GC Extreme. Works really good!!

Of course you could always use Nutella. Just keep an eye on the expiry date... lol Check it out. CoolerMaster tried it and it worked pretty darn good!!!









50 C

Go figure.









http://www.geek.com/chips/coolermaster-demonstrates-how-to-use-nutella-as-thermal-paste-1615280/

FF


----------



## Crouch

GPU-Z Validation


Spoiler: Warning: Spoiler!



Here are some pics:


----------



## MOSER91

Are the Sapphire r9 290x 8gb Tri-X OC cards reference design or custom?


----------



## taem

Quote:


> Originally Posted by *Arizonian*
> 
> Your #279 out of 546 members.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's hard to find on that list but start somewhere close to the middle your bound to find it.


Oh lol. Sorry.


----------



## dallas1990

Quote:


> Originally Posted by *MOSER91*
> 
> Are the Sapphire r9 290x 8gb Tri-X OC cards reference design or custom?


custom


----------



## xTALBITx

I'm hoping someone on here can give me some advice. I'm running Dual Asus R9-290's in a Serial EK Water loop (And Crossfire), and my Top card, gets Quite a bit hotter than the lower one.
During gaming, OR while running Heaven benchmark, the Top card Usually peaks around the 76-81Deg (c) mark, while the Lower card (Which is being used JUST as Much!) tops out at 55deg.
And with them in a SERIAL loop, you'd think the Bottom card, receiving the Warmed fluid from the Top one, that IT would be the hotter of the two! Not so!

PC Specs: i7 4770k @ 4.3ghtz with 1.36v
Dual Asus R9-290 (Stock Clocks!!!)
XSPC Photon 270 pump/res
Rigid 5/8" tubing
Bitspower fittings (I used 90's and NO bends, I Know it adds Some restriction)
and ALL EK-WB. EK Supremacy on the CPU, and Full Cover EK R9 blocks on the GPU's
Flow is - Pump, Cpu, R9, R9(Serial), 2 bay XSPC rad, 3 bay XSPC rad, Pump.

Things I have tried - Switching the two cards around. (The Original "Top" card STAYED hot, even in the Bottom position!
Re-seating the EK block to the card (Was seated Flawless!!)
Stopped running 3 screens (Was running 5760x1080) temps do Not drop when only using One screen (At least not enough to worry about)
Disabling Crossfire and ONLY using the Top (Hot) card. Temps stay the same, or creap up slightly

I Was told that the Top (Hot) block Might have a Clog in it. And, with them being in Serial, wouldn't a Block/Clog in the First card effect flow to the Second card? But How can i check that? If I crack open the Block, it Voids the warranty!! Can i just send it Back??
ALSO, this block was purchased from the now Dead FrozenCPU in New York!!

HELP!! Spent Way too much on this PC for it to be running This Hot!!


----------



## dallas1990

Anyone know on sapphires warranty policy on watercooling? I'd hate to void my warranty


----------



## DzillaXx

Quote:


> Originally Posted by *xTALBITx*
> 
> I'm hoping someone on here can give me some advice. I'm running Dual Asus R9-290's in a Serial EK Water loop (And Crossfire), and my Top card, gets Quite a bit hotter than the lower one.
> During gaming, OR while running Heaven benchmark, the Top card Usually peaks around the 76-81Deg (c) mark, while the Lower card (Which is being used JUST as Much!) tops out at 55deg.
> And with them in a SERIAL loop, you'd think the Bottom card, receiving the Warmed fluid from the Top one, that IT would be the hotter of the two! Not so!
> 
> PC Specs: i7 4770k @ 4.3ghtz with 1.36v
> Dual Asus R9-290 (Stock Clocks!!!)
> XSPC Photon 270 pump/res
> Rigid 5/8" tubing
> Bitspower fittings (I used 90's and NO bends, I Know it adds Some restriction)
> and ALL EK-WB. EK Supremacy on the CPU, and Full Cover EK R9 blocks on the GPU's
> Flow is - Pump, Cpu, R9, R9(Serial), 2 bay XSPC rad, 3 bay XSPC rad, Pump.
> 
> Things I have tried - Switching the two cards around. (The Original "Top" card STAYED hot, even in the Bottom position!
> Re-seating the EK block to the card (Was seated Flawless!!)
> Stopped running 3 screens (Was running 5760x1080) temps do Not drop when only using One screen (At least not enough to worry about)
> Disabling Crossfire and ONLY using the Top (Hot) card. Temps stay the same, or creap up slightly
> 
> I Was told that the Top (Hot) block Might have a Clog in it. And, with them being in Serial, wouldn't a Block/Clog in the First card effect flow to the Second card? But How can i check that? If I crack open the Block, it Voids the warranty!! Can i just send it Back??
> ALSO, this block was purchased from the now Dead FrozenCPU in New York!!
> 
> HELP!! Spent Way too much on this PC for it to be running This Hot!!


Bad TIM job


----------



## chiknnwatrmln

Quote:


> Originally Posted by *xTALBITx*
> 
> I'm hoping someone on here can give me some advice. I'm running Dual Asus R9-290's in a Serial EK Water loop (And Crossfire), and my Top card, gets Quite a bit hotter than the lower one.
> During gaming, OR while running Heaven benchmark, the Top card Usually peaks around the 76-81Deg (c) mark, while the Lower card (Which is being used JUST as Much!) tops out at 55deg.
> And with them in a SERIAL loop, you'd think the Bottom card, receiving the Warmed fluid from the Top one, that IT would be the hotter of the two! Not so!
> 
> PC Specs: i7 4770k @ 4.3ghtz with 1.36v
> Dual Asus R9-290 (Stock Clocks!!!)
> XSPC Photon 270 pump/res
> Rigid 5/8" tubing
> Bitspower fittings (I used 90's and NO bends, I Know it adds Some restriction)
> and ALL EK-WB. EK Supremacy on the CPU, and Full Cover EK R9 blocks on the GPU's
> Flow is - Pump, Cpu, R9, R9(Serial), 2 bay XSPC rad, 3 bay XSPC rad, Pump.
> 
> Things I have tried - Switching the two cards around. (The Original "Top" card STAYED hot, even in the Bottom position!
> Re-seating the EK block to the card (Was seated Flawless!!)
> Stopped running 3 screens (Was running 5760x1080) temps do Not drop when only using One screen (At least not enough to worry about)
> Disabling Crossfire and ONLY using the Top (Hot) card. Temps stay the same, or creap up slightly
> 
> I Was told that the Top (Hot) block Might have a Clog in it. And, with them being in Serial, wouldn't a Block/Clog in the First card effect flow to the Second card? But How can i check that? If I crack open the Block, it Voids the warranty!! Can i just send it Back??
> ALSO, this block was purchased from the now Dead FrozenCPU in New York!!
> 
> HELP!! Spent Way too much on this PC for it to be running This Hot!!


Either a poor TIM application or a clogged block.

Pull out the GPU, pull off the block an check. If the TIM looks bad, then replace it and see if that fixed it. If that doesn't work then open up the block and take a look. The blocks are easy to open up with an allen wrench, just keep track of all the parts and it will be easy to throw back together. You can even dismantle the block without pulling it off the GPU, iirc.

If any hair or whatever goes into your loop, the GPU block is the first thing it will be caught in. If you see some debris in there just use a toothbrush to clean it out. You can also wash it with iso alcohol, but make sure to rinse it well and then rinse it again with DI/distilled water before putting it back in your loop.


----------



## xTALBITx

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Either a poor TIM application or a clogged block.
> 
> Pull out the GPU, pull off the block an check. If the TIM looks bad, then replace it and see if that fixed it. If that doesn't work then open up the block and take a look. The blocks are easy to open up with an allen wrench, just keep track of all the parts and it will be easy to throw back together. You can even dismantle the block without pulling it off the GPU, iirc.
> 
> If any hair or whatever goes into your loop, the GPU block is the first thing it will be caught in. If you see some debris in there just use a toothbrush to clean it out. You can also wash it with iso alcohol, but make sure to rinse it well and then rinse it again with DI/distilled water before putting it back in your loop.


Would opening up the Block Void the warranty though? (Not that I'm That much opposed to doing so...)
And the TIM is Flawless (LOL, sad, I had to Google "TIM" never heard it referred to that way! wasn't sure what you were talking about!)

If this is what it's going to take, then so be it! for the $ I have invested (Pushing the $5000 mark now!) This temp is Not good enough!

Thanks for the help guys. Might tear it down this weekend. Will post back with results!


----------



## chiknnwatrmln

I think it does, honestly you don't really need a warranty though. A water block is literally just a hunk of copper with a piece of plastic screwed on and a couple of o-rings. The only thing that could really break is an o-ring, and those are incredibly cheap to replace.

Just be careful not to over-tighten or else you can crack the plastic. Make it tight enough where you know it's secure but don't hammer the thing down.


----------



## fishingfanatic

Yeah I bought a bunch of o rings, just in case. Very cheap. just like the man said and it's not as though they take up much room.

















FF


----------



## Kaltenbrunner

Whats the deal with Elpida memory vs Hynix and the black screen crashes that happen to some ? And is that temperature related, and so more common when crossfiring ?


----------



## pshootr

I was having an issue where "Flash" would cause my PC the freeze, and disabling C6-state in the BIOS (so far) seems to have fixed the issue! All this time I was blaming it on the power state switching of the GPU.









Needless to say, if you ever have issues with Flash causing your PC to freeze, try disabling C6-State in your BIOS (I have since enabled HA again).


----------



## Kittencake

Quote:


> Originally Posted by *pshootr*
> 
> I was having an issue where "Flash" would cause my PC the freeze, and disabling C6-state in the BIOS (so far) seems to have fixed the issue! All this time I was blaming it on the power state switching of the GPU.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Needless to say, if you ever have issues with Flash causing your PC to freeze, try disabling C6-State in your BIOS (I have since enabled HA again).


I never knew this .. my lag spikes with flash seemed to stop with this .. who knew


----------



## pshootr

Quote:


> Originally Posted by *Kittencake*
> 
> I never knew this .. my lag spikes with flash seemed to stop with this .. who knew


Seems that the C6 (CPU power saving feature) is starving the GPU so to speak. Hopefully my issue and yours will remain a thing of the past as a result of disabling this (fingers crossed).


----------



## kizwan

Quote:


> Originally Posted by *xTALBITx*
> 
> I'm hoping someone on here can give me some advice. I'm running Dual Asus R9-290's in a Serial EK Water loop (And Crossfire), and my Top card, gets Quite a bit hotter than the lower one.
> During gaming, OR while running Heaven benchmark, the Top card Usually peaks around the 76-81Deg (c) mark, while the Lower card (Which is being used JUST as Much!) tops out at 55deg.
> And with them in a SERIAL loop, you'd think the Bottom card, receiving the Warmed fluid from the Top one, that IT would be the hotter of the two! Not so!
> 
> PC Specs: i7 4770k @ 4.3ghtz with 1.36v
> Dual Asus R9-290 (Stock Clocks!!!)
> XSPC Photon 270 pump/res
> Rigid 5/8" tubing
> Bitspower fittings (I used 90's and NO bends, I Know it adds Some restriction)
> and ALL EK-WB. EK Supremacy on the CPU, and Full Cover EK R9 blocks on the GPU's
> Flow is - Pump, Cpu, R9, R9(Serial), 2 bay XSPC rad, 3 bay XSPC rad, Pump.
> 
> Things I have tried - Switching the two cards around. (The Original "Top" card STAYED hot, even in the Bottom position!
> Re-seating the EK block to the card (Was seated Flawless!!)
> Stopped running 3 screens (Was running 5760x1080) temps do Not drop when only using One screen (At least not enough to worry about)
> Disabling Crossfire and ONLY using the Top (Hot) card. Temps stay the same, or creap up slightly
> 
> I Was told that the Top (Hot) block Might have a Clog in it. And, with them being in Serial, wouldn't a Block/Clog in the First card effect flow to the Second card? But How can i check that? If I crack open the Block, it Voids the warranty!! Can i just send it Back??
> ALSO, this block was purchased from the now Dead FrozenCPU in New York!!
> 
> HELP!! Spent Way too much on this PC for it to be running This Hot!!


I'm guessing the "hot" block may have defect. It won't affect the next block because the flow will resume to normal after gone through the problematic block.


----------



## cplifj

Quote:


> Originally Posted by *pshootr*
> 
> I was having an issue where "Flash" would cause my PC the freeze, and disabling C6-state in the BIOS (so far) seems to have fixed the issue! All this time I was blaming it on the power state switching of the GPU.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Needless to say, if you ever have issues with Flash causing your PC to freeze, try disabling C6-State in your BIOS (I have since enabled HA again).


actually that should not make any frikin sense at all.

gpu is not cpu. seems more like it is a flash-bug. since flash is one the worst pieces of crapware out there.


----------



## xTALBITx

Quote:


> Originally Posted by *kizwan*
> 
> I'm guessing the "hot" block may have defect. It won't affect the next block because the flow will resume to normal after gone through the problematic block.


And THIS is what worries me. And if I crack it open, and it turns out it IS a defect, my warranty is Done!

So, to tear it apart and look for a Clog/block, or Not to!! Anyone else got an opinion. So far, it's 1-1. Clogged - Defective


----------



## pshootr

Quote:


> Originally Posted by *cplifj*
> 
> actually that should not make any frikin sense at all.
> 
> gpu is not cpu. seems more like it is a flash-bug. since flash is one the worst pieces of crapware out there.


GPU is not CPU, errm ok... Another member here noticed that Flash was freezing when he was using a lower LLC setting than usual (as an experiment), so this demonstrates that the GPU is not the only thing which can cause Flash to become unstable. And many many websites use Flash, even OCN uses it to broadcast video at times I believe. So without being able to use this "quote" crapware, you become very limited as to what you can do on the net.


----------



## kizwan

Quote:


> Originally Posted by *xTALBITx*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I'm guessing the "hot" block may have defect. It won't affect the next block because the flow will resume to normal after gone through the problematic block.
> 
> 
> 
> And THIS is what worries me. And if I crack it open, and it turns out it IS a defect, my warranty is Done!
> 
> So, to tear it apart and look for a Clog/block, or Not to!! Anyone else got an opinion. So far, it's 1-1. Clogged - Defective
Click to expand...

I don't think so. If the block is defect, you can take a picture & email it to EK. I'm pretty sure EK will replace it for you. Better ask at OCN Watercooling Club thread.


----------



## Mega Man

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *xTALBITx*
> 
> I'm hoping someone on here can give me some advice. I'm running Dual Asus R9-290's in a Serial EK Water loop (And Crossfire), and my Top card, gets Quite a bit hotter than the lower one.
> During gaming, OR while running Heaven benchmark, the Top card Usually peaks around the 76-81Deg (c) mark, while the Lower card (Which is being used JUST as Much!) tops out at 55deg.
> And with them in a SERIAL loop, you'd think the Bottom card, receiving the Warmed fluid from the Top one, that IT would be the hotter of the two! Not so!
> 
> PC Specs: i7 4770k @ 4.3ghtz with 1.36v
> Dual Asus R9-290 (Stock Clocks!!!)
> XSPC Photon 270 pump/res
> Rigid 5/8" tubing
> Bitspower fittings (I used 90's and NO bends, I Know it adds Some restriction)
> and ALL EK-WB. EK Supremacy on the CPU, and Full Cover EK R9 blocks on the GPU's
> Flow is - Pump, Cpu, R9, R9(Serial), 2 bay XSPC rad, 3 bay XSPC rad, Pump.
> 
> Things I have tried - Switching the two cards around. (The Original "Top" card STAYED hot, even in the Bottom position!
> Re-seating the EK block to the card (Was seated Flawless!!)
> Stopped running 3 screens (Was running 5760x1080) temps do Not drop when only using One screen (At least not enough to worry about)
> Disabling Crossfire and ONLY using the Top (Hot) card. Temps stay the same, or creap up slightly
> 
> I Was told that the Top (Hot) block Might have a Clog in it. And, with them being in Serial, wouldn't a Block/Clog in the First card effect flow to the Second card? But How can i check that? If I crack open the Block, it Voids the warranty!! Can i just send it Back??
> ALSO, this block was purchased from the now Dead FrozenCPU in New York!!
> 
> HELP!! Spent Way too much on this PC for it to be running This Hot!!
> 
> 
> 
> I'm guessing the "hot" block may have defect. It won't affect the next block because the flow will resume to normal after gone through the problematic block.
Click to expand...

2 easy things

1 sounds like a mounting problem.

2 clogs are easy to find although I don't think you have one.

Make ( on mobile going to assume got ate in us ) a hose.

Use your tubing with a fitting on one end ( barb with good pipe clamp or compression fitting ) on the other put a house repair kit ( female, I am talking about a garden hose. )

Hook up to your water heater and blow ( or garden hose I prefer using warm water )
Reverse.

It will tell you if there is a restriction


----------



## fishingfanatic

If you're not certain, add a cpl drops of food colour, and c if it instantly changes. Drain the water and check for a defect. Flush it, new water.

No big deal. Red or blue would be easiest to detect.

FF


----------



## JDC2389

I have a question, I'm sure someone here can answer, I have a HIS r9 290 aftermarket. I'm pretty freaking happy with it, but the vrm1 temps are not ok. 105c is too high and this is a slight 44mv bump. Would any of the vrm heatsink or ramsync options fit under it's iceq cooler? It's fine on air the vrms just need some better heatsinks on them directly imo.


----------



## cplifj

Quote:


> Originally Posted by *pshootr*
> 
> GPU is not CPU, errm ok... Another member here noticed that Flash was freezing when he was using a lower LLC setting than usual (as an experiment), so this demonstrates that the GPU is not the only thing which can cause Flash to become unstable. And many many websites use Flash, even OCN uses it to broadcast video at times I believe. So without being able to use this "quote" crapware, you become very limited as to what you can do on the net.


yes indeed, very limited. and what does adobe do today ?

they bring an update to flash that fixes alot of known bugs , bugs we knew were there so they could be used to compromise any computer system out there that uses flash.

(now they fixed some bugs, but i'm sure they will have left a backdoor somewhere....trust and blind faith is your worst enemy.)


----------



## BradleyW

Are there any 290X Crossfire owners who own a copy of Dying Light around? I have a couple of questions relating to fps witnessed at the main screen before you launch into the actual game. (Where the guy is stood overlooking the city).

Thank you.


----------



## Maintenance Bot

Quote:


> Originally Posted by *BradleyW*
> 
> Are there any 290X Crossfire owners who own a copy of Dying Light around? I have a couple of questions relating to fps witnessed at the main screen before you launch into the actual game. (Where the guy is stood overlooking the city).
> 
> Thank you.


Yes, I own Dying Light and I have 290x crossfire setup. I can check or compare something for you, I will be back home in about 4 hours though.


----------



## BradleyW

Quote:


> Originally Posted by *Maintenance Bot*
> 
> Yes, I own Dying Light and I have 290x crossfire setup. I can check or compare something for you, I will be back home in about 4 hours though.


Excellent thank you.

Here is the test location:


Spoiler: Warning: Spoiler!







Here are the settings:


Spoiler: Warning: Spoiler!








Could you check your fps in the test location for me please, for both single and multi GPU? You'll need to force HMA.exe for the CFX profile in CCC.

Thank you. +1.


----------



## Agent Smith1984

Probably going to be doing this:
http://forums.pcper.com/showthread.php?481802-Removing-the-heatsink-and-painting-your-TRI-X-R9-290

Cause I just ordered this:
http://www.newegg.com/Product/Product.aspx?Item=N82E16813157577

Please no flaming the board, I get it... it's an ASRock, but read the specs, look at the caps and power phase info, and notice the fact that it will let me do 8x/8x CF, which is in the plans soon also. I am being optimistic.... I just got it for $76.99 after rebate and promo code. Also looking at adding an 8310 or 8320E in the next month or two.









That board is new and built for the 220W chips, so overclocking the 8 series should be pretty successful.


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Probably going to be doing this:
> http://forums.pcper.com/showthread.php?481802-Removing-the-heatsink-and-painting-your-TRI-X-R9-290
> 
> Cause I just ordered this:
> http://www.newegg.com/Product/Product.aspx?Item=N82E16813157577
> 
> Please no flaming the board, I get it... it's an ASRock, but read the specs, look at the caps and power phase info, and notice the fact that it will let me do 8x/8x CF, which is in the plans soon also. I am being optimistic.... I just got it for $76.99 after rebate and promo code. Also looking at adding an 8310 or 8320E in the next month or two.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That board is new and built for the 220W chips, so overclocking the 8 series should be pretty successful.


Pretty smexxy that. If they sold those in red I'd bet they sell even more. Never loved the yellow myself either.


----------



## supermi

Hey guys,

I am trying AMD for the first time since the 4870 days. I actually picked up a pair of PCS+ 8gb 290x's and a pair of 980 strix to see which I liked for 3k and 4k gaming.

I am having an issue with the 290x's ... whenever I up the voltage past around 75mv I get an affect under load where the screen flashes blank frames for a few seconds then comes back... this is most evident in Firestrike Ultra but can be seen in Valley. It seems to happen with voltage like plus 100mv at stock clocks or with enough core and mem oc on top of less votage added.

Is this a normal current protection? the clocks are not throttling so i am not sure if this is normal or possibly a defect?
Love some feedback

what kind of voltage and clocks can I shoot for. I am on air with the cooler at 100% atm and will use some universal blocks once I can find my AMD adapters









and hello RED team, I HOPE help me keep them!


----------



## MrWhiteRX7

Quote:


> Originally Posted by *BradleyW*
> 
> Excellent thank you.
> 
> Here is the test location:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Here are the settings:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> Could you check your fps in the test location for me please, for both single and multi GPU? You'll need to force HMA.exe for the CFX profile in CCC.
> 
> Thank you. +1.


That view distance setting is a nice strain on the cpu, not sure why you want it that far out. Either way I could test that too when I get home


----------



## Maintenance Bot

Quote:


> Originally Posted by *BradleyW*
> 
> Excellent thank you.
> 
> Here is the test location:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Here are the settings:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> Could you check your fps in the test location for me please, for both single and multi GPU? You'll need to force HMA.exe for the CFX profile in CCC.
> 
> Thank you. +1.


Hey thanks. Here you go @BradleyW

Crossfire off. 1030mhz


Crossfire on. 1030mhz, 1030mhz


----------



## zealord

Has anyone played the Evil Within DLC The Assignment yet? That game is somehow heavily stressing my card. Other games I get like 75°C under full load but in that game I go 82°C and my GPU fan goes bonkers









R9 290X Tri-X


----------



## BradleyW

Quote:


> Originally Posted by *Maintenance Bot*
> 
> Hey thanks. Here you go @BradleyW
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Crossfire off. 1030mhz
> 
> 
> Crossfire on. 1030mhz, 1030mhz


Thank you for this. Another +1.

I ran the test also.
My results are a little "odd".

CFX OFF = 69 FPS
CFX ON = 109 to 114 FPS (A bit jumpy)

1030/1250.

It would seem I'm 2fps behind on a single GPU compared to your result of 71.


----------



## DzillaXx

Quote:


> Originally Posted by *BradleyW*
> 
> Thank you for this. Another +1.
> 
> I ran the test also.
> My results are a little "odd".
> 
> CFX OFF = 69 FPS
> CFX ON = 109 to 114 FPS (A bit jumpy)
> 
> 1030/1250.
> 
> It would seem I'm 2fps behind on a single GPU compared to your result of 71.


Make sure you turn up your power tune, might as well max it at 50% IMO.

Better drivers will come to smooth things out.


----------



## Maintenance Bot

Quote:


> Originally Posted by *BradleyW*
> 
> Thank you for this. Another +1.
> 
> I ran the test also.
> My results are a little "odd".
> 
> CFX OFF = 69 FPS
> CFX ON = 109 to 114 FPS (A bit jumpy)
> 
> 1030/1250.


Hey thanks again. My memory was at 1375 also.
I wonder if Dying Light will be updated next week with the 15.2 beta driver?


----------



## BradleyW

Quote:


> Originally Posted by *Maintenance Bot*
> 
> Hey thanks again. My memory was at 1375 also.
> I wonder if Dying Light will be updated next week with the 15.2 beta driver?


It will. I can confirm both single and mGPU improvements.

Edit: I just retested at 1030/1375 and I now get 71fps like you. First game I've seen that benefits from VRAM overclocking on these 290X's.


----------



## Maintenance Bot

Quote:


> Originally Posted by *BradleyW*
> 
> It will. I can confirm both single and mGPU improvements.
> 
> Edit: I just retested at 1030/1375 and I now get 71fps like you. First game I've seen that benefits from VRAM overclocking on these 290X's.


Sweet, good deal, looking forward to it.


----------



## kizwan

Quote:


> Originally Posted by *supermi*
> 
> Hey guys,
> 
> I am trying AMD for the first time since the 4870 days. I actually picked up a pair of PCS+ 8gb 290x's and a pair of 980 strix to see which I liked for 3k and 4k gaming.
> 
> I am having an issue with the 290x's ... whenever I up the voltage past around 75mv I get an affect under load where the screen flashes blank frames for a few seconds then comes back... this is most evident in Firestrike Ultra but can be seen in Valley. It seems to happen with voltage like plus 100mv at stock clocks or with enough core and mem oc on top of less votage added.
> 
> Is this a normal current protection? the clocks are not throttling so i am not sure if this is normal or possibly a defect?
> Love some feedback
> 
> what kind of voltage and clocks can I shoot for. I am on air with the cooler at 100% atm and will use some universal blocks once I can find my AMD adapters
> 
> 
> 
> 
> 
> 
> 
> 
> 
> and hello RED team, I HOPE help me keep them!


I don't get that with my cards. Pretty sure that's not normal. Did you increase the power limit to +50%?


----------



## supermi

Quote:


> Originally Posted by *kizwan*
> 
> I don't get that with my cards. Pretty sure that's not normal. Did you increase the power limit to +50%?


Yup, I increased the power limit to +50%.


----------



## tsm106

Quote:


> Originally Posted by *supermi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I don't get that with my cards. Pretty sure that's not normal. Did you increase the power limit to +50%?
> 
> 
> 
> Yup, I increased the power limit to +50%.
Click to expand...

What port are you using dp or hdmi?

It reads like the imc is hitting a wall, aka blackscreens. You could try flip flopping the cards and see if the other card is more resilient to BS. Btw hdmi is more resistant to BS but the downside is no [email protected]


----------



## supermi

I used 4k 60hz dp and 3 1080p in portrait surround 2 dvi and 1 dp to dual link active converter

Happens when voltage is applied... Did similar when I first installed cards on desktop... So I switched em and only did it underl load ...in have shut off the one suspect card and still happened.

Though seams only with extra voltage and in certain circumstances


----------



## tsm106

Quote:


> Originally Posted by *supermi*
> 
> I used 4k 60hz dp and 3 1080p in portrait surround 2 dvi and 1 dp to dual link active converter
> 
> Happens when voltage is applied... Did similar when I first installed cards on desktop... So I switched em and only did it underl load ...in have shut off the one suspect card and still happened.
> 
> Though seams only with extra voltage and in certain circumstances


You can confirm it by running FSU on hdmi. If it doesn't BS then you know its the imc port issue. Some imc's are better than others. If it does not BS, flip flop the cards and try the other one.


----------



## supermi

Quote:


> Originally Posted by *tsm106*
> 
> You can confirm it by running FSU on hdmi. If it doesn't BS then you know its the imc port issue. Some imc's are better than others. If it does not BS, flip flop the cards and try the other one.


Thanks I will take a look, though I find it odd that it would make both the DVI along with the DP blank in eyefinity as 1 but I suppose that eyefinity makes it 1 monitor anyway.
Also why would extra core voltage cause this? any thoughts on that


----------



## tsm106

Quote:


> Originally Posted by *supermi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> You can confirm it by running FSU on hdmi. If it doesn't BS then you know its the imc port issue. Some imc's are better than others. If it does not BS, flip flop the cards and try the other one.
> 
> 
> 
> Thanks I will take a look, though I find it odd that it would make both the DVI along with the DP blank in eyefinity as 1 but I suppose that eyefinity makes it 1 monitor anyway.
> Also why would extra core voltage cause this? any thoughts on that
Click to expand...

We don't really know. We have the shared experiences and observations. HDMI is very resistant to BS while the other ports are not. Some cards are beasts and suffer very little BS and some are like magnets for it. Generally speaking the cards that are resistant are great overclockers, though they have to be because you'd never know because you can't see it thru the BS if it could clock higher or not. Something like that... so the assumption is it is tied to the quality level of the imc.


----------



## LandonAaron

I noticed on dieing light that if I just load up the game I will get like 70% GPU utilization and FPS of like 48 with vsync on, monitor refresh rate = 96hz @ 2560x1440. If I tab out to the desktop, switch to 60hz, the switch back to 96hz, then tab back to the game I will have 100% GPU utilization and 70 FPS. This is with everything maxed, but view distance set to about 25%. (Single R9 290x + i7-4790k)


----------



## supermi

Quote:


> Originally Posted by *tsm106*
> 
> We don't really know. We have the shared experiences and observations. HDMI is very resistant to BS while the other ports are not. Some cards are beasts and suffer very little BS and some are like magnets for it. Generally speaking the cards that are resistant are great overclockers, though they have to be because you'd never know because you can't see it thru the BS if it could clock higher or not. Something like that... so the assumption is it is tied to the quality level of the imc.


well I think your right.

I am on my eyefinity setup and the DP is only driving a single 120hz 1080p monitor and the other 2 from the DVI and I pushed up the voltage to +73 and am at 1150mhz and 1500vram and no issue. This is good if I keep this setup but I am worried about getting either a 120hz 1440 monitor or a 4k 60hz that I will have issues ... hmm I wonder if I could RMA at that time ...? or just stick with NVidia , I am liking these cards more now that they are not blanking out on me!

BTW I have a bunch of mcw82 universal blocks if I go with the 980 strix I have I can easily mount them but I am not sure what mounting plate I need for the 290x , anybody here have an idea?


----------



## Arizonian

Quote:


> Originally Posted by *Crouch*
> 
> GPU-Z Validation
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Here are some pics:


Congrats - added


----------



## LandonAaron

My card either blanks or gets double vision if I run at 1440p with voltage > +100. At 1080p though I can run at 1200/1600, +200mV, but at 1440p I top out at 1150/1500, +100mV. I don't know why but the main thing seems to be the voltage that sends it blanking.


----------



## JDC2389

Monitor the vrm temps, my vrms heat up a lot more when running a higher resolution. VSR 3200x1800 dragonball xenoverse at 44mv bump 1100/1400 very aggressive fan profile and vrm1 is a little under 100C.


----------



## supermi

I had another blank at plus 100mv, 1175core and 1500 vram.

I was using eyefinity so I was able to go further but it still happened.

I was gaming @ 1150 and 1500 @ plus 50mv for hours and no issue,seams to happen more with firestrike ultra than in other areas.

Vrm in firestrike are usually in the 50's in a long bf4 session I saw vrm on hot card hit low 80's and no issues.

Funny enough in Bf4 dx11 was giving me ALOT more fps than mantle, in afterburner it was showing my gpu 2 clocks throttle and gpu usage very low and erratic for both while dx11 was pretty perfect.(yes I did restart after changing setting.

So far mantle seams a bust and well those blanking screens make it unlikely I can OC as high as I like ... Does feel like a budget card as opposed to a good deal at the moment.
Plus:
Titanfall crossfire was horrid.
SOM did benefit from the 8gb per card but the frametimes were not great and in certain situations the lighing really FLICKERS!
So far more time problem solving than playing.

I will try my strix 980's again tomorrow for a few days then come back to these 290x's hopefully with universal blocks if I can get my mcw82's on them.


----------



## kizwan

That suck. I wouldn't be happy too if it was happened to mine.


----------



## rdr09

Quote:


> Originally Posted by *supermi*
> 
> I had another blank at plus 100mv, 1175core and 1500 vram.
> 
> I was using eyefinity so I was able to go further but it still happened.
> 
> I was gaming @ 1150 and 1500 @ plus 50mv for hours and no issue,seams to happen more with firestrike ultra than in other areas.
> 
> Vrm in firestrike are usually in the 50's in a long bf4 session I saw vrm on hot card hit low 80's and no issues.
> 
> Funny enough in Bf4 dx11 was giving me ALOT more fps than mantle, in afterburner it was showing my gpu 2 clocks throttle and gpu usage very low and erratic for both while dx11 was pretty perfect.(yes I did restart after changing setting.
> 
> So far mantle seams a bust and well those blanking screens make it unlikely I can OC as high as I like ... Does feel like a budget card as opposed to a good deal at the moment.
> Plus:
> Titanfall crossfire was horrid.
> SOM did benefit from the 8gb per card but the frametimes were not great and in certain situations the lighing really FLICKERS!
> So far more time problem solving than playing.
> 
> I will try my strix 980's again tomorrow for a few days then come back to these 290x's hopefully with universal blocks if I can get my mcw82's on them.


you still need to restart when switching between DX11 and Mantle? I don't anymore. Also, I play Titanfall with 2 290s with no issues using 4K.


----------



## supermi

Quote:


> Originally Posted by *rdr09*
> 
> you still need to restart when switching between DX11 and Mantle? I don't anymore. Also, I play Titanfall with 2 290s with no issues using 4K.


If by no issues you mean poor gpu usage then yes it is.

As for mantle performance seems 1/2 that of DX11 so restart or not it is utter trash in bf4 and crossfire...

I was so ready to go AMD , my goodness. I guess I will give them one more try if the drivers do launch on the 19th.

If I am missing something feel free to share the info, I hope it is user error or something to that effect!


----------



## rdr09

Quote:


> Originally Posted by *supermi*
> 
> If by no issues you mean poor gpu usage then yes it is.
> 
> As for mantle performance seems 1/2 that of DX11 so restart or not it is utter trash in bf4 and crossfire...
> 
> I was so ready to go AMD , my goodness. I guess I will give them one more try if the drivers do launch on the 19th.
> 
> If I am missing something feel free to share the info, I hope it is user error or something to that effect!


no issues mean no issues. poor gpu usage is an issue.









btw, mantle and AB don't mix. same is true with fraps and mantle. it will cause hitching.

is it a clean install of an OS? if you had nvidia in the same system in the past, then i suggest this . . .

http://www.overclock.net/t/1150443/how-to-remove-your-nvidia-gpu-drivers

this, too . . .

http://www.overclock.net/t/1265543/the-amd-how-to-thread


----------



## tsm106

Dude, if you get BS while overclocking it's not a symptom of a defect, it's because you are overclocking. You're nearing the limits of that chip. You shouldn't act surprised. Either improve the cooling or accept it. Now if you BS at stock then you have a defect issue. Btw titanfail runs like ass for me too or maybe that's what it seems like imo. It scales ok but it just doesn't feel right, the engine runs like ass.

On the other I fire up Grid AS and am reminded how awesome a well chided game is. I get perfect scaling in quad, 98-99% pegged thru out. Also Mantle BF4 runs pretty great for me.


----------



## DzillaXx

Quote:


> Originally Posted by *tsm106*
> 
> *Dude, if you get BS while overclocking it's not a symptom of a defect*, it's because you are overclocking. You're nearing the limits of that chip. You shouldn't act surprised. Either improve the cooling or accept it. Now if you BS at stock then you have a defect issue. Btw titanfail runs like ass for me too or maybe that's what it seems like imo. It scales ok but it just doesn't feel right, the engine runs like ass.
> 
> On the other I fire up Grid AS and am reminded how awesome a well chided game is. I get perfect scaling in quad, 98-99% pegged thru out. Also Mantle BF4 runs pretty great for me.


Couldn't it be a sign of bad or crappy memory too?

Seeing that upping the Aux Voltage can sometimes keep that from happening.

Though I have yet to see the 290's Black Screen problem myself.


----------



## tsm106

Memory limits manifests as lost fps, then if you continue pushing you get artifacts. It's due to the error correction, it will re render frames until they are good, this re rendering equates to lost performance, fps. That's how you know you've gone too far.


----------



## leetmode

Hey guys, I'm running a pair of Tri-X 290x(s) and managed to overclock the core to 1100MHz without touching voltage so now I was wondering how much I should up the voltage in order to increase the core clock even more and maybe the memory clock as well? (I'm at 1250 now but can't put it any higher without experiencing problems) Also, is there a certain voltage (or limit if you will) that I shouldn't pass for saftey reasons? Both cards are air cooled at the moment.


----------



## disintegratorx

Quote:


> Originally Posted by *leetmode*
> 
> Hey guys, I'm running a pair of Tri-X 290x(s) and managed to overclock the core to 1100MHz without touching voltage so now I was wondering how much I should up the voltage in order to increase the core clock even more and maybe the memory clock as well? (I'm at 1250 now but can't put it any higher without experiencing problems) Also, is there a certain voltage (or limit if you will) that I shouldn't pass for saftey reasons? Both cards are air cooled at the moment.


From my experience, I couldn't even get my card at clocks that high, but just curious, what voltages are you at right now? Also what app are you using to oc?
I had mine running stable at around 1226/1553 and that was the highest I could go without having side effects using MSI AB with a +83 on the voltage slider and about 150+ with the batch file add in for MSI. It worked great at those clocks, but anymore on the mem side, and I'd get a drop in FPS and crashing above 1226 on the core side.

Also, I'm just checking in, reading a few posts. I'm about to get my system back up and running again soon, so just thought I'd chime in and say what's up. New games commin out and lots of them. Looking forward to that! And also getting in some alone time with some BF4 lol. I miss it! And I haven't played the last expansion yet. Ohh yeah!..


----------



## MrWhiteRX7

Mantle in BF4 with all 3 gpus for me runs beautifully as well. Just wanted to put that out there, not sure why mantle would be giving you 1/2 the fps. Make sure you aren't maxing out MSAA. I think there's still a mild memory leak maybe? I just use higher res


----------



## supermi

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Mantle in BF4 with all 3 gpus for me runs beautifully as well. Just wanted to put that out there, not sure why mantle would be giving you 1/2 the fps. Make sure you aren't maxing out MSAA. I think there's still a mild memory leak maybe? I just use higher res


No its not that, The 8gb per card is not filled, it is gpu usage and click speed that are not steady.

I will check drivers and a few other things as it seems others are not having that issue at all.

Quote:


> Originally Posted by *tsm106*
> 
> Dude, if you get BS while overclocking it's not a symptom of a defect, it's because you are overclocking. You're nearing the limits of that chip. You shouldn't act surprised. Either improve the cooling or accept it. Now if you BS at stock then you have a defect issue. Btw titanfail runs like ass for me too or maybe that's what it seems like imo. It scales ok but it just doesn't feel right, the engine runs like ass.
> 
> On the other I fire up Grid AS and am reminded how awesome a well chided game is. I get perfect scaling in quad, 98-99% pegged thru out. Also Mantle BF4 runs pretty great for me.


This is not a black screen from overclocking bro. It is an intermittent blank screen, had it on desktop with 1 card as 1st gpu and with the other card as 1st gpu only with added voltage/power draw.

This seems related to the DP, whether it is a defect others have issue with the DP on the 2** series so could be a mild design defect.

Happens at stock with one and with added voltage on the other.

And dude read what I wrote, look at my Sig, This is not an overclocking or cooling problem.

As for titanfall, it is simply a scaling issue for me... It does not scale at all is the issue.

My goodness mantle works for others notnme, titanfall works for a few and not others but in different ways!

In addition to normal artifacts, crashes and error correcting slowdowns you get blinking screens at certain resolutions/voltages but only non some cards... Then you get black screens luckily not on mine ... The there are so many issues when trying to help people get them mixed up.

Seriously this is in 2 day's with these cards, is this what you all have been going thru and 4 months with zero updates!??


----------



## MrWhiteRX7

Quote:


> Originally Posted by *supermi*
> 
> No its not that, The 8gb per card is not filled, it is gpu usage and click speed that are not steady.
> 
> I will check drivers and a few other things as it seems others are not having that issue at all.


Ooops sorry. I will say there were a couple beta drivers that ruined mantle for me and multi-gpu, but the omegas are running perfect. No stutters, hitches, or really any variance in my frametimes (locked above refresh at 100hz). Native 1440p running game at 125% resolution on ultra (no aa). Even 64 man Shanghai or Sunken dragon no dips







It's actually quite amazing how well the game runs now!


----------



## supermi

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Ooops sorry. I will say there were a couple beta drivers that ruined mantle for me and multi-gpu, but the omegas are running perfect. No stutters, hitches, or really any variance in my frametimes (locked above refresh at 100hz). Native 1440p running game at 125% resolution on ultra (no aa). Even 64 man Shanghai or Sunken dragon no dips
> 
> 
> 
> 
> 
> 
> 
> It's actually quite amazing how well the game runs now!


I am still hoping I can figure it out I expected more in SOM as well (not mantle I know)

I will share an update if I can sort it out


----------



## tsm106

Quote:


> Originally Posted by *supermi*
> 
> This seems related to the DP, whether it is a defect others have issue with the DP on the 2** series so could be a mild design defect.
> 
> Seriously this is in 2 day's with these cards, is this what you all have been going thru and 4 months with zero updates!??


Did you test on hdmi?

DP is finicky at 4k. It's finicky on differing platforms as well. You've got a few/lot of minor but aggravating issues. It will only get worse if you can't isolate them. Another factor is how you configure the cards in regard to overclocking and the overclocking apps which can have a major impact.

I don't have issues overall, and none of the flakiness you are experiencing. Though I'm not using 4k thru DP, but I am runing 144hz @ 5760. I set my cards up in a specific way that works for me. I've written a guide on my methodology. It was linked to you by rdr. You might want to take a look at it.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *supermi*
> 
> I am still hoping I can figure it out I expected more in SOM as well (not mantle I know)
> 
> I will share an update if I can sort it out


Oh man I played SOM from release until maybe a week after. I literally couldn't stop and just played the hell out of it till the end and did a bunch of side stuff but haven't touched it since. At that time I remember using an Assassins creed xfire profile? Only two gpus though. With 3 it just did weird things lol. It was very smooth though but not consistently at my refresh so I believe I locked it down a tad under. GPU utilization was ehhhh.

No clue if it's since gotten better or not. Please let us know!


----------



## tsm106

Something to keep in mind with Gameworks titles is that most of them obfuscate the crossfire code, preventing AMD from optimizing. It's par for the course with GW titles.


----------



## supermi

Quote:


> Originally Posted by *tsm106*
> 
> Something to keep in mind with Gameworks titles is that most of them obfuscate the crossfire code, preventing AMD from optimizing. It's par for the course with GW titles.


That is part of why I wanna go AMD, I like the experience of NVIDIA, but not at the price of harming the industry for us consumers, long term!

I gonna gut the drivers and clean registry and try again... If needed a fresh OS install BUT, I hate leaving the SS Phase running while redownloading all my games ...

At least I taking the cpu out of the equasion in comparing cards.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *tsm106*
> 
> Something to keep in mind with Gameworks titles is that most of them obfuscate the crossfire code, preventing AMD from optimizing. It's par for the course with GW titles.


The sad truth... it's pretty funny though when I get higher fps in those games than a good bit of my Nvidia using friends







I'm just attacking it with brute force though I guess LOL


----------



## leetmode

Hey guys I desperately need some help.

Long story short, I made a horrible decision and decided to buy the game Titanfall and it pretty much messed up my gaming setup. I had to disable crossfire in order to get Titanfall to run correctly but when I tried to run other games with it back on, Crossfire no longer works properly. I'm trying to get everything back into working order but I am having such a difficult time doing so.

I am running 2 290x(s) with the 14.12 Omega drivers and I want to uninstall and reinstall them. But now for some reason I can not for the life of me find the option to uninstall it in the "Uninstall or change a program" option or with Revo Uninstaller. I can't find anything labled "ATI", "AMD", "CCC" or absolutely anything even refering to the slight possible chance that it has anything to do with the drivers. When I try to run the installation file it only gives me the option to install and not uninstall. All these problems started after I tried installing and playing Titanfall, prior to this everything was perfect.

Does anyone have any ideas on how I can remove the drivers at this point?


----------



## battleaxe

Quote:


> Originally Posted by *leetmode*
> 
> Hey guys I desperately need some help.
> 
> Long story short, I made a horrible decision and decided to buy the game Titanfall and it pretty much messed up my gaming setup. I had to disable crossfire in order to get Titanfall to run correctly but when I tried to run other games with it back on, Crossfire no longer works properly. I'm trying to get everything back into working order but I am having such a difficult time doing so.
> 
> I am running 2 290x(s) with the 14.12 Omega drivers and I want to uninstall and reinstall them. But now for some reason I can not for the life of me find the option to uninstall it in the "Uninstall or change a program" option or with Revo Uninstaller. I can't find anything labled "ATI", "AMD", "CCC" or absolutely anything even refering to the slight possible chance that it has anything to do with the drivers. When I try to run the installation file it only gives me the option to install and not uninstall. All these problems started after I tried installing and playing Titanfall, prior to this everything was perfect.
> 
> Does anyone have any ideas on how I can remove the drivers at this point?


Use DDU; Display Driver Uninstaller. I and lots of others here use it. Flawless every time for me.


----------



## leetmode

Quote:


> Originally Posted by *battleaxe*
> 
> Use DDU; Display Driver Uninstaller. I and lots of others here use it. Flawless every time for me.


Worked perfectly, thanks so much, rep'd!


----------



## pengs

I'm going two CF 290X. What drivers should I use out of the gate?


----------



## dade_kash_xD

is there anyway to run 4k on 2x r9 290 crossfire? I know there had been mentions on using displayport -> hdmi 2.0 active adapters but I cant find any. I just bought a 4k vizio p552ui for $799 at tigerdirect and would love to use the 1080p 120hz or 4k 60hz feature with my current setup.


----------



## tsm106

Quote:


> Originally Posted by *pengs*
> 
> I'm going two CF 290X. What drivers should I use out of the gate?


The latest Omega 14.12 drivers.

Quote:


> Originally Posted by *dade_kash_xD*
> 
> is there anyway to run 4k on 2x r9 290 crossfire? I know there had been mentions on using *displayport -> hdmi 2.0 active adapters* but I cant find any. I just bought a 4k vizio p552ui for $799 at tigerdirect and would love to use the 1080p 120hz or 4k 60hz feature with my current setup.


Not possible. There is no such adapter. Only the Panasonic 4k 65" has built in DP.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *tsm106*
> 
> The latest Omega 14.12 drivers.
> Not possible. There is no such adapter. Only the Panasonic 4k 65" has built in DP.


Still on Catalyst 14.9 ......

I is lazy









Also Sammy 28" UHD has DP . But I wish it had 3 DP's .

Done my fair share of Benchmarking









@alancsalt
That's BS , Black Screen benchmarking BTW not what you edited .... honest









From now on I will but a ' / ' between the B and the S

You thought I was swearing


----------



## supermi

I cleared out the registry etc. Re-installed the Omega drivers afterwords.

SOM is exactly the same horrible timeframe variance.
Titan fall is STILL 1/2 the FPS it should be. I can force friendly AFR and I get 120fps SOLID at MSAA x2 and 100
BF4 same as before at 1080P portrait surround DX11 100% scaling 98-160fps usually 110-130 Mantle 70-80fps same setting same map SAME MATCH.

Also to whoever says they do not restart after choosing mantle in BF4, you need to or the API does not switch and you have no change in FPS.

I had MSI afturburner turned off for the test and relied on the in game FPS counter ...

This was after clearing out all old nvidia and AMD junk and freshly re-installing the omega drivers windows 8.1. I suppose I could do a full driver reinstall but also bet the results will be the same and I can also bet if I leave the AMD drivers on there and put the 980's back in they will work just the same as before.

I would love any ideas, but I think I will put these cards in a box and wait till the 19th if AMD actually realeses new drivers I will try again if not returning these. Also I was really going to get the 390x especially with the 8gb ram it looks to have, but now seeing this I think I might pay whatever NVIDIA wants for a product that works and can also scale smoothly if I use 2 of them.








and







I wanted to use AMD and am still gonna try but they are making it difficult to give them $$$


----------



## tsm106

Quote:


> Originally Posted by *supermi*
> 
> 
> 
> 
> 
> 
> 
> 
> and
> 
> 
> 
> 
> 
> 
> 
> I wanted to use AMD and am still gonna try but they are making it difficult to give them $$$


PPL have given advice, I've given advice but really I don't think you've read any of it.


----------



## supermi

Quote:


> Originally Posted by *tsm106*
> 
> PPL have given advice, I've given advice but really I don't think you've read any of it.


WTH

I did it did not change anything.

I do thank you for your replies of course, they are basic though and already the things I do ... Including cleaning the registry.

This has nothing to do with overclocking afterburner etc. This has to do with crossfire itself.

If you have screenshots of 4k near constant near 100% crossfire gpu usage in titan fall. SOM or some proof of some good time frames please share cause all I hearing from you is no issues works perfectly.

I know what smooth gaming is and it works fine...

Mantle is not working well I have no idea why perhaps my cpu OC negates any copy overhead for the DX11 pathway and it pulls ahead of mantle?

Your answers have been judgmental, basic and only applicable to helping me in the most tangential of ways.

If I have cleaned drivers in the way you mentioned and followed instructions for installing. Omega drivers, have in mantles case followed instructions on switching api's I would love specifics of the dumb mistake(s) I am making.


----------



## rdr09

Quote:


> Originally Posted by *supermi*
> 
> WTH
> 
> I did it did not change anything.
> 
> I do thank you for your replies of course, they are basic though and already the things I do ... Including cleaning the registry.
> 
> This has nothing to do with overclocking afterburner etc. This has to do with crossfire itself.
> 
> If you have screenshots of 4k near constant near 100% crossfire gpu usage in titan fall. SOM or some proof of some good time frames please share cause all I hearing from you is no issues works perfectly.
> 
> I know what smooth gaming is and it works fine...
> 
> Mantle is not working well I have no idea why perhaps my cpu OC negates any copy overhead for the DX11 pathway and it pulls ahead of mantle?
> 
> Your answers have been judgmental, basic and only applicable to helping me in the most tangential of ways.
> 
> If I have cleaned drivers in the way you mentioned and followed instructions for installing. Omega drivers, have in mantles case followed instructions on switching api's I would love specifics of the dumb mistake(s) I am making.


stick with the 980s. either those or Xbox.


----------



## tsm106

Rdr only posted my guide to setting up amd gpus 101. I don't think you even read his post, shrugs.


----------



## supermi

I did and the guides.

Xbox bro wow,if tour feelings are hurt due to the drivers not working in a reasonable manner no need to insult LOL XBOX EWE.

I am totally sure my old drivers are gone registry clean etc. I will recheck the install guide (which I already read) to see if there is something I missed or skipped or otherwise messed up.

I would prefer I am messing something up of course but so far from my perspective it seems like issues with crossfire.


----------



## HOMECINEMA-PC

You could run a older driver as I do but I thinks its the games your running don't like CF . That's a possibility


----------



## supermi

The only thing I can see is I am using the unofficial overclocking options in AB.

Though I get the same issues at stock with afterburner off.

I uninstalled drivers as prescribed... And installed new Omega.

What would I be missing that would cause Titan fall to have such poor gpu usage unless I force friendly AFR?

Or mordor to have flickering in some light or gave high variance in frame times?

Again all this was the same before and after doing drivers the right way and with AB on or off at stock and overclocking.

For BF4 all I did was choose mantal and restart game, then fps low if I logged fly usage and core clock they were perfect in dx11 but fluctuating badly in mantle.

As for enabling mantle it is clear that if I do not restart bf4 the new api is not initialized so that step is needed.

If you can tell me a likely culprit/group of them grrat, and thanks.

But is it possible that titan fall and crossfire are not working great, that mantle and bf4 at high res and crossfire are not working well? Etc?


----------



## supermi

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> You could run a older driver as I do but I thinks its the games your running don't like CF . That's a possibility


Ibmay try them, but I think I will get some days in on the 980's and wait for the new drivers 19th right









Plus it gives me time to get my uni blocks on the 290x's and see if that gives any better over clocks hehehe!


----------



## HOMECINEMA-PC

Id be full blocking those cards if you wanna clock em


----------



## alancsalt

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> The latest Omega 14.12 drivers.
> Not possible. There is no such adapter. Only the Panasonic 4k 65" has built in DP.
> 
> 
> 
> Still on Catalyst 14.9 ......
> 
> I is lazy
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also Sammy 28" UHD has DP . But I wish it had 3 DP's .
> 
> Done my fair share of Benchmarking
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @alancsalt
> That's BS , Black Screen benchmarking BTW not what you edited .... honest
> 
> 
> 
> 
> 
> 
> 
> 
> 
> From now on I will but a ' / ' between the B and the S
> 
> You thought I was swearing
Click to expand...

Please just write Black Screen or Blue Screen.. otherwise you may well be misunderstood ....


----------



## supermi

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Id be full blocking those cards if you wanna clock em


Yeah I had my titans full blocked and ran 1300mhz daily!!! 1.38v @ load

Unless I got some real cheap blocks it would not be worth it for me to block these cards the Titan X or 390x would be ... $100 plus block on a $300-350 card for an extra 50-100mhz no thanks.

I was hoping a unblock could put me around 1200mhz ...

I gotta say BF4 surprisingly in dx11 these cards 1125mhz are keeping up with the 980's @1450mhz ...
I see potential in them hence my frustration.


----------



## supermi

Quote:


> Originally Posted by *alancsalt*
> 
> Please just write Black Screen or Blue Screen.. otherwise you may well be misunderstood ....


Yeah that was silly of me, sorry. But before now I never witnessed such a thing and did/do not know the correct name for it.

Yeah just the screen blinking in and out everything still running fine!

Neither a black or blue actually "blank and normal picture" intermittently

Only Black screen I had was when I accidently lowered voltage on desktop.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *alancsalt*
> 
> Please just write Black Screen or Blue Screen.. otherwise you may well be misunderstood ....


Yep that's my life story ..... misunderstood









10 / 4 Saltydog









Quote:


> Originally Posted by *supermi*
> 
> Yeah I had my titans full blocked and ran 1300mhz daily!!! 1.38v @ load
> 
> Unless I got some real cheap blocks it would not be worth it for me to block these cards the Titan X or 390x would be ... $100 plus block on a $300-350 card for an extra 50-100mhz no thanks.
> 
> I was hoping a unblock could put me around 1200mhz ...
> 
> I gotta say BF4 surprisingly in dx11 these cards 1125mhz are keeping up with the 980's @1450mhz ...
> I see potential in them hence my frustration.


Another thing is ditch AB , Flash 290's to Shamino's ( I think that's how you spell it ) Asus PT1T bios this allows for unlocked voltages with vdroop on the core and run in line with Asus GPU Tweek .

Works for me


----------



## rdr09

Quote:


> Originally Posted by *supermi*
> 
> I did and the guides.
> 
> Xbox bro wow,if tour feelings are hurt due to the drivers not working in a reasonable manner no need to insult LOL XBOX EWE.
> 
> I am totally sure my old drivers are gone registry clean etc. I will recheck the install guide (which I already read) to see if there is something I missed or skipped or otherwise messed up.
> 
> I would prefer I am messing something up of course but so far from my perspective it seems like issues with crossfire.


only suggested that 'cause it's PnP. I do get tearing in Titanfall.


----------



## kizwan

Quote:


> Originally Posted by *supermi*
> 
> ... that mantle and bf4 at high res and crossfire are not working well? Etc?


Nope, Mantle & BF4 working great at 4k in Crossfire. GPU usage will always near 100% most of the time.


----------



## taem

Anyone know what the difference is between the two different models of the Sapphire Tri-X OC 290?

100362-SR2

100362-3L

As far as I can tell they are identical. The 3L is $240 at Newegg right now and I'm tempted but I don't want to find out its a revised pcb that can't take reference blocks or something.


----------



## supermi

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 10 / 4 Saltydog
> 
> 
> 
> 
> 
> 
> 
> 
> Another thing is ditch AB , Flash 290's to Shamino's ( I think that's how you spell it ) Asus PT1T bios this allows for unlocked voltages with vdroop on the core and run in line with Asus GPU Tweek .
> 
> Works for me


That's 290 only or a 290x version as well (imagine there is). I am not maxed out on voltage now. Does that bios add any more stability?

Quote:


> Originally Posted by *rdr09*
> 
> only suggested that 'cause it's PnP. I do get tearing in Titanfall.


Sorry "PnP"?

I might try it on 4k perhaps the eyefinity causes issues.
As I said if I force friendly AFR, I get better FPS than the titans could give ... But it messes up the menu's. I stumped in that game.
Quote:


> Originally Posted by *kizwan*
> 
> Nope, Mantle & BF4 working great at 4k in Crossfire. GPU usage will always near 100% most of the time.


So you just switch mantle on and that's it, any setting in drivers to enable mantle?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *taem*
> 
> Anyone know what the difference is between the two different models of the Sapphire Tri-X OC 290?
> 
> 100362-SR2
> 
> 100362-3L
> 
> As far as I can tell they are identical. The 3L is $240 at Newegg right now and I'm tempted but I don't want to find out its a revised pcb that can't take reference blocks or something.


I would be very cautious about that . Prolly one of the cards has taller capacitors making it unsuitable for whatever block you have ........ Watchout for Murphy here


----------



## taem

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> I would be very cautious about that . Prolly one of the cards has taller capacitors making it unsuitable for whatever block you have ........ Watchout for Murphy here


Yeah Powercolor did that to me. Luckily I was able to return the card.

Went to Sapphires Web site, the 3L is the new edition with beefed up power delivery.

Should I assume it's no longer reference pcb? The model simply does not exist on compatibility charts. Doesn't say custom - its just not listed.


----------



## ChronoBodi

Quote:


> Originally Posted by *supermi*
> 
> I cleared out the registry etc. Re-installed the Omega drivers afterwords.
> 
> SOM is exactly the same horrible timeframe variance.
> Titan fall is STILL 1/2 the FPS it should be. I can force friendly AFR and I get 120fps SOLID at MSAA x2 and 100
> BF4 same as before at 1080P portrait surround DX11 100% scaling 98-160fps usually 110-130 Mantle 70-80fps same setting same map SAME MATCH.
> 
> Also to whoever says they do not restart after choosing mantle in BF4, you need to or the API does not switch and you have no change in FPS.
> 
> I had MSI afturburner turned off for the test and relied on the in game FPS counter ...
> 
> This was after clearing out all old nvidia and AMD junk and freshly re-installing the omega drivers windows 8.1. I suppose I could do a full driver reinstall but also bet the results will be the same and I can also bet if I leave the AMD drivers on there and put the 980's back in they will work just the same as before.
> 
> I would love any ideas, but I think I will put these cards in a box and wait till the 19th if AMD actually realeses new drivers I will try again if not returning these. Also I was really going to get the 390x especially with the 8gb ram it looks to have, but now seeing this I think I might pay whatever NVIDIA wants for a product that works and can also scale smoothly if I use 2 of them.
> 
> 
> 
> 
> 
> 
> 
> 
> and
> 
> 
> 
> 
> 
> 
> 
> I wanted to use AMD and am still gonna try but they are making it difficult to give them $$$


I have no idea lol, SoM is flawless on my rig, same Xfire 290Xs and all. Honestly every rig is different, so, no idea what's bonkers in your case.


----------



## supermi

Quote:


> Originally Posted by *ChronoBodi*
> 
> I have no idea lol, SoM is flawless on my rig, same Xfire 290Xs and all. Honestly every rig is different, so, no idea what's bonkers in your case.


I am taking a break and will see what I can do









Could you run the benchmark at 4k ultra, ultra textures AO on ultra as well and tell me the results


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *supermi*
> 
> *That's 290 only or a 290x version as well (imagine there is). I am not maxed out on voltage now. Does that bios add any more stability? Sorry "PnP"?*
> 
> I might try it on 4k perhaps the eyefinity causes issues.
> 
> As I said if I force friendly AFR, I get better FPS than the titans could give ... But it messes up the menu's. I stumped in that game.
> 
> So you just switch mantle on and that's it, any setting in drivers to enable mantle?


290 / 290x doesn't matter . Sometimes this bios can unlock 290 into 290x if you have the right 290's .. Plug and play

I have no dramas with eyefinity


----------



## tsm106

You think he should be flashing bios' at this point, especially on an 8gb card?


----------



## supermi

Quote:


> Originally Posted by *tsm106*
> 
> You think he should be flashing bios' at this point, especially on an 8gb card?


If I do flash it will be a compatable bios... 4gb bios fpr 8gb card might be bad different memory timings etc.
I might start with a fresh os this ome I have has been through lots of overclocking adventures.

I doubt the bios is causing my issues. Either actual drivers or an OS in need of a clean imstal my guess


----------



## pshootr

I have been using AB with my Tri-X OC, and disabling AMD Graphics OD but AMD Graphics OD keeps being Enabled again. Is this normal, and is it because of AB?


----------



## ZealotKi11er

Quote:


> Originally Posted by *pshootr*
> 
> I have been using AB with my Tri-X OC, and disabling AMD Graphics OD but AMD Graphics OD keeps being Enabled again. Is this normal, and is it because of AB?


Go in CCC > Preferences > Restore to Factory Defaults. That should keep OverDrive off.


----------



## pshootr

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Go in CCC > Preferences > Restore to Factory Defaults. That should keep OverDrive off.


Ok I will try that, thanks a lot for the tip.


----------



## kizwan

Quote:


> Originally Posted by *supermi*
> 
> [
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Nope, Mantle & BF4 working great at 4k in Crossfire. GPU usage will always near 100% most of the time.
> 
> 
> 
> So you just switch mantle on and that's it, any setting in drivers to enable mantle?
Click to expand...

Yes, just switch. Did you save screenshot of gpu usage with bf4?


----------



## supermi

Quote:


> Originally Posted by *kizwan*
> 
> Yes, just switch. Did you save screenshot of gpu usage with bf4?


Did not, was pretty bad though one would ve at Luke 75% the other 25% etc never really totaling more than 100% combined... But inlynin mantle bf4 dx11 was normal 97-99 some dips in the 64 player mayham!

Plus my core in dx11 BF4 WAS stuck at whatever it was set to whether stock 1050 or 1150/OC.
In mantle gpu #w would fluctuate around the 1000mhz mark?!?

Titan fall was similar to Bf4 mantle but it is dx11 one gpu 75 the other 25 both 40% etc forcing friendly AFR pegged both at 99-100%


----------



## Yuniver

Just got it installed yesterday.









Sapphire R9 290 Tri-X

http://www.techpowerup.com/gpuz/details.php?id=hhdpp


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *tsm106*
> 
> You think he should be flashing bios' at this point, especially on an 8gb card?


Put that way ..... NO
Missed that tid bit of 8gb card info









Quote:


> Originally Posted by *supermi*
> 
> If I do flash it will be a compatable bios... 4gb bios fpr 8gb card might be bad different memory timings etc.
> I might start with a fresh os this ome I have has been through lots of overclocking adventures.
> 
> I doubt the bios is causing my issues. Either actual drivers or an OS in need of a clean imstal my guess


Not a good idea 4gb bios on a 8gb will brick it ..... but is easy reversable


----------



## supermi

Ok so I had about 2 hours with the 980's and got bored.

I did the driver change LETTER FOR LETTER!

NOW..... drumroll please......

the same results as before ... titanfall with provided profile in Omega drivers is like 25%-75% on each gpu never a total of 100% between the 2 ... no change at all ... this is in now way a mistake with uninstal or install of drivers, or anything of the sort. I am sure if I force friendly AFR I will get the flickering and 100% gpu usage again ...

Shadow's of mordor is the same ... as in ok ... better than 980's about same as the titans were...

I am about to try BF4 and am expecting the same TBH that is Mant;e at about 1/2 the performance as DX11 (not confirmed this time yet)

One of the cards has a DP out which causes the blinking screen even on desktop the other does not ... I do wonder if that could have issue here? though that card has no monitors plugged in and I am having no unexpected crashing unless I OC too far with too little volts there is no crashing ...
Perhaps I should RMA that other card with the strange DP?

Otherwise I suppose I need to wait for new drivers or roll back to older drivers and see if things change ... I am not exciced about a whole OS re-install as so far there has been no change no matter what I have tried... In short I have correctly applied all the suggestions given to me 100% to the letter! If there are any ideas I WANT THEM guys!

Thanks!









UPDATE:

Mantle worked FINE on 4K monitor after BF4 restart every time ... FPS were about the same as Dx11 but no worse







, I think the 5.1ghz hex core and 4k resolution/settings meant the CPU was not much of a bottlneck for mantle to overcome.

However:

Back to Eyefinity and mantle is doing the same strange stuff from before ... so it is something with my eyefinity...

Titanfall etc are acting the same as before.


----------



## Wezzor

Where can I find the Saturation option in CCC?


----------



## pshootr

Is there a way to show FPS in BF4 with RTSS while using Mantle?


----------



## tsm106

Use the internal counter in bf4?


----------



## ZealotKi11er

Quote:


> Originally Posted by *pshootr*
> 
> Is there a way to show FPS in BF4 with RTSS while using Mantle?


During the game press ~ and then type perfoverlay.drawfps 1.


----------



## pshootr

Quote:


> Originally Posted by *tsm106*
> 
> Use the internal counter in bf4?


Quote:


> Originally Posted by *ZealotKi11er*
> 
> During the game press ~ and then type perfoverlay.drawfps 1.


Thank you both for the reply, "perfoverlay.drawfps 1" worked but the FSP is shown in the upper rite instead of the upper left which I prefer. The numbers also appear large and jagged compared to RTSS. I assume RTSS can not show FPS while using Mantle?

Perhaps I can get same frame-rate with DX11 so I can use RTSS for the FPS counter, I will have to check. Is there a way to tell BF4 to show FPS in the upper left corner?

PS:

Disabling C6 State for CPU solved my freezing issue with Flash! Thanks for trying to help me tsm106.









And restoring defaults on CCC stopped AMD Graphics OD, thanks Zealotki11er.


----------



## tsm106

There's a graph for comparing cpu/gpu load. It's good to compare the relative loads. I never remember what the command is though, don't play bf4 too much anymore to need to remember.


----------



## pshootr

Quote:


> Originally Posted by *tsm106*
> 
> There's a graph for comparing cpu/gpu load. It's good to compare the relative loads. I never remember what the command is though, don't play bf4 too much anymore to need to remember.


Ok thanks I will check it out. I guess this is to show the effect Mantle has on CPU/GPU usage..

+rep to both of you


----------



## ZealotKi11er

Quote:


> Originally Posted by *pshootr*
> 
> Ok thanks I will check it out. I guess this is to show the effect Mantle has on CPU/GPU usage..
> 
> +rep to both of you


Personally i never had good experience with Mantle in BF4 using Intel CPU. When i was using my R9 290X with Athlon X4 @ 3.4GHz Mantle made a huge difference. Going 4K Mantle did not really worth well with super high vRAM usage.


----------



## pshootr

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Personally i never had good experience with Mantle in BF4 using Intel CPU. When i was using my R9 290X with Athlon X4 @ 3.4GHz Mantle made a huge difference. Going 4K Mantle did not really worth well with super high vRAM usage.


I guess I will do some testing and see if Mantle is worth it with my CPU.


----------



## BuildTestRepeat

Sad sad day :'(. Strapped my corsair h40 to my r9 290 that previously had a GELID cooler , was running a stress test and heard a pop sound. Whole pc shut off and restarted. Motherboard vga led staying lit and other gpus are working in the slot. I was literallly watching the temps when it popped. Was 47C on the core and VRM 1 60C VRM 2 somewhere in the 40C range. Took the cooler off and got the burned electeonics smell all of us fear. Damn....


----------



## supermi

Quote:


> Originally Posted by *BuildTestRepeat*
> 
> Sad sad day :'(. Strapped my corsair h40 to my r9 290 that previously had a GELID cooler , was running a stress test and heard a pop sound. Whole pc shut off and restarted. Motherboard vga led staying lit and other gpus are working in the slot. I was literallly watching the temps when it popped. Was 47C on the core and VRM 1 60C VRM 2 somewhere in the 40C range. Took the cooler off and got the burned electeonics smell all of us fear. Damn....


Can you see any burn marks?


----------



## BuildTestRepeat

No i cant, looked at the gpu both sides with no cooler on it.


----------



## supermi

Quote:


> Originally Posted by *BuildTestRepeat*
> 
> No i cant, looked at the gpu both sides with no cooler on it.


Damn, coincidence with or caused by your cooler change?

Gonna RMA?


----------



## tsm106

Another clc death... rip. My condolences.


----------



## Yuniver

Quote:


> Originally Posted by *BuildTestRepeat*
> 
> Sad sad day :'(. Strapped my corsair h40 to my r9 290 that previously had a GELID cooler , was running a stress test and heard a pop sound. Whole pc shut off and restarted. Motherboard vga led staying lit and other gpus are working in the slot. I was literallly watching the temps when it popped. Was 47C on the core and VRM 1 60C VRM 2 somewhere in the 40C range. Took the cooler off and got the burned electeonics smell all of us fear. Damn....


This makes me sad for you my friend.

R.I.P.


----------



## Aussiejuggalo

Finally got my 290 underwater









Ran a 720p Catzilla (I'm to poor to buy the full thing) to make sure it was getting cooled right







Gotta say 51° on the core is better than the 90° I was getting and the VRMs are about 30 - 40° cooler to









I am a bit worried about the droop at the end of the card tho, the cables are loose but its still being pulled down for some reason











(don't mind the crappy cables







)


----------



## Yuniver

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Finally got my 290 underwater
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am a bit worried about the droop at the end of the card tho, the cables are loose but its still being pulled down for some reason
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (don't mind the crappy cables
> 
> 
> 
> 
> 
> 
> 
> )


Is this picture a straight on shot or was it angled a bit?


----------



## Aussiejuggalo

Quote:


> Originally Posted by *Yuniver*
> 
> Is this picture a straight on shot or was it angled a bit?


Angled slightly but as straight as I could get it, the droops about 5 - 10°

I should also ask, hows the new 14.12 drivers, anyone had problems with them?


----------



## songtinh788

I just bought a ASUS R9 290 Ref last week to do crossfire with my old ASUS R9-290X DirectCU II. These cards run well with my system and game, but I notice some annoying problem
- Sometime, my computer freeze while sleeping, the monitor can't wake up and I can't turn on/off numlock light on my keyboard. I must press hard reset
- Ingame, I often get freeze screen about 5-10 seconds then normal

Anyone can help?


----------



## Arizonian

Quote:


> Originally Posted by *BuildTestRepeat*
> 
> Sad sad day :'(. Strapped my corsair h40 to my r9 290 that previously had a GELID cooler , was running a stress test and heard a pop sound. Whole pc shut off and restarted. Motherboard vga led staying lit and other gpus are working in the slot. I was literallly watching the temps when it popped. Was 47C on the core and VRM 1 60C VRM 2 somewhere in the 40C range. Took the cooler off and *got the burned electeonics smell all of us fear*. Damn....


OUCH








Quote:


> Originally Posted by *Yuniver*
> 
> Just got it installed yesterday.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sapphire R9 290 Tri-X
> 
> http://www.techpowerup.com/gpuz/details.php?id=hhdpp


Congrats - added









Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Finally got my 290 underwater
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ran a 720p Catzilla (I'm to poor to buy the full thing) to make sure it was getting cooled right
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Gotta say 51° on the core is better than the 90° I was getting and the VRMs are about 30 - 40° cooler to
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am a bit worried about the droop at the end of the card tho, the cables are loose but its still being pulled down for some reason
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> (don't mind the crappy cables
> 
> 
> 
> 
> 
> 
> 
> 
> )


Congrats - looks great - updated


----------



## aaroc

What SW or HW do you use to capture videos of games or desktop of your R9 290X single or CF?








for desktops screenshots I use greenshot. For games what do you recommend?


----------



## Aussiejuggalo

For games I use Fraps for screenshots or short videos and for longer videos or whenever I'm on Teamspeak or Skype I use Dxtory


----------



## pshootr

"Action" is pretty good for recording games.


----------



## LolCakeLazors

Forgot to join the club.

http://www.techpowerup.com/gpuz/details.php?id=zp5nv

MSI Lightning 290X on Stock Cooler


----------



## BuildTestRepeat

Quote:


> Originally Posted by *supermi*
> 
> Damn, coincidence with or caused by your cooler change?
> 
> Gonna RMA?


It is a sapphire refrence card which i literally have had less than a few weeks. I bought it used. Contacted the guy to see if he was the original purchaser and could get me a reciept Which i must have to rma. I just cant catch a break with AMD cards Lol.

Quote:


> Originally Posted by *tsm106*
> 
> Another clc death... rip. My condolences.


Quote:


> Originally Posted by *Yuniver*
> 
> This makes me sad for you my friend.
> 
> R.I.P.


Thank you all. Of course this happens 2 days before my korean 1440p monitor gets here :'(


----------



## Yuniver

You just have bad luck









I have had no major problems with any of my AMD cards. 7770, 7850, and now the r9 290. The only thing, and it was more of an annoyance than anything, was my 7850 had an overheating issue that caused it to shut down even with the fan speed set at 100%.


----------



## BuildTestRepeat

Quote:


> Originally Posted by *Yuniver*
> 
> You just have bad luck
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have had no major problems with any of my AMD cards. 7770, 7850, and now the r9 290. The only thing, and it was more of an annoyance than anything, was my 7850 had an overheating issue that caused it to shut down even with the fan speed set at 100%.


This is my first pc part to ever actually fully fail. My last r9 290 was defective.

To add to the sucky situation it looks like sapphire warranty is void if the cooler is removed/replaced. Damn.... the guy just sent me a reciept too. Card was bought 11/06/13 so it would still be under warranty. Ugh


----------



## Crouch

Guys, I'm thinking to cool my 290X using the Kraken G10 + H60. Is it a good combination or do I have better options ?!


----------



## godiegogo214

can anyone confirm that this block will fit this card perfectly?

http://www.aquatuning.us/water-cooling/gpu-water-blocks/gpu-full-cover/17508/alphacool-nexxxos-gpx-ati-r9-290x-and-290-m01-incl.-backplate-black?c=6470

card image


----------



## supermi

Well I know part of my issue.

Ready for this.... Box says 8gb for both cards but PCB and gpuz say one of em is 4gb ... What!


----------



## Yuniver

Quote:


> Originally Posted by *supermi*
> 
> Well I know part of my issue.
> 
> Ready for this.... Box says 8gb for both cards but PCB and gpuz say one of em is 4gb ... What!


Downclocked maybe?


----------



## supermi

Quote:


> Originally Posted by *Yuniver*
> 
> Downclocked maybe?


I am not sure I am getting you ... one is 8gb one is 4gb they are both supposed to be 8gb ... in games like SOM that use lots of VRAM I think it is giving me some issues ...

the 4gb card seems flaky anyway ... neither card downclocks from the clockspeeds I set while gaming.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *supermi*
> 
> Well I know part of my issue.
> 
> Ready for this.... Box says 8gb for both cards but PCB and gpuz say one of em is 4gb ... What!


WHAT!?!?!?!?









Yes that can definitely cause some conflict!


----------



## Yuniver

Quote:


> Originally Posted by *supermi*
> 
> I am not sure I am getting you ... one is 8gb one is 4gb they are both supposed to be 8gb ... in games like SOM that use lots of VRAM I think it is giving me some issues ...
> 
> the 4gb card seems flaky anyway ... neither card downclocks from the clockspeeds I set while gaming.


Sorry, it's late and I misread the post. I'm guessing you're x-fire? Have you tested the cards by themselves to see if there's an underlying problem? If the 4gb card is indeed 4gb and it's supposed to be 8gb, then you can RMA. I would troubleshoot first.

EDIT: After reading your post again, I can see why you were all like


----------



## supermi

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> WHAT!?!?!?!?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yes that can definitely cause some conflict!


that 4gb is also the card that flakes out and blinks the screen on and off while on desktop at 4K ... it is not staying here with me ... I contacted Powercolor to see if they can take care of me qiuick as Newegg has no more to exchange so all I would get is a refund and then Likely Titan Z (small chance of keeping the 980 strix SLI) but that darn 4gb of vram (boo) ...

On one had this is annoying on the other it helps make my decision....
with this and with the wait for the 390x and no real word on it yet my $$ is likely going to the Titan x though i have been in denial about that .
Well I will see if Powercolor makes me wanna stay with them thru amazing customer support or if AMD gives us a scoop on the 390x 8gb version prior to Titan X being live for sale ... we shall see wee shall see


----------



## tsm106

Quote:


> Originally Posted by *supermi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MrWhiteRX7*
> 
> WHAT!?!?!?!?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yes that can definitely cause some conflict!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> that 4gb is also the card that flakes out and blinks the screen on and off while on desktop at 4K ... it is not staying here with me ... I contacted Powercolor to see if they can take care of me qiuick as Newegg has no more to exchange so all I would get is a refund and then Likely Titan Z (small chance of keeping the 980 strix SLI) but that darn 4gb of vram (boo) ...
> 
> On one had this is annoying on the other it helps make my decision....
> with this and with the wait for the 390x and no real word on it yet my $$ is likely going to the Titan x though i have been in denial about that .
> Well I will see if Powercolor makes me wanna stay with them thru amazing customer support or if AMD gives us a scoop on the 390x 8gb version prior to Titan X being live for sale ... we shall see wee shall see
Click to expand...

That doesn't really mean it is 4gb. GPUZ is not always accurate and sometimes, a lot of times buggy. *That said, do you have ULPS disabled?*

If ULPS is on, which it is by default GPUZ has a hard time getting accurate reads from the 2nd card.


----------



## supermi

Quote:


> Originally Posted by *tsm106*
> 
> That doesn't really mean it is 4gb. GPUZ is not always accurate and sometimes, a lot of times buggy. *That said, do you have ULPS disabled?*
> 
> If ULPS is on, which it is by default GPUZ has a hard time getting accurate reads from the 2nd card.


It has been disabled from the start, the PCB is marked as 4gb...
Quote:


> Originally Posted by *Yuniver*
> 
> Sorry, it's late and I misread the post. I'm guessing you're x-fire? Have you tested the cards by themselves to see if there's an underlying problem? If the 4gb card is indeed 4gb and it's supposed to be 8gb, then you can RMA. I would troubleshoot first.
> 
> EDIT: After reading your post again, I can see why you were all like


I have tested that is how I know the 4gb won't drive 4k from DP so I stopped testing it there as there was no point , the other card will blink out in a similar fassion but only with added voltage (not to my liking but better than the other one)

So it is clear there is something up with the 4gb card, but with the BIOS and the PCB sayng 4gb I am 100% sure it is that... the other card is marked 8gb on the pcb and in bios and they have different BIOS versions...

To be honest the 8gb card is ok but not amazing and unless I am swayed by an early and super above and beyond email from Powercoler fairly early tomorrow I will pack these up and ship out








oh well right..


----------



## Yuniver

I would return them anyway TBH


----------



## tsm106

Quote:


> Originally Posted by *supermi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> That doesn't really mean it is 4gb. GPUZ is not always accurate and sometimes, a lot of times buggy. *That said, do you have ULPS disabled?*
> 
> If ULPS is on, which it is by default GPUZ has a hard time getting accurate reads from the 2nd card.
> 
> 
> 
> It has been disabled from the start, the PCB is marked as 4gb...
Click to expand...

Whoa! Newegg huh? Chuckles, not on your behalf though, shakes head. Why am I not surprised at Newegg anymore?


----------



## supermi

Quote:


> Originally Posted by *tsm106*
> 
> Whoa! Newegg huh? Chuckles, not on your behalf though, shakes head. Why am I not surprised at Newegg anymore?


Boxes both say 8gb the SN on the cards match the boxes. I just assumed the boxes were right, but someone somewhere made a little mistake. A 4gb mistake.

It makes more sense now why others were having a smoother experience in SOM, I mean the drivers and game engine must have been very confused... The benchmark was giving 7fps low and 350fps high ... It finally makes sense why nothing I did helped LOL

YOU CAN CHUCKLE, I am happy I saw the 4gb on the PCB but I am laughing at myself for taking so long to catch it.

AMD was making a joke along the lines of the 970's hahaha


----------



## tsm106

Your cards have backplates right? Are the serial number stickers on the backplates?


----------



## supermi

Quote:


> Originally Posted by *tsm106*
> 
> Your cards have backplates right? Are the serial number stickers on the backplates?


Yes they are, and a sticker on the PCB under the heatsink with the last 4 digest of the SN.


----------



## Bertovzki

I have started my build and put the 290X Tri-X OC on the Hawaii block , will put together the build log soon , i have done quite a few mods to the 750D
Case fully lined with matt black acrylic paper still on for cable management


Spoiler: Warning: Spoiler!


----------



## BuildTestRepeat

Quote:


> Originally Posted by *tsm106*
> 
> Another clc death... rip. My condolences.


I am curious to know why my card actually failed. From your reply this is a common occurance? I took my time and double checked everything. I know the gpu didnt get too warm. Puzzled Over here. This is a pic of the card before it went in. I actually played a bit of bf4 earlier in the day and it did fine.


----------



## tsm106

Quote:


> Originally Posted by *BuildTestRepeat*
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Another clc death... rip. My condolences.
> 
> 
> 
> I am curious to know why my card actually failed. From your reply this is a common occurance? I took my time and double checked everything. I know the gpu didnt get too warm. Puzzled Over here. This is a pic of the card before it went in. I actually played a bit of bf4 earlier in the day and it did fine.
Click to expand...

Yes, clc deaths happen more often than fullcover block deaths. There are no statistics other than my long experience on this forum. I've kept some track of the cases that I've come across. How or why it happens more frequently is not easy to pin point. There are a heck of a lot more variables and potential points of failure like lots of assembly which creates many chances for mistake, very low material density (heatsinks and vrm kit have very low density), and adding the low material density and the lack of direct cooling leads to higher chances of elevated temps for longer duration.

Without seeing your pcb up close I don't know what cause the cards death, but one thing is for certain. The card was not happy.


----------



## BuildTestRepeat

Quote:


> Originally Posted by *tsm106*
> 
> Yes, clc deaths happen more often than fullcover block deaths. There are no statistics other than my long experience on this forum. I've kept some track of the cases that I've come across. How or why it happens more frequently is not easy to pin point. There are a heck of a lot more variables and potential points of failure like lots of assembly which creates many chances for mistake, very low material density (heatsinks and vrm kit have very low density), and adding the low material density and the lack of direct cooling leads to higher chances of elevated temps for longer duration.
> 
> Without seeing your pcb up close I don't know what cause the cards death, but one thing is for certain. The card was not happy.


Im gonna snap some real good pcb Pics tonight. This card had a gelid cooler on it running good before the clc. "If it aint broke dont fix it" has me kicking myself right now lol.

I got a damn good oc on this card with the gelid. +25mv core, 50% power increase, 1150mhz core 1375 mem stable. Wasnt OC'd when it popped


----------



## battleaxe

Quote:


> Originally Posted by *BuildTestRepeat*
> 
> Im gonna snap some real good pcb Pics tonight. This card had a gelid cooler on it running good before the clc. "If it aint broke dont fix it" has me kicking myself right now lol.
> 
> I got a damn good oc on this card with the gelid. +25mv core, 50% power increase, 1150mhz core 1375 mem stable. Wasnt OC'd when it popped


Double check your TIM. I've had one card go black, make smelly odors after being on load etc... but then when I checked the block seating the corners of the GPU were not seated properly on the block. I redid the TIM making sure it reached all four corners and ended up working just fine after I was sure it was dead... If you cannot see any obvious failure there's a outside chance that it did the same thing. Just an idea to try out.


----------



## BuildTestRepeat

Quote:


> Originally Posted by *battleaxe*
> 
> Double check your TIM. I've had one card go black, make smelly odors after being on load etc... but then when I checked the block seating the corners of the GPU were not seated properly on the block. I redid the TIM making sure it reached all four corners and ended up working just fine after I was sure it was dead... If you cannot see any obvious failure there's a outside chance that it did the same thing. Just an idea to try out.


Tonight i will be messing with the card. Will post back. Praying for the best!


----------



## Devildog83

stock


----------



## Aussiejuggalo

It still surprises the hell out of me how well these 290's respond to watercooling









Ran Valley to see how high temps get and well... The core gained 10°, ambient is 28.8°



Cant complain about the score either



The Valley temp sensor made me laugh tho











Whats a good test to really push the heat?

Also the latest drives, 14.12 are they worth going to?


----------



## taem

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Whats a good test to really push the heat?


I use Metro Last Light bench to test heat. I've never encountered an in game scenario that generates more heat that a loop of this bench maxed out. Not much good as a stability test as it seems forgiving of unstable clocks.

You have to own Metro LL but everyone should anyway, one of the best games ever.


----------



## Aussiejuggalo

Quote:


> Originally Posted by *taem*
> 
> I use Metro Last Light bench to test heat. I've never encountered an in game scenario that generates more heat that a loop of this bench maxed out. Not much good as a stability test as it seems forgiving of unstable clocks.
> 
> You have to own Metro LL but everyone should anyway, one of the best games ever.


Cool yeah I got LL some where

How hard are these things to clock and is it really worth it?


----------



## Devildog83

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Aussiejuggalo*
> 
> It still surprises the hell out of me how well these 290's respond to watercooling
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ran Valley to see how high temps get and well... The core gained 10°, ambient is 28.8°
> 
> 
> 
> Cant complain about the score either
> 
> 
> 
> The Valley temp sensor made me laugh tho
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Whats a good test to really push the heat?
> 
> Also the latest drives, 14.12 are they worth going to?






Not on water yet but I will be. Just got a new 4670k so I am comparing bench's. Got this with my 290 -


----------



## HOMECINEMA-PC

CLC loops on 290's = Meltdown .
If your gonna pull apart 290's please for gawds sake PUT A FULL W/BLOCK ON IT








But if you hate money so much , by all means CLC it


----------



## Devildog83

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> CLC loops on 290's = Meltdown .
> If your gonna pull apart 290's please for gawds sake PUT A FULL W/BLOCK ON IT
> 
> 
> 
> 
> 
> 
> 
> 
> But if you hate money so much , by all means CLC it


Your durn tootin' !!! Mine is going into my CPU loop with an added rad and a full cover block. I hit 78c with my reference card in valley but it's freakin' loud. I cannot wait for water. With just a CPU in my loop the fans never even spin up and are silent.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Devildog83*
> 
> Your durn tootin' !!! Mine is going into my CPU loop with an added rad and a full cover block. I hit 78c with my reference card in valley but it's freakin' loud. I cannot wait for water. With just a CPU in my loop the fans never even spin up and are silent.


That will change with 290 in there .
You should loose 40c odd with waterblock








I run dual loops . To keep the extra heat generated away from my 4960x








Yes im *durn tootin







*


----------



## Devildog83

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> That will change with 290 in there .
> You should loose 40c odd with waterblock
> 
> 
> 
> 
> 
> 
> 
> 
> I run dual loops . To keep the extra heat generated away from my 4960x
> 
> 
> 
> 
> 
> 
> 
> 
> Yes im *durn tootin
> 
> 
> 
> 
> 
> 
> 
> *


My measly little 4670k got all the way up to 54C, doesn't generate much heat.








My new asus board is on it's way and the GPU block is next.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Devildog83*
> 
> My measly little 4670k got all the way up to 54C, doesn't generate much heat.
> 
> 
> 
> 
> 
> 
> 
> 
> My new asus board is on it's way and the GPU block is next.


Well , sorry there im used to runnin hexcores so they do generate a bit more heat


----------



## Devildog83

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Well , sorry there im used to runnin hexcores so they do generate a bit more heat


Yeah but your 290's are gonna create some heat.


----------



## BuildTestRepeat

I found the problem. The GPU surface literally cracked!!! :'( explains the pop sound i heard ...


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Devildog83*
> 
> Yeah but your 290's are gonna create some heat.


With chiller on and running seperate loop 30c - 32c full load , no chiller 40c - 44c full load . Chiller has a 4ltr res









Quote:


> Originally Posted by *BuildTestRepeat*
> 
> I found the problem. The GPU surface literally cracked!!! :'( explains the pop sound i heard ...


Over tightnening ....sorry for your loss


----------



## boot318

Finger-tight is all you need. I've been using the "Red Mod" for my past 5 GPUs and never had that issue. I kinda feel lucky now. Sorry for you lost.









Maybe you can RMA if it within 30 days.


----------



## BuildTestRepeat

Quote:


> Originally Posted by *boot318*
> 
> Finger-tight is all you need. I've been using the "Red Mod" for my past 5 GPUs and never had that issue. I kinda feel lucky now. Sorry for you lost.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Maybe you can RMA if it within 30 days.


No rma for sapphire. No support for modders like us. Or the card would still be in the warranty period. I only zipped it on with my fingers. I always learn the hard way -_- lol.


----------



## HOMECINEMA-PC

No warranty for him .
Ya cant get away with that borkedness .
Supplier will know when furphies are being told


----------



## tsm106

That's like wrapping your car around a pole then asking for warranty service.


----------



## kizwan

Quote:


> Originally Posted by *BuildTestRepeat*
> 
> I found the problem. The GPU surface literally cracked!!! :'( explains the pop sound i heard ...


Ouch!


----------



## BuildTestRepeat

Hey im just mad at myself. That card kicked ass. Its gonna be a bit before i can pickup a new gpu. I literally just got this a few weeks ago. To add insult injury my 1440p monitor will be here tomorrow, and i have no gpu to push bf4. All i can do is laugh at the irony.


----------



## tsm106

390x wce soon...

Dont be hard on yourself, we all break stuff sooner or later.


----------



## DividebyZERO

Quote:


> Originally Posted by *tsm106*
> 
> *390x* *wce soon..*.
> 
> Dont be hard on yourself, we all break stuff sooner or later.


Ooooooohhhhhh YEAH!!!!!!


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *tsm106*
> 
> That's like wrapping your car around a pole then asking for warranty service.


Then local council bills you for the pole









Quote:


> Originally Posted by *BuildTestRepeat*
> 
> Hey im just mad at myself. That card kicked ass. Its gonna be a bit before i can pickup a new gpu. I literally just got this a few weeks ago. To add insult injury my 1440p monitor will be here tomorrow, and i have no gpu to push bf4. All i can do is laugh at the irony.
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> 390x wce soon...
> 
> Dont be hard on yourself, we all break stuff sooner or later.
Click to expand...

Correct , Murphy can appear at any time

I broke a $250,000 Forklift once , and never heard the end of it ( that was last century )


----------



## Vexzarium

i5-4690k + 290x Lightning



3DMark Fire Strike: http://www.3dmark.com/3dm/6270329


----------



## LolCakeLazors

I have almost the same parts as you and I get substantially weaker Heaven benchmark scores.

http://i.imgur.com/4duYrbF.png


----------



## Vexzarium

Quote:


> Originally Posted by *LolCakeLazors*
> 
> I have almost the same parts as you and I get substantially weaker Heaven benchmark scores.
> 
> http://i.imgur.com/4duYrbF.png


If you're referring to me, I have not run Heaven recently. But when I first build this rig, I ran around the same. I believe it was 64fps average. I'll install it and run it again.


----------



## Mega Man

sorry man
Quote:


> Originally Posted by *tsm106*
> 
> I don't know what cause the cards death, but one thing is for certain. The card was not happy.


Quote:


> Originally Posted by *tsm106*
> 
> That's like wrapping your car around a pole then asking for warranty service.


Quote:


> Originally Posted by *tsm106*
> 
> Dont be hard on yourself, we all break stuff sooner or later.


this is in the sig !


----------



## Arizonian

Quote:


> Originally Posted by *LolCakeLazors*
> 
> Forgot to join the club.
> 
> http://www.techpowerup.com/gpuz/details.php?id=zp5nv
> 
> MSI Lightning 290X on Stock Cooler


Congrats - added








Quote:


> Originally Posted by *Vexzarium*
> 
> i5-4690k + 290x Lightning
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## Vexzarium

Quote:


> Originally Posted by *LolCakeLazors*
> 
> I have almost the same parts as you and I get substantially weaker Heaven benchmark scores.
> 
> http://i.imgur.com/4duYrbF.png


So, yeah I have A.D.D. and totally got side tracked. Took me forever to get around to re-running Heaven.


----------



## BuildTestRepeat

Considering this is the first part ive ever broken due to my error i know im doing ok haha. At least i still have this nice gelid cooler and memory/vrm heatsinks lol.


----------



## DzillaXx

Quote:


> Originally Posted by *BuildTestRepeat*
> 
> Considering this is the first part ive ever broken due to my error i know im doing ok haha. At least i still have this nice gelid cooler and memory/vrm heatsinks lol.


Personally I question what made the die crack.

Did the thermal paste not spread out to that side of the die?

Because your cooler didn't look like it would have been over tightened. Though you can get zip ties pretty tight, but IMO not that tight.

Perhaps something like a grain of sand or something got stuck under the block when you were installing.

290's are pretty cheap these days, you could always pick up a ref for cheap. Hell even a 7950 will push 1440p decently. For around 100 bucks you can ride one of those till newer cards drop.


----------



## tsm106

Quote:


> Originally Posted by *DzillaXx*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BuildTestRepeat*
> 
> Considering this is the first part ive ever broken due to my error i know im doing ok haha. At least i still have this nice gelid cooler and memory/vrm heatsinks lol.
> 
> 
> 
> Personally I question what made the die crack.
> 
> Did the thermal paste not spread out to that side of the die?
> 
> Because your cooler didn't look like it would have been over tightened. Though you can get zip ties pretty tight, but IMO not that tight.
> 
> Perhaps something like a grain of sand or something got stuck under the block when you were installing.
Click to expand...

Cracking the core like that is a result of imbalanced compression. Where is a "grain of sand" going to come from? What is he doing, assembling the clc at the beach? You break stuff, own up to it so I don't know why you are making excuses for him when he isn't making any.


----------



## MrWhiteRX7

WCCF reported that attaching CLC's on the beach was absolutely the best scenario... Brian Williams invented the beach just for that reason!


----------



## DividebyZERO

One time we had a customer who lived right on the beach. He has his pc right in front of the sliding glass door. He kept it open all the time. When he brought in his pc because it wouldn't boot. We looked and it had literally rusted and corroded everything in the back, and inside the pc. I didn't see sand but im sure some was in there somewhere. 

I think it would be fun to assemble a gpu and block on the beach. Just keep the banana hammocks away.


----------



## LolCakeLazors

Quote:


> Originally Posted by *Vexzarium*
> 
> So, yeah I have A.D.D. and totally got side tracked. Took me forever to get around to re-running Heaven.
> 
> I updated my original post but here's the Heaven screenshot for ya!


Just curious, what did you do to your computer to get to your current score?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *DividebyZERO*
> 
> One time we had a customer who lived right on the beach. He has his pc right in front of the sliding glass door. He kept it open all the time. When he brought in his pc because it wouldn't boot. We looked and it had literally rusted and corroded everything in the back, and inside the pc. I didn't see sand but im sure some was in there somewhere.
> 
> I think it would be fun to assemble a gpu and block on the beach. Just keep the banana hammocks away.


LOL!!!!


----------



## tsm106

Quote:


> Originally Posted by *LolCakeLazors*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Vexzarium*
> 
> So, yeah I have A.D.D. and totally got side tracked. Took me forever to get around to re-running Heaven.
> 
> I updated my original post but here's the Heaven screenshot for ya!
> 
> 
> 
> Just curious, what did you do to your computer to get to your current score?
Click to expand...

With heaven you cannot take any html submission at face value. Html outputs are meaningless on AMD cards vs. a screen of the cobbles with the fps.

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *DividebyZERO*
> 
> One time we had a customer who lived right on the beach. He has his pc right in front of the sliding glass door. He kept it open all the time. When he brought in his pc because it wouldn't boot. We looked and it had literally rusted and corroded everything in the back, and inside the pc. I didn't see sand but im sure some was in there somewhere.
> 
> I think it would be fun to assemble a gpu and block on the beach. Just keep the banana hammocks away.
> 
> 
> 
> LOL!!!!
Click to expand...

I once had a client who ran his servers in his garage. I was against it but that's the only place he had, it was his man cave. The rigs eventually died after a few years from corrosion. I'd just build new ones to replace. He wasn't as lucky to be on the waterfront but a few miles off. Outside atmosphere is bad on computers, yeap.


----------



## Vexzarium

Quote:


> Originally Posted by *tsm106*
> 
> With heaven you cannot take any html submission at face value. Html outputs are meaningless on AMD cards vs. a screen of the cobbles with the fps.


That is the most interesting theory I've ever read... can the HTML be modified in some way? I'll re-run, give me 20 minutes or so.


----------



## tsm106

Quote:


> Originally Posted by *Vexzarium*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> With heaven you cannot take any html submission at face value. Html outputs are meaningless on AMD cards vs. a screen of the cobbles with the fps.
> 
> 
> 
> That is the most interesting theory I've ever read... can the HTML be modified in some way? I'll re-run, give me 20 minutes or so.
Click to expand...

It's not a theory, it's fact. Html subs are meaningless. I'm not singling you out, just stating a fact.


----------



## Vexzarium

Quote:


> Originally Posted by *tsm106*
> 
> It's not a theory, it's fact. Html subs are meaningless. I'm not singling you out, just stating a fact.


Show me the "fact". Until then, it is a theory.

You miss out on your breakfast this morning? Didn't get your coffee yet? Need a Snickers?


----------



## tsm106

Quote:


> Originally Posted by *Vexzarium*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> It's not a theory, it's fact. Html subs are meaningless. I'm not singling you out, just stating a fact.
> 
> 
> 
> Show me the "fact". Until then, it is a theory.
> 
> You miss out on your breakfast this morning? Didn't get your coffee yet? Need a Snickers?
Click to expand...

A donkey can cheat at Heaven. That's why no one accepts html subs, because you can see the cheating easy with heaven. No bench contest will accept html subs. You must post the screen over the cobbles. The fact that you don't know this and take offense, it doesn't look good for you.


----------



## Vexzarium

I'm a hairy, bearded, semi-fat man... looking good is not a concern of mine.

I'm simply posting my benchmarks to share with this community, didn't realize they were contest submissions. My CPU & GPU would likely be under water if it were a contest, right?

Regardless, redownloading the benchmarks now and re-firing the overclocks(I normally run everything vanilla as the games I play don't need the OC), then re-running them.


----------



## sugarhell

Html heaven is stupid. Screenshot from the in-game engine with the score. Otherwise you can cheat on every single aspect


----------



## Vexzarium

How do you cheat? I clicked save from the benchmark and followed the Chrome link. I know nothing of editing HTML... if that is how you'd accomplish modifying the scores.


----------



## tsm106

Quote:


> Originally Posted by *Vexzarium*
> 
> How do you cheat? I clicked save from the benchmark and followed the Chrome link. I know nothing of editing HTML... if that is how you'd accomplish modifying the scores.


Here's where you want to ask.

http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-4-0-scores/0_40


----------



## Vexzarium

So you can get a massive score if Tessellation is turned off? I did not realize this. As in, I did not realize you could turn it off and it still be considered "Extreme".

So I changed the "Friendly Name" of my GPU in the Registry from "AMD Radeon R9 200 Series" to "Radeon R9-290x Lightning". Is this cheating as well? It is still listed as "AMD Radeon R9 200 Series" in the Catalyst.

I do this because I run ENB for Skyrim and it annoys me at the boot of the game that it would always say "AMD Radeon R9 200 Series". That and ESO crashes a lot and wanted the crash reports to be somewhat more telling than: "He might have a 270 or a 295x2". You never know if they actually pay attention to the VRAM.


----------



## DividebyZERO

Quote:


> Originally Posted by *Vexzarium*
> 
> So you can get a massive score if Tessellation is turned off? I did not realize this. As in, I did not realize you could turn it off and it still be considered "Extreme".
> 
> So I changed the "Friendly Name" of my GPU in the Registry from "AMD Radeon R9 200 Series" to "Radeon R9-290x Lightning". Is this cheating as well? It is still listed as "AMD Radeon R9 200 Series" in the Catalyst.
> 
> I do this because I run ENB for Skyrim and it annoys me at the boot of the game that it would always say "AMD Radeon R9 200 Series". That and ESO crashes a lot and wanted the crash reports to be somewhat more telling than: "He might have a 270 or a 295x2". You never know if they actually pay attention to the VRAM.


its a big reason why they want a screenshot on the beginning with the stone walk/stairs because people have cheated by using CCC to turn off tess. Sadly there are people who cheat at anything from taxes to heaven benchmark. Its usually obvious when it happens but sometimes its very subtle


----------



## mfknjadagr8

Quote:


> Originally Posted by *BuildTestRepeat*
> 
> No rma for sapphire. No support for modders like us. Or the card would still be in the warranty period. I only zipped it on with my fingers. I always learn the hard way -_- lol.


next time leave the grip of Kong in the other hand


----------



## BuildTestRepeat

Quote:


> Originally Posted by *DzillaXx*
> 
> Personally I question what made the die crack.
> 
> Did the thermal paste not spread out to that side of the die?
> 
> Because your cooler didn't look like it would have been over tightened. Though you can get zip ties pretty tight, but IMO not that tight.
> 
> Perhaps something like a grain of sand or something got stuck under the block when you were installing.
> 
> 290's are pretty cheap these days, you could always pick up a ref for cheap. Hell even a 7950 will push 1440p decently. For around 100 bucks you can ride one of those till newer cards drop.


Yes the paste was spread evenly and the GPU was sparking clean. I mostly play BF4 and wanted a card for 100fps avg. One of the most GPU intensive games out it seems. My Korean Panel 1440P monitor will be here when i get home from work today, hoping i can OC that to 96hz as many have had luck doing. Which is why i got the r9 290. Being too tight is the only logical explanation.

Quote:


> Originally Posted by *tsm106*
> 
> Cracking the core like that is a result of imbalanced compression. Where is a "grain of sand" going to come from? What is he doing, assembling the clc at the beach? You break stuff, own up to it so I don't know why you are making excuses for him when he isn't making any.


If only i lived near a damn beach.









Quote:


> Originally Posted by *MrWhiteRX7*
> 
> WCCF reported that attaching CLC's on the beach was absolutely the best scenario... Brian Williams invented the beach just for that reason!


Omg hahahaha


----------



## mfknjadagr8

Quote:


> Originally Posted by *BuildTestRepeat*
> 
> Yes the paste was spread evenly and the GPU was sparking clean. I mostly play BF4 and wanted a card for 100fps avg. One of the most GPU intensive games out it seems. My Korean Panel 1440P monitor will be here when i get home from work today, hoping i can OC that to 96hz as many have had luck doing. Which is why i got the r9 290. Being too tight is the only logical explanation.
> If only i lived near a damn beach.
> 
> 
> 
> 
> 
> 
> 
> 
> Omg hahahaha


lutro0 still had some xfx 290 refs for sale I bought two and they run great although I had to repaste the one but you would do that with a water block install anyway...I get good fps with resolution scale at 180 on 1920 x 1080 I would assume you could do similar at that res...just wondering where you bought your zipties a trend to get crappy batches even when I buy the "heavy duty" ones lol


----------



## Vexzarium

Okay, here you have it. Benchmarks re-done:

EDIT: Updated on a later post: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/35730#post_23680544


----------



## LandonAaron

Picked up a VisionTek Cryovenom R9 290 (with EK Waterblock) to crossfire with my R9 290X, and a matching EK waterblock for it







. I'm interested in seeing if I can BIOS flash my 290 into a 290x. Where can I find more info on the correct BIOS to use and the procedure for doing it?


----------



## Devildog83

Quote:


> Originally Posted by *tsm106*
> 
> A donkey can cheat at Heaven. That's why no one accepts html subs, because you can see the cheating easy with heaven. No bench contest will accept html subs. You must post the screen over the cobbles. The fact that you don't know this and take offense, it doesn't look good for you.


I think maybe we are taking this a bit too serious. I have never posted a screen shot of heaven or Valley and no one has ever accused me of cheating. If your in some kind of actual competition I could see it but the guy was just posting score to show results and get opinions not trying to cheat at a competition. I don't see why it doesn't look good for him, he ain't trying to win a prize but just playing with his hardware.


----------



## Vexzarium

Quote:


> Originally Posted by *Devildog83*
> 
> I think maybe we are taking this a bit too serious. I have never posted a screen shot of heaven or Valley and no one has ever accused me of cheating. If your in some kind of actual competition I could see it but the guy was just posting score to show results and get opinions not trying to cheat at a competition. I don't see why it doesn't look good for him, he ain't trying to win a prize but just playing with his hardware.


Not really a big deal anyway. I ended up scoring higher on both benchmarks. But I agree, was never intending to compete.


----------



## Vexzarium

Quote:


> Originally Posted by *LolCakeLazors*
> 
> Just curious, what did you do to your computer to get to your current score?


Nothing special, just a stable overclock of the GPU and CPU with 8 case fans, 4in/4out.


----------



## LandonAaron

Quote:


> Originally Posted by *tsm106*
> 
> 390x wce soon...
> 
> Dont be hard on yourself, we all break stuff sooner or later.


Quote:


> Originally Posted by *DividebyZERO*
> 
> Ooooooohhhhhh YEAH!!!!!!


List of things I have broken since getting into enthusiast PC's: Motherboard (I knocked a capacitor off of with a screwdriver), Titanium Fatal1ty Soundcard (was making some sort of ghetto fan bracket out of a paperclip and the paperclip ended up against my sound card shorting it out), 3 hard drives in one fail swoop when I plugged in a fan controller with a modular SATA power cable that I had forgotten I had experimented with to see how hard it would be to de-pin for a sleeve job, another motherboard trying to get a CPU cooler backplate off that was attached with adhesive tape, a GTX 580 (plugged it into TV in living room with 30 ft. HDMI cable and it went up in smoke, turns out the outlet in the living room had like the ground and positive terminals reversed, and the outlet for the computer didn't have its ground connected at all), that was one month after cancelling my renters insurance







. I'm sure there is more I am forgetting.

Its kind of funny now, though none of these incidents made me laugh at the time.


----------



## taem

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Cool yeah I got LL some where
> 
> How hard are these things to clock and is it really worth it?


Overclocking is easy, just move sliders until your system crashes or your screen goes weird.

As to whether it's worth it, well... On benchmarks it is.

Here is 1000 core, 1300 mem




http://www.3dmark.com/fs/4327432

And here is 1180 core, 1500 mem




http://www.3dmark.com/fs/4331038

So the graphics sub score of Firestrike goes from 10832 at 1000 core 1300 mem, to 13059 at 1180 core 1500 mem. It's a pretty impressive jump.

In game though? The effects are far less noticeable. Me, I game at 60fps constant at 1440p. To get that I have to lower settings on current titles, even less demanding ones like Bioshock Infinite. And with these settings, most don't pack much of a performance hit, there's just a few settings you have to dial back. And on those, going from 1000 core to 1180 core doesn't really make any difference in many titles, the overclock isn't sufficient to allow a demanding setting to be raised beyond what I can get with a lower clock.

If I had g-sync or freesync, or played without v sync on, then every clock would count so yeah, it'd be totally worth it. But with my current setup, not really, no real difference between the 1070 core I can run with +0mv adjust and the 1180 core I can run with +100mv adjust, I just use more watts with no change in gameplay.


----------



## tsm106

Quote:


> Originally Posted by *taem*
> 
> Overclocking is easy, just move sliders until your system crashes or your screen goes weird.


It's easy for a lil while until you get to the upper end, then it's not so easy anymore.


----------



## taem

Quote:


> Originally Posted by *tsm106*
> 
> It's easy for a lil while until you get to the upper end, then it's not so easy anymore.


How so? The mechanics remain easy.


----------



## tsm106

Quote:


> Originally Posted by *taem*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> It's easy for a lil while until you get to the upper end, then it's not so easy anymore.
> 
> 
> 
> How so? The mechanics remain easy.
Click to expand...

At the upper end you get into nuances and that's not mentioning the cooling required. Going back to the nuances, you get awfully close to the edge of death and knowing where and when to push is critical. Otherwise you fry more cards than not. That said, each benchmark is different. Some like high vram clocks, some do not and even within the same family of benches. For ex. Firestrike in quadfire perfers lower memory clocks vs high. However Firestrike Ultra wants massive clocks. I shouldn't be sharing this info though lol these are tricks of the trade.


----------



## BuildTestRepeat

Quote:


> Originally Posted by *mfknjadagr8*
> 
> next time leave the grip of Kong in the other hand


What had me puzzled is the cooler could actually move a tiny bit
Quote:


> Originally Posted by *tsm106*
> 
> It's easy for a lil while until you get to the upper end, then it's not so easy anymore.


More time than anything. And patience to let stress tests run for quite awhile. Although usually if a OC is unstable, it will fail within 5 mins or less. Once you think you have a stable OC let it run for a few hours. The longer the better.


----------



## DividebyZERO

Quote:


> Originally Posted by *LandonAaron*
> 
> List of things I have broken since getting into enthusiast PC's: Motherboard (I knocked a capacitor off of with a screwdriver), Titanium Fatal1ty Soundcard (was making some sort of ghetto fan bracket out of a paperclip and the paperclip ended up against my sound card shorting it out), 3 hard drives in one fail swoop when I plugged in a fan controller with a modular SATA power cable that I had forgotten I had experimented with to see how hard it would be to de-pin for a sleeve job, another motherboard trying to get a CPU cooler backplate off that was attached with adhesive tape, a GTX 580 (plugged it into TV in living room with 30 ft. HDMI cable and it went up in smoke, turns out the outlet in the living room had like the ground and positive terminals reversed, and the outlet for the computer didn't have its ground connected at all), that was one month after cancelling my renters insurance
> 
> 
> 
> 
> 
> 
> 
> . I'm sure there is more I am forgetting.
> 
> Its kind of funny now, though none of these incidents made me laugh at the time.


sad to admit, but years ago(roughly 15yrs) i remember one time I got mad at my pc which I think was running Windows ME and BSOD'd on me one too many times. So I did what any reasonable 20yr old with a short fuse temper did.... I punched the computer right in the front middle panel you know where the hard drives mounts used to be... needless to say it killed the HDD. I think it waz like a 2gb seagate or something.

Yeah, not proud of that moment. However the good news is today we have SSD's and they dont mind a good punch. Guess I was born a couple decades too early...


----------



## tsm106

Does this look familiar? Someone I know did this. It's been my avy for a while now. Such awesome anger to create this masterpiece.


----------



## DividebyZERO

Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> Does this look familiar? Someone I know did this. It's been my avy for a while now. Such awesome anger to create this masterpiece.


i've seen this before and knew it was something with you. However i always saw the small version of it and thought you had done some sort of frankenstien vrm board soldering. Never seen it in large form so i couldn't see it was broken clean in half.

So now i need to know did you throw it like a javelin, or did you perform a back breaker elbow drop piledriver on it?


----------



## tsm106

lol it wasn't me it was Ranger. When he found out Asus would not warranty it he got so angry that he snapped it over the knee. I gotta admit though that took some balls and granted the Matrix is a pretty crap card.


----------



## DividebyZERO

Quote:


> Originally Posted by *tsm106*
> 
> lol it wasn't me it was Ranger. When he found out Asus would not warranty it he got so angry that he snapped it over the knee. I gotta admit though that took some balls and granted the Matrix is a pretty crap card.


haha dang...i'm guessing then it was already dead?


----------



## LolCakeLazors

Quote:


> Originally Posted by *Vexzarium*
> 
> Nothing special, just a stable overclock of the GPU and CPU with 8 case fans, 4in/4out.


Still stuck on that stock CPU cooler since I'm planning to get a H240X in the future. For now, I'm keeping my 290X at 1100/1400 due to temperature concerns (1200/1400 reaches 80 degrees at full load).


----------



## sugarhell

Quote:


> Originally Posted by *DividebyZERO*
> 
> haha dang...i'm guessing then it was already dead?


Nope. It wasnt


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *DzillaXx*
> 
> Personally I question what made the die crack.
> 
> Did the thermal paste not spread out to that side of the die?
> 
> Because your cooler didn't look like it would have been over tightened. Though you can get zip ties pretty tight, but IMO not that tight.
> 
> *Perhaps something like a grain of sand or something got stuck under the block when you were installing.*
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Cracking the core like that is a result of imbalanced compression. *Where is a "grain of sand" going to come from? What is he doing, assembling the clc at the beach?* You break stuff, own up to it so I don't know why you are making excuses for him when he isn't making any.
Click to expand...



Quote:


> Originally Posted by *MrWhiteRX7*
> 
> *WCCF reported that attaching CLC's on the beach was absolutely the best scenario* ... Brian Williams invented the beach just for that reason!
> Quote:
> 
> 
> 
> Originally Posted by *DividebyZERO*
> 
> One time we had a customer who lived right on the beach. He has his pc right in front of the sliding glass door. He kept it open all the time. When he brought in his pc because it wouldn't boot. *We looked and it had literally rusted and corroded everything in the back, and inside the pc. I didn't see sand but im sure some was in there somewhere.
> 
> I think it would be fun to assemble a gpu and block on the beach. Just keep the banana hammocks away.*
Click to expand...



Quote:


> Originally Posted by *Vexzarium*
> 
> Show me the "fact". Until then, it is a theory.
> 
> You miss out on your breakfast this morning? Didn't get your coffee yet? Need a Snickers?


Ohh very snappy









I need a Kleenex after looooooling at all of you ........ who cares about heaven anyways


----------



## pshootr

Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> Does this look familiar? Someone I know did this. It's been my avy for a while now. Such awesome anger to create this masterpiece.


I always though that avatar looked like an Electric Chair


----------



## Mega Man

Quote:


> Originally Posted by *ChronoBodi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *supermi*
> 
> I cleared out the registry etc. Re-installed the Omega drivers afterwords.
> 
> SOM is exactly the same horrible timeframe variance.
> Titan fall is STILL 1/2 the FPS it should be. I can force friendly AFR and I get 120fps SOLID at MSAA x2 and 100
> BF4 same as before at 1080P portrait surround DX11 100% scaling 98-160fps usually 110-130 Mantle 70-80fps same setting same map SAME MATCH.
> 
> Also to whoever says they do not restart after choosing mantle in BF4, you need to or the API does not switch and you have no change in FPS.
> 
> I had MSI afturburner turned off for the test and relied on the in game FPS counter ...
> 
> This was after clearing out all old nvidia and AMD junk and freshly re-installing the omega drivers windows 8.1. I suppose I could do a full driver reinstall but also bet the results will be the same and I can also bet if I leave the AMD drivers on there and put the 980's back in they will work just the same as before.
> 
> I would love any ideas, but I think I will put these cards in a box and wait till the 19th if AMD actually realeses new drivers I will try again if not returning these. Also I was really going to get the 390x especially with the 8gb ram it looks to have, but now seeing this I think I might pay whatever NVIDIA wants for a product that works and can also scale smoothly if I use 2 of them.
> 
> 
> 
> 
> 
> 
> 
> 
> and
> 
> 
> 
> 
> 
> 
> 
> I wanted to use AMD and am still gonna try but they are making it difficult to give them $$$
> 
> 
> 
> I have no idea lol, SoM is flawless on my rig, same Xfire 290Xs and all. Honestly every rig is different, so, no idea what's bonkers in your case.
Click to expand...

the best part is it was what ( at the time ) a 600-700 $ gpu ? ( in the us ) he is in the uk and it was pricer .....

although i dont miss his temper, i miss him he was a good guy


----------



## BuildTestRepeat

Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> Does this look familiar? Someone I know did this. It's been my avy for a while now. Such awesome anger to create this masterpiece.


I about wanted to do this to my 290 after i found the die cracked lmao! But then i remembered i needed to salvage the VRM and mem heatsinks bahahaha


----------



## chiknnwatrmln

Only thing PC related I've broken in a rage was my old $20 mouse. It quit working sometimes, I had enough of it and swung it around by the cord a few times and slammed it into my bedpost. It exploded into about 100 pieces and was so satisfying I couldn't even stay mad. When I moved out of the dorm at the end of the year I found several pieces still behind desks and under beds.

There's been times I've wanted to throw my PC out the window or drown it in a bathtub, but I always think about how much money I've invested/wasted on the damn thing before I do any of that..


----------



## Harry604

i bought a 290x oc windforce on craigslist

i checked on the buyers computer and it showed 2816 shaders

i get home and check and it says 2560 shaders

i ran hawaiinfo and it says 8010005

the number on the pcb says gv-r929xoc-4gd

but i run the serial and it comes up as gv-r929oc-gd


----------



## kizwan

Quote:


> Originally Posted by *Harry604*
> 
> i bought a 290x oc windforce on craigslist
> 
> i checked on the buyers computer and it showed 2816 shaders
> 
> i get home and check and it says 2560 shaders
> 
> i ran hawaiinfo and it says 8010005
> 
> the number on the pcb says gv-r929xoc-4gd
> 
> but i run the serial and it comes up as gv-r929oc-gd


Is this your first 290 card? I mean there's no another 290 card in your computer? Post a screenshoot of the GPU-Z over here,


----------



## supermi

Quote:


> Originally Posted by *Harry604*
> 
> i bought a 290x oc windforce on craigslist
> 
> i checked on the buyers computer and it showed 2816 shaders
> 
> i get home and check and it says 2560 shaders
> 
> i ran hawaiinfo and it says 8010005
> 
> the number on the pcb says gv-r929xoc-4gd
> 
> but i run the serial and it comes up as gv-r929oc-gd


being that an open box 8gb 290x turned out to be a 4gb ... there are some lazy factory workers or some sneaky people doing these things then resell /return


----------



## Vexzarium

Quote:


> Originally Posted by *supermi*
> 
> being that an open box 8gb 290x turned out to be a 4gb ... there are some lazy factory workers or some sneaky people doing these things then resell /return


I vote someone wanted the 8gb edition and swapped their 4gb out for it and returned it.


----------



## gordesky1

Like to join the club msi 290x lightning stock cooler. 1200/1600

http://www.techpowerup.com/gpuz/details.php?id=5npuz


----------



## Vexzarium

Quote:


> Originally Posted by *gordesky1*
> 
> Like to join the club msi 290x lightning stock cooler. 1200/1600
> 
> http://www.techpowerup.com/gpuz/details.php?id=5npuz


Welcome, fellow Lightning owner.









I'm actually making my card "mad" right now according to my girlfriend... I'm pushing the overclock a bit to get better benchmarks.


----------



## Mega Man

Quote:


> Originally Posted by *gordesky1*
> 
> stock cooler.


welcome

i'm sorry

heres a fix

http://www.ekwb.com/news/477/19/EK-releases-MSI-R9-290X-Lightning-Full-Cover-water-block/

( had to sorry )


----------



## Vexzarium

So I've pushed my 290x Lightning to it's very limit then had to back it down a bit... I actually seen some heat out of it for once... this is the highest I've ever seen it run. It peaked at 73c. These are the very highest scores I can get:




I'm going to give Fire Strike another go.

EDIT: Broke the 11k barrier with a 4690k:

New Fire Strike: http://www.3dmark.com/3dm/6270329


----------



## LolCakeLazors

Yours peaks at 73 degrees on 1200+ but mine peaks 73 on 1150. Well I guess this is what I get for picking silence over cooling on my case.


----------



## Vexzarium

I have 8 case fans and it is dead silent. Not quite as silent as a Fractal R5, but way more airflow.

Take those vent covers out of the top and put some Noctua or Phanteks fans up there. The Phanteks will match the R5 perfectly.

Do you have both front intake slots filled? Are their any bottom intake slots?

Load it down with high quality fans and it will cool your case and not have to spool up high to do it. Even if my Phanteks SP Fans get up to high RPM, they're still really quiet.

Add a fan hub or fan controller. My case comes with this: http://www.newegg.com/Product/Product.aspx?Item=N82E16811984004

Which plugs into one of my PWM fan headers on my mobo then you plug in all of your 3 pin fans to the hub. All the fans are now moderated by PWM.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Harry604*
> 
> i bought a 290x oc windforce on craigslist
> 
> i checked on the buyers computer and it showed 2816 shaders
> 
> i get home and check and it says 2560 shaders
> 
> i ran hawaiinfo and it says 8010005
> 
> the number on the pcb says gv-r929xoc-4gd
> 
> but i run the serial and it comes up as gv-r929oc-gd


Flick the cards bios switch over ?


----------



## LolCakeLazors

Quote:


> Originally Posted by *Vexzarium*
> 
> I have 8 case fans and it is dead silent. Not quite as silent as a Fractal R5, but way more airflow.
> 
> Take those vent covers out of the top and put some Noctua or Phanteks fans up there. The Phanteks will match the R5 perfectly.
> 
> Do you have both front intake slots filled? Are their any bottom intake slots?
> 
> Load it down with high quality fans and it will cool your case and not have to spool up high to do it. Even if my Phanteks SP Fans get up to high RPM, they're still really quiet.
> 
> Add a fan hub or fan controller. My case comes with this: http://www.newegg.com/Product/Product.aspx?Item=N82E16811984004
> 
> Which plugs into one of my PWM fan headers on my mobo then you plug in all of your 3 pin fans to the hub. All the fans are now moderated by PWM.


I got 2 Phanteks up front, 1 FD fan at the bottom, 1 at the exhaust, all plugged into a 3 pin fan controller that spins them at max. Do having fans at the top really make a big difference?


----------



## Vexzarium

Quote:


> Originally Posted by *LolCakeLazors*
> 
> I got 2 Phanteks up front, 1 FD fan at the bottom, 1 at the exhaust, all plugged into a 3 pin fan controller that spins them at max. Do having fans at the top really make a big difference?


The one at the top towards the farthest back point, absolutely. Personally, the top exhaust near the front is the least significant for me as I do not use multiple storage drives, 240gb SSD is plenty for me. But the two that sit right above the mobo are pretty damn good at forcing the hot air out of your case. Remember, heat rises. Take a look at that Phanteks Fan Hub, it is fantastic. Literally plug that into a 4 pin chassis fan header on your mobo, then plug it into a SATA power lead... then plug all of your fans, not the cpu cooler fan, into the fan hub. You won't regret it and your case will still be as silent as it was with the vent covers on top.

With the stock Phanteks Enthoo Pro, you get 1 140mm exhaust and a 200mm intake. The first thing I added was 3 top exhausts and my temps went down by about 5-10c depending on the circumstance. For example, with the stock fan configuration, I'd be idle at 33c for the CPU and 36c for the GPU. Now both are idle at 27c. Granted, I have taken the stock 200mm intake out and replaced it with 2 140's. Also added 2 120 mm bottom intakes. And have three additional top 140's for exhaust on top of the stock rear 140 for exhaust.


----------



## BuildTestRepeat

Anyone had experience with Sapphire RMA? After looking around some say they put a stock cooler on and sent it back without issues. Hmmmmm. I feel finding a stock r9 290 sapphire cooler will be the challenge.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *BuildTestRepeat*
> 
> Anyone had experience with Sapphire RMA? After looking around some say they put a stock cooler on and sent it back without issues. Hmmmmm. I feel finding a stock r9 290 sapphire cooler will be the challenge.


I might still have all of mine lol. I'll check when I get home


----------



## kaspar737

Could anyone comment on how well VRM is cooled on Tri-X 290?


----------



## taem

Quote:


> Originally Posted by *kaspar737*
> 
> Could anyone comment on how well VRM is cooled on Tri-X 290?


*Extremely* well. I had it as the top card in xfire in an air cooled Define R4 (gpu #1; gpu #2 is a Powercolor PCS+), and the temps playing Far Cry 3 with ambient around 22c looked this good



and on a Metro LL bench loop heat test



I'm surprised how well the vrm is cooled since the base plate covers it fully, with thermal pad in between, so the cooler fans blow on a sheet of metal. On the PCS+, the vrms have a heat sink + thermal pad, and it's left open with no base plate over it so the fans blow over aluminum heat sinks.

The Tri-X is the best 290 you can get imho. PCS+ and Club3D Royal Ace cool better and the fans are quieter but they are gigantic. Not that the Tri-X is small. But PCS+ and Royal Ace are ridiculous.


----------



## kaspar737

Quote:


> Originally Posted by *taem*
> 
> *Extremely* well. I had it as the top card in xfire in an air cooled Define R4 (gpu #1; gpu #2 is a Powercolor PCS+), and the temps playing Far Cry 3 with ambient around 22c looked this good
> 
> 
> 
> and on a Metro LL bench loop heat test
> 
> 
> 
> I'm surprised how well the vrm is cooled since the base plate covers it fully, with thermal pad in between, so the cooler fans blow on a sheet of metal. On the PCS+, the vrms have a heat sink + thermal pad, and it's left open with no base plate over it so the fans blow over aluminum heat sinks.
> 
> The Tri-X is the best 290 you can get imho. PCS+ and Club3D Royal Ace cool better and the fans are quieter but they are gigantic. Not that the Tri-X is small. But PCS+ and Royal Ace are ridiculous.


Shouldn't Tri-X be one of the quietest? I remember one review said that the Powercolor is very loud.


----------



## taem

Quote:


> Originally Posted by *kaspar737*
> 
> Shouldn't Tri-X be one of the quietest? I remember one review said that the Powercolor is very loud.


Powercolor pcs+ isn't quiet but it's not too bad. The tonal quality of the fans is also better than the Tri-x which is higher in pitch.

Default bios caps fan speed on Tri-x at 60% (at least on the temps I was getting) while pcs+ goes to 80%. At same % fan speed, pcs+ is far less noticeable.

Tri-x shroud is also part plastic and if you grip it a bit tight while installing it can make the fans get really loud and it's hard to readjust back to normal.


----------



## rv8000

Very sad day for me today







, bad luck getting a defective panel.

http://www.overclock.net/t/1546860/first-hands-on-experience-with-benq-xl2730z


----------



## BuildTestRepeat

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> I might still have all of mine lol. I'll check when I get home


Awesome. I appreciate it!


----------



## narmour

I'm looking for a quick bit of advice guys - I know I could find it in this thread; bit it's massive!

I have a reference Asus r9 290x which I'm putting under water this weekend - what's the score with overclocking it? in terms of max voltages etc and do I need to flash the BIOS?


----------



## LandonAaron

Quote:


> Originally Posted by *narmour*
> 
> I'm looking for a quick bit of advice guys - I know I could find it in this thread; bit it's massive!
> 
> I have a reference Asus r9 290x which I'm putting under water this weekend - what's the score with overclocking it? in terms of max voltages etc and do I need to flash the BIOS?


No need to flash unless you want the card to never throttle down, or want to give it massive voltage. Saphire Trixx is good program for OC as it will allow you to give card +200mV vs. +100mV using MSI AB. However alot of cards have problems with the screen blanking when giving the card more than +100mV especially when running >1080p. As far as overclocking you can try shooting for 1200/1600 or 1100/1500 out the gate with +100mV and +50 power, then just work your way up or down from there. Heaven doesn't work for me for testing OC though. I can overclock very high with heaven but as soon as I load up a real game I get green lines and double vision. So I say just test with an intensive game like Dying Light or BF4. You can disable and re-enabe the monitor, or chagne resolutions on the monitor to reload video without having to close out the game after an unsuccessful OC. You can play with Aux voltage too if you like though what effect it has if any is less certain. Someone recently claimed that it helps with screen blanking though.


----------



## Mega Man

Quote:


> Originally Posted by *Vexzarium*
> 
> Quote:
> 
> 
> 
> Originally Posted by *LolCakeLazors*
> 
> I got 2 Phanteks up front, 1 FD fan at the bottom, 1 at the exhaust, all plugged into a 3 pin fan controller that spins them at max. Do having fans at the top really make a big difference?
> 
> 
> 
> The one at the top towards the farthest back point, absolutely. Personally, the top exhaust near the front is the least significant for me as I do not use multiple storage drives, 240gb SSD is plenty for me. But the two that sit right above the mobo are pretty damn good at forcing the hot air out of your case. *Remember, heat rises*. Take a look at that Phanteks Fan Hub, it is fantastic. Literally plug that into a 4 pin chassis fan header on your mobo, then plug it into a SATA power lead... then plug all of your fans, not the cpu cooler fan, into the fan hub. You won't regret it and your case will still be as silent as it was with the vent covers on top.
> 
> With the stock Phanteks Enthoo Pro, you get 1 140mm exhaust and a 200mm intake. The first thing I added was 3 top exhausts and my temps went down by about 5-10c depending on the circumstance. For example, with the stock fan configuration, I'd be idle at 33c for the CPU and 36c for the GPU. Now both are idle at 27c. Granted, I have taken the stock 200mm intake out and replaced it with 2 140's. Also added 2 120 mm bottom intakes. And have three additional top 140's for exhaust on top of the stock rear 140 for exhaust.
Click to expand...

First @LolCakeLazors

it really depends. it is your case, your components, however usually more airflow is usually better with that said static pressure will help more then cfm in most cases. there are too many component combos to test to say " do this" test it for your self

some people prefer neg pressure ( more air out then in ) most positive pressure ( more in then out )
and some equal pressure.

please note just because i only use 1 fan as exhaust does not mean that is a bad thing ( 24 in 1 out. and yes twenty four ) i use an extremely powerful fan as an exhaust and ramp it up/down aggressively as needed.

it works best for me. find out what works for you.

if you do have positive pressure air will flow out anything with a gap.

so dont be afraid of more in then out, use good fans ( again static pressure is more important but really you need a P-q chart to know which 99% of fans dont publish it )

another thing to look at to answer your question is do you want silence or performance .

we can guide you but cant say "do it its better" if they do they lie
Second @Vexzarium

as to the bold/underline/italics

please NEVER use this statement when it comes to pcs,

UNLESS you run a 100% passive system it is just not true.

in a pc with active cooling. heat does not rise, fall or other wise it goes where you push it, the fans make the heat go where they point. period, if you EVER have heat rising in your case you need better airflow ( again unless doing 100% passive build )

there will never be truth to this statement in a pc otherwise.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *BuildTestRepeat*
> 
> Awesome. I appreciate it!


Woot! Gotcha covered







I knew being a gpu hoarder would one day help someone lol.. I'll send it to you for the cost of shipping. Just PM me if you need it. I have no use of it.


----------



## BuildTestRepeat

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Woot! Gotcha covered
> 
> 
> 
> 
> 
> 
> 
> I knew being a gpu hoarder would one day help someone lol.. I'll send it to you for the cost of shipping. Just PM me if you need it. I have no use of it.


Pm sent, Cant say thanks enough my friend. From one PC nerd to another


----------



## MrWhiteRX7

Quote:


> Originally Posted by *BuildTestRepeat*
> 
> Pm sent, Cant say thanks enough my friend. From one PC nerd to another


Just payin' it forward! Love this community!









Check your messages, I've left a couple responses lol.. you just haaaad to bring up cars!


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *BuildTestRepeat*
> 
> Awesome. I appreciate it!


When the sapphire techs take off that cooler and see the shattered GPU die ....... no warranty for you ..... But good luck with that


----------



## mus1mus




----------



## pdasterly

That chuckled me


----------



## aaroc

Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> Does this look familiar? Someone I know did this. It's been my avy for a while now. Such awesome anger to create this masterpiece.


I tough it was a kids chair, with 1440p or in my cellphone the avatars are tiny and didn't realize it was a bended GPU.


----------



## pdasterly

Looks like someone purchased gpu with no warranty


----------



## rt123

The original 970 Video was much much funnier.


----------



## Mega Man

i still wanna see the video of your avatar


----------



## BuildTestRepeat

That video made my night. And i confess i used to be a fan of nvidia lol. I took this series off for sure. The teeth killed me.


----------



## HOMECINEMA-PC

Being a engineer you'd think he can afford a top denture at least


----------



## rt123

Quote:


> Originally Posted by *Mega Man*
> 
> i still wanna see the video of your avatar


Well here you go




Quote:


> Originally Posted by *BuildTestRepeat*
> 
> That video made my night. And i confess i used to be a fan of nvidia lol. I took this series off for sure. The teeth killed me.


The original spoof is even better




Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Being a developer you'd think he can afford a top denture at least


This is just stuff dubbed over, like the Hitler parodys on youtube. He doesn't work for Nvidia.


----------



## Mega Man

thanks so much i knew he sucked on it .... that was awesome !~


----------



## rt123

No Problem.
Not the first time someone asked me for a Video of the Avatar.


----------



## mus1mus

Quote:


> Originally Posted by *rt123*
> 
> This is just stuff dubbed over, like the Hitler parodys on youtube. He doesn't work for Nvidia.


Nope., he does work against nVidia.

That vid though!


----------



## mfknjadagr8

Quote:


> Originally Posted by *Mega Man*
> 
> thanks so much i knew he sucked on it .... that was awesome !~


Yeah that was funny but seeing the avatar i thought that was something more to your area there megaman


----------



## Vexzarium

Quote:


> Originally Posted by *Mega Man*
> 
> Second @Vexzarium
> 
> as to the bold/underline/italics
> 
> please NEVER use this statement when it comes to pcs,
> 
> UNLESS you run a 100% passive system it is just not true.
> 
> in a pc with active cooling. heat does not rise, fall or other wise it goes where you push it, the fans make the heat go where they point. period, if you EVER have heat rising in your case you need better airflow ( again unless doing 100% passive build )
> 
> there will never be truth to this statement in a pc otherwise.


So if I have 2x140mm SP front & 2x120mm SP front/bottom for intakes. Then 3x140mm SP top & 1x140mm AF back for exhausts. What would this configuration be considered?

And I agree, I likely should not have just said "This will work for everyone". But as you said, it's just easier to suggest "more fans!", as more airflow is usually going to help.


----------



## Mega Man

Quote:


> Originally Posted by *Vexzarium*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> Second @Vexzarium
> 
> as to the bold/underline/italics
> 
> please NEVER use this statement when it comes to pcs,
> 
> UNLESS you run a 100% passive system it is just not true.
> 
> in a pc with active cooling. heat does not rise, fall or other wise it goes where you push it, the fans make the heat go where they point. period, if you EVER have heat rising in your case you need better airflow ( again unless doing 100% passive build )
> 
> there will never be truth to this statement in a pc otherwise.
> 
> 
> 
> So if I have 2x140mm SP front & 2x120mm SP front/bottom for intakes. Then 3x140mm SP top & 1x140mm AF back for ehausts. What would this configuration be considered?
> 
> And I agree, I likely should not have just said "This will work for everyone". But as you said, it's just easier to suggest "more fans!", as more airflow is usually going to help.
Click to expand...

I dunno your mixing fans. And without a lot of variables I can not give answers...
Any way it seems to work for you and that is what matters
An easy way to test is find a vent on your case without a fan and take a kleenex and see if the kleenex pulls toward the case ( negative pressure ) or away ( positive ) or neither (equal)

I know I come off abrasive to most people and if you feel that way I am sorry

I just state facts. No emotion behind it and i hope you don't take it any other way.


----------



## LandonAaron

Yes kleenex, receipt or celephane all works well for testing positive or negative pressure. On my case I have an 80mm fan on top my power supply pulling in air through the pci slots. If I have it turned to low and I turn my two front intake fans to high it will stop spinning as it can't overcome the positve pressure of the case. Turn up rear exhaust and it starts spinning again. I get a kick out of it every time.

I think the general rule though is that negative pressure gives the best cooling but the most dust, while positive pressure combined with good filters will give you slightly worse cooling but much less dust. I prefer positive pressure for this reason but aim for just slightly positive. 3x120 + 1x80 in, 1x120 + 1×200 out. I want to see this 24 in 1 out configuration though.


----------



## Native89

EVGA 650G out. Seasonic X850 in.
Got my VaporX 290 in about a week ago and did a slight overclock, but did not touch the voltage.
Hopefully I can push the core a little further with more voltage. The memory however (Hynix), does not want to budge from 1400.
Coming from a GTX760 the 290 is a nice bump in performance and runs a little cooler to boot.


http://www.techpowerup.com/gpuz/details.php?id=8s2eu


----------



## Vexzarium

Quote:


> Originally Posted by *Mega Man*
> 
> I dunno your mixing fans. And without a lot of variables I can not give answers...
> Any way it seems to work for you and that is what matters
> An easy way to test is find a vent on your case without a fan and take a kleenex and see if the kleenex pulls toward the case ( negative pressure ) or away ( positive ) or neither (equal)
> 
> I know I come off abrasive to most people and if you feel that way I am sorry
> 
> I just state facts. No emotion behind it and i hope you don't take it any other way.


The only vents I have with no fans are the spots under my GPU, and right to the right of that is another vent. The PCI slots under the GPU pulled the tissue in pretty good while the vent to the right only slightly pulled it in.

And when you're "abrasive" with a bluntly honest person, the blunt person tends to not take much offense.


----------



## Vexzarium

Quote:


> Originally Posted by *Native89*
> 
> EVGA 650G out. Seasonic X850 in.
> Got my VaporX 290 in about a week ago and did a slight overclock, but did not touch the voltage.
> Hopefully I can push the core a little further with more voltage. The memory however (Hynix), does not want to budge from 1400.
> Coming from a GTX760 the 290 is a nice bump in performance and runs a little cooler to boot.
> 
> 
> http://www.techpowerup.com/gpuz/details.php?id=8s2eu


Without a doubt, the best stock cooling for the 290/290x is the Vapor-X and the Tri-Frozr(AKA MSi Lightning)


----------



## BuildTestRepeat

Its actually under warranty, but sapphire does not cover cooler removal, thermal paste replacement etc.... -_-

Guess who will never buy another sapphire! lol.


----------



## Mega Man

Quote:


> Originally Posted by *LandonAaron*
> 
> Yes kleenex, receipt or celephane all works well for testing positive or negative pressure. On my case I have an 80mm fan on top my power supply pulling in air through the pci slots. If I have it turned to low and I turn my two front intake fans to high it will stop spinning as it can't overcome the positve pressure of the case. Turn up rear exhaust and it starts spinning again. I get a kick out of it every time.
> 
> I think the general rule though is that negative pressure gives the best cooling but the most dust, while positive pressure combined with good filters will give you slightly worse cooling but much less dust. I prefer positive pressure for this reason but aim for just slightly positive. 3x120 + 1x80 in, 1x120 + 1×200 out. I want to see this 24 in 1 out configuration though.


Just look at a th10 ( case labs )....

I have 3x monsta 480, 1x UT 60 480, 1 x Xt45 480.


----------



## Vexzarium

Okay, so I flipped my front most top Exhaust fan around and it is now an Intake. The tissue is now being pushed away from the vents, but only slightly. So basically, I've opted for a slight-positive approach after Mega Man led me to start researching the matter online. And my case(Phanteks Enthoo Pro) is loaded with dust filters, so all is well.

"REP+" given to the two whom assisted on this matter.


----------



## tsm106

Quote:


> Originally Posted by *BuildTestRepeat*
> 
> Its actually under warranty, but sapphire does not cover cooler removal, thermal paste replacement etc.... -_-
> 
> Guess who will never buy another sapphire! lol.


http://www.overclock.net/t/1444881/sapphire-after-market-coolers-and-warranty#post_21247787

Quote:


> Originally Posted by *Vexzarium*
> 
> Okay, so I flipped my front most top Exhaust fan around and it is now an Intake. The tissue is now being pushed away from the vents, but only slightly. So basically, I've opted for a slight-positive approach after Mega Man led me to start researching the matter online. And my case(Phanteks Enthoo Pro) is loaded with dust filters, so all is well.
> 
> "REP+" given to the two whom assisted on this matter.


Positive vs Negative the never ending debate...


----------



## Agent Smith1984

Quote:


> Originally Posted by *kaspar737*
> 
> Could anyone comment on how well VRM is cooled on Tri-X 290?


They do pretty decent at stock settings, but the second you break 100mv offset on voltage, you will see VRM2 get into the 80s (which is still plenty acceptable).

Once you break 150mv+, you will be in the mid 90's (getting into the danger zone).

And at 200mv I saw VRM2 hit 102c which is what finally lead to instability above 1160MHz.... My card is not the best clocker though, so you may not need a ton of voltage to get he OC you are looking for.

I run 1100/1500 on 100mv+ with no temp issues at all. (73c max core/81c max)


----------



## tsm106

Open fan coolers give up some on vrm cooling. Mount on heatsinks have less material vs the full body frame of the reference cooler. Then again the ref cooler is louder than a banshee at full wail.


----------



## Vexzarium

I really should have paid attention to my 290x Lightning VRM temps when I hammered it for Valley, Heaven. and Fire Strike. I had the Core Voltage at 200. Memory Voltage at 100. Aux Voltage at 50. Power Limit at 50. Core Clock at 1225. Memory Clock at 1575. And peaked at 73c GPU temps... but I forgot to add the VRM temp to the damn OSD. And I do not feel like re-downloading the benchmarks to run them again to check my VRM temps.


----------



## tsm106

Quote:


> Originally Posted by *Vexzarium*
> 
> I really should have paid attention to my 290x Lightning VRM temps when I hammered it for Valley, Heaven. and Fire Strike. I had the Core Voltage at 200. Memory Voltage at 100. Aux Voltage at 50. Power Limit at 50. Core Clock at 1225. Memory Clock at 1575. And peaked at 73c GPU temps... but I forgot to add the VRM temp to the damn OSD. And I do not feel like re-downloading the benchmarks to run them again to check my VRM temps.


Nah, the lightning is one of the few that can run lower vrm temps vs core temp, albeit with three plugs connected. That is the point of its stupid high phase count, spread the load out thereby reducing vrm temps. With a fullcover block even at +300mv the vrm temps hardly break 50c and that's on the stock crap EK pads.


----------



## rt123

Quote:


> Originally Posted by *Vexzarium*
> 
> I really should have paid attention to my 290x Lightning VRM temps when I hammered it for Valley, Heaven. and Fire Strike. I had the Core Voltage at 200. Memory Voltage at 100. Aux Voltage at 50. Power Limit at 50. Core Clock at 1225. Memory Clock at 1575. And peaked at 73c GPU temps... but I forgot to add the VRM temp to the damn OSD. And I do not feel like re-downloading the benchmarks to run them again to check my VRM temps.


Laydown on the Aux Voltage, it doesn't help with overclock, maybe +5 to +10mv above that you are just generating unnecessary heat.

Also try to cutdown on the Mem voltage, rarely have I needed to use all 100Mv.

You VRM temps prolly didn't go over 70C, its a Lightning afterall.


----------



## Vexzarium

Quote:


> Originally Posted by *tsm106*
> 
> Nah, the lightning is one of the few that can run lower vrm temps vs core temp, albeit with three plugs connected. That is the point of its stupid high phase count, spread the load out thereby reducing vrm temps. With a fullcover block even at +300mv the vrm temps hardly break 50c and that's on the stock crap EK pads.


Very good to know. I've debated on removing the 3rd pci-e many times but for some reason I've just left them all in, even though I hardly run it overclocked with the games I play.
Quote:


> Originally Posted by *rt123*
> 
> Laydown on the Aux Voltage, it doesn't help with overclock, maybe +5 to +10mv above that you are just generating unnecessary heat.
> 
> Also try to cutdown on the Mem voltage, rarely have I needed to use all 100Mv.
> 
> You VRM temps prolly didn't go over 70C, its a Lightning afterall.


Those MSi Afterburner settings were just for bashing it's face in for the three benchmarks, I'd never actually run it like that. My Girlfriend was staring at the case window half expecting the GPU to reach out and punch me in the face, she claimed I was "making it mad". LoL


----------



## rt123

Quote:


> Originally Posted by *Vexzarium*
> 
> Those MSi Afterburner settings were just for bashing it's face in for the three benchmarks, I'd never actually run it like that.


Good to know.
Although you can prolly push the core overclock further if you remove the unnecessary heat generated by the Aux voltage. Just a suggestion.


----------



## Vexzarium

Quote:


> Originally Posted by *rt123*
> 
> Good to know.
> Although you can prolly push the core overclock further if you remove the unnecessary heat generated by the Aux voltage. Just a suggestion.


This would have been valuable information before I ran the three benchmarks









I think I'm pretty satisfied with all the scores as is though. With the settings I mentioned above, I was within 10 core clock of artifacts. I'm really not inclined to kick this GPU's ass any further... it might actually punch me next time.


----------



## rt123

Glad you had fun overclocking it.


----------



## tsm106

Can it pass a FSU run at 200mv at 1250 core. Leave mem low and no need for any aux/mem volts. This is my litmus test.


----------



## Vexzarium

Quote:


> Originally Posted by *tsm106*
> 
> Can it pass a FSU run at 200mv at 1250 core. Leave mem low and no need for any aux/mem volts. This is my litmus test.


What is "FSU"? Fire Strike Ultra? I'd have to pay for that, yes? And I only have a 1920x1080 monitor.









But if it's free, I'd be inclined to test it for stability.


----------



## tsm106

Download 3dmark 13. It's free but you have to run the demo. Set your gpu fans to over 85% btw.


----------



## Vexzarium

Quote:


> Originally Posted by *tsm106*
> 
> Download 3dmark 13. It's free but you have to run the demo. Set your gpu fans to over 85% btw.


Damn you.









Downloading now.

That is 4 benchmarks you've convinced me to re-do now.









You were right about my Heaven Benchmark screenshot needing to not be HTML. Which led to me pounding my GPU to prove I was not cheating... which led to a massively better score. Which led to me wanting to re-do Valley for a proper screenshot and a better score... then I was like, hmm wonder if I can do better than 10700 on Fire Strike, so I re-ran that and got 11200...


----------



## tsm106

Quote:


> Originally Posted by *Vexzarium*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Download 3dmark 13. It's free but you have to run the demo. Set your gpu fans to over 85% btw.
> 
> 
> 
> Damn you.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Downloading now.
> 
> That is 4 benchmarks you've convinced me to re-do now.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You were right about my Heaven Benchmark screenshot needing to not be HTML. Which led to me pounding my GPU to prove I was not cheating... which led to a massively better score. Which led to me wanting to re-do Valley for a proper screenshot and a better score... then I was like, hmm wonder if I can do better than 10700 on Fire Strike, so I re-ran that and got 11200...
Click to expand...

The html sub in regard to Heaven, it's just matter fact. It had nothing to do with implying cheats. If it's in html, I don't even bother looking at it hehe. And for benching, it loks like you have caught the bug.


----------



## Vexzarium

Quote:


> Originally Posted by *tsm106*
> 
> The html sub in regard to Heaven, it's just matter fact. It had nothing to do with implying cheats. If it's in html, I don't even bother looking at it hehe. And for benching, it loks like you have caught the bug.


I think I have caught the bug... Overclocking and benching a card like the 290x Lightning is a blast. This card makes it easy, and it seems to like it rough...

I mean, it's an absolute maniac.


----------



## Vexzarium

The option to launch Fire Strike Ultra is grayed out


----------



## tsm106

Quote:


> Originally Posted by *Vexzarium*
> 
> The option to launch Fire Strike Ultra is grayed out


Oh sorry I guess it's locked till u pay. I haven't seen it on sale in a while but I don't really look. It's been as low as 5 bucks though. Run the FSE (extreme) its not as hard but still works.


----------



## Vexzarium

Quote:


> Originally Posted by *tsm106*
> 
> Oh sorry I guess it's locked till u pay. I haven't seen it on sale in a while but I don't really look. It's been as low as 5 bucks though. Run the FSE (extreme) its not as hard but still works.


Extreme is grayed out as well. You have to pay $30 for 3dMark Advanced.







.

I don't think my "bug" is big enough to pay for a benchmark.


----------



## tsm106

Quote:


> Originally Posted by *Vexzarium*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Oh sorry I guess it's locked till u pay. I haven't seen it on sale in a while but I don't really look. It's been as low as 5 bucks though. Run the FSE (extreme) its not as hard but still works.
> 
> 
> 
> Extreme is grayed out as well. You have to pay $30 for 3dMark Advanced.
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I don't think my "bug" is big enough to pay for a benchmark.
Click to expand...

You're in the US. Pm me, I might have an extra. You have steam right?


----------



## Vexzarium

Quote:


> Originally Posted by *tsm106*
> 
> You're in the US. Pm me, I might have an extra. You have steam right?


I hardly use Steam, but yes I have it to launch my modded Skyrim


----------



## rt123

New Drivers & FreeSync are here guys
http://www.guru3d.com/files-details/amd-catalyst-15-x-download.html

Hype...









Edit:- Wrong link. Looks like the Driver was pulled due to some issues & will be re-uploaded later today.


----------



## Vexzarium

Quote:


> Originally Posted by *rt123*
> 
> New Drivers & FreeSync are here guys
> http://www.guru3d.com/files-details/amd-catalyst-15-x-download.html
> 
> Hype...


I smell 390x... And my wallet is already hiding under the bed because I'm about to perform liposuction on it









I'm already shopping for a different mobo, I love mine but it has it's limitations(*cough cough* onboard sound). Also contemplating x99 with a 5930k and 16gb of DDR4... Figure I can make a bit back from my z97, 2x8 ddr3 and 4690k.


----------



## rt123

Quote:


> Originally Posted by *Vexzarium*
> 
> I smell 390x... And my wallet is already hiding under the bed because I'm about to perform liposuction on it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm already shopping for a different mobo, I love mine but it has it's limitations(*cough cough* onboard sound). Also contemplating x99 with a 5930k and 16gb of DDR4... Figure I can make a bit back from my z97, 2x8 ddr3 and 4690k.


I have been ready for a 390X Lightning for a while.


----------



## tsm106

Six months later after official release. I'm not in a HURRY. Just paid off all the credit cards... hum di dum di dum.


----------



## rt123

I don't wanna wait 6months this time. MSi please.









Maybe it will come sooner this time since there is no mining craze.


----------



## Vexzarium

Quote:


> Originally Posted by *rt123*
> 
> I don't wanna wait 6months this time. MSi please.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Maybe it will come sooner this time since there is no mining craze.


I suppose I should note that unless the 390x reference is cooled like the 295x2 reference, then I'll be waiting for an offering from MSi as well. My rig is near dead silent when gaming and I do not want any turbines changing that


----------



## Devildog83

Quote:


> Originally Posted by *tsm106*
> 
> Six months later after official release. I'm not in a HURRY. Just paid off all the credit cards... hum di dum di dum.


I guess the card aficionado title extends to credit cards too - SWEET !!!


----------



## LandonAaron

I have a bad habit of under estimating my needs. I recently purchased an Evga 1000w G2 PSU, thinking that this would be plenty of power for my current system probably enough if I got 2 way crossfire. Well I just purchased an R9 290 to pair with my R9 290X, and now I am unsure of whether the G2 1000W is going to be enough power.

Using this PSU calculator http://www.extreme.outervision.com/psucalculatorlite.jsp I calculate that I would need an 872w PSU for crossfire with my rig. But this only if I set the capacitor aging to 0. If I bump it up to 20% it says I need a 1037w PSU. Also the calculator takes CPU overclock into consideration but not the GPUs' overclock. So my question is how many more watts does say +100mV, 1150/1500 overclock pull vs. running stock?

My rig is an i7-4790K @ 4.6Ghz/1.31V, 2 X 8 GB DDR3 2400mhz Ram at 1.65V, 13 fans, an XSPC X20 pump, and an MCP50X pump. Should I just go ahead and Ebay this PSU and get a 1300w unit to be safe? What is the worst that will happen if I go over my PSU capabilites? Will it just shut down or will it die trying so to speak?


----------



## velocityx

donno but i run i7 DC with 290 CF off of an 850 xfx seasonic psu just fine.


----------



## chiknnwatrmln

1000 watts is more than enough, unless you go balls-to-the-wall. I have an AX860 with a 290x, 290, 3770k, three drives, a sound card, water pump, and over ten fans. Running each card at 1165MHz core I pull around 800 watts from the plug.


----------



## Agent Smith1984

Let's play a game of over/under......

Days before my overlocked vishera and overlcoked 290 explode my 850W Raidmax PSU during a Firestrike combined test: 27










I say over...


----------



## kizwan

Quote:


> Originally Posted by *LandonAaron*
> 
> I have a bad habit of under estimating my needs. I recently purchased an Evga 1000w G2 PSU, thinking that this would be plenty of power for my current system probably enough if I got 2 way crossfire. Well I just purchased an R9 290 to pair with my R9 290X, and now I am unsure of whether the G2 1000W is going to be enough power.
> 
> Using this PSU calculator http://www.extreme.outervision.com/psucalculatorlite.jsp I calculate that I would need an 872w PSU for crossfire with my rig. But this only if I set the capacitor aging to 0. If I bump it up to 20% it says I need a 1037w PSU. Also the calculator takes CPU overclock into consideration but not the GPUs' overclock. So my question is how many more watts does say +100mV, 1150/1500 overclock pull vs. running stock?
> 
> My rig is an i7-4790K @ 4.6Ghz/1.31V, 2 X 8 GB DDR3 2400mhz Ram at 1.65V, 13 fans, an XSPC X20 pump, and an MCP50X pump. Should I just go ahead and Ebay this PSU and get a 1300w unit to be safe? What is the worst that will happen if I go over my PSU capabilites? Will it just shut down or will it die trying so to speak?


Should be fine. I think my CPU draw more power than yours & I have no problem here.


----------



## Vexzarium

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Let's play a game of over/under......
> 
> Days before my overlocked vishera and overlcoked 290 explode my 850W Raidmax PSU during a Firestrike combined test: 27
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I say over...


Raidmax PSU? I say you have 10 days or less.

I shouldn't say that, really. I picked this Rosewill PSU because it was well reviewed on JonnyGURU.com and it has really shown why. I was of the mind that a Rosewill PSU would take a dump on me in 10 days or less. And I've beaten this thing into the ground with benchmarks. Never even heard this thing make a peep.

I guess Rosewill knows it's a good PSU as well. Apparently it once had a 5 year warranty. Now it has a 7 year warranty.

(http://www.jonnyguru.com/modules.php?name=NDReviews&op=Story&reid=266)


----------



## rdr09

Leak test . . .



hopefully the new driver is out by the time i get done with everything. lol

edit: are my connections to the bridge correct? thanks.


----------



## Vexzarium

Quote:


> Originally Posted by *rdr09*
> 
> Leak test . . .
> 
> 
> 
> hopefully the new driver is out by the time i get done with everything. lol
> 
> edit: are my connections to the bridge correct? thanks.


nice flashlight...


----------



## mAs81

Quote:


> Originally Posted by *rdr09*
> 
> Leak test . . .
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> hopefully the new driver is out by the time i get done with everything. lol
> 
> edit: are my connections to the bridge correct? thanks.


Looking good


----------



## tsm106

Quote:


> Originally Posted by *rdr09*
> 
> Leak test . . .
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> hopefully the new driver is out by the time i get done with everything. lol
> 
> *edit: are my connections to the bridge correct?* thanks.


Looks fine for serial.


----------



## BuildTestRepeat

Does anyone know where i could get the all screws to mount a reference cooler?


----------



## battleaxe

Quote:


> Originally Posted by *BuildTestRepeat*
> 
> Does anyone know where i could get the all screws to mount a reference cooler?


Ace hardware or some other hardware store. You might have to order them but once you figure out the threads (I think there are only two different threads) you could order the correct lengths. If memory serves there are two different length screws. One about 1/4" long and a few others a bit shorter. I can't remember though. Shouldn't be too hard to figure out if you find a store with lots of screw types.

Fastenall is another company that sells lots of strange sized screws, or can order almost anything too.


----------



## Klocek001

is 290 trix +25mV out of the box or am I getting a wrong reading ?


----------



## LandonAaron

Quote:


> Originally Posted by *BuildTestRepeat*
> 
> Does anyone know where i could get the all screws to mount a reference cooler?


I would ask nicely in the OCN water cooling club guide/thread. You won't find those at a hardware store. The smallest screw lowes and home depot sell is still larger than every screw in my rig. The screws for the cooler are some of the smallest screws you will find. Maybe a eyeglass repair kit, or watch repair kit might have the size screws you need, but I think your best bet is going to be finding someone who doesn't need theirs anymore. If your still trying to do an RMA though I think its a lost cause.


----------



## BuildTestRepeat

Quote:


> Originally Posted by *LandonAaron*
> 
> I would ask nicely in the OCN water cooling club guide/thread. You won't find those at a hardware store. The smallest screw lowes and home depot sell is still larger than every screw in my rig. The screws for the cooler are some of the smallest screws you will find. Maybe a eyeglass repair kit, or watch repair kit might have the size screws you need, but I think your best bet is going to be finding someone who doesn't need theirs anymore. If your still trying to do an RMA though I think its a lost cause.


Will do, just to clarify im looking for the screws for the IO plate and ones used to attached the cooler to the card. MrWhiteRX7 very generously said he would send me a reference cooler he had no use for to try and RMA my card.


----------



## tsm106

Quote:


> Originally Posted by *BuildTestRepeat*
> 
> Quote:
> 
> 
> 
> Originally Posted by *LandonAaron*
> 
> I would ask nicely in the OCN water cooling club guide/thread. You won't find those at a hardware store. The smallest screw lowes and home depot sell is still larger than every screw in my rig. The screws for the cooler are some of the smallest screws you will find. Maybe a eyeglass repair kit, or watch repair kit might have the size screws you need, but I think your best bet is going to be finding someone who doesn't need theirs anymore. If your still trying to do an RMA though I think its a lost cause.
> 
> 
> 
> Will do, just to clarify im looking for the screws for the IO plate and ones used to attached the cooler to the card. MrWhiteRX7 very generously said he would send me a reference cooler he had no use for to try and RMA my card.
Click to expand...

Dude, you broke it and you're going to rma it? You're going waste the effort, resources to ship this cooler around, then ship card, and you think no one is going to notice the cracked core? As Homecinema said, you're wasting your time and really man up dude.


----------



## pengs

Hey, I've got 2 290x's and Afterburner 3.0.0 Final, I've unlocked voltage monitoring and control in AB, restart AB and it doesn't give me any voltage readings or control - fan speed and usage monitoring is also either rounded or delayed (doesn't seem correct all the time) and the 2nd GPU reads 0*c at desktop.
*Turned ULPS off in AB and enabled unified GPU usage monitoring (also in AB) and all is good. Updated from AB 3.0.0 final to 4.1.0 final which seems stable and gave me voltage reading and a slider.*

Should I go to a newer version of AB or is there something in the CCC I need to tick? I've read about disabling ULPS, maybe that would take care of the monitoring problem.
Can someone recommend an Afterburner version which is stable?
-
BTW, I'm trying to set up a CF profile for some games like Cities Skylines and have used 1x1, AFR and default but none of them seem to work. Is AFR the most common type of frame rendering to use or am I doing something wrong?
*Now that I can see the correct voltage readings and usage I can confirm CF easily.*


----------



## BuildTestRepeat

Quote:


> Originally Posted by *tsm106*
> 
> Dude, you broke it and you're going to rma it? You're going waste the effort, resources to ship this cooler around, then ship card, and you think no one is going to notice the cracked core? As Homecinema said, you're wasting your time and really man up dude.


Clearly i already have. After reading around i found a few cases of people sending theirs back with luck. Is your glass half full? Or half empty my friend. Those "resources" are what a whopping $5-10?


----------



## rdr09

Quote:


> Originally Posted by *Vexzarium*
> 
> nice flashlight...


Quote:


> Originally Posted by *mAs81*
> 
> Looking good


Thanks.

Quote:


> Originally Posted by *tsm106*
> 
> Looks fine for serial.


Thanks, tsm. it runs. i'll get two more female QDCs and get ready for the 300 series.


----------



## tsm106

Quote:


> Originally Posted by *BuildTestRepeat*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Dude, you broke it and you're going to rma it? You're going waste the effort, resources to ship this cooler around, then ship card, and you think no one is going to notice the cracked core? As Homecinema said, you're wasting your time and really man up dude.
> 
> 
> 
> Clearly i already have. After reading around i found a few cases of people sending theirs back with luck. Is your glass half full? Or half empty my friend. Those "resources" are what a whopping $5-10?
Click to expand...

You do realize you're talking about fraud on a forum right? How would you feel if you sold a gpu, buyer cracks the die then tries to say it came that way? Is that not fraud?


----------



## DividebyZERO

I was curious if there are any Bioses outside pt1/pt2 that disable power saving/ulps like settings on the reference 290x/290. I want to try some testing with different bioses and x58/sr2 boards that have nf200 plx chips.just for science and the fact I have a friend who put 290s in his x58 board with no ill effects and his does not have nf200.


----------



## BuildTestRepeat

Quote:


> Originally Posted by *tsm106*
> 
> You do realize you're talking about fraud on a forum right? How would you feel if you sold a gpu, buyer cracks the die then tries to say it came that way? Is that not fraud?


First off , calm down man. Im not angry here. Also not here to argue and cover up people who actually need help. If the manufacturer wants to RMA my card its their choice. If they send it back saying no i wont be bummed. There is nothing fraudulent about leaving the decision up to them, I'm obviously not going to send my GELID cooler to them? Nothing personal , as we all have the same passion here, But maybe i don't make big bucks and break my GPU's in half when they break like yourself?


----------



## mfknjadagr8

Quote:


> Originally Posted by *BuildTestRepeat*
> 
> First off , calm down man. Im not angry here. Also not here to argue and cover up people who actually need help. If the manufacturer wants to RMA my card its their choice. If they send it back saying no i wont be bummed. There is nothing fraudulent about leaving the decision up to them, I'm obviously not going to send my GELID cooler to them? Nothing personal , as we all have the same passion here, But maybe i don't make big bucks and break my GPU's in half when they break like yourself?


lol it's not about being poor...it's the simple fact that you admit damaging it yet want the company to fix the damage you caused...rma is for a product that is defective not one that was damaged by you but one that say missed something in quality control or had a part fail within normal operating procedures...anyone who's ever had a legitimate rma denied can thank all the people who.haven't taken responsibility for thier own actions...this is why rmas are getting harder to do...imo if you overclock and it fails it's your fault...if you knock off a cap it's your fault...if you modify it in any way and it fails it's your fault...personal responsibility is at an all time low and it's getting worse....rant end


----------



## rt123

New Drivers

http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx#


----------



## rdr09

Quote:


> Originally Posted by *rt123*
> 
> New Drivers
> 
> http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx#


+rep.i installed it but it is still saying 14.5 in FS. GPUZ say 15.3, though. Slight difference in graphics scores from an older driver . . .

http://www.3dmark.com/compare/fs/4360640/fs/3110083

could the system.


----------



## supermi

Quote:


> Originally Posted by *rt123*
> 
> New Drivers
> 
> http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx#


are the the actual drivers delayed from yesterday?
I am about to pack my 290x's up to return but would like to give the new drivers a whirl and see if they can woo me LOL


----------



## tsm106

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rt123*
> 
> New Drivers
> 
> http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx#
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> +rep.i installed it but it is still saying 14.5 in FS. GPUZ say 15.3, though. Slight difference in graphics scores from an older driver . . .
> 
> http://www.3dmark.com/compare/fs/4360640/fs/3110083
> 
> could the system.
Click to expand...

14.502 is the driver version and 15.3 is the catalyst version. Needless to say they're both correct.


----------



## rt123

Quote:


> Originally Posted by *supermi*
> 
> are the the actual drivers delayed from yesterday?
> I am about to pack my 290x's up to return but would like to give the new drivers a whirl and see if they can woo me LOL


Yes sir.

The official thread from AMD Rep.

http://www.overclock.net/t/1547184/amd-catalyst-15-3-beta-driver-for-windows-os


----------



## GoLDii3

Anyone ever got a faulty card with blackscreen? Im just asking because i recently bought an amazon warehouse deals one. The R9 290 was like new,i could use 2D clocks without problems but as soon i launched some 3D game the screen turned black,but i still could hear the audio of the game in the background.

I could not find a fix so i RMA'd it and i got my money back.

Now im tempted to buy a R9 290 Tri-X for 235 euros used but is it worth?

I currently have a R9 280X that is stable at 1145 MHz on core,atleast on Dying Light. I am not looking for a upgrade because of necessity (altough i would like to average atleast 50 fps on FC4 instead of 35-45) but only futureproof upgrade. I could sell my 280X for 150 euros and spend 85 euros more for the R9 290.


----------



## ZealotKi11er

I just installed 15.3 Beta drivers and my computer instantly restarts when i run FC4. I cant really test Omega driver because with CFX off the game turns fine aka not full screen mode. Never ever had a problem like this before.


----------



## Vexzarium

Quote:


> Originally Posted by *GoLDii3*
> 
> Anyone ever got a faulty card with blackscreen? Im just asking because i recently bought an amazon warehouse deals one. The R9 290 was like new,i could use 2D clocks without problems but as soon i launched some 3D game the screen turned black,but i still could hear the audio of the game in the background.
> 
> I could not find a fix so i RMA'd it and i got my money back.
> 
> Now im tempted to buy a R9 290 Tri-X for 235 euros used but is it worth?
> 
> I currently have a R9 280X that is stable at 1145 MHz on core,atleast on Dying Light. I am not looking for a upgrade because of necessity (altough i would like to average atleast 50 fps on FC4 instead of 35-45) but only futureproof upgrade. I could sell my 280X for 150 euros and spend 85 euros more for the R9 290.


How much is the 290x Lightning in your currency?

If you can get you hands on one, it would be well worth the upgrade from a 280x.


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> 14.502 is the driver version and 15.3 is the catalyst version. Needless to say they're both correct.


i forgot. that's right.


----------



## Vexzarium

Installed the new Beta Driver... all is well, re-set some of MSi AB settings that require it when a new driver is installed. Seems to be smooth as butta.


----------



## Mega Man

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BuildTestRepeat*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Dude, you broke it and you're going to rma it? You're going waste the effort, resources to ship this cooler around, then ship card, and you think no one is going to notice the cracked core? As Homecinema said, you're wasting your time and really man up dude.
> 
> 
> 
> Clearly i already have. After reading around i found a few cases of people sending theirs back with luck. Is your glass half full? Or half empty my friend. Those "resources" are what a whopping $5-10?
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> You do realize you're talking about fraud on a forum right? How would you feel if you sold a gpu, buyer cracks the die then tries to say it came that way? Is that not fraud?
Click to expand...

agreed !
Quote:


> Originally Posted by *BuildTestRepeat*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> You do realize you're talking about fraud on a forum right? How would you feel if you sold a gpu, buyer cracks the die then tries to say it came that way? Is that not fraud?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> First off , calm down man. Im not angry here. Also not here to argue and cover up people who actually need help. If the manufacturer wants to RMA my card its their choice. If they send it back saying no i wont be bummed. There is nothing fraudulent about leaving the decision up to them, I'm obviously not going to send my GELID cooler to them? Nothing personal , as we all have the same passion here, But maybe i don't make big bucks and break my GPU's in half when they break like yourself?
Click to expand...

i think you should read the warranty guarantee !

i bet you also complained about how expensive pc parts are, well a big reason is warranty fraud

the reason people dont like it is... well we pay for that.

it isnt about " their choice" it is about them agreeing to warranty something for manufacture defects and the warranty is then abused
Quote:


> Originally Posted by *mfknjadagr8*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BuildTestRepeat*
> 
> First off , calm down man. Im not angry here. Also not here to argue and cover up people who actually need help. If the manufacturer wants to RMA my card its their choice. If they send it back saying no i wont be bummed. There is nothing fraudulent about leaving the decision up to them, I'm obviously not going to send my GELID cooler to them? Nothing personal , as we all have the same passion here, But maybe i don't make big bucks and break my GPU's in half when they break like yourself?
> 
> 
> 
> lol it's not about being poor...it's the simple fact that you admit damaging it yet want the company to fix the damage you caused...rma is for a product that is defective not one that was damaged by you but one that say missed something in quality control or had a part fail within normal operating procedures...anyone who's ever had a legitimate rma denied can thank all the people who.haven't taken responsibility for thier own actions...this is why rmas are getting harder to do...imo if you overclock and it fails it's your fault...if you knock off a cap it's your fault...if you modify it in any way and it fails it's your fault...personal responsibility is at an all time low and it's getting worse....rant end
Click to expand...

well said


----------



## alancsalt

Quote:


> Originally Posted by *BuildTestRepeat*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> You do realize you're talking about fraud on a forum right? How would you feel if you sold a gpu, buyer cracks the die then tries to say it came that way? Is that not fraud?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> First off , calm down man. Im not angry here. Also not here to argue and cover up people who actually need help. If the manufacturer wants to RMA my card its their choice. If they send it back saying no i wont be bummed. There is nothing fraudulent about leaving the decision up to them, I'm obviously not going to send my GELID cooler to them? Nothing personal , as we all have the same passion here, But maybe i don't make big bucks and break my GPU's in half when they break like yourself?
Click to expand...

\
He didn't break it in half. That is the broken card of another member that broke their graphics card in a fit of anger.

tsm106 just uses it as an instructive avatar... obviously it wasn't rma eligible..


----------



## BuildTestRepeat

Quote:


> Originally Posted by *Mega Man*
> 
> agreed !
> i think you should read the warranty guarantee !
> 
> i bet you also complained about how expensive pc parts are, well a big reason is warranty fraud
> 
> the reason people dont like it is... well we pay for that.
> 
> it isnt about " their choice" it is about them agreeing to warranty something for manufacture defects and the warranty is then abused
> well said


Honestly some good points here. I've never had to RMA anything. Been lucky i guess . Never thought about a person who gets denied RMA and so on. Just for the record its still in my cabinet lol.
Quote:


> Originally Posted by *alancsalt*
> 
> \
> He didn't break it in half. That is the broken card of another member that broke their graphics card in a fit of anger.
> 
> tsm106 just uses it as an instructive avatar... obviously it wasn't rma eligible..


Ah i saw the pic posted a few pages back. got people mixed up.


----------



## taem

I would never RMA under any false pretenses but just to play devils advocate, how many times do some of these gpu makers refuse to honor warranties for fake reasons? I've read a lot of complaints about such shady practices here over the years.

Me, it's why I never buy msi anymore. Feel free to cheat those guys all you want, they screwed me nicely.


----------



## Vexzarium

Quote:


> Originally Posted by *taem*
> 
> I would never RMA under any false pretenses but just to play devils advocate, how many times do some of these gpu makers refuse to honor warranties for fake reasons? I've read a lot of complaints about such shady practices here over the years.
> 
> Me, it's why I never buy msi anymore. Feel free to cheat those guys all you want, they screwed me nicely.


Please, do tell? I own a MSi 290x Lightning, and I'd hate to be stuck with my fried gpu as an expensive drink coaster. Sort of like Tek Syndicate.






I mean mine is in perfect condition, but I'm less likely to push it, or OC it at all, if I'm not likely to be able to RMA it if a defect rears it's head.


----------



## taem

Quote:


> Originally Posted by *Vexzarium*
> 
> Please, do tell? I own a MSi 290x Lightning, and I'd hate to be stuck with my fried gpu as an expensive drink coaster. Sort of like Tek Syndicate.
> 
> 
> 
> 
> 
> 
> I mean mine is in perfect condition, but I'm less likely to push it, or OC it at all, if I'm not likely to be able to RMA it if a defect rears it's head.


My friend, I scoff at your expensive coaster, I was stuck with a much more expensive doorstop that looked like a laptop.

But actually when the Lightning hit $350 I was like, maybe it's time to forgive and forget and give msi another chance. What a deal that is. I didn't though.


----------



## Vexzarium

Quote:


> Originally Posted by *taem*
> 
> My friend, I scoff at your expensive coaster, I was stuck with a much more expensive doorstop that looked like a laptop.
> 
> But actually when the Lightning hit $350 I was like, maybe it's time to forgive and forget and give msi another chance. What a deal that is. I didn't though.


Well, my 290x is not fried. I was just in fear of frying it.


----------



## chronicfx

Anyone tried crossfire with Far Cry 4 and the new beta 15.3? Is it smooth, if not I will stay at omega and continue Dragon age.


----------



## chronicfx

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I just installed 15.3 Beta drivers and my computer instantly restarts when i run FC4. I cant really test Omega driver because with CFX off the game turns fine aka not full screen mode. Never ever had a problem like this before.


Seen. I hope this is not common.


----------



## Vexzarium

EDIT: Wrong Location for inquiry.


----------



## Pandora's Box

Quote:


> Originally Posted by *chronicfx*
> 
> Anyone tried crossfire with Far Cry 4 and the new beta 15.3? Is it smooth, if not I will stay at omega and continue Dragon age.


Just tested it on my 295X2. Ultra preset @ 3440x1440. Game ran incredibly smooth. 45-65fps (I cap my fps at 65 in all my games)

The amazing part? Assassins Creed Unity is running incredibly smooth too on Ultra preset at 3440x1440.


----------



## Vexzarium

Quote:


> Originally Posted by *Pandora's Box*
> 
> Just tested it on my 295X2. Ultra preset @ 3440x1440. Game ran incredibly smooth. 45-65fps (I cap my fps at 65 in all my games)
> 
> The amazing part? Assassins Creed Unity is running incredibly smooth too on Ultra preset at 3440x1440.


Sounds like the 295x2 is a great bargain at it's current price. Looks like I can pick one up for $600, and based on what you've said, sounds like it stomps the 980?


----------



## Pandora's Box

Quote:


> Originally Posted by *Vexzarium*
> 
> Sounds like the 295x2 is a great bargain at it's current price. Looks like I can pick one up for $600, and based on what you've said, sounds like it stomps the 980?


destroys the 980, takes on the Titan X head on. (When crossfire works well that is). It's a fantastic GPU imo. Runs cool and quiet. Get the benefit of crossfire without the hassle of 2 physical cards. All the heat is dumped outside of the case, don't have to worry about heat from the second card raising the temp of the first card. Just be sure you have the power to run it.

http://www.overclock.net/t/1483021/official-amd-r9-295x2-owners-club/4850_50#post_23654185

That post has a list of certified and tested PSU's.


----------



## Vexzarium

Quote:


> Originally Posted by *Pandora's Box*
> 
> destroys the 980, takes on the Titan X head on. (When crossfire works well that is). It's a fantastic GPU imo. Runs cool and quiet. Get the benefit of crossfire without the hassle of 2 physical cards. All the heat is dumped outside of the case, don't have to worry about heat from the second card raising the temp of the first card. Just be sure you have the power to run it.
> 
> http://www.overclock.net/t/1483021/official-amd-r9-295x2-owners-club/4850_50#post_23654185
> 
> That post has a list of certified and tested PSU's.


I've been wanting one of the 295x2's. So here's my question for you:

This GPU has 8gb of VRAM, correct? So when DX12 comes, will it have 16gb of VRAM? Meaning, is it only 8gb due to it technically being CF 290x's and the current limitations of DX11?


----------



## Pandora's Box

Quote:


> Originally Posted by *Vexzarium*
> 
> I've been wanting one of the 295x2's. So here's my question for you:
> 
> This GPU has 8gb of VRAM, correct? So when DX12 comes, will it have 16gb of VRAM? Meaning, is it only 8gb due to it technically being CF 290x's and the current limitations of DX11?


4GB under Directx11, what DX12 will allow is still a gray area. The card has 4gb per gpu though


----------



## Vexzarium

Quote:


> Originally Posted by *Pandora's Box*
> 
> 4GB under Directx11, what DX12 will allow is still a gray area. The card has 4gb per gpu though


I thought they put 2 8gb edition 290x's in there?

https://pcpartpicker.com/part/xfx-video-card-r9295x8qfa


----------



## Pandora's Box

Quote:


> Originally Posted by *Vexzarium*
> 
> I thought they put 2 8gb edition 290x's in there?
> 
> https://pcpartpicker.com/part/xfx-video-card-r9295x8qfa


marketing. technically yes there is 8GB on the card, but 4GB per gpu


----------



## Vexzarium

Quote:


> Originally Posted by *Pandora's Box*
> 
> marketing. technically yes there is 8GB on the card, but 4GB per gpu


So you'd assume it will be an 8gb card with DX12. That would be a great deal at $600.


----------



## Pandora's Box

Quote:


> Originally Posted by *Vexzarium*
> 
> So you'd assume it will be an 8gb card with DX12. That would be a great deal at $600.


What happens with DirectX12 is all up in the air at this point. I highly doubt we will be able to merge both pools of memory into one big pool.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Vexzarium*
> 
> So you'd assume it will be an 8gb card with DX12. That would be a great deal at $600.


Not really. Just think of it as 4GB even in DX12.It's a good card for 1440p and under. For 4K you have to look for 6GB or 8GB now.


----------



## chronicfx

Quote:


> Originally Posted by *Pandora's Box*
> 
> Just tested it on my 295X2. Ultra preset @ 3440x1440. Game ran incredibly smooth. 45-65fps (I cap my fps at 65 in all my games)
> 
> The amazing part? Assassins Creed Unity is running incredibly smooth too on Ultra preset at 3440x1440.


I have a UM95 too







just have not updated but damn I may trade it in for the UM67... Glad tp hear it works well


----------



## Pandora's Box

Quote:


> Originally Posted by *chronicfx*
> 
> I have a UM95 too
> 
> 
> 
> 
> 
> 
> 
> just have not updated but damn I may trade it in for the UM67... Glad tp hear it works well


Thats the one that has freesync support, right? Nice. I'd hold out for AMD to fix FreeSync not working with crossfire first though









I'm sticking with my UM95, what freesync fixes doesn't bug me enough to drop that kind of cash on a monitor for it. using a frame rate limiter solves the issue 95% of the time anyway


----------



## chronicfx

Quote:


> Originally Posted by *Pandora's Box*
> 
> Thats the one that has freesync support, right? Nice. I'd hold out for AMD to fix FreeSync not working with crossfire first though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm sticking with my UM95, what freesync fixes doesn't bug me enough to drop that kind of cash on a monitor for it. using a frame rate limiter solves the issue 95% of the time anyway


I have that 3 year microcenter warranty on it. So basically I could just walk back in and say I want my $899 if I am not mistaken. Then maybe I could even go with the 29 inch triple monitor. Still debating but yes, after crossfire is fixed.


----------



## Vexzarium

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Not really. Just think of it as 4GB even in DX12.It's a good card for 1440p and under. For 4K you have to look for 6GB or 8GB now.


Well my 290x Lightning on my 1080p monitor is still looking sharp. Should last until the 390x is out and a good 4k monitor is on sale... if you catch my drift.


----------



## Arizonian

Quote:


> Originally Posted by *gordesky1*
> 
> Like to join the club msi 290x lightning stock cooler. 1200/1600
> 
> http://www.techpowerup.com/gpuz/details.php?id=5npuz


Congrats - added









That's 3 Lightnings added to list in a row.
Quote:


> Originally Posted by *Native89*
> 
> EVGA 650G out. Seasonic X850 in.
> Got my VaporX 290 in about a week ago and did a slight overclock, but did not touch the voltage.
> Hopefully I can push the core a little further with more voltage. The memory however (Hynix), does not want to budge from 1400.
> Coming from a GTX760 the 290 is a nice bump in performance and runs a little cooler to boot.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/details.php?id=8s2eu


Congrats - added


----------



## Vexzarium

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That's 3 Lightnings added to list in a row.
> Congrats - added


Hey, with the price of the Lightning coming down as it has, it is the clear choice if you're in the market for a sub $400 GPU. Including Nvidia's offering.


----------



## Maintenance Bot

Can I maybe get a opinion from some of you possibly on a small issue. I have a 290x crossfire setup I put together a few weeks back and I notice the memory clock is allways stuck at 1375mhz for gpu #1 at idle, but gpu #2 memory clock at idle is 150mhz. Is that normal behavior? Everything seems to operate ok though.


----------



## tsm106

Quote:


> Originally Posted by *Maintenance Bot*
> 
> Can I maybe get a opinion from some of you possibly on a small issue. I have a 290x crossfire setup I put together a few weeks back and I notice the memory clock is allways stuck at 1375mhz for gpu #1 at idle, but gpu #2 memory clock at idle is 150mhz. Is that normal behavior? Everything seems to operate ok though.
> 
> 
> Spoiler: Warning: Spoiler!


Try this:

http://www.overclock.net/t/1265543/the-amd-how-to-thread/0_40


----------



## GoLDii3

Quote:


> Originally Posted by *GoLDii3*
> 
> Anyone ever got a faulty card with blackscreen? Im just asking because i recently bought an amazon warehouse deals one. The R9 290 was like new,i could use 2D clocks without problems but as soon i launched some 3D game the screen turned black,but i still could hear the audio of the game in the background.
> 
> I could not find a fix so i RMA'd it and i got my money back.
> 
> Now im tempted to buy a R9 290 Tri-X for 235 euros used but is it worth?
> 
> I currently have a R9 280X that is stable at 1145 MHz on core,atleast on Dying Light. I am not looking for a upgrade because of necessity (altough i would like to average atleast 50 fps on FC4 instead of 35-45) but only futureproof upgrade. I could sell my 280X for 150 euros and spend 85 euros more for the R9 290.


anyone¿?


----------



## rv8000

I can't for the life of me figure out why afterburner won't apply my OC at startup







. I have the profile saved in AB and the apply oc at startup checked up, and sure AB displays the OC in the interface by in games it will only go to default clocks until I manually hit apply. This wasn't an issue with my 970, or on any of my other 290s back when I was on windows 7 (now 8.1). Ideas?


----------



## rdr09

Quote:


> Originally Posted by *GoLDii3*
> 
> anyone¿?


just wait for the 300 series. prices for the 200 series will come down for sure even more or prices of the 300 series might be reasonable enuf to get them instead.

edit: anyways, difference between 4K and 1440 (100% scale both) . . .


----------



## Maintenance Bot

Quote:


> Originally Posted by *tsm106*
> 
> Try this:
> 
> http://www.overclock.net/t/1265543/the-amd-how-to-thread/0_40


Thanks for the link.


----------



## velocityx

Quote:


> Originally Posted by *rdr09*
> 
> i saw in another thread that even BF4 runs like crap in your rig. runs fine here.
> 
> anyway, TitanX is on sale. 12GB.


Bf4 runs butter smooth now. Omega's fixed everything. Its the smoothest game ive seen to be honest.

was hoping for the same with hardline.


----------



## Vexzarium

Likely a client issue, not your driver or card. For example, ESO runs like crap on my Rig when maxed out... yet my GPU is hardly being used. And my 4690k is taking a whooping on one core. Clearly a bottleneck. But that is not due to my hardware, it's poor optimization by the client.

If you're hunting for a new card... yeah the Titan X looks fun, and hot, and loud... but it performs. I'd still wait for the 390x.


----------



## rdr09

Quote:


> Originally Posted by *velocityx*
> 
> Bf4 runs butter smooth now. Omega's fixed everything. Its the smoothest game ive seen to be honest.
> 
> was hoping for the same with hardline.


i read Hardline does not support mantle. can't play anything like BF4 using DX11 after mantle.


----------



## velocityx

Quote:


> Originally Posted by *rdr09*
> 
> i read Hardline does not support mantle. that's a big fubar. can't play anything like BF4 using DX11 after mantle.


hmm Im running mantle with hardline, its in the same place where it is in bf4.


----------



## BradleyW

Quote:


> Originally Posted by *rdr09*
> 
> i read Hardline does not support mantle. that's a big fubar. can't play anything like BF4 using DX11 after mantle.


What? No Mantle on this title? Oh dear....I'm not sure about buying it now.


----------



## rdr09

Quote:


> Originally Posted by *velocityx*
> 
> hmm Im running mantle with hardline, its in the same place where it is in bf4.


then i read wrong. my bad.


----------



## velocityx

I noticed a bug with 15.3 It puts my GPU's in CCC overdrive on -50. 625mem and 500mhz core.

prolly gonna go back to omega.


----------



## ZealotKi11er

Is there any way to take 15.3 CF profiles and use them with Omega?


----------



## pshootr

DX11 only gives me like 6 FPS less than mantle. Either way BF4 is smooth as a whistle on my rig even maxed out with AA


----------



## taem

Quote:


> Originally Posted by *rv8000*
> 
> I can't for the life of me figure out why afterburner won't apply my OC at startup
> 
> 
> 
> 
> 
> 
> 
> . I have the profile saved in AB and the apply oc at startup checked up, and sure AB displays the OC in the interface by in games it will only go to default clocks until I manually hit apply. This wasn't an issue with my 970, or on any of my other 290s back when I was on windows 7 (now 8.1). Ideas?


This can happen when catalyst control center loads after your oc app, it will reset your clocks to stocl.

Either use task scheduler to delay launch of AB until ccc loads (time delay works too), or make a batch file to launch AB after ccc loads and put it in your startup folder.


----------



## rv8000

Quote:


> Originally Posted by *taem*
> 
> This can happen when catalyst control center loads after your oc app, it will reset your clocks to stocl.
> 
> Either use task scheduler to delay launch of AB until ccc loads (time delay works too), or make a batch file to launch AB after ccc loads and put it in your startup folder.


Thank you +!


----------



## MrWhiteRX7

BF Hardline ran smooth for me even in DX11... I just hate that game. EA actually refunded me


----------



## DividebyZERO

Can anyone check Heaven 4.0 performance with new 15.3 beta drivers? It feels like i got a decent boost but wanted to confirm i am not doing a noob manuever and missing something.


----------



## rdr09

Quote:


> Originally Posted by *velocityx*
> 
> I noticed a bug with 15.3 It puts my GPU's in CCC overdrive on -50. 625mem and 500mhz core.
> 
> prolly gonna go back to omega.


wait, are your 290s reference? here is what my gpuz shows . . .



its been that way since the 290 launched and when i crossfired using previous driver and now.


----------



## Vexzarium

Quote:


> Originally Posted by *DividebyZERO*
> 
> Can anyone check Heaven 4.0 performance with new 15.3 beta drivers? It feels like i got a decent boost but wanted to confirm i am not doing a noob manuever and missing something.


Share a screenshot.


----------



## taem

Well dang it my 290 is totally stable in desktop and games at 1200 core.

Problem is, at that clock, screen stays black about a third of the time when I restart or wake from sleep.

If I can get to desktop I'm good. As long as I don't get that black screen when restarting or waking, I've never had a glitch.

So now my choice is use the 1200 clock, but manually apply the clock and restore stock clock before I go into sleep mode, or clock down all the way to 1160 so I can use sleep mode. Or just keep pc on all the time and eat the 170w idle power usage.

I *never* get good clockers, cpu or gpu









Quote:


> Originally Posted by *rv8000*
> 
> Thank you +!


Np. Here is the batch file I use for Trixx
Quote:


> @echo off
> title TRIXX
> cd /
> echo Waiting for Catalyst Control Center
> :start
> tasklist /fi "imagename eq CCC.exe" | find ":" > nul
> if errorlevel 1 (echo Loading Sapphire TRIXX) else (goto start)
> timeout /t 10 /nobreak
> start "" "C:\Program Files (x86)\Sapphire TRIXX\TRIXX.exe" -s
> exit


Set Trixx to auto start and restore clocks, but disable Trixx in Task Scheduler after you save those settings, and put this in your startup folder (C:\Users\\AppData\Roaming\Microsoft\Windows\Start Menu\Programs\Startup) and you're good to go.

Just substitute AB for Trixx to use that app.


----------



## DividebyZERO

Quote:


> Originally Posted by *Vexzarium*
> 
> Share a screenshot.



4x290x - stock, cpu 4.4ghz x2 x5650

1080p, 8xAA, tess: extreme


----------



## Vexzarium

Quote:


> Originally Posted by *DividebyZERO*
> 
> 
> 4x290x - stock, cpu 4.4ghz x2 x5650
> 
> 1080p, 8xAA, tess: extreme


Oh my... That's a nasty pig you have there.


----------



## cokker

Found some pictures of the MSI 290x Gaming Rev.2 naked, no water block for me lol.

http://www.pcgameware.co.uk/blogs/cosmos-ii-gaming-build-part-3/

Card on the left is a Rev.2, right is a Rev.1.


----------



## cephelix

Quote:


> Originally Posted by *cokker*
> 
> Found some pictures of the MSI 290x Gaming Rev.2 naked, no water block for me lol.
> 
> http://www.pcgameware.co.uk/blogs/cosmos-ii-gaming-build-part-3/
> 
> Card on the left is a Rev.2, right is a Rev.1.


Take a look here. If yours looks anything like mine, then an EK WB fullcover wont fit without modding the top right corner of the plate


----------



## cokker

Quote:


> Originally Posted by *cephelix*
> 
> Take a look here. If yours looks anything like mine, then an EK WB fullcover wont fit without modding the top right corner of the plate


Thanks for that, looks like a lot of random hot points dotted around the card. I'm tempted to leave the stock cooler on and beef up the fans with 2 or 3 92mm PWM fans.

Just out of curiosity does your "VRM 2" sensor work? Mine was stuck on 82c out of the box but since overclocking the memory to 1375 it's stuck on 61c.


----------



## cephelix

Quote:


> Originally Posted by *cokker*
> 
> Thanks for that, looks like a lot of random hot points dotted around the card. I'm tempted to leave the stock cooler on and beef up the fans with 2 or 3 92mm PWM fans.
> 
> Just out of curiosity does your "VRM 2" sensor work? Mine was stuck on 82c out of the box but since overclocking the memory to 1375 it's stuck on 61c.


it does.....i've had no problems with the card since day one.


----------



## virpz

Quote:


> Originally Posted by *cokker*
> 
> Thanks for that, looks like a lot of random hot points dotted around the card. I'm tempted to leave the stock cooler on and beef up the fans with 2 or 3 92mm PWM fans.
> 
> Just out of curiosity does your "VRM 2" sensor work? Mine was stuck on 82c out of the box but since overclocking the memory to 1375 it's stuck on 61c.


It could be that the readings on your WRM2 are reverted. If that's the case then lower temperature on VRM2 sensor means higher REAL VRM2 temp.


----------



## cokker

I'm sure it'll be fine, I don't have any weird behavior while gaming. I just messed around with the memory overclock again and got it stuck on 69c lol.


----------



## YellowBlackGod

Since our beloved 290X = 380X = high end gaming, we welcome Direct X12 to our card which is as much futureproof as Crysis 1 when it was released







.

http://www.guru3d.com/news-story/amd-and-nvidia-prep-for-next-gen-directx-12.html


----------



## mAs81

Quote:


> For AMD starting at Radeon HD 7000 and above the API is supported.


That is really good to know , I'm not economically ready to jump on the 3XX wagon


----------



## YellowBlackGod

Quote:


> Originally Posted by *mAs81*
> 
> That is really good to know , I'm not economically ready to jump on the 3XX wagon


Quite honestly, if you have a good custom model of the 290X, which is "AMD's 980", you don't need a new card at all. Tell you what, let the 3xxx series and upgrade when the 4xxx series comes out with PC-E 4.0 and HBM of newer revision.

http://wccftech.com/amd-r9-400-series-gpus-codenamed-arctic-islands/

http://www.tweaktown.com/news/43215/arctic-islands-purported-codename-amds-radeon-r9-400-series/index.html


----------



## mAs81

Quote:


> Originally Posted by *YellowBlackGod*
> 
> Quite honestly, if you have a good custom model of the 290X, which is "AMD's 980", you don't need a new card at all. Tell you what, let the 3xxx series and upgrade when the 4xxx series comes out with PC-E 4.0 and HBM of newer revision.
> 
> http://wccftech.com/amd-r9-400-series-gpus-codenamed-arctic-islands/
> 
> http://www.tweaktown.com/news/43215/arctic-islands-purported-codename-amds-radeon-r9-400-series/index.html


I'm more than pleased with my VaporX 290 on 1440p ( now that I got it working again,that is







) , and , to my knowledge , it is unlockable to a 290X , so I'm not to anxious to change atm..


----------



## dallas1990

I love my 2 290x vapor-x 8gb versions, as well a bit louder then nvidia cards from what I'm used to. I would water cool them but I love the air cooler design to much to take it off. Plus getting a new mobo and cpu. Kind of a pain to work on my system with a custom loop


----------



## rdr09

Quote:


> Originally Posted by *dallas1990*
> 
> I love my 2 290x vapor-x 8gb versions, as well a bit louder then nvidia cards from what I'm used to. I would water cool them but I love the air cooler design to much to take it off. Plus getting a new mobo and cpu. Kind of a pain to work on my system with a custom loop


use QDCs. got it from following tsm post too much. lol



R9 300 series ready.


----------



## Kriant

yeah, QDCs are a must.


----------



## Maintenance Bot

Quote:


> Originally Posted by *dallas1990*
> 
> I love my 2 290x vapor-x 8gb versions, as well a bit louder then nvidia cards from what I'm used to. I would water cool them but I love the air cooler design to much to take it off. Plus getting a new mobo and cpu. Kind of a pain to work on my system with a custom loop


I have a pair of those same cards also







There watercooled now, but those stock cooler's on them are really really good.


----------



## dallas1990

Yea, I'm just used to my evga 780ti watercooled was quite. Took me an hour to get use to it lol


----------



## LandonAaron

Hey guys, I need some help.

I recently tried to add an R9 290 to my system to crossfire with my R9 290x, unfortunately I am having a major issue. Every time I try to use my computer the screen will suddenly go black and the computer will be unresponsive. This happens after about 2 or 3 minutes of use every time. It will even occur in the BIOS, so I don't think it is a driver issue.

I know both cards work fine, because I have used the 290x for several months now, and I ran with just the 290 in my system for one week while waiting on the water block for the 290x to arrive, and had no issues. The problem seems to be only when both cards are installed.

I don't think it is a PSU issue, as I have an EVGA G2 1000w PSU, and the black screen will occur even in the BIOS when the card should have 0 load and be drawing very little power. It will also occur on the desktop when idling.

In the BIOS I have the PCIe configuration set to Auto. Its a big hassle installing and uninstalling the cards as they are on full cover blocks, so each time I have to redo the tubing, leak test etc.

The cards installed in PCIe sots 1 and 2, on my Asus Maximus VII Hero motherboard.

Any suggestions appreciated.


----------



## DividebyZERO

Does your mb allow you to disable pcie slots? Can you use gpuz or a temp monitor program? Maybe something with the gpu blocks mounting? Have you verified they are in the slots and locked? Silly things but trying to help.


----------



## LandonAaron

Quote:


> Originally Posted by *DividebyZERO*
> 
> Does your mb allow you to disable pcie slots? Can you use gpuz or a temp monitor program? Maybe something with the gpu blocks mounting? Have you verified they are in the slots and locked? Silly things but trying to help.


Yes, my motherboard BIOS lets me configure the PCIe slot assignments and speeds manually in the BIOS. My next step is take both blocks off, check for leaks/water on the card, carefully remount, manually assign PCIe slots in BIOS and try again. If that doesn't work then next step will be to try the cards one at a time to see if I can narrow the problem down to an individual card. I am really hoping I don't have to go that far, as one of the full cover blocks (EK FC-290) terminal ports are damaged, and I had to use threadlocker and liquid rubber sealant to stop it from leaking. I have the two terminal port things for each block connected with tubing, and I really don't want to have to take that thing apart and redo it for single card use, as I don't know if I will be able to get it sealed up again against leaks if I take it apart.

Also perofrmance pc's doesn sell the black FC Terminal port, they only sell the clear plexi one. So my only chance of getting a replacement is through EK, which would be slow and expensive (would have to go through customer service as it isn't offered on their online store), or though a member. One guy on the EK thread said he could give me one, but then he has been silent recently.


----------



## LandonAaron

I just uninstalled CCC and all AMD software in preperation for reinstalling the cards, and I had a thought. I have a QNIX 2710 monitor, and I had used AMD Pixel Patcher, and Custom Resolution Utility to patch the driver and create a custom detailed resolution to overclock my monitor. I am thinking that maybe that is what was causing my Black Screen issue. However, I was gettting this issue in the BIOS as well. None of the Drivers installed while in Windows affects the card while your in the BIOS right? The only thing that is loaded at that point is the Card's BIOS which has been unchanged.

Also does it matter how you have the BIOS switches on your card's set when crossfiring or using multiple cards?


----------



## Gobigorgohome

Some problems here, kind of.

I sold three of my cards to the same person and he is experiencing some black screen-issues.

This is what he wrote to me:
"I have mounted all the cards, but I do not get a picture on my monitors, neither on DVI or HDMI, not even in bios. My computer is booting like normal and I find the cards on a remote desktop." What do you guys think? I never used DVI/HDMI when I had those cards.


----------



## kizwan

Quote:


> Originally Posted by *LandonAaron*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DividebyZERO*
> 
> Does your mb allow you to disable pcie slots? Can you use gpuz or a temp monitor program? Maybe something with the gpu blocks mounting? Have you verified they are in the slots and locked? Silly things but trying to help.
> 
> 
> 
> Yes, my motherboard BIOS lets me configure the PCIe slot assignments and speeds manually in the BIOS. My next step is take both blocks off, check for leaks/water on the card, carefully remount, manually assign PCIe slots in BIOS and try again. If that doesn't work then next step will be to try the cards one at a time to see if I can narrow the problem down to an individual card. I am really hoping I don't have to go that far, as one of the full cover blocks (EK FC-290) terminal ports are damaged, and I had to use threadlocker and liquid rubber sealant to stop it from leaking. I have the two terminal port things for each block connected with tubing, and I really don't want to have to take that thing apart and redo it for single card use, as I don't know if I will be able to get it sealed up again against leaks if I take it apart.
> 
> Also perofrmance pc's doesn sell the black FC Terminal port, they only sell the clear plexi one. So my only chance of getting a replacement is through EK, which would be slow and expensive (would have to go through customer service as it isn't offered on their online store), or though a member. One guy on the EK thread said he could give me one, but then he has been silent recently.
Click to expand...

He meant the PCIe switches on the motherboard that allow you to disable any PCIe slot. Your motherboard doesn't have that though.

BTW, you don't need to remove the cards. You can test the card one by one by disabling Crossfire in CCC & connect the monitor to the card you want to test.


----------



## LandonAaron

I figured out the problem one card was overheating. I took the waterblock off and discovered the thermal paste on the die isn't making contact with the waterblock and the vram thermal pads aren't making contact either. The only place it is making contact is the VRM thermal pad. The standoffs are just too tall on the block. Not sure what can be done about this though beyond grinding down the standoffs.or using thicker thermal pads and a shim. Bought it off ebay.


----------



## DividebyZERO

Quote:


> Originally Posted by *LandonAaron*
> 
> I figured out the problem one card was overheating. I took the waterblock off and discovered the thermal paste on the die isn't making contact with the waterblock and the vram thermal pads aren't making contact either. The only place it is making contact is the VRM thermal pad. The standoffs are just too tall on the block. Not sure what can be done about this though beyond grinding down the standoffs.or using thicker thermal pads and a shim. Bought it off ebay.


makes sense because I have my water pumps and fans on a sep power supply and twice I forgot to turn on the pumps and booted to windows then black screen lol... I need to hook all my psus up on adapters to save the idiotic moments


----------



## ChronoBodi

Why am i getting random millisecond-long spikes of like, 135c in what is otherwise normal load temps of 75-80c?

This is from Battlefield Hardline in Mantle, Crossfire enabled.


----------



## pengs

Quote:


> Originally Posted by *ChronoBodi*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Why am i getting random millisecond-long spikes of like, 135c in what is otherwise normal load temps of 75-80c?
> 
> This is from Battlefield Hardline in Mantle, Crossfire enabled.


Driver version maybe or ULPS. If it's just happening in BF4HL it's probably a Mantle thing.


----------



## Mega Man

Quote:


> Originally Posted by *YellowBlackGod*
> 
> Since our beloved 290X = 380X = high end gaming, we welcome Direct X12 to our card which is as much futureproof as Crysis 1 when it was released
> 
> 
> 
> 
> 
> 
> 
> .
> 
> http://www.guru3d.com/news-story/amd-and-nvidia-prep-for-next-gen-directx-12.html


meh, i love the "corporate greed " line when windows 10 will be free at release if you have win 7 or 8 >.>


----------



## LandonAaron

Quote:


> Originally Posted by *DividebyZERO*
> 
> makes sense because I have my water pumps and fans on a sep power supply and twice I forgot to turn on the pumps and booted to windows then black screen lol... I need to hook all my psus up on adapters to save the idiotic moments


They make power stri where you can have one device as a master and all others connected a controlled. When the master device is powered on all the other components connected get power too. When you turn off the master device all other components have their pow r r completely cut off. I use it for my monitors and speakers, and in the living room for subwoofer and amps. Really handy. Just search for green power strip and controlled power stri on amazon.


----------



## tsm106

Quote:


> Originally Posted by *ChronoBodi*
> 
> 
> 
> Why am i getting random millisecond-long spikes of like, 135c in what is otherwise normal load temps of 75-80c?
> 
> This is from Battlefield Hardline in Mantle, Crossfire enabled.


One, don't use gpuz, and two if you use AB check unified usage monitoring. You can also try my guide below.

http://www.overclock.net/t/1265543/the-amd-how-to-thread/0_40


----------



## chiknnwatrmln

Finally new drivers that work for me!

I updated to the newest Beta drivers and I don't have the weird issue I was having with 14.12. CCC usage is normal (0-1%) and I can actually use the drop down menus!

How do I enable and use VSR?

Nevermind, I got it but the highest resolution I can go for either monitor (1080p and 1440p) is 3200x1800. How do I do 4x my resolution?


----------



## BradleyW

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Finally new drivers that work for me!
> 
> I updated to the newest Beta drivers and I don't have the weird issue I was having with 14.12. CCC usage is normal (0-1%) and I can actually use the drop down menus!
> 
> How do I enable and use VSR?
> 
> Nevermind, I got it but the highest resolution I can go for either monitor (1080p and 1440p) is 3200x1800. *How do I do 4x my resolution?*


You can't.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *BradleyW*
> 
> You can't.


That's lame. Why would I want to supersample a resolution that doesn't fit nicely onto my screen?

I guess I can keep my desktop res at 1440p for both monitors, that way windows actually line up from one to the other. Just kinda disappointed that I can't do any meaningful downsampling. I mean, 3200x1800 isn't that big of a step up from 2560x1440.


----------



## Vexzarium

I'm working on a parts list for my GF's rig on PCPartPicker.com, and I have to ask... Is it just my luck or are the 290x's seemingly running low? The Lightning, which is the one she wants, is nowhere to be found! I mean, have people finally seen the light, especially with the price cuts, and moved over to Radeon? Or have they been discontinued in favor of the 300 series?

Either way, I think the i5-4590 + Sapphire 280x Tri-X would be plenty for her for The Elder Scrolls Online @1080p. My 4690k + 290x Lightning is flat out overkill for ESO.

If you're curious, take a look at the 4 "Pandora" builds:

https://pcpartpicker.com/user/Vexzarium/saved/#view=3BghP6


----------



## Mega Man

brand new http://www.ebay.com/itm/NEW-R9-290X-LIGHTNING-LE-AMD-Radeon-R9-290X-512-bits-4Gb-GDDR5-Tri-FROZR-3-fan-/351333933562?pt=LH_DefaultDomain_0&hash=item51cd2261fa

XD

tbh most everything is coming oos recently ( several mobos gpus ect )


----------



## Serandur

Quote:


> Originally Posted by *Vexzarium*
> 
> I'm working on a parts list for my GF's rig on PCPartPicker.com, and I have to ask... Is it just my luck or are the 290x's seemingly running low? The Lightning, which is the one she wants, is nowhere to be found! I mean, have people finally seen the light, especially with the price cuts, and moved over to Radeon? Or have they been discontinued in favor of the 300 series?
> 
> Either way, I think the i5-4590 + Sapphire 280x Tri-X would be plenty for her for The Elder Scrolls Online @1080p. My 4690k + 290x Lightning is flat out overkill for ESO.
> 
> If you're curious, take a look at the 4 "Pandora" builds:
> 
> https://pcpartpicker.com/user/Vexzarium/saved/#view=3BghP6


I've been watching the Lightning on Newegg for 2 weeks now too, been out of stock everytime I check. I even called MSI tech support to find out if it's been discontinued, they told me it's not though they don't know for sure. I might just ask Newegg tomorrow if they have any idea about incoming stock. The Lightning 290X at the price it was going for is such a steal and one of the two best 290X models in my opinion; I really hope they didn't discontinue it already.


----------



## LandonAaron

At long last victory is mine. Ran into alot of difficulties getting these blocks installed, two leaky ports on one FC Terminal, two cracked bolt holes in the plexi block, a leaky 90 degree snake fitting, and an overheating card from lack of GPU die contact on the top card. But with some trial and error, thread locker, epoxy, and help from the good folks here at OCN victory is finally mine. First time rocking a multi-GPU setup since my pair of GTX 260's back in the day.

I also delidded my processor at the same time I was installing these cards and blocks. The delid resulted in a small crack in the chips PCB, so I initially mis-diagnosed the overheating problem on the GPU as a damaged CPU, and ran out purchased two more chips, only to find out that the issue was actually with the video card not the CPU. I think I really need to learn how to slow down and be more careful









Anyway, pictured below is a Saphire R9 290x with an EK Full Cover Nickel and Acrylic Block, and a VisionTek CryoVenom R9 290 with an EK Plexi and Nickel Block. I've already joined the club, but I would like a tally mark placed next to my name







.


----------



## DividebyZERO

Quote:


> Originally Posted by *LandonAaron*
> 
> At long last victory is mine. Ran into alot of difficulties getting these blocks installed, two leaky ports on one FC Terminal, two cracked bolt holes in the plexi block, a leaky 90 degree snake fitting, and an overheating card from lack of GPU die contact on the top card. But with some trial and error, thread locker, epoxy, and help from the good folks here at OCN victory is finally mine. First time rocking a multi-GPU setup since my pair of GTX 260's back in the day.
> 
> I also delidded my processor at the same time I was installing these cards and blocks. The delid resulted in a small crack in the chips PCB, so I initially mis-diagnosed the overheating problem on the GPU as a damaged CPU, and ran out purchased two more chips, only to find out that the issue was actually with the video card not the CPU. I think I really need to learn how to slow down and be more careful
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyway, pictured below is a Saphire R9 290x with an EK Full Cover Nickel and Acrylic Block, and a VisionTek CryoVenom R9 290 with an EK Plexi and Nickel Block. I've already joined the club, but I would like a tally mark placed next to my name
> 
> 
> 
> 
> 
> 
> 
> .


grats!!


----------



## MOSER91

Hey guys, over 2 weeks ago I threw in my r9 290x on my main rig and tried OC'ing it to 1100mhz +100mv. I started playing Crysis 3 for like 5min. or less and it crashed, the screen went black and cards fan was spinning like 100%. Now it just displays nothing but black screen, I'm thinking I killed the card, what do you guys think? I had ran the card a while ago at 1130mhz stable.


----------



## aaroc

Can I use the Ek fc terminal quad semi-p upside down? Have 3 ek-fc r9-290x and 1 ek-fc r9-290x (rev 2) on 3 saphire and 1 his r9 290x.

I think it will be no problem. I want to do this to have the in and out ports reversed and that position suits my needs better. Thanks!!!!!


----------



## Ized

Any ideas where I would acquire a another reference cooler (UK)? Nothing on eBay anytime I have looked


----------



## mfknjadagr8

Quote:


> Originally Posted by *Ized*
> 
> Any ideas where I would acquire a another reference cooler (UK)? Nothing on eBay anytime I have looked


for the 290 or 290x?


----------



## DividebyZERO

Quote:


> Originally Posted by *MOSER91*
> 
> Hey guys, over 2 weeks ago I threw in my r9 290x on my main rig and tried OC'ing it to 1100mhz +100mv. I started playing Crysis 3 for like 5min. or less and it crashed, the screen went black and cards fan was spinning like 100%. Now it just displays nothing but black screen, I'm thinking I killed the card, what do you guys think? I had ran the card a while ago at 1130mhz stable.


did it crash and fans spin up 100% at anytime before? I have someone whos running into this issue but just resets and its good.


----------



## MOSER91

Quote:


> Originally Posted by *DividebyZERO*
> 
> did it crash and fans spin up 100% at anytime before? I have someone whos running into this issue but just resets and its good.


Yes it did that a couple times, but now it won't do anything.


----------



## LandonAaron

Quote:


> Originally Posted by *aaroc*
> 
> Can I use the Ek fc terminal quad semi-p upside down? Have 3 ek-fc r9-290x and 1 ek-fc r9-290x (rev 2) on 3 saphire and 1 his r9 290x.
> 
> I think it will be no problem. I want to do this to have the in and out ports reversed and that position suits my needs better. Thanks!!!!!


You can put the FC Terminals on upside down.
Quote:


> Originally Posted by *MOSER91*
> 
> Hey guys, over 2 weeks ago I threw in my r9 290x on my main rig and tried OC'ing it to 1100mhz +100mv. I started playing Crysis 3 for like 5min. or less and it crashed, the screen went black and cards fan was spinning like 100%. Now it just displays nothing but black screen, I'm thinking I killed the card, what do you guys think? I had ran the card a while ago at 1130mhz stable.


It sounds like it was overheating to me. My card was just having an overheating problem when I installed a waterblock on it that wasn't making good contact. I could boot up and use the computer for about 5 minutes and then the screen would just go black. The fact the fan is spinning up is a good indication of heat, and the other symptoms are in line. Also would make since that it would happen quickly after being put under a load. Now that the card is completely dead, it may not make sense to try and repair it, but you may want to try reseating the cooler and making sure it is making good contact with the card. But if you can RMA, I would just try that before you try taking the card apart and possibly voiding its warranty.


----------



## DividebyZERO

Quote:


> Originally Posted by *MOSER91*
> 
> Yes it did that a couple times, but now it won't do anything.


were you able to game on it before this last crash? My buddy is able to game on his and his crashes like that when idle or sometimes in browser.


----------



## MOSER91

Quote:


> Originally Posted by *DividebyZERO*
> 
> were you able to game on it before this last crash? My buddy is able to game on his and his crashes like that when idle or sometimes in browser.


Man the last time I gamed on them was a couple months ago, I had 2 crossfired in a watercooled system. But I took it apart and had them lying around for while.
It's ok though I bought 2 more like 3 weeks ago.


----------



## mfknjadagr8

Quote:


> Originally Posted by *MOSER91*
> 
> Man the last time I gamed on them was a couple months ago, I had 2 crossfired in a watercooled system. But I took it apart and had them lying around for while.
> It's ok though I bought 2 more like 3 weeks ago.


I was getting black screens ask over the place...I disabled ulps and it got better but still happened turns out it was due to heat...repasted no more black screens...now they are under water and running 45c max


----------



## Ized

Quote:


> Originally Posted by *mfknjadagr8*
> 
> for the 290 or 290x?


I assumed the reference heatsink/cooler combo would be the same for both 290/290X? Its for a 290.


----------



## Pandora's Box

Quote:


> Originally Posted by *Ized*
> 
> I assumed the reference heatsink/cooler combo would be the same for both 290/290X? Its for a 290.


You might have some luck contacting AMD UK


----------



## LandonAaron

Quote:


> Originally Posted by *MOSER91*
> 
> Man the last time I gamed on them was a couple months ago, I had 2 crossfired in a watercooled system. But I took it apart and had them lying around for while.
> It's ok though I bought 2 more like 3 weeks ago.


I was using an EK full cover waterblock when I had my overheating problem. The problem turned out to be some black spacer/protective pad things on the four holes around the GPU die. Removing those allowed the waterblock to sit down lower and make contact with the GPU die and fixed the overheating problem. Do you know what sort of temps you were getting on your cards when you were having the problem?

Also when you go to mount the block a quick way to kind of tell if it is making good contact is if when you set it it it should stick. Meaning even without any screws the card and bock should kind of stick together . If the block is freely sliding there is no contact between the two. Ideally though you should apply thermal paste, set the block check how the paste spread, clean and reapply. I didn't do this though as I was almost out of thermal paste and didn't want to waste any, but it came around to bite me in the a.


----------



## LandonAaron

Quote:


> Originally Posted by *Ized*
> 
> Any ideas where I would acquire a another reference cooler (UK)? Nothing on eBay anytime I have looked


Go to the watercooling thread at overclockers.co.uk and ask around. Im sure someone has a spare they could sell you.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Ized*
> 
> I assumed the reference heatsink/cooler combo would be the same for both 290/290X? Its for a 290.


I've got two from my xfx 290 references id be willing to send you one but you might not want to pay the shipping since I'm in the US...they are loud when running over 50 percent


----------



## LandonAaron

I have an MCP50X pump that is sata powered and has a PWM control cable. I have the PWM cable connected it to my R9 290's fan header using an adapter cable. When I first installed the Driver (15.3 Beta) last night, I was able to adjust the pump speed via MSI AB, but now no matter what I have the fan speed set to in MSI AB the pump runs at 100%. MSIAB is correctly reporting the Pump's RPM, but no matter what the GPU Fan Speed % is set to the pump is still chugging away at 4000 RPM (100%).

Should I just try reinstalling the driver to see if I can get it working again? I'm always worried about screwing up my drivers, as it seems people are always having problems with their video card drivers, so I don't really like excessively uninstalling and reinstalling drivers.

Any other suggestions on getting this working again?


----------



## Nashud

Guys, I really need help, I'm desperate. I don't know anyone in real life who could help me with this issue.

In september 2014, I got the following configuration:

G1.Sniper Z97
Intel Core i5 4670k (not overclocked yet, fabric cooler)
MSI Radeon R9 290x 4GB OC Gaming
2x4GB Patriot Viper
1TB WD Blue
Thermaltake Smart 730W Modular
Aerocool GT Black Edition

The graphics card is overheating and making really strange noises. After I've been in ANY game (Witcher 1, WoW, LoL, Crysis 1, Oblivion, Skyrim, Assassin's Creed 1, RIFT etcetc) for about and hour or so, I start experiencing hardly visible horizontal waves on screen and right after that the card gets to about 85-90C (185-194F). It's good to mention that I'm using a HDMI to VGA signal converter because I have a very old monitor, Samsung SyncMaster 920nw, 1440x900. Also, my computer case has been opened a bit on the side since I got the configuration, I thought that might cool my card a bit, before I get a new, larger case.

I also tried turning VSync on in Catalyst, ATI Tray Tools, but I get the same results, 90C, VERY disturbing noise and horizontal waves. And yeah, the moment I go alt-tab to desktop, the card goes silent, and as soon as I've tabbed back into the game, it starts going crazy again. Also, my graphic driver is up to date.

The card has warranty but before I use it I wanna do anything possible, because I'll be needing the PC for at least a couple of months. What would you do, I'm really frustrated about this.


----------



## ChronoBodi

Quote:


> Originally Posted by *tsm106*
> 
> One, don't use gpuz, and two if you use AB check unified usage monitoring. You can also try my guide below.
> 
> http://www.overclock.net/t/1265543/the-amd-how-to-thread/0_40


May I ask why not GPU-Z? this never occurred in any other game anyway.


----------



## Mega Man

Quote:


> Originally Posted by *LandonAaron*
> 
> I have an MCP50X pump that is sata powered and has a PWM control cable. I have the PWM cable connected it to my R9 290's fan header using an adapter cable. When I first installed the Driver (15.3 Beta) last night, I was able to adjust the pump speed via MSI AB, but now no matter what I have the fan speed set to in MSI AB the pump runs at 100%. MSIAB is correctly reporting the Pump's RPM, but no matter what the GPU Fan Speed % is set to the pump is still chugging away at 4000 RPM (100%).
> 
> Should I just try reinstalling the driver to see if I can get it working again? I'm always worried about screwing up my drivers, as it seems people are always having problems with their video card drivers, so I don't really like excessively uninstalling and reinstalling drivers.
> 
> Any other suggestions on getting this working again?


could be several things, yes i would try reinstalling ab


----------



## rdr09

Quote:


> Originally Posted by *LandonAaron*
> 
> I have an MCP50X pump that is sata powered and has a PWM control cable. I have the PWM cable connected it to my R9 290's fan header using an adapter cable. When I first installed the Driver (15.3 Beta) last night, I was able to adjust the pump speed via MSI AB, but now no matter what I have the fan speed set to in MSI AB the pump runs at 100%. MSIAB is correctly reporting the Pump's RPM, but no matter what the GPU Fan Speed % is set to the pump is still chugging away at 4000 RPM (100%).
> 
> Should I just try reinstalling the driver to see if I can get it working again? I'm always worried about screwing up my drivers, as it seems people are always having problems with their video card drivers, so I don't really like excessively uninstalling and reinstalling drivers.
> 
> Any other suggestions on getting this working again?


try the motherboard fan header. could be hardware issue.

i had a reference 7950 and i replaced the fan assembly with a 3 fan cooler from a 7970 . . . the fan controller bork and kept the fans at 100%.


----------



## LandonAaron

I have an MCP50X pump that is sata powered and has a PWM control cable. I connected it to my R9 290
Quote:


> Originally Posted by *rdr09*
> 
> try the motherboard fan header. could be hardware issue.
> 
> i had a reference 7950 and i replaced the fan assembly with a 3 fan cooler from a 7970 . . . the fan controller bork and kept the fans at 100%.


I went to move it to a motherboard header and discovered the cable was loose in the plug. Reseated and now its working properly.

On another note, what sort of performance scaling are yall seeing with crossfire in Dying Light? I had an R9 290x, and just added an R9 290 to my system. Crossfire is enabled, and both cards show full 99% utilization in game but my performance has barely increased if at all. I am kind of doubtful of whether the second card is really doing anything even though it shows 99% utilization, because the card idles at about 34 degrees, and in game it was still just 34 degrees I think it may have hit 36 degrees but thats all. The primary card on the other hand was running at 43 degrees, and peaked at 47. Is there something else besides just enabling crossfire under performance in CCC that I need to do to utilize the second card better?

Here is my MSIAB settings:


----------



## virpz

Quote:


> Originally Posted by *LandonAaron*
> 
> I have an MCP50X pump that is sata powered and has a PWM control cable. I connected it to my R9 290
> I went to move it to a motherboard header and discovered the cable was loose in the plug. Reseated and now its working properly.
> 
> On another note, what sort of performance scaling are yall seeing with crossfire in Dying Light? I had an R9 290x, and just added an R9 290 to my system. Crossfire is enabled, and both cards show full 99% utilization in game but my performance has barely increased if at all. I am kind of doubtful of whether the second card is really doing anything even though it shows 99% utilization, because the card idles at about 34 degrees, and in game it was still just 34 degrees I think it may have hit 36 degrees but thats all. The primary card on the other hand was running at 43 degrees, and peaked at 47. Is there something else besides just enabling crossfire under performance in CCC that I need to do to utilize the second card better?
> 
> Here is my MSIAB settings:


Dying light=nvidia sponsored ( coded ) game


----------



## Silent Scone

Quote:


> Originally Posted by *virpz*
> 
> Dying light=nvidia sponsored ( coded ) game


Dying Light is a sponsored title, that is all. There are no GameWorks libraries in the game. So what you are implying is incorrect, you're just due a new driver


----------



## Ized

Thanks for the ideas about where to source a stock cooler guys







UK Overclockers site sounds like the cheapest bet.
Quote:


> Originally Posted by *mfknjadagr8*
> 
> I've got two from my xfx 290 references id be willing to send you one but you might not want to pay the shipping since I'm in the US...they are loud when running over 50 percent


Thanks for the offer mfknjadagr8, even if you gave me the unit for free the postage alone would probably be far too high









I wanted to dismantle a reference cooler for the bulky black heatsink part and combine it with my closed loop cooler, haven't seen many other options to keep the VRMs in check.


----------



## gavstapo

Anyone running Hardline smoothly with crossfire 290x , could you please post your ccc setttings. Thanks


----------



## Maintenance Bot

Quote:


> Originally Posted by *gavstapo*
> 
> Anyone running Hardline smoothly with crossfire 290x , could you please post your ccc setttings. Thanks


Hardline ran really bad on my Crossfire 290x until I disabled AMD Gaming Evolved Client, now it runs perfect.
Currently running stock, 1030 core, 1375 memory.


----------



## gavstapo

Quote:


> Originally Posted by *Maintenance Bot*
> 
> Hardline ran really bad on my Crossfire 290x until I disabled AMD Gaming Evolved Client, now it runs perfect.
> Currently running stock, 1030 core, 1375 memory.


Happy to hear. COuld you please share your CCC 3D Application Settings, screenshot plz


----------



## Vexzarium

Quote:


> Originally Posted by *Maintenance Bot*
> 
> Hardline ran really bad on my Crossfire 290x until I disabled AMD Gaming Evolved Client, now it runs perfect.
> Currently running stock, 1030 core, 1375 memory.


I honestly didn't think anyone installed Raptr... It seemed like such a mess when I tried it a few months ago.


----------



## rdr09

Quote:


> Originally Posted by *Vexzarium*
> 
> I honestly didn't think anyone installed Raptr... It seemed like such a mess when I tried it a few months ago.


after installing CCC driver and just before reboot . . . i go to msconfig to uncheck CCC and raptr in Startup tab.

i did use raptr to take screenshots in 4K. 1.5MB size compared to 28MB using printscreen in windows.


----------



## Vexzarium

Quote:


> Originally Posted by *rdr09*
> 
> after installing CCC driver and just before reboot . . . i go to msconfig to uncheck CCC and raptr in Startup tab.
> 
> i did use raptr to take screenshots in 4K. 1.5MB size compared to 28MB using printscreen in windows.


I'm still a bit old school. I use MSi Afterburner for all my capture needs. When I install a new driver, I select Custom and always uncheck the Gaming Evolved App.


----------



## rdr09

Quote:


> Originally Posted by *Vexzarium*
> 
> I'm still a bit old school. I use MSi Afterburner for all my capture needs. When I install a new driver, I select Custom and always uncheck the Gaming Evolved App.


in my case, AB causes hitching in BF4 using mantle. raptr did not cause that. i was not using any other feature while raptr was running in the background, though. just to take screenshots when needed. i was planning on using the record feature but never really did try.


----------



## LandonAaron

Quote:


> Originally Posted by *Vexzarium*
> 
> I honestly didn't think anyone installed Raptr... It seemed like such a mess when I tried it a few months ago.


I like Raptr just to keep track of all the games I have installed at one time. I think they are on the right track with the app, though I would much rather see them put the Radeon Pro guy back to work on that app.


----------



## LandonAaron

It seems the 15.3 Beta driver doesn't have the option: "Enable AMD CrossFireX for applications that have no associated profile". Hmm, wonder if I should roll back for Dying Light Crossfire.


----------



## kizwan

Quote:


> Originally Posted by *LandonAaron*
> 
> I have an MCP50X pump that is sata powered and has a PWM control cable. I connected it to my R9 290
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> try the motherboard fan header. could be hardware issue.
> 
> i had a reference 7950 and i replaced the fan assembly with a 3 fan cooler from a 7970 . . . the fan controller bork and kept the fans at 100%.
> 
> 
> 
> I went to move it to a motherboard header and discovered the cable was loose in the plug. Reseated and now its working properly.
> 
> On another note, what sort of performance scaling are yall seeing with crossfire in Dying Light? I had an R9 290x, and just added an R9 290 to my system. Crossfire is enabled, and both cards show full 99% utilization in game but my performance has barely increased if at all. I am kind of doubtful of whether the second card is really doing anything even though it shows 99% utilization, because the card idles at about 34 degrees, and in game it was still just 34 degrees I think it may have hit 36 degrees but thats all. The primary card on the other hand was running at 43 degrees, and peaked at 47. Is there something else besides just enabling crossfire under performance in CCC that I need to do to utilize the second card better?
> 
> Here is my MSIAB settings:
> 
> 
> Spoiler: Warning: Spoiler!
Click to expand...

Crossfire profile for Dying Light is currently disabled.

Regarding PWM control, try connect a PWM fan there. See whether you can control the fan speed. Probably PWM control on the card gone kaput. Ooops!


----------



## LandonAaron

Quote:


> Originally Posted by *kizwan*
> 
> Crossfire profile for Dying Light is currently disabled.
> 
> Regarding PWM control, try connect a PWM fan there. See whether you can control the fan speed. Probably PWM control on the card gone kaput. Ooops!


Yeah I guess I should have googled first. I found one of TSM's posts on it. I was able to get crossfire working by creating a profile for Dying Light and selecting "optimize 1x1 mode" under Crossifre Mode, but this caused post processing effects like blood and water in players eyes to flicker. Also the game crashed after about 10 minutes. Don't know whats up with that. However, it did significantly improve my FPS. Was getting 50-70 FPS at 1440p, 96hz, vsync on. With crossfire enabled I was getting 96 FPS, vsync on, but it made the game pretty stuttery despite the higher FPS. I think the problem though was having FPS floating around the refresh rate and going over and then under the refresh rate. Raising view distance to 100% brought my FPS down to 80 and smoothed out the gameplay and got rid of most of the stutter.

Unfortunately the flicker and the game crashing means I will probably just stick to using one card right now, but that's no big deal, a single R9 290x is plenty to play this game at 1440p IMHO. I still think that because of issues like this, that single GPU solution is usually better than dual GPU setup. (If I could do it all over again I probably would just get a GTX 980, despite it being theoretically less powerfull than 2 x 290's.)

All in all though I have been extremely pleased since switching to team red. I spent year's with Nvidia Cards (2 GTX 260's in SLI, GTX 480, GTX 580, GTX 770), and I never had as consistently a smooth experience as I have had with the 290X. Its a beast and when I think of how much I paid for it...


----------



## thrgk

Any improvements from the new beta drivers ?


----------



## Maintenance Bot

Quote:


> Originally Posted by *gavstapo*
> 
> Happy to hear. COuld you please share your CCC 3D Application Settings, screenshot plz



Quote:


> Originally Posted by *Vexzarium*
> 
> I honestly didn't think anyone installed Raptr... It seemed like such a mess when I tried it a few months ago.


I didn't even know it was installed and running lol.


----------



## pshootr

Quote:


> Originally Posted by *Maintenance Bot*
> 
> 
> I didn't even know it was installed and running lol.


Ya, they are sneaky with the install. Welcome to the new internet.


----------



## velocityx

Quote:


> Originally Posted by *rdr09*
> 
> after installing CCC driver and just before reboot . . . i go to msconfig to uncheck CCC and raptr in Startup tab.
> 
> i did use raptr to take screenshots in 4K. 1.5MB size compared to 28MB using printscreen in windows.


just curious, why are you disabling the startup of CCC ? doesnt it, like, disable crossfire when not running? or maybe it doesnt do anything.


----------



## rdr09

Quote:


> Originally Posted by *velocityx*
> 
> just curious, why are you disabling the startup of CCC ? doesnt it, like, disable crossfire when not running? or maybe it doesnt do anything.


you can still open like other apps after reboot. just don't want it be part of the boot . . . cause we all know it is slow opening unless you use a ssd. i never really use CCC other than enabling crossfire or adjusting scale if need be.


----------



## YellowBlackGod

Guys take a look at that.How much potential this awesome GPU has! Direct X 12 and Mantle bring this out.







I will stick with this card till PCI-E 4.0 is available.

http://wccftech.com/amd-r9-290x-fast-titan-dx12-enabled-3dmark-33-faster-gtx-980/


----------



## Native89

Quote:


> Originally Posted by *YellowBlackGod*
> 
> Guys take a look at that.How much potential this awesome GPU has! Direct X 12 and Mantle bring this out.
> 
> 
> 
> 
> 
> 
> 
> I will stick with this card till PCI-E 4.0 is available.
> 
> http://wccftech.com/amd-r9-290x-fast-titan-dx12-enabled-3dmark-33-faster-gtx-980/


While I do believe that DX12 will probably bring about performance improvements to both camps, I'll hold judgement until a driver is officially released.
Also, WCCF articles should be taken with a HUGE grain of salt.


----------



## Agent Smith1984

If the 290x is hanging with the 980 in DX12, you have to believe it's because of the superior memory bandwidth on the AMD card right?
I noticed that mantle ALWAYS uses a lot more VRAM than DX11, which further proves the theory....

Thoughts?


----------



## Forceman

It's an API test, not a GPU test. The relative position of the GPUs is irrelevant. The only thing it really shows is how abysmal the CPU overhead is in AMD's DX11 drivers, considering the 750Ti/3470 in my HTPC doubles my 290X/4790K score there.


----------



## kizwan

API test

720p


1080p


1440p


4K


----------



## thrgk

For 4k 60hz what cable do I need just a dp to dp or mini dp to dp? Can someone list a good one ? Want to make sure I got a good cable


----------



## ChronoBodi

Quote:


> Originally Posted by *thrgk*
> 
> For 4k 60hz what cable do I need just a dp to dp or mini dp to dp? Can someone list a good one ? Want to make sure I got a good cable


Just get this one. No issues at all.

http://www.amazon.com/Cable-Matters-Gold-Plated-DisplayPort/dp/B005H3Q59U


----------



## LandonAaron

Quote:


> Originally Posted by *thrgk*
> 
> For 4k 60hz what cable do I need just a dp to dp or mini dp to dp? Can someone list a good one ? Want to make sure I got a good cable


Display port is the correct cable to use. The R9 290/x has a regular display port. I assume your display device does too. A cable is a cable don't fall into the overpriced cables trap. Tripp Lite and MonoPrice both make good cables. When I want the best I get Tripp Lite.


----------



## thrgk

this would be ok? http://www.amazon.com/Cable-Matters-Gold-Plated-DisplayPort/dp/B005H3Q59U


----------



## tsm106

Quote:


> Originally Posted by *thrgk*
> 
> this would be ok? http://www.amazon.com/Cable-Matters-Gold-Plated-DisplayPort/dp/B005H3Q59U


Displayport is a low voltage cable so you don't want to get the cheapest cable you can find. They are NOT all the same, especially important on longer than 6m runs. You can use the thin gauge cables on short runs but on longer, look for the wider gauge cables. Most places sell the cheap crap. If you're using a 4k panel, get the highest ql cable you can.

I'm using the Monoprice Premium cables but they are short runs. Longer than 10ft and you may want to look at the Accel Ultra cables.


----------



## LandonAaron

Has anyone had luck using MSI Afterburners 2d and 3d profiles? MSIAB keeps applying my 3D profile when I am on the desktop.


----------



## thrgk

yea I only need 3 or 6 foot one, so dont buy the one I listed? Grab a monoprice one? Got a link?


----------



## tsm106

Quote:


> Originally Posted by *ChronoBodi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> One, don't use gpuz, and two if you use AB check unified usage monitoring. You can also try my guide below.
> 
> http://www.overclock.net/t/1265543/the-amd-how-to-thread/0_40
> 
> 
> 
> May I ask why not GPU-Z? this never occurred in any other game anyway.
Click to expand...

Gpuz doesn't support unified memory monitoring iirc.

Quote:


> Originally Posted by *LandonAaron*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Crossfire profile for Dying Light is currently disabled.
> 
> Regarding PWM control, try connect a PWM fan there. See whether you can control the fan speed. Probably PWM control on the card gone kaput. Ooops!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah I guess I should have googled first. I found one of TSM's posts on it*. I was able to get crossfire working by creating a profile for Dying Light and selecting "optimize 1x1 mode" under Crossifre Mode, but this caused post processing effects like blood and water in players eyes to flicker.* Also the game crashed after about 10 minutes. Don't know whats up with that. However, it did significantly improve my FPS. Was getting 50-70 FPS at 1440p, 96hz, vsync on. With crossfire enabled I was getting 96 FPS, vsync on, but it made the game pretty stuttery despite the higher FPS. I think the problem though was having FPS floating around the refresh rate and going over and then under the refresh rate. Raising view distance to 100% brought my FPS down to 80 and smoothed out the gameplay and got rid of most of the stutter.
> 
> Unfortunately the flicker and the game crashing means I will probably just stick to using one card right now, but that's no big deal, a single R9 290x is plenty to play this game at 1440p IMHO. I still think that because of issues like this, that single GPU solution is usually better than dual GPU setup. (If I could do it all over again I probably would just get a GTX 980, despite it being theoretically less powerfull than 2 x 290's.)
> 
> All in all though I have been extremely pleased since switching to team red. I spent year's with Nvidia Cards (2 GTX 260's in SLI, GTX 480, GTX 580, GTX 770), and I never had as consistently a smooth experience as I have had with the 290X. Its a beast and when I think of how much I paid for it...
Click to expand...

Turn off the Gameworks settings. Manage your vram usage! Don't let the vram usage get over 80% on average.


----------



## thrgk

http://www.monoprice.com/Product?c_id=102&cp_id=10246&cs_id=1024601&p_id=10582&seq=1&format=2 ?


----------



## tsm106

Quote:


> Originally Posted by *LandonAaron*
> 
> Has anyone had luck using MSI Afterburners 2d and 3d profiles? MSIAB keeps applying my 3D profile when I am on the desktop.


Read my guide, especially on the use of RTSS. AB generally does what it's told to do, with the exception of a bug. 3D profile on desktop is typical of a rogue app.

http://www.overclock.net/t/1265543/the-amd-how-to-thread/0_40

Quote:


> Originally Posted by *thrgk*
> 
> http://www.monoprice.com/Product?c_id=102&cp_id=10246&cs_id=1024601&p_id=10582&seq=1&format=2 ?


Yea, those are the cables I use but then again I'm not using 4k panels. If you are you want the best cables instead of wasting time buying cables that give you link problems which is more prevalent on cheap cables. Btw, you can get those off Amazon too.


----------



## thrgk

which cable would you recommend to be the best for 4k 6f length?


----------



## tsm106

I would get the Accel Ultra off of Amazon.


----------



## boot318

Did anyone ever change the VRM thermal pad under the Asus DirectCU 290x? The VRM is almost impossible to cool. 120mm fan over it and it still wants to go to 90C with +40mV and 25% Power Limit. Oh, I'm using the "Red Mod" and I haven't seen the core over 55C. I probably have to keep it stock for its life.....









Don't buy the Asus DirectCU 290 SERIES!!!!!


----------



## Agent Smith1984

Quote:


> Originally Posted by *boot318*
> 
> Did anyone ever change the VRM thermal pad under the Asus DirectCU 290x? The VRM is almost impossible to cool. 120mm fan over it and it still wants to go to 90C with +40mV and 25% Power Limit. Oh, I'm using the "Red Mod" and I haven't seen the core over 55C. I probably have to keep it stock for its life.....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Don't buy the Asus DirectCU 290 SERIES!!!!!


most of us are aware of this actually.... The main problem is that Asus screwed up and put the 780 cooler on it,and only 3of 5 heat pipes even contact the gpu (if I recall correctly)...


----------



## thrgk

have people noticed the 15.3 beta's being better then omega ones?


----------



## Mega Man

Quote:


> Originally Posted by *boot318*
> 
> Did anyone ever change the VRM thermal pad under the Asus DirectCU 290x? The VRM is almost impossible to cool. 120mm fan over it and it still wants to go to 90C with +40mV and 25% Power Limit. Oh, I'm using the "Red Mod" and I haven't seen the core over 55C. I probably have to keep it stock for its life.....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Don't buy the Asus DirectCU 290 SERIES!!!!!


Size of fan means nothing. Static pressure is more important but actual pressure and ratings are 2 different things

Odds are the temp in your case is high or other issues ( heatsinks not making good contact ect ) I only buy ref. But iirc ek makes a block for it. I suggest real water cooling
Quote:


> Originally Posted by *Agent Smith1984*
> 
> Quote:
> 
> 
> 
> Originally Posted by *boot318*
> 
> Did anyone ever change the VRM thermal pad under the Asus DirectCU 290x? The VRM is almost impossible to cool. 120mm fan over it and it still wants to go to 90C with +40mV and 25% Power Limit. Oh, I'm using the "Red Mod" and I haven't seen the core over 55C. I probably have to keep it stock for its life.....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Don't buy the Asus DirectCU 290 SERIES!!!!!
> 
> 
> 
> most of us are aware of this actually.... The main problem is that Asus screwed up and put the 780 cooler on it,and only 3of 5 heat pipes even contact the gpu (if I recall correctly)...
Click to expand...

He ghetto rigged it with a water
Aio aka the "red mod"


----------



## BradleyW

Quote:


> Originally Posted by *thrgk*
> 
> have people noticed the 15.3 beta's being better then omega ones?


I can't tell them apart.


----------



## pengs

Quote:


> Originally Posted by *velocityx*
> 
> just curious, why are you disabling the startup of CCC ? doesnt it, like, disable crossfire when not running? or maybe it doesnt do anything.


No, once it's enabled it's good throughout restarts ect.


----------



## boot318

Quote:


> Originally Posted by *Agent Smith1984*
> 
> most of us are aware of this actually.... The main problem is that Asus screwed up and put the 780 cooler on it,and only 3of 5 heat pipes even contact the gpu (if I recall correctly)...


My GPU temps are perfect (under 55C most of the time). It is just my VRM temps that I'm worried about. I knew that about the cooler but I didn't care since I was never going to use it.

Quote:


> Originally Posted by *Mega Man*
> 
> Size of fan means nothing. Static pressure is more important but actual pressure and ratings are 2 different things


I was using an Corsair SP120. Now I'm just using a 120mm Noctua fan that comes with the D14. There really is no difference in temps but the noise level is more to my liking.

For me to get 90C on VRM1 I'm running games at a framerate (200+) that I'll never play at. I usually cap my games at 90, unless FPS, but I want to see the maximum temp.




Is the thermal pad on the VRM worth changing? I know I have some pads that are thicker than that but I don't know if that is a good or bad thing. lol


----------



## joeh4384

Quote:


> Originally Posted by *Agent Smith1984*
> 
> most of us are aware of this actually.... The main problem is that Asus screwed up and put the 780 cooler on it,and only 3of 5 heat pipes even contact the gpu (if I recall correctly)...


I had a Asus 780ti for a while and the VRM cooling was terrible on that too.


----------



## Bertovzki

Quote:


> Originally Posted by *Agent Smith1984*
> 
> most of us are aware of this actually.... The main problem is that Asus screwed up and put the 780 cooler on it,and only 3of 5 heat pipes even contact the gpu (if I recall correctly)...


Thats like Ferrari saying " Sorry we put 10 speed tyres on it we stuffed up " , how do you make a mistake that dumb , they know what card they are designing


----------



## LolCakeLazors

Glad I have a Lightning so that I don't really have heating problems... although it could be better if I had more fans in my case (still waiting for the paycheck)

Still wondering when they're going to restock the Lightning. It's been out of stock for a while.


----------



## Buehlar

Quote:


> Originally Posted by *boot318*
> 
> My GPU temps are perfect (under 55C most of the time). It is just my VRM temps that I'm worried about. I knew that about the cooler but I didn't care since I was never going to use it.
> 
> I was using an Corsair SP120. Now I'm just using a 120mm Noctua fan that comes with the D14. There really is no difference in temps but the noise level is more to my liking.
> 
> For me to get 90C on VRM1 I'm running games at a framerate (200+) that I'll never play at. I usually cap my games at 90, unless FPS, but I want to see the maximum temp.
> 
> 
> 
> 
> *Is the thermal pad on the VRM worth changing?* I know I have some pads that are thicker than that but I don't know if that is a good or bad thing. lol


Absolutely! Those are crap...replacing them with anything else will probably help but go with fujipoly if you want the best results. Also put it under a FC waterblock if the noise bothers you...trust me, you'll be glad you did.

My vrm temps rarely rarely reach 45c with fuji and FC blocks.


----------



## LandonAaron

Quote:


> Originally Posted by *Buehlar*
> 
> Absolutely! Those are crap...replacing them with anything else will probably help but go with fujipoly if you want the best results. Also put it under a FC waterblock if the noise bothers you...trust me, you'll be glad you did.
> 
> My vrm temps rarely rarely reach 45c with fuji and FC blocks.


Where did you buy your fujipads from and which model specifically did you get?


----------



## Buehlar

Quote:


> Originally Posted by *LandonAaron*
> 
> Where did you buy your fujipads from and which model specifically did you get?


I got them from FrozenCPU, but they have since shut down. You can find them on ebay sometimes.

Scroll down my post below from my build log post for the sizes I needed for the 290x DC2. Note that these are size needed for the EK waterblocks. If you're just replacing them on the stock cooler you might need a different size.

http://www.overclock.net/t/1478110/build-mod-log-mid-lif-cry-sys-dimastech-easy-xl-the-ultamate-wet-bench/240#post_23058927


----------



## taem

Quote:


> Originally Posted by *LandonAaron*
> 
> Where did you buy your fujipads from and which model specifically did you get?


Performance pcs is now selling Sarcon gr-m high extreme pads, which are the same thing. I have a thread about this in water cooling forum ("where can I get fujipoly ultra extreme pads").

I just put them on and the results weren't that impressive for me. I was getting good temps to begin with though, 55-60c max in the most demanding scenarios. Temps only dropped a few degrees with the 17w/mK pads for me.

It's cheap though, a 100x15 strip is $15 and can cover 2-3 cards.

Edit oh and for what thickness and size you need, Roboyto has an excellent thread on this. http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures/0_50

Tells you everything you need to know about using these on 290x cards


----------



## fyzzz

I'm planning to watercool my asus r9 290 dc2, but the dc2 waterblock is hard to find. So i'm wondering if the reference block will fit. It doesn't says if it fits on ek's website but i'm curious anyway if it will fit since reference block is much easier to find


----------



## taem

Quote:


> Originally Posted by *fyzzz*
> 
> I'm planning to watercool my asus r9 290 dc2, but the dc2 waterblock is hard to find. So i'm wondering if the reference block will fit. It doesn't says if it fits on ek's website but i'm curious anyway if it will fit since reference block is much easier to find


No. Reference blocks will not fit that card.


----------



## markoj901

Sapphire Dual-X R9 280 overclock stable


----------



## tsm106

Quote:


> Originally Posted by *markoj901*
> 
> 
> 
> Sapphire Dual-X R9 280 overclock stable


You're first post here with that? You're off to a great start here might I add lol.


----------



## boot318

Quote:


> Originally Posted by *markoj901*
> 
> 
> 
> Sapphire Dual-X R9 280 overclock stable


Wrong thread and I highly doubt that is stable.


----------



## fyzzz

Quote:


> Originally Posted by *taem*
> 
> No. Reference blocks will not fit that card.


Ok thanks for the info. I was just wondering and i think i know where i can get the dc2 block


----------



## boot318

Quote:


> Originally Posted by *Buehlar*
> 
> Absolutely! Those are crap...replacing them with anything else will probably help but go with fujipoly if you want the best results. Also put it under a FC waterblock if the noise bothers you...trust me, you'll be glad you did.
> 
> My vrm temps rarely rarely reach 45c with fuji and FC blocks.


After taking the VRM heatsink off I found that my thermal pad barely made contact with the VRMs. No wonder temps were so bad. I put the thicker ones I had laying around and temps dropped from 90C to 77C. Thanks for the info. +rep

Quote:


> Originally Posted by *Buehlar*
> 
> I got them from FrozenCPU, but they have since shut down. You can find them on ebay sometimes.
> 
> Scroll down my post below from my build log post for the sizes I needed for the 290x DC2. Note that these are size needed for the EK waterblocks. If you're just replacing them on the stock cooler you might need a different size.
> 
> http://www.overclock.net/t/1478110/build-mod-log-mid-lif-cry-sys-dimastech-easy-xl-the-ultamate-wet-bench/240#post_23058927


Quote:


> Originally Posted by *taem*
> 
> Performance pcs is now selling Sarcon gr-m high extreme pads, which are the same thing. I have a thread about this in water cooling forum ("where can I get fujipoly ultra extreme pads").
> 
> I just put them on and the results weren't that impressive for me. I was getting good temps to begin with though, 55-60c max in the most demanding scenarios. Temps only dropped a few degrees with the 17w/mK pads for me.
> 
> It's cheap though, a 100x15 strip is $15 and can cover 2-3 cards.
> 
> Edit oh and for what thickness and size you need, Roboyto has an excellent thread on this. http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures/0_50
> 
> Tells you everything you need to know about using these on 290x cards


+rep


----------



## taem

Quote:


> Originally Posted by *fyzzz*
> 
> Ok thanks for the info. I was just wondering and i think i know where i can get the dc2 block


You can order direct from EK. I just took a quick look, they have the short (ie doesn't cover the whole pcb) nickel in stock.

http://www.ekwb.com/shop/ek-fc-r9-290x-dcii-nickel.html

Personally I prefer the shorter blocks like this one that leaves the ends of the pcb exposed and this is the block I would get if getting an EK block.


----------



## DividebyZERO

So i was trying to test if amd 290x triple 4k eyefinity could compare to nvidia triple 4k surround. I haven't seen much of anything to compare until i saw the titan x review thread. Apparently TPU did a 4k surround benchmark simple it may be but something i can try to compare with. I only ran one test for far and it has me wondering what to think. Titan X is obviously a fast card and meant to be. However 980gtx is somewhere around a 290x? This isn't an apples to apple comparison but it something tangible.

Source for TPU's bench

CPU: Intel Core i7 5820K processor w/Corsair H110 cooler
GIGABYTE X99 Gaming G1 Wi-F
16GB Corsair Vengeance 2666MHz DDR4
Windows 7 Ultimate x64



My setup
EVGA SR2 - PCIE 2.0
R9 290x Reference x2 CF
AMD 15.3 beta drivers
Intel x5650 xeon OC 4.0/4.4
24GB DDR3, @1600mhz
Win7 Pro 64bit



My setup is considerably older, but the reolution should make it more GPU dependant. Even still there should be a small percentage increase on PCIE 3.0, and possibly newer faster CPU/RAM/Platform maybe. I could run the cpus at stock but i doubt it will make much difference at that resolution. I don't think i missed anything but if i did correct me. Anyways, enjoy it for what it is


----------



## DividebyZERO

My above post i just realized, they don't specify how they tested. I assumed they used the ingame benchmark tool. They could have simply did in game testing instead? who knows.

EDIT they do have the benchmark running in the surround picture of the monitors though.


----------



## Buehlar

Quote:


> Originally Posted by *fyzzz*
> 
> Ok thanks for the info. I was just wondering and i think i know where i can get the dc2 block


Quote:


> Originally Posted by *taem*
> 
> You can order direct from EK. I just took a quick look, they have the short (ie doesn't cover the whole pcb) nickel in stock.
> 
> http://www.ekwb.com/shop/ek-fc-r9-290x-dcii-nickel.html
> 
> Personally I prefer the shorter blocks like this one that leaves the ends of the pcb exposed and this is the block I would get if getting an EK block.


Performance PCs has the Acetal FC blocks in stock.
It covers the full length of the PCB

/3/]http://www.performance-pcs.com/ek-fc-r9-290x-dcii-acetal-nickel.html#!prettyPhoto[gallery]/3/http://www.performance-pcs.com/ek-fc-r9-290x-dcii-acetal-nickel.html#!prettyPhoto[gallery]/3/]http://www.performance-pcs.com/ek-fc-r9-290x-dcii-acetal-nickel.html#!prettyPhoto[gallery]/3/[/URL[/URL]]


----------



## Nevk

http://www.chiphell.com/thread-1263323-1-1.html

"Customer R9 290X from Chiphell"


----------



## fyzzz

Quote:


> Originally Posted by *Buehlar*
> 
> Performance PCs has the Acetal FC blocks in stock.
> It covers the full length of the PCB
> 
> /3/]http://www.performance-pcs.com/ek-fc-r9-290x-dcii-acetal-nickel.html#!prettyPhoto[gallery]/3/http://www.performance-pcs.com/ek-fc-r9-290x-dcii-acetal-nickel.html#!prettyPhoto[gallery]/3/]http://www.performance-pcs.com/ek-fc-r9-290x-dcii-acetal-nickel.html#!prettyPhoto[gallery]/3/[/URL[/URL]]


Thanks for the info. The problem is where i live (aland islands) on ek's website it says that they will ship to aland islands. But when you try to order something it just says:Sorry, no quotes are available for this order at this time.(sorry for the bad english my main language is not english)


----------



## skline00

Any chance the EK 290 waterblocks will work on the up coming 390?


----------



## rt123

Quote:


> Originally Posted by *skline00*
> 
> Any chance the EK 290 waterblocks will work on the up coming 390?


Will let you know as soon as we get in our 390s.


----------



## tsm106

Quote:


> Originally Posted by *rt123*
> 
> Quote:
> 
> 
> 
> Originally Posted by *skline00*
> 
> Any chance the EK 290 waterblocks will work on the up coming 390?
> 
> 
> 
> Will let you know as soon as we get in our 390s.
Click to expand...

Keep dreaming, it's a new generation. 390x doesn't even GDDR memory you know... And I'm sure there's a good possibility it will present a higher difficulty for blocks thus upping the price.


----------



## rt123

Quote:


> Originally Posted by *tsm106*
> 
> Keep dreaming, it's a new generation. 390x doesn't even GDDR memory you know... And I'm sure there's a good possibility it will present a higher difficulty for blocks thus upping the price.


Oh I know.
My post was meant to be Sarcastic.


----------



## tsm106

You need over the top sarcasm for it to come across w/o icons like here.









http://www.overclock.net/t/1548861/toms-amd-details-asynchronous-shaders-in-directx-12-promises-performance-gains/0_40#post_23738027


----------



## DividebyZERO

No sarcasm allowed.


----------



## thrgk

What drivers do you guys think are best for 290x? I got 2 hynix and 2epilda and on the 15.3 beta, but when I do +65 on core and aux voltage and put core to 1100 mem to 1325 I get stall in bf4


----------



## MrWhiteRX7

Quote:


> Originally Posted by *thrgk*
> 
> What drivers do you guys think are best for 290x? I got 2 hynix and 2epilda and on the 15.3 beta, but when I do +65 on core and aux voltage and put core to 1100 mem to 1325 I get stall in bf4


Add more volts. 15.3 are awesome, but if you're just a single 290x then 14.12 is fine too.


----------



## thrgk

should 290x need +100 on core and aux voltage to get 1100/1325? seems high


----------



## tsm106

Quote:


> Originally Posted by *thrgk*
> 
> What drivers do you guys think are best for 290x? I got 2 hynix and 2epilda and on the 15.3 beta, but when I do +65 on core and aux voltage and put core to 1100 mem to 1325 *I get stall in bf4*


Are your gpus actually stable at those clocks? Is your cpu? And then finally is your cpu and gpu stable together at those clocks in BF4? The fact that it's crashing/hangs means it is not. BF4 is tougher than any bench I've used in regards to the cpu's imc and the pcie controller. I can pass prime 95 blend test all day long with lowered vtt and vccsa all day long, but 5 minutes in BF4 and it hangs, roflmao. I can bench and throw down quad Firestrike Ultra runs at 1300/1650mhz with low vtt/vccsa, but 5 mins of BF4 = crash. A lot of crashes in ppl's minds are not what they may think they are from imo. Just saying...


----------



## LolCakeLazors

So I was asking for water cooling advice on HardForum and someone said that Nickel + EK is a bad idea. Has this problem been resolved? I don't want to buy a Lightning 290X block from EK and find out it's corroding.


----------



## rdr09

Quote:


> Originally Posted by *LolCakeLazors*
> 
> So I was asking for water cooling advice on HardForum and someone said that Nickel + EK is a bad idea. Has this problem been resolved? I don't want to buy a Lightning 290X block from EK and find out it's corroding.


don't worry about it. i have both Ni and Cu in my system without an issue. Go for it.

anyway, some Titanfall 4K . . .


Spoiler: Warning: Spoiler!


----------



## tsm106

Quote:


> Originally Posted by *LolCakeLazors*
> 
> So I was asking for water cooling advice on HardForum and someone said that Nickel + EK is a bad idea. Has this problem been resolved? I don't want to buy a Lightning 290X block from EK and find out it's corroding.


Corrosion is a fact of life and not related to any one brand. If you're worried about corrosion pay close attention to your liquid and use a good coolant or other corrosion inhibitor.

http://www.overclock.net/t/1159288/koolance-370-corrosion
http://www.overclock.net/t/1287741/corrosion-koolance-drive-bay
http://www.overclock.net/t/1482548/corroded-xspc-water-block
http://www.overclock.net/t/1200658/aquagrafx-gtx-480-what-is


----------



## LolCakeLazors

That's what I thought. Thanks for the reassurance.


----------



## taem

Quote:


> Originally Posted by *tsm106*
> 
> Corrosion is a fact of life and not related to any one brand. If you're worried about corrosion pay close attention to your liquid and use a good coolant or other corrosion inhibitor.


I don't know man. I just put together my first water cooled build, and in preparation I did a metric ton of reading. And when it comes to nickel related problems, some brands keep coming up. Some brands never do.


----------



## chiknnwatrmln

Copper and nickel will corrode over time in a loop. It's just the way it is. Some blocks are worse than others, but the most important is proper coolant and maintenance.

I use DI water and wash out my loop every six months with clean water.


----------



## tsm106

Quote:


> Originally Posted by *taem*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Corrosion is a fact of life and not related to any one brand. If you're worried about corrosion pay close attention to your liquid and use a good coolant or other corrosion inhibitor.
> 
> 
> 
> I don't know man. I just put together my first water cooled build, and in preparation I did a metric ton of reading. And when it comes to nickel related problems, some brands keep coming up. Some brands never do.
Click to expand...

Some brands also sell at 100:1 ratio. And will that many more sold, you get that many more noobs crying about it.


----------



## taem

Quote:


> Originally Posted by *tsm106*
> 
> Some brands also sell at 100:1 ratio. And will that many more sold, you get that many more noobs crying about it.


Sure. It's why I didn't name any names. I'm not going to pretend anyone really knows for sure what's going on here, and I don't want to malign anyone unfairly.

But I think you're dismissing the point too blithely. The brand that has the most complaints about nickel plating issues in recent years is *not* the sales leader. The sales leader, the one that probably sells as much as the rest of the brands put together, comes in second. And some of the brands that never get complained about are not obscure boutique firms; while they might not sell as many units, they are well established, broadly supported brands which perhaps not coincidentally happen to enjoy reputations for having the best build quality in the industry. I contacted one of these firms about mixing copper, nickel and silver -- I was told, mix whatever you want other than ALU, use whatever coolant you want, our nickel will not flake. The brands that get complained about, will not say that to you. For whatever that's worth.

Also, the internet being what it is, you don't have to sell a lot of units to generate complaint threads. Name a product, google it, you will find someone started a thread somewhere about how product X leaker/flaked/chipped/whatever. I used fittings that do come from a small boutique outfit that get outsold by more than 1000 to 1; there are complaint threads about them.

But ultimately here was my take on the issue. One, whatever is going on with mixing nickel and copper, you can just avoid the issue altogether by going with copper. You can't avoid copper, because any rad you buy will most likely be copper. But you don't have to buy nickel. Not that copper is without its own issues. But you can cross this particular issue off your list. That was the view espoused by martinm in one of the threads, and he's the complete opposite of a "noob."

Two, unless you're putting together a thematic build and need a particular block, or found a stellar bargain, you don't have to buy a block from a brand that has a lot of internet complaint threads about it. You can just go with a brand that doesn't get any complaints. It's no guarantee, and it's probably an unnecessary precaution. But it can't hurt, and it might help.


----------



## aaroc

Each R9 290X nees to pcie power connectors (8+6). Its save to use the cables from the PSU that come with one 8 and one 6 pcie connectors?
I will connect 3 R9 290X to one Corsair AX1200i and 1 R9 290X and FX 9590, Giga ud7 pump, SSds,fans to another AX1200i. Using a single cable for each PCIe connector will use a hell lot of space. Im almost finished with taking out bubles from the GPU waterblocks and almost 4 days without a leak :-D

Please see F2004 on my sig.


----------



## tsm106

Quote:


> Originally Posted by *taem*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Some brands also sell at 100:1 ratio. And will that many more sold, you get that many more noobs crying about it.
> 
> 
> 
> Sure. It's why I didn't name any names. I'm not going to pretend anyone really knows for sure what's going on here, and I don't want to malign anyone unfairly.
> 
> But I think you're dismissing the point too blithely. The brand that has the most complaints about nickel plating issues in recent years is *not* the sales leader. The sales leader, the one that probably sells as much as the rest of the brands put together, comes in second. And some of the brands that never get complained about are not obscure boutique firms; while they might not sell as many units, they are well established, broadly supported brands which perhaps not coincidentally happen to enjoy reputations for having the best build quality in the industry. I contacted one of these firms about mixing copper, nickel and silver -- I was told, mix whatever you want other than ALU, use whatever coolant you want, our nickel will not flake. The brands that get complained about, will not say that to you. For whatever that's worth.
> 
> Also, the internet being what it is, you don't have to sell a lot of units to generate complaint threads. Name a product, google it, you will find someone started a thread somewhere about how product X leaker/flaked/chipped/whatever. I used fittings that do come from a small boutique outfit that get outsold by more than 1000 to 1; there are complaint threads about them.
> 
> But ultimately here was my take on the issue. One, whatever is going on with mixing nickel and copper, you can just avoid the issue altogether by going with copper. You can't avoid copper, because any rad you buy will most likely be copper. But you don't have to buy nickel. Not that copper is without its own issues. But you can cross this particular issue off your list. That was the view espoused by martinm in one of the threads, and he's the complete opposite of a "noob."
> 
> Two, unless you're putting together a thematic build and need a particular block, or found a stellar bargain, you don't have to buy a block from a brand that has a lot of internet complaint threads about it. You can just go with a brand that doesn't get any complaints. It's no guarantee, and it's probably an unnecessary precaution. But it can't hurt, and it might help.
Click to expand...

?????

Everything corrodes! Sorry but I don't have to be sorry because I don't give a crap if its silver, nickel, copper, whatever, it's gonna corrode. I've used all brands and they all corrode because we're dealing with science here. Pssh, you know...

COPPER CORRODES TOO! Omg, look an Aquagrafx block in pure copper, and it's corroded! Did I ever make a thread about it? No. Why? Because I've been doing this long enough to not blow a gasket over it.



Totally blowing a gasket below...

http://www.overclock.net/t/1299223/two-ek-fc580-acetal-nickel-sold


----------



## Buehlar

Quote:


> Originally Posted by *tsm106*
> 
> ?????
> 
> Everything corrodes! Sorry but I don't have to be sorry because I don't give a crap if its silver, nickel, copper, whatever, it's gonna corrode. I've used all brands and they all corrode because we're dealing with science here. Pssh, you know...
> 
> COPPER CORRODES TOO! Omg, look an Aquagrafx block in pure copper, and it's corroded! Did I ever make a thread about it? No. Why? *Because I've been doing this long enough to not blow a gasket over it.*
> 
> 
> 
> Totally blowing a gasket below...
> 
> http://www.overclock.net/t/1299223/two-ek-fc580-acetal-nickel-sold


Then why are you blowing your gasket now?


----------



## tsm106

Quote:


> Originally Posted by *Buehlar*
> 
> Then why are you blowing your gasket now?


You think I'm pissed off?


----------



## Buehlar

Quote:


> Originally Posted by *tsm106*
> 
> You think I'm pissed off?


No, I think you're very knowledgeable and give good advice for the most part you've been pretty helpful. I've probably read every post that you've posted here over the last ~6 months in this thread.

However, I honestly think you just enjoy being a smart___ to anyone who doesn't agree with you.


----------



## DividebyZERO

Quote:


> Originally Posted by *tsm106*
> 
> You think I'm pissed off?


I blame your avatar. It promotes the impression you are a video card abuser when you blow a gasket. /joke troll (yes i know its not yours







)

Seriously though thats one nasty copper corroded block... I probably shouldn't admit this but i've been using distilled water and a small amount of auto radiator antifreeze and the only thing i noticed was some plasticizer in my resivoir and some black paint chips on my first fill up. However i have flushed it 4 times since i started with my brand new R9 290's about a year ago... can't really complain when i check my gpu blocks they are clean as a whistle...


----------



## taem

Quote:


> Originally Posted by *tsm106*
> 
> ?????
> 
> Everything corrodes! Sorry but I don't have to be sorry because I don't give a crap if its silver, nickel, copper, whatever, it's gonna corrode. I've used all brands and they all corrode because we're dealing with science here. Pssh, you know...
> 
> COPPER CORRODES TOO! Omg, look an Aquagrafx block in pure copper, and it's corroded! Did I ever make a thread about it? No. Why? Because I've been doing this long enough to not blow a gasket over it.
> 
> 
> 
> Totally blowing a gasket below...
> 
> http://www.overclock.net/t/1299223/two-ek-fc580-acetal-nickel-sold


? It's an odd reaction to what I posted. Yeah corrosion happens, but there is a question of degree and rapidity, and whether some brands are better than others. And the issue was whether EK nickel plating should be avoided. Flaking of poor nickel plating is its own sub issue with its own considerations.

Although I should note, since the card in question is a Lightning, EK nickel plating can't be avoided lol. Afaik that's the only block for this card. That being the case I wouldn't hesitate to buy the block. I might avoid silver coils though. There is quite a bit of chatter about silver dramatically exacerbating nickel plate flaking.


----------



## Mega Man

Quote:


> Originally Posted by *LolCakeLazors*
> 
> So I was asking for water cooling advice on HardForum and someone said that Nickel + EK is a bad idea. Has this problem been resolved? I don't want to buy a Lightning 290X block from EK and find out it's corroding.


a long time ago they had flaking problems, people just hold onto grudges, also nickel and silver dont get along well


----------



## kizwan

Quote:


> Originally Posted by *LolCakeLazors*
> 
> So I was asking for water cooling advice on HardForum and someone said that Nickel + EK is a bad idea. Has this problem been resolved? I don't want to buy a Lightning 290X block from EK and find out it's corroding.


The idea of nickel plating is for protecting copper from corrosion. It's just a thin layer of nickel. If the plating was done not properly, it can cause flaking & the only issue when it's flaking is the copper now expose. It won't affect performance. Sure aesthetically, internally the block will look ugly. It doesn't make sense to avoid nickel blocks altogether.


----------



## taem

Quote:


> Originally Posted by *kizwan*
> 
> The idea of nickel plating is for protecting copper from corrosion. It's just a thin layer of nickel. If the plating was done not properly, it can cause flaking & the only issue when it's flaking is the copper now expose. It won't affect performance. Sure aesthetically, internally the block will look ugly. It doesn't make sense to avoid nickel blocks altogether.


Are you sure that nickel plating flaking to the point where copper becomes exposed is not a problem? Because in my reading I came across numerous threads discussing how this creates, forget the term, a galvanic cell? something that's not good. I don't understand why copper that was nickel plated but lost some nickel plating would cause an issue where bare copper doesn't, but I read this a lot.

As to it not making sense to avoid nickel altogether, why not? Again, if there is an aesthetic consideration, or price consideration, I wouldn't hesitate to get nickel. But everything else being equal, why not avoid mixing nickel and copper? Because there's this
Quote:


> Originally Posted by *Mega Man*
> 
> also nickel and silver dont get along well


if you avoid the nickel in the first place you also have the option of kill coils rather than using liquid drops.

But again, if you have a custom pcb gpu then an EK nickel plated block is likely your only option. Other than that though -- your rads will be copper, and you can get copper cpu and gpu blocks. This was martinm's view, that he personally goes all copper for his blocks just to not deal with any potential nickel + copper issues that may or may not exist, and that totally makes sense to me. But then I prefer the look of copper.


----------



## Buehlar

The demand is simply not as great for non-ref gpu waterblocks so your options may be limited on the finish available. Something to keep in mind when purchasing hardware.


----------



## kizwan

Quote:


> Originally Posted by *taem*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> The idea of nickel plating is for protecting copper from corrosion. It's just a thin layer of nickel. If the plating was done not properly, it can cause flaking & the only issue when it's flaking is the copper now expose. It won't affect performance. Sure aesthetically, internally the block will look ugly. It doesn't make sense to avoid nickel blocks altogether.
> 
> 
> 
> Are you sure that nickel plating flaking to the point where copper becomes exposed is not a problem? Because in my reading I came across numerous threads discussing how this creates, forget the term, a galvanic cell? something that's not good. I don't understand why copper that was nickel plated but lost some nickel plating would cause an issue where bare copper doesn't, but I read this a lot.
> 
> As to it not making sense to avoid nickel altogether, why not? Again, if there is an aesthetic consideration, or price consideration, I wouldn't hesitate to get nickel. But everything else being equal, why not avoid mixing nickel and copper? Because there's this
Click to expand...

Galvanic corrosion can happen if you're mixing metal in your loop like copper & aluminium. There will be no problem with nickel & copper. There is a table which show which metal when mixed can cause galvanic corrosion. If I can find that table again, I'll share it here. The nickel block is not the whole block is nickel. It's copper block plated with nickel & the nickel layer is to prevent corrosion.


----------



## tsm106

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *taem*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> The idea of nickel plating is for protecting copper from corrosion. It's just a thin layer of nickel. If the plating was done not properly, it can cause flaking & the only issue when it's flaking is the copper now expose. It won't affect performance. Sure aesthetically, internally the block will look ugly. It doesn't make sense to avoid nickel blocks altogether.
> 
> 
> 
> Are you sure that nickel plating flaking to the point where copper becomes exposed is not a problem? Because in my reading I came across numerous threads discussing how this creates, forget the term, a galvanic cell? something that's not good. I don't understand why copper that was nickel plated but lost some nickel plating would cause an issue where bare copper doesn't, but I read this a lot.
> 
> As to it not making sense to avoid nickel altogether, why not? Again, if there is an aesthetic consideration, or price consideration, I wouldn't hesitate to get nickel. But everything else being equal, why not avoid mixing nickel and copper? Because there's this
> 
> Click to expand...
> 
> Galvanic corrosion can happen if you're mixing metal in your loop like copper & aluminium. There will be no problem with nickel & copper. There is a table which show which metal when mixed can cause galvanic corrosion. If I can find that table again, I'll share it here. The nickel block is not the whole block is nickel. It's copper block plated with nickel & the nickel layer is to prevent corrosion.
Click to expand...

Nah, it can happen just like it did with the ek nickel fiasco. It's been a long long time since the massive uber ek fiasco thread so I am regurgitating this from memory. The ek issue was somewhat of a comedy of errors. Essentially or in a nutshell ek had continuing QA issues with cleaning blocks/prep before plating (or at least this was what they admitted to iirc). This left the nickel plating porous. At the same time silver killcoil and distilled had somehow become the number one recommended fluid combo. The porosity combined with the silver ions led to galvanic corrosion. There were three things needed, two dissimilar metals in direct contact, and the charged water to make a loop, voila galvanic corrosion. The porous plating meant that nickel and copper were in contact to each other and the charged liquid. The nickel wasn't actually flaking, it was the copper underneath the porous nickel that was corroding away and then the nickel had nothing left to adhere to. If the nickel plating was uniform, the copper would have never been exposed, then there would not have been an arc and no corrosion, in an ideal world. I think ek is still scratching their heads because they moved to electroless nickel plating instead of fixing the supposed QA problem. However that wasn't the end because their blocks still had corrosion issues interestingly. You have some "EN" blocks corroding and some not corroding. It can't be both ways can it? I wonder...

People blamed the silver, they blamed ptnuke, they blamed their moms and dogs and cats lol. Who knows? What I do know is nickel can and will corrode if not protected with a corrosion inhibitor which a killcoil does not do, in fact the way a killcoil works by flooding your water with neg charged ions may/could exacerbate the issue. Some ppl still swear by a killcoil and distilled. Nowadays I just use a coolant mix. The minute difference in temps is fractional compared to what you gain in protection.


----------



## LandonAaron

Quote:


> Originally Posted by *LolCakeLazors*
> 
> So I was asking for water cooling advice on HardForum and someone said that Nickel + EK is a bad idea. Has this problem been resolved? I don't want to buy a Lightning 290X block from EK and find out it's corroding.


I bought one used nickel ek 290 block off of eBay and one used copper ek 290 block off of ebay. The copper one looks brand new and the nickel one is fairly corroded. I have no idea how long either one was in use for or what else was in the loops with those blocks so take from that what you will.


----------



## LolCakeLazors

Quote:


> Originally Posted by *LandonAaron*
> 
> I bought one used nickel ek 290 block off of eBay and one used copper ek 290 block off of ebay. The copper one looks brand new and the nickel one is fairly corroded. I have no idea how long either one was in use for or what else was in the loops with those blocks so take from that what you will.


I mean I obviously can't avoid the nickel in the 290X Lightning block since only EK makes the block with nickel. Looks like I'm using copper in my Supremacy EVO though.


----------



## tsm106

Quote:


> Originally Posted by *LolCakeLazors*
> 
> Quote:
> 
> 
> 
> Originally Posted by *LandonAaron*
> 
> I bought one used nickel ek 290 block off of eBay and one used copper ek 290 block off of ebay. The copper one looks brand new and the nickel one is fairly corroded. I have no idea how long either one was in use for or what else was in the loops with those blocks so take from that what you will.
> 
> 
> 
> I mean I obviously can't avoid the nickel in the 290X Lightning block since only EK makes the block with nickel. Looks like I'm using copper in my Supremacy EVO though.
Click to expand...

Nah go nickel. It's more better (lol grammar police) for using CLU. Copper gets stained by CLU over time. Nickel is much more resistant to the CLU. Also, brass tax time, nickel plating is for looks on the outside of the block. These blocks are not plated to fight internal corrosion so really get that idea out of your minds. They are plated for the chrome bling on the outside. I would use CLU on everything if I could...


----------



## LandonAaron

Quote:


> Originally Posted by *taem*
> 
> ? It's an odd reaction to what I posted. Yeah corrosion happens, but there is a question of degree and rapidity, and whether some brands are better than others. And the issue was whether EK nickel plating should be avoided. Flaking of poor nickel plating is its own sub issue with its own considerations.
> 
> Although I should note, since the card in question is a Lightning, EK nickel plating can't be avoided lol. Afaik that's the only block for this card. That being the case I wouldn't hesitate to buy the block. I might avoid silver coils though. There is quite a bit of chatter about silver dramatically exacerbating nickel plate flaking.


So that's what was going on with my kill coil. I had been running with a kill coil for about 6 months. Then I added the nickle block and a week later the coil had become completely covered in a thin layer of silky black stuff. I just removed it without looking into it too much. Good to know though.


----------



## LandonAaron

Yeah I just bought the supremacy Evo nickle block. It's too BA looking for me to care about some possible corrosion issue. Also my corroded nickle blcok cools my card just fine so...


----------



## tsm106

Quote:


> Originally Posted by *LandonAaron*
> 
> Yeah I just bought the supremacy Evo nice block. It's too BA looking for me to care about some possible corrosion issue. *Also my corroded nickle blcok cools my card just fine so*...


True that. It's not a performance issue, its aesthetics.


----------



## LandonAaron

Tsm what coolant mix do your recommend? I just got into the custom loop 6 months ago and just added nick r l to the mix about 2 weeks ago. Everything I have read suggested using distilled water, and the price makes it an attractive option especially if you like to make changes often. But considering I am notw going to be rocking two nickle blocks I would like to protect them the best I can. You mentioned a different in temps too, what are we talking in terms of CPU degrees under load?


----------



## Mega Man

Quote:


> Originally Posted by *taem*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> The idea of nickel plating is for protecting copper from corrosion. It's just a thin layer of nickel. If the plating was done not properly, it can cause flaking & the only issue when it's flaking is the copper now expose. It won't affect performance. Sure aesthetically, internally the block will look ugly. It doesn't make sense to avoid nickel blocks altogether.
> 
> 
> 
> Are you sure that nickel plating flaking to the point where copper becomes exposed is not a problem? Because in my reading I came across numerous threads discussing how this creates, forget the term, a galvanic cell? something that's not good. I don't understand why copper that was nickel plated but lost some nickel plating would cause an issue where bare copper doesn't, but I read this a lot.
> 
> As to it not making sense to avoid nickel altogether, why not? Again, if there is an aesthetic consideration, or price consideration, I wouldn't hesitate to get nickel. But everything else being equal, why not avoid mixing nickel and copper? Because there's this
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> also nickel and silver dont get along well
> 
> Click to expand...
> 
> if you avoid the nickel in the first place you also have the option of kill coils rather than using liquid drops.
> 
> But again, if you have a custom pcb gpu then an EK nickel plated block is likely your only option. Other than that though -- your rads will be copper, and you can get copper cpu and gpu blocks. This was martinm's view, that he personally goes all copper for his blocks just to not deal with any potential nickel + copper issues that may or may not exist, and that totally makes sense to me. But then I prefer the look of copper.
Click to expand...

https://martinsliquidlab.wordpress.com/2012/01/24/corrosion-explored/
Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *taem*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> The idea of nickel plating is for protecting copper from corrosion. It's just a thin layer of nickel. If the plating was done not properly, it can cause flaking & the only issue when it's flaking is the copper now expose. It won't affect performance. Sure aesthetically, internally the block will look ugly. It doesn't make sense to avoid nickel blocks altogether.
> 
> 
> 
> Are you sure that nickel plating flaking to the point where copper becomes exposed is not a problem? Because in my reading I came across numerous threads discussing how this creates, forget the term, a galvanic cell? something that's not good. I don't understand why copper that was nickel plated but lost some nickel plating would cause an issue where bare copper doesn't, but I read this a lot.
> 
> As to it not making sense to avoid nickel altogether, why not? Again, if there is an aesthetic consideration, or price consideration, I wouldn't hesitate to get nickel. But everything else being equal, why not avoid mixing nickel and copper? Because there's this
> 
> Click to expand...
> 
> Galvanic corrosion can happen if you're mixing metal in your loop like copper & aluminium. There will be no problem with nickel & copper. There is a table which show which metal when mixed can cause galvanic corrosion. If I can find that table again, I'll share it here. The nickel block is not the whole block is nickel. It's copper block plated with nickel & the nickel layer is to prevent corrosion.
Click to expand...

see the above link

besides that nickel does have some galvanic corrosion potential

that is one of the reasons i prefer swiftech

100% chrome, ie chromium


----------



## taem

Quote:


> Originally Posted by *tsm106*
> 
> Nah, it can happen just like it did with the ek nickel fiasco. It's been a long long time since the massive uber ek fiasco thread so I am regurgitating this from memory. The ek issue was somewhat of a comedy of errors. Essentially or in a nutshell ek had continuing QA issues with cleaning blocks/prep before plating (or at least this was what they admitted to iirc). This left the nickel plating porous. At the same time silver killcoil and distilled had somehow become the number one recommended fluid combo. The porosity combined with the silver ions led to galvanic corrosion. There were three things needed, two dissimilar metals in direct contact, and the charged water to make a loop, voila galvanic corrosion. The porous plating meant that nickel and copper were in contact to each other and the charged liquid. The nickel wasn't actually flaking, it was the copper underneath the porous nickel that was corroding away and then the nickel had nothing left to adhere to. If the nickel plating was uniform, the copper would have never been exposed, then there would not have been an arc and no corrosion, in an ideal world. I think ek is still scratching their heads because they moved to electroless nickel plating instead of fixing the supposed QA problem. However that wasn't the end because their blocks still had corrosion issues interestingly. You have some "EN" blocks corroding and some not corroding. It can't be both ways can it? I wonder...
> 
> People blamed the silver, they blamed ptnuke, they blamed their moms and dogs and cats lol. Who knows? What I do know is nickel can and will corrode if not protected with a corrosion inhibitor which a killcoil does not do, in fact the way a killcoil works by flooding your water with neg charged ions may/could exacerbate the issue. Some ppl still swear by a killcoil and distilled. Nowadays I just use a coolant mix. The minute difference in temps is fractional compared to what you gain in protection.


Right. As I understand it the problem was insufficient preparation of the copper surface resulted in improper adherence of the nickel plating with those EK blocks. This particular episode is well documented. But from there it gets murkier, because there are discussions where folks say their nickel plating is flaking even though they didn't use silver. I'm still not sure what to make of that. It's chemical fact that silver will rob nickel of electrons, so you can understand the problem there. But does/did EK nickel plating ever flake without silver being used? There are guys out there that say yes. And in recent years there are more complaints about Koolance nickel plating than EK.

But afaik absent nickel plating in your system, you won't get the level of corrosion you get if nickel plating on copper flakes off and creates a galvanic cell where silver ion charged coolant interacts with nickel and copper at the same time. If you're just using copper and silver, I'm not sure a corrosion inhibitor is necessary; plenty of knowledgeable voices out there say it's not. And for sure, there are lots of folks out there that claim PT Nuke and corrosion inhibitors present problems of their own. I read thread after thread about all of them, from Liquid Utopia to Feser Base to inhibitors you get from auto parts stores to straight up pure ethylene glycol -- with every single option, there are guys that say that causes problems in your loop.

I'm not sure what to make of it all. For now I'm just using silver and all copper blocks.

But going back to the original question that launched it all -- by all means get the EK Lightning block. But like I said earlier, and Mega Man also said, don't use silver kill coils with it.

Quote:


> Originally Posted by *Mega Man*
> 
> https://martinsliquidlab.wordpress.com/2012/01/24/corrosion-explored/


I read that during my research. And here's the thing -- I mentioned this earlier, but martinm said in an Xtremesystems thread that he uses only copper blocks and does not mix nickel and copper.

Quote:


> that is one of the reasons i prefer swiftech
> 
> 100% chrome, ie chromium


One of the lessons I've learned from my first build, if/when I upgrade parts, I will go with chrome plated brass fittings. Not only for the corrosion issue, but because paint flakes off.

Quote:


> Originally Posted by *LandonAaron*
> 
> So that's what was going on with my kill coil. I had been running with a kill coil for about 6 months. Then I added the nickle block and a week later the coil had become completely covered in a thin layer of silky black stuff. I just removed it without looking into it too much. Good to know though.


That's silver oxide. Have you looked inside that nickel block yet?


http://www.overclock.net/t/1223665/koolances-policy-on-nickel-silver-and-flaking-corrosion-whatever-you-call-the-nickel-coming-off/0_100


----------



## LandonAaron

Quote:


> Originally Posted by *taem*
> 
> Right. As I understand it the problem was insufficient preparation of the copper surface resulted in improper adherence of the nickel plating with those EK blocks. This particular episode is well documented. But from there it gets murkier, because there are discussions where folks say their nickel plating is flaking even though they didn't use silver. I'm still not sure what to make of that. It's chemical fact that silver will rob nickel of electrons, so you can understand the problem there. But does/did EK nickel plating ever flake without silver being used? There are guys out there that say yes. And in recent years there are more complaints about Koolance nickel plating than EK.
> 
> But afaik absent nickel plating in your system, you won't get the level of corrosion you get if nickel plating on copper flakes off and creates a galvanic cell where silver ion charged coolant interacts with nickel and copper at the same time. If you're just using copper and silver, I'm not sure a corrosion inhibitor is necessary; plenty of knowledgeable voices out there say it's not. And for sure, there are lots of folks out there that claim PT Nuke and corrosion inhibitors present problems of their own. I read thread after thread about all of them, from Liquid Utopia to Feser Base to inhibitors you get from auto parts stores to straight up pure ethylene glycol -- with every single option, there are guys that say that causes problems in your loop.
> 
> I'm not sure what to make of it all. For now I'm just using silver and all copper blocks.
> 
> But going back to the original question that launched it all -- by all means get the EK Lightning block. But like I said earlier, and Mega Man also said, don't use silver kill coils with it.
> I read that during my research. And here's the thing -- I mentioned this earlier, but martinm said in an Xtremesystems thread that he uses only copper blocks and does not mix nickel and copper.
> One of the lessons I've learned from my first build, if/when I upgrade parts, I will go with chrome plated brass fittings. Not only for the corrosion issue, but because paint flakes off.
> That's silver oxide. Have you looked inside that nickel block yet?
> 
> 
> http://www.overclock.net/t/1223665/koolances-policy-on-nickel-silver-and-flaking-corrosion-whatever-you-call-the-nickel-coming-off/0_100


No I only ran it with the kill coil for a week before I noticed the issue. It was a used block that was already corroded to start with so whatever. I doubt anything too bad will come of it though I think I will switch to using an anti corrosive coolant now. What's funny I asked what could be going on with my kill coil in the EK thread and got no responses.


----------



## kizwan

Quote:


> Originally Posted by *Mega Man*
> 
> see the above link
> 
> besides that nickel does have some galvanic corrosion potential
> 
> that is one of the reasons i prefer swiftech
> 
> 100% chrome, ie chromium


Sure, some galvanic corrosion potential. For sure with bad plating process & addition to certain conditions, things can turn to bad. Well I guess you're right that better go with safe route & avoid nickel altogether. Well I always use coolant that contain corrosion inhibitor for good measure. I also don't use silver killcoil. I use nickel plated block before & now using copper block but I won't completely say no to nickel plated block though.


----------



## long99x

Hello everyone,I have this problem

I have r9 290 dc2 in crossfire driver 14.12, today I turn on my computer, I use displayport cable and my monitor is LG 29UM65(have speaker inside) but I have no audio from my card to my monitor, in playback device it still show my monitor speaker and in device manager still have AMD HD Audio, I try to re-install driver but nothing change, no audio output from displayport (yesterday monitor speaker work fine)

Anyone know this problem help me please









Thanks


----------



## LandonAaron

Quote:


> Originally Posted by *long99x*
> 
> Hello everyone,I have this problem
> 
> I have r9 290 dc2 in crossfire driver 14.12, today I turn on my computer, I use displayport cable and my monitor is LG 29UM65(have speaker inside) but I have no audio from my card to my monitor, in playback device it still show my monitor speaker and in device manager still have AMD HD Audio, I try to re-install driver but nothing change, no audio output from displayport (yesterday monitor speaker work fine)
> 
> Anyone know this problem help me please
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks


Can you test audio over HDMI to the monitor? Have you tried connecting anything else to the monitor to check that the monitor speakers are still working? Need to just start testing devices one at a time to figure out which one the problem is with the.

Also, why are you using your monitors speakers? They have to be pretty terrible right? Get some decent speakers and connect directly to your motherboards sound card. Logitech makes some good speakers on the cheap.


----------



## rdr09

Quote:


> Originally Posted by *long99x*
> 
> Hello everyone,I have this problem
> 
> I have r9 290 dc2 in crossfire driver 14.12, today I turn on my computer, I use displayport cable and my monitor is LG 29UM65(have speaker inside) but I have no audio from my card to my monitor, in playback device it still show my monitor speaker and in device manager still have AMD HD Audio, I try to re-install driver but nothing change, no audio output from displayport (yesterday monitor speaker work fine)
> 
> Anyone know this problem help me please
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks


Try windows recovery from the last time your audio was working and turn off windows auto update if on. most prolly an update screwed it up.


----------



## boot318

Quote:


> Originally Posted by *long99x*
> 
> Hello everyone,I have this problem
> 
> I have r9 290 dc2 in crossfire driver 14.12, today I turn on my computer, I use displayport cable and my monitor is LG 29UM65(have speaker inside) but I have no audio from my card to my monitor, in playback device it still show my monitor speaker and in device manager still have AMD HD Audio, I try to re-install driver but nothing change, no audio output from displayport (yesterday monitor speaker work fine)
> 
> Anyone know this problem help me please
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks


Is your monitor speakers "Set Default" in playback devices? I know you probably do but still had to ask.


----------



## Harry604

I have a 290x direct cuii

with the stock heatsink and 100mv on core I can only get 1115 mhz stable in bf4

i put a h100i on the card and fan on vrm comes with vrm heatsink

im at 1190 mhz stable 1hr in bf4 at same 100mv on core

how is it possible by putting a waterblock to cool card

57c on core

vrm 1 is 71
vrm 2 is 65

im still adding 5mhz and no crashing yet

havent touched aux voltage or memory clock yet


----------



## ZealotKi11er

I have been trying to get FC4 not to crash with 15.3. I can now play about 5 mins before the system reboots. Has anyone had this problem with FC4?


----------



## boot318

Quote:


> Originally Posted by *Harry604*
> 
> I have a 290x direct cuii
> 
> with the stock heatsink and 100mv on core I can only get 1115 mhz stable in bf4
> 
> i put a h100i on the card and fan on vrm comes with vrm heatsink
> 
> im at 1190 mhz stable 1hr in bf4 at same 100mv on core
> 
> how is it possible by putting a waterblock to cool card
> 
> 57c on core
> 
> vrm 1 is 71
> vrm 2 is 65
> 
> im still adding 5mhz and no crashing yet
> 
> havent touched aux voltage or memory clock yet


I had the same experience with my DirectCU also. The card would artifact like crazy over 1125 (100mv and RED MOD). I change the thermal pad on my VRM heatsink and now I can get 1185 on Core. Previous high on VRM was 90C, but after changing the pad it is 78C. If I can tolerate more fan noise I can get it around 70C. I tried 1220 on the Core but the game I was playing just crashed 10 minutes into it. It didn't artifact like it usually would though.

I guess Asus DirectCU r9 290 series is just VRM temp sensitive.


----------



## ZealotKi11er

Someone really need to test FC4 for me because i am literary going mad testing this game with CF. The dam thing still does not work. What the hell is going on. MSAA is broken, GW is broken., CF is Broken, The other AA does not work with CF. What the hell is going on.


----------



## Pandora's Box

It is better than it has been, but by far it is not optimal at all.


----------



## Buehlar

I plan on testing the beta over this weekend...will post the results when done.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Pandora's Box*
> 
> 
> 
> It is better than it has been, but by far it is not optimal at all.


So you have the same problem. CF just seem to come on and off going from 50-60 fps to 90-120 fps.


----------



## Maintenance Bot

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Someone really need to test FC4 for me because i am literary going mad testing this game with CF. The dam thing still does not work. What the hell is going on. MSAA is broken, GW is broken., CF is Broken, The other AA does not work with CF. What the hell is going on.


Same here. 1080p cf enabled. Ultra settings with ssao,msaa2, and godrays off. Was not very smooth at all. Typical contaminated CrapWorks title.


----------



## Pandora's Box

Quote:


> Originally Posted by *ZealotKi11er*
> 
> So you have the same problem. CF just seem to come on and off going from 50-60 fps to 90-120 fps.


I go from 50% usage to 99% usage. so crossfire is on but the usage is all over the place


----------



## ZealotKi11er

I am going to uninstall FC4 and play it if i get 390X or play it after 2 years like i did for FC3.


----------



## Pandora's Box

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I am going to uninstall FC4 and play it if i get 390X or play it after 2 years like i did for FC3.


I haven't played it since it came out. Each patch and each AMD driver release I test it and get either no crossfire scaling, or crappy scaling.


----------



## long99x

Quote:


> Originally Posted by *LandonAaron*
> 
> Can you test audio over HDMI to the monitor? Have you tried connecting anything else to the monitor to check that the monitor speakers are still working? Need to just start testing devices one at a time to figure out which one the problem is with the.
> 
> Also, why are you using your monitors speakers? They have to be pretty terrible right? Get some decent speakers and connect directly to your motherboards sound card. Logitech makes some good speakers on the cheap.


yes, I tested my monitor speaker, it's work fine
Quote:


> Originally Posted by *rdr09*
> 
> Try windows recovery from the last time your audio was working and turn off windows auto update if on. most prolly an update screwed it up.


Thanks, I will try
Quote:


> Originally Posted by *boot318*
> 
> Is your monitor speakers "Set Default" in playback devices? I know you probably do but still had to ask.


Already đi this


----------



## Buehlar

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I am going to uninstall FC4 and play it if i get 390X or play it after 2 years like i did for FC3.


Quote:


> Originally Posted by *Pandora's Box*
> 
> I haven't played it since it came out. Each patch and each AMD driver release I test it and get either no crossfire scaling, or crappy scaling.


I don't get it guys. I see a lot of people complaining either the game just don't run...crap settings.... stuttering etc... I don't suffer those problems. Runs hell-of-a-lot better than FC3 for me.
If I fire up FC3 it'll crash within an hour or two.

What res & hertz are y'all trying to play at?

I've played through the entire game max settings with 14.12 Omega drivers @ 5890x1080 on (60hz monitors) and found it ran rather smooth and the was quite enjoyable considering I played without full crossfire support and the occasional fps dip but averaged 50~60 fps throughout.

What problems are you guys having other than "no scaling"? Or is it just the "1st world" problems that plagues y'all?









It's a really fun game, quit ya griping and play it!







It will eventually have scaling fixed and we can revisit it in all its full glory in crossfire with endless FPS ...(or with a 390)

Anyways...I plan on trying beta at various settings mentioned here over the weekend.


----------



## Pandora's Box

Trying to play it at 3440x1440 and the usage goes to 50% per GPU doesn't do so well for the FPS. (35-40)


----------



## Buehlar

Quote:


> Originally Posted by *Pandora's Box*
> 
> Trying to play it at 3440x1440 and the usage goes to 50% per GPU doesn't do so well for the FPS. (35-40)


What's your single card fps at same res & settings?

Edit...
oh I noticed you're running a 295x2 so I don't think you cant turn off xFire. You're probably also dealing with a whole other set of issues with the monstrosity of a card


----------



## LandonAaron

There is a long runnin g FC4 thread here at OCN I would check it out. It's been a while since I've kept up with it but the crossfire and sli problems are well documented.


----------



## mojobear

hey guys,

I am setting up 1440p in eyefinity and was puzzling over In/outs for the R9 290s.

How the heck did Linus run this setup?






With LG 34UC97 x 3....you either use DP, HDMI, or TB....

I am very confused. Any enlightenment?


----------



## Buehlar

Quote:


> Originally Posted by *LandonAaron*
> 
> There is a long runnin g FC4 thread here at OCN I would check it out. It's been a while since I've kept up with it but the crossfire and sli problems are well documented.


Yep, there is also a few tweaks for AMD on the 1st post
http://www.overclock.net/t/1520430/official-far-cry-4-information-discussion-thread
Quote:


> Originally Posted by *mojobear*
> 
> hey guys,
> 
> I am setting up 1440p in eyefinity and was puzzling over In/outs for the R9 290s.
> 
> How the heck did Linus run this setup?
> 
> 
> 
> 
> 
> 
> With LG 34UC97 x 3....you either use DP, HDMI, or TB....
> 
> I am very confused. Any enlightenment?


Several configs are possible depending on what monitors you have and their input ports available.

All monitors will be connected to the primary GPU only.

If all your monitors have DP then I recommend getting a DP hub for optimal performance.

Other configs that work.
1st monitor via HDMI
2nd monitor via DVI
3rd monitor via DP

or

1st monitor via HDMI
2nd monitor via DVI
3rd monitor via DVI

or

1st monitor via DVI
2nd monitor via DVI
3rd monitor via DP


----------



## DeviousAddict

I've currently got a Sapphire vapor-x R9 290X 8gb, if i get an MSI R9 290X 8gb for X0fire will the work together ok?

MSI - http://www.overclockers.co.uk/showproduct.php?prodid=GX-269-MS
My current Sapphire - http://www.sapphiretech.com/presentation/product/?cid=1&gid=3&sgid=1227&pid=2394&psn=&lid=1&leg=0

The reason i want to get the MSI over another Sapphire is price, the MSI one is £50 cheaper than the Sapphire.


----------



## Buehlar

Quote:


> Originally Posted by *DeviousAddict*
> 
> I've currently got a Sapphire vapor-x R9 290X 8gb, if i get an MSI R9 290X 8gb for X0fire will the work together ok?
> 
> MSI - http://www.overclockers.co.uk/showproduct.php?prodid=GX-269-MS
> My current Sapphire - http://www.sapphiretech.com/presentation/product/?cid=1&gid=3&sgid=1227&pid=2394&psn=&lid=1&leg=0
> 
> The reason i want to get the MSI over another Sapphire is price, the MSI one is £50 cheaper than the Sapphire.


No problem


----------



## DeviousAddict

Quote:


> Originally Posted by *Buehlar*
> 
> No problem


Awesome, thank you









Edit: MSI version ordered, it'll be in tomorrow morning and fitted by lunch


----------



## Buehlar

Quote:


> Originally Posted by *DeviousAddict*
> 
> Awesome, thank you
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: MSI version ordered, it'll be in tomorrow morning and fitted by lunch


Sweet!
Don't forget to post pics!


----------



## ZealotKi11er

Quote:


> Originally Posted by *Buehlar*
> 
> I don't get it guys. I see a lot of people complaining either the game just don't run...crap settings.... stuttering etc... I don't suffer those problems. Runs hell-of-a-lot better than FC3 for me.
> If I fire up FC3 it'll crash within an hour or two.
> 
> What res & hertz are y'all trying to play at?
> 
> I've played through the entire game max settings with 14.12 Omega drivers @ 5890x1080 on (60hz monitors) and found it ran rather smooth and the was quite enjoyable considering I played without full crossfire support and the occasional fps dip but averaged 50~60 fps throughout.
> 
> What problems are you guys having other than "no scaling"? Or is it just the "1st world" problems that plagues y'all?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's a really fun game, quit ya griping and play it!
> 
> 
> 
> 
> 
> 
> 
> It will eventually have scaling fixed and we can revisit it in all its full glory in crossfire with endless FPS ...(or with a 390)
> 
> Anyways...I plan on trying beta at various settings mentioned here over the weekend.


I played the game fine with a single 290X @ 1440p Ultra but i want to get more then 60 fps.


----------



## Buehlar

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I played the game fine with a single 290X @ 1440p Ultra but i want to get more then 60 fps.


Oh is that all? well we're in the same club then








We suffer from some serious 1st world problems


----------



## ZealotKi11er

Quote:


> Originally Posted by *Buehlar*
> 
> Oh is that all? well we're in the same club then
> 
> 
> 
> 
> 
> 
> 
> 
> We suffer from some serious 1st world problems


Yeah. FC4 is the only game i have been trying to play for the past 4 months. It should not take this long to fix the game. It's not a problem but it's frustrating.


----------



## Agent Smith1984

What exactly are the issues you are having with it, besides getting less than 60FPS?


----------



## LandonAaron

So I was looking through the OCN marketplace to see if there was anything interesting for sale and came across this: http://www.overclock.net/t/1533788/40x-amd-reference-r9-290s-xfx-sapphire-and-asus

Guy is selling 290's he used for mining for $200. That is how I bought my 290x. For $200 from a miner off of ebay and it has treated me well. If anyone is still on the fence about jumping to 290 or just wanting a second card this is a great way to do it. I am not affiliated with the Lutro0 in anyway just thought I would share.


----------



## LandonAaron

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah. FC4 is the only game i have been trying to play for the past 4 months. It should not take this long to fix the game. It's not a problem but it's frustrating.


Yeah I got frustrated with FC4's performance and didn't play it as much as I would have liked. I just quickly ran through the campaign and haven't messed with it since. I was running a single 290x at 1080p 60hz at the time, and it wasn't the FPS that was the problem it was persistent microstutter. I have yet to try it again since I upgraded to crossfire and 1440p 96hz, but I was hoping the extra power and the higher refresh rate might alleviate some of that stutter, but I kind of doubt anything will help with that games problems.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What exactly are the issues you are having with it, besides getting less than 60FPS?


CF. I want more then 50-60 fps and want to use AA.


----------



## Agent Smith1984

Well, I didn't play it a whole lot, but my son has been playing it on his x4 and 7950 with very high settings (not ultra, no AA) and it runs between 45 and 60FPS.... seems to run pretty good in my opinion

Maybe a hitch here and there, but I think that's cause his CPU is a bit bogged down.


----------



## mfknjadagr8

Quote:


> Originally Posted by *LandonAaron*
> 
> So I was looking through the OCN marketplace to see if there was anything interesting for sale and came across this: http://www.overclock.net/t/1533788/40x-amd-reference-r9-290s-xfx-sapphire-and-asus
> 
> Guy is selling 290's he used for mining for $200. That is how I bought my 290x. For $200 from a miner off of ebay and it has treated me well. If anyone is still on the fence about jumping to 290 or just wanting a second card this is a great way to do it. I am not affiliated with the Lutro0 in anyway just thought I would share.


I bought my two through him they needed a repaste but are going strong under water now

can you add me towards your list?


----------



## FastEddieNYC

With the Prices for the 290X hovering around $300 I could not resist and purchased the XFX290X DD 4GB. Nice performance increase from my Asus 270X. For once I think I got lucky with a good chip. The card will run 1125Mhz Core with no voltage increase. I'm going to wait until I Receive and install my Aquacomputer Kryographics Full cover Waterblock and backplate before cranking up the volts. Unfortunately because of the demise of FrozenCpu and the awful experience I've had with Performance PCS in the past and their poor/out stock levels I ended up ordering from Aquacomputer in Germany. They even had the Active Backplate in stock.



The 290X makes my old 270X look puny in comparrison.


----------



## gatygun

Well upgraded from a 580 ti 1,5gb > this beast of a card 290 tri-x 4gb. And i must say ac unity is playing so god dam solid. I was amazed.

Getting like ~50 fps on ultra at 1080p, getting with soft shadows put to high ( 1 point lower in the opinions ) a solid 40 fps on 1400p downsampled, it's even playable with hboa+ down a notch on it's max resolution 1800p at 30 fps. ( it does require a tiny bit of clocking upwards tho ).

To bad the card camps with horrible black screen of deaths and issue's with raptr to get it even remotely to work is a hellish experience.

I hope it can be fixed simple, but i highly doubt it's gona work like that.

Still 10109 fire strike score is pretty darn good for the card. I'm a bit worried about the future proofness of this card tho, the 4gb seems to be already hit when you go 1800p + 4x msaa. The game hits its v-ram wall directly. Doesn't bode well if i want to crossfire it in the future.


----------



## gatygun

If you got them in crossfire can you do some benchmark's where you crank settings up until the point it hits a 30 fps playable framerate. I wanna see the v-ram usage then. ( think about 1800p + msaa + high settings in unity ).

Thanks


----------



## diedo

guys i'm looking for a pre-configured MSI after burnner profile to stock AMD R9 290X , the weather here is hot and i have no AC here .. thanks

the card i have is in my sig !


----------



## Buehlar

Quote:


> Originally Posted by *gatygun*
> 
> Well upgraded from a 580 ti 1,5gb > this beast of a card 290 tri-x 4gb. And i must say ac unity is playing so god dam solid. I was amazed.
> 
> Getting like ~50 fps on ultra at 1080p, getting with soft shadows put to high ( 1 point lower in the opinions ) a solid 40 fps on 1400p downsampled, it's even playable with hboa+ down a notch on it's max resolution 1800p at 30 fps. ( it does require a tiny bit of clocking upwards tho ).
> 
> To bad the card camps with horrible black screen of deaths and issue's with raptr to get it even remotely to work is a hellish experience.
> 
> I hope it can be fixed simple, but i highly doubt it's gona work like that.
> 
> Still 10109 fire strike score is pretty darn good for the card. *I'm a bit worried about the future proofness of this card tho, the 4gb seems to be already hit when you go 1800p + 4x msaa. The game hits its v-ram wall directly. Doesn't bode well if i want to crossfire it in the future*.


In the near future an r9 290 crossfire rig will utilize all 8GB VRAM with direct12 support. Sounds future proof to me


----------



## supermi

Well I have been looking for a pair of or even 1 8gb 290x for a decent used price, no go so far ... I might just get some 290's locally.... if I do I am wondering 2 or 3 of them.
How is 3 way scaling going on, I found 2 way to be less than great for actual gaming but if 3 way is the same but
Quote:


> Originally Posted by *Buehlar*
> 
> In the near future an r9 290 crossfire rig will utilize all 8GB VRAM with direct12 support. Sounds future proof to me


That is up to the developer to implement, I would NOT make purchasing choices based on that assumption.


----------



## LandonAaron

Quote:


> Originally Posted by *gatygun*
> 
> Well upgraded from a 580 ti 1,5gb > this beast of a card 290 tri-x 4gb. And i must say ac unity is playing so god dam solid. I was amazed.
> 
> Getting like ~50 fps on ultra at 1080p, getting with soft shadows put to high ( 1 point lower in the opinions ) a solid 40 fps on 1400p downsampled, it's even playable with hboa+ down a notch on it's max resolution 1800p at 30 fps. ( it does require a tiny bit of clocking upwards tho ).
> 
> To bad the card camps with horrible black screen of deaths and issue's with raptr to get it even remotely to work is a hellish experience.
> 
> I hope it can be fixed simple, but i highly doubt it's gona work like that.
> 
> Still 10109 fire strike score is pretty darn good for the card. I'm a bit worried about the future proofness of this card tho, the 4gb seems to be already hit when you go 1800p + 4x msaa. The game hits its v-ram wall directly. Doesn't bode well if i want to crossfire it in the future.


What is the voltage set to on your card. I only get black screen if I go over +100 on the voltage.


----------



## icmacdon

Hello
To join this club do I post my info in a post?
Regards


----------



## Arizonian

Quote:


> Originally Posted by *icmacdon*
> 
> Hello
> To join this club do I post my info in a post?
> Regards


Yes just drop the GPU-Z validation link open to validation tab or a picture with your OCN name on it.









I use the post as proof in the members list.


----------



## gatygun

Well i got some experience with "future technology" and current graphic cards gained throughout the years.

Either it's:

1) the card doesn't offer the feature in his fullest form and therefore it's a crap solution
2) the tech takes 2-3 years after it's build for 2 games to really support it and at that time your gpu's are outdated
3) It works but limited "advice: you are better off buying a new gen card"

The same goes for the new memory architectures, it's great they move forwards but how much games will ultilize it if only 1 flag ship card is gona use it. It probably will be used to brute force current memory solutions. Specially when most games are console focused.

I was thinking about buying a 8gb used version, but that one still did go for 330 euro's. I couldn't pull the trigger on it as the 8gb would be absolute useless without crossfire anyway as the only 2 titles i really could push over the 4k was at 4xmsaa + 1800p resolution and that msaa would kill my performance anyway.

I even doubt 2x crossfire would be able to pull it off at a stable 30 fps.
Quote:


> Originally Posted by *LandonAaron*
> 
> What is the voltage set to on your card. I only get black screen if I go over +100 on the voltage.


0 and 25 it crashes. 100 it also black screened a few times but it seems to be stable now. No clue for how long tho.

gpu-z:

http://www.techpowerup.com/gpuz/details.php?id=phse

http://gpuz.techpowerup.com/15/04/02/gcu.png

Manufacturer: Sapphire
Brand: 290 tri-x r9 290 oc
stock tri-x fan


----------



## Buehlar

Quote:


> Originally Posted by *supermi*
> 
> Well I have been looking for a pair of or even 1 8gb 290x for a decent used price, no go so far ... I might just get some 290's locally.... if I do I am wondering 2 or 3 of them.
> How is 3 way scaling going on, I found 2 way to be less than great for actual gaming but if 3 way is the same but
> That is up to the developer to implement, I would NOT make purchasing choices based on that assumption.


You have a point, however AMD is already optimizing games at the hardware level and I can't see them not moving forward with it.

I doubt AAA titles will ignore and would want to take advantage, and in all honesty, the game developers that choose to always get by selling poorly optimized games and choose not to adopt and support current technology, well, they should die off anyway IMO.

Would be great for the industry. At least we wont be griping about wasting our money on their games anymore.


----------



## ZealotKi11er

I am thinking of selling my 290 and stick to single 290X as i really dont have a use for it right now. What do you guys think? I am playing to play W3 and GTA and i game @ 1440p. So far since i got this second card last summer i have not play a game i really needed 2 cards apart from Crysis 3 and BF4 @ 150% resolution scaling.


----------



## Arizonian

Quote:


> Originally Posted by *FastEddieNYC*
> 
> With the Prices for the 290X hovering around $300 I could not resist and purchased the XFX290X DD 4GB. Nice performance increase from my Asus 270X. For once I think I got lucky with a good chip. The card will run 1125Mhz Core with no voltage increase. I'm going to wait until I Receive and install my Aquacomputer Kryographics Full cover Waterblock and backplate before cranking up the volts. Unfortunately because of the demise of FrozenCpu and the awful experience I've had with Performance PCS in the past and their poor/out stock levels I ended up ordering from Aquacomputer in Germany. They even had the Active Backplate in stock.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> The 290X makes my old 270X look puny in comparrison.


Congrats - added








Quote:


> Originally Posted by *gatygun*
> 
> gpu-z:
> 
> http://www.techpowerup.com/gpuz/details.php?id=phse
> 
> http://gpuz.techpowerup.com/15/04/02/gcu.png
> 
> Manufacturer: Sapphire
> Brand: 290 tri-x r9 290 oc
> stock tri-x fan


Congrats - added


----------



## ZealotKi11er

Got to love BF4 scaling. Mantle is working much better right now. ~ 3.3GB vRAM usage @ 1440p 150% Resolution scaling. I am getting a solid 60 fps with V-Sync ON and 65-85 fps with it off with all settings @ Ultra 0AA. Got to love how well CF works in this game.


----------



## aaroc

Quote:


> Originally Posted by *Buehlar*
> 
> Yep, there is also a few tweaks for AMD on the 1st post
> http://www.overclock.net/t/1520430/official-far-cry-4-information-discussion-thread
> Several configs are possible depending on what monitors you have and their input ports available.
> 
> All monitors will be connected to the primary GPU only.
> 
> If all your monitors have DP then I recommend getting a DP hub for optimal performance.
> 
> Other configs that work.
> 1st monitor via HDMI
> 2nd monitor via DVI
> 3rd monitor via DP
> 
> or
> 
> 1st monitor via HDMI
> 2nd monitor via DVI
> 3rd monitor via DVI
> 
> or
> 
> 1st monitor via DVI
> 2nd monitor via DVI
> 3rd monitor via DP


If you are using 1440p monitors you can drive only two monitors at that resolution with a DP Hub. that the reason I didnt buy it. I have 3 1440p samsung monitors and I have them connectetd with 2 x DVI DL and 1 HDMI. Luckily the HDMI cable can drive the 1440p monitor without problems, I couldnt find a 5 meter DP cable in my country, so I tested the 5 meter hdmi cable that I had connected to my proyector and it worked


----------



## Buehlar

Quote:


> Originally Posted by *aaroc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Buehlar*
> 
> Yep, there is also a few tweaks for AMD on the 1st post
> http://www.overclock.net/t/1520430/official-far-cry-4-information-discussion-thread
> Several configs are possible depending on what monitors you have and their input ports available.
> 
> All monitors will be connected to the primary GPU only.
> 
> If all your monitors have DP then I recommend getting a DP hub for optimal performance.
> 
> Other configs that work.
> 1st monitor via HDMI
> 2nd monitor via DVI
> 3rd monitor via DP
> 
> or
> 
> 1st monitor via HDMI
> 2nd monitor via DVI
> 3rd monitor via DVI
> 
> or
> 
> 1st monitor via DVI
> 2nd monitor via DVI
> 3rd monitor via DP
> 
> 
> 
> If you are using 1440p monitors you can drive only two monitors at that resolution with a DP Hub. that the reason I didnt buy it. I have 3 1440p samsung monitors and I have them connectetd with 2 x DVI DL and 1 HDMI. Luckily the HDMI cable can drive the 1440p monitor without problems, I couldnt find a 5 meter DP cable in my country, so I tested the 5 meter hdmi cable that I had connected to my proyector and it worked
Click to expand...

Yep. That's why I said it depends on the monitors.









Not many people will attempt to run 3x1 1440p monitors. That's pushing some crazy pixels.









How is the maxed out gaming performance?


----------



## Mega Man

meh amd can drive it, nvidia.... not so much


----------



## Buehlar

Quote:


> Originally Posted by *Mega Ma*
> 
> meh amd can drive it, nvidia.... not so much


Well we all know you're an AMD diehard if there ever was one!








but I do agree that Crossfire and eyefinity is quite a few steps ahead of SLI and multi display


----------



## DeviousAddict

Well my card should have been delivered by now but it seems because of the bank holiday easter weekend, they changed the date to Tuesday but didn't tell me until just now








I've had to change it to next Friday though because I work away from home, so I've got a whole week to wait for my 2nd gpu!
(1st world problems







)


----------



## battleaxe

So I read a few pages back that Xfire 290's can access all 8 GB on 4GB cards when in Xfire? Is this right or am I misunderstanding something? I'm speaking about when DX12 comes out of course. Maybe I misunderstood it?


----------



## Buehlar

Quote:


> Originally Posted by *battleaxe*
> 
> So I read a few pages back that Xfire 290's can access all 8 GB on 4GB cards when in Xfire? Is this right or am I misunderstanding something? I'm speaking about when DX12 comes out of course. Maybe I misunderstood it?


Yes, stackable vRAM is possible with the Mantle API and will also be with Directx 12.
Once developers adopt these APIs they'll have the "ability" to optimize their games to utilize all available vRAM in crossfire/SLI configurations.

Read more here


----------



## tsm106

Quote:


> Originally Posted by *Buehlar*
> 
> Quote:
> 
> 
> 
> Originally Posted by *battleaxe*
> 
> So I read a few pages back that Xfire 290's can access all 8 GB on 4GB cards when in Xfire? Is this right or am I misunderstanding something? I'm speaking about when DX12 comes out of course. Maybe I misunderstood it?
> 
> 
> 
> Yes, stackable vRAM is possible with the Mantle API and will also be with Directx 12.
> Once developers adopt these APIs they'll have the "ability" to optimize their games to utilize all available vRAM in crossfire/SLI configurations.
> 
> Read more here
Click to expand...

I don't know why all the news is gah gah over split frame rendering. There's only a few instances where max or inflated FPS is not the goal but lower lag and low stutter is more important, like in rts/rpg games. One example is Civ BE, where higher minimum framerates is better than maximum framerates, in other words it's not a FPS game so it wouldn't matter anyways. Also a developer cannot just decide to use SFR, they have to design the engine with it in mind.

http://techreport.com/blog/27258/civ-beyond-earth-with-mantle-aims-to-end-multi-gpu-microstuttering

Quote:


> Now, AMD and Firaxis have gone one better, throwing out AFR and implementing split-frame rendering (SFR) instead in the Mantle version of Beyond Earth.


----------



## battleaxe

Quote:


> Originally Posted by *tsm106*
> 
> I don't know why all the news is gah gah over split frame rendering. There's only a few instances where max or inflated FPS is not the goal but lower lag and low stutter is more important, like in rts/rpg games. One example is Civ BE, where higher minimum framerates is better than maximum framerates, in other words it's not a FPS game so it wouldn't matter anyways. Also a developer cannot just decide to use SFR, they have to design the engine with it in mind.
> 
> http://techreport.com/blog/27258/civ-beyond-earth-with-mantle-aims-to-end-multi-gpu-microstuttering


is that also true for 4k resolution? Seems the extra memory would ve helpful?


----------



## gatygun

Well guess i killed my card already haha.

Thing keeps popping a black screen on me out of nowhere. Reinstalled windows , clocked everything down etc. No result. Maybe my power supply doesn't manage. No clue about that.

Ah well got some testing to do tommorow.


----------



## mfknjadagr8

Quote:


> Originally Posted by *gatygun*
> 
> Well guess i killed my card already haha.
> 
> Thing keeps popping a black screen on me out of nowhere. Reinstalled windows , clocked everything down etc. No result. Maybe my power supply doesn't manage. No clue about that.
> 
> Ah well got some testing to do tommorow.


disabled ulps? How are temps? You aren't alone I finally solved mine by disabling ulps and repasting then I went to water cooled....I haven't been overclocking though so I may run into the issue again but so far so good...under water temps never break 55c on cores and vrms so far don't break 60c


----------



## gatygun

Quote:


> Originally Posted by *mfknjadagr8*
> 
> disabled ulps? How are temps?


Gotta try ulps.

Screenshot of it:



basically stock cpu settings = 60c with hyperthreading on
290 gpu is about 67c gpu max and 69c vram 1 max

This is on a 100% aida64 run.

Some lines come in my screen ( the middle and just boom the driver crashes ) or a black screen pops up.


----------



## mfknjadagr8

Quote:


> Originally Posted by *gatygun*
> 
> Gotta try ulps.
> 
> Screenshot of it:
> 
> 
> 
> basically stock cpu settings = 60c with hyperthreading on
> 290 gpu is about 67c gpu max and 69c vram 1 max
> 
> This is on a 100% aida64 run.
> 
> Some lines come in my screen ( the middle and just boom the driver crashes ) or a black screen pops up.


it's my understanding that ulps tries to drop the clocks/voltages to save power when it's not needed however it often does so at improper times under load...I wasn't experiencing black screens except during 3d applications (ie fullscreen gaming) disabling ulps helped tremendously but my temps still weren't good and I still got black screens occasionally...(so I repasted and black screens were gone and tennis were much lower) before I waterblocked them if I forgot to ramp up the fans I would still get black screens but I had stock reference blowers as well...I also never experienced the lines or an outright driver crash with no black screen...perhaps someone else here can chime in on that


----------



## th3illusiveman

Quote:


> Originally Posted by *Harry604*
> 
> I have a 290x direct cuii
> 
> with the stock heatsink and 100mv on core I can only get 1115 mhz stable in bf4
> 
> i put a h100i on the card and fan on vrm comes with vrm heatsink
> 
> im at 1190 mhz stable 1hr in bf4 at same 100mv on core
> 
> how is it possible by putting a waterblock to cool card
> 
> 57c on core
> 
> vrm 1 is 71
> vrm 2 is 65
> 
> im still adding 5mhz and no crashing yet
> 
> havent touched aux voltage or memory clock yet


cooling AMD GPUs allows them to overclock very very well even at the same voltage.


----------



## gatygun

Quote:


> Originally Posted by *mfknjadagr8*
> 
> it's my understanding that ulps tries to drop the clocks/voltages to save power when it's not needed however it often does so at improper times under load...I wasn't experiencing black screens except during 3d applications (ie fullscreen gaming) disabling ulps helped tremendously but my temps still weren't good and I still got black screens occasionally...(so I repasted and black screens were gone and tennis were much lower) before I waterblocked them if I forgot to ramp up the fans I would still get black screens but I had stock reference blowers as well...I also never experienced the lines or an outright driver crash with no black screen...perhaps someone else here can chime in on that


This card hardly gets hot really, it's like running on 35% of it's fan speed max and it's hardly heating up at all. It does heat up the cpu when the fans push more heat into it's direction.

Disabled Ulps, no effect same problem
Enabled unoffical overclocking mode + without power play support, no effect same issue.
Reset display mode on applying unofficial overclock disabled = no effect same issue.

I also did a memtest64 for a while and no issue's there.

So basically there is not much i can still do.

I honestly wonder if my PSU is the issue tho. I got a zalman zm-1000-hp with 18/28a ( not the plus one ) rails. I have 1 cable coming out of the psu 8 pins ( no clue what amps it has and how stable that is ) and 1 i connect on a different rail ( 6 pins ).

Maybe it's just that my power supply doesn't take the beating anymore. Thing is a 1000w but 7-8 years old.

No clue how to calculate this tho.


----------



## Forceman

Have you tried bumping the core voltage a notch? Some cards seem to have poorer memory controllers and give driver errors and crashes even at stock. A slight core bump might help.


----------



## Agent Smith1984

Does anyone have any clue why my 290 would just stay pegged at 100% utilization while idling in windows? Just noticed the idle temp was 55c.... then opened hwmonitor and saw the usage???

Nothing is running that would use the GPU....


----------



## davidm71

I was getting random black screens and serious flickering depending on the driver installed and the only fix I had was to rma the card. I suggest you do the same. The 290Xs are plagued by this fault and no bump in voltage is going to fix it!


----------



## GhosX

My tips for black screen display after first installing driver + catalyst. (i am still using 13.12 driver)
I am sorry if someone have told this before me (i didn't read ALL thread pages







)


Push restart button at PC when display black screens

Enter safe mode ( press F8 key )

Disable startup catalyst (i did with CCleaner)
http://filehippo.com/download_ccleaner

Disable Catalyst Desktop Right-Click Menu
http://www.howtogeek.com/howto/windows-vista/remove-ati-catalyst-control-center-from-the-desktop-right-click-menu/

Restart

i cannot prove with my own screenshot right now, because i am on vacation and use notebook.

good luck


----------



## gatygun

Quote:


> Originally Posted by *GhosX*
> 
> My tips for black screen display after first installing driver + catalyst. (i am still using 13.12 driver)
> I am sorry if someone have told this before me (i didn't read ALL thread pages
> 
> 
> 
> 
> 
> 
> 
> )
> 
> 
> Push restart button at PC when display black screens
> 
> Enter safe mode ( press F8 key )
> 
> Disable startup catalyst (i did with CCleaner)
> http://filehippo.com/download_ccleaner
> 
> Disable Catalyst Desktop Right-Click Menu
> http://www.howtogeek.com/howto/windows-vista/remove-ati-catalyst-control-center-from-the-desktop-right-click-menu/
> 
> Restart
> 
> i cannot prove with my own screenshot right now, because i am on vacation and use notebook.
> 
> good luck


I installed purely the driver and not teh catalyst of 14.12 ( i had 15.3 ) and the same problems existed in a new windows. So that's not gona fix it.

Quote:


> Originally Posted by *davidm71*
> 
> I was getting random black screens and serious flickering depending on the driver installed and the only fix I had was to rma the card. I suggest you do the same. The 290Xs are plagued by this fault and no bump in voltage is going to fix it!


I will test a bit more before i will start with the rma process, because i probably will have no gpu then for weeks if not months. RMA'ing cards here is a hell.

Quote:


> Originally Posted by *Forceman*
> 
> Have you tried bumping the core voltage a notch? Some cards seem to have poorer memory controllers and give driver errors and crashes even at stock. A slight core bump might help.


Yea i did and the problem stayed.

update.

Basically the card crashes non stop whatever i do. I got the feeling that it could be my psu that just doesn't give enough power on the 18a lines. So i drastically underclocked my videocard and runned furmark in order to reduce the needed power. from 21a a times it did now only used 13.8a. And after 13 hours of stress testing the card didn't crash.

I use the clocks 670 for core 900 for memory i also undervolt my card by core voltage -44 and -22 on max power.

But maybe it can be the v-ram that gives issue's by simple having a corrupt module on it who knows as furmark hardly pushes the v-ram forwards.

So probably will do some more testing.

To bad i cant use aida64 for underclocking, as it will push the stock clocks always forwards.


----------



## davidm71

Do a cross ship using your credit card. That's what i did. Trust me it's the only way.


----------



## kzone75

A little late to the party, but I bought a VTX3D R9 290 "X-Edition" today.









GPU-Z


----------



## cokker

Quote:


> Originally Posted by *gatygun*
> 
> Basically the card crashes non stop whatever i do. I got the feeling that it could be my psu that just doesn't give enough power on the 18a lines.


Yeah, depending on how many 18a rails you have and how you have them configured could be a problem.

I've never been a fan of mid range (~500w) wattage multi-rail PSU's.


----------



## gatygun

Quote:


> Originally Posted by *davidm71*
> 
> Do a cross ship using your credit card. That's what i did. Trust me it's the only way.


Nobody has credit cards here tho, so i will probably have to wait a long time. I will probably underclock it and just replace it in a few months with a 8gb version or the next card as i already hit 4gb on ac unity / mordor at times on ultra.
Quote:


> Originally Posted by *cokker*
> 
> Yeah, depending on how many 18a rails you have and how you have them configured could be a problem.
> 
> I've never been a fan of mid range (~500w) wattage multi-rail PSU's.


Well this is my PSU:



The image really isn't clear for me, i have no clue what i got it linked towards. 1 cable is basically pluged in 6 pins in a slot on the power supply and one 8 pins pretty much comes directly out of the power supply. The connector on the psu ( 6 pins ) says pci-e3, but the diagram says it's odd/hdd while those 2 things are indicated on other plugs. So it's kinda the question if it's connected towards a 18a or 28a line. Even then i have no clue how much the 8 pins give me.

Anyway i tested a bit more today and found the following results:

Memtest64 ( no issue's, stock cpu clock works also like a charm and gets like 45c max stressed )

GPU department: I underclocked my core towards:
-44 ( core voltage ), -22 (power limit), 670 (core clock), 904 (memory clock ) and super stable no crashes nothing ( 20 hours long )
Then i tested:
+50, +50, 670, 1000 ( stable no real crashes or anything ( 6-7 hours stable no issue's )
After that:
+50, +50, 670, 1300 ( unstable driver crashed, heat did go up massively somehow and black screen, pretty much 6 minute after i changed it )
after that:
+50, +50, 1000, 904 ( completely stable, no crashes on nothing after a few hours of mordor.

Will try a bit more out tommorow.

I also figured out that my card uses at times 290 watt, why would they push 1x 8 pins and 1x 6 pins on the card? when the limit is like 300 watt. Why not go directly the 2x 8 pins route ( i see they fixed that with the 8gb versions ). Seems to be a lot more stable that way.


----------



## davidm71

I'm just saying don't let the warranty run out.


----------



## tsm106

Quote:


> Originally Posted by *gatygun*
> 
> 
> 
> I also figured out that my card uses at times 290 watt, why would they push 1x 8 pins and 1x 6 pins on the card? when the limit is like 300 watt. Why not go directly the 2x 8 pins route ( i see they fixed that with the 8gb versions ). *Seems to be a lot more stable that way.*


It doesn't have much to do with the cable pin arrangement. The cables themselves can deliver more than the rated power, a helluvalot more.

Judging by your psu pic above, it's an older psu. Look at how much power is dedicated to dead rails, the +5v/+3.3v have 250w dedicated to them lol. Those rails are barely used anymore since cpus moved to 12v a looong time ago. Linked below is a typical current psu, look at the wattage dedicated to the sub rails 3.3v/5v, it's less than half of your psu. There's also more transparency with the top psus, notice the G2 1000 is actually a 1000w psu. The main 12v output actually delivers 999.6w or 1kw.

For ex.



As for your issue. It would be really helpful if you could correlate what plugs you used to their rail number/designations. Which rail or rails are you using to connect to your gpu?

You also should read this thread for reference.

http://www.overclock.net/t/1265543/the-amd-how-to-thread/0_40


----------



## gatygun

Quote:


> Originally Posted by *tsm106*
> 
> It doesn't have much to do with the cable pin arrangement. The cables themselves can deliver more than the rated power, a helluvalot more.
> 
> Judging by your psu pic above, it's an older psu. Look at how much power is dedicated to dead rails, the +5v/+3.3v have 250w dedicated to them lol. Those rails are barely used anymore since cpus moved to 12v a looong time ago. Linked below is a typical current psu, look at the wattage dedicated to the sub rails 3.3v/5v, it's less than half of your psu. There's also more transparency with the top psus, notice the G2 1000 is actually a 1000w psu. The main 12v output actually delivers 999.6w or 1kw.
> 
> For ex.
> 
> 
> 
> As for your issue. It would be really helpful if you could correlate what plugs you used to their rail number/designations. Which rail or rails are you using to connect to your gpu?
> 
> You also should read this thread for reference.
> 
> http://www.overclock.net/t/1265543/the-amd-how-to-thread/0_40


Thanks for the link. I will read it tommorow morning

This is how i got my psu connected.


----------



## chronicfx

Eyefinity guys: Looking for a recommendation. Current setup is 4790k with triple 290x cards and 16gb ram at 2400mhz. Monitors I have are lg 34um95 (3440x1440) and a yamakasi catleap (dvi only). I want to run the catleap as left monitor and have the lg in the middle because of 21:9 aspect. I do not have a third monitor and would like suggestions that cost less than $400. Also the lg uses a display port connection, the catleap is dvi-d. So my cards being reference leaves an hdmi and the other dvi-d port available. I just bought a g27 and would like to use eyefinity for racing games and "easy on yhe gpu games" such as dragon age and skyrim etc. and I will break eyefinity and use the lg for the far cry's, witchers, and crysis games that are not possible at that res.What would you guys do for a third monitor and what connection type should i use?

Here is just the two together but the captions still run across the bezel.. Can you say annoying?


----------



## tsm106

Quote:


> Originally Posted by *gatygun*
> 
> Thanks for the link. I will read it tommorow morning
> 
> This is how i got my psu connected.


That's good your using two rails, so in any combination your gpu should have enough power. I would look at your configuration and validate your gpu clocks and don't forget your cpu too as that can lead to instability as well.

Quote:


> Originally Posted by *chronicfx*
> 
> Eyefinity guys: Looking for a recommendation. Current setup is 4790k with triple 290x cards and 16gb ram at 2400mhz. Monitors I have are lg 34um95 (3440x1440) and a yamakasi catleap (dvi only). I want to run the catleap as left monitor and have the lg in the middle because of 21:9 aspect. I do not have a third monitor and would like suggestions that cost less than $400. *Also the lg uses a display port connection*, the catleap is dvi-d. So my cards being reference leaves an hdmi and the other dvi-d port available. I just bought a g27 and would like to use eyefinity for racing games and "easy on yhe gpu games" such as dragon age and skyrim etc. and I will break eyefinity and use the lg for the far cry's, witchers, and crysis games that are not possible at that res.What would you guys do for a third monitor and what connection type should i use?
> 
> Here is just the two together but the captions still run across the bezel.. Can you say annoying?


Get another catleap or matching panel and throw it on the last dvi? That would make a crazy super wide setup.


----------



## chronicfx

Quote:


> Originally Posted by *tsm106*
> 
> That's good your using two rails, so in any combination your gpu should have enough power. I would look at your configuration and validate your gpu clocks and don't forget your cpu too as that can lead to instability as well.
> Get another catleap or matching panel and throw it on the last dvi? That would make a crazy super wide setup.


Sorry for the link. Not sure of the rules on that. Are these Jupiters any good?

http://www.amazon.com/YAMAKASI-Q270-JUPITER-2560x1440-Computer/dp/B00KX1FZEW/ref=pd_sxp_grid_pt_0_2

nevermind found a catleap. Is the input lag on the multi's a real problem?

http://www.amazon.com/YAMAKASI-CATLEAP-Q270-LED-MULTI/dp/B00Q8GVVY4/ref=pd_sxp_grid_pt_0_2


----------



## tsm106

Quote:


> Originally Posted by *chronicfx*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> That's good your using two rails, so in any combination your gpu should have enough power. I would look at your configuration and validate your gpu clocks and don't forget your cpu too as that can lead to instability as well.
> Get another catleap or matching panel and throw it on the last dvi? That would make a crazy super wide setup.
> 
> 
> 
> Sorry for the link. Not sure of the rules on that. Are these Jupiters any good?
> 
> http://www.amazon.com/YAMAKASI-Q270-JUPITER-2560x1440-Computer/dp/B00KX1FZEW/ref=pd_sxp_grid_pt_0_2
> 
> *nevermind found a catleap. Is the input lag on the multi's a real problem?*
> 
> http://www.amazon.com/YAMAKASI-CATLEAP-Q270-LED-MULTI/dp/B00Q8GVVY4/ref=pd_sxp_grid_pt_0_2
Click to expand...

Hmm, not anymore than single but you want to match the panels specs as closely as possible. If the lag is too great between them it will become a visible between the them and will grate on your nerves over time driving you crazy hehe or something like that.


----------



## chronicfx

Quote:


> Originally Posted by *tsm106*
> 
> Hmm, not anymore than single but you want to match the panels specs as closely as possible. If the lag is too great between them it will become a visible between the them and will grate on your nerves over time driving you crazy hehe or something like that.


My other catleap is a DVi-only and no scaler the multi (multiple input) has the scaler and some say it adds input lag.

Is it ok to get a catleap with the DVI-D only and run the LG off the diplay port connection and use the pair of DVI ports on the two Yamakasi Catleaps?


----------



## tsm106

That's how I would do it. As long as the difference in lag between the three panels isn't too great, that it makes it obvious then I would say you're ok. Btw, you're gonna be pushing over 12m pixels!


----------



## chronicfx

Quote:


> Originally Posted by *tsm106*
> 
> That's how I would do it. As long as the difference in lag between the three panels isn't too great, that it makes it obvious then I would say you're ok. Btw, you're gonna be pushing over 12m pixels!


12.3









Trigger pulled on the perfect pixel multi I linked before


----------



## DividebyZERO

Quote:


> Originally Posted by *chronicfx*
> 
> 12.3
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Trigger pulled on the perfect pixel multi I linked before


these cards can really push well if the game uses it properly

Some sloppy camera 720p footage of Seiki 39"x3 in eyefinity triple 4k @ 30hz game play. Quality is bad on cell phone for now looking for a decent camera now thats hopefully 1080/60fps or the like.

sniper elite 3 - 6480x3840 - ultra preset in the video around 35 seconds



Im working with a new 4k gopro cam so maybe I can do better vids. My cell phone kept auto focusing and washing out scope shots. This was just to demo hawaii though on triple 4k and it maintaining playable fps.


----------



## Mega Man

I still want to see Crysis 3 with that. ....


----------



## naved777

My card is auto down clocking heavily when Overclocked. Is it due to higher temps? (94)


----------



## Hattifnatten

If you can't keep the card below the power-target, temperature-target or fanspeed-target, it will throttle.


----------



## mfknjadagr8

Quote:


> Originally Posted by *naved777*
> 
> My card is auto down clocking heavily when Overclocked. Is it due to higher temps? (94)


yeah bro I was getting horrible throttling at 87....but I also have a question...when I am playing advanced warfare I'm getting an issue where the gpu usage drops to 0 for a second or so then back up to 100 percent causing massive gameplay issue...I have temps below 60c at all times and clocks don't drop just the usage.....I have crossfire disabled for this game because it runs horribly...and for some reason I can't see profiles in my catalyst to choose another to try...anyhow anyone have a clue what could be causing this? I'm on the 15.3 beta...just did fresh removal install of driver in safe mode using ddu then regular installer under safe mode...


----------



## gatygun

Quote:


> Originally Posted by *DividebyZERO*
> 
> these cards can really push well if the game uses it properly
> 
> Some sloppy camera 720p footage of Seiki 39"x3 in eyefinity triple 4k @ 30hz game play. Quality is bad on cell phone for now looking for a decent camera now thats hopefully 1080/60fps or the like.
> 
> sniper elite 3 - 6480x3840 - ultra preset in the video around 35 seconds
> 
> 
> 
> Im working with a new 4k gopro cam so maybe I can do better vids. My cell phone kept auto focusing and washing out scope shots. This was just to demo hawaii though on triple 4k and it maintaining playable fps.


Looks amazing.
Quote:


> Originally Posted by *tsm106*
> 
> That's good your using two rails, so in any combination your gpu should have enough power. I would look at your configuration and validate your gpu clocks and don't forget your cpu too as that can lead to instability as well.


My cpu runs stock, it's been stress tested 24/7 with my old 580 in it without any failures at all, the heat is also really good only 45c max atm. The memory is stress tested so it really is the gpu department.

I'm testing a bit more to see what happens when







thanks for yoru reaction.


----------



## DividebyZERO

Quote:


> Originally Posted by *Mega Man*
> 
> I still want to see Crysis 3 with that. ....


I will see what I can do.


----------



## Mega Man

Ty


----------



## chronicfx

Quote:


> Originally Posted by *DividebyZERO*
> 
> these cards can really push well if the game uses it properly
> 
> Some sloppy camera 720p footage of Seiki 39"x3 in eyefinity triple 4k @ 30hz game play. Quality is bad on cell phone for now looking for a decent camera now thats hopefully 1080/60fps or the like.
> 
> sniper elite 3 - 6480x3840 - ultra preset in the video around 35 seconds
> 
> 
> 
> Im working with a new 4k gopro cam so maybe I can do better vids. My cell phone kept auto focusing and washing out scope shots. This was just to demo hawaii though on triple 4k and it maintaining playable fps.


Looks great! I will have a 21:9 in the center so portrait is out. I will show you how it looks next week. Estimate is 9th to 14th on delivery.


----------



## iAxX

Hi,

I'd like to join the 290x cool kids club

Here's the proof.


----------



## DividebyZERO

A little more triple 4k eyefinity love
Quote:


> Mega that Crysis 3 you requested, i have it, forgive my total noob self with this camera. Right now im trying to learn how to use it. Crysis 3 has major performance issues for me, and its not restricted to resolution. I think the game has a bug or something. Either way i did a short 4k video test for you its probably going to take youtube a while to proccess it. I will do better videos as i get more into this and learn how to do things like get rid of camera fisheye etc.. its 4k in low light is very bad on quality so either way its the best i can do for now.
> 
> Crysis 3 - triple 4k - settings are around 4 minutes, i even tested Very high, without AA of course. - indoors lighting is bad so might want to skip up til outside.
> 
> its going to take a while before HD becomes available/4k
> 
> 
> 
> 
> Lords of the fallen - Test video triple 4k - accidently ran this one on tri-fire. Didn't realize until later however its just a test video and i forgot to show resolution in it, but its def 4kx3 - also i suck at this game as per game play lol


----------



## gatygun

Looks good, how much v-ram memory does it take tho? tripple 4k is a insane resolution.


----------



## DividebyZERO

Not sure, usually though if I hit the limit I get 1 fps or crash. That said I can try to use AB overlay or something to find out.


----------



## hyujmn

Hey, gents.

Whats the best way to mount a fan onto the VRM area? I'm using a Thermalright cooler but unfortunately it doesn't have any cooling for the VRMs and in Crossfire, the top card is getting super toasty on the VRMs, even with the Gelid VRM cooler.

Is a 92mm fan the only one I can probably put on there?


----------



## dallas1990

Quote:


> Originally Posted by *hyujmn*
> 
> Hey, gents.
> 
> Whats the best way to mount a fan onto the VRM area? I'm using a Thermalright cooler but unfortunately it doesn't have any cooling for the VRMs and in Crossfire, the top card is getting super toasty on the VRMs, even with the Gelid VRM cooler.
> 
> Is a 92mm fan the only one I can probably put on there?


This is how I did it. Yes that's a bottle cap lol. Well if my phone would let me upload the picture


----------



## Mega Man

easiest way?

buy a waterblock !


----------



## gatygun

I wish they would make a 295x2 with 16gb of v-ram ( 8 for each ) with closed loop water cooling. Would buy in a heart beat.


----------



## Mega Man

Agreed. Excluding the aio. Just put a high quality wb on it ( ek swiftech ect )

1 Ase-troll should be banned from life.
2 crappy aio perf


----------



## hyujmn

I don't think you know how Xfire works. It would be 8gbs total, not 16.


----------



## ZealotKi11er

Is there a place to find the BIOS for VisionTek CryoVenom® R9 290? It's clocked 1175MHz/1450Mhz out of the BOX.


----------



## Spectre-

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Is there a place to find the BIOS for VisionTek CryoVenom® R9 290? It's clocked 1175MHz/1450Mhz out of the BOX.


try this- http://www.techpowerup.com/gpudb/2460/radeon-r9-290x.html


----------



## Renton577

http://www.techpowerup.com/gpuz/details.php?id=armkp

Just traded in my GTX 970 for a R9 290x and am getting way better performance best choice ever. I have a Sapphire R9 290x with a stock cooler, here is a link to my card on newegg
http://www.newegg.com/Product/Product.aspx?Item=N82E16814202058
was in the back at the place I got it from, the guy said it had been there for a while.


----------



## Arizonian

Quote:


> Originally Posted by *iAxX*
> 
> Hi,
> 
> I'd like to join the 290x cool kids club
> 
> Here's the proof.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added








Quote:


> Originally Posted by *Renton577*
> 
> http://www.techpowerup.com/gpuz/details.php?id=armkp
> 
> Just traded in my GTX 970 for a R9 290x and am getting way better performance best choice ever. I have a Sapphire R9 290x with a stock cooler, here is a link to my card on newegg
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814202058
> was in the back at the place I got it from, the guy said it had been there for a while.


Congrats - added


----------



## gatygun

Quote:


> Originally Posted by *Renton577*
> 
> http://www.techpowerup.com/gpuz/details.php?id=armkp
> 
> Just traded in my GTX 970 for a R9 290x and am getting way better performance best choice ever. I have a Sapphire R9 290x with a stock cooler, here is a link to my card on newegg
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814202058
> was in the back at the place I got it from, the guy said it had been there for a while.


Gz. That 4gb of 512bit ram is gona be far more future proof then the 3,5gb mess on the nvidia department. Main reason why i did go with amd this time. Specially in mordor / ac unity / watch dogs that extra 500 mb of v-ram gets used a lot.

Sapphire is now contacting me to replace my card, so that's a good thing also


----------



## gatygun

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Is there a place to find the BIOS for VisionTek CryoVenom® R9 290? It's clocked 1175MHz/1450Mhz out of the BOX.


Wish they would have builded a closed water loop version of it. Perfect clocks.


----------



## icmacdon

Here is my GPUZ screen shot

Capture.PNG 34k .PNG file


----------



## LandonAaron

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Is there a place to find the BIOS for VisionTek CryoVenom® R9 290? It's clocked 1175MHz/1450Mhz out of the BOX.


I have one. Want me to just save my BIOS via GPUz and send it to you?

Edit: Here it is: http://1drv.ms/1aG34v6


----------



## faizreds

Can I join the club?



Will post the gpu-z screenshot when I finish building my pc.


----------



## gatygun

Gratz on your card mate, it's a beast.


----------



## DeviousAddict

Picked up my extra card today









Also this is the box for the one I have currently fitted


I'm fitting it now so of you need a gpu-z screenshot I shall post that later.

Edit: forgot to sat what they are, both R9 290Xs 8gb editions. One is the sapphire vapor-x the other is msi gaming twin frozr IV both are OC editions

2nd Edit: GPU-Z validation screen shots


----------



## Renton577

Does anyone know where to get a cooler for the 290x like the trix has? Mine is pretty good its just so loud when it gets under load.


----------



## tsm106

Quote:


> Originally Posted by *Renton577*
> 
> Does anyone know where to get a cooler for the 290x like the trix has? Mine is pretty good its just so loud when it gets under load.


The only retail alternative is artic accelero.


----------



## Renton577

Quote:


> Originally Posted by *tsm106*
> 
> The only retail alternative is artic accelero.


Cool thanks I'll look into that one then, anythings better than having a jet in my room every time I play games haha


----------



## LandonAaron

Quote:


> Originally Posted by *gatygun*
> 
> Gratz on your card mate, it's a beast.


Thanks. It was an Ebay find. I don't think many of them were made. But they are basically just Saphire 290's and an EK block with a custom logo. It overclocks really well though, better than my 290x even.


----------



## gatygun

Quote:


> Originally Posted by *Renton577*
> 
> Cool thanks I'll look into that one then, anythings better than having a jet in my room every time I play games haha


Just play battlefield and fly in a jet. 4D ftw, Jet engine like never before.

haha.

Quote:


> Originally Posted by *LandonAaron*
> 
> Thanks. It was an Ebay find. I don't think many of them were made. But they are basically just Saphire 290's and an EK block with a custom logo. It overclocks really well though, better than my 290x even.


Yea i know i bought one myself a week ago. thing is amazing. Plays games like a champ.

Anyway a update on my situation:

After black screens / crashes on stock settings, i contacted sapphire on a solution. They asked me to record the crashes / black screen solution. So i turned my pc on today and windows had a issue with it's c drive ( ssd where windows 8.1 is on ). It started to scan the drive and repair the c drive before starting windows.

After that, i cranked up my settings from 670 core and 904 memory towards 1000 core and 1300 memory in order to record the crashing. But nothing happened. Rock solid and stable card which also was really cool. ( no idea how this could be possible ).

So after playing hours of bf3/crysis3/mordor i decided to clock my cpu back towards 4ghz again from 2,9 stock. And again no issue's. Temps are always fine but no crashes no odd things happening at all. Played for hours the games again ( god dam the framerate is so smooth ).

I kinda started to wonder what would happen if i would upclock my gpu again. from 1000> 1150 and mem from 1300 > 1475 ( found the only stable solution at this moment ). with +100 and +50 volts. ( above the 1500 mem i get artifacts and above the 1150 i get checkboard patterns, so this is my limit.

10.009 3dmark points again on firestarter and the framerate is smooth as hell in comparison towards 2,9ghz cpu + 670 core.

I have no clue what happened really.

I installed windows 7, and that one made the gpu crash just as hard as on windows 8, but now suddently in windows 8 there are absolute zero issue's at all.

I'm honestly having no freaking idea how this is possible lol, as i did nothing. Maybe my harddrive was corrupt after all. But i only installed windows 8 like a few days ago.

It's just weird.


----------



## ZealotKi11er

Quote:


> Originally Posted by *LandonAaron*
> 
> I have one. Want me to just save my BIOS via GPUz and send it to you?
> 
> Edit: Here it is: http://1drv.ms/1aG34v6


Is it stock 1175 or you have to OC?


----------



## mfknjadagr8

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Is it stock 1175 or you have to OC?


they test each card before it's sent out for high stable overclock...You might not get that good of an overclock from them....but it will be stable


----------



## LandonAaron

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Is it stock 1175 or you have to OC?


They test and overclock them at the factory and it comes with a document like this:



The card default clocks in the BIOS are 975 core 1250 memory.

I found their 1175mhz core overclock to be spot on, but I was able to go higher on the memory up to 1600mhz. This is at 1080p 60hz. Both this card and my 290x behave a little differently at 1440p 96hz. I have to reduce my overclock on both cards to 1100mhz 1500mhz at 1440p or I get black screens.


----------



## Forsaken1

Thanks Arizonian for easy to find info on front page.

New toyz to play with.

How can you go wrong for $300 ish AR each?
Waiting on blocks to go H2O.


----------



## ZealotKi11er

Quote:


> Originally Posted by *LandonAaron*
> 
> They test and overclock them at the factory and it comes with a document like this:
> 
> 
> 
> The card default clocks in the BIOS are 975 core 1250 memory.
> 
> I found their 1175mhz core overclock to be spot on, but I was able to go higher on the memory up to 1600mhz. This is at 1080p 60hz. Both this card and my 290x behave a little differently at 1440p 96hz. I have to reduce my overclock on both cards to 1100mhz 1500mhz at 1440p or I get black screens.


For 1175Mhz OC did you have to increase voltage?


----------



## Mega Man

Quote:


> Originally Posted by *Renton577*
> 
> Does anyone know where to get a cooler for the 290x like the trix has? Mine is pretty good its just so loud when it gets under load.


i can help with the so loud at load issue

http://www.coolingconfigurator.com/


----------



## hyujmn

I ended up twisty-tying a 100mm silenx fan to the VRM cooler and holy jebus I didn't think it would make THAT big of a difference with the Gelid VRM heatsinks.

After 30min of Metro2033 at stock, VRM1 was at 91c. With a small fan at medium speed (inaudible), I was able to over clock and it hovered at 71c. Absolutely crazy.


----------



## tsm106

Quote:


> Originally Posted by *hyujmn*
> 
> After 30min of Metro2033 at stock, VRM1 was at 91c. With a small fan at medium speed (inaudible), I was able to over clock and it hovered at 71c. Absolutely crazy.


Nah that's expected, active aircooling vs ambient cooling right?


----------



## Arizonian

Quote:


> Originally Posted by *icmacdon*
> 
> Here is my GPUZ screen shot
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Capture.PNG 34k .PNG file


Congrats - added









See your new to OCN...welcome aboard.








Quote:


> Originally Posted by *faizreds*
> 
> Can I join the club?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Will post the gpu-z screenshot when I finish building my pc.


Congrats - added









Got all I needed but be sure to share your OC results.








Quote:


> Originally Posted by *Forsaken1*
> 
> Thanks Arizonian for easy to find info on front page.
> 
> New toyz to play with.
> 
> How can you go wrong for $300 ish AR each?
> Waiting on blocks to go H2O.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - sweet - added








Quote:


> Originally Posted by *DeviousAddict*
> 
> Picked up my extra card today
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Also this is the box for the one I have currently fitted
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I'm fitting it now so of you need a gpu-z screenshot I shall post that later.
> 
> Edit: forgot to sat what they are, both R9 290Xs 8gb editions. One is the sapphire vapor-x the other is msi gaming twin frozr IV both are OC editions
> 
> 2nd Edit: GPU-Z validation screen shots
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## faizreds

You need to fix my card info. It is r9 290 tri-x. Not r9 290x.
Thanks.


----------



## fyzzz

I just broke my previous record on firestrike







: http://www.3dmark.com/3dm/6555662


----------



## DeviousAddict

Quote:


> Originally Posted by *fyzzz*
> 
> I just broke my previous record on firestrike
> 
> 
> 
> 
> 
> 
> 
> : http://www.3dmark.com/3dm/6555662


Good score for a single card, very impressive!


----------



## fyzzz

Quote:


> Originally Posted by *DeviousAddict*
> 
> Good score for a single card, very impressive!


Thanks! i think i can squeeze maybe a little more out of it, but i leave that to another time


----------



## fyzzz

Would be nice to hit 14 000 in gpu score


----------



## DeviousAddict

I keep getting that 'Brawn' achievment on the side too, I really need to upgrade my CPU. I should probably sell some of my old hardware to fund it


----------



## fyzzz

Quote:


> Originally Posted by *DeviousAddict*
> 
> I keep getting that 'Brawn' achievment on the side too, I really need to upgrade my CPU. I should probably sell some of my old hardware to fund it


What cpu do you have?


----------



## Agent Smith1984

Quote:


> Originally Posted by *LandonAaron*
> 
> They test and overclock them at the factory and it comes with a document like this:
> 
> 
> 
> The card default clocks in the BIOS are 975 core 1250 memory.
> 
> I found their 1175mhz core overclock to be spot on, but I was able to go higher on the memory up to 1600mhz. This is at 1080p 60hz. Both this card and my 290x behave a little differently at 1440p 96hz. I have to reduce my overclock on both cards to 1100mhz 1500mhz at 1440p or I get black screens.


Is that with stock voltage and only 30% power limit?

That thing is a damn good clocker if that's the case...


----------



## DeviousAddict

I've got the i7 3820 on the x79 lga2011 socket.
I can't afford to upgrade my motherboard as well so i was thinking of upgrading to the 4930k or 5930k, I know my mobo will work with the 4930k but not sure if it's compatible with 5930k


----------



## Agent Smith1984

Will work with the 4930, but not the 5930


----------



## DeviousAddict

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Will work with the 4930, but not the 5930


Is the 4930k a decent jump from my 3820?
I know that the 3820 doesn't support pci-e 3.0 yet the 4930 does, will that make a noticable improvement on my scores?


----------



## Agent Smith1984

Quote:


> Originally Posted by *DeviousAddict*
> 
> Is the 4930k a decent jump from my 3820?
> I know that the 3820 doesn't support pci-e 3.0 yet the 4930 does, will that make a noticable improvement on my scores?


The PICE 3.0 won't get you much....

The CPU you have is actually a great CPU though.... if you can overclock it a little, it really really does well.

My brother had a 3820, so I have some hands on with it









He ran his at a mild 4.5GHz (locked multi, turbo mode active, and a few MHz bump of the base clock).
I've seen them do 5GHz on several occations.

That's a really good chip you've already got there! I'd put the cash towards a second, or upgraded GPU









if you do feel the need to upgrade your CPU though, the 4930k is defintelly an upgrade. 2 more cores, 4 more threads, and roughly a 10% IPC improvement per clock, however, it may not overclock as well at the 3820, even with the unlocked multi.

I'd personally ride out the 3820 until you are going to do a total overhaul....


----------



## DeviousAddict

Quote:


> Originally Posted by *Agent Smith1984*
> 
> The PICE 3.0 won't get you much....
> 
> The CPU you have is actually a great CPU though.... if you can overclock it a little, it really really does well.
> 
> My brother had a 3820, so I have some hands on with it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> He ran his at a mild 4.5GHz (locked multi, turbo mode active, and a few MHz bump of the base clock).
> I've seen them do 5GHz on several occations.
> 
> That's a really good chip you've already got there! I'd put the cash towards a second, or upgraded GPU
> 
> 
> 
> 
> 
> 
> 
> 
> 
> if you do feel the need to upgrade your CPU though, the 4930k is defintelly an upgrade. 2 more cores, 4 more threads, and roughly a 10% IPC improvement per clock, however, it may not overclock as well at the 3820, even with the unlocked multi.
> 
> I'd personally ride out the 3820 until you are going to do a total overhaul....


Cheers for the Adivice, I'm not very good when it comes to overclocking at all though. I just leave OC Genie to do it for me (currently at 4ghz day to day)

I've already got 2 cards (posted them in the thread yesterday)









It' be great of there were any OCN members UK bound that lived near me (Bristol) that could come round and overclock my CPU and GPU's for me. I'd pay with beer and food, and even cash if need be








I just don't have the patience to sit there going back and forth upping clocks a little at a time, even though i've got the MSI control panel that suppsoe to let you do it from the desktop but as soon as i restart the system they go back to normal (even after saving it etc)


----------



## Agent Smith1984

Quote:


> Originally Posted by *DeviousAddict*
> 
> Cheers for the Adivice, I'm not very good when it comes to overclocking at all though. I just leave OC Genie to do it for me (currently at 4ghz day to day)
> 
> I've already got 2 cards (posted them in the thread yesterday)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It' be great of there were any OCN members UK bound that lived near me (Bristol) that could come round and overclock my CPU and GPU's for me. I'd pay with beer and food, and even cash if need be
> 
> 
> 
> 
> 
> 
> 
> 
> I just don't have the patience to sit there going back and forth upping clocks a little at a time, even though i've got the MSI control panel that suppsoe to let you do it from the desktop but as soon as i restart the system they go back to normal (even after saving it etc)


I understand...

Well, if you just leave the CPU itself at the stock 3.6, and have turbo on, it will ramp up to 4.3GHz anyways....

Are you running 6 RAM sticks for tripple channel memory?
That helps a little too....


----------



## fyzzz

I want to watercool my 290 but i don't have the money to do it. It already does around 1240 mhz +200mv on air (raijintek morpheus with noctua nf-f12's)


----------



## DeviousAddict

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I understand...
> 
> Well, if you just leave the CPU itself at the stock 3.6, and have turbo on, it will ramp up to 4.3GHz anyways....
> 
> Are you running 6 RAM sticks for tripple channel memory?
> That helps a little too....


I've got 4 sticks (4x4gb) my board has quad channel memory










This is my latest fire strike run http://www.3dmark.com/fs/4526595 (score of 14637) with my CPU on stock. I'll run it again with OC genie on which should up my CPU to 4ghz.
Looking at the similar builds within the 3dMark wedsite i should be in the 16000 point range


----------



## Agent Smith1984

Yeah, it's because the other's are hitting 4.3GHz and getting a better physics and combined score.









I wouldn't use the OC genie.....

My brother's CPU score was almost 11k points if I remember correctly.


----------



## LandonAaron

Quote:


> Originally Posted by *fyzzz*
> 
> I want to watercool my 290 but i don't have the money to do it. It already does around 1240 mhz +200mv on air (raijintek morpheus with noctua nf-f12's)


Perfect opportunity for the Red Mod. Find a cheap AIO off Ebay, zip tie that sucker on there, and get the GelID VRM IceVision VRM heatsink and some PCI slot fans and your golden. Icy Vision VRM heatsink is less than $10 I think, and PCI slot fans can be found for less than $10, so the most expensive part will be finding a good AIO. You can also use the NZXT mounting bracket thing, but its like $35.


----------



## LandonAaron

Quote:


> Originally Posted by *ZealotKi11er*
> 
> For 1175Mhz OC did you have to increase voltage?


Yes. Set voltage to +100. It may be able to go lower, but I didn't experiment with it too much. I typically run at 1100mhz with +75mv.


----------



## ZealotKi11er

Quote:


> Originally Posted by *LandonAaron*
> 
> Yes. Set voltage to +100. It may be able to go lower, but I didn't experiment with it too much. I typically run at 1100mhz with +75mv.


Oh ok. That's what i already do to get 1175Mhz.


----------



## kizwan

Quote:


> Originally Posted by *DeviousAddict*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> Will work with the 4930, but not the 5930
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is the 4930k a decent jump from my 3820?
> *I know that the 3820 doesn't support pci-e 3.0 yet* the 4930 does, will that make a noticable improvement on my scores?
Click to expand...

Say what? The 3820 does support PCIe 3.0.
Quote:


> Originally Posted by *Agent Smith1984*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *DeviousAddict*
> 
> Cheers for the Adivice, I'm not very good when it comes to overclocking at all though. I just leave OC Genie to do it for me (currently at 4ghz day to day)
> 
> I've already got 2 cards (posted them in the thread yesterday)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It' be great of there were any OCN members UK bound that lived near me (Bristol) that could come round and overclock my CPU and GPU's for me. I'd pay with beer and food, and even cash if need be
> 
> 
> 
> 
> 
> 
> 
> 
> I just don't have the patience to sit there going back and forth upping clocks a little at a time, even though i've got the MSI control panel that suppsoe to let you do it from the desktop but as soon as i restart the system they go back to normal (even after saving it etc)
> 
> 
> 
> 
> 
> 
> 
> I understand...
> 
> Well, if you just *leave the CPU itself at the stock 3.6, and have turbo on, it will ramp up to 4.3GHz anyways....*
> 
> Are you running 6 RAM sticks for tripple channel memory?
> That helps a little too....
Click to expand...

Not true though, at stock it will turbo boost to 3.8GHz.


----------



## Agent Smith1984

That's strange, because his turbo'd to 4.3GHz every time on default settings. It was initially in a lenovo rig, so I know it was stock.


----------



## DeviousAddict

@kizwan

There is a table here that says it doesn't support PCIe 3.0, it says 40 (PCIe 2.0) where as the 4930k column says 40 (PCIe 3.0). I'm assuming the 40 is referring to how many lanes it uses.

If I'm wrong no worries, I obviously just found duff info.


----------



## tsm106

Quote:


> Originally Posted by *DeviousAddict*
> 
> @kizwan
> 
> There is a table here that says it doesn't support PCIe 3.0, it says 40 (PCIe 2.0) where as the 4930k column says 40 (PCIe 3.0). I'm assuming the 40 is referring to how many lanes it uses.
> 
> If I'm wrong no worries, I obviously just found duff info.


Nah sbe is pcie 3 but it wasn't officially listed because it didn't support all gpu chipsets. Basically nvidia couldn't get validated in time but amd is validated, ie Tahiti = fully supported and kepler = not supported "officially". Lee Google is your friend for the specifics, ie. nvidia pcie 3 driver hack.


----------



## kizwan

Quote:


> Originally Posted by *Agent Smith1984*
> 
> That's strange, because his turbo'd to 4.3GHz every time on default settings. It was initially in a lenovo rig, so I know it was stock.


Definitely not default. Lenovo BIOS may have been designed to allow the CPU auto-overclocked to max multi but the max turbo boost at stock clock is 3.8GHz.
Quote:


> Originally Posted by *DeviousAddict*
> 
> @kizwan
> 
> There is a table here that says it doesn't support PCIe 3.0, it says 40 (PCIe 2.0) where as the 4930k column says 40 (PCIe 3.0). I'm assuming the 40 is referring to how many lanes it uses.
> 
> If I'm wrong no worries, I obviously just found duff info.


So, you're running your card at PCIe 2.0?


----------



## DeviousAddict

@kizwan

I never changed any bios settings but just had a look and Pcie3.0 was enabled


----------



## gatygun

Quote:


> Originally Posted by *Agent Smith1984*
> 
> That's strange, because his turbo'd to 4.3GHz every time on default settings. It was initially in a lenovo rig, so I know it was stock.


Does turbo boost not only increase the first cpu core to a higher clock speed for single core applications? the other cores will just stay on the same stock solution? This is what my 870 basically does ( it's a older cpu tho ). All the other cores and threads are basically doing nothing more.

Ocing the cpu myself even towards the turbo boost settings gives me a massive jump forwards on performance in general.


----------



## mus1mus

Quote:


> Originally Posted by *fyzzz*
> 
> I just broke my previous record on firestrike
> 
> 
> 
> 
> 
> 
> 
> : http://www.3dmark.com/3dm/6555662


Nice one. Very good score for the clock as well. Guess that CPU is pushing things a tad better.

http://www.3dmark.com/fs/4046538
@1140 / 1625 IIRC

This one's from an AMD rig.
http://www.3dmark.com/fs/4536219


----------



## kizwan

Quote:


> Originally Posted by *DeviousAddict*
> 
> @kizwan
> 
> I never changed any bios settings but just had a look and Pcie3.0 was enabled


Like tsm106 said, PCIe 3.0 was not officially supported but the chip itself capable of supporting PCIe 8 gigatransfers per second (GT/s) speed which is clearly PCIe 3.0 bit rate speed. The only thing left is for the motherboard manufacturer to make the PCIe ports on x79 motherboards electrically support that speed which all did if I'm not mistaken.

Here is the screenshot of LGA2011 datasheet.


Quote:


> Originally Posted by *gatygun*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> That's strange, because his turbo'd to 4.3GHz every time on default settings. It was initially in a lenovo rig, so I know it was stock.
> 
> 
> 
> 
> 
> 
> 
> Does turbo boost not only increase the first cpu core to a higher clock speed for single core applications? the other cores will just stay on the same stock solution? This is what my 870 basically does ( it's a older cpu tho ). All the other cores and threads are basically doing nothing more.
> 
> Ocing the cpu myself even towards the turbo boost settings gives me a massive jump forwards on performance in general.
Click to expand...

Depends on the turbo ratios, it either two cores or one core. Basically that is how turbo boost work at stock settings. Any CPUs that can be overclocked using multiplier can override this.


----------



## cephelix

don't know if this is the appropriate thread to post but here goes.
Currently running a 290 at stock and recently tried supersampling to 150% on shadow of mordor with everything maxed except for textures.monitor size is 1200p 60Hz. I get framerates of abt 50fps and vsync on.wondering if the performance hit from supersampling is worth it or not??


----------



## tsm106

Quote:


> Originally Posted by *DeviousAddict*
> 
> @kizwan
> 
> I never changed any bios settings but just had a look and Pcie3.0 was enabled


That's the way it supposed to be, AMD had no issues validating on pcie 3.0.


----------



## Mega Man

in other news :/
Quote:


> Originally Posted by *Mega Man*
> 
> anyone else find this ironic ??
> 
> http://www.ign.com/articles/2014/11/06/the-witcher-3-will-receive-16-dlc-packages-free-across-all-platforms
> 
> http://www.gog.com/game/the_witcher_3_wild_hunt_expansion_pass?utm_source=newsletter&utm_medium=email&utm_content=game_subject&utm_campaign=w3Expansion


----------



## mfknjadagr8

Quote:


> Originally Posted by *Mega Man*
> 
> in other news :/


Reading the article it states the free dlc is small content released every week and that should not be paid content... the second article states it will be two episodes at least one of which is 10 hours... thats pretty impressive for a paid dlc... ive had a lot of dlcs that were supposed to be so long and great and were garbage...


----------



## fyzzz

Quote:


> Originally Posted by *mus1mus*
> 
> Nice one. Very good score for the clock as well. Guess that CPU is pushing things a tad better.
> 
> http://www.3dmark.com/fs/4046538
> @1140 / 1625 IIRC
> 
> This one's from an AMD rig.
> http://www.3dmark.com/fs/4536219


3dmark actually show wrong clockspeeds. 3dmark shows that my card was clocked to 1190/1500 but during the test it was clocked to 1240/1500.


----------



## fyzzz

Managed to get a few points more in graphics, but my physics score went down a little so previous score was better overallhttp://www.3dmark.com/fs/4538620 (290 1255/1600 4690k 4.9 ghz)


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Mega Man*
> 
> in other news :/


I'd argue there's a difference between the dlc they're giving away and the full on expansion packs coming months and months down the road that actually add new areas and literally expand the game and quests. It's not like they're done now and they are just holding them back. They're still in development. If they gave away an extra 30 hours of the game months down the road that would be incredible, but I don't expect that when the main game is already anywhere from 60 - 200 hours depending on how much you want to do. Just an opinion.


----------



## gatygun

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> I'd argue there's a difference between the dlc they're giving away and the full on expansion packs coming months and months down the road that actually add new areas and literally expand the game and quests. It's not like they're done now and they are just holding them back. They're still in development. If they gave away an extra 30 hours of the game months down the road that would be incredible, but I don't expect that when the main game is already anywhere from 60 - 200 hours depending on how much you want to do. Just an opinion.


I personally don't play the game for the story, but for the visuals. I will buy it and roam around in the lands to see how my fps holds up. After i got a good idea what to aspect i will probably just quit it entirely.

The witcher 2 was kinda a drag with endless cutscenes and running from 1 boxed enviroment towards the other ( when they advertised it as a open world from what i remembered back then ).

It also doesn't help that the witcher 3 is multi platform developed with consoles in there mind. This will hold back visual advances for sure sadly something which i highly praised witcher 2 for of not doing.

I hope they start with the witcher 4 development faster then last time and put there focus back on the pc platform.

Because currently they are moving into the wrong way.


----------



## Agent Smith1984

Deal alert....
http://www.ebay.com/itm/Ref-Board-ATI-R9-290x-4GB-WITH-Water-Cooling-Kit-HG10-Bracket-and-H80i-Cooler-/151640137760?pt=LH_DefaultDomain_0&hash=item234e74e020

A little over an hour left....

If I had the cash free, I'd scoop it up myself....


----------



## gatygun

I got bad experiences with auctions. Mostly they bump up in the last 5 minutes towards the price the go for new.


----------



## tsm106

seller name: horrorful, rating 50%
returns: none accepted

sounds like a great deal.


----------



## Agent Smith1984

I always just wait until an hour or so is left, place my max price I'm willing to pay, and let it pass if it goes for more...

That setup is easily worth 3 bills though, but anything more than that would be on the high side...

The NEW version of the Sapphire 290x is on newegg right now for $280 after rebate, and that thing runs pretty cool for air.....

Sucks prices have gone back up though, cause they were going for sub 200 used not to long ago...


----------



## Agent Smith1984

Quote:


> Originally Posted by *tsm106*
> 
> seller name: horrorful, rating 50%
> returns: none accepted
> 
> sounds like a great deal.


ROFL,

I didn't even see that, haha

I never worry with that stuff though. Paypal ALWAYS gives you a refund when you open a dispute.
I know this from being a buyer and seller of good on ebay.

But good point either way!


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> ROFL,
> 
> I didn't even see that, haha
> 
> I never worry with that stuff though. Paypal ALWAYS gives you a refund when you open a dispute.
> I know this from being a buyer and seller of good on ebay.
> 
> But good point either way!


You can always count on some people to offer a great point and be totally unbiased.









... sorry one too many glasses of wine made me do it.

... I mean bottles


----------



## DividebyZERO

Quote:


> Originally Posted by *tsm106*
> 
> seller name: horrorful, rating 50%
> returns: none accepted
> 
> sounds like a great deal.


Quote:


> Bad packaging, Monitor arrived in pieces, Horrorful sent me a horror packaging
> Apr 08, 2015


HAhaha, monitor in pieces how does that happen? I love the comment

found it, this is how it was packaged


----------



## ZealotKi11er

So in Windows 7 x64 290X + 290 @ 1225/1500 i scored - 25538 GPU on 3DMARK
Now Windows 8 x64 290 + 290 @ 1175/1500 i scored - 25590 GPU on 3DMARK

Not bad of an improvement with lower OC and one card going from 290X to 290.


----------



## Mega Man

Quote:


> Originally Posted by *gatygun*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MrWhiteRX7*
> 
> I'd argue there's a difference between the dlc they're giving away and the full on expansion packs coming months and months down the road that actually add new areas and literally expand the game and quests. It's not like they're done now and they are just holding them back. They're still in development. If they gave away an extra 30 hours of the game months down the road that would be incredible, but I don't expect that when the main game is already anywhere from 60 - 200 hours depending on how much you want to do. Just an opinion.
> 
> 
> 
> I personally don't play the game for the story, but for the visuals. I will buy it and roam around in the lands to see how my fps holds up. After i got a good idea what to aspect i will probably just quit it entirely.
> 
> The witcher 2 was kinda a drag with endless cutscenes and running from 1 boxed enviroment towards the other ( when they advertised it as a open world from what i remembered back then ).
> 
> It also doesn't help that the witcher 3 is multi platform developed with consoles in there mind. This will hold back visual advances for sure sadly something which i highly praised witcher 2 for of not doing.
> 
> I hope they start with the witcher 4 development faster then last time and put there focus back on the pc platform.
> 
> Because currently they are moving into the wrong way.
Click to expand...

meh story > graphics every time i like the long cut scenes


----------



## Agent Smith1984

I actually had an old 17" crt.... It was an awesome target for my 22lr ar15! Lol


----------



## mfknjadagr8

play for the visuals..... i can honestly say ive never heard anyone say that...expecially considering cutscenes are nomally the most spectacular part...well havent until now


----------



## kizwan

I play for the story too. The only time I skip cutscenes is when I already play the games before & just want to finish that part quick.


----------



## ZealotKi11er

Quote:


> Originally Posted by *kizwan*
> 
> I play for the story too. The only time I skip cutscenes is when I already play the games before & just want to finish that part quick.


I play for immersion.


----------



## Mega Man

Quote:


> Originally Posted by *mfknjadagr8*
> 
> play for the visuals..... i can honestly say ive never heard anyone say that...expecially considering cutscenes are nomally the most spectacular part...well havent until now


most seem to agree. Or indie games would die or quick


----------



## gatygun

Quote:


> Originally Posted by *mfknjadagr8*
> 
> play for the visuals..... i can honestly say ive never heard anyone say that...expecially considering cutscenes are nomally the most spectacular part...well havent until now


I spend more time in the ini files / visual configuration screen then that i actually play the game.
Crysis 3 is a good example the first part of the game is incredible visually taxing, there is no need for me to progress i will just stay there and not advance forwards.
Witcher 2 i stopped at the first village, only walked around fine tuning settings until i know what to aspect and quited it.
Crysis 1, stopped at the beach and never did go further
oblivion 1 hitted the forest never progressed.
skyrim the first city was the end of my playthrough.
Ac unity i also didn't progress until the point that i get control over the city.

The moment i cant skip cutscenes like in far cry 4 i will simple not progress at all. I just hate to sit it out a b tier movie that i have no interest in. Further then the room where you start i never did go, pushed enough visual stuff forwards for me to play around with.
Watch dogs, played until i could drive around in the city. ( i hate it when you have those forced tutorials.

Sometimes i do play games forwards because they are just fun experiences to play like shadow of mordor, the combat is just great. I just log into the game and kill a few orcs here and there and quit it again.

But yea that's just me


----------



## mfknjadagr8

Quote:


> Originally Posted by *gatygun*
> 
> I spend more time in the ini files / visual configuration screen then that i actually play the game.
> Crysis 3 is a good example the first part of the game is incredible visually taxing, there is no need for me to progress i will just stay there and not advance forwards.
> Witcher 2 i stopped at the first village, only walked around fine tuning settings until i know what to aspect and quited it.
> Crysis 1, stopped at the beach and never did go further
> oblivion 1 hitted the forest never progressed.
> skyrim the first city was the end of my playthrough.
> Ac unity i also didn't progress until the point that i get control over the city.
> 
> The moment i cant skip cutscenes like in far cry 4 i will simple not progress at all. I just hate to sit it out a b tier movie that i have no interest in. Further then the room where you start i never did go, pushed enough visual stuff forwards for me to play around with.
> Watch dogs, played until i could drive around in the city. ( i hate it when you have those forced tutorials.
> 
> Sometimes i do play games forwards because they are just fun experiences to play like shadow of mordor, the combat is just great. I just log into the game and kill a few orcs here and there and quit it again.
> 
> But yea that's just me


honestly that sounds more like add or adhd but in some instances I can't say much for instance I spend about 3 days getting 200 miss to play nice together on skyrim...played it about an half hour haven't touched it since.lol


----------



## jBspy

Hello,

I own a XFX r9 290x double dissipation edition card and on the box says UEFI Bios ready. When I try to turn on source boot in the bios after a few second a message comes up "The graphic card not UEFI supported. I also tried to switch between the 2 bios with no luck. I flashed it with new bios but didn't help at all either.

Did anyone meet this problem before?
If someone owns the card could you post the bios here.

My bios version number is: 015.041.000.001.003746


----------



## LandonAaron

Quote:


> Originally Posted by *mfknjadagr8*
> 
> honestly that sounds more like add or adhd but in some instances I can't say much for instance I spend about 3 days getting 200 miss to play nice together on skyrim...played it about an half hour haven't touched it since.lol


Yep I did the full STEP mod on Skyrim. Took about two weeks to get it all together. I did play for about three or four weeks though. Fallout 3 and New Vegas on the other hand. I modded those game into oblivion and played them even longer. I had a car mod in Fallout New Vegas that was pretty lore compatible and was just awesome being able to drive around Mojave wasteland. Unfortunately every two or three saves would get corrupted so that the cars would fall through the earth, so I was constantly reloading saves. But it was just too good a mod to give up, lol. Here's hoping the rumor mill is true and Bethesda anounces Fallout 4 this year.


----------



## mfknjadagr8

Quote:


> Originally Posted by *LandonAaron*
> 
> Yep I did the full STEP mod on Skyrim. Took about two weeks to get it all together. I did play for about three or four weeks though. Fallout 3 and New Vegas on the other hand. I modded those game into oblivion and played them even longer. I had a car mod in Fallout New Vegas that was pretty lore compatible and was just awesome being able to drive around Mojave wasteland. Unfortunately every two or three saves would get corrupted so that the cars would fall through the earth, so I was constantly reloading saves. But it was just too good a mod to give up, lol. Here's hoping the rumor mill is true and Bethesda anounces Fallout 4 this year.


agreed and hopefully its optimized properly for both camps...I started the modding back in morrowind so I can do it pretty quickly now...btw your corrupted saves were likely a mod conflict...the new way is extremely easy compared to the old days of manually placing every file and only way to tweak settings was ini files...and don't get me started on mod conflicts before all the great tools came out...used to be you had to manually merge ini file changes line by line and merging mods wasnt an option to bypass the engine loading limit for esps...but I'm thankful for all of that it gives me the skills to troubleshoot and almost know instantaneously which mod is causing the problem...hardest part is when you spend a few solid days modding get everything as perfect as it can be and next day or even hours later they release a patch that breaks the mods functionality...


----------



## Zeke1001

Hello, can I join?







(Sapphire R9 290 Tri-x)



GPU-Z:


----------



## Roy360

I think I finally figured out the reason for my poor performance.
Each Red Rectangle is a single slot Watercooled R9 290

Am I allowed to do this:



I'm using a PCIe riser to connect my sound card.

My CPU has enough PCIE 3.0 lanes but maybe my motherboard doesn't?

CPU : i7 3820
Motherboard: ASUS P9X79 Deluxe. (maybe it's time to get a Rampage)


----------



## tsm106

Quote:


> Originally Posted by *Roy360*
> 
> I think I finally figured out the reason for my poor performance.
> Each Red Rectangle is a single slot Watercooled R9 290
> 
> Am I allowed to do this:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I'm using a PCIe riser to connect my sound card.
> 
> *My CPU has enough PCIE 3.0 lanes but maybe my motherboard doesn't?*
> 
> CPU : i7 3820
> Motherboard: ASUS P9X79 Deluxe. (maybe it's time to get a Rampage)


Nah mb has can actually do quad if you had single slot cards. What problems ail your rig?


----------



## Roy360

Quote:


> Originally Posted by *tsm106*
> 
> Nah mb has can actually do quad if you had single slot cards. What problems ail your rig?


My 3DMark score is only 18k. When I run Heaven's Benchmark on (5760x1080p) my machine is struggling to get over 30fps. I know my temps are fine, I've got way more radiator space than I need, temps never exceed 60 degrees. Using MSI Afterburner to individually clock each card at 1000/1260

I've re-installed my drivers several times to no avail.

Cards: 2 reference cards + ASUS DirectCU


----------



## ZealotKi11er

Quote:


> Originally Posted by *Roy360*
> 
> My 3DMark score is only 18k. When I run Heaven's Benchmark on (5760x1080p) my machine is struggling to get over 30fps. I know my temps are fine, I've got way more radiator space than I need, temps never exceed 60 degrees. Using MSI Afterburner to individually clock each card at 1000/1260
> 
> I've re-installed my drivers several times to no avail.
> 
> Cards: 2 reference cards + ASUS DirectCU


What is the GPU score?


----------



## tsm106

Quote:


> Originally Posted by *Roy360*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Nah mb has can actually do quad if you had single slot cards. What problems ail your rig?
> 
> 
> 
> My 3DMark score is only 18k. When I run Heaven's Benchmark on (5760x1080p) my machine is struggling to get over 30fps. I know my temps are fine, I've got way more radiator space than I need, temps never exceed 60 degrees. Using MSI Afterburner to individually clock each card at 1000/1260
> 
> I've re-installed my drivers several times to no avail.
> 
> Cards: 2 reference cards + ASUS DirectCU
Click to expand...

Imo most issues start with the driver. Then how you overclock and what app you use to overclock. Then its down to specifics, like with heaven for ex. you need to create a 1x1 optimize profile to get any scaling in the bench.

For driver, I use DDU to clean then the latest driver or beta. Then how you oc and the app is very crucial. You are using an Asus DC2 so you have a good chance of problems. DC2 cards used to be limited to gputweak, but some cards are supported under AB. Is your DC2 supported under AB. Hmm, I haven't run heaven in years but I do run the same resolution. Gimmie a few minutes and I can do a run at your gpu clocks. I run my cpu at 4.6 daily btw.

http://www.overclock.net/t/1265543/the-amd-how-to-thread/0_40

**


----------



## Vici0us

Quote:


> Originally Posted by *Roy360*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Nah mb has can actually do quad if you had single slot cards. What problems ail your rig?
> 
> 
> 
> My 3DMark score is only 18k. When I run Heaven's Benchmark on (5760x1080p) my machine is struggling to get over 30fps. I know my temps are fine, I've got way more radiator space than I need, temps never exceed 60 degrees. Using MSI Afterburner to individually clock each card at 1000/1260
> 
> I've re-installed my drivers several times to no avail.
> 
> Cards: 2 reference cards + ASUS DirectCU
Click to expand...

Does it say in your signature that you only have 4GB of RAM? That's a problem if that's the case. One of your PCI-E slots could be bad, did you test each card individually?


----------



## Stag1

CROSSFIRE


----------



## Roy360

Quote:


> Originally Posted by *Vici0us*
> 
> Does it say in your signature that you only have 4GB of RAM? That's a problem if that's the case. One of your PCI-E slots could be bad, did you test each card individually?


Sorry that's out of date. It's at 40GB now. My previous tests were done at 8GB

All the cards were tested before being watercooled. ASUS' memoy is what's holding them all back. Otherwise I'd overclock them alot higher.

PCIe slots... well I didn't test those. But I got the exact same scores on a 5.0GHz Intel 3570K on a Maximus Board. The cards ran at 8/4/4 in that setup, which is why I upgraded to the Deluxe (but now I'm thinking I should have gotten a MAXIMUS again).

Quote:


> Originally Posted by *tsm106*
> 
> Imo most issues start with the driver. Then how you overclock and what app you use to overclock. Then its down to specifics, like with heaven for ex. you need to create a 1x1 optimize profile to get any scaling in the bench.
> 
> For driver, I use DDU to clean then the latest driver or beta. Then how you oc and the app is very crucial. You are using an Asus DC2 so you have a good chance of problems. DC2 cards used to be limited to gputweak, but some cards are supported under AB. Is your DC2 supported under AB. Hmm, I haven't run heaven in years but I do run the same resolution. Gimmie a few minutes and I can do a run at your gpu clocks. I run my cpu at 4.6 daily btw.
> 
> http://www.overclock.net/t/1265543/the-amd-how-to-thread/0_40
> 
> **


Well my CPU is locked, so the best I can do is 4.3GHz if I don't want to lose out on power saving options. I've already used DDU in safe mode to remove my drivers and re-install. Somehow that caused my 3rd card to no longer be detected, so I'll have to work on getting that fixed.

What settings are you using for that Heaven's run?

Did not realized R9 290 DirectCU cards required ASUS tweak. I know the older cards did, but I thought after the R9 290 they removed that limited. (I'll install it now)
Quote:


> Originally Posted by *ZealotKi11er*
> 
> What is the GPU score?


What program?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Roy360*
> 
> Sorry that's out of date. It's at 40GB now. My previous tests were done at 8GB
> 
> All the cards were tested before being watercooled. ASUS' memoy is what's holding them all back. Otherwise I'd overclock them alot higher.
> 
> PCIe slots... well I didn't test those. But I got the exact same scores on a 5.0GHz Intel 3570K on a Maximus Board. The cards ran at 8/4/4 in that setup, which is why I upgraded to the Deluxe (but now I'm thinking I should have gotten a MAXIMUS again).
> Well my CPU is locked, so the best I can do is 4.3GHz if I don't want to lose out on power saving options. I've already used DDU in safe mode to remove my drivers and re-install. Somehow that caused my 3rd card to no longer be detected, so I'll have to work on getting that fixed.
> 
> What settings are you using for that Heaven's run?
> 
> Did not realized R9 290 DirectCU cards required ASUS tweak. I know the older cards did, but I thought after the R9 290 they removed that limited. (I'll install it now)
> What program?


3DMark FireStrike.


----------



## kizwan

Quote:


> Originally Posted by *Roy360*
> 
> Well my CPU is locked, so the best I can do is 4.3GHz if I don't want to lose out on power saving options. I've already used DDU in safe mode to remove my drivers and re-install. Somehow that caused my 3rd card to no longer be detected, so I'll have to work on getting that fixed.
> 
> What settings are you using for that Heaven's run?
> 
> Did not realized R9 290 DirectCU cards required ASUS tweak. I know the older cards did, but I thought after the R9 290 they removed that limited. (I'll install it now)


You don't loose power saving option if you go above 4.3GHz. We already cover this before.

Well even at 4.3GHz the difference shouldn't that huge. You should get around what tsm106 got. Probably little less because I believe tsm106 have 290X.


----------



## tsm106

Quote:


> Originally Posted by *kizwan*
> 
> You don't loose power saving option if you go above 4.3GHz. We already cover this before.
> 
> Well even at 4.3GHz the difference shouldn't that huge. You should get around what tsm106 got. Probably little less because I believe tsm106 have 290X.


He loses voltage drop at idle if he clocks above 4.3, ie. offset voltage does not work with a 125 strap or anything over stock strap (100). And yea, regardless of his 4.3 oc he should be around 60fps or something similar.


----------



## Roy360

Quote:


> Originally Posted by *tsm106*
> 
> He loses voltage drop at idle if he clocks above 4.3, ie. offset voltage does not work with a 125 strap or anything over stock strap (100). And yea, regardless of his 4.3 oc he should be around 60fps or something similar.


I'm going to try re-formating my computer to Windows 10 or 8.

For some reason, I can't get my 3rd GPU to detect at all anymore. The ASUS detects, but one of the references (2nd or 3rd) doesn't anymore
Usually re-installing gets it back, I must have lost it when I upgraded to 14.100.00.


----------



## zealord

So guys. GTA V benchmarks next week. How do you think will the R9 290X compare to the 980 / 780 Ti?

I've heard that the Nvidia GTA V ready driver is already finished. No words from AMD so far









I expect AMD cards to underperform a bit at launch, but it won't be that big a deal for most people because I think GTA V will run pretty well on most cards at resolutions sub 4K. I hope I can crank it up on Ultra everything and enjoy a rockstable 60 fps on 1080p


----------



## ZealotKi11er

Quote:


> Originally Posted by *zealord*
> 
> So guys. GTA V benchmarks next week. How do you think will the R9 290X compare to the 980 / 780 Ti?
> 
> I've heard that the Nvidia GTA V ready driver is already finished. No words from AMD so far
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I expect AMD cards to underperform a bit at launch, but it won't be that big a deal for most people because I think GTA V will run pretty well on most cards at resolutions sub 4K. I hope I can crank it up on Ultra everything and enjoy a rockstable 60 fps on 1080p


Probably run much better with Nvidia cards because GTA is very CPU limited and DX11 driver from AMD sucks. They might do something about it though since GTA is such a big launch.


----------



## kizwan

Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> You don't loose power saving option if you go above 4.3GHz. We already cover this before.
> 
> Well even at 4.3GHz the difference shouldn't that huge. You should get around what tsm106 got. Probably little less because I believe tsm106 have 290X.
> 
> 
> 
> 
> 
> 
> 
> He loses voltage drop at idle if he clocks above 4.3, ie. offset voltage does not work with a 125 strap or anything over stock strap (100). And yea, regardless of his 4.3 oc he should be around 60fps or something similar.
Click to expand...

I'm aware of that. That is not the only power saving feature available.
Quote:


> Originally Posted by *Roy360*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> He loses voltage drop at idle if he clocks above 4.3, ie. offset voltage does not work with a 125 strap or anything over stock strap (100). And yea, regardless of his 4.3 oc he should be around 60fps or something similar.
> 
> 
> 
> 
> 
> 
> I'm going to try re-formating my computer to Windows 10 or 8.
> 
> For some reason, I can't get my 3rd GPU to detect at all anymore. The ASUS detects, but one of the references (2nd or 3rd) doesn't anymore
> Usually re-installing gets it back, I must have lost it when I upgraded to 14.100.00.
Click to expand...

Happened to me before. Try remove & re-seat the card again.


----------



## stl drifter

Hey guys I currently have a Asus R9 290 reference card in my current rig. I am also currently building a 2011-3 rig with a 5930k processor. I found a good deal on 2 powercolor reference R9 290X's. The issue is I know very little about this manufacturer. Would you guys recommend them or not. I already full waterblocks and backplates.


----------



## kizwan

PowerColor & Sapphire are my top choice of brands when I want to get ATI/AMD cards. I have PowerColor before, previous gpu gen. Not bad card.


----------



## ConnorMcLeod

Hi,

I'm using msi afterburner in combinaison to a sapphire 290 tri-x and in afterburner 4.1.0 there are no place in order to set 2d and 3d profile (in profile place there is only the hotkey feature).
I haven't found anyway to enable it in .cfg file, unless i'm blind.
Any idea ?

2nd question, is there anyway to edit bios so i don't need afterburner ?


----------



## pengs

Quote:


> Originally Posted by *zealord*
> 
> So guys. GTA V benchmarks next week. How do you think will the R9 290X compare to the 980 / 780 Ti?
> 
> I've heard that the Nvidia GTA V ready driver is already finished. No words from AMD so far
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I expect AMD cards to underperform a bit at launch, but it won't be that big a deal for most people because I think GTA V will run pretty well on most cards at resolutions sub 4K. I hope I can crank it up on Ultra everything and enjoy a rockstable 60 fps on 1080p


This is what I'm wondering. I bet there will be a beta quite quickly.


----------



## chronicfx

Here it is. I can't even fit it in a single picture of my iPhone because it's so huge I try Dragon age inquisition I'm using the beta 15.3 driver and I can stare forward at 85 60 x 14 40 but the second I turn the game crashes desktop every time Whether on ultra or high settings. I turn the AA off too. grid auto sport works great It feels like I am staring through a windshield. I did DDU and reinstall back to omega driver to see if dragon age would work and it has the same issue. I ha another issue where if I use both dvi ports with the dp port for the third monitor the computer will just blink the the three monitors endlessly during boot and never get to desktop, switching the catleap multi to hdmi cable worked to solve this although I am using three input types now which people say is not ideal (i dunno). So far that is my two hours spent with eyefinity last night. The rest of my day will be filled with entertaining my toddlers and hopefully I will get some more in after they sleep tonight if I don't go out. Thanks for the help so far. Any ideas on the dragon age? I should probably disable upls but since they are all hooked to the top card does it matter?


----------



## tsm106

Quote:


> Originally Posted by *chronicfx*
> 
> 
> 
> Here it is. I can't even fit it in a single picture of my iPhone because it's so huge I try Dragon age inquisition I'm using the beta 15.3 driver and I can stare forward at 85 60 x 14 40 but the second I turn the game crashes desktop every time Whether on ultra or high settings. I turn the AA off too. grid auto sport works great It feels like I am staring through a windshield. I did DDU and reinstall back to omega driver to see if dragon age would work and it has the same issue. I ha another issue where if I use both dvi ports with the dp port for the third monitor the computer will just blink the the three monitors endlessly during boot and never get to desktop, switching the catleap multi to hdmi cable worked to solve this although I am using three input types now which people say is not ideal (i dunno). So far that is my two hours spent with eyefinity last night. The rest of my day will be filled with entertaining my toddlers and hopefully I will get some more in after they sleep tonight if I don't go out. Thanks for the help so far. Any ideas on the dragon age? I should probably disable upls but since they are all hooked to the top card does it matter?


Screen crawl. That's a lot of pixels... Probably in every high detail texture hand you will be vram limited even at lowered settings. DA in mantle is bugged in quad and maybe tri though I haven't confirmed it in tri. Panels blinking is a synch issue. I would verify your panels work natively too. It would be smarter to disable ulps, reduce complexities for your rig.


----------



## chronicfx

Quote:


> Originally Posted by *tsm106*
> 
> Screen crawl. That's a lot of pixels... Probably in every high detail texture hand you will be vram limited even at lowered settings. DA in mantle is bugged in quad and maybe tri though I haven't confirmed it in tri. Panels blinking is a synch issue. I would verify your panels work natively too. It would be smarter to disable ulps, reduce complexities for your rig.


I use dx because i noticed mantle sucked pretty fast in dragon age with flickering. I have been playing the game on the lg screen (3440x1440) in tri for a good 25 hours worth so i think it is eyefinity related and less to do with tri. I am also trying to find 6400x1080 resolution (1920x1080 - 2560x1080 -1920x1080) to lower that vram issue but it never gives me the option for that res.


----------



## chronicfx

Just to add, all of the monitors work fine when I break eyefinity.


----------



## tsm106

Test a dual dvi eyefinity set.


----------



## Gabkicks

Hmm i wonder if GTA V will have good crossfire support at launch or be buggy as hell like IV. current gen consoles are closer than ever to PC's soo... it should be okay, right guys ?!?!


----------



## gatygun

Quote:


> Originally Posted by *zealord*
> 
> So guys. GTA V benchmarks next week. How do you think will the R9 290X compare to the 980 / 780 Ti?
> 
> I've heard that the Nvidia GTA V ready driver is already finished. No words from AMD so far
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I expect AMD cards to underperform a bit at launch, but it won't be that big a deal for most people because I think GTA V will run pretty well on most cards at resolutions sub 4K. I hope I can crank it up on Ultra everything and enjoy a rockstable 60 fps on 1080p


AMD needs to be more open on there driver department anyway. but yea who knows what's gona happen. I don't think gta 4 is really that much taxing in general it stays a last gen console port with additions.


----------



## chronicfx

Quote:


> Originally Posted by *tsm106*
> 
> Test a dual dvi eyefinity set.


Just the two? Or with the dp monitor in between. I originally had both catleaps on dvi and it worked when in windows but they would not work on restart. They would blink in and out of powersaving mode. So i switched one catleap to hdmi and it restarts in eyefinity. Maybe I should break Eyefinity before shutdowns?


----------



## tsm106

Ya you have a synch problem. Confirm the catleaps work on their own in a two panel eyefinity set. It's to isolate them from the dp so see where and when the synch issue happens.


----------



## chronicfx

Quote:


> Originally Posted by *tsm106*
> 
> Ya you have a synch problem. Confirm the catleaps work on their own in a two panel eyefinity set. It's to isolate them from the dp so see where and when the synch issue happens.


I will try it, thanks.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Gabkicks*
> 
> Hmm i wonder if GTA V will have good crossfire support at launch or be buggy as hell like IV. current gen consoles are closer than ever to PC's soo... it should be okay, right guys ?!?!


The game will probablty be more CPU depended until you go 4K. AMD will suck at GTA V for sure because of horrible CPU overhead with DX11.


----------



## Gdourado

Today I have the chance to get an Asus 290X. The model is the R9290X-4GD5

The card is the 290X with the reference blower cooler, but it is brand new and the price is just 220 Euros.

To give some perspective, the cheaper 290X i can get is a VTX3D one for 290 Euros.
The cheaper 970 is a gainward blower style cooler for 295 Euros.
What do you think on the deal?
Is it worth it?

Cheers!


----------



## ZealotKi11er

Depends if you can deal with stock cooler. Maybe you can watercool it.


----------



## Gdourado

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Depends if you can deal with stock cooler. Maybe you can watercool it.


If i get that card, it will be to keep it with stock cooler.
If I went to water it, for the money I might as well go for a custom cooler 290X.

How bad is the stock cooler?
Both in throttling and noise?

Cheers


----------



## th3illusiveman

Quote:


> Originally Posted by *Gdourado*
> 
> If i get that card, it will be to keep it with stock cooler.
> If I went to water it, for the money I might as well go for a custom cooler 290X.
> 
> How bad is the stock cooler?
> Both in throttling and noise?
> 
> Cheers


it's one of the worst. I would not recommend it at all.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Gdourado*
> 
> If i get that card, it will be to keep it with stock cooler.
> If I went to water it, for the money I might as well go for a custom cooler 290X.
> 
> How bad is the stock cooler?
> Both in throttling and noise?
> 
> Cheers


55% fan speed my card did not throttle. It all depends where you live really. If you dont mind the noise the cooler can handle the card just fine even with OC.


----------



## Gdourado

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 55% fan speed my card did not throttle. It all depends where you live really. If you dont mind the noise the cooler can handle the card just fine even with OC.


I live in Lisboa, Portugal.
In the summer, it can get pretty hot.
How is the noise at 55%?
Did you have any special airflow in the case?
I have a HAF XB.

Cheers!


----------



## Gdourado

Quote:


> Originally Posted by *th3illusiveman*
> 
> it's one of the worst. I would not recommend it at all.


Not even for the 220 euros?


----------



## aaroc

If you play with headphones you will not notice the noise. If you play without headphones depends on the game you are playing, Racing games or FPS with war noise like battlefield 3/4 it will be just be background noise. In a FPS like Alan wake, L4D2 I think it will ruin the mod with out headphones. When not playing, the fan has normal/no noise, except when youtube videos with flash player goes nut. Were I live is hot on summer and we dont have AC. The noise was the same in summer and winter
Last year when playing using 3 R9 290 in trifire with normal fan in summer my wife asked if I was cleaning the room with a new vacuum cleaner .









I think I would still buy a R9 290X if cheap today if I didnt have 4 already.


----------



## mfknjadagr8

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 55% fan speed my card did not throttle. It all depends where you live really. If you dont mind the noise the cooler can handle the card just fine even with OC.


mine required at least 60.percent to not black screen...I kept it at 65 which sounded like my fan motor on my car...but it stayed decent temps after a repaste


----------



## ZealotKi11er

Quote:


> Originally Posted by *mfknjadagr8*
> 
> mine required at least 60.percent to not black screen...I kept it at 65 which sounded like my fan motor on my car...but it stayed decent temps after a repaste


Mine is 290 Unlocked to 290X and when i got it it was November so room temps ~ 20C compare to 26-28C during summer. I only played BF4 back then and was able to get 1150MHz with +50mv and 20% Power. For me it was quieter then HD 7970 reference. When i played Dota 2 for example the card would down clock to keep cool and would never go above 37% fan which made it quite.


----------



## tsm106

Quote:


> Originally Posted by *Gdourado*
> 
> Quote:
> 
> 
> 
> Originally Posted by *th3illusiveman*
> 
> it's one of the worst. I would not recommend it at all.
> 
> 
> 
> Not even for the 220 euros?
Click to expand...

Reference cards are far from ideal in a hot climate. Is it a good idea to buy into a problem even if its a bargain?


----------



## sonoma

I have been having issues with my crossfire windforce 290 set up
can someone review my score and tell me if everything looks okay?

http://www.3dmark.com/3dm/6601146


----------



## blackhole2013

I just sold my Asus 780 direct cu and bought a msi 290x 8gb its in the mail now please tell me I made the right choice and this card will out perform my 780 which has been clocked to 1250 mhz since I got it ...


----------



## muhd86

will r9-290x be fully dx12 compliant / or do we need to get the 390x for full dx 12 support / coz the future is dx 12 and above


----------



## paconan

Quote:


> Originally Posted by *sonoma*
> 
> I have been having issues with my crossfire windforce 290 set up
> can someone review my score and tell me if everything looks okay?
> 
> http://www.3dmark.com/3dm/6601146


your score is right
it's mine

intel core i7 2600k @ 5 + xfire 290oc Trix (1000 core / 1300 vram)
http://www.3dmark.com/fs/4403930

intel core i7 2600k @ 5.2 + xfire 290oc Trix (1200 core / 1475 vram)
http://www.3dmark.com/fs/4281982

my build

I think you need to overclock your cpu

sorry for my english


----------



## gatygun

Quote:


> Originally Posted by *blackhole2013*
> 
> I just sold my Asus 780 direct cu and bought a msi 290x 8gb its in the mail now please tell me I made the right choice and this card will out perform my 780 which has been clocked to 1250 mhz since I got it ...


290x is faster then a 780 but a bit slower then a 780 ti. It's all about the v-ram tho. You won't have any bottlenecks when it comes to that.

Quote:


> Originally Posted by *muhd86*
> 
> will r9-290x be fully dx12 compliant / or do we need to get the 390x for full dx 12 support / coz the future is dx 12 and above


A full DX12 card is a card that is build after dx12 is released.
200 amd series and 900 nvidia series are dx12 compatible that's it. They will work with it, but to what extend nobody knows.


----------



## gatygun

Quote:


> Originally Posted by *Gdourado*
> 
> Not even for the 220 euros?


I could buy a 290x stock version for 140 euro's and did go for a 290 tri-x. The sound on the stock coolers specially in hot climates will be horrendous. It's all fun having extra performance for cheap, but at the end you also need to play with it for hours and hours long. After my 580 that would make a aircraft sound in the summer with 90c degree's on the gpu. NEVER AGAIN.


----------



## Maintenance Bot

Quote:


> Originally Posted by *muhd86*
> 
> will r9-290x be fully dx12 compliant / or do we need to get the 390x for full dx 12 support / coz the future is dx 12 and above


----------



## chronicfx

Quote:


> Originally Posted by *tsm106*
> 
> Reference cards are far from ideal in a hot climate. Is it a good idea to buy into a problem even if its a bargain?


Mine are loud but to be honest I set my temps targets to 82c and max allowable fan speed to 100% and my cards do not have temp issues at all in trifire. When I use the default afterburner (i only sometimes care to monitor framerates when playing with settings) my cards normally top in the 70's and the vrms in the 50's.


----------



## chronicfx

Quote:


> Originally Posted by *tsm106*
> 
> Ya you have a synch problem. Confirm the catleaps work on their own in a two panel eyefinity set. It's to isolate them from the dp so see where and when the synch issue happens.


The catleaps do sync on their own into eyefinity without any issues. When the display port lg is added in they will not wake from sleep without powering off and on on of the monitors. Which is not a big deal for me, just less convenient than a mouse wiggle. How can I get 6400x1080 resolution as an option? I want to see if it helps the dragon age crashing. Although I tried one of the options given in the game (6044x1044) and it crashed to desktop when I tried to move my character.


----------



## DividebyZERO

Pretty sure DAI will run almost any resolution you want. I run it triple 4k eyefinity and it didn't crash. So I am not sure why yours is crashing but I doubt its the resolution. I would try setting gpus to stock and testing?


----------



## tsm106

Quote:


> Originally Posted by *chronicfx*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Ya you have a synch problem. Confirm the catleaps work on their own in a two panel eyefinity set. It's to isolate them from the dp so see where and when the synch issue happens.
> 
> 
> 
> The catleaps do sync on their own into eyefinity without any issues. When the display port lg is added in they will not wake from sleep without powering off and on on of the monitors. Which is not a big deal for me, just less convenient than a mouse wiggle. How can I get 6400x1080 resolution as an option? I want to see if it helps the dragon age crashing. Although I tried one of the options given in the game (6044x1044) and it crashed to desktop when I tried to move my character.
Click to expand...

Time to lost out your specs and settings. DA:I is just as harsh on the cpu/imc as BF4 is. Can you run BF4 for very long?


----------



## chronicfx

Quote:


> Originally Posted by *tsm106*
> 
> Time to lost out your specs and settings. DA:I is just as harsh on the cpu/imc as BF4 is. Can you run BF4 for very long?


Have not tried bf4. i do not do multiplayer games really. gpu clock are stock always.


----------



## chronicfx

Quote:


> Originally Posted by *DividebyZERO*
> 
> Pretty sure DAI will run almost any resolution you want. I run it triple 4k eyefinity and it didn't crash. So I am not sure why yours is crashing but I doubt its the resolution. I would try setting gpus to stock and testing?


What driver? I have never overclocked the gpu's. I have a 1300w psu and three cards overclocked would push on that. Stock they do 1046w at the psu plug by my killawatt.


----------



## tsm106

Quote:


> Originally Posted by *chronicfx*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Time to lost out your specs and settings. DA:I is just as harsh on the cpu/imc as BF4 is. Can you run BF4 for very long?
> 
> 
> 
> Have not tried bf4. i do not do multiplayer games really. gpu clock are stock always.
Click to expand...

You wrote that you can run Grid AS no issues, but DA:I is. That sounds like you are not stable in DA:I. DA uses the frostbite engine which is stupid harsh on the imc and its relation to the pcie controller etc etc. This is a common misconception that stability is proven just by stock gpu clocks.


----------



## chronicfx

Quote:


> Originally Posted by *tsm106*
> 
> You wrote that you can run Grid AS no issues, but DA:I is. That sounds like you are not stable in DA:I. DA uses the frostbite engine which is stupid harsh on the imc and its relation to the pcie controller etc etc. This is a common misconception that stability is proven just by stock gpu clocks.


Next step?

Edit: you know what, i never pushed my power limit up. I will try that next. Would that help crash to desktop or just down clocking which I never see with these cards for over a year?. I will try +25% per card to start.


----------



## DividebyZERO

Quote:


> Originally Posted by *chronicfx*
> 
> What driver? I have never overclocked the gpu's. I have a 1300w psu and three cards overclocked would push on that. Stock they do 1046w at the psu plug by my killawatt.


just tested quad CF 290x on 15.3 in DAI 6480x3840 no issues that i observed, only played about 20 minutes. Maybe try a single display first see if it crashes. That would eliminate that part, maybe try stock clocks on everything cpu/gpu and test.


----------



## gatygun

Well after some testing i found out that my gpu would heat my cpu up big time, as the 3 fans on the 290 tri-x would just blow the heat towards the top of the card from the sides and that heat would be stuck on the top side of the card + heat up my cpu cooler.

My cpu cooler did got a bit to hot lately ( 1,4v on 870 i7 for 4ghz "oc genie ) 77 needed to be the absolute max but when the card would run for a while it would move up towards 77 with only 50% used, and even towards 90's on 100% used ( which luckely no game really pushes "mostly hangs at 60% ) hyperthreading is enabled also.

My gpu did heat up towards 75c with 46% fan speed on top of it and the v-ram did move towards 91-92c at really taxing long play times.

I found this to hot, so i switched the card towards the second pci-e slot of my motherboard, and found out that i had room between the card and cpu cooler to put antoher cpu cooler ( old coolermaster ) in between the card and the noctua d14 cpu cooler. So that all heat builded up again can be pushed directly out of the case through a pull push solution.

After turning my PC on i got a strange 8x speed on pci-e bus ( kinda weird as it's a 16x slot ) still got to figure it out ). but the 3dmark score is 10k exactly like my old x16 bus score. So i don't see much of a issue there.
*
My temps changed from*

The CPU:

*cpu ( 1,4v 870 i7, hyperthreading on + noctua d14 )*
from:
~91c 4 cores 4 threads ( 100% usage )
towards:
~77c 4 cores 4 threads ( 100% usage )

As it never hits 100% it hovers now from 80c before towards 67c on games like mordor. So i'm happy with that for sure big increase and makes it nice and stable even on this reducilous overvolt that oc genie does on the cpu to get it stable at 4k.

Now towards the gpu

*gpu ( 1100 / 1450 +100 mv + 50% power limit.*
from:
76c core / 92c v-ram1 with 46% fan speed
towards:
72c core / 84c v-ram1 with 40% fan speed

All with all a big chance for sure on temps, i had to push he v-ram down on clock speed and sometimes the v-core to keep things stable through the heat, which now is completely solved. I couldn't push the fan on the 290 tri-x any harder as the cpu would then heat to unacceptable levels, which also is now a non issue.

This will be stable even on hot days for sure.

2 fans top, 3 fans at the cpu place, 2 fans front, 1 fan ground, 1 above videocard and 3 fans on the videocard.

Here a picture of my new solution:


----------



## chronicfx

Quote:


> Originally Posted by *DividebyZERO*
> 
> just tested quad CF 290x on 15.3 in DAI 6480x3840 no issues that i observed, only played about 20 minutes. Maybe try a single display first see if it crashes. That would eliminate that part, maybe try stock clocks on everything cpu/gpu and test.


Ok i will stock the cpu. I am 25 hours into the game playing 3440x1440. The crashing started after I went eyefinity.


----------



## tsm106

Quote:


> Originally Posted by *chronicfx*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DividebyZERO*
> 
> just tested quad CF 290x on 15.3 in DAI 6480x3840 no issues that i observed, only played about 20 minutes. Maybe try a single display first see if it crashes. That would eliminate that part, maybe try stock clocks on everything cpu/gpu and test.
> 
> 
> 
> Ok i will stock the cpu. I am 25 hours into the game playing 3440x1440. The crashing started *after I went eyefinity.*
Click to expand...

Crossfire and eyefinity obviously puts a large load on the cpu to gpu relationship. If you changed nothing, I would bet that you'd crash in BF4 as well.


----------



## chronicfx

Quote:


> Originally Posted by *DividebyZERO*
> 
> just tested quad CF 290x on 15.3 in DAI 6480x3840 no issues that i observed, only played about 20 minutes. Maybe try a single display first see if it crashes. That would eliminate that part, maybe try stock clocks on everything cpu/gpu and test.


Quote:


> Originally Posted by *tsm106*
> 
> Crossfire and eyefinity obviously puts a large load on the cpu to gpu relationship. If you changed nothing, I would bet that you'd crash in BF4 as well.


I will give my ideas to try, give me an order to try them.

- cpu is 4790k @4.7 with 44x cache, i will back the cache ratio down to 42x.

- cpu to stock

-ram is 2400mhz with 4x4gb sticks needing 1.68v. I can stock the ram back to 1333.

- up power limit from 0 to 25% for all gpu's


----------



## cephelix

Ok guys, another question. Hope someone with an MSI R9 290 Gaming 4G or know specs about it can help.
Recently changed out the thermal paste and pads on my gpu to Gelid GC Extreme and Fujipoly Ultra Extreme (1.0mm) and experiencing high temps in DAI and Valley Benchmark.
Ambients are about 25 degrees but core reaches 82-84 degrees with VRM1 reaching about 79 degrees. These are on stock clocks by the way.
Temps are way too high from the last benching session I ran a while back, when the card was new, I got 60 degrees on core and 79 degrees on VRM1.
Does it seem like a contact issue or airflow issue here? If it helps, I have 6 x AP-15 running at about 1200rpm as intakes and 1 as an exhaust. Plus 2 x bitfenix spectre pro leds(120mm) pointed perpendicular to the card.

If it is a contact issue, would it be better that I use a 0.5mm thermal pad then? Does anyone know how thick the stock thermal pads on VRM1 for the card is? Thank you for any help provided.


----------



## zealord

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *gatygun*
> 
> Well after some testing i found out that my gpu would heat my cpu up big time, as the 3 fans on the 290 tri-x would just blow the heat towards the top of the card from the sides and that heat would be stuck on the top side of the card + heat up my cpu cooler.
> 
> My cpu cooler did got a bit to hot lately ( 1,4v on 870 i7 for 4ghz "oc genie ) 77 needed to be the absolute max but when the card would run for a while it would move up towards 77 with only 50% used, and even towards 90's on 100% used ( which luckely no game really pushes "mostly hangs at 60% ) hyperthreading is enabled also.
> 
> My gpu did heat up towards 75c with 46% fan speed on top of it and the v-ram did move towards 91-92c at really taxing long play times.
> 
> I found this to hot, so i switched the card towards the second pci-e slot of my motherboard, and found out that i had room between the card and cpu cooler to put antoher cpu cooler ( old coolermaster ) in between the card and the noctua d14 cpu cooler. So that all heat builded up again can be pushed directly out of the case through a pull push solution.
> 
> After turning my PC on i got a strange 8x speed on pci-e bus ( kinda weird as it's a 16x slot ) still got to figure it out ). but the 3dmark score is 10k exactly like my old x16 bus score. So i don't see much of a issue there.
> *
> My temps changed from*
> 
> The CPU:
> 
> *cpu ( 1,4v 870 i7, hyperthreading on + noctua d14 )*
> from:
> ~91c 4 cores 4 threads ( 100% usage )
> towards:
> ~77c 4 cores 4 threads ( 100% usage )
> 
> As it never hits 100% it hovers now from 80c before towards 67c on games like mordor. So i'm happy with that for sure big increase and makes it nice and stable even on this reducilous overvolt that oc genie does on the cpu to get it stable at 4k.
> 
> Now towards the gpu
> 
> *gpu ( 1100 / 1450 +100 mv + 50% power limit.*
> from:
> 76c core / 92c v-ram1 with 46% fan speed
> towards:
> 72c core / 84c v-ram1 with 40% fan speed
> 
> All with all a big chance for sure on temps, i had to push he v-ram down on clock speed and sometimes the v-core to keep things stable through the heat, which now is completely solved. I couldn't push the fan on the 290 tri-x any harder as the cpu would then heat to unacceptable levels, which also is now a non issue.
> 
> This will be stable even on hot days for sure.
> 
> 2 fans top, 3 fans at the cpu place, 2 fans front, 1 fan ground, 1 above videocard and 3 fans on the videocard.
> 
> Here a picture of my new solution:






My Sapphire Tri-X Vram1 was at 115°C at some point in benchmarking. I noticed it after I let it run for an hour









In gaming it gets to 85°C or something after I manually set the fan curve to be a bit more generous.


----------



## Arizonian

Quote:


> Originally Posted by *faizreds*
> 
> You need to fix my card info. It is r9 290 tri-x. Not r9 290x.
> Thanks.










ooops....fixed.
Quote:


> Originally Posted by *Zeke1001*
> 
> Hello, can I join?
> 
> 
> 
> 
> 
> 
> 
> (Sapphire R9 290 Tri-x)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> GPU-Z:
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## kizwan

Quote:


> Originally Posted by *cephelix*
> 
> Ok guys, another question. Hope someone with an MSI R9 290 Gaming 4G or know specs about it can help.
> Recently changed out the thermal paste and pads on my gpu to Gelid GC Extreme and Fujipoly Ultra Extreme (1.0mm) and experiencing high temps in DAI and Valley Benchmark.
> Ambients are about 25 degrees but core reaches 82-84 degrees with VRM1 reaching about 79 degrees. These are on stock clocks by the way.
> Temps are way too high from the last benching session I ran a while back, when the card was new, I got 60 degrees on core and 79 degrees on VRM1.
> Does it seem like a contact issue or airflow issue here? If it helps, I have 6 x AP-15 running at about 1200rpm as intakes and 1 as an exhaust. Plus 2 x bitfenix spectre pro leds(120mm) pointed perpendicular to the card.
> 
> If it is a contact issue, would it be better that I use a 0.5mm thermal pad then? Does anyone know how thick the stock thermal pads on VRM1 for the card is? Thank you for any help provided.


If you didn't change the fans arrangement, you can rule out airflow & it's likely poor contact. If you did changed the fans, I think you'll need more than one exhaust fan. Also make sure the two fans pointed toward to the gpu are getting fresh air.


----------



## cephelix

Quote:


> Originally Posted by *kizwan*
> 
> If you didn't change the fans arrangement, you can rule out airflow & it's likely poor contact. If you did changed the fans, I think you'll need more than one exhaust fan. Also make sure the two fans pointed toward to the gpu are getting fresh air.


Well,the bitfenixes are pushing air from the front intake ap-15s. I tried with the bitfenixes on low,med,high and off and it doesn't change temps any.i will have to take apart my card again later and probably use thinner stacked thermal pads


----------



## MrKZ

Just got my Asus DCU2 R9 290x. Got it at a local store's easter stale at the price of a 290 non x so it was a great deal.

Quite happy with it's performance ^^ .

But i have some questions.I dont know if they were already answered or not, but i hope somebody will answer :


Is it normal for the VRM to go up to almost 100C at stock settings? I had to set up a custom fan curve and now it wont go over 85, but as soon as i touch the voltage it goes to 100 again (tried +50mV offset for 1100mhz on core)
I'm thinking of replacing the thermal pad under the VRM heatsink, but i saw there is a white sticker on one of the screws.It doesn't say anything on it. It's just white with a red circle on it. Is it a warranty sticker? If so, is there any way to remove it carefully then put it back?
Is it normal for the card @ 1100mhz to start artifacting when vrm hits ~95C, no matter how high i go with the voltage? Cause if i keep It under that temp is rock stable. Anything over and i start to get random lines coming from 3d objects. And it's not because of the GPU temp, that one is around 75-80C.
Thanks


----------



## nX3NTY

I'm reading through post in 4GB GTX 960 and stumbled upon a good post. Can anyone confirm what he say is true? I'm itching to buy R9 290/X if it's true.


----------



## rdr09

Quote:


> Originally Posted by *nX3NTY*
> 
> I'm reading through post in 4GB GTX 960 and stumbled upon a good post. Can anyone confirm what he say is true? I'm itching to buy R9 290/X if it's true.


most likely you don't need to oc your cpu with the GTX 960. with the 290 . . . you must.


----------



## nX3NTY

Quote:


> Originally Posted by *rdr09*
> 
> most likely you don't need to oc your cpu with the GTX 960. with the 290 . . . you must.


That doesn't answer my question at all. If anyone missed it here is the post I quoted. I just want to know is this real?
Quote:


> Power efficiency is an oft-used negative against the large-die Hawaii chips, but I've been playing with powertune settings and Furmark recently as an experiment to fit a "hot and noisy" AMD card into an SFF with limited cooling.
> 
> Actually, I stand by an earlier post I made that says I think AMD pushed Hawaii silicon too far.
> With both GPU-Z and Furmark able to report power consumptions, I can see a 100W reduction in power consumption on 290X cards for as little as 5% performance loss.
> 
> If you have a Hawaii card, I urge you to crank power limits down in the overdrive tab of CCC and see what the resulting clockspeed is under full load. Even in a worst-case scenario, I'm seeing a typical clockspeed of 850MHz with the slider all the way to the left at -50%
> 
> That means that Hawaii (the two samples I personally own, at least) can run at 850+MHz on only 145W (half the 290W TDP). As mentioned, that's a worst-case scenario using a power-virus like Furmark. Under real gaming situations (I was messing around with Alien Isolation on 1440p ultra settings) the clocks averaged about 925MHz yet my PC was inaudible; Fans that normally hum along at 55% were barely spinning at 30% during my gameplay.
> 
> As Nvidia has proved, you can make a 28nm chip run efficiently. I think the design of Hawaii holds up very well under vastly reduced power constraints - AMD just pushed it outside its comfort zone in order to get the most out of it.
> 
> In saying that, the "underpowered" 290X is around the same performance as my GTX970 and also the same cost - significantly higher than a GTX960 4GB. I don't know if die-harvested 290 cards deal with power limit caps like the cherry-picked 290X cards.


----------



## gatygun

Quote:


> Originally Posted by *MrKZ*
> 
> Just got my Asus DCU2 R9 290x. Got it at a local store's easter stale at the price of a 290 non x so it was a great deal.
> 
> Quite happy with it's performance ^^ .
> 
> But i have some questions.I dont know if they were already answered or not, but i hope somebody will answer :
> 
> 
> Is it normal for the VRM to go up to almost 100C at stock settings? I had to set up a custom fan curve and now it wont go over 85, but as soon as i touch the voltage it goes to 100 again (tried +50mV offset for 1100mhz on core)
> I'm thinking of replacing the thermal pad under the VRM heatsink, but i saw there is a white sticker on one of the screws.It doesn't say anything on it. It's just white with a red circle on it. Is it a warranty sticker? If so, is there any way to remove it carefully then put it back?
> Is it normal for the card @ 1100mhz to start artifacting when vrm hits ~95C, no matter how high i go with the voltage? Cause if i keep It under that temp is rock stable. Anything over and i start to get random lines coming from 3d objects. And it's not because of the GPU temp, that one is around 75-80C.
> Thanks


1) It should be like 93c or something on stock, maybe your airflow is kinda bad. ( if you got a stock cooler on it )
2) Removing stuff or changing stuff will void the warranty on it. There are reports of cards coming still with plastic on the v-ram memory's as it was forgotten to be taken off. so maybe you got that issue.
3) Some say the max temp for a stable v-ram is 95 others say its 105. I would keep it under the 90 personally. My card always yields about 10-15c more temps on the v-ram in comparison towards the core.


----------



## rdr09

Quote:


> Originally Posted by *nX3NTY*
> 
> That doesn't answer my question at all. If anyone missed it here is the post I quoted. I just want to know is this real?


too long to read - my bad and i'm sorry. one thing is for sure . . . your psu will handle a 290 just fine even oc'ed. now, depending on your resolution . . . the GTX 960 might suit you just fine and saving the planet at the same time.

i'll let others chime in. gl with your choice.


----------



## tsm106

Quote:


> Originally Posted by *nX3NTY*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> most likely you don't need to oc your cpu with the GTX 960. with the 290 . . . you must.
> 
> 
> 
> That doesn't answer my question at all. If anyone missed it here is the post I quoted. I just want to know is this real?
> Quote:
> 
> 
> 
> Power efficiency is an oft-used negative against the large-die Hawaii chips, but I've been playing with powertune settings and Furmark recently as an experiment to fit a "hot and noisy" AMD card into an SFF with limited cooling.
> 
> Actually, I stand by an earlier post I made that says I think AMD pushed Hawaii silicon too far.
> With both GPU-Z and Furmark able to report power consumptions, I can see a 100W reduction in power consumption on 290X cards for as little as 5% performance loss.
> 
> If you have a Hawaii card, I urge you to crank power limits down in the overdrive tab of CCC and see what the resulting clockspeed is under full load. Even in a worst-case scenario, I'm seeing a typical clockspeed of 850MHz with the slider all the way to the left at -50%
> 
> That means that Hawaii (the two samples I personally own, at least) can run at 850+MHz on only 145W (half the 290W TDP). As mentioned, that's a worst-case scenario using a power-virus like Furmark. Under real gaming situations (I was messing around with Alien Isolation on 1440p ultra settings) the clocks averaged about 925MHz yet my PC was inaudible; Fans that normally hum along at 55% were barely spinning at 30% during my gameplay.
> 
> As Nvidia has proved, you can make a 28nm chip run efficiently. I think the design of Hawaii holds up very well under vastly reduced power constraints - AMD just pushed it outside its comfort zone in order to get the most out of it.
> 
> In saying that, the "underpowered" 290X is around the same performance as my GTX970 and also the same cost - significantly higher than a GTX960 4GB. I don't know if die-harvested 290 cards deal with power limit caps like the cherry-picked 290X cards.
> 
> Click to expand...
Click to expand...

Nvidia has proved what??. What they have done is use color compression schemes, and boost gating a form of power on/off and most dramatically gutted their gpus to lower power consumption. Also, reviewers boasted these low tdp reference designs as a barometer all the while testing 3rd party custom designs which threw out the reference tdp numbers.

*I can see a 100W reduction in power consumption on 290X cards for as little as 5% performance loss.*

Did you bother to run benches to confirm this gibberish?


----------



## LandonAaron

Quote:


> Originally Posted by *nX3NTY*
> 
> I'm reading through post in 4GB GTX 960 and stumbled upon a good post. Can anyone confirm what he say is true? I'm itching to buy R9 290/X if it's true.


Your not going to save any significant money on your power bill by choosing a lower TDP graphics card unless you mining or folding 24/7 with multiple cards. Just get the faster card and call it a day. I agree with TSM. Nvidia is focusing on all the wrong things trying to make their cards as efficient as possible by sacrificing performance. I save electricity by buying CFL light bulbs, and using blackout curtains in the summer. I don't need to save miniscule amounts of electricity with performance gaming cards. I buy those to go fast.


----------



## MrKZ

Quote:


> Originally Posted by *gatygun*
> 
> 1) It should be like 93c or something on stock, maybe your airflow is kinda bad. ( if you got a stock cooler on it )
> 2) Removing stuff or changing stuff will void the warranty on it. There are reports of cards coming still with plastic on the v-ram memory's as it was forgotten to be taken off. so maybe you got that issue.
> 3) Some say the max temp for a stable v-ram is 95 others say its 105. I would keep it under the 90 personally. My card always yields about 10-15c more temps on the v-ram in comparison towards the core.


Thank you for the reply, but I'm talking about the VRM, not the v-ram (memory)


----------



## gatygun

Quote:


> Originally Posted by *nX3NTY*
> 
> That doesn't answer my question at all. If anyone missed it here is the post I quoted. I just want to know is this real?


At the end it's all about what you want. I underclocked my 580 because i found the heat to much for it on the stock cooler. I dealt with 10% performance lose specially in the summer but i found it good enough.

Also his fan profile and heat is also depending on his airflow, if he got stock issue's with 55% fan speed, that his case is probably badly ventilated or he has maybe a stock version. because my tri-x never goes past the 40% and if it does it only goes towards 42% max and that's with a 10% overclock on it also with 100+ and 50+ increase on volt + max power

I underclocked my card a while back because it was unstable towards 6
Quote:


> Originally Posted by *nX3NTY*
> 
> That doesn't answer my question at all. If anyone missed it here is the post I quoted. I just want to know is this real?


Here made a video of me playing around with voltage, it's kinda blurry but oh well you can see it.

Installing alien isolation atm to see if i can replicate his experience exactly, as furmark really doesn't test much at all for stability and results when it comes to gpu in my experience.


----------



## cephelix

Quote:


> Originally Posted by *MrKZ*
> 
> Just got my Asus DCU2 R9 290x. Got it at a local store's easter stale at the price of a 290 non x so it was a great deal.
> 
> Quite happy with it's performance ^^ .
> 
> But i have some questions.I dont know if they were already answered or not, but i hope somebody will answer :
> 
> 
> Is it normal for the VRM to go up to almost 100C at stock settings? I had to set up a custom fan curve and now it wont go over 85, but as soon as i touch the voltage it goes to 100 again (tried +50mV offset for 1100mhz on core)
> I'm thinking of replacing the thermal pad under the VRM heatsink, but i saw there is a white sticker on one of the screws.It doesn't say anything on it. It's just white with a red circle on it. Is it a warranty sticker? If so, is there any way to remove it carefully then put it back?
> Is it normal for the card @ 1100mhz to start artifacting when vrm hits ~95C, no matter how high i go with the voltage? Cause if i keep It under that temp is rock stable. Anything over and i start to get random lines coming from 3d objects. And it's not because of the GPU temp, that one is around 75-80C.
> Thanks


The asus dcu cards are the worst in terms of cooling for the R9 290(x) series. The VRM can handle temps till 120C but of course the lower the better. As for the sticker over the screw,that is a warranty sticker. If you want to remove it,you could just run the card in games or use a hairdryer to heat it up, then quickly remove the sticker with a pair of fine tipped tweezers. that's what i did to mine anyways. Cant answer your 3rd question since mine is at stock.

On to my problem, i adjusted the thermal pads and applied CLU between heatsink and die and my temps have gone down 10C to 73C in a 25C ambient. Will play around with airflow during the weekend and see how much lower I could get it. Still have yet to oc my card.


----------



## pcrevolution

Can I join too please?










RMA-ed my old HD7970 and got a R9 290x. Reference too!

I know not everyone has a thing for blower designs and admittedly Nvidia's blower designs have better build quality. But I like having the air exhausted out the rear I/O.

Added a aquacomptuter backplate to my 290x too:



P.S. Not an Nvidia fanboy, but green just happens to be my favourite colour.

I'm waiting for my waterblocks and rads to arrive before I can ditch that intel stock cooler.


----------



## Mega Man

Well congrats!

Imo amd ref greater then nvidia


----------



## tsm106

Quote:


> Originally Posted by *Mega Man*
> 
> Imo amd ref greater then nvidia


I think he just meant the ref cooler and not the ref pcb design, in which case you'd be right. AMD pcb > NV pcb.


----------



## pcrevolution

Quote:


> Originally Posted by *tsm106*
> 
> I think he just meant the ref cooler and not the ref pcb design, in which case you'd be right. AMD pcb > NV pcb.


Oh yeah. I was referring to the reference cooler. Hahaha.


----------



## boot318

Quote:


> Originally Posted by *MrKZ*
> 
> Just got my Asus DCU2 R9 290x. Got it at a local store's easter stale at the price of a 290 non x so it was a great deal.
> 
> Quite happy with it's performance ^^ .
> 
> But i have some questions.I dont know if they were already answered or not, but i hope somebody will answer :
> 
> 
> Is it normal for the VRM to go up to almost 100C at stock settings? I had to set up a custom fan curve and now it wont go over 85, but as soon as i touch the voltage it goes to 100 again (tried +50mV offset for 1100mhz on core)
> I'm thinking of replacing the thermal pad under the VRM heatsink, but i saw there is a white sticker on one of the screws.It doesn't say anything on it. It's just white with a red circle on it. Is it a warranty sticker? If so, is there any way to remove it carefully then put it back?
> Is it normal for the card @ 1100mhz to start artifacting when vrm hits ~95C, no matter how high i go with the voltage? Cause if i keep It under that temp is rock stable. Anything over and i start to get random lines coming from 3d objects. And it's not because of the GPU temp, that one is around 75-80C.
> Thanks


I have the same card as you and my thermal pad under the VRM wasn't thick enough to make proper contact with the heatsink/VRM. I change the thermal pad with something I had laying around and saw 13C (or more if I can deal with the fan noise) drop in temps. My card starts to artifact at the 85-87 range on the VRM. I am using the "RED MOD" so Core temps were never a problem.

My heatsink sticker on the screws weren't normal. It was more like a brittle sticker. It was breaking away like clay when I started to remove it. For me, I had to get my VRM temps under control before I could truly start overclocking my GPU. I could barely get 1140 with +100mV before improving the VRM temps. Now I can do 1190 on Core with the same volts.

P.S. The VRM and Memory VRM is under that heatsink also.


----------



## gatygun

Quote:


> Originally Posted by *nX3NTY*
> 
> That doesn't answer my question at all. If anyone missed it here is the post I quoted. I just want to know is this real?


Tested it out with overclock / stock / underclock while running alien isolution 1080p maxed settings. Here are the results:

*OC: 1145/1450*


Spoiler: Warning: Spoiler!









*Stock: 1000/1300*


Spoiler: Warning: Spoiler!









*Underclock: 925/1300*

Couldn't get the volts lower as i had to reduce memory or other solutions then, which i didn't hear him mentioning, but it's close i quess.


Spoiler: Warning: Spoiler!









*Underclock: 850/1250*


Spoiler: Warning: Spoiler!








Quote:


> Originally Posted by *MrKZ*
> 
> Thank you for the reply, but I'm talking about the VRM, not the v-ram (memory)


i mean vrm my bad.


----------



## pengs

Yeah, if your looking at GPU-Z it's _VRM_ temp not V-ram. I was wondering how people were getting video memory temperatures


----------



## ghabhaducha

Hey guys, I saw this on reddit, and hopefully this is true. Have their been any leaks of the driver yet?
Quote:


> Originally Posted by *[email protected] (twitter)*
> 
> __ https://twitter.com/i/web/status/587661043125522432
> AMD GTA V drivers coming today, thanks for your patience everyone.


Pretty stoked for GTA V for PC!!!


----------



## tsm106

It's been on Roy's twitter for a couple hours now.


----------



## gatygun

Nice, was already hoping for a driver update.


----------



## ghabhaducha

Quote:


> Originally Posted by *tsm106*
> 
> It's been on Roy's twitter for a couple hours now.


Ah I see, ya I just saw it. I was hoping the driver download link would have been available somewhere, haha.

So are you gonna get GTA V bro? We should make an OCN crew or something, that would be sick.


----------



## Alexbo1101

Quote:


> Originally Posted by *ghabhaducha*
> 
> Ah I see, ya I just saw it. I was hoping the driver download link would have been available somewhere, haha.
> 
> So are you gonna get GTA V bro? We should make an OCN crew or something, that would be sick.


http://www.overclock.net/t/1513587/gtav-pc-overclock-net-group/0_20


----------



## tsm106

Quote:


> Originally Posted by *ghabhaducha*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> It's been on Roy's twitter for a couple hours now.
> 
> 
> 
> Ah I see, ya I just saw it. I was hoping the driver download link would have been available somewhere, haha.
> 
> So are you gonna get GTA V bro? We should make an OCN crew or something, that would be sick.
Click to expand...

http://www.overclock.net/t/1549443/gmg-gta-v-25-off-45/0_40#post_23765740


----------



## ghabhaducha

Quote:


> Originally Posted by *Alexbo1101*
> 
> http://www.overclock.net/t/1513587/gtav-pc-overclock-net-group/0_20


Aww sweeeettt...ya I saw the MAX SETTINGS group a bit late on reddit r/GrandTheftAutoV_PC, but I was able to join MAX SETTINGS 2 (was the 1000th member incidentally..lmao). Then I thought, hmm I wonder if OCN has a crew. From what I gathered on xbox360, heists are sick, but it helps TREMENDOUSLY to do them with people who you know, or are serious players.


----------



## ghabhaducha

Quote:


> Originally Posted by *tsm106*
> 
> http://www.overclock.net/t/1549443/gmg-gta-v-25-off-45/0_40#post_23765740


Noice, Noice, NOICE! I can't wait bro, we gotta play sometime. I often read your posts regarding multi-gpu set ups, so please share how it runs on your setup!


----------



## tsm106

Will do in 3 hours.


----------



## tsm106

Quote:


> Highlights of AMD Catalyst™ 15.4 Beta Windows Driver
> 
> New Crossfire Profiles for:
> 
> *Grand Theft Auto V*
> *Dying Light**
> Galactic Civilizations III
> Metal Gear Solid V: Ground Zeroes
> Mortal Combat X**
> Sleeping Dogs : Definitive Edition
> 
> Crossfire Profiles updates for:
> 
> Battlefield Hardline
> *Far Cry 4
> Middle Earth: Shadow of Mordor*
> Sniper Elite 3


----------



## Mega Man

So does that mean mantle fixes. ..


----------



## ghabhaducha

Quote:


> Originally Posted by *tsm106*


Thanks, downloaded and installing


----------



## tsm106

Quote:


> Originally Posted by *Mega Man*
> 
> So does that mean mantle fixes. ..


Mantle versions unchanged, though it looks like a bf4 mem leak is addressed.


----------



## Harry604

bf4 hacks dont work with mantle only dx 11


----------



## LandonAaron

Anyone tested the new Dying Light crossfire profile yet?


----------



## LandonAaron

Quote:


> Originally Posted by *Harry604*
> 
> bf4 hacks dont work with mantle only dx 11


You mean cheats?


----------



## Harry604

Quote:


> Originally Posted by *LandonAaron*
> 
> You mean cheats?


yes


----------



## ZealotKi11er

Quote:


> Originally Posted by *tsm106*
> 
> Mantle versions unchanged, though it looks like a bf4 mem leak is addressed.


I got that memory leak in BF4 2 days ago and did not know what to do lol. My computer started slowing down and was not sure what the problem was until i sow 15.5GB used RAM lol.


----------



## LandonAaron

Pre-loading GTA V. I am on 5.7 of 59GB, lol! That has to be some kind of record.


----------



## tsm106

Ran around BF4 test range causing some mayhem in Mantle, so far probs. Good sign. 34mins to go on GTA V...


----------



## boot318

He never said he hacks. It was kinda suggestive, but never said.


----------



## mfknjadagr8

Hacks are easily the biggest pitfall in multiplayer gaming today....I do find it satisfying to get a headshot on a noobie cheater....it's sad but its rampant because people don't want to work for anything not just pc gaming.....I think all multiplayer cheats should come with a complimentary hard drive formatting virus....maybe it's just me but I feel if I can't achieve something I should learn to do it better or admit defeat


----------



## Harry604

Quote:


> Originally Posted by *boot318*
> 
> He never said he hacks. It was kinda suggestive, but never said.


thank you not once did I say i personally hack


----------



## DividebyZERO

Can't we all just get along?


----------



## sinnedone

Does anyone have the stock IO shield off a reference Saphire 290/290x? The stock smoked chrome one?









I had one of my cards sent in for RMA and they sent back a tri-x instead of reference and they don't match the reference ones.


----------



## fishingfanatic

Agreed! We're all enthusiasts, let's celebrate that, it may give us pause b4 lashiing out. That said, I SUCK!!!

JK!!!!!!lolz

FF


----------



## Mega Man

No I don't sorry. I thought I was the only one that looked for that.

You can buy them from ebay. But they are proxy. They liked like normal ones to me though. I plan on painting them. But I am keeping the oem ones however


----------



## Maintenance Bot

Quote:


> Originally Posted by *LandonAaron*
> 
> Anyone tested the new Dying Light crossfire profile yet?


Yes and it is not working well here. Each gpu 45 to 50% usage. The game seems to be very glacier paced and slow with cfx enabled. Runs well though on 1 gpu.


----------



## tsm106

Eyefinity support and very good multi gpu usage out of the box. Gta V is the real deal.


----------



## BradleyW

Quote:


> Originally Posted by *tsm106*
> 
> Eyefinity support and very good multi gpu usage out of the box. Gta V is the real deal.


Is the 3930K getting good usage?


----------



## chronicfx

I got Dragon Age working at 8560x1440. I updated my chipset drivers just as a "what if" and everything has worked after that. Just played about an hour and a half straight no CTD. I am custom settings as ultra leaves a couple things lowered to high, with everything pushed up to ultra possible but AA at 0x. Getting 35-45FPS very smooth and never dropped under 30. Could not be happier with this setup!

Edit: Very happy to report it was not my overclock


----------



## Widde

Got tons of stuttering in gta v with the 14.12s but with the 15.4 it's all gone and crossfire is working


----------



## zealord

It is so hard finding reports from people on how GTA V is running on a 290 or 290X. I just browsed through 20 pages on another forum for GTA V performance and out of 300 posts roughly 295 people are using either a GTX 970 or GTX 980.

Someone said he was getting like 45 fps on 1080p with a R9 290, that can't be, right?

quotes :
Quote:


> So are all of you sporting 60 fps just playing the intro? As soon as I get in the city my 32 gigs of ram intel 5820k 6 core with dual 290x goes from 60 to single digits...


Quote:


> 3570K
> R9 290
> 16 GB Ram
> 
> With every setting (aside from the experimental sliders) set as high as I can, I'm getting an almost perfect 30fps in the benchmark. This is completely stock so I'm going to see how much overclocking both the CPU and GPU improves things.


That sounds horrible


----------



## LandonAaron

Well I am trying to download GTA V for the past 8 hours and just now got to the halfway mark. At 60 gigs it must be using some very high resolution textures. I really hope these crossfire reports are true, I'm anxious to put this second card to work.


----------



## th3illusiveman

anyone else have a weird shadow bug for GTA5? Like when you move the camera around 360 degress there are some angles where it clips away and disappears.


----------



## Yvese

For those interested, getting about 30-80 fps on my setup in GTA 5 max settings ( MSAA at x2 ). The 30 fps is mainly outside while driving at daytime. I'd definitely be at consistent 60 fps if I was at 1080p. Not sure how 970/980s perform at my res but I wouldn't be surprised if this game is optimized better on Nvidia hardware.
Quote:


> Originally Posted by *th3illusiveman*
> 
> anyone else have a weird shadow bug for GTA5? Like when you move the camera around 360 degress there are some angles where it clips away and disappears.


Yea I'm getting that as well. Using latest 15.4 drivers.


----------



## th3illusiveman

Updating from OMEGA to 15.4 seems to have fixed that issue. I also raised my CPU speed to 4.5 GHz and it seems doing both these things made the game moreplayable at everything very high, 1080p with 4x MSAA and hovering between 50 and 70 FPS median around 60 with the advanced graphics slider turned down abit but options turned on.

Very enjoyable. I can say that this is a good port. Some minor issues (like every game) but an absolute masterpiece compared to the POS that was /is/ GTA4.

I can tell Friday will be a very good day for me


----------



## gatygun

Quote:


> Originally Posted by *zealord*
> 
> It is so hard finding reports from people on how GTA V is running on a 290 or 290X. I just browsed through 20 pages on another forum for GTA V performance and out of 300 posts roughly 295 people are using either a GTX 970 or GTX 980.
> 
> Someone said he was getting like 45 fps on 1080p with a R9 290, that can't be, right?
> 
> quotes :
> 
> That sounds horrible




Either AMD did absolute nothing with there drivers towards gta 5, or they on purpose limited amd gpu's in performing

Because the FPS is horrible bad for amd cards, a 680 performs on par with a 290, just lol.

Also this is high:


----------



## gatygun

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Hacks are easily the biggest pitfall in multiplayer gaming today....I do find it satisfying to get a headshot on a noobie cheater....it's sad but its rampant because people don't want to work for anything not just pc gaming.....I think all multiplayer cheats should come with a complimentary hard drive formatting virus....maybe it's just me but I feel if I can't achieve something I should learn to do it better or admit defeat


There is a easy fix and that fix has been done on multiple games already to solve hackers. When they get caught they get pushed into servers that only contain other hackers.


----------



## rdr09

Titan X for 1080? getting same fps as the gpu half its price.


----------



## thrgk

Is it OK to run my 290x with +200mv mod . They are all water cooled and hopefully I'd be 1150 stable in bf4 with 200mv more.


----------



## Wezzor

Quote:


> Originally Posted by *gatygun*
> 
> 
> 
> Either AMD did absolute nothing with there drivers towards gta 5, or they on purpose limited amd gpu's in performing
> 
> Because the FPS is horrible bad for amd cards, a 680 performs on par with a 290, just lol.
> 
> Also this is high:


Is that polish site even trustworthy?


----------



## velocityx

Quote:


> Originally Posted by *rdr09*
> 
> Titan X for 1080? getting same fps as the gpu half its price.


it just shows that CPU is the bottleneck in this game. and that AMD drivers are a cpu hog in comparison to nvidia drivers which use cpu much better.


----------



## Mega Man

Sniff...sniff I smell fanboi.

Is it possible that with more power and not more compression ( nvidia) you need the cpu more?


----------



## rdr09

Quote:


> Originally Posted by *velocityx*
> 
> it just shows that CPU is the bottleneck in this game. and that AMD drivers are a cpu hog in comparison to nvidia drivers which use cpu much better.


half the price . . .


----------



## DarthElvis

Quote:


> Originally Posted by *Wezzor*
> 
> Is that polish site even trustworthy?


No, it's not. One of the most anti-AMD sites I have seen. I would trust Chinese anti-virus before I trusted that site.


----------



## gatygun

I have no clue, found them and posted them.


----------



## tsm106

Quote:


> Originally Posted by *gatygun*
> 
> Quote:
> 
> 
> 
> Originally Posted by *zealord*
> 
> It is so hard finding reports from people on how GTA V is running on a 290 or 290X. I just browsed through 20 pages on another forum for GTA V performance and out of 300 posts roughly 295 people are using either a GTX 970 or GTX 980.
> 
> Someone said he was getting like 45 fps on 1080p with a R9 290, that can't be, right?
> 
> quotes :
> 
> That sounds horrible
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Either AMD did absolute nothing with there drivers towards gta 5, or they on purpose limited amd gpu's in performing
> 
> Because the FPS is horrible bad for amd cards, a 680 performs on par with a 290, just lol.
> 
> Also this is high:
Click to expand...

To say that site is heavily biased is an understatement.


----------



## Yvese

Yea those benchmarks are just wrong. I'm getting that average fps on 1440p...


----------



## LandonAaron

If I watch embedded youtube videos they flicker. If I watch a youtube video that is actually on youtube it plays fine. The embedded videos flicker in normal or fullscreen mode. What could be causing this. Running crossfire, ULPS disabled, 15.4 drivers, happened with 15.3 as well.


----------



## ZealotKi11er

@ 1080p Nvidia has always been better then AMD and we all know AMD has problem with CPU intense games with DX11.


----------



## Mega Man

Quote:


> Originally Posted by *ZealotKi11er*
> 
> @ 1080p Nvidia has always been better then AMD and we all know AMD has problem with CPU intense games with DX11.


Not really


----------



## joeh4384

Guru3d's review has the 290x doing much better. It looks like this game plays pretty well on both sides.

http://www.guru3d.com/articles_pages/gta_v_pc_graphics_performance_review,5.html


----------



## chronicfx

22% downloaded, can't wait to try GTA V out at 8560x1440. If it gets 45FPS at 4K with 290x CF, I am hoping and praying to get that at my RES, 25% more pixels - 25% more GPU horsepower (given less than 100% scaling) = same FPS barring VRAM tanking. Lets hope that AMD bandwidth keeps up with 12.3 Megapixel!


----------



## Jiiks

Had to RMA my TRI-X(suddenly started blackscreening on boot), getting a Vapor-X replacement


----------



## cephelix

Quote:


> Originally Posted by *Jiiks*
> 
> Had to RMA my TRI-X(suddenly started blackscreening on boot), getting a Vapor-X replacement


that's a good replacement!....I've been wondering if I want to swap out my msi 290 gaming for a trix or vapor x 290x..but the performance gains at 1200p don't seem worth it to me...anyways, waiting for the 300 series to launch before i decide if i want to change anything
Good thing now is i'm maxing out all settings in DAI at 1200p except msaa which i turned off and still get about 56-60fps with vsync on


----------



## jura11

Hi guys

Here is my validation

http://www.techpowerup.com/gpuz/details.php?id=3nvq4

Card is Sapphire R9 290 Tri-X

Cooling is stock

Thanks,Jura


----------



## Arizonian

Quote:


> Originally Posted by *MrKZ*
> 
> Just got my Asus DCU2 R9 290x. Got it at a local store's easter stale at the price of a 290 non x so it was a great deal.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Quite happy with it's performance ^^ .
> 
> But i have some questions.I dont know if they were already answered or not, but i hope somebody will answer :
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> Is it normal for the VRM to go up to almost 100C at stock settings? I had to set up a custom fan curve and now it wont go over 85, but as soon as i touch the voltage it goes to 100 again (tried +50mV offset for 1100mhz on core)
> I'm thinking of replacing the thermal pad under the VRM heatsink, but i saw there is a white sticker on one of the screws.It doesn't say anything on it. It's just white with a red circle on it. Is it a warranty sticker? If so, is there any way to remove it carefully then put it back?
> Is it normal for the card @ 1100mhz to start artifacting when vrm hits ~95C, no matter how high i go with the voltage? Cause if i keep It under that temp is rock stable. Anything over and i start to get random lines coming from 3d objects. And it's not because of the GPU temp, that one is around 75-80C.
> 
> 
> 
> 
> Thanks


Congrats - added








Quote:


> Originally Posted by *pcrevolution*
> 
> Can I join too please?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> RMA-ed my old HD7970 and got a R9 290x. Reference too!
> 
> I know not everyone has a thing for blower designs and admittedly Nvidia's blower designs have better build quality. But I like having the air exhausted out the rear I/O.
> 
> Added a aquacomptuter backplate to my 290x too:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> P.S. Not an Nvidia fanboy, but green just happens to be my favourite colour.
> 
> I'm waiting for my waterblocks and rads to arrive before I can ditch that intel stock cooler.


Congrats - added















Quote:


> Originally Posted by *Mega Man*
> 
> Well congrats!
> 
> Imo amd ref greater then nvidia


IMO too.
Quote:


> Originally Posted by *Jiiks*
> 
> Had to RMA my TRI-X(suddenly started blackscreening on boot), getting a Vapor-X replacement


Sorry to hear about your Tri-X. Nice to hear you got a Vapor-X replacement. WTG Sapphire.
Quote:


> Originally Posted by *jura11*
> 
> Hi guys
> 
> Here is my validation
> 
> http://www.techpowerup.com/gpuz/details.php?id=3nvq4
> 
> Card is Sapphire R9 290 Tri-X
> 
> Cooling is stock
> 
> Thanks,Jura


Congrats - added


----------



## chronicfx

So I am cruising along at 4.6MB/s all the way to 26% with my gta v download and I went to play soccer for two hours and I come back to 30% and 200kb/s... Lol night ruined.


----------



## Alexbo1101

Okay, I'm having some annoying problems in GTA V, it's stuttering like crazy and the world is slow to render when driving semi-fast cars.

Any ideas? (this is medium/high settings on Sig rig minus cpu oc, GTA was not happy with it)


----------



## Mega Man

Quote:


> Originally Posted by *y4h3ll*
> 
> Called amd tech support and they told me to change pre defined settings to optimized 1x1 and worked finally. anyone having the same issue just change xfire settings to optimed 1x1 fixed my application Issue. This is a known issue for "win 8.1"that's why it wouldon't load f-s


this is why imo 290/290x/295x2s should be in one thread

many however stated there is no problems with performance


----------



## Alexbo1101

Quote:


> Originally Posted by *Mega Man*
> 
> this is why imo 290/290x/295x2s should be in one thread
> 
> many however stated there is no problems with performance


Was that quote directed at me, Mega Man?


----------



## Mega Man

i hope it helps, but yes, my writing was not.

i wont be able to help you. when i was in high school a buddy of mine screwed me

he just spent hours, and i mean hours shooting at cars and "racing" them

then after that, he would get 5 stars ( or w.e. ) and use the code to clear it.

i honestly did not know that game had a plot for ... 2-3 years.... ( GTA 3 ) so yea .... to this day i cant play it


----------



## mfknjadagr8

Quote:


> Originally Posted by *Mega Man*
> 
> i hope it helps, but yes, my writing was not.
> 
> i wont be able to help you. when i was in high school a buddy of mine screwed me
> 
> he just spent hours, and i mean hours shooting at cars and "racing" them
> 
> then after that, he would get 5 stars ( or w.e. ) and use the code to clear it.
> 
> i honestly did not know that game had a plot for ... 2-3 years.... ( GTA 3 ) so yea .... to this day i cant play it


you should pick it up it's came a long way...since then


----------



## long99x

I need a advice

Should I change my 2xAsus 290DC2 to 2xSapphire 290 TriX ?

Thanks


----------



## rdr09

Quote:


> Originally Posted by *long99x*
> 
> I need a advice
> 
> Should I change my 2xAsus 290DC2 to 2xSapphire 290 TriX ?
> 
> Thanks


yes, if it won't cost you much. one dc2 is hard to cool. can't imagine 2.


----------



## Agent Smith1984

Okay, this really is a deal alert this time...

Refurb Tri-x OC For $215 with promo code....
http://www.newegg.com/Product/Product.aspx?Item=N82E16814202146&cm_re=r9_290-_-14-202-146-_-Product

Crossfire? Yes, it is time.....


----------



## jura11

I've Tri-X 290 which with 20% fan I see 35C on idle,during the 3D Mark Firestrike I will see 60-70C or during the LuxRender rendering I see around 50C.

Not tried to game any newer game as I'm not gaming,if I can count as newer game Asseto Corsa and there I can see 55-65C temps

Hope this helps there

Thanks,Jura


----------



## long99x

Quote:


> Originally Posted by *rdr09*
> 
> yes, if it won't cost you much. one dc2 is hard to cool. can't imagine 2.


Quote:


> Originally Posted by *Agent Smith1984*
> 
> Okay, this really is a deal alert this time...
> 
> Refurb Tri-x OC For $215 with promo code....
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814202146&cm_re=r9_290-_-14-202-146-_-Product
> 
> Crossfire? Yes, it is time.....


Quote:


> Originally Posted by *jura11*
> 
> I've Tri-X 290 which with 20% fan I see 35C on idle,during the 3D Mark Firestrike I will see 60-70C or during the LuxRender rendering I see around 50C.
> 
> Not tried to game any newer game as I'm not gaming,if I can count as newer game Asseto Corsa and there I can see 55-65C temps
> 
> Hope this helps there
> 
> Thanks,Jura


thank
I will switch my card to my friend


----------



## jura11

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Okay, this really is a deal alert this time...
> 
> Refurb Tri-x OC For $215 with promo code....
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814202146&cm_re=r9_290-_-14-202-146-_-Product
> 
> Crossfire? Yes, it is time.....


This one I've got,very happy with this one,but still maybe Vapor X can be better and bit cooler too.

Thanks,Jura


----------



## ikem

Just ordered my MSI R9 290x 8gb.

Should be a good upgrade from a 7870.. right?


----------



## Agent Smith1984

Uhhh.... YES!

Drastic upgrade!

Good for you sir!


----------



## Mega Man

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> i hope it helps, but yes, my writing was not.
> 
> i wont be able to help you. when i was in high school a buddy of mine screwed me
> 
> he just spent hours, and i mean hours shooting at cars and "racing" them
> 
> then after that, he would get 5 stars ( or w.e. ) and use the code to clear it.
> 
> i honestly did not know that game had a plot for ... 2-3 years.... ( GTA 3 ) so yea .... to this day i cant play it
> 
> 
> 
> you should pick it up it's came a long way...since then
Click to expand...

i tried 4, and the flash backs came * starts twitching * soon it was just like that *foams at the mouth*

yea ruined


----------



## Gobigorgohome

If I crossfire MSI Lightning R9 290X 4GB with a Sapphire Vapor-X 8GB, how much VRAM do I have then? 4GB or 8GB?

Also, could I crossfire the R9 290X with a 280X?


----------



## Mega Man

at present 4 gb


----------



## cephelix

Ok guys, finally got around to OC-ing my MSI R9 290 Gaming. Since I forgot to take screenshots, I won't be able to show all the pretty little graphs abd numbers associated with my quick dirty OC. Would need help on how to proceed with this.

Currently the OC,though unstable,stands at;
Voltage: +44
Power Limit: +50%
Core Clock: 1100MHz
Memory Clock: 1400MHz

Temps are all in check, with a max of 78-80C. Looking at AB, loads and clocks are stabe so there isn't any throttling happening. Ran Valley on ultra,vsync off, 1200p and didn't see any artifacts but in DA:I, i get intermittent flickers of black.
Any tips on how to proceed? Should I drop memory clocks to stock and test ingame first?? Or would it be a case of insufficient voltage?


----------



## jura11

Quote:


> Originally Posted by *cephelix*
> 
> Ok guys, finally got around to OC-ing my MSI R9 290 Gaming. Since I forgot to take screenshots, I won't be able to show all the pretty little graphs abd numbers associated with my quick dirty OC. Would need help on how to proceed with this.
> 
> Currently the OC,though unstable,stands at;
> Voltage: +44
> Power Limit: +50%
> Core Clock: 1100MHz
> Memory Clock: 1400MHz
> 
> Temps are all in check, with a max of 78-80C. Looking at AB, loads and clocks are stabe so there isn't any throttling happening. Ran Valley on ultra,vsync off, 1200p and didn't see any artifacts but in DA:I, i get intermittent flickers of black.
> Any tips on how to proceed? Should I drop memory clocks to stock and test ingame first?? Or would it be a case of insufficient voltage?


Hi there

Not sure,but I've got Sapphire R9 290 Tri-X and here are my OC settings:

GPU Clock: 1150mhz
Mem clock: 1500mhz
VDDC Offset: 156

My temps are pretty low,idle is around 42C right now at 20%,have this tested in 3D Mark and everything is fine,didn't tried in any games,but in all benches which I've tried everything seems is stable and no artifacts etc

Hope this helps

Thanks,Jura


----------



## cephelix

Quote:


> Originally Posted by *jura11*
> 
> Hi there
> 
> Not sure,but I've got Sapphire R9 290 Tri-X and here are my OC settings:
> 
> GPU Clock: 1150mhz
> Mem clock: 1500mhz
> VDDC Offset: 156
> 
> My temps are pretty low,idle is around 42C right now at 20%,have this tested in 3D Mark and everything is fine,didn't tried in any games,but in all benches which I've tried everything seems is stable and no artifacts etc
> 
> Hope this helps
> 
> Thanks,Jura


Thanks for that....my memory doesnt seem to want to go more then 1500 on stock volts.don't want to add more since my load temps are already 80C. Will do more testing tonight and over the weekend. The max core temps the 290 can handle is about 95C before it start throttling right? So there shouldn't be any issue if i set an arbitrary limit to my load temps, say 85C? Lucky for me my VRAM1 temps are only 75C on load.
Would most likely have to add more volts. How does my card rank so far among the other 290s??i see some getting 1200MHz on core...


----------



## pcrevolution

Quote:


> Originally Posted by *cephelix*
> 
> Thanks for that....my memory doesnt seem to want to go more then 1500 on stock volts.don't want to add more since my load temps are already 80C. Will do more testing tonight and over the weekend. The max core temps the 290 can handle is about 95C before it start throttling right? So there shouldn't be any issue if i set an arbitrary limit to my load temps, say 85C? Lucky for me my VRAM1 temps are only 75C on load.
> Would most likely have to add more volts. How does my card rank so far among the other 290s??i see some getting 1200MHz on core...


IIRC GDDR5 has automatic error correction. You'll reach a point whereby no matter how much you bump the memory speed, you won't gain any performance. Although yours is a stability issue, but perhaps something to bear in mind.


----------



## cephelix

Quote:


> Originally Posted by *pcrevolution*
> 
> IIRC GDDR5 has automatic error correction. You'll reach a point whereby no matter how much you bump the memory speed, you won't gain any performance. Although yours is a stability issue, but perhaps something to bear in mind.


Yeah, i noticed that. At this point i'd rather save the volts to increase my core clock than memory since,as has already been said on this thread, oc-ing memory has very little performance gains. also the first time i'm seriously oc-ing any gpu so i would like to ask a noob qn. Assuming that all my temps are under control, without unlocking voltages, even if i max out the voltage slider in AB, it shouldn't damage the card right? i seem to remember reading somewhere that the actual voltage added is quite a small amount even when the slider is maxed out. Do correct me if i'm wrong though.


----------



## jura11

Quote:


> Originally Posted by *cephelix*
> 
> Thanks for that....my memory doesnt seem to want to go more then 1500 on stock volts.don't want to add more since my load temps are already 80C. Will do more testing tonight and over the weekend. The max core temps the 290 can handle is about 95C before it start throttling right? So there shouldn't be any issue if i set an arbitrary limit to my load temps, say 85C? Lucky for me my VRAM1 temps are only 75C on load.
> Would most likely have to add more volts. How does my card rank so far among the other 290s??i see some getting 1200MHz on core...


Hi there

I've been using Sapphire supplied SW TRIXX which will allow you to tweak bit more voltage too,I've tried too Afterburner,but with this I've been unable to get more volts

Not sure,but seems MSI doesn't like too much more MHZ on memory,what I've seen 1400-1450mhz is pretty normal on those cards,1130-1150mhz GPU clocks and fans did you tried to adjust too ?

Please have look on this

http://www.overclockers.com/msi-r290-gaming-4g-video-card-review/

Did you tried too Memory Info which is posted somewhere over here ?

I know my Sapphire R9 290 Tri-X have Hynix RAM

Hope this helps

Thanks,Jura


----------



## Aussiejuggalo

Anyone else have there driver crash on GTA V?

I was watching the temps and VRAM usage, it popped up with a RAM warning (stupid thing was showing 3386MB in GPU-Z) then about a min later the driver locked up and crashed


----------



## cephelix

Quote:


> Originally Posted by *jura11*
> 
> Hi there
> 
> I've been using Sapphire supplied SW TRIXX which will allow you to tweak bit more voltage too,I've tried too Afterburner,but with this I've been unable to get more volts
> 
> Not sure,but seems MSI doesn't like too much more MHZ on memory,what I've seen 1400-1450mhz is pretty normal on those cards,1130-1150mhz GPU clocks and fans did you tried to adjust too ?
> 
> Please have look on this
> 
> http://www.overclockers.com/msi-r290-gaming-4g-video-card-review/
> 
> Did you tried too Memory Info which is posted somewhere over here ?
> 
> I know my Sapphire R9 290 Tri-X have Hynix RAM
> 
> Hope this helps
> 
> Thanks,Jura


My memory is also Hynix. I totally forgot about Trixx. I would have to try that when i've somewhat stabilised my current settings and see if i can push my card further. I don't know whether it's just my card or not but even after multiple re-seating attempts and switching to CLU instead of just normal thermal pasts, i finally got my temps under control. I've already adjusted fans. They're running at 70 - 80% at my current load temps. Not so much an issue since I game with headphones on but will try to lower them and see how much they affect temps.


----------



## pcrevolution

Quote:


> Originally Posted by *cephelix*
> 
> Yeah, i noticed that. At this point i'd rather save the volts to increase my core clock than memory since,as has already been said on this thread, oc-ing memory has very little performance gains. also the first time i'm seriously oc-ing any gpu so i would like to ask a noob qn. Assuming that all my temps are under control, without unlocking voltages, even if i max out the voltage slider in AB, it shouldn't damage the card right? i seem to remember reading somewhere that the actual voltage added is quite a small amount even when the slider is maxed out. Do correct me if i'm wrong though.


IMO provided you don't force constant voltage, I think if you max out, you are looking at 100mv right? That should put you in the high 1.2x volts. Still under 1.3v. I think that's a comfortable range for voltage as long as your cooling can manage it.

My 2c..


----------



## cephelix

Quote:


> Originally Posted by *pcrevolution*
> 
> IMO provided you don't force constant voltage, I think if you max out, you are looking at 100mv right? That should put you in the high 1.2x volts. Still under 1.3v. I think that's a comfortable range for voltage as long as your cooling can manage it.
> 
> My 2c..


That's what I thought. And yeah, I think I'm almost at the limits of what the non-ref cooler can handle. I don't force constant voltage though. Would disabling ULPS have any effect on just a single card? I know it's recommended for CFX setups.


----------



## pcrevolution

Don't think it affects single card set up though.

I only had to check that box when I was using my old HD7970 CFX set up.


----------



## cephelix

Quote:


> Originally Posted by *pcrevolution*
> 
> Don't think it affects single card set up though.
> 
> I only had to check that box when I was using my old HD7970 CFX set up.


Thanks for that... Hey! another Singaporean...


----------



## jura11

Quote:


> Originally Posted by *cephelix*
> 
> My memory is also Hynix. I totally forgot about Trixx. I would have to try that when i've somewhat stabilised my current settings and see if i can push my card further. I don't know whether it's just my card or not but even after multiple re-seating attempts and switching to CLU instead of just normal thermal pasts, i finally got my temps under control. I've already adjusted fans. They're running at 70 - 80% at my current load temps. Not so much an issue since I game with headphones on but will try to lower them and see how much they affect temps.


I know Hynix RAM are better overclockers than Samsung RAM,but sometimes seems Samsung are better too,really hard to say why,I know friend have R9 290 with Samsung and he can overclock his card much higher than my

Yes TRIXX is great utility

Not sure why are you have issue with the temps,I remember I've previously Gainward GTX560Ti and on this I've replaced every 3 months thermal paste as seems started to be dry after 3 months and I didn't play on this card any games,only used this card for CUDA rendering,in one render which I've done I've seen temps close to 100C

About the fans,not sure.,I've set automatic and in some LuxRender renders I can hear the fans as they sometimes run at high 50-60%

What thermal paste are you using?

Thanks,Jura


----------



## pcrevolution

Yeah SG!

I think many things are really dependent on silicon lottery. Memory chips and their mileage also vary with batches although certain manufacturers have always produced chips with a certain tendency.


----------



## cephelix

Quote:


> Originally Posted by *jura11*
> 
> I know Hynix RAM are better overclockers than Samsung RAM,but sometimes seems Samsung are better too,really hard to say why,I know friend have R9 290 with Samsung and he can overclock his card much higher than my
> 
> Yes TRIXX is great utility
> 
> Not sure why are you have issue with the temps,I remember I've previously Gainward GTX560Ti and on this I've replaced every 3 months thermal paste as seems started to be dry after 3 months and I didn't play on this card any games,only used this card for CUDA rendering,in one render which I've done I've seen temps close to 100C
> 
> About the fans,not sure.,I've set automatic and in some LuxRender renders I can hear the fans as they sometimes run at high 50-60%
> 
> What thermal paste are you using?
> 
> Thanks,Jura


I have no idea why my temps are not that great.when i tried Valley when i first got the card,core temps never exceeded 60C on stock with less aggressive fan profiles. Currently I'm using Coollaboratory Liquid Ultra. A liquid metal paste which gives me the best temps.
Quote:


> Originally Posted by *pcrevolution*
> 
> Yeah SG!
> 
> I think many things are really dependent on silicon lottery. Memory chips and their mileage also vary with batches although certain manufacturers have always produced chips with a certain tendency.


Yeah,it is the silicon lottery. Of which i have mediocre luck with.not the worst but nothing to get excited over. Was tempted to get another 290 to crossfire but looking at temps now, i dont think my case can handle it


----------



## kizwan

Quote:


> Originally Posted by *cephelix*
> 
> Quote:
> 
> 
> 
> Originally Posted by *jura11*
> 
> Hi there
> 
> Not sure,but I've got Sapphire R9 290 Tri-X and here are my OC settings:
> 
> GPU Clock: 1150mhz
> Mem clock: 1500mhz
> VDDC Offset: 156
> 
> My temps are pretty low,idle is around 42C right now at 20%,have this tested in 3D Mark and everything is fine,didn't tried in any games,but in all benches which I've tried everything seems is stable and no artifacts etc
> 
> Hope this helps
> 
> Thanks,Jura
> 
> 
> 
> Thanks for that....my memory doesnt seem to want to go more then 1500 on stock volts.don't want to add more since my load temps are already 80C. Will do more testing tonight and over the weekend. The max core temps the 290 can handle is about 95C before it start throttling right? So there shouldn't be any issue if i set an arbitrary limit to my load temps, say 85C? Lucky for me my VRAM1 temps are only 75C on load.
> Would most likely have to add more volts. How does my card rank so far among the other 290s??i see some getting 1200MHz on core...
Click to expand...

If you lower target temp limit to 85C, your card will start throttling earlier. Leave it to 95C. What you need to do is increase the fan speed.
Quote:


> Originally Posted by *cephelix*
> 
> I have no idea why my temps are not that great.when i tried Valley when i first got the card,core temps never exceeded 60C on stock with less aggressive fan profiles. Currently I'm using Coollaboratory Liquid Ultra. A liquid metal paste which gives me the best temps.


I'm confused. Did you get good temp with CLU?


----------



## cephelix

Quote:


> Originally Posted by *kizwan*
> 
> If you lower target temp limit to 85C, your card will start throttling earlier. Leave it to 95C. What you need to do is increase the fan speed.


Oh, i don't mean to change the card's target temp via software or anything....it's just a value i have in my head.....


----------



## DividebyZERO

Well if your wondering what quad 4k eyefinity landscape would look like? Look no further. LOL (im sure this will get overlooked but i mean hell someone has to do it for funsies!)







The refresh rate is 30 on 3 screens, im testing a 60hs 4k panel and just wanted to run this to test all on 290x btw.


----------



## kizwan

Quote:


> Originally Posted by *DividebyZERO*
> 
> Well if your wondering what quad 4k eyefinity landscape would look like? Look no further. LOL (im sure this will get overlooked but i mean hell someone has to do it for funsies!)
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> The refresh rate is 30 on 3 screens, im testing a 60hs 4k panel and just wanted to run this to test all on 290x btw.


Yeah, I can't help myself to overlooked your post. What is it exactly? Haha just kidding. Nice!







I think I missed this info but what monitors do you have there?


----------



## cokker

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Anyone else have there driver crash on GTA V?
> 
> I was watching the temps and VRAM usage, it popped up with a RAM warning (stupid thing was showing 3386MB in GPU-Z) then about a min later the driver locked up and crashed


Unstable OC caused a crash after an hour or so for me. Also alt-tabbing causes a memory leak but I've seen my system usage hit 7GB after a while.


----------



## Aussiejuggalo

Quote:


> Originally Posted by *cokker*
> 
> Unstable OC caused a crash after an hour or so for me. Also alt-tabbing causes a memory leak but I've seen my system usage hit 7GB after a while.


My problem ended up being my 2GB pagefile, game eats that damn thing. I did notice it doesn't like when you alt tab even when you play in windowed

I'm gonna try the beta drivers tomorrow tho and see how it goes, hopefully it'll be more stable and give me more FPS


----------



## cephelix

Finally, mostly done with my OC. For now this would be my settings and if it causes weird stuff to happen in game, then I'd up the voltage a bit more. Tested with Heaven benchmark and a couple of hours of DAI at UIltra except for MSAA(off) at 1200p




Temps reach a max of 85C in long gaming sessions though. and there is no way for me to reduce that unless I go water. Fan profile is already quite aggressive, 10% increase in rpm per 10C temp increase.
Any help would be greatly appreciated.

Has to uninstall MSI AB though. It was giving me a weird graphical glitch in DAI, but when I swapped over to TriXX and input the same numbers, I got no glitches.


----------



## pcrevolution

Quote:


> Originally Posted by *cephelix*
> 
> Finally, mostly done with my OC. For now this would be my settings and if it causes weird stuff to happen in game, then I'd up the voltage a bit more. Tested with Heaven benchmark and a couple of hours of DAI at UIltra except for MSAA(off) at 1200p
> 
> 
> 
> 
> Temps reach a max of 85C in long gaming sessions though. and there is no way for me to reduce that unless I go water. Fan profile is already quite aggressive, 10% increase in rpm per 10C temp increase.
> Any help would be greatly appreciated.
> 
> Has to uninstall MSI AB though. It was giving me a weird graphical glitch in DAI, but when I swapped over to TriXX and input the same numbers, I got no glitches.


1100Mhz for 1.1v is pretty good alr IMO.


----------



## cephelix

Quote:


> Originally Posted by *pcrevolution*
> 
> 1100Mhz for 1.1v is pretty good alr IMO.


Good to know....Then again this was tested in 26C ambients. And my case sounded like a very soft vacuum cleaner.


----------



## DividebyZERO

Quote:


> Originally Posted by *kizwan*
> 
> Yeah, I can't help myself to overlooked your post. What is it exactly? Haha just kidding. Nice!
> 
> 
> 
> 
> 
> 
> 
> I think I missed this info but what monitors do you have there?


3 seiki 39inch and testing a crossover 44k 40inch. I figured why not do 4 just for fun. The fact it ran and ran okay was surprising to me.


----------



## BradleyW

Just tried GTA V. CFX gives me no fps advantage with intense stuttering. Mouse is also too laggy. Completely unplayable. Worst performing game I've played to date.

1080p, FXAA, all settings high, advanced settings off, tessellation off. VRAM under 3GB.


----------



## rdr09

Quote:


> Originally Posted by *BradleyW*
> 
> Just tried GTA V. CFX gives me no fps advantage with intense stuttering. Mouse is also too laggy. Completely unplayable. Worst performing game I've played to date.
> 
> 1080p, FXAA, all settings high, advanced settings off, tessellation off. VRAM under 3GB.


other are having a grand time. try this . . .

http://www.overclock.net/t/988215/how-to-remove-your-amd-ati-gpu-drivers

my bad. you made that. lol.

never used it, i just install over the old. been doing since omega. takes like 10 minutes.


----------



## LandonAaron

Quote:


> Originally Posted by *BradleyW*
> 
> Just tried GTA V. CFX gives me no fps advantage with intense stuttering. Mouse is also too laggy. Completely unplayable. Worst performing game I've played to date.
> 
> 1080p, FXAA, all settings high, advanced settings off, tessellation off. VRAM under 3GB.


That's strange. I get basically exactly double fps enabling crossfire. The mouse does cause stuttering though. Basically if you turn faster than you could with a controller it stutters, beyond that performance and fps is good.


----------



## BradleyW

Quote:


> Originally Posted by *rdr09*
> 
> other are having a grand time. try this . . .
> 
> http://www.overclock.net/t/988215/how-to-remove-your-amd-ati-gpu-drivers
> 
> my bad. you made that. lol.
> 
> never used it, i just install over the old. been doing since omega. takes like 10 minutes.


All 6 cores are pegged at 99% if I turn the advanced distance features ON, so I guess it is time to move up to 5960X.


----------



## rdr09

Quote:


> Originally Posted by *BradleyW*
> 
> All 6 cores are pegged at 99% if I turn the advanced distance features ON, so I guess it is time to move up to 5960X.


here is carlo with just a 3770K . . .






never had issue with my i7 SB with 2 290s. only reason i build a X99 is for the 300 series. might be your vram.


----------



## BradleyW

Quote:


> Originally Posted by *rdr09*
> 
> here is carlo with just a 3770K . . .
> 
> 
> 
> 
> 
> 
> never had issue with my i7 SB with 2 290s. only reason i build a X99 is for the 300 series. might be your vram.


VRAM is not the issue because I am not reaching my 4096MB limit. Turning those advanced sliders up half way put my fps down to 40. All the way reduces the fps to 25+. At this point, the CPU is at 99% usage and I have hardly no GPU usage. This game is tremendously CPU intensive for me. Do remember, I am at 1080p.


----------



## gatygun

After playing some 5k skylines i found out that my cpu's started to hit meltdown temps. So i had to change the paste under it and clear it a bit out.

I decided to push the coolers around so it now soaks the heat from the gpu up and pushes it towards the top of the case. This works suprisingly well for my 290 tri-x, as at a 1145/1450 oc i get now max temp of 67c ( 5 degree's cooler ) and the fan spins 4% lesser on top of it. My cpu also got a huge push downwards on temps from 90c towards 70c at max.

Here a pic of it.


Quote:


> Originally Posted by *cephelix*
> 
> Finally, mostly done with my OC. For now this would be my settings and if it causes weird stuff to happen in game, then I'd up the voltage a bit more. Tested with Heaven benchmark and a couple of hours of DAI at UIltra except for MSAA(off) at 1200p
> 
> 
> 
> 
> Temps reach a max of 85C in long gaming sessions though. and there is no way for me to reduce that unless I go water. Fan profile is already quite aggressive, 10% increase in rpm per 10C temp increase.
> Any help would be greatly appreciated.
> 
> Has to uninstall MSI AB though. It was giving me a weird graphical glitch in DAI, but when I swapped over to TriXX and input the same numbers, I got no glitches.


That's a solid clock for just +50 tho. i'm just 45 core higher and that requires me to push +100.


----------



## kzone75

What thermal paste are you guys recommending for GPUs these days? I'd like to replace it on my R9 290.


----------



## thrgk

so is it ok to run at +200mv on 4 290x's all water cooled?


----------



## DividebyZERO

Quote:


> Originally Posted by *thrgk*
> 
> so is it ok to run at +200mv on 4 290x's all water cooled?


if you have enough psu sure. Ive done it but my PSU was tapped out depending on the bench/test I did. Soon as I can get time I will split to 2 psus.


----------



## thrgk

yea i got 2 psus, one 750 for 2 290x and a 1200 for the rest of my goods and the other 2 290x


----------



## jura11

Quote:


> Originally Posted by *gatygun*
> 
> After playing some 5k skylines i found out that my cpu's started to hit meltdown temps. So i had to change the paste under it and clear it a bit out.
> 
> I decided to push the coolers around so it now soaks the heat from the gpu up and pushes it towards the top of the case. This works suprisingly well for my 290 tri-x, as at a 1145/1450 oc i get now max temp of 67c ( 5 degree's cooler ) and the fan spins 4% lesser on top of it. My cpu also got a huge push downwards on temps from 90c towards 70c at max.
> 
> Here a pic of it.
> 
> 
> That's a solid clock for just +50 tho. i'm just 45 core higher and that requires me to push +100.


Hi there

I would change orientation of the CPU cooler,I've once like you have and this has heated unnecessary CPU and I would change orientation towards to the exhaust of the case,I've done that and changed exhaust fan for Thermalright TY-147 fan and my temps has dropped too on the GPU and CPU,this is my view there and everyone can have different opinion









Thanks,Jura


----------



## pcrevolution

Quote:


> Originally Posted by *kzone75*
> 
> What thermal paste are you guys recommending for GPUs these days? I'd like to replace it on my R9 290.


Usually I stick with AC MX-4 for non-conductive TIMs.

CLU has its benefits but is conductive and incompatible with aluminium.


----------



## pcrevolution

Quote:


> Originally Posted by *thrgk*
> 
> so is it ok to run at +200mv on 4 290x's all water cooled?


Even though over-volting can speed up electron migration, but given that you run them under water it does mitigate electronmigration slightly because of the temperature factor.


----------



## cephelix

Quote:


> Originally Posted by *gatygun*
> 
> After playing some 5k skylines i found out that my cpu's started to hit meltdown temps. So i had to change the paste under it and clear it a bit out.
> 
> I decided to push the coolers around so it now soaks the heat from the gpu up and pushes it towards the top of the case. This works suprisingly well for my 290 tri-x, as at a 1145/1450 oc i get now max temp of 67c ( 5 degree's cooler ) and the fan spins 4% lesser on top of it. My cpu also got a huge push downwards on temps from 90c towards 70c at max.
> 
> Here a pic of it.
> 
> 
> That's a solid clock for just +50 tho. i'm just 45 core higher and that requires me to push +100.


I don't think I can handle another +50 since my temps are already maxing out. what are your temps like? it looks stable. haven't tested it in other games except DAI and Heaven/Valley and I get no artifacts. Maybe I'll reinstall AC Black Flag and see how it does. Mainly play RPGs so those are the games I'll be testing with.

Quote:


> Originally Posted by *kzone75*
> 
> What thermal paste are you guys recommending for GPUs these days? I'd like to replace it on my R9 290.


I went for broke and just used CLU between die and heatsink. Brought my temps down 10C or thereabouts. Also strapped an exhaust fan to the unused pcie slots under the GPU and that brought down temps another 5C initially and resulted in my case and card heating up much slower than before.
Quote:


> Originally Posted by *jura11*
> 
> Hi there
> 
> I would change orientation of the CPU cooler,I've once like you have and this has heated unnecessary CPU and I would change orientation towards to the exhaust of the case,I've done that and changed exhaust fan for Thermalright TY-147 fan and my temps has dropped too on the GPU and CPU,this is my view there and everyone can have different opinion
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks,Jura


Agreed. Could it be initial mounting pressure? What I find helpful is to have an exhaust fan that is of higher cfm/static pressure than the ones on the heatsink and that is set to a steeper rpm curve to help exhaust heated air coming from the gpu. But it really depends on his setup.

On a side note. I just got a random shutdown with the source being "thermald" and no event ID. anyone know what it is?


----------



## kizwan

It appears to be part of System Information Viewer from Gigabyte. System utility that came with motherboard usually useless, junk & nuisance. Uninstall it will be a good idea.


----------



## cephelix

Quote:


> Originally Posted by *kizwan*
> 
> It appears to be part of System Information Viewer from Gigabyte. System utility that came with motherboard usually useless, junk & nuisance. Uninstall it will be a good idea.


I agree. But it's so helpful to be able to control my fans from SIV...that's mainly what i use it for. It's the first time i've seen this error though. Well, so long as it's not due to me messing around with my system, doesn't really pose a problem. if it bsods again i'll uninstall it. but speedfan sucksssss


----------



## jura11

Quote:


> Originally Posted by *cephelix*
> 
> Finally, mostly done with my OC. For now this would be my settings and if it causes weird stuff to happen in game, then I'd up the voltage a bit more. Tested with Heaven benchmark and a couple of hours of DAI at UIltra except for MSAA(off) at 1200p
> 
> 
> 
> 
> Temps reach a max of 85C in long gaming sessions though. and there is no way for me to reduce that unless I go water. Fan profile is already quite aggressive, 10% increase in rpm per 10C temp increase.
> Any help would be greatly appreciated.
> 
> Has to uninstall MSI AB though. It was giving me a weird graphical glitch in DAI, but when I swapped over to TriXX and input the same numbers, I got no glitches.


Hi there

Pretty good results on overclocking too,I've attaching my results and they're lower than your that's for sure,but temperatures are lower too,my highest temps has been 75C during the Unigine Benchmark,I'm not sure if this is down to the older drivers(I'm using older drivers due newer drivers giving me issues in 3D SW),my fans has been at 42%

I've uninstalled too MSI AB,I've tried only once and I've got few issues too with this SW

Here are results



Thanks,Jura


----------



## cephelix

Quote:


> Originally Posted by *jura11*
> 
> Hi there
> 
> Pretty good results on overclocking too,I've attaching my results and they're lower than your that's for sure,but temperatures are lower too,my highest temps has been 75C during the Unigine Benchmark,I'm not sure if this is down to the older drivers(I'm using older drivers due newer drivers giving me issues in 3D SW),my fans has been at 42%
> 
> I've uninstalled too MSI AB,I've tried only once and I've got few issues too with this SW
> 
> Here are results
> 
> 
> 
> Thanks,Jura


your temps are indeed lower...much much lower....75C is my stock clocks at load....if only i could push 50 more without having to increase voltages but sadly, at my current voltages, I could only add another 10 to core clock before things start going downhill...


----------



## LandonAaron

Quote:


> Originally Posted by *kzone75*
> 
> What thermal paste are you guys recommending for GPUs these days? I'd like to replace it on my R9 290.


I like IC Diamond, Prolimatech PK-3, or GelID Extreme. GelID is probably the easiest to apply as it is much thinner than the other two, though I think the others may give a degree or two better temps. IC Diamond tends to leave marks/stains on blocks but it has best temps and is non conductive. Remember than on direct die applications like on AMD GPU's you need total coverage of the die, unlike putting thermal paste on a IHS where its okay for the corners to be bare. So if you use the rice/pea method double check to make sure you using enough for it to spread to the edges.


----------



## jura11

Quote:


> Originally Posted by *cephelix*
> 
> your temps are indeed lower...much much lower....75C is my stock clocks at load....if only i could push 50 more without having to increase voltages but sadly, at my current voltages, I could only add another 10 to core clock before things start going downhill...


Hi there

All depends too on the case what you have and cable management too,last time friend has have same issue with higher temps on GPU and his cable management has been very poor(he done very poor job),but this has been down to his case

I see you have Obsidian which should have nice airflow,I've HAF-X,its good old case which has been proven on my numerous builds,although would love Lian-Li

Plus depends on more factors too,ambient temperatures etc

Thanks,Jura


----------



## cephelix

Quote:


> Originally Posted by *LandonAaron*
> 
> I like IC Diamond, Prolimatech PK-3, or GelID Extreme. GelID is probably the easiest to apply as it is much thinner than the other two, though I think the others may give a degree or two better temps. IC Diamond tends to leave marks/stains on blocks but it has best temps and is non conductive. Remember than on direct die applications like on AMD GPU's you need total coverage of the die, unlike putting thermal paste on a IHS where its okay for the corners to be bare. So if you use the rice/pea method double check to make sure you using enough for it to spread to the edges.


Yeap..CLU is conductive so if you do want to use it, you have to be very careful. I appllied it using the same method the others use when applying it to the cpu. Since i will most likely be keeping the card now that it max out my games, i'm just going to do whatever i want to it.
Quote:


> Originally Posted by *jura11*
> 
> Hi there
> 
> All depends too on the case what you have and cable management too,last time friend has have same issue with higher temps on GPU and his cable management has been very poor(he done very poor job),but this has been down to his case
> 
> I see you have Obsidian which should have nice airflow,I've HAF-X,its good old case which has been proven on my numerous builds,although would love Lian-Li
> 
> Plus depends on more factors too,ambient temperatures etc
> 
> Thanks,Jura


Cable management is good. Bareky any wires hanging about. All tucked away in the back. Will take pics when i have the time. Ambients are about 26C. I do find the case having a stagnant area of air around the gpu hence the addition of the pcie slot exhaust


----------



## jura11

Quote:


> Originally Posted by *cephelix*
> 
> Yeap..CLU is conductive so if you do want to use it, you have to be very careful. I appllied it using the same method the others use when applying it to the cpu. Since i will most likely be keeping the card now that it max out my games, i'm just going to do whatever i want to it.
> Cable management is good. Bareky any wires hanging about. All tucked away in the back. Will take pics when i have the time. Ambients are about 26C. I do find the case having a stagnant area of air around the gpu hence the addition of the pcie slot exhaust


Hi there

I personally only used Shin-Etsu X-23-7783D like on CPU or on the GPU in past,but tried too AC MX series which in my case have not best results
I would say this can be due the higher ambient temperatures and maybe thermal paste,hard to say,every GPU cooler is not the same and I will speak with friend,he have same GPU like you have and he have definitely lower temps on his MSI

Thanks,Jura


----------



## cephelix

Quote:


> Originally Posted by *jura11*
> 
> Hi there
> 
> I personally only used Shin-Etsu X-23-7783D like on CPU or on the GPU in past,but tried too AC MX series which in my case have not best results
> I would say this can be due the higher ambient temperatures and maybe thermal paste,hard to say,every GPU cooler is not the same and I will speak with friend,he have same GPU like you have and he have definitely lower temps on his MSI
> 
> Thanks,Jura


I've seen good reviews on shin etsu thermal pastes. But decided on gc extreme instead since it came highly recommended by the good folks on ocn. That and CLU. i must say though, since joining ocn,i've done alot more to my current build then i ever did to my old one. Currently on the hunt for more gentle typhoons..

appreciate you going out of your way to help,me find temps for my card








I find the temps on my card weird as well. It went up about 10-15 degrees on load at stock compared to when it was first purchased. Granted i have messed about with it since then,changing pastes and pads.


----------



## Tonza

Anyone knows if Accelero Extreme IV can be installed on DCII card with the stock Asus backplate and VRM cooling (just using the mem heatsinks from the package). If not, then im going to pass and probably going to RMA my card, since it gets over 90C in gaming even at stock clocks and @ 60% fan speed, which is like jet engine and toaster combined (should not be this bad?). Case is Arc Mini R2, which has more than enough air flow, all case fans are AF120/AF140 from corsair.


----------



## mAs81

Quote:


> Originally Posted by *Tonza*
> 
> Anyone knows if Accelero Extreme IV can be installed on DCII card with the stock Asus backplate and VRM cooling (just using the mem heatsinks from the package). If not, then im going to pass and probably going to RMA my card, since it gets over 90C in gaming even at stock clocks and @ 60% fan speed, which is like jet engine and toaster combined (should not be this bad?). Case is Arc Mini R2, which has more than enough air flow, all case fans are AF120/AF140 from corsair.


Taken from their site :



So,I'm guessing no , don't know if you can fit it tho with some modding or something:I have never used it before..


----------



## jura11

Quote:


> Originally Posted by *cephelix*
> 
> I've seen good reviews on shin etsu thermal pastes. But decided on gc extreme instead since it came highly recommended by the good folks on ocn. That and CLU. i must say though, since joining ocn,i've done alot more to my current build then i ever did to my old one. Currently on the hunt for more gentle typhoons..
> 
> appreciate you going out of your way to help,me find temps for my card
> 
> 
> 
> 
> 
> 
> 
> 
> I find the temps on my card weird as well. It went up about 10-15 degrees on load at stock compared to when it was first purchased. Granted i have messed about with it since then,changing pastes and pads.


Hi there

I've went with Shin Etsu after I've tried few pastes,with which I've got higher temps etc.Right now I'm happy with Shin Etsu and maybe I will try later on better paste

Yes modding is sometimes very enjoyable and sometimes brings more and more problems like with everything,when you try to modify get bit more and more,I wanted to go with Water cooling,but still I'm bit scare to run Water cooling,have run Corsair Hydro which is AIO,but has been so loud and switched to Thermalright HR02 "Macho" B/W and I'm very happy

Friend has send me message,his temperatures are worse than yours there,94C during the playing Sleeping dogs and in Unigine what he tried,he have too around 91C,I think this MSI have flawed design of the fans etc,he will be putting new paste this weekend I think he will be using Shin Etsu

Something is wrong,I would try to RMA this card and go with Sapphire R9 290 Tri-X OC or Vapor series,as you are said I would say too,yours temps are weird,did you check if fans are spinning etc ?

Thanks,Jura


----------



## Tonza

Quote:


> Originally Posted by *mAs81*
> 
> Taken from their site :
> 
> 
> 
> So,I'm guessing no , don't know if you can fit it tho with some modding or something:I have never used it before..


Yeah i know , but i have seen picture of 780Ti DCII which Accelero Extreme III installed with backplate :/, damn so confusing.


----------



## cephelix

Quote:


> Originally Posted by *jura11*
> 
> Hi there
> 
> I've went with Shin Etsu after I've tried few pastes,with which I've got higher temps etc.Right now I'm happy with Shin Etsu and maybe I will try later on better paste
> 
> Yes modding is sometimes very enjoyable and sometimes brings more and more problems like with everything,when you try to modify get bit more and more,I wanted to go with Water cooling,but still I'm bit scare to run Water cooling,have run Corsair Hydro which is AIO,but has been so loud and switched to Thermalright HR02 "Macho" B/W and I'm very happy
> 
> Friend has send me message,his temperatures are worse than yours there,94C during the playing Sleeping dogs and in Unigine what he tried,he have too around 91C,I think this MSI have flawed design of the fans etc,he will be putting new paste this weekend I think he will be using Shin Etsu
> 
> Something is wrong,I would try to RMA this card and go with Sapphire R9 290 Tri-X OC or Vapor series,as you are said I would say too,yours temps are weird,did you check if fans are spinning etc ?
> 
> Thanks,Jura


Well,if you dont mind staining.your heatsink, i do suggest CLU. Just sure to insulate the stuff(vrms?) arount tge die area first. I coated mine with a thin layer of nail polish. BAM! 10C drop on load. I don't think i'd be able to acheive better cooling unless i swap the cooler to something like the AC xtreme IV. I've already tried water and i get more paranoid about pump failure and leaks occurring that i decided to just go back to air and be done with it..so i went out and bought the mother of all cpu coolers and added high static pressure fans to my case as well as making sure i have proper airflow through my case.
Cannot be arsed to RMA the card tbh. It performs as per spec, fans all run and at 80C, my fans run at 80%. if the stock fans ever decide to crap out i'll be strapping 2x120mm fans to it... Nothing wrong with it. What's weird is yours. Is it normal for the tri-x to be running.at those temps?i know the.tri-x has one of the best cooling solutions for the 290.


----------



## mAs81

Quote:


> Originally Posted by *Tonza*
> 
> Yeah i know , but i have seen picture of 780Ti DCII which Accelero Extreme III installed with backplate :/, damn so confusing.


Well apparently AC Xtreme III is compatible with the 780Ti model according to their site , and I have seen pictures of it too , I think , but I still think that the extended PCB restriction still applies in the case of your GPU..


Spoiler: Warning: Spoiler!


----------



## rdr09

Quote:


> Originally Posted by *DividebyZERO*
> 
> Well if your wondering what quad 4k eyefinity landscape would look like? Look no further. LOL (im sure this will get overlooked but i mean hell someone has to do it for funsies!)
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The refresh rate is 30 on 3 screens, im testing a 60hs 4k panel and just wanted to run this to test all on 290x btw.


nice. i only one and i love it.


----------



## jura11

Quote:


> Originally Posted by *cephelix*
> 
> Well,if you dont mind staining.your heatsink, i do suggest CLU. Just sure to insulate the stuff(vrms?) arount tge die area first. I coated mine with a thin layer of nail polish. BAM! 10C drop on load. I don't think i'd be able to acheive better cooling unless i swap the cooler to something like the AC xtreme IV. I've already tried water and i get more paranoid about pump failure and leaks occurring that i decided to just go back to air and be done with it..so i went out and bought the mother of all cpu coolers and added high static pressure fans to my case as well as making sure i have proper airflow through my case.
> Cannot be arsed to RMA the card tbh. It performs as per spec, fans all run and at 80C, my fans run at 80%. if the stock fans ever decide to crap out i'll be strapping 2x120mm fans to it... Nothing wrong with it. What's weird is yours. Is it normal for the tri-x to be running.at those temps?i know the.tri-x has one of the best cooling solutions for the 290.


Hi there

I will be swapping the thermal paste in near future,my temps are already pretty good and what is slowing down is my Motherboard,its P6T SE and its not best(OC making sometimes more difficult with higher BCLK),but still very happy,because I can use Xeon CPU on this MB

Tell me about being paranoid,seen friend PC what happens after his pump failed and I've almost bought XSPC full kit,but backed out from this in last minute,yes Noctua is best CPU cooler,I wanted too get this,but when I wanted all has been out of stock and I needed one which I can have next day and this has have good reviews and went with this one

You will see how you get on with yours and good luck with yours,did you tried to have look on back plate if its tighten ?

Please have look on this thread

http://www.techpowerup.com/forums/threads/msi-r9-290-oc-temps-at-94c-while-gaming.201665/

Not sure there about the mine,I've this Tri-X for around 6 weeks and this GPU I'm using mostly in LuxRender or OpenCL renderers and now with V-RAY too where I've gained,temps in those renderers are pretty low around 65C and GPU is running usually at 100% I've played few games like Asseto Corsa or rFactor,life For speed which I most play and temps has been always good too around 65-70C,in 3D Mark Fire strike no issues at all at 75C,not sure why,but I think yes this one Sapphire have good temps.

Friend have Asus DCII and his temps are pretty high too(90-94C) and he RMA the card and got Vapor as replacement card

Thanks,Jura


----------



## blackhole2013

Can anybody tell me optimal settings for gta v with my 290x 8gb at 1150/1450 or where to find the settings I dont want to play till I get the best quality from my card ....


----------



## cephelix

Quote:


> Originally Posted by *jura11*
> 
> Hi there
> 
> I will be swapping the thermal paste in near future,my temps are already pretty good and what is slowing down is my Motherboard,its P6T SE and its not best(OC making sometimes more difficult with higher BCLK),but still very happy,because I can use Xeon CPU on this MB
> 
> Tell me about being paranoid,seen friend PC what happens after his pump failed and I've almost bought XSPC full kit,but backed out from this in last minute,yes Noctua is best CPU cooler,I wanted too get this,but when I wanted all has been out of stock and I needed one which I can have next day and this has have good reviews and went with this one
> 
> You will see how you get on with yours and good luck with yours,did you tried to have look on back plate if its tighten ?
> 
> Please have look on this thread
> 
> http://www.techpowerup.com/forums/threads/msi-r9-290-oc-temps-at-94c-while-gaming.201665/
> 
> Not sure there about the mine,I've this Tri-X for around 6 weeks and this GPU I'm using mostly in LuxRender or OpenCL renderers and now with V-RAY too where I've gained,temps in those renderers are pretty low around 65C and GPU is running usually at 100% I've played few games like Asseto Corsa or rFactor,life For speed which I most play and temps has been always good too around 65-70C,in 3D Mark Fire strike no issues at all at 75C,not sure why,but I think yes this one Sapphire have good temps.
> 
> Friend have Asus DCII and his temps are pretty high too(90-94C) and he RMA the card and got Vapor as replacement card
> 
> Thanks,Jura


Yeah.backplate screws are all tightened.if i truly am unhapppy with mine i'll probably go with the 390(x) depending on reviews.but for now i'm a happy camper. Already spent quite a bit of cash on my setup.now it's just about tweaking and enjoying it.it does everything i need it to do and then some. Only gaming at 1200p/60hz. Not so demanding. Was tempted to get the 290x vapor x but it's not sold locally. Would have to purchase from overseas.with the currency conversion, it comes up to about SGD405 which means that it's just $5 over the limit for tax free overseas purchases.

The noctuas are great but your cooler is no slouch as well.just cant seem to find any one of the other top coolers besides noctua.might be changing tge fans for thermalright's ty-147 though. And just placed an order for 4 x gentle typhoons 1850rpm...today is a good day...

Guys,i have an idea and would like opinions on whether it'd be useful or just a waste of time. my msi r9 290 gaming runs about 85C on load,overclocked and the backplate area right under the die has a cutout. In an attempt to try and reduce core temps further, would copper heatsinks and thermal pads placed in that area and secured provide a useful reduction in temps? Do take note that i have all the matierials on hand and would not need to purchase anything for this if it is a point of consideration


----------



## blackhole2013

question why does my 290x usage while playing games jump up and down like 100 then 10 then 100 %


----------



## cephelix

Quote:


> Originally Posted by *blackhole2013*
> 
> question why does my 290x usage while playing games jump up and down like 100 then 10 then 100 %


what games? what are your load temps like?


----------



## kizwan

Quote:


> Originally Posted by *jura11*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cephelix*
> 
> Finally, mostly done with my OC. For now this would be my settings and if it causes weird stuff to happen in game, then I'd up the voltage a bit more. Tested with Heaven benchmark and a couple of hours of DAI at UIltra except for MSAA(off) at 1200p
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Temps reach a max of 85C in long gaming sessions though. and there is no way for me to reduce that unless I go water. Fan profile is already quite aggressive, 10% increase in rpm per 10C temp increase.
> Any help would be greatly appreciated.
> 
> Has to uninstall MSI AB though. It was giving me a weird graphical glitch in DAI, but when I swapped over to TriXX and input the same numbers, I got no glitches.
> 
> 
> 
> Hi there
> 
> Pretty good results on overclocking too,I've attaching my results and they're lower than your that's for sure,but temperatures are lower too,my highest temps has been 75C during the Unigine Benchmark,I'm not sure if this is down to the older drivers(I'm using older drivers due newer drivers giving me issues in 3D SW),my fans has been at 42%
> 
> I've uninstalled too MSI AB,I've tried only once and I've got few issues too with this SW
> 
> Here are results
> 
> 
> 
> Thanks,Jura
Click to expand...

Look good. This is my score with single card at 1000/1300 @1080p.


Quote:


> Originally Posted by *jura11*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cephelix*
> 
> Yeap..CLU is conductive so if you do want to use it, you have to be very careful. I appllied it using the same method the others use when applying it to the cpu. Since i will most likely be keeping the card now that it max out my games, i'm just going to do whatever i want to it.
> Cable management is good. Bareky any wires hanging about. All tucked away in the back. Will take pics when i have the time. Ambients are about 26C. I do find the case having a stagnant area of air around the gpu hence the addition of the pcie slot exhaust
> 
> 
> 
> Hi there
> 
> I personally only used Shin-Etsu X-23-7783D like on CPU or on the GPU in past,but tried too AC MX series which in my case have not best results
> I would say this can be due the higher ambient temperatures and maybe thermal paste,hard to say,every GPU cooler is not the same and I will speak with friend,he have same GPU like you have and he have definitely lower temps on his MSI
> 
> Thanks,Jura
Click to expand...

I use Shin-Etsu G751. I also tried MX4 before & I don't like the result too. Shin-Etsu TIMs are pretty good TIM.

I'm not fan with CLU or any electrical conductive TIMs. Other than Shin-Etsu TIMs, IC Diamond is one of my favourite TIM.


----------



## cephelix

Quote:


> Originally Posted by *kizwan*
> 
> Look good. This is my score with single card at 1000/1300 @1080p.
> 
> I use Shin-Etsu G751. I also tried MX4 before & I don't like the result too. Shin-Etsu TIMs are pretty good TIM.
> 
> I'm not fan with CLU or any electrical conductive TIMs. Other than Shin-Etsu TIMs, IC Diamond is one of my favourite TIM.


very close...i find a difference of about 2fps between stock memory clock and at 1500mhz.... I was kind of apprehensive about conductive tim as well, which is why i was extra careful. i coated the vrms in nail polish first before i applied CLU...just in case any small beads got away from me and i wiped the whole area down with alcohol afterwards and checked and rechecked to see if i missed a spot


----------



## blackhole2013

Quote:


> Originally Posted by *cephelix*
> 
> what games? what are your load temps like?


GTA V temps max 75 at 1150/1475


----------



## kizwan

Quote:


> Originally Posted by *cephelix*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Look good. This is my score with single card at 1000/1300 @1080p.
> 
> I use Shin-Etsu G751. I also tried MX4 before & I don't like the result too. Shin-Etsu TIMs are pretty good TIM.
> 
> I'm not fan with CLU or any electrical conductive TIMs. Other than Shin-Etsu TIMs, IC Diamond is one of my favourite TIM.
> 
> 
> 
> 
> 
> 
> very close...i find a difference of about 2fps between stock memory clock and at 1500mhz.... I was kind of apprehensive about conductive tim as well, which is why i was extra careful. i coated the vrms in nail polish first before i applied CLU...just in case any small beads got away from me and i wiped the whole area down with alcohol afterwards and checked and rechecked to see if i missed a spot
Click to expand...

Single 290 1100/1500

1920x1200


1440p


My card usually need less voltage with Unigine than 3dmark or games.


GTA V is downloading. Why the file is too big? It's going to take a while.


----------



## cephelix

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *kizwan*
> 
> Single 290 1100/1500
> 
> 1920x1200
> 
> 
> 1440p
> 
> 
> My card usually need less voltage with Unigine than 3dmark or games.
> 
> 
> GTA V is downloading. Why the file is too big? It's going to take a while.





usch, fps really takes a hit at 1440p...the only reason i'm not using 3dmark is cos the test takes too long....i may use it sometime in the future though but since the games i play run fine at those voltages, i don't bother too much. just keep an eye out for artifacts and the like


----------



## kizwan

Quote:


> Originally Posted by *cephelix*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Single 290 1100/1500
> 
> 1920x1200
> 
> 
> 1440p
> 
> 
> My card usually need less voltage with Unigine than 3dmark or games.
> 
> 
> GTA V is downloading. Why the file is too big? It's going to take a while.
> 
> 
> 
> 
> 
> 
> usch, fps really takes a hit at 1440p...the only reason i'm not using 3dmark is cos the test takes too long....i may use it sometime in the future though but since the games i play run fine at those voltages, i don't bother too much. just keep an eye out for artifacts and the like
Click to expand...

This is when crossfire come in play.


----------



## cephelix

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *kizwan*
> 
> This is when crossfire come in play.






oooo....but how are temps on air though?the case must heat up really fast


----------



## kizwan

Quote:


> Originally Posted by *cephelix*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> This is when crossfire come in play.
> 
> 
> 
> 
> 
> 
> 
> 
> oooo....but how are temps on air though?the case must heat up really fast
Click to expand...

Mine underwater. I run the benchmark in 36C ambient. Temp in the case was approx ~40C. I didn't turn on A/C.


----------



## cephelix

Quote:


> Originally Posted by *kizwan*
> 
> Mine underwater. I run the benchmark in 36C ambient. Temp in the case was approx ~40C. I didn't turn on A/C.


36C? are you in hell?or at least somewhere near there?








Underwater explains the temps...i do wonder if i had crossfire, how the temps in air would be


----------



## kizwan

Quote:


> Originally Posted by *cephelix*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Mine underwater. I run the benchmark in 36C ambient. Temp in the case was approx ~40C. I didn't turn on A/C.
> 
> 
> 
> 
> 
> 36C? are you in hell?or at least somewhere near there?
> 
> 
> 
> 
> 
> 
> 
> 
> Underwater explains the temps...i do wonder if i had crossfire, how the temps in air would be
Click to expand...

Haha. I live in Malaysia. The living room where I put my computer, get direct sunlight from morning to afternoon. So it get pretty hot & the A/C can only bring down room temp at best to 30C during day time.

I can't put my computer in bed room. Phobia, feels isolated, walls closing in, all that crap.


----------



## cephelix

Quote:


> Originally Posted by *kizwan*
> 
> Haha. I live in Malaysia. The living room where I put my computer, get direct sunlight from morning to afternoon. So it get pretty hot & the A/C can only bring down room temp at best to 30C during day time.
> 
> I can't put my computer in bed room. Phobia, feels isolated, walls closing in, all that crap.


Ohh.hello neighbour!!do you not put reflective solar film on the windows or thick curtains to block out the sun??or does the heat radiate from the walls as well??


----------



## kizwan

Quote:


> Originally Posted by *cephelix*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Haha. I live in Malaysia. The living room where I put my computer, get direct sunlight from morning to afternoon. So it get pretty hot & the A/C can only bring down room temp at best to 30C during day time.
> 
> I can't put my computer in bed room. Phobia, feels isolated, walls closing in, all that crap.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ohh.hello neighbour!!do you not put reflective solar film on the windows or thick curtains to block out the sun??or does the heat radiate from the walls as well??
Click to expand...

The heat radiate from roof. Zinc/aluminium roofing. The ceiling doesn't help much.


----------



## cephelix

Quote:


> Originally Posted by *kizwan*
> 
> The heat radiate from roof. Zinc/aluminium roofing. The ceiling doesn't help much.


Ahh is a bummer. Nothinf can be done unless you isolate and insulate tge whole roof area i suppose...but even so,great temps.


----------



## cokker

Quote:


> Originally Posted by *blackhole2013*
> 
> question why does my 290x usage while playing games jump up and down like 100 then 10 then 100 %


I'm guessing it's MSI afterburner miss reading. I have a similar experience with source based games, doesn't seem to affect performance.

On the subject of GTA V I'm getting great results, still using 15.3 drivers with everything maxed (even in advanced, no MSAA) FPS is around 35 (normally when racing in town) to a locked 59. AMD CHS (shadows) seems to be a bit of a FPS hit so I left that on sharp.

Had to lower my core clock to 1050 due to a couple of crashes, stock cooling isn't quite enough to allow me to add +100mv (85c+). I seem to get pretty bad vdroop on the core and anything under +100mv seems to drop to stock voltage causing problems, with the added voltage I get a minimum of 1.2v and a max of 1.3v.


----------



## pcrevolution

Quote:


> Originally Posted by *blackhole2013*
> 
> question why does my 290x usage while playing games jump up and down like 100 then 10 then 100 %


I think it could be due to AMD's PowerTune with boost. Akin to Nvidia's boost.

The card is constantly trying to keep within the thermal and power limits you set in CCC while squeezing maximum performance.

I.E. The latest cards nowadays don't run at constant core clock speeds, rather they are dynamic.


----------



## nX3NTY

Just wanna ask you guys, is it worth to spend extra for 290X instead of 290? The costs between the two is 100USD though because there are plenty of used R9 290 but the 290X is brand new. Only game at 1920x1080


----------



## gatygun

Quote:


> Originally Posted by *jura11*
> 
> Hi there
> 
> I would change orientation of the CPU cooler,I've once like you have and this has heated unnecessary CPU and I would change orientation towards to the exhaust of the case,I've done that and changed exhaust fan for Thermalright TY-147 fan and my temps has dropped too on the GPU and CPU,this is my view there and everyone can have different opinion
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks,Jura


I had it the other way around but some dude figured out that pushing heat from the gpu right through the cpu cooler towards the top would push better temps as hot air always wants to move upwards. I tested this and it works surprisingly well. The temps are a lot more cooler then the other way around. ( a lot is ~4c on gpu and 8c on cpu cooler).
Quote:


> Originally Posted by *cephelix*
> 
> I don't think I can handle another +50 since my temps are already maxing out. what are your temps like? it looks stable. haven't tested it in other games except DAI and Heaven/Valley and I get no artifacts. Maybe I'll reinstall AC Black Flag and see how it does. Mainly play RPGs so those are the games I'll be testing with.
> I went for broke and just used CLU between die and heatsink. Brought my temps down 10C or thereabouts. Also strapped an exhaust fan to the unused pcie slots under the GPU and that brought down temps another 5C initially and resulted in my case and card heating up much slower than before.
> Agreed. Could it be initial mounting pressure? What I find helpful is to have an exhaust fan that is of higher cfm/static pressure than the ones on the heatsink and that is set to a steeper rpm curve to help exhaust heated air coming from the gpu. But it really depends on his setup.
> 
> On a side note. I just got a random shutdown with the source being "thermald" and no event ID. anyone know what it is?


I wouldn't test with heaven or valley or 3dmark to see if your clock is stable, most of the time those benchmarks do not heat your pc up enough in order to see if temps stay acceptable or other wonky problems come up. What i found out that works really well to see fi your system is stable is either cities skylines ( massive city ) and build / play for multiple hours on 5k resolution ( important to crank it up to 250% dynamic resolution mod ) as otherwise your gpu will just idle.
Another good game is battlefield 3 / 4 ( i personally use 3 ) with 3200x1800p ultra settings and play for multiple hours on big conquest maps. This heats up your cpu which also will effect your gpu temps.

Once you can do that perfectly well for a day or so, you can say it's stable.

My temps are 290 tri-x 1145/1450:

@100% usage in gpu-z after hours of playing skylines and bf3 ( also 100% usage of my cpu heavily oced with 1,44v + hyperthreading on, which also adds more heat )
66c core avg ( max 70c )
72c vrm1 avg ( max 77c )
~52% fan speed. ( 45% minimum with 55% highest 50% avg)

I do push a aggressive fan profile atm, because the stock fan profile is terrible. It will let your vrm1 heat up way to much before the fans kick in, which results in far more wonky unstable solutions that ever needs to be. The vcore temp therefore isn't interesting to focus your fan profile on which msi afterburner and the stock control tri-x cooler do. but as there is no option you need to work with it.

The fan speed can be put towards towards 40-45% and get 10c more on v-ram and 4-5 on core, but i rather have it cooler atm through OCing my cpu.

I also run a extra cpu fan on top of gpu to push extra heat out of the case, as the fans on the tri-x push the heat towards the side upwards, to make sure that no additional heat builds up anywhere.

I'm not a big fan of repasting gpu coolers within warranty, i rather keep everything stock as much as possible until the warranty runs out, but that's just personal tho, i don't see much need to lower the temps on my 290 atm.

ps black flag is a terrible optimized game, game runs worse then unity here. I'm not touching it after i played it for a few hours anymore.


----------



## gatygun

Quote:


> Originally Posted by *nX3NTY*
> 
> Just wanna ask you guys, is it worth to spend extra for 290X instead of 290? The costs between the two is 100USD though because there are plenty of used R9 290 but the 290X is brand new. Only game at 1920x1080


Personally i don't think its worth it. The only real interesting 290x is the 8gb version, but that one just costs to much over a 290. But that's just my opinion.


----------



## pcrevolution

Quote:


> Originally Posted by *gatygun*
> 
> Personally i don't think its worth it. The only real interesting 290x is the 8gb version, but that one just costs to much over a 290. But that's just my opinion.


Sorry but I beg to differ wrt a 8GB 290x. If you were to say that 8GB is useful for crossfire gaming @ 4K and above, I still believe that the limiting factor is the GPU core, not the memory buffer.

And as far as rumors go, I believe AMD is moving towards not requiring the cards to have the same copy of the memory buffers in crossfire. Therefore 4GB + 4GB instead of a mirrored 4GB is a possibility in the future.


----------



## Forceman

Quote:


> Originally Posted by *nX3NTY*
> 
> Just wanna ask you guys, is it worth to spend extra for 290X instead of 290? The costs between the two is 100USD though because there are plenty of used R9 290 but the 290X is brand new. Only game at 1920x1080


No, the performance difference is very small, certainly not worth $100. Plus you might get lucky with an unlockable 290.


----------



## rdr09

Quote:


> Originally Posted by *pcrevolution*
> 
> Sorry but I beg to differ wrt a 8GB 290x. If you were to say that 8GB is useful for crossfire gaming @ 4K and above, I still believe that the limiting factor is the GPU core, not the memory buffer.
> 
> And as far as rumors go, I believe AMD is moving towards not requiring the cards to have the same copy of the memory buffers in crossfire. Therefore 4GB + 4GB instead of a mirrored 4GB is a possibility in the future.


that stacking of vram will extend the life of pretty much every gpu unless it will only apply to some.

with regard to the choice between 290 or 290X, yes, the difference at this point is minimal now that the 300 are almost here. Especially if the price difference is $100 or more. not sure how long warranty is on every brand but i know powercolor's is not transferable.


----------



## nX3NTY

Quote:


> Originally Posted by *gatygun*
> 
> Personally i don't think its worth it. The only real interesting 290x is the 8gb version, but that one just costs to much over a 290. But that's just my opinion.


Quote:


> Originally Posted by *Forceman*
> 
> No, the performance difference is very small, certainly not worth $100. Plus you might get lucky with an unlockable 290.


Thanks both of you







I gonna buy a reference 290 to save money, I already have unused Arctic Twin Turbo II if the reference is too noisy.

And thanks a lot gatygun for the test of clockspeed, voltage and powertune you done earlier. You been very helpful. The reason why I wanted to lower power is not to save electricity or something like that, it's because I already have MSI R9 290 Gaming before (sold it and buy GTX 970, sold it again because of 3.5GB fiasco), it made my Seasonic 620W PSU sweat (noisy fan under load), so I'm lowering power target to reduce PSU noise.


----------



## DividebyZERO

Quote:


> Originally Posted by *nX3NTY*
> 
> Thanks both of you
> 
> 
> 
> 
> 
> 
> 
> I gonna buy a reference 290 to save money, I already have unused Arctic Twin Turbo II if the reference is too noisy.
> 
> And thanks a lot gatygun for the test of clockspeed, voltage and powertune you done earlier. You been very helpful. The reason why I wanted to lower power is not to save electricity or something like that, it's because I already have MSI R9 290 Gaming before (sold it and buy GTX 970, sold it again because of 3.5GB fiasco), it made my Seasonic 620W PSU sweat (noisy fan under load), so I'm lowering power target to reduce PSU noise.


there is a member here selling ref 290s on the ocn marketplace. They seem pretty cheap


----------



## cephelix

Quote:


> Originally Posted by *gatygun*
> 
> I had it the other way around but some dude figured out that pushing heat from the gpu right through the cpu cooler towards the top would push better temps as hot air always wants to move upwards. I tested this and it works surprisingly well. The temps are a lot more cooler then the other way around. ( a lot is ~4c on gpu and 8c on cpu cooler).
> I wouldn't test with heaven or valley or 3dmark to see if your clock is stable, most of the time those benchmarks do not heat your pc up enough in order to see if temps stay acceptable or other wonky problems come up. What i found out that works really well to see fi your system is stable is either cities skylines ( massive city ) and build / play for multiple hours on 5k resolution ( important to crank it up to 250% dynamic resolution mod ) as otherwise your gpu will just idle.
> Another good game is battlefield 3 / 4 ( i personally use 3 ) with 3200x1800p ultra settings and play for multiple hours on big conquest maps. This heats up your cpu which also will effect your gpu temps.
> 
> Once you can do that perfectly well for a day or so, you can say it's stable.
> 
> My temps are 290 tri-x 1145/1450:
> 
> @100% usage in gpu-z after hours of playing skylines and bf3 ( also 100% usage of my cpu heavily oced with 1,44v + hyperthreading on, which also adds more heat )
> 66c core avg ( max 70c )
> 72c vrm1 avg ( max 77c )
> ~52% fan speed. ( 45% minimum with 55% highest 50% avg)
> 
> I do push a aggressive fan profile atm, because the stock fan profile is terrible. It will let your vrm1 heat up way to much before the fans kick in, which results in far more wonky unstable solutions that ever needs to be. The vcore temp therefore isn't interesting to focus your fan profile on which msi afterburner and the stock control tri-x cooler do. but as there is no option you need to work with it.
> 
> The fan speed can be put towards towards 40-45% and get 10c more on v-ram and 4-5 on core, but i rather have it cooler atm through OCing my cpu.
> 
> I also run a extra cpu fan on top of gpu to push extra heat out of the case, as the fans on the tri-x push the heat towards the side upwards, to make sure that no additional heat builds up anywhere.
> 
> I'm not a big fan of repasting gpu coolers within warranty, i rather keep everything stock as much as possible until the warranty runs out, but that's just personal tho, i don't see much need to lower the temps on my 290 atm.
> 
> ps black flag is a terrible optimized game, game runs worse then unity here. I'm not touching it after i played it for a few hours anymore.


Well, for me, Heaven and Valley do give accurate gaming temps... Though the stability aspect of it is a bit iffy. But once i used Trixx instead of Afterburner, all was good. I haven't OC-ed my cpu yet, only gpu so far and that's why I have my cpu cooler exhausting out the back instead of the top. I have enough airflow through the case now that it doesn't allow for an area of stagnant air. I do agree with your method of testing though since it is more thorough than any of mine...I will tweak my settings again once everything is dialled in...it's just gpu for now


----------



## nX3NTY

Quote:


> Originally Posted by *DividebyZERO*
> 
> there is a member here selling ref 290s on the ocn marketplace. They seem pretty cheap


It's tempting, but thing is though I live in Malaysia and I don't have PayPal







Price is VERY cheap though...a lot cheaper than any card I found locally.


----------



## gatygun

Quote:


> Originally Posted by *pcrevolution*
> 
> Sorry but I beg to differ wrt a 8GB 290x. If you were to say that 8GB is useful for crossfire gaming @ 4K and above, I still believe that the limiting factor is the GPU core, not the memory buffer.
> 
> And as far as rumors go, I believe AMD is moving towards not requiring the cards to have the same copy of the memory buffers in crossfire. Therefore 4GB + 4GB instead of a mirrored 4GB is a possibility in the future.


8gb removes the 4gb barrier, even if its only 4,3gb. The reason why people bought 3gb 580's was not to use the full 3gb of v-ram, but to remove the 1,5gb barrier which holded the cards back, the same goes for 4gb 680's. Lot's of people complained about 780's not having enough v-ram when they bought it as new games pushed over the 3gb of v-ram on 1080p already soon after it's release.

games are pefectly playable on ultra with a single 290, let alone 2x 290x. Pushing past the 4gb barrier on 290x isn't something hard to archief on good framerates specially when titles demand more and more v-ram these days.

the increase on performance of the 290x over 290 is minimal to say the least, unless you like to spend money for hardly any increase it's a bad choice in my opinion ( unless its cheap enough ).

The only reason to get a 290x over a 290 is therefore "in my opinion" if you go for a 8gb model, just to remove the 4gb barrier of a 290, but you will have to pay big time for it on top of it which was another reason for me to not be bothered with it.

All with all 290 was the most logical choice for me for price / performance.

About the 4+4 = 8gb of v-ram in the future. we will first have to see this. Mantle is already theoretically capable of doing so, yet nobody bothered with even demonstrating it on any level. There will probably be drawbacks and if it's going to work smooth on older hardware or if it's going to be implanted at all is another question. Possible yes, actually something that's gona happen anytime soon i highly doubt it.

Ofcourse this is just speculating. But i have a hard time believing it's going to be anything relevant in the upcoming years. Or even on "older hardware" when the time comes.

But we will see.


----------



## mus1mus

Quote:


> Originally Posted by *kizwan*
> 
> Look good. This is my score with single card at 1000/1300 @1080p.
> 
> I use Shin-Etsu G751. I also tried MX4 before & I don't like the result too. Shin-Etsu TIMs are pretty good TIM.
> 
> I'm not fan with CLU or any electrical conductive TIMs. Other than Shin-Etsu TIMs, IC Diamond is one of my favourite TIM.
> Quote:
> 
> 
> 
> Originally Posted by *cephelix*
> 
> very close...i find a difference of about 2fps between stock memory clock and at 1500mhz.... I was kind of apprehensive about conductive tim as well, which is why i was extra careful. i coated the vrms in nail polish first before i applied CLU...just in case any small beads got away from me and i wiped the whole area down with alcohol afterwards and checked and rechecked to see if i missed a spot
Click to expand...

Not so fast












Memory Clock helped a lot in these runs.








1220 /1700


----------



## Devildog83

XSPC Razor or EK block for my 290? What do ya'll think?


----------



## LandonAaron

Quote:


> Originally Posted by *Devildog83*
> 
> XSPC Razor or EK block for my 290? What do ya'll think?


I think either would be fine. Main thing will be price and aesthetics. EK is probably the most popular block. But why not bitspower? You already have the backplate and their blocks are nice looking.


----------



## Devildog83

Quote:


> Originally Posted by *LandonAaron*
> 
> I think either would be fine. Main thing will be price and aesthetics. EK is probably the most popular block. But why not bitspower? You already have the backplate and their blocks are nice looking.


Actually it's an EK backplate but my CPU block is the XSPC Raystorm. It is more expensive but like the Razor it's a full copper block. If I get the Razor I might have to get a XSPC back plate and if I get the EK block it will not be full copper. Tough decision. I will check into the bitspower block.


----------



## LandonAaron

Quote:


> Originally Posted by *Devildog83*
> 
> Actually it's an EK backplate but my CPU block is the XSPC Raystorm. It is more expensive but like the Razor it's a full copper block. If I get the Razor I might have to get a XSPC back plate and if I get the EK block it will not be full copper. Tough decision. I will check into the bitspower block.


EK makes a full copper block. Or well copper/plexi with some sort of small metal plate over the VRM area that I am not sure what it is make of, but mainly copper.

http://www.performance-pcs.com/ek-msi-gigabyte-radeon-r9-290x-vga-liquid-cooling-block.html


----------



## Devildog83

Quote:


> Originally Posted by *LandonAaron*
> 
> EK makes a full copper block. Or well copper/plexi with some sort of small metal plate over the VRM area that I am not sure what it is make of, but mainly copper.
> 
> http://www.performance-pcs.com/ek-msi-gigabyte-radeon-r9-290x-vga-liquid-cooling-block.html


That a nice one. I am looking for full cover though. It is cheaper than the Razor though. I will ponder some more. Bitspower has a nice one too but it's not copper and just as much as the Razor. I just might go Razor to match my Raystorm I just hope I don't have to do another backplate.

Thanks for the help.


----------



## LandonAaron

Has anyone tested different crossfire settings in GTA V? Frame pacing etc. Wondering if there are better settings to use than the default crossfire profile.


----------



## BradleyW

Quote:


> Originally Posted by *LandonAaron*
> 
> Has anyone tested different crossfire settings in GTA V? Frame pacing etc. Wondering if there are better settings to use than the default crossfire profile.


I heard optimize 1x1 works better. Advanced graphics seem to cause intense CPU usage and massive constant stuttering when driving. Worse with CFX enabled.


----------



## Native89

Got a question regarding power supplies.

So I have a Seasonic X850 (single 12V rail and all that) powering my computer.
I was running two seperate VGA cables to my 290, but did not like the bulk of the stock connectors (each cable has 2 8pin connectors).
Now I'm wondering if it would be safe/sufficient to power my 290 with a single cable?
Running a mild overclock, 1100/1300 +63mv, and haven't noticed any issues thus far. I do not know if there would be a problem in the future though.

Only other solution I can think of is to get some custom sleeved cables and I don't think I can justify the price for them. Thanks all.


----------



## Mega Man

depends on how the cables are ran

the real weak points on the wires and more are the crimps and where they connect pin to pin

there are several models

if you have one like this http://www.newegg.com/Product/Product.aspx?Item=N82E16817151102 where the pcie wires go into one connector, come out of said connector and into another
ie like this ( just an example )

this taken from http://old-macs.com/pages/Cords_Cables/PCIe_Power/PCIePwr6P18GY/Pic2.jpeg
i would not recommend it

if you have one like this you will be fine

http://www.newegg.com/Product/Product.aspx?Item=N82E16817151110

2 separate wires coming from the connector that goes into the psu

IE this

this taken from http://www.moddiy.com/product_images/m/603/T1trW7XmVhXXXMSIcU_013801__15865_zoom.jpg
attempting to find better pics


----------



## BradleyW

Quote:


> Originally Posted by *Native89*
> 
> Got a question regarding power supplies.
> 
> So I have a Seasonic X850 (single 12V rail and all that) powering my computer.
> I was running two seperate VGA cables to my 290, but did not like the bulk of the stock connectors (each cable has 2 8pin connectors).
> Now I'm wondering if it would be safe/sufficient to power my 290 with a single cable?
> Running a mild overclock, 1100/1300 +63mv, and haven't noticed any issues thus far. I do not know if there would be a problem in the future though.
> 
> Only other solution I can think of is to get some custom sleeved cables and I don't think I can justify the price for them. Thanks all.


Use 2 seperate cables. Don't run off one cable. I did and it fried my motherboard and PSU. When GPU requires more power, the PSU sends more power through the 8 pin EPS and the 24 pin MB connector. It's a known fault with all those Seasonic platinum units (Or should I say the cables). A manager from their RMA dept in Germany told me that this was the cause. Seasonic gold units use much better cables.

In other words, using 2 cables instead of one will ensure the hardware is safe from the minimal chance of overloading the wires.


----------



## cephelix

Quote:


> Originally Posted by *BradleyW*
> 
> Use 2 seperate cables. Don't run off one cable. I did and it fried my motherboard and PSU. When GPU requires more power, the PSU sends more power through the 8 pin EPS and the 24 pin MB connector. It's a known fault with all those Seasonic platinum units (Or should I say the cables). A manager from their RMA dept in Germany told me that this was the cause. Seasonic gold units use much better cables.
> 
> In other words, using 2 cables instead of one will ensure the hardware is safe from the minimal chance of overloading the wires.


I didnt know that.i thought it would be a standard wire config, the 2nd pic @Mega Man showed, a 6+2 to 12pin cable


----------



## DividebyZERO

I ran 480gtx quad SLI once upon a time, using 2 power supplies and didn't use the PCIE supplement power connector on the mainboard. I ended up with melted 12v wires and 24pin ATX connector. Luckily i found it rather quickly after a crash during benching. I had to replace the modular 24pin cable and took a while to removed the melted plastin in the mobo 24pin connector. Everything lived and i still use the mainboard and psu to date.

I learned my lesson and every quad setup since have had zero melted wires


----------



## Mega Man

if yours is the second option ( 12pin connector to 2x8 pin ) yes you can

i do and i push some insane volts through it


----------



## Bertovzki

Quote:


> Originally Posted by *Mega Man*
> 
> if yours is the second option ( 12pin connector to 2x8 pin ) yes you can
> 
> i do and i push some insane volts through it


Iv always been confused about this with my new R9 290X Tri-X OC , when i look at standard PCIE cables and what my GPU has on it , the numbers of pins are greater on my GPU , cant remember , not home at moe , but 2 x 8 pin from memory , so this above mentioned cable is the one i should get ( 12 pin to 2 x 8 pin ), i will be getting custom cables made , most probably ?


----------



## Native89

Quote:


> Originally Posted by *BradleyW*
> 
> Use 2 seperate cables. Don't run off one cable. I did and it fried my motherboard and PSU. When GPU requires more power, the PSU sends more power through the 8 pin EPS and the 24 pin MB connector. It's a known fault with all those Seasonic platinum units (Or should I say the cables). A manager from their RMA dept in Germany told me that this was the cause. Seasonic gold units use much better cables.
> 
> In other words, using 2 cables instead of one will ensure the hardware is safe from the minimal chance of overloading the wires.


Hot damn. Guess I'm gonna go switch them out right now.
Next upgrade: sleeved cables.

Thanks for the responses.


----------



## chronicfx

Quote:


> Originally Posted by *BradleyW*
> 
> Just tried GTA V. CFX gives me no fps advantage with intense stuttering. Mouse is also too laggy. Completely unplayable. Worst performing game I've played to date.
> 
> 1080p, FXAA, all settings high, advanced settings off, tessellation off. VRAM under 3GB.


I am butter smooth at 8560x1440 at very high in trifire My only complaint would be the game "mission cues" are off the top of the screen for some reason. I am not a seasoned eyefinity veteran and only got my third monitor last week. Anyone know a remedy?


----------



## Mega Man

how far ?

may check the game options first as there may be an adj there
Quote:


> Originally Posted by *Bertovzki*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> if yours is the second option ( 12pin connector to 2x8 pin ) yes you can
> 
> i do and i push some insane volts through it
> 
> 
> 
> Iv always been confused about this with my new R9 290X Tri-X OC , when i look at standard PCIE cables and what my GPU has on it , the numbers of pins are greater on my GPU , cant remember , not home at moe , but 2 x 8 pin from memory , so this above mentioned cable is the one i should get ( 12 pin to 2 x 8 pin ), i will be getting custom cables made , most probably ?
Click to expand...

what psu do you have ? that is for a seasonic x series


----------



## Bertovzki

Quote:


> Originally Posted by *Mega Man*
> 
> what psu do you have ? that is for a seasonic x series


I see , i have the Coolermaster V1000 , have not even looked at cables yet , not at that stage.


----------



## chronicfx

I did look in the options for something, i can see the bottom line of the text but lines above are offscreen.

Edit: found it! Screen safezone is the option for this.


----------



## kizwan

Quote:


> Originally Posted by *Devildog83*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *LandonAaron*
> 
> I think either would be fine. Main thing will be price and aesthetics. EK is probably the most popular block. But why not bitspower? You already have the backplate and their blocks are nice looking.
> 
> 
> 
> 
> 
> 
> 
> Actually it's an EK backplate but my CPU block is the XSPC Raystorm. It is more expensive but like the Razor it's a full copper block. If I get the Razor I might have to get a XSPC back plate and if I get the EK block it will not be full copper. Tough decision. I will check into the bitspower block.
Click to expand...

You probably overlooked this.
http://www.performance-pcs.com/ek-msi-gigabyte-radeon-r9-290x-vga-liquid-cooling-block-acetal.html

Regarding the EK block not full copper, XSPC also not full copper (if I understand you correctly). XSPC block also have stainless steel top for the VRM water channel.


There's also no vissual difference in coper block coverage between the two blocks.

XSPC copper block


EK copper block


This is of course for referenced design card.

Quote:


> Originally Posted by *Native89*
> 
> Got a question regarding power supplies.
> 
> So I have a Seasonic X850 (single 12V rail and all that) powering my computer.
> I was running two seperate VGA cables to my 290, but did not like the bulk of the stock connectors (each cable has 2 8pin connectors).
> Now I'm wondering if it would be safe/sufficient to power my 290 with a single cable?
> Running a mild overclock, 1100/1300 +63mv, and haven't noticed any issues thus far. I do not know if there would be a problem in the future though.
> 
> Only other solution I can think of is to get some custom sleeved cables and I don't think I can justify the price for them. Thanks all.


If your PSU come with 12-pin to 2x(2+6-pin) cable, it's consider as two seperate PCIe cables connected to PSU. So you should be fine.


----------



## Ramzinho

I've decided to not wait for the 390X... I've been waiting since October. i'm going to give my 290X and My Seasonic M12II to my wife. got myself a EVGA supernova 850GS, i will be working on a double projects. my new rig Beast in a nzxt H440 that will have petg tubing, powdercoated rads, sleeved psu and a lot of work. . and her Rig in a ThermalTake Core V51.

The way i looked at it is that i got one 290X for 200$ and i can probably fish for another one for around 250.. visiontek blocks are 50$ ... so i will ended up with dual 290X for 550 with blocks. and from the looks of it. it will be 10~15% slower a 390X for same price. and if DX12 delivers its promises. i think i'm done for couple more years.

I hope i made the right decision. money wise. if the 390X comes anything below 500$ i'll be sad. but if it's anything above 600... Then i think my money is put where it is worth.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Ramzinho*
> 
> I've decided to not wait for the 390X... I've been waiting since October. i'm going to give my 290X and My Seasonic M12II to my wife. got myself a EVGA supernova 850GS, i will be working on a double projects. my new rig Beast in a nzxt H440 that will have petg tubing, powdercoated rads, sleeved psu and a lot of work. . and her Rig in a ThermalTake Core V51.
> 
> The way i looked at it is that i got one 290X for 200$ and i can probably fish for another one for around 250.. visiontek blocks are 50$ ... so i will ended up with dual 290X for 550 with blocks. and from the looks of it. it will be 10~15% slower a 390X for same price. and if DX12 delivers its promises. i think i'm done for couple more years.
> 
> I hope i made the right decision. money wise. if the 390X comes anything below 500$ i'll be sad. but if it's anything above 600... Then i think my money is put where it is worth.


The more new games you want to play the worse CFX is. If you are a Day 1 gamer guy you are not going to have a fun time.


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> The more new games you want to play the worse CFX is. If you are a Day 1 gamer guy you are not going to have a fun time.


especially nvidia optimized game.


----------



## Ramzinho

Quote:


> Originally Posted by *ZealotKi11er*
> 
> The more new games you want to play the worse CFX is. If you are a Day 1 gamer guy you are not going to have a fun time.


Quote:


> Originally Posted by *rdr09*
> 
> especially nvidia optimized game.


Not a Day 1 Gamer... i've enough backlog i wanna play.


----------



## rdr09

Quote:


> Originally Posted by *Ramzinho*
> 
> Not a Day 1 Gamer... i've enough backlog i wanna play.


carlo is making me want to buy . . .


----------



## Ramzinho

Quote:


> Originally Posted by *rdr09*
> 
> carlo is making me want to buy . . .


I know.. everybody loves GTA. but man 60$ isn't a good investment for me in a game now.. i blew all my savings on finishing the builds i'm doing. need a recovery .. at least 3/4 months . and i've seen you defend a lot of CF 290X build before







.. why giving me second thoughts mate


----------



## rdr09

Quote:


> Originally Posted by *Ramzinho*
> 
> I know.. everybody loves GTA. but man 60$ isn't a good investment for me in a game now.. i blew all my savings on finishing the builds i'm doing. need a recovery .. at least 3/4 months


lol. same here. now, if it is like BF4 that you'd play for over a year - yah. in a few weeks the price will be half.


----------



## Ramzinho

btw i am still looking for a second GPU. so if any of you guys selling his 290X let me know.


----------



## kizwan

Quote:


> Originally Posted by *Ramzinho*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> carlo is making me want to buy . . .
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I know.. everybody loves GTA. but man 60$ isn't a good investment for me in a game now.. i blew all my savings on finishing the builds i'm doing. need a recovery .. at least 3/4 months
Click to expand...

I bought GTA V less than that at steam.


----------



## Ramzinho

Quote:


> Originally Posted by *kizwan*
> 
> I bought GTA V less than that at steam.


some countries has the prices less due to their economy. for me.. steam has set us up with the American Store front. while it's unfair as i make 1/5 what my peers in the states would make. it's good cause if i use the European one i'm in trouble .. exchange rates favor the USD over the Euros for me.


----------



## Waleh

Quote:


> Originally Posted by *Ramzinho*
> 
> I've decided to not wait for the 390X... I've been waiting since October. i'm going to give my 290X and My Seasonic M12II to my wife. got myself a EVGA supernova 850GS, i will be working on a double projects. my new rig Beast in a nzxt H440 that will have petg tubing, powdercoated rads, sleeved psu and a lot of work. . and her Rig in a ThermalTake Core V51.
> 
> The way i looked at it is that i got one 290X for 200$ and i can probably fish for another one for around 250.. visiontek blocks are 50$ ... so i will ended up with dual 290X for 550 with blocks. and from the looks of it. it will be 10~15% slower a 390X for same price. and if DX12 delivers its promises. i think i'm done for couple more years.
> 
> I hope i made the right decision. money wise. if the 390X comes anything below 500$ i'll be sad. but if it's anything above 600... Then i think my money is put where it is worth.


I also wanted to wait for the 300 series to come out since I'm building a new rig in the near future but with the lack of any firm details being released, I'm almost settled on just getting a 290x. I'm planning on buying a 1440p monitor (Asus PB278Q) and I wanted the R9 380x or R9 390 (non-x) for that added performance but at this point who knows when that'll be. Based on the benchmarks I've seen, a single 290x still does pretty well on 1440p. I was looking specifically at the Tri-X OC version. I just don't understand what is taking AMD so long for this new line of GPUs. I swear it was supposed to be out by now.


----------



## Ramzinho

Quote:


> Originally Posted by *Waleh*
> 
> I also wanted to wait for the 300 series to come out since I'm building a new rig in the near future but with the lack of any firm details being released, I'm almost settled on just getting a 290x. I'm planning on buying a 1440p monitor (Asus PB278Q) and I wanted the R9 380x or R9 390 (non-x) for that added performance but at this point who knows when that'll be. Based on the benchmarks I've seen, a single 290x still does pretty well on 1440p. I was looking specifically at the Tri-X OC version. I just don't understand what is taking AMD so long for this new line of GPUs. I swear it was supposed to be out by now.


for me it's more than that.. i've two rig to upgrade and i'm bored. i've not installed a new HW in my pc since last October. it's just a hobby and i need to do something. and in the past two years GPUs hasn't come in a jaw dropping performance like they used to. and i'm pretty sure that the 390X and the Upcoming Nvidia GPU will be like 20% faster than what we have now at most. so i'm set with dual GPUs for a single 1440p.. i'm golden.


----------



## Waleh

Quote:


> Originally Posted by *Ramzinho*
> 
> for me it's more than that.. i've two rig to upgrade and i'm bored. i've not installed a new HW in my pc since last October. it's just a hobby and i need to do something. and in the past two years GPUs hasn't come in a jaw dropping performance like they used to. and i'm pretty sure that the 390X and the Upcoming Nvidia GPU will be like 20% faster than what we have now at most. so i'm set with dual GPUs for a single 1440p.. i'm golden.


You think you have the upgrade itch? I'm going from a Q6600 to a 4790k







I definitely need to upgrade. I'm currently running bf4 at a solid 15-30 FPS on the lowest settings with constant stuttering on my 6850 and this is at 720p! I can't even imagine what 1440p will look like.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Waleh*
> 
> You think you have the upgrade itch? I'm going from a Q6600 to a 4790k
> 
> 
> 
> 
> 
> 
> 
> 
> I definitely need to upgrade. I'm currently running bf4 at a solid 15-30 FPS on the lowest settings with constant stuttering on my 6850 and this is at 720p! I can't even imagine what 1440p will look like.


I upgraded from the q6600 to this 8320 clocked at 4.8...night and day difference...your Intel should perform even better...prepare your body when you fire it up


----------



## zealord

Quote:


> Originally Posted by *Ramzinho*
> 
> for me it's more than that.. i've two rig to upgrade and i'm bored. i've not installed a new HW in my pc since last October. it's just a hobby and i need to do something. and in the past two years GPUs hasn't come in a jaw dropping performance like they used to. and i'm pretty sure that the 390X and the Upcoming Nvidia GPU will be like 20% faster than what we have now at most. so i'm set with dual GPUs for a single 1440p.. i'm golden.


you mean 20% faster than a 980/290X? That would be a big disappointment for me and means slower than the Titan X


----------



## Ramzinho

Quote:


> Originally Posted by *zealord*
> 
> you mean 20% faster than a 980/290X? That would be a big disappointment for me and means slower than the Titan X


well single to single i think the upcoming GPUs wont be that big difference. 290X X 2 are now head to head with the titan in some cases and a bit less in others.. not to mention the promise of DX12 ... that would give me a solid performance for some time as i'm never going to go 4K


----------



## Faster_is_better

Oh look who's in here?









Anyone know where I can download a BIOS editing tool for r9 290? I mainly just want to see which BIOS I have on my 2nd used card. Supposedly it has PT1 but I haven't found any way to determine that yet. Here is the BIOS version, 015.041.000.001.003746, and its default clock is 1000mhz so that isn't a typical reference value.


----------



## kaspar737

I'm probably going to get flamed for this post, but how is GTA V running for you? Any kind of stutter/bugs? How's the framerate?


----------



## mAs81

Quote:


> Originally Posted by *Faster_is_better*
> 
> Anyone know where I can download a BIOS editing tool for r9 290? I mainly just want to see which BIOS I have on my 2nd used card. Supposedly it has PT1 but I haven't found any way to determine that yet. Here is the BIOS version, 015.041.000.001.003746, and its default clock is 1000mhz so that isn't a typical reference value.


Other than VBE7 or HEX Editor , personally I don't know any other BIOS editing progs..

According to techpowerup , your card (the XFX model , right? ) maybe unlocked to a 290X ,given the BIOS version and clocks


Spoiler: Warning: Spoiler!


----------



## zealord

Quote:


> Originally Posted by *kaspar737*
> 
> I'm probably going to get flamed for this post, but how is GTA V running for you? Any kind of stutter/bugs? How's the framerate?


It runs great for me. Great scalability. Game is very smooth. No big framerate drops and even if the framerate is sub 60 at times for me it still feels smooth.
The game has so many graphic options that even budget cards can run it quite well, but also a Titan X can be easily brought to it's knees by the advanced graphics option.

(I have a 2500K 4.5ghz, 290X at 1080p)


----------



## Ramzinho

i just wanted to ask.. if i run my 3770 at like 4.4 .. i would face any issues with the X fire?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Ramzinho*
> 
> i just wanted to ask.. if i run my 3770 at like 4.4 .. i would face any issues with the X fire?


I have my 3770K @ 4.6GHz. Even if it was 5GHz it would not make much difference if the game is CPU bound so you should be fine. Think of CFX more about going higher res, higher settings then higher fps. When you go higher fps CPU has to keep up too.


----------



## BradleyW

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I have my 3770K @ 4.6GHz. Even if it was 5GHz it would not make much difference if the game is CPU bound so you should be fine. Think of CFX more about going higher res, higher settings then higher fps. When you go higher fps CPU has to keep up too.


Exactly this.


----------



## zealord

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I have my 3770K @ 4.6GHz. Even if it was 5GHz it would not make much difference if the game is CPU bound so you should be fine. Think of CFX more about going higher res, higher settings then higher fps. When you go higher fps CPU has to keep up too.


Good Post

Whenever I notice that the CPU is a limiting factor for me I just crank up the settings and try to get 100% GPU usage.

I wish it was easier deciding on a new CPU, but the progress in the last couple of years have been so small that no CPU seems a worthy upgrade from my 2500K and I don't even know what kind of CPUs AMD has, but I guess the 8350 is even weaker than my 2500K


----------



## ZealotKi11er

Quote:


> Originally Posted by *zealord*
> 
> Good Post
> 
> Whenever I notice that the CPU is a limiting factor for me I just crank up the settings and try to get 100% GPU usage.
> 
> I wish it was easier deciding on a new CPU, but the progress in the last couple of years have been so small that no CPU seems a worthy upgrade from my 2500K and I don't even know what kind of CPUs AMD has, but I guess the 8350 is even weaker than my 2500K


DX12 should help so we dont have to upgrade CPUs for another 2-3 year.


----------



## Ramzinho

i was hoping the dual 290X would benefit from that as well.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Ramzinho*
> 
> i was hoping the dual 290X would benefit from that as well.


The think is as DX12 is coming out more and more people will go 4K. The problem with 4K is that it's all GPU land so we will not see much DX12 benefit @ 4K.


----------



## Ramzinho

I'm at 1440p and i'm happy with it.. not even thinking about 4k...


----------



## ZealotKi11er

Quote:


> Originally Posted by *Ramzinho*
> 
> I'm at 1440p and i'm happy with it.. not even thinking about 4k...


I am @ 1440p too and been thinking a lot about 40" 4K panel. In BF4 i already run 60 fps + 1440p with 150% resolution scaling. The game just looks a lot better. Yeah 1440p still amazing but when you never want to stop going up. My U2711 in 1 month is 4 years old and there are still people using 1080p.


----------



## mAs81

I just got my 1440p U2713HM some months ago , and gaming together with my 290 was a big change for me , coming from 1080p..I'm not thinking 4K for now , rather I'd like to get another (or two) 1440p monitors in the future , should the prices go down a bit


----------



## pengs

Need some advice here about AB and CF...

Got two 290X's and Afterburner 4.1.0 w/ RTSS reports them both boosting to 1040 (max boost) 100% of the time when using Crossfire -- I know it's not correct because the voltage is actively adjusting. It seems that when I make a large change to the graphics in-game something resets (maybe the driver) and then gives me an accurate boost reading for both GPU's. This bug is not limited to a specific game, happens in all of them.

Turned off ULPS a long time ago which enabled the second card to give readings 'more' accurately and kept it from (basically) shutting off in 2D mode/Windows environment, so this isn't a ULPS issue.
I have Enable Unified GPU Usage Monitoring ticked, I wonder if that has something to do with it but I think I've tried it w/o.

Anybody with any idea?
-
Oh, to add to this the GPU usage is always symmetrical and VRAM is falsely stacked.
i.e. GPU1 56% GPU2 56%
89%/89%, 38%/38% ect. This is also not correct... well, not coming from an SLI setup unless scaling is really that good over PCIE but there's absolutely zero fluctuation. I don't believe it's correct.


----------



## Faster_is_better

Quote:


> Originally Posted by *mAs81*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Faster_is_better*
> 
> Anyone know where I can download a BIOS editing tool for r9 290? I mainly just want to see which BIOS I have on my 2nd used card. Supposedly it has PT1 but I haven't found any way to determine that yet. Here is the BIOS version, 015.041.000.001.003746, and its default clock is 1000mhz so that isn't a typical reference value.
> 
> 
> 
> Other than VBE7 or HEX Editor , personally I don't know any other BIOS editing progs..
> 
> According to techpowerup , your card (the XFX model , right? ) maybe unlocked to a 290X ,given the BIOS version and clocks
> 
> 
> Spoiler: Warning: Spoiler!
Click to expand...

Yes the XFX is the one I wanted to check. I'm pretty sure it was reflashed with PT1, according to past owner, and I already checked with the 290 unlocking tools, both cards don't match up with the right ID's to be unlockable.

Hmm, I looked at that VBE7 earlier but it seemed to be for older cards only, maybe the page is just out of date and it does still work with newer cards. Nope clearly it is only for 7xxx series cards.

There is no easy way to tell if BIOS is PT1 or reference or ? through say something like GPU-z?


----------



## chronicfx

Quote:


> Originally Posted by *kaspar737*
> 
> I'm probably going to get flamed for this post, but how is GTA V running for you? Any kind of stutter/bugs? How's the framerate?


Not using any framerate monitoring i can tell you I have played a couple missions at 8560x1440 with three 290x and there is nothing slow or jittery. I have everything under graphics pushed all the way up and the view distance in advanced graphics I am pushing up to but respecting my 12gb limit not using any AA but i havent noticed a need to.


----------



## BradleyW

Quote:


> Originally Posted by *chronicfx*
> 
> Not using any framerate monitoring i can tell you I have played a couple missions at 8560x1440 with three 290x and there is nothing slow or jittery. I have everything under graphics pushed all the way up and the view distance in advanced graphics I am pushing up to but respecting my 12gb limit not using any AA but i havent noticed a need to.


Not even a single hitch when driving like the rest of us?


----------



## DividebyZERO

I don't think he is having the same stuttering effect because he is running a higher resolution. Correct me if I am wrong but when I play in lower resolutions frame dips are way easier to spot. While he may get dips in fps its probably not noticeable so much.


----------



## BradleyW

Quote:


> Originally Posted by *DividebyZERO*
> 
> I don't think he is having the same stuttering effect because he is running a higher resolution. Correct me if I am wrong but when I play in lower resolutions frame dips are way easier to spot. While he may get dips in fps its probably not noticeable so much.


I agree with this. I've experienced this myself on other games.


----------



## chronicfx

I will run afterburner behind it later to make sure but from a gaming standpoint I can say I never once paid attention to how the computer was running vs. playing the game. Sometimes that hitching is texture streams loading from the drive too, i have 16gb ram and six ssd in raid0. I didn't have the problems with watchdogs everyone was talking about either.


----------



## DividebyZERO

Quote:


> Originally Posted by *chronicfx*
> 
> I will run afterburner behind it later to make sure but from a gaming standpoint I can say I never once paid attention to how the computer was running vs. playing the game. Sometimes that hitching is texture streams loading from the drive too, i have 16gb ram and six ssd in raid0. I didn't have the problems with watchdogs everyone was talking about either.


Man if your enjoying it don't let an OSD ruin your party. It's easy to get wrapped up in the OSD numbers game and end up missing the fun in the game.


----------



## ZealotKi11er

Quote:


> Originally Posted by *DividebyZERO*
> 
> Man if your enjoying it don't let an OSD ruin your party. It's easy to get wrapped up in the OSD numbers game and end up missing the fun in the game.


Always check the numbers to make sure nothing is wrong.


----------



## chronicfx

Here is the Vespucci beach motorcycle repo mission for simeon. This is at 8560x1440 with trifire 290x. Monitors are LG 34UM95 in center with a Catleap Q270 on each side. Right click and open in new tab for full size image.


----------



## mAs81

Quote:


> Originally Posted by *Faster_is_better*
> 
> There is no easy way to tell if BIOS is PT1 or reference or ? through say something like GPU-z?


Of course you can see the Bios version of your card on GPU-Z , but I don't think that you'll be able to be sure if your BIOS is PT1 or not , I guess..Since it's different from the stock BIOS I'd say that it is indeed flashed with PT1 as he previous owner said , but it never hurts to ask someone in the R9 290 -> 290X Unlock Thread

My 290 is unlocked but I'm not comfortable with flashing it at the moment..I already had some problems with the fan controller on the PCB , and I'm kind worried..for now at least.


----------



## cephelix

Update to my gpu temps! It seems like i have an airflow problem in my case for the gpu side. If you guys remember, at 1100/1500 i was hitting 85C with 26C ambients even with a fan strapped to the unused pcie slots as an additional exhaust. Well,last friday i jumped on 4 gentle typhoons 1850rpm. Apparently a local distributor had them in stock. Replaced my silverstone ap121L with one gentle typhoon and it brought my temps down significantly!at load i now run a max temp of low 70s. Will try to take a screenshot when i get back. So i guss i now have sufficient thermal headroom to overclock some more!!woohoo!!


----------



## jura11

Quote:


> Originally Posted by *cephelix*
> 
> Update to my gpu temps! It seems like i have an airflow problem in my case for the gpu side. If you guys remember, at 1100/1500 i was hitting 85C with 26C ambients even with a fan strapped to the unused pcie slots as an additional exhaust. Well,last friday i jumped on 4 gentle typhoons 1850rpm. Apparently a local distributor had them in stock. Replaced my silverstone ap121L with one gentle typhoon and it brought my temps down significantly!at load i now run a max temp of low 70s. Will try to take a screenshot when i get back. So i guss i now have sufficient thermal headroom to overclock some more!!woohoo!!


That's awesome news there,I'm thinking too replace my some better one,how those Gentle Typhoon are loud,as I would replace them on my case too

But not at least great temperatures on load

Thanks,Jura


----------



## cephelix

Quote:


> Originally Posted by *jura11*
> 
> That's awesome news there,I'm thinking too replace my some better one,how those Gentle Typhoon are loud,as I would replace them on my case too
> 
> But not at least great temperatures on load
> 
> Thanks,Jura


Well,their noise comes from air rushing through. I have no qualms with that. I know they're good but i was surprised at just how good. I thought the silverstone air penetrators were doing a good enough job but these guys just blew the roof off the building. For normal usage, i have them connected to a fan controller and thet are silent. Practically the only time i set them to max rpms is when i'm gaming or overclocking. And they are audible but not irritating


----------



## jura11

Quote:


> Originally Posted by *cephelix*
> 
> Well,their noise comes from air rushing through. I have no qualms with that. I know they're good but i was surprised at just how good. I thought the silverstone air penetrators were doing a good enough job but these guys just blew the roof off the building. For normal usage, i have them connected to a fan controller and thet are silent. Practically the only time i set them to max rpms is when i'm gaming or overclocking. And they are audible but not irritating


Bugger sadly they're not 230mm which I need for my HAF X and I need one 140mm at front and on top too 230mm,think I will replace all fans on my HAF X for sure,which one this will be task to find right ones.

But off topic,yesterday I've re-applied Shin-Etsu on the X5670 and temps are at 4.2GHz now at 60C(previously has been at high 70C) during the rendering and I've suspicion I've previously applied just less than I thought so,but very happy


Yes I've got too fan controller Scythe Kaz Master II and I've used this with Scythe Kaze Jyuni 1900RPM Slip Stream 120mm Fan as for GPU and really on full revs is very,very loud,louder than my GPU

But still great temps there:thumb:

Thanks,Jura


----------



## pcrevolution

Quote:


> Originally Posted by *cephelix*
> 
> Well,their noise comes from air rushing through. I have no qualms with that. I know they're good but i was surprised at just how good. I thought the silverstone air penetrators were doing a good enough job but these guys just blew the roof off the building. For normal usage, i have them connected to a fan controller and thet are silent. Practically the only time i set them to max rpms is when i'm gaming or overclocking. And they are audible but not irritating


Does show that it is equally important that you have good exhaust fans. I use the AP-15s also.


----------



## cephelix

Quote:


> Originally Posted by *jura11*
> 
> Bugger sadly they're not 230mm which I need for my HAF X and I need one 140mm at front and on top too 230mm,think I will replace all fans on my HAF X for sure,which one this will be task to find right ones.
> 
> But off topic,yesterday I've re-applied Shin-Etsu on the X5670 and temps are at 4.2GHz now at 60C(previously has been at high 70C) during the rendering and I've suspicion I've previously applied just less than I thought so,but very happy
> 
> 
> Yes I've got too fan controller Scythe Kaz Master II and I've used this with Scythe Kaze Jyuni 1900RPM Slip Stream 120mm Fan as for GPU and really on full revs is very,very loud,louder than my GPU
> 
> But still great temps there:thumb:
> 
> Thanks,Jura


Nice temps there. Yeah,they are all 120mm. Is there no way for you to mount 120mm fans to all the slots in your case?if you could, i highly recommend them.
Quote:


> Originally Posted by *pcrevolution*
> 
> Does show that it is equally important that you have good exhaust fans. I use the AP-15s also.


Yeah..for me exhaust fans are more of an afterthought until recently of course








the 4 i got were from Lazada with warrty fulfilled by tech Dynamic in case you're looking for more. I couldnt pass up the chance and grabbed them last friday even though i technically didnt need any more fans. But it's more of a "just in case" senario


----------



## pcrevolution

Quote:


> Originally Posted by *cephelix*
> 
> Nice temps there. Yeah,they are all 120mm. Is there no way for you to mount 120mm fans to all the slots in your case?if you could, i highly recommend them.
> Yeah..for me exhaust fans are more of an afterthought until recently of course
> 
> 
> 
> 
> 
> 
> 
> 
> the 4 i got were from Lazada with warrty fulfilled by tech Dynamic in case you're looking for more. I couldnt pass up the chance and grabbed them last friday even though i technically didnt need any more fans. But it's more of a "just in case" senario


Ball bearing fans last so much longer compared to sleeve bearings. Its S$25 well spent. Sure Corsair fans look great. But sleeve bearing? I'll pass.


----------



## cephelix

Quote:


> Originally Posted by *pcrevolution*
> 
> Ball bearing fans last so much longer compared to sleeve bearings. Its S$25 well spent. Sure Corsair fans look great. But sleeve bearing? I'll pass.


Agreed..finally got my what i thought spoiled ap-15 working again.turns out i plugged the wires in the wrong holes..lol
I dont understand the popularity of tge corsair fans though. but i suppose it's down to personal preference.
Anyways, my temps are now in check.at my current oc,vram1 doesnt pass 60C at load..real happy about that. How much more oc do you guys think i can squeeze out of my card??i'll probably try it at load today without the air conditioning on and see what temps i can get...


----------



## disintegratorx

A question for the pros here, although I might be 'barking up the wrong tree', sort of speak.. Does anybody know how to use a program called Proxifier or a program like that so I can get on Steam and Origin through a device with a generated proxy, and no real directions for an IP, or a relating port number? If there is, I'd be glad to throw you a rep point for giving me a quick rundown, but please if you don't know then don't bother speculating because I've been experimenting with this program for a bit already and I'm just missing the final part that I need. Any help though would be very greatfully appreciated though.
Thanks.


----------



## Agent Smith1984

So.... I am adding 290 tri-x for CF this week.

I figure this....

I have $160 in my current 290 off of CL last October when the miners were just finishing up their liquidations...
I add a refurb 290 from newegg for around $225, and I have $385 in what will essintially perform with a $1k titan x in cases where CF is supported, and I will also have the availibility of 8GB VRAM once we see DX12 roll out. And even with 4GB, I am plenty covered at the meeger 1080P I am using now, and should be fine for 4k once I make a TV upgrade towards Christmas time.

To me, the value is definitely there, despite dealing with the high power needs of the cards.....
I mean, 20-25k graphic points in FireStrike for less than $400 seems like a hands down no-brainer!

Agreed?


----------



## Mattbag

How do these compare head to head with a 7970? I dont know weather to sell my 7970 and buy a 290 or buy another 7970 it should roughly cost the same? cause I can always add another 290 down the road right???


----------



## ikem

Quote:


> Originally Posted by *pcrevolution*
> 
> Sorry but I beg to differ wrt a 8GB 290x. If you were to say that 8GB is useful for crossfire gaming @ 4K and above, I still believe that the limiting factor is the GPU core, not the memory buffer.
> 
> And as far as rumors go, I believe AMD is moving towards not requiring the cards to have the same copy of the memory buffers in crossfire. Therefore 4GB + 4GB instead of a mirrored 4GB is a possibility in the future.


my single 290x 8gb is using 5gb of VRAM @ 2560x1080 in GTA V.


----------



## Agent Smith1984

Quote:


> Originally Posted by *ikem*
> 
> my single 290x 8gb is using 5gb of VRAM @ 2560x1080 in GTA V.


With what settings, and what kind of FPS are you seeing?


----------



## ikem

Quote:


> Originally Posted by *Agent Smith1984*
> 
> With what settings, and what kind of FPS are you seeing?


all maxed, but the grass. It is set to high. 50-60fps


----------



## Agent Smith1984

Quote:


> Originally Posted by *ikem*
> 
> all maxed, but the grass. It is set to high. 50-60fps


Well based on the gameplay I have seen, this game looks amazing, even on high/very high settings, so it's just a testament to how well the hawaii is still holding up.

I am thinking 290's in Crossfire with an overclock should do very well.

I am planning on going to 16GB RAM soon, so I will probably move my pagefile to a RAMdisk is case my VRAM fills, or just cut the pagefile off altogether. I have seen this remedy VRAM stutters before. No more texture swapping from the drives, RAM only....


----------



## ikem

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well based on the gameplay I have seen, this game looks amazing, even on high/very high settings, so it's just a testament to how well the hawaii is still holding up.
> 
> I am thinking 290's in Crossfire with an overclock should do very well.
> 
> I am planning on going to 16GB RAM soon, so I will probably move my pagefile to a RAMdisk is case my VRAM fills, or just cut the pagefile off altogether. I have seen this remedy VRAM stutters before. No more texture swapping from the drives, RAM only....


yea, i would like to crossfire this 8gb version. I would be set for a long while. Coming from a 7870 2gb, the performance increase across the board is nice.

EDIT" you cant spill vram over to regular ram. When VRAM fills, the game just crashes. Thats what was happening with my 7870


----------



## Agent Smith1984

Quote:


> Originally Posted by *ikem*
> 
> yea, i would like to crossfire this 8gb version. I would be set for a long while. Coming from a 7870 2gb, the performance increase across the board is nice.


Well, at least with two xeons you aren't dealing with any kind of CPU bottlenecks


----------



## DividebyZERO

Which brings up a good question, how many threads does gtav use? Im curious cause I have older dual xeons and want to see them used if possible.


----------



## ikem

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Well, at least with two xeons you aren't dealing with any kind of CPU bottlenecks


Quote:


> Originally Posted by *DividebyZERO*
> 
> Which brings up a good question, how many threads does gtav use? Im curious cause I have older dual xeons and want to see them used if possible.


more than you would think. they only turbo to 2.4ghz. But GTA V uses 16 threads! so it is fine. its the games that are not multi-threaded that choke at the CPU.

EDIT: also a little sad that MSI doesnt have metal shrouds on thier cards anymore... its what i really liked about their cards... well.. and the cooler.


----------



## Agent Smith1984

GTA uses 16 threads? Like, solidly uses them, or high usage on 4-6, and then little bumps on the others?

If GTA is threading that well, then it should work nicely with my CPU.


----------



## ikem

Quote:


> Originally Posted by *Agent Smith1984*
> 
> GTA uses 16 threads? Like, solidly uses them, or high usage on 4-6, and then little bumps on the others?
> 
> If GTA is threading that well, then it should work nicely with my CPU.


Like 30-50% across all 16 threads.


----------



## Agent Smith1984

http://www.tweaktown.com/tweakipedia/83/grand-theft-auto-benchmarked-1080p-1440p-4k/index.html

I find these results to be AWESOME!!!

Only downside is that it looks like this game has no love for CF or SLI....

Single card does better on team green and team red.....

Anybody running dual graphics should take note of these numbers and disable one of their GPU's.

Edit:
Of course this completedly contradicts all of that...
http://www.techspot.com/review/991-gta-5-pc-benchmarks/page4.html

What gives?


----------



## LandonAaron

Quote:


> Originally Posted by *Agent Smith1984*
> 
> http://www.tweaktown.com/tweakipedia/83/grand-theft-auto-benchmarked-1080p-1440p-4k/index.html
> 
> I find these results to be AWESOME!!!
> 
> Only downside is that it looks like this game has no love for CF or SLI....
> 
> Single card does better on team green and team red.....
> 
> Anybody running dual graphics should take note of these numbers and disable one of their GPU's.
> 
> Edit:
> Of course this completedly contradicts all of that...
> http://www.techspot.com/review/991-gta-5-pc-benchmarks/page4.html
> 
> What gives?


I think they messed something up with their tests. When I run one 290x I get about 50 FPS, with Very High settings at 1440p. When I run 290x + 290, I get 90+ FPS.


----------



## gatygun

Well patch 2 gave me a huge performance boost on my 290 on gta 5. I first hitted like sub 40 fps sometimes close to 30 when i would push myself into that dirt road around the lake in the hills, now it's a rock solid 60+ mostly 70-90 fps.


----------



## LandonAaron

I just came across a guy selling a 290 with no stock cooler, card only. on craigslist for $100. Says he is selling it because he had constant driver problems with it and is tired of messing with it. I asked him specifically what problems he had, and he says it black screens on him a lot.

I have an extra stock cooler for water cooling my cards, and I have a single R290x in my fiance's computer so this would make a good upgrade for her, but I don't know, sounds more like a hardware issue than a driver issue. I mean my cards black screen too, but only when overclocking them too high. If that is happening at stock on this card, who knows if it can be made to be stable. What do you guys think, should I risk it?


----------



## ikem

Quote:


> Originally Posted by *gatygun*
> 
> Well patch 2 gave me a huge performance boost on my 290 on gta 5. I first hitted like sub 40 fps sometimes close to 30 when i would push myself into that dirt road around the lake in the hills, now it's a rock solid 60+ mostly 70-90 fps.


was that just released? patch 2


----------



## LandonAaron

I just got a patch last night for GTA V. I don't know if it was first or second, but I too noticed a performance boost. Not so much in terms of FPS, but less stuttering. Pretty much eliminated all stuttering on foot, now it only stutters when driving fast.


----------



## gatygun

Quote:


> Originally Posted by *ikem*
> 
> was that just released? patch 2


No clue exactly i think its 1.02 or something, its the second patch that downloaded for me. It tested it a bit further tho. but the game keeps crashing now with this patch. The dirt roads and forest all got a massive performance boost out of the patch. normally it would drop to 30/40's when you would race through it, but now it stays on 60+. Still sometimes with the head lights framerate drops as low as 42 fps at weird places.

Kinda a mess of a game atm i guess.


----------



## ikem

Quote:


> Originally Posted by *gatygun*
> 
> No clue exactly i think its 1.02 or something, its the second patch that downloaded for me. It tested it a bit further tho. but the game keeps crashing now with this patch. The dirt roads and forest all got a massive performance boost out of the patch. normally it would drop to 30/40's when you would race through it, but now it stays on 60+. Still sometimes with the head lights framerate drops as low as 42 fps at weird places.
> 
> Kinda a mess of a game atm i guess.


k. I will test when i get home.


----------



## kizwan

Quote:


> Originally Posted by *LandonAaron*
> 
> I just came across a guy selling a 290 with no stock cooler, card only. on craigslist for $100. Says he is selling it because he had constant driver problems with it and is tired of messing with it. I asked him specifically what problems he had, and he says it black screens on him a lot.
> 
> I have an extra stock cooler for water cooling my cards, and I have a single R290x in my fiance's computer so this would make a good upgrade for her, but I don't know, sounds more like a hardware issue than a driver issue. I mean my cards black screen too, but only when overclocking them too high. If that is happening at stock on this card, who knows if it can be made to be stable. What do you guys think, should I risk it?


I wouldn't risk it for $100. Unless the black screen can be cured by increasing power limit or voltage, I won't. Black screen can be caused by many things from overheating memory, overheating vrms to voltage too low. If the card is indeed black screen at stock & it can't be cured by increasing the power limit or increasing voltage or better cooling or disabling power saving feature (C-States, windows power option to high performance, sleep/hibernate), then you'll loose $100. Unless if you can RMA it...


----------



## mfknjadagr8

Quote:


> Originally Posted by *kizwan*
> 
> I wouldn't risk it for $100. Unless the black screen can be cured by increasing power limit or voltage, I won't. Black screen can be caused by many things from overheating memory, overheating vrms to voltage too low. If the card is indeed black screen at stock & it can't be cured by increasing the power limit or increasing voltage or better cooling or disabling power saving feature (C-States, windows power option to high performance, sleep/hibernate), then you'll loose $100. Unless if you can RMA it...


It is a gamble.. mine was black screening due to thermals.. because my paste was old and worn when i got it.. but chances are that person had already tried.. that.. perhaps he was trying to sli it and didnt disable ulps.... or doesnt have a clue how to properly tim and mount the blocks.... but its a risk...but everything you buy from craigslist is a risk so :0


----------



## DividebyZERO

I wonder if Nvidia could do this?


Spoiler: Warning: Spoiler!



So i'd like to share this about the crossover 44k, although its potentially specific to AMD GPU's most likely. So i decided to play with the PBP 4 way option on the 44k. And so i hooked up 4 inputs, via my 290x quadfire. Thanks to AMD's VSR i was able to up the resolution per input and achieved
[email protected] 2x2 eyefinity. I can also do 3840x2160 or a wide range of resolutions if i so choose. While its super demanding on games at [email protected] the scaling is actually not to bad considering the resolution im pushing through the crossover.

Ill post some screenshots to show what i mean exactly. i can also post some pictures from my phone but its not the best for photos. Anyways enjoy.



Game screnshot tests (you can right click and open in new tab for full size.)



Cell phone pics(obviously doesnt do it justice with a cell phone but its all i got)




As far as scren synch goes, its perfectly seamless if vsynch is on. With no vsynch and high FPS there is a slight synch line variance from top half to bottom half. middle of the screen - if fps is over 60 with no vsynch that is.

I might just use this instead of my other eyefinity setup, i do loose alot of screen size from my 3x39inchseiki's but i get almost triple 4k resolution on a single [email protected] And as for gameplay it seems really good, although ill need to test it more.










p.s. i forgot to say i think this will be worth checking out in windows 10, as Win7 has crappy UI scaling in super high resolutions.



in case anyone is confused, this is one 40 inch 4k monitor with 4 inputs split across the panel. As illustrated in photos there are NO BEZELS.


----------



## gatygun

Quote:


> Originally Posted by *LandonAaron*
> 
> I just came across a guy selling a 290 with no stock cooler, card only. on craigslist for $100. Says he is selling it because he had constant driver problems with it and is tired of messing with it. I asked him specifically what problems he had, and he says it black screens on him a lot.
> 
> I have an extra stock cooler for water cooling my cards, and I have a single R290x in my fiance's computer so this would make a good upgrade for her, but I don't know, sounds more like a hardware issue than a driver issue. I mean my cards black screen too, but only when overclocking them too high. If that is happening at stock on this card, who knows if it can be made to be stable. What do you guys think, should I risk it?


Everything can have happened with the card tho. I personally find the 290 series to unstable to buy without warranty. I would surely spend a little bit more and have the warranty there. Or else you basically bought a 100 euro brick.


----------



## kizwan

Quote:


> Originally Posted by *DividebyZERO*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I wonder if Nvidia could do this?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> So i'd like to share this about the crossover 44k, although its potentially specific to AMD GPU's most likely. So i decided to play with the PBP 4 way option on the 44k. And so i hooked up 4 inputs, via my 290x quadfire. Thanks to AMD's VSR i was able to up the resolution per input and achieved
> [email protected] 2x2 eyefinity. I can also do 3840x2160 or a wide range of resolutions if i so choose. While its super demanding on games at [email protected] the scaling is actually not to bad considering the resolution im pushing through the crossover.
> 
> Ill post some screenshots to show what i mean exactly. i can also post some pictures from my phone but its not the best for photos. Anyways enjoy.
> 
> 
> 
> Game screnshot tests (you can right click and open in new tab for full size.)
> 
> 
> 
> Cell phone pics(obviously doesnt do it justice with a cell phone but its all i got)
> 
> 
> 
> 
> As far as scren synch goes, its perfectly seamless if vsynch is on. With no vsynch and high FPS there is a slight synch line variance from top half to bottom half. middle of the screen - if fps is over 60 with no vsynch that is.
> 
> I might just use this instead of my other eyefinity setup, i do loose alot of screen size from my 3x39inchseiki's but i get almost triple 4k resolution on a single [email protected] And as for gameplay it seems really good, although ill need to test it more.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> p.s. i forgot to say i think this will be worth checking out in windows 10, as Win7 has crappy UI scaling in super high resolutions.
> 
> 
> 
> 
> 
> in case anyone is confused, this is one 40 inch 4k monitor with 4 inputs split across the panel. As illustrated in photos there are NO BEZELS.


Can you explain more on this? I'm one of the confused, possibly I'm the only one that confused.







All I know some 4K monitors actually tiled two 2K (1920x2160) displays, side by side. When you said 4 inputs split across the panel, what I understand is that your 40 inch monitor is actually have four 1080p displays in it, basically the arrangement is 2x2. Am I got this right?

I'm still don't understand "4 inputs". Did you mean you use four cables?


----------



## chronicfx

What i don't get is how a monitor can "simulate" more pixels than it actually comes with out of the box?


----------



## DividebyZERO

Quote:


> Originally Posted by *kizwan*
> 
> Can you explain more on this? I'm one of the confused, possibly I'm the only one that confused.
> 
> 
> 
> 
> 
> 
> 
> All I know some 4K monitors actually tiled two 2K (1920x2160) displays, side by side. When you said 4 inputs split across the panel, what I understand is that your 40 inch monitor is actually have four 1080p displays in it, basically the arrangement is 2x2. Am I got this right?
> 
> I'm still don't understand "4 inputs". Did you mean you use four cables?


yes, 4 cables since the monitor has 5 total, each is 1080, but I enabled VSR for each.to 3800x1800

Quote:


> Originally Posted by *chronicfx*
> 
> What i don't get is how a monitor can "simulate" more pixels than it actually comes with out of the box?


I didnt say it was actually displaying more pixels(at the panel), it is still a 4k panel. The scaling is rather decent for the higher resolution and acceptable to me. I am thinking it will do better in windows 10 for ui scaling.

EDIT maybe this will help - here are the imputs and the OSD menu for PBP
Quote:


> Quote:
> 
> 
> 
> Originally Posted by *mars2k*
> 
> OK so 4 connection into 1 panel? which ones or is this 4 connectins into 2 panels or 2 each panel
Click to expand...


----------



## mfknjadagr8

Quote:


> Originally Posted by *DividebyZERO*
> 
> yes, 4 cables since the monitor has 5 total, each is 1080, but I enabled VSR for each.to 3800x1800
> I didnt say it was actually displaying more pixels(at the panel), it is still a 4k panel. The scaling is rather decent for the higher resolution and acceptable to me. I am thinking it will do better in windows 10 for ui scaling.
> 
> EDIT maybe this will help - here are the imputs and the OSD menu for PBP


that's nice...my dual 290s would cry mercy though.lol


----------



## Agent Smith1984

The more benches I see on GTA V, the more I realize that GTX 9 series was a high priced sham for people people using 1080P.....

290x performs right with the 980 in resolutions over 1440P....

What's crazy is.... 2) $250 R9 290's will outperform the $1,000 green giant X right now.... Yeah, it uses a lot more power doing it, but still....

The very second Newegg gets those $230 Tri-x refurbs back in stock (if they ever do), and I am landing on planet crossfire, and never looking back


----------



## Ramzinho

Quote:


> Originally Posted by *Agent Smith1984*
> 
> The more benches I see on GTA V, the more I realize that GTX 9 series was a high priced sham for people people using 1080P.....
> 
> 290x performs right with the 980 in resolutions over 1440P....
> 
> What's crazy is.... 2) $250 R9 290's will outperform the $1,000 green giant X right now.... Yeah, it uses a lot more power doing it, but still....
> 
> The very second Newegg gets those $230 Tri-x refurbs back in stock (if they ever do), and I am landing on planet crossfire, and never looking back


i'm going to get two 290Xs only because i believe in DX12 and for price performance. but i've to say as drivers matures the titan will rise a bit or a lot more depending on the drivers. but for now of course the 290X is the king of the hill and in my modest opinion it's one of the strongest GPUs ever made in the history of gaming.


----------



## Batpimp

hey guys.

got my 2nd r9 290 Powercolors PCS+ yesterday.

Ran BF4 to see what temps were gonna be like and yesterday was hotter than normal here. Will you guys please confirm that my temps are normal or not?

Ambient temp was 71F/21.6C

CPU after about 20 min was 46c/114.8F

GPU #1
56C

GPU #
66-69C

I have a phanteks Enthoo Luxe Case.

I am planning on getting some 140mm fans.


----------



## Ramzinho

that's perfectly fine. the second gpu always gets a lot hotter. if you add more air flow to the case it will run perfectly fine... but even with these temps. you are right on the mark.


----------



## nX3NTY

Quote:


> Originally Posted by *Ramzinho*
> 
> i'm going to get two 290Xs only because i believe in DX12 and for price performance. but i've to say as drivers matures the titan will rise a bit or a lot more depending on the drivers. but for now of course the 290X is the king of the hill and in my modest opinion it's one of the strongest GPUs ever made in the history of gaming.


NV cards are way overpriced for what it's worth. Yes the power consumption is lower but this is OVERCLOCK.net







At my country, the most expensive GTX 960 costs very close to a TwinForzr 290X, and the cheapest GTX 970 costs nearly the same as 290X LIghtning. Simply mind boggling. Agreed about 290 cards are one of the strongest GPU ever made, because of it's 512-bit memory. Screw memory compression, massive bandwidth is what we want









I bought 290X instead of 290, I do own 290 before (TwinFrozr) but for some odd reason this new card (290X Tri-X OC) doesn't make my PSU scream when playing taxing games compared to my previous 290, any reason why this is happening? I thought 290X uses more power than 290?


----------



## mfknjadagr8

Quote:


> Originally Posted by *Batpimp*
> 
> hey guys.
> 
> got my 2nd r9 290 Powercolors PCS+ yesterday.
> 
> Ran BF4 to see what temps were gonna be like and yesterday was hotter than normal here. Will you guys please confirm that my temps are normal or not?
> 
> Ambient temp was 71F/21.6C
> 
> CPU after about 20 min was 46c/114.8F
> 
> GPU #1
> 56C
> 
> GPU #
> 66-69C
> 
> I have a phanteks Enthoo Luxe Case.
> 
> I am planning on getting some 140mm fans.


you really should fill out rig.builder instead of listing items as your motherboard and such aren't listed...temps seem ok...You might be able to drop cpu temp a little depending on bios settings...but gpu temps on air are right on...a fan to disperse that air will help that bottom cards temps come down......the air in between the cards is pulled through the cooler on the second card so.a fan blowing in between should break up the heat a bit...it won't run as co as the to card but you should lessen the gap quite a bit and its been shown here a lot that a cooler card is a happier and better running card


----------



## Ramzinho

man i'm in Egypt... the land of overpriced pc hardware, excessive taxes and a lot of bull... i just want to say that.. yes i know what you mean. and i didn't mean to defend Nvidia. i just say it has its fans and Nvidia cards or AMD cards were gives their optimum performance when they are just released. i would say titan X will perform a lot better when the drivers mature. and for me the 290X is the best gpu price to performance wise ever made.


----------



## Batpimp

okay great Yes now i have to decide which fans to get.

I hope to drop 5-10 degrees once i get my fans setup correctly.

What is a dangerous temp level in Celcius. 85+?

Now another question.

I am upgrading my pc and i belive the 8150 FX bulldozer is my bottleneck. Anyway to verify? I read that i should get GPUZ and check the Loads of each GPU.

If the load is low then my CPU is the bottleneck. If the load is high then the GPUs are the bottleneck. Is that a correct statement?

One final question, Is MSI afterburner for any card setup? I wanted to control the fan profiles as mentioned in previous posts.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Batpimp*
> 
> okay great Yes now i have to decide which fans to get.
> 
> I hope to drop 5-10 degrees once i get my fans setup correctly.
> 
> What is a dangerous temp level in Celcius. 85+?
> 
> Now another question.
> 
> I am upgrading my pc and i belive the 8150 FX bulldozer is my bottleneck. Anyway to verify? I read that i should get GPUZ and check the Loads of each GPU.
> 
> If the load is low then my CPU is the bottleneck. If the load is high then the GPUs are the bottleneck. Is that a correct statement?
> 
> One final question, Is MSI afterburner for any card setup? I wanted to control the fan profiles as mentioned in previous posts.


Quote:


> Originally Posted by *Batpimp*
> 
> okay great Yes now i have to decide which fans to get.
> 
> I hope to drop 5-10 degrees once i get my fans setup correctly.
> 
> What is a dangerous temp level in Celcius. 85+?
> 
> Now another question.
> 
> I am upgrading my pc and i belive the 8150 FX bulldozer is my bottleneck. Anyway to verify? I read that i should get GPUZ and check the Loads of each GPU.
> 
> If the load is low then my CPU is the bottleneck. If the load is high then the GPUs are the bottleneck. Is that a correct statement?
> 
> One final question, Is MSI afterburner for any card setup? I wanted to control the fan profiles as mentioned in previous posts.


max safe for cpu is 62c for gpus I would say 85c is about max I would want...theoretically the cards can probably handle near 100c without damage but in my experience they throttle very hard around 90c...at 90 my crossfire on second card wouldnt even get utilized...
2.) No having full load doesn't always mean it's a bottle neck...some games are cpu heavy some are gpu heavy...your 8150 will perform very well even next to the 8350 and such...need to know more about motherboard and power supply before recommending further on that cpu overclock though...also some games don't utilize crossfire very well so that can factor in...my 8350 is clocked to 4.8 and I have two 290s I see 100 percent cpu usage occasionally but not all that often......bf4 is a good example of usage of multiple cores working together properly....more developers should follow this lead...imo the days if using one or two cores should be over already


----------



## Batpimp

thanks for the quick response.

this website just hangs EVERYTIME i try to update my RIGBUILDER.

my avatar selection just hangs too.

It is quite infuriating. I can only imagine its my crappy work pc but i have zero issues on any other websites. I just tried to use Chrome and IE but to no avail.

Here is my PCpartpicker link and the items i have.

http://pcpartpicker.com/b/W2GfrH

Asus Sabretooth 990fx Rev 1

AMD FX8150 OC'ed to 4.4ghz

Gskill DDR3 2400mhz

EVGA 850w G2

Samsung 850 EVO 500gig

R9 290 Powercolor PCS+

Swiftech 240x Liquid Cooling Setup

Phanteks Enthoo Luxe


----------



## mfknjadagr8

Quote:


> Originally Posted by *Batpimp*
> 
> thanks for the quick response.
> 
> this website just hangs EVERYTIME i try to update my RIGBUILDER.
> 
> my avatar selection just hangs too.
> 
> It is quite infuriating. I can only imagine its my crappy work pc but i have zero issues on any other websites. I just tried to use Chrome and IE but to no avail.
> 
> Here is my PCpartpicker link and the items i have.
> 
> http://pcpartpicker.com/b/W2GfrH
> 
> Asus Sabretooth 990fx Rev 1
> 
> AMD FX8150 OC'ed to 4.4ghz
> 
> Gskill DDR3 2400mhz
> 
> EVGA 850w G2
> 
> Samsung 850 EVO 500gig
> 
> R9 290 Powercolor PCS+
> 
> Swiftech 240x Liquid Cooling Setup
> 
> Phanteks Enthoo Luxe


pcs+ is a good card...You could push your overclock higher with that cooler...if you need help head on over to the 8320 thread..

http://www.overclock.net/t/1318995/official-fx-8320-fx-8350-vishera-owners-club/48750#post_23820382


----------



## Cyber Locc

Quote:


> Originally Posted by *nX3NTY*
> 
> NV cards are way overpriced for what it's worth. Yes the power consumption is lower but this is OVERCLOCK.net
> 
> 
> 
> 
> 
> 
> 
> At my country, the most expensive GTX 960 costs very close to a TwinForzr 290X, and the cheapest GTX 970 costs nearly the same as 290X LIghtning. Simply mind boggling. Agreed about 290 cards are one of the strongest GPU ever made, because of it's 512-bit memory. Screw memory compression, massive bandwidth is what we want
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I bought 290X instead of 290, I do own 290 before (TwinFrozr) but for some odd reason this new card (290X Tri-X OC) doesn't make my PSU scream when playing taxing games compared to my previous 290, any reason why this is happening? I thought 290X uses more power than 290?


A 290x uses slightly more power not much though. something must have been wrong with your card or something.

However it could also be diffrnces of clock speeds or other variables neither are reference so neither use reference voltages. Also hs anything else changed in your system since then? Are you playing the same games same resolution? Is the power supply fan failing i assume that's what you mean by scream. There is alot of variables I would defiantly look into it though.
Quote:


> Originally Posted by *mfknjadagr8*
> 
> pcs+ is a good card...You could push your overclock higher with that cooler...if you need help head on over to the 8320 thread..
> 
> http://www.overclock.net/t/1318995/official-fx-8320-fx-8350-vishera-owners-club/48750#post_23820382


He could only he has a 850w power supply so ya no he cant. I just upgraded my psu for that reason (And I'm about to add a 3rd card). His AMD processor may pull less wattage than my i7 4820k. However with my system if I went over 10% PT and then played a game the PC shut down. You have to keep in mind these cards stock pull 250w under full load a 850w isnt enough if he wants to overclock.

EDIT: Scracth that dude that cpu can pull 600ws by itself what in the holy hell. Dude that psu is not enough at all period get a bigger psu. http://www.bit-tech.net/hardware/cpus/2011/10/12/amd-fx-8150-review/10 That says at stock full load that cpu pulls 244ws so with everything at stock your pulling 750ws (with full load) for just the cpu and cards you need a bigger psu Batpimp.

Your bottle neck is you don't have enough power to power your stuff lol. Even if you can get battlefield or some other game to max out your cpu/gpus your pc will shut off.


----------



## Batpimp

its at 4.4ghz not 4.8ghz as per that graph.

i used this to calculate if i could run 2 gpus and the cpu overclocked

http://www.extreme.outervision.com/psucalculatorlite.jsp

it seems to be working im getting no issues as far as crashing for some reason though i cannot get the FPS gains im looking for


----------



## Cyber Locc

Quote:


> Originally Posted by *Batpimp*
> 
> its at 4.4ghz not 4.8ghz as per that graph.
> 
> i used this to calculate if i could run 2 gpus and the cpu overclocked
> 
> http://www.extreme.outervision.com/psucalculatorlite.jsp
> 
> it seems to be working im getting no issues as far as crashing for some reason though i cannot get the FPS gains im looking for


That graph puts 3.6w at 250 and 4.8ghz at 600. Thats a difference of 350ws your in the middle so lets make a educated guess of 175ws added for your overclock thats 425ws for your system and cpu then 500 more for your gpus thats 925ws. Are your cards overclocked at all? I'm telling you first hand that is not enough PSU if your overclocking your gpus/cpu. My I7 uses way less power and was shutting down.

What gains are your getting? Did you completely uninstall your drivers and re-install after adding crossfire you have to do that. use DDU unistaller then reinstall them. Is crossfire on (stupid question I know but check the simple stuff first lol). Have you gotten MSI after burner and monitor that both GPUS are being used?

Whats your CPUS voltage at? I also think the Xtreme calc is wrong in this case as its saying at stock cpu you need 678w total which would barely power the dual 290s.


----------



## Batpimp

not overclocking the GPUS just the CPU

Yes assuming that graph is correct and accurate i might end up returning the card.

yes i completely un installed drivers and re installed the newest drivers. this is really odd.

i am going to tune down the CPU overclock and see if you are right.

I might have to actually unplug the 2nd card and reinstall it physically again. im not sure whats going on.

Will MSI afterburner work with my Powercolor card?


----------



## Cyber Locc

Quote:


> Originally Posted by *Batpimp*
> 
> not overclocking the GPUS just the CPU
> 
> Yes assuming that graph is correct and accurate i might end up returning the card.
> 
> yes i completely un installed drivers and re installed the newest drivers. this is really odd.
> 
> i am going to tune down the CPU overclock and see if you are right.
> 
> I might have to actually unplug the 2nd card and reinstall it physically again. im not sure whats going on.
> 
> Will MSI afterburner work with my Powercolor card?


\

It shouldn't cause the issues your mentioning. If the psu is causing an issue it would straight shut down. Yes after burner will work with any card. Once you install it click monitor and make sure that both cards are getting loads. Also as long as you don't overclock your cards you should be okay I mean your pushing it close but it should work. What wattage did the psu calc recommend? Keep in kind that most people will suggest a 1kw psu for a 295x2 which uses less power than your crossfire so ya.

How much of an fps gain are you getting? My 290s damn near double my fps with crossfire activated (in benches which is the only time I actually watch my fps)


----------



## LandonAaron

So I was just playing GTA V when the game the game crashed with direct x driver crash, so I clicked ok and closed out the game to get to the desktop and then a few moments later the computer BSOD'ed on me with a BSOD 116 code. I looked this code up and its Video Card hardware error. I had the cards OC'd to 1075/1450 when this happend. For all other game I OC at 1100/1500, but GTA for some reason seems to be overly sensitive and at that setting I get black screens. I guess its just an unstable GPU overclock but its strange because I wasn't getting any artifacting or black screens, and I don't ever recall BSODing from a GPU OC.


----------



## Batpimp

The PSU or PC for that matter is not shutting down. Nor are any games or programs or anything.

Nothing is BSOD, Blackscreening etc

The calc said 745w. It is assuming 90% load and OC of CPU to 4.4ghz and the two R9 290s it recommended a 850 so i got the G2. I was leaning towards the 1050 but so far that calc seems to be right on the money

When i play bf4 in crossfire the GPU load % kick up and down quick
GPUZ made a log. ill attach it.

GPU-ZSensorLog.txt 207k .txt file


I am going to assume it is something simple im missing maybe my BIOS version for this mobo..since ive never updated it.
I have not used the disc that came with the card maybe i need to use it. i went to amd and just made sure my drivers were up to date.
I uninstalled drivers re installed but nothing changed.

My valley and heaven scores dont seem to improve. Thanks for your help anyhow. Any other advice, no matter how stupid, please put it on here.

Things i have done off the top of my head.

1. enabled crossfire
2. made sure that games are in fullscreen as per AMD instructions.
3. used GPUZ to confirm the cards are in crossfire. (strangely when the cards are idling one of them dips down to PCIe 1.1 instead of 2.0). In reading further this seemed normal but is it?
4. uninstalled old driver and re installed again. (i used uninstall programs from windows not a software program)


----------



## mfknjadagr8

Quote:


> Originally Posted by *Batpimp*
> 
> The PSU or PC for that matter is not shutting down. Nor are any games or programs or anything.
> 
> Nothing is BSOD, Blackscreening etc
> 
> The calc said 745w. It is assuming 90% load and OC of CPU to 4.4ghz and the two R9 290s it recommended a 850 so i got the G2. I was leaning towards the 1050 but so far that calc seems to be right on the money
> 
> When i play bf4 in crossfire the GPU load % kick up and down quick
> GPUZ made a log. ill attach it.
> 
> GPU-ZSensorLog.txt 207k .txt file
> 
> 
> I am going to assume it is something simple im missing maybe my BIOS version for this mobo..since ive never updated it.
> I have not used the disc that came with the card maybe i need to use it. i went to amd and just made sure my drivers were up to date.
> I uninstalled drivers re installed but nothing changed.
> 
> My valley and heaven scores dont seem to improve. Thanks for your help anyhow. Any other advice, no matter how stupid, please put it on here.
> 
> Things i have done off the top of my head.
> 
> 1. enabled crossfire
> 2. made sure that games are in fullscreen as per AMD instructions.
> 3. used GPUZ to confirm the cards are in crossfire. (strangely when the cards are idling one of them dips down to PCIe 1.1 instead of 2.0). In reading further this seemed normal but is it?
> 4. uninstalled old driver and re installed again. (i used uninstall programs from windows not a software program)


I would check for bios update definitely...it can't hurt as must saber guys run latest bios...try removing the driver with ddu in safe mode (ddu will prompt to restart)..

http://www.wagnardmobile.com/DDU/

The base uninstall program leaves things behind..
Also using crossfire disable ulps....This helps a lot with crossfire issues

You can also use hwinfo64 or ab w/e to monitor usages to make sure it's utilizing your second card...and to log to make sure your gpu core or vrms aren't overheating while full screen...


----------



## Gobigorgohome

I need more GPU-power than my Lightning R9 290X are capable of handling, thinking of adding a R9 295X2 to the system, to run the R9 290X with the R9 295X2, is that okay? I heard some talk in here earlier about some problems ... the R9 295X2 is about the same price as two second-handed R9 290X's now.







Tri-fire would be awesome again, but if there is some problems related to use the R9 290X with the R9 295X2 I think I will stop think about it.


----------



## gatygun

Zit ook te denken om een tweede 290 tri-x toe te voegen, denk alleen dat mijn i7 870 dan een hart infact krijgt.


----------



## DividebyZERO

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I need more GPU-power than my Lightning R9 290X are capable of handling, thinking of adding a R9 295X2 to the system, to run the R9 290X with the R9 295X2, is that okay? I heard some talk in here earlier about some problems ... the R9 295X2 is about the same price as two second-handed R9 290X's now.
> 
> 
> 
> 
> 
> 
> 
> Tri-fire would be awesome again, but if there is some problems related to use the R9 290X with the R9 295X2 I think I will stop think about it.


might be best to ask this in the 295x2 owners thread, I believe some are running that configuration.


----------



## Cyber Locc

Quote:


> Originally Posted by *Batpimp*
> 
> The PSU or PC for that matter is not shutting down. Nor are any games or programs or anything.
> 
> Nothing is BSOD, Blackscreening etc
> 
> The calc said 745w. It is assuming 90% load and OC of CPU to 4.4ghz and the two R9 290s it recommended a 850 so i got the G2. I was leaning towards the 1050 but so far that calc seems to be right on the money
> 
> When i play bf4 in crossfire the GPU load % kick up and down quick
> GPUZ made a log. ill attach it.
> 
> GPU-ZSensorLog.txt 207k .txt file
> 
> 
> I am going to assume it is something simple im missing maybe my BIOS version for this mobo..since ive never updated it.
> I have not used the disc that came with the card maybe i need to use it. i went to amd and just made sure my drivers were up to date.
> I uninstalled drivers re installed but nothing changed.
> 
> My valley and heaven scores dont seem to improve. Thanks for your help anyhow. Any other advice, no matter how stupid, please put it on here.
> 
> Things i have done off the top of my head.
> 
> 1. enabled crossfire
> 2. made sure that games are in fullscreen as per AMD instructions.
> 3. used GPUZ to confirm the cards are in crossfire. (strangely when the cards are idling one of them dips down to PCIe 1.1 instead of 2.0). In reading further this seemed normal but is it?
> 4. uninstalled old driver and re installed again. (i used uninst
> all programs from windows not a software program)


Mfk made some good points as well. As I said I don't know much about AMD CPUs however my rig was fine with a 850w if I didn't over clock my GPUs really at all (at least don't mess with power) so you should be okay if you don't OC your GPUs.

What is the fps gain in heaven? Also can you flip your cards around and see if the usage is the same.

Yes that is usual. if ulps is disabled I think that won't happen that's the second card sleeping.

It could be a driver issue as said because when I first put the second card in I had similar issues that were fixed by DDU safe mode then a CC cleaner run then reinstall.

Also from that gpuz read out it looks fine but I don't know how to read that lol. Get afterburner if you haven't yet use its monitor. Pull the window screcthed (you can strecth it out side the window). Game for a min or run heaven and then use the snipping tool to get a png of the graphs of the clocks. Then one of the temps and post those please.


----------



## fishingfanatic

Personally I would've gone for the 1050, as to me that's putting the psu under a heavy load. There's little room for leeway with the 850.

It might be fine, but running the psu that close to max will shorten it's lifespan imho.

I don't recall exactly, but generally the psu should be running around 65-75 % under load. So the 1050 would've allowed 300w leeway, and with ocing,... there's plenty left for variance.

Perhaps a psu expert can further elaborate on this. There are a few on here, and some excellent psu reviews and breakdowns as well.







Some psus they recommend staying away from.

FF









Here's something I found doing a quick search:

http://www.overclock.net/t/183810/faq-recommended-power-supplies


----------



## Batpimp

hey thanks again for the response cyber. I know you don't have to help me and you are.

Once i get home today ill be running more tests and doing what you are telling me. This is my 1st crossfire setup so I am sure i am not doing something totally correct. i don't want to give up on trying to make it worth either. I just hope i didn't spend 200 bucks for nothing lol.

Your link to bittech was helpful i had never seen that site before. For right now im stuck at work for another 6 ish hours.

I am planning on using the DD uninstaller you guys are mentioning. physically removing the 2nd card and starting fresh.

Assuming i start over. here is what i plan to do.

1. Uninstall drivers.
2. shut down
3. remove 2nd video card.
4. boot up with 1st video card only.
5. install the most recent NON beta drivers for AMD which are omega drivers.
6. run a match of bf4
7. run a heaven and valley test.
8. shut down
9. install 2nd card
10. boot up
11. activate crossfire
12. run bf4 and heaven and valley (in full screen using mantle as per AMD instructions)
13. report my findings here.

Does that seem like the correct steps to do in that order?


----------



## chronicfx

Quote:


> Originally Posted by *Gobigorgohome*
> 
> I need more GPU-power than my Lightning R9 290X are capable of handling, thinking of adding a R9 295X2 to the system, to run the R9 290X with the R9 295X2, is that okay? I heard some talk in here earlier about some problems ... the R9 295X2 is about the same price as two second-handed R9 290X's now.
> 
> 
> 
> 
> 
> 
> 
> Tri-fire would be awesome again, but if there is some problems related to use the R9 290X with the R9 295X2 I think I will stop think about it.


I used to run a 7990 with a 7970. Check to make sure but I would bet it works.


----------



## gertruude

i did some furmark testing o n my 290 and vrms went right up to 105C

is this normal?.....its scaring me lol


----------



## Widde

Is there a way to make amd cards downscale from like 1440p? I've tried VSR and the only way that works for me is to change the resolution on the desktop (Makes it blurry) so the games show the other resolution options, In wow and diablo 3 it seems to work though. Gedosato doesnt work either (Gta V)


----------



## mfknjadagr8

Quote:


> Originally Posted by *Batpimp*
> 
> hey thanks again for the response cyber. I know you don't have to help me and you are.
> 
> Once i get home today ill be running more tests and doing what you are telling me. This is my 1st crossfire setup so I am sure i am not doing something totally correct. i don't want to give up on trying to make it worth either. I just hope i didn't spend 200 bucks for nothing lol.
> 
> Your link to bittech was helpful i had never seen that site before. For right now im stuck at work for another 6 ish hours.
> 
> I am planning on using the DD uninstaller you guys are mentioning. physically removing the 2nd card and starting fresh.
> 
> Assuming i start over. here is what i plan to do.
> 
> 1. Uninstall drivers.
> 2. shut down
> 3. remove 2nd video card.
> 4. boot up with 1st video card only.
> 5. install the most recent NON beta drivers for AMD which are omega drivers.
> 6. run a match of bf4
> 7. run a heaven and valley test.
> 8. shut down
> 9. install 2nd card
> 10. boot up
> 11. activate crossfire
> 12. run bf4 and heaven and valley (in full screen using mantle as per AMD instructions)
> 13. report my findings here.
> 
> Does that seem like the correct steps to do in that order?


when you run ddu be sure when it asks to restart in safe mode you let it...this allows all files to be removed even the ones normally in use when the regular uninstaller runs...I would 100 percent disable ulps BEFORE you start your crossfire runs...You should be getting decent gains from the second card especially in bf4....I run my 290s max settings with 200 percent res scale 1920 x 1080 with around 100fps most of the time...


----------



## Batpimp

Quote:


> Originally Posted by *mfknjadagr8*
> 
> when you run ddu be sure when it asks to restart in safe mode you let it...this allows all files to be removed even the ones normally in use when the regular uninstaller runs...I would 100 percent disable ulps BEFORE you start your crossfire runs...You should be getting decent gains from the second card especially in bf4....I run my 290s max settings with 200 percent res scale 1920 x 1080 with around 100fps most of the time...


okay great thanks for the tip.

since you have crossfire 290s can you say you ever get consistent 120FPS if you have all settings low and 1080p? I never thought i would run the game ultra everything and get 200fps but at the same time i was expecting to get 120 fps no problem on low settings. especially since a single card was giving me 90fps.


----------



## Roboyto

Quote:


> Originally Posted by *gertruude*
> 
> i did some furmark testing o n my 290 and vrms went right up to 105C
> 
> is this normal?.....its scaring me lol


Furmark = bad. Use some other benchmark(s) to see how the card performs.

If you're air cooled and you have overclocked/volted the card than this can happen very easily.

The VRMs are rated to handle temperatures like this, but no one here would suggest it if you value your card lasting a little while. 80s is about where you want to be for daily extended use.

See the "290(X) Need to Know" in my sig for additional information.


----------



## LandonAaron

Quote:


> Originally Posted by *Widde*
> 
> Is there a way to make amd cards downscale from like 1440p? I've tried VSR and the only way that works for me is to change the resolution on the desktop (Makes it blurry) so the games show the other resolution options, In wow and diablo 3 it seems to work though. Gedosato doesnt work either (Gta V)


If you just click enable VSR in CCC you can set your desktop resolution to whatever you want, just keep it at 1080p, and then in game change the resolution to 1440p, and then it will be downsampled to 1080p. I did this alot before upgrading my monitor to a 1440p one so I could get an idea of what sort of performance to expect.


----------



## gatygun

Quote:


> Originally Posted by *Widde*
> 
> Is there a way to make amd cards downscale from like 1440p? I've tried VSR and the only way that works for me is to change the resolution on the desktop (Makes it blurry) so the games show the other resolution options, In wow and diablo 3 it seems to work though. Gedosato doesnt work either (Gta V)


right mouse button on desktop > amd catalyst control center > my digital screens > properties digital screens > put gpu upscaling enable on and downscaling enable.

Apply.

now higher resolutions will show up in games and you desktop will just stay your original resolution.

Works in all new games i tried which are a lot even gta 5 has no issue with it.


----------



## Widde

Quote:


> Originally Posted by *LandonAaron*
> 
> If you just click enable VSR in CCC you can set your desktop resolution to whatever you want, just keep it at 1080p, and then in game change the resolution to 1440p, and then it will be downsampled to 1080p. I did this alot before upgrading my monitor to a 1440p one so I could get an idea of what sort of performance to expect.


The problem is that the ingame resolutions doesnt show up, 1920x1080 is the max as usual. Only seem to work in diablo 3

CoD advanced warfare doesnt show up, Hitman absolution doesnt show up, Gta V doenst show up









http://piclair.com/w3wnx pic of the settings


----------



## Cyber Locc

Quote:


> Originally Posted by *Batpimp*
> 
> hey thanks again for the response cyber. I know you don't have to help me and you are.
> 
> Once i get home today ill be running more tests and doing what you are telling me. This is my 1st crossfire setup so I am sure i am not doing something totally correct. i don't want to give up on trying to make it worth either. I just hope i didn't spend 200 bucks for nothing lol.
> 
> Your link to bittech was helpful i had never seen that site before. For right now im stuck at work for another 6 ish hours.
> 
> I am planning on using the DD uninstaller you guys are mentioning. physically removing the 2nd card and starting fresh.
> 
> Assuming i start over. here is what i plan to do.
> 
> 1. Uninstall drivers.
> 2. shut down
> 3. remove 2nd video card.
> 4. boot up with 1st video card only.
> 5. install the most recent NON beta drivers for AMD which are omega drivers.
> 6. run a match of bf4
> 7. run a heaven and valley test.
> 8. shut down
> 9. install 2nd card
> 10. boot up
> 11. activate crossfire
> 12. run bf4 and heaven and valley (in full screen using mantle as per AMD instructions)
> 13. report my findings here.
> 
> Does that seem like the correct steps to do in that order?


NP man that's why we are here







.

Yes that looks good and as MF said run DDU in safe mode after which get CCleaner and run a temp file cleanup and a registry cleanup before re-installing drivers.

If you follow that order you need to loop back to step 1 when you install the second card. AMD drivers for 1 card are different from drivers for crossfire. The way it knows which drivers you need is by checking for crossfire or not. So you need to reinstall fresh drivers after adding crossfired card. This is what through me for a loop. I didn't do this as these were my first amd cards in a very long time and with Nvidia I never needed to do this. The result was my second card wasn't working correctly then I found this out and this could very well be your issue as well.

There is an easier way to do it do this.

1. DDU in safe mode remove the drivers.
2. Shutdown
3. Take out the second gpu and clean the fingers with rubbing alcohol then re seat the card. (also ensure you are using the 2 tan PCI E slots not the black one the black one is x4 only)
4. Run CCleaner for temp files and reg cleaner (run the reg cleaner a few times till zero errors are found at least 2 times in a row) (its free from piniform.com if you don't already have it)
5. Install Omega drivers
6. Check if crossfire is enabled if it is disable it (your now only using 1 gpu)
7. Run Heaven and record the results
8. Enable Crossfire, In MSI AB disable ULPS and restart.
9. Run Heaven and msi afterburners graph make it long side to side a lot (stretch it outside the screen we just want the GPU usage graph) record results and snip the heaven gpu graphs. (NOTE: Check your CPU usage graphs as well if your CPU is hitting 100% load snip that graph as well. Same with temps if there high snip those as well.)
10. Upload the heaven results and snips of AB graph here.


----------



## gatygun

Quote:


> Originally Posted by *Widde*
> 
> The problem is that the ingame resolutions doesnt show up, 1920x1080 is the max as usual. Only seem to work in diablo 3
> 
> CoD advanced warfare doesnt show up, Hitman absolution doesnt show up, Gta V doenst show up
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://piclair.com/w3wnx pic of the settings


Works in every game for me:

Video of me using it on a 1080p screen and my windows desktop is 1080p ( putting it on 3200x1800 and back towards 1080p )




You probably didn't enable the option in the control screen which i said earlier, if you did and nothing happens or works try to remove your drivers with that driver uninstall tool and reinstall the newest beta drivers 15.4. It's probably a driver issue then.

I have zero issue's in any game with it. Works like a charm.


----------



## Batpimp

Quote:


> Originally Posted by *cyberlocc*
> 
> NP man that's why we are here
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Yes that looks good and as MF said run DDU in safe mode after which get CCleaner and run a temp file cleanup and a registry cleanup before re-installing drivers.
> 
> If you follow that order you need to loop back to step 1 when you install the second card. AMD drivers for 1 card are different from drivers for crossfire. The way it knows which drivers you need is by checking for crossfire or not. So you need to reinstall fresh drivers after adding crossfired card. This is what through me for a loop. I didn't do this as these were my first amd cards in a very long time and with Nvidia I never needed to do this. The result was my second card wasn't working correctly then I found this out and this could very well be your issue as well.
> 
> There is an easier way to do it do this.
> 
> 1. DDU in safe mode remove the drivers.
> 2. Shutdown
> 3. Take out the second gpu and clean the fingers with rubbing alcohol then re seat the card. (also ensure you are using the 2 tan PCI E slots not the black one the black one is x4 only)
> 4. Run CCleaner for temp files and reg cleaner (run the reg cleaner a few times till zero errors are found at least 2 times in a row) (its free from piniform.com if you don't already have it)
> 5. Install Omega drivers
> 6. Check if crossfire is enabled if it is disable it (your now only using 1 gpu)
> 7. Run Heaven and record the results
> 8. Enable Crossfire, In MSI AB disable ULPS and restart.
> 9. Run Heaven and msi afterburners graph make it long side to side a lot (stretch it outside the screen we just want the GPU usage graph) record results and snip the heaven gpu graphs. (NOTE: Check your CPU usage graphs as well if your CPU is hitting 100% load snip that graph as well. Same with temps if there high snip those as well.)
> 10. Upload the heaven results and snips of AB graph here.


okay i did what you said and from this first set of tests it seems to have worked!


















*Please take note. In the MSI AB pics the drop off is when the test ended and i just left the room for a bit.*


----------



## Widde

Quote:


> Originally Posted by *gatygun*
> 
> Works in every game for me:
> 
> Video of me using it on a 1080p screen and my windows desktop is 1080p ( putting it on 3200x1800 and back towards 1080p )
> 
> 
> 
> 
> You probably didn't enable the option in the control screen which i said earlier, if you did and nothing happens or works try to remove your drivers with that driver uninstall tool and reinstall the newest beta drivers 15.4. It's probably a driver issue then.
> 
> I have zero issue's in any game with it. Works like a charm.


Have it enabled in CCC, And it didnt work on the omega drivers and I'm on 15.4 beta drivers, Does the scaling mode matter?


----------



## Cyber Locc

Quote:


> Originally Posted by *Batpimp*
> 
> okay i did what you said and from this first set of tests it seems to have worked!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Please take note. In the MSI AB pics the drop off is when the test ended and i just left the room for a bit.*


Okay I may have some bad news, its been a long time since I have ran heaven and I never run it at high with no AA like you did. However my maxxed settings benches get more fps than you did with 2 cards. I get about 69 from one and 134 from crossfire. So unless omega drivers have destroyed crossfire scaling something is still wrong.

Try running heaven @ 1080p maxxed settings so MSAAx4, Tess Extreme, and Ultra report back the results. I think your CPU may be bottlenecking you. Or something may be bottlenecking you.

Also maybe run the high test again and check your CPU gragh post that here all 8 cores. I mean it I guess if you your happy with the fps then that's all that matters but I'm pretty sure your getting bottlenecked.


----------



## Batpimp

Quote:


> Originally Posted by *cyberlocc*
> 
> Okay I may have some bad news, its been a long time since I have ran heaven and I never run it at high with no AA like you did. However my maxxed settings benches get more fps than you did with 2 cards. I get about 69 from one and 134 from crossfire. So unless omega drivers have destroyed crossfire scaling something is still wrong.
> 
> Try running heaven @ 1080p maxxed settings so MSAAx4, Tess Extreme, and Ultra report back the results. I think your CPU may be bottlenecking you. Or something may be bottlenecking you.
> 
> Also maybe run the high test again and check your CPU gragh post that here all 8 cores. I mean it I guess if you your happy with the fps then that's all that matters but I'm pretty sure your getting bottlenecked.


yup i just been playing some bf4 and not even a 10 fps gain. other older games i play even less. tested better but not actually gaining anything. I am thinking its my 2011-12 ish fx 8150. I think im going to return this GPU and invest in upgrading to intel.. with win 10 coming out soon though i wonder if i should hold out to try and make my AMD work or what.

I literally read like 30-50 pages of forum posts and its the same thing and then i realize why is this so hard?


----------



## Cyber Locc

Quote:


> Originally Posted by *Batpimp*
> 
> yup i just been playing some bf4 and not even a 10 fps gain. other older games i play even less. tested better but not actually gaining anything. I am thinking its my 2011-12 ish fx 8150. I think im going to return this GPU and invest in upgrading to intel.. with win 10 coming out soon though i wonder if i should hold out to try and make my AMD work or what.
> 
> I literally read like 30-50 pages of forum posts and its the same thing and then i realize why is this so hard?


Well BF4 is extremely CPU heavy game so that makes since. Tell ya what do the graph thing again and play BF4 and see what your CPU is doing its probably maxxed out lol. Don't get me wrong from what I know AMD CPUs can be good in some usages but gaming ain't one of them lol but idk.

Since we keep looking at bf4 that tells me its probably CPU as that game is very CPU heavy favoring a 6-8core i7 that's highly clocked.


----------



## Batpimp

Quote:


> Originally Posted by *cyberlocc*
> 
> Well BF4 is extremely CPU heavy game so that makes since. Tell ya what do the graph thing again and play BF4 and see what your CPU is doing its probably maxxed out lol. Don't get me wrong from what I know AMD CPUs can be good in some usages but gaming ain't one of them lol but idk.
> 
> Since we keep looking at bf4 that tells me its probably CPU as that game is very CPU heavy favoring a 6-8core i7 that's highly clocked.


yup i read some other forum posts and some guy had a 4770k and was being bottle necked
http://www.overclock.net/t/1440600/r9-290-having-problem-with-crossfire/360

ill try to update a graph for you


----------



## mfknjadagr8

Quote:


> Originally Posted by *Batpimp*
> 
> yup i read some other forum posts and some guy had a 4770k and was being bottle necked
> http://www.overclock.net/t/1440600/r9-290-having-problem-with-crossfire/360
> 
> ill try to update a graph for you


Tell that to the guys running amd with 4 cards and seeing proper scaling and no bottlenecking from cpu....8150 isnt quite as good as the 8300 and up but its not bad by any means... i could see cpu being an issue if you had an older quad.....i see proper scaling at 4.0 ghz on this 8320..so i dunno... although if you go to intel you will get better fps anyhow... and supports quad channel ram in some cases... and higher imc clocks... so not really a lose situation going that route except expense


----------



## Raephen

Quote:


> Originally Posted by *gertruude*
> 
> i did some furmark testing o n my 290 and vrms went right up to 105C
> 
> is this normal?.....its scaring me lol


For Furmark, completely normal.... It's a ridiculous worst case scenario test. I know because I've used it on my 290, and it's only real use seems to be to use it to fry some eggs on your gpu with.


----------



## gatygun

Quote:


> Originally Posted by *Widde*
> 
> Have it enabled in CCC, And it didnt work on the omega drivers and I'm on 15.4 beta drivers, Does the scaling mode matter?


Well when i open the CCC screen and go towards my digital screen, somehow CCC opens the profile of my second screen ( which i don't game on) first.

This means that if i enable those features which you did, it won't work on my main screen as it's just enabled for my second screen where i read internet websites / watch movies on.

Make sure you select at the red circle your main screen.

I didn't notice this when i first tried it out, maybe you oversaw this also like me when i started messing with it.

Here screenshot of what i mean ( its dutch tho, but you get the idea ).



If you don't got more screens connected, then i would probably advice you to completely clean the drivers from your windows, and reinstall the newest ones 15.4 and see if it works then. ( use that ddu tool i believe it's called ) If that doesn't fixes it, then i have no clue anymore.


----------



## nX3NTY

Guys I get a very high idle temperature on single monitor, around 54-56C. Is this normal even for Tri-X cooler? At load it reaches maximum of around 80C in GTA V


----------



## cephelix

Quote:


> Originally Posted by *nX3NTY*
> 
> Guys I get a very high idle temperature on single monitor, around 54-56C. Is this normal even for Tri-X cooler? At load it reaches maximum of around 80C in GTA V


that is high..what are your ambient like?mine, granted it's a msi twin frozr currenly idles at 43C


----------



## Wezzor

Quote:


> Originally Posted by *nX3NTY*
> 
> Guys I get a very high idle temperature on single monitor, around 54-56C. Is this normal even for Tri-X cooler? At load it reaches maximum of around 80C in GTA V


Damn, that's a lot. I'm getting 35C idle and 65-70C during gaming. My card is even overclocked and I'm not using any custom fan configuration either.


----------



## faizreds

Malaysia is a hot country.


----------



## gatygun

40c here idle ~310 core / 1450 mem cloak, 31% fan speed tri-x


----------



## nX3NTY

Quote:


> Originally Posted by *cephelix*
> 
> that is high..what are your ambient like?mine, granted it's a msi twin frozr currenly idles at 43C


Quote:


> Originally Posted by *Wezzor*
> 
> Damn, that's a lot. I'm getting 35C idle and 65-70C during gaming. My card is even overclocked and I'm not using any custom fan configuration either.


Ambient is around 35C and sometime I do use air conditioner to reduce room temperature. Something not quite right here, even at 100% fan speed I still getting 53C even with the case opened! I just repasted with Arctic Silver, this is weird


----------



## moorhen2

Quote:


> Originally Posted by *gatygun*
> 
> 40c here idle ~310 core / 1450 mem cloak, 31% fan speed tri-x


310 core, hmmm.


----------



## Widde

Quote:


> Originally Posted by *gatygun*
> 
> Well when i open the CCC screen and go towards my digital screen, somehow CCC opens the profile of my second screen ( which i don't game on) first.
> 
> This means that if i enable those features which you did, it won't work on my main screen as it's just enabled for my second screen where i read internet websites / watch movies on.
> 
> Make sure you select at the red circle your main screen.
> 
> I didn't notice this when i first tried it out, maybe you oversaw this also like me when i started messing with it.
> 
> Here screenshot of what i mean ( its dutch tho, but you get the idea ).
> 
> 
> 
> If you don't got more screens connected, then i would probably advice you to completely clean the drivers from your windows, and reinstall the newest ones 15.4 and see if it works then. ( use that ddu tool i believe it's called ) If that doesn't fixes it, then i have no clue anymore.


Ye I also noticed it took the wrong screen, But I've enabled it on both screens, Gonna try disable it on my 2nd monitor


----------



## Wezzor

Quote:


> Originally Posted by *nX3NTY*
> 
> Ambient is around 35C and sometime I do use air conditioner to reduce room temperature. Something not quite right here, even at 100% fan speed I still getting 53C even with the case opened! I just repasted with Arctic Silver, this is weird


Try to contact @tsm106. I once had an really weird problem too but he helped me to solve it without an problem.


----------



## nX3NTY

Quote:


> Originally Posted by *Wezzor*
> 
> Try to contact @tsm106. I once had an really weird problem too but he helped me to solve it without an problem.


Thanks, I already PM him regarding the issue, awaiting his reply


----------



## nX3NTY

I partially fix the problem, I flashed to VaporX BIOS and now it idled at around 45-46C because the memory clock now stays at 150MHz at idle instead of 1300MHz before.

EDIT: That did the trick. Load temperature now is around 74-75C. Stock everything, which is nice


----------



## Wezzor

Quote:


> Originally Posted by *nX3NTY*
> 
> I partially fix the problem, I flashed to VaporX BIOS and now it idled at around 45-46C because the memory clock now stays at 150MHz at idle instead of 1300MHz before.
> 
> EDIT: That did the trick. Load temperature now is around 74-75C. Stock everything, which is nice


Nice to hear!


----------



## Batpimp

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Tell that to the guys running amd with 4 cards and seeing proper scaling and no bottlenecking from cpu....8150 isnt quite as good as the 8300 and up but its not bad by any means... i could see cpu being an issue if you had an older quad.....i see proper scaling at 4.0 ghz on this 8320..so i dunno... although if you go to intel you will get better fps anyhow... and supports quad channel ram in some cases... and higher imc clocks... so not really a lose situation going that route except expense


alright.

besides everything you said do you have any solution for me? I am willing to try anything and not give up but NOTHING is working.


----------



## Cyber Locc

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Tell that to the guys running amd with 4 cards and seeing proper scaling and no bottlenecking from cpu....8150 isnt quite as good as the 8300 and up but its not bad by any means... i could see cpu being an issue if you had an older quad.....i see proper scaling at 4.0 ghz on this 8320..so i dunno... although if you go to intel you will get better fps anyhow... and supports quad channel ram in some cases... and higher imc clocks... so not really a lose situation going that route except expense


I actually looked into this prior then telling him thinking the same. A 8350 is a pildrover CPU right? Ya its a lot stronger, also the guys that I seen running quadfire aren't doing it with 290s nor are they doing it with 1 1080p screen. The few I seen are running very high resolutions like 3x1440p. I'm also not neccesarily blaming the CPU that's just an assumption to know for sure we need the graphs for CPU. However his first test does show CPU usage at 70%+ on 2 cores during a single card heaven on high run. My I7 barely hits 20-25% load during heaven at ultra maxxed.

Regardless whether its his CPU or not something is wrong he is only getting 10fps gains in BF4 and about a 40%gain in heaven that isn't right. His cards are loading correctly so the only possibility in my mind is CPU bottleneck.


----------



## Batpimp

Quote:


> Originally Posted by *cyberlocc*
> 
> Well BF4 is extremely CPU heavy game so that makes since. Tell ya what do the graph thing again and play BF4 and see what your CPU is doing its probably maxxed out lol. Don't get me wrong from what I know AMD CPUs can be good in some usages but gaming ain't one of them lol but idk.
> 
> Since we keep looking at bf4 that tells me its probably CPU as that game is very CPU heavy favoring a 6-8core i7 that's highly clocked.


I am not completely sure what is the bottleneck, if any, but something is wrong. Aside from doing a complete re install of win 8.1 i have run out of ideas. i tried everything you said. i tried what others have said. I would really like this to run but what to do what to do?


----------



## Batpimp

Quote:


> Originally Posted by *cyberlocc*
> 
> I actually looked into this prior then telling him thinking the same. A 8350 is a pildrover CPU right? Ya its a lot stronger, also the guys that I seen running quadfire aren't doing it with 290s nor are they doing it with 1 1080p screen. The few I seen are running very high resolutions like 3x1440p. I'm also not neccesarily blaming the CPU that's just an assumption to know for sure we need the graphs for CPU. However his first test does show CPU usage at 70%+ on 2 cores during a single card heaven on high run. My I7 barely hits 20-25% load during heaven at ultra maxxed.
> 
> Regardless whether its his CPU or not something is wrong he is only getting 10fps gains in BF4 and about a 40%gain in heaven that isn't right. His cards are loading correctly so the only possibility in my mind is CPU bottleneck.


I am going to assume its the CPU. I guess i dont need 144fps which was my goal in bf4 and similar games. Ill have to be happy with 90 fps. I am going to invest in a Almost top of the line intel and just see where it goes from there maybe move to nvidia and sell my my 290.


----------



## Cyber Locc

Quote:


> Originally Posted by *Batpimp*
> 
> I am going to assume its the CPU. I guess i dont need 144fps which was my goal in bf4 and similar games. Ill have to be happy with 90 fps. I am going to invest in a Almost top of the line intel and just see where it goes from there maybe move to nvidia and sell my my 290.


Did you check your CPU usage in the graph? Let me see what I can turn up. Also try what I suggested when you get a chance run heaven at maxxed settings as your fps will drop quite a lot.

Also try doing the reverse see if lowering your settings in bf4 raise the fps if it doesn't your CPU is the bottleneck if it does were back at square one.


----------



## zealord

Just a random question. Does anyone run a 290X with a custom fan profile from MSI Afterburner ?

My card goes to 50% fan speed on bootup, but after I go into the options it goes back to 20%. No idea why this is happening. This is so random that I am absolutely lost . It's not that bad, it's probably just a bug with Afterburner

Tri-X 290X, newest AMD Driver, newest MSI Afterburner version.


----------



## Batpimp

Quote:


> Originally Posted by *cyberlocc*
> 
> Did you check your CPU usage in the graph? Let me see what I can turn up. Also try what I suggested when you get a chance run heaven at maxxed settings as your fps will drop quite a lot.


alright once i get home ill run more tests.

i was running bf4 with the command perfoverlay.drawgraph 1

the cpu was running at like 45% most of the time and gpus about the same.

ill try the heaven tests and anything else tonight.


----------



## Wezzor

Quote:


> Originally Posted by *zealord*
> 
> Just a random question. Does anyone run a 290X with a custom fan profile from MSI Afterburner ?
> 
> My card goes to 50% fan speed on bootup, but after I go into the options it goes back to 20%. No idea why this is happening. This is so random that I am absolutely lost . It's not that bad, it's probably just a bug with Afterburner
> 
> Tri-X 290X, newest AMD Driver, newest MSI Afterburner version.


I got the same problem here. Sometimes but very rarely it can go up to 100% during bootup.


----------



## zealord

Quote:


> Originally Posted by *Wezzor*
> 
> I got the same problem here. Sometimes but very rarely it can go up to 100% during bootup.


lol same for me, but when it happens on bootup it is just for like 3-4 seconds on 100% (really damn loud, but it happens maybe once a month. Maybe my GPU is a woman







)

If it is at 50% once Afterburner has launched in windows then it won't stop unless I manually change it in the Afterburner Settings.


----------



## MojoW

Quote:


> Originally Posted by *Wezzor*
> 
> I got the same problem here. Sometimes but very rarely it can go up to 100% during bootup.


Quote:


> Originally Posted by *zealord*
> 
> lol same for me, but when it happens on bootup it is just for like 3-4 seconds on 100% (really damn loud, but it happens maybe once a month. Maybe my GPU is a woman
> 
> 
> 
> 
> 
> 
> 
> )
> 
> If it is at 50% once Afterburner has launched in windows then it won't stop unless I manually change it in the Afterburner Settings.


Do you guys have a start up delay on your msi afterburner?
As mine always starts after CCC has started and i never have that problem.


----------



## Wezzor

Quote:


> Originally Posted by *MojoW*
> 
> Do you guys have a start up delay on your msi afterburner?
> As mine always starts after CCC has started and i never have that problem.


MSI Afterburner starts before CCC for me at least. (around 4-5 seconds)


----------



## zealord

Quote:


> Originally Posted by *MojoW*
> 
> Do you guys have a start up delay on your msi afterburner?
> As mine always starts after CCC has started and i never have that problem.


I might check into that. thanks


----------



## Gobigorgohome

Which models of the R9 290X gets temps underneath 70C with the original air-cooler? I know the MSI Lightning R9 290X do around 65-68C and the Sapphire R9 290X Vapor-X is supposed to do 60C, how do the Gigabyte Windforce 3X R9 290X perform? I am not looking to overclock.


----------



## nX3NTY

Quote:


> Originally Posted by *Wezzor*
> 
> Nice to hear!


Although performance ingame is good, I noticed I got lower scores in benchmark, 3Dmark11 X scores reduce drastically from around X46xx to X37xx







Now I'm puzzled


----------



## Batpimp

@cyberlocc

heres some AB graphs

I read on the AMD forums that downclocking worked for someone. So i set my CPU back to stock everything. NOTHING oc'ed

Fx 8150 at 3.6ghz

This is about 20 minutes of bf4 in crossfire and single gpu setup

catalyst 14.12

let me know what you think


----------



## Cyber Locc

Quote:


> Originally Posted by *Batpimp*
> 
> @cyberlocc
> 
> heres some AB graphs
> 
> I read on the AMD forums that downclocking worked for someone. So i set my CPU back to stock everything. NOTHING oc'ed
> 
> Fx 8150 at 3.6ghz
> 
> This is about 20 minutes of bf4 in crossfire and single gpu setup
> 
> catalyst 14.12
> 
> let me know what you think


Hmm doesn't look like its bottlenecking did down clocking work?
Try to lower the BF4 settings and see if you still hit the same fps roof.


----------



## Batpimp

its at 1080p fullscreen vsync on every setting minimum setting.

I also tried running Evolve and i got no FPS increase.

i also flipped the positions of the cards in the PCI e slots

i ran furmark and no crashes

no gains of any kind. in fact bf4 had 10 fps loss at least


----------



## Batpimp

fur mark

of 2nd new card. i think i did an over write of the 1st card but almost identical results


----------



## rv8000

Quote:


> Originally Posted by *Gobigorgohome*
> 
> Which models of the R9 290X gets temps underneath 70C with the original air-cooler? I know the MSI Lightning R9 290X do around 65-68C and the Sapphire R9 290X Vapor-X is supposed to do 60C, how do the Gigabyte Windforce 3X R9 290X perform? I am not looking to overclock.


Delta temperatures are more important than max load temperatures. A card could have a 65c load temp in a review, but their ambient temp could be 16c, and if for instance your ambient room temp is 21c or more your temps will be above 70c depending on loading. Your case ambient will also be a factor in the end, if airflow is poor and the capacity of the case is small you will have very high ambient case temps (well above room ambients) and load temps will begin to suffer very quickly. So the quality of the cooler on a card is just one piece of the puzzle, and you're not necessarily going to get the same temps as reviewers and or other users. Some of the best cooled stock cards appear to be the Lightning, Vapor-X, PCS+, and Tri-X (not sure about the windforce). I own a lightning, fantastic card, but above 70% the fans are annoying as all heck if the rest of your system is tuned to be very quiet.

Just for some more comparison info I have an S340 with an NH D15 sandwhiched right next to it 2x140mm 1200 intakes and a 120mm 1200 rpm exhaust with a ~21c room ambient and the lightning at stock clocks, temps can climb to 79c within 15mins with the stock fan profile. Note I wouldn't really call this an air cooling friendly case for a high performance system.


----------



## LandonAaron

Is it normal for two cards in crossfire to have identical gpu usage?


----------



## Batpimp

Quote:


> Originally Posted by *LandonAaron*
> 
> Is it normal for two cards in crossfire to have identical gpu usage?


pics?


----------



## kizwan

Quote:


> Originally Posted by *LandonAaron*
> 
> So I was just playing GTA V when the game the game crashed with direct x driver crash, so I clicked ok and closed out the game to get to the desktop and then a few moments later the computer BSOD'ed on me with a BSOD 116 code. I looked this code up and its Video Card hardware error. I had the cards OC'd to 1075/1450 when this happend. For all other game I OC at 1100/1500, but GTA for some reason seems to be overly sensitive and at that setting I get black screens. I guess its just an unstable GPU overclock but its strange because I wasn't getting any artifacting or black screens, and I don't ever recall BSODing from a GPU OC.


I got BSOD from gpu OC before & mine also don't get artifact or black screen beforehand. I only did few tests & all point toward to heat or temperature. More toward heat because temps for both core & vrm remain low but my case have bad air flow at that time of testing.

Basically crash not necessarily from unstable clocks. I got BSOD 0x116 on 10 Jun 2014 & both point toward to DX error. I was testing BF4 that day.
Quote:


> Originally Posted by *Batpimp*
> 
> The PSU or PC for that matter is not shutting down. Nor are any games or programs or anything.
> 
> Nothing is BSOD, Blackscreening etc
> 
> The calc said 745w. It is assuming 90% load and OC of CPU to 4.4ghz and the two R9 290s it recommended a 850 so i got the G2. I was leaning towards the 1050 but so far that calc seems to be right on the money
> 
> When i play bf4 in crossfire the GPU load % kick up and down quick
> GPUZ made a log. ill attach it.
> 
> GPU-ZSensorLog.txt 207k .txt file
> 
> 
> I am going to assume it is something simple im missing maybe my BIOS version for this mobo..since ive never updated it.
> I have not used the disc that came with the card maybe i need to use it. i went to amd and just made sure my drivers were up to date.
> I uninstalled drivers re installed but nothing changed.
> 
> My valley and heaven scores dont seem to improve. Thanks for your help anyhow. Any other advice, no matter how stupid, please put it on here.
> 
> Things i have done off the top of my head.
> 
> 1. enabled crossfire
> 2. made sure that games are in fullscreen as per AMD instructions.
> 3. used GPUZ to confirm the cards are in crossfire. (strangely when the cards are idling one of them dips down to PCIe 1.1 instead of 2.0). In reading further this seemed normal but is it?
> 4. uninstalled old driver and re installed again. (i used uninstall programs from windows not a software program)


I still suspect your PSU that make your cards underperformed.

BTW,
3. It's normal for the PCIe link down to v1.1 when idling.
Quote:


> Originally Posted by *gertruude*
> 
> i did some furmark testing o n my 290 and vrms went right up to 105C
> 
> is this normal?.....its scaring me lol


Don't use furmark ever. If you're making homemade cooler & want to test it out, furmark might suitable.
Quote:


> Originally Posted by *Wezzor*
> 
> Quote:
> 
> 
> 
> Originally Posted by *nX3NTY*
> 
> Guys I get a very high idle temperature on single monitor, around 54-56C. Is this normal even for Tri-X cooler? At load it reaches maximum of around 80C in GTA V
> 
> 
> 
> Damn, that's a lot. I'm getting 35C idle and 65-70C during gaming. My card is even overclocked and I'm not using any custom fan configuration either.
Click to expand...

Ambient in Malaysia quite high. Room temperature can go up to 36C. "Heat" season nowadays make it feels a lot higher though.
Quote:


> Originally Posted by *nX3NTY*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cephelix*
> 
> that is high..what are your ambient like?mine, granted it's a msi twin frozr currenly idles at 43C
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Wezzor*
> 
> Damn, that's a lot. I'm getting 35C idle and 65-70C during gaming. My card is even overclocked and I'm not using any custom fan configuration either.
> 
> Click to expand...
> 
> Ambient is around 35C and sometime I do use air conditioner to reduce room temperature. Something not quite right here, even at 100% fan speed I still getting 53C even with the case opened! I just repasted with Arctic Silver, this is weird
Click to expand...

I'm pretty sure something is using the gpu. Even mine when was using stock cooler, can idle at 40s Celsius. However it's not surprising your card can idle that high especially in high ambient.
Quote:


> Originally Posted by *moorhen2*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gatygun*
> 
> 40c here idle ~310 core / 1450 mem cloak, 31% fan speed tri-x
> 
> 
> 
> 310 core, hmmm.
Click to expand...

Yeah, when idle core clock can go down to 300MHz.
Quote:


> Originally Posted by *nX3NTY*
> 
> I partially fix the problem, I flashed to VaporX BIOS and now it idled at around 45-46C because the memory clock now stays at 150MHz at idle instead of 1300MHz before.
> 
> EDIT: That did the trick. Load temperature now is around 74-75C. Stock everything, which is nice


My primary gpu (in crossfire) fluctuating from 300 to 1300MHz all the time.
Quote:


> Originally Posted by *Batpimp*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *cyberlocc*
> 
> Well BF4 is extremely CPU heavy game so that makes since. Tell ya what do the graph thing again and play BF4 and see what your CPU is doing its probably maxxed out lol. Don't get me wrong from what I know AMD CPUs can be good in some usages but gaming ain't one of them lol but idk.
> 
> Since we keep looking at bf4 that tells me its probably CPU as that game is very CPU heavy favoring a 6-8core i7 that's highly clocked.
> 
> 
> 
> 
> 
> 
> 
> I am not completely sure what is the bottleneck, if any, but something is wrong. Aside from doing a complete re install of win 8.1 i have run out of ideas. i tried everything you said. i tried what others have said. I would really like this to run but what to do what to do?
Click to expand...

Can you run Heaven again with preset Custom, quality Ultra & Tessellation Extreme at 1000MHz on the core & 1300MHz on the memory?
Quote:


> Originally Posted by *LandonAaron*
> 
> Is it normal for two cards in crossfire to have identical gpu usage?


If you enabled unified gpu usage, yeah both graphs will look almost identical. I don't enable unified gpu usage because it's only cosmetic.


----------



## Batpimp

Quote:


> Can you run Heaven again with preset Custom, quality Ultra & Tessellation Extreme at 1000MHz on the core & 1300MHz on the memory?




These are stock settings for CPU no oveclocking.

@ 4.4ghz still crossfire



i think i am starting to agree with this guys analysis. R9 290s are NOT meant for 144fps in 1080p. They are meant for 1440p or 4k multimonitor setups. Which i dont give a crap about.

http://forums.guru3d.com/showthread.php?t=387269


----------



## kizwan

Quote:


> Originally Posted by *Batpimp*
> 
> Quote:
> 
> 
> 
> Can you run Heaven again with preset Custom, quality Ultra & Tessellation Extreme at 1000MHz on the core & 1300MHz on the memory?
> 
> 
> 
> 
> 
> These are stock settings for CPU no oveclocking.
Click to expand...

How about single card? You just need to disable crossfire in CCC.


----------



## Batpimp

your 1 card test
@kizwan


----------



## passey

Just swapped my 980 for 290x 8gb crossfire now. Loving not having ti worry about the vram limit at 4k.

Both have nzxt kraken g10's and corsair h55s installed so temps stay under 60c at load


----------



## Mega Man

Quote:


> Originally Posted by *Batpimp*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cyberlocc*
> 
> Did you check your CPU usage in the graph? Let me see what I can turn up. Also try what I suggested when you get a chance run heaven at maxxed settings as your fps will drop quite a lot.
> 
> 
> 
> alright once i get home ill run more tests.
> 
> i was running bf4 with the command perfoverlay.drawgraph 1
> 
> the cpu was running at like 45% most of the time and gpus about the same.
> 
> ill try the heaven tests and anything else tonight.
Click to expand...

wait what did i miss?/
Quote:


> Originally Posted by *Batpimp*
> 
> The PSU or PC for that matter is not shutting down. Nor are any games or programs or anything.
> 
> Nothing is BSOD, Blackscreening etc
> 
> The calc said 745w. It is assuming 90% load and OC of CPU to 4.4ghz and the two R9 290s it recommended a 850 so i got the G2. I was leaning towards the 1050 but so far that calc seems to be right on the money
> 
> When i play bf4 in crossfire the GPU load % kick up and down quick
> GPUZ made a log. ill attach it.
> 
> GPU-ZSensorLog.txt 207k .txt file
> 
> 
> I am going to assume it is something simple im missing maybe my BIOS version for this mobo..since ive never updated it.
> I have not used the disc that came with the card maybe i need to use it. i went to amd and just made sure my drivers were up to date.
> I uninstalled drivers re installed but nothing changed.
> 
> My valley and heaven scores dont seem to improve. Thanks for your help anyhow. Any other advice, no matter how stupid, please put it on here.
> 
> Things i have done off the top of my head.
> 
> 1. enabled crossfire
> 2. made sure that games are in fullscreen as per AMD instructions.
> 3. used GPUZ to confirm the cards are in crossfire. (strangely when the cards are idling one of them dips down to PCIe 1.1 instead of 2.0). In reading further this seemed normal but is it?
> 4. uninstalled old driver and re installed again. (i used uninstall programs from windows not a software program)


i would recommend rigbuilder, i need to know your full rig specs, psus/ number of cards

never never never trust online psu calculators but yes underpowering your cards can cause neg performance and not a freeze/restart/bsod

sorry i have not kept up but work and home life been crazy too much crap blowing up at the same time for me to worry about the forums

what settings are you using graphically ??


----------



## Batpimp

Quote:


> Originally Posted by *Mega Man*
> 
> wait what did i miss?/
> i would recommend rigbuilder, i need to know your full rig specs, psus/ number of cards
> 
> never never never trust online psu calculators but yes underpowering your cards can cause neg performance and not a freeze/restart/bsod
> 
> sorry i have not kept up but work and home life been crazy too much crap blowing up at the same time for me to worry about the forums
> 
> what settings are you using graphically ??


just look back a few pages you will see it all.. time for me to sleep 1 am peace!


----------



## Mega Man

ok, well when you want help from someone who has quadfire and amd, let me know with the answers


----------



## kizwan

Quote:


> Originally Posted by *Batpimp*
> 
> your 1 card test
> @kizwan


is that with everything default in CCC? Your single card score higher than it should be & your crossfire score is lower than it should be.

EDIT: I saw your edit. Crossfire score also slightly higher than it should be. If everything is default in CCC, then your cards running fine....at least in Heaven.


----------



## Gobigorgohome

Quote:


> Originally Posted by *rv8000*
> 
> Delta temperatures are more important than max load temperatures. A card could have a 65c load temp in a review, but their ambient temp could be 16c, and if for instance your ambient room temp is 21c or more your temps will be above 70c depending on loading. Your case ambient will also be a factor in the end, if airflow is poor and the capacity of the case is small you will have very high ambient case temps (well above room ambients) and load temps will begin to suffer very quickly. So the quality of the cooler on a card is just one piece of the puzzle, and you're not necessarily going to get the same temps as reviewers and or other users. Some of the best cooled stock cards appear to be the Lightning, Vapor-X, PCS+, and Tri-X (not sure about the windforce). I own a lightning, fantastic card, but above 70% the fans are annoying as all heck if the rest of your system is tuned to be very quiet.
> 
> Just for some more comparison info I have an S340 with an NH D15 sandwhiched right next to it 2x140mm 1200 intakes and a 120mm 1200 rpm exhaust with a ~21c room ambient and the lightning at stock clocks, temps can climb to 79c within 15mins with the stock fan profile. Note I wouldn't really call this an air cooling friendly case for a high performance system.


I actually have 65C load temperature on my Lightning, so that is no "review"-temperature ... the ambient is inrelevant to some point (normal room temperature is around 20C as far as I know), I have between 18-20C. I have the LD Cooling PC-V8 with 16 fans at around 700-900 rpm, exhaust fan is sitting at around 1500 rpm. The case is pretty cool inside.

Out from tests on Youtube (Tek-Syndicate) had the Vapor-X at 60C at load (I guess it would have been around the same temperature for me), the Lightning card I already have is sitting at around 65C full load (I have never seen it above 65C the few months that I have had it) bone stock with auto fan profile. I actually thought of getting an Asus DCUII OC R9 290 (not X) to add some GPU power, they seem to be resonable cool (even though the cooler is .... bad)


----------



## rdr09

Quote:


> Originally Posted by *cyberlocc*
> 
> Hmm doesn't look like its bottlenecking did down clocking work?
> Try to lower the BF4 settings and see if you still hit the same fps roof.


it seems to me. look at the max the cpu cores were getting. those are high considering the amount per core.

@Batpimp, i'd say . . . an oc is in order or get an 8300 chip and oc it.


----------



## LandonAaron

Quote:


> Originally Posted by *kizwan*
> 
> If you enabled unified gpu usage, yeah both graphs will look almost identical. I don't enable unified gpu usage because it's only cosmetic.


Alright, thanks. Yeah I had that enabled. I never new what exactly that setting did. I just read on a crossfire setup guide to enable it. If that's all it does I would rather see what each card is actually doing then a combined graph of the two.


----------



## Cyber Locc

Quote:


> Originally Posted by *kizwan*
> 
> is that with everything default in CCC? Your single card score higher than it should be & your crossfire score is lower than it should be.
> 
> EDIT: I saw your edit. Crossfire score also slightly higher than it should be. If everything is default in CCC, then your cards running fine....at least in Heaven.


They are this time* when he ran heaven a little while back it was on high with no AA or Tess and he got a 98 single card and 130 crossfire that isn't right. He is also getting only 10 fps gain in bf4 with crossfire on.

Op based off of that I'm going to go with CPU bottleneck your heaven maxed scores are where the should be and your high's aren't. Try lowering the settings in bf4 and see if your fps gain goes up.

Megaman his setup is 2 r9 290s in crossfire a sabertooth 990fx board and a fx8150 @4.4ghz that appears to be bottlenecking all signs lead that way anyway.

Op if you got a piledriver it may make a difference as that's what all the quadfire people are running. However again all the quadfire people I have seen are running 7970s and 3 monitor setups (megaman inclucded) that's kinda a different beast than his 290s and 1 monitor and his 8150 bulldozer.


----------



## mfknjadagr8

Quote:


> Originally Posted by *cyberlocc*
> 
> They are this time* when he ran heaven a little while back it was on high with no AA or Tess and he got a 98 single card and 130 crossfire that isn't right. He is also getting only 10 fps gain in bf4 with crossfire on.
> 
> Op based off of that I'm going to go with CPU bottleneck your heaven maxed scores are where the should be and your high's aren't. Try lowering the settings in bf4 and see if your fps gain goes up.
> 
> Megaman his setup is 2 r9 290s in crossfire a sabertooth 990fx board and a fx8150 @4.4ghz that appears to be bottlenecking all signs lead that way anyway.
> 
> Op if you got a piledriver it may make a difference as that's what all the quadfire people are running. However again all the quadfire people I have seen are running 7970s and 3 monitor setups (megaman inclucded) that's kinda a different beast than his 290s and 1 monitor and his 8150 bulldozer.


if I remember right red was running two 295x2s at one point with 9750...agent is running a 290 with an 8150 and it's keeping up with my vishera until I hit over 4.6....


----------



## Cyber Locc

Quote:


> Originally Posted by *mfknjadagr8*
> 
> if I remember right red was running two 295x2s at one point with 9750...agent is running a 290 with an 8150 and it's keeping up with my vishera until I hit over 4.6....


IDK like I said I'm not familiar with amd cpus personally you are and so is megaman so your help here is defiantly needed.

However all signs point to a bottle necking CPU IMO. What else could cause his fps to hit a roof and not scale right at lower settings but work fine maxxed out?


----------



## Cyber Locc

Quote:


> Originally Posted by *Mega Man*
> 
> wait what did i miss?/
> i would recommend rigbuilder, i need to know your full rig specs, psus/ number of cards
> 
> never never never trust online psu calculators but yes underpowering your cards can cause neg performance and not a freeze/restart/bsod
> 
> sorry i have not kept up but work and home life been crazy too much crap blowing up at the same time for me to worry about the forums
> 
> what settings are you using graphically ??


I told him the same thing







His rig I posted in the post above however I left out his PSU which is a evga g2 850w so it is a very good power supply.

His high settings no aa no tess in heaven is getting low scores(98single - 130cf) his bf4 (he didnt say settings) is showing only a 10fps gain with crossfire, His heaven maxxed scores (tess extreme AAx4 Ultra) is showing proper fps scaling 66single - 112cf. to me that screams CPU* bottleneck and we have exhausted all options at this point so any ideas you have would be great.

I don't think its a power supply issue as he is getting correct scaling in heaven at max but not at high.


----------



## mfknjadagr8

Quote:


> Originally Posted by *cyberlocc*
> 
> IDK like I said I'm not familiar with amd cpus personally you are and so is megaman so your help here is defiantly needed.
> 
> However all signs point to a bottle necking CPU IMO. What else could cause his fps to hit a roof and not scale right at lower settings but work fine maxxed out?


lower settings should make less work for all components involved...which is the strange thing....when I had my 760gtx it performed better at @1080p than @ 720....which honestly doesn't make sense to me...290s handle higher resolution better but not sure how the scaling is lower...I would think if his cpu was the bottleneck he would be seeing more usage from it trying to keep up with the graphical demand...I am going to do some testing this afternoon with my cpu clocked to 4.0 and 4.4 and see how it affects gtaV since I picked it up last night...perhaps megaman can chime in to better clarify if my thoughts are correct...
As it so happens agent had purchased a second 290 so very soon we well be comparing tests to see if in fact the 8150 is going to effect scaling and fps when cfx


----------



## Cyber Locc

Quote:


> Originally Posted by *mfknjadagr8*
> 
> lower settings should make less work for all components involved...which is the strange thing....when I had my 760gtx it performed better at @1080p than @ 720....which honestly doesn't make sense to me...290s handle higher resolution better but not sure how the scaling is lower...I would think if his cpu was the bottleneck he would be seeing more usage from it trying to keep up with the graphical demand...I am going to do some testing this afternoon with my cpu clocked to 4.0 and 4.4 and see how it affects gtaV since I picked it up last night...perhaps megaman can chime in to better clarify if my thoughts are correct...


Lower settings make your cards not work as hard your cpu works harder as your fps goes up.

Heres the easiest way to check for bottlenecks

If lowering the graphics settings has no effect on frame rates, then the bottleneck is your CPU
If lowering the graphics settings increases the frame rate, then your GPU is reaching its upper limits

In his case albeit we did it backwards

High Settings FPS - 98S - 130CF
Maxxed Settings - 66S - 112C

I bet that if when he provides one setting level down we will see numbers like 120S-140CF. His single card ratings are fine across the board the issue becomes with crossfire. So his 8150 will not bottleneck a 290/x but it will bottleneck 2 at lower settings. However that is an issue as from what I am hearing OP doesn't want the best graphics ever he wants 144fps for his 144hz monitor. 2 290s should easily be able to accommodate that @ 1080p but they arent in his setup.

We also have to consider his BF4 fps only went up by 10 when enabling crossfire. BF4 is way cpu heavy so if his cpu cant keep up then enabling crossfire will do barely anything like raise his fps by 10 (which is what is happening).

Also to clarify if you were getting more performance at 1080p than at 720p than your cpu is bottlenecking your system (at least at 720p it is)

Also for megaman to catch up. We had him re install his cards physically and clean the fingers, We also had him ddu in safe mode then re install fresh omega drivers. His temps in all tests have been under 80 and his cards are loading correctly.


----------



## Batpimp

hey guys im awake.

i would recommend rigbuilder, i need to know your full rig specs, psus/ number of cards

never never never trust online psu calculators but yes underpowering your cards can cause neg performance and not a freeze/restart/bsod

sorry i have not kept up but work and home life been crazy too much crap blowing up at the same time for me to worry about the forums

what settings are you using graphically ??

cpu setup

For @megaman

Asus Sabretooth 990fx Rev 1 bios 1102

AMD FX8150 OC'ed to 4.4ghz I also tried it at stock settings and 4.0ghz (no change in fps in games only benchmarks)

Gskill DDR3 2400mhz (running at 1866mhz

EVGA 850w G2 (no issues bsod, blackscreen, crashing, overheating nothing)

Samsung 850 EVO 500gig

R9 290 Powercolor PCS+ (two) Ive tried 13.12, omega drivers, beta 15 drivers. No changes in games only benchmarks

Swiftech 240x Liquid Cooling Setup

I ran DDU twice. With registry cleaner, CCcleaner

I have flipped the card positions in PCI lanes (thus connecting my 144mhz monitor to the new card in the DVI slot.
ULPS Disabled.
Power management on high performance
Win 8.1 with all the newest drivers/patches installed as of Tuesday.

I only get gains in benchmarks not in a single game have I gotten a FPS increase in Crossfire


----------



## Cyber Locc

Quote:


> Originally Posted by *Batpimp*
> 
> hey guys im awake.
> 
> i would recommend rigbuilder, i need to know your full rig specs, psus/ number of cards
> 
> never never never trust online psu calculators but yes underpowering your cards can cause neg performance and not a freeze/restart/bsod
> 
> sorry i have not kept up but work and home life been crazy too much crap blowing up at the same time for me to worry about the forums
> 
> what settings are you using graphically ??
> 
> cpu setup
> 
> For @megaman
> 
> Asus Sabretooth 990fx Rev 1 bios 1102
> 
> AMD FX8150 OC'ed to 4.4ghz I also tried it at stock settings and 4.0ghz (no change in fps in games only benchmarks)
> 
> Gskill DDR3 2400mhz (running at 1866mhz
> 
> EVGA 850w G2 (no issues bsod, blackscreen, crashing, overheating nothing)
> 
> Samsung 850 EVO 500gig
> 
> R9 290 Powercolor PCS+ (two) Ive tried 13.12, omega drivers, beta 15 drivers. No changes in games only benchmarks
> 
> Swiftech 240x Liquid Cooling Setup
> 
> I ran DDU twice. With registry cleaner, CCcleaner
> 
> I have flipped the card positions in PCI lanes (thus connecting my 144mhz monitor to the new card in the DVI slot.
> ULPS Disabled.
> Power management on high performance
> Win 8.1 with all the newest drivers/patches installed as of Tuesday.
> 
> I only get gains in benchmarks not in a single game have I gotten a FPS increase in Crossfire


That makes since because heaven uses barely any cpu as its just like watching a movie nothing changes due to your actions as in a game. We will see what Megaman thinks he knows more about AMD than me however Im with RdR its CPU bottleneck.


----------



## kizwan

I disagree. I don't think it's CPU bottleneck. No gain on fps doesn't sound like bottleneck. I bet it's PSU.
Quote:


> Originally Posted by *Batpimp*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> hey guys im awake.
> 
> i would recommend rigbuilder, i need to know your full rig specs, psus/ number of cards
> 
> never never never trust online psu calculators but yes underpowering your cards can cause neg performance and not a freeze/restart/bsod
> 
> 
> sorry i have not kept up but work and home life been crazy too much crap blowing up at the same time for me to worry about the forums
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> what settings are you using graphically ??
> 
> cpu setup
> 
> For @megaman
> 
> Asus Sabretooth 990fx Rev 1 bios 1102
> 
> AMD FX8150 OC'ed to 4.4ghz I also tried it at stock settings and 4.0ghz (no change in fps in games only benchmarks)
> 
> Gskill DDR3 2400mhz (running at 1866mhz
> 
> EVGA 850w G2 (no issues bsod, blackscreen, crashing, overheating nothing)
> 
> Samsung 850 EVO 500gig
> 
> R9 290 Powercolor PCS+ (two) Ive tried 13.12, omega drivers, beta 15 drivers. No changes in games only benchmarks
> 
> Swiftech 240x Liquid Cooling Setup
> 
> I ran DDU twice. With registry cleaner, CCcleaner
> 
> I have flipped the card positions in PCI lanes (thus connecting my 144mhz monitor to the new card in the DVI slot.
> ULPS Disabled.
> Power management on high performance
> Win 8.1 with all the newest drivers/patches installed as of Tuesday.
> 
> I only get gains in benchmarks not in a single game have I gotten a FPS increase in Crossfire


Well everyone also have crazy life too. It doesn't take long time to fill out rigbuilder. Oh well, good luck!


----------



## Cyber Locc

Quote:


> Originally Posted by *kizwan*
> 
> I disagree. I don't think it's CPU bottleneck. No gain on fps doesn't sound like bottleneck. I bet it's PSU.
> Well everyone also have crazy life too. It doesn't take long time to fill out rigbuilder. Oh well, good luck!


The key is no gain on fps in games of which the only game he has mentioned was BF4. Heaven sees a gain and more so at higher settings which rules out PSU. Its like you said yourself to his maxed settings heaven run his fps was on point however when he lowers the settings is when the issues presents that's not a PSU issue.

And why do you say no gain on FPS doesnt sound like a CPU bottleneck? Thats the definition of CPU bottleneck lol, More GPU horsepower with no fps gains = CPU bottleneck

He didnt say he had a crazzy life megaman said that he was quoting megaman but it didn't work correctly. He has some rigs in rigbuilder but said that he cant get it to work properly (I think update) which I can understand as I have had issues with it myself in the past.

Found this in my searching a 8150 bottlenecks 3 6970s if it bottlenecks those it damn sure is bottle necking 2x 290s. http://www.tweaktown.com/articles/4353/amd_fx_8150_vs_intel_i7_2600k_crossfirex_hd_6970_x3_head_to_head/index14.html


----------



## Batpimp

i dont think its a CPU bottleneck

When i run BF4 FPS Performance graph or AB..the cpu is never going past 65% ish in usage and the GPUS are at 45% peak each

I also tried evolve and with a single card i was getting 90ish FPS with low settings during an entire match and at worst about 60fps on medium settings with a single card. usualy arround 75-80

Ran benchmarks agani for my own entertainment and the cards just dont push 1080p FPS high. In fact for bf4 i lost FPS sometimes and for evolve i never went past single card performance. Dont get me wrong for single card performance its great but what i want is 144fps. Gonna move to intel and Nvidia asap. probably the Nvidia first.

Ill see how win 10 dx12 affects my 8150 performance. If it handles the same ill wait until later this year to drop some cash on a i7 or something. For now i have the cash for two 970s or a 980.

I am convinced of what this person says. Everyone whom is getting 144fps in bf4 and other games are running 295's or Gtx 970 980 (or pairs)
http://forums.guru3d.com/showthread.php?t=387269

paulshardware replied to me today too.

Batpimpn ‏@Batpimpn 10h10 hours ago
@paulhardware i wanna run bf4 and similar fps games 144fps. Assuming i have a great cpu what video cards should ibuy?

"Paul's Hardware ‏@paulhardware 2h2 hours ago
@Batpimpn For that you probably want a GTX 980, GTX 970, or two of either of those, or a 290X, or an R9 295X2"

i bought the wrong cards for the wrong purpose. The purpose of r9 290s is to handle 1440p and 4k multimonitor setups.

What i needed was the 290x or 295x2. Since i am not partial to fanboyism im going to sell both these cards and just get a 980 or maybe two 970 sli.

TL;DR
1080p performance doesnt increase but 1440p remains at 90ish fps or more in bf4 with a single card
. So i lose no FPS by using VSR and moving up to 1440p

970,980 or 295x2 or 290x are for 1080p 144fps


----------



## Mega Man

Quote:


> Originally Posted by *cyberlocc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> is that with everything default in CCC? Your single card score higher than it should be & your crossfire score is lower than it should be.
> 
> EDIT: I saw your edit. Crossfire score also slightly higher than it should be. If everything is default in CCC, then your cards running fine....at least in Heaven.
> 
> 
> 
> They are this time* when he ran heaven a little while back it was on high with no AA or Tess and he got a 98 single card and 130 crossfire that isn't right. He is also getting only 10 fps gain in bf4 with crossfire on.
> 
> Op based off of that I'm going to go with CPU bottleneck your heaven maxed scores are where the should be and your high's aren't. Try lowering the settings in bf4 and see if your fps gain goes up.
> 
> Megaman his setup is 2 r9 290s in crossfire a sabertooth 990fx board and a fx8150 @4.4ghz that appears to be bottlenecking all signs lead that way anyway.
> 
> Op if you got a piledriver it may make a difference as that's what all the quadfire people are running. However again all the quadfire people I have seen are running 7970s and 3 monitor setups (megaman inclucded) that's kinda a different beast than his 290s and 1 monitor and his 8150 bulldozer.
Click to expand...

First 850w is low. It is a great psu. But the lower power 295 x 2 pulls at stock 500w. With high end fx 300+w is normal.

2x290x and that cpu =1kw psu min
Quote:


> Originally Posted by *Batpimp*
> 
> hey guys im awake.
> 
> i would recommend rigbuilder, i need to know your full rig specs, psus/ number of cards
> 
> never never never trust online psu calculators but yes underpowering your cards can cause neg performance and not a freeze/restart/bsod
> 
> sorry i have not kept up but work and home life been crazy too much crap blowing up at the same time for me to worry about the forums
> 
> what settings are you using graphically ??
> 
> cpu setup
> 
> For @megaman
> 
> Asus Sabretooth 990fx Rev 1 bios 1102
> 
> AMD FX8150 OC'ed to 4.4ghz I also tried it at stock settings and 4.0ghz (no change in fps in games only benchmarks)
> 
> Gskill DDR3 2400mhz (running at 1866mhz
> 
> EVGA 850w G2 (no issues bsod, blackscreen, crashing, overheating nothing)
> 
> Samsung 850 EVO 500gig
> 
> R9 290 Powercolor PCS+ (two) Ive tried 13.12, omega drivers, beta 15 drivers. No changes in games only benchmarks
> 
> Swiftech 240x Liquid Cooling Setup
> 
> I ran DDU twice. With registry cleaner, CCcleaner
> 
> I have flipped the card positions in PCI lanes (thus connecting my 144mhz monitor to the new card in the DVI slot.
> ULPS Disabled.
> Power management on high performance
> Win 8.1 with all the newest drivers/patches installed as of Tuesday.
> 
> I only get gains in benchmarks not in a single game have I gotten a FPS increase in Crossfire


BF4 is a very poor game to judge gpu perf. BF4 uses frostbite iirc. Either way the game loves fast ram. Faster the better 1866 is not enough. You want 2000+ perf 2400 I would recomend using another game.

My set up is easy to change to a 290 and 1080p. Atm though I would either have to use cvfz or ud7 on air

I would highly recommend rather then changing to Intel try a 83xx. I know several who have and say it is really not worth it. I own both and I agree. (3930k and 4970k with several 8350s. )


----------



## kizwan

Quote:


> Originally Posted by *Batpimp*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> i dont think its a CPU bottleneck
> 
> When i run BF4 FPS Performance graph or AB..the cpu is never going past 65% ish in usage and the GPUS are at 45% peak each
> 
> I also tried evolve and with a single card i was getting 90ish FPS with low settings during an entire match and at worst about 60fps on medium settings with a single card. usualy arround 75-80
> 
> Ran benchmarks agani for my own entertainment and the cards just dont push 1080p FPS high. In fact for bf4 i lost FPS sometimes and for evolve i never went past single card performance. Dont get me wrong for single card performance its great but what i want is 144fps. Gonna move to intel and Nvidia asap. probably the Nvidia first.
> 
> Ill see how win 10 dx12 affects my 8150 performance. If it handles the same ill wait until later this year to drop some cash on a i7 or something. For now i have the cash for two 970s or a 980.
> 
> I am convinced of what this person says. Everyone whom is getting 144fps in bf4 and other games are running 295's or Gtx 970 980 (or pairs)
> http://forums.guru3d.com/showthread.php?t=387269
> 
> paulshardware replied to me today too.
> 
> Batpimpn ‏@Batpimpn 10h10 hours ago
> @paulhardware i wanna run bf4 and similar fps games 144fps. Assuming i have a great cpu what video cards should ibuy?
> 
> "Paul's Hardware ‏@paulhardware 2h2 hours ago
> @Batpimpn For that you probably want a GTX 980, GTX 970, or two of either of those, or a 290X, or an R9 295X2"
> 
> 
> i bought the wrong cards for the wrong purpose. The purpose of r9 290s is to handle 1440p and 4k multimonitor setups.
> 
> What i needed was the 290x or 295x2. Since i am not partial to fanboyism im going to sell both these cards and just get a 980 or maybe two 970 sli.
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> TL;DR
> 1080p performance doesnt increase but 1440p remains at 90ish fps or more in bf4 with a single card
> . So i lose no FPS by using VSR and moving up to 1440p
> 
> 970,980 or 295x2 or 290x are for 1080p 144fps


If you can't get 290s to work, I honestly don't think you can get 290x or 295x2 working too. Yeah, you probably have better luck with 980 or 970.


----------



## Cyber Locc

Quote:


> Originally Posted by *Mega Man*
> 
> First 850w is low. It is a great psu. But the lower power 295 x 2 pulls at stock 500w. With high end fx 300+w is normal.
> 
> 2x290x and that cpu =1kw psu min
> BF4 is a very poor game to judge gpu perf. BF4 uses frostbite iirc. Either way the game loves fast ram. Faster the better 1866 is not enough. You want 2000+ perf 2400 I would recomend using another game.
> 
> My set up is easy to change to a 290 and 1080p. Atm though I would either have to use cvfz or ud7 on air
> 
> I would highly recommend rather then changing to Intel try a 83xx. I know several who have and say it is really not worth it. I own both and I agree. (3930k and 4970k with several 8350s. )


I agree I had mentioned the power supply being too small to his very first post however that doesn't seem to be the issue here.

He isn't just basing off of BF4 his heaven scores are also not right (and he said other games are showing the same just never said what those games were).

I'm sure your setup is easy to change and I'm not insulating it or anything I'm simply stating that from all evidence that a 8150 is going to bottleneck 290 crossfire. I do not think a 8350 would have that same issue from what I have read.

But yes batpimp I agree with megaman get a 8350


----------



## Batpimp

okay so get a 1kw and an 8350?

i can try going to my nearby frys electronics or bestbuy or something and get 1kw PSU and see if makes any difference.

if that fails ill try to get a 8350. if that fails ill try both.

I do not see how it can be a PSU issues since im getting no symptoms. What makes you believe its PSU? The CPU I could more believe but then again the CPU usage in any game i play is not showing me any CPU bottleneck indications.


----------



## Cyber Locc

Quote:


> Originally Posted by *Batpimp*
> 
> okay so get a 1kw and an 8350?
> 
> i can try going to my nearby frys electronics or bestbuy or something and get 1kw PSU and see if makes any difference.
> 
> if that fails ill try to get a 8350. if that fails ill try both.
> 
> I do not see how it can be a PSU issues since im getting no symptoms. What makes you believe its PSU? The CPU I could more believe but then again the CPU usage in any game i play is not showing me any CPU bottleneck indications.


I personally don't think that your PSU is causing this paticluar issue. That said I would reccomend at least a 1000w as well. However to fix this issue I think that a 8350 or really the best CPU you can get with your budget that fits your board will fix it.

So really IMHO I would just upgrade both. However your option does make sence buy a power supply (you picked a great power supply just the wrong wattage grab a evga g2 1000w). If that doesn't fix your issue keep the power supply and buy a better CPU.

Now if you want to do it in reverse that would be fine as well. I don't think that your PSU is causing this issue but I do think you need a bigger one. Will your current one work yes but it will degrade quickly and stop working sooner so I would defiantly look into upgrading it soonish.


----------



## Batpimp

yup if only i had a 8350 then i would know.

okay the frys electronics near my house has the 8350 for sale. ill buy it try it and if it works then we'll know that much at least. I really hope thats all it is.


----------



## Cyber Locc

Quote:


> Originally Posted by *Batpimp*
> 
> i dont think its a CPU bottleneck
> 
> When i run BF4 FPS Performance graph or AB..the cpu is never going past 65% ish in usage and the GPUS are at 45% peak each
> 
> I also tried evolve and with a single card i was getting 90ish FPS with low settings during an entire match and at worst about 60fps on medium settings with a single card. usualy arround 75-80
> 
> Ran benchmarks agani for my own entertainment and the cards just dont push 1080p FPS high. In fact for bf4 i lost FPS sometimes and for evolve i never went past single card performance. Dont get me wrong for single card performance its great but what i want is 144fps. Gonna move to intel and Nvidia asap. probably the Nvidia first.
> 
> Ill see how win 10 dx12 affects my 8150 performance. If it handles the same ill wait until later this year to drop some cash on a i7 or something. For now i have the cash for two 970s or a 980.
> 
> I am convinced of what this person says. Everyone whom is getting 144fps in bf4 and other games are running 295's or Gtx 970 980 (or pairs)
> http://forums.guru3d.com/showthread.php?t=387269
> 
> paulshardware replied to me today too.
> 
> Batpimpn ‏@Batpimpn 10h10 hours ago
> @paulhardware i wanna run bf4 and similar fps games 144fps. Assuming i have a great cpu what video cards should ibuy?
> 
> "Paul's Hardware ‏@paulhardware 2h2 hours ago
> @Batpimpn For that you probably want a GTX 980, GTX 970, or two of either of those, or a 290X, or an R9 295X2"
> 
> i bought the wrong cards for the wrong purpose. The purpose of r9 290s is to handle 1440p and 4k multimonitor setups.
> 
> What i needed was the 290x or 295x2. Since i am not partial to fanboyism im going to sell both these cards and just get a 980 or maybe two 970 sli.
> 
> TL;DR
> 1080p performance doesnt increase but 1440p remains at 90ish fps or more in bf4 with a single card
> . So i lose no FPS by using VSR and moving up to 1440p
> 
> 970,980 or 295x2 or 290x are for 1080p 144fps


Quote:


> Originally Posted by *kizwan*
> 
> If you can't get 290s to work, I honestly don't think you can get 290x or 295x2 working too. Yeah, you probably have better luck with 980 or 970.


I missed this post before so let me clear this up.

A 290 and a 290x are virtually the same card the difference is one has more shaders that's about it were talking a 3% performance difference at most clock for clock. 290s and 290xs are not for different res they have the same everything, 290s shaders didn't past tests so they cut some off and became 290s they use the same drivers and everything as 290xs. As for a 295x2 that is 2 r9 290s on 1 pcb same thing just slightly lessened as to conserve power and fit on the same board.

You have just clarified even more the issue with saying @1440p the fps is the same thats because your cpu is limiting your frames not your cards period anyone who says different is wrong. I have the same cards and I get 170+fps in heaven at high to your 130, BF4 gets well over 120fps as well. The difference (other than I have faster ram which megaman says will make a small impact) is I have an I7 4820k.

Switching to 970s or 980s isnt going to do anything because your FPS is being limited by your CPU. I have tried to show you this casually and nicely in a way to not offend anyone that has that CPU. However at this point its time to call a spade a spade. That CPU isn't worth the silicon its cut on, Look at the tweaktown review I linked to you that cpu with the same cards as an I7. The 8150 is getting 40-50+ fps less than the i7 system with everything the same except the CPUs and boards.With that said maybe its not a bottleneck per se (although that's what I would call a BN) but more so that cpu just cant perform in a gaming environment. I'm all for anyone trying to save a little cash or whatever but 40-50 fps difference is no small difference Bulldozer is junk for gaming at least.

Now from what I have seen the pile driver seems to be good, Megaman has spoke its praises and I trust his opinion as of the other reviews I have read. Its very plain to see that CPU is the issue google problems with it and you will see many people say the same. I'm sorry to be the bearer of bad news and to be so brash about it but continuing to dance politely will have you updating your whole rig to fix an issue caused by a crap CPU and I don't want to see that happen.

Grab an 8350 and see if it works and when it does Ill be here waiting to hear the great news good luck







.


----------



## Mega Man

1 I didn't say it is your psu I said I would of recommended a 1kw. And that it could effect your score.

2 I would recomend trying a 83xx first. Before switching to Intel if you do this be sure to take advantage of the 2400 ram you have

You are free to do what you want


----------



## Batpimp

I aint doubting you guys just want to know how you reached your conclusion. we have exhausted all options and like i said above im inclined it is the CPU before the PSU. I hope you don't feel i was defending the 8150. i stated a few pages ago i know its a 2011-2012 ish cpu so i have no doubt its not the best it was never even that good when it came to market. For me it was free so not a big deal.

The frys by my house as i updated my last post says they have a 8350 in stock. ill make the trip and install it and see what happens. As always thanks for your help.

1. mega i wanted to know why you thought so since there was no symptoms that i saw. If you are just saying in general you believe a 1k PSU would be better to have cause that would rule out the PSU as a possibility i understand that reasoning.

2. i will still probably end up switching to intel unless dx12 and win 10 just give the 8150/8350 a big bump in life again. I dont think i want to wait for 2016 for the zen architecture to MAYBE hit the market on false hopes when i can get something now thati know will work.

the reasonable thing though is todo what you said. Get a 8350 test it. if its good ill return it to frys and just find it cheaper. no way im paying full price for that thing haha. Yeah way before i do a system swap to Intel i will first run these more reasonable tests to see what it is. hopefully that will help others in the future.

3. my ram is rated at 2400mhz. But my mobo is designed for 1866 stock. Do you have somewhere i can look to see how i can OC it? my mobo if you forget is Sabrekitty 990fx rev 1.0 bios 1102


----------



## Cyber Locc

Quote:


> Originally Posted by *Batpimp*
> 
> yup if only i had a 8350 then i would know.
> 
> okay the frys electronics near my house has the 8350 for sale. ill buy it try it and if it works then we'll know that much at least. I really hope thats all it is.


Nice report back your findings I want to know as well









Quote:


> Originally Posted by *Mega Man*
> 
> 1 I didn't say it is your psu I said I would of recommended a 1kw. And that it could effect your score.
> 
> 2 I would recomend trying a 83xx first. Before switching to Intel if you do this be sure to take advantage of the 2400 ram you have
> 
> You are free to do what you want


I agree with Megaman 100%. And also think there is no reason to go intel for the cost it would take to replace a cpu and board. If the piledriver (or any cpu that fits your board) can come even close then that is your best bet for sure. Same goes with switching to 970s that is a sidegrade they will perform about the same that's a waste of money.

Also I didn't know that made a difference for BF4 how much of a difference does it make Mega? That is cool to know for sure now I'm glad that I run 2400mhz ram never really thought there was a point aside from the coolness factor.


----------



## Cyber Locc

Quote:


> Originally Posted by *Batpimp*
> 
> I aint doubting you guys just want to know how you reached your conclusion. we have exhausted all options and like i said above im inclined it is the CPU before the PSU. I hope you don't feel i was defending the 8150. i stated a few pages ago i know its a 2011-2012 ish cpu so i have no doubt its not the best it was never even that good when it came to market. For me it was free so not a big deal.
> 
> The frys by my house as i updated my last post says they have a 8350 in stock. ill make the trip and install it and see what happens. As always thanks for your help.
> 
> 1. mega i wanted to know why you thought so since there was no symptoms that i saw. If you are just saying in general you believe a 1k PSU would be better to have cause that would rule out the PSU as a possibility i understand that reasoning.
> 
> 2. i will still probably end up switching to intel unless dx12 and win 10 just give the 8150/8350 a big bump in life again. I dont think i want to wait for 2016 for the zen architecture to MAYBE hit the market on false hopes when i can get something now thati know will work.
> 
> the reasonable thing though is todo what you said. Get a 8350 test it. if its good ill return it to frys and just find it cheaper. no way im paying full price for that thing haha. Yeah way before i do a system swap to Intel i will first run these more reasonable tests to see what it is. hopefully that will help others in the future.
> 
> 3. my ram is rated at 2400mhz. But my mobo is designed for 1866 stock. Do you have somewhere i can look to see how i can OC it? my mobo if you forget is Sabrekitty 990fx rev 1.0 bios 1102


I didn't think you were doubting me or nothing was just tying to save you from spending money unnecessarily. What helped me come to the conclusion was the results of your tests the fact that raising the settings and raising the RES increased scaling presents a CPU bottleneck.

Also like I mentioned before this also helped me make a firm conclusion.

Quote about the 8150 from tweaktown
"If some of the numbers looked like 300 FPS on the 2600k and 200 FPS on the FX-8150, we'd say "Sure, there's a big difference, but the numbers are so high it's irrelevant really".

The problem is, when we see a wall in Mafia II at 73 FPS verses 123 FPS on the 2600k, even 87 FPS under Metro 2033 verses 108 FPS - as we see a push for these 1920 x 1080 120Hz monitors, with V-Sync on, the game play is just so smooth at 120 FPS. Any gamer that uses a 120Hz monitor will tell you that there's a clear difference over 60Hz ones and that's not good for AMD."

http://www.tweaktown.com/articles/4353/amd_fx_8150_vs_intel_i7_2600k_crossfirex_hd_6970_x3_head_to_head/index14.html

Edit: I would like to point out the above fps numbers are for 1920x1600 where on the 1920x1200 tests you see the intel rise and the amd numbers stay the same mafia 2 sees 73 on both where intel hits 123 and 119. that is a cpu bottleneck and thats with old cards that were released around the same time as the cpu.

That is a 30-40% performance diffrence between the 8150 and the 2600k. That review is also from release of the chip those games are old now and so are the 6970s they used (trifire). The cpu bottle-necked those cards something bad and the games your playing are more demanding and the cards your using are stronger which just leads to this situation being even more apparent. Like I said I do pefer intel but I don't dislike AMD and I'm not being a fanboy or trying to rag on them in anyway. That said your CPU should have never even been released IMO that is a major difference in performance and the chips new had a difference in price of 35 dollars. I am seriously sickened by this tbh and feel very bad for you and the situation you are in this is absurd.

TBH no offense to AMD as I love my 290s but if there was ever a chance of me buying an AMD CPU it just went out the window from what I have found out helping you.


----------



## Batpimp

Quote:


> Originally Posted by *cyberlocc*
> 
> I didn't think you were doubting me or nothing was just tying to save you from spending money unnecessarily. What helped me come to the conclusion was the results of your tests the fact that raising the settings and raising the RES increased scaling presents a CPU bottleneck.
> 
> Also like I mentioned before this also helped me make a firm conclusion.
> 
> Quote about the 8150 from tweaktown
> "If some of the numbers looked like 300 FPS on the 2600k and 200 FPS on the FX-8150, we'd say "Sure, there's a big difference, but the numbers are so high it's irrelevant really".
> 
> The problem is, when we see a wall in Mafia II at 73 FPS verses 123 FPS on the 2600k, even 87 FPS under Metro 2033 verses 108 FPS - as we see a push for these 1920 x 1080 120Hz monitors, with V-Sync on, the game play is just so smooth at 120 FPS. Any gamer that uses a 120Hz monitor will tell you that there's a clear difference over 60Hz ones and that's not good for AMD."
> 
> http://www.tweaktown.com/articles/4353/amd_fx_8150_vs_intel_i7_2600k_crossfirex_hd_6970_x3_head_to_head/index14.html
> 
> That is a 30-40% performance diffrence between the 8150 and the 2600k. That review is also from release of the chip those games are old now and so are the 6970s they used (trifire). The cpu bottle-necked those cards something bad and the games your playing are more demanding and the cards your using are stronger which just leads to this situation being even more apparent. Like I said I do pefer intel but I don't dislike AMD and I'm not being a fanboy or trying to rag on them in anyway. That said your CPU should have never even been released IMO that is a major difference in performance and the chips new had a difference in price of 35 dollars. I am seriously sickened by this tbh and fell very bad for you and the situation you are in this is absurd.
> 
> TBH no offense to AMD as I love my 290s but if there was ever a chance of me buying an AMD CPU it just went out the window from what I have found out helping you.


haha dont feel too bad. My friend bought this system new few years ago for like 2500-3k?

He ran into money problems and i got his whole system like 2 months later for 1200 bucks
Obviously i have recently upgraded but almost every time i upgrade my buddy, nick, buys the oldpats from me. so i havent even spend 600 bucks.

If i end up getting the 8350 he will probably buy my 8150 since hes running a 6300series amd.

I would go today to frys but the drive alone is like 40 minutes AND on a friday the traffic would be unreal. I will just chillax tonight and get upearly and try it hopefully.

thanks for all your help. now lets see what this 8350 can do.


----------



## Mega Man

well you can see here


Spoiler: Warning: Spoiler!







that the r2.0 had the same ram support

but yet i ran without issue 2400

if you dont have the divider you can oc fsb i can help with that

as to not owning amd cpus,

well i much prefer my amds to any intel, and again i OWN 3930k, 4970k and 8350s

amds respond much faster and smother esp in day to day


----------



## Batpimp

Quote:


> Originally Posted by *Mega Man*
> 
> well you can see here
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> that the r2.0 had the same ram support
> 
> but yet i ran without issue 2400
> 
> if you dont have the divider you can oc fsb i can help with that
> 
> as to not owning amd cpus,
> 
> well i much prefer my amds to any intel, and again i OWN 3930k, 4970k and 8350s
> 
> amds respond much faster and smother esp in day to day


hey im not sure what you mean the divider. I dont know a thing about ram.
heres my ram

http://www.newegg.com/Product/Product.aspx?Item=N82E16820231589&nm_mc=AFC-C8Junction&cm_mmc=AFC-C8Junction-_-na-_-na-_-na&cm_sp=&AID=10446076&PID=3938566&SID=

Is there anything special i need to do to make it work? or is simply just picking the right option in the bios?


----------



## Mega Man

right option in bios


----------



## gatygun

Quote:


> Originally Posted by *Batpimp*
> 
> haha dont feel too bad. My friend bought this system new few years ago for like 2500-3k?
> 
> He ran into money problems and i got his whole system like 2 months later for 1200 bucks
> Obviously i have recently upgraded but almost every time i upgrade my buddy, nick, buys the oldpats from me. so i havent even spend 600 bucks.
> 
> If i end up getting the 8350 he will probably buy my 8150 since hes running a 6300series amd.
> 
> I would go today to frys but the drive alone is like 40 minutes AND on a friday the traffic would be unreal. I will just chillax tonight and get upearly and try it hopefully.
> 
> thanks for all your help. now lets see what this 8350 can do.


Readed most of your post. I will add my feel towards the mix about your setup.

*Keep in mind that you can't judge performance from the numbers i tell you perfectly well, most titan/980 results are of extreme oc'ed editions while the 290's are standard stock versions most of the time, keep it real but it showcases why your got the issue's you have now perfectly well*

*3DMark Firestrike (highest scores):*

I choose firestrike and not extreme or any other taxing one as they won't give you numbers that would equal towards a lower resolution experience ( aka 1080p ) on higher performance hardware.

AMD FX-8150:

1x 290 = 8455 points

2x 290's crossfire = 11.341 points

3x 290's crossfire = 9.806 points

1x Titan x = 10.940 points

1x 980 = 10.315 points

2x 980 sli = 11.689 points

AMD FX-8350

1x 290 = unknown points

2x 290's crossfire = 13.520 points

3x 290's crossfire = 14.077 points

1x titan x = 12.049 points

2x titan x = 14.665 points

3x titan x = 14.534 points

1x 980 gtx = unknown points

2x 980 gtx = 14.323 points

INTEL I7-5960x ( intel top model )

1x 290 = 10.050 points

2x 290's crossfire = 20.926 points

3x 290's crossfire = 23.226 points

4x 290's crossfire = 29.281 points.

1x titan x = 24.222 points ( extreme oc'ed

2x titan x = 35.106 points ( extreme oc'ed )

3x titan x = unknown points

4x titan x = 43.380 points

1x 980 gtx = unknown points

2x 980 gtx = 31.589 points

3x 980 gtx = 35.764 points

4x 980 gtx = 41.098 points

*Conclusion findings.*

*FX-8150* caps out around *11.400 points*

( a normal 290 will mostly get you around the 8900 points, a oced on 1145 core and 1450 mem will give you 10.108 points to get a idea ).

*FX-8350* caps out around *14.500 points*,

_18.000 points is for stock 290's x2, so yea you will still get bottlenecked even with stock let alone with oced version which will move towards up 20k_

*INTEL I7-5960x* caps out around *43.000 points*

_Keep in mind that this cpu is the absolute top you can get atm ( from my knowledge ), it basically features no bottleneck cpu depatment._

*My opinion on your problem:*

Cpu bottleneck first degree issue. Your cpu simple can't push both cards forwards. Also keep in mind that in most games ( not mantle games ) amd gpu's will have a larger dx11 api overhead this specially hits 1080p resolution, the cards will perform better on higher resolutions as cpu overhead will be lesser visible. ( mantle and dx12 will cure this ).

As you want to play bf4 on 144 fps with 290's crossfire, that should be perfectly possible as you can see below that the low is 145 fps on 2x 290's crossfire. But your cpu is holding you back as 2x titan x sli on his 3960x cpu where he tested bf4 with make his cpu hit a wall of 24.884 points ( titans shoudl perform around 35.000 points ). 2x 290's = 21.000 absolute max so 18.000 stock versions would not have bottlenecked his test setup, which makes his 290's push full performance on the 1080p resolution.



I hope this helps you.


----------



## Mega Man

first 3dm* where * is the version, not only heavily favor intel/nvidia but again considering the overclocked results really are not that far off


----------



## fyzzz

I have been trying to be able to overvolt with +300mv because i want to try if i can push my overclock higher then. I have tried to change som things in msi ab but it didn't work. Gputweak allows up to 1400 mv but that doesn't give more voltage than +200mv in trixx.


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> I have been trying to be able to overvolt with +300mv because i want to try if i can push my overclock higher then. I have tried to change som things in msi ab but it didn't work. Gputweak allows up to 1400 mv but that doesn't give more voltage than +200mv in trixx.


it's so cheap you are not afraid to break it, huh?

anyway, have you tried this (post # 14029) . . .

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/14020#post_21574982

before you do, try lowering your mem to 1500 and you might hit 1300 on the core. you are maxing out the Power Limit, right?


----------



## fyzzz

Quote:


> Originally Posted by *rdr09*
> 
> it's so cheap you are not afraid to break it, huh?
> 
> anyway, have you tried this (post # 14029) . . .
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/14020#post_21574982
> 
> before you do, try lowering your mem to 1500 and you might hit 1300 on the core. you are maxing out the Power Limit, right?


I have tried the msi ab thing but it didn't work and yes i always put the power limit to max. My card can do around 1240 with +200 in trixx


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> I have tried the msi ab thing but it didn't work and yes i always put the power limit to max. My card can do around 1240 with +200 in trixx


i assume its watered. how about PT1 Bios? you got to have water to use it, though.

i've seen 1.41v with +200 in Trixx.


----------



## fyzzz

Quote:


> Originally Posted by *rdr09*
> 
> i assume its watered. how about PT1 Bios? you got to have water to use it, though.
> 
> i've seen 1.41v with +200 in Trixx.


It is not under water but i have a raijintek morpheus with 2 nf-f12's on it. When i benchmark with +200 the volt goes up to around 1.3 and temperture around 65. Iv'e tried the PT1 bios but i don't like it it's too scary


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> It is not under water but i have a raijintek morpheus with 2 nf-f12's on it. When i benchmark with +200 the volt goes up to around 1.3 and temperture around 65. Iv'e tried the PT1 bios but i don't like it it's too scary


i see. have you checked the vrm temps, especially vrm1? they need to be cooled as much as the core. full block does a nice job 'cause they become cooler than the core when watered.

if it is just for benching purposes, then sure. short burst of higher than 200+ will work but not for 7/24 use.

edit: also, not sure what psu you have. that could limit higher oc as well.


----------



## fyzzz

Quote:


> Originally Posted by *rdr09*
> 
> i see. have you checked the vrm temps, especially vrm1? they need to be cooled as much as the core. full block does a nice job 'cause they become cooler than the core when watered.
> 
> if it is just for benching purposes, then sure. short burst of higher than 200+ will work but not for 7/24 use.
> 
> edit: also, not sure what psu you have. that could limit higher oc as well.


The vrm 2 is just fine and i have heatsinks on it. The vrm 1 tho is another story. I am using the stock heatsink on vrm 1 that came with my card, i have the dc2 version so the vrm 1 heatsink that came with the cooler didn't fit. I am looking to fix this one day. The psu i have is a corsair rm 750


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> The vrm 2 is just fine and i have heatsinks on it. The vrm 1 tho is another story. I am using the stock heatsink on vrm 1 that came with my card, i have the dc2 version so the vrm 1 heatsink that came with the cooler didn't fit. I am looking to fix this one day. The psu i have is a corsair rm 750


yah, VRM1 is a challenge to cool on air. i've benched both my 290s using +200 with Trixx with a cheap 700W psu (each) and never had a problem. not sure about that psu . . .

http://www.overclock.net/t/1455892/why-you-might-not-want-to-buy-a-corsair-rm-psu

not saying you have to buy a new one. i would just stick to +200 at most.


----------



## Gobigorgohome

Impulse buy.

Got one Asus DiretCU II R9 290X brand new for second-hand price


----------



## fyzzz

Quote:


> Originally Posted by *rdr09*
> 
> yah, VRM1 is a challenge to cool on air. i've benched both my 290s using +200 with Trixx with a cheap 700W psu (each) and never had a problem. not sure about that psu . . .
> 
> http://www.overclock.net/t/1455892/why-you-might-not-want-to-buy-a-corsair-rm-psu
> 
> not saying you have to buy a new one. i would just stick to +200 at most.


Yeah i know that the rm psu might not been the best choice, but i didn't have much to choose from when i built my computer. If i get my hands on a second 290 i need to upgrade the psu anyway


----------



## Agent Smith1984

You guys seen this?!
http://www.tweaktown.com/tweakipedia/56/amd-fx-8350-powering-gtx-780-sli-vs-gtx-980-sli-at-4k/index.html

And then this:


Uhhhhhh....... That is incredible!

Don't sell gpu's, and waste money on cpu,etc... Get a4k monitor, call it a day! lol


----------



## Batpimp

Quote:


> Originally Posted by *gatygun*
> 
> Readed most of your post. I will add my feel towards the mix about your setup.
> 
> *Keep in mind that you can't judge performance from the numbers i tell you perfectly well, most titan/980 results are of extreme oc'ed editions while the 290's are standard stock versions most of the time, keep it real but it showcases why your got the issue's you have now perfectly well*
> 
> *3DMark Firestrike (highest scores):*
> 
> I choose firestrike and not extreme or any other taxing one as they won't give you numbers that would equal towards a lower resolution experience ( aka 1080p ) on higher performance hardware.
> 
> AMD FX-8150:
> 
> 1x 290 = 8455 points
> 
> 2x 290's crossfire = 11.341 points
> 
> 3x 290's crossfire = 9.806 points
> 
> 1x Titan x = 10.940 points
> 
> 1x 980 = 10.315 points
> 
> 2x 980 sli = 11.689 points
> 
> AMD FX-8350
> 
> 1x 290 = unknown points
> 
> 2x 290's crossfire = 13.520 points
> 
> 3x 290's crossfire = 14.077 points
> 
> 1x titan x = 12.049 points
> 
> 2x titan x = 14.665 points
> 
> 3x titan x = 14.534 points
> 
> 1x 980 gtx = unknown points
> 
> 2x 980 gtx = 14.323 points
> 
> INTEL I7-5960x ( intel top model )
> 
> 1x 290 = 10.050 points
> 
> 2x 290's crossfire = 20.926 points
> 
> 3x 290's crossfire = 23.226 points
> 
> 4x 290's crossfire = 29.281 points.
> 
> 1x titan x = 24.222 points ( extreme oc'ed
> 
> 2x titan x = 35.106 points ( extreme oc'ed )
> 
> 3x titan x = unknown points
> 
> 4x titan x = 43.380 points
> 
> 1x 980 gtx = unknown points
> 
> 2x 980 gtx = 31.589 points
> 
> 3x 980 gtx = 35.764 points
> 
> 4x 980 gtx = 41.098 points
> 
> *Conclusion findings.*
> 
> *FX-8150* caps out around *11.400 points*
> 
> ( a normal 290 will mostly get you around the 8900 points, a oced on 1145 core and 1450 mem will give you 10.108 points to get a idea ).
> 
> *FX-8350* caps out around *14.500 points*,
> 
> _18.000 points is for stock 290's x2, so yea you will still get bottlenecked even with stock let alone with oced version which will move towards up 20k_
> 
> *INTEL I7-5960x* caps out around *43.000 points*
> 
> _Keep in mind that this cpu is the absolute top you can get atm ( from my knowledge ), it basically features no bottleneck cpu depatment._
> 
> *My opinion on your problem:*
> 
> Cpu bottleneck first degree issue. Your cpu simple can't push both cards forwards. Also keep in mind that in most games ( not mantle games ) amd gpu's will have a larger dx11 api overhead this specially hits 1080p resolution, the cards will perform better on higher resolutions as cpu overhead will be lesser visible. ( mantle and dx12 will cure this ).
> 
> As you want to play bf4 on 144 fps with 290's crossfire, that should be perfectly possible as you can see below that the low is 145 fps on 2x 290's crossfire. But your cpu is holding you back as 2x titan x sli on his 3960x cpu where he tested bf4 with make his cpu hit a wall of 24.884 points ( titans shoudl perform around 35.000 points ). 2x 290's = 21.000 absolute max so 18.000 stock versions would not have bottlenecked his test setup, which makes his 290's push full performance on the 1080p resolution.
> 
> 
> 
> I hope this helps you.


alright i read your post a few times to make sense of it.

8150 max scores according to this graph are 11400*

8350 max scores with r 290 crossfire is 14,000

I dont see/understand this comment "*18.000* points is for stock 290's x2, so yea you will still get bottlenecked even with _*(with what the 8350/8150 or both?)*_ stock let alone with oced version which will move towards up 20k" _*(i dont see where either intel hits around 20k did you mean intel here?)*_


----------



## rdr09

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You guys seen this?!
> http://www.tweaktown.com/tweakipedia/56/amd-fx-8350-powering-gtx-780-sli-vs-gtx-980-sli-at-4k/index.html
> 
> And then this:
> 
> 
> Uhhhhhh....... That is incredible!
> 
> Don't sell gpu's, and waste money on cpu,etc... Get a4k monitor, call it a day! lol


i've had my 4K for almost half a year with 2 290s at stock. here is BF4 opn'n locker MP64 using Medium and no AA . . .



Mantle of course. my bad. that's DX11. fraps and mantle don't mix.


----------



## Batpimp

Quote:


> Originally Posted by *rdr09*
> 
> i've had my 4K for almost half a year with 2 290s at stock. here is BF4 opn'n locker MP64 using Medium and no AA . . .
> 
> 
> 
> Mantle of course. my bad. that's DX11. fraps and mantle don't mix.


cpu?


----------



## rdr09

Quote:


> Originally Posted by *Batpimp*
> 
> cpu?


An i7 SB at 4.5GHz. that's it for that cpu. i think it can handle a single 390 but not 2.


----------



## Agent Smith1984

It's just kind of funny how the 8350 best a 4930k in 4k, and the 290x crossfire beat 980 sli in 4k.....

It's like, "damn, all i gotta do now is get a 4tv, and I'm high end gaming well as anybody"

What's more is, I was planning on going 4k soon, so this info is great news for me. My second 290 tri-x is arriving Tuesday.... I think this 8300 at 4.9ghz will handle these girls just fine


----------



## Batpimp

Quote:


> Originally Posted by *Agent Smith1984*
> 
> It's just kind of funny how the 8350 best a 4930k in 4k, and the 290x crossfire beat 980 sli in 4k.....
> 
> It's like, "damn, all i gotta do now is get a 4tv, and I'm high end gaming well as anybody"
> 
> What's more is, I was planning on going 4k soon, so this info is great news for me. My second 290 tri-x is arriving Tuesday.... I think this 8300 at 4.9ghz will handle these girls just fine


let us know if you get a big FPS increase.. the 8150 i have is not making it happen. i get 90 fps fine with 1 card but..sometimes i dip to 65fps in bf4 with two cards..not sure what the reason for that is


----------



## gatygun

Quote:


> Originally Posted by *Batpimp*
> 
> alright i read your post a few times to make sense of it.
> 
> 8150 max scores according to this graph are 11400*
> 
> 8350 max scores with r 290 crossfire is 14,000
> 
> I dont see/understand this comment "*18.000* points is for stock 290's x2, so yea you will still get bottlenecked even with _*(with what the 8350/8150 or both?)*_ stock let alone with oced version which will move towards up 20k" _*(i dont see where either intel hits around 20k did you mean intel here?)*_


On DX11 api:

8150 will bottleneck your 290's @ 950 core clock each ( think about ~60% max usage on both gpu's so gpu:1 60% and gpu:2 60%)

8350 will bottleneck your 290's @ 950 core clock each think about ~80% max usage on both gpu's )

The 20-24k+ (whatever it was) number i used was from a intel i7 3960x EE @ 4,7ghz. It pushes about 24.000 points. Which was used in that benchmark with 1866 mhz memory

That system was builded in order to prevent any bottleneck from happening and maximum tax the gpu's that they tested.

AMD dx11 drivers are worse then nvidia on 1080p, heavy cpu taxing games like bf4 will push worse performance then nvidia.

But BF4 uses mantle.

On mantle:

This means that BF4 will get a boost from mantle, to what extend i need to pull another benchmark and combine them.

This benchmark shows that there is a 18% difference on performance between dx11 and mantle on BF4



17% of 14.500 = almost 17.000 points ( a bit lower points )

While it would still bottleneck the 290's it will be a lot less then a 8150 which would hang on ~13.000 points ( a bit lower points actually )

2x 950 core clock 290's crossfire will push about 17.500 points, 1000 core about 18.000+, and 1200 core about 21.000+ ( it's actually higher as memory also gets calculated )

Conclusion:

Both 8150 and 8350 will bottleneck your 290's even with mantle on in battlefield 4. While the 290's if stock clock speed will still be bottlenecked with 8350 it's by far better then a 8150.

If you really don't want any bottleneck and will want to use those 290's on full speed on dx 11 titles even with mantle both cpu's won't work.

ps the cpu's used that push these points are heavily oced think about 5,1ghz overclock.


----------



## Agent Smith1984

The term "bottleneck" is beginning to drive me nuts, lol!

It's always conditional guys....

If you get 60% CPU usage, and 60% usage in each GPU, that just means you aren't doing anything that is demanding the cards or CPU to work harder....

Either that or the 60% CPU usage is not indicative of individual core load, and certain threads used to run the game are maxed out, and unable to push the GPU'S harder. You could call that a bottleneck, but in reality, it's just poor game development.....

There can also be instances of driver overhead issues, or high resolution scaling, which can be controlled with a resolution timer app. Some people use this app in crysis 3 for multi GPU setups. Saw one guy using 60% of an OC I7 3770K at 4.4ghz, and only getting 55% GPU usage in game with his 4 780's. He turned resolution timing down to minimum, and got 95% CPU usage and 80% GPU usages.

Either way.... Yes an 8150 can pigeon hold two cards, in some cases much worse than others, but all in all, I'd you are pushing high graphic settings, and your cpu isn't pegged out in usage, then you should see an improvement from one card to two. The higher the resolution and settings, the better the scaling will be....


----------



## aaroc

for the user with problem with BF4, please search the internerd for the reviews that show the effects of ram speed. The most affected one was BF4 and the increase in perf was big going from 1866 to 2400. Other games were less or not affected. After reading more than one review showing the same I ordered 2400 Ram. I have not problems playing games with 3x 2560x1440 monitors with an AMD CPU and GPUs.
Megaman knows a lot and helps everybody as almost all other users in this thread


----------



## Roboyto

Quote:


> Originally Posted by *aaroc*
> 
> for the user with problem with BF4, please search the internerd for the reviews that show the effects of ram speed. The most affected one was BF4 and the increase in perf was big going from 1866 to 2400. Other games were less or not affected. After reading more than one review showing the same I ordered 2400 Ram. I have not problems playing games with 3x 2560x1440 monitors with an AMD CPU and GPUs.
> Megaman knows a lot and helps everybody as almost all other users in this thread


I concur. I am not a BF4 player, but this was shown way back in January of last year in @Gunderman456 build log. Take a look through his thread if you have questions regarding Xfire of our cards because he covered just about everything!

http://www.overclock.net/t/1442038/build-log-the-hawaiian-heat-wave/120_20#post_21628080

Here's the post regarding RAM speeds and BF4 with a 30% boost in performance from 1333 to 2400.


----------



## rdr09

Quote:


> Originally Posted by *Roboyto*
> 
> I concur. I am not a BF4 player, but this was shown way back in January of last year in @Gunderman456
> build log. Take a look through his thread if you have questions regarding Xfire of our cards because he covered just about everything!
> 
> http://www.overclock.net/t/1442038/build-log-the-hawaiian-heat-wave/120_20#post_21628080
> 
> Here's the post regarding RAM speeds and BF4 with a 30% boost in performance from 1333 to 2400.


And post #252 in pg. 26.


----------



## mfknjadagr8

Ok so i just experienced an issue i havent seen before....first the scenario then a question then my thoughts.. I started my computer everything booting fine.... boots into windows quickly.. as soon as i hit the login screen i notice the led on my waterblock goes out... im like wth.. so i check connections seating etc.. all good... so i restart.... leds on card working... hit windows login screen again bam... off like a switch... so i load into windows... and after everything loads the led comes back on... so i pull up afterburner... no core clock or temp monitoring for the card.... so i shutdown again... start computer again this time.. i remove power from the top card.. connect monitor to the second card... boots fine load up gta 5 runs fine.... shutdown... reconfigure as it was originally same issue upon starting so i remove power from second card reboot again... top card working fine on its own....so... now the question is... can enabling/disabling crossfire bork the driver or the registry entries associated with it... i have done this a lot last few weeks because for some reason crossfire on zombie portion of advanced warfare runs HORRIBLE with crossfire on....or could this simply be a bad shutdown and windows issue...Im thinking it might be driver issue... so im going to reinstall them AGAIN and get a good usb drive tommorrow for in case i need a new windows install... since i dont have an optical drive anymore... anyone experienced this?


----------



## kizwan

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Ok so i just experienced an issue i havent seen before....first the scenario then a question then my thoughts.. I started my computer everything booting fine.... boots into windows quickly.. as soon as i hit the login screen i notice the led on my waterblock goes out... im like wth.. so i check connections seating etc.. all good... so i restart.... leds on card working... hit windows login screen again bam... off like a switch... so i load into windows... and after everything loads the led comes back on... so i pull up afterburner... no core clock or temp monitoring for the card.... so i shutdown again... start computer again this time.. i remove power from the top card.. connect monitor to the second card... boots fine load up gta 5 runs fine.... shutdown... reconfigure as it was originally same issue upon starting so i remove power from second card reboot again... top card working fine on its own....so... now the question is... can enabling/disabling crossfire bork the driver or the registry entries associated with it... i have done this a lot last few weeks because for some reason crossfire on zombie portion of advanced warfare runs HORRIBLE with crossfire on....or could this simply be a bad shutdown and windows issue...Im thinking it might be driver issue... so im going to reinstall them AGAIN and get a good usb drive tommorrow for in case i need a new windows install... since i dont have an optical drive anymore... anyone experienced this?


With crossfire, LED on secondary card will light up when idling if ULPS enabled. Monitoring software won't be able to read correct info from this card. Is this the LED you're referring to?


----------



## mfknjadagr8

Quote:


> Originally Posted by *kizwan*
> 
> With crossfire, LED on secondary card will light up when idling if ULPS enabled. Monitoring software won't be able to read correct info from this card. Is this the LED you're referring to?


update: reinstalling the drivers and redisabling ulps seems to have fixed the issue..however the led I was referring to was the led on the komodo block going off when Windows login screen came up and subsequently the card not being used it recognized by monitoring programs or any other program...as though it wasnt seated (as led stays lit as long as motherboard is powering it even if it's not connected to power supply...I noticed it would come back on periodically and when it was off it would display the green led you were referring to as though ulps was enabled but all the registry keys were 0's.....also the second card was unusable in any application...glad it wasn't hardware but it was very odd...


----------



## Mega Man

upls, the komodo led is powered by the fan, upls shuts off the fan


----------



## nX3NTY

I just wanted to report something, I already pinpointed why my cards won't do 150MHz idle memory clock no matter what VBIOS I use. The reason is monitor overclocking, it seems like if I raised my refresh rate past certain rate (mine being above 70Hz) the memory clock will stay at 3D clock even at idle. Staying at 70Hz makes it stick to idle clock.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Mega Man*
> 
> upls, the komodo led is powered by the fan, upls shuts off the fan


yeah I thought about that but ulps wouldn't have prevented the card from being utilized at all and all the ulps values in the registry were set to 0...that's why it puzzled me...im guessing it was likely a combination of ulps running despite settings and driver not utilizing the card...it's working now so that's what matters...sitting here now I don't remember putting the mounting screws back in the backplate...oh well I'm off in 30 minutes....but my fiancee did play gta5 all night with the card being held by the bus and the joining fitting...and the fitting was a swiftech adjustable so its not as secure as two compression fittings and tube or bridge...oops lol


----------



## tsm106

Quote:


> Originally Posted by *nX3NTY*
> 
> I just wanted to report something, I already pinpointed why my cards won't do 150MHz idle memory clock no matter what VBIOS I use. The reason is monitor overclocking, it seems like if I raised my refresh rate past certain rate (mine being above 70Hz) the memory clock will stay at 3D clock even at idle. Staying at 70Hz makes it stick to idle clock.




You can use up to 120hz if your AB is setup for that and still get low idle clocks. At 144hz it forces 3D memory clocks.


----------



## Cyber Locc

Quote:


> Originally Posted by *Batpimp*
> 
> okay i did what you said and from this first set of tests it seems to have worked!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Please take note. In the MSI AB pics the drop off is when the test ended and i just left the room for a bit.*


Okay so like I thought your high heaven scores are way low for crossfire. My loop was down when I was helping you before however its up now







.

Heres my high heaven result stock on the clocks.


As you can see there's a huge difference. That is with a 4820k at 4.8ghz


----------



## MerkageTurk

Just ordered a 290


----------



## DividebyZERO

Quote:


> Originally Posted by *MerkageTurk*
> 
> Just ordered a 290


I thought you weren't happy with graphics quality on your 280x?

Edit NM i see you got it resolved


----------



## Cyber Locc

Hey guys so I have a mod on my mind that would like some idea or input on. I want to run PT1 bios for benching ect, However no 2d clocks on 3 cards that's a deal breaker. Now that isn't an issue as the 290 has the wonderful bios switch however they didn't really think through the location. My bios switches are covered up by my EK terminal so I need to design a switch extender that will put a switch outside of the terminal block. Any ideas of a way I could do this it aint going to be easy that I am sure off lol.

Also anyone that has PT1 bios how much is the wattage increased when idle? Aida numbers or if you have a difference with a killowatt if its only 100w or so watts more for a few cards than no problem.


----------



## nX3NTY

Quote:


> Originally Posted by *cyberlocc*
> 
> Hey guys so I have a mod on my mind that would like some idea or input on. I want to run PT1 bios for benching ect, However no 2d clocks on 3 cards that's a deal breaker. Now that isn't an issue as the 290 has the wonderful bios switch however they didn't really think through the location. My bios switches are covered up by my EK terminal so I need to design a switch extender that will put a switch outside of the terminal block. Any ideas of a way I could do this it aint going to be easy that I am sure off lol.
> 
> Also anyone that has PT1 bios how much is the wattage increased when idle? Aida numbers or if you have a difference with a killowatt if its only 100w or so watts more for a few cards than no problem.


Why the fuss? You can made a custom profile in AB, Trixx or whatever for it to run at 2D clock, just downclock and put the VDDC offset to minus value to simulate 2D voltage and voila!







I try it yesterday with PT1 BIOS when I was trying different BIOS for fun, it runs great. If I want to game I simply reset it back to default. Just my


----------



## Cyber Locc

Quote:


> Originally Posted by *nX3NTY*
> 
> Why the fuss? You can made a custom profile in AB, Trixx or whatever for it to run at 2D clock, just downclock and put the VDDC offset to minus value to simulate 2D voltage and voila!
> 
> 
> 
> 
> 
> 
> 
> I try it yesterday with PT1 BIOS when I was trying different BIOS for fun, it runs great. If I want to game I simply reset it back to default. Just my


I was under the impression that pt1 wouldn't allow clocks lower than 1000-1250 are you saying it will? I would love to do your suggestion but even at stock voltage 1000/1250 is a whole lot of wattage unless I'm wrong and it isn't which is what I'm trying to figure out.


----------



## nX3NTY

Quote:


> Originally Posted by *cyberlocc*
> 
> I was under the impression that pt1 wouldn't allow clocks lower than 1000-1250 are you saying it will? I would love to do your suggestion but even at stock voltage 1000/1250 is a whole lot of wattage unless I'm wrong and it isn't which is what I'm trying to figure out.


It did allow lower clocks, I managed around 300MHz core and 300MHz memory with -200 VDDC offset in Trixx. Stable in Windows doing stuff like watching videos, browsing, watching flash videos etc. No ill effect


----------



## Cyber Locc

Quote:


> Originally Posted by *nX3NTY*
> 
> It did allow lower clocks, I managed around 300MHz core and 300MHz memory with -200 VDDC offset in Trixx. Stable in Windows doing stuff like watching videos, browsing, watching flash videos etc. No ill effect


Well now that I can live with. I wonder why people say it doesn't allow lower clocks. I'll have to give it a flash tonight and see what I can do.

However did you check the clocks in gpuz or similar to make sure they were actually at 300?


----------



## MerkageTurk

My 280x had black screen driver crashes and recovering everytime running a 3d application,

Even after several reinstall of windows,

So returned to amazon and waiting for my vapor x 290x

Recommended by rick cooper as he said the issue I described is vrms running really hot


----------



## zealord

Quote:


> Originally Posted by *MerkageTurk*
> 
> My 280x had black screen driver crashes and recovering everytime running a 3d application,
> 
> Even after several reinstall of windows,
> 
> So returned to amazon and waiting for my vapor x 290x


That whole black screen thing is really weird with AMD cards. I had it on my 290X on every boot after a couple of minutes, then I swapped both GPU PCIE cables and now it works like a charm.


----------



## MerkageTurk

It runs fine running desktop however when running 3d applications it crashes and recovers

Plus I have a 1200x which should be enough for 280x


----------



## Gobigorgohome

How does two R9 290X in crossfire handle GTA V @ 4K with high settings? Also, Assasin Creed Unity @ 4K with high settings. Stock settings on cards.


----------



## zealord

Quote:


> Originally Posted by *Gobigorgohome*
> 
> How does two R9 290X in crossfire handle GTA V @ 4K with high settings? Also, Assasin Creed Unity @ 4K with high settings. Stock settings on cards.


Pretty good I'd say, but if I might ask why stock settings on 290X Lightning? That card is meant for overclocking









GTA V has a lot of settings ranging from normal to ultra and many sliders for density etc. Maxxing out GTA V would be impossible on 4K with 2x290X, but at reasonable settings with a mix of high settings should be easily doable.


----------



## LandonAaron

Quote:


> Originally Posted by *tsm106*
> 
> 
> 
> You can use up to 120hz if your AB is setup for that and still get low idle clocks. At 144hz it forces 3D memory clocks.


Is it normal when using crossfire for the first cards memory clock not to downclock when on desktop?


----------



## BradleyW

Quote:


> Originally Posted by *LandonAaron*
> 
> Is it normal when using crossfire for the first cards memory clock not to downclock when on desktop?


This can happen if you have an overclocked monitor or a monitor set at 144Hz.


----------



## LandonAaron

Quote:


> Originally Posted by *BradleyW*
> 
> This can happen if you have an overclocked monitor or a monitor set at 144Hz.


Yes, I have one of those Korean 1440p overclockable monitors, and it is generally set to 96hz, but I just tested at 60hz and the memory clock is still at 1250 on the top card.


----------



## rdr09

Quote:


> Originally Posted by *LandonAaron*
> 
> Is it normal when using crossfire for the first cards memory clock not to downclock when on desktop?


no, that's not normal. check what services you running in the background.


----------



## Gobigorgohome

Quote:


> Originally Posted by *zealord*
> 
> Pretty good I'd say, but if I might ask why stock settings on 290X Lightning? That card is meant for overclocking
> 
> 
> 
> 
> 
> 
> 
> 
> 
> GTA V has a lot of settings ranging from normal to ultra and many sliders for density etc. Maxxing out GTA V would be impossible on 4K with 2x290X, but at reasonable settings with a mix of high settings should be easily doable.


Coming from quadfire R9 290X on water, no reason to overclock, the percentage of increase of FPS is so little that it is pretty much unoticeable. I had better FPS average by letting the cards be at stock and overclock the CPU. Also taken in mind increased temperature and power draw, for me it is not worth it.









Good to hear it would do okay with two R9 290X's, I am not planning on going with anything higher than "high" settings anyways.


----------



## Phaster89

after 40 minutes of offline gameplay:

gpu clock - 977

mem clock - 1250

power limit - 0%

stock core voltage, air cooled msi r9 290 4g gaming

on 15.4 drivers

gpu temp - 82ºC

gpu usage - always above 70

fan speed - 80%

memory usage - 3223 roughly

cpu temp - 60ºC

cpu usage - above 70%

ram usage - 6gb

game settings are the following after some fiddling in order to keep closer to 60








i have some drops below 40 fps, is it supposed to happen on my sig rig?


----------



## nlgPRO

Will this pic suffice?


That model is the SAPPHIRE VAPOR-X R9 290X 8GB GDDR5 PCI-E TRI-X (UEFI) and its running on stock cooling.


----------



## Mega Man

just an fyi dont attach it ( the paperclip ) use the picture of the little picture

this one :


----------



## GMcDougal

Im now a (hopefully) proud owner of an msi 290x gaming 4g. Anyone have this card and if so, how do they like it? Thanks


----------



## Durtmagurt

Hi so I'm debating between adding a swiftech komodo for my r9 290 to my h220x or buying another r9 290 and adding the GPU blocks later.

Swiftech komodo & everything needed to install. $250

Used r9 290. $180

Any advice is very much appreciated.


----------



## Mega Man

my advice is "12" as you gave us zero info to help you,

what are you doing, what monitors are you running ect !~

please fill out rigbuilder ( see the link in my sig )


----------



## Durtmagurt

Sorry i believe ive updated my profile.
If i missed something the specs are.
Mobo- Asus m5a99fx pro2.0
Cpu- fx 8350 4.5ghz
Gpu- sapphire r9 290
Ram- 16gigs corsair vengeance 1866ghz
Psu- corsair ax 1200
Liquid cooling- swiftech h220x
Case- corsair 800D
Monitor- lg 25" ultrawide and a asus 23" side monitor


----------



## Durtmagurt

Im also gaming but building my first pc ive found my self wanting to make it look the best i can make it look. And ive found myself wanting to water cool everything. Lol


----------



## Mega Man

you are not pushing too many pixels ( assuming you only game on one )

either will be an upgrade.

do you want silence? or better graphics

i assume you are not using both when gaming ( although some do )

at this point i would say either will be better but none present 100% difference,


----------



## cephelix

Quote:


> Originally Posted by *GMcDougal*
> 
> Im now a (hopefully) proud owner of an msi 290x gaming 4g. Anyone have this card and if so, how do they like it? Thanks


Same card as mine, though i have the non-X version. it does it's job, what can i say..
The most important thing you have to remember about the 290s is cooling it well, whether air or water. I had mine on water previously but went back to air. Thought I knew all there was to know about air cooling. I was sadly mistaken. Thing about non-ref coolers is that it spews hot air back into the case and if you don't have an exhaust fan near the card, it will reingest it's own exhausted air, limiting your overclocks and increasing vrm and core temps. What I have done to circumvent this, and you may try it if you feel that your card is running hot like mine is, is to ziptie an exhaust fan to the unused pcie slots directly under the card. I saw a dramatic increase in core temps going from stock 925/1250, no exhaust (85C on load) to 1100/1500/+50mv, silverstone ap-121L(85C) to 1100/1500/+50mv, gentle typhoon 1850rpm(75C)
Quote:


> Originally Posted by *Mega Man*
> 
> my advice is "12" as you gave us zero info to help you,
> 
> what are you doing, what monitors are you running ect !~
> 
> please fill out rigbuilder ( see the link in my sig )


dude, be nice, he's new here








Quote:


> Originally Posted by *Durtmagurt*
> 
> Sorry i believe ive updated my profile.
> If i missed something the specs are.
> Mobo- Asus m5a99fx pro2.0
> Cpu- fx 8350 4.5ghz
> Gpu- sapphire r9 290
> Ram- 16gigs corsair vengeance 1866ghz
> Psu- corsair ax 1200
> Liquid cooling- swiftech h220x
> Case- corsair 800D
> Monitor- lg 25" ultrawide and a asus 23" side monitor


Quote:


> Originally Posted by *GMcDougal*
> 
> Im now a (hopefully) proud owner of an msi 290x gaming 4g. Anyone have this card and if so, how do they like it? Thanks


Quote:


> Originally Posted by *Mega Man*
> 
> my advice is "12" as you gave us zero info to help you,
> 
> what are you doing, what monitors are you running ect !~
> 
> please fill out rigbuilder ( see the link in my sig )


Quote:


> Originally Posted by *Durtmagurt*
> 
> Im also gaming but building my first pc ive found my self wanting to make it look the best i can make it look. And ive found myself wanting to water cool everything. Lol


if you can afford it, i would suggest watercooling. even if it doesn't look super nice, at least you know your temps are in check and will be significantly lower than air cooling. As stated above, I went from air to water and back to air since I was getting paranoid that my swiftech h220 pump would fail at a certain point and my tubes will leak for no apparent reason even with proper installation and take all my other gear with it in a spectacular but hellish puff of smoke. So i tore down my loop, bought a nh-d15 for my cpu and made sure airflow through my case was sufficient and called it a day. I'm sure that my paranoia is unfounded of course, as you can see from the many watercooled builds here on OCN, but I just never had peace of mind when my rig was under water. One thing that i was super happy about were the temps though.


----------



## Mega Man

i was nice, and i am

i am also blunt


----------



## cephelix

Quote:


> Originally Posted by *Mega Man*
> 
> i was nice, and i am
> 
> i am also blunt


lol, that was the word i was looking for









I was tempted to get the Vapor X 290X 8Gb at one point, but monitoring my ram usage, in DAI it's about 2Gb only, have to take a look at SOM usage again though. I need more gpu processing power at this point than ram


----------



## Durtmagurt

That was a mistake on my part should have thought about what i was asking before i posted.. Lol how could you guys help if you had no clue on what my build is currently.. But none the less i want better graphics but i think going with watercooled option is best. And adding another in the future ill probably overclock just a tad and call it till i can afford another. That way i get better looks and slight increase in performance. Thanks guys for the help.

And besides tbh my 290 is doing great especially for the games i play (d3, league, counter strike etc.) So better graphics would only get me so far haha


----------



## nX3NTY

Quote:


> Originally Posted by *cyberlocc*
> 
> Well now that I can live with. I wonder why people say it doesn't allow lower clocks. I'll have to give it a flash tonight and see what I can do.
> 
> However did you check the clocks in gpuz or similar to make sure they were actually at 300?


Yep I did monitor with GPU-Z, it run at specified speed and voltage.


----------



## Durtmagurt

Has anyone tried the swiftech komodo GPU water block?


----------



## Cyber Locc

Quote:


> Originally Posted by *nX3NTY*
> 
> Yep I did monitor with GPU-Z, it run at specified speed and voltage.


Cool bout to flash now will get back









Hmm ya no cigar lowest i can go is 800mhz on the core? Is that a software limitation anyway to change the slider to go down further?

Okay so I grabbed trixx and your right trixx will do it. Trixx will not however allow me to overclock more than afterburner thats the whole point of pt1 is no voltage limits (w/GpuTweak) where trixx limits me at 1.3.

Edit again: trixx will not allow downclocking at all. Even with no vddc drop are you sure you were running pt1 bios?

I also got to run some numbers for wattage. Running 3 cards with pt1 where I live with my power prices is about 35 dollars a month (if just idling 24/7) pt1 is cool but not that cool.

I'm just going back to my switch idea


----------



## Cyber Locc

I am also having the issue of my men clock staying at 3d in 2d on both my cards.


----------



## mus1mus

I have no issues with both my GPU and CPU on static Performance mode.









Especially if


----------



## kizwan

This is killing me.



I already run out of quota for this month.


----------



## long99x

I plan buy kraken g10 and H55 to cool my Asus r9 290 dc2, will it fit ?

thanks


----------



## nX3NTY

Quote:


> Originally Posted by *cyberlocc*
> 
> Cool bout to flash now will get back
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hmm ya no cigar lowest i can go is 800mhz on the core? Is that a software limitation anyway to change the slider to go down further?
> 
> Okay so I grabbed trixx and your right trixx will do it. Trixx will not however allow me to overclock more than afterburner thats the whole point of pt1 is no voltage limits (w/GpuTweak) where trixx limits me at 1.3.
> 
> Edit again: trixx will not allow downclocking at all. Even with no vddc drop are you sure you were running pt1 bios?
> 
> I also got to run some numbers for wattage. Running 3 cards with pt1 where I live with my power prices is about 35 dollars a month (if just idling 24/7) pt1 is cool but not that cool.
> 
> I'm just going back to my switch idea


Yes I'm sure I run on PT1 BIOS because it runs at 3D speeds. And yes I managed to downclocked it because it runs at lower temperature, lower speeds and voltage from GPU-Z. I no longer use the PT1 because I wanted to save energy.


----------



## Sempre

Sadly i got a stripped screw while trying to remove the stock cooler. I tried the rubber band method and super glue but it still wont budge. Next ill try do drill into and through the screw. Anyone thinks its a good/bad idea?



This is the bit ill be using:


----------



## moorhen2

Well it looks like I cant run 3 Sapphire vapor-x 290x's on my mobo, not without custom waterblocks, that sucks.


----------



## kizwan

Quote:


> Originally Posted by *moorhen2*
> 
> Well it looks like I cant run 3 Sapphire vapor-x 290x's on my mobo, not without custom waterblocks, that sucks.


You can use PCIe riser extender.


----------



## Devildog83

Quote:


> Originally Posted by *Sempre*
> 
> Sadly i got a stripped screw while trying to remove the stock cooler. I tried the rubber band method and super glue but it still wont budge. Next ill try do drill into and through the screw. Anyone thinks its a good/bad idea?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> This is the bit ill be using:


Get one of those dig in and back out stripped screws before you try to just drill it out and for heavens sake be careful.

http://www.lowes.com/pd_299377-41877-8501P_4294607735__?productId=3073145&Ns=p_product_qty_sales_dollar|1&pl=1&currentURL=%3FNs%3Dp_product_qty_sales_dollar|1&facetInfo=


----------



## moorhen2

Quote:


> Originally Posted by *kizwan*
> 
> You can use PCIe riser extender.


Thanks for the reply, please tell me more.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Durtmagurt*
> 
> Has anyone tried the swiftech komodo GPU water block?


I own two...they are exceptionally built...my core temps never break 60c under and circumstances and vrms are the same...the only issue I encountered was the three screws that go through the backplate on the opposite end from the io plate were a little hard to thread through...note this may have been my card...


----------



## Sempre

Quote:


> Originally Posted by *Devildog83*
> 
> Get one of those dig in and back out stripped screws before you try to just drill it out and for heavens sake be careful.
> 
> http://www.lowes.com/pd_299377-41877-8501P_4294607735__?productId=3073145&Ns=p_product_qty_sales_dollar|1&pl=1&currentURL=%3FNs%3Dp_product_qty_sales_dollar|1&facetInfo=


I went to a nearby hardware store but they didn't have an extractor small enough for the GPU screw.Ill buy it online which'll take a week at minimum to arrive.

I just want to know if its ok to drill through it, instead of waiting for a week doing nothing.


----------



## Durtmagurt

Did you have to replace the vrm thermal pads as well? Also would you recommend a different GPU block over the komodo?


----------



## nX3NTY

Quote:


> Originally Posted by *moorhen2*
> 
> Thanks for the reply, please tell me more.


The riser extender he mentioned is one of these

http://www.amazon.com/PCI-E-Express-Extender-Flexible-Extension/dp/B008BZBFTG


----------



## Durtmagurt

Quote:


> Originally Posted by *mfknjadagr8*
> 
> I own two...they are exceptionally built...my core temps never break 60c under and circumstances and vrms are the same...the only issue I encountered was the three screws that go through the backplate on the opposite end from the io plate were a little hard to thread through...note this may have been my card...


Did you have to replace the vrm thermal pads as well? Also would you recommend a different GPU block over the komodo?


----------



## Agent Smith1984

UPS is delivering my second 290 Tri-X today....

Got it for $230 in the very last batch of refurbs newegg offered. They no longer carry them, and they sold out in a few hours when they had hit last Friday.

Is there any pro tips on here for the initial things to look out for when setting up and overclocking two of these in crossfire?

I have some questions, if anyone can chime in:

Will these likely default to 8x/8x?
Do I need to disable ULPS in trixx for overclocking (heard this affects CF much more than single card)?
When overclocking, will I see the cards seprately in the drop down list at the top of trixx, so I can clock them individually?
Are there any CCC settings besides "enable dual graphics" that I need to be aware of that could affect things?
In games (not sure which ones are effected yet) that see a performance hit with crossfire enabled, is there a way to set profiles to run a single card on demand, or will I always be better off to manually turn it off and on accordingly?
Does changing to dual graphics require any restarting or is it "on the fly"?
And lastly, is it safe to say, that I may as well go up to 15.4 (currently using 14.12 omega) if I am planning on playing GTA V?

Thanks guys


----------



## mfknjadagr8

Quote:


> Originally Posted by *Durtmagurt*
> 
> Did you have to replace the vrm thermal pads as well? Also would you recommend a different GPU block over the komodo?


the aqua graphics block performs a little better (a few degrees) however costs just as much without backplate as the komodo does with...komodo block comes with thermal pads already in place and is machined to tight tolerances so it doesn't use a thermal pad for the backside of vrms you use thermal paste...it was very easy to install...I haven't used a lot of 290 blocks but I can attest to the quality of the swiftech block...I wouldn't say it's the best because I haven't used others but I can say it looks good and performs on par with the other top tier blocks...the led is a nice touch as well


----------



## Agent Smith1984

Quote:


> Originally Posted by *mus1mus*
> 
> I have no issues with both my GPU and CPU on static Performance mode.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Especially if


Oh my.... that GPU OC!! NICE


----------



## mfknjadagr8

Quote:


> Originally Posted by *Agent Smith1984*
> 
> UPS is delivering my second 290 Tri-X today....
> 
> Got it for $230 in the very last batch of refurbs newegg offered. They no longer carry them, and they sold out in a few hours when they had hit last Friday.
> 
> Is there any pro tips on here for the initial things to look out for when setting up and overclocking two of these in crossfire?
> 
> I have some questions, if anyone can chime in:
> 
> Will these likely default to 8x/8x?
> Do I need to disable ULPS in trixx for overclocking (heard this affects CF much more than single card)?
> When overclocking, will I see the cards seprately in the drop down list at the top of trixx, so I can clock them individually?
> Are there any CCC settings besides "enable dual graphics" that I need to be aware of that could affect things?
> In games (not sure which ones are effected yet) that see a performance hit with crossfire enabled, is there a way to set profiles to run a single card on demand, or will I always be better off to manually turn it off and on accordingly?
> Does changing to dual graphics require any restarting or is it "on the fly"?
> And lastly, is it safe to say, that I may as well go up to 15.4 (currently using 14.12 omega) if I am planning on playing GTA V?
> 
> Thanks guys


15.4 had improved crossfire profile for gtav


----------



## kizwan

Quote:


> Originally Posted by *Agent Smith1984*
> 
> UPS is delivering my second 290 Tri-X today....
> 
> Got it for $230 in the very last batch of refurbs newegg offered. They no longer carry them, and they sold out in a few hours when they had hit last Friday.
> 
> Is there any pro tips on here for the initial things to look out for when setting up and overclocking two of these in crossfire?
> 
> I have some questions, if anyone can chime in:
> 
> Will these likely default to 8x/8x?
> Do I need to disable ULPS in trixx for overclocking (heard this affects CF much more than single card)?
> When overclocking, will I see the cards seprately in the drop down list at the top of trixx, so I can clock them individually?
> Are there any CCC settings besides "enable dual graphics" that I need to be aware of that could affect things?
> In games (not sure which ones are effected yet) that see a performance hit with crossfire enabled, is there a way to set profiles to run a single card on demand, or will I always be better off to manually turn it off and on accordingly?
> Does changing to dual graphics require any restarting or is it "on the fly"?
> And lastly, is it safe to say, that I may as well go up to 15.4 (currently using 14.12 omega) if I am planning on playing GTA V?
> 
> Thanks guys



depends on your cpu & motherboard. Mine can do x16-x16.
with cf it's recommended to disable ulps. I like to disable ULPS using msi ab. It can prevent your cards from overvolting properly.
yes, you can select each cards individually. You should be able to overclocks the cards individually.
just need to enable crossfire in ccc. it should be automatically enabled though. Other settings are more for fine tuning, game to game basis.
I don't know. Never try it before.
enabling/disabling crossfire on the fly yes you can
I can't say.


----------



## Agent Smith1984

Quote:


> Originally Posted by *kizwan*
> 
> 
> depends on your cpu & motherboard. Mine can do x16-x16.
> with cf it's recommended to disable ulps. I like to disable ULPS using msi ab. It can prevent your cards from overvolting properly.
> yes, you can select each cards individually. You should be able to overclocks the cards individually.
> just need to enable crossfire in ccc. it should be automatically enabled though. Other settings are more for fine tuning, game to game basis.
> I don't know. Never try it before.
> enabling/disabling crossfire on the fly yes you can
> I can't say.


Awesome, thanks for the help.

And on that last one... DUDE! You and every body else!! I am just going to order the disc.... I am only DSL with a max DL of around 270k, and we are "unplugged" from cable/satelltie at my house, and only use the internet for television. My kids and wife watch way too much Hulu and Netflix for me to doanload a 60GB game. Just DLing BF4 was a 4 day venture....
I can get it in the mail quicker than I can online.... I can't wait to play though.


----------



## LandonAaron

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Awesome, thanks for the help.
> 
> And on that last one... DUDE! You and every body else!! I am just going to order the disc.... I am only DSL with a max DL of around 270k, and we are "unplugged" from cable/satelltie at my house, and only use the internet for television. My kids and wife watch way too much Hulu and Netflix for me to doanload a 60GB game. Just DLing BF4 was a 4 day venture....
> I can get it in the mail quicker than I can online.... I can't wait to play though.


If you get the disc you will still need a 5 GB day 1 patch to get started.


----------



## Cyber Locc

Quote:


> Originally Posted by *nX3NTY*
> 
> Yes I'm sure I run on PT1 BIOS because it runs at 3D speeds. And yes I managed to downclocked it because it runs at lower temperature, lower speeds and voltage from GPU-Z. I no longer use the PT1 because I wanted to save energy.


Well can you link me the one you used? if I try to lower the clocks to even 999/1250 my PC crashes. And again everything I have read says that's normal and pt1 cannot go lower than 1000 on the core. So you must have a modded pt1 that allows for 2d clocks I want that bios







.


----------



## gatygun

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Awesome, thanks for the help.
> 
> And on that last one... DUDE! You and every body else!! I am just going to order the disc.... I am only DSL with a max DL of around 270k, and we are "unplugged" from cable/satelltie at my house, and only use the internet for television. My kids and wife watch way too much Hulu and Netflix for me to doanload a 60GB game. Just DLing BF4 was a 4 day venture....
> I can get it in the mail quicker than I can online.... I can't wait to play though.


Can you just not file a complain at your carrier? that you can't download games anymore because there infastructure is outdated.


----------



## Cyber Locc

Quote:


> Originally Posted by *Durtmagurt*
> 
> Did you have to replace the vrm thermal pads as well? Also would you recommend a different GPU block over the komodo?


The komodo is very good. Is there something better? well the answer to that depends on a numerous factors. There was extensive testing done on all the 290x blocks which i cant seem to find now but i found a summary article of the testing.

It shows pretty much the same just quicker and a little less info. http://www.xtremerigs.net/2014/08/11/r9-290x-gpu-waterblock-roundup/

That shows that the komodo has the best core temp of every block available. On the flip side it has the worst VRM temp of every block available so which to you is most important? Is having a good core temp and a bad vrm temp ideal? Well no, Your vrms getting hot may cause your card to throttle and 70c for those is pretty darn hot at stock if you plan to feed more voltage you may very well hit throttling speeds due to the vrm temp.

The AC block does very good on the core and also the VRM but it has other faults.
The AC block is also very restrictive (everything they make is restrictive lol). That said if you plan on running 2 or more and a restrictive cpu block than a single D5 isnt going to cut the mustard you will need at least a mp35x at full speed (which is loud) and 2 would be better.

Both of those are good blocks so is the Ek block (which is pretty much a jack of all trades low restriction, good core temps, and good vrm temps. However it is a master of none) So you see asking which block is best simply doesn't work as they all have pros and cons and different needs and prices. You need to really decide what your looking for in a block and what needs you have to really figure out which is best for you. However if I had to pick a best block it would be the EK simply because it does great in all areas so its pretty much the most even contender.

And to whoever said they have a komodo and 60c on the core how many rads you got? 60c on the core is high my cards need pass 45c under full load with huge overclock and I have 2 in serial.

Edit: I looked







that 220x doesnt have enough pump to push through all that stuff thats why your temps are high (plus maybe a lack of raddage but if you have your fans fast that will make up) the most id stick on a 220x is a single low heat gpu once you need to add rads its time to get a real loop







.


----------



## mfknjadagr8

Quote:


> Originally Posted by *cyberlocc*
> 
> The komodo is very good. Is there something better? well the answer to that depends on a numerous factors. There was extensive testing done on all the 290x blocks which i cant seem to find now but i found a summary article of the testing.
> 
> It shows pretty much the same just quicker and a little less info. http://www.xtremerigs.net/2014/08/11/r9-290x-gpu-waterblock-roundup/
> 
> That shows that the komodo has the best core temp of every block available. On the flip side it has the worst VRM temp of every block available so which to you is most important? Is having a good core temp and a bad vrm temp ideal? Well no, Your vrms getting hot may cause your card to throttle and 70c for those is pretty darn hot at stock if you plan to feed more voltage you may very well hit throttling speeds due to the vrm temp.
> 
> The AC block does very good on the core and also the VRM but it has other faults.
> The AC block is also very restrictive (everything they make is restrictive lol). That said if you plan on running 2 or more and a restrictive cpu block than a single D5 isnt going to cut the mustard you will need at least a mp35x at full speed (which is loud) and 2 would be better.
> 
> Both of those are good blocks so is the Ek block (which is pretty much a jack of all trades low restriction, good core temps, and good vrm temps. However it is a master of none) So you see asking which block is best simply doesn't work as they all have pros and cons and different needs and prices. You need to really decide what your looking for in a block and what needs you have to really figure out which is best for you. However if I had to pick a best block it would be the EK simply because it does great in all areas so its pretty much the most even contender.
> 
> And to whoever said they have a komodo and 60c on the core how many rads you got? 60c on the core is high my cards need pass 45c under full load with huge overclock and I have 2 in serial.
> 
> Edit: I looked
> 
> 
> 
> 
> 
> 
> 
> that 220x doesnt have enough pump to push through all that stuff thats why your temps are high (plus maybe a lack of raddage but if you have your fans fast that will make up) the most id stick on a 220x is a single low heat gpu once you need to add rads its time to get a real loop
> 
> 
> 
> 
> 
> 
> 
> .


yes it did have enough power to push it...I'm pushing both 290s... amd cpu at 4.8 with 1.512 volts 720mm rad space and a res...granted its running full tilt...and I wouldn't recommend that much on one but mine is doing well...also on the komodo block I some with bryan about the komodo block and he told me they couldn't reproduce the results the reviewer had...and assured me temps would be lower...in my experience vrms have never broken 65c...1 or 2...I found a few reviews done by forum members that supported Bryans statement so I took the plunge and low and behold 70 over delta would've been 100+ in my case which isn't the case...also my h220x is a real loop I haven't gotten around to installing my second pump (mcp50x) but I have two 220s and a 280 rad...I know temps will improve with second pump and I'm aware it's definitely to much fit the pump but given all that it's running well considered







I honestly expected it to overheat the cpu adding the 290s but surprisingly it trooped on...I added the third rad with the two gpu blocks but even being under pumped it still dropped temps on gpus 25 on one and 30 on the other...and around 5c on average for cpu


----------



## MerkageTurk

Wow just got my 290x vapour x recommended by rickcooperjr and amazing, no more issues or flickers


----------



## Cyber Locc

Quote:


> Originally Posted by *mfknjadagr8*
> 
> yes it did have enough power to push it...I'm pushing both 290s... amd cpu at 4.8 with 1.512 volts 720mm rad space and a res...granted its running full tilt...and I wouldn't recommend that much on one but mine is doing well...also on the komodo block I some with bryan about the komodo block and he told me they couldn't reproduce the results the reviewer had...and assured me temps would be lower...in my experience vrms have never broken 65c...1 or 2...I found a few reviews done by forum members that supported Bryans statement so I took the plunge and low and behold 70 over delta would've been 100+ in my case which isn't the case...also my h220x is a real loop I haven't gotten around to installing my second pump (mcp50x) but I have two 220s and a 280 rad...I know temps will improve with second pump and I'm aware it's definitely to much fit the pump but given all that it's running well considered
> 
> 
> 
> 
> 
> 
> 
> I honestly expected it to overheat the cpu adding the 290s but surprisingly it trooped on...I added the third rad with the two gpu blocks but even being under pumped it still dropped temps on gpus 25 on one and 30 on the other...and around 5c on average for cpu


I didn't mean it wont work I meant you aint getting 1gph suggested flow rate. And no a 220x is a AIO not a Real Loop. It is a very very good and expandable AIO but at the end of the day it is an AIO.

65c and 70c are not far off from one another thats only a 5 degree difference and that diffrence can be easily made by the fact you have 290s not 290xs used in the test. I trust strens review and it has been linked here many times including by martin and others that are gurus. Stren didnt say it will constantly sit at 70c that's most likely the top temp it hit however that is stock. When you start getting in to high overclocks that temp goes way up.

At any rate even saying 65c max for the vrms still puts it as one of the worst performing blocks as far as vrms go.

The EK result can also be improved (maybe the swiftech can too?) by putting thermal tape on the backplate where the rear vrms are. Like I said that doesnt make it a bad block just if you plan to run pt3 bios and overclock like crazzy than the swiftech block may throttle.

But at the end of the day truthfully there all pretty much in line. It really comes down to what you want/like then you have to gauge does it really matter if one performs 1c less than another. Especially with the consideration of loop, do you really need the best performing block on the market or do you need the one you want. At the end of the day get the one you want as the differences are slim and if you don't have a really really good loop it won't even matter.

Personally I can't stand the looks of the komodo the backplate (well the gaping whole in it) and lack of a terminal adapter are the kills for me.

While on the subject of backplates that's another con for the swifttech. I have had backplates save my cards from leaks a few times the swifttech is worthless in that regard as the gaping hole happens to be where the water will most likely spill.


----------



## mfknjadagr8

Quote:


> Originally Posted by *cyberlocc*
> 
> I didn't mean it wont work I meant you aint getting 1gph suggested flow rate. And no a 220x is a AIO not a Real Loop. It is a very very good and expandable AIO but at the end of the day it is an AIO.
> 
> 65c and 70c are not far off from one another thats only a 5 degree difference and that diffrence can be easily made by the fact you have 290s not 290xs used in the test. I trust strens review and it has been linked here many times including by martin and others that are gurus. Stren didnt say it will constantly sit at 70c that's most likely the top temp it hit however that is stock. When you start getting in to high overclocks that temp goes way up.
> 
> At any rate even saying 65c max for the vrms still puts it as one of the worst performing blocks as far as vrms go.
> 
> The EK result can also be improved (maybe the swiftech can too?) by putting thermal tape on the backplate where the rear vrms are. Like I said that doesnt make it a bad block just if you plan to run pt3 bios and overclock like crazzy than the swiftech block will throttle.


I have an mcp50x pump waiting to be installed...the review I read stated 72c over delta not 72c...yeah if I was planning that I would have probably went ac with backplate...but yeah as I stated the swiftech block is machined to very tight tolerances so it uses tim not a pad to cool vrms on back...I think a pad not matter how thin would probably be worse...

If the parts were separate a flagship block and a 220mm rad with a decent (not amazing but capable) pump and a reservoir would not be a custom loop? Albeit an entry level custom loop but one none the less...


----------



## tsm106

That waterblock review is full of horse poop.

Btw, the ek backplate already comes with pads and is in fact designed to sandwich the vrms and gpu die.

Look at the numbers, only the AC block looks good (fishy?). Then you look at the numbers, 72c over delta and other gibberish lol. When I ran triple ek blocked reference 290x, they were the fastest on the HoF for firestrike like 2 months from release. On stock pads, my vrms never went past 65c and that was at +300mv at 1300/1600mhz. Then I look at this review, facepalm.


----------



## FastEddieNYC

I'm currently using the Aquacomputer Kyrographics block and backplate on my 290X. Load temps running 1175/6000 +150mv are 43c core, 35c vrm1. The memory uses TIM instead of tape and the thermal tape that is used is the thinnest I've seen.I was planning on using the Komodo block but the VRM cooling is awful. Why use a full cover and not cool the VRM's?
My 220x pump easily supplies plenty of flow for my loop which includes a CoolGate 360 rad. I've ran the pump at 2400rpm with no increase in temps. I have a mcp50x pump but will wait until I add a second 290x to install it.


----------



## Ramzinho

Quote:


> Originally Posted by *Sempre*
> 
> Sadly i got a stripped screw while trying to remove the stock cooler. I tried the rubber band method and super glue but it still wont budge. Next ill try do drill into and through the screw. Anyone thinks its a good/bad idea?
> 
> 
> 
> This is the bit ill be using:


try grabbing the screw head with pliers


----------



## tsm106

Quote:


> Originally Posted by *Ramzinho*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sempre*
> 
> Sadly i got a stripped screw while trying to remove the stock cooler. I tried the rubber band method and super glue but it still wont budge. Next ill try do drill into and through the screw. Anyone thinks its a good/bad idea?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> This is the bit ill be using:
> 
> 
> 
> 
> 
> 
> 
> try grabbing the screw head with pliers
Click to expand...

I wanted to avoid enlarging the existing threads which is what you will want to do with a stock cooler I would think. Drilling the screw out was out of the question for me. Thus I used a reversing bit. You can get extraction kits cheap, comes with a bunch of different bits. For ex.


----------



## Cyber Locc

Quote:


> Originally Posted by *tsm106*
> 
> That waterblock review is full of horse poop.
> 
> Btw, the ek backplate already comes with pads and is in fact designed to sandwich the vrms and gpu die.
> 
> Look at the numbers, only the AC block looks good (fishy?). Then you look at the numbers, 72c over delta and other gibberish lol. When I ran triple ek blocked reference 290x, they were the fastest on the HoF for firestrike like 2 months from release. On stock pads, my vrms never went past 65c and that was at +300mv at 1300/1600mhz. Then I look at this review, facepalm.


I guess it did come with pads







I could have swore it didn't but I looked at the instructions it said it did. At any rate I just swapped the default block pads for vrm 1 and 2 and used fujipoly extremes then used pads in the backplate. With +300 my vrms stay around 60c.

Your right the temps are difference in delta on an overclocked card in furmark. Every big name watercooling tester is in that thread skinne and martin both endorse it which they don't do lightly Im going to say that review is pretty spot on. You figure what you said TSM 65c max in firestrike with +300mv he is showing about 65c with a overclock of 1150 on the core with +100 the main vrm only at 65c in furmark. How are you testing the vrm temps? There is multiple vrms in the card with variances in heat is your graph showing you the temp of only the main VRM or sum of both? Another factor of vrm temps would be the case used or if its open air. The reason that vrm cooling increases with a backplate done properly is it turns into one big flat heat sink and with different airflow over it will yield different results.

Also your statement assumes that he has a "good loop" by all rights MFKs block should do better on the core then my EKs yet my cards dont go over 40c at stock where his are over 60c that's a 20c difference. Now by your logic I should say see EK blocks are the best I get 20c less than MFK but that is horse poop because not all is equal. Your looking at it too exact you cant look at and say well he says I should get X temp and I don't so he is wrong.
Most of the test just like me and mfks examples depend on the loop. Now Stren does have a EX560 rad so a big nice rad so IDK. However I do know that you cant look at the way you are saying that because your temps were X his temps of Y are wrong.

Also the AC vrm temps are good because he uses the active backplate that is essentially watercooled. Also if you read the thread his suggestion of which block to get is actually the EK or the AC.

Also I hint your hinting at him being biased? Stren alqways does waterblock reviews that are always backed by major players in watercooling hes pretty highly regarded as being accurate in this field so what horse poop do you see? He didn't just wake up and say today I'm going to review waterblocks he has been doing this for awhile and is pretty credible. However like MFK said that Brian discredited this and said he cant recreate it so the option is what take swiftechs word over his? They have skin in game he doesn't. Also browse the thread there's a whole bunch of people always doubting his results as you are know and he retests the cards again for them some times more than once. The fact is those are fact the swifttech while good with core temps sucks in VRMS and the AC block is the king with the EK as the runner up.

BTW found the original thread - http://www.xtremesystems.org/forums/showthread.php?288109-Stren-s-R9-290-290x-Water-Block-Testing&p=5236507&viewfull=1#post5236507

However mfk said that is machined to sit on the vrms so now the question comes to mind of could his backplate have issues and did he use thermal paste. Also in response mfks question/thought about the metal contact vs thermal pad it should/would be better if you used thermal paste if you didn't well put your heat-sink on your cpu with no paste and see for yourself as it cooks. And I know swiftech says tight tolerances yada yada yada however no matter how tight they are they dont stopp metal from having pits which is the reason for thermal paste. If they can make a tight tolerance metal heatsink I cant wait to see there timless cpu heat-sink. That said it kinda gives benefit to swiftechs block. He doesn't mention using TIM on the backplate do the instructions say to do that? they should.

At any rate mfk said his vrms hit 65c stock which isn't good for a water block as you yourself said 65c with +300mv.

However once again we find are selves at the same conclusion all the blocks are fairly consistent just pick the one you like and be done with it







.


----------



## Gobigorgohome

It seems like I have a weird problem with the new card that I bought. I put the card into the third slot on my Asus RIVBE (the one for doing x16/x16), first 5 boots I get no picture at all with Displayport, then it shows the "start windows normally" or "repair windows". Tried to start it normally first (twice) and did not work, tried to repair and the screen was going black the minute the notification about "installing new device" showed up. I guess this is the well-known "black-screen" issue, I have never seen this before.

I really do not want to take out my Lightning and test the card in the first slot, even though that might be the only way. Is there a "simpler" way to get around this problem?

My rig and the Lightning is solid, I have never had a problem with the system or this card.

I use 14.12 driver, I used the same driver with quadfire R9 290X and never had a problem. I have also used it with trifire and crossfire before (with different cards).


----------



## Dooderek

Crossfire'd 290x Visiontek waterblock (EK)

Originally my H100i AIO + HG A10 Bracket worked well to keep gpu temps down by during overclocks, but i've noticed the VRM temps were just getting silly high (90 deg C). I realized that the reason this was happening was because the fan profile was not matched to the lower temps from the GPU being watercool'd. So essentially the fan was at 20% DC thinking everything was ok. I then scaled my fan profile with the new load temp and the VRM's were now being cooled (to an air cooling extent/ ~ 80 deg C). Unfortunately to get the VRM temps to a comfortable level I had to crank the fan duty cycle quite alot which was just too noisy for me (says alot considering I have my case/rad fans cranked at max).

More than happy with my EK blocks, now @1100MHz I see peak VRM temps of 60 deg C.


----------



## mfknjadagr8

Quote:


> Originally Posted by *cyberlocc*
> 
> I guess it did come with pads
> 
> 
> 
> 
> 
> 
> 
> I could have swore it didn't but I looked at the instructions it said it did. At any rate I just swapped the default block pads for vrm 1 and 2 and used fujipoly extremes then used pads in the backplate. With +300 my vrms stay around 60c.
> 
> Your right the temps are difference in delta on an overclocked card in furmark. Every big name watercooling tester is in that thread skinne and martin both endorse it which they don't do lightly Im going to say that review is pretty spot on. You figure what you said TSM 65c max in firestrike with +300mv he is showing about 65c with a overclock of 1150 on the core with +100 the main vrm only at 65c in furmark. How are you testing the vrm temps? There is multiple vrms in the card with variances in heat is your graph showing you the temp of only the main VRM or sum of both? Another factor of vrm temps would be the case used or if its open air. The reason that vrm cooling increases with a backplate done properly is it turns into one big flat heat sink and with different airflow over it will yield different results.
> 
> Also your statement assumes that he has a "good loop" by all rights MFKs block should do better on the core then my EKs yet my cards dont go over 40c at stock where his are over 60c that's a 20c difference. Now by your logic I should say see EK blocks are the best I get 20c less than MFK but that is horse poop because not all is equal. Your looking at it too exact you cant look at and say well he says I should get X temp and I don't so he is wrong.
> Most of the test just like me and mfks examples depend on the loop. Now Stren does have a EX560 rad so a big nice rad so IDK. However I do know that you cant look at the way you are saying that because your temps were X his temps of Y are wrong.
> 
> Also the AC vrm temps are good because he uses the active backplate that is essentially watercooled. Also if you read the thread his suggestion of which block to get is actually the EK or the AC.
> 
> Also I hint your hinting at him being biased? Stren alqways does waterblock reviews that are always backed by major players in watercooling hes pretty highly regarded as being accurate in this field so what horse poop do you see? He didn't just wake up and say today I'm going to review waterblocks he has been doing this for awhile and is pretty credible. However like MFK said that Brian discredited this and said he cant recreate it so the option is what take swiftechs word over his? They have skin in game he doesn't. Also browse the thread there's a whole bunch of people always doubting his results as you are know and he retests the cards again for them some times more than once. The fact is those are fact the swifttech while good with core temps sucks in VRMS and the AC block is the king with the EK as the runner up.
> 
> BTW found the original thread - http://www.xtremesystems.org/forums/showthread.php?288109-Stren-s-R9-290-290x-Water-Block-Testing&p=5236507&viewfull=1#post5236507
> 
> However mfk said that is machined to sit on the vrms so now the question comes to mind of could his backplate have issues and did he use thermal paste. Also in response mfks question/thought about the metal contact vs thermal pad it should/would be better if you used thermal paste if you didn't well put your heat-sink on your cpu with no paste and see for yourself as it cooks. And I know swiftech says tight tolerances yada yada yada however no matter how tight they are they dont stopp metal from having pits which is the reason for thermal paste. If they can make a tight tolerance metal heatsink I cant wait to see there timless cpu heat-sink. That said it kinda gives benefit to swiftechs block. He doesn't mention using TIM on the backplate do the instructions say to do that? they should.
> 
> At any rate mfk said his vrms hit 65c stock which isn't good for a water block as you yourself said 65c with +300mv.
> 
> However once again we find are selves at the same conclusion all the blocks are fairly consistent just pick the one you like and be done with it
> 
> 
> 
> 
> 
> 
> 
> .


yeah the instructions tell to paste the vrms on backside...I didn't just take Bryans word for it however I found another review done on another forum that stated the vrms were much better than 72 over delta...since my cards were hitting the 90s with reference blower 65 is a big improvement...but I'm not done so things might change with better flow and adding a couple more fans...interestingly enough core on top card is higher by 5c...might.be my mount or the result of the backplate doing it's job but lack of air flow between them but I will sort out the issue and remedy it to the best of my ability in my never ending search for moar powah...once I've done all I can to improve temps with current setup I will start overclocking them and see what happens...if in fact the komodo block could have the lower vrm temps in my opinion it would win in spades....it has the looks and the core performance...no one disputes that.just need to get vrm temps lower...but then again you never know I might overclock the cards and vrm temps not rise a huge amount...we will see...until then I still proclaim the ac block the winner...I might have gone with it but the Palm tree I'd the most hideous thing (joking here but it is bad)


----------



## Batpimp

well finally got it working how i need.

I upped my default ram speed to 2400mhz and bf4 is finally seeing the gains i needed.

I had to do EVERY little piece of advice though until just now i did that and turned on frame pacing in catalyst.

thanks everyone whom helped me.


----------



## Cyber Locc

Quote:


> Originally Posted by *mfknjadagr8*
> 
> yeah the instructions tell to paste the vrms on backside...I didn't just take Bryans word for it however I found another review done on another forum that stated the vrms were much better than 72 over delta...since my cards were hitting the 90s with reference blower 65 is a big improvement...but I'm not done so things might change with better flow and adding a couple more fans...interestingly enough core on top card is higher by 5c...might.be my mount or the result of the backplate doing it's job but lack of air flow between them but I will sort out the issue and remedy it to the best of my ability in my never ending search for moar powah...once I've done all I can to improve temps with current setup I will start overclocking them and see what happens...if in fact the komodo block could have the lower vrm temps in my opinion it would win in spades....it has the looks and the core performance...no one disputes that.just need to get vrm temps lower...but then again you never know I might overclock the cards and vrm temps not rise a huge amount...we will see...until then I still proclaim the ac block the winner...I might have gone with it but the Palm tree I'd the most hideous thing (joking here but it is bad)


Well see that's the thing they don't know his loop. What they get over delta and what he gets over delta are completely different. However yes in this particular case Im leaning towards he didn't pt paste on the back plate because he does mention when he puts pads on the other back plates but says nothing about paste on the swiftech.

That could very well make a very large difference and could be why his test is showing 72 and users aren't able to recreate.

And yes I agree its a very good block and if it fits your style its a defiant I love the way it looks and if it didnt have a big cutout in the backplate I may have went with it myself


----------



## Cyber Locc

Okay guys since pt1 isn't going to work for me I need to figure out how to fix this downclock crap. So when I run heaven or valley or anything my card down clocks like every few mins it will drop either 0001mhz all the way to 500mhz it does that at stock overclocked. It did it with original bios it does it with asus bios. It does it with both my cards and im on a fresh windows 8.1 install now. I have tried different overclocking software I have tried every single thing I can think of. Its not my temps because my core has never exceed 45 and only gets that high at 1.4v and my vrms never exceeded 48c either and again only does that at 1.4v.

Is it normal for this thing to throttle like this? it even does it when gpu usage is 100% between this and my sound issues I'm really getting sick of amd...


----------



## Arizonian

Quote:


> Originally Posted by *Dooderek*
> 
> Crossfire'd 290x Visiontek waterblock (EK)
> 
> Originally my H100i AIO + HG A10 Bracket worked well to keep gpu temps down by during overclocks, but i've noticed the VRM temps were just getting silly high (90 deg C). I realized that the reason this was happening was because the fan profile was not matched to the lower temps from the GPU being watercool'd. So essentially the fan was at 20% DC thinking everything was ok. I then scaled my fan profile with the new load temp and the VRM's were now being cooled (to an air cooling extent/ ~ 80 deg C). Unfortunately to get the VRM temps to a comfortable level I had to crank the fan duty cycle quite alot which was just too noisy for me (says alot considering I have my case/rad fans cranked at max).
> 
> More than happy with my EK blocks, now @1100MHz I see peak VRM temps of 60 deg C.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added to roster


----------



## Mega Man

Quote:


> Originally Posted by *Batpimp*
> 
> well finally got it working how i need.
> 
> I upped my default ram speed to 2400mhz and bf4 is finally seeing the gains i needed.
> 
> I had to do EVERY little piece of advice though until just now i did that and turned on frame pacing in catalyst.
> 
> thanks everyone whom helped me.


no one believes me !~

really 1333/1600 is too slow 2400+ even for gaming now a days


----------



## th3illusiveman

Why can't my 290X go over 1100Mhz without artifacting. +100mv, +50 power limit, 1400Mhz on the memory and and i keep it under 75c core temp yet i refuses. I thought Tri-X cards came with reference PCB (which are usually the best) GTA5 demands more power for very high shadows.


----------



## mfknjadagr8

Quote:


> Originally Posted by *cyberlocc*
> 
> Okay guys since pt1 isn't going to work for me I need to figure out how to fix this downclock crap. So when I run heaven or valley or anything my card down clocks like every few mins it will drop either 0001mhz all the way to 500mhz it does that at stock overclocked. It did it with original bios it does it with asus bios. It does it with both my cards and im on a fresh windows 8.1 install now. I have tried different overclocking software I have tried every single thing I can think of. Its not my temps because my core has never exceed 45 and only gets that high at 1.4v and my vrms never exceeded 48c either and again only does that at 1.4v.
> 
> Is it normal for this thing to throttle like this? it even does it when gpu usage is 100% between this and my sound issues I'm really getting sick of amd...


that's kind of odd you normally get throttle on hitting power limits or heat which I don't think you are doing either.....I'm having an issue lately as well...I don't see actual throttling of core or memory but my usage 0's out occasionally causing stutter....sometimes it's one card our the other and some rings it is both...however it is during graphically intensive scenes...fps is great until this happened...last night on gtav it was running around 110fps...core temps 52 and 57 respectively vrms on card one 55 and 50...card are still stock with presumably stock bios...


----------



## BradleyW

Quote:


> Originally Posted by *Mega Man*
> 
> no one believes me !~
> 
> really 1333/1600 is too slow 2400+ even for gaming now a days


I believe you. Just look at my sig.


----------



## Agent Smith1984

I played Crysis 3 for the first time on my newly Crossfired tri-x cards last night.....

WOW

Ran both cards stock just to make sure all was well....
GPU-Z showed full usage on both cards, and core clock stayed pegged at 1000 on each card.

My original card (has had a repaste with MX-2) ran at 72c core, 75c VRM1, and the second card hit 82c core, 89c VRM1.
Those temps are all well within safe limits, so again, you gotta hand it to Sapphire for their cooler. Especially for maintaining reasonable temps on the second card....

I didn't play with overclocking too much, but I did run unigine Heaven at 1100/1400 on each card just to see.
With heaven at max settings 1080p, I got 2600 points, which is up quite a bit from the 1450 I was getting with one card at 1100/1500.

I am reloading BF4 on tonight and seeing how that does next.

Really impressed with 2x290's so far....


----------



## kizwan

Quote:


> Originally Posted by *Gobigorgohome*
> 
> It seems like I have a weird problem with the new card that I bought. I put the card into the third slot on my Asus RIVBE (the one for doing x16/x16), first 5 boots I get no picture at all with Displayport, then it shows the "start windows normally" or "repair windows". Tried to start it normally first (twice) and did not work, tried to repair and the screen was going black the minute the notification about "installing new device" showed up. I guess this is the well-known "black-screen" issue, I have never seen this before.
> 
> I really do not want to take out my Lightning and test the card in the first slot, even though that might be the only way. Is there a "simpler" way to get around this problem?
> 
> My rig and the Lightning is solid, I have never had a problem with the system or this card.
> 
> I use 14.12 driver, I used the same driver with quadfire R9 290X and never had a problem. I have also used it with trifire and crossfire before (with different cards).


I don't think that is black screen issue like everyone else experiencing.

Anyway, is there any reason why you want to test the card at first slot? Why don't you test the card on the slot the card is currently occupied? Considering you already run tri & quad before, so the slot shouldn't be bad if you suspect bad slot.


----------



## Gobigorgohome

Quote:


> Originally Posted by *kizwan*
> 
> I don't think that is black screen issue like everyone else experiencing.
> 
> Anyway, is there any reason why you want to test the card at first slot? Why don't you test the card on the slot the card is currently occupied? Considering you already run tri & quad before, so the slot shouldn't be bad if you suspect bad slot.


I will try to uninstall the drivers, put in the second card and power it on. If that does not work i will RMA it, everything else is working fine.

I suspect that the second card is bad, it has power and the lights on the mobo tells me that it has cards in pci-e slot 1 and 3 (which is right), psu is good, displayport-cable is good and there is no overclocking into the picture.


----------



## Dooderek

Attached below is my GPU-Z/HW monitor temps with the waterblocks.

Current ambient temp is 81 F / 27 C. Normally I have the A/C on but I decided to leave it off to see what temps I would net.

This was after bench marking GTA V with everything on max (ignored suggested limits off). My xfire'd 290x had no issues with this (65-70 avg FPS V-SYNC on), however when turning on advanced graphics and maxing those out, it seemed to have a fair bit of frame issues dipping to the ~50-40 avg FPS range V-SYNC on, and then running 15 minutes of Valley Extreme HD followed up by a benchmark.

As you can see the water blocks are ultra effective with taking care of those pesky VRM temps!

Currently I have them clocked at 1100/1350 with a +50% power limit / +100 mV (.1V)


----------



## FastEddieNYC

Quote:


> Originally Posted by *th3illusiveman*
> 
> Why can't my 290X go over 1100Mhz without artifacting. +100mv, +50 power limit, 1400Mhz on the memory and and i keep it under 75c core temp yet i refuses. I thought Tri-X cards came with reference PCB (which are usually the best) GTA5 demands more power for very high shadows.


It is commonly called the silicone lottery. Some chips OC well and some don't. I had 1 290x that could do 1120/1600 with stock voltage and another that required 150mv to run 1120.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Dooderek*
> 
> Attached below is my GPU-Z/HW monitor temps with the waterblocks.
> 
> Current ambient temp is 81 F / 27 C. Normally I have the A/C on but I decided to leave it off to see what temps I would net.
> 
> This was after bench marking GTA V with everything on max (ignored suggested limits off). My xfire'd 290x had no issues with this (65-70 avg FPS V-SYNC on), however when turning on advanced graphics and maxing those out, it seemed to have a fair bit of frame issues dipping to the ~50-40 avg FPS range V-SYNC on, and then running 15 minutes of Valley Extreme HD followed up by a benchmark.
> 
> As you can see the water blocks are ultra effective with taking care of those pesky VRM temps!
> 
> Currently I have them clocked at 1100/1350 with a +50% power limit / +100 mV (.1V)


Looks good!
What are you using to OC the card?


----------



## Cyber Locc

Quote:


> Originally Posted by *mfknjadagr8*
> 
> that's kind of odd you normally get throttle on hitting power limits or heat which I don't think you are doing either.....I'm having an issue lately as well...I don't see actual throttling of core or memory but my usage 0's out occasionally causing stutter....sometimes it's one card our the other and some rings it is both...however it is during graphically intensive scenes...fps is great until this happened...last night on gtav it was running around 110fps...core temps 52 and 57 respectively vrms on card one 55 and 50...card are still stock with presumably stock bios...


IKR it is very weird. I tried both my cards individually (my board has switches to shut pcie lanes off) they both do it (they are both reference but 1 is xfx and 1 is sapphire). Usually it is a small drop like when testing at 1000mhz core it will spike down to 997-999 like 4 or 5 times, there will also be 2-3 larger spikes down to 800 or so and 1 really big one that my usage drops to 0 for some reason and my clocks drop to 600 for a split second. That is about the jist of it for a heaven bench. The only time its tied to not full usage is during the big drop the others usage is 100% when it downclocks however pretty much every time there was a usage spike before. So like usage drops to 90% then goes back to 100% and about a second later the clocks drop and then go back up. When I overclock the spikes increase in severity and frequency.

However under searching last night for heaven results I am getting the fps I should heaven ultra tess extreme and 8x aa is giving me 53-54fps for the final score @ stock and 63fps @ OCed.

Edit: When I say at stock I mean a 290 with ASUS 290x bios so its 1000mhz/1250mhz.

However this was happening before flashing that bios with there stock bios. I though my psu was too small (which it was) and that was causing it (apparently not). However now I have a g2 1600w so that's defiantly not the issue.

Quote:


> Originally Posted by *Mega Man*
> 
> no one believes me !~
> 
> really 1333/1600 is too slow 2400+ even for gaming now a days


I believed you and that's cool to know glad I went with 2400mhz ram







. However I just by my own stupidity broke one of my ram sticks should I buy faster ram now that I need to replace all of it ???


----------



## YellowBlackGod

Guys, did you see this Test?

http://www.tweaktown.com/tweakipedia/79/amd-radeon-r9-290x-8gb-cf-vs-titan-sli-gtx-980-4k/index.html

We do still have the best GPU around. And now with DX12 coming it will Rock like a new released GPU!









Take a look at the comments in the test results section and see which card is the winner!


----------



## Gobigorgohome

It would seem that the Asus DCUII R9 290X I got is DOA or bad in some other way.

Try to fix:
-Uninstalled the drivers with only the Lightning inside the machine
-Installed the Asus R9 290X into the third PCI-E port
-Fired up the machine, installed drivers
-Started Tomb Raider --> BSOD
-Machine would not restart

Threw out the card and tried with the Lightning and it fired up right away, no problem. The Asus R9 290X is going back for sure.


----------



## Dooderek

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Looks good!
> What are you using to OC the card?


Thanks!

Using AB currently.


----------



## Bertovzki

Excuse my ignorance. ..but what is the other power input next to the 6+2 pin on the R9 290x tri-X oc ? My psu v1000 has 6+2 pin cable..do I ignored the other 6 pin plug next to it or should that be powered by another 6 pin psu cable


----------



## kizwan

Quote:


> Originally Posted by *Bertovzki*
> 
> Excuse my ignorance. ..but what is the other power input next to the 6+2 pin on the R9 290x tri-X oc ? My psu v1000 has 6+2 pin cable..do I ignored the other 6 pin plug next to it or should that be powered by another 6 pin psu cable


You should connect another 6-pin PCIe power cable to that 6-pin plug.


----------



## Bertovzki

Quote:


> Originally Posted by *kizwan*
> 
> You should connect another 6-pin PCIe power cable to that 6-pin plug.


OK thanks...that is helpful ..I'm cutting all my cable holes in my acrylic mobo sheet


----------



## mfknjadagr8

Quote:


> Originally Posted by *Bertovzki*
> 
> OK thanks...that is helpful ..I'm cutting all my cable holes in my acrylic mobo sheet


without the second plug it won't be provided enough power


----------



## killer8297

Is anyone else having frame issues with GTA V? I have it at default setting except for 2x AA @ 1080p, but I'm getting slower than expected frames, around 50-60s, sometimes dropping even lower than that. I seen other people within 80 range with this card, help?


----------



## Batpimp

make sure you are running latest driver and read other peoples problems. that game JUST came out. lots of problems right now.


----------



## killer8297

I performed a clean install of the latest beta driver.
My school computer with gtx 780 was able to outperform my r9 290x @ 1980x1200 vs 1080.
I also see other r9 290x outperforming my system, so I'm trying to figure out why.


----------



## long99x

Quote:


> Originally Posted by *long99x*
> 
> I plan buy kraken g10 and H55 to cool my Asus r9 290 dc2, will it fit ?
> 
> thanks


anyone help pls


----------



## boot318

Quote:


> Originally Posted by *long99x*
> 
> anyone help pls


I Ghetto/Red Mod my 290x DC2. It will fit. I don't know if using a Kraken w G10 would block the last memory chip by PCI slot though. This is only a concern if you plan on putting heatsinks on it (the others will fit with heatsink on for sure). Having the fan blowing on it will be good enough with the G10 if it does block it.

Memory chip in question. Looks questionable with G10 but it may fit with low profile heatsink on memory.


My "Red Mod"


----------



## DaUn3rD0g

Hi guys I have a quick question regarding vrm2 temps on my 290x.

I bought the msi r9 290x le gaming and had a few problems to start with due to random black screens and lock ups, this mostly sorted itself following an update to my motherboard bios.

However I'm still experiencing the occasional crash under load, and I'm putting it down to vrm overheating.

Gpuz reports vrm2 at around 76c while idling in Windows, I can't remember the core or vrm1 temps but they seemed reasonable.

I've yet to take a reading under load, only noticed late last night after a crash while watching a film and immediately strapped 2 120mm fans to the card to help cool it (1 blowing towards the cards fans and 1 sucking air away from the backplate), the 76c reading is with these fans in place. .

This card is at stock currently.

I have an alphacool nexxxos gpx r9 290x m02 "hybrid" block on order that should be here by next week, I realise it's not the best for vrm cooling being not a proper full cover block, but is the only one compatible with the msi gaming pcb without modification.

Am I right in thinking these temps are too high, or am I just being paranoid?


----------



## long99x

Quote:


> Originally Posted by *boot318*
> 
> I Ghetto/Red Mod my 290x DC2. It will fit. I don't know if using a Kraken w G10 would block the last memory chip by PCI slot though. This is only a concern if you plan on putting heatsinks on it (the others will fit with heatsink on for sure). Having the fan blowing on it will be good enough with the G10 if it does block it.
> 
> Memory chip in question. Looks questionable with G10 but it may fit with low profile heatsink on memory.
> 
> 
> My "Red Mod"


thank you, I will check this


----------



## rdr09

Quote:


> Originally Posted by *DaUn3rD0g*
> 
> Hi guys I have a quick question regarding vrm2 temps on my 290x.
> 
> I bought the msi r9 290x le gaming and had a few problems to start with due to random black screens and lock ups, this mostly sorted itself following an update to my motherboard bios.
> 
> However I'm still experiencing the occasional crash under load, and I'm putting it down to vrm overheating.
> 
> Gpuz reports vrm2 at around 76c while idling in Windows, I can't remember the core or vrm1 temps but they seemed reasonable.
> 
> I've yet to take a reading under load, only noticed late last night after a crash while watching a film and immediately strapped 2 120mm fans to the card to help cool it (1 blowing towards the cards fans and 1 sucking air away from the backplate), the 76c reading is with these fans in place. .
> 
> This card is at stock currently.
> 
> I have an alphacool nexxxos gpx r9 290x m02 "hybrid" block on order that should be here by next week, I realise it's not the best for vrm cooling being not a proper full cover block, but is the only one compatible with the msi gaming pcb without modification.
> 
> Am I right in thinking these temps are too high, or am I just being paranoid?


hell yah that's high. that's load temp. i have not seen the pcb of the gaming but VRM2 on a reference model is located in the highlighted area as shown . . .



check to make sure the heatsink is touching that area if possible. i'll take it apart to see if the pads are even compressed. gaming is next to DC2, imo, for worst in cooling.


----------



## DaUn3rD0g

I thought so.

Just had a play around with it, switched my top 240mm rad to exhaust (pull), and put a pull fan on my front 240mm rad (half is push/pull, other half is push only due to fittings blocking one mount) all as intake.

Increased the rpm on my rear exhaust and the front intake fan in line with the card.

Vrm2 now reads 62c but this value didn't change running Heaven so I don't know if it's giving false reading.

After a benchmark run, on clicking exit, it crashed with the old "driver stopped responding but recovered" message that ended in a locked black screen, so just tried installing the latest beta driver to see if that makes any difference.

EDIT: now it won't even load heaven without black screening, any ideas?

I also forgot to mention I also increased the fan profile in afterburner to one much more aggressive, but obviously that hasn't helped.


----------



## rdr09

Quote:


> Originally Posted by *killer8297*
> 
> I performed a clean install of the latest beta driver.
> My school computer with gtx 780 was able to outperform my r9 290x @ 1980x1200 vs 1080.
> I also see other r9 290x outperforming my system, so I'm trying to figure out why.


an oc'ed 780 will surely surpass a stock 290X. run 3DMark bench like Firestrike and post your results here to compare.
Quote:


> Originally Posted by *DaUn3rD0g*
> 
> I thought so.
> 
> Just had a play around with it, switched my top 240mm rad to exhaust (pull), and put a pull fan on my front 240mm rad (half is push/pull, other half is push only due to fittings blocking one mount) all as intake.
> 
> Increased the rpm on my rear exhaust and the front intake fan in line with the card.
> 
> Vrm2 now reads 62c but this value didn't change running Heaven so I don't know if it's giving false reading.
> 
> After a benchmark run, on clicking exit, it crashed with the old "driver stopped responding but recovered" message that ended in a locked black screen, so just tried installing the latest beta driver to see if that makes any difference.
> 
> EDIT: now it won't even load heaven without black screening, any ideas?
> 
> I also forgot to mention I also increased the fan profile in afterburner to one much more aggressive, but obviously that hasn't helped.


watercooled? full block?


----------



## DaUn3rD0g

Card is currently air cooled by msi twinfozer iv, have an alphacool nexxxos gpx on order.

My cpu is custom loop, I was waiting on the funds for this gpu before adding in a gpu block to the loop (wasn't worth it for my old 260x)

EDIT: upped the power limit to +50, the aux voltage by +50, and downclocked the memory to 1200mhz.

It's lost 2fps in the Heaven benchmark but at least it ran and completed, also completed a run of bioshock infinitebbenchmark which it couldn't do before.

Now I have another question, is there any software to control my case / rad fans (all pwm linked to motherboard) based on gpu temperature?

I ask because I want my computer quiet during desktop work/films but don't mind if it ramps up while gaming.

Unfortunately my asrock motherboard a tuning only allows me to change fan speeds automatically based on cpu temperature, which doesn't fluctuate enough during gpu intensive tasks to get it to put more airflow through the case when the gpu needs it.


----------



## rdr09

Quote:


> Originally Posted by *DaUn3rD0g*
> 
> Card is currently air cooled by msi twinfozer iv, have an alphacool nexxxos gpx on order.
> 
> My cpu is custom loop, I was waiting on the funds for this gpu before adding in a gpu block to the loop (wasn't worth it for my old 260x)
> 
> EDIT: upped the power limit to +50, the aux voltage by +50, and downclocked the memory to 1200mhz.
> 
> It's lost 2fps in the Heaven benchmark but at least it ran and completed, also completed a run of bioshock infinitebbenchmark which it couldn't do before.
> 
> Now I have another question, is there any software to control my case / rad fans (all pwm linked to motherboard) based on gpu temperature?
> 
> I ask because I want my computer quiet during desktop work/films but don't mind if it ramps up while gaming.
> 
> Unfortunately my asrock motherboard a tuning only allows me to change fan speeds automatically based on cpu temperature, which doesn't fluctuate enough during gpu intensive tasks to get it to put more airflow through the case when the gpu needs it.


see if you set the fans in silent mode in motherboard's bios. normally found under HW monitor. if that does not work . . . you can try Speedfan app. But, Speedfan can conflict with other apps like Afterburner or even CCC.


----------



## Widde

I've just switched to a 4770k from my 3570k and my 3dmark scores feels low, Fresh os install and freshly installed 15.4 betas, Cpu at 4.0ghz, 16gb 1866 vengeance pro, 2 R9 290s at 1100/1350 or is this in order? And yes my 5th cpu that's a complete dud, or I'm doing something wrong







Never ocd on z87 or z97 boards, On a sabertooth Z97 now, Had Z77 before

http://www.3dmark.com/fs/4687457


----------



## DaUn3rD0g

Was just trying speedfan and wow is it complicated to figure out lol!

I may have to just stick to manually adjusting fans as needed for now, my waterblock should be here by the 5th as watercoolinguk are in the process of processing the payment they took yesterday during "security checks"... I really wish that company would be a bit more honest and just say "your order may or may not take 3-5 days as it depends on how many chores my mum makes me do and how long it takes me to order in your goods for resending back out to you.".

To be fair it's the cheapest place in the UK to buy watercooling parts, but their websites claims of a 3-5 day delivery time or of actually stocking anything themselves is clearly a lie, Google maps even shows the address to be a residential property!

But that's off topic, for now I seem to have mostly resolved the instability issues even if it requires manual faffing every time I want to play a game.


----------



## kizwan

Quote:


> Originally Posted by *Widde*
> 
> I've just switched to a 4770k from my 3570k and my 3dmark scores feels low, Fresh os install and freshly installed 15.4 betas, Cpu at 4.0ghz, 16gb 1866 vengeance pro, 2 R9 290s at 1100/1350 or is this in order? And yes my 5th cpu that's a complete dud, or I'm doing something wrong
> 
> 
> 
> 
> 
> 
> 
> Never ocd on z87 or z97 boards, On a sabertooth Z97 now, Had Z77 before
> 
> http://www.3dmark.com/fs/4687457


Why do you think the scores are low?


----------



## Widde

Quote:


> Originally Posted by *kizwan*
> 
> Why do you think the scores are low?


It just "felt" low, Was probably expecting to much ^^


----------



## MerkageTurk

I had driver crashes and recovered, it seemed xfx 280x dd black is not a good cooler and use very low quality components, returned and got a vapor x 290x as recommended by rickcooper

Now vrm 1 is 26c

Vrm 2 32


----------



## Dooderek

Quote:


> Originally Posted by *killer8297*
> 
> Is anyone else having frame issues with GTA V? I have it at default setting except for 2x AA @ 1080p, but I'm getting slower than expected frames, around 50-60s, sometimes dropping even lower than that. I seen other people within 80 range with this card, help?


From personal experience as batpimp stated the 15.4 beta drivers from AMD helps tremendously with the fps dips. Also Ive noticed that the 290x (possibly AMD cards in general) does not like any of the "advanced graphics" being turned on. On a single 290x MSAA X2 was the highest I can go while maintaing a nice average fps 50-60. Anything higher and It started to dip below the 40's at times.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Dooderek*
> 
> From personal experience as batpimp stated the 15.4 beta drivers from AMD helps tremendously with the fps dips. Also Ive noticed that the 290x (possibly AMD cards in general) does not like any of the "advanced graphics" being turned on. On a single 290x MSAA X2 was the highest I can go while maintaing a nice average fps 50-60. Anything higher and It started to dip below the 40's at times.


Those advanced graphics have a lot to do with your cpu.


----------



## Phaster89

Quote:


> Originally Posted by *killer8297*
> 
> Is anyone else having frame issues with GTA V? I have it at default setting except for 2x AA @ 1080p, but I'm getting slower than expected frames, around 50-60s, sometimes dropping even lower than that. I seen other people within 80 range with this card, help?


check my post, i'm having the same issues


----------



## mtrai

Back with another question trying to track down the current bios for this card but a bit stumped here. MY Bios string only matches a powercolo 290 while I do indeed own a Powercolor PCS+ R9 290x. Having Black screen issues on windows boot. So far PC has sent me the wrong updated bios twice and now it has been referred to R&D in Taiwan.

See here in my GPUZ validation. http://www.techpowerup.com/gpuz/details.php?id=w9pkq

My string matches this card powercolor card http://www.techpowerup.com/vgabios/158255/powercolor-r9290-4096-140512.html

I have tried a number of vbios that do not work. Most I can't install the driver in Windows, the few I was able to I would get error code 43 in the device manager.

Here is my firestrike ultra I just ran. It seems my temps are pretty low for the run. PS the VDDC is not needed to be that high other then it is the only way for me to prevent blackscreening at this time.



Anyhow also here is a pic of the hex dump of my original bios vs the corrected one for "my card"



The odd thing is my original bios is only listed for the 290, however in the dump it has the correct name after the part highlighted in red.

Help.


----------



## anteerror

Ok I am at a loss, hopefully someone can help me. I can not get the option to set my desktop to 2160 resolution on my R295x. My TV is the Vizio P502ui, using a mini display port to HDMI

It maxes at 3200x1800, no mater what MHz I choose. I know i am doing something stupid but I'm just stuck. Is it possiple my cord is not hdmi 2.0? (its not really an old cord)


----------



## MojoW

Quote:


> Originally Posted by *anteerror*
> 
> Ok I am at a loss, hopefully someone can help me. I can not get the option to set my desktop to 2160 resolution on my R295x. My TV is the Vizio P502ui, using a mini display port to HDMI
> 
> It maxes at 3200x1800, no mater what MHz I choose. I know i am doing something stupid but I'm just stuck. Is it possiple my cord is not hdmi 2.0? (its not really an old cord)




Make sure both checkboxes are unticked!
These cards scale to 3200x1800 max.

Edit: But with HDMI you will only get 3840x2160 @ 30Hz.
As the HDMI on the GPU is 1.4 and not 2.0.


----------



## anteerror

Hmm when Unchecked my max selection is 1080.

This is depressing, I thought this was a powerful card capable of 4k Driving. So can I get a use a Mini display port to hdmi in order to get 60 mhz and 2160 at some point, or should I just take the TV back?


----------



## MojoW

Quote:


> Originally Posted by *anteerror*
> 
> Hmm when Unchecked my max selection is 1080.
> 
> This is depressing, I thought this was a powerful card capable of 4k Driving. So can I get a use a Mini display port to hdmi in order to get 60 mhz and 2160 at some point, or should I just take the TV back?


Are you sure you have a 4k TV?
Maybe you have a 4k ready TV with a internal scaler?
If so you will need to add the resolution manually with CRU.
What is the brand and type?
It sounds as VSR(virtual super resolution) was enabled, effectively giving you a max resolution of 3200x1800.
Sounds like a normal 1080p TV to me if that is what happens.


----------



## mtrai

Quote:


> Originally Posted by *MojoW*
> 
> Are you sure you have a 4k TV?
> Maybe you have a 4k ready TV with a internal scaler?
> If so you will need to add the resolution manually with CRU.
> What is the brand and type?
> It sounds as VSR(virtual super resolution) was enabled, effectively giving you a max resolution of 3200x1800.
> Sounds like a normal 1080p TV to me if that is what happens.


What he said. While in theory it is possible, it would require some registry tweaks. Before I got this R9 290x I was working with some people on the net over over at guru3d and I got VSR enabled and workin on my lowly HD 7770

Here is one link just remember, VSR is not yet fully implemented at least to the public but the bits are there in certain drivers and CCC combinations. What I am saying is VSR is not available in all Control Centers.

http://forums.guru3d.com/showthread.php?t=397924 In there you will find all the info you need on 4k and VSR.


----------



## MojoW

Quote:


> Originally Posted by *mtrai*
> 
> What he said. While in theory it is possible, it would require some registry tweaks. Before I got this R9 290x I was working with some people on the net over over at guru3d and I got VSR enabled and workin on my lowly HD 7770
> 
> Here is one link just remember, VSR is not yet fully implemented at least to the public but the bits are there in certain drivers and CCC combinations. What I am saying is VSR is not available in all Control Centers.
> 
> http://forums.guru3d.com/showthread.php?t=397924 In there you will find all the info you need on 4k and VSR.


To add on this.
Beware if you are using crossfire and the 4k vsr mod, because half of the screen will be artifacting in games.
It works for single gpu users though.


----------



## mtrai

Quote:


> Originally Posted by *MojoW*
> 
> To add on this.
> Beware if you are using crossfire and the 4k vsr mod, because half of the screen will be artifacting in games.
> It works for single gpu users though.


Read the entire thread..and use at your own risk.

As I said VSR is not fully implemented yet, it is still a work in progress on AMDs part...just some creative tech minded people over there found some things, and shared, and more people got onboard to help. This caused AMDMatt ( amd rep there to take our findings to higher ups) and now AMD has come back said it will be implemented on additional older cards that they previously stated they could not due to hard ware limitations. HAHA


----------



## mtrai

Anyone happen to have a bios for the PCS+ R9 290X with these 2 criteria: 113-c671-1101-x672 And the bios stiring ending with HJW.

As I have reported I got a new PC R9 290X and having the dreaded blackscreen on boot. So far PC has sent me the incorrect fixed bios twice, two different ones. Now he is send all my info including my original bios to R&D in Taiwan to investigate and possible build or provide a fixed bios.

I am not sure how long this gonna take, but the blackscreen on booting is starting to wear on me.

/edit Techpowerup list the PN 113-c671-1101-x672 belonging to an R9 nonx 290.

The HJW only exists on 2 non X 290s and R9 290X x 2

/edit the Issue I have with using other bios is a memory issue...as I have BFR and almost all the bios are written for AFR.
Interstingly is my original bios has both Hynix and Samsung memory timings etc in the bios not elpidada.


----------



## MojoW

Quote:


> Originally Posted by *mtrai*
> 
> Read the entire thread..and use at your own risk.
> 
> As I said VSR is not fully implemented yet, it is still a work in progress on AMDs part...just some creative tech minded people over there found some things, and shared, and more people got onboard to help. This caused AMDMatt ( amd rep there to take our findings to higher ups) and now AMD has come back said it will be implemented on additional older cards that they previously stated they could not due to hard ware limitations. HAHA


I know as i am a member of G3D under the same name.
I posted the first vsr driver mod made by asder here on ocn.
And if i recall correctly those vsr mods are based on his findings with that driver mod.


----------



## mtrai

Quote:


> Originally Posted by *MojoW*
> 
> I know as i am a member of G3D under the same name.
> I posted the first vsr driver mod made by asder here on ocn.
> And if i recall correctly those vsr mods are based on his findings with that driver mod.


Same name there...it was all intersting. As I have no desire to actually run VSR on an HD 7770, I worked on my card as a proof to concept and provided it to assist.


----------



## anteerror

Quote:


> Originally Posted by *MojoW*
> 
> Are you sure you have a 4k TV?
> Maybe you have a 4k ready TV with a internal scaler?
> If so you will need to add the resolution manually with CRU.
> What is the brand and type?
> It sounds as VSR(virtual super resolution) was enabled, effectively giving you a max resolution of 3200x1800.
> Sounds like a normal 1080p TV to me if that is what happens.


I Am pretty sure it is a 4K TV, It is this one:

http://www.cnet.com/products/vizio-p502ui-b1/


----------



## MojoW

Quote:


> Originally Posted by *anteerror*
> 
> I Am pretty sure it is a 4K TV, It is this one:
> 
> http://www.cnet.com/products/vizio-p502ui-b1/


Source
Quote:


> With 4x the clarity of 1080p Full HD, the VIZIO P-Series 4K TV is your crystal-clear window to a brand-new entertainment experience. Stream UHD from popular apps like Netflix®.* Watch HD shows in beautiful UHD picture quality with *VIZIO's Spatial Scaling Engine*. And enjoy UHD playback from next generation cable/satellite receivers, Blu-ray players and more, using the latest HDMI standard.


That is straight from your product page.
As i said before it does have a scaler to produce the 4k image.
That shouldn't be a problem as you can use CRU(Custom Resolution Utility) to create the [email protected] resolution.
And the TV will scale it accordingly.


----------



## Agent Smith1984

Quote:


> Originally Posted by *MojoW*
> 
> Source
> That is straight from your product page.
> As i said before it does have a scaler to produce the 4k image.
> That shouldn't be a problem as you can use CRU(Custom Resolution Utility) to create the [email protected] resolution.
> And the TV will scale it accordingly.


Actually, what that means is, it scales 1080P images to 4K to match the resolution of the monitor.
It is in fact, a 4K monitor. All of the 4k TV use scaling processors so that the 720 and 1080 broadcoast resolutions being sent to them don't look like crap.


----------



## LandonAaron

Quote:


> Originally Posted by *anteerror*
> 
> I Am pretty sure it is a 4K TV, It is this one:
> 
> http://www.cnet.com/products/vizio-p502ui-b1/


From the review of your TV:
Quote:


> Connectivity: Of the five HDMI, four are version 1.4, meaning they can handle 1080p and 4K sources up to 30Hz. The fifth is HDMI 2.0 compatible, meaning it can also handle 4K sources up to 60Hz. In addition, this fifth input can accept 1080p signals up to 120Hz, although currently the only sources capable of outputting such signals are PCs.


Make sure you have the cable plugged into the correct input.

On a side note I wish I would have waited a year to purchase my TV. I got a 1080p 60" Edgelite Vizio a year ago for basically the same price as that 4K 55" Vizio with local dimming. That seems like a heck of a deal. NM, just double checked the article. Apparently the 55" is an IPS panel, and the 60" and up is VA panel with local dimming. Not sure what panel type mine is, I just know it is edge-lit LCD.


----------



## ArchDevil

Add me








Gigabyte R9 290X Windforce 3


----------



## tsm106

Quote:


> Originally Posted by *anteerror*
> 
> Ok I am at a loss, hopefully someone can help me. I can not get the option to set my desktop to 2160 resolution on my R295x. My TV is the Vizio P502ui, using a mini display port to HDMI
> 
> It maxes at 3200x1800, no mater what MHz I choose. I know i am doing something stupid but I'm just stuck. Is it possiple my cord is not hdmi 2.0? (its not really an old cord)


There's no such adapter product that converts dp to hdmi 2.0, because it doesn't exist yet. And it's only been recently that the conversion chips were released. In fact it's counter use an adapter anyways since *you can run [email protected] with hdmi 1.4*. And yea, your card is not hdmi 2.0, and for that matter no AMD card out right now supports hdmi 2.0.

"I spoke with Steve Venuti, president of HDMI Licensing, the organization that licenses the HDMI spec to manufacturers. *He told me that the required chips with HDMI 2.0 at 18 Gbps and HDCP 2.2* have been selling for the last six to eight months, and these *are now in shipping products*, including various Panasonic devices and Samsung TVs. I had thought it would have taken longer than that to integrate the new chips into products."

These new products such as receivers with true hdmi 2.0 ports and chips are only just making it to retail and in very limited supply.


----------



## anteerror

Quote:


> Originally Posted by *tsm106*
> 
> There's no such adapter product that converts dp to hdmi 2.0, because it doesn't exist yet. And it's only been recently that the conversion chips were released. In fact it's counter use an adapter anyways since *you can run [email protected] with hdmi 1.4*. And yea, your card is not hdmi 2.0, and for that matter no AMD card out right now supports hdmi 2.0.
> 
> "I spoke with Steve Venuti, president of HDMI Licensing, the organization that licenses the HDMI spec to manufacturers. *He told me that the required chips with HDMI 2.0 at 18 Gbps and HDCP 2.2* have been selling for the last six to eight months, and these *are now in shipping products*, including various Panasonic devices and Samsung TVs. I had thought it would have taken longer than that to integrate the new chips into products."
> 
> These new products such as receivers with true hdmi 2.0 ports and chips are only just making it to retail and in very limited supply.


Yep, Sadly I figured i was SoL, what annoys me is I cannot even get the GFX card to push out to 4k even on the hdmi 1.4, It just simply refuses to go above 1080. I'm returning it and getting one that has a Display port input, this is just too annoying to put up with. That should hopefully work right out of the box ;/

What DP standard does this Card have so I don't go thru this again? Anyone recommend a 4K tv with a DP that will match up?


----------



## tsm106

You can look at the crossover 44k 40in 4k monitor or the Panasonic 65in 4k (only tv with dp).


----------



## anteerror

wow thats it huh, only two and that crossover seems sold out everywhere

-Meh found some used and overpriced on amazon. Such is life I guess


----------



## Cyber Locc

Quote:


> Originally Posted by *MojoW*
> 
> 
> Edit: But with HDMI you will only get 3840x2160 @ 30Hz.
> As the HDMI on the GPU is 1.4 and not 2.0.


Just wanted to point out to you - " But with HDMI you will only get 3840x2160 @ 30Hz.
As the HDMI on the GPU is 1.4 and not 2.0" the way this sounds is in reference to VSR? If it is this is false VSR doesn't care what cable you have that's not how it works lol you can run 4k down-sampling on a dvi cable at 60hz or even 120 if you can push that as when the data passed to the monitor is still 1080p not 4k.

And really in that regard down-sampling could be better in some needs than real 4k as you can achieve a similar 4k quality at 120fps if you have 4 titan Xs or something that can actually push that lol (not sure even they can as downsampling 4k takes more gpu power than straight 4k)
Quote:


> Originally Posted by *tsm106*
> 
> These new products such as receivers with true hdmi 2.0 ports and chips are only just making it to retail and in very limited supply.


Umm how old is that quote I just did a run through for hdmi 2.0 now that I know that theres an adapter coming this month. Anyway hdmi 2.0 is avaible on a whole lot of tvs on amazon even ones that are 2014 samsungs ect and even the 400 dollar spectre 50 inch has a hdmi 2.0. So what I'm saying first of all the first one came out last year in the middle of the year. As of now most of the tvs that are 4k that have came out since then have hdmi 2.0 there not hard to find at all. And while I see you said "True" I read reviews of alot of said tvs they all allow [email protected] over hdmi (confirmed) that's "True" enough for me
Quote:


> Originally Posted by *anteerror*
> 
> Yep, Sadly I figured i was SoL, what annoys me is I cannot even get the GFX card to push out to 4k even on the hdmi 1.4, It just simply refuses to go above 1080. I'm returning it and getting one that has a Display port input, this is just too annoying to put up with. That should hopefully work right out of the box ;/
> 
> What DP standard does this Card have so I don't go thru this again? Anyone recommend a 4K tv with a DP that will match up?


Okay I'm confused what your trying to do lol. You say it maxes at 1800 well trun VSR off that has zero to do for you you already have a 4k tv/monitor so VSR is worthless to you. VSR makes a Res and then down-samples it down to 1080p to work on non 4k monitors like a faux 4k or 1440 ect. You should be using straight up hardware 4k you already have 4k.

As far as it only showing 1080p that would be your adapters fault 90% of those adapters do not support more than 1080p and half of the ones that say they do, don't, read the reviews.

As far as your question of using hdmi 2.0 yes bizlink made an adapter it was at CES 2015 and I have found a few people that say talkign to there marketing techs say it will be released in may it pushes 60hz 4k hdmi 2.0 heres an article about it granted in french but just use translate http://www.hardware.fr/news/14040/ces-adaptateur-displayport-1-2-hdmi-2-0.html. It will be expensive and I think that's your current issue a good adapter that will actually push 4k is going to cost upwards of 50 bucks. This one being so new and the only one at launch I wouldn't surprise 100 or so you cant just buy a 10 dollar adapter off amazon and use it to push 4k just don't work that way.


----------



## fishingfanatic

I can tell you this about the Panasonic 4k 65" tv, it's an awesome tv !!!

It only has 1 4k dp port though .

With a surround system I only need the one port anyway as everything is hooked up that way.

FF


----------



## anteerror

Quote:


> Originally Posted by *cyberlocc*
> 
> Just wanted to point out to you - " But with HDMI you will only get 3840x2160 @ 30Hz.
> As the HDMI on the GPU is 1.4 and not 2.0" the way this sounds is in reference to VSR? If it is this is false VSR doesn't care what cable you have that's not how it works lol you can run 4k down-sampling on a dvi cable at 60hz or even 120 if you can push that as when the data passed to the monitor is still 1080p not 4k.
> 
> As far as it only showing 1080p that would be your adapters fault 90% of those adapters do not support more than 1080p and half of the ones that say they do, don't, read the reviews.
> 
> .


hmm I will check out a new adapter and see if that fixes my Issue, VSR was one of the first things I turned off, I can't even get the TV above 1080 with CRU, Sadly my GFX card lacks USB 2.0 support, thats wht I was looking at Monitors with a DP, with only 2 of em out there, one nearly unavailabe and the other too big for my blood I feel like I need to either just game in SD for a while longer and wait for more to come out, oR sell my 295x and just get a 980.

Lame I wish this thing just pushed 2.0


----------



## Cyber Locc

I would just wait for the adapter to be honest. You can grab a adapter that can push 4k and use that at 30hz or lower the res for 60hz (I think that that TV pushes up to 120hz). Buying a 980 is also an option, that said you will need 2, 1 isn't enough and even 2 barely beat a 295x2 and actually lose to it in some games so that is a major was of cash to blow for a sidegrade. I would just hold out for the adapter tbh. That said good luck with whatever option you choose and switching TVs is a bad idea anyway as your TV is better than both of those.


----------



## GMcDougal

Proud owner of an MSI 290x Gaming 4g. Here's the beast in action!
I


----------



## Arizonian

Quote:


> Originally Posted by *mtrai*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Back with another question trying to track down the current bios for this card but a bit stumped here. MY Bios string only matches a powercolo 290 while I do indeed own a Powercolor PCS+ R9 290x. Having Black screen issues on windows boot. So far PC has sent me the wrong updated bios twice and now it has been referred to R&D in Taiwan.
> 
> 
> See here in my GPUZ validation. http://www.techpowerup.com/gpuz/details.php?id=w9pkq
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> My string matches this card powercolor card http://www.techpowerup.com/vgabios/158255/powercolor-r9290-4096-140512.html
> 
> I have tried a number of vbios that do not work. Most I can't install the driver in Windows, the few I was able to I would get error code 43 in the device manager.
> 
> Here is my firestrike ultra I just ran. It seems my temps are pretty low for the run. PS the VDDC is not needed to be that high other then it is the only way for me to prevent blackscreening at this time.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Anyhow also here is a pic of the hex dump of my original bios vs the corrected one for "my card"
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> The odd thing is my original bios is only listed for the 290, however in the dump it has the correct name after the part highlighted in red.
> 
> Help
> 
> 
> .


Congrats - added








Quote:


> Originally Posted by *ArchDevil*
> 
> Add me
> 
> 
> 
> 
> 
> 
> 
> 
> Gigabyte R9 290X Windforce 3


Congrats - added








Quote:


> Originally Posted by *GMcDougal*
> 
> Proud owner of an MSI 290x Gaming 4g. Here's the beast in action!
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## Banda

Finally I got a r9 290x!



http://www.techpowerup.com/gpuz/details.php?id=epahe


----------



## pcrevolution

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added


Hi sir please add me as well!



Happy owner of a 290x with an AC backplate waiting for summer to complete a custom loop build.


----------



## Misterstrategy

I am looking for a video card that is uefi compatible so as to work with asus x99-e ws and intel ssd750. Can someone please tell me if the Asus R9290x dc2 4gD5 is uefi compatible?


----------



## DaUn3rD0g

Quote:


> Originally Posted by *Misterstrategy*
> 
> I am looking for a video card that is uefi compatible so as to work with asus x99-e ws and intel ssd750. Can someone please tell me if the Asus R9290x dc2 4gD5 is uefi compatible?


What do you mean by uefi compatible?

Do you mean the fast boot option, or just one that will work with a uefi motherboard?

If you mean the former then I would imagine it would do, I have an msi 290x le gaming that uses the dual bios option for both uefi (default bios) and another one for compatibility withllegacy bios, being an asus flagship card (on the amd side) I can't see them not giving it uefi fastboot, I would imagine that if anything they'd drop the support for legacy bios in favour of uefi.

If you mean the latter, any card will work with your motherboard even an older gpu made for legacy bios, you just wont get the fastboot but depending on your system that might not matter, mine cold boots to windows 8.1 login screen quicker than my monitor can wake from standby even with my card set to legacy, but that's just due to having an sdd for boot drive.

Having said that though my older xfx 260x shows a corrupted display in my uefi bios on asrock h97 pro4 forcing me to have to remove it an use the igpu on my 4690k if I needed to do anything in the bios, so I get why you might ask about compatibility, but it still worked fine outside of the bios and no problems at all with my 290x set to either bios.


----------



## mfknjadagr8

So I've seen people run their tubing through two gpus and back out the opposite port on the top gpu...while I like the look of this wouldn't that make the water kind of stalemate in the gpu blocks and increase temps?


----------



## rdr09

Quote:


> Originally Posted by *mfknjadagr8*
> 
> So I've seen people run their tubing through two gpus and back out the opposite port on the top gpu...while I like the look of this wouldn't that make the water kind of stalemate in the gpu blocks and increase temps?


the last gpu in the loop will see higher temps. 1 - 2 C at most.


----------



## mfknjadagr8

Quote:


> Originally Posted by *rdr09*
> 
> the last gpu in the loop will see higher temps. 1 - 2 C at most.


that's pretty negligible really...


----------



## Mega Man

Quote:


> Originally Posted by *mfknjadagr8*
> 
> So I've seen people run their tubing through two gpus and back out the opposite port on the top gpu...while I like the look of this wouldn't that make the water kind of stalemate in the gpu blocks and increase temps?


depends. i have seen mistakes ( out/in the same port )

bu8t it depends on the setup,

serial vs parallel

swiftech has the best write up imo on serial vs parallel

http://www.swiftech.com/ApogeeHD.aspx#tab1

part way down the page

and fyi usually order does not matter


----------



## rdr09

Quote:


> Originally Posted by *mfknjadagr8*
> 
> that's pretty *negligible* really...


exactly. for real.

here was my old setup temps in BF4 MP . . .



only 1 120 rad per block. now, this is assuming that blocks are of same brand and pads and paste are same as well.

here is with the new setup in the same game with 1 240 and 1 360 rads . . .



vrms may go higher in difference. just a bit.


----------



## Dooderek

Quote:


> Originally Posted by *mfknjadagr8*
> 
> So I've seen people run their tubing through two gpus and back out the opposite port on the top gpu...while I like the look of this wouldn't that make the water kind of stalemate in the gpu blocks and increase temps?


No sir, the gpu's will always be hotter than the water, and because water has such heat capacity it will be carrying the heat away even if the first gpu has already been cooled by the water. If you're water is somehow hotter than the gpu than you have much bigger problems at hand.


----------



## ShinGoutetsu

Can I join? Sapphire R9 290X Vapor-X 4gb OC crossfire with ek full cover blocks and backplates









http://s31.photobucket.com/user/gou...3-4A09-9743-FE2A307CA2F0_zps0bhntrjk.jpg.html
http://s31.photobucket.com/user/gou...1-476C-B6D0-6D5646D0685F_zpsacgdfduy.jpg.html


----------



## DaUn3rD0g

Quote:


> Originally Posted by *mfknjadagr8*
> 
> So I've seen people run their tubing through two gpus and back out the opposite port on the top gpu...while I like the look of this wouldn't that make the water kind of stalemate in the gpu blocks and increase temps?


Do you mean where the tube is in a "U" shape, going in onside travelling down (or up) through both blocks and exiting out of the first block on the same side it started on?

I had thought about this previously as water travels the path of least resistance wouldn't it just come through the first block and ignore the next one along, but because the path for the water inside the blocks is smaller and more restrictive than the tubing supplying the water the pressure will mostly even out.

I figure the flow in the last block could be a little bit lower than the first, particularly in the case of 4 way set ups using blocks with low restriction/high flow, I imagine thats what causes the temperature variation more than the coolants ability to absorb heat.

It would be interesting to see if anyone has ever tried taking flow readings between blocks in this kind of configuration, although I would guess that the variation would be too small to accurately measure.

On the subject of flow and restriction while I think of it (I realise it's the wrong place to ask but theres a lot of knowlegable watercoolers in here), does anyone know why high flow / low restriction are considered to be better than low flow / high restriction?

I ask because while I understand the logic of getting more cool water to a component and taking the heat away quicker should increase the cooling capacity, in reviews its actually blocks with high restriction that come out on top.

Take the alphacool nexxxos gpx r9 290 block for example, it has the highest restriction of any gpu block yet offers the best core cooling.

It's a similar story with cpu blocks, yet everyone says what a bad thing restriction is and in theory it should be, so does anyone with an understanding of thermal dynamics want to explain why it's not in practice? Or link me a thread that does explain it?


----------



## Cyber Locc

Quote:


> Originally Posted by *DaUn3rD0g*
> 
> Do you mean where the tube is in a "U" shape, going in onside travelling down (or up) through both blocks and exiting out of the first block on the same side it started on?
> 
> I had thought about this previously as water travels the path of least resistance wouldn't it just come through the first block and ignore the next one along, but because the path for the water inside the blocks is smaller and more restrictive than the tubing supplying the water the pressure will mostly even out.
> 
> I figure the flow in the last block could be a little bit lower than the first, particularly in the case of 4 way set ups using blocks with low restriction/high flow, I imagine thats what causes the temperature variation more than the coolants ability to absorb heat.
> 
> It would be interesting to see if anyone has ever tried taking flow readings between blocks in this kind of configuration, although I would guess that the variation would be too small to accurately measure.
> 
> On the subject of flow and restriction while I think of it (I realise it's the wrong place to ask but theres a lot of knowlegable watercoolers in here), does anyone know why high flow / low restriction are considered to be better than low flow / high restriction?
> 
> I ask because while I understand the logic of getting more cool water to a component and taking the heat away quicker should increase the cooling capacity, in reviews its actually blocks with high restriction that come out on top.
> 
> Take the alphacool nexxxos gpx r9 290 block for example, it has the highest restriction of any gpu block yet offers the best core cooling.
> 
> It's a similar story with cpu blocks, yet everyone says what a bad thing restriction is and in theory it should be, so does anyone with an understanding of thermal dynamics want to explain why it's not in practice? Or link me a thread that does explain it?


It just depends on your loop setup and the blocks construction. Usually high restriction means more fins and leads to better temps this is especially true in CPU blocks. However it becomes harder to achieve the flow rate needed for good cooling thus makes it a wash.

Also for the record The swiftech komodo has the best core cooling of r9 blocks followed by the alphacool nexxxos gpx r9 290. Furthermore the alphacool block does good on the core because it does horrid on everything else the only copper contact point is the core the rest is black painted. The backplate is also like a heatsink and causes it to not fit most boards. It is also a hybrid block as it is an air heatsink as well as a block but it does good on the GPU and horrid on everything else. That is like the block that everyone will tell you not to buy lol. There idea was solid and I give them props but the performance simply isnt there they need to take that back to the woodshed and start over. Also the block is aluminum not sure if that comes in contact with the water I assume it does though. There is a reason the Ocool blocks are the cheapest lol.

Which they did and came out with the other one with multiple contact points but not a solid slab ya that's the worst performing block of all of them lol. Dont get me wrong its cool there trying to innovate and they have the rad market on lock. However there blocks are junk sorry to say.


----------



## rdr09

Quote:


> Originally Posted by *ShinGoutetsu*
> 
> Can I join? Sapphire R9 290X Vapor-X 4gb OC crossfire with ek full cover blocks and backplates
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://s31.photobucket.com/user/gou...3-4A09-9743-FE2A307CA2F0_zps0bhntrjk.jpg.html
> http://s31.photobucket.com/user/gou...1-476C-B6D0-6D5646D0685F_zpsacgdfduy.jpg.html


Nice and clean. Lovely.


----------



## Cyber Locc

Quote:


> Originally Posted by *ShinGoutetsu*
> 
> Can I join? Sapphire R9 290X Vapor-X 4gb OC crossfire with ek full cover blocks and backplates
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://s31.photobucket.com/user/gou...3-4A09-9743-FE2A307CA2F0_zps0bhntrjk.jpg.html
> http://s31.photobucket.com/user/gou...1-476C-B6D0-6D5646D0685F_zpsacgdfduy.jpg.html


Very nice loving the cold zero parts look alot like my build (mines acrylic though). I also know how much of a pain in the A$$ it is to route those mid-plate tubes. I literally had to cut the rad supports off on the power supply area to be able to work down there because I have 4 rads so the power supply (g2 1600w huge...) has to be installed after the loop.


----------



## ShinGoutetsu

Quote:


> Originally Posted by *rdr09*
> 
> Nice and clean. Lovely.


Thanks, my first water cooled build. It was daunting but I took my time and i'm happy with the results
Quote:


> Originally Posted by *cyberlocc*
> 
> Very nice loving the cold zero parts look alot like my build (mines acrylic though). I also know how much of a pain in the A$$ it is to route those mid-plate tubes. I literally had to cut the rad supports off on the power supply area to be able to work down there because I have 4 rads so the power supply (g2 1600w huge...) has to be installed after the loop.


I only have the one 480 rad and the dual pumps in the bottom so it's not too bad but I wanted the tubing pass throughs to be equal distance from the motherboard tray for aesthetics lol. There's another 480 up top. I do love the coldzero parts, going to get the rog front fan grill next


----------



## mojobear

You know what all these watercooling performance graphs are missing? Error bars. They need to do replicates when the delta between each block is like 1-2C. I mean...really? The variation between applying thermal paste I'm guessing is at least +/- 1C....I'm tlaking about their graphs of CPU blocks.
Quote:


> Originally Posted by *Mega Man*
> 
> depends. i have seen mistakes ( out/in the same port )
> 
> bu8t it depends on the setup,
> 
> serial vs parallel
> 
> swiftech has the best write up imo on serial vs parallel
> 
> http://www.swiftech.com/ApogeeHD.aspx#tab1
> 
> part way down the page
> 
> and fyi usually order does not matter


----------



## Mega Man

yea well that is why i said loop order does not matter


----------



## DaUn3rD0g

Quote:


> Originally Posted by *cyberlocc*
> 
> It just depends on your loop setup and the blocks construction. Usually high restriction means more fins and leads to better temps this is especially true in CPU blocks. However it becomes harder to achieve the flow rate needed for good cooling thus makes it a wash.
> 
> Also for the record The swiftech komodo has the best core cooling of r9 blocks followed by the alphacool nexxxos gpx r9 290. Furthermore the alphacool block does good on the core because it does horrid on everything else the only copper contact point is the core the rest is black painted. The backplate is also like a heatsink and causes it to not fit most boards. It is also a hybrid block as it is an air heatsink as well as a block but it does good on the GPU and horrid on everything else. That is like the block that everyone will tell you not to buy lol. There idea was solid and I give them props but the performance simply isnt there they need to take that back to the woodshed and start over. Also the block is aluminum not sure if that comes in contact with the water I assume it does though. There is a reason the Ocool blocks are the cheapest lol.
> 
> Which they did and came out with the other one with multiple contact points but not a solid slab ya that's the worst performing block of all of them lol. Dont get me wrong its cool there trying to innovate and they have the rad market on lock. However there blocks are junk sorry to say.


Thanks for your reply, that does make sense!

I was referring to stren's waterblock roumdup when talking about the alphacool block, I can't remember if the komodo was included or not.

I know the alphacool blocks are "hybrid" if you can call it that, more like a universal block with the necessary passive heatsinks attached. They also look butt ugly compared to other blocks, in my opinion.

The compatibility issue has been blown out of proportion though as it's only an issue if the cards installed on the first pci-e slot and that only really affects motherboards with pci-e slots spaced for 3 or 4 card sli/crossfire. Realistically speaking the kind of people that would be planning a multi card build that requires a high end board like that would have a budget big enough to buy true full cover blocks, people looking to buy cheap waterblocks like the alphacool will most likely have a cheaper motherboard as well and they tend to move the graphics card to the second slot with the first as a pcie x1, elminating the clearance issue. It's a budget block designed for a budget audience, that's not a bad thing really unless you start hoping for it to be a premium block for premium buyers at a cut down price (that just doesn't happen in pc land lol!).

I am one of those budget people and have an alphacool gpx arriving on Tuesday, but I actually bought that one based on the fact that its the only full cover (ish) block that fits over the enlarged capacitors on an msi 290x gaming, without having to butcher your shiny new block, which given the price of waterblocks is not something someone on a budget as limited as mine is would be willing to do. I would've got an ek block if they would just lengthen / widen the capacitor cut out they already have so it would fit, but they have no plans to do so and neither does anyone else, so my only option was the alphacool or to make my card look like an old nvidia mx200 by buying a universal block and those silly stick on heat sinks.

I wasn't aware that the aluminium parts on the alphacool came im contact with the water though, I will need to check that because it won't be going in my loop if it is! As far as I was aware tho the actual water block part was acetal with a copper base / internal fins, it would be a major design flaw on alphacools part if that's the case.

EDIT: False alarm, Stren took the block apart in his review and it is indeed plastic and copper (plated copper to be precise) that the actua waterblock part is made of, the heatsink and back plate are made of aluminium but that's standard and not an issue.

That's a relief, I almost panic cancelled my order!

I know what to expect with regards to less than perfect vrm temps but will post temps when its installed incase anyone else with an msi gaming wants another option over universal blocks or modified full covers.


----------



## tsm106

Quote:


> Originally Posted by *mojobear*
> 
> You know what all these watercooling performance graphs are missing? Error bars. They need to do replicates when the delta between each block is like 1-2C. I mean...really? The variation between applying thermal paste I'm guessing is at least +/- 1C....I'm tlaking about their graphs of CPU blocks.
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> depends. i have seen mistakes ( out/in the same port )
> 
> bu8t it depends on the setup,
> 
> serial vs parallel
> 
> swiftech has the best write up imo on serial vs parallel
> 
> http://www.swiftech.com/ApogeeHD.aspx#tab1
> 
> part way down the page
> 
> and fyi usually order does not matter
Click to expand...

^^Read tests by Martin, he does a section on mounting and rates the blocks on ease and consistency.

Serial > Parallel. Then it comes to the haves and have nots in terms of head pressure. Those w/o enough pump must resort to parallel. I'd rather add another pump and double up on redundancy than throw in the towel and go parallel any day.


----------



## mojobear

Quote:


> Originally Posted by *tsm106*
> 
> ^^Read tests by Martin, he does a section on mounting and rates the blocks on ease and consistency.
> 
> Serial > Parallel. Then it comes to the haves and have nots in terms of head pressure. Those w/o enough pump must resort to parallel. I'd rather add another pump and double up on redundancy than throw in the towel and go parallel any day.


Yah I have read Martins stuff. For watercooling there is no better read for now. But it would be very nice, except time consuming for people to remount in triplicates and get at least some simple statistics to say that there is a difference between products. For the average consumer, even Martin, I dont think thats reasonable. But if Swiftech puts their CPU block vs competitors on their website and shows theirs being better at cooling by 1-2C then by god they should at least get error bars and do a simple T-Test haha.


----------



## Mega Man

Unless I miss understood him he was asking about a gpu waterblock ( 2 in sli/cfx) with both the in and out on same side of bridge.i just wanted to explain how it works. I have not seen a better example of both then the swiftech page. I was not recommending either however


----------



## Jflisk

What are we saying here serial better then parallel. I have parallel and 2 x D5 ? by what C 1-2

Also not to throw the forum off Looking for CPU TIM suggestions. Looks like I might be Due. Thats for MEGA and TSM value your advice. Thanks


----------



## Mega Man

Generally serial is better for temps. But as is everything. Depends on what you are looking for


----------



## mojobear

Quote:


> Originally Posted by *Mega Man*
> 
> Unless I miss understood him he was asking about a gpu waterblock ( 2 in sli/cfx) with both the in and out on same side of bridge.i just wanted to explain how it works. I have not seen a better example of both then the swiftech page. I was not recommending either however


Haha no worries. I was just on a misguided rant. It just grinds my gears reviews now-a-days dont take into account variation. I will stop now


----------



## numbrs

Spoiler: !!!



Quote:


> Originally Posted by *Arizonian*
> 
> [Official] AMD R9 290X / 290 Owners Club
> 
> Enthusiast GPU
> 
> 
> 
> *R9 290X & 290 Club Roster*
> 
> *To be added on the member list please submit the following in your post*
> *
> 1. GPU-Z Link with OCN name or Screen shot of GPU-Z validation tab open with OCN Name or finally a pic of GPU with piece of paper with OCN name showing.
> 2. Manufacturer & Brand - Please don't make me guess.
> 3. Cooling - Stock, Aftermarket or 3rd Party Water*
> _Please note if you do not provide manufacturer, brand or cooling - by default Sapphire Stock Cooling will be chosen for you._
> 
> https://docs.google.com/spreadsheet/pub?hl=en&hl=en&key=0AgQn1aK9PRbrdENJMWhJcEJBOHg1eENKbmNvY2Q3X0E&single=true&gid=0&output=html&widget=true
> 
> *AMD Gaming Facebook*
> *AMD Radeon Twitter*
> *AMD YouTube*
> 
> *Club Signature (Use your specific GPU proudly)*
> 
> 
> 
> 
> 
> 
> 
> 
> *[Official] AMD R9 290X Owners Club*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> [CENTER] [URL=http://www.overclock.net/t/1436497/amd-r9-290x-290-owners-club#post_21044503][B] :clock:[Official] AMD R9 290X Owners Club :clock: [/B][/URL] [/CENTER]










*[Official] AMD R9 290 Owners Club*









Code:



Code:


[CENTER] [URL=http://www.overclock.net/t/1436497/amd-r9-290x-290-owners-club#post_21044503][B] :clock:[Official] AMD R9 290 Owners Club :clock: [/B][/URL] [/CENTER]

*StarYoshi Edit: The thread is official now. Go bonkers and bask in the wonderment this entails.



*AMD Drivers Page*

*Latest Catalyst Software Suite*

*How-To Uninstall AMD Catalyst™ Drivers From A Windows® Based System*

*How-To Install AMD Catalyst™ Drivers For A Windows® Based System*

*Introducing the* *Catalyst™ Omega Driver* *Windows®
*
*AMD Catalys Display Driver 14.4 WHQL Windows®*

*AMD Beta Driver History and link to latest beta*

*
AMD Catalyst™ Driver 14.12 Windows®*


Spoiler: Warning: Spoiler!



*Highlights of AMD Catalyst™ 14.11.2 Windows® Beta Driver Performance Improvements*
◾Dragon Age: Inquisition performance optimizations
- Up to 5% performance increase over Catalyst™ 14.11.1 beta in single GPU scenarios with Anti-Aliasing enabled.
- Optimized AMD CrossFire™ profile
◾Far Cry 4 performance optimizations
- Up to 50% performance increase over Catalyst™ 14.11.1 beta in single GPU scenarios with Anti-Aliasing enabled.

*Important Notes*
◾All previous versions of AMD Catalyst™ drivers must be completely uninstalled to receive full benefits of this driver. Please make sure to completely uninstall all previous AMD Catalyst™ display drivers before installing the 14.11.2 Beta.
◾The AMD CrossFire™ profile for Far Cry 4 is currently disabled in this driver while AMD works with Ubisoft to investigate an issue where AMD CrossFire™ configurations are not performing as intended. An update is expected on this issue in the near future through an updated game patch or an AMD driver posting.

*Resolved Issues*
◾[409235]: Small chance of intermittent screen tearing or corruption in Call of Duty®: Advanced Warfare on high settings 4K resolution in AMD CrossFire™ mode.
◾[408892]: World of Warcraft can sometimes exhibit corruption when using CMAA in AMD CrossFire™ configurations.
◾[407431]: Minecraft sometimes produces corruption when changing video settings in windowed mode.
◾[407338]: XDMA Quad CrossFire™ configurations in portrait Eyefinity modes sometimes display tearing or stuttering.

*Known Issues*
◾[408723]: System can sometimes hang when upgrading to Catalyst™ 14.11.2 from Catalyst™14.7 in AMD CrossFire™ configurations. As a workaround please completely uninstall previous Catalyst™ software versions before installing Catalyst™14.11.2 beta.
◾[409518]: Slight performance drops in FIFA 2015 on AMD CrossFire™ configurations.
◾[409502]: Occasional flickering sometimes observed while playing FIFA 2015 in AMD Dual Graphics configurations.
◾[409638]: Slight Battlefield 4 performance drop on AMD Radeon™ R9 290X in AMD CrossFire™ configuration.
◾[409628]: AMD Radeon™ R9 285 intermittently hangs in Hitman Absolution on new game start.
◾[408484]: AMD Radeon™ R9 285 can sometimes exhibit flickering in Assassins Creed Unity.
◾[409613]: Assassins Creed Unity can sometimes experience frame stutter on some AMD CrossFire™ configurations.
◾[408706]: Call of Duty: Advanced Warfare intermittent black screen when loading game in Quad AMD Crossfire™ configurations.
◾[409600]: Civilization: Beyond Earth mantle users in AMD CrossFire™ configurations may sometimes experience an issue where they cannot change their game resolution. As a work around please use "Enable MGPU=1" in the games configuration .ini file.

AMD is currently working with BioWare to resolve the following issues:
◾Flickering is sometimes observed in Dragon Age: Inquisition on a limited number of surfaces in AMD CrossFire™ configurations.

AMD is currently working with Ubisoft to resolve the following issues:
◾Uneven hair corruption sometimes observed in Assassins Creed Unity when applying "ultra" game settings.
◾Flickering occasionally observed between windows on walls in Assassins Creed Unity.
◾Windows/Doors intermittently flash with black textures in Assassins Creed Unity.
◾Assassins Creed Unity occasionally exits to desktop when "ultra" game settings are applied.



*AMD Catalyst™ Driver 14.9 Windows®*


Spoiler: Driver Info!



*Highlights of AMD Catalyst™ 14.9 Windows Driver*
◾ Support for the AMD Radeon R9 28
◾ Performance improvements (comparing AMD Catalyst 14.9 vs. AMD Catalyst 14.4)
◾3DMark Sky Diver improvements
◾AMD A4 6300 - improves up to 4%
◾Enables AMD Dual Graphics / AMD CrossFire support
◾3DMark Fire Strike
◾AMD Radeon R9 290 Series - improves up to 5% in Performance Preset
◾3DMark11
◾AMD Radeon R9 290 Series / R9 270 Series - improves up to 4% in Entry and Performance Preset
◾BioShock Infinite
◾AMD Radeon R9 290 Series - 1920x1080 - improves up to 5%
◾Company of Heroes 2
◾AMD Radeon R9 290 Series - improves up to 8%
◾Crysis 3
◾AMD Radeon R9 290 Series / R9 270 Series - improves up to 10%
◾Grid Auto Sport
◾AMD CrossFire profile
◾Murdered Soul Suspect
◾AMD Radeon R9 290X (2560x1440, 4x MSAA, 16x AF) - improves up to 50%
◾AMD Radeon R9 290 Series / R9 270 Series - improves up to 6%
◾CrossFire configurations improve scaling up to 75%
◾Plants vs. Zombies (Direct3D performance improvements)
◾AMD Radeon R9 290X - 1920x1080 Ultra - improves up to 11%
◾AMD Radeon R9290X - 2560x1600 Ultra - improves up to 15%
◾AMD Radeon R9290X CrossFire configuration (3840x2160 Ultra) - 92% scaling
◾Batman Arkham Origins:
◾AMD Radeon R9 290X (4x MSAA) - improves up to 20%
◾CrossFire configurations see up to a 70% gain in scaling
◾Wildstar
◾Power Xpress profile
◾Performance improvements to improve smoothness of application
◾Performance improves up to 30% on the AMD Radeon R9 and R7 Series of products for both single GPU and Multi-GPU configurations
◾Tomb Raider
◾AMD Radeon R9 290 Series - improves up to 5%
◾Watch Dogs
◾AMD Radeon R9 290 Series / R9 270 Series - improves up to 9%
◾AMD CrossFire - Frame pacing improvement
◾Improved CrossFire performance - up to 20%
◾Assassin's Creed IV
◾Improves CrossFire scaling (3840x2160 High Settings) up to 93% (CrossFire scaling improvement of 25% compared to AMD Catalyst 14.4)
◾Lichdom
◾Improves performance for single GPU and Multi-GPU configurations
◾Star Craft II
◾AMD Radeon R9 290X (2560x1440, AA, 16x AF) - improves up to 20%
◾ AMD Eyefinity enhancements
◾Mixed Resolution Support
◾A new architecture providing brand new capabilities
◾Display groups can be created with monitors of different resolution (including difference sizes and shapes)
◾Users have a choice of how surface is created over the display group
◾Fill - legacy mode, best for identical monitors
◾Fit - create the Eyefinity surface using best available rectangular area with attached displays
◾Expand - create a virtual Eyefinity surface using desktops as viewports onto the surface
◾Eyefinity Display Alignment
◾Enables control over alignment between adjacent monitors
◾One-Click Setup
◾Driver detects layout of extended desktop
◾Can create Eyefinity display group using this layout in one click!
◾ New user controls for video color and display settings
◾Greater control over Video Color Management:
◾Controls have been expanded from a single slider for controlling Boost and Hue to per color axis
◾Color depth control for Digital Flat Panels (available on supported HDMI and DP displays)
◾Allows users to select different color depths per resolution and display
◾ AMD Mantle enhancements
◾Mantle now supports AMD Mobile products with Enduro technology
◾Battlefield 4: AMD Radeon HD 8970M (1366x768; high settings) - 21% gain
◾Thief: AMD Radeon HD 8970M (1920x1080; high settings) - 14% gain
◾Star Swarm: AMD Radeon HD 8970M (1920x1080; medium settings) - 274% gain
◾Enables support for Multi-GPU configurations with Thief (requires the latest Thief update)
◾ AMD AM1 JPEG decoding acceleration
◾JPEG decoding acceleration was first enabled on the A10 APU Series in AMD Catalyst 14.1 beta, and has now been extended to the AMD AM1 Platform
◾Provides fast JPEG decompression
◾Provides Power Efficiency for JPEG decompression

*Resolved Issues*
◾60Hz SST flickering has been identified as an issue with non-standard display timings exhibited by the AOC U2868PQU panel on certain AMD Radeon™ graphics cards. A software workaround has been implemented in the AMD Catalyst 14.9 driver to resolve the display timing issues with this display
◾Users seeing flickering issues in 60Hz SST mode are further encouraged to obtain newer display firmware from their monitor vendor that will resolve flickering at its origin.
◾Users are additionally advised to utilize DisplayPort-certified cables to ensure the integrity of the DisplayPort data connection.
◾4K panel flickering issues found on the AMD Radeon R9 290 Series and AMD Radeon HD 7800 Series
◾Screen tearing observed on AMD CrossFire systems with Eyefinity portrait display configurations
◾Instability issues for Grid Autosport when running in 2x1 or 1x2 Eyefinity configurations
◾Geometry corruption in State of Decay

*Known Issues*
◾[404829]: Horizontal flashing lines on second screen in a clone mode with V-Sync on using AMD Mobility Graphics with Switchable Intel Graphics
◾[404508]: Display takes a long time to redraw the screen after an S4 cycle
◾[405432]: Mantle driver will TDR when running Star Swarm on the loading screen
◾[404660]: GPU gets stuck in a low power state after it was previously stressed to max power
◾[403032]: Severe flickering observed on default launch of SimCity 4
◾[403449]: Playing any media sample in full screen in the 2x1/1x2 Fill SLS configuration leads to TDR
◾[400573]: Intermittent application hang observed while launching Aliens vs. Predator
◾[400693]: While running performance test, crash is observed with fault module atidxx32.dll
◾[401386]: Severe corruption and flashing light observed with specific game settings in Grid 2
◾[401289]: Flashing lights in Batman Arkham Origins game main menu



*AMD Catalyst™ 14.7 RC Driver for Windows® Operating System*


Spoiler: Driver Info!



Last Updates: 7/10/2014
*Feature Highlights of the AMD Catalyst™ 14.7 RC Driver for Windows
*◾Includes all improvements found in the AMD Catalyst™ 14.6 RC driver
◾AMD CrossFire™ and AMD Radeon™ Dual Graphics profile update for Plants vs. Zombies 
◾Assassin's Creed IV - improved CrossFire scaling (3840x2160 High Settings) up to 93%
◾Collaboration with AOC has identified non-standard display timings as the root cause of 60Hz SST flickering exhibited by the AOC U2868PQU panel on certain AMD Radeon™ graphics cards. A software workaround has been implemented in AMD Catalyst 14.7 RC driver to resolve the display timing issues with this display.
◾Users are further encouraged to obtain newer display firmware from AOC that will resolve flickering at its origin.
◾Users are additionally advised to utilize DisplayPort-certified cables to ensure the integrity of the DisplayPort data connection. 
*Feature Highlights of the AMD Catalyst™ 14.6 RC Driver for Windows*
◾Plants vs. Zombies (Direct3D performance improvements):
◾AMD Radeon R9 290X - 1920x1080 Ultra - improves up to 11%
◾AMD Radeon R9 290X - 2560x1600 Ultra - improves up to 15%
◾AMD Radeon R9 290X CrossFire configuration (3840x2160 Ultra) - 92% scaling
◾3DMark Sky Diver improvements:
◾AMD A4-6300 - improves up to 4%
◾Enables AMD Dual Graphics/AMD CrossFire support
◾Grid Auto Sport:
◾AMD CrossFire profile
◾Wildstar:
◾Power Xpress profile
◾Performance improvements to improve smoothness of application
◾Performance improves up to 24% at 2560x1600 on the AMD Radeon R9 and R7 Series of products for both single GPU and multi-GPU configurations.
◾Watch Dogs:
◾AMD CrossFire - Frame pacing improvements
◾Battlefield Hardline Beta:
◾AMD CrossFire profile
*Known Issues*
◾Running Watch Dogs with a R9 280X CrossFire configuration may result in the application running in CrossFire software compositing mode
◾Enabling Temporal SMAA in a CrossFire configuration when playing Watch Dogs will result in flickering
◾AMD CrossFire™ configurations with AMD Eyefinity enabled will see instability with BattleField 4 or Thief when running Mantle
◾Catalyst™ Install Manager text is covered by Express/Custom radio button text
◾Express Uninstall does not remove C:\Program Files\(AMD or ATI) folder



*AMD Catalyst™ 14.6 RC Driver for Windows® Operating System*


Spoiler: Driver Info!



Last Updates: 6/23/2014

*◾Starting with AMD Catalyst 14.6 Beta, AMD will no longer support Windows 8.0 (and the WDDM 1.2 driver)
*◾Windows 8.0 users should upgrade (for Free) to Windows 8.1 to take advantage of the new features found in the AMD Catalyst 14.6 Beta
◾AMD Catalyst 14.4 will remain available for users who wish to remain on Windows 8
*◾A future AMD Catalyst release will allow for the WDDM 1.1 (Windows 7 driver) to be installed under Windows 8.0 for those users unable to upgrade to Windows 8.1

Feature Highlights of The AMD Catalyst™ 14.6 RC Driver for Windows
◾Plants vs. Zombies (Direct3D performance improvements):
◾AMD Radeon R9 290X - 1920x1080 Ultra - improves up to 11%
◾AMD Radeon R9290X - 2560x1600 Ultra - improves up to 15%
◾AMD Radeon R9290X CrossFire configuration (3840x2160 Ultra) - 92% scaling
◾3DMark Sky Diver improvements:
◾AMD A4 6300 - improves up to 4%
◾Enables AMD Dual Graphics / AMD CrossFire support
◾Grid Auto Sport:
◾AMD CrossFire profile
◾Wildstar:
◾Power Xpress profile
◾Performance improvements to improve smoothness of application
◾Watch Dogs:
◾AMD CrossFire - Frame pacing improvements
◾Battlefield Hardline Beta:
◾AMD CrossFire profile
Known Issues
◾Running Watch Dogs with a R9 280X CrossFire configuration may result in the application running in CrossFire software compositing mode
◾Enabling Temporal SMAA in a CrossFire configuration when playing Watch Dogs will result in Flickering
◾AMD CrossFire configurations with Eyefinity enabled will see application stability with BattleField 4 or Thief when running Mantle
◾Catalyst Install Manager text is covered by Express/Custom radio button text
◾Express Uninstall does not remove C:\Program Files\(AMD or ATI) folder*



*AMD Catalyst™ 14.6 Beta Driver for Windows®*


Spoiler: Driver Info!



*Feature Highlights of The AMD Catalyst 14.6 Beta Driver for Windows*
*◾Performance improvements*
◾Watch Dogs performance improvements
◾AMD Radeon R9 290X - 1920x1080 4x MSAA - improves up to 25%
◾AMD Radeon R9290X - 2560x1600 4x MSAA - improves up to 28%
◾AMD Radeon R9290X CrossFire configuration (3840x2160 Ultra settings, MSAA = 4X) - 92% scaling
◾Murdered Soul Suspect performance improvements
◾AMD Radeon R9 290X - 2560x1600 4x MSAA - improves up to 16%
◾AMD Radeon R9290X CrossFire configuration (3840x2160 Ultra settings, MSAA = 4X) - 93% scaling
◾AMD Eyefinity enhancements
◾Mixed Resolution Support
◾A new architecture providing brand new capabilities
◾Display groups can be created with monitors of different resolution (including difference sizes and shapes)
◾Users have a choice of how surface is created over the display group
◾Fill - legacy mode, best for identical monitors
◾Fit - create the Eyefinity surface using best available rectangular area with attached displays.
◾Expand - create a virtual Eyefinity surface using desktops as viewports onto the surface.
◾Eyefinity Display Alignment
◾Enables control over alignment between adjacent monitors
◾One-Click Setup
◾Driver detects layout of extended desktops
◾Can create Eyefinity display group using this layout in one click!
◾New user controls for video color and display settings
◾Greater control over Video Color Management:
◾Controls have been expanded from a single slider for controlling Boost and Hue to per color axis
◾Color depth control for Digital Flat Panels (available on supported HDMI and DP displays)
◾Allows users to select different color depths per resolution and display

◾AMD Mantle enhancements
◾Mantle now supports AMD Mobile products with Enduro technology
◾Battlefield 4: AMD Radeon HD 8970M (1366x768; high settings) - 21% gain
◾Thief: AMD Radeon HD 8970M (1920x1080; high settings) - 14% gain
◾Star Swarm: AMD Radeon HD 8970M (1920x1080; medium settings) - 274% gain
◾Enables support for Multi-GPU configurations with Thief (requires the latest Thief update)
◾AMD AM1 JPEG decoding acceleration
◾JPEG decoding acceleration was first enabled on the A10 APU Series in AMD Catalyst 14.1 beta, and has now been extended to the AMD AM1 Platform
◾Provides fast JPEG decompression
◾Provides Power Efficiency for JPEG decompression

Known Issues
Running Watch Dogs with a R9 280X CrossFire configuration may result in the application running in CrossFire software compositing mode
◾Enabling Temporal SMAA in a CrossFire configuration when playing Watch Dogs will result in Flickering
◾AMD CrossFire configurations with Eyefinity enabled will see application stability with BattleField 4 or Thief when running Mantle
◾Catalyst Install Manager text is covered by Express/Custom radio button text
◾Express Uninstall does not remove C:\Program Files\(AMD or ATI) folder



*AMD Catalyst™ 14.4 Beta V1.0 Driver for Windows® Operating System*


Spoiler: Driver Info!



Feature Highlights of The AMD Catalyst™ 14.4 Release Candidate Driver for Windows
◾Support for the AMD Radeon™ R9 295X
◾CrossFire™ fixes enhancements:
◾Crysis 3 - frame pacing improvements
◾Far Cry 3 - 3 and 4 GPU performance improvements at high quality settings, high resolution settings
◾Anno 2070 - Improved CrossFire scaling up to 34%
◾Titanfall - Resolved in game flickering with CrossFire enabled
◾Metro Last Light - Improved Crossfire scaling up to 10%
◾Eyefinity 3x1 (with three 4K panels) no longer cuts off portions of the application
◾Stuttering has been improved in certain applications when selecting mid-Eyefinity resolutions with V-sync Enabled
◾Full support for OpenGL 4.4
◾OpenGL 4.4 supports the following extensions:
◾ARB_buffer_storage
◾ARB_enhanced_layouts
◾ARB_query_buffer_object
◾ARB_clear_texture
◾ARB_texture_mirror_clamp_to_edge
◾ARB_texture_stencil8
◾ARB_vertex_type_10f_11f_11f_rev
◾ARB_multi_bind
◾ARB_bindless_texture
◾ARB_spare_texture
◾ARB_seamless_cubemap_per_texture
◾ARB_indirect_parameters
◾ARB_compute_variable_group_size
◾ARB_shader_draw_parameters
◾ARB_shader_group_vote
◾Mantle beta driver improvements:
◾BattleField 4: Performance slowdown is no longer seen when performing a task switch/Alt-tab
◾BattleField 4: Fuzzy images when playing in rotated SLS resolution with an A10 Kaveri system



*AMD Catalyst™ 14.3 Beta V1.0 Driver for Windows® Operating System*


Spoiler: Driver Info!



*Feature Highlights of The AMD Catalyst™ 14.3 Beta V1.0 Driver for Windows®*
◾Thief:
◾AMD Mantle and AMD True Audio support
◾Improves stuttering observed in CrossFire mode
◾Call of Duty: Ghosts: QUAD CrossFire profile update - improves level load times
◾Audio issues observed when using CrossFire configurations (and V-sync enabled) have been resolved
◾BattleField 4: V-sync issues observed on CrossFire configurations (with Mantle enabled) have been resolved

*Known Issues*
◾Intermittent driver stability issues when installing/un-installing on Desktop Kaveri platforms that support AMD Enduro technology under Windows 8.1. Please disable Enduro support to resolve the issue
◾Secondary GPUs do not enter low power state on CrossFire configurations; this issue will be addressed in the next AMD Catalyst beta release
◾Thief (DirectX): Lighting flickers on CrossFire configurations only after CrossFire has been enabled then disabled; this issue will be addressed in the next AMD Catalyst beta release
◾Battlefield 4 (DirectX): Quad CrossFire configurations with Eyefinity Display configurations suffer slowdowns and stability issues
◾Titanfall: Flickering occurs under AMD CrossFire configurations



*AMD Catalyst™ 14.2 Beta V1.3 Driver for Windows®
*


Spoiler: Driver Info!



*Feature Highlights of The AMD Catalyst™ 14.2 Beta V1.3 Driver for Windows®
*◾Thief: Crossfire Profile update and performance improvements for single GPU configurations
◾Mantle: Multi-GPU configurations (up to 4 GPUs) running Battlefield 4 are now supported
◾Frame Pacing for Dual Graphics and non-XDMA configurations above 2560x1600 are now supported with Battlefield 3 and Battlefield 4
◾Dual graphics DirectX 9 application issues have been resolved
◾Minecraft: Missing textures have been resolved
◾3D applications no longer see intermittent hangs or application crashes
◾Resolves corruption issues seen in X-plane.

*Known Issues*
◾Notebooks based on AMD Enduro or PowerXpress™ technologies are currently not supported by the Mantle codepath
◾Thief does not render the right eye when CrossFire and Stereo 3D is enabled



*AMD Catalyst™ Display Driver 14.1 beta for Windows®*


Spoiler: Driver Info!



Feature Highlights of The AMD Catalyst™ 14.1 Beta Driver for Windows

Support for the following new AMD Desktop APU (Accelerated Processors) products:
◾AMD A10-7850K
◾AMD A10-7700K

*Mantle Beta driver*

◾AMD's Mantle is a groundbreaking graphics API that promises to transform the world of game development to help bring better, faster games to the PC

◾Performance gain of up to 45%(versus the DirectX version) for Battlefield 4 on the R9 290 Series

◾Performance gain of up to 200% (versus the DirectX version) for Star Swarm on the R9 290 Series

◾AMD Catalyst 14.1 Beta must be used in conjunction with versions of these applications that support Mantle
◾It is expected that these applications will have future updates to support additional AMD Mantle features
◾AMD Mantle Beta driver is currently supported on:
◾AMD Radeon™ R9 Series GPUs
◾AMD Radeon™ R7 Series GPUs
◾AMD Radeon™ HD 7000 Series GPUs
◾AMD Radeon™ HD 8000 Series GPUs
◾AMD A10-7000 Series and AMD A8-7000 Series APUs
◾ For additional details please see the AMD Mantle Technology FAQ on amd.com

◾Enhanced AMD CrossFire frame pacing - Support for 4K panel and Eyefinity non-XDMA CrossFire solutions (including the AMD Radeon R9 280, 270 Series, 7900 Series, 7800 Series) and Dual Graphics configurations

◾ Frame pacing ensures that frames rendered across multiple GPUs in an AMD CrossFire configuration will be displayed at an even and regular pace
◾Supported on 4K panels and Eyefinity configurations
◾Supported on AMD Dual Graphics configurations
◾Supported on DirectX® 10 and DirectX 11 applications

Resolved issue highlights of AMD Catalyst 14.1 Beta 
◾Resolves ground texture flickering seen in Total War: Rome 2 with high settings (and below) set in game

◾Resolves flickering texture corruption when playing Call of Duty: Ghosts (multi-player) in the space station level

Resolved Issues

◾Ground texture flickering seen in Total War: Rome 2 with high settings (and below) set in game
◾Flickering texture corruption when playing Call of Duty: Ghosts (multi-player) in the space station level
◾Blu-ray playback using PowerDVD black screen on extended mode
◾Streaming VUDU HD/HDX content on Sharp PN-K321 (DP) causes the right-side half to flicker in and out
◾Black screen happened after wake up the monitor
◾Full screen issue at rotation in DX9 mode
◾Video window black screen when using Samsung Kies to play video
◾Crysis2 negative scaling in outdoor scene
◾Crysis2 has insufficient CrossFire scaling in some scene
◾Red Faction: The game has no or negative crossfire scaling with DX9 and DX11

◾Age of Conan has corruption and performance issues with crossfire enabled

◾Company of Heroes shadows are corrupted when using crossfire

◾Resident Evil5 's performance is unstable when display mode set to Window mode

◾Total War: Shogun 2 flickering menu/text

◾Frame rate drop when disabling post-processing in 3DMark06

◾Negative Crossfire scaling with game "The Secret World" in DX11 mode
◾F1 2012 Crashes to desktop

◾Tomb Raider Hair Simulation Stutters on CFX

◾Negative CrossFire scaling experienced in Call of Duty

◾Battlefield 3 performance drop on Haswell systems

◾Choppy video playback on 4k Video
◾VSync ON Tearing with 2x1 Eyefinity SLS CrossFire

◾Far Cry 3 - Game flickering while changing resolutions

◾Display corruption and BSOD occurs when extending a display after disabling Multiple GPU SLS array

◾Flickering seen when enable three 4kx2k panels at the same time

◾No Video, just a black screen when setting Chrome to run in "High Performance" when playing certain video clips

◾Image crashed on Starcraft game

Known Issues

◾Mantle performance for the AMD Radeon™ HD 7000/HD 8000 Series GPUs and AMD Radeon™ R9 280X and R9 270X GPUs will be optimized for BattleField 4™ in future AMD Catalyst™ releases. These products will see limited gains in BattleField 4™ and AMD is currently investigating optimizations for them.
◾Multi-GPU support under DirectX® and Mantle will be added to StarSwarm in a future application patch
◾Intermittent stuttering or stability issues may occur when utilizing Mantle with AMD CrossFire™ technology in BattleField 4™ - AMD recommends using the DirectX code path when playing Battlefield 4 with multiple GPUs. A future AMD Catalyst release will resolve these issues
◾Notebooks based on AMD Enduro or PowerXpress™ technologies are currently not supported by the Mantle codepath in Battlefield 4™
◾AMD Eyefinity configurations utilizing portrait display orientations are currently not supported by the Mantle codepath in Battlefield 4™
◾AMD Eyefinity technology is not currently supported in the Star Swarm application
◾AMD testing for the AMD Catalyst™ 14.1 Beta Mantle driver has been concentrated on the following products: AMD Radeon™ R9 290X, R9 290, R9 280, R9 270, R7 260X, R7 260, HD 7000 Series, HD 8000 Series, A10-7850K and A10-7700K. Future AMD Catalyst™ releases will include full test coverage for all AMD products supported by Mantle.
◾Graphics hardware in the AMD A10-7850K and A10-7700K may override the presence of a discrete GPU under the Mantle code path in Battlefield 4™
◾Frame Pacing for Dual Graphics and non-XDMA configurations above 2560x1600 do not currently work with Battlefield 3 and Battlefield 4. An upcoming release will enable support
◾DX9 Dual graphics is not supported in AMD Catalyst 14.1 Beta. An upcoming release will enable support



*AMD Catalyst™ 13.11 Beta9.5 for Windows® *


Spoiler: Driver Info!



*Resolves the issue of AMD Overdrive missing in the AMD Catalyst Control Center for the AMD Radeon™ R9 290 Series graphics cards
Resolves intermittent flickering seen on some AMD Radeon R9 270x graphics cards
Resolves graphics corruption seen in Starcraft®
Improves frame pacing r esults in AMD Quad CrossFire™ configurations for the following: Hitman: Absolution, and Total War™ : Rome 2*



*AM D Catalyst™ 13.11 Beta 9.4 Driver for Windows®*


Spoiler: Driver Info!



*May resolve intermittent black screens or display loss observed on some AMD Radeon™ R9 290X and AMD Radeon R9 290 graphics cards
Improves AMD CrossFire™ scaling in the multi-player portion of Call of Duty®: Ghosts

AMD Enduro Technology Profile updates:
XCOM: Enemy Unknown
Need for Speed Rivals*



*AMD Catalyst™ 13.11 Beta 9.2 for Windows®*


Spoiler: Driver Info!



*Call of Duty®: Ghost - Improves anti-aliasing performance, and updates the AMD CrossFire™ profile*
*AMD Radeon™ R9 290 Series - PowerTune update to reduce variance of fan speed / RPM
Resolves intermittent crashes seen in legacy DirectX® 9 applications*



*AMD Catalyst™ 13.11 Beta 8 for Windows® *


Spoiler: Driver Info!



*Resolves intermittent crashes experienced with Battlefield 4 on Windows 8 based systems*



*AMD Catalyst™ 13.11 Beta 7 Driver for Windows®*


Spoiler: Driver Info!



*Increases AMD CrossFire ™ scaling up to an additional 20% for Battlefield 4*



*AMD Catalyst™ 13.11 Beta 6 Driver for Windows ®*


Spoiler: Driver Info!



Includes support for the new prod ucts:
*[AMD Radeon ™ R9 290X
AMD Radeon R9 290*

Performance improvements
* Batman: Arkham Origins - improves performance up to 35% with MSAA 8x enabled
Total War™: Rome 2 - improves performance up to 10%
Battlefield 3 - improves performance up to 10%
GRID 2 - improves performance up to 8.5%
DiRT Showdown - improves performance up to 10%
Formula 1™ 2013 - improves performance up to 8%
DiRT 3 - improves performance up to 7%
Sleeping Dogs - improves performance up to 5%
Automatic AMD Eyefinity Configuration 
Automatic "plug and play" configuration of supported Ultra HD/4K tiled displays*










*USEFUL SOFTWARE & INFO SECTION*









*Nvidia & AMD Drivers Un-install Utility*

*RadeonPro*

*Atiman Uninstaller v.7.0.2.msi*

*TechPowerUp GPU-Z*

*MSI Afterburner*

*Sapphire TriXX*

*ASUS GPU Tweak*

*OCCT*

*Fraps*

*Unigine Valley Benchmark*

*Futuremark 3DMark Benchmark*

*Memory Info Test*

*TechPowerUP Video BIOS Collection 290X*

*TechPowerUP Video BIOS Collection 290*

*TechPower UP - BIOS Flahsing - ATI Winflash 2.6.7*

*Stop Down Clocking by OCN member Sgt. Bilko







*



Spoiler: Stop Down Clocking Instructions Using AB Beta 17



Using AB 3.0.0 Beta 17, once you have applied your overclock you must go back into CCC and reset the Power Limit back to 50% again. Even though it says it's 50% in AB it's NOT 50% in CCC.

However IF CCC is disabled you DO NOT ENABLE OVERDRIVE in CCC and your AB settings will work.

NOTE: Whenever you set a new clock and your screen flashes you need to bump the power limit back up in CCC as well as it resets every time the driver does.



*How to Uninstall GPU Drviers by OCN member BradleyW*









*How To Uninstall Your ATI GPU Drivers*

*How To Uninstall Your NVIDIA GPU Drivers*

*Unofficial OC Method by OCN member tsm106*








This is ONLY if you need to do this


*How much PSU is required to run a 290X by OCN member SonDa5







*

*290X PSU Power Output Tests*

*How to give more volts on MSI Afterburner by OCN member sugarhell*











Spoiler: ADD more volts to MSI AB Guide



*Source*

Just use /wi4,30,8d,10 for 100mv. The offset is 6.25 mv in hexademical. So on decimal is :16*6.25=100 mv. For 50mv you need 8. For 200mv you need 20( 20=32 on dec. So 32 * 6.25=200mv)

The easy way to do changes:

Create a txt on desktop. Write
CD C:\Program Files (x86)\MSI Afterburner
MSIAfterburner.exe /wi4,30,8d,10

and then save as .bat file. Eveyrtime you start this bat file msi will start with +100mv

For 50mv: 8
For 100mv:10
For 125mv:14
For 150mv:18
For 175mv:1C
For 200mv:20

I wouldn't go over this point because
1)You are close to leave the sweet spot of the ref pcb vrms efficiency
2)These commands add 200mv on top of the 100mv offset through AB gui.That means 300mv

By default /wi command apply to current gpu only. So if you have 2 or more gpus you must use /sg command. That means the command line is something like that
ex:MsiAfterburner.exe /sg0 /wi6,30,8d,10 /sg1 /wi6,30,8d,10



*Need a good PSU? Check out the Comparison Threads by OCN member shilka*









*700-750 watts comparison thread*
*1000-1050 watts comparison thread*
*1200-1350 watts comparison thread*

*The R9 290(X) "Need to Know" by OCN member Roboyto*









*The R9 290(X) "Need to Know" Post*
*Omega Drivers Comparison by OCN member Roboyto*









*Omega Drivers Comparison*

*Sapphire R9 290 + Arctic Accelero Xtreme IV + VRM mod by OCN member Yuriewitsch*









*Sapphire R9 290 + Arctic Accelero Xtreme IV + VRM mod*

*Random Crashes Playing Fallout Fix by OCN member Alancsalt*










*Water Blocks and Back Plates*

*







R9 290X & 290







*

*EKWB - Water Block* - *LINK*
*EKWB - Back Plate* - *LINK* ( Not a stand alone )

*VID-AR290X Koolance Water Block (AMD Radeon R9 290, 290X)* - *LINK*

*XSPC - Razor R9 290X / 290 Water Block* - *LINK*
*XSPC - Razor R9 290X / 290 Backplate* - *LINK*

*Swiftech - KOMODO-R9-LE R9 290X / 290 Backplate* - *LINK*

*Aqua Computer - Kryographics Black Edition* - *LINK*
*Aqua Computer - Kryographics Black Edition / Nickle Plated* - *LINK*
*Aqua Computer - Kryographics Acrylic Glass Edition* - *LINK*
*Aqua Computer - Kryographics Acrylic Glass Edition / Nickle Plated/B] - LINK

Aqua Computer - Kryographics Backplate , passive - LINK
Aqua Computer - Kryographics Backplate, active XCS - LINK

*

*OCN Benchmark Links*

Got a good benchmark? Show your GPU scores by taking pride and feel free to participate representing the 290 and 290X. Remember to read the first page and follow those rules when benching to submit scores for validity in order to be counted.









*Single GPU Firestrike Top 30*

*Top 30 3d Mark 13 Fire Strike Scores in Crossfire / SLI
*
*Firestrike Extreme Top 30*

*Top 30 3DMark11 Scores for Single/Dual/Tri/Quad*

*[OFFICIAL]--- Top 30 --- Unigine 'Valley' Benchmark 1.0*

*Catzilla Top 30 Thread*

*OCN / AMD Related Sites*

*MSI R9 290X Lightning Thread*

*290/290X Black Screen Poll*

*xXCrossXFire ClubXx --Because one's not enough *

*The R9 290 -> 290X Unlock Thread*
*
Official AMD R9 295X2 Owners Club*
*
XFX Black/Double Dissipation Club!*



Hello,

MSI R9 290x Lightning Edition, STOCK



Thanks


----------



## maynard14

hi guys,. ive been thinkin to sell my 290x ref edition, and buy a new gtx 970,. is it worth it? my 290x now is working great but sometimes it just black screen my pc,. i dont know if my psu or the pcie extension im using is the problem,, so i knda wanna get rid of my 290x,.. is it worth it? im using 1080p 120hz monitor


----------



## numbrs

Quote:


> Originally Posted by *maynard14*
> 
> hi guys,. ive been thinkin to sell my 290x ref edition, and buy a new gtx 970,. is it worth it? my 290x now is working great but sometimes it just black screen my pc,. i dont know if my psu or the pcie extension im using is the problem,, so i knda wanna get rid of my 290x,.. is it worth it? im using 1080p 120hz monitor


It depends what you're using it for. The 290x is great for Folding @ Home, but i think a 970 would be a step backwards, because im pretty sure the 290x you own has 512 bit bus and 4GB GDDR5. I think a descision between those cards would mostly be personal opinion, but the 290x is considered an enthusiast grade card.

Have you overclcoked it at all?

What psu are you using for your build? And black screen as in your computer crashes or more of a "display crash", where the screen fixes itself after you let it refresh?

Try turning off any overclocks, and check the GPU temperature through MSI Afterburner when you try to re-create the crash. Also, make sure you are up to date on all the CCC (Catalyst Control Center) drivers. Google around for a black screen with the reference cards.

TL;DR I wouldn't return the card unless you get 85-90% of the money back, let alone a full refund. Personally i think a 970 would be a downgrade but might save you from more hassle in the future.

Regards,


----------



## Tonza

Decided to buy Accelero Extreme IV and try out myself if i can install it to a DCII 290 card, yes it works and it works flawlessy (did not use the poop VRM backplate cooling, since it doesnt fit to DCII PCB, used original VRM heatsink and left out mem and VRM 1 without any heatsinks like they are stock in this card, also used stock DCII backplate).

Anyway, now some tests what i made in GTA V 1 hour session:

*DCII stock cooling @ 55% jet engine speed with -38mv and stock clocks (yes the thing needed to be undervolted to avoid throttling, absolutely crappy cooling)*

Core: around 86-90C (it went to 94C and started to throttle with stock voltages)

VRMs were around 70-80C, that was fine. Dont have exact readings, but the thing went crazy hot, and i reseated it couple times with Phobya paste (problem is that only 2 of the 4 heatpipes makes proper contact to 290 core, GG Asus). DCII cooler also was like jet engine.

*Accelero Extreme IV @ 100% fan speed with -38mv and stock clocks (wanted to see how it performs first without any OC)*

Core: 65C
VRM 1: 64C
VRM2: 54C

Now for the fun part.

*Accelero Extreme IV with +50mv and 1125/1400 clocks*

Core: hovers between 69-70c

VRM1: 77C
VRM2: 62C

Absolutely insane difference, also now i can play without headphones and my face doesnt flap from the wind force what comes from DCII cooling. Gonna install tomorrow heatsinks to memory and VRM1 when thermal glue arrives (have package of heatsinks from old Alpenföhn Peter cooler which i installed to GTX 580 back in the day), tho i dont think its even necessary.

Arctic states that the IV model is not compatible with 290 DCII card, but that is BS, you need just steady hands and install it without spacers and without the poop VRM heatsink (which wont fit).


----------



## Mbbx

Hi
I have been trying to overclock my quadfire 290x's in afterburner but only get 100mv extra.

I have 4x msi 290x gaming and a dual psu, water cooled.

I am using version 4,1 of AB (afterburner).

Do I have to do a bios change to get more voltage? I can only get 1200mhz gpu and 1290mhz on mem at 100mv.

any ideas?


----------



## numbrs

Quote:


> Originally Posted by *Mbbx*
> 
> Hi
> I have been trying to overclock my quadfire 290x's in afterburner but only get 100mv extra.
> 
> I have 4x msi 290x gaming and a dual psu, water cooled.
> 
> I am using version 4,1 of AB (afterburner).
> 
> Do I have to do a bios change to get more voltage? I can only get 1200mhz gpu and 1290mhz on mem at 100mv.
> 
> any ideas?


You have to go into the settings. Look at the bottom of the thread on the front page it has an "unlock to +200mv" guide.


----------



## numbrs

Quote:


> Originally Posted by *Mbbx*
> 
> Hi
> I have been trying to overclock my quadfire 290x's in afterburner but only get 100mv extra.
> 
> I have 4x msi 290x gaming and a dual psu, water cooled.
> 
> I am using version 4,1 of AB (afterburner).
> 
> Do I have to do a bios change to get more voltage? I can only get 1200mhz gpu and 1290mhz on mem at 100mv.
> 
> any ideas?


Here:


Spoiler: Unlocking:



Just use /wi4,30,8d,10 for 100mv. The offset is 6.25 mv in hexademical. So on decimal is :16*6.25=100 mv. For 50mv you need 8. For 200mv you need 20( 20=32 on dec. So 32 * 6.25=200mv)

The easy way to do changes:

Create a txt on desktop. Write
CD C:\Program Files (x86)\MSI Afterburner
MSIAfterburner.exe /wi4,30,8d,10

and then save as .bat file. Eveyrtime you start this bat file msi will start with +100mv

For 50mv: 8
For 100mv:10
For 125mv:14
For 150mv:18
For 175mv:1C
For 200mv:20

I wouldn't go over this point because
1)You are close to leave the sweet spot of the ref pcb vrms efficiency
2)These commands add 200mv on top of the 100mv offset through AB gui.That means 300mv

By default /wi command apply to current gpu only. So if you have 2 or more gpus you must use /sg command. That means the command line is something like that
ex:MsiAfterburner.exe /sg0 /wi6,30,8d,10 /sg1 /wi6,30,8d,10


----------



## maynard14

Quote:


> Originally Posted by *numbrs*
> 
> It depends what you're using it for. The 290x is great for Folding @ Home, but i think a 970 would be a step backwards, because im pretty sure the 290x you own has 512 bit bus and 4GB GDDR5. I think a descision between those cards would mostly be personal opinion, but the 290x is considered an enthusiast grade card.
> 
> Have you overclcoked it at all?
> 
> What psu are you using for your build? And black screen as in your computer crashes or more of a "display crash", where the screen fixes itself after you let it refresh?
> 
> Try turning off any overclocks, and check the GPU temperature through MSI Afterburner when you try to re-create the crash. Also, make sure you are up to date on all the CCC (Catalyst Control Center) drivers. Google around for a black screen with the reference cards.
> 
> TL;DR I wouldn't return the card unless you get 85-90% of the money back, let alone a full refund. Personally i think a 970 would be a downgrade but might save you from more hassle in the future.
> 
> Regards,


Hi bro, im using the card as is, i mean no overclock no mods done except that im using nzxt g10 and corsair h105 and vrm heatsinks from gelid, it is running great now that i change my pcie 8 pin and 6 pin, but before that i get black screen on my monitor while gaming, it willl crash and i cant move my mouse or no sound on the monitor,, so i have to manually turn it off and restart, but i think i fix it now because i think the pcie sleeve extension was the problem

also my driver is up to date lates 15.4 beta and im using corsair tx750m psu

i see so 970 is downgrade from 290x, yes i might as well keep it as is, i never did oc my 290x coz its really fast and plenty for my games,, it just came to my mind to swtich to nvidia because my problem of crashes while gaming,, temps are great btw 58 c on gpu core, 61 c vrm2 and vrm 1 76c

thanks again ill just keep my 290x


----------



## Mbbx

Hi Numbers

Thanks for the reply.

Although I have managed to get it to work, on all 4 cards, I find it always crashes at 175mv and 200mv even at the same overclock as I achieved at 100mv!

Other people found this?


----------



## Agent Smith1984

If you like benchmarking as much or more than gaming, or if you play at 1080 and never plan to go dual GPU, then the 970 is still a VERY good option....

I think of the GTX 970 as a 240hp/165lbs I-4 in a 2500lbs Honda Civic
I think of the 290x as a 300HP/265lbs V6 in a 3000lbs sedan.

You can see where they both have strengths and weaknesses.....

But if you already have the 290, then going 970 would make no sense at all. It's a sidegrade, unless you just look at Firestrike results..


----------



## boot318

Quote:


> Originally Posted by *Tonza*
> 
> Decided to buy Accelero Extreme IV and try out myself if i can install it to a DCII 290 card, yes it works and it works flawlessy (did not use the poop VRM backplate cooling, since it doesnt fit to DCII PCB, used original VRM heatsink and left out mem and VRM 1 without any heatsinks like they are stock in this card, also used stock DCII backplate).
> 
> Anyway, now some tests what i made in GTA V 1 hour session:
> 
> *DCII stock cooling @ 55% jet engine speed with -38mv and stock clocks (yes the thing needed to be undervolted to avoid throttling, absolutely crappy cooling)*
> 
> Core: around 86-90C (it went to 94C and started to throttle with stock voltages)
> 
> VRMs were around 70-80C, that was fine. Dont have exact readings, but the thing went crazy hot, and i reseated it couple times with Phobya paste (problem is that only 2 of the 4 heatpipes makes proper contact to 290 core, GG Asus). DCII cooler also was like jet engine.
> 
> *Accelero Extreme IV @ 100% fan speed with -38mv and stock clocks (wanted to see how it performs first without any OC)*
> 
> Core: 65C
> VRM 1: 64C
> VRM2: 54C
> 
> Now for the fun part.
> 
> *Accelero Extreme IV with +50mv and 1125/1400 clocks*
> 
> Core: hovers between 69-70c
> 
> VRM1: 77C
> VRM2: 62C
> 
> Absolutely insane difference, also now i can play without headphones and my face doesnt flap from the wind force what comes from DCII cooling. Gonna install tomorrow heatsinks to memory and VRM1 when thermal glue arrives (have package of heatsinks from old Alpenföhn Peter cooler which i installed to GTX 580 back in the day), tho i dont think its even necessary.
> 
> Arctic states that the IV model is not compatible with 290 DCII card, but that is BS, you need just steady hands and install it without spacers and without the poop VRM heatsink (which wont fit).


If you use good quality thermal pads with the stock DCII VRM heatsink, it will be alright. I've seen 15C difference in VRM temps by using some stuff I had laying around my house verses the stock thermal pads. It can only get better with the high end thermal pads.


----------



## numbrs

Quote:


> Originally Posted by *maynard14*
> 
> Hi bro, im using the card as is, i mean no overclock no mods done except that im using nzxt g10 and corsair h105 and vrm heatsinks from gelid, it is running great now that i change my pcie 8 pin and 6 pin, but before that i get black screen on my monitor while gaming, it willl crash and i cant move my mouse or no sound on the monitor,, so i have to manually turn it off and restart, but i think i fix it now because i think the pcie sleeve extension was the problem
> 
> also my driver is up to date lates 15.4 beta and im using corsair tx750m psu
> 
> i see so 970 is downgrade from 290x, yes i might as well keep it as is, i never did oc my 290x coz its really fast and plenty for my games,, it just came to my mind to swtich to nvidia because my problem of crashes while gaming,, temps are great btw 58 c on gpu core, 61 c vrm2 and vrm 1 76c
> 
> thanks again ill just keep my 290x


Glad to hear you figured it out.
Have a nice day


----------



## numbrs

Quote:


> Originally Posted by *Mbbx*
> 
> Hi Numbers
> 
> Thanks for the reply.
> 
> Although I have managed to get it to work, on all 4 cards, I find it always crashes at 175mv and 200mv even at the same overclock as I achieved at 100mv!
> 
> Other people found this?


I don't really know ive never done CrossFireX, but im sure its buried somewhere in Afterburner.

Have you changed any start up options?

Perhaps google some similar setups and ask around as to how people got them to all work in tandem.


----------



## LandonAaron

Quote:


> Originally Posted by *Mbbx*
> 
> Hi Numbers
> 
> Thanks for the reply.
> 
> Although I have managed to get it to work, on all 4 cards, I find it always crashes at 175mv and 200mv even at the same overclock as I achieved at 100mv!
> 
> Other people found this?


At 1080p 60hz I can go all the way up to +200mV no problem and overclock up to 1200/1600. But for some reason at 1440p 96hz I can't go over +100mV without black screening. I think this may related to my monitor which is one of the Korean overclockable monitors. A 27" Qnix to be exact. Anyway yeah its kind of strange to think that the card could be less stable with more voltage but that is definitely the case for me, and I have found +75mV, 1100/1500 to be the optimal OC for my cards.


----------



## pengs

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I think of the GTX 970 as a 240hp/165lbs I-4 in a 2500lbs Honda Civic
> I think of the 290x as a 300HP/265lbs V6 in a 3000lbs sedan.
> 
> You can see where they both have strengths and weaknesses.....


Good analogy even if it breaks the cardinal computer analogy laws
















Torque vs. turbo is how I view it; Hawaii is definitely not bandwidth limited or cut down in any area's which could cause core starvation, even if the shaders happen to be slightly slower clock for clock (which I'm pretty certain we can't gauge for sure given that both architectures are different/polar). But, everything is governed by physics at the end of the day.


----------



## Dooderek

Quote:


> Originally Posted by *pengs*
> 
> Good analogy even if it breaks the cardinal computer analogy laws
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Torque vs. turbo is how I view it; Hawaii is definitely not bandwidth limited or cut down in any area's which could cause core starvation, even if the shaders happen to be slightly slower clock for clock (which I'm pretty certain we can't gauge for sure given that both architectures are different/polar). But, everything is governed by physics at the end of the day.


Ahhhhh sorry I am a car guy and i cannot resist!

Do you mean Naturally Aspirated vs Turbo? Torque will always accompany forced induction.


----------



## maynard14

Quote:


> Originally Posted by *numbrs*
> 
> Glad to hear you figured it out. Rep me? Lololol.
> 
> Have a nice day


ahaha yeah bro, thanks again


----------



## maynard14

Quote:


> Originally Posted by *Agent Smith1984*
> 
> If you like benchmarking as much or more than gaming, or if you play at 1080 and never plan to go dual GPU, then the 970 is still a VERY good option....
> 
> I think of the GTX 970 as a 240hp/165lbs I-4 in a 2500lbs Honda Civic
> I think of the 290x as a 300HP/265lbs V6 in a 3000lbs sedan.
> 
> You can see where they both have strengths and weaknesses.....
> 
> But if you already have the 290, then going 970 would make no sense at all. It's a sidegrade, unless you just look at Firestrike results..


yeah im playing at 1080p 120 hz monitor,, but i realize yes it just a side grade,. and i only play for 2 hours a day sometimes 1 hour only,, haha so i think power bill its not a problem,, but i plan to cf my 290x in the future but no overclock, so i think 750 psu from corsair is enough


----------



## LandonAaron

Is there a setting in CCC to keep the second monitor active while a full screen 3d application (game) is running?


----------



## Cyber Locc

Quote:


> Originally Posted by *Mbbx*
> 
> Hi Numbers
> 
> Thanks for the reply.
> 
> Although I have managed to get it to work, on all 4 cards, I find it always crashes at 175mv and 200mv even at the same overclock as I achieved at 100mv!
> 
> Other people found this?


Yes I have I think its common. One of my cards I can get 1275 on the core but the other one won't go past 1150 no matter how much voltage I give it. I have given it up to 1.4v (using Asus bios) and it still won't go over 1150 it just will not happen at all.
So if I over clock my cards together I cant get over 1150, if your over clocking all 4 cards at once (which you have to unless there removed or have lanes shut off) then it could be one cards has hits its limits.

However your issue I think is not the same as you said you have 4 cards. 4 cards with +200mv is like 1500ws just for the GPUs so unless you have 2 1000w high end psues or better then that ain't going to happen bud. That could be causing your crashing as it might not be pulling the full wattage at those clocks or it could just be one card is holding you back.

Try using one card at a time and over clocking them. If there on air just pull them out or if water and your board has pcie switches then do that.


----------



## Arizonian

Quote:


> Originally Posted by *Banda*
> 
> Finally I got a r9 290x!
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/details.php?id=epahe


Congrats - added








Quote:


> Originally Posted by *pcrevolution*
> 
> Hi sir please add me as well!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Happy owner of a 290x with an AC backplate waiting for summer to complete a custom loop build.


Added you seven members ago. I took your word for it and it's listed under water. But it doesn't get you out of showing us a pic when you do.









Nice PSU cables btw.
Quote:


> Originally Posted by *ShinGoutetsu*
> 
> Can I join? Sapphire R9 290X Vapor-X 4gb OC crossfire with ek full cover blocks and backplates
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s31.photobucket.com/user/gou...3-4A09-9743-FE2A307CA2F0_zps0bhntrjk.jpg.html
> http://s31.photobucket.com/user/gou...1-476C-B6D0-6D5646D0685F_zpsacgdfduy.jpg.html


Congrats - added
















And as far as the work in your rig.....







Real clean man.
Quote:


> Originally Posted by *numbrs*
> 
> Hello,
> 
> MSI R9 290x Lightning Edition, STOCK
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Thanks


Congrats - added


----------



## mfknjadagr8

Ok so our discussion about blocks got me wondering what actual vrm tems im getting so i fired up the gta v on max settings aside from the advanced graphics and AA option is only one 2x however im downsampled from 3100 resolution...

One thing i noticed is ulps seems to have turnd itself back on... AGAIN even though registry values are still at 0.... this is kind of annoying.....


----------



## pengs

Quote:


> Originally Posted by *Dooderek*
> 
> Ahhhhh sorry I am a car guy and i cannot resist!
> 
> Do you mean Naturally Aspirated vs Turbo? Torque will always accompany forced induction.


Ah, yeah that must be it. Not much into the cars other than driving them







I think a better analogy might had been v8 vs a v6 with a supercharger but I'm venturing into uncharted waters


----------



## rdr09

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Ok so our discussion about blocks got me wondering what actual vrm tems im getting so i fired up the gta v on max settings aside from the advanced graphics and AA option is only one 2x however im downsampled from 3100 resolution...
> 
> One thing i noticed is ulps seems to have turnd itself back on... AGAIN even though registry values are still at 0.... this is kind of annoying.....


why you say that? they seem to be loading evenly (see Maximum readings).


----------



## rdr09

Quote:


> Originally Posted by *Mbbx*
> 
> Hi Numbers
> 
> Thanks for the reply.
> 
> Although I have managed to get it to work, on all 4 cards, I find it always crashes at 175mv and 200mv even at the same overclock as I achieved at 100mv!
> 
> Other people found this?


prolly the psu (s)?


----------



## mfknjadagr8

Quote:


> Originally Posted by *rdr09*
> 
> why you say that? they seem to be loading evenly (see Maximum readings).


look at my values after idle it dropped the clocks in both cards to minimums....I was under the impression the cards didn't power save like that with ulps off...they didn't the first time I disabled it...also vrm tennis are much better than we discussed here before...also core temps on cpu happened when closing call of duty to open gtav right after starting...hovered from 50 to 53 max while gaming


----------



## Jflisk

Quote:


> Originally Posted by *mfknjadagr8*
> 
> look at my values after idle it dropped the clocks in both cards to minimums....I was under the impression the cards didn't power save like that with ulps off...they didn't the first time I disabled it...also vrm tennis are much better than we discussed here before...also core temps on cpu happened when closing call of duty to open gtav right after starting...hovered from 50 to 53 max while gaming


Looks like1 GPU under water by the temps. When I use ULPS I have 3 cards keeps the other 2 cards on all the time. At least that is why I disable it.
Checked my system The HW info you posted looks right with ulps off.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Jflisk*
> 
> Looks like1 GPU under water by the temps. When I use ULPS I have 3 cards keeps the other 2 cards on all the time. At least that is why I disable it.
> Checked my system The HW info you posted looks right with ulps off.


ulps is supposed to lower the clocks and power draw when not needed so why would it lower them with it off...this is two cards under water I just moved several of the values to get them in the same screen to monitor while gaming...when I disabled ulps the first time it forced the gpus to stay at the reference clocks (haven't bothered to overclock until I usual second pump) for both core and memory clocks...then one day I noticed the second card wasn't working...and clocks were resetting to the lps clocks again...wipe and reinstall of drivers fixed this and after re disabling ulps the clocks stayed again...now this is happening for the third time only this time the second card is working ok...both times the ulps values in registry are still on 0 and the disable ulps in ab is checked just for good measure...doesn't make sense


----------



## Jflisk

Quote:


> Originally Posted by *mfknjadagr8*
> 
> ulps is supposed to lower the clocks and power draw when not needed so why would it lower them with it off...this is two cards under water I just moved several of the values to get them in the same screen to monitor while gaming...when I disabled ulps the first time it forced the gpus to stay at the reference clocks (haven't bothered to overclock until I usual second pump) for both core and memory clocks...then one day I noticed the second card wasn't working...and clocks were resetting to the lps clocks again...wipe and reinstall of drivers fixed this and after re disabling ulps the clocks stayed again...now this is happening for the third time only this time the second card is working ok...both times the ulps values in registry are still on 0 and the disable ulps in ab is checked just for good measure...doesn't make sense


Never really looked at it till you mentioned it . But my clocks move from 300 up to 1050 GPU clock- Memory 150 to 1250 for the stages of usage.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Dooderek*
> 
> Ahhhhh sorry I am a car guy and i cannot resist!
> 
> Do you mean Naturally Aspirated vs Turbo? Torque will always accompany forced induction.


I was thinking the 970 was more of a high HP, low torque NA 4cyl motor..... the 290 is a larger, more inneficient V6 that is carrying more weight, but makes a lot more torque....


----------



## PsYcHo29388

I'm gonna be getting a 290 sometime today or this week. What brand would you guys recommend for just running at stock?


----------



## tsm106

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jflisk*
> 
> Looks like1 GPU under water by the temps. When I use ULPS I have 3 cards keeps the other 2 cards on all the time. At least that is why I disable it.
> Checked my system The HW info you posted looks right with ulps off.
> 
> 
> 
> ulps is supposed to lower the clocks and power draw when not needed so why would it lower them with it off...this is two cards under water I just moved several of the values to get them in the same screen to monitor while gaming...when I disabled ulps the first time it forced the gpus to stay at the reference clocks (haven't bothered to overclock until I usual second pump) for both core and memory clocks...then one day I noticed the second card wasn't working...and clocks were resetting to the lps clocks again...wipe and reinstall of drivers fixed this and after re disabling ulps the clocks stayed again...now this is happening for the third time only this time the second card is working ok...both times the ulps values in registry are still on 0 and the disable ulps in ab is checked just for good measure...doesn't make sense
Click to expand...

It's because your definition of ULPS is incorrect. You're attributing things to ULPS that it is not. ULPS puts slave cards in crossfire to sleep. That is all. Clocks going up and down or not to idle are a whole other bag of beans.


----------



## Agent Smith1984

Quote:


> Originally Posted by *PsYcHo29388*
> 
> I'm gonna be getting a 290 sometime today or this week. What brand would you guys recommend for just running at stock?


Sapphire Tri-x 290....

Reasons?

1. Much quieter under load than reference design
2. The price is great right now, especially after rebates
3. The cooler will keep the core around 67-70c under max load, versus the 90+ a reference card will hit.
4. I use two, and love them.....


----------



## PsYcHo29388

I missed the sale on that one...it's now $272 after the rebate.

damnit.


----------



## pengs

Quote:


> Originally Posted by *PsYcHo29388*
> 
> I missed the sale on that one...it's now $272 after the rebate.
> 
> damnit.


Yeah stock is running out I believe. Amazon has the 290 nailed at $300 and the 290X at $289.


----------



## mfknjadagr8

Quote:


> Originally Posted by *tsm106*
> 
> It's because your definition of ULPS is incorrect. You're attributing things to ULPS that it is not. ULPS puts slave cards in crossfire to sleep. That is all. Clocks going up and down or not to idle are a whole other bag of beans.


ok so what does determine the clocks staying persistent? I would like my clocks to always be at the setting or within a small margin due to some voltage drift.....I don't care about power consumption during idle our at ask for that matter


----------



## rdr09

Quote:


> Originally Posted by *mfknjadagr8*
> 
> ok so what does determine the clocks staying persistent? I would like my clocks to always be at the setting or within a small margin due to some voltage drift.....I don't care about power consumption during idle our at ask for that matter


resolution, game load, cpu. etc. i would not want my 290s at load even when just browsing. unless i missed something.

if you have a 1080, then it may be hard to load those 290s. when i had 1080 in BF4 MP, i had to raise the rez scale to at least 135% to load the gpus.


----------



## mfknjadagr8

Quote:


> Originally Posted by *rdr09*
> 
> resolution, game load, cpu. etc. i would not want my 290s at load even when just browsing. unless i missed something.


I very rarely just browse...most of the time I turn on my computer to game...that's about it so persistent clocks is what I've always done but these are my first amd cards since my x1300 our the 4800 can't remember which one I upgraded from / to so I'm not sure how to force this to occur or if I even can...are most if you guys using Radeon pro to force your crossfire profiles and such?

I've noticed anytime I get stutter it's because utilization drops to zero on one card or the other for no apparent reason sometimes when they are running 30% load sometimes when they are upper 80s temps are not high on anything...no throttle on any clocks and no drops in voltages that I can see...but I would love to figure this one out...I first thought throttle but clocks are slammed at reference sites the whole time and the drop to zero lasts half a second or less just enough to stutter like a bastard...then it recovers and tears it up again... I also downsample almost all my games from 3100 x 2150 or w/e it is can't remember of the top of my head but I also ensure I'm not hitting vram limit...midst I've seen so far is 3.542 or close


----------



## rdr09

Quote:


> Originally Posted by *mfknjadagr8*
> 
> I very rarely just browse...most of the time I turn on my computer to game...that's about it so persistent clocks is what I've always done but these are my first amd cards since my x1300 our the 4800 can't remember which one I upgraded from / to so I'm not sure how to force this to occur or if I even can...are most if you guys using Radeon pro to force your crossfire profiles and such?


do you have 1080?


----------



## mfknjadagr8

Quote:


> Originally Posted by *rdr09*
> 
> do you have 1080?


yes I edited my post


----------



## rdr09

Quote:


> Originally Posted by *mfknjadagr8*
> 
> yes I edited my post


what games are those you are having issues? next time you play a game . . . do you mind running 2 instances of GPUZ in the background like so . . .



see the drop downs? set the temps to max. make sure other apps such as Afterburner are closed. also, in CCC, make sure overdrive is disabled.

edit: how much oc you have in your cpu?


----------



## mfknjadagr8

Quote:


> Originally Posted by *rdr09*
> 
> what games are those you are having issues? next time you play a game . . . do you mind running 2 instances of GPUZ in the background like so . . .
> 
> 
> 
> see the drop downs? set the temps to max. make sure other apps such as Afterburner are closed. also, in CCC, make sure overdrive is disabled.
> 
> edit: how much oc you have in your cpu?


I have my overclock to 4.8 rock steady in everything I can throw at it 19 hours of prime ibt avx very high 30 runs...the games I've noticed it on are advanced warfare (not a good test of anything as the crossfire profile makes it run horrible) and gta 5...I never noticed this in bf4...but yeah I can run that...I also don't use overdrive


----------



## MerkageTurk

Hello, fellow members of 290x,

I am new to amd and was wondering how to overclock vapor x 290x


----------



## By-Tor

I just replaced my pair of 7950's with a Powercolor R9 290X LCS and love it. Hoping to add a second card soon..

Please add me...


----------



## tsm106

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> It's because your definition of ULPS is incorrect. You're attributing things to ULPS that it is not. ULPS puts slave cards in crossfire to sleep. That is all. Clocks going up and down or not to idle are a whole other bag of beans.
> 
> 
> 
> ok so what does determine the clocks staying persistent? I would like my clocks to always be at the setting or within a small margin due to some voltage drift.....I don't care about power consumption during idle our at ask for that matter
Click to expand...

I don't understand what clocks have to do with voltage drift? You mean droop? Clocks to be at what setting? If your gpu usage is going to zero, you most likely do not have unified gpu monitoring enabled. It's too complicated to explain w/o writing an essay on how to control clocks because it touches upon other fundamentals. In the spoilered sections, I explain in more detail how each aspect works. It's all interrelated. Btw, we really disable ULPS because it interferes with ocing apps, especially with unofficial mode. Otherwise, I'd never disable it lol.

http://www.overclock.net/t/1265543/the-amd-how-to-thread/0_40


----------



## kizwan

Quote:


> Originally Posted by *mfknjadagr8*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> resolution, game load, cpu. etc. i would not want my 290s at load even when just browsing. unless i missed something.
> 
> 
> 
> 
> 
> 
> I very rarely just browse...most of the time I turn on my computer to game...that's about it so persistent clocks is what I've always done but these are my first amd cards since my x1300 our the 4800 can't remember which one I upgraded from / to so I'm not sure how to force this to occur or if I even can...are most if you guys using Radeon pro to force your crossfire profiles and such?
> 
> *I've noticed anytime I get stutter it's because utilization drops to zero on one card or the other for no apparent reason sometimes when they are running 30% load sometimes when they are upper 80s* temps are not high on anything...no throttle on any clocks and no drops in voltages that I can see...but I would love to figure this one out...I first thought throttle but clocks are slammed at reference sites the whole time and the drop to zero lasts half a second or less just enough to stutter like a bastard...then it recovers and tears it up again... I also downsample almost all my games from 3100 x 2150 or w/e it is can't remember of the top of my head but I also ensure I'm not hitting vram limit...midst I've seen so far is 3.542 or close
Click to expand...

Monitor CPU usage too. At 1080p, CPU can easily bottleneck when running multiple 290s. When the gpu usage drops & cpu usage spiking, then CPU is bottlenecking. Use VSR or increase resolution scale to counter this.


----------



## Cyber Locc

Quote:


> Originally Posted by *mfknjadagr8*
> 
> ok so what does determine the clocks staying persistent? I would like my clocks to always be at the setting or within a small margin due to some voltage drift.....I don't care about power consumption during idle our at ask for that matter


Are your clocks fluctuating or just or something else as I'm seeing you say different things? You say here how to keep clocks persistent but earlier you said no throttle on any clocks and no drops in voltages that I can see.

That said if your clocks are dropping and you find a fix let me know because I'm having the same issue. I also have tried everything everyone has told you and then some. Disabling powertune has no effect on it , following TSMs guide no effect, Following the stop down clocking guide on the OP no effect. As per the other suggestion of it being a resolution issue I thought that at first borrowed a 4k monitor and used 1 card still down clocks its bullsnap. I have not a clue what to do or why it does this it should not be at 800-1000mhz half the time while benching but yet it is when its overclocked to 1200mhz it only hits 1200mhz for a few secs a few times during a heaven bench and hits everywhere between when using afterburner. If i use gputweak it negates the issue somewhat it stays at the overclock most of the time and drops less often but it still drops pretty frequently. and my temps are all under 50 even the vrms barely get over 50c there is absolutely no reason for this crap. I have also tried 1 card at a time and crossfire I have even moved OSes(7 - 8.1) hoping to stop it, still nope. And my 4820k is plenty strong so bottleneck it is not.


----------



## ajarocena

Is my Seasonic M2II 650W PSU enough to run Sapphire R9-290X Tri-X? (specsheet requires at least 750W)
my processor runs at stock speed but i plan to Overclock it someday.


----------



## mfknjadagr8

Quote:


> Originally Posted by *cyberlocc*
> 
> Are your clocks fluctuating or just or something else as I'm seeing you say different things? You say here how to keep clocks persistent but earlier you said no throttle on any clocks and no drops in voltages that I can see.
> 
> That said if your clocks are dropping and you find a fix let me know because I'm having the same issue. I also have tried everything everyone has told you and then some. Disabling powertune has no effect on it , following TSMs guide no effect, Following the stop down clocking guide on the OP no effect. As per the other suggestion of it being a resolution issue I thought that at first borrowed a 4k monitor and used 1 card still down clocks its bullsnap. I have not a clue what to do or why it does this it should not be at 800-1000mhz half the time while benching but yet it is when its overclocked to 1200mhz it only hits 1200mhz for a few secs a few times during a heaven bench and hits everywhere between when using afterburner. If i use gputweak it negates the issue somewhat it stays at the overclock most of the time and drops less often but it still drops pretty frequently. and my temps are all under 50 even the vrms barely get over 50c there is absolutely no reason for this crap. I have also tried 1 card at a time and crossfire I have even moved OSes(7 - 8.1) hoping to stop it, still nope. And my 4820k is plenty strong so bottleneck it is not.


you are reading two separate scenarios as one...my clocks drop only at idle I would like to keep this persistent at stock speeds of 947 and 1250 or wbe default is at all times not just under load...under load I see no clock fluctuations not even a mhz...


----------



## kizwan

Quote:


> Originally Posted by *ajarocena*
> 
> Is my Seasonic M2II 650W PSU enough to run Sapphire R9-290X Tri-X? (specsheet requires at least 750W)
> my processor runs at stock speed but i plan to Overclock it someday.


Fill in Rigbuilder & attach it to your signature. Let see what you got.


----------



## mfknjadagr8

Quote:


> Originally Posted by *tsm106*
> 
> I don't understand what clocks have to do with voltage drift? You mean droop? Clocks to be at what setting? If your gpu usage is going to zero, you most likely do not have unified gpu monitoring enabled. It's too complicated to explain w/o writing an essay on how to control clocks because it touches upon other fundamentals. In the spoilered sections, I explain in more detail how each aspect works. It's all interrelated. Btw, we really disable ULPS because it interferes with ocing apps, especially with unofficial mode. Otherwise, I'd never disable it lol.
> 
> http://www.overclock.net/t/1265543/the-amd-how-to-thread/0_40


correct I do not have unified gpu monitoring enabled...yes I mean droop...not drift...I would like my clocks to stay persistent to what they are set to at all times...(for this scenario it's stock clocks) I.e. No drop at idle...disabling ulps all but stopped my black screens when in crossfire mode as long as temps were in check...but they are watered now so no issue...I will read that when I get a few minutes thanks


----------



## mfknjadagr8

Quote:


> Originally Posted by *kizwan*
> 
> Monitor CPU usage too. At 1080p, CPU can easily bottleneck when running multiple 290s. When the gpu usage drops & cpu usage spiking, then CPU is bottlenecking. Use VSR or increase resolution scale to counter this.


already downsampling from 3100 res to 1080....also cpu usage rarely is above 65 percent on all cores...aside from call of duty and it's bill crap when loading maps


----------



## Cyber Locc

Quote:


> Originally Posted by *mfknjadagr8*
> 
> correct I do not have unified gpu monitoring enabled...yes I mean droop...not drift...I would like my clocks to stay persistent to what they are set to at all times...(for this scenario it's stock clocks) I.e. No drop at idle...disabling ulps all but stopped my black screens when in crossfire mode as long as temps were in check...but they are watered now so no issue...I will read that when I get a few minutes thanks


Okay so you don't want your clocks to go down to 2d just stay at whatever you set them? Easy peasy flash pt1 bios they will not move. The mininum clocks for pt1 however are 1000/1250 and they will stay at what you set them always. However this will suck a lot of electricity just a heads up.


----------



## tsm106

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I don't understand what clocks have to do with voltage drift? You mean droop? Clocks to be at what setting? If your gpu usage is going to zero, you most likely do not have unified gpu monitoring enabled. It's too complicated to explain w/o writing an essay on how to control clocks because it touches upon other fundamentals. In the spoilered sections, I explain in more detail how each aspect works. It's all interrelated. Btw, we really disable ULPS because it interferes with ocing apps, especially with unofficial mode. Otherwise, I'd never disable it lol.
> 
> http://www.overclock.net/t/1265543/the-amd-how-to-thread/0_40
> 
> 
> 
> correct I do not have unified gpu monitoring enabled...yes I mean droop...not drift...I would like my clocks to stay persistent to what they are set to at all times...(for this scenario it's stock clocks) I.e. No drop at idle...*disabling ulps all but stopped my black screens when in crossfir*e mode as long as temps were in check...but they are watered now so no issue...I will read that when I get a few minutes thanks
Click to expand...

You're suffering from blackscreens. Is that why you want to run 3D clocks on the desktop? That's a pretty serious band-aid you know.


----------



## Batpimp

i bought two of the r9 290 Power color PCS+

Very happy.

The 1st was 200 bucks

the 2nd was 235 after MIR and 15 dollar paypal CC cash back.


----------



## LandonAaron

Quote:


> Originally Posted by *Batpimp*
> 
> 
> 
> 
> 
> 
> i bought two of the r9 290 Power color PCS+
> 
> Very happy.
> 
> The 1st was 200 bucks
> 
> the 2nd was 235 after MIR and 15 dollar paypal CC cash back.


It is a very good time to be a PC Gamer. 2 years ago I would have never dreamed of being able to get this kind of horse power for so cheap.


----------



## mfknjadagr8

Quote:


> Originally Posted by *tsm106*
> 
> You're suffering from blackscreens. Is that why you want to run 3D clocks on the desktop? That's a pretty serious band-aid you know.


I an no longer suffering from black screens...disabling ulps helped with most of it and going under water stopped it completely...the only issue I'm experiencing now is the usage dropping to 0 randomly causing massive stutter it's not just one card but they rarely do it at the same time but it causes massive frame drops...which is extremely annoying from $700 worth of cards and blocks







at first they only game I noticed it in was advanced.warfare so I chalked it up to typical call of duty...but when I started playing gta 5 I noticed it there as well...it's not like it hits 100% then drops it never breaks 75 percent usage on either card and cpu doesn't break 70%...no clock changes or throttling that I can see in voltages or clocks...I could understand if I was bottlenecking but I can't see anything that points to bottleneck or throttle...that's why I don't understand...but tonight I will do as suggested and game with the gpuz's open and see what it looks like


----------



## DaUn3rD0g

Quote:


> Originally Posted by *mfknjadagr8*
> 
> I an no longer suffering from black screens...disabling ulps helped with most of it and going under water stopped it completely...the only issue I'm experiencing now is the usage dropping to 0 randomly causing massive stutter it's not just one card but they rarely do it at the same time but it causes massive frame drops...which is extremely annoying from $700 worth of cards and blocks
> 
> 
> 
> 
> 
> 
> 
> at first they only game I noticed it in was advanced.warfare so I chalked it up to typical call of duty...but when I started playing gta 5 I noticed it there as well...it's not like it hits 100% then drops it never breaks 75 percent usage on either card and cpu doesn't break 70%...no clock changes or throttling that I can see in voltages or clocks...I could understand if I was bottlenecking but I can't see anything that points to bottleneck or throttle...that's why I don't understand...but tonight I will do as suggested and game with the gpuz's open and see what it looks like


Could I ask you what motherboard you have and if the bios is up to date on it?

I just got a new msi 290x gaming and I've had alsorts of issues with it, random black screens even at idle was probably the worst of the lot though.

Anyway, I reaserched loads of different fixes that work for different people but the only one that fixed it completely was updating my motherboard bios, haven't had a single black screen since.

I don't know how or why it works but it did for me, possibly the bridgeless crossfire thing? Mines only single card though so I don't know if that would even matter.

EDIT: never mind, I missed that you said its gone completely now under water.


----------



## mfknjadagr8

Quote:


> Originally Posted by *DaUn3rD0g*
> 
> Could I ask you what motherboard you have and if the bios is up to date on it?
> 
> I just got a new msi 290x gaming and I've had alsorts of issues with it, random black screens even at idle was probably the worst of the lot though.
> 
> Anyway, I reaserched loads of different fixes that work for different people but the only one that fixed it completely was updating my motherboard bios, haven't had a single black screen since.
> 
> I don't know how or why it works but it did for me, possibly the bridgeless crossfire thing? Mines only single card though so I don't know if that would even matter.
> 
> EDIT: never mind, I missed that you said its gone completely now under water.


yes they are and I'm running sabertooth r2.0 with latest bios...


----------



## LandonAaron

Quote:


> Originally Posted by *mfknjadagr8*
> 
> yes they are and I'm running sabertooth r2.0 with latest bios...


Check you DPC latency with the DPC latency checker tool. Use google I don't have the link. Anyway, I started having audio problems recently and discovered my DPC latency was going through the roof during gaming (>8000us). Narrowed down the cause to either Asus's AI Suite, or something with my video drivers. Changing the refresh rate from 60 to 96 to 60 will fix it. Also having AI Suite open causes it. Anyway, whenever I have that straightened out if fixex the audio but also reduces the stuttering in game a lot as well. I was also having one of my cards suddenly drop to 0% usage during gaming.


----------



## Cyber Locc

Quote:


> Originally Posted by *LandonAaron*
> 
> Check you DPC latency with the DPC latency checker tool. Use google I don't have the link. Anyway, I started having audio problems recently and discovered my DPC latency was going through the roof during gaming (>8000us). Narrowed down the cause to either Asus's AI Suite, or something with my video drivers. Changing the refresh rate from 60 to 96 to 60 will fix it. Also having AI Suite open causes it. Anyway, whenever I have that straightened out if fixex the audio but also reduces the stuttering in game a lot as well. I was also having one of my cards suddenly drop to 0% usage during gaming.


I don't think that has anything to do with his issues.

However I can assist you with this issue we should really have this stickied on the front page for other people with the issue. Anyway the issue is caused by your graphics driver me and some other fine folks both here and on a few other forums have helped AMD recreate the issue and figured out the reason it happens somewhat. However that was almost a year ago so its safe to say that a fix might not happen so here is how you can fix it.

For this issue to present there is certain factors that need to be present.

1. Vsync must be enabled and be being used. (this can even count for desktop use of vsync) The issue presents with all flavors of vsync double buffer, triple buffer, ect.

2. Crossfire must be enabled and both cards have to be being utilized. (GPUZ polling can also fill this req as can having ulps disabled as long as crossfire is enabled.)

3. You must have a Multi Monitor configuration and you must have more than 1 screen on and active. This does not apply to users of eyefinity as all screens are then seen as one I guess? At any rate the issue doesn't present if all your display are being used by eyefinity. It does not matter what ports you use if there is more than one screen turned on and active you will have this issue.

Note: I do not know if this applies to users with USB connected monitors as no one from our discussions had one nor wanted to buy one just to find out. If you or anyone else has such a monitor and could test that for me I would be much obliged.

The Fixes -

1. Disable or turn off all screens but 1 whenever using crossfire and VSync together.

2. Dont use Vsync of any kind, This includes adaptive Vsync as the issue will still be present when vsync turns on. This is somewhat more difficult than the above fix as some games have vsync by default and disabling it can alter the game a lot (EG. Skyrim) You may still use a frame limiter however trying to use both vsync and a limiter at 59 FPS will not work.

3. Buy new GPUs as it honestly doesn't seem that AMD cares about fixing this issue. This late in the GPUs life and with the small pool of users that it affects doesn't appear to be high on there priority list. If you think about it only about 5-10% of owners of these cards have crossfire out of those maybe 50% have more than 1 screen and from those probably another 5% aren't using eyefinity so you can see how that pool shrinks quickly. I would personally estimate less than 1000 users fit the requirements for this issue which would explain the low amount of info and complaints about it.

4. Deal with the sound crackling issues until hopefully there is a fix.

This issue from what I have found (and I have been in pretty much every thread about this issue over the past year including a bunch with AMD reps and Techs) only affects R9 290's, 290X's, 295X2's. Im not saying its not in other cards I am only saying I have not seen the issue recreated in other cards. I have however seen it be recreated on pretty much every 290/295 including custom PCBs it seems to be a driver issue and have zero to do with the card itself.

Spotting the issue is easy as you have mentioned you will get sound crackles that are pretty annoying and with the use of a latency checker see issues with the DXGKRNL.sys which is your Direct X Kernel.

Hope I helped sorry to give you the bad news.

Also your solution of "Changing the refresh rate from 60 to 96 to 60 will fix it" can you elaborate I assume the reason this works is doing so while gaming will break vsync which is a source of the issue. This is the first I have heard of that fix though so please tell me more.


----------



## LandonAaron

Quote:


> Originally Posted by *cyberlocc*
> 
> I don't think that has anything to do with his issues.
> 
> However I can assist you with this issue we should really have this stickied on the front page for other people with the issue. Anyway the issue is caused by your graphics driver me and some other fine folks both here and on a few other forums have helped AMD recreate the issue and figured out the reason it happens somewhat. However that was almost a year ago so its safe to say that a fix might not happen so here is how you can fix it.
> 
> For this issue to present there is certain factors that need to be present.
> 
> 1. Vsync must be enabled and be being used. (this can even count for desktop use of vsync) The issue presents with all flavors of vsync double buffer, triple buffer, ect.
> 
> 2. Crossfire must be enabled and both cards have to be being utilized. (GPUZ polling can also fill this req as can having ulps disabled as long as crossfire is enabled.)
> 
> 3. You must have a Multi Monitor configuration and you must have more than 1 screen on and active. This does not apply to users of eyefinity as all screens are then seen as one I guess? At any rate the issue doesn't present if all your display are being used by eyefinity. It does not matter what ports you use if there is more than one screen turned on and active you will have this issue.
> 
> Note: I do not know if this applies to users with USB connected monitors as no one from our discussions had one nor wanted to buy one just to find out. If you or anyone else has such a monitor and could test that for me I would be much obliged.
> 
> The Fixes -
> 
> 1. Disable or turn off all screens but 1 whenever using crossfire and VSync together.
> 
> 2. Dont use Vsync of any kind, This includes adaptive Vsync as the issue will still be present when vsync turns on. This is somewhat more difficult than the above fix as some games have vsync by default and disabling it can alter the game a lot (EG. Skyrim) You may still use a frame limiter however trying to use both vsync and a limiter at 59 FPS will not work.
> 
> 3. Buy new GPUs as it honestly doesn't seem that AMD cares about fixing this issue. This late in the GPUs life and with the small pool of users that it affects doesn't appear to be high on there priority list. If you think about it only about 5-10% of owners of these cards have crossfire out of those maybe 50% have more than 1 screen and from those probably another 5% aren't using eyefinity so you can see how that pool shrinks quickly. I would personally estimate less than 1000 users fit the requirements for this issue which would explain the low amount of info and complaints about it.
> 
> 4. Deal with the sound crackling issues until hopefully there is a fix.
> 
> This issue from what I have found (and I have been in pretty much every thread about this issue over the past year including a bunch with AMD reps and Techs) only affects R9 290's, 290X's, 295X2's. Im not saying its not in other cards I am only saying I have not seen the issue recreated in other cards. I have however seen it be recreated on pretty much every 290/295 including custom PCBs it seems to be a driver issue and have zero to do with the card itself.
> 
> Spotting the issue is easy as you have mentioned you will get sound crackles that are pretty annoying and with the use of a latency checker see issues with the DXGKRNL.sys which is your Direct X Kernel.
> 
> Hope I helped sorry to give you the bad news.
> 
> Also your solution of "Changing the refresh rate from 60 to 96 to 60 will fix it" can you elaborate I assume the reason this works is doing so while gaming will break vsync which is a source of the issue. This is the first I have heard of that fix though so please tell me more.


Dude! Thank you! I have been losing my mind over this issue and have been contemplating a fresh windows install as I couldn't figure out the cause. I fit the criteria to a T. I always us Vsync. I use crossifre, and I game on 1440p monitor, and have a 2nd 1080p monitor that I use for productivity and monitoring MSIAB while gaming.

I was playing GTA V the other day, monitoring DPC latency tool, and alt tabbing out of the game and closing background processes and applications trying to see if anything would lower the DPC latency, and switching back to the game to see if it helped. Anyway I use DisplayFusion for my multi-monitor setup, and it basically always keeps the 2nd monitor active even while in game, so I can monitor stuff on there like MSIAB. Anyway I closed that and when I switched back to GTAV the 2nd screen went dark/inactive. Anyway I listened for the tell tale sign of audio popping and didn't hear anything, so alt tabbed out to check DPC latency and it was reading low. I played the rest of the night like that believing DisplayFusion to be the cause. Anyway I tried the same thing last night, but now even though display fusion is closed the 2nd monitor stayed active and DPC Latency stayed high. What I figured out I could do was change some graphics setting in game and eventually the 2nd monitor would go inactive and DPC latency would go low. The setting which most consistently worked was changing the output monitor in game from 1 to 2, then back to 1 again. But sometimes that wouldn't work and the 2nd monitor would stay active. I tried changing the refresh rate setting in game from 60 to 96 back to 60 again, though the 2nd monitor stayed active the DPC latency fell back to normal. Note the monitors actual refresh rate stayed at 96, I only changed the in game setting refresh rate. I have just been turning the in game setting down to 60 to help alleviate some stuttering in GTAV. I only tried this one time, and just kept playing that way since it was working.

I will try it again tonight to see if it is consistent. Most likely I will end up using the solution 1 you posted above just disabling the 2nd monitor. Stinks that I have to do this, as I like monitoring everything on there but I am just glad to know what is going on, and that there is a real solution, kind of. Lol. Thanks.

Edit: I should note that I have an overclockable Korean monitor, and use Display Driver Patcher and CRU to create the custom overclocked resolution / refresh rate. Maybe the changing refresh rates and its effect of lowering the DPC latency back to normal has something to do with that. Honestly I suspected that it may have been the cause, and my next troubleshooting step was to remove it and reinstall the AMD drivers, but that doesn't seem to be the case now.


----------



## rdr09

Quote:


> Originally Posted by *LandonAaron*
> 
> Dude! Thank you! I have been losing my mind over this issue and have been contemplating a fresh windows install as I couldn't figure out the cause. I fit the criteria to a T. I always us Vsync. I use crossifre, and I game on 1440p monitor, and have a 2nd 1080p monitor that I use for productivity and monitoring MSIAB while gaming.
> 
> I was playing GTA V the other day, monitoring DPC latency tool, and alt tabbing out of the game and closing background processes and applications trying to see if anything would lower the DPC latency, and switching back to the game to see if it helped. Anyway I use DisplayFusion for my multi-monitor setup, and it basically always keeps the 2nd monitor active even while in game, so I can monitor stuff on there like MSIAB. Anyway I closed that and when I switched back to GTAV the 2nd screen went dark/inactive. Anyway I listened for the tell tale sign of audio popping and didn't hear anything, so alt tabbed out to check DPC latency and it was reading low. I played the rest of the night like that believing DisplayFusion to be the cause. Anyway I tried the same thing last night, but now even though display fusion is closed the 2nd monitor stayed active and DPC Latency stayed high. What I figured out I could do was change some graphics setting in game and eventually the 2nd monitor would go inactive and DPC latency would go low. The setting which most consistently worked was changing the output monitor in game from 1 to 2, then back to 1 again. But sometimes that wouldn't work and the 2nd monitor would stay active. I tried changing the refresh rate setting in game from 60 to 96 back to 60 again, though the 2nd monitor stayed active the DPC latency fell back to normal. Note the monitors actual refresh rate stayed at 96, I only changed the in game setting refresh rate. I have just been turning the in game setting down to 60 to help alleviate some stuttering in GTAV. I only tried this one time, and just kept playing that way since it was working.
> 
> I will try it again tonight to see if it is consistent. Most likely I will end up using the solution 1 you posted above just disabling the 2nd monitor. Stinks that I have to do this, as I like monitoring everything on there but I am just glad to know what is going on, and that there is a real solution, kind of. Lol. Thanks.
> 
> Edit: I should note that I have an overclockable Korean monitor, and use Display Driver Patcher and CRU to create the custom overclocked resolution / refresh rate. Maybe the changing refresh rates and its effect of lowering the DPC latency back to normal has something to do with that. Honestly I suspected that it may have been the cause, and my next troubleshooting step was to remove it and reinstall the AMD drivers, but that doesn't seem to be the case now.


For your purpose of using the second monitor . . . would the igpu work instead?


----------



## Cyber Locc

Quote:


> Originally Posted by *LandonAaron*
> 
> Dude! Thank you! I have been losing my mind over this issue and have been contemplating a fresh windows install as I couldn't figure out the cause. I fit the criteria to a T. I always us Vsync. I use crossifre, and I game on 1440p monitor, and have a 2nd 1080p monitor that I use for productivity and monitoring MSIAB while gaming.
> 
> I was playing GTA V the other day, monitoring DPC latency tool, and alt tabbing out of the game and closing background processes and applications trying to see if anything would lower the DPC latency, and switching back to the game to see if it helped. Anyway I use DisplayFusion for my multi-monitor setup, and it basically always keeps the 2nd monitor active even while in game, so I can monitor stuff on there like MSIAB. Anyway I closed that and when I switched back to GTAV the 2nd screen went dark/inactive. Anyway I listened for the tell tale sign of audio popping and didn't hear anything, so alt tabbed out to check DPC latency and it was reading low. I played the rest of the night like that believing DisplayFusion to be the cause. Anyway I tried the same thing last night, but now even though display fusion is closed the 2nd monitor stayed active and DPC Latency stayed high. What I figured out I could do was change some graphics setting in game and eventually the 2nd monitor would go inactive and DPC latency would go low. The setting which most consistently worked was changing the output monitor in game from 1 to 2, then back to 1 again. But sometimes that wouldn't work and the 2nd monitor would stay active. I tried changing the refresh rate setting in game from 60 to 96 back to 60 again, though the 2nd monitor stayed active the DPC latency fell back to normal. Note the monitors actual refresh rate stayed at 96, I only changed the in game setting refresh rate. I have just been turning the in game setting down to 60 to help alleviate some stuttering in GTAV. I only tried this one time, and just kept playing that way since it was working.
> 
> I will try it again tonight to see if it is consistent. Most likely I will end up using the solution 1 you posted above just disabling the 2nd monitor. Stinks that I have to do this, as I like monitoring everything on there but I am just glad to know what is going on, and that there is a real solution, kind of. Lol. Thanks.
> 
> Edit: I should note that I have an overclockable Korean monitor, and use Display Driver Patcher and CRU to create the custom overclocked resolution / refresh rate. Maybe the changing refresh rates and its effect of lowering the DPC latency back to normal has something to do with that. Honestly I suspected that it may have been the cause, and my next troubleshooting step was to remove it and reinstall the AMD drivers, but that doesn't seem to be the case now.


Glad I could help. Yes option 1 is what i do as well I tried to list them easiest to deal with to hardest to swallow







.
Quote:


> Originally Posted by *rdr09*
> 
> For your purpose of using the second monitor . . . would the igpu work instead?


Do you think that using the IGpu would solve it I am curious of that myself if a Igpu or usb monitor would suffer the same fate.

Your purpose of temp monitoring is also my biggest gripe with the solution (I have 4 screens however, but the 2 on the sides I dont mind turning off as they are for when doing real work) my top screen I use to monitor temps and google stuff and read it. I am considering swapping that monitor with a TV hooking it to another PC just for media and googling for guides ect and I wanted to try one of those little lcd temp screens (I forget what there called I will look for them and edit this part) they run of usb and can show Aida 64 info. Of course 1 wont be enough for me I was thinking 3 or 4 and building an acrylic plate to hold them, and place them under my monitor. with 3 or 4 of them I should be able to get all the data I could/would want. I figured 1 for CPU ! for GPU's 1 for Loop info (and other temps) and 1 for whatever else.

EDIT: its called lcdsysinfo for gooverlay my biggest gripe with it is its size. its only 2.5in lcd which is well tiny lol I would like one like 3-4 inches tall and at least 10-12 inches long.


----------



## ET900

Hey guys. I'm really hoping you can help me out here.. I've had two new 290X's in the past week, and they've both been a lot of trouble







Both of them are "Gigabyte Windforce 290X OC (GV-R929XOC-4GD)" models. The first one was artifacting so I sent it back. The second one works, other than the fact that it downclocks to half speed when it's not even getting too hot (please see my thread here: http://www.overclock.net/t/1554244/gigabyte-windforce-290x-oc-gv-r929xoc-4gd-throttling-core-clock-to-half-speed-at-no-higher-than-81c). So, I'm thinking I'm probably gonna send this one back to Amazon like the last one.

I have to get my replacement gpu on Amazon.co.uk. I would like another 290X, so which do you guys think is the best one out of the lot? I know a lot of people say the Sapphire Tri-X model. What do you think? I just want one that'll stay really cool with the cooler it comes with and won't throttle down when it's not even getting that hot!

I'd really appreciate it if you guys could have a look here and recommend the best choice: http://www.amazon.co.uk/s/ref=nb_sb_noss_2?url=search-alias%3Daps&field-keywords=290X&rh=i%3Aaps%2Ck%3A290X

Thanks a lot


----------



## LandonAaron

Quote:


> Originally Posted by *rdr09*
> 
> For your purpose of using the second monitor . . . would the igpu work instead?


Thats an excellent idea I have a 4790k so I'll try that. I tested again on GTAV and was kind of able to recreate "fixing" the issues but just changing from 60 to 96 to 60hz didn't do it. I just had to keep changing that, switching the output monitor, and alt tabbing to windows and then eventually the DPC went back to normal. Its like you have to break the bug. But now I can change any setting alt tab to windows and it isn't coming back, but I bet if I exited and restarted it would.


----------



## numbrs

Quote:


> Originally Posted by *ET900*
> 
> Hey guys. I'm really hoping you can help me out here.. I've had two new 290X's in the past week, and they've both been a lot of trouble
> 
> 
> 
> 
> 
> 
> 
> Both of them are "Gigabyte Windforce 290X OC (GV-R929XOC-4GD)" models. The first one was artifacting so I sent it back. The second one works, other than the fact that it downclocks to half speed when it's not even getting too hot (please see my thread here: http://www.overclock.net/t/1554244/gigabyte-windforce-290x-oc-gv-r929xoc-4gd-throttling-core-clock-to-half-speed-at-no-higher-than-81c). So, I'm thinking I'm probably gonna send this one back to Amazon like the last one.
> 
> I have to get my replacement gpu on Amazon.co.uk. I would like another 290X, so which do you guys think is the best one out of the lot? I know a lot of people say the Sapphire Tri-X model. What do you think? I just want one that'll stay really cool with the cooler it comes with and won't throttle down when it's not even getting that hot!
> 
> I'd really appreciate it if you guys could have a look here and recommend the best choice: http://www.amazon.co.uk/s/ref=nb_sb_noss_2?url=search-alias%3Daps&field-keywords=290X&rh=i%3Aaps%2Ck%3A290X
> 
> Thanks a lot


MSI 290x Lightning!

LOL

Seriously though its a great card, runs amazing even on air. I have a short review on it you might like to check out









http://www.overclock.net/products/msi-graphics-card-r9-290x-lightning/reviews/7122

Regards,


----------



## ET900

Quote:


> Originally Posted by *numbrs*
> 
> MSI 290x Lightning!
> 
> LOL
> 
> Seriously though its a great card, runs amazing even on air. I have a short review on it you might like to check out
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/products/msi-graphics-card-r9-290x-lightning/reviews/7122
> 
> Regards,


I will checkout your review out of interest, thanks








I've heard these are great cards too. Thing is, they cost £351 on amazon! (http://www.amazon.co.uk/MSI-R9-290X-Lightning-TriFrozr/dp/B00ITHHMM0/ref=sr_1_1?ie=UTF8&qid=1430966650&sr=8-1&keywords=msi+lightning+290x).
That's a bit too much for me :/ I have been eyeing up the Sapphire Vapor-X though: http://www.amazon.co.uk/Sapphire-Vapor-X-Graphics-Express-Display/dp/B00K14AUVY/ref=sr_1_1?ie=UTF8&qid=1430966668&sr=8-1&keywords=290x+vapor
What you reckon? Thanks


----------



## numbrs

Quote:


> Originally Posted by *ET900*
> 
> I will checkout your review out of interest, thanks
> 
> 
> 
> 
> 
> 
> 
> I've heard these are great cards too. Thing is, they cost £351 on amazon! (http://www.amazon.co.uk/MSI-R9-290X-Lightning-TriFrozr/dp/B00ITHHMM0/ref=sr_1_1?ie=UTF8&qid=1430966650&sr=8-1&keywords=msi+lightning+290x).
> That's a bit too much for me :/ I have been eyeing up the Sapphire Vapor-X though: http://www.amazon.co.uk/Sapphire-Vapor-X-Graphics-Express-Display/dp/B00K14AUVY/ref=sr_1_1?ie=UTF8&qid=1430966668&sr=8-1&keywords=290x+vapor
> What you reckon? Thanks


Are you dead set on a 290x?


----------



## ET900

Quote:


> Originally Posted by *numbrs*
> 
> Are you dead set on a 290x?


pretty much, yeh!


----------



## numbrs

Quote:


> Originally Posted by *ET900*
> 
> I will checkout your review out of interest, thanks
> 
> 
> 
> 
> 
> 
> 
> I've heard these are great cards too. Thing is, they cost £351 on amazon! (http://www.amazon.co.uk/MSI-R9-290X-Lightning-TriFrozr/dp/B00ITHHMM0/ref=sr_1_1?ie=UTF8&qid=1430966650&sr=8-1&keywords=msi+lightning+290x).
> That's a bit too much for me :/ I have been eyeing up the Sapphire Vapor-X though: http://www.amazon.co.uk/Sapphire-Vapor-X-Graphics-Express-Display/dp/B00K14AUVY/ref=sr_1_1?ie=UTF8&qid=1430966668&sr=8-1&keywords=290x+vapor
> What you reckon? Thanks


http://www.amazon.co.uk/Gigabyte-WINDFORCE-Graphics-PCI-E-GDDR5/dp/B00HS866AK/ref=pd_sim_sbs_computers_5?ie=UTF8&refRID=05W53C1GSVDT1F84ZM6C

This looks okay. But i would do some research on it first, as iv'e heard of some issues with the Gigabyte 290x.

A 980 is also an option but i don't like the hidden power consumption of Nvidia cards.


----------



## ET900

Quote:


> Originally Posted by *numbrs*
> 
> http://www.amazon.co.uk/Gigabyte-WINDFORCE-Graphics-PCI-E-GDDR5/dp/B00HS866AK/ref=pd_sim_sbs_computers_5?ie=UTF8&refRID=05W53C1GSVDT1F84ZM6C
> 
> This looks okay. But i would do some research on it first, as iv'e heard of some issues with the Gigabyte 290x.
> 
> A 980 is also an option but i don't like the hidden power consumption of Nvidia cards.


That Gigabyte Windforce is actually the 290X I've got. It's my second one in a week. I sent the first one back because it was artifacting, and the second one that i now have keeps throttling the core down to half speed when the highest temp reached (which is on VRM1) is 81c. Even when the temp drops down far lower to about 50-60c, it still keeps the card heavily throttled. It's terrible. I definitely don't want another of these! I've been looking around tonight, and it seems that the Vapor-X is considered the second best 290X after the Msi Lightning. You think it's a good choice?

I don't really want to get an Nvidia card. I am pretty set on the 290X really. I'm just looking for a good model to get which is under £300 on Amazon. Thanks


----------



## numbrs

Quote:


> Originally Posted by *ET900*
> 
> That Gigabyte Windforce is actually the 290X I've got. It's my second one in a week. I sent the first one back because it was artifacting, and the second one that i now have keeps throttling the core down to half speed when the highest temp reached (which is on VRM1) is 81c. Even when the temp drops down far lower to about 50-60c, it still keeps the card heavily throttled. It's terrible. I definitely don't want another of these! I've been looking around tonight, and it seems that the Vapor-X is considered the second best 290X after the Msi Lightning. You think it's a good choice?
> 
> I don't really want to get an Nvidia card. I am pretty set on the 290X really. I'm just looking for a good model to get which is under £300 on Amazon. Thanks


Too bad you don't have the cash for Lightning. Its like a Titan with a third of VRAM, but also a 1/5th of the cost. Awesome cooling power. Folding 24/7 = 65 degrees Celsius. I would wait a day or two before you decide and we can both compile some research on the best models in your price range. I havn't read up on the Vapor-X but im assuming as long as it has the cooling power you need it would be great. Btw the Lightning is like 3-3.5 PCI case slots, and It also comes with multimeter IO for precise overclocking.

Hope this could help a little


----------



## mfknjadagr8

So heres the gpuz that was requested this was after about 4 hours of gaming with ambient of nearly 32c.. as my fiance is sick and so temps are around 90 degrees in here


----------



## Klocek001

am I on the safe side running +69mv on 290 trix ? Not for benching, for regular gaming. Since I got my 1440p I've realized the need for more power, and my card isn't the best overclocker you'll see, needs +69mv for stable 1100/1500. Running 75% fan in the OC mode cause 1440p without any sort of frame limit can really turn the heat up.


----------



## cephelix

Quote:


> Originally Posted by *Klocek001*
> 
> am I on the safe side running +69mv on 290 trix ? Not for benching, for regular gaming. Since I got my 1440p I've realized the need for more power, and my card isn't the best overclocker you'll see, needs +69mv for stable 1100/1500. Running 75% fan in the OC mode cause 1440p without any sort of frame limit can really turn the heat up.


for the voltage you are running, i say you are on the safe side...what temps are you seeing on core and vrms though? anything under 95C for core and 120C for vrms should be fine...if you are reaching thermal limits though, keep an eye out since your card may throttle down if it gets too hot, thus affecting performance
Quote:


> Originally Posted by *mfknjadagr8*
> 
> So heres the gpuz that was requested this was after about 4 hours of gaming with ambient of nearly 32c.. as my fiance is sick and so temps are around 90 degrees in here


Quote:


> Originally Posted by *mfknjadagr8*
> 
> So heres the gpuz that was requested this was after about 4 hours of gaming with ambient of nearly 32c.. as my fiance is sick and so temps are around 90 degrees in here


for 32C ambients i would say your temps are really good!


----------



## Klocek001

no,no the hottest ive seen was 76 on the core and that was it.


----------



## cephelix

Quote:


> Originally Posted by *Klocek001*
> 
> no,no the hottest ive seen was 76 on the core and that was it.


you're safe then. you could try oc-ing some more if you wanted to


----------



## Klocek001

when I was on my old 1080p I was using vsync too, so lower resolution combined with frame cap would result in temperatures around 60 degrees even with OC. Now on 1440p and running no frame cap it gets to 75.


----------



## cephelix

Quote:


> Originally Posted by *Klocek001*
> 
> when I was on my old 1080p I was using vsync too, so lower resolution combined with frame cap would result in temperatures around 60 degrees even with OC. Now on 1440p and running no frame cap it gets to 75.


still that's quite good. at 1200p, OCed to 1100/1500/+50mV, vsync on so 60fps, ambients about 30C, i still get about 80C on the core . And that is with CLU between die and cooler


----------



## Arizonian

Quote:


> Originally Posted by *By-Tor*
> 
> I just replaced my pair of 7950's with a Powercolor R9 290X LCS and love it. Hoping to add a second card soon..
> 
> Please add me...
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## kizwan

Quote:


> Originally Posted by *mfknjadagr8*
> 
> So heres the gpuz that was requested this was after about 4 hours of gaming with ambient of nearly 32c.. as my fiance is sick and so temps are around 90 degrees in here


That's pretty bad gpu usage. I'm I correct that you get these usage with all games you tried except bf4? Did the drops happen from the start you played the games or after a while you played the games? If the latter, I'm interested to see MSI AB graphs. Just select everything in Monitoring tab (settings).


----------



## mfknjadagr8

Quote:


> Originally Posted by *kizwan*
> 
> That's pretty bad gpu usage. I'm I correct that you get these usage with all games you tried except bf4? Did the drops happen from the start you played the games or after a while you played the games? If the latter, I'm interested to see MSI AB graphs. Just select everything in Monitoring tab (settings).


this is what I was getting at...drops to zero for no reason...it's not overheated it's not drooping voltage...odd...I'll use ab next time..but yeah it's from the start usually I have ambient around 22 or so and temps are better...but the only two games I don't notice this in are black ops 2 and bf4


----------



## ET900

Quote:


> Originally Posted by *numbrs*
> 
> Too bad you don't have the cash for Lightning. Its like a Titan with a third of VRAM, but also a 1/5th of the cost. Awesome cooling power. Folding 24/7 = 65 degrees Celsius. I would wait a day or two before you decide and we can both compile some research on the best models in your price range. I havn't read up on the Vapor-X but im assuming as long as it has the cooling power you need it would be great. Btw the Lightning is like 3-3.5 PCI case slots, and It also comes with multimeter IO for precise overclocking.
> 
> Hope this could help a little


Yeh. It definitelty sounds awesome! Defintiely a bit pricey too







I think I'm gonna order today. I did quite a lot of looking around lastnight, and the Vapor-X seems to be the coolest 290X a lot of sites have tested. That's what I'm looking for really. Thanks









Edit: Just ordered the Vapor-X. Wish me luck, and thanks a lot for your help, bud







If I get any troubles, I'll be back! Haha


----------



## elgreco14

I Just bought the msi r9 290x gaming 4g from someone else. But i have some trouble with it. In his pc it had max 68 degrees temp, but in my pc 86 max after 30 mins of bf3. Before it went up to 92, but i forgot to turn the front fan on.

I did also some other tests. 3dmark firestrike i got 9600 first time and second time 9900. In battlefield 3 I did some fps tests. Played each time 10 minutes first time 112 second time 120. The gpu usage of the cars was mostly all the time 100% some time it droppen to 80% or a bit loper but i didnt noticed any dips in fps.

Is this usage and fps not normal? Or am i bit paranoia because i bought it second hand?

And how can i fix the temps? I read somewhere New cooling paste.

Thx in advance


----------



## rdr09

Quote:


> Originally Posted by *mfknjadagr8*
> 
> this is what I was getting at...drops to zero for no reason...it's not overheated it's not drooping voltage...odd...I'll use ab next time..but yeah it's from the start usually I have ambient around 22 or so and temps are better...but the only two games I don't notice this in are black ops 2 and bf4


the usage as K said are weird. you have no other programs running in the background, right?

i agree, use AB to monitor next time. just AB.
Quote:


> Originally Posted by *elgreco14*
> 
> I Just bought the msi r9 290x gaming 4g from someone else. But i have some trouble with it. In his pc it had max 68 degrees temp, but in my pc 86 max after 30 mins of bf3. Before it went up to 92, but i forgot to turn the front fan on.
> 
> I did also some other tests. 3dmark firestrike i got 9600 first time and second time 9900. In battlefield 3 I did some fps tests. Played each time 10 minutes first time 112 second time 120. The gpu usage of the cars was mostly all the time 100% some time it droppen to 80% or a bit loper but i didnt noticed any dips in fps.
> 
> Is this usage and fps not normal? Or am i bit paranoia because i bought it second hand?
> 
> And how can i fix the temps? I read somewhere New cooling paste.
> 
> Thx in advance


just make sure front fan is on.


----------



## numbrs

Quote:


> Originally Posted by *ET900*
> 
> Yeh. It definitelty sounds awesome! Defintiely a bit pricey too
> 
> 
> 
> 
> 
> 
> 
> I think I'm gonna order today. I did quite a lot of looking around lastnight, and the Vapor-X seems to be the coolest 290X a lot of sites have tested. That's what I'm looking for really. Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Just ordered the Vapor-X. Wish me luck, and thanks a lot for your help, bud
> 
> 
> 
> 
> 
> 
> 
> If I get any troubles, I'll be back! Haha


Awesome, send me pictures when it arrives!


----------



## fyzzz

I have been overclocking today and got some pretty good results:

I am pretty happy with the results and got the msi ab voltage mod thing to work today. Everything is cooled by AIR. NOTE: im pretty sure 146c must be a bug


----------



## ET900

Quote:


> Originally Posted by *numbrs*
> 
> Awesome, send me pictures when it arrives!


Ok bud. I'll try remember to come back here with some shots


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> I have been overclocking today and got some pretty good results:
> 
> I am pretty happy with the results and got the msi ab voltage mod thing to work today. Everything is cooled by AIR. NOTE: im pretty sure 146c must be a bug


Great job. You got a nice 290 . . . it's like my 290s.

Edit: 1300 would easily be possible watered.


----------



## fyzzz

Hmm maybe i should save up for some watercooling parts for my 290 and then flash pt1 bios on it....tempting thought


----------



## rdr09

Quote:


> Originally Posted by *LandonAaron*
> 
> Thats an excellent idea I have a 4790k so I'll try that. I tested again on GTAV and was kind of able to recreate "fixing" the issues but just changing from 60 to 96 to 60hz didn't do it. I just had to keep changing that, switching the output monitor, and alt tabbing to windows and then eventually the DPC went back to normal. Its like you have to break the bug. But now I can change any setting alt tab to windows and it isn't coming back, but I bet if I exited and restarted it would.


please try and report back pls. thanks.

Quote:


> Originally Posted by *fyzzz*
> 
> Hmm maybe i should save up for some watercooling parts for my 290 and then flash pt1 bios on it....tempting thought


not sure how much you added but you are hitting 1.39v already and i think that is with droop. with water . . . it might need less volts and result to higher clocks. no need for PT1 in your case.


----------



## mfknjadagr8

Quote:


> Originally Posted by *rdr09*
> 
> the usage as K said are weird. you have no other programs running in the background, right?
> 
> i agree, use AB to monitor next time. just AB.
> just make sure front fan is on.


nope I.always turn everything of and kill unneeded processes...the only thing that runs when i game is Samsung magician, xbox driver for controller, and my headset driver...and needed procs for Windows of course


----------



## rdr09

Quote:


> Originally Posted by *mfknjadagr8*
> 
> nope I.always turn everything of and kill unneeded processes...the only thing that runs when i game is Samsung magician, xbox driver for controller, and my headset driver...and needed procs for Windows of course


i used to have an Xbox controller to play Batman and it was causing issues with other games even when not used. i unplugged it and fixed it. can't recall exactly which game got affected but it was the controller that was causing it. might not be the case here, though.


----------



## mfknjadagr8

Quote:


> Originally Posted by *rdr09*
> 
> i used to have an Xbox controller to play Batman and it was causing issues with other games even when not used. i unplugged it and fixed it. can't recall exactly which game got affected but it was the controller that was causing it. might not be the case here, though.


I'm thinking not because I unplug the controller/receiver when playing all but gta5


----------



## rdr09

Quote:


> Originally Posted by *mfknjadagr8*
> 
> I'm thinking not because I unplug the controller/receiver when playing all but gta5


could you run this bench and post your results here (link) . . .

http://www.techpowerup.com/downloads/2483/futuremark-3dmark-2013-v1-5-893/


----------



## fyzzz

Quote:


> Originally Posted by *rdr09*
> 
> Great job. You got a nice 290 . . . it's like my 290s.
> 
> Edit: 1300 would easily be possible watered.


I know water works wonders on these cards, but i have still hard to believe that it could reach 1300 mhz on water. The 1270 mhz overclock wasn't stable even. It went through firestrike but with many artifacts. I think it gets stable around 1230-1240 mhz. with 200 mv.


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> I know water works wonders on these cards, but i have still hard to believe that it could reach 1300 mhz on water. The 1270 mhz overclock wasn't stable even. It went through firestrike but with many artifacts. I think it gets stable around 1230-1240 mhz. with 200 mv.


i never bench on air but when benching . . . stability is not really a factor. now, if you are looking for an oc for a game, then that needs stability.

1200 core stable is something to write home about with these cars. 1100 is already faster than a stock reference 290X.

you raised the power limit to max, right?


----------



## fyzzz

Quote:


> Originally Posted by *rdr09*
> 
> i never bench on air but when benching . . . stability is not really a factor. now, if you are looking for an oc for a game, then that needs stability.
> 
> 1200 core stable is something to write home about with these cars. 1100 is already faster than a stock reference 290X.
> 
> you raised the power limit to max, right?


Yes i always raise the power limit to max. I think my card is almost stable on 1200 MHz with 100mv. On stock voltage it can do around 1100 mhz.


----------



## mfknjadagr8

Quote:


> Originally Posted by *rdr09*
> 
> could you run this bench and post your results here (link) . . .
> 
> http://www.techpowerup.com/downloads/2483/futuremark-3dmark-2013-v1-5-893/


I assume you mean firestrike...yeah ill run it as well but might be later tonight because today is cleaning day lol


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> Yes i always raise the power limit to max. I think my card is almost stable on 1200 MHz with 100mv. On stock voltage it can do around 1100 mhz.


without knowing what the actual temp of the core . . . we can't really say for sure what your car is capable of. like i said, i never oc'ed on air but my cars on water - one can do 1170 without touching voltage and the other 1150. in benchies, i play at stock.

use another app like HWINFO64 (free version). turn off the other apps.
Quote:


> Originally Posted by *mfknjadagr8*
> 
> I assume you mean firestrike...yeah ill run it as well but might be later tonight because today is cleaning day lol


i got to clean too. lol best time to run benchies is when cleaning the house or whatever. it's boring.


----------



## airisom2

I should have posted this when Hawaii first came out:




It fits so perfectly it's scary.


----------



## numbrs

Quote:


> Originally Posted by *airisom2*
> 
> I should have posted this when Hawaii first came out:
> 
> 
> 
> 
> It fits so perfectly it's scary.


LOL


----------



## tsm106

Quote:


> Originally Posted by *fyzzz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> Great job. You got a nice 290 . . . it's like my 290s.
> 
> Edit: 1300 would easily be possible watered.
> 
> 
> 
> I know water works wonders on these cards, but *i have still hard to believe that it could reach 1300 mhz on water.* The 1270 mhz overclock wasn't stable even. It went through firestrike but with many artifacts. I think it gets stable around 1230-1240 mhz. with 200 mv.
Click to expand...

Unpossible? I have a couple runs with four around 1295-1300, but I'd have to dig it out of my subs.

http://www.3dmark.com/fs/3758613


----------



## mfknjadagr8

As requested.. its a little low but heres the firestrike
http://www.3dmark.com/3dm/6891592


----------



## FastEddieNYC

Some chips OC better than others. Its commonly called the Sillicone Lottery. I've seen a few 1300~1350 Overclocks.


----------



## kizwan

Quote:


> Originally Posted by *mfknjadagr8*
> 
> As requested.. its a little low but heres the firestrike
> http://www.3dmark.com/3dm/6891592


Scores look alright to me.

Comparing to my old result.
http://www.3dmark.com/compare/fs/4770181/fs/1363682


----------



## mfknjadagr8

Ok so a run with afterburner only running... this was gta 5 again for a few hours with second monitor turned off as i dont run it eyefinity with resolution downsampled from 3100 and settings on max aside from aa at 2x and all the advanced options turned off...

maxs are showing 100 but on the graph i rarely see any touch 100... i didnt watch it the whole time... but one thing to note this time did the flight school training instead so it was running the cards a little harder..the top one actually broke 60c on core for a little bit...

And another one after fiance turned up the heat and sweating me out of here


----------



## JourneymanMike

http://www.3dmark.com/fs/4738336


----------



## JourneymanMike

Just a little tweek on the 290X's

http://www.3dmark.com/3dm/6893000?


----------



## xKrNMBoYx

Finally tried OCing my MSI reference 290. So far 143MHz with no change in stock voltage/power limit w/ Afterburner. Seems stable for multiple hours on various games. I wonder how much I can go before I need to raise voltage.


----------



## rdr09

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Ok so a run with afterburner only running... this was gta 5 again for a few hours with second monitor turned off as i dont run it eyefinity with resolution downsampled from 3100 and settings on max aside from aa at 2x and all the advanced options turned off...
> 
> maxs are showing 100 but on the graph i rarely see any touch 100... i didnt watch it the whole time... but one thing to note this time did the flight school training instead so it was running the cards a little harder..the top one actually broke 60c on core for a little bit...
> 
> And another one after fiance turned up the heat and sweating me out of here


i saw your FS score and it looks normal for stock. combined score is low which indicates cpu is holding the gpus back a bit.

i see that one cpu core has pretty high usage. i read gta5 is multi-threaded. it would be interesting to see how the rest of the cores are doing.


----------



## mfknjadagr8

Quote:


> Originally Posted by *rdr09*
> 
> i saw your FS score and it looks normal for stock. combined score is low which indicates cpu is holding the gpus back a bit.
> 
> i see that one cpu core has pretty high usage. i read gta5 is multi-threaded. it would be interesting to see how the rest of the cores are doing.


amd combined score will always be low for amd processors...the test itself is unfair to them....ill check out the core usages when I get a chance...


----------



## fyzzz

I have noticed a intresting thing when overclocking. The graphics card is becoming sometimes more stable with less voltage. i'm guessing it has to do with the heat. When i apply around 250 mv the gpu core goes up to about 70c and the vrm 100+. So watercooling could possibly help alot. Also i have another question would a xt45 240 radiator be sufficient for a r9 290 only (i want to achieve high overclocks).


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> I have noticed a intresting thing when overclocking. The graphics card is becoming sometimes more stable with less voltage. i'm guessing it has to do with the heat. When i apply around 250 mv the gpu core goes up to about 70c and the vrm 100+. So watercooling could possibly help alot. Also i have another question would a xt45 240 radiator be sufficient for a r9 290 only (i want to achieve high overclocks).


i've always listen to tsm when it comes to oc'ing hawaii. he knows more about these things than anyone i know here in ocn.

yes, water will help a lot if you are benching and 240 rad is good enuf. now, if you ever plan on adding stuff in your loop i think 240 is a good start and add a 360 maybe. depending on your pump and case, one thing i learned from tsm is to go big in rad space right off the bat.

i started with a cooler master 912 and tried to squeez 3 120 rads in there to cool the cpu and 2 290s. i saw 60C in BF4, which was not ideal. i now have an Air 540 with a 240 and 360 rads to cool the 290s and an i7 5820. better temps.

here was my highest oc on one of my 290s using an old driver . . .

http://www.3dmark.com/3dm/2098310

that was cooled by water and weather.


----------



## LandonAaron

Quote:


> Originally Posted by *fyzzz*
> 
> I have noticed a intresting thing when overclocking. The graphics card is becoming sometimes more stable with less voltage. i'm guessing it has to do with the heat. When i apply around 250 mv the gpu core goes up to about 70c and the vrm 100+. So watercooling could possibly help alot. Also i have another question would a xt45 240 radiator be sufficient for a r9 290 only (i want to achieve high overclocks).


What are you using to apply +250mV. The highest trixx lets you apply is +200mV. Also what resolution are you using?


----------



## thrgk

are both dvi ports dual link? I am using a 144hz monitor with dvi cable but ccc only has max to 60hz?


----------



## fyzzz

Quote:


> Originally Posted by *LandonAaron*
> 
> What are you using to apply +250mV. The highest trixx lets you apply is +200mV. Also what resolution are you using?


I am using the msi volt unlock thing, i don't know what to call it. I have a 1080p monitor and sometimes plays at 1440p upscaled


----------



## LandonAaron

Quote:


> Originally Posted by *fyzzz*
> 
> I am using the msi volt unlock thing, i don't know what to call it. I have a 1080p monitor and sometimes plays at 1440p upscaled


Is it an MSI card? I just haven't ever been able to get MSIAB to give me more than +100mV on these cards regardless of the settings. Anyway, on my cards I can go all the way up to +200mV fine at 1080p 60hz, but at 1440p 96hz the max voltage I can give them and remain stable is +100mV, with really +75mV being the most stable.


----------



## fyzzz

Quote:


> Originally Posted by *LandonAaron*
> 
> Is it an MSI card? I just haven't ever been able to get MSIAB to give me more than +100mV on these cards regardless of the settings. Anyway, on my cards I can go all the way up to +200mV fine at 1080p 60hz, but at 1440p 96hz the max voltage I can give them and remain stable is +100mV, with really +75mV being the most stable.


Nope it is not an msi card, it is an asus card. I had trouble to get the overvolt thing to work in the beginning, it was finicky.


----------



## mfknjadagr8

Ok so I won't have time today but...I'm going to try a few things probably tomorrow...ill be increasing power limit to max just to ensure it's not hitting a power limit somehow...even though it's stock clocks I'd like to rule that out...secondly ill be watching core cpu clocks more closely though I'm doubting this is the issue (unless I have a weak core which didn't did up in prime or fail ibt) and several people have ran up to 4 290 cards perfectly fine with 8 and 9 series visheras...that said I'm not completely ruling it out....another thing I am considering is I have my page file set to 16gb as per the old school rule of double the ram...(yeah I know it's probably not needed but old habits die hard) considering letting Windows manage and see if it changes anything....going to free up some space on my gaming drive and optimize and defrag it....also in a few days I will have the time for a full Windows reinstall...this will eliminate any Windows issues that might be affecting the situation...any other suggestions are welcomed...I will still bee using Windows 7 because not a big fan of 8 and refuse to jump on ten so soon...


----------



## Cyber Locc

Quote:


> Originally Posted by *LandonAaron*
> 
> Is it an MSI card? I just haven't ever been able to get MSIAB to give me more than +100mV on these cards regardless of the settings. Anyway, on my cards I can go all the way up to +200mV fine at 1080p 60hz, but at 1440p 96hz the max voltage I can give them and remain stable is +100mV, with really +75mV being the most stable.


Its in the op theres a hack for afterburner to get really as much voltage as you want. I think it still reads +100 but you add a multiplier to it on the back end so its really much more anyway its in the op as a spoiler called "Add more volts to Afterburner" here it is for you though to make it easier.

Just use /wi4,30,8d,10 for 100mv. The offset is 6.25 mv in hexademical. So on decimal is :16*6.25=100 mv. For 50mv you need 8. For 200mv you need 20( 20=32 on dec. So 32 * 6.25=200mv)

The easy way to do changes:

Create a txt on desktop. Write
CD C:\Program Files (x86)\MSI Afterburner
MSIAfterburner.exe /wi4,30,8d,10

and then save as .bat file. Eveyrtime you start this bat file msi will start with +100mv

For 50mv: 8
For 100mv:10
For 125mv:14
For 150mv:18
For 175mv:1C
For 200mv:20

I wouldn't go over this point because
1)You are close to leave the sweet spot of the ref pcb vrms efficiency
2)These commands add 200mv on top of the 100mv offset through AB gui.That means 300mv

By default /wi command apply to current gpu only. So if you have 2 or more gpus you must use /sg command. That means the command line is something like that
ex:MsiAfterburner.exe /sg0 /wi6,30,8d,10 /sg1 /wi6,30,8d,10

Asus GPUTweak will also allow you to go to 1.412v if you flash Asus bios or have an Asus card. I personally dont like the layout of gpu tweak but from my experimentation using it seems to solve my throttle issue for the most part. In GPUTweak my clock drops 1-2mhz tops where in afterburner it drops 200-300mhz I dont know why just pointing it out.


----------



## FastEddieNYC

Saphire Trixx will add +200 mv. It works great with my XFX card.


----------



## kizwan

A blast from the past.

I noticed a couple of people talking about voltage. I quote an old post for your reference. This is for referenced card of course but it's still good for reference for you guys.
Quote:


> Originally Posted by *jomama22*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> All you guys need is good cooling, at least sub-ambient cooling. If you search in this thread, I'm pretty sure you can find few people running at 1.5V at least (both before & after Vdroop). Not 24/7 of course but for benching.
> 
> 
> 
> Just a fair warning. The actual silicon is not going to be the first thing to go, it will be a cap/vrm right in the middle of the stack on reference cards. 1.45v was enough to pop it under LN2. Not saying they are all going to do this, but because of how these chips are designed, it is very difficult even under ln2 to find one that will keep running for you through an entire bench @ above 1.45v. Went through 11 290x and found that out.
> 
> Remember, with these chips, because the imc is weaker then that of tahini, it is much more sensitive to the voltage you are throwing at the chip. Thus, even if the shaders, rops, rastisizers ect are all working properly and allowing higher clocks, a weak or very sensitive imc will ruin it.
> 
> Its not all about hynix vs elpida, its about how well the imc works in conjunction with the rest of the chip. Sandy bridge-e is a very good parallel in this regard where the higher core clock would make it much more difficult to attain higher (and stable) memory clocks.
> 
> I has a launch day 290x hit 1357/1749 (6996) with hynix @ dimm check voltage of 1.43v on water. Anything above that voltage would end up with either black screens or lock up of visual artifacts. I had 2 other 290x hit 1340+ as well under the same voltage requirements. Though these had elpidia and were restricted to 1625 (6500) on the memory clock.
Click to expand...


----------



## Cyber Locc

Quote:


> Originally Posted by *kizwan*
> 
> A blast from the past.
> 
> I noticed a couple of people talking about voltage. I quote an old post for your reference. This is for referenced card of course but it's still good for reference for you guys.


Very nice info to have +Rep







. And jeez I wish I could hit 1350 lol. 1 of my cards is good for 1280 (could probably go higher didn't really try) but the other is horrid stuck at 1150 no matter what lol. I may just pick up a XFX DD swap em put the block on it and its stock cooler on the crappy clocking sapphire and throw it in a another build.


----------



## LandonAaron

So I tested with my 2nd monitor connected to the iGPU and that solved my DPC latency issues with crossfire, vsync and using 2nd monitor to monitor temps. So for any with the same issue that is a viable work around.


----------



## Cyber Locc

Quote:


> Originally Posted by *LandonAaron*
> 
> So I tested with my 2nd monitor connected to the iGPU and that solved my DPC latency issues with crossfire, vsync and using 2nd monitor to monitor temps. So for any with the same issue that is a viable work around.


Hmm I wonder if USB Adapters would work then ? I had heard earlier today that the new beta supposedly fixes the issue about to try that soon (I was going to tell ya about it almost forgot








), I sadly don't have any igpu ports I'm on x79







.


----------



## LandonAaron

Quote:


> Originally Posted by *cyberlocc*
> 
> Very nice info to have +Rep
> 
> 
> 
> 
> 
> 
> 
> . And jeez I wish I could hit 1350 lol. 1 of my cards is good for 1280 (could probably go higher didn't really try) but the other is horrid stuck at 1150 no matter what lol. I may just pick up a XFX DD swap em put the block on it and its stock cooler on the crappy clocking sapphire and throw it in a another build.


I miss running my card at 1200. Ive mentioned it a couple times, but yeah at 1080p 60hz my cards can go that high but for some reason at 1440p 96hz they can't go past 1100. But really that is a by product of the real issue which is the cards black screen when given more than +100mV at that resolution. This is on an overclockable korean monitor with a custom resolution created with CRU. On the 1080p monitor with VSR enabled and the resolution upped to 1440p in game I can still go to +200v and 1200mhz. So I don't know where the problem lies or why it behaves the way it does. Its the only time I have ever encountered increased voltage decreasing stability and I don't know how to explain it.


----------



## Cyber Locc

Quote:


> Originally Posted by *LandonAaron*
> 
> So I tested with my 2nd monitor connected to the iGPU and that solved my DPC latency issues with crossfire, vsync and using 2nd monitor to monitor temps. So for any with the same issue that is a viable work around.


Just wanted to update you with the new driver. Not only does it not fix the issue it makes it worse. Now I get occasional spikes even with Vsync off. This is getting absurd.....

Seriously I love my 290s and all and there great for 4k but sadly these will be the last time I buy AMD anything this issue is ridiculous if it was for a month or 5 until a fix thats one thing but were nearing 2 years without a fix or even any status of a fix they straight up don't care and don't even acknowledge (aside from a few reps but all they do is acknowledge then brush it off) and that isn't right.

To make things even funnier I was just reading an article earlier that AMD is through being the "Budget Option" and is going to start making better performing and put higher pricing on there products. Ya good luck with that when you cant even fix a issue 2 years later what a joke, There performance is fine on gpus at least its there drivers and support that are a joke.

Maybe DX12 will fix the issue but I'm not holding my breath as the issue is on there side. Also before someone comes in and says there busy yada yada they have to update for new games, me and the numerous others that are plagued with this could give a crap about GTA V optimization when we cant play any game with decent audio. It doesn't really matter how optimized a game is if a portion of users cant have audio.


----------



## blackhole2013

I have a PowerColor Devil 13 Dual Core AXR9 290X II 8GBD5 which is basically a r9 295x2 and everytime I run crossfire with both cores running in GTA 5 the game crashes with in ten minutes of playing but if I turn one of gpus off and not run crossfire the game does not crash can anybody tell me whats wrong


----------



## moorhen2

Here is my Fires Strike effort.

http://www.3dmark.com/3dm/6908076


----------



## Sempre

Quote:


> Originally Posted by *Sempre*
> 
> Sadly i got a stripped screw while trying to remove the stock cooler. I tried the rubber band method and super glue but it still wont budge. Next ill try do drill into and through the screw. Anyone thinks its a good/bad idea?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> This is the bit ill be using:


I decided to take the safe route and use an extractor. After waiting for a week it arrived and it was a success. I finally managed to remove that _ONE_ stripped screw and now i can watercool it.







I recorded a video of the removal but ill post it here later since uploading to youtube will take a while.

Thank you all for your input @Devildog83 @Ramzinho @tsm106


----------



## LandonAaron

Quote:


> Originally Posted by *Sempre*
> 
> I decided to take the safe route and use an extractor. After waiting for a week it arrived and it was a success. I finally managed to remove that _ONE_ stripped screw and now i can watercool it.
> 
> 
> 
> 
> 
> 
> 
> I recorded a video of the removal but ill post it here later since uploading to youtube will take a while.
> 
> Thank you all for your input @Devildog83 @Ramzinho @tsm106


I had this same problem back in the day on a GTX580. I managed to just drill out the screw but it was a giant pain. How much did the tool cost and where did you order from. That would be a handy tool to have around.


----------



## maynard14

hi guys. i have a 290x ref card and a coollaboratory pro thermal paste..

is it a bad idea to put clp on the gpu die and mount the reference cooler of my 290x?


----------



## rdr09

Quote:


> Originally Posted by *maynard14*
> 
> hi guys. i have a 290x ref card and a coollaboratory pro thermal paste..
> 
> is it a bad idea to put clp on the gpu die and mount the reference cooler of my 290x?


you mean . . . Coollaboratory Liquid PRO?

if yes, then yes, it is safe. Careful not to let it overflow past the core. I saw a You Tube video of how to apply it properly. I would put painter's tape or electrical tape around the core first before applying.

edit: Similar . . .


----------



## Ramzinho

Quote:


> Originally Posted by *Sempre*
> 
> I decided to take the safe route and use an extractor. After waiting for a week it arrived and it was a success. I finally managed to remove that _ONE_ stripped screw and now i can watercool it.
> 
> 
> 
> 
> 
> 
> 
> I recorded a video of the removal but ill post it here later since uploading to youtube will take a while.
> 
> Thank you all for your input @Devildog83 @Ramzinho @tsm106


this is probably the best thing I've heard today.. so glad you got your stuff figured out man.. Glad it was solved. enjoy the beasts.


----------



## Sempre

Quote:


> Originally Posted by *LandonAaron*
> 
> I had this same problem back in the day on a GTX580. I managed to just drill out the screw but it was a giant pain. How much did the tool cost and where did you order from. That would be a handy tool to have around.


I got it from amazon for $24:
http://www.amazon.com/dp/B000Q60UOO

The process was really easy it took me seconds.

Quote:


> Originally Posted by *Ramzinho*
> 
> this is probably the best thing I've heard today.. so glad you got your stuff figured out man.. Glad it was solved. enjoy the beasts.


Thanks for the kind words


----------



## maynard14

Quote:


> Originally Posted by *rdr09*
> 
> you mean . . . Coollaboratory Liquid PRO?
> 
> if yes, then yes, it is safe. Careful not to let it overflow past the core. I saw a You Tube video of how to apply it properly. I would put painter's tape or electrical tape around the core first before applying.
> 
> edit: Similar . . .


yes i already tried it on my 4770k delided processor and it works really.. but im wondering if applying nail polish to the little capacitors around the die will protect the capacitors from the clp? or just cut electric tape and put it on the capacitors and leave them and put the cooler?


----------



## rdr09

Quote:


> Originally Posted by *maynard14*
> 
> yes i already tried it on my 4770k delided processor and it works really.. but im wondering if applying nail polish to the little capacitors around the die will protect the capacitors from the clp? or just cut electric tape and put it on the capacitors and leave them and put the cooler?


should be fine to leave them on..


----------



## maynard14

Quote:


> Originally Posted by *rdr09*
> 
> should be fine to leave them on..


alright thank you bro, i will try it later and post the result







hope it will have a massive boost haha


----------



## tsm106

Quote:


> Originally Posted by *maynard14*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> you mean . . . Coollaboratory Liquid PRO?
> 
> if yes, then yes, it is safe. Careful not to let it overflow past the core. I saw a You Tube video of how to apply it properly. I would put painter's tape or electrical tape around the core first before applying.
> 
> edit: Similar . . .
> 
> 
> 
> 
> 
> 
> 
> 
> 
> yes i already tried it on my 4770k delided processor and it works really.. but im wondering if applying nail polish to the little capacitors around the die will protect the capacitors from the clp? or just cut electric tape and put it on the capacitors and leave them and put the cooler?
Click to expand...

Don't use CLP or Pro, it is harder to remove and using q-tips to apply is lame cuz you get cotton strands left behind. It also can harden over time. Instead use Ultra or CLU. And apply it like like mentioned with tape. I used blue tape because it doesn't get gummy when aged. You can remove the tape because the CLU isn't going anywhere and you should never apply so much that it runs. If you want, I suppose you can leave the tape there albeit trimmed down to fit nicely.


----------



## maynard14

Quote:


> Originally Posted by *tsm106*
> 
> Don't use CLP or Pro, it is harder to remove and using q-tips to apply is lame cuz you get cotton strands left behind. It also can harden over time. Instead use Ultra or CLU. And apply it like like mentioned with tape. I used blue tape because it doesn't get gummy when aged. You can remove the tape because the CLU isn't going anywhere and you should never apply so much that it runs. If you want, I suppose you can leave the tape there albeit trimmed down to fit nicely.


thanks sir but i only have clp here, i dont want to buy clu









did you try to put back the reference cooler sir on the 290x with clu? or your using full water cooling block?

i see, thats scary if its harden over time


----------



## cephelix

Quote:


> Originally Posted by *maynard14*
> 
> thanks sir but i only have clp here, i dont want to buy clu
> 
> 
> 
> 
> 
> 
> 
> 
> 
> did you try to put back the reference cooler sir on the 290x with clu? or your using full water cooling block?
> 
> i see, thats scary if its harden over time


I have CLU on my stock msi r9 290 4G cooler....works real well.


----------



## maynard14

Quote:


> Originally Posted by *cephelix*
> 
> I have CLU on my stock msi r9 290 4G cooler....works real well.


really sir,,, it wont eat at the cooler right? i know the stock cooler is not alluminum? ahmm what is your load temp sir while gaming,,,im really curious









and did you lap the stock cooler? sorry for all the questions


----------



## MerkageTurk

Also a Physx poll running here and some good info.
http://forums.anandtech.com/showthread.php?t=2430693&page=12

Main reason for low fps


----------



## mfknjadagr8

Ok so after reinstalling Windows...cleaning up hard drive...and clean installing everything...using the overlay of rtss to monitor gta...I'm showing cpu usage across all cores in mid 60s....gpu usage (unified, crossfire enabled, power limit +25 mv, uom enabled but but yet used, ulps off)at 35% to 40% max seen over over hour of play downsampled from 2500 x 1440 to 1920 x 1080....fps (with vsync on stays around 60 but the overlay doesn't show drops to 0 but graph confirms and the stutter is still there...it's slightly less often but still happens every few minutes...settings are very high except no msaa and no fxaa...also no advanced options on... Fps drops are into the high 30s and mid 40s...in between major drops I get slight drops under 60 but it's not bad...still open to suggestions...it's not unplayable but it is annoying and happens far to often..I feel at these settings I should be getting over 100 fps much less over 60


----------



## AmcieK

Hello guys .

Is this block : http://images.bit-tech.net/content_images/2013/12/water-cooling-amd-s-radeon-r9-290x/kryographics-hawaii-4-1280x1024.jpg
compatible with xfx r9 290 EDFD version ? Or this http://angela.pl/p11328,ek-water-blocks-ek-fc-r9-290x-nickel-rev-2-0.html ?


----------



## Ramzinho

Quote:


> Originally Posted by *AmcieK*
> 
> Hello guys .
> 
> Is this block : http://images.bit-tech.net/content_images/2013/12/water-cooling-amd-s-radeon-r9-290x/kryographics-hawaii-4-1280x1024.jpg
> compatible with xfx r9 290 EDFD version ? Or this http://angela.pl/p11328,ek-water-blocks-ek-fc-r9-290x-nickel-rev-2-0.html ?


http://coolingconfigurator.com/waterblock/3831109869024

Compatible List:
VTX3D VTX3D Radeon R9 290X X-Edition V2 4GB GDDR5 (VXR9 290X 4GBD5-DHX) visual
VGA FullCover XFX XFX Radeon R9 290 Black Double Dissipation Edition 4GB GDDR5 (R9-290A-EDBD) visual
VGA FullCover XFX XFX Radeon R9 290 Black Edition 4GB GDDR5 (R9-290A-ENBC) visual
VGA FullCover XFX XFX Radeon R9 290 Core Edition 4GB GDDR5 (R9-290A-ENFC) visual
VGA FullCover XFX XFX Radeon R9 290X 4GB GDDR5 (R9-290X-ENFC) visual
VGA FullCover XFX XFX Radeon R9 290X Black Edition 4GB GDDR5 (R9-290X-EDBD) visual
VGA FullCover XFX XFX Radeon R9 290X Double Dissipation Edition 4GB GDDR5 (R9-290X-EDFD) visual
VGA FullCover XFX XFX Radeon R9 290X Double Dissipation Edition 8GB GDDR5 (R9-290X-8DFD)


----------



## JourneymanMike

Quote:


> Originally Posted by *AmcieK*
> 
> Hello guys .
> 
> Is this block : http://images.bit-tech.net/content_images/2013/12/water-cooling-amd-s-radeon-r9-290x/kryographics-hawaii-4-1280x1024.jpg
> compatible with xfx r9 290 EDFD version ? Or this http://angela.pl/p11328,ek-water-blocks-ek-fc-r9-290x-nickel-rev-2-0.html ?


Apparently, it should be a go!

These are mine



Were you interested in the Active Back Plates?


----------



## LandonAaron

What is you guy's opinion on the Morphological Filtering/ MLAA option in CCC? Do you turn it on or off for most games? Or do you just turn it on when you can't use regular MSAA or when not everything is anti-aliased through MSAA like foliage or what not, or do you just keep it on all the time?


----------



## ikem

kinda sad when a 290x 8gb looks small... need a few more.


----------



## mfknjadagr8

Quote:


> Originally Posted by *ikem*
> 
> kinda sad when a 290x 8gb looks small... need a few more.


it only looks small beside that massive cooler







clean looking build though


----------



## Dunan

Hey question, how can the 290X GPUs do 4k gaming if the max resolution is 2550x1600?


----------



## ikem

Quote:


> Originally Posted by *mfknjadagr8*
> 
> it only looks small beside that massive cooler
> 
> 
> 
> 
> 
> 
> 
> clean looking build though


thanks, just waiting on wire and sleeving. Then it will be done.


----------



## JourneymanMike

Quote:


> Originally Posted by *Dunan*
> 
> Hey question, how can the 290X GPUs do 4k gaming if the max resolution is 2550x1600?


That's a good question since 4K is 3840 X 2160...

I'd like to know that myself...


----------



## tsm106

Quote:


> Originally Posted by *JourneymanMike*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Dunan*
> 
> Hey question, how can the 290X GPUs do 4k gaming if the max resolution is 2550x1600?
> 
> 
> 
> That's a good question since 4K is 3840 X 2160...
> 
> I'd like to know that myself...
Click to expand...

The resolution capability is dependent upon the port used and its available bandwidth and much less to do with the actual card.


----------



## Dunan

Hey ques
Quote:


> Originally Posted by *tsm106*
> 
> The resolution capability is dependent upon the port used and its available bandwidth and much less to do with the actual card.


In other words, 2550x1600 would be only available thru DVI and 4k thru display port then if I understand that correctly.


----------



## rdr09

Quote:


> Originally Posted by *Dunan*
> 
> Hey ques
> In other words, 2550x1600 would be only available thru DVI and 4k thru display port then if I understand that correctly.


DVI is good up to 1600. DP is good up to 2160.

I use DP for 4K.


----------



## JourneymanMike

Quote:


> Originally Posted by *tsm106*
> 
> The resolution capability is dependent upon the port used and its available bandwidth and much less to do with the actual card.


Quote:


> Originally Posted by *Dunan*
> 
> Hey ques
> In other words, 2550x1600 would be only available thru DVI and 4k thru display port then if I understand that correctly.


Thanks guys I've never the Display Port only DVI & HDMI

+1 for each of you!


----------



## JourneymanMike

Quote:


> Originally Posted by *rdr09*
> 
> DVI is good up to 1600. DP is good up to 2160.
> 
> I use DP for 4K.


Thanks for your input also!

+1


----------



## aaroc

I had a big disappointment with MSI Warranty. I bought a R9 290 MSI original with noisy fan and it started to show vertical lines when browsing the internerd or minimizing windows. I sent it to warranty (100 USD shipment) and they sent me back a MSI R9 290 4G, not the same model. When opening the very well protected box, wow, how could they send a GPU with several scratches on the metallic cover!!!! I know the warranty says it will be a refurbished GPU, but this was refurbished on a war zone. And the last bit, the card has the black screen of dead issue!!!!! Tested on 4 monitor, DVI, HDMI, DP, and two PCs. Even when doing the Windows 7 experience test the screens go black when testing DX9,DX10 Aero. W7, W8, W8.1 all the same problem. Anything 3D and black screen after a few seconds. Omega, GTAV drivers, all tested. I have other brands R9 290X and no problem.

How can they send a scratched GPU and not tested for a replacement! And to send it back it will cost me another 100USD.


----------



## Offler

Hello guys.

About 6 months ago I purchased Sapphire R9-290x with reference cooler. I lapped the cooler and replaced thermal grease with Arctic Silver 5 (and some time later I figured how to switch graphics it back from performance to normal mode).

Even when cooler mod decreased temperature by 15°C, graphic card in this mode is still reaching critical temperatures in load and starts to throttle.

Most people here solve "what kind of water cooling I should use" but I still try to use stock cooler. My current recommendation is just to customize your fan curve. 20% is quite low. If you set it to 30% it will remove more heat when not loaded too much. The cooler is also able to keep GPU temperature below 90°C, yet it requires fan speed very high above 38% rpm (which last percentage not noisier than rest of PC). Usually about 60-72% depending on ambient temperature.

I also added few fans in front of GPU to feed gpu with fresh air from outside.

a) Is there any air cooler, not bigger than 2 slots? (same size as current stock)
b) Any other tricks which may improve its efficiency?


----------



## piquadrat

I have quite serious problem with my Sapphire R9 290:

Sometimes (purely randomly) after the system power-up the automatic fan speed adjustments simply is not working and the cooler runs at low rpm no matter the temperature and gpu-load. Reboot (once or twice) usually fixes the problem.
If the rpm regulation stucks i can't see actual rpm reading from the card. It looks like the sensor is gone and any diagnostic software (Afterburner, HWInfo, SpeedFan, GPU-Z) cannot find it and display actual fan speed value. Other sensors reads o.k. Manual rpm adjustments isn't working either.
I'd appreciate any help.


----------



## LandonAaron

Quote:


> Originally Posted by *Offler*
> 
> Hello guys.
> 
> About 6 months ago I purchased Sapphire R9-290x with reference cooler. I lapped the cooler and replaced thermal grease with Arctic Silver 5 (and some time later I figured how to switch graphics it back from performance to normal mode).
> 
> Even when cooler mod decreased temperature by 15°C, graphic card in this mode is still reaching critical temperatures in load and starts to throttle.
> 
> Most people here solve "what kind of water cooling I should use" but I still try to use stock cooler. My current recommendation is just to customize your fan curve. 20% is quite low. If you set it to 30% it will remove more heat when not loaded too much. The cooler is also able to keep GPU temperature below 90°C, yet it requires fan speed very high above 38% rpm (which last percentage not noisier than rest of PC). Usually about 60-72% depending on ambient temperature.
> 
> I also added few fans in front of GPU to feed gpu with fresh air from outside.
> 
> a) Is there any air cooler, not bigger than 2 slots? (same size as current stock)
> b) Any other tricks which may improve its efficiency?


Check out Artic Accelero's and GelID's VGA coolers. They are quite good and I am sure you will find one that will fit your needs.

http://www.newegg.com/Product/Product.aspx?Item=N82E16835186097

http://www.newegg.com/Product/Product.aspx?Item=N82E16835426026


----------



## PsYcHo29388

For a 290, Is it okay to use one PCIE Cable with two 6+2Pin connectors on it or do you guys recommend I use two separate cables?

I read something about the black screen issues being related to that.


----------



## tsm106

Quote:


> Originally Posted by *PsYcHo29388*
> 
> For a 290, Is it okay to use one PCIE Cable with two 6+2Pin connectors on it or do you guys recommend I use two separate cables?
> 
> I read something about the black screen issues being related to that.


It doesn't matter. I've never found that power cables had much to do with blackscreens. It's more to do with the IMC than anything else.


----------



## PsYcHo29388

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PsYcHo29388*
> 
> For a 290, Is it okay to use one PCIE Cable with two 6+2Pin connectors on it or do you guys recommend I use two separate cables?
> 
> I read something about the black screen issues being related to that.
> 
> 
> 
> It doesn't matter. I've never found that power cables had much to do with blackscreens. It's more to do with the IMC than anything else.
Click to expand...

Did a little more digging around and found this thread.
http://www.overclock.net/t/1441349/290-290x-black-screen-poll/0_30
So, based on the poll, am I less likely to have issues since the BIOS on my mobo are Legacy?


----------



## tsm106

Quote:


> Originally Posted by *PsYcHo29388*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PsYcHo29388*
> 
> For a 290, Is it okay to use one PCIE Cable with two 6+2Pin connectors on it or do you guys recommend I use two separate cables?
> 
> I read something about the black screen issues being related to that.
> 
> 
> 
> It doesn't matter. I've never found that power cables had much to do with blackscreens. It's more to do with the IMC than anything else.
> 
> Click to expand...
> 
> Did a little more digging around and found this thread.
> http://www.overclock.net/t/1441349/290-290x-black-screen-poll/0_30
> So, based on the poll, am I less likely to have issues since the BIOS on my mobo are Legacy?
Click to expand...

Are you suffering from blacksreens? If not stop worrying?


----------



## mfknjadagr8

Quote:


> Originally Posted by *tsm106*
> 
> Are you suffering from blacksreens? If not stop worrying?


my blacks screens went away after disabling ulps and going under water...couldn't be happier about that


----------



## PsYcHo29388

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PsYcHo29388*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PsYcHo29388*
> 
> For a 290, Is it okay to use one PCIE Cable with two 6+2Pin connectors on it or do you guys recommend I use two separate cables?
> 
> I read something about the black screen issues being related to that.
> 
> 
> 
> It doesn't matter. I've never found that power cables had much to do with blackscreens. It's more to do with the IMC than anything else.
> 
> Click to expand...
> 
> Did a little more digging around and found this thread.
> http://www.overclock.net/t/1441349/290-290x-black-screen-poll/0_30
> So, based on the poll, am I less likely to have issues since the BIOS on my mobo are Legacy?
> 
> Click to expand...
> 
> Are you suffering from blacksreens? If not stop worrying?
Click to expand...

I don't actually have the card yet, and if my questions are annoying you, you don't have to answer them.


----------



## tsm106

The point is it's a waste of time to worry about an unknown.


----------



## mfknjadagr8

Quote:


> Originally Posted by *PsYcHo29388*
> 
> I don't actually have the card yet, and if my questions are annoying you, you don't have to answer them.


I surmise my black screens were caused by heat spikes due to load changes and perhaps ulps wanting to change states when it shouldn't have been...but I have no proof that's my theory...the reason i say this is because upon disabling ulps my black screens went away providing I ramped up my fans before gaming...now before even with fans at 100 percent I was getting black screens...(you would not have to worry about this with a single card as ulps is used in crossfire, tri fire, quad fire only)...after going under water I didn't have to worry about the speed of the fans and haven't gotten a crash or black screen since


----------



## tsm106

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PsYcHo29388*
> 
> I don't actually have the card yet, and if my questions are annoying you, you don't have to answer them.
> 
> 
> 
> I surmise my black screens were caused by heat spikes due to load changes and perhaps ulps wanting to change states when it shouldn't have been...but I have no proof that's my theory...
Click to expand...

ULPS puts slave cards in crossfire to sleep. It doesn't change powerstates. I don't know how many times this has to be explained?


----------



## mfknjadagr8

Quote:


> Originally Posted by *tsm106*
> 
> ULPS puts slave cards in crossfire to sleep. It doesn't change powerstates. I don't know how many times this has to be explained?


then why is it called ultra low power state and what would you call putting it to sleep does that not go from powered to not which is in fact two different states of power?







but seriously it could have been trying to put it to sleep and use it at the same time...all I know is after disabling it my blacks screens stopped assuming my temps were in check...nothing else was changed..not a single setting..not a single Windows update...so I'm not disputing what you are saying just giving my experience with it...


----------



## PsYcHo29388

Quote:


> Originally Posted by *tsm106*
> 
> The point is it's a waste of time to worry about an unknown.


At least now with the threads I found I can most likely fix any issues that arise. If I didn't worry at all I wouldn't have learned anything.


----------



## mfknjadagr8

Quote:


> Originally Posted by *PsYcHo29388*
> 
> At least now with the threads I found I can most likely fix any issues that arise. If I didn't worry at all I wouldn't have learned anything.


yeah being informed on an issue is nice if it ever happens to you...if it doesn't then great...you can never be to prepared...but I also wouldn't lose sleep over it







...there are a lot of theories on the causes of the black screens...done of the things that work for some don't for others and in some cases someone never gets it fixed and rmas rather than try more things...I get frustrated pretty easily but I'm also intrigued and like to find and fix the issue even if I will die of an aneurysm one day


----------



## tsm106

Quote:


> Originally Posted by *PsYcHo29388*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> The point is it's a waste of time to worry about an unknown.
> 
> 
> 
> At least now with the threads I found I can most likely fix any issues that arise. If I didn't worry at all I wouldn't have learned anything.
Click to expand...

When ppl don't understand something they come up with all sorts of theories, like the pcie cables. This reminds me of the Cayman days, the 6950s unlocking to 6970. At first some thought it had to do with the color of the actual pcie plugs. Gray plugs meant it unlocked, lmao. Obviously that didn't turn out to be the case. It's the same with blackscreens. There are two kinds, one from defect and one user initiated. The problem with the latter is that most of the time ppl don't know enough about how these cards work to realize they are initiating the blackscreens. Configuration is everything when it comes to an enjoyable experience.


----------



## LandonAaron

My black screens are always caused from too high of an overclock. How I can overclock varies from game to game though. Also lowering refresh rate than raising back to normal resets the video to recover from a black screen.


----------



## Jflisk

Quote:


> Originally Posted by *PsYcHo29388*
> 
> At least now with the threads I found I can most likely fix any issues that arise. If I didn't worry at all I wouldn't have learned anything.


To answer your question . I use 2 x 6+2 on the same PCI-E cable and do not have problems with black screen. I have 3 x R9 290X connected this way and have no problems.

ULPS - Turns off more then one video card when not is use. Such as in Xfire configurations.Causes more problems then not when left on.


----------



## mfknjadagr8

Quote:


> Originally Posted by *tsm106*
> 
> When ppl don't understand something they come up with all sorts of theories, like the pcie cables. This reminds me of the Cayman days, the 6950s unlocking to 6970. At first some thought it had to do with the color of the actual pcie plugs. Gray plugs meant it unlocked, lmao. Obviously that didn't turn out to be the case. It's the same with blackscreens. There are two kinds, one from defect and one user initiated. The problem with the latter is that most of the time ppl don't know enough about how these cards work to realize they are initiating the blackscreens. Configuration is everything when it comes to an enjoyable experience.


ok so what initiates a black screen in a default card with default settings with drivers installed properly and power settings to always favir performance barring defect as the cause...I'm genuinely interested to know







furthermore how did disabling ulps consequently stop mine from occurring?


----------



## tsm106

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> When ppl don't understand something they come up with all sorts of theories, like the pcie cables. This reminds me of the Cayman days, the 6950s unlocking to 6970. At first some thought it had to do with the color of the actual pcie plugs. Gray plugs meant it unlocked, lmao. Obviously that didn't turn out to be the case. It's the same with blackscreens. There are two kinds, one from defect and one user initiated. The problem with the latter is that most of the time ppl don't know enough about how these cards work to realize they are initiating the blackscreens. Configuration is everything when it comes to an enjoyable experience.
> 
> 
> 
> ok so what initiates a black screen in a default card with default settings with drivers installed properly and power settings to always favir performance barring defect as the cause...I'm genuinely interested to know
> 
> 
> 
> 
> 
> 
> 
> furthermore how did disabling ulps consequently stop mine from occurring?
Click to expand...

There are fundamentals that you still do not understand even though I've linked you the information and threads to read. For the last time ULPS is not compatible with unofficial overclock mode. If ULPS is on and you raise clocks beyond Overdrive limits, it will immediately BSOD. It goes w/o saying that overclock apps cannot read the cards while they are sleeping, which causes issues unto itself. Further ULPS is the last powerstate that the cards will enter. Again it "IS" a powerstate, it is not the thing that changes powerstates. The cards have to go into idle modes first. As far as your question, maybe the watercooling has something to do with it? Which would mean you had temp issues to begin with? It's hard to say when you still misunderstand what ULPS is.

That said blackscreens can happen on a stock card due to hw acceleration, especially browser related.


----------



## PsYcHo29388

Quote:


> Originally Posted by *mfknjadagr8*
> 
> yeah being informed on an issue is nice if it ever happens to you...if it doesn't then great...you can never be to prepared...but I also wouldn't lose sleep over it
> 
> 
> 
> 
> 
> 
> 
> ...there are a lot of theories on the causes of the black screens...done of the things that work for some don't for others and in some cases someone never gets it fixed and rmas rather than try more things...I get frustrated pretty easily but I'm also intrigued and like to find and fix the issue even if I will die of an aneurysm one day


I definitely won't lose sleep by worrying about the card, as I only got curious about the various issues today and that has been put to rest now.
Quote:


> Originally Posted by *tsm106*
> When ppl don't understand something they come up with all sorts of theories, like the pcie cables. This reminds me of the Cayman days, the 6950s unlocking to 6970. At first some thought it had to do with the color of the actual pcie plugs. Gray plugs meant it unlocked, lmao. Obviously that didn't turn out to be the case. It's the same with blackscreens. There are two kinds, one from defect and one user initiated. The problem with the latter is that most of the time ppl don't know enough about how these cards work to realize they are initiating the blackscreens. Configuration is everything when it comes to an enjoyable experience.


That has to be one of the dumbest theories I have ever heard of. As for the blackscreens, I'll probably be fine. If not, I can try a few fixes and if those don't work I can only assume I received a defective card.
Quote:


> Originally Posted by *Jflisk*
> 
> To answer your question . I use 2 x 6+2 on the same PCI-E cable and do not have problems with black screen. I have 3 x R9 290X connected this way and have no problems.


Thanks for the response, I haven't dealt with PCIE cables like this before and to me it did make sense at first since this card does draw a ton of power.


----------



## kizwan

Quote:


> Originally Posted by *PsYcHo29388*
> 
> At least now with the threads I found I can most likely fix any issues that arise. If I didn't worry at all I wouldn't have learned anything.


Black screen can be caused by many things. Overheating memory & insufficient power are example that can cause black screen. With Hawaii card, some people have black screen problem if the monitor go to sleep. Disabling monitor sleep fixed the problem for them while others fixed it by increasing voltage or power limit. There also cases that can't get fixed at all except RMA the card. There's not one solution that worked for all. That's why tsm told you to stop worrying of the unknown.

"Most likely"? NO! "Likely" Yes! Don't rely too much on that thread because you may disappointed if your card still ended up black screening.

Basically you worried, that's cool & after posting here, one thing you should learned already which is stop worrying. Or you can go green. The choice is yours.
Quote:


> Originally Posted by *mfknjadagr8*
> 
> ok so what initiates a black screen in a default card with default settings with drivers installed properly and power settings to always favir performance barring defect as the cause...I'm genuinely interested to know
> 
> 
> 
> 
> 
> 
> 
> furthermore how did disabling ulps consequently stop mine from occurring?


Did the LED on the back of the secondary card turn on when you get the black screen, provided the secondary card was being utilized when the black screen happened? Like tsm said, the secondary card can only enter ULPS mode when it's idling, at 2D clocks. It's not normal for the clocks to drop to 2D clocks while the card is being utilized.

Mine don't have black screen issue when ULPS enabled or after disabled.


----------



## mfknjadagr8

Quote:


> Originally Posted by *tsm106*
> 
> There are fundamentals that you still do not understand even though I've linked you the information and threads to read. For the last time ULPS is not compatible with unofficial overclock mode. If ULPS is on and you raise clocks beyond Overdrive limits, it will immediately BSOD. It goes w/o saying that overclock apps cannot read the cards while they are sleeping, which causes issues unto itself. Further ULPS is the last powerstate that the cards will enter. Again it "IS" a powerstate, it is not the thing that changes powerstates. The cards have to go into idle modes first. As far as your question, maybe the watercooling has something to do with it? Which would mean you had temp issues to begin with? It's hard to say when you still misunderstand what ULPS is.
> 
> That said blackscreens can happen on a stock card due to hw acceleration, especially browser related.


at first yes I did have temp issues because paste had broken down after I repasted temps were better...but I still was getting black screens even when fans were at 100 percent (like a jet with reference cooler) the temps never broke 75c (max spike)which is warm but not horrible...I hadn't overclocked at all at that point and was not using unofficial overclock mode either...also you say I didn't understand what I read I never read anything you linked me about ulps except in the how to it states how to disable and that it prevents cards from sleep so again I'm not illiterate nor do I have a lack of comprehension..but I see what you are saying now about ulps it simply allows the card to enter that state it doesn't control it actually changing states that was the confusion...but on the topic temps were nut great but ok..disabling ulps while still on stock reference cooler stopped the blacks screens unless I didn't ramp up the fans to keep the core cool...they needed about 65 percent asked to keep them cool enough....but I had already planned to put them under water so at that point I did and did not have to worry about fan curves and such...yes I still have a lot to learn no disputing that but my experience with this and your definition if it raise more questions than answers for me in that it makes no sense whatsoever because I only experienced black screens in 3d modes...so it baffles me why disabling ulps could have fixed this unless for some reason it was trying to do this while it wasn't idled...I never once blue screened either


----------



## tsm106

Quote:


> Originally Posted by *mfknjadagr8*
> 
> at first yes I did have temp issues because paste had broken down after I repasted temps were better...but I still was getting black screens even when fans were at 100 percent (like a jet with reference cooler) the temps never broke 75c (max spike)which is warm but not horrible...I hadn't overclocked at all at that point and was not using unofficial overclock mode either...*also you say I didn't understand what I read I never read anything you linked me about ulps* except in the how to it states how to disable and that it prevents cards from sleep so again I'm not illiterate nor do I have a lack of comprehension..but I see what you are saying now about ulps it simply allows the card to enter that state it doesn't control it actually changing states that was the confusion...but on the topic temps were nut great but ok..disabling ulps while still on stock reference cooler stopped the blacks screens unless I didn't ramp up the fans to keep the core cool...they needed about 65 percent asked to keep them cool enough....but I had already planned to put them under water so at that point I did and did not have to worry about fan curves and such...yes I still have a lot to learn no disputing that but my experience with this and your definition if it raise more questions than answers for me in that it makes no sense whatsoever because I only experienced black screens in 3d modes...*so it baffles me why disabling ulps could have fixed this* unless for some reason it was trying to do this while it wasn't idled...I never once blue screened either


ULPS isn't active under a 3D load so it couldn't have done anything!

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mfknjadagr8*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> It's because your definition of ULPS is incorrect. You're attributing things to ULPS that it is not. ULPS puts slave cards in crossfire to sleep. That is all. Clocks going up and down or not to idle are a whole other bag of beans.
> 
> 
> 
> ok so what does determine the clocks staying persistent? I would like my clocks to always be at the setting or within a small margin due to some voltage drift.....I don't care about power consumption during idle our at ask for that matter
> 
> Click to expand...
> 
> I don't understand what clocks have to do with voltage drift? You mean droop? Clocks to be at what setting? If your gpu usage is going to zero, you most likely do not have unified gpu monitoring enabled. It's too complicated to explain w/o writing an essay on how to control clocks because it touches upon other fundamentals. *In the spoilered sections, I explain in more detail how each aspect works.* It's all interrelated. Btw, we really disable ULPS because it interferes with ocing apps, especially with unofficial mode. Otherwise, I'd never disable it lol.
> 
> http://www.overclock.net/t/1265543/the-amd-how-to-thread/0_40
Click to expand...

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Ok so after reinstalling Windows...cleaning up hard drive...and clean installing everything...using the overlay of rtss to monitor gta...I'm showing cpu usage across all cores in mid 60s....gpu usage (unified, crossfire enabled, power limit +25 mv, *uom enabled* but but yet used, ulps off)at 35% to 40% max seen over over hour of play downsampled from 2500 x 1440 to 1920 x 1080....fps (with vsync on stays around 60 but the overlay doesn't show drops to 0 but graph confirms and the stutter is still there...it's slightly less often but still happens every few minutes...settings are very high except no msaa and no fxaa...also no advanced options on... Fps drops are into the high 30s and mid 40s...in between major drops I get slight drops under 60 but it's not bad...still open to suggestions...it's not unplayable but it is annoying and happens far to often..I feel at these settings I should be getting over 100 fps much less over 60


----------



## mfknjadagr8

Quote:


> Originally Posted by *tsm106*
> 
> ULPS isn't active under a 3D load so it couldn't have done anything!


ok uom is enabled NOW it was not months ago when black screens were occurring...as for ulps only working on 2d clocks ok so that means there are only a few explanations left...1.) The remount of the stock cooler with repaste helped vrm temps which for some reason didn't stop black screens until a second reboot 2.) Unforeseen driver issue resolved itself upon second reboot 3.)hell if I know...since ulps it's fully ruled out unsure what changed...I didn't change any other settings simply disabled ulps tennis then ramped up fans and started the game (metro last light) and no more black screens except one time I forgot to increase fan speeds before gaming


----------



## PsYcHo29388

Quote:


> Originally Posted by *kizwan*
> 
> Black screen can be caused by many things. Overheating memory & insufficient power are example that can cause black screen. With Hawaii card, some people have black screen problem if the monitor go to sleep. Disabling monitor sleep fixed the problem for them while others fixed it by increasing voltage or power limit. There also cases that can't get fixed at all except RMA the card. There's not one solution that worked for all. That's why tsm told you to stop worrying of the unknown.
> 
> "Most likely"? NO! "Likely" Yes! Don't rely too much on that thread because you may disappointed if your card still ended up black screening.
> 
> Basically you worried, that's cool & after posting here, one thing you should learned already which is stop worrying. Or you can go green. The choice is yours.


I had probably less than an hours worth of worry before I decided "Meh, I'll be fine." and you are right that I shouldn't worry about what I do not know yet. If the card turns out to be faulty and there's nothing I can do, Newegg should have me covered.

To be honest I wanted to go green at first but to do so I'd have to get a GTX 970 which was out of the budget by a good margin, and since the 290 performs only a little worse I figure I'd go with that.


----------



## the9quad

I bought 5 cards, *3 were faulty*. That should tell you something about 290x quality control on it's initial release. RMA'd cards until I no longer blackscreened., no other changes.


----------



## Ized

Have we established safe voltages? Most of the results on Google are old, dated around launch time and probably guesses at best. Any major problems arisen since?

Is uh 1.305V okay? Seems pretty high but Core Temp is 70c VRM1 71c VRM2 76c so temps are in check.

1200Mhz on the core and 1480Mhz on the memory.


----------



## kizwan

Quote:


> Originally Posted by *Ized*
> 
> Have we established safe voltages? Most of the results on Google are old, dated around launch time and probably guesses at best. Any major problems arisen since?
> 
> Is uh 1.305V okay? Seems pretty high but Core Temp is 70c VRM1 71c VRM2 76c so temps are in check.
> 
> 1200Mhz on the core and 1480Mhz on the memory.


Below 1.4V provided temps are in check. Use DMM to check voltage. Software reading always not accurate, either lower or higher than the actual voltage.
Quote:


> Originally Posted by *jomama22*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> All you guys need is good cooling, at least sub-ambient cooling. If you search in this thread, I'm pretty sure you can find few people running at 1.5V at least (both before & after Vdroop). Not 24/7 of course but for benching.
> 
> 
> 
> 
> 
> 
> 
> Just a fair warning. The actual silicon is not going to be the first thing to go, it will be a cap/vrm right in the middle of the stack on reference cards. 1.45v was enough to pop it under LN2. Not saying they are all going to do this, but because of how these chips are designed, it is very difficult even under ln2 to find one that will keep running for you through an entire bench @ above 1.45v. Went through 11 290x and found that out.
> 
> Remember, with these chips, because the imc is weaker then that of tahini, it is much more sensitive to the voltage you are throwing at the chip. Thus, even if the shaders, rops, rastisizers ect are all working properly and allowing higher clocks, a weak or very sensitive imc will ruin it.
> 
> Its not all about hynix vs elpida, its about how well the imc works in conjunction with the rest of the chip. Sandy bridge-e is a very good parallel in this regard where the higher core clock would make it much more difficult to attain higher (and stable) memory clocks.
> 
> I has a launch day 290x hit 1357/1749 (6996) with hynix @ dimm check voltage of 1.43v on water. Anything above that voltage would end up with either black screens or lock up of visual artifacts. I had 2 other 290x hit 1340+ as well under the same voltage requirements. Though these had elpidia and were restricted to 1625 (6500) on the memory clock.
Click to expand...


----------



## tsm106

^^I've run 1.5v on my trio at release w/o fail. At times I ran even higher and higher voltage too. Btw, base voltage 1.25v input. Thus adding +200mv will fry cards? For reals? No, no freaking way. As for Jomama, I don't know if he's talking load after droop, because there's no way that simply adding +200mv will pop your vrms given adequate cooling. I would presume after droop because it would then make sense that he's dumping around 1.6v input. Btw, I used fuji extremes on my blocks, dunno what he was using. No fried vrms here on ref cards. Here's what those cards did before I got rid of them because of Sapphire's warranty thread lol.



http://www.3dmark.com/fs/1179138


----------



## kizwan

After droop I think. I have run many back to back benching with +200mv, no problem here either. Water & Fujis too.


----------



## Ized

Alright thanks for the voltage information. Makes me feel better about the +200mv runs I did trying to get 1222Mhz without artifacts.

Are there voltage test points somewhere on the card? Otherwise I am way too twitchy to be poking random pins and such









Jumping from 1100Mhz to 1200Mhz while not massive, seems to have really helped with my 5-10 FPS dips in GTA 5, I am very happy!

Those Fuji pads sound pretty popular, would they be a step up over MX5 paste on my VRM heatsinks? (Im on a AIO closed loop thingy with GELID ICY Vision VRM "kit")


----------



## Ha-Nocri

heya ppl, just got m'self a sapphire tri-x 290. Have a question. Is it normal that core clock goes to 1GHz while playing a youtube video, while in Dark Souls 2 it is @ 600-800MHz? lel

Or is it just AMD at work?

also memory clock keeps jumping back and forth 1300 and 150 MHz while in browser


----------



## Nox95

Quote:


> Originally Posted by *Ha-Nocri*
> 
> heya ppl, just got m'self a sapphire tri-x 290. Have a question. Is it normal that core clock goes to 1GHz while playing a youtube video, while in Dark Souls 2 it is @ 600-800MHz? lel


Yeah, mine does that too. That's hardware acceleration for your youtube player i guess. Do you have Vsync enabled whilst playing DS? That might be the reason your card clocks down. OR it is reaching the thermal limit and thus reduces clock to keep within spec.


----------



## LandonAaron

Quote:


> Originally Posted by *Ha-Nocri*
> 
> heya ppl, just got m'self a sapphire tri-x 290. Have a question. Is it normal that core clock goes to 1GHz while playing a youtube video, while in Dark Souls 2 it is @ 600-800MHz? lel
> 
> Or is it just AMD at work?
> 
> also memory clock keeps jumping back and forth 1300 and 150 MHz while in browser


In my experience in older less demanding games where the GPU utilization is low like around 40-60% you will see large downclocks as I the card is just going to work as hard as it thinks it needs to to keep up with the game. What you can do to disable the downclocking, is open up MSIAB and switch the Unofficial Overclocking Mode to: "without powerplay support", and it will just run at whatever OC you have it set to without down-clocking. I always switch it back to disabled once on the desktop.


----------



## tsm106

Quote:


> Originally Posted by *Ized*
> 
> Alright thanks for the voltage information. Makes me feel better about the +200mv runs I did trying to get 1222Mhz without artifacts.
> 
> Are there voltage test points somewhere on the card? Otherwise I am way too twitchy to be poking random pins and such
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Jumping from 1100Mhz to 1200Mhz while not massive, seems to have really helped with my 5-10 FPS dips in GTA 5, I am very happy!
> 
> *Those Fuji pads sound pretty popular*, would they be a step up over MX5 paste on my VRM heatsinks? (Im on a AIO closed loop thingy with GELID ICY Vision VRM "kit")


They're somewhat wasted in most applications. I would definitely get them if you are pushing +300mv or higher to your card. If you are only going up to +200mv like with Trixx, you can skip them. The most your vrms will hit on a block with stock pads is around 65c-70c, which is hardly something to worry about on a fullcover. Btw, I would look into getting a heatsink blank and cutting a nice fat vrm block. The Gelid vrm heatsink is not very dense. I didn't like it when I was testing gpu only. With a heatsink on your vrms, keeping ample airflow over the vrms is the most important thing imo. Check the link below for a nice vrm mod.

http://www.overclock.net/t/1511914/sapphire-r9-290-arctic-accelero-xtreme-iv-vrm-mod/0_40

Quote:


> Originally Posted by *Ha-Nocri*
> 
> heya ppl, just got m'self a sapphire tri-x 290. Have a question. Is it normal that core clock goes to 1GHz while playing a youtube video, while in Dark Souls 2 it is @ 600-800MHz? lel
> 
> Or is it just AMD at work?
> 
> also memory clock keeps jumping back and forth 1300 and 150 MHz while in browser


That's hw acceleration pushing your card into different powerstates. Read the link below for info on how to control that.

http://www.overclock.net/t/1265543/the-amd-how-to-thread/0_40


----------



## Agent Smith1984

Freshly assembled S340 build.....


Yes, that is red paint you see on those trixxies!!


----------



## mAs81

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yes, that is red paint you see on those trixxies!!


They look great man,kudos!!
I'd like to see more photos of them painted - if you have them that is


----------



## LandonAaron

Quote:


> Originally Posted by *Agent Smith1984*
> 
> 
> Freshly assembled S340 build.....
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Yes, that is red paint you see on those trixxies!!


Very nice. Did you paint the S340 yourself as well?


----------



## rdr09

Quote:


> Originally Posted by *Agent Smith1984*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Freshly assembled S340 build.....
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Yes, that is red paint you see on those trixxies!!


Lovely.


----------



## Agent Smith1984

Quote:


> Originally Posted by *LandonAaron*
> 
> Very nice. Did you paint the S340 yourself as well?


Thanks guys!

The S340 actually came that way.
Matte black, with red cable plate and intake port up front. Also red power LED.
I love this little case









I will get some pics up of the cards at some point Friday or Saturday.
I gotta pull em back out for some touch up, and will get some shots when I'm done.


----------



## Linxus

Hey guys,

I recently picked up a Sapphire Tri-X R9 290X and overclocked it using the Sapphire TriXX tool. I'm loving this new card so far and its running GTA 5 @ 1440p wonderfully. I achieved the following settles and the card seems stable:

GPU Clock: 1150
Memory Clock: 1450
VDDC Offset: 93
Power Limit: 25

Running Valley Benchmark 1.0, my GPU temperature hovers around 86ºC; however, my VRM 1 temperature jumps up to 108ºC and hovers around 104-105ºC. The VRM 2 temperature stays around 76ºC. I'm using a custom fan profile with the fan running around 65% at 85ºC. Should I be worried about this high VRM 1 temperature? Are my settings too aggressive? I've noticed that my TriXX options are different than most guides on here. Any input is appreciated.

EDIT: I bumped the fan speed up to a fixed 85% and the VRM 1 temperature is now hovering around 100ºC. I should also mentioned that I have a NZXT Phantom 410 with (2) intake fans on front, (1) intake on bottom, (1) exhaust on rear, and (2) exhaust on top through H100i CPU cooler. There is also an exhaust fan on the side of the case.


----------



## pcrevolution

Quote:


> Originally Posted by *Linxus*
> 
> Hey guys,
> 
> I recently picked up a Sapphire Tri-X R9 290X and overclocked it using the Sapphire TriXX tool. I'm loving this new card so far and its running GTA 5 @ 1440p wonderfully. I achieved the following settles and the card seems stable:
> 
> GPU Clock: 1140
> Memory Clock: 1340
> VDDC Offset: 87
> Power Limit: 20
> 
> Running Valley Benchmark 1.0, my GPU temperature hovers around 86ºC; however, my VRM 1 temperature jumps up to 108ºC and hovers around 104-105ºC. The VRM 2 temperature stays around 76ºC. I'm using a custom fan profile with the fan running around 65% at 85ºC. Should I be worried about this high VRM 1 temperature? Are my settings too aggressive? I've noticed that my TriXX options are different than most guides on here. Any input is appreciated.


VRM temps do seem a tad high.

What is your base voltage btw?


----------



## Linxus

Quote:


> Originally Posted by *pcrevolution*
> 
> VRM temps do seem a tad high.
> 
> What is your base voltage btw?


What's the best way to check base voltage? My VDDC on GPU-Z sits around 1.180 - 1.200 V, with a couple peaks at 1.240 V.


----------



## Linxus

Quote:


> Originally Posted by *pcrevolution*
> 
> VRM temps do seem a tad high.
> 
> What is your base voltage btw?


Quote:


> Originally Posted by *Linxus*
> 
> What's the best way to check base voltage? My VDDC on GPU-Z sits around 1.180 - 1.200 V, with a couple peaks at 1.240 V.


Here's a screenshot of my current settings with Valley running in the background.


----------



## tsm106

Quote:


> Originally Posted by *Linxus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *pcrevolution*
> 
> VRM temps do seem a tad high.
> 
> What is your base voltage btw?
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Linxus*
> 
> What's the best way to check base voltage? My VDDC on GPU-Z sits around 1.180 - 1.200 V, with a couple peaks at 1.240 V.
> 
> Click to expand...
> 
> Here's a screenshot of my current settings with Valley running in the background.
Click to expand...

How is your airflow? The temps seem to be inline with your core temp. Open fan coolers are not the best on vrm cooling and the Tri-X doesn't have more phases so it will run similar or more likely higher vrm temps than a reference card even. You fan DC is at 86% so you've got that running pretty loud to boot.


----------



## Agent Smith1984

I'm thinking airflow there too, though your fan count suggests that should be fine.
I would switch your side exhaust to an intake to blow on your card though.... that should help quite a bit.

You may also want to do some new thermal compound.
I have a newer 290 tri-x OC that runs around 77c core/90c VRM1 with stock fan profile. My older tri-x has had mx-2 applied roughly 6 months ago and it still never breaks 69c core/81c VRM1.

Those temps are taken from the cards running independantly. With both cards installed, I run the card with mx-2 in the top since it sees the most heat from intaking air off the second card.
Under load, the temps between the two are very similar.

I am applying fresh MX-2 to both cards this weekend to shed some heat and have a go at some more overclocking.

These cards are plenty capable of running just fine at 90+c with VRM temps over 100, so you aren't neccesarely in danger, but it is always a good thing to reduce temps if you can.


----------



## Ha-Nocri

this is my GPU usage while playing Dark Souls 2:


Core clk is around 720MHz. I get constant 60fps and gameplay is as smooth as it gets, but my GTX580 used to behave differently. It would have stable usage at let's say 30%, it wouldnt fluctuate nearly is much. Is this normal for newer AMD cards?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Ha-Nocri*
> 
> this is my GPU usage while playing Dark Souls 2:
> 
> 
> Core clk is around 720MHz. I get constant 60fps and gameplay is as smooth as it gets, but my GTX580 used to behave differently. It would have stable usage at let's say 30%, it wouldnt fluctuate nearly is much. Is this normal for newer AMD cards?


Posted in wrong section, but that usage looks like you have VSYNC turned on. And you did specify you have "constant 60fps"....

Edit: Read your sig, and saw you have 290x, I thought you were referring to this card as the GTX580....
Core clock at 720MHz sounds strange to me, but I still believe you haver vsync on.


----------



## Ha-Nocri

no vsync. The game is capped at 60 fps. I'm using 290 trix


----------



## Agent Smith1984

Quote:


> Originally Posted by *Ha-Nocri*
> 
> no vsync. The game is capped at 60 fps. I'm using 290 trix


Well, the game being capped would explain the usage being all over the place.
The game is probably easy enough to run for the card, that it doesn't need full core power/usage to handle 60FPS.
I'm not personally familiar with that title, so not sure.....

Do you have something like Crysis 3 or BF4 to test with?


----------



## tsm106

Quote:


> Originally Posted by *Ha-Nocri*
> 
> no vsync. The game is capped at 60 fps. I'm using 290 trix


Same difference.


----------



## Ha-Nocri

never herd of DS2?







You've been living under rock?









jking ofc, an awesome game. I will have GTA5 later today so will test with it.

Also, being constant 60 fps one would expect usage to be constant, like it was with my 580


----------



## Agent Smith1984

Quote:


> Originally Posted by *Ha-Nocri*
> 
> never herd of DS2?
> 
> 
> 
> 
> 
> 
> 
> You've been living under rock?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> jking ofc, an awesome game. I will have GTA5 later today so will test with it.
> 
> Also, being constant 60 fps one would expect usage to be constant, like it was with my 580


Not really, the 290 is WAY more powerful that the GTX580, so it takes a lot less work to get 60FPS.
I just read a little about the game and saw that it has very little graphics options, and can run at 4k on a GTX680, so it's obviously a fairly easy-to-run title.

Let us know how GTA 5 does....
You should see you core clock pinned and usage much higher.

You are also fighting a bit of a CPU bottleneck (I hate using that phrase, but in some cases, it really does apply).... that old i5, even at 4.2GHz, will struggle to feed a 290 in many cases, especially when the title isn't very demanding.


----------



## Ha-Nocri

there is rly not much difference between i5's from gen 1 to gen 4. Mine is a tad bit faster than friends 3570 @4GHz for example. And surely is faster per core than bulldozer









If GTA can use more than 4 cores then I might be in trouble on some map areas.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Ha-Nocri*
> 
> there is rly not much difference between i5's from gen 1 to gen 4. Mine is a tad bit faster than friends 3570 @4GHz for example. And surely is faster per core than bulldozer
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If GTA can use more than 4 cores then I might be in trouble on some map areas.


I see what you are hinting at, but I'm not using bulldozer









And honestly, the i5 you have at 4.2 it should be okay in most cases (I somewhat doubt it performs better than a 3570k at 4GHz though







), but regardless, what I'm saying is, in the easier to run title like DS2, you will see a bottleneck.

I have heard different stories with CPU usage on GTA V, one person says they get usage on 4 cores, another says 8.








I've not looked into it yet to be honest (most of my GTA V research has been on the GPU side of things).


----------



## LandonAaron

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I see what you are hinting at, but I'm not using bulldozer
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And honestly, the i5 you have at 4.2 it should be okay in most cases (I somewhat doubt it performs better than a 3570k at 4GHz though
> 
> 
> 
> 
> 
> 
> 
> ), but regardless, what I'm saying is, in the easier to run title like DS2, you will see a bottleneck.
> 
> I have heard different stories with CPU usage on GTA V, one person says they get usage on 4 cores, another says 8.
> 
> 
> 
> 
> 
> 
> 
> 
> I've not looked into it yet to be honest (most of my GTA V research has been on the GPU side of things).


GTAV was putting pretty heavy duty load on my 4790k running 2x290's, I was seeing CPU usage on all 8 threads >80%. First time a game has actually worked for crossfire since I got my second card.


----------



## Agent Smith1984

Quote:


> Originally Posted by *LandonAaron*
> 
> GTAV was putting pretty heavy duty load on my 4790k running 2x290's, I was seeing CPU usage on all 8 threads >80%. First time a game has actually worked for crossfire since I got my second card.


Good news for me then, cause I will have it soon.

I download about 4-6GB of the game per night while we sleep, then pause it in the mornings. DSL SUCKS








Been doing this for only a few days now, but plan on leaving her running over the weekend, and making everyone go outside and enjoy the weather (self uncluded)!!


----------



## tsm106

Quote:


> Originally Posted by *LandonAaron*
> 
> I was seeing CPU usage on all 8 threads >80%.


If that were true, who needs DX12 or Mantle?


----------



## Ha-Nocri

GTA 5 runs beautifully. Everything maxed out, MSAAx2, every slider to the end. 60fps 90% of the time. Lowest I saw was 45fps. Sometimes GPU is bottleneck, sometimes it's CPU. But @4GHz this CPU suits this GPU perfectly. Could go to 4.2GHz and 1.1GHz for GPU and I guess I would always have 60 fps.

But how big impact CPU can have in this game shows this: @2.8GHz I was getting 44 fps in benchmark. At 3.6GHz I was getting 58 fps. Really huge jump.


----------



## Linxus

Quote:


> Originally Posted by *tsm106*
> 
> How is your airflow? The temps seem to be inline with your core temp. Open fan coolers are not the best on vrm cooling and the Tri-X doesn't have more phases so it will run similar or more likely higher vrm temps than a reference card even. You fan DC is at 86% so you've got that running pretty loud to boot.


I think the airflow should be adequate. I edited the original post with some information about my fan and case set up. The fan is very loud at anything about 60% so I'd like to keep it under that number during daily use.
Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'm thinking airflow there too, though your fan count suggests that should be fine.
> I would switch your side exhaust to an intake to blow on your card though.... that should help quite a bit.
> 
> You may also want to do some new thermal compound.
> I have a newer 290 tri-x OC that runs around 77c core/90c VRM1 with stock fan profile. My older tri-x has had mx-2 applied roughly 6 months ago and it still never breaks 69c core/81c VRM1.
> 
> Those temps are taken from the cards running independantly. With both cards installed, I run the card with mx-2 in the top since it sees the most heat from intaking air off the second card.
> Under load, the temps between the two are very similar.
> 
> I am applying fresh MX-2 to both cards this weekend to shed some heat and have a go at some more overclocking.
> 
> These cards are plenty capable of running just fine at 90+c with VRM temps over 100, so you aren't neccesarely in danger, but it is always a good thing to reduce temps if you can.


I'll try switching out the side exhaust to an intake and see if that helps out. If not, I may try reapplying thermal compound like you mentioned. I have some left over MX-4 that I used on the CPU that I can try out. Really appreciate the input though.


----------



## JourneymanMike

Quote:


> Originally Posted by *Linxus*
> 
> *I'll try switching out the side exhaust to an intake and see if that helps out*. If not, I may try reapplying thermal compound like you mentioned. I have some left over MX-4 that I used on the CPU that I can try out. Really appreciate the input though.


That makes perfect sense, all intake, except the rear chassis fan for exhaust...


----------



## mfknjadagr8

Quote:


> Originally Posted by *tsm106*
> 
> If that were true, who needs DX12 or Mantle?


My 8320e is showing 55 - 75 percent usage on all cores at 4.8ghz so it does in fact use up to 8 and fairly evenly .... actually would be interested to see the usage on the newer intels...if i could figure out this gpu usage issue it would be running wild... with unified monitoring on im seeing 35% usage across the two in game.. and getting fluctuations below 60 down to as low as 45... i put the power limit to +50 and i never see drop in clocks...fresh installed windows and drivers... afterburner setup for uom now however havent changed voltages or clocks just power limit... showing 7.5gb of 16gb of pagefile usage..so that i assume means my virtual memory is keeping the ram fairly loaded...vram usage showing at around 5.1gb so.. 2.5ish on each card.. at 2512 x 1440 downsampled to 1920 x 1080 all settings to max except AA off vsync on...none of the advanced option on... i noticed with vsync off the drops seem to be more pronounced which to me doesnt make sense...still getting random usage drops to 0... on one card or the other.. rarely both...clocks rock steady temps in check...but the drops cause pronounced stutter and fps drops quite a bit below 60


----------



## pengs

Quote:


> Originally Posted by *Ha-Nocri*
> 
> this is my GPU usage while playing Dark Souls 2:
> 
> 
> Core clk is around 720MHz. I get constant 60fps and gameplay is as smooth as it gets, but my GTX580 used to behave differently. It would have stable usage at let's say 30%, it wouldnt fluctuate nearly is much. Is this normal for newer AMD cards?


Yeah, if your at a consistent 60fps then AB is not correct. It can be caused by ULPS sometimes but I think that's more of a CF thing. Fixed mine by updating AB in single GPU mode and then further fixing it by turning ULPS off in CF mode.
You may try updating AB if there's a new version and toying with 'enable unified GPU usage monitoring' option in AB.

It's not correct anyhow - no real reason to be concerned. It annoyed me quite a bit coming from a various SLI setups, I also couldn't get the usages to separate and show individually in CF mode, but all-in-all it doesn't really matter.

The dynamic clock speeds are normal and beneficial. Fermi did it but not to the extent that I've seen on the 290X and GCN1.1, they literally incrementally clock to 1039MHz or 1040MHz which is a 1MHz step (voltage stepping is similarly incremental). Even Kepler was fairly limited compared.


----------



## rdr09

Difference between Win 10 and Win 7 with 2 290s @ 1200 MHz . . .

http://www.3dmark.com/compare/fs/3155859/fs/3110083


----------



## fyzzz

i actually got to 1300 mhz on AIR today when i had the sidepanel off next to the window, but the combined test crashed and the firestrike gpu score got worse so i didn't save the run.


----------



## Ramzinho

Quote:


> Originally Posted by *rdr09*
> 
> Difference between Win 10 and Win 7 with 2 290s @ 1200 MHz . . .
> 
> http://www.3dmark.com/compare/fs/3155859/fs/3110083


interesting that on WIN 10 maximum boost CPU clock is 0 :S


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> i actually got to 1300 mhz on AIR today when i had the sidepanel off next to the window, but the combined test crashed and the firestrike gpu score got worse so i didn't save the run.


got to watercool that thing. keep an eye out for some cheap full waterblock. maybe lower the VRAM a bit. here was my highest at 1290 . . .

http://www.3dmark.com/3dm/4644282?

I'd be selling one of my 290s with a block (the better of the two) with this . . .


Spoiler: Warning: Spoiler!







sorry for the plug in.

Quote:


> Originally Posted by *Ramzinho*
> 
> interesting that on WIN 10 maximum boost CPU clock is 0 :S


yah, Win 10 does not read specs acurately. not sure if it has changed. that was an old run.


----------



## tsm106

Quote:


> Originally Posted by *fyzzz*
> 
> i actually got to 1300 mhz on AIR today when i had the sidepanel off next to the window, but the combined test crashed and the firestrike gpu score got worse so i didn't save the run.


That's deep into the red zone. I'd be extra careful pushing that far on *air* on any card because of the voltage required to get there. You're pushing the components very far out of their comfort zone.


----------



## fyzzz

Quote:


> Originally Posted by *tsm106*
> 
> That's deep into the red zone. I'd be extra careful pushing that far on *air* on any card because of the voltage required to get there. You're pushing the components very far out of their comfort zone.


I know that i'm kinda risking it, but i don't atleast have the stock cooler on. That was also done on cold air from outside (around 5-8 degrees celsius outside)


----------



## tsm106

Quote:


> Originally Posted by *fyzzz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> That's deep into the red zone. I'd be extra careful pushing that far on *air* on any card because of the voltage required to get there. You're pushing the components very far out of their comfort zone.
> 
> 
> 
> I know that i'm kinda risking it, but i don't atleast have the stock cooler on. That was also done on cold air from outside (around 5-8 degrees celsius outside)
Click to expand...

Oh that makes sense now.


----------



## Agent Smith1984

Anybody ever use OCCT GPU test to check for artifacts, or does it push a little too hard like Furmark/Kombuster does?


----------



## LandonAaron

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Anybody ever use OCCT GPU test to check for artifacts, or does it push a little too hard like Furmark/Kombuster does?


Ive only ran it a couple times, but the image it renders is pretty similar so I am thinking it probably pushes it in a similar way. Then again GPUz PCI test looks kinda similar and it doesn't push it hard at all.

I would test with that 3dmark skydiver test. The part where she walks into the pyramid chamber tends to through artifacts for me.


----------



## FastEddieNYC

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Anybody ever use OCCT GPU test to check for artifacts, or does it push a little too hard like Furmark/Kombuster does?


After I installed my Kyrographics block I tested with a few different programs and found that looping firestrike works best for artifact testing. I looped both valley and heaven, no issues. 30 seconds of firestrike produced artifacts. I recommend not using fur mark or any test that uses unrealistic loads that you will never see when gaming.


----------



## tsm106

Quote:


> Originally Posted by *FastEddieNYC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> Anybody ever use OCCT GPU test to check for artifacts, or does it push a little too hard like Furmark/Kombuster does?
> 
> 
> 
> After I installed my Kyrographics block I tested with a few different programs and found that looping firestrike works best for artifact testing. I looped both valley and heaven, no issues. 30 seconds of firestrike produced artifacts. I recommend not using fur mark or any test that uses unrealistic loads that you will never see when gaming.
Click to expand...

Furmark doesn't create an unrealistic load though, instead it works the card in a way that creates excess heat. That is actually good to test your cooling but not for stability obviously. Looping benchmarks whether unigine or 3dmark has never worked good for me, especially with crossfire. I find BF4 to be the hardest to validate in. That said, I do test with a host of benches and finish with BF4.


----------



## Jflisk

I have used it and it seems to be less then Furmark. But on the same principal.
Quote:


> Originally Posted by *tsm106*
> 
> Furmark doesn't create an unrealistic load though, instead it works the card in a way that creates excess heat. That is actually good to test your cooling but not for stability obviously. Looping benchmarks whether unigine or 3dmark has never worked good for me, especially with crossfire. I find BF4 to be the hardest to validate in. That said, I do test with a host of benches and finish with BF4.


Crysis 2 or 3 seems to push hardware to the max. Temps video CPU - Second runner Bioshock infinite. Thoughts ?


----------



## PsYcHo29388

Guess it's time for me to join.

XFX R9 290 Double D




http://www.techpowerup.com/gpuz/details.php?id=u7kkc


----------



## gatygun

Quote:


> Originally Posted by *Linxus*
> 
> Hey guys,
> 
> I recently picked up a Sapphire Tri-X R9 290X and overclocked it using the Sapphire TriXX tool. I'm loving this new card so far and its running GTA 5 @ 1440p wonderfully. I achieved the following settles and the card seems stable:
> 
> GPU Clock: 1150
> Memory Clock: 1450
> VDDC Offset: 93
> Power Limit: 25
> 
> Running Valley Benchmark 1.0, my GPU temperature hovers around 86ºC; however, my VRM 1 temperature jumps up to 108ºC and hovers around 104-105ºC. The VRM 2 temperature stays around 76ºC. I'm using a custom fan profile with the fan running around 65% at 85ºC. Should I be worried about this high VRM 1 temperature? Are my settings too aggressive? I've noticed that my TriXX options are different than most guides on here. Any input is appreciated.
> 
> EDIT: I bumped the fan speed up to a fixed 85% and the VRM 1 temperature is now hovering around 100ºC. I should also mentioned that I have a NZXT Phantom 410 with (2) intake fans on front, (1) intake on bottom, (1) exhaust on rear, and (2) exhaust on top through H100i CPU cooler. There is also an exhaust fan on the side of the case.


Seems really high in my vision. I played dying light for about 20 minutes or so to test it out on my tri-x. While it's hot in my room atm my temps are:

Tri-x 290 Oc'ed:

Core clock: 1145
Memory clock: 1400
Core voltage: +100
Power limit: +50

Gpu core: 70c max
VRM temp1: 74c max
Vrm temp2: 52c max
Fan speed: 54% max
GPU load: 100%
Memory usage: 3421 mb

The highest i did see it go at some point was 80c on vrm1 and 70c when my room is cooled and i torture the card for hours long, fan profile never really gets over 55.

Normal when it's cooler in my room 95% of the time. I get like 64 core max, 75 vrm1 max and 50% fan profile max.

I do use a own made fan profile tho, because i find that the stock fan profile waits a bit to long with increasing the fan speed.


----------



## tsm106

Quote:


> Originally Posted by *gatygun*
> 
> Seems really high in my vision. I played dying light for about 20 minutes or so to test it out on my tri-x.


That's because he's running a benchmark and you are running a game.


----------



## gatygun

Quote:


> Originally Posted by *tsm106*
> 
> That's because he's running a benchmark and you are running a game.


Runned valley benchmark for 30 minutes on 1145 / 1450 ( slightly higher clock on memory this time )
max 71c, 55% fan profile and 77c vrm1. Not much of a difference there. and then my system is aircooled with a cpu that heats up my system even more.

His temps are extremely high, i push a higher voltage even. He should check out his airflow and if nothing is wrong with it then he should look at it's card if there isn't something wrong with it. Maybe replace paste or reseat it, because the temps are way to high. Dude runs it at 100 vrm1 with 85% fan profile. Something is wrong.


----------



## Joe-Gamer

Off the current topic a bit but does anyone know good settings for project cars, R9 290 in CF.
On topic bit- I'm running 1125mhz core 1450mhz memory +25mv. Top card is MSI gaming (watercooled custom loop) second card is a vapour X. 45c (wc) and 74c (vapour x) max temps.


----------



## fyzzz

I finally got through the whole firestrike benchmark at 1300/1300 mhz: http://www.3dmark.com/3dm/6994842?


----------



## Joe-Gamer

My 3D mark doesn't accept my results; I used to get around the 14500 mark now I get 10000. Uninstalling doesn't help and support don't know why








From this http://www.3dmark.com/fs/3613093
to this http://www.3dmark.com/fs/4234637


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> I finally got through the whole firestrike benchmark at 1300/1300 mhz: http://www.3dmark.com/3dm/6994842?


fyzzz, that graphics score is more like for 1200MHz core.

Quote:


> Originally Posted by *Joe-Gamer*
> 
> My 3D mark doesn't accept my results; I used to get around the 14500 mark now I get 10000. Uninstalling doesn't help and support don't know why
> 
> 
> 
> 
> 
> 
> 
> 
> From this http://www.3dmark.com/fs/3613093
> to this http://www.3dmark.com/fs/4234637


the last run was only using one card. test each card. no need to pull out any. just disable crossfire. i would go as far as unplugging the power from the gpu not being tested.

i suspect a borked driver install. what driver are you using?


----------



## fyzzz

Quote:


> Originally Posted by *rdr09*
> 
> fyzzz, that graphics score is more like for 1200MHz core.
> the last run was only using one card. test each card. no need to pull out any. just disable crossfire. i would go as far as unplugging the power from the gpu not being tested.
> 
> i suspect a borked driver install. what driver are you using?


I know that the score is low but you can see in the link that it was clocked to 1300/1300 and i know that it was clocked to 1300 mhz. Maybe it was so instable so it affected the scores i don't know


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> I know that the score is low but you can see in the link that it was clocked to 1300/1300 and i know that it was clocked to 1300 mhz. Maybe it was so instable so it affected the scores i don't know


you are right. unstable and it affected the outcome. too much voltage maybe or too high a core. try 1280 core and 1400 vram.

1300 using Win 8 should net a 14000 graphics score.


----------



## Joe-Gamer

I'll give it ago, I'm using 15.4 beta but it has been present in the last driver I used 14.12 omega.


----------



## fyzzz

Quote:


> Originally Posted by *rdr09*
> 
> you are right. unstable and it affected the outcome. too much voltage maybe or too high a core. try 1280 core and 1400 vram.
> 
> 1300 using Win 8 should net a 14000 graphics score.


I will test some more in a moment. I plan on watercooling my card later in a few months maybe.


----------



## tsm106

Quote:


> Originally Posted by *gatygun*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> That's because he's running a benchmark and you are running a game.
> 
> 
> 
> Runned valley benchmark for 30 minutes on 1145 / 1450 ( slightly higher clock on memory this time )
> max 71c, 55% fan profile and 77c vrm1. Not much of a difference there. and then my system is aircooled with a cpu that heats up my system even more.
> 
> His temps are extremely high, i push a higher voltage even. He should check out his airflow and if nothing is wrong with it then he should look at it's card if there isn't something wrong with it. Maybe replace paste or reseat it, because the temps are way to high. Dude runs it at 100 vrm1 with 85% fan profile. Something is wrong.
Click to expand...

Repaste will only make a minor change. He's obviously got flow issues. However his ambient could play a post to just like yours could.


----------



## tsm106

Quote:


> Originally Posted by *fyzzz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> you are right. unstable and it affected the outcome. too much voltage maybe or too high a core. try 1280 core and 1400 vram.
> 
> 1300 using Win 8 should net a 14000 graphics score.
> 
> 
> 
> I will test some more in a moment. I plan on watercooling my card later in a few months maybe.
Click to expand...

IS it barely passing too? Time measurement errors equal not stable. In most cases no one will accept a run with that tag, fyi.


----------



## mfknjadagr8

Quote:


> Originally Posted by *tsm106*
> 
> IS it barely passing too? Time measurement errors equal not stable. In most cases no one will accept a run with that tag, fyi.


yes on genric vga x0....doesn't it say that when Windows default driver is running?


----------



## tsm106

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> IS it barely passing too? Time measurement errors equal not stable. In most cases no one will accept a run with that tag, fyi.
> 
> 
> 
> yes on genric vga x0....doesn't it say that when Windows default driver is running?
Click to expand...

You're quoting the wrong post chain.


----------



## mfknjadagr8

Quote:


> Originally Posted by *tsm106*
> 
> You're quoting the wrong post chain.


oops that I did...doh stupid fingers...I had accidentally clicked yours and thought I tapped it off and on the other post.. On this phone/browser it shows quote button and a number that went away then back on but for some reason yours stayed on and the other quote was ticked off...I'm full of fails this week


----------



## fyzzz

Quote:


> Originally Posted by *tsm106*
> 
> IS it barely passing too? Time measurement errors equal not stable. In most cases no one will accept a run with that tag, fyi.


I think that time measurment error is because i kill the demo in task manager. I don't want to watch it and i think that 1300 mhz clock is to unstable for it too. But hey i still got through the tests with it


----------



## Joe-Gamer

Does £200 for the vapour x sound like a fair deal to a friend? I might be able to sell my other R9 290 wit the EK block for around £270-80? I want to sell them before they drop in value.


----------



## kizwan

After droop I th
Quote:


> Originally Posted by *fyzzz*
> 
> I think that time measurment error is because i kill the demo in task manager. I don't want to watch it and i think that 1300 mhz clock is to unstable for it too. But hey i still got through the tests with it


Wasn't the time measurement error is the Windows 8 thing? The tests finished but not at 1300 performance. Likely your card is throttling.


----------



## tsm106

^^Nah, it's a windoze 10 thing.


----------



## joeh4384

I just added my 290x to my main rig with my 295x2. I picked up a corsair HG10 and H75 for it. It works great but my VRM temps are pretty high due to the 295x2 running below.


----------



## Cyber Locc

Quote:


> Originally Posted by *Joe-Gamer*
> 
> Does £200 for the vapour x sound like a fair deal to a friend? I might be able to sell my other R9 290 wit the EK block for around £270-80? I want to sell them before they drop in value.


Okay my answer pertains to US only as I do not know the pricing in Europe. That said if the prices are similar where you live those prices are outrageous. 200BP is 320 dollars US and here the vapor x cost that much new used they can be had for 250 or less and too a friend much less IMO. Same applies to the other question that's 400 US and 290s sell for 180-200 used (reference) used blocks maybe 80 so I'd say 200BP for the water cooled cards.

Hate to break it to you but 290s lost there value the day the mining took a plunge the market is flooded with used cards.

I'll be honest its hard to understand the other sides pricing







. That said after looking at uk sites those prices seem okish ebay UK has 290s for £140 so with block if its reference Id say £240-250 but I could be completely off base lol. It would help to know where you live for someone to answer as I think that plays a part.


----------



## rdr09

Quote:


> Originally Posted by *cyberlocc*
> 
> Okay my answer pertains to US only as I do not know the pricing in Europe. That said if the prices are similar where you live those prices are outrageous. 200BP is 320 dollars US and here the vapor x cost that much new used they can be had for 250 or less and too a friend much less IMO. Same applies to the other question that's 400 US and 290s sell for 180-200 used (reference) used blocks maybe 80 so I'd say 200BP for the water cooled cards.
> 
> Hate to break it to you but 290s lost there value the day the mining took a plunge the market is flooded with used cards.
> 
> I'll be honest its hard to understand the other sides pricing
> 
> 
> 
> 
> 
> 
> 
> . That said after looking at uk sites those prices seem okish ebay UK has 290s for £140 so with block if its reference Id say £240-250 but I could be completely off base lol. It would help to know where you live for someone to answer as I think that plays a part.


yah, depends on the geo location. i am about to sell one of my 290s with a full block and it will be way less than that but in the US.

this in particular . . .

http://www.3dmark.com/3dm11/8776470


----------



## DaUn3rD0g

Quote:


> Originally Posted by *Joe-Gamer*
> 
> Does £200 for the vapour x sound like a fair deal to a friend? I might be able to sell my other R9 290 wit the EK block for around £270-80? I want to sell them before they drop in value.


Are talking about a 290 vapor X or a 290x vapor X?

OCUK have both listed at £275 and £330 respectively, so either way £200 for a used one isn't all that bad when coming from a friend and you're sure its in good condition / looked after.

I think you'll struggle to sell a used 290 for £270+, even with an ek waterblock though, especially on eBay or the likes to a complete stranger, because it's possible to buy a brand new 290x and ek block for £310 from OCUK (Asus direct cu 290x £239 + ek full cover £80).

That does depend on which 290 it is of course, but I know that I personally wouldn't take a used 290 and block over a new 290x and block for a minor £40 saving, but that's just my opinion.


----------



## Ha-Nocri

ok, this is driving me crazy. Hardware acceleration for videos. It makes my GPU run @1GHz. If I tun off HA my CPU is used, which I don't like. I know AMD cards had many problems with playing videos before, drivers crashing etc.... but did they have to do this? Can I tell my GPU to go to 500MHz max (for example)? Can I customize it at all?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Ha-Nocri*
> 
> ok, this is driving me crazy. Hardware acceleration for videos. It makes my GPU run @1GHz. If I tun off HA my CPU is used, which I don't like. I know AMD cards had many problems with playing videos before, drivers crashing etc.... but did they have to do this? Can I tell my GPU to go to 500MHz max (for example)? Can I customize it at all?


You should be able to use afterburner to create a custom 2d profile.

I still don't understand why this is all a problem though?

If you want HW acceleration, you are going to have to accept GPU usage, if you don't want that, you are going to have to accept CPU usage.....
If the card clocks to 1,000mhz and gets 10-20% load, or clocks to 500MHz and gets 40% load, what is the difference? The power circuitry will adapt the voltage in such a way that the power usage is equal.....


----------



## Ha-Nocri

I want lower voltage and clock. I'm pretty sure this card doesn't need this much juice to play videos lol


----------



## Agent Smith1984

Quote:


> Originally Posted by *Ha-Nocri*
> 
> I want lower voltage and clock. I'm pretty sure this card doesn't need this much juice to play videos lol


I understand, but what I'm getting at, is that the power usage will all work itself out to be about the same.
I don't use any HW acceleration for streaming at all, and my CPU still stays at 1400MHz with only 2-3% usage most of the time.
I do watch some of my 720p movies with HW acceleration for better upscaling sometimes though..... still not much load....


----------



## rdr09

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I understand, but what I'm getting at, is that the power usage will all work itself out to be about the same.
> I don't use any HW acceleration for streaming at all, and my CPU still stays at 1400MHz with only 2-3% usage most of the time.
> I do watch some of my 720p movies with HW acceleration for better upscaling sometimes though..... still not much load....


Quote:


> Originally Posted by *Agent Smith1984*
> 
> I understand, but what I'm getting at, is that the power usage will all work itself out to be about the same.
> I don't use any HW acceleration for streaming at all, and my CPU still stays at 1400MHz with only 2-3% usage most of the time.
> I do watch some of my 720p movies with HW acceleration for better upscaling sometimes though..... still not much load....


other systems like mine . . . the gpus stay at stock 300 core and 150 vram when watching videos. not sure why even if HA is enabled. Both my AMD and Intel rigs. I am talking about YT videos. i don't use the pc to watch movies other than those.


----------



## Ha-Nocri

hmm, I unlocked voltage control in MSI AB, and now clk/vol goes to max for 5-10 seconds after a video is started, then goes back to 300MHz/low voltage... might be alright now


----------



## Agent Smith1984

Quote:


> Originally Posted by *rdr09*
> 
> other systems like mine . . . the gpus stay at stock 300 core and 150 vram when watching videos. not sure why even if HA is enabled. Both my AMD and Intel rigs. I am talking about YT videos. i don't use the pc to watch movies other than those.


I haven't even looked at my clocks with HA enabled to be honest.... at least not on my 290's anyways.
I did notice on my 280x that I had a 300MHz, 500MHz 2D, and then max clock 3D state.
When streaming the clock was 300MHz, when watching full HD movies, it was still 300MHz, and while playing Angry Birds it was 500MHz... lol

I am in agreeance his clocks should not be 1,000MHz, but I wonder now did he just peak at GPU-Z or something to see them at that speed while he was watching, because I have noticed launching GPU-Z will initiall clock the card up.... Not really much information to go off of....


----------



## Ha-Nocri

I'm watching constantly in AB what is going on with voltage/clock (was watching only clock before)....


----------



## rdr09

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You should be able to use afterburner to create a custom 2d profile.
> 
> I still don't understand why this is all a problem though?
> 
> If you want HW acceleration, you are going to have to accept GPU usage, if you don't want that, you are going to have to accept CPU usage.....
> If the card clocks to 1,000mhz and gets 10-20% load, or clocks to 500MHz and gets 40% load, what is the difference? The power circuitry will adapt the voltage in such a way that the power usage is equal.....


you taliking about YT?
Quote:


> Originally Posted by *Ha-Nocri*
> 
> I want lower voltage and clock. I'm pretty sure this card doesn't need this much juice to play videos lol


----------



## Agent Smith1984

"YT" ?


----------



## Forceman

Quote:


> Originally Posted by *Agent Smith1984*
> 
> "YT" ?


Youtube?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Forceman*
> 
> Youtube?












Most of my streaming is Netflix and Hulu, but some youtube also....
I honestly don't pay much attention to anything going on with the hardware when doing the light duty stuff.

I only looked into the HW on/off thing when I was having artifact issues with my old 280x. Then I toyed with it briefly when I got my first 290 and noticed it didn't seem to make a difference, except for in actual x264/x265 stuff when I was playing full definition, or attempting to upscale standard HD stuff.
That's been a little while back, and can't recall my findings to be honest.


----------



## DaUn3rD0g

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You should be able to use afterburner to create a custom 2d profile.


Can you tell me how to do this in Afterburner please, how do you set a profile for 2D clocks and 3D clocks?

Or do you mean 2 separate profiles that you have to manually select between before starting a 2D or 3D task?

I know that Asus GPU Tweak gives you the option to set clocks for both 2D and 3D clocks under the same profile so that they kick in automatically as required, but have not noticed an option in Afterburner for this without creating 2 separate profiles and manually switching between them.

Have I just missed it somewhere?

I am very new to the whole overclocking thing and only just getting forced into tweaking settings to resolve gpu crashes with my 290x.


----------



## tsm106

Quote:


> Originally Posted by *DaUn3rD0g*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> You should be able to use afterburner to create a custom 2d profile.
> 
> 
> 
> Can you tell me how to do this in Afterburner please, how do you set a profile for 2D clocks and 3D clocks?
> 
> Or do you mean 2 separate profiles that you have to manually select between before starting a 2D or 3D task?
> 
> I know that Asus GPU Tweak gives you the option to set clocks for both 2D and 3D clocks under the same profile so that they kick in automatically as required, but have not noticed an option in Afterburner for this without creating 2 separate profiles and manually switching between them.
> 
> Have I just missed it somewhere?
> 
> I am very new to the whole overclocking thing and only just getting forced into tweaking settings to resolve gpu crashes with my 290x.
Click to expand...

http://www.overclock.net/t/1265543/the-amd-how-to-thread/0_40


----------



## DaUn3rD0g

Quote:


> Originally Posted by *tsm106*
> 
> http://www.overclock.net/t/1265543/the-amd-how-to-thread/0_40


Excellent, thank you!

I knew about creating 2 separate profiles, I just hadn't noticed the option in the settings to assign profiles to 2D and 3D, much obliged sir!


----------



## gatygun

Just played the witcher 3 a bit on ultra, runs like a charm tho, fps is like 40-55 fps constantly on ultra with only hairworks off and hbao+ on + AA on. If i put hairworks on it will push towards 20-40 fps. I'm sure crossfire 290's will be able to push 30fps stable with it on. That's on 1080p tho on a 290 oc'ed 1145/1400.

Ultra with 1440p i still get 36-45 fps. Probably will play it at that resolution as it looks a lot more clean.


----------



## Ha-Nocri

Nice. So with turning down a few options we should get steady 60 fps.


----------



## cephelix

Quote:


> Originally Posted by *gatygun*
> 
> Just played the witcher 3 a bit on ultra, runs like a charm tho, fps is like 40-55 fps constantly on ultra with only hairworks off and hbao+ on. If i put hairworks on it will push towards 20-40 fps. I'm sure crossfire 290's will be able to push 30fps stable with it on. That's on 1080p tho on a 290 oc'ed 1145/1400.
> 
> Ultra with 1440p i still get 36-45 fps. Probably will play it at that resolution as it looks a lot more clean.


That means i'm practically screwed in terms of fps on my lonely 1100/1450 r9 290(non x) at 1200p?? Read in the various threads about witcher that there isn't much difference between medium and ultra in terms or graphical quality.


----------



## gatygun

Quote:


> Originally Posted by *cephelix*
> 
> That means i'm practically screwed in terms of fps on my lonely 1100/1450 r9 290(non x) at 1200p?? Read in the various threads about witcher that there isn't much difference between medium and ultra in terms or graphical quality.


Only tested ultra and low, and i must say the difference really isn't that much in my vision. I would advice to disable blur / lod stuff and downsample to get a good picture quality tho, it's good doable on your card and still get a 30fps solution with ultra settings, as your card isn't much different then mine. The v-ram / ram usage is really low like 3gb of ram and 2gb of v-ram from what i encountered running around a village and in a forest.

Well a bit more detailed on what i tested so far:

Ultra: ( oc speeds 1145 / 1400 )

Everything Ultra + AA on + HBAO+, Hairworks off, 1080p = lowest i saw 40 fps highest i saw 68 fps ( hovers around the 45 fps mostly )

Everything Ultra + AA on + HBAO+, Hairworks off, 1440p = lowest i saw 35 fps highest i saw 45 fps ( hovers at 38 mostly )

Everything Ultra + AA on + HBAO+, Hairworks (gerald only), 1080p = lowest i saw 28 fps highest 47 fps ( hovers around the 40 but when herald comes close to the screen tanks towards high 20's 28 ).

Everything Ultra + AA on + HBAO+, Hairworks on ( everything), 1080p = lowest i saw 22 fps highest i saw 40 fps.

Low: ( tested at stock speeds 1000/1300 )

Everything Low + AA off +, Hairworks off, 1080p ( watching 2 horses + gerald in a town ) = 80 fps hovers on it ( 60 fps low if you run past the horses ), 100 high.

Everything Low + AA off +, Hairworks on ( everything ), 1080p ( watching 2 horses + gerald in a town ) = ~38 fps ( 34 fps low if you run past the horses ) and 48 fps high.

V-ram usage is incredible low like it hovers on 2gb with 2560x1440 resolution and everything including hairworks maxed. I don't think anybody really with a 3gb v-ram card will have issue's on this department.

Conclusion,

Hairworks on ( everything ) cuts framerate in half. If you want to play with full hairworks on, you can either drop settings to high + ssao for a stable 30+ fps. Or get sometimes 26 fps lows for seconds with ultra settings+ssao ( and drop aa / depth / both blur / sharpening / chrom ) for stable 30+ fps mostly hovers at 37 fps.. on a single 290 oc'ed.


----------



## cephelix

Quote:


> Originally Posted by *gatygun*
> 
> Only tested ultra and low, and i must say the difference really isn't that much in my vision. I would advice to disable blur / lod stuff and downsample to get a good picture quality tho, it's good doable on your card and still get a 30fps solution with ultra settings, as your card isn't much different then mine. The v-ram / ram usage is really low like 3gb of ram and 2gb of v-ram from what i encountered running around a village and in a forest.
> 
> Well a bit more detailed on what i tested so far:
> 
> Ultra:
> 
> Everything Ultra + AA on + HBAO+, Hairworks off, 1080p = lowest i saw 40 fps highest i saw 68 fps ( hovers around the 45 fps mostly )
> 
> Everything Ultra + AA on + HBAO+, Hairworks off, 1440p = lowest i saw 35 fps highest i saw 45 fps ( hovers at 38 mostly )
> 
> Everything Ultra + AA on + HBAO+, Hairworks (gerald only), 1080p = lowest i saw 28 fps highest 47 fps ( hovers around the 40 but when herald comes close to the screen tanks towards high 20's 28 ).
> 
> Everything Ultra + AA on + HBAO+, Hairworks on ( everything), 1080p = lowest i saw 22 fps highest i saw 40 fps.
> 
> Low:
> 
> Everything Low + AA off +, Hairworks off, 1080p ( watching 2 horses + gerald in a town ) = 80 fps hovers on it ( 60 fps low if you run past the horses ), 100 high.
> 
> Everything Low + AA off +, Hairworks on ( everything ), 1080p ( watching 2 horses + gerald in a town ) = ~38 fps ( 34 fps low if you run past the horses ) and 48 fps high.
> 
> Conclusion:
> 
> Hairworks on ( everything ) cuts framerate in half.


Hairworks does seem to be a killer. but someone said hairworks doesn't tank framerates so much after the tutorial part. Don't know his system specs though. thanks for the info man


----------



## BradleyW

Installed Windows 10 and the latest AMD drivers from Windows Update. No CFX option in CCC. Both cards found in device manager. Help?


----------



## Cyber Locc

Quote:


> Originally Posted by *gatygun*
> 
> Just played the witcher 3 a bit on ultra, runs like a charm tho, fps is like 40-55 fps constantly on ultra with only hairworks off and hbao+ on + AA on. If i put hairworks on it will push towards 20-40 fps. I'm sure crossfire 290's will be able to push 30fps stable with it on. That's on 1080p tho on a 290 oc'ed 1145/1400.
> 
> Ultra with 1440p i still get 36-45 fps. Probably will play it at that resolution as it looks a lot more clean.


Wait I thought that game didn't come out till tomorrow?


----------



## chronicfx

steam is one day early


----------



## chronicfx

@gatygun I was just using the AFR profile in the first room with yennifer looking for the keys or whatever with the senses. With 3 290x afterburner was saying I had 100% usage on all three cards and my FPS using generic high setting was ~85FPS, also on ultra I was getting ~50FPS. I could not go outside because I am afraid of walking to the light







You guys who tried the AFR friendly know what I mean. But I think the performance will be there when they fix the driver. Incase you are wondering what resolution, I am playing at 8560x1440 with triple monitors. Two catleaps and the new LG in the middle.


----------



## Cyber Locc

Quote:


> Originally Posted by *chronicfx*
> 
> @gatygun I was just using the AFR profile in the first room with yennifer looking for the keys or whatever with the senses. With 3 290x afterburner was saying I had 100% usage on all three cards and my FPS using generic high setting was ~85FPS, also on ultra I was getting ~50FPS. I could not go outside because I am afraid of walking to the light
> 
> 
> 
> 
> 
> 
> 
> You guys who tried the AFR friendly know what I mean. But I think the performance will be there when they fix the driver. Incase you are wondering what resolution, I am playing at 8560x1440 with triple monitors. Two catleaps and the new LG in the middle.


Thats not bad FPS for 8k on high so crossfire @ 4k may be doable. And ahhh to the steam its out now on GOG anyway







sadly no drivers to next week though.


----------



## chronicfx

Quote:


> Originally Posted by *cyberlocc*
> 
> Thats not bad FPS for 8k on high so crossfire @ 4k may be doable. And ahhh to the steam its out now on GOG anyway
> 
> 
> 
> 
> 
> 
> 
> sadly no drivers to next though.


Yeah.. I have resorted to single screen on the LG for now with one GPU for now.


----------



## Cyber Locc

Quote:


> Originally Posted by *chronicfx*
> 
> Yeah.. I have resorted to single screen on the LG for now with one GPU for now.


Ya I finally got it installed tried to run with crossfire and got like 15fps I get more with 1 card like alot more *****. People are saying to enable AFR friendly but I haven't a clue how to do that. I looked where people said to look but its not there so IDK. How do you enable AFR friendly I'm on 15.4?


----------



## chronicfx

Quote:


> Originally Posted by *cyberlocc*
> 
> Ya I finally got it installed tried to run with crossfire and got like 15fps I get more with 1 card like alot more *****. People are saying to enable AFR friendly but I haven't a clue how to do that. I looked where people said to look but its not there so IDK. How do you enable AFR friendly I'm on 15.4?


In catalyst go to gaming, then 3d aplication settings. You have to make a profile by pressing +add then find the witcher 3 exe and select it. Then AFR will be selectable under frame pacing


----------



## chronicfx

At least there's that. Fingers crossed for crossfire soon.


----------



## BradleyW

Installed Windows 10 and the latest AMD drivers from Windows Update. No CFX option in CCC. Both cards found in device manager. Help?


----------



## moorhen2

Quote:


> Originally Posted by *BradleyW*
> 
> Installed Windows 10 and the latest AMD drivers from Windows Update. No CFX option in CCC. Both cards found in device manager. Help?


Have a look here, you need this driver version for W10.

http://www.touslesdrivers.com/index.php?v_page=23&v_code=44143


----------



## YellowBlackGod

Ι have also Windows 10, build 10074 installed, and I am using the WDDM 2.0 Driver along with CCC, which installs automatically. Runs pretty well. The problem is that this driver is not offering VSR in CCC. When I am cleanly installing the latest beta Windows update detects this and installs the WDDM driver again. And for the time being it is not possible to uninstall this or prevent the insallation in WIndows 10. It will be installed no matter what you do. So I use it (if you can't prevent something, just enjoy it) but at least it performs well.


----------



## LandonAaron

Does Freesync automatically enable Vsync? Also how will it behave on games that are locked to a particular refresh rate like Rage, Dark Souls, and Gamebryo Engine games like Fallout, and Skyrim that only play correctly at 60hz?

Just got thinking about Freesync after the last couple of games I have played don't have a working Vsync option, (even when forced through the driver). Which I really think is a silly term. I have never found that enabling a feature in the control panel actually forces it on. It is much more like a suggestion and the feature may or may not be enabled in game.


----------



## BradleyW

Quote:


> Originally Posted by *LandonAaron*
> 
> Does Freesync automatically enable Vsync? Also how will it behave on games that are locked to a particular refresh rate like Rage, Dark Souls, and Gamebryo Engine games like Fallout, and Skyrim that only play correctly at 60hz?
> 
> Just got thinking about Freesync after the last couple of games I have played don't have a working Vsync option, (even when forced through the driver). Which I really think is a silly term. I have never found that enabling a feature in the control panel actually forces it on. It is much more like a suggestion and the feature may or may not be enabled in game.


No FreeSync does not enable Vsync by default. For games locked at 60Hz, they will work fine as long as the fps stays within the FreeSync range.


----------



## Ha-Nocri

Quote:


> Originally Posted by *BradleyW*
> 
> Installed Windows 10 and the latest AMD drivers from Windows Update. No CFX option in CCC. Both cards found in device manager. Help?


How is win10 for gaming? And overall work.


----------



## kizwan

Mostly driving, running from the cops.

BTW, after updated to 15.4 beta, I forgot to disable ULPS. The second gpu stuck at 100% clock & usage which I think causing the game to crash. The second gpu stuck at 100% clock & usage after crash though. But I did have Open Hardware Monitor running in the background which I closed when running the game after disabling ULPS. I'll investigate this later.

GTA 5 Frametime vs. Framerate


GTA 5 CPU usage


GPU usage is all over the place, pretty much like BF4 when playing at 1080p. Non-unified usage of course.

GTA 5 settings


----------



## StrongForce

Do you have a 8gb 290x ??

http://pclab.pl/art57777-24.html performances are so poor in gta I get lows of 20's.. especially paired with an AMD FX 8000's ugh, in other words, my config, lol.


----------



## kizwan

Quote:


> Originally Posted by *StrongForce*
> 
> Do you have a 8gb 290x ??
> 
> http://pclab.pl/art57777-24.html performances are so poor in gta I get lows of 20's.. especially paired with an AMD FX 8000's ugh, in other words, my config, lol.


I have a pair 290 4GB in crossfire. The game show total VRAM for both cards.


----------



## aaroc

Quote:


> Originally Posted by *kizwan*
> 
> GTA 5 Frametime vs. Framerate
> 
> 
> GTA 5 CPU usage


How do you get those graphs?


----------



## kizwan

Quote:


> Originally Posted by *aaroc*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> GTA 5 Frametime vs. Framerate
> 
> 
> GTA 5 CPU usage
> 
> 
> 
> 
> 
> 
> 
> How do you get those graphs?
Click to expand...

Enabled log to a file in MSI AB (monitoring). Then edit the file a little bit by removing the *"* character & save it using *.csv* extension. Then create the graph using MS Excell.


----------



## Megagoth1702

Hey guys,

can you please help me find most recent BIOS for the Sapphire R9 290 Tri-X OC (1000/1300). The only R290 related BIOS I can find on the sapphire site is this.

This is my current BIOS. The reason why I want to flash a new one is that one of my three fans is going crazy for some reason.

Thanks to all of you in advance!


----------



## mfknjadagr8

Quote:


> Originally Posted by *StrongForce*
> 
> Do you have a 8gb 290x ??
> 
> http://pclab.pl/art57777-24.html performances are so poor in gta I get lows of 20's.. especially paired with an AMD FX 8000's ugh, in other words, my config, lol.


yeah something not right there...You shouldn't be seeing 20s unless you are running the advanced graphics options...


----------



## LandonAaron

Well Witcher 3 is out but its an Nvidia title with no crossfire support for now anyway. Hopefully AMD will release a driver update sometime soon as I won't be playing this when the best I can hope for is 30ish FPS.


----------



## Klocek001

in crossfire, does the second card spin the fans in idle ?


----------



## kizwan

Quote:


> Originally Posted by *Klocek001*
> 
> in crossfire, does the second card spin the fans in idle ?


No if ULPS is enabled.


----------



## LandonAaron

Quote:


> Originally Posted by *Klocek001*
> 
> in crossfire, does the second card spin the fans in idle ?


As far as I am aware all Radeon cards' fans spin all the time even when idle. You can set the fan curve however you like but the fans will still spin at least a little. As far as setting the fan curve both cards will use the same fan curve at least with MSIAB. I haven't found a way to give each card its own fan curve. But each cards fans will spin at different speeds depending on the temperature of the individual card. So if one card is 60 degrees and the other is 30 each will spin at different speeds because they fall on different points of that fan curve.

Also you can set different fixed speeds for each card. So you could set one cards fixed fan speed to 70%, and one cards fixed fan speed to 25% if you like.


----------



## Klocek001

Quote:


> Originally Posted by *kizwan*
> 
> No if ULPS is enabled.


Is it enabled by default ?


----------



## tsm106

Quote:


> Originally Posted by *Klocek001*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> No if ULPS is enabled.
> 
> 
> 
> Is it enabled by default ?
Click to expand...

Yes. But there's also zerocore at work which when the monitor goes to suspend, the gpu or gpus enter zerocore mode and go to sleep, stopping the fan too.


----------



## Klocek001

Quote:


> Originally Posted by *tsm106*
> 
> Yes. But there's also zerocore at work which when the monitor goes to suspend, the gpu or gpus enter zerocore mode and go to sleep, stopping the fan too.


so if I get 290 vaporx and put is as the top card and use my 290 trix as the bottom card then while working e.g. in MS word I'll have just one fan running ?


----------



## tsm106

Quote:


> Originally Posted by *Klocek001*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Yes. But there's also zerocore at work which when the monitor goes to suspend, the gpu or gpus enter zerocore mode and go to sleep, stopping the fan too.
> 
> 
> 
> so if I get 290 vaporx and put is as the top card and use my 290 trix as the bottom card then *while working e.g. in MS word I'll have just one fan running ?*
Click to expand...

If you have ULPS enabled and over time if there is no load after X amount of time, the slave card in this case the trix, will go to sleep.


----------



## kizwan

Quote:


> Originally Posted by *Klocek001*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Yes. But there's also zerocore at work which when the monitor goes to suspend, the gpu or gpus enter zerocore mode and go to sleep, stopping the fan too.
> 
> 
> 
> 
> 
> 
> so if I get 290 vaporx and put is as the top card and use my 290 trix as the bottom card then while working e.g. in MS word I'll have just one fan running ?
Click to expand...

The fans on the 290 Tri-X will stop when idling, this include any tasks that doesn't support crossfire or doesn't utilize the second card if ULPS is enabled.


----------



## Ramzinho

Quote:


> Originally Posted by *tsm106*
> 
> If you have ULPS enabled and over time if there is no load after X amount of time, the slave card in this case the trix, will go to sleep.


NOBODY GIVES THAT MAN REP.. it's the OPTIMUM NUMBER


----------



## PsYcHo29388

I don't like how after upgrading to a 290 my minimum framerates in GTA5 are still the same as they were on my 7850.

Does anyone think a driver update can fix this or is the game just too CPU dependant for that to happen?


----------



## Klocek001

Quote:


> Originally Posted by *PsYcHo29388*
> 
> I don't like how after upgrading to a 290 my minimum framerates in GTA5 are still the same as they were on my 7850.
> 
> Does anyone think a driver update can fix this or is the game just too CPU dependant for that to happen?


I knew you had AMD before I unscrolled your sig rig specs.
Quote:


> Originally Posted by *kizwan*
> 
> The fans on the 290 Tri-X will stop when idling, this include any tasks that doesn't support crossfire or doesn't utilize the second card if ULPS is enabled.


just btw, I heard you need to have ULPS disabled in AB to get more even load on 2 GPUs in crossfire. Is that true ?


----------



## tsm106

Quote:


> Originally Posted by *Klocek001*
> 
> just btw, I heard you need to have ULPS disabled in AB to get more even load on 2 GPUs in crossfire. Is that true ?


No. Load on gpus is not directly a result of ULPS. ULPS shuts down slave cards in crossfire when idle, like on the desktop since crossfire is not active anyways. It however confuses oc apps. On point though, its usually because of a misconfigured AB config, like not enabling unified monitoring.


----------



## Ha-Nocri

Quote:


> Originally Posted by *PsYcHo29388*
> 
> I don't like how after upgrading to a 290 my minimum framerates in GTA5 are still the same as they were on my 7850.
> 
> Does anyone think a driver update can fix this or is the game just too CPU dependant for that to happen?


what drivers r u using? AMD have a beta release just for GTA5. Greatly improved my experience as there was no more visual stuttering...

Also the game is really CPU dependent. I had to OC mine to 4GHz to get stable 60fps (goes down in some areas, but rarely). I think u will need ~4.5GHz OC to match my single-core performance.


----------



## Klocek001

so I don't need AB for CF to work properly if running stock clocks?


----------



## rdr09

Quote:


> Originally Posted by *PsYcHo29388*
> 
> I don't like how after upgrading to a 290 my minimum framerates in GTA5 are still the same as they were on my 7850.
> 
> Does anyone think a driver update can fix this or is the game just too CPU dependant for that to happen?


cpu at stock?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *PsYcHo29388*
> 
> I don't like how after upgrading to a 290 my minimum framerates in GTA5 are still the same as they were on my 7850.
> 
> Does anyone think a driver update can fix this or is the game just too CPU dependant for that to happen?


Shouldn't be... might be that cpu. Get that thing to 4.8ghz or so and that should help.


----------



## PsYcHo29388

Quote:


> Originally Posted by *Ha-Nocri*
> 
> what drivers r u using? AMD have a beta release just for GTA5. Greatly improved my experience as there was no more visual stuttering...
> 
> Also the game is really CPU dependent. I had to OC mine to 4GHz to get stable 60fps (goes down in some areas, but rarely). I think u will need ~4.5GHz OC to match my single-core performance.


Yes, I've been using the 15.4 Beta drivers since GTA5 launched, did a full reinstall of them when I got my 290 as well.
Quote:


> Originally Posted by *MrWhiteRX7*
> Shouldn't be... might be that cpu. Get that thing to 4.8ghz or so and that should help.


That's what I was thinking too, looking at various videos and benchmarks it seems Intel is just better for this game which does indeed mean It's CPU dependant in certain areas, as I bog down all the way to 38FPS when driving from the docks back into the city.

How much of a voltage bump would you guys recommend for that speed?
Quote:


> Originally Posted by *rdr09*
> cpu at stock?


Yes, although I doubt overclocking will get me a ridiculous boost like I want (20-25FPS).


----------



## kizwan

So, Custom Resolution Utility (CRU) doesn't work with 290 crossfire??


----------



## rdr09

Quote:


> Originally Posted by *PsYcHo29388*
> 
> Yes, I've been using the 15.4 Beta drivers since GTA5 launched, did a full reinstall of them when I got my 290 as well.
> That's what I was thinking too, looking at various videos and benchmarks it seems Intel is just better for this game which does indeed mean It's CPU dependant in certain areas, as I bog down all the way to 38FPS when driving from the docks back into the city.
> 
> How much of a voltage bump would you guys recommend for that speed?
> Yes, although I doubt overclocking will get me a ridiculous boost like I want (20-25FPS).


your car will wait less.


----------



## Ha-Nocri

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Yes, although I doubt overclocking will get me a ridiculous boost like I want (20-25FPS).


it gave me a huge boost going from stock 2.8Ghz to 3.6GHz, 14 fps in the built-in bechmark. I was shocked rly. Avg. fps went from 44 to 58


----------



## the9quad

Quote:


> Originally Posted by *kizwan*
> 
> So, Custom Resolution Utility (CRU) doesn't work with 290 crossfire??


Yes it does, I have used it since day one. I install new drivers, run the pixel patcher, then run CRU., Works everytime.


----------



## ShortySmalls

Anyone looking to watercool these beast, i got some EK FC blocks for sale









http://www.overclock.net/t/1554352/2x-ek-290x-nickel-plexi-waterblocks


----------



## LandonAaron

Quote:


> Originally Posted by *Klocek001*
> 
> I knew you had AMD before I unscrolled your sig rig specs.
> just btw, I heard you need to have ULPS disabled in AB to get more even load on 2 GPUs in crossfire. Is that true ?


Quote:


> Originally Posted by *tsm106*
> 
> No. Load on gpus is not directly a result of ULPS. ULPS shuts down slave cards in crossfire when idle, like on the desktop since crossfire is not active anyways. It however confuses oc apps. On point though, its usually because of a misconfigured AB config, like not enabling unified monitoring.


I thought unified monitoring just changed how the GPU usage graph was displayed. Making it one combined graph instead of two separate graphs one for each GPU. What affect could that have on gameplay or GPU load?


----------



## kizwan

Quote:


> Originally Posted by *the9quad*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> So, Custom Resolution Utility (CRU) doesn't work with 290 crossfire??
> 
> 
> 
> Yes it does, I have used it since day one. I install new drivers, run the pixel patcher, then run CRU., Works everytime.
Click to expand...

Do you know whether EDID override doesn't work on certain LCD brand/model?


----------



## the9quad

Quote:


> Originally Posted by *kizwan*
> 
> Do you know whether EDID override doesn't work on certain LCD brand/model?


No idea, toastyx is a member on these boards and he is the man who made all that stuff. You could probably ask him, and he could shed some light.









or ask him on his forums

http://www.monitortests.com/forum/


----------



## kizwan

@ToastyX
Quote:


> Originally Posted by *the9quad*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Do you know whether EDID override doesn't work on certain LCD brand/model?
> 
> 
> 
> 
> 
> 
> 
> No idea, toastyx is a member on these boards and he is the man who made all that stuff. You could probably ask him, and he could shed some light.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> or ask him on his forums
> 
> http://www.monitortests.com/forum/
Click to expand...

Well I managed to get it working. I can only go up to [email protected] However, the image on the left of the screens become jaggy, blurry & double.


----------



## fyzzz

I just bought a second 290 (xfx) because it was on sale here were live. I picked it up today and it has hynix memory!


----------



## Dooderek

Quote:


> Originally Posted by *fyzzz*
> 
> I just bought a second 290 (xfx) because it was on sale here were live. I picked it up today and it has hynix memory!


Nice, my original xfx 290x reference was elpedia.


----------



## Ized

Would anyone be interested in a command line clock + voltage setting tool?

Buggy I have no doubt since I can't code, I just wanted to be able to set clocks from command line.

Code:



Code:


        R9290OC.exe -c 999 -m 888 -v 100
        -c set core clock in Mhz
        -m set memory clock in Mhz
        -v set core voltage offset in millivolts.


----------



## ShortySmalls

Both of my 290x i had were lucky enough to have hynix memory chips.


----------



## LandonAaron

Does AMD publish a list of games that support Crossfire?


----------



## rdr09

disregard if repost . . . 15.5 Beta.

http://forums.guru3d.com/showthread.php?t=399245

install at your own risk.


----------



## Ramzinho

Quote:


> Originally Posted by *rdr09*
> 
> disregard if repost . . . 15.5 Beta.
> 
> http://forums.guru3d.com/showthread.php?t=399245
> 
> install at your own risk.


i think this has been posted somewhere.. and people report no enhancement at witcher or Pcars. and to be honest. if you are AMD owner.. don't even bother buying Pcars.. it's a physx engine game.


----------



## rdr09

Quote:


> Originally Posted by *Ramzinho*
> 
> i think this has been posted somewhere.. and people report no enhancement at witcher or Pcars. and to be honest. if you are AMD owner.. don't even bother buying Pcars.. it's a physx engine game.


Disregard then. Thanks for the tip. still racing grid with my HD7770 . . .


----------



## Ramzinho

why are u selling ur 290X? prepping for 3XX series or going green?


----------



## rdr09

Quote:


> Originally Posted by *Ramzinho*
> 
> why are u selling ur 290X? prepping for 3XX series or going green?


moving to the other side of the world. brown outs are prevalent and internet spotty. i want to sell both 290s but one of the waterblocks has stains and might not look too attractive to sell.


----------



## Ramzinho

Quote:


> Originally Posted by *rdr09*
> 
> moving to the other side of the world. brown outs are prevalent and internet spotty. i want to sell both 290s but one of the waterblocks has stains and might not look too attractive to sell.


Wishing you the best of luck in your move man...







we will miss you


----------



## rdr09

Quote:


> Originally Posted by *Ramzinho*
> 
> Wishing you the best of luck in your move man...
> 
> 
> 
> 
> 
> 
> 
> we will miss you


Thanks, Ram.


----------



## tsm106

Quote:


> Originally Posted by *rdr09*
> 
> moving to the other side of the world. brown outs are prevalent and internet spotty. i want to sell both 290s but one of the waterblocks has stains and might not look too attractive to sell.


You might enjoy this fs thread. GL on your move btw.


----------



## ShortySmalls

Quote:


> Originally Posted by *Ramzinho*
> 
> why are u selling ur 290X? prepping for 3XX series or going green?


I'm selling mine too soon, already sold one to an IRL buddy but the rest of my rig needs to go soon! Need a laptop so I can take it with me to the Navy instead of a desktop. Look for my thread soon









Whats the going rate these days on a 290x? I know this isn't the right section but a 290/290x club would know just as much as anyone. My card under water would hit 45*c tops at 1250 core / 1650 memory, i feel the memory would go higher but afterburner cut me off, slider was maxed. Hynix memory chips.


----------



## Ramzinho

Quote:


> Originally Posted by *ShortySmalls*
> 
> I'm selling mine too soon, already sold one to an IRL buddy but the rest of my rig needs to go soon! Need a laptop so I can take it with me to the Navy instead of a desktop. Look for my thread soon
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Whats the going rate these days on a 290x? I know this isn't the right section but a 290/290x club would know just as much as anyone. My card under water would hit 45*c tops at 1250 core / 1650 memory, i feel the memory would go higher but afterburner cut me off, slider was maxed. Hynix memory chips.


that's an amazing chip.. i would say easy 240$ for the GPU... i got mine for 210 and 220 shipped. so for a nice chip like this i'd say 240~255 shipped would be fine.


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> You might enjoy this fs thread. GL on your move btw.


thanks, tsm. learned a lot from you man.


----------



## tsm106

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> You might enjoy this fs thread. GL on your move btw.
> 
> 
> 
> thanks, tsm. learned a lot from you man.
Click to expand...

lmao, I forgot to actually post the link!

http://www.overclock.net/t/1299223/two-ek-fc580-acetal-nickel-sold


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> lmao, I forgot to actually post the link!
> 
> http://www.overclock.net/t/1299223/two-ek-fc580-acetal-nickel-sold


yup, that's what it looks like. cooling is not affected though.


----------



## Ized

Sharing my command line tool to set overclocks. I was having such horrid luck with Afterburner and TriXX eating my profiles, now my plan is to just create .bat files (see example folder).

Code:



Code:


        R9290OC.exe -c 1000 -m 1400 -v 100 -p 50

        -c set core clock in Mhz
        -m set memory clock in Mhz
        -v set core voltage offset in millivolts.
        100millivolts = 0.100volt. Valid range is 0-250
        -p set Power Limit percentage 0-50

http://www.mediafire.com/download/2ty83mpwt5dybao/R9290OC_1.rar

https://www.virustotal.com/en/file/a21b29e8c280e31ac420ea99e61fc4950028df7501f218eb0ef249e9d09fe492/analysis/1432343280/

I have only briefly tested it on my own system.


----------



## kizwan

Nice effort @Ized! I didn't try it but rep for effort nonetheless.


----------



## fyzzz

I got this result:http: //www.3dmark.com/3dm/7067392? running 1170/1550 on my new r9 290. It seems high


----------



## Agent Smith1984

Quote:


> Originally Posted by *fyzzz*
> 
> I got this result:http: //www.3dmark.com/3dm/7067392? running 1170/1550 on my new r9 290. It seems high


Damn, that's nice on the graphics! You gotta bump that cpu up though


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> I got this result:http: //www.3dmark.com/3dm/7067392? running 1170/1550 on my new r9 290. It seems high


might be a 290X in disguise.

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Damn, that's nice on the graphics! You gotta bump that cpu up though


AS,

http://www.amazon.com/Philips-BDM4065UC-Resolution-Speakers-DisplayPort/dp/B00SCX78JS


----------



## fyzzz

Quote:


> Originally Posted by *rdr09*
> 
> might be a 290X in disguise.
> AS,
> 
> http://www.amazon.com/Philips-BDM4065UC-Resolution-Speakers-DisplayPort/dp/B00SCX78JS


I don't think my xfx 290 will unlock hawaii info reads:Memory config: 0x500046A9 Hynix
RA1: F8010005 RA2: 00000000
RB1: F8010005 RB2: 00000000
RC1: F8010005 RC2: 00000000
RD1: F8010005 RD2: 00000000


----------



## fyzzz

Is this score:http://www.3dmark.com/3dm/7070139 good for 290 crossfire? I'm running my xfx 290 with my asus 290. So overkill but fun.


----------



## kizwan

Yup, for the graphic score, pretty much similar to what I got at same clocks.


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> Is this score:http://www.3dmark.com/3dm/7070139 good for 290 crossfire? I'm running my xfx 290 with my asus 290. So overkill but fun.


your graphics score is about 1000 pts higher but i am comparing it to my system running Win7. Win 8 gives a boost but not that much. don't let go of those. could be your ram.


----------



## Ramzinho

When did firestrike Has this "4K GAMING PC Category" and why is it set to dual 980 at default? ok... i hate this marketing gimmicks.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Ramzinho*
> 
> When did firestrike Has this "4K GAMING PC Category" and why is it set to dual 980 at default? ok... i hate this marketing gimmicks.


I agree, I think it's kinda lame. I'm a little glad that my rig beat that "4K Gaming PC" score, even if it's barely 90 points above it.









I do find it odd though that for 3DM11 the "4K Gaming PC" score seems a lot lower. I barely beat it in Firestrike, yet I'm over 6,000 pts ahead of it on 3DM11.

For anyone curious: http://www.3dmark.com/3dm11/8902300
http://www.3dmark.com/3dm/4569561
These runs were on older drivers, and the Firestrike says it's invalid because I skipped the intro scene.


----------



## tsm106

I can kill that 3dm11 4K gaming score too.









http://www.3dmark.com/3dm11/8134812


----------



## Ramzinho

Quote:


> Originally Posted by *tsm106*
> 
> I can kill that 3dm11 4K gaming score too.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/8134812


THAT OC


----------



## tsm106

I miss those 7970s.









Had four benching at 1340/1850 at their prime.


----------



## Ramzinho

i just hope when i finish my rig i can get my 290Xs to 1100 1400 and i'll be so happy


----------



## fyzzz

http://www.3dmark.com/3dm/7071547 here is a result at 1150/1500. I can't push it much more because of the massive heat problems and i only got a 750w psu.


----------



## Ramzinho

Quote:


> Originally Posted by *fyzzz*
> 
> http://www.3dmark.com/3dm/7071547 here is a result at 1150/1500. I can't push it much more because of the massive heat problems and i only got a 750w psu.


hop are you getting lower scores than this?

http://www.3dmark.com/3dm/4569561\

maybe you are throttling


----------



## cephelix

Quote:


> Originally Posted by *fyzzz*
> 
> http://www.3dmark.com/3dm/7071547 here is a result at 1150/1500. I can't push it much more because of the massive heat problems and i only got a 750w psu.


Those scores








I've been thinking for quite some time whether I should get a second 290


----------



## kizwan

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ramzinho*
> 
> When did firestrike Has this "4K GAMING PC Category" and why is it set to dual 980 at default? ok... i hate this marketing gimmicks.
> 
> 
> 
> 
> 
> 
> I agree, I think it's kinda lame. I'm a little glad that my rig beat that "4K Gaming PC" score, even if it's barely 90 points above it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I do find it odd though that for 3DM11 the "4K Gaming PC" score seems a lot lower. I barely beat it in Firestrike, yet I'm over 6,000 pts ahead of it on 3DM11.
> 
> For anyone curious: http://www.3dmark.com/3dm11/8902300
> http://www.3dmark.com/3dm/4569561
> These runs were on older drivers, and the Firestrike says it's invalid because I skipped the intro scene.
Click to expand...

You mean the Demo? I skipped Demo too but mine valid result. That's not the reason why the result is invalid. It's invalid because you're using beta driver & something wrong with your motherboard real time clock. Google for WinTimerTester & test. You should get 0.9999 to 1.0000 ratio. If not something wrong there.
Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fyzzz*
> 
> Is this score:http://www.3dmark.com/3dm/7070139 good for 290 crossfire? I'm running my xfx 290 with my asus 290. So overkill but fun.
> 
> 
> 
> your graphics score is about 1000 pts higher but i am comparing it to my system running Win7. Win 8 gives a boost but not that much. don't let go of those. could be your ram.
Click to expand...

Really? My graphic score also in the same range.

http://www.3dmark.com/compare/fs/4897920/fs/4710955


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> http://www.3dmark.com/3dm/7071547 here is a result at 1150/1500. I can't push it much more because of the massive heat problems and i only got a 750w psu.


you beat my 1200 . . .

http://www.3dmark.com/3dm/4550854?

Quote:


> Originally Posted by *kizwan*
> 
> You mean the Demo? I skipped Demo too but mine valid result. That's not the reason why the result is invalid. It's invalid because you're using beta driver & something wrong with your motherboard real time clock. Google for WinTimerTester & test. You should get 0.9999 to 1.0000 ratio. If not something wrong there.
> Really? My graphic score also in the same range.
> 
> http://www.3dmark.com/compare/fs/4897920/fs/4710955


could be my system. fy even beats my 1200 with his 1150.









my only consolation is i can oc higher . . .

http://www.3dmark.com/3dm/4644282?

my second 290 is holding back my primary a bit.


----------



## HOMECINEMA-PC

I remembered how to fix my stupid 124 bsod that was coming up on nearly all of my games ...........
Upgraded to latest driver ( not beta )


----------



## rdr09

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> I remembered how to fix my stupid 124 bsod that was coming up on nearly all of my games ...........
> Upgraded to latest driver ( not beta )


eh, 124 is commonly related to cpu.


----------



## fyzzz

Crossfire is literally crossFIRE. The tempertures are insane and i may go back to a single 290. Under bf4 the top card (asus dc2) got to around 80c at 88% fanspeed and i run all my case fans at 100%.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *rdr09*
> 
> eh, 124 is commonly related to cpu.


I know right









All of them load up now even on diff over clocks but before not even at stock settings


----------



## ambientblue

can somebody help me please. Ive been benching this new rig but I don't understand how my temps are even possible, perhaps they are wrong???

I've got a 5820k @ 4GHZ and 290x @ stock 1ghz core, 1250mhz memory. This is water-cooled and the loop goes PUMP/RES>RAD>GPU>CPU.

Is it possible that my GPU is hitting 93C in a matter of minutes, like 10 mins, but the CPU is never passing like 48C on any core?? Also, Whenever im running a benchmark as soon as it ends I bring up the temps and the GPU is always at like 50C or lower immediately, and I've used 3 programs to check temps so far. Im confused.


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> Crossfire is literally crossFIRE. The tempertures are insane and i may go back to a single 290. Under bf4 the top card (asus dc2) got to around 80c at 88% fanspeed and i run all my case fans at 100%.


asus heats up by itself.

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> I know right
> 
> 
> 
> 
> 
> 
> 
> 
> 
> All of them load up now even on diff over clocks but before not even at stock settings


knowing you . . . 5GHz is stock for you.lol


----------



## cephelix

Quote:


> Originally Posted by *fyzzz*
> 
> Crossfire is literally crossFIRE. The tempertures are insane and i may go back to a single 290. Under bf4 the top card (asus dc2) got to around 80c at 88% fanspeed and i run all my case fans at 100%.


Did you OC your cards?
Quote:


> Originally Posted by *rdr09*
> 
> asus heats up by itself.


yeah,but i don't think msi is any better this time around. Should've gotten the vapor x when i made my purchase


----------



## rdr09

Quote:


> Originally Posted by *ambientblue*
> 
> can somebody help me please. Ive been benching this new rig but I don't understand how my temps are even possible, perhaps they are wrong???
> 
> I've got a 5820k @ 4GHZ and 290x @ stock 1ghz core, 1250mhz memory. This is water-cooled and the loop goes PUMP/RES>RAD>GPU>CPU.
> 
> Is it possible that my GPU is hitting 93C in a matter of minutes, like 10 mins, but the CPU is never passing like 48C on any core?? Also, Whenever im running a benchmark as soon as it ends I bring up the temps and the GPU is always at like 50C or lower immediately, and I've used 3 programs to check temps so far. Im confused.


is this with a full waterblock? mounting is definitely messed up. what waterblock?

also, what thermal paste did you use?
Quote:


> Originally Posted by *cephelix*
> 
> Did you OC your cards?
> yeah,but i don't think msi is any better this time around. Should've gotten the vapor x when i made my purchase


yes, the gaming with only two fans. even with those with three fans are struggling.

edit: scroll down to post # 21896 . . .

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781

it has info about asus.


----------



## ambientblue

Quote:


> Originally Posted by *rdr09*
> 
> is this with a full waterblock? mounting is definitely messed up. what waterblock?
> 
> also, what thermal paste did you use?
> yes, the gaming with only two fans. even with those with three fans are struggling.
> 
> edit: scroll down to post # 21896 . . .
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781
> 
> it has info about asus.


yes full block, bitspower.

I used arctic silver ceramique 2.

I don't think I messed up the mounting at all. Only thing I can think of is maybe too much thermal paste but it couldn't be making that much of a difference. I don't think I used much more than the CPU if any. And it certainly wasn't more than the manufacturer used with the reference heatsink.

I don't get how the gpu could be that hot if the cpu is in the same loop right after it and isn't affected at all. Cant even feel heat from the GPU if I touch the block or anything either :S


----------



## FastEddieNYC

Quote:


> Originally Posted by *ambientblue*
> 
> yes full block, bitspower.
> 
> I used arctic silver ceramique 2.
> 
> I don't think I messed up the mounting at all. Only thing I can think of is maybe too much thermal paste but it couldn't be making that much of a difference. I don't think I used much more than the CPU if any. And it certainly wasn't more than the manufacturer used with the reference heatsink.
> 
> I don't get how the gpu could be that hot if the cpu is in the same loop right after it and isn't affected at all. Cant even feel heat from the GPU if I touch the block or anything either :S


Are you using the backplate also? It helps supply uniform pressure for the block. It is nearly impossible to have a bad mount with a full cover block. It is either not enough pressure between the block/card or there is some foreign matter between the card and block. Your temps should be, depending on the room temp, 28~35c idle, 40 to 50c loaded.


----------



## Mega Man

Quote:


> Originally Posted by *fyzzz*
> 
> Crossfire is literally crossFIRE. The tempertures are insane and i may go back to a single 290. Under bf4 the top card (asus dc2) got to around 80c at 88% fanspeed and i run all my case fans at 100%.


how so ? all 4 of mine never hit 40 during gaming
Quote:


> Originally Posted by *ambientblue*
> 
> can somebody help me please. Ive been benching this new rig but I don't understand how my temps are even possible, perhaps they are wrong???
> 
> I've got a 5820k @ 4GHZ and 290x @ stock 1ghz core, 1250mhz memory. This is water-cooled and the loop goes PUMP/RES>RAD>GPU>CPU.
> 
> Is it possible that my GPU is hitting 93C in a matter of minutes, like 10 mins, but the CPU is never passing like 48C on any core?? Also, Whenever im running a benchmark as soon as it ends I bring up the temps and the GPU is always at like 50C or lower immediately, and I've used 3 programs to check temps so far. Im confused.


sounds like a mounting issues to me


----------



## tsm106

Quote:


> Originally Posted by *ambientblue*
> 
> [I don't get how the gpu could be that hot if the cpu is in the same loop right after it and isn't affected at all. Cant even feel heat from the GPU if I touch the block or anything either :S


With that core temp, it's an obvious mounting issue. Ya can't feel heat because its barely touching probably.


----------



## Arizonian

Quote:


> Originally Posted by *PsYcHo29388*
> 
> Guess it's time for me to join.
> 
> XFX R9 290 Double D
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/details.php?id=u7kkc


Congrats - added









I have to apologize for missing this one until now. If I've missed anyone else in the past week plz PM me. It's been a busy week


----------



## fyzzz

http://www.3dmark.com/3dm/7083372? here is run at 1210/1570 mhz with my xfx 290. I think it is quite high for that clock i am almost at my asus score and the asus was clocked around 1270 and higher memory clock.


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> http://www.3dmark.com/3dm/7083372? here is run at 1210/1570 mhz with my xfx 290. I think it is quite high for that clock i am almost at my asus score and the asus was clocked around 1270 and higher memory clock.


yes it is faster. that's my 1260. is it the DD version of XFX? how is the cooling?


----------



## kizwan

false
Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fyzzz*
> 
> http://www.3dmark.com/3dm/7083372? here is run at 1210/1570 mhz with my xfx 290. I think it is quite high for that clock i am almost at my asus score and the asus was clocked around 1270 and higher memory clock.
> 
> 
> 
> yes it is faster. that's my 1260. is it the DD version of XFX? how is the cooling?
Click to expand...

Comparing to mine, 13K graphic score @1200/1500.

http://www.3dmark.com/compare/fs/4907617/fs/2889070


----------



## fyzzz

@rdr09yup it is the dd version. The orignal cooler was actually quite good it kept the card around 70c and was not too loud. Right know i have the raijintek morpheus installed and it's working good,but it seems like it's running hotter than my asus card


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> false
> Comparing to mine, 13K graphic score @1200/1500.
> 
> http://www.3dmark.com/compare/fs/4907617/fs/2889070


fy's 290 is at 1210 scoring 13600 in graphics.

Quote:


> Originally Posted by *fyzzz*
> 
> @rdr09yup it is the dd version. The orignal cooler was actually quite good it kept the card around 70c and was not too loud. Right know i have the raijintek morpheus installed and it's working good,but it seems like it's running hotter than my asus card


must have a newer bios. you did not flash the bios with any other? you do have a nice oc on the VRAM.


----------



## fyzzz

@rdr09yup the bios is stock and the card has hynix memory


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> @rdr09yup the bios is stock and the card has hynix memory


keep that is you won't keep crossfire. its almost as fast as a 290X.


----------



## kizwan

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Comparing to mine, 13K graphic score @1200/1500.
> 
> http://www.3dmark.com/compare/fs/4907617/fs/2889070
> 
> 
> 
> fy's 290 is at 1210 scoring 13600 in graphics.
Click to expand...

Yeah, I know. I'm just telling mine got 13000 at 1200/1500 since you're saying the score is what you get at 1260. Sure mine less 600 than his.

P/S: I don't know how the h the word "false" in my previous post. I didn't type it.


----------



## tsm106

http://www.3dmark.com/compare/fs/4908298/fs/4907617

The newer drivers are quick it looks like. Looking at the scores above, it's pretty close noting two things. I have the obvious lead in physics with cpu and fy is making good use of the Ivy/Hasw bias in firestrike. I bet that as soon as a heavier load bench was run like ext or ultra, the 290 would drop behind... well there's also the point that you lose the Ivy gscore bias when not running firestrike og.


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> http://www.3dmark.com/compare/fs/4908298/fs/4907617
> 
> The newer drivers are quick it looks like. Looking at the scores above, it's pretty close noting two things. I have the obvious lead in physics with cpu and fy is making good use of the Ivy/Hasw bias in firestrike. I bet that as soon as a heavier load bench was run like ext or ultra, the 290 would drop behind... well there's also the point that you lose the Ivy gscore bias when not running firestrike og.


wow, same oc.


----------



## Candyman315

I have one... what do i need to post to confirm? Do i need bench results?

EDIT: Is this good?



Spoiler: Warning: Spoiler!


----------



## Dooderek

http://www.3dmark.com/fs/4908603

1100/1350

Here is also a compilation of valley benchmarks showing how CPU GHz and SSD's affect the synthetic benchmark. Also not posted above, but my original 3D mark score was 1100 pts lower, same gpu clock speeds but @ 4.1 GHz CPU and HDD. I believe the major increase in FPS is a result of scene changes where the FPS will drop low and skew the average. By bumping the CPU frequency you decrease the time changing scenes resulting in more average FPS. I believe the SSD does the same thing.










oh and here is a picture of my finished build. Dont mind the arrows, I was using it to demonstrate something.


----------



## PsYcHo29388

Quote:


> Originally Posted by *Arizonian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PsYcHo29388*
> 
> Guess it's time for me to join.
> 
> XFX R9 290 Double D
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/details.php?id=u7kkc
> 
> 
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have to apologize for missing this one until now. If I've missed anyone else in the past week plz PM me. It's been a busy week
Click to expand...

Thank you


----------



## ambientblue

Quote:


> Originally Posted by *FastEddieNYC*
> 
> Are you using the backplate also? It helps supply uniform pressure for the block. It is nearly impossible to have a bad mount with a full cover block. It is either not enough pressure between the block/card or there is some foreign matter between the card and block. Your temps should be, depending on the room temp, 28~35c idle, 40 to 50c loaded.


Ive used a backplate yes.

Is it possibly just driver issues???


----------



## Dooderek

Quote:


> Originally Posted by *ambientblue*
> 
> Ive used a backplate yes.
> 
> Is it possibly just driver issues???


No dude, it's hardware related. Do you still temp spike? Maybe it was a air bubble or things aren't tight


----------



## ambientblue

AH yes, thank you guys so much

Weird, I went and checked the screws, they weren't very tight some were REALLY loose not sure how it got that way I tightened them all evenly the first time but clearly wasn't enough.

Now I'm getting 53C 100% load, 1000mhz core, 1250mhz memory.

Is there any way the GPU block could have been damaged by the light testing done when it was hitting 90C? Im guessing not but I don't know what the nickel plated copper can handle.


----------



## Dooderek

Quote:


> Originally Posted by *ambientblue*
> 
> AH yes, thank you guys so much
> 
> Weird, I went and checked the screws, they weren't very tight some were REALLY loose not sure how it got that way I tightened them all evenly the first time but clearly wasn't enough.
> 
> Now I'm getting 53C 100% load, 1000mhz core, 1250mhz memory.
> 
> Is there any way the GPU block could have been damaged by the light testing done when it was hitting 90C? Im guessing not but I don't know what the nickel plated copper can handle.


No.


----------



## Arizonian

Quote:


> Originally Posted by *Candyman315*
> 
> I have one... what do i need to post to confirm? Do i need bench results?
> 
> EDIT: Is this good?
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## Candyman315

Quote:


> Originally Posted by *Arizonian*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Candyman315*
> 
> I have one... what do i need to post to confirm? Do i need bench results?
> 
> EDIT: Is this good?
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added
Click to expand...

Heck yeah! Now how do I add that badge?


----------



## chiknnwatrmln

Quote:


> Originally Posted by *kizwan*
> 
> You mean the Demo? I skipped Demo too but mine valid result. That's not the reason why the result is invalid. It's invalid because you're using beta driver & something wrong with your motherboard real time clock. Google for WinTimerTester & test. You should get 0.9999 to 1.0000 ratio. If not something wrong there.
> Really? My graphic score also in the same range.
> 
> http://www.3dmark.com/compare/fs/4897920/fs/4710955


From the 3D11 site:
Quote:


> Time measuring inaccurate
> 
> This message indicates funny business with the system clock during benchmark run. In most cases, this means that, no, you cannot cheat in 3DMark by adjusting Windows timers during the benchmark run or otherwise tampering with the measurements done by the benchmark. If this message persists and you have not done anything out of the ordinary, it may indicate a hardware issue with the real time clock in your system or the presence of a background program that somehow twists the time-space continuum of your operating system in such a way that this anti-cheat detection is tripped.


Looks like I should contact the guys over at CERN, my rig may have created a black hole.








In all seriousness though I ran these benches a long long time ago and haven't had any issues since then. I'll look into it but I think it might have just been a glitch. If it matters I used task manager to kill the intro scene. I didn't get the error when I sat through it.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *rdr09*
> 
> asus heats up by itself.
> knowing you . . . 5GHz is stock for you.lol


On Sandybee it was , not so much on Ivybee

But [email protected]@1.36vc daily on 4960x is alrighty


----------



## moorhen2

Here's a FireStrike run from me, GPU's at 1100/1475 stock voltages.

http://s572.photobucket.com/user/moorhen2/media/Capture2121.jpg.html

http://www.3dmark.com/3dm/7090027

VDDC Ofsett of 31 is stock for these cards using the latest version of Trixx.


----------



## fyzzz

errmm guys what is up with my 290:http://www.3dmark.com/3dm/7091136? just look the scores are insane. During that run the card was clocked to 1230/1600 but still. About the time measuring inaccurate thing, i get it every time i close the demo, i don't know why but i have to close the demo otherwise the vrm 1 will cook itself


----------



## rdr09

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> On Sandybee it was , not so much on Ivybee
> 
> But [email protected]@1.36vc daily on 4960x is alrighty
> 
> 
> Spoiler: Warning: Spoiler!


with as many gpus you run . . . you can't leave your cpu at stock.

For those looking to replace the original pads that came with your waterblocks . . .

http://search.store.yahoo.net/svcompucycle/cgi-bin/nsearch?catalog=svcompucycle&query=fujipoly

not extreme or ultra but will work just as good.


----------



## ambientblue

Quote:


> Originally Posted by *fyzzz*
> 
> errmm guys what is up with my 290:http://www.3dmark.com/3dm/7091136? just look the scores are insane. During that run the card was clocked to 1230/1600 but still. About the time measuring inaccurate thing, i get it every time i close the demo, i don't know why but i have to close the demo otherwise the vrm 1 will cook itself


Wow that score is crazy. What is Pineview Industries btw??


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> asus heats up by itself.
> knowing you . . . 5GHz is stock for you.lol
> 
> 
> 
> 
> 
> 
> On Sandybee it was , not so much on Ivybee
> 
> But [email protected]@1.36vc daily on 4960x is alrighty
Click to expand...



Quote:


> Originally Posted by *fyzzz*
> 
> errmm guys what is up with my 290:http://www.3dmark.com/3dm/7091136? just look the scores are insane. During that run the card was clocked to 1230/1600 but still. About the time measuring inaccurate thing, i get it every time i close the demo, i don't know why but i have to close the demo otherwise the vrm 1 will cook itself


Mine probably premium version because I can set to not run the demo before benching.


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> errmm guys what is up with my 290:http://www.3dmark.com/3dm/7091136? just look the scores are insane. During that run the card was clocked to 1230/1600 but still. About the time measuring inaccurate thing, i get it every time i close the demo, i don't know why but i have to close the demo otherwise the vrm 1 will cook itself


that ivy/has bias is working like tsm said. the only time i hit 14K in graphics was at 1305MHz on the core using Win 10. lol

i would like to see what you get in Extreme, though, and see if ivy lives up . . .

http://www.3dmark.com/3dm/2098310

that's with an older driver.


----------



## fyzzz

Quote:


> Originally Posted by *rdr09*
> 
> that ivy/has bias is working like tsm said. the only time i hit 14K in graphics was at 1305MHz on the core using Win 10. lol


I'm running windows 10 here and today i bumped the cpu up to 4.9 and the gpu a bit higher and that was the result


----------



## fyzzz

Quote:


> Originally Posted by *rdr09*
> 
> that ivy/has bias is working like tsm said. the only time i hit 14K in graphics was at 1305MHz on the core using Win 10. lol
> 
> i would like to see what you get in Extreme, though, and see if ivy lives up . . .
> 
> http://www.3dmark.com/3dm/2098310
> 
> that's with an older driver.


Sadly i can't run extreme because i only have the basic version of 3dmark


----------



## chronicfx

m using trifire 290x at eyefinity resolution of 8560x1440. Game runs smooth and stays over 45 FPS at worst without noticeable stutter. But when I enter the war room or use the skull thing to find hidden shards on the landscape Everything occurs in slow motion. The walk to the war table takes at least an entire minute seemingly at 2-3 FPS, same thing happens when using the occularium (sp?). But everything else, even other cutscenes appear normally. I was thinking maybe it was a swap file loading, (i run six crucial m4 in raid 0 and even though it crushes everything in benchmarks I sometimes question whether it is better than a single drive for normal operation). Or is it multi-monitor related? Does anyone with crossfire amd have these issues?


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> Sadly i can't run extreme because i only have the basic version of 3dmark


no problem. but using the same system . . ., your other 290 can't score as high?


----------



## kizwan

false
Quote:


> Originally Posted by *chronicfx*
> 
> m using trifire 290x at eyefinity resolution of 8560x1440. Game runs smooth and stays over 45 FPS at worst without noticeable stutter. But when I enter the war room or use the skull thing to find hidden shards on the landscape Everything occurs in slow motion. The walk to the war table takes at least an entire minute seemingly at 2-3 FPS, same thing happens when using the occularium (sp?). But everything else, even other cutscenes appear normally. I was thinking maybe it was a swap file loading, (i run six crucial m4 in raid 0 and even though it crushes everything in benchmarks I sometimes question whether it is better than a single drive for normal operation). Or is it multi-monitor related? Does anyone with crossfire amd have these issues?


What game did you play?


----------



## the9quad

Quote:


> Originally Posted by *chronicfx*
> 
> m using trifire 290x at eyefinity resolution of 8560x1440. Game runs smooth and stays over 45 FPS at worst without noticeable stutter. But when I enter the war room or use the skull thing to find hidden shards on the landscape Everything occurs in slow motion. The walk to the war table takes at least an entire minute seemingly at 2-3 FPS, same thing happens when using the occularium (sp?). But everything else, even other cutscenes appear normally. I was thinking maybe it was a swap file loading, (i run six crucial m4 in raid 0 and even though it crushes everything in benchmarks I sometimes question whether it is better than a single drive for normal operation). Or is it multi-monitor related? Does anyone with crossfire amd have these issues?


. Same thing on my rig


----------



## fyzzz

Quote:


> Originally Posted by *rdr09*
> 
> no problem. but using the same system . . ., your other 290 can't score as high?


I don't think so. My other 290 ran colder and at higher voltages and clocked higher but still it wasn't near that score


----------



## Ized

I've noticed I never really get the same score twice.

One day I will score higher on a lower clock etc.

http://www.3dmark.com/fs/4899315

My extreme scores will range from 5400 to 5850ish.

Also disappointing half the time 3dmark records the wrong info for my system in terms of CPU/GPU clocks. Glad I only paid $2 or something.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *chronicfx*
> 
> m using trifire 290x at eyefinity resolution of 8560x1440. Game runs smooth and stays over 45 FPS at worst without noticeable stutter. But when I enter the war room or use the skull thing to find hidden shards on the landscape Everything occurs in slow motion. The walk to the war table takes at least an entire minute seemingly at 2-3 FPS, same thing happens when using the occularium (sp?). But everything else, even other cutscenes appear normally. I was thinking maybe it was a swap file loading, (i run six crucial m4 in raid 0 and even though it crushes everything in benchmarks I sometimes question whether it is better than a single drive for normal operation). Or is it multi-monitor related? Does anyone with crossfire amd have these issues?


Try running res @ 7680 or 7800x1440p for good frames


----------



## moorhen2

Same settings, Windows 10.

http://www.3dmark.com/3dm/7093036


----------



## Dooderek

Quote:


> Originally Posted by *ambientblue*
> 
> Wow that score is crazy. What is Pineview Industries btw??


Xfx


----------



## snow cakes

Stock cooling, i cant post the gpu-z score because this is a brand new build, all my parts will be here tuesday ?


----------



## Jyve

Hey fellas. I have a reference asus 290 running on a corsair hg10/h55 setup. Temps on both the core and vrm is just fine.

I see lots of stellar overclock here and I'm jealous. Any time I put my vram over 1200 I get the black screen issue. I've messed around with both regular voltage as well as Aux voltage to no avail. Unfortunately it has the elpida ram on it.

I've exhausted about all options here and have just resigned myself to how this card runs. Honestly it does just about everything I want it to anyway.

I'm running on a Qnix 1440p monitor (same issue on a 1200p monitor) . The black screen problem only happens in 3d. Gaming only. No movie watching or surfing. I've disabled ulps, no change.

I figured I'd ask around here as a last resort, otherwise I'll just live with it. I purchased this used off CL so no rma'ing. Guy said he didn't register it so I suppose I could if I wanted to but honestly I don't want to go through the hassle of putting the stock cooler back on.

Any ideas? Anything I might have missed?


----------



## Arizonian

Quote:


> Originally Posted by *moorhen2*
> 
> Here's a FireStrike run from me, GPU's at 1100/1475 stock voltages.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s572.photobucket.com/user/moorhen2/media/Capture2121.jpg.html
> 
> 
> 
> http://www.3dmark.com/3dm/7090027
> 
> VDDC Ofsett of 31 is stock for these cards using the latest version of Trixx.


Congrats - added








Quote:


> Originally Posted by *snow cakes*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Stock cooling, i cant post the gpu-z score because this is a brand new build, all my parts will be here tuesday ?


Congrats - added


----------



## hidethecookies

like to join the club
MSI 290 Gaming 4g w/ Kraken g10 and Corsair H75


----------



## Ized

Quote:


> Originally Posted by *Jyve*
> 
> Hey fellas. I have a reference asus 290 running on a corsair hg10/h55 setup. Temps on both the core and vrm is just fine.
> 
> I see lots of stellar overclock here and I'm jealous. Any time I put my vram over 1200 I get the black screen issue. I've messed around with both regular voltage as well as Aux voltage to no avail. Unfortunately it has the elpida ram on it.
> 
> I've exhausted about all options here and have just resigned myself to how this card runs. Honestly it does just about everything I want it to anyway.
> 
> I'm running on a Qnix 1440p monitor (same issue on a 1200p monitor) . The black screen problem only happens in 3d. Gaming only. No movie watching or surfing. I've disabled ulps, no change.
> 
> I figured I'd ask around here as a last resort, otherwise I'll just live with it. I purchased this used off CL so no rma'ing. Guy said he didn't register it so I suppose I could if I wanted to but honestly I don't want to go through the hassle of putting the stock cooler back on.
> 
> Any ideas? Anything I might have missed?


Are you running a stone age driver (From the CD?!)? Back at launch my card couldn't do bugger all in terms of memory clocks while gaming. Now its happy to sit at 1500Mhz gaming on stock volts all day.

I guess I have also changed BIOSs a few dozen times since then too - might be worth trying the ASUS GPU Tweak app, since it has a BIOS updater built in.

There is also a fancy unlocked ASUS BIOS in the first post, but since it is a reference card you can find one to suit your needs over on techpowerup - lots to pick from (different volts, clocks and I guess memory timings?) and no risk in flashing with the dual BIOS switches.


----------



## Jyve

No. I'm running 15.4 right now


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Ized*
> 
> Are you running a stone age driver (From the CD?!)? Back at launch my card couldn't do bugger all in terms of memory clocks while gaming. Now its happy to sit at 1500Mhz gaming on stock volts all day.
> 
> *I guess I have also changed BIOSs a few dozen times since then too - might be worth trying the ASUS GPU Tweak app, since it has a BIOS updater built in.*
> 
> There is also a fancy unlocked ASUS BIOS in the first post, but since it is a reference card you can find one to suit your needs over on techpowerup - lots to pick from (different volts, clocks and I guess memory timings?) and no risk in flashing with the dual BIOS switches.


You can flash any 290 / 290x bios on to them as long as its not a 8 gb bios . But the best bios I found was the Shimano / ASUS GPU tweek / unlocked voltage bios


----------



## Jyve

I'll give the bios flash a shot and report back


----------



## MIGhunter

I'm running a stock 290x reference card from release. I noticed the other day playing Archeage at 1080p the my temps were fluctuating between 90-93c. This is after a complete dust cleaning of the interior of the case. I know the reference cards ran hot but I thought they throttled at 90c? Anything I should try to do to get the temps down? that seems excessive.

I'm using a HAF 932 case. My CPU is cooled with a BeQuiet and it's venting to the back of the case (not toward the GPU)


----------



## cephelix

Quote:


> Originally Posted by *MIGhunter*
> 
> I'm running a stock 290x reference card from release. I noticed the other day playing Archeage at 1080p the my temps were fluctuating between 90-93c. This is after a complete dust cleaning of the interior of the case. I know the reference cards ran hot but I thought they throttled at 90c? Anything I should try to do to get the temps down? that seems excessive.
> 
> I'm using a HAF 932 case. My CPU is cooled with a BeQuiet and it's venting to the back of the case (not toward the GPU)


Well, stock coolers don't seem great at handling temps for the 290(x). If you're comfortable with dismantling it you could probably reapply TIM to the die. Or you could also buy aftermarket coolers like the arctic cooling or prolimatech ones if you choose to. How are your VRM temps under load?


----------



## kizwan

Quote:


> Originally Posted by *Jyve*
> 
> Hey fellas. I have a reference asus 290 running on a corsair hg10/h55 setup. Temps on both the core and vrm is just fine.
> 
> I see lots of stellar overclock here and I'm jealous. Any time I put my vram over 1200 I get the black screen issue. I've messed around with both regular voltage as well as Aux voltage to no avail. Unfortunately it has the elpida ram on it.
> 
> I've exhausted about all options here and have just resigned myself to how this card runs. Honestly it does just about everything I want it to anyway.
> 
> I'm running on a Qnix 1440p monitor (same issue on a 1200p monitor) . The black screen problem only happens in 3d. Gaming only. No movie watching or surfing. I've disabled ulps, no change.
> 
> I figured I'd ask around here as a last resort, otherwise I'll just live with it. I purchased this used off CL so no rma'ing. Guy said he didn't register it so I suppose I could if I wanted to but honestly I don't want to go through the hassle of putting the stock cooler back on.
> 
> Any ideas? Anything I might have missed?


Did you increase power limit?
Quote:


> Originally Posted by *MIGhunter*
> 
> I'm running a stock 290x reference card from release. I noticed the other day playing Archeage at 1080p the my temps were fluctuating between 90-93c. This is after a complete dust cleaning of the interior of the case. I know the reference cards ran hot but I thought they throttled at 90c? Anything I should try to do to get the temps down? that seems excessive.
> 
> I'm using a HAF 932 case. My CPU is cooled with a BeQuiet and it's venting to the back of the case (not toward the GPU)


It start throttling at 95C. You should use custom fan profile. Use MSI AB to set custom fan profile.


----------



## Jyve

I increased it as well as decreased it (as per 1 of the many posts I read on the subject)


----------



## MIGhunter

Quote:


> Originally Posted by *cephelix*
> 
> Well, stock coolers don't seem great at handling temps for the 290(x). If you're comfortable with dismantling it you could probably reapply TIM to the die. Or you could also buy aftermarket coolers like the arctic cooling or prolimatech ones if you choose to. How are your VRM temps under load?


hmm, not sure. How can I check my VRM temps, is it in the CCC somewhere or do I need some other software?


----------



## rdr09

Quote:


> Originally Posted by *MIGhunter*
> 
> hmm, not sure. How can I check my VRM temps, is it in the CCC somewhere or do I need some other software?


you can either use GPUZ or HWINFO64 like this . . .



if you use GPUZ, you may need to use the slider on the side to see the vrm temps. use only one app at a time.


----------



## SPLWF

Quote:


> Originally Posted by *Dooderek*
> 
> http://www.3dmark.com/fs/4908603
> 
> 1100/1350
> 
> Here is also a compilation of valley benchmarks showing how CPU GHz and SSD's affect the synthetic benchmark. Also not posted above, but my original 3D mark score was 1100 pts lower, same gpu clock speeds but @ 4.1 GHz CPU and HDD. I believe the major increase in FPS is a result of scene changes where the FPS will drop low and skew the average. By bumping the CPU frequency you decrease the time changing scenes resulting in more average FPS. I believe the SSD does the same thing.


Your CPU helped you a lot. You also have the 290X, I just have 290.

CPU: 4.5ghz
GPUs: 1100/1300

http://www.3dmark.com/compare/fs/4908603/fs/4891728


----------



## tsm106

Quote:


> Originally Posted by *Dooderek*
> 
> http://www.3dmark.com/fs/4908603
> 
> 1100/1350
> 
> Here is also a compilation of valley benchmarks showing how CPU GHz and SSD's affect the synthetic benchmark.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Also not posted above, but my original 3D mark score was 1100 pts lower, same gpu clock speeds but @ 4.1 GHz CPU and HDD. I believe the major increase in FPS is a result of scene changes where the FPS will drop low and skew the average. By bumping the CPU frequency you decrease the time changing scenes resulting in more average FPS. I believe the SSD does the same thing.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> oh and here is a picture of my finished build. Dont mind the arrows, I was using it to demonstrate something.


Ya, ancient chinese secret. You can bypass that if you're stable enough and run two benches, or at least let it go thru one complete run thru w/o running the bench so it loads the textures in memory, voila silly high minimums. But yea, this is unigine 101 oldest trick in the book.


----------



## Wezzor

I wonder when this week they will release the new driver.


----------



## Arizonian

Quote:


> Originally Posted by *Wezzor*
> 
> I wonder when this week they will release the new driver.










A new driver in time to support a new GPU release perhaps


----------



## Wezzor

Quote:


> Originally Posted by *Arizonian*
> 
> 
> 
> 
> 
> 
> 
> 
> A new driver in time to support a new GPU release perhaps


Nah, for The Witcher 3.








[email protected] said that this week they'll release an driver optimised for The Witcher 3.


----------



## Arizonian

Quote:


> Originally Posted by *Wezzor*
> 
> Nah, for The Witcher 3.
> 
> 
> 
> 
> 
> 
> 
> 
> [email protected] said that this week they'll release an driver optimised for The Witcher 3.


True, but he couldn't say if it was anyway under NDA. Wishful thinking out loud.


----------



## gatygun

Isn't the driver already released?

No clue if it's really the official 15.5 driver tho, but here's a link with it

http://forums.guru3d.com/showthread.php?t=399245


----------



## the9quad

Quote:


> Originally Posted by *gatygun*
> 
> Isn't the driver already released?
> 
> No clue if it's really the official 15.5 driver tho, but here's a link with it
> 
> http://forums.guru3d.com/showthread.php?t=399245


no one is sure what that driver is but it most definitely is not the driver for the witcher 3 or pcars


----------



## tsm106

Yea, the witcher focused driver is not out, yet.


----------



## Forceman

Quote:


> Originally Posted by *tsm106*
> 
> Ya, ancient chinese secret. You can bypass that if you're stable enough and run two benches, or at least let it go thru one complete run thru w/o running the bench so it loads the textures in memory, voila silly high minimums. But yea, this is unigine 101 oldest trick in the book.


You can also just hit enter to cycle through the scenes, without letting go it actually run all the way through them.


----------



## BradleyW

Quote:


> Originally Posted by *the9quad*
> 
> no one is sure what that driver is but it most definitely is not the driver for the witcher 3 or pcars


I believe it's no different from the compiled 15.4 driver package. Just a simple name change before they begin to work on the next driver package.


----------



## ambientblue

http://www.3dmark.com/3dm/7111037



Stock run. 4GHz CPU.


----------



## rdr09

Is this enough for an i7 SB 4.5GHz, Single 290 stock, and 40W monitor?

http://www.amazon.com/APC-Back-UPS-650VA-230V-BK650EI/dp/B0086GQYUK/ref=sr_1_2?s=electronics&ie=UTF8&qid=1432733810&sr=1-2&keywords=ups+230v

thanks.


----------



## Linxus

Quote:


> Originally Posted by *Linxus*
> 
> Hey guys,
> 
> I recently picked up a Sapphire Tri-X R9 290X and overclocked it using the Sapphire TriXX tool. I'm loving this new card so far and its running GTA 5 @ 1440p wonderfully. I achieved the following settles and the card seems stable:
> 
> GPU Clock: 1150
> Memory Clock: 1450
> VDDC Offset: 93
> Power Limit: 25
> 
> Running Valley Benchmark 1.0, my GPU temperature hovers around 86ºC; however, my VRM 1 temperature jumps up to 108ºC and hovers around 104-105ºC. The VRM 2 temperature stays around 76ºC. I'm using a custom fan profile with the fan running around 65% at 85ºC. Should I be worried about this high VRM 1 temperature? Are my settings too aggressive? I've noticed that my TriXX options are different than most guides on here. Any input is appreciated.
> 
> EDIT: I bumped the fan speed up to a fixed 85% and the VRM 1 temperature is now hovering around 100ºC. I should also mentioned that I have a NZXT Phantom 410 with (2) intake fans on front, (1) intake on bottom, (1) exhaust on rear, and (2) exhaust on top through H100i CPU cooler. There is also an exhaust fan on the side of the case.


Has anyone replaced the thermal paste / pads on the Saphire Tri-X R9 290X? If so, what pads and thickness did you use? I'm thinking I might try this out after reading a few success stories on other pages.


----------



## ShortySmalls

Quote:


> Originally Posted by *rdr09*
> 
> Is this enough for an i7 SB 4.5GHz, Single 290 stock, and 40W monitor?
> 
> http://www.amazon.com/APC-Back-UPS-650VA-230V-BK650EI/dp/B0086GQYUK/ref=sr_1_2?s=electronics&ie=UTF8&qid=1432733810&sr=1-2&keywords=ups+230v
> 
> thanks.


No I don't think so, thats only 400w, which your monitor alone is using 10% of it. I would recommend something at least 1000va minimum or else your going to be overloading it all the time when you launch anything gpu intensive.


----------



## tsm106

Quote:


> Originally Posted by *rdr09*
> 
> Is this enough for an i7 SB 4.5GHz, Single 290 stock, and 40W monitor?
> 
> http://www.amazon.com/APC-Back-UPS-650VA-230V-BK650EI/dp/B0086GQYUK/ref=sr_1_2?s=electronics&ie=UTF8&qid=1432733810&sr=1-2&keywords=ups+230v
> 
> thanks.


Is there a reason why that's a 230v unit? That price man!. Everyday price here:

http://www.amazon.com/APC-BR1500G-Back-UPS-10-outlet-Uninterruptible/dp/B003Y24DEU


----------



## rdr09

Quote:


> Originally Posted by *ShortySmalls*
> 
> No I don't think so, thats only 400w, which your monitor alone is using 10% of it. I would recommend something at least 1000va minimum or else your going to be overloading it all the time when you launch anything gpu intensive.


I'll look some more.
Quote:


> Originally Posted by *tsm106*
> 
> Is there a reason why that's a 230v unit? That price man!. Everyday price here:
> 
> http://www.amazon.com/APC-BR1500G-Back-UPS-10-outlet-Uninterruptible/dp/B003Y24DEU


for overseas. I'll check that out but it has to be 220 - 240V.

+rep to both.


----------



## tsm106

Oh, then in that case buy it over there. Since you're going 240v, make sure to get a Pure Sine Wave unit too.


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> Oh, then in that case buy it over there. Since you're going 240v, make sure to get a Pure Sine Wave unit too.


the place i am going to is remote. nothing to buy. lol. i think there is a company in IL that sells all 220 stuff. thanks.


----------



## tsm106

In the US, 230/240v units get into enterprise gear and the cost blows up.


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> In the US, 230/240v units get into enterprise gear and the cost blows up.


how about this . . .

http://www.cdw.com/shop/products/TRIPP-1500VA-INT-UPS-SMART-LCD-230V/3162917.aspx?RecommendedForEDC=3395546&RecoType=RP&cm_sp=Product-_-Session&ProgramIdentifier=3


----------



## FastEddieNYC

Afterburner has just been updated to 4.1.1. It includes a new voltage adjust mode for cards that use custom pcb's. http://www.guru3d.com/news-story/download-msi-afterburner-4-1-1.html


----------



## tsm106

Quote:


> Originally Posted by *FastEddieNYC*
> 
> Afterburner has just been updated to 4.1.1. It includes a new voltage adjust mode for cards that use custom pcb's. http://www.guru3d.com/news-story/download-msi-afterburner-4-1-1.html


Interesting, there's a overall cpu load graph now and some various filters like for example we can make it show individual vram use per gpu instead of global by applying a rule set, and this DB for custom controllers. I just might jump from 4.0.


----------



## elgreco14

So yeah guys I'm thinking about sending a msi 290x gaming 4g as RMA. I just bought a second one and the second one has really great temps only 70 max in 3dmark and games around 75. But the first one I had was always around 88 degrees but since I have installed a second one it is going up to 94!! I also noticed the new one has a new PCB design.

So should I rma the msi card? I bought it from someone else and its six months old, but I still have the proof of purchase but not the box. How is MSI's RMA service?


----------



## Dooderek

Quote:


> Originally Posted by *elgreco14*
> 
> So yeah guys I'm thinking about sending a msi 290x gaming 4g as RMA. I just bought a second one and the second one has really great temps only 70 max in 3dmark and games around 75. But the first one I had was always around 88 degrees but since I have installed a second one it is going up to 94!! I also noticed the new one has a new PCB design.
> 
> So should I rma the msi card? I bought it from someone else and its six months old, but I still have the proof of purchase but not the box. How is MSI's RMA service?


Maybe its getting choked off with airflow? Post a pic of your setup.


----------



## tsm106

Quote:


> Originally Posted by *elgreco14*
> 
> So yeah guys I'm thinking about sending a msi 290x gaming 4g as RMA. I just bought a second one and the second one has really great temps only 70 max in 3dmark and games around 75. But the first one I had was always around 88 degrees but since I have installed a second one it is going up to 94!! I also noticed the new one has a new PCB design.
> 
> *So should I rma the msi card? I bought it from someone else and its six months old, but I still have the proof of purchase but not the box. How is MSI's RMA service?*


I would try repasting it first. If the card was always hot, it could be the paste job or defect. MSI goes by the serial number via the date of manufacturer then invoice date. You can create an rma online, it's pretty simple with fairly quick turn around.


----------



## JourneymanMike

Quote:


> Originally Posted by *tsm106*
> 
> *I would try repasting it first.* If the card was always hot, it could be the paste job or defect. MSI goes by the serial number via the date of manufacturer then invoice date. You can create an rma online, it's pretty simple with fairly quick turn around.


Would this not void the warranty?

Before I went full blown custom loop, my bottom 290X was 95c and top was 90c... Full load

They are Sapphire Reference Cards


----------



## aaroc

I have this UPS for 240V here a link in Amazon maybe 110V http://www.amazon.com/APC-BR1500G-Back-UPS-10-outlet-Uninterruptible/dp/B003Y24DEU
It provides up to 895W protection or something like that. It started to scream with FX 8350 and three R9 290. Because it used between 900 and 1050W, numbers from Corsair AX1200i in Corsair link software. The name has the 1500 number, but the real power of the UPS is a lot less 895W. Always check the real power of the UPS. Very good UPS has nice SW that tells you how much energy are you using per month/week/day/year and how much you spend on your PC (you can enter the energy price) and how many trees you are cutting to power your PC. When its charging it emits a noise like coil whine and its gets a bit hot. When providing power during a blackout its get hot and it runs its internal fans. Once per month it make an auto test, I didnt know what was happening in my PC.







In the US APC sells an external battery to have more time during a black out. sadly they dont sell it here. http://www.amazon.com/APC-BR24BPG-Back-UPS-External-Battery/dp/B0047E5B90/ref=pd_bxgy_23_img_y This can be used with the 110 or 240V UPS. But you can use car batteries to have more time.

The Tripplite is the equivalent.


----------



## elgreco14

Quote:


> Originally Posted by *Dooderek*
> 
> Maybe its getting choked off with airflow? Post a pic of your setup.




So yeah pretty much stock fans. I am thinking about buying a 120mm fan and placing it with tie wraps on the hdd case. But still the second card is 24-25 degrees colder. That is not normal..
Quote:


> Originally Posted by *tsm106*
> 
> I would try repasting it first. If the card was always hot, it could be the paste job or defect. MSI goes by the serial number via the date of manufacturer then invoice date. You can create an rma online, it's pretty simple with fairly quick turn around.


I have looked in to repasting but wat scares me a bit is that it can void warranty and if I mess up I maybe break my card by a stupid mistake. I have looked in to rma online and they say you can not send components in, you have to go the retailer.


----------



## rdr09

Quote:


> Originally Posted by *elgreco14*
> 
> 
> 
> So yeah pretty much stock fans. I am thinking about buying a 120mm fan and placing it with tie wraps on the hdd case. But still the second card is 24-25 degrees colder. That is not normal..
> I have looked in to repasting but wat scares me a bit is that it can void warranty and if I **** up I maybe break my card by a stupid mistake. I have looked in to rma online and they say you can not send components in, you have to go the retailer.


that second gpu is too close to the psu and i think they are also opposing each other with regard to airflow.

But, i think the main culprit is DUST.


----------



## rdr09

Quote:


> Originally Posted by *aaroc*
> 
> I have this UPS for 240V here a link in Amazon maybe 110V http://www.amazon.com/APC-BR1500G-Back-UPS-10-outlet-Uninterruptible/dp/B003Y24DEU
> It provides up to 895W protection or something like that. It started to scream with FX 8350 and three R9 290. Because it used between 900 and 1050W, numbers from Corsair AX1200i in Corsair link software. The name has the 1500 number, but the real power of the UPS is a lot less 895W. Always check the real power of the UPS. Very good UPS has nice SW that tells you how much energy are you using per month/week/day/year and how much you spend on your PC (you can enter the energy price) and how many trees you are cutting to power your PC. When its charging it emits a noise like coil whine and its gets a bit hot. When providing power during a blackout its get hot and it runs its internal ventilators. Once per month it make an auto test, I didnt know what was happening in my PC.
> 
> 
> 
> 
> 
> 
> 
> In the US APC sells an external battery to have more time during a black out. sadly they dont sell it here. http://www.amazon.com/APC-BR24BPG-Back-UPS-External-Battery/dp/B0047E5B90/ref=pd_bxgy_23_img_y This can be used with the 110 or 240V UPS. But you can use car batteries to have more time.
> 
> The Tripplite is the equivalent.


+rep. Thank you.


----------



## aaroc

LOL ventilator == fan.


----------



## FuriousPop

Howdy Gents,

Been out of the game for some time now and just wanted to know what are the most stable drivers at the moment?

Just checked mine after quite some time now and im still running 14.4.

Is it worth the trouble to go to 14.12?

thanks in advanced


----------



## JourneymanMike

Quote:


> Originally Posted by *FuriousPop*
> 
> Howdy Gents,
> 
> Been out of the game for some time now and just wanted to know what are the most stable drivers at the moment?
> 
> Just checked mine after quite some time now and im still running 14.4.
> 
> *Is it worth the trouble to go to 14.12?*
> 
> thanks in advanced


I've had real good luck with 14.12, as for the Beta 15.4 I'm trying that out now...

Go ahead - uninstall 14.4 and install 14.12, if you don't like, go back...

BTW You could also try the Beta 15.4 afterwards.... or by then there may be a new valid driver to use.


----------



## aaroc

Quote:


> Originally Posted by *elgreco14*
> 
> 
> 
> So yeah pretty much stock fans. I am thinking about buying a 120mm fan and placing it with tie wraps on the hdd case. But still the second card is 24-25 degrees colder. That is not normal..
> I have looked in to repasting but wat scares me a bit is that it can void warranty and if I mess up I maybe break my card by a stupid mistake. I have looked in to rma online and they say you can not send components in, you have to go the retailer.


Test swapping the cards to see if is position related.


----------



## FuriousPop

thanks heaps, will give 14.12 a go.


----------



## elgreco14

Quote:


> Originally Posted by *rdr09*
> 
> that second gpu is too close to the psu and i think they are also opposing each other with regard to airflow.
> 
> But, i think the main culprit is DUST.


Its the only spot the second one fits and it doesnt really matter, because that card is getting only 75 degrees max. The dust is no problem, because its a very thin layer.

Quote:


> Originally Posted by *aaroc*
> 
> Test swapping the cards to see if is position related.


I will try that tomorrow. But I need to find a way to remove the power cables from the upper card, because MSI is so stupid that they make the connectors upside down.


----------



## tsm106

Quote:


> Originally Posted by *JourneymanMike*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> *I would try repasting it first.* If the card was always hot, it could be the paste job or defect. MSI goes by the serial number via the date of manufacturer then invoice date. You can create an rma online, it's pretty simple with fairly quick turn around.
> 
> 
> 
> Would this not void the warranty?
> 
> Before I went full blown custom loop, my bottom 290X was 95c and top was 90c... Full load
> 
> They are Sapphire Reference Cards
Click to expand...

No, it depends on the brand. MSI do not care what you do as long as the card is returned as stock. Sapphire however are sticklers. You tell them you did anything and they get void happy. With sapphire, it's a good idea to keep quiet ya.

Quote:


> Originally Posted by *elgreco14*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I would try repasting it first. If the card was always hot, it could be the paste job or defect. MSI goes by the serial number via the date of manufacturer then invoice date. You can create an rma online, it's pretty simple with fairly quick turn around.
> 
> 
> 
> I have looked in to repasting but wat scares me a bit is that it can void warranty and if I mess up I maybe break my card by a stupid mistake. I have looked in to rma online and they say you can not send components in, you have to go the retailer.
Click to expand...

Looks like you are in the EU somewhere. Check your fine print, it's either the first year you have to go thru the retailer etc. Breaking your card regardless would void any warranty no matter the brand. If you are not confident repasting, all that's left is to rma. Fill out your location in your user profile. It makes a big difference in the type of advice you get from posts.


----------



## tsm106

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> In the US, 230/240v units get into enterprise gear and the cost blows up.
> 
> 
> 
> how about this . . .
> 
> http://www.cdw.com/shop/products/TRIPP-1500VA-INT-UPS-SMART-LCD-230V/3162917.aspx?RecommendedForEDC=3395546&RecoType=RP&cm_sp=Product-_-Session&ProgramIdentifier=3
Click to expand...

Looks like it fits the bill, but I can't tell if it's a pure sine wave model or not, shrugs.


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> Looks like it fits the bill, but I can't tell if it's a pure sine wave model or not, shrugs.


it does. the tech spec says sinewave. aaroc, posted something similar but from amazon. i'll check it out.


----------



## elgreco14

Quote:


> Originally Posted by *tsm106*
> 
> No, it depends on the brand. MSI do not care what you do as long as the card is returned as stock. Sapphire however are sticklers. You tell them you did anything and they get void happy. With sapphire, it's a good idea to keep quiet ya.
> Looks like you are in the EU somewhere. Check your fine print, it's either the first year you have to go thru the retailer etc. Breaking your card regardless would void any warranty no matter the brand. If you are not confident repasting, all that's left is to rma. Fill out your ocation in your user profile. It makes a big difference in the type of advice you get from posts.


Msi
Thanks for the tip, I added my location. I contacted the shop today and they asked if i could send the invoice and S/N so they can send it to MSI.


----------



## Jflisk

This ones a little off topic.
I have 3 x R9 290X and FX9590. With Dual D5 pumps and it is fully under water. My question is - All the blocks including CPU should be within at least 2-3C of each other correct. My idle is 40C across the board on all dies GPU - My CPU idles at 42-43C. Full tilt for some reason GPUS 55C max and CPU is where the weirdness comes in. I am getting 57C to 64C. I see the CPU start at 57C then steady rise to 59C.Then at times for maybe 10 Seconds it goes to 64C then drops to 57-59C and stays there. Keep in mind the CPU is 220W and from what I have seen could be 317W full tilt. Its a XSPC Raystorm water block. Would lapping the CPU block help to get the CPU temps in check. TIM is Gelid extreme. Thanks in advance


----------



## ghabhaducha

Quote:


> Originally Posted by *Jflisk*
> 
> This ones a little off topic.
> I have 3 x R9 290X and FX9590. With Dual D5 pumps and it is fully under water. My question is - All the blocks including CPU should be within at least 2-3C of each other correct. My idle is 40C across the board on all dies GPU - My CPU idles at 42-43C. Full tilt for some reason GPUS 55C max and CPU is where the weirdness comes in. I am getting 57C to 64C. I see the CPU start at 57C then steady rise to 59C.Then at times for maybe 10 Seconds it goes to 64C then drops to 57-59C and stays there. Keep in mind the CPU is 220W and from what I have seen could be 317W full tilt. Its a XSPC Raystorm water block. Would lapping the CPU block help to get the CPU temps in check. TIM is Gelid extreme. Thanks in advance


Two words bro: thermal density. A full Vishera chip has a surface area of 315 mm^2, whereas a R9 290x has a surface area of 438 mm^2. As you described your FX-9590 is dumping close to 317W over only 315 mm^2, whereas the R9 290x is 290W over 438 mm^2.

According to Wikipedia, "Heat flux or thermal flux is the rate of heat energy transfer through a given surface, per unit surface. The SI derived unit of heat rate is joule per second, or watt. Heat flux density is the heat rate per unit area. In SI units, heat flux density is measured in [W/m2]."

Assuming all my numbers are right:
FX-9590 = (317W) / (315 mm^2) = (317W) / (0.000315 m^2) = *1,006,349 W/m^2*
R9 290x = (290W) / (438 mm^2) = (290W) / (0.000438 m^2) = *723,744 W/m^2*

As such, the 290x block has to carry a smaller heat flux, than your XSPC Raystorm, and consequently your 290x will have better temps.

In addition, your 290x blocks are touching the die directly where as the Raystorm is attached to the IHS which is then in turn connected to the FX-9590 die. Now I understand there will be other variables, such as flow rate (especially if the 290x's are in parallel), but this simple calculation should help you understand the temperature disparity a bit better. I hope that helped!









Edit~ So during idle the TDP's are much lower over both the CPU and GPUs, and as such the temperature disparity should be smaller.


----------



## kizwan

Quote:


> Originally Posted by *Jflisk*
> 
> This ones a little off topic.
> I have 3 x R9 290X and FX9590. With Dual D5 pumps and it is fully under water. My question is - *All the blocks including CPU should be within at least 2-3C of each other* correct. My idle is 40C across the board on all dies GPU - My CPU idles at 42-43C. Full tilt for some reason GPUS 55C max and CPU is where the weirdness comes in. I am getting 57C to 64C. I see the CPU start at 57C then steady rise to 59C.Then at times for maybe 10 Seconds it goes to 64C then drops to 57-59C and stays there. Keep in mind the CPU is 220W and from what I have seen could be 317W full tilt. Its a XSPC Raystorm water block. Would lapping the CPU block help to get the CPU temps in check. TIM is Gelid extreme. Thanks in advance


Who said that? It's not true at all.

Lapping only useful if the block unable to make proper contact with the CPU, e.g. if the block bow too much.


----------



## Jflisk

Quote:


> Originally Posted by *kizwan*
> 
> Who said that? It's not true at all.
> 
> Lapping only useful if the block unable to make proper contact with the CPU, e.g. if the block bow too much.


kizwan - it was a question -would lapping the block help?

Not sure if the block is bowed. The XSPC hold down for the block always looks bowed.


----------



## LandonAaron

Is there a way to use higher Virtual Super Resolutions than what is available normally? Trying to go back and play some GTA IV, but I cant find a decent AA method for the game. For some reason all the ENB's and Inject SMAA tools cause my game not to launch. Something to do with the d3d9.dll. It just won't work for me and I have given up on that approach. Trying to force Super Sampling through CCC and Radeon Pro doesn't seem to work either with no discernible difference in quality or performance. The only thing that works is upping the in game resolution, but the highest I can go is 3200 x 1800. Versus. the 2560 x 1440 I am starting from this is only a %25 increase and while it helps a bit I would like to go much higher if possible.


----------



## tsm106

Quote:


> Originally Posted by *Jflisk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Who said that? It's not true at all.
> 
> Lapping only useful if the block unable to make proper contact with the CPU, e.g. if the block bow too much.
> 
> 
> 
> kizwan - it was a question -would lapping the block help?
> 
> Not sure if the block is bowed. The XSPC hold down for the block always looks bowed.
Click to expand...

You never want to lap a block flat. Blocks are made to have a slight bow to make better contact. When the block is mounted it will conform to the cpu die.


----------



## JourneymanMike

Quote:


> Originally Posted by *Jflisk*
> 
> kizwan - it was a question -would lapping the block help?
> 
> Not sure if the block is bowed. The XSPC hold down for the block always looks bowed.


Uhmmm, You could take the block off and check it with a straight edge?









As far as lower temps? Maybe... From the video info I found it appears to be concave (lower in the middle)

My EK Supremacy EVO is definitely convex or high in the middle...

Check it and see...


----------



## tsm106

Quote:


> Originally Posted by *JourneymanMike*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jflisk*
> 
> kizwan - it was a question -would lapping the block help?
> 
> Not sure if the block is bowed. The XSPC hold down for the block always looks bowed.
> 
> 
> 
> Uhmmm, You could take the block off and check it with a straight edge?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> As far as lower temps? Maybe... From the video info I found it appears to be concave (lower in the middle)
> 
> My EK Supremacy EVO is definitely convex or high in the middle...
> 
> *Lap it and see...*
Click to expand...

That will destroy the block, effectively making it worthless.


----------



## Jflisk

Quote:


> Originally Posted by *tsm106*
> 
> That will destroy the block, effectively making it worthless.


Added some rep for you guys. Thanks for your help


----------



## SPLWF

X-Fire question

GPU1: Reference Sapphire R9 290 with NZXT G10 w/ Corsair H55 + GELID VRM Kit, +88mV on core for 1150mhz, +25 on memory for 1500mhz (eplida). Runs perfectly stable on all benches.

GPU2: Sapphire Tri-X 290 1000/1300 stock, OC to 1100 stock core voltage and 1400mhz on stock memory voltage. This is also stable at time clocks.

I matched my Ref 290 to the standards of the Tri-X at 1100/1400, so both are running at same clocks.

Valley Bench last for about 15mins, then Valley Freezes, did not restart. Max VRM1 temp was under 80c on the Tri-X, and under 70c on Ref 290.

Unigine Bench runs perfectly fine

Fire Strike Extreme also stable

Monster Hunter Online Bench, crashes in about 30secs with full reboot.

Now my question is: If these cards are running perfectly fine independently on all these benches but only crashes on X-Fire, does that mean X-Fire not stable on some benches?

I also been having weird issue's running Skyrim with lots of texture mods. When running just One card, my vram maxes out at 3600mb. When running X-Fire vram reaches up to 6600mb? Why is this happening? I know if both cards have 4GB of vram, it's not a total of 8GB, it's still only 4GB. Am I missing something?


----------



## JourneymanMike

Quote:


> Originally Posted by *tsm106*
> 
> That will destroy the block, effectively making it worthless.


Thanks for catching that, I meant to say Check it and see...

It has been corrected...

And just for the heck of it, How will lapping destroy the block? In what manner - you know cause and effect?

I'd like to know...


----------



## JourneymanMike

Quote:


> Originally Posted by *JourneymanMike*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> That will destroy the block, effectively making it worthless.
> 
> 
> 
> Thanks for catching that, I meant to say Check it and see...
> 
> It has been corrected...
> 
> And just for the heck of it, How will lapping destroy the block? In what manner - you know cause and effect?
> 
> I'd like to know...
Click to expand...

I got it! I think,

It will change the mounting pressure on the CPU...


----------



## sinnedone

Quote:


> Originally Posted by *SPLWF*
> 
> X-Fire question
> 
> GPU1: Reference Sapphire R9 290 with NZXT G10 w/ Corsair H55 + GELID VRM Kit, +88mV on core for 1150mhz, +25 on memory for 1500mhz (eplida). Runs perfectly stable on all benches.
> 
> GPU2: Sapphire Tri-X 290 1000/1300 stock, OC to 1100 stock core voltage and 1400mhz on stock memory voltage. This is also stable at time clocks.
> 
> I matched my Ref 290 to the standards of the Tri-X at 1100/1400, so both are running at same clocks.
> 
> Valley Bench last for about 15mins, then Valley Freezes, did not restart. Max VRM1 temp was under 80c on the Tri-X, and under 70c on Ref 290.
> 
> Unigine Bench runs perfectly fine
> 
> Fire Strike Extreme also stable
> 
> Monster Hunter Online Bench, crashes in about 30secs with full reboot.
> 
> Now my question is: If these cards are running perfectly fine independently on all these benches but only crashes on X-Fire, does that mean X-Fire not stable on some benches?
> 
> I also been having weird issue's running Skyrim with lots of texture mods. When running just One card, my vram maxes out at 3600mb. When running X-Fire vram reaches up to 6600mb? Why is this happening? I know if both cards have 4GB of vram, it's not a total of 8GB, it's still only 4GB. Am I missing something?


From my experiences, crossfire clocks don't match individual clocks. They usually stable 20-30mhz slower.

If you are using afterburner I believe it adds the memory of both cards, and why you see that.


----------



## elgreco14

Quote:


> Originally Posted by *aaroc*
> 
> Test swapping the cards to see if is position related.


Yeah so I swapped cards, seems like something with air flow because upper card is also getting up to 93. I'm going to buy an extra fan so it targets at upper card. I have also some questions about my perfomance with crossifre R9 290X.

3Dmark (1 card vs 2 cards) (stock clocks): http://www.3dmark.com/compare/fs/4937865/fs/4930290

Another 3Dmark score: http://www.3dmark.com/3dm/7130164
So I played some BF4. With one card the gpu usage is 100% and I get around 90-100 avg fps. But with crossfire I have weird usage spikes:  but my fps is 130-150. But some maps only 90-100. So weird..

But if I change the virtual resolution up to 1600p my fps is around 100-130 depends on the map in multiplayer. With this resolution I have a bit less GPU spikes in the same map (map was empty, i dont know if that matters)

Also having usage spikes in crysis 3.

So yeah I have disabled ulps, changed power saving settings and disabled intel speedstep. I also have changed the multiplier to 42 of my i7 2600. Is this a cpu bottleneck? Or something e;se, I'm really getting annoyed by all these problems every time I build something in my pc







Never having good luck with my game pc


----------



## rdr09

Quote:


> Originally Posted by *elgreco14*
> 
> Yeah so I swapped cards, seems like something with air flow because upper card is also getting up to 93. I'm going to buy an extra fan so it targets at upper card. I have also some questions about my perfomance with crossifre R9 290X.
> 
> 3Dmark (1 card vs 2 cards) (stock clocks): http://www.3dmark.com/compare/fs/4937865/fs/4930290
> 
> So I played some BF4. With one card the gpu usage is 100% and I get around 90-100 avg fps. But with crossfire I have weird usage spikes:  but my fps is 130-150. But some maps only 90-100. So weird..
> 
> But if I change the virtual resolution up to 1600p my fps is around 100-130 depends on the map in multiplayer. With this resolution I have a bit less GPU spikes in the same map (map was empty, i dont know if that matters)
> 
> Also having usage spikes in crysis 3.
> 
> So yeah I have disabled ulps, changed power saving settings and disabled intel speedstep. I also have changed the multiplier to 42 of my i7 2600. Is this a cpu bottleneck? Or something e;se, I'm really getting annoyed by all these problems every time I build something in my pc
> 
> 
> 
> 
> 
> 
> 
> Never having good luck with my game pc


yah, you gonna have to oc the cpu higher. but, you can't anymore. so maps like Shanghai are harder.


----------



## rv8000

Finally got around to installing a G10 and H55 on my 290x Lightning, I guess I was being a bit too optimistic...

H55 @ full speed (~1450rpm) with 2xB12-2 E-loops, load temp after 20 mins with the gpu pegged at 100% is 67c with a ~27c room ambient. Adding a second fan seemed to drop the delta by 1-2c, I shouldn't have skimped on a clc with such a tiny rad







. These temps sound accurate?


----------



## elgreco14

Quote:


> Originally Posted by *rdr09*
> 
> yah, you gonna have to oc the cpu higher. but, you can't anymore. so maps like Shanghai are harder.


Would a i7 3770k be worth the upgrade? Could these spikes also be from a faulty psu?


----------



## rdr09

Quote:


> Originally Posted by *elgreco14*
> 
> Would a i7 3770k be worth the upgrade? Could these spikes also be from a faulty psu?


this is just my opinion. you just need an unlocked sandy bridge. preferably an i7 2700k and oc to 4.5GHz unless you have a motherboard that does not support oc'ing.

EDIT: I just checked your motherboard. It is all good. just get an unlocked i7. Good cpu cooler and oc. just get a used one.


----------



## tsm106

Quote:


> Originally Posted by *JourneymanMike*
> 
> Quote:
> 
> 
> 
> Originally Posted by *JourneymanMike*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> That will destroy the block, effectively making it worthless.
> 
> 
> 
> Thanks for catching that, I meant to say Check it and see...
> 
> It has been corrected...
> 
> And just for the heck of it, How will lapping destroy the block? In what manner - you know cause and effect?
> 
> I'd like to know...
> 
> Click to expand...
> 
> I got it! I think,
> 
> It will change the mounting pressure on the CPU...
Click to expand...

Not exactly. Blocks are slightly different from cpu coolers because the clamping force cannot be applied as close to center as a cooler because of the water ports. A cpu waterblocks clamping force is spread around the circumference of the block (around the ports). And what happens with a flat block when clamping force is applied is that we end up with not enough pressure at the center of the block. When we add a nice lil bow, it ends up applying equal pressure. I hope that makes sense.


----------



## JourneymanMike

Quote:


> Originally Posted by *tsm106*
> 
> Not exactly. Blocks are slightly different from cpu coolers because the clamping force cannot be applied as close to center as a cooler because of the water ports. A cpu waterblocks clamping force is spread around the circumference of the block (around the ports). And what happens with a flat block when clamping force is applied is that we end up with not enough pressure at the center of the block. *When we add a nice lil bow, it ends up applying equal pressure. I hope that makes sense.*


OK, I had half of it anyway. I wasn't thinking about the location of the inlet & outlet ports..

It makes complete sense







.

+1 for the help!


----------



## kizwan

Quote:


> Originally Posted by *elgreco14*
> 
> Quote:
> 
> 
> 
> Originally Posted by *aaroc*
> 
> Test swapping the cards to see if is position related.
> 
> 
> 
> Yeah so I swapped cards, seems like something with air flow because upper card is also getting up to 93. I'm going to buy an extra fan so it targets at upper card. I have also some questions about my perfomance with crossifre R9 290X.
> 
> 3Dmark (1 card vs 2 cards) (stock clocks): http://www.3dmark.com/compare/fs/4937865/fs/4930290
> 
> Another 3Dmark score: http://www.3dmark.com/3dm/7130164
> So I played some BF4. With one card the gpu usage is 100% and I get around 90-100 avg fps. But with crossfire I have weird usage spikes:  but my fps is 130-150. But some maps only 90-100. So weird..
> 
> But if I change the virtual resolution up to 1600p my fps is around 100-130 depends on the map in multiplayer. With this resolution I have a bit less GPU spikes in the same map (map was empty, i dont know if that matters)
> 
> Also having usage spikes in crysis 3.
> 
> So yeah I have disabled ulps, changed power saving settings and disabled intel speedstep. I also have changed the multiplier to 42 of my i7 2600. Is this a cpu bottleneck? Or something e;se, I'm really getting annoyed by all these problems every time I build something in my pc
> 
> 
> 
> 
> 
> 
> 
> Never having good luck with my game pc
Click to expand...

That is not necessarily CPU bottleneck. Depends on the map too. Some maps easier to the 290's & therefore gpu usage spiking a lot like that. Also in harder maps, your FPS will drop a little. Your FPS seems fine to me. I usually play @4K resolution using the BF4 resolution scale & FPS around 80 - 90 average. The gpu usage will also almost 100% all the time.

I don't think you enabled unified gpu usage monitoring in AB. Enabled that. It'll give better graphical representation of the gpu usage.


----------



## igrease

I can't be bothered to read through 6700 posts so I am just going to ask my question.

When I first initially got my 290 back in April I was having issues overclocking it. For me to be able to get 1100 on the core I had to raise the voltage up +56 and had the power limit set to +50. I recently was experimenting with overclocking again and I just slapped 1100 on there with just +50 to power and haven't had any BSODs yet. Did the 14.12 driver increase overclocking stability or something?


----------



## Mega Man

Quote:


> Originally Posted by *ghabhaducha*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jflisk*
> 
> This ones a little off topic.
> I have 3 x R9 290X and FX9590. With Dual D5 pumps and it is fully under water. My question is - All the blocks including CPU should be within at least 2-3C of each other correct. My idle is 40C across the board on all dies GPU - My CPU idles at 42-43C. Full tilt for some reason GPUS 55C max and CPU is where the weirdness comes in. I am getting 57C to 64C. I see the CPU start at 57C then steady rise to 59C.Then at times for maybe 10 Seconds it goes to 64C then drops to 57-59C and stays there. Keep in mind the CPU is 220W and from what I have seen could be 317W full tilt. Its a XSPC Raystorm water block. Would lapping the CPU block help to get the CPU temps in check. TIM is Gelid extreme. Thanks in advance
> 
> 
> 
> Two words bro: thermal density. A full Vishera chip has a surface area of 315 mm^2, whereas a R9 290x has a surface area of 438 mm^2. As you described your FX-9590 is dumping close to 317W over only 315 mm^2, whereas the R9 290x is 290W over 438 mm^2.
> 
> According to Wikipedia, "Heat flux or thermal flux is the rate of heat energy transfer through a given surface, per unit surface. The SI derived unit of heat rate is joule per second, or watt. Heat flux density is the heat rate per unit area. In SI units, heat flux density is measured in [W/m2]."
> 
> Assuming all my numbers are right:
> FX-9590 = (317W) / (315 mm^2) = (317W) / (0.000315 m^2) = *1,006,349 W/m^2*
> R9 290x = (290W) / (438 mm^2) = (290W) / (0.000438 m^2) = *723,744 W/m^2*
> 
> As such, the 290x block has to carry a smaller heat flux, than your XSPC Raystorm, and consequently your 290x will have better temps.
> 
> In addition, your 290x blocks are touching the die directly where as the Raystorm is attached to the IHS which is then in turn connected to the FX-9590 die. Now I understand there will be other variables, such as flow rate (especially if the 290x's are in parallel), but this simple calculation should help you understand the temperature disparity a bit better. I hope that helped!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit~ So during idle the TDP's are much lower over both the CPU and GPUs, and as such the temperature disparity should be smaller.
Click to expand...

absolutely correct, but all this depends on alot, vcore, LLC ect @! but besides die size, your pump can also hold you back ( personal experience as it did when i had 3 gpus, and only one pump ) flow is most important on CPU not as in gpus
Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jflisk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Who said that? It's not true at all.
> 
> Lapping only useful if the block unable to make proper contact with the CPU, e.g. if the block bow too much.
> 
> 
> 
> kizwan - it was a question -would lapping the block help?
> 
> Not sure if the block is bowed. The XSPC hold down for the block always looks bowed.
> 
> Click to expand...
> 
> You never want to lap a block flat. Blocks are made to have a slight bow to make better contact. When the block is mounted it will conform to the cpu die.
Click to expand...

just about to mention this as well, it is very important, i know everyone in the 8xxx club does it, but there is a reason i dont
Quote:


> Originally Posted by *SPLWF*
> 
> X-Fire question
> 
> GPU1: Reference Sapphire R9 290 with NZXT G10 w/ Corsair H55 + GELID VRM Kit, +88mV on core for 1150mhz, +25 on memory for 1500mhz (eplida). Runs perfectly stable on all benches.
> 
> GPU2: Sapphire Tri-X 290 1000/1300 stock, OC to 1100 stock core voltage and 1400mhz on stock memory voltage. This is also stable at time clocks.
> 
> I matched my Ref 290 to the standards of the Tri-X at 1100/1400, so both are running at same clocks.
> 
> Valley Bench last for about 15mins, then Valley Freezes, did not restart. Max VRM1 temp was under 80c on the Tri-X, and under 70c on Ref 290.
> 
> Unigine Bench runs perfectly fine
> 
> Fire Strike Extreme also stable
> 
> Monster Hunter Online Bench, crashes in about 30secs with full reboot.
> 
> Now my question is: If these cards are running perfectly fine independently on all these benches but only crashes on X-Fire, does that mean X-Fire not stable on some benches?
> 
> I also been having weird issue's running Skyrim with lots of texture mods. When running just One card, my vram maxes out at 3600mb. When running X-Fire vram reaches up to 6600mb? Why is this happening? I know if both cards have 4GB of vram, it's not a total of 8GB, it's still only 4GB. Am I missing something?


you need to divide the ram usage by 2 ( or the number of cards you have ) tsm brought up you can now disable that in the newest MSI

as to the rest of the q, unstable drivers/unstable bench, ect can cause it, as i have not seen the bench widely used that can not be discounted


----------



## elgreco14

Quote:


> Originally Posted by *rdr09*
> 
> this is just my opinion. you just need an unlocked sandy bridge. preferably an i7 2700k and oc to 4.5GHz unless you have a motherboard that does not support oc'ing.
> 
> EDIT: I just checked your motherboard. It is all good. just get an unlocked i7. Good cpu cooler and oc. just get a used one.


Thanks for the advice. I put my i7 2600 for sale and some one was willing to exchange it for a 2600k








Quote:


> Originally Posted by *kizwan*
> 
> That is not necessarily CPU bottleneck. Depends on the map too. Some maps easier to the 290's & therefore gpu usage spiking a lot like that. Also in harder maps, your FPS will drop a little. Your FPS seems fine to me. I usually play @4K resolution using the BF4 resolution scale & FPS around 80 - 90 average. The gpu usage will also almost 100% all the time.
> 
> I don't think you enabled unified gpu usage monitoring in AB. Enabled that. It'll give better graphical representation of the gpu usage.


Dude you are a genius







I enabled it and it seems like my gpu usage is around 90 % stable. With 2560x1440p virtuel res and all maxed out.


----------



## aaroc

Quote:


> Originally Posted by *elgreco14*
> 
> Would a i7 3770k be worth the upgrade? Could these spikes also be from a faulty psu?


bf4 needs fast ram and loves 2400, search the internet for the performance gap between ram speeds.
The heatsinks of your gpu are like the ones from xfx double d, they throw most of the hot air inside your case. I had the same problem with the highest gpu with more temp. My case cm 690II adv has two fan positions on the door. For this kind of gpu I installed two taking air out. For the original r9 290 heatsink that throw air out from the back of the case I installed two fans with throwing air inside the case. With that i managed to lower like 10C the higher gpu.
I recommend that you turn your psu fan to the bottom. You have fans on the top and on the bottom (psu) so the air gets stuck in the middle were your gpus are.


----------



## hornedfrog86

Quote:


> Originally Posted by *aaroc*
> 
> bf4 needs fast ram and loves 2400, search the internet for the performance gap between ram speeds.
> The heatsinks of your gpu are like the ones from xfx double d, they throw most of the hot air inside your case. I had the same problem with the highest gpu with more temp. My case cm 690II adv has two fan positions on the door. For this kind of gpu I installed two taking air out. For the original r9 290 heatsink that throw air out from the back of the case I installed two fans with throwing air inside the case. With that i managed to lower like 10C the higher gpu.
> I recommend that you turn your psu fan to the bottom. You have fans on the top and on the bottom (psu) so the air gets stuck in the middle were your gpus are.


Thank you, do you cool the back of your CPU socket area on that case with a fan. I like that it has a cutout on that side, I will try to install an FX 9590 in one.


----------



## mfknjadagr8

Interesting observation last night playing gta5 I was getting the fps drops still...so I decided to monitor using riva and while osd was running the fps drops were quite a bit less often or intense...cpu usage 50 to 70 across all 8 and with unified on showing 55 percent max usage on gpus...


----------



## elgreco14

Quote:


> Originally Posted by *aaroc*
> 
> bf4 needs fast ram and loves 2400, search the internet for the performance gap between ram speeds.
> The heatsinks of your gpu are like the ones from xfx double d, they throw most of the hot air inside your case. I had the same problem with the highest gpu with more temp. My case cm 690II adv has two fan positions on the door. For this kind of gpu I installed two taking air out. For the original r9 290 heatsink that throw air out from the back of the case I installed two fans with throwing air inside the case. With that i managed to lower like 10C the higher gpu.
> I recommend that you turn your psu fan to the bottom. You have fans on the top and on the bottom (psu) so the air gets stuck in the middle were your gpus are.


Yeah the problem is the psu only fits this way







But I just bought 2 noctua fans second hand and installed them like this: 

So yeah in 1080p upper card is 88 degrees max but aigain when I have resolution 1440p it gets aigain 94 degrees Maybe because my cpu is also getting 65 degrees the air cant get away well? I don't know anymore how to fix this and I need a solution because soon my Asus MG279Q is coming so it has stop getting so hot at 1440p


----------



## Ized

Quote:


> Originally Posted by *rv8000*
> 
> Finally got around to installing a G10 and H55 on my 290x Lightning, I guess I was being a bit too optimistic...
> 
> H55 @ full speed (~1450rpm) with 2xB12-2 E-loops, load temp after 20 mins with the gpu pegged at 100% is 67c with a ~27c room ambient. Adding a second fan seemed to drop the delta by 1-2c, I shouldn't have skimped on a clc with such a tiny rad
> 
> 
> 
> 
> 
> 
> 
> . These temps sound accurate?


Surely need more info to tell, but if it helps here is my info:

I am happy ish with my Zalman LQ315, cheapo mounting plate and GELID "VRM kit".

Recently replaced the stock fan with a Noctua NF-F12 thing and seeing better results for sure, that being said the limiting factor in cooling this thing is still noise levels.

I would be hovering between 60c-75c playing GTA5 with 1.34v 1215Mhz core 1498mem with the Noctua at 1000RPM 65% ish. After that the noise levels get rapidly out of control so I guess that's my sweet spot.

I have had good luck using Speedfan to manage the GPU pump, VRM fan and then rad fan. Got all my fan curves and reacting to the core temps and such.

Stock clocks 1.008v I idle 35-45c, depends if my memory decides to downclock or not.. if not that VRM will be chilling at 60c just for no reason heating up the whole card it seems.

Edit: Must only be about 20c at the most here indoors UK.


----------



## rv8000

Quote:


> Originally Posted by *Ized*
> 
> Surely need more info to tell, but if it helps here is my info:
> 
> I am happy ish with my Zalman LQ315, cheapo mounting plate and GELID "VRM kit".
> 
> Recently replaced the stock fan with a Noctua NF-F12 thing and seeing better results for sure, that being said the limiting factor in cooling this thing is still noise levels.
> 
> I would be hovering between 60c-75c playing GTA5 with 1.34v 1215Mhz core 1498mem with the Noctua at 1000RPM 65% ish. After that the noise levels get rapidly out of control so I guess that's my sweet spot.
> 
> I have had good luck using Speedfan to manage the GPU pump, VRM fan and then rad fan. Got all my fan curves and reacting to the core temps and such.
> 
> Stock clocks 1.008v I idle 35-45c, depends if my memory decides to downclock or not.. if not that VRM will be chilling at 60c just for no reason heating up the whole card it seems.
> 
> Edit: Must only be about 20c at the most here indoors UK.


Only other useful info I could provide would be the card was stock 1080/1250 with the standard +63mv for the lightning. H55 at max rpm, 2xb12-2's @ ~1300rpm; radiator limit imo. I'll probably swap it out for a dual 120 clc at some point but the good point is my rig is incredibly quiet without having an ridiculous temps, my 140mm Noctua cpu fan is actually the loudest thing in my case.


----------



## aaroc

Quote:


> Originally Posted by *hornedfrog86*
> 
> Thank you, do you cool the back of your CPU socket area on that case with a fan. I like that it has a cutout on that side, I will try to install an FX 9590 in one.


Yes, I bought the CM official fan for that is super slim. I installed to throw air inside the tower and lowered the CPU temp by 2C, the CPU is cooled by a Corsair H100i with push pull, the push fans are scythe slim and the pull fans are the ones that came with the H100i. My gaming PC is now in an open bench, but my work PC is in the CM 690 II with a Asus Crosshair V Formula Z and a FX 9590 and no gaming GPU card. I really love the CM 690 II, you can fit a lot of fans inside.
Quote:


> Originally Posted by *elgreco14*
> 
> Yeah the problem is the psu only fits this way
> 
> 
> 
> 
> 
> 
> 
> But I just bought 2 noctua fans second hand and installed them like this:
> 
> So yeah in 1080p upper card is 88 degrees max but aigain when I have resolution 1440p it gets aigain 94 degrees Maybe because my cpu is also getting 65 degrees the air cant get away well? I don't know anymore how to fix this and I need a solution because soon my Asus MG279Q is coming so it has stop getting so hot at 1440p


Can you test putting the new two fans throwing air outside of the tower right next to the two GPUs? If this lowers yout GPU temps, maybe you can mod your tower door to have this two fans.


----------



## sugarhell

Interesting the new msi ab added a third party voltage controller mode for unlocked voltages. Maybe we can bypass the whole i2c thing for the voltages. I need to check it soon


----------



## hornedfrog86

Thank you aaroc, it looks like an excellent case that really does work ! I wonder how large a power supply you could fit in the bottom?


----------



## aaroc

Quote:


> Originally Posted by *hornedfrog86*
> 
> Thank you aaroc, it looks like an excellent case that really does work ! I wonder how large a power supply you could fit in the bottom?


Corsair AX1200i but loosing one of two possible 120mm fan on the bottom.


----------



## SPLWF

Quote:


> Originally Posted by *Mega Man*
> 
> you need to divide the ram usage by 2 ( or the number of cards you have ) tsm brought up you can now disable that in the newest MSI
> 
> as to the rest of the q, unstable drivers/unstable bench, ect can cause it, as i have not seen the bench widely used that can not be discounted


Thanks, what option is that now in MSI AB?


----------



## Mega Man

i d k lol ask tsm sorry


----------



## hidethecookies

Has anyone tried putting ram heat sinks on a back plate for more cooling? I have the MSI Gaming 290 with a g10, h75, and the gelid vrm heat sink. It still has the stock MSI back plate on. I was also wondering if it would run cooler without the plate on because of heat soaking into the ram chips. Any help would be appreciated.


----------



## tsm106

Quote:


> Originally Posted by *Mega Man*
> 
> i d k lol ask tsm sorry


In the correction formula box type x/y. With y being the number of cards. For ex. quad = x/4. It's just a divider in essence. You'll know your math is wrong when the text turns red instead of black.


----------



## Caveat

Hello. I got 2 Sapphire R9 290x with stock coolers in my case. But 1 of the tubes of the NZXT Kraken x61 is laying on top of the upper graphics card (See pic below). My question, is it possible to get an backplate and screw it on top of the upper card with just the stock coolers? If its possible i need probably longer screws. Any tips?


----------



## hornedfrog86

Quote:


> Originally Posted by *aaroc*
> 
> Corsair AX1200i but loosing one of two possible 120mm fan on the bottom.


Thanks!


----------



## Ized

Quote:


> Originally Posted by *rv8000*
> 
> Only other useful info I could provide would be the card was stock 1080/1250 with the standard +63mv for the lightning. H55 at max rpm, 2xb12-2's @ ~1300rpm; radiator limit imo. I'll probably swap it out for a dual 120 clc at some point but the good point is my rig is incredibly quiet without having an ridiculous temps, my 140mm Noctua cpu fan is actually the loudest thing in my case.


Apologies, it seemed like you were not happy at first.

I'm curious if your radiator fins are cool to the touch while gaming like mine? While what I assume to be the water storage tank on the bottom being extremely toasty. Almost seems like the heat isn't moving to the fins fast enough or something.. (I have no watercooling experience..)


----------



## Arizonian

Quote:


> Originally Posted by *hidethecookies*
> 
> like to join the club
> MSI 290 Gaming 4g w/ Kraken g10 and Corsair H75
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added















Quote:


> Originally Posted by *ambientblue*
> 
> http://www.3dmark.com/3dm/7111037
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Stock run. 4GHz CPU.


Congrats - added


----------



## Oxen

hi i have 2 R9290 oc edition by gigabyte like to join my 1st club i guess

http://s271.photobucket.com/user/joshua_alexia/media/HGC/11_zpsnwg8fjma.jpg.html

http://s271.photobucket.com/user/joshua_alexia/media/HGC/111_zpsyxdibhco.jpg.html

serial# of top card

http://s271.photobucket.com/user/joshua_alexia/media/HGC/1_zpsciljgo4x.jpg.html

http://s271.photobucket.com/user/joshua_alexia/media/HGC/1111_zpszz75vfsn.jpg.html

there not turned up or anything , cept i did mod the fan profile. noise dont bother me


----------



## zeppoli

I have two 290's, one of them I can OC to 1100core and 1500mhz memory, using MSI Afterburner without problem. Yet when I try my other card I can't touch the core clock and only increase memory 50-75 at most, if I go higher my screen just goes black.

Is this just the way it goes?? some can OC and some cannot?? Obviously because of this I when I pair them in crossfire I can't really OC at all








It has nothing to do with heat, because the second I bring the core up memory up with the other card it instantly black screen's.


----------



## kizwan

Quote:


> Originally Posted by *zeppoli*
> 
> I have two 290's, one of them I can OC to 1100core and 1500mhz memory, using MSI Afterburner without problem. Yet when I try my other card I can't touch the core clock and only increase memory 50-75 at most, if I go higher my screen just goes black.
> 
> Is this just the way it goes?? some can OC and some cannot?? Obviously because of this I when I pair them in crossfire I can't really OC at all
> 
> 
> 
> 
> 
> 
> 
> 
> It has nothing to do with heat, because the second I bring the core up memory up with the other card it instantly black screen's.


If you put the black screen card as secondary card (in Crossfire), do you still get black screen when overclock in Crossfire?


----------



## Oxen

after burnner applys the clocks to both cards i belive. if your running crossfire . tho could be super incorrect too. possible your PSU is tapped out on available watts?


----------



## FastEddieNYC

Quote:


> Originally Posted by *zeppoli*
> 
> I have two 290's, one of them I can OC to 1100core and 1500mhz memory, using MSI Afterburner without problem. Yet when I try my other card I can't touch the core clock and only increase memory 50-75 at most, if I go higher my screen just goes black.
> 
> Is this just the way it goes?? some can OC and some cannot?? Obviously because of this I when I pair them in crossfire I can't really OC at all
> 
> 
> 
> 
> 
> 
> 
> 
> It has nothing to do with heat, because the second I bring the core up memory up with the other card it instantly black screen's.


Disable crossfire and set the card that will not OC as the primary. In AB setting uncheck apply clock for all cards. Then try to overclock the card. Be sure the correct card is selected in AB. If it black screens then it's the card but if it works it's most likely your power supply. You can also try using Sapphire Trixx. I prefer it over Afterburner for Crossfire.


----------



## zeppoli

Quote:


> Originally Posted by *FastEddieNYC*
> 
> Disable crossfire and set the card that will not OC as the primary. In AB setting uncheck apply clock for all cards. Then try to overclock the card. Be sure the correct card is selected in AB. If it black screens then it's the card but if it works it's most likely your power supply. You can also try using Sapphire Trixx. I prefer it over Afterburner for Crossfire.


why would it be my PSU??

Card A can OC to maximum levels

Card B cannot

They are the exact same card with the same exact PSU ( a brand new OCZ 1k watt) why would my power supply have anything to do with it when it works fine with one card and then not the other?

We're talking by themselves NOT in crossfire here.


----------



## HugoStiglitz96

Just got an MSI R9 290








http://www.techpowerup.com/gpuz/details.php?id=8mn2x


----------



## rv8000

Quote:


> Originally Posted by *Ized*
> 
> Apologies, it seemed like you were not happy at first.
> 
> I'm curious if your radiator fins are cool to the touch while gaming like mine? While what I assume to be the water storage tank on the bottom being extremely toasty. Almost seems like the heat isn't moving to the fins fast enough or something.. (I have no watercooling experience..)


I don't either but I think the reason for that is the pump is actually moving the water too quickly for the radiator to dissipate it properly, threes always that sweet spot for pump speed but the h55 is 3pin and can only have it's pump speed changed by altering the voltage. These small 120mm rads were not desigend for 300w of heat, either way good it's enough for now


----------



## PsYcHo29388

Found out today my card has Hynix memory, is this the best type of memory for the 290s or does it not matter at all?


----------



## cephelix

Quote:


> Originally Posted by *PsYcHo29388*
> 
> Found out today my card has Hynix memory, is this the best type of memory for the 290s or does it not matter at all?


In the early days it's been reported that the Hynix memories are better than the Elpida ones, less likely to blackscreen(correct me if I'm wrong on this one) and is able to overclock higher than the Elpida ones.


----------



## andressergio

i adapted an AIO to my R9 290X and went good after some minutes my keyboard felt and i don't know what happened seems GPU is dead, code 62 on my both ASUS using DVI







i didn't try HDMI yet but really sad


----------



## venom55520

Quote:


> Originally Posted by *cephelix*
> 
> In the early days it's been reported that the Hynix memories are better than the Elpida ones, less likely to blackscreen(correct me if I'm wrong on this one) and is able to overclock higher than the Elpida ones.


I can vouch for this. I bought a used one from here that came with the blackscreen issue that the seller conveniently forgot to tell me about. I RMA'd it to sapphire (who at least for me have great support) and they upgraded me to the TRI-X version with hynix memory for free, I just requested it from them. This card has been ROCK SOLID STABLE. Absolutely not a single problem, and it's way quieter and temps are much better but I assume that's because of the AWESOME TRI-X cooler.


----------



## cephelix

Quote:


> Originally Posted by *andressergio*
> 
> i adapted an AIO to my R9 290X and went good after some minutes my keyboard felt and i don't know what happened seems GPU is dead, code 62 on my both ASUS using DVI
> 
> 
> 
> 
> 
> 
> 
> i didn't try HDMI yet but really sad


Could it be a short somewhere? and did you cool the VRMs as well?
Quote:


> Originally Posted by *venom55520*
> 
> I can vouch for this. I bought a used one from here that came with the blackscreen issue that the seller conveniently forgot to tell me about. I RMA'd it to sapphire (who at least for me have great support) and they upgraded me to the TRI-X version with hynix memory for free, I just requested it from them. This card has been ROCK SOLID STABLE. Absolutely not a single problem, and it's way quieter and temps are much better but I assume that's because of the AWESOME TRI-X cooler.


If I had known about the Tri-X when I was in the market for a 290, I definitely would've gotten it. Have heard only good things about it on this forum


----------



## andressergio

Quote:


> Originally Posted by *cephelix*
> 
> Could it be a short somewhere? and did you cool the VRMs as well?
> If I had known about the Tri-X when I was in the market for a 290, I definitely would've gotten it. Have heard only good things about it on this forum


yes a heatsink i put on VRM felt and even that it was a JETFLO pumping brutal air i could see red and burnt







now i don't know if there's a way to repair it, guess im done


----------



## cephelix

Quote:


> Originally Posted by *andressergio*
> 
> yes a heatsink i put on VRM felt and even that it was a JETFLO pumping brutal air i could see red and burnt
> 
> 
> 
> 
> 
> 
> 
> now i don't know if there's a way to repair it, guess im done


Ouch. Hopefully someone with more expertise could help you. But for now, my condolences. I fell for your loss. On a sidenote, it's a good chance to keep an eye out for the 980ti or 390X that's not too far off in the horizon


----------



## Coldblackice

How the **** do we get these ******ed 290's to stop downclocking with the "PowerPlay" garbage? Freaking unreal. Why do they even put this nonsense into these performance cards? Talk about PowerSUCK. It would be one thing if AMD would give us a setting to turn the stupid thing off, but nope. We have to spend hours throwing spaghetti against the wall -- uninstalling, reinstalling, settings on, settings off, etc. -- to hope that something eventually sticks.

Is there any de facto solution? There are a million threads on this and they are all over the place on what to do -- or in other words, no clear solution whatsoever.

Aghhh, I'm sorry, I've just been an AMD user for years, and with each generation, PowerPlay never fails to rear its nasty face. AMD really needs to buy Afterburner and merge it. It's stupid-insane that we still have to resort to clunky third-party utilities to do what AMD knows what we're buying these cards to do.


----------



## gordesky1

So i was thinking of getting the g10 on amazon cause its ony 24$ for my 290x lightning. Will it fit with the back plate and everything?

Im going to use my old version h50 which i know it wont fit but i herd you just file down the plastic on the h50 and it fits perfect.

What temp drop should i see around? with all fans maxed on the lightning i get around 70s with 1200core/1650memory at load. sometimes it will hit 80c.

Pretty much i like to get the heat out of my case to approve case temps.

Oh and does it require a shim?


----------



## PsYcHo29388

Quote:


> Originally Posted by *venom55520*
> 
> Quote:
> 
> 
> 
> Originally Posted by *cephelix*
> 
> In the early days it's been reported that the Hynix memories are better than the Elpida ones, less likely to blackscreen(correct me if I'm wrong on this one) and is able to overclock higher than the Elpida ones.
> 
> 
> 
> I can vouch for this. I bought a used one from here that came with the blackscreen issue that the seller conveniently forgot to tell me about. I RMA'd it to sapphire (who at least for me have great support) and they upgraded me to the TRI-X version with hynix memory for free, I just requested it from them. This card has been ROCK SOLID STABLE. Absolutely not a single problem, and it's way quieter and temps are much better but I assume that's because of the AWESOME TRI-X cooler.
Click to expand...

I was gonna go with Sapphire for my 290 but I looked at the warranty, thought it was rather short, and decided to go with the XFX variant instead. Most 290s on newegg still have rather bad reviews because of black screen issues, which almost made me turn away from getting the card entirely but at the same time the price was too good to pass up so here I am.

I'm thinking maybe, just maybe if and when I get a new motherboard I'll do some mild overclocking, maybe 1050/1350 or just as high as I can go on stock voltage.


----------



## cephelix

Quote:


> Originally Posted by *PsYcHo29388*
> 
> I was gonna go with Sapphire for my 290 but I looked at the warranty, thought it was rather short, and decided to go with the XFX variant instead. Most 290s on newegg still have rather bad reviews because of black screen issues, which almost made me turn away from getting the card entirely but at the same time the price was too good to pass up so here I am.
> 
> I'm thinking maybe, just maybe if and when I get a new motherboard I'll do some mild overclocking, maybe 1050/1350 or just as high as I can go on stock voltage.


Well, I must be one of the lucky ones I suppose. I only got blackscreens when I first installed the card. But I don't think I did anything special and it went away by itself and it's been going smooth ever since. Temps may be a bit off but that's only because I live in a tropical climate. All in all, SGD$600 well spent.


----------



## andressergio

Quote:


> Originally Posted by *cephelix*
> 
> Ouch. Hopefully someone with more expertise could help you. But for now, my condolences. I fell for your loss. On a sidenote, it's a good chance to keep an eye out for the 980ti or 390X that's not too far off in the horizon


Thanks bro was my fault i sticked the copper HS to the vrm and my keboard felt so on an attempt to grab it i hit the VGA either had a fan over them so it was a second i believe it was about to die already...


----------



## HoZy

Quote:


> Originally Posted by *cephelix*
> 
> In the early days it's been reported that the Hynix memories are better than the Elpida ones, less likely to blackscreen(correct me if I'm wrong on this one) and is able to overclock higher than the Elpida ones.


I've got 2 launch referance XFX 290x with ELPIDA flashed with {Sapphire Tri-X R9 290X OC} Bios, Running at 1200core/1500memory(6000mhz) 145 VDC Offset, 50% Power limit.
Been that way since 2013 under water (EK Blocks) and have never black screened or seen them above 65c.

But I might be a lucky one.


----------



## rdr09

Quote:


> Originally Posted by *HoZy*
> 
> I've got 2 launch referance XFX 290x with ELPIDA flashed with {Sapphire Tri-X R9 290X OC} Bios, Running at 1200core/1500memory(6000mhz) 145 VDC Offset, 50% Power limit.
> Been that way since 2013 under water (EK Blocks) and have never black screened or seen them above 65c.
> 
> But I might be a lucky one.


Same reference here.one bought at launch and the other last November used. Benched both at 1290MHz. Zero issues but I use them at stock and watered.


----------



## cephelix

Quote:


> Originally Posted by *andressergio*
> 
> Thanks bro was my fault i sticked the copper HS to the vrm and my keboard felt so on an attempt to grab it i hit the VGA either had a fan over them so it was a second i believe it was about to die already...


What do you mean your keyboard felt?it's somewhat late here and my mind isn't exactly at 100%
Quote:


> Originally Posted by *HoZy*
> 
> I've got 2 launch referance XFX 290x with ELPIDA flashed with {Sapphire Tri-X R9 290X OC} Bios, Running at 1200core/1500memory(6000mhz) 145 VDC Offset, 50% Power limit.
> Been that way since 2013 under water (EK Blocks) and have never black screened or seen them above 65c.
> 
> But I might be a lucky one.


Quote:


> Originally Posted by *rdr09*
> 
> Same reference here.one bought at launch and the other last November used. Benched both at 1290MHz. Zero issues but I use them at stock and watered.


Ahhh,i wish i could afford watercooling my rig.i envy you guys


----------



## mfknjadagr8

Quote:


> Originally Posted by *cephelix*
> 
> What do you mean your keyboard felt?it's somewhat late here and my mind isn't exactly at 100%
> 
> Ahhh,i wish i could afford watercooling my rig.i envy you guys


I think he meant fell...yeah water cooling is nice but for the money spent it's hard to swallow..I did most things on the cheap and it was still $1000 so far


----------



## the9quad

Out of five launch reference 290x's two of mine had black screens. That should tell you something about how bad they were at launch.


----------



## cephelix

Quote:


> Originally Posted by *mfknjadagr8*
> 
> I think he meant fell...yeah water cooling is nice but for the money spent it's hard to swallow..I did most things on the cheap and it was still $1000 so far


Ahh, that would really be a stroke of bad luck that your keyboard fell on your card...

and yeah, when I had my loop up, it costed me around that much as well. but it looked horrendous! and to make it look nice, i would have to spend another $600. I really couldn't swallow that and just went back to air. but the temps were really awesome. that's the part i miss the most


----------



## Icekilla

Maybe this is the wrong place but...

Is it too late to consider buying a used R9 280X if the price is good?


----------



## cephelix

Quote:


> Originally Posted by *Icekilla*
> 
> Maybe this is the wrong place but...
> 
> Is it too late to consider buying a used R9 280X if the price is good?


depends on the price, what games you play and what resolution. if it would meet your requirements for the foreseeable future at the right price, and you could spare the cash,why not...


----------



## Icekilla

Say, $190 used with shipping included. I'd play at 1920 x 1080 and I just want something that is cheap and somewhat future proof. It also helps that it seems like it outperforms the GTX 960, the card I was originally planning to buy.


----------



## cephelix

Quote:


> Originally Posted by *Icekilla*
> 
> Say, $190 used with shipping included. I'd play at 1920 x 1080 and I just want something that is cheap and somewhat future proof. It also helps that it seems like it outperforms the GTX 960, the card I was originally planning to buy.


Well, I don
Quote:


> Originally Posted by *Icekilla*
> 
> Say, $190 used with shipping included. I'd play at 1920 x 1080 and I just want something that is cheap and somewhat future proof. It also helps that it seems like it outperforms the GTX 960, the card I was originally planning to buy.


Doesn't seem bad. and at 1080p it should be enough. Someone is selling a 290X Tri-X for $275. Personally I'd take the 290x


----------



## Icekilla

Sadly it's out of my budget and I'm not sure it'd fit in my case









AMD cards are WAY too long as of late!


----------



## cephelix

Quote:


> Originally Posted by *Icekilla*
> 
> Sadly it's out of my budget and I'm not sure it'd fit in my case
> 
> 
> 
> 
> 
> 
> 
> 
> 
> AMD cards are WAY too long as of late!


then it seems the 280 it is then


----------



## rv8000

Quote:


> Originally Posted by *gordesky1*
> 
> So i was thinking of getting the g10 on amazon cause its ony 24$ for my 290x lightning. Will it fit with the back plate and everything?
> 
> Im going to use my old version h50 which i know it wont fit but i herd you just file down the plastic on the h50 and it fits perfect.
> 
> What temp drop should i see around? with all fans maxed on the lightning i get around 70s with 1200core/1650memory at load. sometimes it will hit 80c.
> 
> Pretty much i like to get the heat out of my case to approve case temps.
> 
> Oh and does it require a shim?


I just installed a g10 and h55 on my lighting last week. You will have to trim the foam on the nzxt plate (roughly about half of it off), I also recommend taking the rubber washers off of the nuts you put on the bracket to fasten it to the card as they stop you from getting a flush mount and good contact easily as they kind of peel themselves off the tighter you go anyways; DO NOT over-tighten the bracket if you do this. You can leave the stock backplate on, as well as the front plate cooling the vrm and memory. I do not think you will need a shim but as the H50 isn't supported I don't know. With a ~26-27c room ambient at stock 1080/1250 +63mv and 2 1300rpm eloops on the rad my temp delta is about 40c, loading to 67-68c with 100% gpu usage for 30 minutes.


----------



## Dooderek

Quote:


> Originally Posted by *Caveat*
> 
> Hello. I got 2 Sapphire R9 290x with stock coolers in my case. But 1 of the tubes of the NZXT Kraken x61 is laying on top of the upper graphics card (See pic below). My question, is it possible to get an backplate and screw it on top of the upper card with just the stock coolers? If its possible i need probably longer screws. Any tips?


Zip tie the lines out of the way


----------



## Jyve

Quote:


> Originally Posted by *Caveat*
> 
> Hello. I got 2 Sapphire R9 290x with stock coolers in my case. But 1 of the tubes of the NZXT Kraken x61 is laying on top of the upper graphics card (See pic below). My question, is it possible to get an backplate and screw it on top of the upper card with just the stock coolers? If its possible i need probably longer screws. Any tips?


Can't you just rotate the tubes on the block? Push them up and away from the GPU and I'd think they'd not lay on the card.


----------



## SLOWION

Got to play with an XFX Radeon R9 290

















Ran some benchmarks with a G3258 as well ^^^^


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *rdr09*
> 
> Same reference here.one bought at launch and the other last November used. Benched both at 1290MHz. Zero issues but I use them at stock and watered.


I have a XFX 290 here as well flashed to 290x with the original Shimano bios .

I run all of my 290's at stock these days at 1000/[email protected] before vdroop on coolant .

My TRI 290 set up has over 5 ltrs of coolant running through it .......

1 card 1360mhz
2 cards 1320mhz
3 cards 1290mhz


----------



## Arizonian

Quote:


> Originally Posted by *Oxen*
> 
> hi i have 2 R9290 oc edition by gigabyte like to join my 1st club i guess
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s271.photobucket.com/user/joshua_alexia/media/HGC/11_zpsnwg8fjma.jpg.html
> 
> http://s271.photobucket.com/user/joshua_alexia/media/HGC/111_zpsyxdibhco.jpg.html
> 
> 
> 
> serial# of top card
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s271.photobucket.com/user/joshua_alexia/media/HGC/1_zpsciljgo4x.jpg.html
> 
> http://s271.photobucket.com/user/joshua_alexia/media/HGC/1111_zpszz75vfsn.jpg.html
> 
> 
> there not turned up or anything , cept i did mod the fan profile. noise dont bother me


Congrats - added








Quote:


> Originally Posted by *HugoStiglitz96*
> 
> Just got an MSI R9 290
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/details.php?id=8mn2x


Congrats - added









Wow if someone told me this club would have 580 owners come through totaling 830 cards I would not have believed them.

I will say for those of you who purchased them prior to the mining craze prices and are still gaming on them, what a value for 19 months.


----------



## Caveat

Quote:


> Originally Posted by *Dooderek*
> 
> Zip tie the lines out of the way


Yes. But to where i attach it? I dont want a rope straight down from the top.
Quote:


> Originally Posted by *Jyve*
> 
> Can't you just rotate the tubes on the block? Push them up and away from the GPU and I'd think they'd not lay on the card.


i tried, but they keep going down on top of the card


----------



## rdr09

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> I have a XFX 290 here as well flashed to 290x with the original Shimano bios .
> 
> I run all of my 290's at stock these days at 1000/[email protected] before vdroop on coolant .
> 
> My TRI 290 set up has over 5 ltrs of coolant running through it .......
> 
> 1 card 1360mhz
> 2 cards 1320mhz
> 3 cards 1290mhz


I've seen your benchmarks on those. it's amazing they still run. lol

I don't flash any other bios on my 290s, though. One can bench to 1330 core but the other only 1290. Both using original bioses. You know the secret,though. keeping them cool.


----------



## rdr09

Thank you >>>> *Arizonian*.


----------



## boredmug

Contemplating buying two 290x cards with ek water blocks to replace my 7950's. Cards are 500 with the blocks and bridge. Or should I just wait until amd comes out with its new offerings? I'm currently gaming at 5760x1080 but was thinking of purchasing a super wide-screen 1440p monitor. Probably won't be going 4k anytime soon.


----------



## aaroc

what driver do you recommend for quadfire the Omega or the latest beta for Witcher 3?


----------



## BradleyW

Quote:


> Originally Posted by *aaroc*
> 
> what driver do you recommend for quadfire the Omega or the latest beta for Witcher 3?


If you play W3 or PCars, go with 15.5. If not, 14.12 all the way!


----------



## Jyve

Quote:


> Originally Posted by *Caveat*
> 
> Yes. But to where i attach it? I dont want a rope straight down from the top.
> i tried, but they keep going down on top of the card


You could try zip tying the block tunes together.


----------



## FastEddieNYC

Quote:


> Originally Posted by *aaroc*
> 
> what driver do you recommend for quadfire the Omega or the latest beta for Witcher 3?


15.5 is optimized for Witcher 3 and Project Cars. You need to run in full screen for multi gpu to work.


----------



## tsm106

Quote:


> Originally Posted by *aaroc*
> 
> what driver do you recommend for *quadfire* the Omega or the latest beta for *Witcher 3*?


lol, do not hold your breath. Wait will be a long freaking time.


----------



## aaroc

Quote:


> Originally Posted by *tsm106*
> 
> lol, do not hold your breath. Wait will be a long freaking time.


wait for what?


----------



## tsm106

Quote:


> Originally Posted by *aaroc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> lol, do not hold your breath. Wait will be a long freaking time.
> 
> 
> 
> wait for what?
Click to expand...

I know it's funny right? I thought it was even funnier when someone mentioned you have to be in fullscreen for crossfire to work in witcher 3 lol.


----------



## aaroc

My question was about recommendation of driver versions to avoid using one with possible problems. I have a quadfire of r9 290x. Im reinstalling my pc right know.


----------



## tsm106

Quote:


> Originally Posted by *aaroc*
> 
> My question was about recommendation of driver versions to avoid using one with possible problems. I have a quadfire of r9 290x. Im reinstalling my pc right know.


There is no quad support in witcher 3 much less any crossfire support in that game. That's why you shouldn't hope for it, much less hold your breath waiting for it. It's a gameworks title! Depending on your viewpoint, one could say it prevents cfx behind an iron curtain.


----------



## aaroc

Quote:


> Originally Posted by *tsm106*
> 
> There is no quad support in witcher 3 much less any crossfire support in that game. That's why you shouldn't hope for it, much less hold your breath waiting for it. It's a gameworks title! Depending on your viewpoint, one could say it prevents cfx behind an iron curtain.


dont worry i dont have witcher 3 and im not planning on buying before a lot of steam sales in the future.
My reference to witcher 3 was because that is the highlight of the latest beta driver. My newest games are gta v, project cars,f1 2015 and dirt rally in the near future.


----------



## tsm106

Quote:


> Originally Posted by *aaroc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> There is no quad support in witcher 3 much less any crossfire support in that game. That's why you shouldn't hope for it, much less hold your breath waiting for it. It's a gameworks title! Depending on your viewpoint, one could say it prevents cfx behind an iron curtain.
> 
> 
> 
> dont worry i dont have witcher 3 and im not planning on buying before a lot of steam sales in the future.
> My reference to witcher 3 was because that is the highlight of the latest beta driver. My newest games are gta v, project cars,f1 2015 and dirt rally in the near future.
Click to expand...

The beta driver has no cfx for witcher 3, it only refers to single gpu. That list of games has decent cfx support, and pcars is improving ever so slightly. Dirt will have full support as well as F1 2015 considering its a codies game. Thus get the latest driver, whichever that


----------



## Caveat

Quote:


> Originally Posted by *Jyve*
> 
> You could try zip tying the block tunes together.


You ment tubes? I tried that before. Keep falling on top of the upper card.


----------



## BradleyW

As for project cards, Windows 10 with leaked Win10 drivers boost min fps by up to 10 thanks to lower overhead and WDDM v2.0.


----------



## PsYcHo29388

Quote:


> Originally Posted by *BradleyW*
> As for project cards, Windows 10 with leaked Win10 drivers boost min fps by up to 10 thanks to lower overhead and WDDM v2.0.


Will this overhead fix/patch/update eventually apply to all DX11 games or just certain ones?


----------



## faizreds

Running Heaven benchmark for my Sapphire R9 290 tri-x:

Stock(957/1250):


Overclock(1100/1300):


----------



## Ha-Nocri

4fps difference. Not worth it to OC. 1300MHz would probably make a noticeable difference, other than that, not worth it.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Wow if someone told me this club would have 580 owners come through totaling 830 cards I would not have believed them.
> 
> I will say for those of you who purchased them prior to the mining craze prices and are still gaming on them, what a value for 19 months.


Your tellin me .









I paid good loot for these and laid down more loot to block em .

Very impressed overall considering these are my first w /cooled cards and did well to run em over 1300mhz to bench with









Ive had these three installed continuously for a year now and they have been working flawlessly with only one driver update recently
Quote:


> Originally Posted by *rdr09*
> 
> I've seen your benchmarks on those. it's amazing they still run. lol
> 
> I don't flash any other bios on my 290s, though. One can bench to 1330 core but the other only 1290. Both using original bioses. You know the secret, though. keeping them cool.


Yeah that 1HP chiller is overkill , but its 4 Ltr res is what is the key to run em at ambient temp full load @ stock @ 36c - 38c before I turn on chiller









Quote:


> Originally Posted by *tsm106*
> 
> lol, do not hold your breath. Wait will be a long freaking time.
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I know it's funny right? I thought it was even funnier when someone mentioned you have to be in fullscreen for crossfire to work in witcher 3 lol.
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> There is no quad support in witcher 3 much less any crossfire support in that game. That's why you shouldn't hope for it, much less hold your breath waiting for it. It's a gameworks title! Depending on your viewpoint, one could say it prevents cfx behind an iron curtain.
> 
> 
> 
> 
> 
> Click to expand...
Click to expand...











I got it the first time LoooL


----------



## BradleyW

Quote:


> Originally Posted by *PsYcHo29388*
> 
> Will this overhead fix/patch/update eventually apply to all DX11 games or just certain ones?


All DX11 games. Optimizing from the root!


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> I paid good loot for these and laid down more loot to block em .
> 
> Very impressed overall considering these are my first w /cooled cards and did well to run em over 1300mhz to bench with
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ive had these three installed continuously for a year now and they have been working flawlessly with only one driver update recently


Hey madman!


----------



## LandonAaron

Quote:


> Originally Posted by *Arizonian*
> 
> Wow if someone told me this club would have 580 owners come through totaling 830 cards I would not have believed them.
> 
> I will say for those of you who purchased them prior to the mining craze prices and are still gaming on them, what a value for 19 months.


What about those of us who purchased after the mining craze had ended and miners were selling their 290x's on Ebay for ~$225 dollars? We haven't been able to use them for 19 months sure. But that was the best performance per dollar purchase I think I have ever made for my computer







.


----------



## Dooderek

Second hand cards FTW. I purchased both my cards second hand and then bought water blocks for them still less then $600 USD total cost. I have taken them both to 1175MHz but haven't tried any higher.


----------



## BradleyW

I paid £450 for each 290X new + 2 new WB's for £110 each!


----------



## JourneymanMike

Quote:


> Originally Posted by *LandonAaron*
> 
> What about those of us who purchased after the mining craze had ended and miners were selling their 290x's on Ebay for ~$225 dollars? We haven't been able to use them for 19 months sure. But that was the *best performance per dollar purchase I think I have ever made for my computer*
> 
> 
> 
> 
> 
> 
> 
> .


That's true!

I purchased 3 Reference 290X's from eBay last year! A great bargain!


----------



## Agent Smith1984

Same here, bought one of my trixxies on craigslist for $180, and the other is a refurb from newegg I got for $230.

$410 total investment for this much horsepower is a friggin' bargain.


----------



## gatygun

Needed to upgrade from a 580, sold that thing for 90, bought a 290 tri-x for 200. Best money i ever spend lol.


----------



## Ramzinho

That's why i'm so glad with my Investment in buying two 290Xs for 220$ each and two waterblock + back plates practically new for 170$


----------



## BradleyW

I really paid too much for the 290X's + WB's. Around £1100 in total. That's about $1750 right?


----------



## mfknjadagr8

Ok so afterburner question has anyone had clocks they have never ran getting stuck even after removing profiles?...I decided to overclock a but last night tried a few settings voltage +10 per limit maxed...1050 core 1300 memory...I had a hard freeze audio still playing but all video frozen had to hard reset...so I said ok I'll just reboot run then at stock and just to make sure it was the overclock causing the freeze...so during idle they looked ok but the moment I started 3d application it was showing 850 on core and 1399 on memory...so I uninstalled afterburner and rebooted then reinstalled it...started things up and and thing clocks stuck at the same values I've never set...so I changed clocks a few times and they stayed at what I set whereas before changing the clocks didn't change the clocks the card was actually running...so I reset to default settings again and only upped the power limit to max...but I'm still experiencing freezes even at stock...could I have corrupted the driver in the initial lockup?


----------



## Ramzinho

Quote:


> Originally Posted by *BradleyW*
> 
> I really paid too much for the 290X's + WB's. Around £1100 in total. That's about $1750 right?


yes.. too much .. but aren't you still rocking them and we expect them to keep rocking with Win10 expected pref romance


----------



## BradleyW

Quote:


> Originally Posted by *Ramzinho*
> 
> yes.. too much .. but aren't you still rocking them and we expect them to keep rocking with Win10 expected pref romance


That's the idea.


----------



## rdr09

Quote:


> Originally Posted by *BradleyW*
> 
> I really paid too much for the 290X's + WB's. Around £1100 in total. That's about $1750 right?


No wonder it is not easy to let go of them even with all the troubles crossfire is giving you.

Got my second 290 with a fullblock for $250 . . .

http://www.3dmark.com/3dm/4644282?

supposedly used for mining. I think Fiji single gpu will finally beat it.


----------



## BradleyW

Quote:


> Originally Posted by *rdr09*
> 
> No wonder it is not easy to let go of them even with all the troubles crossfire is giving you.
> 
> Got my second 290 with a fullblock for $250 . . .
> 
> http://www.3dmark.com/3dm/4644282?
> 
> supposedly used for mining. I think Fiji single gpu will finally beat it.


Speaking of CFX, GTA V and Witcher 3 are playing great.









Dying Light, AC Unity and FC4 are also working brilliant with the latest drivers.

Sadly, AMD are just too slow to release "good" profiles.


----------



## rdr09

Quote:


> Originally Posted by *BradleyW*
> 
> Speaking of CFX, GTA V and Witcher 3 are playing great.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Dying Light, AC Unity and FC4 are also working brilliant with the latest drivers.
> 
> Sadly, AMD are just too slow to release "good" profiles.


Ironically, i think they are having issues with some GW even in single gpu.


----------



## Agent Smith1984

Maybe the software companies are just holding hands with NVIDIA, and rushing launch dates, and not giving AMD a fair chance on making good drivers.
It's not like they can't make good drivers. Every game that has come out in the last few years have had an updated driver within a week or so that fixed most issues, and show performance improvements. I really wonder why they are aren't posting drivers up before launch dates also, when NVIDIA often times has an offering right before a release.
It only leads me to believe they are getting access to something before AMD is..... or maybe AMD really is being horrible about it. Who knows?


----------



## BugBash

Guess I should sign up









I got an ASUS R9 290X DirectCUII and then I read here at OCN they blow goats








Wow it gets hot playing GTA V
I cant live with THIS when summer is just starting! It gets real hot in my place, 30C plus!

So,
Today I pulled the trigger on an EK waterblock from SCAN in the UK








I dont have anything else yet, electric bill!!
next month I will set sail on my maiden voyage into the world of watercooling!!

Picture is my VenomousX needing a GOOD clean and the 290X in the background


----------



## Dooderek

Quote:


> Originally Posted by *BugBash*
> 
> Guess I should sign up
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I got an ASUS R9 290X DirectCUII and then I read here at OCN they blow goats
> 
> 
> 
> 
> 
> 
> 
> 
> Wow it gets hot playing GTA V
> I cant live with THIS when summer is just starting! It gets real hot in my place, 30C plus!
> 
> So,
> Today I pulled the trigger on an EK waterblock from SCAN in the UK
> 
> 
> 
> 
> 
> 
> 
> 
> I dont have anything else yet, electric bill!!
> next month I will set sail on my maiden voyage into the world of watercooling!!
> 
> Picture is my VenomousX needing a GOOD clean and the 290X in the background


Congrats!! WC is the way to go 45 - 50 C load temps all day long


----------



## battleaxe

Regarding the cost of these cards and what we paid. Miners really got the best deal, at least some of us, as mining paid for our cards outright. At least it paid for mine anyway. So, after mining my expense was really zero. Even if I had to pay over retail to get them initially.


----------



## Agent Smith1984

Yeah, some miners came out pretty good.
But a lot of pure gamers came out pretty good after the mining market crashed because cards were being liquidated for dirt cheap (more so then, than now).
People were terrified of mined cards, and people were giving them away. Like I said, I scored my 290 Tri-X last october for $190.
I had a 280x I got last July for $125. I got my son's 7950 for $60.

I could care less if these cards were mined on. Many cards were easily RMA'able anyways.


----------



## PsYcHo29388

Quote:


> Originally Posted by *BradleyW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PsYcHo29388*
> 
> Will this overhead fix/patch/update eventually apply to all DX11 games or just certain ones?
> 
> 
> 
> All DX11 games. Optimizing from the root!
Click to expand...

Awesome. I hope there will eventually be a nice boost in GTA5 and other CPU intensive games. Dropping down to 35 FPS from 60-70 is never fun.


----------



## Ized

Really quite jealous of some of those American prices o_0

Cost me £315 for a reference 290 plus I guess another £120 to cool the damn thing.

So $650.

Well, maybe $649 after I sell my dollars worth of litecoins


----------



## rdr09

Quote:


> Originally Posted by *PsYcHo29388*
> 
> Awesome. I hope there will eventually be a nice boost in GTA5 and other CPU intensive games. Dropping down to 35 FPS from 60-70 is never fun.


you still have your cpu at stock? I read GTA5 is really cpu intensive.


----------



## cephelix

Whoa, you guys bought them at some cheap prices.bought mine at 600SGD. brand new though. And now they're going for half the price.


----------



## icmacdon

I bought 2 x Vapor-X 8gb after coming across from the green side and frankly have not been a happy camper as two of my favourite games Far Cry 4 and Witcher 3 had/have big crossfire issues. Sorry for being negative but I think I may need some encouragement from you guys that I have not made a bad decision. Was considering selling them for a Titan X. Please talk me out of it.
Regards Ian


----------



## icmacdon

I bought 2 x Vapor-X 8gb after coming across from the green side and frankly have not been a happy camper as two of my favourite games Far Cry 4 and Witcher 3 had/have big crossfire issues. Sorry for being negative but I think I may need some encouragement from you guys that I have not made a bad decision. Was considering selling them for a Titan X. Please talk me out of it.
Regards Ian


----------



## icmacdon

I bought 2 x Vapor-X 8gb after coming across from the green side and frankly have not been a happy camper as two of my favourite games Far Cry 4 and Witcher 3 had/have big crossfire issues. Sorry for being negative but I think I may need some encouragement from you guys that I have not made a bad decision. Was considering selling them for a Titan X. Please talk me out of it.
Regards Ian


----------



## JourneymanMike

Quote:


> Originally Posted by *Dooderek*
> 
> Congrats!! WC is the way to go 45 - 50 C load temps all day long


I copy that!


----------



## aaroc

Quote:


> Originally Posted by *BradleyW*
> 
> Speaking of CFX, GTA V and Witcher 3 are playing great.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Dying Light, AC Unity and FC4 are also working brilliant with the latest drivers.
> 
> Sadly, AMD are just too slow to release "good" profiles.


How do you select those profiles or its auto detection?

Here my validation 1x R9 290X HIS (Hynix) , 3x Sapphire R9 290X (Elpida):
http://www.techpowerup.com/gpuz/details.php?id=d9fwu
http://www.techpowerup.com/gpuz/details.php?id=x738
http://www.techpowerup.com/gpuz/details.php?id=eeekc
http://www.techpowerup.com/gpuz/details.php?id=5phbp

Quadfire Baby


----------



## rdr09

Quote:


> Originally Posted by *icmacdon*
> 
> I bought 2 x Vapor-X 8gb after coming across from the green side and frankly have not been a happy camper as two of my favourite games Far Cry 4 and Witcher 3 had/have big crossfire issues. Sorry for being negative but I think I may need some encouragement from you guys that I have not made a bad decision. Was considering selling them for a Titan X. Please talk me out of it.
> Regards Ian


if you are coming from nvidia and playing games optimized for them, then i recommend going back green. not the TitanX but the 980Ti. similar performance at half the price. also, if your monitor is 1080, then it is hard to use 2 290Xs even if it is 120Hz. go back green and get a 1440 gsync.

edit: that is just my opinion. wait for others.


----------



## aaroc

Quote:


> Originally Posted by *icmacdon*
> 
> I bought 2 x Vapor-X 8gb after coming across from the green side and frankly have not been a happy camper as two of my favourite games Far Cry 4 and Witcher 3 had/have big crossfire issues. Sorry for being negative but I think I may need some encouragement from you guys that I have not made a bad decision. Was considering selling them for a Titan X. Please talk me out of it.
> Regards Ian


If you are going that route check the 980ti instead of the Titan X, almost the same performance for 300 USD less.

Please check the latest beta driver and follow the tutorial on how to configure Crossfire for Witcher 3 that is in the beta driver page. I read the release notes an tutorial because im reinstalling my PC and choosing what driver to install. I dont have those games.
Quote:


> Originally Posted by *rdr09*
> 
> if your monitor is 1080, then it is hard to use 2 290Xs even if it is 120Hz. go back green and get a 1440 gsync.


The R9 290X shines at 1440p or 3x1440p, racing with 3 monitors is so immersive, there is no going back.


----------



## boredmug

I'm contemplating buying two 290x's with ek full cover blocks for 500. Seems like a pretty good deal. OR should I wait a couple weeks and see what AMD has to offer. I'm currently running two 7950's under water.


----------



## tsm106

Quote:


> Originally Posted by *boredmug*
> 
> I'm contemplating buying two 290x's with ek full cover blocks for 500. Seems like a pretty good deal. OR should I wait a couple weeks and see what AMD has to offer. I'm currently running two 7950's under water.


That's a pretty damn good deal.


----------



## boredmug

Quote:


> Originally Posted by *tsm106*
> 
> That's a pretty damn good deal.


I know. . that's why I'm contemplating it so much but at the same time if I wait I might be able to get one uber card for a bit more. The 290x's would plug right into my loop though.


----------



## tsm106

Quote:


> Originally Posted by *boredmug*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> That's a pretty damn good deal.
> 
> 
> 
> I know. . that's why I'm contemplating it so much but at the same time if I wait I might be able to get one uber card for a bit more. The 290x's would plug right into my loop though.
Click to expand...

You can't buy that degree of performance and cooling for 500 bucks on newegg even if your life depended on it right this moment.


----------



## boredmug

Question is will I be happy with these for the next two years or so gaming at 5760x1080? I've also been thinking about going to one super wide-screen 1440p monitor. I've had my 7950's for about 2 1/2 years now and they've served me well but I've been itching to upgrade. Other issue is I'm running an Antec 850 high current PSU. Think it'll be enough?


----------



## tsm106

Quote:


> Originally Posted by *boredmug*
> 
> Question is will I be happy with these for the next two years or so gaming at 5760x1080? I've also been thinking about going to one super wide-screen 1440p monitor. I've had my 7950's for about 2 1/2 years now and they've served me well but I've been itching to upgrade. Other issue is I'm running an Antec 850 high current PSU. Think it'll be enough?


Compared to what you have currently it will be twice the power. And if you move to a single super wide 1440 it should be and even easier load to run. Psu wise, I would say it depends on your cpu and your oc'd state of trim. 850w is on small side for 290x cfx, but you can get by if you keep the ocing down/reasonable etc.


----------



## boredmug

Quote:


> Originally Posted by *tsm106*
> 
> Compared to what you have currently it will be twice the power. And if you move to a single super wide 1440 it should be and even easier load to run. Psu wise, I would say it depends on your cpu and your oc'd state of trim. 850w is on small side for 290x cfx, but you can get by if you keep the ocing down/reasonable etc.


The 850 has 2 40 amp rails. I'm running a [email protected] Yea, its an old processor but stills performs pretty decent. I'd rather have the cards and upgrade the CPU/mb later. I was thinking I might just have to run the cards at stock clocks.

So I'm assuming you are saying jump on the deal?


----------



## boredmug

Another question. . I'm currently using two dvi to mini display port adapters for eyefinity. 290x uses display port correct? I'm assuming I'd have to buy one active display port adapter? I know crossfire is a little different on these cards does anyone know if all the monitors have to be plugged into one card still?


----------



## tsm106

Quote:


> Originally Posted by *boredmug*
> 
> Another question. . I'm currently using two dvi to mini display port adapters for eyefinity. 290x uses display port correct? I'm assuming I'd have to buy one active display port adapter? I know crossfire is a little different on these cards does anyone know if all the monitors have to be plugged into one card still?


Nah, you don't have to buy an adapter, but you will have to switch one mdp port to dp port. The 290s have two full dvi ports so you connect two panels via dvi, and last panel can go hdmi or dp.


----------



## boredmug

Quote:


> Originally Posted by *tsm106*
> 
> Nah, you don't have to buy an adapter, but you will have to switch one mdp port to dp port. The 290s have two full dvi ports so you connect two panels via dvi, and last panel can go hdmi or dp.


Lol. I just found your response to the same question right before you answered. Only thing that's holding me back is the chance of AMD releasing something crazy in the 800 dollar range that will trump this setup. Although I guess I'd be spending more on a water block when one comes out for said card. 500 dollars seems fantastic for the cards and blocks. I've had my eyes on a pair of these cards since the bit coining crash and people started dumping them.


----------



## BradleyW

Quote:


> Originally Posted by *tsm106*
> 
> *The beta driver has no cfx for witcher 3*, it only refers to single gpu. That list of games has decent cfx support, and pcars is improving ever so slightly. Dirt will have full support as well as F1 2015 considering its a codies game. Thus get the latest driver, whichever that


No.

See below:
Quote:


> Highlights of AMD Catalyst™ 15.5 Beta Windows Driver
> 
> *Crossfire Profile Update for:
> *The Witcher 3: Wild Hunt*
> 
> Performance Improvements for the following :
> 
> The Witcher 3: Wild Hunt - Up to 10% performance increase on single GPU Radeon R9 and R7 Series graphics products
> 
> Project Cars - Up to 17% performance increase on single GPU Radeon R9 and R7 Series graphics products


Source: http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx


----------



## tsm106

Quote:


> Originally Posted by *BradleyW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> *The beta driver has no cfx for witcher 3*, it only refers to single gpu. That list of games has decent cfx support, and pcars is improving ever so slightly. Dirt will have full support as well as F1 2015 considering its a codies game. Thus get the latest driver, whichever that
> 
> 
> 
> No.
> 
> See below:
> Quote:
> 
> 
> 
> Highlights of AMD Catalyst™ 15.5 Beta Windows Driver
> 
> *Crossfire Profile Update for:
> *The Witcher 3: Wild Hunt*
> 
> Performance Improvements for the following :
> 
> The Witcher 3: Wild Hunt - Up to 10% performance increase on single GPU Radeon R9 and R7 Series graphics products
> 
> Project Cars - Up to 17% performance increase on single GPU Radeon R9 and R7 Series graphics products
> 
> Click to expand...
> 
> Source: http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx
Click to expand...

That was referring to someone asking for quad. Though I suppose I laid it on thick the hyperbole regarding 2 cards, but 3 and 4 card runs like ASS on witcher 3.


----------



## BradleyW

Why do the AMD Windows 10 drivers, when installed on Windows 10 not allow crossfire? If I use the Windows 10 drivers on Windows 8.1, crossfire works. If I use the Latest AMD 15.5 Beta drivers for Windows 8.1 on Windows 10, Crossfire works! Can anyone explain why?

In other words:

Win 10 AMD driver + Win 8.1 = CFX.
Win 8.1 AMD driver + Win 10 = CFX.
*Win 10 AMD driver + Win 10 = No CFX.*

Thank you.


----------



## JourneymanMike

Quote:


> Originally Posted by *BradleyW*
> 
> Why do the AMD Windows 10 drivers, when installed on Windows 10 not allow crossfire? If I use the Windows 10 drivers on Windows 8.1, crossfire works. If I use the Latest AMD 15.5 Beta drivers for Windows 8.1 on Windows 10, Crossfire works! Can anyone explain why?
> 
> In other words:
> 
> Win 10 AMD driver + Win 8.1 = CFX.
> Win 8.1 AMD driver + Win 10 = CFX.
> *Win 10 AMD driver + Win 10 = No CFX.*
> 
> Thank you.


You're right...

For Win 10 I use the AMD drivers and the MB drivers from Asus... In fact I use all OEM drivers...

The Win 10 drivers are just not up to snuff! Microsoft is trying to do too much!


----------



## BradleyW

Quote:


> Originally Posted by *JourneymanMike*
> 
> You're right...
> 
> For Win 10 I use the AMD drivers and the MB drivers from Asus... In fact I use all OEM drivers...
> 
> The Win 10 drivers are just not up to snuff! Microsoft is trying to do too much!


At first I was worried that my machine had an issue.


----------



## th3illusiveman

So to get the reduced CPU overhead you need windows 10? I've been running modded windows 10 drivers off Guru3d and they seem to improve GTA5 performance quite abit.

To install windows 10 before july 29 what do you need?


----------



## JourneymanMike

Quote:


> Originally Posted by *th3illusiveman*
> 
> So to get the reduced CPU overhead you need windows 10? I've been running modded windows 10 drivers off Guru3d and they seem to improve GTA5 performance quite abit.
> 
> To install windows 10 before july 29 what do you need?


Steve Gates?


----------



## kizwan

Quote:


> Originally Posted by *Ized*
> 
> Really quite jealous of some of those American prices o_0
> 
> Cost me £315 for a reference 290 plus I guess another £120 to cool the damn thing.
> 
> So $650.
> 
> Well, maybe $649 after I sell my dollars worth of litecoins


I paid around £360 for a reference card, two for £720. Block only cost me £78 because I don't need to pay VAT.








Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PsYcHo29388*
> 
> Awesome. I hope there will eventually be a nice boost in GTA5 and other CPU intensive games. Dropping down to 35 FPS from 60-70 is never fun.
> 
> 
> 
> you still have your cpu at stock? I read GTA5 is really cpu intensive.
Click to expand...

CPU utilization just slightly higher than BF4.
Quote:


> Originally Posted by *cephelix*
> 
> Whoa, you guys bought them at some cheap prices.bought mine at 600SGD. brand new though. And now they're going for half the price.


Price in our region pretty much static & on the high side. Our market not big enough to influence the price.


----------



## pengs

Quote:


> Originally Posted by *BradleyW*
> 
> Speaking of CFX, GTA V and Witcher 3 are playing great.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Dying Light, AC Unity and FC4 are also working brilliant with the latest drivers.
> 
> Sadly, AMD are just too slow to release "good" profiles.


Did GTA V get fixed with CF on the latest drivers? I was using the GTA V drivers and encountered a lot of driving stutter (less stutter and more of a hault). In general it was silky smooth until you started driving, the faster the worse.

I also couldn't get rid of the post processing flicker in Dying Light. The blood splatter when you are low on health.


----------



## kizwan

Quote:


> Originally Posted by *pengs*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *BradleyW*
> 
> Speaking of CFX, GTA V and Witcher 3 are playing great.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Dying Light, AC Unity and FC4 are also working brilliant with the latest drivers.
> 
> Sadly, AMD are just too slow to release "good" profiles.
> 
> 
> 
> 
> 
> 
> 
> *Did GTA V get fixed with CF on the latest drivers? I was using the GTA V drivers and encountered a lot of driving stutter (less stutter and more of a hault). In general it was silky smooth until you started driving, the faster the worse.*
> 
> I also couldn't get rid of the post processing flicker in Dying Light. The blood splatter when you are low on health.
Click to expand...

Already fixed that since 15.4 beta.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *kizwan*
> 
> Hey madman!


Hey there mate hows it going eh ??

Quote:


> Originally Posted by *BradleyW*
> 
> I paid £450 for each 290X new + 2 new WB's for £110 each!


I paid that for my new 290's but in 'Stralian dollars + $140 each sent for the blocks . But my 3rd 290 is the XFX unlocked to 290x version and I got that 2nd hand with a tri fan cooler for $415AU sent

Quote:


> Originally Posted by *Ramzinho*
> 
> That's why i'm so glad with my Investment in buying two 290Xs for 220$ each and two waterblock + back plates practically new for 170$
> Quote:
> 
> 
> 
> Originally Posted by *boredmug*
> 
> I'm contemplating buying two 290x's with ek full cover blocks for 500. Seems like a pretty good deal. OR should I wait a couple weeks and see what AMD has to offer. I'm currently running two 7950's under water.
Click to expand...

Those are real bang for buck deals well done you two











Quote:


> Originally Posted by *th3illusiveman*
> 
> So to get the reduced CPU overhead you need windows 10? I've been running modded windows 10 drivers off Guru3d and they seem to improve GTA5 performance quite abit.
> 
> To install windows 10 before july 29 what do you need?


This vvvvv in your inbox .......


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Hey madman!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hey there mate hows it going eh ??
Click to expand...

ok eh.


----------



## HOMECINEMA-PC

Sweeeeeeeet


----------



## By-Tor

Quote:


> Originally Posted by *boredmug*
> 
> I'm contemplating buying two 290x's with ek full cover blocks for 500. Seems like a pretty good deal. OR should I wait a couple weeks and see what AMD has to offer. I'm currently running two 7950's under water.


I'd jump of that deal.....

I made the switch from a pair of 7950's on water to a single Powercolor 290x LCS (came from the factory with a EK full cover block) a couple of months back and am very happy. Picked it up from newegg as a refurb for $300 and hope they have another at that price so I can add a second card to my loop.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814131543


----------



## boredmug

Quote:


> Originally Posted by *By-Tor*
> 
> I'd jump of that deal.....
> 
> I made the switch from a pair of 7950's on water to a single Powercolor 290x LCS (came from the factory with a EK full cover block) a couple of months back and am very happy. Picked it up from newegg as a refurb for $300 and hope they have another at that price so I can add a second card to my loop.
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814131543


I'm still contemplating the deal. I'm just worried if I buy them I'll regret it when AMD's new cards come out.


----------



## By-Tor

I doubt you could touch even one of the next gen. high end AMD cards for that price, but I hear what you're saying. I was thinking about waiting also, but when I first seen that Powercolor LCS 290x show up on newegg when it was released I wanted one or two. The cards were priced in the $700-$800 range and way out of what I would even think of spending on a video card. So when I seen one of the cards back on April 28th show up on newegg as a refurb for $300 shipped I had to jump on it.

Good luck with your choice...


----------



## boredmug

Quote:


> Originally Posted by *boredmug*
> 
> I'm still contemplating the deal. I'm just worried if I buy them I'll regret it when AMD's new cards come out.


Tell me though, how does one 290x compare to two 7950's performance wise?


----------



## rdr09

Quote:


> Originally Posted by *boredmug*
> 
> Tell me though, how does one 290x compare to two 7950's performance wise?


290X is about 30% faster than a 7950

2 7950s are about 30% faster than a 290X.


----------



## StarGuardianLux

So, I got a 290X from VisonTek
But it's sittin in the box, doing nothing, waitin till i got all the parts for my new build.

How much of it is a step up over a 7870?


----------



## BradleyW

Quote:


> Originally Posted by *StarGuardianLux*
> 
> So, I got a 290X from VisonTek
> But it's sittin in the box, doing nothing, waitin till i got all the parts for my new build.
> 
> How much of it is a step up over a 7870?


45%.


----------



## cephelix

Quote:


> Originally Posted by *kizwan*
> 
> I paid around £360 for a reference card, two for £720. Block only cost me £78 because I don't need to pay VAT.
> 
> 
> 
> 
> 
> 
> 
> 
> CPU utilization just slightly higher than BF4.
> Price in our region pretty much static & on the high side. Our market not big enough to influence the price.


Yeah.that's a real shame though. I'm wondering why 290x in japan when i went was about 60 bucks cheaper than in singapore though. Is it distance?taxes?


----------



## Joe-Gamer

Do you guys think that the AMD fury can compete at a competitive price against the 980 Ti? Fury was meant to be the £1000 Titan X killer, not the £540 980 Ti (titan x and 980ti are within margin of error of each other)
So the 390x is meant to compete with 980/970 instead?


----------



## flopper

Quote:


> Originally Posted by *Joe-Gamer*
> 
> Do you guys think that the AMD fury can compete at a competitive price against the 980 Ti? Fury was meant to be the £1000 Titan X killer, not the £540 980 Ti (titan x and 980ti are within margin of error of each other)
> So the 390x is meant to compete with 980/970 instead?


390x for the 980.
Fiji will have at least 2 versions and be priced/compete vs the 980Ti and Titanx.
There might be a Pro version at the 980ti performance but AMD might put it at 600 or so which would be the card I buy instantly.


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> I paid around £360 for a reference card, two for £720. Block only cost me £78 because I don't need to pay VAT.
> 
> 
> 
> 
> 
> 
> 
> 
> *CPU utilization just slightly higher than BF4.*
> Price in our region pretty much static & on the high side. Our market not big enough to influence the price.


Wow. a single 290 stock almost maxed my i7 @ 4.5 HT off in BF4 using 1080 130% scale.


----------



## Arizonian

Quote:


> Originally Posted by *Joe-Gamer*
> 
> Do you guys think that the AMD fury can compete at a competitive price against the 980 Ti? Fury was meant to be the £1000 Titan X killer, not the £540 980 Ti (titan x and 980ti are within margin of error of each other)
> So the 390x is meant to compete with 980/970 instead?


Off topic here honestly. Please take this type of discussion to threads regarding Fury and 980TI.


----------



## Joe-Gamer

Not that off topic, considering the 390x is a 8gb ramped up 290x. The thread is very narrowly focused if I can't discuss the refresh of our cards, the 300 series.


----------



## elgreco14

Man these 290x suck... So everytime in games my top card gets 94 degrees. So I tried the following things:

Switch cards to see if position related and it is. I installed two new 140mm outtake fans (noctua) and replaced them for my 200mm fan. Still no change, so then I tried to repaste my top card. I doesnt get as quick as normal to 94 degrees but after some minutes of playing it reaches aigain 94 degrees.

Any other ideas?


----------



## BradleyW

Quote:


> Originally Posted by *elgreco14*
> 
> Man these 290x suck... So everytime in games my top card gets 94 degrees. So I tried the following things:
> 
> Switch cards to see if position related and it is. I installed two new 140mm outtake fans (noctua) and replaced them for my 200mm fan. Still no change, so then I tried to repaste my top card. I doesnt get as quick as normal to 94 degrees but after some minutes of playing it reaches aigain 94 degrees.
> 
> Any other ideas?


Watercooling. It really is the only option. That's why most 290X owners went water when they purchased 2 or more cards.


----------



## mfknjadagr8

Quote:


> Originally Posted by *elgreco14*
> 
> Man these 290x suck... So everytime in games my top card gets 94 degrees. So I tried the following things:
> 
> Switch cards to see if position related and it is. I installed two new 140mm outtake fans (noctua) and replaced them for my 200mm fan. Still no change, so then I tried to repaste my top card. I doesnt get as quick as normal to 94 degrees but after some minutes of playing it reaches aigain 94 degrees.
> 
> Any other ideas?


is the fan speed set to a hundred percent?


----------



## kizwan

Quote:


> Originally Posted by *cephelix*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Price in our region pretty much static & on the high side. Our market not big enough to influence the price.
> 
> 
> 
> Yeah.that's a real shame though. I'm wondering why 290x in japan when i went was about 60 bucks cheaper than in singapore though. Is it distance?*taxes*?
Click to expand...

I think so. But I think it depends on other things too.
Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> *CPU utilization just slightly higher than BF4.*
> 
> 
> 
> Wow. a single 290 stock almost maxed my i7 @ 4.5 HT off in BF4 using 1080 130% scale.
Click to expand...

That is comparison between BF4 @1080p (DX11) & GTA V @1440p. Usage slightly higher on GTA V than BF4.


----------



## PsYcHo29388

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *PsYcHo29388*
> 
> Awesome. I hope there will eventually be a nice boost in GTA5 and other CPU intensive games. Dropping down to 35 FPS from 60-70 is never fun.
> 
> 
> 
> you still have your cpu at stock? I read GTA5 is really cpu intensive.
Click to expand...

Yeah I can't do anything about that either, as my board doesn't seem to be capable of overclocking an 8350. It was great at handling my 960T when I had one though, and back in 2011 that's all that mattered to me.


----------



## Ha-Nocri

Quote:


> Originally Posted by *elgreco14*
> 
> Man these 290x suck... So everytime in games my top card gets 94 degrees. So I tried the following things:
> 
> Switch cards to see if position related and it is. I installed two new 140mm outtake fans (noctua) and replaced them for my 200mm fan. Still no change, so then I tried to repaste my top card. I doesnt get as quick as normal to 94 degrees but after some minutes of playing it reaches aigain 94 degrees.
> 
> Any other ideas?


r ur cards on top of each other or is there a slot free between them?


----------



## pengs

Quote:


> Originally Posted by *kizwan*
> 
> Already fixed that since 15.4 beta.


That's what I'm running

15.4 Beta /
14.502.1014
Quote:


> Originally Posted by *elgreco14*
> 
> Man these 290x suck... So everytime in games my top card gets 94 degrees. So I tried the following things:
> 
> Switch cards to see if position related and it is. I installed two new 140mm outtake fans (noctua) and replaced them for my 200mm fan. Still no change, so then I tried to repaste my top card. I doesnt get as quick as normal to 94 degrees but after some minutes of playing it reaches aigain 94 degrees.
> 
> Any other ideas?


Custom fan profile, especially with the reference design. Use Afterburner, take a few notes on temperature vs. fan speed % every 5*C or so and up that by 5-10%.


----------



## boredmug

Lol.. well I have been contemplating buying two 290x's with EK fullblocks already installed on Ebay. i was just messing around with payment options on Ebay and i guess maybe i clicked the wrong button but it looks like I'm the proud owner of two 290x's with EK fullblocks. $500 for both cards and blocks. yeeee


----------



## Ramzinho

Quote:


> Originally Posted by *boredmug*
> 
> Lol.. well I have been contemplating buying two 290x's with EK fullblocks already installed on Ebay. i was just messing around with payment options on Ebay and i guess maybe i clicked the wrong button but it looks like I'm the proud owner of two 290x's with EK fullblocks. $500 for both cards and blocks. yeeee


amazing price.. hope they are 100% functional


----------



## boredmug

Quote:


> Originally Posted by *Ramzinho*
> 
> amazing price.. hope they are 100% functional


Yea, i hear ya. I corresponded with the seller in length about the cards. He says they are about 8 months old and have only been used under water. When asked why he's selling them he said that he does a lot of video editing and encoding for work and bought a titan x because of the cuda cores..


----------



## cephelix

Quote:


> Originally Posted by *kizwan*
> 
> I think so. But I think it depends on other things too.


Yeah,it stinks.a few guys were complaining about prices on the local forum as well. They'd rather buy a card on amazon then one locally. Oh well,at least the card performs well even though core temps are 85C on load for me. Plays DA:I real well even though it's not maxed out settings-wise.


----------



## tsm106

Quote:


> Originally Posted by *boredmug*
> 
> Lol.. well I have been contemplating buying two 290x's with EK fullblocks already installed on Ebay. i was just messing around with payment options on Ebay and i guess maybe i clicked the wrong button but it looks like I'm the proud owner of two 290x's with EK fullblocks. $500 for both cards and blocks. yeeee


It was bound to happen anyways.


----------



## boredmug

Quote:


> Originally Posted by *tsm106*
> 
> It was bound to happen anyways.


Yup.. I know what's gunna happen now though. This is going to spur me to make some credit card debt and upgrade my mobo/CPU and powersupply. Oh.. And that super wide-screen 34 inch 1440p monitor I've had my eye on. Lol

Just addictive. I don't really even need the cards. Honestly most of the games still play well with my overclocked 7950's, but damn the full cover blocks look so much nicer than my ek supremacy bridge edition GPU only blocks. A little sad that I'll probably be stuck with stock clocks because of the 850w PSU though.


----------



## Mega Man

Quote:


> Originally Posted by *Mega Man*
> 
> Slightly off topic
> 
> super excited, the "new" megaman if you will is avail for preorder, ( ill let you do some googling as to why if it interests you ) - i am super excited !~
> 
> see Mighty No. 9
> 
> http://www.mightyno9.com/


----------



## mush332

hey guys i was wondering if this was a good benchmark for my rig.

2 x tri-x r9 290x oc
i5 3570k


----------



## gupsterg

Quote:


> Originally Posted by *mush332*
> 
> hey guys i was wondering if this was a good benchmark for my rig.
> 
> 2 x tri-x r9 290x oc
> i5 3570k


OMG! I thought CF 290X would yield greater score, have you checked the 3dMark database to see what others get?

Below image of i5 4690K 4.4GHz CPU 4.1GHz Cache XMP 2400MHz @ 1T with R9 290 Tri-X 1100MHz GPU 5900MHz RAM using beta 15.4 Catalyst.



What do you get in other benchmarks? below Heavensward DX11 Maximum settings and CPU / GPU as above.


----------



## elgreco14

Quote:


> Originally Posted by *BradleyW*
> 
> Watercooling. It really is the only option. That's why most 290X owners went water when they purchased 2 or more cards.


Pfff that's to expensive for me..
Quote:


> Originally Posted by *mfknjadagr8*
> 
> is the fan speed set to a hundred percent?


Yep to 100% and i made custom fan profile.

It throthles a bit back in game but I dont see a performance drop but having a card so long a 94 degrees won't it damage it?


----------



## JourneymanMike

Quote:


> Originally Posted by *elgreco14*
> 
> .
> 
> Yep to 100% and i made custom fan profile.
> 
> It throthles a bit back in game but I dont see a performance drop but *having a card so long a 94 degrees won't it damage it*?


I've had three 290X Reference cards, they all ran in the low to mid 90's, they are made to run at these temps. I put them under water @ full load they are @ 50c - 55c


----------



## Lao Tzu

Quote:


> Originally Posted by *mush332*
> 
> 
> hey guys i was wondering if this was a good benchmark for my rig.
> 
> 2 x tri-x r9 290x oc
> i5 3570k


I think is fine, your graphics score is fine, the total score is down by the CPU score.

http://www.3dmark.com/fs/4914719


----------



## Dooderek

Quote:


> Originally Posted by *mush332*
> 
> 
> hey guys i was wondering if this was a good benchmark for my rig.
> 
> 2 x tri-x r9 290x oc
> i5 3570k


http://www.3dmark.com/fs/4908603

You could use a CPU overclock


----------



## tsm106

Quote:


> Originally Posted by *JourneymanMike*
> 
> Quote:
> 
> 
> 
> Originally Posted by *elgreco14*
> 
> .
> 
> Yep to 100% and i made custom fan profile.
> 
> It throthles a bit back in game but I dont see a performance drop but *having a card so long a 94 degrees won't it damage it*?
> 
> 
> 
> I've had three 290X Reference cards, they all ran in the low to mid 90's, they are made to run at these temps. I put them under water @ full load they are @ 50c - 55c
Click to expand...

Running ref cards at max temp for long periods will degrade them. This is why we generally avoid old mining cards. They've had their useable lifespan sucked out of them in a years time. 4-5 years of use in 1 year, etc. Running them under water slows the degradation immeasurably. I've got 7970s bought during release that still clock to 1400mhz core on water today. They're almost 4 years old now.

Quote:


> Originally Posted by *mush332*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> hey guys i was wondering if this was a good benchmark for my rig.
> 
> 2 x tri-x r9 290x oc
> i5 3570k


It's on the low side on both gscore and physics.


----------



## mush332

I
Quote:


> Originally Posted by *Dooderek*
> 
> http://www.3dmark.com/fs/4908603
> 
> You could use a CPU overclock


i got a h240x cooling my cpu. What do u think i can get my speed to?


----------



## kizwan

Quote:


> Originally Posted by *mush332*
> 
> I
> Quote:
> 
> 
> 
> Originally Posted by *Dooderek*
> 
> http://www.3dmark.com/fs/4908603
> 
> You could use a CPU overclock
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i got a h240x cooling my cpu. What do u think i can get my speed to?
Click to expand...

I think 4.5GHz at least .


----------



## Eudisld15

Hey, anyone know the size and type of heatsink mounting screws for the 290x heat sink. Lost a few, need replacements. Can't find them anywhere and they're 4 screws that hold the heatsink to gpu.


----------



## pengs

Quote:


> Originally Posted by *elgreco14*
> 
> Yep to 100% and i made custom fan profile.
> 
> It throthles a bit back in game but I dont see a performance drop but having a card so long a 94 degrees won't it damage it?


It was designed to throttle for protection at that temperature but at 100% fan speed that's a bit odd. Whats the idle temp?


----------



## aaroc

When i bought my first 1440p monitor I decided to upgrade from a cfx 7870 to cfx r9 290 because it was a powerpoint presentation any game at 1440p resolution. You will be good with your new r9 290x cfx for wide 1440p.


----------



## BradleyW

Quote:


> Originally Posted by *elgreco14*
> 
> Pfff that's to expensive for me..
> 
> Yep to 100% and i made custom fan profile.
> 
> It throthles a bit back in game but I dont see a performance drop but having a card so long a 94 degrees won't it damage it?


Puff all you want. It is the only option to resolve the issue. Either that or point a load of 3000rpm fans at the cards or take a card out.


----------



## Jflisk

Quote:


> Originally Posted by *Eudisld15*
> 
> Hey, anyone know the size and type of heatsink mounting screws for the 290x heat sink. Lost a few, need replacements. Can't find them anywhere and they're 4 screws that hold the heatsink to gpu.


You can try these look like M3 screws
http://www.moddiy.com/products/3mm-Video-%7B47%7D-Graphics-Card-Screws.html?gclid=Cj0KEQjw7r-rBRDE_dXtgLz9-e4BEiQATeKG7PZhhOiUNtW3q-E4M524fXqKEev6YDvyPOjCccRcViQaAs2p8P8HAQ

Take the existing screw to lowes or home depot and try it in there thread tester make sure there M3 Screws.


----------



## tsm106

Quote:


> Originally Posted by *Jflisk*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Eudisld15*
> 
> Hey, anyone know the size and type of heatsink mounting screws for the 290x heat sink. Lost a few, need replacements. Can't find them anywhere and they're 4 screws that hold the heatsink to gpu.
> 
> 
> 
> You can try these look like M3 screws
> http://www.moddiy.com/products/3mm-Video-%7B47%7D-Graphics-Card-Screws.html?gclid=Cj0KEQjw7r-rBRDE_dXtgLz9-e4BEiQATeKG7PZhhOiUNtW3q-E4M524fXqKEev6YDvyPOjCccRcViQaAs2p8P8HAQ
> 
> Take the existing screw to lowes or home depot and try it in there thread tester make sure there M3 Screws.
Click to expand...

You probably won't find the right threading at le depot or lowes. You'd have to find mcmastercarr or such for m2 and smaller screws, though I think the screws are closer to m1.4 for reference cooler. Also it would probably be easier to get this. Iirc its the same bracket.

http://www.swiftech.com/CrossBracket7900.aspx


----------



## JourneymanMike

Quote:


> Originally Posted by *Eudisld15*
> 
> Hey, anyone know the size and type of heatsink mounting screws for the 290x heat sink. Lost a few, need replacements. Can't find them anywhere and they're 4 screws that hold the heatsink to gpu.


Do you mean these four screws?


----------



## elgreco14

Quote:


> Originally Posted by *BradleyW*
> 
> Puff all you want. It is the only option to resolve the issue. Either that or point a load of 3000rpm fans at the cards or take a card out.


Watercooling is just to expensive and I rather use the money for something else. But I think I found maybe a solution. Using a pci riser cable and put the second card somewhere else in the case.


----------



## mush332

Quote:


> Originally Posted by *kizwan*
> 
> I think 4.5GHz at least .


Whats a safe voltage


----------



## Jflisk

Quote:


> Originally Posted by *tsm106*
> 
> You probably won't find the right threading at le depot or lowes. You'd have to find mcmastercarr or such for m2 and smaller screws, though I think the screws are closer to m1.4 for reference cooler. Also it would probably be easier to get this. Iirc its the same bracket.
> 
> http://www.swiftech.com/CrossBracket7900.aspx


Above is your best bet and yes the M3 is for a back plate not mounting the GPU to the cooler by the looks of things .


----------



## BradleyW

Quote:


> Originally Posted by *elgreco14*
> 
> Watercooling is just to expensive and I rather use the money for something else. But I think I found maybe a solution. Using a pci riser cable and put the second card somewhere else in the case.


Be careful. I've seen my fair share of riser issues, from malfunction to reduced performance.


----------



## Dooderek

Since I finished my build yesterday, wanted to test out how much heat these rads can handle. Finally broke into the 20k but at these clock speeds I am starting to artifact. I briefly tried 1200/1375 but it artifact-ed too much during the intro scene.

1180/1375

http://www.3dmark.com/3dm/7214925


----------



## JourneymanMike

Quote:


> Originally Posted by *Dooderek*
> 
> Since I finished my build yesterday, wanted to test out how much heat these rads can handle. Finally broke into the 20k but at these clock speeds I am starting to artifact. I briefly tried 1200/1375 but it artifact-ed too much during the intro scene.
> 
> 1180/1375
> 
> http://www.3dmark.com/3dm/7214925


Not bad! For the first run...


----------



## Eudisld15

Quote:


> Originally Posted by *JourneymanMike*
> 
> Do you mean these four screws?


Yup


----------



## pengs

Selling my second Tri-X 290X OC here, Hynix memory per Hawaiinfo12 and a stock 1040/1300 and upper 70's as far as ASIC goes, 78% iirc.
Thought I'd let that be known on a proper thread.

With that set in stone...
is there any consensus about a _'safe'_ voltage for Hawaii?
I've seen 1.20v thrown around, 1.25v also. Currently showing 1.130-1.180v in AB , 1.150v in GPUz with +50mV. The VDDC is about 1.108 stock and 1.148v with the +50 at load.

Does anyone have a general idea where Hawaii falls when it comes to stock VDDC. I'm trying to keep my finger from pulling the slider to +100


----------



## Ized

Quote:


> Originally Posted by *elgreco14*
> 
> Watercooling is just to expensive and I rather use the money for something else. But I think I found maybe a solution. Using a pci riser cable and put the second card somewhere else in the case.


How about a cheapo All in one sealed water cooling unit? You can get the barebones bracket (7000 series?) very cheap, "reconditioned" AIO units popup too all the time but I'm not sure I would touch them.


----------



## JourneymanMike

Quote:


> Originally Posted by *Eudisld15*
> 
> Yup


PM on the way...


----------



## MIGhunter

So, I have a reference 290x in my computer now. I just bought a 295x2 because I was going to build a new computer but that is probably going to take a back burner. Do you guys think I'd be able to run my 290x in my son's computer: 500 watt PSU, ASRock A780FullHD MB and a AMD 7750 Kuma CPU.


----------



## BradleyW

Quote:


> Originally Posted by *MIGhunter*
> 
> So, I have a reference 290x in my computer now. I just bought a 295x2 because I was going to build a new computer but that is probably going to take a back burner. Do you guys think I'd be able to run my 290x in my son's computer: 500 watt PSU, ASRock A780FullHD MB and a AMD 7750 Kuma CPU.


How many amps on the 12v rail?


----------



## MIGhunter

Quote:


> Originally Posted by *BradleyW*
> 
> How many amps on the 12v rail?


Not sure, I'll have to look at the sticker and hopefully it says. What should I be looking for?


----------



## Roy360

.


----------



## aaroc

My previous pc iteration had a fx 8350 and 3 r9 290. When fully load it reached up to 1100W told by my ups while beeping like hell overload and from corsair ax1200i. It never reset or crashed when overloading the ups, but i always stopped what i was doing, gaming, bench, others. The third time i just connected my pc directly to the wall and only connected my monitors.to the ups.

Did you connect the extra mobo power connectors? The only hang that i had with my current pc incarnation was because i forgot to connect the sata power extra connector (fx 9590 and 4 R9 290X and 2 ax1200i) when finishing the rebuild of the pc.


----------



## elgreco14

Quote:


> Originally Posted by *Ized*
> 
> How about a cheapo All in one sealed water cooling unit? You can get the barebones bracket (7000 series?) very cheap, "reconditioned" AIO units popup too all the time but I'm not sure I would touch them.


Any links or examples? I only found the nzxt kraken g10 and then you can use a cpu cooler to cool it. Is it also possible that my mugen 3 is blocking the air to get sucked up by my 2 outtakes fan, because when I hold my hand under the mugen 3, feels like the air is pushed down to the 290x.


----------



## Ized

Quote:


> Originally Posted by *elgreco14*
> 
> Any links or examples? I only found the nzxt kraken g10 and then you can use a cpu cooler to cool it.


This cost me £12 on Amazon UK last year but I see it is no longer stocked. It was literally 4 nuts + screws along with the bracket nothing else. It is marketed for the 7000 cards but it still fits perfectly. I believe a few people have copied the design from a guy on this site if I remember right.



It could be bought with or without the side support for 92mm fan.

http://chilledpc.co.uk/shop/index.php?route=product/product&filter_name=pulse+modding&sort=p.price&order=DESC&page=3&product_id=1926

http://chilledpc.co.uk/shop/index.php?route=product/product&filter_name=pulse+modding&sort=p.price&order=DESC&page=3&product_id=1927

You need to consider VRM cooling/heatsinks also - even with the G10 bracket but at least it comes with a fan. This is also a DIY VRM option

People even use zipties/cableties instead of buying a bracket, but that's a little too ghetto even for me.


----------



## kizwan

This was posted in the local forum. I'm posting here because I want to hear the respond here (especially the driver overhead part). @tsm106, care to respond?
Quote:


> ........I think you confused about something.
> 
> Allow me to elaborate;
> First thing first, you need to differentiate between driver and API.
> Easier if I grab some definition from wikipedia.
> 
> 1) In computing, a device driver (commonly referred to as a driver) is a computer program that operates or controls a particular type of device that is attached to a computer. A driver provides a software interface to hardware devices, enabling operating systems and other computer programs to access hardware functions without needing to know precise details of the hardware being used.
> 
> I was talking about AMD catalyst driver overhead.
> 
> 2) In computer programming, an application programming interface (API) is a set of routines, protocols, and tools for building software applications. An API expresses a software component in terms of its operations, inputs, outputs, and underlying types. An API defines functionalities that are independent of their respective implementations, which allows definitions and implementations to vary without compromising each other. A good API makes it easier to develop a program by providing all the building blocks. A programmer then puts the blocks together.
> 
> Mantle and DirectX are API.
> 
> Now, DirectX 9 - 11 are known to have some API overhead. The purpose of Mantle and upcoming DirectX 12 is to reduce/optimize the API overhead by making low level access available. And yes, as you said, Mantle successfully reduce the API cpu overhead.
> 
> Next, the second part is the DRIVER cpu overhead. This is the section AMD has its worst. Compared to nvidia driver, the issue of AMD catalyst driver overhead in dx9 - 11 game is quite serious.
> You can find more about it at guru3d forum. http://forums.guru3d.com/showthread.php?t=398858
> 
> When AMD released its R9 285 (unofficially known GCN 1.2), even when using Mantle renderer, there was some performance issue. This was solved after AMD optimized R9 285 driver overhead later.
> 
> Edit: some grammer and wording mistake.


----------



## HOMECINEMA-PC

So the drivers API needs moar programming then


----------



## tsm106

Quote:


> Originally Posted by *kizwan*
> 
> This was posted in the local forum. I'm posting here because I want to hear the respond here (especially the driver overhead part). @tsm106, care to respond?


I tend to ignore the driver overhead movement. The digital foundry piece is a piece in itself, when they don't know the difference between a 4790K and a 4970K. Is it a wonder when most of cited games are gameworks titles? Hell even Fireaxis' cpu utilization is bogus. I've tried to hit their usages numbers and it's impossible but their Civ games are multi threaded, it's just impossible to get over an average of 30% usage across your cores. Scratches head oon how they did it. There are similar threads here bemoaning AMD's cpu driver overhead, so you don't really have to go far to ignore the same propaganda.


----------



## Jflisk

Know this is off base a little bit but I get my best answers here usually. Is there such thing as absolute delta. Like no matter how many rads you throw at blocks they will go no lower and I don't mean ambient. I basically have enough radiator at this point to cool a Chevy 350. But my FX950 and R9 290X X 3 is at 57C full tilt. GPUs 50C - CPU 57C MAX. I know I still have a little air in the loop. I created a rad box for the bottom of my Case. The temps probably will not drop By more then 1-2C when the air settles. Rads 240 Monsta 80 - 280 - 240 - 120 - Dual D5 pushing it along. Thanks in advance


----------



## tsm106

Quote:


> Originally Posted by *Jflisk*
> 
> Know this is off base a little bit but I get my best answers here usually. Is there such thing as absolute delta. Like no matter how many rads you throw at blocks they will go no lower and I don't mean ambient. I basically have enough radiator at this point to cool a Chevy 350. But my FX950 and R9 290X X 3 is at 57C full tilt. GPUs 50C - CPU 57C MAX. I know I still have a little air in the loop. I created a rad box for the bottom of my Case. The temps probably will not drop By more then 1-2C when the air settles. Rads 240 Monsta 80 - 280 - 240 - 120 - Dual D5 pushing it along. Thanks in advance


That's not what I would call a lot of radiator. What I run is more like apt to be more than enough. You don't have enough cooling, granted your install is up to snuff. For ex. in gaming from stock to ,ild oc of 1100/1500m my gpus don't go over 45c. Typical gpu idle is 33-35c. Thus I have a roughly 10c delta. And when benching with +300mv and fans at max with cpu at 5.0 or higher, my gpus can hit 60c under a sustained benchmark. Hmm other things, your pumps are a lil weak for multi rad and triple gpu setups. D5 would not have been my choice for this sort of head pressure load. Also adding your rads up, that's only 880mm if I am doing the math correctly which makes it 220mm per block, 1 cpu + 3 gpu. As for delta, getting as close to 10c delta as possible is ideal, but getting under 10c delta is not worth the costs.


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> So the drivers API needs moar programming then












Hey madman, I see you active a lot in car threads. Fast cards & fast cars really suitable for you.








Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> This was posted in the local forum. I'm posting here because I want to hear the respond here (especially the driver overhead part). @tsm106, care to respond?
> 
> 
> 
> I tend to ignore the driver overhead movement. The digital foundry piece is a piece in itself, when they don't know the difference between a 4790K and a 4970K. Is it a wonder when most of cited games are gameworks titles? Hell even Fireaxis' cpu utilization is bogus. I've tried to hit their usages numbers and it's impossible but their Civ games are multi threaded, it's just impossible to get over an average of 30% usage across your cores. Scratches head oon how they did it. There are similar threads here bemoaning AMD's cpu driver overhead, so you don't really have to go far to ignore the same propaganda.
Click to expand...

Thanks. Well the one I'm scratching my head is when the poster keep making a statement like "Mantle or DX12 is useless if AMD doesn't fix the driver overhead".


----------



## Sempre

Quote:


> Originally Posted by *tsm106*
> 
> Hmm other things, your pumps are a lil weak for multi rad and triple gpu setups. D5 would not have been my choice for this sort of head pressure load.


So is a DDC pump better for his setup?


----------



## blackhole2013

can anybody make a custom bios for my powercolor devil 13 290x2 its got so much power slowed down by amd powertune and boost I need amd boost off ... I want to release the devil in it ...


----------



## BugBash

If you add me to the list, next month I will hopefully be under water!


----------



## Mega Man

congrats, love seeing card that are blocked


----------



## jagdtigger

Can someone upload the thermal pad diagram for the vid-ar290x(rev1.1) water bock? I cant find it and want to order some new pads...


----------



## godiegogo214

I like how they maintained the Asus logo....Nice


----------



## Ized

Quote:


> Originally Posted by *blackhole2013*
> 
> can anybody make a custom bios for my powercolor devil 13 290x2 its got so much power slowed down by amd powertune and boost I need amd boost off ... I want to release the devil in it ...


I was experimenting with this in my simple overclocking app that I bodged together, the nearest I got was forcing the 2D clocks and 3D clocks to be the same.

Do any of the fancy overclocking apps allow that? If so give it a try.

(keep an eye on the 2D voltage, in my case it went through the roof - no droop I guess?)


----------



## tsm106

Quote:


> Originally Posted by *Sempre*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Hmm other things, your pumps are a lil weak for multi rad and triple gpu setups. D5 would not have been my choice for this sort of head pressure load.
> 
> 
> 
> So is a DDC pump better for his setup?
Click to expand...

A DDC pump like a 35x2 would have been a better choice because his loop has a lot of restriction from so many rads. His flow rate is probably pretty low for it to reach almost 60c on his gpus at a mild over voltage. Each time you add something to the loop, that offshoot causes restriction and it all adds up.


----------



## BradleyW

Does anyone know how to stop Windows 10 from downloading AMD drivers through Windows Update?


----------



## Mega Man

right click hide update each time
\


----------



## fyzzz

http://www.3dmark.com/fs/5028162 my xfx 290 at 1130/1450 on stock volts.


----------



## Widde

Is it something wrong with the drivers released for windows 10 via windows update? Idling at 59C at stock speeds and voltage. 46% fan speed


----------



## xKrNMBoYx

My 290 crapped out. Just ordered a 290X off of Amazon instead of waiting for the 300-series (as I plan on getting a 400 Series anyway). Time to change ownership


----------



## Mmonge

Quote:


> Originally Posted by *BradleyW*
> 
> Does anyone know how to stop Windows 10 from downloading AMD drivers through Windows Update?


Search for "Change device installation"

Mark:"No, let me choose what to do" > "Never Install driver software from Windows Update" > "Automatically get the device app and info provided by your device manufacturer"

And you're done.


----------



## Roy360

I was hoping someone here could help me debug my system.

I've got 3 R9 290 cards. 1 ASUS DCII Card, and two reference ones. All 3 of them are flashed with the reference XFX bios.

All three cards are watercooled, and for the time being I'm using Card 1 and Card 2 in crossfire to play Witcher 2 (GoG).

Gaming in 5760x1080 and using WideScreen Fixer.

Roughly 30mins into Witcher 2 there's a quick horizontal line that goes through all 3 screens (might just be screen tearing) and then the video freezes. Audio continues to play, so if I was in a cut scene I can still hear them play. However the PC is no longer responsive. I can't CTRL+ALT+DEL or ALT+F4 or anything. Usually I have to restart.

On the next restart 1 of 3 times will happen

1) Eyinifty will be disabled, and I will have to re-enabled it
2) AMD drivers are not found and I'll have to re-install them. <<-- Sometimes this re-installation will blue screen and take a few attempts
3) Windows will say "Please wait while we set things up for you, installing Apps, and eventually load an empty desktop." <-- Restarting after a while usually fixes this

Here are some temps just before the crash:


Note: My case is a heatbox. Very poor airflow. From what I've measured the internal heat varies from 40-60 degrees. Ambient is 20.

It seems I disabled bsod dumps. So I don't have any info on the BSOD, but I've re-enabled it for now.

For now I'm switching the crossfire to cards 1 and 3. (Just to rule out insufficient power) I also remembered that my RAM and CPU are overclocked, so I'm going to return them back to normal, but I highly doubt they are issue since they've been overclocked for months/years now.


----------



## Gabkicks

anyone playing gta V with crossfire? should i force AFR or 1x1 or something? with the default profile, i get stutter. running with crossfire disabled is pretty smooth


----------



## xKrNMBoYx

Trying to decide if I should go 290X Xfire or not. I don't see myself going any higher than 1080P any time soon unless we're talking of VSR. That said do you guys think 290X Xfire should/would last a good few years for 1080P gaming?


----------



## cephelix

Quote:


> Originally Posted by *xKrNMBoYx*
> 
> Trying to decide if I should go 290X Xfire or not. I don't see myself going any higher than 1080P any time soon unless we're talking of VSR. That said do you guys think 290X Xfire should/would last a good few years for 1080P gaming?


i'm on the same boat as you except with 290 @ 1200p. Though my decision is made somewhat easier since my OC already puts my card at 85C and i don't think thermals will allow for the addition of a second card without going to water


----------



## xKrNMBoYx

Quote:


> Originally Posted by *cephelix*
> 
> i'm on the same boat as you except with 290 @ 1200p. Though my decision is made somewhat easier since my OC already puts my card at 85C and i don't think thermals will allow for the addition of a second card without going to water


Yup before my 290 crapped out I had it OCed to 1100MHz stock voltage with no visible artifacts, throttling. With my fan curve the GPU maxed out ariund 73C in games. Killed my card trying to do cheap water cooling for it.

With the 290X I have to decide whether I use stock cooling, or water. Then I need a better PSU too.


----------



## rdr09

Quote:


> Originally Posted by *Roy360*
> 
> I was hoping someone here could help me debug my system.
> 
> I've got 3 R9 290 cards. 1 ASUS DCII Card, and two reference ones. All 3 of them are flashed with the reference XFX bios.
> 
> All three cards are watercooled, and for the time being I'm using Card 1 and Card 2 in crossfire to play Witcher 2 (GoG).
> 
> Gaming in 5760x1080 and using WideScreen Fixer.
> 
> Roughly 30mins into Witcher 2 there's a quick horizontal line that goes through all 3 screens (might just be screen tearing) and then the video freezes. Audio continues to play, so if I was in a cut scene I can still hear them play. However the PC is no longer responsive. I can't CTRL+ALT+DEL or ALT+F4 or anything. Usually I have to restart.
> 
> On the next restart 1 of 3 times will happen
> 
> 1) Eyinifty will be disabled, and I will have to re-enabled it
> 2) AMD drivers are not found and I'll have to re-install them. <<-- Sometimes this re-installation will blue screen and take a few attempts
> 3) Windows will say "Please wait while we set things up for you, installing Apps, and eventually load an empty desktop." <-- Restarting after a while usually fixes this
> 
> Here are some temps just before the crash:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Note: My case is a heatbox. Very poor airflow. From what I've measured the internal heat varies from 40-60 degrees. Ambient is 20.
> 
> It seems I disabled bsod dumps. So I don't have any info on the BSOD, but I've re-enabled it for now.
> 
> For now I'm switching the crossfire to cards 1 and 3. (Just to rule out insufficient power) I also remembered that my RAM and CPU are overclocked, so I'm going to return them back to normal, but I highly doubt they are issue since they've been overclocked for months/years now.


not sure if it is related but your third gpu was not doing anything. ypu may be right - power.

Or, it could be the third gpu. i suggest testing it by itself. disable crossfire, unplug power from the other 2 gpus, plug monitor to third and test. that is, if AB is indicating the right gpus.


----------



## cephelix

Quote:


> Originally Posted by *xKrNMBoYx*
> 
> Yup before my 290 crapped out I had it OCed to 1100MHz stock voltage with no visible artifacts, throttling. With my fan curve the GPU maxed out ariund 73C in games. Killed my card trying to do cheap water cooling for it.
> 
> With the 290X I have to decide whether I use stock cooling, or water. Then I need a better PSU too.


How'd it die?leakage or overheating?
i was thinking of going back to watercooling next year but the cost..O.M.G the cost....


----------



## xKrNMBoYx

Quote:


> Originally Posted by *cephelix*
> 
> How'd it die?leakage or overheating?
> i was thinking of going back to watercooling next year but the cost..O.M.G the cost....


Nothing related to watercooling IMO. It was a freak accident where a small chip next to the VRMs burst in sparks/smoke when I tried to turn the PC on. Not sure how that can happen with a proper installation and good contact from thermal pads.

You're right that watercooling is a tad expensive especially when you want top quality and go all out.


----------



## kizwan

Quote:


> Originally Posted by *Gabkicks*
> 
> anyone playing gta V with crossfire? should i force AFR or 1x1 or something? with the default profile, i get stutter. running with crossfire disabled is pretty smooth


I'm playing GTAV with crossfire & I didn't get stutter but from time to time the frames did slow down a little but not too much that can kill the mood. I did not change anything in CCC though. Is yours stutter badly?
Quote:


> Originally Posted by *xKrNMBoYx*
> 
> Trying to decide if I should go 290X Xfire or not. I don't see myself going any higher than 1080P any time soon unless we're talking of VSR. That said do you guys think 290X Xfire should/would last a good few years for 1080P gaming?


With a pair of 290/290X, you want to move up to 1440p at least. If you're staying at 1080p, better to stick with one card though.

When I said 1440p & 1080p, I mean the resolution, not the monitor but I prefer true 1440p monitor than 1440p VSR.


----------



## boredmug

Quote:


> Originally Posted by *Gabkicks*
> 
> anyone playing gta V with crossfire? should i force AFR or 1x1 or something? with the default profile, i get stutter. running with crossfire disabled is pretty smooth


Try and make about a 20 gig pagefile before you do anything with your graphic cards. Sounds stupid, but I bet it will fix your problem. I use crossfire with no issues. Pagefile fixed the problem you are having for me


----------



## xKrNMBoYx

Quote:


> Originally Posted by *kizwan*
> 
> I'm playing GTAV with crossfire & I didn't get stutter but from time to time the frames did slow down a little but not too much that can kill the mood. I did not change anything in CCC though. Is yours stutter badly?
> With a pair of 290/290X, you want to move up to 1440p at least. If you're staying at 1080p, better to stick with one card though.
> 
> When I said 1440p & 1080p, I mean the resolution, not the monitor but I prefer true 1440p monitor than 1440p VSR.


I'm sure I'll get a higher resolution monitor in the future but at the moment 1080p native is fine for me. 4K TV or Eyefinity Setup is something I'm a bit interested in. For now VSR will do though.


----------



## semencmoz

let me join the club.


Spoiler: Warning: Spoiler!


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Gabkicks*
> 
> anyone playing gta V with crossfire? should i force AFR or 1x1 or something? with the default profile, i get stutter. running with crossfire disabled is pretty smooth


Jack up your pagefile!


----------



## cephelix

Quote:


> Originally Posted by *semencmoz*
> 
> let me join the club.
> 
> 
> Spoiler: Warning: Spoiler!


How're the temps?
Quote:


> Originally Posted by *xKrNMBoYx*
> 
> Nothing related to watercooling IMO. It was a freak accident where a small chip next to the VRMs burst in sparks/smoke when I tried to turn the PC on. Not sure how that can happen with a proper installation and good contact from thermal pads.
> 
> You're right that watercooling is a tad expensive especially when you want top quality and go all out.


Ouch.was it not under warranty?


----------



## semencmoz

Quote:


> Originally Posted by *cephelix*
> 
> How're the temps?


with def clocks it's below 75C under heavy load.
with ~5% overcock it might hit 78-80, so i adjusted fan speeds to cool it down. it doesnt seem'd to be very stable with 5% overclock and <50 offset btw.


----------



## xKrNMBoYx

Quote:


> Originally Posted by *cephelix*
> 
> How're the temps?
> Ouch.was it not under warranty?


No I bought the 290 second hand, and even if I still had warranty I removed the warranty sticker while removing the stock cooler. Did not save the sticker unfortunately. Also I dont have original accessories and the packaging have been modified for other uses.

All is good though. I found someone interested in the GPU for $60 and my 290x will arrive by the end of the week. Maybe a second one the week after


----------



## kizwan

Quote:


> Originally Posted by *boredmug*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gabkicks*
> 
> anyone playing gta V with crossfire? should i force AFR or 1x1 or something? with the default profile, i get stutter. running with crossfire disabled is pretty smooth
> 
> 
> 
> Try and make about a 20 gig pagefile before you do anything with your graphic cards. Sounds stupid, but I bet it will fix your problem. I use crossfire with no issues. Pagefile fixed the problem you are having for me
Click to expand...

Yeah, mine already set to the recommended total paging file size, which in my case it told me to set to 24GB.


----------



## Arizonian

Quote:


> Originally Posted by *BugBash*
> 
> If you add me to the list, next month I will hopefully be under water!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## cephelix

Quote:


> Originally Posted by *semencmoz*
> 
> with def clocks it's below 75C under heavy load.
> with ~5% overcock it might hit 78-80, so i adjusted fan speeds to cool it down. it doesnt seem'd to be very stable with 5% overclock and <50 offset btw.


Well, the trick i find with AIB coolers is that you need to have good exhaust fans going. I strapped a gentle typhoon to the unused pcie slots under the card and it dropped temps quite a bit. If your case has a side panel fan that should work well. Of course all this depends on quite a few factors. Just something to keep in mind I suppose in case you ever want to lower temps some.
Quote:


> Originally Posted by *xKrNMBoYx*
> 
> No I bought the 290 second hand, and even if I still had warranty I removed the warranty sticker while removing the stock cooler. Did not save the sticker unfortunately. Also I dont have original accessories and the packaging have been modified for other uses.
> 
> All is good though. I found someone interested in the GPU for $60 and my 290x will arrive by the end of the week. Maybe a second one the week after


Not too shabby to be able to sell it for $60.was hoping to purchase tghe 290x lightning when I was in the market for a gpu but prices were too steep and my pockets too shallow.


----------



## red8

hi im new to this! so ive never oc'ed my gpu and ive just gotten it on water. i have a asus sabertooth 990fx mobo fx 8320 cpu and evga 705w b2 psu all on a mora3 420 pro rad and a d5 vario pump. Now i was wondering if anyone could help walk me through/ teach me to oc my msi 290x bf4 edition gpu (if any of that matters). im pretty weary just cause ive never done it and that i dont want to damage it and throw away 300+$. so all help is appreciate. Also i have msi after burner if that is useful at all?


----------



## Widde

Does the scale image to full panel size option work for anyone here in cs go? it started to refuse to stretch 4.3 to 16.9 out of the blue and no matter what I do I cant make it do it


----------



## specopsFI

Hi guys! I've been waiting for the 390/390X launch patiently ever since returning my GTX 970 (because of reasons, u know). More and more it's looking like those new cards will actually be Hawaii, still, just with higher clocks and 8GB VRAM. I'm not in need of more than 4GB of memory though and the new cards might just end up being worse bang for my buck than the clearance sale 290/290X's.

So here's the situation. I can get a reasonable deal right now for a Sapphire 290 Tri-X New Edition or an MSI 290 Gaming. I've had the original 290X Tri-X before and it was a nice card, but contrary to general belief, the VRM temps weren't that great. I had to ramp the cooler way up to keep it below 90 degrees with OV/OC. Now the Gaming cards, I've seen a lot of threads of temp problems. Most of those seem to be about poor TIM. In reviews, 290(X) Gaming cards have actually had very nice VRM temps even if the GPU temp / fan noise department isn't as good as Tri-X.

So, given similar pricing, which one would you go for right now? Is the Tri-X New Edition any better regarding the VRM temps than the original Tri-X? I mean it kind of should be since it's a 6 phase design with more mosfets. What memory chips do they come with? Tri-X was known for being a good choice because of the high chance of getting Hynix, but how about the New Edition? Someone in Newegg reviews said they got one with Samsung, but other than that, not much info I can find. And what about the 290 Gaming? There seems to be V1 and V2 versions around. What's the difference and what memory do they use?

My thinking right now is to order one as a backup plan for the chance that the 390/390X won't turn out to be any better of a deal. If the new chips are significant respins and/or they get new features while keeping the cost down, then I won't even take the 290 out of the box but return it.


----------



## cephelix

Quote:


> Originally Posted by *specopsFI*
> 
> Hi guys! I've been waiting for the 390/390X launch patiently ever since returning my GTX 970 (because of reasons, u know). More and more it's looking like those new cards will actually be Hawaii, still, just with higher clocks and 8GB VRAM. I'm not in need of more than 4GB of memory though and the new cards might just end up being worse bang for my buck than the clearance sale 290/290X's.
> 
> So here's the situation. I can get a reasonable deal right now for a Sapphire 290 Tri-X New Edition or an MSI 290 Gaming. I've had the original 290X Tri-X before and it was a nice card, but contrary to general belief, the VRM temps weren't that great. I had to ramp the cooler way up to keep it below 90 degrees with OV/OC. Now the Gaming cards, I've seen a lot of threads of temp problems. Most of those seem to be about poor TIM. In reviews, 290(X) Gaming cards have actually had very nice VRM temps even if the GPU temp / fan noise department isn't as good as Tri-X.
> 
> So, given similar pricing, which one would you go for right now? Is the Tri-X New Edition any better regarding the VRM temps than the original Tri-X? I mean it kind of should be since it's a 6 phase design with more mosfets. What memory chips do they come with? Tri-X was known for being a good choice because of the high chance of getting Hynix, but how about the New Edition? Someone in Newegg reviews said they got one with Samsung, but other than that, not much info I can find. And what about the 290 Gaming? There seems to be V1 and V2 versions around. What's the difference and what memory do they use?
> 
> My thinking right now is to order one as a backup plan for the chance that the 390/390X won't turn out to be any better of a deal. If the new chips are significant respins and/or they get new features while keeping the cost down, then I won't even take the 290 out of the box but return it.


I can try to answer some of your questions regarding the 290 since I own one. Though I cannot make comparisons with the Tri-X. VRM temps do get quite high when OC-ed. I f I'm not mistaken, mine reached around mid- to high-70s with a 26C ambient(AC) when OC-ed to 1050MHz. Core temps were around 60C. This was when I first got my card. Over time, I removed the stock cooler and put the card under water with a universal block. Strange thing is that when I went back to air, my temps were worse even after multiple reseats, going into mid-80s at stock. I have no idea why. With CLU applied to the die and fujipoly extreme to VRM1 temps were brought down to low-70s for both core and VRM1. The difference between the 2 revisions of the MSI gaming are the chokes used. I've put up pictures here. As far as I can tell, there is no difference in terms of performance between both revisions, both use Hynix ram. The difference only comes into play if you plan to put the cards under water.

The second revision, with the gold chokes and position of the capacitor by the connector means that the full cover EK blocks do not fit with significant modification detailed here and


http://imgur.com/HrEJ2

.

I find the card really benefits from an exhaust fan near the card, either at the PCIE slot or side panel along with quite a bit of fresh cool air being fed to it.

Hope that helps. If you have any more questions, Don't hesitate to ask


----------



## Roy360

Quote:


> Originally Posted by *rdr09*
> 
> not sure if it is related but your third gpu was not doing anything. ypu may be right - power.
> 
> Or, it could be the third gpu. i suggest testing it by itself. disable crossfire, unplug power from the other 2 gpus, plug monitor to third and test. that is, if AB is indicating the right gpus.


GPU 3 was actually off during that test. I tested again this time with GPU 1 and GPU 3, instant crash.

Rolled back to non beta drivers. Crossfired all 3 cards and this time I got several hours of gameplay with no crashes. At one time I did see artifacts, but they went away almost instantly.

The GPUs, CPUs and RAM are fine temperature wise, but the case itself is burning to touch. I wonder if the SSD or PSU are getting to hot. Although PSU problems would usually result in an instant crash. Are there any programs I can use to monitor the GPU voltages?


----------



## By-Tor

You can use MSI AfterBurner and HWINFO64 to monitor most anything.


----------



## kizwan

Quote:


> Originally Posted by *Roy360*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> not sure if it is related but your third gpu was not doing anything. ypu may be right - power.
> 
> Or, it could be the third gpu. i suggest testing it by itself. disable crossfire, unplug power from the other 2 gpus, plug monitor to third and test. that is, if AB is indicating the right gpus.
> 
> 
> 
> GPU 3 was actually off during that test. I tested again this time with GPU 1 and GPU 3, instant crash.
> 
> Rolled back to non beta drivers. Crossfired all 3 cards and this time I got several hours of gameplay with no crashes. At one time I did see artifacts, but they went away almost instantly.
> 
> The GPUs, CPUs and RAM are fine temperature wise, but the case itself is burning to touch. I wonder if the SSD or PSU are getting to hot. Although PSU problems would usually result in an instant crash. Are there any programs I can use to monitor the GPU voltages?
Click to expand...

You need to find away to lower the temp inside the case. Heat trapping inside the case is always not good for stability.


----------



## tsm106

At least fill out your sig rig and take some pics.


----------



## Raephen

Quote:


> Originally Posted by *specopsFI*
> 
> Hi guys! I've been waiting for the 390/390X launch patiently ever since returning my GTX 970 (because of reasons, u know). More and more it's looking like those new cards will actually be Hawaii, still, just with higher clocks and 8GB VRAM. I'm not in need of more than 4GB of memory though and the new cards might just end up being worse bang for my buck than the clearance sale 290/290X's.
> 
> So here's the situation. I can get a reasonable deal right now for a Sapphire 290 Tri-X New Edition or an MSI 290 Gaming. I've had the original 290X Tri-X before and it was a nice card, but contrary to general belief, the VRM temps weren't that great. I had to ramp the cooler way up to keep it below 90 degrees with OV/OC. Now the Gaming cards, I've seen a lot of threads of temp problems. Most of those seem to be about poor TIM. In reviews, 290(X) Gaming cards have actually had very nice VRM temps even if the GPU temp / fan noise department isn't as good as Tri-X.
> 
> So, given similar pricing, which one would you go for right now? Is the Tri-X New Edition any better regarding the VRM temps than the original Tri-X? I mean it kind of should be since it's a 6 phase design with more mosfets. What memory chips do they come with? Tri-X was known for being a good choice because of the high chance of getting Hynix, but how about the New Edition? Someone in Newegg reviews said they got one with Samsung, but other than that, not much info I can find. And what about the 290 Gaming? There seems to be V1 and V2 versions around. What's the difference and what memory do they use?
> 
> My thinking right now is to order one as a backup plan for the chance that the 390/390X won't turn out to be any better of a deal. If the new chips are significant respins and/or they get new features while keeping the cost down, then I won't even take the 290 out of the box but return it.


From what I know and have read, the Tri-X are the shizzle when it comes to 290(x), so I'd go with on eof those (if not planning to watercool). And from the rumors I've seen seen, you are right: the Rx 3xx line up seem to be pure rebrands with minor tweaks & 8GB on the 390(x).

I'd pick a 290 Tri-x for cheap now if I had to choose now (and air-cool).

Dunno about the V! / V2 thing though...


----------



## HOMECINEMA-PC

Yeah peeps if your looking for help fill out rig builder , take some pics and put it in your sig .

http://www.overclock.net/t/1258253/how-to-put-your-rig-in-your-sig/0_20 .

# Not Mind Readers #


----------



## aaroc

Quote:


> Originally Posted by *aaroc*
> 
> How do you select those profiles or its auto detection?
> 
> Here my validation 1x R9 290X HIS (Hynix) , 3x Sapphire R9 290X (Elpida):
> http://www.techpowerup.com/gpuz/details.php?id=d9fwu
> http://www.techpowerup.com/gpuz/details.php?id=x738
> http://www.techpowerup.com/gpuz/details.php?id=eeekc
> http://www.techpowerup.com/gpuz/details.php?id=5phbp
> 
> Quadfire Baby


Dear @Arizonian can you please update my data on the club list.


----------



## Mega Man

congrats ~


----------



## Ha-Nocri

Tried to OC my card for the 1st time. 1125 MHz core, 1400 memory... no voltage added. @1150 core I see artifacting. Don't like adding voltage, but I think this OC is rly good for stock?


----------



## cephelix

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Tried to OC my card for the 1st time. 1125 MHz core, 1400 memory... no voltage added. @1150 core I see artifacting. Don't like adding voltage, but I think this OC is rly good for stock?


Thay is very good if you have verified that it is stable. What are your core and vrm1 temps like? If they are acceptable, you could start adding voltage in +10 increments to stabilise 1150mhz


----------



## Ha-Nocri

Quote:


> Originally Posted by *cephelix*
> 
> Thay is very good if you have verified that it is stable. What are your core and vrm1 temps like? If they are acceptable, you could start adding voltage in +10 increments to stabilise 1150mhz


Run valley and 3d mark a few runs. Then played GTA 5 for 30 mins or so (altho on stock CPU clocks, so was more CPU bottle-necked).

VRM1 is like 70c max, GPU 75c, but usually <70c


----------



## cephelix

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Run valley and 3d mark a few runs. Then played GTA 5 for 30 mins or so (altho on stock CPU clocks, so was more CPU bottle-necked).
> 
> VRM1 is like 70c max, GPU 75c, but usually <70c


Those temps are perfectly acceptable. Mine runs at 85C when gaming @ 1100/1450/+56mv
If you don't mind the temps increasing, adding more voltage shouldn't be a problem for you


----------



## kizwan

Quote:


> Originally Posted by *Raephen*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *specopsFI*
> 
> Hi guys! I've been waiting for the 390/390X launch patiently ever since returning my GTX 970 (because of reasons, u know). More and more it's looking like those new cards will actually be Hawaii, still, just with higher clocks and 8GB VRAM. I'm not in need of more than 4GB of memory though and the new cards might just end up being worse bang for my buck than the clearance sale 290/290X's.
> 
> So here's the situation. I can get a reasonable deal right now for a Sapphire 290 Tri-X New Edition or an MSI 290 Gaming. I've had the original 290X Tri-X before and it was a nice card, but contrary to general belief, the VRM temps weren't that great. I had to ramp the cooler way up to keep it below 90 degrees with OV/OC. Now the Gaming cards, I've seen a lot of threads of temp problems. Most of those seem to be about poor TIM. In reviews, 290(X) Gaming cards have actually had very nice VRM temps even if the GPU temp / fan noise department isn't as good as Tri-X.
> 
> So, given similar pricing, which one would you go for right now? Is the Tri-X New Edition any better regarding the VRM temps than the original Tri-X? I mean it kind of should be since it's a 6 phase design with more mosfets. What memory chips do they come with? Tri-X was known for being a good choice because of the high chance of getting Hynix, but how about the New Edition? Someone in Newegg reviews said they got one with Samsung, but other than that, not much info I can find. And what about the 290 Gaming? There seems to be V1 and V2 versions around. What's the difference and what memory do they use?
> 
> My thinking right now is to order one as a backup plan for the chance that the 390/390X won't turn out to be any better of a deal. If the new chips are significant respins and/or they get new features while keeping the cost down, then I won't even take the 290 out of the box but return it.
> 
> 
> 
> 
> 
> 
> 
> From what I know and have read, the Tri-X are the shizzle when it comes to 290(x), so I'd go with on eof those (if not planning to watercool). And from the rumors I've seen seen, you are right: the Rx 3xx line up seem to be pure rebrands with minor tweaks & 8GB on the 390(x).
> 
> I'd pick a 290 Tri-x for cheap now if I had to choose now (and air-cool).
> 
> Dunno about the V! / V2 thing though...
Click to expand...

I stopped following news/rumours about 390's. First it can do 8GB, then it only can do 4GB, then it really can do 8GB & then it can only do 4GB actually.









The TRI-X also have two versions but I could be wrong though.

AMD banner


----------



## Circaflex

Quote:


> Originally Posted by *Iniura*
> 
> I took this from OC UK credits go to WalderX he just posted that there.
> 
> Maybe this will help all the people having problems flashing there bios.
> 
> Following the techpowerup guide does work but it links to the wrong atiflash (older version), and the command to flash is missing the -f switch.
> I have spent a bit of time today finding the correct atiflash and some bios files and have put them all together into a zip file you can grab here:
> 
> http://sdrv.ms/1c9DpWB
> 
> Unzip that and you will have all the correct bits needed so you can follow TPU guide:
> 
> http://www.techpowerup.com/forums/sh...ad.php?t=57750
> 
> You will need an empty USB stick (the HP tool will format it). The win98 boot files are included in my zip file, you need to direct the hp tool to use them.
> 
> Before performing the flash I went into the AMD OverDrive control panel and set it back to defaults and unticked 'Enable Graphics OverDrive'. This meant I was back to stock and minimised risk of any problems when it booted back up with the new bios.
> 
> Once booted into the USB stick with the files copied you can get flashing! Firstly I ran the following command:
> 
> atiflash -s 0 backup.rom
> 
> This gave me a copy of my sapphire bios incase I want to reflash in future, or any problem occurred with flashing the asus bios. I have included this bios image in the zip file above (sapphire.rom).
> 
> You then are ready to flash the Asus bios as follows:
> 
> atiflash -p -f 0 asus.rom
> 
> If you don't use the -f switch you will get an SSID check error and it won't go through with the flashing. If you have multiple cards you just need to run the command again, incrementing the number until all your cards are flashed.
> Reboot, remove USB stick and you're done!
> 
> Now you will need to use Asus GPU Tweak software to do your overclocking, as far as I am aware MSI afterburner doesn't work properly with this card yet - haven't even tried it myself. You can get it direct from Asus here:
> 
> http://support.asus.com/download.asp...pu+tweak&os=30
> 
> I've got mine running at 1150 core & 6000 mem @ 1.350v. Make sure you always have the power target slider set at the max of 150%, to ensure that the card can draw the power it needs when required.
> 
> Hope this guide helps! Will also post this as a new thread in the graphics cards section (if there isn't one already) as I see many people asking how to flash across many forums. The Asus bios rom is from Gibbo so many thanks to him for getting hold of it for us


I understand I am quoting a very old post, please dont kill me but does anyone have a mirror to this file or a new link? this seems down as does the oc uk one still.


----------



## Arizonian

Quote:


> Originally Posted by *aaroc*
> 
> Dear @Arizonian can you please update my data on the club list.


Congrats - updated


----------



## rt123

TheFiles.zip 2855k .zip file

Quote:


> Originally Posted by *Circaflex*
> 
> I understand I am quoting a very old post, please dont kill me but does anyone have a mirror to this file or a new link? this seems down as does the oc uk one still.


Here you go..


----------



## Circaflex

thank you so much


----------



## kizwan

I think mediafire is the only file hosting service that doesn't automatically remove files after a period of time. My files at mediafire can be access even though I have not login for a couple of years now but my account did get a lot of traffics though.


----------



## aaroc

Using latest beta driver 15.5 and gta v at 1440p with quadgasm. Selected reset to defaults and it selected almost all on very high in the graphics section in 1080p, so changed to 1440p. All the time 60fps in afterburner osd and some serious flickers from time to time, vsync was active.... Disabled vsync and ulala!!!! From more than 100 fps on closed spaces and almost always above 60, smooth. Using f2004 in my sig and no oc of cpu or gpu..cpu and gpus temp bellow 40C room temp 19,5C. Im going to waste so much time with this game.....


----------



## Mega Man

but.... surely not! you dont say you have gains from going quadfire ?~!?

seeing as "people" in certain threads have claimed quadfire is near useless and "illogical"!~


----------



## pengs

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Tried to OC my card for the 1st time. 1125 MHz core, 1400 memory... no voltage added. @1150 core I see artifacting. Don't like adding voltage, but I think this OC is rly good for stock?


That's really good. Mine takes about +50-60mV to do the 1125 but my memory seems to have no problem going to 1600 though my stock VDDC is about 1.11v at load.

Also a bit conservative when overvolting until I find the absolute highest stock voltage from a bunch of samples and that seems to be about 1.23v. I somewhat feel that this voltage is within the safe range(most likely very safe) for the 290x at 28nm as some boards are run at that voltage out of the box, that and the AIBs and AMD aren't going to allow a GPU to run outside of the safe voltage threshold as high RMA rates aren't really productive to a company. I didn't feel good about overvolting until I found that my voltage delta was about 120mV from the hottest reference sample.

tldr, if at stock your loaded at 1.1v you could probably easily add 50mV without any discretion with your current temps.
Quote:


> Originally Posted by *specopsFI*
> 
> So here's the situation. I can get a reasonable deal right now for a Sapphire 290 Tri-X New Edition or an MSI 290 Gaming. I've had the original 290X Tri-X before and it was a nice card, but contrary to general belief, the VRM temps weren't that great. I had to ramp the cooler way up to keep it below 90 degrees with OV/OC. Now the Gaming cards, I've seen a lot of threads of temp problems. Most of those seem to be about poor TIM. In reviews, 290(X) Gaming cards have actually had very nice VRM temps even if the GPU temp / fan noise department isn't as good as Tri-X.
> 
> So, given similar pricing, which one would you go for right now? Is the Tri-X New Edition any better regarding the VRM temps than the original Tri-X? I mean it kind of should be since it's a 6 phase design with more mosfets. What memory chips do they come with? Tri-X was known for being a good choice because of the high chance of getting Hynix, but how about the New Edition? Someone in Newegg reviews said they got one with Samsung, but other than that, not much info I can find. And what about the 290 Gaming? There seems to be V1 and V2 versions around. What's the difference and what memory do they use?


Got a Tri-X 290x (V1 I assume with the 8+6 pin) and VRM1 is about 83c tops with +93mV and VRM2 is always 50-60c which isn't terrible but I've got a custom fan curve and two 250mm side fans at my disposal. Had two cards, selling the second (hint














) but both have Hynix. Hynix appears to be on the Tri-X's a bit more than not but Sapphire does use Elpida too from what I've seen. Seems to be a 50/50 chance.


----------



## kizwan

Quote:


> Originally Posted by *Mega Man*
> 
> but.... surely not! you dont say you have gains from going quadfire ?~!?
> 
> seeing as "people" in certain threads have claimed quadfire is near useless and "illogical"!~


"People" tends to think too much. My brain don't have space for small issue like that. I prefer filling the space with "what I'm going to eat for dinner?"


----------



## aaroc

Tonight i will test playing with less gpus at 1440p to see the return of investment of each gpu. With whql drivers the game was not playable with one gpu. First game that I see using 90% total cpu use on some situations maybe is using all the power of my fx cpuor other thing was on background, going to check more.


----------



## t0m3k

Hi there, I'd like to join








http://www.techpowerup.com/gpuz/details.php?id=4rwc4
http://www.overclock.net/lists/display/view/id/6196156

Model: XFX R9 290 DD
Cooling: Water (H55 + Kraken G10)


----------



## Gumbi

The Trix has good VRM temps (75~ or below) AFAIK. The ones that get high are anomalous.

However, the VaporX 290 is by far the best 290 model. Very beefy VRM cooling (even cooler than core) and the core is cooled slightly more efficiently (due to vapor chamber) than the Trix cooler. A fantastic card overall.


----------



## Ha-Nocri

Quote:


> Originally Posted by *pengs*
> 
> That's really good. Mine takes about +50-60mV to do the 1125 but my memory seems to have no problem going to 1600 though my stock VDDC is about 1.11v at load.
> 
> Also a bit conservative when overvolting until I find the absolute highest stock voltage from a bunch of samples and that seems to be about 1.23v. I somewhat feel that this voltage is within the safe range(most likely very safe) for the 290x at 28nm as some boards are run at that voltage out of the box, that and the AIBs and AMD aren't going to allow a GPU to run outside of the safe voltage threshold as high RMA rates aren't really productive to a company. I didn't feel good about overvolting until I found that my voltage delta was about 120mV from the hottest reference sample.
> 
> tldr, if at stock your loaded at 1.1v you could probably easily add 50mV without any discretion with your current temps.


My core voltage is fluctuating under load, going up to 1.18V, which is like reference 290. It is usually around ~1.14 tho. Rly need GPU intensive game that has no CPU bottle-neck in any area to test this OC. Heard Witcher 3 might be just that.

I didn't try to push memory more, want to make sure core is stable @1125 first.

Btw, how do u know ur card isn't stable? What programs do u use to test? Do your drivers crash?


----------



## mfknjadagr8

So what would be the max safe voltage under water? Or is it another if you can cool it you can clock it situation? I have a feeling my bottom card is going to be my limiting factor...when the clocks went wonky it stopped working in crossfire...but at my mild overclock (1050core 1300 mem)it runs a little better...the only thing I changed was power limit to max in afterburner...perhaps it would've needed more voltage for the bottom card to run that clock...


----------



## pengs

Quote:


> Originally Posted by *Ha-Nocri*
> 
> My core voltage is fluctuating under load, going up to 1.18V, which is like reference 290. It is usually around ~1.14 tho. Rly need GPU intensive game that has no CPU bottle-neck in any area to test this OC. Heard Witcher 3 might be just that.
> 
> I didn't try to push memory more, want to make sure core is stable @1125 first.
> 
> Btw, how do u know ur card isn't stable? What programs do u use to test? Do your drivers crash?


I play GTA5








1150 will work for my card at +93mV for benchmarks with no artifacting, multiple runs of Heaven, 3 runs of Valley, Dying Light, Miscreated and all kinds of games but GTA5 is quite picky.

It's a driver crash most of the time with an unstable core clock which freezes the game, turns the screen black periodically and then a CTD. If the memory is pushed too high it's a hard lock and sound loop on my machine. You can confirm a driver crash under Event Viewer>System and checking for Display and in the information it will most likely tell you that the 'display driver has crashed but recovered successfully', something along that line.


----------



## t0m3k

Hi guys, I mounted on my XFX R9 290 DD Kraken G10 with corsair H55. The cool thing is the stock vram and vrm cooling bracket fitted with Kraken.
So I just removed the radiator and putted Kraken with H55.
On air cooling my O/C was 1110 MHz on GPU, 1360 on memory and +50 to power and temps usually about 83 degrees Celsius.
I kept the O/C, now temps are about 68 (push pull with two AF120 silent) but in OCCT vrm is getting up to 120 degrees. In games it's not going over 80 but anyway, is it safe?
I think there I have to put another fan on the other side but I have already 10 inside and no place to connect








Any suggestions for me?

Sent from my D5803 using Tapatalk


----------



## Ized

http://www.techpowerup.com/gpuz/details.php?id=9srqp


Guess I never joined the club properly









Started life as a reference Powercolor 290 975Mhz core 1250Mhz memory, now a 290X 1215Mhz core 1498Mhz memory with a all in one cooler and tiny VRM heatsinks.

Still suffer from horrendous coilwhine which forced me to move to headphones







Quite a few people suggested it would settle down, it did not.

Did anyone else suffer and see a improvement?


----------



## cephelix

@t0m3kGaming at 80C VRM temps is fine. 120C is the absolute limit for VRM functioning though so I wouldn't push it further.

One way is to replace the stock thermal pad on the VRMs with something better such as Fujipoly Ultra Extreme. It has better thermal conductivity, around 17W/mK compared to the others. I got mine from FCPU but since it closed down, you could get from PPC though I'm not sure of the exact name of it now.


----------



## Durtmagurt

My xfire 290's under water.
Love the swiftech komodo.
What you guys think


----------



## kizwan

Quote:


> Originally Posted by *t0m3k*
> 
> Hi guys, I mounted on my XFX R9 290 DD Kraken G10 with corsair H55. The cool thing is the stock vram and vrm cooling bracket fitted with Kraken.
> So I just removed the radiator and putted Kraken with H55.
> On air cooling my O/C was 1110 MHz on GPU, 1360 on memory and +50 to power and temps usually about 83 degrees Celsius.
> I kept the O/C, now temps are about 68 (push pull with two AF120 silent) but in OCCT vrm is getting up to 120 degrees. In games it's not going over 80 but anyway, is it safe?
> I think there I have to put another fan on the other side but I have already 10 inside and no place to connect
> 
> 
> 
> 
> 
> 
> 
> 
> Any suggestions for me?
> 
> Sent from my D5803 using Tapatalk


My suggestion, don't use OCCT.


----------



## t0m3k

How do you check stability then? And what do you use for overclocking and monitoring? On GPU z I'm missing possibility to check temps from the phone (through browser for example).
And why I can't setup maximum temperature in ccc? I don't have this option...

Sent from my D5803 using Tapatalk


----------



## kizwan

Use games to check stability. OCCT & Furmark only useful getting your card to generate a lot amount of heat. Your VRM cooling is less than adequate to handle that kind of heat. That thing need active cooling.

I use TRIXX for overclock & Open Hardware Monitor for monitoring. I also use AB for monitoring if I want to record monitoring data.

I'm not sure about temp limit setting in CCC though. It should be there if you enabled OverDrive.


----------



## aaroc

MSI afterburner has an Android/IOS client and a remote PC client. Never tried them I see the values on my Logitech G510 Keyboard LCD. You should check them. HWInfo64 allows to log all values to file.


----------



## Aaron_Henderson

Quote:


> Originally Posted by *Ized*
> 
> Still suffer from horrendous coilwhine which forced me to move to headphones
> 
> 
> 
> 
> 
> 
> 
> Quite a few people suggested it would settle down, it did not.
> 
> Did anyone else suffer and see a improvement?


My 290x had pretty bad coil-whine when I first got it (and I bought it used). Only when frames per second went up into the thousands. It slowly became less and less and now doesn't do it at all.


----------



## Jflisk

Quote:


> Originally Posted by *Durtmagurt*
> 
> My xfire 290's under water.
> Love the swiftech komodo.
> What you guys think


Nice machine


----------



## skline00

Quote:


> Originally Posted by *Durtmagurt*
> 
> My xfire 290's under water.
> Love the swiftech komodo.
> What you guys think


Excellent rig. I love my Sapphire Tri-X OC R9 290s in CF with EK blocks in my rig also.


----------



## boredmug

Quote:


> Originally Posted by *Durtmagurt*
> 
> My xfire 290's under water.
> Love the swiftech komodo.
> What you guys think


Nice! My 290x's with ek blocks arrive tomorrow. They will also be residing in a 800d with a 360mm up top and a 240mm rad on the bottom.


----------



## Durtmagurt

Quote:


> Originally Posted by *boredmug*
> 
> Nice! My 290x's with ek blocks arrive tomorrow. They will also be residing in a 800d with a 360mm up top and a 240mm rad on the bottom.


I wanna put a 280 on not and a extra 140 in the back wish my case was a bit bigger haha but for right now my temps are amazing and I can't complain. Hope to see pictures of your setup


----------



## boredmug

Quote:


> Originally Posted by *Durtmagurt*
> 
> I wanna put a 280 on not and a extra 140 in the back wish my case was a bit bigger haha but for right now my temps are amazing and I can't complain. Hope to see pictures of your setup


I'll post up some pictures tomorrow when I get the new cards in. I've got a pair of blocked 7950's in there right now. Found a good deal on some 290x's with ek blocks that I couldn't pass up.

I ended up cutting the bottom out with a dremel and installing some mesh for the 240 on the bottom. I think a 280 would fit down there. I've contemplated adding a 120mm on the back exhaust fan spot just for the hell of it.

What are your temps like and what fan speeds do you run?


----------



## MOSER91

Here's my PowerColor TurboDuo R9 290x's under water.





And some benchmarks...
http://www.3dmark.com/fs/5047354
http://www.3dmark.com/fs/5047385


----------



## kizwan

Quote:


> Originally Posted by *MOSER91*
> 
> Here's my PowerColor TurboDuo R9 290x's under water.
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> And some benchmarks...
> http://www.3dmark.com/fs/5047354
> http://www.3dmark.com/fs/5047385


It's nice to see 290/290X underwater & in C70.


----------



## Arizonian

Quote:


> Originally Posted by *t0m3k*
> 
> Hi there, I'd like to join
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://www.techpowerup.com/gpuz/details.php?id=4rwc4
> http://www.overclock.net/lists/display/view/id/6196156
> 
> 
> 
> Model: XFX R9 290 DD
> Cooling: Water (H55 + Kraken G10)


Congrats - added















Quote:


> Originally Posted by *Ized*
> 
> http://www.techpowerup.com/gpuz/details.php?id=9srqp
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Guess I never joined the club properly
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Started life as a reference Powercolor 290 975Mhz core 1250Mhz memory, now a 290X 1215Mhz core 1498Mhz memory with a all in one cooler and tiny VRM heatsinks.
> 
> Still suffer from horrendous coilwhine which forced me to move to headphones
> 
> 
> 
> 
> 
> 
> 
> Quite a few people suggested it would settle down, it did not.
> 
> Did anyone else suffer and see a improvement?


Congrats - added








Quote:


> Originally Posted by *Durtmagurt*
> 
> My xfire 290's under water.
> Love the swiftech komodo.
> What you guys think
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added















Quote:


> Originally Posted by *MOSER91*
> 
> Here's my PowerColor TurboDuo R9 290x's under water.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> And some benchmarks...
> http://www.3dmark.com/fs/5047354
> http://www.3dmark.com/fs/5047385


Congrats - added


----------



## Ha-Nocri

Quote:


> Originally Posted by *pengs*
> 
> I play GTA5
> 
> 
> 
> 
> 
> 
> 
> 
> 1150 will work for my card at +93mV for benchmarks with no artifacting, multiple runs of Heaven, 3 runs of Valley, Dying Light, Miscreated and all kinds of games but GTA5 is quite picky.
> 
> It's a driver crash most of the time with an unstable core clock which freezes the game, turns the screen black periodically and then a CTD. If the memory is pushed too high it's a hard lock and sound loop on my machine. You can confirm a driver crash under Event Viewer>System and checking for Display and in the information it will most likely tell you that the 'display driver has crashed but recovered successfully', something along that line.


OC'ed my CPU, used high-resolution(downsampling) + MSAA, to make sure I'm never CPU bound... and it is stable @ 1125/1400, stock voltage. Can't go higher on memory. How r ppl getting 1600MHz? My memory voltage slider is locked.


----------



## Wezzor

Is there any fast and easy solution to get rid of the rattling noise that is coming from the fans on my 290 Tri-X card?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Ha-Nocri*
> 
> OC'ed my CPU, used high-resolution(downsampling) + MSAA, to make sure I'm never CPU bound... and it is stable @ 1125/1400, stock voltage. Can't go higher on memory. *How r ppl getting 1600MHz?* My memory voltage slider is locked.


Different memory chips








Quote:


> Originally Posted by *Wezzor*
> 
> Is there any fast and easy solution to get rid of the rattling noise that is coming from the fans on my 290 Tri-X card?


Pull it out and fix it


----------



## Wezzor

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Different memory chips
> 
> 
> 
> 
> 
> 
> 
> 
> Pull it out and fix it


yeh,but how exactly? I've already tighten the screws.


----------



## rdr09

Quote:


> Originally Posted by *Wezzor*
> 
> yeh,but how exactly? I've already tighten the screws.


Others with that issue in the past recommended using a piece of rubber. shove it between the pcb and the fan housing. carefull with those tiny electonic critters.


----------



## thrgk

anyone else selling their 290x? I am going to sell my 4 with blocks, Idk if I will go with 980ti or see how the fury is but we will see. How do people sell used GPU? Mostly ebay or? I've barely gotten any hits so far. Ebay seems dangerous because they could break the card then blame me


----------



## Ized

Quote:


> Originally Posted by *thrgk*
> 
> anyone else selling their 290x? I am going to sell my 4 with blocks, Idk if I will go with 980ti or see how the fury is but we will see. How do people sell used GPU? Mostly ebay or? I've barely gotten any hits so far. Ebay seems dangerous because they could break the card then blame me


I had *really* good results selling 2 CPUs (3570K and 4770K) on Amazon UK, so damn hassle free in comparison to ebay and the fees worked out almost exactly the same.

No time limits or initial listing fees, no fees for tweaking your listing, no derpy messages asking if it will work with their next door neighbours xbox. No "winning" without paying forcing you to relist with even more fees.

*If* it sells they take the fees from what they get.

They do hold onto the money though for 30odd days which is unnerving.

I just undercut the current cheapest price fractionally, and dropped it another £1 everyday until it sold.


----------



## gatygun

Quote:


> Originally Posted by *Ha-Nocri*
> 
> OC'ed my CPU, used high-resolution(downsampling) + MSAA, to make sure I'm never CPU bound... and it is stable @ 1125/1400, stock voltage. Can't go higher on memory. How r ppl getting 1600MHz? My memory voltage slider is locked.


Get trixx, push more volts onto the thing.

With +100 i get like 1145 stable max, with +185 i get 2215 stable max on core. the memory goes from 1300>1400 stable on +100, on +200 it goes to 1600 stable but reduces my performance, 1500 is the sweet spot.


----------



## kizwan

2215 on the core...that is extremely high.


----------



## tsm106

Unpossible!


----------



## Ha-Nocri

Quote:


> Originally Posted by *gatygun*
> 
> Get trixx, push more volts onto the thing.
> 
> With +100 i get like 1145 stable max, with +185 i get 2215 stable max on core. the memory goes from 1300>1400 stable on +100, on +200 it goes to 1600 stable but reduces my performance, 1500 is the sweet spot.


I see that trixx has upto +200mV, but it still doen't have memory voltage control for me.

In AB tho I get 1200MHz with +100mV. Don't think it's worth it. Much voltage for 75 MHz. Might try 1250 just for the fun of it.

EDIT: can't go 1250MHz with +200mV, get a few artifacts here and there. Lowered to 1245MHz and seen none so far. So guess that's my max OC. I could pass some benchmarks with higher clocks and artifacts tho...


----------



## boredmug

These just arrived to my house. Time to drain the loop and give the computer some love.


----------



## boredmug

Observations so far... Hotter than my 7950's. For some reason with afterburner running both cards don't clock up in games. 2nd gpu does all the work although it stays at 2d clocks. gpu 1 clocks to 1000 and does nothing. They both work correctly without afterburner running. my little 850w psu's fan gets going pretty fast when gpu's are loaded..


----------



## boredmug

Anybody had this problem with Afterburner and 290x crossfire? I like to monitor temps with afterburner in game and the cards do not clock up correctly when afterburner is running. They work correctly without it. I get all my usage on gpu2 but with a 2d clockspeed. gpu1 stays at 1000mhz but zero usage.


----------



## Roy360

Quote:


> Originally Posted by *boredmug*
> 
> Observations so far... Hotter than my 7950's. For some reason with afterburner running both cards don't clock up in games. 2nd gpu does all the work although it stays at 2d clocks. gpu 1 clocks to 1000 and does nothing. They both work correctly without afterburner running. my little 850w psu's fan gets going pretty fast when gpu's are loaded..


Have them log in the background and see what happens.
Settings>>Monitoring>>Log history to file


----------



## Roy360

Quote:


> Originally Posted by *tsm106*
> 
> At least fill out your sig rig and take some pics.


Updated.

Since the crash I've installed 14.12 instead , and have not experienced a single crash like I previously did.

*Although this time crossfire was disabled.. must have forgotten to enabled it... But atleast I know the rig functions fine without it.

Well almost fine... since downgrading to 14.12 (or maybe since flashing a reference bios on the ASUS DCII card) my monitors don't wake after (monitor) sleep. Restarting the PC doesn't help. I need to cut the power (switch on the back of the PSU) before the monitors will work again.



^Happened Twice now
Quote:


> Originally Posted by *kizwan*
> 
> You need to find away to lower the temp inside the case. Heat trapping inside the case is always not good for stability.


I freed up a 140mm side vent, and keep the front panel open while gaming now. If that doesn't work, I'm going to need to get a sandblaster and start removing the sound damping material so I can make some vents


----------



## boredmug

Quote:


> Originally Posted by *Roy360*
> 
> Have them log in the background and see what happens.
> Settings>>Monitoring>>Log history to file


Something is wrong.. with afterburner running now i get 100 percent usage on both cards but they are stuck at 300mhz. I'm assuming they are working correctly without afterburner running because everything isn't choppy as hell but i have no way of knowing because i can't see usage or clocks.


----------



## tsm106

Quote:


> Originally Posted by *boredmug*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Roy360*
> 
> Have them log in the background and see what happens.
> Settings>>Monitoring>>Log history to file
> 
> 
> 
> Something is wrong.. with afterburner running now i get 100 percent usage on both cards but they are stuck at 300mhz. I'm assuming they are working correctly without afterburner running because everything isn't choppy as hell but i have no way of knowing because i can't see usage or clocks.
Click to expand...

Read mah sig thread.


----------



## JourneymanMike

Quote:


> Originally Posted by *boredmug*
> 
> Anybody had this problem with Afterburner and 290x crossfire? I like to monitor temps with afterburner in game and the cards do not clock up correctly when afterburner is running. They work correctly without it. I get all my usage on gpu2 but with a 2d clockspeed. gpu1 stays at 1000mhz but zero usage.


Try using Sapphire Trixx, I've had better luck with it than AB

Also, a stupid question, have you disabled ULPS?


----------



## boredmug

Quote:


> Originally Posted by *JourneymanMike*
> 
> Try using Sapphire Trixx, I've had better luck with it than AB
> 
> Also, a stupid question, have you disabled ULPS?


UlPS is disabled. Problem is now cards are stuck at 2d clocks without trixx or afterburner running. I've uninstalled drivers a couple times. Trying again.

Mah SIG thread?


----------



## boredmug

Apparently i bought two cards that only run at 300mhz... :-/


----------



## tsm106

Quote:


> Originally Posted by *boredmug*
> 
> Apparently i bought two cards that only run at 300mhz... :-/


No way? Got teamviewer?


----------



## boredmug

Quote:


> Originally Posted by *tsm106*
> 
> No way? Got teamviewer?


What is teamviewer? This is weird. I changed the clock settings in afterburner and now both cards clocks ramp up.. I don't understand this..


----------



## tsm106

Quote:


> Originally Posted by *boredmug*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> No way? Got teamviewer?
> 
> 
> 
> What is teamviewer? This is weird. I changed the clock settings in afterburner and now both cards clocks ramp up.. I don't understand this..
Click to expand...

Follow my guide to the letter. Teamviewer is a remote control suite.


----------



## JourneymanMike

Quote:


> Originally Posted by *tsm106*
> 
> Quote:
> 
> 
> 
> Originally Posted by *boredmug*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> No way? Got teamviewer?
> 
> 
> 
> What is teamviewer? This is weird. I changed the clock settings in afterburner and now both cards clocks ramp up.. I don't understand this..
> 
> Click to expand...
> 
> Follow my guide to the letter. Teamviewer is a remote control suite.
Click to expand...

What Guide? Where is it?


----------



## tsm106

In my signature.

http://www.overclock.net/t/1265543/the-amd-how-to-thread/0_40


----------



## kizwan

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Quote:
> 
> 
> 
> Originally Posted by *gatygun*
> 
> Get trixx, push more volts onto the thing.
> 
> With +100 i get like 1145 stable max, with +185 i get 2215 stable max on core. the memory goes from 1300>1400 stable on +100, on +200 it goes to 1600 stable but reduces my performance, 1500 is the sweet spot.
> 
> 
> 
> I see that trixx has upto +200mV, but it still doen't have memory voltage control for me.
> 
> In AB tho I get 1200MHz with +100mV. Don't think it's worth it. Much voltage for 75 MHz. Might try 1250 just for the fun of it.
> 
> EDIT: can't go 1250MHz with +200mV, get a few artifacts here and there. Lowered to 1245MHz and seen none so far. So guess that's my max OC. I could pass some benchmarks with higher clocks and artifacts tho...
Click to expand...

Tri-X don't have memory voltage control. The only card that have memory voltage control is Lightning. Even without memory voltage control my cards can handle 1600 on the memory stable. The core voltage actually helping memory overclock in case you don't know.
Quote:


> Originally Posted by *Roy360*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> At least fill out your sig rig and take some pics.
> 
> 
> 
> Updated.
> 
> Since the crash I've installed 14.12 instead , and have not experienced a single crash like I previously did.
> 
> *Although this time crossfire was disabled.. must have forgotten to enabled it... But atleast I know the rig functions fine without it.
> 
> Well almost fine... since downgrading to 14.12 (or maybe since flashing a reference bios on the ASUS DCII card) my monitors don't wake after (monitor) sleep. Restarting the PC doesn't help. I need to cut the power (switch on the back of the PSU) before the monitors will work again.
> 
> 
> 
> ^Happened Twice now
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> You need to find away to lower the temp inside the case. Heat trapping inside the case is always not good for stability.
> 
> Click to expand...
> 
> I freed up a 140mm side vent, and keep the front panel open while gaming now. If that doesn't work, I'm going to need to get a sandblaster and start removing the sound damping material so I can make some vents
Click to expand...

Did you use sleep function? Because that BSOD bug check related to sleep & since it's seems to related with the gpus, related to monitor go to sleep. Basically this related to power cycle; sleep <--> wakeup. It's a good idea to flash back the DC2 with original BIOS.
Quote:


> Originally Posted by *boredmug*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> No way? Got teamviewer?
> 
> 
> 
> What is teamviewer? This is weird. I changed the clock settings in afterburner and now both cards clocks ramp up.. I don't understand this..
Click to expand...

Weird thing sometime happen.


----------



## boredmug

Quote:


> Originally Posted by *tsm106*
> 
> In my signature.
> 
> http://www.overclock.net/t/1265543/the-amd-how-to-thread/0_40


Most of this stuff i already do but thank you. Well it seems to be working correctly now other than the fact that it would stick at 2d clocks without changing the default clocks in afterburner. I'm just running them at stock voltage right now at 1020/1400. I really don't think my PSU is up to snuff. I hear it's fan ramp up during games. Loudest fan in my computer now.. Afterburner shows my core clocks are holding steady but my usage is all over the place. I never really had this problem with my 7950's. Performance is still better than the 7950's ofcourse. I do wish these cards didn't get so hot though. I'm seeing mid 50's in a lot of my games. I suppose that's a lot better than with stock air coolers though.


----------



## kizwan

Your 850 PSU seems running at high load. Even the fan on my 1050 PSU also ramp up & the PSU itself getting warm when playing games. 50s is pretty much you'll get in high ambient though (>25C).


----------



## pengs

Quote:


> Originally Posted by *Ha-Nocri*
> 
> I see that trixx has upto +200mV, but it still doen't have memory voltage control for me.
> 
> In AB tho I get 1200MHz with +100mV. Don't think it's worth it. Much voltage for 75 MHz. Might try 1250 just for the fun of it.
> 
> EDIT: can't go 1250MHz with +200mV, get a few artifacts here and there. Lowered to 1245MHz and seen none so far. So guess that's my max OC. I could pass some benchmarks with higher clocks and artifacts tho...


Yeah, it scales slightly with core voltage though if your using AB adding a small amount to the aux voltage can help with memory stability, +6 helped me out. Don't over-do it, your still on air aren't you? Keep an eye on VRM1. I'd try +100 and +6 or 13 aux and 1450 or 1500.

A little aux can also lower temperatures. +6 lowered mine by about 2*c


----------



## Jflisk

Anyone try the new 15.5 Beta driver with BF4 or Hardline yet. I was getting DX crashes every hour or so with the 15.5 Driver. Went back to the 15.4.1 driver no problems played for 2hrs straight last night.


----------



## Agent Smith1984

If interested:
http://www.overclock.net/t/1559864/sapphire-tri-x-290-4gb


----------



## tsm106

Quote:


> Originally Posted by *Roy360*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> At least fill out your sig rig and take some pics.
> 
> 
> 
> Updated.
> 
> Since the crash I've installed 14.12 instead , and have not experienced a single crash like I previously did.
> 
> *Although this time crossfire was disabled.. must have forgotten to enabled it... But atleast I know the rig functions fine without it.
> 
> Well almost fine... since downgrading to 14.12 (or maybe since flashing a reference bios on the ASUS DCII card) my monitors don't wake after (monitor) sleep. Restarting the PC doesn't help. I need to cut the power (switch on the back of the PSU) before the monitors will work again.
> 
> 
> 
> ^Happened Twice now
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> You need to find away to lower the temp inside the case. Heat trapping inside the case is always not good for stability.
> 
> Click to expand...
> 
> I freed up a 140mm side vent, and keep the front panel open while gaming now. If that doesn't work, I'm going to need to get a sandblaster and start removing the sound damping material so I can make some vents
Click to expand...

Your case is a huge compromise with your rad setup. The two 240 on the bottom are very inefficient because of their orientation. The intake/exhaust layout is far from ideal with large areas of stagnant air movement. I would pull the 120 and use that as an intake. Change all the rads to push fresh air into the case. Do not use pull orientation on the rads, you're wasting roughly 1c delta by using pull. Move the sandwiched 240 to the bottom of the case. I would also look into getting an off-center fitting between your DC2 and ref blocks. In the schematic below you can check the specs and see if it will fit your setup.



http://www.performance-pcs.com/bitspower-matte-black-off-center-block.html#Questions-Answers

As far as your crashing and wake from sleep issues, have you tested each card individually first? Did you specifically choose to run the DC2 as the main card for a reason? By chance? You can test them by disabling crossfire, and moving the display plug to each card one at a time. Run stability tests, sleep tests, oc core then memory see how far each will go before it blackscreens. The one that goes the farthest before bs and doesn't have wake from sleep issues is the one to run as the main card. Also driver versions will come into play for wake from sleep too.

Hmm, other things your psu is too small. You don't have enough rads for your heat load, 3 gpu and 1 cpu... and ONLY 840mm of rad and the orientation you do have to not doing any favors cooling wise. The goal should be 360mm per block for comfortably cool and quiet.

At this point, an Entho Primo could net you 300mm per block by running a 480 top and bottom and a 240 on the side. The easiest/cheapest would be to get some QDC and buy a Nova 1080 rad and run it behind your desk. Look at my sig rig for an example. I run a lot of radiator surface for a reason. Man, I feel like I'm rambling but there is a lot of little things I wanted to say too but these are the big things. You have a major cooling problem, you also are quite undersized on psu, and your gpus are a question mark stability wise. That's a lot of big ticket issues to iron out and not easy either.


----------



## By-Tor

I have been wanting to add a second R9 290x to my rig, but a second Powercolor 290x LCS is not that easy to find.
http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&Description=powercolor+r9+290x+lcs&N=-1&isNodeId=1

I have also been looking at the Powercolor Devil 13, dual core, 8gb card to add with my current card for tri fire.
http://www.newegg.com/Product/Product.aspx?Item=N82E16814131584

Anyone have any first hand with either of these cards?

Your thoughts?

Thank you


----------



## tsm106

Quote:


> Originally Posted by *By-Tor*
> 
> I have been wanting to *add a second R9 290x to my rig, but a second Powercolor 290x LCS is not that easy to find.*
> http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&Order=BESTMATCH&Description=powercolor+r9+290x+lcs&N=-1&isNodeId=1
> 
> I have also been looking at the Powercolor Devil 13, dual core, *8gb card to add with my current card for tri fire*.
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814131584
> 
> Anyone have any first hand with either of these cards?
> 
> Your thoughts?
> 
> Thank you


You don't need another LCS, it's just a ref card with ref block. Any matching ref style pcb and ek block will mate up fine.

It's a terribad idea to mate it with a Devil 13 as those are not fullcover supported. And I would go triple single cards before a dually plus but on your cpu/mb two is enough.


----------



## By-Tor

Quote:


> Originally Posted by *tsm106*
> 
> You don't need another LCS, it's just a ref card with ref block. Any matching ref style pcb and ek block will mate up fine.
> 
> It's a terribad idea to mate it with a Devil 13 as those are not fullcover supported. And I would go triple single cards before a dually plus but on your cpu/mb two is enough.


I know I can use any ref. card along side my LCS, but I was more wanting them to match in my new build...

http://www.overclock.net/t/1557043/my-phanteks-enthoo-primo-build

Thanks


----------



## specopsFI

Quote:


> Originally Posted by *cephelix*
> 
> I can try to answer some of your questions regarding the 290 since I own one. Though I cannot make comparisons with the Tri-X. VRM temps do get quite high when OC-ed. I f I'm not mistaken, mine reached around mid- to high-70s with a 26C ambient(AC) when OC-ed to 1050MHz. Core temps were around 60C. This was when I first got my card. Over time, I removed the stock cooler and put the card under water with a universal block. Strange thing is that when I went back to air, my temps were worse even after multiple reseats, going into mid-80s at stock. I have no idea why. With CLU applied to the die and fujipoly extreme to VRM1 temps were brought down to low-70s for both core and VRM1. The difference between the 2 revisions of the MSI gaming are the chokes used. I've put up pictures here. As far as I can tell, there is no difference in terms of performance between both revisions, both use Hynix ram. The difference only comes into play if you plan to put the cards under water.
> 
> The second revision, with the gold chokes and position of the capacitor by the connector means that the full cover EK blocks do not fit with significant modification detailed here and
> 
> 
> http://imgur.com/HrEJ2
> 
> .
> 
> I find the card really benefits from an exhaust fan near the card, either at the PCIE slot or side panel along with quite a bit of fresh cool air being fed to it.
> 
> Hope that helps. If you have any more questions, Don't hesitate to ask


Been a bit busy for a few days, but thanks for the info anyway (+rep)

Luckily I pulled the trigger on the Tri-X before I got busy, because the cheap ones (both Tri-X and Gaming versions) are now sold out here. I still have a feeling that this is going to be the better deal for me compared to 390/X but I suppose we'll see next week. At least I'm covered, either way.


----------



## teohenwhy

I have the HIS iPower IceQ X² OC H290QMC4GD Radeon R9 290 card and I was wondering if it's possible for me to flash the bios to another brand's bios to help improve on performance?


----------



## Gumbi

Hi guys, it's been 6 months since I bought my Vapor x 290 and I want to repaste it for a small temp benefit. I have some MX4.

How liberal/conservative should I be? A tiny bit right in the middle? Or a long grain from fop to bottom like on a Haswell (as in, in a Haswell the die itself isn't undsr the entire IHS, so no need for paste on the whole sink, plus the cooler will spread it out anyway).


----------



## hamzta09

Some dude unboxes a 390X.

Found no relevant thread, so I just posted it here, as its a rebrand, no?


Spoiler: Warning: Spoiler!


----------



## mirzet1976

Quote:


> Originally Posted by *hamzta09*
> 
> Some dude unboxes a 390X.
> 
> Found no relevant thread, so I just posted it here, as its a rebrand, no?
> 
> 
> Spoiler: Warning: Spoiler!


Lol this card cant be crossfired, cant find crossfire bridge, some funny guy?


----------



## hamzta09

Quote:


> Originally Posted by *mirzet1976*
> 
> Lol this card cant be crossfired, cant find crossfire bridge, some funny guy?


More funny is the comments... video is gone now but::

"Too bad the 290x is the AMD flagship while the 970 is the 4th fastest card in the lineup."


----------



## Hazardz

Quote:


> Originally Posted by *hamzta09*
> 
> Some dude unboxes a 390X.
> 
> Found no relevant thread, so I just posted it here, as its a rebrand, no?
> 
> 
> Spoiler: Warning: Spoiler!


Too bad it's taken down now.
Quote:


> Originally Posted by *mirzet1976*
> 
> Lol this card cant be crossfired, cant find crossfire bridge, some funny guy?


Anybody can make a video these days. It's so easy to spread misinformation.


----------



## gatygun

The dude is selling that 390x now on ebay, as he thought that he bought the flag ship card fury from amd. But instead he just bought a rebrand.

Dude has no idea what he does.
Quote:


> Originally Posted by *Gumbi*
> 
> Hi guys, it's been 6 months since I bought my Vapor x 290 and I want to repaste it for a small temp benefit. I have some MX4.
> 
> How liberal/conservative should I be? A tiny bit right in the middle? Or a long grain from fop to bottom like on a Haswell (as in, in a Haswell the die itself isn't undsr the entire IHS, so no need for paste on the whole sink, plus the cooler will spread it out anyway).


Just a little bit in the middle that's it.


----------



## kizwan

That's hilarious. LOL


----------



## tsm106

It's posted all over the news section.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *teohenwhy*
> 
> I have the HIS iPower IceQ X² OC H290QMC4GD Radeon R9 290 card and I was wondering if it's possible for me to flash the bios to another brand's bios to help improve on performance?


Yes absolutely you can , as long as its not a 8 Gb bios ( and your card isn't a 8gb ) you should be fine .

I run the PT1T shamino bios unlocked via Asus GPU Tweek on my Trifire setup


----------



## teohenwhy

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yes absolutely you can , as long as its not a 8 Gb bios ( and your card isn't a 8gb ) you should be fine .
> 
> I run the PT1T shamino bios unlocked via Asus GPU Tweek on my Trifire setup


I'm guessing doing so would void my warranty ? Would I be able to flash back to my old bios if anything was to happen and say I had to send the card in for warranty purposes or something? Also is their a particular bios that's better to flash to?


----------



## jura11

Quote:


> Originally Posted by *Roy360*
> 
> Updated.
> 
> Since the crash I've installed 14.12 instead , and have not experienced a single crash like I previously did.
> 
> *Although this time crossfire was disabled.. must have forgotten to enabled it... But atleast I know the rig functions fine without it.
> 
> Well almost fine... since downgrading to 14.12 (or maybe since flashing a reference bios on the ASUS DCII card) my monitors don't wake after (monitor) sleep. Restarting the PC doesn't help. I need to cut the power (switch on the back of the PSU) before the monitors will work again.
> 
> 
> 
> ^Happened Twice now
> I freed up a 140mm side vent, and keep the front panel open while gaming now. If that doesn't work, I'm going to need to get a sandblaster and start removing the sound damping material so I can make some vents


Hi there

I've got twice in row BSOD with ntoskrnl.exe error(usually hal.dll),its unusual I've got this BSOD when I tried to open PDF with Adobe Reader XI,tried few things,after trying everything in Safe mode and tried same PDF to open in Safe Boot no BSOD,strange

At the end I've found what has causing this,its has been TRIXX OC utility from Sapphire,on next boot up and disabled TRIXX and tried again same PDF and no BSOD.

I've question do you have enabled any utility for OC like Afterburner or TRIXX? If yes can you try disable this utility and try again?

I'm using same driver like you are 14.12 as with newer drivers I don't have same performance in 3D SW,but to the date this is my second BSOD with this GPU and my GPU running always 1150/1500 and temps are around 38C idle with 15-20% fan profile and on load I've never seen higher temps than 75C when I play GTA V or Project Cars and profile is set at 60%

Hope this helps

Thanks,Jura


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *teohenwhy*
> 
> I'm guessing doing so would void my warranty ? Would I be able to flash back to my old bios if anything was to happen and say I had to send the card in for warranty purposes or something? Also is their a particular bios that's better to flash to?


Put it on the 2nd position on the cards bios switch


----------



## Mrs Bilko

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Update Please
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 2 XFX R9 290 DD Black Edition's
> 
> 
> 
> 
> 
> Spoiler: More Pics


And now;



CPU-Z & GPU-Z


----------



## kcuestag

Quote:


> Originally Posted by *Mrs Bilko*
> 
> And now;
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> CPU-Z & GPU-Z


Ohh I thought they were 290X and 295x2, who's 295x2 was that on Twitter?









Nice CrossfireX, I have the same model cooler but 290X on one of them, the other is sadly a reference card.


----------



## hamzta09

390X 3dmark vs 970.


----------



## Ha-Nocri

Quote:


> Originally Posted by *hamzta09*
> 
> 
> 
> 
> 
> 
> 390X 3dmark vs 970.


How can they charge 100$ more for it than 970? That's moronic. AMD sometimes rly... hope this is not true for their own sake. I would get a 970 if the price was anywhere near what I paid for my 290


----------



## hamzta09

Quote:


> Originally Posted by *Ha-Nocri*
> 
> How can they charge 100$ more for it than 970? That's moronic. AMD sometimes rly... hope this is not true for their own sake. I would get a 970 if the price was anywhere near what I paid for my 290


Cus 8GB VRAM!

290X with 8GB here is priced > 970 anyway.


----------



## Ha-Nocri

Quote:


> Originally Posted by *hamzta09*
> 
> Cus 8GB VRAM!
> 
> 290X with 8GB here is priced > 970 anyway.


That card doesnt need 8GB in the first place. If the price is true it's a big fail and I can see it dropping rly fast.


----------



## hamzta09

Quote:


> Originally Posted by *Ha-Nocri*
> 
> That card doesnt need 8GB in the first place. If the price is true it's a big fail and I can see it dropping rly fast.


why not? Some games > 4GB at 1080p.
And its nice for modding.


----------



## Ha-Nocri

Maybe only something heavily modded. But I can't see it being a reason someone would choose it over 970. I bet you 90-95% would pick 970.


----------



## Arizonian

Quote:


> Originally Posted by *boredmug*
> 
> These just arrived to my house. Time to drain the loop and give the computer some love.
> 
> 
> Spoiler: Warning: Spoiler!


Post pics when you get is set up, join the roster.









Quote:


> Originally Posted by *hamzta09*
> 
> Some dude unboxes a 390X.
> 
> Found no relevant thread, so I just posted it here, as its a rebrand, no?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


There is an on topic place -http://www.overclock.net/t/1560057/youtube-amd-radeon-r9-390x-next-generation-graphics-card-unboxing-video









Quote:


> Originally Posted by *Mrs Bilko*
> 
> And now;
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> CPU-Z & GPU-Z


I see you inherited Sgt Bilko's XFX's - Congrats - added


----------



## hamzta09

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Maybe only something heavily modded. But I can't see it being a reason someone would choose it over 970. I bet you 90-95% would pick 970.


I have 2x 970s, Id rather go with 290X.

290X is faster on average by a few % than 970.

And it also has fully usable 4GB VRAM, unlike the 970.


----------



## BradleyW

Quote:


> Originally Posted by *hamzta09*
> 
> I have 2x 970s, Id rather go with 290X.
> 
> 290X is faster on average by a few % than 970.
> 
> And it also has fully usable 4GB VRAM, unlike the 970.


970's 4GB is also fully usable. It is just slower on the 0.5GB partition.


----------



## fyzzz

I think I have finally solved the vrm 1 heat problem. The only thing I had to do was to put a fan back on the card. Now with 233mv (wich results at around 1.3v) it barely reaches 70 when running firestrike


----------



## hamzta09

Quote:


> Originally Posted by *BradleyW*
> 
> 970's 4GB is also fully usable. It is just slower on the 0.5GB partition.


Stuttering/microfreezing > 3584 isnt "usable"

So.. please.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Mrs Bilko*
> 
> And now;
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> CPU-Z & GPU-Z


Can be used as thermal paste , black wax alternative ( must mix it with WD40 ) ..........


----------



## BradleyW

Quote:


> Originally Posted by *hamzta09*
> 
> Stuttering/microfreezing > 3584 isnt "usable"
> 
> So.. please.


By definition, the remaining 0.5GB is usable in the sense of accessibility, which is my point. I'm not debating the performance problems when allocating the remaining 0.5GB.


----------



## Gumbi

Quote:


> Originally Posted by *BradleyW*
> 
> By definition, the remaining 0.5GB is usable in the sense of accessibility, which is my point. I'm not debating the performance problems when allocating the remaining 0.5GB.


Your splitting hairs. It's not usable in any practical sense.

Much like taking a dump on my dinner table, sure, I could do it, but that doesn't mean it's a good idea.


----------



## Mrs Bilko

Quote:


> Originally Posted by *kcuestag*
> 
> Ohh I thought they were 290X and 295x2, who's 295x2 was that on Twitter?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nice CrossfireX, I have the same model cooler but 290X on one of them, the other is sadly a reference card.


Oh, that's Bilkos, and the one I replied to on twitter is Keith's.. And the 290x is Bilkos too..








Quote:


> Originally Posted by *Arizonian*
> 
> I see you inherited Sgt Bilko's XFX's - Congrats - added


Thank you - I'll add to my signature straight away!


----------



## L337Something

What is the current Most Stable and Highest Performance Driver For these cards the latest one off AMD.com? I usually just let the AUTODOWNLOADER decide.


----------



## Mrs Bilko

Quote:


> Originally Posted by *L337Something*
> 
> What is the current Most Stable and Highest Performance Driver For these cards the latest one off AMD.com? I usually just let the AUTODOWNLOADER decide.


I am currently running AMD Catalyst Beta 15.5 with absolutely no issues.

But I think their latest not beta is 14.12
- that one was running without any problems as well. I still have the crackling noises with crossfire enabled, only workaround I've found so far is just after the DDU and when you reinstall drivers, opt out the HD Audio driver - if you have same issue - Or disable HD audio in device manager.


----------



## Mrs Bilko

Quote:


> Originally Posted by *Arizonian*
> 
> I see you inherited Sgt Bilko's XFX's - Congrats - added


Oh wait, mine are Bilkos old cards, so why does he have DD and I don't? (Just curious)


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Gumbi*
> 
> Your splitting hairs. It's not usable in any practical sense.
> 
> *Much like taking a dump on my dinner table, sure, I could do it, but that doesn't mean it's a good idea.*


LooooL ...

Wut ......


----------



## tsm106

Quote:


> Originally Posted by *Mrs Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *L337Something*
> 
> *What is the current Most Stable and Highest Performance Driver For these cards* the latest one off AMD.com? I usually just let the AUTODOWNLOADER decide.
> 
> 
> 
> I am currently running AMD Catalyst Beta 15.5 with absolutely no issues.
> 
> But I think their latest not beta is 14.12
> - that one was running without any problems as well. I still have the crackling noises with crossfire enabled, only workaround I've found so far is just after the DDU and when you reinstall drivers, opt out the HD Audio driver - if you have same issue - Or disable HD audio in device manager.
Click to expand...

This is the new hotness, 15.200.1040.0 June 8.

http://forums.guru3d.com/showthread.php?t=399956


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *tsm106*
> 
> This is the new hotness, 15.200.1040.0 June 8.
> 
> http://forums.guru3d.com/showthread.php?t=399956


WOW that's a improvement


----------



## fyzzz

Now this is a result that makes more sense than my ridcolous 14400 gpu score: http://www.3dmark.com/fs/5085701 almost 14000 gpu score at 1240/1550 but i also have changed from windows 10 back to windows 8.1. (i always get the time measuring inaccurate thing when i close the demo in task manager)


----------



## fyzzz

Can somebody donate a xfx r9 290-edfd bios? I forgot to save mine







. I have tried different bioses today but only getting a blue screen. Now I have derped up and flashed wrong bios on both.


----------



## Jflisk

Quote:


> Originally Posted by *fyzzz*
> 
> Can somebody donate a xfx r9 290-edfd bios? I forgot to save mine
> 
> 
> 
> 
> 
> 
> 
> . I have tried different bioses today but only getting a blue screen. Now I have derped up and flashed wrong bios on both.


This is XFX standard bios and XFX DD bios. I think your looking for the DD. These are for the 4 gb versions. Pulled from my cards with GPU-Z

xfxdd.zip 141k .zip file


Also if these do not work for you try here
http://www.overclock.net/t/1331663/xfx-black-double-dissipation-club/330#post_24035783


----------



## mfknjadagr8

It's odd...upon installing 15.5 I blue screened 3 times upon using cccleaner after ddu it finally reinstalled without blue screening...somehow I keep borking the driver...this will be the third time this month...Windows install was fresh...I've never had these kind of issue with drivers before...not sure how adding +10 to voltage max power limit and mild overclock 1050 1300 could cause this...what else should I be looking at? Temps are below 60 at all times now...vrms 62 is max


----------



## fyzzz

Quote:


> Originally Posted by *Jflisk*
> 
> This is XFX standard bios and XFX DD bios. I think your looking for the DD. These are for the 4 gb versions. Pulled from my cards with GPU-Z
> 
> xfxdd.zip 141k .zip file
> 
> 
> Also if these do not work for you try here
> http://www.overclock.net/t/1331663/xfx-black-double-dissipation-club/330#post_24035783


Thanks for the help!


----------



## moorhen2

Quote:


> Originally Posted by *fyzzz*
> 
> Thanks for the help!


This is the best place to look for a gpu bios.









http://www.techpowerup.com/vgabios/


----------



## fyzzz

Quote:


> Originally Posted by *moorhen2*
> 
> This is the best place to look for a gpu bios.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/vgabios/


Yeah i have looked there but the xfx bioses there doesn't work with my card. But in the end i found a sapphire trix bios that works.


----------



## Internet Swag

Should I use Omega (14.12) for a Radeon R9 290?


----------



## JourneymanMike

Quote:


> Originally Posted by *Internet Swag*
> 
> Should I use Omega (14.12) for a Radeon R9 290?


That is the current valid driver'''

There's also two ( I think it's two) beta's out there on the AMD web-site


----------



## Regnitto

Tried the 15.5 beta driver today, and my oc that was stable on 14.12 was unstable in heaven benchmark (1100/1250 +50 power limit stock voltage) went back to 14.12 and it is stable again


----------



## elgreco14

Guys how much are 2x R9 290x Gaming 4G (msi) worth if you sell them second hand? One card is 6 months old and the other 2,5 weeks and both have the receipt and the new one the box.

Can anyone tell me how much they are worth. I think you guys know it the best.

I thought 280 for the old one and 330 for the new one. (in euro's and new ones are 380 at the moment online)


----------



## boredmug

Well i paid 500 USD for two 290x's with EK full cover blocks but i think i got a pretty rocking deal.


----------



## By-Tor

Quote:


> Originally Posted by *boredmug*
> 
> Well i paid 500 USD for two 290x's with EK full cover blocks but i think i got a pretty rocking deal.


Yeah I would say....


----------



## boredmug

I see them going for around 250 USD used.


----------



## mfknjadagr8

Quote:


> Originally Posted by *By-Tor*
> 
> Yeah I would say....


amazing deal I've got 700 in mine with swiftech komodo blocks...but yeah it's always nice to get a good deal


----------



## boredmug

Quote:


> Originally Posted by *mfknjadagr8*
> 
> amazing deal I've got 700 in mine with swiftech komodo blocks...but yeah it's always nice to get a good deal


Yea, only problems is now I'm shopping for a bigger case and another rad to cool these bastards. 360mm and 240mm just doesn't do the job well enough for me. I don't like the fan speeds I have to run at to get close to the temps I had with my 7950's


----------



## tsm106

Quote:


> Originally Posted by *boredmug*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mfknjadagr8*
> 
> amazing deal I've got 700 in mine with swiftech komodo blocks...but yeah it's always nice to get a good deal
> 
> 
> 
> Yea, only problems is now *I'm shopping for a bigger case* and another rad to cool these bastards. 360mm and 240mm just doesn't do the job well enough for me. I don't like the fan speeds I have to run at to get close to the temps I had with my 7950's
Click to expand...

Check out the Enthoo Primo and take a look at my dual 480 setup in it.


----------



## boredmug

Quote:


> Originally Posted by *tsm106*
> 
> Check out the Enthoo Primo and take a look at my dual 480 setup in it.


Well Fry's has the 900d on sale for 99 dollars. I currently have the 800d so I'm probably gunna go with that.


----------



## tsm106

Well that is cheap.


----------



## boredmug

Quote:


> Originally Posted by *tsm106*
> 
> Well that is cheap.


I KNOW!! I think I paid close to that for my 800d and then chopped on it to make the two rads I have now fit. Either they are trying to get rid of inventory or they screwed up the online price I don't care, I think I'm gunna jump on it.


----------



## tsm106

I ordered one for pick up. I'm a sucker for cheap cases lol.


----------



## boredmug

Quote:


> Originally Posted by *tsm106*
> 
> I ordered one for pick up. I'm a sucker for cheap cases lol.


Hell yea, its quite a deal. I think I'm gunna go grab mine after work tomorrow. Throw it on the Fry's card.


----------



## LandonAaron

3dMark is on sale for $4.99 right now on the Steam summer sale for those interested.


----------



## fyzzz

Quote:


> Originally Posted by *LandonAaron*
> 
> 3dMark is on sale for $4.99 right now on the Steam summer sale for those interested.


Just bought it


----------



## fyzzz

Now that i have bought 3dmark, i can run extreme. So here is a extreme run at 1200/1550: http://www.3dmark.com/3dm/7347559? is it a good score?


----------



## fyzzz

And one more thing i can't seem to get over 1.3v not even with +300mv lol. Therefore i have been trying to maybe find a bios but everything just seems to go wrong. Why do i want over 1.3v? Just for benchmarking i can already hit about 1240/1550 before it gets to instable.


----------



## Ha-Nocri

u cant run firestrike on the free version?


----------



## boredmug

Quote:


> Originally Posted by *tsm106*
> 
> I ordered one for pick up. I'm a sucker for cheap cases lol.


Well I got screwed on that deal. Went up to Frys today and apparently they didn't update their inventory. All gone. Lol. You probably got the last one.


----------



## fyzzz

Quote:


> Originally Posted by *Ha-Nocri*
> 
> u cant run firestrike on the free version?


You can run normal firestrike on free version but not extreme and ultra


----------



## tsm106

Quote:


> Originally Posted by *boredmug*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> I ordered one for pick up. I'm a sucker for cheap cases lol.
> 
> 
> 
> Well I got screwed on that deal. Went up to Frys today and apparently they didn't update their inventory. All gone. Lol. You probably got the last one.
Click to expand...

Nah, I didn't get it either, no stock. Fry'd screws everyone. They always, always post stuff at silly prices and never deliver.


----------



## boredmug

Quote:


> Originally Posted by *tsm106*
> 
> Nah, I didn't get it either, no stock. Fry'd screws everyone. They always, always post stuff at silly prices and never deliver.


Pissed me off thatvi hauled ass up there before I had to get my kids to be told sorry, we didn't update our inventory. Now i have another rad on the way with no place to put it except hanging off the back off my 800d. I'm not sure what kind of brackets to order to get this done. .

I've been watching my son play GTA V for about 2 hours with corsair SP's running full blast.. Cards are at 51 and 56.. I'd prefer to not have these fans so loud for these crap temps


----------



## tsm106

There's lots of options for hanging a rad off the back from swiftech, bitspower, koolance and xspc. Check ppcs in the rad accessories area.


----------



## fyzzz

I think i need a bios that supports this: 4096 MB, GDDR5, Hynix H5GC2H24BFR


----------



## fyzzz

I have found a vapor-x bios that works and now i am getting more firestrike points


----------



## rdr09

Quote:


> Originally Posted by *boredmug*
> 
> Pissed me off thatvi hauled ass up there before I had to get my kids to be told sorry, we didn't update our inventory. Now i have another rad on the way with no place to put it except hanging off the back off my 800d. I'm not sure what kind of brackets to order to get this done. .
> 
> I've been watching my son play GTA V for about 2 hours with corsair SP's running full blast.. Cards are at 51 and 56.. I'd prefer to not have these fans so loud for these crap temps


Place it in another room . . .

http://www.overclock.net/t/505713/yes-another-car-radiator-thread-56k-warning

just kidding. what pump are you using?

Edit: don't buy from here. just reference . . .

http://www.frozencpu.com/cat/l3/g30/c637/s1731/list/p1/Liquid_Cooling-PC_Water_Cooling_Radiator_Accessories-Radiator_Mounts-Page1.html


----------



## 4kallday

I decided to paint my R9 290X's to match my case what do you all think? First picture shows the before, second picture is a comparison and third picture is how it looks currently:


----------



## fishingfanatic

Looks pretty sweet !









FF


----------



## DDSZ

Looks like our 290(x) can't be flashed to 390(x)?


----------



## tsm106

New driver and another one soon.

http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-300-Series.aspx


----------



## Forceman

Quote:


> Originally Posted by *tsm106*
> 
> New driver and another one soon.
> 
> http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-300-Series.aspx


Any release notes? Support for frame rate control, specifically?


----------



## tsm106

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> New driver and another one soon.
> 
> http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-300-Series.aspx
> 
> 
> 
> Any release notes? Support for frame rate control, specifically?
Click to expand...

There was a post somewhere but I can't find it now listing changes. Iirc it had variable framerate control, vsr, etc, basically the works.


----------



## Forceman

Quote:


> Originally Posted by *tsm106*
> 
> There was a post somewhere but I can't find it now listing changes. Iirc it had variable framerate control, vsr, etc, basically the works.


Sweet


----------



## tsm106

Formatting is terribad.









http://forums.overclockers.co.uk/showpost.php?p=28184021&postcount=11056
Quote:


> AMD is releasing a two phase driver release with Catalyst 15.15 and 15.20
> The 15.15 drivers are available immediately, and the 15.20 drivers are coming a few weeks later.
> The Catalyst 15.15 drivers bring new features like
> :
> 
> FPS Targeting
> :
> The ability to save power by setting a maximum FPS target Virtual Super Resolution
> :
> The ability to down
> scale a game from a high
> resolution to a lower resolution, giving a user higher
> -
> quality textures in game (also known as supersampling)
> 
> Performance optimizations
> :
> Adding game performance optimizations to AMD R9 Fury X
> and other graphics cards
> Although Catalyst 15.15 drivers will not be available for Windows 10,
> Catalyst 15.20 will be. The Catalyst 15.20 drivers will bring features like:
> 
> Catalyst Uninstaller
> :
> A
> llows users to uninstall their AMD catalyst drivers
> cleanly,
> so new drivers can be installed without incident
> 
> OpenCL 2.0 optional features
> :
> M
> ultiple performance and feature additions
> 
> FreeSync + CrossFire
> :
> T
> he ability to run FreeSync
> dynamic screen refresh
> technology with multiple GPUs in CrossFire mode.
> Catalyst 15.20 divers
> will also
> have
> Windows 10 specific features like:
> 
> HEVC
> (
> H
> igh
> E
> fficiency
> V
> ideo
> C
> odec)
> :
> E
> nables quality streaming and 4K
> experiences
> 
> DirectX 12
> :
> S
> upport for the lo
> w
> -
> level efficient graphics API for Windows 10
> 
> Windows 10 WHQL
> (Windows Hardware Quality Lab)
> :
> AMD's latest driver
> release (15.20) will be WHQL and WHQL for Windows 10 as well


----------



## boredmug

Quote:


> Originally Posted by *rdr09*
> 
> Place it in another room . . .
> 
> http://www.overclock.net/t/505713/yes-another-car-radiator-thread-56k-warning
> 
> just kidding. what pump are you using?
> 
> Edit: don't buy from here. just reference . . .
> 
> http://www.frozencpu.com/cat/l3/g30/c637/s1731/list/p1/Liquid_Cooling-PC_Water_Cooling_Radiator_Accessories-Radiator_Mounts-Page1.html


I'm using a d5. Why do you say not to order from frozen CPU? I have used them in the past.


----------



## tsm106

Fcpu is dead but their site is still up collecting all your moneys!


----------



## Forceman

Quote:


> Originally Posted by *tsm106*
> 
> Formatting is terribad.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://forums.overclockers.co.uk/showpost.php?p=28184021&postcount=11056


Strange, downloaded and tried to install them and it says they are 14.502.1014, which is the same as the 15.4 betas.


----------



## boredmug

Quote:


> Originally Posted by *tsm106*
> 
> Fcpu is dead but their site is still up collecting all your moneys!


Seriously? So if you place an order they'll take your money and not send anything? WOW. I ordered from performance PC last night. Glad I did.

I feel a little bad about hanging a rad off the back of my 800d but I need to see better numbers on these cards. Not like I stare at the back of my case anyways...


----------



## tsm106

Quote:


> Originally Posted by *Forceman*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Formatting is terribad.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://forums.overclockers.co.uk/showpost.php?p=28184021&postcount=11056
> 
> 
> 
> Strange, downloaded and tried to install them and it says they are *14.502.1014*, which is the same as the 15.4 betas.
Click to expand...

That's the driver packaging number iirc?


----------



## By-Tor

Quote:


> Originally Posted by *boredmug*
> 
> Seriously? So if you place an order they'll take your money and not send anything? WOW. I ordered from performance PC last night. Glad I did.


Yes FCPU has been dead for sometime now. I have placed 6 orders this year with Performance and have had nothing but great service.


----------



## Forceman

Quote:


> Originally Posted by *tsm106*
> 
> That's the driver packaging number iirc?


Not sure, but the installer shows it as the same version I already have. Re-downloading now.

Edit: No dice, same thing. Shows the same version. Weird.


----------



## rdr09

Quote:


> Originally Posted by *tsm106*
> 
> New driver and another one soon.
> 
> http://support.amd.com/en-us/kb-articles/Pages/AMD-Radeon-300-Series.aspx


+rep. took like 3 mins to install. gained about a point in heaven. plays good, though.

Quote:


> Originally Posted by *boredmug*
> 
> I'm using a d5. Why do you say not to order from frozen CPU? I have used them in the past.


maybe the 290X are really much warmer than the 290. D5 should be strong enuf. i use an alphacool p655 with a 240 and 360 rads. my cougar fans are quiet.


----------



## Forceman

Quote:


> Originally Posted by *rdr09*
> 
> +rep. took like 3 mins to install. gained about a point in heaven. plays good, though.
> maybe the 290X are really much warmer than the 290. D5 should be strong enuf. i use an alphacool p655 with a 240 and 360 rads. my cougar fans are quiet.


What does CCC say for the driver version? The long number, not 15.15 or whatever.


----------



## rdr09

Quote:


> Originally Posted by *Forceman*
> 
> What does CCC say for the driver version? The long number, not 15.15 or whatever.


lol. I don't know how to find it in CCC. Hardly use it. my bad. but, here is GPUZ . . .



Is it the right one?

Edit: Found it . . .


----------



## Forceman

Thanks, that shows a slightly different number than the ones I have installed. Wonder why it says 15.5 instead of 15.15 though.


----------



## rdr09

Quote:


> Originally Posted by *Forceman*
> 
> Thanks, that shows a slightly different number than the ones I have installed. Wonder why it says 15.5 instead of 15.15 though.


You are right. did not even notice that. i thought it was 15.15.


----------



## tsm106

Just got home. I'll try these new drivers out. Bah, really hate redoing eyefinity each time though.


----------



## boredmug

Quote:


> Originally Posted by *rdr09*
> 
> +rep. took like 3 mins to install. gained about a point in heaven. plays good, though.
> maybe the 290X are really much warmer than the 290. D5 should be strong enuf. i use an alphacool p655 with a 240 and 360 rads. my cougar fans are quiet.


You have 290's in crossfire? What thickness radiators? Also what are your temps like? Oh and your ambient?


----------



## boredmug

Quote:


> Originally Posted by *tsm106*
> 
> Just got home. I'll try these new drivers out. Bah, really hate redoing eyefinity each time though.


I finally sorted those cards out BTW. I went back and looked through the link you sent me and followed it to a T. I had previously installed and uninstalled drivers a million times. Talk about tired of resetting up eyefinity. What resolution do you run eyefinity in?


----------



## BradleyW

Modified 15.15 are up on Guru3D forums to support the 290X and lower.

http://forum.guru3d.com/showthread.php?t=400078&page=2


----------



## tsm106

Quote:


> Originally Posted by *boredmug*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Just got home. I'll try these new drivers out. Bah, really hate redoing eyefinity each time though.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *I finally sorted those cards out BTW. I went back and looked through the link you sent me and followed it to a T.* I had previously installed and uninstalled drivers a million times. Talk about tired of resetting up eyefinity. What resolution do you run eyefinity in?
Click to expand...

Nice. The settings I have in the guide work in that particular order.









I run roughly [email protected]

Btw guys, you have to get the updated driver with inf mod by asder00 in the link above this post.


----------



## rdr09

Quote:


> Originally Posted by *boredmug*
> 
> You have 290's in crossfire? What thickness radiators? Also what are your temps like? Oh and your ambient?


Yes, my 240 is slim but my 360 is 45mm thick. Ambient is about 22C, though. In BF4, my GPU cores hit just above 50 but the vrms are much cooler and they don't see 50.. My i7 SB @ 4.5GHz reaches 57C. I run out of paste, so I only used the paste that came with the ek waterblocks for the cpu.

BTW, my gpu blocks are using Fujipoly for the vrms and CLU for the core.


----------



## boredmug

Quote:


> Originally Posted by *rdr09*
> 
> Yes, my 240 is slim but my 360 is 45mm thick. Ambient is about 22C, though. In BF4, my GPU cores hit just above 50 but the vrms are much cooler and they don't see 50.. My i7 SB @ 4.5GHz reaches 57C. I run out of paste, so I only used the paste that came with the ek waterblocks for the cpu.
> 
> BTW, my gpu blocks are using Fujipoly for the vrms and CLU for the core.


I have two slim rads. My ambient is more 25 or a little higher depending.. This room doesn't stay as cool as the other rooms and it definately gets warm when gaming.

Don't play BF4 but i've head gta 5 is slightly more cpu intensive. I'm also running SB 2600k @ 4.5. I really haven't been watching vrm's too much as i've noticed they stay pretty cool. Just the stock pads that came with the EK blocks though. None of my paste is super highend. I ran out of good stuff a while back and have been using some that came with a watercooling kit.


----------



## JourneymanMike

Quote:


> Originally Posted by *rdr09*
> 
> Yes, my 240 is slim but my 360 is 45mm thick. Ambient is about 22C, though. In BF4, my GPU cores hit just above 50 but the vrms are much cooler and they don't see 50.. My i7 SB @ 4.5GHz reaches 57C. I run out of paste, so I only used the paste that came with the ek waterblocks for the cpu.
> 
> BTW, my gpu blocks are using Fujipoly for the vrms and CLU for the core.


I have an EK Supremacy EVO, it came with Gelid X, which is a very fine TIM

IC Diamond on the NB and GPU... I use Fujipoly 0.5mm pads for the GPU's and MB's block VRM's...


----------



## Forceman

Quote:


> Originally Posted by *BradleyW*
> 
> Modified 15.15 are up on Guru3D forums to support the 290X and lower.
> 
> http://forum.guru3d.com/showthread.php?t=400078&page=2


Surely AMD intends to release these for 2-series cards, right?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *tsm106*
> 
> Btw guys, you have to get the updated driver with inf mod by asder00 in the link above this post.


Any better than the other modded windows 10 driver? or 15.5 beta?


----------



## kizwan

Quote:


> Originally Posted by *boredmug*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> Yes, my 240 is slim but my 360 is 45mm thick. Ambient is about 22C, though. In BF4, my GPU cores hit just above 50 but the vrms are much cooler and they don't see 50.. My i7 SB @ 4.5GHz reaches 57C. I run out of paste, so I only used the paste that came with the ek waterblocks for the cpu.
> 
> BTW, my gpu blocks are using Fujipoly for the vrms and CLU for the core.
> 
> 
> 
> I have two slim rads. My ambient is more 25 or a little higher depending.. This room doesn't stay as cool as the other rooms and it definately gets warm when gaming.
> 
> Don't play BF4 but i've head gta 5 is slightly more cpu intensive. I'm also running SB 2600k @ 4.5. I really haven't been watching vrm's too much as i've noticed they stay pretty cool. Just the stock pads that came with the EK blocks though. None of my paste is super highend. I ran out of good stuff a while back and have been using some that came with a watercooling kit.
Click to expand...

GPU @50s Celsius is pretty good already, especially in high ambient. Mine can go up to low 60s but keep in mind ambient is around 32 to 35C.







My i7 can go up to 70s.


----------



## fyzzz

Im am on air but my gpu goes up to 54 and vrm to around 50 at stock and my i5 goes up to around 60 clocked at 4,7. Ambient is 20c. (aijintek morpheus on gpu and noctua nh-u14s on cpu)


----------



## Aussiejuggalo

Weird question, does anyone else get major cap whine on the Beta 15.4 in Minecraft?


----------



## rdr09

Quote:


> Originally Posted by *boredmug*
> 
> I have two slim rads. My ambient is more 25 or a little higher depending.. This room doesn't stay as cool as the other rooms and it definately gets warm when gaming.
> 
> Don't play BF4 but i've head gta 5 is slightly more cpu intensive. I'm also running SB 2600k @ 4.5. I really haven't been watching vrm's too much as i've noticed they stay pretty cool. Just the stock pads that came with the EK blocks though. None of my paste is super highend. I ran out of good stuff a while back and have been using some that came with a watercooling kit.


you got 290Xs, so they run a little warmer and your ambient is high. let's compare Render test found in GPUZ using fullscreen . . .



You have to open 3 instances and use drop down at the bottom of one GPUZ to pick the right GPU. Hit the "?" to start the render test. Make sure to set the temps to max. Run the test a 15 mins to half hour.
Quote:


> Originally Posted by *JourneymanMike*
> 
> I have an EK Supremacy EVO, it came with Gelid X, which is a very fine TIM
> 
> IC Diamond on the NB and GPU... I use Fujipoly 0.5mm pads for the GPU's and MB's block VRM's...


I was gonna use CLU on the cpu but i chickened out.


----------



## phasedscum

Hi Guys, I have a 290x DCU II by Asus, I would like to use a copper shim to make use of all 5 heatpipes in the DCU II heatsink, anyone tried this before? What thickness of the copper shim is used?


----------



## elgreco14

So I went from crossfire 290x to R9 295x2. In most games my gpu usage is 90-100. So no trouble anymore with that and these heat problems. Is it possible to sell my R9 290X gaming 4G (2 pieces) on this forum somewhere?


----------



## By-Tor

Quote:


> Originally Posted by *elgreco14*
> 
> So I went from crossfire 290x to R9 295x2. In most games my gpu usage is 90-100. So no trouble anymore with that and these heat problems. Is it possible to sell my R9 290X gaming 4G (2 pieces) on this forum somewhere?


Can post them here in the Video market place. But you must have 35 reps to sell here...

http://www.overclock.net/f/14779/video


----------



## PachAz

Heh, damn the reference cooler on the R9 290/X is loud, I just tested my second stock R9 290 which will be going into my second spare PC. At 100% load the noise is unbarable and the temp is 94C in Valley. I simply dont understand how people can game with stock R9 290/X cards. This makes me appreciate the waterblock even more







. And the heat that the GPU is dumping, oh boy.

Wonder if the R9 390/X will be similar in terms of noise and heat? If so, a waterblock is a must even on those with aftermarket air cooler! If one dont intend to watercool I can understand why some one would pay more to get an Nvidia card that performs slightly less but runs cooler and more quiet.


----------



## tsm106

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Btw guys, you have to get the updated driver with inf mod by asder00 in the link above this post.
> 
> 
> 
> 
> 
> Any better than the other modded windows 10 driver? or 15.5 beta?
Click to expand...

So far so good. It's quick and stable maybe even feels a little bit more rock solid than the 1040 driver.


----------



## mfknjadagr8

Quote:


> Originally Posted by *PachAz*
> 
> Heh, damn the reference cooler on the R9 290/X is loud, I just tested my second stock R9 290 which will be going into my second spare PC. At 100% load the noise is unbarable and the temp is 94C in Valley. I simply dont understand how people can game with stock R9 290/X cards. This makes me appreciate the waterblock even more
> 
> 
> 
> 
> 
> 
> 
> . And the heat that the GPU is dumping, oh boy.
> 
> Wonder if the R9 390/X will be similar in terms of noise and heat? If so, a waterblock is a must even on those with aftermarket air cooler! If one dont intend to watercool I can understand why some one would pay more to get an Nvidia card that performs slightly less but runs cooler and more quiet.


most cards aren't reference cooler...try running afterburner to ramp up the fan speed to keep thermals in a good spot and it will be unbearable...I had two running 75 percent fan speed...you may find the paste is bad when you put waterblocks on...if you run around 60 fan speed it's not much louder but above 70 it's loud


----------



## PachAz

Point is, whats the point getting an AMD with reference cooler? Even if the gpu is not used to 100% in gaming the fan will be noisy and temps will be high. Why dont AMD make a new reference cooler to begin with? It would be highly beneficial for temps and sound level. The only reason I can see for a reference cooled GPU is if you will WC and then it doesnt matter other than it might be less worth when selling 2:nd hand compared to a reference PCB with better cooler. Anyways you get my point. I just feel sorry for the casual gamers buying AMD cards with reference cooler, you cant even enjoy the gaming experiance due to all noise and heat.


----------



## boredmug

Quote:


> Originally Posted by *PachAz*
> 
> Point is, whats the point getting an AMD with reference cooler? Even if the gpu is not used to 100% in gaming the fan will be noisy and temps will be high. Why dont AMD make a new reference cooler to begin with? It would be highly beneficial for temps and sound level. The only reason I can see for a reference cooled GPU is if you will WC and then it doesnt matter other than it might be less worth when selling 2:nd hand compared to a reference PCB with better cooler. Anyways you get my point. I just feel sorry for the casual gamers buying AMD cards with reference cooler, you cant even enjoy the gaming experiance due to all noise and heat.


Honestly i don't think i'd run air ANYTHING but that's just me. I think the point was to get the silicon out there and let other companies iron out the cooling solution. I'm pretty sure AMD doesn't have the R&D that a company like Nvidia has so I guess they pass those tasks down the line?


----------



## HoZy

I only purchased my launch reference cards originally because I knew I'd be straight to waterblocks when they released, My cards haven't seen temps over 60c in years.

Still wish I switched the Xfire 290x's for a 295x though when I had the chance.

Cheers
Mat


----------



## PachAz

Yeah, a 295x2 takes less space. I am not really a fan of crossfire because even thiugh you get more fps in some games, it feels less smooth compared to one card. I put alot of emphasis on smoothness while gaming and less on eye candy, thats why I run all games on low settings, with textures on high/max to get those 100+ fps. I will be keeping my r9 290 for a long while because I dont play any games that are bottlenecked by the gpu and I am not keen on getting a new WB and mess around. Looking at how good the r9 290/x still performs today I am not motivated at all to get something new.


----------



## JourneymanMike

Quote:


> Originally Posted by *PachAz*
> 
> Point is, whats the point getting an AMD with reference cooler? Even if the gpu is not used to 100% in gaming the fan will be noisy and temps will be high. Why dont AMD make a new reference cooler to begin with? It would be highly beneficial for temps and sound level. *The only reason I can see for a reference cooled GPU is if you will WC and then it doesnt matter other than it might be less worth when selling 2:nd hand compared to a reference PCB with better cooler*. Anyways you get my point. I just feel sorry for the casual gamers buying AMD cards with reference cooler, you cant even enjoy the gaming experiance due to all noise and heat.


Plus, Reference Cards are the first ones that WC coolers are readily available for... And always are...


----------



## HoZy

PachAZ,

I couldn't agree more, The microstutter I receive in various games on xfire is annoying & also the lack of xfire support on a few newer games grinds my gears.

But gaming on 2560x1440 or 3840x1440 xfire was a must really.


----------



## PachAz

I will always perfer a very fast single gpu card, because it takes up less space in the case and give you smother fps. But I think the general trend will make more people run crossfire due to 1440p and 4k gaming. Maybe thats why the r9 390/x has 8gb ram, which helps at very high resolutions. I have used a 120hz screen for almost 3 years though and it is simply too difficult to go back to 60hz and 60 fps for me. I am old school gamer hehe. From the times where an i7 3930k and dual 7970 was the norm to achieve 120+ fps in bf3 at 1080p driving tanks and jets pwning. Sadly there are not many left from that era. New school kids are all about maxing everything at big resolutions.


----------



## HoZy

I wouldn't call 7970 or bf3 old, That was 2011/2012.


----------



## PachAz

It was a new era that is old now...^^


----------



## pengs

Any performance gain or reason to move from 15.4 or 15.5 to 15.15?


----------



## tsm106

Quote:


> Originally Posted by *HoZy*
> 
> I wouldn't call 7970 or bf3 old, That was 2011/2012.


7970's are ancient now. And I should know having run one of the fastest quad 7970s here since release.


----------



## kizwan

Quote:


> Originally Posted by *PachAz*
> 
> Point is, whats the point getting an AMD with reference cooler? Even if the gpu is not used to 100% in gaming the fan will be noisy and temps will be high. Why dont AMD make a new reference cooler to begin with? It would be highly beneficial for temps and sound level. The only reason I can see for a reference cooled GPU is if you will WC and then it doesnt matter other than it might be less worth when selling 2:nd hand compared to a reference PCB with better cooler. Anyways you get my point. I just feel sorry for the casual gamers buying AMD cards with reference cooler, you cant even enjoy the gaming experiance due to all noise and heat.


I'm quite surprise there still people complaining about reference cooler. It's a bit late don't you think since 390X & Fury are around the corner. Referenced cooler always noisy though. My 5870 referenced cooler also noisy. Referenced cards suitable for watercooling & the first cards that will get waterblock. You should have get non-referenced PCB though.
Quote:


> Originally Posted by *HoZy*
> 
> PachAZ,
> 
> I couldn't agree more, The microstutter I receive in various games on xfire is annoying & also the lack of xfire support on a few newer games grinds my gears.
> 
> But gaming on 2560x1440 or 3840x1440 xfire was a must really.


What games are you getting microstutter?


----------



## By-Tor

I don't see any reference coolers on the 390 and 390x cards...

http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&N=8000&Order=BESTMATCH&Description=PPSSXSFFGLFAUN

http://www.newegg.com/Product/ProductList.aspx?Submit=ENE&DEPA=0&N=8000&Order=BESTMATCH&Description=PPSSDLPPKMVQUW

All are listed here..
http://promotions.newegg.com/amd/15-2972/index.html


----------



## rdr09

Quote:


> Originally Posted by *PachAz*
> 
> Heh, damn the reference cooler on the R9 290/X is loud, I just tested my second stock R9 290 which will be going into my second spare PC. At 100% load the noise is unbarable and the temp is 94C in Valley. I simply dont understand how people can game with stock R9 290/X cards. This makes me appreciate the waterblock even more
> 
> 
> 
> 
> 
> 
> 
> . And the heat that the GPU is dumping, oh boy.
> 
> Wonder if the R9 390/X will be similar in terms of noise and heat? If so, a waterblock is a must even on those with aftermarket air cooler! If one dont intend to watercool I can understand why some one would pay more to get an Nvidia card that performs slightly less but runs cooler and more quiet.


I only had my reference 290 on air for a week. Back in 2013.


----------



## Jflisk

Air is meant for breathing not cooling computers


----------



## xKrNMBoYx

Got a question to any Sapphire Reference 290X owners. What does the switch do for this particular GPU?



Is it a Uber vs Quiet switch or is it Sapphire's Legacy vs UEFI switch. From what I know Sapphire's Dual-BIOS switch looks like this.



But I don't think my card has the button/switch with the Sapphire logo. So in my case is my switch the Uber/Quiet switch or is the Sapphire UEFI/Legacy switch.


----------



## tsm106

Reference cards are all the same because, drumroll they follow the ref spec. On ref cards the switch changes between quiet and uber. On custom cards it can do a variety of different things. Google yer card reviews to confirm what yers does. That button maybe for an oc iirc.


----------



## sTOrM41

yeah, i finaly managed to edit my 290 bios









here my 290 PCS+ with 1060/1400 default clocks


----------



## pcrevolution

Finally done with my watercooling loop.
Quote:


> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> Added you seven members ago. I took your word for it and it's listed under water. But it doesn't get you out of showing us a pic when you do.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nice PSU cables btw.
> Congrats - added





Spoiler: Warning: Spoiler!







Green just happens to be my favourite colour.







The AsRock Z97E-ITX will go when Skylake is out.



I got a pretty bad batch of GDDR5s from Elpida. They refuse to go above 1300Mhz.



It's a 30C ambient here so my rads are working extra hard.. When running the benchmark it appears that the voltage is around 1.188 - 1.213v but the moment I'm back at desktop it reverts to 1.28v and the card does not underclock. Does anyone have any idea why?


----------



## Raephen

Quote:


> Originally Posted by *xKrNMBoYx*
> 
> Got a question to any Sapphire Reference 290X owners. What does the switch do for this particular GPU?
> 
> 
> 
> Is it a Uber vs Quiet switch or is it Sapphire's Legacy vs UEFI switch. From what I know Sapphire's Dual-BIOS switch looks like this.
> 
> 
> 
> But I don't think my card has the button/switch with the Sapphire logo. So in my case is my switch the Uber/Quiet switch or is the Sapphire UEFI/Legacy switch.


On my refference Sapphire 290 it's Uber & Quiet.


----------



## Arizonian

Quote:


> Originally Posted by *pcrevolution*
> 
> Finally done with my watercooling loop.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Green just happens to be my favourite colour.
> 
> 
> 
> 
> 
> 
> 
> The AsRock Z97E-ITX will go when Skylake is out.
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I got a pretty bad batch of GDDR5s from Elpida. They refuse to go above 1300Mhz.
> 
> 
> 
> It's a 30C ambient here so my rads are working extra hard.. When running the benchmark it appears that the voltage is around 1.188 - 1.213v but the moment I'm back at desktop it reverts to 1.28v and the card does not underclock. Does anyone have any idea why?


Really nice tidy work


----------



## kizwan

Quote:


> Originally Posted by *pcrevolution*
> 
> Finally done with my watercooling loop.
> Quote:
> 
> 
> 
> Originally Posted by *Arizonian*
> 
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> Added you seven members ago. I took your word for it and it's listed under water. But it doesn't get you out of showing us a pic when you do.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nice PSU cables btw.
> Congrats - added
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Green just happens to be my favourite colour.
> 
> 
> 
> 
> 
> 
> 
> The AsRock Z97E-ITX will go when Skylake is out.
> 
> 
> 
> I got a pretty bad batch of GDDR5s from Elpida. They refuse to go above 1300Mhz.
> 
> 
> 
> It's a 30C ambient here so my rads are working extra hard.. When running the benchmark it appears that the voltage is around 1.188 - 1.213v but the moment I'm back at desktop it reverts to 1.28v and the card does not underclock. Does anyone have any idea why?
Click to expand...

Nice!







I like that colour.


----------



## kaspar737

Anyone tried reducing power limit and monitoring clocks in games? I know it downclocks hard in Furmark, but maybe it is a different story in games.


----------



## DDSZ

390x bios modded by netkas to support 4Gb Hynix 1250mhz mem


Also, guys at guru3d noticed that bios sig is not being checked now so you can edit your bioses.


----------



## fyzzz

Quote:


> Originally Posted by *DDSZ*
> 
> 390x bios modded by netkas to support 4Gb Hynix 1250mhz mem
> 
> 
> Also, guys at guru3d noticed that bios sig is not being checked now so you can edit your bioses.


Oh intresting..going to try it immediately


----------



## fyzzz

Quote:


> Originally Posted by *DDSZ*
> 
> 390x bios modded by netkas to support 4Gb Hynix 1250mhz mem
> 
> 
> Also, guys at guru3d noticed that bios sig is not being checked now so you can edit your bioses.


Ohh intresting gonna try it immediately EDIT:Whoops double post :/


----------



## rdr09

Quote:


> Originally Posted by *DDSZ*
> 
> 390x bios modded by netkas to support 4Gb Hynix 1250mhz mem
> 
> 
> Also, guys at guru3d noticed that bios sig is not being checked now so you can edit your bioses.


What driver are you using?


----------



## fyzzz

Quote:


> Originally Posted by *rdr09*
> 
> What driver are you using?


I am on Windows 10 so i am using the driver it downloads. I think it's called 15.2 or something. You did not probably ask me but i answerd anyway.


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> I am on Windows 10 so i am using the driver it downloads. I think it's called 15.2 or something. You did not probably ask me but i answerd anyway.


Thanks. I asked 'cause i read the driver for 300 series will not work on the 200 series. will this bios update change that?

my 290s are using Catalyst version 15.4.1 Beta.


----------



## fyzzz

Quote:


> Originally Posted by *rdr09*
> 
> Thanks. I asked 'cause i read the driver for 300 series will not work on the 200 series. will this bios update change that?
> 
> my 290s are using Catalyst version 15.4.1 Beta.


The 290 is still recognized as a 290 after the bios flash. So no i don't think the newest driver will work.


----------



## fyzzz

I am being so frustrated when overclock because my 290 'only' gives around 1.29v with +200mv and 300mv+ doesn't change that. I can give it how much voltage as i want and it will not go over 1.3v and at 1.29v i can hit around 1240/1600 mhz. I am almost at my asus clock but doing it on 0.1v less. Could it be my psu? I heard that the rm750 isn't so good. gpu-z show sometimes drops down to 11.5-11.38 on the 12v rail, but maybe that is a wrong reading i don't know.


----------



## pcrevolution

Quote:


> Originally Posted by *fyzzz*
> 
> I am being so frustrated when overclock because my 290 'only' gives around 1.29v with +200mv and 300mv+ doesn't change that. I can give it how much voltage as i want and it will not go over 1.3v and at 1.29v i can hit around 1240/1600 mhz. I am almost at my asus clock but doing it on 0.1v less. Could it be my psu? I heard that the rm750 isn't so good. gpu-z show sometimes drops down to 11.5-11.38 on the 12v rail, but maybe that is a wrong reading i don't know.


Lol I can't even get mine to go above 100mv offset in afterburner. I'm running a reference card.

Did you use the command line arguments?

I would take software readings with a pinch of salt though..


----------



## fyzzz

Quote:


> Originally Posted by *pcrevolution*
> 
> Lol I can't even get mine to go above 100mv offset in afterburner. I'm running a reference card.
> 
> Did you use the command line arguments?
> 
> I would take software readings with a pinch of salt though..


I use trixx and the msi volt mod thing, but mainly trixx. And yes i do take the readings with a pinch of salt, but i still want moar mv's hehe.


----------



## Gumbi

Quote:


> Originally Posted by *pcrevolution*
> 
> Lol I can't even get mine to go above 100mv offset in afterburner. I'm running a reference card.
> 
> Did you use the command line arguments?
> 
> I would take software readings with a pinch of salt though..


Trixx gives access to 200mv. However, on air, for 24/7 clocks, 100 is enough I think.


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> I am being so frustrated when overclock because my 290 'only' gives around 1.29v with +200mv and 300mv+ doesn't change that. I can give it how much voltage as i want and it will not go over 1.3v and at 1.29v i can hit around 1240/1600 mhz. I am almost at my asus clock but doing it on 0.1v less. Could it be my psu? I heard that the rm750 isn't so good. gpu-z show sometimes drops down to 11.5-11.38 on the 12v rail, but maybe that is a wrong reading i don't know.


yah, that reading is going below the tolerance of +/- 5% on the 12V.


----------



## jonny jon jon

That's strange as my Asus dcuOcII 290x is 1050 1350 stock and is the same pixel and 345 bandwidth but the texel rate is 185. Wonder why its higher.


----------



## jonny jon jon

Nevermind. Doi. Yours isn't an x. I've had mine clocked to 1200 and 1550 with 200mv but it crashes anywhere past 1100 1400 depending on the game. Recently downclocked it to 1000 and 1400 mem and my temps never go above 67c. Can't wait to crossfire another as soon as these 390s and furys start circulating. I'm anxious to see how cheap I can find another Asus, msi, or sapphire 290x, wanna stick with Asus for the theme but I think the others oc much better, although xfx is the lowest performing one at stock speeds.


----------



## NIK1

I have a Sapphire TRI-X R290 that went NFG last night.I was playing a couple matches of world of tanks and the computer went click and shut off no blue screen or anything.I tried to restart and as soon as I press the start button I hear a faint click and no start.Case fans do not spin at all either since it is a instant click and no go.When I unplug the power cables to the 290 the computer starts fine and later tried a differant graphics card and everything is fine.I bought the 290 in march and put a swiftech water block on it a month later,temps were excellent 29 cel idle and 49 cel max while gaming.Anyone have any idea on what could of went on my 290.


----------



## mfknjadagr8

Quote:


> Originally Posted by *NIK1*
> 
> I have a Sapphire TRI-X R290 that went NFG last night.I was playing a couple matches of world of tanks and the computer went click and shut off no blue screen or anything.I tried to restart and as soon as I press the start button I hear a faint click and no start.Case fans do not spin at all either since it is a instant click and no go.When I unplug the power cables to the 290 the computer starts fine and later tried a differant graphics card and everything is fine.I bought the 290 in march and put a swiftech water block on it a month later,temps were excellent 29 cel idle and 49 cel max while gaming.Anyone have any idea on what could of went on my 290.


vrm...poor quality cap...random power surge from your grid...could be a lot of things...


----------



## pcrevolution

Quote:


> Originally Posted by *NIK1*
> 
> I have a Sapphire TRI-X R290 that went NFG last night.I was playing a couple matches of world of tanks and the computer went click and shut off no blue screen or anything.I tried to restart and as soon as I press the start button I hear a faint click and no start.Case fans do not spin at all either since it is a instant click and no go.When I unplug the power cables to the 290 the computer starts fine and later tried a differant graphics card and everything is fine.I bought the 290 in march and put a swiftech water block on it a month later,temps were excellent 29 cel idle and 49 cel max while gaming.Anyone have any idea on what could of went on my 290.


Sounds like a short somewhere in your rig. Since it works with another video card most likely it mean it's your 290. Any way to the 290 on another system?


----------



## NIK1

I took the card to my local pc repair shop and they put it in a test pc and it did the same thing,hit the start button and instant click and no go.


----------



## rdr09

Quote:


> Originally Posted by *NIK1*
> 
> I took the card to my local pc repair shop and they put it in a test pc and it did the same thing,hit the start button and instant click and no go.


could be a screw (s) too tight. could be a short somewhere.









tried the other bios?


----------



## pcrevolution

Quote:


> Originally Posted by *NIK1*
> 
> I took the card to my local pc repair shop and they put it in a test pc and it did the same thing,hit the start button and instant click and no go.


There's a short on the Pcb of your 290. Best to do a thorough visual inspection!


----------



## mfknjadagr8

Quote:


> Originally Posted by *rdr09*
> 
> could be a screw (s) too tight. could be a short somewhere.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> tried the other bios?


I can't see a way fit the screws to be too tight however if I had to guess in this case did you inspect the tops of the cap points like the instructions for the komodo block stated? Most of them are fine but it takes only one to long to cause a short...I would unblock it and look it over well...shorts usually show as grayish or black/bluish areas on the pcb or backplate...check all the solder point traces that come through the back of the pcb and ensure they aren't over 1mm high...


----------



## pcrevolution

When the local PC shop checked the card was it with the block?

It's almost certain it's a short. The click sound is akin to a circuit breaker.


----------



## Jflisk

Common FURY X


----------



## kizwan

Quote:


> Originally Posted by *mfknjadagr8*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> could be a screw (s) too tight. could be a short somewhere.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> tried the other bios?
> 
> 
> 
> 
> 
> 
> I can't see a way fit the screws to be too tight however if I had to guess in this case did you inspect the tops of the cap points like the instructions for the komodo block stated? Most of them are fine but it takes only one to long to cause a short...I would unblock it and look it over well...shorts usually show as grayish or black/bluish areas on the pcb or backplate...check all the solder point traces that come through the back of the pcb and ensure they aren't over 1mm high...
Click to expand...

I don't think the short because it's too tight. Because if it's true it should be happening right away after installing the block.

@NIK1, did you use CLU on the GPU?


----------



## long99x

Quote:


> Originally Posted by *sTOrM41*
> 
> yeah, i finaly managed to edit my 290 bios
> 
> 
> 
> 
> 
> 
> 
> 
> 
> here my 290 PCS+ with 1060/1400 default clocks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


how did you do that ?

Can you tell me


----------



## mfknjadagr8

Quote:


> Originally Posted by *kizwan*
> 
> I don't think the short because it's too tight. Because if it's true it should be happening right away after installing the block.
> 
> @NIK1, did you use CLU on the GPU?


well too if it was a short that didn't destroy any components the gpu would provide power then short I would think unless it's a constant connection to ground causing the short...the leads that come through the pcb could be shorting on a power on cap or something...not saying that's it but something to look at.....hopefully whatever the issue it's easy to remedy...the swiftech block is designed so the screws can't really be over tightened from what ive seen...and other components are ruled out already...let the games begin


----------



## sTOrM41

Quote:


> Originally Posted by *long99x*
> 
> how did you do that ?
> 
> Can you tell me


I wrote a guide in german
http://www.hardwareluxx.de/community/f140/r9-290-x-bios-modding-wie-erstelle-ich-mein-eigenes-bios-hex-howto-1077140.html


----------



## jonny jon jon

Take the block off and bake it in the oven for 20 - 30 min. If a solder is the issue this may fix it. Worked on my gtx 570 evga sc....that card ran at 90c plus tho.


----------



## broken pixel

Quote:


> Originally Posted by *pcrevolution*
> 
> Finally done with my watercooling loop.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Green just happens to be my favourite colour.
> 
> 
> 
> 
> 
> 
> 
> The AsRock Z97E-ITX will go when Skylake is out.
> 
> 
> 
> I got a pretty bad batch of GDDR5s from Elpida. They refuse to go above 1300Mhz.
> 
> 
> 
> It's a 30C ambient here so my rads are working extra hard.. When running the benchmark it appears that the voltage is around 1.188 - 1.213v but the moment I'm back at desktop it reverts to 1.28v and the card does not underclock. Does anyone have any idea why?


Very nice, very clean setup. What type of tubing is that?


----------



## Ramzinho

i wonder if the 300 series drivers will give us 290X owners a slight boost or not ?


----------



## mAs81

Quote:


> Originally Posted by *Ramzinho*
> 
> i wonder if the 300 series drivers will give us 290X owners a slight boost or not ?


How do you mean?Driver-wise?


----------



## Ramzinho

Quote:


> Originally Posted by *mAs81*
> 
> How do you mean?Driver-wise?


all benchmarks showed that the new driver for 300 series gave them around 5~10% boost .. even some site has its title say "just because drivers work better' wonder if this will boost us as well.


----------



## mAs81

Quote:


> Originally Posted by *Ramzinho*
> 
> all benchmarks showed that the new driver for 300 series gave them around 5~10% boost .. even some site has its title say "just because drivers work better' wonder if this will boost us as well.


Interesting..I hope it does for us 290 owners too


----------



## sinnedone

Quote:


> Originally Posted by *Ramzinho*
> 
> all benchmarks showed that the new driver for 300 series gave them around 5~10% boost .. even some site has its title say "just because drivers work better' wonder if this will boost us as well.


Probably just cause they're clocked 50mhz higher stock.


----------



## Agent Smith1984

I dunno.... If amd worked on driver overhead more, it could help Hawaii even more than clock speeds, though those do help the 3 series more than anything probably... My msi gaming 390 will be in Monday, can't wait to check it out....


----------



## broken pixel

I flashed my two 290x's @ 1100/1500MHz with the modded 4GB 390x BIOS and scored over 20k benching with the 15.5.0 drivers modded by asder00. Then I launch BF4 & my frames are within 60- 70 fps range.

I then installed the 15.5 betas and ran firestrike again and scored less than 20k but BF4 ran at 100fps synced, as usual.

OT:

I had to put my Mockingbird I have been raising almost 2 months to transition land due to a fractured leg that would not heal. His first fracture healed after being taped but re- fractured again even on soft cloth he was sleeping on.

RIP Frank II <3


----------



## faizreds

Is there a way to fix black screen during boot?
My gpu is r9 290 trix. No black screen when gaming. Only happen during boot.


----------



## broken pixel

Quote:


> Originally Posted by *faizreds*
> 
> Is there a way to fix black screen during boot?
> My gpu is r9 290 trix. No black screen when gaming. Only happen during boot.


What driver are you using? If you are overclocking the GPU set it to default freqs.
(Then)

# clean the prev driver with http://www.wagnardmobile.com/DDU/ddudownload.htm
$ install these drivers

* http://www.guru3d.com/files-details/amd-catalyst-14-12-whql-omega-download.html

Report back pls. . . . .. ... ......... .. . . . . . .. . . . . . . . ... . ... ..... .. . . . . . . . .. ... .. . . . . . . . . . . . . . . . . .. .. ....... .... .. .... ..... ... .... ....... ....... ....... .... .. ... .... .......... ....... ..


----------



## pcrevolution

Quote:


> Originally Posted by *broken pixel*
> 
> Very nice, very clean setup. What type of tubing is that?


Thank you sir!

Just regular 12mm OD acrylic tubing from e22.biz.

I haven't had the chance of trying PETG yet.


----------



## long99x

Quote:


> Originally Posted by *sTOrM41*
> 
> I wrote a guide in german
> http://www.hardwareluxx.de/community/f140/r9-290-x-bios-modding-wie-erstelle-ich-mein-eigenes-bios-hex-howto-1077140.html


Thanks,
I'll try


----------



## broken pixel

Quote:


> Originally Posted by *pcrevolution*
> 
> Thank you sir!
> 
> Just regular 12mm OD acrylic tubing from e22.biz.
> 
> I haven't had the chance of trying PETG yet.


How do you achieve the straight tubing with those curves?


----------



## faizreds

Quote:


> Originally Posted by *broken pixel*
> 
> What driver are you using? If you are overclocking the GPU set it to default freqs.
> (Then)
> 
> # clean the prev driver with http://www.wagnardmobile.com/DDU/ddudownload.htm
> $ install these drivers
> 
> * http://www.guru3d.com/files-details/amd-catalyst-14-12-whql-omega-download.html
> 
> Report back pls. . . . .. ... ......... .. . . . . . .. . . . . . . . ... . ... ..... .. . . . . . . . .. ... .. . . . . . . . . . . . . . . . . .. .. ....... .... .. .... ..... ... .... ....... ....... ....... .... .. ... .... .......... ....... ..


Using the latest driver. Before this,blackscreen happen not only at boot but also when gaming and surf internet.
Manage to fix it using msi ab by setting power limit to +50 and up the voltage to +25.
Only black screen during boot still remain.


----------



## broken pixel

Originally Posted by sTOrM41 View Post

I wrote a guide in german
http://www.hardwareluxx.de/community/f140/r9-290-x-bios-modding-wie-erstelle-ich-mein-eigenes-bios-hex-howto-1077140.html

English by Google?
https://translate.google.com/translate?sl=auto&tl=en&js=y&prev=_t&hl=en&ie=UTF-8&u=http%3A%2F%2Fwww.hardwareluxx.de%2Fcommunity%2Ff140%2Fr9-290-x-bios-modding-wie-erstelle-ich-mein-eigenes-bios-hex-howto-1077140.html&edit-text=&act=url


----------



## broken pixel

Quote:


> Originally Posted by *faizreds*
> 
> Using the latest driver. Before this,blackscreen happen not only at boot but also when gaming and surf internet.
> Manage to fix it using msi ab by setting power limit to +50 and up the voltage to +25.
> Only black screen during boot still remain.


if the driver can't fix the black screen issue that has plagued the 290s I would suggest RMA on the GPU since you need to add voltage. Unless you are running overclock settings?


----------



## pcrevolution

Quote:


> Originally Posted by *broken pixel*
> 
> How do you achieve the straight tubing with those curves?


If you look closely at the front rad the tubing isn't exactly straight. Haha. I ran out of tubing unfortunately and had to salvage.

As for bending I try to keep the heat gun isolated at a point on the tubing to keep the bend tight. Then I just used my table surface and its legs as a right angle bend. A jig would suffice too. But when the acrylic is molten you just gotta work fast.


----------



## Ized

Quote:


> Originally Posted by *broken pixel*
> 
> Originally Posted by sTOrM41 View Post
> 
> I wrote a guide in german
> http://www.hardwareluxx.de/community/f140/r9-290-x-bios-modding-wie-erstelle-ich-mein-eigenes-bios-hex-howto-1077140.html
> 
> English by Google?
> https://translate.google.com/translate?sl=auto&tl=en&js=y&prev=_t&hl=en&ie=UTF-8&u=http%3A%2F%2Fwww.hardwareluxx.de%2Fcommunity%2Ff140%2Fr9-290-x-bios-modding-wie-erstelle-ich-mein-eigenes-bios-hex-howto-1077140.html&edit-text=&act=url


o_0 not sure why this took so long to be discovered but better late than never, good stuff. Translation is very readable too.

I hold hope that the default voltage blackmagic is in there somewhere waiting to be tweaked, but yeah im _sure_ that's also "impossible"









There was a dude on a cryptocoin forum pumping out BIOSes with tweaked memory timings a few months after the cards released too.. He was suggesting it was a total mess and great improvements could be made. Kind of lame he never shared the how, we could have been deep into totally custom BIOSes by now.


----------



## kizwan

Quote:


> Originally Posted by *Ized*
> 
> Quote:
> 
> 
> 
> Originally Posted by *broken pixel*
> 
> Originally Posted by sTOrM41 View Post
> 
> I wrote a guide in german
> http://www.hardwareluxx.de/community/f140/r9-290-x-bios-modding-wie-erstelle-ich-mein-eigenes-bios-hex-howto-1077140.html
> 
> English by Google?
> https://translate.google.com/translate?sl=auto&tl=en&js=y&prev=_t&hl=en&ie=UTF-8&u=http%3A%2F%2Fwww.hardwareluxx.de%2Fcommunity%2Ff140%2Fr9-290-x-bios-modding-wie-erstelle-ich-mein-eigenes-bios-hex-howto-1077140.html&edit-text=&act=url
> 
> 
> 
> o_0 not sure why this took so long to be discovered but better late than never, good stuff. Translation is very readable too.
> 
> I hold hope that the default voltage blackmagic is in there somewhere waiting to be tweaked, but yeah im _sure_ that's also "impossible"
> 
> 
> 
> 
> 
> 
> 
> 
> 
> There was a dude on a cryptocoin forum pumping out BIOSes with tweaked memory timings a few months after the cards released too.. He was suggesting it was a total mess and great improvements could be made. Kind of lame he never shared the how, we could have been deep into totally custom BIOSes by now.
Click to expand...

The mod already discovered but the issue with 290/290X BIOS is that it have digital signature. Now the signature no longer checked anymore.

I think Shammy (PT1/PT3 Shamino's BIOSes) & the guys at *coin forum know how to digital signature the modded BIOS but couldn't share it because it's confidential, I'm assuming.


----------



## Ized

Quote:


> Originally Posted by *kizwan*
> 
> The mod already discovered but the issue with 290/290X BIOS is that it have digital signature. Now the signature no longer checked anymore.
> 
> I think Shammy (PT1/PT3 Shamino's BIOSes) & the guys at *coin forum know how to digital signature the modded BIOS but couldn't share it because it's confidential, I'm assuming.


In the guide he states we re-sign the BIOS (create a new signature) using a old 7000 series tool aka the procedure is exactly the same as before... Doesn't appear like the check went anywhere or even changed?


----------



## MojoW

Quote:


> Originally Posted by *Ized*
> 
> In the guide he states we re-sign the BIOS (create a new signature) using a old 7000 series tool aka the procedure is exactly the same as before... Doesn't appear like the check went anywhere or even changed?


Then they are just fixing the checksum like with this method.(HD7XX UEFI Patch Tool Beta is the tool)
So it is exactly the same and we could have been modding our bios since back then.
On another note i am running the 390x 4gb bios with 1050 1350 clocks on my 290 tri-x and reference 290.
Running great and the bios runs a bit cooler then the tri-x bios i had before.


----------



## Ized

Quote:


> Originally Posted by *MojoW*
> 
> Then they are just fixing the checksum like with this method.(HD7XX UEFI Patch Tool Beta is the tool)
> So it is exactly the same and we could have been modding our bios since back then.


Yes that's what I was trying to express. Yet we had lots of _guru's_ back then claim it was impossible - they knew something that we didn't, _sources_ and such.

Turns out it was literally identical, they were talking out of their asses and people just accepted it as fact









Anyhow sorry to rant.
Quote:


> Originally Posted by *MojoW*
> 
> On another note i am running the 390x 4gb bios with 1050 1350 clocks on my 290 tri-x and reference 290.
> Running great and the bios runs a bit cooler then the tri-x bios i had before.


Are you able to compare any benchmarks to identically clocked 290x benchmarks you have ran recently? I'd be interested to see that. I saw a lot of variation even between 290x BIOSes on the same reference card.

Are you also running amd-15.15-radeon300-series-win8.1-win7-64bit-june15.exe ?


----------



## kizwan

Quote:


> Originally Posted by *Ized*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> The mod already discovered but the issue with 290/290X BIOS is that it have digital signature. Now the signature no longer checked anymore.
> 
> I think Shammy (PT1/PT3 Shamino's BIOSes) & the guys at *coin forum know how to digital signature the modded BIOS but couldn't share it because it's confidential, I'm assuming.
> 
> 
> 
> In the guide he states we re-sign the BIOS (create a new signature) using a old 7000 series tool aka the procedure is exactly the same as before... Doesn't appear like the check went anywhere or even changed?
Click to expand...

That tool is for correcting checksum, not the digital signature.


----------



## MojoW

Quote:


> Originally Posted by *Ized*
> 
> Yes that's what I was trying to express. Yet we had lots of _guru's_ back then claim it was impossible - they knew something that we didn't, _sources_ and such.
> 
> Turns out it was literally identical, they were talking out of their asses and people just accepted it as fact
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyhow sorry to rant.
> Are you able to compare any benchmarks to identically clocked 290x benchmarks you have ran recently? I'd be interested to see that. I saw a lot of variation even between 290x BIOSes on the same reference card.
> 
> Are you also running amd-15.15-radeon300-series-win8.1-win7-64bit-june15.exe ?


I was planning on taking a day tomorrow to do some benches and game runs with the old and new bios on the same driver.
I have not run anything recently so i have nothing to compare them to if i would run them now.
I am using the 15.200.1040.0 drivers as they are newer then the 15.15 drivers.
And my PCI Revision ID is still 00 while it should be 80 for the original drivers to recognize it as r9 300 series.
So i would be using the modded one and that's why i have the 1040 drivers.


----------



## skline00

Someone with a smaller case asked if they would have room for a 280mm rad above.

If you want a BIG case you should see this Thermaltake Core X-9 in my rig below.


----------



## kizwan

Quote:


> Originally Posted by *MojoW*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ized*
> 
> Yes that's what I was trying to express. Yet we had lots of _guru's_ back then claim it was impossible - they knew something that we didn't, _sources_ and such.
> 
> Turns out it was literally identical, they were talking out of their asses and people just accepted it as fact
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyhow sorry to rant.
> Are you able to compare any benchmarks to identically clocked 290x benchmarks you have ran recently? I'd be interested to see that. I saw a lot of variation even between 290x BIOSes on the same reference card.
> 
> Are you also running amd-15.15-radeon300-series-win8.1-win7-64bit-june15.exe ?
> 
> 
> 
> 
> 
> 
> 
> I was planning on taking a day tomorrow to do some benches and game runs with the old and new bios on the same driver.
> I have not run anything recently so i have nothing to compare them to if i would run them now.
> I am using the 15.200.1040.0 drivers as they are newer then the 15.15 drivers.
> And my PCI Revision ID is still 00 while it should be 80 for the original drivers to recognize it as r9 300 series.
> So i would be using the modded one and that's why i have the 1040 drivers.
Click to expand...

You need to change the device ID for your card to recognized as R9 300 series card.


----------



## MojoW

Quote:


> Originally Posted by *kizwan*
> 
> You need to change the device ID for your card to recognized as R9 300 series card.


My device id did change from 0x1002 0x67B0 to 0x1002 0x67B1.
And the msi 390x bios still has the 0x1002 0x67B0 device id.


----------



## fyzzz

I am also running a 390x bios, an xfx bios to be specific. It's running great and i have changed the default clocks to 1100/1400 instead of 1050/1500. I use the 15.2 drivers that windows 10 installs. At the 1100/1400 clock that i has as default i get 12600 gpu score in firestrike.


----------



## MojoW

Quote:


> Originally Posted by *fyzzz*
> 
> I am also running a 390x bios, an xfx bios to be specific. It's running great and i have changed the default clocks to 1100/1400 instead of 1050/1500. I use the 15.2 drivers that windows 10 installs. At the 1100/1400 clock that i has as default i get 12600 gpu score in firestrike.


Nice if that is stable i'm gonna try to hex edit in some higher stock clocks too.
As i was being cautious with my 1350 memory clock.








Did you change the 8gb version or the modded 4gb version like i did?


----------



## fyzzz

Quote:


> Originally Posted by *MojoW*
> 
> Nice if that is stable i'm gonna try to hex edit in some higher stock clocks too.
> As i was being cautious with my 1350 memory clock.
> 
> 
> 
> 
> 
> 
> 
> 
> Did you change the 8gb version or the modded 4gb version like i did?


I changed the 4gb one. My card can handle somewere around 1120 before it gets instable, so i choose to have 1100 as stock just to be safe. I have also tried the leaked bios earlier and the default clockspeed on that one was 1050/1500 and my card did not have a problem with that either. So maybe i can tweak the frequencies a bit more.


----------



## kizwan

Quote:


> Originally Posted by *MojoW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> You need to change the device ID for your card to recognized as R9 300 series card.
> 
> 
> 
> My device id did change from 0x1002 0x67B0 to 0x1002 0x67B1.
> And the msi 390x bios still has the 0x1002 0x67B0 device id.
Click to expand...

1002 67B0 is 290X & 1002 67B1 is 290. Correct device ID for the 390X should be 1003 6925 right? Where did you get the 390X BIOS?

EDIT: AMD/ATI should be 1002.


----------



## MojoW

Quote:


> Originally Posted by *fyzzz*
> 
> I changed the 4gb one. My card can handle somewere around 1120 before it gets instable, so i choose to have 1100 as stock just to be safe. I have also tried the leaked bios earlier and the default clockspeed on that one was 1050/1500 and my card did not have a problem with that either. So maybe i can tweak the frequencies a bit more.


Both of mine did not have a problem with the 1050/1500 one either without drivers.
But as soon as i tried to install drivers it acted crazy and went to no input, after the restart no drivers where installed.
So did you install drivers with that one to really test if the bios was stable?
Quote:


> Originally Posted by *kizwan*
> 
> 1002 67B0 is 290X & 1002 67B1 is 290. Correct device ID for the 390X should be 1003 6925 right? Where did you get the 390X BIOS?


Techpowerup vga bios database
And the xfx one was leaked, you can find that one on guru3d.
Quote:


> AMD's upcoming Radeon R9 390X and R9 390 performance-segment graphics cards reportedly feature higher GPU and memory clocks over the products they are a re-branding of, the R9 290X and R9 290, respectively. The 28 nm "Grenada" silicon they are based on, is identical to "Hawaii," down to the last transistor. This has been confirmed by leaked GPU-Z screenshots, *which reveal the device-IDs of the two cards to be identical to those of the R9 290X and R9 290*. Since the Device-IDs are the same, GPU-Z is reading the chip as "Hawaii." The code-name "Grenada" appears in the BIOS version string.


Source techpowerup
But more sites are writing the same.


----------



## xKrNMBoYx

Quote:


> Originally Posted by *tsm106*
> 
> Reference cards are all the same because, drumroll they follow the ref spec. On ref cards the switch changes between quiet and uber. On custom cards it can do a variety of different things. Google yer card reviews to confirm what yers does. That button maybe for an oc iirc.


Quote:


> Originally Posted by *Raephen*
> 
> On my refference Sapphire 290 it's Uber & Quiet.


Well contrary to our belief the switch did have to do something with UEFI BIOS. When I installed the second 290X today I went to GPU-Z and one had a checkmark confirming it was UEFI enabled for fastboot, etc. The second was unchecked. So I switched the Uber/Quiet Switch on the new GPU and booted back up. Both now have the UEFI BIOS Checkmark. Not sure if it still has Uber/Quiet on the switch.


----------



## kizwan

Quote:


> Originally Posted by *xKrNMBoYx*
> 
> Quote:
> 
> 
> 
> Originally Posted by *tsm106*
> 
> Reference cards are all the same because, drumroll they follow the ref spec. On ref cards the switch changes between quiet and uber. On custom cards it can do a variety of different things. Google yer card reviews to confirm what yers does. That button maybe for an oc iirc.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Raephen*
> 
> On my refference Sapphire 290 it's Uber & Quiet.
> 
> Click to expand...
> 
> Well contrary to our belief the switch did have to do something with UEFI BIOS. When I installed the second 290X today I went to GPU-Z and one had a checkmark confirming it was UEFI enabled for fastboot, etc. The second was unchecked. So I switched the Uber/Quiet Switch on the new GPU and booted back up. Both now have the UEFI BIOS Checkmark. Not sure if it still has Uber/Quiet on the switch.
Click to expand...

No necessarily. Basically the switch means the card have dual BIOS. It can be either uber/quiet or legacy/uefi or stock/overclock or anything.


----------



## fyzzz

Yup i had driver installed and ran firestrike herehttp://www.3dmark.com/fs/5130283 is the result and as you can see it was running at 1050/1500


----------



## MojoW

Quote:


> Originally Posted by *fyzzz*
> 
> Yup i had driver installed and ran firestrike herehttp://www.3dmark.com/fs/5130283 is the result and as you can see it was running at 1050/1500


You have some good cards there.
What are you waiting for up those stock memory clocks.


----------



## kizwan

Quote:


> Originally Posted by *MojoW*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> 1002 67B0 is 290X & 1002 67B1 is 290. Correct device ID for the 390X should be 1003 6925 right? Where did you get the 390X BIOS?
> 
> 
> 
> Techpowerup vga bios database
> And the xfx one was leaked, you can find that one on guru3d.
> Quote:
> 
> 
> 
> AMD's upcoming Radeon R9 390X and R9 390 performance-segment graphics cards reportedly feature higher GPU and memory clocks over the products they are a re-branding of, the R9 290X and R9 290, respectively. The 28 nm "Grenada" silicon they are based on, is identical to "Hawaii," down to the last transistor. This has been confirmed by leaked GPU-Z screenshots, *which reveal the device-IDs of the two cards to be identical to those of the R9 290X and R9 290*. Since the Device-IDs are the same, GPU-Z is reading the chip as "Hawaii." The code-name "Grenada" appears in the BIOS version string.
> 
> Click to expand...
> 
> Source techpowerup
> But more sites are writing the same.
Click to expand...

I don't believe in "leak" but I checked tomshardware review site & yes, the 390/390X use same device ID.

Did GPU-Z detect your card as 300 series?


----------



## MojoW

Quote:


> Originally Posted by *kizwan*
> 
> I don't believe in "leak" but I checked tomshardware review site & yes, the 390/390X use same device ID.
> 
> Did GPU-Z detect your card as 300 series?




Nope.

Edit:About the leak it was a user from hardforum who got the card early for one of his customers or something.
He made the bios dump and ran a few benches.


----------



## fyzzz

i did it! http://www.3dmark.com/3dm/7431266? broke 13000 gpu score on stock volts. Card was clocked at 1140/1500 on stock volt and i saw not a single artifact during 3dmark.


----------



## pengs

Quote:


> Originally Posted by *fyzzz*
> 
> i did it! http://www.3dmark.com/3dm/7431266? broke 13000 gpu score on stock volts. Card was clocked at 1140/1500 on stock volt and i saw not a single artifact during 3dmark.


So is that with the 390x bios?

You said that it was unstable at 1120/1500 at stock voltage but after the new bios your able bench 1140/1500 at stock voltage?

You've got a good clocker from the get-go though I wonder if the new bios raised your default VDDC slightly or something to get you passed 1120...


----------



## fyzzz

Quote:


> Originally Posted by *pengs*
> 
> So is that with the 390x bios?
> 
> You said that it was unstable at 1120/1500 at stock voltage but after the new bios your able bench 1140/1500 at stock voltage?
> 
> You've got a good clocker from the get-go though I wonder if the new bios raised your default VDDC slightly or something to get you passed 1120...


I don't have a clue. I was very surprised when it passed.


----------



## rv8000

Quote:


> Originally Posted by *fyzzz*
> 
> I don't have a clue. I was very surprised when it passed.


If you did flash to a 390/390x bios, use gpu-z to monitor your core voltage. I seem to remember one of the 390x reviews stating that the cards actually had a higher stock voltage which would explain the increased stability on the new bios; possibly some other changes with memory settings and also the fact you seem to be running 15.15 which is also showing an increase on gpu score outside of higher clocks.


----------



## By-Tor

I took advantage of the good steam price on 3DMark and here are some scores...

How are these scores?

Normal


Extreme


Ultra


----------



## pengs

Quote:


> Originally Posted by *fyzzz*
> 
> I don't have a clue. I was very surprised when it passed.


If the core and vrms are getting a little warmer that may point to a raised VDDC.

Good clocker though, I can do 1120 at something like 1.17v or +50mV. It would be interesting to find your voltage 290x bios vs 390x bios


----------



## fyzzz

Quote:


> Originally Posted by *rv8000*
> 
> If you did flash to a 390/390x bios, use gpu-z to monitor your core voltage. I seem to remember one of the 390x reviews stating that the cards actually had a higher stock voltage which would explain the increased stability on the new bios; possibly some other changes with memory settings and also the fact you seem to be running 15.15 which is also showing an increase on gpu score outside of higher clocks.


The voltage fluctuates between 1.164 and up to around 1.2 and some spikes up to 1.28. I am running the 15.2 driver that windows 10 downloads.
UPDATE:Got through with 1160/1500 with a couple of artifacts and at 1170 the driver crash.


----------



## rv8000

Quote:


> Originally Posted by *fyzzz*
> 
> The voltage fluctuates between 1.164 and up to around 1.2 and some spikes up to 1.28. I am running the 15.2 driver that windows 10 downloads.


The modded W10 betas are even better than 15.15 in terms of overhead and new D3D version. Do you remember what voltages you possibly spiked to on your stock bios? I remember my 290s and my 290x hit at most ~1.25 (it will differ slightly from card to card) but 1.28 sounds like there might be a small voltage adjustment in the bios.


----------



## fyzzz

Quote:


> Originally Posted by *rv8000*
> 
> The modded W10 betas are even better than 15.15 in terms of overhead and new D3D version. Do you remember what voltages you possibly spiked to on your stock bios? I remember my 290s and my 290x hit at most ~1.25 (it will differ slightly from card to card) but 1.28 sounds like there might be a small voltage adjustment in the bios.


I don't really remember but i think it was a bit lower actually yes.


----------



## Mattbag

Just ordered a 290x used. Please tell me its gonna be a decent upgrade from my 7970 and last me till the 400 series


----------



## rv8000

Quote:


> Originally Posted by *Mattbag*
> 
> Just ordered a 290x used. Please tell me its gonna be a decent upgrade from my 7970 and last me till the 400 series


Definite upgrade, should run most games fine maxed @ 1440, some will require tweaking. I doubt will see any game this year and early 2016 that are really going to be heavy enough to give a 290x too much of a hammering.


----------



## Mattbag

Quote:


> Originally Posted by *rv8000*
> 
> Definite upgrade, should run most games fine maxed @ 1440, some will require tweaking. I doubt will see any game this year and early 2016 that are really going to be heavy enough to give a 290x too much of a hammering.


I was considering adding a 2nd 7970 but i was worried about space in my case, power consumption and the long term investment. This is essentally a fathers day gift to me... And I'll pick up the FF 14 heavensward expansion and the witcher 3


----------



## rv8000

Quote:


> Originally Posted by *Mattbag*
> 
> I was considering adding a 2nd 7970 but i was worried about space in my case, power consumption and the long term investment. This is essentally a fathers day gift to me... And I'll pick up the FF 14 heavensward expansion and the witcher 3


CFX isnt worth the headache to me, 290x is roughly 30-40% faster. For Witcher 3, I currently turn of hairworks and foliage distance down to high from ultra, with the rest on ultra @ 1440p with a mild oc to 1100/1400 I'm getting an avg of about 52fps, haven't seen it drop below 40 but I'm also playing through 2 before I really get into it.


----------



## Mattbag

Quote:


> Originally Posted by *rv8000*
> 
> CFX isnt worth the headache to me, 290x is roughly 30-40% faster. For Witcher 3, I currently turn of hairworks and foliage distance down to high from ultra, with the rest on ultra @ 1440p with a mild oc to 1100/1400 I'm getting an avg of about 52fps, haven't seen it drop below 40 but I'm also playing through 2 before I really get into it.


I couldn't really get into number 2 it was kind of a difficult game combat wise i'm hoping 3 is better. aside from that the graphics were great and i know my 7970 can handle it fine but i want to see how it looks with the 290x and theeeen play the witcher 3


----------



## Gumbi

Just downloaded the latest BIOS for my Z87x-D3H, apparently provides better performance. Will have to see if I can squeeze another 100mhz out of my 4790k (currently at 4.9ghz on air at 1.32V







). 290 Vapor X overclocked to 1150mhz/1550mhz with plus 81mv. At stock (+25) it does 1100, have to work a lot for that 1150







. Here's how I do in 3D Mark. never ran it before so not so sure how good it is.

http://www.3dmark.com/3dm11/9958761


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *faizreds*
> 
> Using the latest driver. Before this,blackscreen happen not only at boot but also when gaming and surf internet.
> Manage to fix it using msi ab by setting power limit to +50 and up the voltage to +25.
> Only black screen during boot still remain.
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> The mod already discovered but the issue with 290/290X BIOS is that it have digital signature. Now the signature no longer checked anymore.
> 
> I think Shammy (PT1/PT3 Shamino's BIOSes) & the guys at *coin forum know how to digital signature the modded BIOS but couldn't share it because it's confidential, I'm assuming.
Click to expand...

This ^^^^^^^^^^^^^^

Gigabyte boards do this even on my rubbish X79 UP4 . I flashed the Asus PT1T bios to solve this and my TRI 290's have performed flawlessly for year now


----------



## disintegratorx

Hey guys, I need some advice with my R9 290x... I'm trying to figure out the best clocks along with the best power usage to set my card at and get good performance in games like Mortal Kombat X and The Witcher 3. Would anyone know what a healthy O.C. setting for MSI Afterburner would be? Even with the voltage setting and power limit. The reason I'm asking is because I think one of my BIOS's may be a little corrupted and its the performance one so, I'd really appreciate some more power efficient settigs, and along with performance. I'm also playing BF4 and I'm trying to find out what a great overclock setting would be, generally so I can then after that, set my CPU to an optimized group of settings. I've had issues with my PC for a bit now because I've been waiting and trying to save up to get the necessary parts to finish this build. I'm actually quite good at building, so if a more experienced person could let me know a good, solid clock and other settings to run for these games, I'd be very greatful. And Iofcourse will throw some respect points out for your time too. Right nowI believe my locks are set at 1222/1544 with alot of voltage added. I was just wondering if anyone found some more efficient clocks that ran Battlefieldforinstance, the best. Please take timeto throw me some clocks becauseits real important to me. I am saving up for the Radeon R9 Fury X, but until I can get one and also forusage after I get one, I need thisadvicefrom you guys. Thank you very much in advance and I really appreciate your input. Thanks.


----------



## fyzzz

It is starting to be annoying to clock my 290. It barely gives any more voltage than stock at 100mv and beyond that it doesn't give anymore. At 150+ mv it just blackscreen and i can't restart my computer but have to manually shut it off on the power button.


----------



## Internet Swag

I'm formatting from Win 7 to Win 8.1 (both 64 bit)

Can I keep the same drivers?


----------



## Ized

Quote:


> Originally Posted by *fyzzz*
> 
> It is starting to be annoying to clock my 290. It barely gives any more voltage than stock at 100mv and beyond that it doesn't give anymore. At 150+ mv it just blackscreen and i can't restart my computer but have to manually shut it off on the power button.


Are your GPU Core and VRM1/VRM2 temps ok? It would be worth setting up GPU-Z or HWiNFO64 to log your temperature/voltage data to disk so you have a snapshot of the moment it switches off. Then we could look at all the data the moment the problem happens.

The few dozen times mine has overheated I also had to flick the switch on my PSU.

What voltage do you see while under load with your +150mv offset?


----------



## fyzzz

Quote:


> Originally Posted by *Ized*
> 
> Are your GPU Core and VRM1/VRM2 temps ok? It would be worth setting up GPU-Z or HWiNFO64 to log your temperature/voltage data to disk so you have a snapshot of the moment it switches off. Then we could look at all the data the moment the problem happens.
> 
> The few dozen times mine has overheated I also had to flick the switch on my PSU.
> 
> What voltage do you see while under load with your +150mv offset?


I don really have the numbers on top of my head, I will check when I have time. Vrm 1 is ok around 60, but vrm 2 goes to around 70. I will come back with results when I have time.


----------



## Gumbi

Quote:


> Originally Posted by *fyzzz*
> 
> I don really have the numbers on top of my head, I will check when I have time. Vrm 1 is ok around 60, but vrm 2 goes to around 70. I will come back with results when I have time.


Check what they max out at under a sustained load. 70 is A-OK. They have very high tolerance limits, but generally people like to keep them under 85 degrees.


----------



## fyzzz

I have run valley and at 150mv it blackscreen but i can run with 100mv. vrm 1 max: 64c vrm2 max: 73c Core: 68c. The voltage with 100mv sits around ~1.25 and some spikes up to 1.3. Maybe it could be my psu? I have seen it as low as 11.38 sometimes on the 12v line.


----------



## Ized

Why the big gap between offsets anyhow? Narrow that range down as much as possible and keep an eye on the temps.

So 100mv is fine? 101? 102? 149?

With my 160mv offset I spike to over 1.4 in games so it could still easily be temps at this stage, its something you could rule out for sure by enabling the logging feature mentioned above.


----------



## kizwan

Quote:


> Originally Posted by *fyzzz*
> 
> I have run valley and at 150mv it blackscreen but i can run with 100mv. vrm 1 max: 64c vrm2 max: 73c Core: 68c. The voltage with 100mv sits around ~1.25 and some spikes up to 1.3. Maybe it could be my psu? I have seen it as low as 11.38 sometimes on the 12v line.


Ignore the software +12V reading. It's not accurate.


----------



## rv8000

Quote:


> Originally Posted by *fyzzz*
> 
> I have run valley and at 150mv it blackscreen but i can run with 100mv. vrm 1 max: 64c vrm2 max: 73c Core: 68c. The voltage with 100mv sits around ~1.25 and some spikes up to 1.3. Maybe it could be my psu? I have seen it as low as 11.38 sometimes on the 12v line.


What clocks did you have your memory at?


----------



## fyzzz

Quote:


> Originally Posted by *rv8000*
> 
> What clocks did you have your memory at?


1500


----------



## fyzzz

hhm at 130 mv i got blacksceen and 120mv was fine. I have now tried to put a noctua fan on top of the psu (i have the psu fan facing upwards) and it can now
run with 130mv. Intresting...but 150mv is still no go.


----------



## narmour

I set voltage +68mv and power limit +35 (lowered these from +81mv and +50 to get stable OC/ lower voltage)

I'm running settings at 1150/1450 with above power settings. I can't seem to break past this without artifacts showing. I'm not the most experienced overclocker but have I reached the limit of the card or do I need to pump more voltage into it to hit 1200/1500 (ideally what I want to hit!).


----------



## kizwan

Quote:


> Originally Posted by *narmour*
> 
> I set voltage +68mv and power limit +35 (lowered these from +81mv and +50 to get stable OC/ lower voltage)
> 
> I'm running settings at 1150/1450 with above power settings. I can't seem to break past this without artifacts showing. I'm not the most experienced overclocker but have I reached the limit of the card or do I need to pump more voltage into it to hit 1200/1500 (ideally what I want to hit!).


Lowering power limit only hide the oc unstability, it doesn't fixed it. You should leave it at +50 & increase the voltage.


----------



## Forceman

Quote:


> Originally Posted by *narmour*
> 
> I set voltage +68mv and power limit +35 (lowered these from +81mv and +50 to get stable OC/ lower voltage)
> 
> I'm running settings at 1150/1450 with above power settings. I can't seem to break past this without artifacts showing. I'm not the most experienced overclocker but have I reached the limit of the card or do I need to pump more voltage into it to hit 1200/1500 (ideally what I want to hit!).


If you are getting artifacts it's probably the memory that is too high. Since memory overclocking doesn't help much (not nearly as much as core) I would recommend dropping the memory clocks to 1350 or something and then try again for 1200 core.


----------



## ZealotKi11er

Been getting RSOD playing Witcher 3 recently. My OC is 1100/1500 with +25mV. Cards are getting hot because it's summer but still under Water. This always happens after i hit 61C but even then 61C is still very good temp in general. VRM1 and VRM2 temps are about the same in mid 50s.


----------



## Forceman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Been getting RSOD playing Witcher 3 recently. My OC is 1100/1500 with +25mV. Cards are getting hot because it's summer but still under Water. This always happens after i hit 61C but even then 61C is still very good temp in general. VRM1 and VRM2 temps are about the same in mid 50s.


You sure it's not CPU? I was getting white screen crashes in BF4 a while back and it turned out to be CPU.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Forceman*
> 
> You sure it's not CPU? I was getting white screen crashes in BF4 a while back and it turned out to be CPU.


These are RED screens. CPU is 100% stable. Does not happen when GPUs are in 50s.


----------



## BradleyW

Quote:


> Originally Posted by *ZealotKi11er*
> 
> These are RED screens. CPU is 100% stable. Does not happen when GPUs are in 50s.


Are you still playing with CFX disabled? It is strange that hitting 61c causes a red screen. These cards can handle 90c. Plug the screen into GPU_2 and test with CFX disabled to rule out a bad card.


----------



## ZealotKi11er

Quote:


> Originally Posted by *BradleyW*
> 
> Are you still playing with CFX disabled? It is strange that hitting 61c causes a red screen. These cards can handle 90c. Plug the screen into GPU_2 and test with CFX disabled to rule out a bad card.


I am playing with CFX. Probably a unstable OC but dont know why 61C would cause it. That's really good temp. Just OCed the card to 1100/1250 with stock volts and it's fine. Never had problem with 1500MHz memory but i did need a +25mV bump can that can increase temps 3-4C.


----------



## rv8000

Quote:


> Originally Posted by *fyzzz*
> 
> 1500


Quote:


> Originally Posted by *fyzzz*
> 
> hhm at 130 mv i got blacksceen and 120mv was fine. I have now tried to put a noctua fan on top of the psu (i have the psu fan facing upwards) and it can now
> run with 130mv. Intresting...but 150mv is still no go.


1500 on the ram is generally where issues crop up, especially blackscreens. Just find a stable core overclock before touching the memory and then bring the memory up to 1400 and go from there finding stability; 1400 vs 1500 isn't going to offer that much more performance, so it'd be best to get the most out of your core oc first.


----------



## boredmug

Quote:


> Originally Posted by *rv8000*
> 
> 1500 on the ram is generally where issues crop up, especially blackscreens. Just find a stable core overclock before touching the memory and then bring the memory up to 1400 and go from there finding stability; 1400 vs 1500 isn't going to offer that much more performance, so it'd be best to get the most out of your core oc first.


Yup. My cards don't like much over 1400 or so on the memory. Crazy though I could bench 1300/1800 on my old 7950's with stock being 800/1250


----------



## fyzzz

Quote:


> Originally Posted by *rv8000*
> 
> 1500 on the ram is generally where issues crop up, especially blackscreens. Just find a stable core overclock before touching the memory and then bring the memory up to 1400 and go from there finding stability; 1400 vs 1500 isn't going to offer that much more performance, so it'd be best to get the most out of your core oc first.


Well it can handle 1500 just fine, even on stock voltage. I have been over 1500 mhz on memory and has just worked fine. I know memory clock doesn't help much. I think I will stop clocking it and benchmarking it, nothing is wanting work and i don't have a clue anymore. It is just to much of a headache.


----------



## MrWhiteRX7

Anyone using the new driver?


----------



## Forceman

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Anyone using the new driver?


Some people are, doesn't seem like it does much besides Batman. There's a thread here:

http://www.overclock.net/t/1561571/amd-amd-15-6-driver-is-out/0_30


----------



## MrWhiteRX7

Quote:


> Originally Posted by *Forceman*
> 
> Some people are, doesn't seem like it does much besides Batman. There's a thread here:
> 
> http://www.overclock.net/t/1561571/amd-amd-15-6-driver-is-out/0_30


Thanks


----------



## Chimera1970

Strange thing happened this evening--I was about to install the new drivers when my card apparently went bad. All the color went away and everything on my screen looked dull grey. Luckily for me, I had an extra card that I had purchased about a year ago, installed it, and everything went back to normal.

Anybody ever seen this happen before? I've had it for over a year, so it's out of warranty. Oh well, if anything, it'll open up the possibility of buying a new card


----------



## randhis

my 290x with arctic cooler gtx280(it is exactly same with arctic extreme 3)
used gelid icy heatsinks and arctic heatsinks with thermal paste

ambient room temp is 33/34C
IDLE TEMP is 46/47C
when playing gta v highest setting is 82C

but oddly when i watch movies the temp go up to 66C

is there a fan speed or memory clock i should set ?

I am also unable to find vram temps using GPU-Z, is there any other way?
thank you


----------



## Arizonian

Quote:


> Originally Posted by *xKrNMBoYx*
> 
> Well contrary to our belief the switch did have to do something with UEFI BIOS. When I installed the second 290X today I went to GPU-Z and one had a checkmark confirming it was UEFI enabled for fastboot, etc. The second was unchecked. So I switched the Uber/Quiet Switch on the new GPU and booted back up. Both now have the UEFI BIOS Checkmark. Not sure if it still has Uber/Quiet on the switch.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - updated








Quote:


> Originally Posted by *randhis*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> my 290x with arctic cooler gtx280(it is exactly same with arctic extreme 3)
> used gelid icy heatsinks and arctic heatsinks with thermal paste
> 
> ambient room temp is 33/34C
> IDLE TEMP is 46/47C
> when playing gta v highest setting is 82C
> 
> but oddly when i watch movies the temp go up to 66C
> 
> is there a fan speed or memory clock i should set ?
> 
> I am also unable to find vram temps using GPU-Z, is there any other way?
> thank you


Congrats - added


----------



## kizwan

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *BradleyW*
> 
> Are you still playing with CFX disabled? It is strange that hitting 61c causes a red screen. These cards can handle 90c. Plug the screen into GPU_2 and test with CFX disabled to rule out a bad card.
> 
> 
> 
> I am playing with CFX. Probably a unstable OC but dont know why 61C would cause it. That's really good temp. Just OCed the card to 1100/1250 with stock volts and it's fine. Never had problem with 1500MHz memory but i did need a +25mV bump can that can increase temps 3-4C.
Click to expand...

I also have experiencing similar problem when the core temp exceeds 60C for an extended period of time & it only happened when overclocked & overvolted at the same time. The only difference is no red screen to me. It pretty much stable & can go on for hours if the temp kept below 60C but once it exceeds that for extended period of time, boom, crashed. So I can't really say it was unstable overclocked. I actually have three logs file with me that show the games crashed after the card running above 60C for extended period of time. If it's rapidly fluctuating between above & below 60C, it won't crash though.


----------



## disintegratorx

I've found out what my problem was. It was my board not playing nice with my RAM config. These Ivy Bridge CPU's aren't as forgiving with XMP profile I think... Gonna have to get a matching set doing 1866Mz, or around there.


----------



## cephelix

@randhis is it just me or is your card sagging real bad?


----------



## fyzzz

I have tried 150mv and 1250 mhz on the memory today and still error sound and a blackscreen.


----------



## Gumbi

Quote:


> Originally Posted by *fyzzz*
> 
> I have tried 150mv and 1250 mhz on the memory today and still error sound and a blackscreen.


Have you tried using some aux voltage? Sounds like you have dud memory tbh. Is it Elpida by any chance?


----------



## fyzzz

Quote:


> Originally Posted by *Gumbi*
> 
> Have you tried using some aux voltage? Sounds like you have dud memory tbh. Is it Elpida by any chance?


I don't think the memory is the problem, i can run 1400-1500 just fine on stock volt. I have hynix memory.


----------



## MojoW

Quote:


> Originally Posted by *fyzzz*
> 
> I don't think the memory is the problem, i can run 1400-1500 just fine on stock volt. I have hynix memory.


Is this with the 390x bios?
If so, that is the reason.
As i have tested multiple and they are all unstable.
The netkas one and the rest i made myself.


----------



## fyzzz

Quote:


> Originally Posted by *MojoW*
> 
> Is this with the 390x bios?
> If so, that is the reason.
> As i have tested multiple and they are all unstable.
> The netkas one and the rest i made myself.


I'm starting to think that yes. I think i need to go back to the sapphire bios i was running before (i don't have the stock bios anymore, forgot to save it and no one seems to have a xfx r9 290 dd bios that works on my card).


----------



## MojoW

Quote:


> Originally Posted by *fyzzz*
> 
> I'm starting to think that yes. I think i need to go back to the sapphire bios i was running before (i don't have the stock bios anymore, forgot to save it and no one seems to have a xfx r9 290 dd bios that works on my card).


Bios switch?
As both of them are the same on your card.

Edit: Check this thread, he has the same card with a bios that is not on techpowerup.
Hopefully he still has it or a back up as he wanted to flash too.


----------



## fyzzz

Quote:


> Originally Posted by *MojoW*
> 
> Bios switch?
> As both of them are the same on your card.
> 
> Edit: Check this thread, he has the same card with a bios that is not on techpowerup.
> Hopefully he still has it or a back up as he wanted to flash too.


I messed both bioses up sadly. Who has a 290 dd that has bios that is not on techpowerup?


----------



## MojoW

Quote:


> Originally Posted by *MojoW*
> 
> Bios switch?
> As both of them are the same on your card.
> 
> Edit: Check this thread, he has the same card with a bios that is not on techpowerup.
> Hopefully he still has it or a back up as he wanted to flash too.


Quote:


> Originally Posted by *fyzzz*
> 
> I messed both bioses up sadly. Who has a 290 dd that has bios that is not on techpowerup?


Press on ''this thread'' and you will be on his thread and see the bios he is using or was using.


----------



## fyzzz

Quote:


> Originally Posted by *MojoW*
> 
> Press on ''this thread'' and you will be on his thread and see the bios he is using or was using.


Welp no luck there.He has sold his card.


----------



## Gumbi

Quote:


> Originally Posted by *fyzzz*
> 
> I'm starting to think that yes. I think i need to go back to the sapphire bios i was running before (i don't have the stock bios anymore, forgot to save it and no one seems to have a xfx r9 290 dd bios that works on my card).


Maybe the new BIOSes have tweaked memory timings which 290 mem chips can't handle.


----------



## MojoW

Quote:


> Originally Posted by *fyzzz*
> 
> Welp no luck there.He has sold his card.


Too bad that's really the only one i found.:S
I think your out of luck and need to use a bios of an other card.


----------



## fyzzz

Quote:


> Originally Posted by *MojoW*
> 
> Too bad that's really the only one i found.:S
> I think your out of luck and need to use a bios of an other card.


Yup it is so dumb to forget to save a bios. Atleast i know a sapphire tri-x and vapor-x bios that works.


----------



## fyzzz

Yup now i can run even with 200mv and no problem. At 200mv the voltage sits around 1.28.


----------



## MojoW

Quote:


> Originally Posted by *fyzzz*
> 
> Yup now i can run even with 200mv and no problem. At 200mv the voltage sits around 1.28.


Yeah i knew it after i did some test with the 390x bios.

Gaming unstable.(GTA would crash almost instantly and most games showed artifacts that i've never seen before)
Overclocking unstable.(Like you've found out)
Video playback on my second screen was broken, a 47 inch 3d tv.(MPC-HC aswell as VLC didn't work correctly, Potplayer did but had artifacts also)

The only thing that worked great was 3dmark on stock clocks.









Edit: The artifacts where like crazy white dots flickering in and out randomly all over the screen.


----------



## fyzzz

Quote:


> Originally Posted by *MojoW*
> 
> Yeah i knew it after i did some test with the 390x bios.
> 
> Gaming unstable.(GTA would crash almost instantly and most games showed artifacts that i've never seen before)
> Overclocking unstable.(Like you've found out)
> Video playback on my second screen was broken, a 47 inch 3d tv.(MPC-HC aswell as VLC didn't work correctly, Potplayer did but had artifacts also)
> 
> The only thing that worked great was 3dmark on stock clocks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: The artifacts where like crazy white dots flickering in and out randomly all over the screen.


Boring that the bios didn't work so well, but you had alot more problems than i did.


----------



## jacking

Quote:


> Originally Posted by *fyzzz*
> 
> I messed both bioses up sadly. Who has a 290 dd that has bios that is not on techpowerup?


i have xfx 290 edbd, i will sent bios to you, sorry my eng


----------



## fyzzz

Quote:


> Originally Posted by *jacking*
> 
> i have xfx 290 edbd, i will sent bios to you, sorry my eng


Yeah thanks i can try it. Hope it works. What memory type do you have?


----------



## jacking

Quote:


> Originally Posted by *fyzzz*
> 
> Yeah thanks i can try it. Hope it works. What memory type do you have?


i unlucky , elpida ram


----------



## rdr09

Quote:


> Originally Posted by *jacking*
> 
> i unlucky , elpida ram


Core is king.


----------



## fyzzz

Quote:


> Originally Posted by *jacking*
> 
> i unlucky , elpida ram


Hmm i have hynix, but maybe it will work i don't know. I'm willing to test every xfx bios that i can get my hands on.


----------



## jacking

Quote:


> Originally Posted by *fyzzz*
> 
> Hmm i have hynix, but maybe it will work i don't know. I'm willing to test every xfx bios that i can get my hands on.


now i don't at home, wait me 1 hour


----------



## MojoW

Quote:


> Originally Posted by *fyzzz*
> 
> Hmm i have hynix, but maybe it will work i don't know. I'm willing to test every xfx bios that i can get my hands on.


It could work a lot of rom's have support for both memory types.


----------



## narmour

So, a 290X with 390X bios is then a 390X?


----------



## MojoW

Quote:


> Originally Posted by *narmour*
> 
> So, a 290X with 390X bios is then a 390X?


Lol no a 290x with the 15.15 drivers is a 390x.


----------



## kizwan

Quote:


> Originally Posted by *jacking*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fyzzz*
> 
> Yeah thanks i can try it. Hope it works. What memory type do you have?
> 
> 
> 
> i unlucky , elpida ram
Click to expand...

When I see people have Elpida ram, I have the urge to tell you that I can overclock mine to 1600MHz, games stable.







But I usually just run my cards at 1000/1300 anyway. I have both Hynix & Elpida cards.


----------



## jacking

Quote:


> Originally Posted by *fyzzz*
> 
> Hmm i have hynix, but maybe it will work i don't know. I'm willing to test every xfx bios that i can get my hands on.


Bios with clock 947 memory 1250

xfx290edfdHawaii.zip 99k .zip file

Quote:


> Originally Posted by *kizwan*
> 
> When I see people have Elpida ram, I have the urge to tell you that I can overclock mine to 1600MHz, games stable.
> 
> 
> 
> 
> 
> 
> 
> But I usually just run my cards at 1000/1300 anyway. I have both Hynix & Elpida cards.


----------



## boot318

Anyone running Win10 with an 290X? Twice a day Win10 keeps messing with my drivers and installing WDDM v2.0 drivers. Anyone knows where Win10 is keeping those AMD's WDDM v2.0 drivers or how to disable it from installing those drivers over 15.5/6? I'm kinda getting mad having to use DDU twice a day and reinstall my drivers.


----------



## kizwan

Anyone know how the Frame Rate Control in CCC work? Is this for limiting FPS? Well I did try it just now but FPS still go over 100FPS. Do I need to enabled both options under the Frame Rate Control & enable vsync in games?

Installed the 15.15 beta drivers hoping to test the Frame Rate Target Control but this option not available in CCC.


----------



## fyzzz

The bios that jacking gave me seems empty? Are i am missing something? I get in to windows the driver doesn' t work and gpu z is just empty.


----------



## jacking

Quote:


> Originally Posted by *fyzzz*
> 
> The bios that jacking gave me seems empty? Are i am missing something? I get in to windows the driver doesn' t work and gpu z is just empty.


i don't know, i save my bios with gpuz


----------



## mfknjadagr8

Quote:


> Originally Posted by *fyzzz*
> 
> Hmm i have hynix, but maybe it will work i don't know. I'm willing to test every xfx bios that i can get my hands on.


I have two xfx 290 reference cards with stock bios...I can dump if you want but sure of versions as I haven't bothered to check...


----------



## aaroc

My PC is stable, so time to play with it. I have all stock (the ram is at 2133 using XMP profile). What do you recommend to gain a few more FPS from the GPUs, flash a bios with higher clocks or overclock with AB? 1x HIS Hynix (1rst GPU), 3x Sapphire Elpida, all 1000/1250. Im using latest beta. All WC with good temps in core and vrms.


----------



## fyzzz

Quote:


> Originally Posted by *mfknjadagr8*
> 
> I have two xfx 290 reference cards with stock bios...I can dump if you want but sure of versions as I haven't bothered to check...


why not i can test them.
Quote:


> Originally Posted by *mfknjadagr8*
> 
> I have two xfx 290 reference cards with stock bios...I can dump if you want but sure of versions as I haven't bothered to check...


why not im willing to try diffrent bioses,


----------



## pengs

Quote:


> Originally Posted by *aaroc*
> 
> My PC is stable, so time to play with it. I have all stock (the ram is at 2133 using XMP profile). What do you recommend to *gain a few more FPS* from the GPUs, flash a bios with higher clocks or overclock with AB? *1x HIS Hynix (1rst GPU), 3x Sapphire*










wait for DX12









It's going to be all CPU after the 2nd or 3rd GPU unless your running 3 4K's or 6 monitors or something







but if your running the modded 15.15 drivers and using Afterburner you should be able to gain something. Personally I wouldn't muck around with a 390x bios because it could cause instability which will force you to use AB to give it more voltage anyhow.

15.15 looks like it may have some good DX11 efficiency improvements
Quote:


> AMD'd 15.5
> 
> API overhead test
> 
> DX11 multi 874 308
> DX11 Single 931 772
> Mantle 9 777 904
> 
> Modded 15.15
> 
> DX11 multi 1 120 471
> DX11 single 1 165 430
> Mantle 9 808 061
> http://forums.overclockers.co.uk/showpost.php?p=28195575&postcount=6122


----------



## aaroc

Quote:


> Originally Posted by *pengs*
> 
> 
> 
> 
> 
> 
> 
> 
> wait for DX12
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's going to be all CPU after the 2nd or 3rd GPU unless your running 3 4K's or 6 monitors or something
> 
> 
> 
> 
> 
> 
> 
> but if your running the modded 15.15 drivers and using Afterburner you should be able to gain something. Personally I wouldn't muck around with a 390x bios because it could cause instability which will force you to use AB to give it more voltage anyhow.
> 
> 15.15 looks like it may have some good DX11 efficiency improvements


I use 3 x 2560x1440 monitors and play with horizontal Eyefinity when possible. For racing games/sims is a must. I was asking about using a R9 290X bios from another vendor/model 4GB ram, not to try a 390X Bios.
So you are recommending me to try the modified 390X driver to be used with R9 290X?


----------



## OneB1t

also now bios can be modified i just hope that someone will start to make bios editor for 290/390 series


----------



## pengs

Quote:


> Originally Posted by *aaroc*
> 
> I use 3 x 2560x1440 monitors and play with horizontal Eyefinity when possible. For racing games/sims is a must. I was asking about using a R9 290X bios from another vendor/model 4GB ram, not to try a 390X Bios.
> So you are recommending me to try the modified 390X driver to be used with R9 290X?


Yeah, it's got some improvements. It might be worth a try
*Catalyst 15.15 mod* http://forums.guru3d.com/showpost.php?p=5098084&postcount=26

It increased the performance in GTAV for me which may be due to the increased DX11 draw call. If that's true then anything which is a DX11 and CPU bound should benefit from it.


----------



## MojoW

Quote:


> Originally Posted by *pengs*
> 
> Yeah, it's got some improvements. It might be worth a try
> *Catalyst 15.15 mod* http://forums.guru3d.com/showpost.php?p=5098084&postcount=26
> 
> It increased the performance in GTAV for me which may be due to the increased DX11 draw call. If that's true then anything which is a DX11 and CPU bound should benefit from it.


For me the AMD Catalyst 15.200.1040.0 are smoother but it could be because i use crossfire.
Don't know if it would matter for a single card setup.
And these have the increased DX11 draw call as well as they are based of this driver branch.


----------



## BradleyW

Quote:


> Originally Posted by *MojoW*
> 
> For me the AMD Catalyst 15.200.1040.0 are smoother but it could be because i use crossfire.
> Don't know if it would matter for a single card setup.
> And these have the increased DX11 draw call as well as they are based of this driver branch.


1040 driver is better than 15.15 in terms of CFX and draw calls. 15.15 however contains the tess pipeline optimizations. AMD are withholding these optimizations for their R9 300 cards. Makes me sick to the core.....


----------



## MojoW

Quote:


> Originally Posted by *BradleyW*
> 
> 1040 driver is better than 15.15 in terms of CFX and draw calls. 15.15 however contains the tess pipeline optimizations. AMD are withholding these optimizations for their R9 300 cards. Makes me sick to the core.....


We have always had it bad with tess so it is indeed sickening that they won't give us AMD supporters the same love as new customers.
Unless we buy a rebrand of the same card we allready have or go fury.


----------



## Agent Smith1984

Quote:


> Originally Posted by *MojoW*
> 
> We have always had it bad with tess so it is indeed sickening that they won't give us AMD supporters the same love as new customers.
> Unless we buy a rebrand of the same card we allready have or go fury.


I'd bet it's coming soon for you guys.
This is just part of the sales launch for 300 series...


----------



## sinnedone

Quote:


> Originally Posted by *BradleyW*
> 
> AMD are withholding these optimizations for their R9 300 cards. Makes me sick to the core.....


Where did you read about this?


----------



## MrWhiteRX7

Any evidence to the better tess pipeline of the 15.15? PCPER specifically tested the drivers and could find no obvious foul play. Just wondering if there's something I have missed.

The 1040 driver overall does seem to be the best. I'm using that combined with the ripped dll file from 15.6 for batman and having no issues


----------



## YellowBlackGod

Has anyone tried to flash an 8GB 290X to 390X? I have the impression that it should work fine. Techpowerup has already some BIOS files available. I have never done the procedure. Any brave soul around?


----------



## Meulen92

Hey guys,

Thinking about getting these (http://www.gelidsolutions.com/products/index.php?lid=1&cid=13&id=108) to cool my VRM's.
Does anyone know if these will fit nicely underneath the cooler of the 290X Tri-x OC?

Thanks in advance!


----------



## kizwan

Quote:


> Originally Posted by *MrWhiteRX7*
> 
> Any evidence to the better tess pipeline of the 15.15? PCPER specifically tested the drivers and could find no obvious foul play. Just wondering if there's something I have missed.
> 
> The 1040 driver overall does seem to be the best. I'm using that combined with the ripped dll file from 15.6 for batman and having no issues


I'm interested to know too. I read a claim that AMD optimized the tessellation in 15.15 by reducing the tessellation. How do we know tessellation is reduced?


----------



## Ha-Nocri

I've tried every moded and other drivers out there. No performance changes in both GTA5 and Witcher 3. So pretty poinless


----------



## cplifj

i have just one thing to remark,

in every benchmark i have seen since 290x release, the card is allways getting slower and slower while nvidia seems to be going faster .

meaning if i look at benchy's now the 290x has worse performance then in the benchy's when it was new released.

this tells me everything gets manipulated for marketing reasons, and also that amd never was any good to begin with.

now the benchy's with the fury pop up and the 290x is way worse then even a 980 vanilla. i remember when it used to beat it in benchmarks....(were they even real benchmarks)

THIS is what makes AMD a bad product.

now this fury x can't even beat a 980Ti, LOL.

amateurs


----------



## rdr09

Quote:


> Originally Posted by *cplifj*
> 
> i have just one thing to remark,
> 
> in every benchmark i have seen since 290x release, the card is allways getting slower and slower while nvidia seems to be going faster .
> 
> meaning if i look at benchy's now the 290x has worse performance then in the benchy's when it was new released.
> 
> this tells me everything gets manipulated for marketing reasons, and also that amd never was any good to begin with.
> 
> now the benchy's with the fury pop up and the 290x is way worse then even a 980 vanilla. i remember when it used to beat it in benchmarks....(were they even real benchmarks)
> 
> THIS is what makes AMD a bad product.
> 
> now this fury x can't even beat a 980Ti, LOL.
> 
> amateurs


14 driver vs 13 driver. has not changed since 14. hoping 15.15 (300 driver) will deliver . . .

http://www.3dmark.com/compare/fs/1392805/fs/2746418


----------



## kizwan

Quote:


> Originally Posted by *Ha-Nocri*
> 
> I've tried every moded and other drivers out there. No performance changes in both GTA5 and Witcher 3. So pretty poinless


What kind of performance issue you're facing with the games? I have no problem with GTA V but I did turn down some settings of course. Maybe we can compare settings?
Quote:


> Originally Posted by *cplifj*
> 
> i have just one thing to remark,
> 
> in every benchmark i have seen since 290x release, the card is allways getting slower and slower while nvidia seems to be going faster .
> 
> meaning if i look at benchy's now the 290x has worse performance then in the benchy's when it was new released.
> 
> this tells me everything gets manipulated for marketing reasons, and also that amd never was any good to begin with.
> 
> now the benchy's with the fury pop up and the 290x is way worse then even a 980 vanilla. i remember when it used to beat it in benchmarks....(were they even real benchmarks)
> 
> THIS is what makes AMD a bad product.
> 
> *now this fury x can't even beat a 980Ti, LOL.*
> 
> amateurs


I don't know how you read the reviews but Fury X & 980 Ti does trading blows. In some games 980 Ti beats Fury X while in the other games Fury X beats 980 Ti.

The so called Titan X killer a.k.a. 980 Ti also loose to Titan X in some games.


----------



## Ha-Nocri

Quote:


> Originally Posted by *kizwan*
> 
> What kind of performance issue you're facing with the games? I have no problem with GTA V but I did turn down some settings of course. Maybe we can compare settings?


No problems. The game runs smooth as butter. Just wanted to check if some drivers had impact on performance. Why not squeeze the last frame out of it


----------



## kizwan

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> What kind of performance issue you're facing with the games? I have no problem with GTA V but I did turn down some settings of course. Maybe we can compare settings?
> 
> 
> 
> No problems. The game runs smooth as butter. Just wanted to check if some drivers had impact on performance. Why not squeeze the last frame out of it
Click to expand...

I see.


----------



## the9quad

You know, I am just going to stick with these cards until the next node. I have my complaints with CFX (when it works, I think it is better than SLI), but the cards overall have been pretty good. I think they will last until we see a smaller process. I hope AMD is still around then. I would do probably do AMD again if given the choice.


----------



## LandonAaron

Quote:


> Originally Posted by *kizwan*
> 
> What kind of performance issue you're facing with the games? I have no problem with GTA V but I did turn down some settings of course. Maybe we can compare settings?
> I don't know how you read the reviews but Fury X & 980 Ti does trading blows. In some games 980 Ti beats Fury X while in the other games Fury X beats 980 Ti.
> 
> The so called Titan X killer a.k.a. 980 Ti also loose to Titan X in some games.


But what AMD card are you suppose to buy, the slower 390X with alot of VRAM or the fast Fury X with a little VRAM? I feel like AMD has really screwed the pooch with the 4 GB VRAM decision. The Fury X is basically as fast as a 980 ti, but with 33% less ram and slower than Evga's factory overclocked 980ti again with 33% less ram. Which basically leaves AMD as always competing on price alone. Hopefully the card will be competive on the price front and still make AMD enough money that they feel the need to continue on in the GPU game but I worry about team red.


----------



## Forceman

Quote:


> Originally Posted by *the9quad*
> 
> You know, I am just going to stick with these cards until the next node. I have my complaints with CFX (when it works, I think it is better than SLI), but the cards overall have been pretty good. I think they will last until we see a smaller process. I hope AMD is still around then. I would do probably do AMD again if given the choice.


I'm hoping the Fury non-X is the same 5% or so slower than the Fury X that we saw with the 290/290X, and that they have some models with DVI. I really don't want to spend $650 on a GPU.


----------



## LandonAaron

Quote:


> Originally Posted by *Forceman*
> 
> I'm hoping the Fury non-X is the same 5% or so slower than the Fury X that we saw with the 290/290X, and that they have some models with DVI. I really don't want to spend $650 on a GPU.


I think one of the most interesting cards to be released will be the 390 non x variant. About as fast as a 980. Will beat a 980 on some games and lose on others, but with 8 GB of VRAM and a price tag of just $329. That sounds like a heck of a deal to me. That is if you can take the heat, lol.

I aslo agree with you about the port selection on the Fury X. It is really strange. From what I read the HDMI port will not be 2.0, and of course no DVI. Isn't DVI what most people are still using? I have korean overclockable monitor with only DVI input. I don't think there is any way I can connect my monitor to that GPU. Seems totally dumb to me. I mean how many display ports do we really need? I don't own a single display cable. I have never used it. And isn't the whole point of Display port that you can daisy chain display devices together off of one port? So why do we need three?


----------



## kizwan

Quote:


> Originally Posted by *LandonAaron*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I don't know how you read the reviews but Fury X & 980 Ti does trading blows. In some games 980 Ti beats Fury X while in the other games Fury X beats 980 Ti.
> 
> The so called Titan X killer a.k.a. 980 Ti also loose to Titan X in some games.
> 
> 
> 
> But what AMD card are you suppose to buy, the slower 390X with alot of VRAM or the fast Fury X with a little VRAM? I feel like AMD has really screwed the pooch with the 4 GB VRAM decision. The Fury X is basically as fast as a 980 ti, but with 33% less ram and slower than Evga's factory overclocked 980ti again with 33% less ram. Which basically leaves AMD as always competing on price alone. Hopefully the card will be competive on the price front and still make AMD enough money that they feel the need to continue on in the GPU game but I worry about team red.
Click to expand...

Between the two, I prefer Fury X since 390X just rebranded. But you have a point there about VRAM. It's not easy choice of course. Dilemma between slower 390X but bigger VRAM & faster Fury X but smaller VRAM.

I wonder whether we'll see better performance once Windows 10/DX12 available. If there's any it will only useful with DX12-enabled games of course.


----------



## Agent Smith1984

Quote:


> Originally Posted by *LandonAaron*
> 
> I think one of the most interesting cards to be released will be the 390 non x variant. About as fast as a 980. Will beat a 980 on some games and lose on others, but with 8 GB of VRAM and a price tag of just $329. That sounds like a heck of a deal to me. That is if you can take the heat, lol.
> 
> I aslo agree with you about the port selection on the Fury X. It is really strange. From what I read the HDMI port will not be 2.0, and of course no DVI. Isn't DVI what most people are still using? I have korean overclockable monitor with only DVI input. I don't think there is any way I can connect my monitor to that GPU. Seems totally dumb to me. I mean how many display ports do we really need? I don't own a single display cable. I have never used it. And isn't the whole point of Display port that you can daisy chain display devices together off of one port? So why do we need three?


Just a note for everyone concerned about DVI....

The card comes with a a cable set, and DVI out....

Saw this in some pictures I saw in the Fury thread.


----------



## By-Tor

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Just a note for everyone concerned about DVI....
> 
> The card comes with a a cable set, and DVI out....
> 
> Saw this in some pictures I saw in the Fury thread.


Do we know if that cable setup with support Dual Link DVI for 144hz monitors?


----------



## cephelix

For me if i were to upgrade(still undecided) i would go the fury route. Gaming at 1200p somehow i run out of processing power before i do ram in the newer titles eg DAI. So 4gb ram is a non-issue for me. Am I on the right train of thought or am I doing something wrong here? I could almost max out some titles but of course with AA off. On older games the 290 runs like a champ such as the first 2 installments of the batman franchise but that's expected. At this point they're like 5 years old or so


----------



## randhis

Quote:


> Originally Posted by *Meulen92*
> 
> Hey guys,
> 
> Thinking about getting these (http://www.gelidsolutions.com/products/index.php?lid=1&cid=13&id=108) to cool my VRM's.
> Does anyone know if these will fit nicely underneath the cooler of the 290X Tri-x OC?
> 
> Thanks in advance!


hi this is my card



I am using the gelid icy rev 2.0 with arctic accelero 3, almost similar to tri-xs cooler...the vram heatsink is abit tall, it does come in contact the arctic accelero heatsink fins and bend the card little bit, i checked my vram temps using hwinfo32, vram1 is showing 55C and VRM2 53C. Not sure how accurate is hwinfo32, but iF I were u, i will stick with stock tri-x, its a awesome gpu wITH proper cooler.


----------



## MrWhiteRX7

Quote:


> Originally Posted by *the9quad*
> 
> You know, I am just going to stick with these cards until the next node. I have my complaints with CFX (when it works, I think it is better than SLI), but the cards overall have been pretty good. I think they will last until we see a smaller process. I hope AMD is still around then. I would do probably do AMD again if given the choice.


I'm right there with ya. I have no desire to upgrade from my trifire setup with what I'm seeing out right now. Next year should be the real deal I hope









Only good thing is I'm really getting my money out of these cards, had them almost since launch!!!! WOOO!


----------



## newbies

Hey guys, I just picked up a 4K monitor, and I'm thinking I should upgrade my GTX 770 along with it. How big of a jump up would a 290x be with current drivers? I'm not looking to play anything super intensive, just want MOBAs/RTS/RPG to run at 4K60, and my GTX770 barely handles 1440P60 for a lot of those games. It seems to me like a $270 290x is a way better deal than a $500 GTX 980 right now!? I plan to make a big graphics upgrade late next year if something comes out that can play Star CItizen at 4K60.


----------



## YellowBlackGod

Quote:


> Originally Posted by *newbies*
> 
> Hey guys, I just picked up a 4K monitor, and I'm thinking I should upgrade my GTX 770 along with it. How big of a jump up would a 290x be with current drivers? I'm not looking to play anything super intensive, just want MOBAs/RTS/RPG to run at 4K60, and my GTX770 barely handles 1440P60 for a lot of those games. It seems to me like a $270 290x is a way better deal than a $500 GTX 980 right now!? I plan to make a big graphics upgrade late next year if something comes out that can play Star CItizen at 4K60.


That's what I did back in December when I changed my 780ti reference with a sapphire r9 290x/8gb. Didn't regret it.My best GPu ever. It was much cheaper about 150 Euros than a gtx 980. And now with DX 12 will show it's true potential. An 8gb model is like buying a 390x, but the 4GB models are great value as well. I would recommend a model from Sapphire or MSI, or any 8GB model you find. Go ahead!


----------



## Agent Smith1984

Quote:


> Originally Posted by *YellowBlackGod*
> 
> That's what I did back in December when I changed my 780ti reference with a sapphire r9 290x/8gb. Didn't regret it.My best GPu ever. It was much cheaper about 150 Euros than a gtx 980. And now with DX 12 will show it's true potential. An 8gb model is like buying a 390x, but the 4GB models are great value as well. I would recommend a model from Sapphire or MSI, or any 8GB model you find. Go ahead!


I hear you, but it's a common misconception that buying an 8GB 290x is like buying a 390X 8GB...

There are differences. Nothing extreme, but they are apparent.
New hardware features, reduced power, and improved driver support. There is about a 9% difference across the board.

This Fury X review is also great for comparing 2/3 series cards...
https://www.techpowerup.com/reviews/AMD/R9_Fury_X/


----------



## YellowBlackGod

BElieve me they are 99% the same. Ok the memory chips are clocked higher and that's about it. Even the core speeds are the same.My 290x runs stock 1060/1400. OCed stock volts 1110/1520. 390x speeds. Whatever 390x supports, supports the 290x as well: same architecture gcn 1.1, dx12, mantle,vulcan,freesync. Drivers will be the same by the next update. The late 8gb 290x models are not what in October 2013 was introduced to us. These GPUs are tweens. It is just that the 390x was born second.


----------



## newbies

Quote:


> Originally Posted by *YellowBlackGod*
> 
> BElieve me they are 99% the same. Ok the memory chips are clocked higher and that's about it. Even the core speeds are the same.My 290x runs stock 1060/1400. OCed stock volts 1110/1520. 390x speeds. Whatever 390x supports, supports the 290x as well: same architecture gcn 1.1, dx12, mantle,vulcan,freesync. Drivers will be the same by the next update. The late 8gb 290x models are not what in October 2013 was introduced to us. These GPUs are tweens. It is just that the 390x was born second.


So if I get a 290x and overclock it to 390x stock speeds it should perform almost the exact same (once driver support is uniform)?


----------



## YellowBlackGod

Of course. .Just take a look at the gaming benchmarks. The maximal difference (stock speeds for both) is just, what, max 10 fps , usually 1-3 fps. Just compare a Sapphire 290x/8GB with the 390x. Only difference is the Ram speed.A bumped up chip.


----------



## rdr09

Quote:


> Originally Posted by *newbies*
> 
> Hey guys, I just picked up a 4K monitor, and I'm thinking I should upgrade my GTX 770 along with it. How big of a jump up would a 290x be with current drivers? I'm not looking to play anything super intensive, just want MOBAs/RTS/RPG to run at 4K60, and my GTX770 barely handles 1440P60 for a lot of those games. It seems to me like a $270 290x is a way better deal than a $500 GTX 980 right now!? I plan to make a big graphics upgrade late next year if something comes out that can play Star CItizen at 4K60.


you gonna need two 2 290Xs. even with maxwell 970 and 980. 980Ti and above may have to be oc'ed to play nicely with a single 4K.

i suggest to go down to 1440 for now from 4K when using just one 290X. i use 2 290s at stock to run 4K.


----------



## Dunan

Quote:


> Originally Posted by *newbies*
> 
> Hey guys, I just picked up a 4K monitor, and I'm thinking I should upgrade my GTX 770 along with it. How big of a jump up would a 290x be with current drivers? I'm not looking to play anything super intensive, just want MOBAs/RTS/RPG to run at 4K60, and my GTX770 barely handles 1440P60 for a lot of those games. It seems to me like a $270 290x is a way better deal than a $500 GTX 980 right now!? I plan to make a big graphics upgrade late next year if something comes out that can play Star CItizen at 4K60.


Star citizen at 60FPS @ 4K? This is a crisis type title, You're looking at 5 yrs or so before you're getting 60 frames in star citizen.


----------



## newbies

Quote:


> Originally Posted by *rdr09*
> 
> you gonna need two 2 290Xs. even with maxwell 970 and 980. 980Ti and above may have to be oc'ed to play nicely with a single 4K.
> 
> i suggest to go down to 1440 for now from 4K when using just one 290X. i use 2 290s at stock to run 4K.


Quote:


> Originally Posted by *Dunan*
> 
> Star citizen at 60FPS @ 4K? This is a crisis type title, You're looking at 5 yrs or so before you're getting 60 frames in star citizen.


5 years? Even for a single card setup I don't think it'll take that long. But regardless, more reason to get a card like the 290X to hold me over until I take the full 4K plunge.

Like I said, I don't play any games that intensive right now, I think my GTX 770 could probably run Heroes of the Storm at 40-50fps at 4K (Getting 65-75 right now at 1440p) I just need that little extra bump to push me back to 60 at 4K, so my main question is will a single 290X be able to do that?


----------



## rdr09

Quote:


> Originally Posted by *newbies*
> 
> 5 years? Even for a single card setup I don't think it'll take that long. But regardless, more reason to get a card like the 290X to hold me over until I take the full 4K plunge.
> 
> Like I said, I don't play any games that intensive right now, I think my GTX 770 could probably run Heroes of the Storm at 40-50fps at 4K (Getting 65-75 right now at 1440p) I just need that little extra bump to push me back to 60 at 4K, so my main question is will a single 290X be able to do that?


Some games are playable in 4K with a single 290X. It really depends on your settings. now, if you want 60 minimum in games, then at least 2 TitanX, 980Ti, or Fury. Just an estimate based on my experience using 4K.

Don't forget the CPU. With the highend cars . . . you gonna need a very fast CPU to match.


----------



## ambientblue

Yes it will. To be sure you could hook up ur current rig to the new monitor and see how ur 770 does and then add like 35-40% fps. I went from a Gtx 680 (same as 770) to r9 290x and its worthwhile. Especially for the much better memory bandwidth.


----------



## ambientblue

CPU is less relevant the higher the resolution you game at. Any i5 since 2nd gen should suffice


----------



## BradleyW

Quote:


> Originally Posted by *ambientblue*
> 
> CPU is less relevant the higher the resolution you game at. Any i5 since 2nd gen should suffice


Only because fps is typically lower, therefore you get the illusion that the CPU is less relevant. Lower a setting or two, gain some fps for playability and you'll see that CPU work hard!


----------



## ambientblue

Exactly, the GPU won't do as well at high resolutions. i5 is enough in most cases


----------



## kizwan

Quote:


> Originally Posted by *newbies*
> 
> Hey guys, I just picked up a 4K monitor, and I'm thinking I should upgrade my GTX 770 along with it. How big of a jump up would a 290x be with current drivers? I'm not looking to play anything super intensive, just want MOBAs/RTS/RPG to run at 4K60, and my GTX770 barely handles 1440P60 for a lot of those games. It seems to me like a $270 290x is a way better deal than a $500 GTX 980 right now!? I plan to make a big graphics upgrade late next year if something comes out that can play Star CItizen at 4K60.


You need at least two 290X minimum for 4K. I have two 290 & without AA, at 4K I can get average FPS around 70. 290X crossfire should do better.


----------



## fyzzz

I want to try the modded 15.15 drivers but the driver installation just fails. I tried the 15.2.1040 and got a boost in firestrike.


----------



## Ha-Nocri

Quote:


> Originally Posted by *fyzzz*
> 
> I want to try the modded 15.15 drivers but the driver installation just fails. I tried the 15.2.1040 and got a boost in firestrike.


Yeah, me too. But in GTA5 and Witcher 3 I see no improvement.


----------



## fyzzz

Arrgh so close to 14000 gpu score in firestrike http://www.3dmark.com/3dm/7508378?. 1220/1600 seems to be the max i can do. Gpu-z shows around 1.28 in voltage at 200mv in trixx, wich to me seems low.


----------



## kizwan

My cards 1.406V & 1.383V before vdroop (max value reported by GPU-Z) respectively with +200mV.


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> Arrgh so close to 14000 gpu score in firestrike http://www.3dmark.com/3dm/7508378?. 1220/1600 seems to be the max i can do. Gpu-z shows around 1.28 in voltage at 200mv in trixx, wich to me seems low.


Quote:


> Originally Posted by *kizwan*
> 
> My cards 1.406V & 1.383V before vdroop (max value reported by GPU-Z) respectively with +200mV.


kiz, you see fy's graphics? i can never achieve that high of a score at those clocks. that 290 is unusual. it is like a 290X.


----------



## peitinhos

Can someone give me a help...i have one MSI r9 290 gaming for one month without a problem...yesterday when gaming i got a BSOD!?
The one thing diferent is that i did not raise the power limit...and playing stock!i always played 1050/1350 and pw LMT at 25...any Hints on the problem?


----------



## kizwan

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fyzzz*
> 
> Arrgh so close to 14000 gpu score in firestrike http://www.3dmark.com/3dm/7508378?. 1220/1600 seems to be the max i can do. Gpu-z shows around 1.28 in voltage at 200mv in trixx, wich to me seems low.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> My cards 1.406V & 1.383V before vdroop (max value reported by GPU-Z) respectively with +200mV.
> 
> Click to expand...
> 
> kiz, you see fy's graphics? i can never achieve that high of a score at those clocks. that 290 is unusual. it is like a 290X.
Click to expand...

Yeah, he have very good card, at lower voltage too. The best my card can do is 13.29K at 1200/1600. Can't go higher than that because it will start to show artifacts.


----------



## fyzzz

http://www.3dmark.com/3dm/7509036? 14000 gpu score, managed to get to 1240/1600 (far from stable, but hey i got through) and i ran only one monitor. Max vddc was 1.414, but it stayed around 1.27 during testing. I also changed from the vapor-x bios to the tri-x bios.


----------



## Gumbi

I'm gonna try contest some of these with my Vapor X later







Have never pushed it to the limit. Need a non modded version of Trixx, as the current version I have I used to unlock speeds/high voltages for my golden 7950.

My current Trixx OC tool doesn't apply 200mv properly at all and crashes the card. If I can get a game stable oc of 1150/1550 at 81mv, I should be able to go over 1200 core with 100-200mv.


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> http://www.3dmark.com/3dm/7509036? 14000 gpu score, managed to get to 1240/1600 (far from stable, but hey i got through) and i ran only one monitor. Max vddc was 1.414, but it stayed around 1.27 during testing. I also changed from the vapor-x bios to the tri-x bios.


Great job! Fantastic car you have there.


----------



## fyzzz

Quote:


> Originally Posted by *rdr09*
> 
> Great job! Fantastic car you have there.


car







?Yeah I'm pretty happy about it. I think I maybe will upgrade the cooling on it too since there is no need too upgrade the gpu itself. Like a kraken g10 with a big aio or something.


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> car
> 
> 
> 
> 
> 
> 
> 
> ?Yeah I'm pretty happy about it. I think I maybe will upgrade the cooling on it too since there is no need too upgrade the gpu itself. Like a kraken g10 with a big aio or something.


Time to custom watercool the whole system. even a kit that cost like $175 for the cpu. you might snag a cheap gpu full block like i did for $60. add the gpu to the loop. check temps once and forget it.


----------



## fyzzz

Quote:


> Originally Posted by *rdr09*
> 
> Time to custom watercool the whole system. even a kit that cost like $175 for the cpu. you might snag a cheap gpu full block like i did for $60. add the gpu to the loop. check temps once and forget it.


Yeah I have been thinking about getting a custom loop but it is so expensive where I live.


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> Yeah I have been thinking about getting a custom loop but it is so expensive where I live.


someone bought my xspc 750 pump/res combo recently for $35. his died after 7 yrs he said and wants the same one.


----------



## scorpinot

Anyone w/a XFX R9 290X & FFXIV Heavensward able to run a quick FPS test ingame @low settings, 1080p incombat for me?


----------



## Jflisk

I am thinking about moving to a R Fury X from 3X R9 290x in Trifire. Anyone think its even worth it.


----------



## BradleyW

Quote:


> Originally Posted by *Jflisk*
> 
> I am thinking about moving to a R Fury X from 3X R9 290x in Trifire. Anyone think its even worth it.


get a 980 Ti. Better TDP and more VRAM. Plus it is quicker and GameWorks issues will be a thing of the past for you.


----------



## rdr09

Quote:


> Originally Posted by *BradleyW*
> 
> get a 980 Ti. Better TDP and more VRAM. Plus it is quicker and GameWorks issues will be a thing of the past for you.


Better TDP? I thought you meant better TDR.









joke


----------



## kizwan

Quote:


> Originally Posted by *Jflisk*
> 
> I am thinking about moving to a R Fury X from 3X R9 290x in Trifire. Anyone think its even worth it.


What do you think about your 3 x 290X? Are they worth it when you got them? No regret? If you're satisfied with your 290X's, then Fury X will be worth it.

If I'm going to upgrade, I'm thinking 2 x Fury (not Fury X),


----------



## Jflisk

Quote:


> Originally Posted by *kizwan*
> 
> What do you think about your 3 x 290X? Are they worth it when you got them? No regret? If you're satisfied with your 290X's, then Fury X will be worth it.
> 
> If I'm going to upgrade, I'm thinking 2 x Fury (not Fury X),


The three R 9 290X will tear threw anything I throw at them. I have owned them for about a year and no regrets they have served me well threw many hours of games. Just trying to get some heat out of my loop( Dump the rad box) and go with one card that comes close or Buy a card now that comes close and add the second one latter. I have owned nvidia before. I dont have problems with AMD or nvidia( not a red or green guy just whatever works best). Also trying to go for a one card solutions or 2.


----------



## aaroc

Quote:


> Originally Posted by *Jflisk*
> 
> The three R 9 290X will tear threw anything I throw at them. I have owned them for about a year and no regrets they have served me well threw many hours of games. Just trying to get some heat out of my loop( Dump the rad box) and go with one card that comes close or Buy a card now that comes close and add the second one latter. I have owned nvidia before. I dont have problems with AMD or nvidia( not a red or green guy just whatever works best). Also trying to go for a one card solutions or 2.


Look at the various benchmarks and reviews in the heat dissipation part. The 970 is lower, but the 980 and 980ti are close to the AMD ones. I you want to keep 3 GPU i don't think you will be able to dismiss a Radiator.


----------



## rdr09

Quote:


> Originally Posted by *Jflisk*
> 
> The three R 9 290X will tear threw anything I throw at them. I have owned them for about a year and no regrets they have served me well threw many hours of games. Just trying to get some heat out of my loop( Dump the rad box) and go with one card that comes close or Buy a card now that comes close and add the second one latter. I have owned nvidia before. I dont have problems with AMD or nvidia( not a red or green guy just whatever works best). Also trying to go for a one card solutions or 2.


too soon to tell but there is news that Trixx will come out with a version that will allow oc'ing the Fury. in the owners thread and based on the graphics score of Firestrike, 2 290 (nonX) are about 40% slower then 2 FuryX both stock.

Pretty sure AB will have one too.


----------



## fyzzz

I don't even know with my 290: http://www.3dmark.com/3dm/7513263? I have benchmarking a while now and will keep going to see if a can squezze a bit more out of it.


----------



## Agent Smith1984

Quote:


> Originally Posted by *fyzzz*
> 
> I don't even know with my 290: http://www.3dmark.com/3dm/7513263? I have benchmarking a while now and will keep going to see if a can squezze a bit more out of it.


Very nice!

What clocks?


----------



## fyzzz

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Very nice!
> 
> What clocks?


1240/1650


----------



## Agent Smith1984

Quote:


> Originally Posted by *fyzzz*
> 
> 1240/1650










?


----------



## fyzzz

Quote:


> Originally Posted by *Agent Smith1984*
> 
> 
> 
> 
> 
> 
> 
> 
> ?


Nope on air but I might watercool it, I will see.


----------



## Agent Smith1984

Quote:


> Originally Posted by *fyzzz*
> 
> Nope on air but I might watercool it, I will see.


Damn, 1240 on air huh?

Which card/cooler do you have?

How much voltage, and/or aux voltage are you running?

I will be pushing this 390 to it's limits this weekend after putting on some noctua TIM


----------



## fyzzz

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Damn, 1240 on air huh?
> 
> Which card/cooler do you have?
> 
> How much voltage, and/or aux voltage are you running?
> 
> I will be pushing this 390 to it's limits this weekend after putting on some noctua TIM


I'm running a xfx card with a raijintek morpheus. I'm benchmarking with trixx at 200mv. I want to try msi ab, but I can't get It working.


----------



## Agent Smith1984

Quote:


> Originally Posted by *fyzzz*
> 
> I'm running a xfx card with a raijintek morpheus. I'm benchmarking with trixx at 200mv. I want to try msi ab, but I can't get It working.


I've never for the life of me been able to get 200mv on AB.

I am going to switch to Trixx for 200mv after I get my new TIM applied.

How are the max VRM and core temps with that Raijintek?


----------



## fyzzz

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I've never for the life of me been able to get 200mv on AB.
> 
> I am going to switch to Trixx for 200mv after I get my new TIM applied.
> 
> How are the max VRM and core temps with that Raijintek?


core can reach 70c and up and vrm 1 goes up to 60 something (I have never seen it reach 70). The vrm 1 would hit 100 before until I put a fan back of the card and now it has a hard time hitting 70.


----------



## Ized

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I've never for the life of me been able to get 200mv on AB.
> 
> I am going to switch to Trixx for 200mv after I get my new TIM applied.
> 
> How are the max VRM and core temps with that Raijintek?


You may want to try other BIOSes. Afterburner and Trixx are as buggy as each other really.

My 3d voltage is maxed after a offset of 160ish - any more makes no difference other than to increase 2d volts which isn't so useful.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Ized*
> 
> You may want to try other BIOSes. Afterburner and Trixx are as buggy as each other really.
> 
> My 3d voltage is maxed after a offset of 160ish - any more makes no difference other than to increase 2d volts which isn't so useful.


I'm going to dump the 390x BIOS on my card, but I'm pretty weary of trying any of the modified ASUS BIOS on my 390 since I'm not sure it's reference, and it has 8GB VRAM....

I will see where the voltage caps, and put more information up on the 390/x owners thread.

Thanks


----------



## fyzzz

Quote:


> Originally Posted by *Ized*
> 
> You may want to try other BIOSes. Afterburner and Trixx are as buggy as each other really.
> 
> My 3d voltage is maxed after a offset of 160ish - any more makes no difference other than to increase 2d volts which isn't so useful.


Yeah I have noticed that to after like 150mv it doesn't give so much more. I can't run gpu tweak because there is no Asus bios that work with my card.


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> Yeah I have noticed that to *after like 150mv* it doesn't give so much more. I can't run gpu tweak because there is no Asus bios that work with my card.


please don't. take care of that.


----------



## fyzzz

Quote:


> Originally Posted by *rdr09*
> 
> please don't. take care of that.


?


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> ?


don't put anymore volts until you watercool it. you'll never find another 290 like that. your 1240 is my 1280.


----------



## fyzzz

Quote:


> Originally Posted by *rdr09*
> 
> don't put anymore volts until you watercool it. you'll never find another 290 like that. your 1240 is my 1280.[/quote yeah I guess it a must to watercool this 290. It is quite unusual. I'm wondering how much watercooling could do for it. I think I have other type of hynix memory not generic one and therefore only a few bioses work.EDIT: I am on my phone and the quote is a bit tricky.


----------



## davidm71

Is it worth it buying another 290x asus oc2 card for crossfire? I found them used like new at $240..

Thanks.


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> don't put anymore volts until you watercool it. you'll never find another 290 like that. your 1240 is my 1280.[/quote yeah I guess it a must to watercool this 290. It is quite unusual. I'm wondering how much watercooling could do for it. I think I have other type of hynix memory not generic one and therefore only a few bioses work.EDIT: I am on my phone and the quote is a bit tricky.
> 
> 
> 
> i remember getting mine at launch. benching is boring and fun at the same time.
> Quote:
> 
> 
> 
> Originally Posted by *davidm71*
> 
> Is it worth it buying another 290x asus oc2 card for crossfire? I found them used like new at $240..
> 
> Thanks.
> 
> Click to expand...
> 
> no, just save it for Fury (nonX).
Click to expand...


----------



## Ha-Nocri

Quote:


> Originally Posted by *davidm71*
> 
> Is it worth it buying another 290x asus oc2 card for crossfire? I found them used like new at $240..
> 
> Thanks.


Why not if you are on higher resolution. When I buy a freeSync 1440p monitor I will get another used 290 for sure.


----------



## trekxtrider

290Proof.PNG 111k .PNG file


XFX R9 290 4GB

Arctic Xtreme IV with additional V-RAM heat sinks provided by Arctic

FullSizeRender.jpg 123k .jpg file


----------



## gatygun

Got a question.

I'm currently looking to buy into a new screen, and as i got a 290 tri-x i want to know if i can connect 5 monitors towards it?

I will have 2x 19 inch on both sides, a ultra wide 29 inch in the middle which i want to connect 2 different screen connections towards + having a 21,5 inch full hd on top of it.

If that isn't possible is atleast 4 possible then?


----------



## fyzzz

Haha i love my 290 it just keeps on delivering. http://www.3dmark.com/3dm/7522947?


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> Haha i love my 290 it just keeps on delivering. http://www.3dmark.com/3dm/7522947?


lol. i think you got a 290 1/2 X.

try the other versions - extreme and ultra. i bet it will scale.

edit: also, i think its vram is what makes it special. FS benefits from vram oc.

@gaytgun, check the op for info how to hook up multi monitor.


----------



## gatygun

Quote:


> Originally Posted by *fyzzz*
> 
> Haha i love my 290 it just keeps on delivering. http://www.3dmark.com/3dm/7522947?


My e-mail is [email protected], can you send me 500 points, you got enough.

Nice score mate.


----------



## rdr09

Quote:


> Originally Posted by *gatygun*
> 
> My e-mail is [email protected], can you send me 500 points, you got enough.
> 
> Nice score mate.


check post # 2 for multi-monitor setup.


----------



## fyzzz

Quote:


> Originally Posted by *rdr09*
> 
> lol. i think you got a 290 1/2 X.
> 
> try the other versions - extreme and ultra. i bet it will scale.
> 
> edit: also, i think its vram is what makes it special. FS benefits from vram oc.
> 
> @gaytgun, check the op for info how to hook up multi monitor.


I will try ultra and extreme soon and yeah i think my vram oc helps alot i ran it at 1700 and no problems. The funny thing is that i also got more score when i lowerd the core clock from 1240 to 1235.


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> I will try ultra and extreme soon and yeah i think my vram oc helps alot i ran it at 1700 and no problems. The funny thing is that i also got more score when i lowerd the core clock from 1240 to 1235.


it was more stable at 1235. but, since you are good at tweaking, then you might figure out how to increase core along with the 1700 vram. fantastic car. the only other car as good as yours i found here is Rob's. see post # 21896.

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781

his goes to 1700 as well.

edit: here beat my extreme . . .

http://www.3dmark.com/3dm/2098310

i got elpida on both my 290s.


----------



## fyzzz

Quote:


> Originally Posted by *rdr09*
> 
> it was more stable at 1235. but, since you are good at tweaking, then you might figure out how to increase core along with the 1700 vram. fantastic car. the only other car as good as yours i found here is Rob's. see post # 21896.
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781
> 
> his goes to 1700 as well.
> 
> edit: here beat my extreme . . .
> 
> http://www.3dmark.com/3dm/2098310
> 
> i got elpida on both my 290s.


Here: http://www.3dmark.com/3dm/7525345? you have been beaten by....37 gpu score....







. Managed to tweak some more and got to 1250/1715.


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> Here: http://www.3dmark.com/3dm/7525345? you have been beaten by....37 gpu score....
> 
> 
> 
> 
> 
> 
> 
> . Managed to tweak some more and got to 1250/1715.


1300+ easy on water. Great job!


----------



## HOMECINEMA-PC

Dec 2013 .......

http://www.3dmark.com/fs/1413455



http://hwbot.org/submission/2471657_homecinema_pc_3dmark___fire_strike_extreme_radeon_r9_290_6395_marks


----------



## rdr09

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Dec 2013 .......
> 
> http://www.3dmark.com/fs/1413455
> 
> 
> 
> http://hwbot.org/submission/2471657_homecinema_pc_3dmark___fire_strike_extreme_radeon_r9_290_6395_marks


lol. show fy a pic of your sweet setup.


----------



## fyzzz

Here is my result with tessellation tweak:http://www.3dmark.com/3dm/7526076?


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> Here is my result with tessellation tweak:http://www.3dmark.com/3dm/7526076?


Beat this . . .

http://www.3dmark.com/3dm/2071170


----------



## fyzzz

Quote:


> Originally Posted by *rdr09*
> 
> Beat this . . .
> 
> http://www.3dmark.com/3dm/2071170


haha 9 gpu score over:http://www.3dmark.com/3dm/7526265?


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> haha 9 gpu score over:http://www.3dmark.com/3dm/7526265?


i knew you and your card can do it.


----------



## HOMECINEMA-PC

@fyzzz Ghetto style , Twin-chillputer LooooL


Last week , I haven't touched it in nearly a year ......












Well done with dat 1700 mem clock ............ mem +










And my ride ...


----------



## fyzzz

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> @fyzzz Ghetto style , Twin-chillputer LooooL
> 
> 
> Last week , I haven't touched it in nearly a year ......
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well done with dat 1700 mem clock ............ mem +
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And my ride ...


Cool setup and nice car!


----------



## Ized

So how repeatable are these Firestrike scores you guys are showing off? Can you run the same test 3 times and get the same general score?

Mine always differ by huge margins. I just use Firestrike extreme to test for artifacts now and disregard the scores









Run 1 5745 http://www.3dmark.com/fs/5232109
Run 2 5894 http://www.3dmark.com/fs/5232157
Run 3 5851 http://www.3dmark.com/fs/5232213


----------



## rdr09

Quote:


> Originally Posted by *Ized*
> 
> So how repeatable are these Firestrike scores you guys are showing off? Can you run the same test 3 times and get the same general score?
> 
> Mine always differ by huge margins. I just use Firestrike extreme to test for artifacts now and disregard the scores
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Run 1 5745 http://www.3dmark.com/fs/5232109
> Run 2 5894 http://www.3dmark.com/fs/5232157
> Run 3 5851 http://www.3dmark.com/fs/5232213


no, you run it and run it until you get the higest score you think you'll get or till your gpu dies.









sometimes, you have to do a complete reboot and start from scratch. other hardcore benchers (coughHomescough) have other rituals they do that we mortals do not know about.

Also, you should disable igpu. it can skew the scores.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *fyzzz*
> 
> Cool setup and nice car!


Thanks man its been very reliable and my Lexy too









Yeah disable IGPU man
Quote:


> Originally Posted by *fyzzz*
> 
> haha 9 gpu score over:http://www.3dmark.com/3dm/7526265?


No dude that's the Physics score that's over 9K not GPU .

Good physics scores are achieved by tight DRAM timings

Quote:


> Originally Posted by *rdr09*
> 
> no, you run it and run it until you get the higest score you think you'll get or till your gpu dies.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *sometimes, you have to do a complete reboot and start from scratch. other hardcore benchers (coughHomescough) have other rituals they do that we mortals do not know about.*
> 
> Also, you should disable igpu. it can skew the scores.


Sssshhhhhh you know .........


----------



## Jflisk

Anyone try the windows 10 preview and have shuddering with multiple gpus in games.


----------



## Gumbi

Quote:


> Originally Posted by *Ized*
> 
> So how repeatable are these Firestrike scores you guys are showing off? Can you run the same test 3 times and get the same general score?
> 
> Mine always differ by huge margins. I just use Firestrike extreme to test for artifacts now and disregard the scores
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Run 1 5745 http://www.3dmark.com/fs/5232109
> Run 2 5894 http://www.3dmark.com/fs/5232157
> Run 3 5851 http://www.3dmark.com/fs/5232213


Lol? You realise the numbers you gave are like, within, less thsn 0.5% of each other???


----------



## Sempre

Anyone got an idea on whats going on with my crossfire setup? I've been having usage variation from both gpus on a couple games so i overclocked the cpu to 4.3 to rule out a cpu bottleneck. Same thing.

I disabled crossfire and compared the usage
Here is an example on BF4:

Two cards:

There are some gpu usage spikes
Cpu usage ~40-60%

One card:

Slightly better
Cpu usage ~40%

ULPS is disabled. Temp is not an issue since all of them are at 60-70 c load. I'm thinking maybe my 760watt psu isn't supplying enough power, i do realize im right at the edge considering i have a 4770k and two 290s although they're at stock voltage. But then the cards dont downclock from 1Ghz so im not sure.


----------



## Sunshine323

I recently bought a radeon R9 290x MSI lighting i think its a great card for a decent price.
It was a real fast card a great upgrade to my aging GT285 but i had to return it for a refund the day after it was delivered because it had some terrible coil whine in any 3d applications.
The one i got was what i guess already send back to them for the same reason as i did. the box looked like it was already opened once.

im considering a different card or maybe the same from a different store or get a 390x.

Whats most important to me is that the new card is fast but dousn't whine

Maybe you guys can share me some of your experiences with the card and if you ever experienced coilwhine with it.


----------



## bkvamme

Quick question.

How far can I push the voltage on my R9 290X? It's a Gigabyte R9 290X Windforce 3X with a EK waterblock on. I am stable at 1200MHz/1465MHz Core clock at +95mV offset, but I am curious as to how much is safe to do for a normal, stable clock. VRM 2 temps are getting crazy in furmark (115degC after 2 hours), but perfectly normal in games and such.

Also, the card only has 6+8pin power delivery, and according to HWInfo, VRM power in peaks out at 350W, how much power can this card handle? Will it automatically stop, or just die?

3Dmark Score:

Fire strike 11159
Sky diver 27616
Cloud gate 27157
Ice Storm170418 http://www.3dmark.com/3dm/7499550?


----------



## fyzzz

I have been tweaking some more today and came to the conclusion that my absolute max overclock is 1260/1726. Now i only need to save up some money for watercooling.


----------



## Ramzinho

I'm so confused. for some reason my 3Dmark sees my ram as 1333 despite being read as 1600 in the CPU-Z memory. also it keeps recording my clocks as stock despite being Overclocked ! any ideas?

http://www.3dmark.com/3dm/7539068?


----------



## Gumbi

Quote:


> Originally Posted by *fyzzz*
> 
> I have been tweaking some more today and came to the conclusion that my absolute max overclock is 1260/1726. Now i only need to save up some money for watercooling.


That's an insane overclock on air. What are load load core/VRM temps?


----------



## Eltanior

Hi there! I'm looking for Sapphire R9 290X Vapor-X OC with 4Gb Hynix BFR memory BIOS. Can anyone help me with this? I'll be very grateful


----------



## fyzzz

Quote:


> Originally Posted by *Gumbi*
> 
> That's an insane overclock on air. What are load load core/VRM temps?


Core and vrm goes up to around 70 in firestrike at 250mv and voltage is around 1.3. I am on a raijintek morpheus also.


----------



## Agent Smith1984

Quote:


> Originally Posted by *fyzzz*
> 
> Core and vrm goes up to around 70 in firestrike at 250mv and voltage is around 1.3. I am on a raijintek morpheus also.


390 must be doing something different in regards to power handling then....

At 200mv+ I am getting a constant ~1.43v with a few spikes to 1.438


----------



## blackhole2013

I got the beast powercolor devil 13 290x2 running 1130 on both cores but cant raise the memory at all above its 1350 stock which makes me think that powercolor uses stock 290x 1250 memory and just overclocked it to 1350... I can cook an egg in my computer when playing tho lol ... I got gta v all the way turned up running 60 fps I love this card even tho its the loudest one out there ...


----------



## Agent Smith1984

Quote:


> Originally Posted by *blackhole2013*
> 
> I got the beast powercolor devil 13 290x2 running 1130 on both cores but cant raise the memory at all above its 1350 stock which makes me think that powercolor uses stock 290x 1250 memory and just overclocked it to 1350... I can cook an egg in my computer when playing tho lol ... I got gta v all the way turned up running 60 fps I love this card even tho its the loudest one out there ...


in my experience, 290 vram needs additionalcore voltage to overclock... Not sure why, but it works for myself and others... Not that the devil needs anymore bandwidth though...


----------



## moorhen2

Quote:


> Originally Posted by *Eltanior*
> 
> Hi there! I'm looking for Sapphire R9 290X Vapor-X OC with 4Gb Hynix BFR memory BIOS. Can anyone help me with this? I'll be very grateful


Here you go

biosfor290x.zip 41k .zip file


----------



## blackhole2013

Quote:


> Originally Posted by *Agent Smith1984*
> 
> in my experience, 290 vram needs additionalcore voltage to overclock... Not sure why, but it works for myself and others... Not that the devil needs anymore bandwidth though...


So should i turn up the afterburner aux voltage will that help with the ram overclock


----------



## Performer81

Quote:


> Originally Posted by *blackhole2013*
> 
> So should i turn up the afterburner aux voltage will that help with the ram overclock


No just gpu voltage. Mem controller sits in gpu.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *bkvamme*
> 
> Quick question.
> 
> How far can I push the voltage on my R9 290X? It's a Gigabyte R9 290X Windforce 3X with a EK waterblock on. I am stable at 1200MHz/1465MHz Core clock at +95mV offset, but I am curious as to how much is safe to do for a normal, stable clock. VRM 2 temps are getting crazy in furmark (115degC after 2 hours), but perfectly normal in games and such.
> 
> Also, the card only has 6+8pin power delivery, and according to HWInfo, VRM power in peaks out at 350W, how much power can this card handle? Will it automatically stop, or just die?
> 
> 3Dmark Score:
> 
> Fire strike 11159
> Sky diver 27616
> Cloud gate 27157
> Ice Storm170418 http://www.3dmark.com/3dm/7499550?


1. I have had all 3 up to and past 1.5vc
2. Stable 290 overclocks are trail and error . Personally 1000/1250 is plenty for 24/7
3. Don't be stupid and use Furmark you will fry your vrms and redo your pads get some fujipoly thermal pads ..... you should only see 40c - 50c w/blocked ........ good grief 115c 
4. Ive seen 375w outta mine
Quote:


> Originally Posted by *fyzzz*
> 
> I have been tweaking some more today and came to the conclusion that my absolute max overclock is 1260/1726. *Now i only need to save up some money for watercooling.*


Yeah stop cooking that card 

Quote:


> Originally Posted by *Ramzinho*
> 
> I'm so confused. for some reason my 3Dmark sees my ram as 1333 despite being read as 1600 in the CPU-Z memory. also it keeps recording my clocks as stock despite being Overclocked ! any ideas?
> 
> http://www.3dmark.com/3dm/7539068?


Its what 3D mark does nuthin ya can do


----------



## Ramzinho

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 1. I have had all 3 up to and past 1.5vc
> 2. Stable 290 overclocks are trail and error . Personally 1000/1250 is plenty for 24/7
> 3. Don't be stupid and use Furmark you will fry your vrms and redo your pads get some fujipoly thermal pads ..... you should only see 40c - 50c w/blocked ........ good grief 115c
> 4. Ive seen 375w outta mine
> 
> [/B]
> 
> Yeah stop cooking that card
> Its what 3D mark does nuthin ya can do


So you advise 1000/1250 for daily use







isn't that stock


----------



## DividebyZERO

guys might want to check this out if it hasn't been mentioned

http://www.overclock.net/t/1561904/mlu-bios-builds-for-290x#post_24085220


----------



## rdr09

Quote:


> Originally Posted by *Sunshine323*
> 
> I recently bought a radeon R9 290x MSI lighting i think its a great card for a decent price.
> It was a real fast card a great upgrade to my aging GT285 but i had to return it for a refund the day after it was delivered because it had some terrible coil whine in any 3d applications.
> The one i got was what i guess already send back to them for the same reason as i did. the box looked like it was already opened once.
> 
> im considering a different card or maybe the same from a different store or get a 390x.
> 
> Whats most important to me is that the new card is fast but dousn't whine
> 
> Maybe you guys can share me some of your experiences with the card and if you ever experienced coilwhine with it.


just get another in the same store if it is that easy to replace just in case you get another with coil whine. some cards have it others do not. just luck of the draw.

Quote:


> Originally Posted by *bkvamme*
> 
> Quick question.
> 
> How far can I push the voltage on my R9 290X? It's a Gigabyte R9 290X Windforce 3X with a EK waterblock on. I am stable at 1200MHz/1465MHz Core clock at +95mV offset, but I am curious as to how much is safe to do for a normal, stable clock. VRM 2 temps are getting crazy in furmark (115degC after 2 hours), but perfectly normal in games and such.
> 
> Also, the card only has 6+8pin power delivery, and according to HWInfo, VRM power in peaks out at 350W, how much power can this card handle? Will it automatically stop, or just die?
> 
> 3Dmark Score:
> 
> Fire strike 11159
> Sky diver 27616
> Cloud gate 27157
> Ice Storm170418 http://www.3dmark.com/3dm/7499550?


i agree with Homes. Your vrms should stay around 50s. i don't think even Furmark will make it that hot. Did you remove the protective coatings on both sides of the thermal pad?


----------



## Sempre

Quote:


> Originally Posted by *Sempre*
> 
> Anyone got an idea on whats going on with my crossfire setup? I've been having usage variation from both gpus on a couple games so i overclocked the cpu to 4.3 to rule out a cpu bottleneck. Same thing.
> 
> I disabled crossfire and compared the usage
> Here is an example on BF4:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Two cards:
> 
> There are some gpu usage spikes
> Cpu usage ~40-60%
> 
> One card:
> 
> Slightly better
> Cpu usage ~40%
> 
> 
> 
> ULPS is disabled. Temp is not an issue since all of them are at 60-70 c load. I'm thinking maybe my 760watt psu isn't supplying enough power, i do realize im right at the edge considering i have a 4770k and two 290s although they're at stock voltage. But then the cards dont downclock from 1Ghz so im not sure.


Ok so tried different drivers. Installed 15.15 and no improvement. Cleaned with ddu and re installed the latest official beta 15.4. Tried switching the bios switch on the cards. Disabled all overclocks so everything is at stock. Last thing I uninstalled afterburner and started monitoring with hwinfo and gpuz. So far nothing improved.

I still see some usage spikes with crossfire disabled but gameplay is a lot smoother and less stuttering.

This is a test i did with Valley:

1 card enabled


2 cards enabled


With hwinfo:
1 card

2 cards


----------



## the9quad

Tried the modified drivers and some things just dont work, so installed the 15.6's instead.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Ramzinho*
> 
> So you advise 1000/1250 for daily use
> 
> 
> 
> 
> 
> 
> 
> isn't that stock


For me and the Asus PT1T 290x bios I run on all of my cards it is









Quote:


> Originally Posted by *the9quad*
> 
> Tried the *modified drivers* and some things just dont work, so installed the 15.6's instead.


That driver for some reason made my VLC player not display video first time in many years , but when gaming it was fine







. Rolled back and have VLC player back to normal


----------



## bkvamme

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 1. I have had all 3 up to and past 1.5vc
> 2. Stable 290 overclocks are trail and error . Personally 1000/1250 is plenty for 24/7
> 3. Don't be stupid and use Furmark you will fry your vrms and redo your pads get some fujipoly thermal pads ..... you should only see 40c - 50c w/blocked ........ good grief 115c
> 4. Ive seen 375w outta mine


Okay, thanks, still some headroom there.

Might very well keep the OCed profile on standby incase I need it, as most of the games I play run perfectly fine on ultra with my refreshrate/resolution anyway ([email protected]). Ran Eyefinity, but my triple monitor setup got downvoted by the significant other (said it took up too much space, and did not look nice, can you believe it







).

Yeah, I thought that was weird! The other temperature reported by the VRM are around 40-50 under normal use, and peaked out at 57 with an ambient temp of 39C. Will have to redo the pads, and try to get a hold of some Fujipoly thermal pads. I only need the 1mm ones yeah?

Will this do?
http://www.ebay.com/itm/Fujipoly-Extreme-1-0mm-11W-mK-Thermal-Pad-100mm-x-15mm-High-quality-Unit-1-/181701212676?pt=LH_DefaultDomain_3&hash=item2a4e3c7a04
Quote:


> Originally Posted by *rdr09*
> 
> i agree with Homes. Your vrms should stay around 50s. i don't think even Furmark will make it that hot. Did you remove the protective coatings on both sides of the thermal pad?


Yes, I certainly believe so. But you can never rule out stupid mistakes like that. Will order up some Fujipoly thermal pads and take it apart to have a look.



How do I known which sensor applies to which VRM? I have two different orders in HWiNFO. Temps will be higher after longer use, just had to fire up FM for 5 minutes to show the difference.



Would make it easier to troubleshoot.

Only used FM for testing the cooling capabilities of my loop, 3DM11 showed a lot more artifacts, so I switched over to that to determine whether the OC was good.


----------



## kdawgmaster

Hate to say it but u can remove me from the list as i no longer have my R9 290x cards.


----------



## Nauticle

So I'm getting my new XFX 290x tomorrow and was wondering what would be the best driver to start with when switching from an Nvidia card. Im thinking about getting the beta 15.6 version since it includes updates for The Witcher 3 while the latest stable one doesn't. Reading through the thread it seems there aren't many issues with the beta ones but just curious about its stability and if I should go straight to beta after the switch. Also Im using Windows 7 and 10 switching between them both so no idea which one I should use for either OS (if it even matters). Thanks.


----------



## broken pixel

The beta drivers work fine and I have tested a bunch of them, just be sure to use DDU & Crap cleaner to clean up all the Nvidia GPU drivers. I tested some of the other drivers with windows 10 and the ones that come stock with win10 work the best. I could not get BF4 to start with other AMD drivers on windows 10 but the ones the OS installs worked fine.


----------



## ptrkhh

As a background, I have a *Silverstone RVZ01* mini-ITX case and *Arctic Accelero Twin Turbo II* GPU cooler.

I recently bought a refurbished Sapphire R9-290 (reference) from a store and it turned out to be defective. The store didn't have any replacement unit, so they 'upgraded' it to an *ASUS Radeon R9-290 DirectCU II OC* model.

Here's the thing: As it turns out, DirectCU II is not compatible with my RVZ01 since its too wide. If I installed the GPU, I had to remove the power connector on the back of the case. I could get an angled power cable that directly connects to the PSU and the wall instead of the back of the case, or I could move the power connector to the bottom similar to the FTZ01 (http://www.silverstonetek.com/images/products/ftz01/ftz01s-34back-top.jpg) although that would require me to cut the case.

However, the problem is the cooling performance of the DCU isn't terribly efficient either. As pointed out in many reviews, only one out of five heatpipes makes full contact with the chip itself. Sure, its a _DirectCU_, but not for all the heatpipes. Two makes half contact, and the other two are simply hanging (http://www.techpowerup.com/reviews/ASUS/R9_290X_Direct_Cu_II_OC/images/cooler2.jpg). This leaves the fans to run fairly loudly under load.

So, there are several options:

1.*Replace the DirectCU II cooler* with the Twin Turbo II.

2.*Replace the power cable* that directly connects the PSU to the wall.

3.*Cut the case and move the power connector* to the bottom similar to the FTZ01 (http://www.silverstonetek.com/images/products/ftz01/ftz01s-34back-top.jpg)

I have several questions:

1.Which one would you recommend? None of those would cost me a dime.

2.The ASUS card comes with 5-pin fan connector. How do I adapt it to the Twin Turbo II which has 3 and 4-pin fan connector? Or should I just put the molex adapter and leave the TT2 running at full speed all the time. Its not loud, but still, I would prefer a silent machine.

(About the TT2, while its not officially supported, I have seen one guy who has successfully mounted it on a 290 with a quite impressive result http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/10550#post_21365851 )


----------



## Ha-Nocri

Quote:


> Originally Posted by *Sempre*
> 
> Ok so tried different drivers. Installed 15.15 and no improvement. Cleaned with ddu and re installed the latest official beta 15.4. Tried switching the bios switch on the cards. Disabled all overclocks so everything is at stock. Last thing I uninstalled afterburner and started monitoring with hwinfo and gpuz. So far nothing improved.
> 
> I still see some usage spikes with crossfire disabled but gameplay is a lot smoother and less stuttering.
> 
> This is a test i did with Valley:
> 
> 1 card enabled
> 
> 
> 2 cards enabled
> 
> 
> With hwinfo:
> 1 card
> 
> 2 cards


That's a normal behavior if you are CPU bound or frame-capped. My 290 does that all the time. Jumps from 30 to 100% usage when GPU has headroom, and game-play is smooth as butter


----------



## rdr09

Quote:


> Originally Posted by *Sempre*
> 
> Ok so tried different drivers. Installed 15.15 and no improvement. Cleaned with ddu and re installed the latest official beta 15.4. Tried switching the bios switch on the cards. Disabled all overclocks so everything is at stock. Last thing I uninstalled afterburner and started monitoring with hwinfo and gpuz. So far nothing improved.
> 
> I still see some usage spikes with crossfire disabled but gameplay is a lot smoother and less stuttering.
> 
> This is a test i did with Valley:
> 
> 1 card enabled
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 2 cards enabled
> 
> 
> With hwinfo:
> 1 card
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 2 cards
> 
> 
> Spoiler: Warning: Spoiler!


Was that running a loop in Valley or just the bench? When you uninstall AB your ULPS setting resets to default, which is enabled.

here was my Valley run with two 290s . . .



When you say temp is not an issue . . . do you mean temp of the VRMs as well?

Quote:


> Originally Posted by *bkvamme*
> 
> Okay, thanks, still some headroom there.
> 
> Might very well keep the OCed profile on standby incase I need it, as most of the games I play run perfectly fine on ultra with my refreshrate/resolution anyway ([email protected]). Ran Eyefinity, but my triple monitor setup got downvoted by the significant other (said it took up too much space, and did not look nice, can you believe it
> 
> 
> 
> 
> 
> 
> 
> ).
> 
> Yeah, I thought that was weird! The other temperature reported by the VRM are around 40-50 under normal use, and peaked out at 57 with an ambient temp of 39C. Will have to redo the pads, and try to get a hold of some Fujipoly thermal pads. I only need the 1mm ones yeah?
> 
> Will this do?
> http://www.ebay.com/itm/Fujipoly-Extreme-1-0mm-11W-mK-Thermal-Pad-100mm-x-15mm-High-quality-Unit-1-/181701212676?pt=LH_DefaultDomain_3&hash=item2a4e3c7a04
> Yes, I certainly believe so. But you can never rule out stupid mistakes like that. Will order up some Fujipoly thermal pads and take it apart to have a look.
> 
> 
> 
> How do I known which sensor applies to which VRM? I have two different orders in HWiNFO. Temps will be higher after longer use, just had to fire up FM for 5 minutes to show the difference.
> 
> 
> 
> Would make it easier to troubleshoot.
> 
> Only used FM for testing the cooling capabilities of my loop, 3DM11 showed a lot more artifacts, so I switched over to that to determine whether the OC was good.


yah, that is about twise as good as the thermal pad that comes with ek. That figure skipped a part of VRM1. Not sure on your screenie which is VRM1 or VRM2 temp but they should be very close in temp and, almost always, lower than the core. See highlighted area in square (VRM1) . . .


----------



## bkvamme

Quote:


> Originally Posted by *rdr09*
> 
> yah, that is about twise as good as the thermal pad that comes with ek. That figure skipped a part of VRM1. Not sure on your screenie which is VRM1 or VRM2 temp but they should be very close in temp and, almost always, lower than the core. See highlighted area in square (VRM1) . . .


Ok, thanks. Will order some new thermal pads then, and reseat the pads. Will suck to drain the loop, but hey, bad temps are bad temps


----------



## rdr09

Quote:


> Originally Posted by *bkvamme*
> 
> Ok, thanks. Will order some new thermal pads then, and reseat the pads. Will suck to drain the loop, but hey, bad temps are bad temps


actually, even with the pads that came with the ek block . . . your temp should not go that high. it should be almost equal the core temp. but, since you are draining and doing all that . . . might as well replace the pad with fuji.

check to make sure old pads are making contact with the components.


----------



## bkvamme

Quote:


> Originally Posted by *rdr09*
> 
> actually, even with the pads that came with the ek block . . . your temp should not go that high. it should be almost equal the core temp. but, since you are draining and doing all that . . . might as well replace the pad with fuji.
> 
> check to make sure old pads are making contact with the components.


Yeah, even one of the pads have slipped off during installation, or I missed it all together. Only the one VRM Temp sensor is very high, the other is roughly the same as the Core Temp.


----------



## kizwan

Quote:


> Originally Posted by *Ramzinho*
> 
> I'm so confused. for some reason my 3Dmark sees my ram as 1333 despite being read as 1600 in the CPU-Z memory. also it keeps recording my clocks as stock despite being Overclocked ! any ideas?
> 
> http://www.3dmark.com/3dm/7539068?


Did you set windows power option to high performance? The 3dmark sysinfo sometime failed to read the actual (overclock) clocks. It's normal.

Did your ram stock frequency (without XMP) is 1333? I think 3dmark sysinfo only read dram stock frequency.
Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bkvamme*
> 
> Yes, I certainly believe so. But you can never rule out stupid mistakes like that. Will order up some Fujipoly thermal pads and take it apart to have a look.
> 
> 
> 
> How do I known which sensor applies to which VRM? I have two different orders in HWiNFO. Temps will be higher after longer use, just had to fire up FM for 5 minutes to show the difference.
> 
> 
> 
> Would make it easier to troubleshoot.
> 
> Only used FM for testing the cooling capabilities of my loop, 3DM11 showed a lot more artifacts, so I switched over to that to determine whether the OC was good.
> 
> 
> 
> yah, that is about twise as good as the thermal pad that comes with ek. That figure skipped a part of VRM1. Not sure on your screenie which is VRM1 or VRM2 temp but they should be very close in temp and, almost always, lower than the core. See highlighted area in square (VRM1) . . .
Click to expand...

VRM1 is the long strip near the PCIe power connectors while VRM2 is the one near the I/O plate.

Actually, with stock EK pad, VRM1 can be higher than the core. I got mine 6 to 7C above core temp when overclock & overvolt. VRM2 remain cool.


----------



## Sempre

Quote:


> Originally Posted by *Ha-Nocri*
> 
> That's a normal behavior if you are CPU bound or frame-capped. My 290 does that all the time. Jumps from 30 to 100% usage when GPU has headroom, and game-play is smooth as butter


Yes i have it capped in some games and gameplay is smooth in one card despite usage spikes, unlike crossfire. So i purposfuly ran banechmarks and games with extreme graphic settings, which makes it impossible to reach any kind of cap, to cross out a CPU bottleneck.
Quote:


> Originally Posted by *rdr09*
> 
> Was that running a loop in Valley or just the bench? When you uninstall AB your ULPS setting resets to default, which is enabled.
> 
> here was my Valley run with two 290s . . .
> 
> When you say temp is not an issue . . . do you mean temp of the VRMs as well?


I was running a loop. I also switched to the free walk mode and kept moving forward in the same scene and monitor the usage, because i thought that the black screen intervals between switching scenes may cause the usage spikes in the graphs, but was not the case.

The VRM temps reaches high 80s for the top card and high 70s for the bottom one. Do you think VRM temps is the cause?
The difference between VRM 1 and 2 in each card is about ~5C

Your GPU1 spikes in crossfire is close to my GPU1 spikes when crossfire is disabled but gameplay is smooth.

Edit: I just did a long loop with valley and as you can see the GPUs usage is awful. I could also see stuttering during the scenes. No frame caps were being reached.


@rdr09 I read your tip about ulps, so ill uninstall it for the second time and disable ulps via registry.


----------



## colorfuel

Hi,

I'm planning on upgrading my 7950 to a Sapphire Radeon R9 290X Tri-X OC [New Edition], 4GB GDDR5 and I will probably overclock it to 1100 Core/ 1500 Mem if possible.

I'm also using a 3570K @ 4.5Ghz

Is my 580W be quiet! Straight Power E9 CM 80+Gold going to be enough?

I was not planning on buying Nvidia and the 290 apparently has compatibility issues with OSX (which I run as a second system).

What do you think?


----------



## BradleyW

Does anyone have resource streaming stutters on Witcher 3 with CFX enabled? (Also happens without CFX, but not as bad).


----------



## Gumbi

Quote:


> Originally Posted by *colorfuel*
> 
> Hi,
> 
> I'm planning on upgrading my 7950 to a Sapphire Radeon R9 290X Tri-X OC [New Edition], 4GB GDDR5 and I will probably overclock it to 1100 Core/ 1500 Mem if possible.
> 
> I'm also using a 3570K @ 4.5Ghz
> 
> Is my 580W be quiet! Straight Power E9 CM 80+Gold going to be enough?
> 
> I was not planning on buying Nvidia and the 290 apparently has compatibility issues with OSX (which I run as a second system).
> 
> What do you think?


A 580w gold psu is perfect. I dunno why you'd get a 290x when a 290 is less than 5% slower clock for clock and much cheaper.


----------



## colorfuel

Thanks for your answer.

I want to get a 290X because the 290 has issues working with OSX which use on a regular basis on my main system. There are workarounds but it is too much of a hassle.


----------



## Agent Smith1984

Quote:


> Originally Posted by *colorfuel*
> 
> Thanks for your answer.
> 
> I want to get a 290X because the 290 has issues working with OSX which use on a regular basis on my main system. There are workarounds but it is too much of a hassle.


That doesn't make any sense....

Same exact design and architecture, with 256 shaders disabled???

Please elaborate...
Maybe we can help...


----------



## colorfuel

Well somehow OSX detects your card and applys its drivers and the drivers it uses are those from an 8XXXcontroller.kext

I dont completely understand how it works but the 290X shows up as a 8XXX card and is fully supported by OSX.

The 290 on the other isnt and its not automatically recognized but there are ways to trick OSX into using its drivers to make it seem like you have an 290X instead. But there are people reporting issues and you will need to edit those kext every time there is an update to OSX.

Maybe there is a way to flash a 290X bios onto the 290 just to trick OSX, but apparently that doesnt work with every 290, or am I wrong?

Also reinstalling your system will be a hassle.

I wanted to opt for the hassle-free variant and just get a 290X.

I know 5% isnt a big deal and can be easily achieved with a little OC.


----------



## Sunshine323

Quote:


> Originally Posted by *rdr09*
> 
> just get another in the same store if it is that easy to replace just in case you get another with coil whine. some cards have it others do not. just luck of the draw.
> i agree with Homes. Your vrms should stay around 50s. i don't think even Furmark will make it that hot. Did you remove the protective coatings on both sides of the thermal pad?


Thank you for your response. i havn't modified anything on the card it had it right out of the box loud buzzing from the card when running any 3d application in dying light it was the most loud buzzing. in farcry 4 it wasn't to extreme but in firestrike you could hear it from the case.

the coilwhine thing seems to be an issue with more high-end models including gtx970 as well. i read may people complaining about light coilwhine on some models even asus models. i still take it with a grain of salt then.

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/67822-graphics-card-coil-whine-investigation.html

i just got mail i get the money back in a few days. ill just order another lighting when i have the money back its an awesome card and the pricing is cheap for all features youll get with it compared to other models. i just hope ill get the lukcy draw this time


----------



## bkvamme

Quote:


> Originally Posted by *Sunshine323*
> 
> Thank you for your response. i havn't modified anything on the card it had it right out of the box loud buzzing from the card when running any 3d application in dying light it was the most loud buzzing. in farcry 4 it wasn't to extreme but in firestrike you could hear it from the case.
> 
> the coilwhine thing seems to be an issue with more high-end models including gtx970 as well. i read may people complaining about light coilwhine on some models even asus models. i still take it with a grain of salt then.
> 
> http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/67822-graphics-card-coil-whine-investigation.html
> 
> i just got mail i get the money back in a few days. ill just order another lighting when i have the money back its an awesome card and the pricing is cheap for all features youll get with it compared to other models. i just hope ill get the lukcy draw this time


I have coil whine on my 290X, but only at ridiculously high FPS (1000+), so it's a non issue for me. Did you experience coil whine at lower FPS aswell?


----------



## Sunshine323

yes it had that too on low FPS. i tried to use V-sync on those games but it didnt help.


----------



## pozzallo

Please add me. I have 3 Asus r9 290x Matrix Platinum cards in crossfire. Asus DirectCU II aftermarket cooler.


----------



## cephelix

Quote:


> Originally Posted by *pozzallo*
> 
> 
> 
> Please add me. I have 3 Asus r9 290x Matrix Platinum cards in crossfire. Asus DirectCU II aftermarket cooler.


Those look nice. Are temps on the matrix platinums the same as the normal asus 290x cards? or are they better?


----------



## pozzallo

Quote:


> Originally Posted by *cephelix*
> 
> Those look nice. Are temps on the matrix platinums the same as the normal asus 290x cards? or are they better?


I read on a review site that the directCu II card ran cooler. But I had a 750D case and my top card ran 94 degrees on aida 64 stress test and would down clock the core speed. I have since changed my case to a silverstone raven rv-02 case not the evo. My video cards temp dropped 10 degrees. and I overclocked the cards more with no down clocking of core speed. I am using GPU tweak II . Happy with results. I have picture of my case and also changed Ap 181 for the new AP 182 fans.


----------



## aaroc

Quote:


> Originally Posted by *pozzallo*
> 
> Please add me. I have 3 Asus r9 290x Matrix Platinum cards in crossfire. Asus DirectCU II aftermarket cooler.


Can you please put a link to where they are sold? They are sold separately for other R9 290X GPUs or I didnt understand correctly what they are?

The EK thermal pads are very good. I bought Fujipolys after Roboytos review, but stored so well that I couldnt find them when I got all parts to build the WC PC.








The user with VRM temp problems should reinstall them if possible. I have EK backplates, they help with vrm cooling too.

Someone has managed to use CFX with Vsync activated on Omega or latest Betas? I used the now dead radeon pro tool to activate frame pacing and vsync last year with my CFX R9 290, it worked excellent.

For those interested in racing games, Assetto Corsa use of 3 monitors in eyefinity is very good without configuring the cockpit of the car. You must only drag and drop with the mouse a HUD that is on the left screen to the center one (velocity, Ge-forces, etc... nothing mandatory). Project Cars on the other side, you just move the POV a lot to have a nice view with 3 monitors.

Dirt Rally is funny in CFX. When using bird view (above and behind car) FPS are good, but if you use the hood view (windshield pov) the reflections of the world in the hood painting kills all FPS, something similar but less performance hit when using the cockpit wheel view where you see some of the hood and its reflections. I hope they fix it as a Gaming Evolved title.


----------



## cephelix

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *pozzallo*
> 
> I read on a review site that the directCu II card ran cooler. But I had a 750D case and my top card ran 94 degrees on aida 64 stress test and would down clock the core speed. I have since changed my case to a silverstone raven rv-02 case not the evo. My video cards temp dropped 10 degrees. and I overclocked the cards more with no down clocking of core speed. I am using GPU tweak II . Happy with results. I have picture of my case and also changed Ap 181 for the new AP 182 fans.






They look great in that case....Always wanted a silverstone case.....but they always seem to cost an arm and a leg for the feature set.


----------



## pozzallo

Quote:


> Originally Posted by *aaroc*
> 
> Can you please put a link to where they are sold? They are sold separately for other R9 290X GPUs or I didnt understand correctly what they are?
> 
> Someone has managed to use CFX with Vsync activated on Omega or latest Betas? I used the now dead radeon pro tool to activate frame pacing and vsync last year with my CFX R9 290, it worked excellent.
> 
> For those interested in racing games, Assetto Corsa use of 3 monitors in eyefinity is very good without configuring the cockpit of the car. You must only drag and drop with the mouse a HUD that is on the left screen to the center one (velocity, Ge-forces, etc... nothing mandatory). Project Cars on the other side, you just move the POV a lot to have a nice view with 3 monitors.
> 
> Dirt Rally is funny in CFX. When using bird view (above and behind car) FPS are good, but if you use the hood view (windshield pov) the reflections of the world in the hood painting kills all FPS, something similar but less performance hit when using the cockpit wheel view where you see some of the hood and its reflections. I hope they fix it as a Gaming Evolved title.[/quot
> 
> The Matrix cards have the same cooler as the asus r9 290x with the DirectCU ii cooler. But the pcb and the power phases are different on the cards themself . If you look on youtube JJ from Asus has a good video explaining the difference in R9 290x models from Asus.


----------



## pozzallo

The Matrix cards have the same cooler as the asus r9 290x with the DirectCU ii cooler. But the pcb and the power phases are different on the cards themself . If you look on youtube JJ from Asus has a good video explaining the difference in R9 290x models from Asus.


----------



## pozzallo

Quote:


> Originally Posted by *cephelix*
> 
> 
> They look great in that case....Always wanted a silverstone case.....but they always seem to cost an arm and a leg for the feature set.


I am in the USA . I brought my case on Ebay. The case is not made by silverstone any more but they have new models. I brought the RV-02 because of my PSU being so long and not fitting in my RV-03 case. The RV-03 case has a PSU limit of 180mm depth. My Corsair AX1500i is 225 mm depth.


----------



## jura11

Quote:


> Originally Posted by *colorfuel*
> 
> Thanks for your answer.
> 
> I want to get a 290X because the 290 has issues working with OSX which use on a regular basis on my main system. There are workarounds but it is too much of a hassle.


Hi there

Yes agree 290X will be easier get work on OSX,but still is possible to use R9 290 on OSX,I've installed Yosemite 10.10.3(dual boot with W7) and my R9 290 Tri-X is working pretty much as I've expected

Thanks,Jura

Quote:


> Originally Posted by *colorfuel*
> 
> Well somehow OSX detects your card and applys its drivers and the drivers it uses are those from an 8XXXcontroller.kext
> 
> I dont completely understand how it works but the 290X shows up as a 8XXX card and is fully supported by OSX.
> 
> The 290 on the other isnt and its not automatically recognized but there are ways to trick OSX into using its drivers to make it seem like you have an 290X instead. But there are people reporting issues and you will need to edit those kext every time there is an update to OSX.
> 
> Maybe there is a way to flash a 290X bios onto the 290 just to trick OSX, but apparently that doesnt work with every 290, or am I wrong?
> 
> Also reinstalling your system will be a hassle.
> 
> I wanted to opt for the hassle-free variant and just get a 290X.
> 
> I know 5% isnt a big deal and can be easily achieved with a little OC.


Hi there

Not sure,how far are you with Hackintosh,but I would go with Clover as bootloader,depends on CPU,but I've got fully working Clover Bootoader on X58 with X5670,it has been painless process,but there are few buts which can be ironed out pretty easily

My R9 290 Tri-X is recognized as R9 290 with 4GB,for now only HDMI is working,but I think this is down to framebuffer which i will be sorting this week if I will have spare time,but everything is working,tried few benchmarks and everything is working as I wanted,just pain is,I can't use multi monitors and only way how to use multi monitor setup to have enabled iGPU,which my MB doesn't have or other wy to have second R9,but I still hope I will get to work via framebuffer my multi monitor setup

I've updated from 10.10 to 10.10.3 with same settings and everything is working on my PC(I mean I've not updated kext)

Really depends on GPU,as not every 290 can be flashed to 290X,my GPU can't be....

Hassle free on OSX not sure,you will see how you get on

Hope this helps

Thanks,Jura


----------



## colorfuel

@yura11

I'm using clover aswell. Did you need to modify any kext or is clover doing it for you?

Do you have the Tri-X new edition or the old one with the AMD pcb?

Also, I'm not planning to use multi-monitor, but my monitor only has DVI and DP, so I suppose another framebuffer should do the trick.

Thanks for the help, now I'm really considering the 290 again.


----------



## bkvamme

Quote:


> Originally Posted by *aaroc*
> 
> The EK thermal pads are very good. I bought Fujipolys after Roboytos review, but stored so well that I couldnt find them when I got all parts to build the WC PC.
> 
> 
> 
> 
> 
> 
> 
> 
> The user with VRM temp problems should reinstall them if possible. I have EK backplates, they help with vrm cooling too.


Yeah, I will have to reseat the thermal pads. The fujipoly pads weren't all that expensive, and if I have to disassemble/drain the loop, I might as well change to the Fujipoly, if only for ease of mind. This card sucks power like no other, and the VRMs have a massive job to do, so I might as well give them the best treatment possible.


----------



## rdr09

Quote:


> Originally Posted by *bkvamme*
> 
> Yeah, I will have to reseat the thermal pads. The fujipoly pads weren't all that expensive, and if I have to disassemble/drain the loop, I might as well change to the Fujipoly, if only for ease of mind. This card sucks power like no other, and the VRMs have a massive job to do, so I might as well give them the best treatment possible.


I mislabeled the VRM 2 to 1 and was corrected by Kizwan. My bad.


----------



## jura11

Quote:


> Originally Posted by *colorfuel*
> 
> @yura11
> 
> I'm using clover aswell. Did you need to modify any kext or is clover doing it for you?
> 
> Do you have the Tri-X new edition or the old one with the AMD pcb?
> 
> Also, I'm not planning to use multi-monitor, but my monitor only has DVI and DP, so I suppose another framebuffer should do the trick.
> 
> Thanks for the help, now I'm really considering the 290 again.


Hi there

if you do use Clover then you should be fine,I will upload pictures later on of my plist or post the plist and post the pictures of the Clover settings...

Sorry I forgot to say,I do have edited two kexts which is very easy as you need to click on the AMDRadeonX4000.kext and on 8000Controller.kext(Show package content),there is info.plist and check line IOPCIMatch and under this line you will need to add R9 290 ID,I've tried two of ID,now not sure which ID I'm using will check later on and post that

In Clover set SMBIOS to Mac PRO 6.1 and yours R9 290 should be recognized with full acceleration and full 4GB VRAM and I woul recommend dump BIOS from yours GPU via GPU-Z and rename this BIOS to "1002_67b1.rom" and in Clover tick load VBIOS,now you need to copy this BIOS to EFI partition(mount EFI partition with Clover) here is location "EFI/CLOVER/ROM"

My Sapphire is I think old one Tri-X 4GB OC and its working,full load temps when I do render in Luxrender is around 60C when outside is hot,when I play games then temps are in 70C(high 70C)

Its bit complicated on first sight,but its working pretty well if you know how to do it,I've read probably 100 guides how to enable full acceleration and full VRAM(first time my VRAM reading has been 7MB),then I tried few other bits and now my GPU is fully recognized

If you need help with this please let me know

Thanks,Jura


----------



## cephelix

I didn't realise it was that complicated to run osx on a diy pc. I've always assumed it was like installing and running windows.


----------



## colorfuel

Ok, I'm convinced. Will get myself a 290. 

Thanks for making me spare 60 eur.

I'll be back if I need help.


----------



## maynard14

hi guys, ive been out of the loop, whats the best driver for my 290x ref xfx? i just rma my card and i just got it a few hours ago, after work i will play with it,.


----------



## rdr09

Quote:


> Originally Posted by *maynard14*
> 
> hi guys, ive been out of the loop, whats the best driver for my 290x ref xfx? i just rma my card and i just got it a few hours ago, after work i will play with it,.


i use the latest all the time. here is the link for Win 8.1 . . .

http://support.amd.com/en-us/download/desktop?os=Windows+8.1+-+64

works fine for my 290s.


----------



## the9quad

Quote:


> Originally Posted by *maynard14*
> 
> hi guys, ive been out of the loop, whats the best driver for my 290x ref xfx? i just rma my card and i just got it a few hours ago, after work i will play with it,.


I'm using the latest Beta 15.6, and they work fine. I tried the modded drivers people are using and it breaks quite a few things. The 15.6's are stable and fast enough.


----------



## maynard14

Quote:


> Originally Posted by *rdr09*
> 
> i use the latest all the time. here is the link for Win 8.1 . . .
> 
> http://support.amd.com/en-us/download/desktop?os=Windows+8.1+-+64
> 
> works fine for my 290s.


Thank you so much, downloading them now and install later, thanks


----------



## jura11

Quote:


> Originally Posted by *cephelix*
> 
> I didn't realise it was that complicated to run osx on a diy pc. I've always assumed it was like installing and running windows.


Hi there

Really depends on HW which you do have,but is fairly straight forward installation,you will need working hackintosh or real mac for first installation and making bootable USB with Yosemite
On yours should be easier as you have Z97 board and UEFI board

Hope this helps

Thanks,Jura


----------



## jura11

Quote:


> Originally Posted by *colorfuel*
> 
> Ok, I'm convinced. Will get myself a 290.
> 
> Thanks for making me spare 60 eur.
> 
> I'll be back if I need help.


Hi there

here are my settings for Clover:








Hope this helps

Thanks,Jura


----------



## Karathox

Hi everyone.

I sincerely hope that it won't be considered off-topic, but I need a little help. I recently bought a slightly damaged 290 (Sapphire Tri-X, to be exact) and... well, it turned out to be in a somewhat worse condition than I hoped - some capacitors and resistors are simply missing. I still want to attempt repairs but to do that I need to have the voltages of missing parts and, ideally, the way they're connected to each other. If anyone would be so kind as to give me that info, I would be very thankful.


Spoiler: Warning: Spoiler!


----------



## Dunan

Quote:


> Originally Posted by *Karathox*
> 
> Hi everyone.
> 
> I sincerely hope that it won't be considered off-topic, but I need a little help. I recently bought a slightly damaged 290 (Sapphire Tri-X, to be exact) and... well, it turned out to be in a somewhat worse condition than I hoped - some capacitors and resistors are simply missing. I still want to attempt repairs but to do that I need to have the voltages of missing parts and, ideally, the way they're connected to each other. If anyone would be so kind as to give me that info, I would be very thankful.
> 
> 
> Spoiler: Warning: Spoiler!


Call Sapphire they may be able to tell you and good luck


----------



## jura11

Quote:


> Originally Posted by *cephelix*
> 
> I didn't realise it was that complicated to run osx on a diy pc. I've always assumed it was like installing and running windows.


Hi there

I would recommend have look on this guide

http://www.tonymacx86.com/yosemite-desktop-guides/143976-unibeast-install-os-x-yosemite-any-supported-intel-based-pc.html

Its very easy to install there,if you need help please let me know,I'm very happy to help

Thanks,Jura


----------



## cephelix

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *jura11*
> 
> Hi there
> 
> I would recommend have look on this guide
> 
> http://www.tonymacx86.com/yosemite-desktop-guides/143976-unibeast-install-os-x-yosemite-any-supported-intel-based-pc.html
> 
> Its very easy to install there,if you need help please let me know,I'm very happy to help
> 
> Thanks,Jura


Quote:


> Originally Posted by *jura11*
> 
> Hi there
> 
> Really depends on HW which you do have,but is fairly straight forward installation,you will need working hackintosh or real mac for first installation and making bootable USB with Yosemite
> On yours should be easier as you have Z97 board and UEFI board
> 
> Hope this helps
> 
> Thanks,Jura






Thanks a bunch! I used to have a mac way back in 2009 and it was awesome! The OS was so clean and quite responsive. Unfortunately, after moving back to windows, I couldn't readjust myself to how the mac functions anymore. Nonetheless, I'll give your link a read. Who know, it might pique my interest enough to do a partition on my system


----------



## Bigm

I'm going to be buying a pair of 290s soon and was wondering about my PSU. Currently I have this Seasonic 850w running my 4790k, 2 x HDDs, 2x SSDs, a Blu Ray Drive, an NZXT Kraken x61, and as soon as they send out the fixed models, an NZXT Grid+, as well as my ASRock Z97 Extreme6. Would 850w suffice or should I look into a 1000w or what?

Edit: The 290s I will most likely be buying are the MSI Gaming 290s, if that makes a difference.


----------



## jura11

Quote:


> Originally Posted by *cephelix*
> 
> 
> Thanks a bunch! I used to have a mac way back in 2009 and it was awesome! The OS was so clean and quite responsive. Unfortunately, after moving back to windows, I couldn't readjust myself to how the mac functions anymore. Nonetheless, I'll give your link a read. Who know, it might pique my interest enough to do a partition on my system


Hi there

In my case first hackintosh back in 2010 when I've got my first i7-920 has been total failure,i've been unable to make fully working Hackintosh and pretty much I've broke my Windows too and after that I didn't tried,yes I've run VMWare image for while for testing

But last week my friend ask me,if my GPU does work with the Yosemite and I've been curious if will works out of box etc and give try and done few tests,still is not perfect and I think never will be,unless I find good framebuffer for multi monitor support

I would give try,if you have two separate HDD then I would go route of the Yosemite and Clover Bootloader,Multibeast is good too and Chameleon I've used too on my installation,but I think I prefer Clover,but you will see









Hope this helps

Thanks,Jura


----------



## pozzallo

Quote:


> Originally Posted by *Bigm*
> 
> I'm going to be buying a pair of 290s soon and was wondering about my PSU. Currently I have this Seasonic 850w running my 4790k, 2 x HDDs, 2x SSDs, a Blu Ray Drive, an NZXT Kraken x61, and as soon as they send out the fixed models, an NZXT Grid+, as well as my ASRock Z97 Extreme6. Would 850w suffice or should I look into a 1000w or what?
> 
> Edit: The 290s I will most likely be buying are the MSI Gaming 290s, if that makes a difference.


You could use a psu calculator online to give you a wattage used of total system. I use extreme.outervision.com, Just fill out the information on there website.


----------



## Bigm

Quote:


> Originally Posted by *pozzallo*
> 
> You could use a psu calculator online to give you a wattage used of total system. I use extreme.outervision.com, Just fill out the information on there website.


Thanks! +rep

Edit: Seems 850w would be pushing it if I wanted to overclock.


----------



## cephelix

Quote:


> Originally Posted by *jura11*
> 
> Hi there
> 
> In my case first hackintosh back in 2010 when I've got my first i7-920 has been total failure,i've been unable to make fully working Hackintosh and pretty much I've broke my Windows too and after that I didn't tried,yes I've run VMWare image for while for testing
> 
> But last week my friend ask me,if my GPU does work with the Yosemite and I've been curious if will works out of box etc and give try and done few tests,still is not perfect and I think never will be,unless I find good framebuffer for multi monitor support
> 
> I would give try,if you have two separate HDD then I would go route of the Yosemite and Clover Bootloader,Multibeast is good too and Chameleon I've used too on my installation,but I think I prefer Clover,but you will see
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hope this helps
> 
> Thanks,Jura


Why is it such a hassle? does the OS not recognize the hardware?
Quote:


> Originally Posted by *pozzallo*
> 
> You could use a psu calculator online to give you a wattage used of total system. I use extreme.outervision.com, Just fill out the information on there website.


Shilka has a link in his sig that shows exactly how to use the extreme.outervision psu calculator FYI


----------



## pozzallo

Quote:


> Originally Posted by *Bigm*
> 
> Thanks! +rep
> 
> Edit: Seems 850w would be pushing it if I wanted to overclock.


Bigm you are welcome


----------



## jura11

Quote:


> Originally Posted by *cephelix*
> 
> Why is it such a hassle? does the OS not recognize the hardware?
> Shilka has a link in his sig that shows exactly how to use the extreme.outervision psu calculator FYI


Hi there

Not really,first time when I've want to go with Hackintosh I've not done any research.
My last Hackintosh with Yosemite 10.10.3 has been very easy,I would say very easy and hassle free,everything is working and I've found every kext which I needed for OSX be fully working and useable
Depends on HW used,but if you are have UEFI board then installation will and can be easier and kext for yours bard you should be able find
Only thing which I would like to have enabled multi monitor setup,but I think this I will be able sort too,just I would need have look on this in bit more during the weekend

Hope this helps and thanks for rep:thumb:

Thanks,Jura


----------



## ZealotKi11er

Quote:


> Originally Posted by *pozzallo*
> 
> I read on a review site that the directCu II card ran cooler. But I had a 750D case and my top card ran 94 degrees on aida 64 stress test and would down clock the core speed. I have since changed my case to a silverstone raven rv-02 case not the evo. My video cards temp dropped 10 degrees. and I overclocked the cards more with no down clocking of core speed. I am using GPU tweak II . Happy with results. I have picture of my case and also changed Ap 181 for the new AP 182 fans.


Thats like the best case for air cooled cards.


----------



## rdr09

Quote:


> Originally Posted by *Bigm*
> 
> I'm going to be buying a pair of 290s soon and was wondering about my PSU. Currently I have this Seasonic 850w running my 4790k, 2 x HDDs, 2x SSDs, a Blu Ray Drive, an NZXT Kraken x61, and as soon as they send out the fixed models, an NZXT Grid+, as well as my ASRock Z97 Extreme6. Would 850w suffice or should I look into a 1000w or what?
> 
> Edit: The 290s I will most likely be buying are the MSI Gaming 290s, if that makes a difference.


are those new? if two, you might be better off getting the R9 390 8GB. The Gigabyte looks really nice.


----------



## Bigm

Quote:


> Originally Posted by *rdr09*
> 
> are those new? if two, you might be better off getting the R9 390 8GB. The Gigabyte looks really nice.


Nah, not new. Used on eBay. I've got a few bids out but I'm not paying more than $150 a card. If I don't win any of them, I might look into a 390 when Amazon has their massive sale starting July 12th.

Edit: Though probably not the Gigabyte 390. Someone told me it was voltage locked.


----------



## rdr09

Quote:


> Originally Posted by *Bigm*
> 
> Nah, not new. Used on eBay. I've got a few bids out but I'm not paying more than $150 a card. If I don't win any of them, I might look into a 390 when Amazon has their massive sale starting July 12th.


I see. If you find a used Tri X, then those i recommend if going for 2. And, yes, if buying new, then go for the 300 series.


----------



## pozzallo

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Thats like the best case for air cooled cards.


Like it better than my Raven RV-03 case harddrives run cooler in this case. No limit of size for PSU like RV-03 case. Thanks ZealotKi11er


----------



## Wezzor

Do you guys think AMD will release a new Omega driver anytime soon?


----------



## snow cakes

oh my goosh


----------



## Sleazybigfoot

I sold my r9 290's, don't own them anymore


----------



## maynard14

Quote:


> Originally Posted by *Sleazybigfoot*
> 
> I sold my r9 290's, don't own them anymore


what gpu are you gonna buy bro? me too tomorrow ill sell my 290x ref,. im sick of the cooler and i dont want to use nzxt g10 anymore,, hehe, im looking for 980, or 390x with very good cooling


----------



## ZealotKi11er

Quote:


> Originally Posted by *Wezzor*
> 
> Do you guys think AMD will release a new Omega driver anytime soon?


The driver that come with Windows 10 aka 15.20 will be certified.


----------



## Wezzor

Quote:


> Originally Posted by *ZealotKi11er*
> 
> The driver that come with Windows 10 aka 15.20 will be certified.


Alright!
Thank you!


----------



## MOSER91

Guys I'm going to build a SFF and was wondering if a 600w PSU is sufficient to power an r9 290x w/ an OC'ed i5 4670k? I was looking @ this one specifically...._http://www.silverstonetek.com/product.php?pid=524_.


----------



## Arizonian

Quote:


> Originally Posted by *randhis*
> 
> hi this is my card
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I am using the gelid icy rev 2.0 with arctic accelero 3, almost similar to tri-xs cooler...the vram heatsink is abit tall, it does come in contact the arctic accelero heatsink fins and bend the card little bit, i checked my vram temps using hwinfo32, vram1 is showing 55C and VRM2 53C. Not sure how accurate is hwinfo32, but iF I were u, i will stick with stock tri-x, its a awesome gpu wITH proper cooler.


Congrats added















Quote:


> Originally Posted by *pozzallo*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Please add me. I have 3 Asus r9 290x Matrix Platinum cards in crossfire. Asus DirectCU II aftermarket cooler.


Congrats - added


----------



## bkvamme

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats added
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added


You can add me as well. Single Gigabyte Windforce 3X OC R9 290X.

Appear to have a pretty decent OC out of this card, can get around 1250/1500 stable. Was able to get 1270/1520, but that wasn't stable enough for 3DMark, 3DMark 11 went OK though.


----------



## Arizonian

Quote:


> Originally Posted by *bkvamme*
> 
> You can add me as well. Single Gigabyte Windforce 3X OC R9 290X.
> 
> Appear to have a pretty decent OC out of this card, can get around 1250/1500 stable. Was able to get 1270/1520, but that wasn't stable enough for 3DMark, 3DMark 11 went OK though.


Nice congrats - added


----------



## pozzallo

Quote:


> Originally Posted by *Arizonian*
> 
> Congrats added
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added


Quote:


> Originally Posted by *Arizonian*
> 
> Congrats added
> 
> 
> 
> 
> 
> 
> 
> 
> Congrats - added


Thanks Arizonian


----------



## Tabinhu

Im going to buy an used R9 290 for 190€, consider this is cheap in Portugal! Its the stock cooler tho.
I'm going with a 4690k overclocked, do you think the 550w XFX PSU will do? Or should I buy a 650W just to be sure?


----------



## Gumbi

Quote:


> Originally Posted by *Tabinhu*
> 
> Im going to buy an used R9 290 for 190€, consider this is cheap in Portugal! Its the stock cooler tho.
> I'm going with a 4690k overclocked, do you think the 550w XFX PSU will do? Or should I buy a 650W just to be sure?


550 w psu is fine if it's decent quality. XFX is usually good. It it bronze rated?

That's an ok price for a 290, but I would also invest in aftermarket cooling of some sort, the stock 290 cooling is absolute trash.


----------



## pozzallo

Quote:


> Originally Posted by *MOSER91*
> 
> Guys I'm going to build a SFF and was wondering if a 600w PSU is sufficient to power an r9 290x w/ an OC'ed i5 4670k? I was looking @ this one specifically...._http://www.silverstonetek.com/product.php?pid=524_.


Quote:


> Originally Posted by *MOSER91*
> 
> Guys I'm going to build a SFF and was wondering if a 600w PSU is sufficient to power an r9 290x w/ an OC'ed i5 4670k? I was looking @ this one specifically...._http://www.silverstonetek.com/product.php?pid=524_.


You could use a psu calculator online to give you a wattage used of total system. I use extreme.outervision.com, Just fill out the information on there website.


----------



## Tabinhu

Quote:


> Originally Posted by *Gumbi*
> 
> 550 w psu is fine if it's decent quality. XFX is usually good. It it bronze rated?
> 
> That's an ok price for a 290, but I would also invest in aftermarket cooling of some sort, the stock 290 cooling is absolute trash.


yes, its the new XFX TS 550W 80+ bronze.
I can get the 600W Corsair CX, bur the TS is of much higher quality.
Do you think it can handle the overlocked 4690k + R9 290?

I might consider getting a Accelero Extreme cooler for it


----------



## Gumbi

Quote:


> Originally Posted by *Tabinhu*
> 
> yes, its the new XFX TS 550W 80+ bronze.
> I can get the 600W Corsair CX, bur the TS is of much higher quality.
> Do you think it can handle the overlocked 4690k + R9 290?
> 
> I might consider getting a Accelero Extreme cooler for it


Yep, it should do, check out system consumption in reviews. 290 systems generally do about 400w in reviews. Just be careful of pumping too nuch extra voltage into the 290 when overclocking and you'll be fine.


----------



## XenoRad

Hello guys,

I have an Asus 290x DirectCU II (non OC version) on the latest 15.6 Beta Drivers. I've tried some overclocking but it only goes up to 1080 Mhz GPU/1350 Mhz VRAM with stock volts (~ 1.090 mV) and +20 Power Limit without problems.

I've been using the MSI Kombustor Burn-in test and while it doesn't crash, beginning with 1085 Mhz for GPU discrete dark spots start flashing on the screen after a few minutes.
At 1090 the dark spots are more frequent and I can't get rid of them even when adding +50 mV to the voltage at which point the temperature for VRM 1 went up to 128 C. Increasing the Power Limit at this point up to the maximum +50 does nothing to alleviate the issue.

My goal was to get a nice frequency for the GPU such as 1100 Mhz but it seems I can't even hit 1090 without artifacts and worrisome temperatures.

Is this is for my card? 1080 MHZ with 1350 MHZ? Or perhaps MSI Kombustor would show artifacts when games wouldn't?


----------



## Ha-Nocri

Quote:


> Originally Posted by *XenoRad*
> 
> Hello guys,
> 
> I have an Asus 290x DirectCU II (non OC version) on the latest 15.6 Beta Drivers. I've tried some overclocking but it only goes up to 1080 Mhz GPU/1350 Mhz VRAM with stock volts (~ 1.090 mV) and +20 Power Limit without problems.
> 
> I've been using the MSI Kombustor Burn-in test and while it doesn't crash, beginning with 1085 Mhz for GPU discrete dark spots start flashing on the screen after a few minutes.
> At 1090 the dark spots are more frequent and I can't get rid of them even when adding +50 mV to the voltage at which point the temperature for VRM 1 went up to 128 C. Increasing the Power Limit at this point up to the maximum +50 does nothing to alleviate the issue.
> 
> My goal was to get a nice frequency for the GPU such as 1100 Mhz but it seems I can't even hit 1090 without artifacts and worrisome temperatures.
> 
> Is this is for my card? 1080 MHZ with 1350 MHZ? Or perhaps MSI Kombustor would show artifacts when games wouldn't?


Well, that cooler is not that great for 290(x). One of the worst even. What's your GPU/VRM temps while running Valley and FireStrike for example?
Kombustor produces the highest temps, much higher than games


----------



## DDSZ

Hawaii Fan Editor

Download
Not all ROMs may be supported, because I made this as temporary solution


----------



## Ha-Nocri

Quote:


> Originally Posted by *DDSZ*
> 
> Hawaii Fan Editor
> 
> Download
> Not all ROMs may be supported, because I made this as temporary solution


can it turn off fans?


----------



## CGabry

With mine R9 290 i can overclock to 1100/1400 without any bump in core voltage or power limit...


----------



## colorfuel

Quote:


> Originally Posted by *jura11*
> 
> Hi there
> 
> here are my settings for Clover:
> 
> [...]
> 
> Hope this helps
> 
> Thanks,Jura


Hi,

The 290X Tri-X went down 40 eur in a german shop (Mindfactory.de) last night (just for a few hours). So the difference went down to just 20€ between the 290 and the 290X, that gave me reason enough to cancel the 290 Tri-X [new edition] and go for the 290X Tri-X [new edition] instead.

Thanks for all the help though!


----------



## XenoRad

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Well, that cooler is not that great for 290(x). One of the worst even. What's your GPU/VRM temps while running Valley and FireStrike for example?
> Kombustor produces the highest temps, much higher than games


I think below 78 GPU/100 VRM in any case. With stock volts in Kombustor the temperatures are around 82/110.


----------



## Agent Smith1984

Quote:


> Originally Posted by *XenoRad*
> 
> Hello guys,
> 
> I have an Asus 290x DirectCU II (non OC version) on the latest 15.6 Beta Drivers. I've tried some overclocking but it only goes up to 1080 Mhz GPU/1350 Mhz VRAM with stock volts (~ 1.090 mV) and +20 Power Limit without problems.
> 
> I've been using the MSI Kombustor Burn-in test and while it doesn't crash, beginning with 1085 Mhz for GPU discrete dark spots start flashing on the screen after a few minutes.
> At 1090 the dark spots are more frequent and I can't get rid of them even when adding +50 mV to the voltage at which point the temperature for VRM 1 went up to 128 C. Increasing the Power Limit at this point up to the maximum +50 does nothing to alleviate the issue.
> 
> My goal was to get a nice frequency for the GPU such as 1100 Mhz but it seems I can't even hit 1090 without artifacts and worrisome temperatures.
> 
> Is this is for my card? 1080 MHZ with 1350 MHZ? Or perhaps MSI Kombustor would show artifacts when games wouldn't?


I would think AMD
That's about an average result for stock voltage. Try 50% power limit, and add some juice for 1100/1400+

My 390 it's running 1180/1750 right now with 100mv core and 50mv+ aux. You probably won't get that high on the memory, but the core should close


----------



## cephelix

Quote:


> Originally Posted by *CGabry*
> 
> With mine R9 290 i can overclock to 1100/1400 without any bump in core voltage or power limit...


That's good!i need +56mv for those clocks


----------



## DDSZ

Quote:


> Originally Posted by *Ha-Nocri*
> 
> can it turn off fans?


I think it depends on fan controller. Setting 0% speed on my Gigabyte WF3 make fans run at 500RPM, they're enough quiet


----------



## Ha-Nocri

Quote:


> Originally Posted by *DDSZ*
> 
> I think it depends on fan controller. Setting 0% speed on my Gigabyte WF3 make fans run at 500RPM, they're enough quiet


well, 1450RPM here @idle (20% fan speed), can't get lower with AB and TriXX... not that quiet


----------



## XenoRad

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Well, that cooler is not that great for 290(x). One of the worst even. What's your GPU/VRM temps while running Valley and FireStrike for example?
> Kombustor produces the highest temps, much higher than games


Disregard my last, I was remembering the temps wrong. I've just tested this morning with a 20+ minutes loop of Valley on full details and 8x AA.
The maximum temperatures for 1080/1350 Mhz GPU/VRAM at stock voltage and a maximum fan speed of 65% were:

GPU - 73 °C
VRM 1 - 79 °C
VRM 2 - 69 °C

They are plenty lower than on Kombustor so can I go a bit further on my current voltage without suffering from artifacts and what not in games?


----------



## Ha-Nocri

Quote:


> Originally Posted by *XenoRad*
> 
> Disregard my last, I was remembering the temps wrong. I've just tested this morning with a 20+ minutes loop of Valley on full details and 8x AA.
> The maximum temperatures for 1080/1350 Mhz GPU/VRAM at stock voltage and a maximum fan speed of 65% were:
> 
> GPU - 73 °C
> VRM 1 - 79 °C
> VRM 2 - 69 °C
> 
> They are plenty lower than on Kombustor so can I go a bit further on my current voltage without suffering from artifacts and what not in games?


Not bad temps actually. Yes you can go higher. But I would expect higher temps in games since CPU will also emit more heat.


----------



## Ha-Nocri

Don't see ppl posting ASIC quality:


----------



## XenoRad

Still trying to see if I can get the core to 1100 Mhz. So far I've done a few Valley runs including playing around in free view and the first two 3D tests in Msi Kombustor. All in all I think over 45 minutes of testing so far and no problems. Still a ways to go.

I've bumped the fan speed a bit and for a maximum of 70% the maximum temperatures are now:

GPU - 74 °C
VRM1 - 80 °C
VRM 2 - 72 °C
CPU (highest temp on any core) - 76 °C

I won't be trying the GPU Burn-in test anymore since I know it will artifact and throw off the temperatures like crazy but if I can run all of my games and other benchmarks with no problem I think I'll leave it like this and not revert to 1080 Mhz.

For what it's worth but I'm not sure how accurate this is or even what it would mean in the grand scheme of things:


----------



## cephelix

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Don't see ppl posting ASIC quality:


This was discussed early on in the thread and we came to the consensus that ASIC quality didn't really make a difference. What mattered more are the operating temps of the various models of card and of course the silicon lottery


----------



## Nox95

My second 290x instantly jumped to 71 degrees, quickly climbed to 81 degrees and then the pc froze. After a hard reset, the second card does not show up in Device manager or any monitoring program. Also, the backplate is cold. Did i toast it?


----------



## cephelix

Quote:


> Originally Posted by *Nox95*
> 
> My second 290x instantly jumped to 71 degrees, quickly climbed to 81 degrees and then the pc froze. After a hard reset, the second card does not show up in Device manager or any monitoring program. Also, the backplate is cold. Did i toast it?


In terms of operating temps, 81C is well within the acceptable range. What are your VRM temps like though? Have you tried replugging in the card one at a time to see if the pc would detect it?


----------



## royalkilla408

Hi all,

I flashed my old version of 290X Tri-X OC (UEFI) with another one. I backed up my BIOS but for some reason I didn't back up the UEFI one and I need it. Can one with the older version of 290X Tri-X OC (UEFI) PM me the UEFI BIOS? Thank you!


----------



## Scorpion49

Well, I'm back in the 290X club I guess. In the process of moving away from my X99 rig, I stopped by Microcenter just to get a better case for my Athlon 860K machine and found they were selling reference refurb Diamond 290's for $239. I asked for one, and it ended up they advertised them wrong, they were 290X's. So now I have a shiny new looking 290X for $239, can't really complain when that price at the store will only get you a GTX 960 or R9 285/380.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *cephelix*
> 
> This was discussed early on in the thread and we came to the consensus that ASIC quality didn't really make a difference. What mattered more are the operating temps of the various models of card and of course the silicon lottery


and the operators equipment eg : WATERCOOLING ........

Quote:


> Originally Posted by *Scorpion49*
> 
> Well, I'm back in the 290X club I guess. In the process of moving away from my X99 rig, I stopped by Microcenter just to get a better case for my Athlon 860K machine and found they were selling reference refurb Diamond 290's for $239. I asked for one, and it ended up they advertised them wrong, they were 290X's. *So now I have a shiny new looking 290X for $239,* can't really complain when that price at the store will only get you a GTX 960 or R9 285/380.


You lucky , Murican U









Wheres ' The Nard ' anyways


----------



## Scorpion49

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> and the operators equipment eg : WATERCOOLING ........
> You lucky , Murican U
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Wheres ' The Nard ' anyways


Ha, The Nard is a stupid slang name for Oxnard California where I used to live. Its on urban dictionary as being stupid as well. All I can say is thank god you got rid of the demon face guy as your avatar, thats been bugging me for years (it was originally from an anti-drug video involving spice made by the US Navy, which I had to watch when it became a big problem in the military because there were no rules about it).


----------



## jura11

Quote:


> Originally Posted by *colorfuel*
> 
> Hi,
> 
> The 290X Tri-X went down 40 eur in a german shop (Mindfactory.de) last night (just for a few hours). So the difference went down to just 20€ between the 290 and the 290X, that gave me reason enough to cancel the 290 Tri-X [new edition] and go for the 290X Tri-X [new edition] instead.
> 
> Thanks for all the help though!


Hi there

I would go with Tri-X 290X without the questions if you have chance buy so cheaply,this card should be working pretty well with Yosemite,just I would suggest have look on few threads about the 290X over on other hackintosh forums,some guys have few issues with their installation,but in my case everything went pretty much smoothly,just multi monitor support is pain,which I need to sort









Tri-X is awesome card,my is always OC to 1150/1500 and I've never have any issue with the card,only issue arise when you use newer drivers with some apps which does use older OGL(OpenGL) or they use different way translating OpenGL,here is my comparison of two drivers in Poser pro which I use mostly,but I've not tried this app in OSX,only I've tried C4D which does work great




Hope this helps

Thanks,Jura


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Scorpion49*
> 
> Ha, The Nard is a stupid slang name for Oxnard California where I used to live. Its on urban dictionary as being stupid as well. All I can say is thank god you got rid of the demon face guy as your avatar, thats been bugging me for years (it was originally from an anti-drug video involving spice made by the US Navy, which I had to watch when it became a big problem in the military because there were no rules about it).


That's a crazy suburb name









Yeah the dealers cut their crap with it and peeps lose their minds real bad , like eating other peeps faces off for example









I was known as a madman hence the avatar


----------



## alancsalt

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Scorpion49*
> 
> Ha, The Nard is a stupid slang name for Oxnard California where I used to live. Its on urban dictionary as being stupid as well. All I can say is thank god you got rid of the demon face guy as your avatar, thats been bugging me for years (it was originally from an anti-drug video involving spice made by the US Navy, which I had to watch when it became a big problem in the military because there were no rules about it).
> 
> 
> 
> That's a crazy suburb name
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah the dealers cut their crap with it and peeps lose their minds real bad , like eating other peeps faces off for example
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I was known as a madman hence the avatar
Click to expand...

Now he's perfectly sane and eating a drop bear....


----------



## kizwan

Quote:


> Originally Posted by *Nox95*
> 
> My second 290x instantly jumped to 71 degrees, quickly climbed to 81 degrees and then the pc froze. After a hard reset, the second card does not show up in Device manager or any monitoring program. Also, the backplate is cold. Did i toast it?


Shutdown, let it cool down for few minutes & turn on again. Second card should be detected again. You might want to remove the cooler & check. Possibly TIM is bad.
Quote:


> Originally Posted by *Scorpion49*
> 
> Well, I'm back in the 290X club I guess. In the process of moving away from my X99 rig, I stopped by Microcenter just to get a better case for my Athlon 860K machine and found they were selling reference refurb Diamond 290's for $239. I asked for one, and it ended up they advertised them wrong, they were 290X's. So now I have a shiny new looking 290X for $239, can't really complain when that price at the store will only get you a GTX 960 or R9 285/380.


Question: why moving away from x99? Downsizing?


----------



## Scorpion49

Quote:


> Originally Posted by *kizwan*
> 
> Question: why moving away from x99? Downsizing?


Mostly, and I like to try different things. Something seems to be wrong with this card though, I'm going to take it apart and check the TIM. It idles fine but under even the slightest load it instantly (within 3-5 seconds) hits 95*C and the fan blasts up to 100% regardless of what you have it set to do.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Scorpion49*
> 
> Mostly, and I like to try different things. Something seems to be wrong with this card though, I'm going to take it apart and check the TIM. It idles fine but under even the slightest load it instantly (within 3-5 seconds) hits 95*C and the fan blasts up to 100% regardless of what you have it set to do.


yeah tim is probably like chalk...or they used too much


----------



## cephelix

CLU FTW!


----------



## mfknjadagr8

Quote:


> Originally Posted by *cephelix*
> 
> CLU FTW!


unless you get it near a working circuit lol...I've wanted to try clu but.I've always worried about screwing it up or having rough time removing it if I decide to go another route


----------



## Scorpion49

Yeah, I'm installing the card into my X99 rig right now (drivers). I had a really, really REALLY hard time getting the drivers to install on my FM2+ machine so I'm wondering if something got messed up.

If not I'll check the TIM.


----------



## cephelix

Quote:


> Originally Posted by *mfknjadagr8*
> 
> unless you get it near a working circuit lol...I've wanted to try clu but.I've always worried about screwing it up or having rough time removing it if I decide to go another route


Well, the application is the same as when applying to my 4790K die. And I apply a coat of clear nail polish as well. so far, 2-3 months in, no problems


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *alancsalt*
> 
> Now he's perfectly sane and eating a drop bear....


Yes its all an *ILLUSION*



'Drop bears' are a part of my feeding frenzy now 'cause I wasn't getting enough essential oils in my diet


----------



## Dunan

I may take a stab at overclocking my lightning LE card, is there an easy way to do it?


----------



## Scorpion49

Well the TIM was really bad, there was about 3 tubes worth around the chip but none on it. More disturbingly, the edges of the copper contact plate on the heatsink are raised so I had to sand it down flat.


----------



## cephelix

Quote:


> Originally Posted by *Dunan*
> 
> I may take a stab at overclocking my lightning LE card, is there an easy way to do it?


You could use msi afterburner or sapphire trixx.there are guides around here and the web for oc-ing your card. Just make sure you keep an eye on core temps and vrm temps


----------



## SavageBrat

Hi Folks, I ask this question over in the 290x lightning forum but got no reply so I thought I'd try here,quick question as I just picked up the 290x lightning card (very hard to come by here in Jordan) but will my little AMD 6300 @4.5 cause any major bottlenecks with it? Thanks..card is at stock clocks for the moment..


----------



## Gumbi

Quote:


> Originally Posted by *SavageBrat*
> 
> Hi Folks, I ask this question over in the 290x lightning forum but got no reply so I thought I'd try here,quick question as I just picked up the 290x lightning card (very hard to come by here in Jordan) but will my little AMD 6300 @4.5 cause any major bottlenecks with it? Thanks..card is at stock clocks for the moment..


Depends on the game. Honestly AMD chips are so much slower clock for clock, any single threaded games will struggle with that CPU.

I would honestly recommend upgrading to an i5 setup for gaming, especially if you are gonna run a Lightning, which can handily manage 1200/1600 overclocks. Even a non overclockable i5 will smoke your CPU.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Dunan*
> 
> I may take a stab at overclocking my lightning LE card, is there an easy way to do it?


Well there is no 'easy way' if you haven't clocked Hawaii before









Better results if watercooled ....

My methods are overvolting 1.4vc + and running @ 1300mhz via unlocked bios using Asus GPU Tweek



But I just run em at these base clocks for 24/7

Maybe the first page in the Lightning club should have a guide if not just repost here if having dramas







.

Quote:


> Originally Posted by *SavageBrat*
> 
> Hi Folks, I ask this question over in the 290x lightning forum but got no reply so I thought I'd try here,quick question as I just picked up the 290x *lightning card (very hard to come by here in Jordan)* but will my little AMD 6300 @4.5 cause any major bottlenecks with it? Thanks..card is at stock clocks for the moment..
> Quote:
> 
> 
> 
> Originally Posted by *Gumbi*
> 
> Depends on the game. Honestly AMD chips are so much slower clock for clock, any single threaded games will struggle with that CPU.
> 
> I would honestly recommend upgrading to an i5 setup for gaming, especially if you are gonna run a Lightning, which can handily manage 1200/1600 overclocks. *Even a non overclockable i5 will smoke your CPU.*
Click to expand...

Well done on your Lightning pickup in your part of the world there









Have to agree with you on that .


----------



## SavageBrat

Well at this time I'm holding off on any major upgrades at this time till fall comes to see what AMD and Intel are going to bring out as I will need to do a complete swap either way (cpu.mb and ram) but for now if my current cpu isn't enough, I don't mind dropping in an 8 core to hold me over till then.


----------



## Gumbi

Quote:


> Originally Posted by *SavageBrat*
> 
> Well at this time I'm holding off on any major upgrades at this time till fall comes to see what AMD and Intel are going to bring out as I will need to do a complete swap either way (cpu.mb and ram) but for now if my current cpu isn't enough, I don't mind dropping in an 8 core to hold me over till then.


There won't be any need for a new PSU if you switch to Intel. They use much less power than AMD chips







a cheapo mobo and non-k i5 would not be too expensive and would a huge upgrade for you


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *SavageBrat*
> 
> Well at this time I'm holding off on any major upgrades at this time till fall comes to see what AMD and Intel are going to bring out as I will need to do a complete swap either way (cpu.mb and ram) but for now if my current cpu isn't enough, I don't mind dropping in an 8 core to hold me over till then.


Good call .

Grab 4x4Gb / 16 gigs of ram and a SSD or two


----------



## 4kallday

I run a build with two Asus Radeon R9 290X's in crossfire, I decided to paint them recently, what do you think? I'l put a comparison photo here too so you can see.


----------



## bkvamme

Nice job, matches your build perfectly!


----------



## cephelix

@4kallday nice. how much of a pain was it to paint it?


----------



## 4kallday

@cephelix Not too bad actually, just dismantle, paint, reassemble. Just remember to find out the safe operating temperatures for your paint. Also when you do it, make sure you have some spare thermal paste because when you open it up you'll want to replace the original paste.


----------



## cephelix

Quote:


> Originally Posted by *4kallday*
> 
> @cephelix Not too bad actually, just dismantle, paint, reassemble. Just remember to find out the safe operating temperatures for your paint. Also when you do it, make sure you have some spare thermal paste because when you open it up you'll want to replace the original paste.


Hmm, never considered operating temps for the paint. Always thought that case temps wouldn't reach high enough to mess with the paint. As for the thermal paste, already swapped mine out for CLU.


----------



## 4kallday

Quote:


> Originally Posted by *cephelix*
> 
> Hmm, never considered operating temps for the paint. Always thought that case temps wouldn't reach high enough to mess with the paint. As for the thermal paste, already swapped mine out for CLU.


With some aftermarket cooler designs the backplate is used to help dissipate heat, if you're planning to paint one of those cards then that's where it's important to at least have a quick check on your paint temps. To get around this though, a lot of people use automotive engine paint, or even opt to powder coat parts, I didn't go for a powder coat though because the backplate has a layer of plastic on one side to stop it from shorting the pcb and I didn't want to mess with that.


----------



## Sempre

I took this before i installed the G10 brackets:



Also, I may have found a workaround for my crossfire gpu usage issues. Enabling V-Sync seems to have decreased the stuttering significantly.


----------



## Agent Smith1984

I am going to invent a GPU wonder bra.....

For large "saggy" cards that just need some good lift to be beautiful.


----------



## cephelix

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I am going to invent a GPU wonder bra.....
> 
> For large "saggy" cards that just need some good lift to be beautiful.


Lol. Aren't there already solutions to that? not elegant for sure, chopsticks, support brackets like in some lian li cases. I've used thin fishing line at one point as well but after using force to kind of push the back panel of my case in so that the card can be mounted securely, i don't have that problem anymore


----------



## Agent Smith1984

Quote:


> Originally Posted by *cephelix*
> 
> Lol. Aren't there already solutions to that? not elegant for sure, chopsticks, support brackets like in some lian li cases. I've used thin fishing line at one point as well but after using force to kind of push the back panel of my case in so that the card can be mounted securely, i don't have that problem anymore


Yes there are some remedies for the problem.

I actually use the knockouts from the PCI slots in the back of the case. I just bend them to the right length, and use them as a stand. They are black so you can't really see them.

Fishing line works great to. You can also use your power cable if you feed the card from the top, and use the cable tension to support.

The devil 13 comes with a cool jack to put under that card. It needs it too... weighs around 5lbs!


----------



## cephelix

yeah, current;y my gpu cables are run out the top. not as pretty but I'm lazy to make new cables since i messed up on some of the length of the cables leading to the Y-joints. I've recently seen photos for the devil 13. and i must say, that thing is massive..4 x 8pin connectors!whut!
because of those types of graphics cards, i keep toying with the idea of making a horzontal atx layout that can accommodate my NH-D15 as well.


----------



## OneB1t

R9 290X - 2.2W


----------



## Ha-Nocri

Quote:


> Originally Posted by *OneB1t*
> 
> R9 290X - 2.2W


What does it mean and how did you do it?


----------



## HOMECINEMA-PC

Wow 2.2 watts at idle ..... means nuthin really to me


----------



## Performer81

6W when core and mem are stock, so not much more. But its not the real comsumption GPU-Z is showing, otherwise my 290 PCS+ only would need about 150W avg. under load.


----------



## fyzzz

I have finally got a bios that gives more vddc. I downloaded a trix bios that works on my card, then opened hawaiibiosreader and then i just copied the voltage numbers from a 390x bios. Now 100mv gives about 1.2+ and 150mv gives about 1.33 and 200mv i haven't tested yet (too scared and i need better cooling).


----------



## gryphonza

Hello everyone

Got myself an XFX R9 290 DD last week and it is amazing (upgraded from a GTX 460). I will post a confirmation later.
I have a question though before I start overclocking, I followed the guides here to uninstall all traces of nvidia before installing the 290 and its drivers.
Would it still be advisable to reinstall windows from scratch? I'm worried that there might still be issues because my previous card was nvidia.


----------



## Gumbi

Quote:


> Originally Posted by *gryphonza*
> 
> Hello everyone
> 
> Got myself an XFX R9 290 DD last week and it is amazing (upgraded from a GTX 460). I will post a confirmation later.
> I have a question though before I start overclocking, I followed the guides here to uninstall all traces of nvidia before installing the 290 and its drivers.
> Would it still be advisable to reinstall windows from scratch? I'm worried that there might still be issues because my previous card was nvidia.


Generally there shouldn't be issues as long as you uninstall the drivers correctly.

Watch out for your VRM temls, some XFX 290s have trouble with high VRM temps when you pump extra voltage into them. Great value for money though those cards are.


----------



## gryphonza

Quote:


> Originally Posted by *Gumbi*
> 
> Generally there shouldn't be issues as long as you uninstall the drivers correctly.
> 
> Watch out for your VRM temls, some XFX 290s have trouble with high VRM temps when you pump extra voltage into them. Great value for money though those cards are.


Thanks, yeah I heard about the VRM temps, will see how they are and maybe order a backplate or some heatsinks for them if they get too hot.

I also had one more question: My PSU (M12II EVO 620W) had a leaflet in the box that said I should use both PCI-E power cables for GPUs over 275W so I am using both my power cables for the card. Would I be able to use only one cable or should I be using both (each cable has one 8 pin and one 6 pin on it).


----------



## Gumbi

Quote:


> Originally Posted by *gryphonza*
> 
> Thanks, yeah I heard about the VRM temps, will see how they are and maybe order a backplate or some heatsinks for them if they get too hot.
> 
> I also had one more question: My PSU (M12II EVO 620W) had a leaflet in the box that said I should use both PCI-E power cables for GPUs over 275W so I am using both my power cables for the card. Would I be able to use only one cable or should I be using both (each cable has one 8 pin and one 6 pin on it).


Hook both power cables in the GPU.


----------



## Agent Smith1984

Can sometime post a FRAPS log of Crysis 3 at these settings (2 minutes is fine):



You can pick campaign or multiplayer, just let me know the map.

Thanks


----------



## Gumbi

2015-07-07 18:43:20 - crysis3
Frames: 8999 - Time: 120000ms - Avg: 74.992 - Min: 60 - Max: 89

290 Vapor X at 1175/1600mhz. Hydro Dam multiplayer map, team deathmatch, 15/16 players (basically full). Walked up and won main sections shooting, throwing grenades, ie being semi active.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gumbi*
> 
> 2015-07-07 18:43:20 - crysis3
> Frames: 8999 - Time: 120000ms - Avg: 74.992 - Min: 60 - Max: 89
> 
> 290 Vapor X at 1175/1600mhz. Hydro Dam multiplayer map, team deathmatch, 15/16 players (basically full). Walked up and won main sections shooting, throwing grenades, ie being semi active.


What CPU and clock speed, and also what driver?

Thanks for posting so quickly!


----------



## Gumbi

4.9ghz 4790k, and 15.5 drivers.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gumbi*
> 
> 4.9ghz 4790k, and 15.5 drivers.


Great, thanks!

Do most people use something different for their AA?

I see some people use FXAA, some use SMAA, and some MSAA, and not sure which is the best visually, and which of the three is most difficult to run?

Anyone?


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Great, thanks!
> 
> Do most people use something different for their AA?
> 
> I see some people use FXAA, some use SMAA, and some MSAA, and not sure which is the best visually, and which of the three is most difficult to run?
> 
> Anyone?


AFAIK, FXAA is a more modern, approximate version of MSAA, which is a lot lighter on the GPU (but suffers slightly in the quality department).

MSAA is the heaviest AFAIK. A quick google should help with this however.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gumbi*
> 
> AFAIK, FXAA is a more modern, approximate version of MSAA, which is a lot lighter on the GPU (but suffers slightly in the quality department).
> 
> MSAA is the heaviest AFAIK. A quick google should help with this however.


Yeah, I read a little on it, thanks!

Mainly wondering what others are using?

I'm old school so I have been rocking the MSAA at 4X for the longest time, and then I finally see that all the review sites are usually using FXAA when testing this game.


----------



## Ha-Nocri

FXAA has very low impact on performance, but is the worst option. It blurs image. I never turn it on.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Ha-Nocri*
> 
> FXAA has very low impact on performance, but is the worst option. It blurs image. I never turn it on.


I noticed the same thing..... that's why I always run MSAA, it's just all these reviews were showing FXAA, but also showing much lower FPS results than I am getting, so I was a little confused.


----------



## ghabhaducha

That's crazy, this thread now has more posts than [Official] AMD Radeon HD 7950/7970/7990 Owners Thread!


----------



## sinnedone

So ive seen bits here and there about how supposedly AMD is making their drivers favor the newer r9 300 series by gimping the r9 200 series.

Is this actually true?

Anyone dealt with this?


----------



## Mega Man

i have never seen amd do this..... nvoidia however does .....


----------



## kizwan

Quote:


> Originally Posted by *sinnedone*
> 
> So ive seen bits here and there about how supposedly AMD is making their drivers favor the newer r9 300 series by gimping the r9 200 series.
> 
> Is this actually true?
> 
> Anyone dealt with this?


Yeah, I read the same thing. Whether there's any truth in it I don't know.

I'll just leave this here. Standard TnC apply (take with a grain of salt).


----------



## colorfuel

If this is true, then it is a really bad move by AMD.

I hope there will be a way to use "hacked" drivers at least. But having to rely on "hacked" drivers for the rest of my new 290Xs time?


----------



## Gumbi

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What CPU and clock speed, and also what driver?
> 
> Thanks for posting so quickly!


Do you get similar performance with your 390? I guess without a set bench it's hard to compare properly if they're close.


----------



## kizwan

Quote:


> Originally Posted by *colorfuel*
> 
> If this is true, then it is a really bad move by AMD.
> 
> I hope there will be a way to use "hacked" drivers at least. But having to rely on "hacked" drivers for the rest of my new 290Xs time?


I can only hope this is temporary. Look like this move is to help selling the 300 series cards.

Even with "hacked" (inf mod) drivers, my 290s is lagging behind 390s around 93 to 94%. The 300 series drivers seems to detect & load "special" optimization only for 300 series cards.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Gumbi*
> 
> Do you get similar performance with your 390? I guess without a set bench it's hard to compare properly if they're close.


I went back and ran the same map you tested.

At my 1150/1700 daily clocks, I am getting 62 min, 77.8 avg, and 109 max

Not a drastic difference I don't suppose. Not sure how much my CPU is affecting my numbers (if at all), but the game is running beautifully.


----------



## gatygun

Quote:


> Originally Posted by *kizwan*
> 
> I can only hope this is temporary. Look like this move is to help selling the 300 series cards.
> 
> Even with "hacked" (inf mod) drivers, my 290s is lagging behind 390s around 93 to 94%. The 300 series drivers seems to detect & load "special" optimization only for 300 series cards.


Doesn't the 390's come with better clocks?


----------



## kizwan

Quote:


> Originally Posted by *gatygun*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I can only hope this is temporary. Look like this move is to help selling the 300 series cards.
> 
> Even with "hacked" (inf mod) drivers, my 290s is lagging behind 390s around 93 to 94%. The 300 series drivers seems to detect & load "special" optimization only for 300 series cards.
> 
> 
> 
> Doesn't the 390's come with better clocks?
Click to expand...

Higher clocks, yes. I meant when both 290 & 390 overclocked at same clocks, my 290's lagging behind 390's, around 93 to 94%, Both using 300 series drivers.


----------



## mfknjadagr8

Quote:


> Originally Posted by *kizwan*
> 
> Higher clocks, yes. I meant when both 290 & 390 overclocked at same clocks, my 290's lagging behind 390's, around 93 to 94%, Both using 300 series drivers.


I wouldn't think it should be that much difference...most benches aren't showing more than 20%..even with it compounded by two cards that's only 40% and we never get 100% scaling


----------



## Agent Smith1984

Quote:


> Originally Posted by *mfknjadagr8*
> 
> I wouldn't think it should be that much difference...most benches aren't showing more than 20%..even with it compounded by two cards that's only 40% and we never get 100% scaling


I think he is saying the 300 series is about 6-7% faster despite using the same clock speeds and the hacked driver.
This has been confirmed by a few other results I've seen around the net.

The hacked driver is showing improvements for 290 users, but it's breaking some things too I believe.

My question would be, if it's improving the 290, but still not performing exactly the same as the 390 at the same speeds, then what did they do to the 300 series, because all of my research shows that everything is practically the same minus the increased VRAM, which has made no performance improvement when comparing 290x 4GB cards to 290x 8GB cards at normal resolutions.


----------



## Forceman

Quote:


> Originally Posted by *Agent Smith1984*
> 
> My question would be, if it's improving the 290, but still not performing exactly the same as the 390 at the same speeds, then what did they do to the 300 series, because all of my research shows that everything is practically the same minus the increased VRAM, which has made no performance improvement when comparing 290x 4GB cards to 290x 8GB cards at normal resolutions.


Tighter memory timings seems the most likely.

That being said:





http://www.hardocp.com/article/2015/06/18/msi_r9_390x_gaming_8g_video_card_review/9


----------



## Agent Smith1984

Quote:


> Originally Posted by *Forceman*
> 
> Tighter memory timings seems the most likely.
> 
> That being said:
> 
> 
> 
> 
> 
> http://www.hardocp.com/article/2015/06/18/msi_r9_390x_gaming_8g_video_card_review/9


Agreed, and it has been confirmed that the 390 series is using a completely new VRAM IC rated at 1500MHz...
Many said the IC's were similar, and only rated higher because their timings were loosened, but now it seems that timings and clock speed have been improved.
A better IC altogether.

I do wonder if the more recent 8GB 290's have the newer hynix IC's rated at 1500MHz.

Guess it'd be good to look at the overclocking reviews on that card to see for sure.


----------



## Scorpion49

It begins... as a side note, I just got my MG279Q freesync monitor, kinda bummed it has a ton of dead pixels but I will say this: first impression is that Freesync is definitely not as good as Gsync having used them back to back. With Gsync I could get all the way down to 42-45fps in DA:I and still not know what the frame rate was, using freesync and the 290X I can tell immediately when it drops below 60fps, 45-50 is as bad as if you didn't have the technology at all. Pretty disappointed in that tbh. This is just an immediate first impression, so I'm going to give it a few more days before making an absolute judgement on it though.


----------



## kizwan

Quote:


> Originally Posted by *Agent Smith1984*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Forceman*
> 
> Tighter memory timings seems the most likely.
> 
> That being said:
> 
> 
> 
> 
> 
> http://www.hardocp.com/article/2015/06/18/msi_r9_390x_gaming_8g_video_card_review/9
> 
> 
> 
> 
> 
> 
> 
> Agreed, and it has been confirmed that the 390 series is using a completely new VRAM IC rated at 1500MHz...
> Many said the IC's were similar, and only rated higher because their timings were loosened, but now it seems that timings and clock speed have been improved.
> A better IC altogether.
> 
> I do wonder if the more recent 8GB 290's have the newer hynix IC's rated at 1500MHz.
> 
> Guess it'd be good to look at the overclocking reviews on that card to see for sure.
Click to expand...

The 290's elpida/hynix VRAM also rated at 1500MHz too. That's not the reason why 390's faster than 290's. Like Forceman said, the increase performance on 390 possibly because of the tighter memory timings. Seeing that you can overclock higher than 1700MHz, loosen memory timings (instead of tightens) seems like possible explanation too but I don't think it's likely because my 290's wouldn't lagging behind 390's if it's the case. It look like 390's come with better memory IC's, in term of overclockability & likely with tighter timings. However, I still hope it's the drivers playing tricks here.


----------



## mfknjadagr8

Quote:


> Originally Posted by *kizwan*
> 
> The 290's elpida/hynix VRAM also rated at 1500MHz too. That's not the reason why 390's faster than 290's. Like Forceman said, the increase performance on 390 possibly because of the tighter memory timings. Seeing that you can overclock higher than 1700MHz, loosen memory timings (instead of tightens) seems like possible explanation too but I don't think it's likely because my 290's wouldn't lagging behind 390's if it's the case. It look like 390's come with better memory IC's, in term of overclockability & likely with tighter timings.


tighter is better....nobody likes loose timings unless for some reason you have to settle...I would imagine that driver would make memory overclocking a bit more challenging since it's expecting modules that perform with tighter timings


----------



## Agent Smith1984

Quote:


> Originally Posted by *kizwan*
> 
> The 290's elpida/hynix VRAM also rated at 1500MHz too. That's not the reason why 390's faster than 290's. Like Forceman said, the increase performance on 390 possibly because of the tighter memory timings. Seeing that you can overclock higher than 1700MHz, loosen memory timings (instead of tightens) seems like possible explanation too but I don't think it's likely because my 290's wouldn't lagging behind 390's if it's the case. It look like 390's come with better memory IC's, in term of overclockability & likely with tighter timings. However, I still hope it's the drivers playing tricks here.


Right.... better IC altogether.


----------



## Agent Smith1984

I wonder if 390 exhibits better frametimes even in the areas where the framerate is the same between the two?

Or how about visa versa; similar frametimes in areas that the 390 has higher FPS???


----------



## OneB1t

if you flash your 290/290X with bios from 390/390X it will perform same on same clocks







(search for XFX 390X bios on guru3d)


----------



## Ha-Nocri

Quote:


> Originally Posted by *OneB1t*
> 
> if you flash your 290/290X with bios from 390/390X it will perform same on same clocks
> 
> 
> 
> 
> 
> 
> 
> (search for XFX 390X bios on guru3d)


What about memory? I'm pretty sure it wouldn't be able to hold on with the same voltage/speed as 390(x) ?!


----------



## intelfan

Do the 290/290X cards have screen tearing when using Eyefinity? (DVI DVI HDMI)

Thanks for any help.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Ha-Nocri*
> 
> What about memory? I'm pretty sure it wouldn't be able to hold on with the same voltage/speed as 390(x) ?!


Agreed, and also, I've heard of people loosening the timings to get higher mem clocks on 290.... IF, the 390 is using tighter timings, wouldn't that put you in even worse shape trying to clock higher?

Again, that's IF the timings are tighter....


----------



## gatygun

Probably something to do with the memory indeed, maybe it can be faster adressed or something. Or they split the v-ram over 2 pools that get both adressed faster that way.
Quote:


> Originally Posted by *Agent Smith1984*
> 
> Agreed, and also, I've heard of people loosening the timings to get higher mem clocks on 290.... IF, the 390 is using tighter timings, wouldn't that put you in even worse shape trying to clock higher?
> 
> Again, that's IF the timings are tighter....


Could you not just underclock the memory then? so you can use automatically the 390 drivers.

It would be interesting to test, just to see how the card performs tho.


----------



## Agent Smith1984

Quote:


> Originally Posted by *gatygun*
> 
> Probably something to do with the memory indeed, maybe it can be faster adressed or something. Or they split the v-ram over 2 pools that get both adressed faster that way.
> Could you not just underclock the memory then? so you can use automatically the 390 drivers.
> 
> It would be interesting to test, just to see how the card performs tho.


People have tested the two cards on the same drivers at the same clock speed and the results are very close.
The difference is .5-1 FPS. That .5-1FPS could be margin of error, or it could be tighter timings, because that's about how big of a difference both would make.

If the 390 uses tighter timings, and that's causing the difference, then I would suspect that placing the 390 BIOS on the relative 290 card could possibly work at stock clocks, but I would assume the memory would clock even worse. Now, that's not to say that the BIOS doesn't change the timing sets after a certain clock speed to increase the headroom.

Some say the 290's had 1500MHz IC's on them.... but I have seen COUNTLESS folks top out at 1350-1450 and never have any luck getting to 1500MHz. That was more the case with Elpida based cards than Hynix cards, but even some Hynix owners had mem clock issues after 1400MHz.

That makes me think these cards have better IC's altogether, not just larger, but faster.

Now... it's not to say that they didn't increase the IC voltage to improve the timings/clocks, but I doubt that, especially considering they reduced TDP on these.

I will be curious to see how many people successfully flash their 290's to a 3 series to be able to get the "better" driver support.

I know a lot of people had luck doing this with their 7970's (to 280x) but then again, a lot more weren't able to than were....


----------



## MrKZ

Anybody tried the 15.7 driver? It improved my 3dmark score by ~600

-Before, 15.6 beta : 11545 gpu score on firestrike
http://www.3dmark.com/3dm/7669193

-Now with 15.7 : 12174 gpu score
http://www.3dmark.com/3dm/7673665

both tests at 1060/1250 +50mV

Didnt tested any game yet.


----------



## Agent Smith1984

Quote:


> Originally Posted by *MrKZ*
> 
> Anybody tried the 15.7 driver? It improved my 3dmark score by ~600
> 
> -Before, 15.6 beta : 11545 gpu score on firestrike
> http://www.3dmark.com/3dm/7669193
> 
> -Now with 15.7 : 12174 gpu score
> http://www.3dmark.com/3dm/7673665
> 
> both tests at 1060/1250 +50mV
> 
> Didnt tested any game yet.


They probably just gave you guys the performance boost found int he 15.15 300 series driver, with the 15.7!

Congrats! Now 290 owners can quit cussing AMD for for pulling a sneaky sneak on their drivers









Anyone who likes to run benchies better get this driver and post a new FireStrike!

It should help out with Crysis 3 and Far Cry 4 a lot too,...


----------



## ghabhaducha

Alright, so when I had 15.5, My second 290x would fall asleep when not being used; however, when I installed 15.6, upon startup my second 290x would register a temperature in MSI afterburner and not fall asleep. I would have to disable crossfire, and then re-enable it to prevent a constant-wake on the second card. This situation was fixed in the 15.15 drivers, but has come back again with the 15.7 drivers. I enable ULPS on this system, and I don't really have any other problems besides this. Would you guys know why this may be happening?


----------



## Sempre

Quote:


> Originally Posted by *Scorpion49*
> 
> It begins... as a side note, I just got my MG279Q freesync monitor, kinda bummed it has a ton of dead pixels but I will say this: first impression is that Freesync is definitely not as good as Gsync having used them back to back. With Gsync I could get all the way down to 42-45fps in DA:I and still not know what the frame rate was, using freesync and the 290X I can tell immediately when it drops below 60fps, 45-50 is as bad as if you didn't have the technology at all. Pretty disappointed in that tbh. This is just an immediate first impression, so I'm going to give it a few more days before making an absolute judgement on it though.


Well that's disappointing. I want to buy a Freesync monitor this year. I actually heard that Freesync is as smooth as Gsync when you're inside the monitor's refresh window.

When it goes below the panel's minimum refresh rate (e.g 30 Hz) that's when the Gsync module kicks in, since it'll double the frames going to the monitor. So if the gpu is outputting 24fps the module will duplicate them and send 48fps. But Freeysnc doesn't do this. Maybe this is what you encountered?


----------



## Scorpion49

Quote:


> Originally Posted by *Sempre*
> 
> Well that's disappointing. I want to buy a Freesync monitor this year. I actually heard that Freesync is as smooth as Gsync when you're inside the monitor's refresh window.
> 
> When it goes below the panel's minimum refresh rate (e.g 30 Hz) that's when the Gsync module kicks in, since it'll double the frames going to the monitor. So if the gpu is outputting 24fps the module will duplicate them and send 48fps. But Freeysnc doesn't do this. Maybe this is what you encountered?


No, it wasn't that. At slightly higher frame rates freesync feels exactly the same as gsync, it was just the margin area between the lower limit and 50-55ish fps that felt bad. More testing to be done, for science.


----------



## p5ych00n5

Seeing as one of my 7970's has died, I thought it would be a good time to upgrade to at the very least a 290X. To the Bays of E





Let's try 390X


----------



## rdr09

Normally gets 13K in graphics with a 290 @ 1200 and Win7 . . .

http://www.3dmark.com/3dm/7677297?

gained about 200 points. just installed it over the last beta and have not played a game.


----------



## mus1mus

Quote:


> Originally Posted by *rdr09*
> 
> Normally gets 13K in graphics with a 290 @ 1200 and Win7 . . .
> 
> http://www.3dmark.com/3dm/7677297?
> 
> gained about 200 points. just installed it over the last beta and have not played a game.


Got it close to 14K Graphics! on the new Driver.








Graphics Test 2 took a huge boost huh?
Both runs 1212 Core / 1717 Mem

http://www.3dmark.com/compare/fs/5342029/fs/5313728


----------



## Arizonian

Quote:


> Originally Posted by *4kallday*
> 
> I run a build with two Asus Radeon R9 290X's in crossfire, I decided to paint them recently, what do you think? I'l put a comparison photo here too so you can see.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Nice mod.


----------



## Ized

Quote:


> Originally Posted by *kizwan*
> 
> Yeah, I read the same thing. Whether there's any truth in it I don't know.
> 
> I'll just leave this here. Standard TnC apply (take with a grain of salt).


That's the guy that was doing custom BIOSes for the 290s right after launch (memory tweaks and such for the coin miner guys)!

He must work for AMD or something?


----------



## Ized

Quote:


> Originally Posted by *OneB1t*
> 
> R9 290X - 2.2W


That's 5 times lower than my idle watts figure! Amazing underclock (I assume)!

Anything special going on or needed to replicate?


----------



## bkvamme

Just got my Fujipoly extreme pad, will probably apply them later today. Any tips for putting it on, or just follow EKs instructions?


----------



## MrWhiteRX7

Quote:


> Originally Posted by *MrKZ*
> 
> Anybody tried the 15.7 driver? It improved my 3dmark score by ~600
> 
> -Before, 15.6 beta : 11545 gpu score on firestrike
> http://www.3dmark.com/3dm/7669193
> 
> -Now with 15.7 : 12174 gpu score
> http://www.3dmark.com/3dm/7673665
> 
> both tests at 1060/1250 +50mV
> 
> Didnt tested any game yet.


Witcher 3 Xfire is much better. Arkham Knight still works great. Haven't tested GTA V yet, but I'm hearing the latest patch isn't doing so well? haha


----------



## edo101

with 15.7 I get 11905 Fire Strike gpu score with core at 1060Mhz, mem1250 Mhz all at +30 mV

R9 290 Windforce OC

Not sure what other people are getting


----------



## YellowBlackGod

What mostly matters, is that all the features advertised for R9 300 series are available for the 200 series as well. Now I can say that there is no reason to upgrade from R9 200 series, since we are taling about the same thing.


----------



## fyzzz

http://www.3dmark.com/3dm/7681019? 1150/1600 with 15.7 driver


----------



## colorfuel

D
Quote:


> Originally Posted by *MrKZ*
> 
> Anybody tried the 15.7 driver? It improved my 3dmark score by ~600
> 
> -Before, 15.6 beta : 11545 gpu score on firestrike
> http://www.3dmark.com/3dm/7669193
> 
> -Now with 15.7 : 12174 gpu score
> http://www.3dmark.com/3dm/7673665
> 
> both tests at 1060/1250 +50mV
> 
> Didnt tested any game yet.


I get nearly the exact same GPU score with those settings on mine (290X, [email protected]).

http://www.3dmark.com/3dm/7681068?

Edit:

With 1150/1500 +87mv

http://www.3dmark.com/3dm/7681134?


----------



## Klocek001

help plz
http://www.overclock.net/t/1564129/how-to-remove-this-nasty-stain-form-my-gpus-pcb


----------



## MrKZ

Quote:


> Originally Posted by *colorfuel*
> 
> D
> I get nearly the exact same GPU score with those settings on mine (290X, [email protected]).
> 
> http://www.3dmark.com/3dm/7681068?
> 
> Edit:
> 
> With 1150/1500 +87mv
> 
> http://www.3dmark.com/3dm/7681134?


Wow. That 1150 score is nice.

Unfortunately I can't test on higher clocks/voltages now cause the vrm is already reaching 90-100 C at 1060 +50mV. Yeah.. Asus cooling + summer temps


----------



## King4x4

Go water or go home!


----------



## rdr09

Quote:


> Originally Posted by *mus1mus*
> 
> Got it close to 14K Graphics! on the new Driver.
> 
> 
> 
> 
> 
> 
> 
> 
> Graphics Test 2 took a huge boost huh?
> Both runs 1212 Core / 1717 Mem
> 
> http://www.3dmark.com/compare/fs/5342029/fs/5313728


nice. my system always scores lower than others with similar oc. could be my aging system. here is my crossfire result . . .

http://www.3dmark.com/3dm/7681711?

gained about 600 points.

@fyzzz, crank it up. can't wait for your higher clocks. i'm sure you'll crack 14500 gpu score.


----------



## fyzzz

Quote:


> Originally Posted by *rdr09*
> 
> nice. my system always scores lower than others with similar oc. could be my aging system. here is my crossfire result . . .
> 
> http://www.3dmark.com/3dm/7681711?
> 
> gained about 600 points.
> 
> @fyzzz, crank it up. can't wait for your higher clocks. i'm sure you'll crack 14500 gpu score.


I will soon.


----------



## Gumbi

Quote:


> Originally Posted by *King4x4*
> 
> Go water or go home!


Vapor X, MAX of 55c on VRMs at 75mv








beefyas hell vrm/mem cooling.

Pads are directly in contact with heatsink for very efficient heat transfer.


----------



## mus1mus

Quote:


> Originally Posted by *rdr09*
> 
> nice. my system always scores lower than others with similar oc. could be my aging system. here is my crossfire result . . .
> 
> http://www.3dmark.com/3dm/7681711?
> 
> gained about 600 points.
> 
> @fyzzz, crank it up. can't wait for your higher clocks. i'm sure you'll crack 14500 gpu score.


Might run it on 17C water later.







I hope it will still improve. 30+C vs sub 20C you reckon?

For a reference card, I am amazed by the memory OC! Core is a bit meh though.


----------



## Scorpion49

Alright, so after playing with my new Freesync monitor a little more, I think my original feeling was correct (the first impression was after playing almost 300 hours of the same game on Gsync). Gsync seems to me to be the better performing of the two *near, (NOT under) the lower limit*, although both are very good. The Freesync setup starts to feel stuttery within around 10fps of the lower limit while the Gsync managed to make it within 1-2fps before it really was obvious, on the other hand Gsync has a problem with runaway frame rate in menus and such that cause some weird flickering of the monitor which can be very annoying as well since it does not have the option to vsync past the max VRR.

All in all, while I think Gsync was slightly superior, Freesync is excellent as well and only required me to tweak my settings a little bit to have a marginally higher frame rate and I can no longer tell the difference between the two. Definitely worth the trade-off as I was able to buy a 1440P 144hz IPS panel for the same price as my 1080p TN Gsync screen. The drawbacks are not really worth noting when you figure in the $150-200 price gap between the same panel with each type of tech.

For anyone on the fence about trying it, I would go for it. This stuff really transformed my gaming experience, I never managed to finish Dragon Age because I was having a lot of problems with stuttering/tearing and kept on tweaking my PC settings trying to get the damn game to run smoothly. Now I just hop in and play with way more immersion than I could have had before when I was focusing on tear lines and how to get an extra 4fps out of my GPU. This was as drastic a change as my first SSD drive was, well worth the money.


----------



## YellowBlackGod

Has anyone got FRTC working with Catalyst 15.7? I checked it in CCC, set it to max (95 FPS) and launched Kingdoms of Amalur Rekoning, which with V-Sync Off hits 200+ FPS. It doesn't work. FPS are still that high. Any thoughts? I don't have any Freesync monitor if that matters.


----------



## Scorpion49

Quote:


> Originally Posted by *YellowBlackGod*
> 
> Has anyone got FRTC working with Catalyst 15.7? I checked it in CCC, set it to max (95 FPS) and launched Kingdoms of Amalur Rekoning, which with V-Sync Off hits 200+ FPS. It doesn't work. FPS are still that high. Any thoughts? I don't have any Freesync monitor if that matters.


It works for me, I set mine at 80 and it seems to stop around 78-79. Makes me want to go play kingdoms of amulbulbulrurburlrlr again though, that game was so fun. Shame what happened to the company before even the first update came out.

EDIT: if you run AB or something maybe try with it off and use something else like fraps to check the FPS, Rivatuner has the same functionality built in and it might be preventing the driver from hooking properly.


----------



## YellowBlackGod

Quote:


> Originally Posted by *Scorpion49*
> 
> It works for me, I set mine at 80 and it seems to stop around 78-79. Makes me want to go play kingdoms of amulbulbulrurburlrlr again though, that game was so fun. Shame what happened to the company before even the first update came out.


Would you kindly try? Can this possibly be game dependent? Amalur is a great game, it is for me the second time to play it. This game did back in 2012 what The Witcher 3 does now.


----------



## Scorpion49

Quote:


> Originally Posted by *YellowBlackGod*
> 
> Would you kindly try? Can this possibly be game dependent? Amalur is a great game, it is for me the second time to play it. This game did back in 2012 what The Witcher 3 does now.


Yeah give me a couple of minutes, gotta install steam (game should still be on my 3TB, I never uninstall them lol).


----------



## YellowBlackGod

Quote:


> Originally Posted by *Scorpion49*
> 
> Yeah give me a couple of minutes, gotta install steam (game should still be on my 3TB, I never uninstall them lol).


Right! Thank you mate!


----------



## Scorpion49

Quote:


> Originally Posted by *YellowBlackGod*
> 
> Right! Thank you mate!


Well steam doesn't want to link my games from the old library so I have to DL it again, 20 minutes until its done.

EDIT: DURP, I forgot. FRTC only works in DX10 or higher games, so it won't work on Amalur. It seems Freesync does not work in this game either. If you need frame cap, use Afterburner and Rivatuner. It works great. TO be fair, the game is not very demanding so I guess Vsync would be fine as well.
Quote:


> Frame Rate Target Control™ (FRTC)
> FRTC allows the user to set a maximum frame rate when playing an application in full screen exclusive mode. This feature provides the following benefits:
> Reduced GPU power consumption
> Reduced system heat
> Lower fan speeds and less noise
> This feature is supported on applications using DirectX® 10 or higher and on the following AMD graphics products


----------



## intelfan

Quote:


> Originally Posted by *intelfan*
> 
> Do the 290/290X cards have screen tearing when using Eyefinity? (DVI DVI HDMI)
> 
> Thanks for any help.


Does anyone know?


----------



## Forceman

Quote:


> Originally Posted by *Scorpion49*
> 
> on the other hand Gsync has a problem with runaway frame rate in menus and such that cause some weird flickering of the monitor which can be very annoying as well since it does not have the option to vsync past the max VRR.


I thought that option was added in the new drivers?


----------



## YellowBlackGod

Quote:


> Originally Posted by *Scorpion49*
> 
> Well steam doesn't want to link my games from the old library so I have to DL it again, 20 minutes until its done.










Thank you for this. It might be game dependant , who knows.... I will try with another game as well wenn I am home. I am running the game in Windows 10 preview, don't know if that matters.
Quote:


> Originally Posted by *Scorpion49*
> 
> Well steam doesn't want to link my games from the old library so I have to DL it again, 20 minutes until its done.
> 
> EDIT: DURP, I forgot. FRTC only works in DX10 or higher games, so it won't work on Amalur. It seems Freesync does not work in this game either. If you need frame cap, use Afterburner and Rivatuner. It works great. TO be fair, the game is not very demanding so I guess Vsync would be fine as well.


Right! That clears everything up! Thank you for this!


----------



## mfknjadagr8

Quote:


> Originally Posted by *Scorpion49*
> 
> Well steam doesn't want to link my games from the old library so I have to DL it again, 20 minutes until its done.
> 
> EDIT: DURP, I forgot. FRTC only works in DX10 or higher games, so it won't work on Amalur. It seems Freesync does not work in this game either. If you need frame cap, use Afterburner and Rivatuner. It works great. TO be fair, the game is not very demanding so I guess Vsync would be fine as well.


sometimes you have to force it to rediscover existing files...to do this you move the folder out...delete local content through the list on steam...move the folder back then click install again and it will rediscover...assuming you point it to the correct library folder









On the subject of the new driver (15.7) I haven't tried it yet but I'm wondering if it fixed the horrible flicker I get from water and reflective surfaces in the witcher 3 when in crossfire.. It's downright nauseating at times...I've tried a few things that reduced it a little but nothing takes it to a suitable level...I still play it but it's tough in some spots...what performance increases have you guys seen in witcher 3 running crossfire?


----------



## SavageBrat

Still learning on oc video cards but this was with the new drivers at stock volts.. 1140-1500


----------



## mfknjadagr8

Quote:


> Originally Posted by *SavageBrat*
> 
> Still learning on oc video cards but this was with the new drivers at stock volts.. 1140-1500


I forget what is your 8370 clocked to? I'm thinking the score should be a little higher for those clocks...if I remember right my 8320 at 4.8 and one 290 stock clocks (947 1250) got around 10,500...


----------



## SavageBrat

Quote:


> Originally Posted by *mfknjadagr8*
> 
> I forget what is your 8370 clocked to? I'm thinking the score should be a little higher for those clocks...if I remember right my 8320 at 4.8 and one 290 stock clocks (947 1250) got around 10,500...


only at 4.5.. slow poking as I just got the 8370 ..


----------



## rdr09

Quote:


> Originally Posted by *mus1mus*
> 
> Might run it on 17C water later.
> 
> 
> 
> 
> 
> 
> 
> I hope it will still improve. 30+C vs sub 20C you reckon?
> 
> For a reference card, I am amazed by the memory OC! Core is a bit meh though.


it's like beer. the colder the better.

edit: here is my second 290 (second slot @ X8) . . .

http://www.3dmark.com/3dm/7690148?

Got a slightly higher gpu score, which is within margin of error i guess.


----------



## bkvamme

Quote:


> Originally Posted by *intelfan*
> 
> Does anyone know?


I did not notice anything on my setup (same outputs as you), but I no longer have triple displays, so I am afraid I can't verify further.


----------



## piquadrat

Quote:


> Originally Posted by *rdr09*
> 
> it's like beer. the colder the better.
> .


This is a blasphemy







Some beer variates require 16-18 degrees Celsius to show its glory. Don't be fool by ubiquitous tasteless international lagers.


----------



## rdr09

Quote:


> Originally Posted by *piquadrat*
> 
> This is a blasphemy
> 
> 
> 
> 
> 
> 
> 
> Some beer variates require 16-18 degrees Celsius to show its glory. Don't be fool by ubiquitous tasteless international lagers.


lol. my bad. i like dark slightly below room temp.


----------



## yawa

So weird Overclocking/benching question here.

I have my Diamond 290X in what could be construed as an "overkill" water loop (how overkill? At +200MV I peak in benching temps at 48C at 1241/1500 with my ambient at 68F, with 3DMark never going above 40C).

My few questions are...

- Does anyone have trouble with Afterburner (The only OC'ing program that seems to actually stick and change my benchmark scores is Trixx)? -

- Does anyone know a program that allows one to go above +200MV? -

- Does anyone post in the benching threads anymore? -

- And what's the highest Ram OC one can reasonably expect on this card? -


----------



## edo101

Quote:


> Originally Posted by *YellowBlackGod*
> 
> What mostly matters, is that all the features advertised for R9 300 series are available for the 200 series as well. Now I can say that there is no reason to upgrade from R9 200 series, since we are taling about the same thing.


too bad you weren't on Nvidia. You'd be forced to upgrade


----------



## edo101

Quote:


> Originally Posted by *fyzzz*
> 
> http://www.3dmark.com/3dm/7681019? 1150/1600 with 15.7 driver










are you on water?


----------



## xKrNMBoYx

I have my custom profile set to where it basically keeps the temps at 70-73C and below for my 290X Crossfire. But on reference coolers it does get a bit noisy. I believe running the 290X at up to 95C is technically fine. So I'm thinking of setting up the profile so the GPUs run at about 80C. I wonder what RPM the fans will run and how loud it will be. Since my room is about 18-19C I think the temp increase in the room will be okay.


----------



## Lao Tzu

Hi, downloading driver 15.7 and new version of 3DMark FireStrike, posting my results in a hour.

*With Omega and OC 1100/1400:*

http://www.3dmark.com/fs/4914719

*with 15.5 and OC 1125/1600*

http://www.3dmark.com/fs/5354302 (its give less GPU Score no matter OC boost)

*with 15.7 and OC 1125/1600 (no changes for me)*

http://www.3dmark.com/fs/5355313


----------



## xKrNMBoYx

Quote:


> Originally Posted by *YellowBlackGod*
> 
> Has anyone got FRTC working with Catalyst 15.7? I checked it in CCC, set it to max (95 FPS) and launched Kingdoms of Amalur Rekoning, which with V-Sync Off hits 200+ FPS. It doesn't work. FPS are still that high. Any thoughts? I don't have any Freesync monitor if that matters.


Works for me. Tried 60 and 75 and it works.


----------



## aaroc

Quote:


> Originally Posted by *intelfan*
> 
> Do the 290/290X cards have screen tearing when using Eyefinity? (DVI DVI HDMI)
> 
> Thanks for any help.


with CFX disabled and Vsync activated I do not see any tearing on Eyefinity 7680x1440 DVI-DL HDMI DVI-DL. When CFX is activated it depends on the game Dirt 3 or Dirt showdown I can activate Vsync and no tearing. Other gamer it flickers with Vsync on and CFX so I run them with Vsync off and some tearing from time to time.

With the latest drivers 15.7 Tomb Raider runs super smooth with CFX without vsync, it has the flicker with vsync on.
Quote:


> Originally Posted by *King4x4*
> 
> Go water or go home!


----------



## AMDMAXX

I figured I'd post this in here too... I posted this in the O/C thread... There's a few more pictures over there though...





I had a fun time putting this machine back together...

Seems with no VDDC added I'm getting 1100 on both cores and 1350 on memory.. anything above that seems to give me artifacts...

One card is a 290 the other is a 290x...

One is hynix (non-X) ram and the other elpida (x)...

http://www.3dmark.com/3dm/7053383?

Temps in Valley are pretty low... around 45-46c after 3-4 hours of looping... 55c with the case closed up...

I was appalled at the stock TIM job... horrible....


----------



## fyzzz

Quote:


> Originally Posted by *edo101*
> 
> 
> 
> 
> 
> 
> 
> 
> are you on water?


No..but maybe soon that score and oc is actually not that high and my card can do much better, i just need the cooling now.


----------



## DividebyZERO

290x's still kciking and enjoying every update AMD throws at them to keep them moving along for 2 years.

Using stilts bios for 1075/1375 clocks default with an older driver



Using stilts bios for 1075/1375 clocks default with the 15.7 newest



I am tempted to oc higher and push my psu to the limits for a 10k run


----------



## randhis

has anyone here flashed different 290x bios into reference 290x and see any improvement? i have a ref 290x with an arctic accelero 3, it will be nice if I can flash tri-x bios into my modded 290x, anyone here have experience doing it? help will be much appreciated, thank you.


----------



## edo101

Quote:


> Originally Posted by *fyzzz*
> 
> No..but maybe soon that score and oc is actually not that high and my card can do much better, i just need the cooling now.


I guess you hit the 290 jackpot. Whats the brand and make of your card?


----------



## Unknownm

Installed amd 15.7 on windows 10 (10162) and now I cant disable ULPS, yet 15.6 beta and older drivers never had this issue before.

Look at GPU2 temperature, it's 0c, GPUZ reports ULPS enabled even with ULPS disabled option enabled in MSI A.B


----------



## kizwan

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Might run it on 17C water later.
> 
> 
> 
> 
> 
> 
> 
> I hope it will still improve. 30+C vs sub 20C you reckon?
> 
> For a reference card, I am amazed by the memory OC! Core is a bit meh though.
> 
> 
> 
> it's like beer. the colder the better.
> 
> edit: here is my second 290 (second slot @ X8) . . .
> 
> http://www.3dmark.com/3dm/7690148?
> 
> Got a slightly higher gpu score, which is within margin of error i guess.
Click to expand...

1150/1600
http://www.3dmark.com/fs/5349386

1200/1600
http://www.3dmark.com/fs/5349579

1150/1610
http://www.3dmark.com/fs/5349697

I forgot to do 1620 on the memory. With 1150/1600, 290's can score between 12.9K to 13.5K I guestimate. 13.5K consider golden card I think (not on the overclockability but synthetic bench scores)
Quote:


> Originally Posted by *xKrNMBoYx*
> 
> I have my custom profile set to where it basically keeps the temps at 70-73C and below for my 290X Crossfire. But on reference coolers it does get a bit noisy. I believe running the 290X at up to 95C is technically fine. So I'm thinking of setting up the profile so the GPUs run at about 80C. I wonder what RPM the fans will run and how loud it will be. Since my room is about 18-19C I think the temp increase in the room will be okay.


When my cards still using referenced cooler, I need to run the fan at 70 to 80% to achieve 70s to 80s Celsius, in >30C room temp. I imagine you can get the same result at much lower fan speed or better temps at the same fan speed.


----------



## bkvamme

Here are my results on 15.7.

Stock clocks (1040/1250)
Firestrike: 10330
http://www.3dmark.com/fs/5352481

Overclocked (1250/1500)
Firestrike: 11815
http://www.3dmark.com/fs/5351996

The upgrade from my R9 280X to the R9 290X was well worth it. Almost doubled my score. The upgrade from the FX-6350 to the i7-2600K certainly helped aswell.

Here is a comparison of my Firestrike scores:
http://www.3dmark.com/compare/fs/5352481/fs/5351996/fs/5233401/fs/5226864/fs/4215516/fs/4283072

Legend:
Result 1. OC CPU, Stock GPU / 10330
Result 2. OC CPU, OC GPU / 11815
Result 3. Stock CPU, OC GPU / 10533
Result 4. Stock CPU, Stock GPU / 9624
Result 5. OC CPU, OC GPU, old rig / 6440
Result 6. Laptop (Gigabyte P34W v3) / 6651

Finally I have a desktop that is stronger than my laptop


----------



## fyzzz

Quote:


> Originally Posted by *edo101*
> 
> I guess you hit the 290 jackpot. Whats the brand and make of your card?


xfx 290 dd


----------



## xKrNMBoYx

Quote:


> Originally Posted by *AMDMAXX*
> 
> I figured I'd post this in here too... I posted this in the O/C thread... There's a few more pictures over there though...
> 
> 
> 
> 
> 
> I had a fun time putting this machine back together...
> 
> Seems with no VDDC added I'm getting 1100 on both cores and 1350 on memory.. anything above that seems to give me artifacts...
> 
> One card is a 290 the other is a 290x...
> 
> One is hynix (non-X) ram and the other elpida (x)...
> 
> http://www.3dmark.com/3dm/7053383?
> 
> Temps in Valley are pretty low... around 45-46c after 3-4 hours of looping... 55c with the case closed up...
> 
> I was appalled at the stock TIM job... horrible....


Yeah my ref 290 would go up to 1110MHx without VDDC until I got artifacts and I didn't OC the memory at all since they were elpida. My 2 ref 290x can go to 1080MHz/1400MHz without VDDC and artifacts. 1090/1450 will cause artifacts.
Quote:


> Originally Posted by *Unknownm*
> 
> Installed amd 15.7 on windows 10 (10162) and now I cant disable ULPS, yet 15.6 beta and older drivers never had this issue before.
> 
> Look at GPU2 temperature, it's 0c, GPUZ reports ULPS enabled even with ULPS disabled option enabled in MSI A.B


I had an issue after 15.7 too. I never disabled ULPS with previous drivers but everything showed nicely. With 15.7 GPUZ thinks of my 2nd card as a 32bit bus and 20gb/s bandwidth. Also shows GPU as not on UEFI mode. Just ULPS working I guess. Wonder why ULPS isn't disabling for you.


----------



## xKrNMBoYx

Quote:


> Originally Posted by *kizwan*
> 
> 1150/1600
> http://www.3dmark.com/fs/5349386
> 
> 1200/1600
> http://www.3dmark.com/fs/5349579
> 
> 1150/1610
> http://www.3dmark.com/fs/5349697
> 
> I forgot to do 1620 on the memory. With 1150/1600, 290's can score between 12.9K to 13.5K I guestimate. 13.5K consider golden card I think (not on the overclockability but synthetic bench scores)
> When my cards still using referenced cooler, I need to run the fan at 70 to 80% to achieve 70s to 80s Celsius, in >30C room temp. I imagine you can get the same result at much lower fan speed or better temps at the same fan speed.


Thanks I messed with my fan profile and I need about mid 40s-mid 50s for the fan to keep the GPUs between mid 70s to mid 80s. IMO two 290X running at 40/50% RPM is much better on the ear than both running around 60/70% RPM.


----------



## rdr09

Quote:


> Originally Posted by *Unknownm*
> 
> Installed amd 15.7 on windows 10 (10162) and now I cant disable ULPS, yet 15.6 beta and older drivers never had this issue before.
> 
> Look at GPU2 temperature, it's 0c, GPUZ reports ULPS enabled even with ULPS disabled option enabled in MSI A.B
> 
> 
> Spoiler: Warning: Spoiler!


Could be Win 10. Not sure. In AB, under Unofficial O'cing Mode should be set at Without PowerPlay support.

edit: more info (not sure if it will apply for Win 10 . . .

http://www.overclock.net/t/1265543/the-amd-how-to-thread

Quote:


> Originally Posted by *kizwan*
> 
> 1150/1600
> http://www.3dmark.com/fs/5349386
> 
> 1200/1600
> http://www.3dmark.com/fs/5349579
> 
> 1150/1610
> http://www.3dmark.com/fs/5349697
> 
> I forgot to do 1620 on the memory. With 1150/1600, 290's can score between 12.9K to 13.5K I guestimate. 13.5K consider golden card I think (not on the overclockability but synthetic bench scores)
> When my cards still using referenced cooler, I need to run the fan at 70 to 80% to achieve 70s to 80s Celsius, in >30C room temp. I imagine you can get the same result at much lower fan speed or better temps at the same fan speed.


+rep. I think its my elpida vram that can only do 1500 at most. setting them at 1600 or above can add a few more points in graphics. at similar oc mine are about 200 points lower in graphics. At stock, they are pretty normal of course. here is a comparison at stock between old and new driver . . .

http://www.3dmark.com/compare/fs/5358579/fs/3106844

300 more points.









Quote:


> Originally Posted by *bkvamme*
> 
> Here are my results on 15.7.
> 
> Stock clocks (1040/1250)
> Firestrike: 10330
> http://www.3dmark.com/fs/5352481
> 
> Overclocked (1250/1500)
> Firestrike: 11815
> http://www.3dmark.com/fs/5351996
> 
> The upgrade from my R9 280X to the R9 290X was well worth it. Almost doubled my score. The upgrade from the FX-6350 to the i7-2600K certainly helped aswell.
> 
> Here is a comparison of my Firestrike scores:
> http://www.3dmark.com/compare/fs/5352481/fs/5351996/fs/5233401/fs/5226864/fs/4215516/fs/4283072
> 
> Legend:
> Result 1. OC CPU, Stock GPU / 10330
> Result 2. OC CPU, OC GPU / 11815
> Result 3. Stock CPU, OC GPU / 10533
> Result 4. Stock CPU, Stock GPU / 9624
> Result 5. OC CPU, OC GPU, old rig / 6440
> Result 6. Laptop (Gigabyte P34W v3) / 6651
> 
> Finally I have a desktop that is stronger than my laptop


+rep


----------



## Scorpion49

Man, this HG10 is really good even with only an H60. I have two AP-14's on it at ~800RPM and my temps haven't exceeded 62*C yet, working on testing 1075mhz at stock volts.


----------



## colorfuel

Managed to get past 14000 gpu score with 1156/1500 +100mv

http://www.3dmark.com/3dm/7700989?

yay!


----------



## Scorpion49

Seems like my 290X is a pretty good card, I'm now on to 1100mhz with only +20% power limit, no voltage added. Considering I've owned 7 other Hawaii cards and none of them would pass 1050mhz no matter what (water, air, non-ref PCB, didn't matter) I'm pretty happy with this one.


----------



## mfknjadagr8

I installed 15.7 last night...didn't have time to play anything because for some reason the last two beta releases have been borked the first time I install...remove via ddu in safe mode and run cccleaner to remove any other entries before reinstalling as always..it installs as usual and reboots..I always unplug power from second card when doing driver installs...the restart reconnect let it detect restart then go to enable crossfire...but last two betas have had to be uninstalled reinstalled twice the first time it makes Windows go wonky..not responding...start menu will come up but not sure anything to be selected, bunch of things...after uninstall and reinstall the second time all I'd gone no issues to be found...another interesting thing is that crossfire seems to automatically enable itself with last two releases...reboot with second card it comes up asking to enable when you go in it's already enabled and working


----------



## FastEddieNYC

Quote:


> Originally Posted by *Unknownm*
> 
> Installed amd 15.7 on windows 10 (10162) and now I cant disable ULPS, yet 15.6 beta and older drivers never had this issue before.
> 
> Look at GPU2 temperature, it's 0c, GPUZ reports ULPS enabled even with ULPS disabled option enabled in MSI A.B


I had the same issue with ULPS when I installed 15.7 on Win 10 10166. I got it working after toggling ULPS on and off in AB and rebooting a few times.


----------



## bkvamme

Quote:


> Originally Posted by *FastEddieNYC*
> 
> I had the same issue with ULPS when I installed 15.7 on Win 10 10166. I got it working after toggling ULPS on and off in AB and rebooting a few times.


What is ULPS? And for that matter, what is PowerPlay?


----------



## diggiddi

Ultra Low Power State can be disabled in sapphire trixx and other software


----------



## Unknownm

Quote:


> Originally Posted by *xKrNMBoYx*
> 
> I had an issue after 15.7 too. I never disabled ULPS with previous drivers but everything showed nicely. With 15.7 GPUZ thinks of my 2nd card as a 32bit bus and 20gb/s bandwidth. Also shows GPU as not on UEFI mode. Just ULPS working I guess. Wonder why ULPS isn't disabling for you.


Quote:


> Originally Posted by *FastEddieNYC*
> 
> I had the same issue with ULPS when I installed 15.7 on Win 10 10166. I got it working after toggling ULPS on and off in AB and rebooting a few times.


Can confirm a few restarts will disable ULPS.

Also that screen shot which at the time I didn't notice but it shows 290x instead of 290. After the few restarts GPUZ started to report 290


----------



## JourneymanMike

Quote:


> Originally Posted by *bkvamme*
> 
> Quote:
> 
> 
> 
> Originally Posted by *FastEddieNYC*
> 
> I had the same issue with ULPS when I installed 15.7 on Win 10 10166. I got it working after toggling ULPS on and off in AB and rebooting a few times.
> 
> 
> 
> What is ULPS? And for that matter, what is PowerPlay?
Click to expand...

http://forums.guru3d.com/showthread.php?t=361702


----------



## fyzzz

I think i'm gonna go water soon. Got the money for it now and i am thinking about getting a 480 kit from alphacool and connect my 290 to it also. I must have the radiator outside my case, but that is no problem.


----------



## SavageBrat

Quote:


> Originally Posted by *mfknjadagr8*
> 
> I forget what is your 8370 clocked to? I'm thinking the score should be a little higher for those clocks...if I remember right my 8320 at 4.8 and one 290 stock clocks (947 1250) got around 10,500...


ok..i got the cpu tuned up a bit(4.8), but still learning about the card so this is still at stock volts.. 1130-1500


----------



## mfknjadagr8

Quote:


> Originally Posted by *SavageBrat*
> 
> ok..i got the cpu tuned up a bit(4.8), but still learning about the card so this is still at stock volts.. 1130-1500


if that's stable that's very good for stock volts...my second card can't go over 1300 on memory without voltage if I clock them both over 1300 with no voltage increase the bottom card wont even initialize in crossfire


----------



## SavageBrat

I can go up to 1600 on the memory and it's stable but I get a lower score overall but a slightly higher graphics score.


----------



## Hazardz

Going from 15.4 beta to 15.7, my R9 290 at stock (975/1250) gained nearly 300 points in Firestike standard, from ~9200 to ~9500. Did a little bit of overclocking and ended up here.

http://www.3dmark.com/fs/5373363


----------



## peitinhos

Any inicial thouths on Catalyst 15.7? I had problems with CCC previous versions...should i install it or just the driver??


----------



## sugarhell

It's not omega


----------



## mfknjadagr8

Quote:


> Originally Posted by *peitinhos*
> 
> Any inicial thouths on OMEGA 15.7? I had problems with CCC previous versions...should i install it or just the driver??


and if you play witcher 3 with crossfire definitely....ccc isn't as robust as id like but it works


----------



## Scorpion49

Is there any way to stop this 290X from downclocking to idle during games? Just bought the witcher 3 and I can't play it because the card just sits there at 384mhz looking stupid. I had this problem way back when Hawaii came out, good to see it still alive and well years later.


----------



## pengs

Quote:


> Originally Posted by *Scorpion49*
> 
> Is there any way to stop this 290X from downclocking to idle during games? Just bought the witcher 3 and I can't play it because the card just sits there at 384mhz looking stupid. I had this problem way back when Hawaii came out, good to see it still alive and well years later.


Does your frame rate tank also?


----------



## Scorpion49

Quote:


> Originally Posted by *pengs*
> 
> Does your frame rate tank also?


It can't tank, it never raises up to anything. I start the game, the card remains at idle. I get like 20fps in the menu and about 5fps when I try to play. I literally had the same exact problem in BF4 when it came out, I gave up and sold my 290's I had at the time because it never got fixed.


----------



## Ha-Nocri

Quote:


> Originally Posted by *Scorpion49*
> 
> It can't tank, it never raises up to anything. I start the game, the card remains at idle. I get like 20fps in the menu and about 5fps when I try to play. I literally had the same exact problem in BF4 when it came out, I gave up and sold my 290's I had at the time because it never got fixed.


You r doing something wrong then or u r so unlucky to get 2 bad cards with the same bug.


----------



## Scorpion49

Quote:


> Originally Posted by *Ha-Nocri*
> 
> You r doing something wrong then or u r so unlucky to get 2 bad cards with the same bug.


It isn't a bad card. I can play 10 other games all day with no issue. I can bench with no issue. Why it decided with this one game it needs to be ******ed I don't know.


----------



## Ha-Nocri

did you set real full screen?


----------



## pengs

Try Windows+L, log back in and refocus on the game


----------



## Scorpion49

Quote:


> Originally Posted by *Ha-Nocri*
> 
> did you set real full screen?


First thing I did. Its like the driver has no idea a game is running, it doesn't even briefly spike up to 3D clocks and drop back.


----------



## SavageBrat

Well this was the best I could squeeze out of mine, ran out of room on the core voltage slider.. 1200-1700, +200 on core,+50 on PL and +25 on ram..
http://www.3dmark.com/3dm/7719908, temps 80c on gpu, 67c on mem and 69c on vrm's on air and without the additional 6 pin power connector hooked up.


----------



## Scorpion49

I wiped my driver and re-installed it, now the game works, sort of. Definitely not good with Freesync, its stutter heaven. I'll just wait until the next patch comes out I guess.


----------



## HOMECINEMA-PC

Did this ages ago









290 @ 1300/1400

http://www.3dmark.com/3dm/4074915


----------



## Scorpion49

Welp, it seems like Freesync doesn't work at all now. I think this monitor is definitely going back to newegg. All the dead pixels and it seems to have some other issues as well. I hated paying the premium for Gsync monitor but that thing worked flawlessly.


----------



## fyzzz

http://www.3dmark.com/fs/5362161 1220/1700 with 15.7 on air...i think i will go water soon.


----------



## mus1mus

Quote:


> Originally Posted by *fyzzz*
> 
> http://www.3dmark.com/fs/5362161 1220/1700 with 15.7 on air...i think i will go water soon.


Very nice.

I'd love to match this on an AMD FX. Just the graphics score before the boys get their balls tickled.









It would be hard but...









But might come later as the GPU is on X99 at the moment.


----------



## fyzzz

Quote:


> Originally Posted by *mus1mus*
> 
> Very nice.
> 
> I'd love to match this on an AMD FX. Just the graphics score before the boys get their balls tickled.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It would be hard but...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But might come later as the GPU is on X99 at the moment.


I did that run with 90c on vrm1 and only a few artifacts.


----------



## th3illusiveman

Quote:


> Originally Posted by *fyzzz*
> 
> I did that run with 90c on vrm1 and only a few artifacts.


nice score but artifacts make it void. maybe 1200/1600 will be stable?


----------



## fyzzz

Quote:


> Originally Posted by *th3illusiveman*
> 
> nice score but artifacts make it void. maybe 1200/1600 will be stable?


I will not benchmark more on air...i'm done with it. Next step:water


----------



## mus1mus

Quote:


> Originally Posted by *fyzzz*
> 
> I did that run with 90c on vrm1 and only a few artifacts.


Who doesnt artifact on these cards when gunning numbers anyway?

Water will give you more.
Quote:


> Originally Posted by *th3illusiveman*
> 
> nice score but *artifacts make it void*. maybe 1200/1600 will be stable?


Who said?








Artifacts start to appear the moment you push the VRMs to 70+ C.
Quote:


> Originally Posted by *fyzzz*
> 
> I will not benchmark more on air...i'm done with it. Next step:water


17C water waiting.








Looks like we have comparable cards. Mine's reference though. With Hynix chips.


----------



## fyzzz

Quote:


> Originally Posted by *mus1mus*
> 
> Who doesnt artifact on these cards when gunning numbers anyway?
> 
> Water will give you more.
> Who said?
> 
> 
> 
> 
> 
> 
> 
> 
> Artifacts start to appear the moment you push the VRMs to 70+ C.
> 17C water waiting.
> 
> 
> 
> 
> 
> 
> 
> 
> Looks like we have comparable cards. Mine's reference though. With Hynix chips.


How good is your card? My card has also hynix memory.


----------



## gatygun

I did got a 13k gpu score and 4,5k on combined score with a 290 tri-x, 10530 points. Took the n1 spot if the driver is approved for my combination. Nice driver.


----------



## Ha-Nocri

I can't go past 13,500 graphics score. 1230/1600 OC with +100mv


----------



## rdr09

Quote:


> Originally Posted by *Ha-Nocri*
> 
> I can't go past 13,500 graphics score. 1230/1600 OC with +100mv


1230 on the 290 gives me 13600 in graphics with an i7 @ 4.5GHz. now, with a thuban at 4GHz . . . about 200 points less.


----------



## mus1mus

Quote:


> Originally Posted by *fyzzz*
> 
> How good is your card? My card has also hynix memory.


Not so good on the cores. But OCs very well on the memory.

http://www.3dmark.com/fs/5342354

Not the best score I have. But shows the improvement on the new driver.

http://www.3dmark.com/compare/fs/5313728/fs/5342354


----------



## NiteNinja

Quote:


> Originally Posted by *Scorpion49*
> 
> Welp, it seems like Freesync doesn't work at all now. I think this monitor is definitely going back to newegg. All the dead pixels and it seems to have some other issues as well. I hated paying the premium for Gsync monitor but that thing worked flawlessly.


I'm thinking G-Sync, FreeSync, etc, this technology isn't perfected yet. Might be better off just getting a higher refresh rate monitor and turning on Vertical Sync. You'll spend about as much on a Sync'd monitor, as you would on a 144hz one, so might as well go with one that has been tried and tested to work. You won't have screen tearing unless you go over 144fps.


----------



## OneB1t

if you guys want to edit just 2D voltage or 3D voltage without creating offset in afterburner it can be done now with bios reader








http://postimg.org/image/cm4sihond/

more informations here
http://forums.guru3d.com/showthread.php?t=400050&page=26


----------



## Scorpion49

Quote:


> Originally Posted by *NiteNinja*
> 
> I'm thinking G-Sync, FreeSync, etc, this technology isn't perfected yet. Might be better off just getting a higher refresh rate monitor and turning on Vertical Sync. You'll spend about as much on a Sync'd monitor, as you would on a 144hz one, so might as well go with one that has been tried and tested to work. You won't have screen tearing unless you go over 144fps.


I disagree 100%. I used Gsync for the last 6 months and it was absolutely wonderful. I would never have tried Freesync otherwise. I can't even describe how smooth it is compared to a regular 120/144hz monitor which I also own. It is literally a night and day difference, and when I was using Gsync I only found a single game that did not work (Skyrim, but its finicky about anything not vsync'd at 60hz).

I think Freesync _might_ not be the problem here, only my particular monitor but I'll not be trying it again for a while because I can't afford to keep "trying" $600 monitors if they don't work right and shipping them back and forth.


----------



## NiteNinja

Quote:


> Originally Posted by *Scorpion49*
> 
> I disagree 100%. I used Gsync for the last 6 months and it was absolutely wonderful. I would never have tried Freesync otherwise. I can't even describe how smooth it is compared to a regular 120/144hz monitor which I also own. It is literally a night and day difference, and when I was using Gsync I only found a single game that did not work (Skyrim, but its finicky about anything not vsync'd at 60hz).
> 
> I think Freesync _might_ not be the problem here, only my particular monitor but I'll not be trying it again for a while because I can't afford to keep "trying" $600 monitors if they don't work right and shipping them back and forth.


Yeah I'm basing my facts from things I've read, and have used 144hz monitors. (Currently using a OC'd 75hz Ultrawide IPS panel), so really haven't seen a Sync'd screen firsthand.

I just don't see how clocking your card and monitor together can actually improve the smoothness of gameplay unless you're dropping frames, but it would have to be something I'll need to see firsthand.

Have you considered changing out the cable? I've been shopping Displayport cables myself for quite some time, (really tired of the 30hz limitation of HDMI on 2K) and seems like el cheapo and stock cables go bad after awhile.


----------



## Scorpion49

Quote:


> Originally Posted by *NiteNinja*
> 
> Yeah I'm basing my facts from things I've read, and have used 144hz monitors. (Currently using a OC'd 75hz Ultrawide IPS panel), so really haven't seen a Sync'd screen firsthand.
> 
> I just don't see how clocking your card and monitor together can actually improve the smoothness of gameplay unless you're dropping frames, but it would have to be something I'll need to see firsthand.
> 
> Have you considered changing out the cable? I've been shopping Displayport cables myself for quite some time, (really tired of the 30hz limitation of HDMI on 2K) and seems like el cheapo and stock cables go bad after awhile.


How can it not? Are you one of the young bucks that never gamed on a CRT?







I'll put it this way, I showed some of my friends who don't game on PC (consoles only, casual gamers) and they could see right away a huge difference between regular 144hz and 144hz Gsync. I really wanted Freesync to be just as good because I could save a lot of money and get a better monitor. I'll have to try a TN panel with a wider sync range sometime down the road and see.

I don't use the stock DP cables, I have a nice monoprice I use thats like 3 times as thick as the dinky things they send in the package.


----------



## NiteNinja

Quote:


> Originally Posted by *Scorpion49*
> 
> How can it not? Are you one of the young bucks that never gamed on a CRT?
> 
> 
> 
> 
> 
> 
> 
> I'll put it this way, I showed some of my friends who don't game on PC (consoles only, casual gamers) and they could see right away a huge difference between regular 144hz and 144hz Gsync. I really wanted Freesync to be just as good because I could save a lot of money and get a better monitor. I'll have to try a TN panel with a wider sync range sometime down the road and see.
> 
> I don't use the stock DP cables, I have a nice monoprice I use thats like 3 times as thick as the dinky things they send in the package.


I look at an old CRT now and it gives me a headache. I think the 24" CRT I had attached to my laptop back in Michigan Tech killed some brain cells.

THATS screen flicker.

And after gaming on 21:9, going back to 16:9 makes me feel like I'm looking at 4:3 again.

Before I go too far off topic, can you recommend a good Displayport cable? HDMI isn't cutting it.


----------



## Scorpion49

Quote:


> Originally Posted by *NiteNinja*
> 
> Before I go too far off topic, can you recommend a good Displayport cable? HDMI isn't cutting it.


What kind do you need? Standard DP to DP or do you need a mini?


----------



## mus1mus

Quote:


> Originally Posted by *OneB1t*
> 
> if you guys want to edit just 2D voltage or 3D voltage without creating offset in afterburner it can be done now with bios reader
> 
> 
> 
> 
> 
> 
> 
> 
> http://postimg.org/image/cm4sihond/
> 
> more informations here
> http://forums.guru3d.com/showthread.php?t=400050&page=26


This is very interesting!

Thanks a lot!


----------



## diggiddi

Monoprice Display Port cables here


----------



## Scorpion49

Quote:


> Originally Posted by *diggiddi*
> 
> Monoprice Display Port cables here


Thats what I have but they don't sell the one I use any more, it is way thicker and white instead of black. Never had a problem with it even at 4k.


----------



## FastEddieNYC

Quote:


> Originally Posted by *OneB1t*
> 
> if you guys want to edit just 2D voltage or 3D voltage without creating offset in afterburner it can be done now with bios reader
> 
> 
> 
> 
> 
> 
> 
> 
> http://postimg.org/image/cm4sihond/
> 
> more informations here
> http://forums.guru3d.com/showthread.php?t=400050&page=26


Thanks for the info. I definitely will look into that.


----------



## NiteNinja

Quote:


> Originally Posted by *Scorpion49*
> 
> What kind do you need? Standard DP to DP or do you need a mini?


Standard DP to DP, thankfully the 8GB XFX 290X DD still uses a standard. I do not like it when they try to make connectors smaller and smaller, but I know mini will be the future (As the 295x2 has 4 mini ports).


----------



## pengs

Anyone know how 15.7 fairs to the modded 15.15?


----------



## NiteNinja

Quote:


> Originally Posted by *pengs*
> 
> Anyone know how 15.7 fairs to the modded 15.15?


Coming from 14.9, I'm having stability issues in some pre-alpha games, such as Carmageddon Reincarnation and Next Car Game Wreckfest. I also get driver crashes when alt-tabbing out of games, which sometimes causes my GPU's core clock to run open throttle instead of an on-demand governor. Crashes the driver after a few benches of Valley as well.


----------



## mus1mus

Quote:


> Originally Posted by *pengs*
> 
> Anyone know how 15.7 fairs to the modded 15.15?


FS Shows better numbers on 15.7

Graphics 2 esp. went up by about 5FPS avearage. This doesnt show it though.

http://www.3dmark.com/compare/fs/5313728/fs/5342354


----------



## pengs

Quote:


> Originally Posted by *NiteNinja*
> 
> Coming from 14.9, I'm having stability issues in some pre-alpha games, such as Carmageddon Reincarnation and Next Car Game Wreckfest. I also get driver crashes when alt-tabbing out of games, which sometimes causes my GPU's core clock to run open throttle instead of an on-demand governor. Crashes the driver after a few benches of Valley as well.


Good to know, I play a lot of Wreckfest.
Quote:


> Originally Posted by *mus1mus*
> 
> FS Shows better numbers on 15.7
> 
> Graphics 2 esp. went up by about 5FPS avearage. This doesnt show it though.
> 
> http://www.3dmark.com/compare/fs/5313728/fs/5342354


Whats FS


----------



## diggiddi

FS= Fire Strike


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Did this ages ago
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 290 @ 1300/1400
> 
> http://www.3dmark.com/3dm/4074915


14257 FS Graphics score


----------



## 4kallday

So I'm currently running two Asus Radeon R9 290X 4gb OC Edition cards in crossfire and I'm thinking about selling them and getting a single Gigabyte GeForce GTX 980 Ti G1 Gaming 6GB. Good or bad idea?


----------



## SavageBrat

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 14257 FS Graphics score


My learning bit for the day, is that considered a good graphics score?


----------



## By-Tor

Quote:


> Originally Posted by *4kallday*
> 
> So I'm currently running two Asus Radeon R9 290X 4gb OC Edition cards in crossfire and I'm thinking about selling them and getting a single Gigabyte GeForce GTX 980 Ti G1 Gaming 6GB. Good or bad idea?


I would say stay with the 290x's. Together they will out perform any single card on the market now (295x would give them a run)at the same price as a single 980ti.


----------



## mus1mus

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 14257 FS Graphics score



ERASED
Even with your Tessellation tweaks, 14389 Graphics Valid








Quote:


> Originally Posted by *4kallday*
> 
> So I'm currently running two Asus Radeon R9 290X 4gb OC Edition cards in crossfire and I'm thinking about selling them and getting a single Gigabyte GeForce GTX 980 Ti G1 Gaming 6GB. Good or bad idea?


If your games doesn't support SLI/XFire, then yes. But I have to warn you, Nvidia Drivers are just Terribad for the 980TI.
Quote:


> Originally Posted by *By-Tor*
> 
> I would say stay with the 290x's. Together they will out perform any single card on the market now (295x would give them a run)at the same price as a single 980ti.


If your Games support SLI/XFire, then yes.


----------



## By-Tor

Quote:


> Originally Posted by *mus1mus*
> 
> If your Games support SLI/XFire, then yes.


True


----------



## 4kallday

Quote:


> Originally Posted by *By-Tor*
> 
> True


Everything I play supports crossfire/sli except for trials fusion. Everywhere I see in game fps between the 980ti and my cards are pretty close but then in stuff like heaven the 980ti blows my cards out of the park. Is it possible that in the future with better drivers the 980ti's performance in game could gain a slight boost?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *SavageBrat*
> 
> My learning bit for the day, is that considered a good graphics score?


Yes









Quote:


> Originally Posted by *mus1mus*
> 
> 
> ERASED
> Even with your Tessellation tweaks, 14389 Graphics Valid
> 
> 
> 
> 
> 
> 
> 
> 
> If your games doesn't support SLI/XFire, then yes. But I have to warn you, Nvidia Drivers are just Terribad for the 980TI.
> If your Games support SLI/XFire, then yes.


Nicely done









BUT I did that val *Dec 2013* on Sandybee


----------



## By-Tor

Quote:


> Originally Posted by *4kallday*
> 
> Everything I play supports crossfire/sli except for trials fusion. Everywhere I see in game fps between the 980ti and my cards are pretty close but then in stuff like heaven the 980ti blows my cards out of the park. Is it possible that in the future with better drivers the 980ti's performance in game could gain a slight boost?


Better drivers would make a big difference and they will always be working on them. 290x 4gb cards would not do as well at 4k either, but a pair of 390 8gb cards may be a good idea for an upgrade also at the same price as one 980ti.


----------



## mus1mus

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yes
> 
> 
> 
> 
> 
> 
> 
> 
> Nicely done
> 
> 
> 
> 
> 
> 
> 
> 
> 
> BUT I did that val *Dec 2013* on Sandybee


Don't worry, I still have some in here:









On AMD FX if you want 

Considering running on a so-called weak bottle-necking processor, with PCIe 2.0.








Quote:


> Originally Posted by *4kallday*
> 
> Everything I play supports crossfire/sli except for trials fusion. Everywhere I see in game fps between the 980ti and my cards are pretty close but then in stuff like heaven the 980ti blows my cards out of the park. Is it possible that in the future with better drivers the 980ti's performance in game could gain a slight boost?


If they have been more generous for the 980ti, hmmm. TitanX owners must be throwing their cards out!

I can do 1530/2100 without artifacts nor glitches but as soon as you pull more fps than the driver should allow, app will just crash! And can no longer recover. So big disappointment there.


----------



## SavageBrat

All I could squeeze out of my 290x... http://www.3dmark.com/3dm/7739349


----------



## mus1mus

@4kallday

You have this to consider too.

http://wccftech.com/amd-radeon-r9-fury-quad-crossfire-nvidia-geforce-gtx-titan-quad-sli-uhd-benchmarks/

It's not hawaii. But you know how AMD scales on Crossfire.


----------



## Agent Smith1984

Dat scaling tho!

AMD has done one thing for certain in the last few years, and that is get their crossfire scaling to a top notch level.


----------



## diggiddi

So which is the better performer right now? a 290x lightning @ 1200 core or a 390/X OC
Am currently on 1080 but future max Res will be 3440 by1440


----------



## By-Tor

My second Powercolor 290x LCS card came in today... Hope to install it in my loop this weekend.


----------



## 4kallday

Quote:


> Originally Posted by *mus1mus*
> 
> @4kallday
> 
> You have this to consider too.
> 
> http://wccftech.com/amd-radeon-r9-fury-quad-crossfire-nvidia-geforce-gtx-titan-quad-sli-uhd-benchmarks/
> 
> It's not hawaii. But you know how AMD scales on Crossfire.


Yeah, I've decided to stick with my cards for now, it seems like the best option. How they can afford to do all the testing to write articles like that I'll never know.


----------



## AMDMAXX

Quote:


> Originally Posted by *By-Tor*
> 
> My second Powercolor 290x LCS card came in today... Hope to install it in my loop this weekend.


That sucker is beautiful... though it's in theory the same block as my two are... I like that they extend the block a bit further out though...

I wish my 290 was a 290X... but the 290 and the 290X together still give decent scores... ~16750

I love my load temps though with my 720mm worth of radiators... in valley they typically are less than 40c...


----------



## boredmug

Very happy with my two blocked 290x's. Picked these guys up used with blocks already installed for $500 recently.


----------



## Ha-Nocri

Are AMD working on CPU usage with their cards, or r they waiting for dx12? Was there ever a response from AMD about CPU overhead?


----------



## Gumbi

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Are AMD working on CPU usage with their cards, or r they waiting for dx12? Was there ever a response from AMD about CPU overhead?


I hope they are! They get slaughtered by nVidia in CPU limited games.


----------



## Maticb

I didn't wanna open a new topic so I'm just gonna post here:

I have just bought a second R9 290 and I was hoping this would go much easier then it is... crossfire just doesn't want to work...
I see no option for it in Catalyst and the second card is shown as disabled in CCC even thought it's not disabled in device manager (as seen in screenshot), the cards are identical both being refference sapphire 290s, the only difference being the memory (hynix and elpida) and the fact that one is watercooled and one not. I tested them both seperately but I cannot switch the PCIE lanes because one is in a watercooling loop already, and the second one I bought is in the second PCIEX16 even thought it shows up on GPU-Z as running on only x2.

My motherboard is the GA-990FAX-UD3 rev 3.0 and has official support for 2-way crossfire, my PSU is a Corsair RM850.... I just have no more ideas I am totally depressed right now I tried 3 different versions of drivers and did a driver cleanup uninstall with the tool, and all that did was disable the "GPU overdrive" option in CCC once I reinstalled new drivers.

Does anyone have any ideas?

MOBO BIOS has no options for PCIE lanes... Only one option that sets which PCIE lane should be the primary GPU.

Screen:



I am so dissapointed right now I hope I didn't throw those 200€ for the 2nd card away...


----------



## mus1mus

Quote:


> Originally Posted by *Gumbi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ha-Nocri*
> 
> Are AMD working on CPU usage with their cards, or r they waiting for dx12? Was there ever a response from AMD about CPU overhead?
> 
> 
> 
> I hope they are! *They get slaughtered by nVidia in CPU limited games.*
Click to expand...

lol. Intel is just better. OKAY?

Bold statement implies nothing in terms of the CPU context


----------



## mus1mus

Quote:


> Originally Posted by *Maticb*
> 
> I didn't wanna open a new topic so I'm just gonna post here:
> 
> I have just bought a second R9 290 and I was hoping this would go much easier then it is... crossfire just doesn't want to work...
> I see no option for it in Catalyst and the second card is shown as disabled in CCC even thought it's not disabled in device manager (as seen in screenshot), the cards are identical both being refference sapphire 290s, the only difference being the memory (hynix and elpida) and the fact that one is watercooled and one not. I tested them both seperately but I cannot switch the PCIE lanes because one is in a watercooling loop already, and the second one I bought is in the second PCIEX16 even thought it shows up on GPU-Z as running on only x2.
> 
> My motherboard is the GA-990FAX-UD3 rev 3.0 and has official support for 2-way crossfire, my PSU is a Corsair RM850.... I just have no more ideas I am totally depressed right now I tried 3 different versions of drivers and did a driver cleanup uninstall with the tool, and all that did was disable the "GPU overdrive" option in CCC once I reinstalled new drivers.
> 
> Does anyone have any ideas?
> 
> MOBO BIOS has no options for PCIE lanes... Only one option that sets which PCIE lane should be the primary GPU.
> 
> Screen:
> 
> 
> 
> I am so dissapointed right now I hope I didn't throw those 200€ for the 2nd card away...


You might just need to get the cards detected one by one one those slot.

1. Pull out both cards. Reset BIOS.
2. Install the first card on the first X16 Slot.
3. Boot to Windows. Make sure it gets detected. Else,
4. Fire up a DDU, Uninstall the Driver. Reboot after uninstallation of the Driver. Install Driver, reboot if necessary til Windows get your card working.
5. Pull out that first card from the slot. Install the second card on the next X16 slot. Boot to windows to make sure it gets detected, and working. If it is,
6. Install the first card back to the first slot.
7. Profit.


----------



## Gumbi

Quote:


> Originally Posted by *mus1mus*
> 
> lol. Intel is just better. OKAY?
> 
> Bold statement implies nothing in terms of the CPU context


Play a CPU limited gamed. nVidia cards perform a lot better than AMD in CPU limited games. I'm not a fanboy, my last 2 cards were AMD...


----------



## mus1mus

Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gumbi*
> 
> lol. Intel is just better. OKAY?
> 
> Bold statement implies nothing in terms of the CPU context
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Play a CPU limited gamed. nVidia cards perform a lot better than AMD in CPU limited games. I'm not a fanboy, my last 2 cards were AMD...
Click to expand...

The only way you can tell that is to Compare an AMD CPU against Intel. Nvidia should not be the one to determine that.

CLUE: *CPU* Intensive Games depend on the *CPU*. Not the GPU make.

Also note: Some games are designed on specific hardware. You can't expect an AMD GPU to outhustle an nVidia GPU on Titles coded for nVidia. Unless you are comparing it to a lower tier card.


----------



## MrKZ

Does anybody know how thick the vrm thermal pad is on the DirectCU 2 290x ? It looks like 1.0mm or 1.5 but I'm not sure and I don't want to disassembly the cooler just to measure it.
(need to order ones that have better thermal conductivity... the ones that come from asus are total crap)


----------



## 4kallday

Quote:


> Originally Posted by *MrKZ*
> 
> Does anybody know how thick the vrm thermal pad is on the DirectCU 2 290x ? It looks like 1.0mm or 1.5 but I'm not sure and I don't want to disassembly the cooler just to measure it.
> (need to order ones that have better thermal conductivity... the ones that come from asus are total crap)


From memory they looked about 1.5mm, but you're better off checking to be sure.


----------



## gordesky1

Has anyone tried windows 10 with their 290x on gta 5 with the 15.2 or the new 15.7 drivers? I been fighting this problem were 5 to 10mins in the game it would stutter like crazy and my core speed is all over the place.. And when exiting the game it will either freeze the whole pc and a black screen. 15.4- to 15.6 windows 8.1 drivers is perfect.

This ony happens so far on gta 5.

I just did a fresh install of windows 10 preview 10162 to rule out a mess install. After going threw that trouble it still does it...

Ony thing i can think of is its either windows 10 or the windows 10 drivers. Cause i would think if it was something with my pc it would do it on every driver.

But i cannot find anyone with this problem with their 290x with windows 10 those drivers for gta 5...

So i figure i post in here to see if anyone is having the same problem.


----------



## wstanci3

Hey Owners,
Thinking about grabbing a 290x to hold over for amd/nvidia's next line up. I am water-cooling and just wondering if I would see gains going with a 290x Lightning over a reference 290x? Opinions?


----------



## Gumbi

Quote:


> Originally Posted by *mus1mus*
> 
> The only way you can tell that is to Compare an AMD CPU against Intel. Nvidia should not be the one to determine that.
> 
> CLUE: *CPU* Intensive Games depend on the *CPU*. Not the GPU make.
> 
> Also note: Some games are designed on specific hardware. You can't expect an AMD GPU to outhustle an nVidia GPU on Titles coded for nVidia. Unless you are comparing it to a lower tier card.


I'm talking about CPU limited games. nVidia will always beat AMD in CPU limited scenerios (whether it's an AMD or Intel CPU).


----------



## mus1mus

Quote:


> Originally Posted by *Gumbi*
> 
> I'm talking about CPU limited games. nVidia will always beat AMD in CPU limited scenerios (whether it's an AMD or Intel CPU).


Hello, Sanity.... check,
















How can a GPU alleviate a CPU deficiency in games? If you have that notion, explain it. Prove it.

Or







keep silent about it.

Quote:


> Originally Posted by *wstanci3*
> 
> Hey Owners,
> Thinking about grabbing a 290x to hold over for amd/nvidia's next line up. I am water-cooling and just wondering if I would see gains going with a 290x Lightning over a reference 290x? Opinions?


OC Headroom.

These GPUs are power hungry. The better the Power section of the card, the better. Binning might also favor the top end models. If you end up with a bad card, that's just bad luck I suppose.

But then again, since the release of the 300 cards, Lightnings and Vapor-X's somewhat became irrelevant. Unless you can find one for a good price of course.


----------



## wstanci3

Quote:


> Originally Posted by *mus1mus*
> 
> But then again, since the release of the 300 cards, Lightnings and Vapor-X's somewhat became irrelevant. Unless you can find one for a good price of course.


So I am guessing that the 300 series has slightly better power delivery? In that case, might just grab a 390/x...


----------



## mus1mus

Quote:


> Originally Posted by *wstanci3*
> 
> So I am guessing that the 300 series has slightly better power delivery? In that case, might just grab a 390/x...


But you'll lose the chances for FC Blocks









Better money goes to the Fury IMO.


EDIT:
ASUS 390X's have FC too.


----------



## wstanci3

Quote:


> Originally Posted by *mus1mus*
> 
> But you'll lose the chances for FC Blocks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Better money goes to the Fury IMO.
> 
> 
> EDIT:
> ASUS 390X's have FC too.


I am really interested in the Fury... it just seems a bit overpriced is all. I know, new tech, new goodies cost more.
Maybe I will consider the Fury. Hopefully enough stock on Newegg today when it launches.
E: Overpriced relative to the 390x performance anyway


----------



## diggiddi

Quote:


> Originally Posted by *mus1mus*
> 
> Hello, Sanity.... check,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How can a GPU alleviate a CPU deficiency in games? If you have that notion, explain it. Prove it.
> 
> Or
> 
> 
> 
> 
> 
> 
> 
> keep silent about it.
> OC Headroom.
> 
> These GPUs are power hungry. The better the Power section of the card, the better. Binning might also favor the top end models. If you end up with a bad card, that's just bad luck I suppose.
> 
> But then again, since the release of the 300 cards, Lightnings and Vapor-X's somewhat became irrelevant. Unless you can find one for a good price of course.


So Just get a 390 instead of a lightning? I'm not into water cooling so...


----------



## mus1mus

Quote:


> Originally Posted by *diggiddi*
> 
> So Just get a 390 instead of a lightning? I'm not into water cooling so...


Not sure mate. 390X might just equate to a lightning.

A 390 might not cut it when both are OC'd.


----------



## diggiddi

Quote:


> Originally Posted by *mus1mus*
> 
> Not sure mate. 390X might just equate to a lightning.
> 
> A 390 might not cut it when both are OC'd.


Thanks Egg has the LE version for 360


----------



## gatygun

Quote:


> Originally Posted by *diggiddi*
> 
> Monoprice Display Port cables here


And here i bought a 3dclub display port cable version 1.2 for 30 euro's.

I didn't wanted to risk getting a cheap cable that messes everything up


----------



## Scorpion49

Quote:


> Originally Posted by *mus1mus*
> 
> Hello, Sanity.... check,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How can a GPU alleviate a CPU deficiency in games? If you have that notion, explain it. Prove it.
> 
> Or
> 
> 
> 
> 
> 
> 
> 
> keep silent about it.
> OC Headroom.
> 
> These GPUs are power hungry. The better the Power section of the card, the better. Binning might also favor the top end models. If you end up with a bad card, that's just bad luck I suppose.
> 
> But then again, since the release of the 300 cards, Lightnings and Vapor-X's somewhat became irrelevant. Unless you can find one for a good price of course.


Nvidia has much lower driver overhead, especially with multiple cards. This has been proven over and over and over. Just because you don't understand doesn't mean he is wrong. Why do you think AMD developed Mantle? Here is just one example I found with 3 seconds of my time on this site, there are many more out there.


----------



## kizwan

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Are AMD working on CPU usage with their cards, or r they waiting for dx12? Was there ever a response from AMD about CPU overhead?


With what games you have problem with CPU overhead? I don't have any problem here with BF4 (DX11) & GTA V to name a few. Your CPU maybe too old for newer games, if you're using your sig rig to play games.
Quote:


> Originally Posted by *Maticb*
> 
> I didn't wanna open a new topic so I'm just gonna post here:
> 
> I have just bought a second R9 290 and I was hoping this would go much easier then it is... crossfire just doesn't want to work...
> I see no option for it in Catalyst and the second card is shown as disabled in CCC even thought it's not disabled in device manager (as seen in screenshot), the cards are identical both being refference sapphire 290s, the only difference being the memory (hynix and elpida) and the fact that one is watercooled and one not. I tested them both seperately but I cannot switch the PCIE lanes because one is in a watercooling loop already, and the second one I bought is in the second PCIEX16 even thought it shows up on GPU-Z as running on only x2.
> 
> My motherboard is the GA-990FAX-UD3 rev 3.0 and has official support for 2-way crossfire, my PSU is a Corsair RM850.... I just have no more ideas I am totally depressed right now I tried 3 different versions of drivers and did a driver cleanup uninstall with the tool, and all that did was disable the "GPU overdrive" option in CCC once I reinstalled new drivers.
> 
> Does anyone have any ideas?
> 
> MOBO BIOS has no options for PCIE lanes... Only one option that sets which PCIE lane should be the primary GPU.
> 
> Screen:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I am so dissapointed right now I hope I didn't throw those 200€ for the 2nd card away...


Crossfire not available because one of the card is running at x2. Fixed that & you'll have crossfire.

Your cards should run at x8/x8. Given that the primary card is running at x16, I think you did not plug the second card in the correct PCIe x16 slot (second slot x8).


----------



## mus1mus

Quote:


> Originally Posted by *Scorpion49*
> 
> Nvidia has much lower driver overhead, especially with multiple cards. This has been proven over and over and over. Just because you don't understand doesn't mean he is wrong. Why do you think AMD developed Mantle? Here is just one example I found with 3 seconds of my time on this site, there are many more out there.


Overhead? Really?

Look,

1. An nVidia card is not equal to an AMD card. And,
2. Changing CPUs doesn't equate to similar performance scaling either. Do you need an example?
3. Games are not created to give equal performance to both AMD and nVidia.
4. Testing methodology, game specifics, *and testers* changes the results
5. If Overhead or Driver Improvement favors nVidia at a time, AMD develops theirs too. *If not,*
6. *nVidia should win in every game, in every situation, in every aspect of the word, every time at 1080p.*

BUT, *They don't.* Should I say more?
http://techgage.com/article/amd-radeon-r9-290x-nvidia-geforce-gtx-780-ti-review/5/

_*Just because you believe in something, doesn't make a non-believer wrong about it too.*_


----------



## Scorpion49

Quote:


> Originally Posted by *mus1mus*
> 
> Overhead? Really?
> 
> Look,
> 
> 1. An nVidia card is not equal to an AMD card. And,
> 2. Changing CPUs doesn't equate to similar performance scaling either. Do you need an example?
> 3. Games are not created to give equal performance to both AMD and nVidia.
> 4. Testing methodology, game specifics, *and testers* changes the results
> 5. If Overhead or Driver Improvement favors nVidia at a time, AMD develops theirs too. *If not,*
> 6. *nVidia should win in every game, in every situation, in every aspect of the word at 1080p.*
> 
> BUT, *They don't.* Should I say more?
> http://techgage.com/article/amd-radeon-r9-290x-nvidia-geforce-gtx-780-ti-review/5/
> 
> Just because you believe in something, doesn't make a non-believer wrong too.


No, you literally have no idea what you're talking about. AMD would not waste time developing Mantle, nor would DX12 be a thing if CPU overhead with DX11 didn't suck. When you can get 10x as many draw calls with the same hardware and a more efficient API, there is a problem. The fact is, Nvidia's driver has been better at this for some time with DX11, you can argue all you want but it doesn't change reality.


----------



## mus1mus

Quote:


> Originally Posted by *Scorpion49*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Overhead? Really?
> 
> Look,
> 
> 1. An nVidia card is not equal to an AMD card. And,
> 2. Changing CPUs doesn't equate to similar performance scaling either. Do you need an example?
> 3. Games are not created to give equal performance to both AMD and nVidia.
> 4. Testing methodology, game specifics, *and testers* changes the results
> 5. If Overhead or Driver Improvement favors nVidia at a time, AMD develops theirs too. *If not,*
> 6. *nVidia should win in every game, in every situation, in every aspect of the word, every time at 1080p.*
> 
> BUT, *They don't.* Should I say more?
> http://techgage.com/article/amd-radeon-r9-290x-nvidia-geforce-gtx-780-ti-review/5/
> 
> _*Just because you believe in something, doesn't make a non-believer wrong about it too.*_
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> No, you literally have no idea what you're talking about. AMD would not waste time developing Mantle, nor would DX12 be a thing if CPU overhead with DX11 didn't suck. When you can get 10x as many draw calls with the same hardware and a more efficient API, there is a problem. The fact is, Nvidia's driver has been better at this for some time with DX11, you can argue all you want but it doesn't change reality.
Click to expand...

If AMD wants to develop Mantle to reduce your so-called overhead, it's part of their development to be ahead or stay in the game. And nope, DirectX12 is Microsoft's development to improve. Not a cure to AMD's overhead issues - as you are saying.

Bottom line you cannot say AMD vs Nvidia is an apple to apple comparison. Period.

And here is your an example that might make you happy with your claims. Both my best scores. Both pushed to the limits. Both on Latest Drivers.

http://www.3dmark.com/compare/fs/5288567/fs/5389526

If I am only done on my Test Bench, I can throw in my AMD FX to these results mind you. But I need a lot to do that.


----------



## Scorpion49

Quote:


> Originally Posted by *mus1mus*
> 
> If AMD wants to develop Mantle to reduce your so-called overhead, it's part of their development to be ahead or stay in the game. And nope, DirectX12 is Microsoft's development to improve. Not a cure to AMD's overhead issues - as you are saying.
> 
> Bottom line you cannot say AMD vs Nvidia is an apple to apple comparison. Period.
> 
> And here is your an example that might make you happy with your claims. Both my best scores. Both pushed to the limits. Both on Latest Drivers.
> 
> http://www.3dmark.com/compare/fs/5288567/fs/5389526


You really don't understand what I'm talking about, and its clear that you have no intention of doing a quick google search to educate yourself so you sound less foolish than you do now, so I'll break this down barney style for you and just copy/paste the graphs so you can't avoid it (hint: this is NOT about Nvidia vs AMD GPU scores, if you can manage to wrap your mind around that):
Quote:


> Spoiler: Warning: Spoiler!
> 
> 
> 
> Now from those you can already somewhat see that the 290X is falling behind the 780Ti more when using the FX8350 instead of the 4690K.
> 
> But let's make it even easier to see. Let's compare the average difference between the 780Ti and the 290X first on the 4690K platform and then on the FX8350 platform.
> 
> 4690K platform:
> 
> 
> 
> As we can see the the 780Ti is around 6% faster than the 290X on average.
> The difference makes sense and sounds just about right.
> 
> But what happens once we move to a more CPU bottlenecked platform? Before the 337.50 drivers the assumption would have been that the 780Ti and the 290X would be more evenly matched as the CPU is limiting the GPUs from reaching their maximum potential. However if the 337.50 driver set did its job we should see the difference stay the same or grow.
> 
> FX 8350 platform:
> 
> 
> 
> The difference between the cards did indeed change.
> The 780Ti is now around 15% faster than the 290X
> 
> Conclusion:
> 
> The newer Nvidia drivers do indeed help with CPU overhead. High core count CPUs that benefit from highly multithreaded software are better off with Nvidia GPUs in situations that are CPU bound (in DX11 applications).


----------



## Gumbi

Scorpion 49 is right, nVidia ALWAYS win in CPU limited games. In CPU limited games, even a 780 can compete with a 290X, despite the 290X trouncing the 780 in GPU limited games such as Crysis 3, Far Cry 4 etc.


----------



## mus1mus

Quote:


> Originally Posted by *Scorpion49*
> 
> You really don't understand what I'm talking about, and its clear that you have no intention of doing a quick google search to educate yourself so you sound less foolish than you do now, so I'll break this down barney style for you and just copy/paste the graphs so you can't avoid it (hint: this is NOT about Nvidia vs AMD GPU scores, if you can manage to wrap your mind around that):


Did you realized I answer you that?

*2. Changing CPUs doesn't equate to similar performance scaling either. Do you need an example?*
AMD is running on PCIe X2.0 compared to a PCIe X3.0 on Intel regardless of the CPU. Though that has very little effect, it can be said the the R9-290X's bandwidth is being cut by the lanes speeds.

I gave you Firestrike as that is the simplest form of test for you to see the difference in results a test can give. There are too many variables to consider when you want to compare both GPUs. You just have your brain closed.

If you wanna delve on the differences on both GPUs go on. I'm over.


----------



## Scorpion49

Quote:


> Originally Posted by *mus1mus*
> 
> Did you realized I answer you that?
> 
> *2. Changing CPUs doesn't equate to similar performance scaling either. Do you need an example?*
> AMD is running on PCIe X2.0 compared to a PCIe X3.0 on Intel regardless of the CPU. Though that has very little effect, it can be said the the R9-290X's bandwidth is being cut by the lanes speeds.
> 
> I gave you Firestrike as that is the simplest form of test for you to see the difference in results a test can give. There are too many variables to consider when you want to compare both GPUs. You just have your brain closed.
> 
> If you wanna delve on the differences on both GPUs go on. I'm over.


*This ISN'T about the 290X or the 780ti.* You can't seem to get that out of your head. Just google "driver overhead" and learn something, I'm done trying to help you understand anything.


----------



## DividebyZERO

I guess the constant cpu argument must be for 1080/1440? I play in high resolutions where cpu doesnt matter. I guess i am in the minority but if the cpu is holding back there are better ways at shifting the load aside from getting the fastest cpu. Of course i wonder in these cpu limited games does it really matter if your getting massive fps already? I guess we could all bring points in from different angles.


----------



## Gumbi

Quote:


> Originally Posted by *mus1mus*
> 
> Did you realized I answer you that?
> 
> *2. Changing CPUs doesn't equate to similar performance scaling either. Do you need an example?*
> AMD is running on PCIe X2.0 compared to a PCIe X3.0 on Intel regardless of the CPU. Though that has very little effect, it can be said the the R9-290X's bandwidth is being cut by the lanes speeds.
> 
> I gave you Firestrike as that is the simplest form of test for you to see the difference in results a test can give. There are too many variables to consider when you want to compare both GPUs. You just have your brain closed.
> 
> If you wanna delve on the differences on both GPUs go on. I'm over.


Why do lower tier nVidia cards compete with AMD's higher tier cards specifically in CPU limited games?


----------



## Scorpion49

Quote:


> Originally Posted by *DividebyZERO*
> 
> I guess the constant cpu argument must be for 1080/1440? I play in high resolutions where cpu doesnt matter. I guess i am in the minority but if the cpu is holding back there are better ways at shifting the load aside from getting the fastest cpu. Of course i wonder in these cpu limited games does it really matter if your getting massive fps already? I guess we could all bring points in from different angles.


Deciding on your setup based on the variables you have is different than arguing that driver overhead is a mythical thing







Naturally as the resolution goes up CPU power is less important than GPU power, this has always been the case except recently cards are fast enough to start moving to 4k. There was a time when 1200p was the big resolution, and long before that 1024x768 was pretty tough.


----------



## mus1mus

Quote:


> Originally Posted by *Scorpion49*
> 
> *This ISN'T about the 290X or the 780ti.* You can't seem to get that out of your head. Just google "driver overhead" and learn something, I'm done trying to help you understand anything.


This is about the example you gave. Drive Overhead or whatever. You can't compare both MAKERS. You need to open your mind about that.

I have both AMD and Nvidia Cards, Both AMD and Intel CPUs. If I switch to a 780 or to a 290 on my AMD FX CPU at stock (to promote your API deficiency on CPU bottleneck claims) , The GTX 780 doesn't give me enough reason to pick it over the 290.
They are the direct competitors on their time. I am not even using Mantle to that.

Your claims are specifically pointed towards comparing both companies's cards. They are simply not built the same. As I am always saying.


----------



## mus1mus

Quote:


> Originally Posted by *Gumbi*
> 
> Why do lower tier nVidia cards compete with AMD's higher tier cards specifically in CPU limited games?


Site an example.


----------



## Scorpion49

Quote:


> Originally Posted by *mus1mus*
> 
> This is about the example you gave. Drive Overhead or whatever. You can't compared both MAKERS. You need to open your mind about that.
> 
> I have both AMD and Nvidia Cards, Both AMD and Intel CPUs. If I switch to a 780 or to a 290 on my AMD FX CPU at stock (to promote your CPU bottleneck calims) , The GTX 780 doesn't give me enough reason to pick it over the 290.
> They are the direct competitors on their time. I am not even using Mantle to that.
> 
> Your claims are specifically pointed towards comparing both cards. They are simply not built the same. As I am always saying.


My claims are NOT between comparing those cards, I said that was the first example in a 3 second search. If you spent as much time verifying for yourself if what I'm saying is real or not as you have making posts completely missing the point you would have understood by now. You literally just can't let go of the 780/290 portion of the example and keep posting that they can't be compared.

If you look back past the 337.50 "miracle drivers" that came out to combat Mantle, AMD used to be the winner in having lower driver overhead. Here, have some more links (all literally from the first page in a google search, hard isn't it?):

http://www.pcper.com/reviews/Graphics-Cards/AMD-Mantle-and-NVIDIA-33750-Scaling-Demonstrated-Star-Swarm-AM1

http://forums.guru3d.com/showthread.php?t=398858

http://www.overclock3d.net/articles/gpu_displays/amd_to_improve_cpu_usage_in_their_catalyst_drivers/1

http://www.maximumpc.com/nvidia-takes-mantle-enhanced-dx11-driver-2014/

And then check out in 2011 (when the situation was reversed) we see things like this:

https://forums.geforce.com/default/topic/487446/nvidia-drivers-cpu-overhead-something-to-chew-upon/


----------



## mus1mus

Quote:


> Originally Posted by *Scorpion49*
> 
> My claims are NOT between comparing those cards, I said that was the first example in a 3 second search. If you spent as much time verifying for yourself if what I'm saying is real or not as you have making posts completely missing the point you would have understood by now. You literally just can't let go of the 780/290 portion of the example and keep posting that they can't be compared.
> 
> If you look back past the 337.50 "miracle drivers" that came out to combat Mantle, AMD used to be the winner in having lower driver overhead. Here, have some more links (all literally from the first page in a google search, hard isn't it?):
> 
> http://www.pcper.com/reviews/Graphics-Cards/AMD-Mantle-and-NVIDIA-33750-Scaling-Demonstrated-Star-Swarm-AM1
> 
> http://forums.guru3d.com/showthread.php?t=398858
> 
> http://www.overclock3d.net/articles/gpu_displays/amd_to_improve_cpu_usage_in_their_catalyst_drivers/1
> 
> http://www.maximumpc.com/nvidia-takes-mantle-enhanced-dx11-driver-2014/
> 
> And then check out in 2011 (when the situation was reversed) we see things like this:
> 
> https://forums.geforce.com/default/topic/487446/nvidia-drivers-cpu-overhead-something-to-chew-upon/


It will always go down to comparing cards. How will one ever evaluate a product's edge if not comparing it to another product? You pointed a link to an example that compares a 780Ti to an R9-290X using both an Intel and AMD CPU for that reason. But to say that AMD has always lag behind nVidia on Driver Overhead is meh. Considering you know for a fact that AMD fought back with Mantle. And you know that DX12 promises huge for AMD on that regards.

Plus you came into an convo on AMD vs nVidia cards on CPU Bottlenecks where one says nVidia will win against AMD on a bottlenecked CPU. And my example contradicts just that.


----------



## Maticb

Quote:


> Originally Posted by *kizwan*
> 
> Crossfire not available because one of the card is running at x2. Fixed that & you'll have crossfire.
> 
> Your cards should run at x8/x8. Given that the primary card is running at x16, I think you did not plug the second card in the correct PCIe x16 slot (second slot x8).


As I suspected myself and as you said that is correct. I have no idea why it works now or how I fixed it but this is what I did:

I took my #1 (watercooled) out (managed to do so without taking the loop apart), I then put the #2 in the first slot and ran it, was working at 16x. Then I shut my PC down and put it in the 2nd 16x slot (it is marked on the MOBO and you can clearly see the PCIE physical connectors so it has been in the correct slot before). This time it ran on 16x in the 2nd slot. So I plugged my #1 back into the first slot, booted it up and both cards now running at 16x, so I have 16x/16x CF running









No driver reinstall or anything. I have no idea what happened but I'm happy it works now









EDIT:
PS: I tried to unlock my card to the 290X when I first got it but Elpida can't be unlocked, if I am not mistaken Hynix can be ? According to the receipt and GPU-Z my 2nd card is Hynix and seems to be from the first batch of cards. The person I bought it from had no idea about unlocking so I assume he didn't try, but chances are higher on Hynix right?


----------



## bkvamme

@Maticb Could very well be that there was some smudge on the PCIe connectors on the GPU. I had that problem on my second hand 290X, and it restricted it to run at 8x, not 16x. Taking the card out and wiping the PCIe connectors got it back up to 16x.


----------



## mustrum

Quote:


> Originally Posted by *Scorpion49*
> 
> Nvidia has much lower driver overhead, especially with multiple cards. This has been proven over and over and over. Just because you don't understand doesn't mean he is wrong. Why do you think AMD developed Mantle? Here is just one example I found with 3 seconds of my time on this site, there are many more out there.


While it was true what you said you should go ahead and install catalyst 15.7.
AMD fixed their DX11 draw call issues with that driver (is fixed for a while in windows 10 beta releases as well).


----------



## Scorpion49

Quote:


> Originally Posted by *mustrum*
> 
> While it was true what you said you should go ahead and install catalyst 15.7.
> AMD fixed their DX11 draw call issues with that driver (is fixed for a while in windows 10 beta releases as well).


Been installed since the day it came out


----------



## Ized

http://www.3dmark.com/fs/5408693
http://www.3dmark.com/fs/5408772

My highest scores yet I believe. Spent the night screwing with BIOSes.

Core: 1230Mhz Mem: 1498Mhz


----------



## gatygun

Quote:


> Originally Posted by *Ized*
> 
> 
> 
> http://www.3dmark.com/fs/5408693
> http://www.3dmark.com/fs/5408772
> 
> My highest scores yet I believe. Spent the night screwing with BIOSes.
> 
> Core: 1230Mhz Mem: 1498Mhz


dam nice score mate.


----------



## Ized

Quote:


> Originally Posted by *gatygun*
> 
> dam nice score mate.


Ohh! Thanks for all of you guys BIOS info over on guru3d + the post here.

Im really excited for the progress.

My score was just volts + tdp/power limit increases in Hawaii Bios Reader.

Still a huge difference between random BIOSes so the future is exciting.


----------



## fyzzz

My 290 will soon be under water. Orderd some watercooling parts today


----------



## OneB1t

Quote:


> Originally Posted by *Ized*
> 
> Ohh! Thanks for all of you guys BIOS info over on guru3d + the post here.
> 
> Im really excited for the progress.
> 
> My score was just volts + tdp/power limit increases in Hawaii Bios Reader.


happy that someone using it please report on best bios u find


----------



## m0n4rch

Quote:


> Originally Posted by *mus1mus*
> 
> It will always go down to comparing cards. How will one ever evaluate a product's edge if not comparing it to another product? You pointed a link to an example that compares a 780Ti to an R9-290X using both an Intel and AMD CPU for that reason. But to say that AMD has always lag behind nVidia on Driver Overhead is meh. Considering you know for a fact that AMD fought back with Mantle. And you know that DX12 promises huge for AMD on that regards.
> 
> Plus you came into an convo on AMD vs nVidia cards on CPU Bottlenecks where one says nVidia will win against AMD on a bottlenecked CPU. And my example contradicts just that.


There's an API overhead test by 3DMark. It's not graphically intensive, so it takes GPU performance out of the equation and focuses on a large amount of draw calls. I ran the test with 290X and 970 in the same system.

290X, 15.4 beta:


290X, 15.7:


GTX 970:


Check out these 2 threads, they contain a lot of information on this topic and include a lot of benchmarks:

http://www.overclock.net/t/1528559/directx-driver-overhead-and-why-mantle-is-a-selling-point-bunch-of-benchmarks
http://forums.guru3d.com/showthread.php?t=398858

And DX11 isn't going anywhere. We will have DX12 games in the future, but NOW we have DX11 games and there'll be a lot of DX11 games in the future as well.
Quote:


> Originally Posted by *mustrum*
> 
> While it was true what you said you should go ahead and install catalyst 15.7.
> AMD fixed their DX11 draw call issues with that driver (is fixed for a while in windows 10 beta releases as well).


Not really fixed. Just improved, as you can see above. It's still not quite there yet.


----------



## sinholueiro

Hi! I've seen that the 300 series have a improved tesselation performance with the 15.15 drivers. With the new 15.7 the 200 series have matched that performance?


----------



## MrKZ

Quote:


> Originally Posted by *4kallday*
> 
> From memory they looked about 1.5mm, but you're better off checking to be sure.


Yes, they are 1.5mm. Just measured them (also goodbye to the warranty screw sticker







)
I saw that there are two stands between the pcb and the vrm rad. I'm wondering if I can place a thinner thermalpad if I find some lower "ghetto style" stands. But will a thinner pad make a big difference over the 1.5mm one? (same brand, same thermal conductance power)


----------



## 4kallday

Quote:


> Originally Posted by *MrKZ*
> 
> Yes, they are 1.5mm. Just measured them (also goodbye to the warranty screw sticker
> 
> 
> 
> 
> 
> 
> 
> )
> I saw that there are two stands between the pcb and the vrm rad. I'm wondering if I can place a thinner thermalpad if I find some lower "ghetto style" stands. But will a thinner pad make a big difference over the 1.5mm one? (same brand, same thermal conductance power)


I think you're better off going with the same thickness just to be sure you get a proper contact so that heat is transferred correctly, but I can't be certain. I only knew the thickness from when I pulled my cards apart to paint them and change the thermal paste.


----------



## Blameless

Does anyone know of any modded, or is any willing to mod, firmware for the Sapphire R9 290X Tri-X New Edition?

This is a non-reference card with Samsung memory, and I'd like a PT1 equivalent for it (vdroop intact, but only a full speed 3D power state and no throttling/downclocking), with improved memory tables, if possible.
Quote:


> Originally Posted by *mus1mus*
> 
> Hello, Sanity.... check,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How can a GPU alleviate a CPU deficiency in games? If you have that notion, explain it. Prove it.
> 
> Or
> 
> 
> 
> 
> 
> 
> 
> keep silent about it.


Gumbi is correct.

AMD's GPU drivers have more CPU overhead than NVIDIA's. AMD GPUs will therefore become CPU limited with faster CPUs than NVIDIA GPUs.

There are countless tests that show this to be the case. You can find them.
Quote:


> Originally Posted by *Scorpion49*
> 
> Nvidia has much lower driver overhead, especially with multiple cards. This has been proven over and over and over. Just because you don't understand doesn't mean he is wrong. Why do you think AMD developed Mantle? Here is just one example I found with 3 seconds of my time on this site, there are many more out there.


Yes.
Quote:


> Originally Posted by *DividebyZERO*
> 
> I guess i am in the minority but if the cpu is holding back there are better ways at shifting the load aside from getting the fastest cpu.


If the CPU is holding you back, only a faster CPU will alleviate the issue without other trade-offs. Running higher IQ and loading the GPU more can shift the load, but it cannot improve performance.
Quote:


> Originally Posted by *mustrum*
> 
> While it was true what you said you should go ahead and install catalyst 15.7.
> AMD fixed their DX11 draw call issues with that driver (is fixed for a while in windows 10 beta releases as well).


Using 15.7, and it's essentially the same as the 15.15s, which have mild improvements over the 15.6s...no where near enough to close the gap in driver overhead with NVIDIA.


----------



## gatygun

Does anybody know if there are any benchmarks out there how witcher 3 performs with 2x 290's? in comparison towards 980 ti / titan x?

Tried to google all morning can't find anything.


----------



## Ha-Nocri

15.7 drivers added CF support for Witcher 3. If it works as usual (70-90% scaling) no single card has a chance. So, to 60 fps just add another 40-50


----------



## crislevin

I didn't bench my 295x2 with witcher 3, I do know that it goes 40-55 without crossfire, and easily 60+ with crossfire (vsync is on, so don't know the upper number).

However, there are some weird flickering issues with crossfire at least in 15.6 beta that is quite uncomfortable on the eye, I read somewhere that 15.7 didn't fix this.

EDIT: let me retract that, just turned on crossfire again and it seems 15.7 did indeed fix the flickering issue on crossfire mode (default mode, AFR still flickers), so its now smooth 60fps+ and no glitches, great!


----------



## Ha-Nocri

I don't have VSR in CCC in Win 10, while in win 7 it is there. Anyone else has the problem?


----------



## gatygun

Quote:


> Originally Posted by *crislevin*
> 
> I didn't bench my 295x2 with witcher 3, I do know that it goes 40-55 without crossfire, and easily 60+ with crossfire (vsync is on, so don't know the upper number).
> 
> However, there are some weird flickering issues with crossfire at least in 15.6 beta that is quite uncomfortable on the eye, I read somewhere that 15.7 didn't fix this.


Aha ty, isn't the flickering happening because of AA being on? readed that somewhere. Disable that and the flickering should stop i think.

I was wondering if 2x 290's would push about the same framerate with hairworks on as a 980 ti. But it's simple impossible to find anything that compares the 2 in that game. And any 295x2 benchmark is all taken from when crossfire didn't work yet. so they are void.
Quote:


> Originally Posted by *Ha-Nocri*
> 
> 15.7 drivers added CF support for Witcher 3. If it works as usual (70-90% scaling) no single card has a chance. So, to 60 fps just add another 40-50


Well i found some random dude that had 2 of those cards and witcher 3, but he struggled to maintain 50 fps. now his cards where bottlenecked to hell and backwards as they hovered mostly on 50-60% usage. So yea that doesn't give a good solid idea.


----------



## crislevin

Quote:


> Originally Posted by *gatygun*
> 
> Aha ty, isn't the flickering happening because of AA being on? readed that somewhere. Disable that and the flickering should stop i think.


Not sure its the fault of AA, I have TXAA turned off in ini file, as AMD instructed. The flickering still exist, only in static UI (loading, character, quest, map, inventory, skill screens), where static element keep blinking. It doesn't show up outside those screens though.

BTW, seen lot of videos showing witcher 3 does 50-ish fps on 980Ti/TitanX with hairworks on too (1080p).


----------



## gatygun

Quote:


> Originally Posted by *Ha-Nocri*
> 
> 15.7 drivers added CF support for Witcher 3. If it works as usual (70-90% scaling) no single card has a chance. So, to 60 fps just add another 40-50


Quote:


> Originally Posted by *crislevin*
> 
> Not sure its the fault of AA, I have TXAA turned off in ini file, as AMD instructed. The flickering still exist, only in static UI (loading, character, quest, map, inventory, skill screens), where static element keep blinking. It doesn't show up outside those screens though.
> 
> BTW, seen lot of videos showing witcher 3 does 50-ish fps on 980Ti/TitanX with hairworks on too (1080p).


Yea was wondering, becuase i see a lot of video's with a 7990 running at 90 fps, and a titan x running at 44 fps in the big city. Yet another person pushes 80 fps with that same titan in the city.

No clue what's going on anymore. I just hope i can get on ultra with hairworks a stable 48 fps minimum, even if i have to reduce hairworks towards 8x. But who knows.


----------



## Scorpion49

So I just bought a SECOND Freesync monitor after sending the MG279Q back, and IT DOES NOT WORK (not the monitor, Freesync). What do I need to do to get smooth gameplay out of this technology? Gsync was so fantastically smooth, Freesync is just a giant stutter-fest. I can literally see the chop of every single frame when its active, if I shut it off and play with regular vsync my games feel so much better and smoother. I'm so beyond disappointed right now, I was sure that the monitor was just bad but it seems to me ($1200 later) that it simply does not work as advertised.

Is the 290X not Freesync capable? I mean, I get the options in the menu and all.


----------



## sinnedone

Odd question, but does anyone have a dead/broken reference 290/290x that they'd be willing to take the IO plate off?

It has a sort of black chrome appearance. Id be willing to pay a couple of dollars as well.


----------



## kaspar737

Hey, if I want to watercool R9 290, would you recommend NZXT Kraken G10 or Corsair HG10 for AIO bracket?


----------



## Scorpion49

G10 is stupid imo I have the HG10 and its great. Just follow the instructions and don't torque the screws hard or you'll bend the card.


----------



## sugarhell

Quote:


> Originally Posted by *Scorpion49*
> 
> So I just bought a SECOND Freesync monitor after sending the MG279Q back, and IT DOES NOT WORK (not the monitor, Freesync). What do I need to do to get smooth gameplay out of this technology? Gsync was so fantastically smooth, Freesync is just a giant stutter-fest. I can literally see the chop of every single frame when its active, if I shut it off and play with regular vsync my games feel so much better and smoother. I'm so beyond disappointed right now, I was sure that the monitor was just bad but it seems to me ($1200 later) that it simply does not work as advertised.
> 
> Is the 290X not Freesync capable? I mean, I get the options in the menu and all.


My freesync just works out of the box. I dont know what are you doing wrong


----------



## aaroc

Quote:


> Originally Posted by *Scorpion49*
> 
> So I just bought a SECOND Freesync monitor after sending the MG279Q back, and IT DOES NOT WORK (not the monitor, Freesync). What do I need to do to get smooth gameplay out of this technology? Gsync was so fantastically smooth, Freesync is just a giant stutter-fest. I can literally see the chop of every single frame when its active, if I shut it off and play with regular vsync my games feel so much better and smoother. I'm so beyond disappointed right now, I was sure that the monitor was just bad but it seems to me ($1200 later) that it simply does not work as advertised.
> 
> Is the 290X not Freesync capable? I mean, I get the options in the menu and all.


From all the reviews that I saw on youtube, Gsync and freesync monitor have low and high fps limits were they work, outside of those limits or when just on the limits they are worse than with G/Freesync off. So you must tweak your settings to be on the Sync zone of your monitor. On the first Freesync monitors this zone was very small, on the newer ones the zone is bigger. Watch the latest review of Linus where he compares G/Freesync with Vsync On of Off, maybe their method is not accurate but shows some interesting results. I use 3 2560x1440 monitors for gaming so upgrading to a G/Freesync monitor is still far away for me.
Quote:


> Originally Posted by *crislevin*
> 
> I didn't bench my 295x2 with witcher 3, I do know that it goes 40-55 without crossfire, and easily 60+ with crossfire (vsync is on, so don't know the upper number).
> 
> However, there are some weird flickering issues with crossfire at least in 15.6 beta that is quite uncomfortable on the eye, I read somewhere that 15.7 didn't fix this.
> 
> EDIT: let me retract that, just turned on crossfire again and it seems 15.7 did indeed fix the flickering issue on crossfire mode (default mode, AFR still flickers), so its now smooth 60fps+ and no glitches, great!


It depends on the game and Vsync. some games do not flicker with Vsync on and other flicker with Vsync on like there is no tomorrow. The smoothness while using CFX is a lot better in 15.7. Just compare the benchmark in Tomb Raider with all in ultra (Tressfx on), before 15.7 there were some frame drops that you can see or hear (coil whine), now the whole benchmark is smooth. The same in Dirt 3 and Dirt Showdown all ultra (3x 2560x1440).


----------



## Scorpion49

Look, I know all about the limits. I'm not an idiot, I think comparing it to gsync is just a bad idea and I should give up and live with the mediocre thing AMD put out.


----------



## rv8000

Quote:


> Originally Posted by *Scorpion49*
> 
> Look, I know all about the limits. I'm not an idiot, I think comparing it to gsync is just a bad idea and I should give up and live with the mediocre thing AMD put out.


What games were giving you stuttering issues?


----------



## Scorpion49

Quote:


> Originally Posted by *rv8000*
> 
> What games were giving you stuttering issues?


Witcher 3, DAI, world of tanks, BF4, elite: dangerous, kingdoms of amalur, etc. I can't type all of the ones I've tried on my phone.


----------



## Hazardz

Quote:


> Originally Posted by *kaspar737*
> 
> Hey, if I want to watercool R9 290, would you recommend NZXT Kraken G10 or Corsair HG10 for AIO bracket?


I've used the G10 and Pulse Modding ones without issue. You can get more info on the G10 here.


----------



## rv8000

Quote:


> Originally Posted by *Scorpion49*
> 
> Witcher 3, DAI, world of tanks, BF4, elite: dangerous, kingdoms of amalur, etc. I can't type all of the ones I've tried on my phone.


Only game I've noticed stuttering in is TW3, and more so since patch 1.06, anything before that the game was much smoother; going to blame this one on the game or drivers. Other than that any dx10/11 game I've tried runs so smoothly its almost eerie.


----------



## Scorpion49

Quote:


> Originally Posted by *rv8000*
> 
> Only game I've noticed stuttering in is TW3, and more so since patch 1.06, anything before that the game was much smoother; going to blame this one on the game or drivers. Other than that any dx10/11 game I've tried runs so smoothly its almost eerie.


That's what I expected my experience to be, and it is as far from smooth as you can get without having a monitor that draws Braille. 3 different rigs, two versions of Windows and $1200 of monitors later and I'm really pissed.


----------



## sugarhell

Quote:


> Originally Posted by *Scorpion49*
> 
> That's what I expected my experience to be, and it is as far from smooth as you can get without having a monitor that draws Braille. 3 different rigs, two versions of Windows and $1200 of monitors later and I'm really pissed.


Do you run your games at fullscreen?


----------



## Scorpion49

Quote:


> Originally Posted by *sugarhell*
> 
> Do you run your games at fullscreen?


Yes.


----------



## elgreco14

Quote:


> Originally Posted by *Scorpion49*
> 
> That's what I expected my experience to be, and it is as far from smooth as you can get without having a monitor that draws Braille. 3 different rigs, two versions of Windows and $1200 of monitors later and I'm really pissed.


I am running 295x2 (so CF) with my MG279Q and I never notice any stuttering in games like BF4. Everything runs perfect. Try http://www.testufo.com/#test=frameskipping. Could be the reason why every game feels that its stuttering.


----------



## rv8000

Quote:


> Originally Posted by *Scorpion49*
> 
> Yes.


Oddly enough, didnt try any games yet today but after uninstalling my 7970 and dropping in my Fury, Freesync doesn't seem to be working. Just tried a driver reinstall and nothing. Very odd I was having no issues with my Fury X on tuesday with the same drivers.


----------



## FastEddieNYC

Quote:


> Originally Posted by *elgreco14*
> 
> I am running 295x2 (so CF) with my MG279Q and I never notice any stuttering in games like BF4. Everything runs perfect. Try http://www.testufo.com/#test=frameskipping. Could be the reason why every game feels that its stuttering.


Same here. I have no issues playing any of the current games. It seems that either people have crossfire working great or they don't. I've seen where something is running taking up processor time which will result in stuttering which is blamed on vid drivers or card.


----------



## Scorpion49

Still trying to get Freesync working, interestingly with the 14.12 Omega drivers my monitor is correctly detected as a 10-bit panel and 60hz maximum instead of 64hz but I still get zero Freesync happening. I'm a few hours of working on this away from never buying an AMD product again as long as I live, two weeks straight of spending all of my leisure time (of which I have very little) troubleshooting this crap instead of playing the games I'd like to play is making me very, very annoyed with AMD as a whole.

Its a shame too, besides Freesync not working this U28E590D is amazing.


----------



## Agent Smith1984

I still don't understand the advantage of G Sync and Freesync over setting a frame target 1FPS lower than the refresh rate on the monitor?
Call me old fashioned I guess?


----------



## Forceman

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I still don't understand the advantage of G Sync and Freesync over setting a frame target 1FPS lower than the refresh rate on the monitor?
> Call me old fashioned I guess?


Because you can't always maintain a frame rate that is 1 FPS below refresh?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Forceman*
> 
> Because you can't always maintain a frame rate that is 1 FPS below refresh?


But even if it's lower, you won't have tearing right?

I just don't see the need for the refresh to change with the framerate.
If the framerate never exceeds the refresh rate, you're good right?

I thought everybody had gotten around the screen tearing thing a long time ago....
I don't follow the resolution/monitor driven era of gaming so much though, since I use a big TV.

I'm actually picking up a 55" 4K TV today to use in the living room.

I'll be curious to see how 1440P 60FPS looks/plays against 4k 30FPS


----------



## rv8000

Quote:


> Originally Posted by *Agent Smith1984*
> 
> But even if it's lower, you won't have tearing right?
> 
> I just don't see the need for the refresh to change with the framerate.
> If the framerate never exceeds the refresh rate, you're good right?
> 
> I thought everybody had gotten around the screen tearing thing a long time ago....
> I don't follow the resolution/monitor driven era of gaming so much though, since I use a big TV.
> 
> I'm actually picking up a 55" 4K TV today to use in the living room.
> 
> I'll be curious to see how 1440P 60FPS looks/plays against 4k 30FPS


With vsync disabled @ 144hz and games that I can actually get to run at that high of FPS @ 1440p I still get tearing with quick mouse movement (WITHOUT FREESYNC ENABLED). Forceman pretty much nailed the rest of it, you won't be able to keep that high fps across all of your games and certainly newer more intensive games that will come out, you will still get tearing the further (and lower) away you get from 120/144hz in terms of fps.

Quote:


> Originally Posted by *Scorpion49*
> 
> Still trying to get Freesync working, interestingly with the 14.12 Omega drivers my monitor is correctly detected as a 10-bit panel and 60hz maximum instead of 64hz but I still get zero Freesync happening. I'm a few hours of working on this away from never buying an AMD product again as long as I live, two weeks straight of spending all of my leisure time (of which I have very little) troubleshooting this crap instead of playing the games I'd like to play is making me very, very annoyed with AMD as a whole.
> 
> Its a shame too, besides Freesync not working this U28E590D is amazing.


Have you been using AB and Rivatuner for fps monitoring/limiting. It actually stops freesync from functioning properly, the second I switched to FRC in CCC freesync was working properly again and I had no tearing in Warframe or TW3; obviously not optimal in its current form because FRC has a stupd limit of 95 (at least for users running 120/144hz freesync monitors).


----------



## boredmug

Quote:


> Originally Posted by *Agent Smith1984*
> 
> But even if it's lower, you won't have tearing right?
> I just don't see the need for the refresh to change with the framerate.
> If the framerate never exceeds the refresh rate, you're good right?
> 
> I thought everybody had gotten around the screen tearing thing a long time ago....
> I don't follow the resolution/monitor driven era of gaming so much though, since I use a big TV.
> 
> I'm actually picking up a 55" 4K TV today to use in the living room.
> 
> I'll be curious to see how 1440P 60FPS looks/plays against 4k 30FPS


I couldn't game at 30fps but but I do have eyefinity 60hz. Screen tearing doesn't bother me that much and it usually isn't hugely evident to me.


----------



## Scorpion49

Quote:


> Originally Posted by *rv8000*
> 
> Have you been using AB and Rivatuner for fps monitoring/limiting. It actually stops freesync from functioning properly, the second I switched to FRC in CCC freesync was working properly again and I had no tearing in Warframe or TW3; obviously not optimal in its current form because FRC has a stupd limit of 95 (at least for users running 120/144hz freesync monitors).


Not currently, I uninstalled it just to be sure it wasn't doing anything. I've been using DA:I with the in-game frostbite engine frame counter right now. I can't believe I've wasted two weeks of my life on this crap, I should have just kept the Gsync panel.


----------



## rv8000

Quote:


> Originally Posted by *Scorpion49*
> 
> Not currently, I uninstalled it just to be sure it wasn't doing anything. I've been using DA:I with the in-game frostbite engine frame counter right now. I can't believe I've wasted two weeks of my life on this crap, I should have just kept the Gsync panel.


Do you have anything in your monitors OSD, such as an information tab showing Freesync is enabled just to ensure it's actually active; for the XL2730Z there is an information tab that shows the status of Freesync.


----------



## Scorpion49

Quote:


> Originally Posted by *rv8000*
> 
> Do you have anything in your monitors OSD, such as an information tab showing Freesync is enabled just to ensure it's actually active; for the XL2730Z there is an information tab that shows the status of Freesync.


Yes, it shows 3840x2160 60hz Freesync. I can't get the option in CCC until I turn it on in the OSD, it simply doesn't exist. When I put it on in the OSD the box pops saying I have connected a Freesync display.


----------



## sugarhell

Quote:


> Originally Posted by *Scorpion49*
> 
> Yes, it shows 3840x2160 60hz Freesync. I can't get the option in CCC until I turn it on in the OSD, it simply doesn't exist. When I put it on in the OSD the box pops saying I have connected a Freesync display.


Some monitors needs to do that. You enable it from the osd and then ccc pop up an option. Then it just work at least for me.


----------



## gatygun

Well i finally freaking did it, took me weeks of tweaking and overclocking, but finally i took the n1 spot on 3dmark for my system setup.

13220 gpu score













yay


----------



## Bartouille

AMD seems to have started using Samsung memory on their cards. I built a pc for a friend and the 290 had samsung, I was surprised because I thought only lightning card used samsung. Now I bought two 280 and the first card I try also has samsung. I wonder how well they oc vs hynix.


----------



## rt123

Quote:


> Originally Posted by *Bartouille*
> 
> AMD seems to have started using Samsung memory on their cards. I built a pc for a friend and the 290 had samsung, I was surprised because I thought only lightning card used samsung. Now I bought two 280 and the first card I try also has samsung. I wonder how well they oc vs hynix.


The AIB partners pick the RAM, not AMD.


----------



## Bartouille

Quote:


> Originally Posted by *rt123*
> 
> The AIB partners pick the RAM, not AMD.


Well the 290 was Sapphire and this 280 is HIS... Maybe all aib are starting to use samsung then.


----------



## Hazardz

Quote:


> Originally Posted by *Bartouille*
> 
> AMD seems to have started using Samsung memory on their cards. I built a pc for a friend and the 290 had samsung, I was surprised because I thought only lightning card used samsung. Now I bought two 280 and the first card I try also has samsung. I wonder how well they oc vs hynix.


I got an R9 290 Turbo Duo RMA back from Powercolor that uses Samsung memory.


----------



## Blameless

Quote:


> Originally Posted by *Agent Smith1984*
> 
> But even if it's lower, you won't have tearing right?


You have tearing any time you don't have the frame rate synced to the refresh rate, regardless of frame rate.

With standard vsycn, any time your frame rate falls below the refresh rate, you are getting duplicated frames, and inconsistent frame times, somewhere.

This is why variable refresh rate solutions are favored by some. You get the same consistency and lack of tearing as ideal vsync, without having to maintain quite as high of a frame rate, and with potentially less latency.
Quote:


> Originally Posted by *Hazardz*
> 
> I got an R9 290 Turbo Duo RMA back from Powercolor that uses Samsung memory.


My Sapphire 290X Tri-X new edition uses Samsung.

Seems better than Elpida, but worse than Hynix, overall. Though I don't have many samples to compare.


----------



## Ized

Quote:


> Originally Posted by *Blameless*
> 
> You have tearing any time you don't have the frame rate synced to the refresh rate, regardless of frame rate.
> .


You make that sound like there no exceptions? For a couple of years my (working..) workaround for tearing has been to set my FPS limiter to about 1 FPS lower than my refresh rate and of course disable v-sync.


----------



## Hazardz

Quote:


> Originally Posted by *Blameless*
> 
> My Sapphire 290X Tri-X new edition uses Samsung.
> 
> Seems better than Elpida, but worse than Hynix, overall. Though I don't have many samples to compare.


My wife's R9 290 Turbo Duo has Hynix. I may have to test it some time.


----------



## Blameless

Quote:


> Originally Posted by *Ized*
> 
> You make that sound like there no exceptions?


That's because there aren't.

The odds of a non-synchronized frame falling exactly on the first horizontal refresh, continuing exactly to the last, and then repeating this process are infinitesimally small.

Unless some mechanism is deliberately constraining the frame rate to precisely the display's refresh rate, or vice versa, they will never be perfectly synchronized over any appreciable amount of time, and will thus always produce some degree of tearing.

You may not notice this tearing, but some degree of it is there, and many people will notice some degree of tearing without synchronization.


----------



## FastEddieNYC

Quote:


> Originally Posted by *Blameless*
> 
> My Sapphire 290X Tri-X new edition uses Samsung.
> 
> Seems better than Elpida, but worse than Hynix, overall. Though I don't have many samples to compare.


My Sapphire Tri-X new edition also has Samsung memory. It clocks as good as my XFX card with Hynix.(1625).
I discovered that the PCB used on the Sapphire is the same as the 390X. Only the memory size is different.


----------



## Arizonian

Quote:


> Originally Posted by *AMDMAXX*
> 
> I figured I'd post this in here too... I posted this in the O/C thread... There's a few more pictures over there though...
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I had a fun time putting this machine back together...
> 
> Seems with no VDDC added I'm getting 1100 on both cores and 1350 on memory.. anything above that seems to give me artifacts...
> 
> One card is a 290 the other is a 290x...
> 
> One is hynix (non-X) ram and the other elpida (x)...
> 
> http://www.3dmark.com/3dm/7053383?
> 
> Temps in Valley are pretty low... around 45-46c after 3-4 hours of looping... 55c with the case closed up...
> 
> I was appalled at the stock TIM job... horrible....


Congrats updated








Quote:


> Originally Posted by *boredmug*
> 
> Very happy with my two blocked 290x's. Picked these guys up used with blocks already installed for $500 recently.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## By-Tor

So SEXY....

Installed my second 290x LCS card tonight..


----------



## sinnedone

Quote:


> Originally Posted by *By-Tor*
> 
> So SEXY....
> 
> Installed my second 290x LCS card tonight..
> 
> 
> Spoiler: Warning: Spoiler!


Definately sexy.


----------



## By-Tor

My first Fire Strike run with the pair together.

Not sure if this is any good or not...


----------



## kizwan

Quote:


> Originally Posted by *Blameless*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ized*
> 
> You make that sound like there no exceptions?
> 
> 
> 
> That's because there aren't.
> 
> The odds of a non-synchronized frame falling exactly on the first horizontal refresh, continuing exactly to the last, and then repeating this process are infinitesimally small.
> 
> Unless some mechanism is deliberately constraining the frame rate to precisely the display's refresh rate, or vice versa, they will never be perfectly synchronized over any appreciable amount of time, and will thus always produce some degree of tearing.
> 
> You may not notice this tearing, but some degree of it is there, and many people will notice some degree of tearing without synchronization.
Click to expand...

So tearing also can happen without the user noticing it? 99% of the time I didn't notice any screen tearing. Is this because I'm playing @1440p or 4K?

Quote:


> Originally Posted by *By-Tor*
> 
> My first Fire Strike run with the pair together.
> 
> Not sure if this is any good or not...


I say that look like the cards are throttling. The score too low IMO.

1100/1350 - http://www.3dmark.com/fs/4711009
1200/1500 - http://www.3dmark.com/fs/2825429


----------



## By-Tor

Quote:


> Originally Posted by *kizwan*
> 
> I say that look like the cards are throttling. The score too low IMO.
> 
> 1100/1350 - http://www.3dmark.com/fs/4711009
> 1200/1500 - http://www.3dmark.com/fs/2825429


Suggestions?


----------



## Hazardz

Quote:


> Originally Posted by *By-Tor*
> 
> Suggestions?


Under water, I can't imagine them being throttled. Weird.


----------



## By-Tor

Was able to bring it up a little...


----------



## kizwan

Quote:


> Originally Posted by *By-Tor*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I say that look like the cards are throttling. The score too low IMO.
> 
> 1100/1350 - http://www.3dmark.com/fs/4711009
> 1200/1500 - http://www.3dmark.com/fs/2825429
> 
> 
> 
> Suggestions?
Click to expand...

ULPS enabled can prevent the cards from overvolting properly when overclocking, causing throttling.
Quote:


> Originally Posted by *Hazardz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *By-Tor*
> 
> Suggestions?
> 
> 
> 
> Under water, I can't imagine them being throttled. Weird.
Click to expand...

Throttling not necessarily caused by overheating.


----------



## Blameless

Quote:


> Originally Posted by *kizwan*
> 
> So tearing also can happen without the user noticing it?


Tearing is when parts of different frames are on screen at the same time because the display has started to draw another mid refresh.

If the scenes in these frames are similar enough, or the dividing line in a less noticeable area, it's perfectly possible to not see the tear.


----------



## Ized

Quote:


> Originally Posted by *Blameless*
> 
> That's because there aren't.
> 
> The odds of a non-synchronized frame falling exactly on the first horizontal refresh, continuing exactly to the last, and then repeating this process are infinitesimally small.
> 
> Unless some mechanism is deliberately constraining the frame rate to precisely the display's refresh rate, or vice versa, they will never be perfectly synchronized over any appreciable amount of time, and will thus always produce some degree of tearing.
> 
> *You may not notice this tearing*, but some degree of it is there, and many people will notice some degree of tearing without synchronization.


Lets try and be realistic though?

If my method gets me from "holy crap this is unplayable" which seems to be more common than not these days all the way down to "wow I can't even notice the tearing that Blameless assures us is *always* there" then does it matter??

No, I think not. Thankfully I don't play technicalities.

Frame limiter at 1fps under screen refresh rate always got the job done just fine for me.


----------



## Blameless

Quote:


> Originally Posted by *Ized*
> 
> Frame limiter at 1fps under screen refresh rate always got the job done just fine for me.


And for me any frame rate almost, but not quite, identical to refresh rate usually results in a very definite and annoying tear that creeps up and down the screen.

The only time I don't notice tearing in fast paced games is either when it's not there (meaning vsync), when the scene is static, or when my frame rate is absurdly high (several hundred or more frame per second).

If you don't notice tearing when capped at 59fps, you probably wouldn't notice it at any fixed fps anywhere near refresh rate. Many other people, probably a majority, will...if they have any idea what they are looking at.


----------



## Ized

Quote:


> Originally Posted by *Blameless*
> 
> And for me any frame rate almost, but not quite, identical to refresh rate usually results in a very definite and annoying tear that creeps up and down the screen.
> 
> The only time I don't notice tearing in fast paced games is either when it's not there (meaning vsync), when the scene is static, or when my frame rate is absurdly high (several hundred or more frame per second).
> 
> If you don't notice tearing when capped at 59fps, you probably wouldn't notice it at any fixed fps anywhere near refresh rate. Many other people, probably a majority, will...if they have any idea what they are looking at.










Ok buddy.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Blameless*
> 
> And for me any frame rate almost, but not quite, identical to refresh rate usually results in a very definite and annoying tear that creeps up and down the screen.
> 
> The only time I don't notice tearing in fast paced games is either when it's not there (meaning vsync), when the scene is static, or when my frame rate is absurdly high (several hundred or more frame per second).
> 
> If you don't notice tearing when capped at 59fps, you probably wouldn't notice it at any fixed fps anywhere near refresh rate. Many other people, probably a majority, will...if they have any idea what they are looking at.


I get tearing in some games (looking at you gta5 and witcher 3) with vsync on or off...can't seem to find settings that alleviate this...still gaming at 1080p @ 60hz...any suggestions


----------



## sugarhell

He is right. If your fps are not the same as the refresh rate you get tearing. It doesnt matter if you cant see it you get tearing. Now some engines can prevent huge tearing by manipulating the frames latency. With Variable refresh rate + vsync you never get tearing because the refresh rate now always follows your fps.


----------



## Ized

Quote:


> Originally Posted by *sugarhell*
> 
> He is right. If your fps are not the same as the refresh rate you get tearing. *It doesnt matter if you cant see it* you get tearing. Now some engines can prevent huge tearing by manipulating the frames latency. With Variable refresh rate + vsync you never get tearing because the refresh rate now always follows your fps.


I never said he was wrong.

But if you can't see the tearing please explain what the issue actual issue is?

How much tearing are we talking about here?

So little that it can't be seen... How do you measure that? Freeze frames and a microscope?


----------



## sugarhell

Quote:


> Originally Posted by *Ized*
> 
> I never said he was wrong.
> 
> But if you can't see the tearing please explain what the issue actual issue is?
> 
> How much tearing are we talking about here?
> 
> So little that it can't be seen... How do you measure that? Freeze frames and a microscope?


I never said neither he that its an issue if you cant see it. There are people more sensitive to tearing or stutter. You can't help that

Also higher fps can hide the huge tearing.

The pros of Variable refresh rates are the elimination of tearing and the elimination of the jitter when the refresh rate and the fps are not synced. Some people are not sensitive enough for these problems to affect them so its fine for them.


----------



## Scorpion49

Quote:


> Originally Posted by *Ized*
> 
> I never said he was wrong.
> 
> But if you can't see the tearing please explain what the issue actual issue is?
> 
> How much tearing are we talking about here?
> 
> So little that it can't be seen... How do you measure that? Freeze frames and a microscope?


Do you use an SSD? Can you tell the difference between a PC with one and without? I sure can, its night and day. Similarly, I didn't think VRR was going to be as big of a deal *until I used it*. Now I can't go back, I see EVERYTHING I used to ignore because I was conditioned not to pay it any mind. You can talk smack all you want, go find someone or somewhere that has a working Gsync/Freesync monitor and be amazed.

Someone else in one of the threads put it aptly: it is so smooth its eerie. Whatever you thought smooth was before, even locked at vsync no longer applies.


----------



## Ized

I simply stated there is a fairly feasible and easy workaround for the worst of the tearing I have come across and you guys decided nope. Impossible! It will produce tearing that you simply are unable to see!









From unplayable to playable I stated.. Ya know - what people need in the meantime until the adaptive technologies are more within reach.

I at no point said it made things go from unplayable to perfect therefore making freesync is useless.

I'm sure freesync + gsync and bloody magical, but again I never disputed that and since most people don't have it - we make do with what we can.

You guys are awfully good at making something out of nothing.


----------



## moorhen2

Firestrike score from me.









http://www.3dmark.com/3dm/7806217


----------



## mfknjadagr8

Quote:


> Originally Posted by *moorhen2*
> 
> Firestrike score from me.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/7806217


nice one...


----------



## gatygun

Quote:


> Originally Posted by *moorhen2*
> 
> Firestrike score from me.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/7806217


Nice, dat cpu tho.


----------



## moorhen2

Quote:


> Originally Posted by *mfknjadagr8*
> 
> nice one...


Thanks


----------



## moorhen2

Quote:


> Originally Posted by *gatygun*
> 
> Nice, dat cpu tho.


????


----------



## Scorpion49

Okay, so I just picked up a second 290X. Now I'm doing a full fresh Windows install and try Freesync again with the new card, hoping it works.


----------



## Gualichu04

Finally made a better score with my crossfire R9 290x. Stock bios on xfx cards. http://www.3dmark.com/3dm/7750324? 26K gpu score







old score is here with 14.9 drivers http://www.3dmark.com/3dm/4335845?


----------



## Scorpion49

Here goes mine in crossfire now, everything stock defaults: http://www.3dmark.com/3dm/7812165

Not too shabby for having x8/x4 crossfire (x4 going to m m.2 drive lol). Full system draw maxed out at 604W according to my UPS.


----------



## mfknjadagr8

So interesting thing happened tonight... i downloaded the latest patch for witcher 3... and started playing... peformance was better and the improvements were nice... but upon playing the second flashback to ciri.... i encounted a blue screen of death on driver crash (infinite loop)... it automatically restarted but windows would not boot would flash blue screen upon trying to load the login screen.... so i rebooted into safe mode and have uninstalled the driver... im not going to reinstall tonight but i had put probably 20 hours in and didnt have a blue screen until this latest patch... (1.07)... i was also using the latest amd driver 15.7.... hopefully this is an isolated incident... (temps were under 60c when the crash occured... cpu was at 55....vrms were under 65)


----------



## Gualichu04

Try using Display driver uninstaller in safemode and reinstall the drivers. I have no issues on the latest patch. Report back how it goes.


----------



## Scorpion49

So I finally have Freesync working now, I had to go buy a Z97 board and do another fresh install of Windows and now it seems to be working both single card and crossfire in DX11 games.


----------



## colorfuel

Quote:


> Originally Posted by *FastEddieNYC*
> 
> My Sapphire Tri-X new edition also has Samsung memory. It clocks as good as my XFX card with Hynix.(1625).
> I discovered that the PCB used on the Sapphire is the same as the 390X. Only the memory size is different.


I have the same card with Samsung Memory. And all I got was 1156/1500 stable with +100mv on air. The memory is really not a good overclocker here. Adding on the AUX in Afterburner didnt help at all.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Gualichu04*
> 
> Try using Display driver uninstaller in safemode and reinstall the drivers. I have no issues on the latest patch. Report back how it goes.


yeah ironically hasn't been but a week since I did that to upgrade to 15.7 but I've already used ddu just need to disconnect second card and reinstall but that will be after work today...


----------



## battleaxe

Quote:


> Originally Posted by *Scorpion49*
> 
> So I finally have Freesync working now, I had to go buy a Z97 board and do another fresh install of Windows and now it seems to be working both single card and crossfire in DX11 games.


What was the issue with the old board? And why did you think to replace it?


----------



## yawa

Uh, wow. 15.7 Made quite a difference for me...

http://www.3dmark.com/3dm/7816706?



14191 Graphics score. Also no artifacts at clocks that used to give me artifacts (which was anything over 1218Mhz)

Anyone else rebench everything and test out their clocking limits since 15.7? Highest Graphics score I ever got before this was 13912 at 1281Mhz, which I just beat handily at 1225Mhz.


----------



## mus1mus

Quote:


> Originally Posted by *yawa*
> 
> Uh, wow. 15.7 Made quite a difference for me...
> 
> http://www.3dmark.com/3dm/7816706?
> 
> 
> 
> 14191 Graphics score. Also no artifacts at clocks that used to give me artifacts (which was anything over 1218Mhz)
> 
> Anyone else rebench everything and test out their clocking limits since 15.7? Highest Graphics score I ever got before this was 13912 at 1281Mhz, which I just beat handily at 1225Mhz.


Taking the top spot on 290X users on Top 30 FS?


----------



## yawa

Quote:


> Originally Posted by *mus1mus*
> 
> Taking the top spot on 290X users on Top 30 FS?


Well if I did it would be because no one with an AMD card Posts over there anymore.









But no, I think I probably could, I'll have to see if I can get to 1300Mhz now though (my limit was 1281 at +200mv with heavy artifacts before), and screenshot every attempt. Temps are not an issue with an AC on for me atm (waterblocked, tops out at 45C in benches), so I might give it a good shot though honestly I wish there was a way to go +300mv without getting into that scary multiplier stuff.


----------



## mus1mus

Quote:


> Originally Posted by *yawa*
> 
> Well if I did it would be because no one with an AMD card Posts over there anymore.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But no, I think I probably could, I'll have to see if I can get to 1300Mhz now though (my limit was 1281 at +200mv with heavy artifacts before), and screenshot every attempt. Temps are not an issue with an AC on for me atm (waterblocked, tops out at 45C in benches), so I might give it a good shot though honestly I wish there was a way to go +300mv without getting into that scary multiplier stuff.


14K Graphics on a 290 waiting for you here. So give it a shot.

And yes, you can go 300mV using MSI AB Volt Hack. Google it.

But you might get into a trouble of screen blanking in and out. But the test should complete and get a nice score. If you can find a cure for that though.


----------



## fyzzz

http://www.3dmark.com/fs/5448903 this will do for now. I am waiting for my watercooling parts, then i will try higher.


----------



## Scorpion49

Quote:


> Originally Posted by *battleaxe*
> 
> What was the issue with the old board? And why did you think to replace it?


Well two things, one the NIC died (this is on the Crossblade Ranger I was using, first time I ever had a network port die) and I wanted to use my ultra m.2 drive and I couldn't do that without an adapter. So I picked up a 4690k with an ASRock Z97 Extreme6. I think it was all a driver issue and it just so happened to install correctly this time around.


----------



## Scorpion49

Quote:


> Originally Posted by *yawa*
> 
> Well if I did it would be because no one with an AMD card Posts over there anymore.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But no, I think I probably could, I'll have to see if I can get to 1300Mhz now though (my limit was 1281 at +200mv with heavy artifacts before), and screenshot every attempt. Temps are not an issue with an AC on for me atm (waterblocked, tops out at 45C in benches), so I might give it a good shot though honestly I wish there was a way to go +300mv without getting into that scary multiplier stuff.


I see you're in MA, did you get the refurb diamond 290X from MC?


----------



## yawa

Quote:


> Originally Posted by *mus1mus*
> 
> 14K Graphics on a 290 waiting for you here. So give it a shot.
> 
> And yes, you can go 300mV using MSI AB Volt Hack. Google it.
> 
> But you might get into a trouble of screen blanking in and out. But the test should complete and get a nice score. If you can find a cure for that though.


Come and bench my friend! Too many of us not bothering anymore over there.

I used that hack once back in the day but l'll try it again. AB was always wonky with this card, I would always get higher scores with Trixx, even with just +100mv. It was weird. It was like the clocks and voltage just wouldnt stick.

Quote:


> Originally Posted by *Scorpion49*
> 
> I see you're in MA, did you get the refurb diamond 290X from MC?


I got one of the new ones a few months after launch, flashed the ASUS voltage bios, put it under water and never looked back. Really underrated card, even on stock cooler it topped out at 80C. I was surprised at the build quality to say the least.


----------



## Scorpion49

Quote:


> Originally Posted by *yawa*
> 
> I got one of the new ones a few months after launch, flashed the ASUS voltage bios, put it under water and never looked back. Really underrated card, even on stock cooler it topped out at 80C. I was surprised to say the least.


Ah was just wondering, both of mine were the $239 refurbs they have in stock in cambridge right now. Kind of hard to pass up 290X CF for less than the price of one Fury.


----------



## BradleyW

Witcher 3 1.07

AMD CFX ruined

http://www.overclock.net/t/1565638/witcher-3-1-07-patch-amd-crossfire-performance-warning


----------



## battleaxe

Quote:


> Originally Posted by *fyzzz*
> 
> http://www.3dmark.com/fs/5448903 this will do for now. I am waiting for my watercooling parts, then i will try higher.


What clocks at this score? This puts it ahead of the 970 now.


----------



## Scorpion49

Quote:


> Originally Posted by *BradleyW*
> 
> Witcher 3 1.07
> 
> AMD CFX ruined
> 
> http://www.overclock.net/t/1565638/witcher-3-1-07-patch-amd-crossfire-performance-warning


Looks like I'm going to be waiting forever to play this game.


----------



## BradleyW

Quote:


> Originally Posted by *Scorpion49*
> 
> Looks like I'm going to be waiting forever to play this game.


I'm just going to stick with 1.06. 1.07 is a waste of time. Can't see a difference except pop in and low fps.


----------



## mus1mus

Quote:


> Originally Posted by *yawa*
> 
> Come and bench my friend! Too many of us not bothering anymore over there.
> 
> I used that hack once back in the day but l'll try it again. AB was always wonky with this card, I would always get higher scores with Trixx, even with just +100mv. It was weird. It was like the clocks and voltage just wouldnt stick.


http://www.3dmark.com/fs/5389526
Still working on my System's Physics score. That should be around 18K. And total might probably take it to 13K.


----------



## FastEddieNYC

Quote:


> Originally Posted by *BradleyW*
> 
> Witcher 3 1.07
> 
> AMD CFX ruined
> 
> http://www.overclock.net/t/1565638/witcher-3-1-07-patch-amd-crossfire-performance-warning


I was going to update to 1.07 later today but I may wait to see if more people report crossfire issues.


----------



## fyzzz

http://www.3dmark.com/fs/5450136 pretty happy with this one


----------



## mus1mus

Quote:


> Originally Posted by *fyzzz*
> 
> http://www.3dmark.com/fs/5450136 pretty happy with this one


You.....

Pretty noice man!


----------



## DividebyZERO

Quote:


> Originally Posted by *fyzzz*
> 
> http://www.3dmark.com/fs/5450136 pretty happy with this one


Quote:


> Originally Posted by *mus1mus*
> 
> You.....
> 
> Pretty noice man!


i cant tell if hes is using a 390/390x or just a 390 bios because its reporting "Pineview Industries Ltd." which is what i got with a 3xx bios myself.


----------



## Scorpion49

Quote:


> Originally Posted by *DividebyZERO*
> 
> i cant tell if hes is using a 390/390x or just a 390 bios because its reporting "Pineview Industries Ltd." which is what i got with a 3xx bios myself.


Thats XFX.


----------



## DividebyZERO

Quote:


> Originally Posted by *Scorpion49*
> 
> Thats XFX.


Double Derp for me. Wasn't aware of the name being XFX and i am getting 2xx owners thread and 3xx owners thread mixed up lol.


----------



## roanlinde

Guys,

I require support and a shoulder to weep on while I drink and dream of the days were my card use to work.

Short version:
Apologies if this is in the wrong section







but I am a owner at least
I got a second hand PowerColor reference 290.
Had some black screen issues (like so many of our brethren).
Contacted PowerColor who sent me a new BIOS, flashed it, and consequently sorted out most of the issues.
Still had to underclock the card a wee bit, but all-n-all a win.
Card was rock solid through the entire story line of GTA V (albeit loud as a banshee)

Then the card turn on me, it started with small little taunts filled with nothing but malice.
The card wouldn't wake the monitor up after sleep. The fans would go balls to the wall every
now and then followed by the famous "Graphics Adapter has stopped working but recovered".
She grew stronger with each passing day, more terrifying, fuelled by hate.
Now, all I get from her is a "VGA Not Detected" beep sequence. My old GTX460 is currently carrying her on
his tiny little shoulders.
If I boot into DOS to ATIFlash, she is not even detected (Adapter not Found). Her cooling equipment is working, and the
top side LED flashes intermittently, but essentially just a very warm and power hungry paper weight.
And switching between the two BIOS chips makes absolutely no difference.

Can anyone point me in the direction of some creepy hardware hacker that knows more about the flash chips
on these cards. I think my last option would be to revert to 1+8/3+8 BIOS hack, but I cannot find a single
relevant datasheet for the BIOS chips (25Q10T). As far as I know, the Write Protection Pin needs to be HIGH to be in
the disabled state, but I don't know which pin that is. Some folks has suggested that BIOS chip is essentially a
Pm25LD512/010/ 020, where the Write Protection pin is pin 3 ref(http://www.eevblog.com/forum/repair/help-fixing-r9-290-graphics-card/10/?wap2).

PC SPECS

MB: Asus P7P55D-E
CPU: i5 750 (no clocks)
RAM: 8GB Corsair XMS3 1333
SSD: Trancend 370 128GB
PSU: Antec 900W (80+)

Anyone with grand ideas and/or recommendation would be prayed for and loved, forever
Regards


----------



## kizwan

I believe your card is kaput. It's not BIOS problem because if yes, you would have this problem from day one.


----------



## roanlinde

Hey hey hey, don't you be mean to my card








Honestly, I kindof suspect the same thing, although I will go to worlds end for that little minx. So unless there is smoke coming from her, I'll resuscitate her senseless


----------



## Gumbi

Quote:


> Originally Posted by *roanlinde*
> 
> Hey hey hey, don't you be mean to my card
> 
> 
> 
> 
> 
> 
> 
> 
> Honestly, I kindof suspect the same thing, although I will go to worlds end for that little minx. So unless there is smoke coming from her, I'll resuscitate her senseless


Repasting the GPU can help a lot with reference chips too btw, if you are having problems with loudness (helps with temps a lot).

Stock setups are usually way overpasted.


----------



## MrKZ

Ok so I got the thermal pad and i have a big problem.. I ordered a 1.5mm pad and they sent me 1.0mm and there is low to no contact between the heatsink and the vrm chips.. I asked them for replacement (cause its their fault) and they said no, because i already cut the pad (I didn't checked the size, I was confident that it was what I ordered







and I noticed the wrong thickness when I was screwing back the heatsink).

So is there anything to do except for buying new pads of the right thickness? I was thinking about stacking two layers of the 1mm ones but it becomes 2mm and the screw wont reach it's hole







. and I tried without the plastic stands that keeps the heatsink height but when i screw the heatsink it lowers on the front part (only like half of the heatsink have vrm under it, the other half only have plain pcb under.) and it only make contact to a portion of the vrm surface.


----------



## boot318

Quote:


> Originally Posted by *MrKZ*
> 
> Ok so I got the thermal pad and i have a big problem.. I ordered a 1.5mm pad and they sent me 1.0mm and there is low to no contact between the heatsink and the vrm chips.. I asked them for replacement (cause its their fault) and they said no, because i already cut the pad (I didn't checked the size, I was confident that it was what I ordered
> 
> 
> 
> 
> 
> 
> 
> and I noticed the wrong thickness when I was screwing back the heatsink).
> 
> So is there anything to do except for buying new pads of the right thickness? I was thinking about stacking two layers of the 1mm ones but it becomes 2mm and the screw wont reach it's hole
> 
> 
> 
> 
> 
> 
> 
> . and I tried without the plastic stands that keeps the heatsink height but when i screw the heatsink it lowers on the front part (only like half of the heatsink have vrm under it, the other half only have plain pcb under.) and it only make contact to a portion of the vrm surface.


I've double pads on VRMs before. Not ideal, but it works. Try doubling and stretching it out to make it thinner. That might work so you can screw the heatsink in.


----------



## MrKZ

Quote:


> Originally Posted by *boot318*
> 
> I've double pads on VRMs before. Not ideal, but it works. Try doubling and stretching it out to make it thinner. That might work so you can screw the heatsink in.


I will try this. As for the pads they don't stretch a lot... they just torn. (Pads in discussion are Phobya XT)


----------



## spyshagg

Hi guys

I have 2 290x in crossfire. I just realised my voltage settings in AfterBurner are not being applied (by watching the OSD ingame). Also explains why upping the voltage doesn't improve my overclock.

stock:

gpu1 ~ 1.224 volts
gpu2 ~ 1.146 volts

with +50mv

gpu1 ~ 1.224 volts
gpu2 ~ 1.146 volts

also, the voltages vary wildly under load.

also, I read the voltages settings we apply in AB are not necessarily applied because powertune decides what voltage the gpu needs at any given time.

this is a stark contrast to the way of overclocking my previous 4870x2 card (voltage chosen is the voltage applied).

I am in the need of guidance


----------



## kizwan

ULPS disabled?


----------



## MrKZ

Quote:


> Originally Posted by *MrKZ*
> 
> I will try this. As for the pads they don't stretch a lot... they just torn. (Pads in discussion are Phobya XT)


Ok. The results (vSync off, with vSync on temps are in 50-60 range) :


Maybe if I had only one thermal pad instead of two stacked it would be even better?


----------



## mus1mus

Quote:


> Originally Posted by *spyshagg*
> 
> Hi guys
> 
> I have 2 290x in crossfire. I just realised my voltage settings in AfterBurner are not being applied (by watching the OSD ingame). Also explains why upping the voltage doesn't improve my overclock.
> 
> stock:
> 
> gpu1 ~ 1.224 volts
> gpu2 ~ 1.146 volts
> 
> with +50mv
> 
> gpu1 ~ 1.224 volts
> gpu2 ~ 1.146 volts
> 
> also, the voltages vary wildly under load.
> 
> also, I read the voltages settings we apply in AB are not necessarily applied because powertune decides what voltage the gpu needs at any given time.
> 
> this is a stark contrast to the way of overclocking my previous 4870x2 card (voltage chosen is the voltage applied).
> 
> I am in the need of guidance


Synched AB overclocking?

Look up the settings tab.

By the way, anyone using Trixx to push 1.4V into these cards?

I can remember for the life of me the version I previously used. I'm getting blackscreens while the game or test us on going. Blanking in and out. Happens too on AB Volt hack.

Should I use a benching bios?


----------



## spyshagg

Quote:


> Originally Posted by *kizwan*
> 
> ULPS disabled?


Yep

Quote:


> Originally Posted by *mus1mus*
> 
> Synched AB overclocking?
> 
> Look up the settings tab.


Tried both synched and unsynched. Same problem.

I also tried Asus GPU Tweak, but as soon as I apply any changes, no matter how small, the screen goes black and locks up. It does this on any change I make, be it mhz, power limits, voltage. And I have 2 asus 290x lol.

Next I will try the trixx software.


----------



## gatygun

Trying to buy a second 290, but it's hard to find a tri-x in the wild with warranty.

So i came uppon another card, a 290x tri-x. And did the guy a offer. But my question is. Does 290x crossfiring with a 290 give any additional issue's then the standard crossfire stuff ?


----------



## Agent Smith1984

Quote:


> Originally Posted by *gatygun*
> 
> Trying to buy a second 290, but it's hard to find a tri-x in the wild with warranty.
> 
> So i came uppon another card, a 290x tri-x. And did the guy a offer. But my question is. Does 290x crossfiring with a 290 give any additional issue's then the standard crossfire stuff ?


It'll run fine...

You will want to clock the 290 about 50MHz or so faster than the "x" to get the performance between the two as equal as possible though.
Otherwise the 290x will be slightly held back by the 290, and you'll see a slightly lower load on it.


----------



## kizwan

The 290X will not held back by 290. Both cards should be able to run at their clocks.
Quote:


> Originally Posted by *mus1mus*
> 
> By the way, anyone using Trixx to push 1.4V into these cards?
> 
> I can remember for the life of me the version I previously used. I'm getting blackscreens while the game or test us on going. Blanking in and out. Happens too on AB Volt hack.
> 
> Should I use a benching bios?


I use Trixx ver. 4.8.2.


----------



## Scorpion49

Quote:


> Originally Posted by *Agent Smith1984*
> 
> It'll run fine...
> 
> You will want to clock the 290 about 50MHz or so faster than the "x" to get the performance between the two as equal as possible though.
> Otherwise the 290x will be slightly held back by the 290, and you'll see a slightly lower load on it.


As far as I know, AMD fixed the asymmetrical stuff many years ago and it has no bearing on crossfire any more. That was an issue with the original version of crossfire (anyone remember the master/slave cards and external dongle?).


----------



## gatygun

Quote:


> Originally Posted by *kizwan*
> 
> The 290X will not held back by 290. Both cards should be able to run at their clocks.
> I use Trixx ver. 4.8.2.


Oki thanks for your reaction.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Scorpion49*
> 
> As far as I know, AMD fixed the asymmetrical stuff many years ago and it has no bearing on crossfire any more. That was an issue with the original version of crossfire (anyone remember the master/slave cards and external dongle?).


I experienced differentiations in usage and suttering when running my cards at different speeds (using 2) Tri-X 290's).
The issues stopped with the cards at the same clocks.... I would think the output of the two cards differing from the shader counts would be just as similar of a change to running different clocks on two cards, but maybe there is a difference.

I tested many different times, different clocks, VRAM speeds, etc....

No matter what, if the core was more than 20MHz faster on one card, it's usage would be roughly 1-2% less, and the game would not run as smoothly.
When equating the cores (adding voltage to the second card to match the core of the first), the issues stopped, and both cards stayed pegged at 99%.

Just my experience though. Also read a lot of that happening to other as well in other forums.


----------



## sugarhell

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I experienced differentiations in usage and suttering when running my cards at different speeds (using 2) Tri-X 290's).
> The issues stopped with the cards at the same clocks.... I would think the output of the two cards differing from the shader counts would be just as similar of a change to running different clocks on two cards, but maybe there is a difference.
> 
> I tested many different times, different clocks, VRAM speeds, etc....
> 
> No matter what, if the core was more than 20MHz faster on one card, it's usage would be roughly 1-2% less, and the game would not run as smoothly.
> When equating the cores (adding voltage to the second card to match the core of the first), the issues stopped, and both cards stayed pegged at 99%.
> 
> Just my experience though. Also read a lot of that happening to other as well in other forums.


Because AFR. If the first card renders the one frame at 60 fps and the second one renders the next frame at 58 fps then this constant frame variance could affect people with sensitivity to frame latency/variance.


----------



## Agent Smith1984

Quote:


> Originally Posted by *sugarhell*
> 
> Because AFR. If the first card renders the one frame at 60 fps and the second one renders the next frame at 58 fps then this constant frame variance could affect people with sensitivity to frame latency/variance.


Exactly, and the same would happen when adding a 290x to a 290, just with a difference in shaders instead of clock speed....

That's why I normally recommend people planning to mix the two, to tune the 290 as close to the performance of the 290x as they possibly can.


----------



## Scorpion49

Anyone have a suggestion for running crossfire cards with the HG10 brackets as to what cooler has the longest/most flexible tubes? It looks like the H75 is the best bet but I'm wondering it anyone has tried something else. My Air 540 is getting pretty packed now and when I add the HG10 on the second card its going to be a nightmare.

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I experienced differentiations in usage and suttering when running my cards at different speeds (using 2) Tri-X 290's).
> The issues stopped with the cards at the same clocks.... I would think the output of the two cards differing from the shader counts would be just as similar of a change to running different clocks on two cards, but maybe there is a difference.
> 
> I tested many different times, different clocks, VRAM speeds, etc....
> 
> No matter what, if the core was more than 20MHz faster on one card, it's usage would be roughly 1-2% less, and the game would not run as smoothly.
> When equating the cores (adding voltage to the second card to match the core of the first), the issues stopped, and both cards stayed pegged at 99%.
> 
> Just my experience though. Also read a lot of that happening to other as well in other forums.


I guess I should have put that they "supposedly" fixed it haha. I've never run different cards, it bothers me for whatever reason so I try to buy identical ones


----------



## gatygun

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Exactly, and the same would happen when adding a 290x to a 290, just with a difference in shaders instead of clock speed....
> 
> That's why I normally recommend people planning to mix the two, to tune the 290 as close to the performance of the 290x as they possibly can.


Aha ty for the hint.

Another thing i was thinking about, would it then be better to just flash the 290x towards a 290 bios? so that everything is exactly the same? If that even is possible.

I'm not sure if i will pick the card up tho, as he's kinda asking a steep price for it ( 300 euro's ). Originally planned to go for 250 max. But so when he bites i can directly agree with it


----------



## Agent Smith1984

Quote:


> Originally Posted by *gatygun*
> 
> Aha ty for the hint.
> 
> Another thing i was thinking about, would it then be better to just flash the 290x towards a 290 bios? so that everything is exactly the same? If that even is possible.
> 
> I'm not sure if i will pick the card up tho, as he's kinda asking a steep price for it ( 300 euro's ). Originally planned to go for 250 max. But so when he bites i can directly agree with it


I'd probably leave the 290x as is, and OC the 290 to match it.
It doesn't have to be perfect, I just recommend getting close.

I suggest testing the 290x stock on firestrike...

Then OC the 290 until it's firestrike score matches that of the 290x.

Then do a little math and see how much of a percentage in core clock increase it took to get the 290 to 290x levels.

After that, if you decide you want to OC both cards, just make sure the 290 stays that percentage in clock speed faster than the 290x clock speed (on the core only, leave the memory speeds the same between both)

It's not a perfect method, but it should work pretty well.


----------



## gatygun

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I'd probably leave the 290x as is, and OC the 290 to match it.
> It doesn't have to be perfect, I just recommend getting close.
> 
> I suggest testing the 290x stock on firestrike...
> 
> Then OC the 290 until it's firestrike score matches that of the 290x.
> 
> Then do a little math and see how much of a percentage in core clock increase it took to get the 290 to 290x levels.
> 
> After that, if you decide you want to OC both cards, just make sure the 290 stays that percentage in clock speed faster than the 290x clock speed (on the core only, leave the memory speeds the same between both)
> 
> It's not a perfect method, but it should work pretty well.


Will check that out then. I will probably be bottlenecked to hell and back anyway.


----------



## Agent Smith1984

Quote:


> Originally Posted by *gatygun*
> 
> Will check that out then. I will probably be bottlenecked to hell and back anyway.


Why's that? CPU?


----------



## gatygun

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Why's that? CPU?


Yea got a old cpu still, i7 870 @ 4,2 ghz atm. I just need additional performance for higher resolutions and filtering so figured a second card would do well


----------



## Ized

Quote:


> Originally Posted by *gatygun*
> 
> Aha ty for the hint.
> 
> Another thing i was thinking about, would it then be better to just flash the 290x towards a 290 bios? so that everything is exactly the same? If that even is possible.
> 
> I'm not sure if i will pick the card up tho, as he's kinda asking a steep price for it ( 300 euro's ). Originally planned to go for 250 max. But so when he bites i can directly agree with it


300 seems pretty steep as you say, wouldn't very little extra buy you a new one with warranty?

Not sure where you are exactly but the first random German site I found has one for 319. Even Amazon France has one fancy Asus for 301 or a reference card (blurgh) for 270.

These 290s seem to have a unreasonable high number of RMAs mentioned on this site vs say a average CPU so the warranty would have a high value for me.

I also saw a several "broken" or "spares and repairs" cards on ebay last time I looked too.


----------



## Ha-Nocri

Quote:


> Originally Posted by *gatygun*
> 
> Yea got a old cpu still, i7 870 @ 4,2 ghz atm. I just need additional performance for higher resolutions and filtering so figured a second card would do well


What FPS r u aiming for? If it's 60 your CPU is fine.


----------



## gatygun

Quote:


> Originally Posted by *Ized*
> 
> 300 seems pretty steep as you say, wouldn't very little extra buy you a new one with warranty?
> 
> Not sure where you are exactly but the first random German site I found has one for 319. Even Amazon France has one fancy Asus for 301 or a reference card (blurgh) for 270.
> 
> These 290s seem to have a unreasonable high number of RMAs mentioned on this site vs say a average CPU so the warranty would have a high value for me.
> 
> I also saw a several "broken" or "spares and repairs" cards on ebay last time I looked too.


I'm from the netherlands.

A 290 tri-x costs me new 350+ euro's, and 400+ for a 290x. ( if you can buy them even at all ), add another 100 on both for the 390 versions.

I bought 3-4 months ago a 290 tri-x with 6 months warranty for 200 euro's, so basically a good deal specially as i sold my 580 for 90 and it actually only costed me 110. Thing works like a champ really. Now i need some more horse power just for higher resolutions / filtering ( 21:9 gaming ) Wanna play games with 2560x1080p + 8x msaa enabled and get a steady 60+ fps.

The cheapest new 290 i can find is 290 euro's but i have no clue how there cooling solutions are, and when i run 2x crossfire 290's with a mild oc + a heavily heating oc'ed cpu on aircooling i need to get a good solid cooler that cools vrm's like the tri-x well.

On every forum i visit ( tech forums ) where people sell them off, all tri-x cards are instantly bought out for ~250 euro's with warranty. No matter if it's the 290 or 290x. So yea people ask a lot of money for them as in crossfire they are pretty much the way to go to brute force performance.

I originally didn't want to spend more then 200 for it like my other card costed me, as i figured that 2x 290's would give enough performance to last me for a few years for sure.
But i need to probably pay up to 250 to even get a chance on those cards and even then it's a struggle.

Indeed 300 euro's is a lot, and a bit out of my budget for sure, but still even if i sell my 290 and add 300 = 550, it buys me totally nothing that gives me the same kind of performance as 2x 290's. As a 980 ti will cost me 700 and i need a amd gpu for freesync







. Fury x seems to be a dissapointing card where i rather not drop my money on and they cost exactly the same as a 980 ti here.

So yea, kinda hard decisions to make.

Maybe if 295x2 where still 550 euro's i could snag one of those badboys
Quote:


> Originally Posted by *Ha-Nocri*
> 
> What FPS r u aiming for? If it's 60 your CPU is fine.


60 minimum with average 75 fps, that's all i need really.


----------



## Maticb

Quote:


> Originally Posted by *gatygun*
> 
> I'm from the netherlands.
> 
> A 290 tri-x costs me new 350+ euro's, and 400+ for a 290x. ( if you can buy them even at all ), add another 100 on both for the 390 versions.


So?
Buy it from germany.

https://www.caseking.de/en/sapphire-radeon-r9-290-tri-x-oc-4096-mb-gddr5-gcsp-111.html

Everything is also expensive here in Slovenia. Most of (us) enthusiasts order stuff from Germany. It's by far the cheapest in Europe. Even with added shipping.


----------



## kizwan

Well in the past few people already running 290x + 290 crossfire before & none of them report/complain any problem. The question has been asked many time before & the answer always no problem running 290X & 290 at their respective clocks in crossfire.


----------



## yawa

K here's my best effort Fyzzz and Mus. Or at least what I'm willing to do in a heatwave...



http://www.3dmark.com/3dm/7833774?

If I didn't have a dud 4790K, my overall score would be much higher too.

And to answer the question that was asked a bit ago, no these are all 290X's with 15.7. Getting much better Graphics scores with that driver.


----------



## gatygun

Quote:


> Originally Posted by *Maticb*
> 
> So?
> Buy it from germany.
> 
> https://www.caseking.de/en/sapphire-radeon-r9-290-tri-x-oc-4096-mb-gddr5-gcsp-111.html
> 
> Everything is also expensive here in Slovenia. Most of (us) enthusiasts order stuff from Germany. It's by far the cheapest in Europe. Even with added shipping.


Ah thanks for the hint, the price is indeed nice. But with everything with it i again move towards a 300 price point something that i kinda don't want to spend. A 295x2 isn't interesting as i just bought a DP cable for 30 euro's of 3 meter that i don't want to change or spend anymore money on to get it into a mini connector, even if i could manage it still will cost me 700, so yea that plan is also scratched.
Quote:


> Originally Posted by *kizwan*
> 
> Well in the past few people already running 290x + 290 crossfire before & none of them report/complain any problem. The question has been asked many time before & the answer always no problem running 290X & 290 at their respective clocks in crossfire.


Ah thanks for the information.

A change of plans tho:

The dude with the 390x wanted 325 for his card at the end and already sold one of his 2 cards, he doesn't want to make a deal for anything less. So i'm kinda bailing out on it.

I got a message out of nowhere from the first person where i originally wanted to buy that second 290 tri-x from, but he pulled out as he was to busy with his new pc and water setup and simple didn't wanted to sell his old stuff before his new pc was finished. He finished his PC build and asked if i still wanted to buy his 299 tri-x. After some talking the end price was 180 euro's with warranty + box / cables everything with it. I directly took it and now it's playing the waiting game.

Probably will arrive in 2-3 days, can't wait







.

Ty for the help people.


----------



## Raephen

Quote:


> Originally Posted by *gatygun*
> 
> Aha ty for the hint.
> 
> Another thing i was thinking about, would it then be better to just flash the 290x towards a 290 bios? so that everything is exactly the same? If that even is possible.
> 
> I'm not sure if i will pick the card up tho, as he's kinda asking a steep price for it ( 300 euro's ). Originally planned to go for 250 max. But so when he bites i can directly agree with it


Where in the EU are you? € 300 is very close to some of the lower prices for a R9 290 Tri-X OC here in the Netherlands.


----------



## mus1mus

Quote:


> Originally Posted by *yawa*
> 
> K here's my best effort Fyzzz and Mus. Or at least what I'm willing to do in a heatwave...
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/7833774?
> 
> If I didn't have a dud 4790K, my overall score would be much higher too.
> 
> And to answer the question that was asked a bit ago, no these are all 290X's with 15.7. Getting much better Graphics scores with that driver.


I don't think you have a dud chip.

Work out that Power Saving Option. It's messing up your OC. Your Total should be over 5K at 4.4GHz.

Bios and/or Windows Power Savings Active. Set both to High Performance.


----------



## Arizonian

Quote:


> Originally Posted by *By-Tor*
> 
> So SEXY....
> 
> Installed my second 290x LCS card tonight..
> 
> 
> Spoiler: Warning: Spoiler!


Looks really nice man. Congrats - updated


----------



## kizwan

@rdr09, I can get 14k gpu score with Tess 2x mod.









http://www.3dmark.com/fs/5462863


----------



## Ized

Quote:


> Originally Posted by *Raephen*
> 
> Where in the EU are you? € 300 is very close to some of the lower prices for a R9 290 Tri-X OC here in the Netherlands.


Yeah I don't get it. Literally hit up google and they are going for that price and less all over the show.

Concerned for your welfare gatygun


----------



## Ized

Quote:


> Originally Posted by *kizwan*
> 
> @rdr09, I can get 14k gpu score with Tess 2x mod.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/5462863


So there are mods for 3dmark?! What the hell... Pretty lame.


----------



## Newbie2009

Is the 295x2 volt locked?


----------



## mus1mus

Quote:


> Originally Posted by *Ized*
> 
> So there are mods for 3dmark?! What the hell... Pretty lame.


Check out your CCC Settings. Tesselation tweaks are cool but won't get validated on FS.


----------



## Ized

Quote:


> Originally Posted by *Newbie2009*
> 
> Is the 295x2 volt locked?


If im not mistaken this guy is happily editing his 295x2 Bios

http://i.imgur.com/LzB0efv.jpg

Quote:


> Originally Posted by *mus1mus*
> 
> Check out your CCC Settings. Tesselation tweaks are cool but won't get validated on FS.


There have been a few oddly high yet valid scores from people who mention tessellation so..


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ized*
> 
> So there are mods for 3dmark?! What the hell... Pretty lame.
> 
> 
> 
> 
> 
> 
> 
> Check out your CCC Settings. Tesselation tweaks are cool but won't get validated on FS.
Click to expand...

Unless there's bug on the server again (validation was done on server side), the scores with tess mod (except off) will get validated. Look like if you mod it between x2 to x64, the score will get validated.

There's a couple of time in the past that scores with tess off got validated. Here is one of the example.
http://www.3dmark.com/fs/2200723

...and my cards automatically re-branded to 390.


----------



## mus1mus

Quote:


> Originally Posted by *kizwan*
> 
> Unless there's bug on the server again (validation was done on server side), the scores with tess mod (except off) will get validated. Look like if you mod it between x2 to x64, the score will get validated.
> 
> There's a couple of time in the past that scores with tess off got validated. Here is one of the example.
> http://www.3dmark.com/fs/2200723
> 
> ...and my cards automatically re-branded to 390.


ohhh... nice one..

Pretty good scores nevertheless.


----------



## gatygun

Quote:


> Originally Posted by *kizwan*
> 
> Unless there's bug on the server again (validation was done on server side), the scores with tess mod (except off) will get validated. Look like if you mod it between x2 to x64, the score will get validated.
> 
> There's a couple of time in the past that scores with tess off got validated. Here is one of the example.
> http://www.3dmark.com/fs/2200723
> 
> ...and my cards automatically re-branded to 390.


The worst part is that gpu's are validated as wrong versions sometimes. For example a 290 getting validated as a 280 or a 290 getting validated as a 390 like you mentioned there. Which pretty much destroys the whole competition concept.

Also i bought the tri-x 290, and will probably get it later on this week. Can't wait.


----------



## Ized

I don't understand how people can get "valid" scores while artifacting every other frame too.

Makes comparing directly difficult.


----------



## sugarhell

If you artifact heavily then you are unstable. You get lower score than you should get if you didint artifact. Also the score is not unvalid. Most high scores are with few artifacts here and there.

Also futermark sucks finding tweaks. Lod tweaks,tess tweaks, etc etc etc.


----------



## kizwan

Quote:


> Originally Posted by *gatygun*
> 
> Also i bought the tri-x 290, and will probably get it later on this week. Can't wait.


Congrats!


----------



## Blameless

Quote:


> Originally Posted by *Ized*
> 
> I don't understand how people can get "valid" scores while artifacting every other frame too.
> 
> Makes comparing directly difficult.


Quote:


> Originally Posted by *sugarhell*
> 
> If you artifact heavily then you are unstable. You get lower score than you should get if you didint artifact. Also the score is not unvalid. Most high scores are with few artifacts here and there.
> 
> Also futermark sucks finding tweaks. Lod tweaks,tess tweaks, etc etc etc.


3DMark is used precisely because you can get 'valid' scores at near suicide clocks with all sorts of silly driver manipulation.


----------



## morencyam

Picked up an MSI Gaming 4G 290x last night. Should be here either late this week or early next week. I'm really hoping I have it by the weekend though.


----------



## Ized

Quote:


> Originally Posted by *Blameless*
> 
> 3DMark is used precisely because you can get 'valid' scores at near suicide clocks with all sorts of silly driver manipulation.


Any alternatives worth looking at then?

The only reason I have stuck with 3dmark is because of the large userbase.

The damn thing won't even record my clocks properly which is frustrating as hell.

I can't learn crap from my old results, and I can't compare them to other peoples from the reasons outlined in above posts. Pretty pointless really.


----------



## Blameless

Only way to really compare any benchmark is to agree to certain conditions amongst a group of benchers, and even 3DMark has utility in these cases. Automated submission/ranking systems are of dubious utility for things other than bragging rights.

If you want a solid comparison for real-world diagnostic/tweaking purposes, post a thread on OCN.


----------



## Archea47

Quote:


> Originally Posted by *Archea47*
> 
> The second block was a success. The copper did start to peel up a bit and one small piece did chip off but it was away from the seal
> 
> The first block failed the leak test. I packed the area with JB Water Weld and am going to leak test again in the morning. Yeah yeah, JB Weld, but it's no use at this point so it can't hurt
> 
> If your card is like mine, and I hope it isn't, .030" makes a difference because it meant I could see a sliver of light between the VRM cooler on the block and the thermal pad on the VRMs. These cards I have, which may or may not be different from yours, would obviously overheat immediately and I bought these for overclocking.
> 
> The difficulty I had milling them is 1.) the thickness of the area to begin with is only >0.050" and considering 2.) there's a seal pressing against that area for the water channel going to the VRM. As I milled to the point where the test fit finally cleared the copper started lifting from the acrylic (from the o-ring?). The cutting tool I was using was brand new but on both blocks it sort of chipped off a couple spots, and in the wrong spot (probably combined with the metal being pushed from the glass) like the first block it can compromise the seal
> 
> We'll see how it goes with the Water Weld on the leaker. FWIW I'll be running parallel (crossing fingers still)
> 
> Like Bertovzki I really want the Aquacomputer to work. I fell in love with the card at first sight, a few days after buying my previous set of 280Xs and was kicking myself for not going '90s. I think if I can't get it to work I might buy another Aquacomputer block and have a friend try cutting it at work on a cooled CNC. I admittedly never cut copper (except for the computer
> 
> 
> 
> 
> 
> 
> 
> ) and it can probably be done better


Hey team,

Quick update on the Aquacomputer blocks and the slightly-off-reference Sapphire Tri-X cards (most of them are compatible but a few, like mine, don't because the chokes are non-reference and too tall)

JB Water Weld eventually failed on me and leaked all the water into my power supply

Bought another aquacomputer block - same issue with interference of the chokes. I had to mill it down to about .010" thick on the copper over the chokes for it to fit, but had better luck RE chipping with a different (wider) cutting tool

The other card, which I milled but didn't crack and need JB Water Weld, is still holding strong


----------



## mus1mus

Quote:


> Originally Posted by *Blameless*
> 
> 3DMark is used precisely because you can get 'valid' scores at near suicide clocks with all sorts of silly driver manipulation.


Sadly, they're the ones in control of the thing.

Quote:


> Originally Posted by *Ized*
> 
> I don't understand how people can get "valid" scores while artifacting every other frame too.
> 
> Makes comparing directly difficult.


But you have to consider, a driver will crash on too much artifacting that automatically stops the test.

And, I found it harder to pass 3dmark when trying a balls to the walls OC. The drivers seem sensitive to FS considering the standard test is on 720P.

In fact, I can push for up to 50MHz more on Heaven or Valley without alarming the driver.

Quote:


> Originally Posted by *sugarhell*
> 
> If you artifact heavily then you are unstable. You get lower score than you should get if you didint artifact. Also the score is not unvalid. Most high scores are with few artifacts here and there.
> 
> Also futermark sucks finding tweaks. Lod tweaks,tess tweaks, etc etc etc.


True.


----------



## By-Tor

Didn't know they made a 290x, 8gb, DX12 card...

http://www.newegg.com/Product/Product.aspx?Item=N82E16814202144


----------



## Agent Smith1984

Quote:


> Originally Posted by *By-Tor*
> 
> Didn't know they made a 290x, 8gb, DX12 card...
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814202144


Yep, those been out for a while.
Curious to see how the ram clocks on those.... Could be an early 390x tri-x


----------



## mus1mus

Interesting to see if R9-29X series cards will get D12 support. Even if it needs flashing the 3XX BIOS.


----------



## BradleyW

290X will support DX12. Not 12.1. Or was it 12.2? I forget. Anyway, it will run 12.0!


----------



## sugarhell

Welp..Every single GCN card will get dx12 support

And there isnt any 12.1.

12.1 is a feature level not a d3d version...


----------



## BradleyW

Quote:


> Originally Posted by *sugarhell*
> 
> Welp..Every single GCN card will get dx12 support
> 
> And there isnt any 12.1.
> 
> 12.1 is a feature level not a d3d version...


I never stated that 12.1 was a d3d version. When you have the d3d version followed by a point, that's reference to that d3d's feature level.


----------



## sugarhell

Quote:


> Originally Posted by *BradleyW*
> 
> I never stated that 12.1 was a d3d version. When you have the d3d version followed by a point, that's reference to that d3d's feature level.


Nope. 12.1 will require new standard features. 12.1 is not even a real thing right now.

Directx12.1 =/= 12.1 feature levels to keep it simple just for you.


----------



## BradleyW

Quote:


> Originally Posted by *sugarhell*
> 
> Nope. 12.1 will require new standard features. 12.1 is not even a real thing right now.
> 
> Directx12.1 =/= 12.1 feature levels to keep it simple just for you.


When I typed 12.1., I was referring to the feature level. It has been stated that the 290X may not support 12.1 feature level. You love to nit pick and point out the most obscure things to win points over me whenever you get the opportunity. You are clearly one of those "know it all" personality types, where only you are correct and everyone else is wrong. I don't know if you do this to everyone, or if you just dislike me for reasons unknown. whichever it is, I'm not interested.


----------



## Scorpion49

Quote:


> Originally Posted by *BradleyW*
> 
> When I typed 12.1., I was referring to the feature level. It has been stated that the 290X may not support 12.1 feature level. You love to nit pick and point out the most obscure things to win points over me whenever you get the opportunity. You are clearly one of those "know it all" personality types, where only you are correct and everyone else is wrong. I don't know if you do this to everyone, or if you just dislike me for reasons unknown. whichever it is, I'm not interested.


He's always like that, but only ever present in a thread involving AMD though.


----------



## sugarhell

Quote:


> Originally Posted by *BradleyW*
> 
> When I typed 12.1., I was referring to the feature level. It has been stated that the 290X may not support 12.1 feature level. You love to nit pick and point out the most obscure things to win points over me whenever you get the opportunity. You are clearly one of those "know it all" personality types, where only you are correct and everyone else is wrong. I don't know if you do this to everyone, or if you just dislike me for reasons unknown. whichever it is, I'm not interested.


Sorry that i hurt your feelings.

By the way scorpion when i tried to help you with your freesync problem you answered me like i killed your dog. Now you are telling me this? Rofl i am out of here my iq drops every minute with you guys.


----------



## Unknownm

Holy nuts batman. Bought a AIO watercool kit for both my r9 290s. 2x Corsair HG10 A1 & 2x Corsair H75

The problem is this bracket was only design for reference cards (which mine is non reference). It mounts fine but VRM temps hit 100c. Regardless this is a major improvement over stock cooling, first thing being the top card doesn't throttle and second the case side panel is on (which was off before because of the heat)

The HDD are my RAID0 setup and since the HDD bays took up to much room they both had to be removed!


----------



## StrongForce

I just had the worst experience of my whole OC career (can we talk about career ? I guess we can) I downloaded OC guru II, upped the core by 15mhz and lol.. then I launched Furmark, and boom, black screen, nothing responds.. ctrl-alt-del, takes ages to bring up the menu I click task manager... nothing happens, after a long black screen some textures start to show of the task manager, all glitchy, can't even move mouse. tryed 2 times, then after even just going on windows and clicking on firefox to come here, made me the same weird crash.. most horrible overclocking experience so far.. not sure what's up with that, is it OC guru ?? I uninstalled it anyway.. which program is the best.. I also uninstalled my old progs Trixx, MSI afterburner, and the other 1 can't remember name, which I was using to run my hd 7950 at epic clocks.










I'm sure it's a software issue though, any help would be appreciated, or I just create a new thread maybe.


----------



## DDSZ

Quote:


> Originally Posted by *StrongForce*
> 
> I downloaded OC guru II, upped the core by 15mhz and lol.. then I launched Furmark, and boom, black screen, nothing responds..
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm sure it's a software issue though, any help would be appreciated, or I just create a new thread maybe.


Mine WF3 needs +100mV to run +60MHz (1040->1100)


----------



## CGabry

Install MSI Afterburner...


----------



## Gumbi

Quote:


> Originally Posted by *Unknownm*
> 
> Holy nuts batman. Bought a AIO watercool kit for both my r9 290s. 2x Corsair HG10 A1 & 2x Corsair H75
> 
> The problem is this bracket was only design for reference cards (which mine is non reference). It mounts fine but VRM temps hit 100c. Regardless this is a major improvement over stock cooling, first thing being the top card doesn't throttle and second the case side panel is on (which was off before because of the heat)
> 
> The HDD are my RAID0 setup and since the HDD bays took up to much room they both had to be removed!


Reference cards? I actually bet the VRM colling was a lot better before you put on the AIO cooler







The fan blows cool air onto the VRMs before it gets to the core, so generally VRMs stay sub 75 degrees in reference cards.

You need a fan blowing directly onto the VRM/memory setuo to get those temps under control. Do you have heatsinks mounted on them?


----------



## JourneymanMike

Quote:


> Originally Posted by *StrongForce*
> 
> I just had the worst experience of my whole OC career (can we talk about career ? I guess we can) I *downloaded OC guru II*, upped the core by 15mhz and lol.. then I launched Furmark, and boom, black screen, nothing responds.. ctrl-alt-del, takes ages to bring up the menu I click task manager... nothing happens, after a long black screen some textures start to show of the task manager, all glitchy, can't even move mouse. tryed 2 times, then after even just going on windows and clicking on firefox to come here, made me the same weird crash.. most horrible overclocking experience so far.. not sure what's up with that, is it OC guru ?? I uninstalled it anyway.. which program is the best.. *I also uninstalled my old progs Trixx, MSI afterburner*, and the other 1 can't remember name, which I was using to run my hd 7950 at epic clocks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm sure it's a software issue though, any help would be appreciated, or I just create a new thread maybe.


I've never even got OC Guru to work!

But, MSI AB and Sapphire Trixx always work









Quote:


> Originally Posted by *CGabry*
> 
> *Install MSI Afterburner...*


This ^ Reinstall MSI AB...


----------



## Scorpion49

Quote:


> Originally Posted by *Unknownm*
> 
> Holy nuts batman. Bought a AIO watercool kit for both my r9 290s. 2x Corsair HG10 A1 & 2x Corsair H75
> 
> The problem is this bracket was only design for reference cards (which mine is non reference). It mounts fine but VRM temps hit 100c. Regardless this is a major improvement over stock cooling, first thing being the top card doesn't throttle and second the case side panel is on (which was off before because of the heat)
> 
> The HDD are my RAID0 setup and since the HDD bays took up to much room they both had to be removed!


Just turn the fan up a little manually. I run mine at 30% and I don't see VRM temps over 75* or so and it is still very quiet. The problem is with the AIO the cards fan does not spin up since it never reaches the temp where it normally would.


----------



## sinnedone

Quote:


> Originally Posted by *StrongForce*
> 
> I just had the worst experience of my whole OC career (can we talk about career ? I guess we can) I downloaded OC guru II, upped the core by 15mhz and lol.. then I launched Furmark, and boom, black screen, nothing responds.. ctrl-alt-del, takes ages to bring up the menu I click task manager... nothing happens, after a long black screen some textures start to show of the task manager, all glitchy, can't even move mouse. tryed 2 times, then after even just going on windows and clicking on firefox to come here, made me the same weird crash.. most horrible overclocking experience so far.. not sure what's up with that, is it OC guru ?? I uninstalled it anyway.. which program is the best.. I also uninstalled my old progs Trixx, MSI afterburner, and the other 1 can't remember name, which I was using to run my hd 7950 at epic clocks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm sure it's a software issue though, any help would be appreciated, or I just create a new thread maybe.


System restore from a point before this issue happened not an option?


----------



## StrongForce

I think it might be just the program.. I need to try with another, which one is your fav to OC guys ?? it didn't happend since I uninstalled OC guru II...

I'll try with MSI afterburner

but do you guys ever got unresponsive windows from a GPU OC ? that was weird.. freaking OC guru man


----------



## Dooderek

Quote:


> Originally Posted by *StrongForce*
> 
> I think it might be just the program.. I need to try with another, which one is your fav to OC guys ?? it didn't happend since I uninstalled OC guru II...
> 
> I'll try with MSI afterburner
> 
> but do you guys ever got unresponsive windows from a GPU OC ? that was weird.. freaking OC guru man


I like MSI AB. I tried using TRIXX the other day but I am just so accustomed to AB I switched back. Although Trixx is very user-friendly as well.


----------



## Dooderek

Quote:


> Originally Posted by *Unknownm*
> 
> Holy nuts batman. Bought a AIO watercool kit for both my r9 290s. 2x Corsair HG10 A1 & 2x Corsair H75
> 
> The problem is this bracket was only design for reference cards (which mine is non reference). It mounts fine but VRM temps hit 100c. Regardless this is a major improvement over stock cooling, first thing being the top card doesn't throttle and second the case side panel is on (which was off before because of the heat)
> 
> The HDD are my RAID0 setup and since the HDD bays took up to much room they both had to be removed!


The reason why your VRM temps are so high is because you have the fan set on auto. The fan profile doesn't know that you have an AIO, so it bases its fan speed off of the GPU core temp. Since your water cooled the fans never ramp up enough to cool the VRM's. Set a custom profile and you'll drop into the 70-80's (Beware it gets loud if you want sufficient VRM cooling).

I had the same experience when I was using the HG A10.


----------



## gatygun

Quote:


> Originally Posted by *StrongForce*
> 
> I think it might be just the program.. I need to try with another, which one is your fav to OC guys ?? it didn't happend since I uninstalled OC guru II...
> 
> I'll try with MSI afterburner
> 
> but do you guys ever got unresponsive windows from a GPU OC ? that was weird.. freaking OC guru man


I use both.

msi afterburner for typical 24/7 overclock that goes up to +100. Trixx for getting my max overclock with +200.

Basically 290 1213/1400 trixx, vs msi afterburner 1145/1375


----------



## StrongForce

Quote:


> Originally Posted by *gatygun*
> 
> I use both.
> 
> msi afterburner for typical 24/7 overclock that goes up to +100. Trixx for getting my max overclock with +200.
> 
> Basically 290 1213/1400 trixx, vs msi afterburner 1145/1375


I see yea nice, now I remember Trixx have higher V limit then right ? which I can't possibly undertsand why lol.

I might stick to MSI afterburner I don't want too hot card in there, my CPU already heats so much.. lol even with a Primo it's crazy.. well it's the socket/backplate more than the CPU itself though.


----------



## gatygun

Quote:


> Originally Posted by *StrongForce*
> 
> I see yea nice, now I remember Trixx have higher V limit then right ? which I can't possibly undertsand why lol.
> 
> I might stick to MSI afterburner I don't want too hot card in there, my CPU already heats so much.. lol even with a Primo it's crazy.. well it's the socket/backplate more than the CPU itself though.


Yea i couldn't get my GPU core higher then 1145 which bumped me out as i wanted to hit 1200, then i noticed trixx and it's +200 v limit and i got 1200 without any issue's. Memory still won't clock without artifacting any higher then 1400 with that tho, even while the artifacts are minor as just as minor as a 1650 oc, it's still annoying and performance gain is minimal for me.

As it's summer now, and like 30+ degree's in my room, If i want to push +200, i have to push my fan at a 100% to cool the vrm's well tho. I don't have issue's with that as i mostly have my window open and the outside noise + headphones i use completely overtake the fan sound. But i only use that profile for benchmarking bassically or test games to see what i maximum can get out of it.

Whenever i game i push mostly the msi +100 v solution forwards, as the differnece between 1145 ans 1200 isn't that much, not as much as 1000>1145 that's for sure. Temps at 50% are in the winter like 70c vrm, and 60 or so on the core with msi after burner overclock, in the summer it's 10 degree's more tho.

As i'm getting a second 290 tommorow, i will probably not touch trixx anymore tho, as i like to keep my cards cooler then 80c, and the extra heat of the second card is surely going to push heat up big time. And increasing 2x 290's tri-x fans over 50% is going to be noisy for sure.

As i also have a noctua d14 which will be right against the top 290 ( got my 290 in the lowest slot atm ) i need to check out how my cpu cooler heats up. As it's close to it's limit with my oc on it and a extra heat source will surely take a toll on it. As my cpu is the main bottleneck of my setup then, i could see even underclocking those 290's in the future. I basically bought the other 290, to not have to stress my current 290 with high overclocks anymore.

So yea


----------



## Dsrt

http://www.3dmark.com/3dm/7856376 . Whats the max safe VRMs temp for r9-290x directcuii. These hit +110c VRM when oced easily. Core temp stays under 90c


----------



## JourneymanMike

Quote:


> Originally Posted by *Dsrt*
> 
> http://www.3dmark.com/3dm/7856376 . Whats the max safe VRMs temp for r9-290x directcuii. T*hese hit +110c VRM when oced easily.* Core temp stays under 90c


Very nice!









Great Physics score...

110c+ Holy Krap! you need some cooling on those VRM's. You could put a fan over them to help cool...

WTH! Don't screw around - go full blown water cooling loop...


----------



## xDorito

Sorry for the poor quality. I've just been so thoroughly impressed by these two I decided to share a quick and dirty shot of them.


----------



## gatygun

Quote:


> Originally Posted by *Dsrt*
> 
> http://www.3dmark.com/3dm/7856376 . Whats the max safe VRMs temp for r9-290x directcuii. These hit +110c VRM when oced easily. Core temp stays under 90c


I believe it's 125c on the vrm's and 95c on the core.

But in my experience you don't want the vrm's to get any hotter then 90c and the core 80c


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Dsrt*
> 
> http://www.3dmark.com/3dm/7856376 . Whats the max safe VRMs temp for r9-290x directcuii. These hit +110c VRM when oced easily. Core temp stays under 90c












You be cooking your card running those temps 24/7 .

Re mount your cooler replace pads and tim or go watercooling .

Watercooling dropped my VRM temps by 60c at full load









P.S. Your physics score is very boss


----------



## Erecshyrinol

Heads up -- great deal on a 290x.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814202145&cm_re=sapphire_290x-_-14-202-145-_-Product

Stock is running out on 290x and soon the inferior and more expensive 390 will be the only choice in the price range. You Yanks get extremely nice deals, I paid 250 euro for a *used* Tri-X 290x and this is $260 for a new one.


----------



## Gumbi

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You be cooking your card running those temps 24/7 .
> 
> Re mount your cooler replace pads and tim or go watercooling .
> 
> Watercooling dropped my VRM temps by 60c at full load
> 
> 
> 
> 
> 
> 
> 
> 
> 
> P.S. Your physics score is very boss


The Asus cooler is pretty terrible, it's pretty bad they didn't revise it for the 390 (x) series. It had the same problems for the 290 (x).

You don't necessarily need water cooling for better VRM temps, the VRM cooling on the Vapor X is beastly, you can pump 200mv through thrm and staysub 65 degrees such is the strenth of it.

I game at 1175/1650mhz at 75mv and they stay 55 degrees or lower (core at 75 degrees, 50% fan).


----------



## Unknownm

Quote:


> Originally Posted by *Dooderek*
> 
> The reason why your VRM temps are so high is because you have the fan set on auto. The fan profile doesn't know that you have an AIO, so it bases its fan speed off of the GPU core temp. Since your water cooled the fans never ramp up enough to cool the VRM's. Set a custom profile and you'll drop into the 70-80's (Beware it gets loud if you want sufficient VRM cooling).
> 
> I had the same experience when I was using the HG A10.


Quote:


> Originally Posted by *Scorpion49*
> 
> Just turn the fan up a little manually. I run mine at 30% and I don't see VRM temps over 75* or so and it is still very quiet. The problem is with the AIO the cards fan does not spin up since it never reaches the temp where it normally would.


You are both correct but not in my situation. The corsair HG A10 was design to only work with reference cards (which means you have to mount a blower fan from the stock heatsink which is only in reference designs), mine isn't!

What would be best here is strapping on a high CFM 120mm at the end of the cards so it's blowing on the vrms. Right now my high CFM coolermaster 120mm fan is mounted at the front of the case and just gives enough air flow to cool the VRMs to 100c, I know this because before installing it my VRM temps hit 114c on the same test.


----------



## kizwan

@Unknownm, I saw your post at Valley thread. Pretty nice scores!


----------



## StrongForce

Quote:


> Originally Posted by *gatygun*
> 
> Yea i couldn't get my GPU core higher then 1145 which bumped me out as i wanted to hit 1200, then i noticed trixx and it's +200 v limit and i got 1200 without any issue's. Memory still won't clock without artifacting any higher then 1400 with that tho, even while the artifacts are minor as just as minor as a 1650 oc, it's still annoying and performance gain is minimal for me.
> 
> As it's summer now, and like 30+ degree's in my room, If i want to push +200, i have to push my fan at a 100% to cool the vrm's well tho. I don't have issue's with that as i mostly have my window open and the outside noise + headphones i use completely overtake the fan sound. But i only use that profile for benchmarking bassically or test games to see what i maximum can get out of it.
> 
> Whenever i game i push mostly the msi +100 v solution forwards, as the differnece between 1145 ans 1200 isn't that much, not as much as 1000>1145 that's for sure. Temps at 50% are in the winter like 70c vrm, and 60 or so on the core with msi after burner overclock, in the summer it's 10 degree's more tho.
> 
> As i'm getting a second 290 tommorow, i will probably not touch trixx anymore tho, as i like to keep my cards cooler then 80c, and the extra heat of the second card is surely going to push heat up big time. And increasing 2x 290's tri-x fans over 50% is going to be noisy for sure.
> 
> As i also have a noctua d14 which will be right against the top 290 ( got my 290 in the lowest slot atm ) i need to check out how my cpu cooler heats up. As it's close to it's limit with my oc on it and a extra heat source will surely take a toll on it. As my cpu is the main bottleneck of my setup then, i could see even underclocking those 290's in the future. I basically bought the other 290, to not have to stress my current 290 with high overclocks anymore.
> 
> So yea


Yea indeed..and lol, I'm at 82 currently without OC LOL !! no idea why.. it's only 27.6 in my room uh.

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You be cooking your card running those temps 24/7 .
> 
> Re mount your cooler replace pads and tim or go watercooling .
> 
> Watercooling dropped my VRM temps by 60c at full load
> 
> 
> 
> 
> 
> 
> 
> 
> 
> P.S. Your physics score is very boss


What ? how do you watercool VRMs though ? I thought when you watercool a card you could only do it for the chipset and then add small heatsinks to the VRMS and (memory too eventually?)


----------



## Unknownm

Quote:


> Originally Posted by *kizwan*
> 
> @Unknownm, I saw your post at Valley thread. Pretty nice scores!


Once I get my VRM temps under control (fan) i'll start pushing more. Thanks


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *StrongForce*
> 
> Yea indeed..and lol, I'm at 82 currently without OC LOL !! no idea why.. it's only 27.6 in my room uh.
> What ? how do you watercool VRMs though ? I thought when you watercool a card you could only do it for the chipset and then add small heatsinks to the VRMS and (memory too eventually?)


Waterblock the whole card mate , how else ?


----------



## Talon720

f
Quote:


> Originally Posted by *StrongForce*
> 
> Yea indeed..and lol, I'm at 82 currently without OC LOL !! no idea why.. it's only 27.6 in my room uh.
> What ? how do you watercool VRMs though ? I thought when you watercool a card you could only do it for the chipset and then add small heatsinks to the VRMS and (memory too eventually?)


The vrms, core and memory all get cooled under a full waterblock like ek, aquacomputer, any of the water blocks that state this blocks cools your gpu mem vrm. Some blocks just cool certain components better than others. They might have a water jet near vrm section of block or use an active watercooled backplate. The aio add ons with a fan and heat sinks is what I assume you are thinking of?


----------



## mus1mus

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Waterblock the whole card mate , how else ?



Quote:


>


----------



## By-Tor

Quote:


> Originally Posted by *StrongForce*
> 
> Yea indeed..and lol, I'm at 82 currently without OC LOL !! no idea why.. it's only 27.6 in my room uh.
> What ? how do you watercool VRMs though ? I thought when you watercool a card you could only do it for the chipset and then add small heatsinks to the VRMS and (memory too eventually?)


I used Swiftech MCW60 and 82 GPU blocks and heatsinks on the memory and VRM's for years without any issues, but the key is to have a fan blowing across the cards for the heatsinks to work.

One of my 7950's.



Full cover on one of my 290x's


----------



## gatygun

So messing around with my crossfire system, but where can i find game specific profiles?. I can't seem to find it.

I can add the witcher 3 executable, then i see at the lower part AMD crossfireX, AMD crossfir X modus and then it says use predifined profiles. But then when i select one it only wants to overwrite it.

So how do you set it up?


----------



## kizwan

Quote:


> Originally Posted by *gatygun*
> 
> So messing around with my crossfire system, but where can i find game specific profiles?. I can't seem to find it.
> 
> I can add the witcher 3 executable, then i see at the lower part AMD crossfireX, AMD crossfir X modus and then it says use predifined profiles. But then when i select one it only wants to overwrite it.
> 
> So how do you set it up?


The profiles built in the driver & usually you don't need to change or setup anything manually. You only use "add" if you want to override the predefined settings.


----------



## Scorpion49

Quote:


> Originally Posted by *Unknownm*
> 
> You are both correct but not in my situation. The corsair HG A10 was design to only work with reference cards (which means you have to mount a blower fan from the stock heatsink which is only in reference designs), mine isn't!
> 
> What would be best here is strapping on a high CFM 120mm at the end of the cards so it's blowing on the vrms. Right now my high CFM coolermaster 120mm fan is mounted at the front of the case and just gives enough air flow to cool the VRMs to 100c, I know this because before installing it my VRM temps hit 114c on the same test.


Oh I get it, you didn't have the blower fans. Don't know how I missed that before. Just an FYI, you can get a used fan from just about any AMD card that had a blower on it, they will all work but the RPM range might be different depending on what it was. For example, I have tried fans from both an R9 270 and a 4870 on the 290's and they run fine. Maybe you can find some for cheap on ebay (note: fans from older cards like the 48XX/58XX have a much higher minimum RPM and will be louder).


----------



## mfknjadagr8

Or he could ask someone like me to ship him one from one of my reference 290 coolers that will never see use ever again...those suckers go up to 5k rpm according to hwinfo64


----------



## gatygun

Quote:


> Originally Posted by *kizwan*
> 
> The profiles built in the driver & usually you don't need to change or setup anything manually. You use "add" if you want to override the default settings.


Well whatever i do or choose, the picture it displays gives me eye damage. it's horrible. Half the screen splitting up like massive v-sync issue's and it stutters like hell.

Any solution for this?


----------



## mfknjadagr8

Quote:


> Originally Posted by *gatygun*
> 
> Well whatever i do or choose, the picture it displays gives me eye damage. it's horrible. Half the screen splitting up like massive v-sync issue's and it stutters like hell.
> 
> Any solution for this?


I haven't seen anything like this...you have 290s yeah?

I was wondering what the go to program was now for forcing settings..people still using Radeon pro or is there something better?


----------



## gatygun

Quote:


> Originally Posted by *mfknjadagr8*
> 
> I haven't seen anything like this...you have 290s yeah?
> 
> I was wondering what the go to program was now for forcing settings..people still using Radeon pro or is there something better?


I have no clue i just insstalled the second 290, and wanted to test witcher 3 out. but god dam it's unplayable. Like the whole screen that is moving around.


----------



## mfknjadagr8

Quote:


> Originally Posted by *gatygun*
> 
> I have no clue i just insstalled the second 290, and wanted to test witcher 3 out. but god dam it's unplayable. Like the whole screen that is moving around.


did you test the second 290 by itself first? You always want to test a new to you card before you crossfire then together so if you have issues you can rule out the card bring defective...I always unplug the power from the first and put monitor on the second card to test and be sure the card is running well and at what clocks it can run this eliminates the card itself as an issue


----------



## Scorpion49

Uh oh....


Spoiler: Warning: Spoiler!















I've become a traitor.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Scorpion49*
> 
> Uh oh....
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've become a traitor.


nah traitors use 970s lol


----------



## kizwan

Quote:


> Originally Posted by *gatygun*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> The profiles built in the driver & usually you don't need to change or setup anything manually. You use "add" if you want to override the default settings.
> 
> 
> 
> Well whatever i do or choose, the picture it displays gives me eye damage. it's horrible. Half the screen splitting up like massive v-sync issue's and it stutters like hell.
> 
> Any solution for this?
Click to expand...

I don't have Witcher 3. If I'm not mistaken crossfire support for this game still buggy. I remember someone did mentioned do not update to patch 1.07. I also read Optimize 1x1 solve flickering but fps drop a lot & stuttering. Did you try with Optimize 1x1?


----------



## By-Tor

Quote:


> Originally Posted by *kizwan*
> 
> I don't have Witcher 3. If I'm not mistaken crossfire support for this game still buggy. I remember someone did mentioned do not update to patch 1.07. I also read Optimize 1x1 solve flickering but fps drop a lot & stuttering. Did you try with Optimize 1x1?


I'm having no problems in Witcher 3 with Crossfire or the 1.07 patch. Knock on wood!!!


----------



## velocityx

witcher 3 in 290 crossfire is really playable although flickering and stuttering depends on the weather in game and can get kinda annoying. not perfect but good enough too play.

the ugly thing is that amd also thought the same, i mean that its good enough so i doubt they will fix this ever.


----------



## mfknjadagr8

Quote:


> Originally Posted by *velocityx*
> 
> witcher 3 in 290 crossfire is really playable although flickering and stuttering depends on the weather in game and can get kinda annoying. not perfect but good enough too play.
> 
> the ugly thing is that amd also thought the same, i mean that its good enough so i doubt they will fix this ever.


my crossfire works great now the flicker was solved on mine by using lower antialiasing settings...Im only using fxaa...I still had a little flicker but the 1.07 patch fixed it in game but I still get the glitchy flicker in menus and the level up icon in the ui...my flicker was horrible on any surface that was reflective...now going on four hours playtime since the patch ive had no flicker outside of the above


----------



## gatygun

Quote:


> Originally Posted by *mfknjadagr8*
> 
> did you test the second 290 by itself first? You always want to test a new to you card before you crossfire then together so if you have issues you can rule out the card bring defective...I always unplug the power from the first and put monitor on the second card to test and be sure the card is running well and at what clocks it can run this eliminates the card itself as an issue


Yea i did a lot of tests on the new card first, which costed me half a day before i did crossfire it. The card is stable on 100+ volt, 1165/1690 memory, thing is a overclock wonder even while it only has 70% asic or whatever they call it..

I do use both card on stock clocks atm tho, the cards work on higher resolutions in a few games i play perfectly fine. but on witcher 3 it's simple not playable. It's like your whole screen is shaking.

I tried default mode / afr / 1x1 and all of them brought there own issue's but didn't solve the massive shaking issue. I have no issue's with bf4 or batman for that matter, or even 3dmark benchmarks. Those are the only ones i tested so far.

Witcher 3 also drops massive in fps, 1440p ultra = sometimes 30 fps. Almost like one card is doing nothing while it states it is pushing for 80%. Framepase or whatever they call it jumps up and down like a rock.

This is my first dual gpu setup, so i have pretty much no experience on this matter. Maybe i just do something wrong entirely. Because my second gpu drops his memory clock to 150mhz in idle while my other card stays at 1300. Maybe something is wrong here?

Edit

Also does anybody know how to disable rivatuner fps limiter but keeps the onscreen information working?


----------



## Ized

Screw having 100c VRMs as mentioned above.

All you need is a couple of weedy heatsinks and some focused direct airflow. A case fan 20inches away isn't going to help much.










This is with the undersized Gelid 290 VRM kit and a small 80mm Noctuia fan after running Firestrike Extreme at 1235Mhz core or something.


----------



## battleaxe

Quote:


> Originally Posted by *Erecshyrinol*
> 
> Heads up -- great deal on a 290x.
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814202145&cm_re=sapphire_290x-_-14-202-145-_-Product
> 
> Stock is running out on 290x and soon the inferior and more expensive 390 will be the only choice in the price range. You Yanks get extremely nice deals, I paid 250 euro for a *used* Tri-X 290x and this is $260 for a new one.


Just ordered one!

Thanks!

+1


----------



## Ha-Nocri

Quote:


> Originally Posted by *gatygun*
> 
> Also does anybody know how to disable rivatuner fps limiter but keeps the onscreen information working?


set it to 0 (zero)


----------



## gatygun

Quote:


> Originally Posted by *Ha-Nocri*
> 
> set it to 0 (zero)


Ah thanks that's it.


----------



## Lou HM

Quote:


> Originally Posted by *mfknjadagr8*
> 
> my crossfire works great now the flicker was solved on mine by using lower antialiasing settings...Im only using fxaa...I still had a little flicker but the 1.07 patch fixed it in game but I still get the glitchy flicker in menus and the level up icon in the ui...my flicker was horrible on any surface that was reflective...now going on four hours playtime since the patch ive had no flicker outside of the above


Am I the only one who performance worsen after 15.7 ?
My GPU utilization doesn't go beyond 75% while in crossfire mode at 2160p, however with 15.6 I have no problems other than flickering on the main menu, therefore I'm reverting back to 15.6 for now


----------



## BradleyW

Quote:


> Originally Posted by *Lou HM*
> 
> Am I the only one who performance worsen after 15.7 ?
> My GPU utilization doesn't go beyond 75% while in crossfire mode at 2160p, however with 15.6 I have no problems other than flickering on the main menu, therefore I'm reverting back to 15.6 for now


Same here. I'm back to 15.5. Better scaling, less stuttering.


----------



## Lou HM

Quote:


> Originally Posted by *BradleyW*
> 
> Same here. I'm back to 15.5. Better scaling, less stuttering.


I haved try many things including your guide, 3rd party uninstaller, RadeonPro but no luck with 15.7

Please post if you find a solution, thanks


----------



## BradleyW

Quote:


> Originally Posted by *Lou HM*
> 
> I haved try many things including your guide, 3rd party uninstaller, RadeonPro but no luck with 15.7
> 
> Please post if you find a solution, thanks


Issue is with the drivers. Sorry, you'll have to wait for an update.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Waterblock the whole card mate , how else ?


Quote:


> Originally Posted by *mus1mus*


----------



## StrongForce

Well holy smokes I didn't know it was possible.. that's pretty coool







, no more risk of leaks that a normal GPU block I suppose ? (I mean since it's like a dozens small VRMS or half a dozen)


----------



## mus1mus

Is it just me or Windows 10 really does wonders?

http://www.3dmark.com/3dm11/10087864

1240 on W10 artifacts less than 1200 on 8.1.. Weird but good eh?


----------



## mfknjadagr8

Quote:


> Originally Posted by *mus1mus*
> 
> Is it just me or Windows 10 really does wonders?
> 
> http://www.3dmark.com/3dm11/10087864
> 
> 1240 on W10 artifacts less than 1200 on 8.1.. Weird but good eh?


1700 on mem nice job...that's around what my two get stock...lol


----------



## kizwan

Quote:


> Originally Posted by *By-Tor*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I don't have Witcher 3. If I'm not mistaken crossfire support for this game still buggy. I remember someone did mentioned do not update to patch 1.07. I also read Optimize 1x1 solve flickering but fps drop a lot & stuttering. Did you try with Optimize 1x1?
> 
> 
> 
> I'm having no problems in Witcher 3 with Crossfire or the 1.07 patch. Knock on wood!!!
Click to expand...

Flickering free or less? What is the GPU usage in the crossfire?
Quote:


> Originally Posted by *velocityx*
> 
> witcher 3 in 290 crossfire is really playable although flickering and stuttering depends on the weather in game and can get kinda annoying. not perfect but good enough too play.
> 
> the ugly thing is that amd also thought the same, i mean that its good enough so i doubt they will fix this ever.


I've never thought you can use the words "playable" & "stuttering" in the same sentence.








Quote:


> Originally Posted by *StrongForce*
> 
> Well holy smokes I didn't know it was possible.. that's pretty coool
> 
> 
> 
> 
> 
> 
> 
> , no more risk of leaks that a normal GPU block I suppose ? (I mean since it's like a dozens small VRMS or half a dozen)


"Normal gpu block"? The block in the picture IS the normal gpu block.








Quote:


> Originally Posted by *mus1mus*
> 
> Is it just me or Windows 10 really does wonders?
> 
> http://www.3dmark.com/3dm11/10087864
> 
> 1240 on W10 artifacts less than 1200 on 8.1.. Weird but good eh?


Basically Win8/8.1 give slight boost in the score than Win7 & Win10 give slight boost in the score than Win8/8.1?


----------



## LA_Kings_Fan

Quote:


> Originally Posted by *Erecshyrinol*
> 
> Heads up -- great deal on a 290x.
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814202145&cm_re=sapphire_290x-_-14-202-145-_-Product
> 
> Stock is running out on 290x and soon the inferior and more expensive 390 will be the only choice in the price range. You Yanks get extremely nice deals, I paid 250 euro for a *used* Tri-X 290x and this is $260 for a new one.


The Fury's didn't blow me away ... and think I may wait to pull the BIGGER UPGRADE after the next gen of HBM2 cards come out in 2016 or later ... and are slightly overpriced IMO for the time being.
Not sure why you'd say the R9 390X is inferior ? more expensive yeah, but from all reports they did make small improvements and of course you get 8GB of RAM ... but how is the 390 inferior to the 290 ?

I'm REALLY tempted to buy another R9 290X at this price ... Thinking do this instead and water cool them with the Swiftech R9 LE blocks ?

Grrrr .... not sure ... should I ... shouldn't I ?


----------



## mus1mus

Quote:


> Originally Posted by *mfknjadagr8*
> 
> 1700 on mem nice job...that's around what my two get stock...lol


I have tried it higher before. 1725 actually. I am limited by Core OC though. It tapped out at 1220 to produce enough artifacts and/or flickering throughout the test.

Now, W10 with 15.7 I can clock the core higher. 1420 then it starts to flicker a couple time on each test. Not enough to affect the scores.
Quote:


> Originally Posted by *kizwan*
> 
> Basically Win8/8.1 give slight boost in the score than Win7 & Win10 give slight boost in the score than Win8/8.1?


Still Dl'ing FS now. once it's up, I can confirm.







This is on AMD FX BTW. sooooo


----------



## mus1mus

Quote:


> Originally Posted by *LA_Kings_Fan*
> 
> The Fury's didn't blow me away ... and think I may wait to pull the BIGGER UPGRADE after the next gen of HBM2 cards come out in 2016 or later ... and are slightly overpriced IMO for the time being.
> *Not sure why you'd say the R9 390X is inferior ? more expensive yeah, but from all reports they did make small improvements and of course you get 8GB of RAM ... but how is the 390 inferior to the 290 ?*
> 
> I'm REALLY tempted to buy another R9 290X at this price ... Thinking do this instead and water cool them with the Swiftech R9 LE blocks ?
> 
> Grrrr .... not sure ... should I ... shouldn't I ?


He was comparing the 290X to the 390.


----------



## StrongForce

Quote:


> Originally Posted by *kizwan*
> 
> Flickering free or less? What is the GPU usage in the crossfire?
> I've never thought you can use the words "playable" & "stuttering" in the same sentence.
> 
> 
> 
> 
> 
> 
> 
> 
> "Normal gpu block"? The block in the picture IS the normal gpu block.
> 
> 
> 
> 
> 
> 
> 
> 
> Basically Win8/8.1 give slight boost in the score than Win7 & Win10 give slight boost in the score than Win8/8.1?


Ah I didn't express myself properly, by normal GPU block I meant one without VRM cooling, I remember the G10 kraken wasn't much expensive also, how much does these block cost approximately ?

the quote failed, not really an expert in quoting









Oh by the way I got a problem tonight.. the screen started to flicker blue horizontal lines.. but it does it only on like some pictures in this thread, on the desktop background, and in games, currently on this page/browser there is nothing.. I tryed to restart the screen, it was black.. so I hard-shut down, which I hate to do...

then at the load screen before password it became black (BIOS and before boot was fine too), scared me, after a few second popped perfectly fine, but then the desktop is flickery with those blue lines again... I fear the worst.. could my card be dead, any of you guys experienced something similar ?







I planned to buy a 980Ti but not just yet ! and I don't even know if this card is RMAable still..


----------



## kizwan

Quote:


> Originally Posted by *StrongForce*
> 
> Ah I didn't express myself properly, by normal GPU block I meant one without VRM cooling, I remember the G10 kraken wasn't much expensive also, how much does these block cost approximately ?
> 
> the quote failed, not really an expert in quoting


Around $1XX usually.

There's 3 types for watercooling:-

GPU bracket + AIO cooler: e.g. Kraken G10, Corsair HG10 with compatible AIO cooler
Universal GPU block: only cooling the core
Full waterblock: cooling the core, memory & (unless stated otherwise) VRMs
Full waterblock is normal GPU block because it's directly replace the stock air cooler because it cool almost everything the stock air cooler cools.


----------



## LA_Kings_Fan

Quote:


> Originally Posted by *mus1mus*
> 
> He was comparing the 290X to the 390.


Ahhh I see that now ... thanks, thought he was comparing the 290X to the 390x ... but still not sure I'd write the 390 off as INFERIOR to the 290X ? they're VERY similar mostly.

Anyways ... I do'd it







... pulled the trigger and ordered a 2nd Sapphire 290X. For only *$260*, I didn't see how I could pass on it









now to get a pair of these and some other goodies soon











Would this Performance-PCs Über 655 Fully Modded (D5) 12 VDC Water Pump be enough, for a CPU block, 2 x's GPU blocks, a 280 radiator and 2 x's 140 radiators ???



http://www.performance-pcs.com/performance-pcs-b-ber-655-b-fully-modded-d5-12-vdc-water-pump-red.html


----------



## mus1mus

I won't either.







My 290 for example can hang with mid to low-high 290Xs soooo..









It should be enough. But if you want security, pick another one. Should raise the performance as well.


----------



## gatygun

So much issue's with my PC atm, i gona reinstall windows first. Got the feeling it's broken.


----------



## StrongForce

Phew so after a while the blue lines disapeard yesterday, and today there was none.. crossing fingers..!! I wanted to try overclocking the card but now I'm a little scared to do it..lol


----------



## Mmonge

http://www.overclock.net/t/1561904/mlu-bios-builds-for-290x/

This bios changed my 290x. No more blackscreen, throttling and instability.


----------



## mus1mus

For those who have tried Windows 10, are you also getting lesser artifact s when pushing the cards?

I have raised my clocks further by 30MHz before things get screwy. 15.7 on 8.1 btw heavily artifacts at clocks where it starts to show occassionaly on 10.


----------



## mfknjadagr8

Quote:


> Originally Posted by *mus1mus*
> 
> For those who have tried Windows 10, are you also getting lesser artifact s when pushing the cards?
> 
> I have raised my clocks further by 30MHz before things get screwy. 15.7 on 8.1 btw heavily artifacts at clocks where it starts to show occassionaly on 10.


on that note if I try Windows ten does that negate my key from being used for Windows 7 once ten goes live or can I use both with same key?


----------



## mus1mus

Quote:


> Originally Posted by *mfknjadagr8*
> 
> on that note if I try Windows ten does that negate my key from being used for Windows 7 once ten goes live or can I use both with same key?


I believe there is that upgrade. Have you received an e-mail from MS?


----------



## Dsrt

http://www.3dmark.com/3dm/7880527 finally got my physics to where it should be. 1.224v @ 4625Mhz cpu
I wanna hit that 20k total score









What waterblocks you recommend for Asus DirectCUII r9-290x's?


----------



## FastEddieNYC

Quote:


> Originally Posted by *Dsrt*
> 
> http://www.3dmark.com/3dm/7880527 finally got my physics to where it should be. 1.224v @ 4625Mhz cpu
> I wanna hit that 20k total score
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What waterblocks you recommend for Asus DirectCUII r9-290x's?


EK makes a block that fits that card. Your choices are limited because it's not a reference PCB.


----------



## cakedude501

I have had a MSI 290X lightning (not LE) for a while now and never got to joining the owner's club. So here is my application.

http://www.techpowerup.com/gpuz/details.php?id=g6spk

It is currently on the aftermarket trifrozr cooler that it is sold with, probably around the end of this year I will put it on water and overclock it as far as it can go.


----------



## gatygun

Well installed windows again, tried witcher with 1.06 version and 15.7 beta. Still horrible looking. Removed 15.7 beta and installed the 15.6 beta. Big performance increase + perfect working crossfire ( other then minor flashing on places ).

Upped the settings a bit, everything ultra ( hairworks off ) and 1440p, got a rock solid 70+ fps outside city's, inside it drops to 45 at times ( my cpu just can't handle it ).

I pretty much get double performance in this game, normally i would get on those settings ~32-40 fps everywhere. Dat scaling.

Also tried metro 2033 ( original one ), and finally i'm capable to hold 60+fps everything ultra on the game, some parts are just so taxing, i sometimes drop into the high 60's but that's fine.

Tried to overclock my new gpu, and found out that with it's low asic of 71%, i could overclock it like a beast. Thing would go on +100 1160 core and 1690 memory rock solid. Got way higher points with it on 3dmark then my older 290. Also the vrm temp is way lower on it, normally it's 10-15c on my older 290 above the core temp, but with this one the vrm is the same or lower then the core temp.

So i decided to push the new one as top. The cooling works so well that i can even overclock it with aircooling both, towards 1100 core / 1400 memory ( can't up the memory higher as my older card can't go higher ). If it gets some colder in the winter that would surely go up to 1145 core, but now my temps are 72c top on core and vrm, and lower card 62 core and 75vrm temp.

CPU as it's aircooled also doesn't go above the 70c which is nice.

All with all the performance is absolute beast mode with the cards, I have had issue's with AC unity to get higher then 40 fps with everything maxed at 1080p on a single 290, it would drop to low 30's on 1440p on top of it. Now it's a rock solid 60 fps on 1440p with everything maxed, even 1800p i get like 50 fps, cpu is holding me back tho as it drops sometimes much like the witcher when large crowds are in my view, but nothing annoying.

Still have to test other stuff but here a picture, of some benches.



Even while it says in 3dmark it's a 390, they are both 290 tri-x.
Single 290 stock = 11.000
Single 290 = 13.220 ( absolute best score 1225 / 1650 "my first 290" )
Crossfire 290 = 20.559 ( if i leave it at stock and automatic fan profile i barely even hear the cards )
Crossfire 290 = 23.281 ( dat increase lol ),

Got some new records on 3dmark for my CPU.

1) highest score for the 870.
2) highest crossfire score for my 870 cpu.
3) highest single score for my 870 cpu.

I still should do some runs with my new 290 as it pushed in the tests that i did with overclocks, a ton more fps then my first 290.

Even metro where my fps on a single 290 oc'ed would drop to low 40's even high 30's at moments it rushes now through it with 70/80 fps minimums. Just insane.

Loving it for sure.


----------



## LA_Kings_Fan

Quote:


> Originally Posted by *LA_Kings_Fan*
> 
> Ditto ... my little Asus RoG Maximus IV Gene Z-68 w/ a Sandy-Bridge i7-2600K @ 4.6 GHz has been running strong with no signs of getting old or slowing down as of yet either.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I moved it into a nice *Corsair Obsidian 450D case* finally, and upgraded the Power Supply (*eVGA SuperNOVA 1000 P2*) and Graphics card (Sapphire R9 290X) a while back
> ... but frankly I haven't seen much from Intel since Sandy-Bridge to even seriously consider upgrading as yet ? maybe by the time Skylake or Cannonlake comes out, but who knows ?
> 
> Only upgrades I see in the near future are to move from the little Noctua Air cooler to an AIO Swiftech H240-X Water cooler unit or a Full Custom Loop System at some point, and to add more storage, SSD's and HDD's. OH and also upgrade the Monitors ... maybe the ASUS RoG PG278Q Swift ? or ASUS MG279Q w/ FreeSync ?
> 
> *One question I DO have though is the future bottlenecking if I continue to upgrade the GPU's ...*
> 
> - would adding a second R9 290 X * be any issue ? wouldn't think so ? except for limited space and heat build up, I know.


*OK* so I posted on my MoBo page, but thought I'd ask here too ... well I couldn't pass up *$260* to get another NEW R9 290X and picked up the SAPPHIRE R9 290X Tri-X OC #100361-4L card to crossfire with my current Sapphire Reference R9 290X #100361BF4SR









 AND 

So now ...

*1*st - am I OK with my PSU ? *eVGA SuperNOVA 1000 P2* *1000* watt 80+ platinum rated power supply ... I think so, but ?

*1/b* - what about long term w/ full custom watercooling ... CPU (i7 2600k @ 4.8 Ghz), 2 x's GPU (290X OC'd @ ?), 32 GB ram, Asus RoG Max IV Gene z68 MoBo, 2 x's SSD's (a 120 & 250), 2 x's HDD's (a 1TB & 2TB - WD Black 7200's), 1 Optical Drive, RAD fans (3 x's 140's + 2 x's 120's), 2 x's 120 case fans, other misc LED's .... will the 1000 watt Platinum be fine long term ? or do I need a 1200 or even 1500 watt unit ?

*2*nd - yes at SOME point when I can afford to, I WILL water cool everything, but I can't do that right now ! so ... which card should go where ? I assume the Reference (1 blower fan) Card up TOP and the Tri-X (3 fan) non-Ref under it ? for best heat flow ? Don't want the Tri-X OC card blowing all that hot air onto the backside of the Reference card right ? Keep in mind my RoG Maximus IV Gene Z68 is a _m_ATX board and the pci-e slots leave little to no room between the cards ... so ?

*3*rd - assume I should use the Tri-X OC card as the Primary card to plug the monitors into ? though other than I think everything HAS to run through ONE card, I don't think it makes much of a difference seeing as they have the same video outputs ?


----------



## Scorpion49

Both my 290x at 1100+ my rig pulls 600w from the wall. I have the same psu.


----------



## crazycrave

Physics Score18203

That's a nice score as I got alittle over 15,000 with my Xeon X5660 so far.. also I picked up one of those Sapphire Tri X 290X today as $259.99 is dirt cheap.. I have the cash to buy two of those but the issues with X58 and CX for 290/290X is something I don't care to try but the board has ran a ref 290 before without issue before I sold it to bitcoin. and I have ran CX 5850 and 7950 before without issues but the ole power supply is 5+ years old now..


----------



## kizwan

Quote:


> Originally Posted by *LA_Kings_Fan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *LA_Kings_Fan*
> 
> Ditto ... my little Asus RoG Maximus IV Gene Z-68 w/ a Sandy-Bridge i7-2600K @ 4.6 GHz has been running strong with no signs of getting old or slowing down as of yet either.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I moved it into a nice *Corsair Obsidian 450D case* finally, and upgraded the Power Supply (*eVGA SuperNOVA 1000 P2*) and Graphics card (Sapphire R9 290X) a while back
> ... but frankly I haven't seen much from Intel since Sandy-Bridge to even seriously consider upgrading as yet ? maybe by the time Skylake or Cannonlake comes out, but who knows ?
> 
> Only upgrades I see in the near future are to move from the little Noctua Air cooler to an AIO Swiftech H240-X Water cooler unit or a Full Custom Loop System at some point, and to add more storage, SSD's and HDD's. OH and also upgrade the Monitors ... maybe the ASUS RoG PG278Q Swift ? or ASUS MG279Q w/ FreeSync ?
> 
> *One question I DO have though is the future bottlenecking if I continue to upgrade the GPU's ...*
> 
> - would adding a second R9 290 X * be any issue ? wouldn't think so ? except for limited space and heat build up, I know.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *OK* so I posted on my MoBo page, but thought I'd ask here too ... well I couldn't pass up *$260* to get another NEW R9 290X and picked up the SAPPHIRE R9 290X Tri-X OC #100361-4L card to crossfire with my current Sapphire Reference R9 290X #100361BF4SR
> 
> 
> 
> 
> 
> 
> 
> 
> 
> AND
> 
> So now ...
> 
> *1*st - am I OK with my PSU ? *eVGA SuperNOVA 1000 P2* *1000* watt 80+ platinum rated power supply ... I think so, but ?
> 
> *1/b* - what about long term w/ full custom watercooling ... CPU (i7 2600k @ 4.8 Ghz), 2 x's GPU (290X OC'd @ ?), 32 GB ram, Asus RoG Max IV Gene z68 MoBo, 2 x's SSD's (a 120 & 250), 2 x's HDD's (a 1TB & 2TB - WD Black 7200's), 1 Optical Drive, RAD fans (3 x's 140's + 2 x's 120's), 2 x's 120 case fans, other misc LED's .... will the 1000 watt Platinum be fine long term ? or do I need a 1200 or even 1500 watt unit ?
> 
> *2*nd - yes at SOME point when I can afford to, I WILL water cool everything, but I can't do that right now ! so ... which card should go where ? I assume the Reference (1 blower fan) Card up TOP and the Tri-X (3 fan) non-Ref under it ? for best heat flow ? Don't want the Tri-X OC card blowing all that hot air onto the backside of the Reference card right ? Keep in mind my RoG Maximus IV Gene Z68 is a _m_ATX board and the pci-e slots leave little to no room between the cards ... so ?
> 
> *3*rd - assume I should use the Tri-X OC card as the Primary card to plug the monitors into ? though other than I think everything HAS to run through ONE card, I don't think it makes much of a difference seeing as they have the same video outputs ?
Click to expand...

I have two 290s, watercooling, 3820 @4.75GHz & 1050W PSU. PSU did feels warm to the touch when gaming. I also have clocked my cards (gaming) from 1110/1350 to 1150/1600.


----------



## StrongForce

First thing.. lol I had downloaded afterburner..I open it and it hangs my PC.. GREAT.

Downloaded Trixx and now it works fine

However umm I get hell of an overheat with my r9 290x Windforce... I only pushed the core to 1070 with +10Mv and it's burning at 87°C after a couple minutes already, and VRM temperature 1 is at 102°c already what the...







and I thought airflow of such case like Enthoo Primo would help my temps..


----------



## LA_Kings_Fan

Quote:


> Originally Posted by *kizwan*
> 
> I have two 290s, watercooling, 3820 @4.75GHz & 1050W PSU. PSU did feels warm to the touch when gaming. I also have clocked my cards (gaming) from 1110/1350 to 1150/1600.


Your 1050 PSU being that Seasonic X 80+ GOLD unit right ?

... so I should be right about that same range with the eVGA Platinum unit ... at least no real reason to justify spending another $300 on a 1200 watt PSU unit at this point









Thanks for the feedback


----------



## StrongForce

Seems to run much cooler in game phew, those GPU burn test are crazy lol. at 1080 core now will do more testing

Update : 25.7° ambiant : 1110 core only +10Mv core : 72 VRM 1 : 69 VRM 2 : 57 sweet


----------



## pengs

Yeah, Furmark.... there's really no need for it. Just use Firestrike, Heaven and Valley to test stability.


----------



## gatygun

Quote:


> Originally Posted by *LA_Kings_Fan*
> 
> *OK* so I posted on my MoBo page, but thought I'd ask here too ... well I couldn't pass up *$260* to get another NEW R9 290X and picked up the SAPPHIRE R9 290X Tri-X OC #100361-4L card to crossfire with my current Sapphire Reference R9 290X #100361BF4SR
> 
> 
> 
> 
> 
> 
> 
> 
> 
> AND
> 
> So now ...
> 
> *1*st - am I OK with my PSU ? *eVGA SuperNOVA 1000 P2* *1000* watt 80+ platinum rated power supply ... I think so, but ?
> 
> *1/b* - what about long term w/ full custom watercooling ... CPU (i7 2600k @ 4.8 Ghz), 2 x's GPU (290X OC'd @ ?), 32 GB ram, Asus RoG Max IV Gene z68 MoBo, 2 x's SSD's (a 120 & 250), 2 x's HDD's (a 1TB & 2TB - WD Black 7200's), 1 Optical Drive, RAD fans (3 x's 140's + 2 x's 120's), 2 x's 120 case fans, other misc LED's .... will the 1000 watt Platinum be fine long term ? or do I need a 1200 or even 1500 watt unit ?
> 
> *2*nd - yes at SOME point when I can afford to, I WILL water cool everything, but I can't do that right now ! so ... which card should go where ? I assume the Reference (1 blower fan) Card up TOP and the Tri-X (3 fan) non-Ref under it ? for best heat flow ? Don't want the Tri-X OC card blowing all that hot air onto the backside of the Reference card right ? Keep in mind my RoG Maximus IV Gene Z68 is a _m_ATX board and the pci-e slots leave little to no room between the cards ... so ?
> 
> *3*rd - assume I should use the Tri-X OC card as the Primary card to plug the monitors into ? though other than I think everything HAS to run through ONE card, I don't think it makes much of a difference seeing as they have the same video outputs ?


For what it's worth.

I just bought a second 290 and overclocked it towards 1100/1375 in crossfire ( both cards ) and run a 1000w zalman 7 years old power supply. Works like a champ. Just make sure you get 40 amps on one card. My old psu got 2x 8 pins which are 28amp each, and 2x 6 pins which are 18 amp each. So enought juice.

I even have a heavily oc'ed 870 with tons of volt on it which eats 100-200watt. I got 3x 1 tb harddisks + 1x 128gb ssd, like 8 coolers + noctua cooler with 3 coolers on it, dvd burner, And 8gb of ram.

As you got a bit more ram etc, i think you will be absolutely fine specially as your cpu eats lesser watt then mine.

Also, if you got the tri-x model, put it in the upper slot, as that one mostly gets hotter, and the tri-x cools itself better then your stock gpu.


----------



## yawa

Well Here's the highest "no tess" mod I could achieve with 15.7.

Newest bench...

http://www.3dmark.com/fs/5505932

yawa--- i7 4790k at 4.6 GHz --- 290X 1264/1530 --- 11468

GPU Score: 14714
Physics Score: 13011
Combined: 4049


----------



## alancsalt

Does tess even affect this bench? Tess off is allowed on HWbot to help even the playing field. Nvidia owners like me have access to other undetectable changes. It has nothing to do with "real men" and more to do with the difficulties in detecting driver changes.


----------



## yawa

Quote:


> Originally Posted by *alancsalt*
> 
> Does tess even affect this bench? Tess off is allowed on HWbot to help even the playing field. Nvidia owners like me have access to other undetectable changes. It has nothing to do with "real men" and more to do with the difficulties in detecting driver changes.


It might or might not I honestly cannot remember. I think it's more a heavan/valley thing I'm referencing it because it was just a running gag in the benchmarking threads last year.


----------



## alancsalt

affects 3dm11.. a 6 to 11% gain


----------



## kizwan

Quote:


> Originally Posted by *alancsalt*
> 
> Does tess even affect this bench? Tess off is allowed on HWbot to help even the playing field. Nvidia owners like me have access to other undetectable changes. It has nothing to do with "real men" and more to do with the difficulties in detecting driver changes.


With FS it only give you a couple FPS boost for the first two tests & an increase of around ~500 for graphic score.

EDIT: This is Tess default vs. Tess 2x


----------



## mus1mus

Quote:


> Originally Posted by *kizwan*
> 
> With FS it only give you a couple FPS boost for the first two tests & an increase of around ~500 for graphic score.
> 
> EDIT: This is Tess default vs. Tess 2x


I have tried this and all I did have been detected.







I'm so noob.

Scores indeed scale as mentioned.


----------



## aaroc

How do you test if the OC of the GPU is not causing problems? single GPU and CFX?
For a WC GPU what is a safe OC? What should one move in AB? Only mem and core clock, others?


----------



## mus1mus

Quote:


> Originally Posted by *aaroc*
> 
> How do you test if the OC of the GPU is not causing problems? single GPU and CFX?
> For a WC GPU what is a safe OC? What should one move in AB? Only mem and core clock, others?


Game on it.

A safe OC is the one you can dial that won't cause you issues.

Core is always king. But if you are somewhat limited by Core OC, can't get past 1100 for example, set your eyes to the Vram. 1500 VRAM is easy.

Lastly, for water, go with Trixx. 1.4 Volts here for a year or so. No issues.


----------



## aaroc

Quote:


> Originally Posted by *mus1mus*
> 
> Game on it.
> 
> A safe OC is the one you can dial that won't cause you issues.
> 
> Core is always king. But if you are somewhat limited by Core OC, can't get past 1100 for example, set your eyes to the Vram. 1500 VRAM is easy.
> 
> Lastly, for water, go with Trixx. 1.4 Volts here for a year or so. No issues.


But you see artifacts, the game crashes, windows crashes?
If you want to OC CPU and GPUs, which one do you OC first?
What is your experience?
Thanks!!!!!!!


----------



## diggiddi

Start with GPU its easier to overclock


----------



## HOMECINEMA-PC

Always dial in CPU and Ram o/c before GPU


----------



## mus1mus

I don't have OC standards on CPU Or GPU OC. But I would always start with a stable setting to OC either CPU or GPU.

On GPU OC, dial an OC, verify by running Fire Strike first. Then Heaven and Valley. Do maybe a couple of runs each just to check if something is amiss.

Fire Strike first as it seems to be more sensitive in detecting faults with the GPU OC and Driver. I can pass Heaven and Valley without artifacts with up to 20MHz more than FS showing artifacts or crashes. Even 3Dmark 11 is easier to pass. All tested on my 290, 780, 980ti.

If that's not enough to determine stability, run your games. Games will be your determining factor as they will be the ones you will deal with daily.


----------



## LA_Kings_Fan

Simple Question ... *Swiftech KOMODO-R9-LE* & the Sapphire Tri-X OC R9 290X NEW EDITION #*100361-4L* Card ...

... ARE THEY COMPATIBLE ?

It's the NEW VERSION of their R9 290X Card ... "_The new SAPPHIRE R9 290X Tri-X 4GB OC is built on a SAPPHIRE original pcb design with the power upgraded to a 6-phase VDDC power design, and now has a Two 8-pin power connector (vs the 6+8 pin on prior #100361-2SR card) to ensure adequate system power is available (up to 375 Watts total including PCI-Express power)_".

Them saying it's "built on a SAPPHIRE original pcb design", and there's a NEW upgraded 6-phase VDDC power design, W/ Two 8-pin power connector" would have me thinking *MAYBE NOT*?

Can anyone chime in ? I saw on EK's config page it said NO ... it's not a Ref PCB ... so I'm thinking it's a NO GO ?









@ only *$260* on NewEgg I had to get one, and I already have a Sapphire reference R9 290X that I planned on getting a KOMODO-R9-LE for ... but wanted to know IF that water block would work on BOTH Cards ?

Thanks








- Ash


----------



## MissHaswellE

Hi everyone. I managed to land a 290X and was wondering what programs I could use to overclock it that would give me the right control over the card.
I used Sapphire TRIXX for a bit of overclocking since I came across a HardOC thread about it. I copied the stock clocks on their page for the TriX OC. It seems to run nicely at around 64~67C.
Is it worth pushing it harder for general gaming? Say 1440p resolution gaming?

TRIXX is showing, Core Memory and VDDC offset, but it also has Power limit as well.
Is TRIXX to limited for stable and properly tuned overclocking? Or should I find something better?


----------



## mfknjadagr8

Quote:


> Originally Posted by *LA_Kings_Fan*
> 
> Simple Question ... *Swiftech KOMODO-R9-LE* & the Sapphire Tri-X OC R9 290X NEW EDITION #*100361-4L* Card ...
> 
> ... ARE THEY COMPATIBLE ?
> 
> It's the NEW VERSION of their R9 290X Card ... "_The new SAPPHIRE R9 290X Tri-X 4GB OC is built on a SAPPHIRE original pcb design with the power upgraded to a 6-phase VDDC power design, and now has a Two 8-pin power connector (vs the 6+8 pin on prior #100361-2SR card) to ensure adequate system power is available (up to 375 Watts total including PCI-Express power)_".
> 
> Them saying it's "built on a SAPPHIRE original pcb design", and there's a NEW upgraded 6-phase VDDC power design, W/ Two 8-pin power connector" would have me thinking *MAYBE NOT*?
> 
> Can anyone chime in ? I saw on EK's config page it said NO ... it's not a Ref PCB ... so I'm thinking it's a NO GO ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @ only *$260* on NewEgg I had to get one, and I already have a Sapphire reference R9 290X that I planned on getting a KOMODO-R9-LE for ... but wanted to know IF that water block would work on BOTH Cards ?
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> - Ash


send @BramSLI1 a pm and ask he is the swiftech rep on here


----------



## kizwan

Quote:


> Originally Posted by *LA_Kings_Fan*
> 
> Simple Question ... *Swiftech KOMODO-R9-LE* & the Sapphire Tri-X OC R9 290X NEW EDITION #*100361-4L* Card ...
> 
> ... ARE THEY COMPATIBLE ?
> 
> It's the NEW VERSION of their R9 290X Card ... "_The new SAPPHIRE R9 290X Tri-X 4GB OC is built on a SAPPHIRE original pcb design with the power upgraded to a 6-phase VDDC power design, and now has a Two 8-pin power connector (vs the 6+8 pin on prior #100361-2SR card) to ensure adequate system power is available (up to 375 Watts total including PCI-Express power)_".
> 
> Them saying it's "built on a SAPPHIRE original pcb design", and there's a NEW upgraded 6-phase VDDC power design, W/ Two 8-pin power connector" would have me thinking *MAYBE NOT*?
> 
> Can anyone chime in ? I saw on EK's config page it said NO ... it's not a Ref PCB ... so I'm thinking it's a NO GO ?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @ only *$260* on NewEgg I had to get one, and I already have a Sapphire reference R9 290X that I planned on getting a KOMODO-R9-LE for ... but wanted to know IF that water block would work on BOTH Cards ?
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> - Ash


I don't think so because that block only for referenced PCB. I remember someone here in this thread having problem fitting waterblock on new Tri-X.
Quote:


> Originally Posted by *MissHaswellE*
> 
> Hi everyone. I managed to land a 290X and was wondering what programs I could use to overclock it that would give me the right control over the card.
> I used Sapphire TRIXX for a bit of overclocking since I came across a HardOC thread about it. I copied the stock clocks on their page for the TriX OC. It seems to run nicely at around 64~67C.
> Is it worth pushing it harder for general gaming? Say 1440p resolution gaming?
> 
> TRIXX is showing, Core Memory and VDDC offset, but it also has Power limit as well.
> Is TRIXX to limited for stable and properly tuned overclocking? Or should I find something better?


Trixx & MSI AB are the ones most people use to overclock. I personally prefer Trixx.


----------



## gatygun

Played metro last light yesterday, and dat 1080p ~140 fps on ultra settings, 2x ssaa = 90 fps, 4x = ~60 fps. Just insane lol. I was honestly waiting for the framerate to tank at spots and had the feeling i must be playing on low settings. But that wasn't the case at all.
Quote:


> Originally Posted by *MissHaswellE*
> 
> Hi everyone. I managed to land a 290X and was wondering what programs I could use to overclock it that would give me the right control over the card.
> I used Sapphire TRIXX for a bit of overclocking since I came across a HardOC thread about it. I copied the stock clocks on their page for the TriX OC. It seems to run nicely at around 64~67C.
> Is it worth pushing it harder for general gaming? Say 1440p resolution gaming?
> 
> TRIXX is showing, Core Memory and VDDC offset, but it also has Power limit as well.
> Is TRIXX to limited for stable and properly tuned overclocking? Or should I find something better?


Trixx is probably the best, people use msi afterburner but it limits your OC potentional because of rivatuner. So you can see what your gpu / system is doing on your screen.

Just push the power limit to +50, and increase the voltage, specially if you can get the stuff cooler then 80c, just up voltage and clock speeds in order to see what the max is you can get out of it.

For 1440p, every little bit of oc helps as 1440p for a single 290x is still really demanding.


----------



## mus1mus

Trixx for the omppph!

1150 on the cores if you can keep it cool should be very noticeable on 1440p.


----------



## MissHaswellE

Alright so I got her at

1100 core
5900 Memory
50 VDDC
50 Power limit
GPU Z says the ASIC quality is 81.1%
3dmark FS runs flawlessly and I'm getting 11275

I was running a test on GW2 though, and it was spiking up to 72C.
Seems a little bit high? or is that pretty average for a 290X?


----------



## sinnedone

Temp is real good. On the reference models they hit 94-95 all day unless you make a custom fan profile.


----------



## MissHaswellE

Quote:


> Originally Posted by *sinnedone*
> 
> Temp is real good. On the reference models they hit 94-95 all day unless you make a custom fan profile.


Yeah my custom profile has it set to 100% fanspeed above 60C.
It's a dual fan open air though. Dual 80mm fans i think.
How I'd love to grab a pair of noctua a series 92mm and replace the stock fans.


----------



## aaroc

Quote:


> Originally Posted by *MissHaswellE*
> 
> Alright so I got her at
> 
> 1100 core
> 5900 Memory
> 50 VDDC
> 50 Power limit
> GPU Z says the ASIC quality is 81.1%
> 3dmark FS runs flawlessly and I'm getting 11275
> 
> I was running a test on GW2 though, and it was spiking up to 72C.
> Seems a little bit high? or is that pretty average for a 290X?


Are there two ways to measure Memory speed? Other users said they have from 950 to 1500 and you write 5900. AB only allows up to 1600 moving the slide of memory clock. Thanks!


----------



## Maticb

Quote:


> Originally Posted by *aaroc*
> 
> Are there two ways to measure Memory speed? Other users said they have from 950 to 1500 and you write 5900. AB only allows up to 1600 moving the slide of memory clock. Thanks!


It is 1475 x4=5900, because it's quadrupled.


----------



## Scorpion49

Quote:


> Originally Posted by *aaroc*
> 
> Are there two ways to measure Memory speed? Other users said they have from 950 to 1500 and you write 5900. AB only allows up to 1600 moving the slide of memory clock. Thanks!


GDDR5 runs at 4x, so 1700mhz is 6800mhz effective. Some programs show single direction speed so that's where you see the lowest number.


----------



## MissHaswellE

Quote:


> Originally Posted by *aaroc*
> 
> Are there two ways to measure Memory speed? Other users said they have from 950 to 1500 and you write 5900. AB only allows up to 1600 moving the slide of memory clock. Thanks!


Use Sapphire TRIXX's effective memory clocks.


----------



## skyrrd

Just tried some basic overclocking with my 290 tri-x New edition

+13 in voltage offset
+50% power limit
1140 core clock
1440 memory clock

Didn't try ans harder yet but with this it stays cool (70° core 60° vrm1) in automatic fan profile and quiet and i have plenty of headroom left
Thats with asic 71%


----------



## cephelix

Quote:


> Originally Posted by *skyrrd*
> 
> Just tried some basic overclocking with my 290 tri-x New edition
> 
> +13 in voltage offset
> +50% power limit
> 1140 core clock
> 1440 memory clock
> 
> Didn't try ans harder yet but with this it stays cool (70° core 60° vrm1) in automatic fan profile and quiet and i have plenty of headroom left
> Thats with asic 71%


That is really good! I recently tried upping the clocks on mine, from 1100 to 1150 with a +87mv. Too hot to play without the ac on though


----------



## spyshagg

Did anyone had a coil "rattling" sound like something touching a fan type of noise?

I'm on watercooling full cover block, so no fans. But the rattling sound is surely coming from the card.


----------



## cephelix

Quote:


> Originally Posted by *spyshagg*
> 
> Did anyone had a coil "rattling" sound like something touching a fan type of noise?
> 
> I'm on watercooling full cover block, so no fans. But the rattling sound is surely coming from the card.


Coil whine?is it happening when switching from 2D to 3D and back to 2D clocks?


----------



## BradleyW

Quote:


> Originally Posted by *spyshagg*
> 
> Did anyone had a coil "rattling" sound like something touching a fan type of noise?
> 
> I'm on watercooling full cover block, so no fans. But the rattling sound is surely coming from the card.


Coil whine can also sound like rattling. You just simply have vibrating coils.


----------



## spyshagg

just in 3D apps, even with low load. The rattle varies from low load to high load, but its always there in 3D.

I dont remember this card doing this sound with the stock air cooler.

Its an Asus DCII with EK fullcover + backplace


----------



## cephelix

I'm inclined to go with @bradleyw
I have coil whine as well but only time it's really obvious is when i run and stop valley


----------



## BradleyW

Quote:


> Originally Posted by *spyshagg*
> 
> just in 3D apps, even with low load. The rattle varies from low load to high load, but its always there in 3D.
> 
> *I dont remember this card doing this sound with the stock air cooler.*
> 
> Its an Asus DCII with EK fullcover + backplace


Fan was most likely louder than the rattling.


----------



## blackhole2013

Quote:


> Originally Posted by *cephelix*
> 
> That is really good! I recently tried upping the clocks on mine, from 1100 to 1150 with a +87mv. Too hot to play without the ac on though


I cant get my powercolor devil 13 290x2 past 1070 mhz or it will overheat and when it hits around 80c it will start to artifact but to be fair it is two 290x chips on air on one card..... I could cook eggs on it ..I probably could use it as a heater in my house lol.


----------



## cephelix

Quote:


> Originally Posted by *blackhole2013*
> 
> I cant get my powercolor devil 13 290x2 past 1070 mhz or it will overheat and when it hits around 80c it will start to artifact but to be fair it is two 290x chips on air on one card..... I could cook eggs on it ..I probably could use it as a heater in my house lol.


That card is a whole different beast. Can't compare that to my MSI R9 290 4G. Is there a way to disable the 2nd chip on the devil13 though? Mine hits 85C @ 1100/1450/+56mV with 30C ambients. That's the best I could do on air with CLU applied to the die, fujipoly on VRM1, 70-80% gpu fan speed and 100% fan speed on an AP-15 strapped to the unused pcie slots. Turning on the AC to 26C gets me to 70C so I'm guessing it's definitely a case airflow problem but I can't find any way to rectify it without modding my 750D case.


----------



## blackhole2013

Quote:


> Originally Posted by *cephelix*
> 
> That card is a whole different beast. Can't compare that to my MSI R9 290 4G. Is there a way to disable the 2nd chip on the devil13 though? Mine hits 85C @ 1100/1450/+56mV with 30C ambients. That's the best I could do on air with CLU applied to the die, fujipoly on VRM1, 70-80% gpu fan speed and 100% fan speed on an AP-15 strapped to the unused pcie slots. Turning on the AC to 26C gets me to 70C so I'm guessing it's definitely a case airflow problem but I can't find any way to rectify it without modding my 750D case.


Well yea when I play games that dont have a crossfire profile I can get to 1125 on the one chip .. they both can do 1125 but like i said at even 100% jet turbine engine sounding fans going in crossfire mode it will hit 80c with in 10 mins and start to artifact ... its a weird thing artifacting always starts at 80c I thought these chips could get hotter than that ?


----------



## skyrrd

Well yeah the tri-x cooler does a pretty decent job ;-) i also tried some undervolting of -83 mV with stock clocks (1000/1300) which was stable in 3d profile but failed on idle.


----------



## cephelix

Quote:


> Originally Posted by *blackhole2013*
> 
> Well yea when I play games that dont have a crossfire profile I can get to 1125 on the one chip .. they both can do 1125 but like i said at even 100% jet turbine engine sounding fans going in crossfire mode it will hit 80c with in 10 mins and start to artifact ... its a weird thing artifacting always starts at 80c I thought these chips could get hotter than that ?


Well, i've never had a multi-card setup let alone an anything x 2 lol.i was under tghe assumption that the card always ran in crossfire and you couldn't disable it. The die itself has a thermal limit of 95C. Maybe it's something else that can't handle the voltages at that particular temperature but what do I know.
Quote:


> Originally Posted by *skyrrd*
> 
> Well yeah the tri-x cooler does a pretty decent job ;-) i also tried some undervolting of -83 mV with stock clocks (1000/1300) which was stable in 3d profile but failed on idle.


Yeah, i've heard good things about the tri-x cooler. Been tempted myself on multiple ocassions to buy a tri x 290 but the mismatch with my current card would drive me insane and as ebtioned, i dont think my case could handle the temp increase


----------



## MissHaswellE

I'd love to see a TriX OC cooler on a AMD FuryX.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *cephelix*
> 
> I'm inclined to go with @bradleyw
> I have coil whine as well but only time it's really obvious is when i run and stop valley


Coil wine







Tells me the bench is still running when im blackscreen benchmarking


----------



## cephelix

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Coil wine
> 
> 
> 
> 
> 
> 
> 
> Tells me the bench is still running when im blackscreen benchmarking


lol, you run it that far? but i suppose if you're doing it for numbers, it's understandable. i'm too cautious to knowingly push my hardware that much.

@MissHaswellE well, unfortunately you could only see it on the Fury and not the X since the X's are like the titans, only reference designs allowed


----------



## mus1mus

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Coil wine
> 
> 
> 
> 
> 
> 
> 
> Tells me the bench is still running when im blackscreen benchmarking


Lucky you.









When I blackscreen with no cool whine, all I can do is wait for the score.


----------



## blackhole2013

Quote:


> Originally Posted by *cephelix*
> 
> Well, i've never had a multi-card setup let alone an anything x 2 lol.i was under tghe assumption that the card always ran in crossfire and you couldn't disable it. The die itself has a thermal limit of 95C. Maybe it's something else that can't handle the voltages at that particular temperature but what do I know.


I wish nope its just like crossfireing 2 seprate cards you can turn on and off crossfire in CCC and if there's no crossfire profile for a game ie batman arkham knight it will only run on one chip only .. and yea its the only card around called 290x2 not like others 295x .... its because they custom made this card before there was such thing as a 295x and they only made 250 cards all custom ... its a rare beast ..


----------



## cephelix

Quote:


> Originally Posted by *blackhole2013*
> 
> I wish nope its just like crossfireing 2 seprate cards you can turn on and off crossfire in CCC and if there's no crossfire profile for a game ie batman arkham knight it will only run on one chip only .. and yea its the only card around called 290x2 not like others 295x .... its because they custom made this card before there was such thing as a 295x and they only made 250 cards all custom ... its a rare beast ..


That is rare indeed.maybe i should start dropping money on limited production stuff....nahhhhh....lol


----------



## blackhole2013

Quote:


> Originally Posted by *cephelix*
> 
> That is rare indeed.maybe i should start dropping money on limited production stuff....nahhhhh....lol


Well the original price for my card last year was 1500 and I only paid 649 on newegg like 3 months ago I guess they were getting rid of the last of them they had .. For the price plus the free 100 dollar razer ouroboros mouse that came withit . it was a no brainer for me to buy it ...


----------



## cephelix

Quote:


> Originally Posted by *blackhole2013*
> 
> Well the original price for my card last year was 1500 and I only paid 649 on newegg like 3 months ago I guess they were getting rid of the last of them they had .. For the price plus the free 100 dollar razer ouroboros mouse that came withit . it was a no brainer for me to buy it ...


That is quite the deal. Don't know how much prices are locally for that card or if they're even sold here. Was initially thinking of getting the fury x but after thinking about it, the 290 suits my gaming needs just fine. And i don't game all that often. Though to be fair, i can't say i'm not tempted by a 295x2


----------



## MissHaswellE

Quote:


> Originally Posted by *cephelix*
> 
> lol, you run it that far? but i suppose if you're doing it for numbers, it's understandable. i'm too cautious to knowingly push my hardware that much.
> 
> @MissHaswellE well, unfortunately you could only see it on the Fury and not the X since the X's are like the titans, only reference designs allowed


I'd be willing to bet that AMD isn't that stupid within this time frame, and low quarter sales.
I'd like to save up for a Fury X, but I don't like liquids near my PC, much less inside of it.


----------



## cephelix

Quote:


> Originally Posted by *MissHaswellE*
> 
> I'd be willing to bet that AMD isn't that stupid within this time frame, and low quarter sales.
> I'd like to save up for a Fury X, but I don't like liquids near my PC, much less inside of it.


It's not so bad really. I've had a full custom loop and also a accelero hybrid. Though nothing beats the piece of mind of air cooling.


----------



## battleaxe

So excited to get my new 290x...

can't believe I got her for $260 what a deal...

Hope she clocks well too!


----------



## cephelix

Quote:


> Originally Posted by *battleaxe*
> 
> So excited to get my new 290x...
> 
> can't believe I got her for $260 what a deal...
> 
> Hope she clocks well too!


Congrats! what brand?


----------



## MrKZ

What is the best way to test the 290x for stability? I ran some firestrike tests at 1125/1430 +50mv and it was fine, also far cry 4 and metroLL seems to run fine but shadow of mordor will artifact after some seconds in menu then it will freeze


----------



## BradleyW

Quote:


> Originally Posted by *MrKZ*
> 
> What is the best way to test the 290x for stability? I ran some firestrike tests at 1125/1430 +50mv and it was fine, also far cry 4 and metroLL seems to run fine but shadow of mordor will artifact after some seconds in menu then it will freeze


Assassin's Creed Unity.


----------



## MissHaswellE

Quote:


> Originally Posted by *MrKZ*
> 
> What is the best way to test the 290x for stability? I ran some firestrike tests at 1125/1430 +50mv and it was fine, also far cry 4 and metroLL seems to run fine but shadow of mordor will artifact after some seconds in menu then it will freeze


Interestingly.
TERA.
It's a free MMORPG built on Unreal3 with really heavy graphics, and surprisingly it's REALLY fussy about GPU overclocks.
TERA does not like overclocks at all, and it throws up blackscreens pretty quick if your OC isn't stable.
If you're stable playing TERA, you'd probably be stable playing just about every other game on the market.


----------



## battleaxe

Quote:


> Originally Posted by *cephelix*
> 
> Congrats! what brand?


Saphire Tri-X... its was the latest deal on Newegg.









Pretty excited about getting it too. 290x is still a great card. Can't be beat for the money IMO, and I own a 970, so I have no allegiance to anything.

I sure hope she does 1200+ on the core... fingers crossed.


----------



## mirzet1976

Guys i have in two diferent setup same result 10313
First run
Fx8320 - 5126mhz
R9 290 - 1290/1600mhz

Second run
Fx 8320E - 4.8ghz
R9 290 - 1250/1625mhz


----------



## passinos

Anyone one using a MCR140-X Drive on a 290x?


----------



## MissHaswellE

Quote:


> Originally Posted by *mirzet1976*
> 
> Guys i have in two diferent setup same result 10313
> First run
> Fx8320 - 5126mhz
> R9 290 - 1290/1600mhz
> 
> Second run
> Fx 8320E - 4.8ghz
> R9 290 - 1250/1625mhz


Check Graphics score over overall score.


----------



## pengs

Quote:


> Originally Posted by *blackhole2013*
> 
> I wish nope its just like crossfireing 2 seprate cards you can turn on and off crossfire in CCC and if there's no crossfire profile for a game ie batman arkham knight it will only run on one chip only .. and yea its the only card around called 290x2 not like others 295x .... its because they custom made this card before there was such thing as a 295x and they only made 250 cards all custom ... its a rare beast ..


Almost got one of those back in April. It dropped below $600 and $50 below a 295x2. That has got to be the heaviest card ever built from the look of it, 2.5 slots worth of solid aluminum and copper. It's got to be a beast.
Quote:


> Originally Posted by *MrKZ*
> 
> What is the best way to test the 290x for stability? I ran some firestrike tests at 1125/1430 +50mv and it was fine, also far cry 4 and metroLL seems to run fine but shadow of mordor will artifact after some seconds in menu then it will freeze


Valley is alright but a little lacking as far as stress goes, there are games which will crash where Valley will barely artifact. I usually take the long route, set the clocks/voltage, give Valley one run and look for artifacts, game with it backing the voltage down until a display driver crash happens and saving the stable profile. Takes about a week but it's solid as it's been run through 10-12 games.


----------



## mus1mus

Quote:


> Originally Posted by *mirzet1976*
> 
> Guys i have in two diferent setup same result 10313
> First run
> Fx8320 - 5126mhz
> R9 290 - 1290/1600mhz
> 
> Second run
> Fx 8320E - 4.8ghz
> R9 290 - 1250/1625mhz


That 1290 might be unstable.
Great Graphics score! Best I can squeeze on pcie 2.0,
http://www.3dmark.com/fs/5520455


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> That 1290 might be unstable.
> Great Graphics score! Best I can squeeze on pcie 2.0,
> http://www.3dmark.com/fs/5520455


Man you guys are getting some nice scores. I can't even touch those clocks.


----------



## mus1mus

Quote:


> Originally Posted by *battleaxe*
> 
> Man you guys are getting some nice scores. I can't even touch those clocks.


How come? Mine's terribad actually. PT1T + Trixx + Water


----------



## battleaxe

Yeah. I haven't tried another bios yet. I don't feel like messing with cost of full water either... So I guess I can't really complain.


----------



## mus1mus

If you are not into OC'ing or benching, stay with your current BIOS OR the new custom BIOS from this forum. MLU and hacked 390X BIOS.

Just be mindful of your temps. I wouldn't hessitate setting up a custom fan profile to keep these cards within bounds of temps to get them to stick to a certain clock (not throttling).


----------



## battleaxe

I overclock a little. Mine can do 1215mhz on fs. 1255 on valley. I don't go too crazy though. I've flashed bios before but haven't bothered with this card since it does 1200mhz + anyway.


----------



## cephelix

All you guys doing 1200mhz and i'm just here, maxing out at 1150


----------



## Maticb

Quote:


> Originally Posted by *cephelix*
> 
> All you guys doing 1200mhz and i'm just here, maxing out at 1150


I never pushed mine over 1136


----------



## MissHaswellE

Quote:


> Originally Posted by *cephelix*
> 
> All you guys doing 1200mhz and i'm just here, maxing out at 1150


Mine's sitting at 1100, even though you all have me convinced I could easily go higher.
I mean this is way more power than my GTX680M. I'm pretty happy with a 290X with TriX OC stock clocks on a standard PCB(non ref cooler).


----------



## Ha-Nocri

Well, I'm pretty sure all those 1230+ MHz r on water... correct?


----------



## battleaxe

My core only on water. Just an aio cooler. I'm too tight to drop on a full water loop.


----------



## mus1mus

Not all. Some good ASIC Cards can do that on Air. But keeping these Hawaii GPUs cool really does magic.









Quote:


> Originally Posted by *battleaxe*
> 
> My core only on water. Just an aio cooler. I'm too tight to drop on a full water loop.


That is good enough provided you have a decent VRM Temp.


----------



## battleaxe

Vrm never exceed 75c even with 200mv added. So good enough yeah. I'm too lazy to go an crazier than that.


----------



## mirzet1976

Quote:


> Originally Posted by *mus1mus*
> 
> That 1290 might be unstable.
> Great Graphics score! Best I can squeeze on pcie 2.0,
> http://www.3dmark.com/fs/5520455


Might be 15.7 driver + modded bios from Insan1tyONE


----------



## mus1mus

Quote:


> Originally Posted by *battleaxe*
> 
> Vrm never exceed 75c even with 200mv added. So good enough yeah. I'm too lazy to go an crazier than that.


Pretty good! Limit it to 70C I should say.

Tried PT1 and PT3, additional Voltage didn't help. It will just go for blackscreens.
Quote:


> Originally Posted by *mirzet1976*
> 
> Might be 15.7 driver + modded bios from Insan1tyONE


I tried that as well. But my card is a but picky. Goes laggy on 2D.

Found my optimum limIt







at 1237/1717
http://www.3dmark.com/fs/5525110


----------



## boredmug

Quote:


> Originally Posted by *LA_Kings_Fan*
> 
> Ahhh I see that now ... thanks, thought he was comparing the 290X to the 390x ... but still not sure I'd write the 390 off as INFERIOR to the 290X ? they're VERY similar mostly.
> 
> Anyways ... I do'd it
> 
> 
> 
> 
> 
> 
> 
> ... pulled the trigger and ordered a 2nd Sapphire 290X. For only *$260*, I didn't see how I could pass on it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> now to get a pair of these and some other goodies soon
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Would this Performance-PCs Über 655 Fully Modded (D5) 12 VDC Water Pump be enough, for a CPU block, 2 x's GPU blocks, a 280 radiator and 2 x's 140 radiators ???
> 
> 
> 
> http://www.performance-pcs.com/performance-pcs-b-ber-655-b-fully-modded-d5-12-vdc-water-pump-red.html


More than enough. You don't need two unless you like the security of a backup pump


----------



## cephelix

Quote:


> Originally Posted by *Maticb*
> 
> I never pushed mine over 1136


Quote:


> Originally Posted by *MissHaswellE*
> 
> Mine's sitting at 1100, even though you all have me convinced I could easily go higher.
> I mean this is way more power than my GTX680M. I'm pretty happy with a 290X with TriX OC stock clocks on a standard PCB(non ref cooler).


It all depends on temps of core and vrms, voltage required to achieve desired clocks and frames gained. There is a sweet spot. Mine just happens to be at 1100. Haven't properly tested at 1150 yet but requires an additional 30mV or so on top of the 56 already required for 1100. At 1150 strangely, I can't OC memory at all without having to add more voltage or risk getting artifacts. Gaming at 1920 x 1200, 60Hz TN panel, I feel the 290 gives me more than enough grunt. My most intensive game right now would be DAI, and when it goes on sale, Witcher 3. Though since I've never gone overkill so I wouldn't know if having more processing power would help my situation in terms of eye candy.


----------



## Maticb

Quote:


> Originally Posted by *cephelix*
> 
> It all depends on temps of core and vrms, voltage required to achieve desired clocks and frames gained. There is a sweet spot. Mine just happens to be at 1100. Haven't properly tested at 1150 yet but requires an additional 30mV or so on top of the 56 already required for 1100. At 1150 strangely, I can't OC memory at all without having to add more voltage or risk getting artifacts. Gaming at 1920 x 1200, 60Hz TN panel, I feel the 290 gives me more than enough grunt. My most intensive game right now would be DAI, and when it goes on sale, Witcher 3. Though since I've never gone overkill so I wouldn't know if having more processing power would help my situation in terms of eye candy.


I run 1136 for benchmarking 24/7 is more like 1050. It is under water(so temps are never a problem), but I see no gain in games from such an OC, so I just see no reason to make it heat more and make it degrade quickly. Expecially now that i've gotten a second 290.

I get artefacting at 1136 even. But it completes 3dmark without a crash. The thing is they're locked refference cards so I can't raise the voltage.


----------



## mus1mus

Quote:


> Originally Posted by *Maticb*
> 
> I run 1136 for benchmarking 24/7 is more like 1050. It is under water(so temps are never a problem), but I see no gain in games from such an OC, so I just see no reason to make it heat more and make it degrade quickly. Expecially now that i've gotten a second 290.
> 
> I get artefacting at 1136 even. But it completes 3dmark without a crash. The thing is they're locked refference cards so I can't raise the voltage.


Reference Designs are not locked. If you flash a Bios


----------



## battleaxe

Quote:


> Originally Posted by *boredmug*
> 
> More than enough. You don't need two unless you like the security of a backup pump


Quote:


> Originally Posted by *Maticb*
> 
> I run 1136 for benchmarking 24/7 is more like 1050. It is under water(so temps are never a problem), but I see no gain in games from such an OC, so I just see no reason to make it heat more and make it degrade quickly. Expecially now that i've gotten a second 290.
> 
> I get artefacting at 1136 even. But it completes 3dmark without a crash. The thing is they're locked refference cards so I can't raise the voltage.


What do you mean locked? Just use Trixx software...? Maybe I missed something...


----------



## mus1mus

Some Bios tops at speeds lower than others.


----------



## LA_Kings_Fan

Hey Arizonian,

*UPDATE* ... I see I'm still on the list with that *PowerColor* PCS+ 290X unit (*link*) ... but I didn't end up keeping that one.







should you remove ?

I still have the used *Sapphire* R9 290X Reference card (*link*) I got for *$315.00* .

... and now I've added another 290X, the *Sapphire* Tri-X OC New Edition Card #100361-4L from Newegg for *$260.00*

So *DUAL* R9 290X power for *$575.00* Total









Can ya update me on the List ... thanks


----------



## LandonAaron

For reason after updating to the latest Catalyst driver 15.7 I can no longer adjust the voltage on my second card. I have an R9 290x and a r9 290. They are both just reference saphire cards. I have always been able to adjust voltage on either. Now on the 290 I can't change the voltage through Afterburner, Trixx, or Overdrive. Any idea whats going on?


----------



## kizwan

Quote:


> Originally Posted by *Maticb*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *cephelix*
> 
> It all depends on temps of core and vrms, voltage required to achieve desired clocks and frames gained. There is a sweet spot. Mine just happens to be at 1100. Haven't properly tested at 1150 yet but requires an additional 30mV or so on top of the 56 already required for 1100. At 1150 strangely, I can't OC memory at all without having to add more voltage or risk getting artifacts. Gaming at 1920 x 1200, 60Hz TN panel, I feel the 290 gives me more than enough grunt. My most intensive game right now would be DAI, and when it goes on sale, Witcher 3. Though since I've never gone overkill so I wouldn't know if having more processing power would help my situation in terms of eye candy.
> 
> 
> 
> 
> 
> 
> 
> I run 1136 for benchmarking 24/7 is more like 1050. It is under water(so temps are never a problem), but I see no gain in games from such an OC, so I just see no reason to make it heat more and make it degrade quickly. Expecially now that i've gotten a second 290.
> 
> I get artefacting at 1136 even. But it completes 3dmark without a crash. *The thing is they're locked refference cards so I can't raise the voltage.*
Click to expand...

Nope!







They don't lock voltage on referenced cards. You can use MSI AB or Trixx to raise voltage. If you want more voltage, you can use MSI AB voltage hack (see first post) or using PT1/PT1T BIOS.


----------



## cephelix

Quote:


> Originally Posted by *Maticb*
> 
> I run 1136 for benchmarking 24/7 is more like 1050. It is under water(so temps are never a problem), but I see no gain in games from such an OC, so I just see no reason to make it heat more and make it degrade quickly. Expecially now that i've gotten a second 290.
> 
> I get artefacting at 1136 even. But it completes 3dmark without a crash. The thing is they're locked refference cards so I can't raise the voltage.


with a second 290 then I agree that you wouldn't really need to oc that high, if at all. But I would like to turn up more eye candy in the more intensive games i play. though to be honest i can't seem to tell the difference even with them on or off


----------



## cokker

Quote:


> Originally Posted by *cephelix*
> 
> All you guys doing 1200mhz and i'm just here, maxing out at 1150


1100 Would be nice, I can only add +100mv as anything lower still drops to stock volts causing a crash. Trouble is +100mv causes me to get too hot so I'm stuck at 1050 lol.


----------



## Ha-Nocri

I'm on Air. Can bench @1230MHz and can play games @1200 with +100mv, altho I don't do it since my card is happy @1125MHz stock voltage, no artifacts.

And since stock clock of 290 reference is 947MHz, the card is a great over-clocker. @1137MHz you are 20% over stock.


----------



## Maticb

Quote:


> Originally Posted by *kizwan*
> 
> Nope!
> 
> 
> 
> 
> 
> 
> 
> They don't lock voltage on referenced cards. You can use MSI AB or Trixx to raise voltage. If you want more voltage, you can use MSI AB voltage hack (see first post) or using PT1/PT1T BIOS.


This is what I see in Afterburner:

http://i.gyazo.com/39f6dcc1b2b44322852985897f426364.png

If I change the voltage, GPU-Z shows absolutelly no change in the voltage, not on idle or full usage. Or maybe I am not looking at the right voltage lol.


----------



## kizwan

Quote:


> Originally Posted by *Maticb*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Nope!
> 
> 
> 
> 
> 
> 
> 
> They don't lock voltage on referenced cards. You can use MSI AB or Trixx to raise voltage. If you want more voltage, you can use MSI AB voltage hack (see first post) or using PT1/PT1T BIOS.
> 
> 
> 
> This is what I see in Afterburner:
> 
> http://i.gyazo.com/39f6dcc1b2b44322852985897f426364.png
> 
> If I change the voltage, GPU-Z shows absolutelly no change in the voltage, not on idle or full usage. Or maybe I am not looking at the right voltage lol.
Click to expand...

The AB screenshot is wrong. Power limit slider is missing. Use Trixx then (ver. 4.8.2).


----------



## rv8000

Just a quick question for reference 290/290x owners, as I can't remember anymore, can GPU-Z and AB read VRM temps on the reference cards? Hoping the answer will help point to an updated GPU-Z and AB being able to report VRM temps for Fiji.


----------



## Ha-Nocri

I think my card is reference, only with different cooler, and yes, GPU-Z reads VRM temps, AB no


----------



## battleaxe

Quote:


> Originally Posted by *rv8000*
> 
> Just a quick question for reference 290/290x owners, as I can't remember anymore, can GPU-Z and AB read VRM temps on the reference cards? Hoping the answer will help point to an updated GPU-Z and AB being able to report VRM temps for Fiji.


Yes, it does. I had an MSI reference card and VRM's temps were included.

Correction. AB doesn't read VRM's as far as I know at all..?

But GPU-z does show the VRM's on a reference MSI 290 for sure.


----------



## rv8000

Quote:


> Originally Posted by *Ha-Nocri*
> 
> I think my card is reference, only with different cooler, and yes, GPU-Z reads VRM temps, AB no


Quote:


> Originally Posted by *battleaxe*
> 
> Yes, it does. I had an MSI reference card and VRM's temps were included.


Thank you both!


----------



## Archea47

Quote:


> Originally Posted by *rv8000*
> 
> Just a quick question for reference 290/290x owners, as I can't remember anymore, can GPU-Z and AB read VRM temps on the reference cards? Hoping the answer will help point to an updated GPU-Z and AB being able to report VRM temps for Fiji.


HWINFO64 as well, though not presented very elegantly in the case of the 290xs I've seen on it. It's to me the superior solution over GPUz apart from valuations and pulling the ASIC value


----------



## Unknownm

Having a problem getting ULPS disabled on Windows 10 10240. Using 15.7 from AMD website

Using AB to disable ULPS, which doesn't work. Regedit all the findings of "disableulps" and same thing

The utility OCN posted to disable ULPS , website doesn't work and no links to download.

I'm not sure what I can to get ULPS disabled?


----------



## MissHaswellE

Quote:


> Originally Posted by *Unknownm*
> 
> Having a problem getting ULPS disabled on Windows 10 10240. Using 15.7 from AMD website
> 
> Using AB to disable ULPS, which doesn't work. Regedit all the findings of "disableulps" and same thing
> 
> The utility OCN posted to disable ULPS , website doesn't work and no links to download.
> 
> I'm not sure what I can to get ULPS disabled?


Did you try TRIXX? TRIXX has a built in ULPS disable.

If that doesn't do it, It could be Win10.
Win10 has been creating issues for Nvidia users, wouldn't be surprised of it starts giving AMd users crap as well.


----------



## kizwan

I loose 1 to 2 FPS to some people (crossfire). Is this because of the Win 7 vs. Win 8.1? If running single I can keep up with my Hynix card. My Elpida card is lagging 1 to 2 FPS.


----------



## spyshagg

Quote:


> Originally Posted by *cephelix*
> 
> Coil whine?is it happening when switching from 2D to 3D and back to 2D clocks?


Quote:


> Originally Posted by *BradleyW*
> 
> Coil whine can also sound like rattling. You just simply have vibrating coils.


Quote:


> Originally Posted by *cephelix*
> 
> I'm inclined to go with @bradleyw
> I have coil whine as well but only time it's really obvious is when i run and stop valley


Quote:


> Originally Posted by *BradleyW*
> 
> Fan was most likely louder than the rattling.


Ok I found the source of the rattling coil. Its funny

It so happens I have a Corsair 540 air case. The power supply sits on the other side of the case, exactly parallel to the second 290x card. So, the sound REALLY appeared to come from the second 290x. But it came from the power supply under very heavy load!

Its a Corsair TX750 watts.
According to some reviews, this power supply can handle up to 900watts without sweating. And it does handle my system just fine, it just never made any coil sound before with my air cooled 290x's crossfire.

But, as I switched the 290x's crossfire to full block watercooling, so did the voltage up from stock to 1.325v, and the PSU started to make this crazy rattle coil sound.

Even though I always had 2 power supplies on my AIR 540 case functioning in parallel (the TX750 and a Nox 520watts, with the nox dedicated to supply the second 290x), the load still made the TX750 rattle like hell.

At first, my two power supplies were distributed like this, and it coil rattled
Quote:


> *Corsair TX750 =* Motherboard + Cpu power socket + 290x 6+8 pin
> *NOX 520w =* 290x 6+8 pin


Then, I splitted the load like this and the coil rattle disappeared.
Quote:


> *Corsair TX750 =* Motherboard + 290x 6+8 pin
> *NOX 520w =* Cpu power socket + 290x 6+8 pin


But now the Nox 520w fan started to kick into high gear and became my loudest component on my dead silent system. So, I tried split the load like this:
Quote:


> *Corsair TX750 =* Motherboard + 290x 6+8 pin + 290x 6pin
> *NOX 520w =* Cpu power socket + 290x 8 pin


With this last revision, I have no coil rattle on the TX750, and the NOX 520 doesn't become loud. I have an almost dead silent system again


----------



## cephelix

@spyshagg that's great news!!any reason you didn't go with a single larger wattage power supply?


----------



## Ha-Nocri

I get 5 FPS more in GTA5 benchmark going from win7 to win10 in the last scene which is the most CPU bound (70fps to 75fps). Did the benchmark run twice on both win7 and win10 just to be sure.


----------



## Ized

spyshagg nice to hear you sorted out your coilwhine.

I tried 4 PSUs and at least 4 motherboards along with 4 GPUs and had whine every time









I moved to headphones


----------



## spyshagg

Quote:


> Originally Posted by *cephelix*
> 
> @spyshagg that's great news!!any reason you didn't go with a single larger wattage power supply?


Its a matter of resources







I already had them from my old rig.

I regard the TX750 very highly. She's from 2009 but handles 2 stock 290x's + i7 2600k at 4.9ghz without getting crazy hot or noisy.
The coil whine only came with the extra Vcore on the 290's but despite the whine it performs excellent. Great voltage stability, great temperatures. Great psu I have to say.

The nox 520w is cheap, but I had space on the Air 540 for it. So I used it to relieve pressure from the TX750.

Quote:


> Originally Posted by *Ized*
> 
> spyshagg nice to hear you sorted out your coilwhine.
> 
> I tried 4 PSUs and at least 4 motherboards along with 4 GPUs and had whine every time
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I moved to headphones


I hear you. Before moving my watercooling Rad box to another room (Computer on one room, WC rad box on another room with watercooling tubes passing though a hole on the wall







) i had to use my Sony NWZ-x1050 mp3 player with noise cancelling phones to silence the fans noise. The coil whine wasn't a problem until after I moved the Rad box to another room, nor it existed before. It simply came after applying full cover waterblocks on my 290's and pushing the voltages.

I'm lucky both my 290x dont suffer from coil whine









Speaking of my watercooling rad box, because I put it in another room, The fans are always 100%







After hours of pushing the CPU and the crossfire 290x's to its limits, I have a water delta temperature of 3~4º centigrades.

I now have amazing performance, amazing silence, and a fresh room temperature of 27º (before I moved the Rad box, playing games meant sweating like a fountain with 32º room temperature).


----------



## mfknjadagr8

Ok so I had the same issue again using the 15.7 drivers on the witcher 3 blue screened saying driver stuck on infinite loop trying to recover...so automatic restarted and once again ensured wouldn't boot...hit splash screen and quick blue screen before restarting and repeat...so boot into safe mode ddu and restart with card unplugged same song and dance with crossfire...then after rebooting the freshly installed driver blue screened during boot still...so safe mode then ddu then I added in cccleaner run before I restarted to install the driver and it worked this time...so it's either the driver or the latest patch for witcher 1.07...or the two colliding in an epic battle...of thing is the two times it blue screened was at the end of a battle where the action had already subsided...temps are pretty good nothing breaking 60c...voltages never exceeding 1.24...


----------



## kizwan

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Ok so I had the same issue again using the 15.7 drivers on the witcher 3 blue screened saying driver stuck on infinite loop trying to recover...so automatic restarted and once again ensured wouldn't boot...hit splash screen and quick blue screen before restarting and repeat...so boot into safe mode ddu and restart with card unplugged same song and dance with crossfire...then after rebooting the freshly installed driver blue screened during boot still...so safe mode then ddu then I added in cccleaner run before I restarted to install the driver and it worked this time...so it's either the driver or the latest patch for witcher 1.07...or the two colliding in an epic battle...of thing is the two times it blue screened was at the end of a battle where the action had already subsided...temps are pretty good nothing breaking 60c...voltages never exceeding 1.24...


If the BSOD created crash dump, you can check what happened. BSOD after reboot sound like more serious problem than driver problem.


----------



## mfknjadagr8

Quote:


> Originally Posted by *kizwan*
> 
> If the BSOD created crash dump, you can check what happened. BSOD after reboot sound like more serious problem than driver problem.


I thought so at first...I even dropped cards and cpu to defaults to make sure it wasn't my overclock...also from what I could see of the blue screen it references the graphics driver but it's literally up half a second it less before it reboots...and it does this when it loads the graphics driver to display the Windows login screen...I haven't looked for crash dump yet...I simply got it back to working state then I'll look further...also to note this would be my third blue screen ever (not counting the restart ones)on windows 7...2 from this problem and one when I had nvidia graphics card trying to install amd chipset drivers


----------



## diggiddi

Quote:


> Originally Posted by *mfknjadagr8*
> 
> I thought so at first...I even dropped cards and cpu to defaults to make sure it wasn't my overclock...also from what I could see of the blue screen it references the graphics driver but it's literally up half a second it less before it reboots...and it does this when it loads the graphics driver to display the Windows login screen...I haven't looked for crash dump yet...I simply got it back to working state then I'll look further...also to note this would be my third blue screen ever (not counting the restart ones)on windows 7...2 from this problem and one when I had nvidia graphics card trying to install amd chipset drivers


There you go who crashed http://www.resplendence.com/whocrashed


----------



## mfknjadagr8

Quote:


> Originally Posted by *diggiddi*
> 
> There you go who crashed http://www.resplendence.com/whocrashed


thx for the link by the way... but as i suspected.. crashdump shows video driver..


Spoiler: crashdump



Crash Dump Analysis

Crash dump directory: C:\Windows\Minidump

Crash dumps are enabled on your computer.

On Tue 7/28/2015 11:19:23 PM GMT your computer crashed
crash dump file: C:\Windows\Minidump\072815-4477-02.dmp
This was probably caused by the following module: atikmdag.sys (atikmdag+0x2D114)
Bugcheck code: 0x3B (0xC0000005, 0xFFFFF88011045114, 0xFFFFF880067238B0, 0x0)
Error: SYSTEM_SERVICE_EXCEPTION
file path: C:\Windows\system32\drivers\atikmdag.sys
product: ATI Radeon Family
company: Advanced Micro Devices, Inc.
description: ATI Radeon Kernel Mode Driver
Bug check description: This indicates that an exception happened while executing a routine that transitions from non-privileged code to privileged code.
This appears to be a typical software driver bug and is not likely to be caused by a hardware problem.
A third party driver was identified as the probable root cause of this system error. It is suggested you look for an update for the following driver: atikmdag.sys (ATI Radeon Kernel Mode Driver, Advanced Micro Devices, Inc.).
Google query: Advanced Micro Devices, Inc. SYSTEM_SERVICE_EXCEPTION

On Tue 7/28/2015 11:19:23 PM GMT your computer crashed
crash dump file: C:\Windows\memory.dmp
This was probably caused by the following module: atikmdag.sys (atikmdag+0x2D114)
Bugcheck code: 0x3B (0xC0000005, 0xFFFFF88011045114, 0xFFFFF880067238B0, 0x0)
Error: SYSTEM_SERVICE_EXCEPTION
file path: C:\Windows\system32\drivers\atikmdag.sys
product: ATI Radeon Family
company: Advanced Micro Devices, Inc.
description: ATI Radeon Kernel Mode Driver
Bug check description: This indicates that an exception happened while executing a routine that transitions from non-privileged code to privileged code.
This appears to be a typical software driver bug and is not likely to be caused by a hardware problem.
A third party driver was identified as the probable root cause of this system error. It is suggested you look for an update for the following driver: atikmdag.sys (ATI Radeon Kernel Mode Driver, Advanced Micro Devices, Inc.).
Google query: Advanced Micro Devices, Inc. SYSTEM_SERVICE_EXCEPTION

On Tue 7/28/2015 11:13:05 PM GMT your computer crashed
crash dump file: C:\Windows\Minidump\072815-4477-01.dmp
This was probably caused by the following module: atikmdag.sys (atikmdag+0x2D114)
Bugcheck code: 0x3B (0xC0000005, 0xFFFFF880112C0114, 0xFFFFF880068F97F0, 0x0)
Error: SYSTEM_SERVICE_EXCEPTION
file path: C:\Windows\system32\drivers\atikmdag.sys
product: ATI Radeon Family
company: Advanced Micro Devices, Inc.
description: ATI Radeon Kernel Mode Driver
Bug check description: This indicates that an exception happened while executing a routine that transitions from non-privileged code to privileged code.
This appears to be a typical software driver bug and is not likely to be caused by a hardware problem.
A third party driver was identified as the probable root cause of this system error. It is suggested you look for an update for the following driver: atikmdag.sys (ATI Radeon Kernel Mode Driver, Advanced Micro Devices, Inc.).
Google query: Advanced Micro Devices, Inc. SYSTEM_SERVICE_EXCEPTION

On Sun 7/19/2015 5:02:43 AM GMT your computer crashed
crash dump file: C:\Windows\Minidump\071915-5257-01.dmp
This was probably caused by the following module: atikmdag.sys (atikmdag+0x2D114)
Bugcheck code: 0x3B (0xC0000005, 0xFFFFF8801124E114, 0xFFFFF880069387F0, 0x0)
Error: SYSTEM_SERVICE_EXCEPTION
file path: C:\Windows\system32\drivers\atikmdag.sys
product: ATI Radeon Family
company: Advanced Micro Devices, Inc.
description: ATI Radeon Kernel Mode Driver
Bug check description: This indicates that an exception happened while executing a routine that transitions from non-privileged code to privileged code.
This appears to be a typical software driver bug and is not likely to be caused by a hardware problem.
A third party driver was identified as the probable root cause of this system error. It is suggested you look for an update for the following driver: atikmdag.sys (ATI Radeon Kernel Mode Driver, Advanced Micro Devices, Inc.).
Google query: Advanced Micro Devices, Inc. SYSTEM_SERVICE_EXCEPTION


----------



## diggiddi

How'd you get the data with visual studio?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *cephelix*
> 
> lol, you run it that far? but i suppose if you're doing it for numbers, it's understandable. i'm too cautious to knowingly push my hardware that much. SNIP


Waterchiller makes 1.5vc and 1350 core possible









Quote:


> Originally Posted by *mus1mus*
> 
> Lucky you.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> When I blackscreen with no cool whine, all I can do is wait for the score.


No finish = crashed

Quote:


> Originally Posted by *diggiddi*
> 
> There you go who crashed http://www.resplendence.com/whocrashed


$50AU for that ? ..........


----------



## diggiddi

Whatutalkinbout Willis? its free, pro version costs money


----------



## mus1mus

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> No finish = crashed


Mine just blackscreens. Display goes back with the scores.


----------



## cephelix

@HOMECINEMA-PC 1.5?! you sir have my respect.....


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *diggiddi*
> 
> Whatutalkinbout Willis? its free, pro version costs money


That's what I saw , pro version . Still fifty bucks 'Strayan ..... a bit overpriced









Quote:


> Originally Posted by *mus1mus*
> 
> Mine just blackscreens. Display goes back with the scores.


It would blackout till end of bench a viola , score screen









Quote:


> Originally Posted by *cephelix*
> 
> @HOMECINEMA-PC 1.5?! you sir have my respect.....


True story 1360mhz @ 1.5vc


----------



## cephelix

@HOMECINEMA-PC wondering what you could pull with LN2 then?


----------



## HOMECINEMA-PC

I've seen the same clocks submitted by a dude over a year ago on extreme cooling , so maybe not worth it . I'd want at least 1400/1500 on LN2


----------



## fyzzz

Wow temperatures seem to do alot. I can run 1270/1700 when card is at 70c and 1230 don't work when the card is at 87c. Even the memory seem to overclock better when the card run colder. I'm just doing some testing when waiting for the parts to arrive and it's going to be intresting when it's under water.


----------



## diggiddi

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> That's what I saw , pro version . Still fifty bucks 'Strayan ..... a bit overpriced
> 
> 
> 
> 
> 
> 
> 
> 
> It would blackout till end of bench a viola , score screen
> 
> 
> 
> 
> 
> 
> 
> 
> 
> True story 1360mhz @ 1.5vc


Well get the free version no need to pay

Anyhoo in other news I just got this bad boy


----------



## morencyam

I picked up an MSI 290X 4G off eBay for $250 after shipping. It just barely fits into my Thermaltake Core V1


----------



## battleaxe

Brand new 290x should arrive on the step today! Whoo hoo!


----------



## Gumbi

Quote:


> Originally Posted by *battleaxe*
> 
> Brand new 290x should arrive on the step today! Whoo hoo!


Noice. Which model?


----------



## cephelix

@morencyam that looks like a very snuf fit. Looks awesome though


----------



## Gumbi

Hey guys, I've a 290 VaporX and I'm just wondering if my idle voltage is working correctly. So here is how it works.

It idles ate 1.000v~ and when I apply load it jumps to say 1.15v (default voltage is 25mv by the way).

Now, say I add 75mv to the core, it actually adds that voltage to the idle voltage, so my core will now idle at 1.05v-1.075v~ (numbers might be slightly off, am at work so can't confirm); and wthen when I apply load, it might thrn jump to 1.2 or more volts, fluctuating accordingly.

Is this how it works in your 290s? Basically wondering if applying voltage should also apply that same voltage to the idle voltage numbers, can't see the need for it as it's only gonna idle at the same mhz anyway(300mhz).

Thanks! Not a big deal really, just wondering







Anything to buy a degree or two in temps t idle


----------



## MrKZ

Quote:


> Originally Posted by *Gumbi*
> 
> Hey guys, I've a 290 VaporX and I'm just wondering if my idle voltage is working correctly. So here is how it works.
> 
> It idles ate 1.000v~ and when I apply load it jumps to say 1.15v (default voltage is 25mv by the way).
> 
> Now, say I add 75mv to the core, it actually adds that voltage to the idle voltage, so my core will now idle at 1.05v-1.075v~ (numbers might be slightly off, am at work so can't confirm); and wthen when I apply load, it might thrn jump to 1.2 or more volts, fluctuating accordingly.
> 
> Is this how it works in your 290s? Basically wondering if applying voltage should also apply that same voltage to the idle voltage numbers, can't see the need for it as it's only gonna idle at the same mhz anyway(300mhz).
> 
> Thanks! Not a big deal really, just wondering
> 
> 
> 
> 
> 
> 
> 
> Anything to buy a degree or two in temps t idle


Yes, setting the voltage offset will modify all the voltages, not only the load voltage. I think the only way to edit the load voltage independently is to edit the bios (somebody correct me if I'm wrong.)


----------



## Gumbi

Quote:


> Originally Posted by *MrKZ*
> 
> Yes, setting the voltage offset will modify all the voltages, not only the load voltage. I think the only way to edit the load voltage independently is to edit the bios (somebody correct me if I'm wrong.)


Awesome, thanks for the precise answer


----------



## battleaxe

Quote:


> Originally Posted by *Gumbi*
> 
> Noice. Which model?




So far I am very impressed with this card. Seems to run Valley so far at 1150mhz/1520mhz with no voltage bump. That seems stronger than my old MSI290 for sure. I havent' really pushed much beyond that yet, but so far very impressed.

@Arizonian can you add me to the list for a 290x now also?


----------



## Gumbi

Nice, would you be so kind as to do a bench run of Valley and screenshot the temps of core and VRM? Is thata Trix model? Ithought the Nitro was a successor to it?

Perhaps provide ambients too.


----------



## cephelix

1150/1520 with no added voltage?you sir have a magical card.


----------



## battleaxe

Quote:


> Originally Posted by *Gumbi*
> 
> Nice, would you be so kind as to do a bench run of Valley and screenshot the temps of core and VRM? Is thata Trix model? Ithought the Nitro was a successor to it?
> 
> Perhaps provide ambients too.


Yeah, I will... just haven't had the time to do so yet. I'll post screenies, etc... I think I've got a decent card here. At least seems so initially.









It came from the factory set to +6 on the voltage. I just left it that way. So that's what I meant by no added voltage. Left it at +6 when running Valley. But that's not Firestrike. It may not do 1150 at only +6 on FS... I have no idea yet.
Quote:


> Originally Posted by *Gumbi*
> 
> Nice, would you be so kind as to do a bench run of Valley and screenshot the temps of core and VRM? Is thata Trix model? Ithought the Nitro was a successor to it?
> 
> Perhaps provide ambients too.


Will do, when I get some time. Hope this weekend.


----------



## fyzzz

Quote:


> Originally Posted by *battleaxe*
> 
> Yeah, I will... just haven't had the time to do so yet. I'll post screenies, etc... I think I've got a decent card here. At least seems so initially.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It came from the factory set to +6 on the voltage. I just left it that way. So that's what I meant by no added voltage. Left it at +6 when running Valley. But that's not Firestrike. It may not do 1150 at only +6 on FS... I have no idea yet.
> Will do, when I get some time. Hope this weekend.


Nice oc on stock voltage. How much voltage under load do you get?


----------



## Gumbi

Sweet







1150 core stable at 6mv is insane tbh! Agent Smith is push 75 or 80mv and cooling it fine with a new cooling setup, you could perhaps hit 1250 or more which is insane on air!


----------



## Maticb

Okay guys, I have a question, just out of curiousity, because for 24/7 I do not use high OCs.
I have 2x 290s, and yesterday I upgraded to an I7 5820k from an FX-8320.

Before, my 290s handled 1136/1440, with artefacting in 3DMark, but didnt crash. Now I get a black screen when I set the memory to 1440.

Could that be related to the motherboard or the difference betwen PCIEx 2.0 and 3.0 (since AMD didnt support 3.0, but the I7 does)? I got Graphics score of 24096 in 3DMARK 1080p with the FX-8320 and now my max was 23796, with the memory clocked 40MHz lower.

As I said, this doesnt affect me because those clocks are far from what I would run 24/7, but I am a bit confused thought. Everything else is the same (PSU and GPUs).

EDIT: Here are the links if anyone is interested:
FX-8320:
http://www.3dmark.com/fs/5420063
I7-5820k:
http://www.3dmark.com/fs/5565937


----------



## fyzzz

Quote:


> Originally Posted by *Gumbi*
> 
> Sweet
> 
> 
> 
> 
> 
> 
> 
> 1150 core stable at 6mv is insane tbh! Agent Smith is push 75 or 80mv and cooling it fine with a new cooling setup, you could perhaps hit 1250 or more which is insane on air!


1250 on air? http://www.3dmark.com/fs/5531717 too instable tho to get a good score and it doesn't help that the run was on my amd machine. But hey it got through anyways.


----------



## Gumbi

Quote:


> Originally Posted by *fyzzz*
> 
> 1250 on air? http://www.3dmark.com/fs/5531717 too instable tho to get a good score and it doesn't help that the run was on my amd machine. But hey it got through anyways.


Yeah, I was talking about a stable overclock ; ) I run 1175/1650 at 75mv on my vaporx, I'd say I could probably hit 1250mhz for a run but it wouldn't necessarily be stable or cool enough.


----------



## fyzzz

Quote:


> Originally Posted by *Gumbi*
> 
> Yeah, I was talking about a stable overclock ; ) I run 1175/1650 at 75mv on my vaporx, I'd say I could probably hit 1250mhz for a run but it wouldn't necessarily be stable or cool enough.


I'm running a modded bios that gives me around 1.2v under load and with that, i can run 1170/1500 (maybe more but i haven't tested more). I also have major heat problems. Switching over to water soon so that doesn't matter.


----------



## mfknjadagr8

It'd
Quote:


> Originally Posted by *fyzzz*
> 
> I'm running a modded bios that gives me around 1.2v under load and with that, i can run 1170/1500 (maybe more but i haven't tested more). I also have major heat problems. Switching over to water soon so that doesn't matter.


send odd but my cards both hit over 1.2 with no settings changed...on stock clocks...stock voltage even with no power limit increase


----------



## battleaxe

Just ran a test. I can't seem to get Firestrike to run right. The program crashes right at the end when it is supposed to show your results. No artifacts. This PC did the same thing with the 970 in it, so IDK what is going on with it.

Anyway: here are some valley tests. I need to post the scores, but just showing temps, and clocks here. On the second test I started to see some artifacts when pushing to 1210mhz on the core, so 1200 was all it wanted to do at 75mv.

Memory is Samsung. I'm not sure if that's a good thing on these cards or not. But the memory is not as good as some. Seems it can do around 1575 maybe on iniitial tests, but I got blacksreen at 1600 when pushing 100mv extra, so that's probably the limit I'm guessing even at 150mv.

Anyway.

1150mhz core
1475mhz memory
+6 mv extra (this is factory apparently)


And here is at:

1200mhz core
1550mhz memory
+75mv


Overall though, I'm very happy with this card. Next I'll see what kind of scores she can do...


----------



## fyzzz

Quote:


> Originally Posted by *battleaxe*
> 
> Just ran a test. I can't seem to get Firestrike to run right. The program crashes right at the end when it is supposed to show your results. No artifacts. This PC did the same thing with the 970 in it, so IDK what is going on with it.
> 
> Anyway: here are some valley tests. I need to post the scores, but just showing temps, and clocks here. On the second test I started to see some artifacts when pushing to 1210mhz on the core, so 1200 was all it wanted to do at 75mv.
> 
> Memory is Samsung. I'm not sure if that's a good thing on these cards or not. But the memory is not as good as some. Seems it can do around 1575 maybe on iniitial tests, but I got blacksreen at 1600 when pushing 100mv extra, so that's probably the limit I'm guessing even at 150mv.
> 
> Anyway.
> 
> 1150mhz core
> 1475mhz memory
> +6 mv extra (this is factory apparently)
> 
> 
> And here is at:
> 
> 1200mhz core
> 1550mhz memory
> +75mv
> 
> 
> Overall though, I'm very happy with this card. Next I'll see what kind of scores she can do...


Wow you have gotten a very good card.


----------



## mfknjadagr8

So what is the max voltage you guys recommend for reference 290s?


----------



## Ized

Quote:


> Originally Posted by *mfknjadagr8*
> 
> So what is the max voltage you guys recommend for reference 290s?


You'll probably hit the thermal throttle (at 94c) or be agitated by the noise long before voltage is a problem on a full reference card.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Ized*
> 
> You'll probably hit the thermal throttle (at 94c) or be agitated by the noise long before voltage is a problem on a full reference card.


its under water now but im saying the components on the board are all reference xfx r9 290


----------



## Ized

I don't think there is a magic number that is going to satisfy you but:

http://www.overclock.net/t/1561904/mlu-bios-builds-for-290x/0_60

I think this guy might work for AMD or similar and he has placed a 1.36v voltage limit on his Bios files.

A few of us did tests at 1.4v+ which he said was silly, but it passed the "nothing melted test" and quite a few firestrike runs on my part.


----------



## Arizonian

Sorry been slow on the updates, shorthanded in staffing and I've been very busy lately. You guys have been busy as well over clocking and pushing scores I see.








Quote:


> Originally Posted by *Unknownm*
> 
> Holy nuts batman. Bought a AIO watercool kit for both my r9 290s. 2x Corsair HG10 A1 & 2x Corsair H75
> 
> The problem is this bracket was only design for reference cards (which mine is non reference). It mounts fine but VRM temps hit 100c. Regardless this is a major improvement over stock cooling, first thing being the top card doesn't throttle and second the case side panel is on (which was off before because of the heat)
> 
> The HDD are my RAID0 setup and since the HDD bays took up to much room they both had to be removed!
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - updated








Quote:


> Originally Posted by *xDorito*
> 
> Sorry for the poor quality. I've just been so thoroughly impressed by these two I decided to share a quick and dirty shot of them.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added















Quote:


> Originally Posted by *LA_Kings_Fan*
> 
> Hey Arizonian,
> 
> *UPDATE* ... I see I'm still on the list with that *PowerColor* PCS+ 290X unit (*link*) ... but I didn't end up keeping that one.
> 
> 
> 
> 
> 
> 
> 
> 
> should you remove ?
> 
> I still have the used *Sapphire* R9 290X Reference card (*link*) I got for *$315.00* .
> 
> ... and now I've added another 290X, the *Sapphire* Tri-X OC New Edition Card #100361-4L from Newegg for *$260.00*
> 
> So *DUAL* R9 290X power for *$575.00* Total
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can ya update me on the List ... thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - updated








Quote:


> Originally Posted by *diggiddi*
> 
> Well get the free version no need to pay
> 
> Anyhoo in other news I just got this bad boy
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Quote:


> Originally Posted by *morencyam*
> 
> I picked up an MSI 290X 4G off eBay for $250 after shipping. It just barely fits into my Thermaltake Core V1
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Quote:


> Originally Posted by *battleaxe*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> So far I am very impressed with this card. Seems to run Valley so far at 1150mhz/1520mhz with no voltage bump. That seems stronger than my old MSI290 for sure. I havent' really pushed much beyond that yet, but so far very impressed.
> 
> @Arizonian can you add me to the list for a 290x now also?


Congrats - updated


----------



## diggiddi

Guys I'm having an issue with my card GPu usage is fluctuating like crazy during game play BF3 MP
I increased power limit to 50% and power profile in windows is high to see if that would help but no dice


----------



## the9quad

Quote:


> Originally Posted by *diggiddi*
> 
> Guys I'm having an issue with my card GPu usage is fluctuating like crazy during game play BF3 MP
> I increased power limit to 50% and power profile in windows is high to see if that would help but no dice


Is it noticeable in game>? If not ignore it. That is my advice.
I think Unwinder has stated the way he has to call GPU usage on AMD cards makes it look like that.
If it is effecting performance, well than someone else will have to help.


----------



## SavageBrat

Quote:


> Originally Posted by *diggiddi*
> 
> Guys I'm having an issue with my card GPu usage is fluctuating like crazy during game play BF3 MP
> I increased power limit to 50% and power profile in windows is high to see if that would help but no dice


As the9quad said said don't worry about it my card do the same thing in WOT..


----------



## diggiddi

Ok thanks guys I'm used to it pegging my 7950 at 99%, reps for you!!


----------



## SavageBrat

see your not alone..


----------



## gatygun

Still can't get used towards the performance of 2x 290's.

Never had this much brute power to push into games, it almost feels like current gen games are basically not even trying


----------



## Gumbi

Quote:


> Originally Posted by *diggiddi*
> 
> Ok thanks guys I'm used to it pegging my 7950 at 99%, reps for you!!


It's probably CPU bound


----------



## diggiddi

Quote:


> Originally Posted by *Gumbi*
> 
> It's probably CPU bound


Yeah but I'm at 4.6ghz and It could peg my 9590 at almost 100%


----------



## diggiddi

I just tried Crysis 3 and it pegged it at 100% I guess BF3 does not put a strain on it. I'll increase the resolution in VSR to 2560 or higher and see what happens


----------



## mus1mus

Feedback on 15.7.1?


----------



## diggiddi

Quote:


> Originally Posted by *mus1mus*
> 
> Feedback on 15.7.1?


Huh?


----------



## mus1mus

Quote:


> Originally Posted by *diggiddi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Feedback on 15.7.1?
> 
> 
> 
> Huh?
Click to expand...

Latest Driver revision my friend.


----------



## diggiddi

I seem to be losing Frames with it at least in Crysis 3


----------



## blackhole2013

Quote:


> Originally Posted by *mus1mus*
> 
> Feedback on 15.7.1?


First windows 10 driver needs some work it slows down my pc while gaming


----------



## spyshagg

Quote:


> Originally Posted by *gatygun*
> 
> Still can't get used towards the performance of 2x 290's.
> 
> Never had this much brute power to push into games, it almost feels like current gen games are basically not even trying


Running triple screens, I kind of think I would need 4 290x to run Dirt rally and Asseto Corsa on MAX. Project Cars is also heavy, but I feel it to be more CPU bound than the other two.


----------



## diggiddi

Quote:


> Originally Posted by *spyshagg*
> 
> Running triple screens, I kind of think I would need 4 290x to run Dirt rally and Asseto Corsa on MAX. Project Cars is also heavy, but I feel it to be more CPU bound than the other two.


What are your opinions on AC and PCARS? and what resolutions are you running? I noticed the cars sound on AC is better but the rumble from running over curbs in Pcars is more vibrant and also comparing Dirt3 to Dirt Rally the latter has far better sounding cars
Anyone else get lower scores with 15.7.1?


----------



## Dooderek

3rd card in





minor oc, no volts

http://www.3dmark.com/3dm/7979255


----------



## spyshagg

Quote:


> Originally Posted by *diggiddi*
> 
> What are your opinions on AC and PCARS? and what resolutions are you running? I noticed the cars sound on AC is better but the rumble from running over curbs in Pcars is more vibrant and also comparing Dirt3 to Dirt Rally the latter has far better sounding cars
> Anyone else get lower scores with 15.7.1?


Asseto corsa is great. Really good. Graphics are nice but could be better. They can be much better if you use the andrea post processing filters. You need to increase the curbs feedback on the menu to feel them. The net code is amazing, almost as good as live for speed.

Project cars is pretty, the physics are not up to spec with asseto. And the net code is in all honesty abysmal. But its a great game to play offline. Its still feels like a sim, but asseto is on another level.

Dirt rally was said to be the spiritual successor to richard burns rally, and i think they are correct. Is really good.

I play with 3x 1080p monitors. With besel corrections the resolution is about 5900x1080. But eyefinity has a serious problem with vsync, because only the center screen vsyncs and the side screens feel like 20fps instead of 60. With vsync disabled the tearing is so bad on Dirt rallly I am forced to run with only one monitor and vsync on.

The tearing is much more acceptable on asseto and pcars.


----------



## By-Tor

Quote:


> Originally Posted by *Dooderek*
> 
> 3rd card in
> 
> 
> 
> minor oc, no volts
> 
> http://www.3dmark.com/3dm/7979255


Very Sexy!!!!

Which cards are they?


----------



## aaroc

Quote:


> Originally Posted by *spyshagg*
> 
> Running triple screens, I kind of think I would need 4 290x to run Dirt rally and Asseto Corsa on MAX. Project Cars is also heavy, but I feel it to be more CPU bound than the other two.


On what resolution do you want me to test? I have 4x R9 290X and 3x 2560x1440, my dream PC for racing sims








What PC do you have? Can you share the link of the prost processing filters?


----------



## diggiddi

Quote:


> Originally Posted by *spyshagg*
> 
> Asseto corsa is great. Really good. Graphics are nice but could be better. They can be much better if you use the andrea post processing filters. You need to increase the curbs feedback on the menu to feel them. The net code is amazing, almost as good as live for speed.
> 
> Project cars is pretty, the physics are not up to spec with asseto. And the net code is in all honesty abysmal. But its a great game to play offline. Its still feels like a sim, but asseto is on another level.
> 
> Dirt rally was said to be the spiritual successor to richard burns rally, and i think they are correct. Is really good.
> 
> I play with 3x 1080p monitors. With besel corrections the resolution is about 5900x1080. But eyefinity has a serious problem with vsync, because only the center screen vsyncs and the side screens feel like 20fps instead of 60. With vsync disabled the tearing is so bad on Dirt rallly I am forced to run with only one monitor and vsync on.
> 
> The tearing is much more acceptable on asseto and pcars.


Thanks for detailed reply +Rep I guess I'll go with a single monitor then, choices are between 2560 or 3440 ultrawide and 4k, definitely crossfired though


----------



## aaroc

Quote:


> Originally Posted by *diggiddi*
> 
> Thanks for detailed reply +Rep I guess I'll go with a single monitor then, choices are between 2560 or 3440 ultrawide and 4k, definitely crossfired though


If you want to play racing or flight sims/games, three screens is the way to go. The sensation of speed that the two side screens bring to your peripheral vision is priceless. To avoid collisions or to squeeze between cars. Counter maneuvers to avoid being overtaken.It is a real advantage over other players that only have one screen. GTAV is super fun too on three screens.


----------



## PachAz

Today I explored the overclocking abilities on my r9 290 OC tri-x, and basicly I have come up with two clocks, the first one is this:

GPU clock: 1100
Memory clock: 1400
VDDC offset: +75
Power limit: +50

And the second:

GPU clock: 1120
Memory clock: 1430
VDDC offset: +93
Power limit: +50

I tuned via sapphire trixx utility and I tested the stability with 3dmark firestrike extreme, I made one initial run to make sure I didnt get any artefacts or crashes and then 3-4 additional runs to make sure everything would work. I increased the core and memory with +10mhz each and when getting artefacts or application crash I upped the vddc offset by +6 on the slider. I tuned the core first and when I got a stable one, I upped the vddc by +6 and then increased the memory by +10mhz. Personly I think valley is really poor to test stability because I passed valley several times with no artefacts at 1100/1400 and vddc +51, but I got instant artefact and crash in firestrike extreme.

I still havent found out my max stable on the core, but I feel it needs alot more vddc past 1110mhz, in order to manage 1120/1400 I needed to increase the vddc to +87. I have played around with both the core and memory today, but its possible I will start all over again and try to find a higher core clock. But at least for now, I have two decent OCs to fall back on if I dont succeed/loose patience hehe. With +93 vddc in trixx software the voltage is 1.307v during benchmarking according to gpu-z and the temps after 2-3 runs with firestrike is 51C on the gpu and 54 for vrm 1 and around 50 for vrm 2. I expect these to increase more if doing multiple runs after eachother or gaming a few hours. As far as I can tell I dont have a great overclocker since I need to upp the vddc a healthy bit to manage 1100/1400, something that many can do with stock voltage.


----------



## mus1mus

Just broke 11K FS with Insan1ty rom! on an all AMD system. Amazing improvement over my 290. But clock rates should have to be lowered.


----------



## Unknownm

Still cannot disable ULPS on windows 10 using 15.7.1 drivers.... with 15.4 beta it took a few restarts to get it fully disabled, 15.7 the same thing.

AB / Trixx / Regedit. AB - Trixx show it's disabled yet GPUZ reports second card ULPS active. I can't apply core/aux voltage to second gpu unless I first start 3d application and apply profile while it's running... however AB won't report core/aux voltage on second GPU on OSD and also fan profile + powerlimit defaults on second GPU when ULPS is enabled

This is very annoying I just want ULPS disabled... why is it so hard


----------



## kizwan

Quote:


> Originally Posted by *Dooderek*
> 
> 3rd card in
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> minor oc, no volts
> 
> http://www.3dmark.com/3dm/7979255


Nice!







Get that CPU to 4.5GHz at least.








Quote:


> Originally Posted by *PachAz*
> 
> Today I explored the overclocking abilities on my r9 290 OC tri-x, and basicly I have come up with two clocks, the first one is this:
> 
> GPU clock: 1100
> Memory clock: 1400
> VDDC offset: +75
> Power limit: +50
> 
> And the second:
> 
> GPU clock: 1120
> Memory clock: 1430
> VDDC offset: +93
> Power limit: +50
> 
> I tuned via sapphire trixx utility and I tested the stability with 3dmark firestrike extreme, I made one initial run to make sure I didnt get any artefacts or crashes and then 3-4 additional runs to make sure everything would work. I increased the core and memory with +10mhz each and when getting artefacts or application crash I upped the vddc offset by +6 on the slider. I tuned the core first and when I got a stable one, I upped the vddc by +6 and then increased the memory by +10mhz. Personly I think valley is really poor to test stability because I passed valley several times with no artefacts at 1100/1400 and vddc +51, but I got instant artefact and crash in firestrike extreme.
> 
> I still havent found out my max stable on the core, but I feel it needs alot more vddc past 1110mhz, in order to manage 1120/1400 I needed to increase the vddc to +87. I have played around with both the core and memory today, but its possible I will start all over again and try to find a higher core clock. But at least for now, I have two decent OCs to fall back on if I dont succeed/loose patience hehe. With +93 vddc in trixx software the voltage is 1.307v during benchmarking according to gpu-z and the temps after 2-3 runs with firestrike is 51C on the gpu and 54 for vrm 1 and around 50 for vrm 2. I expect these to increase more if doing multiple runs after eachother or gaming a few hours. As far as I can tell I dont have a great overclocker since I need to upp the vddc a healthy bit to manage 1100/1400, something that many can do with stock voltage.


*Firestrike - 3x run back-to-back test*

GPU clock: 1100
Memory clock: 1300
VDDC offset: +37mV
Power limit: +50

GPU clock: 1100
Memory clock: 1600
VDDC offset: +87mV
Power limit: +50

Ambient : 32.2C
Case temp : 39.3C
Water temp : 38 - 39C
GPU #1 temp : 64C
GPU #2 temp : 67C

300MHz jump on the memory required +50mV jump on the voltage. These run with zero artifacts.
Quote:


> Originally Posted by *Unknownm*
> 
> Still cannot disable ULPS on windows 10 using 15.7.1 drivers.... with 15.4 beta it took a few restarts to get it fully disabled, 15.7 the same thing.
> 
> AB / Trixx / Regedit. AB - Trixx show it's disabled yet GPUZ reports second card ULPS active. I can't apply core/aux voltage to second gpu unless I first start 3d application and apply profile while it's running... however AB won't report core/aux voltage on second GPU on OSD and also fan profile + powerlimit defaults on second GPU when ULPS is enabled
> 
> This is very annoying I just want ULPS disabled... why is it so hard


I use 15.7.1 too. I use Trixx v4.9.0 to disable ULPS. In the Settings, tick "Disable ULPS" & then reboot.

EDIT: If Trixx show "Disable ULPS" already selected, untick the checkbox & tick the checkbox again. Then reboot.


----------



## Unknownm

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Unknownm*
> 
> Still cannot disable ULPS on windows 10 using 15.7.1 drivers.... with 15.4 beta it took a few restarts to get it fully disabled, 15.7 the same thing.
> 
> AB / Trixx / Regedit. AB - Trixx show it's disabled yet GPUZ reports second card ULPS active. I can't apply core/aux voltage to second gpu unless I first start 3d application and apply profile while it's running... however AB won't report core/aux voltage on second GPU on OSD and also fan profile + powerlimit defaults on second GPU when ULPS is enabled
> 
> This is very annoying I just want ULPS disabled... why is it so hard
> 
> 
> 
> 
> 
> I use 15.7.1 too. I use Trixx v4.9.0 to disable ULPS. In the Settings, tick "Disable ULPS" & then reboot.
> 
> EDIT: If Trixx show "Disable ULPS" already selected, untick the checkbox & tick the checkbox again. Then reboot.
Click to expand...

Just did this and same result. It won't disable on me


----------



## kizwan

Quote:


> Originally Posted by *Unknownm*
> 
> Just did this and same result. It won't disable on me


That's weird. I didn't install or try MSI AB in Win 10. Probably it broken with AB. Re-install the driver again.

- Uninstall MSI AB
- Update GPU-Z (kinda bugging me the old version show Win 8, not Win 10. Just in case it failed to read correct status of the cards.)
- Uninstall AMD driver from Control Panel. (If it ask, uninstall all version.)
- Reboot
- Install AMD 15.7.1 driver
- Reboot
- Run Trixx. It may show the "ULPS Disable" already selected, just untick & tick the checkbox.
- Reboot


----------



## diggiddi

Quote:


> Originally Posted by *aaroc*
> 
> If you want to play racing or flight sims/games, three screens is the way to go. The sensation of speed that the two side screens bring to your peripheral vision is priceless. To avoid collisions or to squeeze between cars. Counter maneuvers to avoid being overtaken.It is a real advantage over other players that only have one screen. GTAV is super fun too on three screens.


3 monitors in landscape? what about Portrait?


----------



## spyshagg

Quote:


> Originally Posted by *aaroc*
> 
> On what resolution do you want me to test? I have 4x R9 290X and 3x 2560x1440, my dream PC for racing sims
> 
> 
> 
> 
> 
> 
> 
> 
> What PC do you have? Can you share the link of the prost processing filters?


I would love to know if you can reproduce the same vsync problem as I have.
With Vsync enabled the center screen is smooth but the side screens are choppy. This happens on all games with vsync enabled. This also happened with my old 4870x2, so maybe its a crossfire issue?

You can see my entire Pc specs in this firestrike link: http://www.3dmark.com/fs/5573462

About Dirt Rally, I run at 5900x1080p and Im always above 60fps in Dry weather with everything set to ultra (except the very last graphical option, which is set to off).
But with weather, my 2 290x's performance tanks hard. I have to drop almost all graphical settings :\ Can you maintain 60FPS with weather ?


----------



## Streetdragon

Hiho, i have a littel question!
Some flashed the 290 with the bios of a 390.
I have 2 r9 290 Vapor. Could i benefit from the 390 bios? get more power out of them? Someone here that testet is already?

(want to be a owner too http://www.3dmark.com/fs/5567509 )


----------



## spyshagg

Quote:


> Originally Posted by *Streetdragon*
> 
> Hiho, i have a littel question!
> Some flashed the 290 with the bios of a 390.
> I have 2 r9 290 Vapor. Could i benefit from the 390 bios? get more power out of them? Someone here that testet is already?
> 
> (want to be a owner too http://www.3dmark.com/fs/5567509 )


Yes

with 2 290x

Left: 390x Bios
Right: stock 290x Bios



Framerate also increased about 12fps in battlefield4


----------



## Streetdragon

is there a tutorial for stupids? and btw. i have non-X cards. So i cant flash 390X bios or?


----------



## skyrrd

take a look here: http://www.overclock.net/t/1564219/r9-390x-modified-bios-for-r9-290-290x-now-with-higher-compatibility-for-all

i tried it with an sapphire 290 tri-x new edition (so non-reference-pcb)

what i found is that where reference-pcb cards are all flashable i have to edit the device-id (hexeditor) to the id of an 290x to get it working in windows. background:

up to version 1.5a all versions (390x bios for 290 AND 290x) had the same device-id, causing blackscreen during boot but working device in windows.
since version 1.6 this is fixed for reference-pcbs which are now working on boot and in windows.
if i want to use 1.6 upwards i have to change device-id back to the wrong value to get it to work at least in windows.
if i use the correct device-id from 1.6 upwards i have working boot-promt but blackscreen in the os.


----------



## spyshagg

Sure

follow this link http://www.overclock.net/t/1564219/r9-390x-modified-bios-for-r9-290-290x-now-with-higher-compatibility-for-all/0_50
ps: before doing this, gownload GPU-Z and save your original bios in case you need it.

What I do is
1- download the correct bios file for my card (in the link above)
2- download atiflash.exe (in the link above) and copy to a folder on C: called atiflash or something
3- copy the bios from step1 to the same folder as atiflash.exe
4- Open command prompt in windows with administrator rights
5-
c:\>cd atiflash
c:\atiflash> atiflash.exe -f -p 0 nameofbiosfile.rom

6- wait until it finishes. Could last 5 minutes.
7- Shutdown your computer
8- Before turning your computer on, first go get some Jesus figurine or a picture of a holy saint
9- Pray you have not bricked your card
10- This doesn't happen that often but it could happen.

As for turning your 290 into a 290x/390x, it was possible a few years ago, but those chips are no longer available.

Edit: beaten with more accurate information from above post!


----------



## mus1mus

I knew some guys that have unlocked their newly bought cards to 290X last month. Unlockable cards are still out there.


----------



## Paul17041993

meant to post about this months ago, but my ol 290X has successfully been hybrid-cooled;



corsair HG10 + fractal design kelvin T12 + alphacool heatsinks + alphacool fittings, temperature fitting (hidden) and silverwater to top it off (harhar). Results are pretty pleasing, temperatures a little better than the stock blower at full-tilt while the noise is drastically more favorable with a gentle whoosh at its peak (1500RPM 120mm sleeve fan, no vibration). Pump is only given about 80% power (~11V) though as it does have a whine at full-tilt, which is mostly expected anyway for the size of it.

was going to make a dedicated topic about the full procedure at the time but... mostly forgot about it...


----------



## mfknjadagr8

Ok so 15.7 driver gives me blue screen randomly in witcher 3 which corrupts the driver so badly it won't even boot into Windows....15.7.1 about half the games I have won't start I just get blackscreen on start then error citing video driver...so I've rolled back to 15.6 and all issues seem to be gone however the improvements are too







sad since I was seeing some improvement and really liking the frame limiter in ccc


----------



## Paul17041993

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Ok so 15.7 driver gives me blue screen randomly in witcher 3 which corrupts the driver so badly it won't even boot into Windows....15.7.1 about half the games I have won't start I just get blackscreen on start then error citing video driver...so I've rolled back to 15.6 and all issues seem to be gone however the improvements are too
> 
> 
> 
> 
> 
> 
> 
> sad since I was seeing some improvement and really liking the frame limiter in ccc


What numbers and file (if it gives one) does it say in the BSOD? and what errors do the games give? and have you tried a full driver uninstall (including deleting the AMD folder under C) > regclean (ccleaner is a good option) > reinstall?

could possibly be a certain file not being updated correctly.


----------



## diggiddi

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Ok so 15.7 driver gives me blue screen randomly in witcher 3 which corrupts the driver so badly it won't even boot into Windows....15.7.1 about half the games I have won't start I just get blackscreen on start then error citing video driver...so I've rolled back to 15.6 and all issues seem to be gone however the improvements are too
> 
> 
> 
> 
> 
> 
> 
> sad since I was seeing some improvement and really liking the frame limiter in ccc


Try this
Roll Back to stock clocks and uninstall using DDU then reinstall with stock Clocks


----------



## Paul17041993

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Ok so 15.7 driver gives me blue screen randomly in witcher 3 which corrupts the driver so badly it won't even boot into Windows....15.7.1 about half the games I have won't start I just get blackscreen on start then error citing video driver...so I've rolled back to 15.6 and all issues seem to be gone however the improvements are too
> 
> 
> 
> 
> 
> 
> 
> sad since I was seeing some improvement and really liking the frame limiter in ccc


actually just realised you're likely using crossfire, so if our suggestions have no effect then I guess crossfire support must have broken with the recent changes... :/


----------



## mfknjadagr8

Quote:


> Originally Posted by *diggiddi*
> 
> Try this
> Roll Back to stock clocks and uninstall using DDU then reinstall with stock Clocks


i have been running stock clocks and reinstalled about a dozen times on stock clocks







the first thing i dropped was any overclock... but yeah 15.6 works well enough just not as good as it could be.. the bsod points to the ati driver... running whocrashed shows the same thing only shows all instances... i tried 15.7 probably ten times different ways even took overclocks off of everything same issue...15.7.1 was just a no go.. 3 out of the 5 games i normally play wouldnt even launch.... another 10 or so of the 28 or so games i have wouldnt launch either..i tried reinstalling it using ddu everytime and running cccleaner before leaving safe mode and reboot..i also tried it with just one card not running crossfire... and same thing...


----------



## diggiddi

I mean stock cpu clocks too


----------



## mfknjadagr8

Quote:


> Originally Posted by *diggiddi*
> 
> I mean stock cpu clocks too


tried that too...its been a journey over the course of a week...even tried running ram at lower speeds even below specs..oh well maybe theres hope for the next one :0


----------



## Paul17041993

Quote:


> Originally Posted by *mfknjadagr8*
> 
> tried that too...its been a journey over the course of a week...even tried running ram at lower speeds even below specs..oh well maybe theres hope for the next one :0


you tried using a single 290 in the rig? from what I can tell by your sig you have a crossfire setup, though I have no idea if both cards are waterblocked and use a solid bridge, though in that case it may be possible to just leave one card without it's power connectors plugged in...


----------



## mus1mus

I knew some guys that have unlocked their newly bought cards to
Quote:


> Originally Posted by *Paul17041993*
> 
> you tried using a single 290 in the rig? from what I can tell by your sig you have a crossfire setup, though I have no idea if both cards are waterblocked and use a solid bridge, though in that case it may be possible to just leave one card without it's power connectors plugged in...


29X series cards dont use Bridges. He's watercooled too.

IMO, if the games only benefit crashes on the new Driver, revert to the old one that worked. Wait for a better alternative.


----------



## Paul17041993

Quote:


> Originally Posted by *mus1mus*
> 
> 29X series cards dont use Bridges. He's watercooled too.


meant waterblock bridges that are commonly used in water crossfire/SLI, sometimes they're just half-inch tubes/fittings.


----------



## mus1mus

Quote:


> Originally Posted by *Paul17041993*
> 
> meant waterblock bridges that are commonly used in water crossfire/SLI, sometimes they're just half-inch tubes/fittings.


I truly made a mistake there.


----------



## kizwan

No need to unplug anything when you want to test single card. Just disable crossfire in CCC.

BTW, anyone using Open Hardware Monitor & running it in the background, for the last a couple drivers update, I think since 15.4 or 15.6 beta, games may crash because the gpu become unresponsive somehow & stuck at max clocks. Probably isolated to crossfire only. Close Open Hardware Monitor before playing.


----------



## ghabhaducha

Quote:


> Originally Posted by *ghabhaducha*
> 
> Alright, so when I had 15.5, My second 290x would fall asleep when not being used; however, when I installed 15.6, upon startup my second 290x would register a temperature in MSI afterburner and not fall asleep. I would have to disable crossfire, and then re-enable it to prevent a constant-wake on the second card. This situation was fixed in the 15.15 drivers, but has come back again with the 15.7 drivers. I enable ULPS on this system, and I don't really have any other problems besides this. Would you guys know why this may be happening?


I figured it out! For anyone else having this same problem, it was the AMD Gaming Evolved App. I didn't auto login, so it would just run without being logged into, and my second 290x would remain active. Because it was set to auto start, I didn't realize this until it updated itself today, and for a moment my second card was asleep when i started my computer. With the AMD 15.5 drivers, I apparently didn't install this app.

I hope this helps someone.


----------



## diggiddi

That app has a mind of its own it would be recording games without my permission and slowing down my system


----------



## PachAz

This is with stock settings 1000/1300:



And this is with an overclock at 1120/1430



Should I increase the core or the memory to get higher minimum fps, because that is what I want. Also what do you recommend as the highest safe voltage for 24/7 setup if the gpu is under water. Now its 1.307 under full load.


----------



## kizwan

Quote:


> Originally Posted by *ghabhaducha*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ghabhaducha*
> 
> Alright, so when I had 15.5, My second 290x would fall asleep when not being used; however, when I installed 15.6, upon startup my second 290x would register a temperature in MSI afterburner and not fall asleep. I would have to disable crossfire, and then re-enable it to prevent a constant-wake on the second card. This situation was fixed in the 15.15 drivers, but has come back again with the 15.7 drivers. I enable ULPS on this system, and I don't really have any other problems besides this. Would you guys know why this may be happening?
> 
> 
> 
> I figured it out! For anyone else having this same problem, it was the AMD Gaming Evolved App. I didn't auto login, so it would just run without being logged into, and my second 290x would remain active. Because it was set to auto start, I didn't realize this until it updated itself today, and for a moment my second card was asleep when i started my computer. With the AMD 15.5 drivers, I apparently didn't install this app.
> 
> I hope this helps someone.
Click to expand...

I don't understand what are you trying to achieve but if you don't want the secondary card to fall asleep, then you just need to disable ULPS. ULPS is responsible putting your secondary card to fall asleep when idle.


----------



## mfknjadagr8

Quote:


> Originally Posted by *kizwan*
> 
> I don't understand what are you trying to achieve but if you don't want the secondary card to fall asleep, then you just need to disable ULPS. ULPS is responsible putting your secondary card to fall asleep when idle.


no he's saying he did want it asleep but it was not going into ulps mode...I'm not sure why anyone would want this especially when overclocking...maybe I feel this way because I live in the land of cheap power which is great because id hate to have seen my current electric bill otherwise ($370)...but to be fair the poor ac is running nearly 100% all the time it's under sized and inefficient and its been 90s for a few months


----------



## kizwan

That make sense then. Electricity not really cheap where I live. I tried to limit AC for 5 to 6 hours a day because more than that, the bill will be expensive.

That's ok but the thing that I really like to complaint is broadband. No higher than 1Mbps in my area. For faster connection, I ended up subscribing mobile 4G. Fast but with limited quota. The thing is the cost per GB increase but the service quality still poor.


----------



## Streetdragon

I had to reinstall the AMD drivers. Now i cant downsampel anymore. I dont have the options!

Where is the checkbox gone?


----------



## ghabhaducha

Quote:


> Originally Posted by *kizwan*
> 
> I don't understand what are you trying to achieve but if you don't want the secondary card to fall asleep, then you just need to disable ULPS. ULPS is responsible putting your secondary card to fall asleep when idle.


Ya, the below poster is right. I wanted ULPS enabled to save power (and heat, but I'm on water so that's not as important). 2x 290x's @1075/1375 is more than enough for my needs at 1440p.

Now for a side note, based on what you are saying, ULPS disabled gives better overclocks?


----------



## kizwan

Quote:


> Originally Posted by *ghabhaducha*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I don't understand what are you trying to achieve but if you don't want the secondary card to fall asleep, then you just need to disable ULPS. ULPS is responsible putting your secondary card to fall asleep when idle.
> 
> 
> 
> Ya, the below poster is right. I wanted ULPS enabled to save power (and heat, but I'm on water so that's not as important). 2x 290x's @1075/1375 is more than enough for my needs at 1440p.
> 
> Now for a side note, based on what you are saying, ULPS disabled gives better overclocks?
Click to expand...

ULPS can interfere your overclock, especially when overvoltage. Your cards may underperform but 2 x 290s already plenty powerful at 1080p & 1440p, so you may not noticed it.


----------



## Paul17041993

Quote:


> Originally Posted by *PachAz*
> 
> This is with stock settings 1000/1300:
> 
> 
> Spoiler: image
> 
> 
> 
> 
> 
> 
> 
> And this is with an overclock at 1120/1430
> 
> 
> Spoiler: image
> 
> 
> 
> 
> 
> 
> 
> Should I increase the core or the memory to get higher minimum fps, because that is what I want. Also what do you recommend as the highest safe voltage for 24/7 setup if the gpu is under water. Now its 1.307 under full load.


increasing both should gain improvement, but memory could possibly be increased 100-200 more than it currently is. While the bandwidth gain isn't incredibly useful for 1080p it should decrease access latency and increase the performance, but try to increase the clocks separately as if you go too high on memory it can start to decrease in performance.

voltage wise, I'd think 1.3 is ideal, but you could possibly go higher if you can keep the core, VRMs and PCB pretty cool. 7970s could take as much as 1.4V 24/7 and I don't think hawaii should be much different...

though those who've actually done such overclocks 24/7 would give more accurate advice, but me being absent for about a year I've lost track of everyone... :/


----------



## MalsBrownCoat

I just purchased an ASUS MG279Q 144hz monitor today, and after doing so, the bug to go triple setup keeps biting me. But before I go and add 2 more to my order, I had a sudden "uh oh" moment; about connectivity.

I'm running 2 R9 290X (ASUS DC2OC-4GD5)'s, watercooled, in Crossfire.

What my concern is about, is that my 290x's have 4 outputs; 2 DVI, 1 HDMI and 1 DP.
This new monitor only has 2 HDMI and 2 DP.

Please excuse the stupid question, but how can I achieve 1440p, at 144hz across these 2 cards to *3* of these monitors?

Thanks!


----------



## diggiddi

I think you'll need an active display port connector/hub
http://support.amd.com/en-us/recommended/eyefinity-adapters

or one of these here
http://www.accellcables.com/products/ultraav-mini-displayport-1-2-mst-multi-display-hub?variant=729676045


----------



## MalsBrownCoat

Thanks for that, diggiddi. Though I don't think the hub will resolve the refresh rate issue.

I haven't been able to find anything online that contradicts the notion that MST hubs can only support up to 2 monitors at 1440p per hub (and even that is pretty taxing on the system). That might be fine if I had more display ports on my card, and could just do 2 hubs, but since there is only 1 DP on the card, the point is moot.

I looked at the Club 3D MST Hub, and it will do 1440p, but it's limited to supporting only 2 monitors at that resolution, and at 60hz:
http://www.club-3d.com/index.php/products/reader.en/product/mst-hub-1-3.html?page=3

The Accell model hub says pretty much the same thing, but elaborates on the issue a bit further towards the bottom of the page.
http://www.accellcables.com/products/ultraav-mini-displayport-1-2-mst-multi-display-hub?variant=729676045

This would be so much easier if the monitors had DVI inputs, but they don't (...seriously?!).
So I started wondering if the best route to go would be to try to utilize the DVI connectors on the _290x_ (using adapters), in addition to the DP connector.

Then my question becomes; _Does DVI-D (Dual Link) support 1440p and 144hz?_

And if I have to go to the adapter route, I don't know if the adapters are one-way, or if they'll work in either direction (can I plug the DVI output of the card to a DP adapter, then DP to the monitor, or is it only supported the other way around?).

These DP to DVI adapters are from Monoprice, and they'll do 1440p, but they say that they're limited to 60hz:
http://www.monoprice.com/Product?c_id=104&cp_id=10428&cs_id=1042806&p_id=12784&seq=1&format=2
Now, I'm not sure if that's just old information, or if they'd really handle 144hz today.

Accell has some DP to DVI adapters, and they say that they'll do 2560 x 1600, but I think they're also limited to 60hz.
http://www.amazon.com/Accell-UltraAV-B087B-002B-DisplayPort-Dual-Link/dp/B002ISVI3U?

I'm wondering if these "60hz limitations" are just old details? If it's DVI, won't it support 144hz anyway (please correct me if I'm wrong)?

If the adapters _do_ support 144hz, would it work for me to go with :
a DVI-D output of the card, to an active DVI-D to Display Port adapter, to a DisplayPort input of monitor 1
the second DVI-D output of the card, to an active DVI-D to Display Port adapter, to a DisplayPort input of monitor 2
the DisplayPort output of the card, to a DisplayPort input of monitor 3

...all while managing to run the three monitors at 2560 x 1440p, at 144hz?

*SIGH* Why must this be so difficult...or am I missing some minute detail that's causing me to overthink this?
.


----------



## sinnedone

I think even the active adapters are limited in bandwidth. In the qnix thread anytime someone wants to overclock the refresh rate and run a triple surround using the 2 dvi and the displayport the most they can get is like 80-90hz with an active adapter. I don't think there are any active converters that will allow for 144hz at 1440p


----------



## MalsBrownCoat

So would it be a fair assumption that if I want to enjoy 1440p on 3 144hz monitors, I need either

1. A GPU that has (at least) 3 DisplayPort outputs

or

2. Monitors that also have DVI-D (Dual-Link) inputs

?


----------



## sinholueiro

Hi! I will buy a 1440p build soon. I have the monitor already and looking for the graphics card I have my doubts about the 200 and 300 series. I mean, I have the numbers clear and I know it's a rename, but I wanna know if I will have all the benefits from the 300 series in the 200 series in terms of drivers, etc... For example, the 15.7 drivers equalise performance (about tesselation improvements, etc...)?


----------



## Paul17041993

Quote:


> Originally Posted by *MalsBrownCoat*
> 
> I just purchased an ASUS MG279Q 144hz monitor today, and after doing so, the bug to go triple setup keeps biting me. But before I go and add 2 more to my order, I had a sudden "uh oh" moment; about connectivity.
> 
> I'm running 2 R9 290X (ASUS DC2OC-4GD5)'s, watercooled, in Crossfire.
> 
> What my concern is about, is that my 290x's have 4 outputs; 2 DVI, 1 HDMI and 1 DP.
> This new monitor only has 2 HDMI and 2 DP.
> 
> Please excuse the stupid question, but how can I achieve 1440p, at 144hz across these 2 cards to *3* of these monitors?
> 
> Thanks!


have you tested DVI > HDMI for the full 144Hz @1440p? that may be your only option, even then I think you'll be capped at 120Hz. You cant run more than one [email protected] off the single displayport as its too much bandwidth.

So you're likely stuck at 3* 1440p 120Hz unless either the DVI > HDMI allows for 144Hz or you get a Fury...


----------



## MalsBrownCoat

It's my (albeit, limited) understanding that HDMI tops out at 1080p and 60hz.

So, my 2 brand new 290x cards are going to be pulled out of my system, and replaced with 2 390x's (the ASUS STRIX-DC3OC-8GD5 versions, which have _3_ Displayports, 1 HDMI and 1 DVI) that I just now bought.

Guess I'll put the 290x's up in the For Sale forum and call it a day.


----------



## Paul17041993

Quote:


> Originally Posted by *MalsBrownCoat*
> 
> It's my (albeit, limited) understanding that HDMI tops out at 1080p and 60hz.


twice that for HDMI 1.4a, which is what the AMD cards use, HDMI 2.0 have twice that again however AMD haven't adopted said standard yet.


----------



## MalsBrownCoat

Just to verify, you're talking about the frequency, right?
So 1.4a supports up to 120hz, correct?


----------



## Paul17041993

Quote:


> Originally Posted by *MalsBrownCoat*
> 
> Just to verify, you're talking about the frequency, right?
> So 1.4a supports up to 120hz, correct?


bandwidth, 1080p @ 120Hz, 2* 1080p @ 60Hz, 4k @ 30Hz, etc. There's also different colour modes that change the bandwidth usage and can allow higher resolutions and/or refresh rates.

you'll have to check the monitor specs about what its HDMI supports, but I believe only the DP can supply the full [email protected]


----------



## Streetdragon

I have 2 r9 290 Vapor cards in crossfire. My question is, how can i help the top card to run cooler? I have in front of them a fan that blows qir on them and on the side 2 fans that pulls hot air out of the tower. Can i do something more for cooling?


----------



## diggiddi

Might wanna put the bottom card in last pcie slot to give them more space or get an acrylic dummy card and put it between the 2 so it separates them


----------



## gatygun

Quote:


> Originally Posted by *Streetdragon*
> 
> I have 2 r9 290 Vapor cards in crossfire. My question is, how can i help the top card to run cooler? I have in front of them a fan that blows qir on them and on the side 2 fans that pulls hot air out of the tower. Can i do something more for cooling?


Both 2 fans on the front pull air inside here, the one at the back pulls it out. This works the best for my solution as it pushes more cool air inside the case.


----------



## mus1mus

There's no practicality in this. But izz fun!

I wanna crack 15K Graphics on a single card!


----------



## fyzzz

Quote:


> Originally Posted by *mus1mus*
> 
> There's no practicality in this. But izz fun!
> 
> I wanna crack 15K Graphics on a single card!


15k on a 290 that would be amazing. Very nice score btw, now i have something too shoot for later







.


----------



## mus1mus

Quote:


> Originally Posted by *fyzzz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> There's no practicality in this. But izz fun!
> 
> I wanna crack 15K Graphics on a single card!
> 
> 
> 
> 15k on a 290 that would be amazing. Very nice score btw, now i have something too shoot for later
> 
> 
> 
> 
> 
> 
> 
> .
Click to expand...

Pretty sure you can get a crack at this.









Wanna try my BIOS? part of the experiment.


----------



## spyshagg

Heres mine cracking *15k*









http://www.3dmark.com/fs/5662841


didn't need to pump full volts either. Only 153mv with very minor artifacts

Edit: Some crossfire love:
http://www.3dmark.com/fs/5662654


----------



## fyzzz

Quote:


> Originally Posted by *mus1mus*
> 
> Pretty sure you can get a crack at this.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Wanna try my BIOS? part of the experiment.


Yes i guess i could try it. I wasn't gonna benchmark more on air because the watercooling parts should be here next week, but i don't know now.


----------



## mus1mus

Quote:


> Originally Posted by *fyzzz*
> 
> Yes i guess i could try it. I wasn't gonna benchmark more on air because the watercooling parts should be here next week, but i don't know now.


YGPM.


----------



## sinnedone

Have a quick question.

When using afterburner in unofficial overclocking mode, what are the pros and cons of going with or without powerplay support?

Any inherent stability issues when using either method?

Thanks


----------



## BradleyW

Quote:


> Originally Posted by *sinnedone*
> 
> Have a quick question.
> 
> When using afterburner in unofficial overclocking mode, what are the pros and cons of going with or without powerplay support?
> 
> Any inherent stability issues when using either method?
> 
> Thanks


Pro = Prevents down clocking when using an fps limiter to reduce chance of stutters and increased latency (very minor of course).

Con = Increases power draw and heat on idle and load.


----------



## Paul17041993

Quote:


> Originally Posted by *sinnedone*
> 
> Have a quick question.
> 
> When using afterburner in unofficial overclocking mode, what are the pros and cons of going with or without powerplay support?
> 
> Any inherent stability issues when using either method?
> 
> Thanks


Quote:


> Originally Posted by *BradleyW*
> 
> Pro = Prevents down clocking when using an fps limiter to reduce chance of stutters and increased latency (very minor of course).
> 
> Con = Increases power draw and heat on idle and load.


I think it'd also be a workaround if you get any idle crashing when overclocked, eg; when using multiple monitors on high memory. I have had a crash from that though I've never tested that option as I haven't been very interested in running anything overclocked for quite a while...


----------



## battleaxe

I sure love the extra power of Xfire and the 290x... and unlike last time I had Xfire, this time I'm not experiencing any issues with it. Very happy so far.


----------



## kizwan

Quote:


> Originally Posted by *battleaxe*
> 
> I sure love the extra power of Xfire and the 290x... and unlike last time I had Xfire, this time I'm not experiencing any issues with it. Very happy so far.


----------



## crazycrave

Can I join the club ?? this is the Tri X 290X New Edition and I have not overclocked it yet but man it runs so cool as that bench mark ran at 65c..

http://www.3dmark.com/fs/5570685


----------



## battleaxe

Let me just say I am very much loving my 290x and 290 in crossfire. I'm running 950mhz on the 290x and 1050mhz on the 290. Both cards can bench much higher than this, but the frames are good, so I just don't see the need to push them. Keeping heat down and enjoying the power. Gotta love AMD value these days!


----------



## morencyam

Quote:


> Originally Posted by *battleaxe*
> 
> Let me just say I am very much loving my 290x and 290 in crossfire. *I'm running 950mhz on the 290x and 1050mhz on the 290*. Both cards can bench much higher than this, but the frames are good, so I just don't see the need to push them. Keeping heat down and enjoying the power. Gotta love AMD value these days!


Won't they both default to the lowest clock of the two cards in crossfire?


----------



## battleaxe

Quote:


> Originally Posted by *morencyam*
> 
> Won't they both default to the lowest clock of the two cards in crossfire?


Mine don't. I can see in the OS display of AB 950 on one and 1050 on the other while playing a game. I can clock each card independently as I see fit.


----------



## kizwan

Quote:


> Originally Posted by *morencyam*
> 
> Quote:
> 
> 
> 
> Originally Posted by *battleaxe*
> 
> Let me just say I am very much loving my 290x and 290 in crossfire. *I'm running 950mhz on the 290x and 1050mhz on the 290*. Both cards can bench much higher than this, but the frames are good, so I just don't see the need to push them. Keeping heat down and enjoying the power. Gotta love AMD value these days!
> 
> 
> 
> Won't they both default to the lowest clock of the two cards in crossfire?
Click to expand...

Nope. Both cards can run at their own clocks in crossfire.


----------



## morencyam

Quote:


> Originally Posted by *kizwan*
> 
> Nope. Both cards can run at their own clocks in crossfire.


Got it. I've only ever run SLI in the past so I wasn't sure if Crossfire worked the same way or not. Thanks for the info. +rep


----------



## HOMECINEMA-PC

Has anyone upgraded to the latest catalyst driver ??


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Has anyone upgraded to the latest catalyst driver ??


I did. I didn't game much atm. GTA V seems fine with the drivers.


----------



## diggiddi

15. 7.1 Yeah


----------



## Gumbi

Can anyone do me a favour? I recently repasted my GPU in order to perhaos gain a few degrees in temps. I believe I either pasted it quite improperly (I put a dab in the middle, not too much, which I thought was prudent), as it's now getting wuite hot under load (even though I've confirmed it's mounted correctly/tightly).

Can anyone give me an idea of how hot your 290 Vapor X gets under load. I ran Heaven earlier, 100% fan, 75mv at 1175/1600, and within 5 mins my core temp was hitting 89 degrees, which I think is ridiculously hot for a Vapor X; as a reference point, my VRMs were both 55 degrees or lower.

I'm going to repaste it at the weekend anyway, hopefully nothing's wrong


----------



## Streetdragon

I have crossfire. Under firestrike with 1150/1500 my top card can reach 85° and the second 70~ so its a bit hot!
btw nice clocks! you have the stock bios? or the moddes 390


----------



## Gumbi

Nope, just stock VaporX setup







. I can maintain 1650 on the mem too if I can keep the core cool enough. Something's definitely up with the temps anyways, I suspected it at first after the inital repasting, but it's got progressively worse and I was certain something was wrong then.

My idle core temps I suspected were *slightly* too high. Ambients of about 19 degrees, VRMs were 22 and 23 degrees, while core was 32-33 degrees, and I think 28~ would be more usual. But the 90 degrees sealed the deal for me.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Gumbi*
> 
> Nope, just stock VaporX setup
> 
> 
> 
> 
> 
> 
> 
> . I can maintain 1650 on the mem too if I can keep the core cool enough. Something's definitely up with the temps anyways, I suspected it at first after the inital repasting, but it's got progressively worse and I was certain something was wrong then.
> 
> My idle core temps I suspected were *slightly* too high. Ambients of about 19 degrees, VRMs were 22 and 23 degrees, while core was 32-33 degrees, and I think 28~ would be more usual. But the 90 degrees sealed the deal for me.


I used about 1/4 what I use on cpu then mounted and removed to check the spread...then reapplied same way since spread was beautiful...it's a pain to remount a few times but it gives you a great baseline for the next tune


----------



## kizwan

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gumbi*
> 
> Nope, just stock VaporX setup
> 
> 
> 
> 
> 
> 
> 
> . I can maintain 1650 on the mem too if I can keep the core cool enough. Something's definitely up with the temps anyways, I suspected it at first after the inital repasting, but it's got progressively worse and I was certain something was wrong then.
> 
> My idle core temps I suspected were *slightly* too high. Ambients of about 19 degrees, VRMs were 22 and 23 degrees, while core was 32-33 degrees, and I think 28~ would be more usual. But the 90 degrees sealed the deal for me.
> 
> 
> 
> I used about 1/4 what I use on cpu then mounted and removed to check the spread...then reapplied same way since spread was beautiful...it's a pain to remount a few times but it gives you a great baseline for the next tune
Click to expand...

This what I did. I'm willing to waste little TIM to get or confirm good spread.


----------



## Agent Smith1984

I was going to wait on 15.7.1 until my WIndows 10 update is available.....

Thoughts on either?


----------



## mus1mus

15.7 and 15.7.1 are very close. Maybe some games benefit it. Or bugs taken away.

With Windows 10, I just spent the whole day trying and wondering why system reboots when subjected with even a slight load! Must be something wrong with installing W10 on UEFI mode. Last issue pointed was when I installed the Graphics driver.


----------



## Agent Smith1984

They're back....
http://www.newegg.com/Product/Product.aspx?Item=N82E16814202146

Refurb Tri-X 290 for $220.....

I had bought one of these before selling off my Crossfire set and was VERY happy...


----------



## mAs81

15.7 on Win 7 for me was a BSOD extravaganza








I think that I'll refrain using those Win 10 drivers , until I actually have that OS on my desktop - and that will take a while..Haven't tried the 15.7.1. at all tho...


----------



## battleaxe

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Has anyone upgraded to the latest catalyst driver ??


15.7.1 here on Win10. Runs fine so far.
Quote:


> Originally Posted by *Agent Smith1984*
> 
> They're back....
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814202146
> 
> Refurb Tri-X 290 for $220.....
> 
> I had bought one of these before selling off my Crossfire set and was VERY happy...


Holy Crap. You're killing me...

Can an ASUS p8z77 Pro do Tri-Xfire?


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> 15.7.1 here on Win10. Runs fine so far.
> Holy Crap. You're killing me...
> 
> Can an ASUS p8z77 Pro do Tri-Xfire?


Dunno,

But you could sell your MSI 290, and have matching Tri-X crossfire if not....


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Dunno,
> 
> But you could sell your MSI 290, and have matching Tri-X crossfire if not....


Very true, but I better wait for 8GB 390x or FuryX instead... as I put the wallet back in my pocket.

Still tempting I must say. If it were a 290x I would have already hit the 'buy' button for sure.

Edit: I am freaking loving these cards in Xfire for gaming. I don't even bother to bench much anymore. Its just more fun to play, especially on 4k with all this power. Looks awesome and just sucks you into the game. I guess that's the whole point to this hobby.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I was going to wait on 15.7.1 until my WIndows 10 update is available.....
> 
> Thoughts on either?


15.7 update for me had caused a lot of blue screening citing video driver and 15.7.1 seems to have an issue launching games for me...the biggest offender of these is ground zeroes..because phantom pain is coming and I would roll back to 5.7 if I had to to ensure that game runs properly it's gonna be epic....no comments on windows 10 as I'm on the fence about trying it


----------



## JourneymanMike

Quote:


> Originally Posted by *mfknjadagr8*
> 
> 15.7 update for me had caused a lot of blue screening citing video driver and 15.7.1 seems to have an issue launching games for me...the biggest offender of these is ground zeroes..because phantom pain is coming and I would roll back to 5.7 if I had to to ensure that game runs properly it's gonna be epic....*no comments on windows 10 as I'm on the fence about trying it*


I not putting it on my Sig Rig, yet

I have it on my laptop , trying it out.


----------



## mfknjadagr8

Quote:


> Originally Posted by *JourneymanMike*
> 
> I not putting it on my Sig Rig, yet
> 
> I have it on my laptop , trying it out.


I'm considering upgrading my desktop to win 10 and trying to use the key on or other pc that is currently running trial version because somehow my older Windows 7 keys won't work even though one I know is retail...but alas the more I read abit Windows ten the more I'm unsure...but I did the same thing with Windows 7 I procrastinated until xp wasnt supported anymore then ran it a few more years lol...I skipped Vista altogether (thank goodness)


----------



## fyzzz

I have finally watercooled my computer. I also broke 12k overall now, but 15k gpu score will require some more tweaking. Much less artifacting on water and got about 60mhz more on the core. Maybe i need to test a regular 290 bios, because i have hit 1270 on a regular 290 bios on air...here's the firestrike link:http://www.3dmark.com/fs/5694797. Much more testing must be had.


----------



## battleaxe

Quote:


> Originally Posted by *fyzzz*
> 
> I have finally watercooled my computer. I also broke 12k overall now, but 15k gpu score will require some more tweaking. Much less artifacting on water and got about 60mhz more on the core. Maybe i need to test a regular 290 bios, because i have hit 1270 on a regular 290 bios on air...here's the firestrike link:http://www.3dmark.com/fs/5694797. Much more testing must be had.


Cheez ol' Pete's... these are killing 970's now... who knew? LOL

The 970's came out and were beating the 290 for a bit. Then drivers caught up, and now an older card that costs less is beating the 970 again.


----------



## mus1mus

Quote:


> Originally Posted by *fyzzz*
> 
> I have finally watercooled my computer. I also broke 12k overall now, but 15k gpu score will require some more tweaking. Much less artifacting on water and got about 60mhz more on the core. Maybe i need to test a regular 290 bios, because i have hit 1270 on a regular 290 bios on air...here's the firestrike link:http://www.3dmark.com/fs/5694797. Much more testing must be had.


Vuuury nice bud!

Let's go race for 15K.


----------



## MIGhunter

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I was going to wait on 15.7.1 until my WIndows 10 update is available.....
> 
> Thoughts on either?


I upgraded to windows 10. All is fine. Then I upgraded to 15.7.1 and rebooted into the Black screen of death.

Here's a thread on the AMD forums about it
https://community.amd.com/thread/184727


----------



## Unknownm

Quote:


> Originally Posted by *MIGhunter*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> I was going to wait on 15.7.1 until my WIndows 10 update is available.....
> 
> Thoughts on either?
> 
> 
> 
> I upgraded to windows 10. All is fine. Then I upgraded to 15.7.1 and rebooted into the Black screen of death.
> 
> Here's a thread on the AMD forums about it
> https://community.amd.com/thread/184727
Click to expand...

I've had this issue since 15.4 on Windows 10 Tech. Pretty annoying because 6 seconds to POST (RAID BIOS included) and another 3-5 seconds to boot windows than another 20-30 seconds of blank screen before login. Thank god it's not a couple of minutes or I would loose my mind


----------



## Paul17041993

Quote:


> Originally Posted by *Gumbi*
> 
> Can anyone do me a favour? I recently repasted my GPU in order to perhaos gain a few degrees in temps. I believe I either pasted it quite improperly (I put a dab in the middle, not too much, which I thought was prudent), as it's now getting wuite hot under load (even though I've confirmed it's mounted correctly/tightly).
> 
> Can anyone give me an idea of how hot your 290 Vapor X gets under load. I ran Heaven earlier, 100% fan, 75mv at 1175/1600, and within 5 mins my core temp was hitting 89 degrees, which I think is ridiculously hot for a Vapor X; as a reference point, my VRMs were both 55 degrees or lower.
> 
> I'm going to repaste it at the weekend anyway, hopefully nothing's wrong


With such results I would check the main 4 screws that hold the block to the die, they could be off-balance or simply not tight enough, alternatively you could have used a little too much paste.
Personally I don't use the 'dab in the middle' technique, I do a pair of squares or rectangles at an even offset from eachother and the die edge, I then apply-remove-apply the heatsink once or twice until it's an ideal thickness and lacking bubbles.


----------



## prjindigo

Quickie question: Sapphire blower-type 290X with a double-tanked bios, any way to clear that out?


----------



## fyzzz

Come on firestrike! 5 points from 15k...http://www.3dmark.com/fs/5698317


----------



## MIGhunter

Quote:


> Originally Posted by *Unknownm*
> 
> I've had this issue since 15.4 on Windows 10 Tech. Pretty annoying because 6 seconds to POST (RAID BIOS included) and another 3-5 seconds to boot windows than another 20-30 seconds of blank screen before login. Thank god it's not a couple of minutes or I would loose my mind


See that's the problem. Mine never went past the black screen. I waited over an hour.


----------



## Paul17041993

Quote:


> Originally Posted by *prjindigo*
> 
> Quickie question: Sapphire blower-type 290X with a double-tanked bios, any way to clear that out?


if you have another GPU or display adapter in the system to at least get you to a command-line level, you should be able to re-flash at least one of the BIOS's by using the port argument in the flash utility (atiwinflash in a windows environment). Be VERY careful though not to get the ports wrong, I bricked my old 5770 from that mistake, though it was virtually dead anyway...


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> Come on firestrike! 5 points from 15k...http://www.3dmark.com/fs/5698317


Good job! You are using 390X bios? any issues?


----------



## mus1mus

Quote:


> Originally Posted by *fyzzz*
> 
> Come on firestrike! 5 points from 15k...http://www.3dmark.com/fs/5698317


witweeew! Very sweet!.


----------



## Agent Smith1984

Damn, Hawaii's hitting 15k on graphics... that is impressive.

What kind of voltage/cooling is being used for these 1250~ core clocks?

I can get the 1750+ memory clock, but I have trouble with the core after 1200MHz using 100mv.....


----------



## fyzzz

Quote:


> Originally Posted by *rdr09*
> 
> Good job! You are using 390X bios? any issues?


Yes using the 390 bios from Insan1tyOne's thread with modified clock speed and tdp/pl/tdc limit. I't can be finicky sometimes, like blackscreen can occur for no reason and other stuff. Can't clock as high as on a normal 290 bios, but that doesn't really matter since the score is so much higher. The 390 bios also runs much cooler. When i was testing in firestrike (200mv) my core temp got to around 45c and vrm1 to around 45c also. But overall it works pretty good.


----------



## fyzzz

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Damn, Hawaii's hitting 15k on graphics... that is impressive.
> 
> What kind of voltage/cooling is being used for these 1250~ core clocks?
> 
> I can get the 1750+ memory clock, but I have trouble with the core after 1200MHz using 100mv.....


I just maxed out the offset in trixx(200mv). I am on water now i can maybe tweak things to get higher, just built the loop. Under air 1180 was my bench clock,now i can run that stable with 75mv (haven't tested lower).


----------



## Agent Smith1984

Quote:


> Originally Posted by *fyzzz*
> 
> I just maxed out the offset in trixx(200mv). I am on water now i can maybe tweak things to get higher, just built the loop. Under air 1180 was my bench clock,now i can run that stable with 75mv (haven't tested lower).


Yeah, my 390 does 1180 on air at 75mv as it sits (daily stable), which really makes me want to put it under water.... there's no telling what it could do with water at 200mv, but unfortunately there are no plans to make a full cover block for the MSI 390's....


----------



## Ized

Anyone have a 290*X* (4gb) 015.049 or 015.048 Bios?


----------



## moorhen2

FireStrike run from me, not too shabby.

http://www.3dmark.com/3dm/8175658

http://s572.photobucket.com/user/moorhen2/media/Capture20202.jpg.html


----------



## ReHWolution

Sapphire R9 290X Vapor-X 8GB, liquid cooled by Alphacool NexXxos GPX 290 M08. Here's the validation: http://www.techpowerup.com/gpuz/details.php?id=ed64k
and here's a picture









http://imgur.com/DkDBiuZ


----------



## rdr09

Thanks for sharing, fy. Break that 15.


----------



## Arizonian

Quote:


> Originally Posted by *Dooderek*
> 
> 3rd card in
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> minor oc, no volts
> 
> http://www.3dmark.com/3dm/7979255


Congrats - updated








Quote:


> Originally Posted by *Paul17041993*
> 
> meant to post about this months ago, but my ol 290X has successfully been hybrid-cooled;
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> corsair HG10 + fractal design kelvin T12 + alphacool heatsinks + alphacool fittings, temperature fitting (hidden) and silverwater to top it off (harhar). Results are pretty pleasing, temperatures a little better than the stock blower at full-tilt while the noise is drastically more favorable with a gentle whoosh at its peak (1500RPM 120mm sleeve fan, no vibration). Pump is only given about 80% power (~11V) though as it does have a whine at full-tilt, which is mostly expected anyway for the size of it.
> 
> was going to make a dedicated topic about the full procedure at the time but... mostly forgot about it...


Congrats - updated








Quote:


> Originally Posted by *Streetdragon*
> 
> I had to reinstall the AMD drivers. Now i cant downsampel anymore. I dont have the options!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Where is the checkbox gone?


Congrats - added








Quote:


> Originally Posted by *crazycrave*
> 
> Can I join the club ?? this is the Tri X 290X New Edition and I have not overclocked it yet but man it runs so cool as that bench mark ran at 65c..
> 
> http://www.3dmark.com/fs/5570685


Congrats - added









Quote:


> Originally Posted by *ReHWolution*
> 
> Sapphire R9 290X Vapor-X 8GB, liquid cooled by Alphacool NexXxos GPX 290 M08. Here's the validation: http://www.techpowerup.com/gpuz/details.php?id=ed64k
> and here's a picture
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://imgur.com/DkDBiuZ


Congrats - updated


----------



## mus1mus

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Damn, Hawaii's hitting 15k on graphics... that is impressive.
> 
> What kind of voltage/cooling is being used for these 1250~ core clocks?
> 
> I can get the 1750+ memory clock, but I have trouble with the core after 1200MHz using 100mv.....


Are the 300 series clocking the memory better?


----------



## diggiddi

Looks like it


----------



## fyzzz

Quote:


> Originally Posted by *rdr09*
> 
> Thanks for sharing, fy. Break that 15.


Here you go http://www.3dmark.com/3dm/8184501? 15k gpu score


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> Here you go http://www.3dmark.com/3dm/8184501? 15k gpu score


Amazing 290 you have. it would take any of my 290s with orig bios 1400 on the core maybe to match yours, which is impossible.

i'm not sure if it is worth it with the accompanying issues. maybe Insanity will do further refinement.

Congrats!


----------



## Paul17041993

it appears VSR and GPU scaling may be broken for some DX titles in windows 10 and catalyst 15.7.1, anyone got any experiences? currently grabbing heaven 4.0 to try and test individual DX levels, OGL may be fine though...


----------



## mus1mus

Quote:


> Originally Posted by *fyzzz*
> 
> Here you go http://www.3dmark.com/3dm/8184501? 15k gpu score


Congrats man.

Imma catch if if I still can.









repped for being the first? +1


----------



## fyzzz

Quote:


> Originally Posted by *mus1mus*
> 
> Congrats man.
> 
> Imma catch if if I still can.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> repped for being the first? +1


Thanks! Took a little bit of tweaking before it went over 15k. I can run higher core clock, but test 2 or combined crash if i do that, while test 1 is just fine. I will tinker some more and see if can squezze more out of it, doubt it, but i'm going to keep benchmarking. So happy anyways to break 15k and hoppefully the 390 bios can be further refined. I hope you will also reach 15k, that would be fun.


----------



## sledgehammer1990

1. GPU-Z Link with OCN name or Screen shot of GPU-Z validation tab open with OCN Name or finally a pic of GPU with piece of paper with OCN name showing.
2. Manufacturer & Brand - Please don't make me guess.
3. Cooling - Stock, Aftermarket or 3rd Party Water

Finally picked up a 290X yay! Way better than my 750 Ti.

Validation Link

Sapphire R9 290X

Tri-X Cooling


----------



## Paul17041993

Quote:


> Originally Posted by *Paul17041993*
> 
> it appears VSR and GPU scaling may be broken for some DX titles in windows 10 and catalyst 15.7.1, anyone got any experiences? currently grabbing heaven 4.0 to try and test individual DX levels, OGL may be fine though...


ok;
- DX11 is completely shot scaling wise, and I mean completely, either run full-native or run windowed, or not at all.
- DX9 and OGL work, at least mostly, but o lorde does windows kark itself with heaven on these modes, it cant decide whether the window is active or not and repeatedly flashes out of context and leaves the taskbar off it's rocker...


----------



## obababoy

Quote:


> Originally Posted by *ReHWolution*
> 
> Sapphire R9 290X Vapor-X 8GB, liquid cooled by Alphacool NexXxos GPX 290 M08. Here's the validation: http://www.techpowerup.com/gpuz/details.php?id=ed64k
> and here's a picture
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://imgur.com/DkDBiuZ


I am so far off from this it makes me sick lol. My vapor x 290 4gb can't 0lay games at anything over 1150 core with out artifact ing badly. I am impressed with the cooling of this card but it hates being overclocked.


----------



## ReHWolution

Quote:


> Originally Posted by *obababoy*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ReHWolution*
> 
> Sapphire R9 290X Vapor-X 8GB, liquid cooled by Alphacool NexXxos GPX 290 M08. Here's the validation: http://www.techpowerup.com/gpuz/details.php?id=ed64k
> and here's a picture
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://imgur.com/DkDBiuZ
> 
> 
> 
> 
> 
> I am so far off from this it makes me sick lol. My vapor x 290 4gb can't 0lay games at anything over 1150 core with out artifact ing badly. I am impressed with the cooling of this card but it hates being overclocked.
Click to expand...

Actually everything is @ Default, because of 35°C+ Ambient Temperatures... as soon as it gets cooler I'll go full overclocked


----------



## Agent Smith1984

Quote:


> Originally Posted by *mus1mus*
> 
> Are the 300 series clocking the memory better?


I haven't seen a 390 that's won't do 1700+ on the memory, even if it takes a touch of AUX voltage.

Mine is stable up to 1780 with 75mv AUX voltage, but seems to get diminishing returns after 1740.

I run it at 1700/ 50mv AUX daily, with my core at 1170 (50mv+)

Love this card....


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I haven't seen a 390 that's won't do 1700+ on the memory, even if it takes a touch of AUX voltage.
> 
> Mine is stable up to 1780 with 75mv AUX voltage, but seems to get diminishing returns after 1740.
> 
> I run it at 1700/ 50mv AUX daily, with my core at 1170 (50mv+)
> 
> Love this card....


Wow. That's probably the biggest reason the 3 series are doing better on benchmarks then, right?


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> Wow. That's probably the biggest reason the 3 series are doing better on benchmarks then, right?


Actually, there are some kind of BIOS tweaks going on also, because people are putting 390 BIOS on their 290's, and ending up with less overclock ability, but better performance.
It's pretty serious gains too.....

AMD said ALL ALONG that to just call the 390 a rebrand was a discredit to their teams' refinement of the Hawaii cards. I think they were right!

I am glad to see 290 owners get the beefed up drivers, and ability to mod their BIOS though (some anyways).


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Actually, there are some kind of BIOS tweaks going on also, because people are putting 390 BIOS on their 290's, and ending up with less overclock ability, but better performance.
> It's pretty serious gains too.....
> 
> AMD said ALL ALONG that to just call the 390 a rebrand was a discredit to their teams' refinement of the Hawaii cards. I think they were right!
> 
> I am glad to see 290 owners get the beefed up drivers, and ability to mod their BIOS though (some anyways).


Yeah, I'm very happy with the 290x and my old 290 overall. Can't complain really. I have Samsung on the 290x but it will only hit around 1600mhz on the ram. So its not going to beat any good 390x's anytime soon. The core is better than normal. I can get well over 1200mz, but I've not really messed with it much. I haven't touched the Aux voltage as you mentioned at all. Just put it up to around 100mv at most. This card doesn't seem to like going too much over 100mv for some reason. Maybe I should try the AuxV for that.


----------



## mus1mus

Quote:


> Originally Posted by *Agent Smith1984*
> 
> I haven't seen a 390 that's won't do 1700+ on the memory, even if it takes a touch of AUX voltage.
> 
> Mine is stable up to 1780 with 75mv AUX voltage, but seems to get diminishing returns after 1740.
> 
> I run it at 1700/ 50mv AUX daily, with my core at 1170 (50mv+)
> 
> Love this card....


Wow. MSI AB?

For some reason I can't pull more than 1625 on my memory slider! Doing something wrong here?

Imma look that up. Coz I know for sure I am hitting the Voltage I need to feed my chip even at +200 on Trixxx.

Nice to know the 300 series are clocking the memory well. ( hmmm







)








don't do it mus!


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> Wow. MSI AB?
> 
> For some reason I can't pull more than 1625 on my memory slider! Doing something wrong here?
> 
> Imma look that up. Coz I know for sure I am hitting the Voltage I need to feed my chip even at +200 on Trixxx.


Update AB. I have a lot more room on my slider than what I can even access, so sounds like something on your rig.


----------



## Maticb

Um ugh, I have a question if any of you guys ever experienced something like this:
I was playing CSGO, my PC black screened (no problem there my R9 290 keeps doing that couple of times a month) but after I restarted it booted but showed no image like the GPU is dead. I restarted a couple of times, then re-seated and booted once with no power to the GPU and then again and now it booted again.

I've never had this happen before, everytime with a black screen a simple reset worked fine. Is taht normal? Anything I can do about it? I don't use crossfire in CSGO and my card is under water so it never goes anywhere near the stock 94 deegres, everything was working fine.


----------



## battleaxe

Quote:


> Originally Posted by *Maticb*
> 
> Um ugh, I have a question if any of you guys ever experienced something like this:
> I was playing CSGO, my PC black screened (no problem there my R9 290 keeps doing that couple of times a month) but after I restarted it booted but showed no image like the GPU is dead. I restarted a couple of times, then re-seated and booted once with no power to the GPU and then again and now it booted again.
> 
> I've never had this happen before, everytime with a black screen a simple reset worked fine. Is taht normal? Anything I can do about it? I don't use crossfire in CSGO and my card is under water so it never goes anywhere near the stock 94 deegres, everything was working fine.


Increase your voltage a tad, say 25mv. See if that helps.

Or lower your memory clock by 50mhz, one or the other.


----------



## Archea47

Quote:


> Originally Posted by *Maticb*
> 
> Um ugh, I have a question if any of you guys ever experienced something like this:
> I was playing CSGO, my PC black screened (no problem there my R9 290 keeps doing that couple of times a month) but after I restarted it booted but showed no image like the GPU is dead. I restarted a couple of times, then re-seated and booted once with no power to the GPU and then again and now it booted again.
> 
> I've never had this happen before, everytime with a black screen a simple reset worked fine. Is taht normal? Anything I can do about it? I don't use crossfire in CSGO and my card is under water so it never goes anywhere near the stock 94 deegres, everything was working fine.


Windows 10? There are other reports of black screen on boot with Windows 10 this week on OCN


----------



## Maticb

Quote:


> Originally Posted by *Archea47*
> 
> Windows 10? There are other reports of black screen on boot with Windows 10 this week on OCN


No, and it's not "on boot" it's like the graphics is dead as I said. Even the POST didn't show.

Also I've noticed my new motherboard has some weird things going on with the PCI ex lanes (MSI X99S SLI Plus). One time the electricity died (faulty extension cable in the kitchen) And when I restarted it wasn't showing the 2nd card, I mean like in BIOS, it wasn't shown there. It's not the card because on my previous AMD +GA990FXAUD3 it never had that kind of problems.

I've only had problem upon problem since I upgraded to the I7 5820K +X99S









EDIT: Okay so I've installed the new 15.7.1 drivers and added +50mV with a mild overclock (which I used to run with no voltage increase), let's see how it goes.


----------



## Streetdragon

Quote:


> Originally Posted by *mus1mus*
> 
> Wow. MSI AB?
> 
> For some reason I can't pull more than 1625 on my memory slider! Doing something wrong here?
> 
> Imma look that up. Coz I know for sure I am hitting the Voltage I need to feed my chip even at +200 on Trixxx.
> 
> Nice to know the 300 series are clocking the memory well. ( hmmm
> 
> 
> 
> 
> 
> 
> 
> 
> )
> 
> 
> 
> 
> 
> 
> 
> 
> don't do it mus!


You must enable in the MSIAB settings the "Extend official overclocking limits" then you can go higher!


----------



## Maticb

Is it possible that my black screen issues were caused by a faulty DVI cable? It happened again and I took my primary card out and tried different combinations on the secondary (after considering that my DVI slot died on the primary) Then I realized it boots on the 1 screen thats on HDMI and goes black screen entirely if I plug my middle screen in. I switched the DVI cable for another one and booted on all 3 screens and both cards in successfully. Hopefully it's that over a dead DVI slot or even my primary card.


----------



## battleaxe

Quote:


> Originally Posted by *Maticb*
> 
> Is it possible that my black screen issues were caused by a faulty DVI cable? It happened again and I took my primary card out and tried different combinations on the secondary (after considering that my DVI slot died on the primary) Then I realized it boots on the 1 screen thats on HDMI and goes black screen entirely if I plug my middle screen in. I switched the DVI cable for another one and booted on all 3 screens and both cards in successfully. Hopefully it's that over a dead DVI slot or even my primary card.


Entirely possible.


----------



## Paul17041993

Quote:


> Originally Posted by *Paul17041993*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Paul17041993*
> 
> it appears VSR and GPU scaling may be broken for some DX titles in windows 10 and catalyst 15.7.1, anyone got any experiences? currently grabbing heaven 4.0 to try and test individual DX levels, OGL may be fine though...
> 
> 
> 
> ok;
> - DX11 is completely shot scaling wise, and I mean completely, either run full-native or run windowed, or not at all.
> - DX9 and OGL work, at least mostly, but o lorde does windows kark itself with heaven on these modes, it cant decide whether the window is active or not and repeatedly flashes out of context and leaves the taskbar off it's rocker...
Click to expand...

good news everybody! in less than 24 hours after discovering the bugs I have worked out specific causes;
- (DX11) if any context-altering or attaching applications are used, such as fraps, render and interface scaling will cease to function, resulting in something similar to the picture below
- (ALL) if any application is open in the background, context control breaks and you wont be able to keep the game/application in fullscreen, ensure _all_ applications are minimized and that includes any launchers.
- (ALL) windows interface handling looses its rocker if anything intensive is run, including but not limited to taskbar lag, keyboard lag, other applications lagging, windows interface artifact-ing, etc



note that screenshots still work correctly, the bugs are only in the buffer > monitor stage whereas screenshots are before or in the buffer stage.


----------



## gatygun

Well kinda a weird thing, a mate of mine played on my pc for a while the last day and found the 290's perform so well that he wanted to buy them up. As i only spended 380 for it, he bought them for 500 from me.

Now i can't use my game pc anymore, so i need to find some new cards. rofl


----------



## Paul17041993

Quote:


> Originally Posted by *gatygun*
> 
> Well kinda a weird thing, a mate of mine played on my pc for a while the last day and found the 290's perform so well that he wanted to buy them up. As i only spended 380 for it, he bought them for 500 from me.
> 
> Now i can't use my game pc anymore, so i need to find some new cards. rofl


500 for a pair of 290s is pretty cheap for AU, so I can only assume things are cheaper in your country, Fury X maby? in raw specs a single Fury X is slightly less than a pair of 290's, but I would expect it to perform better on average as you don't have to worry about crossfire profiles. Other option would be a pair of 390s, essentially the same as the 290s but with 8GB VRAM standard and slight efficiency gains.

Or just something like a 380 and save money for better GPUs next year...


----------



## HOMECINEMA-PC

I wouldn't mind another 290 that's been waterblocked


----------



## VulpesInculta

Hello to everyone!

I installed a r9 290 pcs+ in my pc yesterday and while i noticed a great increase in performance from my recently deceased sapphire HD 7850 2gb OC, i also couldn't help but notice the very intense sound of the card under load after about 5 minutes in The Witcher 3, mostly maxed out on 1600x1200 resolution (my pc isn't the greatest, and my monitor is kinda old).The noise gets really annoying because the fans stay at 80% according to AMD CCC, and the temperature stays at about 70c. So i was wondering if that is normal for the card? And sorry for my other stupid question but could the problem also be because of my insufficient case fans - i have only 1 intake at the front and 1 exhaust at the back in a Fractal R4.This worked pretty well with my previous card, but doesn't seem good now.Could i get better temps with more fans or should i try to return the card?I am afraid to lower the fan speed because the card seems to run pretty hot and because i don't really know what i am doing i am afraid i could damage it.

Any help would be greatly appreciated!


----------



## mfknjadagr8

Quote:


> Originally Posted by *VulpesInculta*
> 
> Hello to everyone!
> 
> I installed a r9 290 pcs+ in my pc yesterday and while i noticed a great increase in performance from my recently deceased sapphire HD 7850 2gb OC, i also couldn't help but notice the very intense sound of the card under load after about 5 minutes in The Witcher 3, mostly maxed out on 1600x1200 resolution (my pc isn't the greatest, and my monitor is kinda old).The noise gets really annoying because the fans stay at 80% according to AMD CCC, and the temperature stays at about 70c. So i was wondering if that is normal for the card? And sorry for my other stupid question but could the problem also be because of my insufficient case fans - i have only 1 intake at the front and 1 exhaust at the back in a Fractal R4.This worked pretty well with my previous card, but doesn't seem good now.Could i get better temps with more fans or should i try to return the card?I am afraid to lower the fan speed because the card seems to run pretty hot and because i don't really know what i am doing i am afraid i could damage it.
> 
> Any help would be greatly appreciated!


70c isnt really hot for these cards especially on air...amd says keep them under 94c....I say keep them under 80c...but 10c is a lot...you can probably just set up a fan profile and tweak it until it suits you...to rma a card that it's doing as well as it should is silly....I can see nothing wrong with 70c on air running a gpu intensive game like the witcher...I would start with better case airflow...air cooled gpus benefit greatly from properly optimized airflow


----------



## Gumbi

Quote:


> Originally Posted by *VulpesInculta*
> 
> Hello to everyone!
> 
> I installed a r9 290 pcs+ in my pc yesterday and while i noticed a great increase in performance from my recently deceased sapphire HD 7850 2gb OC, i also couldn't help but notice the very intense sound of the card under load after about 5 minutes in The Witcher 3, mostly maxed out on 1600x1200 resolution (my pc isn't the greatest, and my monitor is kinda old).The noise gets really annoying because the fans stay at 80% according to AMD CCC, and the temperature stays at about 70c. So i was wondering if that is normal for the card? And sorry for my other stupid question but could the problem also be because of my insufficient case fans - i have only 1 intake at the front and 1 exhaust at the back in a Fractal R4.This worked pretty well with my previous card, but doesn't seem good now.Could i get better temps with more fans or should i try to return the card?I am afraid to lower the fan speed because the card seems to run pretty hot and because i don't really know what i am doing i am afraid i could damage it.
> 
> Any help would be greatly appreciated!


The 290 PCS has great cooling. It does have an aggressive fan profile however.

I would say your "problem" (not really a problem btw, 70c is fantastic for the card), is a mixture of average to poor airflow AND having an aggressive fan profile.

The Witcher is a very GPU intensive game too, so you won't see any worse than the stress you're putting on it. My advice is first to have a look at your fan profile, and modify it so it's a bit less aggressive. If it gets up to 80c in game that's OK. It really is. Next, have a look at your airflow situation. A good fan setup can aid temps a lot. But honestly, allowing your card to get a little warmer is not a big deal!


----------



## VulpesInculta

Quote:


> Originally Posted by *mfknjadagr8*
> 
> 70c isnt really hot for these cards especially on air...amd says keep them under 94c....I say keep them under 80c...but 10c is a lot...you can probably just set up a fan profile and tweak it until it suits you...to rma a card that it's doing as well as it should is silly....I can see nothing wrong with 70c on air running a gpu intensive game like the witcher...I would start with better case airflow...air cooled gpus benefit greatly from properly optimized airflow


It's good to know that this is normal for the card. How can i set up a fan profile that won't damage my system in some way? I am a noob at this kind of stuff, i never overclocked or tweaked in any way with my stuff before.

Also is there some good guide or something about air cooling? I know my 2 case fans are most likely not enough but i am not sure how much fans and what types should i get.


----------



## VulpesInculta

Quote:


> Originally Posted by *Gumbi*
> 
> The 290 PCS has great cooling. It does have an aggressive fan profile however.
> 
> I would say your "problem" (not really a problem btw, 70c is fantastic for the card), is a mixture of average to poor airflow AND having an aggressive fan profile.
> 
> The Witcher is a very GPU intensive game too, so you won't see any worse than the stress you're putting on it. My advice is first to have a look at your fan profile, and modify it so it's a bit less aggressive. If it gets up to 80c in game that's OK. It really is. Next, have a look at your airflow situation. A good fan setup can aid temps a lot. But honestly, allowing your card to get a little warmer is not a big deal!


What do you mean by fan profile?

I am sorry for the stupid questions but i have no idea when it comes to this sort of stuff.

I could get more fans but not really sure how many i need and where to put them.

Tbh i just want to somehow reduce the noise, so the fans on the card won't be sitting on 80% constantly.


----------



## battleaxe

Quote:


> Originally Posted by *VulpesInculta*
> 
> What do you mean by fan profile?
> 
> I am sorry for the stupid questions but i have no idea when it comes to this sort of stuff.
> 
> I could get more fans but not really sure how many i need and where to put them.
> 
> Tbh i just want to somehow reduce the noise, so the fans on the card won't be sitting on 80% constantly.


Download MSI Afterburner. Go into settings and click the "Fan" tab. Then setup your fan curve to do what you want it to do. Remember to click for Afterburner to start when you start Windows, so it comes on and shuts your GPU up as soon as AB starts.










Link:

http://gaming.msi.com/features/afterburner

Mine looks like this


----------



## VulpesInculta

Quote:


> Originally Posted by *battleaxe*
> 
> Download MSI Afterburner. Go into settings and click the "Fan" tab. Then setup your fan curve to do what you want it to do. Remember to click for Afterburner to start when you start Windows, so it comes on and shuts your GPU up as soon as AB starts.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Link:
> 
> http://gaming.msi.com/features/afterburner


Thanks for the help. Can i ask what would be a decent fan profile? Also can i damage my card with this?


----------



## battleaxe

Quote:


> Originally Posted by *VulpesInculta*
> 
> Thanks for the help. Can i ask what would be a decent fan profile? Also can i damage my card with this?


This won't do anything to your card. I've used it for years, as have most others on this site.

The best thing to do is to experiment. I usually have mine start around 30 or 35c then ramp up to whatever I want my maximum temp to be in a straight line. But you can do it lots of different ways using many points or whatever you wish. My fans are quiet so I have a pretty aggressive curve setup. Just mess around with it, you'll see what I'm talking about very shortly.

Here's another example, I just moved the points here is all I did.


----------



## VulpesInculta

Quote:


> Originally Posted by *battleaxe*
> 
> This won't do anything to your card. I've used it for years, as have most others on this site.
> 
> The best thing to do is to experiment. I usually have mine start around 30 or 35c then ramp up to whatever I want my maximum temp to be in a straight line. But you can do it lots of different ways using many points or whatever you wish. My fans are quiet so I have a pretty aggressive curve setup. Just mess around with it, you'll see what I'm talking about very shortly.
> 
> Here's another example, I just moved the points here is all I did.


Ah, i see now.

What temperature should i be setting as maximum one, that would be safe for my card? I guess 80c?


----------



## battleaxe

Quote:


> Originally Posted by *VulpesInculta*
> 
> Ah, i see now.
> 
> What temperature should i be setting as maximum one, that would be safe for my card? I guess 80c?


Probably right around there would be good. 75-85c is a good range I think for these. Depends on your noise threshold, they can handle 80c easily though.


----------



## VulpesInculta

Quote:


> Originally Posted by *battleaxe*
> 
> Probably right around there would be good. 75-85c is a good range I think for these. Depends on your noise threshold, they can handle 80c easily though.


Do you think getting more than 1 intake and 1 exhaust case fans would help my situation?

I was thinking of maybe adding one to the side and another frontal intake?

Not sure if i should add top ones as well.


----------



## battleaxe

Quote:


> Originally Posted by *VulpesInculta*
> 
> Do you think getting more than 1 intake and 1 exhaust case fans would help my situation?
> 
> I was thinking of maybe adding one to the side and another frontal intake?


Uh... yeah. What CPU cooler are you using? Sounds like you have enough only to evacuate for the CPU alone. So all that GPU heat is trapped inside. It takes a lot of CFM to get 290/x heat away from the PC.


----------



## VulpesInculta

Quote:


> Originally Posted by *battleaxe*
> 
> Uh... yeah. What CPU cooler are you using? Sounds like you have enough only to evacuate for the CPU alone. So all that GPU heat is trapped inside. It takes a lot of CFM to get 290/x heat away from the PC.


My CPU cooler is CM Hyper 212 EVO.

Also one more question - let's say my r9 290 somehow croaks due to me changing the fan speed, this is going to void my warranty, right?


----------



## gatygun

Quote:


> Originally Posted by *Paul17041993*
> 
> 500 for a pair of 290s is pretty cheap for AU, so I can only assume things are cheaper in your country, Fury X maby? in raw specs a single Fury X is slightly less than a pair of 290's, but I would expect it to perform better on average as you don't have to worry about crossfire profiles. Other option would be a pair of 390s, essentially the same as the 290s but with 8GB VRAM standard and slight efficiency gains.
> 
> Or just something like a 380 and save money for better GPUs next year...


Yea kinda looking into a gpu atm. The problem was my fps tanked hard the moment my cpu needed to push things forwards because it's a old cpu.

I probably will pick up a 390 8gb a single one ( as crossfire simple was overkill anyway ) or add some more towards the mix and go for a fury, unless i can find another cheap 290 again as the card was just a performance beast alone already.

Another mate of mine is quiting pc gaming entirely tho, he does have a spare 980 which he would want to sell for 450, but i rather go for a fury then i guess even while it costs a bit more.


----------



## battleaxe

Quote:


> Originally Posted by *VulpesInculta*
> 
> My CPU cooler is CM Hyper 212 EVO.
> 
> Also one more question - let's say my r9 290 somehow croaks due to me changing the fan speed, this is going to void my warranty, right?


No. Won't void your warranty. And no this will not kill your card. AB is totally safe. Now if you use some other tools and send 1.4v to your card then run Furmark you might just kill your card.









You need to get better airflow, that is 85% of the problem on your rig.


----------



## mfknjadagr8

Quote:


> Originally Posted by *VulpesInculta*
> 
> My CPU cooler is CM Hyper 212 EVO.
> 
> Also one more question - let's say my r9 290 somehow croaks due to me changing the fan speed, this is going to void my warranty, right?


no fan profile doesn't void anything...just keep an eye on your temperatures as you lower the fan speed you don't want it over 85 under heavy load...but don't freak out if it goes above 85c these cards can handle a bit more heat than that but quite a few people have reported the cards throttling at 85c...my second card refused to even crossfire when it was above 85...most of the time you can drop fan speeds and not see a huge difference with temperatures...unless your airflow is non existent....and yes MOAR FANS MORE POWAH! perhaps someone else who had your case will chime in here if not the air cooling forum has some experienced guys who probably have used your case and can recommend you some fans and placements


----------



## VulpesInculta

Quote:


> Originally Posted by *battleaxe*
> 
> No. Won't void your warranty. And no this will not kill your card. AB is totally safe. Now if you use some other tools and send 1.4v to your card then run Furmark you might just kill your card.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You need to get better airflow, that is 85% of the problem on your rig.


Quote:


> Originally Posted by *mfknjadagr8*
> 
> no fan profile doesn't void anything...just keep an eye on your temperatures as you lower the fan speed you don't want it over 85 under heavy load...but don't freak out if it goes above 85c these cards can handle a bit more heat than that but quite a few people have reported the cards throttling at 85c...my second card refused to even crossfire when it was above 85...most of the time you can drop fan speeds and not see a huge difference with temperatures...unless your airflow is non existent....and yes MOAR FANS MORE POWAH! perhaps someone else who had your case will chime in here if not the air cooling forum has some experienced guys who probably have used your case and can recommend you some fans and placements


Thank you so much guys, i will look into some fans and experiment with the fan curve once i get them.


----------



## sinnedone

I need help with games not using the gpus to their full potential.

I'm using 2 r9 290's in crossfire and seem to get fluctuations in usage all over the place. Both gpus will go up to 100% then down to 25% or so constantly. I see this in afterburner and gpuz.

I am running a 3770k at 4.5ghz at the moment and according to afterburner max usage is 85% going down to 60% or so.

The card clocks are stable and so is memory, its just usage goes up and down constantly and its noticeable in games. Just finished gta 5 and bf4 to double check. I'm using the latest 15.7 drivers as well.

With no overclocking and overclocked the results are the same. Any ideas? I know some have mentioned fixes etc but cant find the posts right now

Thanks all


----------



## mfknjadagr8

resolution? higher resolution or downsampling will bring your usage up...


----------



## VulpesInculta

I have one more question. While playing The Witcher 3 for about two hours, i got two blackscreens almost in a row, for about one or two seconds each. Could this indicate some sort of problem? When this happened i tried to open the CCC but it just didn't open at the time, i had to wait for it and it opened afterwards.The fans were going on at 80% like they always do when i play that game and the temp was about 70c. I have my card plugged through two cables into a 750w Antec true power classic psu, and i am using 15.7.1 version of drivers, if that matters in any way.


----------



## Gumbi

Quote:


> Originally Posted by *VulpesInculta*
> 
> I have one more question. While playing The Witcher 3 for about two hours, i got two blackscreens almost in a row, for about one or two seconds each. Could this indicate some sort of problem? When this happened i tried to open the CCC but it just didn't open at the time, i had to wait for it and it opened afterwards.The fans were going on at 80% like they always do when i play that game and the temp was about 70c. I have my card plugged through two cables into a 750w Antec true power classic psu, and i am using 15.7.1 version of drivers, if that matters in any way.


Sounds like a driver crash. Is your overclock stable? If you are using Windows 10? There may still be some minor driver issues with it.


----------



## sinnedone

Quote:


> Originally Posted by *mfknjadagr8*
> 
> resolution? higher resolution or downsampling will bring your usage up...


oops sorry, its at 1440p and normal scaling,

THe problem is that when usage goes down so does the fps. I feel like the demand is there but for some reason the gpu's are responding correctly. I could be totally wrong on this though.


----------



## gatygun

Well after selling my 2x 290's. I tested out my friends 980 gtx and i must say the minimums on 1080p are way higher for my older cpu in any game. Witcher 3 was buttersmooth something i didn't experience with even a single 290 on ultra settings. Even the market where i hitted high 40's is now a rock solid 60+ fps on ultra.

Using my scaled down settings which i used with my 290, i get a massive boost on framerate everywhere. Hairworks hardly givs a dent in the fps also with that card. Ryse also runned way more smoother and had far higher minimums from low dips into the 30's towards low dips into the 60's, just insane the difference.

As he wanted 475 euro's for that card, i decided against it. I found it to much. I bought from somebody on the internet a 970 gaming and overclocked it towards 980 speeds and i must say it's about equal now if not a bit faster from that stock 980 for half its price.

But even on stock the 970 just plays any game i throw at it butter smooth. Thing hardly breaks 60c even while overclocked without making any sound at all from the fan side, the 290 was a hurricane heating up massively.

The results are the following my older 290 vs 970 heavily oc'ed both.

290 : 1225/1650

http://www.3dmark.com/fs/5431923

13.220 gpu score
10.590 final score

970: 210/525+

http://www.3dmark.com/fs/5725971

13.875 gpu score
10.937 final score

I instantly got the first place for my older cpu segment so that's nice. I did got a higher score but it didn't register it of 13.975 gpu score but i didn't bother to push it further and get 11.000 as i'm already way above the second place now.

Stock it's 8900 vs 9900 ( 290 vs 970 ) final scores.

Anyway, it's clear as day that gamework games are heavily favored by nvidia, all the games i play now are butter smooth while with the 290 it always had some kind of issue. 290 also seemed to stutter in city's when you first walked through it, but when you backtracked it, it was gone. 970 has non of it.

Anyway pretty happy right now, it gives me a bit more performance but also removes the lows on framerate. But the smoothness is something weird to experience tho lol.


----------



## Agent Smith1984

Quote:


> Originally Posted by *gatygun*
> 
> Well after selling my 2x 290's. I tested out my friends 980 gtx and i must say the minimums on 1080p are way higher for my older cpu in any game. Witcher 3 was buttersmooth something i didn't experience with even a single 290 on ultra settings. Even the market where i hitted high 40's is now a rock solid 60+ fps on ultra.
> 
> Using my scaled down settings which i used with my 290, i get a massive boost on framerate everywhere. Hairworks hardly givs a dent in the fps also with that card. Ryse also runned way more smoother and had far higher minimums from low dips into the 30's towards low dips into the 60's, just insane the difference.
> 
> As he wanted 475 euro's for that card, i decided against it. I found it to much. I bought from somebody on the internet a 970 gaming and overclocked it towards 980 speeds and i must say it's about equal now if not a bit faster from that stock 980 for half its price.
> 
> But even on stock the 970 just plays any game i throw at it butter smooth. Thing hardly breaks 60c even while overclocked without making any sound at all from the fan side, the 290 was a hurricane heating up massively.
> 
> The results are the following my older 290 vs 970 heavily oc'ed both.
> 
> 290 : 1225/1650
> 
> http://www.3dmark.com/fs/5431923
> 
> 13.220 gpu score
> 10.590 final score
> 
> 970: 210/525+
> 
> http://www.3dmark.com/fs/5725971
> 
> 13.875 gpu score
> 10.937 final score
> 
> I instantly got the first place for my older cpu segment so that's nice. I did got a higher score but it didn't register it of 13.975 gpu score but i didn't bother to push it further and get 11.000 as i'm already way above the second place now.
> 
> Stock it's 8900 vs 9900 ( 290 vs 970 ) final scores.
> 
> Anyway, it's clear as day that gamework games are heavily favored by nvidia, all the games i play now are butter smooth while with the 290 it always had some kind of issue. 290 also seemed to stutter in city's when you first walked through it, but when you backtracked it, it was gone. 970 has non of it.
> 
> Anyway pretty happy right now, it gives me a bit more performance but also removes the lows on framerate. But the smoothness is something weird to experience tho lol.


What driver were you running on the 290?

My 390 at those clocks would be scoring about 14600 graphics....

I run 1200/1750 daily now, and it gets 14200 I have all but black balled FireStrike, as it seems to have no bearing on real world performance.
I have been running Crysis 3 at 4K, very high, no AA, and averaging 55+ butter smooth frames per second on this single card.

The 980 is a great card though, and very capable, but it basically competes directly with the 390X series right now, and at $510 street price, the 980 is just flat out overpriced.

I did almost buy a used MSI Gaming 980 for $400 cash though, before choosing to go with my 390 @ $330 new.


----------



## VulpesInculta

Quote:


> Originally Posted by *Gumbi*
> 
> Sounds like a driver crash. Is your overclock stable? If you are using Windows 10? There may still be some minor driver issues with it.


I haven't overclocked anything at all, and i am using windows 7.


----------



## Raephen

Quote:


> Originally Posted by *VulpesInculta*
> 
> Thank you so much guys, i will look into some fans and experiment with the fan curve once i get them.


You own a Define series case, yes? An extra front intake + the bottom intake should do wonders. A side intak eshould also help, but I'm thinking you choose Define for a reason, so whether or not you want to remove that piece of sound deadening I leave up to you.

An extra fan for exhaust in top should also help greatly (the top rear).

And have you removed the top hard drive cage (if you don't use it)? That should help with intake airflow.

And don't worry about lowering the fan profile of your card. PCS seem to have an agrresive profile in their bios - they cool great but with too much noise for some. Even with a bit less cooling because of a lower fan profile, the 70C you mentioned is a far cry from what my reference Saphire R9 290 did - 94C. Like everyone has been telling you: you'll be fine. Just try to stay sub 85C and you'll be golden.


----------



## 4kallday

What thermal paste do you guys all use? I recently switched the thermal paste on my cards (two Asus Radeon R9 290X 4b OC editions) to the Gelid gc extreme paste and scored an additional 2.5 fps in heaven @4k. My idle temps dropped by about 10 degrees but it's hard to say how much of an improvement that is from stock because I was using the paste that came with my cpu cooler before because I had to change the paste after I painted my cards. I didn't realise thermal paste alone could have that much of an impact.


----------



## rt123

Quote:


> Originally Posted by *4kallday*
> 
> What thermal paste do you guys all use? I recently switched the thermal paste on my cards (two Asus Radeon R9 290X 4b OC editions) to the Gelid gc extreme paste and scored an additional 2.5 fps in heaven @4k. My idle temps dropped by about 10 degrees but it's hard to say how much of an improvement that is from stock because I was using the paste that came with my cpu cooler before because I had to change the paste after I painted my cards. I didn't realise thermal paste alone could have that much of an impact.


Gelid GC-Extreme is one of the best paste that exist.


----------



## Jflisk

Gelid extream here also cant be beat


----------



## rdr09

Quote:


> Originally Posted by *gatygun*
> 
> Well after selling my 2x 290's. I tested out my friends 980 gtx and i must say the minimums on 1080p are way higher for my older cpu in any game. Witcher 3 was buttersmooth something i didn't experience with even a single 290 on ultra settings. Even the market where i hitted high 40's is now a rock solid 60+ fps on ultra.
> 
> Using my scaled down settings which i used with my 290, i get a massive boost on framerate everywhere. Hairworks hardly givs a dent in the fps also with that card. Ryse also runned way more smoother and had far higher minimums from low dips into the 30's towards low dips into the 60's, just insane the difference.
> 
> As he wanted 475 euro's for that card, i decided against it. I found it to much. I bought from somebody on the internet a 970 gaming and overclocked it towards 980 speeds and i must say it's about equal now if not a bit faster from that stock 980 for half its price.
> 
> But even on stock the 970 just plays any game i throw at it butter smooth. Thing hardly breaks 60c even while overclocked without making any sound at all from the fan side, the 290 was a hurricane heating up massively.
> 
> The results are the following my older 290 vs 970 heavily oc'ed both.
> 
> 290 : 1225/1650
> 
> http://www.3dmark.com/fs/5431923
> 
> 13.220 gpu score
> 10.590 final score
> 
> 970: 210/525+
> 
> http://www.3dmark.com/fs/5725971
> 
> 13.875 gpu score
> 10.937 final score
> 
> I instantly got the first place for my older cpu segment so that's nice. I did got a higher score but it didn't register it of 13.975 gpu score but i didn't bother to push it further and get 11.000 as i'm already way above the second place now.
> 
> Stock it's 8900 vs 9900 ( 290 vs 970 ) final scores.
> 
> Anyway, it's clear as day that gamework games are heavily favored by nvidia, all the games i play now are butter smooth while with the 290 it always had some kind of issue. 290 also seemed to stutter in city's when you first walked through it, but when you backtracked it, it was gone. 970 has non of it.
> 
> Anyway pretty happy right now, it gives me a bit more performance but also removes the lows on framerate. But the smoothness is something weird to experience tho lol.


yah, 980 is indeed faster than the 290, especially if your 290 is not perfroming at par with other 290s. your 1225 core oc should be scoring higher. must be the cpu, then there is the cpu overhead thing with amd cars. lol. now, 2 290s on that cpu?


----------



## 4kallday

Quote:


> Originally Posted by *Jflisk*
> 
> Gelid extream here also cant be beat


I want to change the paste I'm using for my cpu to the gelid as well but I've already used the one tube tube for both of my gpu's. Do you think there'd be enough left in it for the cpu? Because if I run short the only other paste I have on hand is coolermaster ic value, and if I end up using that I may as well be using toothpaste.


----------



## Paul17041993

Quote:


> Originally Posted by *VulpesInculta*
> 
> I have one more question. While playing The Witcher 3 for about two hours, i got two blackscreens almost in a row, for about one or two seconds each. Could this indicate some sort of problem? When this happened i tried to open the CCC but it just didn't open at the time, i had to wait for it and it opened afterwards.The fans were going on at 80% like they always do when i play that game and the temp was about 70c. I have my card plugged through two cables into a 750w Antec true power classic psu, and i am using 15.7.1 version of drivers, if that matters in any way.


I would check to make sure both rails of your PSU are being used correctly, if only one of them were being used to power the full system then voltage regulation would be very poor and would cause blackscreens. Otherwise your VRMs may be overheating (115C is the peak).

Blackscreens usually occur when the controller on the card shuts the core down if something's too unstable, overheating or exceeding the PCB TDP. It can be anything really but the intention is that the card will never be damaged unless the event was too extreme for the shutdown to occur fast enough. Drivers can occasionally be at fault if they were to switch profiles too quickly or send an unstable combination (eg; high memory clock with low core voltage) or simply send invalid instructions.


----------



## Archea47

Quote:


> Originally Posted by *4kallday*
> 
> I want to change the paste I'm using for my cpu to the gelid as well but I've already used the one tube tube for both of my gpu's. Do you think there'd be enough left in it for the cpu? Because if I run short the only other paste I have on hand is coolermaster ic value, and if I end up using that I may as well be using toothpaste.


Less is more with TIM, especially if you've lapped. With the Gelid GC Extreme I suggest getting it hot (I put it in a ziplock bag within a cup of hot water) before application so it spreads easier


----------



## 4kallday

Quote:


> Originally Posted by *Archea47*
> 
> Less is more with TIM, especially if you've lapped. With the Gelid GC Extreme I suggest getting it hot (I put it in a ziplock bag within a cup of hot water) before application so it spreads easier


I understand that much, I just want to be sure I'll have enough for even coverage. With heating it, I've got that covered, I usually heat the tube with a hairdryer for a good while before I apply it, works fine.


----------



## Unknownm

Does anyone know how to get more core voltage in AB? when I run Trixx I'm allow up to 200+ while AB only gives me 100+.


----------



## fyzzz

Quote:


> Originally Posted by *Unknownm*
> 
> Does anyone know how to get more core voltage in AB? when I run Trixx I'm allow up to 200+ while AB only gives me 100+.


http://forums.overclockers.co.uk/showthread.php?t=18556274 Make another shortcut of msi ab to your desktop. Then right click properties and paste this: "C:\Program Files (x86)\MSI Afterburner\MSIAfterburner.exe" /wi6,30,8d,20 in the target box (20 in the end should give 200mv). Msi ab will not open with this second shortcut it will only apply the voltage, in order to get voltage control you need to open your main msi ab shortcut.


----------



## OneB1t

care its 20HEX which is 32 DEC * 6.25 = 200mV


----------



## VulpesInculta

Quote:


> Originally Posted by *Raephen*
> 
> You own a Define series case, yes? An extra front intake + the bottom intake should do wonders. A side intak eshould also help, but I'm thinking you choose Define for a reason, so whether or not you want to remove that piece of sound deadening I leave up to you.
> 
> An extra fan for exhaust in top should also help greatly (the top rear).
> 
> And have you removed the top hard drive cage (if you don't use it)? That should help with intake airflow.
> 
> And don't worry about lowering the fan profile of your card. PCS seem to have an agrresive profile in their bios - they cool great but with too much noise for some. Even with a bit less cooling because of a lower fan profile, the 70C you mentioned is a far cry from what my reference Saphire R9 290 did - 94C. Like everyone has been telling you: you'll be fine. Just try to stay sub 85C and you'll be golden.


Yes, i own a Define R4. I did choose it to minimize the sound of the air cooling, but if i could provide enough cooling with a couple of 140mm low-ish rpm fans, i think it would be still much better than my current situation because the card is getting pretty loud at the constant 80% fan speed it's going now. I have removed the bottom hard drive cage, the top can't be removed i think.
Quote:


> Originally Posted by *Paul17041993*
> 
> I would check to make sure both rails of your PSU are being used correctly, if only one of them were being used to power the full system then voltage regulation would be very poor and would cause blackscreens. Otherwise your VRMs may be overheating (115C is the peak).
> 
> Blackscreens usually occur when the controller on the card shuts the core down if something's too unstable, overheating or exceeding the PCB TDP. It can be anything really but the intention is that the card will never be damaged unless the event was too extreme for the shutdown to occur fast enough. Drivers can occasionally be at fault if they were to switch profiles too quickly or send an unstable combination (eg; high memory clock with low core voltage) or simply send invalid instructions.


As far as i know, my Antec - TP-750C psu has two rails - one for the pci-e stuff and another for everything else. So i should be using those properly, right? I don't really know how to perform a check on that. Also the temperature shown in the CCC was never beyond 71c.


----------



## gatygun

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What driver were you running on the 290?
> 
> My 390 at those clocks would be scoring about 14600 graphics....
> 
> I run 1200/1750 daily now, and it gets 14200 I have all but black balled FireStrike, as it seems to have no bearing on real world performance.
> I have been running Crysis 3 at 4K, very high, no AA, and averaging 55+ butter smooth frames per second on this single card.
> 
> The 980 is a great card though, and very capable, but it basically competes directly with the 390X series right now, and at $510 street price, the 980 is just flat out overpriced.
> 
> I did almost buy a used MSI Gaming 980 for $400 cash though, before choosing to go with my 390 @ $330 new.


I upgraded my 580 towards a 290 was i wanted the 4gb of v-ram. My 580 didn't work anymore because of it's jarring 1,5gb of v-ram solution. ( should have bought the 3gb model ) the performance increase was nice specially when i oc'ed my cpu. But i found out that i still had low's on higher settings ( if not any setting ) of sub 40 fps in games ( crysis 3 / witcher 3 "43 fps lows" / bf4 "45 fps lows" / ryse "35 fps lows" ).

I thought i was just out of gpu power and bought a second gpu, but found out that the low fps would stay there. It was nice to have for higher resolutions and filtering tho. Totally overkill only for what i needed it for. Still crossfire added additional issue's specially in games i played such as witcher 3 which made it simple said unplayable for me. Some kind of judder and stutter or whatever.

I always had a weird loading stutter somehow in witcher 3 when i visited a town. once i passed a area and came back towards it, it was smooth as it finished loading probably ( blamed my ssd for that ) but found out with the 980 it was actually amd that was the issue as the issue was completely solved with it.

Mate of mine gave me a good 2 hours with his 980 ( stock version, no oc nothing ) and my eyes opened on how smooth stuff simple was in comparison towards my 290.

The reason you probably have zero isseu's is that you pack a far better cpu then me, my cpu is clearly getting hammered by the gpu of amd. I just thought my cpu wasn't up for the task anymore but the nvidia card basically proved me wrong on this.

So i decided to look for a replaceable card. From what i heard nvidia wouldn't come out soonish with a new gpu serie or not in a month or so. So it was either 960 / 970 or 980. As 980 only pushed 15% away from the 970 in benchmarks, and the 970 has the same tech as the 980 besides it's crap tier v-ram architecture. But half the price i found out that the gamer model would overclock towards 980 without much issue's. Bought one up for 240 euro's second handed with 18m warranty. overclocked it and got now about 980 gi gaming stock performance with it.

It's just as fast for sure if not a bit faster then his stock 980 so i'm happy with it.

I'm just shocked at how much difference the taxation on the cpu is with amd cards. I aspected it to be like ~5 fps difference, but in some games my fps simple doubled almost. 35 ryse > 60 lows, bf4 from 45 lows > 70, witcher 3 45 fps lows > 61

Pretty drastical improvement. It also is faster then my 290 that's for sure.

The only thing i dislike about this card is it's weaker v-ram. I far rather would have bought the 390 that you got for sure as this card won't last long probably. But i will sell it off in a year most likely anyway.
Quote:


> Originally Posted by *rdr09*
> 
> yah, 980 is indeed faster than the 290, especially if your 290 is not perfroming at par with other 290s. your 1225 core oc should be scoring higher. must be the cpu, then there is the cpu overhead thing with amd cars. lol. now, 2 290s on that cpu?


That 1225 is also something i can't sustain as it's a tornado in my room sound wise then 100% fan profile and it gives me glitches in longer periods of play time. I could only sustain 1145/1400 for a long period of time still with a lot of added noise towards the fan and heat buildup.

I know about the cpu overhead on amd cards, but i was honestly shocking how much it was with my older cpu. Yea i know 2x 290's on that cpu wasn't a smart thing to do haha. But i wanted higher resolutions with better filtering solutions. But i found out i really honestly never use the raw performance.


----------



## Raephen

Quote:


> Originally Posted by *VulpesInculta*
> 
> Yes, i own a Define R4. I did choose it to minimize the sound of the air cooling, but if i could provide enough cooling with a couple of 140mm low-ish rpm fans, i think it would be still much better than my current situation because the card is getting pretty loud at the constant 80% fan speed it's going now. I have removed the bottom hard drive cage, the top can't be removed i think.


I meant the rack of 5 3.5" drive mounts, below the 2 5.25" bays and on top of the 3 lower 3.5" mounts.

I've used a Define Mini for a HTPC setup and it looks pretty similar (only smaller).

You should try to make a custom GPU fan profile with MSI Afterburner which tops out at 60% max. (Or even 50%)

Then play a game or run a benchmark like Valley to see what your temps do. And there is no risk in damaging your card if you do that: when your card gets too hot, it will reduce it's clock speed to compensate (throttle). When you notice that happening, just raise the max fan speed a bit to see if it helps.


----------



## cokker

Quote:


> Originally Posted by *sinnedone*
> 
> I need help with games not using the gpus to their full potential.
> 
> I'm using 2 r9 290's in crossfire and seem to get fluctuations in usage all over the place. Both gpus will go up to 100% then down to 25% or so constantly. I see this in afterburner and gpuz.
> 
> I am running a 3770k at 4.5ghz at the moment and according to afterburner max usage is 85% going down to 60% or so.
> 
> The card clocks are stable and so is memory, its just usage goes up and down constantly and its noticeable in games. Just finished gta 5 and bf4 to double check. I'm using the latest 15.7 drivers as well.
> 
> With no overclocking and overclocked the results are the same. Any ideas? I know some have mentioned fixes etc but cant find the posts right now
> 
> Thanks all


I have a similar thing happen here, I don't think it's anything to worry about. L4D2 is a funny one for me, sometimes for about 10-20 seconds MSI reports the GPU as not being used but my FPS is fine, GTA 5 does it but only for a split second.


----------



## VulpesInculta

Quote:


> Originally Posted by *Raephen*
> 
> I meant the rack of 5 3.5" drive mounts, below the 2 5.25" bays and on top of the 3 lower 3.5" mounts.
> 
> I've used a Define Mini for a HTPC setup and it looks pretty similar (only smaller).
> 
> You should try to make a custom GPU fan profile with MSI Afterburner which tops out at 60% max. (Or even 50%)
> 
> Then play a game or run a benchmark like Valley to see what your temps do. And there is no risk in damaging your card if you do that: when your card gets too hot, it will reduce it's clock speed to compensate (throttle). When you notice that happening, just raise the max fan speed a bit to see if it helps.


Yep, i got that rack removed because i only have one hard drive. I will definitely do the custom fan profile but i want to get a couple more fans before that, just to help the airflow a bit, since as i said i only have one frontal intake and one exhaust on the back. My psu has a lot of cables so i don't think i will be able to put a fan on the bottom, so i was thinking maybe add one more intake on the front, one on the side panel because i have the non-windowed version and maybe one more exhaust in the top back side?


----------



## Raephen

Quote:


> Originally Posted by *VulpesInculta*
> 
> Yep, i got that rack removed because i only have one hard drive. I will definitely do the custom fan profile but i want to get a couple more fans before that, just to help the airflow a bit, since as i said i only have one frontal intake and one exhaust on the back. My psu has a lot of cables so i don't think i will be able to put a fan on the bottom, so i was thinking maybe add one more intake on the front, one on the side panel because i have the non-windowed version and maybe one more exhaust in the top back side?


Aye, one extra out in the back top would help airflow a lot.

So two in the front, one in bottom or side + two out (rear and top rear) would give you a more decent airflow on the slightly positive side (positive airflow help keep dust out from all the unfiltered nooks and crannies in a case).

With building performance systems, I've always preferred using more fans at lower rpm rather than one or two at higher speed.

Ha! My Kabini based HTPC even has 3 fans (1x 200mm + 2 80mm) at very low speed just to make sure air moves through the system. You could say I'm addicted to airflow









And you don't have to run your case fans that fast either - another reason I prefer more fans instead of fast fans: 3 140mm at low rpm (5 to 7 volt) move more fresh air in than a single 140mm in the 7 to 12 volt range.

You only use one storage device in the front bottom rack? Take out the two unused trays and use the room for storing your unused psu cables. That and a fan grill should make it possible to use the bottom fan position (if your psu isn't too long, that is).


----------



## VulpesInculta

Quote:


> Originally Posted by *Raephen*
> 
> Aye, one extra out in the back top would help airflow a lot.
> 
> So two in the front, one in bottom or side + two out (rear and top rear) would give you a more decent airflow on the slightly positive side (positive airflow help keep dust out from all the unfiltered nooks and crannies in a case).
> 
> With building performance systems, I've always preferred using more fans at lower rpm rather than one or two at higher speed.
> 
> Ha! My Kabini based HTPC even has 3 fans (1x 200mm + 2 80mm) at very low speed just to make sure air moves through the system. You could say I'm addicted to airflow
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And you don't have to run your case fans that fast either - another reason I prefer more fans instead of fast fans: 3 140mm at low rpm (5 to 7 volt) move more fresh air in than a single 140mm in the 7 to 12 volt range.
> 
> You only use one storage device in the front bottom rack? Take out the two unused trays and use the room for storing your unused psu cables. That and a fan grill should make it possible to use the bottom fan position (if your psu isn't too long, that is).


Okay so just to make sure, two fans on the front and one on the side(or bottom if i can fit it) are intake, and the one on the top back slot and the back slot are exhaust ones, right? Don't wanna mess this one up as i am a noob in that kinda stuff. 2000-ish rpm shouldn't be too loud right?


----------



## Raephen

Quote:


> Originally Posted by *VulpesInculta*
> 
> Okay so just to make sure, two fans on the front and one on the side(or bottom if i can fit it) are intake, and the one on the top back slot and the back slot are exhaust ones, right? Don't wanna mess this one up as i am a noob in that kinda stuff. 2000-ish rpm shouldn't be too loud right?


Aye, that fan setup is what I would suggest.

Except, I would stay away from running any fan over 1000 rpm, let alone going toward 2000.

I dislike running any fan over 1000 rpm and any fan - even the silent ones, is noticeable when ran at full speed.

It's not just about the noise a fan makes, also it's tonality, if that makes any sense at all you

My suggestion would be getting fans that have a max speed of 1500 or so and run them just below 1000 rpm - either through the motherboard headers (if you have enough), a fan controller if you have one or just 5 or 7 volt molex to fan plugs.

The Gelid Silent fans are good, inexspensive fans that would do well with your Fractal stock fans.


----------



## VulpesInculta

Quote:


> Originally Posted by *Raephen*
> 
> Aye, that fan setup is what I would suggest.
> 
> Except, I would stay away from running any fan over 1000 rpm, let alone going toward 2000.
> 
> I dislike running any fan over 1000 rpm and any fan - even the silent ones, is noticeable when ran at full speed.
> 
> It's not just about the noise a fan makes, also it's tonality, if that makes any sense at all you
> 
> My suggestion would be getting fans that have a max speed of 1500 or so and run them just below 1000 rpm - either through the motherboard headers (if you have enough), a fan controller if you have one or just 5 or 7 volt molex to fan plugs.
> 
> The Gelid Silent fans are good, inexspensive fans that would do well with your Fractal stock fans.


Thanks a lot for the help, i will see what i can find on the local market.


----------



## sinnedone

Trying to figure out my overclocks here on a pair of 290's and just wanted to know if these are good scores and are they scaling well with core clocks?









This is what I have so far on Firestrike scores:

15570 at 1000/1300

15931 at 1050/1300

16559 at 1150/1300

Haven't really tested the 1100/1300 to much for stability as I only added 50mv. Going to be testing overclocks further as anything past 1050 on the core needs extra voltage.

OH forgot to add this is with a 3770k at 4.5 ghz


----------



## kizwan

Quote:


> Originally Posted by *sinnedone*
> 
> Trying to figure out my overclocks here on a pair of 290's and just wanted to know if these are good scores and are they scaling well with core clocks?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This is what I have so far on Firestrike scores:
> 
> 15570 at 1000/1300
> 
> 15931 at 1050/1300
> 
> 16559 at 1100/1300
> 
> Haven't really tested the 1100/1300 to much for stability as I only added 50mv. Going to be testing overclocks further as anything past 1050 on the core needs extra voltage.
> 
> OH forgot to add this is with a 3770k at 4.5 ghz


Your 1100/1300 score pretty much similar to mine.

http://www.3dmark.com/compare/fs/5736881/fs/5608368

I just wish I can sort my result based on the clocks. I have to open the results one by one to find out the clocks.

*EDIT:* I just realized yours actually running at 1150/1300.


----------



## Arizonian

Quote:


> Originally Posted by *sledgehammer1990*
> 
> 1. GPU-Z Link with OCN name or Screen shot of GPU-Z validation tab open with OCN Name or finally a pic of GPU with piece of paper with OCN name showing.
> 2. Manufacturer & Brand - Please don't make me guess.
> 3. Cooling - Stock, Aftermarket or 3rd Party Water
> 
> Finally picked up a 290X yay! Way better than my 750 Ti.
> 
> Validation Link
> 
> Sapphire R9 290X
> 
> Tri-X Cooling


Congrats - added


----------



## sinnedone

Quote:


> Originally Posted by *kizwan*
> 
> Your 1100/1300 score pretty much similar to mine.
> 
> http://www.3dmark.com/compare/fs/5736881/fs/5608368
> 
> I just wish I can sort my result based on the clocks. I have to open the results one by one to find out the clocks.
> 
> *EDIT:* I just realized yours actually running at 1150/1300.


oops fixed









raid 0 too lol .

So I'm in the ballpark then. I just wanted to know because i was having an issue where the load on my cards was sporadic. I figured it out so I think Im good now as far as actual performance I should be getting.


----------



## kizwan

Quote:


> Originally Posted by *sinnedone*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Your 1100/1300 score pretty much similar to mine.
> 
> http://www.3dmark.com/compare/fs/5736881/fs/5608368
> 
> I just wish I can sort my result based on the clocks. I have to open the results one by one to find out the clocks.
> 
> *EDIT:* I just realized yours actually running at 1150/1300.
> 
> 
> 
> oops fixed
> 
> 
> 
> 
> 
> 
> 
> 
> 
> raid 0 too lol .
> 
> So I'm in the ballpark then. I just wanted to know because i was having an issue where the load on my cards was sporadic. I figured it out so I think Im good now as far as actual performance I should be getting.
Click to expand...

RAID 0 FTW!


----------



## fyzzz

So happy with the custom loop so far. 1200/1650 is stable with +87mv (worked with 75mv in firestrike and other benchmarks but not bf4) and with 93mv i can run 1210/1700 without artifacts in firestrike. I couldn't even reach 1200 on air before (using 390 bios). But i think i will go with 1160/1550 with 25mv, because i want to keep things quiet and cool.


----------



## uscool

my MSI 290 Gaming was reaching 90c in heaven 4.0 fans auto go at 100%, i re applied using MX4 paste and now its at max 81c with a auto fan at 61%


----------



## battleaxe

Does anyone know the VRM2 location on the new Sapphire 290x Tri-X card?

Its different than the older cards. VRM1 is in the same place, but there's several VRM's spread around the board, and I've been having fits trying to get the temps down on VRM2, so I think I'm missing something somewhere...

+1's await to whoever knows this. Thanks!


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> Does anyone know the VRM2 location on the new Sapphire 290x Tri-X card?
> 
> Its different than the older cards. VRM1 is in the same place, but there's several VRM's spread around the board, and I've been having fits trying to get the temps down on VRM2, so I think I'm missing something somewhere...
> 
> +1's await to whoever knows this. Thanks!


Okay, so here is the old one:



And here is the new one:



A lot more different than I thought it would be....

I suspect VRM temps are hotter on the new version because they went from 6+8 to 8+8 power delivery. .
Not to mention, the VRM is one of the only weak points of the Tri-X cooler. Even my older version of that card could get toasty on the VRM's when overvolting.


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Okay, so here is the old one:
> 
> 
> 
> And here is the new one:
> 
> 
> 
> A lot more different than I thought it would be....
> 
> I suspect VRM temps are hotter on the new version because they went from 6+8 to 8+8 power delivery. .
> Not to mention, the VRM is one of the only weak points of the Tri-X cooler. Even my older version of that card could get toasty on the VRM's when overvolting.


+1 as promised. I took off the air cooler and put an H80 on the core. The core is now down about 15c overall. The VRM1 is down about 20c overall. VRM2 however is still hotter than I want it. I have individual coolers on all the places that are VRM's even a few that were not being cooled by the stock cooler that looked suspect to me. Interestingly, my memory can now OC to 1670mhz where before the best I could get was about 1575 with stability. So memory is definitely appreciating the individual coolers as opposed to being heated up by the stock block (from the core I suspect). The core also is able to clock higher now as I got 1235mhz on core I assume by lowering the core temps. Also, the card seems to need less volts to do all this. I have not even messed with Trixx, all done on AB using no more than 100mv extra.

After about 30 min running heaven here's my temps with +100mv: (I realize these are acceptable, but I'd like to get VRM2 lower as it controls memory)

Core: 73c
VRM1: 73c
VRM2: 76c

Seems in the past the VRM2 could be much lower than this, so I want to figure out where it is... my efforts thus far are that I'm not doing any better on VRM2 vs the stock cooler, while everthing else is a lot cooler overall.


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> +1 as promised. I took off the air cooler and put an H80 on the core. The core is now down about 15c overall. The VRM1 is down about 20c overall. VRM2 however is still hotter than I want it. I have individual coolers on all the places that are VRM's even a few that were not being cooled by the stock cooler that looked suspect to me. Interestingly, my memory can now OC to 1670mhz where before the best I could get was about 1575 with stability. So memory is definitely appreciating the individual coolers as opposed to being heated up by the stock block (from the core I suspect). The core also is able to clock higher now as I got 1235mhz on core I assume by lowering the core temps. Also, the card seems to need less volts to do all this. I have not even messed with Trixx, all done on AB using no more than 100mv extra.
> 
> After about 30 min running heaven here's my temps with +100mv: (I realize these are acceptable, but I'd like to get VRM2 lower as it controls memory)
> 
> Core: 73c
> VRM1: 73c
> VRM2: 76c
> 
> Seems in the past the VRM2 could be much lower than this, so I want to figure out where it is... my efforts thus far are that I'm not doing any better on VRM2 vs the stock cooler, while everthing else is a lot cooler overall.


Those overclocking results are great, but I'm surprised to see it with those temps!

When I had the previous version of the Tri-X my VRM1 was around 85-90C with 100mv, but my VRM2 was generally about 60C.... very strange to see yours so high....

Are you using a Kraken as a mounting bracket for the H80 or did you go a different route?

Do you have a fan blowing directly on the heatsinks you added to the VRM's?


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Those overclocking results are great, but I'm surprised to see it with those temps!
> 
> When I had the previous version of the Tri-X my VRM1 was around 85-90C with 100mv, but my VRM2 was generally about 60C.... very strange to see yours so high....
> 
> Are you using a Kraken as a mounting bracket for the H80 or did you go a different route?
> 
> Do you have a fan blowing directly on the heatsinks you added to the VRM's?


No on kraken, strap over top of cooler head using long threaded rods.

Yes, using fan to cool vrm heatainks. My next thought is to maybe experiment using some higher quality thermal pads.


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> No on kraken, strap over top of cooler head using long threaded rods.
> 
> Yes, using fan to cool vrm heatainks. My next thought is to maybe experiment using some higher quality thermal pads.


Could we get a pic?

Very curious to see the heat sinks.

Are you using the peel-off adhesive back kind? (like the zalman ones?)


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Could we get a pic?
> 
> Very curious to see the heat sinks.
> 
> Are you using the peel-off adhesive back kind? (like the zalman ones?)


I don't have any way to get a pic right now. I only have the Zalman sinks on the RAM, lots more metal sunk to the VRM's. I'll try to adjust the VRM's/thermal pads, and get some pics up soon.


----------



## Translator

Sapphire R9 290 Noctua edition))
http://i-fotki.info/19/d141d6973df016069eaacb583298990b25e5ff222501871.jpg.html


----------



## Agent Smith1984

Quote:


> Originally Posted by *Translator*
> 
> Sapphire R9 290 Noctua edition))
> http://i-fotki.info/19/d141d6973df016069eaacb583298990b25e5ff222501871.jpg.html


Too small of a pic bud.


----------



## JourneymanMike

Quote:


> Originally Posted by *Translator*
> 
> Sapphire R9 290 Noctua edition))
> http://i-fotki.info/19/d141d6973df016069eaacb583298990b25e5ff222501871.jpg.html


Roots, Rock, Reggae!

Saw Bob Marley and the Wailers at the Orpheum theater, Madison Wisconsin, in 1978...

I remember some of it!


----------



## Paul17041993

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Okay, so here is the old one:
> 
> 
> 
> And here is the new one:
> 
> 
> 
> A lot more different than I thought it would be....
> 
> I suspect VRM temps are hotter on the new version because they went from 6+8 to 8+8 power delivery. .
> Not to mention, the VRM is one of the only weak points of the Tri-X cooler. Even my older version of that card could get toasty on the VRM's when overvolting.


under the PCB?


----------



## battleaxe

Quote:


> Originally Posted by *Paul17041993*
> 
> under the PCB?


That's a good idea. I never thought to look there. I know there are no VRM coolers behind the PCB though at all. No back plate on this card either, but then that would explain why the VRM2 doesn't seem to respond at all to cooling the VRM. I'll have a look there today. Thanks


----------



## ptrkhh

I got an offer to exchange my Asus DirectCU 290 for a Sapphire Tri-X OC 290 for €50, do you think its worth it?


----------



## battleaxe

Quote:


> Originally Posted by *ptrkhh*
> 
> I got an offer to exchange my Asus DirectCU 290 for a Sapphire Tri-X OC 290 for €50, do you think its worth it?


I would say yes. Is your asus dead?


----------



## ptrkhh

Quote:


> Originally Posted by *battleaxe*
> 
> I would say yes. Is your asus dead?


No, its perfectly running. That's why I'm asking, €50 is quite steep for a cooler upgrade isn't it?


----------



## battleaxe

Quote:


> Originally Posted by *ptrkhh*
> 
> No, its perfectly running. That's why I'm asking, €50 is quite steep for a cooler upgrade isn't it?


probably not then.


----------



## By-Tor

Quote:


> Originally Posted by *ptrkhh*
> 
> I got an offer to exchange my Asus DirectCU 290 for a Sapphire Tri-X OC 290 for €50, do you think its worth it?


I would say yes.

The Sapphire 290x Tri-X OC is a direct-x 12 card.


----------



## battleaxe

Quote:


> Originally Posted by *By-Tor*
> 
> I would say yes.
> 
> The Sapphire 290x Tri-X OC is a direct-x 12 card.


Good point. I didn't realize this even though I have one of the new ones... Might be worth it seeing that... Glad I already have one.










So does anyone know if a new card and an old card are xfired together... Will I be getting the full benefits of dx12 in xfire?


----------



## Streetdragon

all 290 290x chips are DX12 ready! So you can crossfire all 290 and 290x together. Would look cool.... so much colors xD


----------



## By-Tor

Quote:


> Originally Posted by *Streetdragon*
> 
> all 290 290x chips are DX12 ready! So you can crossfire all 290 and 290x together. Would look cool.... so much colors xD


Ready meaning?

Both mine say 11.2.


----------



## Agent Smith1984

Quote:


> Originally Posted by *By-Tor*
> 
> Ready meaning?
> 
> Both mine say 11.2.


They are all 12_0 compatible.

They won't get the full benefits of 12_1, but they will give all the benefits that make the most improvements to performance.
Only Maxwell will have full 12_1 compliance.

Please read this to understand completely:
http://www.extremetech.com/extreme/207598-demystifying-directx-12-support-what-amd-intel-and-nvidia-do-and-dont-deliver


----------



## By-Tor

Quote:


> Originally Posted by *Agent Smith1984*
> 
> They are all 12_0 compatible.
> 
> They won't get the full benefits of 12_1, but they will give all the benefits that make the most improvements to performance.
> Only Maxwell will have full 12_1 compliance.
> 
> Please read this to understand completely:
> http://www.extremetech.com/extreme/207598-demystifying-directx-12-support-what-amd-intel-and-nvidia-do-and-dont-deliver


Understand... Thanks Agent!!

But unless you are running windows 10 you won't get anything from DX 12..


----------



## Agent Smith1984

Quote:


> Originally Posted by *By-Tor*
> 
> Understand... Thanks Agent!!
> 
> But unless you are running windows 10 you won't get anything from DX 12..


Correct, and I advise anyone who has not done so, to upgrade.

It has been excellent for me so far. Performance improvements, usage improvements, smoothness, benching increases.... just overall really happy with it so far...


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Correct, and I advise anyone who has not done so, to upgrade.
> 
> It has been excellent for me so far. Performance improvements, usage improvements, smoothness, benching increases.... just overall really happy with it so far...


I can second that. I love Win10 too. Very nice.


----------



## By-Tor

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Correct, and I advise anyone who has not done so, to upgrade.
> 
> It has been excellent for me so far. Performance improvements, usage improvements, smoothness, benching increases.... just overall really happy with it so far...


I had it installed on an extra SSD to check it out and it was giving me some issues in the 2 games I play (Witcher 3 and BF4), plus coming from windows 7 I enjoyed the gadgets feature that windows 8 did away with and 10 is still without them. I run HWinfo64 and use the HWinfo monitor gadget to monitor my temps, usage and pump RPM's of my system on my desktop.

May mess with it a little more, but happy with 7...


----------



## Agent Smith1984

Quote:


> Originally Posted by *By-Tor*
> 
> I had it installed on an extra SSD to check it out and it was giving me some issues in the 2 games I play (Witcher 3 and BF4), plus coming from windows 7 I enjoyed the gadgets feature that windows 8 did away with and 10 is still without them. I run HWinfo64 and use the HWinfo monitor gadget to monitor my temps, usage and pump RPM's of my system on my desktop.
> 
> May mess with it a little more, but happy with 7...


Understandable.

I have had no issues at all with BF4, but I do not own W3.

I actually did the upgrade thing, and everything migrated over beautifully except for CPU-Z/HWMonitor. Not sure why those dissapeared, but everything else was perfect.
All my file settings, system settings, games, favorites.... all of it. I was completely shocked to see Origin automatically login and both Crysis 3 and B4 launch without issue.









Every other "upgrade" I had done to windows in the past ended up being a "screw this, I am wiping this drive, and doing a fresh install, cause almost everything is broken now"


----------



## By-Tor

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Understandable.
> 
> I have had no issues at all with BF4, but I do not own W3.
> 
> I actually did the upgrade thing, and everything migrated over beautifully except for CPU-Z/HWMonitor. Not sure why those dissapeared, but everything else was perfect.
> All my file settings, system settings, games, favorites.... all of it. I was completely shocked to see Origin automatically login and both Crysis 3 and B4 launch without issue.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Every other "upgrade" I had done to windows in the past ended up being a "screw this, I am wiping this drive, and doing a fresh install, cause almost everything is broken now"


I didn't want to mess with my main SSD and my 7 install so I installed 7 on an extra Samsung 250gb SSD and then upgraded to 10 after all the updates were installed to 7.

HW monitor is gone because 10 does not support gadgets and that's what it is. I have read someplace about a work around for gadgets, but not interested in messing with it now...

PS: When are we going to get some cooler weather down here???


----------



## cruisx

Hey guys so my friend bought a Sapphire 280X of someone and it was artifacting like crazy. We tried all the tricks and finally ended up RMAing it last week and got a replacement 280x. After some testing the new card also shows artifacts thought not as bad as the first.

Fast forward today Sapphire is doing us a favor and is asking what replacement card we want? We might just need to pay a tiny bit of the difference. My friend and I talked about it and asked them to give him a Sapphire 290 Tri-x. Do you guys think that was the best card to ask for?


----------



## By-Tor

Well the best would be a Fury x, Fury or 390x, but the 290x may be the best choice for what they would let you pick over a 280x.


----------



## Agent Smith1984

Quote:


> Originally Posted by *By-Tor*
> 
> I didn't want to mess with my main SSD and my 7 install so I installed 7 on an extra Samsung 250gb SSD and then upgraded to 10 after all the updates were installed to 7.
> 
> HW monitor is gone because 10 does not support gadgets and that's what it is. I have read someplace about a work around for gadgets, but not interested in messing with it now...


Oddly, HWInfo still worked perfectly, though it did have to gather all the sensor information over again. Anyways, when you do get a chance, Win 10 is definitely worth getting to know.
I hated 8, and was somewhat happy with the revisions they made to 8.1, but 10 really made me feel good about microsoft again (then again, it's very early







)

Quote:


> PS: When are we going to get some cooler weather down here???


I dunno man, I'm ready for my 69F indoor winter temps, as apposed to the 74F+ I struggle to keep it right now....


----------



## By-Tor

HWInfo worked fine for me also, but HW Monitor won't work in 8 or 10.

Can't wait for much cooler weather... anything sub 70's is fine...


----------



## Agent Smith1984

Quote:


> Originally Posted by *cruisx*
> 
> Hey guys so my friend bought a Sapphire 280X of someone and it was artifacting like crazy. We tried all the tricks and finally ended up RMAing it last week and got a replacement 280x. After some testing the new card also shows artifacts thought not as bad as the first.
> 
> Fast forward today Sapphire is doing us a favor and is asking what replacement card we want? We might just need to pay a tiny bit of the difference. My friend and I talked about it and asked them to give him a Sapphire 290 Tri-x. Do you guys think that was the best card to ask for?


If they do an even swap to even a Tri-X R9 290 you come out a winner. (though I dunno if I'd settle for a reference or not







)

Something about the 280x series from Sapphire and Asus were just nightmares... SO MANY people had artifacting problems on several cards. It was always believed to be a memory issue, but who knows for sure (though that's all that makes sense, since the cores we rebranded tahitis from the 7970 series)?

I was so relieved to find that the 390 series did not suffer from the same problems....

If you can score a Nitro 390, from them, even with a few bucks on the table, I'd think you'd be really happy with it


----------



## Agent Smith1984

Quote:


> Originally Posted by *By-Tor*
> 
> HWInfo worked fine for me also, but HW Monitor won't work in 8 or 10.


That's weird, because HWMonitor worked for me in 8.1, and works now for me in 10 after completely reinstalling it.


----------



## mAs81

Quote:


> Originally Posted by *cruisx*
> 
> My friend and I talked about it and asked them to give him a Sapphire 290 Tri-x. Do you guys think that was the best card to ask for?


Either Vapor-X or Tri-X 290/290X will do very nicely..
Other than that if the price is right , get the Nitro 390 or the Tri-x 390X


----------



## By-Tor

Quote:


> Originally Posted by *Agent Smith1984*
> 
> That's weird, because HWMonitor worked for me in 8.1, and works now for me in 10 after completely reinstalling it.


There are a couple of diff. HW monitors out, the one I'm talking about is a gadget. You have to have HWinfo64 installed and then install the gadget which gets all its data from HWinfo64.

I'll screen shot it when I get home and show you what it looks like.


----------



## Archea47

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Correct, and I advise anyone who has not done so, to upgrade.
> 
> It has been excellent for me so far. Performance improvements, usage improvements, smoothness, benching increases.... just overall really happy with it so far...


Are you in Crossfire? I thought there were issues with games not keeping focus in Win10 and Crossfire disabling


----------



## battleaxe

Quote:


> Originally Posted by *Archea47*
> 
> Are you in Crossfire? I thought there were issues with games not keeping focus in Win10 and Crossfire disabling


I have crossfire and on Win10. I've had zero issue so far. I'm running 4k also.


----------



## Agent Smith1984

Quote:


> Originally Posted by *Archea47*
> 
> Are you in Crossfire? I thought there were issues with games not keeping focus in Win10 and Crossfire disabling


No, I got rid of my 290 crossfire setup before upgrading, and am on a single 390 at the moment.
I am adding a second 390 card in three weeks (delayed this through the summer due to temps for one, and also to make sure I had taken care of some much needed things for the kids before they begin school next week).

Looking very forward to seeing how it goes....


----------



## kizwan

Quote:


> Originally Posted by *Archea47*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> Correct, and I advise anyone who has not done so, to upgrade.
> 
> It has been excellent for me so far. Performance improvements, usage improvements, smoothness, benching increases.... just overall really happy with it so far...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Are you in Crossfire? I thought there were issues with games not keeping focus in Win10 and Crossfire disabling
Click to expand...

I believe BF3. I only tested it one time after upgrading to Win 10. But I'm pretty sure for BF3 when return to fullscreen (after loosing focus) Crossfire should be able to resume working. I should be able to confirm this again later. Unigine Valley does not after return back to fullscreen but I got this since Win 7.


----------



## By-Tor

Agent here's what the monitor gadget looks like I'm using. Opens automatically on bootup.


----------



## Agent Smith1984

Quote:


> Originally Posted by *By-Tor*
> 
> Agent here's what the monitor gadget looks like I'm using. Opens automatically on bootup.


OOOH, me likes that!!!

Very cool.

Does it seem to eat up any CPU cycles/resources?


----------



## kizwan

HWiNFO should consider stand-alone gadget, doesn't depends on windows feature.


----------



## By-Tor

Quote:


> Originally Posted by *Agent Smith1984*
> 
> OOOH, me likes that!!!
> 
> Very cool.
> 
> Does it seem to eat up any CPU cycles/resources?


I like it also and you can set it up many ways...

Have not noticed it using much resources, but at 4.9ghz and 16gb's of ram I don't think I would anyway..

Just noticed I have 2 GPU 1's... Changed it to 2


----------



## By-Tor

Quote:


> Originally Posted by *kizwan*
> 
> HWiNFO should consider stand-alone gadget, doesn't depends on windows feature.


That is an option...

How much can you mod. it to your liking?

Can you enlarge the characters?


----------



## Agent Smith1984

Quote:


> Originally Posted by *By-Tor*
> 
> I like it also and you can set it up many ways...
> 
> Have not noticed it using much resources, but at 4.9ghz and 16gb's of ram I don't think I would anyway..


Yeah, you wouldn't even notice if it did. 4.9GHz on DC is pretty stout!

I only ask because there are certain monitoring apps that do seem to affect benchmarks when running in the background, and others that don't. As far as general use and gaming, I always have a few (by few I mean several, lol) cycles to spare for temp monitoring


----------



## By-Tor

During benching & gaming I have not noticed any performance draw with it on or off.

That and Afterburner are always running anyways...


----------



## battleaxe

1220mhz core
1660mhz memory (Samsung)
+100mv in AB
+25mv in aux

61C max on core
78C max on VRM1
79C max on VRM2

Seems the new cards have a very well balanced VRM section. Used to be the VRM1 were hotter than the VRM2. At least on this card that is not the case. After running for an extended session the VRM's are almost exactly the same temps.

*Newest Heaven results:*


Quote:


> Originally Posted by *Agent Smith1984*
> 
> Could we get a pic?
> 
> Very curious to see the heat sinks.
> 
> Are you using the peel-off adhesive back kind? (like the zalman ones?)


Yes, the system is ugly. There are no full cover blocks available for the new Sapphire 290x. So will have to make do with a pair of H80s strapped to them. Main card is 290x, upper card is an MSI gaming 290. I ran in windowed mode to only stress the main card. Running in full screen enacts the Xfire and upper card.





I'd still like to get the VRM's down a bit more, even though they are acceptable now. VRM1 was over 100c running this test previously and I could not get it to 1220mhz with only 100mv+25mv, so this setup is a lot better than the stock air cooler. VRM1 is about 20c cooler minimum. Core about the same. VRM2 didn't really move much.


----------



## kizwan

Quote:


> Originally Posted by *By-Tor*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> HWiNFO should consider stand-alone gadget, doesn't depends on windows feature.
> 
> 
> 
> 
> 
> That is an option...
> 
> How much can you mod. it to your liking?
> 
> Can you enlarge the characters?
Click to expand...

That one can not mod. You can enlarge the fonts though. That is Open Hardware Monitor gadget (*not* HWMonitor). I don't recommend running it in the background when playing games because since 15.4 beta, there's problem with it causing the secondary card in Crossfire unresponsive. Happened with GTA V at least. Just showing for an example. Shame though because it's lightweight.

Example screenshot of the Open Hardware Monitor.


----------



## JourneymanMike

Quote:


> Originally Posted by *battleaxe*
> 
> 1220mhz core
> 1660mhz memory (Samsung)
> +100mv in AB
> +25mv in aux
> 
> 61C max on core
> 78C max on VRM1
> 79C max on VRM2
> 
> Seems the new cards have a very well balanced VRM section. Used to be the VRM1 were hotter than the VRM2. At least on this card that is not the case. After running for an extended session the VRM's are almost exactly the same temps.
> 
> *Newest Heaven results:*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> Could we get a pic?
> 
> Very curious to see the heat sinks.
> 
> Are you using the peel-off adhesive back kind? (like the zalman ones?)
> 
> 
> 
> 
> 
> 
> 
> *Yes, the system is ugly*. There are no full cover blocks available for the new Sapphire 290x. So will have to make do with a pair of H80s strapped to them. Main card is 290x, upper card is an MSI gaming 290. I ran in windowed mode to only stress the main card. Running in full screen enacts the Xfire and upper card.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'd still like to get the VRM's down a bit more, even though they are acceptable now. VRM1 was over 100c running this test previously and I could not get it to 1220mhz with only 100mv+25mv, so this setup is a lot better than the stock air cooler. VRM1 is about 20c cooler minimum. Core about the same. VRM2 didn't really move much.
Click to expand...

Holy Krrrrrrraaaaaappppppppp!









Absolutely amazing, I thought I had a mess...









You deserve a trophy or some kind of an award for this one


----------



## Forceman

Quote:


> Originally Posted by *By-Tor*
> 
> Agent here's what the monitor gadget looks like I'm using. Opens automatically on bootup.


What gadget is that? You have a link? I'm using three different ones to get that info, it would be nice to just have one.


----------



## By-Tor

Quote:


> Originally Posted by *Forceman*
> 
> What gadget is that? You have a link? I'm using three different ones to get that info, it would be nice to just have one.


Here ya go... There are a lot of setup options and takes time to get it setup, but worth it to me. Has to be used with HWinfo64 running.

Tip: Disable all the sensors you don't want to see before installing the gadget. I have found things work much smoother if done in that order..

http://www.hwinfo.com/addons.php


----------



## battleaxe

Quote:


> Originally Posted by *JourneymanMike*
> 
> Holy Krrrrrrraaaaaappppppppp!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Absolutely amazing, I thought I had a mess...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You deserve a trophy or some kind of an award for this one


LOL... yeah. I know.


----------



## Paul17041993

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Correct, and I advise anyone who has not done so, to upgrade.
> 
> It has been excellent for me so far. Performance improvements, usage improvements, smoothness, benching increases.... just overall really happy with it so far...


except for DX11 scaling going to hell if programs like FRAPS are running, context constancy also seems to be problematic if any applications aren't minimised...


----------



## Forceman

Quote:


> Originally Posted by *By-Tor*
> 
> Here ya go... There are a lot of setup options and takes time to get it setup, but worth it to me. Has to be used with HWinfo64 running.
> 
> Tip: Disable all the sensors you don't want to see before installing the gadget. I have found things work much smoother if done in that order..
> 
> http://www.hwinfo.com/addons.php


It's the HWInfoMonitor on that page?


----------



## By-Tor

Quote:


> Originally Posted by *Forceman*
> 
> It's the HWInfoMonitor on that page?


Down load the side bar gadget 32 or 64 bit, whichever you have.

Second row middle..


----------



## Geoclock

Hi. Wanted to buy new card but can't decide will be 290x better upgrade over 390 ?
I know 290x is similar to 390x but 390 series are much cooler and faster.
Price is $50 more for 390.
Earlier bought XFX 290 BE and had nothing more than Black screens, BSOD. Returned back.
Now i don't know what to choose.
Thanks.


----------



## mus1mus

Quote:


> Originally Posted by *Geoclock*
> 
> Hi. Wanted to buy new card but can't decide will be 290x better upgrade over 390 ?
> I know 290x is similar to 390x but 390 series are much cooler and faster.
> Price is $50 more for 390.
> Earlier bought XFX 290 BE and had nothing more than Black screens, BSOD. Returned back.
> Now i don't know what to choose.
> Thanks.


390. If looking for both new cards.

290X if you can find a good deal on a used model.


----------



## Roboyto

Quote:


> Originally Posted by *Geoclock*
> 
> Hi. Wanted to buy new card but can't decide will be 290x better upgrade over 390 ?
> I know 290x is similar to 390x but 390 series are much cooler and faster.
> Price is $50 more for 390.
> Earlier bought XFX 290 BE and had nothing more than Black screens, BSOD. Returned back.
> Now i don't know what to choose.
> Thanks.


290(X) and 390(X) are near identical but there are differences to be aware of.

Benefits to the 390(X) are:

- 8GB of memory standard compared to 4GB

- 6GHz memory standard compared to 5GHz

- Potential for better core, and more importantly, VRM cooling; XFX's new cards look promising



Spoiler: XFX 390 VRM Cooling















- Potential for being being slightly better in efficiency from what I have read on hardware reviews. A nice feat when you consider higher core clocks, and double the memory with 20% higher memory clocks. This would likely be negated when overclocking, but out of the box performance collectively is better than a 290.

Benefits to the 290(X) are:

- Price

- May be able to find a 290X for equal/less than price of a 390

The 290X were ~5% faster than a 290 stock ~vs~ stock. That gap can be closed or negated if you get a robust 290.

Your black screen and BSOD could very well have been caused by driver issues. When installing *any* new GPU, even if you're upgrading from another AMD card, you should uninstall/clean the old drivers with Display Driver Uninstaller; DDU http://www.guru3d.com/files-get/display-driver-uninstaller-download,20.html








390 is the better buy IMO with its few enhancements even if they are small. The 390(X) is left in the same position as the 290(X) was. If you want that little bit of extra performance then you must pay the premium...the 390 is still the better buy though.

For plenty of other Hawaii info see here: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781


----------



## Geoclock

I just bought XFX 290X for $259-$25(credit)=$235 after MIR with Lifetime Warranty.
390 series don't have it so this one was major problem for me.
I hope in case of trouble XFX will replace it with higher and never version.
Still have 6 year old XFX 5850 card with L.W. and i tried all tricks to "kill" it but w/o success







,
Anyway will keep it until i'll figure out how to "HELP" it little...








Thanks.


----------



## Raephen

Quote:


> Originally Posted by *By-Tor*
> 
> There are a couple of diff. HW monitors out, the one I'm talking about is a gadget. You have to have HWinfo64 installed and then install the gadget which gets all its data from HWinfo64.
> 
> I'll screen shot it when I get home and show you what it looks like.


OpenHardwareMonitor has a build in gadget like thing that works fine on Windows 8(.1) and 10 alike.

In the main program you can choose which readings you want to show and such.



A screen grab I took from my HTPC just now.


----------



## Streetdragon

One of my Vapor r9 290 died. Now i hope for a RMA. If i dont get RMA, wich card could i crossfire to my other Vapor, if i cant get a new secound Vapor? MAybe a r9 390?


----------



## Paul17041993

Quote:


> Originally Posted by *Roboyto*


apart from the missing VRAM VRM heatsink, that card is just screaming for a universal or CPU waterblock/AIO


----------



## mfknjadagr8

Quote:


> Originally Posted by *Geoclock*
> 
> I just bought XFX 290X for $259-$25(credit)=$235 after MIR with Lifetime Warranty.
> 390 series don't have it so this one was major problem for me.
> I hope in case of trouble XFX will replace it with higher and never version.
> Still have 6 year old XFX 5850 card with L.W. and i tried all tricks to "kill" it but w/o success
> 
> 
> 
> 
> 
> 
> 
> ,
> Anyway will keep it until i'll figure out how to "HELP" it little...
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks.


remember to register or you only get two years I think


----------



## Agent Smith1984

Quote:


> Originally Posted by *By-Tor*
> 
> During benching & gaming I have not noticed any performance draw with it on or off.
> 
> That and Afterburner are always running anyways...


Yeah, I always have AB running in the background. I feel like I am naked with out it.
Quote:


> Originally Posted by *Paul17041993*
> 
> except for DX11 scaling going to hell if programs like FRAPS are running, context constancy also seems to be problematic if any applications aren't minimised...


That's strange, I haven't experienced of that.

I actually did a lot of FRAPS logging Sunday and put up my best numbers ever.

Here were my results:
http://www.overclock.net/t/1561704/official-amd-r9-390-390x-owners-club/1430#post_24303616


----------



## battleaxe

Latest Heaven Benchmark

1230core
1660ram
+100mv
+19mv aux

with temps as shown in AB
I did some cleaning up in my case and rerouted cables. Helped on VRM1 actually.


----------



## Paul17041993

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Yeah, I always have AB running in the background. I feel like I am naked with out it.
> That's strange, I haven't experienced of that.
> 
> I actually did a lot of FRAPS logging Sunday and put up my best numbers ever.
> 
> Here were my results:
> http://www.overclock.net/t/1561704/official-amd-r9-390-390x-owners-club/1430#post_24303616


do you use VSR and/or GPU scaling?


----------



## Agent Smith1984

Quote:


> Originally Posted by *Paul17041993*
> 
> do you use VSR and/or GPU scaling?


No, actual res


----------



## Paul17041993

Quote:


> Originally Posted by *Agent Smith1984*
> 
> No, actual res


Quote:


> Originally Posted by *Paul17041993*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Paul17041993*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Paul17041993*
> 
> it appears VSR and GPU scaling may be broken for some DX titles in windows 10 and catalyst 15.7.1, anyone got any experiences? currently grabbing heaven 4.0 to try and test individual DX levels, OGL may be fine though...
> 
> 
> 
> ok;
> - DX11 is completely shot scaling wise, and I mean completely, either run full-native or run windowed, or not at all.
> - DX9 and OGL work, at least mostly, but o lorde does windows kark itself with heaven on these modes, it cant decide whether the window is active or not and repeatedly flashes out of context and leaves the taskbar off it's rocker...
> 
> Click to expand...
> 
> good news everybody! in less than 24 hours after discovering the bugs I have worked out specific causes;
> - (DX11) if any context-altering or attaching applications are used, such as fraps, render and interface scaling will cease to function, resulting in something similar to the picture below
> - (ALL) if any application is open in the background, context control breaks and you wont be able to keep the game/application in fullscreen, ensure _all_ applications are minimized and that includes any launchers.
> - (ALL) windows interface handling looses its rocker if anything intensive is run, including but not limited to taskbar lag, keyboard lag, other applications lagging, windows interface artifact-ing, etc
> 
> 
> 
> note that screenshots still work correctly, the bugs are only in the buffer > monitor stage whereas screenshots are before or in the buffer stage.
Click to expand...


----------



## sinnedone

Quote:


> Originally Posted by *sinnedone*
> 
> Trying to figure out my overclocks here on a pair of 290's and just wanted to know if these are good scores and are they scaling well with core clocks?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This is what I have so far on Firestrike scores:
> 
> 15570 at 1000/1300
> 
> 15931 at 1050/1300
> 
> 16559 at 1150/1300
> 
> Haven't really tested the 1100/1300 to much for stability as I only added 50mv. Going to be testing overclocks further as anything past 1050 on the core needs extra voltage.
> 
> OH forgot to add this is with a 3770k at 4.5 ghz


Here are some updated benchmarks at the same gpu clocks and the only difference is the cpu has been bumped up to 4.8ghz from 4.5ghz.









Interesting to see the difference a higher cpu overclock makes.

16125 at 1000/1300

16484 at 1050/1300

17265 at 1150/1300


----------



## Geoclock

Hi. So i got some benchs from my XFX 290x 4GB.
Valley 

Is it OK ?

Idel - 51c
Load-72c
VRM 1-70c at load

Any chance to cool it down a lite? Maybe backplate with some pads?


----------



## Paul17041993

well I managed a Heaven 4.0 bench, extreme 1080p @ 1200/1650 with ~125mV, albeit not 100% stable;



At this point I'm pump/block bound, VRMs are cooled adequately but the heat transfer to the water caps out and while the water stays at about 52C, the core goes 80C and over. Memory's limited to 1650 as 1700 will cause it to blackscreen after a short while (not enough VRAM voltage) and core gets the blue fuzzies at about 1200 but mostly stable at 1150 simply due to heat.

I wonder in that sense how much I could sell this card for its potential...


----------



## rdr09

Quote:


> Originally Posted by *Paul17041993*
> 
> well I managed a Heaven 4.0 bench, extreme 1080p @ 1200/1650 with ~125mV, albeit not 100% stable;
> 
> 
> 
> At this point I'm pump/block bound, VRMs are cooled adequately but the heat transfer to the water caps out and while the water stays at about 52C, the core goes 80C and over. Memory's limited to 1650 as 1700 will cause it to blackscreen after a short while (not enough VRAM voltage) and core gets the blue fuzzies at about 1200 but mostly stable at 1150 simply due to heat.
> 
> I wonder in that sense how much I could sell this card for its potential...


my 290 @ 1200/1500 scored a 61.8. it is unstable. lower your memory a bit. even 1500.

i must have missed it . . . on water, then core should not go 80C.


----------



## Paul17041993

Quote:


> Originally Posted by *rdr09*
> 
> my 290 @ 1200/1500 scored a 61.8. it is unstable. lower your memory a bit. even 1500.
> 
> i must have missed it . . . on water, then core should not go 80C.


Your CPU IPC could be greater than mine though, atm I'm still using a FX-8150 that hasn't been overclocked for quite a while now...

Though one thing I also forgot to mention was the pump is only being fed 80% power as well as only being very small, so giving it 100% could net me much greater heat transfer. Except that would require me to enter the motherboard BIOS to modify the setting there and back afterwards, I'm too lasy to do that...

Currently debating whether I should go for a full high performance water loop, seeing I'm not about to change out any hardware any time soon it seems like a good idea...


----------



## Geoclock

I guess i got higt temps

Idle. 51c
Load. 75c

Is it OK?

How i heard nVidia cards are 15c cooler even after OC.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Geoclock*
> 
> I guess i got higt temps
> 
> Idle. 51c
> Load. 75c
> 
> Is it OK?
> 
> How i heard nVidia cards are 15c cooler even after OC.


94 is recommended max and 85 kinda what you want to stay under...


----------



## corrosion666

Hi. Is there anybody here that can give me the exact measurements for a reference R9 290X with the EK waterblock attached (revision 2.0). Im currently doing a build with a corsair carbide air 240 case, and i am having trouble finding a good card that will fit with a full cover block attached. Length is of no importance. At 30 cm, i have plenty of clearance for any type of card. Thewidth of the card is the important factor : Clearance of the GPU and the side window is verry small. So i would like to get a good measurement for this card, so i know for sure i will be able to close the case properly once i install this card + the EK waterblock.

Official CPU clearance is 120mm. Not many cards will therefor fit this case with blocks attached.

I currently have a 11.1 cm wide GTX 970 installed, and with sleeved power connectors attached, i am already pushing against the side window.

Thanks in advance !


----------



## Gumbi

Quote:


> Originally Posted by *Geoclock*
> 
> I guess i got higt temps
> 
> Idle. 51c
> Load. 75c
> 
> Is it OK?
> 
> How i heard nVidia cards are 15c cooler even after OC.


Those temps sound exactly cards. Some nVidia cards are cooler, some warmer, it depends on what card and what cooler is being used...

Your idles are "high" because the fans turn off at idle. Which is a neat feature


----------



## JourneymanMike

Quote:


> Originally Posted by *corrosion666*
> 
> Hi. Is there anybody here that can give me the exact measurements for a reference R9 290X with the EK waterblock attached (revision 2.0). Im currently doing a build with a corsair carbide air 240 case, and i am having trouble finding a good card that will fit with a full cover block attached. Length is of no importance. At 30 cm, i have plenty of clearance for any type of card. Thewidth of the card is the important factor : Clearance of the GPU and the side window is verry small. So i would like to get a good measurement for this card, so i know for sure i will be able to close the case properly once i install this card + the EK waterblock.
> 
> Official CPU clearance is 120mm. Not many cards will therefor fit this case with blocks attached.
> 
> I currently have a 11.1 cm wide GTX 970 installed, and with sleeved power connectors attached, i am already pushing against the side window.
> 
> Thanks in advance !


A reference 290x with stock cooler = 36mm thick

With water block AND back plate = 25mm thick

This was measured with a digital calipers...









Hope this helps...


----------



## corrosion666

Quote:


> Originally Posted by *JourneymanMike*
> 
> A reference 290x with stock cooler = 36mm thick
> 
> With water block AND back plate = 25mm thick
> 
> This was measured with a digital calipers...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hope this helps...


Thanks for the information, but i dont think those numbers are the ones i need.
English is not my primary language so i think i gave a wrong description.

I am talking about the length measured from the PCI-connector to the outer side of the water block fittings.










Using the picture of the GTX 980 above as a refference, i am talking about the length of the blue line.
Hope this is clear.

Thanks


----------



## rdr09

Quote:


> Originally Posted by *Paul17041993*
> 
> well I managed a Heaven 4.0 bench, extreme 1080p @ 1200/1650 with ~125mV, albeit not 100% stable;
> 
> 
> 
> At this point I'm pump/block bound, VRMs are cooled adequately but the heat transfer to the water caps out and while the water stays at about 52C, the core goes 80C and over. Memory's limited to 1650 as 1700 will cause it to blackscreen after a short while (not enough VRAM voltage) and core gets the blue fuzzies at about 1200 but mostly stable at 1150 simply due to heat.
> 
> I wonder in that sense how much I could sell this card for its potential...


could very well be the cpu but i'm inclined to think it's an unstable memory oc. if you already have full waterblock on the gpu, then it just makes sense to go all out with the loop.


----------



## mfknjadagr8

Quote:


> Originally Posted by *corrosion666*
> 
> Thanks for the information, but i dont think those numbers are the ones i need.
> English is not my primary language so i think i gave a wrong description.
> 
> I am talking about the length measured from the PCI-connector to the outer side of the water block fittings.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Using the picture of the GTX 980 above as a refference, i am talking about the length of the blue line.
> Hope this is clear.
> 
> Thanks


height is what you are looking for


----------



## JourneymanMike

Quote:


> Originally Posted by *corrosion666*
> 
> Quote:
> 
> 
> 
> Originally Posted by *JourneymanMike*
> 
> A reference 290x with stock cooler = 36mm thick
> 
> With water block AND back plate = 25mm thick
> 
> This was measured with a digital calipers...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hope this helps...
> 
> 
> 
> Thanks for the information, but i dont think those numbers are the ones i need.
> English is not my primary language so i think i gave a wrong description.
> 
> I am talking about the length measured from the PCI-connector to the outer side of the water block fittings.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Using the picture of the GTX 980 above as a refference, i am talking about the length of the blue line.
> Hope this is clear.
> 
> Thanks
Click to expand...

OK, I thought you meant thickness, what you really wanted was height...

Stock cooler = 111.18mm

With water block = 136.56mm

And there you got it...


----------



## Geoclock

Quote:


> Originally Posted by *Gumbi*
> 
> Those temps sound exactly cards. Some nVidia cards are cooler, some warmer, it depends on what card and what cooler is being used...
> 
> Your idles are "high" because the fans turn off at idle. Which is a neat feature


So actualy Gygabyte GTX 970 G1 is better value compared to r9 290x ?
Some are saying they OC G1 by 20% and it stays cool as befpre.


----------



## Gumbi

Quote:


> Originally Posted by *Geoclock*
> 
> So actualy Gygabyte GTX 970 G1 is better value compared to r9 290x ?
> Some are saying they OC G1 by 20% and it stays cool as befpre.


The 290X is a faster card than the 970. Why don't you try overclocking your card and seeing how your temps go from there.

What amke/model card do you have? Like I said, 71c load temo sounds fine...


----------



## corrosion666

Quote:


> Originally Posted by *JourneymanMike*
> 
> OK, I thought you meant thickness, what you really wanted was height...
> 
> Stock cooler = 111.18mm
> 
> With water block = 136.56mm
> 
> And there you got it...


Thanks.

So can anyone confirm that a reference R9 290 X card with the EK waterblock fits inside the Corsair Carbide Air 240 case? I have seen pictures which lead me to believe that it will, but some of them looked like the panel didnt close completely. Can anyone confirm/disconfirm the fit?

Thanks


----------



## JourneymanMike

Quote:


> Originally Posted by *corrosion666*
> 
> Quote:
> 
> 
> 
> Originally Posted by *JourneymanMike*
> 
> OK, I thought you meant thickness, what you really wanted was height...
> 
> Stock cooler = 111.18mm
> 
> With water block = 136.56mm
> 
> And there you got it...
> 
> 
> 
> Thanks.
> 
> So can anyone confirm that a reference R9 290 X card with the EK waterblock fits inside the Corsair Carbide Air 240 case? I have seen pictures which lead me to believe that it will, but some of them looked like the panel didnt close completely. Can anyone confirm/disconfirm the fit?
> 
> Thanks
Click to expand...

I measured with the cables attached. for mine, from the bottom of the Gold Fingers to the top of the cables (with the cables flattened as much as I can get them) the measurement is 145.61mm


----------



## gatygun

Quote:


> Originally Posted by *Geoclock*
> 
> I guess i got higt temps
> 
> Idle. 51c
> Load. 75c
> 
> Is it OK?
> 
> How i heard nVidia cards are 15c cooler even after OC.


I have had 2x 290's and a single 290 tri-x and now i got a 970 msi gaming. I picked the gaming over the g1 because the fans shut down in lesser taxing games while the g1 would always be on.

My experience was with a 290 tri-x that the temps would hover on stock gpu profile about ~70c, a bit more aggressive fan profile and it would run around the 60c obviously this is with a well cooled case.
Oc'ed temps would move upwards towards ~80c, unless you pushed a more aggressive fan profile then ~70c was doable.

With 970 msi gaming, on stock the gpu hits about ~60c. Fan sound is far lesser then the tri-x from the 290, it's completely silent. In lesser demanding games it's even completely shut down, but you don't even notice 70% fan profile while with the tri-x you surely did.

With 970 oc the card gets about 64-65c hot, fan wil start running but you won't hear anything. With the 290 you did hear the sound of it surely.

This is with a well cooled case that has a lot of fans in it and room to push air through, if you got a worse cooling solution then i'm sure the 290 will leap upwards with temps. All with all the 970 is far more cooler and doesn't heat my room up, that's for sure. 970 is more limited through overclocking with it's power limit then cooling.
Quote:


> Originally Posted by *Geoclock*
> 
> So actualy Gygabyte GTX 970 G1 is better value compared to r9 290x ?
> Some are saying they OC G1 by 20% and it stays cool as befpre.


The 970 is basically a 290x without the heat and fan noise when overclocked. Thing is overclocked here to 980 speeds and frankly the fan is unhearable, and the heat is far lesser then my 290 had.

The main issue with 970 is it's power delivery, you are limited by this more then heat, the card here never gets hotter then 65c and i never hear the fan's at all and that's with a hefty overclock comparable towards a 1200/1400 clock speed on a 290. With the 290 i had to push the fan really hard to keep it cool enough and the heat would push itself forwards.

Performance wise the cards are on high end cpu's comparable for sure, on lower end cpu's or older cpu's the 970 will pull ahead drastically ( witcher 3, ultra 1080p 290 = 44 fps lows in city's, the 970 = 61c lows, bf4 290 47 fps lows on ultra 1080p, 970 65 fps lows ),. specially on the 1080p framerate. If you go higher resolutions then 1440p 290 will surely be a better investment.

Still the 970 features only 3,5gb of v-ram, which isn't a issue on 1080p for sure in newer games but higher resolutions get any v-ram you can get is better.

That's what my experience is with both cards.

If you want higher resolutions go with 290/390 cards, but can deal with more heat / sound. Want the lower resolutions i would get a 970, and if you don't want to deal with heat / sound.


----------



## Geoclock

Quote:


> Originally Posted by *Gumbi*
> 
> The 290X is a faster card than the 970. Why don't you try overclocking your card and seeing how your temps go from there.
> 
> What amke/model card do you have? Like I said, 71c load temo sounds fine...


XFX DD r9 290x, so do you think whith this temps worth to OC?
What's the max temp i can hold on? 80c?

Safest Core and Memory for OC?


----------



## Gumbi

Quote:


> Originally Posted by *Geoclock*
> 
> XFX DD r9 290x, so do you think whith this temps worth to OC?
> What's the max temp i can hold on? 80c?


Anything up to 85 is fine. I have heard the XFX 290xs have VRM issues, though I have seen conflicting reports.


----------



## Paul17041993

Quote:


> Originally Posted by *rdr09*
> 
> could very well be the cpu but i'm inclined to think it's an unstable memory oc. if you already have full waterblock on the gpu, then it just makes sense to go all out with the loop.


it's hybrid cooled using the corsair HG10 and the fractal design T12 AIO-kit with some extra fittings and heatsinks stuck to the HG10, the pump is essentially a DC-LT sitting directly over the block. At this point it looks like I'll be trying a pair of cheap CPU blocks (CPU and GPU) with a D5 pump and heaps of rad space, should be at least slightly better than what I have currently while being dead silent. A fullcover block for the 290X costs some 150AUD at least, which is a bit steep for something I might not keep for very long, whereas pumps, rad and CPU blocks stay throughout CPU and GPU upgrades.

I'll be testing with lower clocks later today though to see how the results differ.


----------



## Raephen

Quote:


> Originally Posted by *corrosion666*
> 
> Thanks.
> 
> So can anyone confirm that a reference R9 290 X card with the EK waterblock fits inside the Corsair Carbide Air 240 case? I have seen pictures which lead me to believe that it will, but some of them looked like the panel didnt close completely. Can anyone confirm/disconfirm the fit?
> 
> Thanks


I don't own a Corsair 240 and neitherdo I own an EK block, but I'd think my AquaComputer block should be compareable in width and it adds a bit more to the sref. card width than I think the 240 can handle.

You could go AIO (for the sake of brand: a Corsair one to match your case







) with a suitable mount bracket (Corsair makes). Mount the radiator of the AIO on the front intake.

From the block I know and the specs I've seen, I think a fully blocked 290 would be too tight of a fit for the case. The safest route to go, if you want a form of WC, IMO, is an All In One cooler plus bracket.


----------



## Geoclock

So OC-ed my XFX 290x by 1090<>5020
GPU temp 87-88c
VRM1 temp 90-100c
Fan 60%
Here are some benchs and what do recommend?


----------



## Paul17041993

Quote:


> Originally Posted by *Geoclock*


running heaven at 1920*1080 extreme


----------



## Geoclock

Changed by 1920x1080, it's bad right?


----------



## Paul17041993

Quote:


> Originally Posted by *Geoclock*
> 
> Changed by 1920x1080, it's bad right?


1600*900 is a poor resolution to test these cards on, you end up CPU-bound more than anything.


----------



## Geoclock

I mean results are bad.

How good will be at same 1080 nVidia GTX 970 ?
10% better?


----------



## Paul17041993

ok did more benches with heaven extreme 1080 with different clocks;


Spoiler: stock 1000/1250









Spoiler:  100mV 1101/1499









Spoiler:  130mV 1199/1501







it seems to be able to run stable ~1200 but it doesn't seem to be able to go higher than that even with +200mV. One problem I've noticed though is how hot the shroud gets ad the back end of the card, so the VRAM VRMs may be getting hotter than I thought and causing instability as the card gets stressed to a certain point. So I'll be looking at making a special cover to fit over the shroud and focus the airflow through the whole thing. Fan-end of the shroud is cool to the touch even when overclocked (though at a high RPM) so that's working well.

Other then that I'm CPU-bound, so once the CPU is under water I'll get it to at least 4.8GHz and see from there.


----------



## kizwan

Quote:


> Originally Posted by *Geoclock*
> 
> So OC-ed my XFX 290x by 1090<>5020
> GPU temp 87-88c
> VRM1 temp 90-100c
> Fan 60%
> Here are some benchs and what do recommend?
> 
> 
> Spoiler: Warning: Spoiler!


You should run at 1080p 8xAA Fullscreen. Then press F12 to take screenshots. Screenshots are located in *C:\Users\[username]\Valley* & *C:\Users\[username]\Heaven*
Quote:


> Originally Posted by *Paul17041993*
> 
> ok did more benches with heaven extreme 1080 with different clocks;
> 
> 
> Spoiler: stock 1000/1250
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler:  100mV 1101/1499
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler:  130mV 1199/1501
> 
> 
> 
> 
> 
> 
> 
> it seems to be able to run stable ~1200 but it doesn't seem to be able to go higher than that even with +200mV. One problem I've noticed though is how hot the shroud gets ad the back end of the card, so the VRAM VRMs may be getting hotter than I thought and causing instability as the card gets stressed to a certain point. So I'll be looking at making a special cover to fit over the shroud and focus the airflow through the whole thing. Fan-end of the shroud is cool to the touch even when overclocked (though at a high RPM) so that's working well.
> 
> Other then that I'm CPU-bound, so once the CPU is under water I'll get it to at least 4.8GHz and see from there.


Your 1199/1501 score seems too low IMO.

1200/1500 - Only 1MHz more on the core & only 1MHz less on the memory.


BTW, did you try overclock only the core, leave memory at stock? Just in case if you can go higher than 1200MHz.


----------



## Paul17041993

Quote:


> Originally Posted by *kizwan*
> 
> 1200/1500 - Only 1MHz more on the core & only 1MHz less on the memory.
> 
> 
> BTW, did you try overclock only the core, leave memory at stock? Just in case if you can go higher than 1200MHz.


you're using a 3rd gen i7 at 4.5GHz though








Though I did just try 1250/1250 while it was cold and the result wasn't any different to before, so 1200 on the core may just be the limit of this card without a VRM mod, or trixx may not be setting the voltage correctly...
I don't seem to be able to set it to certain specific numbers though, which may be a trixx bug or it's simply that all 3 values (core, memory, mV) must fit into specific ratios, I've never messed about with it that much to really work it out...


----------



## kizwan

Quote:


> Originally Posted by *Paul17041993*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> 1200/1500 - Only 1MHz more on the core & only 1MHz less on the memory.
> 
> 
> BTW, did you try overclock only the core, leave memory at stock? Just in case if you can go higher than 1200MHz.
> 
> 
> 
> you're using a 3rd gen i7 at 4.5GHz though
> 
> 
> 
> 
> 
> 
> 
> 
> Though I did just try 1250/1250 while it was cold and the result wasn't any different to before, so 1200 on the core may just be the limit of this card without a VRM mod, or trixx may not be setting the voltage correctly...
> I don't seem to be able to set it to certain specific numbers though, which may be a trixx bug or it's simply that all 3 values (core, memory, mV) must fit into specific ratios, I've never messed about with it that much to really work it out...
Click to expand...

The FPS difference between yours & mine are too huge. If the difference only 2 to 3 FPS then we can attributed it to the CPU. Even if you compare the score with you previous score, it also doesn't look right.


----------



## Paul17041993

Quote:


> Originally Posted by *kizwan*
> 
> The FPS difference between yours & mine are too huge. If the difference only 2 to 3 FPS then we can attributed it to the CPU. Even if you compare the score with you previous score, it also doesn't look right.


Well I just discovered it may have been throttling to a certain degree...








But the VRAM VRMs and back-end of the shroud are just getting way too hot overall, as high as 100C and are possibly making the VRAM ICs unstable. While the main VRMs end up cool (65C max even with +200mV it seems), the PCB is still collecting heat from the core, which is then building up on the shroud. Even after sticking another little heatsink on the other end there simply isn't enough airflow...

I need to locate me some perspex and do some sketchies...


----------



## skyrrd

I'm thinking about buying the arctic cooling hybrid iii 140 either 290(x) or generic edition (differs in vrm heatsinks)

Can someone tell me for sure if the 290 edition heatsinks will fit on my 290 tri-x New edition? Otherwise which thermal pads/glue should i use for the generic heatsink parts (afaik they habe to be glued, 290 ref edition can be mounted...)


----------



## kizwan

Anyone using 390 modded BIOS for 290 & using Secure Boot to boot windows?


----------



## Offler

What about R9-290x and 4k displays and/or 4k gaming?

(check my system below, still same config)


----------



## rdr09

Quote:


> Originally Posted by *Paul17041993*
> 
> Well I just discovered it may have been throttling to a certain degree...
> 
> 
> 
> 
> 
> 
> 
> 
> But the VRAM VRMs and back-end of the shroud are just getting way too hot overall, as high as 100C and are possibly making the VRAM ICs unstable. While the main VRMs end up cool (65C max even with +200mV it seems), the PCB is still collecting heat from the core, which is then building up on the shroud. Even after sticking another little heatsink on the other end there simply isn't enough airflow...
> 
> I need to locate me some perspex and do some sketchies...


try 1150/1400 @ 150mv or lower. it should match a 290 @ 1200. make sure to max Power Limit.

Quote:


> Originally Posted by *Offler*
> 
> What about R9-290x and 4k displays and/or 4k gaming?
> 
> (check my system below, still same config)


you need another 290X. i use 2 290s at stock for my 4K. wait . . . 1090T? nah, with 2 290s . . . it's the cpu that needs oc.


----------



## cokker

My windows 10 install just installed an older driver after I DDU'd and installed 15.7.1. Windows what are you doing?


----------



## Offler

Quote:


> Originally Posted by *rdr09*
> 
> you need another 290X. i use 2 290s at stock for my 4K. wait . . . 1090T? nah, with 2 290s . . . it's the cpu that needs oc.


Dual R9-290(x) will kill Hypertransport, due lack of Xfire bridge. Anyway I wanted to stay away from Crossfire. its a cool feature, but requires additional attention.

GPU on FullHD (and highest possible image quality) gaming is utilized on 50-70% while keeps downclocking (not throttling) below 800MHz, Vsynced on 60FPS/Hz. 4x more pixels will require 4x more resources, and it appears that GPU will be able to handle just 2x...

Edit:
btw, not even CPU isnt utilized above 60% in DX games. Mantle alllows to squeeze it more up to 80%...

I checked few results on the gaming. R9-290x, properly clocked (and with all 44 cores on, such as mine) can handle such resolution quite well.

Display Specs like 1bilion colors does not matter much until the display can be calibrated (I am considering purchase of 4k also because of photos, anyway I dont know environment which allows more than 24 bit color depth).

Last thing I was considering is FreeSync. TBH I have opinions against this technology as I prefer 60Hz, 60FPS, properly synced, with proper buffering. However in real world the games are still poorly coded and I have to count with framedrops. Does it have any kind of positive impact when disable V-Sync and use Freesync instead? In FPS range 30-59 it certainly does.


----------



## rdr09

Quote:


> Originally Posted by *Offler*
> 
> Dual R9-290(x) will kill Hypertransport, due lack of Xfire bridge. Anyway I wanted to stay away from Crossfire. its a cool feature, but requires additional attention.
> 
> GPU on FullHD (and highest possible image quality) gaming is utilized on 50-70% while keeps downclocking (not throttling) below 800MHz, Vsynced on 60FPS/Hz. 4x more pixels will require 4x more resources, and it appears that GPU will be able to handle just 2x...
> 
> Edit:
> btw, not even CPU isnt utilized above 60% in DX games. Mantle alllows to squeeze it more up to 80%...
> 
> I checked few results on the gaming. R9-290x, properly clocked (and with all 44 cores on, such as mine) can handle such resolution quite well.
> 
> Display Specs like 1bilion colors does not matter much until the display can be calibrated (I am considering purchase of 4k also because of photos, anyway I dont know environment which allows more than 24 bit color depth).
> 
> Last thing I was considering is FreeSync. TBH I have opinions against this technology as I prefer 60Hz, 60FPS, properly synced, with proper buffering. However in real world the games are still poorly coded and I have to count with framedrops. Does it have any kind of positive impact when disable V-Sync and use Freesync instead? In FPS range 30-59 it certainly does.


my thuban at 4GHz is maxed out by my 7950 in games like BF4. my i7 @ 4.5 HT off (basically an i5) is maxd out by a single 290 in BF4. with 2 290s, i can't turn off HT. I can max my 290s, though, in BF4 with my i7.


----------



## Offler

I play always with Vsync On at 60hz/FPS with Triple Buffering (or tricked triple buffer).

Eg I use my current display and its parameters (both resolution and refresh as a limitation for the rest of system. Both CPU and GPU can perform all render tasks faster than 60hz in most games on 1920x1080x24b. Speedy harddrive (up to 2500mb/s read) also save a lot of CPU time.

I observe mainly two scenarios:

a) Game runs on 60 fps, buttersmooth, while CPU and GPU are not used more than 60-80%
b) Game runs less than 60 fps, due poor optimizations in engine, while CPU and GPU are not used more than 60-80%.

I use to observe stutter which is a result of FPS fluctuation between 58-62FPS in some poorly optimized games. Apparently caused by some buffer overflow. And in some cases I may observe missing Nvidia optimizations (Witcher 2, scene with trebuchets on warfield, Ubersampling ON).


----------



## fyzzz

http://www.3dmark.com/fs/5780378 not my highest gpu score (5 points off) but highest overall and did that run at 24.4c ambient. But the best thing is that it seems like i got rid off the blackscreens. There's no random blackscreen happening at 1245 mhz anymore (using 390 bios that i have tinkered with). But sadly it blackscreens at 1250 mhz.


----------



## VulpesInculta

Hello guys, i'm sorry if this question is dumb but i don't really know where else to ask. I set up a custom fan curve for my r9 290 pcs+, and while the GPU VRAM and GPU CORE temps in hwinfo don't go above 75c, the area around the pc became really warm, much warmer than it was before i set up the fan curve. Is it possible that i am getting wrong readings or something? The fans are going at 50% under load instead of 80% and the temps went up by only 5-6 degrees.


----------



## Agent Smith1984

Quote:


> Originally Posted by *VulpesInculta*
> 
> Hello guys, i'm sorry if this question is dumb but i don't really know where else to ask. I set up a custom fan curve for my r9 290 pcs+, and while the GPU VRAM and GPU CORE temps in hwinfo don't go above 75c, the area around the pc became really warm, much warmer than it was before i set up the fan curve. Is it possible that i am getting wrong readings or something? The fans are going at 50% under load instead of 80% and the temps went up by only 5-6 degrees.


Nope, the fan curve is doing it's "job" and getting the hot air off the card...
That air is then being drawn out of the case by your exhaust fans as much hotter air than it was before


----------



## VulpesInculta

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Nope, the fan curve is doing it's "job" and getting the hot air off the card...
> That air is then being drawn out of the case by your exhaust fans as much hotter air than it was before


I guess i will have to make the fans work a bit faster because it feels like i am having an electric heater next to me


----------



## mfknjadagr8

Quote:


> Originally Posted by *VulpesInculta*
> 
> I guess i will have to make the fans work a bit faster because it feels like i am having an electric heater next to me


faster won't reduce the heat they output if anything it will increase it...however the room gaining temperature at least let's you know you are removing heat from the pc


----------



## VulpesInculta

Quote:


> Originally Posted by *mfknjadagr8*
> 
> faster won't reduce the heat they output if anything it will increase it...however the room gaining temperature at least let's you know you are removing heat from the pc


Well with the stock fan curve the fans go at 80% when the temp goes near 70 and it definitely feels much less warmer, the pc is close to me so i can feel it. And when i played a bit of the witcher 3 with fans at 50% it was like i had a small heater next to me while the temps seemed to be almost the same, instead of 70 they went up to like 75 and stayed there.That's why i asked if it's possible to have wrong readings, it felt that much warmer.


----------



## Paul17041993

Quote:


> Originally Posted by *Offler*
> 
> Dual R9-290(x) will kill Hypertransport, due lack of Xfire bridge.


Pretty sure Hawaii doesn't use the HT link, the cards communicate with each other directly, draw calls of course would though.


----------



## battleaxe

I finished some fairly extensive cleaning and modding in my case today. Made a pretty significant difference if I do say so myself.

Before:


Spoiler: Warning: Spoiler!








After:


Spoiler: Warning: Spoiler!








I'm pretty happy with the results.









After this, I was able to push a bit harder. VRM1 was lower by about 6c after the changes. VRM2 maybe 1c lower. Able to hit a bit higher on core and memory as a result.


----------



## pcrevolution

Spoiler: Warning: Spoiler!






Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Lutro0*
> 
> Hey guys I am in need of 2x sapphire stock R9 290X 4G D5 heatsinks, I know many of you watercool and have them laying about and I honestly just need to borrow them for some troubleshooting so I would be willing to pay to rent them or buy them. I figured this would be the best place to ask. So I appreciate your time, shoot me a pm if you can help me out!





Quote:


> Originally Posted by *battleaxe*
> 
> I finished some fairly extensive cleaning and modding in my case today. Made a pretty significant difference if I do say so myself.
> 
> Before:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> After:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> I'm pretty happy with the results.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> After this, I was able to push a bit harder. VRM1 was lower by about 6c after the changes. VRM2 maybe 1c lower. Able to hit a bit higher on core and memory as a result.






Nice ghetto!


----------



## MissHaswellE

Quote:


> Spoiler: Warning: Spoiler!


My brain doesn't understand what it's looking at.

Is that an upside down motherboard?


----------



## skyrrd

BTX motherboard i guess


----------



## Paul17041993

It's a Silverstone chasis, not sure which one specifically (raven or fortress) but that's what they do.


----------



## battleaxe

Quote:


> Originally Posted by *MissHaswellE*
> 
> My brain doesn't understand what it's looking at.
> 
> Is that an upside down motherboard?


Quote:


> Originally Posted by *Paul17041993*
> 
> It's a Silverstone chasis, not sure which one specifically (raven or fortress) but that's what they do.


Its actually an AZZA Genesis case. And yes, its backwards, actually I like this configuration a lot better than the normal one where the GPU's are upside down. I never have to look at the back of the PCB.


----------



## Paul17041993

Quote:


> Originally Posted by *battleaxe*
> 
> Its actually an AZZA Genesis case.


ah, from what I can tell we don't have any of their products in AU, which would be why I didn't know of it...


----------



## battleaxe

Quote:


> Originally Posted by *pcrevolution*
> 
> 
> Nice ghetto!


Are you making fun of me?


----------



## Paul17041993

Quote:


> Originally Posted by *battleaxe*
> 
> Are you making fun of me?


I would consider your setup better than mine, I've slightly dinged one corner of my noctua heatsink trying to get the hose for the radiator around it. As well as a mess of SATA cables that I don't see in your rig, you've done very well to manage and space out everything.


----------



## battleaxe

Quote:


> Originally Posted by *Paul17041993*
> 
> I would consider your setup better than mine, I've slightly dinged one corner of my noctua heatsink trying to get the hose for the radiator around it. As well as a mess of SATA cables that I don't see in your rig, you've done very well to manage and space out everything.


Well, I doubt that... lol

I've always been a function over form kinda guy. I would love to get into full cover waterblocks and all, but just can't see spending all the money on it. But it would look really nice, plus I cannot push this new 290x very far as the VRM gets too hot without a full block. Can't even think about approaching 200mv or it would burst into flames I think.

I've ordered a few new parts to make my build look a bit neater still. Hopefully it doesn't look quite so "Ghettto" when I'm done. Oh well, I don't care anyway. No-one besides me even sees inside the thing.


----------



## colorfuel

Hi All,

So I did a few overclocking tests with the Ashes of singularity benchmark on my 290X Tri-X new edition

running the latest drivers 15.7.1 on Windows 10 (obviously).

I ran all the tests at 1080 with default settings, which are, I suppose, what reviewers name high. I use fan settings to keep the GPU at max. temp of 75° and VRM1 80°, to prevent any throttling.

My 3570K runs at 4.5Ghz

Here are the results, using the average the benchmark shows at the end.



First of all there is the obvious CPU bottleneck in DX11. DX12 gives me almost 100% more FPS. More like 95%.
Secondly, overclocking the memory doesnt do it any good. I have never seen any tangible improvement between running 1350 and 1500.
And finally, the CPU bottleneck comes back as soon as I reach about 1125 Mhz. (Staying with 1350 on Mem) From 1020 to 1100 there is an improvement of 5.5 % for 7,8% GPU OC. From 1100 to 1125 there is a 1,3% improvement for 2,2 % OC. And then from 1125 to 1150 there is no improvement whatsoever so I'm assuming my system hits its sweetspot at about 1125MHZ GPU edit: on further tests I get 44,3 FPS on 1150/1500, but that doesnt really make a noticeable difference) Everything above that seems to be bottlenecked by the CPU again at which point the difference between DX11 and DX12 is also the highest: 97,3 %.

The test is quite sensitive to overclocks. I've been able to run my 290X at 1150/1500 in anything I

threw at it with no artifacting at + 106 mv.

For example, here is a Fire Strike run at 1157/1500, no artifacting:

http://www.3dmark.com/3dm/8308769?

Ashes of singularity however makes my PC reboot as soon as the test starts. I've been able to run

1150/1350 but there was no way to get it to run 1150/1500 (neither DX11 nor DX12). I suspect this is

due to my 580W PSU reaching its limits. This has never happened before in any game. Or maybe there is

another explanation I dont know of.

Edit: I'm attaching the results as I type.

1020_1350_DX11.txt 216k .txt file


1020_1350_DX12.txt 394k .txt file


1100_1350_DX11.txt 230k .txt file


1100_1350_DX12.txt 413k .txt file


1100_1500_DX11.txt 232k .txt file


1100_1500_DX12.txt 410k .txt file


1125_1350_DX11.txt 231k .txt file


1125_1350_DX12.txt 418k .txt file


1125_1500_DX11.txt 229k .txt file


1125_1500_DX12.txt 417k .txt file


1150_1350_DX11.txt 234k .txt file


1150_1350_DX12.txt 416k .txt file


----------



## Paul17041993

Quote:


> Originally Posted by *battleaxe*
> 
> Well, I doubt that... lol
> 
> I've always been a function over form kinda guy. I would love to get into full cover waterblocks and all, but just can't see spending all the money on it. But it would look really nice, plus I cannot push this new 290x very far as the VRM gets too hot without a full block. Can't even think about approaching 200mv or it would burst into flames I think.
> 
> I've ordered a few new parts to make my build look a bit neater still. Hopefully it doesn't look quite so "Ghettto" when I'm done. Oh well, I don't care anyway. No-one besides me even sees inside the thing.


Hybrid cooling can sustain a high overclock, you just need good enough heatsinks and airflow. But the main limitation will be those AIO coolers as they're simply not capable of sustaining 300+ watts of heat that would come off the core when at high overclocks. My kelvin T12 has the same issue which is why I'm going to try using a CPU block in a custom water loop with a D5 pump and see how well it handles.


----------



## battleaxe

Quote:


> Originally Posted by *Paul17041993*
> 
> Hybrid cooling can sustain a high overclock, you just need good enough heatsinks and airflow. But the main limitation will be those AIO coolers as they're simply not capable of sustaining 300+ watts of heat that would come off the core when at high overclocks. My kelvin T12 has the same issue which is why I'm going to try using a CPU block in a custom water loop with a D5 pump and see how well it handles.


Yes, exactly. My old card does fine with only an AIO, for some reason it doesn't take as much voltage and OCs pretty well considering. The VRM's then are not stressed as much as on most 290/x cards etiher. But that card is not normal, this new one, the VRM's start getting really hot without having a full block. High OC's need a full block on these cards, at least on most of them. Someday...


----------



## fyzzz

I modded the pt1 bios to fit my card today. managed to get to 1300 mhz (sure instable but 1300 is 1300) http://www.3dmark.com/fs/5806107. Couldn't oc memory at all on that bios and readings were wrong so i just checked if temps were fine and pushed further. I know i could have gotten more points if i were more stable and could maybe got 1300 more stable, but i didn't know anything what was happening in the card. I was not out after points, just pure core speed.


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> I modded the pt1 bios to fit my card today. managed to get to 1300 mhz (sure instable but 1300 is 1300) http://www.3dmark.com/fs/5806107. Couldn't oc memory at all on that bios and readings were wrong so i just checked if temps were fine and pushed further. I know i could have gotten more points if i were more stable and could maybe got 1300 more stable, but i didn't know anything what was happening in the card. I was not out after points, just pure core speed.


that's pretty much what i get in graphics score with one of my 290s at 1305 core using original bios but Win 10. i think it was an older driver than Omega i used.


----------



## fyzzz

Quote:


> Originally Posted by *rdr09*
> 
> that's pretty much what i get in graphics score with one of my 290s at 1305 core using original bios but Win 10. i think it was an older driver than Omega i used.


I just wanted the magical number 1300. The bios was crazy all readings and such was off. I switched bios quickly afterwards. I think i will keep tinkering with 390 bios tho, the 390 bios is my points hunting bios. Here http://www.3dmark.com/fs/5780378 is my highest score yet with it and i want to beat it again







.


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> I just wanted the magical number 1300. The bios was crazy all readings and such was off. I switched bios quickly afterwards. I think i will keep tinkering with 390 bios tho, the 390 bios is my points hunting bios. Here http://www.3dmark.com/fs/5780378 is my highest score yet with it and i want to beat it again
> 
> 
> 
> 
> 
> 
> 
> .


keep the 390 bios for benching and flip the switch for gaming.









can't imagine what you'll get at 1300 with 390 bios. keep on tweaking.


----------



## diggiddi

Just recently my system started crashing upon opening Afterburner, anyone dealt with this before?
Card is at stock speed when AB opens but I'm on LN2 bios


----------



## Paul17041993

Quote:


> Originally Posted by *diggiddi*
> 
> Just recently my system started crashing upon opening Afterburner, anyone dealt with this before?
> Card is at stock speed when AB opens but I'm on LN2 bios


Does 3D and benches still run correctly at stock clocks on that BIOS? if the peak clocks are unstable you can end up with the system running and booting fine but only ever crashing when the clocks are spiked, which can happen when you open things like afterburner.

Other then that I can only assume the BIOS is incompatible with your card or has a bug that's triggered when afterburner scans the card.


----------



## By-Tor

Quote:


> Originally Posted by *diggiddi*
> 
> Just recently my system started crashing upon opening Afterburner, anyone dealt with this before?
> Card is at stock speed when AB opens but I'm on LN2 bios


I had this problem yesterday and the only way I could stop it from happening was to boot up in safe mod and uninstall it. Then boot in normally and reinstall AB.

Has been working fine since...


----------



## fyzzz

hhm i thought something wrong had happen today. When i applied my 1200mhz clock today i was suprised that it was not stable, no matter what voltage. I touched the psu and it was almost burning hot. So now when things is starting to cool down a bit i tested again and it was back to full stability again. I have now put a fan next to it that blows over it, hopefully that helps (stupid rm 750)


----------



## Coree

Hey ppl, started to get crazy artifacting occuring while browsing the internet and playing games. Downclocking to 700mhz memory doesn't help. I'm thinking of baking the GPU (290X), should I do it?
Btw does anybody in the EU have an extra stock 290/290X cooler that I could buy?
Thx


----------



## battleaxe

Quote:


> Originally Posted by *Coree*
> 
> Hey ppl, started to get crazy artifacting occuring while browsing the internet and playing games. Downclocking to 700mhz memory doesn't help. I'm thinking of baking the GPU (290X), should I do it?
> Btw does anybody in the EU have an extra stock 290/290X cooler that I could buy?
> Thx


Lower your core clock and see what happens. Unless you just want to bake for fun I guess.


----------



## diggiddi

Quote:


> Originally Posted by *Paul17041993*
> 
> Does 3D and benches still run correctly at stock clocks on that BIOS? if the peak clocks are unstable you can end up with the system running and booting fine but only ever crashing when the clocks are spiked, which can happen when you open things like afterburner.
> 
> Other then that I can only assume the BIOS is incompatible with your card or has a bug that's triggered when afterburner scans the card.


I'm inclined to believe the latter since I was running the GPU core and memory at stock clocks

Quote:


> Originally Posted by *By-Tor*
> 
> I had this problem yesterday and the only way I could stop it from happening was to boot up in safe mod and uninstall it. Then boot in normally and reinstall AB.
> 
> Has been working fine since...


I'll give it a shot and see, my last fix was to uninstall and reinstall, it went away for a while and came back


----------



## Paul17041993

Quote:


> Originally Posted by *fyzzz*
> 
> hhm i thought something wrong had happen today. When i applied my 1200mhz clock today i was suprised that it was not stable, no matter what voltage. I touched the psu and it was almost burning hot. So now when things is starting to cool down a bit i tested again and it was back to full stability again. I have now put a fan next to it that blows over it, hopefully that helps (stupid rm 750)


You likely need a stronger PSU, in testing I've done in the past an overclocked 290X and stock CPU can easily go past 700W total draw, so you're likely very close to the peak of your 750W even if you just have a 290.

Quote:


> Originally Posted by *By-Tor*
> 
> I had this problem yesterday and the only way I could stop it from happening was to boot up in safe mod and uninstall it. Then boot in normally and reinstall AB.
> 
> Has been working fine since...


Sounds like an odd driver bug then, not sure if it could be ULPS or not.
The screen stays active on the lockups though right? which means that the card is still functioning when it occurs, which could be a driver memory corruption. If the screen goes all stripy and/or blacks out then it means the card went unstable and shut down, which would then be a voltage vs clock issue, which would be a driver bug or afterburner for some reason is sending bad settings over.


----------



## By-Tor

Quote:


> Originally Posted by *Paul17041993*
> 
> Sounds like an odd driver bug then, not sure if it could be ULPS or not.
> The screen stays active on the lockups though right? which means that the card is still functioning when it occurs, which could be a driver memory corruption. If the screen goes all stripy and/or blacks out then it means the card went unstable and shut down, which would then be a voltage vs clock issue, which would be a driver bug or afterburner for some reason is sending bad settings over.


The screen just went blank after boot up... I couldn't get to MSI AB fast enough before it would happen. I think it was the high memory over clock I was attempting and it just didn't like it. All has been normal since the reinstall.


----------



## Paul17041993

Quote:


> Originally Posted by *By-Tor*
> 
> The screen just went blank after boot up... I couldn't get to MSI AB fast enough before it would happen. I think it was the high memory over clock I was attempting and it just didn't like it. All has been normal since the reinstall.


ah, that's simply due to an unstable overclock, it occurs if you have CCC start-up enabled too. So when overclocking you should disable CCC and AB or trixx start-up until you know the clocks will be fully stable.


----------



## battleaxe

Quote:


> Originally Posted by *By-Tor*
> 
> The screen just went blank after boot up... I couldn't get to MSI AB fast enough before it would happen. I think it was the high memory over clock I was attempting and it just didn't like it. All has been normal since the reinstall.


Yes, and if you have AB set to restore clocks on restart you may need to go into safe mode and uninstall AB in order to get things back to normal. Make sure to delete settings too. Then reboot, reinstall AB and you're back in business. Whenever AB startsup and resets the memory back to a clock it cannot sustain it will BS. No way I know of to get out of this loop without uninstalling AB. Unless someone else knows a way, then I'm all ears to hear about it.


----------



## By-Tor

Quote:


> Originally Posted by *battleaxe*
> 
> Yes, and if you have AB set to restore clocks on restart you may need to go into safe mode and uninstall AB in order to get things back to normal. Make sure to delete settings too. Then reboot, reinstall AB and you're back in business. Whenever AB startsup and resets the memory back to a clock it cannot sustain it will BS. No way I know of to get out of this loop without uninstalling AB. Unless someone else knows a way, then I'm all ears to hear about it.


Yeah I'm sure that's what was going on. I booted in safe and uninstalled AB and all its files.


----------



## diggiddi

In my situation screen would freeze and the sound would go into a loop and I had to do a hard reset


----------



## Coree

Quote:


> Originally Posted by *battleaxe*
> 
> Lower your core clock and see what happens. Unless you just want to bake for fun I guess.


Well, lowered core to 800 mhz and memory 700 mhz. Clearly a faulty card.


----------



## spyshagg

So, funny story happened to me.

I spend a saturday finding the stable Overclock limits of my two 290x's individually. Using firestrike, heaven, FFIV, Dirt rally.
The result was *1200/1620* on each card. *Fully stable on Firestrike and every benchmark individually.* But with crossfire I had blackscreens and the PSU shut down sometimes.
Yesterday I received my new Superflower 1200watts platinum PSU. Great, now lets try *1200/1620 in crossfire again!
*

I launched Firestrike. Perfect, super stable.

I lauched Project Cars, and when starting the race, BLACK SCREEN.

I launched Assetto Corsa and when starting the race, BLACK SCREEN.

I launched Firestrike again, Perfect, Super stable. WHAT IS GOING ON!!!!!!!!!

I realized, all my tests to find the stable overclock were done with a single monitor at Full HD 1080P. And all my games are using eyefinity 3K resolution. So, i dropped my games to full hd.

I launched Project Cars, no black screens.

I launched Asseto Corsa, no black screens.

SON OF A GUN!!

I thought to my self, maybe the problem is eyefinity. So I disabled eyefinity, and activated VSR to downscale 4k to full hd.

I launched Project Cars, BLACK SCREEN!

I launched Assetto Corsa, BLACK SCREEN!

Conclusion. All my efforts to find the max overclock were done at Full HD. But they blackscreen almost immediately when I increase the Vram load!
Now I have to start everything all over again, this time testing stability with VSR 4K instead of Full HD.


----------



## Roboyto

Quote:


> Originally Posted by *spyshagg*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> So, funny story happened to me.
> 
> I spend a saturday finding the stable Overclock limits of my two 290x's individually. Using firestrike, heaven, FFIV, Dirt rally.
> The result was *1200/1620* on each card. *Fully stable on Firestrike and every benchmark individually.* But with crossfire I had blackscreens and the PSU shut down sometimes.
> Yesterday I received my new Superflower 1200watts platinum PSU. Great, now lets try *1200/1620 in crossfire again!*
> 
> I launched Firestrike. Perfect, super stable.
> 
> I lauched Project Cars, and when starting the race, BLACK SCREEN.
> 
> I launched Assetto Corsa and when starting the race, BLACK SCREEN.
> 
> I launched Firestrike again, Perfect, Super stable. WHAT IS GOING ON!!!!!!!!!
> 
> I realized, all my tests to find the stable overclock were done with a single monitor at Full HD 1080P. And all my games are using eyefinity 3K resolution. So, i dropped my games to full hd.
> 
> I launched Project Cars, no black screens.
> 
> I launched Asseto Corsa, no black screens.
> 
> SON OF A GUN!!
> 
> I thought to my self, maybe the problem is eyefinity. So I disabled eyefinity, and activated VSR to downscale 4k to full hd.
> 
> I launched Project Cars, BLACK SCREEN!
> 
> I launched Assetto Corsa, BLACK SCREEN!
> 
> 
> 
> Conclusion. All my efforts to find the max overclock were done at Full HD. But they blackscreen almost immediately when I increase the Vram load!
> Now I have to start everything all over again, this time testing stability with VSR 4K instead of Full HD.


Black screen is most likely caused by the VRAM clocks. I would just start dialing back your VRAM clocks until the issues goes away. If you start seeing artifacts then increase your volts or power limit a little; assuming you don't have them both maxed already.


----------



## velocityx

Quote:


> Originally Posted by *spyshagg*
> 
> So, funny story happened to me.
> 
> I spend a saturday finding the stable Overclock limits of my two 290x's individually. Using firestrike, heaven, FFIV, Dirt rally.
> The result was *1200/1620* on each card. *Fully stable on Firestrike and every benchmark individually.* But with crossfire I had blackscreens and the PSU shut down sometimes.
> Yesterday I received my new Superflower 1200watts platinum PSU. Great, now lets try *1200/1620 in crossfire again!
> *
> 
> I launched Firestrike. Perfect, super stable.
> 
> I lauched Project Cars, and when starting the race, BLACK SCREEN.
> 
> I launched Assetto Corsa and when starting the race, BLACK SCREEN.
> 
> I launched Firestrike again, Perfect, Super stable. WHAT IS GOING ON!!!!!!!!!
> 
> I realized, all my tests to find the stable overclock were done with a single monitor at Full HD 1080P. And all my games are using eyefinity 3K resolution. So, i dropped my games to full hd.
> 
> I launched Project Cars, no black screens.
> 
> I launched Asseto Corsa, no black screens.
> 
> SON OF A GUN!!
> 
> I thought to my self, maybe the problem is eyefinity. So I disabled eyefinity, and activated VSR to downscale 4k to full hd.
> 
> I launched Project Cars, BLACK SCREEN!
> 
> I launched Assetto Corsa, BLACK SCREEN!
> 
> Conclusion. All my efforts to find the max overclock were done at Full HD. But they blackscreen almost immediately when I increase the Vram load!
> Now I have to start everything all over again, this time testing stability with VSR 4K instead of Full HD.


Just recently went through the same thing with my cfx setup. Check what vrm 1 temp you get in fhd single monitor vs vrm tmp when it crashes. In my case, one of my 290s black screens shortly after it reaches 94c on vrm 1 temp. I keep it at 88 and its super stable. I was getting black screens in bf4, pcars, gta v with super sampling. No black screens in firestrike and witcher 3. Turned out witcher was vsynced so gpus werent fully loaded, firestrike i donno. But it was vrm 1 temp ( the long strip near the end of the card )


----------



## Forceman

Quote:


> Originally Posted by *Roboyto*
> 
> Black screen is most likely caused by the VRAM clocks. I would just start dialing back your VRAM clocks until the issues goes away. If you start seeing artifacts then increase your volts or power limit a little; assuming you don't have them both maxed already.


Yeah, I doubt you'll see any real game performance gain from those sky-high memory clocks anyway.


----------



## spyshagg

Quote:


> Originally Posted by *Roboyto*
> 
> Black screen is most likely caused by the VRAM clocks. I would just start dialing back your VRAM clocks until the issues goes away. If you start seeing artifacts then increase your volts or power limit a little; assuming you don't have them both maxed already.


Quote:


> Originally Posted by *velocityx*
> 
> Just recently went through the same thing with my cfx setup. Check what vrm 1 temp you get in fhd single monitor vs vrm tmp when it crashes. In my case, one of my 290s black screens shortly after it reaches 94c on vrm 1 temp. I keep it at 88 and its super stable. I was getting black screens in bf4, pcars, gta v with super sampling. No black screens in firestrike and witcher 3. Turned out witcher
> was vsynced so gpus werent fully loaded, firestrike i donno. But it was vrm 1 temp ( the long strip near the end of the card )


Testing 1 gpu with vsr

1200/1250mem 137mv blackscreen on pcars
1000/1600mem 137mv blackscreen on pcars
1000/1250mem 0mv OK

GPUS arecwatercooled. Vrm on this card doesnt go above 45degrees c

Edit:
1200/1250mem 100mv no blackscreen so far (5 minutes on pcars)

This does not make sense. I have a superflower platinum 1200w and full block watercooleer.


----------



## spyshagg

This is silly.

Half an hour at 1200/1250mem 100mv (37mv less than firestrike needs!!) without blackscreening on pcars and still going. But at 137mv it instantly blackscreens on pcars with vsr.

This card does 15k graphics on firestrike at 1230/1630mem 150mv without sweating. Whats up with VSR that screws this up.

Im going to bed lol


----------



## Roboyto

Quote:


> Originally Posted by *Forceman*
> 
> Yeah, I doubt you'll see any real game performance gain from those sky-high memory clocks anyway.


I did a test a ways back on the effects of RAM clocks for 1080P and for the few benches I used it averaged out to ~5% going from 5GHz to nearly 7GHz RAM speeds.

Not a whole lot at all...can be seen here in this post in the VRAM CLK Performance Tab: http://www.overclock.net/t/1456279/honey-i-shrunk-the-ultra-tower-beast-my-journey-to-creating-a-more-compact-pc-with-an-r9-290/20_20#post_21847939

I have been meaning to retest that theory with more benchmarks at 3K eyefinity resolution, just never got around to it.

Quote:


> Originally Posted by *spyshagg*
> 
> This is silly.
> 
> Half an hour at 1200/1250mem 100mv (37mv less than firestrike needs!!) without blackscreening on pcars and still going. But at 137mv it instantly blackscreens on pcars with vsr.
> 
> This card does 15k graphics on firestrike at 1230/1630mem 150mv without sweating. Whats up with VSR that screws this up.
> 
> Im going to bed lol


It could very well be a flaw with VSR, but you also have to consider how much more load you are putting on the GPU, and system in general, when you increase the resolution; which you're doing artificially. What resolution are you running on Firestrike? Are you using VSR in firestrike as well? If not, then it isn't an 'apples-to-apples' comparison.

I've said this 100 times and I will say it again. A benchmark is just that, a benchmark...they are a good test but definitely NOT a guarantee for stability in actual use. Aaannnd, there is also no guarantee when overclocking...Peak clocks for one program typically don't work in all programs...suck it up and keep testing until you find what works. Just look at that spreadsheet I have in the link above for how many times I ran benches to figure out what worked the best; and that was just the one 290, I have owned 3 others and setup several more for friends.

If you do want a program that will essentially guarantee stability, then you need to try my personal 'be all, end all' stress test: FFXIV Benchmark. Find you max clocks in FFXIV benchmark and I can nearly guarantee you will have stability in anything else; it has never failed to do so for me. If my overclock passes FFXIV without issue it will run: 3DMark, Unigine, Furmark, Crisis _, Far Cry ___, Wolfenstein, Unreal Tournament Alpha, Tomb Raider, Batman Arkham ___, anything by Blizzard, Borderlands _, Trine _, Bioshock _, Witcher _, etc. etc. And that is at 3K resolution if the aforementioned games would allow and has spanned across Win7, Win8/8.1 and now Win10.

I don't, and wouldn't suggest, running at the absolute threshold all the time...benching like that for some e-peen bragging is OK. But once you find the brink, dial it back ~5% so you know it will have stability and some longevity. The card that all that data came from is still in my rig, has never been removed and hasn't given me any troubles while running at the clocks/voltage I found were a good compromise to get most of the performance without abusing the hell out of it; i.e. sweet spot.

Read through this and see if you find anything you haven't tried to get things working appropriately: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781

Few highlights since that is a long read:


Driver scrub with DDU & CCleaner
Latest Drivers, or try Beta Drivers, or even one driver revision older; scrub between installs.
Do NOT use CCC & Overdrive...if you are disable it and pick a 3rd party program. If one doesn't work, uninstall and try another.
In that 3rd party program make sure you: A) Force Constant Voltage (B) Disable ULPS (C) DON'T allow it to boot/load OC settings on startup.
If you are using MSI AfterBurner, then AUX voltage can assist with RAM overclocking/stability in general sometimes.
Try forcing tesselation off in CCC.
Uninstall and reinstall the GPU, ensure PCIe slot is clear of dust/debris, and check all cable connections
How long since a fresh Windows install? Upgrade to Win8/Win10 recently? If so use the function in Win8/Win10 to do a refresh/reset; upgrades can cause issues.
Don't want to do a fresh install? Try Windows SysPrep. This eradicates all drivers from the computer and re-runs the preliminary Windows driver install. This is particularly handy if you need to upgrade an older system to newer hardware...it allows you to keep the current Windows install with all programs, files, and user accounts intact...priceless: http://www.sevenforums.com/tutorials/135077-windows-7-installation-transfer-new-computer.html

Just got an e-mail from MonoPrice again today...4K 28" monitor for $399. Who needs VSR at this price?

http://www.monoprice.com/Product?c_id=113&cp_id=11307&cs_id=1130703&p_id=12156&seq=1&format=2&cl=res&utm_source=150826_email_4k&utm_medium=email&utm_content=4k_bis_hero&utm_campaign=150826_4k_bis


----------



## kizwan

@spyshagg
IMO testing one gpu at a time, for max stable clocks, only add more work because eventually you're going to run them in crossfire anyway. That's why I test max clocks in crossfire. 1600MHz for the memory maybe pushing it though. 1200 on the core may also pushing it too because not all 290's can do 1200 gaming stable. You can try aim between 1100 to 1150 for the core & 1300 to 1500 for the memory.

Regarding VSR, higher resolution is going to put more load to the gpu than lower resolution. Benchmark & games stress the gpu differently. Bench usually contain multiple short tests while games almost continuously stress the gpu.

My cards when playing BF4 @1150/1500 to 1150/1600, crashed under 30 to 45 minutes. While @1110/1500, BF4 still running fine after one hour.


----------



## fyzzz

You are still running the 390x bios right? @spyshagg. The 390x bios can be tricky at high overclocks. It can run fine at a certain overclock and suddenly blackscreen and sometimes the blackscreen can go away when you apply less voltage, sounds strange i know but i noticed that behavior before.


----------



## spyshagg

Its not VSR per se, its the higher resolutions 3k and 4k in either eyefinity or VSR.

@*Roboyto*
The only option I haven't used is "force constant voltage".
I used several BIOS and tested all of them using *FFIV Firestrike Heaven and Dirt Rally. But I ran all of them @ 1080p. My native screen.*
Firestrike was the absolute ass kicker for me, showing artifacts and locking up 20 or 30mhz bellow FFIV and Heaven.

@*fyzzz*
My first thoughts exactly, thats why I re-flashed the stock bios and came to the conclusion the behaviour is the same, difference being it takes a minute to blackscreen instead of instantly.

@*kizwan*
This is all understandable. More resolution = more memory load = different stability results. Will have to retest everything from scratch.

What isn't understandable, its why at higher resolutions, my 290x blackscreens @1200mhz/137mv, but is stable @1200mhz/100mv, when all my full HD tests told me I need 137mv to be stable @1200mhz.


----------



## Agent Smith1984

Black screens are usually VRAM issues, even if it's not overclocked.

Instead of adding core voltage, try adding 25-50mv of AUX voltage.

The 390 series seem to LOVE AUX voltage, may work on the 290 also


----------



## spyshagg

To my dismay 1200/1250mem 100mv is stable on this card. Ahoy! lets jump to the memory.

Does someone have the straps list for the 290x? I saw it somewhere on this forum.


----------



## fyzzz

Quote:


> Originally Posted by *spyshagg*
> 
> To my dismay 1200/1250mem 100mv is stable on this card. Ahoy! lets jump to the memory.
> 
> Does someone have the straps list for the 290x? I saw it somewhere on this forum.


Hawaii MC strapping
- 150-400MHz
- 401-800MHz
- 801-900MHz
- 901-1000MHz
- 1001-1125MHz
- 1126-1250MHz
- 1251-1375MHz
- 1376-1500MHz
- 1501-1625MHz
- 1626-1750MHz
Directly taken from the hawaii bios editing thread


----------



## Roboyto

Quote:


> Originally Posted by *spyshagg*
> 
> To my dismay 1200/1250mem 100mv is stable on this card. Ahoy! lets jump to the memory.
> 
> Does someone have the straps list for the 290x? I saw it somewhere on this forum.


As I and @kizwan said..if you bench at one resolution and try to play games at another it's not an apples-to-apples comparison. Keep playing around with it and you'll find the right blend of setting s.

My 290 loves 1200/1500 +100mV +50% power in Trixx for 3K eyefinity.

The gains from memory OC are very small, so if you don't end up with much it's not killing your performance. 1200 on the core, with a single 290, has been enough for me to play even newest games with a reasonable compromise of eye candy settings at 3K. Still looks great, but maxed everything isn't plausible without moAr GPU oomph.

Also tick that Force Constant Voltage, it definitely makes a difference.


----------



## sinholueiro

Hi! I will build a 1440p computer and I'm looking to used 290X, because they seems the best way to go. I find a PowerColor LCS 290X, but I will need to put a air cooler while I don't buy the watercooling stuff. Anyone knows if it's a reference design or what air coolers I can intall to it?


----------



## By-Tor

Quote:


> Originally Posted by *sinholueiro*
> 
> Hi! I will build a 1440p computer and I'm looking to used 290X, because they seems the best way to go. I find a PowerColor LCS 290X, but I will need to put a air cooler while I don't buy the watercooling stuff. Anyone knows if it's a reference design or what air coolers I can intall to it?


I'm running 2 of the Powercolor LCS cards and they run great together on water. Depending on where your located if your not going to water cool the card I would sell it and pickup an R9 390 card.


----------



## sinholueiro

Quote:


> Originally Posted by *By-Tor*
> 
> I'm running 2 of the Powercolor LCS cards and they run great together on water. Depending on where your located if your not going to water cool the card I would sell it and pickup an R9 390 card.


I didn't buy it yet, I was thinking in. I will watercool, for sure, but I will purchase the PC in xmas and buy the watercooling stuff about april/may. Some questions:
1. The core seems pretty good, how about the memos? I wanna push it to 1500Mhz.
2. Which is memo's brand?
3. I assume that at the same clocks I will get more or less the same performance of a 390X, right?
4. The price is 290€. With the air cooler, is 340€. It has no waranty. A 390 is 350€ and a 290X 370€. Keeping in mind that it comes with the waterblock and the backplate, it's a good deal, right?


----------



## brazilianloser

I couldn't even get 100 bucks out of my 290 from ebay... what a sad sad day. Card lost value fast.


----------



## JourneymanMike

Quote:


> Originally Posted by *brazilianloser*
> 
> I couldn't even get 100 bucks out of my 290 from ebay... what a sad sad day. Card lost value fast.


Just for grins, I checked eBay for used R9 290's...

There area some used ones that are being bid on for over $200...







<---- grin

There's more that are around $100, but have days of biding left...

Maybe you just had the wrong combo of time and starting price...


----------



## brazilianloser

Quote:


> Originally Posted by *JourneymanMike*
> 
> Just for grins, I checked eBay for used R9 290's...
> 
> There area some used ones that are being bid on for over $200...
> 
> 
> 
> 
> 
> 
> 
> <---- grin
> 
> There's more that are around $100, but have days of biding left...
> 
> Maybe you just had the wrong combo of time and starting price...


Guess so... But from what I am seeing it now I think you are looking at 290x... not seeing any 290 that is above 150 other than the non reference fancy stuff. Oh well though made enough from all my other parts to compesate... and not like I need it anyways thanks to paypal increasing my credit drasticaly forcing me to upgrade to a new x99 build.


----------



## JourneymanMike

Quote:


> Originally Posted by *brazilianloser*
> 
> Guess so... But from what I am seeing it now I think you are looking at 290x... not seeing any 290 that is above 150 other than the non reference fancy stuff. Oh well though made enough from all my other parts to compesate... and not like I need it anyways thanks to paypal increasing my credit drasticaly forcing me to upgrade to a new x99 build.


Ok, I guess I didn't look specifically for reference cards...









Actually I'd rather have a reference card (I have 2 290X's) than non-reference, it's easier to find a reference water block...

Both of mine have full blocks on them...


----------



## kizwan

Quote:


> Originally Posted by *brazilianloser*
> 
> I couldn't even get 100 bucks out of my 290 from ebay... what a sad sad day. Card lost value fast.


Probably because the card is Asus. Many people avoid getting Asus card.


----------



## By-Tor

Quote:


> Originally Posted by *sinholueiro*
> 
> I didn't buy it yet, I was thinking in. I will watercool, for sure, but I will purchase the PC in xmas and buy the watercooling stuff about april/may. Some questions:
> 1. The core seems pretty good, how about the memos? I wanna push it to 1500Mhz.
> 2. Which is memo's brand?
> 3. I assume that at the same clocks I will get more or less the same performance of a 390X, right?
> 4. The price is 290€. With the air cooler, is 340€. It has no waranty. A 390 is 350€ and a 290X 370€. Keeping in mind that it comes with the waterblock and the backplate, it's a good deal, right?


I have run both my cards at 1220/1625 for benching and leave them at 1125/1500 for everyday use and gaming..

Not 100% sure on the memory, but I think they are both Elpida.

I purchased my LCS cards for $540 for the pair and feel that was a great deal..

Here they are in my loop..


----------



## sinholueiro

Quote:


> Originally Posted by *By-Tor*
> 
> I have run both my cards at 1220/1625 for benching and leave them at 1125/1500 for everyday use and gaming..
> 
> Not 100% sure on the memory, but I think they are both Elpida.
> 
> I purchased my LCS cards for $540 for the pair and feel that was a great deal..
> 
> Here they are in my loop..


1100/1500 is all what I wish. Thanks for the photo! I don'r like the case, but the color scheme is really good and the block for the CPU is really sick. So much nice work!


----------



## By-Tor

It's a great case....

Here's a shot of the block lit up..


----------



## sinholueiro

Quote:


> Originally Posted by *By-Tor*
> 
> It's a great case....
> 
> Here's a shot of the block lit up..


I know, but I preffer others like the H440


----------



## Ized

Quote:


> Originally Posted by *By-Tor*
> 
> I have run both my cards at 1220/1625 for benching and leave them at 1125/1500 for everyday use and gaming..
> 
> Not 100% sure on the memory, but I think they are both Elpida.
> 
> I purchased my LCS cards for $540 for the pair and feel that was a great deal..
> 
> Here they are in my loop..


Neat









Im curious what Bios version came on your LCS cards?


----------



## By-Tor

Quote:


> Originally Posted by *Ized*
> 
> Neat
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Im curious what Bios version came on your LCS cards?


I have no idea right at this moment and really never checked on any of my cards in along time.

I can take a look when I get home...


----------



## Ized

Quote:


> Originally Posted by *By-Tor*
> 
> I have no idea right at this moment and really never checked on any of my cards in along time.
> 
> I can take a look when I get home...


That would be great thank you, even a GPU-Z screenshot would work


----------



## spyshagg

Quote:


> Originally Posted by *Roboyto*
> 
> As I and @kizwan said..if you bench at one resolution and try to play games at another it's not an apples-to-apples comparison. Keep playing around with it and you'll find the right blend of setting s.
> 
> My 290 loves 1200/1500 +100mV +50% power in Trixx for 3K eyefinity.
> 
> The gains from memory OC are very small, so if you don't end up with much it's not killing your performance. 1200 on the core, with a single 290, has been enough for me to play even newest games with a reasonable compromise of eye candy settings at 3K. Still looks great, but maxed everything isn't plausible without moAr GPU oomph.
> 
> Also tick that Force Constant Voltage, it definitely makes a difference.


yeah my 290x blackscreens both above 100mv or 1350mem on games with Eyefinity/VSR .

It still does firestrike 1080p artifact free @ 1200/1620 137mv (including the long ass annoying firestrike DEMO)

Aux voltage and Constant Voltage produced Zero results unfortunately.

Lets see how my second card behaves!


----------



## KingCry

Well time to finally going the club after owning my Lighting for over a month now



Was able to get 1245 core and 1500 memory

It has Samsung memory on it as well which was a first for any AMD card I've owned so far


----------



## By-Tor

Quote:


> Originally Posted by *Ized*
> 
> That would be great thank you, even a GPU-Z screenshot would work


Here ya go...


----------



## Gumbi

Per Agent Smith's success with high airflow fans and his 390, I took the liberty of massively improving my case airflow. I now have 2 Noctua 120mm 2000 RPM fans as my front intakes, one side intake (120mm 2k Noctua again), a rear exhaust at 2k RPM 140mm, and a Nanoxia 1100 RPM 140mm bottom intake.

I did it per Agent Smith's success, but was also spurred on by the fact that currently the central fan on my Vapor X 290 is not functional (working on a fix), so my GPU temps were MASSIVELY affected. Here's a bench run of Heaven at 1175/1600 at 75mv for 25-30 mins. VRMs are icy cool as always, but the core is a tad warm for my liking. Thankfully I am confident in putting it down to the lack of the central GPU fan spinning (probably the most important for the core as it sits right on top of it!







Core 77, VRMs at 68/57

They are hooked up to a fancontroller too, so can be silenced with the flick of a slider


----------



## mfknjadagr8

Quote:


> Originally Posted by *Gumbi*
> 
> Per Agent Smith's success with high airflow fans and his 390, I took the liberty of massively improving my case airflow. I now have 2 Noctua 120mm 2000 RPM fans as my front intakes, one side intake (120mm 2k Noctua again), a rear exhaust at 2k RPM 140mm, and a Nanoxia 1100 RPM 140mm bottom intake.
> 
> I did it per Agent Smith's success, but was also spurred on by the fact that currently the central fan on my Vapor X 290 is not functional (working on a fix), so my GPU temps were MASSIVELY affected. Here's a bench run of Heaven at 1175/1600 at 75mv for 25-30 mins. VRMs are icy cool as always, but the core is a tad warm for my liking. Thankfully I am confident in putting it down to the lack of the central GPU fan spinning (probably the most important for the core as it sits right on top of it!
> 
> 
> 
> 
> 
> 
> 
> Core 77, VRMs at 68/57
> 
> They are hooked up to a fancontroller too, so can be silenced with the flick of a slider


nice...much better temps than last time


----------



## By-Tor

My best 3DMark score ever...


----------



## Liranan

Hi everyone, I'm intending to replace my 6870 with a 290 and was looking at the PCS+ but was told that PowerColour is the wrong brand to get due to voltage control.

Here is the thread I posted.
http://www.overclock.net/t/1571461/which-290-to-get/10#post_24354677

Could you guys help me, please?


----------



## kizwan

Quote:


> Originally Posted by *By-Tor*
> 
> My best 3DMark score ever...


Score detective reporting in....

Your score seems low for that clocks. Did ULPS enabled?

My scores:-
1100/1300 http://www.3dmark.com/fs/5608308
1100/1600 http://www.3dmark.com/fs/5608433


----------



## Ized

Thankyou By-Tor


----------



## By-Tor

No its disabled.

For whatever reason I've never gotten great scores...


----------



## kizwan

Quote:


> Originally Posted by *By-Tor*
> 
> No its disabled.
> 
> For whatever reason I've never gotten great scores...


Power limit? In my view powercolor always underperform somehow. Happened in the past in this thread & happened again today. Try monitor voltage for both card. Try set GPU-Z to log to file while you running Firestrike.


----------



## Ized

Quote:


> Originally Posted by *kizwan*
> 
> Power limit? In my view powercolor always underperform somehow. Happened in the past in this thread & happened again today. Try monitor voltage for both card. Try set GPU-Z to log to file while you running Firestrike.


The LCS is a reference card though, waterblock + a extra voltage as default and slightly different clocks is the only difference.

I'd be interested in seeing the Firestrike results for each of the cards on their own.

I personally suffer from wildly differing scores, a few reboots normally gets me back in line with everyone else though - still pretty frustrating.

And to add fuel to your speculation, yes my card is also Powercolor


----------



## Gumbi

Quote:


> Originally Posted by *mfknjadagr8*
> 
> nice...much better temps than last time


Just benched with 1200/1650 at 100mv. Seems OK. 100% GPU fan, 5 mins in Heaven, 74 on core, 60 on VRM 1 and 58 or something on VRM 2







. Have to overshoot with the GPU fan for the moment to stop the core getting warm as the central fan, conveniently the one right on top of the GPU core, is not spinning at all lol.

Super happy with these fans, the VRMs loooove airflow. Can't believe I'm putting 100mv through them and they're not even flinching.

Gonna play with this a bit more. The core likes to be cool. I put 25mv aux through it and it was artefacting a bit, I removed that and the degree or two sorted it (same clocks). Might be able to bench artefact free 1225mhz or more more, especially with 200mv via Trixx.


----------



## Gumbi

Working on my overclock. 1175/1600 at 75mv. 1200/1650 can be benched at 100mv, but it artefacts for a few mins and then the display drivers crashes.


----------



## Streetdragon

I have a little question!
One of my 2 r9 290 Vapor-X died. Now i wanna go crossfire again, but i cant get a new Vapor. All sold out-.-
Now my question:
Can i crossfire my R9 290 Vapor-X with a "Sapphire AMD Radeon R9 390 OC Tri-X NITRO" ?

Thx for answer


----------



## fyzzz

Quote:


> Originally Posted by *Streetdragon*
> 
> I have a little question!
> One of my 2 r9 290 Vapor-X died. Now i wanna go crossfire again, but i cant get a new Vapor. All sold out-.-
> Now my question:
> Can i crossfire my R9 290 Vapor-X with a "Sapphire AMD Radeon R9 390 OC Tri-X NITRO" ?
> 
> Thx for answer


I think you could, but the 8gb on the 390 would be useless then.


----------



## battleaxe

Quote:


> Originally Posted by *fyzzz*
> 
> I think you could, but the 8gb on the 390 would be useless then.


Only on old games. If using Win10 DX12 and a game that's written for it, the 8GB (or would it be 12GB then?) could then be used. As I understand it anyway...


----------



## sinnedone

Quote:


> Originally Posted by *battleaxe*
> 
> Only on old games. If using Win10 DX12 and a game that's written for it, the 8GB (or would it be 12GB then?) could then be used. As I understand it anyway...


thats only if developers code their games like that, and as far as I've read it's not really a given.


----------



## Streetdragon

Beside of the different ram size, what about the different clocks? can i clock them differently?


----------



## battleaxe

Quote:


> Originally Posted by *sinnedone*
> 
> thats only if developers code their games like that, and as far as I've read it's not really a given.


That's what I just said.,.. "and a game that's written for it".









Quote:


> Originally Posted by *Streetdragon*
> 
> Beside of the different ram size, what about the different clocks? can i clock them differently?


Yes


----------



## sinnedone

Quote:


> Originally Posted by *sinnedone*
> 
> thats only if developers code their games like that, and as far as I've read *it's not really a given.*


Quote:


> Originally Posted by *battleaxe*
> 
> That's what I just said.,.. "and a game that's written for it".


lol

You made it sound a little bit too positive.









With all the crappy PC ports we've been getting I highly doubt it will be implemented much if at all.


----------



## battleaxe

Quote:


> Originally Posted by *sinnedone*
> 
> lol
> 
> With all the crappy PC ports we've been getting I highly doubt it will be implemented much if at all.


Well... you may have a point there.


----------



## Aussiejuggalo

How are the 15.7.1 drivers, is it worth an upgrade from 15.6?


----------



## battleaxe

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> How are the 15.7.1 drivers, is it worth an upgrade from 15.6?


IDK... I've been on the 15.7.1 since they came out. Today I just switched to the newest BETA to try those. Seem to work just fine so far too. Haven't tried benching yet to see if they do any better.


----------



## LandonAaron

Has anyone tried the 15.8 beta driver yet? http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx


----------



## battleaxe

Quote:


> Originally Posted by *LandonAaron*
> 
> Has anyone tried the 15.8 beta driver yet? http://support.amd.com/en-us/kb-articles/Pages/latest-catalyst-windows-beta.aspx


Yes, so far so good. I'll post back after I do some benching.


----------



## FastEddieNYC

I just downloaded 15.8 so I will run some benchmarks later and post the results. According to the release notes They fixed the Witcher 3 crossfire problem with AA . I will soon see.


----------



## battleaxe

I just ran the 15.8 BETA on Heaven. Seems to run about the same to me. Score is almost identical. No crashes or anything weird. Played a bit of BF3 and same thing, ran fine. Nothing new, nothing stood out.


----------



## bkvamme

Quote:


> Originally Posted by *bkvamme*
> 
> Yeah, I will have to reseat the thermal pads. The fujipoly pads weren't all that expensive, and if I have to disassemble/drain the loop, I might as well change to the Fujipoly, if only for ease of mind. This card sucks power like no other, and the VRMs have a massive job to do, so I might as well give them the best treatment possible.


Gents,

I figured out what the problem with my VRM thermal pad was.



For the three VRMs in the top left corner of the GPU, I cut up two pads. The smallest pad had skidded a bit (probably when mounting the waterblock, and effectively prevented any proper contact to the water block. Redid it, and should work fine now. Just as a tip for future me/others.


----------



## fyzzz

I think my psu is acting weird. I know gpu-z can read wrong values, but overclocking is a pain and gpu-z is showing that the 12v is going up and down and sometimes dipping down to 11.13. I don't trust this psu anymore. Also sometimes the vddc dips.


----------



## Gumbi

Quote:


> Originally Posted by *fyzzz*
> 
> I think my psu is acting weird. I know gpu-z can read wrong values, but overclocking is a pain and gpu-z is showing that the 12v is going up and down and sometimes dipping down to 11.13. I don't trust this psu anymore. Also sometimes the vddc dips.


Those software readings can be very unreliable.

My PSU also dips when I apply load to it, it's not unusual.

When you say "overclocking is a pain", what exactly do you mean?


----------



## fyzzz

Quote:


> Originally Posted by *Gumbi*
> 
> Those software readings can be very unreliable.
> 
> My PSU also dips when I apply load to it, it's not unusual.
> 
> When you say "overclocking is a pain", what exactly do you mean?


I feel like it is inconsistent, i don't know maybe it's all in my head. I also change bios quite frequently that could be it too, but still i don't trust this psu. I have had heating problems with it before, the gpu wasn't stable at 1200, but when thing cooled of it was stable again.


----------



## Gumbi

Quote:


> Originally Posted by *fyzzz*
> 
> I feel like it is inconsistent, i don't know maybe it's all in my head. I also change bios quite frequently that could be it too, but still i don't trust this psu. I have had heating problems with it before, the gpu wasn't stable at 1200, but when thing cooled of it was stable again.


What's inconsistent? Cooling can certainly affect the stability of an overclock.

If you give us some numbers then maybe we can help. What do you score for a benched run of Heaven at max settings?


----------



## fyzzz

Quote:


> Originally Posted by *Gumbi*
> 
> What's inconsistent? Cooling can certainly affect the stability of an overclock.
> 
> If you give us some numbers then maybe we can help. What do you score for a benched run of Heaven at max settings?


I will tinker with this some more, cooling isn't a problem everything is under water. It *seems* like the reading become steadier when i hooked up 2 pcie cables to graphics card, i must test further before i take any conclusions.


----------



## Gumbi

Quote:


> Originally Posted by *fyzzz*
> 
> I will tinker with this some more, cooling isn't a problem everything is under water. It *seems* like the reading become steadier when i hooked up 2 pcie cables to graphics card, i must test further before i take any conclusions.


Huh? You didn't have 2 cables hooked up to the card before? What connectors does your card have? 8 + 8 pin or 8+ 6? And you only had one connected?


----------



## fyzzz

Quote:


> Originally Posted by *Gumbi*
> 
> Huh? You didn't have 2 cables hooked up to the card before? What connectors does your card have? 8 + 8 pin or 8+ 6? And you only had one connected?


My card has a 8pin and a 6pin. A single pcie cable has a 8pin and a 6+2 pin.


----------



## BradleyW

Anyone tried Mad Max CFX yet?


----------



## kizwan

Quote:


> Originally Posted by *fyzzz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gumbi*
> 
> What's inconsistent? Cooling can certainly affect the stability of an overclock.
> 
> If you give us some numbers then maybe we can help. What do you score for a benched run of Heaven at max settings?
> 
> 
> 
> I will tinker with this some more, cooling isn't a problem everything is under water. It *seems* like the reading become steadier when i hooked up 2 pcie cables to graphics card, i must test further before i take any conclusions.
Click to expand...

I always advice people to use different PCIe cables for each PCIe connectors, especially power hungry card like Hawaii.

GPU-Z 12V reading is unreliable. For accurate reading use DMM. But the symptoms that you having where it stable when the PSU cool down seems like there's maybe issue with the PSU.


----------



## fyzzz

Quote:


> Originally Posted by *kizwan*
> 
> I always advice people to use different PCIe cables for each PCIe connectors, especially power hungry card like Hawaii.
> 
> GPU-Z 12V reading is unreliable. For accurate reading use DMM. But the symptoms that you having where it stable when the PSU cool down seems like there's maybe issue with the PSU.


The second pcie cable solved it. I'm kinda surprised that i even reached 1300 mhz before with one cable and a overheating psu. I could be crazy but to reach 1200 i takes less voltage (i ran like 130mv to bestable before but now i can run 1200 with 112 without artifacts). Anyway im so happy that this psu things seems to be solved now and things seems much more stable.


----------



## battleaxe

Quote:


> Originally Posted by *fyzzz*
> 
> The second pcie cable solved it. I'm kinda surprised that i even reached 1300 mhz before with one cable and a overheating psu. I could be crazy but to reach 1200 i takes less voltage (i ran like 130mv to bestable before but now i can run 1200 with 112 without artifacts). Anyway im so happy that this psu things seems to be solved now and things seems much more stable.


So you only had one cable hooked to your card? Seriously?


----------



## fyzzz

Quote:


> Originally Posted by *battleaxe*
> 
> So you only had one cable hooked to your card? Seriously?


Well the cable has a 8pin and a 6pin, which i thought was fine and left it so. I never have thought so much about it until now, when i have really pushed things. Anyway enough talking about this psu thing, call me dumb if you want to.


----------



## battleaxe

Quote:


> Originally Posted by *fyzzz*
> 
> Well the cable has a 8pin and a 6pin, which i thought was fine and left it so. I never have thought so much about it until now, when i have really pushed things. Anyway enough talking about this psu thing, call me dumb if you want to.


No, I thought you meant you only had the 8 pin connected and the six pin not at all. I wouldn't call you dumb anyway. We all have varying degrees of understanding when it comes to computers. Some around here just think they know everything. (and its annoying to even share a forum with them)

No insult intended at all.


----------



## fyzzz

Quote:


> Originally Posted by *battleaxe*
> 
> No, I thought you meant you only had the 8 pin connected and the six pin not at all. I wouldn't call you dumb anyway. We all have varying degrees of understanding when it comes to computers. Some around here just think they know everything. (and its annoying to even share a forum with them)
> 
> No insult intended at all.


No worries, everything is solved now and no i would never leave a connector unplugged, i know that much atleast.


----------



## sinnedone

Quote:


> Originally Posted by *fyzzz*
> 
> Well the cable has a 8pin and a 6pin, which i thought was fine and left it so. I never have thought so much about it until now, when i have really pushed things. Anyway enough talking about this psu thing, call me dumb if you want to.


Cooler master V series?


----------



## fyzzz

Quote:


> Originally Posted by *sinnedone*
> 
> Cooler master V series?


Corsair RM 750


----------



## sinnedone

I "HAD" the same issue with a coolermaster V1000. At the power supply its a single six pin that splits into two eight pin.

I redid the cables and now have a single six pin at the power supply to a six pin at the graphics card, and a 6 pin at the power supply into an 8 pin at the graphics card.


----------



## bkvamme

Quote:


> Originally Posted by *sinnedone*
> 
> I "HAD" the same issue with a coolermaster V1000. At the power supply its a single six pin that splits into two eight pin.
> 
> I redid the cables and now have a single six pin at the power supply to a six pin at the graphics card, and a 6 pin at the power supply into an 8 pin at the graphics card.


I have it the same way with my V1000. Might be overkill, but atleast the two plugs don't have to compete which each other in the cable. Shouldn't have much to say in practice, as they are all drawing from the same board within the PSU. Some PSU's might have more than one rail, but that is not common anymore.


----------



## allindaze

Quote:


> Originally Posted by *bkvamme*
> 
> I have it the same way with my V1000. Might be overkill, but atleast the two plugs don't have to compete which each other in the cable. Shouldn't have much to say in practice, as they are all drawing from the same board within the PSU. Some PSU's might have more than one rail, but that is not common anymore.


I was worried it would cause an issue using one cable to power both 6pin and 8pin sockets but on higher quality PSU's the cables they include are designed to power high end video cards.


----------



## Archea47

Quote:


> Originally Posted by *bkvamme*
> 
> I have it the same way with my V1000. Might be overkill, but atleast the two plugs don't have to compete which each other in the cable. Shouldn't have much to say in practice, as they are all drawing from the same board within the PSU. Some PSU's might have more than one rail, but that is not common anymore.


The 6 cables from the PSU are normally rated for 75W, vs. the 225W specified for the 290X 8+6pin

I had a colleague with frequent black screens on a Corsair RM with daisy chained cables earlier this summer. He thought it was a bad card, but tried an EVGA 750W and all his problems went away


----------



## sinnedone

Quote:


> Originally Posted by *allindaze*
> 
> I was worried it would cause an issue using one cable to power both 6pin and 8pin sockets but on higher quality PSU's the cables they include are designed to power high end video cards.


I prefer the more hoses approach. Bad habit from car audio.


----------



## bkvamme

Quote:


> Originally Posted by *allindaze*
> 
> I was worried it would cause an issue using one cable to power both 6pin and 8pin sockets but on higher quality PSU's the cables they include are designed to power high end video cards.


Quote:


> Originally Posted by *Archea47*
> 
> The 6 cables from the PSU are normally rated for 75W, vs. the 225W specified for the 290X 8+6pin
> 
> I had a colleague with frequent black screens on a Corsair RM with daisy chained cables earlier this summer. He thought it was a bad card, but tried an EVGA 750W and all his problems went away


Gotcha. Just did some quick calculations. For the V1000, the cables are 18AWG, which should roughly allow for 4.2-4.9A through the wire (could be wrong here). This translates to around 50-60W per wire at 12V, physical "capacity" in the cable.

Assuming 6 wires in the primary cable from the PSU to the PCIe connector(s):
Multiply by the three pairs of +12V cables for the 6pin connector, and you have around 150-180W.

Assuming 8 wires in the primary cable from the PSU to the PCIe connector(s):
Multiply by the three pairs of +12V cables for the 6pin connector, and you have around 200-240W.

Add the 75W supplied through the PCIe 16x slot, and you should (in theory) have enough power when daisy chaining if you have a 8 wire cable. If you have a 6 wire cable (likely common on lower end PSUs), you will be in the danger zone, especially for the R9 290X.

Very, very quick calculations. Please correct me if I am wrong


----------



## allindaze

Quote:


> Originally Posted by *bkvamme*
> 
> Gotcha. Just did some quick calculations. For the V1000, the cables are 18AWG, which should roughly allow for 4.2-4.9A through the wire (could be wrong here). This translates to around 50-60W per wire at 12V, physical "capacity" in the cable.
> 
> Assuming 6 wires in the primary cable from the PSU to the PCIe connector(s):
> Multiply by the three pairs of +12V cables for the 6pin connector, and you have around 150-180W.
> 
> Assuming 8 wires in the primary cable from the PSU to the PCIe connector(s):
> Multiply by the three pairs of +12V cables for the 6pin connector, and you have around 200-240W.
> 
> Add the 75W supplied through the PCIe 16x slot, and you should (in theory) have enough power when daisy chaining if you have a 8 wire cable. If you have a 6 wire cable (likely common on lower end PSUs), you will be in the danger zone, especially for the R9 290X.
> 
> Very, very quick calculations. Please correct me if I am wrong


Well now i'm convinced it's better to have a dedicated cable for each


----------



## bkvamme

Quote:


> Originally Posted by *allindaze*
> 
> Well now i'm convinced it's better to have a dedicated cable for each


Yeah, especially if you have two 8pin connectors. According to spec, each of these should be able to supply 150W,totalling 300W, plus the 75W through the PCIe slot. Unless you have 16AWG cables from the PSU (72-84W per wire), you will likely suffer from lack of power when daisy chaining. With 16AWG, you'd have 288-336W for 8 wire cables, 216-252W for 6 wire cables.

Add in a bit of overclocking, and you can see how this could be a problem very fast. If my calculations are correct, I find it strange that this is not mentioned in PSU manuals.

While it will not be a problem for most GPUs, the power hungry top-tier AMD cards certainly could run in to problems with cheaper PSUs.


----------



## battleaxe

I just pushed more voltage to the card. Didn't seem to help with the core really... But gave more stability to go higher on the RAM, and the score went up just a tad. This is with the new BETA driver 15.8


----------



## allindaze

Quote:


> Originally Posted by *bkvamme*
> 
> Yeah, especially if you have two 8pin connectors. According to spec, each of these should be able to supply 150W,totalling 300W, plus the 75W through the PCIe slot. Unless you have 16AWG cables from the PSU (72-84W per wire), you will likely suffer from lack of power when daisy chaining. With 16AWG, you'd have 288-336W for 8 wire cables, 216-252W for 6 wire cables.
> 
> Add in a bit of overclocking, and you can see how this could be a problem very fast. If my calculations are correct, I find it strange that this is not mentioned in PSU manuals.
> 
> While it will not be a problem for most GPUs, the power hungry top-tier AMD cards certainly could run in to problems with cheaper PSUs.


Are we talking about daisy chaining to another card or simply powering one card through both 8 and 6 pin connectors from ab 8 pin cable with a split to 6?


----------



## JourneymanMike

Quote:


> Originally Posted by *allindaze*
> 
> Quote:
> 
> 
> 
> Originally Posted by *bkvamme*
> 
> Yeah, especially if you have two 8pin connectors. According to spec, each of these should be able to supply 150W,totalling 300W, plus the 75W through the PCIe slot. Unless you have 16AWG cables from the PSU (72-84W per wire), you will likely suffer from lack of power when daisy chaining. With 16AWG, you'd have 288-336W for 8 wire cables, 216-252W for 6 wire cables.
> 
> Add in a bit of overclocking, and you can see how this could be a problem very fast. If my calculations are correct, I find it strange that this is not mentioned in PSU manuals.
> 
> While it will not be a problem for most GPUs, the power hungry top-tier AMD cards certainly could run in to problems with cheaper PSUs.
> 
> 
> 
> Are we talking about daisy chaining to another card or simply powering one card through both 8 and 6 pin connectors from ab 8 pin cable with a split to 6?
Click to expand...

Yes,

Daisy Chained



Not



I believe this what y'all been talking about...


----------



## kizwan

Nah, we're talking about this.

Daisy chain


----------



## sinnedone

lol


----------



## JourneymanMike

Quote:


> Originally Posted by *kizwan*
> 
> Nah, we're talking about this.
> 
> Daisy chain


Jeeze, sorry...

Do the flowers work?









I'll try it out tomorrow!


----------



## battleaxe

Quote:


> Originally Posted by *JourneymanMike*
> 
> Jeeze, sorry...
> 
> Do the flowers work?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll try it out tomorrow!


Yeah, they work fine. Been using em' for years.


----------



## Archea47

Quote:


> Originally Posted by *JourneymanMike*
> 
> Jeeze, sorry...
> 
> Do the flowers work?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll try it out tomorrow!


Super Flowers?


----------



## Sempre

Quote:


> Originally Posted by *Archea47*
> 
> Super Flowers?


I hope so. Regular Flowers are boring.


----------



## fyzzz

Everything is working great now. 1222/1625 150mv stable (~1.258 under load) and still testing.


----------



## fyzzz

Wow this card is starting to impress me now 1270/1780 14,7 k gpu score on a STOCK sapphire tri-x bios, imagine if i could reach those speeds on the 390 bios







. http://www.3dmark.com/3dm/8428729? Max temperature 46c on both the core and vrm 1 at 300mv







.


----------



## bkvamme

Quote:


> Originally Posted by *allindaze*
> 
> Are we talking about daisy chaining to another card or simply powering one card through both 8 and 6 pin connectors from ab 8 pin cable with a split to 6?


Yes

Daisy Chained:


Essentially, one cable from the PSU to two connectors on the GPU.

Not Daisy Chained:


Essentially, one cable from the PSU to one connector on the GPU.

Again, daisy chaining should work fine for 16AWG cables, or if you have a lower wattage GPU. The R9 290(X) is not a low wattage GPU by any means, so you should ideally not daisy chain unless you have 16AWG wires (not common for low-mid end PSUs)


----------



## MEC-777

Hey there.

I'm not new to the R9 290 - been using one for almost a year now. But I am new to this thread (not sure why I didn't sub to this a long time ago...). Anyways, I have another R9 290 being shipped to me as we speak. I will be pairing it with my HIS IceQx2 290 to run in crossfire.

What I'm curious of is just how loud and hot the reference 290's really are? As it stands, I'm using a Kraken G10 + H55 to cool the HIS 290 and was planning to continue running the reference blower on the 2nd card to keep all the hot air exhausting outside the case (NZXT S340). But if it's really THAT loud and hot, then I'll swap the G10 over to the reference card and put the [much better and quieter] stock cooler back on the HIS card. After just replacing the TIM on the HIS with the stock cooler showed improvements of -5 degrees at full load. If I do the same with the reference, maybe it'll run at a more bearable noise level...?

I'm ok with a fairly warm running card if it means lower noise, but how bad are they, really?

Thanks. Appreciate any input/suggestions.


----------



## Roboyto

Quote:


> Originally Posted by *MEC-777*
> 
> Hey there.
> 
> I'm not new to the R9 290 - been using one for almost a year now. But I am new to this thread (not sure why I didn't sub to this a long time ago...). Anyways, I have another R9 290 being shipped to me as we speak. I will be pairing it with my HIS IceQx2 290 to run in crossfire.
> 
> What I'm curious of is just how loud and hot the reference 290's really are? As it stands, I'm using a Kraken G10 + H55 to cool the HIS 290 and was planning to continue running the reference blower on the 2nd card to keep all the hot air exhausting outside the case (NZXT S340). But if it's really THAT loud and hot, then I'll swap the G10 over to the reference card and put the [much better and quieter] stock cooler back on the HIS card. After just replacing the TIM on the HIS with the stock cooler showed improvements of -5 degrees at full load. If I do the same with the reference, maybe it'll run at a more bearable noise level...?
> 
> I'm ok with a fairly warm running card if it means lower noise, but how bad are they, really?
> 
> Thanks. Appreciate any input/suggestions.


They're bad. I'd do exactly as you have planned, or run another Kraken if you can fit the extra radiator somewhere.

I tried changing thermal paste and upgrading thermal pads on my reference XFX 290, and it was barely a measurable difference in temperatures with no change in acoustics.


----------



## sinnedone

If you are a woman or have a woman in your life, go to their room/bathroom and grab the hair dryer. TUrn it on high, that's how loud the reference R9 290 R9 290X cooler is.









I'm not exaggerating.


----------



## OneB1t

more like grinding steel with angle grinder







thats reference cooler @100% speed


----------



## allindaze

Wow we wouldn't even be having this conversation if I had that baby
Quote:


> Originally Posted by *bkvamme*
> 
> Yes
> 
> Daisy Chained:
> 
> 
> Essentially, one cable from the PSU to two connectors on the GPU.
> 
> Not Daisy Chained:
> 
> 
> Essentially, one cable from the PSU to one connector on the GPU.
> 
> Again, daisy chaining should work fine for 16AWG cables, or if you have a lower wattage GPU. The R9 290(X) is not a low wattage GPU by any means, so you should ideally not daisy chain unless you have 16AWG wires (not common for low-mid end PSUs)


So ever since I got my second XFX R9 290 I was quite disappointed that Seasonic had only included 3 pci-e 6+2 connectors. One short of what I needed to power both at the same time. Out of stubborness and not wanting to buy another cable I wrestled with the problem trying to figure out the best way to connect them without stressing the cable by daisy chaining. After a while I settled on plugging one 6+2 cable into the 8 pin port of one card, the other 6+2 cable into the 8 port of the other... and daisy chaining the 3rd 6+2 cable between the two 6 pin connectors on both of the cards. My thought process was those were the two that would pull the least through the cable?

Anyways a year later here I am realizing that my Seasonic 850W X-Series is designed to run multi-gpus through the PCI-e cables they included. They're 16AWG.

So now after reading this thread I'm a bit more comfortable daisy chaining the secondary card as it seems is intended until I get off my butt and buy a 4th cable. I wonder though, was daisy chaining the two 6 pin connectors between the two cards pulling less than daisy chaining the whole card by itself. It seems to me the way I was doing it would be safer.


----------



## bkvamme

Quote:


> Originally Posted by *allindaze*
> 
> Wow we wouldn't even be having this conversation if I had that baby
> ...
> I wonder though, was daisy chaining the two 6 pin connectors between the two cards pulling less than daisy chaining the whole card by itself. It seems to me the way I was doing it would be safer.


Can't say for certain as I don't know if the power draw on each of the connectors on the card is equal. Doesn't make any sense that one of the connectors would draw power until it has maxed out, and then put load on the second. I assume that power is drawn equally between the two connectors (parallell power distribution, compared to serial). So I am afraid that your solution did not do any good, but it didn't do any harm either.

As you stated, you have 16AWG cables, so the power draw wouldn't be a problem in any case. In theory you have the connectivity to run Tri-Fire, although I sincerly doubt that the 850W PSU would be able to churn out enough juice for three 290's


----------



## allindaze

Quote:


> Originally Posted by *bkvamme*
> 
> Can't say for certain as I don't know if the power draw on each of the connectors on the card is equal. Doesn't make any sense that one of the connectors would draw power until it has maxed out, and then put load on the second. I assume that power is drawn equally between the two connectors (parallell power distribution, compared to serial). So I am afraid that your solution did not do any good, but it didn't do any harm either.
> 
> As you stated, you have 16AWG cables, so the power draw wouldn't be a problem in any case. In theory you have the connectivity to run Tri-Fire, although I sincerly doubt that the 850W PSU would be able to churn out enough juice for three 290's


Yeah. I agree I wasn't doing the system any benefit, rather hoping to cause the least amount of harm. Ultimately all that matters now is I'm happy knowing I didn't do anything too dumb or dangerous by using those 3 cables instead of 4 like I'd like. And just to clarify the first line in my last quote (above the picture) was intended for the PSU that was posted on the last page


----------



## MEC-777

Quote:


> Originally Posted by *sinnedone*
> 
> If you are a woman or have a woman in your life, go to their room/bathroom and grab the hair dryer. TUrn it on high, that's how loud the reference R9 290 R9 290X cooler is.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm not exaggerating.


Quote:


> Originally Posted by *OneB1t*
> 
> more like grinding steel with angle grinder
> 
> 
> 
> 
> 
> 
> 
> thats reference cooler @100% speed


I can understand it being that loud at 100%, but does the blower actually get anywhere near 100% during normal gaming loads? (not talking about furmark and other unrealistic benchmark loads)

From the initial reviews I read/watched on the reference card, the default fan speed was set to 47%.


----------



## Gumbi

Quote:


> Originally Posted by *MEC-777*
> 
> I can understand it being that loud at 100%, but does the blower actually get anywhere near 100% during normal gaming loads? (not talking about furmark and other unrealistic benchmark loads)
> 
> From the initial reviews I read/watched on the reference card, the default fan speed was set to 47%.


In most case setups, it throttles a bit, even at 47% fan speed. Seriously, DON'T get the reference card.

Grab a Powercolor PCS 290, a Trix, VaporX etc. Those will leave you far more satisfied


----------



## kizwan

Quote:


> Originally Posted by *MEC-777*
> 
> Quote:
> 
> 
> 
> Originally Posted by *sinnedone*
> 
> If you are a woman or have a woman in your life, go to their room/bathroom and grab the hair dryer. TUrn it on high, that's how loud the reference R9 290 R9 290X cooler is.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm not exaggerating.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *OneB1t*
> 
> more like grinding steel with angle grinder
> 
> 
> 
> 
> 
> 
> 
> thats reference cooler @100% speed
> 
> Click to expand...
> 
> I can understand it being that loud at 100%, but does the blower actually get anywhere near 100% during normal gaming loads? (not talking about furmark and other unrealistic benchmark loads)
> 
> From the initial reviews I read/watched on the reference card, the default fan speed was set to 47%.
Click to expand...

If you manual set to 100%, yes it will go up to 100%. When I run my cards in crossfire with the reference cooler, the fan surprisingly doesn't go up to 100% (automatic fan profile) when playing BF4. I don't remember at what speed the fan go up to but definitely not 100%, I'm pretty sure way lower. I believe this depends on the BIOS. Mine pretty much was using the stock BIOS the card came with. I managed to get the core to run from below 80C to low 80s Celsius at 80% fan speed. That with ambient around 28 - 30C.


----------



## sinnedone

Quote:


> Originally Posted by *MEC-777*
> 
> I can understand it being that loud at 100%, but does the blower actually get anywhere near 100% during normal gaming loads? (not talking about furmark and other unrealistic benchmark loads)
> 
> From the initial reviews I read/watched on the reference card, the default fan speed was set to 47%.


With the stock cooler on auto fan speeds it will hit 94/95c every time and throttle off of that. If you want to keep it from hitting low 90's you will need a custom fan profile and it will be loud.

Only good reference cards are for is for water cooling.


----------



## MEC-777

Quote:


> Originally Posted by *Gumbi*
> 
> In most case setups, it throttles a bit, even at 47% fan speed. Seriously, DON'T get the reference card.
> 
> Grab a Powercolor PCS 290, a Trix, VaporX etc. Those will leave you far more satisfied


Already have a reference 290 in the mail on it's way.







Anything with aftermerket cooling was/is way more money than I wanted to spend. I still can't believe how much some people pay for USED reference 290's...

I paid $285 (CAD) almost a year ago (was on sale 40% off at the time) for my non-reference HIS 290 (which is a very good 290, BTW, very good overclocker (1200/1600 with 10,262 3Dmark score)), and I wasn't about to pay more than that for a 2nd 290. So the only ones that are going for less than that right now are reference cards.

Quote:


> Originally Posted by *kizwan*
> 
> If you manual set to 100%, yes it will go up to 100%. When I run my cards in crossfire with the reference cooler, the fan surprisingly doesn't go up to 100% (automatic fan profile) when playing BF4. I don't remember at what speed the fan go up to but definitely not 100%, I'm pretty sure way lower. I believe this depends on the BIOS. Mine pretty much was using the stock BIOS the card came with. I managed to get the core to run from below 80C to low 80s Celsius at 80% fan speed. That with ambient around 28 - 30C.


Quote:


> Originally Posted by *sinnedone*
> 
> With the stock cooler on auto fan speeds it will hit 94/95c every time and throttle off of that. If you want to keep it from hitting low 90's you will need a custom fan profile and it will be loud.
> 
> Only good reference cards are for is for water cooling.


If you're all saying it's pretty darn loud, then it must be pretty darn loud. lol.

Guess I'll be transferring the G10 over to the reference card.









Thanks for the input, guys. Much appreciated.







I'll have to try it as-is for a little while, just for my own curiosity, though.


----------



## specopsFI

Personally, I never found the reference cooler to be as bad as the general opinion seems to be. I mean, it's bad, but it's not _that_ bad









Up to 55% it is tolerable (when using headphones) and for my earlier 290 reference card in a well ventilated case, that was just enough to keep it from throttling even overclocked. That is overclocked with stock voltage, mind you. Extra voltage would have required fan speeds high enough to be heard even through my studio headphones. And yes, 100% is loud enough to be considered a health hazard.


----------



## sinnedone

Quote:


> Originally Posted by *MEC-777*
> 
> Already have a reference 290 in the mail on it's way.
> 
> 
> 
> 
> 
> 
> 
> Anything with aftermerket cooling was/is way more money than I wanted to spend. I still can't believe how much some people pay for USED reference 290's...
> 
> I paid $285 (CAD) almost a year ago (was on sale 40% off at the time) for my non-reference HIS 290 (which is a very good 290, BTW, very good overclocker (1200/1600 with 10,262 3Dmark score)), and I wasn't about to pay more than that for a 2nd 290. So the only ones that are going for less than that right now are reference cards.
> 
> If you're all saying it's pretty darn loud, then it must be pretty darn loud. lol.
> 
> Guess I'll be transferring the G10 over to the reference card.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks for the input, guys. Much appreciated.
> 
> 
> 
> 
> 
> 
> 
> I'll have to try it as-is for a little while, just for my own curiosity, though.


You could also just try using the HIS cooler on the reference card to see if it fits.

Quote:


> Originally Posted by *specopsFI*
> 
> Personally, I never found the reference cooler to be as bad as the general opinion seems to be. I mean, it's bad, but it's not _that_ bad
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Up to 55% it is tolerable (when using headphones) and for my earlier 290 reference card in a well ventilated case, that was just enough to keep it from throttling even overclocked. That is overclocked with stock voltage, mind you. Extra voltage would have required fan speeds high enough to be heard even through my studio headphones. And yes, 100% is loud enough to be considered a health hazard.


If you have a very well ventilated case and your pc sits under a desk or cabinet then you might not have an issue with noise.

Just for $hits and giggles I played with overclocking on the reference cooler and had to set fan speed at 100%. The wife was not impressed. lol


----------



## betam4x

Just picked up a 290 off Amazon for $249.99. Was looking at getting a new graphics card, but at that price it was an absolute steal IMHO. Probably the best bang for the buck Graphics card wise for the $250 price point.


----------



## battleaxe

Quote:


> Originally Posted by *betam4x*
> 
> Just picked up a 290 off Amazon for $249.99. Was looking at getting a new graphics card, but at that price it was an absolute steal IMHO. Probably the best bang for the buck Graphics card wise for the $250 price point.


Link please?


----------



## betam4x

http://amzn.to/1iiM9SX


----------



## battleaxe

Quote:


> Originally Posted by *betam4x*
> 
> http://amzn.to/1iiM9SX


Nice. Reference oldness, goodness.


----------



## betam4x

Just got it today, it's working great. My 6970 was starting to have issues on newer games. Almost went for a GTX 970, then i saw that for much cheaper. Now i can play games like Tomb Raider, Mortal Kombat, etc. on ultimate settings. ARK should be playable at higher settings as well. I thought people said these things were noisy? I didn't hear it spin up at all.


----------



## battleaxe

Quote:


> Originally Posted by *betam4x*
> 
> Just got it today, it's working great. My 6970 was starting to have issues on newer games. Almost went for a GTX 970, then i saw that for much cheaper. Now i can play games like Tomb Raider, Mortal Kombat, etc. on ultimate settings. ARK should be playable at higher settings as well. I thought people said these things were noisy? I didn't hear it spin up at all.


What kind of RAM does it have? GPUZ will tell you.


----------



## betam4x

Hynix GDDR5.


----------



## mfknjadagr8

Quote:


> Originally Posted by *betam4x*
> 
> Just got it today, it's working great. My 6970 was starting to have issues on newer games. Almost went for a GTX 970, then i saw that for much cheaper. Now i can play games like Tomb Raider, Mortal Kombat, etc. on ultimate settings. ARK should be playable at higher settings as well. I thought people said these things were noisy? I didn't hear it spin up at all.


then you must not be taxing it or you are running high temperatures...I had to run my reference 290 at 65 percent fan speed...to prevent throttling...even on the higher switch setting it never broke 45 percent fan speed and it's not that loud under 50 but above 50 it sounds more like a vacuum cleaner the higher you go....and I had two running at up to 75....I tried them both at 100 a few times and my god....it sounded like a jet plane


----------



## betam4x

Also, for what it's worth: CUinfo:

Adapters detected: 1
Card #1 PCI ID: 1002:67B1 - 1002:0B00
DevID [67B1] Rev [00] (0), memory config: 0x500036A9 Hynix
Hawaii-class chip with 11 compute units per Shader Engine
SE1 hw/sw: F8010005 / 00000000 [..........x]
SE2 hw/sw: F8010005 / 00000000 [..........x]
SE3 hw/sw: F8010005 / 00000000 [..........x]
SE4 hw/sw: F8200005 / 00000000 [.....x.....]
40 of 44 CUs are active. HW locks: 4 (R/O) / SW locks: 0 (R/W).
Sorry, all 4 disabled CUs can't be unlocked by BIOS replacement.

Can't be unlocked it seems.


----------



## betam4x

Quote:


> Originally Posted by *mfknjadagr8*
> 
> then you must not be taxing it or you are running high temperatures...I had to run my reference 290 at 65 percent fan speed...to prevent throttling...even on the higher switch setting it never broke 45 percent fan speed and it's not that loud under 50 but above 50 it sounds more like a vacuum cleaner the higher you go....and I had two running at up to 75....I tried them both at 100 a few times and my god....it sounded like a jet plane


I got it to spin up a bit with ark. Sounds about like my 6970. Fan is on automatic settings.


----------



## sinholueiro

Guys, anyone is playing at 1440p? I wonder if the 4GB will be a bottleneck. The article at TechReport shows a little struggle at 4K, so I am curious at 1440p to decide between 200 or 300


----------



## Gumbi

Nice







I would recommend re applying the thermal paste, that should buy you 10 or so degrees on a reference setup.


----------



## betam4x

From what I read at anandtech, 1440p is perfect for this card. However, I'm not gaming at 1440p, so I wouldn't know. All I know is what I read: http://www.anandtech.com/show/7481/the-amd-radeon-r9-290-review


----------



## Gumbi

Quote:


> Originally Posted by *sinholueiro*
> 
> Guys, anyone is playing at 1440p? I wonder if the 4GB will be a bottleneck. The article at TechReport shows a little struggle at 4K, so I am curious at 1440p to decide between 200 or 300


-
Honestly for one card, 4GB should be enough at 1440P in most cases. You can get aftermarket 290s super cheap on Newegg right now. If you want to play it safe you can grab the 390, it has slightly better performance (BIOS tweaks and better memory) but it's the same chip so overall not much faster.


----------



## allindaze

Apparently I've been misinformed. According to this site...My PSU has 16AwG cables for the GPU

http://www.hardwaresecrets.com/seasonic-x-series-850-w-power-supply-review/

but unfortunately my pci-e cables are clearly marked 18awg. boo









I guess my X-series SS-850KM3 Active PFC F3 is different


----------



## kizwan

Quote:


> Originally Posted by *sinholueiro*
> 
> Guys, anyone is playing at 1440p? I wonder if the 4GB will be a bottleneck. The article at TechReport shows a little struggle at 4K, so I am curious at 1440p to decide between 200 or 300


I'm playing at 1440p. VRAM usage in BF4 is just 2GB @Ultra.

If it's between 200 or 300, obviously go for 300. New is always better.


----------



## MEC-777

Quote:


> Originally Posted by *betam4x*
> 
> Just got it today, it's working great.
> 
> I thought people said these things were noisy? I didn't hear it spin up at all.


Lol, I just asked that same question on the previous page (how loud are the reference cards, really?) and all the answers I got was basically "very loud!"

I guess "loudness" is subjective. While at anything above 60% I can see most people calling that loud, below that seems to be a matter of preference and perception.

Quote:


> Originally Posted by *sinholueiro*
> 
> Guys, anyone is playing at 1440p? I wonder if the 4GB will be a bottleneck. The article at TechReport shows a little struggle at 4K, so I am curious at 1440p to decide between 200 or 300


I'm running most of my games at 1440p (VSR) on high to ultra settings (depending on the game) with my 290. The 4GB Vram is plenty. GTA5 is probably the most Vram-heavy game I have and it uses about 2.5-3.5GB Vram on average and that's on a mix of high/very high settings.


----------



## By-Tor

Quote:


> Originally Posted by *sinnedone*
> 
> Only good reference cards are for is for water cooling.


Agreed and this comes stock from powercolor with one mounted and I love my pair..


----------



## JourneymanMike

Quote:


> Originally Posted by *Archea47*
> 
> Quote:
> 
> 
> 
> Originally Posted by *JourneymanMike*
> 
> Jeeze, sorry...
> 
> Do the flowers work?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll try it out tomorrow!
> 
> 
> 
> Super Flowers?
Click to expand...

What gauge are the stems? That's what really matters...


----------



## Roboyto

Quote:


> Originally Posted by *MEC-777*
> 
> Lol, I just asked that same question on the previous page (how loud are the reference cards, really?) and all the answers I got was basically "very loud!"
> 
> I guess "loudness" is subjective. While at anything above 60% I can see most people calling that loud, below that seems to be a matter of preference and perception.
> I'm running most of my games at 1440p (VSR) on high to ultra settings (depending on the game) with my 290. The 4GB Vram is plenty. GTA5 is probably the most Vram-heavy game I have and it uses about 2.5-3.5GB Vram on average and that's on a mix of high/very high settings.


No it's not subjective...well, I giuess it is...but if you want the card to run cooler than 90C and desire to increase the voltage or clocks at all then you don't want reference. Once you start pushing the card, the fan speed needs to be ridiculous to keep the temperaures in check. And because of the reference design the core temp suffers due to the VRM1 being cooled first. Reference design is terrible.

If ref card at 60% or higher isn't loud to someone, then they need their hearing checked. My .02


----------



## MEC-777

Quote:


> Originally Posted by *Roboyto*
> 
> No it's not subjective...well, I giuess it is...but if you want the card to run cooler than 90C and desire to increase the voltage or clocks at all then you don't want reference. Once you start pushing the card, the fan speed needs to be ridiculous to keep the temperaures in check.
> 
> If ref card at 60% or higher isn't loud to someone, then they need their hearing checked. My .02


I wasn't questioning what fan speeds were required to achieve certain temps. I was saying that at certain fans speeds (example: anything above 60%) it could be considered objectively loud, but below that speed it's more subjective because people perceive sound differently from one another. There's a range where some would consider it "loud", others would consider it "tolerable". That's what I meant.

If the card required 60%+ fan speeds almost all the time while under even moderate loads, then yeah, I would agree it's an objectively loud card.


----------



## battleaxe

Quote:


> Originally Posted by *betam4x*
> 
> Hynix GDDR5.


Very nice. All you need now is to get that thing on some H20 to see what she can do.


----------



## Roboyto

Quote:


> Originally Posted by *MEC-777*
> 
> I wasn't questioning what fan speeds were required to achieve certain temps. I was saying that at certain fans speeds (example: anything above 60%) it could be considered objectively loud, but below that speed it's more subjective because people perceive sound differently from one another. There's a range where some would consider it "loud", others would consider it "tolerable". That's what I meant.
> 
> If the card required 60%+ fan speeds almost all the time while under even moderate loads, then yeah, I would agree it's an objectively loud card.


Read any review for a reference card and then find one that has a video of one running. They couldn't even sustain stock clocks on 290X with auto fan settings. Hot, loud and gimped out of the box.

Actually utilizing the card turns it into a leaf blower.


----------



## KingCry

Quote:


> Originally Posted by *Roboyto*
> 
> Read any review for a reference card and then find one that has a video of one running. They couldn't even sustain stock clocks on 290X with auto fan settings. Hot, loud and gimped out of the box.
> 
> Actually utilizing the card turns it into a leaf blower.


Friend of mine just got a reference 290x it runs so hot to the point were it force crashes the driver into a looping crash then into a system lock up


----------



## rdr09

Quote:


> Originally Posted by *KingCry*
> 
> Friend of mine just got a reference 290x it runs so hot to the point were it force crashes the driver into a looping crash then into a system lock up


Reference hawaiis are old and we've known for quite sometime that they are meant for watercooling. got my second with a full block for $247 last December. they don't even see 60C.









Original bios . . .

http://www.3dmark.com/3dm/4644282?


----------



## MEC-777

Quote:


> Originally Posted by *Roboyto*
> 
> Read any review for a reference card and then find one that has a video of one running. They couldn't even sustain stock clocks on 290X with auto fan settings. Hot, loud and gimped out of the box.
> 
> Actually utilizing the card turns it into a leaf blower.


Yes, I know all this. I know they're hot and loud. I'm not arguing with you or disagreeing with you on that. I don't think you understood what I was trying to say, but never mind, it's ok. I think others got it.









I'll be installing the G10 on it anyways.


----------



## kizwan

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *KingCry*
> 
> Friend of mine just got a reference 290x it runs so hot to the point were it force crashes the driver into a looping crash then into a system lock up
> 
> 
> 
> Reference hawaiis are old and we've known for quite sometime that they are meant for watercooling. got my second with a full block for $247 last December. they don't even see 60C.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Original bios . . .
> 
> http://www.3dmark.com/3dm/4644282?
Click to expand...

If you run that with current drivers, you probably get additional a couple of thousand point for the graphics score.


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> If you run that with current drivers, you probably get additional a couple of thousand point for the graphics score.


nice to know. hawaii still getting some love from amd. can't wait. my stuff are still floating in the atlantic. i miss my rigs and planetside. lol


----------



## By-Tor

Quote:


> Originally Posted by *rdr09*
> 
> nice to know. hawaii still getting some love from amd. can't wait. my stuff are still floating in the atlantic. i miss my rigs and planetside. lol


I feel you're pain. I remember moving back to the states from Germany in 1990 and all our stuff was in crates on a ship and my computer at that time was a Commodore 64 (still have it).


----------



## Archea47

Quote:


> Originally Posted by *sinholueiro*
> 
> Guys, anyone is playing at 1440p? I wonder if the 4GB will be a bottleneck. The article at TechReport shows a little struggle at 4K, so I am curious at 1440p to decide between 200 or 300


1440p here - GTAV and BF4 at least run max well (not maxwell) on my XFire system. There were originally some reporting in GTAV showing over VRAM usage but it was a bug I believe

RE: reference ... I don't think I'll ever buy a non-reference again. It's been a mini nightmare over the past year with my Sapphire 'reference' Tri-X 290Xs that they decided to use taller chokes than reference on. Had to modify my blocks, one of which leaked and blew a $200 PSU. I strongly desired to purchase reference over the Tri-X but at that time a year ago the reference cards were ~$80 more a piece than the aftermarket cooled cards


----------



## mfknjadagr8

Quote:


> Originally Posted by *betam4x*
> 
> I got it to spin up a bit with ark. Sounds about like my 6970. Fan is on automatic settings.


I would check your temperatures with hwinfo64 or similar to be sure you are within thermal limits...even after a repaste my cards needed additional fan speed to run comfortably under load...most people say to keep them below 85 as throttling usually occurs around then or sooner in some cases...if I remember right the default fan profile didn't ramp up the fans fast enough to compensate for the heat...


----------



## Gumbi

Quote:


> Originally Posted by *mfknjadagr8*
> 
> I would check your temperatures with hwinfo64 or similar to be sure you are within thermal limits...even after a repaste my cards needed additional fan speed to run comfortably under load...most people say to keep them below 85 as throttling usually occurs around then or sooner in some cases...if I remember right the default fan profile didn't ramp up the fans fast enough to compensate for the heat...


Throttling doesn't occur until 94 degrees, people just generally likekeeping the temps below 85.

Which model do you have? Only reference cards routinely get as high as 85 degrees unless you super high ambients or woeful case cooling.


----------



## By-Tor

Quote:


> Originally Posted by *sinholueiro*
> 
> Guys, anyone is playing at 1440p? I wonder if the 4GB will be a bottleneck. The article at TechReport shows a little struggle at 4K, so I am curious at 1440p to decide between 200 or 300


I've been playing BF4 on ultra @ 1440p, 100hz with my 1080p, 144hz Asus monitor running at 1440p for the past couple of months and it runs as smooth as butter getting 140-160 FPS range most of the time.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Gumbi*
> 
> Throttling doesn't occur until 94 degrees, people just generally likekeeping the temps below 85.
> 
> Which model do you have? Only reference cards routinely get as high as 85 degrees unless you super high ambients or woeful case cooling.


I'm under water now and never break 60c but they are both reference cards...xfx with elpidia memory







but I got a good deal...my second card though needs voltage to go over 1300 memory so I generally just overclock the core and the memory just a bit...I have the thermal headroom I just don't want to spend a couple days for not much gain on memory performance...nut big on benching and in game isn't affected as much by memory speed.....phantom pain...or mess with settings...the answer it's just a supply drop away


----------



## betam4x

Just a few notes: The card I purchased (reference card I mentioned earlier) is doing great so far. I'm in an air conditioned room (66 degrees), so that's probably why the fans don't spin up that high. Haven't done any actual readings. Planning on building a water cooling setup at some point in the future when i get the budget. So far, it's been perfect for gaming. Best $250 upgrade I ever made. The main games I purchased it for were MKX (was choppy under max details on my 6970) and Ark (Ark was damn near unplayable without making things ugly as hell on my 6970) and it handles those great.


----------



## battleaxe

Quote:


> Originally Posted by *betam4x*
> 
> Just a few notes: The card I purchased (reference card I mentioned earlier) is doing great so far. I'm in an air conditioned room (66 degrees), so that's probably why the fans don't spin up that high. Haven't done any actual readings. Planning on building a water cooling setup at some point in the future when i get the budget. So far, it's been perfect for gaming. Best $250 upgrade I ever made. The main games I purchased it for were MKX (was choppy under max details on my 6970) and Ark (Ark was damn near unplayable without making things ugly as hell on my 6970) and it handles those great.


If on a budget the "RedMod" works great on the 290/x. I have put 3 of them on water using this method. The VRM's won't be as cool as a full loop of course, so its unlikely that you can use a full 200mv if overclocking. But the temps versus stock are better by around 20-30c depending on your setup. VRM's too, if using some form of air cooled heat sinks will greatly benefit over stock. All told you can do a Redmod for around $50 total if you find a deal on a water cooler. Its not ideal, but great in a pinch and far better than stock I can tell you. The cost is about 1/4th what a loop would cost if I had to venture a guess (on only the GPU in question). Just something to think about if you don't want to go full custom water.


----------



## rdr09

Quote:


> Originally Posted by *By-Tor*
> 
> I feel you're pain. I remember moving back to the states from Germany in 1990 and all our stuff was in crates on a ship and my computer at that time was a Commodore 64 (still have it).


mine are coming from the states. hopefully they still work when put together.

hold on to the C64 - classic. i built my first pc right around that time and had a P1 in it. It's as fast as the our phones nowadays.

actually . . . slower.


----------



## Faster_is_better

Does the trick for increasing voltage in MSI AB still work?

@sugarhell originally posted these instructions
Quote:


> Guys its easy to give more volts on msi.
> 
> Just use /wi4,30,8d,10 for 100mv. The offset is 6.25 mv in hexademical. So on decimal is :16*6.25=100 mv. For 50mv you need 8. For 200mv you need 20( 20=32 on dec. So 32 * 6.25=200mv)
> 
> The easy way to do changes:
> 
> Create a txt on desktop. Write
> CD C:\Program Files (x86)\MSI Afterburner
> MSIAfterburner.exe /wi4,30,8d,10
> 
> and then save as .bat file. Eveyrtime you start this bat file msi will start with +100mv
> 
> For 50mv: 8
> For 100mv:10
> For 125mv:14
> For 150mv:18
> For 175mv:1C
> For 200mv:20
> 
> I wouldn't go over this point because
> 1)You are close to leave the sweet spot of the ref pcb vrms efficiency
> 2)These commands add 200mv on top of the 100mv offset through AB gui.That means 300mv
> 
> By default /wi command apply to current gpu only. So if you have 2 or more gpus you must use /sg command. That means the command line is something like that
> ex:MsiAfterburner.exe /sg0 /wi4,30,8d,10 /sg1 /wi4,30,8d,10
> 
> ***************
> 
> Just a fix. If you use the new beta with the new low level i2c bus you need to change wi4 to wi6


I just tried them and it doesn't seem to launch MSI correctly. I'm using MSI AB 4.1.1

I wanted to try some higher voltage and see if my cards are topping out on OC or starved for voltage. Looks like these 390 BIOS applied to them has stunted the overall OC potential of at least one of my cards, not sure if voltage will help though.


----------



## fyzzz

Quote:


> Originally Posted by *Faster_is_better*
> 
> Does the trick for increasing voltage in MSI AB still work?
> 
> @sugarhell originally posted these instructions
> I just tried them and it doesn't seem to launch MSI correctly. I'm using MSI AB 4.1.1
> 
> I wanted to try some higher voltage and see if my cards are topping out on OC or starved for voltage. Looks like these 390 BIOS applied to them has stunted the overall OC potential of at least one of my cards, not sure if voltage will help though.


You should have two shortcuts. The one that will give more voltage, will only apply the voltage you have specified, use your normal shortcut to open msi ab. Also more voltage is maybe only going to make things worse, the 390 bios works differently. The card will not overclock as high as on a 290 bios. I can get about 1270/1780 on normal 290 bios and about 1245/1750 with 390 bios, with frequent black screens. But the blackscreens usually don't show up under less voltage.Also I think you need to change the 4 in wi4 to a 6.


----------



## kizwan

Quote:


> Originally Posted by *betam4x*
> 
> Just a few notes: The card I purchased (reference card I mentioned earlier) is doing great so far. I'm in an air conditioned room (66 degrees), so that's probably why the fans don't spin up that high. Haven't done any actual readings. Planning on building a water cooling setup at some point in the future when i get the budget. So far, it's been perfect for gaming. Best $250 upgrade I ever made. The main games I purchased it for were MKX (was choppy under max details on my 6970) and Ark (Ark was damn near unplayable without making things ugly as hell on my 6970) and it handles those great.


Water cooling sound like a good plan.
Quote:


> Originally Posted by *Faster_is_better*
> 
> Does the trick for increasing voltage in MSI AB still work?
> 
> @sugarhell originally posted these instructions
> Quote:
> 
> 
> 
> Guys its easy to give more volts on msi.
> 
> Just use /wi4,30,8d,10 for 100mv. The offset is 6.25 mv in hexademical. So on decimal is :16*6.25=100 mv. For 50mv you need 8. For 200mv you need 20( 20=32 on dec. So 32 * 6.25=200mv)
> 
> The easy way to do changes:
> 
> Create a txt on desktop. Write
> CD C:\Program Files (x86)\MSI Afterburner
> MSIAfterburner.exe /wi4,30,8d,10
> 
> and then save as .bat file. Eveyrtime you start this bat file msi will start with +100mv
> 
> For 50mv: 8
> For 100mv:10
> For 125mv:14
> For 150mv:18
> For 175mv:1C
> For 200mv:20
> 
> I wouldn't go over this point because
> 1)You are close to leave the sweet spot of the ref pcb vrms efficiency
> 2)These commands add 200mv on top of the 100mv offset through AB gui.That means 300mv
> 
> By default /wi command apply to current gpu only. So if you have 2 or more gpus you must use /sg command. That means the command line is something like that
> ex:MsiAfterburner.exe /sg0 /wi4,30,8d,10 /sg1 /wi4,30,8d,10
> 
> ***************
> 
> Just a fix. If you use the new beta with the new low level i2c bus you need to change wi4 to wi6
> 
> 
> 
> I just tried them and it doesn't seem to launch MSI correctly. I'm using MSI AB 4.1.1
> 
> I wanted to try some higher voltage and see if my cards are topping out on OC or starved for voltage. Looks like these 390 BIOS applied to them has stunted the overall OC potential of at least one of my cards, not sure if voltage will help though.
Click to expand...

After you execute that command, you need to launch MSI AB by double-clicking the MSI AB icon. You'll see the voltage slider now pre-set to the value you set earlier using the command. Don't touch the voltage slider because it will reset to 100.

FYI, */wi4* no longer work. It is */wi6* the correct one. Check first post.


----------



## Faster_is_better

Quote:


> Originally Posted by *fyzzz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Faster_is_better*
> 
> Does the trick for increasing voltage in MSI AB still work?
> 
> @sugarhell originally posted these instructions
> I just tried them and it doesn't seem to launch MSI correctly. I'm using MSI AB 4.1.1
> 
> I wanted to try some higher voltage and see if my cards are topping out on OC or starved for voltage. Looks like these 390 BIOS applied to them has stunted the overall OC potential of at least one of my cards, not sure if voltage will help though.
> 
> 
> 
> You should have two shortcuts. The one that will give more voltage, will only apply the voltage you have specified, use your normal shortcut to open msi ab. Also more voltage is maybe only going to make things worse, the 390 bios works differently. The card will not overclock as high as on a 290 bios. I can get about 1270/1780 on normal 290 bios and about 1245/1750 with 390 bios, with frequent black screens. But the blackscreens usually don't show up under less voltage.Also I think you need to change the 4 in wi4 to a 6.
Click to expand...

Yeah I saw your posts in the 390 BIOS thread, so i suspect I am running into the same issue. I haven't done any benchmark runs to build up a comparison yet and see which BIOS is actually better.

So I created the bat file, it runs I see the command pop up then it immediately closes and MSI AB doesn't run. Using this:

Code:



Code:


CD C:\Program Files (x86)\MSI Afterburner
MSIAfterburner.exe /wi6,30,8d,14

Now I did test with it just opening MSIAfterburner.exe, in the .bat file and it worked that way, so something with the extra parameters isn't working.

If I open MSI AB after running that bat file its just the default program.


----------



## fyzzz

Quote:


> Originally Posted by *Faster_is_better*
> 
> Yeah I saw your posts in the 390 BIOS thread, so i suspect I am running into the same issue. I haven't done any benchmark runs to build up a comparison yet and see which BIOS is actually better.
> 
> So I created the bat file, it runs I see the command pop up then it immediately closes and MSI AB doesn't run. Using this:
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> CD C:\Program Files (x86)\MSI Afterburner
> MSIAfterburner.exe /wi6,30,8d,14
> 
> Now I did test with it just opening MSIAfterburner.exe, in the .bat file and it worked that way, so something with the extra parameters isn't working.
> 
> If I open MSI AB after running that bat file its just the default program.


I usually make a second shortcut of msi ab and paste "C:\Program Files (x86)\MSI Afterburner\MSIAfterburner.exe" /wi6,30,8d,20 in the target, then when i open that the voltage will apply, but msi ab will not open, i must use the regular shortcut to control the voltage. That works for me atleast.


----------



## betam4x

You guys make me sad. I'm 33 now, I was raised on tech. I've owned a number of gems throughout the years (Tandy CoCo 2, Intellivision, IBM XT, IBM XT clone, IBM PS/2, many MANY More)...I don't have any of them anymore.

My proudest moment? My Pentium 60 machine died (updated bios failed to flash properly. Tried to repeat...woudn't work). Fired up my IBM PS/2 (286, 4 mb ram, monochrome monitor), grabbed a book from a library with a PPP dialer, logged on to IRC via a DOS based IRC client that i grabbed via the internet at the library...found a graphical dos web browser somewhere...stuck with that setup for months while i saved up the cash for a new machine. I was a preteen or teen then.

Sometimes i miss the old days.


----------



## Faster_is_better

Quote:


> Originally Posted by *fyzzz*
> 
> I usually make a second shortcut of msi ab and paste "C:\Program Files (x86)\MSI Afterburner\MSIAfterburner.exe" /wi6,30,8d,20 in the target, then when i open that the voltage will apply, but msi ab will not open, i must use the regular shortcut to control the voltage. That works for me atleast.


Hmm seems to be working, its a bit tricky though. Maybe I wasn't noticing it work since I have 2 cards. Thanks for help


----------



## kizwan

Not really tricky.



You can add "pause" just to put a delay between the two command. This way you don't need to manually launch MSI AB.

Code:



Code:


MsiAfterburner.exe /sg0 /wi6,30,8d,14 /sg1 /wi6,30,8d,14
pause
MsiAfterburner.exe

or

Code:



Code:


cd "C:\Program Files (x86)\MSI Afterburner"
MsiAfterburner.exe /sg0 /wi6,30,8d,14 /sg1 /wi6,30,8d,14
pause
MsiAfterburner.exe

*Edit:* No need to add "pause". I was worried it may haywire or something since it will execute both command pretty quick. I just tested without "pause" & it is working without any problem.


----------



## Archea47

Quote:


> Originally Posted by *kizwan*
> 
> Not really tricky.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> You can add "pause" just to put a delay between the two command. This way you don't need to manually launch MSI AB.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> MsiAfterburner.exe /sg0 /wi6,30,8d,14 /sg1 /wi6,30,8d,14
> pause
> MsiAfterburner.exe
> 
> or
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> cd "C:\Program Files (x86)\MSI Afterburner"
> MsiAfterburner.exe /sg0 /wi6,30,8d,14 /sg1 /wi6,30,8d,14
> pause
> MsiAfterburner.exe
> 
> 
> 
> 
> 
> *Edit:* No need to add "pause". I was worried it may haywire or something since it will execute both command pretty quick. I just tested without "pause" & it is working without any problem.


Thanks kizwan! +rep


----------



## Liranan

Need some help guys.

I had made up my mind to buy a 290 Tri-X but then saw that an Asus 290X DCU II isn't that much more.

From the reviews I've seen the DCU II can't OC that much and the Tri-X is very quiet and can OC nicely.

Do you guys think the 290 Tri-X is still the better buy or is the DCU II the one I should get?


----------



## Gumbi

Quote:


> Originally Posted by *Liranan*
> 
> Need some help guys.
> 
> I had made up my mind to buy a 290 Tri-X but then saw that an Asus 290X DCU II isn't that much more.
> 
> From the reviews I've seen the DCU II can't OC that much and the Tri-X is very quiet and can OC nicely.
> 
> Do you guys think the 290 Tri-X is still the better buy or is the DCU II the one I should get?


Personally I would rather the Sapphire one. It has a far better cooler (Asus used their 780 cooler which meant that 2 of the heatpipes have no contact the the GPU at all).

That being said, the 290X is still just that, a 290X, and will likely perform a tad faster.


----------



## Faster_is_better

Quote:


> Originally Posted by *kizwan*
> 
> Not really tricky.
> 
> 
> 
> You can add "pause" just to put a delay between the two command. This way you don't need to manually launch MSI AB.
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> MsiAfterburner.exe /sg0 /wi6,30,8d,14 /sg1 /wi6,30,8d,14
> pause
> MsiAfterburner.exe
> 
> or
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> cd "C:\Program Files (x86)\MSI Afterburner"
> MsiAfterburner.exe /sg0 /wi6,30,8d,14 /sg1 /wi6,30,8d,14
> pause
> MsiAfterburner.exe
> 
> *Edit:* No need to add "pause". I was worried it may haywire or something since it will execute both command pretty quick. I just tested without "pause" & it is working without any problem.


Nice, that will save an extra 2 clicks


----------



## Gumbi

Broke into 1600 master race range in Heave







125mv, 1206mhz/1649mhz overclock. (Awkward numbers are due to the new Trixx being ******ed) 2 outer fans functional on my GPU (working on a fix) at 100% for the bench run, 74 degrees on the core, 60 on VRM 1 and 54 on VRM 2. Man I LOVE the Vapor X cooling


----------



## battleaxe

Quote:


> Originally Posted by *Gumbi*
> 
> Broke into 1600 master race range in Heave
> 
> 
> 
> 
> 
> 
> 
> 125mv, 1206mhz/1649mhz overclock. (Awkward numbers are due to the new Trixx being ******ed) 2 outer fans functional on my GPU (working on a fix) at 100% for the bench run, 74 degrees on the core, 60 on VRM 1 and 54 on VRM 2. Man I LOVE the Vapor X cooling


Nice Job sir!


----------



## FastEddieNYC

Quote:


> Originally Posted by *Liranan*
> 
> Need some help guys.
> 
> I had made up my mind to buy a 290 Tri-X but then saw that an Asus 290X DCU II isn't that much more.
> 
> From the reviews I've seen the DCU II can't OC that much and the Tri-X is very quiet and can OC nicely.
> 
> Do you guys think the 290 Tri-X is still the better buy or is the DCU II the one I should get?


The Sapphire has the better cooler. As stated above Asus simply used the cooler from another card. They have two versions of the Tri-X. The old version uses a 6 and 8 pin power and reference board. The "new Edition" uses the 390X PCB and has 2 8 pin power connectors.


----------



## Liranan

So the one that is so cheap is the reference model with 6+8 pin power. The 390 board is just as expensive as the 390's, at which point I might as well get a 390.


----------



## battleaxe

Quote:


> Originally Posted by *Gumbi*
> 
> Broke into 1600 master race range in Heave
> 
> 
> 
> 
> 
> 
> 
> 125mv, 1206mhz/1649mhz overclock. (Awkward numbers are due to the new Trixx being ******ed) 2 outer fans functional on my GPU (working on a fix) at 100% for the bench run, 74 degrees on the core, 60 on VRM 1 and 54 on VRM 2. Man I LOVE the Vapor X cooling


BTW, if you click on the sliders in TRIXX you can then use the arrow keys to get the exact value you want.







FYI

I just got TRIXX to finally work right with 200mv. I had to slowly ramp up the volts to get it not to blackscreen. Strange. But it works. For some reason I can get a higher Mem clock on Aftervburner than Trixx software. I assume its because of no aux volts control? Anyway, here's my new personal best.


----------



## FastEddieNYC

Quote:


> Originally Posted by *Liranan*
> 
> So the one that is so cheap is the reference model with 6+8 pin power. The 390 board is just as expensive as the 390's, at which point I might as well get a 390.


The reference model was the one listed. You just need to keep checking because they go out of stock quickly. The new version has been bouncing up and down in price. Be patient and you will catch one discounted.


----------



## Aussiejuggalo

So, which drivers should I go, 15.7.1 or 15.8?


----------



## b0uncyfr0

Is a 1500mhz memory overclock on stock memory voltage any good? Does pushing it too far without enough voltage affect performance?

My card taps out around 1110 @ +100. Is it advisable to increase the TDP? Would that help the OC? Id like to get to 1500 core. ASIC quality is 69.8%.

Do any of the bioses(Non Uefi-vs Uefi) OC better? I have an 8gb 290x vapor-x btw.


----------



## bkvamme

1500 Core is pretty crazy, I can't remember seeing anybody with that good overclock on the core.

I reached 1250MHz on the core and 1550MHz on the memory, which (to my understanding) is not half-bad.

Increasing the TDP is a must to get higher overclock, and if you want to push it further, you have to use Trixxx or launch Afterburner with special commands.


----------



## mus1mus

1500 On the core is nothing special lately. But on the other side of the GPU World.









1550 on the VRAM is just about normal. 1250 Gaming / Daily on Core is nothing short of magnificent.


----------



## Gumbi

Quote:


> Originally Posted by *b0uncyfr0*
> 
> Is a 1500mhz memory overclock on stock memory voltage any good? Does pushing it too far without enough voltage affect performance?
> 
> My card taps out around 1110 @ +100. Is it advisable to increase the TDP? Would that help the OC? Id like to get to 1500 core. ASIC quality is 69.8%.
> 
> Do any of the bioses(Non Uefi-vs Uefi) OC better? I have an 8gb 290x vapor-x btw.


Anything over 1200 on air is extremely impressive. You should leave your power slider maxed as their is no disadvantage to that.

AFAIK, neither BIOS offers an OC benefit over the other. 100mv for 1110 on he core is pretty poor btw, are you sure about that score?


----------



## mfknjadagr8

Quote:


> Originally Posted by *Gumbi*
> 
> Anything over 1200 on air is extremely impressive. You should leave your power slider maxed as their is no disadvantage to that.
> 
> AFAIK, neither BIOS offers an OC benefit over the other. 100mv for 1110 on he core is pretty poor btw, are you sure about that score?


if you didn't what would you do with that 50 cents a year lol


----------



## Gumbi

Quote:


> Originally Posted by *mfknjadagr8*
> 
> if you didn't what would you do with that 50 cents a year lol


Maxing the power slider doesn't mean your card necessarily draws more power, it just means it has the option to do so. If you overclock you might get power throttled so maxing the power slider eliminates this.

At stock you probably have air to breathe and would never be thermally throttled.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Gumbi*
> 
> Maxing the power slider doesn't mean your card necessarily draws more power, it just means it has the option to do so. If you overclock you might get power throttled so maxing the power slider eliminates this.
> 
> At stock you probably have air to breathe and would never be thermally throttled.


i was just joking... even if you did use that extra power it wouldnt even be 50c a year in all likelyhood...


----------



## MEC-777

Best OC I've reached is 1200/1600 +50 power limit and +100mV. It wasn't stable enough to run games, but it was enough to run through firestrike: http://www.3dmark.com/fs/5837443









Daily stable gaming OC I run is 1100/1500 +50 power and +0mV. When not gaming or playing less-demanding games I just leave it stock.

Card is an HIS IceQx2 R9 290 - not even the "OC" version, so I was a little surprised to see how well it overclocked. It was able to hit these clocks on air but the core temp reached the low 80's and the VRMs broke into the 90's. On water (kraken G10 + H55) it rarely cracks the mid 60's and same with the VRMs.


----------



## fyzzz

Testing some more overclocking, it seems like i can never stop. Tested 1247/1625 with 100mv (voltage is increased in bios aswell). Ran a few loops of heaven 1080p no problems, max temp was 48c on the core and vrms with a ambient of 19c. Will probably not run that clock, mostly did that for fun. The voltage under load was 1.28-1.3 and had a spike to 1.398.


----------



## Gumbi

Quote:


> Originally Posted by *fyzzz*
> 
> Testing some more overclocking, it seems like i can never stop. Tested 1247/1625 with 100mv (voltage is increased in bios aswell). Ran a few loops of heaven 1080p no problems, max temp was 48c on the core and vrms with a ambient of 19c. Will probably not run that clock, mostly did that for fun. The voltage under load was 1.28-1.3 and had a spike to 1.398.


That's pretty ridiculous. I'm assuming you're under wster with those temps?


----------



## fyzzz

Quote:


> Originally Posted by *Gumbi*
> 
> That's pretty ridiculous. I'm assuming you're under wster with those temps?


Yup water and the autumn is here so lower ambient yay. Tried playing crysis 3 but it did not want to work. cpu was at 100% and the gpu load went up and down. Moved on to bf4 where 1247 was stable for while, but then small artifacts showed, Put the core down to 1240 and i had no problems playing a bf4 match.


----------



## Gumbi

Quote:


> Originally Posted by *fyzzz*
> 
> Yup water and the autumn is here so lower ambient yay. Tried playing crysis 3 but it did not want to work. cpu was at 100% and the gpu load went up and down. Moved on to bf4 where 1247 was stable for while, but then small artifacts showed, Put the core down to 1240 and i had no problems playing a bf4 match.


100mv is perfectly reasonable to run 24/7 under water. Crysis 3 is probably a bit more stressful on the GPU that BF. Nonetheless, your overclock is very impressive indeed. You habe a golden sample my friend


----------



## fyzzz

Quote:


> Originally Posted by *Gumbi*
> 
> 100mv is perfectly reasonable to run 24/7 under water. Crysis 3 is probably a bit more stressful on the GPU that BF. Nonetheless, your overclock is very impressive indeed. You habe a golden sample my friend


Well not really golden i would say, pretty decent.Keep in mind that i have raised voltage in bios aswell it would take like 175 mv or something to get that stable on normal bios. Voltage under load is as said 1.28-1.3.


----------



## Connolly

Hi guys,

Can't find very much about this problem anywhere, anyone else had this security issue flagged by Windows Defender with the latest DDU software:

Program:win32/Hadsruda!bit

I can find a small piece saying that it's a false positive, but only on one forum. I'd love not to do a fresh install, but may need to!

Cheers.


----------



## Jflisk

Quote:


> Originally Posted by *Connolly*
> 
> Hi guys,
> 
> Can't find very much about this problem anywhere, anyone else had this security issue flagged by Windows Defender with the latest DDU software:
> 
> Program:win32/Hadsruda!bit
> 
> I can find a small piece saying that it's a false positive, but only on one forum. I'd love not to do a fresh install, but may need to!
> 
> Cheers.


Had the same problem think there's a newer DDU update that fixes it . Have not had the chance to try the updated version yet.


----------



## b0uncyfr0

Ah sorry guys, 1500 core is insane. I meant 1200 is my desired number. I always thought +100 for 1100 is very low for a vapor x. Im gonna crank it up to 1500 and drop all other overclocks.

How would i remove the tdp limit? I have a silverstone 750w so i know thats not the issue.


----------



## Gumbi

Quote:


> Originally Posted by *b0uncyfr0*
> 
> Ah sorry guys, 1500 core is insane. I meant 1200 is my desired number. I always thought +100 for 1100 is very low for a vapor x. Im gonna crank it up to 1500 and drop all other overclocks.
> 
> How would i remove the tdp limit? I have a silverstone 750w so i know thats not the issue.


I run my 290 Vapor X at 1130mhz core at srock volts (25mv), 100mv is a heck of a lot for 1100 core, are you sure that's correct?

Again, it's impossible to do 1500 core on air or even water.

If you are upping the memory you'll need more volts too. Use MSI afterburner to slide the power slider to 150%.


----------



## MEC-777

Quote:


> Originally Posted by *Gumbi*
> 
> I run my 290 Vapor X at 1130mhz core at srock volts (25mv), 100mv is a heck of a lot for 1100 core, are you sure that's correct?
> 
> Again, it's impossible to do 1500 core on air or even water.
> 
> If you are upping the memory you'll need more volts too. Use MSI afterburner to slide the power slider to 150%.


My HIS 290 runs at 1100/1500 at stock voltage (+0mV) all day long, happy as can be.

How did you get the power limit in AB to go to 150%!? It only goes to +50% afaik.

Keep in mind, everyone, that every time you apply a new overclock in AB you have to also re-set the power limit in overdrive in Catalyst to +50% (max setting) - otherwise it resets to zero by default and overrides the power limit set in AB (even if it still shows +50% in AB). You'll know when you have to do this because it'll start throttling back the core clock under load.







It's annoying but if you set a hotkey combo (I have ALT + P for +50% power limit and ALT + O for stock power limit in overdrive) it makes overclocking with AB much easier. I apply my new OC settings in AB, then hit "ALT + P" and it then holds that clock speed no matter what.


----------



## mfknjadagr8

i remedy that by simply not using overdrive?


----------



## Gumbi

Quote:


> Originally Posted by *MEC-777*
> 
> My HIS 290 runs at 1100/1500 at stock voltage (+0mV) all day long, happy as can be.
> 
> How did you get the power limit in AB to go to 150%!? It only goes to +50% afaik.
> 
> Keep in mind, everyone, that every time you apply a new overclock in AB you have to also re-set the power limit in overdrive in Catalyst to +50% (max setting) - otherwise it resets to zero by default and overrides the power limit set in AB (even if it still shows +50% in AB). You'll know when you have to do this because it'll start throttling back the core clock under load.
> 
> 
> 
> 
> 
> 
> 
> It's annoying but if you set a hotkey combo (I have ALT + P for +50% power limit and ALT + O for stock power limit in overdrive) it makes overclocking with AB much easier. I apply my new OC settings in AB, then hit "ALT + P" and it then holds that clock speed no matter what.


Plus 50% = 150%, that's simple maths... 100% + 50 lol









I don't run CCC so I haven't run into that issue.


----------



## MEC-777

Quote:


> Originally Posted by *mfknjadagr8*
> 
> i remedy that by simply not using overdrive?


Quote:


> Originally Posted by *Gumbi*
> 
> Plus 50% = 150%, that's simple maths... 100% + 50 lol
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I don't run CCC so I haven't run into that issue.


Ah I see what you mean - about the +150%. I've just never heard someone refer to putting the slider to +50% as +150%.









So what you guys are telling me is that you don't install CCC when you install your Catalyst drivers? Is that correct?

I don't "use" overdrive for OCing, I use MSI AB, but overdrive still overrides/interferes regardless.

Also, do you guys not use the gaming 3D application settings in CCC? The settings in there are crucial for many of my games - for making them look and run better.


----------



## rdr09

Quote:


> Originally Posted by *MEC-777*
> 
> Ah I see what you mean - about the +150%. I've just never heard someone refer to putting the slider to +50% as +150%.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So what you guys are telling me is that you don't install CCC when you install your Catalyst drivers? Is that correct?
> 
> I don't "use" overdrive for OCing, I use MSI AB, but overdrive still overrides/interferes regardless.
> 
> Also, do you guys not use the gaming 3D application settings in CCC? The settings in there are crucial for many of my games - for making them look and run better.


Install CCC but do not use Overdrive or use it for oc'ing (others do with no issue but no third-party apps running).

what i do is after intalling Catalyst driver . . . just before reboot . . . i go to msconfig and uncheck CCC and raptr under startup. i can still open CCC as normal like other apps like Excel if need be.


----------



## kizwan

@MEC-777

+50% == 150%
+150% =/= 150%

+150% will be wrong though. 150% is correct & that's what Gumbi wrote.

Regarding OverDrive, simply disable OverDrive in CCC.


----------



## Arizonian

Quote:


> Originally Posted by *Translator*
> 
> Sapphire R9 290 Noctua edition))
> http://i-fotki.info/19/d141d6973df016069eaacb583298990b25e5ff222501871.jpg.html
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> [ IMG]http://www.overclock.net/content/type/61/id/2555277/[/IMG]


Congrats - added








Quote:


> Originally Posted by *colorfuel*
> 
> Hi All,
> 
> So I did a few overclocking tests with the Ashes of singularity benchmark on my 290X Tri-X new edition
> 
> running the latest drivers 15.7.1 on Windows 10 (obviously).
> 
> I ran all the tests at 1080 with default settings, which are, I suppose, what reviewers name high. I use fan settings to keep the GPU at max. temp of 75° and VRM1 80°, to prevent any throttling.
> 
> My 3570K runs at 4.5Ghz
> 
> Here are the results, using the average the benchmark shows at the end.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> First of all there is the obvious CPU bottleneck in DX11. DX12 gives me almost 100% more FPS. More like 95%.
> Secondly, overclocking the memory doesnt do it any good. I have never seen any tangible improvement between running 1350 and 1500.
> And finally, the CPU bottleneck comes back as soon as I reach about 1125 Mhz. (Staying with 1350 on Mem) From 1020 to 1100 there is an improvement of 5.5 % for 7,8% GPU OC. From 1100 to 1125 there is a 1,3% improvement for 2,2 % OC. And then from 1125 to 1150 there is no improvement whatsoever so I'm assuming my system hits its sweetspot at about 1125MHZ GPU edit: on further tests I get 44,3 FPS on 1150/1500, but that doesnt really make a noticeable difference) Everything above that seems to be bottlenecked by the CPU again at which point the difference between DX11 and DX12 is also the highest: 97,3 %.
> 
> The test is quite sensitive to overclocks. I've been able to run my 290X at 1150/1500 in anything I
> 
> threw at it with no artifacting at + 106 mv.
> 
> For example, here is a Fire Strike run at 1157/1500, no artifacting:
> 
> http://www.3dmark.com/3dm/8308769?
> 
> Ashes of singularity however makes my PC reboot as soon as the test starts. I've been able to run
> 
> 1150/1350 but there was no way to get it to run 1150/1500 (neither DX11 nor DX12). I suspect this is
> 
> due to my 580W PSU reaching its limits. This has never happened before in any game. Or maybe there is
> 
> another explanation I dont know of.
> 
> Edit: I'm attaching the results as I type.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 1020_1350_DX11.txt 216k .txt file
> 
> 
> 1020_1350_DX12.txt 394k .txt file
> 
> 
> 1100_1350_DX11.txt 230k .txt file
> 
> 
> 1100_1350_DX12.txt 413k .txt file
> 
> 
> 1100_1500_DX11.txt 232k .txt file
> 
> 
> 1100_1500_DX12.txt 410k .txt file
> 
> 
> 1125_1350_DX11.txt 231k .txt file
> 
> 
> 1125_1350_DX12.txt 418k .txt file
> 
> 
> 1125_1500_DX11.txt 229k .txt file
> 
> 
> 1125_1500_DX12.txt 417k .txt file
> 
> 
> 1150_1350_DX11.txt 234k .txt file
> 
> 
> 1150_1350_DX12.txt 416k .txt file


Congrats - added








Quote:


> Originally Posted by *KingCry*
> 
> Well time to finally going the club after owning my Lighting for over a month now
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Was able to get 1245 core and 1500 memory
> 
> It has Samsung memory on it as well which was a first for any AMD card I've owned so far


Congrats - added








Quote:


> Originally Posted by *JourneymanMike*
> 
> Yes,
> 
> Daisy Chained
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> Not
> 
> 
> 
> 
> 
> I believe this what y'all been talking about...


Congrats - updated








Quote:


> Originally Posted by *betam4x*
> 
> Hynix GDDR5.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added








Quote:


> Originally Posted by *Gumbi*
> 
> Broke into 1600 master race range in Heave
> 
> 
> 
> 
> 
> 
> 
> 125mv, 1206mhz/1649mhz overclock. (Awkward numbers are due to the new Trixx being ******ed) 2 outer fans functional on my GPU (working on a fix) at 100% for the bench run, 74 degrees on the core, 60 on VRM 1 and 54 on VRM 2. Man I LOVE the Vapor X cooling
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added








Quote:


> Originally Posted by *MEC-777*
> 
> My HIS 290 runs at 1100/1500 at stock voltage (+0mV) all day long, happy as can be.
> 
> How did you get the power limit in AB to go to 150%!? It only goes to +50% afaik.
> 
> Keep in mind, everyone, that every time you apply a new overclock in AB you have to also re-set the power limit in overdrive in Catalyst to +50% (max setting) - otherwise it resets to zero by default and overrides the power limit set in AB (even if it still shows +50% in AB). You'll know when you have to do this because it'll start throttling back the core clock under load.
> 
> 
> 
> 
> 
> 
> 
> It's annoying but if you set a hotkey combo (I have ALT + P for +50% power limit and ALT + O for stock power limit in overdrive) it makes overclocking with AB much easier. I apply my new OC settings in AB, then hit "ALT + P" and it then holds that clock speed no matter what.


Congrats - added


----------



## tonymontana95

Ive just clocked my powercolor r9 290x pcs+ to 1150/1525mhz with +63mV and +50%power limit. Stock is 1050/1350. What do you think? Could I do a little more? The temps dont go above 80C,and I could modify the fan speed a bit.


----------



## Gumbi

Quote:


> Originally Posted by *tonymontana95*
> 
> Ive just clocked my powercolor r9 290x pcs+ to 1150/1525mhz with +63mV and +50%power limit. Stock is 1050/1350. What do you think? Could I do a little more? The temps dont go above 80C,and I could modify the fan speed a bit.


Very solid over clock, nice. From what I know the PCS cooler is calable of a bit better than that. How fast is your fan getting? What are your ambient temps? If you have 20 -25 degree ambients ("normal" basically) and your fan speed is sub 50 or 45%, uping the fan speed will give you good gains









Either way, what you have at the moment seems decent.


----------



## tonymontana95

Quote:


> Originally Posted by *Gumbi*
> 
> Very solid over clock, nice. From what I know the PCS cooler is calable of a bit better than that. How fast is your fan getting? What are your ambient temps? If you have 20 -25 degree ambients ("normal" basically) and your fan speed is sub 50 or 45%, uping the fan speed will give you good gains
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Either way, what you have at the moment seems decent.


I think Ill try with cca +80mV and 1170/1550 and if its stable Ill keep it at that clock
This is the fan speed curve,Ill modifi it for further oc


----------



## MEC-777

Quote:


> Originally Posted by *rdr09*
> 
> Install CCC but do not use Overdrive or use it for oc'ing (others do with no issue but no third-party apps running).
> 
> what i do is after intalling Catalyst driver . . . just before reboot . . . i go to msconfig and uncheck CCC and raptr under startup. i can still open CCC as normal like other apps like Excel if need be.


I never install raptr. I always make sure I uncheck that horrible program during the installation process.









Ah I see, so you just have it so CCC doesn't even run at start up - but would that then render any gaming 3d application setting you have changed in CCC as no longer relevant? (because CCC is not running then). I also use VSR in most of my games which is an option you have to enable in CCC, wouldn't that not work then if CCC in not actively running?
Quote:


> Originally Posted by *kizwan*
> 
> @MEC-777
> 
> +50% == 150%
> +150% =/= 150%
> 
> +150% will be wrong though. 150% is correct & that's what Gumbi wrote.
> 
> Regarding OverDrive, simply disable OverDrive in CCC.


Yeah, I got the +50% = 150% thing, thanks.







Just never heard anyone say "set it to 150%" before is all.

I wasn't aware it was possible to disable overdrive in CCC. Will have to try that. Thanks.


----------



## MEC-777

Quote:


> Originally Posted by *tonymontana95*
> 
> Ive just clocked my powercolor r9 290x pcs+ to 1150/1525mhz with +63mV and +50%power limit. Stock is 1050/1350. What do you think? Could I do a little more? The temps dont go above 80C,and I could modify the fan speed a bit.


If it's stable in demanding games (like The Witcher 3 etc.), then I'd say yeah you still have some head room.


----------



## tonymontana95

Quote:


> Originally Posted by *MEC-777*
> 
> If it's stable in demanding games (like The Witcher 3 etc.), then I'd say yeah you still have some head room.


I tested it in few benchmarks like unigine heaven and bf4..I was playin bf4 3-4h multiple times and everything seems to be ok(max temp 80C,gpu load around 95%)


----------



## rdr09

Quote:


> Originally Posted by *MEC-777*
> 
> I never install raptr. I always make sure I uncheck that horrible program during the installation process.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ah I see, so you just have it so CCC doesn't even run at start up - but would that then render any gaming 3d application setting you have changed in CCC as no longer relevant? (because CCC is not running then). I also use VSR in most of my games which is an option you have to enable in CCC, wouldn't that not work then if CCC in not actively running?
> Yeah, I got the +50% = 150% thing, thanks.
> 
> 
> 
> 
> 
> 
> 
> Just never heard anyone say "set it to 150%" before is all.
> 
> I wasn't aware it was possible to disable overdrive in CCC. Will have to try that. Thanks.


i can activate raptr when i need to. i use it to take screenshots. 1.5MB in 4K vs 24MB with printscreen. i just close it when i'm done, so it won't conflict with the game. in case it does.
.
edit: i am not sure if you will lose your settings when you close CCC. like i said, i can open them up like any other app such as Word. CCC slows the boot process when included in the startup, which i hardly do 'cause i just put my rigs to either sleep or hibernate.


----------



## MEC-777

Quote:


> Originally Posted by *rdr09*
> 
> i can activate raptr when i need to. i use it to take screenshots. 1.5MB in 4K vs 24MB with printscreen. i just close it when i'm done, so it won't conflict with the game. in case it does.
> .
> edit: i am not sure if you will lose your settings when you close CCC. like i said, i can open them up like any other app such as Word. CCC slows the boot process when included in the startup, which i hardly do 'cause i just put my rigs to either sleep or hibernate.


You don't need to use print screen or raptr to take screenshots. You can use one of the F keys in steam (I use F4) and if it's not a steam game, you can use fraps (same thing, assign an F key).









Well as it turns out, I still have to have overdrive enabled in CCC and I still have to set the power limit to +50 in overdrive to get one of my 290's to not throttle. The other card does not throttle. Will explain more in a separate post. I had lots of "what the f**k" moments and spent several hours trying to get crossfire working properly last night. Was super frustrating...


----------



## MEC-777

Alright, so last night I setup crossfire with two 290's on my system and I had a heck of a time getting it to work properly. There's a lot to say, but I'll try to just condense it down to the most important issues, primarily related to the clocks, clock syncing and CCC overdrive power limit issues/confusion.

When I first installed the 2nd card, during driver uninstall it came up with a warning that it wasn't uninstalled properly. Before I could click on view logs to see what happened, or try uninstalling again, the PC restarted on it's own. Upon restart, when windows loaded, I was greeted by a screen of colorful pixel vomit. So I did a hard shut-down (had no choice), yanked both cards out and rebooted on the Intel graphics. Got into windows, no problem and finished properly uninstalling the AMD drivers. Popped the cards back in, restarted and installed the latest stable drivers (15.7.1).

Now, here's where things got weird. Both cards were running at 470/625 core/memory by default. Upon running a 3D app like Unigine Valley, the clocks stayed at 470/625. I fired up MSI AB and in settings I checked the box to sync the two cards. I had to manually set the clocks to 947/1250 (where they should have been) and hit apply. Boom, the clocks went to 300/1250 on one card and 300/150 on the other. That's more like it. Overdrive in CCC was NOT enabled at this point.

Then I tried running Valley again and for some reason, only one card (the reference card) would fully ramp up to 947, the other (the HIS) was hovering around 800-900 but wouldn't ever fully reach 947. This, of course, caused not ideal performance. I then opened overdrive and for the HIS card only, I set the power limit to +50 and boom, it now also ramps up and stays at 947 with the reference card. But why should I have to do this? The cards are synced in MSI AB.

Then, I was setting up a custom fan curve to control the hairdryer, and after clicking OK, then running Valley again, now only one card ramped up to 947 and the other sat at 300! What the heck is with that!? Through trial and error (because I didn't know what was going on) I disabled overdrive, then re-enabled it and set the power limit back to +50 again and boom, both cards ramp up to 947 again.

Crossfire is enabled in CCC as well as the option to "run crossfire even in programs that have no crossfire profile" is also enabled.

And now, just this morning (I left the machine on overnight to download some updates) after waking the PC up, after a few minutes, the displays shut off and on several times as windows made the "device disconnected/reconnected" sound numerous times in a row. When the displays finally came back on after a few seconds of this, MSI AB showed no readings except voltage (for either cards) and the two GPUz panels I had open (one for each card) showed nothing, no readings on the sensors. Tried restarting MSI AB, still nothing. Restarted the PC and everything came back as before - full sensor readings and all.

So now... what the heck is going on??? Is this all normal issues with running crossfire? Have I done something wrong?

I would GREATLY appreciate any assistance with this.

Tried running The Witcher 3 and with Vsync, it's horrible. Vsync off is much better (running up in the 70fps range at 1440p, high settings - hairworks off), but it's still not "smooth" like it was with one 290, same settings in the 50fps range. Haven't yet had a chance to try any other games yet.

Please help! lol







Thanks.


----------



## diggiddi

Have you tried testing the second card alone?


----------



## kizwan

Quote:


> Originally Posted by *MEC-777*
> 
> Alright, so last night I setup crossfire with two 290's on my system and I had a heck of a time getting it to work properly. There's a lot to say, but I'll try to just condense it down to the most important issues, primarily related to the clocks, clock syncing and CCC overdrive power limit issues/confusion.
> 
> When I first installed the 2nd card, during driver uninstall it came up with a warning that it wasn't uninstalled properly. Before I could click on view logs to see what happened, or try uninstalling again, the PC restarted on it's own. Upon restart, when windows loaded, I was greeted by a screen of colorful pixel vomit. So I did a hard shut-down (had no choice), yanked both cards out and rebooted on the Intel graphics. Got into windows, no problem and finished properly uninstalling the AMD drivers. Popped the cards back in, restarted and installed the latest stable drivers (15.7.1).
> 
> Now, here's where things got weird. Both cards were running at 470/625 core/memory by default. Upon running a 3D app like Unigine Valley, the clocks stayed at 470/625. I fired up MSI AB and in settings I checked the box to sync the two cards. I had to manually set the clocks to 947/1250 (where they should have been) and hit apply. Boom, the clocks went to 300/1250 on one card and 300/150 on the other. That's more like it. Overdrive in CCC was NOT enabled at this point.
> 
> Then I tried running Valley again and for some reason, only one card (the reference card) would fully ramp up to 947, the other (the HIS) was hovering around 800-900 but wouldn't ever fully reach 947. This, of course, caused not ideal performance. I then opened overdrive and for the HIS card only, I set the power limit to +50 and boom, it now also ramps up and stays at 947 with the reference card. But why should I have to do this? The cards are synced in MSI AB.
> 
> Then, I was setting up a custom fan curve to control the hairdryer, and after clicking OK, then running Valley again, now only one card ramped up to 947 and the other sat at 300! What the heck is with that!? Through trial and error (because I didn't know what was going on) I disabled overdrive, then re-enabled it and set the power limit back to +50 again and boom, both cards ramp up to 947 again.
> 
> Crossfire is enabled in CCC as well as the option to "run crossfire even in programs that have no crossfire profile" is also enabled.
> 
> And now, just this morning (I left the machine on overnight to download some updates) after waking the PC up, after a few minutes, the displays shut off and on several times as windows made the "device disconnected/reconnected" sound numerous times in a row. When the displays finally came back on after a few seconds of this, MSI AB showed no readings except voltage (for either cards) and the two GPUz panels I had open (one for each card) showed nothing, no readings on the sensors. Tried restarting MSI AB, still nothing. Restarted the PC and everything came back as before - full sensor readings and all.
> 
> So now... what the heck is going on??? Is this all normal issues with running crossfire? Have I done something wrong?
> 
> I would GREATLY appreciate any assistance with this.
> 
> Tried running The Witcher 3 and with Vsync, it's horrible. Vsync off is much better (running up in the 70fps range at 1440p, high settings - hairworks off), but it's still not "smooth" like it was with one 290, same settings in the 50fps range. Haven't yet had a chance to try any other games yet.
> 
> Please help! lol
> 
> 
> 
> 
> 
> 
> 
> Thanks.


I have no problem like that with my crossfire. No stuck clocks or pixel vomit. I always have CCC installed & loaded. OverDrive always disabled (the checkbox not tick).

Even with ULPS enabled, the cards won't behave like you described above. If ULPS enabled, the cards still run at 3d clocks but the 2nd card will have problem maintaining overclock clocks since the voltage doesn't overvoltage properly.

Once I set Power Limit in MSI AB to whatever value, for example +50%, OverDrive automatically show 50% too.


Software can haywire sometime. You can try uninstall both & re-install one by one starting with the drivers. If you already done this, please ignore.


----------



## MEC-777

Quote:


> Originally Posted by *diggiddi*
> 
> Have you tried testing the second card alone?


Yep. It works perfectly. Both cards run exactly as they should on their own.


----------



## MEC-777

Quote:


> Originally Posted by *kizwan*
> 
> I have no problem like that with my crossfire. No stuck clocks or pixel vomit. I always have CCC installed & loaded. OverDrive always disabled (the checkbox not tick).
> 
> Even with ULPS enabled, the cards won't behave like you described above. If ULPS enabled, the cards still run at 3d clocks but the 2nd card will have problem maintaining overclock clocks since the voltage doesn't overvoltage properly.


The pixel vomit was only because the drivers didn't uninstall properly and the computer restarted on it's own before I could fix that. Thus, when windows loaded with corrupted drivers - it gave a very messy image.









I will try it again tonight, with ULPS enabled, overdrive disabled and see what it does.

With both cards properly synced (one with power limit at +50 in overdrive, both set to +50 in AB) I was able to OC both cards to 1050/1350 and they ran great. Scored 13,999 in firestrike: http://www.3dmark.com/fs/5956797 They ran hot and loud, lol, but ran fine otherwise.


----------



## mfknjadagr8

Quote:


> Originally Posted by *MEC-777*
> 
> The pixel vomit was only because the drivers didn't uninstall properly and the computer restarted on it's own before I could fix that. Thus, when windows loaded with corrupted drivers - it gave a very messy image.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I will try it again tonight, with ULPS enabled, overdrive disabled and see what it does.
> 
> With both cards properly synced (one with power limit at +50 in overdrive, both set to +50 in AB) I was able to OC both cards to 1050/1350 and they ran great. Scored 13,999 in firestrike: http://www.3dmark.com/fs/5956797 They ran hot and loud, lol, but ran fine otherwise.


always install drivers with only one card powered...then restart...then shutdown then reconnect power to the second card then boot up let it detect the second card then reboot and after that make sure crossfire is enabled...my 290s do wierd things when I uninstall/install drivers with both connected...I've had clocks showing as clocks ive never ran or isnt possible on the cards and had clocks not change with any program regardless of settings until I started doing this...also ddu in safe mode and cc cleaner is your best friend with amd cards


----------



## MEC-777

Quote:


> Originally Posted by *mfknjadagr8*
> 
> always install drivers with only one card powered...then restart...then shutdown then reconnect power to the second card then boot up let it detect the second card then reboot and after that make sure crossfire is enabled...my 290s do wierd things when I uninstall/install drivers with both connected...I've had clocks showing as clocks ive never ran or isnt possible on the cards and had clocks not change with any program regardless of settings until I started doing this...also ddu in safe mode and cc cleaner is your best friend with amd cards


Next time I install new drivers, I'll try the method you suggested here. Thanks for the tip.









I don't know what the cause was of the initial problems I had, but everything seems to be working as it should now. I uninstalled, then re-installed MSI AB (wiped all previous settings along with it). I then tried a few different settings (unofficial overclocking disabled, ULPS disabled and the with or without powerplay option as disabled as well). Also set power limit to +50% in overdrive. Both cards now ramp up together and remain locked at 947mhz as they should.

I also created a new custom fan curve in AB which now keeps temps reasonable (in the 70's in GTA 5) and noise is "tolerable".









Anyways, thanks everyone for all the help and suggestions. There are so many factors and options and settings that can effect how it works - it's just a matter of knowing which settings to use.









Now it's time to think about reinstalling the kraken on the HIS card and installing an HG10 on the reference card...


----------



## supermiguel

are 2 290x in crossfire good enough for 4k gaming?


----------



## MEC-777

Quote:


> Originally Posted by *supermiguel*
> 
> are 2 290x in crossfire good enough for 4k gaming?


For most games, yes. But you won't be running at 60fps with maxed out settings though. 60fps with medium/high settings, yes.


----------



## By-Tor

Quote:


> Originally Posted by *supermiguel*
> 
> are 2 290x in crossfire good enough for 4k gaming?


If you're running a pair of 8gb 290x's like the card in the link below it may play pretty good, but 4gb cards are going to struggle a bit more at 4k.

http://www.newegg.com/Product/Product.aspx?Item=N82E16814150723


----------



## diggiddi

Quote:


> Originally Posted by *MEC-777*
> 
> The pixel vomit was only because the drivers didn't uninstall properly and the computer restarted on it's own before I could fix that. Thus, when windows loaded with corrupted drivers - it gave a very messy image.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I will try it again tonight, with ULPS enabled, overdrive disabled and see what it does.
> 
> With both cards properly synced (one with power limit at +50 in overdrive, both set to +50 in AB) I was able to OC both cards to 1050/1350 and they ran great. Scored 13,999 in firestrike: http://www.3dmark.com/fs/5956797 They ran hot and loud, lol, but ran fine otherwise.


That is a low score for firestrike, it must definitely be using only one card cos I got 14197 on one card(290x)


----------



## fyzzz

Quote:


> Originally Posted by *diggiddi*
> 
> That is a low score for firestrike, it must definitely be using only one card cos I got 14197 on one card(290x)


He was talking about overall score I think


----------



## MEC-777

Quote:


> Originally Posted by *diggiddi*
> 
> That is a low score for firestrike, it must definitely be using only one card cos I got 14197 on one card(290x)


Quote:


> Originally Posted by *fyzzz*
> 
> He was talking about overall score I think


Yep, that was the overall score. Graphics score was 21,878. My best single card (290) graphics score was 13,061.


----------



## DDSZ




----------



## battleaxe

Quote:


> Originally Posted by *DDSZ*


Haha... I haven't seen too many of those CPU coolers. I have one too. I can't even remember what its called though. Skythe something or other....


----------



## arrow0309

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *DDSZ*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Haha... I haven't seen too many of those CPU coolers. I have one too. I can't even remember what its called though. Skythe something or other....
Click to expand...

It's the Ninja 3


----------



## diggiddi

I connected a 40" tv and even though the immersion was better, it felt like it slowed down my system can anyone explain why? or is that the dreaded lag I was experiencing


----------



## kizwan

Quote:


> Originally Posted by *diggiddi*
> 
> I connected a 40" tv and even though the immersion was better, it felt like it slowed down my system can anyone explain why? or is that the dreaded lag I was experiencing


Good question. What happened is probably input lag. I've never experienced it myself though.


----------



## diggiddi

Quote:


> Originally Posted by *kizwan*
> 
> Good question. What happened is probably input lag. I've never experienced it myself though.


It felt like I needed to overclock the entire system, really sluggish movements but the immersion was off tha hook, I'm still trying to get used to my regular setup


----------



## sinnedone

Quote:


> Originally Posted by *diggiddi*
> 
> I connected a 40" tv and even though the immersion was better, it felt like it slowed down my system can anyone explain why? or is that the dreaded lag I was experiencing


What resolution? SIze of the display doesn't matter, a 1080p 41" tv is the same as a 19" 1080p monitor.


----------



## battleaxe

Quote:


> Originally Posted by *diggiddi*
> 
> It felt like I needed to overclock the entire system, really sluggish movements but the immersion was off tha hook, I'm still trying to get used to my regular setup


input/output lag brah....


----------



## MEC-777

Here's my new 290 crossfire setup. Added a Corsair HG10 + H55 to the reference card (couldn't stand the high temps and noise).



















Was fun squeezing three 120mm AIO's in the S340.


----------



## diggiddi

Quote:


> Originally Posted by *sinnedone*
> 
> What resolution? SIze of the display doesn't matter, a 1080p 41" tv is the same as a 19" 1080p monitor.


I was running at 1080

Quote:


> Originally Posted by *battleaxe*
> 
> input/output lag brah....


I see







Yeah it is a cheap 40" tv, Haier, but now I must go big or go home, next monitor will be 40"+. Most likely a 40"+ curved 4k tv


----------



## fyzzz

My gpu driver keeps on disappearing. it is highly annoying. The screen goes black sometimes and flickers and after thats done, the driver is gone.


----------



## MEC-777

Quote:


> Originally Posted by *fyzzz*
> 
> My gpu driver keeps on disappearing. it is highly annoying. The screen goes black sometimes and flickers and after thats done, the driver is gone.


Can you explain a little more? What do you mean by "it's gone"?


----------



## fyzzz

Quote:


> Originally Posted by *MEC-777*
> 
> Can you explain a little more? What do you mean by "it's gone"?


When the screen goes black and flickers, I then check with gpu-z and the gpu clocks are at 0 and the graphics card is using the basic driver that it uses when no driver is installed. I'm running windows 10, so maybe some update is conflicting or something. I guess i could live with it, because it happens rarely. But it can be annoying.


----------



## sinnedone

The only time I get that in gpuz is because of bios flashing.

You could try flashing the stock bios and see if it continues?


----------



## LandonAaron

Quote:


> Originally Posted by *BradleyW*
> 
> Anyone tried Mad Max CFX yet?


Replying to super old post, but whatever. Yeah Mad Max works like a dream with crossfire. Running 290x+290 @ 1440p all settings maxed out I get constant 96FPS with Vsync. About 60% unified GPU utilizaiton.


----------



## MEC-777

Quote:


> Originally Posted by *fyzzz*
> 
> When the screen goes black and flickers, I then check with gpu-z and the gpu clocks are at 0 and the graphics card is using the basic driver that it uses when no driver is installed. I'm running windows 10, so maybe some update is conflicting or something. I guess i could live with it, because it happens rarely. But it can be annoying.


Oh ok, I see. The driver isn't "gone" but it has crashed, is what happened. I've had that happen a few times and had to restart, but it's on very rare occasions - like when I was trying to get crossfire working properly recently.

I've also found that running with GPUz open actually causes problems with system and driver crashes on my system. Not sure why, but with GPUz closed - not a single issue and everything runs smooth. With GPUz open (so I can monitor the sensors on 2nd monitor) while gaming, the GPU usage is very erratic and games run worse than usual.


----------



## MEC-777

Quote:


> Originally Posted by *LandonAaron*
> 
> Replying to super old post, but whatever. Yeah Mad Max works like a dream with crossfire. Running 290x+290 @ 1440p all settings maxed out I get constant 96FPS with Vsync. About 60% unified GPU utilizaiton.


Glad to hear this. Been thinking about grabbing that game and wondered how it runs with CFX 290's.


----------



## Blue Dragon

Quote:


> Originally Posted by *MEC-777*
> 
> Oh ok, I see. The driver isn't "gone" but it has crashed, is what happened. I've had that happen a few times and had to restart, but it's on very rare occasions - like when I was trying to get crossfire working properly recently.
> 
> I've also found that running with GPUz open actually causes problems with system and driver crashes on my system. Not sure why, but with GPUz closed - not a single issue and everything runs smooth. With GPUz open (so I can monitor the sensors on 2nd monitor) while gaming, the GPU usage is very erratic and games run worse than usual.


I've had problems with GPUZ and my 290s as well... I run HWinfo64 with no problems though, you could try that one.


----------



## fyzzz

1700 in heaven at 1080p







Card will surely do more than 1225/1625, but mostly i did want to break into the 1700 mark. Running a very heavily modded 390 bios with tighter timings and such.


----------



## Gumbi

Quote:


> Originally Posted by *fyzzz*
> 
> 1700 in heaven at 1080p
> 
> 
> 
> 
> 
> 
> 
> Card will surely do more than 1225/1625, but mostly i did want to break into the 1700 mark. Running a very heavily modded 390 bios with tighter timings and such.


Very nice! I was running slightly gimped cooling on my 290 Vapor X and was proud of breaking 1600 at 1206/1649.

How hot did the cores and VRMs get? Card/cooling?


----------



## fyzzz

Quote:


> Originally Posted by *Gumbi*
> 
> Very nice! I was running slightly gimped cooling on my 290 Vapor X and was proud of breaking 1600 at 1206/1649.
> 
> How hot did the cores and VRMs get? Card/cooling?


The card is under water, the core went to 43 and vrm 1 hit 50c. I'm running the 390 bios so the core is running a bit cooler than on normal 290 bios. I have fan on the backside that helps cool vrm, i could increase the rpm on to get a couple of degrees of vrm, but i don't really need to since the vrm is so cool anyways. When i put in the timings for the 1126-1250 strap into the 1501-1625 strap i got a huge boost + the 390 bios itself helps a bit. My cards memory seems to handle pretty high memory clocks with tight timings.


----------



## Gumbi

Quote:


> Originally Posted by *fyzzz*
> 
> The card is under water, the core went to 43 and vrm 1 hit 50c. I'm running the 390 bios so the core is running a bit cooler than on normal 290 bios. I have fan on the backside that helps cool vrm, i could increase the rpm on to get a couple of degrees of vrm, but i don't really need to since the vrm is so cool anyways. When i put in the timings for the 1126-1250 strap into the 1501-1625 strap i got a huge boost + the 390 bios itself helps a bit. My cards memory seems to handle pretty high memory clocks with tight timings.


Very nice. Cool core and vrms helps a lot with stability. What card is it though? Make/model I mean. And also, what kind of voltage you putting through it for those clocks?


----------



## fyzzz

Quote:


> Originally Posted by *Gumbi*
> 
> Very nice. Cool core and vrms helps a lot with stability. What card is it though? Make/model I mean. And also, what kind of voltage you putting through it for those clocks?


Yeah noticed that too when i was on air, as soon as the vrm hit 80c or something the stability went out the window. The card is xfx dd, with hynix bfr memory, weirdly enough. I have increased the voltage in the bios so i ran that with +50mv, so it's hard to say how much it would take on normal bios, but i would estimate around 125mv. I had a few artifacts, but not many, but i was careful with the voltage since it seems to cause blackscreen if pushed to high. The 390 bios is weird and seems to blackscreen with high voltages, but i seems like i can push more voltage when running heaven than running valley.


----------



## battleaxe

Quote:


> Originally Posted by *fyzzz*
> 
> 1700 in heaven at 1080p
> 
> 
> 
> 
> 
> 
> 
> Card will surely do more than 1225/1625, but mostly i did want to break into the 1700 mark. Running a very heavily modded 390 bios with tighter timings and such.


This a 390 or 390x?

Incredibadness score by the way. I've wanted to break 17k too, but I don't have full cover blocks so have not been able to yet. I'm so close too... my best so far has been 1676... but I want 17k

Awesome card you have there.


----------



## fyzzz

Quote:


> Originally Posted by *battleaxe*
> 
> This a 390 or 390x?
> 
> Incredibadness score by the way. I've wanted to break 17k too, but I don't have full cover blocks so have not been able to yet. I'm so close too... my best so far has been 1676... but I want 17k
> 
> Awesome card you have there.


It is a 290 with a heavily modded 390 bios that i have made myself, was kinda surprised my self when it went over 1700.


----------



## Gumbi

I will beat you all with my air cooled 290 Muwhaha!







Can break 1600 with just 2 of my 3 Vapor X fans functional,that was 1206/1649 at 125 mv (could prob have done it at a bit less voltage). RMAing right now, fingers crossed for a replacement Vapor X 290 or 290x Vapor X.... I would honestly prefer it over a 390 Nitro...


----------



## MEC-777

Quote:


> Originally Posted by *Blue Dragon*
> 
> I've had problems with GPUZ and my 290s as well... I run HWinfo64 with no problems though, you could try that one.


I keep MSI AB open on my second screen for monitoring. Was only running GPUz to watch the clocks in real time on both cards at the same time. Now I only use GPUz to check in the moment, then close it again before I do anything else. Otherwise, MSI AB is enough. Can see the temps and loads of both cards at all times, which is all I really need.









Before this, I had no idea GPUz could cause so many problems. Kind of surprising, really. When I think back now, GPUz being open is probably what caused most of the performance issues I was experiencing, not the crossfire setup itself.


----------



## battleaxe

Quote:


> Originally Posted by *fyzzz*
> 
> It is a 290 with a heavily modded 390 bios that i have made myself, was kinda surprised my self when it went over 1700.


That's crazy. Not many 290X's that can even do that. I haven't seen any 390 or 390x get that score. Maybe its time to mod my BIOS.


----------



## Archea47

fyzzz are you going to share this bios back to the 390X bios thread?


----------



## battleaxe

Quote:


> Originally Posted by *Archea47*
> 
> fyzzz are you going to share this bios back to the 390X bios thread?


Quote:


> Originally Posted by *fyzzz*
> 
> It is a 290 with a heavily modded 390 bios that i have made myself, was kinda surprised my self when it went over 1700.


Please share it with us too!!!!


----------



## fyzzz

Quote:


> Originally Posted by *battleaxe*
> 
> Please share it with us too!!!!


Maybe..it may not work with all cards tho, because i run so tight timings. I can fix that easily however. I'm having so much fun right now, just getting more and more performance out of hawaii, i love bios modding.


----------



## battleaxe

Quote:


> Originally Posted by *fyzzz*
> 
> Maybe..it may not work with all cards tho, because i run so tight timings. I can fix that easily however. I'm having so much fun right now, just getting more and more performance out of hawaii, i love bios modding.


What tool did you use to adjust the timings?


----------



## fyzzz

Quote:


> Originally Posted by *battleaxe*
> 
> What tool did you use to adjust the timings?


There are different straps so I took the timings of the 1126-1250 strap and put it in the 1501-1625 strap and therefore i was running 1625 with 1250 timings and all of this can be changed in a hex editor. But the timings change gave me a good boost.


----------



## sinnedone

Quote:


> Originally Posted by *fyzzz*
> 
> There are different straps so I took the timings of the 1126-1250 strap and put it in the 1501-1625 strap and therefore i was running 1625 with 1250 timings and all of this can be changed in a hex editor. But the timings change gave me a good boost.


It sounds like you are putting a lot of leg work into this.

I really haven't had the time to test this further but hopefully soon.


----------



## battleaxe

Quote:


> Originally Posted by *fyzzz*
> 
> There are different straps so I took the timings of the 1126-1250 strap and put it in the 1501-1625 strap and therefore i was running 1625 with 1250 timings and all of this can be changed in a hex editor. But the timings change gave me a good boost.


I have heard of this being done. I have flashed a few GTX660's and GTX670's before. But just never felt the need to mess with the 290's. But now, this just seems kinda fun. Pretty cool you got a 290 to score that high. That's so insane.


----------



## adog12341

Got around to overclocking my QX2710 again and it seems that my GPU might have died







I overclocked it last night before I went to bed and today any 3d application will cause my computer to freeze and the screen will fade to black. I unpatched the drivers, uninstalled, and then reinstalled the latest beta driver and still freezing. Really hoping I didn't kill her


----------



## battleaxe

Quote:


> Originally Posted by *adog12341*
> 
> Got around to overclocking my QX2710 again and it seems that my GPU might have died
> 
> 
> 
> 
> 
> 
> 
> I overclocked it last night before I went to bed and today any 3d application will cause my computer to freeze and the screen will fade to black. I unpatched the drivers, uninstalled, and then reinstalled the latest beta driver and still freezing. Really hoping I didn't kill her


Turn off the PC, unplug it, press the power button to drain any power, then plug back in, and restart... see if problem persists. I've experienced this on a GPU or two and the system just needed to be reset was all. Hope that's it for you.


----------



## adog12341

God I really hope it's that easy of a fix! Don't want to throw down for another 290...

EDIT: Tried that last night. No dice from that. Left her unplugged all night, along with the monitor, and it made it through a few passes of Heaven while before it would freeze before the first loop. So thank you!!!







Hopefully I am not speaking too soon, but I have to go to school so I can't test out on a game.


----------



## MEC-777

So, is this a good Firestrike score for crossfire 290's clocked at 1100/1350? http://www.3dmark.com/fs/6012827

Reason I'm asking is because if I look at others with the same or similar config, I see people scoring upwards of 17,000+ with lower clocks than what I'm running. How can that be?


----------



## By-Tor

Quote:


> Originally Posted by *MEC-777*
> 
> So, is this a good Firestrike score for crossfire 290's clocked at 1100/1350? http://www.3dmark.com/fs/6012827
> 
> Reason I'm asking is because if I look at others with the same or similar config, I see people scoring upwards of 17,000+ with lower clocks than what I'm running. How can that be?


The score for the 290's looks fine, but the processors Physics Score looks low to me.


----------



## kizwan

Quote:


> Originally Posted by *MEC-777*
> 
> So, is this a good Firestrike score for crossfire 290's clocked at 1100/1350? http://www.3dmark.com/fs/6012827
> 
> Reason I'm asking is because if I look at others with the same or similar config, I see people scoring upwards of 17,000+ with lower clocks than what I'm running. How can that be?


Same CPU, i5-4570?

This is mine, 1100/1300 but mine is i7 anyway. Graphics score close enough, within margin of error. I believe CPU does affect FPS (+/- 2 to 3 FPS).
http://www.3dmark.com/fs/5608368

Comparison: http://www.3dmark.com/compare/fs/6012827/fs/5608368


----------



## rdr09

Quote:


> Originally Posted by *MEC-777*
> 
> So, is this a good Firestrike score for crossfire 290's clocked at 1100/1350? http://www.3dmark.com/fs/6012827
> 
> Reason I'm asking is because if I look at others with the same or similar config, I see people scoring upwards of 17,000+ with lower clocks than what I'm running. How can that be?


compare graphics scores not overall. your scores are fine.


----------



## MEC-777

Quote:


> Originally Posted by *kizwan*
> 
> Same CPU, i5-4570?
> 
> This is mine, 1100/1300 but mine is i7 anyway. Graphics score close enough, within margin of error. I believe CPU does affect FPS (+/- 2 to 3 FPS).
> http://www.3dmark.com/fs/5608368
> 
> Comparison: http://www.3dmark.com/compare/fs/6012827/fs/5608368


Quote:


> Originally Posted by *rdr09*
> 
> compare graphics scores not overall. your scores are fine.


Ok, so it's on par, just my CPU that's not as powerful as those people scoring higher. I see. Thanks for clearing that up.


----------



## rdr09

Quote:


> Originally Posted by *MEC-777*
> 
> Ok, so it's on par, just my CPU that's not as powerful as those people scoring higher. I see. Thanks for clearing that up.


like By said physics is kinda low. pretty much any cpu when left at stock will seem less powerful. you gonna need to oc with 2 290s. even oc'ed the i5 will get maxed with 2 of those in some games.

also, disable igpu when you are not using it. it can conflict with the gpus.


----------



## Arizonian

Quote:


> Originally Posted by *MEC-777*
> 
> Here's my new 290 crossfire setup. Added a Corsair HG10 + H55 to the reference card (couldn't stand the high temps and noise).
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Was fun squeezing three 120mm AIO's in the S340.


Nice job - updated


----------



## Jorginto

Guys, I'm in the dark here. I updated to win 10 recently and my crossfire setup is just giving me gray hair. I can not disable ulps. Tried though regedit and MSI AB. One card works like a charm, but the second one... omg, no monitoring, no overclock, is not scaling with primary GPU. Not mentioning statup freezes and black screen. Been removing and instaling the latest drivers like 5 times now with one card and second one atfter that. Both at the same time. Makes no difference.

Just on the verge of selling them and getting 980ti.


----------



## Unknownm

Quote:


> Originally Posted by *Jorginto*
> 
> Guys, I'm in the dark here. I updated to win 10 recently and my crossfire setup is just giving me gray hair. I can not disable ulps. Tried though regedit and MSI AB. One card works like a charm, but the second one... omg, no monitoring, no overclock, is not scaling with primary GPU. Not mentioning statup freezes and black screen. Been removing and instaling the latest drivers like 5 times now with one card and second one atfter that. Both at the same time. Makes no difference.
> 
> Just on the verge of selling them and getting 980ti.


That was my issue when upgrading to windows 10.

Download Sapphire TRIXX and enable Disable ULPS. Works like a charm!


----------



## Jorginto

Quote:


> Originally Posted by *Unknownm*
> 
> That was my issue when upgrading to windows 10.
> 
> Download Sapphire TRIXX and enable Disable ULPS. Works like a charm!


Did you leave MSI AB, or are you using just trixx?


----------



## Unknownm

Quote:


> Originally Posted by *Jorginto*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Unknownm*
> 
> That was my issue when upgrading to windows 10.
> 
> Download Sapphire TRIXX and enable Disable ULPS. Works like a charm!
> 
> 
> 
> Did you leave MSI AB, or are you using just trixx?
Click to expand...

Trixx didn't have Aux Voltage option so I just use it for disabling ULPS


----------



## MEC-777

Quote:


> Originally Posted by *rdr09*
> 
> like By said physics is kinda low. pretty much any cpu when left at stock will seem less powerful. you gonna need to oc with 2 290s. even oc'ed the i5 will get maxed with 2 of those in some games.
> 
> also, disable igpu when you are not using it. it can conflict with the gpus.


I have the CPU overlocked just a little with the baseclock at 103 since it's a non-K CPU. Good for a 100mhz bump anyways.









I have no choice but to use the iGPU for my second monitor. Why? Because having both monitors hooked up to the 290's gives a popping sound in the audio when running Vsync in some games. It's a known crossfire issue that's been around for a long time and AMD has yet to fix it.

BTW, here's my latest score with an OC of 1100/1350 with the new 15.8 beta drivers: http://www.3dmark.com/fs/6020432 I'm sure I can crack 15k but I don't want to push these cards any further. Running them at stock clocks gives me more than enough performance.


----------



## rdr09

Quote:


> Originally Posted by *MEC-777*
> 
> I have the CPU overlocked just a little with the baseclock at 103 since it's a non-K CPU. Good for a 100mhz bump anyways.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have no choice but to use the iGPU for my second monitor. Why? Because having both monitors hooked up to the 290's gives a popping sound in the audio when running Vsync in some games. It's a known crossfire issue that's been around for a long time and AMD has yet to fix it.
> 
> BTW, here's my latest score with an OC of 1100/1350 with the new 15.8 beta drivers: http://www.3dmark.com/fs/6020432 I'm sure I can crack 15k but I don't want to push these cards any further. Running them at stock clocks gives me more than enough performance.


keep your igpu if it is not giving you issue, then. i heard of the that issue. you've got excellent 290s. i play with my 290s at stock myself.


----------



## kizwan

Quote:


> Originally Posted by *Unknownm*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Jorginto*
> 
> Guys, I'm in the dark here. I updated to win 10 recently and my crossfire setup is just giving me gray hair. I can not disable ulps. Tried though regedit and MSI AB. One card works like a charm, but the second one... omg, no monitoring, no overclock, is not scaling with primary GPU. Not mentioning statup freezes and black screen. Been removing and instaling the latest drivers like 5 times now with one card and second one atfter that. Both at the same time. Makes no difference.
> 
> Just on the verge of selling them and getting 980ti.
> 
> 
> 
> That was my issue when upgrading to windows 10.
> 
> Download Sapphire TRIXX and enable Disable ULPS. Works like a charm!
Click to expand...

Yup, using TRIXX to disable ULPS in Win 10 here too.

@Jorginto TRIXX will not notify you but when selecting disable ULPS in TRIXX, close TRIXX & reboot your computer. If somehow disable ULPS already selected in TRIXX, untick & tick again the checkbox, close TRIXX & reboot your computer.


----------



## Archea47

Is the Cinebench R15 OpenGL benchmark considered at all taxing?

Here's my results and max temps with 2x 290X Tri-X with stock Tri-X BIOS and no tweaks/overclock:



Is this a normal/healthy/reasonable starting point? Fresh build of Windows 7 with the latest (15.7.1?) WHQL drivers


----------



## MEC-777

Quote:


> Originally Posted by *rdr09*
> 
> keep your igpu if it is not giving you issue, then. i heard of the that issue. you've got excellent 290s. i play with my 290s at stock myself.


Yeah, I actually had to disable the iGPU and go back to both monitors off the 290's. Main reason being is because it was causing Project Cars to freeze on me. As long as I don't run Vsync, I don't get the audio popping, so it's not really a big deal.









Running Project Cars at 1440p (VSR) on high with DS2X AA. It looks beautiful and runs fairly smooth. Frame rates are 70-100+ and rarely drops below 60. Both cards sit at about 60 and 54 degrees, respectively. Very happy with the performance - especially considering this game is an Nvidia title.


----------



## kizwan

Quote:


> Originally Posted by *Archea47*
> 
> Is the Cinebench R15 OpenGL benchmark considered at all taxing?
> 
> Here's my results and max temps with 2x 290X Tri-X with stock Tri-X BIOS and no tweaks/overclock:
> 
> 
> 
> Is this a normal/healthy/reasonable starting point? Fresh build of Windows 7 with the latest (15.7.1?) WHQL drivers


They are normal, healthy & reasonable. As for the CPU temp, the CPU sensor on the motherboard is the most accurate. You likely know this already but since you highlight both, so...


----------



## Archea47

Quote:


> Originally Posted by *kizwan*
> 
> They are normal, healthy & reasonable. As for the CPU temp, the CPU sensor on the motherboard is the most accurate. You likely know this already but since you highlight both, so...


Thanks kizwan!

Very much familiar with the sensors on the 990FX sabertooth and their behavior







The sensor on the motherboard is a different sensor - socket vs the package temp. Yeah the package temp isn't accurate at low (like shown) levels. Once it gets in the 50+ it (package)'s very useful and different than the motherboard CPU (socket) temp, though the two are certainly related due to proximity. When I watercooled the VRMs it had a greater impact on Socket than Package for instance, as does cooling the rear of the socket. When the temps are high what I really care about is the Package

A co-worker ran the Cinebench R15 test today for me on his new single-socket Haswell Xeon + single GTX980 build, all stock. He got 104FPS







His CPU score was in the mid 800s though while mine at stock is in the mid-low 600s


----------



## Liranan

Guys, please forgive me for not reading all 4000 pages but I have a problem with my brand new 290. It arrived today and when running games and Furmark I've noticed that the GPU doesn't stay at a constant 1000MHz. Speed is between 750 and 1000 in, with sometimes at 1000 and sometimes even at 300. I've disabled ULPS in Afterburned but still the same problem. Can someone explain what's going on, please and how can I ensure I get the maximum performance?

Thanks.


----------



## Gumbi

Quote:


> Originally Posted by *Liranan*
> 
> Guys, please forgive me for not reading all 4000 pages but I have a problem with my brand new 290. It arrived today and when running games and Furmark I've noticed that the GPU doesn't stay at a constant 1000MHz. Speed is between 750 and 1000 in, with sometimes at 1000 and sometimes even at 300. I've disabled ULPS in Afterburned but still the same problem. Can someone explain what's going on, please and how can I ensure I get the maximum performance?
> 
> Thanks.


Stop cooking your card in Furmark! It's downclocking to prevent itself from your abuse!

Seriously, it is doing that due to throttling either due to power or overheating or both. This is normal behaviour with Furmark.

Do a bench run of Heaven benchmark with gpuz running in the backgrounf and post a screen of your clocks then, I guarantee they will be stuck at 1000mhz then.


----------



## Liranan

You're making me download Heaven









50Mbit to the rescue









50Mbit turned into 4Mbit for some reason.

Here are the results, still not a consistent 1000.


----------



## kizwan

Quote:


> Originally Posted by *Liranan*
> 
> You're making me download Heaven
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 50Mbit to the rescue
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 50Mbit turned into 4Mbit for some reason.
> 
> Here are the results, still not a consistent 1000.
> 
> 
> Spoiler: Warning: Spoiler!


Little fluctuation on the core is normal. Try running in fullscreen 1080p or higher.


----------



## Gumbi

Quote:


> Originally Posted by *Liranan*
> 
> You're making me download Heaven
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 50Mbit to the rescue
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 50Mbit turned into 4Mbit for some reason.
> 
> Here are the results, still not a consistent 1000.


That's pretty much bang on 1000mhz the whole time... the only drips it has would be during scene changes or during CPU limited sections... your GPU is fine and working exactly as it should, nothing to worry about at all.


----------



## Liranan

Thanks guys, now I know it's working well I can test that 100MHz overclock I was playing with earlier.


----------



## buttface420

just got a reference 290x...is this suppossed to run 95c? even on youtube its 90c. should i get a aio and a hg10 bracket? one time today i got a blackscreen crash..any fixes for that?


----------



## fyzzz

Quote:


> Originally Posted by *buttface420*
> 
> just got a reference 290x...is this suppossed to run 95c? even on youtube its 90c. should i get a aio and a hg10 bracket? one time today i got a blackscreen crash..any fixes for that?


The reference card is designed to run at 95c, but it sounds strange that it is running at 90c on youtube. The black screen is/was a common problem for these cards, sometimes increasing voltage and/or lower frequency can help get rid of it. If I were you I would get a better cooler.


----------



## mfknjadagr8

Quote:


> Originally Posted by *fyzzz*
> 
> The reference card is designed to run at 95c, but it sounds strange that it is running at 90c on youtube. The black screen is/was a common problem for these cards, sometimes increasing voltage and/or lower frequency can help get rid of it. If I were you I would get a better cooler.


honestly it sounds like the paste had given up the ghost....my reference 290s were hitting 90s....the paste was like dried clay...also you should try upping the fan speed via msi afterburner and see how that effects temperature...I'd start with 60 as that's about as loud as must people can stand with it...my black screens went away after repaste and remount BUT the fan speed had to be ramped up otherwise temps would cause black screen even after repaste with stock fan profiles


----------



## Gumbi

Quote:


> Originally Posted by *mfknjadagr8*
> 
> honestly it sounds like the paste had given up the ghost....my reference 290s were hitting 90s....the paste was like dried clay...also you should try upping the fan speed via msi afterburner and see how that effects temperature...I'd start with 60 as that's about as loud as must people can stand with it...my black screens went away after repaste and remount BUT the fan speed had to be ramped up otherwise temps would cause black screen even after repaste with stock fan profiles










Def repaste


----------



## buttface420

ahhh okay i will re-apply paste thanks guys!


----------



## mfknjadagr8

Quote:


> Originally Posted by *buttface420*
> 
> ahhh okay i will re-apply paste thanks guys!


you need about 1/4 of what you would use on a cpu...you might find they way overdid it and its extremely dried out


----------



## Liranan

Quote:


> Originally Posted by *buttface420*
> 
> just got a reference 290x...is this suppossed to run 95c? even on youtube its 90c. should i get a aio and a hg10 bracket? one time today i got a blackscreen crash..any fixes for that?


Sadly AMD reference cards are terrible. My reference 4870 and even 6870 were just awfully hot, loud, had terrible paste and both died after a short while. Fortunately they got replaced by aftermarket coolers by MSI (6870) and Powercolour (4870)m that's why I made sure I got the card with the best cooler this time. nVidia cards on the other hard are entirely different, their stock coolers are quite good and quiet.


----------



## Roboyto

Quote:


> Originally Posted by *buttface420*
> 
> just got a reference 290x...is this suppossed to run 95c? even on youtube its 90c. should i get a aio and a hg10 bracket? one time today i got a blackscreen crash..any fixes for that?


Yup, like others said its normal for those temperatures.

AIO and bracket is a great idea if you want to experience what the card is capable of and not a hot, noisy, performance throttling turd. Harsh, but true, words coming from someone who currently has both gaming PCs powered by these cards; love 'em but they need proper cooling to achieve full potential.

Re-pasting won't likely get you very far, I've tried it and unless the paste is like concrete you may not see much of a performance gain. Try manually forcing the fan speed to about 60%, at that speed it should easily be able to sustain factory clocks/voltages without any throttling. Pushing the stock blower upwards of 75+ percent fan speed can tame fairly large amounts of overclocking and overvolting; even for VRM1 which controls power for core. I benched my 290 with the stock blower @ 75% fan, with +125mv, +50% power to obtain 1215/1675 clock speeds. This kept the core in the low 70s and VRMs in the low 60s for 3DMark11 Extreme. Higher temps were seen for other benchmarks, but the temps were well within operating range to prevent throttling. Bench runs can be seen here: http://www.overclock.net/t/1456279/honey-i-shrunk-the-ultra-tower-beast-my-journey-to-creating-a-more-compact-pc-with-an-r9-290/20_20#post_21847939

Your black screen issues could be due to heat. If core is at 95, then VRM2(power for RAM) may be very hot as well. Use GPU-Z or something similar to check your VRM1 and VRM2 temps. VRM1 is just as important, and problematic, as the core on these cards without proper cooling. Typically VRM2 temps aren't of concern once you get rid of the crappy blower since all the heat from VRM1 & Core aren't being blown over it.

I presume you purchased the card used? Did the previous owner have the stock blower off of it? Was it used for mining or anything else that put the card under constant heavy stress?

In an overclocking scenario, black screens are typically caused by pushing the RAM clocks too high.

Regardless of what GPU you came from before this one, it is best to scrub all previous drivers with DDU(display driver uninstaller) and start fresh. Driver conflicts can cause all kinds of issues. http://www.guru3d.com/files-get/display-driver-uninstaller-download,20.html

General good read since you're new to the Hawaii GPUs: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781


----------



## Archea47

Quote:


> Originally Posted by *Archea47*
> 
> Is the Cinebench R15 OpenGL benchmark considered at all taxing?
> 
> Here's my results and max temps with 2x 290X Tri-X with stock Tri-X BIOS and no tweaks/overclock:
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Is this a normal/healthy/reasonable starting point? Fresh build of Windows 7 with the latest (15.7.1?) WHQL drivers


Quote:


> Originally Posted by *kizwan*
> 
> They are normal, healthy & reasonable. As for the CPU temp, the CPU sensor on the motherboard is the most accurate. You likely know this already but since you highlight both, so...


With the 290Xs still at stock (Tri-X stock) I got a big bump in my Cinebench R15 score by overclocking the FX CPU from stock 4 to 4.8GHz (building back up to the 5+):


EDIT: From 1040/1300 to 1150/1500 I only went up from 108.19 to 108.81? This seems like a CPU-bound test so far


----------



## mfknjadagr8

Quote:


> Originally Posted by *Roboyto*
> 
> Yup, like others said its normal for those temperatures.
> 
> AIO and bracket is a great idea if you want to experience what the card is capable of and not a hot, noisy, performance throttling turd. Harsh, but true, words coming from someone who currently has both gaming PCs powered by these cards; love 'em but they need proper cooling to achieve full potential.
> 
> Re-pasting won't likely get you very far, I've tried it and unless the paste is like concrete you may not see much of a performance gain. Try manually forcing the fan speed to about 60%, at that speed it should easily be able to sustain factory clocks/voltages without any throttling. Pushing the stock blower upwards of 75+ percent fan speed can tame fairly large amounts of overclocking and overvolting; even for VRM1 which controls power for core. I benched my 290 with the stock blower @ 75% fan, with +125mv, +50% power to obtain 1215/1675 clock speeds. This kept the core in the low 70s and VRMs in the low 60s for 3DMark11 Extreme. Higher temps were seen for other benchmarks, but the temps were well within operating range to prevent throttling. Bench runs can be seen here: http://www.overclock.net/t/1456279/honey-i-shrunk-the-ultra-tower-beast-my-journey-to-creating-a-more-compact-pc-with-an-r9-290/20_20#post_21847939
> 
> Your black screen issues could be due to heat. If core is at 95, then VRM2(power for RAM) may be very hot as well. Use GPU-Z or something similar to check your VRM1 and VRM2 temps. VRM1 is just as important, and problematic, as the core on these cards without proper cooling. Typically VRM2 temps aren't of concern once you get rid of the crappy blower since all the heat from VRM1 & Core aren't being blown over it.
> 
> I presume you purchased the card used? Did the previous owner have the stock blower off of it? Was it used for mining or anything else that put the card under constant heavy stress?
> 
> In an overclocking scenario, black screens are typically caused by pushing the RAM clocks too high.
> 
> Regardless of what GPU you came from before this one, it is best to scrub all previous drivers with DDU(display driver uninstaller) and start fresh. Driver conflicts can cause all kinds of issues. http://www.guru3d.com/files-get/display-driver-uninstaller-download,20.html
> 
> General good read since you're new to the Hawaii GPUs: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781


some good info... mine was like clay/ contrete as you described it.. i went from 90s at load stock clocks to low 70s.. which is pretty good considering i only pushed the fan to 60 percent... dont get me wrong i did run it at 100 for awhile just to check it out... and two 290s at 100 percent fan.. omg... that was unbearable for long periods... with him seeing 90 in youtube video playback id almost bet a large sum on the paste being so hard he could stab someone with it.... not that in endorse that


----------



## buttface420

Quote:


> Originally Posted by *mfknjadagr8*
> 
> some good info... mine was like clay/ contrete as you described it.. i went from 90s at load stock clocks to low 70s.. which is pretty good considering i only pushed the fan to 60 percent... dont get me wrong i did run it at 100 for awhile just to check it out... and two 290s at 100 percent fan.. omg... that was unbearable for long periods... with him seeing 90 in youtube video playback id almost bet a large sum on the paste being so hard he could stab someone with it.... not that in endorse that


i re did the paste just a while ago and yes it was hard and krusty applied mx-4 and now load temps in games still is 95c but idle and youtube dropped down alot. during youtube its in the 60's now.

im looking at aio coolers to slap on it right now, manually setting fan to 60% dropped temps down to mid 80's under load but sounds like a rocket.


----------



## kizwan

Quote:


> Originally Posted by *Archea47*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Archea47*
> 
> Is the Cinebench R15 OpenGL benchmark considered at all taxing?
> 
> Here's my results and max temps with 2x 290X Tri-X with stock Tri-X BIOS and no tweaks/overclock:
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Is this a normal/healthy/reasonable starting point? Fresh build of Windows 7 with the latest (15.7.1?) WHQL drivers
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> They are normal, healthy & reasonable. As for the CPU temp, the CPU sensor on the motherboard is the most accurate. You likely know this already but since you highlight both, so...
> 
> Click to expand...
> 
> 
> 
> 
> 
> With the 290Xs still at stock (Tri-X stock) I got a big bump in my Cinebench R15 score by overclocking the FX CPU from stock 4 to 4.8GHz (building back up to the 5+):
> 
> 
> EDIT: From 1040/1300 to 1150/1500 I only went up from 108.19 to 108.81? This seems like a CPU-bound test so far
Click to expand...

My gpu usage was less than 50%. Cinebench doesn't fully utilized the gpu. CPU load is 92% when running the multi-core test but single core is less than 40%.

947/1250 115.59 FPS vs. 1159/1499 117.39. Stupid TRIXX, I can't set it to 1160/1500.








CPU 3820 @4.75GHz


----------



## Arizonian

@Liranan & @buttface420 _or anyone else I may have missed._

I read you both got new cards, congrats. I'm sorry if I missed your submission post. If you did, I didn't find it (if you could link me) and if not, please post either GPU-Z pic or link open to 'VALIDATION' tab with OCN name. Even better an actual pic. I'd like to add you to the club roster of *604* members with *871* cards to date.


----------



## Liranan

Quote:


> Originally Posted by *Arizonian*
> 
> @Liranan & @buttface420 _or anyone else I may have missed._
> 
> I read you both got new cards, congrats. I'm sorry if I missed your submission post. If you did, I didn't find it (if you could link me) and if not, please post either GPU-Z pic or link open to 'VALIDATION' tab with OCN name. Even better an actual pic. I'd like to add you to the club roster of *604* members with *871* cards to date.


Usually I don't join clubs. The only club I'm a member of is the 840 SSD club but that is an accident however I will join this one because your politeness has warmed my heart.






Two years late but these things are so cheap now it's ridiculous. I got mine for 160 USD less than a 390 and the stock of 290's are dwindling fast. Did AMD discontinue production long ago and we are now seeing the last batches that were produced being sold?


----------



## MEC-777

Quote:


> Originally Posted by *buttface420*
> 
> i re did the paste just a while ago and yes it was hard and krusty applied mx-4 and now load temps in games still is 95c but idle and youtube dropped down alot. during youtube its in the 60's now.
> 
> im looking at aio coolers to slap on it right now, manually setting fan to 60% dropped temps down to mid 80's under load but sounds like a rocket.


I have two R9 290's, both with AIO's slapped on them. If you're running a single 290, you can get away with running it on air IF it has a good aftermarket cooler on it. The reference 290's, IMO are unbearable and unacceptable in terms of temps and noise.

One is an HIS IceQx2 which, on it's own with it's aftermarket air cooler, was fairly quiet and would run in the mid/high 70's with a mild OC. The other 290 I picked up about a month ago is a Gigabyte reference. With both cards installed and both air cooled, they would both easily reach 90*C+ at stock clocks in not a very long period of time. The noise and heat radiating from my case was just insane.

Now I have a Kraken G10 braket + H55 AIO on the HIS 290 and a Corsair HG10 + H55 AIO on the reference 290. Water cooling these cards makes a massive difference! The top card (HIS) never reaches much beyond the mid 60's and the bottom card (reference) rarely exceeds the mid 50's - and this is under full gaming load for hours. Silence is bliss! All the rad fans run no more than about 1400-1500rpm which is like a gentle whisper.

For you and for anyone with a reference 290/X I STRONGLY recommend the Corsair HG10 A1 kit + the H55 (or any Corsair AIO of choice). It's worth every penny. The Kraken G10 is a good alternative, but it's not as ideal for reference cards. The HG10 was designed for the reference 290's and does a better job with cooling the VRMs and Vram modules with it's included mid-plate.

And one more thing; make sure you setup the rad for your GPU as exhaust.









Pics:


----------



## buttface420

Quote:


> Originally Posted by *Arizonian*
> 
> @Liranan & @buttface420 _or anyone else I may have missed._
> 
> I read you both got new cards, congrats. I'm sorry if I missed your submission post. If you did, I didn't find it (if you could link me) and if not, please post either GPU-Z pic or link open to 'VALIDATION' tab with OCN name. Even better an actual pic. I'd like to add you to the club roster of *604* members with *871* cards to date.


here ya go! please excuse all the dust..also my power supply is 7 years old is that a bad thing? its a corsair tx750w 80+ bronze


----------



## buttface420

Quote:


> Originally Posted by *MEC-777*
> 
> I have two R9 290's, both with AIO's slapped on them. If you're running a single 290, you can get away with running it on air IF it has a good aftermarket cooler on it. The reference 290's, IMO are unbearable and unacceptable in terms of temps and noise.
> 
> One is an HIS IceQx2 which, on it's own with it's aftermarket air cooler, was fairly quiet and would run in the mid/high 70's with a mild OC. The other 290 I picked up about a month ago is a Gigabyte reference. With both cards installed and both air cooled, they would both easily reach 90*C+ at stock clocks in not a very long period of time. The noise and heat radiating from my case was just insane.
> 
> Now I have a Kraken G10 braket + H55 AIO on the HIS 290 and a Corsair HG10 + H55 AIO on the reference 290. Water cooling these cards makes a massive difference! The top card (HIS) never reaches much beyond the mid 60's and the bottom card (reference) rarely exceeds the mid 50's - and this is under full gaming load for hours. Silence is bliss! All the rad fans run no more than about 1400-1500rpm which is like a gentle whisper.
> 
> For you and for anyone with a reference 290/X I STRONGLY recommend the Corsair HG10 A1 kit + the H55 (or any Corsair AIO of choice). It's worth every penny. The Kraken G10 is a good alternative, but it's not as ideal for reference cards. The HG10 was designed for the reference 290's and does a better job with cooling the VRMs and Vram modules with it's included mid-plate.
> 
> And one more thing; make sure you setup the rad for your GPU as exhaust.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Pics:


ahh cool! thats exactly what i was looking at was the hg10 a1 bracket i didnt know if i should get a h50 (intel only) or a h55..i think thats the route im going to go because it looks awesome!


----------



## Archea47

Quote:


> Originally Posted by *kizwan*
> 
> My gpu usage was less than 50%. Cinebench doesn't fully utilized the gpu. CPU load is 92% when running the multi-core test but single core is less than 40%.
> 
> 947/1250 115.59 FPS vs. 1159/1499 117.39. Stupid TRIXX, I can't set it to 1160/1500.
> 
> 
> 
> 
> 
> 
> 
> 
> CPU 3820 @4.75GHz


Thanks for sharing your results! BTW how do you switch between single and multi core on the cpu test?


----------



## mfknjadagr8

Quote:


> Originally Posted by *buttface420*
> 
> here ya go! please excuse all the dust..also my power supply is 7 years old is that a bad thing?


I would replace a 7 year old psu but assuming it doesn't fail it will be enough wattage for what you have...being corsair tx line and 7 years old I would consider replacing it if you can find a better brand or rebrand such as superflower or sea sonic...for a decent price


----------



## supermiguel

will i see a big difference between 2 way and 3 way crossfire? Using a 4k monitor and playing archage, on ultra i get 30fps with 2 290x not overclocked.... will i reach 60fps if i add another 290x?


----------



## kizwan

Quote:


> Originally Posted by *Archea47*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> My gpu usage was less than 50%. Cinebench doesn't fully utilized the gpu. CPU load is 92% when running the multi-core test but single core is less than 40%.
> 
> 947/1250 115.59 FPS vs. 1159/1499 117.39. Stupid TRIXX, I can't set it to 1160/1500.
> 
> 
> 
> 
> 
> 
> 
> 
> CPU 3820 @4.75GHz
> 
> 
> 
> 
> Thanks for sharing your results! BTW how do you switch between single and multi core on the cpu test?
Click to expand...

Go to File >> Advanced benchmark.


----------



## semitope

Quote:


> Originally Posted by *Liranan*
> 
> Sadly AMD reference cards are terrible. My reference 4870 and even 6870 were just awfully hot, loud, had terrible paste and both died after a short while. Fortunately they got replaced by aftermarket coolers by MSI (6870) and Powercolour (4870)m that's why I made sure I got the card with the best cooler this time. nVidia cards on the other hard are entirely different, their stock coolers are quite good and quiet.


I actually had a decent reference 5770. Temps got hot after a good while but redid the paste. was at least under 80. Hasn't died after 5 years AFAIK. Sold last year


----------



## sinnedone

Quote:


> Originally Posted by *supermiguel*
> 
> will i see a big difference between 2 way and 3 way crossfire? Using a 4k monitor and playing archage, on ultra i get 30fps with 2 290x not overclocked.... will i reach 60fps if i add another 290x?


Honestly I would just lower the quality settings. At that resolution you really shouldn't be running AA/MSAA etc. Set your settings on high with no antialiasing or post processing. You should be able to get 60 fps.

Graphics cards have not caught up to 4k yet so I would compromise for a smoother gameplay experience.


----------



## rdr09

Quote:


> Originally Posted by *supermiguel*
> 
> will i see a big difference between 2 way and 3 way crossfire? Using a 4k monitor and playing archage, on ultra i get 30fps with 2 290x not overclocked.... will i reach 60fps if i add another 290x?


tried oc'ing your cpu more? how much oc you have?


----------



## MIGhunter

Quote:


> Originally Posted by *buttface420*
> 
> here ya go! please excuse all the dust..also my power supply is 7 years old is that a bad thing? its a corsair tx750w 80+ bronze
> 
> 
> Spoiler: Warning: Spoiler!


I bought a similar model in 2012 (750tx). I bought it because bench tests at the time showed it pushing 1000 watts easy. You shouldn't need to replace it.
Quote:


> Originally Posted by *supermiguel*
> 
> will i see a big difference between 2 way and 3 way crossfire? Using a 4k monitor and playing archage, on ultra i get 30fps with 2 290x not overclocked.... will i reach 60fps if i add another 290x?


AA is ******ed. It's not optimized for crossfire. My 295x gets 145 fps at 1080p in a lot of zones. In Windscour it bounces from 15 to 145. It's hard to play there sometimes.

p.s. what server are you on? I'm on Ollo.


----------



## supermiguel

Quote:


> Originally Posted by *MIGhunter*
> 
> I bought a similar model in 2012 (750tx). I bought it because bench tests at the time showed it pushing 1000 watts easy. You shouldn't need to replace it.
> AA is ******ed. It's not optimized for crossfire. My 295x gets 145 fps at 1080p in a lot of zones. In Windscour it bounces from 15 to 145. It's hard to play there sometimes.
> 
> p.s. what server are you on? I'm on Ollo.


Morpheus


----------



## Arizonian

Quote:


> Originally Posted by *Liranan*
> 
> Usually I don't join clubs. The only club I'm a member of is the 840 SSD club but that is an accident however I will join this one because your politeness has warmed my heart.
> Quote:
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Two years late but these things are so cheap now it's ridiculous. I got mine for 160 USD less than a 390 and the stock of 290's are dwindling fast. Did AMD discontinue production long ago and we are now seeing the last batches that were produced being sold?
Click to expand...

Then I really feel honored you joined and congrats - added








Quote:


> Originally Posted by *buttface420*
> 
> here ya go! please excuse all the dust..also my power supply is 7 years old is that a bad thing? its a corsair tx750w 80+ bronze
> 
> Quote:
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> [ IMG ALT=""]http://www.overclock.net/content/type/61/id/2590122/width/500/height/1000[/IMG]
Click to expand...

7yrs on the PSU makes it a great value.









Congrats - added









606 members with 873 GPU's


----------



## HOMECINEMA-PC

Luv me 290's


----------



## supermiguel

is the upgrade from 2 to 3 worth it? specially for 4k


----------



## rdr09

Quote:


> Originally Posted by *supermiguel*
> 
> is the upgrade from 2 to 3 worth it? specially for 4k


how much is your cpu oc'ed?

i asked coz i have a similar setup and i think my i7 Sandy has seen its limit with 2 290s. i won't dare add more. i use 4K.


----------



## Gumbi

Quote:


> Originally Posted by *supermiguel*
> 
> is the upgrade from 2 to 3 worth it? specially for 4k


Depends on your CPU, you'd want an overclocked i5/7 to minimise bottlenecking on that front.

Also, the scaling is good on 2 cards, but drops off a nice bit with a third card.


----------



## kizwan

Quote:


> Originally Posted by *sinnedone*
> 
> Quote:
> 
> 
> 
> Originally Posted by *supermiguel*
> 
> will i see a big difference between 2 way and 3 way crossfire? Using a 4k monitor and playing archage, on ultra i get 30fps with 2 290x not overclocked.... will i reach 60fps if i add another 290x?
> 
> 
> 
> Honestly I would just lower the quality settings. At that resolution you really shouldn't be running AA/MSAA etc. Set your settings on high with no antialiasing or post processing. You should be able to get 60 fps.
> 
> Graphics cards have not caught up to 4k yet so I would compromise for a smoother gameplay experience.
Click to expand...

I agree with AA but MSAA doesn't affect performance right?
Quote:


> Originally Posted by *supermiguel*
> 
> is the upgrade from 2 to 3 worth it? specially for 4k


For 4K, definitely worth it.


----------



## megax05

I want to join this club but have some questions.
My r9 290x will come from an RMA within a week.
I am using i7 4770 non k with 16gb of ddr3 2133
PSU is thermaltake 730w can I run 2x r9 290x in cf with a bit undervolt I am planning to sell my gtx 970 and grab another r9 290x cause planning to get a new 4k monitor.


----------



## ImJJames

Quote:


> Originally Posted by *megax05*
> 
> I want to join this club but have some questions.
> My r9 290x will come from an RMA within a week.
> I am using i7 4770 non k with 16gb of ddr3 2133
> PSU is thermaltake 730w can I run 2x r9 290x in cf with a bit undervolt I am planning to sell my gtx 970 and grab another r9 290x cause planning to get a new 4k monitor.


Just don't OC youll be fine with 2x 290x on 730W


----------



## kizwan

Quote:


> Originally Posted by *megax05*
> 
> I want to join this club but have some questions.
> My r9 290x will come from an RMA within a week.
> I am using i7 4770 non k with 16gb of ddr3 2133
> PSU is thermaltake 730w can I run 2x r9 290x in cf with a bit undervolt I am planning to sell my gtx 970 and grab another r9 290x cause planning to get a new 4k monitor.


Your PSU +12V max power output is 672W. I think it's enough even without undervolt. No overclock of course.


----------



## MIGhunter

Are you guys finding a good 4k monitor? Everything I've read has been "stick with 1440p and a 144hz monitor for gaming because 4k is too far behind the curve ”


----------



## HOMECINEMA-PC

Go a 4K monitor and run what ever res you want .

DP of cause . But I run TRI monitors at 1440p as the middle one is a 28" Sammy UHD 60hz refresh .

Looks soooo gooood .

For me 4K over 1080p - 144hz refresh . All day long


----------



## supermiguel

Quote:


> Originally Posted by *rdr09*
> 
> how much is your cpu oc'ed?
> 
> i asked coz i have a similar setup and i think my i7 Sandy has seen its limit with 2 290s. i won't dare add more. i use 4K.


3930x 4.8-5.0 GHz


----------



## Mega Man

It is a common fake argument. Quad fire is still better ( or tri )


----------



## supermiguel

Quote:


> Originally Posted by *Mega Man*
> 
> It is a common fake argument. Quad fire is still better ( or tri )


????


----------



## rdr09

Quote:


> Originally Posted by *supermiguel*
> 
> 3930x 4.8-5.0 GHz


depending on your motherboard, you can go tri first. your other components will matter as well like the PSU. i'd say minimum 1300W for 3.


----------



## supermiguel

Quote:


> Originally Posted by *rdr09*
> 
> depending on your motherboard, you can go tri first. your other components will matter as well like the PSU. i'd say minimum 1300W for 3.


cant find those dam msi 290x 4g twin frozr anywhere :s


----------



## rdr09

Quote:


> Originally Posted by *supermiguel*
> 
> cant find those dam msi 290x 4g twin frozr anywhere :s


does not have to be exact same brand unless you are really particular but . . . on air . . . unless you have enough good space between the gpus and really good air flow . . . i would suggest going water. those msi twins are not the coolest to run in multi.


----------



## supermiguel

Quote:


> Originally Posted by *rdr09*
> 
> does not have to be exact same brand unless you are really particular but . . . on air . . . unless you have enough good space between the gpus and really good air flow . . . i would suggest going water. those msi twins are not the coolest to run in multi.


i gotcha u =) http://www.overclock.net/t/1508183/build-log-super-monster

just trying to finish a long project but cant find video cards....

How does crossfire work with different brands of cards? does it cause issues?


----------



## rdr09

Quote:


> Originally Posted by *supermiguel*
> 
> i gotcha u =) http://www.overclock.net/t/1508183/build-log-super-monster
> 
> just trying to finish a long project but cant find video cards....
> 
> How does crossfire work with different brands of cards? does it cause issues?


Lovely. Crossfire will just work fine among different brands. some even crossfire 290s with 290Xs. with that amount of rad . . . you should keep everything below 60 at load.

if you've crossfire before then there should not have any surprises. i always install (upgrade) driver with crossfire disabled.

edit: i would seek help if you need any from members like Homecinema from an earlier post.


----------



## Mega Man

Quote:


> Originally Posted by *supermiguel*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> how much is your cpu oc'ed?
> 
> i asked coz i have a similar setup and i think my i7 Sandy has seen its limit with 2 290s. i won't dare add more. i use 4K.
> 
> 
> 
> 3930x 4.8-5.0 GHz
Click to expand...

Quote:


> Originally Posted by *Mega Man*
> 
> It is a common fake argument. Quad fire is still better ( or tri )


@supermiguel see rdr09s comment. About not daring add another


----------



## battleaxe

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Luv me 290's


Nice... you had some of the more beast 290's around didn't you? Weren't they doing 1300mhz + ?


----------



## Widde

If someone would be so kind and remove the sapphire card from me on the list ^^ Sold one of the cards


----------



## Streetdragon

Sooooooo one more time Crossfire!

http://www.overclock.net/t/1561704/official-amd-r9-390-390x-owners-club/2860#post_24448137

First Slot r9 390 Nitro

Second Slot r9 290 Vapor-X

runs nice and without problems with the newest Beta Driver


----------



## Arizonian

Quote:


> Originally Posted by *Widde*
> 
> If someone would be so kind and remove the sapphire card from me on the list ^^ Sold one of the cards


Done.








Quote:


> Originally Posted by *Streetdragon*
> 
> Sooooooo one more time Crossfire!
> 
> http://www.overclock.net/t/1561704/official-amd-r9-390-390x-owners-club/2860#post_24448137
> 
> First Slot r9 390 Nitro
> 
> Second Slot r9 290 Vapor-X
> 
> runs nice and without problems with the newest Beta Driver


Congrats on the 390/290 combo.


----------



## serave

hello guys, i've asked this somewhere before, but i'd post it at this thread again because a guy tried to sell me his own Sapphire R9 290 for $215

I have the SuperFlower 500Watt unit (mine is the exact same model as the one in this link )

it has 41,5A on the 12V rail, i have an Intel i5 4590, 4 case fans and 1 HDD with no ODD

and now for the generic question, would my PSU be able to cope with the load ? i might get a new PSU in the future, but definitely not until my next paycheck


----------



## Widde

Quote:


> Originally Posted by *serave*
> 
> hello guys, i've asked this somewhere before, but i'd post it at this thread again because a guy tried to sell me his own Sapphire R9 290 for $215
> 
> I have the SuperFlower 500Watt unit (mine is the exact same model as the one in this link )
> 
> it has 41,5A on the 12V rail, i have an Intel i5 4590, 4 case fans and 1 HDD with no ODD
> 
> and now for the generic question, would my PSU be able to cope with the load ? i might get a new PSU in the future, but definitely not until my next paycheck


I'm pulling around 490-510 watts at the wall with full load. 1 R9 290 1100/1350 +50% power limit +96mV , 4770k 4,2 ghz 1,37v 3 hdds and 1 ssd, 6 fans


----------



## serave

so if we factor the PSU efficiency, the voltage boost on your processor, and a moderate OC on your R9 290

i guess i should be fine with my PSU? i dont have any plan for overclocking aswell

nevertheless, i think im going to give it a shot afterall, thanks for the reply


----------



## Mega Man

Quote:


> Originally Posted by *serave*
> 
> so if we factor the PSU efficiency, the voltage boost on your processor, and a moderate OC on your R9 290
> 
> i guess i should be fine with my PSU? i dont have any plan for overclocking aswell
> 
> nevertheless, i think im going to give it a shot afterall, thanks for the reply


huh? What does psu efficiency have to do with anything? Nvm I missed a post sorry

But yes 500w should be able to push 1 card np maybe not oced but other then that no problem


----------



## Streetdragon

Maybe push a bit air through the PSU to get rid of the dust in it and everything will be ok!


----------



## mathesar

Acquired a used single fan Diamond R9 290X 4GB from a friend who recently upgraded to dual 980 TI's, The Diamond version appears to be a rare and probably undesirable card (reference cooling) but It was free so I couldn't turn it down lol.

I've had it installed a few days now and so far no problems and great performance, my previous card was a Nvidia GTX 670 FTW model (EVGA).

My only gripe with this 290X is the heat output under load, With the stock Auto fan it would hit 94C very quickly during gaming, I installed MSI Afterburner and enabled user fan control, With MSI's default fan curve it rarely sees over 81C, depending on the game, Soma for example maxes out around 78C, Mad Max 81C, Plants vs Zombies Garden warfare 74C.

The one downside to the custom fan speed method is semi-loud fan operation although it hasn't hit 100% (usually around 80% RPM)

Is there a popular aftermarket replacement cooler I should consider for this 290X? (prefer air cooled if possible).

Thanks!


----------



## fyzzz

Quote:


> Originally Posted by *mathesar*
> 
> Acquired a used single fan Diamond R9 290X 4GB from a friend who recently upgraded to dual 980 TI's, The Diamond version appears to be a rare and probably undesirable card (reference cooling) but It was free so I couldn't turn it down lol.
> 
> I've had it installed a few days now and so far no problems and great performance, my previous card was a Nvidia GTX 670 FTW model (EVGA).
> 
> My only gripe with this 290X is the heat output under load, With the stock Auto fan it would hit 94C very quickly during gaming, I installed MSI Afterburner and enabled user fan control, With MSI's default fan curve it rarely sees over 81C, depending on the game, Soma for example maxes out around 78C, Mad Max 81C, Plants vs Zombies Garden warfare 74C.
> 
> The one downside to the custom fan speed method is semi-loud fan operation although it hasn't hit 100% (usually around 80% RPM)
> 
> Is there a popular aftermarket replacement cooler I should consider for this 290X? (prefer air cooled if possible).
> 
> Thanks!


I used a raijintek Morpheus on my 290 before, it comes with heatsinks and cools the card well. But you need pretty good fans, because I ran it with a couple of nf-f12 and vrms got a bit hot, but when I changed the fan that was above vrm 1, it dropped by a bit. You could also check out the artic acellero iv or the prolimatech-mk 26.


----------



## Gumbi

Quote:


> Originally Posted by *mathesar*
> 
> Acquired a used single fan Diamond R9 290X 4GB from a friend who recently upgraded to dual 980 TI's, The Diamond version appears to be a rare and probably undesirable card (reference cooling) but It was free so I couldn't turn it down lol.
> 
> I've had it installed a few days now and so far no problems and great performance, my previous card was a Nvidia GTX 670 FTW model (EVGA).
> 
> My only gripe with this 290X is the heat output under load, With the stock Auto fan it would hit 94C very quickly during gaming, I installed MSI Afterburner and enabled user fan control, With MSI's default fan curve it rarely sees over 81C, depending on the game, Soma for example maxes out around 78C, Mad Max 81C, Plants vs Zombies Garden warfare 74C.
> 
> The one downside to the custom fan speed method is semi-loud fan operation although it hasn't hit 100% (usually around 80% RPM)
> 
> Is there a popular aftermarket replacement cooler I should consider for this 290X? (prefer air cooled if possible).
> 
> Thanks!


I don't think you understand thermal dynamics. A cooler card doesn't mean it's producing less heat (sure, it'll produce slightly less due to improved efficiency, but overall, the difference is minimal). Aftermarket cards are just _more efficient_ at removing the same heat.

As to your question, look into the Arctic Accelero IV cooler.


----------



## kizwan

Even though he wrote "heat" but what he wrote after that show that he actually talking about temp, as in the core temp.

@mathesar, well I don't have any recommendation for air cooling but whatever aftermarket cooler you're going to get, make sure you get heatsink & a fan for VRM cooling.


----------



## davidm71

Just redo the thermal greese


----------



## MEC-777

Quote:


> Originally Posted by *davidm71*
> 
> Just redo the thermal greese


This does very little for a reference Hawaii card. (R9 290/X) They simply produce so much heat and the stock reference "blower" cooler is just not efficient enough to cool the card effectively without making a ton of noise.

@mathesar I would HIGHLY recommend the Corsair HG10 A1 water cooling bracket kit which you can use with any Corsair AIO cooler (I suggest the H55 which is inexpensive and very reliable) for anyone with a reference 290/X. It uses the blower fan from the stock cooler to help keep the VRMs and Vram modules cool (only has to run at 30% constant which is silent) and it has a mid-plate incorporated into it's design to act as a large heatsink for the VRMs and Vram as well. It's really well designed and works great. Using one myself on a reference 290 and the temps are mid-50's at max load and 32 at idle.


----------



## Mega Man

Ironically we just had this conversation in another thread.

First. Changing thermal paste may have an effect.

Second the ref cooler is fine at doing what it does

Third it is far better to just buy a full block and say a mcr140x then any crapy aio does it cost more sure. But it is far better in so many ways


----------



## megax05

I will go with hg10 + h55 this combo will cist about 80$ but its just amazing since you got the card for free then I highly recommend this combo.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Mega Man*
> 
> Ironically we just had this conversation in another thread.
> 
> First. Changing thermal paste may have an effect.
> 
> Second the ref cooler is fine at doing what it does
> 
> Third it is far better to just buy a full block and say a mcr140x then any crapy aio does it cost more sure. But it is far better in so many ways


especially vrm temps...when the vrms stay around the core temps it's a great thing


----------



## mathesar

Quote:


> Originally Posted by *MEC-777*
> 
> This does very little for a reference Hawaii card. (R9 290/X) They simply produce so much heat and the stock reference "blower" cooler is just not efficient enough to cool the card effectively without making a ton of noise.
> 
> @mathesar I would HIGHLY recommend the Corsair HG10 A1 water cooling bracket kit which you can use with any Corsair AIO cooler (I suggest the H55 which is inexpensive and very reliable) for anyone with a reference 290/X. It uses the blower fan from the stock cooler to help keep the VRMs and Vram modules cool (only has to run at 30% constant which is silent) and it has a mid-plate incorporated into it's design to act as a large heatsink for the VRMs and Vram as well. It's really well designed and works great. Using one myself on a reference 290 and the temps are mid-50's at max load and 32 at idle.


After researching the Accelero IV and reading user reivews it appears to have issues with cooling the VRMs on 290X and people tend to buy separate VRM heat sinks, in that case the overall cost ends up being very close to the corsair cooler + bracket (Newegg reviews are very mixed on the Accelero).

Either way the Corsair bracket & H55 cooler seems like a much nicer design for both looks & function so Thanks for the info!


----------



## buttface420

Quote:


> Originally Posted by *mathesar*
> 
> Acquired a used single fan Diamond R9 290X 4GB from a friend who recently upgraded to dual 980 TI's, The Diamond version appears to be a rare and probably undesirable card (reference cooling) but It was free so I couldn't turn it down lol.
> 
> I've had it installed a few days now and so far no problems and great performance, my previous card was a Nvidia GTX 670 FTW model (EVGA).
> 
> My only gripe with this 290X is the heat output under load, With the stock Auto fan it would hit 94C very quickly during gaming, I installed MSI Afterburner and enabled user fan control, With MSI's default fan curve it rarely sees over 81C, depending on the game, Soma for example maxes out around 78C, Mad Max 81C, Plants vs Zombies Garden warfare 74C.
> 
> The one downside to the custom fan speed method is semi-loud fan operation although it hasn't hit 100% (usually around 80% RPM)
> 
> Is there a popular aftermarket replacement cooler I should consider for this 290X? (prefer air cooled if possible).
> 
> Thanks!


i recently got a reference 290x also i took the advice of another member and bought a hg10 a1 bracket and a corsair h55...waiting for the h55 to come, i looked at air cooling alternatives but the gelid needed 2 extra heat sink kits which made it like 100 bucks total, the accelero looked too goofy to me so i went with liquid

i was going to go with the nzxt brackets but they dont cool vrms well and they required a "shim" . the hg10 bracket acts like a heatsink itself and has really good temps


----------



## MEC-777

Quote:


> Originally Posted by *Mega Man*
> 
> Ironically we just had this conversation in another thread.
> 
> First. Changing thermal paste may have an effect.
> 
> Second the ref cooler is fine at doing what it does
> 
> Third it is far better to just buy a full block and say a mcr140x then any crapy aio does it cost more sure. But it is far better in so many ways


First, not much. Maybe 2-5 degrees if you're lucky. It's still going to hit 90+ unless you have the blower fan running at 60%+ which is still quite loud (way too loud IMO).

Second, If you don't mind the sound of a 747 at full throttle running in your case, sure it's fine at what it does. For me and for many others, it's performance and acoustics are unacceptable.

Third, a full cover waterblock for a custom loop alone is $100+. The HG10 + H55 combo can be had for less than that. The temps and silence achieved is an incredible value per dollar and you avoid the pit falls of a full custom loop system. If going custom loop is your thing and you have the money, by all means. But for many others who just want to properly tame the heat dissipation and deal with the atrocious noise of the Hawaii reference coolers, the HG10 + H55 is simply unbeatable in practicality, performance and cost. "Crappy AIO"? I think not. Just because it's not what you would choose doesn't mean it's"crappy".









The 290 I'm running with the HG10 rarely ever cracks the mid 50's. Ever. That's a -40 degree drop in load temps for $100 or less. Sorry but I'm not paying hundreds of dollars more just to get it 5 degrees cooler than that.


----------



## Mega Man

Actually I have heard of 10+c from changing paste. I never said everything hence the may.

And that crappy aio is why you will never have a golden card


----------



## JourneymanMike

Quote:


> Originally Posted by *Mega Man*
> 
> Actually I have heard of 10+c from changing paste. I never said everything hence the may.
> 
> And *that crappy aio* is why you will never have a golden card


^^^^^^^^^^^^^^^^^^^^^This^^^^^^^^^^^^^^^^^^^^^^


----------



## MEC-777

Quote:


> Originally Posted by *Mega Man*
> 
> Actually I have heard of 10+c from changing paste. I never said everything hence the may.
> 
> And that crappy aio is why you will never have a golden card


I've never heard of temps being dropped 10 degrees on a reference Hawaii by simply changing the paste. I find it amusing that you defend the horrible reference cooler and then call an AIO solution crappy when it drops temps by 40 degrees +.

And what? Golden card? What are you talking about?

Like I said, if custom loop is your thing and you have the cash for it, by all means. But I don't see where you get off calling an AIO solution crappy when it can get the card within 5 degrees of a custom loop for a fraction of the cost.

Also, it helps if you quote the person you're talking/replying to.


----------



## JourneymanMike

Quote:


> Originally Posted by *MEC-777*
> 
> I've never heard of temps being dropped 10 degrees on a reference Hawaii by simply changing the paste. I find it amusing that you defend the horrible reference cooler and then call an AIO solution crappy when it drops temps by 40 degrees +.
> 
> And what? Golden card? What are you talking about?
> *
> Like I said, if custom loop is your thing and you have the cash for it, by all means. But I don't see where you get off calling an AIO solution crappy when it can get the card within 5 degrees of a custom loop for a fraction of the cost.
> *
> Also, it helps if you quote the person you're talking/replying to.


I agree that some AIO's come really close to a custom loop for the CPU. I started with a Corsair H100i and then quickly switched to a Swiftech H220, which gave me all most 10c lower temps on the CPU than the Corsair...

That was humble my start. So, I decided to go full blown custom. Then I bought a CaseLabs case w/pedestal (so I'd have enough room for the loop) and went nuts! Thousands of $ spent on WC blocks, fittings and alike. It works great, looks good and I still want more better cooling! Also it takes a little more time and effort...

It's an addiction! But it's money pit hobby... What else would I need? A girlfriend, to keep me totally confused at all times?


----------



## MEC-777

Quote:


> Originally Posted by *JourneymanMike*
> 
> I agree that some AIO's come really close to a custom loop for the CPU. I started with a Corsair H100i and then quickly switched to a Swiftech H220, which gave me all most 10c lower temps on the CPU than the Corsair...
> 
> That was humble my start. So, I decided to go full blown custom. Then I bought a CaseLabs case w/pedestal (so I'd have enough room for the loop) and went nuts! Thousands of $ spent on WC blocks, fittings and alike. It works great, looks good and I still want more better cooling! Also it takes a little more time and effort...
> 
> It's an addiction! But it's money pit hobby... What else would I need? A girlfriend, to keep me totally confused at all times?


That's awesome and I know the feeling when you start upgrading and keep wanting/adding more and more.







My PC started off as a modest little mini-ITX rig in a Node 304. Now, well, you can see the specs in my sig. I don't have the kind of money required to build a huge fully custom loop system like you've done (and I don't really want to build something like that anyways), but I did want a high-performance rig that runs quiet and looks good. AIO's suit my needs and budget and that's the route I took.









I just don't see logic in calling an AIO solution "crappy" when it's performance to cost value is really good - especially when it comes to cooling GPUs. @Mega Man You know what IS crappy? The Hawaii (290/X) reference coolers. THOSE are crappy.


----------



## Mega Man

Quote:


> Originally Posted by *MEC-777*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> Actually I have heard of 10+c from changing paste. I never said everything hence the may.
> 
> And that crappy aio is why you will never have a golden card
> 
> 
> 
> I've never heard of temps being dropped 10 degrees on a reference Hawaii by simply changing the paste. I find it amusing that you defend the horrible reference cooler and then call an AIO solution crappy when it drops temps by 40 degrees +.
> 
> And what? Golden card? What are you talking about?
> 
> Like I said, if custom loop is your thing and you have the cash for it, by all means. But I don't see where you get off calling an AIO solution crappy when it can get the card within 5 degrees of a custom loop for a fraction of the cost.
> 
> Also, it helps if you quote the person you're talking/replying to.
Click to expand...

Yes I know you have never had a golden card. That is why I said that.

And since you never heard of a 10c drop obviously it can't happen. Come on.

Lastly if you can not read one post then the next does not mean I need to quote the post directly above mine sorry


----------



## JourneymanMike

Quote:


> Originally Posted by *MEC-777*
> 
> That's awesome and I know the feeling when you start upgrading and keep wanting/adding more and more.
> 
> 
> 
> 
> 
> 
> 
> My PC started off as a modest little mini-ITX rig in a Node 304. Now, well, you can see the specs in my sig. I don't have the kind of money required to build a huge fully custom loop system like you've done (and I don't really want to build something like that anyways), but I did want a high-performance rig that runs quiet and looks good. AIO's suit my needs and budget and that's the route I took.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I just don't see logic in calling an AIO solution "crappy" when it's performance to cost value is really good - especially when it comes to cooling GPUs. @Mega Man You know what IS crappy? *The Hawaii (290/X) reference coolers*. THOSE are crappy.


I had to save to get where I am today with my build, I even got a loan to finish it!

Yes, the 290x reference stock coolers make a lot of racket! Especially @ 100%









I had two of them in CFX on my test bench... My idea was to water cool them, so I got 2 Aquacomputer Kryographics water blocks with active back plates... Very nice...

Reference coolers



Beautiful water blocks


----------



## MEC-777

Quote:


> Originally Posted by *Mega Man*
> 
> Yes I know you have never had a golden card. That is why I said that.
> 
> And since you never heard of a 10c drop obviously it can't happen. Come on.
> 
> Lastly if you can not read one post then the next does not mean I need to quote the post directly above mine sorry


So you won't tell me what a golden card is? OK.

I never said it can't happen, I said I've never heard of it happening before on these cards. 85 degrees is still hella hot, IMO and the cooler is still sub-par. It's like splashing a cup of water on a huge bonfire.









Quoting just helps keep track of the conversation for everyone since new posts with new questions can pop up at any time in this thread.
Quote:


> Originally Posted by *JourneymanMike*
> 
> I had to save to get where I am today with my build, I even got a loan to finish it!
> 
> Yes, the 290x reference stock coolers make a lot of racket! Especially @ 100%
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I had two of them in CFX on my test bench... My idea was to water cool them, so I got 2 Aquacomputer Kryographics water blocks with active back plates... Very nice...
> 
> Reference coolers
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Beautiful water blocks
> 
> 
> Spoiler: Warning: Spoiler!


Those are awesome waterblocks. Very fitting for the Hawaii cards with the palm trees and all.


----------



## JourneymanMike

Quote:


> Originally Posted by *MEC-777*
> 
> Those are awesome water blocks. *Very fitting for the Hawaii cards with the palm trees and all.*


Golden card is one that you can get stable, high overclocks on...

They even have the Hawaiian islands machined in water chamber...






You can see the islands right next to the palm tree plaque

THey had a video of it being machined, but I couldn't find it...


----------



## MEC-777

Quote:


> Originally Posted by *JourneymanMike*
> 
> Golden card is one that you can get stable, high overclocks on...


I assumed as such. Just never heard anyone call it that before.









I will make the argument to @Mega Man that you'd still do far better on the "crappy" AIO solution and pretty darn close to a custom loop on a golden card than you would on air/stock cooler - and again, for a fraction of the cost of a custom loop. So I still fail to see how the AIO solution is "crappy"...


----------



## mathesar

Quote:


> Originally Posted by *MEC-777*
> 
> I assumed as such. Just never heard anyone call it that before.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I will make the argument to @Mega Man that you'd still do far better on the "crappy" AIO solution and pretty darn close to a custom loop on a golden card than you would on air/stock cooler - and again, for a fraction of the cost of a custom loop. So I still fail to see how the AIO solution is "crappy"...


Yea to be honest I'm not interested in overclocking the card, at least I haven't seen a reason to yet.. it's already running my games excellent now.

I'm only pushing 1080p / 60hz monitors at the moment and only have one display active while gaming.

The Aquacomputer block looks great but how much are we talking $$$ wise for *everything* that's required if you're starting from scratch?

If an AIO is going to get rid of the hair dryer sound from my case then seems good enough to me lol.


----------



## MEC-777

Quote:


> Originally Posted by *mathesar*
> 
> Those Aquacomputer blocks looks nice and all but the price for one plus everything else required puts them in the Hell No category for me.
> Yea to be honest I'm not interested in overclocking the card, at least I haven't seen a reason to yet.. it's already running my games excellent now.
> 
> I'm only pushing 1080p / 60hz monitors at the moment and only have one display active while gaming.
> 
> The Aquacomputer block looks great but how much are we talking $$$ wise for *everything* that's required if you're starting from scratch?
> 
> If an AIO is going to get rid of the hair dryer sound from my case then seems good enough to me lol.


It really depends on how far you want to go. The HG10 + H55 is by far the best in terms of price to performance. But if you want to go full custom loop, I'm guessing you're looking in the neighborhood of $350-400+ for the basics.


----------



## Mega Man

@MEC-777

You can talk to sugarhell or tsm. About golden cards. You will not get them on aios. Ever
You also have obviously not read the 290 thread from the beginning.

Vrms and core need to be in 40s.

I have been here since the beginning.

Only reason I did not upgrade to quad fury x is my newborn but I still keep up with them.

@mathesar
Going from Scratch kills watercooling from a cost standpoint. i know. I have more in rads in just one build then most put into their whole build. ( current builds (non server) are all caselabs, m8+ped, th10, tx10, and (2)s3s)
However it does not have to be so bad.

Gpu only use a mcr140x and gpu block.

Or a h220x and a gpu block.

Otherwise I would look at a small res, pump, rad, tubing ( you can buy cheap ), fittings blocks ( cpu/gpu )

I can do a small kit from 250 to several thousand ( my th10 has (5)480s, (3x Monsta, 1x ut60, 1x xt45), 2x mcp35x2, 1 CPU and 2 295x2 blocks.

But on the other hand everything but the gpu blocks can be used in the next build


----------



## mfknjadagr8

Quote:


> Originally Posted by *mathesar*
> 
> Yea to be honest I'm not interested in overclocking the card, at least I haven't seen a reason to yet.. it's already running my games excellent now.
> 
> I'm only pushing 1080p / 60hz monitors at the moment and only have one display active while gaming.
> 
> The Aquacomputer block looks great but how much are we talking $$$ wise for *everything* that's required if you're starting from scratch?
> 
> If an AIO is going to get rid of the hair dryer sound from my case then seems good enough to me lol.


Yeah it will remove the hair dryer sound...if you look around you might find a better air cooler to fit reference from someone who went water...if you aren't overclocking at all there are a lot of air coolers that will keep temps in check and not be extremely loud....most of us here overclock everything to hell and back so water is an excellent choice...the real advantages to full water loop with full cover blocks are not only cooler load temperatures on cores but cooler vrm and memory temperatures...this allows you to get that extra overclock and still remain below what most air coolers can achieve and all of the components SHOULD last longer due to not being stressed from heat...now aios can be a cheaper option but you can't push overclock as high without proper cooling on vrms so you buy heatsink and fans and the full cover block still outperforms this in most cases...plus since you did it on the cheap the risk of failure on a cheap aio is higher than quality components in a custom loop or a quality aio expanded to a custom loop...if you aren't overclocking will you see benefits sure but nothing like you would see if you were overclocking it's not cost efficient...


----------



## MEC-777

Quote:


> Originally Posted by *Mega Man*
> 
> @MEC-777
> 
> You can talk to sugarhell or tsm. About golden cards. You will not get them on aios. Ever
> You also have obviously not read the 290 thread from the beginning.
> 
> Vrms and core need to be in 40s.
> 
> I have been here since the beginning.
> 
> Only reason I did not upgrade to quad fury x is my newborn but I still keep up with them.


The person who was asking (@mathesar) was not interested in overclocking and obtaining a golden card. They just want a better solution than the reference cooler. An AIO solution is the best in terms of cooling performance per cost.

Sorry I don't have time to read through 3957 pages of this thread. But that doesn't mean I don't know what I'm talking about. I'm not arguing that you can get a golden card on an AIO. In fact, I agree with you on that (you can't). That's not the point I'm disputing. Calling an AIO solution "crappy" is blatantly misleading. Anyways, I think I've made my point (sever times now), so I'm going to leave it at that.

You've been here since the beginning? That's nice.


----------



## MEC-777

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Yeah it will remove the hair dryer sound...if you look around you might find a better air cooler to fit reference from someone who went water...if you aren't overclocking at all there are a lot of air coolers that will keep temps in check and not be extremely loud....most of us here overclock everything to hell and back so water is an excellent choice...the real advantages to full water loop with full cover blocks are not only cooler load temperatures on cores but cooler vrm and memory temperatures...this allows you to get that extra overclock and still remain below what most air coolers can achieve and all of the components SHOULD last longer due to not being stressed from heat...now aios can be a cheaper option but you can't push overclock as high without proper cooling on vrms so you buy heatsink and fans and the full cover block still outperforms this in most cases...plus since you did it on the cheap the risk of failure on a cheap aio is higher than quality components in a custom loop or a quality aio expanded to a custom loop...if you aren't overclocking will you see benefits sure but nothing like you would see if you were overclocking it's not cost efficient...


I just want to clarify; the HG10 was designed for the reference 290/X's. It has a mid plate that covers all the VRMs (1 and 2) and all the memory modules. It uses the stock blower fan to cool the mid plate. Temps on the core and VRMs from what I've observed myself have dropped 40 degrees +. So there's no need to be adding additional heatsinks to the VRMs etc. with the HG10 and the results are dramatic.

For those of us who do like to play around with overclocking but don't need to push it to the ragged edge with modded BIOS and overvolting (beyond normal OC limits) etc., this is an excellent cooling solution.


----------



## mfknjadagr8

Quote:


> Originally Posted by *MEC-777*
> 
> I just want to clarify; the HG10 was designed for the reference 290/X's. It has a mid plate that covers all the VRMs (1 and 2) and all the memory modules. It uses the stock blower fan to cool the mid plate. Temps on the core and VRMs from what I've observed myself have dropped 40 degrees +. So there's no need to be adding additional heatsinks to the VRMs etc. with the HG10 and the results are dramatic.
> 
> For those of us who do like to play around with overclocking but don't need to push it to the ragged edge with modded BIOS and overvolting (beyond normal OC limits) etc., this is an excellent cooling solution.


for what it is it does do well I don't dispute that but full cover blocks have several advantages with the only real drawback being startup cost....most cases the expandable aio can move from build to build as well as a budget custom loop so there's that too...but I wouldn't trust the cheaper aios on my card...especially if I bought something "top of the line"....also the stock blower fan can be pretty loud even at lower speeds but it's good to know at least it does have some vrm cooling...
Edit:h55 is meh...depending on the card you can get full cover block and h220x for around 225 if you shop around...it's you buy used parts from say the forums here you could part together a setup for less...a lot of the watercoolers here spend bank on items then sell them cheaply to help fund their next upgrades...I've gotten quite a few good deals here...in fact I got both of my 290s for around $300


----------



## mathesar

I appreciate the replies from everyone I'm not 100% sure which cooling solution I'm going with yet but the Corsair method seems like a decent choice so far, I do have a new question about the 290X's Fan behavior.

If I let the card run with the default Auto fan control the RPM's wont increase any higher than 20% until it reaches above 90C Temps (?) This is my first AMD card in years so I'm used to the way Nvidia coolers work., This "stuck" at 20% RPM thing is new to me, or is this not normal behavior for a 290X?

Because of the cards unwillingness to ramp up RPMs until its near peak temps it tends to run very hot even when not gaming, I watched a few 1080P 60FPS YouTube videos and the card was running 84C @ 20% Fan RPM, Then when I closed down the browser GPU temp slowly dropped to around 52C Idle.

From what I've read online the card will lower its clock speeds in order to keep the temp from exceeding the Max GPU threshold temp..This makes sense but then why isn't the fan control trying a little harder to prevent this from happening, It's waiting until the card is right at peak temp before reacting?









Just making sure this is normal behavior,

Thanks!


----------



## kizwan

Quote:


> Originally Posted by *mathesar*
> 
> I appreciate the replies from everyone I'm not 100% sure which cooling solution I'm going with yet but the Corsair method seems like a decent choice so far, I do have a new question about the 290X's Fan behavior.
> 
> If I let the card run with the default Auto fan control the RPM's wont increase any higher than 20% until it reaches above 90C Temps (?) This is my first AMD card in years so I'm used to the way Nvidia coolers work., This "stuck" at 20% RPM thing is new to me, or is this not normal behavior for a 290X?
> 
> Because of the cards unwillingness to ramp up RPMs until its near peak temps it tends to run very hot even when not gaming, I watched a few 1080P 60FPS YouTube videos and the card was running 84C @ 20% Fan RPM, Then when I closed down the browser GPU temp slowly dropped to around 52C Idle.
> 
> From what I've read online the card will lower its clock speeds in order to keep the temp from exceeding the Max GPU threshold temp..This makes sense but then why isn't the fan control trying a little harder to prevent this from happening, It's waiting until the card is right at peak temp before reacting?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just making sure this is normal behavior,
> 
> Thanks!


290X have Quiet & Uber mode (BIOS swtich). With Quiet mode fan is capped at 40% max while Uber mode, fan is capped at 55% max. Did you select Uber mode? Move the switch toward I/O plate for Quiet mode or move the switch away from I/O plate for Uber mode. You have two other options; 1) create custom fan profile (run fan at certain speed at certain temp), or 2) set to fixed fan speed using any monitoring software when playing games.


----------



## mfknjadagr8

Quote:


> Originally Posted by *kizwan*
> 
> 290X have Quiet & Uber mode (BIOS swtich). With Quiet mode fan is capped at 40% max while Uber mode, fan is capped at 55% max. Did you select Uber mode? Move the switch toward I/O plate for Quiet mode or move the switch away from I/O plate for Uber mode. You have two other options; 1) create custom fan profile (run fan at certain speed at certain temp), or 2) set to fixed fan speed using any monitoring software when playing games.


I used the second method as the default fan profile didn't ramp up the speeds fast enough...msi afterburner allows you to do both...but I'm sure other programs do the same...the fan curve is probably the best way...I set afterburner to use fan curve on 2d profile then set 3d profile to a set number that would keep my temps in check no matter the load...the reason the default profiles aren't set to ramp up steadily is because of the noise profile that can create and they state 94 is acceptable...problem with that is a lot of cards will throttle and some black screen to at default settings because the fan profile doesn't get the heat out fast enough


----------



## rdr09

Quote:


> Originally Posted by *mfknjadagr8*
> 
> I used the second method as the default fan profile didn't ramp up the speeds fast enough...msi afterburner allows you to do both...but I'm sure other programs do the same...the fan curve is probably the best way...I set afterburner to use fan curve on 2d profile then set 3d profile to a set number that would keep my temps in check no matter the load...the reason the default profiles aren't set to ramp up steadily is because of the noise profile that can create and they state 94 is acceptable...problem with that is a lot of cards will throttle and some black screen to at default settings because the fan profile doesn't get the heat out fast enough


Very true. i tested my second reference 290 before putting the full block on it. it was up against the psu (crossfired) and it was throttling in C3 causing the usage to fluctuate on both gpus. Left the fan in auto though.


----------



## MEC-777

Quote:


> Originally Posted by *mfknjadagr8*
> 
> for what it is it does do well I don't dispute that but full cover blocks have several advantages with the only real drawback being startup cost....most cases the expandable aio can move from build to build as well as a budget custom loop so there's that too...but I wouldn't trust the cheaper aios on my card...especially if I bought something "top of the line"....also the stock blower fan can be pretty loud even at lower speeds but it's good to know at least it does have some vrm cooling...
> Edit:h55 is meh...depending on the card you can get full cover block and h220x for around 225 if you shop around...it's you buy used parts from say the forums here you could part together a setup for less...a lot of the watercoolers here spend bank on items then sell them cheaply to help fund their next upgrades...I've gotten quite a few good deals here...in fact I got both of my 290s for around $300


The stock blower fan at 30% constant (which is all you need with the HG10) is pretty much inaudible.









I've explored the idea of going full custom loop, but what keeps me from going that route, aside from cost, is the ease of swapping parts out. Should a problem arise where I need to remove one card or both, I can still do that easily with the AIOs. It's basically the "next best thing" to a custom loop. I definitely do want to try building a custom loop one day, though. Maybe on my next major system upgrade which will be a few years down the road.


----------



## mfknjadagr8

Quote:


> Originally Posted by *MEC-777*
> 
> The stock blower fan at 30% constant (which is all you need with the HG10) is pretty much inaudible.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've explored the idea of going full custom loop, but what keeps me from going that route, aside from cost, is the ease of swapping parts out. Should a problem arise where I need to remove one card or both, I can still do that easily with the AIOs. It's basically the "next best thing" to a custom loop. I definitely do want to try building a custom loop one day, though. Maybe on my next major system upgrade which will be a few years down the road.


well one thing you can do to fix this is qdcs or quick disconnects on the card this is how the 360 predator is setup and they will be selling prefilled waterblocks for easy expansion no fuss no muss...great idea but nothing really beats the experience of learning to do things to the loop yourself it's an experience that allows you to know what to do if you have a problem and understand water cooling a bit better even if it is more work


----------



## MEC-777

Quote:


> Originally Posted by *mfknjadagr8*
> 
> well one thing you can do to fix this is qdcs or quick disconnects on the card this is how the 360 predator is setup and they will be selling prefilled waterblocks for easy expansion no fuss no muss...great idea but nothing really beats the experience of learning to do things to the loop yourself it's an experience that allows you to know what to do if you have a problem and understand water cooling a bit better even if it is more work


Yeah qdcs would definitely make things easier, just means no hard line tubing though.









Next upgrade for me is the monitor. After that... maybe custom loop...


----------



## obiwan kanobi

please add me....














i upgrade my rig from hd6950 to r9 290.. i swapped direct cuii to prolimatech mk-26 with 2 140mm bitfenix spectre pro... idle temp drop to 37-39c.. max i get around 78c. the card wont reach 80 unlike Direct CUii... usually i get 50-56c.


----------



## By-Tor

Powercolor released the dual core 390 II Devil 13..

http://www.newegg.com/Product/Product.aspx?Item=N82E16814131680


----------



## buttface420

Quote:


> Originally Posted by *By-Tor*
> 
> Powercolor released the dual core 390 II Devil 13..
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814131680


nice!


----------



## mfknjadagr8

Quote:


> Originally Posted by *obiwan kanobi*
> 
> 
> 
> please add me....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i upgrade my rig from hd6950 to r9 290.. i swapped direct cuii to prolimatech mk-26 with 2 140mm bitfenix spectre pro... idle temp drop to 37-39c.. max i get around 78c. the card wont reach 80 unlike Direct CUii... usually i get 50-56c.


DAng that thing is like a four slot card now...hope you arent going to crossfire youd have to lose the fans lol

@By-Tor they really need to stop calling the dual card variants 16 gb when they are two 8gb cards that cant use more than 8 its very misleading..... if dx12 were here and actually did combine the ram into one pool then i would be ok with that but... otherwise that sucker is a beast!


----------



## By-Tor

Quote:


> Originally Posted by *mfknjadagr8*
> 
> DAng that thing is like a four slot card now...hope you arent going to crossfire youd have to lose the fans lol
> 
> @By-Tor they really need to stop calling the dual card variants 16 gb when they are two 8gb cards that cant use more than 8 its very misleading..... if dx12 were here and actually did combine the ram into one pool then i would be ok with that but... otherwise that sucker is a beast!


Yeah sounds like a selling ploy to sell more cards, but I don't see somebody that doesn't know what they are buying just droping down $800 for any card.

I wish someone would produce a water block for it.

It's Damn sexy.....


----------



## Gumbi

@ obiwan... sweet mod man. Your post was a bit unclear... you get 50-56 degree load temps now? Sick! How hot do the VRMs get? You seem to have VRM cooling mounted which is cool!

100mv is a lot for 1120mhz core, are you sure you need that much to keep 1120mhz stable? I can do that at 25mv on my card.


----------



## mathesar

Quote:


> Originally Posted by *kizwan*
> 
> 290X have Quiet & Uber mode (BIOS swtich). With Quiet mode fan is capped at 40% max while Uber mode, fan is capped at 55% max. Did you select Uber mode? Move the switch toward I/O plate for Quiet mode or move the switch away from I/O plate for Uber mode. You have two other options; 1) create custom fan profile (run fan at certain speed at certain temp), or 2) set to fixed fan speed using any monitoring software when playing games.


I've had it set on Uber but the fan RPM still doesn't move from 20% until the card is near max (93-94C), I'm using MSI Afterburner now with a custom fan profile which seems to be working out pretty good, I'm still fiddling with the curve here and there, Thanks.


----------



## obiwan kanobi

@Gumbi 50-60c if i turn on vsync.. high 70s if turn vsync off... most of the game i set high/medium setting. my gpu will crash with low voltage. that screenshot i upload is the highest i can get with this card. i keep it down to 1047mhz core and 1350mhz memory. vram temp is acceptable.. asus cooler comes with heatsink. i stacked another heatsink on top of the asus heatsink.. vram highest temp recorded is high 70s.. cant remember all this number.


----------



## rdr09

Quote:


> Originally Posted by *JourneymanMike*
> 
> I had to save to get where I am today with my build, I even got a loan to finish it!
> 
> Yes, the 290x reference stock coolers make a lot of racket! Especially @ 100%
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I had two of them in CFX on my test bench... My idea was to water cool them, so I got 2 Aquacomputer Kryographics water blocks with active back plates... Very nice...
> 
> Reference coolers
> 
> 
> 
> Beautiful water blocks
> 
> 
> Spoiler: Warning: Spoiler!


That bridge looks so much better. IMO.


----------



## Bajloz

Quote:


> Originally Posted by *By-Tor*
> 
> Powercolor released the dual core 390 II Devil 13..
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814131680


do you got eny reviw for that card ?


----------



## megax05

I am getting my 2x r9 290 tri-x today do think those cards are capable of 390s stock clocks


----------



## fyzzz

Quote:


> Originally Posted by *megax05*
> 
> I am getting my 2x r9 290 tri-x today do think those cards are capable of 390s stock clocks


It can probably handle the core clock speed and maybe not the memory speed. There are cards that will go as high as 1150 on the core without voltage add. I have seen memory around 1500-ish, but mostly in the 1400 range without voltage add.


----------



## Streetdragon

Could it be, that MSI AB is very buggy on Windows10/Crossfire?
It wont set the Clock speed on startup (Overdrive is disabled, only change the speeds on startup, if i enable Overdrive in CCC)
And the Voltage wont change too. From time to time it change the voltage on the first card to what i set up but the secound is still on stock voltage.
Add to this, after a restart, i cant change the voltage on my second card anymore. I have to change the voltage settings in the settings menü of AB to enable the voltagebar.

I found a "Solution" hmmm can i say solution to it? hmmm dont know^^
In autostart of windows i put in a batch file, where i manualy adjust the Voltage to +100 (like if you wanna add +200 or more voltage in MSI AB, with the /wi6 command)

Is there a other way to fix the MSI AB issus? The batchfile works. but i think this solution is ugly... Can post the batchfile if wanted.


----------



## kizwan

When you set AB to run at startup, it create a task in Task Scheduler. Set a delay in the task, so that AB will delay start.


----------



## By-Tor

Quote:


> Originally Posted by *Bajloz*
> 
> do you got eny reviw for that card ?


Haven't seen any yet, but they will surface I'm sure.

As the 290x version it takes 4-8 pin power connectors and uses 15 power phases...

Info I found

http://techreport.com/news/28976/run-with-powercolor-devil-13-dual-core-r9-390-graphics-card


----------



## Agent Smith1984

Quote:


> Originally Posted by *By-Tor*
> 
> Yeah sounds like a selling ploy to sell more cards, but I don't see somebody that doesn't know what they are buying just droping down $800 for any card.
> 
> I wish someone would produce a water block for it.
> 
> It's Damn sexy.....


This http://www.newegg.com/Product/Product.aspx?Item=N82E16814131584&cm_re=devil_13-_-14-131-584-_-Product

Is $100 cheaper after rebate, and is essentially a faster card, minus the 4GB of VRAM......

Why did PowerColor gimp the newest Devil 13 by using Hawaii Pro??

And why did they gimp the VRAM by using slower clocks than the current 390's offer?

It's nice, but a bad buy in my opinion, I'd take the 290X II Devil 13 all day long over that one.


----------



## By-Tor

Quote:


> Originally Posted by *Agent Smith1984*
> 
> This http://www.newegg.com/Product/Product.aspx?Item=N82E16814131584&cm_re=devil_13-_-14-131-584-_-Product
> 
> Is $100 cheaper after rebate, and is essentially a faster card, minus the 4GB of VRAM......
> 
> Why did PowerColor gimp the newest Devil 13 by using Hawaii Pro??
> 
> And why did they gimp the VRAM by using slower clocks than the current 390's offer?
> 
> It's nice, but a bad buy in my opinion, I'd take the 290X II Devil 13 all day long over that one.


Not sure why myself, but I think I'll just stay with my pair of 290x's anyway...


----------



## Agent Smith1984

Quote:


> Originally Posted by *By-Tor*
> 
> Not sure why myself, but I think I'll just stay with my pair of 290x's anyway...


Absolutely!

Anyone running a pair of 290's, and NOT having any VRAM limitations, should stick with that setup for a while, cause a move from that in any direction just isn't worth the $$$ right now...


----------



## MEC-777

Quote:


> Originally Posted by *Streetdragon*
> 
> Could it be, that MSI AB is very buggy on Windows10/Crossfire?
> It wont set the Clock speed on startup (Overdrive is disabled, only change the speeds on startup, if i enable Overdrive in CCC)
> And the Voltage wont change too. From time to time it change the voltage on the first card to what i set up but the secound is still on stock voltage.
> Add to this, after a restart, i cant change the voltage on my second card anymore. I have to change the voltage settings in the settings menü of AB to enable the voltagebar.
> 
> I found a "Solution" hmmm can i say solution to it? hmmm dont know^^
> In autostart of windows i put in a batch file, where i manualy adjust the Voltage to +100 (like if you wanna add +200 or more voltage in MSI AB, with the /wi6 command)
> 
> Is there a other way to fix the MSI AB issus? The batchfile works. but i think this solution is ugly... Can post the batchfile if wanted.


I had similar issues when I upgraded to crossfire 290's recently. Took me a while to "get it just right".

Here's what I did:
Make sure the "Sync settings" in AB is checked, so it applies the same clocks/voltages etc. to both cards. To overcome powertune/overdrive and prevent those features from overriding AB settings, I had to do the following.
-Create a preset in CCC for +0 power limit in overdrive.
-Create another preset in CCC for +50 power limit in overdrive.
-Assign a hotkey combo for both presets.
-With the power limit set to +0, go into CCC overdrive and disable it.
-Now hit the hotkey combo preset for the +50 power setting. This will re-enable overdrive, but it's OK.
-It will now no longer override AB settings when you apply overclocks.

It's also important to note that when you re-apply the stock stock clocks/settings in AB, you then must hit the hotkey combo for the +0 power preset in over drive, open overdrive again, disable overdrive, then hit the hotkey combo for the +50 power preset again to circumvent powertune once more.

It took a lot of trial and error to figure this all out and it's frustrating this is what has to be done to make crossfire work properly, but it is what it is. My one card does not throttle at all, while my other card does - even when temps are WELL below thermal limits. I don't know why, but there it is. This is the only way I've been able to get both cards to run at the same fixed clocks.

Hope this helps.


----------



## Streetdragon

Quote:


> Originally Posted by *kizwan*
> 
> When you set AB to run at startup, it create a task in Task Scheduler. Set a delay in the task, so that AB will delay start.


Quote:


> Originally Posted by *MEC-777*
> 
> I had similar issues when I upgraded to crossfire 290's recently. Took me a while to "get it just right".
> 
> Here's what I did:
> Make sure the "Sync settings" in AB is checked, so it applies the same clocks/voltages etc. to both cards. To overcome powertune/overdrive and prevent those features from overriding AB settings, I had to do the following.
> -Create a preset in CCC for +0 power limit in overdrive.
> -Create another preset in CCC for +50 power limit in overdrive.
> -Assign a hotkey combo for both presets.
> -With the power limit set to +0, go into CCC overdrive and disable it.
> -Now hit the hotkey combo preset for the +50 power setting. This will re-enable overdrive, but it's OK.
> -It will now no longer override AB settings when you apply overclocks.
> 
> It's also important to note that when you re-apply the stock stock clocks/settings in AB, you then must hit the hotkey combo for the +0 power preset in over drive, open overdrive again, disable overdrive, then hit the hotkey combo for the +50 power preset again to circumvent powertune once more.
> 
> It took a lot of trial and error to figure this all out and it's frustrating this is what has to be done to make crossfire work properly, but it is what it is. My one card does not throttle at all, while my other card does - even when temps are WELL below thermal limits. I don't know why, but there it is. This is the only way I've been able to get both cards to run at the same fixed clocks.
> 
> Hope this helps.


Tried both. Now it clocks to the speed i want, but my secound card still resets it voltage to stock....

I think this is the problem.. I enabled the voltage msi but after a restart the voltagebar gets unuseable.. Afterburner.. why do you hate me?

EdiT:
The Batch to get it working:::
CD C:\Program Files (x86)\MSI Afterburner
MSIAfterburner.exe /sg0 /wi6,30,8d,10 /sg1 /wi6,30,8d,10


----------



## kizwan

Did you disable ULPS?


----------



## Streetdragon

yep. Disabled everything. Tried every setting and stuff i found in the net. but somehow im not lucky^^


----------



## buttface420

im about to install a h55 to a hg10 on my 290x...for those of you who have done this can you please tell me if you used the plastic pieces into the retention ring or did you just use the screws? im about to use it without them


----------



## MEC-777

Quote:


> Originally Posted by *buttface420*
> 
> im about to install a h55 to a hg10 on my 290x...for those of you who have done this can you please tell me if you used the plastic pieces into the retention ring or did you just use the screws? im about to use it without them


Installed mine without the plastic pieces. The instructions don't mention using them. Just be careful to make sure the block is centered on the GPU die when you tighten the screws and do them up in a cross pattern, each a little at a time.


----------



## NicksTricks007

Finally upgraded from my 2GB GTX 770 to an R9 290X. I'm enjoying the performance difference so far. Here's the GPU-Z link and information needed so I can join:

GPU-Z Validation

Manufacturer: XFX
Model/Series: R9 290X
Cooling: Double Dissipation


----------



## 00Smurf

Che
Quote:


> Originally Posted by *buttface420*
> 
> im about to install a h55 to a hg10 on my 290x...for those of you who have done this can you please tell me if you used the plastic pieces into the retention ring or did you just use the screws? im about to use it without them


Check out my prior post. I did this conversion on 5 290x's, I used the zip tie method took all of 8 mins, and worked very well.


----------



## buttface420

okay got the h55 installed im maxing out at 67c in bf4 which before i was hitting 95c! does that sound right or should it be cooler? if anything its wayyyyyyyy quieter


----------



## battleaxe

Quote:


> Originally Posted by *buttface420*
> 
> okay got the h55 installed im maxing out at 67c in bf4 which before i was hitting 95c! does that sound right or should it be cooler? if anything its wayyyyyyyy quieter


What are your clocks and voltage settings. Only way I can tell you if those numbers are good enough.

For example, at stock clocks and voltage mine never goes over 61c on the core. If you are OCing at all though, that could explain it.

I used the X bracket from the H80 cooler and two long screws to go to back of PCB. The Xbracket is on top of the GPU, drilled some holes in said bracket so it would line up with the holes on the H80 cooler head/pump, then some long screws and rubber grommets on back to protect the PCB. Results are better than I ever got with the zip tie method.


----------



## buttface420

Quote:


> Originally Posted by *battleaxe*
> 
> What are your clocks and voltage settings. Only way I can tell you if those numbers are good enough.


everything stock..i havnt incresed fan settings yet tho..still testing


----------



## battleaxe

Quote:


> Originally Posted by *buttface420*
> 
> everything stock..i havnt incresed fan settings yet tho..still testing


I think you could get better temps then. Something is not quite seated right. Should be lower IMO. What is your case airflow like?

Might not be a bad idea to take it apart and see how good the thermal paste spread etc... what mounting method are you using?


----------



## buttface420

im using a hg10 bracket, just screws in. i did however use the stock thermal paste that the h55 came with im going to try using mx-4 and reseating it!


----------



## battleaxe

Quote:


> Originally Posted by *buttface420*
> 
> im using a hg10 bracket, just screws in. i did however use the stock thermal paste that the h55 came with im going to try using mx-4 and reseating it!


Yeah, seems it should be a bit lower then. That bracket typically mounts up pretty well from what I have heard. Just make sure to use a non-conductive paste as it can bleed out on the contacts close to the core. Also, make sure you don't have any corners that are not covered as this happens easily and will cause temps to rise and cause instability issues. (done it a few times, so I found out the hard way)

I always hold the top of the water pump and the back of the GPU front to back with my fingers with one hand. Then I tighten the bolts with my other hand. It makes a good contact without using too much force.


----------



## megax05

I did that in my old rig I didn't see mid 60s till I OCed 1170/1450. Was always mid 50s in a 23c ambient.


----------



## diggiddi

Quote:


> Originally Posted by *MEC-777*
> 
> Installed mine without the plastic pieces. The instructions don't mention using them. Just be careful to make sure the block is centered on the GPU die when you tighten the screws and do them up in a cross pattern, each a little at a time.


The HG10 works with ZAlman lq 315 right? is there a list of compatible AIO's, I'm thinking of goin this route with a aquacomputer passive Backplate
either that or Corsair H80i


----------



## battleaxe

Quote:


> Originally Posted by *megax05*
> 
> I did that in my old rig I didn't see mid 60s till I OCed 1170/1450. Was always mid 50s in a 23c ambient.


Yes, there are some difference between the cards themselves. I have one that won't hit 60c no matter what voltage I put through it even on an H80 AIO. The other one though runs about 10C hotter. The difference is in the voltage. One card runs about .1v higher on the stock VBIOs. No idea why, its just always been that way. So each user can experience slightly different results. The majority though would be around 60c on the core for normal volts. This is after gaming for over an hour, not just a short bench run. Ambient also plays a huge part as well.


----------



## buttface420

yeah im still hitting 68c in short runs on bf4...i dont know if my pump or fan are running as fast as they should be its way too quiet. i think hwinfo showed the pump at 1022rpm maximum but the fan at 670 rpm maximum...probably just something silly im overlooking


----------



## battleaxe

Quote:


> Originally Posted by *buttface420*
> 
> yeah im still hitting 68c in short runs on bf4...i dont know if my pump or fan are running as fast as they should be its way too quiet. i think hwinfo showed the pump at 1022rpm maximum but the fan at 670 rpm maximum...probably just something silly im overlooking


That fan speed sounds a bit low. But that's about 5c max I would guess. That would get you to 63c though, so maybe that would be enough.


----------



## buttface420

lol nevermind guys..i had a blonde moment that lasted the whole day apparently....the pump and fan were running slow i had to go into bios and set them from low mode to high mode lmao i feel so ******ed max temps now 52c


----------



## battleaxe

Quote:


> Originally Posted by *buttface420*
> 
> lol nevermind guys..i had a blonde moment that lasted the whole day apparently....the pump and fan were running slow i had to go into bios and set them from low mode to high mode lmao i feel so ******ed max temps now 52c


Very nice!


----------



## Accurate

http://imgur.com/Lwlevvm

Gigabyte with the Windforce 3X aftermarket cooler.


----------



## Gumbi

Quote:


> Originally Posted by *Accurate*
> 
> 
> 
> http://imgur.com/Lwlevvm
> 
> Gigabyte with the Windforce 3X aftermarket cooler.


How does she perform? Core/VRM cooling under load?


----------



## Accurate

Quote:


> Originally Posted by *Gumbi*
> 
> How does she perform? Core/VRM cooling under load?


Really well, 60 fps Ultra on Witcher 3 with no Hairworks and the temps cap out at 75 with 60% fan speed, barely even audible. VRMs not so sure about but they seem to stay <80 most if not all the time. Mine overclocks upto 1110 with only a 10% power increase but I can't seem to be able to change voltage, I think it may be locked.


----------



## fyzzz

Quote:


> Originally Posted by *Accurate*
> 
> Really well, 60 fps Ultra on Witcher 3 with no Hairworks and the temps cap out at 75 with 60% fan speed, barely even audible. VRMs not so sure about but they seem to stay <80 most if not all the time. Mine overclocks upto 1110 with only a 10% power increase but I can't seem to be able to change voltage, I think it may be locked.


I don't think any card is locked. Are you using msi afterburner? if so have you checked the 'unlock voltage control' box under the settings tab?


----------



## Gumbi

Quote:


> Originally Posted by *fyzzz*
> 
> I don't think any card is locked. Are you using msi afterburner? if so have you checked the 'unlock voltage control' box under the settings tab?


Of all the 290s, Gigabyte's is possibly the only one that's locked actually, AFAIK.


----------



## MEC-777

Quote:


> Originally Posted by *diggiddi*
> 
> The HG10 works with ZAlman lq 315 right? is there a list of compatible AIO's, I'm thinking of goin this route with a aquacomputer passive Backplate
> either that or Corsair H80i


I'm not sure. The HG10 works with any Corsair AIO. You'd have to check the height of the mounting bracket since Zalman used their own mounting bracket design. However, since it "looks" like the same Asetek head design as the H55, it *might* work. Again, it depends on the bracket/X-plate and length of the screws.
Quote:


> Originally Posted by *buttface420*
> 
> lol nevermind guys..i had a blonde moment that lasted the whole day apparently....the pump and fan were running slow i had to go into bios and set them from low mode to high mode lmao i feel so ******ed max temps now 52c


Yeah, you want to make sure the pump gets 12v constant (runs at full speed). The rad fan can run as low as you want at idle and have it ramp up using fan control software. I have mine ramp up to about 1500rpm max. The blower fan on the card itself, you can set to constant 30% in MSI AB - this will keep it silent and yet still easily cool the VRMs.


----------



## supermiguel

Is there a guide around here on how to oc 290x ?


----------



## BradleyW

Quote:


> Originally Posted by *supermiguel*
> 
> Is there a guide around here on how to oc 290x ?


Same as any other. Increase speeds until it becomes unstable. Then, add voltage to regain stability. Watch the temps and enjoy.


----------



## rdr09

Quote:


> Originally Posted by *supermiguel*
> 
> Is there a guide around here on how to oc 290x ?


From the op . . .

scroll down to post # 21896 . . . it should be in there.

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781

always max power limit when oc'ing either using afterburner or trixx.

check your temps at stock first (core and the vrms, especially vrm1). keep all them below 80 at stock or oc with whatever method. fan profile, add'l fans, open side door, lower ambient, etc.


----------



## supermiguel

Quote:


> Originally Posted by *rdr09*
> 
> From the op . . .
> 
> scroll down to post # 21896 . . . it should be in there.
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781
> 
> always max power limit when oc'ing either using afterburner or trixx.
> 
> check your temps at stock first (core and the vrms, especially vrm1). keep all them below 80 at stock or oc with whatever method. fan profile, add'l fans, open side door, lower ambient, etc.


Perfect thanks!!!! what do you normally use to check your vrm temp? btw any 290x comes with 2x dp ports? have 2 4k monitors and cant crossfire my cards now







since i need to plug displays to both

How do video cards work when not in crossfire? like if im playing 2 games one on each monitor, does it use video card 1 for game in monitor 1 and video card 2 for game on monitor 2?


----------



## rdr09

Quote:


> Originally Posted by *supermiguel*
> 
> Perfect thanks!!!! what do you normally use to check your vrm temp? btw any 290x comes with 2x dp ports? have 2 4k monitors and cant crossfire my cards now
> 
> 
> 
> 
> 
> 
> 
> since i need to plug displays to both
> 
> How do video cards work when not in crossfire? like if im playing 2 games one on each monitor, does it use video card 1 for game in monitor 1 and video card 2 for game on monitor 2?


lol. can't help you there. i only have one 4K. i would have never thought of that setup.


----------



## supermiguel

Quote:


> Originally Posted by *rdr09*
> 
> lol. can't help you there. i only have one 4K. i would have never thought of that setup.


Do they make any 290x/390x with 2 dp ports?


----------



## kizwan

Quote:


> Originally Posted by *supermiguel*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> From the op . . .
> 
> scroll down to post # 21896 . . . it should be in there.
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781
> 
> always max power limit when oc'ing either using afterburner or trixx.
> 
> check your temps at stock first (core and the vrms, especially vrm1). keep all them below 80 at stock or oc with whatever method. fan profile, add'l fans, open side door, lower ambient, etc.
> 
> 
> 
> Perfect thanks!!!! what do you normally use to check your vrm temp? btw any 290x comes with 2x dp ports? have 2 4k monitors and cant crossfire my cards now
> 
> 
> 
> 
> 
> 
> 
> since i need to plug displays to both
> 
> *How do video cards work when not in crossfire? like if im playing 2 games one on each monitor, does it use video card 1 for game in monitor 1 and video card 2 for game on monitor 2?*
Click to expand...

I would thing there will be resource problem like whether you have enough CPU processing power to handle that. The proper approach for this is to setup a virtual machine server like Xen or VMware ESXi for example. Then you can assign the gpus & the monitors to virtual machines. This way you can play two different games at separate virtual machines.


----------



## supermiguel

ya dont think would work...


----------



## fyzzz

I have been trying to get another 290. But i seems it's hard to get hold of one where i live and don't know if i want to spend 326 € on a new one. I guess i have to buy a new psu and waterblock also. But the main reason why i want another card is mainly because of bios modding. I doubt i would get a better card than my xfx, but i usually have luck with my gpus. It will go alot of money to crossfire also,which i don't really have. I also have an amd rig that needs a new gpu. I will have to see and think about this a bit more, but what i managed to do to my xfx is wanting me to get a another card. So much money that goes into pc parts







. I don't know really why i'm sharing this, but i wanted to share some thoughts i had.


----------



## supermiguel

under burn test whats the highest VRM i can get? and still safe


----------



## Accurate

Quote:


> Originally Posted by *fyzzz*
> 
> I don't think any card is locked. Are you using msi afterburner? if so have you checked the 'unlock voltage control' box under the settings tab?


Oooo, thanks dude. Never noticed that setting there.


----------



## Accurate

Quote:


> Originally Posted by *fyzzz*
> 
> I don't think any card is locked. Are you using msi afterburner? if so have you checked the 'unlock voltage control' box under the settings tab?


With the core voltage increased, I was able to get a 1160 stable over clock with 1500 on the memory. This is from a factory OC of 1040 and 1250 on memory. I think this is quite good tbh. The temps only go up by 10 degrees but I was able to get a 10-20 fps jump in the Witcher 3. The fan speeds increased to a max of 80 tho.


----------



## matt9882

Joining the Club -

2x MSI Gaming 290X -8GB,
MSI Gaming Stock Cooler

http://www.techpowerup.com/gpuz/details.php?id=dk7yz


----------



## megax05

Add me with 2x Sapphire r9 290 tri-x 4gb
I will get my 850w PSU within 2 days.


----------



## rdr09

Quote:


> Originally Posted by *matt9882*
> 
> Joining the Club -
> 
> 2x MSI Gaming 290X -8GB,
> MSI Gaming Stock Cooler
> 
> http://www.techpowerup.com/gpuz/details.php?id=dk7yz


now i am wishing i have 8GB hawaiis. Nice rig.
Quote:


> Originally Posted by *megax05*
> 
> Add me with 2x Sapphire r9 290 tri-x 4gb
> I will get my 850w PSU within 2 days.


upgrade psu? your current just needs dusting. jk.

i came from 1300w down to 850w evga psu. i am done benching and i play with the 290s at stock. pretty cramped in there. might find it hard to keep temps down.


----------



## megax05

I bought this PSU just to fire up a single GPU it was fine till I decided to play at 4k and grab dual GPU and the r9 290 was the cheapest way to do it I got 2 cards for a nice 385$ including shipping now I will get Acer budget IPS 4k the 28 inch one. Not planning on massive OC for the GPUs only targeting somewhere close to 390s clocks or further I can get with stock voltages.
The problem is my new case will be so tight to fit those cards and might modify it to keep these beasts cool
Aerocool GT-RS will have hard time I will see what I can do when dhl nock the door.


----------



## rdr09

Quote:


> Originally Posted by *megax05*
> 
> I bought this PSU just to fire up a single GPU it was fine till I decided to play at 4k and grab dual GPU and the r9 290 was the cheapest way to do it I got 2 cards for a nice 385$ including shipping now I will get Acer budget IPS 4k the 28 inch one. Not planning on massive OC for the GPUs only targeting somewhere close to 390s clocks or further I can get with stock voltages.
> The problem is my new case will be so tight to fit those cards and might modify it to keep these beasts cool
> Aerocool GT-RS will have hard time I will see what I can do when dhl nock the door.


we have a similar setup but mine are watered. you may have to keep the side door off. i use 28in 4K for games and 1440 for work. i normally disable crossfire if just at 1440. takes less than a minute to switch everything up or down. i don't need to oc my gpus at all in any of the games i play including c3, bf3, and bf4. my cpu, though, i keep my ht on and 4.5GHz oc.

100MHz on the core and prolly 1400 MHz vram oc should match a stock 390. you prolly don't need add'l VDDC. at most 25mv. try stock voltage first.

at any oc, though, max the power limit.


----------



## MEC-777

Quote:


> Originally Posted by *matt9882*
> 
> Joining the Club -
> 
> 2x MSI Gaming 290X -8GB,
> MSI Gaming Stock Cooler
> 
> http://www.techpowerup.com/gpuz/details.php?id=dk7yz


Just curious why you have your rear fan as intake instead of exhaust? I'd be worried about dust since there's no dust filter. But then, I have 4 dogs, so maybe that's not as much of an issue for you.


----------



## sinnedone

Quote:


> Originally Posted by *supermiguel*
> 
> Do they make any 290x/390x with 2 dp ports?


Is a display port hub out od the question?


----------



## supermiguel

is it worth it overclocking
Quote:


> Originally Posted by *sinnedone*
> 
> Is a display port hub out od the question?


does it work?? connecting 2 4k monitors to a dp splitter into a single dp port on video card

something like this? http://www.amazon.com/StarTech-com-Triple-DisplayPort-Multi-Monitor/dp/B00JLRBB8I/ref=sr_1_2?s=electronics&ie=UTF8&qid=1443792225&sr=1-2&keywords=display+port+hub


----------



## sinnedone

Quote:


> Originally Posted by *supermiguel*
> 
> is it worth it overclocking
> does it work?? connecting 2 4k monitors to a dp splitter into a single dp port on video card


Dont quote me on this, but i believe the more expensive ones were caoable of more bandwidth. Might try researching that.


----------



## supermiguel

Quote:


> Originally Posted by *sinnedone*
> 
> Dont quote me on this, but i believe the more expensive ones were caoable of more bandwidth. Might try researching that.


ya already found this: http://www.startech.com/AV/Displayport-Converters/Triple-Head-DisplayPort-Multi-Monitor-MST-Hub~MSTDP123DP i guess i need to look deeper for one that supports 4k at 60Hz instead of 30Hz


----------



## supermiguel

anyways, any of you know a way to go higher than 200mV in trixx? or i need to use MSI AB for it?

edit:

I tried using MSI AB, and while doing MSIAfterburner.exe /wi6,30,8d,40, it goes to 1.35V but when i open MSI AB it doesnt show there the 400mV it still shows a max of 100mV is this a problem with the latest version of MSI AB?


----------



## sinnedone

Quote:


> Originally Posted by *supermiguel*
> 
> ya already found this: http://www.startech.com/AV/Displayport-Converters/Triple-Head-DisplayPort-Multi-Monitor-MST-Hub~MSTDP123DP i guess i need to look deeper for one that supports 4k at 60Hz instead of 30Hz


IT looks like on Displayport 1.2 two 4k monitors at 30hz will be the limit. Are you planning to game on these at the same time?

I don't know the hdmi version on these cards but I think hdmi is limited to 30hz as well.


----------



## supermiguel

Quote:


> Originally Posted by *sinnedone*
> 
> IT looks like on Displayport 1.2 two 4k monitors at 30hz will be the limit. Are you planning to game on these at the same time?


Yup, ill just disable crossfire and overclock the video cards more..


----------



## By-Tor

Quote:


> Originally Posted by *MEC-777*
> 
> Just curious why you have your rear fan as intake instead of exhaust? I'd be worried about dust since there's no dust filter. But then, I have 4 dogs, so maybe that's not as much of an issue for you.


I run both of my 360 rads with the fans setup as exhaust, so all my case fans are setup as intake to feed them. My guess is his maybe setup the same, but that's just a guess..

I didn't mount any fans at the rear of my Enthoo Primo as its all open grating and they pull in whatever air they need.


----------



## matt9882

Quote:


> Originally Posted by *MEC-777*
> 
> Just curious why you have your rear fan as intake instead of exhaust? I'd be worried about dust since there's no dust filter. But then, I have 4 dogs, so maybe that's not as much of an issue for you.


I actually have two dogs in a relatively small apartment - but now the rear intake is filtered. That picture was from a while ago, before I got my second card (and let me tell you, it was hard to find the second card). I now have all fans installed (2 x 140 AFL in the front, 2 x 120 AF Quiet in the bottom, 1 x 140 AF Quiet in the rear) , filtered, on intake, aside from the 140 GTX cooler, which has upgraded fans to Phanteks PH-140MP's in exhaust. New pictures will come soon, but I'm actually taking the whole thing apart and making a build log. Little mods here and there, ya know?

Dust wasn't as big an issue as I thought it would be with an un-filtered rear intake. Temps, however, improved DRASTICALLY. I dropped about 4-5 deg C to the CPU when I switched the fan to intake, and my GPU temps didn't degrade at all. The 750D, while being a nice, big, sleek case, doesn't allow a ton of airflow, and the extra fresh air to the rad up top helped a lot. I need to validate my results now that I'm in crossfire, though. I feel as though I need to kill some of the extremely hot air spewing out of my 290X's before they get to the rad. I'm likely going to either switch the fan back to exhaust, or install a PCI - Slot cooler above the top 290X.


----------



## MEC-777

Quote:


> Originally Posted by *matt9882*
> 
> I actually have two dogs in a relatively small apartment - but now the rear intake is filtered. That picture was from a while ago, before I got my second card (and let me tell you, it was hard to find the second card). I now have all fans installed (2 x 140 AFL in the front, 2 x 120 AF Quiet in the bottom, 1 x 140 AF Quiet in the rear) , filtered, on intake, aside from the 140 GTX cooler, which has upgraded fans to Phanteks PH-140MP's in exhaust. New pictures will come soon, but I'm actually taking the whole thing apart and making a build log. Little mods here and there, ya know?
> 
> Dust wasn't as big an issue as I thought it would be with an un-filtered rear intake. Temps, however, improved DRASTICALLY. I dropped about 4-5 deg C to the CPU when I switched the fan to intake, and my GPU temps didn't degrade at all. The 750D, while being a nice, big, sleek case, doesn't allow a ton of airflow, and the extra fresh air to the rad up top helped a lot. I need to validate my results now that I'm in crossfire, though. I feel as though I need to kill some of the extremely hot air spewing out of my 290X's before they get to the rad. I'm likely going to either switch the fan back to exhaust, or install a PCI - Slot cooler above the top 290X.


Cool Cool.







Gotta do what you gotta do to get proper airflow. I know all too well.

With two 290's in my S340, on air, they both just ramped right up to 90+ degrees and the fans screamed like mad to try and cool them. The top and rear of the case would actually get quite warm to the touch. kinda scary. Thus, put both cards under water with kraken G10 on one and Corsair HG10 on the other - both with H55's plus an H60 cooling the CPU. S340's feel quite small once you start putting a lot of hardware inside. lol.







Still love it none-the-less.

Here's how I'm running it now:



Couldn't fit a fan in push on that top rad, so had to mount it outside on top in pull. Still works just fine though. Now the whole case stays cool to the touch, even under constant heavy loads. Top card hits low/mid 60's max and bottom card hits low/mid 50's max.


----------



## matt9882

Yeah, I wish I could hit my cards with some AIO brackets, but alas the MSI Gaming series doesn't use a reference design, so it'll be a little modding to get it to work properly.


----------



## battleaxe

I just ordered a 360 RAD, tubing, connectors, etc, from Amazon... picking up a new case to build it in tonight.

The 290x's will be going in a full loop very soon.

Finally, a full loop is on the horizon.







Its been far too long...


----------



## Arizonian

Quote:


> Originally Posted by *obiwan kanobi*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> please add me....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i upgrade my rig from hd6950 to r9 290.. i swapped direct cuii to prolimatech mk-26 with 2 140mm bitfenix spectre pro... idle temp drop to 37-39c.. max i get around 78c. the card wont reach 80 unlike Direct CUii... usually i get 50-56c.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Quote:


> Originally Posted by *matt9882*
> 
> Joining the Club -
> 
> 2x MSI Gaming 290X -8GB,
> MSI Gaming Stock Cooler
> 
> http://www.techpowerup.com/gpuz/details.php?id=dk7yz
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added









Quote:


> Originally Posted by *megax05*
> 
> Add me with 2x Sapphire r9 290 tri-x 4gb
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> I will get my 850w PSU within 2 days.


Congrats - added









Club has 609 members 878 GPU's


----------



## Ashura

Great to see new member/ owners, I'm not the only one








XFX R9 290 DD


How high can I push the core @ +20mv ? any ideas?


----------



## Ashura

Quote:


> Originally Posted by *MEC-777*
> 
> Cool Cool.
> 
> 
> 
> 
> 
> 
> 
> Gotta do what you gotta do to get proper airflow. I know all too well.
> 
> With two 290's in my S340, on air, they both just ramped right up to 90+ degrees and the fans screamed like mad to try and cool them. The top and rear of the case would actually get quite warm to the touch. kinda scary. Thus, put both cards under water with kraken G10 on one and Corsair HG10 on the other - both with H55's plus an H60 cooling the CPU. S340's feel quite small once you start putting a lot of hardware inside. lol.
> 
> 
> 
> 
> 
> 
> 
> Still love it none-the-less.
> 
> Here's how I'm running it now:
> 
> 
> 
> Couldn't fit a fan in push on that top rad, so had to mount it outside on top in pull. Still works just fine though. Now the whole case stays cool to the touch, even under constant heavy loads. Top card hits low/mid 60's max and bottom card hits low/mid 50's max.


Looks Great









Case lighting?


----------



## Arizonian

Quote:


> Originally Posted by *Ashura*
> 
> Great to see new member/ owners, I'm not the only one
> 
> 
> 
> 
> 
> 
> 
> 
> XFX R9 290 DD
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> How high can I push the core @ +20mv ? any ideas?


Congrats - added


----------



## Streetdragon

If you are lucky.. maybe 1100...... but you will see it if you try it! Go higher with the stock voltage and try a benchmark like 3DMark or Heavenbench. if you dont artefakt, try a higher clock and so on


----------



## rdr09

Quote:


> Originally Posted by *MEC-777*
> 
> Cool Cool.
> 
> 
> 
> 
> 
> 
> 
> Gotta do what you gotta do to get proper airflow. I know all too well.
> 
> With two 290's in my S340, on air, they both just ramped right up to 90+ degrees and the fans screamed like mad to try and cool them. The top and rear of the case would actually get quite warm to the touch. kinda scary. Thus, put both cards under water with kraken G10 on one and Corsair HG10 on the other - both with H55's plus an H60 cooling the CPU. S340's feel quite small once you start putting a lot of hardware inside. lol.
> 
> 
> 
> 
> 
> 
> 
> Still love it none-the-less.
> 
> Here's how I'm running it now:
> 
> 
> 
> Couldn't fit a fan in push on that top rad, so had to mount it outside on top in pull. Still works just fine though. Now the whole case stays cool to the touch, even under constant heavy loads. Top card hits low/mid 60's max and bottom card hits low/mid 50's max.


Very nice. exactly what did you use to light things up?


----------



## Ramzinho

dual 290X are awesome guys.. i believe this is the best Value for money setup one can get.


----------



## Ashura

Quote:


> Originally Posted by *Streetdragon*
> 
> If you are lucky.. maybe 1100...... but you will see it if you try it! Go higher with the stock voltage and try a benchmark like 3DMark or Heavenbench. if you dont artefakt, try a higher clock and so on


Thanks.

Was getting low scores on valley. 52fps with 1100/1300.

Re installed 15.7.1
60Fps @1100/1250


WEI isn't working since my upgrade to 290 &15.7.1, bad driver?


----------



## rdr09

Quote:


> Originally Posted by *Ramzinho*
> 
> dual 290X are awesome guys.. i believe this is the best Value for money setup one can get.


indeed they are. Lovely rig you got there. what's up with the . . . is that a SSD at bottom right not flushed?

Quote:


> Originally Posted by *Ashura*
> 
> Thanks.
> 
> Was getting low scores on valley. 52fps with 1100/1300.
> 
> Re installed 15.7.1
> 60Fps @1100/1250
> 
> 
> *WEI isn't working since my upgrade to 290 &15.7.1, bad driver?*


might need a bios/cmos reset. save your oc settings in bios if you will do it.


----------



## Ashura

Quote:


> Originally Posted by *rdr09*
> 
> might need a bios/cmos reset. save your oc settings in bios if you will do it.


It crashes while "Direct3D 10 Texture Load Assessment"
Not sure if its significant.


----------



## rdr09

Quote:


> Originally Posted by *Ashura*
> 
> It crashes while "Direct3D 10 Texture Load Assessment"
> Not sure if its significant.


that's too bad. you know WEI > Anand, right?


----------



## Ramzinho

Quote:


> Originally Posted by *rdr09*
> 
> indeed they are. Lovely rig you got there. what's up with the . . . is that a SSD at bottom right not flushed?
> might need a bios/cmos reset. save your oc settings in bios if you will do it.


i fixed that. Sata power cable was so tough it wouldn't let me position the ssd correctly. just did a reseat for it.


----------



## rdr09

Quote:


> Originally Posted by *Ramzinho*
> 
> i fixed that. Sata power cable was so tough it wouldn't let me position the ssd correctly. just did a reseat for it.


i see. really nicely done. it is not too late to watercool hawaiis. we have at least two more years of life out of these gpus.


----------



## NicksTricks007

Add me to the list too now









Just realized I didn't add a darn picture to my original post, so here it is again with a pic









Finally upgraded from my 2GB GTX 770 to an R9 290X. I'm enjoying the performance difference so far. Here's the GPU-Z link and information needed so I can join:

GPU-Z Validation

Manufacturer: XFX
Model/Series: R9 290X
Cooling: Double Dissipation


----------



## MEC-777

Quote:


> Originally Posted by *NicksTricks007*
> 
> Add me to the list too now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just realized I didn't add a darn picture to my original post, so here it is again with a pic
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Finally upgraded from my 2GB GTX 770 to an R9 290X. I'm enjoying the performance difference so far. Here's the GPU-Z link and information needed so I can join:
> 
> GPU-Z Validation
> 
> Manufacturer: XFX
> Model/Series: R9 290X
> Cooling: Double Dissipation


Just curious why all your fans are as intake?

Nice setup, none-the-less.


----------



## NicksTricks007

Quote:


> Originally Posted by *MEC-777*
> 
> Just curious why all your fans are as intake?
> 
> Nice setup, none-the-less.


Thanks! The only thing I'm really wanting to change up now is my PSU. I want a stronger, higher wattage one. This Antec is about 6 years old, but has been in 3 different builds of mine.

As for the fans. I like more cold air coming in to create a positive pressure inside my case. I just have to watch my temps. When I see them creeping up a bit, I'll take the side off to let the heat escape. Thankfully I don't have to food it that often. The fans in my radiator are the 140mm jetflows in push pull, so they exhaust enough air to almost keep up with the others.


----------



## Arizonian

Quote:


> Originally Posted by *NicksTricks007*
> 
> Add me to the list too now
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just realized I didn't add a darn picture to my original post, so here it is again with a pic
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Finally upgraded from my 2GB GTX 770 to an R9 290X. I'm enjoying the performance difference so far. Here's the GPU-Z link and information needed so I can join:
> 
> GPU-Z Validation
> 
> Manufacturer: XFX
> Model/Series: R9 290X
> Cooling: Double Dissipation
> 
> 
> Spoiler: Warning: Spoiler!


Totally appreciated.







Congrats - added


----------



## NicksTricks007

Thanks, I'm really excited to see what this card can do. But I want to get a better PSU first. My 6 year old 650 watt Antec had been a workhorse, but don't think it's up to the task of pushing my CPU at its current 4.5 Ghz and the 290x.


----------



## fyzzz

I was looking through this thread today and realized that i never joined this club and I've had this card since may and it has been under water for soon 2 months. I am waiting for some parts and therefore the whole loop is down and i took the opportunity too take a picture. I excuse the bad picture, but my phone takes bad pictures in low light. XFX R9 290 DD Watercooled. Can i join? @Arizonian


----------



## By-Tor

Very Sexy!!!


----------



## Accurate

Quote:


> Originally Posted by *Ashura*
> 
> Great to see new member/ owners, I'm not the only one
> 
> 
> 
> 
> 
> 
> 
> 
> XFX R9 290 DD
> 
> 
> How high can I push the core @ +20mv ? any ideas?


I pushed the core to the max at 100mV and was able to get a 1160 stable overclock with the memory at 1400. Though this did lead to a increase in temperatures which were 80 degrees @ 60% fanspeed


----------



## Ashura

Quote:


> Originally Posted by *NicksTricks007*
> 
> Thanks! The only thing I'm really wanting to change up now is my PSU. I want a stronger, higher wattage one. This Antec is about 6 years old, but has been in 3 different builds of mine.
> 
> As for the fans. I like more cold air coming in to create a positive pressure inside my case. I just have to watch my temps. When I see them creeping up a bit, I'll take the side off to let the heat escape. Thankfully I don't have to food it that often. The fans in my radiator are the 140mm jetflows in push pull, so they exhaust enough air to almost keep up with the others.


Nice setup








Positive pressure is good, But I'd suggest to keep the rear fan as exhaust.
It'll push out the hot air from the VRMs.


----------



## NicksTricks007

Quote:


> Originally Posted by *Ashura*
> 
> Nice setup
> 
> 
> 
> 
> 
> 
> 
> 
> Positive pressure is good, But I'd suggest to keep the rear fan as exhaust.
> It'll push out the hot air from the VRMs.


I'll try it to see what kind of difference I get in temps. Doesn't seem like it's an issue so far though. Idle temps are about 38c with 50% fan speed and load temps are 71c with 50% and 61c with 100%. Well within spec of the card with room to spare. I may try to give it a bit of an overclock soon though.


----------



## Arizonian

Quote:


> Originally Posted by *fyzzz*
> 
> I was looking through this thread today and realized that i never joined this club and I've had this card since may and it has been under water for soon 2 months. I am waiting for some parts and therefore the whole loop is down and i took the opportunity too take a picture. I excuse the bad picture, but my phone takes bad pictures in low light. XFX R9 290 DD Watercooled. Can i join? @Arizonian
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## rdr09

waterblocks available . . .

http://www.overclock.net/t/1574773/appraisal-bnib-ek-full-copper-blocks-xspc-photon-reservoirs-alphacool-radiators-primochill-ghost-fittings


----------



## buttface420

loving my 290x, but i have one problem annoying the crap out of me.

it seems if i open msi afterburner or hwinfo at random times it cause my pc to freeze,doesnt do it all the time id say 3 out of 10 times. if i open them up as soon as im booted into windows its fine. i have no other problems but this is bugging me.

any ideas?


----------



## kizwan

Try reinstalling drivers. Use DDU to remove the current drivers.


----------



## NicksTricks007

Quote:


> Originally Posted by *buttface420*
> 
> loving my 290x, but i have one problem annoying the crap out of me.
> 
> it seems if i open msi afterburner or hwinfo at random times it cause my pc to freeze,doesnt do it all the time id say 3 out of 10 times. if i open them up as soon as im booted into windows its fine. i have no other problems but this is bugging me.
> 
> any ideas?


Are you on Windows 10? Just wondering because I've had issues with a few programs in 10 that cause the same thing as you. Some of them haven't been patched for 10 yet. Speedfan is one of them.


----------



## mfknjadagr8

Quote:


> Originally Posted by *buttface420*
> 
> loving my 290x, but i have one problem annoying the crap out of me.
> 
> it seems if i open msi afterburner or hwinfo at random times it cause my pc to freeze,doesnt do it all the time id say 3 out of 10 times. if i open them up as soon as im booted into windows its fine. i have no other problems but this is bugging me.
> 
> any ideas?


you could try running them in windows 7 compatibility mode...worth a shot (that is if your running windows 10) but refreshing the drivers never hurts anyway...


----------



## diggiddi

Quote:


> Originally Posted by *kizwan*
> 
> Try reinstalling drivers. Use DDU to remove the current drivers.


I had/ve the same issue reinstalled driver with DDU and it came back

Quote:


> Originally Posted by *NicksTricks007*
> 
> Are you on Windows 10? Just wondering because I've had issues with a few programs in 10 that cause the same thing as you. Some of them haven't been patched for 10 yet. Speedfan is one of them.


I'm running Win 7 and have the same crashing issue


----------



## mfknjadagr8

Quote:


> Originally Posted by *diggiddi*
> 
> I had/ve the same issue reinstalled driver with DDU and it came back
> I'm running Win 7 and have the same crashing issue


Ive read here and other places of afterburner locking up and even crashing pc when it tries to alter the clocks/voltages/limits upon starting up after windows boots.. im not sure anyone found a solution or if they did i didnt see it amongst the posts... i cant think of a time that mine has locked up and i never start it with windows... maybe somene here has fixed it


----------



## diggiddi

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Ive read here and other places of afterburner locking up and even crashing pc when it tries to alter the clocks/voltages/limits upon starting up after windows boots.. im not sure anyone found a solution or if they did i didnt see it amongst the posts... i cant think of a time that mine has locked up and i never start it with windows... maybe somene here has fixed it


It crashes at stock clocks too, BTW I don't have mine setup to open on startup


----------



## mfknjadagr8

Quote:


> Originally Posted by *diggiddi*
> 
> It crashes at stock clocks too, BTW I don't have mine setup to open on startup


that sucks....knowing microsoft its probably one of their crappy updates







I wasnt saying if it set as a boot item i mean starting it after windows boots.. but yeah thats irrelevant now


----------



## buttface420

Quote:


> Originally Posted by *NicksTricks007*
> 
> Are you on Windows 10? Just wondering because I've had issues with a few programs in 10 that cause the same thing as you. Some of them haven't been patched for 10 yet. Speedfan is one of them.


yes im on windows 10. win 10 has been sucky on some things..firefox on it is a nightmare


----------



## matt9882

Is anyone aware of just how different the MSI Gaming 8G PCB is from the Reference design board? I want to throw an EK full-cover block on there, but they claim it won't work. Looking at pictures of the two side by side, however, they don't seem all that different.

Here's the 8G (according to TechPowerUp)



And here's the reference:



Thoughts?


----------



## sinnedone

Looks like that black metal heatsink is what makes the difference.


----------



## matt9882

Quote:


> Originally Posted by *sinnedone*
> 
> Looks like that black metal heatsink is what makes the difference.


Right, but that should be pretty easily removable, shouldn't it?


----------



## kizwan

Quote:


> Originally Posted by *matt9882*
> 
> Is anyone aware of just how different the MSI Gaming 8G PCB is from the Reference design board? I want to throw an EK full-cover block on there, but they claim it won't work. Looking at pictures of the two side by side, however, they don't seem all that different.
> 
> Here's the 8G (according to TechPowerUp)
> 
> 
> 
> And here's the reference:
> 
> 
> 
> Thoughts?


Try take a peak near the PCIe power connectors. How many caps do you see there? Coolingconfigurator seems poorly maintained. Even slightly taller or fatter caps can make the block incompatible. No one going to know whether it's going to fit or not until someone become a guinea pig. Visually it should fit but not 100% sure.


----------



## rdr09

Quote:


> Originally Posted by *rdr09*
> 
> waterblocks available . . .
> 
> http://www.overclock.net/t/1574773/appraisal-bnib-ek-full-copper-blocks-xspc-photon-reservoirs-alphacool-radiators-primochill-ghost-fittings


just to clear things up . . . these aren't mine. member just needs help appraising stuff.


----------



## mfknjadagr8

Quote:


> Originally Posted by *kizwan*
> 
> Try take a peak near the PCIe power connectors. How many caps do you see there? Coolingconfigurator seems poorly maintained. Even slightly taller or fatter caps can make the block incompatible. No one going to know whether it's going to fit or not until someone become a guinea pig. Visually it should fit but not 100% sure.


taller being the bigger issue... dremel can route it out a bit but still no guarantees... and then you cant return if it doesnt work...


----------



## Ashura

Quote:


> Originally Posted by *diggiddi*
> 
> It crashes at stock clocks too, BTW I don't have mine setup to open on startup


It happened to me too on stock clocks.

Overclocked it a bit, so far no freezes. It is at least gta V stable :


----------



## mfknjadagr8

Quote:


> Originally Posted by *Ashura*
> 
> It happened to me too on stock clocks.
> 
> Overclocked it a bit, so far no freezes. It is at least gta V stable :


gta v stable that made me smile


----------



## megax05

I just finished building my PC with a new PSU and case and run heaven benchmark on 1080p extreme x8 tessellation and got 2186 with 87 avg fps , is this normal for a dual r9 290 with i7 4770?


----------



## MEC-777

Quote:


> Originally Posted by *megax05*
> 
> I just finished building my PC with a new PSU and case and run heaven benchmark on 1080p extreme x8 tessellation and got 2186 with 87 avg fps , is this normal for a dual r9 290 with i7 4770?
> 
> 
> Spoiler: Warning: Spoiler!


Will run my dual 290's through Heaven within the hour and let you know if the score is similar.







Are your cards overclocked at all?

What case is that? Looks very... full.







I'm assuming it's a dual-chamber design with the PSU and storage mounted on the back side?

Very nice setup.


----------



## semitope

http://www.techpowerup.com/gpuz/558xd

Sapphire R9 290x Vapor-X 8GB Stock


----------



## megax05

Quote:


> Originally Posted by *MEC-777*
> 
> Will run my dual 290's through Heaven within the hour and let you know if the score is similar.
> 
> 
> 
> 
> 
> 
> 
> Are your cards overclocked at all?
> 
> What case is that? Looks very... full.
> 
> 
> 
> 
> 
> 
> 
> I'm assuming it's a dual-chamber design with the PSU and storage mounted on the back side?
> 
> Very nice setup.


Indeed its a dual chamber case from Aerocool had to do some metal cutting for the tall GPUs but its fine temp wise I still need 2x high flow 120mm + 1x 80mm fan I will make it front/top intake rear/bottom exhaust.


----------



## shicedreck

Just received my new Sapphire R9 290x Tri-X OC today. I must say it's a beast but a sexy one that is.

I even did some benchmarking on it in comparision to a GTX 970.

Well as you all know in some games the 970 is faster and in most the r9 290x.

In case you wanna see my benchmarks, I posted them on google docs.

-> https://docs.google.com/spreadsheets/d/10qodyVGqILcWXC99WoZAEOWm0sTCjdJNMRDQkp6q0cI/edit?usp=sharing


----------



## MEC-777

Quote:


> Originally Posted by *megax05*
> 
> I just finished building my PC with a new PSU and case and run heaven benchmark on 1080p extreme x8 tessellation and got 2186 with 87 avg fps , is this normal for a dual r9 290 with i7 4770?


Ok, so I ran Heaven on the same settings and scored 2485, with an avg fps of 98.6. That was with the cards running at stock clocks 947/1250. With a mild OC of 1100/1350 @+0mV the score was 2723 with avg fps of 108.1.

So it looks like your score is a bit on the low side. Which drivers are you using? I'm running 15.8 beta and it's been quite stable. Also, make sure you don't have any other application running in the back ground. You should also use MSI afterburner to watch the clock speeds of both cards and ensure they're both synchronized. If the cards don't run at the same clock speeds, performance can be significantly reduced.

What are your temps like under load?


----------



## megax05

I think my pcie 16x/4x is crippling my second card while my stock clock is 1000/1300
I will try to add more fans to prevent card from throttling.
I am using the latest stable 15.7
I will try to figure it out.


----------



## mfknjadagr8

Quote:


> Originally Posted by *megax05*
> 
> I think my pcie 16x/4x is crippling my second card while my stock clock is 1000/1300
> I will try to add more fans to prevent card from throttling.
> I am using the latest stable 15.7
> I will try to figure it out.


fill out your rig in the rigbuilder in the top right and set it to show on your post signature... this will let people see what you have at a glance when you ask questions


----------



## MEC-777

Quote:


> Originally Posted by *megax05*
> 
> pcie 16x/4x


This could be what's holding it back since 290's use the PCIe lanes to communicate between GPUs in crossfire. The motherboard I'm using is SLI certified which means it has two PCIe 3.0 slots that run at 8x/8x instead of 16x/4x like most other crossfire only certified motherboards. AMD should really change this certification to match the standard of SLI for two slots at 8x minimum. You are definitely seeing a good performance boost but this could be a limiting factor.


----------



## mfknjadagr8

Quote:


> Originally Posted by *MEC-777*
> 
> This could be what's holding it back since 290's use the PCIe lanes to communicate between GPUs in crossfire. The motherboard I'm using is SLI certified which means it has two PCIe 3.0 slots that run at 8x/8x instead of 16x/4x like most other crossfire only certified motherboards. AMD should really change this certification to match the standard of SLI for two slots at 8x minimum. You are definitely seeing a good performance boost but this could be a limiting factor.


Best case is like the saber i have here:
3 x PCIe 2.0 x16 (dual x16 or x16, x8, x8)

x4 wont give you much out of that second card for sure


----------



## megax05

As I thought it might be the thing that hold back my gpus .
I will be fine till i save some $$ for a decent motherboard
For now i will work in cooling this case.
I saw 90c in my top card and 73c on the bottom one that mean i need more air flow in the case.


----------



## MEC-777

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Best case is like the saber i have here:
> 3 x PCIe 2.0 x16 (dual x16 or x16, x8, x8)
> 
> x4 wont give you much out of that second card for sure


Same as my board. PCIe 3.0 at x8 = PCIe 2.0 x16. (PCIe 3.0 has twice the lane bandwidth of PCIe 2.0)









Quote:


> Originally Posted by *megax05*
> 
> As I thought it might be the thing that hold back my gpus .
> I will be fine till i save some $$ for a decent motherboard
> For now i will work in cooling this case.
> I saw 90c in my top card and 73c on the bottom one that mean i need more air flow in the case.


Yeah, get some fans in the top blowing fresh air down on the cards if you can.


----------



## mfknjadagr8

Quote:


> Originally Posted by *MEC-777*
> 
> Same as my board. PCIe 3.0 at x8 = PCIe 2.0 x16. (PCIe 3.0 has twice the lane bandwidth of PCIe 2.0)
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah, get some fans in the top blowing fresh air down on the cards if you can.


lemme know when your cards can fully saturate 2.0 much less 3.0... but that point is moot so long as you are getting the best performance you can out of your board


----------



## Gumbi

Damnit. My RMAed card did 1200/1650 at 80mv. The card I got back can only do 1150/1550 at 80mv. And I'm not even confident in that 1150 tbh yet. Damn it. Just when I got the cooling setup to push my 290 to the limits it craps out, and I get something a lot worse back. Makes me appreciate what I had tbh


----------



## mfknjadagr8

Quote:


> Originally Posted by *Gumbi*
> 
> Damnit. My RMAed card did 1200/1650 at 80mv. The card I got back can only do 1150/1550 at 80mv. And I'm not even confident in that 1150 tbh yet. Damn it. Just when I got the cooling setup to push my 290 to the limits it craps out, and I get something a lot worse back. Makes me appreciate what I had tbh


I mustve missed your post where you had to rma.. that sucks you had a good specimen there...but thats how the lottery works... i cant count how many people buy two cards and expect them to clock the same as they thought i guess since they were bought at the same time they would perform exactly the same but yeah some people are having trouble hitting those clocks at +200 it could always be worse








my bottom card wont budge over 1100/1300 unless i add +100mv so you still didnt get a complete dud


----------



## MIGhunter

Quote:


> Originally Posted by *MEC-777*
> 
> Which drivers are you using? I'm running 15.8 beta and it's been quite stable.


How did you install those? Last time I installed beta drivers, I used DDU to wipe my old drivers. Then when I installed the beta drivers, it acted like it didn't include CCC, Raptr or anything else past the actual driver. So, DDU'd it again, installed the stable and then updated to the beta and it was pretty glitchy.


----------



## mfknjadagr8

Quote:


> Originally Posted by *MIGhunter*
> 
> How did you install those? Last time I installed beta drivers, I used DDU to wipe my old drivers. Then when I installed the beta drivers, it acted like it didn't include CCC, Raptr or anything else past the actual driver. So, DDU'd it again, installed the stable and then updated to the beta and it was pretty glitchy.


I always uninstall via DDU IN SAFE MODE(make sure windows automatic installation of drivers is turned off until you have the new driver installed then you can turn it back on) then i reboot and run cccleaner on the registry to be sure DDU got everything most of the time it does but sometimes ive had issues with an install to find something was missed... then reinstall the new driver.. and reboot.... if you happen to have crossfire be sure you install the drivers with only one card powered... then reboot then turn off reconnect power to the second card let it detect on startup then turn on crossfire..ive had issues in the past installing with both cards connected...


----------



## MIGhunter

Quote:


> Originally Posted by *mfknjadagr8*
> 
> I always uninstall via DDU IN SAFE MODE(make sure windows automatic installation of drivers is turned off until you have the new driver installed then you can turn it back on) then i reboot and run cccleaner on the registry to be sure DDU got everything most of the time it does but sometimes ive had issues with an install to find something was missed... then reinstall the new driver.. and reboot.... if you happen to have crossfire be sure you install the drivers with only one card powered... then reboot then turn off reconnect power to the second card let it detect on startup then turn on crossfire..ive had issues in the past installing with both cards connected...


Thanks, i'll try that and see if it's different. I'm using a 295x2 so I can't disable the xrossfire. I just thought it was weird that last time I did the beta drivers it didn't install the CCC and stuff.


----------



## mfknjadagr8

Quote:


> Originally Posted by *MIGhunter*
> 
> Thanks, i'll try that and see if it's different. I'm using a 295x2 so I can't disable the xrossfire. I just thought it was weird that last time I did the beta drivers it didn't install the CCC and stuff.


if it doesnt work this time you should redownload the file you might have managed to get a corrupted download...it happens sometimes


----------



## Arizonian

Quote:


> Originally Posted by *semitope*
> 
> http://www.techpowerup.com/gpuz/558xd
> 
> Sapphire R9 290x Vapor-X 8GB Stock


Congrats - added









Quote:


> Originally Posted by *shicedreck*
> 
> Just received my new Sapphire R9 290x Tri-X OC today. I must say it's a beast but a sexy one that is.
> 
> I even did some benchmarking on it in comparision to a GTX 970.
> 
> Well as you all know in some games the 970 is faster and in most the r9 290x.
> 
> In case you wanna see my benchmarks, I posted them on google docs.
> 
> -> https://docs.google.com/spreadsheets/d/10qodyVGqILcWXC99WoZAEOWm0sTCjdJNMRDQkp6q0cI/edit?usp=sharing


Congrats - added









Thanks for proper submissions.


----------



## MEC-777

Quote:


> Originally Posted by *mfknjadagr8*
> 
> lemme know when your cards can fully saturate 2.0 much less 3.0... but that point is moot so long as you are getting the best performance you can out of your board


I know, even the fastest cards currently available can't fully saturate 2.0 x16 slot (equivalent to 3.0 x8). Was just pointing out that dual 2.0 x16 slots is the same as dual 3.0 x8 slots.









Agreed.


----------



## bigpoppapump

Is this kind of voltage fluctuation normal under load?










VisionTek R9 290, XFX TS 750w.


----------



## Mega Man

Voltage isn't fluctuating. Useage is


----------



## Ha-Nocri

Quote:


> Originally Posted by *bigpoppapump*
> 
> Is this kind of voltage fluctuation normal under load?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> VisionTek R9 290, XFX TS 750w.


Yeah, it's normal if game is not very taxing on GPU. Was strange to me to when I got the 290, coming from nVidia card, where usage was constant number (30% for example, all the time).


----------



## rdr09

Quote:


> Originally Posted by *bigpoppapump*
> 
> Is this kind of voltage fluctuation normal under load?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> VisionTek R9 290, XFX TS 750w.


Use the slide bar to show the other readings. Was that a reading on a game? Can you run a bench like FS and do the same but make the chart to show the other reading. Your temp looks good.

edit: not sure if it still does but Voltage Monitoring in AB can cause issue. Try with it unchecked. Follow this thread to configure AB . . .

http://www.overclock.net/t/1265543/the-amd-how-to-thread


----------



## megax05

I was thinking abou Asus Z87 Pro
With 16x or 8x/8x i can get it for about 85$
I think it will be better than my h97 with 16/4 pci-e that crippled my crossfire r9 290 tri-x OC i berly get 2200 in heaven 4.0 with 1080p 8x extreme set also my case getting really hot while benching and gaming top card hits 87c and bottom one max @ 70c i will rearrange my fans to 3x 120mm fans with max cmf of 233 and 1x 120mm + 80mm exhaust with max cmf of 190 to get enough air flow while keep positive pressure.


----------



## kizwan

Quote:


> Originally Posted by *bigpoppapump*
> 
> Is this kind of voltage fluctuation normal under load?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> VisionTek R9 290, XFX TS 750w.


Voltage fluctuation is normal.


----------



## Gumbi

It's not fluctuating. That's also idle voltage.


----------



## bigpoppapump

Cool, thanks.

Yeah that reading was in a game. I've been having a few black screen issues with this card and I wasn't sure if power delivery was the culprit. I'll run FS tomorrow after work.


----------



## eyesfire

http://www.techpowerup.com/gpuz/9mchc/

2 290x's 1 msi, 1 xfx, both watercooled with a h50 w/ kraken g10


----------



## Gumbi

Has anyone here got a 290X VaporX 8GB? I'm looking at possibly grabbing one second hand and wondering how well they clock!

I RMAed a great clocking 290 VaporX and got a very meh one in return and I miss my othrr one greatly!


----------



## weetbix07

Hi,

Spend a bit of time reading these forums but haven't posted much..

I was wondering if anyone can help. I have a R9 290 Crossfire (watercooled) setup and I have been trying to run the Unigine Valley Benchmark to see where my temps are getting to. I have noticed that only one card is doing the work during the benchmark. Am I missing some settings in Unigine? I do have crossfire enabled and I can see both cards working in games.

Would appreciate any suggestions or help. Thanks in advance.


----------



## eyesfire

first thing i can think of, is unigine full screened or windowed?


----------



## weetbix07

Quote:


> Originally Posted by *eyesfire*
> 
> first thing i can think of, is unigine full screened or windowed?


Ahh I'm pretty sure I tried both. I will try again tonight with fullscreen.


----------



## megax05

I managed to do sime testing and OCing my cards yesterday
And best I got was 1050/1400 @ -37 mv , power limit @ default and my Heaven benchmark went from 2189 to 2390
Settings are 1080p full screen, Extreme , 8xAA.
While still using my crapy 16x/4x pcie motherboard till I can afford a better one , top card max @ 84c lower card @ 63c
I think I will keep this setting as my daily use OC.


----------



## rdr09

Quote:


> Originally Posted by *weetbix07*
> 
> Ahh I'm pretty sure I tried both. I will try again tonight with fullscreen.


that and disable and enable crossfire just before running the bench.
Quote:


> Originally Posted by *megax05*
> 
> I managed to do sime testing and OCing my cards yesterday
> And best I got was 1050/1400 @ -37 mv , power limit @ default and my Heaven benchmark went from 2189 to 2390
> Settings are 1080p full screen, Extreme , 8xAA.
> While still using my crapy 16x/4x pcie motherboard till I can afford a better one , top card max @ 84c lower card @ 63c
> I think I will keep this setting as my daily use OC.


you can compare the two cards. disable crossfire, then shutdown the pc. move the cable from the primary card to the card at X4. turn on pc and test the secondary 290 with a bunch benchies from futuremark and unigine.

shutdown pc and switch back the cable to the primary card and run the same benchies, then compare results. just run the benchies first on the primary card.

i would go as far as unplugging power from the idle gpu. do not forget to plug any of the cables. take your time.


----------



## megax05

I did the test and the results was a bit strange

Test setting:
Uningen Heaven 4.0
1080p ,Extreme ,8xAA ,Ultra ,fullscreen.

Master R9 290 stock 1000/1300

FPS : 52.8
Score: 1330
Min: 24.1
Max: 108

Slave R9 290 stock 1000/1300

FPS : 51.1
Score: 1287
Min: 23.1
Max: 103.6

The gap isnt much

Now 2x crossfire R9 290 @ 1000/1300
FPS: 95.7
Score: 2410
Min : 26.7
Max : 203.7

I think i will need to redo with OC .
Will post update.


----------



## rdr09

Quote:


> Originally Posted by *megax05*
> 
> I did the test and the results was a bit strange
> 
> Test setting:
> Uningen Heaven 4.0
> 1080p ,Extreme ,8xAA ,Ultra ,fullscreen.
> 
> Master R9 290 stock 1000/1300
> 
> FPS : 52.8
> Score: 1330
> Min: 24.1
> Max: 108
> 
> Slave R9 290 stock 1000/1300
> 
> FPS : 51.1
> Score: 1287
> Min: 23.1
> Max: 103.6
> 
> The gap isnt much
> 
> Now 2x crossfire R9 290 @ 1000/1300
> FPS: 95.7
> Score: 2410
> Min : 26.7
> Max : 203.7
> 
> I think i will need to redo with OC .
> Will post update.


i was gonna say, yes, you won't see much difference. i did the test too on my amd rig with a 7950 using X16 vs X4. lost around 200 points in FS graphics score.

not sure if it's worth getting a new board for the difference. i know you can oc them cards individually using either AB or Trixx. might just bring issues.

crossifred the cpu may make a bit of a difference. if we have a member here with exact same setup as yours except the running X8X8, then you'll see their score in Heaven. Prolly above 100 with similar oc.

edit: i just checked one of my benchies in Heaven . . . at 1090/1300 core on my 290 i got a 54. within the range with yours. i say keep your motherboard.


----------



## buttface420

anyone recommend a standalone backplate for reference 290x? is there one? i got the hg10 on it and would love to add a backplate


----------



## megax05

Wow i did another run after turning off MSI AB and guess it was crippling my GPUs

With a small OC and same test

Crossfire r9 290 @ 1075/1450
FPS : 103.1
Score: 2630
Min: 26.6
Max: 216.2


----------



## rdr09

Quote:


> Originally Posted by *megax05*
> 
> Wow i did another run after turning off MSI AB and guess it was crippling my GPUs
> 
> With a small OC and same test
> 
> Crossfire r9 290 @ 1075/1450
> FPS : 103.1
> Score: 2630
> Min: 26.6
> Max: 216.2


so, do you still need to spend on a new motherboard? btw, vram oc makes a difference in unigine.


----------



## megax05

Damn no i will settle on lower clock 1050/1400 is more than pushing my 1440p
Even my firestrike score went up from 14k to 16.1k that is huge jump will never use msi ab again i think i will sapphire tool for that i think its caller trixx or something.


----------



## rdr09

Quote:


> Originally Posted by *megax05*
> 
> Damn no i will settle on lower clock 1050/1400 is more than pushing my 1440p
> Even my firestrike score went up from 14k to 16.1k that is huge jump will never use msi ab again i think i will sapphire tool for that i think its caller trixx or something.


Check CCC if Overdrive is enabled. it conflicts with AB or even Trixx.

Have some steak tonite. lol

edit: wait, do you let CCC be part of the startup programs? i don't.


----------



## diggiddi

Quote:


> Originally Posted by *buttface420*
> 
> anyone recommend a standalone backplate for reference 290x? is there one? i got the hg10 on it and would love to add a backplate


Aquacomputer and alphacool have backplates that might work


----------



## bigpoppapump

Quote:


> Originally Posted by *rdr09*
> 
> Use the slide bar to show the other readings. Was that a reading on a game? Can you run a bench like FS and do the same but make the chart to show the other reading. Your temp looks good.
> 
> edit: not sure if it still does but Voltage Monitoring in AB can cause issue. Try with it unchecked. Follow this thread to configure AB . . .
> 
> http://www.overclock.net/t/1265543/the-amd-how-to-thread




Here's my Firestrike.


----------



## rdr09

Quote:


> Originally Posted by *bigpoppapump*
> 
> 
> 
> Here's my Firestrike.


everything looks normal except your memory clock. could someone else read and double check? does it only go to 1000MHz?

what's should be the default? it's 1250, right?

edit: i think even your core clock is low. did you underclock them?

Your load voltage is also low. that may cause blackscreen.


----------



## MEC-777

Quote:


> Originally Posted by *megax05*
> 
> Wow i did another run after turning off MSI AB and guess it was crippling my GPUs
> 
> With a small OC and same test
> 
> Crossfire r9 290 @ 1075/1450
> FPS : 103.1
> Score: 2630
> Min: 26.6
> Max: 216.2


Compare that to my 290 cf setup running at 1100/1350:

FPS: 108.1
Score: 2723
Min FPS: 24.5
Max FPS: 221.0

I'd say you're right on the money now.









With running MSI AB, you have to disable overdrive in CCC or it will conflict and cause some problems. With crossfire, if both cards are not running at the same clock speeds, it can cripple performance. I found running my cards at stock (947/1250) to be more than fast enough to run most games at 1440p on high-ultra settings and maintain a minimum of 60fps+. When I was running just one 290, I had it overclocked quite a bit to boost performance, but with two I find it's not necessary. I'd rather not stress them and keep the temps/noise down instead.









Quote:


> Originally Posted by *bigpoppapump*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Here's my Firestrike.


Have you underclocked your card? Stock for most 290's is 947 core and 1250 memory. Yours is running at 900/1000, so something is going on there. Is this a 290 or 290x? Which brand/model?


----------



## megax05

Funny that I just found one of my 290s is unlocked to 290x and by swapping the slots I gained extra 70p in Heaven.
I was happy and wanted to see if my other card is unlockable but bleh it was HW locked.
Now I am happy with my results to be inline with others with the similar setup, I think I can now work on my case cooling
However The Witcher 3 still have minor stuttering even with the latest beta driver and 1.08 game patch but its not terrible.


----------



## MEC-777

Quote:


> Originally Posted by *megax05*
> 
> Funny that I just found one of my 290s is unlocked to 290x and by swapping the slots I gained extra 70p in Heaven.
> I was happy and wanted to see if my other card is unlockable but bleh it was HW locked.
> Now I am happy with my results to be inline with others with the similar setup, I think I can now work on my case cooling
> However The Witcher 3 still have minor stuttering even with the latest beta driver and 1.08 game patch but its not terrible.


You can actually unlock a locked 290. I did it with my HIS card. Forget what the command was... something like "atiflash unlock rom"? Someone here might know and can maybe confirm.


----------



## mus1mus

whut?


----------



## mfknjadagr8

Quote:


> Originally Posted by *MEC-777*
> 
> You can actually unlock a locked 290. I did it with my HIS card. Forget what the command was... something like "atiflash unlock rom"? Someone here might know and can maybe confirm.


no you don't just go flashing the card...you must find out if it will unlock first no reason going through the hassle if it won't unlock...there is a thread for that and it contains all the information on this..http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread


----------



## kizwan

Quote:


> Originally Posted by *megax05*
> 
> Funny that I just found one of my 290s is unlocked to 290x and by swapping the slots I gained extra 70p in Heaven.
> I was happy and wanted to see if my other card is unlockable but bleh it was HW locked.
> Now I am happy with my results to be inline with others with the similar setup, I think I can now work on my case cooling
> However The Witcher 3 still have minor stuttering even with the latest beta driver and 1.08 game patch but its not terrible.


Congrats! One of your 290s "unlocked" to 290X because ULPS.









*Edit:* I probably read too fast. Did you unlocked the 290 to 290X or you just saying that because you saw higher shaders count on your secondary 290 in GPU-Z?


----------



## MEC-777

Quote:


> Originally Posted by *mfknjadagr8*
> 
> no you don't just go flashing the card...you must find out if it will unlock first no reason going through the hassle if it won't unlock...there is a thread for that and it contains all the information on this..http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread


Yeah, I've read through that thread extensively. Did a ton of research before attempting to flash one of my cards. That's when I found out my HIS card was listed as "locked" and supposedly not unlockable. Well I did more research, then used the 'unlock rom' command with atiflash and... tadaaaa - it unlocked.









I then successfully flashed it to a modified 390x BIOS (to work with 4GB Vram and default memory clock of 1250). I have since switched back to the stock BIOS because the gains were minimal and it caused some instability.

It may not work with every card, I understand that. But not every card that is supposedly "locked" are really not able to be unlocked. I simply just tried it and it worked. The worst that will happen is it will not unlock.


----------



## ZealotKi11er

Anyone with CF playing SW BF?


----------



## mfknjadagr8

Quote:


> Originally Posted by *MEC-777*
> 
> Yeah, I've read through that thread extensively. Did a ton of research before attempting to flash one of my cards. That's when I found out my HIS card was listed as "locked" and supposedly not unlockable. Well I did more research, then used the 'unlock rom' command with atiflash and... tadaaaa - it unlocked.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I then successfully flashed it to a modified 390x BIOS (to work with 4GB Vram and default memory clock of 1250). I have since switched back to the stock BIOS because the gains were minimal and it caused some instability.
> 
> It may not work with every card, I understand that. But not every card that is supposedly "locked" are really not able to be unlocked. I simply just tried it and it worked. The worst that will happen is it will not unlock.


I guess that's less of an issue than risk bricking it from bios screwing up...wasn't the difference in shaders like 2100 unified vs 2560 or something like that and base clocks were slightly higher on 290x? I know both if mine are listed as locked...both xfx with elpedia memory


----------



## MEC-777

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Anyone with CF playing SW BF?


Yep. 1440p ultra settings, maintains over 60fps+ on average and GPUs aren't even being taxed that heavily. Runs very well with CF.








Quote:


> Originally Posted by *mfknjadagr8*
> 
> I guess that's less of an issue than risk bricking it from bios screwing up...wasn't the difference in shaders like 2100 unified vs 2560 or something like that and base clocks were slightly higher on 290x? I know both if mine are listed as locked...both xfx with elpedia memory


The 290 has 2560 shaders and the 290x has 2816. Stock reference clocks are 947/1250 (290) and 1000/1250 (290x), respectively (I believe).

Typically, the only time your card can be bricked is if something goes wrong while you're flashing the new BIOS. And even if that happens, our cards have the dual BIOS switch, so you can switch it back to the other BIOS and still use the card. I believe you can also un-brick the bad/corrupt BIOS by powering up your PC with the integrated graphics (if you CPU has it) and re-flashing the bad BIOS with the original.


----------



## ZealotKi11er

Quote:


> Originally Posted by *MEC-777*
> 
> Yep. 1440p ultra settings, maintains over 60fps+ on average and GPUs aren't even being taxed that heavily. Runs very well with CF.
> 
> 
> 
> 
> 
> 
> 
> 
> The 290 has 2560 shaders and the 290x has 2816. Stock reference clocks are 947/1250 (290) and 1000/1250 (290x), respectively (I believe).
> 
> Typically, the only time your card can be bricked is if something goes wrong while you're flashing the new BIOS. And even if that happens, our cards have the dual BIOS switch, so you can switch it back to the other BIOS and still use the card. I believe you can also un-brick the bad/corrupt BIOS by powering up your PC with the integrated graphics (if you CPU has it) and re-flashing the bad BIOS with the original.


My Second card stays 0% for me and this only happens when CFX is off or disables in the drivers for the game.


----------



## megax05

Quote:


> Originally Posted by *kizwan*
> 
> Congrats! One of your 290s "unlocked" to 290X because ULPS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Edit:* I probably read too fast. Did you unlocked the 290 to 290X or you just saying that because you saw higher shaders count on your secondary 290 in GPU-Z?


yeah i saw higher shader counts dang I look like a dumb







but any how I am happy with those cards they suit my needs and i got them for so cheap my Rig costed like 670 $ so its a killer when look at bang/$. when you consider i7 4770 + 16gb DDR3 + H97 ATX Mobo + 730w PSU + Case + 2x r9 290 Tri-x OC + 2 TB HDD








its a win win for me.

BTW did anyone test an r9 290 vs r9 390 clock per clock ,if there a link for one online plz I want to see how much of improvement they did on the hawaii chip cause benchmarks are everywhere but they dont do a clock per clock test.


----------



## mus1mus

Quote:


> Originally Posted by *MEC-777*
> 
> Yep. 1440p ultra settings, maintains over 60fps+ on average and GPUs aren't even being taxed that heavily. Runs very well with CF.
> 
> 
> 
> 
> 
> 
> 
> 
> The 290 has 2560 shaders and the 290x has 2816. Stock reference clocks are 947/1250 (290) and 1000/1250 (290x), respectively (I believe).
> 
> Typically, the only time your card can be bricked is if something goes wrong while you're flashing the new BIOS. And even if that happens, our cards have the dual BIOS switch, so you can switch it back to the other BIOS and still use the card. I believe you can also un-brick the bad/corrupt BIOS by powering up your PC with the integrated graphics (if you CPU has it) and re-flashing the bad BIOS with the original.


I'd like to see proofs of this claim.

Also, just in case some have not yet try, you can flash to a bricked BIOS location without an IGPU. As long as you have the other BIOS position clean/working fine, you should not fear.

If one BIOS position gets into a bad flash, no screen, just turn off the PC, flip the BIOS switch to the other side, power on you PC while pressing DEL or go into the MOBO - BIOS, upon post or once you are in the Mobo BIOS screen, flick the switch to the bad BIOS' side, then flash again. Video won't even flick. :Thumb:

Alternatively, you can go straight to the ATIFlash dos window before flipping the switch so you can flash.


----------



## sinnedone

Quote:


> Originally Posted by *ZealotKi11er*
> 
> My Second card stays 0% for me and this only happens when CFX is off or disables in the drivers for the game.


Make sure you're not in borderless mode.

Game used the gpus strangely for me. Usage was up and down for no apparent reason.


----------



## ZealotKi11er

Quote:


> Originally Posted by *sinnedone*
> 
> Make sure you're not in borderless mode.
> 
> Game used the gpus strangely for me. Usage was up and down for no apparent reason.


Always have Full Screen.


----------



## sinnedone

Yah odd then. I know on mine were only about half used and both gpus had their usage constantly up and down. I wouldnt say it ran great, but everything ultra at 1440p it ran around 65-90 fps. The odd thing is neither gpus or cpu saw full usage.


----------



## MEC-777

Quote:


> Originally Posted by *mus1mus*
> 
> I'd like to see proofs of this claim.
> 
> Also, just in case some have not yet try, you can flash to a bricked BIOS location without an IGPU. As long as you have the other BIOS position clean/working fine, you should not fear.
> 
> If one BIOS position gets into a bad flash, no screen, just turn off the PC, flip the BIOS switch to the other side, power on you PC while pressing DEL or go into the MOBO - BIOS, upon post or once you are in the Mobo BIOS screen, flick the switch to the bad BIOS' side, then flash again. Video won't even flick. :Thumb:
> 
> Alternatively, you can go straight to the ATIFlash dos window before flipping the switch so you can flash.


Proof of which claim?

I didn't know you could flip the BIOS switch with the card powered on. Learn something new everyday.









Quote:


> Originally Posted by *ZealotKi11er*
> 
> *My Second card stays 0%* for me and this *only happens when CFX is off or disables in the drivers* for the game.


If CFX is off/disabled, then the game will only use one card.


----------



## By-Tor

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Anyone with CF playing SW BF?


Game seems to play fine in CF, but usage if very erratic and get heavy flickering between rounds.

One card is fine and plays great.


----------



## ZealotKi11er

Quote:


> Originally Posted by *MEC-777*
> 
> Proof of which claim?
> 
> I didn't know you could flip the BIOS switch with the card powered on. Learn something new everyday.
> 
> 
> 
> 
> 
> 
> 
> 
> If CFX is off/disabled, then the game will only use one card.


It works fine in other games. I tried to force CFX and then it turned on but it's not stable. CPU usage goes 95%. To me it looks like with 15.9.1 CFX is disabled on this game by AMD.


----------



## bigpoppapump

Quote:


> Originally Posted by *rdr09*
> 
> everything looks normal except your memory clock. could someone else read and double check? does it only go to 1000MHz?
> 
> what's should be the default? it's 1250, right?
> 
> edit: i think even your core clock is low. did you underclock them?
> 
> Your load voltage is also low. that may cause blackscreen.


Quote:


> Originally Posted by *MEC-777*
> 
> Have you underclocked your card? Stock for most 290's is 947 core and 1250 memory. Yours is running at 900/1000, so something is going on there. Is this a 290 or 290x? Which brand/model?


Yeah I underclocked it while I was troubleshooting. VisionTek R9 290, no special name. It's at 1000 core 1200 memory +50mV core now and it seems stable. What's the safe max voltage?


----------



## rdr09

Quote:


> Originally Posted by *bigpoppapump*
> 
> Yeah I underclocked it while I was troubleshooting. VisionTek R9 290, no special name. It's at 1000 core 1200 memory +50mV core now and it seems stable. What's the safe max voltage?


that explains it. my 290s at load go up to 1.2v at stock of 947MHz. i see yours goes to 1.1v. with +50mV . . . check and see how high it goes when loaded.

edit: when i was benching and just using Trixx, adding +200mv . . . i saw my load voltage go as high as 1.41v. of course, i would not use that 7/24 even if my temps are manageable.

i say 1.2v load as long as your temps are all below 80C . . . is good.


----------



## mfknjadagr8

Quote:


> Originally Posted by *rdr09*
> 
> that explains it. my 290s at load go up to 1.2v at stock of 947MHz. i see yours goes to 1.1v. with +50mV . . . check and see how high it goes when loaded.
> 
> edit: when i was benching and just using Trixx, adding +200mv . . . i saw my load voltage go as high as 1.41v. of course, i would not use that 7/24 even if my temps are manageable.
> 
> i say 1.2v load as long as your temps are all below 80C . . . is good.


both of my cards run 1.24v under load at stock settings no mv added...with or without the power limit increased... go figure and my temps dont even know what 60 looks like anymore


----------



## rdr09

Quote:


> Originally Posted by *mfknjadagr8*
> 
> both of my cards run 1.24v under load at stock settings no mv added...with or without the power limit increased... go figure and my temps dont even know what 60 looks like anymore


that's about right. these software apps, i don't think are really that accurate but can be use as reference with enough reading.

my first card only goes to 1.17v at load at 947 MHz. it does 1170 MHz in benches before needing extra VDDC. my second goes to 1.2v and can bench to 1150 MHz at that voltage. ASICs are pretty close if that matters. 79/81.


----------



## weetbix07

Quote:


> Originally Posted by *rdr09*
> 
> that and disable and enable crossfire just before running the bench.
> you can compare the two cards. disable crossfire, then shutdown the pc. move the cable from the primary card to the card at X4. turn on pc and test the secondary 290 with a bunch benchies from futuremark and unigine.
> 
> shutdown pc and switch back the cable to the primary card and run the same benchies, then compare results. just run the benchies first on the primary card.
> 
> i would go as far as unplugging power from the idle gpu. do not forget to plug any of the cables. take your time.


Thanks for the help disabling CF and re-enabling helped and running in fullscreen. I could actually see one card heating up 10 degrees higher then the other and I realised i did my GPU's in parallel incorrectly. So I have just finished draining the loop and fixing it up now to test again.

Thanks again!


----------



## weetbix07

Quote:


> Originally Posted by *buttface420*
> 
> anyone recommend a standalone backplate for reference 290x? is there one? i got the hg10 on it and would love to add a backplate


I bought the XSPC ones and I quite like them:

http://www.xs-pc.com/waterblocks-gpu/razor-r9-290x-290-backplate

Here is a shot of mine:


----------



## Arizonian

Quote:


> Originally Posted by *weetbix07*
> 
> I bought the XSPC ones and I quite like them:
> 
> http://www.xs-pc.com/waterblocks-gpu/razor-r9-290x-290-backplate
> 
> Here is a shot of mine:
> 
> 
> Spoiler: Warning: Spoiler!


Conrats - added


----------



## weetbix07

Quote:


> Originally Posted by *Arizonian*
> 
> Conrats - added


Cheers and thanks!


----------



## juno12

I own my dcu ii for almost 2 years but recently i watercooled it.
temps dropped from 80-90 to only 60 while playing bf4









http://www.techpowerup.com/gpuz/details.php?id=eyu47

Manufacturer: asus
Model/Series: R9 290X DCU ii oc
Cooling: ek waterblock


----------



## Liranan

That's why I got this Tri-X, so I wouldn't need to watercool it


----------



## megax05

These tri-x heatsink is connected to the card IO what makes the case so hot.
Anyone have the same issue?


----------



## buttface420

Quote:


> Originally Posted by *weetbix07*
> 
> I bought the XSPC ones and I quite like them:
> 
> http://www.xs-pc.com/waterblocks-gpu/razor-r9-290x-290-backplate
> 
> Here is a shot of mine:


nice! but i wonder if they would work without the waterblock. im wondering if the screws with the xspc backplate would be long enough (or too long) for my gpu. the hg10 bracket i have came with longer screws than the original gpu screws but i'd guess they'd have to be a tad bit longer to add the backplate


----------



## fyzzz

My 290 never fails to surprise me








http://www.3dmark.com/3dm/8863511?


----------



## gupsterg

That's a super 290 mate!!









Platinum!!


----------



## fyzzz

Quote:


> Originally Posted by *gupsterg*
> 
> That's a super 290 mate!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Platinum!!


Thanks! I wouldn't been able to bring this 290 this far without your help and your thread about bios modding. I also got my hands on another 390 bios, which allowed me to push to 1260+ without blackscreen.


----------



## bigpoppapump

Quote:


> Originally Posted by *rdr09*
> 
> that explains it. my 290s at load go up to 1.2v at stock of 947MHz. i see yours goes to 1.1v. with +50mV . . . check and see how high it goes when loaded.
> 
> edit: when i was benching and just using Trixx, adding +200mv . . . i saw my load voltage go as high as 1.41v. of course, i would not use that 7/24 even if my temps are manageable.
> 
> i say 1.2v load as long as your temps are all below 80C . . . is good.


+50mV was putting it at 1.18v at peak, +100mV does 1.22v and 77c under load. 1100 core was unstable so I backed it down to 1000 and it's running smooth.

The card won't blink at Firestrike or Valley but War Thunder at low settings will find instability in about five minutes.









Thanks for all the help.


----------



## diggiddi

Quote:


> Originally Posted by *fyzzz*
> 
> My 290 never fails to surprise me
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/8863511?


Whoa 15k ,you must be on water huh


----------



## sinnedone

Quote:


> Originally Posted by *fyzzz*
> 
> My 290 never fails to surprise me
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/8863511?


Still on the 390x bios?


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> Thanks! I wouldn't been able to bring this 290 this far without your help and your thread about bios modding. I also got my hands on another 390 bios, which allowed me to push to 1260+ without blackscreen.


Nice work to both you!

Quote:


> Originally Posted by *bigpoppapump*
> 
> +50mV was putting it at 1.18v at peak, +100mV does 1.22v and 77c under load. 1100 core was unstable so I backed it down to 1000 and it's running smooth.
> 
> The card won't blink at Firestrike or Valley but War Thunder at low settings will find instability in about five minutes.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks for all the help.


You raised the power limit? max it if not. what are the vrm temps?


----------



## Liranan

Quote:


> Originally Posted by *fyzzz*
> 
> My 290 never fails to surprise me
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/8863511?


That is amazing. What is the bandwidth at those speeds?


----------



## megax05

That is one hell of a card Kingpin r9 290


----------



## bigpoppapump

Quote:


> Originally Posted by *rdr09*
> 
> You raised the power limit? max it if not. what are the vrm temps?


Yeah, +50%. HWINFO says 49c max but I don't know if I believe that.

1000 turned out to not be stable either, back down to 950. My card's BIOS are Jan 2014 vintage, would flashing it potentially improve things?


----------



## fyzzz

Quote:


> Originally Posted by *diggiddi*
> 
> Whoa 15k ,you must be on water huh


Yes water and very cold air, card maxed out at like 30c, also on custom 390 bios.
Quote:


> Originally Posted by *sinnedone*
> 
> Still on the 390x bios?


No it didn't allow me to go above 1240 mhz without blackscreen, but i got my hands on a non x 390 bios, which i modded and it allowed me to go 1260+ without blackscreen.


----------



## rdr09

Quote:


> Originally Posted by *bigpoppapump*
> 
> Yeah, +50%. HWINFO says 49c max but I don't know if I believe that.
> 
> 1000 turned out to not be stable either, back down to 950. My card's BIOS are Jan 2014 vintage, would flashing it potentially improve things?


if there is a newer bios and the current one is giving you issue, then i suggest you flash it. save the original bios on a separate drive using GPUZ.

you can try fyzzz's bios.


----------



## juno12

Quote:


> Originally Posted by *Liranan*
> 
> That's why I got this Tri-X, so I wouldn't need to watercool it


what are your max temps??


----------



## gupsterg

Quote:


> Originally Posted by *fyzzz*
> 
> Thanks! I wouldn't been able to bring this 290 this far without your help and your thread about bios modding. I also got my hands on another 390 bios, which allowed me to push to 1260+ without blackscreen.


Glad to read the information is helping you!







, but gotta say it's all your work at using it!







.

It was so great when many were involved in taking apart the bios to see how to manipulate it.

Which rom you using now? what mods did you do?

At times now I wish I had a ref PCB 290/290X to play with, should have never sold the Tri-X 290 I had as that OC'd to 1100MHz with just +25mv. At the time had no idea on bios manipulation and I think that would have rocked with some differing bioses.
Quote:


> Originally Posted by *bigpoppapump*
> 
> +50mV was putting it at 1.18v at peak, +100mV does 1.22v and 77c under load. 1100 core was unstable so I backed it down to 1000 and it's running smooth.


What's your ASIC quality?

If your hitting lower voltage with same offset as @rdr09's 290 your's should be higher ASIC quality.

Simply put ASIC quality = Leakage ID , higher Leakage ID = lower GPU voltage.


----------



## fyzzz

Quote:


> Originally Posted by *gupsterg*
> 
> Glad to read the information is helping you!
> 
> 
> 
> 
> 
> 
> 
> , but gotta say it's all your work at using it!
> 
> 
> 
> 
> 
> 
> 
> .
> 
> It was so great when many were involved in taking apart the bios to see how to manipulate it.
> 
> Which rom you using now? what mods did you do?


I'm using a 390 non x bios that i got from some guy in the 390x/390 owners club. I have changed timings, voltage table and made it to support 4gb. My card can do ~1110 mhz without voltage add.


----------



## gupsterg

Cheers







.

So you're using this one .

Did you just change timings or mod 290 RAM IC info as well into rom? ie whole vram info table then mod timings.


----------



## megax05

Dang I have no idea about editing bios if there is any chance to share yours.


----------



## fyzzz

Quote:


> Originally Posted by *gupsterg*
> 
> Cheers
> 
> 
> 
> 
> 
> 
> 
> .
> 
> So you're using this one .
> 
> Did you just change timings or mod 290 RAM IC info as well into rom? ie whole vram info table then mod timings.


Yep using that bios. I just modded in the the timings, i left the vram info alone. Using the 290's 1126-1250 timings up to 1376-1500 and 1251-1375 timings for the two last straps.


----------



## gupsterg

I've also found 1126-1250 timings in 1376-1500 strap good over long term gaming / OC use.

Been a while since I meddled with my GPU but will be getting back to it and will try 1251-1375 timings in the higher RAM frequency straps.

So many thanks for share







.

Are you using the 390 stock timings or 290 stock timings in 390 rom?


----------



## fyzzz

Quote:


> Originally Posted by *gupsterg*
> 
> I've also found 1126-1250 timings in 1376-1500 strap good over long term gaming / OC use.
> 
> Been a while since I meddled with my GPU but will be getting back to it and will try 1251-1375 timings in the higher RAM frequency straps.
> 
> So many thanks for share
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Are you using the 390 stock timings or 290 stock timings in 390 rom?


I am using 290 stock timings.


----------



## sinnedone

What's your max voltage at stock for the 390 bios?

I think my 290's are 1.21v


----------



## fyzzz

Quote:


> Originally Posted by *sinnedone*
> 
> What's your max voltage at stock for the 390 bios?
> 
> I think my 290's are 1.21v


I had to increase the voltage in hawaii bios otherwise the voltage was so low. I put 1250 in dpm7 and that gave about ~1.41+ under load, which is stock voltage for me when i run 290 bios.


----------



## Liranan

Quote:


> Originally Posted by *juno12*
> 
> what are your max temps??


At 1100/1350 (I haven't really started testing the limits of this card as the RAM has some problems beyond 1400) I hit 80C running Heaven.

I haven't tested Firestrike because I haven't downloaded it yet, I only downloaded Heaven to test thermal limits.

Edit: Firestrike is the same. Max temperature is 79.


----------



## Gumbi

Quote:


> Originally Posted by *Liranan*
> 
> At 1100/1350 (I haven't really started testing the limits of this card as the RAM has some problems beyond 1400) I hit 80C running Heaven.
> 
> I haven't tested Firestrike because I haven't downloaded it yet, I only downloaded Heaven to test thermal limits.
> 
> Edit: Firestrike is the same. Max temperature is 79.


RAM will over clock more the more volts you put through it. (Core voltage).


----------



## Gumbi

290x Vapor X 8GB oncoming to replace the poorly clocking 290 Vaporx OC I got back on RMA! Feeling good about it


----------



## gupsterg

Quote:


> Originally Posted by *Liranan*
> 
> At 1100/1350 (I haven't really started testing the limits of this card as the RAM has some problems beyond 1400) I hit 80C running Heaven.
> 
> I haven't tested Firestrike because I haven't downloaded it yet, I only downloaded Heaven to test thermal limits.
> 
> Edit: Firestrike is the same. Max temperature is 79.


Quote:


> Originally Posted by *Gumbi*
> 
> RAM will over clock more the more volts you put through it. (Core voltage).


Core voltage not equal RAM voltage







.

Core voltage is just helping GPU/IMC along to use higher clocked RAM *if* RAM can do that clock.

Some info by the Stilt in this post, not searched within rom or found a way to edit rom to up MVDD







.

Perhaps a little upping of VDDCI may help, some info in Hawaii bios editing thread, heading Memory Interface / I/O bus voltage (VDDCI) (between memory and GPU core) and frequency editing and IIRC Aux voltage in MSI AB is same voltage.

Perhaps Liranan has Elpida which tends to not clock as well as Hynix.

Another thing that keeps cropping up whenever I search about best ways to OC 290/X there is mention of ratio between GPU/RAM clock.
Quote:


> This ratio varies from card to card but is generally between 1.25 and 1.4


Above quote from this blog heading The VRAM. Read other bits where again people stated can't get x RAM clock stable and then they try another ratio of GPU/RAM clocking and it works.


----------



## Gumbi

Quote:


> Originally Posted by *gupsterg*
> 
> Core voltage not equal RAM voltage
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Core voltage is just helping GPU/IMC along to use higher clocked RAM *if* RAM can do that clock.
> 
> Some info by the Stilt in this post, not searched within rom or found a way to edit rom to up MVDD
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Perhaps a little upping of VDDCI may help, some info in Hawaii bios editing thread, heading Memory Interface / I/O bus voltage (VDDCI) (between memory and GPU core) and frequency editing and IIRC Aux voltage in MSI AB is same voltage.
> 
> Perhaps Liranan has Elpida which tends to not clock as well as Hynix.
> 
> Another thing that keeps cropping up whenever I search about best ways to OC 290/X there is mention of ratio between GPU/RAM clock.
> Above quote from this blog heading The VRAM. Read other bits where again people stated can't get x RAM clock stable and then they try another ratio of GPU/RAM clocking and it works.


My 290 Vaporx OC could only do 1450mhz at stock voltage (25mv). At 100mv it could do 1650mhz. Upping the core voltage helped an awful lot with the memory...


----------



## gupsterg

Don't doubt it didn't help







and in no way refuting your experience







.
Quote:


> Originally Posted by *Gumbi*
> 
> RAM will over clock more the more volts you put through it. (Core voltage).


When I read this post it comes across as your stating core voltage is ram voltage.

Which as far as everything I've read states is not.


----------



## Gumbi

All I am saying is that core voltage helps ram clocks


----------



## mfknjadagr8

Quote:


> Originally Posted by *Gumbi*
> 
> All I am saying is that core voltage helps ram clocks


aux voltage ftw


----------



## Gumbi

Do you find aux voltage helps ram clocks on 290? I never looked into it much but from my experience they didn't seem to do too much.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Gumbi*
> 
> Do you find aux voltage helps ram clocks on 290? I never looked into it much but from my experience they didn't seem to do too much.


it helps my second card which clocks horribly...the first card it doesn't seem to help much


----------



## gupsterg

Quote:


> Originally Posted by *mfknjadagr8*
> 
> it helps my second card which clocks horribly...the first card it doesn't seem to help much


How much do you use and what is your clocks? if you don't mind









Ram type be also handy to know.

For Aux voltage the Stilt advised no more than 1.10v as this part is sensitive to voltage increase, also stated no real point in playing with, but again could help someone really try to eek every last bit out of a card.

And as always every card can react differently. ie some cases it helps others no effect

I have noticed in screenies of 390 / 390X that by default aux voltage / VDDCI can be around the 1.05v mark, so must be helping stabilize the higher default RAM clocks.


----------



## mfknjadagr8

Quote:


> Originally Posted by *gupsterg*
> 
> How much do you use and what is your clocks? if you don't mind
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ram type be also handy to know.
> 
> For Aux voltage the Stilt advised no more than 1.10v as this part is sensitive to voltage increase, also stated no real point in playing with, but again could help someone really try to eek every last bit out of a card.
> 
> And as always every card can react differently. ie some cases it helps others no effect
> 
> I have noticed in screenies of 390 / 390X that by default aux voltage / VDDCI can be around the 1.05v mark, so must be helping stabilize the higher default RAM clocks.


I haven't broken 1.0 but I only needed it because both of my cards are elpidia memory and the bottom one won't pass 1300 without voltage increase and it won't pass 1100 without core voltage increase...I'm setting at 1200 1400 for my 3d clocks on stock bios with temps never breaking 65c but the second card is holding the first card back as I run them with same clocks...power limit maxed +50mv vcore...it's best I've done so far under load cards hit 1.29v core


----------



## gupsterg

Thank you for reply







.

When you say you haven't broken 1.0 do you mean you have kept AUX voltage below say 1.009v?

As you *can't* clock RAM frequency high have you considered your stock bios with tightened RAM timings?

For example when I did a test with same GPU frequency but tight RAM timings for say 1375MHz that was as fast as 1500MHz with stock RAM timings.

Besides modifying RAM timings, another thing that occurs is there are RAM straps.

- 1001-1125MHz
- 1126-1250MHz
- 1251-1375MHz
- 1376-1500MHz
- 1501-1625MHz
- 1626-1750MHz

Each strap has it's own set of timings, each higher strap has looser timings but higher clocking per strap can make up for slack timings (as long as you at least past the mid point of a strap IMO).

1375MHz should be very close to 1400MHz in benches, why I say?

When you go past 1375MHz your in the next strap and 1400MHz is closer to beginning of next strap thus your not at the most efficient clock for those timings. Where as 1375MHz is at end of it's strap, thus is the most efficient clock for the set timings within it.

The other nugget of info that the Stilt has posted on OCN is regarding EDC (ECC), Link:- http://www.overclock.net/t/1561904/mlu-bios-builds-for-290x/10#post_24120051

SO lets say you are getting error corrections at 1400MHz which your not seeing as artifacts, thus 1375MHz maybe best balance of clock / timings without errors.


----------



## mfknjadagr8

Quote:


> Originally Posted by *gupsterg*
> 
> Thank you for reply
> 
> 
> 
> 
> 
> 
> 
> .
> 
> When you say you haven't broken 1.0 do you mean you have kept AUX voltage below say 1.009v?
> 
> As you *can't* clock RAM frequency high have you considered your stock bios with tightened RAM timings?
> 
> For example when I did a test with same GPU frequency but tight RAM timings for say 1375MHz that was as fast as 1500MHz with stock RAM timings.
> 
> Besides modifying RAM timings, another thing that occurs is there are RAM straps.
> 
> - 1001-1125MHz
> - 1126-1250MHz
> - 1251-1375MHz
> - 1376-1500MHz
> - 1501-1625MHz
> - 1626-1750MHz
> 
> Each strap has it's own set of timings, each higher strap has looser timings but higher clocking per strap can make up for slack timings (as long as you at least past the mid point of a strap IMO).
> 
> 1375MHz should be very close to 1400MHz in benches, why I say?
> 
> When you go past 1375MHz your in the next strap and 1400MHz is closer to beginning of next strap thus your not at the most efficient clock for those timings. Where as 1375MHz is at end of it's strap, thus is the most efficient clock for the set timings within it.
> 
> The other nugget of info that the Stilt has posted on OCN is regarding EDC (ECC), Link:- http://www.overclock.net/t/1561904/mlu-bios-builds-for-290x/10#post_24120051
> 
> SO lets say you are getting error corrections at 1400MHz which your not seeing as artifacts, thus 1375MHz maybe best balance of clock / timings without errors.


That's correct I've yet to try higher than that mostly because my second card isn't a very good clocker but also because I'm still at 1080p 60hz so other than running a bench which I don't compete in I don't need higher. Also I have considered this but I've never modded a bios before so im a little Leary of it...I'd like to try this eventually though you have me wondering what I could get within the clocks I'm currently running...


----------



## gupsterg

Attach you roms to post in thread, I will change timings and upload when convenient / find time to do edit for you







.


----------



## VulpesInculta

My recently obtained r9 290 pcs+ fans started to rattle at even 30% fan speed and sound pretty loud when above 50%. Due to how the shops here work, i doubt i will be able to send it back or anything since it works perfectly fine otherwise, so i want to ask if anyone here has any experience with aftermarket coolers for that card? I was thinking of buying one, air or maybe some water cooling one but i have no idea how to install it and which one would be compatible with the card. I am sorry if i am asking in the wrong place.


----------



## Gumbi

Quote:


> Originally Posted by *VulpesInculta*
> 
> My recently obtained r9 290 pcs+ fans started to rattle at even 30% fan speed and sound pretty loud when above 50%. Due to how the shops here work, i doubt i will be able to send it back or anything since it works perfectly fine otherwise, so i want to ask if anyone here has any experience with aftermarket coolers for that card? I was thinking of buying one, air or maybe some water cooling one but i have no idea how to install it and which one would be compatible with the card. I am sorry if i am asking in the wrong place.


If you bought it online in the EU you can return it fot any reason for a full refund within 2 weeks.


----------



## VulpesInculta

Quote:


> Originally Posted by *Gumbi*
> 
> If you bought it online in the EU you can return it fot any reason for a full refund within 2 weeks.


I bought it from a shop in my country ( Bulgaria) and it was in august, there was some time to return the product but it's over now.


----------



## Gumbi

Quote:


> Originally Posted by *VulpesInculta*
> 
> I bought it from a shop in my country ( Bulgaria) and it was in august, there was some time to return the product but it's over now.


Why not talk to manufacturer directly?


----------



## VulpesInculta

Quote:


> Originally Posted by *Gumbi*
> 
> Why not talk to manufacturer directly?


Do you think this would work, since i didn't buy the card from them?


----------



## VulpesInculta

Also i mostly asked here because i am actually fine with getting a good cooler instead of waiting a lot of time, i was hoping that someone might help me with choosing and installing a cooler.


----------



## Gumbi

Of course, I just did an RMA through Sapphire as the shop I bought the card from went bankrupt.


----------



## VulpesInculta

Quote:


> Originally Posted by *Gumbi*
> 
> Of course, I just did an RMA through Sapphire as the shop I bought the card from went bankrupt.


I was told that they don't usually deal with stuff that works though, like issues with fan rattling or coil whine.


----------



## Gumbi

Quote:


> Originally Posted by *VulpesInculta*
> 
> I was told that they don't usually deal with stuff that works though, like issues with fan rattling or coil whine.


No harm trying... And I wouldn't be happy with a fan rattle myself, the product isn't living up to its claims of silent cooling...


----------



## mfknjadagr8

Quote:


> Originally Posted by *gupsterg*
> 
> Attach you roms to post in thread, I will change timings and upload when convenient / find time to do edit for you
> 
> 
> 
> 
> 
> 
> 
> .


Cool thanks man... this is both the one i dumped from gpu z and the full download from tech power up that matches my version that shows in gpu z... i will do one card only see how it works then do the other.. they have the same version of the bios...

Hawaiidumped.zip 43k .zip file


techpowerupbios.zip 99k .zip file


----------



## fat4l

Anyone experiencing blackscreening if adding more voltage to the core ?
On my card, if I add more than +200mV to the core, blackscreening under load starts to appear. This happens only if powertarget is set for more than 0%.
Below +200mV, even with +50% power target, no blackscreening. Tried with oced ram and core and non oced as well. Temps are fine, 40-50C @load.


----------



## Gumbi

Got my second hand 290X Vapor X 8GB on my country's Craiglist equivalent today







I RMAed my original Vapor X 290 OC due to a dodgy fan connector. Received a replacement but wasn't happy with how it overclocked at all, it was very poor compared to my original card (a very good clocker, 1200/1650 at 100mv.... this would barely do 1150/1500 at 100mv), So am selling that card, and saw a great deal on the 290X 8GB. More RAM, slightly more performance per clock for 60 euro~ (will make back rest in sale of current card). SURE I'LL TAKE IT.

As it turns out it's a great clocker! Haven't tested memory yet, but core can do 1130 on stock voltage (25mv) fine for 10min in Heaven so far! My original card topped out 1140 on stock voltage (1135 in Crysis 3 though as 1140 would have some very minor artefacts after a while) so I'm feeling good about this one!

Gonna come hunting for 'dem 390 Heaven scores! My other card (290) did 1600 at 1080p/tess/AA with only 2 working fans


----------



## megax05

I was testing my cards to see what are the limits on those beasts and they didn't disappoint me and didn't impress me ether the best I manage to get was 1220/1650 on both cards +200mv and 50+ PL but the temps was out of control and I returned my clocks back to stock with a -25mv till I get a new case I am going to get the Phanteks Enthoo pro cause of the decent airflow and I am not planning on water cooling soon.

I will update when i get the new case but still dont want regret my choice of the case so any advice will be helpful want to decide
Phanteks Enthoo Pro / Corsair obsidian 450 / be quite 800.


----------



## MEC-777

Quote:


> Originally Posted by *megax05*
> 
> I was testing my cards to see what are the limits on those beasts and they didn't disappoint me and didn't impress me ether the best I manage to get was 1220/1650 on both cards +200mv and 50+ PL but the temps was out of control and I returned my clocks back to stock with a -25mv till I get a new case I am going to get the Phanteks Enthoo pro cause of the decent airflow and I am not planning on water cooling soon.
> 
> I will update when i get the new case but still dont want regret my choice of the case so any advice will be helpful want to decide
> Phanteks Enthoo Pro / Corsair obsidian 450 / be quite 800.


What about the Fractal Define S? I think if I were to change to another case, that would be it. Just love the design which allows for tons of air flow and room for water cooling (which ever you want to do) without the drive cages in the front.


----------



## ZealotKi11er

Quote:


> Originally Posted by *fat4l*
> 
> Anyone experiencing blackscreening if adding more voltage to the core ?
> On my card, if I add more than +200mV to the core, blackscreening under load starts to appear. This happens only if powertarget is set for more than 0%.
> Below +200mV, even with +50% power target, no blackscreening. Tried with oced ram and core and non oced as well. Temps are fine, 40-50C @load.


Is it the ARES? If yes then maybe you are pulling too much power from PCIE.


----------



## serave

just bought this for $240, its actually pretty good, fans were quiet, temps would hover around the 62-65°C mark

plays FC4, Saints Row IV and Crysis 3 just fine @High settings, maybe a liittle bit lower at Crysis 3, but this unit has been amazing for me so far

considering 970 costs around $400 and upwards here, i don't think this one is a bad purchase the performance gap between them isn't that big to my knowledge, i could be wrong though

running it with SuperFlower 500watts unit


----------



## megax05

Quote:


> Originally Posted by *serave*
> 
> just bought this for $240, its actually pretty good, fans were quiet, temps would hover around the 62-65°C mark
> 
> plays FC4, Saints Row IV and Crysis 3 just fine @High settings, maybe a liittle bit lower at Crysis 3, but this unit has been amazing for me so far
> 
> considering 970 costs around $400 and upwards here, i don't think this one is a bad purchase the performance gap between them isn't that big to my knowledge, i could be wrong though
> 
> running it with SuperFlower 500watts unit


Actually its on par with the gtx 970 and with the right OC you will pass the GTX 970 and get the performance of the r9 390.

Just an update for my case I found a nice deal on Corsair vengeance c70 for around 95$ brand new and local pickup no shipping or so , I decided to bite a bullet after reading few reviews and liked the ability to house a 320mm cards without removing the drive caddy wich i will remove anyway for better airflow and will fit my SSD + HDD in a 5.25 caddy adapter.

I will update the with sime pics when I am done with the build.
I was leaning toward the Phanteks enthoo pro but will have to wait for a month for it to get to my country and the price tag on the c70 was tempting so I hope that I will not regret this decision.


----------



## joeh4384

Quote:


> Originally Posted by *serave*
> 
> just bought this for $240, its actually pretty good, fans were quiet, temps would hover around the 62-65°C mark
> 
> plays FC4, Saints Row IV and Crysis 3 just fine @High settings, maybe a liittle bit lower at Crysis 3, but this unit has been amazing for me so far
> 
> considering 970 costs around $400 and upwards here, i don't think this one is a bad purchase the performance gap between them isn't that big to my knowledge, i could be wrong though
> 
> running it with SuperFlower 500watts unit


290 or 290x? Nice card, the Tri-X is probably the best mainstream cooler for the 290s.


----------



## Gumbi

Is it the Trix new edition? You have an AMD CPU in that system? If it's overclocked you'd want to be careful not to pump too many volts into that Trix as your PSU won't be up for it; it's a quality one though so you should be fine


----------



## smithydan

Guys just the other day r9 290s and 290Xs were low in prices and this was before the 300 series launch, now they have gone up both used and new. Can anyone tell me why the increase? Am I missing something? Thanks.


----------



## Gumbi

Quote:


> Originally Posted by *smithydan*
> 
> Guys just the other day r9 290s and 290Xs were low in prices and this was before the 300 series launch, now they have gone up both used and new. Can anyone tell me why the increase? Am I missing something? Thanks.


Probably due to the fact most of the 290 lins are EOL and there isn't much supply available.


----------



## ghabhaducha

Perhaps because 290/290x are almost exactly the same as 390/390x sans the extra 4GB of memory and a better memory controller? Coupled with AMD Hawaii's competitive performance against GK110 and GM204 in newer titles, people are turning to secondhand markets to grab some 290/290x's (what I would personally do) for more frames without selling kidneys.


----------



## MEC-777

Quote:


> Originally Posted by *smithydan*
> 
> Guys just the other day r9 290s and 290Xs were low in prices and this was before the 300 series launch, now they have gone up both used and new. Can anyone tell me why the increase? Am I missing something? Thanks.


People know the value of the 290/x's. Instead of buying new 390/x's for significantly more $ it makes more sense (to me and many others) to buy a used/sale price 290/x and OC it slightly to match the 390/x's. Desire and demand has gone up, thus prices have gone up.


----------



## gupsterg

Quote:


> Originally Posted by *serave*


Quote:


> Originally Posted by *Gumbi*
> 
> Is it the Trix new edition?


Nope, has 8+6 pin.

New, has 8+8 pin.



Still great card serave







, do like the orange black of Tri-X cooler







.


----------



## megax05

I got mine 2x r9 290 tri-x oc for less than 200$ each and I can match or even pass the performance of a GTX 980ti for 65% of the cost .
And price are going up I think cause 390/390x didnt bring that much of performance gain and can be obtained for free with some OCing and bios modding, which create more demand on those 290/290x while the supply low however if someone is going to water cool the cards then the reference cards still priced used for less than others paying for gtx 960 and r9 380 4gb cards.


----------



## serave

Quote:


> Originally Posted by *megax05*
> 
> Actually its on par with the gtx 970 and with the right OC you will pass the GTX 970 and get the performance of the r9 390.
> 
> Just an update for my case I found a nice deal on Corsair vengeance c70 for around 95$ brand new and local pickup no shipping or so , I decided to bite a bullet after reading few reviews and liked the ability to house a 320mm cards without removing the drive caddy wich i will remove anyway for better airflow and will fit my SSD + HDD in a 5.25 caddy adapter.
> 
> I will update the with sime pics when I am done with the build.
> I was leaning toward the Phanteks enthoo pro but will have to wait for a month for it to get to my country and the price tag on the c70 was tempting so I hope that I will not regret this decision.


well, that's good to hear







, also leaving the case open is the way to go for me

Quote:


> Originally Posted by *joeh4384*
> 
> 290 or 290x? Nice card, the Tri-X is probably the best mainstream cooler for the 290s.


it's the 290 one

Quote:


> Originally Posted by *Gumbi*
> 
> Is it the Trix new edition? You have an AMD CPU in that system? If it's overclocked you'd want to be careful not to pump too many volts into that Trix as your PSU won't be up for it; it's a quality one though so you should be fine


was using i5 4590 before, but someone was willing to buy mine with higher price compared to the market price

i'm using my old i3-4150 now, do i really need to change the PSU ? pic somehow related



Quote:


> Originally Posted by *gupsterg*
> 
> Still great card serave
> 
> 
> 
> 
> 
> 
> 
> , do like the orange black of Tri-X cooler
> 
> 
> 
> 
> 
> 
> 
> .


thanks mate







, feels good someone agrees with me to that matter


----------



## Gumbi

i5 or i3 even overclocked would be fine. You're good


----------



## MEC-777

Quote:


> Originally Posted by *serave*
> 
> well, that's good to hear
> 
> 
> 
> 
> 
> 
> 
> , also leaving the case open is the way to go for me
> it's the 290 one
> was using i5 4590 before, but someone was willing to buy mine with higher price compared to the market price
> 
> i'm using my old i3-4150 now, do i really need to change the PSU ? pic somehow related
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> thanks mate
> 
> 
> 
> 
> 
> 
> 
> , feels good someone agrees with me to that matter


You'd be better off with the i5, honestly. If I were you, I would have sold the i3 and kept the i5. The i3 may hold your system back in a few games.

I would also get a better PSU. That one is ok for now, but it doesn't even say if it's 80+ rated (didn't see it listed on their website either). 290's can pull 300watts + on their own, even more when OC'd. Personally I would recommend good quality unit in the 650w range and 80+ gold rated for running a single 290.
Quote:


> Originally Posted by *Gumbi*
> 
> i5 or i3 even overclocked would be fine. You're good


Can't OC an i3. Only via baseclock, and that will only net maybe a 100-150mhz increase.


----------



## gupsterg

Quote:


> Originally Posted by *serave*
> 
> it's the 290 one


Only what I saw on my rig, the 290 Tri-X was benching so close to the 290X Asus DCU II & Vapor-X. I think they are the best value cards







.

If I hadn't got the 290Xs very cheap new I'd have been upset.


----------



## Gumbi

Quote:


> Originally Posted by *MEC-777*
> 
> You'd be better off with the i5, honestly. If I were you, I would have sold the i3 and kept the i5. The i3 may hold your system back in a few games.
> 
> I would also get a better PSU. That one is ok for now, but it doesn't even say if it's 80+ rated (didn't see it listed on their website either). 290's can pull 300watts + on their own, even more when OC'd. Personally I would recommend good quality unit in the 650w range and 80+ gold rated for running a single 290.
> Can't OC an i3. Only via baseclock, and that will only net maybe a 100-150mhz increase.


I know that, I meant for the i5







An overclocked i5 + 290 is fine for 500watts, even with an overvolt on the CPU and an overvolt on the GPU (though a modest one, DEFINITELY no more than 100mv, preferably under that).


----------



## serave

Quote:


> Originally Posted by *MEC-777*
> 
> You'd be better off with the i5, honestly. If I were you, I would have sold the i3 and kept the i5. The i3 may hold your system back in a few games.
> 
> I would also get a better PSU. That one is ok for now, but it doesn't even say if it's 80+ rated (didn't see it listed on their website either). 290's can pull 300watts + on their own, even more when OC'd. Personally I would recommend good quality unit in the 650w range and 80+ gold rated for running a single 290.
> Can't OC an i3. Only via baseclock, and that will only net maybe a 100-150mhz increase.


from local review sites, efficiency of that PSU tops at 82%-ish

it doesnt have the 80+ rating because it only does 230VAC, anyway i realize i've done goof'd when i sold the i5, might get a new one when my next paycheck arrives


----------



## MEC-777

Quote:


> Originally Posted by *Gumbi*
> 
> I know that, I meant for the i5
> 
> 
> 
> 
> 
> 
> 
> An overclocked i5 + 290 is fine for 500watts, even with an overvolt on the CPU and an overvolt on the GPU (though a modest one, DEFINITELY no more than 100mv, preferably under that).


Depends what other components the person is running along side the GPU/CPU as well. How many fans, how many HDDs, etc. It will work, yes, but I would not use that combination for long-term. The harder a PSU has to work, the shorter it's life span will be. Superflower is a good PSU brand/manufacturer, but we (you and I) don't know how old the unit in question is and the fact that I see no 80+ rating makes me worry a little.

When it comes to PSU's it's always better to err on the side of caution and make sure you have a good quality unit with ample overhead wattage available, rather than "just enough". I'm not saying you need at 1000w PSU minimum for a single 290, but simply that a quality 650w 80+ gold would be more ideal and will have a longer life. That's all.


----------



## MEC-777

Quote:


> Originally Posted by *serave*
> 
> from local review sites, efficiency of that PSU tops at 82%-ish
> 
> it doesnt have the 80+ rating because it only does 230VAC, anyway i realize i've done goof'd when i sold the i5, might get a new one when my next paycheck arrives


If 82% is it's best peak efficiency, then that's just barely an 80+ rating (not even bronze). That concerns me because the lower-rated units tend to have lower-grade/quality components. They also tend to use group regulated 3.3, 5 and 12V rails which could be very dangerous for a system running a power-hungry card like a 290. If you notice any artifacting, or black screens when running a game (or anything that puts load on the GPU), turn your system off immediately and swap that PSU out right away. Not trying to scare you, just trying to save you from headaches down the road and help you get the most out of your system.


----------



## spyshagg

Quote:


> Originally Posted by *fat4l*
> 
> Anyone experiencing blackscreening if adding more voltage to the core ?
> On my card, if I add more than +200mV to the core, blackscreening under load starts to appear. This happens only if powertarget is set for more than 0%.
> Below +200mV, even with +50% power target, no blackscreening. Tried with oced ram and core and non oced as well. Temps are fine, 40-50C @load.


Yes. My crossfire setup does that.

I ended up settling for a more stable and "humbling" clock speed at 100mv. Got tired of random blackscreens mid-games.


----------



## megax05

update
I changed my case to the corsair vengeance c70 with stock fan config and cards on stock clocks 1000/1300 and my top card hits 91c on unigen heaven 4.0 and bottom card max @ 71c
I think i will need more fans in this case.
I think 91c for top card + 71c for the lower one is quite high for a tri-x card.


----------



## gatygun

Quote:


> Originally Posted by *serave*
> 
> from local review sites, efficiency of that PSU tops at 82%-ish
> 
> it doesnt have the 80+ rating because it only does 230VAC, anyway i realize i've done goof'd when i sold the i5, might get a new one when my next paycheck arrives


Drop the psu get a 650 gold 80+ one like EVGA PSU SuperNOVA 650 G2 or the 750 ( best on the market atm ) version for a bit more room. If you plan on heavily oc'ing both psu's will be enough, but the 750 will probably bring you a bit more lifetime in general then the 650 then. If you plan on going crossfire in the future, i wouldn't even go below anything lower then 1000w gold rated one.

The 290's are power hungry if not the most power hungry gpu's, my single 290 overclocked will push past 350 watt at times. And a budget psu which feature zero standards will probably not even push a stable 400w forwards. Your psu will have a hard time if it works at all. I hope your psu has protections build into it so it doesn't undervolt / overvolt your hardware on top of it which can make your live a nightmare. I had this happen once to me years back.

My current 1000w psu which i used for 2x 290's is dying and shutting my pc down ( probably my 2x 290's that killed it over time ). Ordered a new psu yesterday as i only have a 970 atm. I could buy a 750 cheapo power supply for 80 euro's, but i bought a 750w for 130 euro's, because the quality of it is just that much better and safer. All kinds of protections, lots of room to play around / silence etc.

One thing i learned never safe money on the psu.


----------



## kizwan

Quote:


> Originally Posted by *megax05*
> 
> update
> I changed my case to the corsair vengeance c70 with stock fan config and cards on stock clocks 1000/1300 and my top card hits 91c on unigen heaven 4.0 and bottom card max @ 71c
> I think i will need more fans in this case.
> I think 91c for top card + 71c for the lower one is quite high for a tri-x card.


The stock fans are pretty poor in moving air especially when you have multiple gpus. Two exhaust fans on the side panel may help.


----------



## megax05

Quote:


> Originally Posted by *kizwan*
> 
> The stock fans are pretty poor in moving air especially when you have multiple gpus. Two exhaust fans on the side panel may help.


What I did is i removed the drive cage and moved the fans to the front of the case and added 1 cheapo 120mm fan to the top of the case as exhaust also set the fans to spin @ 100% they are berly audiable and top card tops 82c @ 1050/1350 and the lower one hits 75c with custom fan curve .
I think its all about the quality of the fans and how much air they can move.

Overall I'm happy with my current results now have to save for set of quality fans and a 4k monitor.


----------



## megax05

I managed to oc up to 1100/1500 with +50mv and +50 power limit and my firestrike result was decent 15855 thats make me ranked 11th within similar system build.
I can live with that


----------



## AliNT77

i got my 290 ref. for the price of cheapest 960 i could find here (around 200$)








bought a used CLC (corsair h70) for 50$








OC'd it to 1145/1525 (+100mV +15% 24/7 stable for gaming) (i know







its not a good overclocker







)
max temp i saw was 66
vrms are pasted with TIM +2fans blowing on them and suprisingly cool

super great value considering that 970 is priced about 400$ here


















also scored 14789 (graphics) @ 1200/1620 with tesselation turned off


----------



## Gumbi

Quote:


> Originally Posted by *AliNT77*
> 
> i got my 290 ref. for the price of cheapest 960 i could find here (around 200$)
> 
> 
> 
> 
> 
> 
> 
> 
> bought a used CLC (corsair h70) for 50$
> 
> 
> 
> 
> 
> 
> 
> 
> OC'd it to 1145/1525 (+100mV +15% 24/7 stable for gaming) (i know
> 
> 
> 
> 
> 
> 
> 
> its not a good overclocker
> 
> 
> 
> 
> 
> 
> 
> )
> max temp i saw was 66
> vrms are pasted with TIM +2fans blowing on them and suprisingly cool
> 
> super great value considering that 970 is priced about 400$ here
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> also scored 14789 (graphics) @ 1200/1620 with tesselation turned off


Sweet dude. The 290 is such a great value buy and performs superbly if you give it a bit of love and out effort into cooling!


----------



## MEC-777

Quote:


> Originally Posted by *gatygun*
> 
> Drop the psu get a 650 gold 80+ one like EVGA PSU SuperNOVA 650 G2 or the 750 ( best on the market atm ) version for a bit more room. If you plan on heavily oc'ing both psu's will be enough, but the 750 will probably bring you a bit more lifetime in general then the 650 then. If you plan on going crossfire in the future, i wouldn't even go below anything lower then 1000w gold rated one.
> 
> The 290's are power hungry if not the most power hungry gpu's, my single 290 overclocked will push past 350 watt at times. And a budget psu which feature zero standards will probably not even push a stable 400w forwards. Your psu will have a hard time if it works at all. I hope your psu has protections build into it so it doesn't undervolt / overvolt your hardware on top of it which can make your live a nightmare. I had this happen once to me years back.
> 
> My current 1000w psu which i used for 2x 290's is dying and shutting my pc down ( probably my 2x 290's that killed it over time ). Ordered a new psu yesterday as i only have a 970 atm. I could buy a 750 cheapo power supply for 80 euro's, but i bought a 750w for 130 euro's, because the quality of it is just that much better and safer. All kinds of protections, lots of room to play around / silence etc.
> 
> One thing i learned never safe money on the psu.


You can safely go less than 1000w for CF 290's. I'm running CF 290's with a new EVGA Supernova 850GS PSU (made by seasonic). I have OC'd my 290's to 1200/1450 for benchmarking, but I mainly run them at stock clocks 24/7 (gives me more than enough performance anyways), so I'm not concerned at all. I could have gone with a cheaper quality 1000w bronze PSU, but decided to go with a top quality 850w gold unit instead (the best I could afford). The quality of the components used and the design of the unit also plays a significant roll in stability/longevity as well.

Besides that, I agree with you 100% - never skimp of the PSU.


----------



## mfknjadagr8

Quote:


> Originally Posted by *MEC-777*
> 
> You can safely go less than 1000w for CF 290's. I'm running CF 290's with a new EVGA Supernova 850GS PSU (made by seasonic). I have OC'd my 290's to 1200/1450 for benchmarking, but I mainly run them at stock clocks 24/7 (gives me more than enough performance anyways), so I'm not concerned at all. I could have gone with a cheaper quality 1000w bronze PSU, but decided to go with a top quality 850w gold unit instead (the best I could afford). The quality of the components used and the design of the unit also plays a significant roll in stability/longevity as well.
> 
> Besides that, I agree with you 100% - never skimp of the PSU.


really that depends on how much power your cpu is pulling and if you plan to overvolt a lot...either way go quality because failure means headache and money...with Intel cpu probably not as much of an issue but say the 8320 heavily overclocked can run near 250 watt...we know overvolted 290s can push the upper 200s so 850 would be a minimum I would use for crossfire unless I knew they would stay stock...


----------



## MEC-777

Quote:


> Originally Posted by *mfknjadagr8*
> 
> really that depends on how much power your cpu is pulling and if you plan to overvolt a lot...either way go quality because failure means headache and money...with Intel cpu probably not as much of an issue but say the 8320 heavily overclocked can run near 250 watt...we know overvolted 290s can push the upper 200s so 850 would be a minimum I would use for crossfire unless I knew they would stay stock...


Oh for sure. I'm running an i5-4570, so no overclocking (just a little via baseclock @ 104). But, if I were running an FX 8-core and had it OC'd, I'd definitely agree and go for a 1000w minimum with that combined with 290's in CF. It really depends on the whole system, not just the core components. Every situation/build is a bit different.









Indeed, quality is key.


----------



## Liranan

How do you guys OC the RAM so high? My card black screens when I hit 1400. The core is at 1100 without an increase in power limit or voltage but the RAM really doesn't like being OC'd.


----------



## mus1mus

Quote:


> Originally Posted by *Liranan*
> 
> How do you guys OC the RAM so high? My card black screens when I hit 1400. The core is at 1100 without an increase in power limit or voltage but the RAM really doesn't like being OC'd.


Custom BIOS. Or feeding it Volts.


----------



## EpicOtis13

Just picked up two 290's for $410 with blocks. I have been doing some benching, and at 1100/1350 I am getting a score of 17652, is this normal? If it is I probably should have just stuck with my 7970s that would hit 15800 at 1250/1700.


----------



## megax05

Quote:


> Originally Posted by *Liranan*
> 
> How do you guys OC the RAM so high? My card black screens when I hit 1400. The core is at 1100 without an increase in power limit or voltage but the RAM really doesn't like being OC'd.


Well every card react to OCing differntly however its mostly caused by the card's memory chips samsung and hinyx are more capable of getting higher OC than the problematic elpida chip they tend to black screen even on stock settings and dont like to get OC more than 1350 , but as I say its all about the silicon lottery but samsung and hinyx can hit over 1500 mhz without a lot of tweaking .
You can try to bump the core volt / mem aux volt a bit and see if you can get your card stablized cause I'v seen cards with elpida chips running @ 1600 mhz.
Good luck.


----------



## megax05

Quote:


> Originally Posted by *EpicOtis13*
> 
> Just picked up two 290's for $410 with blocks. I have been doing some benching, and at 1100/1350 I am getting a score of 17652, is this normal? If it is I probably should have just stuck with my 7970s that would hit 15800 at 1250/1700.


Are you taking about firestrike bench or other app if you talk about firestrike then that one good result there .
Also benchmark actually dont reflect the gaming abilities of the cards. My cf 290s maxed @ 16120 pts around 24000+ in graphics running them @ 1200/1600.


----------



## Gumbi

Quote:


> Originally Posted by *Liranan*
> 
> How do you guys OC the RAM so high? My card black screens when I hit 1400. The core is at 1100 without an increase in power limit or voltage but the RAM really doesn't like being OC'd.


Voltage. My RAM only does 1450mhz but can do 1650 whn I feed in 80mv~ of core voltage. So an overclock would be 1180/1650 at 80mv.


----------



## rdr09

Quote:


> Originally Posted by *megax05*
> 
> Well every card react to OCing differntly however its mostly caused by the card's memory chips samsung and hinyx are more capable of getting higher OC than the problematic elpida chip they tend to black screen even on stock settings and dont like to get OC more than 1350 , but as I say its all about the silicon lottery but samsung and hinyx can hit over 1500 mhz without a lot of tweaking .
> You can try to bump the core volt / mem aux volt a bit and see if you can get your card stablized cause I'v seen cards with elpida chips running @ 1600 mhz.
> Good luck.


i read you are shooting for 4K, right? i think they have one with freesync now. i remember when the 4Ks came out that users where having issues with the dp cable. even the ones that came with the monitor. i've had mine for a year now and i like it.

i only have a tiny 28 inch but works fine even at the desktop . . .



maps in one of my games is hard to see . . .


----------



## EpicOtis13

Quote:


> Originally Posted by *megax05*
> 
> Are you taking about firestrike bench or other app if you talk about firestrike then that one good result there .
> Also benchmark actually dont reflect the gaming abilities of the cards. My cf 290s maxed @ 16120 pts around 24000+ in graphics running them @ 1200/1600.


It would be firestrike scores. I have been getting around 80 fps at 4k with settings on high and no AA


----------



## mfknjadagr8

Quote:


> Originally Posted by *EpicOtis13*
> 
> It would be firestrike scores. I have been getting around 80 fps at 4k with settings on high and no AA


I'm unsure what you think is wrong...sites and everything looks fine to me...mine don't do that well as my elpidia memory on my second card doesn't like any speed over 1400...


----------



## mus1mus

I am out to swap my a 980TI (very good one) for 2-290s + $200. Anything to expect from ELPIDA memories?

My 290 with a Hynix is pretty good on the memory at 1700+ MHz.


----------



## mirzet1976

My gigabyte r9 290 referent elpida mem under water ccan bench at 1300/1625mhz. 1625mhz is max after that system restarts wenn I apply 1626mhz with msi ab.
http://www.overclock.net/t/1447763/amd-r9-290-290x-overclockers-club/240#post_23382804


----------



## fyzzz

I was going to go crossfire 290's but i got a good deal on a 980 ti. I will probably never sell my 290 however, considering how special it is.


----------



## rdr09

Quote:


> Originally Posted by *EpicOtis13*
> 
> Just picked up two 290's for $410 with blocks. I have been doing some benching, and at 1100/1350 I am getting a score of 17652, is this normal? If it is I probably should have just stuck with my 7970s that would hit 15800 at 1250/1700.


compare graphics scores. 7970 to 290 is only about 20 to 30 percent increase depending clocks. tahitis are known to score high in FS and 1250 is above average oc.

my 290's at 1250 get around 25K in graphics score.


----------



## mfknjadagr8

Quote:


> Originally Posted by *mus1mus*
> 
> I am out to swap my a 980TI (very good one) for 2-290s + $200. Anything to expect from ELPIDA memories?
> 
> My 290 with a Hynix is pretty good on the memory at 1700+ MHz.


hopefully you don't get one like my bottom card...elpidia memory and over 1400 black screens..oddly ive never seen it artifact the first time just black screens after 1400 I might could push voltage up more but honestly I'm not willing to potentially lose the card for another 50 or even 100mhz on memory clock that likely won't help a whole lot


----------



## mus1mus

Quote:


> Originally Posted by *mfknjadagr8*
> 
> hopefully you don't get one like my bottom card...elpidia memory and over 1400 black screens..oddly ive never seen it artifact the first time just black screens after 1400 I might could push voltage up more but honestly I'm not willing to potentially lose the card for another 50 or even 100mhz on memory clock that likely won't help a whole lot


Yeah. Fingers crossed. That 980ti is hard to give up. But it's like 3 vs 1 trade.

And, one of the cards is unlockable.! Not sure which between the powercolor and the sapphire though.


----------



## battleaxe

Quote:


> Originally Posted by *Liranan*
> 
> How do you guys OC the RAM so high? My card black screens when I hit 1400. The core is at 1100 without an increase in power limit or voltage but the RAM really doesn't like being OC'd.


Not all cards are created equal. My old 290 can only do 1375 before black screening. Just luck of the lottery.


----------



## xKrNMBoYx

Hehehe they arrived a day early.


----------



## Streetdragon

Quote:


> Originally Posted by *EpicOtis13*
> 
> Just picked up two 290's for $410 with blocks. I have been doing some benching, and at 1100/1350 I am getting a score of 17652, is this normal? If it is I probably should have just stuck with my 7970s that would hit 15800 at 1250/1700.


I would say, it is a bit low!
http://www.3dmark.com/fs/6103632

Topcard is 390 Nitro and the other is a 290 Vapor


----------



## gatygun

Quote:


> Originally Posted by *MEC-777*
> 
> You can safely go less than 1000w for CF 290's. I'm running CF 290's with a new EVGA Supernova 850GS PSU (made by seasonic). I have OC'd my 290's to 1200/1450 for benchmarking, but I mainly run them at stock clocks 24/7 (gives me more than enough performance anyways), so I'm not concerned at all. I could have gone with a cheaper quality 1000w bronze PSU, but decided to go with a top quality 850w gold unit instead (the best I could afford). The quality of the components used and the design of the unit also plays a significant roll in stability/longevity as well.
> 
> Besides that, I agree with you 100% - never skimp of the PSU.


It works for you because you have a efficient cpu and your gpu's run mostly on stock and have a high quality 850w psu. If you push those gpu's to 1200+ core and 1600 mem + push a amd cpu to high frequencies. You will hit over the 850w at times.

Example:

This is with a newer i7 ( which is better on the wattage ) on 4,4ghz with 2x 290's at 1040/1400 each ( minor overclock ).


Spoiler: Warning: Spoiler!







A single gpu is probably going to increase about ~60w on a heavy overclock at worst case scenario which will be about 120w for both. Yea good luck with that. Let's not even talk about a lesser efficient cpu that uses a lot more wattage ( like amd has ) then that intel newer cpu's pushes.

1000w gold quality psu is absolutely going to be needed to get stuff going stable, a cheap 1000w or a cheap any watt is never going to work well.

Unless you keep stuff at stock ( gpu wise ) and have a solid cpu that doesn't push wattage to the max a good quality 850 watt psu will work.


----------



## Gumbi

Currently benching a STABLE and COOL (loud fans though







) 1248/1651mhz overclock on my 290X VaporX @ 200mv. I think this is shaping up to be one of the best clockers I've ever had. And I just boight it second hand!

Sold my poorly clocking RMAed 290 and paid a difference of 60 euro for a better clocking290X with 4 extra gb of vram


----------



## Gumbi

Well I failed at the print screen but have the log to prove it. 68.4FPS. 1080p AA and Tess maxed. 1252mhz core and 1650mhz memory.


----------



## sinnedone

Quote:


> Originally Posted by *Streetdragon*
> 
> I would say, it is a bit low!
> http://www.3dmark.com/fs/6103632
> 
> Topcard is 390 Nitro and the other is a 290 Vapor


You have an overclocked 6 core though.

My older i7 at 4.7ghz and 290 crossfire at 1150/1350 hits mid 17,000


----------



## kizwan

Quote:


> Originally Posted by *Streetdragon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *EpicOtis13*
> 
> Just picked up two 290's for $410 with blocks. I have been doing some benching, and at 1100/1350 I am getting a score of 17652, is this normal? If it is I probably should have just stuck with my 7970s that would hit 15800 at 1250/1700.
> 
> 
> 
> I would say, it is a bit low!
> http://www.3dmark.com/fs/6103632
> 
> Topcard is 390 Nitro and the other is a 290 Vapor
Click to expand...

390 + 290 Crossfire will always faster than 290 + 290 Crossfire at same clocks. Also, 1150/1600 vs. 1100/1350. Come on.


----------



## fat4l

You should be comparing gpu scores not overall. And the best to compare is fire strike extreme.


----------



## megax05

Indeed graphic score is the way to compare mine scores iver 24k @ 1100/1500 , so getting extra 70/150 mhz on core and mem I will probably hit 25k in graphics.


----------



## Streetdragon

Quote:


> Originally Posted by *kizwan*
> 
> 390 + 290 Crossfire will always faster than 290 + 290 Crossfire at same clocks. Also, 1150/1600 vs. 1100/1350. Come on.


sure is it faster BUT not 8k faster! the gap is to huge

edit: old RIG http://www.3dmark.com/fs/5721307 lower clocks and still more points...


----------



## rdr09

Quote:


> Originally Posted by *xKrNMBoYx*
> 
> Hehehe they arrived a day early.


Nice. pics when installed.
Quote:


> Originally Posted by *gatygun*
> 
> It works for you because you have a efficient cpu and your gpu's run mostly on stock and have a high quality 850w psu. If you push those gpu's to 1200+ core and 1600 mem + push a amd cpu to high frequencies. You will hit over the 850w at times.
> 
> Example:
> 
> This is with a newer i7 ( which is better on the wattage ) on 4,4ghz with 2x 290's at 1040/1400 each ( minor overclock ).
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> A single gpu is probably going to increase about ~60w on a heavy overclock at worst case scenario which will be about 120w for both. Yea good luck with that. Let's not even talk about a lesser efficient cpu that uses a lot more wattage ( like amd has ) then that intel newer cpu's pushes.
> 
> 1000w gold quality psu is absolutely going to be needed to get stuff going stable, a cheap 1000w or a cheap any watt is never going to work well.
> 
> Unless you keep stuff at stock ( gpu wise ) and have a solid cpu that doesn't push wattage to the max a good quality 850 watt psu will work.


don't we have to factor in efficiency?


----------



## gatygun

Quote:


> Originally Posted by *MEC-777*
> 
> You can safely go less than 1000w for CF 290's. I'm running CF 290's with a new EVGA Supernova 850GS PSU (made by seasonic). I have OC'd my 290's to 1200/1450 for benchmarking, but I mainly run them at stock clocks 24/7 (gives me more than enough performance anyways), so I'm not concerned at all. I could have gone with a cheaper quality 1000w bronze PSU, but decided to go with a top quality 850w gold unit instead (the best I could afford). The quality of the components used and the design of the unit also plays a significant roll in stability/longevity as well.
> 
> Besides that, I agree with you 100% - never skimp of the PSU.


It works for you because you have a efficient cpu and your gpu's run mostly on stock and have a high quality 850w psu. If you push those gpu's to 1200+ core and 1600 mem + push a amd cpu to high frequencies. You will hit over the 850w at times.

Example:

This is with a newer i7 ( which is better on the wattage ) on 4,4ghz with 2x 290's at 1040/1400 each ( minor overclock ).

http://cdn.overclock.net/2/26/900x900px-LL-26848ea7_Prime95andValley.png

A single gpu is probably going to increase about ~60w on a heavy overclock at worst case scenario which will be about 120w for both. Yea good luck with that. Let's not even talk about a lesser efficient cpu that uses a lot more wattage ( like amd has ) then that intel newer cpu's pushes.

1000w gold quality psu is absolutely going to be needed to get stuff going stable, a cheap 1000w or a cheap any watt is never going to work well.

Quote:


> Originally Posted by *EpicOtis13*
> 
> Just picked up two 290's for $410 with blocks. I have been doing some benching, and at 1100/1350 I am getting a score of 17652, is this normal? If it is I probably should have just stuck with my 7970s that would hit 15800 at 1250/1700.


18k is low for 2x 290's. There is something going wrong. You should be getting at 1200+ core 25k


----------



## Mega Man

Ahem. Someone with an INTEL cpu has already shown that 2 290Xs can pull 1300w from the wall


----------



## mfknjadagr8

Quote:


> Originally Posted by *gatygun*
> 
> It works for you because you have a efficient cpu and your gpu's run mostly on stock and have a high quality 850w psu. If you push those gpu's to 1200+ core and 1600 mem + push a amd cpu to high frequencies. You will hit over the 850w at times.
> 
> Example:
> 
> This is with a newer i7 ( which is better on the wattage ) on 4,4ghz with 2x 290's at 1040/1400 each ( minor overclock ).
> 
> http://cdn.overclock.net/2/26/900x900px-LL-26848ea7_Prime95andValley.png
> 
> A single gpu is probably going to increase about ~60w on a heavy overclock at worst case scenario which will be about 120w for both. Yea good luck with that. Let's not even talk about a lesser efficient cpu that uses a lot more wattage ( like amd has ) then that intel newer cpu's pushes.
> 
> 1000w gold quality psu is absolutely going to be needed to get stuff going stable, a cheap 1000w or a cheap any watt is never going to work well.
> 18k is low for 2x 290's. There is something going wrong. You should be getting at 1200+ core 25k


I'm pretty sure he is saying total score not graphics score...good luck getting 25k total score with two 290s


----------



## gatygun

Quote:


> Originally Posted by *mfknjadagr8*
> 
> I'm pretty sure he is saying total score not graphics score...good luck getting 25k total score with two 290s


Never realized people actually would look at that number when comparing gpu performance.


----------



## mfknjadagr8

Quote:


> Originally Posted by *gatygun*
> 
> Never realized people actually would look at that number when comparing gpu performance.


I may be wrong but that's around what I get total score with my two 290s and my processor at 4.8 however fire strike combined is always gimped for amd processors


----------



## EpicOtis13

I just wanted to clarify and say that my CPU is a 5930k at 4.3 ghz.


----------



## fat4l

Quote:


> Originally Posted by *Mega Man*
> 
> Ahem. Someone with an INTEL cpu has already shown that 2 290Xs can pull 1300w from the wall


 so now my 1200W psu is not nuff ?







cmonnnn
Do u maybe have the link or something ? I would love to see that


----------



## Mega Man

You need to talk to tsm iirc. It was in this thread

All I ment is 1get off the anti amd bike it has all to do with how much you oc


----------



## kizwan

Quote:


> Originally Posted by *Streetdragon*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> 390 + 290 Crossfire will always faster than 290 + 290 Crossfire at same clocks. Also, 1150/1600 vs. 1100/1350. Come on.
> 
> 
> 
> sure is it faster BUT not 8k faster! the gap is to huge
> 
> edit: old RIG http://www.3dmark.com/fs/5721307 lower clocks and still more points...
Click to expand...

From his score, I can tell that was actually total score, not graphics score.
Quote:


> Originally Posted by *fat4l*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> Ahem. Someone with an INTEL cpu has already shown that 2 290Xs can pull 1300w from the wall
> 
> 
> 
> so now my 1200W psu is not nuff ?
> 
> 
> 
> 
> 
> 
> 
> cmonnnn
> Do u maybe have the link or something ? I would love to see that
Click to expand...

You can find pics in this thread. I think it was a 2011 hexacore if I'm not mistaken. 4790k doesn't pull a lot, so I can understand why you don't believe it. We have 8 core Haswell-E now. I'm interested to see power consumption with that.


----------



## Ruzbynen

Hello.I have big problemo.CS:GO freezes for about 2 seconds with some lagging noise and then fixes itself.Temps are good.VRMs are good too.My PSU is 750W FSP Raider.
i5-3330
B75-D3v
16 GB RAM
I tried reinstalling Windows 10 the problem is still there.I tried with V-Sync on, underclocking the card, default clocks, overclocked still the same problem.Help.


----------



## mus1mus

Is 1250/1700 for an Elpida 290 good? It is unlockable to 290X but I can't get into that memory clock on the Bioses I have tried.


----------



## semitope

Quote:


> Originally Posted by *mus1mus*
> 
> Is 1250/1700 for an Elpida 290 good? It is unlockable to 290X but I can't get into that memory clock on the Bioses I have tried.


sounds nice .What voltage?

I have only gotten up to 1150 barely on my 290x vapor-x with over+70mV or so


----------



## megax05

Quote:


> Originally Posted by *mus1mus*
> 
> Is 1250/1700 for an Elpida 290 good? It is unlockable to 290X but I can't get into that memory clock on the Bioses I have tried.


Dude that is a nice OC for an elpida memory r9 290.
@ what volt is that OC?


----------



## mus1mus

Quote:


> Originally Posted by *semitope*
> 
> sounds nice .What voltage?
> 
> I have only gotten up to 1150 barely on my 290x vapor-x with over+70mV or so


Quote:


> Originally Posted by *megax05*
> 
> Dude that is a nice OC for an elpida memory r9 290.
> @ what volt is that OC?


+200

Will be on water soon. So no heaven til then.


----------



## EpicOtis13

Quote:


> Originally Posted by *mus1mus*
> 
> +200
> 
> Will be on water soon. So no heaven til then.


Sweet Jesus that is alot of voltage


----------



## mus1mus

Thou can never have enough Voltage!


----------



## EpicOtis13

Quote:


> Originally Posted by *mus1mus*
> 
> Thou can never have enough Voltage!


Is that 1.4 or 1.3?

Also, I can't get my cards to run well past 1100/1350. In firestrike any higher clocks on either result in a drop of almost 1000pt on the whole score and 2 thousand on the graphics score. Are my cards just poor clockers? One is elpida and one is hynix.


----------



## mus1mus

I don't know dude. 1.4 is still not that dangerous IMO if it stays cool.

My MSI reference 290 has been the same since I got it. +200 on Trixxx and 1200/1700. But it stays within 40C all the time.


----------



## Gumbi

AFAIK it's not necessarily voltage that kills chips, but current. And since chips draw more current at higher temps (due to resistance I think), 1.4v is not a problem if you can cool it.


----------



## ZealotKi11er

For me after +125v i will not get higher OC. With +200 i can run 1285MHz core but with artifacts.


----------



## xKrNMBoYx

Quote:


> Originally Posted by *rdr09*
> 
> Nice. pics when installed.


----------



## megax05

Nice setup how much did the loop costed I am planning to WC my pc after I upgrade my PC
Current plan is sell my i7 4770 + mobo + 16 gb of ddr + 730w psu and get an i7 5820k + mobo + 16gb of ddr4 and 1200w psu .


----------



## xKrNMBoYx

Quote:


> Originally Posted by *megax05*
> 
> Nice setup how much did the loop costed I am planning to WC my pc after I upgrade my PC
> Current plan is sell my i7 4770 + mobo + 16 gb of ddr + 730w psu and get an i7 5820k + mobo + 16gb of ddr4 and 1200w psu .


$670, if there are no more purchases. That's the price with going with a cheaper line of radiators. If I bought higher quality rads it would cost $100-200+ more I would think.

Your current computer is nice but the new one will be sweet.


----------



## megax05

Quote:


> Originally Posted by *xKrNMBoYx*
> 
> $670, if there are no more purchases. That's the price with going with a cheaper line of radiators. If I bought higher quality rads it would cost $100-200+ more I would think.
> 
> Your current computer is nice but the new one will be sweet.


Yeah its good for now but every one need to upgrade @ some point , lately I was OCing and overvolting my cards and decided to see how low voltage I can go with my cards so I tested @1000/1300 and I was able to go as low as -30 befor driver crashes in me the tested @ 975/1250 I was able to get to -70mv and the heaven score was as the stock speeds with less heat and noise.


----------



## xKrNMBoYx

Quote:


> Originally Posted by *megax05*
> 
> Yeah its good for now but every one need to upgrade @ some point , lately I was OCing and overvolting my cards and decided to see how low voltage I can go with my cards so I tested @1000/1300 and I was able to go as low as -30 befor driver crashes in me the tested @ 975/1250 I was able to get to -70mv and the heaven score was as the stock speeds with less heat and noise.


Yup. I enjoy OCing but I also enjoy undervolting to compare temp/performance


----------



## gupsterg

Quote:


> Originally Posted by *fyzzz*
> 
> My 290 never fails to surprise me
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/8863511?


Was just doing some benching _and_ I thought I'll take a peak at 3dMark database regarding i5 4690K CPU and 1x 290 GPU _*and*_ what do I find???

The top 50 has been BLITZED by you!!!
















Needless to say I was LOOL & ROLOF.


Spoiler: Top 50







SO that's 1-25 , 28-30, 32, 34-37, 39-40, 42, 44, 47.

They may as well name that section as fyzzz's golden hits!!!


----------



## Liranan

Quote:


> Originally Posted by *Gumbi*
> 
> Voltage. My RAM only does 1450mhz but can do 1650 whn I feed in 80mv~ of core voltage. So an overclock would be 1180/1650 at 80mv.


Quote:


> Originally Posted by *megax05*
> 
> Well every card react to OCing differntly however its mostly caused by the card's memory chips samsung and hinyx are more capable of getting higher OC than the problematic elpida chip they tend to black screen even on stock settings and dont like to get OC more than 1350 , but as I say its all about the silicon lottery but samsung and hinyx can hit over 1500 mhz without a lot of tweaking .
> You can try to bump the core volt / mem aux volt a bit and see if you can get your card stablized cause I'v seen cards with elpida chips running @ 1600 mhz.
> Good luck.


I have Hynix not Elpida RAM. Will experiment with voltages later to see if i can stabilise it.

Thanks guys.


----------



## r0llinlacs

How are y'all cooling the VRM's with these insane overclocks?


----------



## mus1mus

Water helps.


----------



## Mega Man

Quote:


> Originally Posted by *mus1mus*
> 
> Water beer helps.


Fixed for you


----------



## xKrNMBoYx

Quote:


> Originally Posted by *Mega Man*
> 
> Fixed for you


Well, time to restock since I ran out bottles yesterday.


----------



## fyzzz

I
Quote:


> Originally Posted by *gupsterg*
> 
> Was just doing some benching _and_ I thought I'll take a peak at 3dMark database regarding i5 4690K CPU and 1x 290 GPU _*and*_ what do I find???
> 
> The top 50 has been BLITZED by you!!!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Needless to say I was LOOL & ROLOF.
> 
> 
> Spoiler: Top 50
> 
> 
> 
> 
> 
> 
> 
> SO that's 1-25 , 28-30, 32, 34-37, 39-40, 42, 44, 47.
> 
> They may as well name that section as fyzzz's golden hits!!!


I love benching especially with that 290 and sometimes has a little to much time on my hands. I also have the number one spot with 4690k/980 ti in firestrike extreme (by 1 point), and i haven't started yet.


----------



## rdr09

Quote:


> Originally Posted by *xKrNMBoYx*


nice. i replaced my old pump with that now my system is quiet again. love the variable feature and mine is set to 4. did not even bolt it down. it is seating on a foam.

Quote:


> Originally Posted by *fyzzz*
> 
> I
> I love benching especially with that 290 and sometimes has a little to much time on my hands. I also have the number one spot with 4690k/980 ti in firestrike extreme (by 1 point), and i haven't started yet.


that's not a 290. it's a 390.


----------



## xKrNMBoYx

Quote:


> Originally Posted by *rdr09*
> 
> nice. i replaced my old pump with that now my system is quiet again. love the variable feature and mine is set to 4. did not even bolt it down. it is seating on a foam.


That's nice to know. My pump is bolted down on top of using the pads. I had the inlet and the reservoir outlet perfectly aligned for a straight connection but using the pad made the pump inlet a little higher up. Now that you mention it the way I have the pump setup makes it harder to access the variable speed notch.


----------



## rdr09

Quote:


> Originally Posted by *xKrNMBoYx*
> 
> That's nice to know. My pump is bolted down on top of using the pads. I had the inlet and the reservoir outlet perfectly aligned for a straight connection but using the pad made the pump inlet a little higher up. Now that you mention it the way I have the pump setup makes it harder to access the variable speed notch.


might be able to bore a hole through the case just under the HDD so you can adjust the pump speed.

i have a slim 240, thick 360, similar res, and same amount of blocks as you using the setting 4 is perfect.


----------



## ansha

Hi guys, need your advice with gpu choice. It is between XFX 290X DD 8GB and MSI Lightning 290X 4GB. XFX is cheaper but it's negligible.


----------



## xKrNMBoYx

Quote:


> Originally Posted by *rdr09*
> 
> might be able to bore a hole through the case just under the HDD so you can adjust the pump speed.
> 
> i have a slim 240, thick 360, similar res, and same amount of blocks as you using the setting 4 is perfect.


Awesome, thanks.


----------



## MEC-777

Quote:


> Originally Posted by *ansha*
> 
> Hi guys, need your advice with gpu choice. It is between XFX 290X DD 8GB and MSI Lightning 290X 4GB. XFX is cheaper but it's negligible.


I would go with the MSI Lightning. Haven't heard anything special or particularly good about the XFX DD cards.


----------



## fat4l

So guys I was testing my card today and found out something very weird.
Yeah, the blackscreen problem.

Let me tell you more. I have ares 3 which is 290X in cf. Monitor is benq XL2730z
The issue is related to GPU1 clock and it depends on monitor refresh rate. If gpu1 clock is 1200+ and refresh rate is more than 60Hz its blackscreening.
If it's 60Hz only, all is fine, no blackscreen problem.

I'm using Lindy Chrome 2m cable which is really good cable.
I tried to set different ram clocks, and I can tell it doesn't depend on ram clock. I went from 4000MHz to 6600MHz.
I tried to bump VDDCI voltage too, didn't help.
Doesn't depend on gpu 2 clocks.

It happens mainly for example, if the states/clocks are going from 2d->3d.
So if I run 3dmark 13, and setting between tabs(benchmark,custom, feature test ....) it blackscreens.

Anyone ?
Can it be 20-pin DisplayPort issue?


----------



## Gumbi

Quote:


> Originally Posted by *ansha*
> 
> Hi guys, need your advice with gpu choice. It is between XFX 290X DD 8GB and MSI Lightning 290X 4GB. XFX is cheaper but it's negligible.


The Lightning without a doubt, the XFX 290X has VRM cooling issues and only OK core cooling. The Lightning has great core/VRM cooling, and a beefy custom board.


----------



## GorillaSceptre

Hi guys,

I've found a deal on a brand new Sapphire 290x Vapor X for around $70 cheaper than a new msi 390x 8g.

Are 290x's and 390x's essentially the same, besides clock speeds and Vram, will the performance be similar? Is the Vapor X a decent 290x or a dud? Which one would you go for?

Sorry for all the questions, i don't have time for a lot of research.


----------



## rdr09

Quote:


> Originally Posted by *GorillaSceptre*
> 
> Hi guys,
> 
> I've found a deal on a brand new Sapphire 290x Vapor X for around $70 cheaper than a new msi 390x 8g.
> 
> Are 290x's and 390x's essentially the same, besides clock speeds and Vram, will the performance be similar? Is the Vapor X a decent 290x or a dud? Which one would you go for?
> 
> Sorry for all the questions, i don't have time for a lot of research.


Vapor X are pretty decent. Even a good clocking 290 will perform similarly to the 390X.

But, if the 290X, i'd go with a used one.


----------



## Gumbi

Quote:


> Originally Posted by *rdr09*
> 
> Vapor X are pretty decent. Even a good clocking 290 will perform similarly to the 390X.
> 
> But, if the 290X, i'd go with a used one.


Definitely, and the VaporX is a beastly cooler. I have a 290X VaporX 8GB that does 1250mhz/1650mhz on air. Pretty ridiculous


----------



## mus1mus

Quote:


> Originally Posted by *rdr09*
> 
> Vapor X are pretty decent. Even a good clocking 290 will perform similarly to the 390X.
> 
> But, if the 290X, i'd go with a used one.


True!

One of my reference 290s does 1222/1725 scores better than an unlocked 290X that does 1250/1500.

And Core is not always king.


----------



## GorillaSceptre

Quote:


> Originally Posted by *rdr09*
> 
> Vapor X are pretty decent. Even a good clocking 290 will perform similarly to the 390X.
> 
> But, if the 290X, i'd go with a used one.


Quote:


> Originally Posted by *Gumbi*
> 
> Definitely, and the VaporX is a beastly cooler. I have a 290X VaporX 8GB that does 1250mhz/1650mhz on air. Pretty ridiculous


Quote:


> Originally Posted by *mus1mus*
> 
> True!
> 
> One of my reference 290s does 1222/1725 scores better than an unlocked 290X that does 1250/1500.
> 
> And Core is not always king.


I went with the Vapor X 290x.

I was just looking for a card to hold me over to next year, so for a higher price i could've gotten a mid-tier 390x or a high-tier 290x for cheaper. I think i made the right call.

Thanks for the help guys.


----------



## serave

This might've be the best place to ask this, i recently found out there was some kind of "290X press being faster than the retail version" thing

i'm pretty sure it was Toms who claims such a thing, google'd my ass off but i just can't find a good answer

can someone please enlighten me regarding the issue?


----------



## mus1mus

By faster you might mean, better clocking and more stable samples.

That is a possibility.

You want good feedbacks from the press, send them the best binned sample you can find. After all, most users rely on Press reviews in their buying decisions. Happens most of the time.

There are several people round here that were once reviewers and got some samples that does things better than most of the other samples.

Me no need to call names.


----------



## gupsterg

This article that Serave is referring was regarding how retail cards 40% fan duty was not same RPM as AMD produced cards 40%, so throttling occurred on retail cards.

Catalyst 13.11 Beta 9.2 fixed it.

When I was trawling Asus & Sapphire support pages for official roms, they also released updated firmware for the ref PCB / Cooler cards.


----------



## xKrNMBoYx

Loop is done and now leak testing. I initially had a problem where the fill port got loose in the beginning but after that I have not seen any leaks. Time to leave it on for 24 hours. Then I can connect everything and get to seeing temp changes from reference cooling.


----------



## Shweller

Quote:


> Originally Posted by *xKrNMBoYx*
> 
> 
> 
> Loop is done and now leak testing. I initially had a problem where the fill port got loose in the beginning but after that I have not seen any leaks. Time to leave it on for 24 hours. Then I can connect everything and get to seeing temp changes from reference cooling.


Nice! I have not had time to put the blocks on mine yet. I am looking forward to my computer not sounding like a jet.


----------



## xKrNMBoYx

Quote:


> Originally Posted by *Shweller*
> 
> Nice! I have not had time to put the blocks on mine yet. I am looking forward to my computer not sounding like a jet.


Thanks. Having Cool and Quiet 290Xs will be nice. Really hope no leaks appear during the next 24 hours.


----------



## Arizonian

Not exactly the correct proof on some but enough for me to know what was what and I added.








Quote:


> Originally Posted by *juno12*
> 
> I own my dcu ii for almost 2 years but recently i watercooled it.
> temps dropped from 80-90 to only 60 while playing bf4
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/details.php?id=eyu47
> 
> Manufacturer: asus
> Model/Series: R9 290X DCU ii oc
> Cooling: ek waterblock


Congrats - added















Quote:


> Originally Posted by *serave*
> 
> just bought this for $240, its actually pretty good, fans were quiet, temps would hover around the 62-65°C mark
> 
> plays FC4, Saints Row IV and Crysis 3 just fine @High settings, maybe a liittle bit lower at Crysis 3, but this unit has been amazing for me so far
> 
> considering 970 costs around $400 and upwards here, i don't think this one is a bad purchase the performance gap between them isn't that big to my knowledge, i could be wrong though
> 
> running it with SuperFlower 500watts unit
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added








Quote:


> Originally Posted by *AliNT77*
> 
> i got my 290 ref. for the price of cheapest 960 i could find here (around 200$)
> 
> 
> 
> 
> 
> 
> 
> 
> bought a used CLC (corsair h70) for 50$
> 
> 
> 
> 
> 
> 
> 
> 
> OC'd it to 1145/1525 (+100mV +15% 24/7 stable for gaming) (i know
> 
> 
> 
> 
> 
> 
> 
> its not a good overclocker
> 
> 
> 
> 
> 
> 
> 
> )
> max temp i saw was 66
> vrms are pasted with TIM +2fans blowing on them and suprisingly cool
> 
> super great value considering that 970 is priced about 400$ here
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> also scored 14789 (graphics) @ 1200/1620 with tesselation turned off


Congrats - added








Quote:


> Originally Posted by *mirzet1976*
> 
> My gigabyte r9 290 referent elpida mem under water ccan bench at 1300/1625mhz. 1625mhz is max after that system restarts wenn I apply 1626mhz with msi ab.
> http://www.overclock.net/t/1447763/amd-r9-290-290x-overclockers-club/240#post_23382804


Congrats - added








Quote:


> Originally Posted by *xKrNMBoYx*
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - updated








Quote:


> Originally Posted by *GorillaSceptre*
> 
> I went with the Vapor X 290x.
> 
> I was just looking for a card to hold me over to next year, so for a higher price i could've gotten a mid-tier 390x or a high-tier 290x for cheaper. I think i made the right call.
> 
> Thanks for the help guys.


When you get a chance please post proof (read OP for what is needed) and I'll get you added on the roster. Also for anyone else that hasn't posted proper proof and would like join the list, please do.


----------



## MEC-777

Quote:


> Originally Posted by *xKrNMBoYx*
> 
> 
> 
> Loop is done and now leak testing. I initially had a problem where the fill port got loose in the beginning but after that I have not seen any leaks. Time to leave it on for 24 hours. Then I can connect everything and get to seeing temp changes from reference cooling.


Gooks great.









I've been thinking about doing a custom loop on my own setup with 290's in crossfire as well, but there's something I don't understand. After looking at a number of other people's loops and paying particular close attention to the order of the components in terms of water flow, I don't get why people often have all their system components - CPU, GPU(s) - flowing one after the other, especially when they have more than one rad in their system.

Looking at your system, for example - and please forgive me, this is just an observation - but the order of flow doesn't make sense to me and I don't think this would give you the best possible temps. It looks like you have things in the following order:

Res/Pump -> top rad -> front rad -> GPU -> GPU -> CPU -> rear rad -> res/pump.

Now, to me, I would think the water would be warmer right after going through both GPUs, and having that water go through the CPU right after would that not cause the CPU to run a little warmer? If you have multiple rads, my thinking is it would be best to have a rad between the CPU and GPU blocks and run the water through the CPU first as they generally produce a lot less heat than GPUs can.

Would this not be a better order for water flow?

Res/pump -> CPU -> rear rad -> GPU -> GPU (with parallel flow between GPU blocks instead of single-line flow) -> front rad -> top rad -> res/pump.

My thoughts on this is to have the majority of the cooling done after the water flows through the components and before the pump (so the pump gets cooler water going through it) and to have one rad between the CPU and GPU(s) so the water is as cool as possible before going through each component.

Again, I'm not trying to pick apart your system at all, I think it's awesome! I'm just trying to understand why I've seen so many people who have the order components and flow of water in what I believe to be a less efficient and less optimal layout. Perhaps there's something I'm missing? Does what I'm saying make any sense?


----------



## xKrNMBoYx

Quote:


> Originally Posted by *MEC-777*
> 
> Gooks great.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've been thinking about doing a custom loop on my own setup with 290's in crossfire as well, but there's something I don't understand. After looking at a number of other people's loops and paying particular close attention to the order of the components in terms of water flow, I don't get why people often have all their system components - CPU, GPU(s) - flowing one after the other, especially when they have more than one rad in their system.
> 
> Looking at your system, for example - and please forgive me, this is just an observation - but the order of flow doesn't make sense to me and I don't think this would give you the best possible temps. It looks like you have things int he following order:
> 
> Res/Pump -> top rad -> front rad -> GPU -> GPU -> CPU -> rear rad -> res/pump.
> 
> Now, to me, I would think the water would be warmer right after going through both GPUs, and having that water go through the CPU right after would that not cause the CPU to run a little warmer? If you have multiple rads, my thinking is it would be best to have a rad between the CPU and GPU blocks and run the water through the CPU first as they generally produce a lot less heat than GPUs can.
> 
> Would this not a better order for water flow?
> 
> Res/pump -> CPU -> rear rad -> GPU -> GPU (with parallel flow between GPU blocks instead of single-line flow) -> front rad -> top rad -> res/pump.
> 
> My thoughts on this is to have the majority of the cooling done after the water flows through the components and before the pump (so the pump gets cooler water going through it) and to have one rad between the CPU and GPU(s) so the water is as cool as possible before going through each component.
> 
> Again, I'm not trying to pick apart your system at all, I think it's awesome! I'm just trying to understand why I've seen so many people who have the order components and flow of water in what I believe to be a less efficient and less optimal layout. Perhaps there's something I'm missing? Does what I'm saying make any sense?


Not offended. During my research phase I asked your particular question ALOT since in the basic logic to me it seems that going block rad block would have better temps.

That said the short answer is majority said order doesn't matter since water temps would be fairly equalized. There are temp differences but fairly small.

This is my second loop, first done by myself so I lacked the planning and execution. The original plan was to go GPU GPU Rear 120mm and then CPU but I could not get the 120mm rad to connect to the GPU without severe kinking in the tube. The linking happened around 2am and I did not expect it to happen. So immediate improvisation was done. Honestly the whole way I imagined the loop turning out was backwards. Front/Top Rad > CPU > Rear Rad > GPU/GPU > Res > Pump

Good news is no leaks after about 10 hours


----------



## mus1mus

Loop order doesn't matter. Temps will stabilize in no time. And water flows fast enough to be warmed up significantly from a component to the next.

It's all about ambient and rad space.


----------



## GorillaSceptre

Quote:


> Originally Posted by *Arizonian*
> 
> When you get a chance please post proof (read OP for what is needed) and I'll get you added on the roster. Also for anyone else that hasn't posted proper proof and would like join the list, please do.


Will do.







Just got to wait for it to get here.

I'm only a few years late..


----------



## Moret

My Sapphire stock 290 with EK waterblock and backplate: http://www.techpowerup.com/gpuz/details.php?id=4xq5x
I like to be on the list


----------



## kizwan

Quote:


> Originally Posted by *MEC-777*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *xKrNMBoYx*
> 
> 
> 
> Loop is done and now leak testing. I initially had a problem where the fill port got loose in the beginning but after that I have not seen any leaks. Time to leave it on for 24 hours. Then I can connect everything and get to seeing temp changes from reference cooling.
> 
> 
> 
> 
> 
> 
> 
> Gooks great.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've been thinking about doing a custom loop on my own setup with 290's in crossfire as well, but there's something I don't understand. After looking at a number of other people's loops and paying particular close attention to the order of the components in terms of water flow, I don't get why people often have all their system components - CPU, GPU(s) - flowing one after the other, especially when they have more than one rad in their system.
> 
> Looking at your system, for example - and please forgive me, this is just an observation - but the order of flow doesn't make sense to me and I don't think this would give you the best possible temps. It looks like you have things int he following order:
> 
> Res/Pump -> top rad -> front rad -> GPU -> GPU -> CPU -> rear rad -> res/pump.
> 
> Now, to me, I would think the water would be warmer right after going through both GPUs, and having that water go through the CPU right after would that not cause the CPU to run a little warmer? If you have multiple rads, my thinking is it would be best to have a rad between the CPU and GPU blocks and run the water through the CPU first as they generally produce a lot less heat than GPUs can.
> 
> Would this not a better order for water flow?
> 
> Res/pump -> CPU -> rear rad -> GPU -> GPU (with parallel flow between GPU blocks instead of single-line flow) -> front rad -> top rad -> res/pump.
> 
> My thoughts on this is to have the majority of the cooling done after the water flows through the components and before the pump (so the pump gets cooler water going through it) and to have one rad between the CPU and GPU(s) so the water is as cool as possible before going through each component.
> 
> Again, I'm not trying to pick apart your system at all, I think it's awesome! I'm just trying to understand why I've seen so many people who have the order components and flow of water in what I believe to be a less efficient and less optimal layout. Perhaps there's something I'm missing? Does what I'm saying make any sense?
Click to expand...

Yes, you're missing thermodynamics theory.







Like everyone else said, the gist is that the water begins to warm up slowly, and in time it reaches a balance, called an equilibrium. Heat is made and heat removed, the loop is stabilized and temperatures will not change. If you measure the temp between the radiator out temp & cpu out temp, the difference is only 2 to 3 degrees. This has been verified, not in theory. I have read the test but I don't think I bookmark it.


----------



## battleaxe

I just made a small discovery with coil wine.

My 290x has not had coil whine that I have heard at all until today. I made a custom cooler for the VRM2 and screwed it on. It was then that I noticed significant coil wine while running heaven. I remembered that my old VRM2 cooler was contacting the choke which is just below the VRM2, I had used a thermal pad to stick to that choke to make it stick better previously.

When I installed the new cooler I did not do this. I took it back apart and this time put a small roll of thermal pad between the choke and the VRM cooler, then restarted and the coil whine was gone. Everyone knows that its the coils making the noise.

Just an idea but if some of you are getting coil whine, you could use some VRM pads and small ram sinks if you have them to tame the noise if it bothers you. It worked on mine.


----------



## MEC-777

Quote:


> Originally Posted by *mus1mus*
> 
> Loop order doesn't matter. Temps will stabilize in no time. And water flows fast enough to be warmed up significantly from a component to the next.
> 
> It's all about ambient and rad space.


Quote:


> Originally Posted by *kizwan*
> 
> Yes, you're missing thermodynamics theory.
> 
> 
> 
> 
> 
> 
> 
> Like everyone else said, the gist is that the water begins to warm up slowly, and in time it reaches a balance, called an equilibrium. Heat is made and heat removed, the loop is stabilized and temperatures will not change. If you measure the temp between the radiator out temp & cpu out temp, the difference is only 2 to 3 degrees. This has been verified, not in theory. I have read the test but I don't think I bookmark it.


Ah, thanks, understood.







I've studied thermodynamics in school, so I know what you mean and what you're trying to explain. I guess I just never realized the delta-T in the actual temperature of the water itself is not that significant, so long as you have sufficient rad dissipation. In that case, yeah, I guess it really doesn't matter.

Still, to satisfy my OCD nature and get temps as low as possible, if I do end up building a custom loop for my system, I'll probably go with the following order: Res/pump > CPU > top 120mm rad > rear 120mm rad > GPU1 > GPU2 > front 240mm rad > Res/pump.


----------



## kizwan

Quote:


> Originally Posted by *MEC-777*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Loop order doesn't matter. Temps will stabilize in no time. And water flows fast enough to be warmed up significantly from a component to the next.
> 
> It's all about ambient and rad space.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Yes, you're missing thermodynamics theory.
> 
> 
> 
> 
> 
> 
> 
> Like everyone else said, the gist is that the water begins to warm up slowly, and in time it reaches a balance, called an equilibrium. Heat is made and heat removed, the loop is stabilized and temperatures will not change. If you measure the temp between the radiator out temp & cpu out temp, the difference is only 2 to 3 degrees. This has been verified, not in theory. I have read the test but I don't think I bookmark it.
> 
> Click to expand...
> 
> 
> 
> 
> 
> Ah, thanks, understood.
> 
> 
> 
> 
> 
> 
> 
> I've studied thermodynamics in school, so I know what you mean and what you're trying to explain. I guess I just never realized the delta-T in the actual temperature of the water itself is not that significant, so long as you have sufficient rad dissipation. In that case, yeah, I guess it really doesn't matter.
> 
> Still, to satisfy my OCD nature and get temps as low as possible, if I do end up building a custom loop for my system, I'll probably go with the following order: Res/pump > CPU > top 120mm rad > rear 120mm rad > GPU1 > GPU2 > front 240mm rad > Res/pump.
Click to expand...

Why not!







My loop order is res+pump >> cpu >> top 360 rad >> 290's >> bottom 120 rad >> front 240 rad >> res+pump. The loops that you saw cpu >> gpu probably have the top rad ports toward the front instead of at the back which I think the order is actually likely .... >> gpu >> cpu >> top rad >> res+pump >> (and the rest of the loop).

Yeah, I know, top rad is red but that was the only one available (in stock) that time. I was going to paint it but I didn't.


----------



## MEC-777

Quote:


> Originally Posted by *kizwan*
> 
> Why not!
> 
> 
> 
> 
> 
> 
> 
> My loop order is res+pump >> cpu >> top 360 rad >> 290's >> bottom 120 rad >> front 240 rad >> res+pump. The loops that you saw cpu >> gpu probably have the top rad ports toward the front instead of at the back which I think the order is actually likely .... >> gpu >> cpu >> top rad >> res+pump >> (and the rest of the loop).
> 
> Yeah, I know, top rad is red but that was the only one available (in stock) that time. I was going to paint it but I didn't.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Nice setup! Very clean.


----------



## LandonAaron

Does anyone know what the "ACP Application" is that is installed during the catalyst driver installation?


----------



## mus1mus

Quote:


> Originally Posted by *MEC-777*
> 
> Ah, thanks, understood.
> 
> 
> 
> 
> 
> 
> 
> I've studied thermodynamics in school, so I know what you mean and what you're trying to explain. I guess I just never realized the delta-T in the actual temperature of the water itself is not that significant, so long as you have sufficient rad dissipation. In that case, yeah, I guess it really doesn't matter.
> 
> Still, to satisfy my OCD nature and get temps as low as possible, if I do end up building a custom loop for my system, I'll probably go with the following order: Res/pump > CPU > top 120mm rad > rear 120mm rad > GPU1 > GPU2 > front 240mm rad > Res/pump.


Hey, to add to your OCD, configure your loop to be as clean as possible.









Loop order should be made to promote less or short tube runs as possible. Nothing scientific. Just clean runs. You will be satisfied.


----------



## Roboyto

Quote:



> Originally Posted by *MEC-777*
> 
> Ah, thanks, understood.
> 
> 
> 
> 
> 
> 
> 
> I've studied thermodynamics in school, so I know what you mean and what you're trying to explain. I guess I just never realized the delta-T in the actual temperature of the water itself is not that significant, so long as you have sufficient rad dissipation. In that case, yeah, I guess it really doesn't matter.
> 
> Still, to satisfy my OCD nature and get temps as low as possible, if I do end up building a custom loop for my system, I'll probably go with the following order: Res/pump > CPU > top 120mm rad > rear 120mm rad > GPU1 > GPU2 > front 240mm rad > Res/pump.


I felt the same as you with component order and followed through accordingly









@mus1mus Short and sweet tube runs gets the job done

Temps are respectable with 4.5GHz 4770k and 1200/1500 290 with only (2) standard thickness 240 radiators; top is push using slim Yate Loons with front push/pull SP120s.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Roboyto*
> 
> Quote:
> I felt the same as you with component order and followed through accordingly
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @mus1mus
> Short and sweet tube runs gets the job done
> 
> Temps are respectable with 4.5GHz 4770k and 1200/1500 290 with only (2) standard thickness 240 radiators; top is push using slim Yate Loons with front push/pull SP120s.


I like your build but I never saw the appeal of the tube coils...I know they are for anti kink but man they put me off I think I would rather see zip ties everywhere...but to each their own there...nice looking stuff


----------



## Mega Man

Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MEC-777*
> 
> Ah, thanks, understood.
> 
> 
> 
> 
> 
> 
> 
> I've studied thermodynamics in school, so I know what you mean and what you're trying to explain. I guess I just never realized the delta-T in the actual temperature of the water itself is not that significant, so long as you have sufficient rad dissipation. In that case, yeah, I guess it really doesn't matter.
> 
> Still, to satisfy my OCD nature and get temps as low as possible, if I do end up building a custom loop for my system, I'll probably go with the following order: Res/pump > CPU > top 120mm rad > rear 120mm rad > GPU1 > GPU2 > front 240mm rad > Res/pump.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hey, to add to your OCD, configure your loop to be as clean as possible.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Loop order should be made to promote less or short tube runs as possible. Nothing scientific. Just clean runs. You will be satisfied.
Click to expand...

very true, with the exception of res> pump loop order does not matter ! esp for temps


----------



## kizwan

I think Win 10 screw up my Elpida card overclocking stability. I can say this after using several drivers in Win 10. Let say I want to run Heaven at 1100/1600, it tends to crash, even if I dial up the voltage up to +100mV. I didn't get this problem with Win 7 though. However, I can run Heaven in Crossfire (Elpida + Hynix cards) at 1100/1600 without any problem.


----------



## diggiddi

It seems Win 10 is sensitive to OC, at least cpu OC


----------



## Mega Man

i have no issues on any of my rigs, never had to change a thing,


----------



## mus1mus

me sad.









My first ever hi-end card died. R9 290 with a very good Hynix Memory.

So long my friend.....

For the memory of you, I present your greatest hits! R.I.P

http://www.3dmark.com/compare/fs/5607835/fs/5618528/fs/5618504/fs/5608463/fs/5609270/fs/5618444/fs/5607969

http://www.3dmark.com/compare/fs/5661030/fs/5662677/fs/5661065/fs/5660881/fs/5662861/fs/5661261/fs/5660594


----------



## MEC-777

Quote:


> Originally Posted by *mus1mus*
> 
> me sad.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My first ever hi-end card died. R9 290 with a very good Hynix Memory.
> 
> So long my friend.....
> 
> For the memory of you, I present your greatest hits! R.I.P
> 
> http://www.3dmark.com/compare/fs/5607835/fs/5618528/fs/5618504/fs/5608463/fs/5609270/fs/5618444/fs/5607969
> 
> http://www.3dmark.com/compare/fs/5661030/fs/5662677/fs/5661065/fs/5660881/fs/5662861/fs/5661261/fs/5660594


Sorry to hear.









What happened? A bit too much overclocking?


----------



## fyzzz

Quote:


> Originally Posted by *mus1mus*
> 
> me sad.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My first ever hi-end card died. R9 290 with a very good Hynix Memory.
> 
> So long my friend.....
> 
> For the memory of you, I present your greatest hits! R.I.P
> 
> http://www.3dmark.com/compare/fs/5607835/fs/5618528/fs/5618504/fs/5608463/fs/5609270/fs/5618444/fs/5607969
> 
> http://www.3dmark.com/compare/fs/5661030/fs/5662677/fs/5661065/fs/5660881/fs/5662861/fs/5661261/fs/5660594


That's sad to hear, that card was like my 290's long lost brother







. Nice runs however.


----------



## mus1mus

I left it downloading GTA5 and W3 last weekend. No OC and stays within a sub-20C room so I don't think it's the OC. Must have been visited by alien (particles) round the time.

Or, maybe she got tired of chasing Fyzz' card!









RMA in the works BTW.


----------



## By-Tor

I've noticed an issue with my cards usage lately in BF4. Card 2 doesn't seem to be stepping up to the plate...

Anyone have any ideas whats going on?

Thanks

BF4


Heaven


----------



## MIGhunter

Quote:


> Originally Posted by *By-Tor*
> 
> I've noticed an issue with my cards usage lately in BF4. Card 2 doesn't seem to be stepping up to the plate...
> 
> Anyone have any ideas whats going on?
> 
> Thanks
> 
> BF4
> 
> 
> Heaven


what is the program on the right?


----------



## By-Tor

Quote:


> Originally Posted by *MIGhunter*
> 
> what is the program on the right?


HWINFO windows gadget. Reads HWINFO64 sensors. Has a lot of setting options.


----------



## MIGhunter

Quote:


> Originally Posted by *By-Tor*
> 
> HWINFO windows gadget. Reads HWINFO64 sensors. Has a lot of setting options.


works in win10?


----------



## By-Tor

Quote:


> Originally Posted by *MIGhunter*
> 
> works in win10?


Don't think windows 8, 8.1 or 10 use gadgets.

I'm on 7...


----------



## Mega Man

no it doesnt, they removed gadgets as they are a security risk

http://windows.microsoft.com/en-us/windows/gadgets

http://www.pcworld.com/article/259085/microsoft_urges_users_to_shut_down_windows_gadgets_or_risk_attack.html

http://support.microsoft.com/kb/2719662


----------



## ydrogios

My watercooling system with the fantastic R9 290 Sapphire Toxic,Τhe back plate with ssd is temporeraly.I will put another one

http://s278.photobucket.com/user/ydrogios/media/my system_zpsy3rx1y9e.jpg.html


----------



## Paul17041993

Quote:


> Originally Posted by *By-Tor*
> 
> I've noticed an issue with my cards usage lately in BF4. Card 2 doesn't seem to be stepping up to the plate...
> 
> Anyone have any ideas whats going on?
> 
> Thanks
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> BF4
> 
> 
> Heaven


Scaling issue by the looks of it, what drivers you tested with and how much of an increase over a single card is there? If you're using 1080p though you may be CPU-bound more than anything.


----------



## By-Tor

Quote:


> Originally Posted by *Paul17041993*
> 
> Scaling issue by the looks of it, what drivers you tested with and how much of an increase over a single card is there? If you're using 1080p though you may be CPU-bound more than anything.


15.7.1 drivers.

This is something that started happening in the past couple weeks. They both used to stay from 90-100 most of the time while playing in BF4.


----------



## fat4l

Quote:


> Originally Posted by *ydrogios*
> 
> My watercooling system with the fantastic R9 290 Sapphire Toxic,Τhe back plate with ssd is temporeraly.I will put another one
> 
> http://s278.photobucket.com/user/ydrogios/media/my system_zpsy3rx1y9e.jpg.html


Nice !








is this your daily core clock ? 1240mhz?
Could u please run EVV application for me to see whats the default voltage for your card?
This link or This link.


----------



## By-Tor

Quote:


> Originally Posted by *Paul17041993*
> 
> Scaling issue by the looks of it, what drivers you tested with and how much of an increase over a single card is there? If you're using 1080p though you may be CPU-bound more than anything.


I disabled crossfire and unplugged the bottom card and now get much better frame rates in BF4.


----------



## Paul17041993

Quote:


> Originally Posted by *By-Tor*
> 
> 15.7.1 drivers.
> 
> This is something that started happening in the past couple weeks. They both used to stay from 90-100 most of the time while playing in BF4.


Quote:


> Originally Posted by *By-Tor*
> 
> I disabled crossfire and unplugged the bottom card and now get much better frame rates in BF4.


I'm thinking either a driver bug or that second card has lost stability, though unfortunately I haven't done crossfire so my knowledge on it is very limited...


----------



## By-Tor

Whats a good new driver to use?


----------



## EpicOtis13

Quote:


> Originally Posted by *By-Tor*
> 
> Whats a good new driver to use?


15.10 is awesome on my xfire 290's


----------



## By-Tor

Quote:


> Originally Posted by *EpicOtis13*
> 
> 15.10 is awesome on my xfire 290's


ill give them a try... ty


----------



## jbottz

My crossfire 290X WC loop... Just to echo the short/clean philosophy:

(I know that I need to clean up the cable management a bit, but I'm waiting on new PCIe power extenders to arrive.)


----------



## By-Tor

Quote:


> Originally Posted by *Paul17041993*
> 
> I'm thinking either a driver bug or that second card has lost stability, though unfortunately I haven't done crossfire so my knowledge on it is very limited...


Quote:


> Originally Posted by *EpicOtis13*
> 
> 15.10 is awesome on my xfire 290's


switched to the 15.10 drivers and all is well now.

Thanks


----------



## EpicOtis13

Quote:


> Originally Posted by *By-Tor*
> 
> switched to the 15.10 drivers and all is well now.
> 
> Thanks


np man, they worked for me so I though I might as well share the news.


----------



## ydrogios

Here is the default voltage

1.21.JPG 32k .JPG file


----------



## kizwan

Quote:


> Originally Posted by *MIGhunter*
> 
> Quote:
> 
> 
> 
> Originally Posted by *By-Tor*
> 
> HWINFO windows gadget. Reads HWINFO64 sensors. Has a lot of setting options.
> 
> 
> 
> works in win10?
Click to expand...

Open Hardware Monitor gadget works in Win 10.


----------



## Gumbi

Quote:


> Originally Posted by *ydrogios*
> 
> My watercooling system with the fantastic R9 290 Sapphire Toxic,Τhe back plate with ssd is temporeraly.I will put another one
> 
> http://s278.photobucket.com/user/ydrogios/media/my system_zpsy3rx1y9e.jpg.html


Sick overclock, I see your board limit is only set to 25%, do you find this limits you? Why not set it to 50%?


----------



## fyzzz

Thinking about selling my 290..I have no use for it right now. I doubt however that i will get it sold here where i live and i will not sell it for a 'normal' price. Or i can sell the 980 ti and go back to the 290...decisions decisions


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> Thinking about selling my 290..I have no use for it right now. I doubt however that i will get it sold here where i live and i will not sell it for a 'normal' price. Or i can sell the 980 ti and go back to the 290...decisions decisions


twice as fast and more vram. looks like an easy pick.

edit: your 290 is special, maybe 30% slower.


----------



## fyzzz

Quote:


> Originally Posted by *rdr09*
> 
> twice as fast and more vram. looks like an easy pick.
> 
> edit: your 290 is special, maybe 30% slower.


Yeah i know, but i like my 290 more







. And it did everything i asked for. The reason why i got this 980 ti in the first place was because the price, got it for 637€. This 980 ti is also very whiny.


----------



## mus1mus

I sold my 980ti in a hope that I could grab a 290 that could match mine.

I traded it for two 290s and a bit of cash that could easily get me a third.

One of them is an unlockable (unlocked) 290X. And occured, damn, hard to find a 290 like mine. Much more Fyzz's.









That thing smothers my 290X with a higher core clock in FS!

http://www.3dmark.com/compare/fs/6324704/fs/5607835/fs/5618528/fs/5618504

You gotta understand fyzz, some things are just hard to replicate. But I see that your 980TI is also good. (good anyway is common on 980TIs. Even my ref card does 1520/2040 some driver revs ago.)


----------



## diggiddi

Quote:


> Originally Posted by *EpicOtis13*
> 
> 15.10 is awesome on my xfire 290's


Quote:


> Originally Posted by *mus1mus*
> 
> I sold my 980ti in a hope that I could grab a 290 that could match mine.
> 
> I traded it for two 290s and a bit of cash that could easily get me a third.
> 
> One of them is an unlockable (unlocked) 290X. And occured, damn, hard to find a 290 like mine. Much more Fyzz's.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That thing smothers my 290X with a higher core clock in FS!
> 
> http://www.3dmark.com/compare/fs/6324704/fs/5607835/fs/5618528/fs/5618504
> 
> You gotta understand fyzz, some things are just hard to replicate. But I see that your 980TI is also good. (good anyway is common on 980TIs. Even my ref card does 1520/2040 some driver revs ago.)


Guys what are the recommended CPU- NB and HT settings for crossfire?
I also read its best to run dual rank 2400mhz RAM to ease any cpu bottlenecking that might occur due to bandwidth or am I misinterpreting
http://overclocking.guide/ddr3-ram-myths-enlightened/


----------



## GoLDii3

Do you think it's worth going from from a 280X to a used R9 290X Tri-X?

It should cost me 80-90 bucks if i sell my 280X. I was thinking about waiting for Greenland,but i doubt i will be able to get anything better than a 290X performance wise within the 250-300 bucks range.

R9 290/290X will probably be mid-high tier GPU's when Greenland launchs. Just like the 280X now.


----------



## diggiddi

Quote:


> Originally Posted by *GoLDii3*
> 
> *Do you think it's worth going from from a 280X to a used R9 290X Tri-X?*
> 
> It should cost me 80-90 bucks if i sell my 280X. I was thinking about waiting for Greenland,but i doubt i will be able to get anything better than a 290X performance wise within the 250-300 bucks range.
> 
> R9 290/290X will probably be mid-high tier GPU's when Greenland launchs. Just like the 280X now.


Yes, I went from a 7950 1165/1300 to a 290x and could tell the difference


----------



## MEC-777

Quote:


> Originally Posted by *GoLDii3*
> 
> Do you think it's worth going from from a 280X to a used R9 290X Tri-X?
> 
> It should cost me 80-90 bucks if i sell my 280X. I was thinking about waiting for Greenland,but i doubt i will be able to get anything better than a 290X performance wise within the 250-300 bucks range.
> 
> R9 290/290X will probably be mid-high tier GPU's when Greenland launchs. Just like the 280X now.


I went from a 7950(280) to a 290 and it was quite a jump in performance. So yes, it's definitely worth it.









You might was to consider upgrading your PSU at the same time though. 520w is barely enough for a 290x at stock clocks. Minimum I would recommend is a good quality 650w gold unit.


----------



## GoLDii3

Quote:


> Originally Posted by *MEC-777*
> 
> I went from a 7950(280) to a 290 and it was quite a jump in performance. So yes, it's definitely worth it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You might was to consider upgrading your PSU at the same time though. 520w is barely enough for a 290x at stock clocks. Minimum I would recommend is a good quality 650w gold unit.


520W is enough for a R9 290X. Usually most of them have 8+6 pin. That means MAX 300W draw. I have no problems with a 280X wich should take already 250W easily at stock. I usually clock it at 1150 MHz . Never had a problem. Not even with a 3570K.

That said,damn. I should of have kept my 7950. I got it basically for free thanks to an RMA.Then i sold it and spent 70 bucks for this 280X. Wasn't a smart move. Trying to avoid doing something like that again,7950 to 7970 is like 10-15% more at max.

That means 7950 to 290X is like 35-45%. Meanwhile 7970 to 290X is like 25-30%. Oh well,i don't think i will buy anything better than a 290X performance wise even if i wait for Greenland.


----------



## buttface420

Quote:


> Originally Posted by *GoLDii3*
> 
> Do you think it's worth going from from a 280X to a used R9 290X Tri-X?
> 
> It should cost me 80-90 bucks if i sell my 280X. I was thinking about waiting for Greenland,but i doubt i will be able to get anything better than a 290X performance wise within the 250-300 bucks range.
> 
> R9 290/290X will probably be mid-high tier GPU's when Greenland launchs. Just like the 280X now.


yes it is! i did the same from 280x to 290x


----------



## Rob27shred

Guess I better get my name on the list if you are still adding them! My 290X arrived today







It is a MSI R9 290X Gaming 4GB, here are my pics. Please add my name when you get a chance.


----------



## Mega Man

Quote:


> Originally Posted by *GoLDii3*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MEC-777*
> 
> I went from a 7950(280) to a 290 and it was quite a jump in performance. So yes, it's definitely worth it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You might was to consider upgrading your PSU at the same time though. 520w is barely enough for a 290x at stock clocks. Minimum I would recommend is a good quality 650w gold unit.
> 
> 
> 
> 520W is enough for a R9 290X. Usually most of them have 8+6 pin. That means MAX 300W draw. I have no problems with a 280X wich should take already 250W easily at stock. I usually clock it at 1150 MHz . Never had a problem. Not even with a 3570K.
Click to expand...

amd has already stated they push the power boundaries

just an fyi

iirc tsm has a screenshot showing 2 290xs and his whole pc pulling 1300w from the wall ....


----------



## Arizonian

Quote:


> Originally Posted by *mus1mus*
> 
> me sad.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My first ever hi-end card died. R9 290 with a very good Hynix Memory.
> 
> So long my friend.....
> 
> For the memory of you, I present your greatest hits! R.I.P
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://www.3dmark.com/compare/fs/5607835/fs/5618528/fs/5618504/fs/5608463/fs/5609270/fs/5618444/fs/5607969
> 
> http://www.3dmark.com/compare/fs/5661030/fs/5662677/fs/5661065/fs/5660881/fs/5662861/fs/5661261/fs/5660594


Sorry to hear this. Please let us know how the RMA goes.
Quote:


> Originally Posted by *ydrogios*
> 
> My watercooling system with the fantastic R9 290 Sapphire Toxic,Τhe back plate with ssd is temporeraly.I will put another one
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> http://s278.photobucket.com/user/ydrogios/media/my system_zpsy3rx1y9e.jpg.html


Congrats - added















Quote:


> Originally Posted by *Rob27shred*
> 
> Guess I better get my name on the list if you are still adding them! My 290X arrived today
> 
> 
> 
> 
> 
> 
> 
> It is a MSI R9 290X Gaming 4GB, here are my pics. Please add my name when you get a chance.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## Harry604

290 trix or gtx 970 acx 2.0+ ssc

my monitor is 2650x1440p 120hz


----------



## diggiddi

I just doubled up........ or is it doubled down


Spoiler: Warning: Spoiler!






Some better shots



Spoiler: Warning: Spoiler!


----------



## diggiddi

Quote:


> Originally Posted by *Harry604*
> 
> 290 trix or gtx 970 acx 2.0+ ssc
> 
> my monitor is 2650x1440p 120hz


290 for sure


----------



## MEC-777

Quote:


> Originally Posted by *GoLDii3*
> 
> 520W is enough for a R9 290X. Usually most of them have 8+6 pin. That means MAX 300W draw.


No it isn't. Will it run? Yes, for the most part but it's not ideal and it will significantly shorten the life of that PSU. 290x's will pull more than 300w. Don't go by the 8 and 6 pin ratings. Many high-end GPUs exceed those ratings even at stock clocks. The power draw of these GPUs is constantly being regulated and can draw momentary spikes of over 400w. A lower capacity PSU will have a much more difficult time trying to deal with those high load spikes and that's what will kill it sooner.

The PSU is the most important part of a PC and should not be compromised. Do yourself a favor; if you're going to upgrade to a 290x, then sell your 520w unit and use that money to help pay for a good quality unit with at least 650w and gold rated. Save your self the extra costs and headaches later on when that 520w unit starts to fail and possibly takes out other components with it.









It's your system, you can do what you want, but I speak from experience and from doing A LOT of research about PSU's and GPU power draw characteristics. I STRONGLY recommend you upgrade your PSU if you install a 290x.


----------



## diggiddi

Quote:


> Originally Posted by *MEC-777*
> 
> No it isn't. Will it run? Yes, for the most part but it's not ideal and it will significantly shorten the life of that PSU. 290x's will pull more than 300w. Don't go by the 8 and 6 pin ratings. Many high-end GPUs exceed those ratings even at stock clocks. The power draw of these GPUs is constantly being regulated and can draw momentary spikes of over 400w. A lower capacity PSU will have a much more difficult time trying to deal with those high load spikes and that's what will kill it sooner.
> 
> The PSU is the most important part of a PC and should not be compromised. Do yourself a favor; if you're going to upgrade to a 290x, then sell your 520w unit and use that money to help pay for a good quality unit with at least 650w and gold rated. Save your self the extra costs and headaches later on when that 520w unit starts to fail and possibly takes out other components with it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's your system, you can do what you want, but I speak from experience and from doing A LOT of research about PSU's and GPU power draw characteristics. I STRONGLY recommend you upgrade your PSU if you install a 290x.


Whaddabout Crossfire? I'm running on stock right now but I'm thinking 1300-1600w for both GPU and CPU Stout OCing


----------



## MEC-777

Quote:


> Originally Posted by *diggiddi*
> 
> Whaddabout Crossfire? I'm running on stock right now but I'm thinking 1300-1600w for both GPU and CPU Stout OCing


I'm running an EVGA 850 GS with my 290 crossfire setup which is plenty. I've had both cards to 1200/1400 which it can handle just fine. But my CPU is a locked i5 (just a slight OC with a base clock bump) and I run my 290's at stock clocks 90% of the time anyways, so it's more than enough for me.

For any one looking to run crossfire 290's or 290x's and are planning to do some hefty OCing on both the GPUs and CPU, then I'd recommend at least a 1000w gold unit. 1200 is more than plenty and 1300+ I would say is excessive.









It also depends on what else is in your system. How many HDD's? How many fans? Custom loop water cooling? Have to take that into account as well.


----------



## diggiddi

Repped up, you are sir
I'm thinking of adding a third and also an FX 9590 so I'd like to buy one psu that can possibly feed those beasts and feed them well


----------



## kizwan

Quote:


> Originally Posted by *diggiddi*
> 
> Repped up, you are sir
> I'm thinking of adding a third and also an FX 9590 so I'd like to buy one psu that can possibly feed those beasts and feed them well


I'd go with 1300 minimum then. Any 80Plus rated PSUs should be good. All that bronze, silver, gold or platinum rating only good for your electricity bills which unless your computer always under load all the time 24/7, I think you won't see much difference between them.


----------



## diggiddi

Quote:


> Originally Posted by *kizwan*
> 
> I'd go with 1300 minimum then. Any 80Plus rated PSUs should be good. All that bronze, silver, gold or platinum rating only good for your electricity bills which unless your computer always under load all the time 24/7, I think you won't see much difference between them.










+1 to you, I got my eye on a Lepa 1600


----------



## EpicOtis13

Quote:


> Originally Posted by *diggiddi*
> 
> Repped up, you are sir
> I'm thinking of adding a third and also an FX 9590 so I'd like to buy one psu that can possibly feed those beasts and feed them well


Don't get the 9590, it will bottle neck an cause a headache especially with three cards. Also for a PSU suggestion, the Evga Supernova G2's are excellent, they have a nice weight and I have never had any problems with the three that I own.


----------



## diggiddi

Quote:


> Originally Posted by *EpicOtis13*
> 
> Don't get the 9590, it will bottle neck an cause a headache especially with three cards. Also for a PSU suggestion, the Evga Supernova G2's are excellent, they have a nice weight and I have never had any problems with the three that I own.


I'm sticking with AMD for several reasons but thanks for the advice I'll take a look at the G2's


----------



## mfknjadagr8

Quote:


> Originally Posted by *EpicOtis13*
> 
> Don't get the 9590, it will bottle neck an cause a headache especially with three cards. Also for a PSU suggestion, the Evga Supernova G2's are excellent, they have a nice weight and I have never had any problems with the three that I own.


red1866 would disagree as would I but yeah the g2 it's good but not because of its weight but because it's a great oem rebrand I've heard great things about the lepa models too...


----------



## mus1mus

Or go straight for the 1600T2.

Why? coz want to be moar overkill than anyone else.









I have not tested my 3 reference 290s with the FX at 5.2 - 5.5 but I bet I will need my two Seasonic 1250s for the oooomph!


----------



## diggiddi

Quote:


> Originally Posted by *mus1mus*
> 
> Or go straight for the 1600T2.
> 
> Why? coz want to be moar overkill than anyone else.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have not tested my 3 reference 290s with the FX at 5.2 - 5.5 but I bet I will need my two Seasonic 1250s for the oooomph!


Whoa there a $400+ PSU!!!! Thank you but I'll pass on that one


----------



## mus1mus

Quote:


> Originally Posted by *diggiddi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Or go straight for the 1600T2.
> 
> Why? coz want to be moar overkill than anyone else.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have not tested my 3 reference 290s with the FX at 5.2 - 5.5 but I bet I will need my two Seasonic 1250s for the oooomph!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Whoa there a $400+ PSU!!!! Thank you but I'll pass on that one
Click to expand...

lol

Good things don't come cheap, she said.
If you want it, come and get it, she said.









And woops,

If there's a *T2*, there should also be a *P2* and a *G2*


----------



## MEC-777

Quote:


> Originally Posted by *kizwan*
> 
> I'd go with 1300 minimum then. Any 80Plus rated PSUs should be good. All that bronze, silver, gold or platinum rating only good for your electricity bills which unless your computer always under load all the time 24/7, I think you won't see much difference between them.


Agree with the 1300 minimum for a 9590 with 3 Hawaii cards. However, there are differences beyond the 80+ ratings that should be noted. It's not just a difference in efficiency. The higher efficiency models also use higher quality components, better designed and better built circuitry as well which will prolong the life of the unit and give you more clean, stable power at all times. A number of the lower efficiency (80+ bronze or below) models use what is called a group regulation design for the 3.3, 5 and 12v rails which is something to stay FAR away from. So by sticking with at least gold rated units, you're pretty much guaranteed individual rail voltage regulation and better components overall.


----------



## Rob27shred

Quote:


> Originally Posted by *diggiddi*
> 
> I just doubled up........ or is it doubled down
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Some better shots
> 
> 
> 
> Spoiler: Warning: Spoiler!


Very nice, I almost grabbed some 290x Lightnings off the marketplace here but have spent so much on PC lately, I settled on that MSI gaming 290x for $250. I'm hoping it will play well with my XFX DD 390 in crossfire. My 1250w Seasonic should be today so I'll know for sure soon. How are them Lightnings doing for you?


----------



## mus1mus

Quote:


> Originally Posted by *EpicOtis13*
> 
> Quote:
> 
> 
> 
> Originally Posted by *diggiddi*
> 
> Repped up, you are sir
> I'm thinking of adding a third and also an FX 9590 so I'd like to buy one psu that can possibly feed those beasts and feed them well
> 
> 
> 
> Don't get the 9590, it will bottle neck *BE GIMPED* an cause a headache especially with three cards *BENCHMARKS*. Also for a PSU suggestion, the Evga Supernova G2's are excellent, they have a nice weight and I have never had any problems with the three that I own.
Click to expand...

cuhrrected fuhr Ü sir.


----------



## mfknjadagr8

Quote:


> Originally Posted by *mus1mus*
> 
> cuhrrected fuhr Ü sir.


lol nice


----------



## Rob27shred

Thank for the add @Arizonian!


----------



## gatygun

Quote:


> Originally Posted by *GoLDii3*
> 
> 520W is enough for a R9 290X. Usually most of them have 8+6 pin. That means MAX 300W draw. I have no problems with a 280X wich should take already 250W easily at stock. I usually clock it at 1150 MHz . Never had a problem. Not even with a 3570K.
> 
> That said,damn. I should of have kept my 7950. I got it basically for free thanks to an RMA.Then i sold it and spent 70 bucks for this 280X. Wasn't a smart move. Trying to avoid doing something like that again,7950 to 7970 is like 10-15% more at max.
> 
> That means 7950 to 290X is like 35-45%. Meanwhile 7970 to 290X is like 25-30%. Oh well,i don't think i will buy anything better than a 290X performance wise even if i wait for Greenland.


I had a 290 tri-x with 1 8 pin and 1 6 pin, yet it would push up to 400w at times. ( i believe 6 pins can also transfer 150w perfectly fine, it just misses the grounding pins )

I currently have 1x 970 + my i7 870 oc'ed to 4ghz and both stressed consume up to 505w together. 970 is a lot more energy efficient then a 290 let alone a 290x which has even more hardware on it.

So 520gold rated if you plan on oc'ing for a 290x is way to short, specially when you push your psu to it's limits you will kill it way faster then when you are 100 or so under it. My 2x 290's killed my 1000w silver psu because i pushed them to it's limits.

I wouldn't go lower then a 650 for a single 290x, and even then a 750 would probably fair a lot more better with more headroom. For sli 1000w gold is absolutely the minimum when you plan on overclocking tho.


----------



## diggiddi

Quote:


> Originally Posted by *MEC-777*
> 
> Agree with the 1300 minimum for a 9590 with 3 Hawaii cards. However, there are differences beyond the 80+ ratings that should be noted. It's not just a difference in efficiency. The higher efficiency models also use higher quality components, better designed and better built circuitry as well which will prolong the life of the unit and give you more clean, stable power at all times. A number of the lower efficiency (80+ bronze or below) models use what is called a group regulation design for the 3.3, 5 and 12v rails which is something to stay FAR away from. So by sticking with at least gold rated units, you're pretty much guaranteed individual rail voltage regulation and better components overall.


Good advice +rep again

Quote:


> Originally Posted by *Rob27shred*
> 
> Very nice, I almost grabbed some 290x Lightnings off the marketplace here but have spent so much on PC lately, I settled on that MSI gaming 290x for $250. I'm hoping it will play well with my XFX DD 390 in crossfire. My 1250w Seasonic should be today so I'll know for sure soon. How are them Lightnings doing for you?


Still trying to get AB to work correctly I cant adjust any settings and its not reading gpu2 mem usage etc not too sure what to do
It looks like very few games it can't handle at 60fps 1080 max settings, crisis 3 being one of them

Quote:


> Originally Posted by *mus1mus*
> 
> cuhrrected fuhr Ü sir.


Huh I'm lost so u agree with him?


----------



## Roboyto

Quote:



> Originally Posted by *MEC-777*
> 
> Agree with the 1300 minimum for a 9590 with 3 Hawaii cards. However, there are differences beyond the 80+ ratings that should be noted. It's not just a difference in efficiency. The higher efficiency models also use higher quality components, better designed and better built circuitry as well which will prolong the life of the unit and give you more clean, stable power at all times. A number of the lower efficiency (80+ bronze or below) models use what is called a group regulation design for the 3.3, 5 and 12v rails which is something to stay FAR away from. So *by sticking with at least gold rated units, you're pretty much guaranteed individual rail voltage regulation and better components overall*.


What he said ^

Quote:


> Originally Posted by *MEC-777*
> 
> No it isn't. Will it run? Yes, for the most part but it's not ideal and it will significantly shorten the life of that PSU. 290x's will pull more than 300w. Don't go by the 8 and 6 pin ratings. Many high-end GPUs exceed those ratings even at stock clocks. The power draw of these GPUs is constantly being regulated and can draw momentary spikes of over 400w. A lower capacity PSU will have a much more difficult time trying to deal with those high load spikes and that's what will kill it sooner.
> 
> The PSU is the most important part of a PC and should not be compromised. Do yourself a favor; if you're going to upgrade to a 290x, then sell your 520w unit and use that money to help pay for a good quality unit with at least 650w and gold rated. Save your self the extra costs and headaches later on when that 520w unit starts to fail and possibly takes out other components with it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's your system, you can do what you want, but I speak from experience and from doing A LOT of research about PSU's and GPU power draw characteristics. I STRONGLY recommend you upgrade your PSU if you install a 290x.


*I'm not disagreeing with you/anyone, or saying anyone should run a small PSU, just stating my experience.*

I ran a fairly well OC'd 4790k and R9 290 on a Rosewill Capstone 450W gold for ~18 months without any issues at all; Capstone are a SuperFlower rebrand. The same PSU has been through numerous system/GPU upgrades prior to fulfilling that duty; Z77, Z87, Z97, 3570, 3770, 4770, GT 640, HD7770, GTX 670, R9 270X.

I never had any hiccups when running stress tests, benchmarks, BD rips, or games with that little PSU driving that hardware. Both CPU and GPU were cooled with a single 120mm AIO; 2 pumps, a few fans, SSD, HDD, and ODD.

4790k runs at 4.7GHz ~1.28V with 1.5V 2133 RAM. Stable in IBT and for countless BD rips.

The 290 I benched up to +187mV and 1200+ core, but ran daily with +37mV @ 1075/1375.


----------



## Mega Man

80 plus gold means nothing. Nothing ! on voltage regulation.

http://www.overclock.net/t/711542/on-efficiency

That is from 2010...and yet just as relevant now. . .
Quote:


> Originally Posted by *mfknjadagr8*
> 
> Quote:
> 
> 
> 
> Originally Posted by *EpicOtis13*
> 
> Don't get the 9590, it will bottle neck an cause a headache especially with three cards. Also for a PSU suggestion, the Evga Supernova G2's are excellent, they have a nice weight and I have never had any problems with the three that I own.
> 
> 
> 
> red1866 would disagree as would I but yeah the g2 it's good but not because of its weight but because it's a great oem rebrand I've heard great things about the lepa models too...
Click to expand...

Red1776.

The lepa were top of the line. But now there is better psus I hope they soon release another that ages this well
Quote:


> Originally Posted by *diggiddi*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Or go straight for the 1600T2.
> 
> Why? coz want to be moar overkill than anyone else.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I have not tested my 3 reference 290s with the FX at 5.2 - 5.5 but I bet I will need my two Seasonic 1250s for the oooomph!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Whoa there a $400+ PSU!!!! Thank you but I'll pass on that one
Click to expand...

There is other ratings that is titanium. There is gold and plat. Which are a better value. You should look into the 2kw psu by SuperFlower ( its a Leadex)


----------



## mfknjadagr8

Quote:


> Originally Posted by *Mega Man*
> 
> 80 plus gold means nothing. Nothing ! on voltage regulation.
> 
> http://www.overclock.net/t/711542/on-efficiency
> 
> That is from 2010...and yet just as relevant now. . .
> Red1776.
> 
> The lepa were top of the line. But now there is better psus I hope they soon release another that ages this well
> There is other ratings that is titanium. There is gold and plat. Which are a better value. You should look into the 2kw psu by SuperFlower ( its a Leadex)


wow I botched that one up...not sure how I got that number tbh...auto correct I guess lol


----------



## Mega Man

Hehe. I miss him. Hope he is doing well


----------



## mus1mus

Same here. ^

Quote:


> Originally Posted by *diggiddi*
> 
> Huh I'm lost so u agree with him?


Do you agree?









I keep coming back to my FX for a reason. And, http://www.3dmark.com/compare/fs/5661030/fs/5607835

The only gripes I have really is having a lower Graphics score on PCIe 2.0 vs 3.0.


----------



## kizwan

Quote:


> Originally Posted by *MEC-777*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I'd go with 1300 minimum then. Any 80Plus rated PSUs should be good. All that bronze, silver, gold or platinum rating only good for your electricity bills which unless your computer always under load all the time 24/7, I think you won't see much difference between them.
> 
> 
> 
> Agree with the 1300 minimum for a 9590 with 3 Hawaii cards. However, there are differences beyond the 80+ ratings that should be noted. It's not just a difference in efficiency. The higher efficiency models also use higher quality components, better designed and better built circuitry as well which will prolong the life of the unit and give you more clean, stable power at all times. A number of the lower efficiency (80+ bronze or below) models use what is called a group regulation design for the 3.3, 5 and 12v rails which is something to stay FAR away from. So by sticking with at least gold rated units, you're pretty much guaranteed individual rail voltage regulation and better components overall.
Click to expand...

The group regulation only found in low budget, low watt PSUs. That being said, group regulation not necessarily bad. I have read a couple of reviews of low watt PSUs (up to 600W) that used group regulation that have excellent voltage regulation on all rails. So no problem there if people get a good quality PSUs.

I have read a couple of times before that their gold rated PSU died. So to me 80Plus rating is no more than efficiency rating.
Quote:


> The bad:
> 1. Their tests are conducted inside a thermal chamber with a constant temperature of 23° C (73.4° F) ±5%. This is ridiculous as no computer in the world works internally at such low temperature. The problem is that as temperature increases power supplies start consuming more from the power grid in order to deliver the same amount of power on their outputs, so efficiency typically decreases with temperature. Our tests here on Hardware Secrets are conducted with a temperature between 45° C and 50° C (113° F and 122° F) inside our thermal chamber, as we want to measure power supplies under real-world conditions.
> 2. Power supplies are tested only under three loads: 20%, 50% and 100% (called "light," "typical" and "full," respectively). At one hand the use of these three loads is enough for having an overall idea of the power supply efficiency. On the other hand, for a more precise measurement it is our opinion that they needed to do tests under several different loads, especially when they are charging for doing so. In our tests we test power supplies under five different loads: 20%, 40%, 60%, 80%, and 100%. On the other hand, the new Titanium certfication provides minimum requirements for 10% load, which is excellent.
> 3. They don't disclose the exact equipments (e.g., manufacturers and models) they use on their testing.
> 
> Read more at http://www.hardwaresecrets.com/understanding-the-80-plus-certification/2/#BeSIIPOHEheAjV0L.99


Quote:


> Originally Posted by *mus1mus*
> 
> Same here. ^
> 
> Quote:
> 
> 
> 
> Originally Posted by *diggiddi*
> 
> Huh I'm lost so u agree with him?
> 
> 
> 
> Do you agree?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I keep coming back to my FX for a reason. And, http://www.3dmark.com/compare/fs/5661030/fs/5607835
> 
> The only gripes I have really is having a lower Graphics score on PCIe 2.0 vs 3.0.
Click to expand...

In benchmark, yes (gimped) but can you games with triple 290s & FX CPU, yes. I don't think anyone with triple 290s going to play at 1080p.


----------



## mus1mus

HI-RES, +1

SOO TRUE.

And with DX12 GAMES coming,









I know I have my CPU clocked up top where most users will not be able to reach, 'cept for my fellas at the Vish thread who taught this guy how to properly OC, but a decent clocked AMD FX will play most games just fine.

This is not an Intel vs. AMD debate but rather some truth that has been misunderstood quite often. (I have a lot of friends that say, don't go for an AMD CPU, it's crap, blah blah!) But they run a locked i5 for heaven's sake.


----------



## EpicOtis13

Quote:


> Originally Posted by *mus1mus*
> 
> HI-RES, +1
> 
> SOO TRUE.
> 
> And with DX12 GAMES coming,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I know I have my CPU clocked up top where most users will not be able to reach, 'cept for my fellas at the Vish thread who taught this guy how to properly OC, but a decent clocked AMD FX will play most games just fine.
> 
> This is not an Intel vs. AMD debate but rather some truth that has been misunderstood quite often. (I have a lot of friends that say, don't go for an AMD CPU, it's crap, blah blah!) But they run a locked i5 for heaven's sake.


I am definitely not saying that AMD CPUs are crap at all, I love the 8320 in my second rig, but there is definitely a tangible benefit of getting a CPU with stronger single core perf. Right now I have a 5930k which beats my old 4790k only in super CPU bound games, and at 4K, but going from a 8320 to a 4790k at 4.8ghz was like day and night. My 8320 at 4.6 is a great chip, but you need the single core perf for most games.


----------



## mus1mus

Well, you have CPUs that are actually heaven and earth.

4.8 4790K is a monster in Single core perf.
4.6 AMD FX is Mediocre.

Things will really become apparent that way. Even your 5930K single core perf. is "no mas" for that 4790K in fact.


----------



## GoLDii3

Quote:


> Originally Posted by *MEC-777*
> 
> No it isn't. Will it run? Yes, for the most part but it's not ideal and it will significantly shorten the life of that PSU. 290x's will pull more than 300w. Don't go by the 8 and 6 pin ratings. Many high-end GPUs exceed those ratings even at stock clocks. The power draw of these GPUs is constantly being regulated and can draw momentary spikes of over 400w. A lower capacity PSU will have a much more difficult time trying to deal with those high load spikes and that's what will kill it sooner.
> 
> The PSU is the most important part of a PC and should not be compromised. Do yourself a favor; if you're going to upgrade to a 290x, then sell your 520w unit and use that money to help pay for a good quality unit with at least 650w and gold rated. Save your self the extra costs and headaches later on when that 520w unit starts to fail and possibly takes out other components with it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's your system, you can do what you want, but I speak from experience and from doing A LOT of research about PSU's and GPU power draw characteristics. I STRONGLY recommend you upgrade your PSU if you install a 290x.


Whatever you say.









Why don't you watch some reviews?

I was talking about the Tri-X wich doesn't even reach 350W at peak - Yes,peak.

https://www.techpowerup.com/reviews/Sapphire/R9_290X_Tri-X_OC/22.html
http://www.hardocp.com/article/2014/08/01/sapphire_vaporx_r9_290x_trix_oc_video_card_review/9#.VjF3uW5HyUk
http://hexus.net/tech/reviews/graphics/70913-sapphire-radeon-r9-290x-vapor-x-4gb/?page=10
http://www.legitreviews.com/sapphire-r9-290x-vaporx-oc-4gb-video-card-review_142216/12

520w is plenty for a single GPU and a intel CPU - Total system draw of my PC without GPU must be 150W tops. Now,if you have AMD,got sum bad news.


----------



## mus1mus

Til you put Overclock into the equation.


----------



## fat4l

520w psu? What would anyone want to put himself into situation where you can get psu wattage limited?
You always want to have some space + if you start overclocking then 520w is not gonna be enough. Also psu doesn't like be at it limits...
750w is a good standard for these cards. 650w would do the trick as well...


----------



## MEC-777

Quote:


> Originally Posted by *kizwan*
> 
> The group regulation only found in low budget, low watt PSUs. That being said, group regulation not necessarily bad. I have read a couple of reviews of low watt PSUs (up to 600W) that used group regulation that have excellent voltage regulation on all rails. So no problem there if people get a good quality PSUs.
> 
> I have read a couple of times before that their gold rated PSU died. So to me 80Plus rating is no more than efficiency rating.


That's why you have to research the specific units.







There are a number of 80+ rated units with group regulation that are actually decent and there are also a number of gold rated units that are actually not so good. I agree, group regulation isn't something you run into too much in the higher capacity units, but it's something to be aware of and avoided, IMO.

The point I'm trying to make, and what we're both saying is to at least make sure you're buying a good quality unit. Don't just buy something because it has the correct rating or a certain branding. Research is key.









Quote:


> Originally Posted by *GoLDii3*
> 
> Whatever you say.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Why don't you watch some reviews?
> 
> I was talking about the Tri-X wich doesn't even reach 350W at peak - Yes,peak.
> 
> https://www.techpowerup.com/reviews/Sapphire/R9_290X_Tri-X_OC/22.html
> http://www.hardocp.com/article/2014/08/01/sapphire_vaporx_r9_290x_trix_oc_video_card_review/9#.VjF3uW5HyUk
> http://hexus.net/tech/reviews/graphics/70913-sapphire-radeon-r9-290x-vapor-x-4gb/?page=10
> http://www.legitreviews.com/sapphire-r9-290x-vaporx-oc-4gb-video-card-review_142216/12
> 
> 520w is plenty for a single GPU and a intel CPU - Total system draw of my PC without GPU must be 150W tops. Now,if you have AMD,got sum bad news.


Reviews don't tell the whole story and the equipment they use is not sensitive enough to pick up on the power draw spikes from modern high-end GPUs, especially Hawaii cards. 520w is not plenty. You will be running very close to that PSUs max output which will kill it (and or other components in your system) sooner than later. Trust me, I have been there and I have had two GPUs killed by an inadequate PSU (7950 and a 290). It's not a road you want to go down. There are a number of us here who have owned and run 290/x's for some time, telling you 520w is not enough. Not for the long term.

Heed our warnings or do what you want. It's your system and your money.

Here's a really good article you should read in it's entirety: http://www.tomshardware.com/reviews/graphics-card-power-supply-balance,3979.html


----------



## GoLDii3

Quote:


> Originally Posted by *fat4l*
> 
> 520w psu? What would anyone want to put himself into situation where you can get psu wattage limited?
> You always want to have some space + if you start overclocking then 520w is not gonna be enough. Also psu doesn't like be at it limits...
> 750w is a good standard for these cards. 650w would do the trick as well...


Like said and confirmed by reviews,520w is plenty for a single GPU build with Intel. No one forces you to overclock,and i prefer to keep the extra cash instead of using it to buy something that i don't need.








Quote:


> Originally Posted by *MEC-777*
> 
> Reviews don't tell the whole story and the equipment they use is not sensitive enough to pick up on the power draw spikes from modern high-end GPUs, especially Hawaii cards. 520w is not plenty. You will be running very close to that PSUs max output which will kill it (and or other components in your system) sooner than later. Trust me, I have been there and I have had two GPUs killed by an inadequate PSU (7950 and a 290). It's not a road you want to go down. There are a number of us here who have owned and run 290/x's for some time, telling you 520w is not enough. Not for the long term.
> 
> Heed our warnings or do what you want. It's your system and your money.
> 
> Here's a really good article you should read in it's entirety: http://www.tomshardware.com/reviews/graphics-card-power-supply-balance,3979.html


Heh. Looks like you had very bad luck.









Im sorry,but i prefer to trust reviews done by professionals,rather than hearsay. "Equipment they use is not sensitive enough to pick up on the power draw spikes from modern high-end GPUs". Yeah...









I had a 7870,7950 and now a 280X. Always on the Seasonic SII12 520W platform. Never had a problem. That's 3 years already so i guess that "sooner or later" is more like "very very later"









You're making quite a lot of unfounded statements.


----------



## mfknjadagr8

Quote:


> Originally Posted by *GoLDii3*
> 
> Like said and confirmed by reviews,520w is plenty for a single GPU build with Intel. No one forces you to overclock,and i prefer to keep the extra cash instead of using it to buy something that i don't need.
> 
> 
> 
> 
> 
> 
> 
> 
> Heh. Looks like you had very bad luck.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Im sorry,but i prefer to trust reviews done by professionals,rather than hearsay. "Equipment they use is not sensitive enough to pick up on the power draw spikes from modern high-end GPUs". Yeah...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I had a 7870,7950 and now a 280X. Always on the Seasonic SII12 520W platform. Never had a problem. That's 3 years already so i guess that "sooner or later" is more like "very very later"
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You're making quite a lot of unfounded statements.


ironically you guys are fighting opposite sides with the same ammo...one side says never happened to me other side says it's happened twice...do i think 520w is enough for that setup...at its bare minimum yes it is...stock clocks and voltages it's just enough...that said it truly depends on the specific setup as to if it's enough...all 290s aren't created equal done have higher tdp it of the box due to factory overclocks...cpu is a factor too..some of the lower power draw cpus use a boost to x speed under load which increases power draw...also each of these things may draw more or less power based on the voltages they run at stock...would I run any of my rigs that close to the edge of psu capabilities? Not for a minute...saving money I'm all about but psus that are run hard and close to max output are more prone to fall regardless of oem...that simple for anyone to understand...this varies for a ton of reasons...which is why I wouldn't recommend running on the edge of psus capabilities 24/7 to anyone...you might run it 24/7 for years on your setups...then someone takes your recommendation spends 1500 on a rig and that psu they buy (same make and model)doesn't hold up as well as yours or their case airflow doesn't keep it cool enough...and kaput there goes that savings when the psu fails and takes things with it...there is a reason gpu manufacturers recommend higher wattage psus...some say it's a conspiracy...I say it's simply because they want to account for a wide range of setups and reduce their rmas because someone ran in the ragged edge and lost...


----------



## MEC-777

Quote:


> Originally Posted by *GoLDii3*
> 
> Like said and confirmed by reviews,520w is plenty for a single GPU build with Intel. No one forces you to overclock,and i prefer to keep the extra cash instead of using it to buy something that i don't need.
> 
> 
> 
> 
> 
> 
> 
> 
> Heh. Looks like you had very bad luck.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Im sorry,but i prefer to trust reviews done by professionals,rather than hearsay. "Equipment they use is not sensitive enough to pick up on the power draw spikes from modern high-end GPUs". Yeah...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I had a 7870,7950 and now a 280X. Always on the Seasonic SII12 520W platform. Never had a problem. That's 3 years already so i guess that "sooner or later" is more like "very very later"
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You're making quite a lot of unfounded statements.


I'm not making a single unfounded statement. The reviews you're referring to do not tell the whole story, plain and simple. Go read that article I posted in my last reply if you don't believe me. I provided a source, you ignore it and say I'm making unfounded statements? Please.









For the GPU's you've been running in the past, yes, that PSU is "OK" for. They don't draw near as much power as Hawaii cards do. It's not the same ball game. Just because you "haven't had a problem yet" doesn't mean you never will in the future. I said the exact same thing with the PSu I used to use. Then it failed.

Nobody is forcing you to do anything, but the majority of us here, who've all had experience running Hawaii's are all telling you it's not a good idea to stick with your 520w unit. We're not idiots, we're telling you based on experience and knowledge. If you choose to ignore us, it's your call.

The fact remains; 520w is NOT plenty for a 290x and you will be running it on or near it's limits.
Quote:


> Originally Posted by *mfknjadagr8*
> 
> ironically you guys are fighting opposite sides with the same ammo...one side says never happened to me other side says it's happened twice...do i think 520w is enough for that setup...at its bare minimum yes it is...stock clocks and voltages it's just enough...that said it truly depends on the specific setup as to if it's enough...all 290s aren't created equal done have higher tdp it of the box due to factory overclocks...cpu is a factor too..some of the lower power draw cpus use a boost to x speed under load which increases power draw...also each of these things may draw more or less power based on the voltages they run at stock...would I run any of my rigs that close to the edge of psu capabilities? Not for a minute...saving money I'm all about but psus that are run hard and close to max output are more prone to fall regardless of oem...that simple for anyone to understand...this varies for a ton of reasons...which is why I wouldn't recommend running on the edge of psus capabilities 24/7 to anyone...you might run it 24/7 for years on your setups...then someone takes your recommendation spends 1500 on a rig and that psu they buy (same make and model)doesn't hold up as well as yours or their case airflow doesn't keep it cool enough...and kaput there goes that savings when the psu fails and takes things with it...there is a reason gpu manufacturers recommend higher wattage psus...some say it's a conspiracy...I say it's simply because they want to account for a wide range of setups and reduce their rmas because someone ran in the ragged edge and lost...


Here's the thing; I used to be in his shoes - the one saying "well it's never happened to me", then it happened and I wised up. Will it happen to everyone? No, not at all. But the risk is there and I see no reason to take that risk if you don't have to.

Agree with everything else said here.


----------



## kipperfl1012

Hi Guys I am building my first new pc in 7 years and I had some questions on witch card to buy . I am not really concerned about the price difference Just performance,Heat and Noise

What would be the best card and why

Thanks

SAPPHIRE Radeon R9 290 100362SR
http://www.newegg.com/Product/Product.aspx?Item=N82E16814202043

XFX Black Edition Double Dissipation Radeon R9 290 R9-290A-EDBD
http://www.newegg.com/Product/Product.aspx?Item=N82E16814150701

XFX Radeon R9 290 R9-290A-EDFD
http://www.newegg.com/Product/Product.aspx?Item=N82E16814150697


----------



## MEC-777

Quote:


> Originally Posted by *kipperfl1012*
> 
> Hi Guys I am building my first new pc in 7 years and I had some questions on witch card to buy . I am not really concerned about the price difference Just performance,Heat and Noise
> 
> What would be the best card and why
> 
> Thanks
> 
> SAPPHIRE Radeon R9 290 100362SR
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814202043
> 
> XFX Black Edition Double Dissipation Radeon R9 290 R9-290A-EDBD
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814150701
> 
> XFX Radeon R9 290 R9-290A-EDFD
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814150697


The XFX DD black edition is the best of those 3 choices. Has a slight factory overclock. Both of the XFX have a MUCH better cooler vs the reference blower. The Sapphire is the reference blower card and will run extremely loud and hot. Unless you're planning to run water cooling with either a custom loop of an AIO solution like the Kraken G10 or Corsair HG10 A1 with something like an H55, the stay away from the reference cards. Unless you don't care about the noise and heat.


----------



## Agonist

Quote:


> Originally Posted by *MEC-777*
> 
> The XFX DD black edition is the best of those 3 choices. Has a slight factory overclock. Both of the XFX have a MUCH better cooler vs the reference blower. The Sapphire is the reference blower card and will run extremely loud and hot. Unless you're planning to run water cooling with either a custom loop of an AIO solution like the Kraken G10 or Corsair HG10 A1 with something like an H55, the stay away from the reference cards. Unless you don't care about the noise and heat.


I have 2 XFX Black Edition DD 290s.
Both stay around 83c load at stock clocks and are pretty quiet.
But I am gonna put 2 Cooler Master Siedon 120m on them.
Whats wicked about the DD cards is the bottom half of the gpu is a gpu plate for the vram and vrm. So no need for a g10 bracket. Also the 120m bolts directly up the gpus.
Ill post pictures when everything arrives.


----------



## sinnedone

Honestly I'd research XFX a little more. They have a habit of silently revising their pcb's/cooler designs to make cheaper and locking down voltage control.


----------



## Agonist

Quote:


> Originally Posted by *sinnedone*
> 
> Honestly I'd research XFX a little more. They have a habit of silently revising their pcb's/cooler designs to make cheaper and locking down voltage control.


The DD black editions have fully unlocked voltage.
From what I have read is the DD versions actually are reference designs as well.
But that Im not 100% how true that is. I dont have my Gigabyte R9 290 Windforce 3x OC anymore to compare. That uses a 100% reference board.


----------



## GoLDii3

Quote:


> Originally Posted by *MEC-777*
> 
> I'm not making a single unfounded statement. The reviews you're referring to do not tell the whole story, plain and simple. Go read that article I posted in my last reply if you don't believe me. I provided a source, you ignore it and say I'm making unfounded statements? Please.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> For the GPU's you've been running in the past, yes, that PSU is "OK" for. They don't draw near as much power as Hawaii cards do. It's not the same ball game. Just because you "haven't had a problem yet" doesn't mean you never will in the future. I said the exact same thing with the PSu I used to use. Then it failed.
> 
> Nobody is forcing you to do anything, but the majority of us here, who've all had experience running Hawaii's are all telling you it's not a good idea to stick with your 520w unit. We're not idiots, we're telling you based on experience and knowledge. If you choose to ignore us, it's your call.
> 
> The fact remains; 520w is NOT plenty for a 290x and you will be running it on or near it's limits.


Please enlighten me on what "the whole story" is. The R9 290X is a VGA for gaming. Those reviews got their results by gaming - What else could be relevant?

I'd like to know according to you- What PSU the average enthusiast build with Intel and a R9 290X needs.

Most i5's barely go over 150W total power consumption at stock - You have 300-350W free wich is normally what a 8+6 Pin GPU consumes.

And just because a specific card has junk power management it doesn't mean all 290X peak to 400W sometimes.

You keep saying 520W is not enough,yet provide no proof. "Experience and knowledge" is not proof.







And even if it was,review sites would have by far more credibility.

By the way...don't you think no OEM would put 8+6 pin to power a VGA if it wasn't enough...yet plenty of 550W PSU's have them...









Also i'd take a look at the 280X's power consumption..yep..totally NOT like the 290/290X









http://www.anandtech.com/show/7406/the-sapphire-r9-280x-toxic-review/4
Quote:


> Originally Posted by *kipperfl1012*
> 
> Hi Guys I am building my first new pc in 7 years and I had some questions on witch card to buy . I am not really concerned about the price difference Just performance,Heat and Noise
> 
> What would be the best card and why
> 
> Thanks
> 
> SAPPHIRE Radeon R9 290 100362SR
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814202043
> 
> XFX Black Edition Double Dissipation Radeon R9 290 R9-290A-EDBD
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814150701
> 
> XFX Radeon R9 290 R9-290A-EDFD
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814150697


The XFX because it's not a reference model. Why? Because reference models are crap wich run hot and throttle like no tomorrow.


----------



## fyzzz

The XFX Radeon R9 290 R9-290A-EDFD is reference pcb. I bought one back in may and it was simply amazing.


----------



## mfknjadagr8

Quote:


> Originally Posted by *fyzzz*
> 
> The XFX Radeon R9 290 R9-290A-EDFD is reference pcb. I bought one back in may and it was simply amazing.


the card he's looking at has reference cooler and that's a big issue unless you plan to waterblock it anyway...
@GoLDii3 saying enthusiast and i5 didn't realy fit these days...and using the argument that they wouldn't put something on a psu that it couldn't handle isn't a great argument because even garbage oem psus contain this...in fact I had 300 watt ultra power supply years ago and it had two 6 pins...for that time even the 500 and 650s had just started using two...the hugo came with racing stripes back in the day...just saying


----------



## Gumbi

Intel CPU (anything overclocked to 1.3v IvyBridge or later) and excluding 6 or more cores plus 290X is completely fine if you have a good 520w system.

All reviews show systems consuming 400-450 watts under a gaming load, and a tad over 500 in worst case scenarios when overclocking.

520 waats is fine for mild overclocking a 290X and an Intel chip. It's cutting it close, but is definitely OK if the PSU is decent.


----------



## fyzzz

Quote:


> Originally Posted by *mfknjadagr8*
> 
> the card he's looking at has reference cooler and that's a big issue unless you plan to waterblock it anyway...
> @GoLDii3 saying enthusiast and i5 didn't realy fit these days...and using the argument that they wouldn't put something on a psu that it couldn't handle isn't a great argument because even garbage oem psus contain this...in fact I had 300 watt ultra power supply years ago and it had two 6 pins...for that time even the 500 and 650s had just started using two...the hugo came with racing stripes back in the day...just saying


The point i was trying to make is that i bought the R9-290A-EDFD and the cooler was quite good it held the card just a bit over 70c if i remember right and was pretty quiet at 100%. BUT the card i got was something special, it had hynix bfr memory instead of normal afr memory and it performed better than most 290's.


----------



## kipperfl1012

Ok bare with me guys I know very little about GPU's since my last system build was 7 years ago.

I looked up what a reference card was and I realize that is not what I want. And yes i do plan to water cool the GPU but not right away since I already have $2400 out of my pocket right now for this system.

I guess this is the card I have come to after reading your comments and looking at some benchmarks. What do you think ?

SAPPHIRE Radeon R9 290X 100361-4L 4GB 512-Bit GDDR5 PCI Express 3.0 Tri-X OC(UEFI) Video Card, New Edition
http://www.newegg.com/Product/Product.aspx?Item=N82E16814202145


----------



## MEC-777

Quote:


> Originally Posted by *GoLDii3*
> 
> Please enlighten me on what "the whole story" is. The R9 290X is a VGA for gaming. Those reviews got their results by gaming - What else could be relevant?
> 
> I'd like to know according to you- What PSU the average enthusiast build with Intel and a R9 290X needs.
> 
> Most i5's barely go over 150W total power consumption at stock - You have 300-350W free wich is normally what a 8+6 Pin GPU consumes.
> 
> And just because a specific card has junk power management it doesn't mean all 290X peak to 400W sometimes.
> 
> You keep saying 520W is not enough,yet provide no proof. "Experience and knowledge" is not proof.
> 
> 
> 
> 
> 
> 
> 
> And even if it was,review sites would have by far more credibility.
> 
> By the way...don't you think no OEM would put 8+6 pin to power a VGA if it wasn't enough...yet plenty of 550W PSU's have them...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also i'd take a look at the 280X's power consumption..yep..totally NOT like the 290/290X


Please go read that article I posted and then come back and we'll talk.

Just because a card has an 8 and 6 pin connector doesn't mean it will only ever draw the rated power through it. Look at the 295x2, dual 8 pins on a card with two 290X GPUs. You can do the math. I'm not saying your 290x will do the same, but it can and will (at times) pull more than what those connectors are rated for. It should not be taken as gospel that that is all the power it will ever draw - "because connectors". It doesn't work that way.

There are plenty of lower power cards that use 8 and 6 pin connectors which are totally safe to run on your 520w PSU, that doesn't mean you can run any GPU, regardless, just because it has the same connectors. Again, it's not that simple.

Another thing about all these "reviews" you keep holding up as pure spoken truth; How long do these reviewer actually use and test these cards for? A day? A few hours? That's really about it. Just long enough to gather some number, slap them on a graph and show you where they stack up relative to the other cards on the market. The power consumption numbers are what they are, and again, that equipment is not inclusive of the true power draw characteristics of these cards. You'll also notice they don't use 520w PSUs for testing, they typically use units in the 750w+ range because they don't want anything to interfere with the performance of the components in their test system.

280X - 250w TDP
290X - 290w TDP
On a lower capacity PSU like the one you're using, that 40w difference can be the difference between being close to or on the limit and not.

The 290X doesn't have junk power management. Again, please go read that article. I'm not discussing this further with you until you do so. I don't have time for people who refuse to listen to reason and tell me I'm spouting BS. If you want to use your 520w unit, so right ahead. A number of us here have warned you.
Quote:


> Originally Posted by *Gumbi*
> 
> Intel CPU (anything overclocked to 1.3v IvyBridge or later) and excluding 6 or more cores plus 290X is completely fine if you have a good 520w system.
> 
> All reviews show systems consuming 400-450 watts under a gaming load, and a tad over 500 in worst case scenarios when overclocking.
> 
> 520 watts is fine for mild overclocking a 290X and an Intel chip. It's cutting it close, but is definitely OK if the PSU is decent.


When it comes to PSU's, "cutting it close" is not "fine". The PSU is the most important part of a PC. I just don't understand the mentality behind using something that is barely enough and putting the rest of the system/components at risk. You're all certainly free to use whatever parts you wish, but I will hold to the fact that saying this is "ok" is not ok.


----------



## MEC-777

Quote:


> Originally Posted by *kipperfl1012*
> 
> Ok bare with me guys I know very little about GPU's since my last system build was 7 years ago.
> 
> I looked up what a reference card was and I realize that is not what I want. And yes i do plan to water cool the GPU but not right away since I already have $2400 out of my pocket right now for this system.
> 
> I guess this is the card I have come to after reading your comments and looking at some benchmarks. What do you think ?
> 
> SAPPHIRE Radeon R9 290X 100361-4L 4GB 512-Bit GDDR5 PCI Express 3.0 Tri-X OC(UEFI) Video Card, New Edition
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814202145


Excellent choice. One of the best 290X's on the market.


----------



## kipperfl1012

Great thank you !


----------



## GoLDii3

Quote:


> Originally Posted by *MEC-777*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Please go read that article I posted and then come back and we'll talk.
> 
> Just because a card has an 8 and 6 pin connector doesn't mean it will only ever draw the rated power through it. Look at the 295x2, dual 8 pins on a card with two 290X GPUs. You can do the math. I'm not saying your 290x will do the same, but it can and will (at times) pull more than what those connectors are rated for. It should not be taken as gospel that that is all the power it will ever draw - "because connectors". It doesn't work that way.
> 
> There are plenty of lower power cards that use 8 and 6 pin connectors which are totally safe to run on your 520w PSU, that doesn't mean you can run any GPU, regardless, just because it has the same connectors. Again, it's not that simple.
> 
> Another thing about all these "reviews" you keep holding up as pure spoken truth; How long do these reviewer actually use and test these cards for? A day? A few hours? That's really about it. Just long enough to gather some number, slap them on a graph and show you where they stack up relative to the other cards on the market. The power consumption numbers are what they are, and again, that equipment is not inclusive of the true power draw characteristics of these cards. You'll also notice they don't use 520w PSUs for testing, they typically use units in the 750w+ range because they don't want anything to interfere with the performance of the components in their test system.
> 
> 280X - 250w TDP
> 290X - 290w TDP
> On a lower capacity PSU like the one you're using, that 40w difference can be the difference between being close to or on the limit and not.
> 
> The 290X doesn't have junk power management. Again, please go read that article. I'm not discussing this further with you until you do so. I don't have time for people who refuse to listen to reason and tell me I'm spouting BS. If you want to use your 520w unit, so right ahead. A number of us here have warned you.


Already did,didn't show anything interesting.

Makes me wonder how can you pull off a 290 CFX with a 850W power supply,if according to your own logic,those cards can pull up to 400W. And since you have two,that would be 800W. And the rest of the system? Must be magic.









It's not about "lower capacity PSU" like you said. It's about math.

PC with i5/i7 at stock without GPU -> Max 150W

150 + 290 (Your supposed 290X TDP) = 440W

Seasonic SII12W +12V Rail wattage: 480W

It doesn't matter if it's close to the limit. 12V is rated to output 480W,it will output them without a single problem. It doesn't? Then it's a junk PSU.


----------



## Roboyto

Quote:


> Originally Posted by *MEC-777*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Please go read that article I posted and then come back and we'll talk.
> 
> Just because a card has an 8 and 6 pin connector doesn't mean it will only ever draw the rated power through it. Look at the 295x2, dual 8 pins on a card with two 290X GPUs. You can do the math. I'm not saying your 290x will do the same, but it can and will (at times) pull more than what those connectors are rated for. It should not be taken as gospel that that is all the power it will ever draw - "because connectors". It doesn't work that way.
> 
> There are plenty of lower power cards that use 8 and 6 pin connectors which are totally safe to run on your 520w PSU, that doesn't mean you can run any GPU, regardless, just because it has the same connectors. Again, it's not that simple.
> 
> Another thing about all these "reviews" you keep holding up as pure spoken truth; How long do these reviewer actually use and test these cards for? A day? A few hours? That's really about it. Just long enough to gather some number, slap them on a graph and show you where they stack up relative to the other cards on the market. The power consumption numbers are what they are, and again, that equipment is not inclusive of the true power draw characteristics of these cards. You'll also notice they don't use 520w PSUs for testing, they typically use units in the 750w+ range because they don't want anything to interfere with the performance of the components in their test system.
> 
> 
> 
> 280X - 250w *TDP*
> 290X - 290w *TDP*
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> On a lower capacity PSU like the one you're using, that 40w difference can be the difference between being close to or on the limit and not.
> 
> The 290X doesn't have junk power management. Again, please go read that article. I'm not discussing this further with you until you do so. I don't have time for people who refuse to listen to reason and tell me I'm spouting BS. If you want to use your 520w unit, so right ahead. A number of us here have warned you.
> When it comes to PSU's, "cutting it close" is not "fine". The PSU is the most important part of a PC. I just don't understand the mentality behind using something that is barely enough and putting the rest of the system/components at risk. You're all certainly free to use whatever parts you wish, but I will hold to the fact that saying this is "ok" is not ok.


TDP is not the same as power consumption. It is the amount the heat that needs to be displaced under normal operating conditions.

Just want to be certain you are aware of this.


----------



## Gumbi

Quote:


> Originally Posted by *MEC-777*
> 
> Please go read that article I posted and then come back and we'll talk.
> 
> Just because a card has an 8 and 6 pin connector doesn't mean it will only ever draw the rated power through it. Look at the 295x2, dual 8 pins on a card with two 290X GPUs. You can do the math. I'm not saying your 290x will do the same, but it can and will (at times) pull more than what those connectors are rated for. It should not be taken as gospel that that is all the power it will ever draw - "because connectors". It doesn't work that way.
> 
> There are plenty of lower power cards that use 8 and 6 pin connectors which are totally safe to run on your 520w PSU, that doesn't mean you can run any GPU, regardless, just because it has the same connectors. Again, it's not that simple.
> 
> Another thing about all these "reviews" you keep holding up as pure spoken truth; How long do these reviewer actually use and test these cards for? A day? A few hours? That's really about it. Just long enough to gather some number, slap them on a graph and show you where they stack up relative to the other cards on the market. The power consumption numbers are what they are, and again, that equipment is not inclusive of the true power draw characteristics of these cards. You'll also notice they don't use 520w PSUs for testing, they typically use units in the 750w+ range because they don't want anything to interfere with the performance of the components in their test system.
> 
> 280X - 250w TDP
> 290X - 290w TDP
> On a lower capacity PSU like the one you're using, that 40w difference can be the difference between being close to or on the limit and not.
> 
> The 290X doesn't have junk power management. Again, please go read that article. I'm not discussing this further with you until you do so. I don't have time for people who refuse to listen to reason and tell me I'm spouting BS. If you want to use your 520w unit, so right ahead. A number of us here have warned you.
> When it comes to PSU's, "cutting it close" is not "fine". The PSU is the most important part of a PC. I just don't understand the mentality behind using something that is barely enough and putting the rest of the system/components at risk. You're all certainly free to use whatever parts you wish, but I will hold to the fact that saying this is "ok" is not ok.


It's not barely enough though. When I say "cutting it close", I don't mean it's barely enough. For God's sake, here is a 290x vapor x system drawing 400 watts under load... Add in an overclock and you're still OK.

http://www.tweaktown.com/reviews/6818/sapphire-radeon-r9-290x-8gb-vapor-x-oc-video-card-review/index16.html


----------



## mfknjadagr8

Quote:


> Originally Posted by *Gumbi*
> 
> It's not barely enough though. When I say "cutting it close", I don't mean it's barely enough. For God's sake, here is a 290x vapor x system drawing 400 watts under load... Add in an overclock and you're still OK.
> 
> http://www.tweaktown.com/reviews/6818/sapphire-radeon-r9-290x-8gb-vapor-x-oc-video-card-review/index16.html


funny thing being they recommended using a 600 watt quality unit for a 400watt draw...wonder why?


----------



## By-Tor

I have a watt meter that my tower is plugged into so I can see what I'm pulling at a glance. I'm using a XFX 850w Black Edition PSU.

Both of my 290x's at OCed to 1150/1500 with the 4790k at 4.7ghz

These things are also running, but not drawing a ton of power:

12 - 120mm rad. fans
4 - 140mm case fans
Swiftech MCP-355 pump
LED lighting
Cold Cathod lighting
2 - SSD's
1 - HDD

When I ran furmark awhile back the rig was pulling just over 1000 watts without issue.

This is in firestrike with both cards at 99%


BF4 with both cards at 99%


----------



## Gumbi

Quote:


> Originally Posted by *mfknjadagr8*
> 
> funny thing being they recommended using a 600 watt quality unit for a 400watt draw...wonder why?


Theyvare taking into account the trash psus somepeople put in their systems


----------



## battleaxe

Quote:


> Originally Posted by *kipperfl1012*
> 
> Ok bare with me guys I know very little about GPU's since my last system build was 7 years ago.
> 
> I looked up what a reference card was and I realize that is not what I want. And yes i do plan to water cool the GPU but not right away since I already have $2400 out of my pocket right now for this system.
> 
> I guess this is the card I have come to after reading your comments and looking at some benchmarks. What do you think ?
> 
> SAPPHIRE Radeon R9 290X 100361-4L 4GB 512-Bit GDDR5 PCI Express 3.0 Tri-X OC(UEFI) Video Card, New Edition
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814202145


I have that exact card. I've gotten mine up to 1275 on the core at 200mv. Its beast. I wouldn't count on getting one like mine, I think its rare. Still, great cards. I'm temped to buy another one.


----------



## mus1mus

This PSU thing has taken a lot of replies already.

I don't wanna mention any names but it seems that the confusion is growing.

I don't understand why some would not consider spending a little extra to get a little headroom on PSU power knowing that everything in the system connects to a power supply. If that thing goes bad for overloading, chances are, it will take components with itself. Does this even make sense?

Also, people are putting too much emphasis on spending a premium on high quality units more than power ratings. Even basing calculations to what reviewers shown. Like, a certain reviewer tested a unit past it's rated output. i.e. 1000W on an 850W unit.

Some people take this and buy an 850W unit when their calculated load would be around a 1000W --even more.

Doesn't sound so healthy of a thinking for me. But then, it'll take a hard lesson to learn on something that is generally accepted truth or even myth sometimes.

Good luck.


----------



## kipperfl1012

I figure my system is going to run around 650-700 watts but to give myself some headroom and room for upgrading I went with a 1000 watt psu. Just makes sense to me.


----------



## Rob27shred

PSU is the most important part of PC period. Like many have already said if your PSU blows due to overloading pretty much expect it to take some things with it.


----------



## mus1mus

Though, contrary to my previous statement, PSU prices per Watt are not linear. So maybe, some people are not up to spending $300-400 unto a PSU. They'd rather spend it on a GPU.

I don't want to trust a sub-$200 PSU to handle my 290s, my CPU, and Motherboard personally.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Gumbi*
> 
> Theyvare taking into account the trash psus somepeople put in their systems


nah it says a quality 600 watt unit is their recommendation...meaning non trash







but I support peoples decision to risk their hardware...if they gamble and win their gain if they lose I'd even help them troubleshoot and see what they've murdered


----------



## mus1mus

Quote:


> Originally Posted by *mfknjadagr8*
> 
> nah it says a quality 600 watt unit is their recommendation...meaning non trash
> 
> 
> 
> 
> 
> 
> 
> but I support peoples decision to risk their hardware...if they gamble and win their gain if they lose I'd even help them troubleshoot and see what they've murdered


That's reassuring.

BTW bud, I think I read somewhere you mentioned your 290 is crap at mem OC.

Giga is it?


----------



## kizwan

Quote:


> Originally Posted by *kipperfl1012*
> 
> Ok bare with me guys I know very little about GPU's since my last system build was 7 years ago.
> 
> I looked up what a reference card was and I realize that is not what I want. And yes i do plan to water cool the GPU but not right away since I already have $2400 out of my pocket right now for this system.
> 
> I guess this is the card I have come to after reading your comments and looking at some benchmarks. What do you think ?
> 
> SAPPHIRE Radeon R9 290X 100361-4L 4GB 512-Bit GDDR5 PCI Express 3.0 Tri-X OC(UEFI) Video Card, New Edition
> http://www.newegg.com/Product/Product.aspx?Item=N82E16814202145


If you're planning to waterooling then that card is not for you. No full cover water block available for that card.

My cards are referenced cards. Before putting them underwater, I played BF4 games for hours without the cards throttling & I played with two cards in Crossfire. There's a couple of things you need to know or should know when using referenced cards with referenced coolers; 1) don't overclock, 2) use custom fan profiles & 3) don't mind with the noise. If anyone having throttling problem, they're not doing it right which basically means user error. My ambient pretty much above 32C most of the time.
Quote:


> Originally Posted by *Roboyto*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MEC-777*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Please go read that article I posted and then come back and we'll talk.
> 
> Just because a card has an 8 and 6 pin connector doesn't mean it will only ever draw the rated power through it. Look at the 295x2, dual 8 pins on a card with two 290X GPUs. You can do the math. I'm not saying your 290x will do the same, but it can and will (at times) pull more than what those connectors are rated for. It should not be taken as gospel that that is all the power it will ever draw - "because connectors". It doesn't work that way.
> 
> There are plenty of lower power cards that use 8 and 6 pin connectors which are totally safe to run on your 520w PSU, that doesn't mean you can run any GPU, regardless, just because it has the same connectors. Again, it's not that simple.
> 
> Another thing about all these "reviews" you keep holding up as pure spoken truth; How long do these reviewer actually use and test these cards for? A day? A few hours? That's really about it. Just long enough to gather some number, slap them on a graph and show you where they stack up relative to the other cards on the market. The power consumption numbers are what they are, and again, that equipment is not inclusive of the true power draw characteristics of these cards. You'll also notice they don't use 520w PSUs for testing, they typically use units in the 750w+ range because they don't want anything to interfere with the performance of the components in their test system.
> 
> 
> 
> 280X - 250w _*TDP*_
> 
> 290X - 290w _*TDP*_
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> On a lower capacity PSU like the one you're using, that 40w difference can be the difference between being close to or on the limit and not.
> 
> The 290X doesn't have junk power management. Again, please go read that article. I'm not discussing this further with you until you do so. I don't have time for people who refuse to listen to reason and tell me I'm spouting BS. If you want to use your 520w unit, so right ahead. A number of us here have warned you.
> 
> When it comes to PSU's, "cutting it close" is not "fine". The PSU is the most important part of a PC. I just don't understand the mentality behind using something that is barely enough and putting the rest of the system/components at risk. You're all certainly free to use whatever parts you wish, but I will hold to the fact that saying this is "ok" is not ok.
> 
> 
> 
> 
> 
> 
> TDP is not the same as power consumption. It is the amount the heat that needs to be displaced under normal operating conditions.
> 
> Just want to be certain you are aware of this.
Click to expand...

That is correct but if the card is outputting let say 250W of heat, how much power do you think the card is pulling? The answer is more. You can actually use TDP value especially with graphic card as a reference for how high (wattage) PSU you should get.


----------



## serave

you wouldnt need more than 600/650W unit for a single gpu setup anyway

ive been running valley, lots of games (Crysis 3, GTA V, SRIV) etc on my 290 with a 500W psu with no problems

just dont go full ****** with overclock/volting with it and you should probably be okay

maybe because im only using i3, but i plan to go with a 650W unit in the future as i plan to upgrade my whole system in less than a year or so


----------



## By-Tor

When I was running an AMD 8350 @ 5ghz+ and a pair of 7950's last year I have had my system shutdown during a couple of benchmarks for drawing to much power. I have know idea what wattage it was drawing at that time due to not having a watt meter then.

I was thinking about going with a 1000-1200 watt PSU awhile back and was advised that my 850 watt was going quality and no need to go with that high of wattage for my setup and since going Intel I have not had any shutdowns due to drawing to much power from my PSU. If I were to add a 3rd 290x sure I would grab up a higher rated PSU, but things run great with what I have.

Head over to the components section under power supplies and read some of Shilka's threads on PSU's.


----------



## megax05

I am no expert in such things but my dual r9 290s OC to 1100/1450 24/7 with a non OC i7 haswell 1 HDD 1 SSD 6 120 AF fans are running fine on a sub 100$ 730w PSU.


----------



## MEC-777

A lot of posts for me to respond to. Will try to keep it brief...
Quote:


> Originally Posted by *GoLDii3*
> 
> Already did,didn't show anything interesting.
> 
> Makes me wonder how can you pull off a 290 CFX with a 850W power supply,if according to your own logic,those cards can pull up to 400W. And since you have two,that would be 800W. And the rest of the system? Must be magic.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It's not about "lower capacity PSU" like you said. It's about math.
> 
> PC with i5/i7 at stock without GPU -> Max 150W
> 
> 150 + 290 (Your supposed 290X TDP) = 440W
> 
> Seasonic SII12W +12V Rail wattage: 480W
> 
> It doesn't matter if it's close to the limit. 12V is rated to output 480W,it will output them without a single problem. It doesn't? Then it's a junk PSU.


Didn't show anything interesting? Ok, then.









Did my research when I chose my PSU. I don't OC my CPU and run my 290's stock because they run all my game fast enough without OCing. In that regard, I've got decent overhead. If I were to be OCing my CPU and GPUs daily, I would have gone with a 1000w unit, just for peace of mind, to have the extra overhead and know my PSU should last a long time as it's not being run on the ragged edge all the time.

Like I said before, I never said your 520w unit wouldn't work. I said it's not ideal and I would strongly recommend upgrading to something beefier for reliability and longevity sake. But do what you want. It's your system.

Quote:


> Originally Posted by *Roboyto*
> 
> TDP is not the same as power consumption. It is the amount the heat that needs to be displaced under normal operating conditions.
> 
> Just want to be certain you are aware of this.


Yes, I am aware. It's still a good value to use to estimate the actual power draw of a GPU - which quite often can actually be higher than the TDP rating.








Quote:


> Originally Posted by *Gumbi*
> 
> It's not barely enough though. When I say "cutting it close", I don't mean it's barely enough. For God's sake, here is a 290x vapor x system drawing 400 watts under load... Add in an overclock and you're still OK.
> 
> http://www.tweaktown.com/reviews/6818/sapphire-radeon-r9-290x-8gb-vapor-x-oc-video-card-review/index16.html


I guess it depends on how you define "cutting it close". I don't consider "cutting it close" (running a PSU at 90%+ load) to be "OK". Not for daily use. Just personal preference.

Quote:


> Originally Posted by *mus1mus*
> 
> This PSU thing has taken a lot of replies already.
> 
> I don't wanna mention any names but it seems that the confusion is growing.
> 
> I don't understand why some would not consider spending a little extra to get a little headroom on PSU power knowing that everything in the system connects to a power supply. If that thing goes bad for overloading, chances are, it will take components with itself. Does this even make sense?
> 
> Also, people are putting too much emphasis on spending a premium on high quality units more than power ratings. Even basing calculations to what reviewers shown. Like, a certain reviewer tested a unit past it's rated output. i.e. 1000W on an 850W unit.
> 
> Some people take this and buy an 850W unit when their calculated load would be around a 1000W --even more.
> 
> Doesn't sound so healthy of a thinking for me. But then, it'll take a hard lesson to learn on something that is generally accepted truth or even myth sometimes.
> 
> Good luck.


Well said. Just to comment on one thing; Power ratings and quality are both important. I don't recommend people buy crazy overpowered PSUs for their systems or also only buy the best top quality platinum units, but rather a balance where power and quality to price is reasonable, relative to what you need.

Quote:


> Originally Posted by *kizwan*
> 
> That is correct but if the card is outputting let say 250W of heat, how much power do you think the card is pulling? The answer is more. You can actually use TDP value especially with graphic card as a reference for how high (wattage) PSU you should get.


Yep.


----------



## MEC-777

Quote:


> Originally Posted by *megax05*
> 
> I am no expert in such things but my dual r9 290s OC to 1100/1450 24/7 with a non OC i7 haswell 1 HDD 1 SSD 6 120 AF fans are running fine on a sub 100$ 730w PSU.


What "works" and what is ideal are two very different things. I would say you're treading on thin ice. As mentioned before; I used to be one of those people saying "it works fine" until one day it didn't. Just sayin.


----------



## Gumbi

@ MEC 777.

400 watts (actually 397) is nowhere near 90% load on a 520 watt PSU. It's less than 80%. And it's not like you'll be gaming 24/7.

Like I said, if it's a quality PSU (like the aforementioned Seasonic, I don't see the problem).

I personally recommend 550 - 650 watt PSUs for 390/390X systems depending on whether it's an Intel CPU and/or how heavy the overclocking (especially on the GPU, overvolting can add quite a bit more wattage) will be.


----------



## rdr09

Quote:


> Originally Posted by *Gumbi*
> 
> @ MEC 777.
> 
> 400 watts (actually 397) is nowhere near 90% load on a 520 watt PSU. It's less than 80%. And it's not like you'll be gaming 24/7.
> 
> Like I said, if it's a quality PSU (like the aforementioned Seasonic, I don't see the problem).
> 
> I personally recommend 550 - 650 watt PSUs for 390/390X systems depending on whether it's an Intel CPU and/or how heavy the overclocking (especially on the GPU, overvolting can add quite a bit more wattage) will be.


you think the AMP on the 12V needs to be considered? it could be a low 500W PSU but if the AMP is putting out 41 or higher, then it would be good. i think there are some PSUs that are 600W and even higher put out less AMP on the 12V.

I had a 700W with my first 290, so i was not shy on adding +200 VDDC for 1300/1600 benching. Over a year ago . . .

http://www.3dmark.com/3dm/3154307?


----------



## MEC-777

Quote:


> Originally Posted by *Gumbi*
> 
> @ MEC 777.
> 
> 400 watts (actually 397) is nowhere near 90% load on a 520 watt PSU. It's less than 80%. And it's not like you'll be gaming 24/7.
> 
> Like I said, if it's a quality PSU (like the aforementioned Seasonic, I don't see the problem).
> 
> I personally recommend 550 - 650 watt PSUs for 390/390X systems depending on whether it's an Intel CPU and/or how heavy the overclocking (especially on the GPU, overvolting can add quite a bit more wattage) will be.


@GoLDii3's system could very well pull more than 400w at high load while gaming/benchmarking. Not every system is the same and there are other factors at work to consider.

I'm done with this PSU discussion. I will continue to make recommendations to others, but it's up to the individual to do what they want beyond that. I'm just trying to be helpful and save people time and money, ultimately. Take my advice or leave it (speaking more to @GoLDii3 here) I don't care. But don't put down what I'm trying to tell you just because you don't agree and claim I don't know what I'm talking about. Do what you want. If it works out for you, then great. If not, don' say I (and others) didn't warn you.


----------



## mfknjadagr8

Quote:


> Originally Posted by *mus1mus*
> 
> That's reassuring.
> 
> BTW bud, I think I read somewhere you mentioned your 290 is crap at mem OC.
> 
> Giga is it?


note it's xfx reference...with elpidia memory...this is the bottom of my two cards the top card easily handles 1200 1400 with a +50 on core but the bottom card needs aux voltage and a bump to 100 on core or it won't work at all in crossfire at those speeds so I've been running at 1100 1300 with +25 core...power limit at max


----------



## Mega Man

Quote:


> Originally Posted by *Gumbi*
> 
> @ MEC 777.
> 
> 400 watts (actually 397) is nowhere near 90% load on a 520 watt PSU. It's less than 80%. And it's not like you'll be gaming 24/7.
> 
> Like I said, if it's a quality PSU (like the aforementioned Seasonic, I don't see the problem).
> 
> I personally recommend 550 - 650 watt PSUs for 390/390X systems depending on whether it's an Intel CPU and/or how heavy the overclocking (especially on the GPU, overvolting can add quite a bit more wattage) will be.


I'll just leave this here http://www.overclock.net/t/928113/a-message-to-the-community-on-enthusiast-power-supplies


----------



## GoLDii3

Quote:


> Originally Posted by *Mega Man*
> 
> I'll just leave this here http://www.overclock.net/t/928113/a-message-to-the-community-on-enthusiast-power-supplies


Exactly. It's why "OH NOES MY PSU IS NEAR IT'S LIMITS" is just being dramatic

You buy a PSU capable of "X" wattage,you can expect it to output it perfectly without any consequence,because,you know,it's been rated to work within those specifications. It doesn't work¿? Then the PSU is junk. Easy


----------



## kizwan

Quote:


> Originally Posted by *Mega Man*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Gumbi*
> 
> @ MEC 777.
> 
> 400 watts (actually 397) is nowhere near 90% load on a 520 watt PSU. It's less than 80%. And it's not like you'll be gaming 24/7.
> 
> Like I said, if it's a quality PSU (like the aforementioned Seasonic, I don't see the problem).
> 
> I personally recommend 550 - 650 watt PSUs for 390/390X systems depending on whether it's an Intel CPU and/or how heavy the overclocking (especially on the GPU, overvolting can add quite a bit more wattage) will be.
> 
> 
> 
> I'll just leave this here http://www.overclock.net/t/928113/a-message-to-the-community-on-enthusiast-power-supplies
Click to expand...

^^This!

If the PSU break when running it at 100% load & take other hardware with it, then you do not call it good quality PSU. A good quality PSU will be able to deliver the rated amount of wattage for extended period of time without any problem.


----------



## MEC-777

Quote:


> Originally Posted by *GoLDii3*
> 
> Exactly. It's why "OH NOES MY PSU IS NEAR IT'S LIMITS" is just being dramatic
> 
> You buy a PSU capable of "X" wattage,you can expect it to output it perfectly without any consequence,because,you know,it's been rated to work within those specifications. It doesn't work¿? Then the PSU is junk. Easy


Who was ever being dramatic? Again, I said it's not ideal.

Go ahead and run your 290x on your 3 year old 520w unit. Like I said before, if it works out for you, then great. If it doesn't, don't say you weren't warned.

Again, I'm just trying to help is all. You know, a lot of people would have just said, "oh yeah grab a 290x and slap it in your system, it'll be great" without even asking about or looking at the rest of your system specs to even check if everything is compatible. I took a look at your system to see if there's anything that might be of concern, knowing it's a very power-hungry GPU. I thought you would appreciate a suggestion/recommendation that would help you get the most out of your new upgrade and keep it running strong/reliable for a longer period of time.


----------



## mfknjadagr8

Quote:


> Originally Posted by *kizwan*
> 
> ^^This!
> 
> If the PSU break when running it at 100% load & take other hardware with it, then you do not call it good quality PSU. A good quality PSU will be able to deliver the rated amount of wattage for extended period of time without any problem.


there can be quality control issue our simply a slightly weaker cap on any model even the best...we all know cooler is better and heat can kill components faster...a psu running at 30c will be much happier than one pegging the meter and running at say 50c...I agree that quality unit should produce full rated wattage at all times when within temp spec but I play it safe in this area for all the unforeseen...i prefer longevity so I would rather over estimate than under estimate..keep them running cool and for a long time...to each their own...so far I've never had a psu fail but maybe I've just been lucky in this regard...


----------



## MEC-777

Quote:


> Originally Posted by *Mega Man*
> 
> I'll just leave this here http://www.overclock.net/t/928113/a-message-to-the-community-on-enthusiast-power-supplies


Agree with this. However, the unit in question (Seasonic S12II 520w bronze), I wouldn't consider to be an "enthusiast" high-end PSU. Seasonic is one of the best PSU OEM's out there, but that doesn't mean every single unit they make is bulletproof. Is it a good unit? Yeah. It does use group regulation, which is something I'd try to avoid, however according to Jonnyguru this unit did surprisingly well on cross-load tests (typically where you'd find the flaws of group regulation and where you'd run into problems powering a beefy GPU like a 290x). So while it is a pretty good unit, I still wouldn't classify it under the "enthusiast high-end" umbrella of PSU's that Phaedrus2129 is referring to.

I've lost track how many times I'v said this; will it work? Yes. Is it ideal/recommended, I would say no. That's all I'm saying.

Not trying to nit-pick or draw this discussion out longer than it needs to be, just trying to be objective/realistic and help a fellow member out.


----------



## kizwan

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> ^^This!
> 
> If the PSU break when running it at 100% load & take other hardware with it, then you do not call it good quality PSU. A good quality PSU will be able to deliver the rated amount of wattage for extended period of time without any problem.
> 
> 
> 
> there can be quality control issue our simply a slightly weaker cap on any model even the best...we all know cooler is better
Click to expand...


I would avoid any brands that have known quality control issue. The quality control is there for preventing faulty PSU from getting to the consumer.
Yes, there's time faulty PSU will find a way to the consumer but a good designed PSU, when it break down, it won't take another hardware with it and at the same time doesn't harm the end-user as well.
If PSU is only working at certain (low) temp, then it's not good quality PSU. The worst case scenario with good quality PSU is the efficiency may drop when the temp is higher.


----------



## mus1mus

Quote:


> Originally Posted by *mfknjadagr8*
> 
> note it's xfx reference...with elpidia memory...this is the bottom of my two cards the top card easily handles 1200 1400 with a +50 on core but the bottom card needs aux voltage and a bump to 100 on core or it won't work at all in crossfire at those speeds so I've been running at 1100 1300 with +25 core...power limit at max


Try flashing a reference bios. Still the most stable I have tried on my cards. Handles 1270/1650 without the blackscreens on high Voltages.

Or try the moded 3XX bios. Sadly, mine doesn't seem to benefit much on those BIOSes.


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *mfknjadagr8*
> 
> note it's xfx reference...with elpidia memory...this is the bottom of my two cards the top card easily handles 1200 1400 with a +50 on core but the bottom card needs aux voltage and a bump to 100 on core or it won't work at all in crossfire at those speeds so I've been running at 1100 1300 with +25 core...power limit at max
> 
> 
> 
> 
> 
> 
> 
> Try flashing a reference bios. Still the most stable I have tried on my cards. Handles 1270/1650 without the blackscreens on high Voltages.
> 
> Or try the moded 3XX bios. Sadly, mine doesn't seem to benefit much on those BIOSes.
Click to expand...

A good quality PSU....oh! wait, you didn't talk about PSU. Is it over? LOL

My elpida card unstable when overclocked, blackscreen crash at any voltage with the modded BIOS. The hynix ones so far fine on my hynix card. Right now I'm running one card with referenced BIOS (TRI-X OC bios 1000/1300) & another card with 390 hynix modded BIOS (edited default clock 1000/1300). The drivers tend to auto-overclock the hynix card from 947/1250 to 999/1250. So I edited the default clock to 1000/1300 to satisfy my OCD.









Too bad you didn't get any benefit from it. I honestly didn't measure & compare the FPS between referenced & modded BIOS in games (in benchmark yes). All I know I can play GTA V fine at stock clock with the modded BIOS. Come to think of it again, I don't really need the modded BIOS because I can play games just fine with referenced BIOS anyway.


----------



## mus1mus

Quote:


> Originally Posted by *kizwan*
> 
> A good quality PSU....oh! wait, you didn't talk about PSU. Is it over? LOL
> 
> My elpida card unstable when overclocked, blackscreen crash at any voltage with the modded BIOS. The hynix ones so far fine on my hynix card. Right now I'm running one card with referenced BIOS (TRI-X OC bios 1000/1300) & another card with 390 hynix modded BIOS (edited default clock 1000/1300). The drivers tend to auto-overclock the hynix card from 947/1250 to 999/1250. So I edited the default clock to 1000/1300 to satisfy my OCD.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Too bad you didn't get any benefit from it. I honestly didn't measure & compare the FPS between referenced & modded BIOS in games (in benchmark yes). All I know I can play GTA V fine at stock clock with the modded BIOS. Come to think of it again, I don't really need the modded BIOS because I can play games just fine with referenced BIOS anyway.


Funny, I was about to answer you Quality PSU claim. lolol. But I'll let it go to sleep.









One thing I have noticed with the modded bios is that OC'ing is a bit limited and you'll be having black screen issues if you are unlucky.

Reference BIOS like the one I am messing with right now has none of those issues on my 290. Higher OC and just a one or 2 frames behind @fyzz's on graphics tests in FS. So the benefit really matters in benchmarks.

I have also been playing around modding bioses. And I suck. lol

My unlocked 290 was put aside today so I can deal better on tuning them single card at a time.


----------



## mfknjadagr8

Quote:


> Originally Posted by *kizwan*
> 
> If PSU is only working at certain (low) temp, then it's not good quality PSU. The worst case scenario with good quality PSU is the efficiency may drop when the temp is higher.


I never said doesn't work or shuts down at high temp I said lower temps mean more longevity this is true for anything that stores or transmits current directly through it... cooler is better ...but no point in talking about this more...I think everyone has posted their view on it...I wouldn't recommend a crap unit that's most important...better chances of no failure regardless of if you exceed it's limits..I'm cheap so I'm always looking for a deal but some things I refuse to compromise on...personal preference...just like I don't buy store brand sodas...ironically I buy oversized cups to hold them too...lol


----------



## fyzzz

@mus1musNice keep going. The last and best run I did with my 290, was with that 390 bios that I got. It got rid of the black screens I had with the original 390x bios and allowed me to oc to 1260, 20 mhz higher, which was about my max on stock 290 bios. Man I miss my 290, I must revisit it someday


----------



## By-Tor

I always had stuttering in BF4 in certain parts of each map while in crossfire, but after installing the 15.10 I don't get it anymore. (knock on wood)


----------



## rdr09

Quote:


> Originally Posted by *By-Tor*
> 
> I always had stuttering in BF4 in certain parts of each map while in crossfire, but after installing the 15.10 I don't get it anymore. (knock on wood)


Do you mind checking your Pagefile reading next time you play? i registered almost 11K on mine. Pagefile set to auto in Windows.


----------



## By-Tor

Quote:


> Originally Posted by *rdr09*
> 
> Do you mind checking your Pagefile reading next time you play? i registered almost 11K on mine. Pagefile set to auto in Windows.


16325 MB


----------



## rdr09

Quote:


> Originally Posted by *By-Tor*
> 
> 16325 MB


+rep. i thought mine was unusually high.


----------



## kizwan

Quote:


> Originally Posted by *By-Tor*
> 
> I always had stuttering in BF4 in certain parts of each map while in crossfire, but after installing the 15.10 I don't get it anymore. (knock on wood)


I have not play BF4 for a while now. Anything interesting?


----------



## diggiddi

Quote:


> Originally Posted by *kizwan*
> 
> I have not play BF4 for a while now. Anything interesting?


New community map Jungle setting


----------



## mfknjadagr8

Quote:


> Originally Posted by *rdr09*
> 
> +rep. i thought mine was unusually high.


I've seen upwards of 30gb on mine before in some games...go figure...I have it set to 16gb solid now and it swaps in and out with no issues...I often run out of space on ssd so I left it lower now so far no ill effects


----------



## GoLDii3

Quote:


> Originally Posted by *mfknjadagr8*
> 
> I've seen upwards of 30gb on mine before in some games...go figure...I have it set to 16gb solid now and it swaps in and out with no issues...I often run out of space on ssd so I left it lower now so far no ill effects


And what are the benefits of pagefile? I always have it low or disabled


----------



## rdr09

Quote:


> Originally Posted by *GoLDii3*
> 
> And what are the benefits of pagefile? I always have it low or disabled


to my understanding, pagefile is accessed by windows when system ram is running low. so, if you only have 8GB of system ram and playing BF4, for example, ram usage can go high enuf that pagefile is then used to compensate.


----------



## mfknjadagr8

Quote:


> Originally Posted by *GoLDii3*
> 
> And what are the benefits of pagefile? I always have it low or disabled


it can prevent crashing and hiccups when ram is getting low (usable ram)...and some people run with it off however some programs don't t react well to not having a pagefile at all...pagefile isn't as useful as it used to be when ram was lower speed and capacity...old rule of thumb used to be pagefile should be double the ram...now most times you can get away with a small pagefile...I still do the old way out of habit...some games and programs like to fill up the pagefile for some reason...shadow of mordor can be bad about this and all of the battlefield games...I think mega said his shadow of mordor pagefile usage was similar to mine around 30gb...but I've played since with my pagefile set to 16gb with no noticeable difference even though it's only using half that so I'm unsure why some programs love filling it up


----------



## By-Tor

Quote:


> Originally Posted by *kizwan*
> 
> I have not play BF4 for a while now. Anything interesting?


The new map reminds me of BF Vietnam.


----------



## Archea47

Continuing the page file discussion ... many SSD optimization guides suggest disbaling the page file. This is because it can make a lot of writes to the flash, causing accelerated wear on the drive.

On the other hand, page file is being used as extra RAM so high performance is desirable. Some suggest putting the page file on spinning disc to avoid that wear on the flash but I do use my SSDs.

The page file can be used for docking idle programs' in memory data out of the RAM to make room for active apps' hot data, or when free physical RAM is being exhausted.

Playing GTAV for example 4 of my 8GBs are still free but it uses 6-8GB of page file on my system, pretty much as soon as it starts. Supposedly this is a carryover behavior from consoles where they are often and assume they will be exhausted of RAM and default to using the page file. Undesirable on PC if you have enough RAM: data in RAM is directly available for operations on the CPU rather than it having to be shuffled in first off the page file on disc.

FWIW in BF4 I've started to need a page file. I used to run without one (and 8GB RAM - to be gentle on the AMD IMC) but on my new install the game says I need a video card with at least 512MB vRAM (each has 4GB). I had to turn the page file up to 10GB for it to run - it immediately fills 8.5GB on launch

Now at work we have systems with 1.5TB RAM and 3TB pagefile


----------



## By-Tor

I have an empty Samsung 250gb EVO SSD in my system.

Would I see any performance issues taking the page file off of the main SSD (Samsung 500gb EVO) and using the empty SSD for the page file?


----------



## mfknjadagr8

Quote:


> Originally Posted by *By-Tor*
> 
> I have an empty Samsung 250gb EVO SSD in my system.
> 
> Would I see any performance issues taking the page file off of the main SSD (Samsung 500gb EVO) and using the empty SSD for the page file?


I wouldn't think so but I doubt you would see any gains either...I would leave the page file and transfer whatever programs or games to the empty drive...


----------



## EpicOtis13

Quote:


> Originally Posted by *mus1mus*
> 
> Well, you have CPUs that are actually heaven and earth.
> 
> 4.8 4790K is a monster in Single core perf.
> 4.6 AMD FX is Mediocre.
> 
> Things will really become apparent that way. Even your 5930K single core perf. is "no mas" for that 4790K in fact.


Some how my 5930k boosted my frames again even though the core count is high and the single core not great like the 8320.
Quote:


> Originally Posted by *kizwan*
> 
> I have not play BF4 for a while now. Anything interesting?


You can now get the phantom bow really easily on the new community map


----------



## Mega Man

Quote:


> Originally Posted by *Archea47*
> 
> Continuing the page file discussion ... many SSD optimization guides suggest disbaling the page file. This is because it can make a lot of writes to the flash, causing accelerated wear on the drive.
> 
> On the other hand, page file is being used as extra RAM so high performance is desirable. Some suggest putting the page file on spinning disc to avoid that wear on the flash but I do use my SSDs.
> 
> The page file can be used for docking idle programs' in memory data out of the RAM to make room for active apps' hot data, or when free physical RAM is being exhausted.
> 
> Playing GTAV for example 4 of my 8GBs are still free but it uses 6-8GB of page file on my system, pretty much as soon as it starts. Supposedly this is a carryover behavior from consoles where they are often and assume they will be exhausted of RAM and default to using the page file. Undesirable on PC if you have enough RAM: data in RAM is directly available for operations on the CPU rather than it having to be shuffled in first off the page file on disc.
> 
> FWIW in BF4 I've started to need a page file. I used to run without one (and 8GB RAM - to be gentle on the AMD IMC) but on my new install the game says I need a video card with at least 512MB vRAM (each has 4GB). I had to turn the page file up to 10GB for it to run - it immediately fills 8.5GB on launch
> 
> Now at work we have systems with 1.5TB RAM and 3TB pagefile


This is not really relevant anymore like when ssds were first introduced. ( talking about page files on ssds )


----------



## Archea47

Quote:


> Originally Posted by *Mega Man*
> 
> This is not really relevant anymore like when ssds were first introduced. ( talking about page files on ssds )


Agree to disagree? If anything average write endurance has decreased, with many consumers moving to TLC and most of the enterprise flash on MLC these days. Some controllers are better than others at trying to distribute writes to fresher cells but trust me the flash guys are still very conscious of and continuing to engineer around this issue. Your workload or use case may vary but the topic is still an interest to professionals.


----------



## ryboto

Guys, I need help deciding....I'm coming from a 21.5" 16:9 monitor and can't decide on a 29'' or 34'' 21:9 monitor.... can anyone help me make a decision? My brain is locking up on this one!


----------



## diggiddi

Whats the budget here? Bigger is always better so........


----------



## Archea47

Quote:


> Originally Posted by *ryboto*
> 
> Guys, I need help deciding....I'm coming from a 21.5" 16:9 monitor and can't decide on a 29'' or 34'' 21:9 monitor.... can anyone help me make a decision? My brain is locking up on this one!


What's your target resolution?


----------



## ryboto

I'm trying to decide between the two freesync LG Ultrawides, the 34UM67 and 29UM67. I just worry the 34 is much too large for my desk, and for its native resolution.

edit-Coming from a 21.5 I'm leaning toward the 29'', because it'll still be a sizeable increase. It's also not as expensive. I'd love to get the 34, but I'm primarily concerned about the pixel density, and secondarily concerned about its size. I think when/if curved 34'' 3440 x 1440 becomes more affordable, I'll upgrade to that. Thoughts?


----------



## mus1mus

Quote:


> Originally Posted by *EpicOtis13*
> 
> Some how my 5930k boosted my frames again even though the core count is high and the single core not great like the 8320.


I have no doubt about that. But you are simply having a bit of a conundrum there.

Like I said, your FX OC is mediocre. Your 4790K OC is good. And your 5930K at 4.6 is pretty stout that can match a 5.0GHz clock of the previous gen CPUs.

You can also add that your Intel mobos allow the card to run at PCIe 3.0 while the FX does 2.0. Firestrike alone shows at least 2 fps boast for Intels in my tests.

Now, game performance is a different story as it may vary from title to title. Esp. At 1080p. But as soon as you move to higher resolutions (multi-card, obviously) the CPU effect becomes less and less.


----------



## EpicOtis13

Quote:


> Originally Posted by *mus1mus*
> 
> I have no doubt about that. But you are simply having a bit of a conundrum there.
> 
> Like I said, your FX OC is mediocre. Your 4790K OC is good. And your 5930K at 4.6 is pretty stout that can match a 5.0GHz clock of the previous gen CPUs.
> 
> You can also add that your Intel mobos allow the card to run at PCIe 3.0 while the FX does 2.0. Firestrike alone shows at least 2 fps boast for Intels in my tests.
> 
> Now, game performance is a different story as it may vary from title to title. Esp. At 1080p. But as soon as you move to higher resolutions (multi-card, obviously) the CPU effect becomes less and less.


You are right about higher res, except for in GTA V, that game needs all the CPU power it can get to run fully maxed out in the advanced graphics.


----------



## rdr09

Quote:


> Originally Posted by *ryboto*
> 
> I'm trying to decide between the two freesync LG Ultrawides, the 34UM67 and 29UM67. I just worry the 34 is much too large for my desk, and for its native resolution.
> 
> edit-Coming from a 21.5 I'm leaning toward the 29'', because it'll still be a sizeable increase. It's also not as expensive. I'd love to get the 34, but I'm primarily concerned about the pixel density, and secondarily concerned about its size. I think when/if curved 34'' 3440 x 1440 becomes more affordable, I'll upgrade to that. Thoughts?


if you seat about two feet from the monitor, then i suggest the 29. if the price difference isn't much, then go for the 34. back up a bit.


----------



## taem

Having this super annoying issue where upon waking up from sleep mode, my OC settings keep clocks, but reset voltage to stock. (Using Afterburner.) That results in the memory OC being unstable and I get a black screen. And then I have to hard shutdown by holding down the power button. Anyone know what's causing this and how to get rid of it? It's a 290.

Quote:


> Originally Posted by *ryboto*
> 
> I'm trying to decide between the two freesync LG Ultrawides, the 34UM67 and 29UM67. I just worry the 34 is much too large for my desk, and for its native resolution.
> 
> edit-Coming from a 21.5 I'm leaning toward the 29'', because it'll still be a sizeable increase. It's also not as expensive. I'd love to get the 34, but I'm primarily concerned about the pixel density, and secondarily concerned about its size. I think when/if curved 34'' 3440 x 1440 becomes more affordable, I'll upgrade to that. Thoughts?


I'm in the same boat as you, I want an adaptive sync monitor, having a hard time choosing.

Those are 48-75Hz range freesync on the LG models. Seems rather narrow a refresh range. The Acer XR341CK has 30-75Hz range which is a lot better, but it's $1000+ so quite a bit more. But on the g sync side, the same Acer will get you 30-100Hz range. Seems unfair. But the 48Hz minimum on the LG freesync, I dunno about that.


----------



## rdr09

Quote:


> Originally Posted by *taem*
> 
> Having this super annoying issue where upon waking up from sleep mode, my OC settings keep clocks, but reset voltage to stock. (Using Afterburner.) That results in the memory OC being unstable and I get a black screen. And then I have to hard shutdown by holding down the power button. Anyone know what's causing this and how to get rid of it? It's a 290.
> I'm in the same boat as you, I want an adaptive sync monitor, having a hard time choosing.
> 
> Those are 48-75Hz range freesync on the LG models. Seems rather narrow a refresh range. The Acer XR341CK has 30-75Hz range which is a lot better, but it's $1000+ so quite a bit more. But on the g sync side, the same Acer will get you 30-100Hz range. Seems unfair. But the 48Hz minimum on the LG freesync, I dunno about that.


do you have CCC part of the startup programs in msconfig? if yes, then i suggest unchecking it. that and raptr.


----------



## taem

Quote:


> Originally Posted by *rdr09*
> 
> do you have CCC part of the startup programs in msconfig? if yes, then i suggest unchecking it. that and raptr.


Oh duh. Totally forgot about that. With Trixx I always had to set a delay on the auto launch so it launches after CCC, or else CCC will launch after it and reset the volts. I'd like to not disable CCC or Raptr though, wonder how I can get around this with AB.


----------



## fat4l

ok so I finished my ROM for 1100MHz/1500MHz.
GPU1- 1212v set in the dpm7
GPU2- 1250v set in the dpm7

Real max voltage is:
GPU1- 1.195v
GPU2- 1.234v

Would be nice to see how much volts ur 290X/390X needs for 1100/1500MHz clocks


----------



## rdr09

Quote:


> Originally Posted by *taem*
> 
> Oh duh. Totally forgot about that. With Trixx I always had to set a delay on the auto launch so it launches after CCC, or else CCC will launch after it and reset the volts. I'd like to not disable CCC or Raptr though, wonder how I can get around this with AB.


CCC will open like any other app even if not part of startup. raptr will require a reboot.


----------



## HOMECINEMA-PC

My cards have been running error free for 2 years now ........


----------



## ryboto

Quote:


> Originally Posted by *taem*
> 
> Having this super annoying issue where upon waking up from sleep mode, my OC settings keep clocks, but reset voltage to stock. (Using Afterburner.) That results in the memory OC being unstable and I get a black screen. And then I have to hard shutdown by holding down the power button. Anyone know what's causing this and how to get rid of it? It's a 290.
> I'm in the same boat as you, I want an adaptive sync monitor, having a hard time choosing.
> 
> Those are 48-75Hz range freesync on the LG models. Seems rather narrow a refresh range. The Acer XR341CK has 30-75Hz range which is a lot better, but it's $1000+ so quite a bit more. But on the g sync side, the same Acer will get you 30-100Hz range. Seems unfair. But the 48Hz minimum on the LG freesync, I dunno about that.


I went with the LG anyway. For a little over $300 I'll get a pretty decent upgrade in visual space and in fluidity regardless of the Freesync range. When single video cards can easily push 3440 x 1440 I'll jump on something like the XR341CK. For now I don't have the budget to pay for bleeding edge.


----------



## happoman

Quote:


> Originally Posted by *ryboto*
> 
> I went with the LG anyway. For a little over $300 I'll get a pretty decent upgrade in visual space and in fluidity regardless of the Freesync range. When single video cards can easily push 3440 x 1440 I'll jump on something like the XR341CK. For now I don't have the budget to pay for bleeding edge.


I went also with LG 29UM67-P. With a little display driver mod you can modify you Freesync range down to 32hz.

http://hardforum.com/showthread.php?t=1872620


----------



## ryboto

Quote:


> Originally Posted by *happoman*
> 
> *
> *
> 
> I went also with LG 29UM67-P. With a little display driver mod you can modify you Freesync range down to 32hz.
> 
> http://hardforum.com/showthread.php?t=1872620


Did you perform the mod? Can you share the drivers?


----------



## taem

Quote:


> Originally Posted by *ryboto*
> 
> I went with the LG anyway. For a little over $300 I'll get a pretty decent upgrade in visual space and in fluidity regardless of the Freesync range. When single video cards can easily push 3440 x 1440 I'll jump on something like the XR341CK. For now I don't have the budget to pay for bleeding edge.


Yeah. $1100 shipped for a 21:9 1440p is tough to justify for me too, when my single 290 can't max out the QNIX 27" 1440p I'm using. I think I'll probably hold off til HBM 2 and by then, we should see better and cheaper freesync on the market. I almost bought a variant of that LG, Newegg had the 34UM94-P with a PS4 and Last of Us Remastered thrown in for $800 or so recently, I was seconds from clicking checkout.

Anyway if you can get that screen to 32-75Hz range like happoman says that's a great deal you just picked up. Even at the default 48-75Hz, you're getting a great price on it.


----------



## happoman

Quote:


> Originally Posted by *ryboto*
> 
> Did you perform the mod? Can you share the drivers?


I haven't performed the mod, because my display hasn't arrived yet. I'm going to mod right away when my display arrives. Probably tomorrow, hopefully.

Here is a link to post where you can download the modded driver.

http://www.forum-3dcenter.org/vbulletin/showthread.php?p=10744881#post10744881


----------



## gatygun

Quote:


> Originally Posted by *happoman*
> 
> I haven't performed the mod, because my display hasn't arrived yet. I'm going to mod right away when my display arrives. Probably tomorror, hopefully.
> 
> Here is a link to post where you can download the modder driver.
> 
> http://www.forum-3dcenter.org/vbulletin/showthread.php?p=10744881#post10744881


Just when i ditched amd for nvidia, if this is true i'm kinda bumped out that i did go this way. I really wanted that monitor but found the minimum to low, this would change everything.

Nice found.


----------



## NastyAIDS

Ok so i had my r9 290 for 4 months and out of no where one of the vrm controllers decided to explode out of no where. I have a Corsair HG10 waterblock that i used for the r9 290 and was wondering if it would fit a r9 390?


----------



## EpicOtis13

So the highest core OC that I can run where both cards are stable is only 1100mhz. Is this unreasonable? The highest 3dmark score that I have gotten so far is 23554 graphics score with two r9 290's at 1100/1350, any higher than this gives me some artifacting. I may just have expectations that are too high because my 7970's liked to do 1200/1700 at stock core volts and +10% power limit.


----------



## Mega Man

Quote:


> Originally Posted by *NastyAIDS*
> 
> Ok so i had my r9 290 for 4 months and out of no where one of the vrm controllers decided to explode out of no where. I have a Corsair HG10 waterblock that i used for the r9 290 and was wondering if it would fit a r9 390?










hmmm methinks it was not "out of nowhere"

afaik the390 is just a 290x so afaik it will, take the 290s hint - either keep stock cooler on or go full block


----------



## diggiddi

I think thats right for a reference 290, someone else can chime in I think you were spoiled by that 7970
1200/1700 is not to be sniffed at
The Tahiti's were overclocking beasts the Hawaii's.....ehhh not so much


----------



## Mega Man

Quote:


> Originally Posted by *diggiddi*
> 
> I think thats right for a reference 290, someone else can chime in I think you were spoiled by that 7970
> 1200/1700 is not to be sniffed at
> The Tahiti's were are still overclocking beasts the Hawaii's.....ehhh not so much


fixed for you,.

still rocking my 7970 quadfire and my 290x quad and 295x2 quad


----------



## diggiddi

Lol my bad


----------



## Mercennarius

Has anyone had screen "studdering" issues with there R9 280X before? Looks like the screen moves up and down real quick and seems to happen most when opening certain windows or scrolling in a window. Not my computer but looks like this:





I've reinstalled the drivers and tried both the beta and most recent WHQL drivers with no success.


----------



## rdr09

Quote:


> Originally Posted by *Mercennarius*
> 
> Has anyone had screen "studdering" issues with there R9 280X before? Looks like the screen moves up and down real quick and seems to happen most when opening certain windows or scrolling in a window. Not my computer but looks like this:
> 
> 
> 
> 
> 
> I've reinstalled the drivers and tried both the beta and most recent WHQL drivers with no success.


ask here . . .

http://www.overclock.net/t/1432035/official-amd-r9-280x-280-270x-270-owners-club


----------



## Mercennarius

Quote:


> Originally Posted by *rdr09*
> 
> ask here . . .
> 
> http://www.overclock.net/t/1432035/official-amd-r9-280x-280-270x-270-owners-club


Thanks. Thought that was the thread I was in







.


----------



## Ironsight

I'm running my 290x on a QNIX monitor. It will not go into 2D mode even at 60 hz.
Is there a way to fix this or is it not possible?


----------



## fat4l

Quote:


> Originally Posted by *EpicOtis13*
> 
> So the highest core OC that I can run where both cards are stable is only 1100mhz. Is this unreasonable? The highest 3dmark score that I have gotten so far is 23554 graphics score with two r9 290's at 1100/1350, any higher than this gives me some artifacting. I may just have expectations that are too high because my 7970's liked to do 1200/1700 at stock core volts and +10% power limit.


depends on you DPM7 voltage and silicon lottery.
You can go through 290X bios modding thread and theres a utility that shows you your default 3D voltage set in the bios for your card.
It depends on bios "default" clocks, asic quality and some other factors.
Check it for yourself here: The Stilt's VID APP

Will look like this:









These values mean:
3D voltage
default clock
Asic quality(hex->deci and then divide by 1023)-you can check asic quality by gpu-z as well.

Mega man, could you pls run it as well? I'm interested to see 295X2 results


----------



## Gumbi

Quote:


> Originally Posted by *Mega Man*
> 
> 
> 
> 
> 
> 
> 
> 
> hmmm methinks it was not "out of nowhere"
> 
> afaik the390 is just a 290x so afaik it will, take the 290s hint - either keep stock cooler on or go full block


A 390 is a 290, not a 290x


----------



## Mega Man

so either way it will work


----------



## rdr09

Quote:


> Originally Posted by *EpicOtis13*
> 
> So the highest core OC that I can run where both cards are stable is only 1100mhz. Is this unreasonable? The highest 3dmark score that I have gotten so far is 23554 graphics score with two r9 290's at 1100/1350, any higher than this gives me some artifacting. I may just have expectations that are too high because my 7970's liked to do 1200/1700 at stock core volts and +10% power limit.


What are your other settings? Specifically VDDC and Power Limit. What app are you using to oc? What are your temps (CORE and VRMs) at load?

edit: if that is at stock volts, then that is actually pretty good and your 7970 - above average.

it's silicon. my first 290 can do 1170 in FS at stock volts while the other 1150. Power Limit maxed for both.


----------



## sinnedone

Quote:


> Originally Posted by *Ironsight*
> 
> I'm running my 290x on a QNIX monitor. It will not go into 2D mode even at 60 hz.
> Is there a way to fix this or is it not possible?


Are you running a second monitor?


----------



## fat4l

Quote:


> Originally Posted by *rdr09*
> 
> edit: if that is at stock volts, then that is actually pretty good and your 7970 - above average.


what does it mean stock volts ?
higher the asic->lower the stock volts
higher the default clock->lower the stock volts

Higher asic doesn't have to mean higher "achievable" clocks

...thus......stock volts differ, from about 1.18 to about 1.25v.....and this affects OC.
That's why I said run EVV application to tell the stock volts so u know how much headroom u have.


----------



## rdr09

Quote:


> Originally Posted by *fat4l*
> 
> what does it mean stock volts ?
> higher the asic->lower the stock volts
> higher the default clock->lower the stock volts
> 
> Higher asic doesn't have to mean higher "achievable" clocks
> 
> ...thus......stock volts differ, from about 1.18 to about 1.25v.....and this affects OC.
> That's why I said run EVV application to tell the stock volts so u know how much headroom u have.


not too familiar with asic. iirc, as discussed in this thread two years ago . . . it has to do with leakage but does not really matter with hawaii.

my 290 with an asic of 79 can bench to 1320 - 1330. my higher asic 290 can only do 1290. could be their asic is too close to even affect how far they can bench.

what i meant by stock is oc'ing as high as you can without needing to add VDDC. both my 290s need about +50 to do 1200 core.


----------



## gupsterg

From some results I've seen of DPM 7 VID gained from The Stilts app for Hawaii cards even a small difference in ASIC quality (aka Leakage ID) makes a difference.

Why all this is going on is so ROM doesn't need to be manually set for VID per GPU. Every stock rom uses EVV, Electronic Variable Voltage. ASIC profiling goes on to then determine VID, besides ROM the driver also plays a role in this.

Next what goes on is default GPU clock in ROM plays an element in set VID, when it rises in a ROM from AMD default value VID reduces and that's why these edition cards tend to have a GPU core voltage offset programmed from factory.

Link:- A post by The Stilt on ASIC Quality, etc
Quote:


> Originally Posted by *rdr09*
> 
> my 290 with an asic of 79 can bench to 1320 - 1330. my higher asic 290 can only do 1290. could be their asic is too close to even affect how far they can bench.


Run the Stilt VID app see what DPM 7 voltage you get for both, do it at stock setting ie no OC app running , etc.

I do agree that besides ASIC quality there is just plain simple silicon lottery going on.

How I see it your lower ASIC card maybe clocking higher as a) has higher default vid b) less leaky than other card.

In this post by the stilt is some other info on what goes on with low/high asic parts.


----------



## EpicOtis13

Quote:


> Originally Posted by *rdr09*
> 
> What are your other settings? Specifically VDDC and Power Limit. What app are you using to oc? What are your temps (CORE and VRMs) at load?
> 
> edit: if that is at stock volts, then that is actually pretty good and your 7970 - above average.
> 
> it's silicon. my first 290 can do 1170 in FS at stock volts while the other 1150. Power Limit maxed for both.


The power limit is set to +15% and I am running them at 1.35 volts. Any higher than 1150 and I get artifacting. I miss my 7970s. At 1.35 volts they were both hitting 1350 fully stable in game. I use afterburner to OC


----------



## Ironsight

Quote:


> Originally Posted by *sinnedone*
> 
> Are you running a second monitor?


I have a.2nd monitor plugged in but it's not enabled.


----------



## kizwan

Quote:


> Originally Posted by *EpicOtis13*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> What are your other settings? Specifically VDDC and Power Limit. What app are you using to oc? What are your temps (CORE and VRMs) at load?
> 
> edit: if that is at stock volts, then that is actually pretty good and your 7970 - above average.
> 
> it's silicon. my first 290 can do 1170 in FS at stock volts while the other 1150. Power Limit maxed for both.
> 
> 
> 
> The power limit is set to +15% and I am running them at 1.35 volts. Any higher than 1150 and I get artifacting. I miss my 7970s. At 1.35 volts they were both hitting 1350 fully stable in game. I use afterburner to OC
Click to expand...

You should max out power limit. There's no reason to set it lower. It can only masking the unstable overclock.


----------



## rdr09

Quote:


> Originally Posted by *gupsterg*
> 
> From some results I've seen of DPM 7 VID gained from The Stilts app for Hawaii cards even a small difference in ASIC quality (aka Leakage ID) makes a difference.
> 
> Why all this is going on is so ROM doesn't need to be manually set for VID per GPU. Every stock rom uses EVV, Electronic Variable Voltage. ASIC profiling goes on to then determine VID, besides ROM the driver also plays a role in this.
> 
> Next what goes on is default GPU clock in ROM plays an element in set VID, when it rises in a ROM from AMD default value VID reduces and that's why these edition cards tend to have a GPU core voltage offset programmed from factory.
> 
> Link:- A post by The Stilt on ASIC Quality, etc
> Run the Stilt VID app see what DPM 7 voltage you get for both, do it at stock setting ie no OC app running , etc.
> 
> I do agree that besides ASIC quality there is just plain simple silicon lottery going on.
> 
> How I see it your lower ASIC card maybe clocking higher as a) has higher default vid b) less leaky than other card.
> 
> In this post by the stilt is some other info on what goes on with low/high asic parts.


will do as soon as i get my rig together. still in storage.









Quote:


> Originally Posted by *EpicOtis13*
> 
> The power limit is set to +15% and I am running them at 1.35 volts. Any higher than 1150 and I get artifacting. I miss my 7970s. At 1.35 volts they were both hitting 1350 fully stable in game. I use afterburner to OC


max the power limit at any oc. we are opposites. my tahitis (4) are all low clocking. highest was 1255.


----------



## taem

Quote:


> Originally Posted by *gatygun*
> 
> Just when i ditched amd for nvidia, if this is true i'm kinda bumped out that i did go this way. I really wanted that monitor but found the minimum to low, this would change everything.
> 
> Nice found.


It's a pain that there are two different standards. Right now I have the freedom to go AMD or Nvidia, but if I invest in adaptive sync, I'll be wedded to one brand for the life of the monitor -- and that's a component I don't upgrade often.

Right now it seems like G Sync is more mature than Freesync. Like I mentioned earlier, the Acer XR341 is 30-75Hz, while the G Sync version of the same display is 30-100Hz. I like adaptive sync more for the sub-60fps benefits and I think I'd be fine with 75Hz max, but at the same price and performance, why wouldn't I go 980 TI + X34 versus Fury X + XR341CK? Same investment, same game performance, but better refresh range. Hopefully Freesync will catch up, I don't know why but I always prefer AMD. But if this state of affairs persists well into next year I might have to go green.


----------



## ragtag7

Hi guys! I just bought a Sapphire 290X 8GB! Got a pretty decent deal on it. Hopefully I can play Witcher 3 at better frames. ^_^


----------



## Gumbi

Quote:


> Originally Posted by *ragtag7*
> 
> Hi guys! I just bought a Sapphire 290X 8GB! Got a pretty decent deal on it. Hopefully I can play Witcher 3 at better frames. ^_^


Nice buy! Which model/cooler?


----------



## serave

anybody here plays GTA V?|

i cant get mine stable @60 fps w/ i3-4150 and 290 Tri-X

mostly dips into the 40fps range when i am doing fast driving in town, settings were mixed between high/very high, advanced settings are turned off, it's fluctuating between the 50-60 range most of the time though

please enlighten


----------



## happoman

Quote:


> Originally Posted by *serave*
> 
> anybody here plays GTA V?|
> 
> i cant get mine stable @60 fps w/ i3-4150 and 290 Tri-X
> 
> mostly dips into the 40fps range when i am doing fast driving in town, settings were mixed between high/very high, advanced settings are turned off, it's fluctuating between the 50-60 range most of the time though
> 
> please enlighten


It's probably you CPU which is limiting your FPS. I had same experience with my previous CPU (G3258).

I'm now running on almost maxed settings (advanced settings off) with 290 and i5-4590 @50-60fps. No dips in framerate.


----------



## Gumbi

Quote:


> Originally Posted by *serave*
> 
> anybody here plays GTA V?|
> 
> i cant get mine stable @60 fps w/ i3-4150 and 290 Tri-X
> 
> mostly dips into the 40fps range when i am doing fast driving in town, settings were mixed between high/very high, advanced settings are turned off, it's fluctuating between the 50-60 range most of the time though
> 
> please enlighten


Your CPU is causing the issue.


----------



## Shweller

My 290x's in crossfire are utterly useless. I get better performance with just one card except for benchmark programs like 3d Mark. I guess 1080p is not strong point of these cards. Switched from a gtx 770 sli set up. Followed other threads instructions such as disabling ulps etc. Gpu usage is all over the place. Running custom ran curve so the cards are loud but not throttling due to temps.


----------



## rdr09

Quote:


> Originally Posted by *Shweller*
> 
> My 290x's in crossfire are utterly useless. I get better performance with just one card except for benchmark programs like 3d Mark. I guess 1080p is not strong point of these cards. Switched from a gtx 770 sli set up. Followed other threads instructions such as disabling ulps etc. Gpu usage is all over the place. Running custom ran curve so the cards are loud but not throttling due to temps.


if with an i5 combined with 1080 . . . yes, they would be useless. when i had only one 290 . . . i keep my HT off to keep my water cooler. with 2 290s, i don't even think of turning off HT anymore. i have to increase my radiator to compensate for the heat.

with 2 7900 cards, yes, i was able to turn HT off.


----------



## HOMECINEMA-PC




----------



## Shweller

Quote:


> Originally Posted by *rdr09*
> 
> if with an i5 combined with 1080 . . . yes, they would be useless. when i had only one 290 . . . i keep my HT off to keep my water cooler. with 2 290s, i don't even think of turning off HT anymore. i have to increase my radiator to compensate for the heat.
> 
> with 2 7900 cards, yes, i was able to turn HT off.


HT?


----------



## fyzzz

Quote:


> Originally Posted by *Shweller*
> 
> HT?


hyper Threading


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Shweller*
> 
> HT?
> Quote:
> 
> 
> 
> Originally Posted by *fyzzz*
> 
> hyper Threading
Click to expand...

CPU hyperthreading (Intel)


----------



## mfknjadagr8

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> CPU hyperthreading (Intel)


not to be confused with ht link for pci express communication


----------



## MEC-777

Quote:


> Originally Posted by *Shweller*
> 
> My 290x's in crossfire are utterly useless. I get better performance with just one card except for benchmark programs like 3d Mark. I guess 1080p is not strong point of these cards. Switched from a gtx 770 sli set up. Followed other threads instructions such as disabling ulps etc. Gpu usage is all over the place. Running custom ran curve so the cards are loud but not throttling due to temps.


I'm running crossfire 290's with an i5-4570 @ 3.7 and I get really good performance in most games, except those that don't play nice with crossfire (witcher 3, wolfenstien new order etc.).

This much GPU horsepower likes to be worked and loaded up. 1080p gaming is not it's forte - though it should still run smooth, regardless. Try enabling VSR (virtual super resolution) and running your games at 1440p (or higher). I run most of my games at 1440p and anywhere from high to maxed settings. In Project Cars at 1440 high settings with DS4x AA I get upwards of 80+ fps, 60 minimum. GTA-V at 1440 very high with some advanced settings on, it runs at 70+ fps and rarely ever dips below 60. Mad Max at 1440 maxed settings just flies along at over 120fps 95% of the time. The list goes on.

Make sure you have the latest drivers installed and that it was a clean install (properly uninstalled the old drivers with DDU first). Make sure both cards are properly synced in MSI AB (it's an option box you have to select in AB settings). Make sure overdrive is disabled in CCC (catalyst control center) and crank the power limit to +50% in AB (yes, even if you're not OCing). The reason for that is some 290's still throttle even though they are nowhere near the thermal limit due to powertune interfering. One of my 290's does this and this is how I got around that issue. If both cards aren't running the exact same clocks, it will noticeably hinder performance.


----------



## kizwan

Quote:


> Originally Posted by *Shweller*
> 
> My 290x's in crossfire are utterly useless. I get better performance with just one card except for benchmark programs like 3d Mark. I guess 1080p is not strong point of these cards. Switched from a gtx 770 sli set up. Followed other threads instructions such as disabling ulps etc. Gpu usage is all over the place. Running custom ran curve so the cards are loud but not throttling due to temps.


Depending on the games, at 1080p & crossfire, gpu usage will fluctuating a lot. This however doesn't means the cards are throttling. If you get fluctuating core frequency & FPS (Frames Per Second) dipped from time to time, then the cards are throttling. The higher the resolution, the lesser the usage fluctuation will be. If you use MSI AB which is abbreviation for MSI AfterBurner, enable unified gpu usage monitoring in the settings window.

What games did you play? The games support crossfire? Did you get worst FPS in crossfire or no change in FPS at all?

MSI AB which is abbreviation for MSI AfterBurner come with RivaTuner which you can use to turn on FPS counter in games.


----------



## Shweller

Thank you everyone for the help. In general all games including bf4 & gtav. I run +30 on AB. Graphs show gpu usage is inconsistent. GTAV will be smooth then stuter for a second like it froze then smooth again. My sli set up was always smooth and both gpus ran at 100%. The 290x are beast. Gtav with crossfire disabled and amd optimal settings gives me 60+ fps. Only get 40 fps with crossfire enabled. I will play around with resolution settings like suggested. All other settings that were mentioned were already in place at tge time of my results.


----------



## kizwan

Quote:


> Originally Posted by *Shweller*
> 
> Thank you everyone for the help. In general all games including bf4 & gtav. I run +30 on AB. Graphs show gpu usage is inconsistent. GTAV will be smooth then stuter for a second like it froze then smooth again. My sli set up was always smooth and both gpus ran at 100%. The 290x are beast. Gtav with crossfire disabled and amd optimal settings gives me 60+ fps. Only get 40 fps with crossfire enabled. I will play around with resolution settings like suggested. All other settings that were mentioned were already in place at tge time of my results.


What is the drivers version that you installed? If I'm not mistaken someone did complain the crossfire in BF4 was broken problematic with 15.7.1 drivers & the latest 15.10 beta solved it.


----------



## rdr09

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


lol
Quote:


> Originally Posted by *Shweller*
> 
> HT?


Like others said, HT stands for hyper threading present on i7s. Turn it off and basically turns into an i5. You prolly heard of AMD infamous CPU overhead. It could be a contributing factor and lack of crossfire support for games, especially from ubi and gameworks.

Anyways, here is an example of HT off on my i7 using crossfire HD7900 cars and a single 290 in BF3 . . .



Spikes during huge explosions and stuff. Crossfire 7900 is a bit faster than a single 290.


----------



## diggiddi

Quote:


> Originally Posted by *Shweller*
> 
> Thank you everyone for the help. In general all games including bf4 & gtav. I run +30 on AB. Graphs show gpu usage is inconsistent. GTAV will be smooth then stuter for a second like it froze then smooth again. My sli set up was always smooth and both gpus ran at 100%. The 290x are beast. Gtav with crossfire disabled and amd optimal settings gives me 60+ fps. Only get 40 fps with crossfire enabled. I will play around with resolution settings like suggested. All other settings that were mentioned were already in place at tge time of my results.


Crossfire doesn't always take, I ran into your same issue yesty when I was getting 45 fps in BF3 mp. Closeout game, Go into CCC disabe CF re-enable it and restart AB if you have it running and apply all the settings that have been recommended and add a slight overclock eg 20hz to the gpu to "wake it up". When I had em at stock one was at idle speed 300/150 and that was causing the low fps.


----------



## MEC-777

Quote:


> Originally Posted by *diggiddi*
> 
> Crossfire doesn't always take, I ran into your same issue yesty when I was getting 45 fps in BF3 mp. Closeout game, Go into CCC disabe CF re-enable it and restart AB if you have it running and apply all the settings that have been recommended and add a slight overclock eg 20hz to the gpu to "wake it up". When I had em at stock one was at idle speed 300/150 and that was causing the low fps.


I discovered getting crossfire to work properly on 290's can be a bit tricky. There are a lot of little things/tweaks that sometimes need to be done to get it working "just right".

When I first installed my 2nd 290 it took me a few days to figure everything out and start seeing the proper performance gains it should be providing. Until I got it working right, I did see the same or worse performance in a number of games. The biggest issues for me was getting both cards clocking the same and preventing power tune from causing one of my cards to down-clock even though it was nowhere near thermal limit. Now that I have it all figured out and setup correctly, both my cards clock down to 300/150 at idle with the primary card clocking up a little from time to time depending on desktop load. In games, both clock up together to 947/1250 (stock) and they both stay glued there the whole time, regardless of load.

Here's what I would suggest for @Shweller to try: (This is the process I follow to update drivers and ensure my crossfire setup works properly thereafter)

-Download latest version of DDU.
-Download latest AMD drivers (15.10 beta) from AMD's support website.
-Exit MSI AB (close it down)
-Uninstall your current AMD drivers via add/remove programs in control panel.
-Shut PC down (don't restart). With PC shut down, unplug the PCIe power cables from the secondary card but leave the card installed. The PC just simply won't detect it.
-Power up PC and run DDU - it will ask you to restart in safemode, do it and follow the steps to completely remove all AMD software. Then restart PC.
-When PC boots back into windows, run and install the new AMD drivers (15.10). Restart as prompted.
-After restart with the new drivers installed, check and make sure it installed correctly (open CCC (catalyst control center) and check version number etc.) If everything looks good, shut the PC down again.
-Kill the power switch on the PSU just to be safe, wait ~10-15 seconds, then reconnect the PCIe power cables to the secondary card. Turn the PSU back on and power up the machine.
-When you get back into windows give it a minute to detect the 2nd card. If it doesn't open a prompt to enable crossfireX, open up CCC and enable Crossfire. (make sure you also check the box that says "enable crossfire even for applications that have no crossfire profile).
-While you're in CCC, go to the performance section and disable overdrive.
-Next, open up MSI AB and go into settings. Check the box to sync the cards so they both run at the same clocks. You can check the box to disable ULPS, but I left mine enabled and it's been fine (it's an optional thing). Apply and close the settings.
-In AB, set the power limit to +50% (as high as it will go). No harm will come from doing this. It will just make sure power tune doesn't downclock when it shouldn't. Set the GPU core clock to 947 (or what ever is stock for your cards) and the same for the memory - 1250 or what ever it should be. Click apply and then set that as profile #1 and click the little dot that says "apply OC on startup".
-Now, go run a 3D application (I like using unigine valley as a quick easy test) and watch your GPU clocks (if you have a 2nd display watch AB on the 2nd screen, if not, just alt-tab out while the program is running and look back to see what they were doing. Just keep in mind; some applications will drop out of crossfire mode (drop down to one card) when you alt-tab out and back in (valley does this)). They should sit at 300/150 at idle and ramp up to max stock clocks under load. If both cards ramp up together and stay that way, you should be good to go. GPU load should also be matched on both cards.

Let me know if that does the trick or if you have any trouble with any of those steps. Crossfire can be finicky and the silliest little thing(s) can cause undesirable performance if it's not working just right.

Clear as mud?


----------



## battleaxe

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *MEC-777*
> 
> I discovered getting crossfire to work properly on 290's can be a bit tricky. There are a lot of little things/tweaks that sometimes need to be done to get it working "just right".
> 
> When I first installed my 2nd 290 it took me a few days to figure everything out and start seeing the proper performance gains it should be providing. Until I got it working right, I did see the same or worse performance in a number of games. The biggest issues for me was getting both cards clocking the same and preventing power tune from causing one of my cards to down-clock even though it was nowhere near thermal limit. Now that I have it all figured out and setup correctly, both my cards clock down to 300/150 at idle with the primary card clocking up a little from time to time depending on desktop load. In games, both clock up together to 947/1250 (stock) and they both stay glued there the whole time, regardless of load.
> 
> Here's what I would suggest for @Shweller to try: (This is the process I follow to update drivers and ensure my crossfire setup works properly thereafter)
> 
> -Download latest version of DDU.
> -Download latest AMD drivers (15.10 beta) from AMD's support website.
> -Exit MSI AB (close it down)
> -Uninstall your current AMD drivers via add/remove programs in control panel.
> -Shut PC down (don't restart). With PC shut down, unplug the PCIe power cables from the secondary card but leave the card installed. The PC just simply won't detect it.
> -Power up PC and run DDU - it will ask you to restart in safemode, do it and follow the steps to completely remove all AMD software. Then restart PC.
> -When PC boots back into windows, run and install the new AMD drivers (15.10). Restart as prompted.
> -After restart with the new drivers installed, check and make sure it installed correctly (open CCC (catalyst control center) and check version number etc.) If everything looks good, shut the PC down again.
> -Kill the power switch on the PSU just to be safe, wait ~10-15 seconds, then reconnect the PCIe power cables to the secondary card. Turn the PSU back on and power up the machine.
> -When you get back into windows give it a minute to detect the 2nd card. If it doesn't open a prompt to enable crossfireX, open up CCC and enable Crossfire. (make sure you also check the box that says "enable crossfire even for applications that have no crossfire profile).
> -While you're in CCC, go to the performance section and disable overdrive.
> -Next, open up MSI AB and go into settings. Check the box to sync the cards so they both run at the same clocks. You can check the box to disable ULPS, but I left mine enabled and it's been fine (it's an optional thing). Apply and close the settings.
> -In AB, set the power limit to +50% (as high as it will go). No harm will come from doing this. It will just make sure power tune doesn't downclock when it shouldn't. Set the GPU core clock to 947 (or what ever is stock for your cards) and the same for the memory - 1250 or what ever it should be. Click apply and then set that as profile #1 and click the little dot that says "apply OC on startup".
> -Now, go run a 3D application (I like using unigine valley as a quick easy test) and watch your GPU clocks (if you have a 2nd display watch AB on the 2nd screen, if not, just alt-tab out while the program is running and look back to see what they were doing. Just keep in mind; some applications will drop out of crossfire mode (drop down to one card) when you alt-tab out and back in (valley does this)). They should sit at 300/150 at idle and ramp up to max stock clocks under load. If both cards ramp up together and stay that way, you should be good to go. GPU load should also be matched on both cards.
> 
> Let me know if that does the trick or if you have any trouble with any of those steps. Crossfire can be finicky and the silliest little thing(s) can cause undesirable performance if it's not working just right.
> 
> Clear as mud?






Yeah, its weird. Seems to really depend on the system. On my 3570k machine I had similar issues as what you describe. But on my 2700k machine, I had zero issues. I didn't have to do anything besides put the PC together start it up and enable crossfire. Crazy.


----------



## kizwan

I don't need to do any of that. Mine just works.


----------



## MEC-777

Quote:


> Originally Posted by *kizwan*
> 
> I don't need to do any of that. Mine just works.


Lucky you.









Not all of that is completely necessary, but it's to cover a "worst case scenario" type situation. You shouldn't have to do all that, but in my case, much of that was/is necessary. The very first time I powered up my system with both cards installed, neither would run higher than about 618mhz or something weird like that. It was very bizarre. Following the procedure I outlined, I haven't had any such problems or auto-downclocking since. Now it just works as it should.


----------



## mfknjadagr8

Quote:


> Originally Posted by *MEC-777*
> 
> Lucky you.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not all of that is completely necessary, but it's to cover a "worst case scenario" type situation. You shouldn't have to do all that, but in my case, much of that was/is necessary. The very first time I powered up my system with both cards installed, neither would run higher than about 618mhz or something weird like that. It was very bizarre. Following the procedure I outlined, I haven't had any such problems or auto-downclocking since. Now it just works as it should.


I've had similar issues and I always do it almost exactly the way you outlined now...I even had issues where the cards would show crazy numbers on clocks like 2700 on memory...or 1700 on core...it was simply a driver issue...but I do it slightly different to enable the unofficial overclocking mode...as to ulps it effects some overclocking programs and can effect crossfire when overclocking so I disable it as well


----------



## Shweller

Quote:


> Originally Posted by *MEC-777*
> 
> I discovered getting crossfire to work properly on 290's can be a bit tricky. There are a lot of little things/tweaks that sometimes need to be done to get it working "just right".
> 
> When I first installed my 2nd 290 it took me a few days to figure everything out and start seeing the proper performance gains it should be providing. Until I got it working right, I did see the same or worse performance in a number of games. The biggest issues for me was getting both cards clocking the same and preventing power tune from causing one of my cards to down-clock even though it was nowhere near thermal limit. Now that I have it all figured out and setup correctly, both my cards clock down to 300/150 at idle with the primary card clocking up a little from time to time depending on desktop load. In games, both clock up together to 947/1250 (stock) and they both stay glued there the whole time, regardless of load.
> 
> Here's what I would suggest for @Shweller to try: (This is the process I follow to update drivers and ensure my crossfire setup works properly thereafter)
> 
> -Download latest version of DDU.
> -Download latest AMD drivers (15.10 beta) from AMD's support website.
> -Exit MSI AB (close it down)
> -Uninstall your current AMD drivers via add/remove programs in control panel.
> -Shut PC down (don't restart). With PC shut down, unplug the PCIe power cables from the secondary card but leave the card installed. The PC just simply won't detect it.
> -Power up PC and run DDU - it will ask you to restart in safemode, do it and follow the steps to completely remove all AMD software. Then restart PC.
> -When PC boots back into windows, run and install the new AMD drivers (15.10). Restart as prompted.
> -After restart with the new drivers installed, check and make sure it installed correctly (open CCC (catalyst control center) and check version number etc.) If everything looks good, shut the PC down again.
> -Kill the power switch on the PSU just to be safe, wait ~10-15 seconds, then reconnect the PCIe power cables to the secondary card. Turn the PSU back on and power up the machine.
> -When you get back into windows give it a minute to detect the 2nd card. If it doesn't open a prompt to enable crossfireX, open up CCC and enable Crossfire. (make sure you also check the box that says "enable crossfire even for applications that have no crossfire profile).
> -While you're in CCC, go to the performance section and disable overdrive.
> -Next, open up MSI AB and go into settings. Check the box to sync the cards so they both run at the same clocks. You can check the box to disable ULPS, but I left mine enabled and it's been fine (it's an optional thing). Apply and close the settings.
> -In AB, set the power limit to +50% (as high as it will go). No harm will come from doing this. It will just make sure power tune doesn't downclock when it shouldn't. Set the GPU core clock to 947 (or what ever is stock for your cards) and the same for the memory - 1250 or what ever it should be. Click apply and then set that as profile #1 and click the little dot that says "apply OC on startup".
> -Now, go run a 3D application (I like using unigine valley as a quick easy test) and watch your GPU clocks (if you have a 2nd display watch AB on the 2nd screen, if not, just alt-tab out while the program is running and look back to see what they were doing. Just keep in mind; some applications will drop out of crossfire mode (drop down to one card) when you alt-tab out and back in (valley does this)). They should sit at 300/150 at idle and ramp up to max stock clocks under load. If both cards ramp up together and stay that way, you should be good to go. GPU load should also be matched on both cards.
> 
> Let me know if that does the trick or if you have any trouble with any of those steps. Crossfire can be finicky and the silliest little thing(s) can cause undesirable performance if it's not working just right.
> 
> Clear as mud?


Thanks, I will give it a try and report back. I am happy to see support in the AMD community is strong.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *kizwan*
> 
> I don't need to do any of that. Mine just works.


----------



## MEC-777

Quote:


> Originally Posted by *mfknjadagr8*
> 
> I've had similar issues and I always do it almost exactly the way you outlined now...I even had issues where the cards would show crazy numbers on clocks like 2700 on memory...or 1700 on core...it was simply a driver issue...but I do it slightly different to enable the unofficial overclocking mode...as to ulps it effects some overclocking programs and can effect crossfire when overclocking so I disable it as well


I left ULPS enabled and "extend OC limits" disabled. With that, I've been able to push both card up to 1200/1450, but that was on the ragged edge (able to finish a run of firestrike but that's about it). I'm not too concerned with extreme OCing though as I find even at stock they're more than fast enough to run the games I play.







That being said, they'll do 1100/1350 all day long perfectly stable.








Quote:


> Originally Posted by *Shweller*
> 
> Thanks, I will give it a try and report back. I am happy to see support in the AMD community is strong.


No prob, let me know.







Yeah, these Hawaii cards offer by far, the best performance per dollar of any GPU currently available, so it's understandable that there's quite a few people running them. If you can control the heat they produce effectively and get a good OC on them, they can really pump out a lot of performance.


----------



## diggiddi

Oh boy I just tripped shutoff mechanism at 1100/1505 my 750w PSU can't handle it


----------



## gatygun

Quote:


> Originally Posted by *serave*
> 
> anybody here plays GTA V?|
> 
> i cant get mine stable @60 fps w/ i3-4150 and 290 Tri-X
> 
> mostly dips into the 40fps range when i am doing fast driving in town, settings were mixed between high/very high, advanced settings are turned off, it's fluctuating between the 50-60 range most of the time though
> 
> please enlighten


Amd drivers have massive cpu overhead, combining a fast gpu with a slower cpu isn't going to work well. I experienced that myself. It's your cpu. Either switch the 290 for a nvidia card, or upgrade your cpu to resolve it.


----------



## battleaxe

Quote:


> Originally Posted by *HOMECINEMA-PC*











Quote:


> Originally Posted by *gatygun*
> 
> Amd drivers have massive cpu overhead, combining a fast gpu with a slower cpu isn't going to work well. I experienced that myself. It's your cpu. Either switch the 290 for a nvidia card, or upgrade your cpu to resolve it.


Amen


----------



## rdr09

Quote:


> Originally Posted by *Shweller*
> 
> Thanks, I will give it a try and report back. I am happy to see support in the AMD community is strong.


you can do all that, but if you play a multi-threaded game . . . it would not help. i learned early from my HD 7900 cars in crossfire. HT on gave me higher minimums. now, with hawaii, i don't even think about turning HT off.

like Kizwan, mine just works. i just install the old driver over the new. no ddu on all my rigs. intel or amd.

what i do different from others "maybe" is . . . install one card, install driver, install the other card. CCC automatically enables crossfire.

i do some rituals, too, with Afterburner. mainly how it set as per this thread . . .

http://www.overclock.net/t/1265543/the-amd-how-to-thread


----------



## MEC-777

Quote:


> Originally Posted by *gatygun*
> 
> Amd drivers have massive cpu overhead, combining a fast gpu with a slower cpu isn't going to work well. I experienced that myself. It's your cpu. Either switch the 290 for a nvidia card, or upgrade your cpu to resolve it.


It's not that it's AMD or Nvidia. That CPU will hold back a 980 in the same/similar situations. The best solution is to upgrade to an i5.


----------



## MEC-777

Quote:


> Originally Posted by *rdr09*
> 
> you can do all that, but if you play a multi-threaded game . . . it would not help. i learned early from my HD 7900 cars in crossfire. HT on gave me higher minimums. now, with hawaii, i don't even think about turning HT off.
> 
> like Kizwan, mine just works. i just install the old driver over the new. no ddu on all my rigs. intel or amd.
> 
> what i do different from others "maybe" is . . . install one card, install driver, install the other card. CCC automatically enables crossfire.
> 
> i do some rituals, too, with Afterburner. mainly how it set as per this thread . . .
> 
> http://www.overclock.net/t/1265543/the-amd-how-to-thread


Just a few thoughts....

The member's system in question is running an i5, so there is no HT to be turned on/off. An i5 is more than enough for crossfire 290X's. I'm running a slower i5 (4570) with two 290's and have not experienced much CPU bottlenecking, if at all.

You've mentioned a few times now about enabling or disabling HT. Forgive me, but I see no reason why you'd ever want to disable it. Wouldn't you not want the most CPU performance available at all times?









The member's system in question is having issues with getting crossfire to work/perform properly which doesn't really have anything to do with multi-threaded games. It has more to do with the drivers and/or additional settings.

I agree with installing new drivers with one card powered in the system, then shutting down, reconnecting power to the 2nd card and power up again. CCC should enable crossfire automatically or at least bring up a prompt, but it doesn't always do that (from my experience). But yeah, that's why I included those steps in my write up a few posts back.







When I tried installing new drivers with both cards powered, I ran into issues.


----------



## rdr09

Quote:


> Originally Posted by *MEC-777*
> 
> Just a few thoughts....
> 
> The member's system in question is running an i5, so there is no HT to be turned on/off. An i5 is more than enough for crossfire 290X's. I'm running a slower i5 (4570) with two 290's and have not experienced much CPU bottlenecking, if at all.
> 
> You've mentioned a few times now about enabling or disabling HT. Forgive me, but I see no reason why you'd ever want to disable it. Wouldn't you not want the most CPU performance available at all times?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The member's system in question is having issues with getting crossfire to work/perform properly which doesn't really have anything to do with multi-threaded games. It has more to do with the drivers and/or additional settings.
> 
> I agree with installing new drivers with one card powered in the system, then shutting down, reconnecting power to the 2nd card and power up again. CCC should enable crossfire automatically or at least bring up a prompt, but it doesn't always do that (from my experience). But yeah, that's why I included those steps in my write up a few posts back.
> 
> 
> 
> 
> 
> 
> 
> When I tried installing new drivers with both cards powered, I ran into issues.


i had a single 290 for awhile. about a year. turning HT off made my water cooler and with it off . . . it can handle the 290 just fine. but, like is said, based on my experience with crossfire HD7900 cards and now with 2 290s . . . i have to keep it on. i added rads to compensate for the extra heat from adding another 290 and turning HT on.

edit: here was my usage with a single 290 in BF4 at 1080, which is fine . . .



but adding another 290? maybe playing with settings will work like one member did. i'll post the link when i find it. here was what Truelies did . . .

http://www.overclock.net/t/1553034/low-fps-with-r9-290x/50#post_24447176


----------



## MEC-777

Quote:


> Originally Posted by *rdr09*
> 
> i had a single 290 for awhile. about a year. turning HT off made my water cooler and with it off . . . it can handle the 290 just fine. but, like is said, based on my experience with crossfire HD7900 cards and now with 2 290s . . . i have to keep it on. i added rads to compensate for the extra heat from adding another 290 and turning HT on.
> 
> edit: here was my usage with a single 290 in BF4 at 1080, which is fine . . .
> 
> 
> 
> but adding another 290? maybe playing with settings will work like one member did. i'll post the link when i find it. here was what Truelies did . . .
> 
> http://www.overclock.net/t/1553034/low-fps-with-r9-290x/50#post_24447176


Oh I see, so you turned HT off to reduce your water temps. How much of a difference does that make though?

In my system, with the i5 and two 290's, most of my games show near 100% usage on both GPUs (as long as it's a demanding game like GTA-V) fairly consistently. Even though I do see high CPU usage in some games, it still seems to remain mainly GPU-bound which is what you want. I run most of my games at 1440p which puts more load primarily on the GPUs vs at 1080p. The heavier the workload on the GPUs the less CPU overhead you will have.







So running at 1080p may show some CPU bottlenecking with a straight i5, which is also why you'd see an improvement by enabling HT on an i7.


----------



## kizwan

Quote:


> Originally Posted by *MEC-777*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I don't need to do any of that. Mine just works.
> 
> 
> 
> Lucky you.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not all of that is completely necessary, but it's to cover a "worst case scenario" type situation. You shouldn't have to do all that, but in my case, much of that was/is necessary. The very first time I powered up my system with both cards installed, neither would run higher than about 618mhz or something weird like that. It was very bizarre. Following the procedure I outlined, I haven't had any such problems or auto-downclocking since. Now it just works as it should.
Click to expand...

I still have screenshots of gpu usage & core frequency (playing BF4 with the cards still using the stock cooler) that I took in the first week I got my 290's. Yeah, it's pretty much smooth sailing for me. No black screen issue either like many people have back then.


Spoiler: Warning: Spoiler!







Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I don't need to do any of that. Mine just works.
Click to expand...











Quote:


> Originally Posted by *MEC-777*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mfknjadagr8*
> 
> I've had similar issues and I always do it almost exactly the way you outlined now...I even had issues where the cards would show crazy numbers on clocks like 2700 on memory...or 1700 on core...it was simply a driver issue...but I do it slightly different to enable the unofficial overclocking mode...as to ulps it effects some overclocking programs and can effect crossfire when overclocking so I disable it as well
> 
> 
> 
> I left ULPS enabled and "extend OC limits" disabled. With that, I've been able to push both card up to 1200/1450, but that was on the ragged edge (able to finish a run of firestrike but that's about it). I'm not too concerned with extreme OCing though as I find even at stock they're more than fast enough to run the games I play.
> 
> 
> 
> 
> 
> 
> 
> That being said, they'll do 1100/1350 all day long perfectly stable.
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Shweller*
> 
> Thanks, I will give it a try and report back. I am happy to see support in the AMD community is strong.
> 
> Click to expand...
> 
> No prob, let me know.
> 
> 
> 
> 
> 
> 
> 
> Yeah, these Hawaii cards offer by far, the best performance per dollar of any GPU currently available, so it's understandable that there's quite a few people running them. If you can control the heat they produce effectively and get a good OC on them, they can really pump out a lot of performance.
Click to expand...

What I found when overclock & overvolt while ULPS still enabled is that the secondary card, while it seems running at max frequency, was not running at full performance. The voltage on the secondary card was not overvolt like it should be but it doesn't crash when running benchmark or when playing games. ULPS can masking overclock instability. This was like a year ago & probably not true anymore with the drivers update and what not. You can check this by logging the data from secondary card using GPU-Z in two conditions; stock frequency & overclock (1200/1450). Then you can plot a graph in Excel for the voltage & compare both to see whether the secondary card was properly overvolt or not.


----------



## rdr09

Quote:


> Originally Posted by *MEC-777*
> 
> Oh I see, so you turned HT off to reduce your water temps. How much of a difference does that make though?
> 
> In my system, with the i5 and two 290's, most of my games show near 100% usage on both GPUs (as long as it's a demanding game like GTA-V) fairly consistently. Even though I do see high CPU usage in some games, it still seems to remain mainly GPU-bound which is what you want. I run most of my games at 1440p which puts more load primarily on the GPUs vs at 1080p. The heavier the workload on the GPUs the less CPU overhead you will have.
> 
> 
> 
> 
> 
> 
> 
> So running at 1080p may show some CPU bottlenecking with a straight i5, which is also why you'd see an improvement by enabling HT on an i7.


if you saw my earlier post in BF3 . . . with HT off my cores were hitting 70C. that was 'cause i did not have enuf rad space (only 120 per block limited by my case). If i had the right amount, then i would keep HT on with a single 290. Never ever tried HT off with 2 290s. Maybe sandy's ipc is really slow and a greater oc would help. here is my usage in BF4 at 4K using Mantle and DX11 . . .



pretty similar but mantle seems evenly distributed. they both keep my gpus at work.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *battleaxe*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Amen




Quote:


> Originally Posted by *rdr09*
> 
> you can do all that, but if you play a multi-threaded game . . . it would not help. i learned early from my HD 7900 cars in crossfire. HT on gave me higher minimums. now, with hawaii, i don't even think about turning HT off.
> 
> *like Kizwan, mine just works.* i just install the old driver over the new. no ddu on all my rigs. intel or amd.
> 
> what i do different from others "maybe" is . . . install one card, install driver, install the other card. CCC automatically enables crossfire.
> 
> i do some rituals, too, with Afterburner. mainly how it set as per this thread . . .
> 
> http://www.overclock.net/t/1265543/the-amd-how-to-thread




Quote:


> Originally Posted by *kizwan*
> 
> I still have screenshots of gpu usage & core frequency (playing BF4 with the cards still using the stock cooler) that I took in the first week I got my 290's. Yeah, it's pretty much smooth sailing for me. No black screen issue either like many people have back then.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What I found when overclock & overvolt while ULPS still enabled is that the secondary card, while it seems running at max frequency, was not running at full performance. The voltage on the secondary card was not overvolt like it should be but it doesn't crash when running benchmark or when playing games. ULPS can masking overclock instability. This was like a year ago & probably not true anymore with the drivers update and what not. You can check this by logging the data from secondary card using GPU-Z in two conditions; stock frequency & overclock (1200/1450). Then you can plot a graph in Excel for the voltage & compare both to see whether the secondary card was properly overvolt or not.


----------



## mus1mus

Ohmigawd! My phone!


----------



## battleaxe

Quote:


> Originally Posted by *kizwan*
> 
> I still have screenshots of gpu usage & core frequency (playing BF4 with the cards still using the stock cooler) that I took in the first week I got my 290's. Yeah, it's pretty much smooth sailing for me. No black screen issue either like many people have back then.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What I found when overclock & overvolt while ULPS still enabled is that the secondary card, while it seems running at max frequency, was not running at full performance. The voltage on the secondary card was not overvolt like it should be but it doesn't crash when running benchmark or when playing games. ULPS can masking overclock instability. This was like a year ago & probably not true anymore with the drivers update and what not. You can check this by logging the data from secondary card using GPU-Z in two conditions; stock frequency & overclock (1200/1450). Then you can plot a graph in Excel for the voltage & compare both to see whether the secondary card was properly overvolt or not.


No. This is still true. I had this happening just the other day. Couldn't figure out why my second card was running cooler than the first, and this was the reason. Once ULPS was disabled FPS increased as did heat. So you are correct still even on new drivers.


----------



## AliNT77

so after daaays of research & bios editing & undervolting & benchmarking &... i came up with this as my 24/7 gaming stable profile on my 290:


-33% power consumption & higher performance at the same time















wonder WTH are they (AMD's "engineers") doing









BTW my score @stock is 10900


----------



## Ironsight

Quote:


> Originally Posted by *AliNT77*
> 
> so after daaays of research & bios editing & undervolting & benchmarking &... i came up with this as my 24/7 gaming stable profile on my 290:
> 
> 
> -33% power consumption & higher performance at the same time
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> wonder WTH are they (AMD's "engineers") doing
> 
> 
> 
> 
> 
> 
> 
> 
> 
> BTW my score @stock is 10900


Woah nice job. Would you mind sharing some tips on where to start so I can see some improvements like you?


----------



## MEC-777

Quote:


> Originally Posted by *AliNT77*
> 
> so after daaays of research & bios editing & undervolting & benchmarking &... i came up with this as my 24/7 gaming stable profile on my 290:
> 
> 
> -33% power consumption & higher performance at the same time
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> wonder WTH are they (AMD's "engineers") doing
> 
> 
> 
> 
> 
> 
> 
> 
> 
> BTW my score @stock is 10900


Very nice!









Makes me wonder if my two 290's can run stock clocks with lower voltages without any stability issues. (hours of testing incoming...







)

Just keep in mind, not ALL Hawaii/Grenada cards can do this/are the same. The AMD engineers have to carefully select clocks and voltages that ALL cards can run at and remain stable.


----------



## AliNT77

Quote:


> Originally Posted by *MEC-777*
> 
> Very nice!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Makes me wonder if my two 290's can run stock clocks with lower voltages without any stability issues. (hours of testing incoming...
> 
> 
> 
> 
> 
> 
> 
> )
> 
> Just keep in mind, not ALL Hawaii/Grenada cards can do this/are the same. The AMD engineers have to carefully select clocks and voltages that ALL cards can run at and remain stable.


i know it but they could add a middle memory clock like i did (450mHz) so when on desktop , memory wont go full throttle with idle voltage and cause black screen

the bottleneck for undervolting hawaii is the memory's idle state so i removed it this way:


before editing bios i just could UV it by -31mV but now -185mV without blackscreen
but UVing this much will affect core stability so U must test it under load and find the sweet spot for Ur card

sorry for my poor english


----------



## battleaxe

Quote:


> Originally Posted by *AliNT77*
> 
> so after daaays of research & bios editing & undervolting & benchmarking &... i came up with this as my 24/7 gaming stable profile on my 290:
> 
> 
> -33% power consumption & higher performance at the same time
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> wonder WTH are they (AMD's "engineers") doing
> 
> 
> 
> 
> 
> 
> 
> 
> 
> BTW my score @stock is 10900


Every card is different, and not all cards could do that.


----------



## AliNT77

Quote:


> Originally Posted by *Ironsight*
> 
> Woah nice job. Would you mind sharing some tips on where to start so I can see some improvements like you?


first download hawaii bios reader and learn how to flash a bios (its super easy so dont panic)

extract your card's bios using gpu-z and open it with hawaii bios reader

make it look exactly like this:

save the bios and flash it

now open afterburner and in settings/monitoring set "hardware polling period" to 100 and close settings

then detach MSI AB hardware monitoring and make its window very big (like more than half of your screen)
*now U have an @idle memory torture test*









then start UVing with sapphire trixx until blackscreen happens

shut down the system and switch bios using physical switch on your graphics card ( unless windows wont boot )
when windows booted up , shut down the system and switch bios to the modded one
*(u have to do this step every time U get blackscreen)*

now set "the voltage that caused blackscreen+12mV" and fire up a benchmark like unigine valley
let it run for a while and if there were artifacts,BS,driver crash,etc.. start underclocking
if there were no artifact then u are so damn lucky and overclock the core

then start lowering power limit untill u got what u want









hope that helps


----------



## mathesar

Hello I recently noticed GPU usage seems very erratic when viewing in MSI Afterburner or GPU-Z, For example this screenshot is what it does while playing Batman Arkham Knight, The card is running stock settings except with a custom fan profile, but I also tried running it with the stock Auto fan settings with same results.

This is my first AMD card so not sure if it's considered normal but I'm guessing not lol (Never saw this with previous Nvidia cards).

Card is a Diamond R9 290X (reference cooling) bios set on Uber mode.


----------



## AliNT77

Quote:


> Originally Posted by *battleaxe*
> 
> Every card is different, and not all cards could do that.


but most of them can do more than -100mV

my card is a super ****ty clocker (max stable core: 1140 no matter how much voltage)
trust me; a less-voltage-hungry gpu can do -220mV easily









u can test it for yourself


----------



## battleaxe

Quote:


> Originally Posted by *AliNT77*
> 
> but most of them can do more than -100mV
> 
> my card is a super ****ty clocker (max stable core: 1140 no matter how much voltage)
> trust me; a less-voltage-hungry gpu can do -220mV easily
> 
> 
> 
> 
> 
> 
> 
> 
> 
> u can test it for yourself


What I was talking about was the memory. Getting 1500 at -150 mine could not do. Well... I guess I should try before I say it can't. But I know one of my cards cannot, as it can't hit 1500 no matter what voltage. The core, yes. That seems very possible.


----------



## AliNT77

Quote:


> Originally Posted by *battleaxe*
> 
> What I was talking about was the memory. Getting 1500 at -150 mine could not do. Well... I guess I should try before I say it can't. But I know one of my cards cannot, as it can't hit 1500 no matter what voltage. The core, yes. That seems very possible.


definitely try editing bios
it helps memory OC a lotttt


----------



## MEC-777

Quote:


> Originally Posted by *AliNT77*
> 
> i know it but they could add a middle memory clock like i did (450mHz) so when on desktop , memory wont go full throttle with idle voltage and cause black screen
> 
> the bottleneck for undervolting hawaii is the memory's idle state so i removed it this way:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> before editing bios i just could UV it by -31mV but now -185mV without blackscreen
> but UVing this much will affect core stability so U must test it under load and find the sweet spot for Ur card
> 
> sorry for my poor english


Cool.







Thanks for the tips. I think I'll see how much I can undervolt them without messing around with the BIOS. I tried playing around with the BIOS back when I had just one card (with a modified 390x BIOS) but it became unstable and didn't yield any gains that made it worth the hassle. Now I have two cards, so flashing the BIOS on both is just more steps. I'm pretty happy with they they both run at stock, to be honest. Wouldn't mind a little bit lower temps though. Will see how far I can drop them.









Quote:


> Originally Posted by *mathesar*
> 
> Hello I recently noticed GPU usage seems very erratic when viewing in MSI Afterburner or GPU-Z, For example this screenshot is what it does while playing Batman Arkham Knight, The card is running stock settings except with a custom fan profile, but I also tried running it with the stock Auto fan settings with same results.
> 
> This is my first AMD card so not sure if it's considered normal but I'm guessing not lol (Never saw this with previous Nvidia cards).
> 
> Card is a Diamond R9 290X (reference cooling) bios set on Uber mode.
> 
> 
> Spoiler: Warning: Spoiler!


Arkham Knight is a broken game with numerous bugs and errors in the PC version. It's a mess, to be blunt. lol. Even after it's latest second release, it's still apparently broken. Not a good game to use as a proper example for how a game should run, so don't be too concerned. Try another game like GTA-V or Witcher 3 or something that runs much more stable.


----------



## MEC-777

Quote:


> Originally Posted by *battleaxe*
> 
> What I was talking about was the memory. Getting 1500 at -150 mine could not do. Well... I guess I should try before I say it can't. But I know one of my cards cannot, as it can't hit 1500 no matter what voltage. The core, yes. That seems very possible.


Yeah, memory OC is hit and miss with these cards. My HIS can do 1500 on the memory but my Gigabyte can only do 1450 and barely. Both can do 1200 on the core though.


----------



## battleaxe

Quote:


> Originally Posted by *MEC-777*
> 
> Yeah, memory OC is hit and miss with these cards. My HIS can do 1500 on the memory but my Gigabyte can only do 1450 and barely. Both can do 1200 on the core though.


Yeah, both of mine hit 1250+ one will do 1280 on core. But one of the cards will only do 1325mhz on memory lol.... what a waste. Luckily, gaming doesn't really need it, so I don't care. I bench with my strong card, but game in Xfire. I only run 1250 mem when gaming. Just no need to push harder.


----------



## mus1mus

Quote:


> Originally Posted by *battleaxe*
> 
> Yeah, both of mine hit 1250+ one will do 1280 on core. *But one of the cards will only do 1325mhz on memory* lol.... what a waste. Luckily, gaming doesn't really need it, so I don't care. I bench with my strong card, but game in Xfire. I only run 1250 mem when gaming. Just no need to push harder.


BIOS can fix that.









Other bioses have tighter memory strap timings that limit overclockability. Esp for Elpida memory. Some bios also have missing memory straps that the next defined strap may be too tight for the memory capability.

Looking for the perfect BIOS for your card, though, can be time consuming and may not be worth the hassle.

It can also be due to your card's ASIC quality and the *theory* that memory Voltage is linked with the VDDC. I think higher cards have lower VDDC at DPM 7 that may be affecting the Memory Voltage.

Edit: The BIOSES available from TPU for GIGAs, for example, have tighter timings than most BIOS from other makes.


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> BIOS can fix that.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Other bioses have tighter memory strap timings that limit overclockability. Esp for Elpida memory. Some bios also have missing memory straps that the next defined strap may be too tight for the memory capability.
> 
> Looking for the perfect BIOS for your card, though, can be time consuming and may not be worth the hassle.
> 
> It can also be due to your card's ASIC quality and the *theory* that memory Voltage is linked with the VDDC. I think higher cards have lower VDDC at DPM 7 that may be affecting the Memory Voltage.
> 
> Edit: The BIOSES available from TPU for GIGAs, for example, have tighter timings than most BIOS from other makes.


Interesting. Maybe I should have a go at the BIOS. Just don't have a lot of motivation to do so. Know what I mean?


----------



## mathesar

Quote:


> Originally Posted by *MEC-777*
> 
> Arkham Knight is a broken game with numerous bugs and errors in the PC version. It's a mess, to be blunt. lol. Even after it's latest second release, it's still apparently broken. Not a good game to use as a proper example for how a game should run, so don't be too concerned. Try another game like GTA-V or Witcher 3 or something that runs much more stable.


Aaanddd you were right lol. I tried running Metro 2033 since that one always taxes systems pretty good and the GPU usage readout was very stable this time.

I'm actually enjoying Arkham Knight even tho it has questionable performance on even high end PCs, it mostly stays at 60fps for me with the biggest issues occurring when driving the Batmobile (stuttering & erratic framerates).

Thanks.


----------



## mus1mus

Quote:


> Originally Posted by *battleaxe*
> 
> Interesting. Maybe I should have a go at the BIOS. Just don't have a lot of motivation to do so. Know what I mean?


Of course I do.


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *battleaxe*
> 
> Interesting. Maybe I should have a go at the BIOS. Just don't have a lot of motivation to do so. Know what I mean?
> 
> 
> 
> Of course I do.
Click to expand...



My cards, both Elpida & Hynix, can bench at 1200/1600 but since upgrade to Win 10, I can't do both core & memory at the same time anymore (at same voltage). I use 390X modded BIOS on my Hynix card as a workaround. I can get same bench score that I got with 1200/1500 at 1159/1501 with the modded BIOS. Overclocks seems more stable with modded BIOS but don't quote me because I didn't test it thoroughly. The modded BIOS however introduced instability on my Elpida card. So I stick with 290 TRI-X OC BIOS with that card.


----------



## mus1mus

Stable with instabilities









I feel the timings on the modded BIOS is the culprit on my cards. As they introduced 1250 timings on to 1500 strap. (not too sure, I didn't look)

Though some are getting better bench scores on them. I just don't want Black screens.


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> Of course I do.


Quote:


> Originally Posted by *kizwan*
> 
> 
> 
> My cards, both Elpida & Hynix, can bench at 1200/1600 but since upgrade to Win 10, I can't do both core & memory at the same time anymore (at same voltage). I use 390X modded BIOS on my Hynix card as a workaround. I can get same bench score that I got with 1200/1500 at 1159/1501 with the modded BIOS. Overclocks seems more stable with modded BIOS but don't quote me because I didn't test it thoroughly. The modded BIOS however introduced instability on my Elpida card. So I stick with 290 TRI-X OC BIOS with that card.


LOL... yeah. Maybe I do need a hug.

The silicon lottery drives me crazy sometimes but then again, its what makes this hobby fun because you never are sure what you will get.


----------



## MEC-777

Quote:


> Originally Posted by *battleaxe*
> 
> LOL... yeah. Maybe I do need a hug.
> 
> The silicon lottery drives me crazy sometimes but then again, its what makes this hobby fun because you never are sure what you will get.


"Graphics cards are like a box of chocolates. Ya never know what ya gonna get."


----------



## robinaish

Quote:


> Originally Posted by *AliNT77*
> 
> first download hawaii bios reader and learn how to flash a bios (its super easy so dont panic)
> 
> extract your card's bios using gpu-z and open it with hawaii bios reader
> 
> make it look exactly like this:
> 
> save the bios and flash it
> 
> now open afterburner and in settings/monitoring set "hardware polling period" to 100 and close settings
> 
> then detach MSI AB hardware monitoring and make its window very big (like more than half of your screen)
> *now U have an @idle memory torture test*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> then start UVing with sapphire trixx until blackscreen happens
> 
> shut down the system and switch bios using physical switch on your graphics card ( unless windows wont boot )
> when windows booted up , shut down the system and switch bios to the modded one
> *(u have to this step every time U get blackscreen)*
> 
> now set "the voltage that caused blackscreen+12mV" and fire up a benchmark like unigine valley
> let it run for a while and if there were artifacts,BS,driver crash,etc.. start underclocking
> if there were no artifact then u are so damn lucky and overclock the core
> 
> then start lowering power limit untill u got what u want
> 
> 
> 
> 
> 
> 
> 
> 
> 
> hope that helps


Hi,
I'm very interested in this... but I don't understand the Voltage Table.

Where do you find the voltage corresponding to each of your dpm frequency?
Is this voltage table in fact 3 voltage tables, each used for a certain asic quality? And i have to find out which one I use.

Can i change value directly in the "Hawaii Bios Reader" or do I have to fiddle whith the Hex editor?

Thank you

(for now I have only changed the offset value and fan curve in hex editor)


----------



## fyzzz

Since some guys have experimented with low voltages and refined your bioses, i want to share some things that you can do. If you put a 'middle' Clock that take care of the low voltages, you can run a high memory Clock on low voltage under load. I hope the Pictures explain this a Little bit better. The memory wasn't supposed to be at 1000 mhz at idle, i messed up since Hawaii bios reader messed up the clockspeeds with this bios, but this is just a Quick bios mod to show something that you can do. Notice the voltages and the clockspeeds. It made it through firestrike without a problem http://www.3dmark.com/fs/6411494.
Idle:

Load:


----------



## fat4l

So I update to win 10 and my CF setup is completelly broken.
FS is running very badly, clocks fluctuating like crazy, gpu usage too.....bahhh
im on 15.11 drivers


----------



## MEC-777

Quote:


> Originally Posted by *fat4l*
> 
> So I update to win 10 and my CF setup is completelly broken.
> FS is running very badly, clocks fluctuating like crazy, gpu usage too.....bahhh
> im on 15.11 drivers


After upgrading to windows 10, it's a good idea to uninstall (completely) and reinstall the latest drivers. That's at least where I would start.


----------



## fat4l

Quote:


> Originally Posted by *MEC-777*
> 
> After upgrading to windows 10, it's a good idea to uninstall (completely) and reinstall the latest drivers. That's at least where I would start.


Thats what I did. 15.11 drivers are not working fine for me.
I reverted back to official 15.7.1 and all is good. Even better than on win 7.
Very pleased.

Clocks are 100% now.
Voltage regulation is awesome.
27500GPU points in FS! yay



EDIT://
it looks like its caused by freesync...


----------



## MEC-777

Quote:


> Originally Posted by *fat4l*
> 
> Thats what I did. 15.11 drivers are not working fine for me.
> I reverted back to official 15.7.1 and all is good. Even better than on win 7.
> Very pleased.
> 
> Clocks are 100% now.
> Voltage regulation is awesome.
> 27500GPU points in FS! yay


Ah, maybe there's a problem with 15.11 then. I'm on 15.9.1 with windows 10 and all is well.


----------



## fat4l

Quote:


> Originally Posted by *MEC-777*
> 
> Ah, maybe there's a problem with 15.11 then. I'm on 15.9.1 with windows 10 and all is well.


I'm testing now and it looks like the problem is caused by "freesync".
If you enable it it's causing all this crap...


----------



## taem

Quote:


> Originally Posted by *MEC-777*
> 
> After upgrading to windows 10, it's a good idea to uninstall (completely) and reinstall the latest drivers. That's at least where I would start.


I'm confused about versions though. I did a clean install of Windows 10 and it automatically installed Catalyst 15.201.1151. Benchmarks are about where they should be, just a tad slower than the latest stable version I was running under Windows 8.1.

AMD Gaming Evolved tells me I should upgrade to 15.7.1 (stable) or 15.11 (beta). But CCC tells me what I have is up to date.

So, should I keep 15.201.1151 or go to 15.7.1?

If I do install 15.7.1, Windows won't think that's an older version and just reinstall 15.201.1151?


----------



## AliNT77

Quote:


> Originally Posted by *robinaish*
> 
> Hi,
> I'm very interested in this... but I don't understand the Voltage Table.
> 
> Where do you find the voltage corresponding to each of your dpm frequency?
> Is this voltage table in fact 3 voltage tables, each used for a certain asic quality? And i have to find out which one I use.
> 
> Can i change value directly in the "Hawaii Bios Reader" or do I have to fiddle whith the Hex editor?
> 
> Thank you
> 
> (for now I have only changed the offset value and fan curve in hex editor)


there's no need to change voltage under bios
just set a middle clock for memory and do the voltage adjustments with sapphire trixx


----------



## AliNT77

Quote:


> Originally Posted by *fyzzz*
> 
> Since some guys have experimented with low voltages and refined your bioses, i want to share some things that you can do. If you put a 'middle' Clock that take care of the low voltages, you can run a high memory Clock on low voltage under load. I hope the Pictures explain this a Little bit better. The memory wasn't supposed to be at 1000 mhz at idle, i messed up since Hawaii bios reader messed up the clockspeeds with this bios, but this is just a Quick bios mod to show something that you can do. Notice the voltages and the clockspeeds. It made it through firestrike without a problem http://www.3dmark.com/fs/6411494.
> Idle:
> 
> Load:


yes this is exactly what i came up with
















my card just gets 1.01V under load







with -33% power limit and 950/1500mHz with a very very very slight throttling
but overall performance is higher than stock


----------



## kizwan

Quote:


> Originally Posted by *AliNT77*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Ironsight*
> 
> Woah nice job. Would you mind sharing some tips on where to start so I can see some improvements like you?
> 
> 
> 
> first download hawaii bios reader and learn how to flash a bios (its super easy so dont panic)
> 
> extract your card's bios using gpu-z and open it with hawaii bios reader
> 
> make it look exactly like this:
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> save the bios and flash it
> 
> now open afterburner and in settings/monitoring set "hardware polling period" to 100 and close settings
> 
> then detach MSI AB hardware monitoring and make its window very big (like more than half of your screen)
> *now U have an @idle memory torture test*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> then start UVing with sapphire trixx until blackscreen happens
> 
> shut down the system and switch bios using physical switch on your graphics card ( unless windows wont boot )
> when windows booted up , shut down the system and switch bios to the modded one
> *(u have to this step every time U get blackscreen)*
> 
> now set "the voltage that caused blackscreen+12mV" and fire up a benchmark like unigine valley
> let it run for a while and if there were artifacts,BS,driver crash,etc.. start underclocking
> if there were no artifact then u are so damn lucky and overclock the core
> 
> then start lowering power limit untill u got what u want
> 
> 
> 
> 
> 
> 
> 
> 
> 
> hope that helps
Click to expand...

Quote:


> Originally Posted by *fyzzz*
> 
> Since some guys have experimented with low voltages and refined your bioses, i want to share some things that you can do. If you put a 'middle' Clock that take care of the low voltages, you can run a high memory Clock on low voltage under load. I hope the Pictures explain this a Little bit better. The memory wasn't supposed to be at 1000 mhz at idle, i messed up since Hawaii bios reader messed up the clockspeeds with this bios, but this is just a Quick bios mod to show something that you can do. Notice the voltages and the clockspeeds. It made it through firestrike without a problem http://www.3dmark.com/fs/6411494.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Idle:
> 
> Load:


Thanks for the tips. So far only tested memory at 1600 with +0mV & firestrike completed without any issue. I'll try lower voltage later. I honestly never tried overclock memory only, so I don't have baseline to compare. Haha.

+rep for both of ya.


----------



## taem

So Windows 10 will not let you install your own GPU drivers. I just tried, DDU to remove the auto installed package, installed 15.7.1, upon reboot, Windows auto installs its own package. No way to stop it on the Home edition of 10. It's a weird install, doesn't require a reboot. Seems to work ok. But not allowing gamers to choose their own discrete gpu driver package seems insane. Am I the only idiot who chose to upgrade to Windows 10 on a gaming pc?


----------



## GorillaSceptre

Quote:


> Originally Posted by *taem*
> 
> So Windows 10 will not let you install your own GPU drivers. I just tried, DDU to remove the auto installed package, installed 15.7.1, upon reboot, Windows auto installs its own package. No way to stop it on the Home edition of 10. It's a weird install, doesn't require a reboot. Seems to work ok. But not allowing gamers to choose their own discrete gpu driver package seems insane. Am I the only idiot who chose to upgrade to Windows 10 on a gaming pc?


A lot are on Win 10 now. I upgrade from 7 ultimate so i got 10 pro.

That's unacceptable.. What if the latest driver release is broken? MS have stated that driver and compatibility issues have dropped, so what they're doing must be working, but taking away peoples choice is a load of bollocks..


----------



## taem

Quote:


> Originally Posted by *GorillaSceptre*
> 
> A lot are on Win 10 now. I upgrade from 7 ultimate so i got 10 pro.
> 
> That's unacceptable.. What if the latest driver release is broken? MS have stated that driver and compatibility issues have dropped, so what they're doing must be working, but taking away peoples choice is a load of bollocks..


I'm hoping I'm doing something wrong, and someone can correct me. Because if you can't opt for beta driver packages, I would think that's a deal breaker, gaming wise, for most folks.

Is it different for Pro version users? Afaik with Pro you can opt to delay updates, but at some point it will auto install just like Home. Maybe that delay is long enough so you can manually install drivers, and then repeat the process everytime the auto update does kick in?

3dMark also reports the drivers auto installed by Windows 10 to be "not approved." AMD Gaming Evolved keeps telling me to update to 15.7.1. Shrug


----------



## GorillaSceptre

Quote:


> Originally Posted by *taem*
> 
> I'm hoping I'm doing something wrong, and someone can correct me. Because if you can't opt for beta driver packages, I would think that's a deal breaker, gaming wise, for most folks.
> 
> Is it different for Pro version users? Afaik with Pro you can opt to delay updates, but at some point it will auto install just like Home. Maybe that delay is long enough so you can manually install drivers, and then repeat the process everytime the auto update does kick in?
> 
> 3dMark also reports the drivers auto installed by Windows 10 to be "not approved." AMD Gaming Evolved keeps telling me to update to 15.7.1. Shrug


No idea man.. I went back to an old driver version a couple weeks back, and they didn't change until i manually went back to the new ones a few days ago.

There must be a fix for home users somewhere though.


----------



## fat4l

Quote:


> Originally Posted by *taem*
> 
> So Windows 10 will not let you install your own GPU drivers. I just tried, DDU to remove the auto installed package, installed 15.7.1, upon reboot, Windows auto installs its own package. No way to stop it on the Home edition of 10. It's a weird install, doesn't require a reboot. Seems to work ok. But not allowing gamers to choose their own discrete gpu driver package seems insane. Am I the only idiot who chose to upgrade to Windows 10 on a gaming pc?


when I did it, DDU "said" it is stopping this windows auto install thingy to prevent installing drivers.
Download the newest version, run it in safe mode and it should tell you it's disabling windows instalations or check this column here:


----------



## taem

Quote:


> Originally Posted by *fat4l*
> 
> when I did it, DDU "said" it is stopping this windows auto install thingy to prevent installing drivers.
> Download the newest version, run it in safe mode and it should tell you it's disabling windows instalations or check this column here:


Are you on Pro or Home? Because I didn't touch that button, I only clicked "Clean and Restart." My understanding was if I leave that alone, I get to manually install my drivers and Windows won't auto install over it. But if I click it, then update settings go to Default, and Windows *will* reinstall the drivers it wants to. But as soon as I restarted WIndows installed new drivers anyway. I'm on Home obviously.

So do I have that right? You leave that button alone, right?


----------



## diggiddi

I don't ever worry about that button. Like u said windows will install its driver but that doesn't matter since CCC will remove it upon installation


----------



## fat4l

i didnt touch it.
It said that its disabling windows automatic instalations and after that if I want to enable it I need to press the button.
Disabling automatic instalations should be offered automatically.
I'm on PRO version.


----------



## taem

DDU 15.5.1.0 (the latest afaik) tells me no such thing about preventing auto install by Windows update. I guess this is a Pro vs Home issue, unless there's someone here on Windows 10 Home who is able to manually install catalyst drivers and *keep* them without Windows automatically updating over it. Windows 10 Home wants me to use 15.201.1151, and will not let me manually install. I just tried it again, re-downloading DDU just in case. Same result.

I'll be interested in seeing what happens to the Pro users down the road, because afaik, with Pro, you can only delay auto updating, not prevent them entirely. With Home your only option is whether to let Windows auto reboot after updates, or tell you to reboot.

If my understanding of all this is correct, there's no way gamers can use WIndows 10 Home, you have to get Pro just for this, and even then, it might just be delaying the issue. Someone please correct me if I'm wrong here. If Pro will allow me to choose my own gpu drivers, then I would pay the $99, but not if it's merely temporary.


----------



## ragtag7

Quote:


> Originally Posted by *Gumbi*
> 
> Nice buy! Which model/cooler?


I bought the Sapphire Tri-X 8GB version. It was on sale on Amazon for $365. I'm using it now, downside is my CPU is a major bottleneck. -__-


----------



## fat4l

Quote:


> Originally Posted by *taem*
> 
> DDU 15.5.1.0 (the latest afaik) tells me no such thing about preventing auto install by Windows update. I guess this is a Pro vs Home issue, unless there's someone here on Windows 10 Home who is able to manually install catalyst drivers and *keep* them without Windows automatically updating over it. Windows 10 Home wants me to use 15.201.1151, and will not let me manually install. I just tried it again, re-downloading DDU just in case. Same result.
> 
> I'll be interested in seeing what happens to the Pro users down the road, because afaik, with Pro, you can only delay auto updating, not prevent them entirely. With Home your only option is whether to let Windows auto reboot after updates, or tell you to reboot.
> 
> If my understanding of all this is correct, there's no way gamers can use WIndows 10 Home, you have to get Pro just for this, and even then, it might just be delaying the issue. Someone please correct me if I'm wrong here. If Pro will allow me to choose my own gpu drivers, then I would pay the $99, but not if it's merely temporary.


yeah htats prolly it. Go and get "pro"








However I had an issue with resolution. After uninstalling my vga drivers my resolution greyed out and was at 640x480(afaik).
I couldn't install amd drivers cuz of this low resolution as the installer was too big for the screen and I coudln't press(see) the "next" button....
I had to rotate my screen(portrait mode) and this allowed me to install them...


----------



## kizwan

GPU-Z Sensor Log - ELPIDA 1600 MEMORY -25mV Fire Strike 20151107-1357

Core 1000MHz (290 TRI-X OC BIOS)
MSI AB: Memory = 1600MHz, Core Voltage = -25mV

Note: This is not lowest my card can go. Still in progress. Voltage still high because my card voltage seems pretty high by default.



Cropped: Test 1 & 2 only.


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> GPU-Z Sensor Log - ELPIDA 1600 MEMORY -25mV Fire Strike 20151107-1357
> 
> Note: This is not lowest my card can go. Still in progress. Voltage still high because my card voltage seems pretty high by default.


what are your other reading/settings? Core? And, you use the VDDC slider of AB or through BIOS?

Thanks. +rep


----------



## kizwan

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> GPU-Z Sensor Log - ELPIDA 1600 MEMORY -25mV Fire Strike 20151107-1357
> 
> Core 1000MHz (290 TRI-X OC BIOS)
> MSI AB: Memory = 1600MHz, Core Voltage = -25mV
> 
> Note: This is not lowest my card can go. Still in progress. Voltage still high because my card voltage seems pretty high by default.
> 
> 
> 
> Cropped: Test 1 & 2 only.
> 
> 
> 
> 
> what are your other reading/settings? Core? And, you use the VDDC slider of AB or through BIOS?
> 
> Thanks. +rep
Click to expand...

This is using 290 TRI-X OC BIOS, so the core already 1000MHz. Memory & voltage (VDDC) was set in MSI AB. I only alter the "middle" memory frequency, following the tips by fyzzz & AliNT77 earlier. (I updated my previous post.)


----------



## AliNT77

memory @1600 @1v


----------



## fyzzz

Quote:


> Originally Posted by *AliNT77*
> 
> memory @1600 @1v


Nice, I experimented with lower idle voltage today, it may be able to go lower, but i won't probably test any lower than this, since this is my other computer that i don't use that much.
290 'Power saving mode':


----------



## gupsterg

Quote:


> Originally Posted by *AliNT77*
> 
> memory @1600 @1v


You have memory voltage control on your card? OR are you thinking AUX voltage in MSI AB is memory voltage?


----------



## fyzzz

I think he means gpu voltage


----------



## gupsterg

Ahh , I see now ...

I just get excited every time when someone mentions memory voltage that a non lightning / matrix card owner has snagged it







...


----------



## AliNT77

Quote:


> Originally Posted by *fyzzz*
> 
> Nice, I experimented with lower idle voltage today, it may be able to go lower, but i won't probably test any lower than this, since this is my other computer that i don't use that much.
> 290 'Power saving mode':


holy crap







that voltage :|
what about under load? is 947mhz on core stable with that voltage offset?


----------



## fyzzz

Quote:


> Originally Posted by *AliNT77*
> 
> holy crap
> 
> 
> 
> 
> 
> 
> 
> that voltage :|
> what about under load? is 947mhz on core stable with that voltage offset?


Well i did everything through bios editing, so i can choose whatever load voltage i want, i was just testing how low the idle voltage could go. I won't do hours testing, no point in that anymore, since my 290 is in a computer that i barely use, which is a bit of a shame.


----------



## AliNT77

Quote:


> Originally Posted by *fyzzz*
> 
> Well i did everything through bios editing, so i can choose whatever load voltage i want, i was just testing how low the idle voltage could go. I won't do hours testing, no point in that anymore, since my 290 is in a computer that i barely use, which is a bit of a shame.


how can i just change load voltage through bios?


----------



## fyzzz

Quote:


> Originally Posted by *AliNT77*
> 
> how can i just change load voltage through bios?


DPM 7 changes load voltage, you must change all 6 of them (there are also 4 that you must change under the limit table tab), for example if i put 1250 in all of the dpm 7 states, i get about 1.148 under load. And the dpm 0 state changes idle voltage, and once again you must edit all 6 of them.


----------



## fyzzz

Since I decided to test my psu in my amd rig, i Went ahead and benchmarked a bit too. Everything is on air now. Want to break 10k overall, i know the 290 has alot of room left, but i'm testing the cpu first.
EDIT: I finally got 10 030 overall with my fx6300/r9 290 i'm so happy now








http://www.3dmark.com/3dm/9192320?


----------



## jagdtigger

It has been a while since i last posted in this thread but i just run into some issue. Have somebody any idea what can cause the blackouts on this viseo?








https://dl.dropboxusercontent.com/u/1201829/OCN/20151107_231742.mp4


----------



## caenlen

mmm nom nom http://www.3dmark.com/3dm/9193690? oc'd the hell out of my crossfire, and its stable, and freesync is smooth as butter. beats a 980 ti at 1500 core and cost me $500 less. AMD fo life. /flex


----------



## jagdtigger

Nice, that 2500k gets almost the same score as my 4670k...
http://www.3dmark.com/fs/5577240


----------



## MEC-777

Quote:


> Originally Posted by *caenlen*
> 
> mmm nom nom http://www.3dmark.com/3dm/9193690? oc'd the hell out of my crossfire, and its stable, and freesync is smooth as butter. beats a 980 ti at 1500 core and cost me $500 less. AMD fo life. /flex


Nice.







Very similar score and OC to my 290's (1200/1450). http://www.3dmark.com/fs/6128206

Nailed a 25802 graphics score, but you got me on the CPU side.


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> This is using 290 TRI-X OC BIOS, so the core already 1000MHz. Memory & voltage (VDDC) was set in MSI AB. I only alter the "middle" memory frequency, following the tips by fyzzz & AliNT77 earlier. (I updated my previous post.)
> 
> 
> Spoiler: Warning: Spoiler!


Thanks for the info. such a wuzz at flashing bios.


----------



## EpicOtis13

Quote:


> Originally Posted by *MEC-777*
> 
> Nice.
> 
> 
> 
> 
> 
> 
> 
> Very similar score and OC to my 290's (1200/1450). http://www.3dmark.com/fs/6128206
> 
> Nailed a 25802 graphics score, but you got me on the CPU side.


You beat me by a lot in graphics score but in total score I you got your a$$ handed to you. http://www.3dmark.com/fs/6413092


----------



## megax05

Now I know my what is my next upgrade


----------



## mfknjadagr8

Quote:


> Originally Posted by *EpicOtis13*
> 
> You beat me by a lot in graphics score but in total score I you got your a$$ handed to you. http://www.3dmark.com/fs/6413092


we get it you have hyperthreading lol


----------



## Roboyto

Been a while since I've done any benching, so I fired up 3DMark to see if there has been any degradation to my hardware.

This was my first FS run: http://www.3dmark.com/fs/6427151 Driver came up as not approved...couldn't remember the last time I did anything with drivers since I upgraded to Win10 a little ways back. Fired up DDU, and installed 15.7.1

Problem Solved: http://www.3dmark.com/fs/6427238

Or so it seemed until I ran 3DMark11.

First run in 3DMark11: http://www.3dmark.com/3dm11/10499435 All was well with drivers, but I noticed by overall score was down about 1000 points from when I first got my 290 and was benching it constantly; previous best was P16312 w/ Tesselation on. After closer examination my physics score is definitely the culprit.

I then remembered I had removed my RAM recently to test in my HTPC, and I probably forgot to reset RAM speed. Into the BIOS, and sure enough RAM was running at 1600. I corrected this and enabled the XMP to get it back to 2400.

Re-run 3DMark11 and the good news is that the card is still performing like it was 2 years ago







...but now I'm back to Graphics Driver Not Approved...














http://www.3dmark.com/3dm11/10501280 http://www.3dmark.com/3dm/9201086?

Any input here?

Quote:


> Originally Posted by *AliNT77*
> 
> first download hawaii bios reader and learn how to flash a bios (its super easy so dont panic)
> 
> extract your card's bios using gpu-z and open it with hawaii bios reader
> 
> make it look exactly like this:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> save the bios and flash it
> 
> now open afterburner and in settings/monitoring set "hardware polling period" to 100 and close settings
> 
> then detach MSI AB hardware monitoring and make its window very big (like more than half of your screen)
> *now U have an @idle memory torture test*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> then start UVing with sapphire trixx until blackscreen happens
> 
> shut down the system and switch bios using physical switch on your graphics card ( unless windows wont boot )
> when windows booted up , shut down the system and switch bios to the modded one
> *(u have to this step every time U get blackscreen)*
> 
> now set "the voltage that caused blackscreen+12mV" and fire up a benchmark like unigine valley
> let it run for a while and if there were artifacts,BS,driver crash,etc.. start underclocking
> if there were no artifact then u are so damn lucky and overclock the core
> 
> 
> then start lowering power limit untill u got what u want
> 
> 
> 
> 
> 
> 
> 
> 
> 
> hope that helps


Very cool, I will definitely have to try this. +Rep! My card already reaches 1700 on the VRAM, I wonder how far I can push it with a little tweaking









You may be receiving some PMs at some point when I start messing with that


----------



## rdr09

@Roboyto,

It has degraded. I know you use to bench at 1300 core.


----------



## fat4l

Quote:


> Originally Posted by *rdr09*
> 
> @Roboyto,
> 
> It has degraded. I know you use to bench at 1300 core.


I wonder what core voltage u need for that lol...
My cores are asking for 1.381v and 1.418v as for DPM7(1.336v and 1.375v max real) to be completelly artifact free @1200MHz.
Not rly sure if it's safe to go any higher than that...








Quote:


> Originally Posted by *Roboyto*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Been a while since I've done any benching, so I fired up 3DMark to see if there has been any degradation to my hardware.
> 
> This was my first FS run: http://www.3dmark.com/fs/6427151 Driver came up as not approved...couldn't remember the last time I did anything with drivers since I upgraded to Win10 a little ways back. Fired up DDU, and installed 15.7.1
> 
> Problem Solved: http://www.3dmark.com/fs/6427238
> 
> Or so it seemed until I ran 3DMark11.
> 
> First run in 3DMark11: http://www.3dmark.com/3dm11/10499435 All was well with drivers, but I noticed by overall score was down about 1000 points from when I first got my 290 and was benching it constantly; previous best was P16312 w/ Tesselation on. After closer examination my physics score is definitely the culprit.
> 
> I then remembered I had removed my RAM recently to test in my HTPC, and I probably forgot to reset RAM speed. Into the BIOS, and sure enough RAM was running at 1600. I corrected this and enabled the XMP to get it back to 2400.
> 
> Re-run 3DMark11 and the good news is that the card is still performing like it was 2 years ago
> 
> 
> 
> 
> 
> 
> 
> ...but now I'm back to Graphics Driver Not Approved... " src="https://www.overclock.net/images/smilies/thinking.gif" style="width:18px;">
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/10501280 http://www.3dmark.com/3dm/9201086?
> 
> Any input here?
> 
> Very cool, I will definitely have to try this. +Rep! My card already reaches 1700 on the VRAM, I wonder how far I can push it with a little tweaking
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You may be receiving some PMs at some point when I start messing with that


It says 15.201.1151.0.
That's windows 10 oem amd driver. U need to run ddu in safe mode, select "disable automatic windows drivers instalations" and install a proper amd driver.


----------



## rdr09

Quote:


> Originally Posted by *fat4l*
> 
> I wonder what core voltage u need for that lol...
> My cores are asking for 1.381v and 1.418v as for DPM7(1.336v and 1.375v max real) to be completelly artifact free @1200MHz.
> Not rly sure if it's safe to go any higher than that...
> 
> 
> 
> 
> 
> 
> 
> 
> It says 15.201.1151.0.
> That's windows 10 oem amd driver. U need to run ddu in safe mode, select "disable automatic windows drivers instalations" and install a proper amd driver.


+200 VDDC in Trixx and showed 1.41v in HWIFO64 . . .

http://www.3dmark.com/3dm/2098310

it was a cold winter.


----------



## mus1mus

Quote:


> Originally Posted by *rdr09*
> 
> +200 VDDC in Trixx and showed 1.41v in HWIFO64 . . .
> 
> http://www.3dmark.com/3dm/2098310
> 
> it was a cold winter.


Elpida?

Close match.
http://www.3dmark.com/compare/fs/1465417/fs/5526928


----------



## megax05

That score is more than what gtx 980 can do i think.


----------



## Roboyto

Quote:


> Originally Posted by *fat4l*
> 
> I wonder what core voltage u need for that lol...
> My cores are asking for 1.381v and 1.418v as for DPM7(1.336v and 1.375v max real) to be completelly artifact free @1200MHz.
> Not rly sure if it's safe to go any higher than that...
> 
> 
> 
> 
> 
> 
> 
> 
> It says 15.201.1151.0.
> That's windows 10 oem amd driver. U need to run ddu in safe mode, select "disable automatic windows drivers instalations" and install a proper amd driver.


It still benches at 1305/1725, but artifacts and tears like crazy. 1295 has minimal issues.

I run +200mV in Trixx w/ 50% power. Still on the stock XFX 290 black edition BIOS.

I still run 1200/1500 +87mV +50% power for eyefinity gaming. Hasn't given me issues in any new titles. If it's degraded it is minor and only at the brink.

I did run DDU and checked disable auto driver install in the DDU menu. It appears that it ran on 15.7.1 for a couple benchmarks until I rebooted to alter the RAM settings in the BIOS...how is that possible?


----------



## fat4l

Quote:


> Originally Posted by *Roboyto*
> 
> It still benches at 1305/1725, but artifacts and tears like crazy. 1295 has minimal issues.
> 
> I run +200mV in Trixx w/ 50% power. Still on the stock XFX 290 black edition BIOS.
> 
> I still run 1200/1500 +87mV +50% power for eyefinity gaming. Hasn't given me issues in any new titles. If it's degraded it is minor and only at the brink.
> 
> I did run DDU and checked disable auto driver install in the DDU menu. It appears that it ran on 15.7.1 for a couple benchmarks until I rebooted to alter the RAM settings in the BIOS...how is that possible?


No idea








Anyway, whats the max vcore voltage reported byafterburner/gpuz/ or any program ? With lets say +200mV trixx in fs.


----------



## rdr09

Quote:


> Originally Posted by *mus1mus*
> 
> Elpida?
> 
> Close match.
> http://www.3dmark.com/compare/fs/1465417/fs/5526928


yes, Elpida.

Quote:


> Originally Posted by *megax05*
> 
> That score is more than what gtx 980 can do i think.


stock 980 maybe.


----------



## megax05

Yes stock reference gtx 980 .
For the half of the price.


----------



## fyzzz

I'm switching back to my 290 for a while. I need to rma my 980 ti, i had bunch of issues with it and it had coil whine too.


----------



## Ironsight

Quote:


> Originally Posted by *kizwan*
> 
> This is using 290 TRI-X OC BIOS, so the core already 1000MHz. Memory & voltage (VDDC) was set in MSI AB. I only alter the "middle" memory frequency, following the tips by fyzzz & AliNT77 earlier. (I updated my previous post.)


Tried this and the idle voltage went up. Think it has sumething to do with my korean monitor.


----------



## AliNT77

Quote:


> Originally Posted by *Ironsight*
> 
> Tried this and the idle voltage went up. Think it has sumething to do with my korean monitor.


if u mean those spikes then i must say its norma


----------



## Roboyto

Quote:


> Originally Posted by *fat4l*
> 
> No idea
> 
> 
> 
> 
> 
> 
> 
> 
> Anyway, whats the max vcore voltage reported byafterburner/gpuz/ or any program ? With lets say +200mV trixx in fs.


I'm really baffled how the hell the default Win10 drivers got installed...DDU ran in safe mode, it booted to low resolution, I manually installed the 15.7 drivers, rebooted and checked it in CCC to make sure it said 15.7. the first time I ran 3DMark...

Double checked the auto update settings and they are definitely off



Spoiler: I ain't crazy :D















Reran DDU and reinstalled drivers again...

Here's max values as shown in GPU-Z for Fire Strike

http://www.3dmark.com/3dm/9205561?



And 3DMark11

http://www.3dmark.com/3dm11/10502869



I see the abnormal temperature spiking/reading issue is still around

Another strange issue that has happened twice now is a blackscreen/hard lock when using the Reset Button in Trixx to return clocks/volts/pwr to default. Anybody else come across a problem like this? @rdr09 @kizwan

About to uninstall and reinstall Trixx to see if it changes anything

New version of Trixx is pretty alright it seems. They have a hardware monitor built in now that is spot on with GPU-Z for readings.

One odd thing is I can't get the same clocks as before; 1295/1725 is now 1296/1724. Overall score down slightly; graphics down, physics up.



Blackscreen/Hard lock problem is still present with the newest version of Trixx


----------



## fat4l

Quote:


> Originally Posted by *Roboyto*
> 
> I'm really baffled how the hell the default Win10 drivers got installed...DDU ran in safe mode, it booted to low resolution, I manually installed the 15.7 drivers, rebooted and checked it in CCC to make sure it said 15.7. the first time I ran 3DMark...
> 
> Double checked the auto update settings and they are definitely off
> 
> 
> Spoiler: I ain't crazy :D
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Reran DDU and reinstalled drivers again...
> 
> Here's max values as shown in GPU-Z for Fire Strike
> http://www.3dmark.com/3dm/9205561?
> 
> 
> 
> 
> And 3DMark11
> http://www.3dmark.com/3dm11/10502869
> 
> 
> 
> 
> I see the abnormal temperature spiking/reading issue is still around
> 
> Another strange issue that has happened twice now is a blackscreen/hard lock when using the Reset Button in Trixx to return clocks/volts/pwr to default. Anybody else come across a problem like this? @rdr09
> @kizwan
> 
> About to uninstall and reinstall Trixx to see if it changes anything
> 
> New version of Trixx is pretty alright it seems. They have a hardware monitor built in now that is spot on with GPU-Z for readings.
> 
> One odd thing is I can't get the same clocks as before; 1295/1725 is now 1296/1724. Overall score down slightly; graphics down, physics up.
> 
> 
> 
> 
> Blackscreen/Hard lock problem is still present with the newest version of Trixx


Nice







1.422v max...thats about 1.45v DPM7 i would say.
I just realised it's actually you who inspired me to change the thermal pads on my ares 3









Anyway try HIS iTurbo. It should work fine and it even allow you to use +400mV (if you want to try







)

Could you please run EVV application to see the default DPM7(bios) voltage for your card ?
CLICK
I'm almost quite sure it's 1.25v.


----------



## Ironsight

Quote:


> Originally Posted by *AliNT77*
> 
> if u mean those spikes then i must say its norma


It wasn't spiking, it was just stuck at 1.25v. I assume since it was pretty much the same bios I wouldn't have to reinstall all the AMD drivers again but maybe that's the problem.


----------



## Roboyto

Quote:


> Originally Posted by *fat4l*
> 
> Nice
> 
> 
> 
> 
> 
> 
> 
> 1.422v max...thats about 1.45v DPM7 i would say.
> I just realised it's actually you who inspired me to change the thermal pads on my ares 3
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyway try HIS iTurbo. It should work fine and it even allow you to use +400mV (if you want to try
> 
> 
> 
> 
> 
> 
> 
> )
> 
> Could you please run EVV application to see the default DPM7(bios) voltage for your card ?
> CLICK
> I'm almost quite sure it's 1.25v.


Yeah, I'm the crazy thermal pad guy







How'd the Fuji's work out for ya? You use the extreme or ultras?

I will get to checking your BIOS voltage in a bit due to running into some new issues...

Trixx isn't allowing me to alter the RAM clocks too high now for some reason. Anything over 1600 and Trixx errors out/crashes.

Decided to give AfterBurner a try...for whatever reason if 3DMark is open prior to AB, the computer hard locks...no black screen/BSOD or anything like that, but it is completely unresponsive requiring a physical reset. At first it wouldn't hard lock until I actually clicked on AB to choose it as my primary window. I decided to uninstall/download/reinstall both 3DMark and AB. After doing so, the computer immediately hard locks when AB opens..I don't have to do/click on anything.

As strange as that was, I figured whatever...I'll just open AB first and then 3DMark. First thing I noticed is AB only allows 1275/1625 for maximum clocks? I'm not super familiar with AB because I've always had issues with it, since my 4890, is that normal for the clock limitations? I do have "Extend Official Overclock Limits" enabled in the options.

Maxed out the sliders for everything: volts, power limit, aux volts, core 1275/1625 memory. Fire up FS and holy hell the artifacts! Duh...only +100mV. Toned the clocks down to 1200/1500 which has always been stable in Trixx @ 87mv 50% power. It made it through FS, but the coil whine was equivocal or possibly worse than what Trixx would emit @ 200mV: http://www.3dmark.com/3dm/9206606? Returned the aux voltage to stock and the coil whine was eliminated.

Turned RAM clocks up to max so 1200/1625 and FS ran flawlessly: http://www.3dmark.com/3dm/9206685?

Ran without flaw up to 1203 core; surpassing gave artifacts.

Gotta reboot to run that BIOS viewing program...

You are correct sir. Default BIOS voltage @ 1.25V

Now time to try HIS iTurbo...which of course the download link through HIS website is currently inoperable...I've had enough BS for the day. Going to go clean up the wiring in my HTPC...I'll be back later


----------



## rdr09

@Roboyto,

a second of blackscreen does happen on my systems both for oc'ing and setting back to default using Trixx. no hard locks, though.


----------



## kizwan

Quote:


> Originally Posted by *Roboyto*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *fat4l*
> 
> No idea
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyway, whats the max vcore voltage reported byafterburner/gpuz/ or any program ? With lets say +200mV trixx in fs.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm really baffled how the hell the default Win10 drivers got installed...DDU ran in safe mode, it booted to low resolution, I manually installed the 15.7 drivers, rebooted and checked it in CCC to make sure it said 15.7. the first time I ran 3DMark...
> 
> Double checked the auto update settings and they are definitely off
> 
> 
> Spoiler: I ain't crazy :D
Click to expand...

That settings doesn't work. Windows update still can update the drivers regardless the settings there.
Quote:


> Originally Posted by *Roboyto*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Reran DDU and reinstalled drivers again...
> 
> Here's max values as shown in GPU-Z for Fire Strike
> http://www.3dmark.com/3dm/9205561?
> 
> 
> 
> 
> And 3DMark11
> http://www.3dmark.com/3dm11/10502869
> 
> 
> 
> 
> I see the abnormal temperature spiking/reading issue is still around
> 
> 
> 
> Another strange issue that has happened twice now is a blackscreen/hard lock when using the Reset Button in Trixx to return clocks/volts/pwr to default. Anybody else come across a problem like this? @rdr09
> @kizwan
> 
> About to uninstall and reinstall Trixx to see if it changes anything
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> New version of Trixx is pretty alright it seems. They have a hardware monitor built in now that is spot on with GPU-Z for readings.
> 
> One odd thing is I can't get the same clocks as before; 1295/1725 is now 1296/1724. Overall score down slightly; graphics down, physics up.
> 
> 
> 
> 
> Blackscreen/Hard lock problem is still present with the newest version of Trixx


Yeah, happened to me from time to time with Trixx too but so far so good with latest version. You can try closing the GPU-Z and any monitoring software prior clicking the reset button. This workaround worked for me I think.

By the way, the drivers that Windows 10 keep want to update is the AMD latest drivers which is currently 15.11 beta drivers (15.201.1151). DDU only temporarily prevent the windows update from updating the driver. Windows update still be able to auto-update drivers in the next boot. There's tool from Microsoft that can temporarily hide the update. It's temporary & I'm pretty sure when new update available, windows update will be able to auto-update the drivers.

Follow this steps:-

Use DDU to disable auto-update & uninstall the drivers
Install the drivers of your choosing (follow the on-screen instructions, including reboot if the installer ask it)
Use this tool (https://support.microsoft.com/en-us/kb/3073930) to hide the drivers update.
Quote:


> Originally Posted by *Ironsight*
> 
> Quote:
> 
> 
> 
> Originally Posted by *AliNT77*
> 
> if u mean those spikes then i must say its norma
> 
> 
> 
> It wasn't spiking, it was just stuck at 1.25v. I assume since it was pretty much the same bios I wouldn't have to reinstall all the AMD drivers again but maybe that's the problem.
Click to expand...

Did the memory running at max clock? I think this is the reason why.


----------



## Matt-Matt

Alright I'm back again!

Just wondering what people run their cards at voltage wise?

Right now I've got 1179/1450 stable with +94mv.. Should I put an ASUS Bios on and push further or just leave it?

Temps are in check, stays under 50c most of the time on the core/VRM's.


----------



## YellowBlackGod

I run my 290X/8GB at 1090/ 1520 stable at stock volts. I have tried also with 1100/1520, but it gives me some artifacts in Firestrike and some games. I will try again with the Crimson Drivers which are said to have some performance improvements and using a DP cable as well, now that my Freesync Monitor is on the way.


----------



## Matt-Matt

Quote:


> Originally Posted by *YellowBlackGod*
> 
> I run my 290X/8GB at 1090/ 1520 stable at stock volts. I have tried also with 1100/1520, but it gives me some artifacts in Firestrike and some games. I will try again with the Crimson Drivers which are said to have some performance improvements and using a DP cable as well, now that my Freesync Monitor is on the way.


Yeah I'm keen for Crimson drivers too!


----------



## AliNT77

@roboyto
For fixing the blackscreen caused by reset button in trixx
First decrease memory clock then press reset

The reason for this is that when U press reset button in trixx it first sets stock voltage then memory clock
Obviously stock voltage is not enough for your memory OC and it causes blackscreen
(Though you can set a middle clock for memory and that will never happen)


----------



## Gumbi

Quote:


> Originally Posted by *AliNT77*
> 
> @roboyto
> For fixing the blackscreen caused by reset button in trixx
> First decrease memory clock then press reset
> 
> The reason for this is that when U press reset button in trixx it first sets stock voltage then memory clock
> Obviously stock voltage is not enough for your memory OC and it causes blackscreen
> (Though you can set a middle clock for memory and that will never happen)


Yep, it's silly design. It applies voltage before clock speeds when it should be the other way around.


----------



## AliNT77

Quote:


> Originally Posted by *Gumbi*
> 
> Yep, it's silly design. It applies voltage before clock speeds when it should be the other way around.


Yeah trixx needs a lot of work


----------



## fat4l

Quote:


> Originally Posted by *Roboyto*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Yeah, I'm the crazy thermal pad guy
> 
> 
> 
> 
> 
> 
> 
> How'd the Fuji's work out for ya? You use the extreme or ultras?
> 
> I will get to checking your BIOS voltage in a bit due to running into some new issues...
> 
> Trixx isn't allowing me to alter the RAM clocks too high now for some reason. Anything over 1600 and Trixx errors out/crashes.
> 
> Decided to give AfterBurner a try...for whatever reason if 3DMark is open prior to AB, the computer hard locks...no black screen/BSOD or anything like that, but it is completely unresponsive requiring a physical reset. At first it wouldn't hard lock until I actually clicked on AB to choose it as my primary window. I decided to uninstall/download/reinstall both 3DMark and AB. After doing so, the computer immediately hard locks when AB opens..I don't have to do/click on anything.
> 
> As strange as that was, I figured whatever...I'll just open AB first and then 3DMark. First thing I noticed is AB only allows 1275/1625 for maximum clocks? I'm not super familiar with AB because I've always had issues with it, since my 4890, is that normal for the clock limitations? I do have "Extend Official Overclock Limits" enabled in the options.
> 
> Maxed out the sliders for everything: volts, power limit, aux volts, core 1275/1625 memory. Fire up FS and holy hell the artifacts! Duh...only +100mV. Toned the clocks down to 1200/1500 which has always been stable in Trixx @ 87mv 50% power. It made it through FS, but the coil whine was equivocal or possibly worse than what Trixx would emit @ 200mV: http://www.3dmark.com/3dm/9206606? Returned the aux voltage to stock and the coil whine was eliminated.
> 
> Turned RAM clocks up to max so 1200/1625 and FS ran flawlessly: http://www.3dmark.com/3dm/9206685?
> Ran without flaw up to 1203 core; surpassing gave artifacts.
> 
> Gotta reboot to run that BIOS viewing program...
> 
> You are correct sir. Default BIOS voltage @ 1.25V
> 
> Now time to try HIS iTurbo...which of course the download link through HIS website is currently inoperable...I've had enough BS for the day. Going to go clean up the wiring in my HTPC...I'll be back later


I used Fujipoly Extreme Plus which are 14W/mK.
My results were Δ7C-Δ21C.
http://www.overclock.net/t/1576426/fujipoly-thermal-pads-vrm-insanely-low-temps-must-have-recommended/0_30









However I don't like my core temps...It can get to 60C with water temp of ~25C while running at about +200mV, 1200MHz.
With my old 295X2 and aquacomputer kryographics waterblock my temps would be ~50C maybe with this setup.
I think it's the ares3-EK waterblock's fault.

Anyway, my AB allows me 1625mhz mem max too. I also have "extended oc limits" ticked but it's still 1625....

here is HIS_iTurbo.rar


----------



## Matt-Matt

Quote:


> Originally Posted by *AliNT77*
> 
> Yeah trixx needs a lot of work


The one thing with Trixx is that it can do up to +200mv, whereas afterburner cannot.
Version 5.0 is rubbish as you can't manually enter clocks...

If anyone can show me how to get +200mv in afterburner that'd be great and I'd love you forever...









Anyway, just validated 1220/1500MHz on valley and got 68.3 Avg FPS which is nice.


----------



## zealord

Looks like Fallout 4 is another game where the 290(X) performs about 25-30% lower than it should be
















http://www.pcgameshardware.de/Fallout-4-Spiel-18293/Specials/Test-Benchmark-vor-Release-1177284/


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Matt-Matt*
> 
> The one thing with Trixx is that it can do up to +200mv, whereas afterburner cannot.
> Version 5.0 is rubbish as you can't manually enter clocks...
> 
> If anyone can show me how to get +200mv in afterburner that'd be great and I'd love you forever...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyway, just validated 1220/1500MHz on valley and got 68.3 Avg FPS which is nice.


Nice one Matt

Hey silly question but im curious to know that if I gots me a 390 will it play nice with my 290's , being a 8gb vram card ??

Well Im off to sleepy bye byes Cyas


----------



## Ha-Nocri

Quote:


> Originally Posted by *zealord*
> 
> Looks like Fallout 4 is another game where the 290(X) performs about 25-30% lower than it should be
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.pcgameshardware.de/Fallout-4-Spiel-18293/Specials/Test-Benchmark-vor-Release-1177284/


Yes, bad performance, will have to lower some settings it seems


----------



## Matt-Matt

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Nice one Matt
> 
> Hey silly question but im curious to know that if I gots me a 390 will it play nice with my 290's , being a 8gb vram card ??
> 
> Well Im off to sleepy bye byes Cyas


Cheers! Time for me to sleep as well!

With AMD cards, it uses the least vRAM so in your case you'll have only 4GB available to both but it'll run fine.
Matching the clocks of the cards makes it run a bit nicer in my experience with older AMD cards, and there's no point not to match them either way.

Just had a few rounds of COD BO2.. Seems stable


----------



## fat4l

Quote:


> Originally Posted by *Matt-Matt*
> 
> The one thing with Trixx is that it can do up to +200mv, whereas afterburner cannot.
> Version 5.0 is rubbish as you can't manually enter clocks...
> 
> If anyone can show me how to get +200mv in afterburner that'd be great and I'd love you forever...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyway, just validated 1220/1500MHz on valley and got 68.3 Avg FPS which is nice.


try his Iturbo.I uploaded it just 1 post above yours...
It lets u change vddci(aux) too....and +400mV for the core should u want to


----------



## Gumbi

Quote:


> Originally Posted by *Matt-Matt*
> 
> The one thing with Trixx is that it can do up to +200mv, whereas afterburner cannot.
> Version 5.0 is rubbish as you can't manually enter clocks...
> 
> If anyone can show me how to get +200mv in afterburner that'd be great and I'd love you forever...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyway, just validated 1220/1500MHz on valley and got 68.3 Avg FPS which is nice.


How much voltage for 1220mhz core? I believe a simple edit of a file allows more than 100mv overvolt in Afterburner, but haven't bothered looking into it.


----------



## kizwan

Quote:


> Originally Posted by *Matt-Matt*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *AliNT77*
> 
> Yeah trixx needs a lot of work
> 
> 
> 
> 
> 
> 
> 
> The one thing with Trixx is that it can do up to +200mv, whereas afterburner cannot.
> Version 5.0 is rubbish as you can't manually enter clocks...
> 
> If anyone can show me how to get +200mv in afterburner that'd be great and I'd love you forever...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyway, just validated 1220/1500MHz on valley and got 68.3 Avg FPS which is nice.
Click to expand...

That score little bit low though. Did your cpu running at stock?

BTW, to be able to get more than +200mV in afterburner, you can check the info at first post.


----------



## Gumbi

Quote:


> Originally Posted by *kizwan*
> 
> That score little bit low though. Did your cpu running at stock?


Agreed. Did you possibly power/thermal throttle? How were temps? (Core/VRM?) Did you have plus 50% set on the power limit?


----------



## Roboyto

Quote:


> Originally Posted by *rdr09*
> 
> @Roboyto,
> 
> a second of blackscreen does happen on my systems both for oc'ing and setting back to default using Trixx. no hard locks, though.


Mine had always done that as well with the older version. Switched to the newest one and it isn't cooperating at all. Thanks for the input.

Quote:


> Originally Posted by *kizwan*
> 
> *That settings doesn't work. Windows update still can update the drivers regardless the settings there.*
> Yeah, happened to me from time to time with Trixx too but so far so good with latest version. You can try closing the GPU-Z and any monitoring software prior clicking the reset button. This workaround worked for me I think.
> 
> By the way, the drivers that Windows 10 keep want to update is the AMD latest drivers which is currently 15.11 beta drivers (15.201.1151). DDU only temporarily prevent the windows update from updating the driver. *Windows update still be able to auto-update drivers in the next boot.* There's tool from Microsoft that can temporarily hide the update. It's temporary & I'm pretty sure when new update available, windows update will be able to auto-update the drivers.
> 
> Follow this steps:-
> 
> Use DDU to disable auto-update & uninstall the drivers
> Install the drivers of your choosing (follow the on-screen instructions, including reboot if the installer ask it)
> Use this tool (https://support.microsoft.com/en-us/kb/3073930) to hide the drivers update.


Well, I'm glad I'm not crazy...that is the dumbest







thing I've seen in a while. Thanks for the info.

Quote:


> Originally Posted by *AliNT77*
> 
> @roboyto
> For fixing the blackscreen caused by reset button in trixx
> First decrease memory clock then press reset
> 
> The reason for this is that when U press reset button in trixx it first sets stock voltage then memory clock
> Obviously stock voltage is not enough for your memory OC and it causes blackscreen
> (Though you can set a middle clock for memory and that will never happen)


I had a sneaking suspicion while pondering this at work today. I remember running into this issue in the opposite direction when I was benching my card; setting power/voltage before clock speeds. Thanks for the input.

Anyone have any idea why the combination of 3DMark and Afterburner is causing lockups? I don't have it set to auto-apply any overclocks/voltages...thinking about a fresh windows install if this weird sh..stuff keeps happening

Quote:


> Originally Posted by *fat4l*
> 
> I used Fujipoly Extreme Plus which are 14W/mK.
> My results were Δ7C-Δ21C.
> http://www.overclock.net/t/1576426/fujipoly-thermal-pads-vrm-insanely-low-temps-must-have-recommended/0_30
> 
> 
> 
> 
> 
> 
> 
> 
> 
> However I don't like my core temps...It can get to 60C with water temp of ~25C while running at about +200mV, 1200MHz.
> With my old 295X2 and aquacomputer kryographics waterblock my temps would be ~50C maybe with this setup.
> I think it's the ares3-EK waterblock's fault.
> 
> Anyway, my AB allows me 1625mhz mem max too. I also have "extended oc limits" ticked but it's still 1625....
> 
> here is HIS_iTurbo.rar


Higher temps on the Ares could be due to higher voltage overall? Or did they have same default as stock 295X2?

Great results with Fuji pads as usual









Never used a Kryographics block yet. Had a couple EK blocks for 4890, 7870, GTX670, and then switched to XSPC for the illuminated feature. Very pleased with their craftmanship and performance overall. Not sure what brand I'll be using when Arctic Islands hit the streets









Thanks much for the HIS download. You know I had used it a little ways back when I had my Devil 270X in my HTPC...got lost between a couple SSD formats between now and then though.

Quote:


> Originally Posted by *Matt-Matt*
> 
> Alright I'm back again!
> 
> Just wondering what people run their cards at voltage wise?
> 
> Right now I've got 1179/1450 stable with +94mv.. Should I put an ASUS Bios on and push further or just leave it?
> 
> Temps are in check, stays under 50c most of the time on the core/VRM's.


There are diminishing returns for performance after 1200 core clock and RAM speeds don't help all that much collectively...I'd say leave it alone for everyday/gaming purposes. If you wanna use the ASUS as a secondary BIOS for benching or something to push the envelope that is another story







.

I run 1200/1500 +87mV on my 290, and have for just about the last 2 years. It's been very good to me driving my 3x1 EyeFinity setup.


----------



## Matt-Matt

Quote:


> Originally Posted by *Gumbi*
> 
> How much voltage for 1220mhz core? I believe a simple edit of a file allows more than 100mv overvolt in Afterburner, but haven't bothered looking into it.


Quote:


> Originally Posted by *kizwan*
> 
> That score little bit low though. Did your cpu running at stock?
> 
> BTW, to be able to get more than +200mV in afterburner, you can check the info at first post.


Quote:


> Originally Posted by *Roboyto*
> 
> Mine had always done that as well with the older version. Switched to the newest one and it isn't cooperating at all. Thanks for the input.
> 
> Well, I'm glad I'm not crazy...that is the dumbest
> 
> 
> 
> 
> 
> 
> 
> thing I've seen in a while. Thanks for the info.
> 
> I had a sneaking suspicion while pondering this at work today. I remember running into this issue in the opposite direction when I was benching my card; setting power/voltage before clock speeds. Thanks for the input.
> 
> Anyone have any idea why the combination of 3DMark and Afterburner is causing lockups? I don't have it set to auto-apply any overclocks/voltages...thinking about a fresh windows install if this weird sh..stuff keeps happening
> 
> Higher temps on the Ares could be due to higher voltage overall? Or did they have same default as stock 295X2?
> 
> Great results with Fuji pads as usual
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Never used a Kryographics block yet. Had a couple EK blocks for 4890, 7870, GTX670, and then switched to XSPC for the illuminated feature. Very pleased with their craftmanship and performance overall. Not sure what brand I'll be using when Arctic Islands hit the streets
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Thanks much for the HIS download. You know I had used it a little ways back when I had my Devil 270X in my HTPC...got lost between a couple SSD formats between now and then though.
> 
> There are diminishing returns for performance after 1200 core clock and RAM speeds don't help all that much collectively...I'd say leave it alone for everyday/gaming purposes. If you wanna use the ASUS as a secondary BIOS for benching or something to push the envelope that is another story
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I run 1200/1500 +87mV on my 290, and have for just about the last 2 years. It's been very good to me driving my 3x1 EyeFinity setup.


Running 1220/1500 at +130mv, which equates to around 1.28v maximum in GPU-Z.

I should probably run 1179/1450 for everday clocks, and I'm within temperature ranges still and voltage still..

Also I missed the over volt in the first post, thanks guys!

CPU is stock ATM yes


----------



## EpicOtis13

So I installed Trixx instead of afterburner, and now I can bench the card at 1200/1525 with no artifacting, and I get a decent firestrike score, but when I try to play Battlefield 4 using Mantle at those clocks the game freezes and I have to hard restart my PC. The card is at 1200/1525 with +100mv on the core and +50 powerlimit, is there a way to get it stable? I also want to try out 1225 and 1250 for the core, and 1550 and 1575 for the VRAM to see how my cards fair at higher clocks.


----------



## mus1mus

You first need to know how high your card can go. By Benching.Gaming will always be clocked lower.


----------



## MEC-777

Ok, so I tried undervolting my 290's and I'd like to know if this is normal or how low the average 290 can go at stock clocks.

I was able to drop it to -37mV using MSI AB and ran Project Cars for a good 2 hours totally smooth and problem free. Both cards exhibited about the same temps maybe 1 degree cooler, tough to say as ambient can fluctuate quite a bit where I have my PC.

I now have it set to -44mV and I'm just wondering how low any of you have been able to drop the voltage while maintaining stable stock clocks?


----------



## rdr09

Quote:


> Originally Posted by *MEC-777*
> 
> Ok, so I tried undervolting my 290's and I'd like to know if this is normal or how low the average 290 can go at stock clocks.
> 
> I was able to drop it to -37mV using MSI AB and ran Project Cars for a good 2 hours totally smooth and problem free. Both cards exhibited about the same temps maybe 1 degree cooler, tough to say as ambient can fluctuate quite a bit where I have my PC.
> 
> I now have it set to -44mV and I'm just wondering how low any of you have been able to drop the voltage while maintaining stable stock clocks?


i can't wait to do the same. i wish we can have someone check the power use from the wall after getting a stable and most lowest voltage achievable.

i am really interested 'cause my ups is only good for 900W and planning on putting a car battery to it to power my crossfire.

+rep


----------



## mus1mus

By the way guys, for those who are into BIOS editing and can read HEX codes









Memory timings on both have been altered. Same card.

http://www.3dmark.com/compare/3dm11/10508433/3dm11/10508382


----------



## AliNT77

Quote:


> Originally Posted by *MEC-777*
> 
> Ok, so I tried undervolting my 290's and I'd like to know if this is normal or how low the average 290 can go at stock clocks.
> 
> I was able to drop it to -37mV using MSI AB and ran Project Cars for a good 2 hours totally smooth and problem free. Both cards exhibited about the same temps maybe 1 degree cooler, tough to say as ambient can fluctuate quite a bit where I have my PC.
> 
> I now have it set to -44mV and I'm just wondering how low any of you have been able to drop the voltage while maintaining stable stock clocks?


If u edit bios like how i showed earlier , u can UV down to -180mV with stock core clock and memory fully OC'd

Im running my card at 950/1625 -153mV -33% power limit and i get ~3% higher performance than everything @stock


----------



## Gumbi

Quote:


> Originally Posted by *AliNT77*
> 
> If u edit bios like how i showed earlier , u can UV down to -180mV with stock core clock and memory fully OC'd
> 
> Im running my card at 950/1625 -153mV -33% power limit and i get ~3% higher performance than everything @stock


I don't understand how your memory controller is still stable with so little voltage. My card can do 1640mhz memory but only if I give the core 100mv.

Great results btw!


----------



## MEC-777

Quote:


> Originally Posted by *AliNT77*
> 
> If u edit bios like how i showed earlier , u can UV down to -180mV with stock core clock and memory fully OC'd
> 
> Im running my card at 950/1625 -153mV -33% power limit and i get ~3% higher performance than everything @stock


Not really interested in modding the BIOS at this point. I tinkered with BIOS modding a little while back and saw very minor gains vs the time I spent testing and decided it wasn't worth it. Would prefer to see what can be done with just MSI AB. That is pretty cool, none-the-less.









Makes you wonder if each vendor spent more time in R&D in the BIOS with these Hawaii and Grenada GPUs if they could get these things running more efficiently, maybe they wouldn't have received such a bad rep in their early days. Though, the reference coolers were pretty bad anyways.








Quote:


> Originally Posted by *Gumbi*
> 
> I don't understand how your memory controller is still stable with so little voltage. My card can do 1640mhz memory but only if I give the core 100mv.
> 
> Great results btw!


That's what BIOS modding can do for you.







Not every card is the same though. Depends on a number of factors, what each card is capable of.

________________

Going to see how much lower I can go on the voltage tonight. I think the reason I'm seeing almost the same temps is because of the way I have Fanspeed setup. I think the cards are actually producing slightly less heat, but the fans are just running slightly slower and maintaining nearly the same temps. I'm fine with that, the quieter the better.


----------



## fyzzz

Well this happen








http://www.3dmark.com/3dm/9225912?
And still testing


----------



## Gumbi

Wow, what a beast. Water? Also, Hynix memory I'm guessing? Beats my 290X at 1240/1640 (14.8k ~)


----------



## gatygun

Quote:


> Originally Posted by *fyzzz*
> 
> Well this happen
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/9225912?
> And still testing


You won the 290 lottery my friend. God dam lol.


----------



## AliNT77

ran this with my 24/7 gaming profile:


















the actual voltage during test is hovering around 0.992-1.018
















ran this @stock :


temps @stock can go higher than what U see but @undervolt it sits on those numbers


----------



## megax05

Yeah that is one beast of a card mine can berly hit 1200/1600 @ +100mv and maxed power limit.


----------



## fyzzz

Quote:


> Originally Posted by *Gumbi*
> 
> Wow, what a beast. Water? Also, Hynix memory I'm guessing? Beats my 290X at 1240/1640 (14.8k ~)


Yup water and a open window with 7c outside. Modded 390 bios and the card has hynix bfr memory yes. On stock sapphire 290 bios it can do 1270/1780, which gives about 14.7k in score.


----------



## Gumbi

Quote:


> Originally Posted by *fyzzz*
> 
> Yup water and a open window with 7c outside. Modded 390 bios and the card has hynix bfr memory yes. On stock sapphire 290 bios it can do 1270/1780, which gives about 14.7k in score.


Innnnteresting. I might look into using a modified BIOS myself on my 390X. It seems like the memory timing tweaks give a nice boost in performance!


----------



## fyzzz

Quote:


> Originally Posted by *Gumbi*
> 
> Innnnteresting. I might look into using a modified BIOS myself on my 390X. It seems like the memory timing tweaks give a nice boost in performance!


Yeah it gives a couple of 100 points i would imagine , depending how tight timings you can run. But my card responded well on the 390 bios and got probably most of my performance from there.


----------



## fat4l

Quote:


> Originally Posted by *fyzzz*
> 
> Yeah it gives a couple of 100 points i would imagine , depending how tight timings you can run. But my card responded well on the 390 bios and got probably most of my performance from there.


whats up with those bfr timings ?
Are they better than 1250 timings in 1500+ strap ?


----------



## megax05

Quote:


> Originally Posted by *fyzzz*
> 
> Yeah it gives a couple of 100 points i would imagine , depending how tight timings you can run. But my card responded well on the 390 bios and got probably most of my performance from there.


I have 2 r9 290 tri-x old version with the reference pcb and hynix vram , do you have a moded bios that will work with those i dont OC for daily usage I use the stock 1000/1300 on both cards cause temps here in Saudi Arabia are insane with AC i can manage to control my ambient temps to 25c without AC it can hit 35c indoor.
I can use the extra performance on stock clocks for my New 4k monitor that I expect to get within 2 weeks from Korea.


----------



## fyzzz

Quote:


> Originally Posted by *fat4l*
> 
> whats up with those bfr timings ?
> Are they better than 1250 timings in 1500+ strap ?


The afr timings and bfr timings are a bit different, but here are no difference performance wise between afr timings and bfr timings. So no, no gains there. But i would almost say that the bfr timings are more stable for me, which makes sense since i have bfr memory. The only bios were i can reach 1780 memory clock is on the sapphire tri-x bios, which has the bfr timings.


----------



## fat4l

Quote:


> Originally Posted by *fyzzz*
> 
> The afr timings and bfr timings are a bit different, but here are no difference performance wise between afr timings and bfr timings. So no, no gains there. But i would almost say that the bfr timings are more stable for me, which makes sense since i have bfr memory. The only bios were i can reach 1780 memory clock is on the sapphire tri-x bios, which has the bfr timings.


and are u using non-modified straps(in terms of timings from lower straps) to reach that frequency ?


----------



## fyzzz

Quote:


> Originally Posted by *fat4l*
> 
> and are u using non-modified straps(in terms of timings from lower straps) to reach that frequency ?


Yes, on a completely stock sapphire tri-x bios, it can reach that memory speed.


----------



## fyzzz

Quote:


> Originally Posted by *megax05*
> 
> I have 2 r9 290 tri-x old version with the reference pcb and hynix vram , do you have a moded bios that will work with those i dont OC for daily usage I use the stock 1000/1300 on both cards cause temps here in Saudi Arabia are insane with AC i can manage to control my ambient temps to 25c without AC it can hit 35c indoor.
> I can use the extra performance on stock clocks for my New 4k monitor that I expect to get within 2 weeks from Korea.


I don't have a modded trix bios, but I can make one for you. You would get a bit more performance, but probably not much.


----------



## fat4l

Quote:


> Originally Posted by *fyzzz*
> 
> Yes, on a completely stock sapphire tri-x bios, it can reach that memory speed.


aha.
my point is, don't 1250 timings in 1500+ strap bring better results ?


----------



## fyzzz

Quote:


> Originally Posted by *fat4l*
> 
> aha.
> my point is, don't 1250 timings in 1500+ strap bring better results ?


Yes 1250 timings in any higher strap performs better.


----------



## megax05

Quote:


> Originally Posted by *fyzzz*
> 
> I don't have a modded trix bios, but I can make one for you. You would get a bit more performance, but probably not much.


Man that would be awesome , do you need me to upload my cards stick bios so you can mod them? Or you you will use one from tpu bios database.


----------



## surfinchina

Maybe the answer is already somewhere in the 4030 pages...

I just got an EVGA Micro 2 with a 5960x cpu. Kept my old R9 280x oc (Gigabyte).
It crashes if I set auto or gen 3 in PCIE bios.
Gen 2 is fine.
It cuts down cinebench from 180 to 170 so not a huge deal, but I'm looking for a new card and don't want to buy an R9 390x (I need the memory for CAD models) if I have to run it on gen 2.

Anybody had this issue on other X99 boards?


----------



## mus1mus

Quote:


> Originally Posted by *fyzzz*
> 
> Yes 1250 timings in any higher strap performs better.


Heya mate. Nice score.

How tight did you go?

Are you modding modding the memory strap as well?


----------



## fyzzz

Quote:


> Originally Posted by *mus1mus*
> 
> Heya mate. Nice score.
> 
> How tight did you go?
> 
> Are you modding modding the memory strap as well?


Yes, i can have the 1126-1250 timings all the way up to 1625, but then i need to loosen the timings, so I used 1376-1500 timings in the 1626-1750 strap.


----------



## fat4l

So I finally have my 1250 timings in 1250-1749 straps.
Big thanks to Gupsterg!









My quick results with 1200/1500MHz CF enabled are:

*Performance*

*19854*


vs

*20008*


*Extreme*

*10364*


vs

*10588*


*Ultra*

*5603*


vs

*5703*


Will continue testing tomorrow


----------



## mus1mus

Quote:


> Originally Posted by *fyzzz*
> 
> Yes, i can have the 1126-1250 timings all the way up to 1625, but then i need to loosen the timings, so I used 1376-1500 timings in the 1626-1750 strap.


Nice, what are the first two HEX values of your target strap? Like, 77/71, DD/D1 etc.

I have went down as much as 1000 strap on my elpida for 1625.







but still, it's not a bencher.

http://www.3dmark.com/compare/3dm11/10508382/3dm11/10508433

This is the subject. If you can test it with yours.

DD/D1 vs 77/71. Same strap, just the first 2 hex values changed.


----------



## fyzzz

Quote:


> Originally Posted by *mus1mus*
> 
> Nice, what are the first two HEX values of your target strap? Like, 77/71, DD/D1 etc.
> 
> I have went down as much as 1000 strap on my elpida for 1625.
> 
> 
> 
> 
> 
> 
> 
> but still, it's not a bencher.
> 
> http://www.3dmark.com/compare/3dm11/10508382/3dm11/10508433
> 
> This is the subject. If you can test it with yours.
> 
> DD/D1 vs 77/71. Same strap, just the first 2 hex values changed.


The timings i use starts with 77 71 and I've noticed that in the sapphire tri-x bios, the later straps starts with 99 91. Is DD/D1 a elpida thing?


----------



## mus1mus

Quote:


> Originally Posted by *fyzzz*
> 
> Yes, i can have the 1126-1250 timings all the way up to 1625, but then i need to loosen the timings, so I used 1376-1500 timings in the 1626-1750 strap.


I believe so. BBBG_DEBUG.

In my limited tests, you can replace it with anything keeping the semantics.

DD scored better but limits the core. 77 clocks high but meh. Though, I noticed artifacts are non-existent on 77/71.

That may be the reason why some chips clock better but score less.


----------



## fyzzz

Quote:


> Originally Posted by *mus1mus*
> 
> I believe so. BBBG_DEBUG.
> 
> In my limited tests, you can replace it with anything keeping the semantics.
> 
> DD scored better but limits the core. 77 clocks high but meh. Though, I noticed artifacts are non-existent on 77/71.
> 
> That may be the reason why some chips clock better but score less.


Hmm interesting, this makes me want to test more timing related stuff. Yeah maybe that's why 290's varies so much in score and clockspeed.


----------



## mus1mus

Quote:


> Originally Posted by *fyzzz*
> 
> Hmm interesting, this makes me want to test more timing related stuff. Yeah maybe that's why 290's varies so much in score and clockspeed.


Yeah man. But yours is an exception.


----------



## fyzzz

Comparison between DD/D1 and 77/71, almost 150 points, not bad. Will test higher memory clock too.
http://www.3dmark.com/compare/fs/6455217/fs/6438355

A bit more testing and results at higher memory clock.
http://www.3dmark.com/compare/fs/6455389/fs/6455360/fs/6455324


----------



## mus1mus

You confirmed it buddy!.

+1

Are you testing it on your best profile?

Can you do the same to the lower straps as well? Sat, down to 1250? And push the card where it previously was. So we can get an idea how it affects OC and Stability.

hmmm. Might try FF as well.


----------



## megax05

Never thought that a single r9 290 will hit 14-15k in fire strike
I think AMD need to change some of their bios, and driver devision staff.


----------



## fyzzz

Quote:


> Originally Posted by *mus1mus*
> 
> You confirmed it buddy!.
> 
> +1
> 
> Are you testing it on your best profile?
> 
> Can you do the same to the lower straps as well? Sat, down to 1250? And push the card where it previously was. So we can get an idea how it affects OC and Stability.
> 
> hmmm. Might try FF as well.


I have edited all straps with DD/D1, I will also test my max clock and see how that goes.


----------



## mus1mus

Quote:


> Originally Posted by *megax05*
> 
> Never thought that a single r9 290 will hit 14-15k in fire strike
> I think AMD need to change some of their bios, and driver devision staff.


His' is an exceptional card. I couldn't stress it out more.









But yeah, I almost made it there using the modified 300X bios. Maybe you should try it as well.

Quote:


> Originally Posted by *fyzzz*
> 
> I have edited all straps with DD/D1, I will also test my max clock and see how that goes.


Thanks mate. I'm rooting for you breaking your own record.

Let us know.


----------



## megax05

Nvidia hit hard from the start not like amd from what i see the AMD cards get better after time and bios fixes by user and prove to be better cards over time unlike Nvidia they sort everything before they release the new cards.


----------



## YellowBlackGod

Quote:


> Originally Posted by *megax05*
> 
> Nvidia hit hard from the start not like amd from what i see the AMD cards get better after time and bios fixes by user and prove to be better cards over time unlike Nvidia they sort everything before they release the new cards.


I fully agree.


----------



## mus1mus

Quote:


> Originally Posted by *megax05*
> 
> Nvidia hit hard from the start not like amd from what i see the AMD cards get better after time and bios fixes by user and prove to be better cards over time unlike Nvidia they sort everything before they release the new cards.


Good point. I've seen it actually.


----------



## megax05

Quote:


> His' is an exceptional card. I couldn't stress it out more. biggrin.gif


I know that he won the lottery on this card but what I am saying is in general if AMD got better bios, and driver writers they could'v get higher market share in GPU division.
I will wait for my edited bios not planning on insane OC I only got tri-x cooler


----------



## rdr09

Quote:


> Originally Posted by *megax05*
> 
> Nvidia hit hard from the start not like amd from what i see the AMD cards get better after time and bios fixes by user and prove to be better cards over time unlike Nvidia they sort everything before they release the new cards.


it is the start that matters most in sales. their marketing is great, too, relatively speaking. imagine they were able to sell so many 970s. those who felt who got duped upgraded to 980s. those who are not satisfied upgraded to 980Tis.


----------



## mus1mus

Quote:


> Originally Posted by *megax05*
> 
> I know that he won the lottery on this card but what I am saying is in general if AMD got better bios, and driver writers they could'v get higher market share in GPU division.
> I will wait for my edited bios not planning on insane OC I only got tri-x cooler


BIOS and Drivers are right where they are. It just happened that nVidia is more geared towards the current demand. And they do better on that aspect.

As with any tech, it takes time to mature. And that is where AMD is doing good. You can almost expect better experience on Driver releases. Something that nVidia screws up. (in my experience)

And by the way, the BIOS that fyzz have is developed outside AMD camp. It took a few curious minds to start doing it. And a lot benefit from those mods. Not always a given though. My elpida card don't work well with the modded ones.

In another note, modded 300 series bios lowered my temps too. At least as they show.


----------



## megax05

Quote:


> Originally Posted by *rdr09*
> 
> it is the start that matters most in sales. their marketing is great, too, relatively speaking. imagine they were able to sell so many 970s. those who felt who got duped upgraded to 980s. those who are not satisfied upgraded to 980Tis.


Indeed also I did undervolt my 290s to -60mv and i was pulling just a bit over my GTX 970 in total system power draw.
Well I hope Su will do something about that


----------



## fyzzz

Quote:


> Originally Posted by *mus1mus*
> 
> Thanks mate. I'm rooting for you breaking your own record.
> 
> Let us know.


1270 is about my max on stock 290 bios, without going to crazy with the artifacts. I would say that the new timings affect the stability a bit. 1270 was a mess, but 1260 was much better and 1750 vs 1730 on the memory, however the performance is pretty good. I must implement all of these things in the 390 bios and run the card a bit cooler and see how that goes.
All test done on sapphire tri-x 290 bios:

14811 gpu score with 1260/1730
http://www.3dmark.com/3dm/9236988?

13705 gpu score with 1260/1250
http://www.3dmark.com/fs/6455608


----------



## rdr09

Quote:


> Originally Posted by *megax05*
> 
> Indeed also I did undervolt my 290s to -60mv and i was pulling just a bit over my GTX 970 in total system power draw.
> Well I hope Su will do something about that


wait, let me get this straight . . . you are comparing a system with 2 290s and a system with a single 970?

i will try to undervolt and use BF4 MP64 as a test.


----------



## megax05

Quote:


> Originally Posted by *rdr09*
> 
> wait, let me get this straight . . . you are comparing a system with 2 290s and a system with a single 970?
> 
> i will try to undervolt and use BF4 MP64 as a test.


Just a single card at time and tested both cards with almost identical results.


----------



## mus1mus

Quote:


> Originally Posted by *rdr09*
> 
> it is the start that matters most in sales. their marketing is great, too, relatively speaking. imagine they were able to sell so many 970s. those who felt who got duped upgraded to 980s. those who are not satisfied upgraded to 980Tis.


Their technique is actually pretty genius.

Relase the 970/980 to boost a bit of performance from previous gen. (people jumped into it right away) After selling all their ref models, realeased the titan-x (this is the best card in the planet evil laugh).

All stocks deployed, release the 980TI.

Genius! You know how many people remorsed buying the 980 when the TI came out?








Quote:


> Originally Posted by *fyzzz*
> 
> 1270 is about my max on stock 290 bios, without going to crazy with the artifacts. I would say that the new timings affect the stability a bit. 1270 was a mess, but 1260 was much better and 1750 vs 1730 on the memory, however the performance is pretty good. I must implement all of these things in the 390 bios and run the card a bit cooler and see how that goes.
> All test done on sapphire tri-x 290 bios:
> 
> 14811 gpu score with 1260/1730
> http://www.3dmark.com/3dm/9236988?
> 
> 13705 gpu score with 1260/1250
> http://www.3dmark.com/fs/6455608


1000 points up? Holy guacamole!

I shouldn't have told you.


----------



## fyzzz

A big thanks to @mus1mus for his awesome discovery of the dd/d1 timing. Here is the result of it:
http://www.3dmark.com/fs/6456169
I just keep breaking my record over and over again


----------



## fat4l

Quote:


> Originally Posted by *fyzzz*
> 
> A big thanks to @mus1mus for his awesome discovery of the dd/d1 timing. Here is the result of it:
> http://www.3dmark.com/fs/6456169
> I just keep breaking my record over and over again


are these DD/D1 valid for bfr?
My card cant go 1500MHz on mems with 1250 timings....
1500mhz pass with no artifacts
1550mhz fail, driver down + artifacts

Any thoughts ?


----------



## megax05

Quote:


> Originally Posted by *fyzzz*
> 
> A big thanks to @mus1mus for his awesome discovery of the dd/d1 timing. Here is the result of it:
> http://www.3dmark.com/fs/6456169
> I just keep breaking my record over and over again


Gratz man that is one awesome result.
Is this on a full cover block?


----------



## fyzzz

Quote:


> Originally Posted by *fat4l*
> 
> are these DD/D1 valid for bfr?
> My card cant go 1500MHz on mems with 1250 timings....
> 1500mhz pass with no artifacts
> 1550mhz fail, driver down + artifacts
> 
> Any thoughts ?


You have to put in a less tight timings above 1500 then. DD/D1 will work for any memory. I have bfr and it works flawlessly. But it may lower your overclock slightly.
Quote:


> Originally Posted by *megax05*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Gratz man that is one awesome result.
> Is this on a full cover block?


Yes it is under water and i had the computer next to a window so it got cold air. I could probably hit the same overclock on ambient, but the lower temperature helps with stability.


----------



## gupsterg

Quote:


> Originally Posted by *mus1mus*
> 
> Nice, what are the first two HEX values of your target strap? Like, 77/71, DD/D1 etc.
> 
> I have went down as much as 1000 strap on my elpida for 1625.
> 
> 
> 
> 
> 
> 
> 
> but still, it's not a bencher.
> 
> http://www.3dmark.com/compare/3dm11/10508382/3dm11/10508433
> 
> This is the subject. If you can test it with yours.
> 
> DD/D1 vs 77/71. Same strap, just the first 2 hex values changed.


+Rep mus1mus, many thanks for posting this info







...
Quote:


> Originally Posted by *fyzzz*
> 
> A big thanks to @mus1mus for his awesome discovery of the dd/d1 timing. Here is the result of it:
> http://www.3dmark.com/fs/6456169
> I just keep breaking my record over and over again


WOW!









You've got one great card there!

Anyone checking the 3dmark database for 290 is gonna weep when they don't get same result for like clocks!

+Rep for your testing results







.

Now that I'm finally getting my DPM 2-6 set manually after a lot of testing gonna try out the new RAM timings mod







.

IIRC I swear when I was looking for other info I saw atombios.h (one from latest AMD linux drivers) it has some info regarding what the timings hex values may mean but still more to do to unravel them.

I guess the more that people like mus1mus try new values we'll get some insight into them perhaps.

As you know from what I PM'd there maybe a MVDDC bios mod in the pipeline, still not tested it. I'm waiting on when Buidzoid has his one to hand, then planning on sending him a test rom where he'll see via a DMM if it had effect.


----------



## battleaxe

You guys are making me want to see what my card can really do... I can hit 1280 on core, but these memory timings make me consider trying harder on the RAM too. Makes me think. Too bad I'm so lazy.









If I got her colder I'm sure I could break 1300mhz on core. Havent' really tried very hard yet to be honest. Hmmm


----------



## GoLDii3

Does anyone know if you can change timings on other cards too? My 280X has Hynix.


----------



## gupsterg

Indeed you can, take a look at this thread Link:- http://www.overclock.net/t/1554360/tahiti-memory-timings-patch-for-hynix-vram


----------



## mus1mus

Quote:


> Originally Posted by *battleaxe*
> 
> You guys are making me want to see what my card can really do... I can hit 1280 on core, but these memory timings make me consider trying harder on the RAM too. Makes me think. Too bad I'm so lazy.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If I got her colder I'm sure I could break 1300mhz on core. Havent' really tried very hard yet to be honest. Hmmm


Core clock is not the goal. The goal is a perfectly performing clock on both mem and core along with the necessary timings.

See the above posts. What you gonna do with a 1300 Core when it performs like a 1200?

But yeah, we are in the stages of this bit the results are pointing us in the right direction.


----------



## mus1mus

Quote:


> Originally Posted by *fyzzz*
> 
> A big thanks to @mus1mus for his awesome discovery of the dd/d1 timing. Here is the result of it:
> http://www.3dmark.com/fs/6456169
> I just keep breaking my record over and over again


I knew it! I were wrong telling you al these. :sad:








Quote:


> Originally Posted by *gupsterg*
> 
> +Rep mus1mus, many thanks for posting this info
> 
> 
> 
> 
> 
> 
> 
> ...
> WOW!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You've got one great card there!
> 
> Anyone checking the 3dmark database for 290 is gonna weep when they don't get same result for like clocks!
> 
> +Rep for your testing results
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Now that I'm finally getting my DPM 2-6 set manually after a lot of testing gonna try out the new RAM timings mod
> 
> 
> 
> 
> 
> 
> 
> .
> 
> IIRC I swear when I was looking for other info I saw atombios.h (one from latest AMD linux drivers) it has some info regarding what the timings hex values may mean but still more to do to unravel them.
> 
> I guess the more that people like mus1mus try new values we'll get some insight into them perhaps.
> 
> As you know from what I PM'd there maybe a MVDDC bios mod in the pipeline, still not tested it. I'm waiting on when Buidzoid has his one to hand, then planning on sending him a test rom where he'll see via a DMM if it had effect.


Fun times ahead @gupsterg.


----------



## MEC-777

Really cool to see how high some of you are able to clock these 290's and score on the benchmarks. Very impressive.









I'm curious how they stack up against highly overclocked GTX 970's, which I understand, can clock quite a bit higher out of the box and without messing around with the BIOS...


----------



## fat4l

What is causing it basically?

Also when playing with timings and straps. Can u apply 1250 timings to 1520mhz for example? Or it has to be 1499?


----------



## mus1mus

Quote:


> Originally Posted by *MEC-777*
> 
> Really cool to see how high some of you are able to clock these 290's and score on the benchmarks. Very impressive.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm curious how they stack up against highly overclocked GTX 970's, which I understand, can clock quite a bit higher out of the box and without messing around with the BIOS...


970 meh.

You can hang out with the 980 crowd on a well tuned 290/290X. 980 does OC quite well though.

You can do a search query on 3dmark to compare. Same platform is always a better way to compare cards.

I cannot beat my 290 scores at 1212/1720 using a 780 that overclocked like crazy at 1450/1750 as an example. And I think the 970 is close to what a 780 can do.


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> Core clock is not the goal. The goal is a perfectly performing clock on both mem and core along with the necessary timings.
> 
> See the above posts. What you gonna do with a 1300 Core when it performs like a 1200?
> 
> But yeah, we are in the stages of this bit the results are pointing us in the right direction.


What?........

Edit: NVM... I think I misinterpreted your post. Disregard


----------



## zealord

Has anyone tested MGSV TPP with 15.11 beta driver?

I get 1~ fps everywhere even in menus. Game is utterly unplayable. Also graphical glitches in Fallout 4 with 15.11

With 15.7.1 everything is fine again.


----------



## GoLDii3

Quote:


> Originally Posted by *zealord*
> 
> Has anyone tested MGSV TPP with 15.11 beta driver?
> 
> I get 1~ fps everywhere even in menus. Game is utterly unplayable. Also graphical glitches in Fallout 4 with 15.11
> 
> With 15.7.1 everything is fine again.


Not happening for me. Playing both fine on 15.11

Time to use DDU


----------



## mus1mus

Quote:


> Originally Posted by *battleaxe*
> 
> What?........
> 
> I wasn't making any statements about what you are talking about. Isn't this the 290/x thread? Am I not allowed to post here?
> 
> And why would 1300mhz perform like 1200mhz exactly? Relax a bit maybe... just a thought.


What are you on? lol.

Take a look at this, I said: http://www.3dmark.com/compare/3dm11/10508382/3dm11/10508433

And this:
http://www.3dmark.com/compare/fs/1465417/fs/5526928

Memory tuning does affect performance and core OC numbers. It's not about the OC numbers, more of performance standpoint that matters.


----------



## fat4l

Quote:


> Originally Posted by *mus1mus*
> 
> What are you on? lol.
> 
> Take a look at this, I said: http://www.3dmark.com/compare/3dm11/10508382/3dm11/10508433
> 
> And this:
> http://www.3dmark.com/compare/fs/1465417/fs/5526928
> 
> Memory tuning does affect performance and core OC numbers. It's not about the OC numbers, more of performance standpoint that matters.


So is there any rule "how to" get the best performance ?


----------



## zealord

Quote:


> Originally Posted by *GoLDii3*
> 
> Not happening for me. Playing both fine on 15.11
> 
> Time to use DDU


hmm well you have a 280X according to your signature. maybe 15.11 only has these problems with 290(x)


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> What are you on? lol.
> 
> Take a look at this, I said: http://www.3dmark.com/compare/3dm11/10508382/3dm11/10508433
> 
> And this:
> http://www.3dmark.com/compare/fs/1465417/fs/5526928
> 
> Memory tuning does affect performance and core OC numbers. It's not about the OC numbers, more of performance standpoint that matters.


As mentioned... I edited my post because I misunderstood what you were saying. I figured out what you meant. Just didn't make sense at first.









Edit: and I'm on all sorts of things...


----------



## mus1mus

Quote:


> Originally Posted by *battleaxe*
> 
> As mentioned... I edited my post because I misunderstood what you were saying. I figured out what you meant. Just didn't make sense at first.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: and I'm on all sorts of things...


I figured. All good buddy. I hope no offense was instilled too.

I'm a bit intoxicated as well. With a flutab.









Actually, I'm a bit mad. You must be smoking something yet you are not sharing.















Quote:


> Originally Posted by *fat4l*
> 
> So is there any rule "how to" get the best performance ?


I guess, figuring out an OC baseline on stock bios is a good way to start. Then squeezing the rest by bios mod.

You have 2 ways to do it actually. Tweak via Hawaii Bios Reader and/or by editing via HEX.

I do gauge performance via 3dmark coz I can compare scores on the same platform easily.


----------



## kizwan

Looks like bench shootout commencing but I can only join in the low side.









Unfortunately my second (Hynix) card hit a wall at 1180 on the core with 390 modded BIOS. My method is more of a bruteforce. 1180 already need +100mV & combined with memory it already needs more. So I just max out to +200mV. No artifacts visible, if there's any, I don't see it. I thought I can hit 1800 on the memory but at 1775, it crashed.

1179/1599 Core +200mV AUX +13mV (1.047V)
Graphics Score 14064
Graphics Test 1 - 66.8 fps
Graphics Test 2 - 56.38 fps
http://www.3dmark.com/3dm/9240826?

1181/1708 Core +200mV AUX +13mV (1.047V)
Graphics Score 14140
Graphics Test 1 - 66.76 fps
Graphics Test 2 - 56.97 fps
http://www.3dmark.com/3dm/9240968?

1179/1726 Core +200mV AUX +13mV (1.047V)
Graphics Score 14189
Graphics Test 1 - 66.95 fps
Graphics Test 2 - 57.2 fps
http://www.3dmark.com/3dm/9241171?

1181/1749 Core +200mV AUX +13mV (1.047V)
Graphics Score 14239
Graphics Test 1 - 67.35 fps
Graphics Test 2 - 57.28 fps
http://www.3dmark.com/3dm/9241242?

Primary (Elpida) card with 290 BIOS. Same thing with the voltage, I can run it at 1200 on the core with +100mV but it needs more when combined with memory. I can only max out 1620 on the memory.

1200/1600 Core +200mV AUX +50mV (1.047V)
Graphics Score 13470
Graphics Test 1 - 64.61 fps
Graphics Test 2 - 53.56 fps
http://www.3dmark.com/3dm/9241519?

1200/1620 Core +200mV AUX +50mV (1.047V)
Graphics Score 13551
Graphics Test 1 - 65.15 fps
Graphics Test 2 - 53.77 fps
http://www.3dmark.com/3dm/9241609?


----------



## mus1mus

http://www.3dmark.com/fs/6436361

1200/1625


----------



## gatygun

Quote:


> Originally Posted by *MEC-777*
> 
> Really cool to see how high some of you are able to clock these 290's and score on the benchmarks. Very impressive.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm curious how they stack up against highly overclocked GTX 970's, which I understand, can clock quite a bit higher out of the box and without messing around with the BIOS...


my 290 wouldn't go more above the 13,2k points ( its not a 290x so yea ), at 1225/1650, my 970 hits it's limits at 1580/4080 which equals 14,1k points. Both stock bioses ( 290 = tri-x, msi gaming 970) ( total 3dmark points would be 10590 for 290, and 11900 for 970 with only the gpu being different ).

290 = slower then a 290x


----------



## mus1mus

My card failed to post on a water block.

Scurry!


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> My card failed to post on a water block.
> 
> Scurry!


What happen? I thought your card already underwater.


----------



## mus1mus

Quote:


> Originally Posted by *kizwan*
> 
> What happen? I thought your card already underwater.


That was the Hynix.

This elpida has been tested on air.

It posted and went black.

Pulled it out the WB and it posted back on air. Weird.


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> That was the Hynix.
> 
> This elpida has been tested on air.
> 
> It posted and went black.
> 
> Pulled it out the WB and it posted back on air. Weird.


Is it possible you missed a corner on the die with TIM? I've done that before.


----------



## mus1mus

what do you mean?

I think something is in contact with a metal.

Washed the card again with ISO. Well.


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> what do you mean?
> 
> I think something is in contact with a metal.
> 
> Washed the card again with ISO. Well.


Like one corner of the die didn't make contact with the block. I did this once or twice, didn't get quite enough TIM on the die, and a corner wasn't making contact. It could boot, but as soon as I put any load on it, black screens came every single time. Took a while to figure out that I didn't have quite enough TIM on die. Put a bit more on and spread it out with a CC. Reseated it, and boom, fixed. When one or more corners aren't making good contact it can do this, temps can even look great and it still happens. Just an idea. Tends to happen when you use the pea or rice spot method instead of the spreading method on GPU's. Some recommend always using spread method on GPU's.


----------



## mus1mus

Quote:


> Originally Posted by *battleaxe*
> 
> Like one corner of the die didn't make contact with the block. I did this once or twice, didn't get quite enough TIM on the die, and a corner wasn't making contact. It could boot, but as soon as I put any load on it, black screens came every single time. Took a while to figure out that I didn't have quite enough TIM on die. Put a bit more on and spread it out with a CC. Reseated it, and boom, fixed. When one or more corners aren't making good contact it can do this, temps can even look great and it still happens. Just an idea. Tends to happen when you use the pea or rice spot method instead of the spreading method on GPU's. Some recommend always using spread method on GPU's.


Thanks. But it's just complete black screen.No post whatsoever. While it posted on the stock cooler.


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> Thanks. But it's just complete black screen.No post whatsoever. While it posted on the stock cooler.


Was worth a shot.


----------



## mus1mus

It will be for sure.


----------



## kizwan

Which water block you have there? Check thermal pads, make sure all of them making good contact, by looking at the impression left on the thermal pads. If you have Aquakomputer block, make sure you use TIM. not thermal pad on the memory IC's.


----------



## mus1mus

must of been too tight. she said'









Sooo, the card is running cold now. And is slowly giving it some workout...

http://www.3dmark.com/compare/3dm11/10516400/3dm11/10516423/3dm11/10516439/3dm11/10516456/3dm11/10516462/3dm11/10516475/3dm11/10516483/3dm11/10516494

Also running FF/F1. No artifacts yet at 1270/1600 UNTOUCHED Strap.

OHMIGOD!!! It keeps on going.......

Artifacts at 1290/1600

And Boom!
http://www.3dmark.com/3dm11/10516506


----------



## EpicOtis13

Quote:


> Originally Posted by *mus1mus*
> 
> must of been too tight. she said'
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sooo, the card is running cold now. And is slowly giving it some workout...
> 
> http://www.3dmark.com/compare/3dm11/10516400/3dm11/10516423/3dm11/10516439/3dm11/10516456/3dm11/10516462/3dm11/10516475/3dm11/10516483/3dm11/10516494
> 
> Also running FF/F1. No artifacts yet at 1270/1600 UNTOUCHED Strap.
> 
> OHMIGOD!!! It keeps on going.......
> 
> Artifacts at 1290/1600
> 
> And Boom!
> http://www.3dmark.com/3dm11/10516506


Holy hell, teach my your ways, I want to crush those stupid 970's in SLI with my xfire 290s


----------



## mus1mus

Quote:


> Originally Posted by *EpicOtis13*
> 
> Holy hell, teach my your ways, I want to crush those stupid 970's in SLI with my xfire 290s


My friend, grudge is bad thing.You can not prosper forward with some burden in your heart. Pray to mus my friend.









First thing you need to do, is to find your perfect BIOS. Good Scoring - High OC - Less Issues. --Do a lot of tests.

Modify til you squeeze her tight.


----------



## Mega Man

translated, lots of voltage


----------



## mus1mus

Quote:


> Originally Posted by *Mega Man*
> 
> translated, lots of voltage










yeah man!


----------



## megax05

Quote:


> Originally Posted by *mus1mus*
> 
> must of been too tight. she said'
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sooo, the card is running cold now. And is slowly giving it some workout...
> 
> http://www.3dmark.com/compare/3dm11/10516400/3dm11/10516423/3dm11/10516439/3dm11/10516456/3dm11/10516462/3dm11/10516475/3dm11/10516483/3dm11/10516494
> 
> Also running FF/F1. No artifacts yet at 1270/1600 UNTOUCHED Strap.
> 
> OHMIGOD!!! It keeps on going.......
> 
> Artifacts at 1290/1600
> 
> And Boom!
> http://www.3dmark.com/3dm11/10516506


Man 19k is just insane I will start to read about bios editting


----------



## rdr09

Quote:


> Originally Posted by *megax05*
> 
> Man 19k is just insane I will start to read about bios editting


19 using orig bios . . .

http://www.3dmark.com/3dm11/8776470

good job to mus and fy!


----------



## mus1mus

Quote:


> Originally Posted by *rdr09*
> 
> 19 using orig bios . . .
> 
> http://www.3dmark.com/3dm11/8776470
> 
> good job to mus and fy!


Very nice!

That's the elpida card I assume. If so, are you willing to give it a spin on my BIOS? Not today though. Tomorrow?


----------



## megax05

Quote:


> Originally Posted by *rdr09*
> 
> 19 using orig bios . . .
> 
> http://www.3dmark.com/3dm11/8776470
> 
> good job to mus and fy!


I think I can do 1300 on core in my dreams only in reality I can do 1200 max.


----------



## rdr09

Quote:


> Originally Posted by *mus1mus*
> 
> Very nice!
> 
> That's the elpida card I assume. If so, are you willing to give it a spin on my BIOS? Not today though. Tomorrow?


thanks for the offer but i am a wuzz in flashing. i think they are more geared to low clocking cards. someday when i get the balls.









Quote:


> Originally Posted by *megax05*
> 
> I think I can do 1300 on core in my dreams only in reality I can do 1200 max.


proper cooling is first step. like water.

btw, my 1300 runs were full of artifacts.


----------



## fyzzz

It's so fun to see everyone benching and showing great results. The bios modding really helps these 290(x)'s. The 290(x)'s are ageing well.


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> It's so fun to see everyone benching and showing great results. The bios modding really helps these 290(x)'s. The 290(x)'s are ageing well.


i like to see crossfire reach 30K in graphics. i think it's possible.


----------



## fyzzz

Quote:


> Originally Posted by *rdr09*
> 
> i like to see crossfire reach 30K in graphics. i think it's possible.


Yeah probably with 2 good one's. I have played around with even more timings (can't stop this is to fun) tested with 1200/1625 and got a gpu score of 14707 on a sapphire tri-x bios! http://www.3dmark.com/3dm/9246605?


----------



## mus1mus

Quote:


> Originally Posted by *megax05*
> 
> I think I can do 1300 on core in my dreams only in reality I can do 1200 max.


Would you believe I can't even stabilize 1230/1625 yesterday? Even with +200.

Quote:


> Originally Posted by *rdr09*
> 
> thanks for the offer but i am a wuzz in flashing. i think they are more geared to low clocking cards. someday when i get the balls.
> 
> 
> 
> 
> 
> 
> 
> 
> *proper cooling is first step. like water.*
> 
> btw, my 1300 runs were full of artifacts.


Soo true. But I think BIOS helps as well.

Was running at 70C tops yesterday on air.

1300/1625 runs flawless at +175.







But I have adjusted my VID to 1.25 from 1.19 at stock.
Quote:


> Originally Posted by *fyzzz*
> 
> It's so fun to see everyone benching and showing great results. The bios modding really helps these 290(x)'s. The 290(x)'s are ageing well.


I am still gunning for you. If I break 15K, I will be satisfied.







hopefully I can catch up soon.


----------



## fyzzz

I believe in you @mus1mus that you can break 15k! That would be fun. Also 15k on 290 bios at 1250/1625 http://www.3dmark.com/3dm/9246655?. It was not too happy during the second test however, i got a blank screen, but the test was still running, guess i need to play around with this a bit more.


----------



## megax05

I am really considering water cooling my system and change cpu and mobo now with skylake i7 6700k do you guys think it still worth to get custom loop and full blocks for r9 290s ,cause will get my 4k screen soon.


----------



## mus1mus

30K Prolly.
http://www.3dmark.com/fs/6268384


----------



## rdr09

Quote:


> Originally Posted by *mus1mus*
> 
> 30K Prolly.
> http://www.3dmark.com/fs/6268384


prolly if you have a skylake you'll get an extra 400 pts in GS. got 26K with an older driver at 1290. win 10 will prolly add 400 pts.


----------



## mus1mus

Quote:


> Originally Posted by *rdr09*
> 
> prolly if you have a skylake you'll get an extra 400 pts in GS. got 26K with an older driver at 1290. win 10 will prolly add 400 pts.


Doable I think. :still haven't RMAd my MSI 290 and my unlocked card is now detected as ATI 8XXX!

Well, life throwing lemons!


----------



## mus1mus

Quote:


> Originally Posted by *fyzzz*
> 
> I have played around with even more timings (can't stop this is to fun) tested with 1200/1625 and got a gpu score of 14707 on a sapphire tri-x bios! http://www.3dmark.com/3dm/9246605?












Let us know what you'd find.

I have a coupke of things changed on the timings as well. That did this:
http://www.3dmark.com/fs/6462131


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> It's so fun to see everyone benching and showing great results. The bios modding really helps these 290(x)'s. The 290(x)'s are ageing well.


they are indeed ageing very well. i find benching boring. i did mine the first few weeks of receiving the cards while doing other stuff like cleaning the house. best part is these cards can handle 4K quite well. of course, we can't max games like having a couple of 980Tis but . . . still looks better than maxed 1080. so long as they are not gimpworks.


----------



## battleaxe

You guys are getting me excited. I started flashing last night. Its a flash dance around here. What BIOS are you guys using?

Edit: Okay so I went over to here (http://www.overclock.net/t/1353325/tutorial-atiwinflash-how-to-flash-the-bios-of-your-ati-cardsf) and started flashing, but so far none of the flashes are taking.

I followed all the steps and it seems to be doing what it should. But... They all show as unsupported. Ideas?

Also, are you guys modding your own bios? I've seen that done some, wasn't sure how you all were doing it. I've got one pretty nice card that should do well. The other one has some turd RAM, so I would like to try to get better perf from it if I can. This should be fun.

Edit: I found this (http://www.overclock.net/t/1564219/r9-390x-bios-for-r9-290-290x-now-with-stock-and-modded-voltage-tables)

I'll read up and try before asking stupid questions. LOL

One question though...

do you guys uninstall your AMD drivers and disable crossfire before flashing?


----------



## sinnedone

Quote:


> Originally Posted by *battleaxe*
> 
> You guys are getting me excited. I started flashing last night. Its a flash dance around here. What BIOS are you guys using?
> 
> Edit: Okay so I went over to here (http://www.overclock.net/t/1353325/tutorial-atiwinflash-how-to-flash-the-bios-of-your-ati-cardsf) and started flashing, but so far none of the flashes are taking.
> 
> I followed all the steps and it seems to be doing what it should. But... They all show as unsupported. Ideas?
> 
> Also, are you guys modding your own bios? I've seen that done some, wasn't sure how you all were doing it. I've got one pretty nice card that should do well. The other one has some turd RAM, so I would like to try to get better perf from it if I can. This should be fun.
> 
> Edit: I found this (http://www.overclock.net/t/1564219/r9-390x-bios-for-r9-290-290x-now-with-stock-and-modded-voltage-tables)
> 
> I'll read up and try before asking stupid questions. LOL
> 
> One question though...
> 
> *do you guys uninstall your AMD drivers and disable crossfire before flashing?*


Yes


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> 19 using orig bios . . .
> 
> http://www.3dmark.com/3dm11/8776470
> 
> good job to mus and fy!
> 
> 
> 
> Very nice!
> 
> That's the elpida card I assume. If so, are you willing to give it a spin on my BIOS? Not today though. Tomorrow?
Click to expand...

I'm interested to test on my card.

BTW, did you try both cards, overclock in crossfire?
Quote:


> Originally Posted by *battleaxe*
> 
> You guys are getting me excited. I started flashing last night. Its a flash dance around here. What BIOS are you guys using?
> 
> Edit: Okay so I went over to here (http://www.overclock.net/t/1353325/tutorial-atiwinflash-how-to-flash-the-bios-of-your-ati-cardsf) and started flashing, but so far none of the flashes are taking.
> 
> I followed all the steps and it seems to be doing what it should. But... They all show as unsupported. Ideas?
> 
> Also, are you guys modding your own bios? I've seen that done some, wasn't sure how you all were doing it. I've got one pretty nice card that should do well. The other one has some turd RAM, so I would like to try to get better perf from it if I can. This should be fun.
> 
> Edit: I found this (http://www.overclock.net/t/1564219/r9-390x-bios-for-r9-290-290x-now-with-stock-and-modded-voltage-tables)
> 
> I'll read up and try before asking stupid questions. LOL
> 
> One question though...
> 
> *do you guys uninstall your AMD drivers and disable crossfire before flashing?*


No, I didn't do any of that. Flashing in DOS though.


----------



## battleaxe

Quote:


> Originally Posted by *sinnedone*
> 
> Yes


Could explain why my flash didn't take then...
Quote:


> Originally Posted by *kizwan*
> 
> I'm interested to test on my card.
> 
> BTW, did you try both cards, overclock in crossfire?
> No, I didn't do any of that. Flashing in DOS though.


DOS?

Using command with admin privileges?

I didn't realize we had DOS on Win anymore? Maybe I'm confused.


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> Could explain why my flash didn't take then...
> DOS?
> 
> Using command with admin privileges?
> 
> I didn't realize we had DOS on Win anymore? Maybe I'm confused.


Try using ATIFlash found in this thread . . .

http://www.overclock.net/t/1564219/r9-390x-bios-for-r9-290-290x-now-with-stock-and-modded-voltage-tables

don't forget to rep op.


----------



## battleaxe

Quote:


> Originally Posted by *sinnedone*
> 
> Yes


Quote:


> Originally Posted by *rdr09*
> 
> Try using ATIFlash found in this thread . . .
> 
> http://www.overclock.net/t/1564219/r9-390x-bios-for-r9-290-290x-now-with-stock-and-modded-voltage-tables
> 
> don't forget to rep op.


Yup, plan to try that. I just referenced that thread a few posts ago. Thanks man!









and will do on the rep, to you too for trying to help.


----------



## mus1mus

No ATIWINFLASH.
Just use the bootable DOS version.

Prep:

http://www.techpowerup.com/forums/threads/how-to-use-atiflash.57750/

For those intereseted, I will upload the BIOS I am testing in a bit later. ELPIDA for now. As I think the 390X Bios already work well for the Hynix cards.

I've seen a few benefit from them using Elpida


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> No ATIWINFLASH.
> Just use the bootable DOS version.
> 
> Prep:
> 
> http://www.techpowerup.com/forums/threads/how-to-use-atiflash.57750/
> 
> For those intereseted, I will upload the BIOS I am testing in a bit later. ELPIDA for now. As I think the 390X Bios already work well for the Hynix cards.
> 
> I've seen a few benefit from them using Elpida


Working on it now. Still won't flash, I'm still doing something wrong. Grrrr

+1


----------



## mus1mus

Quote:


> Originally Posted by *battleaxe*
> 
> Working on it now. Still won't flash, I'm still doing something wrong. Grrrr
> 
> +1


What does it return?

-p -f 0

0 is zero to point 1st card.


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> What does it return?
> 
> -p -f 0
> 
> 0 is zero to point 1st card.


Yup did that. Says no adapter to flash. I'm trying to flash my second card. Wait... I think I know what I did wrong now. I put 2 instead of 1. Should have been a "1" for the second slot. Duh.

Thanks man!

+1


----------



## mus1mus

Glad you figured it out.


----------



## sinnedone

Quote:


> Originally Posted by *battleaxe*
> 
> Could explain why my flash didn't take then...
> DOS?
> 
> Using command with admin privileges?
> 
> I didn't realize we had DOS on Win anymore? Maybe I'm confused.


Yeah you need to make a Bootable usb stick. I do believe there is a how to on the 390x bios on 290 thread in the first post.


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> Glad you figured it out.


Quote:


> Originally Posted by *sinnedone*
> 
> Yeah you need to make a Bootable usb stick. I do believe there is a how to on the 390x bios on 290 thread in the first post.


Yeah, it worked. So I proved I can flash it now. Seems to work okay, but not getting any better results on my dud card. Gonna try timings now.


----------



## mus1mus

Elpida or Hynix?

Who wants to give this BIOS a spin?

http://www.3dmark.com/fs/6468184

Tweaked.zip 43k .zip file


Tweaks include:

FF-F1 for Elpida
DPM 7 - 1.25
1250 Strap for 1625.

Best to give the card a spin to 1625 Memory before doing any Core OC. And maybe +100 On VDDC.


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> Elpida or Hynix?
> 
> Who wants to give this BIOS a spin?
> 
> http://www.3dmark.com/fs/6468184
> 
> Tweaked.zip 43k .zip file
> 
> 
> Tweaks include:
> 
> FF-F1 for Elpida
> DPM 7 - 1.25
> 1250 Strap for 1625.
> 
> Best to give the card a spin to 1625 Memory before doing any Core OC. And maybe +100 On VDDC.


its hynix, but doesn't clock well on mem for some reason. Never has. I need to work on timings to see if I can improve it. Not 100% sure how to start though. Ideas?


----------



## mus1mus

Quote:


> Originally Posted by *battleaxe*
> 
> its hynix, but doesn't clock well on mem for some reason. Never has. I need to work on timings to see if I can improve it. Not 100% sure how to start though. Ideas?


Give me a bit of time to write a little guide.









Hwub to scwub mwy lwazy bwutt fwor thwat.


----------



## mus1mus

Okay here it is.

First thing to do is to look for your stock BIOS and mod it via a HEX Editor. I use HxD


Spoiler: Let's begin







Hit Alt + F to do a search query and type "rgr" this is very close to the memory timings section.

Scroll up till you see these:


Spoiler: Marker, the timings should be above these









Spoiler: Warning: But before that, A little rundown.







Some roms can support two memory types, namely Elpida and Hynix. As shown above.

Now, if you look at the Memory timings section of the rom, you will see the pointers showing 01 and 02. If the card can only support one chip make, it shows 00.
In this rom, as shown by the image above, Hynix memory was listed first. So 01 stands for Hynix and 02 points into the Elpida thing.


Spoiler: Timings section.







It means that, if you have Hynix, you should only mess with the timings that points to Hynix. And Elpida and Samsung. Whatever!

The picture above also shows what @fyzzz and I were talking about DD-D1, 77 71, etc.


Spoiler: She's sooo tight, she said







Final Output.
This is yet to be proven but tightening the timings sometimes bring stability at higher clocks. It does it's magic on mine. So try your luck.
What I did was, copying the timings one step at a time. Say, 1000 for 1250, 1250 for 1375, 1375 for 1500, 1500 for 1625, 1625 for 1750, save the rom. Flash, test, go tighten her more.
If she scored better, I try some more til she won't get to run even 1500 after targeting 1625.

I think I ended up using the timings from 1250 for 1625.











Spoiler: If you spend time with caressing her, she will be tighter







Lastly,

Correcting the Checksum for the rom file.


Spoiler: Checksum: WRONG.


----------



## EpicOtis13

Quote:


> Originally Posted by *mus1mus*
> 
> Okay here it is.
> 
> First thing to do is to look for your stock BIOS and mod it via a HEX Editor. I use HxD
> 
> 
> Spoiler: Let's begin
> 
> 
> 
> 
> 
> 
> 
> Hit Alt + F to do a search query and type "rgr" this is very close to the memory timings section.
> 
> Scroll up till you see these:
> 
> 
> Spoiler: Marker, the timings should be above these
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: But before that, A little rundown.
> 
> 
> 
> 
> 
> 
> 
> Some roms can support two memory types, namely Elpida and Hynix. As shown above.
> 
> Now, if you look at the Memory timings section of the rom, you will see the pointers showing 01 and 02. If the card can only support one chip make, it shows 00.
> In this rom, as shown by the image above, Hynix memory was listed first. So 01 stands for Hynix and 02 points into the Elpida thing.
> 
> 
> Spoiler: Timings section.
> 
> 
> 
> 
> 
> 
> 
> It means that, if you have Hynix, you should only mess with the timings that points to Hynix. And Elpida and Samsung. Whatever!
> 
> The picture above also shows what @fyzzz and I were talking about DD-D1, 77 71, etc.
> 
> 
> Spoiler: She's sooo tight, she said
> 
> 
> 
> 
> 
> 
> 
> Final Output.
> This is yet to be proven but tightening the timings sometimes bring stability at higher clocks. It does it's magic on mine. So try your luck.
> What I did was, copying the timings one step at a time. Say, 1000 for 1250, 1250 for 1375, 1375 for 1500, 1500 for 1625, 1625 for 1750, save the rom. Flash, test, go tighten her more.
> If she scored better, I try some more til she won't get to run even 1500 after targeting 1625.
> 
> I think I ended up using the timings from 1250 for 1625.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: If you spend time with caressing her, she will be tighter
> 
> 
> 
> 
> 
> 
> 
> Lastly,
> 
> Correcting the Checksum for the rom file.
> 
> 
> Spoiler: Checksum: WRONG.


Thanks for the guide! I'm ready to push my cards tomorrow and see how I score with my 5930k. I'm going to OC my cpu, and then hope fully thrash somepeoples scores with the help of the guide.


----------



## battleaxe

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *mus1mus*
> 
> Okay here it is.
> 
> First thing to do is to look for your stock BIOS and mod it via a HEX Editor. I use HxD
> 
> 
> Spoiler: Let's begin
> 
> 
> 
> 
> 
> 
> 
> Hit Alt + F to do a search query and type "rgr" this is very close to the memory timings section.
> 
> Scroll up till you see these:
> 
> 
> Spoiler: Marker, the timings should be above these
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: But before that, A little rundown.
> 
> 
> 
> 
> 
> 
> 
> Some roms can support two memory types, namely Elpida and Hynix. As shown above.
> 
> Now, if you look at the Memory timings section of the rom, you will see the pointers showing 01 and 02. If the card can only support one chip make, it shows 00.
> In this rom, as shown by the image above, Hynix memory was listed first. So 01 stands for Hynix and 02 points into the Elpida thing.
> 
> 
> Spoiler: Timings section.
> 
> 
> 
> 
> 
> 
> 
> It means that, if you have Hynix, you should only mess with the timings that points to Hynix. And Elpida and Samsung. Whatever!
> 
> The picture above also shows what @fyzzz and I were talking about DD-D1, 77 71, etc.
> 
> 
> Spoiler: She's sooo tight, she said
> 
> 
> 
> 
> 
> 
> 
> Final Output.
> This is yet to be proven but tightening the timings sometimes bring stability at higher clocks. It does it's magic on mine. So try your luck.
> What I did was, copying the timings one step at a time. Say, 1000 for 1250, 1250 for 1375, 1375 for 1500, 1500 for 1625, 1625 for 1750, save the rom. Flash, test, go tighten her more.
> If she scored better, I try some more til she won't get to run even 1500 after targeting 1625.
> 
> I think I ended up using the timings from 1250 for 1625.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: If you spend time with caressing her, she will be tighter
> 
> 
> 
> 
> 
> 
> 
> Lastly,
> 
> Correcting the Checksum for the rom file.
> 
> 
> Spoiler: Checksum: WRONG.






So does it start on line 0000B000 ??

That's where yours starts and I see those lines as well. Mine are all zeros in those locations. Your guide should have its own thread. Very crazy, and helpful. Can't imagine attempting this without it.


----------



## mus1mus

Quote:


> Originally Posted by *battleaxe*
> 
> So does it start on line 0000B000 ??
> 
> That's where yours starts and I see those lines as well. Mine are all zeros in those locations. Your guide should have its own thread. Very crazy, and helpful. Can't imagine attempting this without it.


After looking at different roms, the offset locations for the memory straps vary. So you need to be careful. RGR is the closest query I can get to find it.

One key guide to check them is that repetition of the first 2 hex values on each strap. It's pretty common to find 77 71, 99 91, DD D1, to mention a few.
Follow that by looking at the 4 hex values that precede the 77 71or the DD D1, minus the pointer, and you will find the strap value. Always have a hex to decimal converter open.










No need to add a new thread. We already have one;


----------



## battleaxe

Okay, so this is what I see. Am I highlighting the correct set of chars? These are the first mem timings correct?

Edit: I can see my thinking was wrong. I found the 77 71, etc... its a bit further down


----------



## mus1mus

Very close.

Can you see the 33 31 33 in the center that's a couple of lines above the highlighted area? That's the first strap. (Not really our concern as those are the low clocks)

Also within the highlighted area are 55 51 33. That's also another strap.

So if you see three "5"s for example followed by a "1" and "33" you're in a strap.











your concern starts here.

The 77 71 the supersedes the 01 shows thehighest strap of the rom for Hynix (if that is the first defined RAM supported) Scroll up.

The 99 91 is for the second RAM supported as indicated by 02 preceding them.


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> Very close.
> 
> Can you see the 33 31 33 in the center that's a couple of lines above the highlighted area? That's the first strap. (Not really our concern as those are the low clocks)
> 
> Also within the highlighted area are 55 51 33. That's also another strap.
> 
> So if you see three "5"s for example followed by a "1" and "33" you're in a strap.


How the heck do you figure that out? How do you see that? Feeling quite dumb over here. LOL

I see what you are talking about now...

Edit: Okay so with my turd of a card I cannot get it to 1350 on mem. Highest I have tried with success is 1325mhz. My thinking was to use slower timings on the faster RAM speeds to free them up so I can clock higher. So, like use the 1650 ram timings on 1350-1500 for example. Is that reasonable?

And if doing so, what am I looking at changing, from and to what? Or do you have a better idea what I should try?

Edit: okay, I see the straps now. Got it...







... shewww... was starting to feel a little dumb there. LOL


----------



## battleaxe

Weird how this creates a check sum error... wonder why it does that?


----------



## mus1mus

Quote:


> Originally Posted by *battleaxe*
> 
> How the heck do you figure that out? How do you see that? Feeling quite dumb over here. LOL
> 
> I see what you are talking about now...
> 
> Edit: Okay so with my turd of a card I cannot get it to 1350 on mem. Highest I have tried with success is 1325mhz. My thinking was to use slower timings on the faster RAM speeds to free them up so I can clock higher. So, like use the 1650 ram timings on 1350-1500 for example. Is that reasonable?
> 
> And if doing so, what am I looking at changing, from and to what? Or do you have a better idea what I should try?
> 
> Edit: okay, I see the straps now. Got it...
> 
> 
> 
> 
> 
> 
> 
> ... shewww... was starting to feel a little dumb there. LOL


Is it your stock rom?

Quote:


> Originally Posted by *battleaxe*
> 
> Weird how this creates a check sum error... wonder why it does that?


Any editing on the rom will create a checksum error.

Can you let me take a look at your BIOS?


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> Very close.
> 
> Can you see the 33 31 33 in the center that's a couple of lines above the highlighted area? That's the first strap. (Not really our concern as those are the low clocks)
> 
> Also within the highlighted area are 55 51 33. That's also another strap.
> 
> So if you see three "5"s for example followed by a "1" and "33" you're in a strap.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> your concern starts here.
> 
> The 77 71 the supersedes the 01 shows thehighest strap of the rom for Hynix (if that is the first defined RAM supported) Scroll up.
> 
> The 99 91 is for the second RAM supported as indicated by 02 preceding them.


Okay. Got it. Makes sense. My BIOS is this one. Well, that's the one on it now...

0_NEW.zip 42k .zip file


Here's the stock one.

Hawaii2.zip 99k .zip file


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> Okay here it is.
> 
> First thing to do is to look for your stock BIOS and mod it via a HEX Editor. I use HxD
> 
> 
> Spoiler: Let's begin
> 
> 
> 
> 
> 
> 
> 
> Hit Alt + F to do a search query and type "rgr" this is very close to the memory timings section.
> 
> Scroll up till you see these:
> 
> 
> Spoiler: Marker, the timings should be above these
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: But before that, A little rundown.
> 
> 
> 
> 
> 
> 
> 
> Some roms can support two memory types, namely Elpida and Hynix. As shown above.
> 
> Now, if you look at the Memory timings section of the rom, you will see the pointers showing 01 and 02. If the card can only support one chip make, it shows 00.
> In this rom, as shown by the image above, Hynix memory was listed first. So 01 stands for Hynix and 02 points into the Elpida thing.
> 
> 
> Spoiler: Timings section.
> 
> 
> 
> 
> 
> 
> 
> It means that, if you have Hynix, you should only mess with the timings that points to Hynix. And Elpida and Samsung. Whatever!
> 
> The picture above also shows what @fyzzz and I were talking about DD-D1, 77 71, etc.
> 
> 
> Spoiler: She's sooo tight, she said
> 
> 
> 
> 
> 
> 
> 
> Final Output.
> This is yet to be proven but tightening the timings sometimes bring stability at higher clocks. It does it's magic on mine. So try your luck.
> What I did was, copying the timings one step at a time. Say, 1000 for 1250, 1250 for 1375, 1375 for 1500, 1500 for 1625, 1625 for 1750, save the rom. Flash, test, go tighten her more.
> If she scored better, I try some more til she won't get to run even 1500 after targeting 1625.
> 
> I think I ended up using the timings from 1250 for 1625.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: If you spend time with caressing her, she will be tighter
> 
> 
> 
> 
> 
> 
> 
> Lastly,
> 
> Correcting the Checksum for the rom file.
> 
> 
> Spoiler: Checksum: WRONG.


+rep







Can you make a PDF one for me?







Seriously, so I can save a copy.
Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Very close.
> 
> Can you see the 33 31 33 in the center that's a couple of lines above the highlighted area? That's the first strap. (Not really our concern as those are the low clocks)
> 
> Also within the highlighted area are 55 51 33. That's also another strap.
> 
> So if you see three "5"s for example followed by a "1" and "33" you're in a strap.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> How the heck do you figure that out? How do you see that? Feeling quite dumb over here. LOL
> 
> I see what you are talking about now...
> 
> Edit: Okay so with my turd of a card I cannot get it to 1350 on mem. Highest I have tried with success is 1325mhz. My thinking was to use slower timings on the faster RAM speeds to free them up so I can clock higher. So, like use the 1650 ram timings on 1350-1500 for example. Is that reasonable?
> 
> And if doing so, what am I looking at changing, from and to what? Or do you have a better idea what I should try?
> 
> Edit: okay, I see the straps now. Got it...
> 
> 
> 
> 
> 
> 
> 
> ... shewww... was starting to feel a little dumb there. LOL
Click to expand...

If your memory is crap when overclocking, the tighten timing won't help but you should be able to squeeze out additional performance with the modded BIOS.


----------



## mus1mus

a pdf of what?

Actually, just to verify whether there's placebo on my tests, I will flash back to a low clocking bios. To try if what can be done.


----------



## battleaxe

I think I have the basic handle on this now. I just have to get the courage up to modify it. Tomorrow!

What frequency range is the last strap? I'm not quite understanding how to calculate those hex values preceding the strap? I put it into a hex calc and it just give me some really random crazy numbers, so I'm doing something wrong I think.


----------



## mus1mus

for ref:



Spoiler: Warning: Spoiler!



800
80 38 01 02
77 71 33 20 00 00 00 00 A5 AC 35 1F 30 55 09 0C 20 8E 75 02 00 44 82 00 22 AA 1C 08 44 09 14 20 2A 89 00 A5 00 00 07 C0 0C 06 14 1A 27 19 21 0F

900
90 5F 01 02
77 71 33 20 00 00 00 00 E7 B4 36 23 40 55 09 0D 24 90 C6 02 00 44 A2 00 22 AA 1C 08 4C 0B 14 20 2A 89 80 A5 00 00 07 C0 0E 08 16 1C 2C 1C 25 0F

1000
A0 86 01 02
77 71 33 20 00 00 00 00 29 39 57 26 50 55 09 0E 26 11 17 03 00 68 C2 00 22 AA 1C 08 54 0C 14 20 AA 89 00 A6 00 00 07 C0 0F 0A 18 1D 31 1E 27 10

1250
48 E8 01 02
77 71 33 20 00 00 00 00 4A BD 47 2A 60 55 0F 0F 23 1D 87 03 00 46 C4 00 22 AA 1C 08 5C 0B 14 20 4A 89 00 A0 00 00 01 20 11 0D 20 23 4A 1D 24 11

1375
1C 19 02 02
77 71 33 20 00 00 00 00 4A BD 47 2A 60 55 0F 0F 23 1D 87 03 00 46 C4 00 22 AA 1C 08 5C 0B 14 20 4A 89 00 A0 00 00 01 20 11 0D 20 23 4A 1D 24 11

1500
F0 49 02 02
77 71 33 20 00 00 00 00 31 5A 6B 3A 90 55 09 12 36 19 AB 04 00 6A E4 00 22 AA 1C 08 74 04 14 20 CA 89 00 A9 02 00 07 C0 17 12 24 29 4A 2A 37 12

1625
C4 7A 02 02
77 71 33 20 00 00 00 00 73 62 7C 3E B0 55 09 14 3A 1B 1C 05 00 69 26 01 22 AA 1C 08 04 06 14 20 EA 89 80 A9 03 00 07 C0 19 14 26 2B 51 2E 3B 13

1750
98 AB 02 02
77 71 33 20 00 00 00 00 B5 6A 7D 43 C0 55 09 15 3E 1D 7D 05 00 6A 27 01 22 AA 1C 08 0C 08 14 20 FA 89 40 AA 03 00 07 C0 1B 16 29 2E 57 31 3F 13



The top shows the frequency strap.
1st 3 HEX on each 2nd line shows the hex equivalent: You read it from right to left.
the 4th value on 2nd line is the pointer. 2 for I have Elpida.

i.e:
98 AB 02 = 02 AB 98 = 1750


----------



## battleaxe

So if I put 98 AB 02 into a hex to decimal converter I get 10005250

? Or do I need to convert 98 AB 02 into something else first?


----------



## mus1mus

Quote:


> Originally Posted by *battleaxe*
> 
> So if I put 98 AB 02 into a hex to decimal converter I get 10005250
> 
> ? Or do I need to convert 98 AB 02 into something else first?


You put it reading from right to left.

98 AB 02 - this is how it appears on the rom.
02 AB 98 - this is how you should put it on a HEX converter.


----------



## kizwan

Quote:


> Originally Posted by *battleaxe*
> 
> So if I put 98 AB 02 into a hex to decimal converter I get 10005250
> 
> ? Or do I need to convert 98 AB 02 into something else first?


Reverse the order before converting. For example, if "98 AB 02" is what you see in the hex editor, reverse it to "02 AB 98" & then convert to decimal.


----------



## fat4l

BTW, I found out what is causing my clock dropping....
If you do, crossfire + freesync + install afterburner= dropping......
If I do a fresh install of drivers, + enable freesync but don't install afterburner, my clocks are 100% stable. Once I install AB it becomes broken...

Also, my scores in 3dmark are now even higher by about ~300 points.

Also, regarding freesync.
If enable freesync and run FS X or FS U, my scores are the same as if freesync was disabled. However this doesnt apply to "normal" FS. If freesync is enabled in normal FS, my scores are lower by about ~600 points than as if it would disabled.

Will post screens soon.


----------



## kizwan

Quote:


> Originally Posted by *kizwan*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Looks like bench shootout commencing but I can only join in the low side.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Unfortunately my second (Hynix) card hit a wall at 1180 on the core with 390 modded BIOS. My method is more of a bruteforce. 1180 already need +100mV & combined with memory it already needs more. So I just max out to +200mV. No artifacts visible, if there's any, I don't see it. I thought I can hit 1800 on the memory but at 1775, it crashed.
> 
> 1179/1599 Core +200mV AUX +13mV (1.047V)
> Graphics Score 14064
> Graphics Test 1 - 66.8 fps
> Graphics Test 2 - 56.38 fps
> http://www.3dmark.com/3dm/9240826?
> 
> 1181/1708 Core +200mV AUX +13mV (1.047V)
> Graphics Score 14140
> Graphics Test 1 - 66.76 fps
> Graphics Test 2 - 56.97 fps
> http://www.3dmark.com/3dm/9240968?
> 
> 1179/1726 Core +200mV AUX +13mV (1.047V)
> Graphics Score 14189
> Graphics Test 1 - 66.95 fps
> Graphics Test 2 - 57.2 fps
> http://www.3dmark.com/3dm/9241171?
> 
> 1181/1749 Core +200mV AUX +13mV (1.047V)
> Graphics Score 14239
> Graphics Test 1 - 67.35 fps
> Graphics Test 2 - 57.28 fps
> http://www.3dmark.com/3dm/9241242?
> 
> 
> 
> Primary (Elpida) card with 290 BIOS. Same thing with the voltage, I can run it at 1200 on the core with +100mV but it needs more when combined with memory. I can only max out 1620 on the memory.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 1200/1600 Core +200mV AUX +50mV (1.047V)
> Graphics Score 13470
> Graphics Test 1 - 64.61 fps
> Graphics Test 2 - 53.56 fps
> http://www.3dmark.com/3dm/9241519?
> 
> 
> 
> 1200/1620 Core +200mV AUX +50mV (1.047V)
> Graphics Score 13551
> Graphics Test 1 - 65.15 fps
> Graphics Test 2 - 53.77 fps
> http://www.3dmark.com/3dm/9241609?


Quote:


> Originally Posted by *mus1mus*
> 
> Elpida or Hynix?
> 
> Who wants to give this BIOS a spin?
> 
> http://www.3dmark.com/fs/6468184
> 
> Tweaked.zip 43k .zip file
> 
> 
> Tweaks include:
> 
> FF-F1 for Elpida
> DPM 7 - 1.25
> 1250 Strap for 1625.
> 
> Best to give the card a spin to 1625 Memory before doing any Core OC. And maybe +100 On VDDC.


mus1mus FF F1 timings 947/1625 AUX +50mV
Graphics Score 11370
Graphics Test 1 - 54.2 fps
Graphics Test 2 - 45.44 fps
http://www.3dmark.com/fs/6474140

mus1mus FF F1 timings 1200/1620 Core +200mV AUX +50mV (1.047V)
Graphics Score 13864
Graphics Test 1 - 66.59 fps
Graphics Test 2 - 55.06 fps
http://www.3dmark.com/fs/6474064

mus1mus FF F1 timings 1200/1625 Core +200mV AUX +69mV (1.063V)
_(Note. AUX at +50mV ran just fine but I overvolted a little bit anyway)_
Graphics Score 13903
Graphics Test 1 - 66.91 fps
Graphics Test 2 - 55.13 fps
http://www.3dmark.com/fs/6474351

TRI-X OC vs. mus1mus FF F1 timings BIOS - 1200-1620


mus1mus FF F1 timings BIOS 1200-1620 vs. 1200-1625


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> You put it reading from right to left.
> 
> 98 AB 02 - this is how it appears on the rom.
> 02 AB 98 - this is how you should put it on a HEX converter.


Quote:


> Originally Posted by *kizwan*
> 
> Reverse the order before converting. For example, if "98 AB 02" is what you see in the hex editor, reverse it to "02 AB 98" & then convert to decimal.


[/quote]

You guys are so helpful. Thanks!

+1 to you both


----------



## fat4l

Here it is... Freesync test.

This only applies to FS performance....Score in ultra and extreme stays the same with freesync on and off.

*FREESYNC ON*
*19753*


*FREESYNC OFF*
*20338*


Hmmm


----------



## Agent Smith1984

You guys are getting some pretty sick results out of those "old" 290's.....

The 15.2 and up driver combined with some of these modified BIOS', really opened up some doors.....

I still say the 290, when found cheap enough, is hands down the best bang per buck out there right now....


----------



## mus1mus

@kizwan

So it gave you some boost?

I am looking at Insanity's now.

I think I can figure this out. And give @fyzzz some scare. lol

http://www.3dmark.com/fs/6475112

Surprisingly, no Black Screens as before.


----------



## fyzzz

Quote:


> Originally Posted by *mus1mus*
> 
> @kizwan
> 
> So it gave you some boost?
> 
> I am looking at Insanity's now.
> 
> I think I can figure this out. And give @fyzzz some scare. lol
> 
> http://www.3dmark.com/fs/6475112
> 
> Surprisingly, no Black Screens as before.


Ohh almost at the magical 15k. We can soon start a 15k 290 club


----------



## mus1mus

Quote:


> Originally Posted by *fyzzz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> @kizwan
> 
> So it gave you some boost?
> 
> I am looking at Insanity's now.
> 
> I think I can figure this out. And give @fyzzz some scare. lol
> 
> http://www.3dmark.com/fs/6475112
> 
> Surprisingly, no Black Screens as before.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ohh almost at the magical 15k. We can soon start a 15k 290 club
Click to expand...

Incoming mate.....







\

http://www.3dmark.com/fs/6475202

I'msooohappyimmadie!


----------



## fat4l

Quote:


> Originally Posted by *mus1mus*
> 
> @kizwan
> 
> So it gave you some boost?
> 
> I am looking at Insanity's now.
> 
> I think I can figure this out. And give @fyzzz some scare. lol
> 
> http://www.3dmark.com/fs/6475112
> 
> Surprisingly, no Black Screens as before.


How did u get rid of them ?







Mind sharing ?


----------



## mus1mus

Quote:


> Originally Posted by *fat4l*
> 
> How did u get rid of them ?
> 
> 
> 
> 
> 
> 
> 
> Mind sharing ?


Apparently, the timings are too tight for my card.

Continue mine next week.

For you guys, refer to the guide I posted.

1. Look for your cards' bios timings
2. Copy them into Insanity's

I get screen tear on 2D indicative of some timings issues I guess.









And Black Screens at 1625.

Gotta hit the sack guys. Have fun all ya


----------



## fyzzz

Quote:


> Originally Posted by *mus1mus*
> 
> Incoming mate.....
> 
> 
> 
> 
> 
> 
> 
> \
> 
> http://www.3dmark.com/fs/6475202
> 
> I'msooohappyimmadie!


Nice! Your hard work is paying off I see!


----------



## sinnedone

I really need to get some time off to play with Bios.

Id really like to see if I can clock higher with less heat. Currently running 1050/1300 crossfire 290's at stock voltage.


----------



## mus1mus

Quote:


> Originally Posted by *fyzzz*
> 
> Nice! Your hard work is paying off I see!


It's a quick and dirty job actually. Added FF F1, put 1250 for DPM 7. Benched. I can still push higher than 1300 I guess. And may squeeze a bit more from 1625.


----------



## battleaxe

I had no luck putting more slack timings at the 1250, 1325, and 1500 memory straps. My thought was by loosening them up I could push higher frequencies. No dice.

I was however able to put the 1000 timings into everything higher than it. So I'm running 1000mhz timings now on 1125 and 1250mhz. I can only get this ram up to about 1325 still before it peters out... weird. Next is to find out what difference it made.


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> @kizwan
> 
> So it gave you some boost?
> 
> I am looking at Insanity's now.
> 
> I think I can figure this out. And give @fyzzz some scare. lol
> 
> http://www.3dmark.com/fs/6475112
> 
> Surprisingly, no Black Screens as before.


Yes I got some boost.

That's good idea actually, copy timings to Insanity's BIOS.

I really need a lot of volt to break 1200. I wonder whether setting 1300 for DPM 7 would be good idea or risky?


----------



## fat4l

Quote:


> Originally Posted by *battleaxe*
> 
> I had no luck putting more slack timings at the 1250, 1325, and 1500 memory straps. My thought was by loosening them up I could push higher frequencies. No dice.
> 
> I was however able to put the 1000 timings into everything higher than it. So I'm running 1000mhz timings now on 1125 and 1250mhz. I can only get this ram up to about 1325 still before it peters out... weird. Next is to find out what difference it made.


I think Mhz is a bit more important than timings...
For me 1250 timings work till 1520MHz. After that artifacts start.
I will have to try 1375 timings in 1625 strap...


----------



## Agent Smith1984

Quote:


> Originally Posted by *battleaxe*
> 
> I had no luck putting more slack timings at the 1250, 1325, and 1500 memory straps. My thought was by loosening them up I could push higher frequencies. No dice.
> 
> I was however able to put the 1000 timings into everything higher than it. So I'm running 1000mhz timings now on 1125 and 1250mhz. I can only get this ram up to about 1325 still before it peters out... weird. Next is to find out what difference it made.


You tried adding aux voltage?


----------



## mus1mus

Quote:


> Originally Posted by *kizwan*
> 
> Yes I got some boost.
> 
> That's good idea actually, copy timings to Insanity's BIOS.
> 
> I really need a lot of volt to break 1200. I wonder whether setting 1300 for DPM 7 would be good idea or risky?


Observe the rom's effect to your card. I cannot get my voltage more than 1.344 when loaded. Even with iTurbo at +400 and PT1. Bios position?

I don't think setting it to 1300 is dangerous. If it ever accepts it.
Quote:


> Originally Posted by *Agent Smith1984*
> 
> You tried adding aux voltage?


It doesn't seem to do anything for me. But that can be tried.

How about DPM 7 setting?
Quote:


> Originally Posted by *battleaxe*
> 
> I had no luck putting more slack timings at the 1250, 1325, and 1500 memory straps. My thought was by loosening them up I could push higher frequencies. No dice.
> 
> I was however able to put the 1000 timings into everything higher than it. So I'm running 1000mhz timings now on 1125 and 1250mhz. I can only get this ram up to about 1325 still before it peters out... weird. Next is to find out what difference it made.


Try to look for other roms. Flash, test, observe.

Maybe the insanity's?


----------



## battleaxe

Quote:


> Originally Posted by *Agent Smith1984*
> 
> You tried adding aux voltage?


Oh yeah. Seems to be a wall at 1350 on this mem for some reason. It takes tighter timings just fine, still OC's up to about 1340, then sputters out, kinda strange.


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Yes I got some boost.
> 
> That's good idea actually, copy timings to Insanity's BIOS.
> 
> I really need a lot of volt to break 1200. I wonder whether setting 1300 for DPM 7 would be good idea or risky?
> 
> 
> 
> Observe the rom's effect to your card. I cannot get my voltage more than 1.344 when loaded. Even with iTurbo at +400 and PT1. Bios position?
> 
> I don't think setting it to 1300 is dangerous. If it ever accepts it.
Click to expand...

1.344 before Vdroop? Mine with +200mV, 1.4V before Vdroop with stock, TRI-X OC & Insanity's BIOSes.
Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Agent Smith1984*
> 
> You tried adding aux voltage?
> 
> 
> 
> It doesn't seem to do anything for me. But that can be tried.
> 
> How about DPM 7 setting?
Click to expand...

I thought overvolting AUX doesn't help on my cards based on the past tests but with +50mV I can get my Elpida card to bench at 1200/1600 again in Win 10 (that it used to be able to do it in Win 7), can go up to 1200/1620 & with FF F1 timings it can go up to 1200/1625 without black screen crashing. On my Hynix card, with +50mV it can bench with memory up to 1750 which I can't even break 1625 before. However, above +50mV doesn't improve memory overclock further.


----------



## mus1mus

After droop.
1.359 pre droop.

Maybe I beed more Voltage tweak know how.


----------



## mus1mus

Quote:


> Originally Posted by *kizwan*
> 
> 1.344 before Vdroop? Mine with +200mV, 1.4V before Vdroop with stock, TRI-X OC & Insanity's BIOSes.
> I thought overvolting AUX doesn't help on my cards based on the past tests but with +50mV I can get my Elpida card to bench at 1200/1600 again in Win 10 (that it used to be able to do it in Win 7), can go up to 1200/1620 & with FF F1 timings it can go up to 1200/1625 without black screen crashing. On my Hynix card, with +50mV it can bench with memory up to 1750 which I can't even break 1625 before. However, above +50mV doesn't improve memory overclock further.


Dude, I didn't mod the hynix timings on the BIOS I posted. Did you?


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> 1.344 before Vdroop? Mine with +200mV, 1.4V before Vdroop with stock, TRI-X OC & Insanity's BIOSes.
> I thought overvolting AUX doesn't help on my cards based on the past tests but with +50mV I can get my Elpida card to bench at 1200/1600 again in Win 10 (that it used to be able to do it in Win 7), can go up to 1200/1620 & with FF F1 timings it can go up to 1200/1625 without black screen crashing. On my Hynix card, with +50mV it can bench with memory up to 1750 which I can't even break 1625 before. However, above +50mV doesn't improve memory overclock further.
> 
> 
> 
> Dude, I didn't mod the hynix timings on the BIOS I posted. Did you?
Click to expand...

My Hynix card use Insan1ty's 290_HYNIX_STOCK_V1.7 BIOS. I only edited the default clocks to 1000/1300. This BIOS pretty stable.

Insan1ty's BIOS for ELPIDA caused my ELPIDA card unstable when overclock. But I think it is unstable because of the "middle" clock mod I did earlier which also caused hard lock in crossfire when overclock which didn't happen before. I think I may try Insan1ty's BIOS for ELPIDA again later.


----------



## mus1mus

Quote:


> Originally Posted by *kizwan*
> 
> My Hynix card use Insan1ty's 290_HYNIX_STOCK_V1.7 BIOS. I edited the default clocks to 1000/1300.
> 
> Insan1ty's BIOS for ELPIDA caused my ELPIDA card unstable when overclock. But I think it is unstable because of the "middle" clock mod I did earlier which also caused hard lock in crossfire which didn't happen before. I think I may try Insan1ty's BIOS for ELPIDA again later.


Nice.

I have sime fun on the modded Hynix before. It really does add some boost. The Elpida on the other hand is a bit too tight limiting the OC.

Maybe integrating the stock Elpida timings will do the trick. I am not gonna be back at it til Monday though.


----------



## Roboyto

Quote:


> Originally Posted by *battleaxe*
> 
> Oh yeah. Seems to be a wall at 1350 on this mem for some reason. It takes tighter timings just fine, still OC's up to about 1340, then sputters out, kinda strange.


I've owned two different 290's that would only clock the RAM into that range; under 1400. I never messed with the timings, but they both exhibited the same symptoms. Personally I think it boils down to silicon lottery...some are winners...others not so much.

Quote:


> Originally Posted by *kizwan*
> 
> I thought overvolting AUX doesn't help on my cards based on the past tests but with +50mV I can get my Elpida card to bench at 1200/1600 again in Win 10 (that it used to be able to do it in Win 7), can go up to 1200/1620 & with FF F1 timings it can go up to 1200/1625 without black screen crashing. On my Hynix card, with +50mV it can bench with memory up to 1750 which I can't even break 1625 before. However, above +50mV doesn't improve memory overclock further.


I've seen this phenomenon with AUX voltage before before as well...come to think of it AUX has never really helped all that much with any of my RAM clocks.

A Powercolor reference 290 I had exhibited similar problem but with the core voltage. Anything I would put on it over +175mV and it would hardlock the system immediately under any 3D load. It got OK speeds of 1075/1375 with +37mV, but even with +175 it didn't make it to 1200 core...I bought it open box on The Egg for cheap, so I wasn't disappointed as $297 at that time(april 2014) was a great deal on any 290.


----------



## mus1mus

I still find it weird why 3 days ago, I was clocked here:
http://www.3dmark.com/fs/6445106

But working on the BIOS I was able to OC further.

http://www.3dmark.com/compare/fs/6445592/fs/6452320/fs/6461753/fs/6468551/fs/6475202


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> My Hynix card use Insan1ty's 290_HYNIX_STOCK_V1.7 BIOS. I edited the default clocks to 1000/1300.
> 
> Insan1ty's BIOS for ELPIDA caused my ELPIDA card unstable when overclock. But I think it is unstable because of the "middle" clock mod I did earlier which also caused hard lock in crossfire which didn't happen before. I think I may try Insan1ty's BIOS for ELPIDA again later.
> 
> 
> 
> Nice.
> 
> I have sime fun on the modded Hynix before. It really does add some boost. The Elpida on the other hand is a bit too tight limiting the OC.
> 
> Maybe integrating the stock Elpida timings will do the trick. I am not gonna be back at it til Monday though.
Click to expand...

Tried Insan1ty's BIOS for Elpida. I can confirmed the problem I'm facing before with Insan1ty's BIOS is caused by "middle" memory mod I did before. As usual, overclock lowered a little bit but better performance. E.g. at 1180/1600 with Insan1ty's BIOS outperform 1200/1625. Please take a look at my score & FPS. Do you think I can squeeze out more performance with different timings? With your FF F1 timings allow my card to run at 1625 on the memory without black screen crash. I wonder whether I can break 1600 on the memory with Insan1ty's BIOS with FF F1 timings.

Insan1ty's 290_ELPIDA_STOCK_V1.7 1180/1600 Core +200mV AUX +13mV (1.047V)
Graphics Score 14030
Graphics Test 1 - 66.67 fps
Graphics Test 2 - 56.22 fps
http://www.3dmark.com/compare/fs/6474351/fs/6480061

1200/1625 vs. 1180/1600

http://www.3dmark.com/compare/fs/6474351/fs/6480061
Quote:


> Originally Posted by *Roboyto*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I thought overvolting AUX doesn't help on my cards based on the past tests but with +50mV I can get my Elpida card to bench at 1200/1600 again in Win 10 (that it used to be able to do it in Win 7), can go up to 1200/1620
> 
> 
> 
> 
> 
> 
> 
> I've seen this phenomenon with AUX voltage before before as well...come to think of it AUX has never really helped all that much with any of my RAM clocks.
> 
> A Powercolor reference 290 I had exhibited similar problem but with the core voltage. Anything I would put on it over +175mV and it would hardlock the system immediately under any 3D load. It got OK speeds of 1075/1375 with +37mV, but even with +175 it didn't make it to 1200 core...I bought it open box on The Egg for cheap, so I wasn't disappointed as $297 at that time(april 2014) was a great deal on any 290.
Click to expand...

Yeah, I've heard a couple of times before that some cards just can't take extra voltage without crashing. My cards voltage wall probably higher. So far highest before vdroop I recorded is 1.445V with +200mV on my Elpida card.
Quote:


> Originally Posted by *mus1mus*
> 
> I still find it weird why 3 days ago, I was clocked here:
> http://www.3dmark.com/fs/6445106
> 
> But working on the BIOS I was able to OC further.
> 
> http://www.3dmark.com/compare/fs/6445592/fs/6452320/fs/6461753/fs/6468551/fs/6475202


Nice!







What is the voltage before Vdroop for that card?


----------



## fat4l

Quote:


> Originally Posted by *mus1mus*
> 
> I still find it weird why 3 days ago, I was clocked here:
> http://www.3dmark.com/fs/6445106
> 
> But working on the BIOS I was able to OC further.
> 
> http://www.3dmark.com/compare/fs/6445592/fs/6452320/fs/6461753/fs/6468551/fs/6475202


so ur saying that playing around timings allowed you to clock the core further with the same volts?


----------



## mus1mus

It sure did. But clocks are not my concern. For I know, my previous settings will post better scores at same clocks than my new ones--albeit the Insanity's.

It just scored better the higher I go, so I'll take that.

@kizwan 1.344 at load. 1.365 at idle.

Even forcing iTurbo won't raise it past 1.344 on load. Using GPU-Z of course.


----------



## gupsterg

@ members which may have not seen this thread Link:- http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x

I started this thread about 4-5mths ago correlating things that can be modded.

Why I'm highlighting it is when some of you are searching for the memory strap be aware that if your ROM has two supported RAM ICs you may need to identify which is the correct section to modify plus what IC you have.

For example if a ROM supports Hynix & Elpida each strap MHZ identifier has a 01 or 02 to denote which IC the timing belongs to.

Please ref an image and info in post 1 of above linked thread, heading *Memory Timings Modding* >> *What area of ROM to mod?*

I would also suggest modders create tables for a ROM using AtomDis, again info in the linked thread heading *AtomDis installation and usage in Ubuntu*, this will give you location of VRAM_Info table.


----------



## megax05

guys I am selling my r9 290s and now want to decide what to get gtx 980 ti vs r9 fury x I will be playing on 4k also the r9 fury x will cost me around 70-100$ less but the performance go to the gtx 980 ti from the reviews I dont know about now since they did few drivers update for the fury x.


----------



## rdr09

Quote:


> Originally Posted by *megax05*
> 
> guys I am selling my r9 290s and now want to decide what to get gtx 980 ti vs r9 fury x I will be playing on 4k also the r9 fury x will cost me around 70-100$ less but the performance go to the gtx 980 ti from the reviews I dont know about now since they did few drivers update for the fury x.


you'll still need two of any of those cards for 4K.


----------



## megax05

I know that but I am not aiming for ultra settings here probably a mix of high med to mantain 60 fps it will still look better than ultra @ 1080p.
So gtx 980 ti or fury x?


----------



## rdr09

Quote:


> Originally Posted by *megax05*
> 
> I know that but I am not aiming for ultra settings here probably a mix of high med to mantain 60 fps it will still look better than ultra @ 1080p.
> So gtx 980 ti or fury x?


definitely not high settings. prolly the 980Ti if single and oc it. 4K needs help from the cpu as well. i just saw the physics score of a stock 4770K and it is lower than my i7 Sandy 4.5. i've had my 4K for about a year now and i know that this cpu oc helps to push my 2 290s.

that's why i suggested to icy to oc the cpu a bit . . .

http://www.overclock.net/t/1580257/sapphire-amd-radeon-r9-290-4gb-g5-crossfire-fps-issue-revisted-please/10#post_24606621

edit: btw, if it is a 4K TV . . . you really want to go 980Ti.


----------



## megax05

What i did is i got my self a new i7 6700k and 16gb of ddr4 and z170 with sli/cfx support for future dual gpus when price go down or I can hold on my current r9 290s to see what amd and nvidia have in sleeves for q1 2016 but that didnt answer my question


----------



## rdr09

Quote:


> Originally Posted by *megax05*
> 
> What i did is i got my self a new i7 6700k and 16gb of ddr4 and z170 with sli/cfx support for future dual gpus when price go down or I can hold on my current r9 290s to see what amd and nvidia have in sleeves for q1 2016 but that didnt answer my question


have you even tried using the 2 290s in 4K? unless your games (works) do not support crossfire, then i suggest the 980Ti. Save and get another.


----------



## fat4l

ok guys.
I got into mem timings now.

*Firestrike eXtreme.*

1250 timings in 1500 strap: 10879 pts
(Max with 1250 timings is about 1520MHz stable)=>tested with 1500MHz.


1375 timings in 1750 strap: 11002 pts
(Max with 1375 timings is about 1660MHz stable)=>tested with 1650MHz.


Will have to test 1500 timings in 1750 strap and also try DD D1 stuff.


----------



## mus1mus

I think I'm on the other end of the stick on your methods.

First, the straps ends where as follows:
1000
1250
1375
1500
1625
1750 etc.

So putting 1000 timings for 1500 strap for example should be complemented by putting either putting 1250 or 1000 timings for 1625. Same with 1750.

And test for the effectiveness of that mod on the very end of the strap. So setting your OC to 1500 if that's where your mod is, is the best way to test it.

Do not mix up things. If you intend to test at 1500, mod 1500 and limit the tested frequency there.

Your last example is a fail IMO as you can't reach 1750. Not even 1700. I'd suggest you mod 1625 and end your tests there. As the maximum performance gains(if any) can be attained there.

1626 will be passed on to 1750 strap. And will not be as effective as 1625.


----------



## fat4l

Quote:


> Originally Posted by *mus1mus*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I think I'm on the other end of the stick on your methods.
> 
> First, the straps ends where as follows:
> 1000
> 1250
> 1375
> 1500
> 1625
> 1750 etc.
> 
> So putting 1000 timings for 1500 strap for example should be complemented by putting either putting 1250 or 1000 timings for 1625. Same with 1750.
> 
> And test for the effectiveness of that mod on the very end of the strap. So setting your OC to 1500 if that's where your mod is, is the best way to test it.
> 
> Do not mix up things. If you intend to test at 1500, mod 1500 and limit the tested frequency there.
> 
> Your last example is a fail IMO as you can't reach 1750. Not even 1700. I'd suggest you mod 1625 and end your tests there. As the maximum performance gains(if any) can be attained there.
> 
> 1626 will be passed on to 1750 strap. And will not be as effective as 1625.


http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/420_30#post_24607299


----------



## MEC-777

Ok, so my 290's can run no problem at stock clocks (947/1250) undervolted to as low as -50mV so far. However I made a stupid mistake and applied a 1050/1350 OC while forgetting to put the voltage back up to stock first. Immediate artifacting, then black screens. Hard restarted and the moment MSI AB launched when windows booted - black screens. Got around it by bringing up task manager the moment I logged back into windows and had to find and kill MSI AB before it applied the OC again. With an SSD, I only had a few seconds to do this but managed to get it. Then had to uninstall AB and reinstall it to wipe out that bad OC setting.

It wasn't until after all that, I realized and learned; if you hold the Ctrl key after you login to windows it will bypass the OC in AB.







Learn something new everyday. Sometimes, you learn the hard way.


----------



## AliNT77

Quote:


> Originally Posted by *MEC-777*
> 
> Ok, so my 290's can run no problem at stock clocks (947/1250) undervolted to as low as -50mV so far. However I made a stupid mistake and applied a 1050/1350 OC while forgetting to put the voltage back up to stock first. Immediate artifacting, then black screens. Hard restarted and the moment MSI AB launched when windows booted - black screens. Got around it by bringing up task manager the moment I logged back into windows and had to find and kill MSI AB before it applied the OC again. With an SSD, I only had a few seconds to do this but managed to get it. Then had to uninstall AB and reinstall it to wipe out that bad OC setting.
> 
> It wasn't until after all that, I realized and learned; if you hold the Ctrl key after you login to windows it will bypass the OC in AB.
> 
> 
> 
> 
> 
> 
> 
> Learn something new everyday. Sometimes, you learn the hard way.


-50mV sounds low to me for a bios-modded card

If your bios is already modded (middle clock) try lower voltages
If not , then definitely mod it

If U ever experienced a blackscreen @win boot :

Shut down Ur PC
Switch your cards bios to the other one
Boot into windows
Shut down Ur PC
Switch your cards bios to the bios that caused BS
Boot into windows
Done


----------



## Wickedtt

Hey all finally got my hands on a lightning is there a bios for the samsung nemory for tighter timings i can hit 1700+ no problem so far.


----------



## MEC-777

Quote:


> Originally Posted by *AliNT77*
> 
> -50mV sounds low to me for a bios-modded card
> 
> If your bios is already modded (middle clock) try lower voltages
> If not , then definitely mod it
> 
> If U ever experienced a blackscreen @win boot :
> 
> Shut down Ur PC
> Switch your cards bios to the other one
> Boot into windows
> Shut down Ur PC
> Switch your cards bios to the bios that caused BS
> Boot into windows
> Done


Both my 290's are running on their stock BIOS. Neither has been modded.

I wasn't looking to OC and lower voltages at the same time, I just forgot to bring the voltage back up when applying a new OC.

It didn't give me a black screen until the moment MSI AB launched. But it's OK, if it happens again, all I have to do is hold Crtl after logging into windows to bypass the current OC settings.







Both cards run just fine at stock clocks with -50mV.


----------



## Renton577

http://www.techpowerup.com/gpuz/details.php?id=67wgm
http://www.techpowerup.com/gpuz/details.php?id=5x2d6

have 290x CF, primary card is an ASUS while the second is a Sapphire both on stock reference coolers, been really happy with them, wayyyy more than my 970


----------



## Roboyto

@kizwan

@rdr09

@fat4l

@AliNT77

I think I've found a workaround for the auto driver update issue in Win10.

I started poking around inside the automatic update settings and this is what I changed:

Go to Windows Update and then into Advanced Options



I altered:

Choose how updates are installed

Defer Upgrades

Choose how updates are delivered



Once in 'Choose How Updates are Delivered" change it to OFF



Once you're through those you're heading to Device Manager > Display Adapters > Properties > Roll Back Driver



I just clicked on 'Roll Back Driver', my monitors went black for a moment and came back on. 'Roll Back Driver' is now greyed out. CCC was no longer in the tray so I opened it manually to find that I'm on 15.7

I've rebooted 5 times consecutively and I'm still good







I presume the last driver before whatever Windows installs must be 15.7 for this to work correctly. So if need be run DDU, install 15.7, let it auto-install 15.11 beta and follow the process.

I will verify this method on my HTPC which is also running Win10 with a 290. I checked for updates on it and there was a large Win10 upgrade from 11/11/15 pending. That is installing now...maybe they will have fixed this issue...?

Well, the Win10 'upgrade' didn't fix anything as it still automatically installed the latest beta drivers. However, using the 'Roll Back Driver' function in Device Manager appears to be successful on the other computer as well


----------



## AliNT77

Quote:


> Originally Posted by *Renton577*
> 
> http://www.techpowerup.com/gpuz/details.php?id=67wgm
> http://www.techpowerup.com/gpuz/details.php?id=5x2d6
> 
> have 290x CF, primary card is an ASUS while the second is a Sapphire both on stock reference coolers, been really happy with them, wayyyy more than my 970


nice









what about temps?
if its high then definitely try Undervolting them using this little guide :

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/40170#post_24575831

it should decrease temps & power consumption of your cards
















(always wanted to test UVing on a reference card but mine is under water and my reference cooler is salvaged so...







)


----------



## fat4l

Quote:


> Originally Posted by *Roboyto*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> @kizwan
> 
> @rdr09
> 
> @fat4l
> 
> @AliNT77
> 
> I think I've found a workaround for the auto driver update issue in Win10.
> 
> I started poking around inside the automatic update settings and this is what I changed:
> 
> Go to Windows Update and then into Advanced Options
> 
> 
> 
> I altered:
> Choose how updates are installed
> Defer Upgrades
> Choose how updates are delivered
> 
> 
> 
> Once in 'Choose How Updates are Delivered" change it to OFF
> 
> 
> 
> Once you're through those you're heading to Device Manager > Display Adapters > Properties > Roll Back Driver
> 
> 
> 
> I just clicked on 'Roll Back Driver', my monitors went black for a moment and came back on. 'Roll Back Driver' is now greyed out. CCC was no longer in the tray so I opened it manually to find that I'm on 15.7
> 
> I've rebooted 5 times consecutively and I'm still good
> 
> 
> 
> 
> 
> 
> 
> I presume the last driver before whatever Windows installs must be 15.7 for this to work correctly. So if need be run DDU, install 15.7, let it auto-install 15.11 beta and follow the process.
> 
> I will verify this method on my HTPC which is also running Win10 with a 290. I checked for updates on it and there was a large Win10 upgrade from 11/11/15 pending. That is installing now...maybe they will have fixed this issue...?
> 
> Well, the Win10 'upgrade' didn't fix anything as it still automatically installed the latest beta drivers. However, using the 'Roll Back Driver' function in Device Manager appears to be successful on the other computer as well


Thanks for that !







+rep
I will try it as I will be reinstalling my Win 10 soon...

Anyway..
My final results are,
1200MHz Cores
1700MHz Mems
1376-1500 strap timings used for 1626+ Mhz
This gives me the best results in terms of performance and stability. No artifacts even in long run.















Fow comparison, this is my old "untuned" 295X2, 1250/1650Mhz.
~1300pts less...


----------



## rdr09

Quote:


> Originally Posted by *Roboyto*
> 
> @kizwan
> 
> @rdr09
> 
> @fat4l
> 
> @AliNT77
> 
> I think I've found a workaround for the auto driver update issue in Win10.
> 
> I started poking around inside the automatic update settings and this is what I changed:
> 
> Go to Windows Update and then into Advanced Options
> 
> 
> 
> I altered:
> Choose how updates are installed
> Defer Upgrades
> Choose how updates are delivered
> 
> 
> 
> Once in 'Choose How Updates are Delivered" change it to OFF
> 
> 
> 
> Once you're through those you're heading to Device Manager > Display Adapters > Properties > Roll Back Driver
> 
> 
> 
> I just clicked on 'Roll Back Driver', my monitors went black for a moment and came back on. 'Roll Back Driver' is now greyed out. CCC was no longer in the tray so I opened it manually to find that I'm on 15.7
> 
> I've rebooted 5 times consecutively and I'm still good
> 
> 
> 
> 
> 
> 
> 
> I presume the last driver before whatever Windows installs must be 15.7 for this to work correctly. So if need be run DDU, install 15.7, let it auto-install 15.11 beta and follow the process.
> 
> I will verify this method on my HTPC which is also running Win10 with a 290. I checked for updates on it and there was a large Win10 upgrade from 11/11/15 pending. That is installing now...maybe they will have fixed this issue...?
> 
> Well, the Win10 'upgrade' didn't fix anything as it still automatically installed the latest beta drivers. However, using the 'Roll Back Driver' function in Device Manager appears to be successful on the other computer as well


Hard to spoiler your post. lol

You have Win 10 Home or Pro?

+rep


----------



## Roboyto

Quote:


> Originally Posted by *fat4l*
> 
> Thanks for that !
> 
> 
> 
> 
> 
> 
> 
> +rep
> I will try it as I will be reinstalling my Win 10 soon...
> 
> Anyway..
> My final results are,
> 1200MHz Cores
> 1700MHz Mems
> 1376-1500 strap timings used for 1626+ Mhz
> This gives me the best results in terms of performance and stability. No artifacts even in long run.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Fow comparison, this is my old "untuned" 295X2, 1250/1650Mhz.
> ~1300pts less...


You're welcome. Nice clocks with the Ares. It makes 1200/1700 with no additional power/voltage?

Quote:


> Originally Posted by *rdr09*
> 
> Hard to spoiler your post. lol
> 
> You have Win 10 Home or Pro?
> 
> +rep


Both are running Pro. I was getting really salty







and was nearly ready to bust out my Win7 flash drive and roll back.

I went into the Windows Feedback app and submitted a post for the auto-update issue. If you're on Win10 give me some up-votes!


----------



## MEC-777

When I first upgraded to windows 10, the first thing I did was disable the auto updates, just as @Roboyto described. Since then, it's never messed or changed the drivers on me. It's always let me install the drivers I want. Never had to do the driver rollback either.


----------



## Renton577

Quote:


> Originally Posted by *AliNT77*
> 
> nice
> 
> 
> 
> 
> 
> 
> 
> 
> 
> what about temps?
> if its high then definitely try Undervolting them using this little guide :
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/40170#post_24575831
> 
> it should decrease temps & power consumption of your cards
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (always wanted to test UVing on a reference card but mine is under water and my reference cooler is salvaged so...
> 
> 
> 
> 
> 
> 
> 
> )


Sometimes they reach the throttling point but only for a few seconds while the fans start to kick in, once I've been playing for a while the first card usually hovers around 86C while the second at about 78C


----------



## taem

So I have a single 290. I was thinking I'd wait for the next gen of card but I have no clue how long that will be. If I got a 980 ti or Fury X now, how much of an upgrade would that be over a 290 running at 1160 core? My display is 2560x1440.


----------



## AliNT77

Quote:


> Originally Posted by *Renton577*
> 
> Sometimes they reach the throttling point but only for a few seconds while the fans start to kick in, once I've been playing for a while the first card usually hovers around 86C while the second at about 78C


How much did U UV them?


----------



## Roboyto

Quote:


> Originally Posted by *taem*
> 
> So I have a single 290. I was thinking I'd wait for the next gen of card but I have no clue how long that will be. If I got a 980 ti or Fury X now, how much of an upgrade would that be over a 290 running at 1160 core? My display is 2560x1440.


The 980ti Lightning review shows Fury X catching up to reference 980Ti in lower resolutions, matching it at 2560x1440, and even slightly surpassing it at 4K.

http://www.techpowerup.com/reviews/MSI/GTX_980_Ti_Lightning/23.html

From the looks of this you would see roughly a 15-20% performance bump.

This review is with the latest AMD drivers. It appears, per the usual trend, AMD's hardware is getting better with age. I'm no Nvidia flag waver by a long shot, but it's no secret that even the worst OCing Maxwell cards will easily give you a 15-20% performance bump with an overclock.

Fury X can get somewhere around a 5% bump from an overclock, but you could save yourself a couple bucks and grab a regular Fury which are typically only 5-7% behind the X. The Sapphire/XFX versions retained the reference board and wisely still stuck a full length cooler on them. The last 1/3 of the card is free of normal air flow restrictions causing the cards to have impressive thermals without watercooling. The XFX Fury can be had on Amazon right now for $550, and the Sapphire for $565.

If you really need to upgrade you could, but the die shrink alone for next gen should yield large gains, plus next gen HBM, AND whatever other tricks both camps have up their sleeves. If you're set on a Fury/980 then I would still wait for next gen to drop so the prices on these come down.


----------



## mus1mus

Quote:


> Originally Posted by *taem*
> 
> So I have a single 290. I was thinking I'd wait for the next gen of card but I have no clue how long that will be. If I got a 980 ti or Fury X now, how much of an upgrade would that be over a 290 running at 1160 core? My display is 2560x1440.


I have benched the 980ti to the max and I can say, max OC to max OC, the TI will average on 290's max.

This comparison was based off a 980ti at 1520/2080 vs my Hynix 290 at 1220/1725 with the added boost from using a 390X bios.

I swapped the TI for 2 290s though.


----------



## fat4l

Quote:


> Originally Posted by *Roboyto*
> 
> You're welcome. Nice clocks with the Ares. It makes 1200/1700 with no additional power/voltage?


No







Ares is not that great tbh...
Gpu1 has dpm7=1.387v
Gpu2 has dpm7=1.425v

These values are set in bios. They include 12.5mV extra voltage, for the worst case scenario.
Not sure what droop voltages are as I'm on my laptop now and can't check but it's around 1.35v_max and 1.38v_max....


----------



## kizwan

Quote:


> Originally Posted by *fat4l*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Roboyto*
> 
> You're welcome. Nice clocks with the Ares. It makes 1200/1700 with no additional power/voltage?
> 
> 
> 
> No
> 
> 
> 
> 
> 
> 
> 
> Ares is not that great tbh...
> Gpu1 has dpm7=1.387v
> Gpu2 has dpm7=1.425v
> 
> These values are set in bios. They include 12.5mV extra voltage, for the worst case scenario.
> Not sure what droop voltages are as I'm on my laptop now and can't check but it's around 1.35v_max and 1.38v_max....
Click to expand...

What is the average of increase in FPS when gaming at 1200/1250 over stock clocks?


----------



## Caveat

Hello,

I have 2 Sapphire R9 290X in Crossfire. 1 is about 2 years old now and the other i have for a year (bought it from another guy, so not sure how old it is). Now they are not getting hotter than 70°C. So thats good. But when do i need to refresh the thermal paste? Is there something like "refresh it every year or every 2 years or just when you think it need to be done"?


----------



## rdr09

Quote:


> Originally Posted by *Caveat*
> 
> Hello,
> 
> I have 2 Sapphire R9 290X in Crossfire. 1 is about 2 years old now and the other i have for a year (bought it from another guy, so not sure how old it is). Now they are not getting hotter than 70°C. So thats good. But when do i need to refresh the thermal paste? Is there something like "refresh it every year or every 2 years or just when you think it need to be done"?


70 . . . do not touch them. Less handling = Less chance of borking them. And other components.


----------



## Caveat

Quote:


> Originally Posted by *rdr09*
> 
> 70 . . . do not touch them. Less handling = Less chance of borking them. And other components.


Alright. Thanks for the reply ^^


----------



## gatygun

Quote:


> Originally Posted by *taem*
> 
> So I have a single 290. I was thinking I'd wait for the next gen of card but I have no clue how long that will be. If I got a 980 ti or Fury X now, how much of an upgrade would that be over a 290 running at 1160 core? My display is 2560x1440.


If you also overclock that ti, you probably gain around double the performance, if you buy the lightning model.


----------



## mus1mus

hmmm: not that quite.


----------



## Ha-Nocri

Quote:


> Originally Posted by *gatygun*
> 
> If you also overclock that ti, you probably gain around double the performance, if you buy the lightning model.


That's the worst case scenario with nVidia features turned on. 390X is usually on pair with 980, and in your example it is slower than 970.


----------



## gatygun

Quote:


> Originally Posted by *Ha-Nocri*
> 
> That's the worst case scenario with nVidia features turned on. 390X is usually on pair with 980, and in your example it is slower than 970.


Settings for that benchmark:
Quote:


> Our test run has enabled:
> 
> DX11
> Ultra mode
> AA enabled
> 16x AF enabled
> SSAO enabled
> Nvidia hairworks OFF
> Other settings ON


Hairworks is disabled,

I picked this one because it was one of the newer games to see how it compares.

390x will indeed perform at 1440p around 980 levels. It will be slower when more nvidia features are enabled / tesselation is performance is needed. It will be the same speed when it's not in most of the other benchmarks that are on it. The higher the resolution the lesser the difference is between them, the lower the resolution the more a 980 pulls away. Also the 390x will need a fast cpu.


----------



## mus1mus

Clearly an nVidia Title for a reason.

Plus, it may also mean that the tested card was a reference. Hello throttling!









No object, the TI is a fast card. But it won't really mean doubling the 290s capability. Heck, a crossfire 290 will smoke a TI on crossfire supported games, just to give you an example.


----------



## MEC-777

It totally depends on the specific game(s) in question. For some, Nvidia cards run away. In others, AMD cards run away. You could take the average performance across a wide range of games, but the problem with that is it doesn't tell the whole story. My two 290's nearly match a 980Ti at max settings 1440p. But in The Wicther 3, they just tank (frame rates are fine, but it's VERY rough/stuttery).

So it's not enough to say something like, "this card is 2x faster than that card" because there needs to be more context. In what game? At what resolution? etc. etc.









FYI folks, AMD just dropped their 15.11.1 beta drivers. Includes optimizations for Fallout 4 (and hopefully crossfire support).


----------



## Ha-Nocri

Quote:


> Originally Posted by *gatygun*
> 
> Settings for that benchmark:
> Hairworks is disabled,
> 
> I picked this one because it was one of the newer games to see how it compares.
> 
> 390x will indeed perform at 1440p around 980 levels. It will be slower when more nvidia features are enabled / tesselation is performance is needed. It will be the same speed when it's not in most of the other benchmarks that are on it. The higher the resolution the lesser the difference is between them, the lower the resolution the more a 980 pulls away. Also the 390x will need a fast cpu.


Those results are strange. Other Witcher 3 benchmarks show different story:


----------



## rdr09

Quote:


> Originally Posted by *Ha-Nocri*
> 
> Those results are strange. Other Witcher 3 benchmarks show different story:


Not sure what's up with g3D (actually I do)? I was just pointing out to another member about a score in this bench . . .

290 stock with a 5960X @ 4.4GHz



290 stock with an i7 Sandy at 4.5GHz



I am sure there are some who based their purchase on reviews like those.


----------



## Renton577

Quote:


> Originally Posted by *AliNT77*
> 
> How much did U UV them?


I didn't, I set both cards to a max fan speed of 80% so that they don't throttle while initially heating up and those temps are what they hover at once the fans have normalized. also the second card is now overclocked to 1080 on the core and 1350 on the memory so the fan speeds are a little higher now when they level out but I left the primary card at stock clocks.


----------



## kizwan

@mus1mus, this for the load voltage with +200mV you asked earlier in 390/390X thread. With the 390 modded BIOS, the voltage droop a lot. I think this is why overclockability lowered a little bit when using 390 modded BIOS. @fyzzz, what do you think?

mus1mus's 290 Tweaked BIOS. DPM7 set to 1.25V. (Bus 1)

GPU-Z Sensor Log - ELPIDA 1200-1625 +200mV AUX +50mV Fire Strike
*1.383 - 1.398V*


Insan1ty's 290_ELPIDA_STOCK_V1.7 BIOS (390X modded BIOS). DPM7 is not set. (Bus 1)

GPU-Z Sensor Log - ELPIDA 1180-1600 +200mV AUX +13mV Fire Strike
*1.297 - 1.32V*


----------



## fyzzz

Yeah i always getter much lower voltage on the 390 bios, that's why i always set 1250 in dpm 7 state, which fixes it. The voltage table differs quite a bit too.


----------



## kizwan

Quote:


> Originally Posted by *fyzzz*
> 
> Yeah i always getter much lower voltage on the 390 bios, that's why i always set 1250 in dpm 7 state, which fixes it. The voltage table differs quite a bit too.


Maybe I should set mine to 1300 for DPM7. It should be safe because it will droop when underload anyway, right?


----------



## mus1mus

Wow. Nice find.

My problem happens on any BIOS though. I've tried 1.281 that produced the same results. Hmmmm


----------



## fyzzz

Quote:


> Originally Posted by *kizwan*
> 
> Maybe I should set mine to 1300 for DPM7. It should be safe because it will droop when underload anyway, right?


Yep should be no problem. This got me thinking about playing around with different values in dpm 7, i always put 1250 in and adjust with trixx. With 1250 in dpm 7 and +200mv, i get around ~ 1.3v under load and some spikes to 1.4+.


----------



## gatygun

Quote:


> Originally Posted by *rdr09*
> 
> Not sure what's up with g3D (actually I do)? I was just pointing out to another member about a score in this bench . . .
> 
> 290 stock with a 5960X @ 4.4GHz
> 
> 
> 
> 290 stock with an i7 Sandy at 4.5GHz
> 
> 
> 
> I am sure there are some who based their purchase on reviews like those.


LOL wait what they picked the total 3dmark score?

hahahaha
Quote:


> Originally Posted by *Ha-Nocri*
> 
> Those results are strange. Other Witcher 3 benchmarks show different story:


Yea it's indeed faster there. Dunno about that then.


----------



## fat4l

Here is my score for a single card, 1250/1700MHz.
15030 Graphics score


----------



## Kittencake

K i have the hg10 corsair bracket on with a h60 on my 290x.. XD now whats next


----------



## rdr09

Quote:


> Originally Posted by *gatygun*
> 
> LOL wait what they picked the total 3dmark score?
> 
> hahahaha


yah, overall score. i used to look up to guru. lol. now, it is just another tom's. and people use these review sites for gpu purchase.









Quote:


> Originally Posted by *Kittencake*
> 
> K i have the hg10 corsair bracket on with a h60 on my 290x.. XD now whats next


what do you have to cool the vrms, especially vrm1?

Go check post # 21896 . . . scroll down to My card runs too hot . . .

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890


----------



## corey3333

I HAVE TWO SAPHIRE AMD R9 290X'S AND NEED HELP REGISTERING TO THIS FORUM. NOT TO SOUND DUMB BUT I HAVE BUILT COMPUTERS ALL MY LIFE BUT HAVE NEVER BEEN TO, READ OR POSTED TO A FORUM. I NEED HELP JOINING THE OFFICIAL AMD 29X CLUB AND CROSSFIRING MY CARDS. THNX FOR WHOEVER RESPONDS


----------



## MEC-777

Quote:


> Originally Posted by *rdr09*
> 
> what do you have to cool the vrms, especially vrm1?


With corsair GH10, the bracket itself makes contact with the VRMs and acts as a heatsink, unlike the NZXT G10 which needs additional heatsinks for the VRMs. It also uses the stock blower fan to move air across the bracket to help cool the VRMs and Vram modules. You can set the fan to a constant 30% which is enough to cool everything and remain inaudible. It's actually a well designed product and is a great cooling upgrade for those with reference 290/X's.


----------



## rdr09

Quote:


> Originally Posted by *MEC-777*
> 
> With corsair GH10, the bracket itself makes contact with the VRMs and acts as a heatsink, unlike the NZXT G10 which needs additional heatsinks for the VRMs. It also uses the stock blower fan to move air across the bracket to help cool the VRMs and Vram modules. You can set the fan to a constant 30% which is enough to cool everything and remain inaudible. It's actually a well designed product and is a great cooling upgrade for those with reference 290/X's.


It came with thermal pads or you used something better?
Quote:


> Originally Posted by *corey3333*
> 
> I HAVE TWO SAPHIRE AMD R9 290X'S AND NEED HELP REGISTERING TO THIS FORUM. NOT TO SOUND DUMB BUT I HAVE BUILT COMPUTERS ALL MY LIFE BUT HAVE NEVER BEEN TO, READ OR POSTED TO A FORUM. I NEED HELP JOINING THE OFFICIAL AMD 29X CLUB AND CROSSFIRING MY CARDS. THNX FOR WHOEVER RESPONDS


A pic of your cards and sign with your name on it (corey333). I think op accepts a screenshot of GPUZ.

But, we rather see your rig.


----------



## alancsalt

Quote:


> Originally Posted by *corey3333*
> 
> I HAVE TWO SAPHIRE AMD R9 290X'S AND NEED HELP REGISTERING TO THIS FORUM. NOT TO SOUND DUMB BUT I HAVE BUILT COMPUTERS ALL MY LIFE BUT HAVE NEVER BEEN TO, READ OR POSTED TO A FORUM. I NEED HELP JOINING THE OFFICIAL AMD 29X CLUB AND CROSSFIRING MY CARDS. THNX FOR WHOEVER RESPONDS


R9 290X & 290 Club Roster

To be added on the member list please submit the following in your post

1. GPU-Z Link with OCN name or Screen shot of GPU-Z validation tab open with OCN Name or finally a pic of GPU with piece of paper with OCN name showing.
2. Manufacturer & Brand - Please don't make me guess.
3. Cooling - Stock, Aftermarket or 3rd Party Water
Please note if you do not provide manufacturer, brand or cooling - by default Sapphire Stock Cooling will be chosen for you.


----------



## corey3333




----------



## corey3333




----------



## corey3333

http://www.techpowerup.com/gpuz/details.php?id=56ehc

Sorry my camera sucks, these are the best pics I can send. now can I become a member? I need help to
crossfire my cards please. Corey3333


----------



## alancsalt

GPU-Z, with a G - you can download from https://www.techpowerup.com/downloads/SysInfo/GPU-Z/


----------



## mus1mus

Quote:


> Originally Posted by *fat4l*
> 
> Here is my score for a single card, 1250/1700MHz.
> 15030 Graphics score


Maybe you can be up for a challenge.








http://www.3dmark.com/3dm11/10544612


----------



## kizwan

Quote:


> Originally Posted by *corey3333*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/details.php?id=56ehc
> 
> Sorry my camera sucks, these are the best pics I can send. now can I become a member? I need help to
> crossfire my cards please. Corey3333


What kind of problem you are having with crossfire?


----------



## MEC-777

Quote:


> Originally Posted by *corey3333*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.techpowerup.com/gpuz/details.php?id=56ehc
> 
> 
> 
> Sorry my camera sucks, these are the best pics I can send. now can I become a member? I need help to
> crossfire my cards please. Corey3333


Sweet system! Those Vapour X cards are awesome.









Can you give us more info about what kind of help you need? What graphics card(s) were you running before this (if any)?


----------



## rdr09

Quote:


> Originally Posted by *corey3333*


Very nice. How are your temps though? Those HDDs not blocking air flow?

I have an evga 850 and i only use one power cable (daisy chained) per card. No issues just like when i had an evga 1300.


----------



## MEC-777

Quote:


> Originally Posted by *rdr09*
> 
> Very nice. How are your temps though? Those HDDs not blocking air flow?
> 
> I have an evga 850 and i only use one power cable (daisy chained) per card. No issues just like when i had an evga 1300.


I think that is an earlier version of the EVGA G2 1300 as it has the red PCIe cables, which were later changed for black ones from the factory (people complained about the red). It also looks like those cables are single 8-pin per cable, vs 8 plus 6+2 on each cable like you mentioned.

Also have an EVGA 850 (GS) and it has black PCIe cables (thank gawd







) and 8 plus 6+2 connectors on each cable, same as yours. As long as the gauge of the wires is enough for beyond the rated wattage, it's fine.


----------



## rdr09

Quote:


> Originally Posted by *MEC-777*
> 
> I think that is an earlier version of the EVGA G2 1300 as it has the red PCIe cables, which were later changed for black ones from the factory (people complained about the red). It also looks like those cables are single 8-pin per cable, vs 8 plus 6+2 on each cable like you mentioned.
> 
> Also have an EVGA 850 (GS) and it has black PCIe cables (thank gawd
> 
> 
> 
> 
> 
> 
> 
> ) and 8 plus 6+2 connectors on each cable, same as yours. As long as the gauge of the wires is enough for beyond the rated wattage, it's fine.


i see. yah, i had a P2 1300 and now a G2 850. Both have the same cables. Daisy chained. G2 is newer.


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> Maybe you can be up for a challenge.
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/10544612


Cow... 1330mhz on the core? WTH?


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> Cow... 1330mhz on the core? WTH?


i did the same almost two years ago . . .

http://www.3dmark.com/3dm/2098310

extreme though. wonder what would be the difference with newer driver like the crimson?


----------



## battleaxe

Quote:


> Originally Posted by *rdr09*
> 
> i did the same almost two years ago . . .
> 
> http://www.3dmark.com/3dm/2098310
> 
> extreme though. wonder what would be the difference with newer driver like the crimson?


You guys got lucky with those cards. That's nuts. Full cover blocks?


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> You guys got lucky with those cards. That's nuts. Full cover blocks?


No other way.


----------



## fat4l

Quote:


> Originally Posted by *mus1mus*
> 
> Maybe you can be up for a challenge.
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/10544612


no








1330MHz is .....not happening... and I test only [email protected]_artifacts clocks


----------



## battleaxe

Quote:


> Originally Posted by *rdr09*
> 
> No other way.


I can't even get a block for my good card. Hohum


----------



## gatygun

1330 god dam that's a lot.


----------



## mus1mus

Quote:


> Originally Posted by *fat4l*
> 
> no
> 
> 
> 
> 
> 
> 
> 
> 
> 1330MHz is .....not happening... and I test only [email protected]_artifacts clocks


I am not artifacting at that run (evil grin).









1300 is.

I think a colder ambient can raise it more.


----------



## fat4l

Quote:


> Originally Posted by *mus1mus*
> 
> I am not artifacting at that run (evil grin).
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1300 is.
> 
> I think a colder ambient can raise it more.











what volts reported in gpu-z as max value ?


----------



## corey3333

Thank you, building computers is my passion. Here are my specs:

Motherboard: Asus ROG Crosshair V Formula-z NB950/SB910 RD9x0
CPU: AMD FX8350 Vishera TDP-125watt
RAM: Hyper x Fury 16gb 2x8 kit 1866mhz cl 9-10-11-27
POWERSUPLY: EVGA 1300watt G2 Supernova Gold plus with "super flower components"
GRAPHICS CARDS: x2 AMD Sapphire Vapor-X Radeon R9 290X Aero 10vrm's UEFI W/GCN Mantle API, Both cards identical' air cooled

About the Crossfire problems...This is my fourth set of AMD cards I have tried to Crossfire. First set, 2x AMD Radeon HD 7750 'fail' FPS dropped to about 9 or 10, games un-playable.
Next set 2x AMD Radeon HD 7790 'fail' Frame Rate 11 to 14
Again next set 2x AMD Radeon R9 270x 'fail' Low frame rate and very glitchy
Now the cards I am using right now 2x Sapphire, with one card I usually get top Frame Rates on all games like around 128 to 140 FPS, Depends on the game. When I enable Crossfire Frames drop to around 32 and a little glitchy.

and yes I got a new power supply that's not it, I have gone through three motherboards still no fix, Got a new CPU went from a four core to what I am using now 8 core as you see above, I bought new RAM three times, With the first set of Graphics cards I had windows XP than went to WIN 7 SP1 64bit now I run windows 10 64bit, I even bought a brand new monitor, I have like 4 or 5 monitors to use so that's not it. My latest problem with this setup is this: Crossfire ON most all my games drop way down in frame rates to where they are un-playable. More modern games like Witcher 3 and Assassins creed syndicate' with Crossfire OFF I get like 32 to 46 FPS with Crossfire ON I get max like 60 consistent but the game gets glitchy and images flicker.

I really hope someone can help. Thnx.. Core3333


----------



## mus1mus

Quote:


> Originally Posted by *fat4l*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> what volts reported in gpu-z as max value ?


I need a flat 1.4 on load to do that.


----------



## code33

Quote:


> Originally Posted by *corey3333*
> 
> Thank you, building computers is my passion. Here are my specs:
> 
> Motherboard: Asus ROG Crosshair V Formula-z NB950/SB910 RD9x0
> CPU: AMD FX8350 Vishera TDP-125watt
> RAM: Hyper x Fury 16gb 2x8 kit 1866mhz cl 9-10-11-27
> POWERSUPLY: EVGA 1300watt G2 Supernova Gold plus with "super flower components"
> GRAPHICS CARDS: x2 AMD Sapphire Vapor-X Radeon R9 290X Aero 10vrm's UEFI W/GCN Mantle API, Both cards identical' air cooled
> 
> About the Crossfire problems...This is my fourth set of AMD cards I have tried to Crossfire. First set, 2x AMD Radeon HD 7750 'fail' FPS dropped to about 9 or 10, games un-playable.
> Next set 2x AMD Radeon HD 7790 'fail' Frame Rate 11 to 14
> Again next set 2x AMD Radeon R9 270x 'fail' Low frame rate and very glitchy
> Now the cards I am using right now 2x Sapphire, with one card I usually get top Frame Rates on all games like around 128 to 140 FPS, Depends on the game. When I enable Crossfire Frames drop to around 32 and a little glitchy.
> 
> and yes I got a new power supply that's not it, I have gone through three motherboards still no fix, Got a new CPU went from a four core to what I am using now 8 core as you see above, I bought new RAM three times, With the first set of Graphics cards I had windows XP than went to WIN 7 SP1 64bit now I run windows 10 64bit, I even bought a brand new monitor, I have like 4 or 5 monitors to use so that's not it. My latest problem with this setup is this: Crossfire ON most all my games drop way down in frame rates to where they are un-playable. More modern games like Witcher 3 and Assassins creed syndicate' with Crossfire OFF I get like 32 to 46 FPS with Crossfire ON I get max like 60 consistent but the game gets glitchy and images flicker.
> 
> I really hope someone can help. Thnx.. Core3333


----------



## code33

Quote:


> Originally Posted by *code33*


, 7000 series wasn't good card to crossfire. I think it was really optimize 2000 series a lot better


----------



## By-Tor

Not the highest around, but the best I've been able to get out of my system... I'm happy with it...


----------



## corey3333

Not at all.... Front and Side Case Fans suck in, Rear' Top' and side' Fans Blow out. Plus my Top two fans are Sythe Ultra 3000 38mm thick, all fans are connected to a riosmart 6, the important ones anyway. Temps at Gaming CPU 41c/44c max Graphics card usually with demanding gaming 58c/61c. I crank up all the case fans to max to get those low temps. NO water cooling just AIR. Thanks for the comment.


----------



## mfknjadagr8

Quote:


> Originally Posted by *corey3333*
> 
> Not at all.... Front and Side Case Fans suck in, Rear' Top' and side' Fans Blow out. Plus my Top two fans are Sythe Ultra 3000 38mm thick, all fans are connected to a riosmart 6, the important ones anyway. Temps at Gaming CPU 41c/44c max Graphics card usually with demanding gaming 58c/61c. I crank up all the case fans to max to get those low temps. NO water cooling just AIR. Thanks for the comment.


this site has a function called rig builder it does basically what you have done but is easier to see the whole rig at a glance...


----------



## corey3333

http://www.3dmark.com/3dm11/10548186


----------



## rdr09

@ corey333, you gonna need to oc your cpu. can't lv cpu at stock or at low oc with 2 290x. try 4.5GHz or higher. Your score is about equal to a single gpu score.

Could it be the cards or a card is throttling. check your vrm temps, especially vrm1. use GPUZ but set the reading to show Max temp.

Here was my run with a highly oc'ed 290 though . . .

http://www.3dmark.com/3dm11/8776470


----------



## Paul17041993

The graphics score is correct for the configuration, just that 3Dmark's crummy maths means the physics score drops the greater the graphics score is as well as the total score being something stupid.


----------



## EpicOtis13

Quote:


> Originally Posted by *By-Tor*
> 
> Not the highest around, but the best I've been able to get out of my system... I'm happy with it...


That seem low, at 1200/1525 I'm getting 19338 which is a huge gain over your score.


----------



## rdr09

@ correy333, you gonna need to oc your cpu. can't lv cpu at stock or at low oc with 2 290x. try 4.5GHz or higher. Your score is about equal to a single gpu sco
Quote:


> Originally Posted by *Paul17041993*
> 
> The graphics score is correct for the configuration, just that 3Dmark's crummy maths means the physics score drops the greater the graphics score is as well as the total score being something stupid.


same thing is going to happen to my system if i put the sandy to stock. i think only the really powerful cpu or the newer ones with high ipc will work at stock when combined with 2 290(X)s. you got to consider that these hawaii in crossfire are a tad faster than a single 980 Ti, especially when oc'ed.

Futuremark results can mirror what happens in gaming. i think my 290s at stock scored a bit higher in GS score compared to correy's.


----------



## kizwan

Quote:


> Originally Posted by *EpicOtis13*
> 
> Quote:
> 
> 
> 
> Originally Posted by *By-Tor*
> 
> Not the highest around, but the best I've been able to get out of my system... I'm happy with it...
> 
> 
> 
> 
> 
> That seem low, at 1200/1525 I'm getting 19338 which is a huge gain over your score.
Click to expand...

Do not compare overall score but instead compare with graphics score. The graphics score is indeed lower than it should be & this is compare with my old score, with older drivers & a pair of 290's at 1200/1500.

Using two samples for comparison with both using different drivers version & both was with older drivers. The differences between mine & his was ~330 & ~560 respectively. Not a lot but remember this is 290X's vs 290's crossfire. I'm pretty sure this is because Powercolor BIOS is less optimized than the other brands BIOSes. I have seen other Powercolor 290's scores before & they always also seems scored lower though. Feel free to dig this thread for other Powercolor firestrike scores.
http://www.3dmark.com/fs/2215393
http://www.3dmark.com/fs/2825429

However, bench score is a poor presentation of the cards gaming performance. I'm pretty sure FPS difference between his & mine is less than 2 FPS which is nothing really. His cards probable a couple FPS behind other 290X's which is still not a lot. Do this affect his gaming experience? Definitely NO. So @By-Tor...


----------



## corey3333

I already have the cpu oc with the bios in ROG. Bench marks went up but MHZ did not. what software do you suggest to overclock? When I had Windows 7 my cpu ran at 4446mhz but after upgrading to windows 10 it now runs at 4.05mhz.


----------



## Gumbi

Quote:


> Originally Posted by *corey3333*
> 
> I already have the cpu oc with the bios in ROG. Bench marks went up but MHZ did not. what software do you suggest to overclock? When I had Windows 7 my cpu ran at 4446mhz but after upgrading to windows 10 it now runs at 4.05mhz.


How are you reading the CPU clock speeds in Windows?


----------



## mus1mus

Who would be brave enough to try my best scoring BIOS so far?

PT1- Based

http://www.3dmark.com/compare/fs/6524731/fs/6524712


----------



## fyzzz

Quote:


> Originally Posted by *mus1mus*
> 
> Who would be brave enough to try my best scoring BIOS so far?
> 
> PT1- Based
> 
> http://www.3dmark.com/compare/fs/6524731/fs/6524712


Welp nice you have beaten my score. I did atleast 15300 at a lower clock. i need to find a magical timing for my card's memory







.


----------



## mus1mus

Quote:


> Originally Posted by *fyzzz*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Who would be brave enough to try my best scoring BIOS so far?
> 
> PT1- Based
> 
> http://www.3dmark.com/compare/fs/6524731/fs/6524712
> 
> 
> 
> Welp nice you have beaten my score. I did atleast 15300 at a lower clock. i need to find a magical timing for my card's memory
> 
> 
> 
> 
> 
> 
> 
> .
Click to expand...

Yeah man. You really are a tough nut to crack!.

I'm taking your spot in my next runs. And will set it there for eternity. lol


----------



## fyzzz

Quote:


> Originally Posted by *mus1mus*
> 
> Yeah man. You really are a tough nut to crack!.
> 
> I'm taking your spot in my next runs. And will set it there for eternity. lol


Hmm we will see. I have scored over 15k at 1200 mhz before, with the right timings, so the potential is definitely there...


----------



## mus1mus

Quote:


> Originally Posted by *fyzzz*
> 
> Hmm we will see. I have scored over 15k at 1200 mhz before, with the right timings, so the potential is definitely there...


I have no doubt in that.

Got lucky today I think. I added another 360 into the mix to cool things a bit.
http://www.3dmark.com/fs/6524795


----------



## fyzzz

Quote:


> Originally Posted by *mus1mus*
> 
> I have no doubt in that.
> 
> Got lucky today I think. I added another 360 into the mix to cool things a bit.
> http://www.3dmark.com/fs/6524795


Wow just wow, nice scores and that clockspeed! I 'only' have 2x240 radiators, 30mm. But i have 3c outside today, so that takes care of the temperatures. I need to fiddle around more with the timings, my core doesn't want to budge over 1265 mhz, no matter voltage. I just realized we are approaching 16k!


----------



## mus1mus

Yesssir we are.

Thanks for the push!


----------



## rdr09

Quote:


> Originally Posted by *corey3333*
> 
> I already have the cpu oc with the bios in ROG. Bench marks went up but MHZ did not. what software do you suggest to overclock? When I had Windows 7 my cpu ran at 4446mhz but after upgrading to windows 10 it now runs at 4.05mhz.


I suggest you seek furhter help here in oc'ing cpu.

http://www.overclock.net/t/1318995/official-fx-8320-fx-8350-vishera-owners-club

one thing, though, check to make sure your cpu temp does not go around 60C. ask the owner's club for more accurate info. again, check your vrm temps as well on the gpus. like this . . .



showing max rdgs. open two instances of gpuz one for each gpu.


----------



## Paul17041993

Quote:


> Originally Posted by *rdr09*
> 
> same thing is going to happen to my system if i put the sandy to stock. i think only the really powerful cpu or the newer ones with high ipc will work at stock when combined with 2 290(X)s. you got to consider that these hawaii in crossfire are a tad faster than a single 980 Ti, especially when oc'ed.
> 
> Futuremark results can mirror what happens in gaming. i think my 290s at stock scored a bit higher in GS score compared to correy's.


Considering crossfire is mainly intended for pushing more pixels each frame, vs pushing more frames, I'm still quite happy with my FX-8150 @ stock for gaming as I'm always GPU-bound (single 290X with 4k, don't have adequate cooling to sustain a high OC in AU heat).

That being said, my next major upgrade is going to see all the main components replaced (Zen CPU/APU, DDR4, "GCN2.0") and I'll be "retiring" this system as more of a server/test box.
Quote:


> Originally Posted by *corey3333*
> 
> I already have the cpu oc with the bios in ROG. Bench marks went up but MHZ did not. what software do you suggest to overclock? When I had Windows 7 my cpu ran at 4446mhz but after upgrading to windows 10 it now runs at 4.05mhz.


Keeping in mind some applications are quite drunk when it comes to reading current clocks and multipliers correctly. Both coretemp and task manager for example have had known bugs for a long time now that they cant read dynamic FSB clocks correctly at all, which is something AMD processors use as well as dynamic per-core/module multipliers. HWiNFO64 I think is the most reliable application for clock monitoring currently, which even it can have its rare moments...


----------



## fyzzz

I think i need to buy a new psu. It feels like the psu can't really keep up with the 290. Atleast when i'm benching at high volts. More vddc almost makes thing worse and when i touched the psu, it was cold, while it's usually very warm. I have the Corsair rm 750 and i know all about the rumors about it and i've considered to swap it out for a long time. I've been thinking about getting a superflower psu, since they seems pretty good and are actually affordable where i live. My current psu may be able to handle the 980 ti if i switch back to it when it comes back from rma. But i feel like it's time to buy a reliable psu.


----------



## mus1mus

Have you checked the voltages during the runs?

Droop is normal on every BIOS I have tested sanz the PT1.

I hear you on the PSU though. Somehow, my X-1250 gets warm on just a single 290 and a 4790K. That thing is even bathing in cold air from rads. At 20C.

By the way, getting close to 16K.








http://www.3dmark.com/fs/6525388


----------



## fyzzz

Quote:


> Originally Posted by *mus1mus*
> 
> Have you checked the voltages during the runs?
> 
> Droop is normal on every BIOS I have tested sanz the PT1.
> 
> I hear you on the PSU though. Somehow, my X-1250 gets warm on just a single 290 and a 4790K. That thing is even bathing in cold air from rads. At 20C.
> 
> By the way, getting close to 16K.
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/6525388


Yeah this psu is acting weird sometimes. I might have damaged it during the summer, it didn't like benching in 25c ambient, it overheated from time to time. Now i just want something that i can rely on when benching. Wouldn't surprise me if you started to touch 16k soon. You have really figured out how to push your 290. I will never reach that score, just too much issues. Maybe things will change, if i change psu, but i doubt that it will change much. I need to figure out a good combination for my 290, i still believe i can push it a bit more. I'm still happy with my 15319 gpu score at 1265/1750 however. I wonder what my asus 290 dc 2would had done with all these bios mods, i remember benching it at 1265/1625 on air (raijintek morpheus) and maybe a bit higher than that, it had elpida, but it was still good. It also ran very cold and had 84 asic.


----------



## mus1mus

Quote:


> Originally Posted by *fyzzz*
> 
> Yeah this psu is acting weird sometimes. I might have damaged it during the summer, it didn't like benching in 25c ambient, it overheated from time to time. Now i just want something that i can rely on when benching. Wouldn't surprise me if you started to touch 16k soon. You have really figured out how to push your 290. I will never reach that score, just too much issues. Maybe things will change, if i change psu, but i doubt that it will change much. I need to figure out a good combination for my 290, i still believe i can push it a bit more. I'm still happy with my 15319 gpu score at 1265/1750 however. I wonder what my asus 290 dc 2would had done with all these bios mods, i remember benching it at 1265/1625 on air (raijintek morpheus) and maybe a bit higher than that, it had elpida, but it was still good. It also ran very cold and had 84 asic.


Naah, just trual and error runs.









I have another set of timings to try. From a GIGA that is for the BBBG DEBUG chips. Looser but, who knows. M)


----------



## By-Tor

Quote:


> Originally Posted by *kizwan*
> 
> Do not compare overall score but instead compare with graphics score. The graphics score is indeed lower than it should be & this is compare with my old score, with older drivers & a pair of 290's at 1200/1500.
> 
> Using two samples for comparison with both using different drivers version & both was with older drivers. The differences between mine & his was ~330 & ~560 respectively. Not a lot but remember this is 290X's vs 290's crossfire. I'm pretty sure this is because Powercolor BIOS is less optimized than the other brands BIOSes. I have seen other Powercolor 290's scores before & they always also seems scored lower though. Feel free to dig this thread for other Powercolor firestrike scores.
> http://www.3dmark.com/fs/2215393
> http://www.3dmark.com/fs/2825429
> 
> However, bench score is a poor presentation of the cards gaming performance. I'm pretty sure FPS difference between his & mine is less than 2 FPS which is nothing really. His cards probable a couple FPS behind other 290X's which is still not a lot. Do this affect his gaming experience? Definitely NO. So @By-Tor


Well since we play games and not bench marks I'm am very happy with the performance these cards give me. Getting very smooth game play and great frame rates.
I have been a Powercolor fan for awhile and have owned 77xx, 78xx, 79xx and now these 290's and would purchase from them again on my next upgrade.

A huge plus with this 290x gen. is that they came from the factory with full cover EK water blocks attached.


----------



## rdr09

Quote:


> Originally Posted by *mus1mus*
> 
> Have you checked the voltages during the runs?
> 
> Droop is normal on every BIOS I have tested sanz the PT1.
> 
> I hear you on the PSU though. Somehow, my X-1250 gets warm on just a single 290 and a 4790K. That thing is even bathing in cold air from rads. At 20C.
> 
> By the way, getting close to 16K.
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/6525388


Don't bork the platinum card. Very Good Job!

Quote:


> Originally Posted by *By-Tor*
> 
> Well since we play games and not bench marks I'm am very happy with the performance these cards give me. Getting very smooth game play and great frame rates.
> I have been a Powercolor fan for awhile and have owned 77xx, 78xx, 79xx and now these 290's and would purchase from them again on my next upgrade.
> 
> A huge plus with this 290x gen. is that they came from the factory with full cover EK water blocks attached.


iirc, there were powercolor that came out underperforming and needed a new bios from the manufacturer. those who flashed their cards successfully got the boost.

with two, you got all the power you need anyway for gaming and that's all that matters. your 290Xs are still faster than my 290s.


----------



## mus1mus

I promise I won't.

It's the promise I might break though.









Any takers for a PT1 based BIOS for Elpida RAM?


----------



## Nassenoff

Quote:


> Originally Posted by *mus1mus*
> 
> I promise I won't.
> 
> It's the promise I might break though.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any takers for a PT1 based BIOS for Elpida RAM?


I might try once I get the 290 under water.
Tried a 290X bios a while back. After that i had some Blue screen issues due to memory compatility issues and no I try to get Fallout 4 running.
It has not been a good PC month for me.


----------



## EpicOtis13

Quote:


> Originally Posted by *mus1mus*
> 
> I promise I won't.
> 
> It's the promise I might break though.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any takers for a PT1 based BIOS for Elpida RAM?


I would take it. My cards are stable in games at 1200/1525 with +100mv and +25% powerlimit. I want to tune it in, but it will be annoying to do so since I have one elpida card and one hynix card.


----------



## MEC-777

Installed the latest catalyst 15.11.1 the other day hoping to see some gains in Fallout 4. It's in the driver notes that they added optimizations for this game.

Well, I saw maybe 10-15% bump, enough to allow me to turn up the eye candy just a bit more and still maintain 60+fps most of the time. But still no crossfire profile.







There's no SLI profile either, so I'm not too concerned about that. Runs good enough on one card. I have a feeling they won't ever release a proper multi-GPU profile for this game which sucks as I'm sure the Nexus mod community will be modding the hell of out this game, making it more demanding to run.

Usually it runs fairly smooth, only dipping in FPS when there's a lot of "stuff" on screen (buildings etc. But last nigh I noticed a number of times there was significant frame rate drops, like down into the high 20's at one point. Looking at GPU usage, it was between 60-80%, so it wasn't GOU bound and CPU usage was also around 60-70%, so it wasn't a CPU bottleneck either. Not sure why it was tanking like that. Game has run fine up to this point.

How's Fallout 4 running for you guys? Did any of you notice any performance increase or issues after updating to 15.11.1?


----------



## buttface420

is my 290x a bad overclocker? its asic is 68.1%

the highest most stable overclock i've gotten is 1100 core and 1375 memory (elpida)

its cooled with a hg10 and a h55 aio, temps are around 60c at 1100/1375

core voltage(mV) is +75 and power limit max (afterburner)


----------



## EpicOtis13

Quote:


> Originally Posted by *buttface420*
> 
> is my 290x a bad overclocker? its asic is 68.1%
> 
> the highest most stable overclock i've gotten is 1100 core and 1375 memory (elpida)
> 
> its cooled with a hg10 and a h55 aio, temps are around 60c at 1100/1375
> 
> core voltage(mV) is +75 and power limit max (afterburner)


Try out Trixx. On after burner my max stable clock was 1150/1450 and now I'm at 1250/1600 with +200mv core and +50% power limit.


----------



## Nassenoff

Quote:


> Originally Posted by *MEC-777*
> 
> Installed the latest catalyst 15.11.1 the other day hoping to see some gains in Fallout 4. It's in the driver notes that they added optimizations for this game.
> 
> Well, I saw maybe 10-15% bump, enough to allow me to turn up the eye candy just a bit more and still maintain 60+fps most of the time. But still no crossfire profile.
> 
> 
> 
> 
> 
> 
> 
> There's no SLI profile either, so I'm not too concerned about that. Runs good enough on one card. I have a feeling they won't ever release a proper multi-GPU profile for this game which sucks as I'm sure the Nexus mod community will be modding the hell of out this game, making it more demanding to run.
> 
> Usually it runs fairly smooth, only dipping in FPS when there's a lot of "stuff" on screen (buildings etc. But last nigh I noticed a number of times there was significant frame rate drops, like down into the high 20's at one point. Looking at GPU usage, it was between 60-80%, so it wasn't GOU bound and CPU usage was also around 60-70%, so it wasn't a CPU bottleneck either. Not sure why it was tanking like that. Game has run fine up to this point.
> 
> How's Fallout 4 running for you guys? Did any of you notice any performance increase or issues after updating to 15.11.1?


The driver didn't help jack Sh.... on my rig. I have exactly the same problem as you(and a numerous amount of other people). People with Skylake OC I5 and 980 are having problems.
I have uploaded a video taken from the game with my rig, have been advised to reduce the number cores Fallout 4 is allowed to use in hope of increasing the singel core CPU utilization. I will try it tonight.
You can follow this thread:
http://www.overclock.net/t/1576308/bethesda-net-fallout-4-system-requirements-recommendation/330


----------



## kizwan

Disabling cores will likely increase cpu utilization but that also may reduce the games performance because everything, every tasks need to be handled by remaining enabled cores. If you want to prevent power saving from interfering, set power options to high performance & disable C-states (C1E, C3, C6 & C7) in the BIOS.


----------



## MEC-777

Quote:


> Originally Posted by *Nassenoff*
> 
> The driver didn't help jack Sh.... on my rig. I have exactly the same problem as you(and a numerous amount of other people). People with Skylake OC I5 and 980 are having problems.
> I have uploaded a video taken from the game with my rig, have been advised to reduce the number cores Fallout 4 is allowed to use in hope of increasing the singel core CPU utilization. I will try it tonight.
> You can follow this thread:
> http://www.overclock.net/t/1576308/bethesda-net-fallout-4-system-requirements-recommendation/330


Glad to hear I'm not the only one with these issues. Thanks, I'll go check out that thread.


----------



## corey3333

Games without a Crossfire profile can be forced if you want to. Enable "AMD CrossfireX for applications that have no associated profile" as seen on the pic I sent you' Check that box, hope this helps you.
I am having Crossfire problems myself on Witcher 3. when I enable Crossfire my FPS Double But I get some flickering on icons and the bottom of screen maybe you can help Me with that?

Also you can turn OFF V-Sync that should allow above 60FPS


----------



## joeh4384

I found forcing profiles most of the time leads to epietic seizures from the flickering.


----------



## Paul17041993

Quote:


> Originally Posted by *Nassenoff*
> 
> The driver didn't help jack Sh.... on my rig. I have exactly the same problem as you(and a numerous amount of other people). People with Skylake OC I5 and 980 are having problems.
> I have uploaded a video taken from the game with my rig, have been advised to reduce the number cores Fallout 4 is allowed to use in hope of increasing the singel core CPU utilization. I will try it tonight.
> You can follow this thread:
> http://www.overclock.net/t/1576308/bethesda-net-fallout-4-system-requirements-recommendation/330


Quote:


> Originally Posted by *kizwan*
> 
> Disabling cores will likely increase cpu utilization but that also may reduce the games performance because everything, every tasks need to be handled by remaining enabled cores. If you want to prevent power saving from interfering, set power options to high performance & disable C-states (C1E, C3, C6 & C7) in the BIOS.


Quote:


> Originally Posted by *Paul17041993*
> 
> Keep in mind, the more cores of a CPU that are heavily utilised means the surrounding cores must throttle down to the same level to remain in the max TDP, this would be seen a lot with many intel processors but probably not so much with AMD as they have a higher allowed TDP (or TDP throttle disabled if OC'ed or other settings changed in BIOS).
> 
> If someone gives me a summary of what seems to be happening in Fallout4 I may be able to give a more detailed explanation, there's various other things that can contribute to the performance hill between more and less active threads.


Silly me went and put it in the wrong thread. Also disabling cores in the BIOS is rarely a good thing, it means all the background threads as well as the primary task threads are crammed together the less cores that are available. Disabling turbo and HT on some intel processors (mainly 1st and 2nd gen i series) can give more reliable performance in certain situations.


----------



## MEC-777

Quote:


> Originally Posted by *corey3333*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Games without a Crossfire profile can be forced if you want to. Enable "AMD CrossfireX for applications that have no associated profile" as seen on the pic I sent you' Check that box, hope this helps you.
> I am having Crossfire problems myself on Witcher 3. when I enable Crossfire my FPS Double But I get some flickering on icons and the bottom of screen maybe you can help Me with that?
> 
> Also you can turn OFF V-Sync that should allow above 60FPS


Yeah I have that box checked under enable CF. The thing with Fallout 4 is, the only way to actually force CF to work is by using AFR, or 1x1, or using a CF profile from another game. One that works for FO4 is Crysis 3, but it's only a minor improvement and frame rates are terribly inconsistent. ARF and 1x1 give a flickering effect that looks awful. Same thing with The Witcher 3. I just disable CF for that game as well.
Quote:


> Originally Posted by *joeh4384*
> 
> I found forcing profiles most of the time leads to epietic seizures from the flickering.


Yep. Forcing CF in FO4 is a mess. I found I get a much better experience and smoother/consistent frame rates by just leaving CF disabled for this game.

I think the core of the issue with frame rate dips in FO4 is something to do with how the game is optimized/threaded.


----------



## Eudisld15

What do you guys think?

http://www.3dmark.com/fs/6532670


----------



## kizwan

Quote:


> Originally Posted by *Paul17041993*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Paul17041993*
> 
> Keep in mind, the more cores of a CPU that are heavily utilised means the surrounding cores must throttle down to the same level to remain in the max TDP, this would be seen a lot with many intel processors but probably not so much with AMD as they have a higher allowed TDP (or TDP throttle disabled if OC'ed or other settings changed in BIOS).
> 
> If someone gives me a summary of what seems to be happening in Fallout4 I may be able to give a more detailed explanation, there's various other things that can contribute to the performance hill between more and less active threads.
> 
> 
> 
> Silly me went and put it in the wrong thread. Also disabling cores in the BIOS is rarely a good thing, it means all the background threads as well as the primary task threads are crammed together the less cores that are available. Disabling turbo and HT on some intel processors (mainly 1st and 2nd gen i series) can give more reliable performance in certain situations.
Click to expand...

Intel processors can run with all cores active & utilized all day long as long as temp is under the limit. The rest of the cores *not* must throttle down to keep within TDP limits when many cores are heavily utilized. It doesn't work that way. If you're talking about Turbo Boost, it's the other way around - regarding the TDP limit. (Assuming other factors also below limits) Intel core cpu is allowed to operate at power level higher than TDP - which means operate at max turbo boost frequency (when one or more cores active while the other cores throttle down) - for short durations. Basically what I'm saying is only when the CPU power level is below TDP limit, it is when some of the cores are allowed to run at max Turbo Boost frequency while the rest of the cores throttle down. Once TDP limit reaches, this not going to happen & all cores will be running at max non-Turbo Boost frequency. In case with unlocked CPU that allow overclocking, TDP is not capped though, can be override & usually pre-set to crazy high number anyway. Only temp will be the limit for these CPUs.

Actually it's not about which gen the CPU belong to but it's depends on the games whether it'll perform better with HT off or not. It seems fallout 4 is not CPU intensive games though, so I doubt disabling HT or even disabling cores or even with affinity set to just a number of cores/threads will give any performance boost.


----------



## mus1mus

Quote:


> Originally Posted by *Eudisld15*
> 
> What do you guys think?
> 
> http://www.3dmark.com/fs/6532670


It's a nice score for that clock.

Any idea of your Memory chip?

EDIT: It's a 290X. So must be in the ballpark of the card's capabilities in FS.


----------



## MEC-777

Quote:


> Originally Posted by *kizwan*
> 
> Actually it's not about which gen the CPU belong to but it's depends on the games whether it'll perform better with HT off or not. It seems fallout 4 is not CPU intensive games though, so I doubt disabling HT or even disabling cores or even with affinity set to just a number of cores/threads will give any performance boost.


FO4 is not CPU intensive in the sense that it is heavy on all cores, but it is (or seems to be) heavily single-thread dependent. This means (from what I've observed) it can use multiple cores but is still limited, primarily, to a single threaded task/workload. Some of the other processes for the game which can be performed on the other cores, are still held back by that one heavy thread. So it loads up one core and practically maxes it out while the other cores are only partially being utilized. So in a sense, it can be considered a CPU intensive in a way.

The game engine is quite old now (same engine as Skyrim), which happens to be heavily single threaded. You'd think with today's CPUs and with Intel's strong IPC per core, this shouldn't be a problem, but I think it's just that the game is not as well optimized as it could be.

As for CPU clocks and turbo clocks, my i5-4570 can easily maintain max boost clocks on all four cores all day long. From my understanding, as long as temps are well under control, it can do this. I've never seen it boost some cores and not others, not for more than brief moments at a time. Usually all 4 run the same.


----------



## kizwan

Quote:


> Originally Posted by *MEC-777*
> 
> As for CPU clocks and turbo clocks, my i5-4570 can easily maintain max boost clocks on all four cores all day long. From my understanding, as long as temps are well under control, it can do this. I've never seen it boost some cores and not others, not for more than brief moments at a time. Usually all 4 run the same.


You mean 3.4GHz? That's the rule though & by default that's how Intel core CPU behave. With unlocked CPU, you can override this. CPU-Z may report 3.6GHz but did you check how many cores was under load? Can you set "by all cores" to 36 multi in BIOS?


----------



## corey3333

I am going to get Fallout 4 any day now, when I do ill play around with the .Debug .dll .pak .ini and Hexadecimal Editor. I usual can read/edit the Kernal in software ill look for a fix, if i find anything ill let you know k.


----------



## corey3333

CPUID Hardware Monitor Pro, DXdiag and AOD AMD OverDrive. Win 10 Task manager is Not correct.


----------



## corey3333

http://www.3dmark.com/3dm11/10556434


----------



## MEC-777

Quote:


> Originally Posted by *kizwan*
> 
> You mean 3.4GHz? That's the rule though & by default that's how Intel core CPU behave. With unlocked CPU, you can override this. CPU-Z may report 3.6GHz but did you check how many cores was under load? Can you set "by all cores" to 36 multi in BIOS?


Actually, Intel claims the 4570 is 3.2 base and 3.6 turbo. I have my baseclock bumped up to 103 (or 104, can't remember) which has bumped the base up to 3.3 and 3.74 turbo. CPU-Z will show 3.74 on all 4 cores in a multi-threaded game. Resource monitor shows even loading on all 4 cores at the same time.

According to CPU-world, it is as follows:

turbo clocks:
4 cores @ 3.4
3 cores @ 3.5
1-2 cores @ 3.6

However, as mentioned above, that's not what I observe.


----------



## the9quad

I've been thinking of upgrading lately to some newer cards, but you know what? These cards are beast. I am going to wait a while. Everything I play runs amazingly well. Bets purchase in a long time getting these day one.


----------



## kizwan

Quote:


> Originally Posted by *MEC-777*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> You mean 3.4GHz? That's the rule though & by default that's how Intel core CPU behave. With unlocked CPU, you can override this. CPU-Z may report 3.6GHz but did you check how many cores was under load? Can you set "by all cores" to 36 multi in BIOS?
> 
> 
> 
> Actually, Intel claims the 4570 is 3.2 base and 3.6 turbo. I have my baseclock bumped up to 103 (or 104, can't remember) which has bumped the base up to 3.3 and 3.74 turbo. CPU-Z will show 3.74 on all 4 cores in a multi-threaded game. Resource monitor shows even loading on all 4 cores at the same time.
> 
> According to CPU-world, it is as follows:
> 
> turbo clocks:
> 4 cores @ 3.4
> 3 cores @ 3.5
> 1-2 cores @ 3.6
> 
> However, as mentioned above, that's not what I observe.
Click to expand...

Like I said, it wont be able to run at 36 multi on all cores. Only two or one core active when it is running at max turbo boost multi. When all cores loaded at the same time, it only be able to run up to 3.4GHz or ~3.5GHz (BCLK 104).


----------



## MEC-777

Quote:


> Originally Posted by *kizwan*
> 
> Like I said, it wont be able to run at 36 multi on all cores. Only two or one core active when it is running at max turbo boost multi. When all cores loaded at the same time, it only be able to run up to 3.4GHz or ~3.5GHz (BCLK 104).


So why then, does CPU-Z and other hardware monitoring software show 3.74 on all 4 cores simultaneously when that's technically not possible?


----------



## kizwan

Quote:


> Originally Posted by *MEC-777*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Like I said, it wont be able to run at 36 multi on all cores. Only two or one core active when it is running at max turbo boost multi. When all cores loaded at the same time, it only be able to run up to 3.4GHz or ~3.5GHz (BCLK 104).
> 
> 
> 
> So why then, does CPU-Z and other hardware monitoring software show 3.74 on all 4 cores simultaneously when that's technically not possible?
Click to expand...

You didn't answer my question earlier. Let me rephrase the question. Did you set CPU Core Ratio to Sync All Cores?

And can you post screenshot showing all cores running at 3.74GHz?


----------



## MEC-777

Quote:


> Originally Posted by *kizwan*
> 
> You didn't answer my question earlier. Let me rephrase the question. Did you set CPU Core Ratio to Sync All Cores?
> 
> And can you post screenshot showing all cores running at 3.74GHz?


Sorry about that. I honestly don't know. All I did was increase the BCLK until it was unstable (~105), then backed it off to the highest stable BCLK. Everything else, I believe is on default settings, whatever that may be. I will check later tonight.

Yep, will upload a screen shot while running P95 later tonight.


----------



## battleaxe

Quote:


> Originally Posted by *the9quad*
> 
> I've been thinking of upgrading lately to some newer cards, but you know what? These cards are beast. I am going to wait a while. Everything I play runs amazingly well. Bets purchase in a long time getting these day one.


are you still running three of them on air? What's your temps like?

And agreed, still awesome cards. Worth every penny for sure.


----------



## kizwan

Quote:


> Originally Posted by *MEC-777*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> You didn't answer my question earlier. Let me rephrase the question. Did you set CPU Core Ratio to Sync All Cores?
> 
> And can you post screenshot showing all cores running at 3.74GHz?
> 
> 
> 
> Sorry about that. I honestly don't know. All I did was increase the BCLK until it was unstable (~105), then backed it off to the highest stable BCLK. Everything else, I believe is on default settings, whatever that may be. I will check later tonight.
> 
> Yep, will upload a screen shot while running P95 later tonight.
Click to expand...

Thank you. I actually trying to find similar screenshot before but can't find any. Just to be clear I'm only explaining the default Intel core CPU behavior with turbo boost. When motherboard can override this, then it's running outside default configuration, just like the extra +4 turbo bin with previous gen Intel core CPU. This is if the motherboard have the "by all cores" or "sync all cores" feature in the BIOS. Otherwise the cpu will run at default configuration. With Haswell, Intel removed the free turbo bin but allowed to override number of cores at max turbo boost frequency.


----------



## the9quad

Quote:


> Originally Posted by *battleaxe*
> 
> are you still running three of them on air? What's your temps like?
> 
> And agreed, still awesome cards. Worth every penny for sure.


Heck yeah I run them on air. I wear headphones, and the fans run a custom curve. So temps are fine, but you can definitely hear them fans lol. Doesn't bother me though.

me>>>>><<<<<290x's


----------



## battleaxe

Quote:


> Originally Posted by *the9quad*
> 
> Heck yeah I run them on air. I wear headphones, and the fans run a custom curve. So temps are fine, but you can definitely hear them fans lol. Doesn't bother me though.
> 
> me>>>>><<<<<290x's


LOL.... that's awesome. Embrace the noise. Who cares? LOL


----------



## kizwan

Yeah, noise FTW?!


----------



## mus1mus

Noise is good. It reminds you that something's up and running.


----------



## Widde

Quote:


> Originally Posted by *the9quad*
> 
> Heck yeah I run them on air. I wear headphones, and the fans run a custom curve. So temps are fine, but you can definitely hear them fans lol. Doesn't bother me though.
> 
> me>>>>><<<<<290x's


Like me ^^ As long as I get performance and it overclocks I dont mind the noise


----------



## MEC-777

Man, when I first installed both 290's in crossfire, I had them both with their air coolers (HIS IceQx2 and a gigabyte reference). They both hit 90*C in a short amount of time and sounded like a 747 at full throttle. Was way too loud for my liking.









I like my PC nice and quiet.


----------



## MEC-777

Quote:


> Originally Posted by *kizwan*
> 
> Thank you. I actually trying to find similar screenshot before but can't find any. Just to be clear I'm only explaining the default Intel core CPU behavior with turbo boost. When motherboard can override this, then it's running outside default configuration, just like the extra +4 turbo bin with previous gen Intel core CPU. This is if the motherboard have the "by all cores" or "sync all cores" feature in the BIOS. Otherwise the cpu will run at default configuration. With Haswell, Intel removed the free turbo bin but allowed to override number of cores at max turbo boost frequency.


I didn't get a chance to check those settings in the BIOS yet, but here's a screeny running P95.


----------



## MEC-777

FYI, for those of you having issues with severe frame rate drops in Fallout 4, try turning shadow distance option down to medium. Even on high it just tanks performance for some reason. But on medium, the game runs silky smooth and stays glued at 60fps + running at 1440p ultra on a single 290.









So yeah, something about the shadow distance on high/ultra just holds the game back for some reason. Maybe it'll be addressed in the FO4 patch that's coming soon...


----------



## dbrisc

Just ordered my Sapphire 290 from the Newegg flash for 199 with the MIR and I cannot wait! Can finally retire my GTX 460


----------



## Paul17041993

Quote:


> Originally Posted by *kizwan*
> 
> Intel processors can run with all cores active & utilized all day long as long as temp is under the limit. The rest of the cores *not* must throttle down to keep within TDP limits when many cores are heavily utilized. It doesn't work that way. If you're talking about Turbo Boost, it's the other way around - regarding the TDP limit. (Assuming other factors also below limits) Intel core cpu is allowed to operate at power level higher than TDP - which means operate at max turbo boost frequency (when one or more cores active while the other cores throttle down) - for short durations. Basically what I'm saying is only when the CPU power level is below TDP limit, it is when some of the cores are allowed to run at max Turbo Boost frequency while the rest of the cores throttle down. Once TDP limit reaches, this not going to happen & all cores will be running at max non-Turbo Boost frequency. In case with unlocked CPU that allow overclocking, TDP is not capped though, can be override & usually pre-set to crazy high number anyway. Only temp will be the limit for these CPUs.
> 
> Actually it's not about which gen the CPU belong to but it's depends on the games whether it'll perform better with HT off or not. It seems fallout 4 is not CPU intensive games though, so I doubt disabling HT or even disabling cores or even with affinity set to just a number of cores/threads will give any performance boost.


1st and 2nd generations had problems with HT, 1st had a very significant performance drop and 2nd had some stutter and other thread issues, 3rd and later don't really have any issues and should see a slight improvement when HT is enabled vs disabled.

Turbo clocks are only allowed to run on a fixed amount of cores, it depends on the model and TDP as to what each level is but it's commonly one core for the very highest and two cores for halfway, where the rest of the cores are then either at standard highest or throttled down. When turbo is disabled and the processor is overclocked this nolonger applies unless it starts overheating or the voltage goes unstable (depending on the motherboard).


----------



## ansha

Sure, shadow distance to medium does wonders for fps, but the game looks awful..


----------



## gupsterg

Quote:


> Originally Posted by *mus1mus*
> 
> Noise is good. It reminds you that something's up and running.


I still got some Vantec Tornado 80mm & 92mm in my PC parts horde!


----------



## kizwan

Quote:


> Originally Posted by *MEC-777*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Thank you. I actually trying to find similar screenshot before but can't find any. Just to be clear I'm only explaining the default Intel core CPU behavior with turbo boost. When motherboard can override this, then it's running outside default configuration, just like the extra +4 turbo bin with previous gen Intel core CPU. This is if the motherboard have the "by all cores" or "sync all cores" feature in the BIOS. Otherwise the cpu will run at default configuration. With Haswell, Intel removed the free turbo bin but allowed to override number of cores at max turbo boost frequency.
> 
> 
> 
> I didn't get a chance to check those settings in the BIOS yet, but here's a screeny running P95.
Click to expand...

Thanks. That means your motherboard automatically sync all cores.
Quote:


> Originally Posted by *Paul17041993*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Intel processors can run with all cores active & utilized all day long as long as temp is under the limit. The rest of the cores *not* must throttle down to keep within TDP limits when many cores are heavily utilized. It doesn't work that way. If you're talking about Turbo Boost, it's the other way around - regarding the TDP limit. (Assuming other factors also below limits) Intel core cpu is allowed to operate at power level higher than TDP - which means operate at max turbo boost frequency (when one or more cores active while the other cores throttle down) - for short durations. Basically what I'm saying is only when the CPU power level is below TDP limit, it is when some of the cores are allowed to run at max Turbo Boost frequency while the rest of the cores throttle down. Once TDP limit reaches, this not going to happen & all cores will be running at max non-Turbo Boost frequency. In case with unlocked CPU that allow overclocking, TDP is not capped though, can be override & usually pre-set to crazy high number anyway. Only temp will be the limit for these CPUs.
> 
> Actually it's not about which gen the CPU belong to but it's depends on the games whether it'll perform better with HT off or not. It seems fallout 4 is not CPU intensive games though, so I doubt disabling HT or even disabling cores or even with affinity set to just a number of cores/threads will give any performance boost.
> 
> 
> 
> 
> 
> 
> 
> 1st and 2nd generations had problems with HT, 1st had a very significant performance drop and 2nd had some stutter and other thread issues, 3rd and later don't really have any issues and should see a slight improvement when HT is enabled vs disabled.
> 
> Turbo clocks are only allowed to run on a fixed amount of cores, it depends on the model and TDP as to what each level is but it's commonly one core for the very highest and two cores for halfway, where the rest of the cores are then either at standard highest or throttled down. When turbo is disabled and the processor is overclocked this nolonger applies unless it starts overheating or the voltage goes unstable (depending on the motherboard).
Click to expand...

I've never heard or seen the problem you're describing there with 1st & 2nd gen Intel core processors & I also have own both. You probably referring to core parking which is a feature in Windows 7 that caused problem with some games.


Spoiler: Warning: Spoiler!



Quote:


> Turbo clocks are only allowed to run on a fixed amount of cores, it depends on the model and TDP as to what each level is but it's commonly one core for the very highest and two cores for halfway, *where the rest of the cores are then either at standard highest or throttled down*.


I'm putting this in spoiler because I want my reply look brief since what we're talking here is off topic anyway. In short, turbo boost doesn't work that way. Lets take an example of a quad core processor. Let say 2 cores are running at turbo boost frequency. The rest of the cores are in one of the state in C-states. It's a must actually because if not the two cores wont be able to running at turbo boost frequency. Monitoring software will report these cores are running at idle frequency. If these cores exit & enter C0 state, and even if it's speed up just 1MHz, the two cores that was running at turbo boost frequency, wont be able to maintain at that frequency anymore & will throttle down to lower turbo boost frequency at least since now number of cores active are now 4 cores instead of 2 cores before. Yes, number of cores active is one of the rules. The other rules are type of workload, current and power consumption, & temperature. Regarding TDP, once CPU power level higher than TDP, except for short durations, (all) cores will not be able to Turbo Boost & the cores will only be able to run at max non-turbo frequency. Intel CPU doesn't turbo boost one or two cores & throttle down the other cores to keep power level below TDP. It doesn't work that way. All the cores will throttle down to non-turbo frequency until power level below TDP again.


When overclocking with Intel core processor, it actually overclocking the Turbo frequency. Turbo Boost was actually auto-enable if you have it disabled when overclock. Yes, I did mentioned this, with unlocked CPU, it no longer conform to the rules except temp.


----------



## Ironsight

Anyone having issues with the new version of Trixx? I tried running overclocks that were perfectly fine on MSI Afterburner but it gave me a lot of issues.


----------



## Roboyto

Quote:



> Originally Posted by *Ironsight*
> 
> Anyone having issues with the new version of Trixx? I tried running overclocks that were perfectly fine on MSI Afterburner but it gave me a lot of issues.


Yes, it appears to be problematic. Biggest issue I was having was when using the reset button to return clock/volt to normal. The new version of Trixx foolishly reduces voltage first which will likely cause a crash/lockup depending on how far you were pushing the card. Reduce clocks manually and then you can hit the reset button.


----------



## kizwan

Quote:


> Originally Posted by *Ironsight*
> 
> Anyone having issues with the new version of Trixx? I tried running overclocks that were perfectly fine on MSI Afterburner but it gave me a lot of issues.


What kind of issues?


----------



## Timer5

So I have been lurking around here for a while now and figured it was time to post. So I finally got the guts and decided to OC my R9 290x. So I put a Kraken G10 a Corsair H50 (2nd edition so it does fit) on my Stock XFX card installed the GELID cooler for my VRMs and put Copper heat sinks on my VRAM. I was happy to get the Jet engine out of my computer room to say the least.

So I fired up Trixx and got to work. Well that is where the troubles started. After a few days of work I was able to get up to 1150 on the core no issue it is the memory where my system seems to mess up. So I am at +150 on Trixx with and +33% on power (For some reason if I go over I get artifacts and if I go under I never get close to 1150mhz) My card has Elpida From snooping around I have found that it is kinda known for being buggy.

SO my issue is kinda weird. I am able to run Valley ALL day at 1500Mhz on memory I did a 2hr test run no issues at the voltage and power option posted above. But when I get in game I get black screens. Now the strange part it is ONLY when I go from super action packed to something slow. So in Fallout 4 when I am blowing stuff up and killing no issue but when I go to a store and leave the store menu it I get a black screen. It does it also with games like Metro 2033 but games like TF2 and Fallout 3 give no issues!?!? My temps are fine and I have moved around the RAM speed it does it at 1300, 1350, 1400, 1420, 1450 1500 and at +120v and +200v and everything in between. The issue happened with MSI afterburner also even after adjusting the aux voltage. Is it my BIOS should I flash to a different one? I have not seen anyone around here with the same issue.

P.S I am running a SeaSonic 750W PSU so I am pretty sure it is not a power issue I picked it up from newegg last year so I know that is good. My temps stay great. My GPU hits 64 during super intense scenes my VRMs stay in the 50/60 degree range at full load. My Case is a Cooler master Haf 932 so my case has great cooling on everything else. I have also tested with just the core overclocked and it didn't black screen.


----------



## mus1mus

Quote:


> Originally Posted by *Timer5*
> 
> So I have been lurking around here for a while now and figured it was time to post. So I finally got the guts and decided to OC my R9 290x. So I put a Kraken G10 a Corsair H50 (2nd edition so it does fit) on my Stock XFX card installed the GELID cooler for my VRMs and put Copper heat sinks on my VRAM. I was happy to get the Jet engine out of my computer room to say the least.
> 
> So I fired up Trixx and got to work. Well that is where the troubles started. After a few days of work I was able to get up to 1150 on the core no issue it is the memory where my system seems to mess up. So I am at +150 on Trixx with and +33% on power (For some reason if I go over I get artifacts and if I go under I never get close to 1150mhz) My card has Elpida From snooping around I have found that it is kinda known for being buggy.
> 
> SO my issue is kinda weird. I am able to run Valley ALL day at 1500Mhz on memory I did a 2hr test run no issues at the voltage and power option posted above. But when I get in game I get black screens. Now the strange part it is ONLY when I go from super action packed to something slow. So in Fallout 4 when I am blowing stuff up and killing no issue but when I go to a store and leave the store menu it I get a black screen. It does it also with games like Metro 2033 but games like TF2 and Fallout 3 give no issues!?!? My temps are fine and I have moved around the RAM speed it does it at 1300, 1350, 1400, 1420, 1450 1500 and at +120v and +200v and everything in between. The issue happened with MSI afterburner also even after adjusting the aux voltage. Is it my BIOS should I flash to a different one? I have not seen anyone around here with the same issue.
> 
> P.S I am running a SeaSonic 750W PSU so I am pretty sure it is not a power issue I picked it up from newegg last year so I know that is good. My temps stay great. My GPU hits 64 during super intense scenes my VRMs stay in the 50/60 degree range at full load. My Case is a Cooler master Haf 932 so my case has great cooling on everything else. I have also tested with just the core overclocked and it didn't black screen.


Have you tried running your card in-game with +50% power target? It is always advised to max this out to prevent the card from being power capped.

If you feel like flashing a BIOS into your card, save a backup of your cureent one. This can be done with GPU-Z.

Then maybe try this rom I posted a while back. I am working on a better one today.

Report your findings with the above recommendations.









http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/40350_50#post_24601245


----------



## refirendum

If i am currently running a full-covered WC GTX 480, how worthwhile is an upgrade to a used 290X w/ full cover WC block?


----------



## battleaxe

Quote:


> Originally Posted by *refirendum*
> 
> If i am currently running a full-covered WC GTX 480, how worthwhile is an upgrade to a used 290X w/ full cover WC block?


Extremely worthwhile.

Best I can tell it should be roughly twice as powerful. Maybe more.


----------



## refirendum

Well, then i didn't make a terrible mistake. fingers crossed my 290X is a good one (ebay used purchases >_>) but at around 200 for card and 150-ish for block/backplate, this better be good. lol


----------



## battleaxe

Quote:


> Originally Posted by *refirendum*
> 
> Well, then i didn't make a terrible mistake. fingers crossed my 290X is a good one (ebay used purchases >_>) but at around 200 for card and 150-ish for block/backplate, this better be good. lol


As long as it runs fine, you won't regret it. I was looking a bit more and it could be up to 3x more powerful. Quite a huge distance between them, there's really no comparison's to speak of as the generations are so far apart, but as a best guess, somewhere between 2x to 3x more powerful. With H2O you should be able to push quite hard on it too, they OC pretty nicely. Mine both do 1240mhz+ on the cores. Abnormal, but lots of these will go over 1200 especially on water. At which point, they are GTX980 competitive in lots of games. And for way less money. Good luck to you.


----------



## Timer5

Gah this is an R9 290 BIOS not a 290x! Do you have one for the 290x by chance?? I am resting at 2560 shaders now


----------



## refirendum

so how much of an overclock will an average 290X take to keep up with the 390/390x?


----------



## rdr09

Quote:


> Originally Posted by *refirendum*
> 
> so how much of an overclock will an average 290X take to keep up with the 390/390x?


Without editing the bios of the 290X? prolly about 50MHz more on the 290X will match a 390X. It will also depend how good the vram on the 290X will oc. most 390X are set at 1500MHz out the box. This is without going through the difference in timings.


----------



## mus1mus

Quote:


> Originally Posted by *Timer5*
> 
> Gah this is an R9 290 BIOS not a 290x! Do you have one for the 290x by chance?? I am resting at 2560 shaders now


I do. But you need waterrrrrr.









Actually,wait on a bit. In the mean time, show me where you are on your OC.


----------



## Timer5

Well I decided i would start out a bit on the easy side. I put it up to 1400 and 1150 with complete stability. SO I decided to move up to 1500Mhz and yep 100% stable no black screens no issues at all it fixed my card! I thought with the lower amount of shaders would hurt performance but it didn't change much infact I gained a lot of performance with out them!. At one point I had enough stability on my old BIOS to run 1500 with out a black screen.

This is the original



New BIOS got me



So it got me 700 extra points on graphics ALONE! My Core is still at 1150 though I am thinking about moving up higher as for the Memory I might try my luck with 1550. At this point I am more than pleased with my results thank you.

Please keep me posted on that new one BIOS you mentioned you are working on I would love to test it out. Also will it be for elpida or Hynix?
Also last question can I try the R9 290x one I am interested to find out how much more performance the extra shaders could add in. I mean considering this BIOS gave me an extra 700+ points on my graphics score with 256 less shaders the performance boost from the extra shaders and the BIOS should be amazing!


----------



## mus1mus

Quote:


> Originally Posted by *Timer5*
> 
> Well I decided i would start out a bit on the easy side. I put it up to 1400 and 1150 with complete stability. SO I decided to move up to 1500Mhz and yep 100% stable no black screens no issues at all it fixed my card! I thought with the lower amount of shaders would hurt performance but it didn't change much infact I gained a lot of performance with out them!. At one point I had enough stability on my old BIOS to run 1500 with out a black screen.
> 
> This is the original
> 
> 
> 
> New BIOS got me
> 
> 
> 
> So it got me 700 extra points on graphics ALONE! My Core is still at 1150 though I am thinking about moving up higher as for the Memory I might try my luck with 1550. At this point I am more than pleased with my results thank you.
> 
> Please keep me posted on that new one BIOS you mentioned you are working on I would love to test it out. Also will it be for elpida or Hynix?
> Also last question can I try the R9 290x one I am interested to find out how much more performance the extra shaders could add in. I mean considering this BIOS gave me an extra 700+ points on my graphics score with 256 less shaders the performance boost from the extra shaders and the BIOS should be amazing!


Sure thing. Just give me a moment to massage this PT1-based rom.

Can you try 1625? Should be very doable.









Last thing, Try to OC that CPU.


----------



## mus1mus

Okay. New ROM:

Based on an Asus rom.

Mods done:

1250 MHz memory Strap for Elpida adopted up to 1750 Strap.
FF F1 mods.
DPM 7 - 1.287
Stock Voltage at LOAD on my card: 1.211

 Much to my Surprise: It's hitting Very Respectable scores. 15300+ on 1300? I needed 1320 to even break 15K on a PT1 

If you need a pre-overclocked version, let me know.

















@kizwan @mfknjadagr8 @Nassenoff @EpicOtis13

Are you guys up for this?

ASUSX.zip 104k .zip file


Let me know how the BIOS treats your card. We can mod the lower straps to get better close to stock speeds performance.

Any performance testing info at 1375, 1500 will be highly appreciated.


----------



## Timer5

Well I went in and set the VRAM to 1650 and max volt and went into black screen in 10 seconds. I have been toying with it for a bit and for some reason I can't get past 1500Mhz be it 1550 1600. I am going to play with it more in the morning.

As for the CPU it's an FX 8350 I have it at 4.33 my motherboard is low end (bought it back before I had my good job) so I was a moron and bought a board with out proper VRM coolers. I put some heatsinks on the VRM and it does help out but I still don't trust it too much though I am debating on trying to push it to 4.5Ghz I have a Hyper 212+ on it so cooling on the CPU is not too big of an issue. I figure One of these days I will sit down and get that puppy up to 4.5Ghz and start FLYING! I do plan on replacing the CPU in the coming year if things go as planned (aka zen coming out on time







)

Good luck with the BIOS if you get it done tonight I will download it in the morning and tell you the results







if it is anything like the last one it is going to be AMAZING!

Have a great night and Thank you for the amazing BIOS.


----------



## Timer5

Gosh dang it I wanted to sleep







Time to start testing!


----------



## mus1mus

Don't aim for 1650. Aim for these mem clocks:

1250,
1375,
1500,
1625,
1750.

These are the points for most performance. If you get a black screen above these straps, go back to the last defined.


----------



## Timer5

Wow what a BIOS! This thing turned out FANTASTIC! I wanted to see how much of a boost I would get with just the new BIOS. So same clocks as last test same everything just the new BIOS and the results are worth it!

Original BIos 1150core and 1500 Memory



First modded BIOS



Newest BIOS



The performance boost has been fantastic! I can't wait to do more tests on it tomorrow and see if with the new BIOS I can push it past 1500 fingers crossed

Thank you once again for the BIOS this thing is AMAZING!


----------



## mus1mus

Glad it worked for ya.









Can you provide more info on your tested frequency next time you take it for a spin?


----------



## rdr09

Quote:


> Originally Posted by *mus1mus*
> 
> Glad it worked for ya.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can you provide more info on your tested frequency next time you take it for a spin?


prolly needs more tweaking. that's a 290X at 1150, right? my 290 with original bios gets same GS at 1200 using old driver. i know i have to oc around 60/70MHz more to match an uneditied 290X. And i am using Win7. Win10 adds a few hundreds in GS.


----------



## mus1mus

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Glad it worked for ya.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Can you provide more info on your tested frequency next time you take it for a spin?
> 
> 
> 
> prolly needs more tweaking. that's a 290X at 1150, right? my 290 with original bios gets same GS at 1200 using old driver. i know i have to oc around 60/70MHz more to match an uneditied 290X. And i am using Win7. Win10 adds a few hundreds in GS.
Click to expand...

I believe so.

I tweaked the ROM with emphasis on 1625 so anything below that will be a bit under performing. Imma try it myself now. Since most of the folks might still be in bed.


----------



## rdr09

Quote:


> Originally Posted by *mus1mus*
> 
> I believe so.
> 
> I tweaked the ROM with emphasis on 1625 so anything below that will be a bit under performing. Imma try it myself now. Since most of the folks might still be in bed.


could just be the difference in our systems. i know my thuban gets lower GS compared to my Sandy using the same gpu.


----------



## AliNT77

Quote:


> Originally Posted by *mus1mus*
> 
> Okay. New ROM:
> 
> Based on an Asus rom.
> 
> Mods done:
> 
> 1250 MHz memory Strap for Elpida adopted up to 1750 Strap.
> FF F1 mods.
> DPM 7 - 1.287
> Stock Voltage at LOAD on my card: 1.211
> 
> Much to my Surprise: It's hitting Very Respectable scores. 15300+ on 1300? I needed 1320 to even break 15K on a PT1
> 
> If you need a pre-overclocked version, let me know.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> @kizwan @mfknjadagr8 @Nassenoff @EpicOtis13
> 
> Are you guys up for this?
> 
> ASUSX.zip 104k .zip file
> 
> 
> Let me know how the BIOS treats your card. We can mod the lower straps to get better close to stock speeds performance.
> 
> Any performance testing info at 1375, 1500 will be highly appreciated.


Can i use it with my ref. 290?

Does this bios have a "middle-clock" ?

Nice work BTW ????


----------



## mus1mus

YES. I tested that for my 290.

Middle clocks in the works.

1500-1375-1250 use 1250 strap timings at the moment.


----------



## mus1mus

Here's the new BIOS with modded middle straps:

Mods:

Stock Clocks: 1080 / 1375
DPM7 raised to 1287 mV (Stock loaded Voltage 1.211)
FF F1 Mods
1000 Strap Timings for 1000-1375 (Heaven Verified)
1250 Strap Timings for 1500 (Heaven Verified)
1375 Strap Timings for 162501750 (Heaven Verified)

Voltages:





Performance Metrics:
Stock 1080/1375:



1150/1375 @ +100 Trixx = 1.281



1150/1500 @ +100 Trixx = 1.281



1150/1625 @ +100 Trixx = 1.281



MUS1MUS.zip 104k .zip file


Find bugs and instabilities?

Lemme know.

This ROM does not come with a Warranty nor a Guarantee you will be 100% Stable nor OC as much as the pics indicate.









MODS Work only for Elpida memory.


----------



## Gumbi

63.5FPS at that core clock is very nice indeed! I have to check this out, currently using a 290X VaporX 8GB which can bench 1240/1640







(Elpida).


----------



## mus1mus

Quote:


> Originally Posted by *Gumbi*
> 
> 63.5FPS at that core clock is very nice indeed! I have to check this out, currently using a 290X VaporX 8GB which can bench 1240/1640
> 
> 
> 
> 
> 
> 
> 
> (Elpida).


Oppps, you might need to mod the VRAM size before you do. This is based on 4GB cards.

Also, try not to go for the max Frequency your VRAM clocks to unless it is very near the end of the straps.


----------



## Gumbi

Quote:


> Originally Posted by *mus1mus*
> 
> Oppps, you might need to mod the VRAM size before you do. This is based on 4GB cards.


Hmm, I was thinking that might be an issue, I'll have to look into it! I'll see what scores I get at those speeds later. What CPU are you using btw and what clocks?

As is I'm getting great scores. I've benched 1240 or 1250mhz core and 1640 mem for a score for 67.7FPS (31 min).


----------



## mus1mus

Quote:


> Originally Posted by *Gumbi*
> 
> Hmm, I was thinking that might be an issue, I'll have to look into it! I'll see what scores I get at those speeds later. What CPU are you using btw and what clocks?
> 
> As is I'm getting great scores. I've benched 1240 or 1250mhz core and 1640 mem for a score for 67.7FPS (31 min).


4790K. 5.0GHz barely stable to complete 3DMark11 Physics. But solid at 4.9. Thing runs so hot!

Those clocks by the way are not representative of my benching clocks. I have to be conservative on the ROMs being shared. As most people enjoy more gaming with their cards than benching.


----------



## Gumbi

Quote:


> Originally Posted by *mus1mus*
> 
> 4790K. 5.0GHz barely stable to complete 3DMark11 Physics. But solid at 4.9. Thing runs so hot!
> 
> Those clocks by the way are not representative of my benching clocks. I have to be conservative on the ROMs being shared. As most people enjoy more gaming with their cards than benching.


Nice, I have a 4790k at 4.9ghz myself, on air! Best damn chip I've ever had, rock solid at 1.31v in BIOS so even when stressing it only hits mid 80s.

Appreciate the conservative timings!


----------



## mus1mus

Quote:


> Originally Posted by *Gumbi*
> 
> Nice, I have a 4790k at 4.9ghz myself, on air! Best damn chip I've ever had, rock solid at 1.31v in BIOS so even when stressing it only hits mid 80s.
> 
> Appreciate the conservative timings!


80 is hot for my tastes.









If you base it off a basic rom, it's not so conservative at all. Considering itcs pre-OC'd with some timing mods. But I need to secure a solid base clocks.









You'd be surprised, 1500 mem is just a hair off 1625.


----------



## Gumbi

Quote:


> Originally Posted by *mus1mus*
> 
> 80 is hot for my tastes.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you base it off a basic rom, it's not so conservative at all. Considering itcs pre-OC'd with some timing mods. But I need to secure a solid base clocks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You'd be surprised, 1500 mem is just a hair off 1625.


Unless I go custom water and/or delidded there's no way I can go 1.3v~ Haswell and expect less than 80 degrees celcius









When I game though it's well south of 80, so it doesn't bother me.


----------



## mus1mus

Quote:


> Originally Posted by *Gumbi*
> 
> Unless I go custom water and/or delidded there's no way I can go 1.3v~ Haswell and expect less than 80 degrees celcius
> 
> 
> 
> 
> 
> 
> 
> 
> 
> When I game though it's well south of 80, so it doesn't bother me.


Yah,

You don't especially need to WC these CPUs. But better if you do. Though, mine will hit 80C even with 15C water.







1.32 Core.

Marvelous Intel masterpiece.


----------



## mus1mus

Quote:


> Originally Posted by *rdr09*
> 
> could just be the difference in our systems. i know my thuban gets lower GS compared to my Sandy using the same gpu.


Opps. This somehow slipped through my head.

The results posted prior to our replies were of a different user. Not mine.

He compared the score from his stock 290X bios to a modded 290 to another modded 290X.

His results are quite amazing. Especially for a quick and dirty BIOS mod done.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Timer5*
> 
> Well I went in and set the VRAM to 1650 and max volt and went into black screen in 10 seconds. I have been toying with it for a bit and for some reason I can't get past 1500Mhz be it 1550 1600. I am going to play with it more in the morning.
> 
> As for the CPU it's an FX 8350 I have it at 4.33 my motherboard is low end (bought it back before I had my good job) so I was a moron and bought a board with out proper VRM coolers. I put some heatsinks on the VRM and it does help out but I still don't trust it too much though I am debating on trying to push it to 4.5Ghz I have a Hyper 212+ on it so cooling on the CPU is not too big of an issue. I figure One of these days I will sit down and get that puppy up to 4.5Ghz and start FLYING! I do plan on replacing the CPU in the coming year if things go as planned (aka zen coming out on time
> 
> 
> 
> 
> 
> 
> 
> )
> 
> Good luck with the BIOS if you get it done tonight I will download it in the morning and tell you the results
> 
> 
> 
> 
> 
> 
> 
> if it is anything like the last one it is going to be AMAZING!
> 
> Have a great night and Thank you for the amazing BIOS.


one little thing...no problem cooling....fx chip and hyper 212 don't fit so great together







your likely nearing the limit of the 212 with your 4.5 hope


----------



## superkeest

Does anyone have any advice on how to push my card further? I have a HIS 290x iceq x2 with hynix memory. which is 1000 mhz 1250 mhz stock.

Ive been able to get it to 1100 mhz and 1375 mhz with 100+ mv in Trixx. However i see people pushing them much further. I read that each card has a certain "ratio" ram/cpu that if you can find the ratio, usually between 1.25 and 1.4 that you can really push it. have you guys found this to be true? and any ideas on how to isolate the ratio?

I was thinking of trying out the bios that was posted a few pages back that you guys have been taking about maybe that will help? will it work on a non reference card?

thanks guys.


----------



## mus1mus

Quote:


> Originally Posted by *superkeest*
> 
> Does anyone have any advice on how to push my card further? I have a HIS 290x iceq x2 with hynix memory. which is 1000 mhz 1250 mhz stock.
> 
> Ive been able to get it to 1100 mhz and 1375 mhz with 100+ mv in Trixx. However i see people pushing them much further. I read that each card has a certain "ratio" ram/cpu that if you can find the ratio, usually between 1.25 and 1.4 that you can really push it. have you guys found this to be true? and any ideas on how to isolate the ratio?
> 
> I was thinking of trying out the bios that was posted a few pages back that you guys have been taking about maybe that will help? will it work on a non reference card?
> 
> thanks guys.


1.
You can push it more if you can cool it better.

2.
Try out Insanity's 390X modded bios for your card. Hynix cards benefit from that Bios.


----------



## superkeest

Quote:


> Originally Posted by *mus1mus*
> 
> 1.
> You can push it more if you can cool it better.
> 
> 2.
> Try out Insanity's 390X modded bios for your card. Hynix cards benefit from that Bios.


I have tried iInsanity's bios' but last time i tried i could not get through a pass of heaven without heaven crashing. Maybe I will try it again tonight. my temps seem ok, at 1100/1375 i run about 68 degrees.


----------



## mus1mus

Quote:


> Originally Posted by *superkeest*
> 
> I have tried iInsanity's bios' but last time i tried i could not get through a pass of heaven without heaven crashing. Maybe I will try it again tonight. my temps seem ok, at 1100/1375 i run about 68 degrees.


I can mod the Hynix Timings later but that would need more testing on the end-user part to squeeze some good performance as I do not have a hynix mem card.

But it should be doable.


----------



## superkeest

awesome yea im willing to do any kind of testing necessary i was trying all sorts of things over the weekend. but would love to get the 390x bios working stable!


----------



## Timer5

I have started testing this morning with the modded ASUS bios you made for the 290x and IDK what is up. So it will run 3D mark all day no issues but when I am in fallout 4 it acts like the old BIOS and gives me a black screen of death







I did back down to 1375 it is 100% stable. I am confused why on the original tweaked BIOS that was R9 290 based I could do 1500Mhz with out issue but with the R9 290x ASUS based Bios do 3D mark with out getting a black screen?!?! I also tested with Valley and yeah it black screen there also?! Any idea what could be causing this? My card is a R9 290x not an flashed R9 290x so the chip is able to be a 290x. Is there any way to use the settings the Tweaked BIOS had with R9 290x support?


----------



## mus1mus

Quote:


> Originally Posted by *superkeest*
> 
> awesome yea im willing to do any kind of testing necessary i was trying all sorts of things over the weekend. but would love to get the 390x bios working stable!


Maybe because of the timings on the 390X roms.

Try to raise the frequency to 1500 if you can. I remember it uses the stilt's timings for 1250 and 1375. Which may be too tight foe your card.

Or try adding some Voltage.


----------



## superkeest

So your suggesting flash the bios, then use trixx to up the ram frequency to 1500? and add extra voltage? should i increase the clock speed too? or just use the 390x bios clock speed.

thanks.


----------



## mus1mus

Quote:


> Originally Posted by *superkeest*
> 
> So your suggesting flash the bios, then use trixx to up the ram frequency to 1500? and add extra voltage? should i increase the clock speed too? or just use the 390x bios clock speed.
> 
> thanks.


Flash Insanity's modded bios.
Add a bit of Voltage
Clock memory first to 1500
Bench Heaven on stock Core but OC memory.

Check for Blackscreens.
It could be the timings on the lower strap.


----------



## superkeest

Great! i will try this. If its unstable or driver is crashing should i lower the vram frequency?


----------



## mus1mus

More likely YES.


----------



## Timer5

Quote:


> Originally Posted by *mus1mus*
> 
> Here's the new BIOS with modded middle straps:
> 
> Mods:
> 
> Stock Clocks: 1080 / 1375
> DPM7 raised to 1287 mV (Stock loaded Voltage 1.211)
> FF F1 Mods
> 1000 Strap Timings for 1000-1375 (Heaven Verified)
> 1250 Strap Timings for 1500 (Heaven Verified)
> 1375 Strap Timings for 162501750 (Heaven Verified)
> 
> Find bugs and instabilities?
> 
> Lemme know.
> 
> This ROM does not come with a Warranty nor a Guarantee you will be 100% Stable nor OC as much as the pics indicate.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> MODS Work only for Elpida memory.


So I tried out the MUS1MUS Bios and my computer could not get into windows. It would go straight to black screen. So the MUS1MUS does not like my card







and the ASUS one is unstable when I overclock to 1500Mhz in games. It will work like a charm in 3D mark but gives me a black screen. IDK what is causing it I mean the black screen didn't happen on the "Tweaked" BIOS but for some reason I lost that overclocking ability with this one!?? I am 100% stable at 1375Mhz on the Memory. Figured I would report back my results on Stability for you.


----------



## mus1mus

Quote:


> Originally Posted by *Timer5*
> 
> So I tried out the MUS1MUS Bios and my computer could not get into windows. It would go straight to black screen. So the MUS1MUS does not like my card
> 
> 
> 
> 
> 
> 
> 
> and the ASUS one is unstable when I overclock to 1500Mhz in games. It will work like a charm in 3D mark but gives me a black screen. IDK what is causing it I mean the black screen didn't happen on the "Tweaked" BIOS but for some reason I lost that overclocking ability with this one!?? I am 100% stable at 1375Mhz on the Memory. Figured I would report back my results on Stability for you.


Thanks for the feedback. I think I know what's causing it.

I'll pm you a refined one 6-8 hours from now.


----------



## YellowBlackGod

Guys, have you tried the Crimson Drivers? All in all they improve the GPU behavior. I am using them right now, and I can tell you, 1100/1500. stock volts, power limit 50+ runs stable in Firestrike.


----------



## MojoW

Quote:


> Originally Posted by *YellowBlackGod*
> 
> Guys, have you tried the Crimson Drivers? All in all they improve the GPU behavior. I am using them right now, and I can tell you, 1100/1500. stock volts, power limit 50+ runs stable in Firestrike.


And for the VSR lovers out there it has custom resolution support!


----------



## fyzzz

Testing crimson driver also. The colors are very different in firestrike and the score went down a touch, but the physics score seems to have gone up. 1120/1625 on stock volts is stable as before. Will also test bf4 and see how that works. I also did a comparison between catalyst 15.7.1 vs Crimson drivers in the Api overhead test: http://www.3dmark.com/compare/aot/87370/aot/87363


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Timer5*
> 
> So I tried out the MUS1MUS Bios and my computer could not get into windows. It would go straight to black screen. So the MUS1MUS does not like my card
> 
> 
> 
> 
> 
> 
> 
> and the ASUS one is unstable when I overclock to 1500Mhz in games. It will work like a charm in 3D mark but gives me a black screen. IDK what is causing it I mean the black screen didn't happen on the "Tweaked" BIOS but for some reason I lost that overclocking ability with this one!?? I am 100% stable at 1375Mhz on the Memory. Figured I would report back my results on Stability for you.
> 
> 
> 
> Thanks for the feedback. I think I know what's causing it.
> 
> I'll pm you a refined one 6-8 hours from now.
Click to expand...

middle memory mod?


----------



## YellowBlackGod

Quote:


> Originally Posted by *fyzzz*
> 
> Testing crimson driver also. The colors are very different in firestrike and the score went down a touch, but the physics score seems to have gone up. 1120/1625 on stock volts is stable as before. Will also test bf4 and see how that works. I also did a comparison between catalyst 15.7.1 vs Crimson drivers in the Api overhead test: http://www.3dmark.com/compare/aot/87370/aot/87363


So, that wasn't just me, with the colours (i noticed that too, it looks more vibrant) and OC. Good work AMD. Tons of features, stability, modern UI, everything needed to bring out the full Radeon potential. At last!

Edit: Raised the Memory up to 1520 MHz. Works great. This indicates that AMD has worked much on the thermal and general GPU behavior of the GPUs.


----------



## kizwan

Based on Firestrike graphics score, Crimson drivers performance pretty much similar to 15.11 beta.

BTW, is it depends on the panel (1080p) whether it can do custom resolution 4K?


----------



## kizwan

Be careful with Crimson drivers. I noticed that if I overclocked the GPU & I restart the computer without resetting to default clocks, the drivers auto-apply the overclock (including the voltage) when logging into windows. This shouldn't be any issue if the overclock is stable. I also experienced crash while overclocking & this is where the horror start. Once logged into the windows, drivers trying to auto-apply the overclock which seems with stock voltage. Screen flickering & black screen hard crash occurred. The way I fixed this is by rebooting into safe mode & uninstall the drivers via Device Manager.


----------



## fat4l

Quote:


> Originally Posted by *kizwan*
> 
> Be careful with Crimson drivers. *I noticed that if I overclocked the GPU & I restart the computer without resetting to default clocks, the drivers auto-apply the overclock (including the voltage) when logging into windows.* This shouldn't be any issue if the overclock is stable. I also experienced crash while overclocking & this is where the horror start. Once logged into the windows, drivers trying to auto-apply the overclock which seems with stock voltage. Screen flickering & black screen hard crash occurred. The way I fixed this is by rebooting into safe mode & uninstall the drivers via Device Manager.


Realised this too.

New driver(crimson), new, improved, results








1200/1700MHz(1500Strap Timings)










Spoiler: Warning: 15.11 DRIVERS



*P*


*X*


*U*






Spoiler: Warning: CRIMSON DRIVERS



*P*


*X*


*U*




edit:// reuploaded images.
edit2://added comparison


----------



## Timer5

Quote:


> Originally Posted by *kizwan*
> 
> Be careful with Crimson drivers. I noticed that if I overclocked the GPU & I restart the computer without resetting to default clocks, the drivers auto-apply the overclock (including the voltage) when logging into windows. This shouldn't be any issue if the overclock is stable. I also experienced crash while overclocking & this is where the horror start. Once logged into the windows, drivers trying to auto-apply the overclock which seems with stock voltage. Screen flickering & black screen hard crash occurred. The way I fixed this is by rebooting into safe mode & uninstall the drivers via Device Manager.


Yeah found that out the hard way today XD I am going to stick with CC for a bit longer. I will move once I am done tinkering with my card and I feel this will not happen again.


----------



## asciii

_Removed._


----------



## Gumbi

Quote:


> Originally Posted by *asciii*
> 
> Have been running stable at 1150/1500 +100mv with a R9 290 Windforce (Elpida) on MUS1MUS BIOS for a bit now, testing in different games and benchmarks.
> *The stock cooler can't keep up even at 100% fans during Heaven benchmark.* Replaced thermal paste with Arctic MX-4 and now the GPU starts to throttle around stage 11/26.


Really?? What's your case airflow like? Assuming average to good airflow, your ambients must be very high, I've heard users bringing their reference temps to the mid 80s after replacing the paste and running more sane fan speeds.


----------



## gupsterg

Quote:


> Originally Posted by *kizwan*
> 
> Be careful with Crimson drivers. I noticed that if I overclocked the GPU & I restart the computer without resetting to default clocks, the drivers auto-apply the overclock (including the voltage) when logging into windows. This shouldn't be any issue if the overclock is stable. I also experienced crash while overclocking & this is where the horror start. Once logged into the windows, drivers trying to auto-apply the overclock which seems with stock voltage. Screen flickering & black screen hard crash occurred. The way I fixed this is by rebooting into safe mode & uninstall the drivers via Device Manager.


Quote:


> Originally Posted by *fat4l*
> 
> Realised this too.


Just about to uninstall 15.11.1 beta and install Crimson, don't think I'll run into same issue with modded rom having my mild 1090/1475 OC.


----------



## mus1mus

Quote:


> Originally Posted by *kizwan*
> 
> middle memory mod?


Naah, Prolly the OC. Must be high for some cards.

@fat4l

I dont mean to offend you, but there's nothing informative about your posted scores. It would be very helpful to the others if you upload your scores for easy comparison.

And yeah, your images are also too heavy to load. Esp on mobile. Think about using spoilers maybe.


----------



## kizwan

Quote:


> Originally Posted by *kizwan*
> 
> Be careful with Crimson drivers. I noticed that if I overclocked the GPU & I restart the computer without resetting to default clocks, the drivers auto-apply the overclock (including the voltage) when logging into windows. This shouldn't be any issue if the overclock is stable. I also experienced crash while overclocking & this is where the horror start. Once logged into the windows, drivers trying to auto-apply the overclock which seems with stock voltage. Screen flickering & black screen hard crash occurred. The way I fixed this is by rebooting into safe mode & uninstall the drivers via Device Manager.


I am really sleepy when I posted this. Thankfully I didn't sound like Yoda. It's really hard to find correct words when you're sleepy. What I'm trying to say is I experienced hard lock when testing my overclock with BF4 (after 15 minutes in the game). Likely BF4 doesn't like my memory overclock. This overclock also involving over voltage. Usually when crashed like this, whether I turn off or reset my computer, on the next boot, clocks will reset to default. However, this is not true with Crimson. The only thing that get reset is voltage, not clocks. So it will immediately black screen crash after log in into windows. The only way I know to fixed this is by uninstalling the drivers in safe mode via device manager. Thankfully I did enabled legacy boot menu for accessing safe mode in Win 10 before.

Anyone know how to set voltage including AUX using commnad line? I'm thinking writing a batch file that will auto execute when log in into windows. The idea is when it crash again, the batch file will auto-overvolt for me to prevent crashing when booting after crash.


----------



## mus1mus

Quote:


> Originally Posted by *kizwan*
> 
> I am really sleepy when I posted this. Thankfully I didn't sound like Yoda. It's really hard to find correct words when you're sleepy. What I'm trying to say is I experienced hard lock when testing my overclock with BF4 (after 15 minutes in the game). Likely BF4 doesn't like my memory overclock. This overclock also involving over voltage. Usually when crashed like this, whether I turn off or reset my computer, on the next boot, clocks will reset to default. However, this is not true with Crimson. The only thing that get reset is voltage, not clocks. So it will immediately black screen crash after log in into windows. The only way I know to fixed this is by uninstalling the drivers in safe mode via device manager. Thankfully I did enabled legacy boot menu for accessing safe mode in Win 10 before.
> 
> Anyone know how to set voltage including AUX using commnad line? I'm thinking writing a batch file that will auto execute when log in into windows. The idea is when it crash again, the batch file will auto-overvolt for me to prevent crashing when booting after crash.


This is true.

But I don't have a hard lock Black screen yet. Even on my Benching profile of 1350/1650. But it may have to do with my Core Voltage set at DPM7.

It's probably enough to warrant 2D to act normally.

I don't see enough gains on mine even coming from 15.7.1. But did raise my bench scores closer to 16K GS.









http://www.3dmark.com/fs/6583861

In a note: anyone notice the changes in graphics appearance in 3D?


----------



## rdr09

Quote:


> Originally Posted by *mus1mus*
> 
> This is true.
> 
> But I don't have a hard lock Black screen yet. Even on my Benching profile of 1350/1650. But it may have to do with my Core Voltage set at DPM7.
> 
> It's probably enough to warrant 2D to act normally.
> 
> I don't see enough gains on mine even coming from 15.7.1. But did raise my bench scores closer to 16K GS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/6583861
> 
> In a note: anyone notice the changes in graphics appearance in 3D?


i know it's addicting but . . . do not bork the platinum card please.

Great job!


----------



## fat4l

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *mus1mus*
> 
> Naah, Prolly the OC. Must be high for some cards.
> 
> @fat4l
> 
> I dont mean to offend you, but there's nothing informative about your posted scores. It would be very helpful to the others if you upload your scores for easy comparison.
> 
> And yeah, your images are also too heavy to load. Esp on mobile. Think about using spoilers maybe.





Fixed it just for you


----------



## fyzzz

Quote:


> Originally Posted by *mus1mus*
> 
> This is true.
> 
> But I don't have a hard lock Black screen yet. Even on my Benching profile of 1350/1650. But it may have to do with my Core Voltage set at DPM7.
> 
> It's probably enough to warrant 2D to act normally.
> 
> I don't see enough gains on mine even coming from 15.7.1. But did raise my bench scores closer to 16K GS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/6583861
> 
> In a note: anyone notice the changes in graphics appearance in 3D?


Yes 3dmark looks different with crimson drivers. Wow you are so close to 16k...ridiculous. I didn't notice any performance improvements with the new driver, in fact the 3dmark score went down a bit.
You have done some benchmarking i see


Spoiler: Warning: Spoiler!


----------



## mus1mus

Not much as you can see.


----------



## MEC-777

Quote:


> Originally Posted by *kizwan*
> 
> I am really sleepy when I posted this. Thankfully I didn't sound like Yoda. It's really hard to find correct words when you're sleepy. What I'm trying to say is I experienced hard lock when testing my overclock with BF4 (after 15 minutes in the game). Likely BF4 doesn't like my memory overclock. This overclock also involving over voltage. Usually when crashed like this, whether I turn off or reset my computer, on the next boot, clocks will reset to default. However, this is not true with Crimson. The only thing that get reset is voltage, not clocks. So it will immediately black screen crash after log in into windows. The only way I know to fixed this is by uninstalling the drivers in safe mode via device manager. Thankfully I did enabled legacy boot menu for accessing safe mode in Win 10 before.
> 
> Anyone know how to set voltage including AUX using commnad line? I'm thinking writing a batch file that will auto execute when log in into windows. The idea is when it crash again, the batch file will auto-overvolt for me to prevent crashing when booting after crash.


Where/how did you get voltage control in Crimson? It only allows me to adjust core/mem clocks, power limit, temp target and fan speed target.

That being said, I love the new Crimson layout and how you can apply specific OC and CF settings for specific games/apps (I know this can be done in MSI AB, but it's much easier to do so now in Crimson).

It's giving me a problem though. Playing Project Cars last night with a modest OC that I know is stable (1060/1350) and after 5 mins or so, the game would freeze and go to a blue screen with the following message: "THREAD_STUCK_IN_DEVICE_DRIVER" It would then restart my system 2-3 times before coming to the windows repair screen. If I restart, it would keep doing the same thing. If I shut down the machine, and then power it back on it would then start up just fine. All other games tested thus far with Crimson have been working fine. Researching the issue, I haven't really found any solutions aside from removing and reinstalling the drivers. (I used DDU before installing Crimson).

Anyone else running into this problem?


----------



## kizwan

Quote:


> Originally Posted by *MEC-777*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I am really sleepy when I posted this. Thankfully I didn't sound like Yoda. It's really hard to find correct words when you're sleepy. What I'm trying to say is I experienced hard lock when testing my overclock with BF4 (after 15 minutes in the game). Likely BF4 doesn't like my memory overclock. This overclock also involving over voltage. Usually when crashed like this, whether I turn off or reset my computer, on the next boot, clocks will reset to default. However, this is not true with Crimson. The only thing that get reset is voltage, not clocks. So it will immediately black screen crash after log in into windows. The only way I know to fixed this is by uninstalling the drivers in safe mode via device manager. Thankfully I did enabled legacy boot menu for accessing safe mode in Win 10 before.
> 
> Anyone know how to set voltage including AUX using commnad line? I'm thinking writing a batch file that will auto execute when log in into windows. The idea is when it crash again, the batch file will auto-overvolt for me to prevent crashing when booting after crash.
> 
> 
> 
> Where/how did you get voltage control in Crimson? It only allows me to adjust core/mem clocks, power limit, temp target and fan speed target.
> 
> That being said, I love the new Crimson layout and how you can apply specific OC and CF settings for specific games/apps (I know this can be done in MSI AB, but it's much easier to do so now in Crimson).
> 
> It's giving me a problem though. Playing Project Cars last night with a modest OC that I know is stable (1060/1350) and after 5 mins or so, the game would freeze and go to a blue screen with the following message: "THREAD_STUCK_IN_DEVICE_DRIVER" It would then restart my system 2-3 times before coming to the windows repair screen. If I restart, it would keep doing the same thing. If I shut down the machine, and then power it back on it would then start up just fine. All other games tested thus far with Crimson have been working fine. Researching the issue, I haven't really found any solutions aside from removing and reinstalling the drivers. (I used DDU before installing Crimson).
> 
> Anyone else running into this problem?
Click to expand...

I didn't said I can set voltage in Crimson control panel (or CNext). You misunderstood my post there.







I'm using AB/Trixx of course.

I have been in the situation - failed boot loop - two times & it's annoying that Win 10 take control of everything. First time Win 10 found error the C: drive & insisted in checking+fixing the problem during booting into windows. Guess what, it failed to correct the "error" twice - auto-reboot in between - and on the 3rd reboot, Win 10 tried to enter Advanced Startup which was successful but no use at all because keyboard is not responding. I managed to access Advanced Startup using Win 10 USB installer bootable drive & fixed it by using System Restore (usually turned off but thankfully forgot at that time). The second time when BF4+overclock crash with the Crimson driver.

I tried to google how to set AUX voltage using command line. I'm pretty sure somebody already shared the info but I can't find it. So have to invetigate it myself using info from *here* & *here*.


Spoiler: MSI Afterburner I2C dump



Code:



Code:


C:\Program Files (x86)\MSI Afterburner>MSIAfterburner.exe /i2cd

AUX +0mV (left) vs. +50mV (right)
There are two entry because I have two cards.



View in HEX editor. AUX +0mV (top) vs. +50mV (bottom).
GPU1


GPU2


We do know that this is how to set voltage in MSI AB using the command line. An example, to set +125mV.

Code:



Code:


MsiAfterburner.exe /sg0 /wi6,30,8d,14 /sg1 /wi6,30,8d,14

/sg# : # = 0 - first card, 1 - second card, etc
/wi6,30,8d,14 : I2C bus 6, device 30, offset/location in I2C dump, offset voltage in HEX (e.g. HEX 14 = DEC 20 : offset voltage = 20 x 6.25mV = +125mV)

When I set AUX voltage in MSI AB to +50mV, I check I2C dump again & noticed value at offset/location 0x8e changed from 00 to 08.

HEX 08 = DEC 8 : offset AUX voltage = 8 x 6.25mV = +50mV

I confirmed this by resetting to default in MSI AB, closed MSI AB & run this command line.

Code:



Code:


MsiAfterburner.exe /sg0 /wi6,30,8e,8 /sg1 /wi6,30,8e,8

Running MSI AB again show that AUX voltage was set to +50mV & GPU-Z confirmed the AUX voltage is increased.

*Use it at your own risk! Be careful not to over voltage AUX. You may damage your card.*


----------



## fat4l

RAM I/O bus voltage mod done!
GPU-z showing 1.047V @VDDCI now








(More info hawaii bios editing thread)


----------



## MEC-777

Quote:


> Originally Posted by *kizwan*
> 
> I didn't said I can set voltage in Crimson control panel (or CNext). You misunderstood my post there.
> 
> 
> 
> 
> 
> 
> 
> I'm using AB/Trixx of course.
> 
> I have been in the situation - failed boot loop - two times & it's annoying that Win 10 take control of everything. First time Win 10 found error the C: drive & insisted in checking+fixing the problem during booting into windows. Guess what, it failed to correct the "error" twice - auto-reboot in between - and on the 3rd reboot, Win 10 tried to enter Advanced Startup which was successful but no use at all because keyboard is not responding. I managed to access Advanced Startup using Win 10 USB installer bootable drive & fixed it by using System Restore (usually turned off but thankfully forgot at that time). The second time when BF4+overclock crash with the Crimson driver.


Ah, my bad. Misunderstood.









So, it's possible there is an error with my windows 10 installation which might need to have a system restore/repair performed on it? Or do you think reinstalling the Crimson drivers could fix the problem?

Could a mild OC cause this? (1060/1350, +50% power and +0mV). I've run this OC many times before, even higher without any voltage (1100/1350) and it was totally stable. And, sorry, one more question; is it possible that OCing with Overdrive in the new Crimson is not as stable as OCing with MSI AB? (I'm still using MSI AB, but only for monitoring clocks/temps/usage etc.)


----------



## mfknjadagr8

Quote:


> Originally Posted by *kizwan*
> 
> I didn't said I can set voltage in Crimson control panel (or CNext). You misunderstood my post there.
> 
> 
> 
> 
> 
> 
> 
> I'm using AB/Trixx of course.
> 
> I have been in the situation - failed boot loop - two times & it's annoying that Win 10 take control of everything. First time Win 10 found error the C: drive & insisted in checking+fixing the problem during booting into windows. Guess what, it failed to correct the "error" twice - auto-reboot in between - and on the 3rd reboot, Win 10 tried to enter Advanced Startup which was successful but no use at all because keyboard is not responding. I managed to access Advanced Startup using Win 10 USB installer bootable drive & fixed it by using System Restore (usually turned off but thankfully forgot at that time). The second time when BF4+overclock crash with the Crimson driver.
> 
> I tried to google how to set AUX voltage using command line. I'm pretty sure somebody already shared the info but I can't find it. So have to invetigate it myself using info from *here* & *here*.
> 
> 
> Spoiler: MSI Afterburner I2C dump
> 
> 
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> C:\Program Files (x86)\MSI Afterburner>MSIAfterburner.exe /i2cd
> 
> AUX +0mV (left) vs. +50mV (right)
> There are two entry because I have two cards.
> 
> 
> 
> View in HEX editor. AUX +0mV (top) vs. +50mV (bottom).
> GPU1
> 
> 
> GPU2
> 
> 
> We do know that this is how to set voltage in MSI AB using the command line. An example, to set +125mV.
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> MsiAfterburner.exe /sg0 /wi6,30,8d,14 /sg1 /wi6,30,8d,14
> 
> /sg# : # = 0 - first card, 1 - second card, etc
> /wi6,30,8d,14 : I2C bus 6, device 30, offset/location in I2C dump, offset voltage in HEX (e.g. HEX 14 = DEC 20 : offset voltage = 20 x 6.25mV = +125mV)
> 
> When I set AUX voltage in MSI AB to +50mV, I check I2C dump again & noticed value at offset/location 0x8e changed from 00 to 08.
> 
> HEX 08 = DEC 8 : offset AUX voltage = 8 x 6.25mV = +50mV
> 
> I confirmed this by resetting to default in MSI AB, closed MSI AB & run this command line.
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> MsiAfterburner.exe /sg0 /wi6,30,8e,8 /sg1 /wi6,30,8e,8
> 
> Running MSI AB again show that AUX voltage was set to +50mV & GPU-Z confirmed the AUX voltage is increased.
> 
> *Use it at your own risk! Be careful not to over voltage AUX. You may damage your card.*


im confused here.. why wouldnt you just set the 2d overclock to modest.. or even stock and push the 3d clock up using another saved preset... or am i missing something.... are you saying you had it set this way but it persists after restart for 2d and 3d?

EDIT: nevermind you arent using AB....sorry... im too tired for this..


----------



## kizwan

Quote:


> Originally Posted by *MEC-777*
> 
> Ah, my bad. Misunderstood.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So, it's possible there is an error with my windows 10 installation which might need to have a system restore/repair performed on it? Or do you think reinstalling the Crimson drivers could fix the problem?
> 
> Could a mild OC cause this? (1060/1350, +50% power and +0mV). I've run this OC many times before, even higher without any voltage (1100/1350) and it was totally stable. And, sorry, one more question; is it possible that OCing with Overdrive in the new Crimson is not as stable as OCing with MSI AB? (I'm still using MSI AB, but only for monitoring clocks/temps/usage etc.)


No idea. Try disabled Fast Boot (_Control Panel \ System and Security \ Power Options \ "Choose what the power buttons do" \ "Change settings that are currently unavailable"_. Untick Fast Boot). That thing can cause problem too. Remember to reboot after that. If still get that BSOD, then you can try uninstall Crimson using DDU & re-install again. If still problematic, system restore to the last point where you don't have this problem.

I have no experience overclocking with CCC or Crimson but it shouldn't cause any problem. Usually when troubleshooting, I close all monitoring & overclocking software when playing games. Sometime these software can cause problem too.


----------



## MEC-777

Quote:


> Originally Posted by *kizwan*
> 
> No idea. Try disabled Fast Boot (_Control Panel \ System and Security \ Power Options \ "Choose what the power buttons do" \ "Change settings that are currently unavailable"_. Untick Fast Boot). That thing can cause problem too. Remember to reboot after that. If still get that BSOD, then you can try uninstall Crimson using DDU & re-install again. If still problematic, system restore to the last point where you don't have this problem.
> 
> I have no experience overclocking with CCC or Crimson but it shouldn't cause any problem. Usually when troubleshooting, I close all monitoring & overclocking software when playing games. Sometime these software can cause problem too.


Thanks for the suggestions. I still got the crash/freeze/BSOD with MSI AB closed. So I know it's not that.

Odd that it could be Fast Boot to cause a specific game to crash (haven't tried too many other games yet). But yeah, if the problem persists, will try DDU again, then reinstall Crimson.


----------



## mus1mus

Have you tried flipping the BIOS switch to the other position?

It might reset the driver and it's saved attributes after boot.


----------



## MEC-777

Quote:


> Originally Posted by *mus1mus*
> 
> Have you tried flipping the BIOS switch to the other position?
> 
> It might reset the driver and it's saved attributes after boot.


I'll give it a try. Just did a whole race including two practice sessions, qualifying session and 7 lap race (total of about 30 mins straight racing, GPUs near full load, no problem). After the race ended, it goes to external camera. After a few seconds of that, it froze and went to BSOD with the "thread stuck in device driver" message.









Verified game files, all is well. Will try reinstalling Crimson if the bios switch doesn't fix it.


----------



## corey3333

Does anyone know how to get rid of or reduce Screen Tearing? Also I am getting some Icon Flickering in-game while Crossfiring. Looking for Any advice on this, Thnx


----------



## mfknjadagr8

Quote:


> Originally Posted by *corey3333*
> 
> Does anyone know how to get rid of or reduce Screen Tearing? Also I am getting some Icon Flickering in-game while Crossfiring. Looking for Any advice on this, Thnx


vsync....if the game you are taking about is the witcher 3 it's an issue with crossfire...I'm thinking a few people got it stopped but their fixes never worked on mine


----------



## corey3333

Thanks for the reply, but I always have vsync on. Still having problem.


----------



## mfknjadagr8

Quote:


> Originally Posted by *corey3333*
> 
> Thanks for the reply, but I always have vsync on. Still having problem.


tearing generally occurs when the framerate exceeds the refresh rate so the screen plays catch up sometimes...I edited the above post...which games are you having the issues in?


----------



## MEC-777

Quote:


> Originally Posted by *corey3333*
> 
> Does anyone know how to get rid of or reduce Screen Tearing? Also I am getting some Icon Flickering in-game while Crossfiring. Looking for Any advice on this, Thnx


Best results I've found, especially with crossfire is to use Vsync in combination with frame rate target control in Radeon Settings set to 60fps. This way, your GPUs will only need to work as hard as necessary to render no more than 60fps, providing a frame for every Vsync refresh. Depending on the game, this will also reduce GPU usage, temps and noise as the GPUs aren't rendering frames as fast as possible as with using straight Vsync. I'v also found this reduces/eliminates most of the input lag associated with using Vsync.


----------



## MEC-777

Quote:


> Originally Posted by *corey3333*
> 
> Thanks for the reply, but I always have vsync on. Still having problem.


Are you using Vsync in Catalyst/Radeon settings, or the in-game Vsync option?

I've found the Catalyst/Radeon Vsync doesn't always apply/override the game settings. If you haven't tried yet, try using the in-game Vsync option.

The Witcher 3 is notorious for screen flicker/stutter in crossfire mode, especially with Vsync on. However, you still shouldn't see screen tearing. Just to be clear, this is what screen tearing looks like:



It is when multiple frames are being sent to the monitor during it's refresh cycle. Can see from the above example the display received 2 frames during this refresh cycle.


----------



## MEC-777

Wiped out Crimson with DDU and re-installed. So far no crashes, freezing or BSOD. Fingers crossed...









This time I installed it leaving both cards in and powered. Maybe that's what caused the problem the first time....


----------



## corey3333

Witcher 3 and Fallout 4


----------



## corey3333

vsync does help on most games but not Witcher 3 and Fallout 4 Does not have the option for vsync.


----------



## corey3333

I did not know AMD Catalist had a vsync option, I thought it was only available in-game. Can you tell my where I can find this option?


----------



## fat4l

Today I did some fraps benchmark of *Crysis 3*, Welcome to the Jungle.
134s Benchmark, Crossfire 290x (ares III).
All max details, 1440p, SMAA MGPU 2x.
My results are:
*Stock clocks, 1030/1250MHz:*
Min: 59
Max: 108
Avg: 71.127

*Overclocked clocks, 1200/1700MHz(1500Timings):*
Min: 62
Max: 132
Avg: 85.517

Difference: 20.2%

*3DMark Firestrike eXtreme:*
http://www.3dmark.com/compare/fs/6593207/fs/6590481


Difference in Graphics Score: 19.7%

Nice scaling! Start modding your bios ppl


----------



## mus1mus

http://www.3dmark.com/fs/6610752


----------



## ZealotKi11er

Quote:


> Originally Posted by *mus1mus*
> 
> http://www.3dmark.com/fs/6610231


What did you do lol to get that high of overclock?


----------



## Gumbi

Over 16k on a 290... wow. My 290X does 14.8k at 1241mhz and 1640mhz memory.

Haven't benched with the new drivers yet though!


----------



## mus1mus

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What did you do lol to get that high of overclock?


Twerking the BIOS.









Nothing good though.Only useful in Benching.


----------



## ZealotKi11er

Quote:


> Originally Posted by *mus1mus*
> 
> Twerking the BIOS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Nothing good though.Only useful in Benching.


What does your card do 24/7 OC?


----------



## mus1mus

1200/1500. ?

Not sure. I'll try gaming on it.


----------



## fat4l

Quote:


> Originally Posted by *mus1mus*
> 
> http://www.3dmark.com/fs/6610752


gZ lol


----------



## rdr09

Quote:


> Originally Posted by *mus1mus*
> 
> http://www.3dmark.com/fs/6610752


LOL


----------



## MEC-777

So just an update on the BSOD/freezing/crashing I've been getting since upgrading to Crimson drivers... I've discovered the problem only occurs when playing Project Cars and only when an OC is applied. I have the power limit cranked to keep powertune from downclocking, but with even just a 10% OC, it will randomly freeze and go to BSOD with the "thread stuck in device driver" error message.

I would try a lower OC, but IMHO any OC below 10% isn't really worth it for the gain. I can OC the primary card 10% easily without any problems (like in Fallout 4 which doesn't use crossfire), but any games that do use CF, I'll just have to run the cards at stock clocks. They're fast enough anyways.









After running CF 290's for a while now, I'm giving serious thought about selling both 290's and going to a single Fury or Fury X. I like tinkering, but the time I've spent messing with CF and associated settings when I just wanted to be playing games is accumulating. AMD just cut prices on the Fury line, so it's kind of tempting...


----------



## kizwan

What would be the cause if your computer hanged in games like BF4 & GTA V, not immediately but after 15 to 30 minutes in game when running the card overclocked? I did test several overclock before with BF4 & did found one core/memory combo that work but lost all of the record when my computer received Win 10 November update. If the crash preceding with artifacts or screen blinking, then we can easily narrow it down but it just hanged. What consistent with the crash is the memory was overclocked to 1500 or above. Is this means my cards or one of my card can not run with memory at 1500 or above continuously?


----------



## rdr09

Quote:


> Originally Posted by *MEC-777*
> 
> So just an update on the BSOD/freezing/crashing I've been getting since upgrading to Crimson drivers... I've discovered the problem only occurs when playing Project Cars and only when an OC is applied. I have the power limit cranked to keep powertune from downclocking, but with even just a 10% OC, it will randomly freeze and go to BSOD with the "thread stuck in device driver" error message.
> 
> I would try a lower OC, but IMHO any OC below 10% isn't really worth it for the gain. I can OC the primary card 10% easily without any problems (like in Fallout 4 which doesn't use crossfire), but any games that do use CF, I'll just have to run the cards at stock clocks. They're fast enough anyways.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> After running CF 290's for a while now, I'm giving serious thought about selling both 290's and going to a single Fury or Fury X. I like tinkering, but the time I've spent messing with CF and associated settings when I just wanted to be playing games is accumulating. AMD just cut prices on the Fury line, so it's kind of tempting...


MEC, i am serious here . . . why do you have to oc your gpus? it is the cpu that needs oc. in your case, though, you have a locked one. My crossfire HD 7900 cards got bottlenecked by my i7 4.5 GHz HT off (basically an i5) in BF3, C3, and BF4.

With a single 290, though, HT off was fine.


----------



## mus1mus

Hanging and BSOD or Black screen are memory OC instability attributes.









Core instability will give you flickers and stuff.


----------



## ZealotKi11er

Quote:


> Originally Posted by *mus1mus*
> 
> Hanging and BSOD or Black screen are memory OC instability attributes.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Core instability will give you flickers and stuff.


I get RSOD when I am not stable.


----------



## kizwan

My thoughts exactly. Oh well.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Hanging and BSOD or Black screen are memory OC instability attributes.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Core instability will give you flickers and stuff.
> 
> 
> 
> I get RSOD when I am not stable.
Click to expand...

Sometime I do get red screen too. More like maroon screen of death to me.


----------



## mus1mus

No green?


----------



## mfknjadagr8

Quote:


> Originally Posted by *mus1mus*
> 
> Hanging and BSOD or Black screen are memory OC instability attributes.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Core instability will give you flickers and stuff.


either that or driver related...do you get sound loop and video freeze or do you get black screen hang? Or screen freeze with no sound?


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> No green?


Quote:


> Originally Posted by *mfknjadagr8*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Hanging and BSOD or Black screen are memory OC instability attributes.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Core instability will give you flickers and stuff.
> 
> 
> 
> either that or driver related...do you get sound loop and video freeze or do you get black screen hang? Or screen freeze with no sound?
Click to expand...

Video freeze & sound loop.


----------



## mfknjadagr8

Quote:


> Originally Posted by *kizwan*
> 
> Video freeze & sound loop.


I've gotten this several times from a borked driver...from bad overclock (low voltage or more than card can handle on either core or memory or both) or Windows update doing what it sometimes does....I always ddu then cccleaner on registry (in safe mode)...then restart then install drivers with one card unpowered then shutdown and reconnect the second card...boot let windows detect the second card then restart and activate crossfire...the last restart may not be necessary but this has never failed to work once I knew it was driver issue


----------



## MEC-777

Quote:


> Originally Posted by *rdr09*
> 
> MEC, i am serious here . . . why do you have to oc your gpus? it is the cpu that needs oc. in your case, though, you have a locked one. My crossfire HD 7900 cards got bottlenecked by my i7 4.5 GHz HT off (basically an i5) in BF3, C3, and BF4.
> 
> With a single 290, though, HT off was fine.


I've rarely seen any situations where my CPU was a bottleneck, if at all. Believe it or not, but a 10% OC on both cards does give noticeable performance gain, depending on the game. It can mean the difference between frame rates in the 50's or 60+. Not entirely necessary though. They are fast enough without OC. But if I can run them a bit faster, why not? Not like I'm trying to break benchmark records, it's just a 10% OC.









Quote:


> Originally Posted by *mus1mus*
> 
> Hanging and BSOD or Black screen are memory OC instability attributes.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Core instability will give you flickers and stuff.


Interesting. Perhaps I'll try a 10% OC on the core and leave the memory stock. Maybe the Elpida memory on my second card doesn't like to do +100mhz with +0mV.
Quote:


> Originally Posted by *mfknjadagr8*
> 
> I've gotten this several times from a borked driver...from bad overclock (low voltage or more than card can handle on either core or memory or both) or Windows update doing what it sometimes does....I always ddu then cccleaner on registry (in safe mode)...then restart then install drivers with one card unpowered then shutdown and reconnect the second card...boot let windows detect the second card then restart and activate crossfire...the last restart may not be necessary but this has never failed to work once I knew it was driver issue


I usually follow that method. Last time I re-installed the Crimson drivers, I just left both cards powered and everything went smoothly.

Only thing that is odd is where you go to enable CF, it will not allow me to check the option to enable CF for apps with no CF profile. I check the box and click apply. Close the window. Reopen the window and that box is unchecked again (not enabled). Don't understand why that is or if it's actually enabling or not.


----------



## kizwan

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Video freeze & sound loop.
> 
> 
> 
> I've gotten this several times from a borked driver...from bad overclock (low voltage or more than card can handle on either core or memory or both) or Windows update doing what it sometimes does....I always ddu then cccleaner on registry (in safe mode)...then restart then install drivers with one card unpowered then shutdown and reconnect the second card...boot let windows detect the second card then restart and activate crossfire...the last restart may not be necessary but this has never failed to work once I knew it was driver issue
Click to expand...

In my case, not driver issue. It was unstable overclock.


----------



## mus1mus

Quote:


> Originally Posted by *MEC-777*
> 
> Interesting. Perhaps I'll try a 10% OC on the core and leave the memory stock. Maybe the Elpida memory on my second card doesn't like to do +100mhz with +0mV.


You know, I've been pushing things last night and it's not the core that gives me a blue screen nor core needing more Voltage.

It's always the Memory. You can dial a Core only OC without adding Voltages but when you try to OC the memory along with the core, more errors will arise.

Also note, 1500 is right at the very end of the strap. So it's always a give and take situation for stability and performance. I mean, it's where most performance for that strap happen. But it's also more prone to be unstable.

I am willing to bet, 1525 will give you more stabilty. Maybe even 1550.

That is due to how the timings are set for the straps. But also note that the memory has it's own limit.


----------



## Shweller

Been having a weird issue with Star Wars Battlefornt. It seems my 290x in Xfire break the game. Once I am in a matchup I can see the HUD but the rest of the screen goes black. I run around and get shoot sometimes the image will pop up for a second. Game works fine with Xfire disabled. Gaming at 1440p with freesync. Anyone else have this problem?


----------



## MEC-777

Quote:


> Originally Posted by *mus1mus*
> 
> You know, I've been pushing things last night and it's not the core that gives me a blue screen nor core needing more Voltage.
> 
> It's always the Memory. You can dial a Core only OC without adding Voltages but when you try to OC the memory along with the core, more errors will arise.
> 
> Also note, 1500 is right at the very end of the strap. So it's always a give and take situation for stability and performance. I mean, it's where most performance for that strap happen. But it's also more prone to be unstable.
> 
> I am willing to bet, 1525 will give you more stabilty. Maybe even 1550.
> 
> That is due to how the timings are set for the straps. But also note that the memory has it's own limit.


1525 or 1550? Holy smokes, not on my cards. Not the Elpida card anyways. My HIS card with Hynix will do 1600, but the Elpida card I can't get more than 1450 and barely finish a run of firestrike.









After some testing this evening, I played Project Cars for several hours with a 10% OC on the core and left the memory at stock (1250). Totally fine, no crashes or hiccups at all. Then I tried a 16.2% core OC (1100) and again, totally fine. Even the temps were good. So it seems any kind of memory OC on the Elpida card doesn't sit well. I'm happy though. Frame rate was tickling 100fps at times running 1440p high settings with DX4S AA.









Also, just as a side note; decided to take a look at CPU usage and it was pegged at 100% on all four cores most of the time while racing. Both GPUs were also pegged at 100% most of the time as well. So I guess the system is really being used to it fullest in this game.







I was watching a youtube video at the same time (only streaming at 360p) so that probably wasn't helping.


----------



## corey3333

I have many hundreds of videos that play at 576i, they are blurry. is there anyway to clear up picture quality that is 576i ?


----------



## corey3333

I also get a lot of Ghosting on 567i any advice?


----------



## mus1mus

Quote:


> Originally Posted by *corey3333*
> 
> I have many hundreds of videos that play at 576i, they are blurry. is there anyway to clear up picture quality that is 576i ?


Play it at defined res.

Meaning, small window. No zooms or full screen playback.


----------



## H1vemind

Hi guys, i would like to join the community as I recently upgraded to an MSI R9 290x, its a stock cooled model with the twin frzr 4 cooler. Heres the gpuz validation link with my highest stable OC without extra voltage. http://www.techpowerup.com/gpuz/details.php?id=8f4mb


----------



## JourneymanMike

Quote:


> Originally Posted by *H1vemind*
> 
> Hi guys, i would like to join the community as I recently upgraded to an MSI R9 290x, its a stock cooled model with the twin frzr 4 cooler. Heres the gpuz validation link with my highest stable OC without extra voltage. http://www.techpowerup.com/gpuz/details.php?id=8f4mb


Have a picture of that BEAST?


----------



## fyzzz

@mus1mus has helped me to improve my score, not quite as impressive as his, but i would say pretty good considering the clockspeeds. I also believe that it can go even further with a bit of time. But this will do for now. Couldn't run 390 bios because it gives me blackscreen when loading test 2, which is a shame. http://www.3dmark.com/fs/6623545 ,kinda of a bad run missing about 100 points on physics, but highest gpu score i've gotten.


----------



## H1vemind

@JourneymanMike Unfortunately I didnt do that before installing it so not got a pic ready yet, the card is much beefier than I initially thought it would be for some reason though. Gotta say beast is indeed the right word to describe it, after some quick benching i got it stable at 1190mhz core 1375mhz memory with +131mv (i am maxing out the fans for firestike testing though which is also much quieter than I thought it would be). gotta stop it there though as its forcing my cpu to throttle the higher i up the voltage (i have a 4690k with the stock cooler from an intel pentium g3258 for now which means my physics scores are terrible compared to anyone with a decently cooled 4690k)


----------



## mus1mus

Quote:


> Originally Posted by *fyzzz*
> 
> @mus1mus has helped me to improve my score, not quite as impressive as his, but i would say pretty good considering the clockspeeds. I also believe that it can go even further with a bit of time. But this will do for now. Couldn't run 390 bios because it gives me blackscreen when loading test 2, which is a shame. http://www.3dmark.com/fs/6623545 ,kinda of a bad run missing about 100 points on physics, but highest gpu score i've gotten.


Holy!

That's impressive. Youre memory might still improve that!.

Do you fancy joining the OCN HWBot team?


----------



## alancsalt

You haven't bothered to get the HWbot postbit mus1mus?

http://www.overclock.net/t/803475/hwbot-postbit-information


----------



## fat4l

Quote:


> Originally Posted by *fyzzz*
> 
> @mus1mus has helped me to improve my score, not quite as impressive as his, but i would say pretty good considering the clockspeeds. I also believe that it can go even further with a bit of time. But this will do for now. Couldn't run 390 bios because it gives me blackscreen when loading test 2, which is a shame. http://www.3dmark.com/fs/6623545 ,kinda of a bad run missing about 100 points on physics, but highest gpu score i've gotten.


Mind sharing some tips ?
What exactly do u change in bios ?
Any good for gaming ?
Thx


----------



## ZealotKi11er

Quote:


> Originally Posted by *fat4l*
> 
> Mind sharing some tips ?
> What exactly do u change in bios ?
> Any good for gaming ?
> Thx


Would like to know too.


----------



## mus1mus

Quote:


> Originally Posted by *alancsalt*
> 
> You haven't bothered to get the HWbot postbit mus1mus?


Not yet. But looking into it now.

Thanks for the tip.


----------



## mus1mus

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fat4l*
> 
> Mind sharing some tips ?
> What exactly do u change in bios ?
> Any good for gaming ?
> Thx
> 
> 
> 
> Would like to know too.
Click to expand...

Something that involves BIOS editing. And practicality wise, it's yet to be proven if beneficial for gaming. Otherwise, it's only good for benching.

But actually, maybe you guys can test it with gaming? Are you up to that?


----------



## ZealotKi11er

Quote:


> Originally Posted by *mus1mus*
> 
> Something that involves BIOS editing. And practicality wise, it's yet to be proven if beneficial for gaming. Otherwise, it's only good for benching.
> 
> But actually, maybe you guys can test it with gaming? Are you up to that?


Sure thing.


----------



## mus1mus

Okay, if you are into BIOS modding and have been initiated to memory timing tweaking, here's my recipe:

77 71 33 20 00 00 00 00 29 39 57 *26 50* 55 09 0E 26 1D 17 03 00 68 C2 00 22 AA 1C 08 54 0C 14 20 AA 89 00 A6 00 00 07 C0 0F 0A 18 1D 31 1E 27 10

These are the series of HEX values for the memory timings.

The idea is to tighten the strap you are testing. So these hawaii GPUs have these straps:

1750
1625
1500
1375
1250
1000 etc. It follows, that 1626 belongs to 1700 strap and 1599 belongs to 1625. And so on.

Now, if you can clock your memory up to, say 1650, you can substitute the timings on that strap with the timings from the previous strap. So, you can pick 1625 strap timings and substitute that to 1750 Strap. Do that till you find the lowest(tightest) strap timings you can use for your need. Or before instabilities start to ruin performance.

Now, if you look at the timings above, two values are on bold. These are the final recipe.

On my BIOS, (sorry, can't put up one at the moment as I am not on my rig now nor have a copy I can access while on mobile) I use 1250 strap to 1375 up to 1750. That means I dont lose performance going higher past a certain strap end.

The tricky part is, using 1000 strap timings causes too many issues. So is using the Stilt's 1250 timings. So I ended up looking for a way to tighen a bit my 1250.

So far, 2 HEX values have been tested that gives me that. Those in bold can be replaced by the values from 1000 strap on the same positions.

This is a little tricky if you are not into modding so yeah. Let me know, or Fyzzz if you have further questions.


----------



## fat4l

Quote:


> Originally Posted by *mus1mus*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Okay, if you are into BIOS modding and have been initiated to memory timing tweaking, here's my recipe:
> 
> 77 71 33 20 00 00 00 00 29 39 57 *26 50* 55 09 0E 26 1D 17 03 00 68 C2 00 22 AA 1C 08 54 0C 14 20 AA 89 00 A6 00 00 07 C0 0F 0A 18 1D 31 1E 27 10
> 
> These are the series of HEX values for the memory timings.
> 
> The idea is to tighten the strap you are testing. So these hawaii GPUs have these straps:
> 
> 1750
> 1625
> 1500
> 1375
> 1250
> 1000 etc. It follows, that 1626 belongs to 1700 strap and 1599 belongs to 1625. And so on.
> 
> Now, if you can clock your memory up to, say 1650, you can substitute the timings on that strap with the timings from the previous strap. So, you can pick 1625 strap timings and substitute that to 1750 Strap. Do that till you find the lowest(tightest) strap timings you can use for your need. Or before instabilities start to ruin performance.
> 
> Now, if you look at the timings above, two values are on bold. These are the final recipe.
> 
> On my BIOS, (sorry, can't put up one at the moment as I am not on my rig now nor have a copy I can access while on mobile) I use 1250 strap to 1375 up to 1750. That means I dont lose performance going higher past a certain strap end.
> 
> The tricky part is, using 1000 strap timings causes too many issues. So is using the Stilt's 1250 timings. So I ended up looking for a way to tighen a bit my 1250.
> 
> So far, 2 HEX values have been tested that gives me that. Those in bold can be replaced by the values from 1000 strap on the same positions.
> 
> This is a little tricky if you are not into modding so yeah. Let me know, or Fyzzz if you have further questions.


Interesting.
Will check my timings and see whats going on there


----------



## fyzzz

I finally know how to get rid of the blackscreens i've had with 390 bios, i just had to run it more stable and then the magic happend.
http://www.3dmark.com/3dm/9477943?


----------



## mus1mus

Pusheet!










By the way, it may not be a complete blackscreen. My monitor sometimes loses the ability to refresh the input.


----------



## fyzzz

Quote:


> Originally Posted by *mus1mus*
> 
> Pusheet!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> By the way, it may not be a complete blackscreen. My monitor sometimes loses the ability to refresh the input.


I'm trying, but as soon as i go over 1240 mhz it blackscreens and i can't do anything about it. Test 1 runs fine on 1265mhz and when test 2 is about to start, it blackscreens. So frustrating







.


----------



## fat4l

Quote:


> Originally Posted by *fyzzz*
> 
> I'm trying, but as soon as i go over 1240 mhz it blackscreens and i can't do anything about it. Test 1 runs fine on 1265mhz and when test 2 is about to start, it blackscreens. So frustrating
> 
> 
> 
> 
> 
> 
> 
> .


Similar happening here. What Hz are u using ?


----------



## fyzzz

Quote:


> Originally Posted by *fat4l*
> 
> Similar happening here. What Hz are u using ?


Just 60hz, runs fine on 290 bios however.


----------



## fat4l

Quote:


> Originally Posted by *fyzzz*
> 
> Just 60hz, runs fine on 290 bios however.


aha. and this blackscreen is affected only by core clock right ?


----------



## fyzzz

Quote:


> Originally Posted by *fat4l*
> 
> aha. and this blackscreen is affected only by core clock right ?


Seems like it, only happening with 390 bios too


----------



## fyzzz

Well i gave up the 390 bios for now. Went back and ran a few benches with 290 bios instead. Highest i got:
http://www.3dmark.com/fs/6628183


----------



## fat4l

Quote:


> Originally Posted by *fyzzz*
> 
> Seems like it, only happening with 390 bios too


It's happening to my card as well and I'm using stock bios.
There must be something in the bios causing it.


----------



## ZealotKi11er

Quote:


> Originally Posted by *fyzzz*
> 
> Well i gave up the 390 bios for now. Went back and ran a few benches with 290 bios instead. Highest i got:
> http://www.3dmark.com/fs/6628183


New drivers better score?


----------



## fyzzz

Quote:


> Originally Posted by *ZealotKi11er*
> 
> New drivers better score?


Maybe a little help yes, but not much. But that score is mainly thanks to all of the bios mods.


----------



## ZealotKi11er

Quote:


> Originally Posted by *fyzzz*
> 
> Maybe a little help yes, but not much. But that score is mainly thanks to all of the bios mods.


You mean memory mods?


----------



## fyzzz

Quote:


> Originally Posted by *ZealotKi11er*
> 
> You mean memory mods?


Yes, memory mods. My cards memory can handle tight timings at high clockspeeds and therefore it scores pretty high without being north of 1300 mhz.


----------



## ZealotKi11er

Quote:


> Originally Posted by *fyzzz*
> 
> Yes, memory mods. My cards memory can handle tight timings at high clockspeeds and therefore it scores pretty high without being north of 1300 mhz.


Have you tried to game on these BIOS mods? I can run 3DMark all day and then get Black screen, Red Screen during heavy gaming.


----------



## fyzzz

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Have you tried to game on these BIOS mods? I can run 3DMark all day and then get Black screen, Red Screen during heavy gaming.


No i use a 980ti for that







. Just using the 290 for benching right now. But i used to run a bios with 1100/1625 with tighter timings as 24/7 bios and that worked fine.


----------



## kizwan

Quote:


> Originally Posted by *kizwan*
> 
> What would be the cause if your computer hanged in games like BF4 & GTA V, not immediately but after 15 to 30 minutes in game when running the card overclocked? I did test several overclock before with BF4 & did found one core/memory combo that work but lost all of the record when my computer received Win 10 November update. If the crash preceding with artifacts or screen blinking, then we can easily narrow it down but it just hanged. What consistent with the crash is the memory was overclocked to 1500 or above. Is this means my cards or one of my card can not run with memory at 1500 or above continuously?


I just found out that the problem was caused by modded BIOS. My PC will crash exactly 15 minutes in the BF4 & not just when overclock but at stock as well. Actually I don't have the same problem with GTA V (I erroneously said it was) where the crashes was caused by unstable CPU OC (WHEA BSOD) which I already fixed by lowering VCCSA & VTT voltages. I did clean install the driver like @mfknjadagr8 suggested, just to rule out borked driver. Only after I re-flashed the cards with 290 TRI-X OC BIOS, BF4 running without any problem. I think I'll just stick with 290 BIOS.


----------



## gupsterg

Quote:


> Originally Posted by *kizwan*
> 
> I just found out that the problem was caused by modded BIOS.


Had you used modded factory bios for your card? only to share my own experience I've had no issues, I use my factory bios modded with clocks/voltages/tightened RAM timings.

My recent rooster of games played for several hours since doing bios modding/using one are GTA IV, Crysis 2 &3, Diablo 3 and Battlefront with my OC'd CPU setup.

Also clocked up a lot of hours (at least 100+) of GPU [email protected] on modded bios, logs show no faults, PPD rate no issues either.


----------



## buttface420

seems my 290x is a poor clocker, best stable i can get is 1100/1375, even with trixx on +200 i cant get much better i had 1150/1400 and it crashes before i was done testing,temps are still in the 50's so im assuming its just not good.

it seems no matter how much voltage i give it still doesnt help, although i found lowering power limit from 50 to 30 makes it a little better still...

1100/1375 on a water cooled 290x is kinda lame


----------



## kizwan

Quote:


> Originally Posted by *buttface420*
> 
> seems my 290x is a poor clocker, best stable i can get is 1100/1375, even with trixx on +200 i cant get much better i had 1150/1400 and it crashes before i was done testing,temps are still in the 50's so im assuming its just not good.
> 
> it seems no matter how much voltage i give it still doesnt help, although i found lowering power limit from 50 to 30 makes it a little better still...
> 
> 1100/1375 on a water cooled 290x is kinda lame


Lowering power limit in your case preventing your card from running at full tilt. Once power level exceeds 130%, your gpu will start throttling. This is why it look little better.


----------



## mus1mus

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *buttface420*
> 
> seems my 290x is a poor clocker, best stable i can get is 1100/1375, even with trixx on +200 i cant get much better i had 1150/1400 and it crashes before i was done testing,temps are still in the 50's so im assuming its just not good.
> 
> it seems no matter how much voltage i give it still doesnt help, although i found lowering power limit from 50 to 30 makes it a little better still...
> 
> 1100/1375 on a water cooled 290x is kinda lame
> 
> 
> 
> Lowering power limit in your case preventing your card from running at full tilt. Once power level exceeds 130%, your gpu will start throttling. This is why it look little better.
Click to expand...

This^

You will also need more than that kind of assumption and testing to prove if it's a dud. Memory Instabilities causes crashes. So does the Driver and borked installs.


----------



## ManofGod1000

I have an XFX R9 290 that I have unlocked to a full 290X using the XFX core bios firmware. However, they have no UEFI support in it. However, there is a XFX R9 290X bios with UEFI support in it but, it is for a Black OC edition. My card has no issues overclocking to 1050 which is what that bios does and I was able to use a R9 290 bios that has UEFI for a core edition from Sapphire. Do you think this XFX bios with the one I have or is it too different to be of any use?


----------



## Gumbi

What kind of crashes do you get? Black screens/coloured artefacting generally indicates memory instability. I've never seen a core overclock so badly, most Hawaii's can do in or around 1100mhz core without fiddling too much with voltages.

Max out the power limit and overclock just the corr and see what you can do, then tackle the memory.


----------



## gupsterg

Quote:


> Originally Posted by *buttface420*
> 
> it seems no matter how much voltage i give it still doesnt help, although i found lowering power limit from 50 to 30 makes it a little better still...
> 
> 1100/1375 on a water cooled 290x is kinda lame


IIRC 15% PL was more than ample for me to stablise 1100/1475 in tests I did was 3dMark FS full set of tests with demo, Heaven & Valley for 20 to 30mins. Only in Heaven & Valley when scene changes occur I get very slight drops in clocks. In things like RealBench OpenCL test its stable plus [email protected], the whole "PowerTune" thing can play with frequency even if its not throttling due to temp, etc depending on app loading GPU IMO.

Depending on your ROM coded Powerlimit you may require more than another owner.


Spoiler: my compare of some factory rom powerlimits





Basically what happens when you add PL by OC app the values in the rom get that addition of %, so an owner of a card with higher values in rom will need less than one with lower in rom. I use the last column values for my rom and don't need to add any PL by an app.



I'd try mus1mus suggestion buttface420, uninstall all OC apps without keeping settings/profiles and make sure the install directories are empty/deleted. Uninstall AMD driver, run DDU and reinstall driver/OC apps. When testing OC try upped PL in graduations of 5% to see what works best.


----------



## refirendum

yay i got my 290X and it's on water. Aquacomputer block and backplate, XFX reference card.

what settings should i go for for only a mild overclock or should i be satisfied with how it is as a reference card?


----------



## mus1mus

Quote:


> Originally Posted by *refirendum*
> 
> yay i got my 290X and it's on water. Aquacomputer block and backplate, XFX reference card.
> 
> what settings should i go for for only a mild overclock or should i be satisfied with how it is as a reference card?


Try a Trixx by Sapphire.
Actually, before you OC, what Memory Chips do you have on the card?

Hynix, try to reach 1625 on the memory and 1100 Core at +100 Voltage.

Elpida, aim for 1500 Memory and 1100 Core at +100.

If you can reach those, try +200 and record the max temps and Voltages with Heaven at say, 1200 Core with memory speeds mentioned above.

Verify your OC by gaming. It could improve your gaming.


----------



## i2CY

Hey,

Has anyone tried the New Radeon Software Crimson Edition Driver Manager yet?
http://support.amd.com/en-us/download/desktop?os=Windows+10+-+64

Does the R9 290 not have AMD FreeSync™ Technology in the bios? Crimson States "Not Supported" show up supported for my Sapphires Crossfired.

Side note, anyone roll back to Catalyst due to choppy video play back or am I the only one?


----------



## i2CY

Sorry Forget the last post,
I got one for everyone,
The BIOs dip switch on the R9 290, which bios should be loaded in which pos?

https://www.techpowerup.com/vgabios/index.php?did=1002-67b1--&page=2


----------



## ZealotKi11er

Quote:


> Originally Posted by *refirendum*
> 
> yay i got my 290X and it's on water. Aquacomputer block and backplate, XFX reference card.
> 
> what settings should i go for for only a mild overclock or should i be satisfied with how it is as a reference card?


The card under water should do 1100-1125MHz with no extra voltage. Memory should do about 1375MHz. You need ~ +25 mV for 1500MHz. Even thought you are under water dont go over 100 mV for 24/7. My cards for example can do 1200MHz/1500MHz with +100mV. The other needs 125mV. I would say almost all cards can do 1175MHz.
Quote:


> Originally Posted by *i2CY*
> 
> Hey,
> 
> Has anyone tried the New Radeon Software Crimson Edition Driver Manager yet?
> http://support.amd.com/en-us/download/desktop?os=Windows+10+-+64
> 
> Does the R9 290 not have AMD FreeSync™ Technology in the bios? Crimson States "Not Supported" show up supported for my Sapphires Crossfired.
> 
> Side note, anyone roll back to Catalyst due to choppy video play back or am I the only one?


Yes R9 290 support FreeSync. The problem with you is that you TV/Monitor does not.


----------



## spyshagg

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Have you tried to game on these BIOS mods? I can run 3DMark all day and then get Black screen, Red Screen during heavy gaming.


Same.

My cards could do this back in August

single: http://www.3dmark.com/fs/5662841 15101 gpu score

Crossfire: http://www.3dmark.com/fs/5818885 28472 gpu score

But blackscreen at any serious gaming above 1130gpu/1300mem.


----------



## battleaxe

Quote:


> Originally Posted by *spyshagg*
> 
> Same.
> 
> My cards could do this back in August
> 
> single: http://www.3dmark.com/fs/5662841 15101 gpu score
> 
> Crossfire: http://www.3dmark.com/fs/5818885 28472 gpu score
> 
> But blackscreen at any serious gaming above 1130gpu/1300mem.


Its normal to not be able to game at benching clocks. No worries.


----------



## spyshagg

Sure, it just means that things changed considerably since I was last deeply involved in overclock (~2003-2008).

Back then we used GPU benchmarks to more quickly find the max stable overclock for 24/7 gaming. But now, my cards can push 1230mhz in firestrike/heaven easily but fail at 1140mhz in Mad Max BF4 etc.

This makes the gpu benchmarking case quite pointless imho. Thank god the cpu stability benchmarks behave like the old days


----------



## H1vemind

Hey guys/gals just a quick question, is it possible to increase the core and memory voltage independently or does the card calculate the memory voltage based on what the core gets?


----------



## JourneymanMike

Opps, nevermind


----------



## mtrai

Now that I have pretty much re built my system switched from AMD FX 8120 BE to Intel Skylake 6600K and working on the overclocking on all things. I think I am reaching my limits until I can switch to either AIO liquid or full liquid. here are my 3dmarks results for my cards. Mind you I never planned to crossfire when I bought the R9 290X back in the spring, but I won an R9 290 from the company shortly after I registered it.

THoughts on these results? What other info is needed. I can push my CPU a bit higher but it is on air at this time. Will take some time for me have money to buy liquid cooling for everything.

http://www.3dmark.com/3dm/9549716


----------



## BadRobot

Hey guys,

R9 290 user here. I bought the Arctic Accelero Xtreme IV a while ago and only recently found out that the 290 doesn't push enough power to the fans to make them spin. I had to rig a ATX 4 pin to 3-pin fan adapter to push 12V to the three fans (see image). I messaged Arctic and they said to set the PWM myself to see if it worked... how the hell do I do that? ATI CCC doesn't react when I put it on manual 100% and neither does MSI Afterburner. In fact, they both say the fans are at xx% based on temperature even though it isn't connected.

Help D:



Video of what is happening:


----------



## i2CY

Hey!
I just run my Arctic Accelero Xtreme III off my PSU, they run at 100% the whole time, and it a swiper, both running!
Did you VI not come with the Adapter 7V + 12V to PSU?


----------



## BadRobot

Quote:


> Originally Posted by *i2CY*
> 
> Hey!
> I just run my Arctic Accelero Xtreme III off my PSU, they run at 100% the whole time, and it a swiper, both running!
> Did you VI not come with the Adapter 7V + 12V to PSU?


It's not as loud as expected running off the 12V from the PSU. I checked the box again and no mention of an adapter. Box> heat sink, long screws, screws, metal washers, nuts, spacers, washers, foam, thermal pads, clips, card holder, protective film. No mention of an adapter







And good luck to me trying to actually find one online from stores nearby. I can't find any cables like it.


----------



## i2CY

http://www.arctic.ac/worldwide_en/accelero-xtreme-iii.html

No mine came with.....


----------



## BadRobot

Quote:


> Originally Posted by *i2CY*
> 
> 
> http://www.arctic.ac/worldwide_en/accelero-xtreme-iii.html
> 
> No mine came with.....


Well I forced a 3-pin fan cable into the 4 pin of the cooler now and that works too. At least control the speed through the front panel. The instructional pdf doesn't mention any adapters like yours. Would have been nice to have though. I messaged them about it and they told me to set the PWM higher.

Thanks for the help


----------



## JourneymanMike

Fire Strike results, No OC on the cards, CPU @ 4.9GHz...
http://www.3dmark.com/fs/6681635

Edit: did a little OC on the GPU, 1000 to1100...








http://www.3dmark.com/3dm/9553564

Want more? Actually, I have no idea what I'm doing!









Edit: again,

Just a little more
http://www.3dmark.com/3dm/9553653

And a little more...
http://www.3dmark.com/3dm/9553829

I'm using MSI AB

Even Better!
http://www.3dmark.com/3dm/9553990


----------



## JourneymanMike

The last one of the night...

http://www.3dmark.com/fs/6682805

Plus RealBench.... http://rog.asus.com/rog-pro/realbench-v2-leaderboard/?search=JourneymanMike


----------



## MEC-777

It is with mixed emotions that I say; it's time for a change. Ordered a 980 Strix last night. Got a good deal. Will be selling both 290's.


----------



## fyzzz

Quote:


> Originally Posted by *MEC-777*
> 
> It is with mixed emotions that I say; it's time for a change. Ordered a 980 Strix last night. Got a good deal. Will be selling both 290's.


Yeah i know the feeling, i didn't want to switch from my 290, but i got a awesome deal on a 980 ti, so i ended up switching anyway. But it feels better now, seeing how powerful this card is and how good it overclocks.


----------



## Gumbi

Quote:


> Originally Posted by *MEC-777*
> 
> It is with mixed emotions that I say; it's time for a change. Ordered a 980 Strix last night. Got a good deal. Will be selling both 290's.


Why not grab a 980ti? 2 290s will outperform a 980 solidly.


----------



## Slam-It

Finally...count me in!


----------



## MEC-777

Quote:


> Originally Posted by *fyzzz*
> 
> Yeah i know the feeling, i didn't want to switch from my 290, but i got a awesome deal on a 980 ti, so i ended up switching anyway. But it feels better now, seeing how powerful this card is and how good it overclocks.


Quote:


> Originally Posted by *Gumbi*
> 
> Why not grab a 980ti? 2 290s will outperform a 980 solidly.


Would have liked to go with a 980Ti but here in Canada the prices are just insane right now. They start at around $800, which means well over $900+ after tax. I simply can't justify spending 30-40% more for not that much extra performance (over a 980 OC). Pretty much whatever I can sell both 290's for is what my budget was. (~$500).

I know the 290's can outperform a single 980, but here's the thing; only a few games I play actually take full advantage of crossfire. The rest either don't benefit much or have to run on a single card. So I'd rather have better performance overall across all my games, if that makes sense. I'm only using a 60hz display so the 980 will still run everything at over 60fps easy anyways, I won't really see the difference between 90 and 70fps with Vsync on.









The crossfire setup has also given me a number of issues and headaches. I feel like I've spent more time tinkering and messing with it to get it to work right, than actually playing games. So I'm ready for a change.


----------



## ManofGod1000

Quote:


> Originally Posted by *MEC-777*
> 
> Would have liked to go with a 980Ti but here in Canada the prices are just insane right now. They start at around $800, which means well over $900+ after tax. I simply can't justify spending 30-40% more for not that much extra performance (over a 980 OC). Pretty much whatever I can sell both 290's for is what my budget was. (~$500).
> 
> I know the 290's can outperform a single 980, but here's the thing; only a few games I play actually take full advantage of crossfire. The rest either don't benefit much or have to run on a single card. So I'd rather have better performance overall across all my games, if that makes sense. I'm only using a 60hz display so the 980 will still run everything at over 60fps easy anyways, I won't really see the difference between 90 and 70fps with Vsync on.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The crossfire setup has also given me a number of issues and headaches. I feel like I've spent more time tinkering and messing with it to get it to work right, than actually playing games. So I'm ready for a change.


The 980 stock and 290x stock are very close together. Now, a Ti would definitely be much faster but like you said, it is a lot more expensive. However, from what I am reading, you must be on a 1080p or 1440p monitor. Myself, I am on a 4k monitor so a R9 390 and R9 290x crossfire makes a Hugh difference there. Otherwise, only a Titan X would fit the bill and that is way beyond what I want to spend.


----------



## BiruZ

After flashing the 390 bios from the other thread I as finally able to beat the 20k on FS.


----------



## fat4l

Quote:


> Originally Posted by *BiruZ*
> 
> After flashing the 390 bios from the other thread I as finally able to beat the 20k on FS.


Nice








However Graphics score is the one that is important and cpu independent


----------



## rdr09

Quote:


> Originally Posted by *ManofGod1000*
> 
> The 980 stock and 290x stock are very close together. Now, a Ti would definitely be much faster but like you said, it is a lot more expensive. However, from what I am reading, you must be on a 1080p or 1440p monitor. Myself, I am on a 4k monitor so a R9 390 and R9 290x crossfire makes a Hugh difference there. Otherwise, only a Titan X would fit the bill and that is way beyond what I want to spend.


MEC has a locked i5. it will play better with a single 980 than with his current cards.


----------



## ManofGod1000

Quote:


> Originally Posted by *rdr09*
> 
> MEC has a locked i5. it will play better with a single 980 than with his current cards.


As long as he is not a 4k, you are right. However, it is not like he is going to get the same fps as two 290's, right? The reason I am bringing that up is because I recently got a 390 to crossfire with my 290x but I am playing at 4k. Yes, they do get hot but, I can keep the wattage usage down by setting the frames cap to 60. However, I am more concerned about when the summer shows up again but, I do not see anything short of a Titan X to be able to play 4k on a single card. (I did not have major issues playing on my single R9 290x but it can be limiting at times in 4k.)

Now, I am not trying to argue but just seeing what the experience of others is. I am not going to buy a card from newegg and then discover that it does not cut it. (You cannot return it.) Oh, and I am on an i7-6700k at 4.7 Ghz but, I cannot find the place to edit my system information on these forums.


----------



## rdr09

Quote:


> Originally Posted by *ManofGod1000*
> 
> As long as he is not a 4k, you are right. However, it is not like he is going to get the same fps as two 290's, right? The reason I am bringing that up is because I recently got a 390 to crossfire with my 290x but I am playing at 4k. Yes, they do get hot but, I can keep the wattage usage down by setting the frames cap to 60. However, I am more concerned about when the summer shows up again but, I do not see anything short of a Titan X to be able to play 4k on a single card. (I did not have major issues playing on my single R9 290x but it can be limiting at times in 4k.)
> 
> Now, I am not trying to argue but just seeing what the experience of others is. I am not going to buy a card from newegg and then discover that it does not cut it. (You cannot return it.) Oh, and I am on an i7-6700k at 4.7 Ghz but, I cannot find the place to edit my system information on these forums.


i am using 4K, too, and it helps stretch the legs of our gpus. also, when i was using 1080 and single 290, i can turn off HT fine just to keep my cpu temp down (lack of radiator space). With two 290s, it would not even cross my mind turning off HT. Not that i need to since i have enuf cooling now.

I am really curious how your i7 will behave when you turn off HT in multi-threaded games. I never tried.


----------



## ManofGod1000

Quote:


> Originally Posted by *rdr09*
> 
> i am using 4K, too, and it helps stretch the legs of our gpus. also, when i was using 1080 and single 290, i can turn off HT fine just to keep my cpu temp down (lack of radiator space). With two 290s, it would not even cross my mind turning off HT. Not that i need to since i have enuf cooling now.
> 
> I am really curious how your i7 will behave when you turn off HT in multi-threaded games. I never tried.


I will have to try that later and let you know. I am not running water cooling and that is why I am a bit concerned about the summer heat. These guys both go over 80C when I am gaming and the fans do get noisy but, I short of a Titan X, no other single card out today can really do 4k gaming well, right?


----------



## MEC-777

Quote:


> Originally Posted by *ManofGod1000*
> 
> The 980 stock and 290x stock are very close together. Now, a Ti would definitely be much faster but like you said, it is a lot more expensive. However, from what I am reading, you must be on a 1080p or 1440p monitor. Myself, I am on a 4k monitor so a R9 390 and R9 290x crossfire makes a Hugh difference there. Otherwise, only a Titan X would fit the bill and that is way beyond what I want to spend.


I am gaming on a single 1080p monitor, however I don't actually run most of my games at 1080 native. I run most of them at 1440p using VSR. I found this in combination with less AA provides a cleaner, higher-quality image while still allowing to maintain good fps. So since with VSR the GPU is actually rendering the game at the higher resolution, technically I need a card that's good at 1440p. A single 290 does this decently, but a 980 does it much better. (I am planning on upgrading to an actual 1440 or 4k display sometime in the new year).

When ever I buy something and especially when spending this much money, I put A LOT of research into making my decision. The 390X is decently faster than a 290 and priced $50-$100 less than most 980's. However, the accumulated data of all the benchmarks and reviews I read all showed consistently that the 980 does perform better than the 390X in 'almost' every scenario. Some times not by much, other times by as much as 20fps. On top of that, I plan to OC and we all know how well Maxwell chips OC. Even if the performance scaling per OC isn't as good as the 390X, the 980 still runs away from it. So, since I had the chance to grab a 980 for the price of the cheapest 390X's currently available here, I jumped on it.

I love the Hawaii cards and always will. They remain performance per dollar kings and will for some time yet. But I'm also not brand loyal. I go where the best deal that suits my needs and budget takes me.









The Fury may be a tad faster than the 980 overall, but right now they start at over $700 CAD. So they were out of the question.
Quote:


> Originally Posted by *rdr09*
> 
> MEC has a locked i5. it will play better with a single 980 than with his current cards.


Yes and no. I never saw an instance where my CPU actually held the 290's back. However, when playing Project Cars I did see 100% CPU usage and 100% GPU usage (on both cards). So that game with both 290's was about the most my CPU could "handle". There will, however, be a lesser chance of that happening with a single 980, for sure.


----------



## rdr09

Quote:


> Originally Posted by *MEC-777*
> 
> I am gaming on a single 1080p monitor, however I don't actually run most of my games at 1080 native. I run most of them at 1440p using VSR. I found this in combination with less AA provides a cleaner, higher-quality image while still allowing to maintain good fps. So technically I need a card that's good at 1440p. A single 290 does this decently, but a 980 does it much better.
> 
> When ever I buy something and especially when spending this much money, I put A LOT of research into making my decision. The 390X is decently faster than a 290 and priced $50-$100 less than most 980's. However, the accumulated data of all the benchmarks and reviews I read all showed consistently that the 980 does perform better than the 390X in 'almost' every scenario. Some times not by much, other times by as much as 20fps. On top of that, I plan to OC and we all know how well Maxwell chips OC. Even if the performance scaling per OC isn't as good as the 390X, the 980 still runs away from it. So, since I had the chance to grab a 980 for the price of the cheapest 390X's currently available here, I jumped on it.
> 
> I love the Hawaii cards and always will. They remain performance per dollar kings and will for some time yet. But I'm also not brand loyal. I go where the best deal that suits my needs and budget takes me.
> 
> 
> 
> 
> 
> 
> 
> 
> Yes and no. I never saw an instance where my CPU actually held the 290's back. However, when playing Project Cars I did see 100% CPU usage and 100% GPU usage (on both cards). So that game with both 290's was about the most my CPU could "handle". There will, however, be a lesser chance of that happening with a single 980, for sure.


Single 980? not sure. i know some members use a single 290(X) and 390(X) in 4K. Highly oc'ed i assume. I've tested a few games in 4K with a single 290 and they were fine. Titanfall is one of them.

If you be so kind to use Afterburner to monitor your cpu usage with HT off.

@MEC, i read your response. Much rather see AB graphs, though. Wait, crossfire works in Project Cars? I may have to get that game.


----------



## ManofGod1000

Quote:


> Originally Posted by *MEC-777*
> 
> I am gaming on a single 1080p monitor, however I don't actually run most of my games at 1080 native. I run most of them at 1440p using VSR. I found this in combination with less AA provides a cleaner, higher-quality image while still allowing to maintain good fps. So technically I need a card that's good at 1440p. A single 290 does this decently, but a 980 does it much better.
> 
> When ever I buy something and especially when spending this much money, I put A LOT of research into making my decision. The 390X is decently faster than a 290 and priced $50-$100 less than most 980's. However, the accumulated data of all the benchmarks and reviews I read all showed consistently that the 980 does perform better than the 390X in 'almost' every scenario. Some times not by much, other times by as much as 20fps. On top of that, I plan to OC and we all know how well Maxwell chips OC. Even if the performance scaling per OC isn't as good as the 390X, the 980 still runs away from it. So, since I had the chance to grab a 980 for the price of the cheapest 390X's currently available here, I jumped on it.
> 
> I love the Hawaii cards and always will. They remain performance per dollar kings and will for some time yet. But I'm also not brand loyal. I go where the best deal that suits my needs and budget takes me.
> 
> 
> 
> 
> 
> 
> 
> 
> Yes and no. I never saw an instance where my CPU actually held the 290's back. However, when playing Project Cars I did see 100% CPU usage and 100% GPU usage (on both cards). So that game with both 290's was about the most my CPU could "handle". There will, however, be a lesser chance of that happening with a single 980, for sure.


Cool, thanks for the information. However, what about at 4k? Is that 980 capable or would I have to use a Titan X if I wanted to do 4k on a single card? Or is crossfire and sli the only real option for this right now?


----------



## MEC-777

Quote:


> Originally Posted by *rdr09*
> 
> Single 980? not sure. i know some members use a single 290(X) and 390(X) in 4K. Highly oc'ed i assume. I've tested a few games in 4K with a single 290 and they were fine. Titanfall is one of them.
> 
> If you be so kind to use Afterburner to monitor your cpu usage with HT off.
> 
> @MEC, i read your response. Much rather see AB graphs, though. Wait, crossfire works in Project Cars? I may have to get that game.


The 980 is notably faster than the 290 and 290x at all resolutions, so yeah it will have no trouble at 1440. if a single 290/X can handle some games at 4k, a 980 can as well. 4k benchmarks show less of a gap between the 290x and 980 (and the 390x matches, stock for stock), but I'm not running games at 4k, so I'm not really concerned with that at this point.

When I do upgrade to 1440 or 4k I will be buying one with Gsync. So sub 60fps will still play smooth.







If not, I'll wait for the new 14/16nm GPUs from Nvdia and AMD next year and see what's what then.

I can't turn HT off because my CPU doesn't have hyperthreading. It's an i5.







I wouldn't turn it off if I had it though. If temps are an issue to the point where you have to turn off HT, perhaps you need a better CPU cooler?
Quote:


> Originally Posted by *ManofGod1000*
> 
> Cool, thanks for the information. However, what about at 4k? Is that 980 capable or would I have to use a Titan X if I wanted to do 4k on a single card? Or is crossfire and sli the only real option for this right now?


At 4k, a 980 is boarderline strong enough for playable frame rates, depending on the game. I would say a 980Ti and Fury X are both not quite there yet either for single card 4k gaming. You really need 2 cards to get enough muscle if you want to run 4k, high+ settings at 60fps+. With the next gen, however, that may change. 14/16nm process node for GPUs could provide quite the leap in GPU performance. The last few generations have all been 28nm which is why we've only seen these "incremental" improvements on the extreme top end - nothing that's blown our minds, so to speak.


----------



## the9quad

Quite a few games where even 3 290x's don't get 60 fps at 1440p.... I'll be happy when a single card comes in under a grand that will do that.


----------



## LandonAaron

So whats the word on Crimson driver package/ Catalyst replacement (15.11.1). I did a quick forum search and all I saw were comments from MEC-777, I believe, about a problem he was having with it. Does anyone else have any thoughts on it?


----------



## Forceman

Quote:


> Originally Posted by *LandonAaron*
> 
> So whats the word on Crimson driver package/ Catalyst replacement (15.11.1). I did a quick forum search and all I saw were comments from MEC-777, I believe, about a problem he was having with it. Does anyone else have any thoughts on it?


Fixes the Just Cause 3 texture problem, at least.


----------



## BadRobot

Quote:


> Originally Posted by *the9quad*
> 
> Quite a few games where even 3 290x's don't get 60 fps at 1440p.... I'll be happy when a single card comes in under a grand that will do that.


I'll be happy when there's a good and affordable upgrade to the 290 that's still in the AMD line. The Fury X is still out of my price range. So far my history has almost doubled the pixel fill rate each time.
5670 > 6670 > 7770 > 7850 > R9 290


----------



## LandonAaron

Well I ran off to give Crimson a test and my initial impressions aren't good. The first annoying thing is that after the installation finishes I keep getting a message that, "The installation finished but there were errors." then below: "AMD Drivers and Software weren't installed, check log for details." I looked through the log and the only things I saw was a line that says download failed. Which is strange because this isn't the web installer its the big half gig one that shouldn't need to download anything. The only other thing I saw was the very last line which said "Driver already pre-installed". This got me thinking that maybe since I was using 15.10 driver previously maybe the driver was so similar it thought I already had it? I duno.

So I tried reinstalling, using the AMD Clean Uninstall Utility to remove the driver completely first. I still got the same message so I tried again, using Display Driver Uninstaller to remove any traces of AMD software, and got the same error yet again for a 3rd time, so I just said screw it. I opened up the AMD Settings menu and checked the driver version which shows 15.11.1, so I decided the driver installed fine and just chalked it up to a dumb installer.

The next order of business was to up my refresh rate from the 60hz it defaults to, to the 96hz the monitor is capable of. AMD has never had the ability to create custom resolutions through the driver software before so I was happy to see this new feature. However I soon discovered that the new feature is pointless, because either it won't allow you to create a custom resolution with a refresh rate greater than the monitor's reported highest refresh rate, or well, maybe its just broken and you can't create a custom resolution with it for some other reason. The bottom line is that I couldn't get this feature to work even when I copied every single value line for line from a working custom resolution I had created with the CRU program.

So yeah, there is my little spiel on the matter. The installer gives annoying and seemingly inaccurate error messages, and the Custom Resolution feature doesn't seem to work.


----------



## rdr09

Quote:


> Originally Posted by *MEC-777*
> 
> The 980 is notably faster than the 290 and 290x at all resolutions, so yeah it will have no trouble at 1440. if a single 290/X can handle some games at 4k, a 980 can as well. 4k benchmarks show less of a gap between the 290x and 980 (and the 390x matches, stock for stock), but I'm not running games at 4k, so I'm not really concerned with that at this point.
> 
> When I do upgrade to 1440 or 4k I will be buying one with Gsync. So sub 60fps will still play smooth.
> 
> 
> 
> 
> 
> 
> 
> If not, I'll wait for the new 14/16nm GPUs from Nvdia and AMD next year and see what's what then.
> 
> I can't turn HT off because my CPU doesn't have hyperthreading. It's an i5.
> 
> 
> 
> 
> 
> 
> 
> I wouldn't turn it off if I had it though. If temps are an issue to the point where you have to turn off HT, perhaps you need a better CPU cooler?


yes, i turned off HT with a single 290 when i had the case CM 912. went with a bigger case and added more radiators. But, single 290 ran fine with HT off really. Not with two. Not even with 2 HD 7900 cards.

edit: see post #1458

http://www.overclock.net/t/1517133/official-fallout-4-information-and-discussion-thread/1450

that's with a single 970.


----------



## MEC-777

Quote:


> Originally Posted by *BadRobot*
> 
> I'll be happy when there's a good and affordable upgrade to the 290 that's still in the AMD line. The Fury X is still out of my price range. So far my history has almost doubled the pixel fill rate each time.
> 5670 > 6670 > 7770 > 7850 > R9 290


Honestly, there isn't a double performance equivalent to the 290 in this current generation. Part of the reason for the somewhat lackluster incremental performance bump of this latest gen of GPUs is because they're still on the 28nm node. The current GPU architectures on this node have pretty much reached the limit of what they can do in terms of performance. I'm sure they could get more with more tweaking, but not much.

With this next generation coming out next year on the 14 and 16nm node with HBM2 and GDDR5X memory, I predict we're going to see some of the biggest jumps in GPU performance we've seen in years. So hold tight, it's coming...








Quote:


> Originally Posted by *LandonAaron*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Well I ran off to give Crimson a test and my initial impressions aren't good. The first annoying thing is that after the installation finishes I keep getting a message that, "The installation finished but there were errors." then below: "AMD Drivers and Software weren't installed, check log for details." I looked through the log and the only things I saw was a line that says download failed. Which is strange because this isn't the web installer its the big half gig one that shouldn't need to download anything. The only other thing I saw was the very last line which said "Driver already pre-installed". This got me thinking that maybe since I was using 15.10 driver previously maybe the driver was so similar it thought I already had it? I duno.
> 
> So I tried reinstalling, using the AMD Clean Uninstall Utility to remove the driver completely first. I still got the same message so I tried again, using Display Driver Uninstaller to remove any traces of AMD software, and got the same error yet again for a 3rd time, so I just said screw it. I opened up the AMD Settings menu and checked the driver version which shows 15.11.1, so I decided the driver installed fine and just chalked it up to a dumb installer.
> 
> The next order of business was to up my refresh rate from the 60hz it defaults to, to the 96hz the monitor is capable of. AMD has never had the ability to create custom resolutions through the driver software before so I was happy to see this new feature. However I soon discovered that the new feature is pointless, because either it won't allow you to create a custom resolution with a refresh rate greater than the monitor's reported highest refresh rate, or well, maybe its just broken and you can't create a custom resolution with it for some other reason. The bottom line is that I couldn't get this feature to work even when I copied every single value line for line from a working custom resolution I had created with the CRU program.
> 
> So yeah, there is my little spiel on the matter. The installer gives annoying and seemingly inaccurate error messages, and the Custom Resolution feature doesn't seem to work.


Always, ALWAYS run DDU when installing new AMD drivers. Even then, there have been a number of times when I had to run DDU again and install a 2nd time to fix issues/get it to properly install.

I like the new Crimson interface and the convenience of setting up custom profiles for each game. But it's definitely lacking some options. In some ways they streamlined it a bit too much.

The drivers themselves are... "OK" I've experienced some crashing, BSOD, freezing, but I'm unsure if it's crossfire related or driver related or both.


----------



## Regnitto

Quote:


> Originally Posted by *BadRobot*
> 
> I'll be happy when there's a good and affordable upgrade to the 290 that's still in the AMD line. The Fury X is still out of my price range. So far my history has almost doubled the pixel fill rate each time.
> 5670 > 6670 > 7770 > 7850 > R9 290


i think it's kind of funny, my last 3 upgrades match yours. msi hd7770 - Powercolor HD 7850 - VisionTek R9 290


----------



## kizwan

My upgrade is Voodoo 3 3000 > 5870 > 290 Crossfire. lol So, basically everytime I upgrade, it always a "WOW!" experience.

Actually between Voodoo 3 3000 & 5870, I did own several low-end ATI/Nvidia gpus which are not worth it to mention them here.


----------



## LandonAaron

Quote:


> Originally Posted by *MEC-777*
> 
> Always, ALWAYS run DDU when installing new AMD drivers. Even then, there have been a number of times when I had to run DDU again and install a 2nd time to fix issues/get it to properly install.
> 
> I like the new Crimson interface and the convenience of setting up custom profiles for each game. But it's definitely lacking some options. In some ways they streamlined it a bit too much.
> 
> The drivers themselves are... "OK" I've experienced some crashing, BSOD, freezing, but I'm unsure if it's crossfire related or driver related or both.


BTW I used AMD's Clean uninstall utility and DDU from safe mode before installing the Crimson Driver.

I had much more serious issue with the new Crimson driver today. I have an overclockable Korean monitor that I run at 96hz, at which it has always been stable and never given me any issues. Well today watching some Youtube videos' in firefox with hardware graphics acceleration enabled in the FF settings the monitor kept flickering with green horizontal lines on the screen. I eventually disabled HW graphics acceleration and they went away. However later in the day my 96hz monitor randomly turned black as if it had no signal, and my 2nd monitor running at 60hz just froze holding its last displayed image. Unplugging and re-plugging in the monitor had no effect and I couldn't get the mouse cursor to show up on the second monitor. The time in the bottom right corner wasn't updating either, so I concluded the whole computer had frozen up. I rebooted and looked through Event Viewer, and found two events related to AMD/ATI, but both occurred two hours before the computer crashed, so I am not sure how relevant those events are:

Event ID 0


Spoiler: Warning: Spoiler!



"The description for Event ID 0 from source amdacpusrsvc cannot be found. Either the component that raises this event is not installed on your local computer or the installation is corrupted. You can install or repair the component on the local computer.

If the event originated on another computer, the display information had to be saved with the event.

The following information was included with the event:

amdacpusrsvc
[EVENT]: SERVICE_CONTROL_POWEREVENT: RESUME: FAILED."



and

Event ID 4101


Spoiler: Warning: Spoiler!



"Display driver amdkmdap stopped responding and has successfully recovered.".



Does anyone know what exactly amdacpusrsvc is? It sound like something for an AMD processor not a GPU. I have an Intel system and wonder what it is for?

The issue may have just been a problem with overclocking the GPU's as I loaded up MSIAB to make sure it wasn't temperature related and saw that the clock rate on both 290's were set to the overclocked level I had set them to the night before, but there voltage settings had reset to default. So the crash and strange Youtube behavior may have just been from too low a voltage overclock. Never had this happen before though, where the computer would reload a GPU overclock after being restarted. Another strange issue was when I launched MSIAB for the first time it didn't ask me anything about needing to restart to unlock voltage control like it normally does after installing a new driver. So I went ahead and unticked both voltage settings and reloaded then reticked them and reloaded again, to make sure MSIAB was configured properly with the new driver.

Final thing, something else that annoyed me in the AMD Settings menu was that beyond creating a custom resolution I didn''t see any option for setting the monitors refresh rate? Does anyone know where this setting is located?


----------



## refirendum

http://www.userbenchmark.com/UserRun/545180
http://www.userbenchmark.com/UserRun/545147
http://www.userbenchmark.com/UserRun/545140

ran 2 runs at my chosen overclock and one at a milder overclock



i think i'm satisfied with this. i won't wanna push more voltage into the card. anything above 1300 on mem, and it craps out under load, and over 1125 on core and artifacts under load, so not going there.


----------



## colorfuel

Today I tried OCCT gpu stress test on my 290X for the first time and it seems I cant even get a proper 10% OC anymore. 1100Mhz core is impossible to stabilize even with >100mv. At >130mv it just crashes the system.

On stock volts I'm happy OCCT doesnt show errors @ stock clocks.









OCCT shows me my gpu is a real POS, eventhough this [new edition] has these extra phases. Pfft.

I wish I had never tried to use OCCT and had just sticked to using Crysis 2 or some such to test my OCs.


----------



## rdr09

Quote:


> Originally Posted by *colorfuel*
> 
> Today I tried OCCT gpu stress test on my 290X for the first time and it seems I cant even get a proper 10% OC anymore. 1100Mhz core is impossible to stabilize even with >100mv. At >130mv it just crashes the system.
> 
> On stock volts I'm happy OCCT doesnt show errors @ stock clocks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> OCCT shows me my gpu is a real POS, eventhough this [new edition] has these extra phases. Pfft.
> 
> I wish I had never tried to use OCCT and had just sticked to using Crysis 2 or some such to test my OCs.


want to try to edit the bios?

http://www.overclock.net/t/1564219/r9-390x-bios-for-r9-290-290x-now-with-stock-and-modded-voltage-tables

your card is a good candidate. keep your temps below 80C - core and vrms, especially vrm1.


----------



## colorfuel

Quote:


> Originally Posted by *rdr09*
> 
> want to try to edit the bios?
> 
> http://www.overclock.net/t/1564219/r9-390x-bios-for-r9-290-290x-now-with-stock-and-modded-voltage-tables
> 
> your card is a good candidate. keep your temps below 80C - core and vrms, especially vrm1.


Thanks for your help. I've already tried with keeping the temps below 80° which the Tri-X cooler has the ability to. I would try the bios mod but this Sapphire 290X Tri-X new edition uses a different PCB layout than the standard 290X. Apparently its the same as their 390X Tri-X but instead of Hynix (like the 390X) it uses Samsung Ram. Afaik I have not seen a suitable bios so far.



Here is the memory info:

Samsung
K4G20325FS

Density: 64MX32

And here is my bios

SapphireTriX290XNewEd.zip 42k .zip file


----------



## sinnedone

Quote:


> Originally Posted by *colorfuel*
> 
> Today I tried OCCT gpu stress test on my 290X for the first time and it seems I cant even get a proper 10% OC anymore. 1100Mhz core is impossible to stabilize even with >100mv. At >130mv it just crashes the system.
> 
> On stock volts I'm happy OCCT doesnt show errors @ stock clocks.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> OCCT shows me my gpu is a real POS, eventhough this [new edition] has these extra phases. Pfft.
> 
> I wish I had never tried to use OCCT and had just sticked to using Crysis 2 or some such to test my OCs.


If your overclocks are stable for whatever you use it for don't worry about stress tests.


----------



## scottydsntknow

Quote:


> Originally Posted by *sinnedone*
> 
> If your overclocks are stable for whatever you use it for don't worry about stress tests.


This

I can't get OCCT to run for more than 90 seconds without erroring out btw. But then will run 10 stable passes in IBT and P95 all night so try a different test maybe. Or just do a real intense gaming session and if its fine it will be fine. I can't put my rig through more than 5 or 6 passes in IBT at max overclock or the socket gets too hot but in my most stressful gaming situations it never touches 70 and the core never hits 60.


----------



## Jflisk

Quote:


> Originally Posted by *scottydsntknow*
> 
> This
> 
> I can't get OCCT to run for more than 90 seconds without erroring out btw. But then will run 10 stable passes in IBT and P95 all night so try a different test maybe. Or just do a real intense gaming session and if its fine it will be fine. I can't put my rig through more than 5 or 6 passes in IBT at max overclock or the socket gets too hot but in my most stressful gaming situations it never touches 70 and the core never hits 60.


OCCT is an old dinosaur - has not been upgraded in years . Heats the CPU up more then what It should be or ever be gaming or in use for that matter. Its like furmark for a CPU.


----------



## HOMECINEMA-PC

Got catalyst 15.8 running well in 3 way eyefinity a steady 150fps @ 1000 / 1250 @ 7680x1440p in COD AW ..........





4K


----------



## bkvamme

I have a 290X 4GB, and might pick up a second if the price is right. How much power do you need for crossfire 290X? Tried out the power calculator, and at 90% load I ended up at 977W with no overclocks, which is a bit on the shortside as I have a 1000W PSU (Cooler Master V1000) and I was hoping to overclock the CPU at least.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *bkvamme*
> 
> I have a 290X 4GB, and might pick up a second if the price is right. How much power do you need for crossfire 290X? Tried out the power calculator, and at 90% load I ended up at 977W with no overclocks, which is a bit on the shortside as I have a 1000W PSU (Cooler Master V1000) and I was hoping to overclock the CPU at least.


Just go a 1200 watter and be done with it


----------



## bkvamme

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Just go a 1200 watter and be done with it


Already have a 1000W PSU, and I am not sure if I'll be able to fit 180mm PSUs in my case due to the res/pump combo. I'll see how much space I can spare. I know the 170mm of the V1000 was a tight enough fit when I bought that one!


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *bkvamme*
> 
> Already have a 1000W PSU, and I am not sure if I'll be able to fit 180mm PSUs in my case due to the res/pump combo. I'll see how much space I can spare. I know the 170mm of the V1000 was a tight enough fit when I bought that one!


I have desk / chillputer so space is not a issue ..... for me


----------



## bkvamme

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> I have desk / chillputer so space is not a issue ..... for me


Haha!

Wish I could say the same. Here is the spacing now with the V1000. Might be able to squeeze some additional mm's out of it by moving the res a bit further to the left, but not many.


----------



## MEC-777

Quote:


> Originally Posted by *bkvamme*
> 
> I have a 290X 4GB, and might pick up a second if the price is right. How much power do you need for crossfire 290X? Tried out the power calculator, and at 90% load I ended up at 977W with no overclocks, which is a bit on the shortside as I have a 1000W PSU (Cooler Master V1000) and I was hoping to overclock the CPU at least.


You will be fine with that V1000. It's a really good PSU and 1000w is plenty for two 290X's.


----------



## ZealotKi11er

Quote:


> Originally Posted by *MEC-777*
> 
> You will be fine with that V1000. It's a really good PSU and 1000w is plenty for two 290X's.


If you dont go over +100 mV and dont have a Overclocked FX-8 or 6-Core + Intel.


----------



## bkvamme

Quote:


> Originally Posted by *MEC-777*
> 
> You will be fine with that V1000. It's a really good PSU and 1000w is plenty for two 290X's.


Alright, that's good to hear. I have been doing a bit more reading, and as far as I can tell, I should be fine with moderate overclocks. Each 290X draws around 300W stock, and the motherboard + CPU draws around 150-200W under load. That gives me around 200W headroom for overclocking the CPU. I think the V1000 can handle higher peak loads for shorter periods of time aswell.

Suddenly my problem became if the circuit breaker can handle this and my server. Should be exiting


----------



## fyzzz

I've ran 2x290 in crossfire on a 750 psu and with a overclocked i5 4690k. Even benched the cards a little bit and had no shut downs. But i was probably on the edge.


----------



## bkvamme

Hehe, only one way to find out. Bid is placed, and fingers are crossed.


----------



## diggiddi

I am running 2 lightnings and a FX 8350 @ 4.8 on an Antec HCG 750, it can withstand mild overclocking to the gpu's but I tripped OCP at 1180 core on both I think it would have run If I had a 1000w psu


----------



## bkvamme

Quote:


> Originally Posted by *diggiddi*
> 
> I am running 2 lightnings and a FX 8350 @ 4.8 on an Antec HCG 750, it can withstand mild overclocking to the gpu's but I tripped OCP at 1180 core on both I think it would have run If I had a 1000w psu


Huh, not too shabby. The 8350 has almost the same TDP as the 5820K, so it is absolutely comparable. I have my hopes up then! Just saw in a review that TechPowerUp managed to get 1050W in a 290X Crossfire setup while overclocked.


----------



## ZealotKi11er

Quote:


> Originally Posted by *bkvamme*
> 
> Huh, not too shabby. The 8350 has almost the same TDP as the 5820K, so it is absolutely comparable. I have my hopes up then! Just saw in a review that TechPowerUp managed to get 1050W in a 290X Crossfire setup while overclocked.


You got to consider efficiency also.


----------



## bkvamme

Quote:


> Originally Posted by *ZealotKi11er*
> 
> You got to consider efficiency also.


Ah, of course. That was wattage at the wall. Real usage would probably be ~10% lower. Thanks a lot for clearing that up, helped a lot! Feeling lot better about my V1000 now.


----------



## ZealotKi11er

Quote:


> Originally Posted by *bkvamme*
> 
> Ah, of course. That was wattage at the wall. Real usage would probably be ~10% lower. Thanks a lot for clearing that up, helped a lot! Feeling lot better about my V1000 now.


Just for reference with my AX1200 Gold PSU.

3770k @ 4.6GHz w/ 1.35v
2 x R9 290s

3DMark
GPUs @ Stock ~ 670W Peak
GPUs @ +100mV ~ 1175MHz Core ~ 860W Peak
GPUs @ +200mV ~ 1275MHz Core ~ 1040W Peak

Those are all readings off the wall.
I dont expect many people to run +200 mV other then benchmarks. It just does not make sense in any level unless you want to degrade the PSU and heat up the room in cold winter.
I dont even run a OC from my cards right now because there is no real need.


----------



## bkvamme

Thanks a lot for your help and feedback, I am now confident that the PSU won't be a problem!


----------



## kizwan

1000W is not plenty for 2 x 290X & 5820K OC-ed but should be enough. Let me know if you feel the PSU is warm or hot while benching or gaming with both OC-ed. I said it's enough because most likely most of the time you're going to run 2 x 290X at stock anyway.


----------



## AliNT77

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Just for reference with my AX1200 Gold PSU.
> 
> 3770k @ 4.6GHz w/ 1.35v
> 2 x R9 290s
> 
> 3DMark
> GPUs @ Stock ~ 670W Peak
> GPUs @ +100mV ~ 1175MHz Core ~ 860W Peak
> GPUs @ +200mV ~ 1275MHz Core ~ 1040W Peak
> 
> Those are all readings off the wall.
> I dont expect many people to run +200 mV other then benchmarks. It just does not make sense in any level unless you want to degrade the PSU and heat up the room in cold winter.
> I dont even run a OC from my cards right now because there is no real need.


Can U mod your cards' bios (middle memory clock) and try UVing them and measure power draw?

I think 450w with 290 CFX UVed is absolutely possible ??


----------



## bkvamme

Quote:


> Originally Posted by *kizwan*
> 
> 1000W is not plenty for 2 x 290X & 5820K OC-ed but should be enough. Let me know if you feel the PSU is warm or hot while benching or gaming with both OC-ed. I said it's enough because most likely most of the time you're going to run 2 x 290X at stock anyway.


Yeah, I figure the same. The 5820K needs to be OCed, but when I crossfire the 290X's I think stock should be fine. For games that don't work well with crossfire, I might end up overclocking one of the cards, but then the second card is not used, so power usage should be good.

With the new Crimson software, I got the impression that it's possible to decide overclocks on a per-game basis, is this correct?
Is it also possible to apply crossfire on a per game basis? In that case, it would be perfect.

My motherboard died, so I am unable to try it out myself









Nevermind, found some screenshots that showed that it is indeed possible. Very, very nice!


----------



## Ironsight

For some reason now if I overclock and the monitor turns off I will wake up to a black screen. This never happened before so I thought it was the new crimson drivers, but I rolled back and it will still blackscreen when overclocked. Kinda weird as I've never ran into this issue since I got my card. ULPS is disabled btw.


----------



## RatusNatus

Quote:


> Originally Posted by *Ironsight*
> 
> For some reason now if I overclock and the monitor turns off I will wake up to a black screen. This never happened before so I thought it was the new crimson drivers, but I rolled back and it will still blackscreen when overclocked. Kinda weird as I've never ran into this issue since I got my card. ULPS is disabled btw.


Welcome to the pool!







u ready? Not good news man.

http://www.overclock.net/t/1441349/290-290x-black-screen-poll


----------



## morencyam

Does anyone know if there is a full cover water block that will fit an MSI Gaming 290X 4G model number 912-V308-001. I've seen some reports that the EK-FC R9 290X rev 2.0 block fits the V308-001R card, but my model doesn't have the "R" and I don't know what the difference between the two are


----------



## Cyber Locc

I do not know, however what I sadly do know is that ek (and probally everyone else EOLed 290x blocks. So good luck in finding one







. I found that out myself the other day which sucks because I was going to add a 3rd card soon.


----------



## Ironsight

Quote:


> Originally Posted by *RatusNatus*
> 
> Welcome to the pool!
> 
> 
> 
> 
> 
> 
> 
> u ready? Not good news man.
> 
> http://www.overclock.net/t/1441349/290-290x-black-screen-poll


I figured out that my tightened memory timings won't wake the monitor up properly if I go beyond 1375mhz. But they're stable when gaming up to 1625.

I guess I just won't bother going beyond 1375 until the crimson drivers will reset to stock values after a restart.


----------



## FuriousPop

Just trying the Crimson drivers and im still getting the flickering issue...

Anyone else in this boat?

if fixed, what drivers are you using?


----------



## JourneymanMike

Quote:


> Originally Posted by *FuriousPop*
> 
> Just trying the Crimson drivers and im still getting the flickering issue...
> 
> Anyone else in this boat?
> 
> if fixed, what drivers are you using?


Hell, I can't even get the Crimson drivers to install...


----------



## JourneymanMike

Does anyone have any idea, how to tell the manufacturing date, on a Sapphire R9 290X?

I've been gooooooooogling all over the internet trying to find out, No Dice!


----------



## ZealotKi11er

Quote:


> Originally Posted by *JourneymanMike*
> 
> Does anyone have any idea, how to tell the manufacturing date, on a Sapphire R9 290X?
> 
> I've been gooooooooogling all over the internet trying to find out, No Dice!


What cards do you have? They have 2 years warranty so its probably close. You would want to check with Sapphire.


----------



## FuriousPop

Quote:


> Originally Posted by *JourneymanMike*
> 
> Hell, I can't even get the Crimson drivers to install...


I've been using the same method since like 14.10

Log in windows
Uninstall AMD CC Manager + amd drivers
Restart into safe mode
Run DDC uninstall cleaner (Clean without restart)
locate all AMD folders in C: (appdata + program files) - delete them all
restart and log into windows normally
run the exe for Crimons / Amd driver

been using this above method for ages without any issue ever with - amd drivers not working properly blah blah..

I did however have the Crimon Driver installation manage freeze on me like at 20 something % - i just cancelled it and followed the above again to clean it all out. 2nd time around it went straight through without issue.

Plus i run eyefinity which is always fun when it comes to going through the setup again and again........

my main concern i have now is that i am getting flickering through the monitors when i exit any game, desktop just starts to flicker like crazy when i move the mouse - but if i go back into the game it doesnt happen during gaming, only occurs during windows desktop. werid...


----------



## JourneymanMike

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *JourneymanMike*
> 
> Does anyone have any idea, how to tell the manufacturing date, on a *Sapphire R9 290X?
> 
> *I've been gooooooooogling all over the internet trying to find out, No Dice!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *What cards do you have?* They have 2 years warranty so its probably close. You would want to check with Sapphire.
Click to expand...

The card I've put in bold, in my original post...

Yeah, I'll have to contact Sapphire Support...

The Sapphire BF4 Reference card were released in October, 2013...


----------



## RatusNatus

Quote:


> Originally Posted by *JourneymanMike*
> 
> The card I've put in bold, in my original post...
> 
> Yeah, I'll have to contact Sapphire Support...
> 
> The Sapphire BF4 Reference card were released in October, 2013...


This was the release date from R9 290. All those reference cards came from AMD straght. They are completing 2 years now so hurry up.
The Saphire forum is a good way to start.


----------



## Ha-Nocri

One of the fans on my triX card is dying, barely spins and makes noises. Temps went up 10 degrees probably. Not sure what's the best thing to do, apply machine oil?


----------



## JourneymanMike

Quote:


> Originally Posted by *Ha-Nocri*
> 
> One of the fans on my triX card is dying, barely spins and makes noises. Temps went up 10 degrees probably. Not sure what's the best thing to do, apply machine oil?


Fans? On a GPU? Where's the Full Cover water block and back plate?



No bad fan here!









Just screwing around...


----------



## bkvamme

Quote:


> Originally Posted by *Ha-Nocri*
> 
> One of the fans on my triX card is dying, barely spins and makes noises. Temps went up 10 degrees probably. Not sure what's the best thing to do, apply machine oil?


Sounds like that ball bearing is worn out/running out of lubrication. You can have a look at the fan and see if it is possible to add lubricant, or if it's a sealed bearing (my guess).


----------



## LandonAaron

Does anyone know how to disable MSIAB from applying an overclock at startup. I never use to have this problem but ever since I updated to Crimson drivers, if I forget to reset my overclock before I shut down my computer it will return to that overclock when I turn my computer back on. This wouldn't be a problem except for about half the time it will only restore the frequency settings and but not the voltage settings. So I will be running too high a clock for that voltage and randomly my I will lose picture and the computer will be completely unresponsive, I assume from some critical error with the video card due to an unstable overclock that I forgot to disable. Very frustrating.


----------



## mfknjadagr8

Quote:


> Originally Posted by *LandonAaron*
> 
> Does anyone know how to disable MSIAB from applying an overclock at startup. I never use to have this problem but ever since I updated to Crimson drivers, if I forget to reset my overclock before I shut down my computer it will return to that overclock when I turn my computer back on. This wouldn't be a problem except for about half the time it will only restore the frequency settings and but not the voltage settings. So I will be running too high a clock for that voltage and randomly my I will lose picture and the computer will be completely unresponsive, I assume from some critical error with the video card due to an unstable overclock that I forgot to disable. Very frustrating.


this is why I set 2d and 3d profile...when it boots it will use 2d when you game or bench 3d profile kicks it up to whatever...I guess this wouldn't be ideal if you do a ton of modeling or editing but it's an option...I use default clocks for my 2d profile and change the 3d in the settings dependant on what I'm doing...if you disable ab loading on startup that would work but then you have to start it manually...I was thinking there was an option in the settings to not apply it on startup but can't remember where


----------



## kizwan

Quote:


> Originally Posted by *LandonAaron*
> 
> Does anyone know how to disable MSIAB from applying an overclock at startup. I never use to have this problem but ever since I updated to Crimson drivers, if I forget to reset my overclock before I shut down my computer it will return to that overclock when I turn my computer back on. This wouldn't be a problem except for about half the time it will only restore the frequency settings and but not the voltage settings. So I will be running too high a clock for that voltage and randomly my I will lose picture and the computer will be completely unresponsive, I assume from some critical error with the video card due to an unstable overclock that I forgot to disable. Very frustrating.


Actually MSI AB not doing that (I'm assuming you did not set MSI AB to run at startup). It was Crimson driver fault. It will do that - remember overclock clock but not the voltage - when you shutdown or reboot after crash (gpu related crash). With normal restart it will remember both overclock clock & voltage.

The workaround for now is either re-install older drivers or create a batch file that will auto-overvolt for you every time you log into windows to prevent crashing. If you choose the latter, you can reset the voltage in windows using MSI AB.


----------



## Sycksyde

Quote:


> Originally Posted by *colorfuel*
> 
> OCCT shows me my gpu is a real POS


It can't be worse than mine, 1050mhz is as far as it will go without over 100mv of voltage.

Pretty damn disappointed tbh but that's the luck of the draw I guess....my card is a 290 Tri-X.


----------



## corey3333

I have Two Sapphire R9 290x's. When crossfired i get some flickering on games, also crossfire does not seem compatable on alot of games. So i ordered a MSI R9 390x. it have Derect X 12 I hope this with help with futer games that are coming out. My question is: 1 Can i crossfire my R9 290x with a 390x?? Also will Derect x 12 help drive awsome graphics better than Derect x 11??


----------



## cokker

Quote:


> Originally Posted by *Sycksyde*
> 
> It can't be worse than mine, 1050mhz is as far as it will go without over 100mv of voltage.
> 
> Pretty damn disappointed tbh but that's the luck of the draw I guess....my card is a 290 Tri-X.


If it makes you feel any better I'm in the same boat (MSI 290x gaming). I was running at 1080MHz but GTA5 crashed.

I'm now stuck at this speed, if I add voltage to overclock things start to get too hot.


----------



## RatusNatus

Quote:


> Originally Posted by *corey3333*
> 
> I have Two Sapphire R9 290x's. When crossfired i get some flickering on games, also crossfire does not seem compatable on alot of games. So i ordered a MSI R9 390x. it have Derect X 12 I hope this with help with futer games that are coming out. My question is: 1 Can i crossfire my R9 290x with a 390x?? Also will Derect x 12 help drive awsome graphics better than Derect x 11??


This is a bad call since the 390 is exactly the same card.

Next year will bring some incredibly new releases. Wait for it.


----------



## battleaxe

Quote:


> Originally Posted by *corey3333*
> 
> I have Two Sapphire R9 290x's. When crossfired i get some flickering on games, also crossfire does not seem compatable on alot of games. So i ordered a MSI R9 390x. it have Derect X 12 I hope this with help with futer games that are coming out. My question is: 1 Can i crossfire my R9 290x with a 390x?? Also will Derect x 12 help drive awsome graphics better than Derect x 11??


Yes you can. But you just did a sidegrade if you weren't aware. All great cards though TBH.


----------



## i2CY

Tried Crimson...
Not going to fight it....
Glitchy ...
Waiting till full blown release in 2016.


----------



## kizwan

Quote:


> Originally Posted by *cokker*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sycksyde*
> 
> It can't be worse than mine, 1050mhz is as far as it will go without over 100mv of voltage.
> 
> Pretty damn disappointed tbh but that's the luck of the draw I guess....my card is a 290 Tri-X.
> 
> 
> 
> If it makes you feel any better I'm in the same boat (MSI 290x gaming). I was running at 1080MHz but GTA5 crashed.
> 
> I'm now stuck at this speed, if I add voltage to overclock things start to get too hot.
Click to expand...

What kind of crash? The whole PC crash or just the game crash? BSOD? GTA V is both problematic game & also very stressful to the CPU & the memory (system memory).


----------



## cokker

Quote:


> Originally Posted by *kizwan*
> 
> What kind of crash? The whole PC crash or just the game crash? BSOD? GTA V is both problematic game & also very stressful to the CPU & the memory (system memory).


Those "ERR_GFX_D3D_INIT" type crashes, just threw me back to desktop. After down clocking to 1050MHz I could play for hours.

Never mind, I guess I'll have to invest some money for cooling


----------



## Mercennarius

Quote:


> Originally Posted by *corey3333*
> 
> I have Two Sapphire R9 290x's. When crossfired i get some flickering on games, also crossfire does not seem compatable on alot of games. So i ordered a MSI R9 390x. it have Derect X 12 I hope this with help with futer games that are coming out. My question is: 1 Can i crossfire my R9 290x with a 390x?? Also will Derect x 12 help drive awsome graphics better than Derect x 11??


The 290X and 390X both offer the same DirectX 12 Support...the 390X is just an overclocked 290X. Great card, but not an upgrade over a 290X per say.


----------



## corey3333

Thanks for the advice. food for thought. I would like any input i can get on this.


----------



## afyeung

A bit late. But add me to the club please. Running the MSI 290 liquid cooled under the 390x. Awesome card btw, especially for $175! OC settings: 1164/1526 +96mv core +50 power.


----------



## ZealotKi11er

Quote:


> Originally Posted by *afyeung*
> 
> A bit late. But add me to the club please. Running the MSI 290 liquid cooled under the 390x. Awesome card btw, especially for $175! OC settings: 1164/1526 +96mv core +50 power.


Drop memory to 1500 flat.


----------



## mus1mus

And put the liquid cooled card up top the air cooled one.

The liquid cooled will have higher headroom, granting it can. And you will need air flow to cool that aircooled card. Tons of air.


----------



## afyeung

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Drop memory to 1500 flat.


Why? 1525 is the stock clock for the factory OC'ed 390x's


----------



## afyeung

Quote:


> Originally Posted by *mus1mus*
> 
> And put the liquid cooled card up top the air cooled one.
> 
> The liquid cooled will have higher headroom, granting it can. And you will need air flow to cool that aircooled card. Tons of air.


That's the highest I can go without having to crank up the voltage to insane levels. I have a 140mm fnv2 fan directly pointed at the 390x. Will add more fans after I repaste.


----------



## ZealotKi11er

Quote:


> Originally Posted by *afyeung*
> 
> Why? 1525 is the stock clock for the factory OC'ed 390x's


Not sure about 390X but for 290 you want to run it 1250/1375/1500/1625


----------



## afyeung

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Not sure about 390X but for 290 you want to run it 1250/1375/1500/1625


Eh. I'll play around with it later. Performance isn't the issue for me now. The temps on the 390x are just terrible. I'll have to see what I can do to fix it.


----------



## mus1mus

The memory straps should be the same. So as mention by Zeal, 1375, 1500, 1625, 1750 are the best memory clocks.

For the heat, my suggestion stands.


----------



## kizwan

Quote:


> Originally Posted by *afyeung*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> Not sure about 390X but for 290 you want to run it 1250/1375/1500/1625
> 
> 
> 
> Eh. I'll play around with it later. Performance isn't the issue for me now. The temps on the 390x are just terrible. I'll have to see what I can do to fix it.
Click to expand...

To be clear, the suggestion to run memory at those frequency is for getting optimal performance. 1250/1375/1500/1625/1750 represent memory straps (excluding lower memory straps which are not useful to us) & these numbers are the max memory clock for that strap. The higher the strap, the looser the memory timings, e.g. memory at 1376 - 1500 running at tighter timings than memory at 1501 - 1625. Because of this you may get better performance with memory at 1500 with tighter timings than memory at 1526 with looser timings. Basically if you can't do 1625, better go back to 1500.

In my opinion, if my card can only do 1600, not 1625, I will run memory at 1600 instead of 1500. If my card can only do 1525, I will run memory at 1500 instead of 1525.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *kizwan*
> 
> To be clear, the suggestion to run memory at those frequency is for getting optimal performance. 1250/1375/1500/1625/1750 represent memory straps (excluding lower memory straps which are not useful to us) & these numbers are the max memory clock for that strap. The higher the strap, the looser the memory timings, e.g. memory at 1376 - 1500 running at tighter timings than memory at 1501 - 1625. Because of this you may get better performance with memory at 1500 with tighter timings than memory at 1526 with looser timings.


Howsit goin ??

Yep , that goes for your dram timings as well


----------



## Arizonian

Quote:


> Originally Posted by *afyeung*
> 
> A bit late. But add me to the club please. Running the MSI 290 liquid cooled under the 390x. Awesome card btw, especially for $175! OC settings: 1164/1526 +96mv core +50 power.
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> To be clear, the suggestion to run memory at those frequency is for getting optimal performance. 1250/1375/1500/1625/1750 represent memory straps (excluding lower memory straps which are not useful to us) & these numbers are the max memory clock for that strap. The higher the strap, the looser the memory timings, e.g. memory at 1376 - 1500 running at tighter timings than memory at 1501 - 1625. Because of this you may get better performance with memory at 1500 with tighter timings than memory at 1526 with looser timings.
> 
> 
> 
> Howsit goin ??
> 
> Yep , that goes for your dram timings as well
Click to expand...

Good.







How are you? I haven't seen you posting for some time.

Are you not going to upgrade to Haswell-E? If you do, becareful posting your madly overclock & overvolt at HW-E thread. Voltage pretty sensitive subject over there.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *kizwan*
> 
> Good.
> 
> 
> 
> 
> 
> 
> 
> How are you? I haven't seen you posting for some time.
> 
> Are you not going to upgrade to Haswell-E? If you do, becareful posting your madly overclock & overvolt at HW-E thread. Voltage pretty sensitive subject over there.


Things are fantastic as always dude









When I do upgrade sounds like that's the first place to go and turn everything on its head eh


----------



## kizwan

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Things are fantastic as always dude
> 
> 
> 
> 
> 
> 
> 
> 
> 
> When I do upgrade sounds like that's the first place to go and turn everything on its head eh


Sound like a good plan.


----------



## afyeung

Quote:


> Originally Posted by *kizwan*
> 
> To be clear, the suggestion to run memory at those frequency is for getting optimal performance. 1250/1375/1500/1625/1750 represent memory straps (excluding lower memory straps which are not useful to us) & these numbers are the max memory clock for that strap. The higher the strap, the looser the memory timings, e.g. memory at 1376 - 1500 running at tighter timings than memory at 1501 - 1625. Because of this you may get better performance with memory at 1500 with tighter timings than memory at 1526 with looser timings. Basically if you can't do 1625, better go back to 1500.
> 
> In my opinion, if my card can only do 1600, not 1625, I will run memory at 1600 instead of 1500. If my card can only do 1525, I will run memory at 1500 instead of 1525.


Thanks for the great explanation! My card can actually do a lot higher than 1600. But I need aux voltage to get past 1550 and as of now, MSI AB is the only tool that offers Aux voltage but AB doesn't play well with the 390x + 290







I'll drop it down to 1500. No biggie.


----------



## afyeung

Quote:


> Originally Posted by *kizwan*
> 
> Good.
> 
> 
> 
> 
> 
> 
> 
> How are you? I haven't seen you posting for some time.
> 
> Are you not going to upgrade to Haswell-E? If you do, becareful posting your madly overclock & overvolt at HW-E thread. Voltage pretty sensitive subject over there.


Lol. I have my 5820k @ 4.5ghz with 1.3v. If I go any higher than 1.3v my friend will kill me


----------



## corey3333

are you running a Hawaii 290 and/or X with a 390x crossfire? I can't get my Sapphire 290x to crossfire with my 390x without flicker.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *corey3333*
> 
> are you running a Hawaii 290 and/or X with a 390x crossfire? I can't get my Sapphire 290x to crossfire with my 390x without flicker.


That's cause you need to add a northern flicker to counter act it










Spoiler: Warning: Spoiler!


----------



## corey3333

what is a Northern Flicker?


----------



## afyeung

Quote:


> Originally Posted by *corey3333*
> 
> are you running a Hawaii 290 and/or X with a 390x crossfire? I can't get my Sapphire 290x to crossfire with my 390x without flicker.


Yep. Running MSI 290 with the 390x in crossfire. Only game I've had issues with is gta 5 because of vram difference. Using crimson beta driver


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> That's cause you need to add a northern flicker to counter act it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *corey3333*
> 
> what is a Northern Flicker?
Click to expand...

Open the spoiler noob ......


----------



## corey3333

Ya GTAV in crossfire gives me like 7 or 9 fps, not playable.


----------



## corey3333

Nobody calls me a Noob! i benn buidling computers and programing code since the intel 8088 and when the 8086 came out..NOOB!!


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *corey3333*
> 
> Nobody calls me a Noob! i benn buidling computers and programing code since the intel 8088 and when the 8086 came out..NOOB!!


The point is that you didn't open the spoiler to get the joke that's a noob thing ...........

Plus


----------



## JourneymanMike

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *corey3333*
> 
> Nobody calls me a Noob! i benn buidling computers and programing code since the intel 8088 and when the 8086 came out..NOOB!!
> 
> 
> 
> The point is that you didn't open the spoiler to get the joke that's a noob thing ...........
> 
> Plus
Click to expand...





















































































































































Just My


----------



## afyeung

Quote:


> Originally Posted by *corey3333*
> 
> Ya GTAV in crossfire gives me like 7 or 9 fps, not playable.


That's not possible. Then the obvious thing is to turn off crossfire for the game then.


----------



## mfknjadagr8

Quote:


> Originally Posted by *corey3333*
> 
> Ya GTAV in crossfire gives me like 7 or 9 fps, not playable.


sounds like driver issue to me...did you use ddu in safe mode to remove? did you install with one card powered then restart cycle shutdown reconnect second card boot until second card is found then restart and enable crossfire....?

Funny thing experience is...just because you do something for a long time doesn't mean you know all there is to know...otherwise you wouldn't be here asking...


----------



## corey3333

Yep


----------



## JourneymanMike

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Quote:
> 
> 
> 
> Originally Posted by *corey3333*
> 
> Ya GTAV in crossfire gives me like 7 or 9 fps, not playable.
> 
> 
> 
> sounds like driver issue to me...did you use *ddu* in safe mode to remove? did you install with one card powered then restart cycle shutdown reconnect second card boot until second card is found then restart and enable crossfire....?
> 
> Funny thing experience is...just because you do something for a long time doesn't mean you know all there is to know...otherwise you wouldn't be here asking...
Click to expand...

I run across so many initials in here...

I already know what Di Di Mao means...









Please excuse my ignorance, but what does DDU stand for?

Thanks

Mike


----------



## mus1mus

Spoiler: DUM DUM UHM!



http://www.guru3d.com/files-details/display-driver-uninstaller-download.html


----------



## HOMECINEMA-PC

Display Driver Uninstaller .......


----------



## HOMECINEMA-PC

Didn't see the post above


----------



## mus1mus

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Didn't see the post above


where are them gifs?


----------



## Vici0us

I have a quick question.
Is it safe to have a reference 290 @ 1.258V? I'm talking about 24/7 OC. I usually have my 290's below 1.25V.
Temps stay around 80C - 85C (depending on the room temperature). I'm using a custom fan profile.
Thanks in advance!


----------



## mus1mus

Those are pretty standard Voltages. The reference cards are built tough.


----------



## Vici0us

Quote:


> Originally Posted by *mus1mus*
> 
> Those are pretty standard Voltages. The reference cards are built tough.


Thanks mang, which voltage isn't safe for a ref 290?


----------



## mus1mus

Quote:


> Originally Posted by *Vici0us*
> 
> Thanks mang, which voltage isn't safe for a ref 290?


I fear I can't tell you even if I have been there.







I will if you are on water.


----------



## Vici0us

Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Vici0us*
> 
> Thanks mang, which voltage isn't safe for a ref 290?
> 
> 
> 
> I fear I can't tell you even if I have been there.
> 
> 
> 
> 
> 
> 
> 
> I will if you are on water.
Click to expand...

Well the reason I want to know is so I can push my cards as far as I can but still be safe. I've pushed em to 1.3V but that was just to mess around with benchmarks.


----------



## corey3333

I hear ya, thats why im here. you can find others that have solved problems that have plaged you for so long on pc. I'v tried 3 sets of amd cards thats six different combinations from 7790 to 390x and from windows xp to 7 sp1 to now windows 10 and all have the same problem with GTA V. I'v resienged myselfe to the fact that i have to run on just one card. also Watch Dogs has a simaler problem' like 14 fps.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *mus1mus*
> 
> where are them gifs?


On my phone dude ;-)


----------



## mfknjadagr8

Quote:


> Originally Posted by *Vici0us*
> 
> I have a quick question.
> Is it safe to have a reference 290 @ 1.258V? I'm talking about 24/7 OC. I usually have my 290's below 1.25V.
> Temps stay around 80C - 85C (depending on the room temperature). I'm using a custom fan profile.
> Thanks in advance!


mine run 1.24 under load with no voltage changes but I'm under water and they never break 60c....I don't like seeing 80s but according to amd completely fine


----------



## bkvamme

Quote:


> Originally Posted by *mfknjadagr8*
> 
> mine run 1.24 under load with no voltage changes but I'm under water and they never break 60c....I don't like seeing 80s but according to amd completely fine


Are we talking 24/7 as in GPU computing 24/7?

Edit: Too early for me it seems, you don't mention 24/7 clocks. Where is is that coffee?!


----------



## Vici0us

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Vici0us*
> 
> I have a quick question.
> Is it safe to have a reference 290 @ 1.258V? I'm talking about 24/7 OC. I usually have my 290's below 1.25V.
> Temps stay around 80C - 85C (depending on the room temperature). I'm using a custom fan profile.
> Thanks in advance!
> 
> 
> 
> mine run 1.24 under load with no voltage changes but I'm under water and they never break 60c....I don't like seeing 80s but according to amd completely fine
Click to expand...

I got my cards when they first came out. I remember, everyone was complaining about cards running @ 95C but AMD confirmed that, that's how it's suppose to be. So running reference card @ 80C - 85C is not bad.


----------



## th3illusiveman

Anyone find a way to run "High" shadows in fallout with the FPS tanking? I can max everything but that setting (and Godrays of course)


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *JourneymanMike*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Just My :2cents




Quote:


> Originally Posted by *mus1mus*
> 
> where are them gifs?






Quote:


> Originally Posted by *mfknjadagr8*
> 
> sounds like driver issue to me...did you use ddu in safe mode to remove? did you install with one card powered then restart cycle shutdown reconnect second card boot until second card is found then restart and enable crossfire....?
> 
> Funny thing experience is...just because you do something for a long time doesn't mean you know all there is to know...otherwise you wouldn't be here asking...




Quote:


> Originally Posted by *Vici0us*
> 
> I got my cards when they first came out. I remember, everyone was complaining about cards running @ 95C but AMD confirmed that, that's how it's suppose to be. So running reference card @ 80C - 85C is not bad.


Yikes that's a tad warm











Since I blocked these cards 2 years ago ive never looked back . Ive gotten 1360mhz at 1.56v on single card , 1330mhz xfire and 1300mhz on trifire









I gets 42c under full load without chiller and 27c to 32c with chiller on at 1000 / 1250 at 1.18v vdrooped ........



Anyways im outta here .....


----------



## mfknjadagr8

Quote:


> Originally Posted by *bkvamme*
> 
> Are we talking 24/7 as in GPU computing 24/7?
> 
> Edit: Too early for me it seems, you don't mention 24/7 clocks. Where is is that coffee?!


I'm not folding on them if that's what you mean but I run them in all 3d apps as it is my un modified stock value...under any load over 25 percent I'm at that voltage







clocks are 1050 1300 no changes aside from power limit maxed


----------



## Vici0us

You're forgetting that I don't have my cards under water like most of you guys. I'm @ 1100 / 1375 on air.


----------



## rdr09

Quote:


> Originally Posted by *corey3333*
> 
> I hear ya, thats why im here. you can find others that have solved problems that have plaged you for so long on pc. I'v tried 3 sets of amd cards thats six different combinations from 7790 to 390x and from windows xp to 7 sp1 to now windows 10 and all have the same problem with GTA V. I'v resienged myselfe to the fact that i have to run on just one card. also Watch Dogs has a simaler problem' like 14 fps.


Both those games need help from the cpu. oc if at stock. whatever cpu you have. use AB graphs to monitor usages.


----------



## refirendum

so who's excited for dx12? that performance boost on dx12 that recent amd cards get is pretty awesome.


----------



## YellowBlackGod

Quote:


> Originally Posted by *refirendum*
> 
> so who's excited for dx12? that performance boost on dx12 that recent amd cards get is pretty awesome.


I am. And the value and potential of this card are awsome. 2 year old GPU, still lives today, also with a different name. It is by far the best GPU. Still performs in new titles as if it was released today. When I changed the 780ti reference with the 8GB Vapor-X somebody could say I am crazy. But I guess I am not.







Direct X12, Freesync via HDMI, and the new technologies of AMD will all be supported by our GPU. In 1-2 days my freesync Monitor arrives as well. Can't wait to enjoy the full gaming experience. And I don't want to hear that GTX 970 is the competitor of 290x. The GTX 980 is. Or even 980ti if we think about DX12.


----------



## minh2134

Hello, may I join this club?
GPU-Z link: http://www.techpowerup.com/gpuz/details.php?id=m79ak
Manufacturer:AMD(since this is a ref ver.) Brand: Gigabyte
Cooler:Stock.


----------



## Arizonian

Quote:


> Originally Posted by *minh2134*
> 
> Hello, may I join this club?
> GPU-Z link: http://www.techpowerup.com/gpuz/details.php?id=m79ak
> Manufacturer:AMD(since this is a ref ver.) Brand: Gigabyte
> Cooler:Stock.


I see your fairly new to OCN, welcome and Congrats - added


----------



## minh2134

Quote:


> Originally Posted by *Arizonian*
> 
> I see your fairly new to OCN, welcome and Congrats - added


Nice, will try to be more active in the future, i'm sure of that


----------



## scottydsntknow

Finally got mine all setup properly. Pulled the 290x at stock speeds I was borrowing and installed my non X 290 with a water cooler and cranked it to 1175/1450 and we'll see how she does.


----------



## scottydsntknow

Just ran Firestrike and pulled a 13307 graphics score, 9654 combined score on the 1175/1450 overclock. Apparently there is more to be squeezed out of this card but I'll figure that out later. The 290x did 12207 graphics and 9209 at stock speeds. I am going to see if I have the correct bin to try to flash this thing to a 290x but if not oh well.

Card did no break 70C at any time which is a little hot for water but I'm using a basic H50 cooler with the single stock Corsair fan. When I swap motherboards to a Sabertooth when my RMA comes back in I will put water back on the CPU and throw the 212Evo fans that have way more CFM and much better static pressure onto this rad and see what happens.


----------



## rdr09

Quote:


> Originally Posted by *scottydsntknow*
> 
> Just ran Firestrike and pulled a 13307 graphics score, 9654 combined score on the 1175/1450 overclock. Apparently there is more to be squeezed out of this card but I'll figure that out later. The 290x did 12207 graphics and 9209 at stock speeds. I am going to see if I have the correct bin to try to flash this thing to a 290x but if not oh well.
> 
> Card did no break 70C at any time which is a little hot for water but I'm using a basic H50 cooler with the single stock Corsair fan. When I swap motherboards to a Sabertooth when my RMA comes back in I will put water back on the CPU and throw the 212Evo fans that have way more CFM and much better static pressure onto this rad and see what happens.


Normally, when on air, the VRM1 temp will be your limiting factor. What was it?

Use GPUZ to check temps or HWINFO64.


----------



## corey3333

Hey I just bought a MSI 390x and successfully crossfired it with my Sapphire vapor-x 290x' it works perfectly. And everyone i asked told me you can't mix different brands and the 390 has a totaly different chip than the 290x. Here are the facts: The 390/390x has the same Hawaii GPU as the 290/290x also different brands DO crossfire. I had to try it to find this out for myself. But in the end i'am returning the MSI 390x, according to the specs and what they tell you the 390 is supposed to be a bit faster than the 290x. NOT true. I did several benchmarks with several different Benchmarking apps. The MSI 390x with Open GL4 at 1080p gets an average of 320FPS. My Sapphire 290x with all same settings gets 430FPS. More importantly the 390 gets way way too hot for me' usually around 68/72c with 92% Load. The Sapphire runs around 53/58c with 97/100% Load. This is just my experience and opinion, maybe other people may have better results on there setup. The point is I'am returning the 390x and sticking with my two Sapphire 290x cards, and i'm happy with that.


----------



## mfknjadagr8

Quote:


> Originally Posted by *corey3333*
> 
> Hey I just bought a MSI 390x and successfully crossfired it with my Sapphire vapor-x 290x' it works perfectly. And everyone i asked told me you can't mix different brands and the 390 has a totaly different chip than the 290x. Here are the facts: The 390/390x has the same Hawaii GPU as the 290/290x also different brands DO crossfire. I had to try it to find this out for myself. But in the end i'am returning the MSI 390x, according to the specs and what they tell you the 390 is supposed to be a bit faster than the 290x. NOT true. I did several benchmarks with several different Benchmarking apps. The MSI 390x with Open GL4 at 1080p gets an average of 320FPS. My Sapphire 290x with all same settings gets 430FPS. More importantly the 390 gets way way too hot for me' usually around 68/72c with 92% Load. The Sapphire runs around 53/58c with 97/100% Load. This is just my experience and opinion, maybe other people may have better results on there setup. The point is I'am returning the 390x and sticking with my two Sapphire 290x cards, and i'm happy with that.


no one here should have told you any of those things...390x should have outperformed your 290x easily...due to tighter memory timings and better base clocks stock but they tend to have slightly higher base voltages to compensate


----------



## corey3333

I did not get any of that info from here. Just feedback from Amazon customers who purchased the card also some feedback from Newegg and some friends. Yes i agree, the memory has better timings and has a higher effective clock speed of 1500mhz. Also the 390 GPU is factory set at 1080mhz and can OC for gaming to 1100mhz.. My Sapphire 290x is set at GPU speed 1030mhz and uses higher mili volts, and the memory clock speed it factory set to 1375mhz. The cool thing about the 390 is it can handle Triple over-voltage. For Some reason in my case Two 290x's runs faster than one 390x linked with a 290x. I bet if you put together two 390x's than you will see a large improvement.


----------



## corey3333

Also the 390 can process 384GB a sec and the 290x can do 352GB a sec.


----------



## mfknjadagr8

I'm assuming you had the 390 in the first slot and the 290x below it? Were you running them unified clocks or separate?


----------



## sinnedone

Quote:


> Originally Posted by *corey3333*
> 
> Also the 390 can process 384GB a sec and the 290x can do 352GB a sec.


At what clocks for each?

Put them both at the same core and memory frequencies and see if that differs.


----------



## afyeung

BTW Guys I got rid of the heat by removing the liquid cooled card and just running the 390x. My 290 will be going into another system.


----------



## navjack27

i'm glad u fixed those temp issues @afyeung


----------



## scottydsntknow

Quote:


> Originally Posted by *rdr09*
> 
> Normally, when on air, the VRM1 temp will be your limiting factor. What was it?
> 
> Use GPUZ to check temps or HWINFO64.


I never even checked, I put the water cooler on right off the bat pretty much. Just verified I did not have a dead card first.

My VRM1 temp right now stays in the 50s during heavy gaming sessions. The core gets a little hot, got into the 70s today. I am going to move the radiator to an intake setup and install some real good fans in a push/pull setup.


----------



## pcrevolution

Quote:


> Originally Posted by *mfknjadagr8*
> 
> mine run 1.24 under load with no voltage changes but I'm under water and they never break 60c....I don't like seeing 80s but according to amd completely fine


Just a quick question.

My 290X runs close to 60C too. Would you consider this temperature too high in that it can speed up oxidation in the waterblocks?


----------



## Forceman

Quote:


> Originally Posted by *pcrevolution*
> 
> Just a quick question.
> 
> My 290X runs close to 60C too. Would you consider this temperature too high in that it can speed up oxidation in the waterblocks?


I don't know the impact on oxidizing, but bear in mind that is core temp and not water temp. Core temp is always going to be higher than the water temp, and water temp is what speeds up problems with plasticizers, so likely the same with oxidization.


----------



## Roboyto

Quote:



> Originally Posted by *pcrevolution*
> 
> Just a quick question.
> 
> My 290X runs close to 60C too. Would you consider this temperature too high in that it can speed up oxidation in the waterblocks?


If you are using some sort of anti-corrosion substance and a biocide you shouldn't have any issues. I typically just mix in a small amount of automobile anti-freeze for it's anti-corrosive properties. I also run a silver kill coil with some Dead Water biocide. When constructing the loop, I run some Isopropyl alcohol through blocks/tubing/reservoirs before everything goes together.

Following this procedure has always worked out well for me. My current loop in my main rig hasn't been touched in nearly 2 years with very minimal evaporation, no apparent corrosion/gunk build up at all in my res, and still good load temps.

Many times manufacturers for your various components will list a maximum operating temperature for your water/coolant. So if you feel it's too high you can always check what they say and compare to your temp. 60C for GPU/VRMs with some extra voltage/overclock isn't uncommon if you don't have excessive amounts of radiator space.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Roboyto*
> 
> If you are using some sort of anti-corrosion substance and a biocide you shouldn't have any issues. I typically just mix in a small amount of automobile anti-freeze for it's anti-corrosive properties. I also run a silver kill coil with some Dead Water biocide. When constructing the loop, I run some Isopropyl alcohol through blocks/tubing/reservoirs before everything goes together.
> 
> Following this procedure has always worked out well for me. My current loop in my main rig hasn't been touched in nearly 2 years with very minimal evaporation, no apparent corrosion/gunk build up at all in my res, and still good load temps.
> 
> Many times manufacturers for your various components will list a maximum operating temperature for your water/coolant. So if you feel it's too high you can always check what they say and compare to your temp. 60C for GPU/VRMs with some extra voltage/overclock isn't uncommon if you don't have excessive amounts of radiator space.


this...also just remember it's water temp that matters not core temps or vrm temps...(when taking about failure and corrosion) under 60 is good in my opinion your water temp is likely quite a bit lower than that but nearly every manufacture of tubing and blocks specifies an operating range as long as you don't exceed that id never worry


----------



## scottydsntknow

Heh... I hit 77C on the core last night in a REALLY heavy eyefinity session with all kinds of explosions going on in Elite Dangerous. I'm going to crack this sucker back open tonight and maybe check my TIM application and then put the fans in a push/pull as a bottom intake where the coolest air is going to be (filter and 1" rubber case feet so plenty of clean air).

Even still, these cards are good to go over 90C so I'm not worrying about frying anything... not exactly. I could always bump down my OC, don't need 1175/1450 for ED...


----------



## Tobiman

I started having a problem after installing 15.12 yesterday. My PC monitors turn off after gaming for like 30mins while every other thing works in the background. I have to hard restart to get the PC showing anything again. I thought it was the new driver but uninstalling with DDU and installing 15.10 didn't fix it.


----------



## navjack27

there are some interesting TDR manifestations with crimson. at least on my rig i've noticed that i think my computer is frozen, caps lock won't light up my keyboard but then i wait a lil bit and it comes back without a driver reset notice in the taskbar. it could be something like that maybe i dunno


----------



## i2CY

15.12 agrees with my rig, 15.11 did not....
did you install all current windows updates before updating Crimson?


----------



## Tobiman

Since the problem I was havng manifested with 15.10 as well, I went ahead and reinstalled 15.12 and BAM! Everything is fine, again.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Tobiman*
> 
> Since the problem I was havng manifested with 15.10 as well, I went ahead and reinstalled 15.12 and BAM! Everything is fine, again.


welcome to amd borked installs....


----------



## scottydsntknow

I was having all sorts of... lag-like issues with even just basic browsing with my new 290 and 15.10 but after 15.12 everything is good to go. Different strokes for different rigs.


----------



## mfknjadagr8

Quote:


> Originally Posted by *scottydsntknow*
> 
> I was having all sorts of... lag-like issues with even just basic browsing with my new 290 and 15.10 but after 15.12 everything is good to go. Different strokes for different rigs.


note likely same as above...sometimes even with proper prep the amd driver just doesn't install our initialize properly...it seems to be worse with crossfire as well at least in my case I have way more borked installs with two cards than I had with one...even doing it the way it should be done


----------



## wemo92

Seeking help, I have a R9 290 tri - x power supply Aerocool KCAS 700w . During the test I saw that the graph GPU core is unstable and I could not understand why , I fear that the power supply is not sufficient to make it stable .

I place a screen :


----------



## diggiddi

The GPU core speed will fluctuate based on the load, it will not be constantly fixed at one speed. I can't say anything about your PSU


----------



## OneB1t

lets try to force AMD to make driver fix for 2D power consumption of 290X/390X
https://community.amd.com/thread/193942
thanks for your contribution


----------



## Cyber Locc

Quote:


> Originally Posted by *OneB1t*
> 
> lets try to force AMD to make driver fix for 2D power consumption of 290X/390X
> https://community.amd.com/thread/193942
> thanks for your contribution


Okay so a word of warning about this. I did this rodeo with AMD over a much much larger issue. The Crackling noises in crossfire that were/are a nightmare (there still there last I checked). We had a ton of people back it and amd reps say they look into it. Last I heard they recreated it and would fix it that was a year ago and as of omega I still had it. In other words don't hold your breath.

That said I really don't think your going to get a large backing on this issue tbh. Lets break this down realistically.


Spoiler: Warning: Long and Math!



Lets take your tech power examples and the largest difference between the 290x and the 980 ti, Which is bluray movies. I will also do the Idle differences however less detailed.

All wattage examples are from "https://www.techpowerup.com/reviews/Powercolor/R9_390_PCS_Plus/28.html"

Legend - W = Watts. PDKW = KWH usage for 24 hours. PMKW = KWH 24/7 for 30 days. PYKW = KWH 24/7 for 365 days.
PYC = Cost per year @ US average of .12c per KWH.

290x/390x BR......l.......980 TI - BR.......l......Diffrence
W - 79................l......W - 14..............l......W - 65
PDKW - 1.56........l......PDKW - .336......l......PDKW - 1.224
PMKW - 46.8........l......PMKW - 10.08....l......PMKW - 36.72
PYKW - 569.4.......l......PYKW - 122.64...l......PYKW - 446.76
PYC - $68.33........l......PYC - $14.72......l......PYC - $53.61

As you can see, the difference in $$$ if I was to run Blurays 24/7 is $53.61 on average. The price of a r9 390x as of writing is 430avg a 980ti is 650 average, that is a 220 dollar price difference. Therefor the difference wont be made up until 5yrs of use of the card.

The thing is your jumping in all gun ho on something that may not be as easy to change as you think. Also on a matter that is well known and has been for some time. With Nvidia you pull up front at the store, with AMD you pay half at the store and half on your power bill







.

As you can also see from your graph this issue also persists in the 7000 series, most likely the 6000 series as well and so on. It isn't something so easy as change a setting in a driver, this issue is built into the cards architecture.

Furthermore as other users have told you on the other thread it may very well have to do with the way the card handles 2d while AMD cards do the lifting still the Nvidia cards appear to shut off entirely and give video to the CPU.

So here is our Idle findings.

290x/390x - Idle....l......980 TI - Idle......l......Diffrence
W - 11.................l......W - 8...............l......W - 3
PYKW - 96.36........l......PYKW - 70.08....l......PYKW - 26.28
PYC - $11.57.........l......PYC - $8.4........l......PYC - $3.16

As you can see above that difference shrinks alot at idle. So unless you are one of the rare people in the world that build a rig with a flagship GPU to watch Blurays 24/7 then the issue isn't nearly as bad as your making it.

On that very note though, the average Bluray player consumes 60-100ws of power to watch Blurays which is more than the GPU









I truly hate to be the Debbie Downer, and I realize that some country's power is much more expensive than in the US. However to expect AMD to "Fix" an issue with a last gen gpu, that most of us knew before going in to save us a few dollars a year is just ludicrous.

This is amplified by the fact that there is much much bigger issues that need to be resolved and have not, the static issue included. (I'm not holding my breath though.)

Edit: Added source and cleaned the mess a little.



I really hope I don't offend you with this post, I'm not trying to be mean or condescending. I'm just trying to help you see the bigger picture here. I think we can all say we bought our GPUs to play games, not to save money on our power bills. In that light I personally would much rather they focus on Performance and stability and bug work outs ect, far more than a few dollars on my power bill.

EDIT: Spoiler-ed the large math and stuff







.

EDIT, AGAIN







: BTW I forgot about 1 part on your post on the other forum. You mentioned locking 2d clocks, I know I have seen guides about this before but I never wanted to do that (I always needed the opposite). It may work in reverse though you could check out the thread at the op and try to skim through that for an idea. Or we could just call the master himself that wrote that guide PING...@tsm106. However I am not sure if that will solve your power issue but it might.

"maybe just add ability to lock clocks to 2D with certain applications?" - From the other forum.


----------



## mfknjadagr8

Quote:


> Originally Posted by *cyberlocc*
> 
> Okay so a word of warning about this. I did this rodeo with AMD over a much much larger issue. The Crackling noises in crossfire that were/are a nightmare (there still there last I checked). We had a ton of people back it and amd reps say they look into it. Last I heard they recreated it and would fix it that was a year ago and as of omega I still had it. In other words don't hold your breath.
> 
> That said I really don't think your going to get a large backing on this issue tbh. Lets break this down realistically.
> 
> Lets take your tech power examples and the largest difference between the 290x and the 980 ti, Which is bluray movies. I will also do the Idle differences however less detailed.
> 
> Legend - W = Watts. PDKW = KWH usage for 24 hours. PMKW = KWH 24/7 for 30 days. PYKW = KWH 24/7 for 365 days.
> PYC = Cost per year @ US average of .12c per KWH.
> 
> 290x/390x BR......l......980 TI - BR.....l......Diffrence
> W - 79...............................l......W - 14.........................l......W - 65
> PDKW - 1.56...................l......PDKW - .336..............l......PDKW - 1.224
> PMKW - 46.8..................l......PMKW - 10.08...........l......PMKW - 36.72
> PYKW - 569.4.................l......PYKW - 122.64..........l......PYKW - 446.76
> PYC - $68.33..................l......PYC - $14.72.............l.......PYC - $53.61
> 
> As you can see, the difference in $$$ if I was to run Blurays 24/7 is $53.61 on average. The price of a r9 390x as of writing is 430avg a 980ti is 650 average, that is a 220 dollar price difference. Therefor the difference wont be made up until 5yrs of use of the card.
> 
> The thing is your jumping in all gun ho on something that may not be as easy to change as you think. Also on a matter that is well known and has been for some time. With Nvidia you pull up front at the store, with AMD you pay half at the store and half on your power bill. As you can also see from your graph this issue also persists in the 7000 series, most likely the 6000 series as well and so on. It isn't something so easy as change a setting in a driver, this issue is built into the cards architecture.
> 
> Furthermore as other users have told you on the other thread it may very well have to do with the way the card handles 2d while AMD cards do the lifting still the Nvidia cards appear to shut off entirely and give video to the CPU.
> 
> So here is our Idle findings.
> 
> 290x/390x - Idle....l....980 TI - Idle....l.....Diffrence
> W - 11................................l.....W - 8..........................l......W - 3
> PYKW - 96.36..................l.....PYKW - 70.08..........l.......PYKW - 26.28
> PYC - $11.57...................l.....PYC - $8.4................l.......PYC - $3.16
> 
> As you can see above that difference shrinks alot at idle. So unless you are one of the rare people in the world that build a rig with a flagship GPU to watch Blurays 24/7 then the issue isn't nearly as bad as your making it.
> 
> On that very note though, the average Bluray player consumes 60-100ws of power to watch Blurays which is more than the GPU
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I truly hate to be the Debbie Downer, and I realize that some country's power is much more expensive than in the US. However to expect AMD to "Fix" an issue with a last gen gpu, that most of us knew before going in to save us a few dollars a year is just ludicrous. This is amplified by the fact that there is much much bigger issues that need to be resolved and have not the static issue included (I'm not holding my breath though.)
> 
> I really hope I don't offend you with this post, I'm not trying to be mean or condescending. I'm just trying to help you see the bigger picture here. I think we can all say we bought our GPUs to play games, not to save money on our power bills. In that light I personally would much rather they focus on Performance and stability and bug work outs ect, far more than a few dollars on my power bill.


surely you don't mean the spotty crossfire support and lack of driver updating in general rendering 300 to 500 dollars worth of cards unusable in a large number of circumstances?


----------



## Cyber Locc

Quote:


> Originally Posted by *mfknjadagr8*
> 
> surely you don't mean the spotty crossfire support and lack of driver updating in general rendering 300 to 500 dollars worth of cards unusable in a large number of circumstances?


I sure do man, I sure do







. Last I checked I am still having the sound crackling with my crossfire on and more than 1 monitor hooked up, sucks so bad, so bad. (well by still I mean last I checked but my rig has been down for a bit and I hadn't upgraded to crimson prior, so fingers crossed)


----------



## the9quad

Quote:


> Originally Posted by *cyberlocc*
> 
> I sure do man, I sure do
> 
> 
> 
> 
> 
> 
> 
> . Last I checked I am still having the sound crackling with my crossfire on and more than 1 monitor hooked up, sucks so bad, so bad. (well by still I mean last I checked but my rig has been down for a bit and I hadn't upgraded to crimson prior, so fingers crossed)


I have two monitors, and 3 290x's hooked up. I don't get that crackling. I am using DVI-D though, on both monitors. What are you using?


----------



## mfknjadagr8

Quote:


> Originally Posted by *the9quad*
> 
> I have two monitors, and 3 290x's hooked up. I don't get that crackling. I am using DVI-D though, on both monitors. What are you using?


up until last week I had two monitors and two xfx 290s one dvi one hdmi...no crackling...but my monitors are used as extended desktop and only one was used while gaming or watching videos as one was full 1080p and the other 900..


----------



## Cyber Locc

Quote:


> Originally Posted by *the9quad*
> 
> I have two monitors, and 3 290x's hooked up. I don't get that crackling. I am using DVI-D though, on both monitors. What are you using?


Basically the same as you actually. However there was a thread about the crackling and there was a set of parameters that caused it. So basically you have to have crossfire enabled more than 1 monitor hooked up (ports don't matter) and Vsync enabled if Vsync is off it goes away. That said it may not affect all cards of that I'm not sure when i first encountered it I was using 2 gpus both were launch cards that may have something to do with it.

However AMDMatt on Overclockers was helping us get the issue to the driver team and confirmed that they were able to recreate it. Hopefully it got fixed in Crimson







. Did you ever experience crackling? The best way to check for it, is to have crossfire enabled and multi monitors not in eyefinity and Vsync on and then get DPClatency and if you see latency spikes that's when the audio cracks.

My rig is torn down or I would check







My pump died and I figured it would be a good time to change some stuff around as when I added my third card my cooling wasnt good enough for me anymore so time to move into a STH10







. I had a 480 a 420 and 2 240s in a 900d Im moving into a STH10 with 3 560s





















.


----------



## fat4l

Guys.
I decided to sell my Ares 3.








Not sure what I will be moving to tho...









Any ideas ?


----------



## mus1mus

Quote:


> Originally Posted by *fat4l*
> 
> Guys.
> I decided to sell my Ares 3.
> 
> 
> 
> 
> 
> 
> 
> 
> Not sure what I will be moving to tho...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any ideas ?


980TI for a single card. If your budget can budge.

Else, 390X is a good replacement for your card.


----------



## fat4l

Quote:


> Originally Posted by *mus1mus*
> 
> 980TI for a single card. If your budget can budge.
> 
> Else, 390X is a good replacement for your card.


Well going Green would mean I have to change my monitor as well, as it's a freesync one.
I'm thinking about 390X as I rly like the OC and performance results. I hoped Ares 3 will be special in some way but it is rly not.








I'm also thinking about FuryX but reference PCB is not something I like.
I will be wcooling ofc.


----------



## OneB1t

1. crossfire is 1% users issue thats why the dont care that much about it
2. here in europe gap between 980ti and 290X in terms of power cost is much faster (about 1-2 years with your numbers) also if you are using multimonitor setup card will stay on full 3D mem clocks +30W even while idle
also for 970/980 numbers are even worse (but guy who tested this on pctuning.cz forums have only 980ti for comparsion)
3. it is pretty easy fix as they have full control over card frequency from drivers so just create profiles for specific .exe files like chrome, opera, potplayer etc.. to stay in 2D mode or improve powerplay to not ramp up frequency when there is no need to do it


----------



## Cyber Locc

Quote:


> Originally Posted by *OneB1t*
> 
> 1. crossfire is 1% users issue thats why the dont care that much about it
> 2. here in europe gap between 980ti and 290X in terms of power cost is much faster (about 1-2 years with your numbers) also if you are using multimonitor setup card will stay on full 3D mem clocks +30W even while idle
> also for 970/980 numbers are even worse (but guy who tested this on pctuning.cz forums have only 980ti for comparsion)
> 3. it is pretty easy fix as they have full control over card frequency from drivers so just create profiles for specific .exe files like chrome, opera, potplayer etc.. to stay in 2D mode or improve powerplay to not ramp up frequency when there is no need to do it


First off its a lot more than 1% lol. probably more like 5-10% use 2 gpu crossfire then figure in the 295x2s, 7990s, 6990s, ect. which also use crossfire that number jumps a lot more than you think. If one percent of users used crossfire they wouldn't care about it damn near at all. Furthermore the people that buy 2+ gpus of every years new models make up much more of the total sales than the person that buys 1 every 4.
Not to mention the people that bin there gpus buying in the double digits every generation of gpus those people use crossfire. We be a smaller base but in dollars we make up more of there sales.

The average cost of power in Europe is .20c US so ya a little less than double. My math still applies with a 980ti paying for itself after 3 years.

"also if you are using multi monitor setup card will stay on full 3D mem clocks +30W even while idle" This is due to the additional monitor unhook it and your Mem Clock will go down, and guess what my 670s and 780s (I'm sure the 980s too) do the same thing this isn't a AMD thing is a GPU thing all of them do it.

The fact is as the thing you linked shows you as well there power consumption has been an ongoing issue for many generations. Less than 1% of users actually care that is the point your missing. Most of us knew that going in or simply don't care. If we cared about the extra cost a PC adds we would be gaming on a 400 dollar PS4 that uses 300Ws total.

Just to clarify http://www.tomshardware.com/reviews/radeon-r9-290-review-benchmark,3659-17.html documents this issue very clearly as has since the cards release. In spite of that no one else has said anything. read all them comments no one says anything about it. The only people that bring it up are fanboys trying to throw it in AMDs face. We are spending 400 dollars on a GPU to add to a already expensive pc to do a what a PS4 or Xbox1 can do for 400. Money spent on a gaming PC is never ever Price to Performance optimized, We do it because we want the better graphics ect and money isn't an issue to stand in what we want.

At any rate good luck with your endeavor I wish you luck.


----------



## Cyber Locc

Quote:


> Originally Posted by *fat4l*
> 
> Well going Green would mean I have to change my monitor as well, as it's a freesync one.
> I'm thinking about 390X as I rly like the OC and performance results. I hoped Ares 3 will be special in some way but it is rly not.
> 
> 
> 
> 
> 
> 
> 
> 
> I'm also thinking about FuryX but reference PCB is not something I like.
> I will be wcooling ofc.


I would go with 2 390xs and pocket the extra







you could get 2 390xs and blocks and have the same performance and pocket some money from the sale







.


----------



## mus1mus

Quote:


> Originally Posted by *fat4l*
> 
> Well going Green would mean I have to change my monitor as well, as it's a freesync one.
> I'm thinking about 390X as I rly like the OC and performance results. I hoped Ares 3 will be special in some way but it is rly not.
> 
> 
> 
> 
> 
> 
> 
> 
> I'm also thinking about FuryX but reference PCB is not something I like.
> I will be wcooling ofc.


Go with the fury x. It's the only upgrade path for you.

Watercooling is also not friendly towards the 390X.

Also, having a reference PCB is not AMD's weak point. Remember my 1350 Mhz 290? It's a ref.


----------



## fat4l

Quote:


> Originally Posted by *mus1mus*
> 
> Go with the fury x. It's the only upgrade path for you.
> 
> Watercooling is also not friendly towards the 390X.
> 
> Also, having a reference PCB is not AMD's weak point. Remember my 1350 Mhz 290? It's a ref.


Oh didnt know that.
Then prolly I should go with Fury X. Sapphire Fury X costs 490£ on Amazon now.

This block looks so nice!


----------



## Cyber Locc

Quote:


> Originally Posted by *mus1mus*
> 
> Go with the fury x. It's the only upgrade path for you.
> 
> Watercooling is also not friendly towards the 390X.
> 
> Also, having a reference PCB is not AMD's weak point. Remember my 1350 Mhz 290? It's a ref.


Wait What? How s watercooling not friendly with 390x? A reference 390x water-cools just fine using a 290x block, A 390x is a 290x with more Vram is the same PCB uses the same block.

That said 2 Fury Xs would also be a upgrade more so than the 290/390xs (as they would be the same card just splitting 1 into 2).

Also just to clarify something as the way you guys are talking doesn't sound like you realize. The Ares III is a dual GPU so you will have to replace it with 2 gpus period. That is the fastest single GPU in the world still, until we see a Fury X2 or a 980 TI dual GPU.


----------



## mus1mus

Quote:


> Originally Posted by *cyberlocc*
> 
> Wait What? How s watercooling not friendly with 390x? A reference 390x water-cools just fine using a 290x block, A 390x is a 290x with more Vram is the same PCB uses the same block.
> 
> That said 2 Fury Xs would also be a upgrade more so than the 290/390xs (as they would be the same card just splitting 1 into 2).
> 
> Also just to clarify something as the way you guys are talking doesn't sound like you realize. *The Ares III is a dual GPU* so you will have to replace it with 2 gpus period. *That is the fastest single GPU in the world still,* until we see a Fury X2 or a 980 TI dual GPU.


Number 1. No 390X is not a custom cooled card. Block compatibility varies from make to make.









Number 2. 2 or 3 or 4 Fury Xs will still fall into the same family.









Number 3. Wait, what? I didn't know it's a 295X2 with a different nameplate. No, I didnt.







I'm too list really. And not knowing what I'm writing.

And ohh, if you insist you must upgrade a dual gpu card into a dual gpu card, good luck. :bye:

Last edit: since the Ares 3 is a dual gpu card, how can it become the fastest single gpu in the world?


----------



## Cyber Locc

Quote:


> Originally Posted by *mus1mus*
> 
> Number 1. No 390X is not a custom cooled card. Block compatibility varies from make to make.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Number 2. 2 or 3 or 4 Fury Xs will still fall into the same family.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Number 3. Wait, what? I didn't know it's a 295X2 with a different nameplate. No, I didnt.
> 
> 
> 
> 
> 
> 
> 
> I'm too list really. And not knowing what I'm writing.
> 
> And ohh, if you insist you must upgrade a dual gpu card into a dual gpu card, good luck. :bye:
> 
> Last edit: since the Ares 3 is a dual gpu card, how can it become the fastest single gpu in the world?


No 1. There is reference 390xs as well as non reference 390xs just like any other gpu on the market, Get a reference 390x it will use reference 290x block. It is 290x pcb period what board makers change is another story as it is with any other gpu. So I am confused what your on about. http://www.ekwb.com/news/606/19/Existing-EK-Full-Cover-blocks-compatible-with-Radeon-Rx-300-series-GPUs/

No 2. I know the Furys would fall into the same family what does this have to do with anything at all really.

"And ohh, if you insist you must upgrade a dual gpu card into a dual gpu card, good luck. :bye:" When did I say this ever? I didn't, can you read? I said since it is a dual gpu hes going to need 2 cards to replace it, and match its performance. "so you will have to replace it with *2* gpus period."

"Last edit: since the Ares 3 is a dual gpu card, how can it become the fastest single gpu in the world?" How long you been in tech industry and you don't know a Dual GPUs on the card its still a single card. There is physically 1 card not 2 it could have 4 GPUs on there if its 1 physical card its still a Single Card. Unless I missed something and the Ares III is 2 cards that you stick in 2 PCIs that it is a single card. (this was my fault I should have said card when I said GPU)

Lesson in English so this can be avoided in the future, A = Single so when you say A 980ti that implies 1, S = Plural so when you say replaces with 980ti's That implies 2 or more. Dont get an attitude with me because you don't know how to type properly. Assuming that you knew it was a dual GPU or that he did lol, that's a bad idea because most likely half the people owning dual GPUs don't know that. This is also why those cards Enable crossfire by themselves.

Furthermore even if you and he did know that, Joe Schmo reading this thread sees that and decides to replace his 295x2 with a single fury thinking its an upgrade because of the way you worded it.


----------



## fat4l

Yeah ares3 = 295x2 + custom pcb + waterblock. Thats all.
It's obvious that going from 295X2 to a single Fury X will decrease the performance, in multi gpu applications.
However I'm bored with dual cards. They don't clock as good as single gpu cards and to utilize my watercooling, which is very OP, I need a card that will clock nice+like extra volts+like low temps.
In other words, I have my wcooling for a reason.


----------



## Cyber Locc

Quote:


> Originally Posted by *fat4l*
> 
> Yeah ares3 = 295x2 + custom pcb + waterblock. Thats all.
> It's obvious that going from 295X2 to a single Fury X will decrease the performance, in multi gpu applications.
> However I'm bored with dual cards. They don't clock as good as single gpu cards and to utilize my watercooling, which is very OP, I need a card that will clock nice+like extra volts+like low temps.
> In other words, I have my wcooling for a reason.


Ya i completely understand that







I hate dual GPUs for that reason plus I like the look of alot of gpus







. I was just saying I would replace your card with 2 cards. Anyway there is a few ares floating about on ebay and such and there asking around 1300 so I'd assume you would net probably 1200 or so. For that price you could pretty much get whatever you want 2 GPU wise. Fury Xs would probably be the best bet as Mus1mus said.


----------



## OneB1t

1. i used crossfire before and its mess and always will be







1% is real value from steam statistics
2. i found that i can fix nearly all issues with global profile to 300/150mhz in crimson driver then add 1000/1250 profile for every game i play (dumb way but it works and then gpu keeps clocks at 150mhz)
3. so now i know it can be easily fixed with global profile i want such solution from AMD as standart for everyone


----------



## mus1mus

@cyberlocc

I will be very easy on you.







don't be mad at these.

EK said, if you read your link, they only support 3 makes of the 300 series cards. Namely, DC2, XFX DD, and the PCS+.

DC2 is already not a reference design. So you're left with the DD and PCS+. Yes, they use ref PCB but the notion is, you are limited by the choices.

It's not lost in the translation.

2. Quoting you.

"That said 2 Fury Xs would also be a upgrade more so than the 290/390xs (as they would be the same card just splitting 1 into 2)."

Of course, everyone will agree that since buying a Fury to replace a 200 series card is an UPGRADE, it follows that 2 or 3 or 4 FURYs will be an UPGRADE. What do you imply to that statement of yours?









An upgrade does not follow the scheme that if you have a DUAL GPU or 2 Cards you need to also pick a DUAL or 2 cards. Nope. Heck, there are games that a FuryX or a 980TI will perform better than XFire Hawaii.

That is a no-brainer.

3. It doesn't take an industry veteran to know that a Sigle GPU means a single GPU CHIP in a Card.

Dual GPU means 2 GPU CHIPS in a Card.

So better check your semantics. Before teaching English to a guy with a comment you cannot understand.

4. If you don't think a 980TI or a FURY X will be an upgrade to an existing 295X2, you simply don't have the idea of how better a single GPU solution is compared to an SLI or XFire set-up in majority of the games.

Lastly, my brain is wired properly to my typing hands. You need to check yours.


----------



## Cyber Locc

Quote:


> Originally Posted by *mus1mus*
> 
> @cyberlocc
> 
> I will be very easy on you.
> 
> 
> 
> 
> 
> 
> 
> don't be mad at these.
> 
> EK said, if you read your link, they only support 3 makes of the 300 series cards. Namely, DC2, XFX DD, and the PCS+.
> 
> DC2 is already not a reference design. So you're left with the DD and PCS+. Yes, they use ref PCB but the notion is, you are limited by the choices.
> 
> It's not lost in the translation.
> 
> 2. Quoting you.
> 
> "That said 2 Fury Xs would also be a upgrade more so than the 290/390xs (as they would be the same card just splitting 1 into 2)."
> 
> Of course, everyone will agree that since buying a Fury to replace a 200 series card is an UPGRADE, it follows that 2 or 3 or 4 FURYs will be an UPGRADE. What do you imply to that statement of yours?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> An upgrade does not follow the scheme that if you have a DUAL GPU or 2 Cards you need to also pick a DUAL or 2 cards. Nope. Heck, there are games that a FuryX or a 980TI will perform better than XFire Hawaii.
> 
> That is a no-brainer.
> 
> 3. It doesn't take an industry veteran to know that a Sigle GPU means a single GPU CHIP in a Card.
> 
> Dual GPU means 2 GPU CHIPS in a Card.
> 
> So better check your semantics. Before teaching English to a guy with a comment you cannot understand.
> 
> 4. If you don't think a 980TI or a FURY X will be an upgrade to an existing 295X2, you simply don't have the idea of how better a single GPU solution is compared to an SLI or XFire set-up in majority of the games.
> 
> Lastly, my brain is wired properly to my typing hands. You need to check yours.


A single GPU is not as strong as 2 in any game period, nope nata. Is it less buggy? Sure ya, easier to deal with? Again yes, will it provide more GPU horsepower? Not a chance a fury is like a 10% performance increase over a 290x, with crossfire scaling at damn near 90% on the second card sorry Fury gets kicked.

That said then ya if your playing at [email protected] it really doesn't matter either will be fine but when you want [email protected] or
[email protected] ect sorry no.

I understand not everyone likes the hassle of Crossfire or SLI but saying it does not perform as well as a single card is just crazy.

However I guess your right and all of us that use CF and SLI in every rig every time don't know what were doing and we are throwing away money. I wish someone would of told me I could run 4k on max settings with 1 fury x instead of my 3 290s that would of saved me a ton.

An upgrade follows the scheme of improvement 1 fury is not an improvement over 2 290xs, that is a sidegrade/downgrade that's like going from a hexacore ivy e to a 4 core haswell you gain 5% performance per core but lose 2 that is a downgrade/side grade.


----------



## JourneymanMike

Quote:


> Originally Posted by *cyberlocc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> @cyberlocc
> 
> I will be very easy on you.
> 
> 
> 
> 
> 
> 
> 
> don't be mad at these.
> 
> EK said, if you read your link, they only support 3 makes of the 300 series cards. Namely, DC2, XFX DD, and the PCS+.
> 
> DC2 is already not a reference design. So you're left with the DD and PCS+. Yes, they use ref PCB but the notion is, you are limited by the choices.
> 
> It's not lost in the translation.
> 
> 2. Quoting you.
> 
> "That said 2 Fury Xs would also be a upgrade more so than the 290/390xs (as they would be the same card just splitting 1 into 2)."
> 
> Of course, everyone will agree that since buying a Fury to replace a 200 series card is an UPGRADE, it follows that 2 or 3 or 4 FURYs will be an UPGRADE. What do you imply to that statement of yours?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> An upgrade does not follow the scheme that if you have a DUAL GPU or 2 Cards you need to also pick a DUAL or 2 cards. Nope. Heck, there are games that a FuryX or a 980TI will perform better than XFire Hawaii.
> 
> That is a no-brainer.
> 
> 3. It doesn't take an industry veteran to know that a Sigle GPU means a single GPU CHIP in a Card.
> 
> Dual GPU means 2 GPU CHIPS in a Card.
> 
> So better check your semantics. Before teaching English to a guy with a comment you cannot understand.
> 
> 4. If you don't think a 980TI or a FURY X will be an upgrade to an existing 295X2, you simply don't have the idea of how better a single GPU solution is compared to an SLI or XFire set-up in majority of the games.
> 
> Lastly, my brain is wired properly to my typing hands. You need to check yours.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> A single GPU is not as strong as 2 in any game period, nope nata. Is it less buggy? Sure ya, easier to deal with? Again yes, will it provide more GPU horsepower? Not a chance a fury is like a 10% performance increase over a 290x, with crossfire scaling at damn near 90% on the second card sorry Fury gets kicked.
> 
> That said then ya if your playing at [email protected] it really doesn't matter either will be fine but when you want [email protected] or
> [email protected] ect sorry no.
> 
> I understand not everyone likes the hassle of Crossfire or SLI but saying it does not perform as well as a single card is just crazy.
> 
> However I guess your right and all of us that use CF and SLI in every rig every time don't know what were doing and we are throwing away money. I wish someone would of told me I could run 4k on max settings with 1 fury x instead of my 3 290s that would of saved me a ton.
> 
> An upgrade follows the scheme of improvement 1 fury is not an improvement over 2 290xs, that is a sidegrade/downgrade that's like going from a hexacore ivy e to a 4 core haswell you gain 5% performance per core but lose 2 that is a downgrade/side grade.
Click to expand...

Geeze Guys, I was thinking of selling my two 290X's, both with Aquacomputer Kryographics blocks and active back plates, and getting a Fury X..

But, now I'm more confused than ever. will this or will this not be an improvement over my Crossfire 290X's? What about 2 x 390X's, with the 290x PCB, so I can transfer my blocks over,and save some money on that?


----------



## fat4l

Quote:


> Originally Posted by *JourneymanMike*
> 
> Geeze Guys, I was thinking of selling my two 290X's, both with Aquacomputer Kryographics blocks and active back plates, and getting a Fury X..
> 
> But, now I'm more confused than ever. will this or will this not be an improvement over my Crossfire 290X's? What about 2 x 390X's, with the 290x PCB, so I can transfer my blocks over,and save some money on that?


Performance-wise 2x 290X > Fury X.
However some games doesn't like crossfire and there, Fury X will bring you more fps.

So again, performance-wise, if you now have 2x290X which is 2 cards and you want to have more horsepower with a single card, you have to wait a few years.
290X cards are good "performance for money" cards but they also have their downsides which are:
High consumption, heat, low single core performance(1440p+).

In my eyes, 390X is a bit better, but going from 290X to 390X makes almost no sense.

If I sell my Ares 3 now, I will either go fury X(possibly cf) or wait for the next gen cards.


----------



## kizwan

Quote:


> Originally Posted by *JourneymanMike*
> 
> Geeze Guys, I was thinking of selling my two 290X's, both with Aquacomputer Kryographics blocks and active back plates, and getting a Fury X..
> 
> But, now I'm more confused than ever. will this or will this not be an improvement over my Crossfire 290X's? What about 2 x 390X's, with the 290x PCB, so I can transfer my blocks over,and save some money on that?


I don't think you'll see improvement with Fury X over 290X Crossfire. Best case scenario, performance will be the same I think. Overall, with not all games support crossfire, Fury X can be consider an improvement over 290X crossfire.

Regarding watercooling with 390/390X, XFX did a revision with new inductors on their DD cards which render the card no longer compatible with 290/290X waterblock. Unless you can find earlier revision of XFX DD 390/390X cards in the market, you can not transfer your block to new cards. Powercolor 390/390X cards are not referenced card & it is only compatible with EK-FC R9-290X SE waterblock.

@mus1mus is correct when he said 390/390X cards are not watercooling friendly. Listen to him.


----------



## the9quad

If money was no object, I would take a single 980ti over my 3 290x's any day. A single 980ti in the majority of new games these days is going to perform better (since the majority of games are using gameworks). Fallout4, Rainbow Six:Siege, The witcher 3, etc.... all using gameworks, and all of them run like poo in crossfire. That said, in some games, crossfire shines and in those cases the 290x's will beat a single 980ti, but that doesn't happen all that often sadly. The 290x's (and perhaps AMD) for me had a good run, but this gameworks thing is killing us. I am holding out for the next cards, but it is a painful wait to be honest.


----------



## Regnitto

I'm going to upgrade my 290 to a 980ti at tax time unless next gen is out by then.....

edit: but when I do, I'll be upgrading my girl's rig to my 290 from her current HD 7850


----------



## cephelix

Been out of the game for a while and quite a few new things have popped up since I was last here. How are the new crimson drivers with relation to a 290 and mainly rpg games(witcher 3, ac unity,syndicate)? Is it worth switching to crimson or should i just stick with my v15 drivers?
Pushed my card more, now it's at 1150/1250/+92 mv on trixx with max temps in the mid 70s core/mid 60s vrm1. That is of course with clu on die and the high thermal conductivity pads(forgot what they're called). Ambient temps are at 26C, air conditioned room


----------



## OneB1t

secret about crossfire is that in 80% of situations its worse than single powerfull card








microstutter, increased input delay not working in majority of games and never will

if you really need that performance just go with 980ti OCed to max


----------



## mus1mus

I understand what some guys are coming through.

1. Two cards is better than one. In crossfire supported games.

2. The Fury X recommendation over 2 290Xs doesn't apply to everyone. You need to check out where that notion came from.

@fat4l mentioned of a Freesynch monitor. Thus sticking with AMD. But first recommendation is a 980TI.

3. Next gen cards are out of the equation now if you are itching for a new card.

4. When you bring in an argument about the Ivy-E vs Haswell, can you rethink what you are trying to imply?

Ivy E vs Haswell has never been meant to be an upgrade. What about Haswell-E?

6C/12T (4960X) vs 8C/16T is. Yes it is if, you itch for a new platform. Nope, if you have a very good clocking 49XX chip and only care about gaming. The additional Cores and DDR4 Bandwidth mean very little in Gaming.

BTW, it's Christmas, so I hope you guys have a merry one.

Chill.


----------



## fat4l

Quote:


> Originally Posted by *mus1mus*
> 
> @fat4l mentioned of a Freesynch monitor. Thus sticking with AMD. But first recommendation is a 980TI.


I will prolly sell my monitor as well


----------



## mus1mus

Quote:


> Originally Posted by *fat4l*
> 
> I will prolly sell my monitor as well


Mo monies, mo problems.


----------



## NeoTheOne2015

Hello smile









Please send me a BIOS for the Sapphire Vapor-X Radeon R9 290X Tri-X OC, 4GB (11226-09-40G)

Thank you very much


----------



## minh2134

Hey I wonder if there is a real difference that worth consider between blocks for r9 290? I am in a dilemma between upgrading block or adding more rads.


----------



## fat4l

Quote:


> Originally Posted by *mus1mus*
> 
> Mo monies, mo problems.


yeah








And buy Asus Swift PG278Q + 980 Ti.
Maybe...


----------



## mus1mus

Quote:


> Originally Posted by *fat4l*
> 
> yeah
> 
> 
> 
> 
> 
> 
> 
> 
> And buy Asus Swift PG278Q + 980 Ti.
> Maybe...


And embrace nVidia's premature driver tripping.


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> And embrace nVidia's premature driver tripping.


Amen brother, Amen... I like how AMD's drivers keep helping their cards and Nvidia keeps gimping theirs so next gen can 'seem' stronger. Whatever. I'll pass on the 980ti this time. On 4k FuryX is still better even if by a hair.


----------



## JourneymanMike

Quote:


> Originally Posted by *fat4l*
> 
> *Performance-wise 2x 290X > Fury X.
> *However some games doesn't like crossfire and there, Fury X will bring you more fps.
> 
> So again, performance-wise, if you now have 2x290X which is 2 cards and you want to have more horsepower with a single card, you have to wait a few years.
> 290X cards are good "performance for money" cards but they also have their downsides which are:
> High consumption, heat, low single core performance(1440p+).
> 
> In my eyes, 390X is a bit better, but going from 290X to 390X makes almost no sense.
> 
> If I sell my Ares 3 now, I will either go fury X(possibly cf) or wait for the next gen cards.


My thought is, that if I have a single Fury X, running at 16X vs two 290X's running at 8X, wouldn't I get more performance with the single Fury X running 16X?

See, I switched from and AMD FX 8350 processor on a Crosshair V Formula-Z (which supports a full 16X for both cards in CFX) to an i7 4790K on a Maximus VII Formula, which only allows 8X in CFX...

Do I have a point, or is there something else I'm not getting...


----------



## fat4l

Quote:


> Originally Posted by *JourneymanMike*
> 
> My thought is, that if I have a single Fury X, running at 16X vs two 290X's running at 8X, wouldn't I get more performance with the single Fury X running 16X?
> 
> See, I switched from and AMD FX 8350 processor on a Crosshair V Formula-Z (which supports a full 16X for both cards in CFX) to an i7 4790K on a Maximus VII Formula, which only allows 8X in CFX...
> 
> Do I have a point, or is there something else I'm not getting...


Well going from PCiE gen3 16X to 8X doesn't decrease the performance drastically, the difference is maybe 1fps.





So, let's compare my ares 3(2x290X) graphics score to yours with Fury X.
Firestrike eXtreme.

13074 pts
http://www.3dmark.com/fs/6590481


----------



## Timer5

Quote:


> Originally Posted by *battleaxe*
> 
> Amen brother, Amen... I like how AMD's drivers keep helping their cards and Nvidia keeps gimping theirs so next gen can 'seem' stronger. Whatever. I'll pass on the 980ti this time. On 4k FuryX is still better even if by a hair.


I have to comment here I love AMD I ONLY buy AMD but their recent "Crimson" Drivers are total garbage! I have had to roll back to 15.11 any time I test out Crimson! They are BEYOND unstable for me.

1 When I overclock and restart the drivers try to put in the overclock with out the voltage change so my screens flickers for like 30 seconds until I can open Trixx and hit the reset button to fix the issue.

2 I can't youtube and play games at once (I have 3 monitors so I game on the middle monitor and youtube on the monitor to my right) when the video open my whole screen freezes and the drivers crash and after the crash I can use the computer again

3 Games have artifacting with or without overclocking like Fallout 4 the little map at the bottom artifacts like crazy to the point where it is unusable

So no I have to say AMD drivers went down hill recently I am going to stick with 15.11 Beta until AMD can figure out how to make a driver that ACTUALLY WORKS! I love you AMD FIX YOUR DRIVERS!


----------



## mus1mus

Quote:


> Originally Posted by *JourneymanMike*
> 
> My thought is, that if I have a single Fury X, running at 16X vs two 290X's running at 8X, wouldn't I get more performance with the single Fury X running 16X?
> 
> See, I switched from and AMD FX 8350 processor on a Crosshair V Formula-Z (which supports a full 16X for both cards in CFX) to an i7 4790K on a Maximus VII Formula, which only allows 8X in CFX...
> 
> Do I have a point, or is there something else I'm not getting...


Running at X8 vs X16 will not decrease performance by a lot. At least in Benchmarks.

http://www.3dmark.com/compare/3dmv/5379325/3dmv/5371555

That's X8 vs X16.

The obvious observation would be lower minimums at the very best.
Quote:


> Originally Posted by *fat4l*
> 
> Well going from PCiE gen3 16X to 8X doesn't decrease the performance drastically, the difference is maybe 1fps.
> 
> 
> 
> 
> 
> So, let's compare my ares 3(2x290X) graphics score to yours with Fury X.
> Firestrike eXtreme.
> 
> 13074 pts
> http://www.3dmark.com/fs/6590481


This ^

Quote:


> Originally Posted by *Timer5*
> 
> I have to comment here I love AMD I ONLY buy AMD but their recent "Crimson" Drivers are total garbage! I have had to roll back to 15.11 any time I test out Crimson! They are BEYOND unstable for me.
> 
> 1 When I overclock and restart the drivers try to put in the overclock with out the voltage change so my screens flickers for like 30 seconds until I can open Trixx and hit the reset button to fix the issue.
> 
> 2 I can't youtube and play games at once (I have 3 monitors so I game on the middle monitor and youtube on the monitor to my right) when the video open my whole screen freezes and the drivers crash and after the crash I can use the computer again
> 
> 3 Games have artifacting with or without overclocking like Fallout 4 the little map at the bottom artifacts like crazy to the point where it is unusable
> 
> So no I have to say AMD drivers went down hill recently I am going to stick with 15.11 Beta until AMD can figure out how to make a driver that ACTUALLY WORKS! I love you AMD FIX YOUR DRIVERS!


Well, it's a first release of the new driver. Things aren't really robust at launch.

What I mean though is, (pre-Crimson-talk







) you can push an AMD Card to it's knees and the driver will allow it. Artifacts, lockups, etc. While nVidia Drivers will crash even before you artifact.

This has been my experience on 290X, 290s vs 780s and a 980TI.

To note, I have played with the TI back in June. Close to 6 months later, my score still stands.









http://www.3dmark.com/search#/?mode=advanced&url=/proxycon/ajax/search/cpugpu/fs/P/1843/1033/19199?minScore=16800&cpuName=Intel Core i7-5930K&gpuName=NVIDIA GeForce GTX 980 Ti

Something that AMD excels on is Driver improvement.


----------



## kizwan

Quote:


> Originally Posted by *Timer5*
> 
> Quote:
> 
> 
> 
> Originally Posted by *battleaxe*
> 
> Amen brother, Amen... I like how AMD's drivers keep helping their cards and Nvidia keeps gimping theirs so next gen can 'seem' stronger. Whatever. I'll pass on the 980ti this time. On 4k FuryX is still better even if by a hair.
> 
> 
> 
> I have to comment here I love AMD I ONLY buy AMD but their recent "Crimson" Drivers are total garbage! I have had to roll back to 15.11 any time I test out Crimson! They are BEYOND unstable for me.
> 
> 1 When I overclock and restart the drivers try to put in the overclock with out the voltage change so my screens flickers for like 30 seconds until I can open Trixx and hit the reset button to fix the issue.
> 
> 2 I can't youtube and play games at once (I have 3 monitors so I game on the middle monitor and youtube on the monitor to my right) when the video open my whole screen freezes and the drivers crash and after the crash I can use the computer again
> 
> 3 Games have artifacting with or without overclocking like Fallout 4 the little map at the bottom artifacts like crazy to the point where it is unusable
> 
> So no I have to say AMD drivers went down hill recently I am going to stick with 15.11 Beta until AMD can figure out how to make a driver that ACTUALLY WORKS! I love you AMD FIX YOUR DRIVERS!
Click to expand...

Enabling Unofficial overclocking mode without Powerplay fixed the screen-flickering-without-increasing-voltage for me.


----------



## battleaxe

Quote:


> Originally Posted by *Timer5*
> 
> I have to comment here I love AMD I ONLY buy AMD but their recent "Crimson" Drivers are total garbage! I have had to roll back to 15.11 any time I test out Crimson! They are BEYOND unstable for me.
> 
> 1 When I overclock and restart the drivers try to put in the overclock with out the voltage change so my screens flickers for like 30 seconds until I can open Trixx and hit the reset button to fix the issue.
> 
> 2 I can't youtube and play games at once (I have 3 monitors so I game on the middle monitor and youtube on the monitor to my right) when the video open my whole screen freezes and the drivers crash and after the crash I can use the computer again
> 
> 3 Games have artifacting with or without overclocking like Fallout 4 the little map at the bottom artifacts like crazy to the point where it is unusable
> 
> So no I have to say AMD drivers went down hill recently I am going to stick with 15.11 Beta until AMD can figure out how to make a driver that ACTUALLY WORKS! I love you AMD FIX YOUR DRIVERS!


Yes, seems some have had trouble with Crimson. I however have not had a single issue. Been running just fine for me with a slight perf increase. But from what others have said Crimson has been buggy a tad. Happens to Nvidia too though. Nvidia had a driver out that was killing cards back in the 6 series. So it happens to both. My GTX670's were spared, but were buggy as **** too, so IDK if I buy that Nvidia's drivers are any better than AMD. I've had very good luck with AMD drivers. Not ever really had a significant problem if memory recalls.


----------



## ZealotKi11er

CF and SLI si for people that have high resolution screens and want to have MAX settings. You can not deal with single GPU in a lot of games. Also 290X to Fury X is not worth the upgrade. Only GTX980 Ti because of the extra performance you get from overclocking. 2 x 290X to Fury X or GTX980 Ti you are mostly doing it because the games you play don't scale scale as well. The problem with most games not scaling as well is you are not running enough GPU taxing settings. With CFX and SLI work much better to bring a 30 fps game to 50 fps+ then a 70 fps game to 120 fps +. Reason is the when you are in high fps you start to neglect the difference better fps over 60. Another reason especially for AMD is CPU overhead.


----------



## By-Tor

I have been on the fence about selling my second Powercolor 290x LCS card for sometime now.

It's installed in my rig and in my water cooling loop, but power has been unplugged from it for sometime. The other Powercolor 290x LCS card does everything I do on the computer.

I'm running 2-24" Asus monitors with my main one a 144hz model I have set to 1440p and 100hz and still getting great frames in games. With both cards running together I don't see a big performance change in games and the cards usage seem to jump around a lot in games and benches.

I have been thinking about upgrading to a single 390 or 390x card and putting a water block on it, but what I have been reading is that it would be more of a side grade as far as performance goes.

Anyone have thoughts on this they would like to share?

PS: This may be more of just an upgrade itch that needs a scratching...

Thanks


----------



## Fickle Pickle

Quote:


> Originally Posted by *By-Tor*
> 
> I have been on the fence about selling my second Powercolor 290x LCS card for sometime now.
> 
> It's installed in my rig and in my water cooling loop, but power has been unplugged from it for sometime. The other Powercolor 290x LCS card does everything I do on the computer.
> 
> I'm running 2-24" Asus monitors with my main one a 144hz model I have set to 1440p and 100hz and still getting great frames in games. With both cards running together I don't see a big performance change in games and the cards usage seem to jump around a lot in games and benches.
> 
> I have been thinking about upgrading to a single 390 or 390x card and putting a water block on it, but what I have been reading is that it would be more of a side grade as far as performance goes.
> 
> Anyone have thoughts on this they would like to share?
> 
> PS: This may be more of just an upgrade itch that needs a scratching...
> 
> Thanks


I would recommend against it. Just wait for the successor to the Fury X. The only reason to go 390x from a 290 is for the Vram. Unless you're using more than 4gb, I would wait.


----------



## HOMECINEMA-PC

I want a r9 card that's 2 slot so I can block it also with 3 DP's so I can run Tri UHD monitors @ 60Hz .......

Is this possible ??


----------



## fat4l

Well guys, after seeing how good Fury X scales in CF, and that with the newest drivers + CF, it is getting the same(sometimes even better) performance than 980Ti SLI, and the fact I don;t need to bother selling my monitor, I will stay with Fury X and possibly CF









http://www.techpowerup.com/forums/threads/an-epic-fury-x-review-quad-fury-x-vs-quad-titan-x.214231/
http://www.pcper.com/reviews/Graphics-Cards/AMD-Fury-X-vs-NVIDIA-GTX-980-Ti-2-and-3-Way-Multi-GPU-Performance/Crysis-3
(This is even with old drivers)


----------



## Cyber Locc

Quote:


> Originally Posted by *the9quad*
> 
> If money was no object, I would take a single 980ti over my 3 290x's any day. A single 980ti in the majority of new games these days is going to perform better (since the majority of games are using gameworks). Fallout4, Rainbow Six:Siege, The witcher 3, etc.... all using gameworks, and all of them run like poo in crossfire. That said, in some games, crossfire shines and in those cases the 290x's will beat a single 980ti, but that doesn't happen all that often sadly. The 290x's (and perhaps AMD) for me had a good run, but this gameworks thing is killing us. I am holding out for the next cards, but it is a painful wait to be honest.


Umm I don't know where you get this there was a review with fury crossfire and 980ti SLI the other day I seen and 290x crossfire beat 980 SLI in those game works games. (Wicther 3 mainly was the one I seen)

Again the only problem with your statement is 4k see personally I'm stuck no single card in the world will max 4k and I play @ 4k so thus CF or bust for me









Anyway I don't have any issues with wicther and crossfire I did like the first few weeks of release but I don't anymore. Are you trying to use game works features? Because ya that will defiantly cause issues.

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> I want a r9 card that's 2 slot so I can block it also with 3 DP's so I can run Tri UHD monitors @ 60Hz .......
> 
> Is this possible ??


A 295x2 has 4 DP, that said I have seen reviews of quad SLI Titan Xs not being up to 3x4k. However give it a shot with dual 295x2s it may work out semi ok with crossfires better scaling. I have been considering get 2 more 4k displays just for my side monitors but then I don't use Eyefinity or surround makes my stomach hurt.


----------



## the9quad

Quote:


> Originally Posted by *cyberlocc*
> 
> Umm I don't know where you get this there was a review with fury crossfire and 980ti SLI the other day I seen and 290x crossfire beat 980 SLI in those game works games. (Wicther 3 mainly was the one I seen)
> 
> Again the only problem with your statement is 4k see personally I'm stuck no single card in the world will max 4k and I play @ 4k so thus CF or bust for me


I get it from owning gameswork games, and owning 290x's in tri-fire. Not trying to be a jerk, but I don't think I would trust any benchmark that has 980ti sli's losing to 290x's in crossfire in The Witcher 3.


----------



## Cyber Locc

Quote:


> Originally Posted by *the9quad*
> 
> I get it from owning gameswork games, and owning 290x's in tri-fire. Not trying to be a jerk, but I don't think I would trust any benchmark that has 980ti sli's losing to 290x's in crossfire in The Witcher 3.


Okay then show me one where it doesn't? With game works off. That said with gameworks on Fury X crossfire still wins, http://www.hardocp.com/article/2015/10/06/amd_radeon_r9_fury_x_crossfire_at_4k_review/4#.Vn9UXvkrLIU.

I am trying to find the review from which the 290x (it was 390xs but pretty much same thing) here was the image I seen of said situation (it was posted by someone in another thread.) 
That said a single 980ti would crush a Fury in Wicther but SLI is horrid right now and CF is currently way way superior. It is all about scaling man, CF sees a 90%+ increase on second GPU SLI gives a 60% increase CF will always win, IF the cards are close to performance with a single card and drivers for that game are good.

I'm not saying the 980tis shouldn't be winning or that they cant with better drivers, I'm saying SLI drivers are trash ATM.

All this said yes with hairworks on they both take a massive hit and 980s win by a bit.


----------



## the9quad

2560x1440 and 40 ish fps with hair works off, 3 290x's. Scaling is pretty much crap in TW3 in CFX with the profile. Fire up rainbiw six siege want to see pathetic.

Also read the whole article you posted, don't just show the graph. Read the conclusion about how stuttery CFX is and SLI has the smoother performance.

Beside the point, you posted benched for fury CFX, you said 290x CFX beats 980ti not fury CFX.


----------



## JourneymanMike

Quote:


> Originally Posted by *fat4l*
> 
> Well going from PCiE gen3 16X to 8X doesn't decrease the performance drastically, the difference is maybe 1fps.
> 
> 
> 
> 
> 
> So, let's compare my ares 3(2x290X) graphics score to yours with Fury X.
> Firestrike eXtreme.
> 
> 13074 pts
> http://www.3dmark.com/fs/6590481


I'd like to do that...

Only thing is, I'm still am running my 290X Crossfire...

I was just asking about the Fury X, and it's performance as compared to 290X CFX...

I see you've been using FireStrike Extreme, which I have some results for, but not at my best overclocks. I do have some regular FireStrike scores though...

http://www.3dmark.com/fs/6703166


----------



## ZealotKi11er

Quote:


> Originally Posted by *the9quad*
> 
> 2560x1440 and 40 ish fps with hair works off, 3 290x's. Scaling is pretty much crap in TW3 in CFX with the profile. Fire up rainbiw six siege want to see pathetic.
> 
> Also read the whole article you posted, don't just show the graph. Read the conclusion about how stuttery CFX is and SLI has the smoother performance.
> 
> Beside the point, you posted benched for fury CFX, you said 290x CFX beats 980ti not fury CFX.


4K just makes scaling crazy even if you get not that good of scaling at 1440p. At 4K all AMD CPU overhead is gone so its all GPU vs GPU and nothing else. Yeah CFX does not act well in Witcher 3 @ 1440p for my system either.


----------



## Cyber Locc

Quote:


> Originally Posted by *the9quad*
> 
> 2560x1440 and 40 ish fps with hair works off, 3 290x's. Scaling is pretty much crap in TW3 in CFX with the profile. Fire up rainbiw six siege want to see pathetic.
> 
> Also read the whole article you posted, don't just show the graph. Read the conclusion about how stuttery CFX is and SLI has the smoother performance.
> 
> Beside the point, you posted benched for fury CFX, you said 290x CFX beats 980ti not fury CFX.


The Picture was for the 290xs(390xs) I can not find that review that image was linked in another thread (that was promoting 980tis). To the stutter I don't have that issue however I have a freesync display so that may be why?

I do not play at 1440p, our rigs differ slightly, all my cards are all overclocked quite a bit. (1175/1550) and they are all water cooled to prevent throttling as with the stock coolers they will throttle especially if you have more than one.

I also never used crimson before taking my rig down. I heard that hurts performance are your 40 fps result post crimson?

I do not know if what ZealotKi11er is saying is accurate or not I don't play at 1440, I do know that Wicther 3 plays fine on my 4k Freesync monitor with 2 cards and 3.

Also 40fps is bad







have you tried using just 1 card? It would probably perform better than that with 1 card overclocked that that.

Again don't play FPS games so I have to take your word for that one. Also would suggest getting some kind of cooler for those GPUs those reference coolers so close like that your cards are defiantly throttling. I tested it in my 900d before putting the blocks on 2 290s 2 slots away were throttling alot even with the fan set to jet engine at 100% fan speed they would still throttle.

Ah ha found the 390x review







http://www.hardocp.com/article/2015/09/28/asus_strix_radeon_r9_fury_dc3_crossfire_at_4k_review/5#.VoAwpvkrLIU They also mention the stutter, I don't notice this stutter but I have freesync so maybe thats why? This also shows a avg fps of 40, however my game is modded to play ultra with foliage on high and still get 60 fps. (the amd mods with tess and such)


----------



## the9quad

Quote:


> Originally Posted by *cyberlocc*
> 
> The Picture was for the 290xs(390xs) I can not find that review that image was linked in another thread (that was promoting 980tis). To the stutter I don't have that issue however I have a freesync display so that may be why?
> 
> I do not play at 1440p, our rigs differ slightly, all my cards are all overclocked quite a bit. (1175/1550) and they are all water cooled to prevent throttling as with the stock coolers they will throttle especially if you have more than one.
> 
> I also never used crimson before taking my rig down. I heard that hurts performance are your 40 fps result post crimson?
> 
> I do not know if what ZealotKi11er is saying is accurate or not I don't play at 1440, I do know that Wicther 3 plays fine on my 4k Freesync monitor with 2 cards and 3.
> 
> Also 40fps is bad
> 
> 
> 
> 
> 
> 
> 
> have you tried using just 1 card? It would probably perform better than that with 1 card overclocked that that.
> 
> Again don't play FPS games so I have to take your word for that one. Also would suggest getting some kind of cooler for those GPUs those reference coolers so close like that your cards are defiantly throttling. I tested it in my 900d before putting the blocks on 2 290s 2 slots away were throttling alot even with the fan set to jet engine at 100% fan speed they would still throttle.
> 
> Ah ha found the 390x review
> 
> 
> 
> 
> 
> 
> 
> http://www.hardocp.com/article/2015/09/28/asus_strix_radeon_r9_fury_dc3_crossfire_at_4k_review/5#.VoAwpvkrLIU They also mention the stutter, I don't notice this stutter but I have freesync so maybe thats why? This also shows a avg fps of 40, however my game is modded to play ultra with foliage on high and still get 60 fps. (the amd mods with tess and such)


Cards don't throttle, fans just get loud. Custom fan curves and headphones bruh, I'm deaf but I'm. Not throttling


----------



## fat4l

Quote:


> Originally Posted by *JourneymanMike*
> 
> I'd like to do that...
> 
> Only thing is, I'm still am running my 290X Crossfire...
> 
> I was just asking about the Fury X, and it's performance as compared to 290X CFX...
> 
> I see you've been using FireStrike Extreme, which I have some results for, but not at my best overclocks. I do have some regular FireStrike scores though...
> 
> http://www.3dmark.com/fs/6703166


This is my 24/7 overclock.
I got 12% more graphics score.

http://www.3dmark.com/compare/fs/6734662/fs/6703166


----------



## JourneymanMike

Quote:


> Originally Posted by *fat4l*
> 
> This is my 24/7 overclock.
> I got 12% more graphics score.
> 
> http://www.3dmark.com/compare/fs/6734662/fs/6703166


Looks like I need to do some tuning, Fat...

My Rig is down for delidding and a change of motherboard, the MB I'm using 4 bent pins...



I don't know what these pins may effect, but I just got my other M7F back from Asus RMA, on different issues...

I'll be back!!

Are you using the Uber BIOS ( switch towards the back of the card ) on your cards?


----------



## Tobiman

Quote:


> Originally Posted by *JourneymanMike*
> 
> Looks like I need to do some tuning, Fat...
> 
> My Rig is down for delidding and a change of motherboard, the MB I'm using 4 bent pins...
> 
> 
> 
> I don't know what these pins may effect, but I just got my other M7F back from Asus RMA, on different issues...
> 
> I'll be back!!
> 
> Are you using the Uber BIOS ( switch towards the back of the card ) on your cards?


Why don't you lift the pins up with a credit card? I had 12 bent pins in my motherboard and fixed it easily. As long as they are not broken, you have hope.


----------



## Tobiman

Quote:


> Originally Posted by *fat4l*
> 
> Well going from PCiE gen3 16X to 8X doesn't decrease the performance drastically, the difference is maybe 1fps.
> 
> 
> 
> 
> 
> So, let's compare my ares 3(2x290X) graphics score to yours with Fury X.
> Firestrike eXtreme.
> 
> 13074 pts
> http://www.3dmark.com/fs/6590481


That ARES 3 is a beast. I would love to take if off your hands but no monies -_-.


----------



## fat4l

Quote:


> Originally Posted by *JourneymanMike*
> 
> Are you using the Uber BIOS ( switch towards the back of the card ) on your cards?


Well my card is technically 2x290X on a single PCB = 295X2 but with custom pcb so...Ares 3










Quote:


> Originally Posted by *Tobiman*
> 
> That ARES 3 is a beast. I would love to take if off your hands but no monies -_-.


It rly is however I feel like leaving Hawaii behind and moving towards something new., Fury X.


----------



## ZealotKi11er

Quote:


> Originally Posted by *fat4l*
> 
> Well my card is technically 2x290X on a single PCB = 295X2 but with custom pcb so...Ares 3
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It rly is however I feel like leaving Hawaii behind and moving towards something new., Fury X.


Fury X scores like 17K in 3DMark so its 29K vs 17K. You will get huge downgrade. When I moved from HD 7970s to 290X i moved from ~ 18K to 15K in 3DMark.


----------



## Tobiman

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Fury X scores like 17K in 3DMark so its 29K vs 17K. You will get huge downgrade. When I moved from HD 7970s to 290X i moved from ~ 18K to 15K in 3DMark.


The ARES 3 is also somehow faster than your standard R9 295X2. Coupled with that insane overclock and you have a winner.


----------



## fat4l

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Fury X scores like 17K in 3DMark so its 29K vs 17K. You will get huge downgrade. When I moved from HD 7970s to 290X i moved from ~ 18K to 15K in 3DMark.


I compared it to Fury X, highly clocked.

Observed results are :
FS: http://www.3dmark.com/compare/fs/6802832/fs/6734662# = 47% difference in GS
FSX: http://www.3dmark.com/compare/fs/6802501/fs/6590481# = 36% difference in GS.

That's a huge drop in performance for me. Only Fury X Crossfire makes sense to me...


----------



## ZealotKi11er

Quote:


> Originally Posted by *fat4l*
> 
> I compared it to Fury X, highly clocked.
> 
> Observed results are :
> FS: http://www.3dmark.com/compare/fs/6802832/fs/6734662# = 47% difference in GS
> FSX: http://www.3dmark.com/compare/fs/6802501/fs/6590481# = 36% difference in GS.
> 
> That's a huge drop in performance for me. Only Fury X Crossfire makes sense to me...


Might as well wait for Fury X2 card then .


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *fat4l*
> 
> Guys.
> I decided to sell my Ares 3.
> 
> 
> 
> 
> 
> 
> 
> 
> Not sure what I will be moving to tho...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Any ideas ?


How much ??


----------



## fat4l

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> How much ??


Well I'm open to offers so if you feel like it, you can send me a PM. Just pls be reasonable...


----------



## fat4l

For those who are interested in performance 290X CF vs Fury X, here it is:

*2x 290X* vs *Fury X*
Both setups clocked high: 1200/1700 vs 1180/590 MHz.
2x 290X wins by ~40%

FS: http://www.3dmark.com/compare/fs/6802832/fs/6734662# = 47% difference in GS
FSX: http://www.3dmark.com/compare/fs/6802501/fs/6590481# = 36% difference in GS
FSU: http://www.3dmark.com/compare/fs/6812511/fs/6592177# = 37% difference in GS


----------



## ZealotKi11er

Quote:


> Originally Posted by *fat4l*
> 
> For those who are interested in performance 290X CF vs Fury X, here it is:
> 
> *2x 290X* vs *Fury X*
> Both setups clocked high: 1200/1700 vs 1180/590 MHz.
> 2x 290X wins by ~40%
> 
> FS: http://www.3dmark.com/compare/fs/6802832/fs/6734662# = 47% difference in GS
> FSX: http://www.3dmark.com/compare/fs/6802501/fs/6590481# = 36% difference in GS
> FSU: http://www.3dmark.com/compare/fs/6812511/fs/6592177# = 37% difference in GS


I am guessing at 4K the Fury X HBM probably kicks in. Keep in mind you Ares OC is very top of the spectrum for 290X. I would say most people hit about 1150/1500.


----------



## MEC-777

As I discovered when I was running crossfire 290's, not a lot of games can make use of crossfire. So in all games where only one card/GPU can be utilized, the Fury X will be faster. Just something not a lot of people mention when comparing multi-GPU cards to single GPU cards, but is very important to consider, IMO. It's one of the reasons why I dropped the CF 290's for a single 980.


----------



## JourneymanMike

Quote:


> Originally Posted by *fat4l*
> 
> For those who are interested in performance 290X CF vs Fury X, here it is:
> 
> *2x 290X* vs *Fury X*
> Both setups clocked high: 1200/1700 vs 1180/590 MHz.
> 2x 290X wins by ~40%
> 
> FS: http://www.3dmark.com/compare/fs/6802832/fs/6734662# = 47% difference in GS
> FSX: http://www.3dmark.com/compare/fs/6802501/fs/6590481# = 36% difference in GS
> FSU: http://www.3dmark.com/compare/fs/6812511/fs/6592177# = 37% difference in GS


The comparison is not equal, as far as the OC on the CPU, and the GPU...

I found that CPU OC, does matter, on GPU benchmarking...

The platform is also different, Z97 to Z170...

An alike system, would be a better comparison...


----------



## By-Tor

Quote:


> Originally Posted by *Fickle Pickle*
> 
> I would recommend against it. Just wait for the successor to the Fury X. The only reason to go 390x from a 290 is for the Vram. Unless you're using more than 4gb, I would wait.


Going to go ahead and just wait it out and see what's next...

Thanks


----------



## rdr09

Quote:


> Originally Posted by *MEC-777*
> 
> As I discovered when I was running crossfire 290's, not a lot of games can make use of crossfire. So in all games where only one card/GPU can be utilized, the Fury X will be faster. Just something not a lot of people mention when comparing multi-GPU cards to single GPU cards, but is very important to consider, IMO. It's one of the reasons why I dropped the CF 290's for a single 980.


with a locked i5 . . . even if the game supports crossfire your cpu will limit, especially those that use a lot of cores.


----------



## fat4l

Quote:


> Originally Posted by *JourneymanMike*
> 
> The comparison is not equal, as far as the OC on the CPU, and the GPU...
> 
> I found that CPU OC, does matter, on GPU benchmarking...
> 
> The platform is also different, Z97 to Z170...
> 
> An alike system, would be a better comparison...


It would be better but noone else submitted any results.
I also believe that on 4k(at least) the CPU overclock is not making any difference in graphics score. The platforms are different but these results cant be far off of what it would be on the same system.


----------



## MEC-777

Quote:


> Originally Posted by *rdr09*
> 
> with a locked i5 . . . even if the game supports crossfire your cpu will limit, especially those that use a lot of cores.


That's beside the point I was trying to make. The single more powerful GPU will always be faster than dual GPU's when the game can only use one GPU.

Furthermore, I never ran into any CPU bottlenecking with my locked i5 and CF 290 setup.


----------



## spyshagg

Keep in mind:

The tearing with crossfire is abysmal on a 60hz panel.

Its better with a 144hz panel, but I'm calling it almost obligatory to use either Vsync or Freesync.


----------



## rdr09

Quote:


> Originally Posted by *MEC-777*
> 
> That's beside the point I was trying to make. The single more powerful GPU will always be faster than dual GPU's when the game can only use one GPU.
> 
> *Furthermore, I never ran into any CPU bottlenecking with my locked i5 and CF 290 setup*.


.

lol. no way. i've seen an overclocked i5 bottleneck a single 970, which has less cpu overhead than a single 290. you had two with a locked i5.


----------



## AliNT77

I have an i5-2500k @4.4 and 1600cl8 RAM and its already bottlenecking my stock r9 290 in CPU-intensive scenarios like:

Witcher 3 in Novigrad
AC:U almost everywhere
AC:S when driving a carriage
Project cars running in win7
GTA V in the city
Fallout 4 with too many NPCs
MGSV with too many NPCs and Light sources
NFS:Rivals so often (framerate unlocked to 60)
&...

While running All of them @FHD
its nearly impossible to achieve a rock-solid 60 in these scenarios with an i5 paired with just a r9 290


----------



## Cyber Locc

Quote:


> Originally Posted by *MEC-777*
> 
> As I discovered when I was running crossfire 290's, not a lot of games can make use of crossfire. So in all games where only one card/GPU can be utilized, the Fury X will be faster. Just something not a lot of people mention when comparing multi-GPU cards to single GPU cards, but is very important to consider, IMO. It's one of the reasons why I dropped the CF 290's for a single 980.


All games can make use of crossfire, You may need to use AFR friendly or mod the settings for that game but all games can use crossfire. Is it stable or playable with it on that's another story









Here is the thing I think everyone keeps missing, The only reason to run crossfire is if you play in a way that 1 card is not enough. For instance I play maxed out at 4k and wont accept under 60fps, for that reason a less than 2 Fury Xs will not meet my needs. The same could be said about people playing 3x1440p 1 Fury X is not going to cut the mustard.

If your playing at 1080p, or 1440p single display then yes the Fury X or 980 is the best option, that said you never needed CF or SLI to begin with. The people that *need* CF or SLI still need it it a year later, or they need to step back and reevaluate if there expectations or wants out weight the troubles.

There will be issues running on the bleeding edge the question every person has to ask themselves is it worth it. No one can answer this except you, is it worth the issues for the gaming in 4k? Would there be more value to turning settings down or dropping resolution always.

That is the same way a Fury X could be seen as a upgrade, If I was prepared to swear off 4k completely I could trade my 290s for a Fury. This would in essence give me better frames at 1080p or 1440 without the issues of CF. However on the same token I would be giving up my ability to play in 4k. As I said before everyone has different needs and wants, I play RPGs and TBH thats all on the rare occasion I play FPS/Racing its on console. So to me 144fps is not a concern 4k is and that makes a massive difference to my immersive RPGs, on the same token frame latency plays little to no role in my games in a FPS it does.

This also goes into line with what spyshagg said. Vsync is looked down on alot due to its latency but people need to realize not everyone is a FPS player if your playing FPS or some other competitive online game then Vsync is bad for business and you want it off but not everyone plays those games. I do fall on both sides but am lucky, I do not play those types of games for the most part. I do at times play PvP based MMOs where that does matter however in those games I don't need crossfire so I am lucky







(I have freesync now anyway







)

You really have to look at the games you play and the way you want to play them to make the decision there is not one answer to rule them all for anyone. This is why I made the statement that I did in the begging. If you have CF because you need it and you think switching to a single card 1 gen later will be an upgrade it will not. It can be if you are prepared to make changes in the way you game and your expectations, That said if your already laying at lowered needs than you wasted money to begin with and are suffering through issues without reason. Then downgrading would be just putting you where you should have been from the get go and becomes an upgrade to your wallet.

Also I agree not a lot of people mention it I think thats because they don't see a need. See again *Need CF vs Want CF* If you are playing 1080p You don't NEED CFed Furys and you didn't NEED CFed 290s, you may have wanted them for X reason and found that X wasn't worth Y issues so you changed your config.

You are the perfect example of this not trying to pick on you







. If your rig specs are accurate, your using a 1080p 60hz monitor. With that config you never needed CF 290xs 1 would have done just fine as will 1 980. If you had a 4k panel than you would need CF, so in your circumstance you were dealing with CF issues when you didn't need to be, it was pointless and your best bet was to remove 1 gpu.
Quote:


> Originally Posted by *JourneymanMike*
> 
> The comparison is not equal, as far as the OC on the CPU, and the GPU...
> 
> I found that CPU OC, does matter, on GPU benchmarking...
> 
> The platform is also different, Z97 to Z170...
> 
> An alike system, would be a better comparison...


I agree with this, Skylake CPUs are stronger than that Haswell and use a completely different well everything, They also have changed the way the PCI works in part giving the card a boost to performance. I regretfully say this is a worthless comparison.


----------



## the9quad

Quote:


> Originally Posted by *cyberlocc*
> 
> All games can make use of crossfire, You may need to use AFR friendly or mod the settings for that game but all games can use crossfire. Is it stable or playable with it on that's another story
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Here is the thing I think everyone keeps missing, The only reason to run crossfire is if you play in a way that 1 card is not enough. For instance I play maxed out at 4k and wont accept under 60fps, for that reason a less than 2 Fury Xs will not meet my needs. The same could be said about people playing 3x1440p 1 Fury X is not going to cut the mustard.
> 
> If your playing at 1080p, or 1440p single display then yes the Fury X or 980 is the best option, that said you never needed CF or SLI to begin with. The people that *need* CF or SLI still need it it a year later, or they need to step back and reevaluate if there expectations or wants out weight the troubles.
> 
> There will be issues running on the bleeding edge the question every person has to ask themselves is it worth it. No one can answer this except you, is it worth the issues for the gaming in 4k? Would there be more value to turning settings down or dropping resolution always.
> 
> That is the same way a Fury X could be seen as a upgrade, If I was prepared to swear off 4k completely I could trade my 290s for a Fury. This would in essence give me better frames at 1080p or 1440 without the issues of CF. However on the same token I would be giving up my ability to play in 4k. As I said before everyone has different needs and wants, I play RPGs and TBH thats all on the rare occasion I play FPS/Racing its on console. So to me 144fps is not a concern 4k is and that makes a massive difference to my immersive RPGs, on the same token frame latency plays little to no role in my games in a FPS it does.
> 
> This also goes into line with what spyshagg said. Vsync is looked down on alot due to its latency but people need to realize not everyone is a FPS player if your playing FPS or some other competitive online game then Vsync is bad for business and you want it off but not everyone plays those games. I do fall on both sides but am lucky, I do not play those types of games for the most part. I do at times play PvP based MMOs where that does matter however in those games I don't need crossfire so I am lucky
> 
> 
> 
> 
> 
> 
> 
> (I have freesync now anyway
> 
> 
> 
> 
> 
> 
> 
> )
> 
> You really have to look at the games you play and the way you want to play them to make the decision there is not one answer to rule them all for anyone. This is why I made the statement that I did in the begging. If you have CF because you need it and you think switching to a single card 1 gen later will be an upgrade it will not. It can be if you are prepared to make changes in the way you game and your expectations, That said if your already laying at lowered needs than you wasted money to begin with and are suffering through issues without reason. Then downgrading would be just putting you where you should have been from the get go and becomes an upgrade to your wallet.
> 
> Also I agree not a lot of people mention it I think thats because they don't see a need. See again *Need CF vs Want CF* If you are playing 1080p You don't NEED CFed Furys and you didn't NEED CFed 290s, you may have wanted them for X reason and found that X wasn't worth Y issues so you changed your config.
> 
> You are the perfect example of this not trying to pick on you
> 
> 
> 
> 
> 
> 
> 
> . If your rig specs are accurate, your using a 1080p 60hz monitor. With that config you never needed CF 290xs 1 would have done just fine as will 1 980. If you had a 4k panel than you would need CF, so in your circumstance you were dealing with CF issues when you didn't need to be, it was pointless and your best bet was to remove 1 gpu.
> I agree with this, Skylake CPUs are stronger than that Haswell and use a completely different well everything, They also have changed the way the PCI works in part giving the card a boost to performance. I regretfully say this is a worthless comparison.


Periodically some games just wont flatout work at all with CFX as it is completely disabled see past UBI games that had CFX disabled.

In addition, with Crimson, not all games use crossfire. Pretty much if they don't have a profile, you are out of luck







It is a crapshoot if you can enable a per game anything in crimson.

Even if they do work, most newer games just don't scale well, making 4k a suspect resolution for people that don't want to sacrifice visual settings and still maintain 60fps.

CFX can be useful for 1080p 60hz. Quite a few games will not do 60 fps at 1080p on a single 980ti, throw in some AA and bam you need SLI or CFX. Not everyone wants or needs a 4k monitor, and there are just as many reason to have SLI or CFX system at 1080p as there are at 4k.

Not trying to pick on you


----------



## JourneymanMike

Quote:


> Originally Posted by *MEC-777*
> 
> That's beside the point I was trying to make. The single more powerful GPU will always be faster than dual GPU's when the game can only use one GPU.
> 
> Furthermore, *I never ran into any CPU bottlenecking with my locked i5 and CF 290 setup*.


In my practical experience, I know, and have proof, that a higher CPU clock, can a result in a higher GPU score

My Performance Rig is down, due to a D5 pump failure. I don't have access to the SSD, where the results are, because I'm using my laptop now.

I will try to get results to show you later...

Mike


----------



## ZealotKi11er

Quote:


> Originally Posted by *JourneymanMike*
> 
> In my practical experience, I know, and have proof, that a higher CPU clock, can a result in a higher GPU score
> 
> My Performance Rig is down, due to a D5 pump failure. I don't have access to the SSD, where the results are, because I'm using my laptop now.
> 
> I will try to get results to show you later...
> 
> Mike


In 3DMark I did not get more GPU score from faster CPU. There is a limit though. Going from 3770K to 6700K will get you 0% increase in GPU score ~ 2000 in CPU score.


----------



## Cyber Locc

Quote:


> Originally Posted by *the9quad*
> 
> Periodically some games just wont flatout work at all with CFX as it is completely disabled see past UBI games that had CFX disabled.
> 
> In addition, with Crimson, not all games use crossfire. Pretty much if they don't have a profile, you are out of luck
> 
> 
> 
> 
> 
> 
> 
> It is a crapshoot if you can enable a per game anything in crimson.
> 
> Even if they do work, most newer games just don't scale well, making 4k a suspect resolution for people that don't want to sacrifice visual settings and still maintain 60fps.
> 
> CFX can be useful for 1080p 60hz. Quite a few games will not do 60 fps at 1080p on a single 980ti, throw in some AA and bam you need SLI or CFX. Not everyone wants or needs a 4k monitor, and there are just as many reason to have SLI or CFX system at 1080p as there are at 4k.
> 
> Not trying to pick on you


Every game I play plays max 4k (without a ton of AA which you don't need) at 60hz, no visuals settings are lossed. Visuals are lost by not playing at 4k though, again this comes down to preference. I will agree there is games that don't have a profile, and also that CF is a crapshoot always, matter of fact I said that multiple times.

There is not 1 game that cannot run at 60 fps @ 1080p on a 980ti sorry not buying that buddy I'm going to need links. Even if there is a "Few" I use that word sparingly because there isnt many if any what maybe 2-3? Those can also most likely be easily fixed by reducing AA slightly which impairs no visual change.


----------



## Cyber Locc

Quote:


> Originally Posted by *ZealotKi11er*
> 
> In 3DMark I did not get more GPU score from faster CPU. There is a limit though. Going from 3770K to 6700K will get you 0% increase in GPU score ~ 2000 in CPU score.


It isnt just the CPU thats added to the difference to which we speak. Skylake changes the way the PCI behaves, as does the faster DDR4. It has been proven that DDR4 does make a difference in gaming. Comparing results from a Z97 board are not going to be the same as a Z170.

Heres a review with a 6700k vs a 5820k in firestrike. http://www.bjorn3d.com/2015/08/intel-core-i7-6700k-review-skylake-falling/3/ Spoiler the 6700k Wins.

However I do see what your saying about using the GPU only score I would still like to see a equal test bench used to get the results.

Edit: found and equal test dont like linking to other forums but the results are out there. It would seem his results are pretty accurate.


----------



## JourneymanMike

Quote:


> Originally Posted by *ZealotKi11er*
> 
> *In 3DMark I did not get more GPU score from faster CPU*. There is a limit though. Going from 3770K to 6700K will get you 0% increase in GPU score ~ 2000 in CPU score.


Yes I did, on my FX-8350 and 4790K..

I'll get the scores after Supper sometime.

As I have important political garbage to watch!


----------



## the9quad

Quote:


> Originally Posted by *cyberlocc*
> 
> Every game I play plays max 4k (without a ton of AA which you don't need) at 60hz, no visuals settings are lossed. Visuals are lost by not playing at 4k though, again this comes down to preference. I will agree there is games that don't have a profile, and also that CF is a crapshoot always, matter of fact I said that multiple times.
> 
> There is not 1 game that cannot run at 60 fps @ 1080p on a 980ti sorry not buying that buddy I'm going to need links. Even if there is a "Few" I use that word sparingly because there isnt many if any what maybe 2-3? Those can also most likely be easily fixed by reducing AA slightly which impairs no visual change.


Quite a few games do not maintain a solid 60 fps at 1080p (and especially at 1440p) on a 980ti, and just like you wont sacrifice 4k, alot wont sacrifice AA. Quite a few games also don't hit 60 fps solid at 4k without turning things down in SLI or CFX.
The Witcher3, GTAV, Farcry4, etc... all go under 60 fps and all very popular games when completely maxed on a 980ti at 1080p (and yes when maxed at 4k in SLI they do to as well).. Same thing with Rainbow Six Siege when any AA is used besides temporal AA. I am glad every game "you" play runs fine, without turning down settings at 4k and "you" still maintain 60 fps. Keep in mind before belittling some dude for running CFX at 1080p 60hz, that maybe he doesn't play the games "you" do, and maybe he likes AA (which is needed at 1080p very much so, and is a huge thing to some people when you turn it down).


----------



## ZealotKi11er

Quote:


> Originally Posted by *the9quad*
> 
> Quite a few games do not maintain a solid 60 fps at 1080p (and especially at 1440p) on a 980ti, and just like you wont sacrifice 4k, alot wont sacrifice AA. Quite a few games also don't hit 60 fps solid at 4k without turning things down in SLI or CFX.
> The Witcher3, GTAV, Farcry4, etc... all go under 60 fps and all very popular games when completely maxed on a 980ti at 1080p (and yest when maxed at 4k in SLI they do to as well).. Same thing with Rainbow Six Siege when any AA is used besides temporal AA. I am glad every game "you" play runs fine, without turning down settings at 4k and "you" still maintain 60 fps. Keep in mind before belittling some dude for running CFX at 1080p 60hz, that maybe he doesn't play the games "you" do, and maybe he likes AA (which is needed at 1080p very much so, and is a huge thing to some people when you turn it down).


I am playing Assassins Creed Syndicate and with everything MAX @ 1680x1050 i am getting 40 fps lol. CFX is scaling 100%.


----------



## JourneymanMike

Quote:


> Originally Posted by *JourneymanMike*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> *In 3DMark I did not get more GPU score from faster CPU*. There is a limit though. Going from 3770K to 6700K will get you 0% increase in GPU score ~ 2000 in CPU score.
> 
> 
> 
> Yes I did, on my FX-8350 and 4790K..
> 
> I'll get the scores after Supper sometime.
> 
> As I have important political garbage to watch!
Click to expand...

Comparison: I needed more vCore on the higher score...

http://www.3dmark.com/compare/fs/6593063/fs/6874761#


----------



## kizwan

Quote:


> Originally Posted by *spyshagg*
> 
> Keep in mind:
> 
> *The tearing with crossfire is abysmal on a 60hz panel.*
> 
> Its better with a 144hz panel, but I'm calling it almost obligatory to use either Vsync or Freesync.


Very noticeable?
Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *MEC-777*
> 
> That's beside the point I was trying to make. The single more powerful GPU will always be faster than dual GPU's when the game can only use one GPU.
> 
> *Furthermore, I never ran into any CPU bottlenecking with my locked i5 and CF 290 setup*.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> .
> 
> lol. no way. i've seen an overclocked i5 bottleneck a single 970, which has less cpu overhead than a single 290. you had two with a locked i5.
Click to expand...

He probably playing @4K or have MSAA at max or the games not CPU intensive games.
Quote:


> Originally Posted by *JourneymanMike*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> *In 3DMark I did not get more GPU score from faster CPU*. There is a limit though. Going from 3770K to 6700K will get you 0% increase in GPU score ~ 2000 in CPU score.
> 
> 
> 
> Yes I did, on my FX-8350 and 4790K..
> 
> I'll get the scores after Supper sometime.
> 
> As I have important political garbage to watch!
Click to expand...

I noticed the same thing actually. Faster CPU does boost FPS little bit which in benching can give you hundreds of additional (graphics) score. I think this is AMD thing where CPU with better single thread performance will have less DX11 CPU overhead & usage.


----------



## Cyber Locc

Quote:


> Originally Posted by *the9quad*
> 
> Quite a few games do not maintain a solid 60 fps at 1080p (and especially at 1440p) on a 980ti, and just like you wont sacrifice 4k, alot wont sacrifice AA. Quite a few games also don't hit 60 fps solid at 4k without turning things down in SLI or CFX.
> The Witcher3, GTAV, Farcry4, etc... all go under 60 fps and all very popular games when completely maxed on a 980ti at 1080p (and yes when maxed at 4k in SLI they do to as well).. Same thing with Rainbow Six Siege when any AA is used besides temporal AA. I am glad every game "you" play runs fine, without turning down settings at 4k and "you" still maintain 60 fps. Keep in mind before belittling some dude for running CFX at 1080p 60hz, that maybe he doesn't play the games "you" do, and maybe he likes AA (which is needed at 1080p very much so, and is a huge thing to some people when you turn it down).


I didn't belittle anyone did I? Funny because I don't remember belittling anyone, I said he could have gotten by a 980 from the get go seeing how that is what he is now doing that was truth. Sorry if speaking the truth is belittling in your eyes.

And I am still waiting for which games that a 980 cant run? or is this list accurate.

All AVG FPS at [email protected] for a 980ti

GTA V - 63.3
Wicther 3 - 56.2
http://www.maximumpc.com/nvidia-geforce-gtx-980-ti-review/

Fallout 4 - 88
http://www.gamersnexus.net/game-bench/2177-fallout-4-pc-video-card-fps-benchmark-all-resolutions

And just because someone else mentioned it I will throw it in as well.
AC Syndicate - 71
http://www.gamersnexus.net/game-bench/2195-assassins-creed-syndicate-gpu-fps-benchmarks

Also those are cards at stock btw a 980ti overclocked would yet much better performance.

If you want to run CF for whatever reason, so your game doesn't dip ever, so you have more than 60 FPS where you dont need it, because it looks cool as heck. IT really doesn't matter does it? I mean after all this is overkill.net I am getting ready to blow a ton on a STH10 with 3 560s. Do I need more cooling, no I dont, Do I want it? yes I do. Is it a waste of money? Completely and I know that going in and thats okay.

So what by your logic I am belittling the guy wanting a 16 core Xeon to game with so lets let him spend 6k on a CPU he wont use? You are too sensitive man, I wasnt trying to be mean to him or "Belittle" him in anyway and if he thought that my apologizes to him. It makes no difference to me or anyone else how he spends his money, All I am saying is that a single 290x (Fury X much better:thumb can do. He stated he was having crossfire issues and that he moved to a 980 and is much happier. Its a shame I wasn't there the day he bought the CFed gpus to save him the trouble and money that he spent to get to that point







.

I used him as an example to what I was saying only because of his statement fit what I was saying.

If anyone is belittling anyone here its you. You have came at me for the past few days because of your issues with CF and my lack of.Im sorry you have issues with CF and I do not share them, I am sorry you seemingly hate crossfire. The same applies to you don't use it then, get a 1080p screen and don't use CF. However you do not need to consistently bash it, every post you have made here has been negative and just argumentative with me. I am trying to help the people in the thread that are asking about a fury x from 2 cards you are just trying to argue and im done talking to you. Kindly after you read this block me or say your final words and I will you.

Thanks and sorry to @MEC-777 if what I said offended you in anyway as that was not my intent.


----------



## mus1mus

Quote:


> Originally Posted by *JourneymanMike*
> 
> Yes I did, on my FX-8350 and 4790K..
> 
> I'll get the scores after Supper sometime.
> 
> As I have important political garbage to watch!


Don't forget that the FX is running the GPU at X16 PCIe 2.0 vs the X16 PCIe 3.0 with Intel. It's not big. But it's there.

Otherwise, Intel to Intel at same link speeds is a better comparison.


----------



## sinnedone

Quote:


> Originally Posted by *cyberlocc*
> 
> I didn't belittle anyone did I? Funny because I don't remember belittling anyone, I said he could have gotten by a 980 from the get go seeing how that is what he is now doing that was truth. Sorry if speaking the truth is belittling in your eyes.
> 
> And I am still waiting for which games that a 980 cant run? or is this list accurate.
> 
> All AVG FPS at [email protected] for a 980ti
> 
> GTA V - 63.3
> Wicther 3 - 56.2
> http://www.maximumpc.com/nvidia-geforce-gtx-980-ti-review/
> 
> Fallout 4 - 88
> http://www.gamersnexus.net/game-bench/2177-fallout-4-pc-video-card-fps-benchmark-all-resolutions
> 
> And just because someone else mentioned it I will throw it in as well.
> AC Syndicate - 71
> http://www.gamersnexus.net/game-bench/2195-assassins-creed-syndicate-gpu-fps-benchmarks
> 
> Also those are cards at stock btw a 980ti overclocked would yet much better performance.
> 
> If you want to run CF for whatever reason, so your game doesn't dip ever, so you have more than 60 FPS where you dont need it, because it looks cool as heck. IT really doesn't matter does it? I mean after all this is overkill.net I am getting ready to blow a ton on a STH10 with 3 560s. Do I need more cooling, no I dont, Do I want it? yes I do. Is it a waste of money? Completely and I know that going in and thats okay.
> 
> So what by your logic I am belittling the guy wanting a 16 core Xeon to game with so lets let him spend 6k on a CPU he wont use? You are too sensitive man, I wasnt trying to be mean to him or "Belittle" him in anyway and if he thought that my apologizes to him. It makes no difference to me or anyone else how he spends his money, All I am saying is that a single 290x (Fury X much better:thumb can do. He stated he was having crossfire issues and that he moved to a 980 and is much happier. Its a shame I wasn't there the day he bought the CFed gpus to save him the trouble and money that he spent to get to that point
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I used him as an example to what I was saying only because of his statement fit what I was saying.
> 
> If anyone is belittling anyone here its you. You have came at me for the past few days because of your issues with CF and my lack of.Im sorry you have issues with CF and I do not share them, I am sorry you seemingly hate crossfire. The same applies to you don't use it then, get a 1080p screen and don't use CF. However you do not need to consistently bash it, every post you have made here has been negative and just argumentative with me. I am trying to help the people in the thread that are asking about a fury x from 2 cards you are just trying to argue and im done talking to you. Kindly after you read this block me or say your final words and I will you.
> 
> Thanks and sorry to @MEC-777 if what I said offended you in anyway as that was not my intent.


All 6 core hyperthreaded cpu's. It makes a difference when most are running quad core cpu's


----------



## the9quad

Quote:


> Originally Posted by *cyberlocc*
> 
> I didn't belittle anyone did I? Funny because I don't remember belittling anyone, I said he could have gotten by a 980 from the get go seeing how that is what he is now doing that was truth. Sorry if speaking the truth is belittling in your eyes.
> 
> And I am still waiting for which games that a 980 cant run? or is this list accurate.
> 
> All AVG FPS at [email protected] for a 980ti
> 
> GTA V - 63.3
> Wicther 3 - 56.2
> http://www.maximumpc.com/nvidia-geforce-gtx-980-ti-review/
> 
> Fallout 4 - 88
> http://www.gamersnexus.net/game-bench/2177-fallout-4-pc-video-card-fps-benchmark-all-resolutions
> 
> And just because someone else mentioned it I will throw it in as well.
> AC Syndicate - 71
> http://www.gamersnexus.net/game-bench/2195-assassins-creed-syndicate-gpu-fps-benchmarks
> 
> Also those are cards at stock btw a 980ti overclocked would yet much better performance.
> 
> If you want to run CF for whatever reason, so your game doesn't dip ever, so you have more than 60 FPS where you dont need it, because it looks cool as heck. IT really doesn't matter does it? I mean after all this is overkill.net I am getting ready to blow a ton on a STH10 with 3 560s. Do I need more cooling, no I dont, Do I want it? yes I do. Is it a waste of money? Completely and I know that going in and thats okay.
> 
> So what by your logic I am belittling the guy wanting a 16 core Xeon to game with so lets let him spend 6k on a CPU he wont use? You are too sensitive man, I wasnt trying to be mean to him or "Belittle" him in anyway and if he thought that my apologizes to him. It makes no difference to me or anyone else how he spends his money, All I am saying is that a single 290x (Fury X much better:thumb can do. He stated he was having crossfire issues and that he moved to a 980 and is much happier. Its a shame I wasn't there the day he bought the CFed gpus to save him the trouble and money that he spent to get to that point
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I used him as an example to what I was saying only because of his statement fit what I was saying.
> 
> If anyone is belittling anyone here its you. You have came at me for the past few days because of your issues with CF and my lack of.Im sorry you have issues with CF and I do not share them, I am sorry you seemingly hate crossfire. The same applies to you don't use it then, get a 1080p screen and don't use CF. However you do not need to consistently bash it, every post you have made here has been negative and just argumentative with me. I am trying to help the people in the thread that are asking about a fury x from 2 cards you are just trying to argue and im done talking to you. Kindly after you read this block me or say your final words and I will you.
> 
> Thanks and sorry to @MEC-777 if what I said offended you in anyway as that was not my intent.


You asked which games can't maintain a solid 60 fps, and I told you which ones couldn't. Some people want a nice flat frametime, and don't really care what average FPS is. High framerates, and low framerates can equal a nice average framerate and still give a shaky experience.This is 56 fps average on a overclocked 980ti, in The Witcher 3, and it shows what I mean:



Sidenote-I really don't have an issue with you, and I am sure a lot of this is just context missing because we are typing over the internet without inflection, and a lot of stuff gets read in to the way we read it.

Truce:thumb:

That said, crossfire performance is generally good in Frostbite engine games, and pretty much anything AMD chooses to be a part of. It is generally awful in gameworks titles. The rest is a crapshoot or a waiting game. Sometimes I don't mind waiting. Case in point dying light was awful at release, and I waited months, then it ran flawlessly. Other games, still on the shelf waiting for improvements. It gets frustrating though, when they shove a driver out the door that makes things take a step back (crimson). Yes, AMD Matt stated that they are aware that CFX profiles have issues in the new Crimson drivers, and they are working on fixing it.


----------



## JourneymanMike

Quote:


> Originally Posted by *mus1mus*
> 
> Don't forget that the FX is running the GPU at X16 PCIe 2.0 vs the X16 PCIe 3.0 with Intel. It's not big. But it's there.
> 
> Otherwise, Intel to Intel at same link speeds is a better comparison.


That's true, but the AMD board runs an actual 16x in both PCIe 1 & 4 channels on the CVF-Z, in CFX - Gen. 2.0

As opposed to the Intel, which runs 8x on the M7F board, in CFX - Gen 3.0

I really don't know what the difference would be between the two...

I'm waiting for AMD to get with it, and come out with a new platform.

The FX processor is getting old. That's why I switched to Intel...


----------



## mus1mus

Well, give it a 300 points off GS at same clocks for running PCIe 2.0 vs 3.0.


----------



## spyshagg

Quote:


> Originally Posted by *kizwan*
> 
> Very noticeable?


At 60hz, *extremely*. Like you wouldn't believe. Vsync or Freesync obligatory.


----------



## ZealotKi11er

Quote:


> Originally Posted by *JourneymanMike*
> 
> Comparison: I needed more vCore on the higher score...
> 
> http://www.3dmark.com/compare/fs/6593063/fs/6874761#


They are both 4790K and the one with bigger OC is scoring less.


----------



## fat4l

Quote:


> Originally Posted by *ZealotKi11er*
> 
> They are both 4790K and the one with bigger OC is scoring less.


exactly lol


----------



## JourneymanMike

Quote:


> Originally Posted by *ZealotKi11er*
> 
> They are both 4790K and the one with bigger OC is scoring less.


OK, back to the drawing board!

I'll be back!


----------



## Cyber Locc

Quote:


> Originally Posted by *the9quad*
> 
> You asked which games can't maintain a solid 60 fps, and I told you which ones couldn't. Some people want a nice flat frametime, and don't really care what average FPS is. High framerates, and low framerates can equal a nice average framerate and still give a shaky experience.This is 56 fps average on a overclocked 980ti, in The Witcher 3, and it shows what I mean:
> 
> 
> 
> Sidenote-I really don't have an issue with you, and I am sure a lot of this is just context missing because we are typing over the internet without inflection, and a lot of stuff gets read in to the way we read it.
> 
> Truce:thumb:
> 
> That said, crossfire performance is generally good in Frostbite engine games, and pretty much anything AMD chooses to be a part of. It is generally awful in gameworks titles. The rest is a crapshoot or a waiting game. Sometimes I don't mind waiting. Case in point dying light was awful at release, and I waited months, then it ran flawlessly. Other games, still on the shelf waiting for improvements. It gets frustrating though, when they shove a driver out the door that makes things take a step back (crimson). Yes, AMD Matt stated that they are aware that CFX profiles have issues in the new Crimson drivers, and they are working on fixing it.


I 100% agree it is very frustrating sometimes I even want to ditch CF and 4k to play with 1 card problem free. I also agree with the inflections thing as your claim to mine to him. I aploligize if I seemed mean to him or anyone else that wasn't my intent







. I have been told a lot in RL as well that I lack tact and tend to make people feel like I am insulting them but truthfully that is never my intent. So I am sorry to you and anyone else that I may have offended.
Quote:


> Originally Posted by *sinnedone*
> 
> All 6 core hyperthreaded cpu's. It makes a difference when most are running quad core cpu's


I don't think that a hexacore makes a difference in most of those games however a difference may be made in the type of chipset and CPU. I see similar results to those (in the games I play) that said I have a x79 chipset with a Quad Core CPU albeit a different and stronger variation than that of what I think you mean. So ya I see what your saying and that is possible.


----------



## sinnedone

What I mean is the physics part gets offloaded to cpu. Depending on frequency and core count that will affect low fps.


----------



## Cyber Locc

Quote:


> Originally Posted by *sinnedone*
> 
> What I mean is the physics part gets offloaded to cpu. Depending on frequency and core count that will affect low fps.


Okay ya no, First off only one game in that list uses Phys X and that is Wicther 3. Secondly Phsy X only runs on CPU if you set it to in the Nvidia drivers, by default the GPU handles Phsy X. The only impact a 6 core vs a 4 core makes would be from platform.

My 4820k will perform just as well as a 4930k in most games.(assuming clocks ect are the same) There are a few games that do support 6 cores but not many. Off the top of my head I can think of 1 and that is BF4 and maybe Civ (I think) but the differences are far from huge, 2-5fps at best.

Now if we were talking about AMD cards that would change and you would be right that PhysX would run from the CPU on the few games that don't have hardware required PhysX and flat out don't use those features with AMD. Also PhysX is dead lol, the only time you will see it these days is in Games Works games because Nvidia likes to hold on to dead proprietary stuff.

Edit: Actually Wicther 3 uses cpu only physx so in that case that game may see an improvement from a hexacore.

BTW As to most games here https://www.techpowerup.com/reviews/Intel/Core_i7-5960X_5930K_5820K_Comparison/5.html


----------



## Mercennarius

Quote:


> Originally Posted by *cyberlocc*
> 
> There are a few games that do support 6 cores but not many. Off the top of my head I can think of 1 and that is BF4 and maybe Civ (I think) but the differences are far from huge, 2-5fps at best.


That is certainly changing especially with Directx 12. Ashes of the singularity can use 12+ cores:
http://www.overclock.net/t/1584528/directx-11-vs-directx-12-multi-thread-performance-ashes-of-the-singularity


----------



## sinnedone

Quote:


> Originally Posted by *cyberlocc*
> 
> Okay ya no, First off only one game in that list uses Phys X and that is Wicther 3. Secondly Phsy X only runs on CPU if you set it to in the Nvidia drivers, by default the GPU handles Phsy X. The only impact a 6 core vs a 4 core makes would be from platform.
> 
> My 4820k will perform just as well as a 4930k in most games.(assuming clocks ect are the same) There are a few games that do support 6 cores but not many. Off the top of my head I can think of 1 and that is BF4 and maybe Civ (I think) but the differences are far from huge, 2-5fps at best.
> 
> Now if we were talking about AMD cards that would change and you would be right that PhysX would run from the CPU on the few games that don't have hardware required PhysX and flat out don't use those features with AMD. Also PhysX is dead lol, the only time you will see it these days is in Games Works games because Nvidia likes to hold on to dead proprietary stuff.


I at no time mentioned physx, I sais physics.

Considering this is the *AMD* 290/290x thread it most certainly makes a difference.


----------



## Cyber Locc

Quote:


> Originally Posted by *sinnedone*
> 
> I at no time mentioned physx, I sais physics.
> 
> Considering this is the *AMD* 290/290x thread it most certainly makes a difference.


I don't understand what your saying by physics then (thats why I assumed you meant PhysX). Anyway here is this for now https://www.techpowerup.com/reviews/Intel/Core_i7-5960X_5930K_5820K_Comparison/5.html

That is a 4core vs the new 6 cores in gaming a 5960x loses to a 4770k in most games,
Quote:


> Originally Posted by *Mercennarius*
> 
> That is certainly changing especially with Directx 12. Ashes of the singularity can use 12+ cores:
> http://www.overclock.net/t/1584528/directx-11-vs-directx-12-multi-thread-performance-ashes-of-the-singularity


I am not going to say one way or the other on this issue, I have seen both things said and presented but dx12 isn't here this isn't fact yet. Also it isn't really relevant to what we were talking about, he said the review I linked earlier are skewed as to there use of a 6 core CPU which as I showed above with DX11 and these games that isn't true.


----------



## Mercennarius

Quote:


> Originally Posted by *cyberlocc*
> 
> I am not going to say one way or the other on this issue, I have seen both things said and presented but dx12 isn't here this isn't fact yet.


I disagree. DX12 is here, on every copy of Windows 10 and a compatible graphics card. Ashes of the Singularity can be purchased on Steam right now and ran in DX12 or DX11. I can say for sure that the game is using 12 + threads right now.


----------



## the9quad

I can't think of a single game that I play where having a six core processor has benefitted me. 3D mark and video stuff yea, but games I play, no. I do however think being on x79 or x99 benefits multi gpu systems somewhat in games, not really anything great though. Would still take a 5960x over anything else if I could afford it though


----------



## sinnedone

Quote:


> Originally Posted by *cyberlocc*
> 
> I don't understand what your saying by physics then (thats why I assumed you meant PhysX). Anyway here is this for now https://www.techpowerup.com/reviews/Intel/Core_i7-5960X_5930K_5820K_Comparison/5.html
> 
> That is a 4core vs the new 6 cores in gaming a 5960x loses to a 4770k in most games,
> I am not going to say one way or the other on this issue, I have seen both things said and presented but dx12 isn't here this isn't fact yet. Also it isn't really relevant to what we were talking about, he said the review I linked earlier are skewed as to there use of a 6 core CPU which as I showed above with DX11 and these games that isn't true.


Actual explosions etc that use the cpu, not nvidias proprietary physX. What you linked uses an nvidia card, not an amd 290/290x.

Quote:


> Originally Posted by *the9quad*
> 
> I can't think of a single game that I play where having a six core processor has benefitted me. 3D mark and video stuff yea, but games I play, no. I do however think being on x79 or x99 benefits *multi gpu systems* somewhat in games, not really anything great though. Would still take a 5960x over anything else if I could afford it though


That might be the reason I can see a difference in lower fps in certain scenarios on my personal system when cpu is at different clock speeds. (ie stock vs overclocked)


----------



## ZealotKi11er

Quote:


> Originally Posted by *the9quad*
> 
> I can't think of a single game that I play where having a six core processor has benefitted me. 3D mark and video stuff yea, but games I play, no. I do however think being on x79 or x99 benefits multi gpu systems somewhat in games, not really anything great though. Would still take a 5960x over anything else if I could afford it though


For me i changed from i5 to i7 for HT because a lot of people said it would help in games. I did not for me. Only game I sow more fps was Crysis 3 in some part of the game.


----------



## By-Tor

Quote:


> Originally Posted by *ZealotKi11er*
> 
> For me i changed from i5 to i7 for HT because a lot of people said it would help in games. I did not for me. Only game I sow more fps was Crysis 3 in some part of the game.


I turned HT off on my 4790k to see how BF4 would play and it played like crap..


----------



## navjack27

Quote:


> Originally Posted by *By-Tor*
> 
> I turned HT off on my 4790k to see how BF4 would play and it played like crap..


Keep it on and use something like process lasso


----------



## Cyber Locc

Quote:


> Originally Posted by *sinnedone*
> 
> Actual explosions etc that use the cpu, not nvidias proprietary physX. What you linked uses an nvidia card, not an amd 290/290x.
> That might be the reason I can see a difference in lower fps in certain scenarios on my personal system when cpu is at different clock speeds. (ie stock vs overclocked)


Okay got ya, still the cores don't make a difference (well past 4c, again most games). However yes different clock speeds play a major role to an extent there is a sweet spot that past that the performance doesn't increase very much however for the most part it does.

Quote:


> Originally Posted by *ZealotKi11er*
> 
> For me i changed from i5 to i7 for HT because a lot of people said it would help in games. I did not for me. Only game I sow more fps was Crysis 3 in some part of the game.


Yep sounds about right, Cry Engine 3, Frostbite Engine 3 both will utilize more than 4c and make a difference but for the most part, most games wont.
Quote:


> Originally Posted by *Mercennarius*
> 
> I disagree. DX12 is here, on every copy of Windows 10 and a compatible graphics card. Ashes of the Singularity can be purchased on Steam right now and ran in DX12 or DX11. I can say for sure that the game is using 12 + threads right now.


Yes thats true but the way you worded it was that all or alot of DX12 games will use 6c/12t (at least from what I gathered) and there simply isn't enough evidence to support that yet. I mean I hope they do start using more cores, I just dont know what will happen.


----------



## kizwan

Multithreaded doesn't necessarily means the process/tasks is distributed to all cores/threads evenly. It's still depends on the games coding. If you guys checked the CPU core/threads usage (with some games), you'll notice some cores/threads have higher usage than the others. In this case, the performance still depends on the CPU single thread performance because task(s) can still hold up other task(s). It will be no difference with DX12-enabled games because it still depends on the coding (this is regarding how the process/tasks distributed to all cores). The benefit with DX12 is the API CPU overhead will be a lot better than DX11 for AMD gpus. So we should see better CPU cores/threads utilization.

Many games nowadays can use all available cores already but it doesn't necessarily means more core is better because like I said above depends on the coding & also depends on the CPU single thread performance. This remain true with DX12 except CPU cores/threads utilization will be a lot better. It's all about the coding. I can't stress this enough.


----------



## ZealotKi11er

Quote:


> Originally Posted by *kizwan*
> 
> Multithreaded doesn't necessarily means the process/tasks is distributed to all cores/threads evenly. It's still depends on the games coding. If you guys checked the CPU core/threads usage (with some games), you'll notice some cores/threads have higher usage than the others. In this case, the performance still depends on the CPU single thread performance because task(s) can still hold up other task(s). It will be no difference with DX12-enabled games because it still depends on the coding (this is regarding how the process/tasks distributed to all cores). The benefit with DX12 is the API CPU overhead will be a lot better than DX11 for AMD gpus. So we should see better CPU cores/threads utilization.
> 
> Many games nowadays can use all available cores already but it doesn't necessarily means more core is better because like I said above depends on the coding & also depends on the CPU single thread performance. This remain true with DX12 except CPU cores/threads utilization will be a lot better. It's all about the coding. I can't stress this enough.


Well that is what is going to happen more of the time. Much better to have a CPU with 2 cores @ 5GHz and 6 @ 3GHz then 8 @ 4GHz in gaming.


----------



## kizwan

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Multithreaded doesn't necessarily means the process/tasks is distributed to all cores/threads evenly. It's still depends on the games coding. If you guys checked the CPU core/threads usage (with some games), you'll notice some cores/threads have higher usage than the others. In this case, the performance still depends on the CPU single thread performance because task(s) can still hold up other task(s). It will be no difference with DX12-enabled games because it still depends on the coding (this is regarding how the process/tasks distributed to all cores). The benefit with DX12 is the API CPU overhead will be a lot better than DX11 for AMD gpus. So we should see better CPU cores/threads utilization.
> 
> Many games nowadays can use all available cores already but it doesn't necessarily means more core is better because like I said above depends on the coding & also depends on the CPU single thread performance. This remain true with DX12 except CPU cores/threads utilization will be a lot better. It's all about the coding. I can't stress this enough.
> 
> 
> 
> 
> 
> 
> 
> Well that is what is going to happen more of the time. Much better to have a CPU with 2 cores @ 5GHz and 6 @ 3GHz then 8 @ 4GHz in gaming.
Click to expand...

If the 2 cores CPU @5GHz have better single thread performance than the 8 cores CPU @4GHz, then yes. But 2 cores CPU probably too "little" for modern games though because it have limited resources (2 cores) to process the tasks. Unless the CPU is HT-able, 8 cores CPU @4GHz clocked up to 4.5GHz may perform efficiently than the 2 cores CPU @5GHz. I think dual-cores with HT or quad-core CPU should be the minimum nowadays.


----------



## ZealotKi11er

Quote:


> Originally Posted by *kizwan*
> 
> If the 2 cores CPU @5GHz have better single thread performance than the 8 cores CPU @4GHz, then yes. But 2 cores CPU probably too "little" for modern games though because it have limited resources (2 cores) to process the tasks. Unless the CPU is HT-able, 8 cores CPU @4GHz clocked up to 4.5GHz may perform efficiently than the 2 cores CPU @5GHz. I think dual-cores with HT or quad-core CPU should be the minimum nowadays.


I am saying both are 8 Core CPUs. The first CPU has different clock speed for different cores. 2 are @ 5 and the other 6 @ 3.


----------



## kizwan

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> If the 2 cores CPU @5GHz have better single thread performance than the 8 cores CPU @4GHz, then yes. But 2 cores CPU probably too "little" for modern games though because it have limited resources (2 cores) to process the tasks. Unless the CPU is HT-able, 8 cores CPU @4GHz clocked up to 4.5GHz may perform efficiently than the 2 cores CPU @5GHz. I think dual-cores with HT or quad-core CPU should be the minimum nowadays.
> 
> 
> 
> 
> 
> 
> 
> I am saying both are 8 Core CPUs. The first CPU has different clock speed for different cores. 2 are @ 5 and the other 6 @ 3.
Click to expand...

I thought you meant different CPUs gen there.

If you (overclocked) have turbo boost profile like that; when 6 cores active @3GHz & when 2 cores active @5GHz; depends on the workload, it may offer better performance than locking all cores @4GHz. Operating temperature maybe less too (guesstimate).


----------



## rdr09

Quote:


> Originally Posted by *By-Tor*
> 
> I turned HT off on my 4790k to see how BF4 would play and it played like crap..


My cores with HT off on my i7 Sandy @ 4.5 were maxed out with 2 HD 7900 cards in BF3.


----------



## Mercennarius

Quote:


> Originally Posted by *kizwan*
> 
> Multithreaded doesn't necessarily means the process/tasks is distributed to all cores/threads evenly. It's still depends on the games coding. If you guys checked the CPU core/threads usage (with some games), you'll notice some cores/threads have higher usage than the others. In this case, the performance still depends on the CPU single thread performance because task(s) can still hold up other task(s). It will be no difference with DX12-enabled games because it still depends on the coding (this is regarding how the process/tasks distributed to all cores). The benefit with DX12 is the API CPU overhead will be a lot better than DX11 for AMD gpus. So we should see better CPU cores/threads utilization.
> 
> Many games nowadays can use all available cores already but it doesn't necessarily means more core is better because like I said above depends on the coding & also depends on the CPU single thread performance. This remain true with DX12 except CPU cores/threads utilization will be a lot better. It's all about the coding. I can't stress this enough.


True. If you see my Ashes of the Singularity DX11 vs DX12 thread link in the previous page you can see both DX11 and DX12 are using 12+ threads. The difference between them is DX11 puts 90+% of the work load on 1 thread while DX12 for the most part evenly distributes the work load across many threads.


----------



## mus1mus

An hour to go yet,

HAPPY NEW YEAR!
everyone.


----------



## BadRobot

Quote:


> Originally Posted by *rdr09*
> 
> My cores with HT off on my i7 Sandy @ 4.5 were maxed out with 2 HD 7900 cards in BF3.


Lucky







my 2600k can't get past 4.2Ghz while my 2500k reaches 4.3 maybe 4.4Ghz. Curse you, silicon lottery!

Happy new year in advance!


----------



## battleaxe

Quote:


> Originally Posted by *BadRobot*
> 
> Lucky
> 
> 
> 
> 
> 
> 
> 
> my 2600k can't get past 4.2Ghz while my 2500k reaches 4.3 maybe 4.4Ghz. Curse you, silicon lottery!
> 
> Happy new year in advance!


I'm pretty sure they could both do better than that. What are you using the the auto OC feature on the motherboard? You should be able to beat that by 2-300mhz if using your own settings in BIOS instead of auto OC.


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> My cores with HT off on my i7 Sandy @ 4.5 were maxed out with 2 HD 7900 cards in BF3.


My 2500K @ 4.8GHz bottlenecked the heck out of 2 x HD 7970 @ 1440p 0AA. Even 3770K did not help. In BF4 I have to run the game @ 4K to get no CPU bottleneck aka for AMD CPU overhead.


----------



## the9quad

1440p 2xMSAA/FXAA low the rest on Ultra, i get 100% GPU usage in BF4 on all 3 290x's.


----------



## BadRobot

Quote:


> Originally Posted by *battleaxe*
> 
> I'm pretty sure they could both do better than that. What are you using the the auto OC feature on the motherboard? You should be able to beat that by 2-300mhz if using your own settings in BIOS instead of auto OC.


Quote:


> Originally Posted by *ZealotKi11er*
> 
> My 2500K @ 4.8GHz bottlenecked the heck out of 2 x HD 7970 @ 1440p 0AA. Even 3770K did not help. In BF4 I have to run the game @ 4K to get no CPU bottleneck aka for AMD CPU overhead.


Using BIOS to OC. Going any higher than what I mentioned and the voltage increases too much imo. With both I don't want to go past 1.33v since 100Mhz higher requires nearly 1.4v

I should download and boot up BF4 again to see how the R9 290 fares at 1440p 75Hz with the 2600k.


----------



## ZealotKi11er

Quote:


> Originally Posted by *the9quad*
> 
> 1440p 2xMSAA/FXAA low the rest on Ultra, i get 100% GPU usage in BF4 on all 3 290x's.


So you must be getting like 200 fps in BF4 then?


----------



## the9quad

Quote:


> Originally Posted by *ZealotKi11er*
> 
> So you must be getting like 200 fps in BF4 then?


Sometimes. Here ran a quick bench settings 1440p all ultra, 4xmsaa with fxaa on low. FOV 90. My monitor is 120hz, so I usually cap the frames, and it doesn't seem to have a problem maintaining 120fps. but as you can see if I run the frames uncapped it hits 200's most of the time. This was run using Mantle as the renderer. I will do another with DX11 and see if it makes a difference.


















here DX11, huge difference! lol:


----------



## kizwan

Happy New Year 2016!

















My 2015 ended with weird conversation at local forum.


----------



## MEC-777

Quote:


> Originally Posted by *kizwan*
> 
> Happy New Year 2016!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My 2015 ended with weird conversation at local forum.


Still just under 6 hours to go here.


----------



## JourneymanMike

Quote:


> Originally Posted by *mus1mus*
> 
> An hour to go yet,
> 
> HAPPY NEW YEAR!
> everyone.


Holy Smokes Man! I still have 6 hours to go...

How is the future? Has Global Warming destroyed the Earth yet? :


----------



## HOMECINEMA-PC

Happies 2016 ...... its 10.30am here , new years day and the neighbor is mowing the lawn ........









Woke me up


----------



## JourneymanMike

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Happies 2016 ...... its 10.30am here , new years day and the neighbor is mowing the lawn ........
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Woke me up


I've got 5hours and 15 minutes to go!

So, how's the future? Has Global Warming destroyed the earth yet?


----------



## HOMECINEMA-PC

I farted and helped contribute .........



Gonna do some gaming today with the 3x290 chiller on and post some pics in 4k and 7680x1440p


----------



## JourneymanMike

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> *I farted and helped contribute .........
> 
> 
> 
> 
> 
> 
> 
> 
> *
> 
> 
> *Gonna do some gaming today with the 3x290 chiller* on and post some pics in 4k and 7680x1440p


I believe that it may be us guys, with the overclocked AMD GPU's, are great contributors to the destroying the Earth !

Fating, especially my dog's farts, are also destroying the Planet!


----------



## Tivan

Quote:


> Originally Posted by *JourneymanMike*
> 
> I've got 5hours and 15 minutes to go!
> 
> So, how's the future? Has Global Warming destroyed the earth yet?


Can't really destroy the earth with global warming. Just make it a less nice place for humans (and a couple other species, which will conveniently just go extinct so they can't complain.) to live on









At least we pay for more investments into sustainable energy by consuming more energy, over here in germany! So using more electricity over here will just lead to getting sooner to the day where it's gonna be mostly free and without environmental impact to use electricity









So I wouldn't worry too much about these 290s... depenending on where you use em!


----------



## kizwan

Quote:


> Originally Posted by *MEC-777*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Happy New Year 2016!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My 2015 ended with weird conversation at local forum.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Still just under 6 hours to go here.
Click to expand...

Spoiler alert: 2016 is awesome! ...nope, not really...lol
Quote:


> Originally Posted by *JourneymanMike*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> An hour to go yet,
> 
> HAPPY NEW YEAR!
> everyone.
> 
> 
> 
> Holy Smokes Man! I still have 6 hours to go...
> 
> How is the future? Has Global Warming destroyed the Earth yet? :
Click to expand...

Not yet unfortunately.








Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Happies 2016 ...... its 10.30am here , new years day and the neighbor is mowing the lawn ........
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Woke me up


Turn up that 5.1 speaker for your neighbor "entertainment".


----------



## mus1mus

It's 1100H of 01-01-2016 round here. Clear skies. So it's very bright indeed.









Astrology says 2016 will be financially great for "RATS" (which I am) so I will have to lay my back and wait for them monies.


----------



## spyshagg

Quote:


> Originally Posted by *mus1mus*
> 
> Okay, if you are into BIOS modding and have been initiated to memory timing tweaking, here's my recipe:
> 
> 77 71 33 20 00 00 00 00 29 39 57 *26 50* 55 09 0E 26 1D 17 03 00 68 C2 00 22 AA 1C 08 54 0C 14 20 AA 89 00 A6 00 00 07 C0 0F 0A 18 1D 31 1E 27 10
> 
> These are the series of HEX values for the memory timings.
> 
> The idea is to tighten the strap you are testing. So these hawaii GPUs have these straps:
> 
> 1750
> 1625
> 1500
> 1375
> 1250
> 1000 etc. It follows, that 1626 belongs to 1700 strap and 1599 belongs to 1625. And so on.
> 
> Now, if you can clock your memory up to, say 1650, you can substitute the timings on that strap with the timings from the previous strap. So, you can pick 1625 strap timings and substitute that to 1750 Strap. Do that till you find the lowest(tightest) strap timings you can use for your need. Or before instabilities start to ruin performance.
> 
> Now, if you look at the timings above, two values are on bold. These are the final recipe.
> 
> On my BIOS, (sorry, can't put up one at the moment as I am not on my rig now nor have a copy I can access while on mobile) I use 1250 strap to 1375 up to 1750. That means I dont lose performance going higher past a certain strap end.
> 
> The tricky part is, using 1000 strap timings causes too many issues. So is using the Stilt's 1250 timings. So I ended up looking for a way to tighen a bit my 1250.
> 
> So far, 2 HEX values have been tested that gives me that. Those in bold can be replaced by the values from 1000 strap on the same positions.
> 
> This is a little tricky if you are not into modding so yeah. Let me know, or Fyzzz if you have further questions.


Hi

What are the hex that precede 1000 strap ?
Edit: found tthem


----------



## scottydsntknow

So, was having horrible cooling issues with my overclocked water cooled 290. Using a H50 which is a starter water cooler but was still getting way too hot. Replaced the single stock Corsair fan with two Cooler Master Evo fans in a push/pull and the core temps tanked down into the low 60s under load. VRM1 still pushes into the 80s under stress but that's nowhere close to what it was. I think I will have to replace the NZXT fan with a Noctua or something to get my VRM temps in line. And yes I do have the Gelid heatsinks.


----------



## JourneymanMike

Quote:


> Originally Posted by *scottydsntknow*
> 
> So, was having horrible cooling issues with my overclocked water cooled 290. Using a H50 which is a starter water cooler but was still getting way too hot. Replaced the single stock Corsair fan with two Cooler Master Evo fans in a push/pull and the core temps tanked down into the low 60s under load. VRM1 still pushes into the 80s under stress but that's nowhere close to what it was. I think I will have to replace the NZXT fan with a Noctua or something to get my VRM temps in line. And yes I do have the Gelid heatsinks.


You're spending all this time & money on AIO solutions, plus some half-a$$ed switching of fans...

Why not just save some time & money, and get a Full Block GPU cooler w/ backplate...

You'll be very much happier...

Just my opinion, based on personal experience...

Mike


----------



## Cyber Locc

Quote:


> Originally Posted by *scottydsntknow*
> 
> So, was having horrible cooling issues with my overclocked water cooled 290. Using a H50 which is a starter water cooler but was still getting way too hot. Replaced the single stock Corsair fan with two Cooler Master Evo fans in a push/pull and the core temps tanked down into the low 60s under load. VRM1 still pushes into the 80s under stress but that's nowhere close to what it was. I think I will have to replace the NZXT fan with a Noctua or something to get my VRM temps in line. And yes I do have the Gelid heatsinks.


Your rad is not cooling the VRMs with that cooler at all in anyway. 80c is fine for the VRMs, 60 is fine for the GPU and switching fans isnt going to make a huge difference if you want a huge difference and your VRMs down get a block.


----------



## JourneymanMike

Quote:


> Originally Posted by *JourneymanMike*
> 
> You're spending all this time & money on AIO solutions, plus some half-a$$ed switching of fans...
> 
> Why not just save some time & money, and *get a Full Cover GPU Block w/ backplate...
> 
> *You'll be very much happier...
> 
> Just my opinion, based on personal experience...
> 
> Mike


Quote:


> Originally Posted by *cyberlocc*
> 
> Your rad is not cooling the VRMs with that cooler at all in anyway. 80c is fine for the VRMs, 60 is fine for the GPU and switching fans isnt going to make a huge difference if you want a huge difference and your VRMs down *get a block*.


This, right here ^^^^^^^^^^^^^^^^^^^^^^^^


----------



## mirzet1976

Something new from MSI - Radeon GeForce R9 380









http://www.msi.com/product/graphics-card/#?category=AMD-Radeon%26trade%3B%20R9%20380
http://www.msi.com/product/graphics-card/R9-380-2GD5T-OC.html#hero-overview


----------



## HOMECINEMA-PC

Aio solutions are a scam and are for peeps too scared to watercool properly

Block it and be done with it .

Pump res combo and 1 360 rad per card


----------



## Cyber Locc

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Aio solutions are a scam and are for peeps too scared to watercool properly
> 
> Block it and be done with it .
> 
> Pump res combo and 1 360 rad per card


360.... No way 560 per card and one for your CPU and be done with it.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *cyberlocc*
> 
> 360.... No way 560 per card and one for your CPU and be done with it.


I have twin D5's and 2 360 rads for cards ....... and NO rad for cpu


----------



## scottydsntknow

Lol elitism ftw. Don't feel like spending a metric assload of money on a system with a single 290. That is counterproductive. My setup is now fine with an aio on the gpu and cpu at pretty much max overclocks for both. If I upgrade to a second card and go Intel or Zen next year I'll look into a custom loop then.


----------



## PJFT808

Never got around to posting this but why not. I just did it cause I could lol.



http://imgur.com/3vkqsXg


----------



## battleaxe

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Aio solutions are a scam and are for peeps too scared to watercool properly
> 
> Block it and be done with it .
> 
> Pump res combo and 1 360 rad per card


Or they can be a stepping stone to full blocks, loops, etc... as was for me.


----------



## scottydsntknow

Quote:


> Originally Posted by *battleaxe*
> 
> Or they can be a stepping stone to full blocks, loops, etc... as was for me.


That's what they will be for me. I have a Fractal Define S case which has provisions for mounting a pump and brackets for a reservoir. Custom cooling is my next step but the AIO setup I have now works great.


----------



## HOMECINEMA-PC

Yeah it depends on size of case , ambient temps so on and so forth .

I gave em a miss

Had one on my first w/c cpu and it died


----------



## battleaxe

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Yeah it depends on size of case , ambient temps so on and so forth .
> 
> I gave em a miss
> 
> Had one on my first w/c cpu and it died


I seem to just keep building more computers and putting the old AIO's in my old PC's lol... they have their place still. I think I have 6 of them laying around here in different PC's.

I bought all my AIO's on a fire sale back when we were mining with GPU's. I used to use them to keep the GPU cores cool on my miners.


----------



## HOMECINEMA-PC

Use em if u got em eh


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *scottydsntknow*
> 
> *Lol elitism ftw.* Don't feel like spending a metric assload of money on a system with a single 290. That is counterproductive. My setup is now fine with an aio on the gpu and cpu at pretty much max overclocks for both. If I upgrade to a second card and go Intel or Zen next year I'll look into a custom loop then.












No sub ambient cooling running









Add another 5-6c under load


----------



## spyshagg

3x 360 rads with mcp655 and 15 fans on one loop here







stupid crazy low temps on 5ghz i7 and 2x 290x

Its an investment for every build going forward. You only need to change CPU/gpu blocks as required.

Don't really trust aio pumps


----------



## JourneymanMike

Quote:


> Originally Posted by *PJFT808*
> 
> Never got around to posting this but why not. I just did it cause I could lol.
> 
> 
> 
> http://imgur.com/3vkqsXg


Man, that is one $**t load of AIO's!!!









You should get an award for it!!!

Or, you could post it in this forum...

http://www.overclock.net/t/666445/post-your-ghetto-rigging-shenanigans/0_50









Good Work!


----------



## the9quad

Quote:


> Originally Posted by *PJFT808*
> 
> Never got around to posting this but why not. I just did it cause I could lol.
> 
> 
> 
> http://imgur.com/3vkqsXg


----------



## ZealotKi11er

Can't believe how well 2 x 290X handle most games @ 4K.
Crysis 3 Very High 0AA i get 40-50 fps
Assassin's Creed Syndicate High 30-40 fps
BF4 Ultra 0AA 80-110 fps.

I am having problem overclocking my cards right now. I have not overcloked them for games for some time now and can't even do 1175MHz without the game not even starting. Almost like ULPS is ON but I made sure to turn it off. Could be the custom BIOS I flashed.


----------



## kizwan

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Can't believe how well 2 x 290X handle most games @ 4K.
> Crysis 3 Very High 0AA i get 40-50 fps
> Assassin's Creed Syndicate High 30-40 fps
> BF4 Ultra 0AA 80-110 fps.
> 
> I am having problem overclocking my cards right now. I have not overcloked them for games for some time now and can't even do 1175MHz without the game not even starting. Almost like ULPS is ON but I made sure to turn it off. Could be the custom BIOS I flashed.


You are not alone. Mine too not fare well when gaming with overclocked cards. Mine either lasted for 15 or 30 minutes with GTAV when overclocked before the whole PC locked up. I have better luck when memory below 1400. There's no difference whether stock or modded BIOS. I think heat related even if the temp is only in the 60s.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *PJFT808*
> 
> Never got around to posting this but why not. I just did it cause I could lol.
> 
> 
> 
> http://imgur.com/3vkqsXg


----------



## fat4l

Guys, I wasnt satisfied with the core temps on my ares 3.....
I even use *Thermal Grizzly Kryonaut*(which is supposed to be the best non-conductive paste) and I was still getting max 65C on gpu1 and 57C on gpu2 with the water temp of 26C(after 2 runs of heaven).

Now I used *Coolaboratory Liquid Pro.*
Results are amazing.
GPU 1 went from 65C to 50C.
GPU 2 went from 57C to 43C.

That's hell of a difference......
......*14-15C difference!!!*


----------



## battleaxe

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Can't believe how well 2 x 290X handle most games @ 4K.
> Crysis 3 Very High 0AA i get 40-50 fps
> Assassin's Creed Syndicate High 30-40 fps
> BF4 Ultra 0AA 80-110 fps.
> 
> I am having problem overclocking my cards right now. I have not overcloked them for games for some time now and can't even do 1175MHz without the game not even starting. Almost like ULPS is ON but I made sure to turn it off. Could be the custom BIOS I flashed.


If you are using AB to disable ULPS don't. For some reason its buggy as **** right now. Use TRIXX to disable it while AB is still running, then you can turn TRIXX back off and not use it again. I've found this is the only way to make sure ULPS is off with AB right now. No idea why, but this has been my experience, so give it a try if you haven't already. It may fix the issue for you.


----------



## JourneymanMike

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> Can't believe how well 2 x 290X handle most games @ 4K.
> Crysis 3 Very High 0AA i get 40-50 fps
> Assassin's Creed Syndicate High 30-40 fps
> BF4 Ultra 0AA 80-110 fps.
> 
> I am having problem overclocking my cards right now. I have not overcloked them for games for some time now and can't even do 1175MHz without the game not even starting. Almost like ULPS is ON but I made sure to turn it off. Could be the custom BIOS I flashed.
> 
> 
> 
> If you are using AB to disable ULPS don't. For some reason its buggy as **** right now. Use TRIXX to disable it while AB is still running, then you can turn TRIXX back off and not use it again. I've found this is the only way to make sure ULPS is off with AB right now. No idea why, but this has been my experience, so give it a try if you haven't already. It may fix the issue for you.
Click to expand...

Instead of having two GPU OC Utilities open, why not try...

http://ww2.ulpsconfigurationutility.com/

I haven't had problems with it... I found it here...

http://www.overclock.net/t/1088266/ulps-gui-config-utility-enable-disable/0_50

Good Luck


----------



## battleaxe

Quote:


> Originally Posted by *JourneymanMike*
> 
> Instead of having two GPU OC Utilities open, why not try...
> 
> http://ww2.ulpsconfigurationutility.com/
> 
> I haven't had problems with it... I found it here...
> 
> http://www.overclock.net/t/1088266/ulps-gui-config-utility-enable-disable/0_50
> 
> Good Luck


Well, I personally use both from time to time. Just depends on what I am doing. I just happened to notice that TRIXX would disable ULPS while AB would not. I use TRIXX for benching and AB for gaming. I'm guessing others do the same. That's why.


----------



## ZealotKi11er

Quote:


> Originally Posted by *battleaxe*
> 
> If you are using AB to disable ULPS don't. For some reason its buggy as **** right now. Use TRIXX to disable it while AB is still running, then you can turn TRIXX back off and not use it again. I've found this is the only way to make sure ULPS is off with AB right now. No idea why, but this has been my experience, so give it a try if you haven't already. It may fix the issue for you.


Yeah MSI AB clearly does not work. I still see the green light in the back of the second card.


----------



## HOMECINEMA-PC

1.5 hrs of COD AW @ 4k 1000/1250 with chiller and A/C on set at 20c = 18c water temp = Card temps of 25c - 28c under full load











Its been awhile since I dids a screener









I think I will do a better one tomorrow


----------



## mfknjadagr8

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 1.5 hrs of COD AW @ 4k 1000/1250 with chiller and A/C on set at 20c = 18c water temp = Card temps of 25c - 28c under full load
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Its been awhile since I dids a screener
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think I will do a better one tomorrow


how much power does your chiller pull from the wall wattage wise?


----------



## battleaxe

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 1.5 hrs of COD AW @ 4k 1000/1250 with chiller and A/C on set at 20c = 18c water temp = Card temps of 25c - 28c under full load
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Its been awhile since I dids a screener
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I think I will do a better one tomorrow


How low a temp can you set your chiller to? Why aren't you running it colder than 20c?


----------



## scottydsntknow

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> No sub ambient cooling running
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Add another 5-6c under load


If I ran my 290 at those speeds my temps aren't exactly a ton higher. I'm running it at 1175/1450 on a H50 and NZXT bracket. Voltage is cranked. Only difference is I'm running two very good fans on the rad instead of the single regular fan that Corsair sends with the H50.

Last night on 3 screens in Rebel Galaxy I was 61 core and 81 VRM. Still hot but well within limits and if I drop down to stock speeds the temps are very cool.

Like I said, I understand the AIO setups are meh compared to custom loop. I will be going custom loop for my next build but that won't be for a little bit. I don't extreme game, I don't game at higher than 1080p (on 3 sceens) yet and its a FX8350/R9 290 combo, not setting any sorts of records lol.


----------



## i2CY

ULPS: How to disable: regedit
https://community.amd.com/thread/176003


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *mfknjadagr8*
> 
> how much power does your chiller pull from the wall wattage wise?


No idea mate sorry









Keep in mind that the HC-1000 its not running all the time its chills to the setpoint and switches off till the water temp goes 3c higher than setpoint

Gonna have to get a watt o meter









Quote:


> Originally Posted by *battleaxe*
> 
> How low a temp can you set your chiller to? Why aren't you running it colder than 20c?


I could set it to 4c if I wanted to but its summer here in 'Straya and the humidity dictates condensation and set point temp .

Even with insulation ive had sweating with previous mobo and the watered down tim leaked into the cpu socket . Cpu or mobo didn't die either . Thank you Asus









So I try not to run it any lower than 6c - 8c below ambient room temps and if its raining no less than 5c











Its ran flawlessly for 18mths like this









Quote:


> Originally Posted by *scottydsntknow*
> 
> If I ran my 290 at those speeds my temps aren't exactly a ton higher. I'm running it at 1175/1450 on a H50 and NZXT bracket. Voltage is cranked. Only difference is I'm running two very good fans on the rad instead of the single regular fan that Corsair sends with the H50.
> 
> Last night on 3 screens in Rebel Galaxy I was 61 core and 81 VRM. Still hot but well within limits and if I drop down to stock speeds the temps are very cool.
> 
> Like I said, I understand the AIO setups are meh compared to custom loop. I will be going custom loop for my next build but that won't be for a little bit. I don't extreme game, I don't game at higher than 1080p (on 3 sceens) yet and its a FX8350/R9 290 combo, not setting any sorts of records lol.


Don't get me wrong im no e-lit-ist









That's pretty sweet results 25c less from a H50


----------



## JourneymanMike

Quote:


> Originally Posted by *i2CY*
> 
> ULPS: How to disable: regedit
> https://community.amd.com/thread/176003


That's the way to do it!









I wonder when more people will start editing the registry to disable ULPS, instead of screwing around with settings on overclocking software?









I've been using the UPLS Configuration Utility, which edits the registry for you...


----------



## H1vemind

Hey guys what should be the hottest thing on an air cooled card? The core or the vrm's?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *H1vemind*
> 
> Hey guys what should be the hottest thing on an air cooled card? The core or the vrm's?


Core fer sure


----------



## H1vemind

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Core fer sure


ah thanks, just wanted to know as I was trying to find my msi 290x gaming's Max overclock and hit thermal limits at 1205mhz and +164mv but the vrm's were a good 5-10'c cooler.


----------



## RivaLCF

Hey guys, I'm sure this has already been asked often enough but I'm here to get my final thoughts on flashing an XFX R9 290X Core Edition 4GB to an R9 390X BIOS (I'm sure it is called like that?).

Simply put, is it worth a try or is there too much of a risk in doing this? I'm simply not sure if it is like a no-no overall or if some models are better for it than others. Heard XFX is the ideal card to have.


----------



## battleaxe

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> No idea mate sorry
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Keep in mind that the HC-1000 its not running all the time its chills to the setpoint and switches off till the water temp goes 3c higher than setpoint
> 
> Gonna have to get a watt o meter
> 
> 
> 
> 
> 
> 
> 
> 
> I could set it to 4c if I wanted to but its summer here in 'Straya and the humidity dictates condensation and set point temp .
> 
> Even with insulation ive had sweating with previous mobo and the watered down tim leaked into the cpu socket . Cpu or mobo didn't die either . Thank you Asus
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So I try not to run it any lower than 6c - 8c below ambient room temps and if its raining no less than 5c
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Its ran flawlessly for 18mths like this
> 
> 
> 
> 
> 
> 
> 
> 
> Don't get me wrong im no e-lit-ist
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That's pretty sweet results 25c less from a H50






You just can't be bothered to put it in a case can you? LOL

epic Ghetto man. Epic.


----------



## Johny Boy

Hi just wanted to know how is 290/290x responding to crimson drivers, I heard that performance has increased but was unable to find any review of this cards with new drivers in comparison say gtx970.
I am getting Sapphire TriX 290 @ good price with 14 months of warranty left so just wanted to know will it be wise to buy it now knowing AMD will be launching new GPU's this year.


----------



## Dunan

Quote:


> Originally Posted by *Johny Boy*
> 
> Hi just wanted to know how is 290/290x responding to crimson drivers, I heard that performance has increased but was unable to find any review of this cards with new drivers in comparison say gtx970.
> I am getting Sapphire TriX 290 @ good price with 14 months of warranty left so just wanted to know will it be wise to buy it now knowing AMD will be launching new GPU's this year.


AMD is launching cards 2 years in a row? What?

And Crimson drivers are fine with the 290x


----------



## Roboyto

Quote:


> Originally Posted by *Johny Boy*
> 
> Hi just wanted to know how is 290/290x responding to crimson drivers, I heard that performance has increased but was unable to find any review of this cards with new drivers in comparison say gtx970.
> I am getting Sapphire TriX 290 @ good price with 14 months of warranty left so just wanted to know will it be wise to buy it now knowing AMD will be launching new GPU's this year.


Haven't had any issues so far with the Crimson drivers on both of my PC's running a 290 GPU on Win X.

The 290 and 970 trade blows depending on the title/benchmark at hand. Due to the Maxwell architecture having outstanding overclock capabilities the 970 can often edge out the 290 since they are less likely to have obtain outstanding clocks; the 290X will close these gaps. And even if you do get a fantastic overclocking 290(X) you have lots of heat to deal with.

I have not compared Crimson to previous drivers yet, but I did do a comparison of a GTX 970 to my 290 a year ago...some things may have changed slightly...but it's a decent shootout nonetheless:

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/34380_20#post_23458320

I'll end up doing a Crimson driver comparison at some point...just haven't gotten around to it.


----------



## battleaxe

Quote:


> Originally Posted by *Johny Boy*
> 
> Hi just wanted to know how is 290/290x responding to crimson drivers, I heard that performance has increased but was unable to find any review of this cards with new drivers in comparison say gtx970.
> I am getting Sapphire TriX 290 @ good price with 14 months of warranty left so just wanted to know will it be wise to buy it now knowing AMD will be launching new GPU's this year.


Crimson drivers seem to be working just fine with my Xfire 290x's. I have had no issues. I've probably logged 20 hours of game time on them.


----------



## battleaxe

Quote:


> Originally Posted by *Roboyto*
> 
> Haven't had any issues so far with the Crimson drivers on both of my PC's running a 290 GPU on Win X.
> 
> The 290 and 970 trade blows depending on the title/benchmark at hand. Due to the Maxwell architecture having outstanding overclock capabilities the 970 can often edge out the 290 since they are less likely to have obtain outstanding clocks; the 290X will close these gaps. And even if you do get a fantastic overclocking 290(X) you have lots of heat to deal with.
> 
> I have not compared Crimson to previous drivers yet, but I did do a comparison of a GTX 970 to my 290 a year ago...some things may have changed slightly...but it's a decent shootout nonetheless:
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/34380_20#post_23458320
> 
> I'll end up doing a Crimson driver comparison at some point...just haven't gotten around to it.


The drivers have advanced pretty significantly during this time. The 290 now trades blows with the 970 with ease. Where the 970 used to be equal to the 290x the 290x now is trading blowes with the 980. Who knew?

Well, we did... but I digress.


----------



## cephelix

@Johny Boy asked the same question myself a few weeks back without an answer so I decided to take the plunge even after reading about some issues and I must say, I'm impressed. Played AC Syndicate on the old omega vs crimson whql drivers and fps went from 25-30, to almost a solid 60.


----------



## Johny Boy

Quote:


> Originally Posted by *Roboyto*
> 
> Haven't had any issues so far with the Crimson drivers on both of my PC's running a 290 GPU on Win X.
> 
> The 290 and 970 trade blows depending on the title/benchmark at hand. Due to the Maxwell architecture having outstanding overclock capabilities the 970 can often edge out the 290 since they are less likely to have obtain outstanding clocks; the 290X will close these gaps. And even if you do get a fantastic overclocking 290(X) you have lots of heat to deal with.
> 
> I have not compared Crimson to previous drivers yet, but I did do a comparison of a GTX 970 to my 290 a year ago...some things may have changed slightly...but it's a decent shootout nonetheless:
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/34380_20#post_23458320
> 
> I'll end up doing a Crimson driver comparison at some point...just haven't gotten around to it.


WoW that was awesome analysis, so with crimson drivers I guess gap b/w 970&290 is more close.Would you advice someone to buy 290 over 290x or 390/gtx 970 at this time of year when new gpus are about to be announced.
I will mostly use it for benching purpose.
+rep for you.
Quote:


> Originally Posted by *battleaxe*
> 
> Crimson drivers seem to be working just fine with my Xfire 290x's. I have had no issues. I've probably logged 20 hours of game time on them.


Nice to hear that,any shutters or another form of in game problems?
Sorry just wanted to make sure as resale value in my side of world is rather low.
Quote:


> Originally Posted by *cephelix*
> 
> @Johny Boy asked the same question myself a few weeks back without an answer so I decided to take the plunge even after reading about some issues and I must say, I'm impressed. Played AC Syndicate on the old omega vs crimson whql drivers and fps went from 25-30, to almost a solid 60.


Boom that's double fps from crimson drivers.

Thanks guys for your replies, will keep an eye on this thread.


----------



## battleaxe

Quote:


> Originally Posted by *Johny Boy*
> 
> WoW that was awesome analysis, so with crimson drivers I guess gap b/w 970&290 is more close.Would you advice someone to buy 290 over 290x or 390/gtx 970 at this time of year when new gpus are about to be announced.
> I will mostly use it for benching purpose.
> +rep for you.
> Nice to hear that,any shutters or another form of in game problems?
> Sorry just wanted to make sure as resale value in my side of world is rather low.
> Boom that's double fps from crimson drivers.
> 
> Thanks guys for your replies, will keep an eye on this thread.


If you ever plan to go 4k and crossfire go with a 290/390 or better yet a 290x/390x. Either combination will beat a 970. At 1080p I don't know. I don't use that resolution for anything serious anymore. But even before Crimson the 290 was matching the 970. It has just taken a while for the AMD drivers to mature as they usually do. We can expect the same thing for the Fury series too IMO.


----------



## cephelix

@Johny Boy
Forgot to mention that I'm gaming at 1200p and I don't know whether it's the newest version of trixx, or the drivers, I'm finally able to to stabilise my 290's OC to 1150/1300/+102mv when before I'd get a little artifacting in games.
Not regretting my purchase at all :thumbs: coming from a gtx 460 and 570


----------



## Johny Boy

Quote:


> Originally Posted by *battleaxe*
> 
> If you ever plan to go 4k and crossfire go with a 290/390 or better yet a 290x/390x. Either combination will beat a 970. At 1080p I don't know. I don't use that resolution for anything serious anymore. But even before Crimson the 290 was matching the 970. It has just taken a while for the AMD drivers to mature as they usually do. We can expect the same thing for the Fury series too IMO.


Well I only game @1080p and that too occasionally, just wanted sound advice from more informed members as I will just use it mainly for benching with different setups plus right now I have gtx 760 which is enough for my needs but getting [email protected] good price so thought lets ask how new drivers have improved it.
I do plan to move on to 4k but that's something different than what I intend to use it for.
So 290/390 are almost same with minor tweaks right or just buy 390x which bought as in new would cost me double the price of 290 or get 980 which will be further 130% pricier than this used 290.

Quote:


> Originally Posted by *cephelix*
> 
> @Johny Boy
> Forgot to mention that I'm gaming at 1200p and I don't know whether it's the newest version of trixx, or the drivers, I'm finally able to to stabilise my 290's OC to 1150/1300/+102mv when before I'd get a little artifacting in games.
> Not regretting my purchase at all :thumbs: coming from a gtx 460 and 570


How much can we expect to oc 290 over stock and is it easy to bios mod like gtx (kbt/kbt bios mods).

Thank You.


----------



## diggiddi

Quote:


> Originally Posted by *Johny Boy*
> 
> Well I only game @1080p and that too occasionally, just wanted sound advice from more informed members as I will just use it mainly for benching with different setups plus right now I have gtx 760 which is enough for my needs but getting [email protected] good price so thought lets ask how new drivers have improved it.
> I do plan to move on to 4k but that's something different than what I intend to use it for.
> So 290/390 are almost same with minor tweaks right or just buy 390x which bought as in new would cost me double the price of 290 or get 980 which will be further 130% pricier than this used 290.
> How much can we expect to oc 290 over stock and is it easy to bios mod like gtx (kbt/kbt bios mods).
> 
> Thank You.


Just get the 290 since you are rocking 1080, this is the order I'd recommend them, 290 first 1080 up to 2k then 390 then 390X for higher res
At 4k gaming u'll need crossfire though, if that's the aim then get 390 and double up later


----------



## cephelix

Quote:


> Originally Posted by *Johny Boy*
> 
> How much can we expect to oc 290 over stock and is it easy to bios mod like gtx (kbt/kbt bios mods).
> 
> Thank You.


Well, the extent to which you can oc would depend on how you manage the core and vrm temps i suppose. Do take note that I have CLU between cooler and die and also the 17mW/k fujipoly thermal pads for both vrm 1 and 2. With 26C ambients, I'm getting about 75-80C on core and 65C on vrm1. My card isn't the best though, there are quite a few here who get 1200mhz core with wonderful temps. As for bios flashing, I'll let someone else answer that as I have no experience with it.


----------



## battleaxe

Quote:


> Originally Posted by *Johny Boy*
> 
> Well I only game @1080p and that too occasionally, just wanted sound advice from more informed members as I will just use it mainly for benching with different setups plus right now I have gtx 760 which is enough for my needs but getting [email protected] good price so thought lets ask how new drivers have improved it.
> I do plan to move on to 4k but that's something different than what I intend to use it for.
> So 290/390 are almost same with minor tweaks right or just buy 390x which bought as in new would cost me double the price of 290 or get 980 which will be further 130% pricier than this used 290.
> How much can we expect to oc 290 over stock and is it easy to bios mod like gtx (kbt/kbt bios mods).
> 
> Thank You.


Quote:


> Originally Posted by *cephelix*
> 
> Well, the extent to which you can oc would depend on how you manage the core and vrm temps i suppose. Do take note that I have CLU between cooler and die and also the 17mW/k fujipoly thermal pads for both vrm 1 and 2. With 26C ambients, I'm getting about 75-80C on core and 65C on vrm1. My card isn't the best though, there are quite a few here who get 1200mhz core with wonderful temps. As for bios flashing, I'll let someone else answer that as I have no experience with it.


I'm running a pair of 290x's in Xfire. Which is great for 1440 or 4k gaming. As mentioned there are a few cards here that are truly exceptional and can do 1300+ on the core. My weak card does around 1245mhz on the core, the strong one will hit about 1280mhz on the core. I can't get a full block on the strongest card as they don't make them. So I am running regular blocks on the cores and have some retrofitted blocks on the VRM's. The cores are usually around 43C with 24C ambient while gaming even after several hours. The VRM's sit at about 55C with the same test. For benching my temps don't get this hot even when running very high Vcore using TRIXX. All this said, some do a lot better staying below 40C with full blocks on the VRM's. I think my cores are about normal even for full cover blocks though. I'm probably more limited by RAD space than anything else. Some of these guys are sporting a full 360rad on one 290x as I understand it.

So to get the core and VRM down as low as possible is the goal with OC'ing and benching. MUS1MUS has a 290 that scores better than any 290x I've ever seen including mine. I think he holds the current record for the 290 series cards. I could be wrong on this one though. His scores trounce any 970 as I understand it.

Modding the BIOS is fairly easy. Maybe not quite as simple as with Nvidia, though I'm not sure as its been a while since I flashed my old 670's. From memory they seem about the same in difficulty though. I used the AMDFLASH method which includes using a boot drive or flash stick with the bios files on it. Fairly straight forward and simple. There's a whole thread dedicated to this method here.


----------



## Roboyto

Quote:


> Originally Posted by *battleaxe*
> 
> The drivers have advanced pretty significantly during this time. The 290 now trades blows with the 970 with ease. Where the 970 used to be equal to the 290x the 290x now is trading blowes with the 980. Who knew?
> 
> Well, we did... but I digress.


It is better to contribute positively than be snarky, especially when I specifically stated that things may have changed since I did my comparison nearly a year ago.

A question for you though: if 'the 970 used to be equal to the 290X', how did my comparison turn out the way it did? Not sure if you read it, but it was essentially a tie for benchmark performance, with the 970 receiving praise for power consumption, operating temps, and more consistent/superior overclocking capabilities on average. This is why I said that the 290X would close the gap created by inferior OC capabilities.

I am on the AMD side of the spectrum..trust me. My contributions to this thread and both my PC's running FC water cooled 290s should attest to that.

A recent, 12/22/15, review of Strix 390 w/ Crimson shows a







*very simila*r to what I experienced a year ago w/ Omega.

390 @ 1190/1675 ~VS~ 970 @ 1542/2000

http://hardocp.com/article/2015/12/22/asus_r9_390_strix_directcu_iii_video_card_review/2

It is great that our cards still have untapped potential that is being released with drivers over 2 years later. It can be seen with that 390 running a lower core than my 290, and their GTX 970 running at higher speeds than mine did.

1190/1675 are decent clocks for air and it is somewhat surprising temps were in check running 1350mv and +50% power. This required 70% fan speed with the DC3 cooler and ASUS's efficiency improvements overall; which older/other cards do not benefit from. If all Hawaii cards could make 1190/1675 with maxed voltage/power, on AIR, it would be wonderful..but most won't, or would require excessive fan speed/noise to do so.

I'm actually surprised the DC3 fairs so well considering they are still only utilizing 3/5 heatpipes; a problem the 290(X) DC2 cards suffered more painfully from. I'm curious as to how ridiculously hot the VRMs are running with those settings...



Spoiler: Strix 390 Heatsink















Quote:


> Originally Posted by *Johny Boy*
> 
> WoW that was awesome analysis, so with crimson drivers I guess gap b/w 970&290 is more close.Would you advice someone to buy 290 over 290x or 390/gtx 970 at this time of year when new gpus are about to be announced.
> I will mostly use it for benching purpose.
> +rep for you.
> 
> Thanks guys for your replies, will keep an eye on this thread.


The 'X' nets you ~5-10% additional performance depending on resolution. 1080 gaming isn't these cards strongest suit, so you won't likely notice a difference between a 290/390 and their X counterparts.

If you plan on sticking with air cooling, I would say go with a 390. Nearly all the manufacturers have put in extra effort to tame temperatures on the core and VRMs which means less worry/hassle and noise for you to deal with. The 8GB of VRAM won't help you now at 1080P, but for future proofing it certainly won't hurt. A warranty is also always nice with new hardware.

If you plan on going with water, then your options are vast. Picking up a used 290 with/without a waterblock would be your cheapest endeavor.

The 290 in my HTPC I grabbed off our Marketplace for $250 with a full cover block and then spent a little $ to make a small water loop for it. I'm quite pleased with it's performance for the investment made considering it collectively cost me less than what I paid for only a 290 GPU 2 years ago; $450.

My little loop:

$50 - XSPC X2O 420 Single Bay res/pump combo
$24 - (6) Bitspower Premium Matte Black 1/4" fitting
$5 - (6) Koolance Chrome Spring Clamps
$5 - 3ft of Tygon 1/4 ID tubing
$35 - XSPC EX240 Radiator (Used OCN Marketplace)

____

$119

I also bought the XSPC backplate and stacked some thermal pads up on the rear of VRM1 for added passive cooling. The results collectively are pretty good for a standard 240mm radiator using (2) SilenX Effizio fans in pull that have mediocre static pressure and, most importantly, *can't be heard*.



Spoiler: HTPC - Fractal Node 804












*Looks much better with the DeepCool Captain 240 in there*





After a 1 hour loop of Final Fantasy XIV Benchmark we have the following max temperatures with all stock settings on a reference R9 290:

Core: 52C
VRM1: 44C
VRM2: 48C












It would certainly appear that the thermal pads on the back side of VRM1 do have a solid impact on temperatures. It is very rare to see VRM1 running cooler than VRM2 on these cards. I bet VRM2 would drop a little as well with some pads on the back side.

This card is no 'lottery winner' and I found it's best performance ratio to be at 1100/1350 +50mV +50% power. (It did bench up to 1200/1350 +175mv +50% power, but the addend volts/heat just aren't worth it for my use.)

20 minute loop of Valley 1100/1350 looks like this:

Core: 68C
VRM1: 61C
VRM2: 59C

Certainly not the best temperatures for water, but it is absolutely silent. It would most likely benefit the most from better radiator placement to enhance cool air intake, but this orientation made it easiest for my installation. Plus I rarely ever run the OC as 1080/60 is all I need on my 60" Sharp Aquos











I'm sure someone is probably thinking why go to all the trouble and run the itty-bitty 3/8" tubing?

The answer is multi-faceted:


Before making the loop as seen here, GPU was plumbed into a Corsair H55 AIO pump & ancient Corsair H100 radiator with the tubing
 The results were lackluster and not long after installation the pump took a dump.

Smaller size easier to run/bend in HTPC
Driving 1080P; OC was not my priority
Already had tubing & (some) fittings
Curiosity of performance
Cost


----------



## cephelix

The cards you guys have are running OCs that I can only dream of! But I can't complain with graphics tweaks in game mine runs smooth and I don't even notice the difference after a while. Maybe my next card should be a sapphire as their cards seem to be doing great temp wise.


----------



## Roboyto

Quote:


> Originally Posted by *cephelix*
> 
> The cards you guys have are running OCs that I can only dream of! But I can't complain with graphics tweaks in game mine runs smooth and I don't even notice the difference after a while. Maybe my next card should be a sapphire as their cards seem to be doing great temp wise.


I'm sure plenty of people won't agree but...

Compromise is the name of the game sometimes. My main rig drives 3K/5760*1080 with a lonely 290 @ 1200/1500. Are all settings maxed in demanding games? Nope. However, after playing around with settings for a little while I end up finding a good compromise of performance/quality stretched across nearly 5 feet of monitors. And when I'm blowing off Nazi heads, soaring through the air as Batman, or conjuring up fireballs I forget about the fact that I don't have anti-aliasing and shadow quality maxed because I'm driving a Pagani, robbing a bank, or saving the world from certain doom...and it still looks awesome.

Could I benefit from a 2nd GPU? Probably...BUT, the better question is do I want to deal with the proverbial baggage that comes with a 2nd GPU? Definitely not. This setup just works, and has worked *flawlessly* for 2 years without question through 3 operating systems. Couldn't be happier with the way it has served me thus far.

Aaaannnd I'm extra pumped that AMD, or RTG (Radeon Technology Group), has started to give out bits and pieces of information for our GPUs FinFet successors; Polaris.

http://www.anandtech.com/show/9886/amd-reveals-polaris-gpu-architecture


----------



## the9quad

I am hoping AMD really pulls out one heck of a card. They could use a big win. The next time I am going for just two cards, instead of three, I hope that combo is 2 to 3x as powerful as what I have now. I am leaning AMD again, barring a failure on their part to keep up with what Nvidia has. Been really happy with these 290x's as far as gaming goes.


----------



## cephelix

@Roboyto I wholeheartedly agree with your assessments. I've only recently started building computers myself. My first rig was an i5-760 and gtx 460. At that point all my friends were using an intel/nvidia combo and so not knowing anything I followed their recommendation, all along reading up on amd/intel/nvidia stuff as well as air vs watercooling. That is when I stumbled across OCN and the people here. Reading up more I finally decided on the 290 instead of the nvidia equivalent since I really didn't feel like parting with one of my kidneys to pay for a graphics card. Lol. I've also kept coming back to the idea of cfx but have always hesitated due to the reasons you mentioned. I have realised through trial and error that even though graphics play a part in enjoying a game, the storyline is what matters more. Till this day, my favourite game is Legacy of Kain: Soul Reaver. Which reminds me, I should take a look if they have a pc port of it. I was sceptical at first whether the 290 would last as it was my first amd card but crimson and you guys have shown me that it is true. Now I can sit back and comfortably enjoy my system. At least till the next amd card launches. Heh

@the9quad yeah, they really do need a win but seeing as how they're the smaller of the two companies I suppose it'll be somewhat more difficult for them. And as Roboyto mentioned, exciting times with the polaris announcements though I don't know much about the processes/tech to make sense of it all.


----------



## battleaxe

Quote:


> Originally Posted by *Roboyto*
> 
> It is better to contribute positively than be snarky, especially when I specifically stated that things may have changed since I did my comparison nearly a year ago.
> 
> A question for you though: if 'the 970 used to be equal to the 290X', how did my comparison turn out the way it did? Not sure if you read it, but it was essentially a tie for benchmark performance, with the 970 receiving praise for power consumption, operating temps, and more consistent/superior overclocking capabilities on average. This is why I said that the 290X would close the gap created by inferior OC capabilities.
> 
> I am on the AMD side of the spectrum..trust me. My contributions to this thread and both my PC's running FC water cooled 290s should attest to that.
> 
> A recent, 12/22/15, review of Strix 390 w/ Crimson shows a
> 
> 
> 
> 
> 
> 
> 
> *very simila*r to what I experienced a year ago w/ Omega.
> 
> 390 @ 1190/1675 ~VS~ 970 @ 1542/2000
> 
> http://hardocp.com/article/2015/12/22/asus_r9_390_strix_directcu_iii_video_card_review/2
> 
> It is great that our cards still have untapped potential that is being released with drivers over 2 years later. It can be seen with that 390 running a lower core than my 290, and their GTX 970 running at higher speeds than mine did.
> 
> 1190/1675 are decent clocks for air and it is somewhat surprising temps were in check running 1350mv and +50% power. This required 70% fan speed with the DC3 cooler and ASUS's efficiency improvements overall; which older/other cards do not benefit from. If all Hawaii cards could make 1190/1675 with maxed voltage/power, on AIR, it would be wonderful..but most won't, or would require excessive fan speed/noise to do so.
> 
> I'm actually surprised the DC3 fairs so well considering they are still only utilizing 3/5 heatpipes; a problem the 290(X) DC2 cards suffered more painfully from. I'm curious as to how ridiculously hot the VRMs are running with those settings...
> 
> 
> Spoiler: Strix 390 Heatsink
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The 'X' nets you ~5-10% additional performance depending on resolution. 1080 gaming isn't these cards strongest suit, so you won't likely notice a difference between a 290/390 and their X counterparts.
> 
> If you plan on sticking with air cooling, I would say go with a 390. Nearly all the manufacturers have put in extra effort to tame temperatures on the core and VRMs which means less worry/hassle and noise for you to deal with. The 8GB of VRAM won't help you now at 1080P, but for future proofing it certainly won't hurt. A warranty is also always nice with new hardware.
> 
> If you plan on going with water, then your options are vast. Picking up a used 290 with/without a waterblock would be your cheapest endeavor.
> 
> The 290 in my HTPC I grabbed off our Marketplace for $250 with a full cover block and then spent a little $ to make a small water loop for it. I'm quite pleased with it's performance for the investment made considering it collectively cost me less than what I paid for only a 290 GPU 2 years ago; $450.
> 
> My little loop:
> 
> $50 - XSPC X2O 420 Single Bay res/pump combo
> 
> $24 - (6) Bitspower Premium Matte Black 1/4" fitting
> 
> $5 - (6) Koolance Chrome Spring Clamps
> 
> $5 - 3ft of Tygon 1/4 ID tubing
> 
> $35 - XSPC EX240 Radiator (Used OCN Marketplace)
> ____
> $119
> 
> I also bought the XSPC backplate and stacked some thermal pads up on the rear of VRM1 for added passive cooling. The results collectively are pretty good for a standard 240mm radiator using (2) SilenX Effizio fans in pull that have mediocre static pressure and, most importantly, *can't be heard*.
> 
> 
> Spoiler: HTPC - Fractal Node 804
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Looks much better with the DeepCool Captain 240 in there*
> 
> 
> 
> 
> 
> 
> After a 1 hour loop of Final Fantasy XIV Benchmark we have the following max temperatures with all stock settings on a reference R9 290:
> 
> Core: 52C
> 
> VRM1: 44C
> 
> VRM2: 48C
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It would certainly appear that the thermal pads on the back side of VRM1 do have a solid impact on temperatures. It is very rare to see VRM1 running cooler than VRM2 on these cards. I bet VRM2 would drop a little as well with some pads on the back side.
> 
> This card is no 'lottery winner' and I found it's best performance ratio to be at 1100/1350 +50mV +50% power. (It did bench up to 1200/1350 +175mv +50% power, but the addend volts/heat just aren't worth it for my use.)
> 
> 20 minute loop of Valley 1100/1350 looks like this:
> 
> Core: 68C
> 
> VRM1: 61C
> 
> VRM2: 59C
> 
> Certainly not the best temperatures for water, but it is absolutely silent. It would most likely benefit the most from better radiator placement to enhance cool air intake, but this orientation made it easiest for my installation. Plus I rarely ever run the OC as 1080/60 is all I need on my 60" Sharp Aquos " src="https://www.overclock.net/images/smilies/smile.gif" style="border-style:none;border-width:medium;">
> 
> 
> 
> 
> I'm sure someone is probably thinking why go to all the trouble and run the itty-bitty 3/8" tubing?
> 
> The answer is multi-faceted:
> 
> 
> Before making the loop as seen here, GPU was plumbed into a Corsair H55 AIO pump & ancient Corsair H100 radiator with the tubing
> The results were lackluster and not long after installation the pump took a dump.
> 
> Smaller size easier to run/bend in HTPC
> Driving 1080P; OC was not my priority
> Already had tubing & (some) fittings
> Curiosity of performance
> Cost


I wasn't trying to be 'snarky' at all. You may have misinterpreted my post. I have one of the best 970 cards around so I also wasn't disagreeing with you as I'm well aware of how they compare to both a 290 and 290x. No worries.


----------



## kizwan

Well, Hawaii was my first attempt for multi-gpu & I don't regret it at all. All the games that I play support Crossfire & so far very happy with the scaling.
Quote:


> Originally Posted by *battleaxe*
> 
> I wasn't trying to be 'snarky' at all. You may have misinterpreted my post. I have one of the best 970 cards around so I also wasn't disagreeing with you as I'm well aware of how they compare to both a 290 and 290x. No worries.


Nah, your previous post doesn't look like snarky at all. I remember many reviews back then show 970 trading blows with 290X but it wasn't long after that, with drivers improvement, 970 is more closer to 290 than 290X. However by this time basically in everyone mind have been permanently imprinted the "impression" that 970 is equal to 290X. I did ran a couple of bench myself with my cards & compare it with bench results I found at 970 thread which also confirmed this.


----------



## freek2005

What is default (stock) GPU voltage for reference R9 290?


----------



## Roboyto

Quote:


> Originally Posted by *freek2005*
> 
> What is default (stock) GPU voltage for reference R9 290?


Under load, should be somewhere around 1.211V

This is what my reference PowerColor "OC" card registered at 975/1250

Can be seen in GPU-Z Here in the embedded photo:

"*Kraken G10 + Antec 620 + SilenX Effizio + Gelid R9 290 Heatsink Upgrade Kit"*

http://www.overclock.net/t/1478544/the-build-formerly-known-as-the-devil-inside-a-gaming-htpc-for-my-wife-4770k-corsair-250d-powercolor-r9-290/20_20#post_22214255


----------



## fat4l

Quote:


> Originally Posted by *freek2005*
> 
> What is default (stock) GPU voltage for reference R9 290?


It differs from card to card.
Usually from 1.19-1.25v.
You can run EVV program to tell you what is your default voltage.


----------



## mfknjadagr8

Quote:


> Originally Posted by *fat4l*
> 
> It differs from card to card.
> Usually from 1.19-1.25v.
> You can run EVV program to tell you what is your default voltage.


both of mine run around 1.1 @ idle and 1.24 full load with stock bios...I thought 1.24 was kind of a lot but then I've seen people here running near 1.4 with bios edits so I wasn't so scared...they never break 60 on the cores so


----------



## fat4l

Quote:


> Originally Posted by *mfknjadagr8*
> 
> both of mine run around 1.1 @ idle and 1.24 full load with stock bios...I thought 1.24 was kind of a lot but then I've seen people here running near 1.4 with bios edits so I wasn't so scared...they never break 60 on the cores so


Run this. It will tell you your 3D voltage
Link:- The Stilt's VID APP


----------



## Evil-Mobo

Guys, if you were going to buy a 290X "today" what would you look for?

I want to put this under water so that's why not a 390 hard to find blocks for been there and done that..................

Thanks
EM


----------



## sinnedone

Quote:


> Originally Posted by *Evil-Mobo*
> 
> Guys, if you were going to buy a 290X "today" what would you look for?
> 
> I want to put this under water so that's why not a 390 hard to find blocks for been there and done that..................
> 
> Thanks
> EM


Used with hynix memory.

There have been board revisions etc so you might need to watch out for that if you are planning on installing a waterblock.


----------



## kizwan

Quote:


> Originally Posted by *Evil-Mobo*
> 
> Guys, if you were going to buy a 290X "today" what would you look for?
> 
> I want to put this under water so that's why not a 390 hard to find blocks for been there and done that..................
> 
> Thanks
> EM


You want to look for referenced cards. Non-referenced cards would be Powercolor PCS+ (true for both 290/290X & 39/390X) & ASUS DC2 (true for both 290/290X & 39/390X). I'm pretty sure there's no revision with these two non-referenced cards that render them incompatible with the waterblock (EK 290X SE & EK 290X DC2 waterblock respectively) but you should double check them anyway.


----------



## Shweller

You can have mine. I am completely disappointing with with AMD and they're products. Screen tearing even with my Free sync monitor. Lack of Crossfire support. Frame stuttering and there is always something wrong with something. Its not the hardware just the crappy software.


----------



## Evil-Mobo

Quote:


> Originally Posted by *kizwan*
> 
> You want to look for referenced cards. Non-referenced cards would be Powercolor PCS+ (true for both 290/290X & 39/390X) & ASUS DC2 (true for both 290/290X & 39/390X). I'm pretty sure there's no revision with these two non-referenced cards that render them incompatible with the waterblock (EK 290X SE & EK 290X DC2 waterblock respectively) but you should double check them anyway.


Thanks coincidentally I am eyeballing a couple of DCII OC 290X's....... let's see what happens


----------



## mus1mus

Just got meself a reference VTX3D R9 290X.

Found this in a local store for the price lesser than used cards locally due to the store getting rid of them off the inventory.

Clocks 1275/1625 for an Elpida.


----------



## fat4l

Quote:


> Originally Posted by *mus1mus*
> 
> Just got meself a reference VTX3D R9 290X.
> 
> Found this in a local store for the price lesser than used cards locally due to the store getting rid of them off the inventory.
> 
> Clocks 1275/1625 for an Elpida.


another one ....lol







GZ


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> Just got meself a reference VTX3D R9 290X.
> 
> Found this in a local store for the price lesser than used cards locally due to the store getting rid of them off the inventory.
> 
> Clocks 1275/1625 for an Elpida.


Good grief. Nice.


----------



## mus1mus

Not to mention, on stock cooler and Voltage of 1.34







might slap it unto a WB.







and/or pick another one.

It's hard not to get one over a 390. Which costs a little less than 2 ref cards.

It's a BNDS sneaker-speak.









Managed to get this FS on a 290X + 290 combo.

http://www.3dmark.com/fs/7127031


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> Not to mention, on stock cooler and Voltage of 1.34
> 
> 
> 
> 
> 
> 
> 
> might slap it unto a WB.
> 
> 
> 
> 
> 
> 
> 
> and/or pick another one.
> 
> It's hard not to get one over a 390. Which costs a little less than 2 ref cards.
> 
> It's a BNDS sneaker-speak.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Managed to get this FS on a 290X + 290 combo.
> 
> http://www.3dmark.com/fs/7127031


Nice... nice... makes me temped to look for another 290x just for the hay of it. I've had my eye on the 390x's of late. Gonna be a while before 14nm comes out.


----------



## Roboyto

Quote:


> Originally Posted by *Evil-Mobo*
> 
> Guys, if you were going to buy a 290X "today" what would you look for?
> 
> I want to put this under water so that's why not a 390 hard to find blocks for been there and done that..................
> 
> Thanks
> EM


If you are willing to buy EK waterblocks, I believe their Rev 2.0 blocks accommodate some of the non reference designs.

Per EK's site and I quote:

EK-FC R9-290X is a high performance full-cover water block for AMD reference design Radeon R9 290(X) and -390(X) graphics card.

*Rev.2.0 brings compatibility for previously unsupported MSI® and GIGABYTE® model graphics cards.*

https://shop.ekwb.com/ek-fc-r9-290x-acetal-nickel-rev-2-0

This may open up your options a little bit.


----------



## Evil-Mobo

Quote:


> Originally Posted by *Roboyto*
> 
> If you are willing to buy EK waterblocks, I believe their Rev 2.0 blocks accommodate some of the non reference designs.
> 
> Per EK's site and I quote:
> 
> EK-FC R9-290X is a high performance full-cover water block for AMD reference design Radeon R9 290(X) and -390(X) graphics card.
> 
> *Rev.2.0 brings compatibility for previously unsupported MSI® and GIGABYTE® model graphics cards.*
> 
> https://shop.ekwb.com/ek-fc-r9-290x-acetal-nickel-rev-2-0
> 
> This may open up your options a little bit.


Thank you for this information..... I will look into this. Had an issue with a previously purchased XFX 390X.......









Is one brand making a better 290X than the other between the Gigabyte or MSI....?


----------



## Roboyto

Quote:


> Originally Posted by *Evil-Mobo*
> 
> Thank you for this information..... I will look into this. Had an issue with a previously purchased XFX 390X.......
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is one brand making a better 290X than the other between the Gigabyte or MSI....?


Both pretty solid AFAIK. Gigabyte probably the underdog in this arena, but I do like Gigabyte motherboards. Maybe some others can chime in for more helpful feedback.

I currently own 2 XFX 290's. One I purchased new and has been in beast mode since day one. The 2nd I purchased used and that PC is having some issues lately...can't seem to pinpoint the problem. While using the 290 for video I get audio skipping that appears to have ceased since switching to Intel IGP.

Just noticed EK has a compatibility list for those 2.0 blocks:

https://www.ekwb.com/configurator/waterblock/3831109869031

And it hits just about every major manufacturer: ASUS, Club3D, Colorful, Diamond, Gigabyte, HIS, MSI, Powercolor, Sapphire, VisionTek, VTX3D, XFX and Yeston?

Looks like you could buy just about any reference or slightly altered 290(X) and be covered...pun intended


----------



## Cyber Locc

Quote:


> Originally Posted by *Roboyto*
> 
> If you are willing to buy EK waterblocks, I believe their Rev 2.0 blocks accommodate some of the non reference designs.
> 
> Per EK's site and I quote:
> 
> EK-FC R9-290X is a high performance full-cover water block for AMD reference design Radeon R9 290(X) and -390(X) graphics card.
> 
> *Rev.2.0 brings compatibility for previously unsupported MSI® and GIGABYTE® model graphics cards.*
> 
> https://shop.ekwb.com/ek-fc-r9-290x-acetal-nickel-rev-2-0
> 
> This may open up your options a little bit.


Ya only 1 issue though, those blocks are EOL good luck finding one I been looking for a month. Of course I am slightly picky I need a Nickel/Acetal Rev 2.0 however in my search I have found nada for any of them, seen a few cards with copper blocks and a few with plexi blocks but that was the card and block.

If you do find somewhere selling blocks let me know







.


----------



## battleaxe

Quote:


> Originally Posted by *Roboyto*
> 
> Both pretty solid AFAIK. Gigabyte probably the underdog in this arena, but I do like Gigabyte motherboards. Maybe some others can chime in for more helpful feedback.
> 
> I currently own 2 XFX 290's. One I purchased new and has been in beast mode since day one. The 2nd I purchased used and that PC is having some issues lately...can't seem to pinpoint the problem. While using the 290 for video I get audio skipping that appears to have ceased since switching to Intel IGP.
> 
> Just noticed EK has a compatibility list for those 2.0 blocks:
> 
> https://www.ekwb.com/configurator/waterblock/3831109869031
> 
> And it hits just about every major manufacturer: ASUS, Club3D, Colorful, Diamond, Gigabyte, HIS, MSI, Powercolor, Sapphire, VisionTek, VTX3D, XFX and Yeston?
> 
> Looks like you could buy just about any reference or slightly altered 290(X) and be covered...pun intended


Really wish they made a block for my card. I don't think anyone will now though. Too late in the game.


----------



## Roboyto

Quote:


> Originally Posted by *Cyber Locc*
> 
> Ya only 1 issue though, those blocks are EOL good luck finding one I been looking for a month. Of course I am slightly picky I need a Nickel/Acetal Rev 2.0 however in my search I have found nada for any of them, seen a few cards with copper blocks and a few with plexi blocks but that was the card and block.
> 
> If you do find somewhere selling blocks let me know
> 
> 
> 
> 
> 
> 
> 
> .


EK's site says they can be assembled upon purchase from them.

Quote:


> Originally Posted by *battleaxe*
> 
> Really wish they made a block for my card. I don't think anyone will now though. Too late in the game.


Looks like Sapphire Tri-X and MSI Gaming cards are in that list


----------



## battleaxe

Quote:


> Originally Posted by *Roboyto*
> 
> EK's site says they can be assembled upon purchase from them.
> 
> Looks like Sapphire Tri-X and MSI Gaming cards are in that list


My MSI card is no problem. But the Tri-X is the (new version) of 290x. No-one that I know of makes them. I've looked everywhere... kinda a bummer. But I also can't justify getting rid of it as it does just shy of 1300mhz on the core 1680mhz on the Samsung RAM. I've only seen a few cards that can beat it here (390x and 290x/290). Mus1Mus has a 290 that beats every card I have seen period. I haven't had any other 290/290x cards that were even close to this new version 290x. So I'm stuck for now it seems.


----------



## Roboyto

Quote:


> Originally Posted by *battleaxe*
> 
> My MSI card is no problem. But the Tri-X is the (new version) of 290x. No-one that I know of makes them. I've looked everywhere... kinda a bummer. But I also can't justify getting rid of it as it does just shy of 1300mhz on the core 1680mhz on the Samsung RAM. I've only seen a few cards that can beat it here (390x and 290x/290). Mus1Mus has a 290 that beats every card I have seen period. I haven't had any other 290/290x cards that were even close to this new version 290x. So I'm stuck for now it seems.


That's where my 290 ends up as well ~1275+ / 1700 with Hynix

I take it you have one of the 11227-13 or 11227-14 cards then with upgrade VRMs?


----------



## battleaxe

Quote:


> Originally Posted by *Roboyto*
> 
> That's where my 290 ends up as well ~1275+ / 1700 with Hynix
> 
> I take it you have one of the 11227-13 or 11227-14 cards then with upgrade VRMs?


Its the 100361-4L version. Here

I'd love it if I could find a full cover block. I'm not optimistic though.









The only positive is that its DX12, so I can pair it with a 390x and still be compatible with the new games coming out.


----------



## Cyber Locc

Quote:


> Originally Posted by *Roboyto*
> 
> EK's site says they can be assembled upon purchase from them.
> 
> Looks like Sapphire Tri-X and MSI Gaming cards are in that list


Would love to know where you see that at, 290x reference blocks are not even listed in there shop.


----------



## Roboyto

Quote:


> Originally Posted by *battleaxe*
> 
> Its the 100361-4L version. Here
> 
> I'd love it if I could find a full cover block. I'm not optimistic though.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The only positive is that its DX12, so I can pair it with a 390x and still be compatible with the new games coming out.


So many different versions/model #s of the same card...enough to make your head spin


----------



## Roboyto

Quote:


> Originally Posted by *Cyber Locc*
> 
> Would love to know where you see that at, 290x reference blocks are not even listed in there shop.




Hovering mouse over the ? says "This item is currently out of stock but will be assembled on short notice. Ordering this item will not cause major shipping delays."

https://shop.ekwb.com/ek-fc-r9-290x-acetal-nickel-rev-2-0


----------



## Cyber Locc

Quote:


> Originally Posted by *Roboyto*
> 
> 
> 
> 
> Hovering mouse over the ? says "This item is currently out of stock but will be assembled on short notice. Ordering this item will not cause major shipping delays."
> 
> https://shop.ekwb.com/ek-fc-r9-290x-acetal-nickel-rev-2-0


Ya now try to add it to cart and watch it tell you no. They aslo seem to have 2 of that block on there store, one says ready to assemble and one says End Of Life.


----------



## battleaxe

Quote:


> Originally Posted by *Cyber Locc*
> 
> Ya now try to add it to cart and watch it tell you no.


LOL... bloody sellers...


----------



## Roboyto

Quote:


> Originally Posted by *Cyber Locc*
> 
> Ya now try to add it to cart and watch it tell you no.


I wasn't looking to buy one...so I didn't attempt to add it to cart. Pretty dumb that their site says they will assemble one.


----------



## Evil-Mobo

Every time I have ordered something from them that said could be put together I was then days later sent an email saying EOL......happened multiple times, so of you want one I would say email them first to see.

Just my $0.02


----------



## Cyber Locc

Quote:


> Originally Posted by *Roboyto*
> 
> I wasn't looking to buy one...so I didn't attempt to add it to cart. Pretty dumb that their site says they will assemble one.


Ya I never even seen that listing until now, it was never on the page you have to search for 290x to get it to come up. Anyway ya back to the issue there EOL







only way your getting one is used or from a store that has some that it never sold.
Quote:


> Originally Posted by *Evil-Mobo*
> 
> Every time I have ordered something from them that said could be put together I was then days later sent an email saying EOL......happened multiple times, so of you want one I would say email them first to see.
> 
> Just my $0.02


Ya I did that already they have been EOL for months they said no go.


----------



## kizwan

Quote:


> Originally Posted by *Evil-Mobo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Roboyto*
> 
> If you are willing to buy EK waterblocks, I believe their Rev 2.0 blocks accommodate some of the non reference designs.
> 
> Per EK's site and I quote:
> 
> EK-FC R9-290X is a high performance full-cover water block for AMD reference design Radeon R9 290(X) and -390(X) graphics card.
> 
> *Rev.2.0 brings compatibility for previously unsupported MSI® and GIGABYTE® model graphics cards.*
> 
> https://shop.ekwb.com/ek-fc-r9-290x-acetal-nickel-rev-2-0
> 
> This may open up your options a little bit.
> 
> 
> 
> Thank you for this information..... I will look into this. Had an issue with a previously purchased XFX 390X.......
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is one brand making a better 290X than the other between the Gigabyte or MSI....?
Click to expand...

Just remember what I replied earlier. Sapphire Tri-X 290/290X (2nd revision & new edition) & MSI Gaming 290/290X available currently are newer revision which are no longer compatible with EK waterblock. Unless you can find the first revision of these cards, you will have better luck with referenced cards or Powercolor PCS+ (EK SE waterblock) or ASUS DCU2 (EK DCU2 waterblock) cards. Remember the EK Rev. 2.0 waterblock was introduced around the time people having issue with taller/bigger capacitors on MSI Gaming 290/290X cards & 1st version of EK referenced waterblock.

For 390/390X, only Powercolor PCS+ & ASUS DCU2 are compatible with EK waterblock (EK SE & EK DCU2 waterblock). XFX 390/390X also recently using newer revision PCB which no longer compatible with EK waterblock.


----------



## Evil-Mobo

Quote:


> Originally Posted by *kizwan*
> 
> For 390/390X, only Powercolor PCS+ & ASUS DCU2 are compatible with EK waterblock (EK SE & EK DCU2 waterblock).


You're positive on this? Because then I will look for a 390X.......


----------



## Cyber Locc

Quote:


> Originally Posted by *Evil-Mobo*
> 
> You're positive on this? Because then I will look for a 390X.......


Are you in the market for a 390x or a 290x? if the former then you just need to wait a bit and you will be golden EK is releasing 390 blocks next month.

I have seen mid February confirmed by a few EK reps on different forums.


----------



## Evil-Mobo

Oh I had no idea.....then I will just wait.

And yes IF I can get a decent block for a good 390X that's what I want to do, because I don't see the Fury/FuryX worth the extra cost. The Nano is tempting for an ITX build, but I have a Z87 mobo and DDR3 on hand already............ so not sure still toying with stuff. My build is all up in the air again thanks to a major F up with a custom case order.......story for another day


----------



## kizwan

Quote:


> Originally Posted by *Evil-Mobo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> For 390/390X, only Powercolor PCS+ & ASUS DCU2 are compatible with EK waterblock (EK SE & EK DCU2 waterblock).
> 
> 
> 
> You're positive on this? Because then I will look for a 390X.......
Click to expand...

Yes, last time I checked they're the ones left that still compatible. You can try email EKWB support.


----------



## Cyber Locc

Quote:


> Originally Posted by *Evil-Mobo*
> 
> Oh I had no idea.....then I will just wait.
> 
> And yes IF I can get a decent block for a good 390X that's what I want to do, because I don't see the Fury/FuryX worth the extra cost. The Nano is tempting for an ITX build, but I have a Z87 mobo and DDR3 on hand already............ so not sure still toying with stuff. My build is all up in the air again thanks to a major F up with a custom case order.......story for another day


Well honestly if you can hold out for a bit the New 14nm gpus should be rolling in around Augustish and that would be alot better than a 390x.


----------



## Evil-Mobo

Quote:


> Originally Posted by *Cyber Locc*
> 
> Well honestly if you can hold out for a bit the New 14nm gpus should be rolling in around Augustish and that would be alot better than a 390x.


Yeah the issue is waiting that much longer with no PC been "building my RIG" since last August............


----------



## kizwan

Quote:


> Originally Posted by *Evil-Mobo*
> 
> Oh I had no idea.....then I will just wait.
> 
> And yes IF I can get a decent block for a good 390X that's what I want to do, because I don't see the Fury/FuryX worth the extra cost. The Nano is tempting for an ITX build, but I have a Z87 mobo and DDR3 on hand already............ so not sure still toying with stuff. My build is all up in the air again thanks to a major F up with a custom case order.......story for another day


As far as I know, EK rep at OCN have never mention about this at all.


----------



## Cyber Locc

Quote:


> Originally Posted by *kizwan*
> 
> As far as I know, EK rep at OCN have never mention about this at all.


Well he may not have but was he asked? I seen EK reps say it when asked.

Anyway its also on there twitter page,

__ https://twitter.com/i/web/status/684725981874155520%5B%2FURL So Derick 390x blocks? Here lets ping @akira749 too just in case I know Derick is busy so.

Also Akira whats up with 290x blocks can they still be ordered and assembled? I need one really bad







I really think you guys took them off the market a little too soon


----------



## Evil-Mobo

Quote:


> Originally Posted by *Cyber Locc*
> 
> Well he may not have but was he asked? I seen EK reps say it when asked.
> 
> Anyway its also on there twitter page,
> 
> __ https://twitter.com/i/web/status/684725981874155520%5B%2FURL So Derick 390x blocks?


Thanks let's see what the response is


----------



## akira749

Quote:


> Originally Posted by *Cyber Locc*
> 
> Well he may not have but was he asked? I seen EK reps say it when asked.
> 
> Anyway its also on there twitter page,
> 
> __ https://twitter.com/i/web/status/684725981874155520%5B%2FURL So Derick 390x blocks? Here lets ping @akira749 too just in case I know Derick is busy so.
> 
> Also Akira whats up with 290x blocks can they still be ordered and assembled? I need one really bad
> 
> 
> 
> 
> 
> 
> 
> I really think you guys took them off the market a little too soon


Quote:


> Originally Posted by *Evil-Mobo*
> 
> Thanks let's see what the response is


Hi









What's the question here? I read a few posts back but didn't get a clear idea.









Also, @Cyber Locc I have sent an email at the HQ about the 290 blocks.


----------



## Evil-Mobo

Quote:


> Originally Posted by *akira749*
> 
> Hi
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What's the question here? I read a few posts back but didn't get a clear idea.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also, @Cyber Locc I have sent an email at the HQ about the 290 blocks.


Ok the question at hand is whether it is true that soon in the near future EKWB will be releasing water blocks for R9 390/390X GPU's.....?


----------



## akira749

Quote:


> Originally Posted by *Evil-Mobo*
> 
> Ok the question at hand is whether it is true that soon in the near future EKWB will be releasing water blocks for R9 390/390X GPU's.....?


Which one? Brand and model


----------



## Evil-Mobo

Quote:


> Originally Posted by *akira749*
> 
> Which one? Brand and model


Any I'm very confused because I once bought an XFX R9 390 that was listed as compatible, then the block didn't work for it, now that same GPU is listed on the compatibility list.......

So are there new blocks coming to ADD more R9 390/390X's?

I should not have to and don't want to dremel a brand new block lol........

http://www.overclock.net/t/1569032/xfx-r9-390-dd-removed-from-the-ek-fc-r9-290x-v2-ca-cooling-configurator


----------



## JourneymanMike

Quote:


> Originally Posted by *Evil-Mobo*
> 
> Every time I have ordered something from them that said could be put together I was then days later sent an email saying EOL......happened multiple times, so of you want one I would say email them first to see.
> 
> Just my $0.02



































































































































































































































































































































































































































































































































































































































































































































































































































































































































































































































































































































































































































































































































































































EDIT: I'm bored today!


----------



## battleaxe

Quote:


> Originally Posted by *JourneymanMike*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: I'm bored today!


How many cents is that? I'm too lazy to count them.


----------



## Cyber Locc

Quote:


> Originally Posted by *akira749*
> 
> Hi
> 
> 
> 
> 
> 
> 
> 
> 
> 
> What's the question here? I read a few posts back but didn't get a clear idea.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also, @Cyber Locc I have sent an email at the HQ about the 290 blocks.


Thanks very much, The question was what 390 blocks are we going to see? We know about the MSI as its on twitter but I have seen a rep mention others. Are you able to tell us what 390 blocks are in the works? Alot of people looking at buying 390s want to block them so this info is invaluable to people looking to buy 390s.


----------



## JourneymanMike

Quote:


> Originally Posted by *battleaxe*
> 
> How many cents is that? I'm too lazy to count them.


There are 197 2 centses (?), so that = 394 cents... I think...


----------



## Cyber Locc

Quote:


> Originally Posted by *JourneymanMike*
> 
> There are 197 2 centses (?), so that = 394 cents... I think...


39 x 5 - 3 = 192, 192 x 2 = 384 (based off quote)

27 x 7 + 3 = 192, 192 x 2 = 384 (based off OP)

There is 384 cents there now what do I win.


----------



## JourneymanMike

Quote:


> Originally Posted by *Cyber Locc*
> 
> *39 x 5 - 3 = 192, 192 x 2 = 384 (based off quote)
> 
> 27 x 7 + 3 = 192, 192 x 2 = 384 (based off OP)
> *
> There is 384 cents there now what do I win.


What the hell are you doin'? Did you recount them, or edit the centses?

If you do win, you win! I'm not going to recount them right now! My poor little brain hurts...


----------



## Cyber Locc

Quote:


> Originally Posted by *JourneymanMike*
> 
> What the hell are you doin'? Did you recount them, or edit the centses?
> 
> If you do win, you win! I'm not going to recount them right now! My poor little brain hurts...


I counted them lol, You were off, I assume you counted the quote and you counted the row off by 1, there's 39 to each row not 40. I assume you counted 40 and thus you came out 5 higher than me. But I counted the quote and the OP multiple times.

See if we were to say 40 to the row with 5 to the column that would be 40 x 5 - 3 = 197, so thats where you went wrong. Does that make Sense how you miscounted the Cents









The answer is 384, now what do I win







.


----------



## kizwan

Quote:


> Originally Posted by *JourneymanMike*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Evil-Mobo*
> 
> Every time I have ordered something from them that said could be put together I was then days later sent an email saying EOL......happened multiple times, so of you want one I would say email them first to see.
> 
> Just my $0.02
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: I'm bored today!
Click to expand...









+ 6% GST


----------



## Cyber Locc

Quote:


> Originally Posted by *kizwan*
> 
> 
> 
> 
> 
> 
> 
> 
> + 6% GST


409.16 What do I win


----------



## battleaxe

What have I done? LOL


----------



## mus1mus

The Exchange Rate plummeted bad.


----------



## JourneymanMike

Quote:


> Originally Posted by *Cyber Locc*
> 
> I counted them lol, You were off, I assume you counted the quote and you counted the row off by 1, there's 39 to each row not 40. I assume you counted 40 and thus you came out 5 higher than me. But I counted the quote and the OP multiple times.
> 
> See if we were to say 40 to the row with 5 to the column that would be 40 x 5 - 3 = 197, so thats where you went wrong. Does that make Sense how you miscounted the Cents
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The answer is 384, now what do I win
> 
> 
> 
> 
> 
> 
> 
> .


This time I came up with 42 x 4 + 32 = 200 x 2 = *400*

I'm half


----------



## AliNT77

What is goin on in this thread ???

There's 192x2 BTW ??


----------



## JourneymanMike

*@Cyber Locc & @AliNT77
*
_*You guys win!

*_


----------



## AliNT77

yeeeeeeeeeeeeeeeeeeeeeeeaaaaaaaaaaahhhh









wait a minute ...
where's my reward ?


----------



## JourneymanMike

Quote:


> Originally Posted by *AliNT77*
> 
> yeeeeeeeeeeeeeeeeeeeeeeeaaaaaaaaaaahhhh
> 
> 
> 
> 
> 
> 
> 
> 
> 
> wait a minute ...
> where's my reward ?


----------



## fat4l

I'm doing some testing of my 1750MHz mem if it is fully stable and it seems so...yay.
1500timings applied.
1050aux voltage.

1200/1750MHz + Crossfire

*13146 GS in FS X!*

http://www.3dmark.com/fs/7157746


----------



## akira749

Quote:


> Originally Posted by *Evil-Mobo*
> 
> Any I'm very confused because I once bought an XFX R9 390 that was listed as compatible, then the block didn't work for it, now that same GPU is listed on the compatibility list.......
> 
> So are there new blocks coming to ADD more R9 390/390X's?
> 
> I should not have to and don't want to dremel a brand new block lol........
> 
> http://www.overclock.net/t/1569032/xfx-r9-390-dd-removed-from-the-ek-fc-r9-290x-v2-ca-cooling-configurator


The problems with the XFX is that they changed the caps on their cards and the new revision don't fit anymore.

The new one with a "Z" on the caps are not compatibles. Only those with a "C" are.


----------



## Evil-Mobo

Quote:


> Originally Posted by *akira749*
> 
> The problems with the XFX is that they changed the caps on their cards and the new revision don't fit anymore.
> 
> The new one with a "Z" on the caps are not compatibles. Only those with a "C" are.


So there are no "new" 390/390X blocks coming out then?


----------



## akira749

Quote:


> Originally Posted by *Evil-Mobo*
> 
> So there are no "new" 390/390X blocks coming out then?


Other than the one for the MSI Gaming no.


----------



## Evil-Mobo

false
Quote:


> Originally Posted by *akira749*
> 
> Other than the one for the MSI Gaming no.


MSI 390 or 390X or both?

Thanks


----------



## akira749

Quote:


> Originally Posted by *Evil-Mobo*
> 
> false
> MSI 390 or 390X or both?
> 
> Thanks


Both


----------



## Evil-Mobo

Quote:


> Originally Posted by *akira749*
> 
> Both


Perfect thanks for the confirmation, much appreciated.


----------



## ZealotKi11er

Quote:


> Originally Posted by *fat4l*
> 
> I'm doing some testing of my 1750MHz mem if it is fully stable and it seems so...yay.
> 1500timings applied.
> 1050aux voltage.
> 
> 1200/1750MHz + Crossfire
> 
> *13146 GS in FS X!*
> 
> http://www.3dmark.com/fs/7157746


http://www.3dmark.com/fs/6693268

290X + 290 @ 1275/1625. I might have to try 1750 because I never used any AUX Voltage.


----------



## MoGTy

I recently bought a cheap Sapphire R9 290X 8GB edition, I'm pretty pleased with the performance however the VRM temps (when overclocking) freak me out. I like to ghetto mod my own stuff but before I start I'd like to know if DIY'ing a small backplate with a heatsink over the VRMs is worth it? I never made a backplate before so I wouldn't know if the back of the PCB has the ability to contribute in cooling those hot buggers.


----------



## fat4l

Quote:


> Originally Posted by *MoGTy*
> 
> I recently bought a cheap Sapphire R9 290X 8GB edition, I'm pretty pleased with the performance however the VRM temps (when overclocking) freak me out. I like to ghetto mod my own stuff but before I kicked off the process I was wondering if DIY'ing a small backplate with a heatsink over the VRMs is worth it? I never made a backplate before so I wouldn't know if the back of the PCB has the ability to contribute in cooling those hot buggers.


Well just use different pads and thats it


----------



## MoGTy

Quote:


> Originally Posted by *fat4l*
> 
> Well just use different pads and thats it


I should point out, my graphics card doesn't have a backplate







That's why I'm making one.


----------



## fat4l

Quote:


> Originally Posted by *MoGTy*
> 
> I should point out, my graphics card doesn't have a backplate
> 
> 
> 
> 
> 
> 
> 
> That's why I'm making one.


I don't think it will help you








Change pads on vrms(on the front)

Theres threads already about it You can get -25C easily by doing it


----------



## cephelix

Quote:


> Originally Posted by *fat4l*
> 
> I don't think it will help you
> 
> 
> 
> 
> 
> 
> 
> 
> Change pads on vrms(on the front)
> 
> Theres threads already about it You can get -25C easily by doing it


I changed the pads on my backplate to the fujipoly ones and it did drop temps some. Can't exactly recall how much. I do agree though that changing pads on the front side would see the biggest drop in vrm temps. I was going for as much decrease in temps on air as I could attain and I wasn't disappointed.
1150 core/1300mem/+50powerlimit/+102mv using the 15.12 crimsom drivers and latest version of sapphire trixx. Clu on die, 26C ambients nets me mid to high 70C on load


----------



## MoGTy

Quote:


> Originally Posted by *fat4l*
> 
> I don't think it will help you
> 
> 
> 
> 
> 
> 
> 
> 
> Change pads on vrms(on the front)
> 
> Theres threads already about it You can get -25C easily by doing it


Oh I see :-D so a custom vrm heatsink on the back is pointless?
Quote:


> Originally Posted by *cephelix*
> 
> I changed the pads on my backplate to the fujipoly ones and it did drop temps some. Can't exactly recall how much. I do agree though that changing pads on the front side would see the biggest drop in vrm temps. I was going for as much decrease in temps on air as I could attain and I wasn't disappointed.
> 1150 core/1300mem/+50powerlimit/+102mv using the 15.12 crimsom drivers and latest version of sapphire trixx. Clu on die, 26C ambients nets me mid to high 70C on load


What thickness are they?


----------



## Roboyto

Quote:


> Originally Posted by *fat4l*
> 
> I don't think it will help you
> 
> 
> 
> 
> 
> 
> 
> 
> Change pads on vrms(on the front)
> 
> Theres threads already about it You can get -25C easily by doing it


Quote:


> Originally Posted by *MoGTy*
> 
> I should point out, my graphics card doesn't have a backplate
> 
> 
> 
> 
> 
> 
> 
> That's why I'm making one.


I have a thread for VRM thermal pads and you can get somewhere around a (+/-) 20% drop in load temps: http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures/0_20

The thermal pads on the back can make a little difference. I did it in my media center with XSPC block/backplate using Fuji Extreme pads. I only added them to the backside of VRM1 which took (3) .5mm layers since it was all I had lying around; if you get thicker pads it would obviously take less. The backplate does get noticably hotter than my other rig with the same block/backplate that doesn't have any pads there.

This card is no 'lottery winner' and I found it's best performance ratio to be at 1100/1350 +50mV +50% power. (It did bench up to 1200/1350 +175mv +50% power, but the addend volts/heat just aren't worth it for my use.)

It is a min-loop only for the GPU using 1/4" ID tubing, standard thickness XSPC EX240 rad, with 2 fans pulling air. Temps aren't outstanding, but it is whisper quiet under maximum load.

20 minute loop of Valley @ 1100/1350 looks like this with VRM1 well under GPU core temp:

Core: 68C
VRM1: 61C
VRM2: 59C


----------



## fat4l

Quote:


> Originally Posted by *MoGTy*
> 
> Oh I see :-D so a custom vrm heatsink on the back is pointless?
> What thickness are they?


yes it is pointless I would say.
The thickness differs but mostly it is 1mm. You would have to check yours.

Check @Roboytos thread and also you can check mine:
http://www.overclock.net/t/1576426/fujipoly-thermal-pads-vrm-insanely-low-temps-must-have-recommended/0_30

This is what I dropped(OC results):
GPU1_vrms
VRM1=Δ21C
VRM2=Δ7C

GPU2_vrms
VRM1=Δ34C
VRM2=Δ17C
(to understand it, I dropped 7-34C)


----------



## cephelix

I used 1mm thick fujipolys for both the front and backside of vrm1. Overall on load, my vrm 1 temps are around 65C max. Alot cooler than my cores.


----------



## diggiddi

Is this a good stock crossfire score ? http://www.3dmark.com/3dm/10225336
I got 14K+ score with a single 290X at 1230/1620 [email protected] 4.6GHZ


----------



## kizwan

Quote:


> Originally Posted by *diggiddi*
> 
> Is this a good stock crossfire score ? http://www.3dmark.com/3dm/10225336
> I got 14K+ score with a single 290X at 1230/1620 [email protected] 4.6GHZ


Your crossfire score at stock is OK.


----------



## fat4l

Quote:


> Originally Posted by *diggiddi*
> 
> Is this a good stock crossfire score ? http://www.3dmark.com/3dm/10225336
> I got 14K+ score with a single 290X at 1230/1620 [email protected] 4.6GHZ


Well my OC setup (1200/1750) is doing 29000+ graphics score. Not sure about 1080/1250MHz ..


----------



## mus1mus

Not yet a SCORER but hella clocker. http://www.3dmark.com/3dm11/10792066


----------



## diggiddi

Quote:


> Originally Posted by *kizwan*
> 
> Your crossfire score at stock is OK.


Cool
Quote:


> Originally Posted by *fat4l*
> 
> Well my OC setup (1200/1750) is doing 29000+ graphics score. Not sure about 1080/1250MHz ..


Unfortunately My PSU cant feed those beasts so I'm stuck at stock speeds for now, looking to pick up a 1600w Lepa and trifire in the future


----------



## fat4l

Quote:


> Originally Posted by *mus1mus*
> 
> Not yet a SCORER but hella clocker. http://www.3dmark.com/3dm11/10792066


lol...DPM7 volts ?


----------



## mfknjadagr8

Quote:


> Originally Posted by *mus1mus*
> 
> Not yet a SCORER but hella clocker. http://www.3dmark.com/3dm11/10792066


nice core you crazy fool you


----------



## mus1mus

Quote:


> Originally Posted by *fat4l*
> 
> lol...DPM7 volts ?


Funny you asked. I cant run EVV app.









Installed fresh OS to try doing that. But ASIC is 75ish. And VID might be over 1.175ish. Me guess.

Quote:


> Originally Posted by *mfknjadagr8*
> 
> nice core you crazy fool you


I am.









1350/1625 is more stable on this card than my 290 BTW. And requires the same 1.4 + Volts.


----------



## kizwan

@mus1mus, did you use PT1 on that card?


----------



## mus1mus

No other way sir. And Trixx +200 at the very least.


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> No other way sir. And Trixx +200 at the very least.


How much offset voltage did you feed to the card for that clocks? And what was your ambient at that time?


----------



## mus1mus

+ 400 or about 1.5 under Load at over 1350/1625.


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> + 400 or about 1.5 under Load at 1350/1625.


How about ambient temp at that time?


----------



## mus1mus

15C Water maybe. The hottest thing on the card is VRM 1. I bet to get it down from 60C may help more MHz.

BUUUTTTT

It scores less than my 290 That artifacts like mad on the same clocks.
http://www.3dmark.com/compare/3dm11/10792089/3dm11/10791943


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> 15C Water maybe. The hottest thing on the card is VRM 1. I bet to get it down from 60C may help more MHz.
> 
> BUUUTTTT
> 
> It scores less than my 290 That artifacts like mad on the same clocks.
> http://www.3dmark.com/compare/3dm11/10792089/3dm11/10791943


15C water. That's what I want to hear.







I should get water chiller if I don't want left behind in benching.









Anyway, +1 FPS increase with Crimson 16.1 driver. Yayyyy!
http://www.3dmark.com/compare/fs/7022885/fs/7166136#


----------



## diggiddi

Better scores with ULPS turned off, 300 more graphics points

12662 with AMD Radeon R9 290X(2x) and AMD FX-8350

Graphics Score 24016

Physics Score 9276

Combined Score 3167

http://www.3dmark.com/3dm/10230025


----------



## mus1mus

Lightnings right?

Push eem!









http://www.3dmark.com/fs/7127031

290X + 290 Combo.

Edit:

We had a notion back then of a 30K Graphics Score on 2 cards. Seems highly doable.


----------



## greenscobie86

Sooo I'm pondering adding a second reference 290 to my system for x-fire. Pros/Cons??? My opinion is that they are cheap enough to do it now. Want to get a better experience gaming on my 3x1 Eyefinity setup(5040x1050)


----------



## AliNT77

Quote:


> Originally Posted by *greenscobie86*
> 
> Sooo I'm pondering adding a second reference 290 to my system for x-fire. Pros/Cons??? My opinion is that they are cheap enough to do it now. Want to get a better experience gaming on my 3x1 Eyefinity setup(5040x1050)


Do you have a reference 290 right now ?
Hynix ?
If so , try this trick on undervolting and see how much it helps with temps and noise with that ref. Cooler:

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/40170#post_24575831


----------



## greenscobie86

^
Yep, I have a ref. MSI at stock speeds. Will definitely try the undervolting trick. The noise is not really that bad to be honest, I have it sitting in an NZXT Phantom 410 case with lots of fans and the card runs at about 74~79 degrees on full load with a custom fan curve through AB. I was more interested in the sense/benefit of adding another one to improve gaming performance for my eyefinity setup.


----------



## AliNT77

Quote:


> Originally Posted by *greenscobie86*
> 
> ^
> Yep, I have a ref. MSI at stock speeds. Will definitely try the undervolting trick. The noise is not really that bad to be honest, I have it sitting in an NZXT Phantom 410 case with lots of fans and the card runs at about 74~79 degrees on full load with a custom fan curve through AB. I was more interested in the sense/benefit of adding another one to improve gaming performance for my eyefinity setup.


With that trick , i'm running my 290 24/7 @950/1625 with -156mV and i can decrease power limit by 32% without ANY throttling at all ??? ( voltage on full load : 1.018 )

Or 1050/1625 @-63mv

But my card is under AIO
I need someone to test it with ref. Cooler 'cuz im curious ??


----------



## mus1mus

Quote:


> Originally Posted by *AliNT77*
> 
> With that trick , i'm running my 290 24/7 @950/1625 with -156mV and i can decrease power limit by 32% without ANY throttling at all ??? ( voltage on full load : 1.018 )
> 
> Or 1050/1625 @-63mv
> 
> But my card is under AIO
> I need someone to test it with ref. Cooler 'cuz im curious ??


Can you send me the specific BIOS and the trick on UVing?









I find it amazing though, that you were able to dial a 1625 memory even when Undervolted. Must be an amazing card to OC as well. Have you tried how much it OCs?


----------



## JourneymanMike

Quote:


> Originally Posted by *greenscobie86*
> 
> ^
> Yep, I have a ref. MSI at stock speeds. Will definitely try the undervolting trick. The noise is not really that bad to be honest, I have it sitting in an NZXT Phantom 410 case with lots of fans and the card runs at about *74~79 degrees on full load* with a custom fan curve through AB. I was more interested in the sense/benefit of adding another one to improve gaming performance for my eyefinity setup.


That's damn good @ full load!









With a Ref cooler, mine ran 94c - 95c, @ full load... Is yours gaming, stress testing, or benchmarking? And what programs?

The Reference 290x is meant to run at 94c...


----------



## greenscobie86

^
That is definitely during gaming. Stress testing its closer to 90c. I've got like 7 fans in that case and a cool room in the winter and A\C'd in the summer hahaha... Granted the card is at stock clocks, maybe with AliNT77's undervolting I can get it lower.

Also, I guess I'll pick up another reference r9 290 then, jonesing for more power...


----------



## kizwan

Quote:


> Originally Posted by *greenscobie86*
> 
> ^
> Yep, I have a ref. MSI at stock speeds. Will definitely try the undervolting trick. The noise is not really that bad to be honest, I have it sitting in an NZXT Phantom 410 case with lots of fans and the card runs at about 74~79 degrees on full load with a custom fan curve through AB. I was more interested in the sense/benefit of adding another one to improve gaming performance for my eyefinity setup.


Cooling two referenced cars will be challenging. It's doable but noisy. The top card will be running hotter than bottom card though.


----------



## greenscobie86

^
I figured as much. Hopefully they will still be okay enough not to throttle, Might have to run my fan controller on full blast during gaming.
Anyone have any experience with a X-fire Ref. 290 setup on stock coolers?


----------



## AliNT77

Quote:


> Originally Posted by *mus1mus*
> 
> Can you send me the specific BIOS and the trick on UVing?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I find it amazing though, that you were able to dial a 1625 memory even when Undervolted. Must be an amazing card to OC as well. Have you tried how much it OCs?


yeah im impressed with my card's memory








i can even run it @-187mV @1625mHz but the core wont be stable at 947mHz









*the trick:*



flash it and start UVing in desktop with afterburner's monitoring window maximized and ( 100ms update period ) step by step until u see a blackscreen

restart pc , now set ([the voltage that caused BS]+12mV)
now you can be sure that any instability is because of the Core (not memory)

start a benchmark (valley is nice) now if U see any artifact,driver crash,etc... start overvolting till everything is stable

here's my modded bios:

Hawaii.zip 43k .zip file


my card's core is not a good clocker and not a good scorer either
i hit stability wall @1145 (+100mV) and 1180(+200mv)

but it benches @1225 (+200mv) with artifacts
1230= so unstable
1240= not doable


----------



## OneB1t

that seems very nice








need to check it

PS: this is very old release of hawaiireader check new one


----------



## Riktar54

My XFX DD R9 290 is now one year old and the cooling is starting to go South. When I first got the card my VRM's seldom hit 90C and the GPU never touched 80C.

To be clear: The temps are while gaming. I am an Elite Dangerous junkie,,,,,,









Fast forward to today and the VRM's are always over 100C and the GPU is hitting 95C and (of course) the card starts to throttle back and my frame rates starts to drop.

Bummer.









XFX tech support suggested that the thermal paste might be drying out and that I should take off the Double D cooler and clean/re-apply some new thermal paste.

If I am going to go through the trouble of doing that I am thinking I would spring for this:

Arctic Accelero Hybrid III on Amazon

However,,,,,,

I can't find a single review (in English) about this product. Has anyone here tried this particular unit?

And for future consideration: Is the pcb on the 390 the same as the 290? From a reference standapoint.


----------



## OneB1t

@AliNT77: your trick is not working for me







card is staying on 450mhz memory in 2D and its unstable in 3D so i reverted it back
also 250mhz 2D core clock is causing blue screen driver crash for me :-/ (16.1 crimson windows 10)


----------



## AliNT77

Quote:


> Originally Posted by *OneB1t*
> 
> @AliNT77: your trick is not working for me
> 
> 
> 
> 
> 
> 
> 
> card is staying on 450mhz memory in 2D and its unstable in 3D so i reverted it back
> also 250mhz 2D core clock is causing blue screen driver crash for me :-/ (16.1 crimson windows 10)


ahh dammit
















staying @450 and have spikes in frequency is normal

what do you mean by unstable in 3d? what kind of instability?








is it unstable with stock voltage???

can you try 1 more time plz?








but this time use 800 instead of 450 and don't change 2d core clock

thx


----------



## OneB1t

it will stay on 800
by unstable 3D i mean that after 3dmark load whole computer rebooted








this is just not a good way









you can undervolt just 2D (i have 900mV @ DPM0)


this mod is stable for me without issues







(i have 290X reference with hynix memory)


----------



## AliNT77

Quote:


> Originally Posted by *OneB1t*
> 
> it will stay on 800
> by unstable 3D i mean that after 3dmark load whole computer rebooted
> 
> 
> 
> 
> 
> 
> 
> 
> this is just not a good way
> 
> 
> 
> 
> 
> 
> 
> 
> 
> you can undervolt just 2D (i have 900mV @ DPM0)


Quote:


> Originally Posted by *OneB1t*
> 
> it will stay on 800
> by unstable 3D i mean that after 3dmark load whole computer rebooted
> 
> 
> 
> 
> 
> 
> 
> 
> this is just not a good way
> 
> 
> 
> 
> 
> 
> 
> 
> 
> you can undervolt just 2D (i have 900mV @ DPM0)
> 
> 
> this mod is stable for me without issues
> 
> 
> 
> 
> 
> 
> 
> (i have 290X reference with hynix memory)


but what about under load ??

we want lower power consumption & temps under load right?


----------



## OneB1t

you can set whatever voltage you want under load








i have it [email protected] but you can go [email protected] + 700Mhz memory (which will be about 85% performance of stock 290X but about half power usage)


----------



## fyzzz

I messed around with undervolting my 290 a long time ago. Got it down to 0.766 at idle and 1.1v 1050/1625mhz at load and that was just some quick test i did.


----------



## OneB1t

i think that most card needs higher vcore to keep memory happy (as memory controller is part of core)
for example i need 1300mV to 1625mhz memory everything under will create artifacts or black screen

but you can undervolt alot if you go down with memory clocks and card performance is not really affected by running memory @800mhz or so 3D

card running 900mhz vcore and 800mhz memory @1.1V (or even less) will have 50% power consumption


----------



## kizwan

Quote:


> Originally Posted by *greenscobie86*
> 
> ^
> I figured as much. Hopefully they will still be okay enough not to throttle, Might have to run my fan controller on full blast during gaming.
> Anyone have any experience with a X-fire Ref. 290 setup on stock coolers?


I have the experience. That's why I replied to you.







I ran 290 Crossfire with stock cooler for no less than 2 months, before I put them under water. It's noisy because need to turn up the fan to at least 70% to keep temp below 80C at stock clock. This is with ambient at low 30s Celsius too.


----------



## AliNT77

Quote:


> Originally Posted by *OneB1t*
> 
> i think that most card needs higher vcore to keep memory happy (as memory controller is part of core)
> for example i need 1300mV to 1625mhz memory everything under will create artifacts or black screen
> 
> but you can undervolt alot if you go down with memory clocks and card performance is not really affected by running memory @800mhz or so 3D
> 
> card running 900mhz vcore and 800mhz memory @1.1V (or even less) will have 50% power consumption


oh man i think you lost memory-chip-lottery badly
















mine hits 1625 with 0.992v














yeah that's under load









BTW sorry it didn't work for you


----------



## OneB1t

but with my undervolt there is such small amount of heat that i can keep my card under 50C on AIR








and still keeping decent amount of performance (abour 9000 graphic score in 3dmark down from 12500-13000 which is normal settings)


----------



## greenscobie86

Quote:


> Originally Posted by *kizwan*
> 
> I have the experience. That's why I replied to you.
> 
> 
> 
> 
> 
> 
> 
> I ran 290 Crossfire with stock cooler for no less than 2 months, before I put them under water. It's noisy because need to turn up the fan to at least 70% to keep temp below 80C at stock clock. This is with ambient at low 30s Celsius too.


I got ya thank you. Well It seems like some watercooling is in my future then haha.


----------



## AliNT77

Quote:


> Originally Posted by *OneB1t*
> 
> but with my undervolt there is such small amount of heat that i can keep my card under 50C on AIR
> 
> 
> 
> 
> 
> 
> 
> 
> and still keeping decent amount of performance (abour 9000 graphic score in 3dmark down from 12500-13000 which is normal settings)


challenge accepted







:

Everything @Stock :


247 stable gaming profile :


higher performance
lower temps
lower noise
lower power consumption

at the same time


----------



## greenscobie86

^
Yeah I'm definitely doing this...


----------



## diggiddi

Annnd I just got a shutdown running Heaven benchmark


----------



## OneB1t

@AliNT77: i have more than -200mV on my card so -156mV still low









you can never achieve such values in afterburner without moding bios as afterburner is aplying voltage offset also to 2D clocks so card will be unstable










132W


----------



## mus1mus

Guys, Where is VRM1 again?


----------



## cephelix

Quote:


> Originally Posted by *mus1mus*
> 
> Guys, Where is VRM1 again?


Seriously? well, just in case you are, vrm1 is the strip on the right side of the card, nearer the power connectors. These pictures would explain it better


----------



## The Stilt

Quote:


> Originally Posted by *mus1mus*
> 
> Guys, Where is VRM1 again?




Low-side fets marked with a red star (hot). MVDDC VRM has no temperature sensor. The VRM1 & VRM2 NTCs are located adjacent to the phase inductors.


----------



## AliNT77

Quote:


> Originally Posted by *OneB1t*
> 
> @AliNT77: i have more than -200mV on my card so -156mV still low
> 
> 
> 
> 
> 
> 
> 
> 
> 
> you can never achieve such values in afterburner without moding bios as afterburner is aplying voltage offset also to 2D clocks so card will be unstable
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 132W


10600 GS @120w :


----------



## mus1mus

Thanks guys.

And yeah, I am serious. I'm 30 and doesn't take things seriously.









I am hitting 70+ C on VRM1 on sub 20C Water. Core stays within 40C.

Weird eh?


----------



## cephelix

Quote:


> Originally Posted by *mus1mus*
> 
> Thanks guys.
> 
> And yeah, I am serious. I'm 30 and doesn't take things seriously.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am hitting 70+ C on VRM1 on sub 20C Water. Core stays within 40C.
> 
> Weird eh?


hey! two peas in a pod...i'm 30 and still pulling childish pranks..








oh, those are high temps..is the thermal pad making proper contact with the block?what thermal pads are you using?are you pumping tons of voltage through the card?
i have to recommend the fujipoly ultra extreme if you haven't already got them. @Roboyto has a thread on it and they are da bomb....
Keeps my vrm1 cool and i'm on air... highest temp is 65-69C


----------



## mus1mus

You know, ar your 30s, you have all the urge to pump things up. In this case, 1.5V









I keep hearung fujipoly but I cant get them anywhere.


----------



## AliNT77

Quote:


> Originally Posted by *mus1mus*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You know, ar your 30s, you have all the urge to pump things up. In this case, 1.5V
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I keep hearung fujipoly but I cant get them anywhere.


You know , as your 16, you have all the urge to pump things down. in my case , 0.984v
??


----------



## cephelix

Quote:


> Originally Posted by *mus1mus*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You know, ar your 30s, you have all the urge to pump things up. In this case, 1.5V
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I keep hearung fujipoly but I cant get them anywhere.


lol...i've been pumping alot of things alot less....i'm sleepy by 11pm...








Performance PCs has them if you're willing to ship from the US. That's where I got mine anyway. And CLU straight from Germany. Though for you, I don't see the need for CLU on die


----------



## mus1mus

Quote:


> Originally Posted by *AliNT77*
> 
> You know , as your 16, you have all the urge to pump things down. in my case , 0.984v
> ??


Good for you. The change of heart will not happen for another 10 or so years.









Quote:


> Originally Posted by *cephelix*
> 
> lol...i've been pumping alot of things alot less....i'm sleepy by 11pm...
> 
> 
> 
> 
> 
> 
> 
> 
> Performance PCs has them if you're willing to ship from the US. That's where I got mine anyway. And CLU straight from Germany. Though for you, I don't see the need for CLU on die


I might try that.

CLU is notbreally needed.


----------



## cephelix

Quote:


> Originally Posted by *mus1mus*
> 
> Good for you. The change of heart will not happen for another 10 or so years.
> 
> 
> 
> 
> 
> 
> 
> 
> I might try that.
> 
> CLU is notbreally needed.


Definitely not needed. I had to give my card the best in terms of cooling mods, fujipoly on the front and rear of vrm1(vrm 2 doesn't heat up, staying in the 50C range) CLU on die, and a 1850rpm gentle typhoon strapped to the unused pcie slots to act as an extra exhaust. even then it's insufficient and my case heats up quite fast during gaming sessions


----------



## mus1mus

You can always go for a custom/aftermarket coolung solutions.









Or play undervolted.


----------



## cephelix

Quote:


> Originally Posted by *mus1mus*
> 
> You can always go for a custom/aftermarket coolung solutions.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Or play undervolted.


Nah, trying to curb my expenses for now. The card performs surprisingly well for me. It'll last me at least through 2016. But my next build will hopefully be balls to the wall watercooling...


----------



## AliNT77

Quote:


> Originally Posted by *cephelix*
> 
> Definitely not needed. I had to give my card the best in terms of cooling mods, fujipoly on the front and rear of vrm1(vrm 2 doesn't heat up, staying in the 50C range) CLU on die, and a 1850rpm gentle typhoon strapped to the unused pcie slots to act as an extra exhaust. even then it's insufficient and my case heats up quite fast during gaming sessions


clocks? voltage? temps? give us some numbers


----------



## cephelix

Quote:


> Originally Posted by *AliNT77*
> 
> clocks? voltage? temps? give us some numbers


Well, first time anyone has asked..lol
My card is nothing impressive and since it's on air, it is temp limited
Mine is as MSI R9 290 Gaming 4G with PCB rev 2.2
Core: 1150MHz
Memory: 1300MHz
Aux: +102mv
Power Limit: +50%

In 26C ambients, air conditioned,
Max temps on load for core/VRM1/VRM2 are 75C/69C/58C
With the air con turned off, ambients reach 30-31C and I'll usually dial it down to 1100MHz/1450MHz/+50mv/+50% power limit
It isn't necessary as even at the higher clocks, it's still within the temp limit for the core but I just like running it sub 80C with very little impact to fps.


----------



## fat4l

Quote:


> Originally Posted by *mus1mus*
> 
> Thanks guys.
> 
> And yeah, I am serious. I'm 30 and doesn't take things seriously.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am hitting 70+ C on VRM1 on sub 20C Water. Core stays within 40C.
> 
> Weird eh?


I had similar problem with my card









vrm1 hitting 100C+









(The screens were made before I even knew anything about bios modding and volts etc... )

*Before Fujipoly mod:*
~100C max
http://postimg.org/image/igdbwho0z/full/

*
After the mode:*
~66C max
http://postimg.org/image/gaefo8v6r/full


This is the difference I achevied:
GPU1_vrms
VRM1=Δ21C
VRM2=Δ7C

GPU2_vrms
VRM1=Δ34C
VRM2=Δ17C

From my point, everytime I buy a card, I'm going to use fujipoly straight away! Its worth every penny. Sad thing is, I will have to get them shipped from the usa.

BTW, do performancepcs.com ship to the UK ?


----------



## cephelix

Quote:


> Originally Posted by *fat4l*
> 
> I had similar problem with my card
> 
> 
> 
> 
> 
> 
> 
> 
> 
> vrm1 hitting 100C+
> 
> 
> 
> 
> 
> 
> 
> 
> 
> (The screens were made before I even knew anything about bios modding and volts etc... )
> 
> *Before Fujipoly mod:*
> ~100C max
> http://postimg.org/image/igdbwho0z/full/
> 
> *
> After the mode:*
> ~66C max
> http://postimg.org/image/gaefo8v6r/full
> 
> 
> This is the difference I achevied:
> GPU1_vrms
> VRM1=Δ21C
> VRM2=Δ7C
> 
> GPU2_vrms
> VRM1=Δ34C
> VRM2=Δ17C
> 
> From my point, everytime I buy a card, I'm going to use fujipoly straight away! Its works every penny. Sad thing is, I will have to get them shipped from the usa.
> 
> BTW, do performancepcs.com ship to the UK ?


Impressive temp drops! I'm gonna be switching out thermal pads for any future cards with fujipoly as well:thumb:


----------



## ZealotKi11er

Quote:


> Originally Posted by *cephelix*
> 
> Impressive temp drops! I'm gonna be switching out thermal pads for any future cards with fujipoly as well:thumb:


I had to do it too. So pathetic of EK. With stock pads VRM1 was hitting 80C lol.


----------



## The Stilt

Just be sure when you work with the mosfets that you don´t puncture the thermal pad or tape. The mosfet body is directly connected to the drain, so shorting it to the ground with a heatsink or a water block isn´t that good of a idea. All of the PCB holes are grounded.


----------



## kizwan

I bought big sheet of Fujipoly Extreme pad. Enough for the next 3 to 4 GPUs upgrade.


----------



## mus1mus

Hmmm. Fujipoly Extremes right?

Thickness?


----------



## Roboyto

Quote:


> Originally Posted by *cephelix*
> 
> hey! two peas in a pod...i'm 30 and still pulling childish pranks..
> 
> 
> 
> 
> 
> 
> 
> 
> oh, those are high temps..is the thermal pad making proper contact with the block?what thermal pads are you using?are you pumping tons of voltage through the card?
> i have to recommend the fujipoly ultra extreme if you haven't already got them. @Roboyto has a thread on it and they are da bomb....
> Keeps my vrm1 cool and i'm on air... highest temp is 65-69C


Quote:


> Originally Posted by *mus1mus*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You know, ar your 30s, you have all the urge to pump things up. In this case, 1.5V
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I keep hearung fujipoly but I cant get them anywhere.


Amazon maybe? http://www.amazon.com/gp/product/B00ZSELP3O/ref=pd_lpo_sbs_dp_ss_3?pf_rd_p=1944687642&pf_rd_s=lpo-top-stripe-1&pf_rd_t=201&pf_rd_i=B00ZSJQUT8&pf_rd_m=ATVPDKIKX0DER&pf_rd_r=1K74XCS4DF1ZTF1CH795

Primochill now carries them too: http://www.primochill.com/product/fujipoly-modsmart-ultra-extreme-xr-m-thermal-pad-100-x-15-x-1-0-thermal-conductivity-17-0-wmk/

Aquatuning US site has them: http://www.aquatuning.us/thermal-pads-und-paste/thermal-pads/19471/alphacool-eisschicht-17w/mk-120x20x1mm-2-stueck-sarcon-xr-m

Seems as though other companies are taking note of our obsession for these glorious thermal pads









You can also try searching for Mod Smart thermal pads, or Sarcon XR-m. Maybe that will help you locate them by you.


----------



## Roboyto

Quote:


> Originally Posted by *mus1mus*
> 
> Hmmm. Fujipoly Extremes right?
> 
> Thickness?


My thread has this table in it



EK suggests using TIM, to help make up for their crap, which isn't really necessary once you have the good thermal pads in there.


----------



## fat4l

Yep, *XR-m* version is the one you are looking for. 17w/mK


----------



## battleaxe

Quote:


> Originally Posted by *MoGTy*
> 
> I recently bought a cheap Sapphire R9 290X 8GB edition, I'm pretty pleased with the performance however the VRM temps (when overclocking) freak me out. I like to ghetto mod my own stuff but before I start I'd like to know if DIY'ing a small backplate with a heatsink over the VRMs is worth it? I never made a backplate before so I wouldn't know if the back of the PCB has the ability to contribute in cooling those hot buggers.


Very worth it. Check the bottom post that's mine on this page.


----------



## cephelix

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I had to do it too. So pathetic of EK. With stock pads VRM1 was hitting 80C lol.


80C on water is unacceptable!

@kizwanwhere do you order yours from??


----------



## mus1mus

This card doesn't seem to like me much. Scores less than my 290.









http://www.3dmark.com/3dm11/10799927


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> This card doesn't seem to like me much. Scores less than my 290.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm11/10799927


I would wager every card is going to do worse than your 290. My 290x is a rockstar and can't match your 290. 1350core is insane BTW.


----------



## mus1mus

Gotta mod some moarrrrrr.


----------



## cephelix

1350 core?!life is so unfair...why doesn't god want me to have 1200core so i can get awesome gaming experiences?! Lol


----------



## mus1mus

Quote:


> Originally Posted by *cephelix*
> 
> 1350 core?!life is so unfair...why doesn't god want me to have 1200core so i can get awesome gaming experiences?! Lol


God loves me more than you for shoor.







No other reasons.

http://www.3dmark.com/3dm11/10799958


----------



## cephelix

Quote:


> Originally Posted by *mus1mus*
> 
> God loves me more than you for shoor.
> 
> 
> 
> 
> 
> 
> 
> No other reasons.
> 
> http://www.3dmark.com/3dm11/10799958


1390 now?!that is just ridiculous.make it an even 1400??


----------



## rt123

Quote:


> Originally Posted by *mus1mus*
> 
> God loves me more than you for shoor.
> 
> 
> 
> 
> 
> 
> 
> No other reasons.
> 
> http://www.3dmark.com/3dm11/10799958


What cooling...?


----------



## mus1mus

Quote:


> Originally Posted by *cephelix*
> 
> 1390 now?!that is just ridiculous.make it an even 1400??


It freaked out at 1400.
But then, it is scoring less than a 290 that is clocked lower.









Quote:


> Originally Posted by *rt123*
> 
> What cooling...?


Water sir. sub 20C at the moment.


----------



## rt123

Quote:


> Originally Posted by *mus1mus*
> 
> Water sir. sub 20C at the moment.


That's a Damn good card man.









If you don't mind, I have 2 more questions for you.

1) What's the Asic.?
2) Is it a Matrix..? (FM link shows an Asus card)


----------



## Bartouille

Your card is probably throttling or something... 1390mhz is sure insane but the score doesn't really come with it.









Unless I'm missing something? This is my friend 290x run 1200/1550mhz. Not much lower score (graphics ofc), but MUCH lower clocks. I don't get it.

http://www.3dmark.com/3dm11/10795977


----------



## cephelix

Quote:


> Originally Posted by *mus1mus*
> 
> It freaked out at 1400.
> But then, it is scoring less than a 290 that is clocked lower.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Water sir. sub 20C at the moment.


How are you getting sub 20C ambients? Air conditioned room?
Regardless of it freaking out at 1400, that is a very good oc. I'll need to go for reference cards next time. Was disappointed that the particular revision of my card didn't have a full cover block for it


----------



## mus1mus

Quote:


> Originally Posted by *rt123*
> 
> That's a Damn good card man.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you don't mind, I have 2 more questions for you.
> 
> 1) What's the Asic.?
> 2) Is it a Matrix..? (FM link shows an Asus card)


76ish ASIC.
Reference. (I may be capping the crap out of the Power Delivery thus low scores)

Quote:


> Originally Posted by *Bartouille*
> 
> Your card is probably throttling or something... 1390mhz is sure insane but the score doesn't really come with it.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Unless I'm missing something? This is my friend 290x run 1200/1550mhz. Not much lower score (graphics ofc), but MUCH lower clocks. I don't get it.
> 
> http://www.3dmark.com/3dm11/10795977


I reckon that card has Hynix. Which are better scorers.







And you seldom get exceptional core clocks from them as far as I am aware.
Gotta check for the clock during the run if it's falling off.

Quote:


> Originally Posted by *cephelix*
> 
> How are you getting sub 20C ambients? Air conditioned room?
> Regardless of it freaking out at 1400, that is a very good oc. I'll need to go for reference cards next time. Was disappointed that the particular revision of my card didn't have a full cover block for it


Industrial AC behind my wall. Meaning, I get the first gush of cold air from it. And 2*360mm with 3k Fans










You can't go wrong with AMD refs. But then again, you play lottery just the same.


----------



## Bartouille

Quote:


> Originally Posted by *mus1mus*
> 
> I reckon that card has Hynix. Which are better scorers.
> 
> 
> 
> 
> 
> 
> 
> 
> Gotta check for the clock during the run if it's falling off.


Damn, didn't know it could make such a different. Still, 1390mhz is insane.


----------



## cephelix

Quote:


> Originally Posted by *mus1mus*
> 
> Industrial AC behind my wall. Meaning, I get the first gush of cold air from it. And 2*360mm with 3k Fans
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You can't go wrong with AMD refs. But then again, you play lottery just the same.


No wonder...how do you stand such cold temps?the most i could comfortably tolerate without having to bundle up is 26C. As for the lottery bit, that is the case with everything silicon related i suppose. It's just a bummer when I dismantled my card only to see that i had the new pcb rev. And I was so hyped up to put my card under water as well


----------



## rt123

Quote:


> Originally Posted by *mus1mus*
> 
> 76ish ASIC.
> Reference. (I may be capping the crap out of the Power Delivery thus low scores)


Thanks.








+Rep.

AMD reference PCB are way more sturdy then Nvidia ones. A couple of hardmods should allow you to go higher.


----------



## mus1mus

Quote:


> Originally Posted by *Bartouille*
> 
> Damn, didn't know it could make such a different. Still, 1390mhz is insane.


They do. By a lot.







Check these links.

http://www.3dmark.com/fs/6641212

http://www.3dmark.com/fs/6610752

Clocks don't throttle.







But at 1300/1625


Quote:


> Originally Posted by *cephelix*
> 
> No wonder...how do you stand such cold temps?the most i could comfortably tolerate without having to bundle up is 26C. As for the lottery bit, that is the case with everything silicon related i suppose. It's just a bummer when I dismantled my card only to see that i had the new pcb rev. And I was so hyped up to put my card under water as well


26C is fairly comfortable.Cold by tropical standards. But seem to have gotten used to this room now.








Quote:


> Originally Posted by *rt123*
> 
> Thanks.
> 
> 
> 
> 
> 
> 
> 
> 
> +Rep.
> 
> AMD reference PCB are way more sturdy then Nvidia ones. A couple of hardmods should allow you to go higher.


I am just fairly limited by the VRM I reckon.









P.S.: I am already at 1.45ish on the V


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> Hmmm. Fujipoly Extremes right?
> 
> Thickness?


Yes. 1mm thickness. Hmm...I can see the flaw in my plan there.
Quote:


> Originally Posted by *cephelix*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> I had to do it too. So pathetic of EK. With stock pads VRM1 was hitting 80C lol.
> 
> 
> 
> 80C on water is unacceptable!
> 
> @kizwanwhere do you order yours from??
Click to expand...

FrozenCPU...before the "crisis". 11 W/mK is more than enough for my Hawaii.



Quote:


> Originally Posted by *cephelix*
> 
> 1350 core?!life is so unfair...why doesn't god want me to have 1200core so i can get awesome gaming experiences?! Lol


You just need to feed more voltage, more than +200mV offset voltage & sub-ambient cooling, e.g. A/C and/or water chiller if underwater.


----------



## cephelix

Quote:


> Originally Posted by *mus1mus*
> 
> They do. By a lot.
> 
> 
> 
> 
> 
> 
> 
> Check these links.
> 
> http://www.3dmark.com/fs/6641212
> 
> http://www.3dmark.com/fs/6610752
> 
> Clocks don't throttle.
> 
> 
> 
> 
> 
> 
> 
> But at 1300/1625
> 
> 26C is fairly comfortable.Cold by tropical standards. But seem to have gotten used to this room now.
> 
> 
> 
> 
> 
> 
> 
> 
> I am just fairly limited by the VRM I reckon.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> P.S.: I am already at 1.45ish on the V


Yeah...but 30 is just so damn hot, esp with the humidity...
Quote:


> Originally Posted by *kizwan*
> 
> Yes. 1mm thickness. Hmm...I can see the flaw in my plan there.
> FrozenCPU...before the "crisis". 11 W/mK is more than enough for my Hawaii.
> 
> 
> You just need to feed more voltage, more than +200mV offset voltage & sub-ambient cooling, e.g. A/C and/or water chiller if underwater.


Ahh, same here thn. I thought you ordered it from somewhere within the region. Agree with the moar volts thing but thermally limited now. If i wasn't you can be sure i'd go crazy on the volts.ah well, at least it's serving me well, gaming at 1920 x 1200..
Only way i'd go cfx is to go water.


----------



## Paul17041993

Quote:


> Originally Posted by *kizwan*
> 
> You just need to feed more voltage, more than +200mV offset voltage & sub-ambient cooling, e.g. A/C and/or water chiller if underwater.


VRM/wire mod or can we now get over +200 via software?


----------



## kizwan

Quote:


> Originally Posted by *cephelix*
> 
> Ahh, same here thn. I thought you ordered it from somewhere within the region. Agree with the moar volts thing but thermally limited now. If i wasn't you can be sure i'd go crazy on the volts.ah well, at least it's serving me well, gaming at 1920 x 1200..
> Only way i'd go cfx is to go water.


I found that they have factory in Malaysia. Try to contact them but unsuccessful. They have office at Singapore too.
Quote:


> Originally Posted by *Paul17041993*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> You just need to feed more voltage, more than +200mV offset voltage & sub-ambient cooling, e.g. A/C and/or water chiller if underwater.
> 
> 
> 
> VRM/wire mod or can we now get over +200 via software?
Click to expand...

I personally have never tried go beyond +200mV but @fyzzz & @mus1mus have go beyond +200mV with modded BIOS (modded memory timings) which allow them to run without black screen.


----------



## mus1mus

iTurbo for +400mV Voltage + PT based BIOS to allow more than 1.4 Volts at load.

You need a full cover block at the very least, 20C or lower water temps, a Powerful PSU to ice the game.









VRM limitation will be your draw back.


----------



## cephelix

Quote:


> Originally Posted by *kizwan*
> 
> I found that they have factory in Malaysia. Try to contact them but unsuccessful. They have office at Singapore too.
> I personally have never tried go beyond +200mV but @fyzzz & @mus1mus have go beyond +200mV with modded BIOS (modded memory timings) which allow them to run without black screen.


Yeah they do. On the site they stated to contact them should you want samples but i never did. Looks like they deal with businesses instead of individual buyers, similar to nidec


----------



## fyzzz

I couldn't really go above 200mv, since my psu didn't seem to be able to deliver. It would be fun to benchmark it again with a better psu, especially now (-7c outside and even -20c some days)


----------



## mus1mus

I would love that ambient









It's not about the Offset though. But the actual Voltage. Software or True Voltage, the gist is the same.

My card is nuts! Purely a show-off. Not a scorer.


----------



## fat4l

1.6v omggggggggg


----------



## JourneymanMike

Quote:


> Originally Posted by *fat4l*
> 
> 1.6v omggggggggg


I couldn't even get my 8350 to boot, on a CVF-Z, over 1,6v :


----------



## mfknjadagr8

Quote:


> Originally Posted by *JourneymanMike*
> 
> I couldn't even get my 8350 to boot, on a CVF-Z, over 1,6v :


I've stressed my saber at 1.65 but I wouldn't run that for an everyday clock...


----------



## rdr09

Testing Crimson one card at a time. So far so good. Saw 477W from the UPS with this run . . .

http://www.3dmark.com/3dm/10268542?

Coming from 15.4 Driver.


----------



## fyzzz

I´m testing my 290 again








http://www.3dmark.com/fs/7195413


----------



## Sgt Bilko

Alrighty guys, so i replaced the cooler on my XFX DD 290x with the one from the 390x and it dropped the temps like a rock...

More testing is needed ( i don't have much time to devote to it) but from what I've seen so far XFX did a helluva better job than i originally gave them credit for






And part 2: 








Apologies for the smudge marks on the front of it but the cooler swap dropped the temps (especially vrm temps) by a substantial amount so good on XFX for listening to the consumer and improving what needed to be improved


----------



## battleaxe

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Sgt Bilko*
> 
> Alrighty guys, so i replaced the cooler on my XFX DD 290x with the one from the 390x and it dropped the temps like a rock...
> 
> More testing is needed ( i don't have much time to devote to it) but from what I've seen so far XFX did a helluva better job than i originally gave them credit for
> 
> 
> 
> 
> 
> 
> And part 2:
> 
> 
> 
> 
> 
> 
> 
> 
> Apologies for the smudge marks on the front of it but the cooler swap dropped the temps (especially vrm temps) by a substantial amount so good on XFX for listening to the consumer and improving what needed to be improved






Very nice comparison!


----------



## ZealotKi11er

Why testing 3DMark11?


----------



## Sgt Bilko

Quote:


> Originally Posted by *battleaxe*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Alrighty guys, so i replaced the cooler on my XFX DD 290x with the one from the 390x and it dropped the temps like a rock...
> 
> More testing is needed ( i don't have much time to devote to it) but from what I've seen so far XFX did a helluva better job than i originally gave them credit for
> 
> 
> 
> 
> 
> 
> And part 2:
> 
> 
> 
> 
> 
> 
> 
> 
> Apologies for the smudge marks on the front of it but the cooler swap dropped the temps (especially vrm temps) by a substantial amount so good on XFX for listening to the consumer and improving what needed to be improved
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Very nice comparison!
Click to expand...

Honestly all things considered it was a bit rough and rushed on my part, I was in Denmark when i did that and I'm now back in Australia (the card stayed in Denmark), I really would have liked to have given it a proper run down in more games and benches than what i did but time was quite short unfortunately.

and if anyone was considering doing this i can say that everything fits perfectly, the plate for the memory, both core and vrm heatsink s plus the backplate all just fit like a glove


----------



## mfknjadagr8

So question for all you crossfire and bios guys... when running in crossfire is it better to have the same bios on all cards? the reason i ask is i just noticed both of my cards are xfx cards but neither of them have an xfx bios installed and the two bios's are different...


Spoiler: generation by trixxx



CARD 1
Report generated on 01/13/16 15:28:35

Card name: AMD Radeon R9 200 Series
GPU: Hawaii
Device Id: 1002-67B1 : 1682-9295
Die Size: 438 mm²
Bus Interface: PCI-E 3.0 x16 @ x16 1.1
Memory Size: 4096 MB
Memory Type: GDDR5
Memory Bus Width: 512 bit
ROPs: 64
Shaders: 2560 Unified / DirectX 12 (12_0)
Driver Version: 15.301.1201.0 Catalyst 15.8
BIOS Version: 015.039.000.006.003516
BIOS Part Number: 113-C6711100-101_MBA
UEFI Support: No
Clocks: 1000 / 1300 MHz

CARD 2

Report generated on 01/13/16 15:28:19

Card name: AMD Radeon R9 200 Series
GPU: Hawaii
Device Id: 1002-67B1 : 1682-9295
Die Size: 438 mm²
Bus Interface: PCI-E 3.0 x16 @ x16 1.1
Memory Size: 4096 MB
Memory Type: GDDR5
Memory Bus Width: 512 bit
ROPs: 64
Shaders: 2560 Unified / DirectX 12 (12_0)
Driver Version: 15.301.1201.0 Catalyst 15.8
BIOS Version: 015.039.000.007.003523
BIOS Part Number: 113-C6711100-102_MBA
UEFI Support: No
Clocks: 1000 / 1300 MHz



Another interesting thing is it says im running the cards on pci e 3.0 which my board doesnt support and dx12 which windows 7 doesnt support? Or is this a statement of capabilites?


----------



## AliNT77

Can someone tech me how to have 1250 strap timing on 1250-1750 straps plz ??

Thx in advance ☺?


----------



## fat4l

Quote:


> Originally Posted by *AliNT77*
> 
> Can someone tech me how to have 1250 strap timing on 1250-1750 straps plz ??
> 
> Thx in advance ☺?


----------



## AliNT77

Quote:


> Originally Posted by *fat4l*


thx a lot but how can i find the highlighted numbers? (Strap indicators)

Sry for my poor english ?


----------



## mus1mus

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Why testing 3DMark11?


It's quicker.


----------



## AliNT77

Also what about hawaii bios reader?
It has a tab called memory timing right? No way i can do the mod with it?


----------



## ZealotKi11er

Quote:


> Originally Posted by *mus1mus*
> 
> It's quicker.


Lol so less time for card to crash...


----------



## mus1mus

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Lol so less time for card to crash...


Nope. Too lazy to install FS and Catzilla via my Steam.

It's actually longer than FS. And has the option to skip the Demo. So yeah.

Besides, Heaven and/or Valley can be benched at my clocks.









By the way, 3DM11 and FS are two different things when benching. Last time I checked with my 290, I can bench on higher clocks than FS. But my FS Benching Bios craps out with 3DM11.


----------



## fat4l

Quote:


> Originally Posted by *AliNT77*
> 
> thx a lot but how can i find the highlighted numbers? (Strap indicators)
> 
> Sry for my poor english ?


Well you need to check which memory type your card have. Then you need to check in the bios which "indicator" is for your memory, 00, 01 or 02(some bioses support different memery types thats why).
Then if you know all this, you can simply open HxD and serach for a hex value of : 48e801*00* or 48e801*01* or 48e801*02*. This is 1250 MHz strap but you need to know which one of these three you is for you(this is that 00, 01 and 02 I was talking about above).

Memory info tool and HxD is here:
Link:- HxD Hex editor
Link:- Memory Info tool

Memory info tool will tell you which type of mem u have. For example: Hynix AFR or Hynix BFR etc....
Then you open HxD and search for "text" : AFR or BFR and see the order. If you mem type is BFR and it is second, then u need to search for 48e801*01*. If its first then 48e801*00* etc...

GL









Or you can use hawaii reader but you need to copy the timings one by one(imho). Thats prolly easier for you









*EDIT://*
I forgot, after all this you need to fix checksum. Open the modded bios in hawaiireader and save it again. it will fix checksum


----------



## kizwan

Quote:


> Originally Posted by *AliNT77*
> 
> Can someone tech me how to have 1250 strap timing on 1250-1750 straps plz ??
> 
> Thx in advance ☺?


This should help. It will be able to bring you straight to the memory straps.

http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/590#post_24725815


----------



## AliNT77

+rep for you two







thanks a lot









early results:

FS GS 1100/1625 (+25mv core & +50mv AUX) with stock Timings : 12689

FS GS 1100/1625 (+25mv core & +50mv AUX) with 1250 Timings : 12896

FS GS 1100/1625 (+25mv core & +50mv AUX) with 1125 Timings : heavy artifacting
but 947/1625 with -156mv on core & stock AUX was stable and scored less than 1250 timings







weird behavior so no luck with 1125


----------



## fat4l

Quote:


> Originally Posted by *AliNT77*
> 
> +rep for you two
> 
> 
> 
> 
> 
> 
> 
> thanks a lot
> 
> 
> 
> 
> 
> 
> 
> 
> 
> early results:
> 
> FS GS 1100/1625 (+25mv core & +50mv AUX) with stock Timings : 12689
> 
> FS GS 1100/1625 (+25mv core & +50mv AUX) with 1250 Timings : 12896
> 
> FS GS 1100/1625 (+25mv core & +50mv AUX) with 1125 Timings : heavy artifacting
> but 947/1625 with -156mv on core & stock AUX was stable and scored less than 1250 timings
> 
> 
> 
> 
> 
> 
> 
> weird behavior so no luck with 1125


Nice. try higher mem freq







1750?


----------



## AliNT77

Quote:


> Originally Posted by *fat4l*
> 
> Nice. try higher mem freq
> 
> 
> 
> 
> 
> 
> 
> 1750?


tried but no luck

highest clock for my memory is 1700 and its not fully stable so...









my gaming profile right now is: 1100/1625+25mv + 10% + 50mv AUX and it scores 12970 in FS & 67.8 in valley (crimson 16.1 hotfix)
for less demanding games: 950/1625 -156mV -30% = FS11400 Valley60.5


----------



## mus1mus

You might need to add more Voltage for the memory to fly.

But that is against your youthful philosophy right?


----------



## AliNT77

Quote:


> Originally Posted by *mus1mus*
> 
> You might need to add more Voltage for the memory to fly.
> 
> But that is against your youthful philosophy right?


















tried it right now:
added 200mv to the core but no luck even for 1700 (now that im running 1250 timings)


----------



## mus1mus

Quote:


> Originally Posted by *AliNT77*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> tried it right now:
> added 200mv to the core but no luck even for 1700 (now that im running 1250 timings)


Ohh. So you're on memory modding now. Nice.

Remind me: what make are your memory chips?

If Hynix, then shoot for 1750. Elpidas can shoot for 1625. Unstable at those clocks on tighter timings means go up a step on the straps.

Also, if you are keen enough, try out other roms. You might find one that will match your card pretty well. Same clocks benching using different rom is what I am saying. Then, copy those memory timings into your stock rom.









I'm on a hunt for such a rom for my 290Xs. These are rather superb clockers but poor scorers. Poor compared to my 290 that is.


----------



## AliNT77

its hynix AFR
Gonna try 1125 timings again
Results incoming...


----------



## ZealotKi11er

compare 1500 with 1125 timings to 1625 with 1250 timings.


----------



## AliNT77

So far =

1625 not stable no more

1600 seems highest stable clock (1100/1600 +100mv +50mv AUX) :
FS: 12959
Up from 12900

Dissapointed ?

Gonna try 1500:
FS 12818

Gonna try 1000timings ?


----------



## AliNT77

1000 timings = artifacts in desktop ??


----------



## kizwan

I have two questions.


My primary monitor is connected using HDMI cable while my secondary monitor is connected using DVI to HDMI (DVI on the card & HDMI on the monitor) cable. My question is should I use DVI cable instead (the monitor also have DVI-D port) or it shouldn't matter whichever cable I use?

Do the MVA monitor suitable for gaming? I mistakenly buy MVA monitor to replace my faulty monitor. I purposely getting TN monitor because my primary monitor have TN panel. I thought the new monitor use TN panel. I'm asking this because when I test run with GTA V, just testing for fun with Eyefinity, there's noticeable wavy colors transition (for lack of a better word) on the MVA monitor. Is this because the slower response time (8ms GtG)?
Quote:


> Originally Posted by *AliNT77*
> 
> So far =
> 
> 1625 not stable no more
> 
> 1600 seems highest stable clock (1100/1600 +100mv +50mv AUX) :
> FS: 12959
> Up from 12900
> 
> Dissapointed ?
> 
> Gonna try 1500:
> FS 12818
> 
> Gonna try 1000timings ?


If 1250 timings not stable with 1625MHz & 1750MHz memory clock, try higher timings like 1375 or 1500 timings. You also can try 1625 timings for 1750MHz memory clock.


----------



## fat4l

yeah do 1375 timings with 1750 MHz







or even above..2000MHz


----------



## ZealotKi11er

Quote:


> Originally Posted by *fat4l*
> 
> yeah do 1375 timings with 1750 MHz
> 
> 
> 
> 
> 
> 
> 
> or even above..2000MHz


If you can't do 1750 with stock timings what makes you think you can do it with tighter timings lol?


----------



## fat4l

Quote:


> Originally Posted by *ZealotKi11er*
> 
> If you can't do 1750 with stock timings what makes you think you can do it with tighter timings lol?


oh didnt know that he is unable to get it stable @stock timings.


----------



## mus1mus

Stock BIOS should not be the basis for top Memory Clock. Some of them are gimped.

My current cards has a gimped BIOS for Elpida. 1625 is all I can muster on them at stock. 1626 will result into corruption. That is due to a single hex value that they purposedly put to act as a limitter. Changing that value to a good one allows for up to 1675 MHz memory clock. Even using the good timings for 1750 strap does that. But that is another story as the guy can do 1700.


----------



## cephelix

Quote:


> Originally Posted by *kizwan*
> 
> I have two questions.
> 
> 
> My primary monitor is connected using HDMI cable while my secondary monitor is connected using DVI to HDMI (DVI on the card & HDMI on the monitor) cable. My question is should I use DVI cable instead (the monitor also have DVI-D port) or it shouldn't matter whichever cable I use?
> 
> Do the MVA monitor suitable for gaming? I mistakenly buy MVA monitor to replace my faulty monitor. I purposely getting TN monitor because my primary monitor have TN panel. I thought the new monitor use TN panel. I'm asking this because when I test run with GTA V, just testing for fun with Eyefinity, there's noticeable wavy colors transition (for lack of a better word) on the MVA monitor. Is this because the slower response time (8ms GtG)?
> If 1250 timings not stable with 1625MHz & 1750MHz memory clock, try higher timings like 1375 or 1500 timings. You also can try 1625 timings for 1750MHz memory clock.


I think dvi vs hdmi there's very little difference between them besides connector size so you should be good. Someone correct me if i'm wrong though.


----------



## AliNT77

Quote:


> Originally Posted by *mus1mus*
> 
> Stock BIOS should not be the basis for top Memory Clock. Some of them are gimped.
> 
> My current cards has a gimped BIOS for Elpida. 1625 is all I can muster on them at stock. 1626 will result into corruption. That is due to a single hex value that they purposedly put to act as a limitter. Changing that value to a good one allows for up to 1675 MHz memory clock. Even using the good timings for 1750 strap does that. But that is another story as the guy can do 1700.


I can comfirm the limiting thing ?
On stock bios , i couldnt do 1626 because of some weired behavior so now i know why...

BTW @musimus can U give me a good AFR 290 reference bios? I would be grateful??


----------



## cephelix

Quote:


> Originally Posted by *AliNT77*
> 
> I can comfirm the limiting thing ?
> On stock bios , i couldnt do 1626 because of some weired behavior so now i know why...
> 
> BTW @musimus can U give me a good AFR 290 reference bios? I would be grateful??


Just buy over his awesome clocking/scoring 290..lol


----------



## mus1mus

Quote:


> Originally Posted by *AliNT77*
> 
> I can comfirm the limiting thing ?
> On stock bios , i couldnt do 1626 because of some weired behavior so now i know why...
> 
> BTW @musimus can U give me a good AFR 290 reference bios? I would be grateful??


Sure. Give me a few moment to look for it. But I don't have a Hynix Card to test it's strength.
Quote:


> Originally Posted by *cephelix*
> 
> Just buy over his awesome clocking/scoring 290..lol


That won't do it sir. No. Shipping alone might cost more than the card.


----------



## cephelix

Quote:


> Originally Posted by *mus1mus*
> 
> Sure. Give me a few moment to look for it. But I don't have a Hynix Card to test it's strength.
> That won't do it sir. No. Shipping alone might cost more than the card.


Well, tht's true..maybe mus should sell his card to me instead since i'm so near..lol


----------



## mus1mus

aint these things cheaper over there?

I might be flying into SG for that purpose.


----------



## cephelix

Quote:


> Originally Posted by *mus1mus*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> aint these things cheaper over there?
> 
> I might be flying into SG for that purpose.


Well, you can head over to the local forum and download pricelists to compare...
It's pricey compared to the US..
But things are cheaper in japan.last i went, it was about 100$ cheaper in akihabara


----------



## ZealotKi11er

Can someone check this BIOS. It's from one of my R9 290 i got directly from AMD.
If someone can change to memory strap for something game stable.

HawaiiR9290PressSample.zip 43k .zip file


----------



## Loosenut

come test your overclocks or just support Team AMD & come help us win the Forum Folding War

http://www.overclock.net/t/1587731/forum-folding-war-2016-team-amd#post_24796394


----------



## cephelix

Ooo..folding..very interesting.one question tho would i still be able to use my rig while folding.or would it slow my pc down to a crawl?


----------



## mus1mus

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Can someone check this BIOS. It's from one of my R9 290 i got directly from AMD.
> If someone can change to memory strap for something game stable.
> 
> HawaiiR9290PressSample.zip 43k .zip file


Verify this buddy.

1250.rom -
copied 1250 strap up to 1750
1281 mV - DPM7
FF F1

1375.rom -
copied 1375 up to 1750
1281 DPM 7
FF F1

Both Hynix and Elpida Timings Twerkeedddd









HawaiiR9290PressSample.zip 170k .zip file


EDIT: With these modded BIOS, you no longer need to shoot for the maximum clock of the strap. From 1250 or 1375 up to your card's content, the scaling will be linear as the timings are the same.


----------



## mus1mus

Quote:


> Originally Posted by *AliNT77*
> 
> I can comfirm the limiting thing ?
> On stock bios , i couldnt do 1626 because of some weired behavior so now i know why...
> 
> BTW @musimus can U give me a good AFR 290 reference bios? I would be grateful??


Try this as well.

Unedited.









ASUS290ROM.zip 43k .zip file


----------



## AliNT77

Quote:


> Originally Posted by *cephelix*
> 
> Well, tht's true..maybe mus should sell his card to me instead since i'm so near..lol


My country is in sanctions and post costs a lottt....

I can find 290s pretty easily here and they cost similiar to a 960 2GB

I have a really nasty exam for tommorow so today , its study time ??

@musimus
Thx if you have any other one , post it here ill test them tomorrow ??


----------



## cephelix

Quote:


> Originally Posted by *AliNT77*
> 
> My country is in sanctions and post costs a lottt....
> 
> I can find 290s pretty easily here and they cost similiar to a 960 2GB
> 
> I have a really nasty exam for tommorow so today , its study time ??
> 
> @musimus
> Thx if you have any other one , post it here ill test them tomorrow ??


dude, wht country i from??


----------



## kizwan

Quote:


> Originally Posted by *cephelix*
> 
> Ooo..folding..very interesting.one question tho would i still be able to use my rig while folding.or would it slow my pc down to a crawl?


It's like folding laundry, you do it during free time. With the hot season right now, I don't recommend you folding. Not good for environment. Wait for winter, which means never. lol


----------



## AliNT77

Quote:


> Originally Posted by *cephelix*
> 
> dude, wht country i from??


Iran ❤


----------



## cephelix

Quote:


> Originally Posted by *kizwan*
> 
> It's like folding laundry, you do it during free time. With the hot season right now, I don't recommend you folding. Not good for environment. Wait for winter, which means never. lol


Ahaha..unless i'm in an air conditioned room!but the electricity bills would be through the roof i suppose.
Quote:


> Originally Posted by *AliNT77*
> 
> Iran ❤


Iran is under sanctions?Man, i better brush up on my current events..how are things there in general though??


----------



## AliNT77

Quote:


> Originally Posted by *cephelix*
> 
> Ahaha..unless i'm in an air conditioned room!but the electricity bills would be through the roof i suppose.
> Iran is under sanctions?Man, i better brush up on my current events..how are things there in general though??


Its peaceful inside ?
No mass shooting
No bombing
No threat from ISIS
...

Search for #mustseeiran in instagram

Yeah Its under sanctions for 35 years so it doesn't matter anymore ?

And people here care about tech a lot
For example: more than 75-80% of people own a smartphone and iphone (wich is just can be found unlocked in here and u have to pay 700$ Or more at once) is the most common smartphone (all of them are smuggled from dubai and china so no official sale numbers )

But salaries are a bit on the low side
For example : My dad is a dentist and ownes a clinic for himself and makes about 15 grands a month wich compared to US and EU i think its wayy less

Worst thing here is the internet ?
Slow , high ping , and the worst= filtering

No access to YT , FB , Twitter , etc. without VPN (though everyone has vpn)
i pay 20$ a month for a 10mb/s ADSL with 20GB traffic and 6hours of free download time per day

Funniest thing:
No copyright in here ? 60-70% of people dont even know what that is ...
U can find pirate pc games , movies , tv shows , Windows, Softwares, ... being sold completely legaly for 0.5$ per DVD (for example witcher is 6dvds so 3$) ??


----------



## AliNT77

Quote:


> Originally Posted by *mus1mus*
> 
> Try this as well.
> 
> Unedited.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ASUS290ROM.zip 43k .zip file


it has the EXACT same timings as my stock bios (checked it with HBR) so i think there's no point in flashing it


----------



## mus1mus

hmm.


----------



## AliNT77

Quote:


> Originally Posted by *mus1mus*
> 
> hmm.


Trying stilt's strap timings:

1100/1500 (1250 timing) : FS12780
1100/1500 (stilt's timing) : FS12850

Couldnt do 1625 cause it crashed at desktop ?
Rolled back to 1250timings ?


----------



## Offler

Hello guys.

There is a highly un-optimized MMO based on Crysis 3.0 engine called Archeage. I am running it on Ultra high details, DX 11. with enabled multithread rendering. Goal of this settings was first to test whether the game is becoming less stable or causing more glitches.

I encountered several game stops, few times with "Program stopped working" and rarely even BSODs referring to cores not receiving interrupts. I started with an assumption that my PC is stable yet these issues say otherwise.

I decreased CPU frequency by 100Mhz and increased voltages by 2 steps. Issues with "program stopped working" and BSODs were gone, only one type of crash remained. In that type of crash game stops for a second or two and closes, without any message. There is but one issue - fans on graphic became slightly more noisy. During the tests I was running GPU on stock settings (1000Mhz core, 1250Mhz ram).

Because symptom was still pointing toward graphic card (system logs did not contained info about driver restart) I tried to downclock it to 900Mhz. Apparently it resolved even this very last symptom.

=======

a) Game put stress on my system unmatched by OCCT or LinX, even combined.
b) The second type of crash when game stops for a while and closes is quite common on multiple systems.

=======

Regardless that tests and results say that my GPU is underpowered at 1000Mhz its not true. I increased VDDC a bit (Sapphire trixx), yet without any positive result in higher stability. Only thing that worked so far was downclock to 900Mhz...

Is it possible to make code for a GPU which is becoming less reliable on higher GPU frequencies, no matter if they are considered as stable? My particular chip was running fine on 1165Mhz VDDC +200...

Issue with power supply does not seem very plausible as well. Voltages on 12v are stable.


----------



## Vellinious

I'm considering a switch to the red team. I'm looking VERY hard at the XFX 8GB 290X. Can anyone give me the pros / cons of this card? Voltage limitations? Power limit limitations? Keep in mind that it's going to be under water, in a VERY good loop, with sub 20c ambient temps in the man cave.

Also, what's the flash process like. With Maxwell (my old GPU), it was stupid easy.....it literally took 20 seconds and was bug free. I must have flashed my cards 150+ times.


----------



## mus1mus

Quote:


> Originally Posted by *Vellinious*
> 
> I'm considering a switch to the red team. I'm looking VERY hard at the XFX 8GB 290X. Can anyone give me the pros / cons of this card? Voltage limitations? Power limit limitations? Keep in mind that it's going to be under water, in a VERY good loop, with sub 20c ambient temps in the man cave.
> 
> Also, what's the flash process like. With Maxwell (my old GPU), it was stupid easy.....it literally took 20 seconds and was bug free. I must have flashed my cards 150+ times.


Looks like you are the benching type.









Well then, if you can find a 290/X, watercooling would be easier. 390/X will be limited in choices. But they are good. Hynix memory does wonders.

In terms of Voltage, you are mainly concerned with temps. You can pump more than 1.5 Volts into the core up to your willingness.







assuming you go that far.

Here are the some examples:

on Water
http://www.3dmark.com/3dm11/10800210

On Air, Dual Cards
http://www.3dmark.com/fs/7127031

And we also have another cool BIOS Modding for these cards. More interesting than just upping the Power Limits on the Green side.

Flashing needs to be done in DOS. But the process is fairly similar. And takes a couple of seconds to be done. And we have dual BIOS. Yay!

So if you plan on doing so, make the right decision for your needs.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> Looks like you are the benching type.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well then, if you can find a 290/X, watercooling would be easier. 390/X will be limited in choices. But they are good. Hynix memory does wonders.
> 
> In terms of Voltage, you are mainly concerned with temps. You can pump more than 1.5 Volts into the core up to your willingness.
> 
> 
> 
> 
> 
> 
> 
> assuming you go that far.
> 
> Here are the some examples:
> 
> on Water
> http://www.3dmark.com/3dm11/10800210
> 
> On Air, Dual Cards
> http://www.3dmark.com/fs/7127031
> 
> And we also have another cool BIOS Modding for these cards. More interesting than just upping the Power Limits on the Green side.
> 
> Flashing needs to be done in DOS. But the process is fairly similar. And takes a couple of seconds to be done. And we have dual BIOS. Yay!
> 
> So if you plan on doing so, make the right decision for your needs.


The XFX 8GB 290X has a dual bios? Is it a switch?

I was pulling 28k graphics scores with my 970s in SLI, and just under 15k graphics score on a single 970 in Firestrike....I was kind of hoping to see higher with a 290X...

http://www.3dmark.com/fs/6368664


----------



## mus1mus

Quote:


> Originally Posted by *Vellinious*
> 
> The XFX 8GB 290X has a dual bios? Is it a switch?
> 
> I was pulling 28k graphics scores with my 970s in SLI, and just under 15k graphics score on a single 970 in Firestrike....I was kind of hoping to see higher with a 290X...
> 
> http://www.3dmark.com/fs/6368664


Someone can confirm the DUAL BIOS Switch availability for that 390X DD XFX.

A hair under 15K? Yours was really a top dog. But I guess you missed the part that says, "on air".









BTW, my 290X on Single runs at X8 coz that sheety RVE won't run the 2nd X16 slot to X16.







and them cards are not yet gunners.

















http://www.3dmark.com/fs/6610752


----------



## Vellinious

lol, top dog....who knows. Not many people push 970s very hard. Got bored with them, because they were as high as I could clock them. Running Firestrike at 1630+ on the core and 2175 on the memory. They were hummin.....

Now, I want more....more volts, more power and...higher scores. = )


----------



## mus1mus

Quote:


> Originally Posted by *Vellinious*
> 
> lol, top dog....who knows. Not many people push 970s very hard. Got bored with them, because they were as high as I could clock them. Running Firestrike at 1630+ on the core and 2175 on the memory. They were hummin.....
> 
> Now, I want more....more volts, more power and...higher scores. = )


It really is boring in there. Came from a 980ti and back to the Reds as the game doesn't need skills. Just dollar-heavy card.









Did uou say Voltage?

PT Bios + DPM 7 Mod + HIS iTurbo App equate to:

Unlimited Power Limit + 1.287 ref 3D Voltage + 400mV offset. Or,

1.6+ Volts.


----------



## fat4l

Quote:


> Originally Posted by *mus1mus*
> 
> It really is boring in there. Came from a 980ti and back to the Reds as the game doesn't need skills. Just dollar-heavy card.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Did uou say Voltage?
> 
> PT Bios + DPM 7 Mod + HIS iTurbo App equate to:
> 
> Unlimited Power Limit + 1.287 ref 3D Voltage + 400mV offset. Or,
> 
> 1.6+ Volts.


and then connect your watercooling to the central haeting system and have ur house cozy and warm


----------



## mus1mus

Quote:


> Originally Posted by *fat4l*
> 
> and then connect your watercooling to the central haeting system and have ur house cozy and warm


Remind me.









My new ref 290Xs kept me warm in the office today. Looped Heaven with the cards beside me to battle the 17C room temp coz I'm feeling nauseous.







helped a lot better than my thick parka. Which, by the way, makes me look too out of place in the tropics.


----------



## Vellinious

I ordered the XFX 8GB 290X. Will be here Monday. Have to admit...I'm curious how far I can push an AMD card.


----------



## mus1mus

Depends on how willing you are on feeding it. But don't forget lottery.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> Depends on how willing you are on feeding it. But don't forget lottery.


Yeah, I know...hoping I get a decent card. I'd like to see close to 1300 out of it.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Vellinious*
> 
> Yeah, I know...hoping I get a decent card. I'd like to see close to 1300 out of it.


On Air? Not going to happen.


----------



## Vellinious

Quote:


> Originally Posted by *ZealotKi11er*
> 
> On Air? Not going to happen.


Uh...no.

It looks like the XFX cards are all reference design. Would you say that's accurate? Trying to find a block in stock somewhere. I really don't want to wait 2 weeks for the EK block....


----------



## AliNT77

Quote:


> Originally Posted by *mus1mus*
> 
> Remind me.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My new ref 290Xs kept me warm in the office today. Looped Heaven with the cards beside me to battle the 17C room temp coz I'm feeling nauseous.
> 
> 
> 
> 
> 
> 
> 
> helped a lot better than my thick parka. Which, by the way, makes me look too out of place in the tropics.


U have reference cards Right now???

Please try undervolting them and see how much u can go down with no performance loss (temps and noise)


----------



## Vellinious

Block and backplate ordered. I'll set the baselines on air on Monday night after the card arrives, and get the block / backplate on when they arrive. Should be by the weekend. I'm anxious to see how well it stacks up.


----------



## Torvi

is 600w enough to power up single r9 290 + i3-4170?


----------



## karod

Why does my LG 34UC97C flicker from time to time if I enable DP 1.2 in the monitor OSD? (the AMD driver then automatically enables 10bit).
When I disable DP 1.2 the flicker is gone but also there is only 8bit in the AMD driver.


----------



## karod

Quote:


> Originally Posted by *Torvi*
> 
> is 600w enough to power up single r9 290 + i3-4170?


More than enough, I can OC my Core i5 to 4,6ghz and use the R9 290 with a 500W Bequiet PSU.


----------



## AliNT77

Quote:


> Originally Posted by *Torvi*
> 
> is 600w enough to power up single r9 290 + i3-4170?


Its more than enough?

I have a very poor 550w PSU (gigabyte hercules x580) and im running an i5-2500k @4.5ghz and a r9 290

Even did some 4.8ghz alongside +200mV on GPU and no issues so you gonna be fine


----------



## Torvi

so a 500w psu should be enough as well? i dont plan to oc cpu and maybe a very slight oc on gpu

this one
http://www.cclonline.com/product/187510/SST-SX500-LG/Power-Supplies/Silverstone-SFX-L-500W-80-Plus-Gold-Fully-Modular-Power-Supply/PSU0941/


----------



## ZealotKi11er

Quote:


> Originally Posted by *Torvi*
> 
> so a 500w psu should be enough as well? i dont plan to oc cpu and maybe a very slight oc on gpu
> 
> this one
> http://www.cclonline.com/product/187510/SST-SX500-LG/Power-Supplies/Silverstone-SFX-L-500W-80-Plus-Gold-Fully-Modular-Power-Supply/PSU0941/


Yeah. 290 or 290X will not use more than 450W unless you start to add voltage.


----------



## karod

The R9 290 can only draw max 300W. (75W via PCIe slot, and the remainder off 300-75W via the 1x 6pin and 1x 8 pin connectors).
Also in the BIOS of the GPU the max allowed wattage is set to 208W for most cards.


----------



## Torvi

thanks a lot guys, im still battling myself between 290 used and 960 4gb


----------



## ZealotKi11er

Quote:


> Originally Posted by *Torvi*
> 
> thanks a lot guys, im still battling myself between 290 used and 960 4gb


Really? 290 will stomp 960.


----------



## Torvi

i know i know it's just that im not too trusty in radeons since i always seem to have problems with drivers and in the past i think i burnt like 4 of them (times before joining with amd) now on this laptop i have some drivers related problems as well, on top of that radeons are often used for mining, not sure if i can trust gpu after such abuse.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Torvi*
> 
> i know i know it's just that im not too trusty in radeons since i always seem to have problems with drivers and in the past i think i burnt like 4 of them (times before joining with amd) now on this laptop i have some drivers related problems as well, on top of that radeons are often used for mining, not sure if i can trust gpu after such abuse.


Get the GTX960 then. Seems like it's not worth the trouble for you. It's all about personal feels. We here just recommend stuff based on things you might not be interested. I hate trying to convince people because really there is nothing you get in return.

Yes you have to be careful of mining cards especially if they are reference model. I mined with my R9 290 and 2 x HD 7970 and they are all good now but that is just me.


----------



## Torvi

luckily i still have time to think about it, 10 days at best so ill probably change my mind like 5 times more haha


----------



## Torvi

lmao it will be doublepost

i just won auction for r9 290x for 195 gbp...

is there any difference between red coloured version of asus r9 290x and golden one? i won the latter


----------



## ZealotKi11er

Quote:


> Originally Posted by *Torvi*
> 
> lmao it will be doublepost
> 
> i just won auction for r9 290x for 195 gbp...
> 
> is there any difference between red coloured version of asus r9 290x and golden one? i won the latter


Golder i think its ASUS anniversary edition.


----------



## Torvi

so i got myself a rare for nice price... right?


----------



## miklkit

Finally decided to OC this Saphire 8gb 290x. Not looking for top OCs but an average every day OC on air. I already tinkered with the fan profiles and now the temps seem to stay under 70C so up we go. But how far?


----------



## ZealotKi11er

Quote:


> Originally Posted by *miklkit*
> 
> Finally decided to OC this Saphire 8gb 290x. Not looking for top OCs but an average every day OC on air. I already tinkered with the fan profiles and now the temps seem to stay under 70C so up we go. But how far?


Try 1100-1150 with no extra voltage.


----------



## Torvi

now since im going to be owning r9 290x i need a different case because of bigger temps, any ideas for itx case that wont turn into an oven?


----------



## refirendum

so 290x. should i stick with it when the new polaris and zen AMD stuff is released? or should i just build a new machine and swap up to the new stuff?


----------



## sinnedone

Quote:


> Originally Posted by *refirendum*
> 
> so 290x. should i stick with it when the new polaris and zen AMD stuff is released? or should i just build a new machine and swap up to the new stuff?


I'd wait to see how dx12 games perform on it before I decide to upgrade. I mean if it's not broken don't fix it right?

It also depends on what resolution you are playing at and what expect performance wise to see.


----------



## Torvi

Quote:


> Originally Posted by *refirendum*
> 
> so 290x. should i stick with it when the new polaris and zen AMD stuff is released? or should i just build a new machine and swap up to the new stuff?


Question is: Do you even need more power atm? afaik for single screen 1080p 290x seems like good enough for at least two more years, if you want to build just for pure joy of building new rig then go upgrade but otherwise performance wise there is no point imo


----------



## Torvi

ok so my final build at this moment looks like this:



+r9 290x off from ebay for 195gbp+8,5 postage
250gb ssd samsung evo that sits in my laptop
mionix naos 3400 that is left over from old rig (sold everything except for mouse lol)

Total cost: 605 gbp (starting budget was 500)

At this moment im hunting for a better used cpu on ebay but i might just stay with this i3 and move on to something better if i see it bottlenecking the gpu (which i dont think it will happen)

games im going to play on it:
bf4
hots
tibia
all sorts of survival games
insurgency
gtaV

Ps. i know there is no onboard wifi card and i do not need one, at this moment i got myself some cheap ebay wifi antena that im using because signal in my room is poor and turns out it has a wifi card inside it lol so i will be running my internet off of it.


----------



## cephelix

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Torvi*
> 
> ok so my final build at this moment looks like this:
> 
> 
> 
> +r9 290x off from ebay for 195gbp+8,5 postage
> 250gb ssd samsung evo that sits in my laptop
> mionix naos 3400 that is left over from old rig (sold everything except for mouse lol)
> 
> Total cost: 605 gbp (starting budget was 500)
> 
> At this moment im hunting for a better used cpu on ebay but i might just stay with this i3 and move on to something better if i see it bottlenecking the gpu (which i dont think it will happen)
> 
> games im going to play on it:
> bf4
> hots
> tibia
> all sorts of survival games
> insurgency
> gtaV
> 
> Ps. i know there is no onboard wifi card and i do not need one, at this moment i got myself some cheap ebay wifi antena that im using because signal in my room is poor and turns out it has a wifi card inside it lol so i will be running my internet off of it.





not considering an i5 instead? if i'm not mistaken, aren't those sorts of games quite cpu intensive?


----------



## Torvi

Quote:


> Originally Posted by *cephelix*
> 
> not considering an i5 instead? if i'm not mistaken, aren't those sorts of games quite cpu intensive?


i do but im out of budget at this moment, i've even tried to take it on finance but got refused even tho my history is perfect lol.

i have had i5-4670k + gtx 770 on my old rig, ive ran cpu on following settings 3.4ghz no boost, 3.4ghz with boost enabled and 4.2ghz oc, i've seen no fps changes in bf4 on any of these set ups so me thinks i3-4170 with 3.7 ghz 2cores+2virtual cores should run just fine


----------



## kizwan

Quote:


> Originally Posted by *Torvi*
> 
> is 600w enough to power up single r9 290 + i3-4170?


Quote:


> Originally Posted by *Torvi*
> 
> so a 500w psu should be enough as well? i dont plan to oc cpu and maybe a very slight oc on gpu
> 
> this one
> http://www.cclonline.com/product/187510/SST-SX500-LG/Power-Supplies/Silverstone-SFX-L-500W-80-Plus-Gold-Fully-Modular-Power-Supply/PSU0941/


Depends on the PSU specification. If the PSU is dual +12V rail PSU, look at the max amperage at +12V1. I recommend at least 25A.
Quote:


> Originally Posted by *karod*
> 
> The R9 290 can only draw max 300W. (75W via PCIe slot, and the remainder off 300-75W via the 1x 6pin and 1x 8 pin connectors).
> Also in the BIOS of the GPU the max allowed wattage is set to 208W for most cards.


Actually it can draw more than that & the value set in BIOS is TDP, not max power consumption.

https://www.techpowerup.com/reviews/MSI/R9_290X_Lightning/22.html


----------



## Torvi

i saw that 290x can go up to 413 tdp so i guess 500-550W will be enough eh?


----------



## kizwan

Quote:


> Originally Posted by *Torvi*
> 
> i saw that 290x can go up to 413 tdp so i guess 500-550W will be enough eh?


What is the PSU specification?


----------



## Torvi

http://www.corsair.com/en-gb/cs-series-modular-cs550m-550-watt-80-plus-gold-certified-psu


----------



## kizwan

Quote:


> Originally Posted by *Torvi*
> 
> http://www.corsair.com/en-gb/cs-series-modular-cs550m-550-watt-80-plus-gold-certified-psu


That will be enough for a 290 or 290X & i3-4170.

I asked PSU specification because there's dual +12V rail PSU (with low wattage PSU) that divide the total wattage to two. In this case, total wattage available for the GPU is lower. The 500 or 600W dual +12V rail PSU may or may not enough for a 290 or 290X.


----------



## Torvi

it's gonn be 290x after all, somehow i won auction without even intending to win, just wanted to raise guy's bid haha, there goes my budget gaming pc which at first was supposed to have 750ti inside lmao


----------



## ZealotKi11er

Quote:


> Originally Posted by *Torvi*
> 
> it's gonn be 290x after all, somehow i won auction without even intending to win, just wanted to raise guy's bid haha, there goes my budget gaming pc which at first was supposed to have 750ti inside lmao


Is this a different build then your current system or an upgrade? Can you use parts from your current build?


----------



## Torvi

my old rig is long gone, it's completely fresh build


----------



## ZealotKi11er

I would try to get a i5 with AMD GPU. They need more CPU then Nvidia cards. With i3 most games will be CPU bound. Try to get i3-6100 and OC that if you want to stay i3.


----------



## cephelix

What is the price difference between the i3+mobo and i5+mobo? Can the build be bought in parts instead of the whole setup in one go??


----------



## Torvi

Quote:


> Originally Posted by *cephelix*
> 
> What is the price difference between the i3+mobo and i5+mobo? Can the build be bought in parts instead of the whole setup in one go??


there is no difference, said chipset will hold i3 i5 and i7s of 4th generation no problem









anyway im chancing upon making a build on i5-2500k too, currently bidding at one


----------



## rdr09

My secondary 290 running at X8 is scoring about 200 more points in graphics score than my primary.









Stock . . .

http://www.3dmark.com/3dm/10315340?


----------



## Torvi

decided to dig deeper and just stumbled upon prebuilt pc for 360gbp which looks like a good deal, tried to put together same pc and it was 400gbp including postage

- Case: Fractal Core 1000
- Power Supply: 500W corsair bronze 80
- CPU: Intel I5 4690K 3.5GHz
- Motherboard: Asus H81 USB3 M-ATX Motherboard
- RAM: 8GB DDR3 1600MHz Dual Channel Kit
- Graphics : Intel HD4600 Graphics Technology
- Optical Drive: 24x DVD+/-RW SATA Drive

now thinking whether i should go that way and get r9 290 or 290x since psu there is quite small or drop the idea of radeon and go for less power hungry nvidia


----------



## rdr09

Quote:


> Originally Posted by *Torvi*
> 
> decided to dig deeper and just stumbled upon prebuilt pc for 360gbp which looks like a good deal, tried to put together same pc and it was 400gbp including postage
> 
> - Case: Fractal Core 1000
> - Power Supply: 500W corsair bronze 80
> - CPU: Intel I5 4690K 3.5GHz
> - Motherboard: Asus H81 USB3 M-ATX Motherboard
> - RAM: 8GB DDR3 1600MHz Dual Channel Kit
> - Graphics : Intel HD4600 Graphics Technology
> - Optical Drive: 24x DVD+/-RW SATA Drive
> 
> now thinking whether i should go that way and get r9 290 or 290x since psu there is quite small or drop the idea of radeon and go for less power hungry nvidia


You looking at . . . at least a GTX 970 and is defintely less power hungry. Using the reading on my UPS benching my 290 @ 1200 with +50 MV I was getting around 440W Max running FS. That was with an i7 Sandy at 4.5GHz. About 100W less with the 290 at stock running the same bench. That run at post #41338.

BTW, that motherboard I think won't support oc'ing an unlocked CPU.


----------



## Torvi

im not going to oc it, the unlocked version is just a futureproof safety, i will defo swap things inside in a matter of time

about gpu i dont really need 290/290x performance, it just happens that it's quite cheap for it's performance, if not that i will most probably settle with 960 4gb and call it a day


----------



## rdr09

Quote:


> Originally Posted by *Torvi*
> 
> im not going to oc it, the unlocked version is just a futureproof safety, i will defo swap things inside in a matter of time
> 
> about gpu i dont really need 290/290x performance, it just happens that it's quite cheap for it's performance, if not that i will most probably settle with 960 4gb and call it a day


Then you are good.


----------



## cephelix

Quote:


> Originally Posted by *Torvi*
> 
> decided to dig deeper and just stumbled upon prebuilt pc for 360gbp which looks like a good deal, tried to put together same pc and it was 400gbp including postage
> 
> - Case: Fractal Core 1000
> - Power Supply: 500W corsair bronze 80
> - CPU: Intel I5 4690K 3.5GHz
> - Motherboard: Asus H81 USB3 M-ATX Motherboard
> - RAM: 8GB DDR3 1600MHz Dual Channel Kit
> - Graphics : Intel HD4600 Graphics Technology
> - Optical Drive: 24x DVD+/-RW SATA Drive
> 
> now thinking whether i should go that way and get r9 290 or 290x since psu there is quite small or drop the idea of radeon and go for less power hungry nvidia


Tht build looks good, do delve into the specs of the psu though.i've read on ocn some corsair psus are just junk. Other than that if you're going to swap out the mobo, it looks good.the 290/x should do fine with the psu provided you're not gonna oc it. But i suppose rd09 should know more thn me


----------



## kizwan

Quote:


> Originally Posted by *rdr09*
> 
> My secondary 290 running at X8 is scoring about 200 more points in graphics score than my primary.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Stock . . .
> 
> http://www.3dmark.com/3dm/10315340?


200 not much though. The difference probably less than 1 FPS or less than 2 FPS. It is within margin of error.
Quote:


> Originally Posted by *Torvi*
> 
> decided to dig deeper and just stumbled upon prebuilt pc for 360gbp which looks like a good deal, tried to put together same pc and it was 400gbp including postage
> 
> - Case: Fractal Core 1000
> - Power Supply: 500W corsair bronze 80
> - CPU: Intel I5 4690K 3.5GHz
> - Motherboard: Asus H81 USB3 M-ATX Motherboard
> - RAM: 8GB DDR3 1600MHz Dual Channel Kit
> - Graphics : Intel HD4600 Graphics Technology
> - Optical Drive: 24x DVD+/-RW SATA Drive
> 
> now thinking whether i should go that way and get r9 290 or 290x since psu there is quite small or drop the idea of radeon and go for less power hungry nvidia


With that budget, you can afford this.

ASUS z170m-plus
CORE i5-6400 BX80662I56400 Skylake LGA1151 *OR* CORE i5-6600K BX80662I56600K Skylake LGA1151
GSkill pc4-19200 / ddr4 2400 2x4GB
Fractal Core 1000
I put two choices for CPU just in case you don't want to overclock.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Torvi*
> 
> im not going to oc it, the unlocked version is just a futureproof safety, i will defo swap things inside in a matter of time
> 
> about gpu i dont really need 290/290x performance, it just happens that it's quite cheap for it's performance, if not that i will most probably settle with 960 4gb and call it a day


R9 290 and 290X use same power. Maybe 20-30W difference MAX.


----------



## Torvi

ihave asked the seller on ebay if he could put there 550/600w psu, it's some online retailer so he should be able to do it


----------



## RatusNatus

About your new rig....

I don't trust Corsair PSUs anymore since my HX850 fried my rig with my R9 290 in it. My 290 was the only one thing remaining there.
You can find a better PSU at same price. Any Seasonic will do but get a single 12v rail.

AOC just released some new monitors with FREESYNC option. There are some 75hz freesync cheap. I think is the best option IF you won't go to 144hz.


----------



## rdr09

Quote:


> Originally Posted by *cephelix*
> 
> Tht build looks good, do delve into the specs of the psu though.i've read on ocn some corsair psus are just junk. Other than that if you're going to swap out the mobo, it looks good.the 290/x should do fine with the psu provided you're not gonna oc it. But i suppose rd09 should know more thn me


What I meant to say was the 500W psu will work with a 960. I had to re-read
Torvi's post.
Quote:


> Originally Posted by *kizwan*
> 
> 200 not much though. The difference probably less than 1 FPS or less than 2 FPS. It is within margin of error.
> With that budget, you can afford this.
> 
> ASUS z170m-plus
> CORE i5-6400 BX80662I56400 Skylake LGA1151 *OR* CORE i5-6600K BX80662I56600K Skylake LGA1151
> GSkill pc4-19200 / ddr4 2400 2x4GB
> Fractal Core 1000
> I put two choices for CPU just in case you don't want to overclock.


Could be. I'll do more testing.


----------



## Torvi

Quote:


> Originally Posted by *RatusNatus*
> 
> About your new rig....
> 
> I don't trust Corsair PSUs anymore since my HX850 fried my rig with my R9 290 in it. My 290 was the only one thing remaining there.
> You can find a better PSU at same price. Any Seasonic will do but get a single 12v rail.
> 
> AOC just released some new monitors with FREESYNC option. There are some 75hz freesync cheap. I think is the best option IF you won't go to 144hz.


on old rig i used to be a corsair fanboy so i still have a lot of trust into corsair made stuff


----------



## miklkit

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Try 1100-1150 with no extra voltage.


It's now sitting at 1100/1400 and running just fine . Thanks!


----------



## RatusNatus

Yeah, me too since my 10 years old HX620 but I'm not the only one to left the brand.
I've just bought the HX850 because the old HX620 couldn't handle my new(2 years ago) R9 290.

Looks the HX850 had a very BAD ageing reports. It can't handle 24/7 confirmed by lots of faulty PSUS from bitcoin miners and people who do folding like myself.

And I'm talking about HX series not the CX cheaper.


----------



## karod

Quote:


> Originally Posted by *kizwan*
> 
> Actually it can draw more than that & the value set in BIOS is TDP, not max power consumption.
> 
> https://www.techpowerup.com/reviews/MSI/R9_290X_Lightning/22.html


I am not talking about the TDP (important for air cooling, since it can't cool as much as liquid cooling)

I refer to the power limit entry, it really is described [in the BIOS editing guide, as the value that regulates how much power is drawn from the PSU.

See:


----------



## HOMECINEMA-PC

Under 30c gaming @ 7680 x 1440p


----------



## karod

This is with a chiller?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *karod*
> 
> This is with a chiller?


No and Yes

The a/c is doing nearly all of the work keeping water temp @ 18c by chilling the rads .

The card chiller HC-1000 is set to kick in at 20c cause I don't want it running all the time


----------



## Vellinious

Quote:


> Originally Posted by *karod*
> 
> I am not talking about the TDP (important for air cooling, since it can't cool as much as liquid cooling)
> 
> I refer to the power limit entry, it really is described [in the BIOS editing guide, as the value that regulates how much power is drawn from the PSU.
> 
> See:


300 watts seems pretty low for a 290 / 290X. Do the AMD cards "power limit throttle" like NVIDIA cards do, if they're not able to draw enough power? As in, the card goes above the 300 watts?


----------



## karod

The 300W actually is set by me. It is a modded BIOS. The stock BIOS shows 208W for powerlimit.

With powerlimit you set the limit where "AMD Powertune" (that's function you were asking for) throttles the core frequency and voltage I think.
You can see in Benchmark, how the clocks drop if the limit isn't high while overclocking.


----------



## PacificNic

Does anyone know where I could buy a used but reliable waterblock for my R9 290? It's got a stock cooler so I obviously want to remedy that but am having a hard time dropping $160 on a block and backplate.


----------



## Cyber Locc

Quote:


> Originally Posted by *PacificNic*
> 
> Does anyone know where I could buy a used but reliable waterblock for my R9 290? It's got a stock cooler so I obviously want to remedy that but am having a hard time dropping $160 on a block and backplate.


Ebay, dont understand why you dont want to spend 160 on a block and back plate that is how much they always cost for every card.

You might also watch the marketplace here, however finding someone selling a used block without the GPU is not common in my experience it does happen just not that often. Even if you do find a used one in good condition its going to cost 120+ for the block and backplate.

No one said water cooling is cheap and if they did they lied.


----------



## kizwan

Quote:


> Originally Posted by *karod*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Actually it can draw more than that & the value set in BIOS is TDP, not max power consumption.
> 
> https://www.techpowerup.com/reviews/MSI/R9_290X_Lightning/22.html
> 
> 
> 
> I am not talking about the TDP (important for air cooling, since it can't cool as much as liquid cooling)
> 
> I refer to the power limit entry, it really is described [in the BIOS editing guide, as the value that regulates how much power is drawn from the PSU.
> 
> See:
> 
> 
> Spoiler: Warning: Spoiler!
Click to expand...

Aah, yes, that power limit. Don't forget to add +50% because that how much you can draw with stock BIOS. Still the fact still stand, gpu can draw more power than theoretical max power (75 + 75 + 150 with PCIe slot + 6-pin + 8-pin cables). You can find example in this thread alone (though you'll need to dig way back approximately 2 years ago I think) that show gpu can draw more than 300W when using modded BIOS.


----------



## HOMECINEMA-PC

When I was benching I saw over 365w per card


----------



## PacificNic

Quote:


> Originally Posted by *Cyber Locc*
> 
> Ebay, dont understand why you dont want to spend 160 on a block and back plate that is how much they always cost for every card.
> 
> You might also watch the marketplace here, however finding someone selling a used block without the GPU is not common in my experience it does happen just not that often. Even if you do find a used one in good condition its going to cost 120+ for the block and backplate.
> 
> No one said water cooling is cheap and if they did they lied.


No one ever told me it was cheap!







I've done a lot of research and have found that most WB for the 290(X) are around $130 (+$30 for backplate). I was just wondering whether I would be able to score one at reduced cost. Thanks for the info


----------



## ZealotKi11er

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> When I was benching I saw over 365w per card


Its 208W limit per card with 50% Increase ~ 312W. If you include efficiency then you get to ~350-370W per card. 290X probably can go to 400-420W.


----------



## mus1mus

Quote:


> Originally Posted by *PacificNic*
> 
> No one ever told me it was cheap!
> 
> 
> 
> 
> 
> 
> 
> I've done a lot of research and have found that most WB for the 290(X) are around $130 (+$30 for backplate). I was just wondering whether I would be able to score one at reduced cost. Thanks for the info


Try Aussie stores. With their current dollar exchange rate 0.7 of US$, you might end up on the cheap side.


----------



## fat4l

Guys, new liquid metal paste coming to the market soon : *Thermal Grizzly Conductonaut - 73 W/mk*








http://www.computerbase.de/2015-12/intel-skylake-heatspreader-delid-die-mate-test/2/



For comparison, here is the "Thermal Conductivity" values for comparison:
Thermal Grizzly Conductonaut - 73 W/mk
Coollaboratory Liquid Ultra - 38.4 W/mk
Coollaboratory Liquid Pro - 32.6 W/mk
Thermal Grizzly Kryonaut -12.5 W/mk
Gelid GC-Extreme - 8.5 W/mk

Our gpus need it!(at least mine







)

http://www.overclock.net/t/1588116/thermal-grizzly-conductonaut-73-w-mk/0_30


----------



## fyzzz

Hey, @mus1mus what do you think about this score: http://www.3dmark.com/fs/7247395


----------



## mus1mus

Quote:


> Originally Posted by *fyzzz*
> 
> Hey, @mus1mus what do you think about this score: http://www.3dmark.com/fs/7247395


Holy mooooother!
Any other mods added to that run?


Spoiler: Warning: Spoiler!



Wait for my 290X.


----------



## fyzzz

Quote:


> Originally Posted by *mus1mus*
> 
> Holy mooooother!
> Any other mods added to that run?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Wait for my 290X.


Nope no more mods added. Just more vcore and running it colder. Beat my score so i have something to aim for again







.


----------



## mus1mus

Just soo everyone know, I am not in a grudge fight with this chap. Nor is he. But we like shooting at each other's crap.








Quote:


> Originally Posted by *fyzzz*
> 
> Nope no more mods added. Just more vcore and running it colder. Beat my score so i have something to aim for again
> 
> 
> 
> 
> 
> 
> 
> .


The rig is in a tear down for another Novice Showdown. And I will be shooting for your score eventually.


----------



## fyzzz

Quote:


> Originally Posted by *mus1mus*
> 
> Just soo everyone know, I am not in a grudge fight with this chap. Nor is he. But we like shooting at each other's crap.
> 
> 
> 
> 
> 
> 
> 
> 
> The rig is in a tear down for another Novice Showdown. And I will be shooting for your score eventually.


Sounds good







. Yeah this is kinda fun. It is amazing how far you can push these cards, i wonder when we will start to hit 17k







. I hope it gets even colder here, so i can test again.


----------



## ZealotKi11er

Quote:


> Originally Posted by *fyzzz*
> 
> Sounds good
> 
> 
> 
> 
> 
> 
> 
> . Yeah this is kinda fun. It is amazing how far you can push these cards, i wonder when we will start to hit 17k
> 
> 
> 
> 
> 
> 
> 
> . I hope it gets even colder here, so i can test again.


Never get impressed unless is 24/7 Game Stable Overclock.


----------



## mus1mus

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fyzzz*
> 
> Sounds good
> 
> 
> 
> 
> 
> 
> 
> . Yeah this is kinda fun. It is amazing how far you can push these cards, i wonder when we will start to hit 17k
> 
> 
> 
> 
> 
> 
> 
> . I hope it gets even colder here, so i can test again.
> 
> 
> 
> Never get impressed unless is 24/7 Game Stable Overclock.
Click to expand...

Ohh come on, we know you've been there.


----------



## ZealotKi11er

Quote:


> Originally Posted by *mus1mus*
> 
> Ohh come on, we know you've been there.


Yeah once in the blue moon but not every single day. Gets boring going after something which give no again in real gaming.


----------



## mus1mus

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah once in the blue moon but not every single day. Gets boring going after something which give no again in real gaming.


The 26 50 mod might. at stock speeds up to a certain memory clock.

My 290X accepted 1000 timings up to 1375 just fine. So yeah. It does help a bit.


----------



## fat4l

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Never get impressed unless is 24/7 Game Stable Overclock.


The same here


----------



## battleaxe

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Never get impressed unless is 24/7 Game Stable Overclock.


It is impressive if you compare what these cards can do against Nvidia. Remember when the 970 could hang with a 290x? Can't anymore. And its because of scores like the ones these guys are putting up.


----------



## fyzzz

I tried gaming on my bench bios too today. 1230/1625 +200mv and with all my memory mods. During crysis 3 very high i saw maybe one wierd thing caused by memory probably, but other than that it was fine. Highest core temp was 46c and vrm 1 48c.


----------



## fyzzz

Well this is kinda interesting: http://www.3dmark.com/compare/fs/6228104/fs/7252410#


----------



## ZealotKi11er

Quote:


> Originally Posted by *fyzzz*
> 
> Well this is kinda interesting: http://www.3dmark.com/compare/fs/6228104/fs/7252410#


lol


----------



## rdr09

Spiked to 660W (using UPS reading) running FS at 1100 core +0 mV and +50 Power Limit . . .



Still using Trixx.


----------



## Vellinious

Got my 290x in this morning, and started playing around with it. I have to say, it's an impressive looking card. Messed with the clocks a bit using TRIXX (only 200mv with TRIXX?), and pushed the core to 1251 really easily, so that's promising. Without any driver tweaks got really close to the GPU score I had run on my 970 (14880 max). This is just a temporary setup until the block/backplate arrives.

http://www.3dmark.com/fs/7256603


----------



## Vellinious

What software allows for +400mv? Or does it take a bios mod to get it that high?


----------



## mus1mus

HIS iTurbo + PT1 rom.

Edit: if you mod PT1 to have a higher base 3D Voltage, you can go as high as 1.7V before Droop and/or more than 1.5V clean under load.


----------



## Cyber Locc

Quote:


> Originally Posted by *Vellinious*
> 
> Got my 290x in this morning, and started playing around with it. I have to say, it's an impressive looking card. Messed with the clocks a bit using TRIXX (only 200mv with TRIXX?), and pushed the core to 1251 really easily, so that's promising. Without any driver tweaks got really close to the GPU score I had run on my 970 (14880 max). This is just a temporary setup until the block/backplate arrives.
> 
> http://www.3dmark.com/fs/7256603


AB should allow 300mv (I think) also you could do what I do, flash a Asus bios on there and us GPU Tweak this will allow up to 1.4vs.

Pt1 bios is also an option, and use it with GPU Tweak (I think it allows 1.5? maybe just 1.4 and the other one allows 1.5)

Anyway if you just want to get 1.4, then going with the regular Asus bios is easiest, it shouldn't cause any issues with your card. All 3 of my cards run Asus bios and they are 2 xfx and 1 sapphire BF4.

Scratch that HIS turbo is new on me but it allows 1.4







I will give that a shot myself when my rig is back up.

You can only flash it to 1 side to be extra careful or you could do a double down Asus on 1 and PT1 on the other (thats what I do)


----------



## fyzzz

HIS iTurbo allows up to 400mv


----------



## ZealotKi11er

Does +400 mV help with anything? I can go 1275 MHz + 200mV but still see artifacts.


----------



## Vellinious

I googled HIS iTurbo, but the download link was dead...

I'll definitely do the PT1 bios in the 2nd position, and leave the 1st position straight stock


----------



## fyzzz

Quote:


> Originally Posted by *Vellinious*
> 
> I googled HIS iTurbo, but the download link was dead...
> 
> I'll definitely do the PT1 bios in the 2nd position, and leave the 1st position straight stock


https://drive.google.com/file/d/0Bzz6JsWm9EHPRWZ6LTB2eFRyclk/view


----------



## mus1mus

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Does +400 mV help with anything? I can go 1275 MHz + 200mV but still see artifacts.


Yes.

It does help a ton. But you need to evaluate where the artifacts are coming from. It can either come from the CORE or the Memory. Both needs moar Voltage and a little fiddling.

Also, I have seen some Voltage walls on these cards. Like 1300/1625 can do without artifacts by just +200. While 1320 needs +300 to eliminate the glitches.

Quote:


> Originally Posted by *Vellinious*
> 
> I googled HIS iTurbo, but the download link was dead...
> 
> I'll definitely do the PT1 bios in the 2nd position, and leave the 1st position straight stock


And dont forget modding the memory timings.


----------



## Vellinious

Best I can do on air is 1240 on the core and 1720 on the memory before it starts artifacting. Ran a 14.8k graphics score....I'm looking forward to seeing what it can do under water.


----------



## battleaxe

Quote:


> Originally Posted by *fyzzz*
> 
> https://drive.google.com/file/d/0Bzz6JsWm9EHPRWZ6LTB2eFRyclk/view


+1 been waiting for that for a while now. Thank you!!!!


----------



## Arizonian

Quote:


> Originally Posted by *Vellinious*
> 
> Got my 290x in this morning, and started playing around with it. I have to say, it's an impressive looking card. Messed with the clocks a bit using TRIXX (only 200mv with TRIXX?), and pushed the core to 1251 really easily, so that's promising. Without any driver tweaks got really close to the GPU score I had run on my 970 (14880 max). This is just a temporary setup until the block/backplate arrives.
> 
> http://www.3dmark.com/fs/7256603
> 
> 
> Spoiler: Warning: Spoiler!


Congrats - added


----------



## cainy1991

Quote:


> Originally Posted by *Vellinious*
> 
> Got my 290x in this morning, and started playing around with it. I have to say, it's an impressive looking card. Messed with the clocks a bit using TRIXX (only 200mv with TRIXX?), and pushed the core to 1251 really easily, so that's promising. Without any driver tweaks got really close to the GPU score I had run on my 970 (14880 max). This is just a temporary setup until the block/backplate arrives.
> 
> http://www.3dmark.com/fs/7256603


Constantly monitor the VRM temps with that card until you get your block.

Unless XFX have actually fixed the design of the cooler your vrms have no real form of cooling, My core clock would sit around 70c-ish and my VRM would be over 100C at all times under load. (stock clocks)


----------



## Vellinious

Yeah, it got pretty warm, so I won't go any further until I get it under water.


----------



## kizwan

Quote:


> Originally Posted by *fyzzz*
> 
> I tried gaming on my bench bios too today. 1230/1625 +200mv and with all my memory mods. During crysis 3 very high i saw maybe one wierd thing caused by memory probably, but other than that it was fine. Highest core temp was 46c and vrm 1 48c.


Did you still use XFX390 modded BIOS or already go back to 290 BIOS? I have been go back and forth between 290 Tri-X OC BIOS & XFX390. Currently go back to XFX390 modded BIOS & still testing them. Having hard lock last week with GTA V at stock even though it was fine few days before. The only difference is that I have secondary monitor connected to DVI port & was running with Eyefinity (I know only two monitors but testing just for fun). Re-tested my CPU overclock & that still fine and stable. I did some correction to the modded BIOS. We'll see what happen when I test it this weekend.


----------



## fyzzz

Quote:


> Originally Posted by *kizwan*
> 
> Did you still use XFX390 modded BIOS or already go back to 290 BIOS? I have been go back and forth between 290 Tri-X OC BIOS & XFX390. Currently go back to XFX390 modded BIOS & still testing them. Having hard lock last week with GTA V at stock even though it was fine few days before. The only difference is that I have secondary monitor connected to DVI port & was running with Eyefinity (I know only two monitors but testing just for fun). Re-tested my CPU overclock & that still fine and stable. I did some correction to the modded BIOS. We'll see what happen when I test it this weekend.


I go back and forth between 290 and 390 bios too. My bench bios is a modded pt1 bios, which is a 290 bios and my daily bios is a 390 bios.


----------



## Cyber Locc

Quote:


> Originally Posted by *kizwan*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Did you still use XFX390 modded BIOS or already go back to 290 BIOS? I have been go back and forth between 290 Tri-X OC BIOS & XFX390. Currently go back to XFX390 modded BIOS & still testing them. Having hard lock last week with GTA V at stock even though it was fine few days before. The only difference is that I have secondary monitor connected to DVI port & was running with Eyefinity (I know only two monitors but testing just for fun). Re-tested my CPU overclock & that still fine and stable. I did some correction to the modded BIOS. We'll see what happen when I test it this weekend.


Quote:


> Originally Posted by *fyzzz*
> 
> I go back and forth between 290 and 390 bios too. My bench bios is a modded pt1 bios, which is a 290 bios and my daily bios is a 390 bios.


Since both of you use 2 bioses, how do you switch them on the fly? I am looking for a better way to do it, maybe some kind of extender on the switch. It is a real pain to switch my cards with the EK terminal being right over top of the bios switch.


----------



## battleaxe

Quote:


> Originally Posted by *Cyber Locc*
> 
> Since both of you use 2 bioses, how do you switch them on the fly? I am looking for a better way to do it, maybe some kind of extender on the switch. It is a real pain to switch my cards with the EK terminal being right over top of the bios switch.


The switch on the cards maybe? All they would have to do is reboot and hit the switch while its rebooting. But its super fast to re-flash too, so that could be a method as well.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Cyber Locc*
> 
> Since both of you use 2 bioses, how do you switch them on the fly? I am looking for a better way to do it, maybe some kind of extender on the switch. It is a real pain to switch my cards with the EK terminal being right over top of the bios switch.


you could glue something to the switch as an extender


----------



## Cyber Locc

Quote:


> Originally Posted by *battleaxe*
> 
> The switch on the cards maybe? All they would have to do is reboot and hit the switch while its rebooting. But its super fast to re-flash too, so that could be a method as well.


I know the switch on the card works, I run PT1 on 1 and regular on another just like they do. The issue is the EK terminal adapter blocks that switch so to flip it is a challenge and requires a paperclip and some stealthiness.
Quote:


> Originally Posted by *mfknjadagr8*
> 
> you could glue something to the switch as an extender


That is what I was thinking, its a shame that no one thought of this as most blocks cover them up. A little extender that clips on to it would be awesome.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Cyber Locc*
> 
> I know the switch on the card works, I run PT1 on 1 and regular on another just like they do. The issue is the EK terminal adapter blocks that switch so to flip it is a challenge and requires a paperclip and some stealthiness.
> That is what I was thinking, its a shame that no one thought of this as most blocks cover them up. A little extender that clips on to it would be awesome.


I seen a extender on a 290 one but not sure it was a buyable part though..I can reach mine but I don't have a terminal I have an extendable fitting....the one I saw went over the other switch then flat across to a textured top...it looked professional


----------



## kizwan

Quote:


> Originally Posted by *Cyber Locc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Did you still use XFX390 modded BIOS or already go back to 290 BIOS? I have been go back and forth between 290 Tri-X OC BIOS & XFX390. Currently go back to XFX390 modded BIOS & still testing them. Having hard lock last week with GTA V at stock even though it was fine few days before. The only difference is that I have secondary monitor connected to DVI port & was running with Eyefinity (I know only two monitors but testing just for fun). Re-tested my CPU overclock & that still fine and stable. I did some correction to the modded BIOS. We'll see what happen when I test it this weekend.
> 
> 
> 
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *fyzzz*
> 
> I go back and forth between 290 and 390 bios too. My bench bios is a modded pt1 bios, which is a 290 bios and my daily bios is a 390 bios.
> 
> Click to expand...
> 
> Since both of you use 2 bioses, how do you switch them on the fly? I am looking for a better way to do it, maybe some kind of extender on the switch. It is a real pain to switch my cards with the EK terminal being right over top of the bios switch.
Click to expand...

I don't use the switch because the 2nd BIOS is for holding the stock BIOS. Both 290 Tri-X OC & XFX390DD BIOSes I use are modded. I just re-flash the cards.
Quote:


> Originally Posted by *battleaxe*
> 
> The switch on the cards maybe? All they would have to do is reboot and hit the switch while its rebooting. But its super fast to re-flash too, so that could be a method as well.


Yes, re-flash because it's supa fast.


----------



## Vellinious

I'm hitting 1783 on the memory on this card, with 1250 on the core and haven't had to go over +220mv yet....coming from NVIDIA, I'm not sure how much people are getting on these cards. What is considered "good" and what is "outstanding"?

Also, there are two voltage sliders on the HIS iTurbo software. The top one is core, obviously, but....what is the bottom slider for?

Another test run, this time I put some clock on the CPU.

http://www.3dmark.com/3dm/10367995


----------



## fyzzz

Quote:


> Originally Posted by *Vellinious*
> 
> I'm hitting 1783 on the memory on this card, with 1250 on the core and haven't had to go over +220mv yet....coming from NVIDIA, I'm not sure how much people are getting on these cards. What is considered "good" and what is "outstanding"?
> 
> Also, there are two voltage sliders on the HIS iTurbo software. The top one is core, obviously, but....what is the bottom slider for?
> 
> Another test run, this time I put some clock on the CPU.
> 
> http://www.3dmark.com/3dm/10367995


You have a very nice card. Probably going to shine even more if you bios mod it. A bit VDDCI can sometimes help with memory clock or a little more stability. I also have xfx dd card, a 290 that is also very good.


----------



## Vellinious

What is it, though? Like...input voltage on a CPU? Memory voltage? I bumped it up from 1000mv to 1010mv, and it seemed to help a tiny bit, but....

a) How much is too much
b) What temps would it affect. I'm paying particular attention to the VRM temps, as the air cooling on this card for the VRM seems to be....lacking.


----------



## fyzzz

Quote:


> Originally Posted by *Vellinious*
> 
> What is it, though? Like...input voltage on a CPU? Memory voltage? I bumped it up from 1000mv to 1010mv, and it seemed to help a tiny bit, but....
> 
> a) How much is too much
> b) What temps would it affect. I'm paying particular attention to the VRM temps, as the air cooling on this card for the VRM seems to be....lacking.


There is useful information in this thread: http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/0_50.
Quote:


> The quality of the memory controllers vary so some of them might need a slight increase in the supply voltage (VDDCI) even at or slightly below 1500MHz. Usually 20mV increase in the VDDCI is enough to stabilise it. On Hawaii the VDDCI should never be set higher than +50mV (= 1.050V) as the memory PHY / controller is the hottest part of the GPU already.


----------



## Vellinious

Yeah, I've started reading through that thread....I try to skip ahead a bit to find the stuff more relevant to where it's at now, but....long process.


----------



## Vellinious

Quote:


> Originally Posted by *fyzzz*
> 
> There is useful information in this thread: http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/0_50.


That quoted bit...good information. Thank you, very much.


----------



## ITAngel

I wanted to share some info I just recently got the PowerColor Devil's 13 290X2 and is not a bad card at all. Is not as loud as many make them to be and during stress test yes they do get a bit louder. I think that only two places I heard the card run a bit louder was in Battlefield 4 and Heaven Bench test. My test were done in my office ambiet temps were 72F. My room stay within 68F-72F.

The Devil 13 is in the case.

-Closed Case-


-Open Case-


Temps didnt get pass 86C in Battlefield 4 thumb.gif
Temps in heaven got to 93C in the first test run stock mode.


So far sound was quiet during normal gaming in GW2 and Rift and normal desktop activities but during battlefield 4 and heaven bench mark it was a bit louder. Temps went up to 94C during the first test. Temps stay within 48C after that when the card was not stress tested.

Not a bad card for $400 only a month old.


----------



## cephelix

4 cables?! that thing is a beast


----------



## Vellinious

Quote:


> Originally Posted by *cephelix*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 4 cables?! that thing is a beast


It's a dual GPU card......


----------



## cephelix

Quote:


> Originally Posted by *Vellinious*
> 
> It's a dual GPU card......


yeah, i know....still a beast!


----------



## Raephen

Quote:


> Originally Posted by *ITAngel*
> 
> I wanted to share some info I just recently got the PowerColor Devil's 13 290X2 and is not a bad card at all. Is not as loud as many make them to be and during stress test yes they do get a bit louder. I think that only two places I heard the card run a bit louder was in Battlefield 4 and Heaven Bench test. My test were done in my office ambiet temps were 72C. My room stay within 68C-72C.
> 
> The Devil 13 is in the case.
> 
> -Closed Case-
> 
> 
> -Open Case-
> 
> 
> Temps didnt get pass 86C in Battlefield 4 thumb.gif
> Temps in heaven got to 93C in the first test run stock mode.
> 
> 
> So far sound was quiet during normal gaming in GW2 and Rift and normal desktop activities but during battlefield 4 and heaven bench mark it was a bit louder. Temps went up to 94C during the first test. Temps stay within 48C after that when the card was not stress tested.
> 
> Not a bad card for $400 only a month old.


Ambient temps of up to 72C? I do hope you don't mean C as in degrees Celsius, otherwise both the office and your room are very very very hot


----------



## ITAngel

Oh yea I mean in the cooler side of things. Lol


----------



## ZealotKi11er

Have you tried having the top fans as intake? It seems you have negative pressure in the case.


----------



## ITAngel

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Have you tried having the top fans as intake? It seems you have negative pressure in the case.


No I don't I have primary fan on the front pushing in, 3 fans pushing out. I use to have 3 in and 3 out but I sold a system that needed some fans and I remove them from my case. Should I have both fans pushing in? I also have one more fan I can drop so I can have 3 fans pushing in. Not sure what would be the best setup any advice?


----------



## ZealotKi11er

Quote:


> Originally Posted by *ITAngel*
> 
> No I don't I have primary fan on the front pushing in, 3 fans pushing out. I use to have 3 in and 3 out but I sold a system that needed some fans and I remove them from my case. Should I have both fans pushing in? I also have one more fan I can drop so I can have 3 fans pushing in. Not sure what would be the best setup any advice?


The card produces a lot of hot air inside the case. With only a 200mm restricted intake fan the card will try to take air from the back vents of the case. That air is all hot because of exhaust. Try to see with top 2 cans as intake.


----------



## Cyber Locc

Quote:


> Originally Posted by *ITAngel*
> 
> No I don't I have primary fan on the front pushing in, 3 fans pushing out. I use to have 3 in and 3 out but I sold a system that needed some fans and I remove them from my case. Should I have both fans pushing in? I also have one more fan I can drop so I can have 3 fans pushing in. Not sure what would be the best setup any advice?


1 fan pushing in and 3 taking out is negative pressure lol. Also remember the GPUs fan is exhaust so you have 4 fans out 1 in. That is extreme negative pressure its all bad. Always have more fans going in than out. You want positive or neutral pressure negative is all bad.

Your CPU temps should drop too by flipping those fans.


----------



## ITAngel

Quote:


> Originally Posted by *ZealotKi11er*
> 
> The card produces a lot of hot air inside the case. With only a 200mm restricted intake fan the card will try to take air from the back vents of the case. That air is all hot because of exhaust. Try to see with top 2 cans as intake.


Okay I will give that a shot will report back shortly.


----------



## ITAngel

Quote:


> Originally Posted by *Cyber Locc*
> 
> 1 fan pushing in and 3 taking out is negative pressure lol. Also remember the GPUs fan is exhaust so you have 4 fans out 1 in. That is extreme negative pressure its all bad. Always have more fans going in than out. You want positive or neutral pressure negative is all bad.


Okay cool thanks I will set that up now.


----------



## ITAngel

Okay here is what I did let me know if it does look good.


----------



## specopsFI

Guys, you're setting ITAngel up for a disappointment. One has to have a lot of fans blowing heat out when having a dual Hawaii setup with an open-air cooler (or two). One exhaust fan won't cut it, no matter how many intake fans you have. Sure, it would be a better balance to have like 3 in - 3 out, but personally I wouldn't even bother trying that setup with top fans as intakes. The amount of hot air circulating in that case will be massive.

ITAngel, I would try adding an extra intake at the bottom and keeping the top part of the case for exhaust. There is a bottom 140mm spot in the Enthoo Prime, right? 200mm+140mm as intake could already be enough to keep those exhaust fans occupied. Even if the intakes are a bit restricted, it doesn't matter for the pressure, the cold air just won't go to the GPU as effectively.


----------



## ITAngel

Quote:


> Originally Posted by *specopsFI*
> 
> Guys, you're setting ITAngel up for a disappointment. One has to have a lot of fans blowing heat out when having a dual Hawaii setup with an open-air cooler (or two). One exhaust fan won't cut it, no matter how many intake fans you have. Sure, it would be a better balance to have like 3 in - 3 out, but personally I wouldn't even bother trying that setup with top fans as intakes. The amount of hot air circulating in that case will be massive.
> 
> ITAngel, I would try adding an extra intake at the bottom and keeping the top part of the case for exhaust. There is a bottom 140mm spot in the Enthoo Prime, right? 200mm+140mm as intake could already be enough to keep those exhaust fans occupied. Even if the intakes are a bit restricted, it doesn't matter for the pressure, the cold air just won't go to the GPU as effectively.


I there thanks for the tip, I been running a game testing it. The fans were getting pretty load after a while I decide to remove the side panel. As soon I did that the fans on the GPU drop and temps started dropping from 89C to 82C, so this case is lacking a lot of cool air. My room is only 72F.

I was thinking of grabbing a *Thermaltake Core X9 Black E-ATX* for better air flow but not sure, maybe is a mater of arranging my fans and adding powerful quite fans to push more air into this case. Not sure.


----------



## kizwan

Quote:


> Originally Posted by *ITAngel*
> 
> I wanted to share some info I just recently got the PowerColor Devil's 13 290X2 and is not a bad card at all. Is not as loud as many make them to be and during stress test yes they do get a bit louder. I think that only two places I heard the card run a bit louder was in Battlefield 4 and Heaven Bench test. My test were done in my office ambiet temps were 72C. My room stay within 68C-72C.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> The Devil 13 is in the case.
> 
> -Closed Case-
> 
> 
> -Open Case-
> 
> 
> Temps didnt get pass 86C in Battlefield 4 thumb.gif
> Temps in heaven got to 93C in the first test run stock mode.
> 
> 
> So far sound was quiet during normal gaming in GW2 and Rift and normal desktop activities but during battlefield 4 and heaven bench mark it was a bit louder. Temps went up to 94C during the first test. Temps stay within 48C after that when the card was not stress tested.
> 
> Not a bad card for $400 only a month old.


Whoa! I'd most likely get heat stroke if I live in 72C ambient room. Survival rate also not very good.










I disagree the saying that negative pressure is bad. Not at all. The only negative about negative pressure is dust. With heat dumping cards like Hawaii, especially two or more of them, you want active exhaust airflow which you can get by negative pressure. With positive pressure, if there is air turbulence, it can prevent the heat from escaping the case properly & trap the heat instead. I didn't say positive pressure is bad but make sure you positioned the intake & exhaust fans properly if you want positive pressure. Negative pressure is much easier.


----------



## ITAngel

Quote:


> Originally Posted by *kizwan*
> 
> Whoa! I'd most likely get heat stroke if I live in 72C ambient room. Survival rate also not very good.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I disagree the saying that negative pressure is bad. Not at all. The only negative about negative pressure is dust. With heat dumping cards like Hawaii, especially two or more of them, you want active exhaust airflow which you can get by negative pressure. With positive pressure, if there is air turbulence, it can prevent the heat from escaping the case properly & trap the heat instead. I didn't say positive pressure is bad but make sure you positioned the intake & exhaust fans properly if you want positive pressure. Negative pressure is much easier.


Well I am trying to find the best way I can make it so that it will cool everything in quicker since the noise of the fans come in due to the heat trap in the case. I did notice as soon I removed the side panel the graphic card noise started going down. In just about a minute or two the temps drop from 89C to 82C and the noise level also did drop. So I have to figure out a best way to get everything working here with the current fans I have. I can't afford to add more at the moment as I have lots of drives, an OC cpu, this graphic card etc... and only running a 1000w EVGA SuperNova G2 power supply.

I have not placed fans on the button of the case because it looks like I need some low profile fans to move under the case. so I only have the top, and front and rear. If I want to use the bottom to add intake fans, I will have to remove the PSU cove so I can place one to blow directly to the GPU.

What you guys think?

If I did that, I would put a 120mm down, have my 200mm on front, 120mm on top and the 140mm on the back.


----------



## mus1mus

Quote:


> Originally Posted by *Cyber Locc*
> 
> 1 fan pushing in and 3 taking out is negative pressure lol. *Also remember the GPUs fan is exhaust so you have 4 fans out 1 in*. That is extreme negative pressure its all bad. Always have more fans going in than out. You want positive or neutral pressure negative is all bad.
> 
> Your CPU temps should drop too by flipping those fans.


As bad as an advise I have ever seen.

1. Only reference cards exhaust hot air. Custom cooled cards spray hot air all inside the case.

2. Negative or Positive pressure, the difference is DUST. Not cooling.


----------



## kizwan

Quote:


> Originally Posted by *ITAngel*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Whoa! I'd most likely get heat stroke if I live in 72C ambient room. Survival rate also not very good.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I disagree the saying that negative pressure is bad. Not at all. The only negative about negative pressure is dust. With heat dumping cards like Hawaii, especially two or more of them, you want active exhaust airflow which you can get by negative pressure. With positive pressure, if there is air turbulence, it can prevent the heat from escaping the case properly & trap the heat instead. I didn't say positive pressure is bad but make sure you positioned the intake & exhaust fans properly if you want positive pressure. Negative pressure is much easier.
> 
> 
> 
> Well I am trying to find the best way I can make it so that it will cool everything in quicker since the noise of the fans come in due to the heat trap in the case. I did notice as soon I removed the side panel the graphic card noise started going down. In just about a minute or two the temps drop from 89C to 82C and the noise level also did drop. So I have to figure out a best way to get everything working here with the current fans I have. I can't afford to add more at the moment as I have lots of drives, an OC cpu, this graphic card etc... and only running a 1000w EVGA SuperNova G2 power supply.
> 
> I have not placed fans on the button of the case because it looks like I need some low profile fans to move under the case. so I only have the top, and front and rear. If I want to use the bottom to add intake fans, I will have to remove the PSU cove so I can place one to blow directly to the GPU.
> 
> What you guys think?
> 
> If I did that, I would put a 120mm down, have my 200mm on front, 120mm on top and the 140mm on the back.
Click to expand...

Open case temps will always lower than closed case. I think you can lower the temps with higher CFM fans but there will be issue with noise. I would keep both top & rear fans working as exhaust. Also keep front as intake although I would replace the 200mm with a pair of good 120mm fans. Did you use custom fan profile on the Devil?


----------



## Mega Man

Speak for yourself! Mine are the same no matter what


----------



## cephelix

I was going to suggest @ITAngel the same setup I have running in my 750D but it seems like there isn't enough space on the unused PCI slot covers to strap another 120mm fan from the inside.
Nonetheless, here it is, fans are all 1850rpm Gentle Typhoons which work the best in my senario
Top: 2 as intake nearer the front since the slot nearest the back as intake actually increases my CPU temps.
Front: 2 as intake
Bottom: 2 as intake, fiddling with temps suggests that it doesn't really do much. Maybe 1 degree difference in card temps
Top Back: 1 as exhaust
Bottom Back (Unused PCIE slots): Cable-tied a fan to the inside. This one made a huge difference to temps, a 5C drop. Since the case doesn't have a side panel fan to extract all the hot air thrown out by the card, this was the only other way.

Of course I did other mods to the card, CLU on die and Fujipoly Ultra Extreme on the VRMs.
Hope this helps. If not, you could probably ghetto rig a 120mm CLC to each die to bring down temps as well though it won't look as elegant


----------



## battleaxe

Quote:


> Originally Posted by *ITAngel*
> 
> I there thanks for the tip, I been running a game testing it. The fans were getting pretty load after a while I decide to remove the side panel. As soon I did that the fans on the GPU drop and temps started dropping from 89C to 82C, so this case is lacking a lot of cool air. My room is only 72F.
> 
> I was thinking of grabbing a *Thermaltake Core X9 Black E-ATX* for better air flow but not sure, maybe is a mater of arranging my fans and adding powerful quite fans to push more air into this case. Not sure.


Interestingly, I have both of these cases. The Phanteks and the X9. The X9 has vastly more availability for airflow. That being said you should be able to do fine with the one you have too. You just need to add more 120mm high pressure fans. 8-10 of them would do it nicely.


----------



## Vellinious

lol, my goodness..8-10 fans? More fans than a lot of watercooled rigs.


----------



## mus1mus

I would set the top as exhaust, set the rear and the front as intake, reverse the orientation of the CPU cooler to muster cold air from the rear.

Add a little Undervolt and an agressive fan profile on the GPU.

Maybe a fan at the bottom (if supported) as intake.

With that monster of a card, I would avoid it heating the CPU from the hot air it radiates.

Otherwise, fans replacement should be highly considered.


----------



## battleaxe

Quote:


> Originally Posted by *Vellinious*
> 
> lol, my goodness..8-10 fans? More fans than a lot of watercooled rigs.


well I should have made the caveat that those fans would be low rpm, high static pressure, so as to not be annoying and loud. That's the reason for more of them.


----------



## ITAngel

Quote:


> Originally Posted by *mus1mus*
> 
> I would set the top as exhaust, set the rear and the front as intake, reverse the orientation of the CPU cooler to muster cold air from the rear.
> 
> Add a little Undervolt and an agressive fan profile on the GPU.
> 
> Maybe a fan at the bottom (if supported) as intake.
> 
> With that monster of a card, I would avoid it heating the CPU from the hot air it radiates.
> 
> Otherwise, fans replacement should be highly considered.


Thanks I will give that a shot and see. I don't have many connectors n the mATX motherboard until I get a better fan controller I am limited on the amount of fans I can run on the case.


----------



## ITAngel

Okay this is what I did and with the help of a friend he configure the fan profile under msi burner. So far temps are much better but I need more intake cooler air. Better fans or a better layout etc..



The Top Fan is feeding the CPU with fresh cool air directly, and the rear fan is taking cpu hot air out of the case. The front fan is feeding the case while the inside 120mm fan help move that air from the 200mm fan in front to the GPU. I also added the fan to the bottom of the case to push more fresh air up while having that inner fan pushing it toward the GPU. We found that the PSU cover was causing a heat pocket between the GPU and the PSU. So far temps are great but could be better. :







:


----------



## kizwan

Quote:


> Originally Posted by *ITAngel*
> 
> Okay this is what I did and with the help of a friend he configure the fan profile under msi burner. So far temps are much better but I need more intake cooler air. Better fans or a better layout etc..
> 
> 
> 
> The Top Fan is feeding the CPU with fresh cool air directly, and the rear fan is taking cpu hot air out of the case. The front fan is feeding the case while the inside 120mm fan help move that air from the 200mm fan in front to the GPU. I also added the fan to the bottom of the case to push more fresh air up while having that inner fan pushing it toward the GPU. We found that the PSU cover was causing a heat pocket between the GPU and the PSU. So far temps are great but could be better. :
> 
> 
> 
> 
> 
> 
> 
> :


\
Yeah, custom fan profile is important with Hawaii card.

I think the fan on the CPU cooler helps a lot with the air flow in that fans configuration.


----------



## ITAngel

Yea I wonder if it won't help more if I put the noctua nh-d15 back in since it has larger fans. Or take the CPU fans and place them in my case instead. I mean the noctua 140mm fans.


----------



## JourneymanMike

Quote:


> Originally Posted by *ITAngel*
> 
> Yea I wonder if it won't help more if I put the noctua nh-d15 back in since it has larger fans. Or take the CPU fans and place them in my case instead. I mean the noctua 140mm fans.


Can you say "Custom Loop Water Cooling"?









You won't have the heat problems, or the noise problem!!!


----------



## ITAngel

Lol, I can say a $400-$500 WC parts list. Lol


----------



## Vellinious

Quote:


> Originally Posted by *ITAngel*
> 
> Lol, I can say a $400-$500 WC parts list. Lol


On the low side....


----------



## JourneymanMike

Quote:


> Originally Posted by *ITAngel*
> 
> Lol, I can say a *$400-$500* WC parts list. Lol


Just to start, It can become a horrible addiction, that will empty your spare cash in one big hurry!!!

I have twice that much in fittings alone!


----------



## mus1mus

Quote:


> Originally Posted by *JourneymanMike*
> 
> Just to start, It can become a horrible addiction, that will empty your spare cash in one big hurry!!!
> 
> I have twice that much in fittings alone!


GAWLD FIIITIINGS?


----------



## mfknjadagr8

Quote:


> Originally Posted by *JourneymanMike*
> 
> Just to start, It can become a horrible addiction, that will empty your spare cash in one big hurry!!!
> 
> I have twice that much in fittings alone!


I've got around 1k in mine now...
Two gpu blocks - 300
H220x - 150
280 and 240 rad - 100
Mcp50x and res - 100
420mm rad - 75
480mm rad - 86
12 fans (four included with h220x and 240 rad) - 60
8 fittings - 40(other 6 are the included barbs)
10 foot of onyx tubing - 30

I'm sure I've forgotten something...but I've got nothing fancy in my loop...all straight fittings nothing rotary...and al my fans are swiftech helix fans also keep in mind I found great deals on most of my parts...the h220x, gpu blocks, and the fittings are the only things I paid retail for...I've actually got a koolance high fpi radiator I've been meaning to give away as a freebie just haven't had the time


----------



## Vellinious

I've got a ton in my loop....the overall $$$ value dropped when I sold both 970s, and went to a single 290X block, but....still rather pricey.

Some pics of my rig here...this is during the transition from 970s to 290X.

http://www.modsrigs.com/detail.aspx?BuildID=35270


----------



## ITAngel

Quote:


> Originally Posted by *Vellinious*
> 
> I've got a ton in my loop....the overall $$$ value dropped when I sold both 970s, and went to a single 290X block, but....still rather pricey.
> 
> Some pics of my rig here...this is during the transition from 970s to 290X.
> 
> http://www.modsrigs.com/detail.aspx?BuildID=35270


damn nice looking setup







, I am to poor to try going that route yet. hahaha, I will stick to air cooling for now.


----------



## Vellinious

Quote:


> Originally Posted by *ITAngel*
> 
> damn nice looking setup
> 
> 
> 
> 
> 
> 
> 
> , I am to poor to try going that route yet. hahaha, I will stick to air cooling for now.


Thanks...it was a lot of work. = )


----------



## mAs81

Hey guys..
Sooo...I'm having a somewhat peculiar problem,that I do not think is normal..This has been going on for some time..

When watching YouTube videos on my 2560x1440 monitor,while browsing OCN on my 1920x1080 one , I noticed that my Vapor-X LEDs turned yellow because it was running @ 55-58c with low ambient temps(since it's winter here,and it's been cold these days) of 22-24c
I started then to check out what was going on,and I saw that my Core Clock was @ max(1030) even though my GPU Load was minimal,to say the least..
I have Hardware acceleration turned off on my browser ,Virtual Super Resolution Turned off on my 16.1 Radeon drivers , and my GPU is not OC'd at all..

Here's how it looks :


Spoiler: Warning: Spoiler!








What I have noticed is that from time to time,my Vapor-X fans rev up for a sec or 2 , and then go back to normal..
What I was thinking to do is,take out the card and change the TIM,but that won't change the fact that I get full core clock speeds at minimal load..

So,what do you guys think,is this normal and I'm paranoid,or something is going on??

Any help will be highly appreciated


----------



## kizwan

Quote:


> Originally Posted by *mAs81*
> 
> Hey guys..
> Sooo...I'm having a somewhat peculiar problem,that I do not think is normal..This has been going on for some time..
> 
> When watching YouTube videos on my 2560x1440 monitor,while browsing OCN on my 1920x1080 one , I noticed that my Vapor-X LEDs turned yellow because it was running @ 55-58c with low ambient temps(since it's winter here,and it's been cold these days) of 22-24c
> I started then to check out what was going on,and I saw that my Core Clock was @ max(1030) even though my GPU Load was minimal,to say the least..
> I have Hardware acceleration turned off on my browser ,Virtual Super Resolution Turned off on my 16.1 Radeon drivers , and my GPU is not OC'd at all..
> 
> Here's how it looks :
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> What I have noticed is that from time to time,my Vapor-X fans rev up for a sec or 2 , and then go back to normal..
> What I was thinking to do is,take out the card and change the TIM,but that won't change the fact that I get full core clock speeds at minimal load..
> 
> So,what do you guys think,is this normal and I'm paranoid,or something is going on??
> 
> Any help will be highly appreciated


In MSI AB, if you press the (I)nformation button, you can see list of active 3d process. Maybe it list what app is causing that.


----------



## mAs81

Quote:


> Originally Posted by *kizwan*
> 
> In MSI AB, if you press the (I)nformation button, you can see list of active 3d process. Maybe it list what app is causing that.


I'm using Sapphire Trixx..but I guess it's worth a try..As we are speaking tho , my fans went to 100% and I had to reboot for it to stop..
What in the world is going on, lol ?


----------



## kizwan

Quote:


> Originally Posted by *mAs81*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> In MSI AB, if you press the (I)nformation button, you can see list of active 3d process. Maybe it list what app is causing that.
> 
> 
> 
> I'm using Sapphire Trixx..but I guess it's worth a try..As we are speaking tho , my fans went to 100% and I had to reboot for it to stop..
> What in the world is going on, lol ?
Click to expand...

I think that Crimson bug. Did you use latest drivers?


----------



## mAs81

Quote:


> Originally Posted by *kizwan*
> 
> I think that Crimson bug. Did you use latest drivers?


I'm using the 16.1 Hotfix beta drivers


----------



## Vellinious

Quote:


> Originally Posted by *mAs81*
> 
> Hey guys..
> Sooo...I'm having a somewhat peculiar problem,that I do not think is normal..This has been going on for some time..
> 
> When watching YouTube videos on my 2560x1440 monitor,while browsing OCN on my 1920x1080 one , I noticed that my Vapor-X LEDs turned yellow because it was running @ 55-58c with low ambient temps(since it's winter here,and it's been cold these days) of 22-24c
> I started then to check out what was going on,and I saw that my Core Clock was @ max(1030) even though my GPU Load was minimal,to say the least..
> I have Hardware acceleration turned off on my browser ,Virtual Super Resolution Turned off on my 16.1 Radeon drivers , and my GPU is not OC'd at all..
> 
> Here's how it looks :
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> What I have noticed is that from time to time,my Vapor-X fans rev up for a sec or 2 , and then go back to normal..
> What I was thinking to do is,take out the card and change the TIM,but that won't change the fact that I get full core clock speeds at minimal load..
> 
> So,what do you guys think,is this normal and I'm paranoid,or something is going on??
> 
> Any help will be highly appreciated


NVIDIA drivers experienced the same issues with multiple monitor setups. There was a work around, but I can't remember what it was. Worth googling, see if the workaround will work in this case. Sounds like exactly the same problem.


----------



## mAs81

Quote:


> Originally Posted by *Vellinious*
> 
> NVIDIA drivers experienced the same issues with multiple monitor setups. There was a work around, but I can't remember what it was. Worth googling, see if the workaround will work in this case. Sounds like exactly the same problem.


Yeah , the workaround was to disable extreme performance in the NVIDIA panel..
unfortunately not much I can do here









But since I restarted because the fans went 100% , there's been some progress...kinda..
I downloaded and installed the msi AB which I decided that I'll be using from now on , and everything's seems back to normal?


Spoiler: Warning: Spoiler!




I captured this during the same conditions


----------



## ZealotKi11er

Quote:


> Originally Posted by *mAs81*
> 
> Yeah , the workaround was to disable extreme performance in the NVIDIA panel..
> unfortunately not much I can do here
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But since I restarted because the fans went 100% , there's been some progress...kinda..
> I downloaded and installed the msi AB which I decided that I'll be using from now on , and everything's seems back to normal?
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> I captured this during the same conditions


What happens when you remove your 1080p monitor?


----------



## mAs81

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What happens when you remove your 1080p monitor?


Didn't try it to tell you the truth..What I do know is that dual monitors keep the memory clock at maximum at least..I tried some gaming and back to back dual monitor You tube/Browsing and my clocks/temps seem to be fine for now..








But if it re-occurs,I'll disconnect the 1080p monitor and get back to you..
I love my Vapor X to death , but all these peculiar thermal glitches are really making me furious. . .


----------



## ZealotKi11er

Quote:


> Originally Posted by *mAs81*
> 
> Didn't try it to tell you the truth..What I do know is that dual monitors keep the memory clock at maximum at least..I tried some gaming and back to back dual monitor You tube/Browsing and my clocks/temps seem to be fine for now..
> 
> 
> 
> 
> 
> 
> 
> 
> But if it re-occurs,I'll disconnect the 1080p monitor and get back to you..
> I love my Vapor X to death , but all these peculiar thermal glitches are really making me furious. . .


It going to be same with every other air cooled card. I have been under water for years now and only used Dual Monitors for a very short time so I never had 2D clock bug. Some people get this from different drivers. Now sure where the problem is but its a old problem that has never really been fixed.


----------



## mAs81

Quote:


> Originally Posted by *ZealotKi11er*
> 
> It going to be same with every other air cooled card. I have been under water for years now and only used Dual Monitors for a very short time so I never had 2D clock bug. Some people get this from different drivers. Now sure where the problem is but its a old problem that has never really been fixed.


Yeah,after googling around,I saw people complaining about clock bugs from waay back..
Let's hope that it's fixed now , because having my card work at 58c for simple everyday stuff was not life threatening for the card itself , but was making me anxious


----------



## ZealotKi11er

Quote:


> Originally Posted by *mAs81*
> 
> Yeah,after googling around,I saw people complaining about clock bugs from waay back..
> Let's hope that it's fixed now , because having my card work at 58c for simple everyday stuff was not life threatening for the card itself , but was making me anxious


New card like 390 run even hotter with 0 RPM fan.


----------



## cephelix

I have a dual monitor setup and my core clocks remain low unless I game. Youtube/videos and my core is at 300-500mhz only. The only time my fans rev up and down is when trixx is enabled but I suppose that's the characteristic of the software. Memory does stay at constant 3D clocks though. Using the crimson drivers without the hotfix. The downclocking issue in games is a pain though. Resorted to using clockblocker


----------



## mAs81

Quote:


> Originally Posted by *ZealotKi11er*
> 
> New card like 390 run even hotter with 0 RPM fan.


Yeah but isn't it kinda regular for it to do that with no fan?AMD always liked to run it's cards hot








Quote:


> Originally Posted by *cephelix*
> 
> I have a dual monitor setup and my core clocks remain low unless I game. Youtube/videos and my core is at 300-500mhz only. The only time my fans rev up and down is when trixx is enabled but I suppose that's the characteristic of the software. Memory does stay at constant 3D clocks though. Using the crimson drivers without the hotfix. The downclocking issue in games is a pain though. Resorted to using clockblocker


I'm kinda leaning towards Trixx being the problem too..I haven't uninstalled it yet , it was very convenient , but it seems that having AB for my custom fan curve is not creating any problem , for now at least.


----------



## cephelix

Quote:


> Originally Posted by *mAs81*
> 
> Yeah but isn't it kinda regular for it to do that with no fan?AMD always liked to run it's cards hot
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm kinda leaning towards Trixx being the problem too..I haven't uninstalled it yet , it was very convenient , but it seems that having AB for my custom fan curve is not creating any problem , for now at least.


Well, as far as I know AB doesn't give me those jerky fan ramp ups. I went trixx since because l for some reason it gives my OC better stability than AB. Not sure if I'm just imagining it though. I just use what works for me


----------



## mAs81

Well with Trixx I had my card running at 1100/1500 without touching the voltage , and at 1150/1500 with minimal voltage added..
Last time I used AB was with my msi 280X , which OC'd at 1200/1600 with the power limit @ +20% ..
Time to see what my 81%ASIC 290 can do with the AB , I guess


----------



## cephelix

Quote:


> Originally Posted by *mAs81*
> 
> Well with Trixx I had my card running at 1100/1500 without touching the voltage , and at 1150/1500 with minimal voltage added..
> Last time I used AB was with my msi 280X , which OC'd at 1200/1600 with the power limit @ +20% ..
> Time to see what my 81%ASIC 290 can do with the AB , I guess


LUCKY with a capital everything! I need +52mv for 1100 and +102mv for 1150. And my asic is around the 80% mark as well.


----------



## ZealotKi11er

Quote:


> Originally Posted by *cephelix*
> 
> LUCKY with a capital everything! I need +52mv for 1100 and +102mv for 1150. And my asic is around the 80% mark as well.


I need +50mV for 1 card but since I use CFX I have to feed them both the same. Thats for 1150/1500. I can do 1200 with +125mV but really not worth the heat.


----------



## mAs81

Quote:


> Originally Posted by *cephelix*
> 
> LUCKY with a capital everything! I need +52mv for 1100 and +102mv for 1150. And my asic is around the 80% mark as well.


Don't really know what to tell you there..I guess I got a good card?I'll do some benchmarks the next days and post them here for you guys to see..

This card has a funny backstory to it..I bought it when I was in the U.S and since then , the fan controller on the PCB for the 2 heatsink fans got fried,so now I control the 2 said fans with my 5.25 bay fan controller via a custom cable and I've had my series of black screening/thermal issues and whatnot..

But I really do love this card


----------



## cephelix

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I need +50mV for 1 card but since I use CFX I have to feed them both the same. Thats for 1150/1500. I can do 1200 with +125mV but really not worth the heat.


If only I had the thermal headroom for 1200mhz. How is cfx in games by the way? Kinda tempted to go cfx for polaris. Wondering if it'll be worth it and also whether my 6-7 year old seasonic x750w would be able to handle it.
Quote:


> Originally Posted by *mAs81*
> 
> Don't really know what to tell you there..I guess I got a good card?I'll do some benchmarks the next days and post them here for you guys to see..
> 
> This card has a funny backstory to it..I bought it when I was in the U.S and since then , the fan controller on the PCB for the 2 heatsink fans got fried,so now I control the 2 said fans with my 5.25 bay fan controller via a custom cable and I've had my series of black screening/thermal issues and whatnot..
> 
> But I really do love this card


How'd the fan controller get fried?and why didn't you rma??


----------



## mAs81

Quote:


> Originally Posted by *cephelix*
> 
> How'd the fan controller get fried?and why didn't you rma??


Don't really know how , it just did I guess..
And RMA was not really an option since I bought it using my cousin's Amazon Prime account , and I live in Greece ..
I've made a thread about it if you'd be interested to see how I did it and what I went though until I noticed it








Here..
Plus , I did a quick Unigine run @ 1100/1500 ;


----------



## ZealotKi11er

Quote:


> Originally Posted by *cephelix*
> 
> If only I had the thermal headroom for 1200mhz. How is cfx in games by the way? Kinda tempted to go cfx for polaris. Wondering if it'll be worth it and also whether my 6-7 year old seasonic x750w would be able to handle it.
> How'd the fan controller get fried?and why didn't you rma??


Right now CFX is in a very bad state. Almost all games what use GimpWorks have CFX problems. All I can say for CFX is that games what came from AMD will work good. Game from EA will work good since they use Frostbite. For Polaris st get the best single GPU and call a day.


----------



## jodybdesigns

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Right now CFX is in a very bad state. Almost all games what use GimpWorks have CFX problems. All I can say for CFX is that games what came from AMD will work good. Game from EA will work good since they use Frostbite. For Polaris st get the best single GPU and call a day.


I second this. I am debating on cards in another thread. My Crossfire 7950's are absolutely worthless on 70% of games now. CFX should be awesome on games like Fallout 4, but it just isn't.

Stupid Gimpworks is ruining everything with Nvidia...like another bankrupt company called 3DFX did once in the past with this crap....


----------



## cephelix

Quote:


> Originally Posted by *mAs81*
> 
> Don't really know how , it just did I guess..
> And RMA was not really an option since I bought it using my cousin's Amazon Prime account , and I live in Greece ..
> I've made a thread about it if you'd be interested to see how I did it and what I went though until I noticed it
> 
> 
> 
> 
> 
> 
> 
> 
> Here..
> Plus , I did a quick Unigine run @ 1100/1500 ;


Ahh, bookmarked for later reading. Well, at least you have a workaround that is less of a hassle than rma-ing the cards.
Quote:


> Originally Posted by *ZealotKi11er*
> 
> Right now CFX is in a very bad state. Almost all games what use GimpWorks have CFX problems. All I can say for CFX is that games what came from AMD will work good. Game from EA will work good since they use Frostbite. For Polaris st get the best single GPU and call a day.


That was what i was afraid of. But that's my initial plan for polaris. Only if it's good though. If not I'll just keep my 290 till the next run of cards come out. I can still play ac syndicate near 60fps, mostly on high.


----------



## Vellinious

Has anyone with a 290, 290X, 390, or 390X hit any of the following? No tess tweaks....

16k graphics score in firestrike?
3500+ in Valley (extreme HD preset)
1900+ in Heaven (highest settings, 1080p)


----------



## mus1mus

Quote:


> Originally Posted by *Vellinious*
> 
> Has anyone with a 290, 290X, 390, or 390X hit any of the following? No tess tweaks....
> 
> 16k graphics score in firestrike?
> 3500+ in Valley (extreme HD preset)
> 1900+ in Heaven (highest settings, 1080p)


16K FS, not a handful. You did hit it right? You know the other two.









Imma try Heaven and Valley. I think I hit it 1600+ Heaven on Extreme? Do note, System Resolution scores higher. You need to specify 1080.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> 16K FS, not a handful. You did hit it right? You know the other two.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Imma try Heaven and Valley. I think I hit it 1600+ Heaven on Extreme? Do note, System Resolution scores higher. You need to specify 1080.


If you set the preset to extreme, that'll set everything at the highest values. Then set it back to custom, so you can change the resolution to 1920 x 1080.

I want to see 1900.....

For Firstrike, I did hit 16.1k, but....it was a fluke. Graphics test 2 glitched, screen flashed, and kept running. It inflated the score....I can't count that. My high without any glitches is 15.7k.


----------



## ITAngel

Is it me or is my graphic card like to run in the 94C? I have the following setups . test I am using is heaven didn't post a result yet.

2x120mm top fan intake
1x120mm top rear fan exhaust
1x120mm button fan intake
1x140mm rate fan exhaust
1x200mm front fan intake
2x120mm internal support fans to help move more air from the 200mm.

Lol a lot of fans not sure if this case doing justice all I know the two inner fans have 74.x cfm @ 2k ram.


----------



## JourneymanMike

Quote:


> Originally Posted by *ITAngel*
> 
> Is it me or is my graphic card like to run in the 94C? I have the following setups . test I am using is heaven didn't post a result yet.
> 
> 2x120mm top fan intake
> 1x120mm top rear fan exhaust
> 1x120mm button fan intake
> 1x140mm rate fan exhaust
> 1x200mm front fan intake
> 2x120mm internal support fans to help move more air from the 200mm.
> 
> Lol a lot of fans not sure if this case doing justice all I know the two inner fans have 74.x cfm @ 2k ram.


Do you have a pic or a drawing, as to where these fans are exactly located?

Are your two top fans right next to each other? One exhaust, one intake?


----------



## ITAngel

Quote:


> Originally Posted by *JourneymanMike*
> 
> Do you have a pic or a drawing, as to where these fans are exactly located?
> 
> Are your two top fans right next to each other? One exhaust, one intake?


I can make a drawing of the laid out give me a bit.


----------



## specopsFI

As I already said, you need a lot of exhaust for dual Hawaii. I wouldn't even consider anything less than two top exhaust and one rear exhaust, then adjust the intake to your liking.


----------



## ITAngel

Sorry not the best drawing but I am at work and needed to do it fast since I am busy with some coding. lol


----------



## Cyber Locc

Quote:


> Originally Posted by *ITAngel*
> 
> Sorry not the best drawing but I am at work and needed to do it fast since I am busy with some coding. lol


I looked into this a little for you, and yes to your earlier question. "Is it me or is my graphic card like to run in the 94C?" was your question







.

I looked at a few reviews and posts about that card and all of them showing it running at 80-90c with almost Performance mode fan speed in a test bench must less a case.

I will preface this part by saying that I do not air cool, I haven't in a very long time, I dont even own a Air cooler for CPUs. Just FYI so if you want to take this with a grain of salt.

That card has a bad rap for a reason, It should NOT have been aircooled, no other 295x2 is air cooled there is a reason that AMD water cooled the reference. You are seeing that reason right now.

All that said I have a few ideas for you, Take the heat sink off and reapply good thermal paste, if your brave enough use conductive paste like CLU/CLP. Make an aggressive fan profile (by aggressive I
mean 100% at load its going to be very loud but that is the only way to get that card even remotely cooled)

You have tried a lot as far as case cooling it is obvious the issue is not your case, that card should have never been made on air (however that is just my opinion and as a WCer it may be bias, thats why the preface).

Here is a review for ya, http://www.guru3d.com/articles-pages/powercolor-radeon-290x-295x2-devil13-review,27.html. They have the same temp issues and they do not even have it in a case.

Here is the issue you are having "Is not as loud as many make them to be" They are as loud as people make them out to be you most likely have in quiet mode, and dont have a fan curve capable of cooling the card. If your card is hitting 95c your fans are no where near fast enough. When you make those fans at the speed they need to be to cool that thing it will be a jet engine.


----------



## ITAngel

Quote:


> Originally Posted by *Cyber Locc*
> 
> I looked into this a little for you, and yes to your earlier question. "Is it me or is my graphic card like to run in the 94C?" was your question
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I looked at a few reviews and posts about that card and all of them showing it running at 80-90c with almost max fan speed in a test bench must less a case.
> 
> I will preface this part by saying that I do not air cool, I haven't in a very long time, I dont even own a Air cooler for CPUs. Just FYI so if you want to take this with a grain of salt.
> 
> That card has a bad rap for a reason, It should NOT have been aircooled, no other 295x2 is air cooled there is a reason that AMD water cooled the reference. You are seeing that reason right now.
> 
> All that said I have a few ideas for you, Take the heat sink off and reapply good thermal paste, if your brave enough use conductive paste like CLU/CLP. Make an aggressive fan profile (by aggressive I
> mean 100% at load its going to be very loud but that is the only way to get that card even remotely cooled)
> 
> You have tried a lot as far as case cooling it is obvious the issue is not your case, that card should have never been made on air (however that is just my opinion and as a WCer it may be bias, thats why the preface).
> 
> Here is a review for ya, http://www.guru3d.com/articles-pages/powercolor-radeon-290x-295x2-devil13-review,27.html. They have the same temp issues and they do not even have it in a case.


I see the tip/advice and info is much appreciated. What is the safe temp zone and were to stay away from on this card? As soon I remove the side panel temps went from 89C to 82C in a mater of a minute or so on stock profile settings. I am thinking if I got a case such as a Cooler Master HAF XB EVO (HAF XB Rev. 2) that has holes on the side my provide the good amount of air to the card. Just a though I am checking dimensions on that case at the moment.

I did notice the air is coming out of the gpu hitting the side panel and getting pulled back in to the graphic card. So I need to figure out a way to make sure that air doesn't come back to the graphic card.


----------



## Cyber Locc

Quote:


> Originally Posted by *ITAngel*
> 
> I see the tip/advice and info is much appreciated. What is the safe temp zone and were to stay away from on this card? As soon I remove the side panel temps went from 89C to 82C in a mater of a minute or so on stock profile settings. I am thinking if I got a case such as a Cooler Master HAF XB EVO (HAF XB Rev. 2) that has holes on the side my provide the good amount of air to the card. Just a though I am checking dimensions on that case at the moment.


It may make some difference however I do not think it will make as big of a difference as you think. Even then its a personal thing and as a water cooler my opinion will be different than an air cooler, however you asked so I will answer.

To me a GPU running at even 82c is too much, way too much. I would never allow a card to run that hot ever. I would personally not run a card over 70C and even that would be pushing it, However I am a spoiled water cooler whos reference 290s run at 42c under full load so take that with a grain







.

As to taking the panel off versus use a side that is vented, that is a huge difference. Venting will not make a large difference it may change by 1-3c and still be way unacceptable temps. You need to make an aggressive fan curve that is reality, it will sound like a jet engine with a proper curve, that is why you got it for 400.

Can you SS your fan curve for us and post it please. Also do me a favor, go into AB or whatever you use and force the fans to 100% then run heaven and see what temps you get.


----------



## ITAngel

Quote:


> Originally Posted by *Cyber Locc*
> 
> It may make some difference however I do not think it will make as big of a difference as you think. Even then its a personal thing and as a water cooler my opinion will be different than an air cooler, however you asked so I will answer.
> 
> To me a GPU running at even 82c is too much, way too much. I would never allow a card to run that hot ever. I would personally not run a card over 70C and even that would be pushing it, However I am a spoiled water cooler whos reference 290s run at 42c under full load so take that with a grain
> 
> 
> 
> 
> 
> 
> 
> .
> 
> As to taking the panel off versus use a side that is vented, that is a huge difference. Venting will not make a large difference it may change by 1-3c and still be way unacceptable temps. You need to make an aggressive fan curve that is reality, it will sound like a jet engine with a proper curve, that is why you got it for 400.
> 
> Can you SS your fan curve for us and post it please. Also do me a favor, go into AB or whatever you use and force the fans to 100% then run heaven and see what temps you get.


I will once I get home today, but I did a run with 60% fan curve and still hit 94C and I stop the test. Once again it touch the side panel and is hot like crazy. I know I can cool this card out if I had direct cool air going to the fans and having the top pull the hot air out but I doubt it will work with this case because of that side panel and the way the fans are.

I will do what you asked once i am home and we can go from there. I another lay out I was thinking of would be something like this helping the air raise quicker will help the card?



At the worse case I may be looking at a *LIAN LI PC-T60B Black Aluminum ATX / Micro-ATX TEST BENCH Computer Case* LOL!


----------



## jodybdesigns

You could be getting negative pressure going on here. Especially if all your fans have different fan curves. Try to put them all on a controller so the air is circulating correctly at the same speeds.

On another note, I thought these 290x's were safe up to 120C before it throttles....


----------



## Cyber Locc

Quote:


> Originally Posted by *ITAngel*
> 
> I will once I get home today, but I did a run with 60% fan curve and still hit 94C and I stop the test. Once again it touch the side panel and is hot like crazy. I know I can cool this card out if I had direct cool air going to the fans and having the top pull the hot air out but I doubt it will work with this case because of that side panel and the way the fans are.
> 
> I will do what you asked once i am home and we can go from there. I another lay out I was thinking of would be something like this helping the air raise quicker will help the card?
> 
> 
> 
> At the worse case I may be looking at a *LIAN LI PC-T60B Black Aluminum ATX / Micro-ATX TEST BENCH Computer Case* LOL!


Ya the 60% is the concern here, no matter what case you use 60% simply isn't going to cut it. It was the same song and dance as with the reference cards there fans need to be like 80-90% at load to get decent temps. At those speeds it is going to sound like a jet engine but that is simply the reality.
Quote:


> Originally Posted by *jodybdesigns*
> 
> You could be getting negative pressure going on here. Especially if all your fans have different fan curves. Try to put them all on a controller so the air is circulating correctly at the same speeds.
> 
> On another note, I thought these 290x's were safe up to 120C before it throttles....


Umm no? the throttle for 295x2s is at 75c he is way past where that card should be running. Also in reviews that I have seen the Devil 13 loses to the reference 295x2 this is undoubtedly because of throttling.

A regular 290x has a throttle temp of 95c, maybe that's what you were talking about? even then 120 is a crazy number and no just no. at 120c the plastic would be melting lol.

Correction the 290x core TJ max is 95c, the 295x2s cores TJ max is 75c


----------



## jodybdesigns

Quote:


> Originally Posted by *Cyber Locc*
> 
> A regular 290x has a throttle temp of 95c, maybe that's what you were talking about? even then 120 is a crazy number and no just no. at 120c the plastic would be melting lol. (I think 110 is TJ max for a 290x, it may be 120c and thats what you might be thinking of.)


I thought the new Radeons had a VRAM limit of 120c (that is what I was referring to)? But of course, we are talking about the core. The core will of course throttle at far lower temps than 120C.

Yep I had it confused with the TJ Max. 120C is the TJ. 95 is the throttle temp.


----------



## Cyber Locc

Quote:


> Originally Posted by *jodybdesigns*
> 
> I thought the new Radeons had a VRAM limit of 120c (that is what I was referring to)? But of course, we are talking about the core. The core will of course throttle at far lower temps than 120C.


Ahh ya im not sure about the VRAM limits, I know the core will throttle at 95c on a regular 290x and the VRMS will at 85c on a 290, a 295x2 however seems to have lower max temps. At any rate he is saying his core is at 94c in heaven and BF4 that is beyond unacceptable.

He went on to say that with his case panel off in other games he keeps it at 82c that is still unacceptable. He needs to turn his fan curves way up, 60% simply isn't anywhere near enough that card should have never been air cooled to begin with. I know that the reference 290xs get a lot of hate due to running hot but honestly I think this thing is worse. I do know though when I was testing my cards before putting them under water it took 85% fan speed to keep them right at 95c, I wouldn't buy a reference to keep it on air.

Also @ITAngel "At the worse case I may be looking at a LIAN LI PC-T60B Black Aluminum ATX / Micro-ATX TEST BENCH Computer Case LOL!" no you dont need a bench your case is fine, the issue is your settings. You seem to think that below and at 60% fan speed is enough well it is not. It is not even close to enough, this is why you think the card is quiet and everyone says it isnt. You have to have those fans running a whole lot faster than you are.

When they told you before to set an aggressive fan profile, 60% at max is not an aggressive fan profile, an aggressive fan profile means them fans hit 100% to keep that card under 80c. Set the minimum fan speed at 40% at like 60c then the Maximum at 75c to 100% fan speed.

Use the money you saved on the card for a set of really nice really loud headphones because you are going to need them.


----------



## ITAngel

I see, I will set those profile tonight however I felt that the air 540 kept my 290x lightning overclock cooled. I may go the haf xb or air 540 route.


----------



## Cyber Locc

Quote:


> Originally Posted by *ITAngel*
> 
> I see, I will set those profile tonight however I felt that the air 540 kept my 290x lightning overclock cooled. I may go the haf xb or air 540 route.


The lightning has a good cooler and is a fairly cool running card. the Devil 13 is not a quiet card its not a cool running card. They air cooled a card that was made to be water cooled they took a leap and made a bad design. It sucks that the card will run hot and loud but that is the facts.

Honestly if noise is a concern you need to look at a different GPU that is sadly the reality that card will not be quiet and run correctly ever. All the cases in the world wont change that, you have to have an aggressive fan profile and its going to be very very loud.


----------



## mus1mus

Jerry-rig AIOs. Problem solved.


----------



## JourneymanMike

Quote:


> Originally Posted by *mus1mus*
> 
> Jerry-rig AIOs. Problem solved.


Yeah, Gerry-rig (I can say that because of my German ancestry - So it's not an ethnic slur)

Are you talking about CLC's, or OLC's, or both?


----------



## mus1mus

Quote:


> Originally Posted by *JourneymanMike*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Jerry-rig AIOs. Problem solved.
> 
> 
> 
> Yeah, Gerry-rig (I can say that because of my German ancestry - So it's not an ethnic slur)
> 
> Are you talking about CLC's, or OLC's, or both?
Click to expand...









all in ones?









That is the closest thing he can do.


----------



## cephelix

If @ITAngel does the top exhaust, his/her CPU temps are going to skyrocket. You need to adjust your fans in such a way that it creates a nice straight path through your case. Ideally of course. A side panel fan exhausting would also work. Get good fans as well. I found a massive difference between using the bitfenix spectres vs silverstone vs gentle typhoon fans.
Also as previously mentioned if you're feeling adventurous, CLU on the die(nail polish coating vrms) and a change of thermal pads to something better would help decreasing your temps


----------



## ITAngel

Quote:


> Originally Posted by *cephelix*
> 
> If @ITAngel does the top exhaust, his/her CPU temps are going to skyrocket. You need to adjust your fans in such a way that it creates a nice straight path through your case. Ideally of course. A side panel fan exhausting would also work. Get good fans as well. I found a massive difference between using the bitfenix spectres vs silverstone vs gentle typhoon fans.
> Also as previously mentioned if you're feeling adventurous, CLU on the die(nail polish coating vrms) and a change of thermal pads to something better would help decreasing your temps


Nice info, Thanks! at the moment to plan is going to be I will be swapping my case tonight to a Corsair Air 540 and start testing/messing with the profile settings and so on. I will post my results and see how things go tonight. Thanks!


----------



## kizwan

Quote:


> Originally Posted by *JourneymanMike*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Jerry-rig AIOs. Problem solved.
> 
> 
> 
> Yeah, Gerry-rig (I can say that because of my German ancestry - So it's not an ethnic slur)
> 
> Are you talking about CLC's, or OLC's, or both?
Click to expand...

There is no such thing as OLC. All loops are closed loop, unless you have fill port open. It is either All-In-One or custom loop.


----------



## cephelix

Keep us updated!


----------



## JourneymanMike

Quote:


> Originally Posted by *kizwan*
> 
> There is no such thing as OLC. All loops are closed loop, unless you have fill port open. It is either All-In-One or custom loop.


Open Loop Cooler... I don't have time right now, but I just saw a video comparing different coolers... The guy called the expandable ones (Swiftech H240X, for one) Open Loop Cooler..

Heck what do I know?


----------



## cephelix

Quote:


> Originally Posted by *JourneymanMike*
> 
> Open Loop Cooler... I don't have time right now, but I just saw a video comparing different coolers... The guy called the expandable ones (Swiftech H240X, for one) Open Loop Cooler..
> 
> Heck what do I know?


Ahh, usually the terms i've seen thrown around on OCN and CLC(asetek stuff) and AIO(swiftech/ek predator) and custom loops. First time i've heard OLC. Could you possibly link the video??


----------



## Cyber Locc

Quote:


> Originally Posted by *kizwan*
> 
> There is no such thing as OLC. All loops are closed loop, unless you have fill port open. It is either All-In-One or custom loop.


Dude yes there is, I use a OLC everyday. Y'all haven't lived till you run a loop with resovoir cap off all times plus you can drop in ice cubes with ease.

You just have to pray the power doesn't go out, esp if you have your resovoir over your motherboard like me hehehehe.

Living life dangerously they call me 001337


----------



## Mega Man

Leeloo?


----------



## Cyber Locc

Quote:


> Originally Posted by *Mega Man*
> 
> Leeloo?


Na man, Double O LEET









Leeloo is my girlfriend I love me some gingers.


----------



## Mega Man




----------



## ITAngel

Okay, now we are talking, Temps are holding way better now and I can play my game with temps hovering around 79C-80C with 60% Fan. I have not even really fine tune it yet I am still setting the fan profile and testing a few games that I own with it.



Will report later on some more testing, and I been wondering if the Noctua NH-D15 actually help move even more air due to its own 140mm fans versus the be quiet! Dark Rock Pro 3. Even though during my games the CPU stayed within 39C









The Fan layout is;
2x 140mm Font Intake
2x 120mm Top Exhaust
1x 140mm Rear Exhaust

-Update-
Played Battlefield 4 and temps stay withing 76C-78C and avg temp was 77C for most of the game.


----------



## Vellinious

NM...fixed it


----------



## lostsurfer

guys, i've got a lemon of a XFX280x DD, opened a ticket to start rma process. I'm more than 100% sure it's the VRM overheating. I have some left of heat sinks from my 290 geldid aftercooler. Should I try those or not worth it seeing that the card reboots my pc around 5-10min of gameplay?


----------



## ITAngel

Quote:


> Originally Posted by *lostsurfer*
> 
> guys, i've got a lemon of a XFX280x DD, opened a ticket to start rma process. I'm more than 100% sure it's the VRM overheating. I have some left of heat sinks from my 290 geldid aftercooler. Should I try those or not worth it seeing that the card reboots my pc around 5-10min of gameplay?


Hard call i would just RMA it personally but that is just me. At the same time i am willing to try a few things before sending it out could be an easy fix as long it doesn't void my RMA.


----------



## jodybdesigns

Quote:


> Originally Posted by *lostsurfer*
> 
> guys, i've got a lemon of a XFX280x DD, opened a ticket to start rma process. I'm more than 100% sure it's the VRM overheating. I have some left of heat sinks from my 290 geldid aftercooler. Should I try those or not worth it seeing that the card reboots my pc around 5-10min of gameplay?


Kinda sounds like a power issue to me if you are getting reboots under load...


----------



## kizwan

Quote:


> Originally Posted by *jodybdesigns*
> 
> Quote:
> 
> 
> 
> Originally Posted by *lostsurfer*
> 
> guys, i've got a lemon of a XFX280x DD, opened a ticket to start rma process. I'm more than 100% sure it's the VRM overheating. I have some left of heat sinks from my 290 geldid aftercooler. Should I try those or not worth it seeing that the card reboots my pc around 5-10min of gameplay?
> 
> 
> 
> Kinda sounds like a power issue to me if you are getting reboots under load...
Click to expand...


----------



## lostsurfer

Quote:


> Originally Posted by *kizwan*


Well sorry i should of clarified, doesn't reboot the system, screen goes black and no signal is shown on screen. Tried with a 575watt and now it's behind a corsair 700 watt, and a FX-8320. I used to get it to wing along fine in afterburner at 1144 volt, and +20 to power. Swapped motherboard now i'm back to this issue again. : (

Have to hard reset the system to get display back up, using latest bios. By default in afterburner the volts are 1200,Asic quality of the card is 74.5%.


----------



## kizwan

Quote:


> Originally Posted by *lostsurfer*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well sorry i should of clarified, doesn't reboot the system, screen goes black and no signal is shown on screen. Tried with a 575watt and now it's behind a corsair 700 watt, and a FX-8320. I used to get it to wing along fine in afterburner at 1144 volt, and +20 to power. Swapped motherboard now i'm back to this issue again. : (
> 
> Have to hard reset the system to get display back up, using latest bios. By default in afterburner the volts are 1200,Asic quality of the card is 74.5%.
Click to expand...

You're not using your sig rig? How about at stock clock, games running fine at stock clock? Did you try play games at highest clock you can get without touching voltage? Even if running at stock clock give you problem, then you should RMA.


----------



## lostsurfer

Quote:


> Originally Posted by *kizwan*
> 
> You're not using your sig rig? How about at stock clock, games running fine at stock clock? Did you try play games at highest clock you can get without touching voltage? Even if running at stock clock give you problem, then you should RMA.


yea not my sig rig, this is my second rig not listed as I'm still adding things here and there before I'm ready to call it quits. My sig rig stays at home this one chills at the gf's. Yea it's stock everything, ran DDU and did a clean crimson install (not beta), removed realtek drivers just for good measure and made sure i had the latest bios flashed. Temps are good, not sure about the vrm temps tho...I started a tickey over 3 days ago and haven't heard back, they extended the warranties on this card which is good. I just wanted to be ready to check the division beta this friday, that why i was wondering about puttin some small syncs over the vrm or just put my sig rig card for the time being, i've spend way too much time trying to get this thing going...


----------



## battleaxe

Quote:


> Originally Posted by *lostsurfer*
> 
> yea not my sig rig, this is my second rig not listed as I'm still adding things here and there before I'm ready to call it quits. My sig rig stays at home this one chills at the gf's. Yea it's stock everything, ran DDU and did a clean crimson install (not beta), removed realtek drivers just for good measure and made sure i had the latest bios flashed. Temps are good, not sure about the vrm temps tho...I started a tickey over 3 days ago and haven't heard back, they extended the warranties on this card which is good. I just wanted to be ready to check the division beta this friday, that why i was wondering about puttin some small syncs over the vrm or just put my sig rig card for the time being, i've spend way too much time trying to get this thing going...


Sounds like the ram can't take the clocks. Try clocking the ram down a bit. If it can't hold at stock clocks but can at lower than you have some defective ram on the card. Thats' my 2cents worth. Blackscreens are usually caused by the ram on the card not able to handle the clock.


----------



## jodybdesigns

Quote:


> Originally Posted by *lostsurfer*
> 
> Well sorry i should of clarified, doesn't reboot the system, screen goes black and no signal is shown on screen. Tried with a 575watt and now it's behind a corsair 700 watt, and a FX-8320. I used to get it to wing along fine in afterburner at 1144 volt, and +20 to power. Swapped motherboard now i'm back to this issue again. : (
> 
> Have to hard reset the system to get display back up, using latest bios. By default in afterburner the volts are 1200,Asic quality of the card is 74.5%.


Just RMA it, don't even fight with it anymore. If you can exchange it, do it while you can.


----------



## lostsurfer

Quote:


> Originally Posted by *jodybdesigns*
> 
> Just RMA it, don't even fight with it anymore. If you can exchange it, do it while you can.


Quote:


> Originally Posted by *battleaxe*
> 
> Sounds like the ram can't take the clocks. Try clocking the ram down a bit. If it can't hold at stock clocks but can at lower than you have some defective ram on the card. Thats' my 2cents worth. Blackscreens are usually caused by the ram on the card not able to handle the clock.


Thanks guys, I pm'd our rep and he got the ball moving forward, thanks for everyones response. Looking forward to getting a working 280x, i'll be creeping these posts as soon as i get it


----------



## ITAngel

Question for you card gurus out there what is better to run.

Single Devil 13 290X2 or Dual 290X ?

What are the noise and temp levels differences?

The Two cards would be in CF

SAPPHIRE TRI-X OC Radeon R9 290X DirectX 11.2 100361-2SR 4GB 512-Bit GDDR5 PCI Express 3.0 CrossFireX Support Video Card

and

MSI R9 290X LIGHTNING 4GB 512-Bit GDDR5 PCI Express 3.0 HDCP Ready CrossFireX Support Video Card

*Note* Is just a curious question that is all. My friend and I are comparing setups, temps and performance etc...


----------



## Cyber Locc

Quote:


> Originally Posted by *ITAngel*
> 
> Question for you card gurus out there what is better to run.
> 
> Single Devil 13 290X2 or Dual 290X ?
> 
> What are the noise and temp levels differences?
> 
> The Two cards would be in CF
> 
> SAPPHIRE TRI-X OC Radeon R9 290X DirectX 11.2 100361-2SR 4GB 512-Bit GDDR5 PCI Express 3.0 CrossFireX Support Video Card
> 
> and
> 
> MSI R9 290X LIGHTNING 4GB 512-Bit GDDR5 PCI Express 3.0 HDCP Ready CrossFireX Support Video Card
> 
> *Note* Is just a curious question that is all. My friend and I are comparing setups, temps and performance etc...


I am scared to give my honest answer as I dont want to offend your purchase, but you asked.

The 2 290xs win on all counts, they will be faster, there coolers are better and will yield better temps, Noise is up in the air, the fans wont need to spin as fast to keep better temps on each card, however there is more fans so it may be a tad louder or more quiet really thats a tough one.

I would never suggest buying a Dual GPU card unless you need one. By "need one" I mean you have a MATX Board or a ITX board and don't have enough GPU slots to run 2. Plus lets be honest 2 cards looks a whole lot better than 1 especially on bigger boards it fills more space.

Another huge advantage I mean huge. If your 295x2 dies your out everything, if one of his 290xs die he is out a 290x.


----------



## ITAngel

Thanks for letting me know about the fact if one card dies I can still continue with the other one. I didn't think of that being a possible issue in a 295x2 kind of card.









I forgot to mention my friend was interested in trading me but I was not sure if it a good idea or not. What you guys think? I got Cyber Locc response but curious what other think about it.


----------



## Cyber Locc

Quote:


> Originally Posted by *ITAngel*
> 
> Thanks for letting me know about the fact if one card dies I can still continue with the other one. I didn't think of that being a possible issue in a 295x2 kind of card.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I forgot to mention my friend was interested in trading me but I was not sure if it a good idea or not. What you guys think? I got Cyber Locc response but curious what other think about it.


Ya that dieing thing is an old throw back from the TV/VCR/DVD combos lol if one dies your out everything!


----------



## ITAngel

Quote:


> Originally Posted by *Cyber Locc*
> 
> Ya that dieing thing is an old throw back from the TV/VCR/DVD combos lol if one dies your out everything!


I was looking at some configurations that I am planning to do on my system and it should work out perfectly well.


----------



## Roboyto

Quote:


> Originally Posted by *ITAngel*
> 
> Question for you card gurus out there what is better to run.
> 
> Single Devil 13 290X2 or Dual 290X ?
> 
> What are the noise and temp levels differences?
> 
> The Two cards would be in CF
> 
> SAPPHIRE TRI-X OC Radeon R9 290X DirectX 11.2 100361-2SR 4GB 512-Bit GDDR5 PCI Express 3.0 CrossFireX Support Video Card
> 
> and
> 
> MSI R9 290X LIGHTNING 4GB 512-Bit GDDR5 PCI Express 3.0 HDCP Ready CrossFireX Support Video Card
> 
> *Note* Is just a curious question that is all. My friend and I are comparing setups, temps and performance etc...


Two cards for all the reasons everyone else has mentioned, and it is easier to sell individual cards rather than a dual GPU once you are done with them.

If you were going to get a dual GPU hawaii you would want a reference so you could at least put a block on it.

While PowerColor did a hell of a job cooling dual 290X's on air, it wouldn't be my first choice.

Either road you choose just be sure to have excellent air flow through your case to keep the temps within reason.

If you're in dire need of a 2nd card you can get one, but if it isn't an absolutely necessary upgrade then I would just hang tight for Polaris.


----------



## cephelix

Can't wait to see what's in store for polaris...let's see if specs are good since fury/x was underwhelming at the res i play and seems like more of a sidegrade


----------



## YellowBlackGod

From what I undestand, my 290x (8GB) + Z97 setup will survive the whole 2016. It is the first time in 4 years that I don't have any upgrade itch. I will be waiting for Zen to come out, see what it has to offer and keep my 290X till I see the available polaris prices and models. After all when this time arrives DX12 will breathe new life to the MVGPU (aka most valuable GPU) of the last 2.5 years.


----------



## fat4l

Hi guys, I just wanted to share some interesting stuff.
When u change settings in drivers, it cant rly be seen by 3d mark and it says "Valid Result" thus making the benchmarking @scores a bit questionable








Settings that can be adjusted: Tessellation OFF, Texture filtering quality-Performance, Frame pacing-OFF.

Look here:
Normal default driver settings, 1030/1250MHz all stock vs Changed drivers settings, 1030/1250MHz all stock
Newest drivers 16.1.1

http://www.3dmark.com/compare/fs/7418510/fs/7418566

http://www.3dmark.com/fs/7418566
http://www.3dmark.com/fs/7418510



All it says is, "graphics driver not approved" as this is beta driver from today. It says it for both results hoever one of them has custom driver settings.
This gives me 12% Crossfire boost in graphics score on stock(just for curiosity)...









With that said, here is my 290X single, 1250/1750MHz with 1500 timings and with "non-beta" driver.
16.6k graphics score...








U can see all three scores....and all three "valid result"!












Also, Crimson 16.1.1 vs 16.1. No boost in 3d mark








http://www.3dmark.com/compare/fs/7418510/fs/7418456#


----------



## OneB1t

this is known..


----------



## Vellinious

Which settings, though? Tessellation turned down, or off?


----------



## fat4l

Quote:


> Originally Posted by *Vellinious*
> 
> Which settings, though? Tessellation turned down, or off?


ah my bad. I will add it.
Yes, Tessellation OFF, Texture filtering quality-Performance, Frame pacing-OFF.


----------



## fyzzz

3dmark noticed that i turned off tessellation before when i tried it, weird that some people get a valid score, even though they turned it off. I tried it on crimson 15.12 and turned off tessellation only.


----------



## mus1mus

hmmm.

So Crimson does that?

I'm still using pre-Crimson.

Last month's runs.

http://www.3dmark.com/compare/fs/7187932/fs/7189852#


----------



## fat4l

well, fyzzz is using crimson and he says 3d mark noticed he turned it off.
you have pre-crimson and you say the same thing.

So it prolly depends on system or idk


----------



## mus1mus

Weird.

I don't do tess off when trying to shoot some scores here. But I do on the bot. And most of the times, it gets detected.


----------



## fat4l

Quote:


> Originally Posted by *mus1mus*
> 
> Weird.
> 
> I don't do tess off when trying to shoot some scores here. But I do on the bot. And most of the times, it gets detected.


maybe I should snipe them big [email protected] hwbot


----------



## Vellinious

This is my best run so far with standard driver settings.

http://www.3dmark.com/fs/7358506


----------



## Roboyto

Quote:


> Originally Posted by *cephelix*
> 
> Can't wait to see what's in store for polaris...let's see if specs are good since fury/x was underwhelming at the res i play and seems like more of a sidegrade


I was just entertaining the idea of a Fury or Fury X compared to my 290.

I went to TPU and scoped out results for 4K since I run 3K eyefinity. Any results for 4K would obviously pan out better

The review was from Aug 7, 2015 and TPU said:


R9 390X reference performance was tested by running the MSI R9 390X Gaming at reference design clocks (1050/1500)
R9 Fury reference performance was tested by running the ASUS R9 Fury Strix

A 390X @ 1050/1500 will trail my 290 @ 1200/1500 (gaming clocks) by a bit...it is still a decent comparison from a few months ago *(August 7th 2015)*.

The Fury they tested in this instance was the Sapphire Tri-X OC which runs @ 1040/500.

I took frame rates from their 22 game benchmark suite and averaged the gains to get the following:


The gains from the MSI 390X to the Sapphire Fury were: 16.98%
The gains from the MSI 390X to Fury X were: 24.51%
They were also able to squeeze another 4.5% (BF3 specifically) out of the Fury by hitting 1110/500.
If you add the 4.5% to either of those margins you have 21.48% and 29.01% respectively.
https://www.techpowerup.com/reviews/Sapphire/R9_Fury_Tri-X_OC/1.html


Spoiler: Sheet




GAME @ 4K390XFURYFURY XGAINS FURYGAINS FURY XALIEN ISOLATION52.558.562.2111.43%118.48%AC UNITY21.829.331.6134.40%144.95%BATMAN AK74.79598.5127.18%131.86%BF 342.349.152.8116.08%124.82%BF 429.532.834.3111.19%116.27%BIOSHOCK INF50.560.563.2119.80%125.15%COD AW58.964.167.9108.83%115.28%CIV BEYOND EARTH52.564.571.5122.86%136.19%CRYSIS 32326.228.2113.91%122.61%DEAD RISING 324.427.930.8114.34%126.23%DRAGON AGE INQ28.732.733.7113.94%117.42%FAR CRY 431.836.939.6116.04%124.53%GTA V2630.432.2116.92%123.85%METRO LL30.237.239.9123.18%132.12%PROJECT CARS39.743.945.9110.58%115.62%RYSE34.339.441.5114.87%120.99%SHADOW OF MORDOR44.149.952.7113.15%119.50%WITCHER 325.530.632.8120.00%128.63%TOMB RAIDER45.352.658.7116.11%129.58%WATCH DOGS32.737.539.6114.68%121.10%WOLFENSTEIN28.735.837.4124.74%130.31%WOW: WOD71.37881.1109.40%113.74%         Gains116.98%124.51%





I reckon most Fury(X) could obtain ~1110 core clock, and that did not include any HBM overclocking which does, AFAIK, give a little boost as well.

For those of us with exceptionally well clocking Hawaii cards, either Fury would probably be a 'sidegrade' as you have mentioned.

What is interesting currently however, is how low the R9 Nano pricing has gotten. Considering you get full Fury X core count for ~$500 it's not that bad of a deal. Yes they only have a single 8-pin, but you are getting the higher binned efficient chips in this scenario. Slap a block on a Nano and the results would probably come pretty close to a Fury X...?


----------



## ITAngel

Roboyto nice post. Thanks! It does look interesting to me. I am currently happy with my devil 13 290x2 but very interested in the fury x finding and nanotechnology cards.


----------



## Roboyto

Quote:


> Originally Posted by *ITAngel*
> 
> Roboyto nice post. Thanks! It does look interesting to me. I am currently happy with my devil 13 290x2 but very interested in the fury x finding and nanotechnology cards.


Sapphire's new Nitro Fury that is a custom PCB with numerous enhancements looks promising as well. In this particular review the card hit 1200/550: http://sapphirenation.net/sapphire-nitro-r9-fury-technical-overview/

If the Fury cards on average could obtain clocks similar to this Sapphire offering, I believe it would make them more attractive.


----------



## cephelix

@Roboyto great analysis there. I do know that the fury x does quite well at higher resolutions but always seems to be less so at 1080p. I myself game at 1920x1200 so i don't think I'll see that much of a difference unless I start supersampling?


----------



## Roboyto

Quote:


> Originally Posted by *cephelix*
> 
> @Roboyto great analysis there. I do know that the fury x does quite well at higher resolutions but always seems to be less so at 1080p. I myself game at 1920x1200 so i don't think I'll see that much of a difference unless I start supersampling?


They're definitely a bit lackadaisical at the lower resolutions; though AMD seems to be working on this. If you're not driving the higher resolutions, then it definitely isn't worth the upgrade at this point.

Polaris is right around the corner, and if things go as planned *I think* we could see a 40-50% performance jump if you consider Fury X is touting a ~25% bump over a 390X. With the possible combination of a near 50% die shrink, HBM2, and higher clocks all around due to die shrink and HBM2...I don't see how it can't be a giant leap forward. AMD has been quoted saying somewhere in the vicinity of doubling performance per watt IIRC.

28nm hung on for a long time, but I am certainly ready to see some amazing things happen that will bring 4K much closer to the standard and make consoles look even sillier than they already do.


----------



## cephelix

Quote:


> Originally Posted by *Roboyto*
> 
> They're definitely a bit lackadaisical at the lower resolutions; though AMD seems to be working on this. If you're not driving the higher resolutions, then it definitely isn't worth the upgrade at this point.
> 
> Polaris is right around the corner, and if things go as planned _*I think*_ we could see a 40-50% performance jump if you consider Fury X is touting a ~25% bump over a 390X. With the possible combination of a near 50% die shrink, HBM2, and higher clocks all around due to die shrink and HBM2...I don't see how it can't be a giant leap forward. AMD has been quoted saying somewhere in the vicinity of doubling performance per watt IIRC.
> 
> 28nm hung on for a long time, but I am certainly ready to see some amazing things happen that will bring 4K much closer to the standard and make consoles look even sillier than they already do.


Yeah....maybe polaris would actually tempt me into getting a 1440p monitor finally. 4K is still just a distant dream for me. It requires too much graphical power and for now things aren't up to par to properly drive 4K in my opinion. For now my monitors are still serving me well. If polaris is doing great, I'll finally get a reference card and put my whole system under water...


----------



## Roboyto

Quote:


> Originally Posted by *cephelix*
> 
> Yeah....maybe polaris would actually tempt me into getting a 1440p monitor finally. 4K is still just a distant dream for me. It requires too much graphical power and for now things aren't up to par to properly drive 4K in my opinion. For now my monitors are still serving me well. If polaris is doing great, I'll finally get a reference card and put my whole system under water...


I think 4K will be much less of a dream in the next 4-10 months honestly. I picked up an Acer 4K from NewEgg for $299 over the holidays. Greatest monitor in the world...nope, but games look pretty solid on it and I have a real 4K monitor for testing hardware and fiddling around.

Add HDMI 2.0 to AMD's list of capabilities along with large performance gains. 4K with a reasonable compromise of eye-candy should be a reality. My 290 has been kicking arse for over 2 years driving triple 1080 so I have high hopes for Polaris.


----------



## cephelix

Quote:


> Originally Posted by *Roboyto*
> 
> I think 4K will be much less of a dream in the next 4-10 months honestly. I picked up an Acer 4K from NewEgg for $299 over the holidays. Greatest monitor in the world...nope, but games look pretty solid on it and I have a real 4K monitor for testing hardware and fiddling around.
> 
> Add HDMI 2.0 to AMD's list of capabilities along with large performance gains. 4K with a reasonable compromise of eye-candy should be a reality. My 290 has been kicking arse for over 2 years driving triple 1080 so I have high hopes for Polaris.


That is a real steal for a 4K monitor.heck i was looking at one of the benq 27" 1440p monitors and it cost SGD888 with just a TN panel..dell ones cost about the same.main priority now is new card and wc-ing my rig though. Oh, and definitely looking forward to your tests...


----------



## OneB1t

problem is that you need more than 34" for real 4K experience







as 24" 4K is good for games but too small to work with without messing with DPI


----------



## Roboyto

Quote:


> Originally Posted by *cephelix*
> 
> That is a real steal for a 4K monitor.heck i was looking at one of the benq 27" 1440p monitors and it cost SGD888 with just a TN panel..dell ones cost about the same.main priority now is new card and wc-ing my rig though. Oh, and definitely looking forward to your tests...


http://www.newegg.com/Product/Product.aspx?Item=N82E16824009659

That's what I grabbed and it looks like it's back to $450-$500 depending where you shop. I didn't particularly need it for any one reason, but with the reviews around the web being generally positive and the price I just couldn't pass it up at $299..so I threw it into the cart with some other Xmas gifts and got 0% financing for everything









I am also looking forward to something new to fiddle around with. I'm out of things to upgrade in my HTPC at this point since it has evolved into a near clone of my main rig. I may even jump into the Zen pool when those release, as well, just to mix things up a bit.

WCing doesn't have to cost you an arm & leg honestly. If you have any questions I'd be happy to chat you up; just shoot me a PM


----------



## cephelix

Quote:


> Originally Posted by *Roboyto*
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16824009659
> 
> That's what I grabbed and it looks like it's back to $450-$500 depending where you shop. I didn't particularly need it for any one reason, but with the reviews around the web being generally positive and the price I just couldn't pass it up at $299..so I threw it into the cart with some other Xmas gifts and got 0% financing for everything
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am also looking forward to something new to fiddle around with. I'm out of things to upgrade in my HTPC at this point since it has evolved into a near clone of my main rig. I may even jump into the Zen pool when those release, as well, just to mix things up a bit.
> 
> WCing doesn't have to cost you an arm & leg honestly. If you have any questions I'd be happy to chat you up; just shoot me a PM


For now, besides polaris i'm all set thought like you, i'm also excited for zen to see what it can do..if it's good enough, who knows, my next upgrade might actually be zen or zen+.
As for wc, i'll pm you in due time.right now i'm leaning towards the new ek d5 revo pump, rigid tubing and barrow fittings. Already purchased rads(360 and 240) and a reservoir


----------



## rdr09

Quote:


> Originally Posted by *Roboyto*
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16824009659
> 
> That's what I grabbed and it looks like it's back to $450-$500 depending where you shop. I didn't particularly need it for any one reason, but with the reviews around the web being generally positive and the price I just couldn't pass it up at $299..so I threw it into the cart with some other Xmas gifts and got 0% financing for everything


That's what i have and been using it for over a year now. i go down to 1440 for work. Have had zero issues on mine - crosses fingers. got it for $355.


----------



## EdInk007

Currently experiencing loss of monitor signals when browsing or doing non-GPU intensive tasks
i7-4790K (no OC)
290X CF (liquid cooled - no OC)
Maximus Ranger
Kingston Hyper X 16GB
AX760i
LG34UM57 monitor using Displayport
AMD Driver 15.11.1 (had to revert to older driver as Crimson gave me more frequent losses- DDU now my best friend)

Anyone experiencing this or experienced this?


----------



## mfknjadagr8

Quote:


> Originally Posted by *EdInk007*
> 
> Currently experiencing loss of monitor signals when browsing or doing non-GPU intensive tasks
> i7-4790K (no OC)
> 290X CF (liquid cooled - no OC)
> Maximus Ranger
> Kingston Hyper X 16GB
> AX760i
> LG34UM57 monitor using Displayport
> AMD Driver 15.11.1 (had to revert to older driver as Crimson gave me more frequent losses- DDU now my best friend)
> 
> Anyone experiencing this or experienced this?


I experience this frequently when connected to an hdtv...I have signal losses a lot of places but mostly when Windows login screen comes up and when going from 2d to 3d


----------



## i2CY

Hey,

is there away to set PCI-E Lane Priority? Possibly more of a MOBO question or Case air flow.

Reason,
I like my case, hate how crammed it is, truth be told did not have good air flow rating when I got it, but then the build bug bit me...

I have installed after market air cooler on my 290s (but after reading ITAngel flow issues wish I may have not)

either or cards are super quite, but still getting hot-hot with out OC, so my top card is suffocating, or rather lane 1's card,

wanting to switch full load to lane 2, with lane 1 picking up the slack when needed if Crossfired is enabled.

In 1 - 95 CFM bottom case intake (Cooler Master JetFlo 120 )
In 2 - 95 CFM front case intake (Cooler Master JetFlo 120 )
Out 2 - 95 CFM top case exhaust (Cooler Master JetFlo 120 )
Out 1 - 77 CFM H80i back case cooler, routed as exhaust.

Crazy? Possible? Simple as installing lane 2 card first, hook up cable, and put lane one card in?Pointless, man up and get another case?

However I doubt it, I'd like to OC as I am just below the 4K mark on the firestorm bench.

PS - They were claiming gains in proformance with crimson i get losses as well, rolled back to cat. 15.7.


----------



## ITAngel

Yea I am still working on my cooling so far I had it working last night with 58C with the room at 68F. Of course the CPU was changed to stock from being OC which can explain lower temps on the case?

Also finding out that my system hate crimson drivers. Keeps disabling my Crossfire or crashing my system upon login to the desktop. Going back to 15.7.


----------



## Cyber Locc

Quote:


> Originally Posted by *ITAngel*
> 
> Yea I am still working on my cooling so far I had it working last night with 58C with the room at 68F. Of course the CPU was changed to stock from being OC which can explain lower temps on the case?
> 
> Also finding out that my system hate crimson drivers. Keeps disabling my Crossfire or crashing my system upon login to the desktop. Going back to 15.7.


You dont plan on leaving the CPU stock right? At stock your CPU will bottleneck your GPUs pretty badly.


----------



## InsideJob

I've recently finally got my R9 290 under water.


Anyone else with a single 290 running 1440p? I'm wondering how well the card will hold up if I upgrade from 1080 to 1440.


----------



## Cyber Locc

Quote:


> Originally Posted by *InsideJob*
> 
> I've recently finally got my R9 290 under water.
> 
> 
> Anyone else with a single 290 running 1440p? I'm wondering how well the card will hold up if I upgrade from 1080 to 1440.


It will most likely be able to play almost anything maxed at 1440p, you can try it yourself though. Just turn on VSR and set it to 1440p and boom, it will look almost as good as 1440p and will tax your card a tad more than native 1440p would.


----------



## cephelix

Quote:


> Originally Posted by *Cyber Locc*
> 
> It will most likely be able to play almost anything maxed at 1440p, you can try it yourself though. Just turn on VSR and set it to 1440p and boom, it will look almost as good as 1440p and will tax your card a tad more than native 1440p would.


I assume this won't be at max settings? Since in games like DA:I i couldn't even hold a steady 60fps with tweaked settings at 1920x1200


----------



## ITAngel

Quote:


> Originally Posted by *Cyber Locc*
> 
> You dont plan on leaving the CPU stock right? At stock your CPU will bottleneck your GPUs pretty badly.


No that was until I can figure out what I need to do in order to fix it. I ended up going back to Jan 29 and it seems to be holding for now. It is back to 4.5Ghz.


----------



## InsideJob

The most notable game I have issues with currently, although it is still in development, is Armored Warfare. It runs off Cryengine and depending on the map I'm playing I struggle to maintain 60fps with a few things turned down. Then there's Eliteangerous, I get fine FPS playing, but it makes my loop heat up to nearly 60c which I'd rather not have my CPU that hot. Gunna have to get a pair of gentle typhoons for the one rad soon and hope that helps that situation.

I did just price out going for an eyefinity setup of the current 1080 monitor I have which is cheaper than the 1440 monitor I'm looking at. So now I'm torn between eyefinity or a single 1440...


----------



## battleaxe

Quote:


> Originally Posted by *InsideJob*
> 
> The most notable game I have issues with currently, although it is still in development, is Armored Warfare. It runs off Cryengine and depending on the map I'm playing I struggle to maintain 60fps with a few things turned down. Then there's Eliteangerous, I get fine FPS playing, but it makes my loop heat up to nearly 60c which I'd rather not have my CPU that hot. Gunna have to get a pair of gentle typhoons for the one rad soon and hope that helps that situation.
> 
> I did just price out going for an eyefinity setup of the current 1080 monitor I have which is cheaper than the 1440 monitor I'm looking at. So now I'm torn between eyefinity or a single 1440...


1440... all day.


----------



## cephelix

Quote:


> Originally Posted by *InsideJob*
> 
> The most notable game I have issues with currently, although it is still in development, is Armored Warfare. It runs off Cryengine and depending on the map I'm playing I struggle to maintain 60fps with a few things turned down. Then there's Eliteangerous, I get fine FPS playing, but it makes my loop heat up to nearly 60c which I'd rather not have my CPU that hot. Gunna have to get a pair of gentle typhoons for the one rad soon and hope that helps that situation.
> 
> I did just price out going for an eyefinity setup of the current 1080 monitor I have which is cheaper than the 1440 monitor I'm looking at. So now I'm torn between eyefinity or a single 1440...


The best of both worlds 1440p eyefinity..


----------



## rdr09

Quote:


> Originally Posted by *ITAngel*
> 
> No that was until I can figure out what I need to do in order to fix it. I ended up going back to Jan 29 and it seems to be holding for now. It is back to 4.5Ghz.


Never had the need to oc the 5820. it was faster than my i7 Sandy Oc'ed 4.5GHz, which never ever bottleneck my 2 290s.

http://www.3dmark.com/3dm/4644282

http://www.3dmark.com/3dm/6302964

just checkout the physics scores. if i have 2 980Tis or Fury Xs, then i would have kept the 5820.


----------



## Roboyto

Quote:


> Originally Posted by *InsideJob*
> 
> The most notable game I have issues with currently, although it is still in development, is Armored Warfare. It runs off Cryengine and depending on the map I'm playing I struggle to maintain 60fps with a few things turned down. Then there's Eliteangerous, I get fine FPS playing, but it makes my loop heat up to nearly 60c which I'd rather not have my CPU that hot. Gunna have to get a pair of gentle typhoons for the one rad soon and hope that helps that situation.
> 
> I did just price out going for an eyefinity setup of the current 1080 monitor I have which is cheaper than the 1440 monitor I'm looking at. So now I'm torn between eyefinity or a single 1440...


I rather enjoy 1080 eyefinity.
Quote:


> Originally Posted by *cephelix*
> 
> The best of both worlds 1440p eyefinity..


Interesting proposition indeed!


----------



## InsideJob

Quote:


> Originally Posted by *Roboyto*
> 
> I rather enjoy 1080 eyefinity.
> Expensive proposition indeed!


^fixed

I'll probably go 1440 and get more down the line when price drops.


----------



## rdr09

Quote:


> Originally Posted by *InsideJob*
> 
> ^fixed
> 
> I'll probably go 1440 and get more down the line when price drops.


Not sure how much your budget is and what Hz the 1440 you are planning on getting but you can snag a 4K for about the same amount maybe. checkout the post of Rob . . .

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/41590#post_24855742

it only takes 2 secs to go down to 1440. Not 1080 'cause it looks funny.


----------



## OneB1t

4k all the way


----------



## cephelix

Quote:


> Originally Posted by *InsideJob*
> 
> ^fixed
> 
> I'll probably go 1440 and get more down the line when price drops.


Expensive? don't all enthusiasts earn a six figure annual salary? lol
But I do agree it is pricey... the cost of the monitor itself is putting me off from upgrading


----------



## Cyber Locc

Quote:


> Originally Posted by *InsideJob*
> 
> ^fixed
> 
> I'll probably go 1440 and get more down the line when price drops.


Ya I honestly dont feel that will happen, 1440p is kind of a PC exclusive and then a niche. 4k will become the new standard and its prices have dropped a whole lot since its release you can already buy a 4k monitor for cheaper than 1440p lol. 4k will be the new 1080p not 1440, 1440 is and always will be a niche so I dont think its prices will ever drop. They will remain and then they will phase out just like all the other niche resolutions.

However you can buy a 4k panel and use it at 1440p, and honestly today there is no reason to buy a 1440p screen unless you want 120hz (which 4k monitors dont yet support). You can buy a 4k screen for the same price and it is way more future proof and can be used at 1440p.

Sadly 1440p monitors time is up and there days are numbered. I would say within the next year or 2 tops they will not longer be produced in a massive capacity. There simply isn't any point for them anymore. The second a company produces a nice panel that can run 120hz, 1440p is dead tech. They're will still be old models but new ones wont happen and they will fade away.


----------



## Cyber Locc

Quote:


> Originally Posted by *cephelix*
> 
> I assume this won't be at max settings? Since in games like DA:I i couldn't even hold a steady 60fps with tweaked settings at 1920x1200


Sorry I had missed this, Um I honestly dont know I dont play at 1440p nor with only 1 card, just saying from what I have read / been told. I can max DAI at 1080 with 1 card and at 4k with 2 overclocked if that helps. However my rig is ripped down right now and on a bench and becasue of that I am only running 1 card so I will check that out. I been playing witcher a little at 1440p maxed (GImpworks features off of course) and haven't had any issues. I dont watched FPS but I haven't noticed any shifts at all.

I will get a fps monitor going next time I play and play DAI and see what happens.

Here is of 1600, having issues finding one thats of 1440p. Most people like to use the benchmark that comes with it and that thing is fake, it not only is way more demanding the actual game it favors nvidia heavily, anyway here is maxxed or pretty close at 1600. 

However your selection is a very demanding and very nvidia favoring game. In most games 1440p maxxed with a 290x is no problem especially one with a decent overclock. Most of the benches are for stock my card is heavily overclocked so a much different story.

Also since were talking about it







have some other issues with DAI, I have to back my CPU overclock for it and only it. Every other game is stable Prime 95 is stable everything is stable at 4.7 however anything over 4.5 no matter how much voltage i give BSODs with DAI I dont know why that game is cursed lol.


----------



## cephelix

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Cyber Locc*
> 
> Sorry I had missed this, Um I honestly dont know I dont play at 1440p nor with only 1 card, just saying from what I have read / been told. I can max DAI at 1080 with 1 card and at 4k with 2 overclocked if that helps. However my rig is ripped down right now and on a bench and becasue of that I am only running 1 card so I will check that out. I been playing witcher a little at 1440p maxed (GImpworks features off of course) and haven't had any issues. I dont watched FPS but I haven't noticed any shifts at all.
> 
> I will get a fps monitor going next time I play and play DAI and see what happens.
> 
> Here is of 1600, having issues finding one thats of 1440p. Most people like to use the benchmark that comes with it and that thing is fake, it not only is way more demanding the actual game it favors nvidia heavily, anyway here is maxxed or pretty close at 1600.
> 
> However your selection is a very demanding and very nvidia favoring game. In most games 1440p maxxed with a 290x is no problem especially one with a decent overclock. Most of the benches are for stock my card is heavily overclocked so a much different story.






No worries....The benches seem about right I suppose. Haven't revisited DA:I since the release of the new drivers though. All I know is the crimson provided a massive boost to my 290 compared to the omega driver


----------



## Cyber Locc

Quote:


> Originally Posted by *cephelix*
> 
> 
> No worries....The benches seem about right I suppose. Haven't revisited DA:I since the release of the new drivers though. All I know is the crimson provided a massive boost to my 290 compared to the omega driver


Ya I havent played it in a long time either, I will though next time I get a chance to game for a few hours. I will report back my findings







on that cursed game, I seriously hate having to change my settings for that stupid game lol but such is life.

Everyone else should take note as well want to see if your CPU is 100% stable prime is so yesterday use DAI its the ultimate CPU tester hehehe. It literally BSODs after less than 5 mins. I so love how I can run Aida, Prime, and IBT for 48 hours straight with no issues and 5 mins of DAI and BSOD.


----------



## cephelix

Quote:


> Originally Posted by *Cyber Locc*
> 
> Ya I havent played it in a long time either, I will though next time I get a chance to game for a few hours. I will report back my findings
> 
> 
> 
> 
> 
> 
> 
> on that cursed game, I seriously hate having to change my settings for that stupid game lol but such is life.


Lol...it's fun but unlike DAO where i prefer playing as a tank, now i prefer being a mage. I'm still hooked on AC syndicate though. I know people hate the AC series for a plethora of different reasons but i somehow like it. Obviously i have niggles with ac3 and unity but the rest i really enjoyed. After this is playing witcher 2 so i can move on to 3...man, my social life is gonna go down the drain..lol


----------



## Cyber Locc

Quote:


> Originally Posted by *cephelix*
> 
> Lol...it's fun but unlike DAO where i prefer playing as a tank, now i prefer being a mage. I'm still hooked on AC syndicate though. I know people hate the AC series for a plethora of different reasons but i somehow like it. Obviously i have niggles with ac3 and unity but the rest i really enjoyed. After this is playing witcher 2 so i can move on to 3...man, my social life is gonna go down the drain..lol


The Witchers are very very fun but get ready to increase your vocabulary because your going to be saying curse words at the screen you wont even know you knew. Saying that the Wicther is hard is a serious understatement the Witcher 3 to me is much easier than 2 even then though its tough prepare to die ALOT.


----------



## cephelix

Quote:


> Originally Posted by *Cyber Locc*
> 
> The Witchers are very very fun but get ready to increase your vocabulary because your going to be saying curse words at the screen you wont even know you knew. Saying that the Wicther is hard is a serious understatement the Witcher 3 to me is much easier than 2 even then though its tough prepare to die ALOT.


I have to expand it even more than when i played the first installment?








That's the one thing I still haven't gotten used to. In other games, once you're of a high enough level you practically cannot be killed by the AI but with Witcher, the AI monsters still whoop my ass repeatedly.
Frustrating at times but immensely fun. Currently reading threads on Skyrim and the accompanying mods as well. Still wondering if I could play it majorly in 3rd person.


----------



## Cyber Locc

Quote:


> Originally Posted by *cephelix*
> 
> I have to expand it even more than when i played the first installment?
> 
> 
> 
> 
> 
> 
> 
> 
> That's the one thing I still haven't gotten used to. In other games, once you're of a high enough level you practically cannot be killed by the AI but with Witcher, the AI monsters still whoop my ass repeatedly.
> Frustrating at times but immensely fun. Currently reading threads on Skyrim and the accompanying mods as well. Still wondering if I could play it majorly in 3rd person.


Ya I play Skyrim in 3rd person I dont see an issue with that.

Ya defiantly going to have to, I have played all 3 and I have to say that 2 is by far the hardest its like ridiculously hard, the other 2 are tough but 2 is just insane. I would at times have to drop the difficulty and even with it on easy I would die repeatedly. Most of the time my deaths involved being jumped by more drowners than I feel were actually necessary. There is a few hives and I mean hive you walk in the wrong spot and you got like 20 of them on you lol.

3 isnt too bad, there is some stupiud stuff though like in Novigrad theres some quests you may get by accident and while at level 12 (as half the quests there are around that level) you find yourself having to fight people twice your level, needless to say its not possible, Make sure you hard save and alot lol. I do like the idea of being able to pick up any quest all the time but sometimes that has drawbacks like above.


----------



## cephelix

Quote:


> Originally Posted by *Cyber Locc*
> 
> Ya I play Skyrim in 3rd person I dont see an issue with that.
> 
> Ya defiantly going to have to, I have played all 3 and I have to say that 2 is by far the hardest its like ridiculously hard, the other 2 are tough but 2 is just insane. I would at times have to drop the difficulty and even with it on easy I would die repeatedly. Most of the time my deaths involved being jumped by more drowners than I feel were actually necessary. There is a few hives and I mean hive you walk in the wrong spot and you got like 20 of them on you lol.
> 
> 3 isnt too bad, there is some stupiud stuff though like in Novigrad theres some quests you may get by accident and while at level 12 (as half the quests there are around that level) you find yourself having to fight people twice your level, needless to say its not possible, Make sure you hard save and alot lol. I do like the idea of being able to pick up any quest all the time but sometimes that has drawbacks like above.


ahh, then witcher 3 sounds like skyrim for me where i stupidly wander into areas way above my current level and die a stupid death..


----------



## OneB1t

Quote:


> Originally Posted by *Cyber Locc*
> 
> Sorry I had missed this, Um I honestly dont know I dont play at 1440p nor with only 1 card, just saying from what I have read / been told. I can max DAI at 1080 with 1 card and at 4k with 2 overclocked if that helps. However my rig is ripped down right now and on a bench and becasue of that I am only running 1 card so I will check that out. I been playing witcher a little at 1440p maxed (GImpworks features off of course) and haven't had any issues. I dont watched FPS but I haven't noticed any shifts at all.
> 
> I will get a fps monitor going next time I play and play DAI and see what happens.
> 
> Here is of 1600, having issues finding one thats of 1440p. Most people like to use the benchmark that comes with it and that thing is fake, it not only is way more demanding the actual game it favors nvidia heavily, anyway here is maxxed or pretty close at 1600.
> 
> However your selection is a very demanding and very nvidia favoring game. In most games 1440p maxxed with a 290x is no problem especially one with a decent overclock. Most of the benches are for stock my card is heavily overclocked so a much different story.
> 
> Also since were talking about it
> 
> 
> 
> 
> 
> 
> 
> have some other issues with DAI, I have to back my CPU overclock for it and only it. Every other game is stable Prime 95 is stable everything is stable at 4.7 however anything over 4.5 no matter how much voltage i give BSODs with DAI I dont know why that game is cursed lol.


as dragon age supports mantle these results are not correct in fact in mantle all radeons will beat their competition


----------



## Chewiethrash

I have a little issue with my gpu.
Is a sapphire tri-x 290x 8gb with dual bios, i want to get it better, i'm just looking for the best bios archive to put it on, but i just can't find any clue of what i need, i am noob in that kind of modding, well, in bios modding at all, and i'm from spain and my english sometimes is not suficient to get to the technical words sometimes... i'm just asking for a little help, cause i'm reading your posts and it seems that you know very well what I should do.

If you need specs of my pc, there they go.
-cpu amd 8320 fx
- 4x4gb ram Corsair Vengeance DDR3 1866Mhz PC3-15000 CL9
- 290x 8gb
- motherboard msi 970 gaming
- cpu cooler Noctua NH-U9B SE2.
thank you for your attention


----------



## Cyber Locc

Quote:


> Originally Posted by *OneB1t*
> 
> as dragon age supports mantle these results are not correct in fact in mantle all radeons will beat their competition


So you jest? or did you actually drink the koolaid?

This bench uses the benchmark app which I will be the first to admit give worse performance, but thats not what is important here whats important is the Mantle vs DX numbers. There is a reason that Mantle was gave up on it and it wasn't because DX12 implores some of its backbone.


----------



## OneB1t

these results are bull****









here are my results from integrated benchmark
Quote:


> dragon age inquisition ultra 1920x1080
> 
> [email protected]
> R9-290x 1050/1450
> 
> Mantle:
> no MSAA
> min 59
> avg 70
> 
> 2x MSAA
> min 50.7
> avg 60.5
> 
> 4x MSAA
> min 44.9
> avg 52.4
> 
> DX11
> no MSAA
> min 55.5
> avg 68.7


----------



## Cyber Locc

Quote:


> Originally Posted by *OneB1t*
> 
> these results are bull****
> 
> 
> 
> 
> 
> 
> 
> 
> 
> here are my results from integrated benchmark


Ya so that is the results that are barely different from the ones I posted as far as the difference between Mantle and DX. Furthermore those results are old, again I will say mantle was discounted for a reason your seriously mistaken if you think that mantle makes every Radeon card better than Nvidia or even better at all in a game that has it.

The only place you will ever see a gain from mantle is if you have a CPU that is struggling, in which you do me and the other benches do not.

I am trying to find newer reviews about it, however no one tests 1080p anymore lol its all 1440p/4k.

However I can go all day with benches showing the same I showed, Your CPU is struggling mantle helps with that however anyone with an I7 will show the opposite of your results. However thats on a game for game basis, mantle will not be better or worse in every single game, more so how each developer utilizes that Mantle in that game.


----------



## mfknjadagr8

Quote:


> Originally Posted by *cephelix*
> 
> Expensive? don't all enthusiasts earn a six figure annual salary? lol
> But I do agree it is pricey... the cost of the monitor itself is putting me off from upgrading


if only I did...I'd have so much crap I don't need...
Quote:


> Originally Posted by *Cyber Locc*
> 
> Ya so that is the results that are barely different from the ones I posted as far as the difference between Mantle and DX. Furthermore those results are old, again I will say mantle was discounted for a reason your seriously mistaken if you think that mantle makes every Radeon card better than Nvidia or even better at all in a game that has it.
> 
> The only place you will ever see a gain from mantle is if you have a CPU that is struggling, in which you do me and the other benches do not.
> 
> I am trying to find newer reviews about it, however no one tests 1080p anymore lol its all 1440p/4k.
> 
> However I can go all day with benches showing the same I showed, Your CPU is struggling mantle helps with that however anyone with an I7 will show the opposite of your results. However thats on a game for game basis, mantle will not be better or worse in every single game, more so how each developer utilizes that Mantle in that game.


I assume you would say my cpu struggles too? I rarely get 100 percent usage from cpu or gpus at 1080p...however I agree mantle may or may not help based on how the developers implement it's usage...in bf4 I gained 8 fps on mins and over 15 on average however in da:I I see little to no gains...


----------



## Cyber Locc

Quote:


> Originally Posted by *mfknjadagr8*
> 
> if only I did...I'd have so much crap I don't need...
> I assume you would say my cpu struggles too? I rarely get 100 percent usage from cpu or gpus at 1080p...however I agree mantle may or may not help based on how the developers implement it's usage...in bf4 I gained 8 fps on mins and over 15 on average however in da:I I see little to no gains...


With CFX oh yes it is defiantly struggling lol. The fact that you are not maxing your GPUs goes to show that unless of course you mean with Vsync on and in games where it will get more than 60 FPS. "I rarely get 100 percent usage from cpu" well see now this is a misnomer just because a CPU doesn't go to 100% load doesn't mean anything, a CPU that is bottle-necking your cards doesn't necessarily hit 100% load as a matter of fact it is usually more like 75-85%. To give you some insight my CPU never ever hits 100% while gaming as a matter of fact it rarely hits 60%.

I am not saying that all AMD CPUs are bad or that any of them are, they have there place. However they are weaker in gaming and when you have them at stock clocks which both of you do I assume as your rig specs, yes they are defiantly a bottleneck. Now I am not saying it will by much I am not talking a 10-30fps, I am talking like maybe 5 to 10 at most, most likely. However that is enough to show gains in mantle.


----------



## ITAngel

Can I ask what temps range I should keep my Devil 13 290X2 and what temp range I should stay way from? Thanks!


----------



## AMDMAXX

The lower the better...

I keep my 290X and 290 at around 50 degrees celsius...

When I was air cooled... I kept them between 70-80 degrees celsius...


----------



## ITAngel

Quote:


> Originally Posted by *AMDMAXX*
> 
> The lower the better...
> 
> I keep my 290X and 290 at around 50 degrees celsius...
> 
> When I was air cooled... I kept them between 70-80 degrees celsius...


I see, when both GPU are working it does want to push around 80-85C but occasionally I can keep it 74C-79C when the room temps are around 68F. I figure I can configure the fan profile better if I have an idea were to keep the temps under control.


----------



## OneB1t

daii benchmark was not very cpu intensive but try DX11 vs mantle on this position and you will see masive difference
http://postimg.org/image/rokyp0xkd/full/

as this place draw calls are massive you will barrely see 30fps in DX11


----------



## Mega Man

Quote:


> Originally Posted by *ITAngel*
> 
> Can I ask what temps range I should keep my Devil 13 290X2 and what temp range I should stay way from? Thanks!


Devil 13 is not 295x2,

Devil 13 is air cooled 290x.

295x2 has crappy aio pumps, mac temp on asecrap pumps is 75 iirc, which is why 295x2 throttles at 75 or w.e. temps the pumps get damaged at

( sorry I don't keep up with asecrap specs, and I switched to open loop on my 295x2s like a week after getting them, basically after testing for doa )


----------



## Cyber Locc

Quote:


> Originally Posted by *Mega Man*
> 
> Devil 13 is not 295x2,
> 
> Devil 13 is air cooled 290x.
> 
> 295x2 has crappy aio pumps, mac temp on asecrap pumps is 75 iirc, which is why 295x2 throttles at 75 or w.e. temps the pumps get damaged at
> 
> ( sorry I don't keep up with asecrap specs, and I switched to open loop on my 295x2s like a week after getting them, basically after testing for doa )


Umm no devil 13 is an air cooled 295x2....

Max temp pumps has no correlation to max temp on GPU core, Just because your GPUs temp has a temp of 75c doesn't mean the water does. The devil 13 throttles at 75C as well just like the 295x2 which it is a non reference 295x2.

Where you came up with water pump stuff is beyond me and is very strange.

I will agree the asetek cooler is junk though, however as a card that has to be watercooled stock and not everyone has a custom loop they didn't really have much choice. There other option would be to air cool and have a big mess that is a the Devil 13.

Quote:


> Originally Posted by *OneB1t*
> 
> daii benchmark was not very cpu intensive but try DX11 vs mantle on this position and you will see masive difference
> http://postimg.org/image/rokyp0xkd/full/
> 
> as this place draw calls are massive you will barrely see 30fps in DX11


And there is my point my CPU never does that in any game at all period.

However as I stated earlier DAI also seems to have some CPU problems imo so its a bad example for any CPU at all, even still that doesn't happen to me.

Mantle does not show gains for me, a FX8350 stock will bottleneck a r9 290x. I also run crossfire most of the time so again we have zero gains on mantle it is actually a performance loss. If it was this king thing you seem to think it is AMD wouldn't have wrote it off after a year lol.


----------



## OneB1t

go into hinterlands use dx11 and you will see how bad your cpu will perform


----------



## mfknjadagr8

Quote:


> Originally Posted by *Cyber Locc*
> 
> With CFX oh yes it is defiantly struggling lol. The fact that you are not maxing your GPUs goes to show that unless of course you mean with Vsync on and in games where it will get more than 60 FPS. "I rarely get 100 percent usage from cpu" well see now this is a misnomer just because a CPU doesn't go to 100% load doesn't mean anything, a CPU that is bottle-necking your cards doesn't necessarily hit 100% load as a matter of fact it is usually more like 75-85%. To give you some insight my CPU never ever hits 100% while gaming as a matter of fact it rarely hits 60%.
> 
> I am not saying that all AMD CPUs are bad or that any of them are, they have there place. However they are weaker in gaming and when you have them at stock clocks which both of you do I assume as your rig specs, yes they are defiantly a bottleneck. Now I am not saying it will by much I am not talking a 10-30fps, I am talking like maybe 5 to 10 at most, most likely. However that is enough to show gains in mantle.


I barely see 75 percent ever...only when loading maps etc..and as this isn't stockclocks.net it's running at 4.8 not stock...when running with vsync off my gpu usage goes up quite a bit but at 1080p is its rarely above 70...with vsync on it's low 50s...supersampled 3200 x 1800 with vsync off I see mid 80s gpu usage and 70s in cpu usage...if the processor is really struggling that badly wouldn't I be seeing higher usage expecially at lower resolutions which takes load off the gpus? In fact the only true time I've seen full usage on both is loading call of duty advanced warfare maps....for some reason that game until the later patches taxed both cpu and gpu super high in fact I hit nearly 80c once loading a map when I never break 60 except during ibt avx


----------



## Cyber Locc

Quote:


> Originally Posted by *OneB1t*
> 
> go into hinterlands use dx11 and you will see how bad your cpu will perform


I will tonight and report back. Downloading the updates now.


----------



## OneB1t

its good think when cpu can reach 100% load that means that game can use all performance of cpu


----------



## Cyber Locc

Quote:


> Originally Posted by *mfknjadagr8*
> 
> I barely see 75 percent ever...only when loading maps etc..and as this isn't stockclocks.net it's running at 4.8 not stock...when running with vsync off my gpu usage goes up quite a bit but at 1080p is its rarely above 70...with vsync on it's low 50s...supersampled 3200 x 1800 with vsync off I see mid 80s gpu usage and 70s in cpu usage...if the processor is really struggling that badly wouldn't I be seeing higher usage expecially at lower resolutions which takes load off the gpus? In fact the only true time I've seen full usage on both is loading call of duty advanced warfare maps....for some reason that game until the later patches taxed both cpu and gpu super high in fact I hit nearly 80c once loading a map when I never break 60 except during ibt avx


At 3200x1800 you should be more like 100% usage with Vsync off you defiantly will that proves my point, you are bottlenecking. We just had a guy on here that was bottlenecking with 2 290xs and a i5 heavily overclocked a 3750k, if a 3750k at 4.5 is bottlenecking you are defiantly bottle necking. Its easy to test, run heaven and see if your GPUs remain at 100% load at 1080p I bet they dont.

BTW Vsync off puts the cards at full, they should be at 100% load if they are at 70 they are being bottle necked.

Your rig in your sig doesnt show an OC so I assume, I dont know your Overclocked unless you tell me. Look at his chart he is showing 100% usage.

Quote:


> Originally Posted by *OneB1t*
> 
> its good think when cpu can reach 100% load that means that game can use all performance of cpu












Just LOL, that is all.


----------



## i2CY

Hey,
wondering what the most common cause of graphic patches in games, rather spots that there is missing textures, or random non environment colour?
over heading?


----------



## ITAngel

Hey Mega Man This is my card below which is dual 290X or can be compare to the 295x2 which this version is not water cooled.


----------



## mfknjadagr8

Quote:


> Originally Posted by *i2CY*
> 
> Hey,
> wondering what the most common cause of graphic patches in games, rather spots that there is missing textures, or random non environment colour?
> over heading?


you mean from the developer...they usually fix poorly optimized textures or fix textures being out of place...or improperly rendered textures in the case of fo4 lol...they can also fix issues with specific graphics card setups and scaling/implementation of sli or crossfire


----------



## fat4l

So this is what I'm experiencing:
Flickering with crossfire + freesync enabled.
This is with 16.1.1 drivers.
For example, when I run World of warcraft, then alt-tab into windows, then run skype, then the desktop starts flickering, especially when I try to move the skype window.

The usage and frequency fluctuation with games, mostly older ones(DX9, DX10)...
Here are two vids showing the fluctuations. Both clock and usage.
#1 World of Warcraft 3.3.5 - 



#2 Crysis 1 - 




System spec-
Intel i7 4790K @5.1G
Asus M7 Hero
Corsair Dominator Platinum 4*4GB 2666CL10
Asus Ares III (2*290X)
Superflower Leadex 1200W Platinum
BenQ XL2730Z @1440p/144Hz

AMD Drivers: 16.1.1 Crimson, Windows 10 Pro 64 bit.


----------



## i2CY

Yes,

I rolled back to Cat. 15.7 cause I find it is more stable,
Was having huge issues with Heroes of the Storm; layers not showing, glitch patches among others., Some little things with elite, Bttlfrnt3, and BF4, none oc stock card settings.
Still have some patches with 15.7 but far more stable,
was wondering if it was driver or hardware issue.


----------



## ITAngel

Anyone here know if a XFX P1-850B-BEFX 850W 80-Plus Gold can Power up a Devil 13 290X on a non-oc system?


----------



## Roboyto

Quote:


> Originally Posted by *ITAngel*
> 
> Anyone here know if a XFX P1-850B-BEFX 850W 80-Plus Gold can Power up a Devil 13 290X on a non-oc system?


I would say it is borderline. You are probably better off with a good 1kw unit, especially if you have a hefty CPU.

EVGA G 1000W for $139 + $20 MIR

http://www.newegg.com/Product/Product.aspx?Item=N82E16817438062


----------



## ITAngel

Quote:


> Originally Posted by *Roboyto*
> 
> I would say it is borderline. You are probably better off with a good 1kw unit, especially if you have a hefty CPU.
> 
> EVGA G 1000W for $139 + $20 MIR
> http://www.newegg.com/Product/Product.aspx?Item=N82E16817438062


My friend is interested just checking to see if he can run it stock since he run everything stock. he has a i7-4790k


----------



## Roboyto

Quote:


> Originally Posted by *ITAngel*
> 
> My friend is interested just checking to see if he can run it stock since he run everything stock. he has a i7-4790k


PSU is the last place to cheap out.

Should check several different review sites for what they rated their system power draw at. I looked at only one and system draw with an OC'd 3770k was 802W


----------



## ITAngel

Quote:


> Originally Posted by *Roboyto*
> 
> PSU is the last place to cheap out.
> 
> Should check several different review sites for what they rated their system power draw at. I looked at only one and system draw with an OC'd 3770k was 802W


Yea true he will get that power supply, for now he is wondering if he can run the card until his new power supply arrives.

He has the following

i7-4790k
2x 8GB DDR3
1x Solid State
1x Storage HDD
5x Case Fan

His room temps stay within 60F so is pretty cool.


----------



## Roboyto

Quote:


> Originally Posted by *ITAngel*
> 
> Yea true he will get that power supply, for now he is wondering if he can run the card until his new power supply arrives.
> 
> He has the following
> 
> i7-4790k
> 2x 8GB DDR3
> 1x Solid State
> 1x Storage HDD
> 2x Case Fan
> 
> His room temps stay within 60F so is pretty cool.


Yeah, I'm sure he can plug it in and run it to see if the card works or not. But heavy load could cause some hiccups


----------



## Roboyto

Not sure if everyone saw this, but it is a call to arms Red Team!

http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/0_20

*Originally Posted by **Noxinite* 

*Yes as per the HWBot rules:*

*Allowed optimisations :*
*Disable subtests*
*Driver settings finetuning*
*Tesselation tweaking*


----------



## JourneymanMike

*New Crimson driver out*

*Hotfix Driver 16.1.1 just out today...

*http://www.techspot.com/news/63673-amd-releases-radeon-software-16-1-1-hotfix-drivers.html

*I haven't seen it on this forum, thought I'd break the news to all*









I'll try it, but I'll probably go back to 15.7.1, again! For some reason, I just can't get the crimsons to work correctly...


----------



## rdr09

Quote:


> Originally Posted by *JourneymanMike*
> 
> *New Crimson driver out*
> 
> *Hotfix Driver 16.1.1 just out today...
> 
> *http://www.techspot.com/news/63673-amd-releases-radeon-software-16-1-1-hotfix-drivers.html
> 
> *I haven't seen it on this forum, thought I'd break the news to all*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'll try it, but I'll probably go back to 15.7.1, again! For some reason, I just can't get the crimsons to work correctly...


lol. Journey, it went out like 4 days ago. i'll prolly stick with the first fix. i came from 15.4 and crimson seems fine so far.

i normally just install the new over the old but since crimson is a "major" change . . . i used 13 driver uninstall feature for 15.4. rebooted and installed hotfix 16 less evolved. i disabled crossfire first btw.


----------



## JourneymanMike

Quote:


> Originally Posted by *rdr09*
> 
> lol. Journey, it went out like 4 days ago. i'll prolly stick with the first fix. i came from 15.4 and crimson seems fine so far.
> 
> i normally just install the new over the old but since crimson is a "major" change . . . i used 13 driver uninstall feature for 15.4. rebooted and *installed hotfix 16 less evolved*. *i disabled crossfire first btw*.


OOOOPS...

I don't use evolved either, on the disabling of CrossFire-X, you do this before the uninstall of whatever driver you have, before uninstalling, correct? I haven't been doing that... it seems like there's a whole lot stuff, you need to do, to install the Crimson drivers... I've never got the Crimsons to work for me... Still using 15.7.1...


----------



## rdr09

Quote:


> Originally Posted by *JourneymanMike*
> 
> OOOOPS...
> 
> I don't use evolved either, on the disabling of CrossFire-X, you do this before the uninstall of whatever driver you have, before uninstalling, correct? I haven't been doing that... it seems like there's a whole lot stuff, you need to do, to install the Crimson drivers... I've never got the Crimsons to work for me... Still using 15.7.1...


yes, i disable crossfire. i go as far as unplugging the power to the second card. that's just my ritual. after installing the new driver i shut down the system and plug the power back to second card. turn on the system and crossfire is set automatically.

not saying you do the same but it works for me. like i said, i normally install the new over the old with crossfire off. never used ddu.


----------



## JourneymanMike

Quote:


> Originally Posted by *rdr09*
> 
> yes, i disable crossfire. *i go as far as unplugging the power to the second card.* that's just my ritual. after installing the new driver i shut down the system and plug the power back to second card. turn on the system and crossfire is set automatically.
> 
> not saying you do the same but it works for me. like i said, i normally install the new over the old with crossfire off. never used ddu.


Do you mean that you take the second card out of the slot, also?

Because, if you just unplug the cables, you still have 75 watts of power, going into the card, from the PCIe slot...

I've always understood, that when you add a second card, that you were to uninstall the drivers first, shut down and install the second card, then reboot and install the drivers again... In fact, I learned this, on this forum, quite some time ago! More than a year or two ago...


----------



## Cyber Locc

Quote:


> Originally Posted by *JourneymanMike*
> 
> Do you mean that you take the second card out of the slot, also?
> 
> Because, if you just unplug the cables, you still have 75 watts of power, going into the card, from the PCIe slot...
> 
> I've always understood, that when you add a second card, that you were to uninstall the drivers first, shut down and install the second card, then reboot and install the drivers again... In fact, I learned this, on this forum, quite some time ago! More than a year or two ago...


That is correct, thats why it works for him because of those 75ws. You cannot install a new driver with only 1 card installed and get it to work with CF you have to have both installed for the driver to allow for CF.


----------



## cephelix

So does the new hotfix resolve the downclocking issue or do i still have to keep using clockblocker?


----------



## rdr09

Quote:


> Originally Posted by *JourneymanMike*
> 
> Do you mean that you take the second card out of the slot, also?
> 
> Because, if you just unplug the cables, you still have 75 watts of power, going into the card, from the PCIe slot...
> 
> I've always understood, that when you add a second card, that you were to uninstall the drivers first, shut down and install the second card, then reboot and install the drivers again... In fact, I learned this, on this forum, quite some time ago! More than a year or two ago...


No, my cards are watercooled. uplugging it is like having it out anyways and just having one. you don't have to pull it out. you prolly don't have to unplug. i just do it.

1. disable crossfire
2 shutdown and unplug power to second card
3. boot uninstall old driver with 12 or 13 driver (older drivers have uninstall feature)
4. Reboot and install new driver
5. Shutdown and plug second card
6. Boot and crossfire should set itself. if not, then set it yourself.

These steps i do to both my amd and intel. but, like i said before, i normally just install the new over the old after steps 1 & 2. i followed the steps above for crimson and omega.


----------



## Mega Man

Quote:


> Originally Posted by *Cyber Locc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> Devil 13 is not 295x2,
> 
> Devil 13 is air cooled 290x.
> 
> 295x2 has crappy aio pumps, mac temp on asecrap pumps is 75 iirc, which is why 295x2 throttles at 75 or w.e. temps the pumps get damaged at
> 
> ( sorry I don't keep up with asecrap specs, and I switched to open loop on my 295x2s like a week after getting them, basically after testing for doa )
> 
> 
> 
> Umm no devil 13 is an air cooled 295x2....
> 
> Max temp pumps has no correlation to max temp on GPU core, Just because your GPUs temp has a temp of 75c doesn't mean the water does. The devil 13 throttles at 75C as well just like the 295x2 which it is a non reference 295x2.
> 
> Where you came up with water pump stuff is beyond me and is very strange.
> 
> I will agree the asetek cooler is junk though, however as a card that has to be watercooled stock and not everyone has a custom loop they didn't really have much choice. There other option would be to air cool and have a big mess that is a the Devil 13.
> 
> Quote:
> 
> 
> 
> Originally Posted by *OneB1t*
> 
> daii benchmark was not very cpu intensive but try DX11 vs mantle on this position and you will see masive difference
> http://postimg.org/image/rokyp0xkd/full/
> 
> as this place draw calls are massive you will barrely see 30fps in DX11
> 
> Click to expand...
> 
> And there is my point my CPU never does that in any game at all period.
> 
> However as I stated earlier DAI also seems to have some CPU problems imo so its a bad example for any CPU at all, even still that doesn't happen to me.
> 
> Mantle does not show gains for me, a FX8350 stock will bottleneck a r9 290x. I also run crossfire most of the time so again we have zero gains on mantle it is actually a performance loss. If it was this king thing you seem to think it is AMD wouldn't have wrote it off after a year lol.
Click to expand...

No the devil 13 is NOT a 295x2.

It is just 2 290x cores on one pcb, and they are not the same the devil 13 was also released earlier then a 295x2

If power color made them throttle so low that is due to them.

If you don't believe me about the 295 throttling to protect the pump, come ask in the 295 thread.
Quote:


> Originally Posted by *ITAngel*
> 
> Hey Mega Man This is my card below which is dual 290X or can be compare to the 295x2 which this version is not water cooled.


I know, I have quad 290x and 2 295s


----------



## spyshagg

anyone found their 290x to degradate with time?

not the overclock capability, but the image scaler/output becomes sensitive to voltage, as in: the screen loses sync every other second.
This is "normal" with freesync, but one of my cards is doing this over hdmi, and now over DVI as well.

display port craps it self above stock vcore. HMDI above 150mv, and DVI above 190mv. Its not gpu frequency (mhz) but vcore. Set the card at 1000mhz and increase the vcore until the screen loses sync.

This same card gave no problems at all 7 months ago.


----------



## Cyber Locc

Quote:


> Originally Posted by *Mega Man*
> 
> No the devil 13 is NOT a 295x2.
> 
> It is just 2 290x cores on one pcb, and they are not the same the devil 13 was also released earlier then a 295x2
> 
> If power color made them throttle so low that is due to them.
> 
> If you don't believe me about the 295 throttling to protect the pump, come ask in the 295 thread.
> I know, I have quad 290x and 2 295s


Okay first thing is first. A 295x2 is AMDs name for a dual 290x card, anycard that has dual 290x gpus follows AMDs designs plans of a 295x2. 295x2 is the reference cards name nothing more, they can call it whatever they want it is still a 295x2.

"and they are not the same the devil 13 was also released earlier then a 295x2 " No it wasn't, nor could it be GPU partners dont just start making cards, if AMD doesn't make a Dual GPU reference design then board partners do not make one period. Same thing with Nvidia, Amd and Nvidia make the cards those are your reference models, then they allow certain changes to be made from board partners. I do not know where you are getting this but its incorrect. Below is the dates for you.

295x2 - April 21, 2014
Devil 13 - around June 15,14

"If power color made them throttle so low that is due to them. " You are right I suppose they could have edited in there custom bios, however they did not nor did Asus AFAIK.

"If you don't believe me about the 295 throttling to protect the pump," again I dont need to ask anyone, what you are saying doesn't pan out for anyone that knows anything about liquid cooling. If your GPUs are at 75c your water is more like 30c a pump can operate upwards of 50c your water is not hitting anywhere near 50c.

Quote:


> Originally Posted by *rdr09*
> 
> No, my cards are watercooled. uplugging it is like having it out anyways and just having one. you don't have to pull it out. you prolly don't have to unplug. i just do it.
> 
> 1. disable crossfire
> 2 shutdown and unplug power to second card
> 3. boot uninstall old driver with 12 or 13 driver (older drivers have uninstall feature)
> 4. Reboot and install new driver
> 5. Shutdown and plug second card
> 6. Boot and crossfire should set itself. if not, then set it yourself.
> 
> These steps i do to both my amd and intel. but, like i said before, i normally just install the new over the old after steps 1 & 2. i followed the steps above for crimson and omega.


Umm I am 90% sure a gpu will still get power and still be recognized with just the 6 pins removed. As a mater of fact it would have to be or your idea would not work. As we both pointed out to you already it doesnt work that way.

If you have 1 card installed and try to add a second without reinstalling drivers it will not work, SLI does but CFX does not. Both cards have to be installed at the time of the driver install to unlock CF ability's. I ran into this when I got my second 290x as I did not know this and for days couldn't get CFX to work, I asked here and TSM told me that.

I do agree about the removing Water Cooled cards thing sucks though I love that my RIVBE can shut off PCIE lanes with switches best feature ever!.


----------



## JourneymanMike

Quote:


> Originally Posted by *rdr09*
> 
> No, my cards are watercooled. uplugging it is like having it out anyways and just having one. you don't have to pull it out. you prolly don't have to unplug. i just do it.
> 
> 1. disable crossfire
> 2 shutdown and unplug power to second card
> 3. boot uninstall old driver with 12 or 13 driver (older drivers have uninstall feature)
> 4. Reboot and install new driver
> 5. Shutdown and plug second card
> 6. Boot and crossfire should set itself. if not, then set it yourself.
> 
> These steps i do to both my amd and intel. but, like i said before, i normally just install the new over the old after steps 1 & 2. i followed the steps above for crimson and omega.


I believe that you're going through extra, unnecessary, steps ..

First of all...

1. Uninstall the old drivers... This, in itself, will disable CrossFire
2. Make sure that all the GPU's are mounted and plugged in, start the system,
3. Install the new drivers, reboot, CrossFire should be picked up automatically, go to work!!!!

Sometimes, I'll even do a fresh install of the OS


----------



## Cyber Locc

Quote:


> Originally Posted by *JourneymanMike*
> 
> I believe that you're going through extra, unnecessary, steps ..
> 
> First of all...
> 
> 1. Uninstall the old drivers... This, in itself, will disable CrossFire
> 2. Make sure that all the GPU's are mounted and plugged in, start the system,
> 3. Install the new drivers, reboot, CrossFire should be picked up automatically, go to work!!!!
> 
> Sometimes, I'll even do a fresh install of the OS


The biggest issue is what we already told him, he isnt plugging the second card in until after the drivers are installed, CF doesnt work that way he has to have both cards installed for the CF drivers to install.


----------



## rdr09

Quote:


> Originally Posted by *JourneymanMike*
> 
> I believe that you're going through extra, unnecessary, steps ..
> 
> First of all...
> 
> 1. Uninstall the old drivers... This, in itself, will disable CrossFire
> 2. Make sure that all the GPU's are mounted and plugged in, start the system,
> 3. Install the new drivers, reboot, CrossFire should be picked up automatically, go to work!!!!
> 
> Sometimes, I'll even do a fresh install of the OS


That's my way. Never had driver issue. At least with the games i play. no flickering, no lost of signal, no fluctuating gpu usage, no clocks not going back to idle. I never had a driver crash except oc'ing too much like 1330 core. lol

For the next whql, i will just install the new driver over the old. but, still, i will disable crossfire.

@Cyber, once i plug the other 290 and boot . . . Crimsom/CCC will set crossfire by itself, which it did when i installed hotfix.


----------



## Enzarch

Quote:


> Originally Posted by *spyshagg*
> 
> anyone found their 290x to degradate with time?
> 
> not the overclock capability, but the image scaler/output becomes sensitive to voltage, as in: the screen loses sync every other second.
> This is "normal" with freesync, but one of my cards is doing this over hdmi, and now over DVI as well.
> 
> display port craps it self above stock vcore. HMDI above 150mv, and DVI above 190mv. Its not gpu frequency (mhz) but vcore. Set the card at 1000mhz and increase the vcore until the screen loses sync.
> 
> This same card gave no problems at all 7 months ago.


This is a fairly well known issue with some cards, particularly when running high refresh rates, but I havent heard of this getting worse over time. We are just beginning to discuss this on the Hawaii BIOS modding thread.

Also, my card, which has this issue will drop any of the outputs at similar voltages


----------



## Cyber Locc

Quote:


> Originally Posted by *rdr09*
> 
> That's my way. Never had driver issue. At least with the games i play. no flickering, no lost of signal, no fluctuating gpu usage, no clocks not going back to idle. I never had a driver crash except oc'ing too much like 1330 core. lol
> 
> For the next whql, i will just install the new driver over the old. but, still, i will disable crossfire.
> 
> @Cyber, once i plug the other 290 and boot . . . Crimsom/CCC will set crossfire by itself, which it did when i installed hotfix.


That is because the driver is seeing another card. Even with your 6 and 8 pins unplugged AMDs drivers can still see the card, if you completely removed the card or shut off the lane it was in, then that would not happen. When you install the drivers there is a version for single cards and a version for CF cards, The driver chooses which one by looking how many cards are in your system. It is stupid and you should just be able pop a card in and go but in my experience that is not the case.


----------



## Mega Man

Quote:


> Originally Posted by *Cyber Locc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> No the devil 13 is NOT a 295x2.
> 
> It is just 2 290x cores on one pcb, and they are not the same the devil 13 was also released earlier then a 295x2
> 
> If power color made them throttle so low that is due to them.
> 
> If you don't believe me about the 295 throttling to protect the pump, come ask in the 295 thread.
> I know, I have quad 290x and 2 295s
> 
> 
> 
> Okay first thing is first. A 295x2 is AMDs name for a dual 290x card, anycard that has dual 290x gpus follows AMDs designs plans of a 295x2. 295x2 is the reference cards name nothing more, they can call it whatever they want it is still a 295x2.
> 
> "and they are not the same the devil 13 was also released earlier then a 295x2 " No it wasn't, nor could it be GPU partners dont just start making cards, if AMD doesn't make a Dual GPU reference design then board partners do not make one period. Same thing with Nvidia, Amd and Nvidia make the cards those are your reference models, then they allow certain changes to be made from board partners. I do not know where you are getting this but its incorrect. Below is the dates for you.
> 
> 295x2 - April 21, 2014
> Devil 13 - around June 15,14
> 
> "If power color made them throttle so low that is due to them. " You are right I suppose they could have edited in there custom bios, however they did not nor did Asus AFAIK.
> 
> "If you don't believe me about the 295 throttling to protect the pump," again I dont need to ask anyone, what you are saying doesn't pan out for anyone that knows anything about liquid cooling. If your GPUs are at 75c your water is more like 30c a pump can operate upwards of 50c your water is not hitting anywhere near 50c.
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> No, my cards are watercooled. uplugging it is like having it out anyways and just having one. you don't have to pull it out. you prolly don't have to unplug. i just do it.
> 
> 1. disable crossfire
> 2 shutdown and unplug power to second card
> 3. boot uninstall old driver with 12 or 13 driver (older drivers have uninstall feature)
> 4. Reboot and install new driver
> 5. Shutdown and plug second card
> 6. Boot and crossfire should set itself. if not, then set it yourself.
> 
> These steps i do to both my amd and intel. but, like i said before, i normally just install the new over the old after steps 1 & 2. i followed the steps above for crimson and omega.
> 
> Click to expand...
> 
> Umm I am 90% sure a gpu will still get power and still be recognized with just the 6 pins removed. As a mater of fact it would have to be or your idea would not work. As we both pointed out to you already it doesnt work that way.
> 
> If you have 1 card installed and try to add a second without reinstalling drivers it will not work, SLI does but CFX does not. Both cards have to be installed at the time of the driver install to unlock CF ability's. I ran into this when I got my second 290x as I did not know this and for days couldn't get CFX to work, I asked here and TSM told me that.
> 
> I do agree about the removing Water Cooled cards thing sucks though I love that my RIVBE can shut off PCIE lanes with switches best feature ever!.
Click to expand...

omg

so hard, now i remember why i blocked you

http://www.powercolor.com/us/products_features.asp?id=549
Quote:


> devil 13 dual core 290x


you can find it under products and searching for 290x
http://www.powercolor.com/us/products_search_VGA.asp?Bus=&Generation=RADEON+R9+290X&Series=&MemerySize=&MemoryInterface=&ByIntention=&ByUniqueFeature=&submit=Search

under 295x2s you cant
http://www.powercolor.com/us/products_search_VGA.asp?Bus=&Generation=RADEON+R9+295X2&Series=&MemerySize=&MemoryInterface=&ByIntention=&ByUniqueFeature=&submit=Search

crap in ones can not be compared to WATER COOLING

crap in ones water, if the cores on the 295x2s could reach 95 like normal 290/290x then the water could SATURATE and get to 75, ( seriously have you seen how crappy asecrap is ?) again
Quote:


> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Alex132*
> 
> Has anyone found any reviews of using different fans on the R9 295X2's radiator? I can't find any myself...
> 
> I have 2x Cougar Turbines and 2x SP120 Quiets and would like to know how they perform.
> 
> 
> 
> Here ya go: http://www.overclock.net/t/1540259/r9-295x2-rad-fan-shootout/0_50
> 
> The SP120's do quite well, i'm running a couple of Noctua iPPC NF-F12's on mine and the Cougar's should perform well too
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Oh, also is there a way to disable the red LED?
> 
> Click to expand...
> 
> Yes!
> 
> Under the shroud you can unplug the LED's power cable, taking off the shroud is quite easy and will not void your warranty
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> The LED has its own power cable? +1 AMD.
> It really annoys me launching and having to install the bloatware that is known as GF Experience JUST to turn off the LED on my GTX 690.
> 
> Coming from Nvidia's "best" dual GPU to AMD's best. I really am going to be interested to see the difference
> 
> 
> 
> 
> 
> 
> 
> 
> One thing I love that almost no reviews seem to mention is the customizability of the cooling - simply change out the fans for different ones.
> 
> I heard/read that the 295X2 has a thermal throttle of 75'c - whereas the 290Xs have a higher throttle (85'c?). Seeing as how the R9 295X2 should have binned 290X cores - surely the thermal throttle is just for the AIO cooler / heat / power draw. Is it safe going beyond it (with a BIOS flash on the one BIOS to disable it)?
> 
> Purely for benchmarking, I will be running this thing stock 24/7 otherwise
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> here is the cable:
> 
> 
> 
> And yeah, the 295x2 throttles at 75c due to the amount of heat the pump can withstand afaik, the 290x has a throttle point of 95c.
> 
> i have the Sapphire OC Bios flashed onto my card and it runs quite well, haven't tried any hard benching but with the stock Bios i managed 1150/1625 with +100mV in Afterburner.
> 
> With the Sapphire Bios i can go up to +300mV in Sapphire Trixx.....waiting for cooler weather though
Click to expand...

since you wont ask

@wermad @Sgt Bilko and anyone else
Quote:


> Originally Posted by *Mega Man*
> 
> i am sorry what was the general consensus to why the 295x2 throttled at 75 and not 95 like normal 290xs, and could you please respond in the quoted thread, ... thanks


----------



## Cyber Locc

Quote:


> Originally Posted by *Mega Man*
> 
> omg
> 
> so hard, now i remember why i blocked you
> 
> http://www.powercolor.com/us/products_features.asp?id=549
> you can find it under products and searching for 290x
> http://www.powercolor.com/us/products_search_VGA.asp?Bus=&Generation=RADEON+R9+290X&Series=&MemerySize=&MemoryInterface=&ByIntention=&ByUniqueFeature=&submit=Search
> 
> under 295x2s you cant
> http://www.powercolor.com/us/products_search_VGA.asp?Bus=&Generation=RADEON+R9+295X2&Series=&MemerySize=&MemoryInterface=&ByIntention=&ByUniqueFeature=&submit=Search
> 
> crap in ones can not be compared to WATER COOLING
> 
> crap in ones water, if the cores on the 295x2s could reach 95 like normal 290/290x then the water could SATURATE and get to 75, ( seriously have you seen how crappy asecrap is ?) again
> since you wont ask
> 
> @wermad @Sgt Bilko and anyone else


Dude a 295x2 is a dual 290x GPU, a Devil 13 is a Dual 290x GPU they are the same thing! Powercooler Devil 13, and Asus Ares are both 295x2s.....

They can call it whatever they want it is still a non reference 295x2.

I wont ask because I do not need to ask, anyone that thinks a card needs to be throttled at 75c to prevent damaging a pump that can handle up to a 50c water temp needs to go back to school. A core temp of 75 is no where near a water temp of 50c. If the water was to get 50c the barbs wont hold on the tube must less pump concerns.

BTW a 7970s max temp is 95c however a 7990s is 85c, I guess they did that to protect the pump that doesn't exist on that card huh?.


----------



## mfknjadagr8

The driver can not see that you have the second card unplugged because it polls the operating system and guess what with the power unplugged try and find it in device manager....not happening...if somehow it did know it was there it would give crossfire as an option after installing drivers with one installed and powered and the other unplugged...also if in fact there are two versions of the driver as cyber stated why does crossfire work immediately when you install a new card on the same drivers or when you plug it back up after clean installing with it unplugged? Can you explain cyber or anyone who believes this?


----------



## Sgt Bilko

Quote:


> Originally Posted by *Cyber Locc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Mega Man*
> 
> omg
> 
> so hard, now i remember why i blocked you
> 
> http://www.powercolor.com/us/products_features.asp?id=549
> you can find it under products and searching for 290x
> http://www.powercolor.com/us/products_search_VGA.asp?Bus=&Generation=RADEON+R9+290X&Series=&MemerySize=&MemoryInterface=&ByIntention=&ByUniqueFeature=&submit=Search
> 
> under 295x2s you cant
> http://www.powercolor.com/us/products_search_VGA.asp?Bus=&Generation=RADEON+R9+295X2&Series=&MemerySize=&MemoryInterface=&ByIntention=&ByUniqueFeature=&submit=Search
> 
> crap in ones can not be compared to WATER COOLING
> 
> crap in ones water, if the cores on the 295x2s could reach 95 like normal 290/290x then the water could SATURATE and get to 75, ( seriously have you seen how crappy asecrap is ?) again
> since you wont ask
> 
> @wermad @Sgt Bilko and anyone else
> 
> 
> 
> Dude a 295x2 is a dual 290x GPU, a Devil 13 is a Dual 290x GPU they are the same thing! Powercooler Devil 13, and Asus Ares are both 295x2s.....
> 
> They can call it whatever they want it is still a non reference 295x2.
> 
> I wont ask because I do not need to ask, anyone that thinks a card needs to be throttled at 75c to prevent damaging a pump that can handle up to a 50c water temp needs to go back to school. A core temp of 75 is no where near a water temp of 50c. If the water was to get 50c the barbs wont hold on the tube must less pump concerns.
> 
> BTW a 7970s max temp is 95c however a 7990s is 85c, I guess they did that to protect the pump that doesn't exist on that card huh?.
Click to expand...

295x2 uses the same Hawaii cores as the 290x yes but they aren't the same, the Ares and Devil reads under GPU-Z as Crossfire (Hawaii) 290x's, the 295x2 doesn't, it reads as Vesuvius and Crossfire cannot be disabled on it (major difference there)

the 75c throttle temp is to prevent damage to the pump, the Devil obviously doesn't need that (being air cooled) and the Ares should never get close to that (EK waterblock).

and iirc the throttle temp on the 7990 was because of the vrm's proximity to the core.

but hey, AMD are stupid I guess......I mean why intentionally throttle the card if they didn't have to?


----------



## Cyber Locc

Quote:


> Originally Posted by *mfknjadagr8*
> 
> The driver can not see that you have the second card unplugged because it polls the operating system and guess what with the power unplugged try and find it in device manager....not happening...if somehow it did know it was there it would give crossfire as an option after installing drivers with one installed and powered and the other unplugged...also if in fact there are two versions of the driver as cyber stated why does crossfire work immediately when you install a new card on the same drivers or when you plug it back up after clean installing with it unplugged? Can you explain cyber or anyone who believes this?


It doesnt work when you install a second card on the same drivers. I have done that it doesnt read the second card, Even AMD Matt will tell you if you add a second card you have to reinstall the drivers. If you have drivers installed from a single card and put a second in it will not give you crossfire settings. At least it didn't for me and countless other threads made having the same exact issue.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Cyber Locc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mfknjadagr8*
> 
> The driver can not see that you have the second card unplugged because it polls the operating system and guess what with the power unplugged try and find it in device manager....not happening...if somehow it did know it was there it would give crossfire as an option after installing drivers with one installed and powered and the other unplugged...also if in fact there are two versions of the driver as cyber stated why does crossfire work immediately when you install a new card on the same drivers or when you plug it back up after clean installing with it unplugged? Can you explain cyber or anyone who believes this?
> 
> 
> 
> It doesnt work when you install a second card on the same drivers. I have done that it doesnt read the second card, Even AMD Matt will tell you if you add a second card you have to reinstall the drivers. If you have drivers installed from a single card and put a second in it will not give you crossfire settings. At least it didn't for me and countless other threads made having the same exact issue.
Click to expand...

I've done it when i ran Crossfire 290's and it worked fine unless they've gone backwards with Crimson in that regard....


----------



## Cyber Locc

Quote:


> Originally Posted by *Sgt Bilko*
> 
> 295x2 uses the same Hawaii cores as the 290x yes but they aren't the same, the Ares and Devil reads under GPU-Z as Crossfire (Hawaii) 290x's, the 295x2 doesn't, it reads as Vesuvius and Crossfire cannot be disabled on it (major difference there)
> 
> the 75c throttle temp is to prevent damage to the pump, the Devil obviously doesn't need that (being air cooled) and the Ares should never get close to that (EK waterblock).
> 
> and iirc the throttle temp on the 7990 was because of the vrm's proximity to the core.
> 
> but hey, AMD are stupid I guess......I mean why intentionally throttle the card if they didn't have to?


I am not saying they do not have to, I am saying there is another reason. It is not to protect a pump that is just hogwash.

"it reads as Vesuvius and Crossfire cannot be disabled on it (major difference there)" This is due to the way the bios is coded nothing more, My 290s say 390x in GPU Z they are not 390xs however there bios is. If you dont understand that then I can not help you. The crossfire being able to be disabled is again a bios feature by AMD and a change that PC and Asus made to there cards.

You need to go and do some research about reference vs non reference cards because you seem to both have a lack of understanding on the subject. Funny how every tech reivew wite on the planet calls the Devil 13 a 295x2, but hey they are stupid and dont know what they are talking about and you know better.

I did it pre omega like back in 14.4ish area and it did not work period. You can google and find many many threads about the subject and AMDs own reps saying you have to reinstall the drivers with both cards in but hey AMD Matt doesn't know what he is talking about.

BTW @Mega Man as you mentioned it, you blocked me due to the argument we had about power delivery in which the Power Supply Editor here on OCN stepped in and even told you that you were wrong and I was right lol.


----------



## Mega Man

funny thing about that, 295x2s DO NOT have custom PCBs, AMD did not authorize vendors to make custom PCBs that is why they are 290xs, and there is actually a big difference between the 2,chip voltage leakage

but alas i am just a crazy psycho and as of now i will go back to just not reading your posts


----------



## Sgt Bilko

Quote:


> Originally Posted by *Cyber Locc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> 295x2 uses the same Hawaii cores as the 290x yes but they aren't the same, the Ares and Devil reads under GPU-Z as Crossfire (Hawaii) 290x's, the 295x2 doesn't, it reads as Vesuvius and Crossfire cannot be disabled on it (major difference there)
> 
> the 75c throttle temp is to prevent damage to the pump, the Devil obviously doesn't need that (being air cooled) and the Ares should never get close to that (EK waterblock).
> 
> and iirc the throttle temp on the 7990 was because of the vrm's proximity to the core.
> 
> but hey, AMD are stupid I guess......I mean why intentionally throttle the card if they didn't have to?
> 
> 
> 
> I am not saying they do not have to, I am saying there is another reason. It is not to protect a pump that is just hogwash.
> 
> "it reads as Vesuvius and Crossfire cannot be disabled on it (major difference there)" This is due to the way the bios is coded nothing more, My 290s say 390x in GPU Z they are not 390xs however there bios is. If you dont understand that then I can not help you. The crossfire being able to be disabled is again a bios feature by AMD and a change that PC and Asus made to there cards.
> 
> You need to go and do some research about reference vs non reference cards because you seem to both have a lack of understanding on the subject. Funny how every tech reivew wite on the planet calls the Devil 13 a 295x2, but hey they are stupid and dont know what they are talking about and you know better.
> 
> I did it pre omega like back in 14.4ish area and it did not work period. You can google and find many many threads about the subject and AMDs own reps saying you have to reinstall the drivers with both cards in but hey AMD Matt doesn't know what he is talking about.
Click to expand...

Asus Ares III:
Quote:


> *Dual R9 290X* factory-overclocked to 1030MHz- delivering a 15% faster performance than GTX Titan Z


https://www.asus.com/au/Graphics-Cards/ROG_ARESIII8GD5/

Powercolor Devil 13 290x2:
Quote:


> PowerColor Radeon™ Devil 13 *Dual Core R9 290X* 8GB GDDR5


http://www.powercolor.com/au/products_features.asp?id=549

Guess both Asus and Powercolor had no idea what they were making then.....

And yes I've done it before with my 290's, believe me or don't........and didn't you just say that AMD makes up hogwash anyway?


----------



## mfknjadagr8

Quote:


> Originally Posted by *Sgt Bilko*
> 
> I've done it when i ran Crossfire 290's and it worked fine unless they've gone backwards with Crimson in that regard....


I've done it probably 6 times since I've had my crossfire setup it had never failed to immediately detect crossfire and even enable it automatically in most cases...in fact installing with both cards powered and connected is the only time I've had issues with crossfire not initializing properly....this includes crimson in all three flavors and omega and 14 series...I have never had to reinstall drivers when simply adding a second card for crossfire...I've had to reinstall a lot due to borked drivers from overclock, Windows failure, or bad install...I always use ddu in safe then manually clean any registry entries if any are left then reboot with one card unpowered...after install of new drivers I restart ensure they installed properly...then shutdown and reconnect the second card then it's always good to go...Before this method it would occasionally give issues with crossfire after installing a new driver...always disable ulps in registry and via checkboxes in ab or trix whichever I'm using at the time...(I know I don't need to do it in the program after I've manually done it in the registry but I've always done it just to ensure the program doesn't try to reenable it)


----------



## Sgt Bilko

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> I've done it when i ran Crossfire 290's and it worked fine unless they've gone backwards with Crimson in that regard....
> 
> 
> 
> I've done it probably 6 times since I've had my crossfire setup it had never failed to immediately detect crossfire and even enable it automatically in most cases...this includes crimson in all three flavors and omega and 14 series...
> 
> I have never had to reinstall drivers when simply adding a second card for crossfire...I've had to reinstall a lot due to borked drivers from overclock, Windows failure, or bad install...I always use ddu in safe then manually clean any registry entries if any are left then reboot with one card unpowered...after install of new drivers I restart ensure they installed properly...then shutdown and reconnect the second card then it's always good to go...
> 
> Before this method it would occasionally give issues with crossfire after installing a new driver...always disable ulps in registry and via checkboxes in ab or trix whichever I'm using at the time...(I know I don't need to do it in the program after I've manually done it in the registry but I've always done it just to ensure the program doesn't try to reenable it)
Click to expand...

You must be wrong though, it's not possible to do it









In all seriousness though, It's pretty simple, plug in the other card, boot up PC......if it doesn't detect it then re-install the driver.

no harm from trying it instead of wiping it all straight away


----------



## Cyber Locc

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Asus Ares III:
> https://www.asus.com/au/Graphics-Cards/ROG_ARESIII8GD5/
> 
> Powercolor Devil 13 290x2:
> http://www.powercolor.com/au/products_features.asp?id=549
> 
> Guess both Asus and Powercolor had no idea what they were making then.....
> 
> And yes I've done it before with my 290's, believe me or don't........and didn't you just say that AMD makes up hogwash anyway?


I concede maybe you did do it, Maybe the feature that is suppose to be worked for you. However in my case and millions of others on this forum and others it doesn't work you have to reinstall the drivers with both cards installed.

The issue that present when doing it like that is not the cards are not detected by the system they are. The case I had, presented with the cards showing up in device manager, they showed up in afterburner yet CCC had no crossfire options to turn CFX on or off or anything about CF at all. This occurs to a lot of people when they do not reinstall new drivers with both cards installed. I presented my problem in this very thread and was Told by TSM that I needed to install the drivers fresh with both cards installed. I have since seen many people have the same exact issue and even AMD Matt say you have to do it like that.

Asus Ares III:
https://www.asus.com/au/Graphics-Cards/ROG_ARESIII8GD5/

Powercolor Devil 13 290x2:
http://www.powercolor.com/au/products_features.asp?id=549

What they label the card and what the card is is irrelevant, 295x2 is AMDs code name for a dual 290x GPU, both of those cards are dual 290x GPUs that makes them 295x2s period. AMD decides code names around here not Powercooler not Asus. They all use the same base PCB and GPU cores, it is simply reference versus non reference.

Board partners do have the pull you seem to think they do, AMD gives them specs for a single GPU and they alter it to there liking within AMDs guide sets, AMD can them give them a Design for a dual GPU setup and they do the same. They don't just start making there own cards out of thin air it doesn't work that way.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Sgt Bilko*
> 
> You must be wrong though, it's not possible to do it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In all seriousness though, It's pretty simple, plug in the other card, boot up PC......if it doesn't detect it then re-install the driver.
> 
> no harm from trying it instead of wiping it all straight away


hey thanks for making some paragraphs I'm a horrible about creating text walls as I usually have several thoughts in what would be a paragraph lol

I know formatting pretty well I just seem to forget it when I post and punctuation wth is that







my old English teachers and my fiance the English nazi would smack me around if they had to read my posts


----------



## Sgt Bilko

Quote:


> Originally Posted by *Cyber Locc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> Asus Ares III:
> https://www.asus.com/au/Graphics-Cards/ROG_ARESIII8GD5/
> 
> Powercolor Devil 13 290x2:
> http://www.powercolor.com/au/products_features.asp?id=549
> 
> Guess both Asus and Powercolor had no idea what they were making then.....
> 
> And yes I've done it before with my 290's, believe me or don't........and didn't you just say that AMD makes up hogwash anyway?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I concede maybe you did do it, Maybe the feature that is suppose to be worked for you. However in my case and millions of others on this forum and others it doesn't work you have to reinstall the drivers with both cards installed.
> 
> The issue that present when doing it like that is not the cards are not detected by the system they are. The case I had, presented with the cards showing up in device manager, they showed up in afterburner yet CCC had no crossfire options to turn CFX on or off or anything about CF at all. This occurs to a lot of people when they do not reinstall new drivers with both cards installed. I presented my problem in this very thread and was Told by TSM that I needed to install the drivers fresh with both cards installed. I have since seen many people have the same exact issue and even AMD Matt say you have to do it like that.
> 
> Asus Ares III:
> https://www.asus.com/au/Graphics-Cards/ROG_ARESIII8GD5/
> 
> Powercolor Devil 13 290x2:
> http://www.powercolor.com/au/products_features.asp?id=549
> 
> What they label the card and what the card is is irrelevant, 295x2 is AMDs code name for a dual 290x GPU, both of those cards are dual 290x GPUs that makes them 295x2s period. AMD decides code names around here not Powercooler not Asus. They all use the same base PCB and GPU cores, it is simply reference versus non reference.
> 
> Board partners do have the pull you seem to think they do, AMD gives them specs for a single GPU and they alter it to there liking within AMDs guide sets, AMD can them give them a Design for a dual GPU setup and they do the same. They don't just start making there own cards out of thin air it doesn't work that way.
Click to expand...

"Vesuvius" is the code name, "295x2" is the product name, the Ares and Devil 13 use a custom PCB, they do not follow the reference design and you'll see my answer to the Crossfire question above your post

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> You must be wrong though, it's not possible to do it
> 
> 
> 
> 
> 
> 
> 
> 
> 
> In all seriousness though, It's pretty simple, plug in the other card, boot up PC......if it doesn't detect it then re-install the driver.
> 
> no harm from trying it instead of wiping it all straight away
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> hey thanks for making some paragraphs I'm a horrible about creating text walls as I usually have several thoughts in what would be a paragraph lol
> 
> I know formatting pretty well I just seem to forget it when I post and punctuation wth is that
> 
> 
> 
> 
> 
> 
> 
> my old English teachers and my fiance the English nazi would smack me around if they had to read my posts
Click to expand...

no worries mate, just have a re-read before you hit submit, makes reading it a bit easier


----------



## Cyber Locc

Quote:


> Originally Posted by *mfknjadagr8*
> 
> hey thanks for making some paragraphs I'm a horrible about creating text walls as I usually have several thoughts in what would be a paragraph lol
> 
> I know formatting pretty well I just seem to forget it when I post and punctuation wth is that


I am completely lost about what you are saying.

Your initial install of AMD drivers (as from the sounds you are not wiping everytime) was with 1 card or 2? I am simply relating to initial install of CF, if you install drivers over old drivers that already have CF all bets are off. I do not know about this as I have never done I always DDU then install new drivers every time.

I am saying that if you start clean slate, and install the drivers with 1 gpu only turn the system off. Install the second card, reboot the pc. You will not have CFX options in CCC, your cards will both show up in AB or GPU Z ect however the option for CFX is not there.

However maybe in some cases it does work and you do get CF options I did not and google will show you 100s of threads of others that did not as well as AMD Matt saying you need to reinstall after adding a second card.

To SGT Biko, "Ares and Devil 13 use a custom PCB, they do not follow the reference design"
That is why I have been saying this entire time.... It is a non reference design of a 295x2, however they did not just decide lets make a Dual GPU, they cannot do that all AMD GPUs are based from an AMD reference design and then altered to there wants.

AMD Design the cards board partners alter them, nothing more.

Speaking of here is from Asus's website "Made for water-cooled rig featuring custom-designed EK water block - running at 25% cooler than reference R9 295X2"

Looks like Asus does call the Ares a 295x2.

And you are right Vesuvius is the code name, it is also the code name for the Devil 13 and Ares 3.

The PowerColor Radeon R9 295X2 Devil 13 which is also codenamed 'Vesuvius' (Official AMD naming) features two Hawaii XT cores with GCN 1.1 architecture. With two Hawaii XT GPUs on the same board,

Oh BTW here is another reference point,
"PowerColor's Radeon R9 295X2 Devil 13 will feature a custom designed PCB which we can't see at the moment but its really beefy in terms of design"

Read more: http://wccftech.com/powercolor-radeon-r9-295x2-devil13-unveiled-monstrous-pcb-design-requires-8pin-power-connectors/#ixzz3zPTZUrGM

Just because they say 2x290x in the naming scheme does not mean its not a 295x2 a 295x2 is 2 290xs as well.


----------



## wermad

Generally, its accepted to call the dual gpu card design *directly from amd* is whatever the name amd gave it for the consumer level: Ie 295x2, 7990, 6990, 5970, 4870x2, 3870x2, etc. AIB partners can and will launch their own dual core gpu products though to ensure we can distinguish them from the factory "reference" dual core design, its usually named differently. I think a lot of the confusion comes from the Tahiti (7970/7950/7870-XT) and New Zealand (7990) era. PowerColor launched their card under 7990 (AX7990 and AX7990 Devil for the oc version) in late 2012 if I'm right (Asus still uses Ares and Mars names to distinguish their dual gpu cards). While amd was still mulling over a challenger to the GTX 690, it had no product so I'm sure it had no objection for PC to use "7990". Eventually we did get a 7990 from amd and it lead more to confusion. With the hawaii launch, Vesuvius (295x2) was named 295x2 while products from Asus (Ares) and PC (Devil) were dubbed "290X x2" or "dual 290X". So, part of the blame for the confusion comes from Amd and ppl who don't keep up or understand the (very wacky at times) naming schemes of *both* gpu companies tbh.

In the 295x2 club, we do welcome both Mars and Devil 290x X2 owners (the 390 non x X2 devil owners have their own club/thing) and we all get along and help each other. There are more 295x2 owners but ultimately, most issues may be experienced on either side. Some unique features do set us aspart, and one of them is that the 295x2 reference model from amd is thermal limted to 75°C to preserve the aio system (from nazi asetek). My cards are under Koolance blocks (and have been most of their lives) since I got them in early 2015. Even still w/ water blocks, I'm still limited to 75°c and ive seen it happen when I accidentally unplugged the power to the fan controller (trying to fix a rattling fan). Another common thing for our group is power requirements and if a power supply can be used w/ this beast. Its so confusing i took the liberty of compiling a long list to help answer this question (took a few days).

I did have a triple Sapphire 290 OC setup not too long ago, but then switched to dual 295x2













edit:

Quote:


> Originally Posted by *Cyber Locc*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> I am completely lost about what you are saying.
> 
> Your initial install of AMD drivers (as from the sounds you are not wiping everytime) was with 1 card or 2? I am simply relating to initial install of CF, if you install drivers over old drivers that already have CF all bets are off. I do not know about this as I have never done I always DDU then install new drivers every time.
> 
> I am saying that if you start clean slate, and install the drivers with 1 gpu only turn the system off. Install the second card, reboot the pc. You will not have CFX options in CCC, your cards will both show up in AB or GPU Z ect however the option for CFX is not there.
> 
> However maybe in some cases it does work and you do get CF options I did not and google will show you 100s of threads of others that did not as well as AMD Matt saying you need to reinstall after adding a second card.
> 
> To SGT Biko, "Ares and Devil 13 use a custom PCB, they do not follow the reference design"
> That is why I have been saying this entire time.... It is a non reference design of a 295x2, however they did not just decide lets make a Dual GPU, they cannot do that all AMD GPUs are based from an AMD reference design and then altered to there wants.
> 
> AMD Design the cards board partners alter them, nothing more.
> 
> Speaking of here is from Asus's website "Made for water-cooled rig featuring custom-designed EK water block - running at 25% cooler than reference R9 295X2"
> 
> 
> Looks like Asus does call the Ares a 295x2.


Quote:


> ASUS ROG Ares III: Limited Edition. Unlimited Power
> 
> *Dual R9 290X* factory-overclocked to 1030MHz - delivering a 15% faster performance than GTX Titan Z
> Made for water-cooled rig featuring custom-designed EK water block - running at 25% cooler than reference R9 295X2
> DIGI+ VRM with black metallic capacitors and 16-phase Super Alloy Power: 30%-less power noise and 5X-greater durability
> GPU Tweak: Modify clock speeds, voltages, fan performance and more, all


I don't know where you get your info from....









https://www.asus.com/us/Graphics-Cards/ROG_ARESIII8GD5/

Quote:


> PowerColor Radeon™ Devil 13 Dual Core *R9 290X* 8GB GDDR5
> Part NumberAXR9 290X II 8GBD5
> Graphics EngineRADEON R9 290X
> Video Memory8GB GDDR5
> Engine Clock1000MHz
> Memory Clock1350MHz x4 (5.4 Gbps)
> Memory Interface512bit X2
> DirectX® Support11.2
> Bus StandardPCIE 3.0
> Display ConnectorsDL DVI-D/ DL DVI-D/ HDMI/ DisplayPort


http://www.powercolor.com/US/products_features.asp?id=549


----------



## Cyber Locc

Quote:


> Originally Posted by *wermad*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Generally, its accepted to call the dual gpu card design *directly from amd* is whatever the name amd gave it for the consumer level: Ie 295x2, 7990, 6990, 5970, 4870x2, 3870x2, etc. AIB partners can and will launch their own dual core gpu products though to ensure we can distinguish them from the factory "reference" dual core design, its usually named differently. I think a lot of the confusion comes from the Tahiti (7970/7950/7870-XT) and New Zealand (7990) era. PowerColor launched their card under 7990 (AX7990 and AX7990 Devil for the oc version) in late 2012 if I'm right (Asus still uses Ares and Mars names to distinguish their dual gpu cards). While amd was still mulling over a challenger to the GTX 690, it had no product so I'm sure it had no objection for PC to use "7990". Eventually we did get a 7990 from amd and it lead more to confusion. With the hawaii launch, Vesuvius (295x2) was named 295x2 while products from Asus (Ares) and PC (Devil) were dubbed "290X x2" or "dual 290X". So, part of the blame for the confusion comes from Amd and ppl who don't keep up or understand the (very wacky at times) naming schemes of *both* gpu companies tbh.
> 
> In the 295x2 club, we do welcome both Mars and Devil 290x X2 owners (the 390 non x X2 devil owners have their own club/thing) and we all get along and help each other. There are more 295x2 owners but ultimately, most issues may be experienced on either side. Some unique features do set us aside, and one of them is that the 295x2 reference model from amd is thermal limted to 75°C to preserve the aio system (from nazi asetek). My cards are under Koolance blocks (and have been most of their lives) since I got them in early 2015. Even still w/ water blocks, I'm still limited to 75°c and ive seen it happen when I accidentally unplugged the power to the fan controller (trying to fix a rattling fan).
> 
> I did have a triple Sapphire 290 OC setup not too long ago, but then switched to dual 295x2
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I don't know where you get your info from....:confusion:
> 
> https://www.asus.com/us/Graphics-Cards/ROG_ARESIII8GD5/
> http://www.powercolor.com/US/products_features.asp?id=549


You dodnt know where I got my info from yet you just linked it lol. Its on that page right there read!. I directly copy and pasted from that very page that you just linked.

Then read my updates, Board Partners are not allotted the ability to just make whatever the heck they want. They have to follow a reference design and make changes that AMD deems okay. This is why we do not see Dual GPU cards prior to AMDs release, the one exception to this is the Devil 13 dual 390x however in the case of that it is simply a 295x2 with more Vram and some tweaks ect just as the 390x is to the 290x.


----------



## wermad

Quote:


> Originally Posted by *Cyber Locc*
> 
> You dodnt know where I got my info from yet you just linked it lol. Its on that page right there read!. I directly copy and pasted from that very page that you just linked.


I think you need to adjust your glasses, read the initial wording from each of their products specs.

I'll repost:
Quote:


> Quote:
> 
> 
> 
> ASUS ROG Ares III: Limited Edition. Unlimited Power
> 
> *Dual R9 290X* factory-overclocked to 1030MHz - delivering a 15% faster performance than GTX Titan Z
> Made for water-cooled rig featuring custom-designed EK water block - running at 25% cooler than reference R9 295X2
> DIGI+ VRM with black metallic capacitors and 16-phase Super Alloy Power: 30%-less power noise and 5X-greater durability
> GPU Tweak: Modify clock speeds, voltages, fan performance and more, all
> 
> 
> 
> I don't know where you get your info from....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> https://www.asus.com/us/Graphics-Cards/ROG_ARESIII8GD5/
> 
> Quote:
> 
> 
> 
> PowerColor Radeon™ Devil 13 Dual Core *R9 290X* 8GB GDDR5
> Part NumberAXR9 290X II 8GBD5
> Graphics EngineRADEON R9 290X
> Video Memory8GB GDDR5
> Engine Clock1000MHz
> Memory Clock1350MHz x4 (5.4 Gbps)
> Memory Interface512bit X2
> DirectX® Support11.2
> Bus StandardPCIE 3.0
> Display ConnectorsDL DVI-D/ DL DVI-D/ HDMI/ DisplayPort
> 
> Click to expand...
> 
> http://www.powercolor.com/US/products_features.asp?id=549
Click to expand...

Can you read the bold? Or do I make the text bigger for you? I'm sorry, its just that its plainly obvious and I'm not sure if its your limitations or trolling that can't see this fact. I think I'll just leave it there.


----------



## Mega Man

you said even the reviews call it a 295x2..

http://www.overclockersclub.com/reviews/powercolor_devil13_dual_core_r9_290x_8gb/

nope, dual core 290x

again nope
http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/66784-powercolor-devil-13-r9-290x-dual-core-review.html

hey look this one does
http://www.guru3d.com/articles-pages/powercolor-radeon-290x-295x2-devil13-review,1.html

but it is guru3d, so no big surprise there

another nope

http://www.tomshardware.com/print/powercolor-devil-13-dual-amd-r9-290x-review,reviews-3853.html

another nope
http://www.modders-inc.com/powercolor-devil-13-r9-290x-dual-core-review/
so you have 1 sad site who does and what 4 others who dont, that tells me 1 site epic failed ( big surprise guru3d.... ) ........

feel free to show proof of all these other " sites "


----------



## Cyber Locc

Quote:


> Originally Posted by *wermad*
> 
> I think you need to adjust your glasses, read the initial wording from each of their products specs.
> 
> I'll repost:
> Can you read the bold? Or do I make the text bigger for you? I'm sorry, its just that its plainly obvious and I'm not sure if its your limitations or trolling that can't see this fact. I think I'll just leave it there.


Dude what they decide to label it as has no bearing on what it is, they can call it whatever they want it is still a non reference card based on the 295x2 period.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Cyber Locc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wermad*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Generally, its accepted to call the dual gpu card design *directly from amd* is whatever the name amd gave it for the consumer level: Ie 295x2, 7990, 6990, 5970, 4870x2, 3870x2, etc. AIB partners can and will launch their own dual core gpu products though to ensure we can distinguish them from the factory "reference" dual core design, its usually named differently. I think a lot of the confusion comes from the Tahiti (7970/7950/7870-XT) and New Zealand (7990) era. PowerColor launched their card under 7990 (AX7990 and AX7990 Devil for the oc version) in late 2012 if I'm right (Asus still uses Ares and Mars names to distinguish their dual gpu cards). While amd was still mulling over a challenger to the GTX 690, it had no product so I'm sure it had no objection for PC to use "7990". Eventually we did get a 7990 from amd and it lead more to confusion. With the hawaii launch, Vesuvius (295x2) was named 295x2 while products from Asus (Ares) and PC (Devil) were dubbed "290X x2" or "dual 290X". So, part of the blame for the confusion comes from Amd and ppl who don't keep up or understand the (very wacky at times) naming schemes of *both* gpu companies tbh.
> 
> In the 295x2 club, we do welcome both Mars and Devil 290x X2 owners (the 390 non x X2 devil owners have their own club/thing) and we all get along and help each other. There are more 295x2 owners but ultimately, most issues may be experienced on either side. Some unique features do set us aside, and one of them is that the 295x2 reference model from amd is thermal limted to 75°C to preserve the aio system (from nazi asetek). My cards are under Koolance blocks (and have been most of their lives) since I got them in early 2015. Even still w/ water blocks, I'm still limited to 75°c and ive seen it happen when I accidentally unplugged the power to the fan controller (trying to fix a rattling fan).
> 
> I did have a triple Sapphire 290 OC setup not too long ago, but then switched to dual 295x2
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I don't know where you get your info from....:confusion:
> 
> https://www.asus.com/us/Graphics-Cards/ROG_ARESIII8GD5/
> http://www.powercolor.com/US/products_features.asp?id=549
> 
> 
> 
> You dodnt know where I got my info from yet you just linked it lol. Its on that page right there read!. I directly copy and pasted from that very page that you just linked.
> 
> Then read my updates, Board Partners are not allotted the ability to just make whatever the heck they want. They have to follow a reference design and make changes that AMD deems okay. This is why we do not see Dual GPU cards prior to AMDs release, the one exception to this is the Devil 13 dual 390x however in the case of that it is simply a 295x2 with more Vram and some tweaks ect just as the 390x is to the 290x.
Click to expand...

Asus and Powercolor both sold the R9 295x2 AND they both created Custom Dual GPU cards called the Ares 3 and Devil 13.........which they listed and sold as R9 290x2's or Dual 290x cards


----------



## Mega Man

also to add i think you are confusing what nvidia does with what amd does .... i never heard how asus has to ask amd if the ares is ok, acctually i read an article about i, but i probably wont find it, currently in china and cant use google ( no i am not joking )

amd lets companies make their own choices unlike nvidia when it comes to nonref boards, they dont need amds stamp of approval,

ref boards of course they just follow the blueprint


----------



## wermad

Well...I must help as best I can...pictures might do this time for those in need:





Even posted it in 1kx1k for yah


----------



## Cyber Locc

Quote:


> Originally Posted by *Sgt Bilko*
> 
> Asus and Powercolor both sold the R9 295x2 AND they both created Custom Dual GPU cards called the Ares 3 and Devil 13.........which they listed and sold as R9 290x2's or Dual 290x cards


That is correct they both sold refrence 295x2s then they made custom board options as you said, I agree on those counts. That is no different from a MSI Lighting 290x or a ASUS DC2 they are still however 290xs just tweaked boards.
Quote:


> Originally Posted by *Mega Man*
> 
> you said even the reviews call it a 295x2..
> 
> http://www.overclockersclub.com/reviews/powercolor_devil13_dual_core_r9_290x_8gb/
> 
> nope, dual core 290x
> 
> again nope
> http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/66784-powercolor-devil-13-r9-290x-dual-core-review.html
> 
> hey look this one does
> http://www.guru3d.com/articles-pages/powercolor-radeon-290x-295x2-devil13-review,1.html
> 
> but it is guru3d, so no big surprise there
> 
> another nope
> 
> http://www.tomshardware.com/print/powercolor-devil-13-dual-amd-r9-290x-review,reviews-3853.html
> 
> another nope
> http://www.modders-inc.com/powercolor-devil-13-r9-290x-dual-core-review/
> so you have 1 sad site who does and what 4 others who dont, that tells me 1 site epic failed ( big surprise guru3d.... ) ........
> 
> feel free to show proof of all these other " sites "


Okay lets do that,

First here from your links,

http://www.overclockersclub.com/reviews/powercolor_devil13_dual_core_r9_290x_8gb/
"PowerColor has again been thinking outside the box to deliver what is essentially an air cooled R9 295X2... "

http://www.hardwarecanucks.com/forum/hardware-canucks-reviews/66784-powercolor-devil-13-r9-290x-dual-core-review.html
PowerColor's R9 290X Devil 13 goes back to basics by utilizing virtually the same specifications as the R9 295X2 but backing things up with an infinitely more adaptable fan-based heatsink.

http://www.tomshardware.com/print/powercolor-devil-13-dual-amd-r9-290x-review,reviews-3853.html
"Now, PowerColor is trying to improve upon the first workable reference design we've seen from AMD in years with a gigantic air cooler. Is this an act of deft engineering or blind ambition?"

Okay so now you want links of review sites that call it a 295x2 in the title no problem.

http://www.fudzilla.com/35401-powercolor-devil-13-dual-gpu-r9-290x-reviewed
"PowerColor is not using standard R9 295X2 branding for its dual-GPU card. It calls it the Devil 13 Dual GPU R9 290X. The marketing plan is simple - PowerColor wants to differentiate its dual-Hawaii card from the rest of the field."

http://wccftech.com/powercolor-radeon-r9-295x2-devil13-unveiled-monstrous-pcb-design-requires-8pin-power-connectors/ In the title.

http://www.tweaktown.com/news/37990/powercolor-expected-to-unveil-its-crazy-new-gpu-at-computex-next-week/index.html

http://www.hardwareheaven.com/2014/06/powercolor-devil13-r9-290x2-graphics-card-review/ Calling it powercolors own branded 295x2.

Not a reviewer but PC parts picker calls it a 295x2 as well. https://pcpartpicker.com/part/powercolor-video-card-axr9290xii8gbd5


----------



## Cyber Locc

Quote:


> Originally Posted by *wermad*
> 
> Well...I must help as best I can...pictures might do this time for those in need:
> 
> 
> 
> 
> 
> Even posted it in 1kx1k for yah


OMG no one is arguing that the cards dont have a custom PCB design, it does it is a NON REFERENCE 295x2.

By your logic a MSI lighting 290x is not a 290x, it is an MSI Lighting. Get real you all seriously have some fan boy issues. That is what this comes down too I think you want to think your card is different so it can be superior or something however it isn't.

A 295x2 is a dual 290x gpu
A Power Color Devil 13 is a Dual 290x GPU

That makes them the same thing!


----------



## Sgt Bilko

Quote:


> Originally Posted by *wermad*
> 
> Well...I must help as best I can...pictures might do this time for those in need:
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Even posted it in 1kx1k for yah


^ That.......end of "discussion"


----------



## Cyber Locc

Quote:


> Originally Posted by *Sgt Bilko*
> 
> ^ That.......end of "discussion"


Okay I will keep that in mind any time talking in the future.

A Titan X is not a Titan X unless it is reference.
A 290x is not a 290x unless it is reference.
A fury X is not a Fury X unless it is reference.

So in laments terms, a DC2 is a DC2 not a 290x DC2, Get freaking real your guy's argument is the stupidest thing I have heard in years.


----------



## Sgt Bilko

Quote:


> Originally Posted by *Cyber Locc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Sgt Bilko*
> 
> ^ That.......end of "discussion"
> 
> 
> 
> Okay I will keep that in mind any time talking in the future.
> 
> A Titan X is not a Titan X unless it is reference.
> A 290x is not a 290x unless it is reference.
> A fury X is not a Fury X unless it is reference.
> 
> So in laments terms, a DC2 is a DC2 not a 290x DC2, Get freaking real your guy's argument is the stupidest thing I have heard in years.
Click to expand...

You've completely missed the point of it all.......

to call the Devil 13 and Ares 3 a "R9 295x2" is actually an insult due to the amount of work those companies put into those cards.....same reason I don't put the Lightning 290x and the Ref 290x in the same class, the amount of work and engerinnering in them puts them into another class.

They have the same GPU Core and the same amount of vram (usually) and that is where the similarities end with them.

I'm sorry if you do not understand that and if it still makes no sense to you then I have nothing more to say on the matter

Oh and btw, Titan X is only Reference and same goes with the Fury X, both AMD and Nvidia didn't let AIB's create and engineer custom versions of them


----------



## Cyber Locc

Quote:


> Originally Posted by *Sgt Bilko*
> 
> You've completely missed the point of it all.......
> 
> to call the Devil 13 and Ares 3 a "R9 295x2" is actually an insult due to the amount of work those companies put into those cards.....same reason I don't put the Lightning 290x and the Ref 290x in the same class, the amount of work and engerinnering in them puts them into another class.
> 
> They have the same GPU Core and the same amount of vram (usually) and that is where the similarities end with them.
> 
> I'm sorry if you do not understand that and if it still makes no sense to you then I have nothing more to say on the matter
> 
> Oh and btw, Titan X is only Reference and same goes with the Fury X, both AMD and Nvidia didn't let AIB's create and engineer custom versions of them


Fine Fury's then, and actually I would say the opposite of the Devil 13 seeing how a 295x2 is faster than the Devil 13...

And you just hit the nail on the hammer, a Power Color Devil 13 is a Custom Version of a 295x2 however a 295x2 it is. AMD allowed AIBs to make a custom 295x2 and thus the Power Cooler Devil 13 was born.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Cyber Locc*
> 
> Fine Fury's then, and actually I would say the opposite of the Devil 13 seeing how a 295x2 is faster than the Devil 13...
> 
> And you just hit the nail on the hammer, a Power Color Devil 13 is a Custom Version of a 295x2 however a 295x2 it is. AMD allowed AIBs to make a custom 295x2 and thus the Power Cooler Devil 13 was born.


that's cool and all but you never explained how there are two different versions of amd drivers in every install (allegedly)yet an installed but not powered (via psu)card doesn't show up in device manager and doesn't change the driver structure or files in any way....you in fact skipped over that little tidbit


----------



## mus1mus

Quote:


> Originally Posted by *Cyber Locc*
> 
> Okay I will keep that in mind any time talking in the future.
> 
> A Titan X is not a Titan X unless it is reference.
> A 290x is not a 290x unless it is reference.
> A fury X is not a Fury X unless it is reference.
> 
> So in laments terms, a DC2 is a DC2 not a 290x DC2, Get freaking real your guy's argument is the stupidest thing I have heard in years.


This is the 2nd time I've seen "laments terms" in this thread. So I'll chime in.

Put it in laymen's term coz by your logic, Washington APPLES are the same as Fuji APPLES. Nope, they are just apples. But not the same.


----------



## mfknjadagr8

Sorry I needed it to be a little more light hearted


----------



## Alex132

Wait, is that one guy still believing that the R9 295X2 is limited to 75'c for lolz and not for a real reason?


----------



## mfknjadagr8

Quote:


> Originally Posted by *Alex132*
> 
> Wait, is that one guy still believing that the R9 295X2 is limited to 75'c for lolz and not for a real reason?
> 
> 
> Spoiler: Warning: Spoiler!


no he never said that he simply said he doesn't think it was because of the pump


----------



## Cyber Locc

Quote:


> Originally Posted by *mfknjadagr8*
> 
> that's cool and all but you never explained how there are two different versions of amd drivers in every install (allegedly)yet an installed but not powered (via psu)card doesn't show up in device manager and doesn't change the driver structure or files in any way....you in fact skipped over that little tidbit


I did not, that was covered by the other guy, just because the GPU doesn't get power from the pcie cables doesnt mean it aint reading it.

I had that happen long ago however it was a weak card, a gtx 430 that I used for testing boards. I put it in a HTPC without its cables hooked up by accident, it worked it displayed and was read in device manager however when under load it would crash which is what lead me to figure out I forgot the cables. Of course that is a completely different card with way less power reqs the same thing could be happening here.

As I stated previously " there are two different versions of AMD drivers in every install (allegedly)" I dont know that that is just a guess. What I do know is that if you install drivers and try to add a second without reinstalling it does not work. At least it didn't for me or the 100s of people that have threads about the exact same issue, along with the other guy that said something about before I did.

Know could this issue just be a certain set of drivers every now and then ya its possible. Could it work sometimes and not others that is also possible. Did it not for me and many many others, Yes.
Could installing a fresh driver versus installing a driver over another change the way that works yes. Have I tried no, because I dont have a need to I install my drivers with all my GPUs are installed.
Quote:


> Originally Posted by *Alex132*
> 
> Wait, is that one guy still believing that the R9 295X2 is limited to 75'c for lolz and not for a real reason?


Like he said, I never said there was not a reason. There is a reason I am sure, and seeing how the Devil 13 and Ares 3 have the same feature obviously it has nothing to do with pumps. Not that any person familiar with water cooling in the slightest would come to that conclusion, however AMD engineers might not be familiar with water cooling so there is that.

However if you want to keep your crazzy conspiracy theory that the AIO cooler is the cause then so be it, but I promise you that is not the reason.

If it were the reason someone better inform Corsair and everyone else that makes an AIO that get put on CPUs that even with them are way over 75c, Because we are going to have RMAs left and right oh wait we dont.

Funny AIOs have been out for years and years yet this 75c limit to avoid water temps of 50c+ have never been mentioned. Well by golly you guys are going to revolutionize the AIO water cooling market with these findings.

"AMD has clamped down on the R9 295X2's GPUs, only allowing them to reach 75C before throttling. Upon finding this we asked AMD why they were using such a relatively low temperature limit, and the response we received is that it's due to a combination of factors including the operational requirements of the CLLC itself, and what AMD considers the best temperature for optimal performance. As we briefly discussed in our 290X review leakage increases with temperature, and while Hawaii is supposed to be a lower leakage part leakage is still going to be occurring. To that end our best guess is that 75C is as warm as Hawaii can get before leakage starts becoming a problem for this card."

http://www.anandtech.com/show/7930/the-amd-radeon-r9-295x2-review/3

However this again does not imply pump problems either, the thing is that pump will withstand more heat than the tubing will. Those hoses will pull off the barbs long before there is any problems with the pump. Seeing how the other cards have the same limit I would think it would be safe to assume that "Leakage" is the problem.

When you put 2 GPUs so close together on 1 card that vastly changes the dynamic of the card. So while a regular 290x chip with space to spread out has a thermal limit of 95 it is not unreasonable for a dual version to have a different limit.

I would go on to mention this is why I pointed out the 7990 vs 7970 having different throttle limits. As a matter of fact I would venture a guess to say that if we looked at all the different Dual GPU cards we would see a pattern emerge of lower temp limits for Dual GPUs vs there single GPU counterpart.

Back on to the pump convo though, DDCs can and do hit way over 70c due to there AIr cooling friendly design, an AIOs pump is watercooled like a d5 all heat from the pump is dumped into the loop. the temp of the AIOs pump will be slightly higher than that of the water. Now we couple that with the fact that DDCs and D5s can both handle way more than 70C (ddcs board cannot however it can).

Now we are suggesting that a r9 295x2 can heat the water beyond 70c when its cores are past 75c, that would insane and the tubes would come off long before that.

But hey like I said you are welcome to think water you want.


----------



## Cyber Locc

Quote:


> Originally Posted by *mus1mus*
> 
> This is the 2nd time I've seen "laments terms" in this thread. So I'll chime in.
> 
> Put it in laymen's term coz by your logic, Washington APPLES are the same as Fuji APPLES. Nope, they are just apples. But not the same.


Quote:


> Originally Posted by *mfknjadagr8*
> 
> Sorry I needed it to be a little more light hearted


How yall know I wasnt saying that in Grief??? Hmmmmm I could have been saying that with sadness in my voice to get across my outlook on the statement, and used Laments to express my grief feeling as you cannot hear my voice? Ahhh assumptions and ad hominems, always fun.

JK I meant Laymens







.


----------



## Alex132

Quote:


> Originally Posted by *Cyber Locc*
> 
> Like he said, I never said there was not a reason. There is a reason I am sure, and seeing how the Devil 13 and Ares 3 have the same feature obviously it has nothing to do with pumps. Not that any person familiar with water cooling in the slightest would come to that conclusion, however AMD engineers might not be familiar with water cooling so there is that.
> 
> However if you want to keep your crazzy conspiracy theory that the AIO cooler is the cause then so be it, but I promise you that is not the reason.
> 
> If it were the reason someone better inform Corsair and everyone else that makes an AIO that get put on CPUs that even with them are way over 75c, Because we are going to have RMAs left and right oh wait we dont.
> 
> Funny AIOs have been out for years and years yet this 75c limit to avoid water temps of 50c+ have never been mentioned. Well by golly you guys are going to revolutionize the AIO water cooling market with these findings.
> 
> "AMD has clamped down on the R9 295X2's GPUs, only allowing them to reach 75C before throttling. Upon finding this we asked AMD why they were using such a relatively low temperature limit, and the response we received is that it's due to a combination of factors including the operational requirements of the CLLC itself, and what AMD considers the best temperature for optimal performance. As we briefly discussed in our 290X review leakage increases with temperature, and while Hawaii is supposed to be a lower leakage part leakage is still going to be occurring. To that end our best guess is that 75C is as warm as Hawaii can get before leakage starts becoming a problem for this card."
> 
> http://www.anandtech.com/show/7930/the-amd-radeon-r9-295x2-review/3
> 
> However this again does not imply pump problems either, the thing is that pump will withstand more heat than the tubing will. Those hoses will pull off the barbs long before there is any problems with the pump. Seeing how the other cards have the same limit I would think it would be safe to assume that "Leakage" is the problem.
> 
> When you put 2 GPUs so close together on 1 card that vastly changes the dynamic of the card. So while a regular 290x chip with space to spread out has a thermal limit of 95 it is not unreasonable for a dual version to have a different limit.
> 
> I would go on to mention this is why I pointed out the 7990 vs 7970 having different throttle limits. As a matter of fact I would venture a guess to say that if we looked at all the different Dual GPU cards we would see a pattern emerge of lower temp limits for Dual GPUs vs there single GPU counterpart.
> 
> Back on to the pump convo though, DDCs can and do hit way over 70c due to there AIr cooling friendly design, an AIOs pump is watercooled like a d5 all heat from the pump is dumped into the loop. the temp of the AIOs pump will be slightly higher than that of the water. Now we couple that with the fact that DDCs and D5s can both handle way more than 70C (ddcs board cannot however it can).
> 
> Now we are suggesting that a r9 295x2 can heat the water beyond 70c when its cores are past 75c, that would insane and the tubes would come off long before that.
> 
> But hey like I said you are welcome to think water you want.


Lets compare it to the other dual-pump OEM AIO graphics cards!
Oh. Wait. There are none.

Yeah AMD clearly put the 75'c limit on there for lolz. You gotta raise the limit to 900'c and stick it to the man! You show AMD.
Also nice sources on pump temperature limits. Oh. Wait. There are none.


----------



## Alex132

Also pretty sure the thermal limits for dual GPU cards have been the same as their single-GPU counterparts - except for 295x2.

I know this is a fact for the 4870X2, 5970, 590, 690, 7990 and Titan-Z.


----------



## Cyber Locc

Quote:


> Originally Posted by *Alex132*
> 
> Lets compare it to the other dual-pump OEM AIO graphics cards!
> Oh. Wait. There are none.
> 
> Yeah AMD clearly put the 75'c limit on there for lolz. You gotta raise the limit to 900'c and stick it to the man! You show AMD.
> Also nice sources on pump temperature limits. Oh. Wait. There are none.


We dont need to compare it to other dual pump graphics cards lol.

Pump Temps,

https://www.ekwb.com/shop/ek-d5-vario-laing-d5-vario
EK d5 - Maximum liquid temperature: 60°C

https://www.ekwb.com/shop/ek-ddc-3-2-pwm-laing-ddc-3-2-pwm
EK DDC - Maximum liquid temperature: 60 °C

http://koolance.com/pmp-500-pump-g-1-4-bsp -

Koolance PMP 500 - 60°C (140°F)

http://www.laing.cz/index.php?option=com_content&task=view&id=1&Itemid=2
Laings pump temps, Medium temperature norylových pumps 0 to +60C

Pay special attention to this next one as it is the same type of pump implored by the AIO,
http://www.alphacool.com/product_info.php/info/p1055_Alphacool-DC-LT-Keramik-12-Volt-Pumpe---bulk.html?language=en&XTCsid=2f6v9a5a1kaemlbvct8mr53qj2
Alphacool DC-LT Keramik
Maximum system temperature: 65°C

So you are right I was off by 10c its 60c, again those plastic barbs would begin to melt at 50c and hoses would pop, acrylic reservoirs would crack all sorts of madness.

If you cannot understand the difference between water and heat source temp well than I cannot help you, and a 75c temp on the core will not put water anywhere near 65c, it wont even be 50c it will be less than 40c lol.
Quote:


> Originally Posted by *Alex132*
> 
> Also pretty sure the thermal limits for dual GPU cards have been the same as their single-GPU counterparts - except for 295x2.
> 
> I know this is a fact for the 4870X2, 5970, 590, 690, 7990 and Titan-Z.


also I am pretty sure no, 7990s was 85c vs 95c but by all means keep throwing conspiracy theory's. Sgt Biko said this had something to do with VRMs, I do not know personally.

Also check the facts again Anandtech says you are wrong, I said you are wrong, Pump makers tell you you are wrong, Science tells you that you ARE WRONG. So please stop spreading misinformation.

Edit: got the results for a Titan Z vs a Titan Black, this is a bad example as Titan Zs are not as strong as its single part SLI counter part.

Throttle temps
Titan Z 83c
Titan Black 95c

Well would you look at there seems that they do have varying max temps. Granted not as large of a variances as a 295x2 but varying temps still.

"Yeah AMD clearly put the 75'c limit on there for lolz. You gotta raise the limit to 900'c and stick it to the man! You show AMD."
Ya let that troll flag fly ..... Did you even read the convo?

As I am fairly sure you did not, no one is trying to raise the max temp. Someone said that the thermal limitation was solely because of the pump this is not true this is 100% false and ridiculous.

Things that could be a reason,

They didn't feel the rad on the clc could cool higher temps with the already fast fans they added.
the Leakage like Anandtech pointed out
The closeness of the components on the dual GPU card mainly the VRMs and Memory chips relative position to the core.
or any number of other situations, most likely it was a combination of all 3 as that is what AMD implied to Anandtech when asked.


----------



## Butthurt Beluga

I know I'm just a little late by now, but is it too late to join the club?








1)http://www.techpowerup.com/gpuz/details.php?id=w254n
2) XFX R9 290X 8GB
3) Stock

Hopefully it's okay to ask here (if not please point me in the right direction if you don't mind) but what are some good aftermarket coolers for the R9 290X?
Right now I have a stock cooler and while it does it's job, things do get a little toasty and I'd like a little more overclocking headroom.
Preferably I'm looking for a self-contained unit that doesn't require me to have my own custom loop (because I don't have one yet)


----------



## JourneymanMike

Quote:


> Originally Posted by *Butthurt Beluga*
> 
> I know I'm just a little late by now, but is it too late to join the club?
> 
> 
> 
> 
> 
> 
> 
> 
> 1)http://www.techpowerup.com/gpuz/details.php?id=w254n
> 2) *XFX R9 290X 8GB
> *3) Stock
> 
> *Hopefully it's okay to ask here* (if not please point me in the right direction if you don't mind) but what are some good aftermarket coolers for the R9 290X?
> Right now I have a stock cooler and while it does it's job, things do get a little toasty and I'd like a little more overclocking headroom.
> Preferably I'm looking for a self-contained unit that doesn't require me to have my own custom loop (because I don't have one yet)


1. Pictures?

2. Yes, it is OK to check here, that's what this thread is all about...

Are you looking to water cool, or air cool?


----------



## Butthurt Beluga

Quote:


> Originally Posted by *JourneymanMike*
> 
> Are you looking to water cool, or air cool?


Either or really, but if it is water it'd need to could only have a 160mm~ radiator as it'd have to fit onto the back of my case (there's no room elsewhere)


----------



## Roboyto

Quote:


> Originally Posted by *Butthurt Beluga*
> 
> I know I'm just a little late by now, but is it too late to join the club?
> 
> 
> 
> 
> 
> 
> 
> 
> 1)http://www.techpowerup.com/gpuz/details.php?id=w254n
> 2) XFX R9 290X 8GB
> 3) Stock
> 
> Hopefully it's okay to ask here (if not please point me in the right direction if you don't mind) but what are some good aftermarket coolers for the R9 290X?
> Right now I have a stock cooler and while it does it's job, things do get a little toasty and I'd like a little more overclocking headroom.
> Preferably I'm looking for a self-contained unit that doesn't require me to have my own custom loop (because I don't have one yet)


Welcome to the 290X thread!

If you're going to make a change then at the least you should attach a AIO watercooler to your card. Lots of good info here but scroll down to the bottom of the post for alternate cooling means: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21880_20#post_22208781

One that I haven't added into the list is Corsair HG10. Unfortunately it requires the use of a reference blower fan to cool the VRM's so you'd have to buy one of those: http://www.corsair.com/en-us/hydro-series-hg10-a1-gpu-liquid-cooling-bracket

Be sure to check https://www.ekwb.com/configurator/ to see if a full cover block will fit your card if you eventually plan to fully water cool it.

If you don't want to make an entire loop for CPU and GPU you could construct a small loop for just the GPU. I did this in my HTPC and got fair results:

http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/800_20#post_24762217


----------



## Butthurt Beluga

Quote:


> Originally Posted by *JourneymanMike*
> 
> 1. Pictures?



]
hopefully that's good enough, unfortunately I don't have the original packaging
Quote:


> Originally Posted by *Roboyto*
> 
> Welcome to the 290X thread!
> 
> If you're going to make a change then at the least you should attach a AIO watercooler to your card. Lots of good info here but scroll down to the bottom of the post for alternate cooling means: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21880_20#post_22208781
> 
> One that I haven't added into the list is Corsair HG10. Unfortunately it requires the use of a reference blower fan to cool the VRM's so you'd have to buy one of those: http://www.corsair.com/en-us/hydro-series-hg10-a1-gpu-liquid-cooling-bracket
> 
> Be sure to check https://www.ekwb.com/configurator/ to see if a full cover block will fit your card if you eventually plan to fully water cool it.
> 
> If you don't want to make an entire loop for CPU and GPU you could construct a small loop for just the GPU. I did this in my HTPC and got fair results:
> 
> http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/800_20#post_24762217


Thank you! The first thing I should have done was read through at least the first page of the thread, I got a little excited


----------



## JourneymanMike

Quote:


> Originally Posted by *Butthurt Beluga*
> 
> 
> ]
> hopefully that's good enough, unfortunately I don't have the original packaging


That'll do 'er....

Thanks


----------



## rdr09

Quote:


> Originally Posted by *Butthurt Beluga*
> 
> Either or really, but if it is water it'd need to could only have a 160mm~ radiator as it'd have to fit onto the back of my case (there's no room elsewhere)


There is this, too . . .

http://www.overclock.net/t/1331663/xfx-black-double-dissipation-club

Welcome.


----------



## c0V3Ro

Hey Mates!
Just got an used PowerColor 290x with waterblock and Visiontek bios.
It's scoring low in benchs, but doesn't shows artefacts or this kind of issue.
Noticed a low value for physics test CPU and a high value for loading time(ms) on Catzilla.
Can it be CPU to VGA communication issue?

Does this log values looks acceptable?

Or I just got a lemon?


----------



## rdr09

Quote:


> Originally Posted by *c0V3Ro*
> 
> Hey Mates!
> Just got an used PowerColor 290x with waterblock and Visiontek bios.
> It's scoring low in benchs, but doesn't shows artefacts or this kind of issue.
> Noticed a low value for physics test CPU and a high value for loading time(ms) on Catzilla.
> Can it be CPU to VGA communication issue?
> 
> Does this log values looks acceptable?
> 
> Or I just got a lemon?


We normally compare Firestrike scores (Graphics) or Heaven 4.0. Hard to tell with that bench.

Welcome.

Edit: Use settings found here . . .

http://www.overclock.net/t/1235557/official-top-30-heaven-benchmark-4-0-scores

Your cpu will definitely hold back the gpu. Pretty well known.


----------



## c0V3Ro

Hey, thanks for the prompt answer.
I'm wondering to go for a 8350. But this cpu low scores, less than half of a 8350, also made me think about the motherboard.
Also got low scores on 3dmark, unigine, aquamark.


----------



## rdr09

Quote:


> Originally Posted by *c0V3Ro*
> 
> Hey, thanks for the prompt answer.
> I'm wondering to go for a 8350. But this cpu low scores, less than half of a 8350, also made me think about the motherboard.
> Also got low scores on 3dmark, unigine, aquamark.


Top of the page is the Rigbuilder. Fill it out if you get a chance.

Not knowing the rest of your rig like psu . . . it is hard to tell if it is just the cpu. Your graphics score is quite low. i know of some Powercolor had an issue of under performing Hawaiis, so they put out a Bios to fix it. A stock 290X should be getting above 11K in Graphics score in FS.

Even your Heaven score is low for a bench that relies less on the cpu. At 1030 core you should be scoring close if not over 60.

About the cpu, yes, the best and cheapest route is an 8300 chip.


----------



## c0V3Ro

Thanks Mate. I'll fill it up.
The board came with Visiontek bios. The PSU is a 7team ST-750ZAF.
Would it be that bad luck to catch two lemons in a row?


----------



## Cyber Locc

Quote:


> Originally Posted by *c0V3Ro*
> 
> Thanks Mate. I'll fill it up.
> The board came with Visiontek bios. The PSU is a 7team ST-750ZAF.
> Would it be that bad luck to catch two lemons in a row?


Umm, that psu is over 7 years old and was mediocre the day it was released. Ya I would say that defiantly needs replacing.


----------



## Riktar54

Quote:


> Originally Posted by *Butthurt Beluga*
> 
> I know I'm just a little late by now, but is it too late to join the club?
> 
> 
> 
> 
> 
> 
> 
> 
> 1)http://www.techpowerup.com/gpuz/details.php?id=w254n
> 2) XFX R9 290X 8GB
> 3) Stock
> 
> Hopefully it's okay to ask here (if not please point me in the right direction if you don't mind) but what are some good aftermarket coolers for the R9 290X?
> Right now I have a stock cooler and while it does it's job, things do get a little toasty and I'd like a little more overclocking headroom.
> Preferably I'm looking for a self-contained unit that doesn't require me to have my own custom loop (because I don't have one yet)


I purchased an Arctic Accelero Hybrid III-140 and what a difference! XFX can kiss my behind about their DD cooler.









This is my card running Furmark with the stock DD cooler:


And this is running Furmark with the Arctic Accelero Hybrid III-140:


One thing to note: The kit does not provide ANY cooling for the VRM2 chips. My first run showed the VRM2's going (about) 6C hotter than the stock DD cooler. So I put a 140mm intake fan in the bottom of my R5 case and that was enough to blow some air over the VRM2 chips and keep the temps down.

Amazon was selling them for $99 (Prime) but they are now out of stock on the 290 version. Their is a 3rd party vendor who is trying to get $294









Arctic does sell this cooler on their website for $129 (140mm) radiator and $119 (120mm radiator) + shipping.

Last point: Installing this cooler (by yourself) is a royal PITA! If I ever have to do this again, I will have a friend over to help.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Cyber Locc*
> 
> I did not, that was covered by the other guy, just because the GPU doesn't get power from the pcie cables doesnt mean it aint reading it.
> 
> I had that happen long ago however it was a weak card, a gtx 430 that I used for testing boards. I put it in a HTPC without its cables hooked up by accident, it worked it displayed and was read in device manager however when under load it would crash which is what lead me to figure out I forgot the cables. Of course that is a completely different card with way less power reqs the same thing could be happening here.
> 
> As I stated previously " there are two different versions of AMD drivers in every install (allegedly)" I dont know that that is just a guess. What I do know is that if you install drivers and try to add a second without reinstalling it does not work. At least it didn't for me or the 100s of people that have threads about the exact same issue, along with the other guy that said something about before I did.
> 
> Know could this issue just be a certain set of drivers every now and then ya its possible. Could it work sometimes and not others that is also possible. Did it not for me and many many others, Yes.
> Could installing a fresh driver versus installing a driver over another change the way that works yes.


I always do clean installs I didn't used to but amd drivers have shown me the error of my ways...neither of my 290s will show up in device manager or sync to a display without being powered via power supply...this includes each by themselves individually and together...or with both installed but one powered


----------



## cephelix

@Butthurt Belugaif you're going air, you could try the raijintek morpheus. My best upgrade though has been clu on the die and fujipoly ultra extreme on the vrms


----------



## c0V3Ro

Quote:


> Originally Posted by *Cyber Locc*
> 
> Umm, that psu is over 7 years old and was mediocre the day it was released. Ya I would say that defiantly needs replacing.


Oh my...








I got it new (bargain?) and kept until got time to assemble a game rig. It has very little use.
It's voltages on BIOS seems pretty solid.
What do you guys think about Gpu-z log on Heaven4.0 at 100% gpu use?
12V line drops to 11.88V;
Max Vddc(voltage): 1.164V;
Max Vddc(current-in): 15.7A;
Max Vddc(current-out): 126.5A;
Max Vddc(power-in): 178.8W;
Max Vddc(power-out): 148.3W

What would be a good PSU? Cougar, Aerocool, Coolermaster, Xigmatek, Seasonic, Sentey? Corsair (







very over priced here because of fan-boys)?


----------



## Cyber Locc

Quote:


> Originally Posted by *c0V3Ro*
> 
> Oh my...
> 
> 
> 
> 
> 
> 
> 
> 
> I got it new (bargain?) and kept until got time to assemble a game rig. It has very little use.
> It's voltages on BIOS seems pretty solid.
> What do you guys think about Gpu-z log on Heaven4.0 at 100% gpu use?
> 12V line drops to 11.88V;
> Max Vddc(voltage): 1.164V;
> Max Vddc(current-in): 15.7A;
> Max Vddc(current-out): 126.5A;
> Max Vddc(power-in): 178.8W;
> Max Vddc(power-out): 148.3W
> 
> What would be a good PSU? Cougar, Aerocool, Coolermaster, Xigmatek, Seasonic, Sentey? Corsair (
> 
> 
> 
> 
> 
> 
> 
> very over priced here because of fan-boys)?


That 12v drop seems fine so I think the psu is okay. The reviews I read said it had the 12v rail running at 11.15 when it was brand new.

Good PSUs are always "Overpriced" because they are good, IMO the thing you never ever ever cheap out on is a PSU you are asking for trouble. You have to think if the PSU frys it takes out all of your hardware with it, always spend more and buy a solid trusted power supply.

The thing is people run out and buy a 20 dollar power supply as it is an after thought. That 20 dollar power supply frys and destroys there GPUs and motherboard now they are out 1000s to save 100 dollars.

That is of course a worse case scenario and doesn't happen often but it does happen.


----------



## mfknjadagr8

Quote:


> Originally Posted by *c0V3Ro*
> 
> Oh my...
> 
> 
> 
> 
> 
> 
> 
> 
> I got it new (bargain?) and kept until got time to assemble a game rig. It has very little use.
> It's voltages on BIOS seems pretty solid.
> What do you guys think about Gpu-z log on Heaven4.0 at 100% gpu use?
> 12V line drops to 11.88V;
> Max Vddc(voltage): 1.164V;
> Max Vddc(current-in): 15.7A;
> Max Vddc(current-out): 126.5A;
> Max Vddc(power-in): 178.8W;
> Max Vddc(power-out): 148.3W
> 
> What would be a good PSU? Cougar, Aerocool, Coolermaster, Xigmatek, Seasonic, Sentey? Corsair (
> 
> 
> 
> 
> 
> 
> 
> very over priced here because of fan-boys)?


send shilka a pm he can help you find a quality psu for a decent price


----------



## Arizonian

Quote:


> Originally Posted by *Butthurt Beluga*
> 
> I know I'm just a little late by now, but is it too late to join the club?
> 
> 
> 
> 
> 
> 
> 
> 
> 1)http://www.techpowerup.com/gpuz/details.php?id=w254n
> 2) XFX R9 290X 8GB
> 3) Stock
> 
> Hopefully it's okay to ask here (if not please point me in the right direction if you don't mind) but what are some good aftermarket coolers for the R9 290X?
> Right now I have a stock cooler and while it does it's job, things do get a little toasty and I'd like a little more overclocking headroom.
> Preferably I'm looking for a self-contained unit that doesn't require me to have my own custom loop (because I don't have one yet)


Congrats - added







Welcome to OCN, you came to the right place to ask your question as you see.


----------



## Roboyto

Quote:


> Spoiler: Warning: Spoiler!
> 
> 
> 
> Originally Posted by *c0V3Ro*
> 
> Oh my...
> 
> 
> 
> 
> 
> 
> 
> 
> I got it new (bargain?) and kept until got time to assemble a game rig. It has very little use.
> It's voltages on BIOS seems pretty solid.
> What do you guys think about Gpu-z log on Heaven4.0 at 100% gpu use?
> 12V line drops to 11.88V;
> Max Vddc(voltage): 1.164V;
> Max Vddc(current-in): 15.7A;
> Max Vddc(current-out): 126.5A;
> Max Vddc(power-in): 178.8W;
> Max Vddc(power-out): 148.3W
> 
> 
> 
> What would be a good PSU? Cougar, Aerocool, Coolermaster, Xigmatek, Seasonic, Sentey? Corsair (
> 
> 
> 
> 
> 
> 
> 
> very over priced here because of fan-boys)?


Many PSUs are rebrands of a larger manufacturer. Here's all the information/reviews about nearly every PSU you'll ever need: http://www.realhardtechx.com/index_archivos/Page447.htm

That is the direct link to Corsair PSU's, but you can choose from nearly 100 different brands for a fantastic collection of information on any of them.

Here's a good look at a highly overclocked 290X and 4770k: http://www.overclock.net/t/1441118/290x-psu-power-output-tests/0_20#post_21156921

During normal benchmarking total system draw around 450-475W. Furmark jumped up to 600W, but is obviously a worst case scenario. If you are running a single card than a solid 650W PSU should be enough for nearly all scenarios, with the exception being a highly overclocked/volted FX 8-core CPU where you would probably want something a little larger.

Antec, Corsair, EVGA, Rosewill, Seasonic...there are lots of choices and in the 650W range you can spend around $100 and get something very nice and reliable with a a good warranty; 5-10 year warranty.

For $85 this would be a very solid choice with a 10 year warranty:

http://www.newegg.com/Product/Product.aspx?Item=N82E16817438026&ignorebbr=1&cm_re=650w_psu-_-17-438-026-_-Product



> Originally Posted by *Cyber Locc*
> 
> That 12v drop seems fine so I think the psu is okay. The reviews I read said it had the 12v rail running at 11.15 when it was brand new.
> 
> *Good PSUs are always "Overpriced" because they are good, IMO the thing you never ever ever cheap out on is a PSU you are asking for trouble. You have to think if the PSU frys it takes out all of your hardware with it, always spend more and buy a solid trusted power supply.*
> 
> The thing is people run out and buy a 20 dollar power supply as it is an after thought. That 20 dollar power supply frys and destroys there GPUs and motherboard now they are out 1000s to save 100 dollars.
> 
> That is of course a worse case scenario and doesn't happen often but it does happen.


This.

Quote:


> Originally Posted by *mfknjadagr8*
> 
> *send shilka a pm he can help you find a quality psu for a decent price*


Beat me to it.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Roboyto*
> 
> This.
> 
> Beat me to it.


yeah I like to defer to him on almost all my psu related purposes...the last one I whimmed on price luckily I can avoid the issues he mentions as I keep it below the temperatures where the problems occur


----------



## c0V3Ro

Thank you Roboyto, mfknjadagr8, Cyber-Locc








7team was one of the most well known brand available when I bought








I agree PSU should be the cornerstone.
If Corsair makes the best PSU I have no problem in buying one. I just don't want to be caught on "go-with-the-flow" marketing.








Well, I'll read the review.








Now desktop image started to turn-off and on from time-to-time.








The TV doesn't do this when input by other device like cable decoder.


----------



## Roboyto

Quote:


> Originally Posted by *c0V3Ro*
> 
> Thank you Roboyto, mfknjadagr8, Cyber-Locc
> 
> 
> 
> 
> 
> 
> 
> 
> 7team was one of the most well known brand available when I bought
> 
> 
> 
> 
> 
> 
> 
> 
> I agree PSU should be the cornerstone.
> If Corsair makes the best PSU I have no problem in buying one. I just don't want to be caught on "go-with-the-flow" marketing.
> 
> 
> 
> 
> 
> 
> 
> 
> Well, I'll read the review.
> 
> 
> 
> 
> 
> 
> 
> 
> Now desktop image started to turn-off and on from time-to-time.
> 
> 
> 
> 
> 
> 
> 
> 
> The TV doesn't do this when input by other device like cable decoder.


You're welcome.

I wouldn't say they make the best PSU's indefinitely, as they have had some hiccups here and there, but generally speaking you won't be disappointed with one of their units. They do have very good customer service which can be worth the entry premium if something does happen to go wrong.

Do a little reading of reviews from reputable sources and you will probably be OK. If you're still unsure, just ask and someone will likely be able to tell you if you're making a mistake or not.


----------



## Cyber Locc

Quote:


> Originally Posted by *c0V3Ro*
> 
> Thank you Roboyto, mfknjadagr8, Cyber-Locc
> 
> 
> 
> 
> 
> 
> 
> 
> 7team was one of the most well known brand available when I bought
> 
> 
> 
> 
> 
> 
> 
> 
> I agree PSU should be the cornerstone.
> If Corsair makes the best PSU I have no problem in buying one. I just don't want to be caught on "go-with-the-flow" marketing.
> 
> 
> 
> 
> 
> 
> 
> 
> Well, I'll read the review.
> 
> 
> 
> 
> 
> 
> 
> 
> Now desktop image started to turn-off and on from time-to-time.
> 
> 
> 
> 
> 
> 
> 
> 
> The TV doesn't do this when input by other device like cable decoder.


Ya, I honestly never knew seventeen made pc psus, there server psus are prety good but it would seem there desktop lines did not share the same quality and that caused there desktop line to be short lived.

Also just to be clear, I am not sure if the PSU is your problem, I would just make it a note to replace it either way.


----------



## mfknjadagr8

Quote:


> Originally Posted by *c0V3Ro*
> 
> Thank you Roboyto, mfknjadagr8, Cyber-Locc
> 
> 
> 
> 
> 
> 
> 
> 
> 7team was one of the most well known brand available when I bought
> 
> 
> 
> 
> 
> 
> 
> 
> I agree PSU should be the cornerstone.
> If Corsair makes the best PSU I have no problem in buying one. I just don't want to be caught on "go-with-the-flow" marketing.
> 
> 
> 
> 
> 
> 
> 
> 
> Well, I'll read the review.
> 
> 
> 
> 
> 
> 
> 
> 
> Now desktop image started to turn-off and on from time-to-time.
> 
> 
> 
> 
> 
> 
> 
> 
> The TV doesn't do this when input by other device like cable decoder.


when you input to a tv most tvs don't sync very well with varying resolutions and anytime there is a resync (like 2d to 3d clocks) there's a possibility for sync issues and even loss of signal... The problem is must tvs even if they support pcm...they don't sync properly you get improper aspect ratios applied


----------



## c0V3Ro

That's a relief








I intend to buy a proper monitor, with at least flicker-free








Put some stress on cpu and gpu to see how voltages perform.
Voltages:
3.3V: 3.324 - 3.305
5V: 5.130 - 5.100
12V: 12.240 - 12.240
GPU Vcc: 3.340 - 3.301
Vram GPU: 1.001 - 1.001
GPU +12V: 12.031 - 11.938
GPU VRM: 0.964 - 1.184
Current:
Vram GPU: 1.25 - 24.00
GPU VRM: 7.50 - 81.00
Power:
Vram GPU: 1.25 - 23.75
GPU VRM: 7.22 - 97.38
The ideal should be measure it with a multimeter but I still need to learn how to do it.
 
I think i'll go for a 850W PSU already.

For about US$205 + shipment:
- EVGA 220-G2-0850-XR;
- Coolermaster RS-850-AFBA-G1.
For ~US$235 :
- Corsair CP-9020056-WW.


----------



## cephelix

Quote:


> Originally Posted by *c0V3Ro*
> 
> That's a relief
> 
> 
> 
> 
> 
> 
> 
> 
> I intend to buy a proper monitor, with at least flicker-free
> 
> 
> 
> 
> 
> 
> 
> 
> Put some stress on cpu and gpu to see how voltages perform.
> Voltages:
> 3.3V: 3.324 - 3.305
> 5V: 5.130 - 5.100
> 12V: 12.240 - 12.240
> GPU Vcc: 3.340 - 3.301
> Vram GPU: 1.001 - 1.001
> GPU +12V: 12.031 - 11.938
> GPU VRM: 0.964 - 1.184
> Current:
> Vram GPU: 1.25 - 24.00
> GPU VRM: 7.50 - 81.00
> Power:
> Vram GPU: 1.25 - 23.75
> GPU VRM: 7.22 - 97.38
> The ideal should be measure it with a multimeter but I still need to learn how to do it.
> 
> I think i'll go for a 850W PSU already.
> 
> For about US$205 + shipment:
> - EVGA 220-G2-0850-XR;
> - Coolermaster RS-850-AFBA-G1.
> For ~US$235 :
> - Corsair CP-9020056-WW.


Have you considered the seasonic X- series or superflower leadex? I myself have a seasonic x-750 that's about 6 years old and still running strong.


----------



## Mega Man

Or any of the spinning offs,

Evga g2,t2,p2,gs

Xfx has some Coolermaster v series and several others are based off of the Seasonic x series


----------



## eucalyptux

Hello everyone !

I would like to know if there is any info/solution about the blackscreen at boot bug in the Crimson driver
The probleme occure if you overclock the card with AB for exemple with a voltage offset, the voltage is not applied at boot and the card crash into a black screen and the driver need to be reinstalled via DDU in safe mode.
I saw a few post about this probleme but no solution unless going back to 15.11.1.

Thanks in advance and excuse my english


----------



## Pete2

Quote:


> Originally Posted by *eucalyptux*
> 
> Hello everyone !
> 
> I would like to know if there is any info/solution about the blackscreen at boot bug in the Crimson driver
> The probleme occure if you overclock the card with AB for exemple with a voltage offset, the voltage is not applied at boot and the card crash into a black screen and the driver need to be reinstalled via DDU in safe mode.
> I saw a few post about this probleme but no solution unless going back to 15.11.1.
> 
> Thanks in advance and excuse my english


I've been looking for a solution too but I haven't found one yet.


----------



## mus1mus

The latest Crimson drivers no longer have these issues.

Clock and Voltages are now retained after a bad reboot or dirty shutdonwns.


----------



## eucalyptux

This is in 16.1.1 ?
did'nt saw that in the changlog, i will try that tomorow, thx


----------



## mus1mus

they never did mention such stuff.


----------



## i2CY

Crossfire Card installs....

I have plugged one card in, and it worked but I found it to be glitchie, always found card to software to run better after DDU, and reinstall eveytime they are romoved from my MOBO... card hair air flow problems.

Also, I found DDU with Crimson to give me a black screens on safe mode boot. The direct results left me twitching horizontal on the floor, weeping cause I could not get out of it, and had to clean install thrice.... t

Any one every have GPU Lock up on benching.. .. .. .. . . . .... .. at stock settings?

To curve this ot try to solve the issue I have just re-flashed my cards' (290) bios to the latest ones on the sapphire site, have not tired FS yet, but it happen 3 or 4 time before the re-flash, even with the GPU-Z test screen.

This was with 15.10, scaled back to 15.7 as well, my card(s) hate me for some reason.

The issue is probably out of ensnarement, those Arctic Accelero Xtreme III VGA Cooler side decals look hideous.


----------



## c0V3Ro

Thanks cephelix








I'll look for Seasonic too. Probably a 850W so it'll give some room to overclocking









Well, let's joining the club









1)http://www.techpowerup.com/gpuz/8mnku/
2) Powercolor R9 290x 4GB OC (Visiontek BIOS)
3) Water (Aquacomputer Kryographics Hawaii)
 

What would be the advantages in flashing Visiontek BIOS on a Powercolor board? Unlock voltage? More detailed report on GPU-z?








Bought this card already with this BIOS flashed.


----------



## rdr09

Quote:


> Originally Posted by *c0V3Ro*
> 
> Thanks cephelix
> 
> 
> 
> 
> 
> 
> 
> 
> I'll look for Seasonic too. Probably a 850W so it'll give some room to overclocking
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well, let's joining the club
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1)http://www.techpowerup.com/gpuz/8mnku/
> 2) Powercolor R9 290x 4GB OC (Visiontek BIOS)
> 3) Water (Aquacomputer Kryographics Hawaii)
> 
> 
> What would be the advantages in flashing Visiontek BIOS on a Powercolor board? Unlock voltage? More detailed report on GPU-z?
> 
> 
> 
> 
> 
> 
> 
> 
> Bought this card already with this BIOS flashed.


Even a good 750W will work for just one gpu. save the rest for a cpu upgrade. my system with 2 290s only spiked to 689W in Firestrike when i was testing. Get a good cpu cooler too.


----------



## Roboyto

Quote:
Originally Posted by *c0V3Ro* 

Thanks cephelix








I'll look for Seasonic too. Probably a 850W so it'll give some room to overclocking









What would be the advantages in flashing Visiontek BIOS on a Powercolor board? Unlock voltage? More detailed report on GPU-z?








Bought this card already with this BIOS flashed.








Quote:


> Originally Posted by *rdr09*
> 
> Even a good 750W will work for just one gpu. save the rest for a cpu upgrade. my system with 2 290s only spiked to 689W in Firestrike when i was testing. Get a good cpu cooler too.


850W is borderline even for a crossfire setup of 290(X) depending on what kind of cooling, overclocks, and CPU you're running. I've been on a 650W Rosewill Capstone for over 2 years and it has been perfect. My 4770k runs at 4.5, benches up to 4.8, and my 290 runs at 1200/1500 and will bench up to 1300/1700; It's also powering 16GB 2400MHz RAM, water pump, 6 fans, 3 SSDs and 1 HDD.

If you plan on adding another card then a larger PSU will help you in the long run, otherwise 650-750W is plenty for a an overclocked single CPU/GPU setup.

Different voltage profile of some sort I would guess. I believe VisionTek had a factory full cover waterblock card, the CryoVenom IIRC, maybe the previous owner flashed that BIOS to it.


----------



## i2CY

Frack!!!
Forgot to do this in Nov.!!








May I pretty please joinith?

1) *Techpowerup*
*GPU1:*http://www.techpowerup.com/gpuz/9afg9/
*GPU2:*http://www.techpowerup.com/gpuz/555af/
2) *Card Type:* _2 x Sapphire R9 290 4GB GDDR5_
*Part Number:* _21227-00,_
*Bios:* _Sapphire 285PA500.L41_
3) *Cooler:* _Arctic Accelero Xtreme III VGA Cooler (Air)_


----------



## Riktar54

Please ad me:

1. GPU-Z Link with OCN name: http://www.techpowerup.com/gpuz/details.php?id=awq8u

2. Manufacturer & Brand - XFX R9 290 DD

3. Cooling - ARCTIC Accelero Hybrid III-140


----------



## eucalyptux

@mus1mus tried 16.1.1 and nop, exact same probleme at boot (for me at least..)
Just to be sure, being an hotfix, does that mean that it need to be installed on top of 16.1 ? I doupt it


----------



## mus1mus

Roll back to 15.7.1 then. If you knew DDU, work it out first. Wipe Drivers and revert to 15.7.1


----------



## EdInk007

Quote:


> Originally Posted by *rdr09*
> 
> Even a good 750W will work for just one gpu. save the rest for a cpu upgrade. my system with 2 290s only spiked to 689W in Firestrike when i was testing. Get a good cpu cooler too.


I won't recommend any 750W for a 290X CF setup.... I had an AX760i and suffered monitor signal losses, upgraded to the AX860i and all is well.


----------



## EdInk007

May I join the club?
Quote:


> Originally Posted by *EdInk007*
> 
> Currently experiencing loss of monitor signals when browsing or doing non-GPU intensive tasks
> i7-4790K (no OC)
> 290X CF (liquid cooled - no OC)
> Maximus Ranger
> Kingston Hyper X 16GB
> AX760i
> LG34UM57 monitor using Displayport
> AMD Driver 15.11.1 (had to revert to older driver as Crimson gave me more frequent losses- DDU now my best friend)
> 
> Anyone experiencing this or experienced this?


So much from the OWNERS CLUB pfft! no one could point out that it may have been a power supply issue.


----------



## Roboyto

Quote:



> Originally Posted by *EdInk007*
> 
> May I join the club?
> So much from the OWNERS CLUB pfft! no one could point out that it may have been a power supply issue.


I didn't see your original post, but if you have ever ready any review of a 290(X)/390(X) then it would be fairly obvious that a 760W power supply is typically not enough for 2 of these cards. I have seen many manufacturers suggest a 750W for 1 card.

Best bet for trouble free 290(X)/390(X) crossfire is a 1kW PSU. That would allow solid overclocking of your CPU and both GPUs.

I did just notice that your cards are liquid cooled and you are not running an OC on CPU/GPUs. If that is the case, then that platinum 760W PSU could potentially suffice. * I say this only because of the quality of those particular PSUs as they can be pushed pretty far past their rated output and still maintain functionality and fairly good efficiency.*

I ran a 4790k @ 4.7GHz and a 290 @ 1075/1375 off of a 450W PSU for quite a while without any issues.

I think that your problem was the PSU, but not due to lack of power, but due to it being faulty since your issue was not during intensive loads.

Also, your tone is not necessary. It is not a good way to get anyone to answer your questions now or at a later date and time especially when your post count is 8. This is a fairly popular thread and it moves quickly at times. Due to this fact posts get buried or missed because not every member who is capable of answering questions and/or assisting with troubleshooting/problem solving is watching it 'all day every day'.

Now if you want to join the club, there is a format to do so. Go to the OP(original post) and see what the criteria are to join. Make a post with that information and you will be added.


----------



## kizwan

Last time I checked, 80Plus certification test doesn't test the PSU thoroughly. I would not choose PSU based on 80Plus rating. I rather check jonny guru site for example to know whether the PSU is ok or not.
Quote:


> Originally Posted by *eucalyptux*
> 
> @mus1mus tried 16.1.1 and nop, exact same probleme at boot (for me at least..)
> Just to be sure, being an hotfix, does that mean that it need to be installed on top of 16.1 ? I doupt it


Hotfix for AMD means clean install.


----------



## eucalyptux

Quote:


> Originally Posted by *kizwan*
> 
> Hotfix for AMD means clean install.


yep, im back to 16.1.1 but staying at stock clock ...
not ideal for playing with custom bios


----------



## kizwan

Quote:


> Originally Posted by *eucalyptux*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Hotfix for AMD means clean install.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> yep, im back to 16.1.1 but staying at stock clock ...
> not ideal for playing with custom bios
Click to expand...

I use 15.7.1 drivers when playing with custom biosessssss.


----------



## eucalyptux

Quote:


> Originally Posted by *kizwan*
> 
> I use 15.7.1 drivers when playing with custom biosessssss.


Thanks for the tip, is this driver preferable over 15.11.1 ?
also, is there any file that i can grab from 16.1 to put in the 15.7.1 one ?


----------



## kizwan

Quote:


> Originally Posted by *eucalyptux*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I use 15.7.1 drivers when playing with custom biosessssss.
> 
> 
> 
> Thanks for the tip, is this driver preferable over 15.11.1 ?
> also, is there any file that i can grab from 16.1 to put in the 15.7.1 one ?
Click to expand...

Any driver before Crimson will do. No need to copy any file though.


----------



## eucalyptux

So there this is definitly an issu from Crimson, how is this not more documented ?
weird


----------



## kizwan

Quote:


> Originally Posted by *eucalyptux*
> 
> So there this is definitly an issu from Crimson, how is this not more documented ?
> weird


It is well documented here but already buried deep in the thread. I think I am the first person to report about the problem with Crimson driver in this thread.


----------



## cephelix

@c0V3Ro as the others have said, if you're planning to only have 1 card, a quality 650W psu would do fine. For crossfire, i'd go with 1000W so you won't run into any issue
But as @kizwan said, an 80+ certification isn't a measure of psu quality. I've read this but cannot recall where. Johnnyguru is the place to go to read up on psus. Alternatively, you could go find shilka on the forum. He's the resident psu guy


----------



## Arizonian

Quote:


> Originally Posted by *c0V3Ro*
> 
> Thanks cephelix
> 
> 
> 
> 
> 
> 
> 
> 
> I'll look for Seasonic too. Probably a 850W so it'll give some room to overclocking
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Well, let's joining the club
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1)http://www.techpowerup.com/gpuz/8mnku/
> 2) Powercolor R9 290x 4GB OC (Visiontek BIOS)
> 3) Water (Aquacomputer Kryographics Hawaii)
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> What would be the advantages in flashing Visiontek BIOS on a Powercolor board? Unlock voltage? More detailed report on GPU-z?
> 
> 
> 
> 
> 
> 
> 
> 
> Bought this card already with this BIOS flashed.


Congrats added















Quote:


> Originally Posted by *i2CY*
> 
> Frack!!!
> Forgot to do this in Nov.!!
> 
> 
> 
> 
> 
> 
> 
> 
> May I pretty please joinith?
> 
> 1) *Techpowerup*
> *GPU1:*http://www.techpowerup.com/gpuz/9afg9/
> *GPU2:*http://www.techpowerup.com/gpuz/555af/
> 2) *Card Type:* _2 x Sapphire R9 290 4GB GDDR5_
> *Part Number:* _21227-00,_
> *Bios:* _Sapphire 285PA500.L41_
> 3) *Cooler:* _Arctic Accelero Xtreme III VGA Cooler (Air)_
> 
> 
> Spoiler: Warning: Spoiler!


Congrats added








Quote:


> Originally Posted by *Riktar54*
> 
> Please ad me:
> 
> 1. GPU-Z Link with OCN name: http://www.techpowerup.com/gpuz/details.php?id=awq8u
> 
> 2. Manufacturer & Brand - XFX R9 290 DD
> 
> 3. Cooling - ARCTIC Accelero Hybrid III-140


Congrats added









628 member 899 GPU's have come thru the club.


----------



## rdr09

Quote:


> Originally Posted by *EdInk007*
> 
> I won't recommend any 750W for a 290X CF setup.... I had an AX760i and suffered monitor signal losses, upgraded to the AX860i and all is well.


I wouldn't either. I recommended the 750W for one Hawaii. I have an 850W and I can't oc my 290s like used to when I had a 1300W - like 1290 core at +200 mv.

Welcome to OCN and club.


----------



## Timer5

So I am looking to move to a new BIOS. I had an elpida based R9 290x that was not too friendly to overclocking well I was able to sell it off to a buddy of mine who has no idea what overclocking is but wanted to replace his aging 5850 card. So far this card is VERY friendly to overclocking I have been able to get it up to 1200Mhz Core and 1550Mhz on the Memory and only needing +150mv. I want to push it farther and make it better. I have a Kraken G10 attached to a H90 I made sure to get the GELID Vrm coolers so my VRMS are frosty as for the memory I went out and picked up 2 packs of the Enzotech copper Memory coolers. So far my temps have been VERY good and I want to see how much more I can squeeze out of this card and wondering if any of you guys had any good BIOSes for Hynex memory. I tried to use one of the BIOSes I used on my Elpidia card that was supposed to support Hynex also but I would BSOD as soon as I tried to install the drivers so it looks like I need a new BIOS. Anyone here have a suggestion I am looking to use this card for gaming until the Launch of the R9 4XX series (I didn't see the Fury X as big enough so I am waiting a Gen)

Thank you all in advance









P.S I have a 750W PSU so power is not too much of a problem


----------



## Roboyto

Quote:


> Originally Posted by *Timer5*
> 
> So I am looking to move to a new BIOS. I had an elpida based R9 290x that was not too friendly to overclocking well I was able to sell it off to a buddy of mine who has no idea what overclocking is but wanted to replace his aging 5850 card. So far this card is VERY friendly to overclocking I have been able to get it up to 1200Mhz Core and 1550Mhz on the Memory and only needing +150mv. I want to push it farther and make it better. I have a Kraken G10 attached to a H90 I made sure to get the GELID Vrm coolers so my VRMS are frosty as for the memory I went out and picked up 2 packs of the Enzotech copper Memory coolers. So far my temps have been VERY good and I want to see how much more I can squeeze out of this card and wondering if any of you guys had any good BIOSes for Hynex memory. I tried to use one of the BIOSes I used on my Elpidia card that was supposed to support Hynex also but I would BSOD as soon as I tried to install the drivers so it looks like I need a new BIOS. Anyone here have a suggestion I am looking to use this card for gaming until the Launch of the R9 4XX series (I didn't see the Fury X as big enough so I am waiting a Gen)
> 
> Thank you all in advance
> 
> 
> 
> 
> 
> 
> 
> 
> 
> P.S I have a 750W PSU so power is not too much of a problem


I don't know if another BIOS is going to help you too much but, there are some interesting things going on with VRAM timings and Straps in the Hawaii BIOS editing thread so you may have better luck asking there: http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/0_20

I don't have any experience with BIOS altering on these cards, but I do have plenty of experience OCing and altering cooling means on them.

I was one of the first to test out that Gelid VRM kit and while it does work pretty well for moderate clocks and voltages, I'm wondering what you're considering 'frosty' for VRM1 temps with +150mV running through them?

I squeezed every bit of capability out of the Kraken/Gelid combination with a fan/thermal pad upgrade, and VRM1 would be in low 50's after a couple hours gaming at stock settings. Before I changed the pads and fan it was low 70s at stock settings. I could be mistaken, but with +150mV I don't think your temperatures would be that great unless you have some extravagant airflow happening over them. 80C-90C is about as high as you want to run them on a regular basis, after that you are asking for trouble.

As far as your current overclock is concerned +150mV for 1200/1550 is fair. Are you running +50% on the power slider as well?

Since you only have another +50mV to go I don't think there is a whole lot of room left for your GPU to stretch its proverbial legs. Additionally, ~1200MHz core is when the Hawaii GPUs begin to see diminishing returns for performance gains.

My 290 sits rock steady at 1200/1500 +100mV +50% power and it will run upwards of 1300/1700 if pushed to the brink. However, after doing lots of benching and some simple calculations I figured I was getting a very large majority of my performance without maxing the card out.

These benchmark numbers are old, but they should serve their purpose for trying to make a point:

*3DMark11 Extreme*

980/1250 Stock Settings/Power - X5175 Score (tess off)

1200/1500 +100mV + 50% Power - X6238 Score (tess off)

1255/1675 +125mV + 50% Power - X6444 Score (tess off)

*Initial Gain: 20.5%*

*Pushing it Gain: 3.3%*

*3DMark11 Performance*

1200/1500 +100mV + 50% Power - 17006 Score (tess off)

1295/1700 +200mV + 50% Power - 17793 Score (tess off)

*Abuse Gain: 4.6%*

*Unigine Valley*

980/1250 Stock Settings/Power - 2408 Score

1200/1500 +100mV + 50% Power - 2928 Score

1260/1675 +137mV + 50% Power - 3081 Score

*Initial Gain: 21.6%*

*Pushing it Gain: 5.2%*

*Unigine Heaven*

1200/1500 +100mV + 50% Power - 1999 Score

1255/1675 +137mV + 50% Power - 2079 Score

*Pushing it Gain: 4%*

*FFXIV Benchmark*

980/1250 Stock Settings/Power - 13795 Score

1255/1675 +125mV + 50% Power - 17129 Score

1315/1675 +200mV + 50% Power - 17511 Score

*Initial Gain: 24.2%*

*Abuse Gain: 2.2%*

While there is nothing wrong with pushing your hardware...'cuz this is OCN and that's the name of the game...it does have its effects over time. The point being that finding a happy medium of settings means longer life, less (chance of) degradation, cooler temps, and quieter operation.

Another thought is your CPU settings and RAM. If your rigbuilder is correct you have a FX-8350 with a Hyper 212 cooler and 1600MHz RAM.

I know the 212 is a decent cooler, but the FX are toasty little buggers especially with some volts/OC. If you're not running an OC there, or a small one, an AIO and more MHz for the CPU could help.

Also, many modern games are taking advantage of system RAM speeds much more than they used to. 1600Mhz used to be OK, but around 2133 is a much better place to be.


----------



## Maggots

Hello, I have gigabyte 290 oc . It's not brand new but second hand, when I installed it in my pc, I tested it with 3d mark. I'm surprise with the fan sound, it's very loud (compare to my old HD 7870). And the temperature is high to. Is it normal if idle temperature is 45-47 C, and load about 83 C? I'm not playing with any game yet.


----------



## Vellinious

Quote:


> Originally Posted by *Maggots*
> 
> Hello, I have gigabyte 290 oc . It's not brand new but second hand, when I installed it in my pc, I tested it with 3d mark. I'm surprise with the fan sound, it's very loud (compare to my old HD 7870). And the temperature is high to. Is it normal if idle temperature is 45-47 C, and load about 83 C? I'm not playing with any game yet.


What's your ambient temp? That seems kinda high.....


----------



## Maggots

Quote:


> Originally Posted by *Vellinious*
> 
> What's your ambient temp? That seems kinda high.....


I don't have tool to measure my room temperature. But 83 C at load is with the side panel opened. If the side panel installed, the load temperature rise to 87 C


----------



## i2CY

What is your OC settings?
Stock Cooler?


----------



## Roboyto

Quote:


> Originally Posted by *Maggots*
> 
> Hello, I have gigabyte 290 oc . It's not brand new but second hand, when I installed it in my pc, I tested it with 3d mark. I'm surprise with the fan sound, it's very loud (compare to my old HD 7870). And the temperature is high to. Is it normal if idle temperature is 45-47 C, and load about 83 C? I'm not playing with any game yet.


That is quite high for an idle, and the load is a little high as well especially if the card is running at stock clocks? Are you overclocked with additional voltage?

If you're comfortable with pulling the cooler off the card I would check to see how the thermal paste looks and try some fresh paste on there and see what happens.

Do you have a picture of the card? Is it a reference or does it have the WindForce cooler on it?

If it's the windforce cooler I believe they are normally pretty quiet in general, they fans might be going out if the card has seen a lot of use with the fans forced to higher RPMS to keep the temperatures down.


----------



## Cyber Locc

WOW edited this was the wrong thread hehehe.

While I am here though, figured I would mention a friend asked me how much used 290s are going for atm and figured I would let everyone here now. they are going to 250+ ATM on ebay ect which is way way higher than they have ever been (which is stupid) so if you were planning to sell now is the time.


----------



## new boy

How can I recover from an unstable overclock set via after burner to start at startup?

I've tried opening misconfig in safe mode, but can't see it..

Typing on phone now, which is further adding to the rage lol


----------



## eucalyptux

Boot into safe mode, delete your card's profile in AB (\Program Files\MSI Afterburner\Profiles\) and use DDU to swipe the driver and reinstall it
Also if you use the Crimson driver (15.12 ~ 16.1.1), this may be the cause, it has been reported that the driver don't remember or apply the "OC" voltage at startup.
Roll back to 15.7.1 or 15.11.1 if its the case


----------



## spyshagg

Quote:


> Originally Posted by *new boy*
> 
> How can I recover from an unstable overclock set via after burner to start at startup?
> 
> I've tried opening misconfig in safe mode, but can't see it..
> 
> Typing on phone now, which is further adding to the rage lol


15.11.1 is the place to be for us who are living on the thin red line of stability lol


----------



## Vellinious

Quote:


> Originally Posted by *spyshagg*
> 
> 15.11.1 is the place to be for us who are living on the thin red line of stability lol


Is that the best one for benchmarking?


----------



## Maggots

Quote:


> Originally Posted by *i2CY*
> 
> What is your OC settings?
> Stock Cooler?


I'm not OC the card. Using the windforce cooler
Quote:


> Originally Posted by *Roboyto*
> 
> That is quite high for an idle, and the load is a little high as well especially if the card is running at stock clocks? Are you overclocked with additional voltage?
> 
> If you're comfortable with pulling the cooler off the card I would check to see how the thermal paste looks and try some fresh paste on there and see what happens.
> 
> Do you have a picture of the card? Is it a reference or does it have the WindForce cooler on it?
> 
> If it's the windforce cooler I believe they are normally pretty quiet in general, they fans might be going out if the card has seen a lot of use with the fans forced to higher RPMS to keep the temperatures down.


I'm running at stock clocks, not overclocking it at all. Using the windforce cooler. If I'm pull off the cooler, and change the paste will it not void the warranty?

If I'm trying to RMA it, will gigabyte take it if I'm told them the fan is very loud and the card run very hot? It's still got 1 year remaining warranty time.


----------



## i2CY

Quote:


> Originally Posted by *Maggots*
> 
> I'm not OC the card. Using the windforce cooler
> I'm running at stock clocks, not overclocking it at all. Using the windforce cooler. If I'm pull off the cooler, and change the paste will it not void the warranty?
> 
> If I'm trying to RMA it, will gigabyte take it if I'm told them the fan is very loud and the card run very hot? It's still got 1 year remaining warranty time.


how is the rest of your cooling in your case?
How many intake fans over exghust?
before i installed my after market air coolers i was at idle around 55*C, was hitting thermo limits with BF4 and shutting down my rig,
mind you I never researched my case at the time, and now that i had read up on it there is better airflow threw a spong; but build bug bit me, and i have what i have for now.

if you pull it off and put it back together carfully, they probably won't know. That setup throw the heat all over the case so you need good air flow to get rid of the heat.

but the R9 290s run hot, and are designed to run hot, but at idle... somethings is a miss.


----------



## Maggots

Quote:


> Originally Posted by *i2CY*
> 
> how is the rest of your cooling in your case?
> How many intake fans over exghust?
> before i installed my after market air coolers i was at idle around 55*C, was hitting thermo limits with BF4 and shutting down my rig,
> mind you I never researched my case at the time, and now that i had read up on it there is better airflow threw a spong; but build bug bit me, and i have what i have for now.
> 
> if you pull it off and put it back together carfully, they probably won't know. That setup throw the heat all over the case so you need good air flow to get rid of the heat.
> 
> but the R9 290s run hot, and are designed to run hot, but at idle... somethings is a miss.


The case that I use is storm trooper with 2 intake fan in the front, and 1 exhaust fan at the top, and corsair h75 as exhaust too. The intake, well, I have 6 hard drive so the intake airflow not that good. But when I'm open the side panel it's not give any change, only load temp is 3 C lower compare to the side panel closed.


----------



## Roboyto

Quote:


> Originally Posted by *Maggots*
> 
> The case that I use is storm trooper with 2 intake fan in the front, and 1 exhaust fan at the top, and corsair h75 as exhaust too. The intake, well, I have 6 hard drive so the intake airflow not that good. But when I'm open the side panel it's not give any change, only load temp is 3 C lower compare to the side panel closed.


You could definitely benefit from some more airflow through the case. I would add at least the extra fan for exhaust at the top. Maybe even consider upgrading all the fans.

All those HDDs are killing the airflow from the front of the case. If you can figure some way to consolidate or move things around to get at least one of the front intake fans unobstructed it should help a bit.

Airflow aside, I would pop the cooler off and check whatever paste is on there currently. This should not have any effect on the warranty of the card. However, I don't know if Gigabyte will honor a warranty if you're not the original purchaser.


----------



## mfknjadagr8

Ok guys so im having an issue with my second card...if i run with just the top card everything works great so i know its something to do with the second card...when gaming earlier with both cards connected and crossfire turned on i got a blue screen... citing the amd driver as the cause... ive gotten that before on crimson and i was monitoring temperatures and voltages on screen and seeing nothing over 60c and nothing over 1.22v on core... anyhow...i assumed it was crimson at first...though after it happened again last week and a few more times this week i was getting an issue where after the blue screen windows would just blue screen loop over and over citing service failure or something to that effect...

So i did my normal ritual of uplugging the second card while powered down and restart in safe mode.. clean drivers with ddu then restart, reinstall, restart, shutdown, reconnect second card test everything working... and crossfire worked everything.. well this happened again this week so i assumed it was crimson so i rolled back to 15.10 drivers to eliminate the driver itself as the problem.. and everything was smooth for a few days...

Tonight while gaming... blue screen again.. normal temps... normal voltages...but this time i thought ill just leave the second card connected and try to install the drivers this way...after restarting it did the blue screen loop again... so i said fine... safe mode... ddu... then reinstalled 15.7 just to ensure once again it wasnt drivers even though i knew it wasnt by this point....only now anytime the second card is powered via psu i get the blue screen loop after installing the drivers fresh... the error says service failur again.. and has ati file cited below...

My question is... can bios corruption due to a bad overclock cause blue screens when loading a driver to a card... if so would a reflash possibly fix this? I dont have time for a few days to tear the pc down and pull the block and check things out... and if i didnt have to and there was another option that would be great...the cards were performing well when the blue screens happened and fairly cool... are there any other options im overlooking as possible cause? Im stumped and i dont know anything about the bios's on gpus... another thing to note is that there is no crash dump from any of these crashes.. and who crashed finds nothing as well (pretty sure it just checks crash dump and the windows error services)


----------



## new boy

Quote:


> Originally Posted by *eucalyptux*
> 
> Boot into safe mode, delete your card's profile in AB (\Program Files\MSI Afterburner\Profiles\) and use DDU to swipe the driver and reinstall it
> Also if you use the Crimson driver (15.12 ~ 16.1.1), this may be the cause, it has been reported that the driver don't remember or apply the "OC" voltage at startup.
> Roll back to 15.7.1 or 15.11.1 if its the case


Thank you, deleting the profile in safe mode worked a treat.

I'm actually using the newest catalyst driver, as crimson was causing me issues and I've just moved to water so wanted to dial my overclock in on something stable.

Thanks again.


----------



## spyshagg

Quote:


> Originally Posted by *Vellinious*
> 
> Is that the best one for benchmarking?


Use 15.11.1 to find your most absolute stable overclock, then switch to the latest crimson and enjoy your games. But to be honest, there's only a real difference in the most recent games (tomb raider fallout)


----------



## gupsterg

@fyzzz

Wanted some of your input







, using stock ROM = 1100/1475 & 1110/1460 (info in this post).

Due to erratic/intermittent "blackscreen" I went to mod stage 1, removed the VDDC & VDDCI offset = -6.25mV (info in this post).

Then did mod stage 2, this was to improve fan profile a little to keep GPU temp at ~<75c plus I upped "PowerLimit".


Spoiler: ROM Mod stage 2 info



PowerLimit mod

TDP 220 > 238
MPDL 220 > 238
TDC 211 > 229

Fan Profile mod

TMP2 80c > 75c
TMP3 95C > 90c
PWM2 47% > 50%
PWM3 100% > 95%



1125/1475 tested in games, etc and 7hrs [email protected]


Spoiler: [email protected] screenie







Now I'm at 1150/1475 (3dMark FS result), attached is HML file logging 2 runs of full 3dMark FS.

Tri-X290OC1150_14753dmark.zip 13k .zip file


Here's some VRM temp, etc info for 3rd run full 3dMark FS.


Spoiler: HWiNFO data for 3dMark FS







Comparing it to other cards I've owned / own:-

Sapphire Tri-X 290 STD = 1100 / 1475 ~+25mV
Asus DCUII 290X STD = 1070 / 1340 ~+50mV (IIRC)

Wasn't that knowledgeable about OC aspect / voltages, etc at the time of those cards or ROM modding. But was when got the Vapor-X 290X STD (1100/1575 @ 1.30V VID 1.000V VDDCI).

I'm phenomenally







with this newly acquired Sapphire Tri-X 290 OC edition (PCI-E Con. 8+6 & VRM 5+1+1), amazed I'm still using EVV 1.25V VID and 1.000V VDDCI to get these clocks (all cards stock air cooling). Gonna test further what I get before modding RAM timings and adding 390 MC timings.

How is MAX GPU VRM Current Out looking (181A)? what figures you seen when OC'ing your card? (This is ref PCB, got AMD stamp by PCI-E fingers).


----------



## fyzzz

Seems like a great card. What hynix memory is it? Normal afr?. My card's asic is very near yours actually, 78.3% and it does 1120mhz on stock voltage (1.25v).


----------



## gupsterg

Cheers mate







.

Screenie of info as out of box







, mine is 0.3% less leaky







.



Marked in red lines in i2cdump is the -6.25mV VDDC & VDCCI offset programmed to IR3567B, note the 0.993V DPM 0 VID in AiDA64 registers dump and due to that offset 0.984V in GPU-Z as VDDC @ idle/DPM 0.

Unless I upgrade cooling no chance of benching at your clocks (if it can do them)







.

Any info on your GPU VRM Current out values you can share buddy?


----------



## fyzzz

I can go and run firestrike+ the demo and see what values i get, My bios that i use everyday has the clocks set to 1170/1625 and 1325 in dpm 7 state.


----------



## gupsterg

Cheers, I'd appreciate the info







, yours being watercooled VRM temps must be low?


----------



## fyzzz

Quote:


> Originally Posted by *gupsterg*
> 
> Cheers, I'd appreciate the info
> 
> 
> 
> 
> 
> 
> 
> , yours being watercooled VRM temps must be low?


Yes they usually stay pretty low, i have a amd fan from a stock cooler on the back of the card that helps too. I don't have so much rad space, only two 30mm 240mm radiators, but the temperatures are decent.
I ran firestrike + demo and some bf4 at ultra and these are the values i got:


Spoiler: Warning: Spoiler!


----------



## gupsterg

Cheers







, what "PowerLimit" values are using in ROM?


----------



## fyzzz

Quote:


> Originally Posted by *gupsterg*
> 
> Cheers
> 
> 
> 
> 
> 
> 
> 
> , what "PowerLimit" values are using in ROM?


Stock, TDP: 208/ Power limit: 208/ TDC limit: 200 and the power limit maxed out in iturbo.


----------



## spyshagg

shameless copy paste but it is a calling to arms

jump on this will ya!!!!!

http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/250_50#post_24886638

lets do it come on


----------



## i2CY

Quote:


> Originally Posted by *spyshagg*
> 
> shameless copy paste but it is a calling to arms
> 
> jump on this will ya!!!!!
> 
> http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/250_50#post_24886638
> 
> lets do it come on


i am in,
should be able to whop a few 980s


----------



## Vellinious

Quote:


> Originally Posted by *spyshagg*
> 
> shameless copy paste but it is a calling to arms
> 
> jump on this will ya!!!!!
> 
> http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/250_50#post_24886638
> 
> lets do it come on


I know I probably asked this before....are they allowing tess tweaks for the bench runs?


----------



## spyshagg

Quote:


> Originally Posted by *Vellinious*
> 
> I know I probably asked this before....are they allowing tess tweaks for the bench runs?


YES !!

I may contribute with up to 45000 points this weekend if all goes well. Lets do this guys


----------



## mus1mus

The more guys joining the competition, the better. Green guys seem to flood that thread we need to voice out we are still in the game.


----------



## gupsterg

@mus1mus

Just getting my new Tri-290 tweaked to add max I can get







.

You have any info to share concerning max A I can pull out of main VRM1 being on air ~75C temp?


----------



## Vellinious

Quote:


> Originally Posted by *spyshagg*
> 
> YES !!
> 
> I may contribute with up to 45000 points this weekend if all goes well. Lets do this guys


My replacement PSU comes in today from EVGA, finally....I'll put some numbers up tomorrow.


----------



## Roboyto

Quote:


> Originally Posted by *spyshagg*
> 
> shameless copy paste but it is a calling to arms
> 
> jump on this will ya!!!!!
> 
> http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/250_50#post_24886638
> 
> lets do it come on





> Originally Posted by *Vellinious*
> 
> I know I probably asked this before....are they allowing tess tweaks for the bench runs





> Originally Posted by *mus1mus*
> 
> The more guys joining the competition, the better. Green guys seem to flood that thread we need to voice out we are still in the game.





> Originally Posted by *gupsterg*
> 
> @mus1mus
> 
> Just getting my new Tri-290 tweaked to add max I can get
> 
> 
> 
> 
> 
> 
> 
> .
> 
> You have any info to share concerning max A I can pull out of main VRM1 being on air ~75C temp?


Yes Tesselation tweaks are allowed!

We're getting walloped in dual GPU scenario, otherwise it would be a very close race.

VRM1 can run at very high temperatures for short periods; 90+. This is not good for the long term, but they are rated for quite high temperatures. I had a GTX 970 hit 132C, measured with IR thermometer, before it shut itself down.


----------



## gupsterg

Quote:


> Originally Posted by *Roboyto*
> 
> VRM1 can run at very high temperatures for short periods; 90+. This is not good for the long term, but they are rated for quite high temperatures. I had a GTX 970 hit 132C, measured with IR thermometer, before it shut itself down.


Cheers! +rep







, temps sorta aware but need more info on A.

Plan is only for testing / max bench for the compo







.

Otherwise quite happy with 1140/1495 @ 1.25 VID 1.000V VDDCI at present for 24/7 use







.

Real shame the Vapor-X 290X doesn't OC as well, that has 12 phases in total, runs very cool on air (~60c VRM load ~30c idle).

*** edit ***

Does this data from relevant datasheets look correct?



Dunno if those figures from datasheets take switching losses into account?


----------



## Maggots

Quote:


> Originally Posted by *Roboyto*
> 
> You could definitely benefit from some more airflow through the case. I would add at least the extra fan for exhaust at the top. Maybe even consider upgrading all the fans.
> 
> All those HDDs are killing the airflow from the front of the case. If you can figure some way to consolidate or move things around to get at least one of the front intake fans unobstructed it should help a bit.
> 
> Airflow aside, I would pop the cooler off and check whatever paste is on there currently. This should not have any effect on the warranty of the card. However, I don't know if Gigabyte will honor a warranty if you're not the original purchaser.


Thanks, for the advice







. But I want to ask, apart from my case, is the windforce cooler on 290 series performance should be good and not loud?


----------



## Roboyto

Quote:


> Originally Posted by *Maggots*
> 
> Thanks, for the advice
> 
> 
> 
> 
> 
> 
> 
> . But I want to ask, apart from my case, is the windforce cooler on 290 series performance should be good and not loud?


It should perform as well as most other 290's from what I can tell from reviews:

http://www.tomshardware.com/reviews/radeon-r9-290-and-290x,3728-6.html

http://www.guru3d.com/articles_pages/gigabyte_radeon_r9_290x_windforce_3x_oc_review,11.html

Not looking like it should be running that loud or hot.

Did you get a chance to check the thermal paste?


----------



## Paul17041993

I've noticed the EK 290X R2 block doesn't come with the 1mm pads it requires for the VRMs, meaning you have to stack the .5mm's, quite silly really.
Still, temps are much better than my makeshift hybrid cooling.


----------



## Mega Man

they usually do, i am surprised tbh


----------



## mixolyd

Hey guys, I have an MSI R9 290 on Win 10 x64. i've noticed occasionally while doing 2D tasks (don't really game much these days) that the screen will flicker for a second. It doesn't flicker to black, there is still some color. Sometimes it doesn't happen for days but today it's happened 3 times while I'm doing work and it's getting annoying. Anyone have any tips for what to do? I tried both stable and beta Radeon drivers. The card is not OC'd or anything. I checked MSI update tool and have latest firmware. Maybe I should RMA? I've only had the card since May (it was sent as an RMA replacement for a 7850 that died on me)


----------



## kizwan

Quote:


> Originally Posted by *mixolyd*
> 
> Hey guys, I have an MSI R9 290 on Win 10 x64. i've noticed occasionally while doing 2D tasks (don't really game much these days) that the screen will flicker for a second. It doesn't flicker to black, there is still some color. Sometimes it doesn't happen for days but today it's happened 3 times while I'm doing work and it's getting annoying. Anyone have any tips for what to do? I tried both stable and beta Radeon drivers. The card is not OC'd or anything. I checked MSI update tool and have latest firmware. Maybe I should RMA? I've only had the card since May (it was sent as an RMA replacement for a 7850 that died on me)


One of my 290s used to do this but in my case it's because I use modded 390 rom & this rom set much lower voltages for 2D than stock rom. After I fixed this, it doesn't happen again. It's a long shot maybe, try increase the voltage little bit, +25mV should be more than enough.


----------



## Caveat

Is it worth to change 2 Sapphire R9 290X OC to an R9 390x or N R9 FuryX? Does it get much better or not much?


----------



## mus1mus

Quote:


> Originally Posted by *Caveat*
> 
> Is it worth to change 2 Sapphire R9 290X OC to an R9 390x? Does it get much better or not much?


For free? Why not?

Almost every 390X comes with Hynix Memory. Better performance and high memory clocks give them some edge over the 2XX cards.

Ohh, wait. 2 against 1? No way. 390X has a very small advantage over a 290X.


----------



## Caveat

Quote:


> Originally Posted by *mus1mus*
> 
> For free? Why not?
> 
> Almost every 390X comes with Hynix Memory. Better performance and high memory clocks give them some edge over the 2XX cards.
> 
> Ohh, wait. 2 against 1? No way. 390X has a very small advantage over a 290X.


And an FuryX?


----------



## rdr09

Quote:


> Originally Posted by *Caveat*
> 
> And an FuryX?


No high-end single gpu atm will match the raw power of crossfire 290X.


----------



## Caveat

Quote:


> Originally Posted by *rdr09*
> 
> No high-end single gpu atm will match the raw power of crossfire 290X.


Ok thank you Mus1mus and rdr09. Ill stay with my 2 R9 290x's ^^


----------



## mus1mus

True ^

Though, if you donct play crossfire-supported games, faces power supply limitation, cooling and noise conscious, the FuryX is a good choice.

Performance difference is not huge though.


----------



## Roboyto

Quote:


> Originally Posted by *Caveat*
> 
> Is it worth to change 2 Sapphire R9 290X OC to an R9 390x or N R9 FuryX? Does it get much better or not much?


@mus1mus and @rdr09 are absolutely correct.

I was pondering the Fury(X) situation myself the other day since my main rig drives 5760x1080 on a single 290, so I looked at the most recent reviews TechPowerUp had to offer and made a little spreadsheet.

I went to TPU and scoped out results for 4K since I run 3K eyefinity. Any results for 4K would obviously pan out better

The review was from Aug 7, 2015 and TPU said:


R9 390X reference performance was tested by running the MSI R9 390X Gaming at reference design clocks (1050/1500)
R9 Fury reference performance was tested by running the ASUS R9 Fury Strix @ reference clocks 1000/500.

A 390X @ 1050/1500 will trail my 290 @ 1200/1500 (gaming clocks) by a bit...it is still a decent comparison from a few months ago *(August 7th 2015)*.

The Fury they tested in this instance was the Sapphire Tri-X OC which runs @ 1040/500.

I took frame rates from their 22 game benchmark suite and averaged the gains to get the following:


The gains from the MSI 390X to the Sapphire Fury were: 16.98%
The gains from the MSI 390X to Fury X were: 24.51%
They were also able to squeeze another 4.5% (BF3 specifically) out of the Fury by hitting 1110/500.
If you add the 4.5% to either of those margins you have 21.48% and 29.01% respectively.
https://www.techpowerup.com/reviews/Sapphire/R9_Fury_Tri-X_OC/1.html



GAME @ 4K390XFURYFURY XGAINS FURYGAINS FURY XALIEN ISOLATION52.558.562.2111.43%118.48%AC UNITY21.829.331.6134.40%144.95%BATMAN AK74.79598.5127.18%131.86%BF 342.349.152.8116.08%124.82%BF 429.532.834.3111.19%116.27%BIOSHOCK INF50.560.563.2119.80%125.15%COD AW58.964.167.9108.83%115.28%CIV BEYOND EARTH52.564.571.5122.86%136.19%CRYSIS 32326.228.2113.91%122.61%DEAD RISING 324.427.930.8114.34%126.23%DRAGON AGE INQ28.732.733.7113.94%117.42%FAR CRY 431.836.939.6116.04%124.53%GTA V2630.432.2116.92%123.85%METRO LL30.237.239.9123.18%132.12%PROJECT CARS39.743.945.9110.58%115.62%RYSE34.339.441.5114.87%120.99%SHADOW OF MORDOR44.149.952.7113.15%119.50%WITCHER 325.530.632.8120.00%128.63%TOMB RAIDER45.352.658.7116.11%129.58%WATCH DOGS32.737.539.6114.68%121.10%WOLFENSTEIN28.735.837.4124.74%130.31%WOW: WOD71.37881.1109.40%113.74%         Gains116.98%124.51%

I reckon most Fury(X) could obtain ~1110 core clock, and that did not include any HBM overclocking which does, AFAIK, give a little boost as well.

For those of us with exceptionally well clocking Hawaii cards, either Fury would probably be a 'sidegrade'.


----------



## JourneymanMike

Quote:


> Originally Posted by *Caveat*
> 
> Is it worth to change 2 Sapphire R9 290X OC to an R9 390x or N R9 FuryX? Does it get much better or not much?


EN OH
Quote:


> Originally Posted by *rdr09*
> 
> No high-end single gpu atm will match the raw power of crossfire 290X.


^^^^^^^^^^^^^^^^^^^^^This
Quote:


> Originally Posted by *Caveat*
> 
> Ok thank you Mus1mus and rdr09. Ill stay with my 2 R9 290x's ^^


Good Decision!








Quote:


> Originally Posted by *mus1mus*
> 
> True ^
> 
> Though, if you don't play crossfire-supported games, faces power supply limitation, cooling and noise conscious, the FuryX is a good choice.
> 
> *Performance difference is not huge though*.


Then, why change?

I'm waiting for something that has a real difference... Who knows, maybe when AMD releases it's new Processors and improve their GPU's (and Drivers!!!!)....

Then I'll switch to AMD and trade my Crossfire 290X's in...


----------



## Caveat

Ye. Thats why i stay with my 2 290x's too. If its not making a huge difference than im not changing. Noise isnt a problem. Most of the time ingame i use a headset. So i dont hear m "roar" anyway. I can play gta v on almost high on my lg 29" ultrawide 2560x1080 screen.


----------



## rdr09

Quote:


> Originally Posted by *Caveat*
> 
> Ye. Thats why i stay with my 2 290x's too. If its not making a huge difference than im not changing. Noise isnt a problem. Most of the time ingame i use a headset. So i dont hear m "roar" anyway. I can play gta v on almost high on my lg 29" ultrawide 2560x1080 screen.


I do envy those with 8GB or HBM that manages usage better than GDDR5.


----------



## Caveat

Quote:


> Originally Posted by *rdr09*
> 
> I do envy those with 8GB or HBM that manages usage better than GDDR5.


I have 2 4GB's.


----------



## rdr09

Quote:


> Originally Posted by *Caveat*
> 
> I have 2 4GB's.


Yah, Hawaiis have 4 each. They don't add up or stack up. it's 4GB usable in crossfire, tri or quad.


----------



## Caveat

Quote:


> Originally Posted by *rdr09*
> 
> Yah, Hawaiis have 4 each. They don't add up or stack up. it's 4GB usable in crossfire, tri or quad.


Oooh you meant it that way. Sorry haha


----------



## rdr09

Quote:


> Originally Posted by *Caveat*
> 
> Oooh you meant it that way. Sorry haha


I don't think it will be an issue on your ultrawide not even in some games like dying light. 4Gb is more than enough.


----------



## Riktar54

I am pondering "upgrading" (note the quotes) from my 290 (nonx) to a Fury Nano.

My reasoning is twofold:

1. Increased performance. Granted we are talking maybe 10 - 15 fps but hey, it is an increase.

2. Power consumption. Now this is something that shows a remarkable difference. I am amazed at what AMD was able to achieve on the TDP. And the difference has to make somewhat of a difference on the longevity of the PS since it's load will be reduced.

Given the recent drop on pricing of the Nano and the market value of the 290 I am thinking the upgrade cost would significantly less doing this now. instead of waiting for the next gen of cards that would find my 290 being worth 1/2 of what it is now.

Questions, questions,,,,,,,,,,,,,


----------



## spyshagg

guys

guys

guys

Its time!!!




*AMD vs NVIDIA*

> Run three benchmarks, print screen the score with GPU-Z CPU-Z and the custom wallpaper

http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/0_50

Put your 290's to WORK!
Lets do this! come on!


----------



## mus1mus

Quote:


> Originally Posted by *Riktar54*
> 
> I am pondering "upgrading" (note the quotes) from my 290 (nonx) to a Fury Nano.
> 
> My reasoning is twofold:
> 
> 1. Increased performance. Granted we are talking maybe 10 - 15 fps but hey, it is an increase.
> 
> 2. Power consumption. Now this is something that shows a remarkable difference. I am amazed at what AMD was able to achieve on the TDP. And the difference has to make somewhat of a difference on the longevity of the PS since it's load will be reduced.
> 
> Given the recent drop on pricing of the Nano and the market value of the 290 I am thinking the upgrade cost would significantly less doing this now. instead of waiting for the next gen of cards that would find my 290 being worth 1/2 of what it is now.
> 
> Questions, questions,,,,,,,,,,,,,


Those are good reason you have there. You can sell your 290s now as they should drop further in time.

I reckon the Nano will be limited in OCing though.

Sidegrade or upgrade, you can still gain a few fps.


----------



## keikei

Quote:


> Originally Posted by *Riktar54*
> 
> I am pondering "upgrading" (note the quotes) from my 290 (nonx) to a Fury Nano.
> 
> My reasoning is twofold:
> 
> 1. Increased performance. Granted we are talking maybe 10 - 15 fps but hey, it is an increase.
> 
> 2. Power consumption. Now this is something that shows a remarkable difference. I am amazed at what AMD was able to achieve on the TDP. And the difference has to make somewhat of a difference on the longevity of the PS since it's load will be reduced.
> 
> Given the recent drop on pricing of the Nano and the market value of the 290 I am thinking the upgrade cost would significantly less doing this now. instead of waiting for the next gen of cards that would find my 290 being worth 1/2 of what it is now.
> 
> Questions, questions,,,,,,,,,,,,,


That is an upgrade. On a side note, i'm surprised your rig info hasnt been created yet. A 10-15 frame increase on what resolution/setting? Selling your current card is a great idea, itll make getting the Nano even more attractive. Its essentially a Fury, but much more efficient.


----------



## JourneymanMike

Quote:


> Originally Posted by *Caveat*
> 
> Ye. Thats why i stay with my 2 290x's too. If its not making a huge difference than im not changing. Noise isnt a problem. Most of the time ingame i use a headset. So i dont hear m "roar" anyway. I can play gta v on almost high on my lg 29" ultrawide 2560x1080 screen.


I don't worry about the noise!



Not with two, beautiful, Aquacomputer Kryographics water blocks & Active backplates!!!


----------



## mus1mus

Quote:


> Originally Posted by *JourneymanMike*
> 
> I don't worry about the noise!
> 
> 
> 
> Not with two, beautiful, Aquacomputer Kryographics water blocks & Active backplates!!!


Wow! Didn't realize you now have a lovely FX rig! Well done.

In a note, do us a favor and run your system to help the team. 3D Mark, 3DM11, 3DM Vantage. All accepts tess off.

Give them cards a workout!.


----------



## Riktar54

Quote:


> Originally Posted by *keikei*
> 
> That is an upgrade. On a side note, i'm surprised your rig info hasnt been created yet. A 10-15 frame increase on what resolution/setting? Selling your current card is a great idea, itll make getting the Nano even more attractive. Its essentially a Fury, but much more efficient.


Currently running 2560x1080 but am considering going 3 panel 1920x1080's which is why I am looking for a bit more oomph. If I stay with the 2560x1080 I will most likely just keep the 290 since I am currently pegging 75fps with the 75Hz refresh rate on my LG 34UM57-P monitor.

I did create a rig info. Just have to get it worked into my signature.


----------



## keikei

Quote:


> Originally Posted by *Riktar54*
> 
> Currently running 2560x1080 but am considering going 3 panel 1920x1080's which is why I am looking for a bit more oomph. If I stay with the 2560x1080 I will most likely just keep the 290 since I am currently pegging 75fps with the 75Hz refresh rate on my LG 34UM57-P monitor.
> 
> I did create a rig info. Just have to get it worked into my signature.


Oh nice. I attempted eye finity a while back and a few positives i would say about it is the screen space. As a workstation its insanely effcient. Gaming is almost another experience alltogether as well. One thing i didnt like was that back then not many games supported the resolution and the borders were an eyesore.


----------



## chronicfx

Quote:


> Originally Posted by *Riktar54*
> 
> Currently running 2560x1080 but am considering going 3 panel 1920x1080's which is why I am looking for a bit more oomph. If I stay with the 2560x1080 I will most likely just keep the 290 since I am currently pegging 75fps with the 75Hz refresh rate on my LG 34UM57-P monitor.
> 
> I did create a rig info. Just have to get it worked into my signature.


For reference I used to run the setup below with three 290x and had decent framerate. I did not play the hardest games while I had it but finished Dragon age inquisition in eyefinity which was really immersive!! Also Grid Autosport was alot of fun to play on it. Just remember if you haven't dealt with driver issues before and tweaking a game to death for the first three weeks and 25% of the story you have been warned









My son playing grid, my iphone cannot even get all three screens













So this is two yamakasi catleaps on the outside with an LG34UM97 in the middle.
(2560x1440)(3440x1440)(2560x1440)


----------



## Riktar54

WOW!!! Nice setup!









My current setup (2560x1080 on a 34") is for (gaming wise) playing Elite Dangerous. I used to run 2 monitors (1920x1080 as main and 1600x1200 as secondary) which was great for my work.

Changing my main monitor to 2560x1080 for Elite was GREAT from a gameplay perspective but I do miss the extra monitor for work and photo editing.

I wonder if the Fury Nano can handle 3 monitors? All the reviews I read only show single monitor rates. I wonder if I could "guess" how it would fare by taking the 1440p rates and the 4k rates and doing some rough "guestimating"


----------



## diggiddi

Quote:


> Originally Posted by *chronicfx*
> 
> For reference I used to run the setup below with three 290x and had decent framerate. I did not play the hardest games while I had it but finished Dragon age inquisition in eyefinity which was really immersive!! Also Grid Autosport was alot of fun to play on it. Just remember if you haven't dealt with driver issues before and tweaking a game to death for the first three weeks and 25% of the story you have been warned
> 
> 
> 
> 
> 
> 
> 
> 
> 
> My son playing grid, my iphone cannot even get all three screens
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So this is two yamakasi catleaps on the outside with an LG34UM97 in the middle.
> (2560x1440)(3440x1440)(2560x1440)


How is that PSU handling your tri CF setup? I will be getting one next, are you on stock clocks? cpu too?


----------



## Cyber Locc

Quote:


> Originally Posted by *diggiddi*
> 
> How is that PSU handling your tri CF setup? I will be getting one next, are you on stock clocks? cpu too?


Barely lol, no seriously though. It will handle 3 cards if they are not overclocked that high if you plan on overclocking high go with the 1600. Also the 1300w is very loud, source I have both







.


----------



## mus1mus

Just go straight 1600G2 if you have more than 2 cards. Or split it on 2 1250s.

3 way - mediocre OC- OCP trips my X-1250.

5930K at 4.75 - 1.35


----------



## Cyber Locc

Quote:


> Originally Posted by *mus1mus*
> 
> Just go straight 1600G2 if you have more than 2 cards. Or split it on 2 1250s.
> 
> 3 way - mediocre OC- OCP trips my X-1250.
> 
> 5930K at 4.75 - 1.35


^ this and the 1600w G2 is a thing of beauty, I would however advise to go with the P2 if you can afford it the fanless mode is worth it. The G2 1600 is kinda loud, no where near as loud as the 1300G2 but still fanless mode would be nice.

I am actually putting my g2 in my bench and buying a p2 for my main rig for this reason.


----------



## rdr09

Quote:


> Originally Posted by *JourneymanMike*
> 
> I don't worry about the noise!
> 
> 
> 
> Not with two, beautiful, Aquacomputer Kryographics water blocks & Active backplates!!!


Looks very good. Somehow multi-card rigs, especially watercooled, look 'cooler' in boards that are atx and above. Nice job.


----------



## mus1mus

Quote:


> Originally Posted by *Cyber Locc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Just go straight 1600G2 if you have more than 2 cards. Or split it on 2 1250s.
> 
> 3 way - mediocre OC- OCP trips my X-1250.
> 
> 5930K at 4.75 - 1.35
> 
> 
> 
> ^ this and the 1600w G2 is a thing of beauty, I would however advise to go with the P2 if you can afford it the fanless mode is worth it. The G2 1600 is kinda loud, no where near as loud as the 1300G2 but still fanless mode would be nice.
> 
> I am actually putting my g2 in my bench and buying a p2 for my main rig for this reason.
Click to expand...

Well, TBH, any rig with 3 GPUs will be kinda loud.







LOUD as in PSU standards.

You can't really cool down 3 Hawaiis without much rad space and the number of fans that come with it. Unless 60C for watercooled GPUs is your thing.







and no OC of course.


----------



## Cyber Locc

Quote:


> Originally Posted by *mus1mus*
> 
> Well, TBH, any rig with 3 GPUs will be kinda loud.
> 
> 
> 
> 
> 
> 
> 
> LOUD as in PSU standards.
> 
> You can't really cool down 3 Hawaiis without much rad space and the number of fans that come with it. Unless 60C for watercooled GPUs is your thing.
> 
> 
> 
> 
> 
> 
> 
> and no OC of course.


I agree but my loop before I tore it down, had a bunch of corsair sp120s QE at full speed (1350?) and cougar vortex 140s at 7vs, and mcp35x running 80%. The G2 1600w is the loudest of all that stuff by quite a bit, or course it gets louder at full load, but then it doesn't matter as I have headphones







but even at idle its louder that all that above. You can hear it in the case 5 feet away, cant really hear the other stuff (at least I can but sound is subjective).

The passive fan mode will only help with low loud, but that really is when it is an issue.


----------



## mus1mus

Ohh. That's just terribad then.


----------



## chronicfx

Quote:


> Originally Posted by *diggiddi*
> 
> How is that PSU handling your tri CF setup? I will be getting one next, are you on stock clocks? cpu too?


That setup was using a EVGA 1300w G2 and a 4790k at 4.7ghz and 1.35v. The PSU handled it just fine. Yes the GPU were stock clocks.


----------



## Cyber Locc

Quote:


> Originally Posted by *mus1mus*
> 
> Ohh. That's just terribad then.


Ya and I mean it could just be mine, something with the fan or something, however I do know that the fan is fairly fast right off the bat no matter what the load, and 140mm fans tend to be pretty loud above 800rpms in my experience (hints my cougars at half voltage







), so the fanless mode is very attractive to me. It is a whole heaping lot quieter than the 1300 that thing sounds like a 290x reference cooler.


----------



## chronicfx

Quote:


> Originally Posted by *Cyber Locc*
> 
> Barely lol, no seriously though. It will handle 3 cards if they are not overclocked that high if you plan on overclocking high go with the 1600. Also the 1300w is very loud, source I have both
> 
> 
> 
> 
> 
> 
> 
> .


The PSU was silent considering it was three 290x reference on air


----------



## Unknownm

What psu is recommended for over volting two R9 290s? I own a RM850W, with a overclocked 4690k, I had a 750w in the system and it shut down during a benchmark.

Both 290s are overclocked 1140Mhz with +133 voltage and aux @ +50. What's limiting me right now is the vrm cooling but once that's fixed I will push the overclocks more. Should I upgrade or keep the 850?

Sent from my D5803 using Tapatalk


----------



## Cyber Locc

Quote:


> Originally Posted by *chronicfx*
> 
> The PSU was silent considering it was three 290x reference on air


Oh ya compared to those it is quiet







but by itself its a pretty loud unit, Like I said I find it almost as loud as a single reference cards fan. However again noise is subjective and the 1600w isnt really loud, I can hear it from a few feet away but not "Loudly" so maybe I am just picky lol, both of them are a great deal louder than my XFX power supply's though they are 850 and 1000ws so much different beat I guess. At any rate I would strongly suggest the Eco Mode versions, so you do not need to deal with unneeded sound.

I also dont think the fan is bad per se, I think it just moves a whole heaping lot of air and its more of a swooshing tons of air being moved sound then a fan issue. Especially seeing how it moves more air than the 1300w and is still much much quieter.

Also ya 1300ws should be okay for no overclock on the GPUs, but change that desktop CPU to an E series and Overclock all 3 GPUs and now you can have issues.


----------



## Cyber Locc

Quote:


> Originally Posted by *Unknownm*
> 
> What psu is recommended for over volting two R9 290s? I own a RM850W, with a overclocked 4690k, I had a 750w in the system and it shut down during a benchmark.
> 
> Both 290s are overclocked 1140Mhz with +133 voltage and aux @ +50. What's limiting me right now is the vrm cooling but once that's fixed I will push the overclocks more. Should I upgrade or keep the 850?
> 
> Sent from my D5803 using Tapatalk


Defiantly get a bigger psu, I had my 850w shut down from 1175 on 2 cards, I would go with 1000+ws.


----------



## mus1mus

Quote:


> Originally Posted by *Cyber Locc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Unknownm*
> 
> What psu is recommended for over volting two R9 290s? I own a RM850W, with a overclocked 4690k, I had a 750w in the system and it shut down during a benchmark.
> 
> Both 290s are overclocked 1140Mhz with +133 voltage and aux @ +50. What's limiting me right now is the vrm cooling but once that's fixed I will push the overclocks more. Should I upgrade or keep the 850?
> 
> Sent from my D5803 using Tapatalk
> 
> 
> 
> Defiantly get a bigger psu, I had my 850w shut down from 1175 on 2 cards, I would go with 1000+ws.
Click to expand...

Minimum of 1000. 1250 may cut it. I can trip it with 2 290Xs during benchmarks.


----------



## Cyber Locc

BTW guys sorry for not helping in the benchmark rumble







you guys have horrid timing, my rig is tore down and in a bench that is pretty bad lol I am using 2 240s to cool my CPU and 1 of GPUs. It overheats gaming lol (with my clocks at stock on the GPU) never mind benching hehehe. Hope we win, I will see how long the contest lasts maybe I can get my loop back together before that and do some benches for it.

Shoot that ends March 31, I will defiantly have my loop back together before then I thought it would only be for a few days lol. Scratch that above, I will get my benches in for the red team







.


----------



## mus1mus

Quote:


> Originally Posted by *Cyber Locc*
> 
> BTW guys sorry for not helping in the benchmark rumble
> 
> 
> 
> 
> 
> 
> 
> 
> you guys have horrid timing, my rig is tore down and in a bench that is pretty bad lol I am using 2 240s to cool my CPU and 1 of GPUs. It overheats gaming lol (with my clocks at stock on the GPU) never mind benching hehehe. Hope we win, I will see how long the contest lasts maybe I can get my loop back together before that and do some benches for it.
> 
> Shoot that ends March 31, I will defiantly have my loop back together before then I thought it would only be for a few days lol. Scratch that above, I will get my benches in for the red team
> 
> 
> 
> 
> 
> 
> 
> .


Sounds good. Maybe we can help you gain more points as well.If you are willing.


----------



## JourneymanMike

Quote:


> Originally Posted by *JourneymanMike*
> 
> I don't worry about the noise!
> 
> 
> 
> Not with two, beautiful, Aquacomputer Kryographics water blocks & Active backplates!!!


Quote:


> Originally Posted by *mus1mus*
> 
> Wow! *Didn't realize you now have a lovely FX rig! Well done.
> 
> *In a note, do us a favor and run your system to help the team. 3D Mark, 3DM11, 3DM Vantage. All accepts tess off.
> 
> Give them cards a workout!.


Quote:


> Originally Posted by *rdr09*
> 
> *Looks very good. Somehow multi-card rigs, especially watercooled, look 'cooler' in boards that are atx and above. Nice job*.


First of all, ThankYou guys! Now the other news... I no longer have the FX / CVF-Z setup, that's an old rig...

I got worn out waiting for AMD, to stop sitting on their hands, and release something new, as far as processors go...

But I still have the GPU's w/ blocks. This time on a Maximus VII Formula w/ delidded 4790K!





BTW: The 4790K lets these cards breath a little more!!


----------



## rdr09

Quote:


> Originally Posted by *JourneymanMike*
> 
> First of all, ThankYou guys! Now the other news... I no longer have the FX / CVF-Z setup, that's an old rig...
> 
> I got worn out waiting for AMD, to stop sitting on their hands, and release something new, as far as processors go...
> 
> But I still have the GPU's w/ blocks. This time on a Maximus VII Formula w/ delidded 4790K!
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> BTW: The 4790K lets these cards breath a little more!!


Wow. Only other member i know going over the top like that is wermad. Really good looking. i know there are others but i haven't really seen many.

I also use QDCs. They are positioned so i can isolate the gpus - ready for a single gpu to match or excel over my hawaiis.

EDIT: I see you Aquas. Got that backplate cooler. Nice.


----------



## Cyber Locc

Quote:


> Originally Posted by *mus1mus*
> 
> Sounds good. Maybe we can help you gain more points as well.If you are willing.


Of course, it will be a couple weeks though, as I have a new bench that I am modding to move the pc too, lots of work to do on it though. I do have a bunch of rads around I can string up lol just actually getting that done is the hard part hehehe







I will hit you up about it when I get all setup


----------



## mus1mus

Quote:


> Originally Posted by *rdr09*
> 
> Wow. Only other member i know going over the top like that is wermad. Really good looking. i know there are others but i haven't really seen many.
> 
> I also use QDCs. They are positioned so i can isolate the gpus - ready for a single gpu to match or excel over my hawaiis.
> 
> EDIT: I see you Aquas. Got that backplate cooler. Nice.


Wermad's newly updated rig is just stunning, indeed.
Quote:


> Originally Posted by *Cyber Locc*
> 
> Of course, it will be a couple weeks though, as I have a new bench that I am modding to move the pc too, lots of work to do on it though. I do have a bunch of rads around I can string up lol just actually getting that done is the hard part hehehe
> 
> 
> 
> 
> 
> 
> 
> I will hit you up about it when I get all setup


Nice. We have some awesome gaming recipe for the taking.


----------



## c0V3Ro

New PSU Coolermaster V850, new cpu fx-8350...









Bought another crappy vga


----------



## JourneymanMike

Quote:


> Originally Posted by *rdr09*
> 
> Wow. Only other member i know going over the top like that is wermad. Really good looking. i know there are others but i haven't really seen many.
> 
> *I also use QDCs. They are positioned so i can isolate the gpus* - ready for a single gpu to match or excel over my hawaiis.
> 
> EDIT: *I see you Aquas. Got that backplate cooler. Nice*.


I like the QDC's, there are four in the build... Having a Caselabs case, with removable Motherboard Tray / Tech Station, I can disconnect from my WC system, and take the whole motherboard assembly out... I also have QDC's on each side of the CPU Block (Supremacy EVO), so I can change CPU's, in a flash!









I liked the aqua's, from the first time I studied them, so I bought two with Active Backplates... I got them directly from Germany (I live in the States), It was cheaper than getting them here, even including shipping!


----------



## rdr09

Quote:


> Originally Posted by *c0V3Ro*
> 
> New PSU Coolermaster V850, new cpu fx-8350...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Bought another crappy vga


Really hard to gain points in that bench and Valley. 5 above stock ain't bad at all. Here my stock runs from single to dual . . .



checkout the scaling from driver to driver.

Quote:


> Originally Posted by *JourneymanMike*
> 
> I like the QDC's, there are four in the build... Having a Caselabs case, with removable Motherboard Tray / Tech Station, I can disconnect from my WC system, and take the whole motherboard assembly out... I also have QDC's on each side of the CPU Block (Supremacy EVO), so I can change CPU's, in a flash!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I liked the aqua's, from the first time I studied them, so I bought two with Active Backplates... I got them directly from Germany (I live in the States), It was cheaper than getting them here, even including shipping!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


what pump are you using?


----------



## JourneymanMike

Best Fire Strike, with new Intel i7 4790K - http://www.3dmark.com/fs/7575143

Best Fire Strike, with AMD FX-8350 - http://www.3dmark.com/fs/4823373

Fire Strike Extreme w/ 4790K... http://www.3dmark.com/fs/7575180

So far! I must do better, I'm still new to GPU OC'ing...

Aren't there certain sweet spots. for the 290X's, as far as core clocks and memory clocks go?

I thought I saw a chart on that, a while back...


----------



## JourneymanMike

Quote:


> Originally Posted by *rdr09*
> 
> .
> what pump are you using?


I'm using an Aquacomputer, D5 PWM... Of coarse!


----------



## mus1mus

Quote:


> Originally Posted by *JourneymanMike*
> 
> Best Fire Strike, with new Intel i7 4790K - http://www.3dmark.com/fs/7575143
> 
> Best Fire Strike, with AMD FX-8350 - http://www.3dmark.com/fs/4823373
> 
> Fire Strike Extreme w/ 4790K... http://www.3dmark.com/fs/7575180
> 
> So far! I must do better, I'm still new to GPU OC'ing...
> 
> Aren't there certain sweet spots. for the 290X's, as far as core clocks and memory clocks go?
> 
> I thought I saw a chart on that, a while back...


For the clocks, try to reach 1625 for the memory if you have Elpida chips. If Hynix, go for 1750. It may require a bit of Voltage but since you on Water, it's not gonna be an issue.

Core, go for 1250 if the card allows.

You would be playing within Max VDDCR allowed by the card's BIOS which usually sits at 1.35ish. So Core may not clock past 1250 unless you go for BIOSes that allow more Voltage. 1.4 and over. We can mod your own bios for that though.


----------



## rdr09

Quote:


> Originally Posted by *Cyber Locc*
> 
> Of course, it will be a couple weeks though, as I have a new bench that I am modding to move the pc too, lots of work to do on it though. I do have a bunch of rads around I can string up lol just actually getting that done is the hard part hehehe
> 
> 
> 
> 
> 
> 
> 
> I will hit you up about it when I get all setup


Share some pics pls.
Quote:


> Originally Posted by *mus1mus*
> 
> Wermad's newly updated rig is just stunning, indeed.


i saw that. tsm106 and Mega Man are both in the same category.

Quote:


> Originally Posted by *JourneymanMike*
> 
> I'm using an Aquacomputer, D5 PWM... Of coarse!


+rep. Thanks.

BTW, i was looking at the numbers 980Ti is putting out at the amd vs nvidia challenge and looks like the 290 is just 30% slower. I thought it was more than that. Prolly 35% at the most at least in Futuremark benchies.


----------



## cephelix

@rdr09 only 30%. Colour me surprised. I've been amazed at the longevity of my 290. And at amd cards in general. Can't wait to see what polaris brings to the table. If my card was under water, i bet i could push my clocks further


----------



## spyshagg

30% perhaps against the very best 290X in that competition (17xxx graphics score!!). A very good 24/7 score for a 290x would be 15K without tessellation. Which would put the difference above 40% I think?


----------



## fyzzz

These cards are still very good. http://www.3dmark.com/compare/fs/7252410/fs/6228104


----------



## mus1mus

Simply put, 2 Titan Xs = 3 290s. We're talking top overclocking Titan Xs vs mediocre 290s. I can't wait til I get my waterblocks so I can do a full watercooled tri-fire and give them titans a run for their money.









An easy 1300/1625 on 3-cards can't be too unreachable I reckon.


----------



## rdr09

Quote:


> Originally Posted by *cephelix*
> 
> @rdr09 only 30%. Colour me surprised. I've been amazed at the longevity of my 290. And at amd cards in general. Can't wait to see what polaris brings to the table. If my card was under water, i bet i could push my clocks further


Quote:


> Originally Posted by *spyshagg*
> 
> 30% perhaps against the very best 290X in that competition (17xxx graphics score!!). A very good 24/7 score for a 290x would be 15K without tessellation. Which would put the difference above 40% I think?


Quote:


> Originally Posted by *cephelix*
> 
> @rdr09 only 30%. Colour me surprised. I've been amazed at the longevity of my 290. And at amd cards in general. Can't wait to see what polaris brings to the table. If my card was under water, i bet i could push my clocks further


maybe i was focusing more on crossfire/sli numbers.

Quote:


> Originally Posted by *fyzzz*
> 
> These cards are still very good. http://www.3dmark.com/compare/fs/7252410/fs/6228104


Quote:


> Originally Posted by *cephelix*
> 
> @rdr09 only 30%. Colour me surprised. I've been amazed at the longevity of my 290. And at amd cards in general. Can't wait to see what polaris brings to the table. If my card was under water, i bet i could push my clocks further


lol
Quote:


> Originally Posted by *mus1mus*
> 
> Simply put, 2 Titan Xs = 3 290s. We're talking top overclocking Titan Xs vs mediocre 290s. I can't wait til I get my waterblocks so I can do a full watercooled tri-fire and give them titans a run for their money.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> An easy 1300/1625 on 3-cards can't be too unreachable I reckon.


Those Titan Xs will not have any vram issue running GTA V maxed out in 4K. Over 8GBs.









Not sure what happened there. lol. i am updating Planetside 2.


----------



## mus1mus

Quote:


> Originally Posted by *rdr09*
> 
> Those Titan Xs will not have any vram issue running GTA V maxed out in 4K. Over 8GBs.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not sure what happened there. lol. i am updating Planetside 2.


At least on 3DMark you said.









4K is on another league yeah,but 3XX can. meguess


----------



## rdr09

Quote:


> Originally Posted by *mus1mus*
> 
> At least on 3DMark you said.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 4K is on another league yeah,but 3XX can. meguess


Oh yah. I forgot.

Anyways, for those wanting to run a 4K TV . . .

http://www.overclock.net/t/1587329/has-anybody-used-this-adaptor


----------



## spyshagg

Quote:


> Originally Posted by *fyzzz*
> 
> These cards are still very good. http://www.3dmark.com/compare/fs/7252410/fs/6228104


thats insane! How do your VRM's hold so much current without blowing up?


----------



## fyzzz

Quote:


> Originally Posted by *spyshagg*
> 
> thats insane! How do your VRM's hold so much current without blowing up?


When i benchmark at those frequencies and voltages, i usually have the window open next to the computer, so it's getting really cold air. I also have a 120mm fan spinning at 2k rpm that is blowing air at the back of the card. When it was -5c here, when the card was at idle the core temp was at 9c and vrm 1 at 2c, with a constant voltage of 1.25v. I didn't check how hot it got under load when i applied more voltage however, but i'm sure it wasn't too hot.


----------



## cephelix

For me i'm still running comfortably under 4Gb of vram.
Quote:


> Originally Posted by *fyzzz*
> 
> When i benchmark at those frequencies and voltages, i usually have the window open next to the computer, so it's getting really cold air. I also have a 120mm fan spinning at 2k rpm that is blowing air at the back of the card. When it was -5c here, when the card was at idle the core temp was at 9c and vrm 1 at 2c, with a constant voltage of 1.25v. I didn't check how hot it got under load when i applied more voltage however, but i'm sure it wasn't too hot.


*slow clap* that is amazing dude


----------



## spyshagg

yeah reminds me when I was a young bloke doing the exact same thing at winter, some 12 years ago! very nice









(now just want everything stable, with peace and quiet because FREE TIME FLIES!)


----------



## mus1mus

Quote:


> Originally Posted by *fyzzz*
> 
> When i benchmark at those frequencies and voltages, i usually have the window open next to the computer, so it's getting really cold air. I also have a 120mm fan spinning at 2k rpm that is blowing air at the back of the card. When it was -5c here, when the card was at idle the core temp was at 9c and vrm 1 at 2c, with a constant voltage of 1.25v. I didn't check how hot it got under load when i applied more voltage however, but i'm sure it wasn't too hot.


Max I see during benches is 68C on VRM 1. 49C Core. Sub 20C ambient and 1.46Volts.

For the guys that encounter blackscreens with high Voltages, 2 of my cards do this while one can go up to 1.52 and not do a blackscreen. Is it worth making it the primary to avoid the issue?


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *fyzzz*
> 
> When i benchmark at those frequencies and voltages, i usually have the window open next to the computer, so it's getting really cold air. I also have a 120mm fan spinning at 2k rpm that is blowing air at the back of the card. When it was -5c here, when the card was at idle the core temp was at 9c and vrm 1 at 2c, with a constant voltage of 1.25v. I didn't check how hot it got under load when i applied more voltage however, but i'm sure it wasn't too hot.
> 
> 
> 
> 
> 
> 
> 
> Max I see during benches is 68C on VRM 1. 49C Core. Sub 20C ambient and 1.46Volts.
> 
> For the guys that encounter blackscreens with high Voltages, 2 of my cards do this while one can go up to 1.52 and not do a blackscreen. *Is it worth making it the primary to avoid the issue?*
Click to expand...


----------



## Cyber Locc

I will RD







.

However it may take a little longer. I had a very very bad day yesterday.

So I was playing wicther 3 and was having memory leak issues. I never had them before and didnt understand what the heck was going on. Anyway during troubleshooting I saw that windows was only seeing half of my ram.

So then nore troubleshooting and none of my dimms on the left side were registering (a1 b1 c1 d1, only a1 and c1 were populated though) so after I figured out it was just that side. I though maybe it was just 2 sticks or those 2 slots. Anyway I began testing more, this lead to less and less working until I was getting error 56 with any stick in any slot.

Going foward that turned into error 65, i pulled to reseat my CPU and check pins ect. All was fine however after replacing my cpu I am now hit with error 00! That means No CPU detected







. So something is seriously wrong either my CPU is fried or my board is. And of course as luck would have it I dont have an extra x79 chip or cpu to test....

I think its the board, as i did scracth the back a tiny bit a while back. At first after i saw no issues at all, however then my cmos quit working. Anytime I took power from the board everything would reset, and now this. However there again my CPU has been pushing very very high voltage for a long time and on 24/7 with that voltage (no speedstep) so it could have died.


----------



## mus1mus

@kizwan

I'll take that as a yes.

In another, do you have a 390MC rom for Elpida? I wanna try it. I can figure out how to do it yet.


----------



## fyzzz

Quote:


> Originally Posted by *mus1mus*
> 
> @kizwan
> 
> I'll take that as a yes.
> 
> In another, do you have a 390MC rom for Elpida? I wanna try it. I can figure out how to do it yet.


I can help. Send me a bios if you want help.


----------



## mus1mus

Quote:


> Originally Posted by *fyzzz*
> 
> I can help. Send me a bios if you want help.


NICE buddy.

The ASUSX.

ASUSX.zip 104k .zip file


I'll do the rest like memory timings and VDDCR Limit.


----------



## fyzzz

Quote:


> Originally Posted by *mus1mus*
> 
> NICE buddy.
> 
> The ASUSX.
> 
> ASUSX.zip 104k .zip file
> 
> 
> I'll do the rest like memory timings and VDDCR Limit.


Try this:

ASUSX390MC.zip 103k .zip file


----------



## mus1mus

Quote:


> Originally Posted by *fyzzz*
> 
> Try this:
> 
> ASUSX390MC.zip 103k .zip file


Ugh. Not getting any luck with these 290Xs.

Card doesn't scale with my OC.


----------



## c0V3Ro

Really disappointed with my AMD rig :/




Could it be the MB? It already has the NB watercooled and a fan over the mosfets heatsink


----------



## mirzet1976

Quote:


> Originally Posted by *c0V3Ro*
> 
> Really disappointed with my AMD rig :/
> 
> 
> 
> 
> Could it be the MB? It already has the NB watercooled and a fan over the mosfets heatsink


Definitely your oc is not stable, see my CPU to 4.7GHz with r9 280x and the difference in Physics and Combined scores


----------



## Roboyto

Quote:


> Originally Posted by *c0V3Ro*
> 
> Really disappointed with my AMD rig :/
> 
> 
> 
> 
> Could it be the MB? It already has the NB watercooled and a fan over the mosfets heatsink


Your graphics score is on par for a 290(X) at those clocks. And that graphics score is the important thing to look at in this scenario because an i5 or i7 is going to vastly outperform in benchmarks for the physics score. That larger physics score increases the overall score quite a bit.

Looks like you have 1600MHz RAM? It certainly couldn't hurt to get something in the 2133, or better, range.

Why no overclock on the GPU? 1030/1250 is essentially stock for a 290X. There is definitely more performance to be had even if you have a terrible 290(X). Is it watercooled?

4.6 Is pretty good on an 8350, but I know there is hidden performance to be had by doing more than just raising the core clock on those chips; which I have 0 experience with. A trip over to the FX 8320/8350 owners club would divulge some useful information if you haven't been there already: http://www.overclock.net/t/1318995/official-fx-8320-fx-8350-vishera-owners-club


----------



## Vellinious

How long does it take futuremark to approve a freakin driver?! What's the latest driver that futuremark approved for AMD?

http://www.3dmark.com/fs/7586373


----------



## alancsalt

two or three weeks?


----------



## Vellinious

Quote:


> Originally Posted by *alancsalt*
> 
> two or three weeks?


16.1 released in early January...still not approved. = /


----------



## alancsalt

Worse than ever then.


----------



## c0V3Ro

Thanks mirzet1976
My suspicious is on the MB. Causing CPU throttling.

Thanks Roboyto
But Firestrike score for a 290x at stock clocks shouldn't be around 10k? Even with a stock fx8350?
Yes, it's watercooled. What makes this scores even worse.
Rams are KHX1600. Already push them for 240FSB with 1.5V. But requires lots of NB voltage because the MB can't fix the NB frequency at 2200.

Well, already start to look for new MB, probably a m5a99fx.


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fyzzz*
> 
> Try this:
> 
> ASUSX390MC.zip 103k .zip file
> 
> 
> 
> 
> Ugh. Not getting any luck with these 290Xs.
> 
> Card doesn't scale with my OC.
Click to expand...

Did you also can not get better score with 290X ROM than 390X ROM?


----------



## diggiddi

Quote:


> Originally Posted by *Roboyto*
> 
> _Your graphics score is on par for a 290(X) at those clocks. And that graphics score is the important thing to look at in this scenario because an i5 or i7 is going to vastly outperform in benchmarks for the physics score. That larger physics score increases the overall score quite a bit._
> 
> Looks like you have 1600MHz RAM? It certainly couldn't hurt to get something in the 2133, or better, range.
> 
> Why no overclock on the GPU? 1030/1250 is essentially stock for a 290X. There is definitely more performance to be had even if you have a terrible 290(X). Is it watercooled?
> 
> 4.6 Is pretty good on an 8350, but I know there is hidden performance to be had by doing more than just raising the core clock on those chips; which I have 0 experience with. A trip over to the FX 8320/8350 owners club would divulge some useful information if you haven't been there already: http://www.overclock.net/t/1318995/official-fx-8320-fx-8350-vishera-owners-club


Those scores are low for a 290x and 8350 at 4.6ghz i got 11874 at same cpu speed [email protected]/1250
http://www.3dmark.com/3dm/7980519


----------



## diggiddi

Quote:


> Originally Posted by *spyshagg*
> 
> guys
> 
> guys
> 
> guys
> 
> Its time!!!
> 
> 
> 
> 
> *AMD vs NVIDIA*
> 
> > Run three benchmarks, print screen the score with GPU-Z CPU-Z and the custom wallpaper
> 
> http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/0_50
> 
> Put your 290's to WORK!
> Lets do this! come on!


What tesselation tweaks on Firestrike to get more points?


----------



## Vellinious

Quote:


> Originally Posted by *diggiddi*
> 
> What tesselation tweaks on Firestrike to get more points?


Driver settings, turn tessellation off. Profit.


----------



## diggiddi

+rep, any anything else? unfortunately I cant overclock using Afterburner, it just crashes my whole system and Crimson doesn't seem to change any settings even though I maxed core to +48% and set mem to 1550

OK just tried it it actually decreased slightly


----------



## Vellinious

Quote:


> Originally Posted by *diggiddi*
> 
> +rep, any anything else? unfortunately I cant overclock using Afterburner, it just crashes my whole system and Crimson doesn't seem to change any settings even though I maxed core to +48% and set mem to 1550
> 
> OK just tried it it actually decreased slightly


Try using Trixxx


----------



## diggiddi

Quote:


> Originally Posted by *Vellinious*
> 
> Try using Trixxx


Which version


----------



## cephelix

Quote:


> Originally Posted by *diggiddi*
> 
> Which version


the latest version









Should be 5.01 or somethin


----------



## diggiddi

Okay i have 5.0.0


----------



## Roboyto

Quote:


> Originally Posted by *c0V3Ro*
> 
> Thanks mirzet1976
> My suspicious is on the MB. Causing CPU throttling.
> 
> Thanks Roboyto
> But Firestrike score for a 290x at stock clocks shouldn't be around 10k? Even with a stock fx8350?
> Yes, it's watercooled. What makes this scores even worse.
> Rams are KHX1600. Already push them for 240FSB with 1.5V. But requires lots of NB voltage because the MB can't fix the NB frequency at 2200.
> 
> Well, already start to look for new MB, probably a m5a99fx.





> Originally Posted by *diggiddi*
> 
> Those scores are low for a 290x and 8350 at 4.6ghz i got 11874 at same cpu speed [email protected]/1250
> http://www.3dmark.com/3dm/7980519


@c0V3Ro what PSU are you running in your system?

You should fill out the rig builder so we can see exactly what you're working with.

Should try resetting CPU to stock to see what score you get, then compare to your OC settings.

It had been some time since I ran FS on my rig and you are absolutely correct @diggiddi

Looks like driver improvements for a 550 point bump compared to a year ago: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/34380_20#post_23458320

Just ran on my machine with R9 290 at XFX Black Edition stock clocks of 980/1250. i7 4770k @ 4.5 w/ 16GB DDR3 @ 2400

http://www.3dmark.com/3dm/10862665?


----------



## mus1mus

Quote:


> Originally Posted by *c0V3Ro*
> 
> Thanks mirzet1976
> My suspicious is on the MB. Causing CPU throttling.
> 
> Thanks Roboyto
> But Firestrike score for a 290x at stock clocks shouldn't be around 10k? Even with a stock fx8350?
> Yes, it's watercooled. What makes this scores even worse.
> Rams are KHX1600. Already push them for 240FSB with 1.5V. But requires lots of NB voltage because the MB can't fix the NB frequency at 2200.
> 
> Well, already start to look for new MB, probably a m5a99fx.


Before you buy a new mobo, try to do these:

Fix your CPU Overclock. 1.27 may be low for 4.6 Core. Thus very low Physics. Which could be attributed to Voltage Throttling.

If you havent done yet, update Windows and look for the hotfix file for W7 on the 83XX Thread. -- It helps even on X99.

Set your Power Limit in MSI AB, Trixx or HIS iTurbo even when not Overclocking --it helps.

I have no negs to say about the M5A99X but better money goes to the Kitty or a Gigabyte UD5 at least.


----------



## Roboyto

Quote:


> Originally Posted by *mus1mus*
> 
> If you havent done yet, update Windows and look for the hotfix file for W7 on the 83XX Thread. -- It helps even on X99.
> 
> Set your Power Limit in MSI AB, Trixx or HIS iTurbo even when not Overclocking --it helps.


@c0V3Ro

Windows Hotfix for Bulldozer

https://support.microsoft.com/en-us/kb/2645594


----------



## Vellinious

Quote:


> Originally Posted by *Roboyto*
> 
> @c0V3Ro
> 
> Looks like driver improvements for a 550 point bump compared to a year ago: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/34380_20#post_23458320


Wow...his 970 numbers SUCK.

Just sayin


----------



## mus1mus

Quote:


> Originally Posted by *Vellinious*
> 
> Wow...his 970 numbers SUCK.
> 
> Just sayin


Well, compared to what you used to have, YES.

On the same note, his 290X numbers suck too. Compared to what yoi and fyzz can post now.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> Well, compared to what you used to have, YES.
> 
> On the same note, his 290X numbers suck too. Compared to what yoi and fyzz can post now.


True....just shocked me how low the 970 numbers were. Just about everyone I help with a bios and the overclock can hit a minimum of 14k graphics score in Firestrike.... Quite a few guys out there closing in on my scores.


----------



## Roboyto

Quote:


> Originally Posted by *Vellinious*
> 
> Wow...his 970 numbers SUCK.
> 
> Just sayin


Those numbers are from a year ago. It was obviously not the best performing 970 by it's clock speeds 1500/1800, but it was a pure reference board. A very good 970 would hit somewhere around 1600/2000 I reckon? Regardless of these facts a 290/390 still trades blows with a 970.

What are good scores for a 970?

I don't follow their bench scores that closely, but you're the first person to say that they are that bad in the year that comparison has been around. I followed my FS link from a year ago for the 970 and went to comparisons for similar hardware. The best recorded score for standard FS with a 4770k and a GTX 970 is 11999, from Nov 2015, with clock speeds of...drum roll please: 1596/2051. http://www.3dmark.com/fs/6488217

Or there is a more recent submission tying that score, Jan 2016, at 1353/2090: http://www.3dmark.com/fs/7126804

That ridiculously high reported GPU clock speed for the first one, I would presume, is due to a BIOS modification of some sort forcing the card to max boost all the time rather than boosting once under load like the second.

That is 968 higher than my sucky score from a year ago.

If you look at purely graphics scores that guy has a sizable lead 14262 compared to 12891; a difference of 1371. Or comparing the second submission of 14330 to 12891; a difference of 1439. If AMD could squeeze 559 points from driver improvements, I'm sure Nvidia could achieve the same/similar feat. If we subtract the 559 from the 1371 we are left with 812 graphics points difference. And then 559 from 1439 to land at 880. Let's take an average of 846 between the two. I'm no Nvidia GPU expert, but the extra clock speeds between those kickass 970s, and that feeble reference I had (with a ghetto-rigged cooling solution), would most likely account for an 846 point delta in the GPU score.

You are using a 5820k which skews benchmarks where CPU/Physics are factored in. Must compare apples-to-apples, your 6 cores give you an advantage.


----------



## Roboyto

Quote:
Originally Posted by *mus1mus* 

Well, compared to what you used to have, YES.

On the same note, his 290X numbers suck too. Compared to what yoi and fyzz can post now.










> Originally Posted by *Vellinious*
> 
> True....just shocked me how low the 970 numbers were. Just about everyone I help with a bios and the overclock can hit a minimum of 14k graphics score in Firestrike.... Quite a few guys out there closing in on my scores.


Yes, this is the difference between running a stock card and BIOS modding/tweaking.

My card is a 290.


----------



## mus1mus

Good 290/Xs hang with 980s now with AMD's driver improvements. Can't say the same to nVidia Drivers. In fact, I'm seeing more driver interventions now than I was able to see last year. But I only compare drivers with 980TIs.

@Vellinious, did you notice that as well?


----------



## c0V3Ro

@ Roboyto
Just upgraded from a ST-750Z-AF to a Coolermaster V850.
Rig is updated.
If i set no overclock results are even worse.

@ mus1mus
Set vcore to 1.41 in the BIOS. Disabled almost all energy saving options. HPC on, APM off.
Will try using AB power limit set to maximum.
I'm using win10 x64. Hotfix is for win7.

Another gigabyte user (990fx chipset) said their AMD mobos are sensitive about VRM temperature and throttles automatic even with APM disable in BIOS.
The results with RMAed 280x DB were also kind of low. The shop tested and said problem reported was confirmed.
http://www.3dmark.com/3dmv/5386394
290x with same CPU.
http://www.3dmark.com/3dmv/5409703
That's why i'm suspicious about MB.
I intend buying M5A99FX Pro because Sabertooth 990FX is kind of over kill for overclocking potential of my watercooled rig.

Thanks all for helping.


----------



## mfknjadagr8

Quote:


> Originally Posted by *c0V3Ro*
> 
> @ Roboyto
> Just upgraded from a ST-750Z-AF to a Coolermaster V850.
> Rig is updated.
> If i set no overclock results are even worse.
> 
> @ mus1mus
> Set vcore to 1.41 in the BIOS. Disabled almost all energy saving options. HPC on, APM off.
> Will try using AB power limit set to maximum.
> I'm using win10 x64. Hotfix is for win7.
> 
> Another gigabyte user (990fx chipset) said their AMD mobos are sensitive about VRM temperature and throttles automatic even with APM disable in BIOS.
> The results with RMAed 280x DB were also kind of low. The shop tested and said problem reported was confirmed.
> http://www.3dmark.com/3dmv/5386394
> 290x with same CPU.
> http://www.3dmark.com/3dmv/5409703
> That's why i'm suspicious about MB.
> I intend buying M5A99FX Pro because Sabertooth 990FX is kind of over kill for overclocking potential of my watercooled rig.
> 
> Thanks all for helping.


you don't want to buy another board that might limit you to the same place either though...you might do well with that board but the saber is much more trusted if you are going to be pushing things hard


----------



## Dunan

Anyone know the resale value on a 290x lighting LE?

I'm thinking of selling and going with something that I can run a straight 60fps for 1920x1200 res regardless of settings. It'll most likely be the last card I buy as I'm getting old and don't plan on running at higher res - maybe 1440, only because I would like a bigger monitor sooner or later.

The card is not that old, maybe a year and I don't OC my gfx cards. Runs around 52-55c under load.


----------



## NicksTricks007

Quote:


> Originally Posted by *Dunan*
> 
> Anyone know the resale value on a 290x lighting LE?
> 
> I'm thinking of selling and going with something that I can run a straight 60fps for 1920x1200 res regardless of settings. It'll most likely be the last card I buy as I'm getting old and don't plan on running at higher res - maybe 1440, only because I would like a bigger monitor sooner or later.
> 
> The card is not that old, maybe a year and I don't OC my gfx cards. Runs around 52-55c under load.


I'd probably say probably about $300ish. These cards see still holding their own and will continue to especially at 1080p.


----------



## Dunan

Quote:


> Originally Posted by *NicksTricks007*
> 
> I'd probably say probably about $300ish. These cards see still holding their own and will continue to especially at 1080p.


Excellent, thanks... Might put it on the chopping block then. Anyone interested, PM me. Mods if this isn't allowed, please let me know.


----------



## NicksTricks007

You'd have to list it in the ocn marketplace for sale section. In order to do that, you need 35 rep.


----------



## Riktar54

I just picked up a freesync monitor that mandates using the Displayport on my XFX R9 290 DD card for freesync to be enabled. It works just (sorta) ok BUT,,,,,

I can't see the startup bios screen when I switch on the computer. If I use the HDMI port everything seems fine.

I figured maybe I just got a bad displayport cable but after having the same issue going after trying 3 cables I am at a loss.

If it matters, the monitor is an LG 34UM57-P

Right now it's just a minor annoyance. I have both hdmi and displayport cables hooked up in case I need to go into the bios. But would prefer just using the displayport cable and maybe I am missing something simple or their could be a problem?

Anyone else run into this?


----------



## Raephen

Quote:


> Originally Posted by *c0V3Ro*
> 
> @ Roboyto
> Just upgraded from a ST-750Z-AF to a Coolermaster V850.
> Rig is updated.
> If i set no overclock results are even worse.
> 
> @ mus1mus
> Set vcore to 1.41 in the BIOS. Disabled almost all energy saving options. HPC on, APM off.
> Will try using AB power limit set to maximum.
> I'm using win10 x64. Hotfix is for win7.
> 
> Another gigabyte user (990fx chipset) said their AMD mobos are sensitive about VRM temperature and throttles automatic even with APM disable in BIOS.
> The results with RMAed 280x DB were also kind of low. The shop tested and said problem reported was confirmed.
> http://www.3dmark.com/3dmv/5386394
> 290x with same CPU.
> http://www.3dmark.com/3dmv/5409703
> That's why i'm suspicious about MB.
> I intend buying M5A99FX Pro because Sabertooth 990FX is kind of over kill for overclocking potential of my watercooled rig.
> 
> Thanks all for helping.


I used to own AMD 990FX setups (both boards you mentioned, actually) and indeed: the VRM heat can cause all sorts of issues.
I never did any benching or anything really extreme, but even for mere everyday stability, a couple of 40mm fans atop of the VRM heatsink did wonders, even spinning at sub-1000 rpm.


----------



## Dunan

Quote:


> Originally Posted by *NicksTricks007*
> 
> You'd have to list it in the ocn marketplace for sale section. In order to do that, you need 35 rep.


Great.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> Good 290/Xs hang with 980s now with AMD's driver improvements. Can't say the same to nVidia Drivers. In fact, I'm seeing more driver interventions now than I was able to see last year. But I only compare drivers with 980TIs.
> 
> @Vellinious, did you notice that as well?


The performance updates for Maxwell were done over the summer. All the driver updates since August / September were game specific hot fixes. Most of them actually hurt benchmark scores. NVIDIA driver 353.62 was the best for benchmarking.

Quote:


> Originally Posted by *Roboyto*
> 
> What are good scores for a 970?
> 
> Or there is a more recent submission tying that score, Jan 2016, at 1353/2090: http://www.3dmark.com/fs/7126804
> 
> That is 968 higher than my sucky score from a year ago.
> 
> If you look at purely graphics scores that guy has a sizable lead 14262 compared to 12891; a difference of 1371. Or comparing the second submission of 14330 to 12891; a difference of 1439. If AMD could squeeze 559 points from driver improvements, I'm sure Nvidia could achieve the same/similar feat. If we subtract the 559 from the 1371 we are left with 812 graphics points difference. And then 559 from 1439 to land at 880. Let's take an average of 846 between the two. I'm no Nvidia GPU expert, but the extra clock speeds between those kickass 970s, and that feeble reference I had (with a ghetto-rigged cooling solution), would most likely account for an 846 point delta in the GPU score.
> 
> You are using a 5820k which skews benchmarks where CPU/Physics are factored in. Must compare apples-to-apples, your 6 cores give you an advantage.


That score you linked is likely on a stock bios, and the core wasn't reading correctly. He didn't pull a 14k graphics score with 1353 on the core.

Good 970 scores will range from low to mid 14k. Exceptional scores would go up to, but not eclipse 15k. Highest graphics score I ever posted was 14880. That's the highest I've seen a 970 run.

Most 970s under water will hit 1600 pretty easy, and 2100+ on the memory. I had mine running at 1633 core and 2176 memory on Firestrike, and 1652 / 2205 on Heaven.

A well overclocked 290X still puts it to shame though.


----------



## JourneymanMike

These are my 290X's, notice the difference between the two, besides that one is a Sapphire Battlefield 4 (reference) & the other, an AMD (reference)... both are WC'd BTW

They have different BIOS's and the Pixel Fillrate & Texture Fillrate are different (might be the BIOS's?)...

I'm running Crossfire w/ ULPS turned off, also using the Crimson beta drivers 16.1.1

Now, for the best performance, shouldn't both BIOS's be the same?

I've hit a wall with my OC, I can get the Core Clock to 1200 and if I mess with the Memory Clock, I only get 1300... I'm using MSI AB, BTW how does one enable the memory voltage? It's on AB, but I don't see a way to enable it...


----------



## kizwan

Quote:


> Originally Posted by *JourneymanMike*
> 
> 
> 
> These are my 290X's, notice the difference between the two, besides that one is a Sapphire Battlefield 4 (reference) & the other, an AMD (reference)... both are WC'd BTW
> 
> They have different BIOS's and the Pixel Fillrate & Texture Fillrate are different (might be the BIOS's?)...
> 
> I'm running Crossfire w/ ULPS turned off, also using the Crimson beta drivers 16.1.1
> 
> Now, for the best performance, shouldn't both BIOS's be the same?
> 
> I've hit a wall with my OC, I can get the Core Clock to 1200 and if I mess with the Memory Clock, I only get 1300... I'm using MSI AB, BTW how does one enable the memory voltage? It's on AB, but I don't see a way to enable it...


One of the cards is overclocked which explained the difference in Pixel Fillrate & Texture Fillrate reading.

They don't need to have same BIOS.

Only MSI lightning & Asus matrix have memory voltage control.


----------



## rdr09

Quote:


> Originally Posted by *JourneymanMike*
> 
> 
> 
> These are my 290X's, notice the difference between the two, besides that one is a Sapphire Battlefield 4 (reference) & the other, an AMD (reference)... both are WC'd BTW
> 
> They have different BIOS's and the Pixel Fillrate & Texture Fillrate are different (might be the BIOS's?)...
> 
> I'm running Crossfire w/ ULPS turned off, also using the Crimson beta drivers 16.1.1
> 
> Now, for the best performance, shouldn't both BIOS's be the same?
> 
> I've hit a wall with my OC, I can get the Core Clock to 1200 and if I mess with the Memory Clock, I only get 1300... I'm using MSI AB, BTW how does one enable the memory voltage? It's on AB, but I don't see a way to enable it...


What Kiz said. Also, what do you use to disable ULPS? I use AB and the rest of the settings. Let's compare . . .





I use AB and close it. I use Trixx to oc. I never use any of these apps, though, for gaming. Also, you can uncheck the synchronize multi-gpu and oc the cards individually like what you have. my settings will be limited by the slower card. it would be safe to assume it would be better to have them synchronize for games.

With Crimson, I noticed that when you oc with AB or Trixx . . . CCC follows the oc and it might take precedence. What i do is open Crimson and reset the oc to default clocks. Let AB orTrixx oc the cards. Resetting the clock reading in Crimson should not change the OC settings in AB or Trixx. Double check. I never use CCC to oc.


----------



## mus1mus

This ^

For best performance, grab a bios from @fyzzz or @gupsterg's thread.

@kizwan can help too.


----------



## Roboyto

Quote:


> Originally Posted by *Vellinious*
> 
> That score you linked is likely on a stock bios, and the core wasn't reading correctly. He didn't pull a 14k graphics score with 1353 on the core.
> 
> Good 970 scores will range from low to mid 14k. Exceptional scores would go up to, but not eclipse 15k. Highest graphics score I ever posted was 14880. That's the highest I've seen a 970 run.
> 
> Most 970s under water will hit 1600 pretty easy, and 2100+ on the memory. I had mine running at 1633 core and 2176 memory on Firestrike, and 1652 / 2205 on Heaven.
> 
> A well overclocked 290X still puts it to shame though.


My 3DMark results did the same thing for the 970 displaying the base frequency and not the boost.

Thanks for clarifying my suspicions were correct as I only spent a short time with the 970s I had.

My 290 with stock BIOS is tickling 15K graphics score: http://www.3dmark.com/3dm/10864900?



Quote:


> Originally Posted by *mus1mus*
> 
> For best performance, grab a bios from @fyzzz or @gupsterg's thread.
> 
> @kizwan can help too.


This will be happening sooner than later so I can post some results for Fanboy Competition which we are getting







in by a large margin.


----------



## mus1mus

Do it mate. They are pounding us with quantity. And we need to as many as we can to beat them.

You have a very good sample there. I reckon using @fyzzz bios, your card will scream 16K with TESS Off.


----------



## Roboyto

Quote:


> Originally Posted by *mus1mus*
> 
> Do it mate. They are pounding us with quantity. And we need to as many as we can to beat them.
> 
> You have a very good sample there. I reckon using @fyzzz bios, your card will scream 16K with TESS Off.


Or it will hit 16K now with the stock BIOS @ 1310/1750









I benched the hell out of this card when I first got it and I know it couldn't surpass 1725 on any benchmark I tried, and the core didn't like anything past 1300. Don't know what the deal is, but I'll take it.

*Tesselation ON 14990: http://www.3dmark.com/3dm/10883028?*



*Tesselation OFF 16090: http://www.3dmark.com/3dm/10883300?*


----------



## mus1mus

My bad. I should of said 17K.









Which bios do you use btw? Some Stock BIOS implement VDDC limits. Thus prevents high clocks. Unless you have a lightning or a Matrix, custom BIOS such as PT-based ones, is needed for high Voltages.

I can remove VDDC limits if you want.


----------



## Roboyto

Quote:


> Originally Posted by *mus1mus*
> 
> My bad. I should of said 17K.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Which bios do you use btw? Some Stock BIOS implement VDDC limits. Thus prevents high clocks. Unless you have a lightning or a Matrix, custom BIOS such as PT-based ones, is needed for high Voltages.
> 
> I can remove VDDC limits if you want.


 XFX290RefBlackEdition.zip 41k .zip file


XFX R9 290 Reference Black Edition

I don't know about voltage limits, I was more intrigued by the VRAM strap/timing adjustments. It seems like the memory could go further, but once I cross 1750 -> 1751+ FS causes black screen hard lock in 2nd GPU test.


----------



## mus1mus

Quote:


> Originally Posted by *Roboyto*
> 
> XFX290RefBlackEdition.zip 41k .zip file
> 
> 
> XFX R9 290 Reference Black Edition
> 
> I don't know about voltage limits, I was more intrigued by the VRAM strap/timing adjustments. It seems like the memory could go further, but once I cross 1750 -> 1751+ FS causes black screen hard lock in 2nd GPU test.


I'll get back with a modded bios in a couple of hours. Had to prepare for the day.









Once I can view your BIOS, I can mod your VRAM straps as well. It gives some boost. But @gupsterg's finishing touches takes it further.

It's amazing we have these bios modding going on that takes the cards' performance further.


----------



## Vellinious

Wish my card wasn't such an oddball....I'd love to get the memory timings fine tuned. It just doesn't seem to like it when the memory is touched.


----------



## fyzzz

Interesting, Roboyto, Vellnious and me have very good XFX cards.


----------



## mus1mus

Quote:


> Originally Posted by *Vellinious*
> 
> Wish my card wasn't such an oddball....I'd love to get the memory timings fine tuned. It just doesn't seem to like it when the memory is touched.


But yours is already a very good one. 137t strap for 1800+ mem.
Quote:


> Originally Posted by *fyzzz*
> 
> Interesting, Roboyto, Vellnious and me have very good XFX cards.


Rob's card looks like a winner, you reckon?


----------



## gupsterg

Quote:


> Originally Posted by *Vellinious*
> 
> Wish my card wasn't such an oddball....I'd love to get the memory timings fine tuned. It just doesn't seem to like it when the memory is touched.


You have MFR or BABG RAM IC?
Quote:


> Originally Posted by *mus1mus*
> 
> But yours is already a very good one. 137t strap for 1800+ mem.


I'd agree if he's using 1375MHz strap timings for 1800+ that's damn good in my book







.


----------



## Vellinious

It's an 8GB 290X. It's made things difficult.


----------



## mus1mus

Quote:


> Originally Posted by *gupsterg*
> 
> You have MFR or BABG RAM IC?


MFR 8GB on his BIOS.

Quote:


> Originally Posted by *Vellinious*
> 
> It's an 8GB 290X. It's made things difficult.


gups can do that.

IMO, it makes it easier to fuse a 390X bios into your card.


----------



## gupsterg

Quote:


> Originally Posted by *Vellinious*
> 
> It's an 8GB 290X. It's made things difficult.


We can work with it IMO







, no harm in trying







.
Quote:


> Originally Posted by *mus1mus*
> 
> MFR 8GB on his BIOS.


It's also got BABG support, so need confirmation







; I downed and viewed Vell's stock rom when we were discussing other things in that group PM session between me, you, fyzzz & kizwan







.


----------



## mus1mus

It's Hynix.









1800 is way off what Elpida can clock to. I guess.


----------



## Vellinious

Yes, Hynix. At first I thought I got a dud....Hynix memory on Maxwell is just awful. Elpida and Samsung are the better clockers.


----------



## gupsterg

3 files in zip.

1. Your stock rom
2. Stock + 390 MC mod
3. Stock + 390 MC mod + RRF 03 mod

Try 2 first, then 3 and see how it goes







.

Vellinious.zip 296k .zip file


----------



## JourneymanMike

Quote:


> Originally Posted by *kizwan*
> 
> One of the cards is overclocked which explained the difference in Pixel Fillrate & Texture Fillrate reading.
> 
> They don't need to have same BIOS.
> 
> *Only MSI lightning & Asus matrix have memory voltage control*.


Bummer! Thanks for the info! +1
Quote:


> Originally Posted by *rdr09*
> 
> What Kiz said. Also, *what do you use to disable ULPS?* I use AB and the rest of the settings. Let's compare . . .
> 
> 
> 
> 
> 
> I use AB and close it. *I use Trixx to oc.* I never use any of these apps, though, for gaming. Also, you can uncheck the synchronize multi-gpu and oc the cards individually like what you have. my settings will be limited by the slower card. it would be safe to assume it would be better to have them synchronize for games.
> 
> With Crimson, I noticed that when you oc with AB or Trixx . . . CCC follows the oc and it might take precedence. What i do is open Crimson and reset the oc to default clocks. Let AB or Trixx oc the cards. Resetting the clock reading in Crimson should not change the OC settings in AB or Trixx. Double check. I never use CCC to oc.


I use a registry hack program, that I got from here ( http://www.sevenforums.com/tutorials/316913-ulps-ultra-low-power-state-disable-amd-crossfirex.html ) to disable/enable ULPS in the registry... I'll give Trixx a try!







+1
Quote:


> Originally Posted by *mus1mus*
> 
> This ^
> 
> For best performance, grab a bios from @fyzzz or @gupsterg's thread.
> 
> @kizwan can help too.


Now I'll have to figure out flashing the BIOS, is that on both cards, or one at a time? Thanks for the help! +1


----------



## Vellinious

Quote:


> Originally Posted by *gupsterg*
> 
> 3 files in zip.
> 
> 1. Your stock rom
> 2. Stock + 390 MC mod
> 3. Stock + 390 MC mod + RRF 03 mod
> 
> Try 2 first, then 3 and see how it goes
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Vellinious.zip 296k .zip file


Thanks, I'll give it a try

Anything specific I need to know? Like...don't use the power target slider, or, voltage can be tricky....anything?


----------



## mus1mus

Quote:


> Originally Posted by *JourneymanMike*
> 
> Now I'll have to figure out flashing the BIOS, is that on both cards, or one at a time? Thanks for the help! +1


You can flash the bios with both cards in by using pointers. Note of the bold number. 0 is primary card. 1 for second, 2 for third..etc.

You just need your stock rom in either switch position as a fail-safe. Flip the switch, flash, profit.

atiflash -f -p *0* biosname.rom


----------



## Jflisk

Quote:


> Originally Posted by *mus1mus*
> 
> You can flash the bios with both cards in by using pointers. Note of the bold number. 0 is primary card. 1 for second, 2 for third..etc.
> 
> You just need your stock rom in either switch position as a fail-safe. Flip the switch, flash, profit.
> 
> atiflash -f -p *0* biosname.rom


Ill leave this here for the next guy. Their is a better one somewhere in the fury X thread.

Run on cmd line as administrator in windows. You don't need to exit windows to flash the card . put everything including bios into a folder on C:\ call it atiflash

CMD opens command prompt box . type cd c:\ then cd atiflash then type
atiflash -p 0(0 top or 1 second depending on card position) biosname.rom

The proper usage of atiflash 2.71 is

atiflash -p 0(0 top or 1 second card depending on card position) biosname.rom

All the switches can be found here
http://www.techpowerup.com/forums/threads/how-to-use-atiflash.57750/

More here
http://www.overclock.net/t/640063/how-to-flash-ati-cards


----------



## gupsterg

Quote:


> Originally Posted by *Vellinious*
> 
> Thanks, I'll give it a try
> 
> Anything specific I need to know? Like...don't use the power target slider, or, voltage can be tricky....anything?


No worries







.

Nothing specific I need to state







, ROMs are stock clocks/voltage/PL limit/stock RAM timings per strap, so do as you would with stock ROM.

You have 1 with 390 MC = some gain in performance, possibly aid higher freq. RAM (that's what the Stilt posted see last paragraph here). The other 1 is same but RRF (RefreshRateFactor as we currently know) is 03 vs 02. All 8GB ROMs regardless of 290/X or 390/X have 02. 4GB 290/X have 01, when only that is changed from 01 to 02 in those ROMs there is a boost plus going 03. Fyzzz reported no extra gain with RRF 04 vs 03 in tests with 290 ROM with 390 MC.


----------



## miklkit

Hi!

I just discovered in MSI Afterburner that my Sapphire 290x 8gb card seems to be using only 3 gb of ram. Is this a feature of Afterburner or is this the actual state of affairs?

I just took the Steam VR test and it failed badly and that is when I noticed the ram usage.


----------



## diggiddi

I don't think that's the problem because the graph in AB only goes to 3gb in mine too , but I passed with a 7.3

Strange that you failed considering your specs, but try running it without AB mine kept crashing with it on


----------



## Regnitto

290 died. Has been replaced with a 390x


----------



## mfknjadagr8

Quote:


> Originally Posted by *Regnitto*
> 
> 290 died. Has been replaced with a 390x


rip great card


----------



## Vellinious

Quote:


> Originally Posted by *gupsterg*
> 
> No worries
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Nothing specific I need to state
> 
> 
> 
> 
> 
> 
> 
> , ROMs are stock clocks/voltage/PL limit/stock RAM timings per strap, so do as you would with stock ROM.
> 
> You have 1 with 390 MC = some gain in performance, possibly aid higher freq. RAM (that's what the Stilt posted see last paragraph here). The other 1 is same but RRF (RefreshRateFactor as we currently know) is 03 vs 02. All 8GB ROMs regardless of 290/X or 390/X have 02. 4GB 290/X have 01, when only that is changed from 01 to 02 in those ROMs there is a boost plus going 03. Fyzzz reported no extra gain with RRF 04 vs 03 in tests with 290 ROM with 390 MC.


Well, that definitely helped. Results from the 3rd bios were pretty impressive, given that it was with the stock bios voltage issues.


----------



## kizwan

Quote:


> Originally Posted by *Vellinious*
> 
> Well, that definitely helped. Results from the 3rd bios were pretty impressive, given that it was with the stock bios voltage issues.
> 
> 
> Spoiler: Warning: Spoiler!


How much gain did you see with it?


----------



## Vellinious

Quote:


> Originally Posted by *kizwan*
> 
> How much gain did you see with it?


Nothing in graphics test 1....maybe .1 fps. But in graphics test 2 it added over 2fps. Raised the graphics score by 400+ points. I just did a quick run with clocks I knew would be stable....I'm pretty certain with some fine tuning and pushing the core / memory harder, I could get it up some more.


----------



## Vellinious

Did a quick run with no tess tweaks. Been looking for the magic combination to hit 16k straight up..... Oh yeah...

Can we combine this mod with the voltage mods that mus did for me? This is sick.....


----------



## Vellinious

I beat Fyzzz.... = P

http://www.3dmark.com/3dm/10903929


----------



## mus1mus

Quote:


> Originally Posted by *Vellinious*
> 
> Can we combine this mod with the voltage mods that mus did for me? This is sick.....


That is very easy. post the actual one you are using right now.


----------



## miklkit

Quote:


> Originally Posted by *diggiddi*
> 
> I don't think that's the problem because the graph in AB only goes to 3gb in mine too , but I passed with a 7.3
> 
> Strange that you failed considering your specs, but try running it without AB mine kept crashing with it on


I installed Trixx and it read the same 3 gb. The cpu and ram are loafing along hardly being used while the gpu is pegged and it isn't even close to passing. In fact it is absolutely minimum performance, kinda like software rendering.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> That is very easy. post the actual one you are using right now.


This was the one with the best results.

22616GOOD.zip 98k .zip file


----------



## mus1mus

Quote:


> Originally Posted by *Vellinious*
> 
> This was the one with the best results.


 22616GOOD_NOOV.zip 98k .zip file


Shoot 18K


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> 22616GOOD_NOOV.zip 98k .zip file
> 
> 
> Shoot 18K


That's with the voltage mod the same as the OV3 bios you sent me?


----------



## mus1mus

YES


----------



## kizwan

Quote:


> Originally Posted by *Vellinious*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> How much gain did you see with it?
> 
> 
> 
> Nothing in graphics test 1....maybe .1 fps. But in graphics test 2 it added over 2fps. Raised the graphics score by 400+ points. I just did a quick run with clocks I knew would be stable....I'm pretty certain with some fine tuning and pushing the core / memory harder, I could get it up some more.
Click to expand...

If I do that mod, my 290 modded ROM can outpace my 390 modded ROM but if I do that mod in 390 modded ROM, then my 290 modded ROM may still be defeated.







Nice score btw.








Quote:


> Originally Posted by *miklkit*
> 
> Quote:
> 
> 
> 
> Originally Posted by *diggiddi*
> 
> I don't think that's the problem because the graph in AB only goes to 3gb in mine too , but I passed with a 7.3
> 
> Strange that you failed considering your specs, but try running it without AB mine kept crashing with it on
> 
> 
> 
> I installed Trixx and it read the same 3 gb. The cpu and ram are loafing along hardly being used while the gpu is pegged and it isn't even close to passing. In fact it is absolutely minimum performance, kinda like software rendering.
Click to expand...

Something wrong with your card then. Try clean the drivers using DDU, then re-install the drivers.


----------



## Vellinious

Sweet. Gotta wait for the coolant temps to come back down and I'll give it a try.

Thank you both.....


----------



## mus1mus

Quote:


> Originally Posted by *Vellinious*
> 
> Sweet. Gotta wait for the coolant temps to come back down and I'll give it a try.
> 
> Thank you both.....


No worries mate.

Can I make a request too?


----------



## battleaxe

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Vellinious*
> 
> Well, that definitely helped. Results from the 3rd bios were pretty impressive, given that it was with the stock bios voltage issues.






Holy Crap Batman... daaauuuuummm...


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> No worries mate.
> 
> Can I make a request too?


Sure. I owe ya. lol

That didn't seem to do anything. Voltages were back to dropping below 1.3v again under load. 1.281v at the lowest point. Same settings I was using before when the voltage was only dropping to 1.305v under load.

EDIT: gotta try something else...I had the voltage set wrong.


----------



## mus1mus

Quote:


> Originally Posted by *Vellinious*
> 
> Sure. I owe ya. lol


It involves bios modding for a card on the other side.








Edit me Maxwell.


----------



## Vellinious

Well, not gonna get 18k. This is riding right on the ragged edge. 1310 core / 1807 memory.

http://www.3dmark.com/3dm/10904498



Maxwell? Easy. What card is it and what's the ASIC? Water or air? And....how aggressive do you want it? Just....please tell me it's not a STRIX, Classy, KPE or HOF.....not much we can do with those, except increase the power limits.


----------



## mus1mus

HOF TI.. AIR.

One card did 1555/4404


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> HOF TI.. AIR.
> 
> One card did 1555/2404


GM200 or GM204?

Can't raise the voltage on the HOF card in the bios. Gotta use the voltage tool..... I think there was a thread on the KingPin forums about those. Same with the others I listed. I think it's a modified GPU tweak, iirc....I never read too much about it.

I can give you higher power limits, but...that'll only help if you're hitting the PWR perf cap.

2400 memory? Good lord....

EDIT: NM. 980ti, GM200. Saw the FS. 2200 memory. That's god like for air cooling, man. Wow....must be Samsung?


----------



## mus1mus

Yep. Samsung.

That's alright. I just need Boost Removed and 200% Power and TDP Limits.







It helps on some benches.

3 Cards. that one can probably go further. I'm just testing them one by one. Will send you a PM later.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> Yep. Samsung.
> 
> That's alright. I just need Boost Removed and 200% Power Limits.
> 
> 
> 
> 
> 
> 
> 
> It helps on some benches.
> 
> 3 Cards. that one can probably go further. I'm just testing them one by one. Will send you a PM later.


Found a thread on the voltage tool for the HOF cards. PM me your stock bios, and I'll take a look at it. The GM200 is a bit different in the tables...may have Mr Dark take a look at it. He's the magic man. = P

http://www.overclock.net/t/1564410/just-got-my-galax-gtx-980-ti-hall-of-fame-pics-and-more-info-inside/10


----------



## mus1mus

Quote:


> Originally Posted by *Vellinious*
> 
> Found a thread on the voltage tool for the HOF cards. PM me your stock bios, and I'll take a look at it. The GM200 is a bit different in the tables...may have Mr Dark take a look at it. He's the magic man. = P
> 
> http://www.overclock.net/t/1564410/just-got-my-galax-gtx-980-ti-hall-of-fame-pics-and-more-info-inside/10


Nice info
Thanks for the link buddy. I'll check it out.


----------



## diggiddi

Quote:


> Originally Posted by *miklkit*
> 
> I installed Trixx and it read the same 3 gb. The cpu and ram are loafing along hardly being used while the gpu is pegged and it isn't even close to passing. In fact it is absolutely minimum performance, kinda like software rendering.


Where did trixx give you that reading? Under hardware monitor (trixx v5.2.1) my dedicated memory maxxed out at 3770 gb
I'm beginning to hate Afterburner, needs fixin


----------



## Vellinious

Does anyone have the link to the HiS iTurbo download link off of google drive? I reinstalled my OS and lost my link to it.

NM, found it. = )


----------



## miklkit

Quote:


> Originally Posted by *diggiddi*
> 
> Where did trixx give you that reading? Under hardware monitor (trixx v5.2.1) my dedicated memory maxxed out at 3770 gb
> I'm beginning to hate Afterburner, needs fixin


In Trixx 5.2.1 it's in "hardware monitor". I just ran it again and it hit 3369mb and got a little bit better fps.

I cleaned and installed the Crimson drivers with the AMD utility. Will try DDU next.


----------



## diggiddi

Also In your AMD folder C: how many driver installations are there? I always delete the old ones


----------



## b0uncyfr0

Does anyone know why Trixx doesn't allow changing Aux voltage on 2xx/3xx cards?


----------



## miklkit

It was corrupted drivers. After cleaning and reinstalling the Crimson hotfix drivers performance is all the way up to average. Thanks everyone!

Now of to do some gaming................


----------



## cephelix

Quote:


> Originally Posted by *b0uncyfr0*
> 
> Does anyone know why Trixx doesn't allow changing Aux voltage on 2xx/3xx cards?


It doesn't?it works fine on mine. I know there was one of the newer trixx versions that wasn't working and i had to find a previous version of that.


----------



## fyzzz

@Vellinious Awesome card you have there. But you haven't beaten me yet







http://www.3dmark.com/fs/7252410 and my highest but i had trouble with combined: http://www.3dmark.com/fs/7572102.


----------



## mus1mus

Ohh dear. I am falling behind these 2 brats!

I'll beat you in numbers you freaks


----------



## Vellinious

Quote:


> Originally Posted by *fyzzz*
> 
> @Vellinious Awesome card you have there. But you haven't beaten me yet
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/7252410 and my highest but i had trouble with combined: http://www.3dmark.com/fs/7572102.


I'll have to let my ambient temps drop and make another run at it. Loop temps were gettin pretty warm last night. lol


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> Ohh dear. I am falling behind these 2 brats!
> 
> I'll beat you in numbers you freaks


I must have tried 20 times with different settings last night....I can't touch the 58k you posted for Vantage. /shrug


----------



## mus1mus

Maybe you can see a pattern here.

http://www.3dmark.com/compare/3dmv/5407520/3dmv/5408003/3dmv/5375930/3dmv/5375508/3dmv/5375926#


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> Maybe you can see a pattern here.
> 
> http://www.3dmark.com/compare/3dmv/5407520/3dmv/5408003/3dmv/5375930/3dmv/5375508/3dmv/5375926#


Are those clocks reading right? ........that doesn't make sense. lol


----------



## mus1mus

That, obviously is a misread. But look down the CPU Clock.

FSB maybe? It translates to - you know.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> That, obviously is a misread. But look down the CPU Clock.
> 
> FSB maybe? It translates to - you know.


Ah. What is that, 125.5? 126?


----------



## mus1mus

this will show it better.

http://www.3dmark.com/3dmv/5415511

Try a PCIe clock over 101.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> this will show it better.
> 
> http://www.3dmark.com/3dmv/5415511
> 
> Try a PCIe clock over 101.


Hmm. I'll have to look around in my bios.


----------



## mus1mus

Quote:


> Originally Posted by *Vellinious*
> 
> Hmm. I'll have to look around in my bios.


Try 103 FSB or 128 for 125 strap.

Btw, anyone tried this? Meant to do this but got busy with the TIs.

__
https://www.reddit.com/r/36io8f/amd_onlytweakheres_a_lod_tweak_for_a_fps_boost/


----------



## mirzet1976

Quote:


> Originally Posted by *mus1mus*
> 
> Try 103 FSB or 128 for 125 strap.
> 
> Btw, anyone tried this? Meant to do this but got busy with the TIs.
> 
> __
> https://www.reddit.com/r/36io8f/amd_onlytweakheres_a_lod_tweak_for_a_fps_boost/


Cant find LodAdj under HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Video\ ( your driver video id) \0000\UMD

3- Change the LodAdj string value to -1


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> Try 103 FSB or 128 for 125 strap.
> 
> Btw, anyone tried this? Meant to do this but got busy with the TIs.
> 
> __
> https://www.reddit.com/r/36io8f/amd_onlytweakheres_a_lod_tweak_for_a_fps_boost/


Anything above 126 started giving me big problems. I'd have to mess with the memory timings to get it to run stable. That's a lot of work for a benchmark. lol


----------



## juno12

Has anyone found a solution to the black screen crash in YouTube videos? I have this since the 1st crimson release.

Στάλθηκε από το LG-D855 μου χρησιμοποιώντας Tapatalk


----------



## rdr09

Quote:


> Originally Posted by *juno12*
> 
> Has anyone found a solution to the black screen crash in YouTube videos? I have this since the 1st crimson release.
> 
> Στάλθηκε από το LG-D855 μου χρησιμοποιώντας Tapatalk


If you have Hardware Acceleration enabled in the browser, your system might be sensitive to it. I have crimson (16.1) with 2 290s and YT does not give me issue with or without HA.

Go to Advance settings and disable HA. See if it helps.


----------



## Ashura

Guys, why does vcore resets after reboot in afterburner? everything else remains the same, just vcore is back to 0.

This is happening after upgrading to Win 10 & 15.12 crimson driver.

Any ideas?


----------



## xAcid9

Quote:


> Originally Posted by *mirzet1976*
> 
> Cant find LodAdj under HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\Video\ ( your driver video id) \0000\UMD
> 
> 3- Change the LodAdj string value to -1


You can use RadeonMod to adjust that and few others tweaks within registry.

Quote:


> Originally Posted by *mus1mus*
> 
> Try 103 FSB or 128 for 125 strap.
> 
> Btw, anyone tried this? Meant to do this but got busy with the TIs.
> 
> __
> https://www.reddit.com/r/36io8f/amd_onlytweakheres_a_lod_tweak_for_a_fps_boost/%5B/URL
> 
> Guys, why does vcore resets after reboot in afterburner? everything else remains the same, just vcore is back to 0.
> 
> This is happening after upgrading to Win 10 & 15.12 crimson driver.
> 
> Any ideas?


Crimson driver is the culprit, need to revert back to pre-Crimson. Not sure if they fixed it in Feb12 WHQL though.


----------



## i2CY

Anyone run the Steam VR ready system test?

Wasn't thrilled with my results?

I wonder what they are basing it off of comapired to 3DMark


----------



## Vellinious

Yup. I was curious. Daily clocks. CPU at 4.2 and GPU at 1250 / 1625


----------



## mirzet1976

CPU at 4.8ghz GPU 1275/1625mhz daily



CPU 5ghz GPU 1275/1625mhz


----------



## i2CY

lol
k
just me then ha!!


----------



## diggiddi

Afterburner sux its started acting up recently, too many bugs
Quote:


> Originally Posted by *xAcid9*
> 
> You can use RadeonMod to adjust that and few others tweaks within registry.
> If i'm not mistaken LOD tweak only works in DX9 games.
> *Crimson driver is the culprit, need to revert back to pre-Crimson. Not sure if they fixed it in Feb12 WHQL though*.


I think you are right AB 4.11and 4.20 are causing crashes and crimson is the only software change


----------



## Roboyto

*DX12 information is heating up folks







*

*R9 390 ~VS~ GTX 970 in New Hitman:*






And new Ashes of the Singularity Beta benchmark which has tossed in some extra DX12 Asynchronous Shading seems to be giving the Red Team a bit of a boost:

http://www.anandtech.com/show/10067/ashes-of-the-singularity-revisited-beta





Good things are coming...good things


----------



## Roboyto

Quote:
Originally Posted by *i2CY* 

Anyone run the Steam VR ready system test?

Wasn't thrilled with my results?

I wonder what they are basing it off of comapired to 3DMark

Quote:
Originally Posted by *Vellinious* 

Yup. I was curious. Daily clocks. CPU at 4.2 and GPU at 1250 / 1625


> 9620 frames Avg Fidelity 10


Quote:


> Originally Posted by *mirzet1976*
> 
> CPU at 4.8ghz GPU 1275/1625mhz daily
> 
> 9159 frames Avg Fidelity 9.5
> 
> CPU 5ghz GPU 1275/1625mhz
> 
> 10134 frames Avg Fidelity 9.6


4770k @ 4.5 | 16GB DDR3 2400 | R9 290

*Stock 980/1250: *

*8729 Frames | Avg Fidelity 6.3 High*



Spoiler: Stock















*Gaming Clocks 1200/1500:*

*8846 Frames | Avg Fidelity 8.1 Very High*



Spoiler: Game















*Bench Clocks 1275/1700:*

*9141 Frames | Avg Fidelity 8.4 Very High*



Spoiler: Bench















*All core 1275/1250:*

*9474 Frames | Avg Fidelity 7.3 Very High*



Spoiler: OC Core















*All VRAM 980/1700:*

*9063 Frames | Avg Fidelity 5.4 Medium*



Spoiler: OC VRAM















Interesting the 8320 is putting up better fidelity/frame numbers than my 4770k with GPUs at near identical settings.

5820k pushing less frames but higher fidelity. VR making use of 12 threads?

Maxing my VRAM out with stock core clock and the performance plummets...maybe my 1700 not stable for the VR test.


----------



## battleaxe

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *Roboyto*
> 
> *DX12 information is heating up folks
> 
> 
> 
> 
> 
> 
> 
> *
> 
> *R9 390 ~VS~ GTX 970 in New Hitman:*
> 
> 
> 
> 
> 
> 
> And new Ashes of the Singularity Beta benchmark which has tossed in some extra DX12 Asynchronous Shading seems to be giving the Red Team a bit of a boost:
> 
> http://www.anandtech.com/show/10067/ashes-of-the-singularity-revisited-beta
> 
> 
> 
> 
> 
> 
> 
> Good things are coming...good things






I'm going to laugh uncontrollably if the 390X catches the 980ti after all the smoke settles on DX12, even if only on 4k. Unlikely maybe, but it would be cool if it happened.

Still, I hope these results are right and what we will actually see. Somehow though I doubt it.


----------



## Vellinious

Quote:


> Originally Posted by *battleaxe*
> 
> 
> I'm going to laugh uncontrollably if the 390X catches the 980ti after all the smoke settles on DX12, even if only on 4k. Unlikely maybe, but it would be cool if it happened.
> 
> Still, I hope these results are right and what we will actually see. Somehow though I doubt it.


There is very little, to absolutely no chance of that happening...just sayin. lol


----------



## battleaxe

Quote:


> Originally Posted by *Vellinious*
> 
> There is very little, to absolutely no chance of that happening...just sayin. lol


Yeah, but it would be sweet if it did. I'd just like to see AMD get a break for once and beat the giant.


----------



## Vellinious

Quote:


> Originally Posted by *battleaxe*
> 
> Yeah, but it would be sweet if it did. I'd just like to see AMD get a break for once and beat the giant.


It won't be with Hawaii.....


----------



## mus1mus

Hawaii X2 did.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> Hawaii X2 did.


Yeah....only took 2 of them. lol


----------



## Roboyto

Quote:


> Originally Posted by *Vellinious*
> 
> There is very little, to absolutely no chance of that happening...just sayin. lol


Quote:


> Originally Posted by *battleaxe*
> 
> I'm going to laugh uncontrollably if the 390X catches the 980ti after all the smoke settles on DX12, even if only on 4k. Unlikely maybe, but it would be cool if it happened.
> 
> Still, I hope these results are right and what we will actually see. Somehow though I doubt it.


It will be interesting to see how things unfold. AMD has a clear advantage at the hardware level with DX12, where Nvidias typical superior software optimizations are of little, to no, help.

It all depends on how these up and coming games are coded. If they all turn out like RotR where Gameworks takes precedence over DX12...then there is little hope.


----------



## Vellinious

Quote:


> Originally Posted by *Roboyto*
> 
> It will be interesting to see how things unfold. AMD has a clear advantage at the hardware level with DX12, where Nvidias typical superior software optimizations are of little, to no, help.
> 
> It all depends on how these up and coming games are coded. If they all turn out like RotR where Gameworks takes precedence over DX12...then there is little hope.


AMD needs more than DX12 to be a game changer to help them. They need arctic island and zen to be killers. If they don't gain some market share in the CPU market, they're done...worse part is, they know it. That's why they split off the GPU division last year. To protect it against what's almost inevitable at this point.


----------



## Roboyto

Quote:


> Originally Posted by *Vellinious*
> 
> AMD needs more than DX12 to be a game changer to help them. They need arctic island and zen to be killers. If they don't gain some market share in the CPU market, they're done...worse part is, they know it. That's why they split off the GPU division last year. To protect it against what's almost inevitable at this point.


I agree, but DX12 giving them a leg up certainly isn't going to hurt them. The early signs from DX12 appear to be positive and they could certainly get better.

Their efforts for improvement of CCC with crimson as a whole are a positive sign.

Hiring Jim Keller for arch design of Zen is a positive sign. Additionally being on the same/similar node as Intel is a positive sign.

Containing Fury power consumption with the Nano is a positive sign.

Now speculating....if DX12 shows good gains for AMD, it could get even better with Polaris if they've tailored the architecture to even further exploit the benefits of the new API.


----------



## Vellinious

I agree with all of that....but I've believed the hype before. I'll wait until I actually see it perform before I buy in again. lol


----------



## Roboyto

Quote:


> Originally Posted by *Vellinious*
> 
> I agree with all of that....but I've believed the hype before. I'll wait until I actually see it perform before I buy in again. lol


I was let down, like many, with FX CPUs...and again with Fury. Even if they are roughly a 25% performance gain over the 290(X) cards. There was too much hype combined with a poor launch.

However...

I think they've also made some changes in their PR dept. The only major thing they've showed, AFAIK, is a low end Polaris up against a GTX 950. As long as they haven't skewed/cherry picked the results in that scenario that is also a good sign. Much less foot-in-mouth with the hardware doing the talking would be a wise strategy this time around.

Lets hope the trend continues and it kicks the performance up a couple notches for our already mature hardware. Word on the proverbial street is Rise of the Tomb Raider is getting a DX12 patch sooner than later.


----------



## mus1mus

Quote:


> Originally Posted by *Roboyto*
> 
> I agree, but DX12 giving them a leg up certainly isn't going to hurt them. The early signs from DX12 appear to be positive and they could certainly get better.
> 
> Their efforts for improvement of CCC with crimson as a whole are a positive sign.
> 
> Hiring Jim Keller for arch design of Zen is a positive sign. Additionally being on the same/similar node as Intel is a positive sign.
> 
> Containing Fury power consumption with the Nano is a positive sign.
> 
> Now speculating....if DX12 shows good gains for AMD, it could get even better with Polaris if they've tailored the architecture to even further exploit the benefits of the new API.


Jim already left AMD I heard. We can say he was there around the design stage though.


----------



## Roboyto

Quote:


> Originally Posted by *mus1mus*
> 
> Jim already left AMD I heard. We can say he was there around the design stage though.


Just like he has for other projects.

https://en.wikipedia.org/wiki/Jim_Keller_(engineer)

He was with AMD from Aug 2012 to Sep 2015...3 years is plenty long to have a large influence and much longer than his first tour with AMD. He now works for Tesla.


----------



## mus1mus

Quote:


> Originally Posted by *Roboyto*
> 
> Just like he has for other projects.
> 
> https://en.wikipedia.org/wiki/Jim_Keller_(engineer)
> 
> He was with AMD from Aug 2012 to Sep 2015...3 years is plenty long to have a large influence and much longer than his first tour with AMD. He now works for Tesla.


I have high hope for ZEN TBH.

How did you go with the modded bios btw?


----------



## Roboyto

Quote:


> Originally Posted by *mus1mus*
> 
> I have high hope for ZEN TBH.
> 
> How did you go with the modded bios btw?


No test yet. Need a solid uninterrupted stretch to fiddle around with all of that. Work, wife, cats, side work, car fixing, mom's bday...weekend is approaching quickly however







.


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> 
> I'm going to laugh uncontrollably if the 390X catches the 980ti after all the smoke settles on DX12, even if only on 4k. Unlikely maybe, but it would be cool if it happened.
> 
> Still, I hope these results are right and what we will actually see. Somehow though I doubt it.


Who cares about DX12?



battle, just noticed we have very similar setup. How's Sandy holding up? Even our monitor is same.

Sorry, i have no high scoring Firestrike to show off.


----------



## JourneymanMike

Crimson 16.2 beta is out...

http://www.techpowerup.com/220351/amd-releases-radeon-software-crimson-edition-16-2-beta-drivers.html


----------



## ILOVEPOTtery

Hey guys/gals,

I've been out of the game for a while and have missed a few generations of hardware issues, so please bear with me. I think I've found my answers across the web, but I wanted to confirm and see what you all have to say.

My go-to for GPU overclocking has been Afterburner (currently using 4.2.0). I've ticked "Start with Windows" and "Force Constant Voltage", and have selected the same profile for both 2D and 3D applications. If I shutdown or reboot with an OC applied, I get a black screen when booting into Win7. A reboot into Safe Mode and restoring stock clocks clears the issue.

Would Trixx offer better boot stability or do I need to go the way of a BIOS mod/flash for this card? Would it make any difference going from the Legacy to UEFI switch position?

TIA


----------



## MoGTy

Quote:


> Originally Posted by *ILOVEPOTtery*
> 
> Hey guys/gals,
> 
> I've been out of the game for a while and have missed a few generations of hardware issues, so please bear with me. I think I've found my answers across the web, but I wanted to confirm and see what you all have to say.
> 
> My go-to for GPU overclocking has been Afterburner (currently using 4.2.0). I've ticked "Start with Windows" and "Force Constant Voltage", and have selected the same profile for both 2D and 3D applications. If I shutdown or reboot with an OC applied, I get a black screen when booting into Win7. A reboot into Safe Mode and restoring stock clocks clears the issue.
> 
> Would Trixx offer better boot stability or do I need to go the way of a BIOS mod/flash for this card? Would it make any difference going from the Legacy to UEFI switch position?
> 
> TIA


Have you ever reverted back from win10?
I had better luck with the legacy bios. But generally speaking (especially in Win10!) don't auto boot with the overclock enabled. It'll cause the bug you described.


----------



## ILOVEPOTtery

Quote:


> Originally Posted by *MoGTy*
> 
> Have you ever reverted back from win10?
> I had better luck with the legacy bios. But generally speaking (especially in Win10!) don't auto boot with the overclock enabled. It'll cause the bug you described.


This has been a Win7 build from the get go.

Now that you mention it, I've reverted to stock clocks (and removed the 2D/3D profiles) with Afterburner before a warm reboot or cold boot, but the "Start with Windows" option was still ticked. Perhaps Afterburner is trying to apply one of the OC profiles at boot... Curiouser and curiouser....

I think I need to fiddle with the software a bit more. What I'm really hoping is that a BIOS profile with my desired clocks and voltages would solve this issue.


----------



## battleaxe

Quote:


> Originally Posted by *rdr09*
> 
> Who cares about DX12?
> 
> 
> 
> battle, just noticed we have very similar setup. How's Sandy holding up? Even our monitor is same.
> 
> Sorry, i have no high scoring Firestrike to show off.


I think SandyBridge is still relevant honestly. My 2700k will go to 5ghz on water. I keep it at 4.7 for daily driving. I have two other 2500k that both do 4.7 and could go a little higher. Both still running great. One I let a friend borrow indefinitely until I need it back. And I have a 3570k that my wife uses that runs at 4.5ghz.

Until something comes out that is really quite a bit better I'm going to hold off. I'd rather buy new GPU's seems you get more mileage with those upgrades IMO. I'm loving both my 290, 290x, and 390x's right now. Great cards. Waiting to see what next gen brings us.


----------



## Vellinious

Thinking about adding another 8GB 290X...can't find another clear acrylic / nickel plated kryographics block though. All I can find is the black edition.

I've looked at Mod My Mods and PPCS....any other ideas?


----------



## kizwan

Quote:


> Originally Posted by *ILOVEPOTtery*
> 
> Hey guys/gals,
> 
> I've been out of the game for a while and have missed a few generations of hardware issues, so please bear with me. I think I've found my answers across the web, but I wanted to confirm and see what you all have to say.
> 
> My go-to for GPU overclocking has been Afterburner (currently using 4.2.0). I've ticked "Start with Windows" and "Force Constant Voltage", and have selected the same profile for both 2D and 3D applications. If I shutdown or reboot with an OC applied, I get a black screen when booting into Win7. A reboot into Safe Mode and restoring stock clocks clears the issue.
> 
> Would Trixx offer better boot stability or do I need to go the way of a BIOS mod/flash for this card? Would it make any difference going from the Legacy to UEFI switch position?
> 
> TIA


When you set AB to start with windows, it'll create a task in Task Scheduler. Find that task & set a delay start. Let the CCC/Crimson started first before the AB task start.


----------



## ILOVEPOTtery

Quote:


> Originally Posted by *Vellinious*
> 
> Thinking about adding another 8GB 290X...can't find another clear acrylic / nickel plated kryographics block though. All I can find is the black edition.
> 
> I've looked at Mod My Mods and PPCS....any other ideas?


Frozen CPU has 'em in stock.
Quote:


> Originally Posted by *kizwan*
> 
> When you set AB to start with windows, it'll create a task in Task Scheduler. Find that task & set a delay start. Let the CCC/Crimson started first before the AB task start.


I'll try that out, thank you!


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> I think SandyBridge is still relevant honestly. My 2700k will go to 5ghz on water. I keep it at 4.7 for daily driving. I have two other 2500k that both do 4.7 and could go a little higher. Both still running great. One I let a friend borrow indefinitely until I need it back. And I have a 3570k that my wife uses that runs at 4.5ghz.
> 
> Until something comes out that is really quite a bit better I'm going to hold off. I'd rather buy new GPU's seems you get more mileage with those upgrades IMO. I'm loving both my 290, 290x, and 390x's right now. Great cards. Waiting to see what next gen brings us.


Wut? You have three hawaiis with that cpu? My thought is that it has seen its limit with two. Although, I've only seen a couple of cores hit the 90s in BF4 and C3 at 4K.


----------



## mfknjadagr8

Quote:


> Originally Posted by *ILOVEPOTtery*
> 
> Frozen CPU has 'em in stock.


If you get what you ordered by phone


----------



## Vellinious

Quote:


> Originally Posted by *mfknjadagr8*
> 
> If you get what you ordered by phone


Yeah.....I'm not ready to trust him again.

I sent an email to PPCS, see if they can get me one.


----------



## ILOVEPOTtery

Sorry man, didn't realize the guy was a shyster. Good to know, since I was considering the same block from them.


----------



## battleaxe

Quote:


> Originally Posted by *rdr09*
> 
> Wut? You have three hawaiis with that cpu? My thought is that it has seen its limit with two. Although, I've only seen a couple of cores hit the 90s in BF4 and C3 at 4K.


No, the 290 is on a 2500k. Have a 390x and 290x on the 2700k together.


----------



## rdr09

Quote:


> Originally Posted by *battleaxe*
> 
> No, the 290 is on a 2500k. Have a 390x and 290x on the 2700k together.


Nice.


----------



## mfknjadagr8

Quote:


> Originally Posted by *ILOVEPOTtery*
> 
> Sorry man, didn't realize the guy was a shyster. Good to know, since I was considering the same block from them.


it's an ongoing thing...there's about three threads here about it and its sad really...some are saying they are getting things with the phone in orders but it's risky given the circumstances...when a web based company doesn't do business on the web I don't usually do business with them at all it's a bad sign imo


----------



## Jflisk

Quote:


> Originally Posted by *ILOVEPOTtery*
> 
> Frozen CPU has 'em in stock.
> I'll try that out, thank you!


Try
http://www.aquatuning.us/

They are in Germany but ship quick to the US . I have used them multiple times with no problems.


----------



## Vellinious

Quote:


> Originally Posted by *Jflisk*
> 
> Try
> http://www.aquatuning.us/
> 
> They are in Germany but ship quick to the US . I have used them multiple times with no problems.


Yeah, I checked them already. All they have is the black edition. = /

Think I'm just going to wait until Pascal / Arctic Island releases to upgrade....I had an itch, but, not being able to find the same block I have for my other card pretty much killed it. lol


----------



## Jflisk

Quote:


> Originally Posted by *ILOVEPOTtery*
> 
> Sorry man, didn't realize the guy was a shyster. Good to know, since I was considering the same block from them.


I would try modmymods.com see if they can get them for you. I have not ordered from FCPU since the whole debacle with their employees who happen to now own modmymods.com. Good luck


----------



## JourneymanMike

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ILOVEPOTtery*
> 
> Sorry man, didn't realize the guy was a shyster. Good to know, since I was considering the same block from them.
> 
> 
> 
> it's an ongoing thing...there's about three threads here about it and its sad really...some are saying they are getting things with the phone in orders but it's risky given the circumstances...when a web based company doesn't do business on the web I don't usually do business with them at all it's a bad sign imo
Click to expand...

I've ordered from them, twice, since the big shutdown...

Got the stuff, no problem...

They are more expensive than most places, and the shipping is UPS only!









One thing is that they have many things that other places don't have, or many of the items that are not being made anymore...

Lots of EK spare parts!









I talked to them 4 months ago, they said they'd be online in a week! NOT!!!


----------



## Cyber Locc

Quote:


> Originally Posted by *Jflisk*
> 
> I would try modmymods.com see if they can get them for you. I have not ordered from FCPU since the whole debacle with their employees who happen to now own modmymods.com. Good luck


He said he already tried mod my mods, I also would not hold my breath of them getting much high branded stuff. They decided to carry barrow and commit water cooling shop suicide EK, Bitspower, Monsoon they wont deal with them now. I have not seen those companys state that about MMM precisely but they have made comments in the past if you carry barrow you lose the ability to carry there parts.

The watercooling makers stick together and if you start dealing with people like barrow you get blacklisted lol, especially seeing how they were FCPU and now they carry barrow well thats 2 strikes.


----------



## Jflisk

Quote:


> Originally Posted by *Cyber Locc*
> 
> He said he already tried mod my mods, I also would not hold my breath of them getting much high branded stuff. They decided to carry barrow and commit water cooling shop suicide EK, Bitspower, Monsoon they wont deal with them now. I have not seen those companys state that about MMM precisely but they have made comments in the past if you carry barrow you lose the ability to carry there parts.
> 
> The watercooling makers stick together and if you start dealing with people like barrow you get blacklisted lol, especially seeing how they were FCPU and now they carry barrow well thats 2 strikes.


I had to go look up what Barrow was those parts look awfully familiar. I also gave him aquatunning - Just trying to help him out.


----------



## Cyber Locc

Quote:


> Originally Posted by *Jflisk*
> 
> I had to go look up what Barrow was those parts look awfully familiar. I also gave him aquatunning - Just trying to help him out.


They do dont they lol, Barrow steals designs from other watercooling makers. I am not talking thermal take steal a design and change it a bit and call it your own, they straight up make it exactly like the other 100% the same. At first they just did it to BP now they have moved on and copied some AQ designs and most recently Frozen Qs reservoir, they even reference it as Frozen Q! They are straight 110% knockoff Chinese thieving company







.

People buy it and I dont hate on them, they want to save money or dont like the bitspower designs all over. They are much cheaper than Bits due to the fact that they steal all there designs they have no overhead for R&D testing Ect just steal and produce. So the water cooling company's stick up for each other as do the US suppliers that is why no one else carries Barrow, before you had to get them on Ebay or Aliexexpress. So ya MMM carrying them IMO is wrong that company is poison to our industry and anyone that supports them like that is the same.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Cyber Locc*
> 
> They do dont they lol, Barrow steals designs from other watercooling makers. I am not talking thermal take steal a design and change it a bit and call it your own, they straight up make it exactly like the other 100% the same. At first they just did it to BP now they have moved on and copied some AQ designs and most recently Frozen Qs reservoir, they even reference it as Frozen Q! They are straight 110% knockoff Chinese thieving company
> 
> 
> 
> 
> 
> 
> 
> .
> 
> People buy it and I dont hate on them, they want to save money or dont like the bitspower designs all over. They are much cheaper than Bits due to the fact that they steal all there designs they have no overhead for R&D testing Ect just steal and produce. So the water cooling company's stick up for each other as do the US suppliers that is why no one else carries Barrow, before you had to get them on Ebay or Aliexexpress. So ya MMM carrying them IMO is wrong that company is poison to our industry and anyone that supports them like that is the same.


I surmise over time mmm will stop carrying barrow once it gets of the ground a bit....gotta have capital to afford a stock of the higher end parts...if not they may well have messed up...the fact that they are presumably the part of fcpu that left because of the service to the customers is a positive imo...


----------



## Cyber Locc

Quote:


> Originally Posted by *mfknjadagr8*
> 
> I surmise over time mmm will stop carrying barrow once it gets of the ground a bit....gotta have capital to afford a stock of the higher end parts...if not they may well have messed up...the fact that they are presumably the part of fcpu that left because of the service to the customers is a positive imo...


I agree its a positive to us, however to a maker that is owed lots of money by FCPU that may not be the case. I seen EK talk to MMM on twitter about carrying there parts that was in September of 2015 2 weeks later they started carrying barrow and I haven't seen anything more about it.

Like I said none of those companies have said specifically that MMM is cut off that I have seen, just before it has been referenced they do not support company's that carry barrow.

Also AFAIK it had nothing to do with service to customers, it had to do with there boss throwing a party in the shop and just problems they had with him personally being an Addict. At least that is what was referenced when they blasted the owner and his addiction on the internet, which honestly also goes to show there moral compass and business sense isn't the best.

However I do hope that MMM succeed we need more options, I am just not sure they will.


----------



## Vellinious

This card is godlike...I love it.

1307 / 1836

Full run, no tess tweaks.



http://www.3dmark.com/fs/7701597


----------



## mus1mus

You guys are putting on a show now. I can no longer compete.


----------



## jodybdesigns

I think you should take a look around on Aliexpress or TaoBao before you fully judge Barrow. There is now like 5 different brands making the same fittings. Look and all.

Unfortunately it isn't going to stop, they are making a name for themselves no matter if we like it or not. TONS of people are using Barrow..including myself. And I will continue to use them as long as they are selling $1 fittings for $3.50 and not trying to charge me $11 a fitting. The costs to make one of these tiny fittings is probably $1. And that is pushing it.

Why do I have to pay $11 a fitting? A name? So you are saying when I go to the grocery store and I buy Food Club Green Beans and not Del Monte Green Beans, I am doing something wrong? Green Beans is green beans, and so is my fittings.


----------



## Vellinious

Quote:


> Originally Posted by *jodybdesigns*
> 
> I think you should take a look around on Aliexpress or TaoBao before you fully judge Barrow. There is now like 5 different brands making the same fittings. Look and all.
> 
> Unfortunately it isn't going to stop, they are making a name for themselves no matter if we like it or not. TONS of people are using Barrow..including myself. And I will continue to use them as long as they are selling $1 fittings for $3.50 and not trying to charge me $11 a fitting. The costs to make one of these tiny fittings is probably $1. And that is pushing it.
> 
> Why do I have to pay $11 a fitting? A name? So you are saying when I go to the grocery store and I buy Food Club Green Beans and not Del Monte Green Beans, I am doing something wrong? Green Beans is are green beans, and so is are my fittings.


There...I fixed it for you

The difference is, some of those companies spent the money for R&D to make sure their designs work properly. The quality of the construction and the quality of the materials that go into making them. When I consider how much I put into my loop, I'm not going to skimp on fittings because I can save myself $50, especially when it's protecting a couple thousand dollars in hardware. If you choose to.....that's on you.

And the difference between your $1 can of greenbeans, and the $3 can of greenbeans, are the quality. Storebrands products are made by the same people that make jolly green giant, and del monte, but they use the lower quality vegetables, and cuttings from the good ones to put in the cans. You're eating leftovers....same with any storebrands products. I worked at CAG for a number of years....I know exactly what they do. lol


----------



## jodybdesigns

Quote:


> Originally Posted by *Vellinious*
> 
> The difference is, some of those companies spent the money for R&D to make sure their designs work properly. The quality of the construction and the quality of the materials that go into making them. When I consider how much I put into my loop, I'm not going to skimp on fittings because I can save myself $50, especially when it's protecting a couple thousand dollars in hardware. If you choose to.....that's on you.
> 
> And the difference between your $1 can of greenbeans, and the $3 can of greenbeans, are the quality. Storebrands products are made by the same people that make jolly green giant, and del monte, but they use the lower quality vegetables, and cuttings from the good ones to put in the cans. You're eating leftovers....same with any storebrands products. I worked at CAG for a number of years....I know exactly what they do. lol


I am pretty sure ALL these fittings are made from "recycled" materials. The same quality as those lower quality green beans I'm eating. But you paid $11, and I paid $3.50.


----------



## Vellinious

Quote:


> Originally Posted by *jodybdesigns*
> 
> I am pretty sure ALL these fittings are made from "recycled" materials. The same quality as those lower quality green beans I'm eating. But you paid $11, and I paid $3.50.


Enjoy your leftovers


----------



## jodybdesigns

Quote:


> Originally Posted by *Vellinious*
> 
> Enjoy your leftovers


Heh, I will


----------



## jodybdesigns

Can't fix when you eat your own words either.


----------



## Vellinious

Man, that was brutal. Good one.


----------



## jodybdesigns

Eh, prefer not to rage about it. We are all here for the same reasons, for help and to help others. Let's do that big bro


----------



## Vellinious

I can agree to disagree. lol

It's all good


----------



## Cyber Locc

Quote:


> Originally Posted by *jodybdesigns*
> 
> I think you should take a look around on Aliexpress or TaoBao before you fully judge Barrow. There is now like 5 different brands making the same fittings. Look and all.
> 
> Unfortunately it isn't going to stop, they are making a name for themselves no matter if we like it or not. TONS of people are using Barrow..including myself. And I will continue to use them as long as they are selling $1 fittings for $3.50 and not trying to charge me $11 a fitting. The costs to make one of these tiny fittings is probably $1. And that is pushing it.
> 
> Why do I have to pay $11 a fitting? A name? So you are saying when I go to the grocery store and I buy Food Club Green Beans and not Del Monte Green Beans, I am doing something wrong? Green Beans is green beans, and so is my fittings.


Yep and if everyone hopped on the barrow train innovation would stop and we would be stuck with the same tired clones and nothing new. You also grossly misrepresent the reasoning in this case, Barrow steals there designs they do not pay for R&D, they do not Innovate they copy. They are most likely a subsidiary of a very large company that clones many products, they have there own factories and they use cheap ways out like not changing there tools when they should which equals lesser quality.

This is why the price is lower, I always love to hear the "There ripping us off" excuse, well as a small business owner myself I can tell you you could not be more wrong.
The reason BP is more expensive is,

A. Quality takes Money.
B. Innovation takes Money.
C. BP is a Small Company, they do not own there own factories and even if they do its no where near as vast as a Chinese Cloners.

As to the price difference your numbers are skewed, all the Barrows I seen are more like 5-6 on Aliex, and the same fittings are 8 on MMM where as the Bits version is 10. Also I addressed this in my post, if you want to use Barrow stuff that is your business, and I do not care and will not judge you, however if you want this industry to tell you its okay give your money to the thieves well you are mistaken.

The user of Barrow trying to save a few bucks doesn't bother me, however a water cooling shop that is supposed to be bettering our industry supporting toxic thieves that are playing into destroying it, well thats another matter. I love this hobby and this industry and I want to see it succeed and if that means I need to spend a few more dollars so be it. If you do not see how supporting thieves like Barrow hurts our industry well then I do not know what to tell you, as its pretty much common sense.

Also this is all really irrelevant you and I do not produce water cooling parts, the makers that do, do not support barrow for good reason. They have every right to deny anyone to carry there products and they do they are actually pretty strict about who they let into the circle to protect this community.

And your right it will continue, but thats okay because 80% of this community has morals and understand the repercussions of dealing with such companies. If they succeed and take over a large market share then then our company's will leave, then we can go back to making custom blocks as the cloners will have nothing to copy.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Cyber Locc*
> 
> Yep and if everyone hopped on the barrow train innovation would stop and we would be stuck with the same tired clones and nothing new. You also grossly misrepresent the reasoning in this case, Barrow steals there designs they do not pay for R&D, they do not Innovate they copy. They are most likely a subsidiary of a very large company that clones many products, they have there own factories and they use cheap ways out like not changing there tools when they should which equals lesser quality.
> 
> This is why the price is lower, I always love to hear the "There ripping us off" excuse, well as a small business owner myself I can tell you you could not be more wrong.
> The reason BP is more expensive is,
> 
> A. Quality takes Money.
> B. Innovation takes Money.
> C. BP is a Small Company, they do not own there own factories and even if they do its no where near as vast as a Chinese Cloners.
> 
> As to the price difference your numbers are skewed, all the Barrows I seen are more like 5-6 on Aliex, and the same fittings are 8 on MMM where as the Bits version is 10. Also I addressed this in my post, if you want to use Barrow stuff that is your business, and I do not care and will not judge you, however if you want this industry to tell you its okay give your money to the thieves well you are mistaken.
> 
> The user of Barrow trying to save a few bucks doesn't bother me, however a water cooling shop that is supposed to be bettering our industry supporting toxic thieves that are playing into destroying it, well thats another matter. I love this hobby and this industry and I want to see it succeed and if that means I need to spend a few more dollars so be it. If you do not see how supporting thieves like Barrow hurts our industry well then I do not know what to tell you, as its pretty much common sense.
> 
> Also this is all really irrelevant you and I do not produce water cooling parts, the makers that do, do not support barrow for good reason. They have every right to deny anyone to carry there products and they do they are actually pretty strict about who they let into the circle to protect this community.
> 
> And your right it will continue, but thats okay because 80% of this community has morals and understand the repercussions of dealing with such companies. If they succeed and take over a large market share then then our company's will leave, then we can go back to making custom blocks as the cloners will have nothing to copy.


for that matter there are low cost options that don't break the bank and also function well...xspc is a big one


----------



## jodybdesigns

Quote:


> Originally Posted by *Cyber Locc*
> 
> Yep and if everyone hopped on the barrow train innovation would stop and we would be stuck with the same tired clones and nothing new. You also grossly misrepresent the reasoning in this case, Barrow steals there designs they do not pay for R&D, they do not Innovate they copy. They are most likely a subsidiary of a very large company that clones many products, they have there own factories and they use cheap ways out like not changing there tools when they should which equals lesser quality.
> 
> This is why the price is lower, I always love to hear the "There ripping us off" excuse, well as a small business owner myself I can tell you you could not be more wrong.
> The reason BP is more expensive is,
> 
> A. Quality takes Money.
> B. Innovation takes Money.
> C. BP is a Small Company, they do not own there own factories and even if they do its no where near as vast as a Chinese Cloners.
> 
> As to the price difference your numbers are skewed, all the Barrows I seen are more like 5-6 on Aliex, and the same fittings are 8 on MMM where as the Bits version is 10. Also I addressed this in my post, if you want to use Barrow stuff that is your business, and I do not care and will not judge you, however if you want this industry to tell you its okay give your money to the thieves well you are mistaken.
> 
> The user of Barrow trying to save a few bucks doesn't bother me, however a water cooling shop that is supposed to be bettering our industry supporting toxic thieves that are playing into destroying it, well thats another matter. I love this hobby and this industry and I want to see it succeed and if that means I need to spend a few more dollars so be it. If you do not see how supporting thieves like Barrow hurts our industry well then I do not know what to tell you, as its pretty much common sense.
> 
> Also this is all really irrelevant you and I do not produce water cooling parts, the makers that do, do not support barrow for good reason. They have every right to deny anyone to carry there products and they do they are actually pretty strict about who they let into the circle to protect this community.
> 
> And your right it will continue, but thats okay because 80% of this community has morals and understand the repercussions of dealing with such companies. If they succeed and take over a large market share then then our company's will leave, then we can go back to making custom blocks as the cloners will have nothing to copy.


Well played sir









I agree with this. There are hobbyists (me) and enthusiasts (mostly you guys) when it comes to water cooling. It's a really neat thing to do, its unique, and quite frankly, its beautiful to look at. I have never heard so many Oooh's and Ahhh's about anything. I understand where the long time enthusiasts would stand by a long term product. But man I am just a hobbyist. I don't want to spend a ton of money. I would LOVE to spend a ton of money, but the budget is tight, and I would rather spend my money on my little girl.

Now if we were getting a nice server to host some websites, you wanted to know what the best database to host your site would be, then I could see me being on the enthusiast side of the argument about that. But this is the other side of the fence and I am new(ish) to water cooling.


----------



## Cyber Locc

Quote:


> Originally Posted by *mfknjadagr8*
> 
> for that matter there are low cost options that don't break the bank and also function well...xspc is a big one


Agreed








Quote:


> Originally Posted by *jodybdesigns*
> 
> Well played sir
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I agree with this. There are hobbyists (me) and enthusiasts (mostly you guys) when it comes to water cooling. It's a really neat thing to do, its unique, and quite frankly, its beautiful to look at. I have never heard so many Oooh's and Ahhh's about anything. I understand where the long time enthusiasts would stand by a long term product. But man I am just a hobbyist. I don't want to spend a ton of money. I would LOVE to spend a ton of money, but the budget is tight, and I would rather spend my money on my little girl.
> 
> Now if we were getting a nice server to host some websites, you wanted to know what the best database to host your site would be, then I could see me being on the enthusiast side of the argument about that. But this is the other side of the fence and I am new(ish) to water cooling.


I agree thats why I said in the beginning, I dont judge users of the products that are new into the hobby or dont want the logos. Honestly I wasnt overly upset at barrow for making the fittings at first, as BP does like to put lots of logos and are very expensive.

However lets fast forward, now they are cloning other company's ones such as Frozen Q, which is a very small company that is barely getting by, not only did they clone there product they took there name and label it as such in a misleading way. That is just wrong, the fittings I mean there is only so many ways to make a fitting and it can be sort of written off, but they have gone too far as of late.

To take a design and innovate on it is something else entirely, TT for instance kinda copy designs but they change them and innovate on them, straight up stealing and changing nothing just isn't right. Like you say in your title you are database designer, so put yourself in those shoes , if you spent a ton of time designing and perfecting a database and someone just blatantly ripped off your code and sold it as there own for cheaper.


----------



## jodybdesigns

Quote:


> Originally Posted by *Cyber Locc*
> 
> Agreed
> 
> 
> 
> 
> 
> 
> 
> 
> I agree thats why I said in the beginning, I dont judge users of the products that are new into the hobby or dont want the logos. Honestly I wasnt overly upset at barrow for making the fittings at first, as BP does like to put lots of logos and are very expensive.
> 
> However lets fast forward, now they are cloning other company's ones such as Frozen Q, which is a very small company that is barely getting by, not only did they clone there product they took there name and label it as such in a misleading way. That is just wrong, the fittings I mean there is only so many ways to make a fitting and it can be sort of written off, but they have gone too far as of late.
> 
> To take a design and innovate on it is something else entirely, TT for instance kinda copy designs but they change them and innovate on them, straight up stealing and changing nothing just isn't right. Like you say in your title you are database designer, so put yourself in those shoes , if you spent a ton of time designing and perfecting a database and someone just blatantly ripped off your code and sold it as there own for cheaper.


Yeah, I have seen those T Virus rezzies they came out with. I have also seen the arguments about it. I do agree that is wrong. There are so many awesome ways you could design a rezzie. Makes you wonder if Barrow happened to get their hands on a mold because I would imagine that rezzie would take some serious work to put together.

But on another note, I have created many databases for a company for 3 years (no longer employed there btw) that I know for a fact are being used (and abused) to this day. That was the choice I made when I signed up for the job. It is what it is, just recreate something new and fresh that will blow people away and move on...


----------



## JourneymanMike

_









*Is this still the [Official] AMD R9 290X / 290 Owners Club? Or has it change into the I like, or don't like, Barrow fittings Club?*

Just for grins: I have Bitspower Black Sparkle fittings and Koolance QDCs...


----------



## Cyber Locc

Quote:


> Originally Posted by *JourneymanMike*
> 
> 
> 
> 
> 
> 
> 
> 
> _
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Is this still the [Official] AMD R9 290X / 290 Owners Club? Or has it change into the I like, or don't like, Barrow fittings Club?*
> 
> Just for grins: I have Bitspower Black Sparkle fittings and Koolance QDCs...


Actually we were talking about MMM, more so. Then barrow as a company not there fittings, but ya, however only as it related to MMM.







.

Then seeing how MMM sells 290 blocks, which is how all this started, its pretty on topic, though it did veer a bit.


----------



## Roboyto

Quote:


> Originally Posted by *mus1mus*
> 
> How did you go with the modded bios btw?


Best run I've achieved is with the 1250 RAM timings across the board. Flashing any of the BIOSes with altered RAM timings dropped the peak core speed I could hit to 1285 for whatever reason. The 'Magic BIOS' 'didn't seem to help much as I couldn't best the 1250 score. I may have to do a little more fiddling around, but the black screen lockups got old after ~4 hours of benching with most requiring a hard reset.

This is 1285/1749 since getting into the 1750 strap would cause a lockup.

16600 Graphics Score with TESS OFF. That's 510 point bump from my best run with stock BIOS.

http://www.3dmark.com/3dm/10982636?


----------



## mus1mus

Weird. 1750 shouldn't be an issues. I might have missed something.

Try using HIS iTurbo for moar...


----------



## JourneymanMike

Quote:


> Originally Posted by *Cyber Locc*
> 
> Quote:
> 
> 
> 
> Originally Posted by *JourneymanMike*
> 
> 
> 
> 
> 
> 
> 
> 
> _
> 
> 
> 
> 
> 
> 
> 
> 
> 
> *Is this still the [Official] AMD R9 290X / 290 Owners Club? Or has it change into the I like, or don't like, Barrow fittings Club?*
> 
> Just for grins: I have Bitspower Black Sparkle fittings and Koolance QDCs...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Actually we were talking about MMM, more so. Then barrow as a company not there fittings, but ya, however only as it related to MMM.
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Then seeing how MMM sells 290 blocks, which is how all this started, its pretty on topic, though it did veer a bit.
Click to expand...

Okey, Dokey...


----------



## KaffieneKing

You guys have probably seen this but worth representing *AMD* here


----------



## mus1mus

Whoa!

Tripping 1250W with 2 Cards!


----------



## Vellinious

Quote:


> Originally Posted by *Roboyto*
> 
> Best run I've achieved is with the 1250 RAM timings across the board. Flashing any of the BIOSes with altered RAM timings dropped the peak core speed I could hit to 1285 for whatever reason. The 'Magic BIOS' 'didn't seem to help much as I couldn't best the 1250 score. I may have to do a little more fiddling around, but the black screen lockups got old after ~4 hours of benching with most requiring a hard reset.
> 
> This is 1285/1749 since getting into the 1750 strap would cause a lockup.
> 
> 16600 Graphics Score with TESS OFF. That's 510 point bump from my best run with stock BIOS.
> 
> http://www.3dmark.com/3dm/10982636?


That's right up there with mine. Good run, man

EDIT: Sorry, didn't see you were running with Tess off. Still. Good run


----------



## Screener

Hey guys

I had a Powercolor 290 end of 2015 and it began suffering chronic blackscreen at idle problems after about 8 months , I presume due to low power states going too low and causing the card to bomb out. No amount of different bios's could sort this so was RMA'd and I was lucky as I got a Asus Matrix Plat 290X in return.

This has worked fine for months till recently on boot sometimes (once every couple of weeks) get a black screen









Neither card has had issues in game its always at idle.

Have swapped every other component out, have stacks of high end gear here so its definitely the cards. I am currently on the latest crimson drivers.

I'm ultra sick of it. Is there anyway to disable the power management and just make them run at full clocks all the time


----------



## Kokin

Quote:


> Originally Posted by *Screener*
> 
> Hey guys
> 
> I had a Powercolor 290 end of 2015 and it began suffering chronic blackscreen at idle problems after about 8 months , I presume due to low power states going too low and causing the card to bomb out. No amount of different bios's could sort this so was RMA'd and I was lucky as I got a Asus Matrix Plat 290X in return.
> 
> This has worked fine for months till recently on boot sometimes (once every couple of weeks) get a black screen
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Neither card has had issues in game its always at idle.
> 
> Have swapped every other component out, have stacks of high end gear here so its definitely the cards. I am currently on the latest crimson drivers.
> 
> I'm ultra sick of it. Is there anyway to disable the power management and just make them run at full clocks all the time


Try turning off ULPS. You can do this via Sapphire Trixx or MSI Afterburner. I would also suggest switching motherboard slots or doing a motherboard BIOS reset. I've had 2 or 3 instances in the past where my GPU would cause a black screen during boot up and all it took to fix it was a BIOS reset. It didn't make much sense, but it worked for me and a few others in the past.

My R9 290 has been solid for 2+ years despite having ULPS turned on. I do watercool my card so it hardly gets hotter than 40-50C. If you are maxing out on your core temps (95C), the VRMs may be slowly degrading as they would be running at 100C+ at that point.


----------



## HOMECINEMA-PC

Ohh yeah















Grabbed my first freesync 4k







and its also my 3rd 4k monitor


----------



## jodybdesigns

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Ohh yeah
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Grabbed my first freesync 4k
> 
> 
> 
> 
> 
> 
> 
> and its also my 3rd 4k monitor


Jesus....

I am Lime Green Jello right now son lol


----------



## battleaxe

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Ohh yeah
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Grabbed my first freesync 4k
> 
> 
> 
> 
> 
> 
> 
> and its also my 3rd 4k monitor


Holy balls batman. That is nuts.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *jodybdesigns*
> 
> Jesus....
> 
> I am Lime Green Jello right now son lol
> Quote:
> 
> 
> 
> Originally Posted by *battleaxe*
> 
> Holy balls batman. That is nuts.
Click to expand...

It sure is .

Thanks 290 bro's


----------



## Roboyto

Quote:


> Originally Posted by *Vellinious*
> 
> That's right up there with mine. Good run, man
> 
> EDIT: Sorry, didn't see you were running with Tess off. Still. Good run


I'm still pleased with the results for a setup that hasn't been altered in over 2 years...I can't complain.

Bought Ashes of the Singularity from steam over the weekend since it was 50% off. Had to see for myself how the card acted with DX12 since they 'patched in'/increased more DX12 'features'.

I will get some graphs/screenies up later, don't have time currently, but the numbers for right now are looking impressive.

My 290 was ran at 980/1250 stock clocks and 1200/1500 gaming clocks. The benchmark gives tons of info, especially for DX12, but for the sake of simplicity I will only list the overall 'All Batches Average FPS'.

*'1080' Windowed (1908*1033) - Crazy Graphics Preset*

DX11 Stock: 31.6 FPS

DX11 OC'ed: 36.3 FPS

Overclocking gains 14.87%

DX12 Stock: 35.9 FPS

DX12 OC'ed: 42.6 FPS

Overclocking gains 18.66%

DX12 Gains Stock: 13.61%

DX12 Gains OC'ed:  17.36%

*5760*1080 Eyefinity Fullscreen - High Graphics Preset*

DX11 Stock: 31.3 FPS

DX11 OC'ed: 33.3 FPS

Overclocking Gains 6.39%

DX12 Stock: 43.5 FPS

DX12 OC'ed: 51.4 FPS

Overclocking gains 18.16%

DX12 Gains Stock: 38.98%

DX12 Gains OC'ed: 54.35%

*I know plenty of people are thinking, and I am conscious of it as well, that this is one game, in beta, and it most definitely is not a concrete depiction of what DX12 can/will do for some AMD GPUs.*

However, to take the game from a stuttery, sloppy and unacceptable 31 FPS average, to a smooth and enjoyable 51 FPS average simply by overclocking and clicking 'Launch DX12 Version', is downright fantastic.

The 1080 boost is alright...no one will be angry to get an extra 15%...but a 40% boost in frames from DX12 for 3K resolution?!



Another interesting realization is that not only do you get a solid boost from DX12, but it also makes the OC more effective. 3.79% additional at 1080, and a walloping 11.77% at 3K.



I'm going to run through the bench some more to double check my findings. YMMV obviously, but if anyone has the game and would like to throw up some stats to compare, that would be cool.


----------



## rdr09

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> It sure is .
> 
> Thanks 290 bro's


Are you using DVI to DP adapters? If you are, brand please. Thanks.

Also, what 4K Freensync brand is it?


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *rdr09*
> 
> Are you using DVI to DP adapters? If you are, brand please. Thanks.
> 
> Also, what 4K Freensync brand is it?


Hey bloke , ummm no adapters here just hdmi , dp , dvi .

DP ( 60hz freesync ) is the middle 4k monitor its a AOC U2879VF



With a 1st gen Sammy U28D590 and a Phillips 288P6L on either side


----------



## rdr09

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Hey bloke , ummm no adapters here just hdmi , dp , dvi .
> 
> DP ( 60hz freesync ) is the middle 4k monitor its a AOC U2879VF
> 
> 
> 
> With a 1st gen Sammy U28D590 and a Phillips 288P6L on either side


Oh that is nice! Thanks m8. +rep.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *rdr09*
> 
> Oh that is nice! Thanks m8. +rep.












12k 4k ............









Oh yeah forgots ..... I finally dids the Windows 10 upgrade and so mucho snappier .

Its fixed all of my W7 start up bugs and glitches .........

considering its my bench OS that is 3 years old LooooooL


----------



## rdr09

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 12k 4k ............
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Oh yeah forgots ..... I finally dids the Windows 10 upgrade and so mucho snappier .
> 
> Its fixed all of my W7 start up bugs and glitches .........
> 
> considering its my bench OS that is 3 years old LooooooL


That's almost as old as our Hawaiis. I'm sticking with 7. I'm skerd.

At least you can keep the rest of your system and just switch to Polaris. Maybe mine can handle one. Our QDCs will prove valuable.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> 22616GOOD_NOOV.zip 98k .zip file
> 
> 
> Shoot 18K


Ok....tell me about tighter memory timings.... Specifically for 1750+


----------



## mus1mus

Where is your new BIOS? We can try new things.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> Where is your new BIOS? We can try new things.


The one in the last post is the one I've been using.


----------



## mus1mus

Try

22616GOOD_NOOV.zip 98k .zip file


1312 DPM7
1250 Timings up


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> Try
> 
> 22616GOOD_NOOV.zip 98k .zip file
> 
> 
> 1312 DPM7
> 1250 Timings up


Sweet. Thanks, Mus. I'll give it a go tonight.


----------



## Agent Smith1984

Quote:


> Originally Posted by *mus1mus*
> 
> Where is your new BIOS? We can try new things.


You had a chance to take a look at mine yet? MSI 390X.... I guess I just mainly benefit from getting 1250 straps all the way up the latter and then seeing how high it will clock from there....
Maybe a little bit of voltage "headroom" wouldn't hurt, but I'm near the end on temps anyways. It would be nice if the vdroop and voltage delivery to the core wasn't so inconsistent, cause I think that's part of the issue.

Thanks!


----------



## fyzzz

1250 timings in all the upper straps:

MSI390x.zip 98k .zip file


----------



## Agent Smith1984

Quote:


> Originally Posted by *fyzzz*
> 
> 1250 timings in all the upper straps:
> 
> MSI390x.zip 98k .zip file


Awesome! +1


----------



## JourneymanMike

Just screwin' around the last couple of days...

Tried some 390X BIOS's, did worse than my stock BIOS...









I'd been trying to break the 19000 mark in FS, then it was the 20000, then I hit 21062! All on stock BIOS's (each card has a different BIOS)














http://www.3dmark.com/3dm/11072769?









Anyone have a stable BIOS, for a pair of Reference 290X's w/ Hyninx? memory, that I could try?


----------



## fyzzz

Over 30k on stock bios? That is awesome. Weird that you got lower performance with 390X bios, did you try a modded or stock one? I can help with bios modifications if you want to.


----------



## OneB1t

when you push 290X to limit of artifacts the score increases alot


----------



## JourneymanMike

Quote:


> Originally Posted by *fyzzz*
> 
> Over 30k on stock bios? That is awesome. Weird that you got lower performance with 390X bios, did you try a modded or stock one? I can help with bios modifications if you want to.


I tried both - stock and modded...

I spent two days trying out different BIOS's...


----------



## JourneymanMike

Quote:


> Originally Posted by *OneB1t*
> 
> when you push 290X to limit of artifacts the score increases alot


No artifacting on this OC (1251 - 1653) anything more and it does artifact, the scores are about 50 to 75 lower when it artifacts...

I wish I could get more voltage on the GPU, and a higher power limit too...

Is the BIOS, a couple posts up, for a 290X reference with Hynix memory?


----------



## Vellinious

22616GOOD_NOOV.zip 98k .zip file


1312 DPM7
1250 Timings up[/quote]

1625 worked well, 1750 crashed the driver from the word go. The voltage seemed to help a tiny bit. It got up to 1.468v max, pushing the voltage offset any higher caused black screens. The minimum was still 1.305v though. Weird.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *JourneymanMike*
> 
> I tried both - stock and modded...
> 
> I spent two days trying out different BIOS's...


Cross flashing is fun until you accidently flash a 8gb bios on to a 4gb card LooooL









Vice a versa


----------



## diggiddi

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Cross flashing is fun until you accidently flash a 8gb bios on to a 4gb card LooooL
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Vice a versa










Huh its being done on the regular in here, 290 with 390 bios


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *diggiddi*
> 
> 
> 
> 
> 
> 
> 
> 
> Huh its being done on the regular in here, 290 with 390 bios


The last time I cross flashed a 8gb bios onto a 4gb card I bricked it is what I'm saying









I haven't done that in years anyways


----------



## JourneymanMike

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *JourneymanMike*
> 
> I tried both - stock and modded...
> 
> I spent two days trying out different BIOS's...
> 
> 
> 
> Cross flashing is fun *until you accidently flash a 8gb bios on to a 4gb card LooooL
> 
> 
> 
> 
> 
> 
> 
> *
> 
> Vice a versa
Click to expand...









Uhmmmm? I did that, twice! What I had to do was use the BIOS switch the top card, to the Uber BIOS...

Then it would boot... I flashed the bottom card, Success!









So now the problem was to flash the top card back to normal.... I slid the motherboard tray out (I have a CaseLabs case, the whole motherboard mount slides out to make a tech station) Then I switched the cards in the slots, set the one that was still bricked, in the switch forward position, and flashed that card back to normal... Hell it worked for me, TWICE!

Let's hope I don't do that one again! It is nice that I have QDC's, so I can take the MB tray out, without spilling water...









Hey, I'm learning, Now I need to learn how BIOS editing!


----------



## kizwan

Quote:


> Originally Posted by *JourneymanMike*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *JourneymanMike*
> 
> I tried both - stock and modded...
> 
> I spent two days trying out different BIOS's...
> 
> 
> 
> Cross flashing is fun *until you accidently flash a 8gb bios on to a 4gb card LooooL
> 
> 
> 
> 
> 
> 
> 
> *
> 
> Vice a versa
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Uhmmmm? I did that, twice! What I had to do was use the BIOS switch the top card, to the Uber BIOS...
> 
> Then it would boot... I flashed the bottom card, Success!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So now the problem was to flash the top card back to normal.... I slid the motherboard tray out (I have a CaseLabs case, the whole motherboard mount slides out to make a tech station) Then I switched the cards in the slots, set the one that was still bricked, in the switch forward position, and flashed that card back to normal... Hell it worked for me, TWICE!
> 
> Let's hope I don't do that one again! It is nice that I have QDC's, so I can take the MB tray out, without spilling water...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hey, I'm learning, Now I need to learn how BIOS editing!
Click to expand...

You can achieve the same thing by booting with the switch set to good BIOS (which you did) & once booted, move the switch to brick BIOS, then flash.


----------



## JourneymanMike

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *JourneymanMike*
> 
> Quote:
> 
> 
> 
> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *JourneymanMike*
> 
> I tried both - stock and modded...
> 
> I spent two days trying out different BIOS's...
> 
> 
> 
> Cross flashing is fun *until you accidently flash a 8gb bios on to a 4gb card LooooL
> 
> 
> 
> 
> 
> 
> 
> *
> 
> Vice a versa
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Uhmmmm? I did that, twice! What I had to do was use the BIOS switch the top card, to the Uber BIOS...
> 
> Then it would boot... I flashed the bottom card, Success!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So now the problem was to flash the top card back to normal.... I slid the motherboard tray out (I have a CaseLabs case, the whole motherboard mount slides out to make a tech station) Then I switched the cards in the slots, set the one that was still bricked, in the switch forward position, and flashed that card back to normal... Hell it worked for me, TWICE!
> 
> Let's hope I don't do that one again! It is nice that I have QDC's, so I can take the MB tray out, without spilling water...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Hey, I'm learning, Now I need to learn how BIOS editing!
> 
> Click to expand...
> 
> You can achieve the same thing by booting with the switch set to good BIOS (which you did) & once booted, move the switch to brick BIOS, then flash.
Click to expand...

Very Good! That will make it a lot easier, WHEN I brick another one!!

+Rep for the time saving tip!


----------



## rdr09

Quote:


> Originally Posted by *JourneymanMike*
> 
> Just screwin' around the last couple of days...
> 
> Tried some 390X BIOS's, did worse than my stock BIOS...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'd been trying to break the 19000 mark in FS, then it was the 20000, then I hit 21062! All on stock BIOS's (each card has a different BIOS)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/3dm/11072769?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anyone have a stable BIOS, for a pair of Reference 290X's w/ Hyninx? memory, that I could try?


Good job!


----------



## c0V3Ro

Old MB, 970, [email protected], 7270.
http://www.3dmark.com/fs/7633998
New MB, 990fx. All stock, 1st run, 8726.
http://www.3dmark.com/fs/7824338


----------



## jodybdesigns

Quote:


> Originally Posted by *c0V3Ro*
> 
> New MB, 990fx. All stock, 1st run.
> http://www.3dmark.com/fs/7824338
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Old MB, 970, [email protected]
> http://www.3dmark.com/fs/7633998


Your graphics score dropped almost 2k points though... That's a lot for a mobo swap.


----------



## Mega Man

I assume (on mobile can't see your pics) different program versions and different drivers


----------



## c0V3Ro

Same drivers. Had to make a win10 fresh install. But this justifies the extra 2k pts? Tried lots of different settings on BIOS and driver to reach 7k pts with 4.7GHz. Maybe the old ga-970a-ud3p was already tired. Now I think worth the effort to run benchs at 5GHz+


----------



## Agent Smith1984

Okay, is anybody able to flash BIOS from winthin windows?

I did it all the time on my Fury, but every time I try to do it on my 390x, it tells me the file "msi390x.rom" is not found.

hmmmmm


----------



## YellowBlackGod

The new driver is out, 16.3 Version.

http://www.overclock.net/t/1594113/amd-radeon-software-crimson-16-3-out-now

Has anybody tried it? Is there the power efficiency feature for us available or not? It must be buggy, 7970 has it, nano does not, 290x does not, according to user reports from the specific thread.


----------



## rdr09

Quote:


> Originally Posted by *YellowBlackGod*
> 
> The new driver is out, 16.3 Version.
> 
> http://www.overclock.net/t/1594113/amd-radeon-software-crimson-16-3-out-now
> 
> Has anybody tried it? Is there the power efficiency feature for us available or not? It must be buggy, 7970 has it, nano does not, 290x does not, according to user reports from the specific thread.


I don't see it. Not that i need it.



There is an explanation for it, though.


----------



## Offler

Hello guys.

More than a year ago I decided to do stress tests and 3dmark benches on my system. Results:
http://www.overclock.net/t/1406832/single-gpu-fire-strike-top-30/1200#post_22936403

Goals for the days were 3dmark score over 10k, and graphics score over 13k.

Questions:
1. We have now Crimson Driver 16.3 which is a completely different league compared to 14.9. I expect better results as more than a year ago.
I am interested mainly in Graphics score, while the new goal is above 14k.

2. the GPU works great for a lapped stock cooler (if you wear earplugs)
The chip is most likely a win in a silicon lottery (asic over 80), and overall card build from Sapphire. All I did with the card is just to lap the bumps on the vapour chamber. What kind of modifications you recommend?

I can choose between two types of coolers. Either Arctic Accelero Xtreme IV, or a watercoolign Arctic Accelero Hybrid III.

Due practical reasons I cannot allow more than 1 slot for cooling, so the closed loop water looks a way even when I am really not a fan of any watercooling.

3. Does anyone even run Phenom II x6 1090t?
I was suprised that nobody got better result for combination of a graphics and CPU. Yes, its an old CPU, but I really dont see that an issue


----------



## diggiddi

Quote:


> Originally Posted by *Offler*
> 
> Hello guys.
> 
> More than a year ago I decided to do stress tests and 3dmark benches on my system. Results:
> http://www.overclock.net/t/1406832/single-gpu-fire-strike-top-30/1200#post_22936403
> 
> Goals for the days were 3dmark score over 10k, and graphics score over 13k.
> 
> Questions:
> 1. We have now Crimson Driver 16.3 which is a completely different league compared to 14.9. I expect better results as more than a year ago.
> I am interested mainly in Graphics score, while the new goal is above 14k.
> 
> 2. the GPU works great for a lapped stock cooler (if you wear earplugs)
> The chip is most likely a win in a silicon lottery (asic over 80), and overall card build from Sapphire. All I did with the card is just to lap the bumps on the vapour chamber. What kind of modifications you recommend?
> 
> I can choose between two types of coolers. Either Arctic Accelero Xtreme IV, or a watercoolign Arctic Accelero Hybrid III.
> 
> Due practical reasons I cannot allow more than 1 slot for cooling, so the closed loop water looks a way even when I am really not a fan of any watercooling.
> 
> 3. Does anyone even run Phenom II x6 1090t?
> I was suprised that nobody got better result for combination of a graphics and CPU. Yes, its an old CPU, but I really dont see that an issue


1. Yes you should see some improvements with the new Crimson if not 16.3 try the WHQL

2. Another option would be the Corsair HG10 with a CLC

3. X6 1090T is stil a decent cpu especially greater than or equal to 4ghz


----------



## cephelix

Can't really say much about thearctic hybrids, i just put CLU between die and heatsink brought temps down quite a bit


----------



## i2CY

Agh,
Fresh install, was working last night (6hrs ago)
LN 2 Card not running, detected, but not giving feed back in GPUZ or MSI,
Saphire R9 290,
Reinstalled driver, with no luck

Thoughts....

FYI - 390 Full mod bios installed,

Card one is working fine.

----
FYI Window 10 did just roll out an update, but i can't say that this is the real issue that cause me to have to fresh install.
----
Ah,
could this be the issue?
2016-03-11 09:40:36, Info CSI 000053d4 [SR] Repairing corrupted file [l:23 ml:24]"\??\C:\WINDOWS\SysWOW64"\[l:10]"opencl.dll" from store
2016-03-11 09:40:36, Info CSI 000053d8 [SR] Repair complete

ARG!!!
Desided to finally wake up and start pulling it weight, agh!


----------



## Offler

Quote:


> Originally Posted by *diggiddi*
> 
> 1. Yes you should see some improvements with the new Crimson if not 16.3 try the WHQL
> 
> 2. Another option would be the Corsair HG10 with a CLC
> 
> 3. X6 1090T is stil a decent cpu especially greater than or equal to 4ghz


How much importaint is physics score in 3d mark? I noticed that CPUs from intel from series I7 and later have really decent score here, Core2 generation was worse as my current Phenom.

However my particular CPU cant do 3900+ Mhz stable. I even recently downclocked it by 100Mhz, until I realized that BSOD trouble could be caused by conflicting versions of Adobe flash (trying to access 3d acceleration capabilities). To be more specific, my system should be able to pass 10-15 minutes of LinX/OCCT stress testing, in ambient temperatures around 35 degrees of celsius. Above 3900 under heavy load its becoming too temperature sensitive.

This is ttrue for graphic cards as well. 84 degrees under normal GPU load is bit too much, considering that summer isnt here yet.

So I rather tigthened RAM latency. It indirectly increases performance of whole system (about 5% in LinX).


----------



## jodybdesigns

Quote:


> Originally Posted by *Offler*
> 
> How much importaint is physics score in 3d mark? I noticed that CPUs from intel from series I7 and later have really decent score here, Core2 generation was worse as my current Phenom.
> 
> However my particular CPU cant do 3900+ Mhz stable. I even recently downclocked it by 100Mhz, until I realized that BSOD trouble could be caused by conflicting versions of Adobe flash (trying to access 3d acceleration capabilities). To be more specific, my system should be able to pass 10-15 minutes of LinX/OCCT stress testing, in ambient temperatures around 35 degrees of celsius. Above 3900 under heavy load its becoming too temperature sensitive.
> 
> This is ttrue for graphic cards as well. 84 degrees under normal GPU load is bit too much, considering that summer isnt here yet.
> 
> So I rather tigthened RAM latency. It indirectly increases performance of whole system (about 5% in LinX).


Lots of people ignore the Physics score and look directly at the graphics score. In benchmarking the 1090 is going to fall far far behind in comparison to Intel. But it will compare nicely to the FX series, if not passing it on most occasions. 3DMark relies on a strong single core for the most part, and AMD just can't offer that performance. I hope with Zen that changes. But watching the track record and how blown up the APU was before it was released, then it was terrible. Just like Bulldozer and Vishera. Overhyped.. And I love AMD, I just don't like the current CEO, she is kind of a dimwit if you actually post attention to her.


----------



## c0V3Ro

VGA @ stock. FX-8350 @ 5GHz, 16GB @ 800MHz, SSD.
5636, Crimson 15.12
http://www.3dmark.com/fs/7855636
5611, Crimson 16.1
http://www.3dmark.com/fs/7838857
5531, 16.3 with same specs.
http://www.3dmark.com/fs/7856187
Weird, huh?
Used DDU after unistall.


----------



## Roboyto

Quote:


> Originally Posted by *Offler*
> 
> Hello guys.
> 
> More than a year ago I decided to do stress tests and 3dmark benches on my system. Results:
> http://www.overclock.net/t/1406832/single-gpu-fire-strike-top-30/1200#post_22936403
> 
> Goals for the days were 3dmark score over 10k, and graphics score over 13k.
> 
> Questions:
> 1. We have now Crimson Driver 16.3 which is a completely different league compared to 14.9. I expect better results as more than a year ago.
> I am interested mainly in Graphics score, while the new goal is above 14k.
> 
> 2. the GPU works great for a lapped stock cooler (if you wear earplugs)
> The chip is most likely a win in a silicon lottery (asic over 80), and overall card build from Sapphire. All I did with the card is just to lap the bumps on the vapour chamber. What kind of modifications you recommend?
> 
> I can choose between two types of coolers. Either Arctic Accelero Xtreme IV, or a watercoolign Arctic Accelero Hybrid III.
> 
> Due practical reasons I cannot allow more than 1 slot for cooling, so the closed loop water looks a way even when I am really not a fan of any watercooling.
> 
> 3. Does anyone even run Phenom II x6 1090t?
> I was suprised that nobody got better result for combination of a graphics and CPU. Yes, its an old CPU, but I really dont see that an issue


Run the new Crimson, you may see some slight improvements in scores but don't expect it to be a giant leap.

Due to heavy tesselation in Firestrike it is perfectly acceptable to disable it through CCC. You should see around a 7% increase in your score just from that, which would put you very close to 14k already.

I got 14990 graphics score at 1310/1725: http://www.3dmark.com/3dm/10883028

And then 16090 with tesselation forced off: http://www.3dmark.com/3dm/10883300

Both of these were with Crimson 16.2 IIRC

And then if you want to experiment with some BIOS flashing then you may be able to get a little extra performance.

1285/1749 got me 16600 graphics score: http://www.3dmark.com/3dm/10982636

This performance bump is from VRAM timings from the stock 1250 clock being applied to every VRAM clock speed up to 1875.

@gupsterg or @mus1mus could help you with that if you were interested.

An AIO setup will certainly help with your core temperatures, but may leave VRM1 a little on the toasty side if you plan on running excessive amounts of voltage.

The arctic Hybrid kit is alright, but you may want to consider the HG10 as @diggiddi suggested. Your other option is the Kraken AIO setup with the Gelid 290X VRM kit. Regardless of which of these you choose, you will want to grab some good thermal pads for VRM1. Fujipoly ultra extreme is the only way to go if you want maximum performance.

Info for Fujipoly performance in general: http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures

Some Kraken & Gelid VRM Kit info starts here: http://www.overclock.net/t/1478544/the-build-formerly-known-as-the-devil-inside-a-gaming-htpc-for-my-wife-4770k-corsair-250d-powercolor-r9-290/20_20#post_22214255

Read on a bit through that log and you will find when I swapped in the Fuji pads for some nice temp drops with the Kraken/Gelid solution.


----------



## Agent Smith1984

Are you guys able to use winflash or are you having to do a boot flash?

My Fury would flash in windows easily, and I did it quite often, but I can't get my 390X to take any of the roms....


----------



## Vellinious

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Are you guys able to use winflash or are you having to do a boot flash?
> 
> My Fury would flash in windows easily, and I did it quite often, but I can't get my 390X to take any of the roms....


Boot flash.


----------



## Roboyto

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Are you guys able to use winflash or are you having to do a boot flash?
> 
> My Fury would flash in windows easily, and I did it quite often, but I can't get my 390X to take any of the roms....


Boot flash over here.


----------



## gupsterg

Quote:


> Originally Posted by *Roboyto*
> 
> @gupsterg
> or @mus1mus
> could help you with that if you were interested.


OP of Hawaii bios mod has info plus there is 390 MC mod for 290/X which may improve performance. All depends an individuals card







, if offler is stuck happy to help







, maybe best if he post in that thread with attaching his stock ROM to a post







. I'm also sure other modders will aid with advice and help also







.

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Are you guys able to use winflash or are you having to do a boot flash?
> 
> My Fury would flash in windows easily, and I did it quite often, but I can't get my 390X to take any of the roms....


I'd be using pure DOS flash on your 390X







.

If you prefer windows (and have no stability issues) then I'd use a winflash taken from Asus ROM pack (regardless if a 290X pack).


----------



## diggiddi

Quote:


> Originally Posted by *Offler*
> 
> How much importaint is physics score in 3d mark? I noticed that CPUs from intel from series I7 and later have really decent score here, Core2 generation was worse as my current Phenom.
> 
> However my particular CPU cant do 3900+ Mhz stable. I even recently downclocked it by 100Mhz, until I realized that BSOD trouble could be caused by conflicting versions of Adobe flash (trying to access 3d acceleration capabilities). To be more specific, my system should be able to pass 10-15 minutes of LinX/OCCT stress testing, in ambient temperatures around 35 degrees of celsius. Above 3900 under heavy load its becoming too temperature sensitive.
> 
> This is ttrue for graphic cards as well. 84 degrees under normal GPU load is bit too much, considering that summer isnt here yet.
> 
> So I rather tigthened RAM latency. It indirectly increases performance of whole system (about 5% in LinX).


Might want to improve your cooling on 1090T to get better clocks, fan on VRMs and back of socket will help a bit
I think the 14k barrier for FireStrike is somewhere in 1200mhz/1600mhz region for the card, at least for me it is
I dont exactly know how much better going from 14.9 to latest crimson is you'll just have to try


----------



## rdr09

Quote:


> Originally Posted by *Offler*
> 
> How much importaint is physics score in 3d mark? I noticed that CPUs from intel from series I7 and later have really decent score here, Core2 generation was worse as my current Phenom.
> 
> However my particular CPU cant do 3900+ Mhz stable. I even recently downclocked it by 100Mhz, until I realized that BSOD trouble could be caused by conflicting versions of Adobe flash (trying to access 3d acceleration capabilities). To be more specific, my system should be able to pass 10-15 minutes of LinX/OCCT stress testing, in ambient temperatures around 35 degrees of celsius. Above 3900 under heavy load its becoming too temperature sensitive.
> 
> This is ttrue for graphic cards as well. 84 degrees under normal GPU load is bit too much, considering that summer isnt here yet.
> 
> So I rather tigthened RAM latency. It indirectly increases performance of whole system (about 5% in LinX).


Here was a comparison i did when i got my second 290 (used) and tested it on my Phenom on air.

http://www.3dmark.com/compare/fs/3064629/fs/3003367

Pretty much the same driver version and the Phenom scored about less than 200 pts in graphics. I had to oc my Phenom to 4.2GHz in order to lessen the bottleneck a bit in some games. It did max out same games as my intel, though, at 1080.

EDIT: With 16.3 to hit 14K graphics score, i may need to oc my 290 with my intel rig to 1250 core. With the Phenom . . . about 1275 core.


----------



## Vellinious

Quote:


> Originally Posted by *rdr09*
> 
> Here was a comparison i did when i got my second 290 (used) and tested it on my Phenom on air.
> 
> http://www.3dmark.com/compare/fs/3064629/fs/3003367
> 
> Pretty much the same driver version and the Phenom scored about less than 200 pts in graphics. I had to oc my Phenom to 4.2GHz in order to lessen the bottleneck a bit in some games. It did max out same games as my intel, though, at 1080.
> 
> EDIT: With 16.3 to hit 14K graphics score, i may need to oc my 290 with my intel rig to 1250 core. With the Phenom . . . about 1275 core.


Man, I love those old Phenoms.....great processors. I still have a 7750BE that I use in a backup rig for XP 32 bit, for some of my old games that just won't play on the new OSs. Good stuff.


----------



## Offler

Bit OT:
I used one quite older benchmark which used plain arithmetic tests based on calculations of the matrixes and gave score in FPU. In terms of single core Intel Pentium III-S had 430, Core 2 had 650-800 (2833-3400Mhz), I7-965 had 1250, Phenom II x6 @ 3800 has 1375, and I7-2700k had 1300 on stock.
Quote:


> Originally Posted by *rdr09*
> 
> Here was a comparison i did when i got my second 290 (used) and tested it on my Phenom on air.
> 
> http://www.3dmark.com/compare/fs/3064629/fs/3003367
> 
> Pretty much the same driver version and the Phenom scored about less than 200 pts in graphics. I had to oc my Phenom to 4.2GHz in order to lessen the bottleneck a bit in some games. It did max out same games as my intel, though, at 1080.
> 
> EDIT: With 16.3 to hit 14K graphics score, i may need to oc my 290 with my intel rig to 1250 core. With the Phenom . . . about 1275 core.


Good comparison

Different CPU, caused +2% in both Graphic Scores, 33% in physics scores, but suprisingly again 2% in combined test. In general it says that there is some insignificant performance bump because of slightly better CPU performance ( 1 FPS, 2 percent), significant bump in physics test (10fps, 33%), but again 1fps 2 percent in combined test.


----------



## rdr09

Quote:


> Originally Posted by *Offler*
> 
> Bit OT:
> I used one quite older benchmark which used plain arithmetic tests based on calculations of the matrixes and gave score in FPU. In terms of single core Intel Pentium III-S had 430, Core 2 had 650-800 (2833-3400Mhz), I7-965 had 1250, Phenom II x6 @ 3800 has 1375, and I7-2700k had 1300 on stock.
> Good comparison
> 
> Different CPU, caused +2% in both Graphic Scores, 33% in physics scores, but suprisingly again 2% in combined test. In general it says that there is some insignificant performance bump because of slightly better CPU performance ( 1 FPS, 2 percent), significant bump in physics test (10fps, 33%), but again 1fps 2 percent in combined test.


I did other tests as well since the hardware are available and have the luxury to swap gpus. Not many experienced a thuban reach higher OCs than 4GHz. It is around 4.2 GHz that it shows a jump in performance. Here was the old Cine with both at 4.5GHz . . .



In FS, if i turn off HT on my i7 4.5GHz, its Physics score is lower than my thuban at 4GHz. The bench loves cores just like some games. But when a game needs single core performance, then the Thuban falls behind a bit. If only it can run at 4.5GHz stable, i think it can even handle 2 290s.

@Vell, i will keep this Phenom even after it is dead. lol


----------



## FLaguy954

Okay, I am loosing my mine here. My 290X Lightning is starting to give me the dreaded "black screen" issue. Are there any fixes I should be aware of?

Edit: It might just be my motherboard, I'm going to test and see.


----------



## jodybdesigns

Quote:


> Originally Posted by *FLaguy954*
> 
> Okay, I am loosing my mine here. My 290X Lightning is starting to give me the dreaded "black screen" issue. Are there any fixes I should be aware of?
> 
> Edit: It might just be my motherboard, I'm going to test and see.


My 390 was doing this last week. Is it causing you to hard reset? A PSU replacement last weekend cleared up my issue.

*edit* Check your Event Viewer. I was getting Kernel-Power 41 errors. Also, with HWinfo open, I could set and count the hardware errors while under load. I have had 0 errors since Monday.


----------



## i2CY

Clean install.
15.7 driver
R9 290
FPS issue Arma III...
Thoughts?

I should have more then enough power


----------



## OneB1t

Arma 3 is cpu overhead mess


----------



## FLaguy954

Quote:


> Originally Posted by *jodybdesigns*
> 
> My 390 was doing this last week. Is it causing you to hard reset? A PSU replacement last weekend cleared up my issue.
> 
> *edit* Check your Event Viewer. I was getting Kernel-Power 41 errors. Also, with HWinfo open, I could set and count the hardware errors while under load. I have had 0 errors since Monday.


It the new motherboard all along. I guess I am returning it tomorrow morning







.


----------



## Offler

1st attempt is quite bad...

http://www.3dmark.com/compare/fs/2873251/fs/7876016

1st result is from october, 2nd from today. Old CCC allowed me to adjust two sliders with performance and power. That is not present in current control panel and the results clearly copy "default" settings.

Anyway I see as a remarkable that physics score is slightly better. BTW i found out that many files in the system has been updated (including Kernel32.dll) so probably things can run smoother.

Edit:
All tests so far were done on Crimson 16.1

2nd test:
http://www.3dmark.com/3dm/11205921

Much better. This score has best Graphics and Combined score I had so far

3rd test:
http://www.3dmark.com/3dm/11206103

Best physics score, but Graphics 1 had trouble again.

Contrary to my tests from october 2014 there are artifacts in 1st and 2nd test.


----------



## jodybdesigns

Quote:


> Originally Posted by *FLaguy954*
> 
> It the new motherboard all along. I guess I am returning it tomorrow morning
> 
> 
> 
> 
> 
> 
> 
> .


At least you didn't fight with it for a week lol. Good you found your issue.


----------



## Forceman

Anyone else with Hitman tested performance difference between DX11 and DX12? I've run a bunch of tests, and am seeing very minimal (1 FPS) difference at 1440p. Looking for some more data.


----------



## serave

can't get my R9 290 Tri-X to 1100/1400 on stock voltage









I did raise the core voltage by about 20mv and it still crashes everytime i ran valley

is there anything i can do regarding this matter? please send help


----------



## mus1mus

Add more Voltage. Set a more aggressive fan profile or simply back off the clocks.


----------



## serave

that's fast, also is it safe if i increase the voltage by about +100mv and start from there ?

most people said hawaii cards can reach 1100 without rasiing the core voltage


----------



## mus1mus

As long as you can keep the card cool.


----------



## cmac68

Quote:


> Originally Posted by *serave*
> 
> that's fast, also is it safe if i increase the voltage by about +100mv and start from there ?
> 
> most people said hawaii cards can reach 1100 without rasiing the core voltage


I use Sapphire Trixx 4.9 (newer version is still buggy for me) for my Tri-X OC's and stock they are already +25mv. I run my cards 24/7 with +50 on the power slider and +75mv voltage for 1100MHz core and 1500MHz memory.

I also run a custom fan profile and core temps on my top card stays in the low 70's and the VRM temps are around the same.


----------



## spyshagg

Quote:


> Originally Posted by *serave*
> 
> can't get my R9 290 Tri-X to 1100/1400 on stock voltage
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I did raise the core voltage by about 20mv and it still crashes everytime i ran valley
> 
> is there anything i can do regarding this matter? please send help


1100mhz @25mv would be a pretty good card or a pretty "cool" card. You most likely need to add voltage. Don't go by other examples, each card is different.

For now forget games and benchmarks.

Install OCCT, enable error detection and increase tested vram to 2GB. Start the benchmark at stock clocks/voltage and let the VRM temperatures stabilize. After the VRM temperature stops increasing, start adding 10mhz until occt tells you errors were found. Then increase the voltage by 10mv until the error reporting stops increasing and then add another 10mhz. Keep doing this until you find a good balance between mhz and voltage.

Pay very much attention to the VRM temperature. I'm OCCT stable at [email protected] up to 73ºc VRM temperature. At 74ºc it shows errors. Adding votlage would be counter-intuitive . If I wan't to increase my overclock I need to decrease my VRM temperature first.

After this baseline, enter a heavy game (project cars, metal gear solid, assassins creed etc... dont do benchmarks they suck for stability) and test if its stable.

==============

Memory overclock is a different story. Its really hard to find stable memory clocks. Both my cards pass OCCT at 1580/1610mhz without errors, but fail 10 minutes into Project cars. At 1500mhz it fails after 2 hours of gameplay.

So, you have to pick a 100% stable baseline speed (mine is 1375mhz) and start increasing the speed every other gaming session. My last 2 gaming sessions were done at 1425mhz, so I think it may be stable. I will keep this speed the rest of the week. If it passes, I will increase to 1450 and keep playing games


----------



## serave

Quote:


> Originally Posted by *mus1mus*
> 
> As long as you can keep the card cool.


Thanks








Quote:


> Originally Posted by *cmac68*
> 
> I use Sapphire Trixx 4.9 (newer version is still buggy for me) for my Tri-X OC's and stock they are already +25mv. I run my cards 24/7 with +50 on the power slider and +75mv voltage for 1100MHz core and 1500MHz memory.
> 
> I also run a custom fan profile and core temps on my top card stays in the low 70's and the VRM temps are around the same.


Quote:


> Originally Posted by *spyshagg*
> 
> 1100mhz @25mv would be a pretty good card or a pretty "cool" card. You most likely need to add voltage. Don't go by other examples, each card is different.
> 
> For now forget games and benchmarks.
> 
> Install OCCT, enable error detection and increase tested vram to 2GB. Start the benchmark at stock clocks/voltage and let the VRM temperatures stabilize. After the VRM temperature stops increasing, start adding 10mhz until occt tells you errors were found. Then increase the voltage by 10mv until the error reporting stops increasing and then add another 10mhz. Keep doing this until you find a good balance between mhz and voltage.
> 
> Pay very much attention to the VRM temperature. I'm OCCT stable at [email protected] up to 73ºc VRM temperature. At 74ºc it shows errors. Adding votlage would be counter-intuitive . If I wan't to increase my overclock I need to decrease my VRM temperature first.
> 
> After this baseline, enter a heavy game (project cars, metal gear solid, assassins creed etc... dont do benchmarks they suck for stability) and test if its stable.
> 
> ==============
> 
> Memory overclock is a different story. Its really hard to find stable memory clocks. Both my cards pass OCCT at 1580/1610mhz without errors, but fail 10 minutes into Project cars. At 1500mhz it fails after 2 hours of gameplay.
> 
> So, you have to pick a 100% stable baseline speed (mine is 1375mhz) and start increasing the speed every other gaming session. My last 2 gaming sessions were done at 1425mhz, so I think it may be stable. I will keep this speed the rest of the week. If it passes, I will increase to 1450 and keep playing games


Gee, thanks for the detailed answer friend, i decided to go 1100/1400 @+45mv voltage, triend went higher and i just cant be arsed to deal with the memory clocks (it keeps failing at me just like you said in the last sentences)

at least its stable in most games i tested (metro 2033, TR 2013, Withcer 3 and so on)

Didnt know that VRM temps that low would effect overclockability though, guess you learned something new everyday


----------



## mus1mus

Forget Games and Benchmark just DO OCCT, you said. Then you fail Project Cars after passing OCCT.

Nice concept.


----------



## spyshagg

Quote:


> Originally Posted by *mus1mus*
> 
> Forget Games and Benchmark just DO OCCT, you said. Then you fail Project Cars after passing OCCT.
> 
> Nice concept.


oi mate you misread. I separated gpu overclock from memory overclock in that post







Games fail when using occt as a reference for *memory* overclock.

For gpu overclock occt is almost pin point accurate in my experience.


----------



## mus1mus

I'll stick to a simpler, more specific testing in terms of GPU OC by gaming with the games you play.

Not everyone plays the same assortment of games as everybody.

If it works for the games I play, they're stable for me.


----------



## spyshagg

Both work, one is much faster than the other to pin point your baseline. To each its own, i guess.

I'll always stick to the notion that if it is unstable, then it is unstable no matter what games one plays. Same as the cpu oc.

Cheers


----------



## jodybdesigns

Quote:


> Originally Posted by *mus1mus*
> 
> I'll stick to a simpler, more specific testing in terms of GPU OC by gaming with the games you play.
> 
> Not everyone plays the same assortment of games as everybody.
> 
> If it works for the games I play, they're stable for me.


I have 5 saved profiles in Afterburner. Takes 2 seconds to choose one for these types of situations. I run multi monitor so my cards won't downclock memory. But Afterburner can


----------



## Roboyto

Quote:


> Originally Posted by *Forceman*
> 
> Anyone else with Hitman tested performance difference between DX11 and DX12? I've run a bunch of tests, and am seeing very minimal (1 FPS) difference at 1440p. Looking for some more data.


I will be picking it up soon to see what it's about and can let you know.

Dunno if you have Ashes or not, but I saw tremendous improvements in performance switching from DX11 to DX12.


----------



## diggiddi

Quote:


> Originally Posted by *spyshagg*
> 
> Both work, one is much faster than the other to pin point your baseline. To each its own, i guess.
> 
> I'll always stick to the notion that if it is unstable, then it is unstable no matter what games one plays. Same as the cpu oc.
> 
> Cheers


You play Pcars too? Add me on steam, same handle


----------



## asciii

_Removed._


----------



## mus1mus

Quote:


> Originally Posted by *asciii*
> 
> Which ROM would be the best for stable overclocking on a locked core R9 290 with Hynix memory?
> Searched all over, but I've failed to find one that isn't a R9 290X rom.


V1.8 from this thread.


----------



## asciii

_Removed._


----------



## melodystyle2003

Guys can i use a corsair H80 aio to cool the core of a ref r9 290? I mean will it fit or its size will cause me issues? Has anyone tried it already?
I also have vrms cooling kit + ram copper heatsinks to use with a 80mm fan.


----------



## diggiddi

Quote:


> Originally Posted by *melodystyle2003*
> 
> Guys can i use a corsair H80 aio to cool the core of a ref r9 290? I mean will it fit or its size will cause me issues? Has anyone tried it already?
> I also have vrms cooling kit + ram copper heatsinks to use with a 80mm fan.


Yes it should work with your GPU, you might consider getting the Corsair HG10
http://www.tomshardware.com/news/corsair-hg10-water-cooling-gpu,26959.html


----------



## melodystyle2003

@diggiddi thanks for your answer.

Also i noticed that ram speed is always 1250Mhz, in desktop or in game. Using the latest beta 16.3. Is this normal while using an 144Hz monitor?


----------



## spyshagg

my card falls back to idle clocks @ 144hz unless something is moving in the screen (browsing overclock.net for instance)


----------



## diggiddi

Have you disabled powerplay? that could be the reason or are you on dual monitor setup?


----------



## cephelix

I noticed my memory doesn't downclock as well on any version of drivers. Doesn't bother me though.

FYI on a dual monitor setup


----------



## Ha-Nocri

Quote:


> Originally Posted by *cephelix*
> 
> I noticed my memory doesn't downclock as well on any version of drivers. Doesn't bother me though.
> 
> FYI on a dual monitor setup


Yep, been happening since Crimson


----------



## rdr09

Vell is crazy.

http://www.3dmark.com/fs/7911442

His card is loco.


----------



## mus1mus

Quote:


> Originally Posted by *rdr09*
> 
> Vell is crazy.
> 
> http://www.3dmark.com/fs/7911442
> 
> His card is loco.


Should be @fyzzz

Vell is another one though.

My replacement for the MSI ref 290 arrived and now ready for pick-up.









It grew an X from Taiwan


----------



## rdr09

Quote:


> Originally Posted by *mus1mus*
> 
> Should be @fyzzz
> 
> Vell is another one though.
> 
> My replacement for the MSI ref 290 arrived and now ready for pick-up.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It grew an X from Taiwan


My bad. it is fy. That's two reaching those crazy numbers. We are waiting for you mus.


----------



## mus1mus

I decided not to chase those guys anymore.









I am working for the 3X and 2X. Maybe even 4X if power allows.


----------



## rdr09

Quote:


> Originally Posted by *mus1mus*
> 
> I decided not to chase those guys anymore.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am working for the 3X and 2X. Maybe even 4X if power allows.


4 will cause a power outage in the city. shoot for 35K FS graphics score for 2X. easy for me to say. it aint my rig. lol

I'll help out in the competition soon before the deadline. i might do 1150 in crossfire with the 850W.


----------



## melodystyle2003

Quote:


> Originally Posted by *diggiddi*
> 
> Have you disabled powerplay? that could be the reason or are you on dual monitor setup?


Thank you, that was the issue, i had disabled ULPS on msi ab. I am using single monitor setup.


Spoiler: Warning: Spoiler!


----------



## mus1mus

Quote:


> Originally Posted by *rdr09*
> 
> 4 will cause a power outage in the city. shoot for 35K FS graphics score for 2X. easy for me to say. it aint my rig. lol
> 
> I'll help out in the competition soon before the deadline. i might do 1150 in crossfire with the 850W.


I will. In fact I just did a round for FS.

GS 1 tops past 210 FPS.
GS2 tops 215 FPS.

Not online so no results.









Funny as a single cannot break 100 FPS.









Spoiler: hmm. 178 FPS Average on GS 1 should mean over 35K it seems. Can't wait for the AC to turn ON. :D


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> I will. In fact I just did a round for FS.
> 
> GS 1 tops past 210 FPS.
> GS2 tops 215 FPS.
> 
> Not online so no results.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Funny as a single cannot break 100 FPS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: hmm. 178 FPS Average on GS 1 should mean over 35K it seems. Can't wait for the AC to turn ON. :D


I tried your volt mod that you sent me....didn't work, man. That bios wouldn't boot up. This is the last good one I think, without the memory 390/390X memory timings. I didn't like the way they ran.

OV3MH.zip 98k .zip file


----------



## mus1mus

Hey, I'll try to mess with that again. Later.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> Hey, I'll try to mess with that again. Later.


I appreciate ya. I've downloaded the old drivers...gonna test them, see if they help with the black screen issues.


----------



## mus1mus

No worries mate. I'll try to help as much as I can. Your card is a monster. I love to see it break some barriers.

I'm gonna try a single 290X later. I am feeling positive, I can squeeze some goodies off it. I think I have been to hard on tightening the CPU Block last time I installed it. That I was bending the PCB or creating some bad contacts on the pins and the CPU.

If a dual can break 200+ FPS AT 1350/1400......I am excited.


----------



## rdr09

Quote:


> Originally Posted by *mus1mus*
> 
> I will. In fact I just did a round for FS.
> 
> GS 1 tops past 210 FPS.
> GS2 tops 215 FPS.
> 
> Not online so no results.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Funny as a single cannot break 100 FPS.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Spoiler: hmm. 178 FPS Average on GS 1 should mean over 35K it seems. Can't wait for the AC to turn ON. :D


Ugh, i only get 130 in GS1 with 2 290s. you are talking crossfire, right? Saw your last post. Can't wait to see.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> No worries mate. I'll try to help as much as I can. Your card is a monster. I love to see it break some barriers.
> 
> I'm gonna try a single 290X later. I am feeling positive, I can squeeze some goodies off it. I think I have been to hard on tightening the CPU Block last time I installed it. That I was bending the PCB or creating some bad contacts on the pins and the CPU.
> 
> If a dual can break 200+ FPS AT 1350/1400......I am excited.


I'm gonna be selling mine next month. Either wait for Pascal or go with a 980ti. Unless something drastic changes, I've gotten about as much as I can out of this card.


----------



## Vellinious

Full run with the WHQL approved drivers on the stock bios. Graphics score: 15918

Couldn't break 16k on the stock bios.

http://www.3dmark.com/fs/7922980


----------



## YellowBlackGod

Quote:


> Originally Posted by *melodystyle2003*
> 
> @diggiddi thanks for your answer.
> 
> Also i noticed that ram speed is always 1250Mhz, in desktop or in game. Using the latest beta 16.3. Is this normal while using an 144Hz monitor?


It should be an 144Hz issue, not the GPU's fault. My 290X's memory was idling when I was not playing games at the time when I had my 60Hz monitor. Now I have an 144Hz Monitor and the GPU RAM always runs at 1400 MHz (5600 MHz). When I change the refresh rate to 60Hz, it downclocks and idles just normally.


----------



## spyshagg

That doesn't happen to me, even with eyefinity enabled (3 panels).

@ 144hz my card always defaults to idle clocks if nothing in the screen is moving. If you start moving a window or scrolling a webpage, then yes my clocks jump to game speeds.


----------



## YellowBlackGod

Hmmmm, i must look into it then. I have the impression that my GPU also when nothing is moving that it runs at full speed (the memory). What should I do to change that? Is this ULPS or powerplay related?


----------



## spyshagg

ULPS only affects a second GPU in crossfire mode. I do know for sure my card only leaves idle clocks when there is big movement on screen or something else pushing the gpu


----------



## TheReciever

Well just now hopped into the 290x realm from my old 6970m and man, got to love the performance/price. I bought my MSI Gaming 290x 4gb for 215.

Took off the heatsink and put on 2x 120mm fans and now have it at +200mv

1175/1490 is what I have it running at right now...

http://www.3dmark.com/3dm/11202846


----------



## spyshagg

Quote:


> Originally Posted by *TheReciever*
> 
> Well just now hopped into the 290x realm from my old 6970m and man, got to love the performance/price. I bought my MSI Gaming 290x 4gb for 215.
> 
> Took off the heatsink and put on 2x 120mm fans and now have it at +200mv
> 
> 1175/1490 is what I have it running at right now...
> 
> http://www.3dmark.com/3dm/11202846


come to our side, grasshopper. We can make those clocks give you a valid 14500 graphics score.

come, come.
http://www.overclock.net/t/1564219/modded-r9-390x-bios-for-r9-290-290x-updated-02-16-2016/0_50


----------



## chiknnwatrmln

Does the 390/x BIOS give a performance benefit? I thought the cards were the exact same, minus the amount of VRAM?

I'd be interested in flashing to a 390/x BIOS if it gives a boost, both my cards are very old reference boards. My 290 is not unlockable, my 290x is obviously unlocked.


----------



## Vellinious

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Does the 390/x BIOS give a performance benefit? I thought the cards were the exact same, minus the amount of VRAM?
> 
> I'd be interested in flashing to a 390/x BIOS if it gives a boost, both my cards are very old reference boards. My 290 is not unlockable, my 290x is obviously unlocked.


Didn't do anything on mine. The memory timings are a bit tighter, but the tighter timings made the higher clocks unstable, so it was a wash.


----------



## TheReciever

Quote:


> Originally Posted by *spyshagg*
> 
> come to our side, grasshopper. We can make those clocks give you a valid 14500 graphics score.
> 
> come, come.
> http://www.overclock.net/t/1564219/modded-r9-390x-bios-for-r9-290-290x-updated-02-16-2016/0_50


14500?!

Really?


----------



## spyshagg

Yes. http://www.3dmark.com/fs/7825736


----------



## rdr09

Quote:


> Originally Posted by *TheReciever*
> 
> Well just now hopped into the 290x realm from my old 6970m and man, got to love the performance/price. I bought my MSI Gaming 290x 4gb for 215.
> 
> Took off the heatsink and put on 2x 120mm fans and now have it at +200mv
> 
> 1175/1490 is what I have it running at right now...
> 
> http://www.3dmark.com/3dm/11202846


Try to lower your oc to 1150 core and 1400 memory at a lower voltage like +175mv or lower. see if it runs. also, you are using an older driver. are you scared of updating? lol

at 1150 you should be matching or even surpassing my 1200 core . . .

http://www.3dmark.com/3dm/11169440

i think that was 16.1. 16.3 seems to give a little higher. watch your temps and keep all of them lower than 80.

EDIT: Max your power limit to 50.


----------



## TheReciever

Quote:


> Originally Posted by *spyshagg*
> 
> Yes. http://www.3dmark.com/fs/7825736


That's crazy lol I'll have to look into it when I get some extra time thanks for sharing!

Quote:


> Originally Posted by *rdr09*
> 
> Try to lower your oc to 1150 core and 1400 memory at a lower voltage like +175mv or lower. see if it runs. also, you are using an older driver. are you scared of updating? lol
> 
> at 1150 you should be matching or even surpassing my 1200 core . . .
> 
> http://www.3dmark.com/3dm/11169440
> 
> i think that was 16.1. 16.3 seems to give a little higher. watch your temps and keep all of them lower than 80.
> 
> EDIT: Max your power limit to 50.


I was having issues with it holding a gpu clock and I was told that the newer drivers could be at fault. I went back to 15.11.1 and it seemed to work out that problem.

Though I can try the new ones, no problem with updating just figured it might have been Crimson related

Right now since the GPU clock doesn't stick I have it over clocked 24/7 since it doesn't like changing profiles. When I change the profile (oc vs. Stock) I get black screens and have to Uninstall drivers in under to not have it automatically black screen on boot.

From what I read it's because of the memory controller? Adding juice seems to be the solution









Thanks for the information and suggestions guys! Last time I had an AMD gpu was the 4890 way back in the day


----------



## spyshagg

Where do you change profiles?

Never had that happen but i only use HIS trixx or afterburner


----------



## TheReciever

I used to use msi afterburner but due not applying power as desired I stopped using it for changing settings and only for the osd.

I now use trixx


----------



## 241pizza

Owner of BF4 Edition 290x the first month it came out.
Replace leaf blower stock cooler,

Picked up a Msi lightning 290x about a year later.

Stock Air cooling 60-64c under max load


----------



## rdr09

Quote:


> Originally Posted by *241pizza*
> 
> 
> 
> Owner of BF4 Edition 290x the first month it came out.
> Replace leaf blower stock cooler,
> 
> Picked up a Msi lightning 290x about a year later.
> 
> Stock Air cooling 60-64c under max load


Just joining now after three years. lol

We followed a similar path but mine are 290s. Got my second one for $247 with a full block.

Anyways, this is what i get with tess off at stock . . .

http://www.3dmark.com/3dm/11302839?

Looks about right? About 1000 pts more.


----------



## TheReciever

Finally got the driver downloaded, ill probably get it running tomorrow some time and see what the difference is


----------



## rdr09

Quote:


> Originally Posted by *TheReciever*
> 
> Finally got the driver downloaded, ill probably get it running tomorrow some time and see what the difference is


Yeah! You can uncheck evolve if you are not going to use. Also, after installing and just before reboot i go to msconfig under startup to uncheck Radeon Settings. I think it helps to prevent conflicts with other apps like Trixx. You can always open Radeon like other apps anyways.


----------



## 241pizza

Quote:


> Originally Posted by *rdr09*
> 
> Just joining now after three years. lol
> 
> We followed a similar path but mine are 290s. Got my second one for $247 with a full block.
> 
> Anyways, this is what i get with tess off at stock . . .
> 
> http://www.3dmark.com/3dm/11302839?
> 
> Looks about right? About 1000 pts more.


http://www.3dmark.com/fs/5592722 - tess on

and ya .....lol on 3 yrs. had mixed feelings about clubs.


----------



## OneB1t

thats pretty low score for 1 185 MHz OC... you doing it wrong


----------



## 241pizza

Quote:


> Originally Posted by *OneB1t*
> 
> thats pretty low score for 1 185 MHz OC... you doing it wrong


Not overly helpful comment.

Suggestions ....or specifics are always welcome.


----------



## OneB1t

powerlimit?


----------



## spyshagg

I think that is a great score for a stock bios. Here is a stock bios firestrike @ 1200core/1250 mem http://www.3dmark.com/fs/7938309


----------



## 241pizza

Quote:


> Originally Posted by *OneB1t*
> 
> powerlimit?


50% if I recall, ( long time ago)

I am fairly sure Some thermal throttling was happening.

Not a HUGE benchmarker anyhow.


----------



## OneB1t

for stock bios its ok but with 390X bios it should do over 14 000


----------



## 241pizza

Quote:


> Originally Posted by *OneB1t*
> 
> for stock bios its ok but with 390X bios it should do over 14 000


Agreed,

I have thinking quite a bit about flashing to the 390X bios.

After 4 yrs of being a primary caregiver, (which is one reason it took 3 yrs to finally submit .. I now have time to do some things I have been putting off. like flashing and ....







water









Hope to flash bios this week.this week.


----------



## TheReciever

Updated to 16.3 and instant black screen and bsods :/


----------



## josephimports

Quote:


> Originally Posted by *Agent Smith1984*
> 
> Okay, is anybody able to flash BIOS from winthin windows?
> 
> I did it all the time on my Fury, but every time I try to do it on my 390x, it tells me the file "msi390x.rom" is not found.
> 
> hmmmmm


I just flashed the "290_ELPIDA_MOD_V1.8" rom using Atiflash 2.71 successfully.










Spoiler: Warning: Spoiler!



Create new folder in c:\ and name it atiflash.

Extract atiflash zip file into newly created folder.

Copy the bios file you would like to flash into this folder as well.

run CMD promt as admin.

type cd c:\atiflash

For gpu adapter info type atiflash -i

To save gpu bios, use info found using command -i. Type atiflash -s (GPU NUM) (BIOS FILE NAME) [BIOS FILE SIZE]

Example atiflash -s 0 Fiji 40000

To flash bios type atiflash -p 0(0 top or 1 second card depending on card position) biosname.rom

Example " atiflash -p 0 Fiji.rom" This process may take several minutes. Restart PC once complete.

List of commands can be found here



Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Does the 390/x BIOS give a performance benefit? I thought the cards were the exact same, minus the amount of VRAM?
> 
> I'd be interested in flashing to a 390/x BIOS if it gives a boost, both my cards are very old reference boards. My 290 is not unlockable, my 290x is obviously unlocked.


There is an increase in performance and a drop in voltage/temps using the "290_ELPIDA_MOD_V1.8" rom on my reference Asus 290.









stock bios


Spoiler: Warning: Spoiler!





http://www.3dmark.com/3dm/11308877



mod bios


Spoiler: Warning: Spoiler!





http://www.3dmark.com/3dm/11309028



Thanks










Spoiler: Warning: Spoiler!



1.) Lard
2.) The Stilt
3.) Plug2k
4.) gupsterg
5.) Synyster Gates
6.) to all others who have participated/contributed


----------



## TheReciever

Looks like it was a botched installation/download.

Re-ran DDU and redownloaded 16.3, so far I can see the Desktop now! lol


----------



## OneB1t

http://postimg.org/image/ewope67kj/full/

16.3 drivers 290x with 390X bios 1100/1575 (this one https://mega.nz/#!dU4AxRLI!zpCi2P-ybfy2ZxkQJ5TM0noSSSv7QoQD1KPvAVt1UVU) no tweaks
(its normal firestrike i just disabled physics and combined tests)


----------



## TheReciever

Gave up on 16.3

Booting to black screens is not what I desire from my computer right now.

Went back to 15.11.1 since it works...

uninstalled trixx, afterburner, still black screen.

Even when 16.3 did work, it did not hold GPU clocks.


----------



## 241pizza

Noooooooo,
I feel you pain.
Not to rub it in but i just loaded up 16.3 So far no issues at all. Afterburner fine, actually now shows FPS in Farcry4.
No black screens , surprised really it went so well.
I did use DDU .

gonna try 390x bios next......

edit: even down clocking to 300mhz and 150mhz.
idle temp 32c - ambient 23C

Happy !


----------



## TheReciever

I'm not having any significant issues in games I currently play so I'm too motivated with troubleshooting at the moment.

The only game I play right now is black desert online but it's more of a passive play as I'm mostly working on school and take a break every so often to path to other quests


----------



## Forceman

Quote:


> Originally Posted by *TheReciever*
> 
> Gave up on 16.3
> 
> Booting to black screens is not what I desire from my computer right now.
> 
> Went back to 15.11.1 since it works...
> 
> uninstalled trixx, afterburner, still black screen.
> 
> Even when 16.3 did work, it did not hold GPU clocks.


You get the black screen right after Windows tries to go to the desktop (after the login screen)? I've had the same problem with the 16.x drivers. The only way I've found to get past it is to install the drivers, boot into safe mode, delete the video card from the device manager, then reboot. When it reboots and installs the driver automatically, it installs the new 16.x one and then works fine. I really wish I knew how to fix it, it is super annoying (especially since Windows boots so fast it is hard to get to safe mode).


----------



## TheReciever

Quote:


> Originally Posted by *Forceman*
> 
> You get the black screen right after Windows tries to go to the desktop (after the login screen)? I've had the same problem with the 16.x drivers. The only way I've found to get past it is to install the drivers, boot into safe mode, delete the video card from the device manager, then reboot. When it reboots and installs the driver automatically, it installs the new 16.x one and then works fine. I really wish I knew how to fix it, it is super annoying (especially since Windows boots so fast it is hard to get to safe mode).


Yes precisely. Windows loads up, services begin their rituals and then black screen.

Interesting fix, may be worth a try. I can troubleshoot a lot of things but a black screen is hard to work with lol. Luckily 15.11.1 works so I can revert back to that to get my productivity back up again.

Do your GPU clocks hold during gaming? The other issue I faced was dynamic clocks but then I black screened and couldnt begin troubleshooting it


----------



## serave

Pardon the yet another tech support-related question again

i'm having some kind of jagged image quality all over any games i've played (GTA V, Witcher 3, SoM), and some kind of flickering, _but not really flickering_
*
Driver version if that helps : Catalyst 15.7*

Here's a video to illustrate what i'm saying:




Things i've tried to resolve this matter:

1.Switching to any other GPUs (only integrated graphics and friend's 750Ti, doesnt have any spare gpus lying around atm)
2.Using another computer to play the games, problem still persists

*Also i've noticed something like this in my R9 290:*

1.Notice how the sunlight went off and on in this video (cant think anything better to say that in English)






2.Lines in the bottom right of the screen






shadow seems to act funny overall

is this a sign of a bad card or artifacts?

i've done "artifacts scanning" test in programs such as ATi Tool, OCCT and the Valley benchmark (using my own eyes to look for something funny), but none of them shows any signs of error, i have a big hope it's not though, hard to find any card that match the price/performance of an R9 290









Any replies would be appreciated


----------



## melodystyle2003

I think it looks like drivers issue. Which one you use? Have you tried the latest whql?


----------



## Forceman

Quote:


> Originally Posted by *TheReciever*
> 
> Yes precisely. Windows loads up, services begin their rituals and then black screen.
> 
> Interesting fix, may be worth a try. I can troubleshoot a lot of things but a black screen is hard to work with lol. Luckily 15.11.1 works so I can revert back to that to get my productivity back up again.
> 
> Do your GPU clocks hold during gaming? The other issue I faced was dynamic clocks but then I black screened and couldnt begin troubleshooting it


Yes, clocks are fine. I never had any issues with that, luckily.


----------



## TheReciever

Quote:


> Originally Posted by *Forceman*
> 
> Yes, clocks are fine. I never had any issues with that, luckily.


Ok thanks, Ill probably give it a go tomorrow and see how it works out


----------



## rdr09

Quote:


> Originally Posted by *TheReciever*
> 
> I'm not having any significant issues in games I currently play so I'm too motivated with troubleshooting at the moment.
> 
> The only game I play right now is black desert online but it's more of a passive play as I'm mostly working on school and take a break every so often to path to other quests


I think your card is a good candidate for a bios flash. we have a couple threads dealing with those.

Here is one . . .

http://www.overclock.net/t/1564219/modded-r9-390x-bios-for-r9-290-290x-updated-02-16-2016


----------



## serave

Quote:


> Originally Posted by *melodystyle2003*
> 
> I think it looks like drivers issue. Which one you use? Have you tried the latest whql?


unlikely, was using catalyst 15.7 before, tried the Crimson 16.3, problem still persists

i think i'd just live with it though


----------



## Arizonian

Quote:


> Originally Posted by *TheReciever*
> 
> Well just now hopped into the 290x realm from my old 6970m and man, got to love the performance/price. I bought my MSI Gaming 290x 4gb for 215.
> 
> Took off the heatsink and put on 2x 120mm fans and now have it at +200mv
> 
> 1175/1490 is what I have it running at right now...
> 
> http://www.3dmark.com/3dm/11202846


Welcome to the red tide - congrats added








Quote:


> Originally Posted by *241pizza*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Owner of BF4 Edition 290x the first month it came out.
> Replace leaf blower stock cooler,
> 
> Picked up a Msi lightning 290x about a year later.
> 
> Stock Air cooling 60-64c under max load


Quote:


> Originally Posted by *241pizza*
> 
> http://www.3dmark.com/fs/5592722 - tess on
> 
> and ya .....lol on 3 yrs. had mixed feelings about clubs.


Clubs are probably the best place IMO on OCN to get other owners experiences with same products.

Congrats added









630 members 900 GPU's to date through club


----------



## 241pizza

Don't get me wrong, too many good ideas out there, Ii can get me in trouble. However they are by far they source of knowledge going!

Clubs tend to make me spend money, and try to fix things that aren't always broken.....if you now what I mean.









Thanks for having me.

Now on to flashing bios ....if a I can find a 390x for a lightning.


----------



## Paul17041993

Quote:


> Originally Posted by *serave*
> 
> -snip-


A lot of game developers don't implement shader effect mathematics correctly or accurately and don't test them, making them only work with certain hardware/drivers. Skyrim's light shimmering looks normal however, it's just an artefact of low resolution lighting/bloom.

Have you talked to anyone in the support areas of the other games about it? Or do you have anything configured in CCC/Rad.Set.? (texture quality, AF, AA, etc)


----------



## i2CY

Anyone have orginal Bios: Reg/Uber for a Sapphire R9 290 947mhz?
Need to trouple shoot something.
Thanks


----------



## kizwan

Quote:


> Originally Posted by *i2CY*
> 
> Anyone have orginal Bios: Reg/Uber for a Sapphire R9 290 947mhz?
> Need to trouple shoot something.
> Thanks


Referenced?

https://drive.google.com/file/d/0B_32SYawOggYLXhnSm1malRweG8/view?usp=sharing


----------



## i2CY

yes sorry


----------



## Paul17041993

Quote:


> Originally Posted by *i2CY*
> 
> Anyone have orginal Bios: Reg/Uber for a Sapphire R9 290 947mhz?
> Need to trouple shoot something.
> Thanks


one stop for all GPU BIOS's;
https://www.techpowerup.com/vgabios/index.php?manufacturer=Sapphire&model=R9+290


----------



## serave

Quote:


> Originally Posted by *Paul17041993*
> 
> A lot of game developers don't implement shader effect mathematics correctly or accurately and don't test them, making them only work with certain hardware/drivers. Skyrim's light shimmering looks normal however, it's just an artefact of low resolution lighting/bloom.
> 
> Have you talked to anyone in the support areas of the other games about it? Or do you have anything configured in CCC/Rad.Set.? (texture quality, AF, AA, etc)


it happens in all games, well actually even when i'm watching videos the jagged lines/flickering edges are still there so i guess it's not from the GPU itself

tried my luck with an old GTX 480 and 570, no avail, i guess i'll just have to suck it up/get a new monitor in the future


----------



## Paul17041993

Quote:


> Originally Posted by *serave*
> 
> it happens in all games, well actually even when i'm watching videos the jagged lines/flickering edges are still there so i guess it's not from the GPU itself
> 
> tried my luck with an old GTX 480 and 570, no avail, i guess i'll just have to suck it up/get a new monitor in the future


What resolution is the monitor? and I also forgot about whether the resolution in use is native to the monitor or if you're using a super resolution, the later can have some funny grid alignment effects if the res isn't a direct multiple of the screen's.

You can always try using forced SSAA in selected applications, or globally, if you want really smooth graphics with a minor cost of heavier GPU usage. Even with my 4k screen I still use x2 or x4 SSAA on some games as it looks more natural and appealing.

The odd Z-fighting and shadow corruption in your GTA5(?) video is something completely different though.


----------



## rdr09

Welp . . .

http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd


----------



## Vellinious

Quote:


> Originally Posted by *rdr09*
> 
> Welp . . .
> 
> http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd


Yeah...red team's gonna lose this round.


----------



## mus1mus

No way if people helps.


----------



## rdr09

Quote:


> Originally Posted by *Vellinious*
> 
> Yeah...red team's gonna lose this round.


Quote:


> Originally Posted by *mus1mus*
> 
> No way if people helps.


^this. I'll try to steal some distilled tonite and get my 290s to join.


----------



## mus1mus

There are big players in this club that are yet to submit scores for us. I'm talking 3X and 4X guys. That deficit is not insurmountable. A single player that has 4X and runs every level can easily negate that lead--mind you.


----------



## Agent Smith1984

I'll be running all in tomorrow.

Looking forward to some no-tess suicide runs


----------



## Vellinious

The weather has been pretty cold here the last few days. I'll probably rerun all my benches this weekend with a clean OS install. My Samsung 950 Pro arrives tonight. = D


----------



## Agent Smith1984

Quote:


> Originally Posted by *Vellinious*
> 
> The weather has been pretty cold here the last few days. I'll probably rerun all my benches this weekend with a clean OS install. My Samsung 950 Pro arrives tonight. = D


nuh nuh nuh-950 Pro????

AWESOME!!

Samsung just keeps on impressing with their SSD's man. That's an awesome drive. Was reading a little about it earlier.


----------



## Vellinious

Found it "open box" on NewEgg last week for almost $100 off on the 512gb drive. Couldn't pass that up.


----------



## kizwan

My secondary 290 referenced card developed a problem. It unable to overclock without black screen crashing even at pretty low overclock e.g. 1100 on the core. At TRI-X OC stock clock 1000/1300, it is running fine though. It's degraded already I think.


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> My secondary 290 referenced card developed a problem. It unable to overclock without black screen crashing even at pretty low overclock e.g. 1100 on the core. At TRI-X OC stock clock 1000/1300, it is running fine though. It's degraded already I think.


That's messed up. You've enjoyed them long enuf anyways. Let's look forward to polaris or pascal.


----------



## mus1mus

Quote:


> Originally Posted by *kizwan*
> 
> My secondary 290 referenced card developed a problem. It unable to overclock without black screen crashing even at pretty low overclock e.g. 1100 on the core. At TRI-X OC stock clock 1000/1300, it is running fine though. It's degraded already I think.


I'd inspect the drivers and the coolers. Maybe dried up paste too.

I've seen that happen.


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> My secondary 290 referenced card developed a problem. It unable to overclock without black screen crashing even at pretty low overclock e.g. 1100 on the core. At TRI-X OC stock clock 1000/1300, it is running fine though. It's degraded already I think.
> 
> 
> 
> I'd inspect the drivers and the coolers. Maybe dried up paste too.
> 
> I've seen that happen.
Click to expand...

The card was not able to heat up at all because it crash at the beginning of the firestrike. I did have air bubbles problem since I flushed my loops earlier this week. The core & VRM1 are 10C higher than the first card but temps still safe.

The weird thing is that I managed to bench with firestrike with crossfire enabled today at 1100/1375 & 1120/1375 (at +100mV). No problem during benching unlike last night crashed at the beginning of the test. I did found the primary card somehow running at x8 today, I don't know how relevant it is with last night problem. After I shutdown my PC & gently wiggle/push the primary card a little bit, it back run at x16 again & at the moment managed to overclocked in crossfire at 1100/1375 & 1120/1375. Take it easy on the memory for the moment. I'm going to need to drain may loop again because of the air bubbles.


----------



## mus1mus

Quote:


> Originally Posted by *kizwan*
> 
> The card was not able to heat up at all because it crash at the beginning of the firestrike. I did have air bubbles problem since I flushed my loops earlier this week. The core & VRM1 are 10C higher than the first card but temps still safe.
> 
> The weird thing is that I managed to bench with firestrike with crossfire enabled today at 1100/1375 & 1120/1375 (at +100mV). No problem during benching unlike last night crashed at the beginning of the test. I did found the primary card somehow running at x8 today, I don't know how relevant it is with last night problem. After I shutdown my PC & gently wiggle/push the primary card a little bit, it back run at x16 again & at the moment managed to overclocked in crossfire at 1100/1375 & 1120/1375. Take it easy on the memory for the moment. I'm going to need to drain may loop again because of the air bubbles.


Firestrike has been my cards' achilles heels. I am getting constant crashes too. Guess will have to spend a bit more time with it in the coming days.

I don't think it's degrading tbh.


----------



## battleaxe

Quote:


> Originally Posted by *kizwan*
> 
> My secondary 290 referenced card developed a problem. It unable to overclock without black screen crashing even at pretty low overclock e.g. 1100 on the core. At TRI-X OC stock clock 1000/1300, it is running fine though. It's degraded already I think.


However could have that happened?

My 290x just died too. LOL

Geez. We should be able to push these things like a mule indefinitely right?


----------



## Lucky 23

Quote:


> Originally Posted by *battleaxe*
> 
> However could have that happened?
> 
> My 290x just died too. LOL
> 
> Geez. We should be able to push these things like a mule indefinitely right?


What was yours clocked at?


----------



## Roboyto

Quote:


> Originally Posted by *kizwan*
> 
> My secondary 290 referenced card developed a problem. It unable to overclock without black screen crashing even at pretty low overclock e.g. 1100 on the core. At TRI-X OC stock clock 1000/1300, it is running fine though. It's degraded already I think.


Quote:


> Originally Posted by *battleaxe*
> 
> However could have that happened?
> 
> My 290x just died too. LOL
> 
> Geez. We should be able to push these things like a mule indefinitely right?


Push 'em to the limit and these things happen. Not saying they should fail under high stress, but if you've never broken something, PC parts or otherwise, when pushing boundaries..then you ain't doin' it right.









Polaris around the corner...can probably pick up a used one on eBay for cheap to tide you over for now.


----------



## battleaxe

Quote:


> Originally Posted by *Lucky 23*
> 
> What was yours clocked at?


It would do 1280/1750 for bench runs. Now it's a good paperweight.


----------



## JourneymanMike

Quote:


> Originally Posted by *Lucky 23*
> 
> Quote:
> 
> 
> 
> Originally Posted by *battleaxe*
> 
> However could have that happened?
> 
> *My 290x just died too.* LOL
> 
> Geez. We should be able to push these things like a mule indefinitely right?
> 
> 
> 
> *What was yours clocked at?*
Click to expand...

I just toasted one of my 290X's - BUMMER!









Was clocked at 1251 x 1653, Sapphire Tri-X BIOS on a Reference BF4...

Best two card score (It was the top card) http://www.3dmark.com/fs/7862043

Best single card score (the one that's bricked, for good!) http://www.3dmark.com/fs/7963481

The dead card...




OK, enough of the viewing, Time to put the reference cooler back on, and sell it for parts...


----------



## mus1mus

Wow. So much dead cards lately. Are you guys sure they are dead?


----------



## kizwan

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> My secondary 290 referenced card developed a problem. It unable to overclock without black screen crashing even at pretty low overclock e.g. 1100 on the core. At TRI-X OC stock clock 1000/1300, it is running fine though. It's degraded already I think.
> 
> 
> 
> However could have that happened?
> 
> My 290x just died too. LOL
> 
> Geez. We should be able to push these things like a mule indefinitely right?
Click to expand...

I know, right?







I didn't push my cards yet, at least have not go beyond what MSI AB & Trixx allowed.









Previously running firestrike @1100/1375 & 1120/1375 with +100mV & just now 1100/1500 with +50mV. Dammit air bubbles, core & VRM1 on the second card reaching 70s Celsius in just a couple of seconds but interestingly stayed there. Cool down in just couple of seconds too.









P/S: Anyone having problem browsing their results at 3dmark page?


----------



## mus1mus

Quote:


> Originally Posted by *kizwan*
> 
> I know, right?
> 
> 
> 
> 
> 
> 
> 
> I didn't push my cards yet, at least have not go beyond what MSI AB & Trixx allowed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Previously running firestrike @1100/1375 & 1120/1375 with +100mV & just now 1100/1500 with +50mV. Dammit air bubbles, core & VRM1 on the second card reaching 70s Celsius in just a couple of seconds but interestingly stayed there. Cool down in just couple of seconds too.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> P/S: Anyone having problem browsing their results at 3dmark page?


I'd recheck the blocks installation IMO.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> I'd recheck the blocks installation IMO.


^^THIS. Sounds like a bad mount.


----------



## battleaxe

Quote:


> Originally Posted by *kizwan*
> 
> The card was not able to heat up at all because it crash at the beginning of the firestrike. I did have air bubbles problem since I flushed my loops earlier this week. The core & VRM1 are 10C higher than the first card but temps still safe.
> 
> The weird thing is that I managed to bench with firestrike with crossfire enabled today at 1100/1375 & 1120/1375 (at +100mV). No problem during benching unlike last night crashed at the beginning of the test. I did found the primary card somehow running at x8 today, I don't know how relevant it is with last night problem. After I shutdown my PC & gently wiggle/push the primary card a little bit, it back run at x16 again & at the moment managed to overclocked in crossfire at 1100/1375 & 1120/1375. Take it easy on the memory for the moment. I'm going to need to drain may loop again because of the air bubbles.


Sounds like the TIM is not reaching the corner of the die. I've done this before. Double check your contact is my suggestion because this is exactly what would happen in such a scenario.


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I know, right?
> 
> 
> 
> 
> 
> 
> 
> I didn't push my cards yet, at least have not go beyond what MSI AB & Trixx allowed.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Previously running firestrike @1100/1375 & 1120/1375 with +100mV & just now 1100/1500 with +50mV. Dammit air bubbles, core & VRM1 on the second card reaching 70s Celsius in just a couple of seconds but interestingly stayed there. Cool down in just couple of seconds too.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> P/S: Anyone having problem browsing their results at 3dmark page?
> 
> 
> 
> I'd recheck the blocks installation IMO.
Click to expand...

I didn't removed the block. I only drain, change one tube run between bottom & front radiators. Prior draining, the temps are no where near this bad. Worst temps was in the 60s Celsius with both cards when I didn't turn on A/C (ambient 32C - 34C). Right now I can see air bubbles dancing in the tube (between the bottom & front radiators). I'll drain & change the tube to shorter tube back.


----------



## mus1mus

When was the last time you changed TIM though?

I am ocassionally injecting bubbles to loop as I enjoy seeing them run along the tubes. But sees no change in temps.

Weird me.


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> When was the last time you changed TIM though?
> 
> I am ocassionally injecting bubbles to loop as I enjoy seeing them run along the tubes. But sees no change in temps.
> 
> Weird me.


3 years ago.







I've never thought to change the TIM because don't see the reason to. I fixed that tube first, temp should be back to normal. If not I'll re-seat the block.

The air bubbles doesn't move, it just dancing there.

My temps running at 1100/1500. Look the difference between the two cards.


Okay, I checked my logs. It seems there's temp changed on the secondary card since I installed the block. Temp on the first card still not change as far as I can tell.


----------



## mus1mus

3 years may be a bit overaged.

Those are my core temps at load.







I am damned seeing 40+C idle on mine. With the AC on, on 2-360 rads cooling 3 cards and a constant clock for the CPU. I am seeing 20C idle. AC OFF - 33C or more. But not 48C. Unless I am on PT3. Which I don't need to with Voltage Limiter Off on modded BIOS.


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> 3 years may be a bit overaged.
> 
> Those are my core temps at load. biggrin.gif I am damned seeing 40+C idle on mine. With the AC on, on 2-360 rads cooling 3 cards and a constant clock for the CPU. I am seeing 20C idle. AC OFF - 33C or more. But not 48C. Unless I am on PT3. Which I don't need to with Voltage Limiter Off on modded BIOS


True. Somehow I keep forgetting it was 3 years ago I installed the blocks.







I'll tear down my loop soon.


----------



## battleaxe

Quote:


> Originally Posted by *kizwan*
> 
> True. Somehow I keep forgetting it was 3 years ago I installed the blocks.
> 
> 
> 
> 
> 
> 
> 
> I'll tear down my loop soon.


You shouldn't have to tear down the loop to change the TIM right? I don't have to on mine anyway... they're all different though, so who knows... ?

Your RADs could be fairly clogged up with dust by now too if you haven't cleaned them out. Just a thought. I'm sure you probably have though. Most of us are OCD about those things.


----------



## kizwan

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> True. Somehow I keep forgetting it was 3 years ago I installed the blocks.
> 
> 
> 
> 
> 
> 
> 
> I'll tear down my loop soon.
> 
> 
> 
> You shouldn't have to tear down the loop to change the TIM right? I don't have to on mine anyway... they're all different though, so who knows... ?
> 
> Your RADs could be fairly clogged up with dust by now too if you haven't cleaned them out. Just a thought. I'm sure you probably have though. Most of us are OCD about those things.
Click to expand...

Mine needs to. They are short run tubes. I need to drain & disconnect the tubes before I can remove the cards.

I just clean my rigs earlier this week. So rads are clean. I drain & changed the tube between the bottom & front rads to longer one. Pic below is before changing the tube. The tube kinking a little bit which is why I changed it but the new longer tube caused the bubbles kinda trapped there. I can see air bubbles dancing in the tube.



*Edit:* Ok, tear down may not the correct words.







I'll just need to disconnect the tubes to the cards.


----------



## MIGhunter

Since you guys are good about the software, I have a question.

I just finished my new watercooled setup with a 295x2, so my i-5 running my reference 290x got moved into my son's room. I had planned on using one of my monitors but one of them died. So, I hooked it up to my Son's 40" RCA TV. It has HDMI. When I hook up the TV to the HDMI output on the 290x, I get a black screen. If I hook the HDMI on the TV to the onboard GPU, it works fine. If I delete the drivers with DDU, it works just fine. As soon as I install the drivers, at about 18% the screen blinks and then goes black and stays that way. I've logged into the computer with my tablet using splashtop and changed all the resolutions hoping that was the issue but nothing works.

Any ideas?


----------



## rdr09

Quote:


> Originally Posted by *MIGhunter*
> 
> Since you guys are good about the software, I have a question.
> 
> I just finished my new watercooled setup with a 295x2, so my i-5 running my reference 290x got moved into my son's room. I had planned on using one of my monitors but one of them died. So, I hooked it up to my Son's 40" RCA TV. It has HDMI. When I hook up the TV to the HDMI output on the 290x, I get a black screen. If I hook the HDMI on the TV to the onboard GPU, it works fine. If I delete the drivers with DDU, it works just fine. As soon as I install the drivers, at about 18% the screen blinks and then goes black and stays that way. I've logged into the computer with my tablet using splashtop and changed all the resolutions hoping that was the issue but nothing works.
> 
> Any ideas?


When the screen turns black try pulling the hdmi cable off the card and plug it back in. If it goes back, reboot and see if it the blackscreen goes away.

What driver are you using?


----------



## mus1mus

Guys, keep pounding. They can come back. So we better annihilate them.









http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/1150_50#post_25019282


----------



## MIGhunter

Quote:


> Originally Posted by *rdr09*
> 
> When the screen turns black try pulling the hdmi cable off the card and plug it back in. If it goes back, reboot and see if it the blackscreen goes away.
> 
> What driver are you using?


1st few re-installs were the 16.3.1 Beta, then I went back to the 16.3 driver.


----------



## rdr09

Quote:


> Originally Posted by *MIGhunter*
> 
> 1st few re-installs were the 16.3.1 Beta, then I went back to the 16.3 driver.


You may have to go all the way back to 15.7. Also, make sure the TV is on first set to HDMI if possible.


----------



## rdr09

nvm.


----------



## MIGhunter

Quote:


> Originally Posted by *rdr09*
> 
> You may have to go all the way back to 15.7. Also, make sure the TV is on first set to HDMI if possible.


I don't see the 15.7 drivers, when I goto the AMD drivers, it's 15.12 and then 16.3 on the site. Kinda weird.


----------



## kizwan

Quote:


> Originally Posted by *MIGhunter*
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> You may have to go all the way back to 15.7. Also, make sure the TV is on first set to HDMI if possible.
> 
> 
> 
> I don't see the 15.7 drivers, when I goto the AMD drivers, it's 15.12 and then 16.3 on the site. Kinda weird.
Click to expand...

On that page, there's link to previous drivers under the "Helpful Links" on the right side of the page.

http://support.amd.com/en-us/download/desktop/previous?os=Windows%2010%20-%2064


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> On that page, there's link to previous drivers under the "Helpful Links" on the right side of the page.
> 
> http://support.amd.com/en-us/download/desktop/previous?os=Windows%2010%20-%2064


^this. I got here using the manual download for Win 7 64 bit . . .

http://support.amd.com/en-us/download/desktop/previous?os=Windows%207%20-%2064


----------



## TheReciever

When does the competition end? Im working on uni stuff right now along with other time consuming parts of the day...

Im guessing ill need to flash the BIOS and reattempt an overclock


----------



## rdr09

Quote:


> Originally Posted by *TheReciever*
> 
> When does the competition end? Im working on uni stuff right now along with other time consuming parts of the day...
> 
> Im guessing ill need to flash the BIOS and reattempt an overclock


31st March, US Eastern Time. No need for any high OC. Just submit first, then you can update later before time is up if you like.


----------



## mus1mus

31st of March.


----------



## gupsterg

*1x special offer*







.

Whomever submits a new entry to 3D Fanboy Competition 2016: nVidia vs AMD (Red Team of course







) I will offer ROM mod service over and above what HawaiiReader does (ie multi state VDDCI / VDDC Limit / LLC / fSW ) aka "The Works"







.

*Only 2 conditions:-*

i) do a entry .

ii) ROM mod done after entry, within my own time constraints (which usually is not long wait







) .


----------



## TheReciever

I need some help going through the mod for my card, so Ill submit after my exam that I have in about 30 minutes. Maybe around 5pm central?

Will definitely resub after the mod as I would like to see the benefits myself and also hopefully use crimson too (Still having to use 15.11.1)


----------



## gupsterg

No worries







, PM as soon as entry done







.


----------



## mus1mus

Thank you very much gups for the support! We really need some push!


----------



## Vellinious

Quote:


> Originally Posted by *gupsterg*
> 
> *1x special offer*
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Whomever submits a new entry to 3D Fanboy Competition 2016: nVidia vs AMD (Red Team of course
> 
> 
> 
> 
> 
> 
> 
> ) I will offer ROM mod service over and above what HawaiiReader does (ie multi state VDDCI / VDDC Limit / LLC / fSW ) aka "The Works"
> 
> 
> 
> 
> 
> 
> 
> .
> 
> *Only 2 conditions:-*
> 
> i) do a entry .
> 
> ii) ROM mod done after entry, within my own time constraints (which usually is not long wait
> 
> 
> 
> 
> 
> 
> 
> ) .


I could use some help in that area. I've attached a copy of the stock bios and the bios I'm currently using. My card doesn't seem to do well with 390X memory timings...and it runs extremely well just as it is. Would love to see some more voltage / clocks on the core, though.... My scores are submitted already. If I could get some more core, I could bump them up a tad more for some additional points. = )

290x1STOCK.zip 98k .zip file


GOOD32616.zip 98k .zip file


----------



## gupsterg

Vel my online buddy







, you want LLC On? done







, you want fSW? done







, AFAIK those are what you want?
Quote:


> Originally Posted by *mus1mus*
> 
> Thank you very much gups for the support! We really need some push!


No worries mate







.


----------



## Vellinious

Quote:


> Originally Posted by *gupsterg*
> 
> Vel my online buddy
> 
> 
> 
> 
> 
> 
> 
> , you want LLC On? done
> 
> 
> 
> 
> 
> 
> 
> , you want fSW? done
> 
> 
> 
> 
> 
> 
> 
> , AFAIK those are what you want?
> No worries mate
> 
> 
> 
> 
> 
> 
> 
> .


Explain fSW? This is VRM frequency? You're the pro...I trust ya. LLC yes, for sure.


----------



## TheReciever

Before I make the entry, will I gain any notable performance from running in single monitor mode vs the 4 I have now?


----------



## fyzzz

Quote:


> Originally Posted by *TheReciever*
> 
> Before I make the entry, will I gain any notable performance from running in single monitor mode vs the 4 I have now?


Yes you will probably get a few points more, i can't really remember, but i think i got about 50 points more switching from two monitors to only one.


----------



## TheReciever

I have figured out the issue I had with black screens and it seems to be with AMD overdrive. Every time it's enabled I have to Uninstall upon next reboot.

So that's what I am doing now...

Though without it, it seems that GPU clocks are dynamic instead of a static 3d clock.

Can no longer get my main monitor to display. Sigh, this is why I dont bother much with this stuff anymore. I end up troubleshooting the machine more than using it.


----------



## gupsterg

Quote:


> Originally Posted by *Vellinious*
> 
> Explain fSW? This is VRM frequency? You're the pro...I trust ya. LLC yes, for sure.


fSW is VRM frequency, as said by mus1mus it will help you at the edge of OC/stability







.

I will do you ROM with editable fSW and LLC on/off







, is that what you want?


----------



## Vellinious

Quote:


> Originally Posted by *gupsterg*
> 
> fSW is VRM frequency, as said by mus1mus it will help you at the edge of OC/stability
> 
> 
> 
> 
> 
> 
> 
> .
> 
> I will do you ROM with editable fSW and LLC on/off
> 
> 
> 
> 
> 
> 
> 
> , is that what you want?


Yup! I'd love to give it a shot. This card seems pretty picky....

Is there a special bios reader that I need to make changes to those settings, if need be?


----------



## kivikas14

Quote:


> Originally Posted by *gupsterg*
> 
> *1x special offer*
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Whomever submits a new entry to 3D Fanboy Competition 2016: nVidia vs AMD (Red Team of course
> 
> 
> 
> 
> 
> 
> 
> ) I will offer ROM mod service over and above what HawaiiReader does (ie multi state VDDCI / VDDC Limit / LLC / fSW ) aka "The Works"
> 
> 
> 
> 
> 
> 
> 
> .
> 
> *Only 2 conditions:-*
> 
> i) do a entry .
> 
> ii) ROM mod done after entry, within my own time constraints (which usually is not long wait
> 
> 
> 
> 
> 
> 
> 
> ) .


Entry done on the red team.
http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/1270#post_25023350
Maybe there is still some juice left to this card to be squeezed out









StockAsusDC2.zip 98k .zip file

Thank You in advance


----------



## TheReciever

Made the submission









http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/1250_50#post_25023692


----------



## MIGhunter

Quote:


> Originally Posted by *kizwan*
> 
> On that page, there's link to previous drivers under the "Helpful Links" on the right side of the page.
> 
> http://support.amd.com/en-us/download/desktop/previous?os=Windows%2010%20-%2064


Quote:


> Originally Posted by *rdr09*
> 
> ^this. I got here using the manual download for Win 7 64 bit . . .
> 
> http://support.amd.com/en-us/download/desktop/previous?os=Windows%207%20-%2064


Thanks, so I went with that driver and same thing. When the computer loads, I can see my MB logo, the windows boot screen, the little circle that let's you know it's loading, then a quick flash of green and then just black screen and the "HDMI 1" logo appears on my TV like it's not getting a signal. So, I swap over to the ONboard GPU and it is fine.

I tried rebooting, same thing. I tried unplugging and replugging, same thing. I tried unplugging the HDMI, rebooting and then plugging the HDMI back in, nothing.


----------



## Forceman

I've got a similar (if not the same) problem whenever I install a new driver version. I am able to fix it by booting into safe mode and uninstalling the video card in device manager (but uncheck the delete device driver box). Then when I reboot into normal mode windows detects the card and installs the latest driver (the one I just installed). From then on it works fine.


----------



## kizwan

Quote:


> Originally Posted by *MIGhunter*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> On that page, there's link to previous drivers under the "Helpful Links" on the right side of the page.
> 
> http://support.amd.com/en-us/download/desktop/previous?os=Windows%2010%20-%2064
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *rdr09*
> 
> ^this. I got here using the manual download for Win 7 64 bit . . .
> 
> http://support.amd.com/en-us/download/desktop/previous?os=Windows%207%20-%2064
> 
> Click to expand...
> 
> Thanks, so I went with that driver and same thing. When the computer loads, I can see my MB logo, the windows boot screen, the little circle that let's you know it's loading, then a quick flash of green and then just black screen and the "HDMI 1" logo appears on my TV like it's not getting a signal. So, I swap over to the ONboard GPU and it is fine.
> 
> I tried rebooting, same thing. I tried unplugging and replugging, same thing. I tried unplugging the HDMI, rebooting and then plugging the HDMI back in, nothing.
Click to expand...

Your TV EDID may not compatible with AMD cards. I don't know, I'm just assuming. If you can, dump EDID of your TV & post it here. You can use software like MonInfo. If possible dump the one mark as _Real-Time_. If not, the one mark as _Registry_ is fine too.


----------



## MIGhunter

Quote:


> Originally Posted by *Forceman*
> 
> I've got a similar (if not the same) problem whenever I install a new driver version. I am able to fix it by booting into safe mode and uninstalling the video card in device manager (but uncheck the delete device driver box). Then when I reboot into normal mode windows detects the card and installs the latest driver (the one I just installed). From then on it works fine.


I'll try that when I get off work in the morning
Quote:


> Originally Posted by *kizwan*
> 
> Your TV EDID may not compatible with AMD cards. I don't know, I'm just assuming. If you can, dump EDID of your TV & post it here. You can use software like MonInfo. If possible dump the one mark as _Real-Time_. If not, the one mark as _Registry_ is fine too.


I'll look into it tomorrow morning and see if I can figure that out. Working 12 hour night shifts in the ER.


----------



## diggiddi

Need help running fanboy benches on crossfire, have down clocked to half speed on both core/mem and minimum power limit and voltage, but cant get it to work any ideas?
Crashes after running for a bit


----------



## gupsterg

Quote:


> Originally Posted by *Vellinious*
> 
> Yup! I'd love to give it a shot. This card seems pretty picky....
> 
> Is there a special bios reader that I need to make changes to those settings, if need be?


You will only be able to edit fSW / LLC manually using Hex editor, you may not want LLC On (ie no VDROOP).

Multistate VDDCI is now editable via HawaiiReader, on my Vapor-X 290X I used to have 3 RAM clocks/VDDCI 150MHz(850mV) 1250MHz(925mV) 1525MHz(1000mV). From tests Kizwan did multimonitor setups don't like lowered VDDCI / many states IIRC. I had no issues a have 1 monitor setup, the way I had states general desktop usage never hit 1000mV VDDCI. So this mod is handy for members which say use higher VDDCI than stock (1000mV) to stabilise an OC.
Quote:


> Originally Posted by *kivikas14*
> 
> Entry done on the red team.
> http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/1270#post_25023350
> Maybe there is still some juice left to this card to be squeezed out
> 
> 
> 
> 
> 
> 
> 
> 
> 
> StockAsusDC2.zip 98k .zip file
> 
> Thank You in advance


What would you like done







, cheers for sub







.
Quote:


> Originally Posted by *TheReciever*
> 
> Made the submission
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.overclock.net/t/1586140/3d-fanboy-competition-2016-nvidia-vs-amd/1250_50#post_25023692


What would you like done







, cheers for sub







.


----------



## MIGhunter

Quote:


> Originally Posted by *kizwan*
> 
> Your TV EDID may not compatible with AMD cards. I don't know, I'm just assuming. If you can, dump EDID of your TV & post it here. You can use software like MonInfo. If possible dump the one mark as _Real-Time_. If not, the one mark as _Registry_ is fine too.


Here is the Dump from MonInfo


Spoiler: Warning: Spoiler!



Code:



Code:


Monitor
  Model name............... TTE HDTV
  Manufacturer............. TCL-Thomson
  Plug and Play ID......... TTE1205
  Serial number............ n/a
  Manufacture date......... 2008, ISO week 0
  Filter driver............ None
  -------------------------
  EDID revision............ 1.3
  Input signal type........ Digital
  Color bit depth.......... Undefined
  Display type............. RGB color
  Screen size.............. 160 x 90 mm (7.2 in)
  Power management......... Not supported
  Extension blocs.......... 1 (CEA-EXT)
  -------------------------
  DDC/CI................... Not supported

Color characteristics
  Default color space...... Non-sRGB
  Display gamma............ 2.20
  Red chromaticity......... Rx 0.625 - Ry 0.340
  Green chromaticity....... Gx 0.280 - Gy 0.595
  Blue chromaticity........ Bx 0.155 - By 0.070
  White point (default).... Wx 0.283 - Wy 0.298
  Additional descriptors... None

Timing characteristics
  Horizontal scan range.... 15-46kHz
  Vertical scan range...... 59-61Hz
  Video bandwidth.......... 80MHz
  CVT standard............. Not supported
  GTF standard............. Not supported
  Additional descriptors... None
  Preferred timing......... Yes
  Native/preferred timing.. 1280x720p at 60Hz (16:9)
    Modeline............... "1280x720" 74.250 1280 1390 1430 1650 720 725 730 750 +hsync +vsync
  Detailed timing #1....... 1920x1080i at 60Hz (16:9)
    Modeline............... "1920x1080" 74.250 1920 2008 2052 2200 1080 1084 1094 1124 interlace +hsync +vsync

Standard timings supported

EIA/CEA-861 Information
  Revision number.......... 3
  IT underscan............. Not supported
  Basic audio.............. Supported
  YCbCr 4:4:4.............. Supported
  YCbCr 4:2:2.............. Supported
  Native formats........... 1
  Detailed timing #1....... 720x480p at 60Hz (16:9)
    Modeline............... "720x480" 27.000 720 736 798 858 480 489 495 525 -hsync -vsync
  Detailed timing #2....... 720x480p at 60Hz (4:3)
    Modeline............... "720x480" 27.000 720 736 798 858 480 489 495 525 -hsync -vsync

CE video identifiers (VICs) - timing/formats supported
    1280 x  720p at  60Hz - HDTV (16:9, 1:1) [Native]
    1920 x 1080i at  60Hz - HDTV (16:9, 1:1)
     720 x  480p at  60Hz - EDTV (16:9, 32:27)
     720 x  480p at  60Hz - EDTV (4:3, 8:9)
     640 x  480p at  60Hz - Default (4:3, 1:1)
     720 x  480i at  60Hz - Doublescan (16:9, 32:27)
     720 x  480i at  60Hz - Doublescan (4:3, 8:9)
    NB: NTSC refresh rate = (Hz*1000)/1001

CE audio data (formats supported)
  LPCM    2-channel, 16-bit              at 32/44/48 kHz

CE speaker allocation data
  Channel configuration.... 2.0
  Front left/right......... Yes
  Front LFE................ No
  Front center............. No
  Rear left/right.......... No
  Rear center.............. No
  Front left/right center.. No
  Rear left/right center... No
  Rear LFE................. No

CE vendor specific data (VSDB)
  IEEE registration number. 0x000C03
  CEC physical address..... 1.0.0.0
  Supports AI (ACP, ISRC).. Yes
  Supports 48bpp........... No
  Supports 36bpp........... No
  Supports 30bpp........... No
  Supports YCbCr 4:4:4..... No
  Supports dual-link DVI... No
  Maximum TMDS clock....... 75MHz
  Audio/video latency (p).. n/a
  Audio/video latency (i).. n/a
  HDMI video capabilities.. No
  Data payload............. 030C001000800F00

Report information
  Date generated........... 3/27/2016
  Software revision........ 2.90.0.1002
  Data source.............. Real-time 0x0061
  Operating system......... 10.0.10586.2

Raw data
  00,FF,FF,FF,FF,FF,FF,00,52,85,05,12,00,00,00,00,00,12,01,03,80,10,09,78,0A,0D,C9,A0,57,47,98,27,
  12,48,4C,00,00,00,01,01,01,01,01,01,01,01,01,01,01,01,01,01,01,01,01,1D,00,72,51,D0,1E,20,6E,28,
  55,00,A0,5A,00,00,00,1E,01,1D,80,18,71,1C,16,20,58,2C,25,00,A0,5A,00,00,00,9E,00,00,00,FC,00,54,
  54,45,20,48,44,54,56,0A,20,20,20,20,00,00,00,FD,00,3B,3D,0F,2E,08,00,0A,20,20,20,20,20,20,01,64,
  02,03,1D,71,47,84,05,03,02,01,07,06,23,09,07,01,83,01,00,00,68,03,0C,00,10,00,80,0F,00,8C,0A,D0,
  8A,20,E0,2D,10,10,3E,96,00,A0,5A,00,00,00,18,8C,0A,D0,8A,20,E0,2D,10,10,3E,96,00,28,1E,00,00,00,
  18,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,
  00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,2A


----------



## Vellinious

Quote:


> Originally Posted by *gupsterg*
> 
> You will only be able to edit fSW / LLC manually using Hex editor, you may not want LLC On (ie no VDROOP).
> 
> Multistate VDDCI is now editable via HawaiiReader, on my Vapor-X 290X I used to have 3 RAM clocks/VDDCI 150MHz(850mV) 1250MHz(925mV) 1525MHz(1000mV). From tests Kizwan did multimonitor setups don't like lowered VDDCI / many states IIRC. I had no issues a have 1 monitor setup, the way I had states general desktop usage never hit 1000mV VDDCI. So this mod is handy for members which say use higher VDDCI than stock (1000mV) to stabilise an OC..


Leave LLC on. Is it adjustable, or just "off" and "on"? lol Either way, leave it on.


----------



## gupsterg

No adjustable level of LLC, from what we understand ROM constraint, hardware does support varying LLC.

You are fully aware when LLC is on VDDC will be slighter higher than set VID? ie VID 1.25V = ~1.26V VDDC not your normal drooped value.

It can be card killing feature







.


----------



## Vellinious

Quote:


> Originally Posted by *gupsterg*
> 
> No adjustable level of LLC, from what we understand ROM constraint, hardware does support varying LLC.
> 
> You are fully aware when LLC is on VDDC will be slighter higher than set VID? ie VID 1.25V = ~1.26V VDDC not your normal drooped value.
> 
> It can be card killing feature
> 
> 
> 
> 
> 
> 
> 
> .


Does it read the voltage correctly in the monitoring software (HW Monitor / GPUz sensors tab)?

My card has a tendency to get wonky with the voltages set above 1.443v anyway. The screen goes fuzzy. In and out, but....fuzzy. And, I get the black screen bug occasionally. I think the black screens are related to VRM temps...I think. At least, when the VRM temps get too high, or my coolant goes above 23c, that's when I start seeing them. /shrug


----------



## gupsterg

Monitoring will function as normal and be as correct as software monitoring is







.

Black screen can be due to a number of things from what I've experienced.

So it's hard to just say do x and it will fix it, in my case it had been too much RAM clock coupled with too low a VDDC for GPU to be happy with it. I solved it by adding a extra state of RAM ie 1250MHz. So my card would use 150MHz DPM 0 , 1250MHz DPM 1 & 2 and then 1525MHz DPM 3 onwards.



Now if you ref above image of my 24/7 use ROM what you'll note is GPU Clock DPM 2 is 799MHz, so what happens is at desktop/low loads if GPU remain below 799MHz RAM will only go up 1250MHz, so then it need less juice and = no chance of blackscreen. When GPU above 799MHz it goes to 1525MHz and is using more juice so no chance of blackscreen







. Above setup was even fine for 1575MHz long term use, I've done sessions of upto 70hrs continuous [email protected]

Another fix for Black screen is 0.95V rail need more juice, only solution hard mod, only card support out of box is Matrix 290X; that also had MVDDC / fSW adjustable via GPU Tweak







.


----------



## Vellinious

Quote:


> Originally Posted by *gupsterg*
> 
> Monitoring will function as normal and be as correct as software monitoring is
> 
> 
> 
> 
> 
> 
> 
> .
> 
> Black screen can be due to a number of things from what I've experienced.
> 
> So it's hard to just say do x and it will fix it, in my case it had been too much RAM clock coupled with too low a VDDC for GPU to be happy with it. I solved it by adding a extra state of RAM ie 1250MHz. So my card would use 150MHz DPM 0 , 1250MHz DPM 1 & 2 and then 1525MHz DPM 3 onwards.
> 
> 
> 
> Now if you ref above image of my 24/7 use ROM what you'll note is GPU Clock DPM 2 is 799MHz, so what happens is at desktop/low loads if GPU remain below 799MHz RAM will only go up 1250MHz, so then it need less juice and = no chance of blackscreen. When GPU above 799MHz it goes to 1525MHz and is using more juice so no chance of blackscreen
> 
> 
> 
> 
> 
> 
> 
> . Above setup was even fine for 1575MHz long term use, I've done sessions of upto 70hrs continuous [email protected]
> 
> Another fix for Black screen is 0.95V rail need more juice, only solution hard mod, only card support out of box is Matrix 290X; that also had MVDDC / fSW adjustable via GPU Tweak
> 
> 
> 
> 
> 
> 
> 
> .


Hmm. I run my memory quite high. 1792 is it's best. I can get occasional runs in at 1836, but...it crashes there more than it runs.

I've seen the thread for the .95v rail hard mod....I might do that down the road, after Pascal releases and I see how well they perform / overclock.

Let's do it. I'll just monitor things very closely until I find it's happy place. VRM temps would have to change due to extra voltage, so I'll just have to keep a watchful eye on those temps during testing. Right now they only rarely break 40c, so....

I leave on business in about 2 hours, and won't be back until next weekend. So, it'll be a while before I can do any testing with it. So....take your time.


----------



## gupsterg

Increased fSW will also increase VRM temps AFAIK, what degree no idea







.

So I will add LLC on by default plus editable fSW, which will be stock 480kHz and you then up as require, I will add a little mod guide in ROM pack, if stuck post/PM for help.

I will use ROMs in post 42257.


----------



## Vellinious

Quote:


> Originally Posted by *gupsterg*
> 
> Increased fSW will also increase VRM temps AFAIK, what degree no idea
> 
> 
> 
> 
> 
> 
> 
> .
> 
> So I will add LLC on by default plus editable fSW, which will be stock 480kHz and you then up as require, I will add a little mod guide in ROM pack, if stuck post/PM for help.
> 
> I will use ROMs in post 42257.


Excellent, thank you. I'll watch and compare temps between stock / mod and see how much they climb with voltage.

Use the stock one if you would...start from scratch. The other one has a volt mod that Mus set up for me.


----------



## gupsterg

ROM is done, before releasing it I'd like to see an i2cdump for your card using stock ROM and settings, just so I can verify stock fSW on your card.

Example of how your shortcut property for "Target" box should be:-

"C:\Program Files (x86)\MSI Afterburner\MSIAfterburner.exe" -i2cdump

*Note:* the space between " -

You'll find an i2cdump.txt in install dir of MSI AB, attach to post







.


----------



## Vellinious

Quote:


> Originally Posted by *gupsterg*
> 
> ROM is done, before releasing it I'd like to see an i2cdump for your card using stock ROM and settings, just so I can verify stock fSW on your card.
> 
> Example of how your shortcut property for "Target" box should be:-
> 
> "C:\Program Files (x86)\MSI Afterburner\MSIAfterburner.exe" -i2cdump
> 
> *Note:* the space between " -
> 
> You'll find an i2cdump.txt in install dir of MSI AB, attach to post
> 
> 
> 
> 
> 
> 
> 
> 
> .


I'll do that first thing when I get home on Friday. I left this morning on a business trip.


----------



## TheReciever

I guess I just want to try and reach 1200 core, but in the end any free improvements on performance ill gladly take regardless lol


----------



## rdr09

nvm.


----------



## TheReciever

Ill make another submission after tweaking the BIOS, and hopefully it does resolve my overall issues that I have been facing with the card.

Plus I need to run Tess off on the benches again too so overall hopefully can get a bit of a boost for the red team.


----------



## rdr09

Quote:


> Originally Posted by *TheReciever*
> 
> Ill make another submission after tweaking the BIOS, and hopefully it does resolve my overall issues that I have been facing with the card.
> 
> Plus I need to run Tess off on the benches again too so overall hopefully can get a bit of a boost for the red team.


With the help of gupsterg, et al . . . you are in good hands. You've submitted so no rush.

EDIT: i am thinking of flashing my reference 290s with elpida when polaris come out.


----------



## TheReciever

Software issues aside, I am really enjoying the performance offered by the 290x, last time I had a desktop AMD GPU it was still ATI with the EAH4890 which discouraged me from their products for maybe half a decade.

Running 4 displays and doing something on all of them, plus a little AFK leveling in Black Desert Online


----------



## rdr09

Quote:


> Originally Posted by *MIGhunter*
> 
> Thanks, so I went with that driver and same thing. When the computer loads, I can see my MB logo, the windows boot screen, the little circle that let's you know it's loading, then a quick flash of green and then just black screen and the "HDMI 1" logo appears on my TV like it's not getting a signal. So, I swap over to the ONboard GPU and it is fine.
> 
> I tried rebooting, same thing. I tried unplugging and replugging, same thing. I tried unplugging the HDMI, rebooting and then plugging the HDMI back in, nothing.


Quote:


> Originally Posted by *MIGhunter*
> 
> Here is the Dump from MonInfo
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> Monitor
> Model name............... TTE HDTV
> Manufacturer............. TCL-Thomson
> Plug and Play ID......... TTE1205
> Serial number............ n/a
> Manufacture date......... 2008, ISO week 0
> Filter driver............ None
> -------------------------
> EDID revision............ 1.3
> Input signal type........ Digital
> Color bit depth.......... Undefined
> Display type............. RGB color
> Screen size.............. 160 x 90 mm (7.2 in)
> Power management......... Not supported
> Extension blocs.......... 1 (CEA-EXT)
> -------------------------
> DDC/CI................... Not supported
> 
> Color characteristics
> Default color space...... Non-sRGB
> Display gamma............ 2.20
> Red chromaticity......... Rx 0.625 - Ry 0.340
> Green chromaticity....... Gx 0.280 - Gy 0.595
> Blue chromaticity........ Bx 0.155 - By 0.070
> White point (default).... Wx 0.283 - Wy 0.298
> Additional descriptors... None
> 
> Timing characteristics
> Horizontal scan range.... 15-46kHz
> Vertical scan range...... 59-61Hz
> Video bandwidth.......... 80MHz
> CVT standard............. Not supported
> GTF standard............. Not supported
> Additional descriptors... None
> Preferred timing......... Yes
> Native/preferred timing.. 1280x720p at 60Hz (16:9)
> Modeline............... "1280x720" 74.250 1280 1390 1430 1650 720 725 730 750 +hsync +vsync
> Detailed timing #1....... 1920x1080i at 60Hz (16:9)
> Modeline............... "1920x1080" 74.250 1920 2008 2052 2200 1080 1084 1094 1124 interlace +hsync +vsync
> 
> Standard timings supported
> 
> EIA/CEA-861 Information
> Revision number.......... 3
> IT underscan............. Not supported
> Basic audio.............. Supported
> YCbCr 4:4:4.............. Supported
> YCbCr 4:2:2.............. Supported
> Native formats........... 1
> Detailed timing #1....... 720x480p at 60Hz (16:9)
> Modeline............... "720x480" 27.000 720 736 798 858 480 489 495 525 -hsync -vsync
> Detailed timing #2....... 720x480p at 60Hz (4:3)
> Modeline............... "720x480" 27.000 720 736 798 858 480 489 495 525 -hsync -vsync
> 
> CE video identifiers (VICs) - timing/formats supported
> 1280 x  720p at  60Hz - HDTV (16:9, 1:1) [Native]
> 1920 x 1080i at  60Hz - HDTV (16:9, 1:1)
> 720 x  480p at  60Hz - EDTV (16:9, 32:27)
> 720 x  480p at  60Hz - EDTV (4:3, 8:9)
> 640 x  480p at  60Hz - Default (4:3, 1:1)
> 720 x  480i at  60Hz - Doublescan (16:9, 32:27)
> 720 x  480i at  60Hz - Doublescan (4:3, 8:9)
> NB: NTSC refresh rate = (Hz*1000)/1001
> 
> CE audio data (formats supported)
> LPCM    2-channel, 16-bit              at 32/44/48 kHz
> 
> CE speaker allocation data
> Channel configuration.... 2.0
> Front left/right......... Yes
> Front LFE................ No
> Front center............. No
> Rear left/right.......... No
> Rear center.............. No
> Front left/right center.. No
> Rear left/right center... No
> Rear LFE................. No
> 
> CE vendor specific data (VSDB)
> IEEE registration number. 0x000C03
> CEC physical address..... 1.0.0.0
> Supports AI (ACP, ISRC).. Yes
> Supports 48bpp........... No
> Supports 36bpp........... No
> Supports 30bpp........... No
> Supports YCbCr 4:4:4..... No
> Supports dual-link DVI... No
> Maximum TMDS clock....... 75MHz
> Audio/video latency (p).. n/a
> Audio/video latency (i).. n/a
> HDMI video capabilities.. No
> Data payload............. 030C001000800F00
> 
> Report information
> Date generated........... 3/27/2016
> Software revision........ 2.90.0.1002
> Data source.............. Real-time 0x0061
> Operating system......... 10.0.10586.2
> 
> Raw data
> 00,FF,FF,FF,FF,FF,FF,00,52,85,05,12,00,00,00,00,00,12,01,03,80,10,09,78,0A,0D,C9,A0,57,47,98,27,
> 12,48,4C,00,00,00,01,01,01,01,01,01,01,01,01,01,01,01,01,01,01,01,01,1D,00,72,51,D0,1E,20,6E,28,
> 55,00,A0,5A,00,00,00,1E,01,1D,80,18,71,1C,16,20,58,2C,25,00,A0,5A,00,00,00,9E,00,00,00,FC,00,54,
> 54,45,20,48,44,54,56,0A,20,20,20,20,00,00,00,FD,00,3B,3D,0F,2E,08,00,0A,20,20,20,20,20,20,01,64,
> 02,03,1D,71,47,84,05,03,02,01,07,06,23,09,07,01,83,01,00,00,68,03,0C,00,10,00,80,0F,00,8C,0A,D0,
> 8A,20,E0,2D,10,10,3E,96,00,A0,5A,00,00,00,18,8C,0A,D0,8A,20,E0,2D,10,10,3E,96,00,28,1E,00,00,00,
> 18,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,
> 00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,2A


Bump. Have you tried updating the monitor's driver? The thing is . . . all my monitors (including TVs used as monitors) say Generic and using Windows driver.

BTW, all of my monitor drivers from both my AMD and Intel rig are from 2006.


----------



## kizwan

Quote:


> Originally Posted by *MIGhunter*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Your TV EDID may not compatible with AMD cards. I don't know, I'm just assuming. If you can, dump EDID of your TV & post it here. You can use software like MonInfo. If possible dump the one mark as _Real-Time_. If not, the one mark as _Registry_ is fine too.
> 
> 
> 
> Here is the Dump from MonInfo
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> Monitor
> Model name............... TTE HDTV
> Manufacturer............. TCL-Thomson
> Plug and Play ID......... TTE1205
> Serial number............ n/a
> Manufacture date......... 2008, ISO week 0
> Filter driver............ None
> -------------------------
> EDID revision............ 1.3
> Input signal type........ Digital
> Color bit depth.......... Undefined
> Display type............. RGB color
> Screen size.............. 160 x 90 mm (7.2 in)
> Power management......... Not supported
> Extension blocs.......... 1 (CEA-EXT)
> -------------------------
> DDC/CI................... Not supported
> 
> Color characteristics
> Default color space...... Non-sRGB
> Display gamma............ 2.20
> Red chromaticity......... Rx 0.625 - Ry 0.340
> Green chromaticity....... Gx 0.280 - Gy 0.595
> Blue chromaticity........ Bx 0.155 - By 0.070
> White point (default).... Wx 0.283 - Wy 0.298
> Additional descriptors... None
> 
> Timing characteristics
> Horizontal scan range.... 15-46kHz
> Vertical scan range...... 59-61Hz
> Video bandwidth.......... 80MHz
> CVT standard............. Not supported
> GTF standard............. Not supported
> Additional descriptors... None
> Preferred timing......... Yes
> Native/preferred timing.. 1280x720p at 60Hz (16:9)
> Modeline............... "1280x720" 74.250 1280 1390 1430 1650 720 725 730 750 +hsync +vsync
> Detailed timing #1....... 1920x1080i at 60Hz (16:9)
> Modeline............... "1920x1080" 74.250 1920 2008 2052 2200 1080 1084 1094 1124 interlace +hsync +vsync
> 
> Standard timings supported
> 
> EIA/CEA-861 Information
> Revision number.......... 3
> IT underscan............. Not supported
> Basic audio.............. Supported
> YCbCr 4:4:4.............. Supported
> YCbCr 4:2:2.............. Supported
> Native formats........... 1
> Detailed timing #1....... 720x480p at 60Hz (16:9)
> Modeline............... "720x480" 27.000 720 736 798 858 480 489 495 525 -hsync -vsync
> Detailed timing #2....... 720x480p at 60Hz (4:3)
> Modeline............... "720x480" 27.000 720 736 798 858 480 489 495 525 -hsync -vsync
> 
> CE video identifiers (VICs) - timing/formats supported
> 1280 x  720p at  60Hz - HDTV (16:9, 1:1) [Native]
> 1920 x 1080i at  60Hz - HDTV (16:9, 1:1)
> 720 x  480p at  60Hz - EDTV (16:9, 32:27)
> 720 x  480p at  60Hz - EDTV (4:3, 8:9)
> 640 x  480p at  60Hz - Default (4:3, 1:1)
> 720 x  480i at  60Hz - Doublescan (16:9, 32:27)
> 720 x  480i at  60Hz - Doublescan (4:3, 8:9)
> NB: NTSC refresh rate = (Hz*1000)/1001
> 
> CE audio data (formats supported)
> LPCM    2-channel, 16-bit              at 32/44/48 kHz
> 
> CE speaker allocation data
> Channel configuration.... 2.0
> Front left/right......... Yes
> Front LFE................ No
> Front center............. No
> Rear left/right.......... No
> Rear center.............. No
> Front left/right center.. No
> Rear left/right center... No
> Rear LFE................. No
> 
> CE vendor specific data (VSDB)
> IEEE registration number. 0x000C03
> CEC physical address..... 1.0.0.0
> Supports AI (ACP, ISRC).. Yes
> Supports 48bpp........... No
> Supports 36bpp........... No
> Supports 30bpp........... No
> Supports YCbCr 4:4:4..... No
> Supports dual-link DVI... No
> Maximum TMDS clock....... 75MHz
> Audio/video latency (p).. n/a
> Audio/video latency (i).. n/a
> HDMI video capabilities.. No
> Data payload............. 030C001000800F00
> 
> Report information
> Date generated........... 3/27/2016
> Software revision........ 2.90.0.1002
> Data source.............. Real-time 0x0061
> Operating system......... 10.0.10586.2
> 
> Raw data
> 00,FF,FF,FF,FF,FF,FF,00,52,85,05,12,00,00,00,00,00,12,01,03,80,10,09,78,0A,0D,C9,A0,57,47,98,27,
> 12,48,4C,00,00,00,01,01,01,01,01,01,01,01,01,01,01,01,01,01,01,01,01,1D,00,72,51,D0,1E,20,6E,28,
> 55,00,A0,5A,00,00,00,1E,01,1D,80,18,71,1C,16,20,58,2C,25,00,A0,5A,00,00,00,9E,00,00,00,FC,00,54,
> 54,45,20,48,44,54,56,0A,20,20,20,20,00,00,00,FD,00,3B,3D,0F,2E,08,00,0A,20,20,20,20,20,20,01,64,
> 02,03,1D,71,47,84,05,03,02,01,07,06,23,09,07,01,83,01,00,00,68,03,0C,00,10,00,80,0F,00,8C,0A,D0,
> 8A,20,E0,2D,10,10,3E,96,00,A0,5A,00,00,00,18,8C,0A,D0,8A,20,E0,2D,10,10,3E,96,00,28,1E,00,00,00,
> 18,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,
> 00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,00,2A
Click to expand...

Sorry, I have no idea what's wrong.







What resolution you can achieve with Intel HD GPU? Did you try remote login to your computer while the TV connected to your 290X & check whether CCC detected the TV?


----------



## rdr09

Quote:


> Originally Posted by *kizwan*
> 
> Sorry, I have no idea what's wrong.
> 
> 
> 
> 
> 
> 
> 
> What resolution you can achieve with Intel HD GPU? Did you try remote login to your computer while the TV connected to your 290X & check whether CCC detected the TV?


Ran out of ideas too.


----------



## gupsterg

Quote:


> Originally Posted by *TheReciever*
> 
> Ill make another submission after tweaking the BIOS, and hopefully it does resolve my overall issues that I have been facing with the card.
> 
> Plus I need to run Tess off on the benches again too so overall hopefully can get a bit of a boost for the red team.


You got mail







.


----------



## TheReciever

So do you!

Ill get the info asap


----------



## MIGhunter

Quote:


> Originally Posted by *kizwan*
> 
> Sorry, I have no idea what's wrong.
> 
> 
> 
> 
> 
> 
> 
> What resolution you can achieve with Intel HD GPU? Did you try remote login to your computer while the TV connected to your 290X & check whether CCC detected the TV?


Thanks for the help. My oldest son went and got a TV he left over at his cousins. It works just fine. No clue what the issue with the other TV is but whatever.


----------



## JourneymanMike

A little help please...

I just put my #2 290X back in my system

I uninstalled the drivers according to BradleyW's guide on the OP...

I then reinstalled according to the guide also...

When I went to disable ULPS, this is what I got...



So I uninstalled and reinstalled again... Got the same result...









I've had this happen before and was able to correct it, I don't know how I did that... Brain Fade...
















Have any ideas?


----------



## Jflisk

Quote:


> Originally Posted by *JourneymanMike*
> 
> A little help please...
> 
> I just put my #2 290X back in my system
> 
> I uninstalled the drivers according to BradleyW's guide on the OP...
> 
> I then reinstalled according to the guide also...
> 
> When I went to disable ULPS, this is what I got...
> 
> 
> 
> So I uninstalled and reinstalled again... Got the same result...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've had this happen before and was able to correct it, I don't know how I did that... Brain Fade...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Have any ideas?


1) Use DDU to remove all traces of drivers .Reinstall drivers
2)Restart let it see first card - Second card removed/Disabled powered down .
2)Remove/Disable card first slot - Insert card second slot let system see second card on start. Check make sure its in device info
3) Reinsert first card restart - See if you have both. I had the same problem long time ago .

Remember to power down and shut off between pulling and placing cards. Know you know this but part of disclaimer. Good luck


----------



## Roboyto

Quote:


> Originally Posted by *JourneymanMike*
> 
> A little help please...
> 
> I just put my #2 290X back in my system
> 
> I uninstalled the drivers according to BradleyW's guide on the OP...
> 
> I then reinstalled according to the guide also...
> 
> When I went to disable ULPS, this is what I got...
> 
> 
> 
> So I uninstalled and reinstalled again... Got the same result...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I've had this happen before and was able to correct it, I don't know how I did that... Brain Fade...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Have any ideas?


Did you run DDU? I would try running that along with CCleaner to make sure the registry is squeaky clean.


----------



## JourneymanMike

Quote:


> Originally Posted by *Jflisk*
> 
> 1) Use DDU to remove all traces of drivers .Reinstall drivers
> 2)Restart let it see first card - Second card removed/Disabled powered down .
> 2)Remove/Disable card first slot - Insert card second slot let system see second card on start. Check make sure its in device info
> 3) Reinsert first card restart - See if you have both. I had the same problem long time ago .
> 
> Remember to power down and shut off between pulling and placing cards. Know you know this but part of disclaimer. Good luck


Quote:


> Originally Posted by *Roboyto*
> 
> Did you run DDU? I would try running that along with CCleaner to make sure the registry is squeaky clean.


I thought of DDU earlier, but was unable to get it to install - something about permissions, then my anti-virus deleted DDU, including the download!

I got it from the authors web-site, here http://www.wagnardmobile.com/forums/viewforum.php?f=5

After downloading, and trying to run, I ran my spyware and AV, the spyware picked up Ransomeware! Don't know where that came from... All I know, is that you have to jump through a few flaming hoops, to find the actual DDU download, or whatever it was...


----------



## mfknjadagr8

Quote:


> Originally Posted by *JourneymanMike*
> 
> I thought of DDU earlier, but was unable to get it to install - something about permissions, then my anti-virus deleted DDU, including the download!
> 
> I got it from the authors web-site, here http://www.wagnardmobile.com/forums/viewforum.php?f=5
> 
> After downloading, and trying to run, I ran my spyware and AV, the spyware picked up Ransomeware! Don't know where that came from... All I know, is that you have to jump through a few flaming hoops, to find the actual DDU download, or whatever it was...


you probably mistakenly clicked on a bad link to start with or an ad...if I remember correctly ddu downloads are done via his forum posts on his site that link to the new versions...wagnard mobile is the correct site not sure what happened there


----------



## JourneymanMike

Quote:


> Originally Posted by *mfknjadagr8*
> 
> you probably mistakenly clicked on a bad link to start with or an ad...if I remember correctly *ddu downloads are done via his forum posts on his site that link to the new versions..*.wagnard mobile is the correct site not sure what happened there


That's exactly where I got it from, it me through 3 or 4 different pages to get that bad download...









EDIT: OK, I got it... It took me, from the home page, 7 different pages to get to the download!

Worked just fine!


----------



## jodybdesigns

Guru3D usually has DDU too. Always clean there.


----------



## Roboyto

Quote:


> Originally Posted by *JourneymanMike*
> 
> I thought of DDU earlier, but was unable to get it to install - something about permissions, then my anti-virus deleted DDU, including the download!
> 
> I got it from the authors web-site, here http://www.wagnardmobile.com/forums/viewforum.php?f=5
> 
> After downloading, and trying to run, I ran my spyware and AV, the spyware picked up Ransomeware! Don't know where that came from... All I know, is that you have to jump through a few flaming hoops, to find the actual DDU download, or whatever it was...


Quote:


> Originally Posted by *jodybdesigns*
> 
> Guru3D usually has DDU too. Always clean there.


 ^ What he said


----------



## spdaimon

Hi guys, new to the forum. I just picked up a card off of eBay, an ASUS R9 290X DCUII OC, since I am handing off one of my 7950 to a buddy. The card was not working right until the seller told me to lower the mem and core speeds speeds and a slight bump in voltage. I assume this wouldn't nerf the card that much. I mainly want it for compute anyhow. Didn't anyone, manufacturer or user, release a revised BIOS to fix this issue? I mean MSI Afterburner is only good in Windows, but in Linux or OS X (if that is possible) you'd need a more hardware solution. Seems kind of bad on AMD or ASUS's part to release such a card.


----------



## TheReciever

You could try increasing the VDDCI to help the memory, but wouldnt push it much as iirc they run hot already


----------



## spdaimon

Thanks. I was just reading Roboyto's Need to Know post. Pretty much answered some of my questions. I was thinking maybe applying a heatsink there if the stock non-reference cooler allows. I may have some spare Artic Cooler heatsinks left. I don't mind having lower mem speeds. Core speeds is what affects or rather the core voltage is what affects VRM1 correct?


----------



## JourneymanMike

I've been looking into buying a Sapphire Tri-X R9 290X, and I was just wondering if these were built on Reference PCB...

And if it is so, would it be compatible with my Aquacomputer Kryographics R9 290X block and Active back plate, such as this one here... http://www.performance-pcs.com/aquacomputer-kryographics-hawaii-for-radeon-r9-290x-and-290-black-edition-nickel-plated.html#Questions-Answersqa[]

IDK, but somewhere on this forum, I think, I read that the Tri-X was a Reference PCB...


----------



## Roboyto

Quote:


> Originally Posted by *spdaimon*
> 
> Thanks. I was just reading Roboyto's Need to Know post. Pretty much answered some of my questions. I was thinking maybe applying a heatsink there if the stock non-reference cooler allows. I may have some spare Artic Cooler heatsinks left. I don't mind having lower mem speeds. Core speeds is what affects or rather the core voltage is what affects VRM1 correct?


Core voltage pertains to VRM1, you are correct.

Are you having issues with the card just sustaining stock clocks? May not be a bad idea to see if the BIOS(es) are factory. Quite common for these cards to be tinkered with and pushed to the brink due to their rather stout nature.

The DC2 290(X) had a heatsink that was designed for the 780, and not all the heatpipes make contact to the GPU die. This is why the cooling performance, for that otherwise solid DC2, was less than great on the 290(X); I make mention of it in that post of mine. I think it may be possible to help the performance with a thin copper shim to get some of the heat over to the other unused heat pipes. You would probably need to use some longer screws to secure the heatsink from the backside due to the height change.

If you can fit some RAM sinks under there you could put them on, but it may not effect the RAM performance any. I think these are what I used with my Kraken setup:

http://www.performance-pcs.com/3m-8810-thermally-conductive-adhesive-transfer-tapes-thermal-sink-8-pack.html

http://www.overclock.net/t/1478544/the-build-formerly-known-as-the-devil-inside-a-gaming-htpc-for-my-wife-4770k-corsair-250d-powercolor-r9-290/20_20#post_22214255

A more important cooling upgrade would be the thermal pads for VRM1. You would probably see a decent temperature drop since ASUS has a decent heatsink on them. Fujipoly Ultra Extreme or Sarcon XR-M. Most likely 1mm thickness would be enough I imagine for that type of heatsink.

http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures



Spoiler: DC2 Under The Hood















On another note did everyone see our lead in the Fanboy Competition? Red Team stepping up big time in the last 72 hours to put some points on the board!


----------



## Roboyto

Quote:


> Originally Posted by *JourneymanMike*
> 
> I've been looking into buying a Sapphire Tri-X R9 290X, and I was just wondering if these were built on Reference PCB...
> 
> And if it is so, would it be compatible with my Aquacomputer Kryographics R9 290X block and Active back plate, such as this one here... http://www.performance-pcs.com/aquacomputer-kryographics-hawaii-for-radeon-r9-290x-and-290-black-edition-nickel-plated.html#Questions-Answersqa[]
> 
> IDK, but somewhere on this forum, I think, I read that the Tri-X was a Reference PCB...


https://www.ekwb.com/configurator/

I believe there have been numerous versions of the Tri-X...so be certain you know which one exactly and Cooling Configurator will tell you if they are reference or not.


----------



## JourneymanMike

Quote:


> Originally Posted by *Roboyto*
> 
> https://www.ekwb.com/configurator/
> 
> I believe there *have been numerous versions of the Tri-X*...so be certain you know which one exactly and Cooling Configurator will tell you if they are reference or not.


That's kinda what I understood. the first ones were reference PCB, then it changed, I believe...

I guess I'll have to get the model No.'s, off the cards I'm considering...

Thanks for the link... + Rep.


----------



## battleaxe

Quote:


> Originally Posted by *JourneymanMike*
> 
> That's kinda what I understood. the first ones were reference PCB, then it changed, I believe...
> 
> I guess I'll have to get the model No.'s, off the cards I'm considering...
> 
> Thanks for the link... + Rep.


You def have to get the model number. Because I have one and it is not compatible with any waterblock.


----------



## JourneymanMike

Quote:


> Originally Posted by *battleaxe*
> 
> You def have to get the model number. Because I have one and it is not compatible with any waterblock.


I know that's true of the Gigabyte Windforce, but from the reading I've done on the Tri-X, when it first came out, says it was based on the reference PBC, with a custom cooler added... Now I'm not sure what they mean by *"Based On"*, they don't make it very clear ...


----------



## battleaxe

Quote:


> Originally Posted by *JourneymanMike*
> 
> I know that's true of the Gigabyte Windforce, but from the reading I've done on the Tri-X, when it first came out, says it was based on the reference PBC, with a custom cooler added... Now I'm not sure what they mean by *"Based On"*, they don't make it very clear ...


I have a Sapphire Tri-X 290x. And its not compatible. Just an FYI


----------



## JourneymanMike

Quote:


> Originally Posted by *battleaxe*
> 
> I have a Sapphire Tri-X 290x. And its not compatible. Just an FYI


Not so fast, you must have the New Edition of the Tri-X, according to the EK configurator, there are two models of the Tri-X that are reference PCB, Models 11226-00 & 11226-03, and of coarse the reference model 21226-00...

https://www.ekwb.com/shop/ek-fc-r9-290x-acetal-nickel-rev-2-0

Check the list...


----------



## battleaxe

Quote:


> Originally Posted by *JourneymanMike*
> 
> Not so fast, you must have the New Edition of the Tri-X, according to the EK configurator, there are two models of the Tri-X that are reference PCB, Models 11226-00 & 11226-03, and of coarse the reference model 21226-00...
> 
> https://www.ekwb.com/shop/ek-fc-r9-290x-acetal-nickel-rev-2-0
> 
> Check the list...


Yes, new edition.


----------



## JourneymanMike

Quote:


> Originally Posted by *battleaxe*
> 
> Yes, new edition.


I found the exact Tri-X that my block will fit, Woo Hoo!









Correction, I just bought the Tri-x Version 11226-00, it should fit my block, according to the EK configurator...

Guess I'll report back on this, when it gets here!


----------



## Roboyto

Quote:


> Originally Posted by *JourneymanMike*
> 
> I found the exact Tri-X that my block will fit, Woo Hoo!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Correction, I just bought the Tri-x Version 11226-00, it should fit my block, according to the EK configurator...
> 
> Guess I'll report back on this, when it gets here!


----------



## spdaimon

Quote:


> Originally Posted by *Roboyto*
> 
> Core voltage pertains to VRM1, you are correct.
> 
> Are you having issues with the card just sustaining stock clocks? May not be a bad idea to see if the BIOS(es) are factory. Quite common for these cards to be tinkered with and pushed to the brink due to their rather stout nature.
> 
> The DC2 290(X) had a heatsink that was designed for the 780, and not all the heatpipes make contact to the GPU die. This is why the cooling performance, for that otherwise solid DC2, was less than great on the 290(X); I make mention of it in that post of mine. I think it may be possible to help the performance with a thin copper shim to get some of the heat over to the other unused heat pipes. You would probably need to use some longer screws to secure the heatsink from the backside due to the height change.
> 
> If you can fit some RAM sinks under there you could put them on, but it may not effect the RAM performance any. I think these are what I used with my Kraken setup:
> 
> http://www.performance-pcs.com/3m-8810-thermally-conductive-adhesive-transfer-tapes-thermal-sink-8-pack.html
> 
> http://www.overclock.net/t/1478544/the-build-formerly-known-as-the-devil-inside-a-gaming-htpc-for-my-wife-4770k-corsair-250d-powercolor-r9-290/20_20#post_22214255
> 
> A more important cooling upgrade would be the thermal pads for VRM1. You would probably see a decent temperature drop since ASUS has a decent heatsink on them. Fujipoly Ultra Extreme or Sarcon XR-M. Most likely 1mm thickness would be enough I imagine for that type of heatsink.
> 
> http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures
> 
> 
> Spoiler: DC2 Under The Hood
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> On another note did everyone see our lead in the Fanboy Competition? Red Team stepping up big time in the last 72 hours to put some points on the board!


Thanks for responding so quickly. Thanks again for the info.
I had some heatsinks like those 3M 8810 that came with a GTX Artic Cooling kit. Yea, the thermal tape was lacking. I ended up thermal epoxying some on...its not like I was going to take the kit off again.
This kind of reminds me of my old 4870X2...went the watercooling route, sort of. It was so loud and hot, I got a DD waterblock and a Zalman Reservator V2. Completely silent, it was eerie and didn't go above 55C even with the Core2 Quad in the loop. I still have the Reservator sitting around even though people say its junk, it performed just fine for me.

I am pretty sure the BIOS is stock. The specs match the ASUS specs... 1050 core clock, 1350 mem clock. Its now at 980/1100 with a +25% power level and +38mV boost. After idling all day around 37/38C the VRM1 and VRM2 are at 41/40C now that I am browsing OCN.

Ok, I just noticed that ASUS approved my RMA request. I did all this before researching the problem. With the flickering and black screening, it surely seemed like a faulty card. Is it worth me sending it back?


----------



## Roboyto

Quote:


> Originally Posted by *spdaimon*
> 
> Thanks for responding so quickly. Thanks again for the info.
> I had some heatsinks like those 3M 8810 that came with a GTX Artic Cooling kit. Yea, the thermal tape was lacking. I ended up thermal epoxying some on...its not like I was going to take the kit off again.
> This kind of reminds me of my old 4870X2...went the watercooling route, sort of. It was so loud and hot, I got a DD waterblock and a Zalman Reservator V2. Completely silent, it was eerie and didn't go above 55C even with the Core2 Quad in the loop. I still have the Reservator sitting around even though people say its junk, it performed just fine for me.
> 
> I am pretty sure the BIOS is stock. The specs match the ASUS specs... 1050 core clock, 1350 mem clock. Its now at 980/1100 with a +25% power level and +38mV boost. After idling all day around 37/38C the VRM1 and VRM2 are at 41/40C now that I am browsing OCN.
> 
> Ok, I just noticed that ASUS approved my RMA request. I did all this before researching the problem. With the flickering and black screening, it surely seemed like a faulty card. Is it worth me sending it back?


If it can't sustain stock clocks with stock power and voltage then I would send it back..that is a bad sign.


----------



## mus1mus

Congrats EVERYONE..

Solid TEAM WORK!


----------



## mus1mus

Guys, Heads UP

http://www.guru3d.com/files-details/amd-catalyst-15-10-beta-driver-download.html

http://forums.guru3d.com/showthread.php?t=403142

Gave me this

NEW HIGH SCORE


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> Guys, Heads UP
> 
> http://www.guru3d.com/files-details/amd-catalyst-15-10-beta-driver-download.html
> 
> http://forums.guru3d.com/showthread.php?t=403142
> 
> Gave me this
> 
> NEW HIGH SCORE


Nice. Jeez.







Redonkulous.


----------



## mus1mus

Quote:


> Originally Posted by *battleaxe*
> 
> Redonkulous.


There's more. http://www.3dmark.com/fs/8063796

I might try to secure them Hardware records just coz the contest didn't give me enough time to tinker the QUAD set-up.


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> There's more. http://www.3dmark.com/fs/8063796
> 
> I might try to secure them Hardware records just coz the contest didn't give me enough time to tinker the QUAD set-up.


How does tri-Xfire scale in games? Does it work pretty well?

I'm considering going Tri-Xfire.

I would imagine withe the 8GB 390x's it would do pretty decent?


----------



## mus1mus

I am only benching at the moment. But it show pretty good scaling.

Now, ideas: or stupid idea?


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> I am only benching at the moment. But it show pretty good scaling.
> 
> Now, ideas: or stupid idea?


Hmm.... LOL... looks like some things I have done. I wouldn't expect VRM1 to stay very cool though.










Edit: that's pretty funny though. But hey, if it works. Who cares right?









Is that a good card? One that does 1400?


----------



## mus1mus

Quote:


> Originally Posted by *battleaxe*
> 
> Hmm.... LOL... looks like some things I have done. I wouldn't expect VRM1 to stay very cool though.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: that's pretty funny though. But hey, if it works. Who cares right?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is that a good card? One that does 1400?


I bet you figured it out.

I want to test first if water cooling the heatpipe means cooler CORE. Thus the CPU block in there. If that works, I will solder copper pipes unto the that stock cooler so I keep the look stock.









And ohh, that is a dead card.


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> I bet you figured it out.
> 
> I want to test first if water cooling the heatpipe means cooler CORE. Thus the CPU block in there. If that works, I will solder copper pipes unto the that stock cooler so I keep the look stock.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And ohh, that is a dead card.


Good idea. Who have we seen do that before? LOL

Hey the copper pipes on the VRM's are working splendidly. Only about 5C hotter than a full cover block from what I can tell. Its a little tricky getting good contact though. Takes some fiddling for sure. But your method would be easy. Good ideas, I see where you are going with this. Makes me think. LOL


----------



## mus1mus

Quote:


> Originally Posted by *battleaxe*
> 
> Good idea. Who have we seen do that before? LOL
> 
> Hey the copper pipes on the VRM's are working splendidly. Only about 5C hotter than a full cover block from what I can tell. Its a little tricky getting good contact though. Takes some fiddling for sure. But your method would be easy. Good ideas, I see where you are going with this. Makes me think. LOL


Yeah. I have thought about your mod.

Things is, I am not sure how the transfer would be like for heatpipe and a waterblock to the core.









I'll test it for sure.


----------



## mus1mus

Well, well, well...











Not bad. Considering it's not even soldered!


----------



## battleaxe

Spoiler: Warning: Spoiler!



Quote:


> Originally Posted by *mus1mus*
> 
> Well, well, well...
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not bad. Considering it's not even soldered!






I'm either lame or dumb (maybe both) but what's the core at?

Edit: Dude, I love experiments like this. So much fun. This is more rewarding than slapping full cover blocks on cause its more involved. Lots of tweaking and thinking about what to do next.

I'm shocked the VRM1 only got to 61c when getting hit with 1.4+volts. Not bad at all.

+1

If you turned the block sideways could you screw that block right into the copper plate if it were threaded?


----------



## JourneymanMike

Quote:


> Originally Posted by *mus1mus*
> 
> Well, well, well...
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Not bad. Considering it's not even soldered!


You really need to post this in the Ghetto Forum! Lookin' good


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> I bet you figured it out.
> 
> I want to test first if water cooling the heatpipe means cooler CORE. Thus the CPU block in there. If that works, I will solder copper pipes unto the that stock cooler so I keep the look stock.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And ohh, that is a dead card.


I see what you are doing now.

This is a crazy idea. Using the stock copper block as the block and soldering some pipe to it. Best to maximize the contact area. If possible make several loops hitting the block. And probably have to do it in the oven. Gonna take crazy heat with a torch, so that may not work.

But in the oven, hit it at 400+ and then lots of solder. Should work. You're nuts man. LOL

Great ideas here.


----------



## Vellinious

Quote:


> Originally Posted by *gupsterg*
> 
> ROM is done, before releasing it I'd like to see an i2cdump for your card using stock ROM and settings, just so I can verify stock fSW on your card.
> 
> Example of how your shortcut property for "Target" box should be:-
> 
> "C:\Program Files (x86)\MSI Afterburner\MSIAfterburner.exe" -i2cdump
> 
> *Note:* the space between " -
> 
> You'll find an i2cdump.txt in install dir of MSI AB, attach to post
> 
> 
> 
> 
> 
> 
> 
> .


I'm home...will probably mess with this tomorrow. Is there a way to get you what you need without downloading that software?


----------



## mus1mus

Quote:


> Originally Posted by *battleaxe*
> 
> 
> I'm either lame or dumb (maybe both) but what's the core at?
> 
> Edit: Dude, I love experiments like this. So much fun. This is more rewarding than slapping full cover blocks on cause its more involved. Lots of tweaking and thinking about what to do next.
> 
> I'm shocked the VRM1 only got to 61c when getting hit with 1.4+volts. Not bad at all.
> 
> +1
> 
> If you turned the block sideways could you screw that block right into the copper plate if it were threaded?


Core didnt hit 50C. The app was bugged by spikes to 600C so I turned to Average. The Copper Base didn't heat up nor warmed at all. So it is working. :weh:
Quote:


> Originally Posted by *JourneymanMike*
> 
> You really need to post this in the Ghetto Forum! Lookin' good


Ugh.








Quote:


> Originally Posted by *battleaxe*
> 
> I see what you are doing now.
> 
> This is a crazy idea. Using the stock copper block as the block and soldering some pipe to it. Best to maximize the contact area. If possible make several loops hitting the block. And probably have to do it in the oven. Gonna take crazy heat with a torch, so that may not work.
> 
> But in the oven, hit it at 400+ and then lots of solder. Should work. You're nuts man. LOL
> 
> Great ideas here.


Nice tips. That seems to be easier way to get this done!


----------



## gupsterg

Quote:


> Originally Posted by *Vellinious*
> 
> I'm home...will probably mess with this tomorrow. Is there a way to get you what you need without downloading that software?


AiDA64 (available 30 day trial), view 



 for how to do dump using that.


----------



## Vellinious

Quote:


> Originally Posted by *gupsterg*
> 
> AiDA64 (available 30 day trial), view
> 
> 
> 
> for how to do dump using that.


Hmm...maybe the debug feature isn't on the free version.



I'm just gonna buy it. Been meaning to anyway.


----------



## gupsterg

In "View" menu enable status bar







, then right click status bar to gain menu







.

I gotta add this to video/instructions as a) I always have status bar enabled b) by default it's off







.


----------



## Vellinious

Tricky....

Here's the file. Let me know if you need anything else. That should be with the stock bios.

atismbusdump.zip 4k .zip file


----------



## tictoc

Quote:


> Originally Posted by *mus1mus*
> 
> I am only benching at the moment. But it show pretty good scaling.
> 
> Now, ideas: or stupid idea?


I did something similar on a 7970 with an AIO, and it worked great.



That card crunched numbers 24/7 at 1215/1550 with max volts for more than one year with no issues.


----------



## fyzzz

I traded my 980 ti for 2x Asus reference cards with kryographics waterblock and the active backplate. I'm testing them one by one to begin with. I have one of them installed now, elpida memory as i expected. But it's asic is 79.1% and it does 1150/1450 on stock voltage without any problem. Can't wait to try out a custom bios, but first i'm going to test it on stock bios.


----------



## mus1mus

Quote:


> Originally Posted by *tictoc*
> 
> I did something similar with on a 7970 with an AIO, and it worked great.
> 
> 
> 
> That card crunched numbers 24/7 at 1215/1550 with max volts for more than one year with no issues.


Nice. I wonder why not so many does this mod. It seems to be rather effective!

Quote:


> Originally Posted by *fyzzz*
> 
> I traded my 980 ti for 2x Asus reference cards with kryographics waterblock and the active backplate. I'm testing them one by one to begin with. I have one of them installed now, elpida memory as i expected. But it's asic is 79.1% and it does 1150/1450 on stock voltage without any problem. Can't wait to try out a custom bios, but first i'm going to test it on stock bios.


3 on 3 shoot out again?


----------



## gupsterg

@Vellinious

Cheers







, due to the nature of the ROM I have PM'd you it. There is guide within zip to edit ROM to varying fSW, if stuck PM for info







.

@fyzzz

Nice card


----------



## fyzzz

Quote:


> Originally Posted by *mus1mus*
> 
> 3 on 3 shoot out again?


I will 'only' run 2 cards, 3 cards would require lot's of cooling, new psu and my motherboard/cpu can't really do 3. But this card that I'm testing now seems to be awesome. Stock Bios 1210/1625 +75mv: http://www.3dmark.com/fs/8079244. no artifacts on this run. Also it seems to like more vddci, at first when i tried 1625 memory clock it blackscreened almost instantly, but it ran just fine with 1.050.


----------



## mus1mus

Quote:


> Originally Posted by *fyzzz*
> 
> I will 'only' run 2 cards, 3 cards would require lot's of cooling, new psu and my motherboard/cpu can't really do 3. But this card that I'm testing now seems to be awesome. Stock Bios 1210/1625 +75mv: http://www.3dmark.com/fs/8079244. no artifacts on this run. Also it seems to like more vddci, at first when i tried 1625 memory clock it blackscreened almost instantly, but it ran just fine with 1.050.


Your biggest challenge is Power. Not twmps on water.

You should have at least 1200W with 2 cards overclocked.


----------



## fyzzz

Quote:


> Originally Posted by *mus1mus*
> 
> Your biggest challenge is Power. Not twmps on water.
> 
> You should have at least 1200W with 2 cards overclocked.


Yeah, i know that my supernova 850 g2 will probably be not enough when i push 2 cards. I still have my old rm 750 that i could maybe use as extra power.


----------



## mus1mus

Quote:


> Originally Posted by *fyzzz*
> 
> Yeah, i know that my supernova 850 g2 will probably be not enough when i push 2 cards. I still have my old rm 750 that i could maybe use as extra power.


Good idea. But with little OC, you may get away with it.

My advice though, is look for the card that black screens less at higher Voltages and set that as the primary. 2 cards scales very good. Even 3.

It takes a lot of effort on fours!









And yeah, just tap both green cables fom each PSU to sync powering them.


----------



## fyzzz

It's really, really weird to oc two cards. I have the 79.1% asic elpida as main card and my 78.3% asic hynix card as secondary. It was a mistake installing crimson driver, weird crashes, memory didn't oc at all almost. Then i tried 15.10 beta and it works much better, i could atleast get the elpida card to 1200/1500 and the hynix card to 1200/1625, both is flashed with a custom 390 bios, highest score so far: http://www.3dmark.com/fs/8082461.

Instant blackscreen in crysis 3







got a lot of issiues to solve


----------



## mus1mus

Quote:


> Originally Posted by *fyzzz*
> 
> It's really, really weird to oc two cards. I have the 79.1% asic elpida as main card and my 78.3% asic hynix card as secondary. It was a mistake installing crimson driver, weird crashes, memory didn't oc at all almost. Then i tried 15.10 beta and it works much better, i could atleast get the elpida card to 1200/1500 and the hynix card to 1200/1625, both is flashed with a custom 390 bios, highest score so far: http://www.3dmark.com/fs/8082461.
> 
> Instant blackscreen in crysis 3
> 
> 
> 
> 
> 
> 
> 
> got a lot of issiues to solve


15. 10 works much much better. Even 15. 9 topples my scores with Crimson.

You might also find it better to run memory at same clocks. 1500 worked best on me with multiple cards. 1625 didnt introduces issues like artifacts but scored lower.

Also, 2X scaling is linear. And Synched clocks perform better than async.

Were you able to test which card didn' t black screen at high (1.4ish) Voltages? If you can find that one, put it into the primary and clock/overvolt the secondary high enough for high clocks and do an asynch when gunning for high scores.

And yeah, congrats for the win mate.









Same goes to

@rt123
@kizwan

Felt good to see you 3 being winners!


----------



## kizwan

Quote:


> Originally Posted by *fyzzz*
> 
> It's really, really weird to oc two cards. I have the 79.1% asic elpida as main card and my 78.3% asic hynix card as secondary. It was a mistake installing crimson driver, weird crashes, memory didn't oc at all almost. Then i tried 15.10 beta and it works much better, i could atleast get the elpida card to 1200/1500 and the hynix card to 1200/1625, both is flashed with a custom 390 bios, highest score so far: http://www.3dmark.com/fs/8082461.
> 
> Instant blackscreen in crysis 3
> 
> 
> 
> 
> 
> 
> 
> got a lot of issiues to solve


I can do 1180/1500 (both cards) in bench with Crimson & voltage +100mV. Above +100mV, it will crash while bench loading regardless clock. This is new actually, never have this problem before. Maybe 15.10 will fixed this problem.

@mus1mus, thanks!


----------



## fyzzz

I don't know anymore, even with one card i have lot's of issues now. I'm going to reinstall windows and hope that it works better.


----------



## mus1mus

Quote:


> Originally Posted by *kizwan*
> 
> I can do 1180/1500 (both cards) in bench with Crimson & voltage +100mV. Above +100mV, it will crash while bench loading regardless clock. This is new actually, never have this problem before. Maybe 15.10 will fixed this problem.
> 
> @mus1mus, thanks!


You did replace the TIM, right?

Quote:


> Originally Posted by *fyzzz*
> 
> I don't know anymore, even with one card i have lot's of issues now. I'm going to reinstall windows and hope that it works better.


I have had several issues like that during the competition.







had to DDU a lot and Reinstall Windows every other day.


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> You did replace the TIM, right?


Not yet. Overheat weather make me very lazy. The good card (50s to low 60s Celsius on the core depending on the overclock) also having this problem when testing individually.


----------



## mus1mus

Quote:


> Originally Posted by *kizwan*
> 
> Not yet. Overheat weather make me very lazy. The good card (50s to low 60s Celsius on the core depending on the overclock) also having this problem when testing individually.


Ahhh... That might be the issue really.

On reference cooler, I can't exceed 60C on the core but it seems to be more sensitive with Voltage than having it on water.

Popping out the cooler revealed some lumps on the copper base touching the die. I did replace it with a flatter one and semi polished so that might help. See my mod a few posts back.

I am yet to run it on 16C ambient temps so yeah. That will be later.







we'll see how that card scales on cold with my mod.


----------



## kizwan

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *fyzzz*
> 
> It's really, really weird to oc two cards. I have the 79.1% asic elpida as main card and my 78.3% asic hynix card as secondary. It was a mistake installing crimson driver, weird crashes, memory didn't oc at all almost. Then i tried 15.10 beta and it works much better, i could atleast get the elpida card to 1200/1500 and the hynix card to 1200/1625, both is flashed with a custom 390 bios, highest score so far: http://www.3dmark.com/fs/8082461.
> 
> Instant blackscreen in crysis 3
> 
> 
> 
> 
> 
> 
> 
> got a lot of issiues to solve
> 
> 
> 
> I can do 1180/1500 (both cards) in bench with Crimson & voltage +100mV. Above +100mV, it will crash while bench loading regardless clock. This is new actually, never have this problem before. Maybe 15.10 will fixed this problem.
Click to expand...

Oops! Correction, actually 1140/1500 with (1.287V)+100mV. 1180 only possible when running Vantage.








Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Not yet. Overheat weather make me very lazy. The good card (50s to low 60s Celsius on the core depending on the overclock) also having this problem when testing individually.
> 
> 
> 
> Ahhh... That might be the issue really.
> 
> On reference cooler, I can't exceed 60C on the core but it seems to be more sensitive with Voltage than having it on water.
> 
> Popping out the cooler revealed some lumps on the copper base touching the die. I did replace it with a flatter one and semi polished so that might help. See my mod a few posts back.
> 
> I am yet to run it on 16C ambient temps so yeah. That will be later.
> 
> 
> 
> 
> 
> 
> 
> we'll see how that card scales on cold with my mod.
Click to expand...

Ambient: 35C
Water temp: 38C
Clocks: 1140/1500 @1.287V+100mV
GPU1 core temp: 66C
GPU1 VRM1: 60C
GPU2 core temp: 78C
GPU2 VRM1: 76C

















Yeah, yeah, I know. I will change the TIM on the GPU2. Just need to find time for it.







Honestly when I clean my loop, I've never touch the blocks yet. Maybe I should, to clean any gunk in it.


----------



## fyzzz

Everything seems to work just fine now, even with crimson. I have reinstalled everything and i have my rm 750 as extra power for the cards.


----------



## kizwan

Quote:


> Originally Posted by *fyzzz*
> 
> Everything seems to work just fine now, even with crimson. I have reinstalled everything and i have my rm 750 as extra power for the cards.


That's good new. Looks like I have a lot to do for mine.


----------



## fyzzz

The elpida memory on my second card is holding me back. It runs just fine at 1500mhz until the end of test 2 in firestrike and then RSOD. When i only had it installed, i could bench it at 1625mhz.. I also reached 29k graphics score with my hynix card at 1240/1625 and my elpida card at 1250/1450: http://www.3dmark.com/fs/8098401.


----------



## kizwan

Quote:


> Originally Posted by *fyzzz*
> 
> The elpida memory on my second card is holding me back. It runs just fine at 1500mhz until the end of test 2 in firestrike and then RSOD. When i only had it installed, i could bench it at 1625mhz.. I also reached 29k graphics score with my hynix card at 1240/1625 and my elpida card at 1250/1450: http://www.3dmark.com/fs/8098401.


RSOD = fun stuff. You should put Elpida card as primary & Hynix card as secondary.


----------



## fyzzz

Quote:


> Originally Posted by *kizwan*
> 
> RSOD = fun stuff. You should put Elpida card as primary & Hynix card as secondary.


At first I had it setup that way, but it doesn't seem to matter which card is first or second.


----------



## Vellinious

Quote:


> Originally Posted by *kizwan*
> 
> Oops! Correction, actually 1140/1500 with (1.287V)+100mV. 1180 only possible when running Vantage.
> 
> 
> 
> 
> 
> 
> 
> 
> Ambient: 35C
> Water temp: 38C
> Clocks: 1140/1500 @1.287V+100mV
> GPU1 core temp: 66C
> GPU1 VRM1: 60C
> GPU2 core temp: 78C
> GPU2 VRM1: 76C
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Yeah, yeah, I know. I will change the TIM on the GPU2. Just need to find time for it.
> 
> 
> 
> 
> 
> 
> 
> Honestly when I clean my loop, I've never touch the blocks yet. Maybe I should, to clean any gunk in it.


35c ambient? Good lord, man. Where do you live?!


----------



## Offler

Erhm.. My original submission to top 30 in graphics score.

http://www.overclock.net/t/1406832/single-gpu-fire-strike-top-30/1200#post_22936403

At that point I was able to archieve 3900Mhz, but due stability issues I had to downclock to 38000MHz, and at the frequencies I had absolutely no artifacts.

So far best result since I tried to repeat the test:
http://www.3dmark.com/fs/7877224

Exactly same system settings, but...
a) Artifacts in Graphic tests
b) Unstable results. It was fluctuating badly.
c) Graphic driver not approved.
d) Its nice to see that graphic score is better.

My personal rules are that the result has to be valid and driver settings on "reset to defaults" with max power, and no artifacts. It also indicates that the driver was not tweaked, and the result is really fine.

Anway its nice to see that the graphic score is bit better, but overall score is not...

Any experience regarding more artifacting with Crimson drivers on same (topmost) frequency?


----------



## mus1mus

Quote:


> Originally Posted by *Offler*
> 
> Erhm.. My original submission to top 30 in graphics score.
> 
> http://www.overclock.net/t/1406832/single-gpu-fire-strike-top-30/1200#post_22936403
> 
> At that point I was able to archieve 3900Mhz, but due stability issues I had to downclock to 38000MHz, and at the frequencies I had absolutely no artifacts.
> 
> So far best result since I tried to repeat the test:
> http://www.3dmark.com/fs/7877224
> 
> Exactly same system settings, but...
> a) Artifacts in Graphic tests
> b) Unstable results. It was fluctuating badly.
> c) Graphic driver not approved.
> d) Its nice to see that graphic score is better.
> 
> My personal rules are that the result has to be valid and driver settings on "reset to defaults" with max power, and no artifacts. It also indicates that the driver was not tweaked, and the result is really fine.
> 
> Anway its nice to see that the graphic score is bit better, but overall score is not...
> 
> Any experience regarding more artifacting with Crimson drivers on same (topmost) frequency?


Topmost frequency? You are not yet there.









Provide the card either more voltage or better cooling!


----------



## Offler

Quote:


> Originally Posted by *mus1mus*
> 
> Topmost frequency? You are not yet there.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Provide the card either more voltage or better cooling!


I am planning to get water closed loop. Not as much for thermal reasons, rather than I am slowly getting deaf


----------



## mus1mus

Look up my mod a couple pages back. If you find it easier.


----------



## Offler

Well many pages back I sent some info about lapping of stock cooler on R9-290x. Contact with chip was poor because of shape and bumps on the vapour chamber. After that the lapped stock cooler gave 10°c less temperatuers at same ambient and fan speed. In any case 1180Mhz was beyond my expectations for the stock cooler.

The cooler itself isnt that bad (when properly seated - which was not done in factory), but its very noisy.


----------



## mus1mus

Quote:


> Originally Posted by *Offler*
> 
> Well many pages back I sent some info about lapping of stock cooler on R9-290x. Contact with chip was poor because of shape and bumps on the vapour chamber. After that the lapped stock cooler gave 10°c less temperatuers at same ambient and fan speed. In any case 1180Mhz was beyond my expectations for the stock cooler.
> 
> The cooler itself isnt that bad (when properly seated - which was not done in factory), but its very noisy.


Very true.

I have a total of 5 ref blowers and some of them have bad surfaces.

I did lap one too. But I did it on the flattest I have and was pumping +250 on the card at 100% speed and 15C ambient. Good enough to max the card at 1250/1650. Temps didnt hit 70C during benching.

But I already have a trio on Full Covers. I just needed the fourth to have decent temps than 70s. Thus that mod.

It works better than expected! Still getting the max of 1250 but temps at 45C during several runs for GPUPI 32B that takes 20 minutes for each run.

When done with @battleaxe 's copper pipe mod, this thing can easily outplay full covers!

We learn new trick each day I guess!


----------



## nX3NTY

Quote:


> Originally Posted by *Offler*
> 
> Well many pages back I sent some info about lapping of stock cooler on R9-290x. Contact with chip was poor because of shape and bumps on the vapour chamber. After that the lapped stock cooler gave 10°c less temperatuers at same ambient and fan speed. In any case 1180Mhz was beyond my expectations for the stock cooler.
> 
> The cooler itself isnt that bad (when properly seated - which was not done in factory), but its very noisy.


There is stock cooler mod where people cut the rear I/O panel to reduce the noise. You can even remove the panel altogether too


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> Very true.
> 
> I have a total of 5 ref blowers and some of them have bad surfaces.
> 
> I did lap one too. But I did it on the flattest I have and was pumping +250 on the card at 100% speed and 15C ambient. Good enough to max the card at 1250/1650. Temps didnt hit 70C during benching.
> 
> But I already have a trio on Full Covers. I just needed the fourth to have decent temps than 70s. Thus that mod.
> 
> It works better than expected! Still getting the max of 1250 but temps at 45C during several runs for GPUPI 32B that takes 20 minutes for each run.
> 
> When done with @battleaxe 's copper pipe mod, this thing can easily outplay full covers!
> 
> We learn new trick each day I guess!


Nice Mus... nice.

Does it look decent? Or was that not the idea? Sounds like it turned out really well. I'm thinking about chrome plating mine, or black paint. Not sure which. I plan to redo my loop routing today to try to make the tubing look a bit more 'neat'.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> When done with @battleaxe 's copper pipe mod, this thing can easily outplay full covers!


----------



## mus1mus

Quote:


> Originally Posted by *battleaxe*
> 
> Nice Mus... nice.
> 
> Does it look decent? Or was that not the idea? Sounds like it turned out really well. I'm thinking about chrome plating mine, or black paint. Not sure which. I plan to redo my loop routing today to try to make the tubing look a bit more 'neat'.


Well, I dont have the pipes yet. But temps wise, this is as good as it gets! Mind you, it's just a block slapped on to the heat pipe at the moment with just zip ties securing the block. Once soldered, you can only expect better results.

To do it in a simpler way, any uni blocks that can be screwed to the base plate up top, will work. As you can solder that base plate into the heat pipe for better transfer and assemble the full product easily.
Cover it up with the original shroud and you have a stock reference blower look with water temps.


----------



## Offler

Quote:


> Originally Posted by *mus1mus*
> 
> Very true.
> 
> I have a total of 5 ref blowers and some of them have bad surfaces.
> 
> I did lap one too. But I did it on the flattest I have and was pumping +250 on the card at 100% speed and 15C ambient. Good enough to max the card at 1250/1650. Temps didnt hit 70C during benching.
> 
> But I already have a trio on Full Covers. I just needed the fourth to have decent temps than 70s. Thus that mod.
> 
> It works better than expected! Still getting the max of 1250 but temps at 45C during several runs for GPUPI 32B that takes 20 minutes for each run.
> 
> When done with @battleaxe 's copper pipe mod, this thing can easily outplay full covers!
> 
> We learn new trick each day I guess!


Card was immediatelly giving me an impression of not-fitting cooler - high temperature even without load, really fast jump up when under load. There was no mystery in it.
Quote:


> Originally Posted by *nX3NTY*
> 
> There is stock cooler mod where people cut the rear I/O panel to reduce the noise. You can even remove the panel altogether too


The card is silent to me up to 36% of fan speed, yet common load without dowclock or throttling require up to 60%. I know that mod, but ... i guess it does not have as big impact.


----------



## mus1mus

Quote:


> Originally Posted by *nX3NTY*
> 
> There is stock cooler mod where people cut the rear I/O panel to reduce the noise. You can even remove the panel altogether too


The fan itself sitting on the table is responsible for about 70% of the card's noise. And that is being generous.

So modding the shroud will do it insignificance. It will help a bit if you are into it though.
Quote:


> Originally Posted by *Vellinious*


I will show you temps tomarraw!









Edit: actually, I have some pics.


----------



## spyshagg

I am VRM limited (temperatures wise). Would lowering the VRM frequency lower the temps without detriment to the stability I already have?


----------



## Torvi

hey guys, anyone here with raijintek morpheus cooler on their r9 290x? i want to get one for my gpu, is it as good as sites says?


----------



## mus1mus

Quote:


> Originally Posted by *spyshagg*
> 
> I am VRM limited (temperatures wise). Would lowering the VRM frequency lower the temps without detriment to the stability I already have?


Yes. You just got to find that minimum frequency to keep you stable.

Refer to my post here

From stock to I think, 799 ish and 899 ish.

It didn'nt improve my artifact issue. But look at the VRM temps go higher with frequency.


----------



## mirzet1976

New 3DMArk with new interface and more.

http://www.futuremark.com/support/update/3dmark

What's new in 3DMark 2.0.1979
This is a major update that adds a redesigned UI for all editions and a preview of VRMark for Advanced and Professional Edition users.
New

3DMark UI has been redesigned and rebuilt to be faster and more flexible.
Home screen recommends the best test for your PC based on your system details.
Run other benchmarks and feature tests from the Benchmarks screen.
Russian localization.
Improved

Each benchmark test can now be updated independently.
Ice Storm Extreme and Ice Storm Unlimited are unlocked in 3DMark Basic Edition.
SystemInfo module updated to 4.43 for improved hardware detection.
VRMark preview

Explore two test scenes in a preview of VRMark, our new benchmark for VR systems. The preview does not produce a score.
The VRMark preview is not available in 3DMark Basic Edition or the Steam demo.

Fixed

*Workaround for the AMD driver issue where the preview videos in the UI caused some AMD graphics cards to use low power mode and run at lower clock speeds.*


----------



## mus1mus

I just had this today. And oops! Has a bit of bugs when updating components. Takes a while to respond and should have been more responsive to user input.

VRMArk looks fun though.


----------



## Paul17041993

Has there been any known cold-bug issues with the reference 290X? I've had the system unexpectedly blank-out (similar to putting the system to sleep, but the disks still do random things and the system doesn't respond to any events while remaining powered on for eternity) twice on this cold morning with only minecraft running, and I noticed that event viewer apparently has a record of the display driver crashing that I never saw...


----------



## JourneymanMike

I ran into a problem... When trying to disable ULPS, with the ULPS Configuration Utility v1.1.4, I got this message...

Error: ULPS Registry Key not found.









This was after uninstalling, using DDU, and then reinstalling with AV disabled and running as Administrator -









Anybody ever run into this? What did you do? Or, what should I do?


----------



## diggiddi

Quote:


> Originally Posted by *JourneymanMike*
> 
> I ran into a problem... When trying to disable ULPS, with the ULPS Configuration Utility v1.1.4, I got this message...
> 
> Error: ULPS Registry Key not found.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This was after uninstalling, using DDU, and then reinstalling with AV disabled and running as Administrator -
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anybody ever run into this? What did you do? Or, what should I do?


Try looking in the registry hopefully Bradley's post will help http://www.overclock.net/t/1587347/how-to-disable-ulps-cfx-windows-10-crimson/0_20


----------



## Paul17041993

Quote:


> Originally Posted by *Paul17041993*
> 
> Has there been any known cold-bug issues with the reference 290X? I've had the system unexpectedly blank-out (similar to putting the system to sleep, but the disks still do random things and the system doesn't respond to any events while remaining powered on for eternity) twice on this cold morning with only minecraft running, and I noticed that event viewer apparently has a record of the display driver crashing that I never saw...


Turns out it was possibly something weird with the PCIe slot, at one point the card seemed to stop working completely but appeared to still let windows boot to a certain point (disk and peripheral activity). But after I pulled the card out of the slot and re-seated it it was working perfectly again.

I can only assume the cold must have made the card or the pins in the slot shift slightly...


----------



## kizwan

Quote:


> Originally Posted by *JourneymanMike*
> 
> I ran into a problem... When trying to disable ULPS, with the ULPS Configuration Utility v1.1.4, I got this message...
> 
> Error: ULPS Registry Key not found.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> This was after uninstalling, using DDU, and then reinstalling with AV disabled and running as Administrator -
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Anybody ever run into this? What did you do? Or, what should I do?


I use Trixx to disable ULPS.


----------



## gupsterg

Quote:


> Originally Posted by *mus1mus*
> 
> I just had this today. And oops! Has a bit of bugs when updating components. Takes a while to respond and should have been more responsive to user input.
> 
> VRMArk looks fun though.


I didn't have any install, GUI, etc issues, what I miss is being able to run FS demo looped in custom mode







, so I guess I should complain on 3DM forum?

*** edit ***

Other issue is it doesn't remember custom setting after closing and reopening app







.


----------



## JourneymanMike

Quote:


> Originally Posted by *kizwan*
> 
> I use Trixx to disable ULPS.


I use TriXX also... But the thing is, that the registry key can't be found...

I did use TriXX, and disabled it there, now I can OC with TriXX, but I have to wonder about the missing Registry Key







...


----------



## 0razor1

Guys,

I've got the 290 Sapphire Tri-x.

Now it's the old 4+1 phase design. I'm looking to cut down on the temps.
Can you advise on how I can achieve this?
The core is perfectly fine, it's the VRM's I'm worried about.

The OC is mild, with very little VCore offset.

I doubt lapping anything will help! If I change my thermal pads, to which company should I switch.
I'd like it to be the same color as the original pads (black)


----------



## Offler

Quote:


> Originally Posted by *0razor1*
> 
> Guys,
> 
> I've got the 290 Sapphire Tri-x.
> 
> Now it's the old 4+1 phase design. I'm looking to cut down on the temps.
> Can you advise on how I can achieve this?
> The core is perfectly fine, it's the VRM's I'm worried about.
> 
> The OC is mild, with very little VCore offset.
> 
> I doubt lapping anything will help! If I change my thermal pads, to which company should I switch.
> I'd like it to be the same color as the original pads (black)


Can you provide frequencies (GPU. ram) and temperatuers (VRM, GPU) on Idle and Burn? You can give also stock values.

Few times I noticed that non-reference designs are OCed by manufacturer, but overclock is almost impossible (5% on my old 7970).


----------



## 0razor1

Quote:


> Originally Posted by *Offler*
> 
> Can you provide frequencies (GPU. ram) and temperatuers (VRM, GPU) on Idle and Burn? You can give also stock values.
> 
> Few times I noticed that non-reference designs are OCed by manufacturer, but overclock is almost impossible (5% on my old 7970).


Hey man, thanks for taking an interest.

AB +50 mV to the Core.
Power limit +50%
Core 1100
Mem 1500 *hynix








GPU temp with the GPU-z render test ( I know, I know) - Stabilizes at 80C (5 minutes)
Fan at this temp - 40% 1970 RPM
VDDC - 1.188 (Droops from 1.195 and 1.203)
VRM1 - 74 C
VRM2- 62 C

With some firestrike it'll hit ~100C
With OCCT the VRM1 will hit 115C+ and I shut it off since it's abnormally high.

This is with that GPU- z render and then idled for 30 seconds.


And this is after say 30 seconds of occt :/


Ambients are at ~ 27C. Case is a HAF 922 with moderate airflow. Positive pressure.


----------



## Agent Smith1984

Quote:


> Originally Posted by *0razor1*
> 
> Hey man, thanks for taking an interest.
> 
> AB +50 mV to the Core.
> Power limit +50%
> Core 1100
> Mem 1500 *hynix
> 
> 
> 
> 
> 
> 
> 
> 
> GPU temp with the GPU-z render test ( I know, I know) - Stabilizes at 80C (5 minutes)
> Fan at this temp - 40% 1970 RPM
> VDDC - 1.188 (Droops from 1.195 and 1.203)
> VRM1 - 74 C
> VRM2- 62 C
> 
> With some firestrike it'll hit ~100C
> With OCCT the VRM1 will hit 115C+ and I shut it off since it's abnormally high.
> 
> This is with that GPU- z render and then idled for 30 seconds.
> 
> 
> And this is after say 30 seconds of occt :/
> 
> 
> Ambients are at ~ 27C. Case is a HAF 922 with moderate airflow. Positive pressure.


OCCT test is very similar to Furmark, I wouldn't use it if I were you.... unrealistic heat generation, and can really cook a card, to death even....


----------



## Offler

I use OCCT to measure primarily to test cooling and power source+consumption of the graphic card and CPU, while checking temperatures as well.

I was hitting 65-70 degrees on VRM with a reference card with stock cooler - which has been sligtly modded. (thermal paste, some new thermal pads). In this case I would compare the temperature of VRM with card from same manufacturer, because my old 7970 from gigabyte was able to cook on VRMs up to 115 degrees as well.

And yes when running OCCT on some stressful scenarios u use to be ready to power off the PC to avoid damage. In my case I am using old OCCT version which is bugged and use only one CPU core to generate the stress test. (I do it for some comparation purposes)


----------



## Agent Smith1984

Quote:


> Originally Posted by *Offler*
> 
> I use OCCT to measure primarily to test cooling and power source+consumption of the graphic card and CPU, while checking temperatures as well.
> 
> I was hitting 65-70 degrees on VRM with a reference card with stock cooler - which has been sligtly modded. (thermal paste, some new thermal pads). In this case I would compare the temperature of VRM with card from same manufacturer, because my old 7970 from gigabyte was able to cook on VRMs up to 115 degrees as well.
> 
> And yes when running OCCT on some stressful scenarios u use to be ready to power off the PC to avoid damage. In my case I am using old OCCT version which is bugged and use only one CPU core to generate the stress test. (I do it for some comparation purposes)


What I'm saying, is that VRM's will only hit 100c+ in apps like Furmark and OCCT, Kombuster..etc....

Try looping Heaven to see what your "real" temps are....


----------



## Offler

Quote:


> Originally Posted by *Agent Smith1984*
> 
> What I'm saying, is that VRM's will only hit 100c+ in apps like Furmark and OCCT, Kombuster..etc....
> 
> Try looping Heaven to see what your "real" temps are....


It was above 110 in heaven with 7970. The VRM was simply dreadful.


----------



## 0razor1

Thanks for your replies. Well, I've hit 100 C on the VRM1 easy with heaven max'd out.
So, in general, I have a feeling this may be wise- just ideas at this point, feel free to talk me out of it:

1. Lap lightly, the heatsink at the point of pad contact as well as vcore?
If so, how? Cause I've done my CPU in the past, never a GPU heatsink.
2. Buy thermal pads - recommendations would be great. I'll be using ebay.com with global shipping.
3. Combine said efforts with thermal paste as well? ???

Has one ever used thermal paste and pads? Doesn't that even work?
Also, what thickness thermal pads? Preferably black.
Which metrics does one look for while purchasing? Are they similar to conductivity and resistivity that is relevant to thermal paste?


----------



## Agent Smith1984

Quote:


> Originally Posted by *0razor1*
> 
> Thanks for your replies. Well, I've hit 100 C on the VRM1 easy with heaven max'd out.
> So, in general, I have a feeling this may be wise- just ideas at this point, feel free to talk me out of it:
> 
> 1. Lap lightly, the heatsink at the point of pad contact as well as vcore?
> If so, how? Cause I've done my CPU in the past, never a GPU heatsink.
> 2. Buy thermal pads - recommendations would be great. I'll be using ebay.com with global shipping.
> 3. Combine said efforts with thermal paste as well? ???
> 
> Has one ever used thermal paste and pads? Doesn't that even work?
> Also, what thickness thermal pads? Preferably black.
> Which metrics does one look for while purchasing? Are they similar to conductivity and resistivity that is relevant to thermal paste?


If trying to improve VRM temps, I suggest some 11W/km or even 17W/km Fujipoly pads.

$20 or so and you could see temps drop as low as 30c+


----------



## 0razor1

What about thickness ?TY :}

http://www.ebay.com/sch/i.html?_from=R40&_sacat=0&_nkw=Fujipoly&_sop=15
pin 201301


----------



## Agent Smith1984

Quote:


> Originally Posted by *0razor1*
> 
> What about thickness ?TY :}
> 
> http://www.ebay.com/sch/i.html?_from=R40&_sacat=0&_nkw=Fujipoly&_sop=15
> pin 201301


Not sure on that card exactly, but there should be some google info out there.

Probably between 1 and 1.5mm

Definitely won't be thicker than that...


----------



## 0razor1

Hey what about shims?


----------



## Offler

I usually check whether original pads are not dried (may start to turn into dust). I have only normal silicon-based pads to replace them. Nothing special.

I would avoid metal shims on GPU. I can hardly imagine they might work in similar way as a piece of rubber/silicon material then it comes to tensions between card, chip and cooler (sorry my english isnt good enough to expain it)


----------



## kizwan

Quote:


> Originally Posted by *0razor1*
> 
> Hey what about shims?


You only need shims if there's a gap between the gpu die & heatsink. And also you only need to lap the heatsink if there is evidence of uneven pressure or poor contact. You can easily check these two out when you remove the cooler. For now all you need to do is to re-apply TIM & new pad. Fujipoly pad is expensive. The Fujipoly 11W/mK one is less expensive or Phobya 7W/mK for VRMs. You'll need to figure out the correct thickness though.


----------



## spyshagg

How do you guys like them 290X's















http://gamegpu.com/action-/-fps-/-tps/killer-instinct-test-gpu


----------



## Offler

I am fine with R9-290x results, as long I know that GTX 980ti has 96 ROPs while R9-290x has just 64 ROPs.









Unless there is any game test which says that Nvidia has 30% or more FPS with said graphic there is no reason for me to even consider Nvidia.


----------



## Riktar54

Hmmm. How do I like my 290?

Let's see,,,,,

I paid $250 for my 290 (Brand new with a lifetime warranty) on Amazon.

The GTX 980 costs twice as much. The Ti? Even more.

The difference in frame rates is barely 10% Wait. Forget percentages. How many here could tell the difference between 119fps and 135fps? I would bet that if you put 5 monitors side by side and gave someone $250 if they could identify which ones were 119 and which were 135 you would have no winners. Wait, ok, if enough people tried SOMEONE would make a lucky guess.

So,,,,,,, I paid half of what someone else did and got (virtually) indistinguishable performance from the higher priced card.

No, I don't like my 290.

I LOVE it,,,,,


----------



## TheReciever

A 7870 beating a 680? ouch...


----------



## Offler

Quote:


> Originally Posted by *TheReciever*
> 
> A 7870 beating a 680? ouch...


A proof that AMD cards age better


----------



## Lucky 23

Quote:


> Originally Posted by *spyshagg*
> 
> How do you guys like them 290X's


I love mine











http://www.3dmark.com/fs/8189380


----------



## Vellinious

The 290Xs are really, pretty awesome cards. I hit almost 17k in Firestrike graphics score with mine, and have seen 2 x 290Xs hit 30k+. Rather impressive for such an old piece of hardware.


----------



## fyzzz

My 2x290 with custom bios at 1270/1560(card 2 mem clock 1625) can score 30727 gpu points in firestrike: http://www.3dmark.com/fs/8187123 . I could benchmark them even higher, but i only have a 850w psu and 2x240 radiators.


----------



## spyshagg

http://www.3dmark.com/fs/7865388

30800 with tess off (probably 29800 with tess on?) with lowly 1150/1580 clocks


----------



## mus1mus

30K is pretty easy. Even 45K for 3.

With TESS OFF of course.


----------



## mus1mus

Quote:


> Originally Posted by *fyzzz*
> 
> My 2x290 with custom bios at 1270/1560(card 2 mem clock 1625) can score 30727 gpu points in firestrike: http://www.3dmark.com/fs/8187123 . I could benchmark them even higher, but i only have a 850w psu and 2x240 radiators.


You are very lucky! My 1250W trips off OCP with 2 cards at 1300!


----------



## Vellinious

A little playing...I didn't bother pushing the CPU up to my normal benchmark clocks, and the ambient temps were a bit high to be messing around too much, but....still ran pretty darn good.

The comparison between my 970 and 290X runs:
http://www.3dmark.com/compare/fs/8198130/fs/6639792


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> You are very lucky! My 1250W trips off OCP with 2 cards at 1300!


Your cards are pigs. Eating everything that gets in their way.


----------



## SpykeZ

Sooo....just scored a Sapphire 290x Tri-X OC. After all said and done, I'm only out 75 bucks









Was flipping through ebay to see if the 290x had dropped in price or not. Saw some guy with one for 300 with a brand new cooler on it and had "or best offer", and those never work for me so as a joke put 250 down. 10 minutes later I get a message from ebay that it's mine......2 minutes after that he already shipped it....and I hadn't even paid the dude yet LOL. Sent him his money instantly and was going to put my 7950 with backplate up on here but figered I'd ding my buddy to see if he wanted to buy it, sold it for 175.....

75 bucks for a 290x upgrade...I'm down with that..

So how much can I expect to OC this thing. My 7950 did 1250/1800 under water. On air it's sitting at 1100/1400.


----------



## chiknnwatrmln

Broke 19k!

http://www.3dmark.com/3dm/11665196?


----------



## battleaxe

I just heard back from Sapphire. A new 290x on the way apparently. Fingers crossed that its not a turd or its being sold on Fleabay.


----------



## TheReciever

Hopefully they do you a solid and get you a well rounded card


----------



## Mega Man

you act like they test these things and care, they dont, grab box, ship box nothing more


----------



## battleaxe

Quote:


> Originally Posted by *TheReciever*
> 
> Hopefully they do you a solid and get you a well rounded card


Quote:


> Originally Posted by *Mega Man*
> 
> you act like they test these things and care, they dont, grab box, ship box nothing more


I didn't get that from his response at all. He was just hoping the best for a fellow owner. And I'm hoping too. If it's decent I'm selling my gtx 970.


----------



## SpykeZ

Quote:


> Originally Posted by *battleaxe*
> 
> I didn't get that from his response at all. He was just hoping the best for a fellow owner. And I'm hoping too. If it's decent I'm selling my gtx 970.


good support shows they care, but actually testing the product? No. Big companies like Sapphire don't do that.


----------



## TheReciever

Quote:


> Originally Posted by *Mega Man*
> 
> you act like they test these things and care, they dont, grab box, ship box nothing more


I implied nothing of the sort. Its surprisingly irritating how it seems like every post should have a disclaimer for people that want to extrapolate potential naive notions communicated.

They still have to pick up the box and send it to him, in that process I was simply hoping what ever they pick up is not a lemon.

Just, appalling.

Quote:


> Originally Posted by *battleaxe*
> 
> I didn't get that from his response at all. He was just hoping the best for a fellow owner. And I'm hoping too. If it's decent I'm selling my gtx 970.


Well at least the member in question, which is all that mattered in this instance understood it.


----------



## battleaxe

Well, I just got my replacement RMA 290x back from Sapphire. Its not a dud. That's good, but definently not as good as the old one. I got it to 1200mhz/1625mhz at under 100mv. But there's no way it will hit 1280mhz like the old one did. Oh well.

Good enough to beat the 970 and sell it. Selling off my Giga 970G1 gaming then. Done with Nvidia for now after the 970 fiasco. Nice to have almost all AMD cards now. AMD makes great GPU's. These 290's are still relevant even today if you ask me.


----------



## TheReciever

Sweet deal, mine doesn't seem to hit 1200mhz even with +200mv but either way it was a large step up from the 6970m I used to have


----------



## SpykeZ

Quote:


> Originally Posted by *battleaxe*
> 
> Well, I just got my replacement RMA 290x back from Sapphire. Its not a dud. That's good, but definently not as good as the old one. I got it to 1200mhz/1625mhz at under 100mv. But there's no way it will hit 1280mhz like the old one did. Oh well.
> 
> Good enough to beat the 970 and sell it. Selling off my Giga 970G1 gaming then. Done with Nvidia for now after the 970 fiasco. Nice to have almost all AMD cards now. AMD makes great GPU's. These 290's are still relevant even today if you ask me.


I really wish AMD could be as strong with their CPU's as they are GPU.

I've switched back and forth, depending on which generation was better when I got my card. Started off on a Radeon 7000 back in the day, went to a 9250 to an x800GTO then went nvidia with the 7900GS KO to the amazing 8800GT, back to AMND with the 5850 the week they were released to a 7950 to the 290x (that I got for like 75 bucks all said and done).

I almost got the 970 but I just can't stomache support nvidia with their ******ed gameworks scam.


----------



## battleaxe

Quote:


> Originally Posted by *SpykeZ*
> 
> I really wish AMD could be as strong with their CPU's as they are GPU.
> 
> I've switched back and forth, depending on which generation was better when I got my card. Started off on a Radeon 7000 back in the day, went to a 9250 to an x800GTO then went nvidia with the 7900GS KO to the amazing 8800GT, back to AMND with the 5850 the week they were released to a 7950 to the 290x (that I got for like 75 bucks all said and done).
> 
> I almost got the 970 but I just can't stomache support nvidia with their ******ed gameworks scam.


The 970 is a piece of garbage. 4gb... yeah, whatever.

They are getting beat by the old 290's now. What a joke.









Yeah, I'm still pissed my old card died. I've had some luck lately. Had a 290x die and a 390x die. Both were my best cards. My 390x was truly unusual as it would hit 1280mhz/1750mhz and I've not heard of any that could match it. Now its dead. Such a suckfest. But, at least XFX replaced it, can't complain on that. I'll have Xfire back up and running soon.

This 290x RMA is going in the wife's PC. The 970 is getting offed to some Nvidia fanboy.


----------



## battleaxe

Quote:


> Originally Posted by *TheReciever*
> 
> Sweet deal, mine doesn't seem to hit 1200mhz even with +200mv but either way it was a large step up from the 6970m I used to have


Sold my friend a 6970, that was a nice card too. All good stuff really. AMD is really hitting the drivers lately haven't they?


----------



## asciii

_Removed._


----------



## mus1mus

Quote:


> Originally Posted by *SpykeZ*
> 
> *I really wish AMD could be as strong with their CPU's as they are GPU.*
> 
> I've switched back and forth, depending on which generation was better when I got my card. Started off on a Radeon 7000 back in the day, went to a 9250 to an x800GTO then went nvidia with the 7900GS KO to the amazing 8800GT, back to AMND with the 5850 the week they were released to a 7950 to the 290x (that I got for like 75 bucks all said and done).
> 
> I almost got the 970 but I just can't stomache support nvidia with their ******ed gameworks scam.


Z-E_N.


----------



## battleaxe

Quote:


> Originally Posted by *asciii*
> 
> Amazing isn't it?
> 
> Comparing benchmarks today with benchmarks a few years ago you can see how the drivers have improved the R9 290(x).
> Will be interesting to see the R9 290(x) against the newer Nvidia and AMD cards later this year.


Yes. Right now it is looking like the 290x/390x might be somewhat competitive with the 980ti on DX12. I'm not optimistic about that that staying that way, but I bet Nvidia is scrambling right now and trying to buy off whomever is in control of DX12. Money buys anything these days. And Nvidia will stop at nothing, I have no doubt about that.

As long as the general idiot customer thinks Nivida is the best then Nvidia will be #1, at least in the minds of those who purchase them.

The rest of us are buying 290's and 390's like our money actually means something to us.

Especially after that stupid 970 fiasco.

Only my OP of course, and I'm not bitter...

well... maybe a little.

worst $400.00 I ever spent in my life (on a 970)

Edit: no wait. My ex-wife's engagement ring and every penny I ever spent on her was the worst money every spent. And that was far more than I care to share about (BMW's included).


----------



## battleaxe

My appologies if I offended anyone. I drank too much wine tonight.


----------



## Vellinious

I had a couple of 970s that ran very well..... They were great cards. But, like the 290 / 290X....ya just had to make some changes to get the most out of them.


----------



## TheReciever

Quote:


> Originally Posted by *battleaxe*
> 
> Sold my friend a 6970, that was a nice card too. All good stuff really. AMD is really hitting the drivers lately haven't they?


Ah no I had a 6970m which is more like a 6850 so quite a bit more noticeable gains lol


----------



## SpykeZ

Quote:


> Originally Posted by *mus1mus*
> 
> Z-E_N.


b-u-l-l-d-o-z-e-r
p-i-l-e-d-r-i-v-e-r

I've been with from the AMD K6 to the 939 x2 4400+, to the 965, to the 8350.

AMD's been playing catch up ever since AM2. I know how their cpu's go, promise promise promise release no promises held.

Anywho

that 970 fiasco. Did that 3.5GB of memory really make that much of a difference in performance? My cousin bought one not knowing about it but he plays 1080p with 144hz and everything he plays just runs flawless, even Ark runs amazing.


----------



## mus1mus

Quote:


> Originally Posted by *SpykeZ*
> 
> b-u-l-l-d-o-z-e-r
> p-i-l-e-d-r-i-v-e-r
> 
> I've been with from the AMD K6 to the 939 x2 4400+, to the 965, to the 8350.
> 
> *AMD's been playing catch up ever since AM2. I know how their cpu's go, promise promise promise release no promises held.
> *
> Anywho
> 
> that 970 fiasco. Did that 3.5GB of memory really make that much of a difference in performance? My cousin bought one not knowing about it but he plays 1080p with 144hz and everything he plays just runs flawless, even Ark runs amazing.


I'm not saying ZEN will be the game-changer. But at the very least, if they can provide adequate fight for Intels, I'm all in..


----------



## Mega Man

Quote:


> Originally Posted by *battleaxe*
> 
> Well, I just got my replacement RMA 290x back from Sapphire. Its not a dud. That's good, but definently not as good as the old one. I got it to 1200mhz/1625mhz at under 100mv. But there's no way it will hit 1280mhz like the old one did. Oh well.
> 
> Good enough to beat the 970 and sell it. Selling off my Giga 970G1 gaming then. Done with Nvidia for now after the 970 fiasco. Nice to have almost all AMD cards now. AMD makes great GPU's. These 290's are still relevant even today if you ask me.


amd has been solid for quite some time, people just have been brainwashed to think nvidia is better ... my 7970s are still going strong....
Quote:


> Originally Posted by *SpykeZ*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Z-E_N.
> 
> 
> 
> b-u-l-l-d-o-z-e-r
> p-i-l-e-d-r-i-v-e-r
> 
> I've been with from the AMD K6 to the 939 x2 4400+, to the 965, to the 8350.
> 
> AMD's been playing catch up ever since AM2. I know how their cpu's go, promise promise promise release no promises held.
> 
> Anywho
> 
> that 970 fiasco. Did that 3.5GB of memory really make that much of a difference in performance? My cousin bought one not knowing about it but he plays 1080p with 144hz and everything he plays just runs flawless, even Ark runs amazing.
Click to expand...

apearantly you missed the part where amd STATED they will not try to compete with intel in performance, but are heading in a different direction

i think it is funny people are mad that amd " does not compete" when they said they wouldnt. they didnt lie to you .. as for me i love my fxs and i would take them over any intel for day to day tasks.


----------



## rdr09

Quote:


> Originally Posted by *SpykeZ*
> 
> b-u-l-l-d-o-z-e-r
> p-i-l-e-d-r-i-v-e-r
> 
> I've been with from the AMD K6 to the 939 x2 4400+, to the 965, to the 8350.
> 
> AMD's been playing catch up ever since AM2. I know how their cpu's go, promise promise promise release no promises held.
> 
> Anywho
> 
> that 970 fiasco. Did that 3.5GB of memory really make that much of a difference in performance? My cousin bought one not knowing about it but he plays 1080p with 144hz and everything he plays just runs flawless, even Ark runs amazing.


iirc and i think i do, the 970 was marketed for 1080, which tells me that nVidia knew about the limitation but skipped mentioning it - of course. users realized that they are very fast and tried higher rez, then they soon discovered what's up.


----------



## SpykeZ

Quote:


> Originally Posted by *rdr09*
> 
> iirc and i think i do, the 970 was marketed for 1080, which tells me that nVidia knew about the limitation but skipped mentioning it - of course. users realized that they are very fast and tried higher rez, then they soon discovered what's up.


I don't get how they even got away with that, the fact all 970's are still labeled as 4GB should be considered a scam.


----------



## TheReciever

Quote:


> Originally Posted by *SpykeZ*
> 
> I don't get how they even got away with that, the fact all 970's are still labeled as 4GB should be considered a scam.


4GB is present in the card, just not of a uniform speed.


----------



## Vellinious

Quote:


> Originally Posted by *SpykeZ*
> 
> I don't get how they even got away with that, the fact all 970's are still labeled as 4GB should be considered a scam.


Uh, they get away with it because they do, in fact, have 4GB.


----------



## Vellinious

I was messing around a bit with my overclocks last night, and found that my card doesn't start the black screens as soon as it used to. It had been at 1.461v that I started seeing them, now it doesn't really happen until 1.477v. With vdroop on 1.461, the voltages were dropping to 1.305, at 1.477 they're dropping to 1.328v. It gave me some extra core clock with which to play with.

Question is....what would allow that to happen?


----------



## spyshagg

In my experience, Every time I flash a bios (even the same bios), the blackscreens and blackouts happen at different voltages


----------



## Vellinious

Quote:


> Originally Posted by *spyshagg*
> 
> In my experience, Every time I flash a bios (even the same bios), the blackscreens and blackouts happen at different voltages


I haven't changed bios versions in over a month... /shrug


----------



## chiknnwatrmln

You might be running too high voltage. One of my cards is not that great and if I increase the voltage too much it blackscreens. The highest I can get it stable is 1200MHz.


----------



## Vellinious

I know it's voltage that causes the black screens...what I'm wondering, is why it would black screen last week at 1.461v, and then this week, doesn't black screen until 1.477v. Not that I'm complaining....none of the other variables in my system have changed. Same driver version, ambient temps are within 1c, coolant temps within 2c...... I just wonder what prompted the change. Break in time? lol


----------



## alancsalt

Isn't it down to how much you are drawing at the wall, (and hitting the limit) therefore the load rather than the particular vcore? Maybe the power feed to yr house isn't quite consistent? i had dedicated higher amperage breakers and powerpoints put in to avoid this and ran dual PSU (for triple GPU)


----------



## chiknnwatrmln

Because it's not a stable overclock. If it blackscreens at that voltage then you can't run that high.

1.46v is ridiculously high anyways.


----------



## mus1mus

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Because it's not a stable overclock. If it blackscreens at that voltage then you can't run that high.
> 
> *1.46v is ridiculously high* anyways.


----------



## Vellinious

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Because it's not a stable overclock. If it blackscreens at that voltage then you can't run that high.
> 
> 1.46v is ridiculously high anyways.


I'm gonna go out on a limb and say that you don't overclock much......


----------



## dagget3450

Quote:


> Originally Posted by *Vellinious*
> 
> Uh, they get away with it because they do, in fact, have 4GB.


There is another part that was hilarious on the 970gtx fiasco. People were like oh no, well i should upgrade to 980gtx now. So they bought 980gtx in place of the 970gtx and rewarded Nvidia for being shady.


----------



## Vellinious

People who are too dumb to realize that the 980 isn't worth the upgrade, maybe.


----------



## dagget3450

Quote:


> Originally Posted by *Vellinious*
> 
> People who are too dumb to realize that the 980 isn't worth the upgrade, maybe.


Yep, just a few posts i pulled, was either got a 980 to replace 970, or was considering it as the ONLY option. there are many more but i didnt want to wast my time more










Spoiler: Warning: Spoiler!



http://www.overclock.net/t/1535502/gtx-970s-can-only-use-3-5gb-of-4gb-vram-issue/2600#post_23671211
http://www.overclock.net/t/1535502/gtx-970s-can-only-use-3-5gb-of-4gb-vram-issue/2250#post_23584480
http://www.overclock.net/t/1535502/gtx-970s-can-only-use-3-5gb-of-4gb-vram-issue/2220#post_23565009
http://www.overclock.net/t/1535502/gtx-970s-can-only-use-3-5gb-of-4gb-vram-issue/2160#post_23545423
http://www.overclock.net/t/1535502/gtx-970s-can-only-use-3-5gb-of-4gb-vram-issue/2860#post_23857336
http://www.overclock.net/t/1535502/gtx-970s-can-only-use-3-5gb-of-4gb-vram-issue/2860#post_23855087
http://www.overclock.net/t/1535502/gtx-970s-can-only-use-3-5gb-of-4gb-vram-issue/2680#post_23704623
http://www.overclock.net/t/1535502/gtx-970s-can-only-use-3-5gb-of-4gb-vram-issue/370#post_23451444
http://www.overclock.net/t/1535502/gtx-970s-can-only-use-3-5gb-of-4gb-vram-issue/370#post_23451377
http://www.overclock.net/t/1535502/gtx-970s-can-only-use-3-5gb-of-4gb-vram-issue/280#post_23447248
http://www.overclock.net/t/1513920/official-nvidia-gtx-980-owners-club/8650#post_24448981



There are many many out there that only consider Nvidia even when they are taken like that. For me personally, i'd be using Nvidia right now if AMD had done this. I used to be a hardcore Nvidia fanboy, i started waking up after the SLI certification scandal*.


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Vellinious*
> 
> I'm gonna go out on a limb and say that you don't overclock much......


http://www.3dmark.com/3dm/11665196?


----------



## Vellinious

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> http://www.3dmark.com/3dm/11665196?


I would have expected more.....

http://www.3dmark.com/compare/fs/8207585/fs/6368664#


----------



## questgraves

So I'm new,
My card was an Asus DCU II but is currently on Custom Water my gpu-z validation http://www.techpowerup.com/gpuz/details/24q6y and here is my snapshot:

Although i've had the Asus Direct CU II 290 for about 6 months. I recently did a Skylake 6700K build and OCd it to 4.6 on Custom Water. I was lucky enough to get a full cover block from EK for my DCU II card and was very bummed that I couldn't find any new 290's out there for sale as I need a second one to balance out my system. So if any of you guys have one you want to offload and even if it has a block on it, even better! I don't care if it's been OC'd so long as the Bios hasn't been messed with so as to get crazy high OCing with LN2 or something. This will be my daily driver for work and play for the next couple of years. Until the HBM2 cards are out and affordable. Please PM if you do or even if you just want to chat about cool:thumb:

GPU-Z_290_Snapshot.png 55k .png file


20160409_180224-1.jpg 3571k .jpg file


----------



## chiknnwatrmln

Quote:


> Originally Posted by *Vellinious*
> 
> I would have expected more.....
> 
> http://www.3dmark.com/compare/fs/8207585/fs/6368664#


6 core vs 4 core.. the winner will be you every time.

Our graphics scores are just about even though. I am limited by my UPS, it outputs a max of 865 watts and my PC draws 850 with that overclock.

Also not that it makes much of a difference but one of my cards is a 290, not a 290x


----------



## Vellinious

I only look at graphics scores when comparing GPUs. With what I've seen from the 2xx series cards, they should be markedly higher than 970s. /shrug


----------



## chiknnwatrmln

Yeah, my cards and CPU are not particularly great overclockers. If memory serves me correctly my cards are on a little over 1.3v, any higher and I get blackscreens. My CPU was on like 1.47v for 4.8GHz, I didn't really want to run any higher.

My 290's been going strong since November 2013, so I can't really complain. I'm waiting for Broadwell-E and the next gen video cards to upgrade. I can't wait to have an M.2 drive.


----------



## Vellinious

I just bought the Samsung 950 PRO....I love it.


----------



## Offler

So, I re-tested more stored OC profiles for my graphic card. Each of them is now artifacting. According to "checkered" pattern its certain that on affected zones is failing "every second"' CU, so definitely its issue with core...

I dont get it... Either since 14.9 to current Crimsons 16.3 the load on shader units was much increased, or the cooling is failing so badly, or the power has trouble.


----------



## gupsterg

@Vellinious

0.95V rail volt mod would help I reckon with your card/situation. This mod not only helps hawaii/grenada but fiji as well. Only hawaii card that out of the box has this rail moddable by software is Matrix 290X, all other cards hard mod needed. Several usual overclockers have done this plus all the pro overclockers do it.


----------



## Vellinious

Quote:


> Originally Posted by *gupsterg*
> 
> @Vellinious
> 
> 0.95V rail volt mod would help I reckon with your card/situation. This mod not only helps hawaii/grenada but fiji as well. Only hawaii card that out of the box has this rail moddable by software is Matrix 290X, all other cards hard mod needed. Several usual overclockers have done this plus all the pro overclockers do it.


Yeah, I've been holding off on hardware mods until the new cards actually release. I really don't want to get stuck buying a GPU right now. lol


----------



## cokker

Just re-pasted my 290x with some MX-4, dropped a solid 15c under load.

To MSI's credit the original paste job wasn't too bad, none of the paste was on the little capacitor things but there was a minor dried paste sandwich between the core and heatsink.


----------



## mus1mus

Quote:


> Originally Posted by *cokker*
> 
> Just re-pasted my 290x with some MX-4, dropped a solid 15c under load.
> 
> To MSI's credit the original paste job wasn't too bad, none of the paste was on the little capacitor things but there was a minor dried paste sandwich between the core and heatsink.


It'd either be crap TIM, too old paste, or dried up contact. 15C drop is too good to be true in without any of the mentioned conditions.


----------



## mfknjadagr8

Quote:


> Originally Posted by *mus1mus*
> 
> It'd either be crap TIM, too old paste, or dried up contact. 15C drop is too good to be true in without any of the mentioned conditions.


he said died paste sandwich...implying claylike tim...I got nearly 21c with mine in stock cooler went from 95+ under load to 75c under stress testing...it was so hard it looked like dried mortar


----------



## gupsterg

Quote:


> Originally Posted by *Vellinious*
> 
> Yeah, I've been holding off on hardware mods until the new cards actually release. I really don't want to get stuck buying a GPU right now. lol


Yeah I lack the ability to hard mods and would probably end making a right mess/killing card







.


----------



## Vellinious

Quote:


> Originally Posted by *gupsterg*
> 
> Yeah I lack the ability to hard mods and would probably end making a right mess/killing card
> 
> 
> 
> 
> 
> 
> 
> .


I've never done a hardware mod. I've soldered quite a bit, but....still. I'd rather wait to do something like that, so just in case I DO kill it, I can just buy the new card I was going to upgrade to anyway, and be happy. lol

It's increasingly looking like the motherboard / processor will be first though.....Broadwell E is calling me already.


----------



## questgraves

delete?


----------



## questgraves

Quote:


> Originally Posted by *Vellinious*
> 
> I just bought the Samsung 950 PRO....I love it.


Nice! Other than the obvious such as boot times, I'm sure you are coming from a normal SSD, what other types of noticeable gains do you see? One thing that really bummed me out is I play a lot of MMORPGs with my retired Mom and wife so there are not level loads very often, when they do I don't have time to read the tips they show because the level or dungeon loads. However, I recently got Fallout 4 and even with an 850 EVO it seems to take forever to load levels...I'm considering the 950 Pro because I can with the new board, but how quickly will depend on the return in investment I guess...


----------



## crystaal

Quote:


> Originally Posted by *cokker*
> 
> Just re-pasted my 290x with some MX-4, dropped a solid 15c under load.
> 
> To MSI's credit the original paste job wasn't too bad, none of the paste was on the little capacitor things but there was a minor dried paste sandwich between the core and heatsink.


Btw how do you guys clean out the paste that got smeared inbetween those tiny little capacitors?


----------



## TTheuns

Quote:


> Originally Posted by *crystaal*
> 
> Btw how do you guys clean out the paste that got smeared inbetween those tiny little capacitors?


Alcohol and a cuetip?


----------



## mus1mus

Quote:


> Originally Posted by *TTheuns*
> 
> Alcohol and a cuetip?


Too scared to use a toothbrush eh?


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> Too scared to use a toothbrush eh?


I always use a toothbrush. Is that bad?


----------



## mus1mus

Quote:


> Originally Posted by *battleaxe*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Too scared to use a toothbrush eh?
> 
> 
> 
> I always use a toothbrush. Is that bad?
Click to expand...

I use all sort to clean the die and the surrounding components. And toothbrush is the best. Esp when cleaning TIM-infested SMDs.

I even use water.


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> I use all sort to clean the die and the surrounding components. And toothbrush is the best. Esp when cleaning TIM-infested SMDs.
> 
> I even use water.


A mixture of gas and acetone are my personal fVorite. Lol... Kidding.


----------



## mus1mus

Quote:


> Originally Posted by *battleaxe*
> 
> A mixture of gas and acetone are my personal fVorite. Lol... Kidding.


Oh hell no, you'll damage the block and it's innards!


----------



## dagget3450

Is grizzly paste conductive?


----------



## mfknjadagr8

Quote:


> Originally Posted by *battleaxe*
> 
> A mixture of gas and acetone are my personal fVorite. Lol... Kidding.


I mix bleach and ammonia then I don't have to worry about the paste....(please no one be stupid and try this it was a joke and it will likely kill you)

I use 92 percent rubbing alcohol and all sorts of swabs and stuffs to get it where I need it...I used to use denatured but it will remove adhesives...which can be very bad lol


----------



## battleaxe

Quote:


> Originally Posted by *mfknjadagr8*
> 
> I mix bleach and ammonia then I don't have to worry about the paste....(please no one be stupid and try this it was a joke and it will likely kill you)
> 
> I use 92 percent rubbing alcohol and all sorts of swabs and stuffs to get it where I need it...I used to use denatured but it will remove adhesives...which can be very bad lol


You don't have to worry about the paste or your face... cause they're both melted off.


----------



## crystaal

Quote:


> Originally Posted by *TTheuns*
> 
> Alcohol and a cuetip?


Qtips tend to leave behind tiny strands that get caught on the sharp edges of the tiny capacitors.

Thanks for the toothbrush suggestions, I'll look for one with soft bristles.


----------



## SovereigN7

What should I be looking at with an overclock on a 290x tri-x with default voltage on the core and memory?


----------



## mus1mus

Quote:


> Originally Posted by *crystaal*
> 
> Qtips tend to leave behind tiny strands that get caught on the sharp edges of the tiny capacitors.
> 
> Thanks for the toothbrush suggestions, I'll look for one with soft bristles.


Doesn't need to be soft bristled. I looked for the stiff ones when I did clean the die last week. Dried TIM will be hard for softies.

There are no SMD components prone to being brushed off there.


----------



## boot318

Quote:


> Originally Posted by *dagget3450*
> 
> Is grizzly paste conductive?


Thermal Grizzly Conductonaut is conductive. Thermal Grizzly Hydronaut, Aeronaut and Kryonaut are not conductive.

http://www.thermal-grizzly.com/en/products


----------



## dagget3450

Quote:


> Originally Posted by *boot318*
> 
> Thermal Grizzly Conductonaut is conductive. Thermal Grizzly Hydronaut, Aeronaut and Kryonaut are not conductive.
> 
> http://www.thermal-grizzly.com/en/products


+rep good sir, i wasn't aware they made that many products

I am using kryonaught


----------



## battleaxe

Quote:


> Originally Posted by *mus1mus*
> 
> Doesn't need to be soft bristled. I looked for the stiff ones when I did clean the die last week. Dried TIM will be hard for softies.
> 
> There are no SMD components prone to being brushed off there.


I just used a stiff one







today to clean before installing my block on the new 390x. She booted and ran at 1100 fine. So I went ahead and cleaned the loop out while I was at it. Stiff tooth brush all the way.


----------



## SpykeZ

So, I got a 290x super cheap as I said earlier, I just discovered AMD added resolution scaling to the drivers so I can play 1440p in games. 290x are still a beefy card but since they're become rather cheap i was kinda thinking about throwing another one in. Not really interested in the new generation stuff as I doubt it'll be THAT much better.

Anywho, not looking to debate so don't even bother. I'm curious about power supplies as I've never dicked around with crossfire before.

http://www.newegg.com/Product/Product.aspx?Item=N82E16817151087

I've got that PSU in my system ATM.

The CPU is currently running 4.5GHz @ 1.272v
GPU is 1100/1400 +50mV (dunno which voltage matters in GPU-z)

750 watts going to be enough to handle another 290x both running stock speeds? (the extra OC really doesn't do much.)


----------



## mus1mus

2 cards will serve you up very well for quite a while.

Open GPU-Z monitoring and note of the VDDC Value. That is the card's core voltage.

750 may not be enough for 2 cards. But since you wanted to run stock, you can try it anyway. 1000 is better. I have seen FPs improvement by changing just the PSU.

Depending on your card's ASIC, Power Consumption will Vary. I have 2 290X cards with 76 ASIC and they can run OC'd to 1300/1625 with a 1250. If I pull one of them and run my 290 (86ASIC) alongside, I will trip OCP my PSU when running Firestrike Combined. I have to limit my Voltage to +120 and/or my clocks to 1235/1625.


----------



## Vellinious

Sure it's not the overcurrent protection on the ASUS motherboard? Mine was doing that...thought at first it was the PSU, but, turning off that feature in the bios, fixed the issue.


----------



## mus1mus

Pretty sure it wasn't the mobo. I can stress the CPU heavier and I have pumped it more than you can imagine.









Just to give you a clue, The PSU breathes out 40ish C when I am giving it a forced 12C air.


----------



## chiknnwatrmln

750 watts is more than enough for stock voltages. If you do OC, be careful on the voltage because these cards start sucking power pretty easily.


----------



## diggiddi

Quote:


> Originally Posted by *SpykeZ*
> 
> So, I got a 290x super cheap as I said earlier, I just discovered AMD added resolution scaling to the drivers so I can play 1440p in games. 290x are still a beefy card but since they're become rather cheap i was kinda thinking about throwing another one in. Not really interested in the new generation stuff as I doubt it'll be THAT much better.
> 
> Anywho, not looking to debate so don't even bother. I'm curious about power supplies as I've never dicked around with crossfire before.
> 
> http://www.newegg.com/Product/Product.aspx?Item=N82E16817151087
> 
> I've got that PSU in my system ATM.
> 
> The CPU is currently running 4.5GHz @ 1.272v
> GPU is 1100/1400 +50mV (dunno which voltage matters in GPU-z)
> 
> 750 watts going to be enough to handle another 290x both running stock speeds? (the extra OC really doesn't do much.)


Upgrade if you can, but if that's not possible, keep cards at stock and reduce core voltage
If you must oc then minimal mem ocing is recommended


----------



## spyshagg

undervolt them a little bit. They can handle it, and the power consumption will drop like a rock.


----------



## melodystyle2003

Hello guys,

asus dcii does not have vrm1 heatsink. Would be possible to tell me what is the gap between the dcii cooler and the vrm1/vrams? I searched on the internet and couldnt't find any relative information. TIA


----------



## SpykeZ

Quote:


> Originally Posted by *spyshagg*
> 
> undervolt them a little bit. They can handle it, and the power consumption will drop like a rock.


...never knew you could undervolt a GPU :O, How ya do that


----------



## Vellinious

Move the voltage slider to the left


----------



## boot318

Quote:


> Originally Posted by *melodystyle2003*
> 
> Hello guys,
> 
> asus dcii does not have vrm1 heatsink. Would be possible to tell me what is the gap between the dcii cooler and the vrm1/vrams? I searched on the internet and couldnt't find any relative information. TIA


VRM1 does have a heatsink on it. It covers the GDDR and GPU VRMs.



VRM2 rarely gets above 60C.



I don't think you can use memory heatsinks on the DC2. You probably can cover some of them with the heatsink on, but the gap is going to be incredibly small to fit normal memory heatsinks. Maybe something like this could fit on the memory modules with the cooler on. I'm not even sure how effective they would be on memory.


----------



## SpykeZ

Quote:


> Originally Posted by *Vellinious*
> 
> Move the voltage slider to the left


......HA, never realized the thing moves left


----------



## melodystyle2003

Quote:


> Originally Posted by *boot318*
> 
> VRM1 does have a heatsink on it. It covers the GDDR and GPU VRMs.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> VRM2 rarely gets above 60C.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> I don't think you can use memory heatsinks on the DC2. You probably can cover some of them with the heatsink on, but the gap is going to be incredibly small to fit normal memory heatsinks. Maybe something like this could fit on the memory modules with the cooler on. I'm not even sure how effective they would be on memory.


VRM2 has that long black heatsink, VRM1 is not using any (you confused vrm1 positioning with vrm2's). So according to your state, VRM1 (vrm2 according to you) is going up to 60oC only? I ve read on various review sites that it goes up to 110oC while overclocked, whilst varies between 80-95oC on stock frequencies. Do you own an asus r9 290 dcii sir?
Edit: no matter the name, i would like to know the space between vrm1 (the one without heasink) and dcii cooler.


----------



## boot318

Quote:


> Originally Posted by *melodystyle2003*
> 
> VRM2 has that long black heatsink, VRM1 is not using any (you confused vrm1 positioning with vrm2's). So according to your state, VRM1 (vrm2 according to you) is going up to 60oC only? I ve read on various review sites that it goes up to 110oC while overclocked, whilst varies between 80-95oC on stock frequencies. Do you own an asus r9 290 dcii sir?
> Edit: no matter the name, i would like to know the space between vrm1 (the one without heasink) and dcii cooler.


I wanted to do a quick 25 minute run of GTA5 to show you.


The VRM with the heatsink is VRM1. GPU-Z even shows it.

I have that exact card. I'm using a CLC on my card with a G10, so my temps will be way better than people using the DCU2 cooler + I change the thermal pad on the VRM heatsink. The stock cooler is probably heating the whole card up to make it 85C+. I have no experience with the stock cooler since I took it off the second I got my card. All I know is I've never seen VRM1 (heatsink) over 75C and VRM2 (non-heatink) over 60C.
Quote:


> i would like to know the space between vrm1 (the one without heasink) and dcii cooler.


I don't know the answer. I know with an CLC that VRM doesn't get that hot.

EDIT: After looking at the heatsink there is a small cutout on that VRM section. I'm not sure how big it is but you could fit something right there.


----------



## melodystyle2003

Quote:


> Originally Posted by *boot318*
> 
> EDIT: After looking at the heatsink there is a small cutout on that VRM section. I'm not sure how big it is but you could fit something right there.
> 
> 
> Spoiler: Warning: Spoiler!


On the same horizontal line it has also a cutout over power connectors. Does it have the same depth?


Spoiler: Warning: Spoiler!


----------



## boot318

Quote:


> Originally Posted by *melodystyle2003*
> 
> On the same horizontal line it has also a cutout over power connectors. Does it have the same depth?
> 
> 
> Spoiler: Warning: Spoiler!


They are different.


Spoiler: Warning: Spoiler!








http://www.hardocp.com/article/2014/01/13/asus_r9_290x_directcu_ii_oc_overclocking_review/6
http://www.legitreviews.com/asus-r9-290x-directcu-ii-sapphire-r9-290x-tri-x-video-card-reviews_132158/11

I'm not sure if we are on the same page. Look at the reviews for the temps on the VRMs. The higher temps for the VRMs (VRM 1 in GPU-Z) is for the heatsink VRM. It gets way hotter than the unsinked one.


----------



## melodystyle2003

Quote:


> Originally Posted by *boot318*
> 
> They are different.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.hardocp.com/article/2014/01/13/asus_r9_290x_directcu_ii_oc_overclocking_review/6
> http://www.legitreviews.com/asus-r9-290x-directcu-ii-sapphire-r9-290x-tri-x-video-card-reviews_132158/11
> 
> I'm not sure if we are on the same page. Look at the reviews for the temps on the VRMs. The higher temps for the VRMs (VRM 1 in GPU-Z) is for the heatsink VRM. It gets way hotter than the unsinked one.


You are correct, rep+. So guess higher air flow fans, e.g. 2x120mm fans will do the trick. We will see.


----------



## TheLAWNOOB

Should I get another R9 290 and put it in CF or get the Polaris card (probably 450CAD)?

Polaris will probably give me a 20-30% boost depending on how it overclocks, or I can get a used 290 for about 300CAD?

I can swap my 290 for a 780 and sell the 780 for about 300CAD, if I go with Polaris.

I need more than 40fps to get freesync working. Already tried lowering freesync range to 35 and it makes me reinstall the driver everyday I boot up.


----------



## diggiddi

At 4k crossfire for sure


----------



## fyzzz

Well i think my hynix bfr 290 just died. It was at idle for a while and when i tried to start fallout 4, i got instant rsod and then i couldn't get into windows anymore.


----------



## battleaxe

Quote:


> Originally Posted by *fyzzz*
> 
> Well i think my hynix bfr 290 just died. It was at idle for a while and when i tried to start fallout 4, i got instant rsod and then i couldn't get into windows anymore.


Unplug the power and try again. Sometimes you have to sever all power to the GPU's and Motherboard/CPU before she will respond.


----------



## fyzzz

Quote:


> Originally Posted by *battleaxe*
> 
> Unplug the power and try again. Sometimes you have to sever all power to the GPU's and Motherboard/CPU before she will respond.


I have tried different cables, unplugging the power etc, doesn't work. I get display and it works without driver, but as soon as the driver installs, everything freezes. My two other 290's works just fine.


----------



## chiknnwatrmln

I think I just lost my 290x...

Was installing monitor mounts and for some reason I smelled something electrical burning (god I've grown to hate that smell... every time it means more work and less money) but all my components and PC were off...

Can boot into safe mode fine with both GPUs, can't boot normally with both.

Can boot with just 1st GPU (290x) but is all artifacty.

Can boot just fine with 2nd GPU (290).

Edit: well the old uninstall-and-reinstall drivers worked, 'cept I had to do it twice.

Who knows.


----------



## battleaxe

Quote:


> Originally Posted by *fyzzz*
> 
> I have tried different cables, unplugging the power etc, doesn't work. I get display and it works without driver, but as soon as the driver installs, everything freezes. My two other 290's works just fine.


Quote:


> Originally Posted by *chiknnwatrmln*
> 
> I think I just lost my 290x...
> 
> Was installing monitor mounts and for some reason I smelled something electrical burning (god I've grown to hate that smell... every time it means more work and less money) but all my components and PC were off...
> 
> Can boot into safe mode fine with both GPUs, can't boot normally with both.
> 
> Can boot with just 1st GPU (290x) but is all artifacty.
> 
> Can boot just fine with 2nd GPU (290).
> 
> Edit: well the old uninstall-and-reinstall drivers worked, 'cept I had to do it twice.
> 
> Who knows.


Man guys... I just had to replace my 290x also. Really sucks, I feel your pain. I lost a great card to get a so-so card back. Still does over 1200mhz but just barely. Old one hit almost 1300mhz. So much suck.


----------



## fyzzz

Strange that so many 290(x)'s has died lately. My second 290 that i'm using now is behaving strange also...


----------



## gupsterg

Quote:


> Originally Posted by *fyzzz*
> 
> Well i think my hynix bfr 290 just died. It was at idle for a while and when i tried to start fallout 4, i got instant rsod and then i couldn't get into windows anymore.


OMG, sorry to read that mate.


----------



## SpykeZ

Quote:


> Originally Posted by *gupsterg*
> 
> OMG, sorry to read that mate.


----------



## mus1mus

@fyzzz

What do you get when you do an atiflash -i ?

Could just be the bios. Check for 67B0 and 67B1 maybe?


----------



## fyzzz

Quote:


> Originally Posted by *mus1mus*
> 
> @fyzzz
> 
> What do you get when you do an atiflash -i ?
> 
> Could just be the bios. Check for 67B0 and 67B1 maybe?


Haven't tried that command in atiflash. I have tried both bios switches and different bioses, doesn't help. It runs just fine without driver and being recognized properly in windows. But as soon as i try to install the driver, whole computer locks up and i get blackscreen when it tries to boot into windows and the driver loads.


----------



## mus1mus

BSOD? Maybe the DPMs.


----------



## gupsterg

Until the AMD driver is used a lot of the GPU features are not functioning, nor does card operate at full speed, so probably that's why card is ok without driver loaded.


----------



## boot318

Quote:


> Originally Posted by *melodystyle2003*
> 
> You are correct, rep+. So guess higher air flow fans, e.g. 2x120mm fans will do the trick. We will see.


You don't have a spare CLC? I recommend the "RED MOD" for that model.


----------



## melodystyle2003

Quote:


> Originally Posted by *boot318*
> 
> You don't have a spare CLC? I recommend the "RED MOD" for that model.


My gpu has warranty sticker (white sticker with a red circle) on one of the screws which holds the dcii cooler, so going to "red mod" would void the warranty (18 months more left). I do have a corsair h80 waiting though...
Got her on my hands yesterday, feels and looks quite neat. Removing top shroud and fans would be easy and 2x120mm high flow fans may help reducing overall temps.


----------



## boot318

Quote:


> Originally Posted by *melodystyle2003*
> 
> My gpu has warranty sticker (white sticker with a red circle) on one of the screws which holds the dcii cooler, so going to "red mod" would void the warranty (18 months more left). I do have a corsair h80 waiting though...
> Got her on my hands yesterday, feels and looks quite neat. Removing top shroud and fans would be easy and 2x120mm high flow fans may help reducing overall temps.


I voided my warranty the first 15 minutes I got mine. lol

I loved her more when I did the cheap/ghetto/RED mod on her.


I'm interested in seeing the temperature change by slapping 120mm fans on her.


----------



## Mega Man

i vote for options 2-4, much better looking and you dont have to support asecrap aka patent trolls


----------



## fyzzz

It sucks that my 290 just died, it could do 1625mhz on the memory with custom bios and tighter timings, no problem. The 290 i'm using now can't even do 1250mhz on the memory with custom bios, but the core overclocks very well. 1140+ on air and over 1150 on water with stock voltage. I have it on air right now, because i'm going to rebuild my loop and order some new parts.


----------



## fat4l

Quote:


> Originally Posted by *fyzzz*
> 
> It sucks that my 290 just died, it could do 1625mhz on the memory with custom bios and tighter timings, no problem. The 290 i'm using now can't even do 1250mhz on the memory with custom bios, but the core overclocks very well. 1140+ on air and over 1150 on water with stock voltage. I have it on air right now, because i'm going to rebuild my loop and order some new parts.


its those XXX high voltages brah.... Running them so high has its cost


----------



## fyzzz

Quote:


> Originally Posted by *fat4l*
> 
> its those XXX high voltages brah.... Running them so high has its cost


Maybe, i've used pretty high voltages when benchmarking, but it was at stock voltage most of it's life. No damage on the pcb atleast.


----------



## battleaxe

Quote:


> Originally Posted by *Mega Man*
> 
> 
> 
> i vote for options 2-4, much better looking and you dont have to support asecrap aka patent trolls


Also, hundreds of dollars more expensive too. You can't really compare the two. Custom is better I know, but I started out on AIO's and they are not a bad way to get into water. Great for those not wanting to spend a boatload of money as my loop cost well over $700 alone and that's nothing compared to some of them around here.

Seeing how an AIO costs about $50 on sale and gets the end user into the 50's C on a 290/390 I don't see how they are bad. I've had 5 or 6 of them and all are still alive. As good as my custom loop? No way. But a lot better than stock air and simpler too.

Not everyone has the budget for full loops.


----------



## mogie

So im new btw hi







with no bios mod and stock cooling ive got mine at a sweet spot for fps.... MV +81 CC 1090 Vram 1465 .... no artifacts but I am intrested to see how much more i can gain with water cooling because this is a 5% fps boost in 1080p and im hoping to get more......It handles higher overclocking with 100% volts but im getting an fps drop....its not the cpu bottlenecking it I believe its just unstable....will I see better results once cooled or is this it you reckon? should i strap my spare cooler on her and a bios and go a lil furthure? 10 fps would be nice.On the other hand keeping this card would be nice too









Also I got a pretty crap motherboard that cant be overclocked my z77 fried...Getting another one in a few days as i need to over clock my gpu.....I dont think if im not maxing out my cpu usage in games a better MOb will help right? even overclocked to my awsome 3570k that is insaine and sits on 4.7 for days no problem...Someone mentioned a crap Mob can slow down fps.....rams fine...powers fine...Fps is going somewhere.Right now its stock at 3.4 and i cant do nothing bout that

Thoughts?


----------



## questgraves

Quote:


> Originally Posted by *mogie*
> 
> So im new btw hi
> 
> 
> 
> 
> 
> 
> 
> with no bios mod and stock cooling ive got mine at a sweet spot for fps.... MV +81 CC 1090 Vram 1465 .... no artifacts but I am intrested to see how much more i can gain with water cooling because this is a 5% fps boost in 1080p and im hoping to get more......It handles higher overclocking with 100% volts but im getting an fps drop....its not the cpu bottlenecking it I believe its just unstable....will I see better results once cooled or is this it you reckon? should i strap my spare cooler on her and a bios and go a lil furthure? 10 fps would be nice.On the other hand keeping this card would be nice too
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also I got a pretty crap motherboard that cant be overclocked my z77 fried...Getting another one in a few days as i need to over clock my gpu.....I dont think if im not maxing out my cpu usage in games a better MOb will help right? even overclocked to my awsome 3570k that is insaine and sits on 4.7 for days no problem...Someone mentioned a crap Mob can slow down fps.....rams fine...powers fine...Fps is going somewhere.Right now its stock at 3.4 and i cant do nothing bout that
> 
> Thoughts?


Welcome Mogie,

It's been my experience that no matter what people say such as "a chip set is a chip set and costly motherboards are a waste of money etc." that you get what you pay for (most of the time). I may have gone a little overboard -no pun intended with an Asus Maximus Extreme VIII for my 6700k, I'm not planning on using LN2 any time soon but I have a custom water loop and like the feature set. Could have got the same a day later, with better features for me in the Formula for $100 less, but that's just crappy luck. I should have known that part was coming but I digress...

Of all the systems I've built (about a dozen or more just for myself not including family and friend builds), and of all I've compared to that supposedly have same amount of RAM, Storage, CPU etc. made by companies like HP or Dell, nothing compares to a good quality motherboard as the foundation especially here at OC! I say this because I'm not surprised that you are losing frames per second some place. It could be leakage in the traces, bad power delivery who knows, crappy components for just about anything and you loose efficiency.

That all being said, I'm also new as far as getting involved. I know I likely have some good to help with upstairs so now I hope to give a little back. I develop software for a living but I know hardware fairly well also. I'm no electrical engineer, but I know my way around a PCB.


----------



## questgraves

I posted this in the Black Screen Poll forum for our card and nobody seemed to have any thoughts on the matter so please don't be upset that I'm asking a larger audience here, but I just have a few questions that I'm hoping to get some feedback on.

I'll start by saying I've been overclocking CPU's for about 15 years and just got into GPU Overclocking after building a custom water loop in my new rig.

Question 1:

Can GPU Tweak kill VRMs even if plenty cool?

I accidentally went WAY over with voltage switching to GPU tweak.

I just make a rookie move, leaving the lock on between Core Frequency and Core Voltage so they step up together.

GPU tweak seems to want to add an incredible amount more volts to hit 1150Mhz core clock than if I do a manual MSI Afterburner OC and only add say 25Mv and a +30% power.

I don't remember how far up but I know when I opened Afterburner after saving GPU tweak to compare the Mv line for core voltages was pegged all the way up! I have no idea how far beyond the maximum it went. I heard from a few people I trust on YouTube that OCing via software is fairly safe so didn't think anything of it.

Anyway now I cannot add more voltage or it's unstable and at stock clocks it fluctuates between 100% and 0% GPU load when stability testing or running Valley or games. I don't really notice a problem though in FPS other than it being lower because I cannot OC.

Question 2:
Can it be I possibly ruined the cord, or maybe even the PCI-E slot? I can test the power cords and will later today but it will be a lot harder to test the rest as I will have disassemble the water loop.

Question 3:
Asus issued an RMA number but I punched out the standoffs on my back-plate, used some custom plastic ones and thermal tape on the back behind VRMs and Chip, so that I can use it with the EK water block. it's noticeable even if I reassemble the card so I don't how Asus will feel about that ?

Temps not an issue for VRM max out at 50C, same for the GPU itself, however I think I just put too much voltage through and hopefully I will not end up with a dead card in a couple weeks.


----------



## SpykeZ

Anyone here have a 290x with the Kraken G10 bracket inside a Corsair Air 540? Wanna see how it fits in there.


----------



## LandonAaron

After updating my driver to Crimson 16.3.2 I have noticed an issue with MSI AB where my overclock does not completely reset back to default settings after rebooting the computer. I noticed it after investigating a freeze/black screen issue I started experience after updating to 16.3.2. What is happening is I will use MSIAB to overclock my Core Clock and Memory Clock, and to raise the core voltage. Then after a reboot the core clock and core voltage will reset back to the default setting but the memory clock will remain at the overclocked setting. This in turn is what caused the freezing and black screen issues as the raised memory clock setting was not stable with the default voltage.

Has anyone else run into this issue or know of a potential fix other than just always remembering to reset your OC settings before a reboot?


----------



## mus1mus

Quote:


> Originally Posted by *LandonAaron*
> 
> After updating my driver to Crimson 16.3.2 I have noticed an issue with MSI AB where my overclock does not completely reset back to default settings after rebooting the computer. I noticed it after investigating a freeze/black screen issue I started experience after updating to 16.3.2. What is happening is I will use MSIAB to overclock my Core Clock and Memory Clock, and to raise the core voltage. Then after a reboot the core clock and core voltage will reset back to the default setting but the memory clock will remain at the overclocked setting. This in turn is what caused the freezing and black screen issues as the raised memory clock setting was not stable with the default voltage.
> 
> Has anyone else run into this issue or know of a potential fix other than just always remembering to reset your OC settings before a reboot?


Crimson Crap.

One thing, if you have happen to find the clocks, Core and Memory, Save them as Profiles, set MSI AB to start with Windows, enabling the Profile with boot as well.


----------



## spyshagg

I have that problem and many more with Crimson. All of which dont exist with Catalyst 15.11.1 BETA , the best driver I tested so far.


----------



## Sp3cialUs3r

Hello, i found the Solution about Black Screen Problems with Crimson & Restarts:

https://community.amd.com/thread/197510

Last Post from User Neo0. It fixed by me the Blackscreen Problem (Issue is: VRAM Needs more Voltage, but Crimson applys at Windows Startup the VRAM Clock without more Voltage = Blackscreen).

With Regards

Sp3cial Us3r

PS:

Please deactivate the Hybrid Shutdown from Windows 10 -> powercfg -h off (in CMD), otherwise the Problem still exist. If you have more than one Displayadapter Change the 0000 entry to 0001









Sorry for Bad English


----------



## OneB1t

you can increase voltage for low DPM states via bios edit







then card will be stable without black screens


----------



## Sp3cialUs3r

I tried this by my Setup and it didnt helped, the idle voltage increased but the flickers and Blackscreens come back (with higher Idle Voltage). Normally Blackscreen under 1.087 V, after Modding the Idle Voltage Blackscreen hat 1,112







...

With MSI Afterburner no Blackscreen with 1,087 Idle Voltage (+100 in MSI Afterburner)

Im a little bit confused...


----------



## Paul17041993

I'd rather just OC via a BIOS flash, much more reliable and easy to debug. Though I haven't checked back on 290X > 390X flashing for reference cards, anyone know how well they go?


----------



## OneB1t

Quote:


> Originally Posted by *Sp3cialUs3r*
> 
> I tried this by my Setup and it didnt helped, the idle voltage increased but the flickers and Blackscreens come back (with higher Idle Voltage). Normally Blackscreen under 1.087 V, after Modding the Idle Voltage Blackscreen hat 1,112
> 
> 
> 
> 
> 
> 
> 
> ...
> 
> With MSI Afterburner no Blackscreen with 1,087 Idle Voltage (+100 in MSI Afterburner)
> 
> Im a little bit confused...


you need to increase more than 1 state
increase voltage in DPM0-DPM3 then retest


----------



## kizwan

Quote:


> Originally Posted by *Sp3cialUs3r*
> 
> I tried this by my Setup and it didnt helped, the idle voltage increased but the flickers and Blackscreens come back (with higher Idle Voltage). Normally Blackscreen under 1.087 V, after Modding the Idle Voltage Blackscreen hat 1,112
> 
> 
> 
> 
> 
> 
> 
> ...
> 
> With MSI Afterburner no Blackscreen with 1,087 Idle Voltage (+100 in MSI Afterburner)
> 
> Im a little bit confused...


Let me get this straight...

- DPM volt mod --> black screen at idle
- MSI AB offset voltage --> no black screen

If this is the case, then add & set either offset voltage (I2C register/ID 0x8D) or hidden offset voltage (I2C register/ID 0x26) in VoltageObjectInfo in the ROM.


----------



## mogie

Works on my 290x saphire tri-x







also ive got a beef with people saying the msi lightning stock coolers are better if people wana argue i can show evidence tri force is like insainely better


----------



## Sp3cialUs3r

Quote:


> Originally Posted by *OneB1t*
> 
> you need to increase more than 1 state
> increase voltage in DPM0-DPM3 then retest


I did this allready







But i will check kizwanns Answer







Quote:


> Originally Posted by *kizwan*
> 
> Let me get this straight...
> 
> - DPM volt mod --> black screen at idle
> - MSI AB offset voltage --> no black screen
> 
> If this is the case, then add & set either offset voltage (I2C register/ID 0x8D) or hidden offset voltage (I2C register/ID 0x26) in VoltageObjectInfo in the ROM.


- DPM volt mod --> black screen at idle
- MSI AB offset voltage --> no black screen

Right this is my Problem


----------



## spyshagg

Quote:


> Originally Posted by *Paul17041993*
> 
> I'd rather just OC via a BIOS flash, much more reliable and easy to debug. Though I haven't checked back on 290X > 390X flashing for reference cards, anyone know how well they go?


In my case, the difference from winter (when i tested my overclock limit) to today is about 10ºc, and my winter overclock fails at this temperature. I like to keep OC via software for this reason. I do the exact same with my cpu

cheers


----------



## mogie

Quote:


> Originally Posted by *spyshagg*
> 
> In my case, the difference from winter (when i tested my overclock limit) to today is about 10ºc, and my winter overclock fails at this temperature. I like to keep OC via software for this reason. I do the exact same with my cpu
> 
> cheers


Makes sense here in aus the temps in a tropical place fluctuate all day.....im talking 25 degree difference on a good day


----------



## Paul17041993

Quote:


> Originally Posted by *spyshagg*
> 
> In my case, the difference from winter (when i tested my overclock limit) to today is about 10ºc, and my winter overclock fails at this temperature. I like to keep OC via software for this reason. I do the exact same with my cpu
> 
> cheers


BIOS switch, CPU though is a bit of a different story, though I've never had any working form of software CPU OC anyway, not that I'd want to try as I need access to every little setting possible.

Quote:


> Originally Posted by *mogie*
> 
> Makes sense here in aus the temps in a tropical place fluctuate all day.....im talking 25 degree difference on a good day


Which was basically why I invested ~1KAUD in an overkill water loop (although probably not overkill at all if you wanted pure silence).


----------



## spyshagg

i'm watercooled. Water temps go up by the same delta as air temp


----------



## mogie

Guys I really need your help for my watercooling idea can smeone please take 2 mins and outline where the vrams n stuff are its a 290x tri-x force

Im begging you i need to get the watercooling going today


----------



## fyzzz




----------



## mogie

Quote:


> Originally Posted by *fyzzz*


cheers bro bu wth are the mats all around in the circle of the gpu chip?


----------



## mogie

Im flat out sick of this thoe can someone pklease link me a picture to exactly what everything on the board is so i know .im asking pretty please with sugar on top ....ALSO are they all the same once the coolers are off? i mean besdes the ram moduels being diff brands


----------



## fyzzz

Quote:


> Originally Posted by *mogie*
> 
> cheers bro bu wth are the mats all around in the circle of the gpu chip?


That's the VRAM, it doesn't need as much cooling as the VRM's.


----------



## mogie

Quote:


> Originally Posted by *fyzzz*
> 
> That's the VRAM, it doesn't need as much cooling as the VRM's.


the stupidest part man when i ripped of the cooling system it was focused on that and i think the thermal pads werent even touching the VRM's. how ******ed are manufacturers man


----------



## LandonAaron

Quote:


> Originally Posted by *mogie*
> 
> cheers bro bu wth are the mats all around in the circle of the gpu chip?




Here I edited Fryzz's picture. Also the shinny part in the middle is the actual GPU chip. Like others said the VRM needs more cooling than the VRAM, but is a bit more difficult to cool due to its smaller surface area. You really need some sort of heat sink for these. Someone use to make a set for VRM heat sinks specifically for the 290 that came with thermal pads but I forget who. Maybe someone else remembers.

Edit: Isn't google wonderful. Found em: http://www.newegg.com/Product/Product.aspx?Item=N82E16835426042 These are good because the mounting hardware ensures a good contact with the VRM.


----------



## Chopper1591

Hello everyone.

I am seeking some advice once more. Would like to upgrade my build but not sure what to do... cpu or gpu.

Would be cool if you can post experience with different cpu's combined with a 290(x) card. I know some games are more cpu oriented than others..
I made a thread about my upgrade idea here: link.

What do you guys think will help more?

Upgrade cpu: to 6600k or 6700k with proper board (Maximus VIII Hero) and something like 16GB ddr4-3000.
Upgrade gpu: to gtx 1080 or 1070.


----------



## ManofGod1000

Quote:


> Originally Posted by *Chopper1591*
> 
> Hello everyone.
> 
> I am seeking some advice once more. Would like to upgrade my build but not sure what to do... cpu or gpu.
> 
> Would be cool if you can post experience with different cpu's combined with a 290(x) card. I know some games are more cpu oriented than others..
> I made a thread about my upgrade idea here: link.
> 
> What do you guys think will help more?
> 
> Upgrade cpu: to 6600k or 6700k with proper board (Maximus VIII Hero) and something like 16GB ddr4-3000.
> Upgrade gpu: to gtx 1080 or 1070.


What games are you playing and at what resolution? However, I personally would stick with what you have for now and enjoy it. I upgraded to a 980 Ti and 6700k and in day to day use, I see no real difference at all. Now, I am gaming at 4k but still, I have regretted spending the money since I do not really game all that much on my computer like I do on my XBox one. However, if you absolutely must upgrade, the gpu is the best option.

Edit: I recommend upgrading your ram with 16GB more, you will notice a difference with today's games.


----------



## Chopper1591

Quote:


> Originally Posted by *ManofGod1000*
> 
> What games are you playing and at what resolution? However, I personally would stick with what you have for now and enjoy it. I upgraded to a 980 Ti and 6700k and in day to day use, I see no real difference at all. Now, I am gaming at 4k but still, I have regretted spending the money since I do not really game all that much on my computer like I do on my XBox one. However, if you absolutely must upgrade, the gpu is the best option.
> 
> Edit: I recommend upgrading your ram with 16GB more, you will notice a difference with today's games.


What did you have before the upgrade to the ti and 6700k?
4k is another story I guess. I don't expect Intel to be much faster than a good clocked fx83x0 cpu.

I do however only game on pc. I have no Xbox, well I do but it is the real Xbox (1).








Regarding the RAM upgrade. I have actually thought about buying a 16GB kit as some games do require it (Forza for example), I can't choose Ultra in-game because my 8GB ram restricts that....








Sadly I have to get ddr3 and when I do upgrade I can't use that on the new platform. Hmmm... Decisions, decisions.

Will wait until the cards are released to see some benchmarks.


----------



## Roboyto

Quote:


> Originally Posted by *Chopper1591*
> 
> Hello everyone.
> 
> I am seeking some advice once more. Would like to upgrade my build but not sure what to do... cpu or gpu.
> 
> Would be cool if you can post experience with different cpu's combined with a 290(x) card. I know some games are more cpu oriented than others..
> I made a thread about my upgrade idea here: link.
> 
> What do you guys think will help more?
> 
> Upgrade cpu: to 6600k or 6700k with proper board (Maximus VIII Hero) and something like 16GB ddr4-3000.
> Upgrade gpu: to gtx 1080 or 1070.


We are so close to new CPUs and GPUs any purchase of current hardware would be a fools errand IMHO. There is little performance in *gaming* to gain by switching CPUs. A 980Ti would obviously be faster, but depending on what resolution, refresh rate, etc you are working with it may be pointless.

I would really wait for the dust to settle with new hardware, especially the GPUs with all the clout around DX12. With Ashes and Hitman which had DX12 in mind from the start, AMD has an advantage. In Hitman, Nvidia no longer runs away at 1080 and then the 290(X) surpasses the Titan X at 4K. http://www.guru3d.com/articles_pages/hitman_2016_pc_graphics_performance_benchmark_review,7.html

The first Tomb Raider favored Nvidia, despite it being an AMD 'branded' game, from what I remember from my 290/970 comparison: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/34380_20#post_23458320. ROTR comes out with Crapworks features and DX12 'support' added after the fact where it shows that DX12 is worse than DX11? Coincidence? I highly doubt it.

If Nvidia did nothing with Pascal to leverage DX12 features, than the choice for AMD will be a clear no-brainer as DX12 implementation will only grow at an exponential rate from here on out.









If you don't have 16GB of RAM currently, just buy a kit as they are unbelievably cheap right now. I'm pretty sure I have seen 16GB 2400MHz kits for like $75.


----------



## Chopper1591

Quote:


> Originally Posted by *Roboyto*
> 
> We are so close to new CPUs and GPUs any purchase of current hardware would be a fools errand IMHO. There is little performance in *gaming* to gain by switching CPUs. A 980Ti would obviously be faster, but depending on what resolution, refresh rate, etc you are working with it may be pointless.
> 
> I would really wait for the dust to settle with new hardware, especially the GPUs with all the clout around DX12. With Ashes and Hitman which had DX12 in mind from the start, AMD has an advantage. In Hitman, Nvidia no longer runs away at 1080 and then the 290(X) surpasses the Titan X at 4K. http://www.guru3d.com/articles_pages/hitman_2016_pc_graphics_performance_benchmark_review,7.html
> 
> The first Tomb Raider favored Nvidia, despite it being an AMD 'branded' game, from what I remember from my 290/970 comparison: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/34380_20#post_23458320. ROTR comes out with Crapworks features and DX12 'support' added after the fact where it shows that DX12 is worse than DX11? Coincidence? I highly doubt it.
> 
> If Nvidia did nothing with Pascal to leverage DX12 features, than the choice for AMD will be a clear no-brainer as DX12 implementation will only grow at an exponential rate from here on out.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> If you don't have 16GB of RAM currently, just buy a kit as they are unbelievably cheap right now. I'm pretty sure I have seen 16GB 2400MHz kits for like $75.


That is one of the main selling points I am waiting for to actually see in tests (dx12 performance of Pascal). Clearly AMD has the edge (quiet hard also) at dx12 with current gen cards. I will actually be amazed if Pascal is better @ dx12 than AMD's Vega cards. Lets just hope they do arrive soon.









You are probably right about a cpu upgrade not justifying the cost. It's just that you hear/read everywhere that Intel is so much better at games compared to AMD....
I guess in most cases the gap isn't that big after all. Far Cry 4, GTA V etc. excluded. These do run substantially better on Intel it seems.


----------



## mogie

Quote:


> Originally Posted by *Chopper1591*
> 
> That is one of the main selling points I am waiting for to actually see in tests (dx12 performance of Pascal). Clearly AMD has the edge (quiet hard also) at dx12 with current gen cards. I will actually be amazed if Pascal is better @ dx12 than AMD's Vega cards. Lets just hope they do arrive soon.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You are probably right about a cpu upgrade not justifying the cost. It's just that you hear/read everywhere that Intel is so much better at games compared to AMD....
> I guess in most cases the gap isn't that big after all. Far Cry 4, GTA V etc. excluded. These do run substantially better on Intel it seems.


bro uno what in my opinion dont do it. if u do anything get a sick watercooling system and push her to the limits at 1080p ...what games are you looking at ? if you stick to old source games *** who cares ....dont do it man unless you have to .4k gaming is bull**** **** MORE {PIXLES .....you wana fork out 2 grand for bragging rights?**** thos ***s man ull alright i just rekon watercool overcool and when it dies **** it get a new rig .I would hower get freesync thats a godsend no ******* vsync on and no screen tear its the best thing ive every invested in .ignore my spelling mistkes its 4am and im drunk working on custom colling my gpu


----------



## battleaxe

Quote:


> Originally Posted by *mogie*
> 
> ignore my spelling mistkes its 4am and im drunk working on custom colling my gpu


----------



## Chopper1591

Quote:


> Originally Posted by *mogie*
> 
> bro uno what in my opinion dont do it. if u do anything get a sick watercooling system and push her to the limits at 1080p ...what games are you looking at ? if you stick to old source games *** who cares ....dont do it man unless you have to .4k gaming is bull**** **** MORE {PIXLES .....you wana fork out 2 grand for bragging rights?**** thos ***s man ull alright i just rekon watercool overcool and when it dies **** it get a new rig .I would hower get freesync thats a godsend no ******* vsync on and no screen tear its the best thing ive every invested in .ignore my spelling mistkes its 4am and im drunk working on custom colling my gpu


I know the feeling. Wuahah
Beware of the admin's though.









But...
At your first sentence: I already have a decent water cooling system. This bad boy have sat me back around 600 euro. Already pushing it.
Next?









But damn.
I might even consider getting a new screen with *Freesync*.
Anybody have experience with it here? Is it worth it? Something like the Asus 144hz 1440p Freesync (500 euro).


----------



## Roboyto

Quote:


> Originally Posted by *Chopper1591*
> 
> That is one of the main selling points I am waiting for to actually see in tests (dx12 performance of Pascal). Clearly AMD has the edge (quiet hard also) at dx12 with current gen cards. I will actually be amazed if Pascal is better @ dx12 than AMD's Vega cards. Lets just hope they do arrive soon.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You are probably right about a cpu upgrade not justifying the cost. *It's just that you hear/read everywhere that Intel is so much better at games compared to AMD....*
> I guess in most cases the gap isn't that big after all. Far Cry 4, GTA V etc. excluded. These do run substantially better on Intel it seems.


I would have to, at least partially, disagree with the highlighted portion of your comment. What you hear/read everywhere regarding Intel being *vastly superior* to AMD specifically in regards to gaming is likely from peoples opinions.

There are always numerous factors to consider when people make blanket statements such as:

What game(s) is it specifically? Are they utilizing numerous CPU cores, because we all know Intel is superior in single-threaded performance. (This is also an area where DX12 levels the playing field.)

Are we talking about Xfire/SLI of enthusiast class cards? Gee, I wonder why the $150 AMD CPU is bottlenecking a pair of $650 graphics cards? This is a no-brainer...

What resolution/refresh rate? Are peoples overclocks stable? Are they using a proper PSU? Do they have enough RAM, and is it fast enough? Are they running proper drivers? What OS are they on? The list of questions to ask someone who blurts out blanket statements is usually vast. But, you don't ever get a chance to find any of this out because they are typically quite immature.

There are also numerous other comparisons and benchmarks Intel obviously holds a sizable advantage, I won't dispute that. Hell, every PC device in my house is powered by an Intel Core i5/i7 for this particular reason.

However...

An AMD FX 8-core with a mild overclock shouldn't pose a vast disadvantage for the average user when it comes to gaming. I'm sure there are some people in this thread who have AMD chips that perform just fine across a myriad of titles who could chime in.

I would also be surprised if Nvidia caught up to AMD in DX12 performance with next gen...The hardware isn't out yet so all we can do is speculate.

But I *really* wouldn't be purchasing a new CPU/GPU until everything from Intel/AMD/Nvidia has the covers pulled off of them.

If you do decide to switch to team green, I wouldn't be purchasing a 1070/1080 until a few months later when they release the fully enabled Ti variant of either card...cuz you know they will be pulling that BS once again.


----------



## Paul17041993

Quote:


> Originally Posted by *mogie*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Guys I really need your help for my watercooling idea can smeone please take 2 mins and outline where the vrams n stuff are its a 290x tri-x force
> 
> Im begging you i need to get the watercooling going today


If the VRAM only runs at the stock 1250MHz or close to it, you can skip their cooling for now. Im not sure what options you have to mounting that H100i as it's larger than the mounting holes around the die, but most importantly you'll need a way to cool both the main and VRAM VRMs.

The corsair HG10 bracket would do a decent job for what you need if you can get one, it bypasses the mounting hole issue by using a solid shroud that mounts around the entire card.
Quote:


> Originally Posted by *Chopper1591*
> 
> You are probably right about a cpu upgrade not justifying the cost. It's just that you hear/read everywhere that Intel is so much better at games compared to AMD....
> I guess in most cases the gap isn't that big after all. Far Cry 4, GTA V etc. excluded. These do run substantially better on Intel it seems.


I've possibly said this to death by now; but I still use a stock FX-8150, never needed anything "better" (of which I would have to go for a 2011-3 hex or higher for it to be an actual upgrade...)

But if it's the case that the developers for the game/s you like are some form of nazi that refuse to develop and optimise their games adequately, then a side-grade to intel is reasonable until zen comes out...


----------



## Chopper1591

Quote:


> Originally Posted by *Roboyto*
> 
> I would have to, at least partially, disagree with the highlighted portion of your comment. What you hear/read everywhere regarding Intel being *vastly superior* to AMD specifically in regards to gaming is likely from peoples opinions.
> 
> There are always numerous factors to consider when people make blanket statements such as:
> 
> What game(s) is it specifically? Are they utilizing numerous CPU cores, because we all know Intel is superior in single-threaded performance. (This is also an area where DX12 levels the playing field.)
> 
> Are we talking about Xfire/SLI of enthusiast class cards? Gee, I wonder why the $150 AMD CPU is bottlenecking a pair of $650 graphics cards? This is a no-brainer...
> 
> What resolution/refresh rate? Are peoples overclocks stable? Are they using a proper PSU? Do they have enough RAM, and is it fast enough? Are they running proper drivers? What OS are they on? The list of questions to ask someone who blurts out blanket statements is usually vast. But, you don't ever get a chance to find any of this out because they are typically quite immature.
> 
> There are also numerous other comparisons and benchmarks Intel obviously holds a sizable advantage, I won't dispute that. Hell, every PC device in my house is powered by an Intel Core i5/i7 for this particular reason.
> 
> However...
> 
> An AMD FX 8-core with a mild overclock shouldn't pose a vast disadvantage for the average user when it comes to gaming. I'm sure there are some people in this thread who have AMD chips that perform just fine across a myriad of titles who could chime in.
> 
> I would also be surprised if Nvidia caught up to AMD in DX12 performance with next gen...The hardware isn't out yet so all we can do is speculate.
> 
> But I *really* wouldn't be purchasing a new CPU/GPU until everything from Intel/AMD/Nvidia has the covers pulled off of them.
> 
> If you do decide to switch to team green, I wouldn't be purchasing a 1070/1080 until a few months later when they release the fully enabled Ti variant of either card...cuz you know they will be pulling that BS once again.


Totally agree with you. Well said.

I am probably chasing the impossible here. The very few titles which I actually have majors problems with will most likely not even be much better on Intel.








Will just have to temper myself and just save up cash until, like you said, all the stuff is out of the dark and released with proper tests done to everything. And then, if I still want to, grab some new stuff.

A mild overclock? I thought a 4.8ghz OC is pretty decent on a 8320 (3.5ghz stock).









Haha, so true. What you say about the ti joke.
Quote:


> Originally Posted by *Paul17041993*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> If the VRAM only runs at the stock 1250MHz or close to it, you can skip their cooling for now. Im not sure what options you have to mounting that H100i as it's larger than the mounting holes around the die, but most importantly you'll need a way to cool both the main and VRAM VRMs.
> 
> The corsair HG10 bracket would do a decent job for what you need if you can get one, it bypasses the mounting hole issue by using a solid shroud that mounts around the entire card.
> 
> 
> I've possibly said this to death by now; but I still use a stock FX-8150, never needed anything "better" (of which I would have to go for a 2011-3 hex or higher for it to be an actual upgrade...)
> 
> But if it's the case that the developers for the game/s you like are some form of nazi that refuse to develop and optimise their games adequately, then a side-grade to intel is reasonable until zen comes out...


Kudos...
A stock 8150. I do honestly find the overclock on my 8320 noticeable.

I will settle with my setup. It is doing its thing for me after all. Will just blame the companies when I have performance issues. After all, it has been like this for a few years now. Companies rushing the release of unfinished crap (remember GTA IV anyone?).

On the side note. Something I wanted to grab for some time now and probably just will.
A nice new display. My current Samsung 24" 60hz is starting to get old.

Got my eye on the *Asus MG279Q*.
Any warnings here? It is a 1440p 144hz FreeSync display. Only downside would be, well... FreeSync. This will tie me to AMD for some time. I am not sure if I will even notice the difference between a FreeSync or non-FreeSync display.
The technique is somewhat mature now so should be good, right?


----------



## mogie

Quote:


> Originally Posted by *Chopper1591*
> 
> I know the feeling. Wuahah
> Beware of the admin's though.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But...
> At your first sentence: I already have a decent water cooling system. This bad boy have sat me back around 600 euro. Already pushing it.
> Next?
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But damn.
> I might even consider getting a new screen with *Freesync*.
> Anybody have experience with it here? Is it worth it? Something like the Asus 144hz 1440p Freesync (500 euro).


yes dude i bought one 5 days ago and no more screentearing its a complete godsend worth every penny


----------



## Chopper1591

Quote:


> Originally Posted by *mogie*
> 
> yes dude i bought one 5 days ago and no more screentearing its a complete godsend worth every penny


Niice.

Which one did you get? There are various choices.
I'm not sure which is more important. You have like 1080 vs v1440p, tn vs ips, 60 vs 144hz... choices.


----------



## mogie

Quote:


> Originally Posted by *Chopper1591*
> 
> Niice.
> 
> Which one did you get? There are various choices.
> I'm not sure which is more important. You have like 1080 vs v1440p, tn vs ips, 60 vs 144hz... choices.


24 inch Aoc 144hz set me back 400 australian tho


----------



## Chopper1591

Quote:


> Originally Posted by *mogie*
> 
> 24 inch Aoc 144hz set me back 400 australian tho


Do you by any chance mean the "AOC GF2460PF"?
1080p TN, 144hz with freesync support ranging 35-144hz.

I actually have also been looking at that screen. But I'm not sure if that is a decent upgrade considering I already have a 1080p LED 2ms screen now (5 years old).
The screen is decent priced though, €259 at the cheapest store (292 USD). Hehe, funny thing is... that is actually 401 Australian dollar. The most expensive store is selling it for ~620 Australian dollar.

The one I have my mind on now is: Asus MG279Q or MG278Q.
Both are 27 inch 1440p 144hz.
*MG278Q* is 1ms TN with Freesync 35-144hz - €509 (789 AUD)
*MG279Q* is 4ms IPS with Freesync 35-90hz - €579 (898 AUD)

What is your experience when running games below 60fps on your monitor? Or is more smoothly compared to the non Freesync 60hz display?


----------



## mogie

Quote:


> Originally Posted by *Chopper1591*
> 
> Do you by any chance mean the "AOC GF2460PF"?
> 1080p TN, 144hz with freesync support ranging 35-144hz.
> 
> I actually have also been looking at that screen. But I'm not sure if that is a decent upgrade considering I already have a 1080p LED 2ms screen now (5 years old).
> The screen is decent priced though, €259 at the cheapest store (292 USD). Hehe, funny thing is... that is actually 401 Australian dollar. The most expensive store is selling it for ~620 Australian dollar.
> 
> The one I have my mind on now is: Asus MG279Q or MG278Q.
> Both are 27 inch 1440p 144hz.
> *MG278Q* is 1ms TN with Freesync 35-144hz - €509 (789 AUD)
> *MG279Q* is 4ms IPS with Freesync 35-90hz - €579 (898 AUD)
> 
> What is your experience when running games below 60fps on your monitor? Or is more smoothly compared to the non Freesync 60hz display?


yep thats the 1 bro .actually when you dont play it at higher than 60fps i had some wierd lag ....wouldnt call it screen tearing its hard to explain just run it max 144z and as much fps as possible.issue is ive only tested on source engine games atm because my cards in peices im doing the water mod on her


----------



## Chopper1591

Quote:


> Originally Posted by *mogie*
> 
> yep thats the 1 bro .actually when you dont play it at higher than 60fps i had some wierd lag ....wouldnt call it screen tearing its hard to explain just run it max 144z and as much fps as possible.issue is ive only tested on source engine games atm because my cards in peices im doing the water mod on her


Which water mod?
There is only one water mod IMO and that is a full custom loop. I am so happy I switched to water fully. Much cooler and quieter.
With a 1200mhz OC my r9 290 tri-x sits around 45-50c after hours of gaming. Also my vrm's are very chill (40-50c) with 1.3v.

I think I will just buy the MG279Q as I will probably upgrade my gpu by the end of the year anyway.


----------



## mogie

Quote:


> Originally Posted by *Chopper1591*
> 
> Which water mod?
> There is only one water mod IMO and that is a full custom loop. I am so happy I switched to water fully. Much cooler and quieter.
> With a 1200mhz OC my r9 290 tri-x sits around 45-50c after hours of gaming. Also my vrm's are very chill (40-50c) with 1.3v.
> 
> I think I will just buy the MG279Q as I will probably upgrade my gpu by the end of the year anyway.


i ground down n got rid of the heatdink so whe whole block will be coooled by this moneser also putting small heatinks where vrm1 n that is so its both water and air cooled same time.... crazy no?


----------



## Chopper1591

Quote:


> Originally Posted by *mogie*
> 
> i ground down n got rid of the heatdink so whe whole block will be coooled by this moneser also putting small heatinks where vrm1 n that is so its both water and air cooled same time.... crazy no?


I like thinking out of the box. But I doubt all that work is worth it.
Why not just go all out and get proper water?

I also have a tri-x.
My gpu sits around 45c with a 1200mhz overclock and the block keeps the vrm's around 40-45c also.


----------



## spyshagg

Quote:


> Originally Posted by *Chopper1591*
> 
> Do you by any chance mean the "AOC GF2460PF"?
> 1080p TN, 144hz with freesync support ranging 35-144hz.
> 
> I actually have also been looking at that screen. But I'm not sure if that is a decent upgrade considering I already have a 1080p LED 2ms screen now (5 years old).
> The screen is decent priced though, €259 at the cheapest store (292 USD). Hehe, funny thing is... that is actually 401 Australian dollar. The most expensive store is selling it for ~620 Australian dollar.
> 
> The one I have my mind on now is: Asus MG279Q or MG278Q.
> Both are 27 inch 1440p 144hz.
> *MG278Q* is 1ms TN with Freesync 35-144hz - €509 (789 AUD)
> *MG279Q* is 4ms IPS with Freesync 35-90hz - €579 (898 AUD)
> 
> What is your experience when running games below 60fps on your monitor? Or is more smoothly compared to the non Freesync 60hz display?


MG278Q is 42-144


----------



## mogie

Quote:


> Originally Posted by *Chopper1591*
> 
> I like thinking out of the box. But I doubt all that work is worth it.
> Why not just go all out and get proper water?
> 
> I also have a tri-x.
> My gpu sits around 45c with a 1200mhz overclock and the block keeps the vrm's around 40-45c also.


Came down to cash and me wanting to see how well I could create a block for ideas of prefab making em myself at a cheaper rate than the ones going to make some money hopefully .But its a **** ton of work and I am slightely regreting it only because i cant play insurgency atm









stuff in aus bro is crazy expensive a waterbglock in usa thats 150 bux it automatically like 400 here


----------



## Chopper1591

Quote:


> Originally Posted by *spyshagg*
> 
> MG278Q is 42-144


Whatever








Anything below 50 min is good for me.
Quote:


> Originally Posted by *mogie*
> 
> Came down to cash and me wanting to see how well I could create a block for ideas of prefab making em myself at a cheaper rate than the ones going to make some money hopefully .But its a **** ton of work and I am slightely regreting it only because i cant play insurgency atm
> 
> 
> 
> 
> 
> 
> 
> 
> 
> stuff in aus bro is crazy expensive a waterbglock in usa thats 150 bux it automatically like 400 here


Making something yourself is nice. But like you said. Its a lot of work


----------



## Paul17041993

Seems stable, 1.3V core and left the memory/aux at 1V, done via a modded BIOS. Didn't touch the TDP's though so it does throttle a little unless it's set to +50% via the crimson panel. I managed to get 1200/1750 clocks with 1.35V via software before modding the BIOS however it blacked out possibly due to the card's physical TDP limits (it looked stable until the sudden blackout).



Not sure if the ASIC quality may have been affected by the mod though...

Quote:


> Originally Posted by *mogie*


I'd encourage you to solder the blocks together if you're not worried about them being almost permanently attached, however a good conductive or diamond compound spread across the pump block's entire surface area should do a good job.


----------



## mogie

Quote:


> Originally Posted by *Paul17041993*
> 
> Seems stable, 1.3V core and left the memory/aux at 1V, done via a modded BIOS. Didn't touch the TDP's though so it does throttle a little unless it's set to +50% via the crimson panel. I managed to get 1200/1750 clocks with 1.35V via software before modding the BIOS however it blacked out possibly due to the card's physical TDP limits (it looked stable until the sudden blackout).
> 
> 
> 
> Not sure if the ASIC quality may have been affected by the mod though...
> I'd encourage you to solder the blocks together if you're not worried about them being almost permanently attached, however a good conductive or diamond compound spread across the pump block's entire surface area should do a good job.


I was gona bro i need flux n stuff first also to make the surface areas smooth as silk ..I beliave that the cooling will reach the vrms but also putting a strip of heat sinks even on top of there area will make the aircooling an overall a first time wierd custom job .water and air and if you saw the pump i got for it holy god it would cool down australia

anyone know the name of the kits for the vrm that everyone used to use for this card?

Tnx for the input









ino a heatsink is a heat sink but if ive gone this far with the build might aswell keep going it right


----------



## diggiddi

Quote:


> Originally Posted by *mogie*
> 
> I was gona bro i need flux n stuff first also to make the surface areas smooth as silk ..I beliave that the cooling will reach the vrms but also putting a strip of heat sinks even on top of there area will make the aircooling an overall a first time wierd custom job .water and air and if you saw the pump i got for it holy god it would cool down australia
> 
> anyone know the name of the kits for the vrm that everyone used to use for this card?
> 
> Tnx for the input
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ino a heatsink is a heat sink but if ive gone this far with the build might aswell keep going it right


See this prev post
Quote:


> Originally Posted by *LandonAaron*
> 
> 
> 
> Here I edited Fryzz's picture. Also the shinny part in the middle is the actual GPU chip. Like others said the VRM needs more cooling than the VRAM, but is a bit more difficult to cool due to its smaller surface area. You really need some sort of heat sink for these. Someone use to make a set for VRM heat sinks specifically for the 290 that came with thermal pads but I forget who. Maybe someone else remembers.
> 
> Edit: Isn't google wonderful. Found em: http://www.newegg.com/Product/Product.aspx?Item=N82E16835426042 These are good because the mounting hardware ensures a good contact with the VRM.


----------



## Paul17041993

Quote:


> Originally Posted by *mogie*
> 
> I was gona bro i need flux n stuff first also to make the surface areas smooth as silk ..I beliave that the cooling will reach the vrms but also putting a strip of heat sinks even on top of there area will make the aircooling an overall a first time wierd custom job .water and air and if you saw the pump i got for it holy god it would cool down australia
> 
> anyone know the name of the kits for the vrm that everyone used to use for this card?
> 
> Tnx for the input
> 
> 
> 
> 
> 
> 
> 
> 
> 
> ino a heatsink is a heat sink but if ive gone this far with the build might aswell keep going it right


The full-cover heatspreader that the card already has should do a good job, possibly even better than other VRM heatsinks. You also already have the two fans still attached so airflow shouldn't be an issue.


----------



## mogie

Quote:


> Originally Posted by *Paul17041993*
> 
> The full-cover heatspreader that the card already has should do a good job, possibly even better than other VRM heatsinks. You also already have the two fans still attached so airflow shouldn't be an issue.


ill let u bous know in a few hours how it goes. althoe the cosmetics of the original plastic framing with fans will be modded in future too


----------



## TheLAWNOOB

How much is a R9 290 with EK block worth?


----------



## mogie

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> How much is a R9 290 with EK block worth?


cheapo 1 around 88 bux better one 200 id reccomendede cuz red mod is crasp for vrm cooling unless u add heatsinks to em and akk ends looking ghettos and takes a million years not worth it IMO


----------



## Paul17041993

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> How much is a R9 290 with EK block worth?


Would somewhat depend on how well it clocks and the country its from I suppose, but 300USD? possibly more. In AU they're still pretty costly, some places selling reference 290X's for 1K AUD...


----------



## Vellinious

Been thinkin about selling mine as well, with the aquacomputer kryographics block. The card is amazing, easily hits 1300 on the core and 1750 on the memory. I was going to ask $350 USD....seems a pretty fair price. /shrug


----------



## Forceman

Quote:


> Originally Posted by *Vellinious*
> 
> Been thinkin about selling mine as well, with the aquacomputer kryographics block. The card is amazing, easily hits 1300 on the core and 1750 on the memory. I was going to ask $350 USD....seems a pretty fair price. /shrug


If you are looking for that kind of price, I'd sell it before Polaris launches.


----------



## Vellinious

Quote:


> Originally Posted by *Forceman*
> 
> If you are looking for that kind of price, I'd sell it before Polaris launches.


Why? From everything I've read, they're supposed to have the same kind of performance as the 390 and 390X.....I've seen nothing from the 3XX series that impressed me at all. That's why I bought the 290X lol.


----------



## Forceman

Quote:


> Originally Posted by *Vellinious*
> 
> Why? From everything I've read, they're supposed to have the same kind of performance as the 390 and 390X.....I've seen nothing from the 3XX series that impressed me at all. That's why I bought the 290X lol.


Because who's going to pay $350 for a used 290X when you can get a new, faster card for $300? Assuming the rumors are true, that is.

Are people paying $350 for used 290X cards now?


----------



## mogie

Quote:


> Originally Posted by *Forceman*
> 
> Because who's going to pay $350 for a used 290X when you can get a new, faster card for $300? Assuming the rumors are true, that is.
> 
> Are people paying $350 for used 290X cards now?


SIGH ....http://www.aliexpress.com/item/VGA-WATER-BLOCK-A-R290X-X-R9-290X-R290-full-coverage-of-the-public-version-of/32250471224.html?spm=2114.30010308.3.19.7vGLBa&ws_ab_test=searchweb201556_10,searchweb201602_2_10017_10021_507_10022_10020_10009_10008_10018_10019_101,searchweb201603_6&btsid=c12e9243-22c9-4183-951d-69cb83ef02a5

shhhhhhhhhhhhhhhh


----------



## Vellinious

Quote:


> Originally Posted by *Forceman*
> 
> Because who's going to pay $350 for a used 290X when you can get a new, faster card for $300? Assuming the rumors are true, that is.
> 
> Are people paying $350 for used 290X cards now?


Actually, it'd be about $275 for the card, and $75 for the waterblock. lol

Newer AND faster? Seems the latter part of that is still up in the air. IF the new cards have 390 / 390X type performance, this will still be faster. /wink


----------



## TheLAWNOOB

Got a pair of Gigabyte R9 290 Windforce WITH EK Fullcover for $225 Canadian each









Could have got a third one for another $100, but I have no need for it.


----------



## Forceman

Quote:


> Originally Posted by *Vellinious*
> 
> Actually, it'd be about $275 for the card, and $75 for the waterblock. lol
> 
> Newer AND faster? Seems the latter part of that is still up in the air. IF the new cards have 390 / 390X type performance, this will still be faster. /wink


Well, good luck with the sale.


----------



## Newbie2009

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Got a pair of Gigabyte R9 290 Windforce WITH EK Fullcover for $225 Canadian each
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Could have got a third one for another $100, but I have no need for it.


Very nice. Still holding up well these days, max temp of 48c and crossfire is really good now, when game supported.


----------



## TheLAWNOOB

Out of the games I play, only one not supported is NFS 2016. AMD have been working on it for 3 month and still didn't fix it


----------



## Darknessrise13

Just got a ref 290 recently and I'm more or less set on keeping the ref cooler. Are there any backplates that are compatible with the reference cooler itself?


----------



## TheLAWNOOB

Why do you need a backplate for ref cooler?


----------



## Darknessrise13

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Why do you need a backplate for ref cooler?


Don't NEED it persay, want it for aesthetics.


----------



## diggiddi

So do these cards have HEVC/ H265 abilities?


----------



## TheLAWNOOB

Well I tried to play CSGO.

Disabled Crossfire, and this is what happened.



GG AMD.


----------



## diggiddi

Looks like it was not disabled


----------



## TheLAWNOOB

Yeah, hitting disable doesn't actually disables it.


----------



## diggiddi

It does for me, but currently I just use game profiles in Crimson for CFX


----------



## TheLAWNOOB

What OS? Its a common problem for Win10, not sure if they fixed it then broke it in beta, or never fixed it at all.


----------



## rdr09

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Well I tried to play CSGO.
> 
> Disabled Crossfire, and this is what happened.
> 
> 
> 
> GG AMD.


Quote:


> Originally Posted by *TheLAWNOOB*
> 
> What OS? Its a common problem for Win10, not sure if they fixed it then broke it in beta, or never fixed it at all.


Like i said in the other thread, your setup is messed up and needs fixing before blaming other stuff. you bought your gpus used? The core clocks when loaded should be as straight as they can be. If not, the core usage will fluctuate wildly.

What are your settings in AB if you used it at all?


----------



## diggiddi

Win 8.1, Crimson 16.5.2
Here's a SS of Cry 3, stock CPU can't handle CFX, everything on highest settings 'cept blur (Med) and AA (4Msaa IIRC) @1080


Spoiler: Warning: Spoiler!


----------



## TheLAWNOOB

I put +31mv, 1100/1500, didn't disable ULPS.

I'll try disabling that. Will fullscreen windowed disable CF?


----------



## spyshagg

Yes.

Also, disabling ULPS is mandatory. You will encounter many problems with it enabled (for instance, firestrike will lock up instantly in my PC).

Regarding your gpu usage with CF enabled, that seems pretty normal to me? because CF really is game dependent. Some games will show that kind of usage (it basically means it scales badly). Some other games that do not have CF profile will show that kind of bad usage if you enabled the option "force for all titles even if no profile exists".

Try running Firestrike and compare the usage results. Both gpus should be >90% all the time. If they do, you are good and the game is the problem


----------



## TheLAWNOOB

The GPU usage is fine in KF2 and Tomb Raider 2013.

It was fluctuating because I didn't manage to disable CF for CSGO which is a DX9 game (no frame pacing support for CF).

I'll try running it in fullscreen windowed.


----------



## rdr09

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> The GPU usage is fine in KF2 and Tomb Raider 2013.
> 
> It was fluctuating because I didn't manage to disable CF for CSGO which is a DX9 game (no frame pacing support for CF).
> 
> I'll try running it in fullscreen windowed.


DX9 . . . yah, try windowed mode or disable crossfire the SLI way . . . unplug secondary gpu.


----------



## spyshagg

With crimson having framepacing enabled results in heavy stuttering with some CF games. BF4 for instance. I have to disabled it. With 15.11.1 I can have it enabled. Go figure


----------



## rdr09

Quote:


> Originally Posted by *spyshagg*
> 
> With crimson having framepacing enabled results in heavy stuttering with some CF games. BF4 for instance. I have to disabled it. With 15.11.1 I can have it enabled. Go figure


BF4 MP is quite demanding even with Mantle. Scenes vary too much from light to heavy load, especially when switching weapons. But, even with that . . . the core clocks should stay as straight as they can be. I test mine with Heaven first after updating to every driver to see if they do remain straight. If not, I won't even start my games.



Saw 3750 MB of vram use here per card.


----------



## spyshagg

Mantle provided me better minimal framerates but lower upper framerates when I had [email protected]

But after switching to [email protected] I no longer have any benefits with mantle. DX11 provides me minimal framerates always above 60 and my normal framerate is always between 110 and 155. Mantle will struggle to go above 110 at all.

Crossfire choked my computer with 1333mhz ram.


----------



## Chopper1591

Quote:


> Originally Posted by *Darknessrise13*
> 
> Just got a ref 290 recently and I'm more or less set on keeping the ref cooler. Are there any backplates that are compatible with the reference cooler itself?


CNC machined, anyone?


----------



## TheLAWNOOB

Disabled CF globally, worked great.

Have a CF profile for tomb raider but CF is not enabling.

Turn global CF back on. Now I gotta reinstall driver cause I forgot to reset overclock.

Also solidwork's viewport freezes with CF. I need solidworks, not sure what I have to do now.

Maybe Ill run it on my Pentium G3258 with integrated graphics.


----------



## Darknessrise13

Quote:


> Originally Posted by *Chopper1591*
> 
> CNC machined, anyone?


Haha that would work but I don't have access to anyone with a CNC.


----------



## diggiddi

Quote:


> Originally Posted by *spyshagg*
> 
> With crimson having framepacing enabled results in heavy stuttering with some CF games. BF4 for instance. I have to disabled it. With 15.11.1 I can have it enabled. Go figure


Quote:


> Originally Posted by *rdr09*
> 
> BF4 MP is quite demanding even with Mantle. Scenes vary too much from light to heavy load, especially when switching weapons. But, even with that . . . the core clocks should stay as straight as they can be. I test mine with Heaven first after updating to every driver to see if they do remain straight. If not, I won't even start my games.
> 
> 
> 
> Saw 3750 MB of vram use here per card.


Quote:


> Originally Posted by *spyshagg*
> 
> Mantle provided me better minimal framerates but lower upper framerates when I had [email protected]
> 
> But after switching to [email protected] I no longer have any benefits with mantle. DX11 provides me minimal framerates always above 60 and my normal framerate is always between 110 and 155. Mantle will struggle to go above 110 at all.
> 
> Crossfire choked my computer with 1333mhz ram.


BF4 Loves Ram bandwidth It is smoother with my stock 8350 and 16gb of 2400mhz ram than @4.8ghz and 24gb 1600mhz


----------



## TheLAWNOOB

For R9 290 OCed with fullcover block, is 72C under max load for 2 hours in game too hot?

My 1st card runs at 66C, 2nd at 72C, CPU at 75C.

Have 480mm rad with 1300rpm fans, order is: pump>rad>cpu>gpu1>gpu2


----------



## Forceman

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> For R9 290 OCed with fullcover block, is 72C under max load for 2 hours in game too hot?
> 
> My 1st card runs at 66C, 2nd at 72C, CPU at 75C.
> 
> Have 480mm rad with 1300rpm fans, order is: pump>rad>cpu>gpu1>gpu2


Any idea what the water temperature is? That's a safe temp for the card, but seems pretty high for full water cooling. My card gets to 50-55C under long-term gaming, and that feels warm to me.


----------



## TheLAWNOOB

The rad feels warm, so probably about 40C when it exits the rad. I'll measure the radiator temp tomorrow.

I read some reviews of the EK block and it says the block has 21C delta between GPU temp and water temp. Considering the water gets pretty warm when it reaches 2nd GPU it shouldn't be too bad.

I was only concerned because when I had GTX780 OCed with universal block I had 50C under max load.


----------



## battleaxe

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> For R9 290 OCed with fullcover block, is 72C under max load for 2 hours in game too hot?
> 
> My 1st card runs at 66C, 2nd at 72C, CPU at 75C.
> 
> Have 480mm rad with 1300rpm fans, order is: pump>rad>cpu>gpu1>gpu2


Seems hot. Mine never exceed 45c at 1100mhz OC.

What's your ambient. Likely the culprit.


----------



## TheLAWNOOB

Ambient is around 20-25C.

I might not have enough radiator space or fan speeds. According to martinsliquidlab, my 480mm rad can disappate 500W at 20C delta with 1300rpm fans.

If the water temperature going to CPU is 40C, temp entering 1st GPU is 45C, and temp entering 2nd GPU is 50C, 72C GPU temp is not far off since my waterblock has 21C delta (core temp to water temp).


----------



## rdr09

Quote:


> Originally Posted by *spyshagg*
> 
> Mantle provided me better minimal framerates but lower upper framerates when I had [email protected]
> 
> But after switching to [email protected] I no longer have any benefits with mantle. DX11 provides me minimal framerates always above 60 and my normal framerate is always between 110 and 155. Mantle will struggle to go above 110 at all.
> 
> Crossfire choked my computer with 1333mhz ram.


Quote:


> Originally Posted by *diggiddi*
> 
> BF4 Loves Ram bandwidth It is smoother with my stock 8350 and 16gb of 2400mhz ram than @4.8ghz and 24gb 1600mhz


Really had to go Mantle when i played BF4. My RAMs are only 1600 MHz.

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> For R9 290 OCed with fullcover block, is 72C under max load for 2 hours in game too hot?
> 
> My 1st card runs at 66C, 2nd at 72C, CPU at 75C.
> 
> Have 480mm rad with 1300rpm fans, order is: pump>rad>cpu>gpu1>gpu2


Even when i only had 120 per rad (3 blocks) my GPU temps stayed around 60. Got rid of those little 120s and replaced them with 240/360. Now my temps stay below 60 on all of them. VRMs should be cooler than the core using EK blocks. I don't oc my gpus during games though.

Use HWINFO 64 to monitor your temps during a game. *Close the other apps*.



That was with 120 rad per block.


----------



## spyshagg

just played doom for 4 hours straight and the gpu temp hasn't gone above 36ºc @ 1150mhz 75mv

3 x360 radiators with 15 fans

70ºc seems very high for a watercooled card.


----------



## rdr09

Quote:


> Originally Posted by *spyshagg*
> 
> just played doom for 4 hours straight and the gpu temp hasn't gone above 36ºc @ 1150mhz 75mv
> 
> 3 x360 radiators with 15 fans
> 
> 70ºc seems very high for a watercooled card.


Very nice. Really was a hard lesson learned cheaping out in watercooling parts. Should have went with bigger rads right off the bat.


----------



## spyshagg

big thick rads because these babies like watts like i do cookies!


----------



## Newbie2009

Yeah those temps way too high, refit the block I'd say. I've never seen my cards break 50c with maximum abuse. mid 40s should be the norm, I have a 360 & 240 rad


----------



## NaXter24R

After many troubles, i've made it, i've watercooled my 290x.

About that, i would like to say that XSPC Razor waterblock is compatible with MSI Gaming 290x OC!
I've read that it has some higher cap, but mine, seems to be standard tho. Moreover, the block covers the GPU, memory and VRM with enough clearance fot the read and front caps as well.
Ah, my PCB model is: V308-005

That's my wc thread about that: http://www.overclock.net/t/1599044/first-loop-some-advice-needed

Right now i'm running an h220 from Swiftech with another XSPC 120mm rad and a 4790k. GPU temps are about 51°C while mining and they never went above 60 while gaming (some cpu intensive one).
The only drawback was my backplate (MSI got one with the card) but i'm planning to use some EK one. It should fit with a slightly longer screw, like 10mm instead of 4mm. I'll let you know about that.

I'll also let you know my max oc, right now i don't have much time to experiment


----------



## Chopper1591

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> For R9 290 OCed with fullcover block, is 72C under max load for 2 hours in game too hot?
> 
> My 1st card runs at 66C, 2nd at 72C, CPU at 75C.
> 
> Have 480mm rad with 1300rpm fans, order is: pump>rad>cpu>gpu1>gpu2


IMO 72c is pretty hot. Almost a waste to go custom water with those temps.
You sure flow rate is okay? I haven't followed the discussion: is it a new loop? If so, did you clean the rads? I've seen this so many times (causing high temps).

My r9 290 tri-x with a 1200mhz OC (+75mv) mostly is around 45c after prolonged gaming. Have hit 50c with 25c ambient on warm days. Do note that there is a fx-8320 dumping heat heavily in my loop (4.8ghz OC with 1.525v for around 200w).

You are a bit short on rad space for the parts you cool. Either up that or accept that you need some more fan speed. I like mine low for sure (preferably sub 1000 rpm).
Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Ambient is around 20-25C.
> 
> I might not have enough radiator space or fan speeds. According to martinsliquidlab, my 480mm rad can disappate 500W at 20C delta with 1300rpm fans.
> 
> If the water temperature going to CPU is 40C, temp entering 1st GPU is 45C, and temp entering 2nd GPU is 50C, 72C GPU temp is not far off since my waterblock has 21C delta (core temp to water temp).


Measuring by hand (feeling) or calculating by static data is not the way to go in my opinion. Been there, done that.
When you do clean/take apart your loop next time I highly advice you to add one or two temp sensors in the loop so you can actually see/read the water temp.

I bought some fishing displays from e-bay (like these). These are basically the same as water cooling brand ones, but a lot cheaper (like €2 compared to €15).
Then got some in-line temp sensors, these to be precise: Aquatuning inline sensor. Then you can just cut of the 2-pin plug and the sensor probe from the LCD temp meter and solder the wires. Done. Works perfect.









Do note that most (if not all) pumps are rated for 50c max water temp so you might want to check that you don't push it too hard.
Quote:


> Originally Posted by *spyshagg*
> 
> just played doom for 4 hours straight and the gpu temp hasn't gone above 36ºc @ 1150mhz 75mv
> 
> 3 x360 radiators with 15 fans
> 
> 70ºc seems very high for a watercooled card.


This.
I am on a 360 UT60 paired with a single 140mm EK rad. The 360 is push/pull with fans mostly running at 700-1200 rpm, depending on my mood (OC intensity) and ambient temp.

With a mild cpu clock (4.4-4.6) and stock r9 290 my water is mostly around 25-30c with ambient around 21c. This keeps my cpu and gpu below 40c under prolonged gaming.


----------



## spyshagg

Quote:


> Originally Posted by *rdr09*
> 
> Very nice. Really was a hard lesson learned cheaping out in watercooling parts. Should have went with bigger rads right off the bat.


Here's 30 minutes of Doom on afterburner


amb temp is 21ºc. Water temp is 22ºc. The radiators barely heat up. Something is very wrong with your cooling!


----------



## NaXter24R

What fans do you have? Are you sure that your loop is air free? I had to flip my case upside down in order to bleed it completely, a massive air bubble was inside one of the rad. I have 55°C on the core with a 4790k and a 290x +25mv 1120mhz and a 240+120 rad. Mi fans are 3 Noctua NF-F12 at 700rpm more or less...


----------



## TheLAWNOOB

At room temp of 20C, radiator gets to 26C, 2nd GPU core goes to 66C.

No air bubbles, flow rate is fine.

I'll monitor the VRM temps with HWinfo.

I have a infrared thermometer for measuring radiator temps.

Using 1300rpm fans


----------



## battleaxe

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> At room temp of 20C, radiator gets to 26C, 2nd GPU core goes to 66C.
> 
> No air bubbles, flow rate is fine.
> 
> I'll monitor the VRM temps with HWinfo.
> 
> I have a infrared thermometer for measuring radiator temps.
> 
> Using 1300rpm fans


Sum ting wong.

Try tipping the PC while it is running. Could be a bubble in the block, I've had them in there before too.


----------



## kizwan

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> At room temp of 20C, radiator gets to 26C, 2nd GPU core goes to 66C.
> 
> No air bubbles, flow rate is fine.
> 
> I'll monitor the VRM temps with HWinfo.
> 
> I have a infrared thermometer for measuring radiator temps.
> 
> Using 1300rpm fans


Define flow rate is fine? What is your flow rate? That way too hot in that ambient & with that water temp. Likely need to re-seat the block.


----------



## Chopper1591

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> At room temp of 20C, radiator gets to 26C, 2nd GPU core goes to 66C.
> 
> No air bubbles, flow rate is fine.
> 
> I'll monitor the VRM temps with HWinfo.
> 
> I have a infrared thermometer for measuring radiator temps.
> 
> Using 1300rpm fans


Hmm. Measuring rad temp is not ideal but should give you OK info.

How do you know flow rate is fine? Looking at the tubing?








Something is definitely not right. Your temp is not okay if your water is around 6 delta-t (which is also unlikely btw).
Quote:


> Originally Posted by *kizwan*
> 
> Define flow rate is fine? What is your flow rate? That way too hot in that ambient & with that water temp. Likely need to re-seat the block.


This x10

Even if his water is 26c with 20c, which is unlikely with a cpu and 2 gpu's with a 480 rad and 1300rpm fans, a 40c increase on the gpu (66c-26c water) is not okay by all means.

My money is also on block contact and/or flow restriction (gunk in the blocks? Been there).

For comparison purposes:
Current temps. Just stopped a game after +- 2 hours of running. Water temp at the time of the screenshot (27,9c).

Though this is idle after gaming it shows roughly 7c over water-temp on both the cpu and gpu (highlighted).

Edit:
To give me insight I just did ~20 minutes of Unigine Heaven. Stock gpu clock (1000) with the 4.8 cpu clock.
Water temp was around 31c. Ambient I think around 22-23c.


As you can see the gpu core is 23c above water-temp and the vrm's are just 14c (I love my Fulipoly VRM thermal pads







).


----------



## NaXter24R

Quote:


> Originally Posted by *Chopper1591*
> 
> As you can see the gpu core is 23c above water-temp and the vrm's are just 14c (I love my Fulipoly VRM thermal pads
> 
> 
> 
> 
> 
> 
> 
> ).


Dude, those VRMs









little OT: I'm trying to get a EK backplate for my Razor block (or some aquacomputer, cheaper, but i don't know if it fits, read my old post, i have an MSI Gaming), is that pad so different from the others? I mean, when i push my card, my core temp is 55°C, that is ok since i'm running a 240+120mm rad with the CPU as well and my VRM 2 varies from 60 to 70ish°C under heavy load. My 120 rad is in front of the case (450D) fith a NF-F12 pulling air inside the case and on the card (passive cooling, sort of..). Without that fan, with only the case fan, my temps rise to 78/80°C on VRM1. On the back of the cards there ase 6 mosfets i think. Those needs some cooling


----------



## TheLAWNOOB

The hotter GPU's VRM go up to 100C...

It's within safe limits but a little too hot for water cooling.


----------



## NaXter24R

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> The hotter GPU's VRM go up to 100C...
> 
> It's within safe limits but a little too hot for water cooling.


VRM2 should be the VRAM one, and usually is the cooler one... Maybe the thermal pad. I'll suggest you to reseat the block. If you have a drain port you can do that in 1h or less


----------



## TheLAWNOOB

Actually theres 2 sets of VRM temps for each card and I circled the bottom one. The one on top have the VRM temps flipped.

It seems like each GPU raises water temp by 10C.


----------



## NaXter24R

Yep, i saw that. In theory, if it's like GPU-Z, and should be like that, VRM 1 it's the core one (VDDC) and VRM 2 is VRAM one (VDDCI). If you change core voltage you are dealing with VRM1 and if you change aux voltage you are dealing with VRM2.

That said, VRM2 should be 1v, period, VRM1 is fluctuating according to the core clock. Moreover, since VRM2 is the memory one, it is, or should be, the coolest one. Usually it's not even cooled on air cooled card, or just passively with most brand. So, since is that hot i'm wondering if it's a bad block seat. If you're sure to be without any bubble, if you're sure that all rads are ok and if everything else it's ok, it's probably a bad seat, it's the only thing left. That or a bad thermal pad..


----------



## TheLAWNOOB

Thanks, I'll reseat it if I get another rad for the loop.

I'm thinking about getting a 240 or 360 rad and put it after the first GPU to lower the temps on second GPU.


----------



## NaXter24R

In theory loop order is not relevant. Just put the pump before the res and you should be ok. When the fluid starts to move, temps will equalize after a while. Yes you can se some difference, but they should be minor.
For example, my loop is 240 rad>pump/cpu block>120 rad>GPU>240 rad. Since i have a h220 my cpu wb has a pump on it and my 240 rad has a build in res. I put the rad between the GPU and CPU because of routing convenience. Moreover in that way i have a quick disconnect fittings that acts as a bleed port, just in case. Temps wouldn't have changed, at least, not in the long term, for example, 1h gaming scenario.

If you reseat the block (i highly suggest that) look a thermal pads, they should have the VRM shape, if not, they weren't seated correctly. Also, when you put the thermal paster on the GPU, use the cross method or some asterics. The GPU don't have any heatspreader like a CPU, the shiny thing is all die, all GPU, all transistor, so should be covered with thermal paste. If you remove the block and it's not, clean and put more paste.
Less is more, yeah, but just cover all the die (maybe that's the case, you never know).


----------



## TheLAWNOOB

I just bought these GPU used with waterblock already on them, so I'm not sure how well the TIM is applied.

I found a cheap 360 rad, not sure if the seller is willing to ship.

The temps are ok, but disappointing considering its a fullcover water loop.


----------



## NaXter24R

Maybe it's just a bad thermal compound spread. When My 290x was used too and something weird happened, the thermal compound dryed out during shipment. Infact that card (MSI Gaming) thermal throttled like a boss. Really, 60ish when idle and 95 in less than 15 second under any 3d load.
I took the cooler apart and, surprise, dry thermal compound. I've replaced it with some MX2 and bum, 40 idle, 77 under load.


----------



## mus1mus

Before buying a rad, check the TIM first by reseating the block.

Some sellers have the habit of not checking and reseating the blocks prior to selling. So you should do that on your end.

And lastly, check your rad too. Cleaning it will do wonders.


----------



## TheLAWNOOB

Would something like the EK Parallel Terminal help reduce my temps?

https://www.ekwb.com/shop/ek-fc-terminal-dual-parallel-3-slot-plexi

Currently I'm using a no name brand chinese pump at medium speed, I figured a parallel connector will reduce restriction and increase flow rate. It would also feed both GPUs with the same water temperature, which should help reduce the temps on the second card.

Right now the second card is 6-10C hotter than first one.


----------



## Vellinious

Buy a better pump....get a D5 or a DDC. Problem solved.


----------



## NaXter24R

Never skimp on the pump. It's not about China, but, better to spend something more on that one since it's the most important part of the loop. My old h220 pump died on me, well, it didn't started, luckily, so i had to replace that. But what if it stopped working while rendering a video for example? Ok, the PC shut off immediately, and it's ok, but my 4790k was idling at 80°C after the boot.

p.s. yes i'm a lucky man


----------



## diggiddi

Speaking of pumps what do you all think of this, just for single GPU

http://www.xs-pc.com/water-pumps/ion-pumpreservoir

http://thermalbench.com/2016/03/03/xspc-ion-pumpreservoir/

Comes out in red and white too
https://www.facebook.com/XSPC-186277998125590/


----------



## Chopper1591

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Would something like the EK Parallel Terminal help reduce my temps?
> 
> https://www.ekwb.com/shop/ek-fc-terminal-dual-parallel-3-slot-plexi
> 
> Currently I'm using a no name brand chinese pump at medium speed, I figured a parallel connector will reduce restriction and increase flow rate. It would also feed both GPUs with the same water temperature, which should help reduce the temps on the second card.
> 
> Right now the second card is 6-10C hotter than first one.


I'll ask again:
Is the loop new? If so, did you clean/flush the radiator(s)? Gunk can be a cause of the high temps.

And besides that it is most likely either poor contact on the gpu and/or gunk into the gpu block.


----------



## TheLAWNOOB

That pump looks nice but seem like the water level might be hard to tell.

I increased the speed of my pump to 100%. It now runs at 5000rpm instead of 3000, and temps pretty much the same.

I'll have to open the block since the previous owner used Mayhem dye, and it might clogged up the block. Or maybe he did a terrible job of applying thermal paste.

The radiator is fine, zero dust build up.


----------



## Chopper1591

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> That pump looks nice but seem like the water level might be hard to tell.
> 
> I increased the speed of my pump to 100%. It now runs at 5000rpm instead of 3000, and temps pretty much the same.
> 
> I'll have to open the block since the previous owner used Mayhem dye, and it might clogged up the block. Or maybe he did a terrible job of applying thermal paste.
> 
> The radiator is fine, zero dust build up.


Uhmm, I mean the inside of the radiator actually.
But I'm curious to hear the results when you take off the block and inspect the inside.


----------



## NaXter24R

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> That pump looks nice but seem like the water level might be hard to tell.
> 
> I increased the speed of my pump to 100%. It now runs at 5000rpm instead of 3000, and temps pretty much the same.
> 
> I'll have to open the block since the previous owner used Mayhem dye, and it might clogged up the block. Or maybe he did a terrible job of applying thermal paste.
> 
> The radiator is fine, zero dust build up.


Just to show you, this was my old h220 pump. Look at what i've found inside,


http://imgur.com/nxWS5




It was a stock h220, so nothing fancy at all. Still, that gunk killed my pump. the first time it stopped working, the second time was pretty much dead. It's still functional but it's closer to death than ever...

Turned out to be plasticizer, directly form tubings, so, if the old owner had some tubings like i had, most likely there is a lot of gunk in the wb.


----------



## TheLAWNOOB

That looks pretty bad. I'll buy the parallel bridge and open up the GPU blocks. I might get another 480 rad too but my Corsair 240 is already looking ridiculous with a single 480 rad.


----------



## Chopper1591

Quote:


> Originally Posted by *NaXter24R*
> 
> Just to show you, this was my old h220 pump. Look at what i've found inside,
> 
> 
> http://imgur.com/nxWS5
> 
> 
> 
> 
> It was a stock h220, so nothing fancy at all. Still, that gunk killed my pump. the first time it stopped working, the second time was pretty much dead. It's still functional but it's closer to death than ever...
> 
> Turned out to be plasticizer, directly form tubings, so, if the old owner had some tubings like i had, most likely there is a lot of gunk in the wb.


Wow. I've seen my share of gunked blocks but this is one of the most severe.
Quote:


> Originally Posted by *TheLAWNOOB*
> 
> That looks pretty bad. I'll buy the parallel bridge and open up the GPU blocks. I might get another 480 rad too but my Corsair 240 is already looking ridiculous with a single 480 rad.


I would really inspect the blocks first. I hardly think the parallel bridge will help. Like said earlier loop order makes little to no difference. Water temp will settle after a while no matter the order.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Chopper1591*
> 
> Wow. I've seen my share of gunked blocks but this is one of the most severe.
> I would really inspect the blocks first. I hardly think the parallel bridge will help. Like said earlier loop order makes little to no difference. Water temp will settle after a while no matter the order.


this is both plasticizer build up and the dye from the coolant breaking down...I've seen about 5 or 6 like that on the swiftech thread they changed the coolant and tubing so that doesn't happen now but it's pretty grody for sure


----------



## NaXter24R

Yeah i go lucky after all. After a block cleaning and after some trouble, Swiftech was so kind to send me a spare pump, even without contacting the store who sold me that AiO. Bryan was amazing.
He told me about plasticizer. The very first version of the h220 had a lot in it, and my old h220 was one of those. There are some tubings with lots of plasticizer inside, so try to avoid those.
I'm trying some Masterkleer right now. I know, they are ugly and they are more yellowish than clean, but, i got 2m for 3€ i think, so i'm ok with that.
Another advice, put some drop of anti algae in your loop. I got some Mayhems one but right now i can't tell if it's wort or not since my loop is less than one month. Last, but not least, i'm a huge fan of nature, water, distilled water, is the best. Cheap and functional, no additional unkow things inside.

Ah, almost forgot, that bridge, do as you wish, but it won't change much. In the short term, yes, it will help and you'll se some degree difference, but if you use your pc for a while, after the temps equalize, it would be the same as before, so save some money.
Same goes for the rad. Try without it first.
As i told you, i'm a bit low on space in my case and i have a 240+120mm rad, think it like a 360. Now, in this config, while gaming some CPU intensive game like GTA V, i can run my [email protected]/53°C an my [email protected]/55°C. Fans at 50/60% since they are PWM and pump is 40% or less, like 1700/1800rpm (min is 1300 and max is 3000 for this pump).

So, if i can do that whit a pretty minimal radiator setup, you will surely at least, like i do with a 480.


----------



## csgofanatic

Not sure if I should make a separate topic for this issue. Love the 290 I got from my cousin who previously used it for bitcoin mining. Still runs fine (for now) but there's a bit of an issue with my dual monitor setup. Didn't have this problem when I had a Nvidia cards installed on my computer.

I've set up my computer so that my monitor goes to sleep after ten minutes of idleness (also so that my Razer keyboard and it's LED stop glowing), I notice that any applications that I have opened on my 2nd monitor (Asus VE 228) has been moved to my primary monitor (BenQ 2411Z). It's a pain to constantly drag Teamspeak back on my second monitor when I AFK for a bit.

The 2nd monitor is connected through HDMI, and the BenQ monitor is through DVI-DL. I noticed when I plugged my DVI-DL on the 2nd DVI-D port the 290 provides the issue is gone, but my games feel choppy and laggy (csgo in particular) so I switched back to the top DVI-D port. Any workarounds behind this issue?

I have tried re-installing AMD drivers, but no avail.

Here are my current specs:

Asus Z-87-K
Intel i7 4770S
MSI R290 Reference
16 GB of RAM (Corsair, 1600mhz)
1TB HDD (OS) + 500 GB HDD (both WD)
Windows 8.1
Current AMD drivers are the 16.3.2 drivers. Tried updating to the hotfix but my screen would constantly flicker when watching videos.


----------



## kizwan

Quote:


> Originally Posted by *csgofanatic*
> 
> Not sure if I should make a separate topic for this issue. Love the 290 I got from my cousin who previously used it for bitcoin mining. Still runs fine (for now) but there's a bit of an issue with my dual monitor setup. Didn't have this problem when I had a Nvidia cards installed on my computer.
> 
> I've set up my computer so that my monitor goes to sleep after ten minutes of idleness (also so that my Razer keyboard and it's LED stop glowing), I notice that any applications that I have opened on my 2nd monitor (Asus VE 228) has been moved to my primary monitor (BenQ 2411Z). It's a pain to constantly drag Teamspeak back on my second monitor when I AFK for a bit.
> 
> The 2nd monitor is connected through HDMI, and the BenQ monitor is through DVI-DL. I noticed when I plugged my DVI-DL on the 2nd DVI-D port the 290 provides the issue is gone, but my games feel choppy and laggy (csgo in particular) so I switched back to the top DVI-D port. Any workarounds behind this issue?
> 
> I have tried re-installing AMD drivers, but no avail.
> 
> Here are my current specs:
> 
> Asus Z-87-K
> Intel i7 4770S
> MSI R290 Reference
> 16 GB of RAM (Corsair, 1600mhz)
> 1TB HDD (OS) + 500 GB HDD (both WD)
> Windows 8.1
> Current AMD drivers are the 16.3.2 drivers. Tried updating to the hotfix but my screen would constantly flicker when watching videos.


I've had my first monitor connected to the top DVI port (DVI to HDMI cable) & second monitor connected to HDMI but have no problem like yours did. Currently my first monitor connected to HDMI while second monitor connected to the top DVI port. No problem either. Your card probably have bad DVI port.


----------



## NaXter24R

If this can help you, i have 2 monitor as well, the first one connected via DispayPort, the other via DVI-D.
When they go to sleep, nothing happens, but, if instead i turn off one of those monitor, everything goes to the first one.
It's something related to digital connection such as HDMI or DisplayPort, infact whenever i turn off (off, not sleep or power saving mode) my second monitor (DVI) everything moves to the 1st one.

Now, what R9 290 do you have? Since it has 2 DVI-D, so they are the same, or should be.

I used to have your same layout with my old monitor, DVI qith 1st and HDMI with 2nd, moreover, 2 different resolution, no issue whatsoever, this with 7770, 280x and 290x.
Also, try to disable ULPS, sometimes it might help.
Could you also tell me whats the VRAM clock when idle with 2 monitors?


----------



## TheLAWNOOB

I tried playing other games. Both KF2 and GTA V only get my GPUs to mid 40s to 50s. Only Rise of Tomb Raider pushes it that hard.


----------



## Chopper1591

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> I tried playing other games. Both KF2 and GTA V only get my GPUs to mid 40s to 50s. Only Rise of Tomb Raider pushes it that hard.


Bottleneck?


----------



## NaXter24R

That's strange... We need a baseline. Run some Heaven or Valley. They use GPU only so the CPU won't make any difference at all. Don't use Furmark since it's a waste of electriciy and it's not a good bench but just a VRM killer.
Also, what settings do you use for GTA V? It's pretty demanding but maybe youre running 1080p with v-sync at 60fps, so both cards are not flat out.
Rise of the Tomb Raider instead, is made by Nvidia, so we can say for sure that, technically speaking, is garbage, still, that cannot justify a huge temperature difference.


----------



## csgofanatic

Quote:


> Originally Posted by *NaXter24R*
> 
> If this can help you, i have 2 monitor as well, the first one connected via DispayPort, the other via DVI-D.
> When they go to sleep, nothing happens, but, if instead i turn off one of those monitor, everything goes to the first one.
> It's something related to digital connection such as HDMI or DisplayPort, infact whenever i turn off (off, not sleep or power saving mode) my second monitor (DVI) everything moves to the 1st one.
> 
> Now, what R9 290 do you have? Since it has 2 DVI-D, so they are the same, or should be.
> 
> I used to have your same layout with my old monitor, DVI qith 1st and HDMI with 2nd, moreover, 2 different resolution, no issue whatsoever, this with 7770, 280x and 290x.
> Also, try to disable ULPS, sometimes it might help.
> Could you also tell me whats the VRAM clock when idle with 2 monitors?


That sounds about right. If I turn off my DVI-D connected monitor, everything is transferred to my HDMI connected monitor. If I simply turn off the monitor, nothing happens though. I'll disable ULPS and see what happens from there.

I have the MSI 290 4GD5, which is the reference variant.

Even on idle, the memroy clock is on 1250mhz. See below screenshot.


----------



## NaXter24R

Quote:


> Originally Posted by *csgofanatic*
> 
> That sounds about right. If I turn off my DVI-D connected monitor, everything is transferred to my HDMI connected monitor. If I simply turn off the monitor, nothing happens though. I'll disable ULPS and see what happens from there.
> 
> I have the MSI 290 4GD5, which is the reference variant.
> 
> Even on idle, the memroy clock is on 1250mhz. See below screenshot.


Both DVI are the same, they are all digital in the reference one. Maybe there is some custom with analog one but i can't remember right now.
Memory clock is ok, when multiple monitor are connected the clock do not scale. I've experienced it myself with 7770, 280x and 290x, there is nothing you can do. It's like 2°C more in idle, nothing to worry about. You could use some custom bios in order to fix that but, i've read that Nvidia scales the clock and most of the times when you resume the pc after the screens turns black, you have to reset the pc since the power draw is too low. It's like a safety features, use a bit more power to be sure to resume windows after sceen's sleep


----------



## csgofanatic

Disabling ULPS seems to have fixed the issue. Left Teamspeak on my second monitor to test and when monitors when to sleep, the Teamspeak window stayed where it belonged. Thanks for your help guys.









EDIT: I lied. Still happens.









I read on a forum elsewhere that the ATi drivers are to blame when it comes to HDMI/Displayport - because when the monitor goes to sleep, AMD thinks the monitor is disconnected and as a result, the applications you have opened end up moving.


----------



## NaXter24R

Quote:


> Originally Posted by *csgofanatic*
> 
> Disabling ULPS seems to have fixed the issue. Left Teamspeak on my second monitor to test and when monitors when to sleep, the Teamspeak window stayed where it belonged. Thanks for your help guys.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> EDIT: I lied. Still happens.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I read on a forum elsewhere that the ATi drivers are to blame when it comes to HDMI/Displayport - because when the monitor goes to sleep, AMD thinks the monitor is disconnected and as a result, the applications you have opened end up moving.


Glad to be helpful


----------



## TheLAWNOOB

I ran all games capped at 60 fps, but Rise of Tomb Raider seems to be consistently using 90-100% of my GPUs.

I can't max out GTA V because of my VRAM, and it get to 90% usage every once a while.

KF2 uses 90-100% but it's nowhere as hot as Tomb Raider.


----------



## Paul17041993

Quote:


> Originally Posted by *Chopper1591*
> 
> (I love my Fulipoly VRM thermal pads
> 
> 
> 
> 
> 
> 
> 
> ).


Definitely getting some of those as the EK pads are not great.

I will also note that the GPU-water delta on my overclocked 290X is about 20C too, however I'm also 20C higher on average as I tuned my fans for silence. My VRMs are definitely problematic though as they get quite hot under load, or at least the primary ones do, had to back out of 1325mV as they ended up at 120C...
Quote:


> Originally Posted by *csgofanatic*
> 
> Not sure if I should make a separate topic for this issue. Love the 290 I got from my cousin who previously used it for bitcoin mining. Still runs fine (for now) but there's a bit of an issue with my dual monitor setup. Didn't have this problem when I had a Nvidia cards installed on my computer.
> 
> I've set up my computer so that my monitor goes to sleep after ten minutes of idleness (also so that my Razer keyboard and it's LED stop glowing), I notice that any applications that I have opened on my 2nd monitor (Asus VE 228) has been moved to my primary monitor (BenQ 2411Z). It's a pain to constantly drag Teamspeak back on my second monitor when I AFK for a bit.
> 
> The 2nd monitor is connected through HDMI, and the BenQ monitor is through DVI-DL. I noticed when I plugged my DVI-DL on the 2nd DVI-D port the 290 provides the issue is gone, but my games feel choppy and laggy (csgo in particular) so I switched back to the top DVI-D port. Any workarounds behind this issue?
> 
> I have tried re-installing AMD drivers, but no avail.
> 
> Here are my current specs:
> 
> Asus Z-87-K
> Intel i7 4770S
> MSI R290 Reference
> 16 GB of RAM (Corsair, 1600mhz)
> 1TB HDD (OS) + 500 GB HDD (both WD)
> Windows 8.1
> Current AMD drivers are the 16.3.2 drivers. Tried updating to the hotfix but my screen would constantly flicker when watching videos.


Quote:


> Originally Posted by *NaXter24R*
> 
> If this can help you, i have 2 monitor as well, the first one connected via DispayPort, the other via DVI-D.
> When they go to sleep, nothing happens, but, if instead i turn off one of those monitor, everything goes to the first one.
> It's something related to digital connection such as HDMI or DisplayPort, infact whenever i turn off (off, not sleep or power saving mode) my second monitor (DVI) everything moves to the 1st one.
> 
> Now, what R9 290 do you have? Since it has 2 DVI-D, so they are the same, or should be.
> 
> I used to have your same layout with my old monitor, DVI qith 1st and HDMI with 2nd, moreover, 2 different resolution, no issue whatsoever, this with 7770, 280x and 290x.
> Also, try to disable ULPS, sometimes it might help.
> Could you also tell me whats the VRAM clock when idle with 2 monitors?


Both HDMI and DP are PnP power-active, if you turn the monitor off it will completely disconnect itself from the GPU, leaving the GPU thinking it was unplugged. It's an annoying flaw that for the most part you're stuck with unless you use DVI only.

There's a trick with HDMI though that you could try, either with a special adapter or a strip of tape you can disable the data pin that deals with this and effectively disable the flawed behaviour.
It's mentioned in this technet topic about 75% of the way down;
https://social.technet.microsoft.com/Forums/windows/en-US/8a9b5aa7-fe33-4e6d-b39b-8ac80a21fdc2/disable-monitor-off-detection-how?forum=w7itprogeneral


----------



## NaXter24R

Quote:


> Originally Posted by *csgofanatic*
> 
> EDIT: I lied. Still happens.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I read on a forum elsewhere that the ATi drivers are to blame when it comes to HDMI/Displayport - because when the monitor goes to sleep, AMD thinks the monitor is disconnected and as a result, the applications you have opened end up moving.


Have you tried with older or newer one? If so, same thing? Just for the sake of it, try to inver the cables, so the DVI on the hdmi monitor and viceversa
Quote:


> Originally Posted by *Paul17041993*
> 
> Both HDMI and DP are PnP power-active, if you turn the monitor off it will completely disconnect itself from the GPU, leaving the GPU thinking it was unplugged. It's an annoying flaw that for the most part you're stuck with unless you use DVI only.
> 
> There's a trick with HDMI though that you could try, either with a special adapter or a strip of tape you can disable the data pin that deals with this and effectively disable the flawed behaviour.
> It's mentioned in this technet topic about 75% of the way down;
> https://social.technet.microsoft.com/Forums/windows/en-US/8a9b5aa7-fe33-4e6d-b39b-8ac80a21fdc2/disable-monitor-off-detection-how?forum=w7itprogeneral


I know that, infat it happens when you turn the monitor off, but when it goes to power saving mode, it shouldn't do that. At least, in my experience i've never experienced that. The only wat to replicate that situation is to turn off the monitor, like pushing the button


----------



## csgofanatic

My only question is if that is the case, how come it didn't happen with my Nvidia card (ASUS 960 Strix). Sounds like this problem happens with the 200 series? I'll ask my friend who owns a 390 and how he manages his double monitor setup.

Not really in a position to invert my primary monitor either - need my 144hz.

Will try using different cables, but I'm a bit adamant with HDMI as the monitor acts like a set of speakers without me needing to constantly plug in and out my headset. Don't like to use my headset when browsing on the internet all the time. Will post an update.

Found the thread that confirmed my issue: http://forums.guru3d.com/showthread.php?t=391428


----------



## Chopper1591

Quote:


> Originally Posted by *Paul17041993*
> 
> Definitely getting some of those as the EK pads are not great.
> 
> I will also note that the GPU-water delta on my overclocked 290X is about 20C too, however I'm also 20C higher on average as I tuned my fans for silence. My VRMs are definitely problematic though as they get quite hot under load, or at least the primary ones do, had to back out of 1325mV as they ended up 120c


You sure contact is okay?
Which gpu block do you have?

120 degrees is insane. For air that would even have been high. For water its just absurd.
Even when I clock to 1250mhz with ~1.4v my vrm's stay around 50-55c depending om ambient temp.

Sure my thermal pads are nice but the difference shouldnt be that big. Keep in mind though that I have the ultra extreme version of the pads. Had to get them from usa (im from the Netherlands). For just the vrm it was around 20 dollar if I remember correctly. The "normal" extreme fujipoly pads are on the vram modules.


----------



## Paul17041993

Quote:


> Originally Posted by *Chopper1591*
> 
> You sure contact is okay?
> Which gpu block do you have?
> 
> 120 degrees is insane. For air that would even have been high. For water its just absurd.
> Even when I clock to 1250mhz with ~1.4v my vrm's stay around 50-55c depending om ambient temp.
> 
> Sure my thermal pads are nice but the difference shouldnt be that big. Keep in mind though that I have the ultra extreme version of the pads. Had to get them from usa (im from the Netherlands). For just the vrm it was around 20 dollar if I remember correctly. The "normal" extreme fujipoly pads are on the vram modules.


atm the VRMs just have two layers of the stock EK .25 pads as my block didn't come with the proper .5 pads, block is EK's revision 2 290X nickel acetal.

Found the industrial model of the pads though;
http://www.fujipoly.sg/index.php?id=46
SARCON XR-m, 17 Watt/m-K

Not sure whether I should get .5 or 1mm thickness though, I suppose 1mm should compress adequately without issues...


----------



## rdr09

A bit off topic: Will there be 490/490X or just Volta?


----------



## Chopper1591

Quote:


> Originally Posted by *Paul17041993*
> 
> atm the VRMs just have two layers of the stock EK .25 pads as my block didn't come with the proper .5 pads, block is EK's revision 2 290X nickel acetal.
> 
> Found the industrial model of the pads though;
> http://www.fujipoly.sg/index.php?id=46
> SARCON XR-m, 17 Watt/m-K
> 
> Not sure whether I should get .5 or 1mm thickness though, I suppose 1mm should compress adequately without issues...


Never stack thermal pads. Never!

And obviously go with .5mm as that is what EK recommends. You should improve a lot with any pad. Right now you are more isolating than conducting the heat. Replace that stuff asap


----------



## Paul17041993

Quote:


> Originally Posted by *rdr09*
> 
> A bit off topic: Will there be 490/490X or just Volta?


490/X will likely be separate from volta and released some time around Q3-4, reason being that HBM 2 is still expensive and early and GDDR5X somewhat negates the point of HBM 1 now.
Quote:


> Originally Posted by *Chopper1591*
> 
> Never stack thermal pads. Never!
> 
> And obviously go with .5mm as that is what EK recommends. You should improve a lot with any pad. Right now you are more isolating than conducting the heat. Replace that stuff asap


Yea it was just a quick hack that seems to be going decently for 1300mV, 1100/1700 while going into winter. Will definitely try to get better pads especially considering I'm planning on doing a rebuild to AM4 at some point soon, depending on what motherboards arise.


----------



## rdr09

Quote:


> Originally Posted by *Paul17041993*
> 
> 490/X will likely be separate from volta and released some time around Q3-4, reason being that HBM 2 is still expensive and early and GDDR5X somewhat negates the point of HBM 1 now.


That's what i thought and hoping. Thanks.


----------



## TheLAWNOOB

Gave Need for Speed 2016 another shot.

Newest driver. Tried official profile, AFR, and default. 4K high setting. ULPS disabled. +40% power limit, 1100/1500 on both cards.



Spoiler: Warning: Spoiler!


----------



## TheReciever

Thats how Afterburner displays my 290x, I dont use it anymore


----------



## TheLAWNOOB

How it displays it doesn't matter.

It's getting zero scaling and according to this review it should have 60-80% scaling in 4k.

http://nl.hardware.info/reviews/6658/6/need-for-speed-2016-review-getest-met-22-gpus-slicrossfire-schaling


----------



## TheReciever

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> How it displays it doesn't matter.
> 
> It's getting zero scaling and according to this review it should have 60-80% scaling in 4k.
> 
> http://nl.hardware.info/reviews/6658/6/need-for-speed-2016-review-getest-met-22-gpus-slicrossfire-schaling


confirmed with a different tool?


----------



## kizwan

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Gave Need for Speed 2016 another shot.
> 
> Newest driver. Tried official profile, AFR, and default. 4K high setting. ULPS disabled. +40% power limit, 1100/1500 on both cards.
> 
> 
> 
> Spoiler: Warning: Spoiler!


- Too high frametime
- high CPU usage

Are you sure you're playing at 4K? lol I'm just kidding. All I know the game still problematic in crossfire.


----------



## TheLAWNOOB

Quote:


> Originally Posted by *TheReciever*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TheLAWNOOB*
> 
> How it displays it doesn't matter.
> 
> It's getting zero scaling and according to this review it should have 60-80% scaling in 4k.
> 
> http://nl.hardware.info/reviews/6658/6/need-for-speed-2016-review-getest-met-22-gpus-slicrossfire-schaling
> 
> 
> 
> confirmed with a different tool?
Click to expand...

There's no need to confirm it with another tool.

The game's framerate/frametime literally rapes my eyes.


----------



## TheReciever

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> There's no need to confirm it with another tool.
> 
> The game's framerate/frametime literally rapes my eyes.


...K

then good luck to you


----------



## TheLAWNOOB

Fixed the high VRM temps.

I pointed a 1000rpm 80mm fan at the back of the GPU and the VRM only tops 90C.

Didn't know I still need air flow for full cover blocks.


----------



## Paul17041993

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Fixed the high VRM temps.
> 
> I pointed a 1000rpm 80mm fan at the back of the GPU and the VRM only tops 90C.
> 
> Didn't know I still need air flow for full cover blocks.


You shouldn't, it means the block doesn't have adequate contact with the VRMs to cool them effectively and/or the pads in use can only handle low power.


----------



## rdr09

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Fixed the high VRM temps.
> 
> I pointed a 1000rpm 80mm fan at the back of the GPU and the VRM only tops 90C.
> 
> Didn't know I still need air flow for full cover blocks.


No you don't as Paul said. With EK, the vrms should be cooler than the core even oc'ed. Poor contact. Even with the lack of rad space . . . your temps should not go that high with full block.

EDIT: Your cpu was maxing out. wth. Check what resources other than the game that is causing that. my physics score in FS was about 1000 pts lower and i saw my cores alternately maxing out. it was windows update causing it on my rig. i turned that thing off. lol

*Wait, it's an i5. Is it even oc'ed?*


----------



## mus1mus

Stock EK pads are crap for the VRM. But they shouldn't be that bad. They can go higher than the Core when pushed really hard but might be worth looking at contact, and/or if the pads were placed on the correct spots. We've seen this a lot. People placing the pads on the Chokes rather than the VRMs.


----------



## Chopper1591

Quote:


> Originally Posted by *mus1mus*
> 
> Stock EK pads are crap for the VRM. But they shouldn't be that bad. They can go higher than the Core when pushed really hard but might be worth looking at contact, and/or if the pads were placed on the correct spots. We've seen this a lot. People placing the pads on the Chokes rather than the VRMs.


He knows what to do. Been said multiple times.

If you scroll back a bit you see that he actually stacked the thermal pad.


----------



## mus1mus

Quote:


> Originally Posted by *Chopper1591*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Stock EK pads are crap for the VRM. But they shouldn't be that bad. They can go higher than the Core when pushed really hard but might be worth looking at contact, and/or if the pads were placed on the correct spots. We've seen this a lot. People placing the pads on the Chokes rather than the VRMs.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> He knows what to do. Been said multiple times.
> 
> If you scroll back a bit you see that he actually *stacked* the thermal pad.
Click to expand...

Do you reckon *that* may be causing his high VRM Temps?

FYI,

The stock VRM thermal pad that comes with reference cards works so damn good on my end. It's hard to pull out intact off the reference cooler, so extra care is needed.


----------



## kizwan

Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Chopper1591*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Stock EK pads are crap for the VRM. But they shouldn't be that bad. They can go higher than the Core when pushed really hard but might be worth looking at contact, and/or if the pads were placed on the correct spots. We've seen this a lot. People placing the pads on the Chokes rather than the VRMs.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> He knows what to do. Been said multiple times.
> 
> If you scroll back a bit you see that he actually *stacked* the thermal pad.
> 
> Click to expand...
> 
> Do you reckon *that* may be causing his high VRM Temps?
> 
> FYI,
> 
> The stock VRM thermal pad that comes with reference cards works so damn good on my end. It's hard to pull out intact off the reference cooler, so extra care is needed.
Click to expand...

Nah, I have stacked pad before I get my Fuji & the highest I get with VRM1 is in the 60s Celsius in 30C ambient. Actually VRM temp doesn't get worse compare to non-stacked pad before. 90C is more like bad contact issue than crappy or stacked pad or even trapped air bubbles.


----------



## mus1mus

Quote:


> Originally Posted by *kizwan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Chopper1591*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Stock EK pads are crap for the VRM. But they shouldn't be that bad. They can go higher than the Core when pushed really hard but might be worth looking at contact, and/or if the pads were placed on the correct spots. We've seen this a lot. *People placing the pads on the Chokes rather than the VRMs*.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> He knows what to do. Been said multiple times.
> 
> If you scroll back a bit you see that he actually stacked the thermal pad.
> 
> Click to expand...
> 
> Do you reckon *that* may be causing his high VRM Temps?
> 
> FYI,
> 
> The stock VRM thermal pad that comes with reference cards works so damn good on my end. It's hard to pull out intact off the reference cooler, so extra care is needed.
> 
> 
> 
> 
> 
> 
> 
> 
> Click to expand...
> 
> Nah, I have stacked pad before I get my Fuji & the highest I get with VRM1 is in the 60s Celsius in 30C ambient. Actually VRM temp doesn't get worse compare to non-stacked pad before. 90C is more like bad contact issue than crappy or stacked pad or even trapped air bubbles.
Click to expand...

I concur. Thus the bold original statement.


----------



## TheLAWNOOB

i5 is OCed to 4.8Ghz. It's only maxed out during loading screens. I assume the stacked thermal pad is directed at someone else since I never said I stacked thermal pads.

The parallel bridge comes tomorrow next Tuesday cause Canadapost. I'll take everything apart then, and have a look.

Even if the bridge doesn't help temps it should look a bit better. Starting to regret not ordering that EK XE 360 rad with my bridge.


----------



## Chopper1591

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> i5 is OCed to 4.8Ghz. It's only maxed out during loading screens. I assume the stacked thermal pad is directed at someone else since I never said I stacked thermal pads.
> 
> The parallel bridge comes tomorrow next Tuesday cause Canadapost. I'll take everything apart then, and have a look.
> 
> Even if the bridge doesn't help temps it should look a bit better. Starting to regret not ordering that EK XE 360 rad with my bridge.


Hmm.. My bad then. Thought you posted about the pad.

Anyway. I am curious to see the inside of your GPU block's and pictures of the compressed (or non compressed







) thermal pads on your VRMs.


----------



## Paul17041993

stock'n'stacked EK pads are working ok for me with 1300mV and usual loads, about the same 30-60C temp range (15C if you count cold-start). But once the voltage is bumped a little higher and heavily stressed the pads simply cant conduct the heat fast enough and the temp keeps rising. They're probably only rated for 5W/mK or something...

edit; oh though I also have the backplate so that may make a significant difference...


----------



## TheLAWNOOB

Ok, problem solved.



#Dying


----------



## mus1mus

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Ok, problem solved.
> 
> 
> 
> #Dying


#dayum!
#noobjob!

Now take things correctly and you'll enjoy the card a whole further


----------



## TheLAWNOOB

And then I found out why the core temp was so high...


----------



## rdr09

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Ok, problem solved.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> #Dying


I was wrong. It aiin't poor contact but no contact.

if reference design, make sure to use 1mm for the vrms and 0.5 mm for the vram.

If no additional rad, maybe push/pull will help.

EDIT: checkout post #21896

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781


----------



## kizwan

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Ok, problem solved.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> #Dying


#ProblemNotSolved

#PadOnTheWrongPlace

#EpicFail

I'm going to tweet this guys.


----------



## TheLAWNOOB

Thanks for the link.

And now it gets better. No wonder why one of the GPU ran 10C hotter....



Looks like he barbequed some chicken meat or something.


----------



## battleaxe

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Thanks for the link.
> 
> And now it gets better. No wonder why one of the GPU ran 10C hotter....
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Looks like he barbequed some chicken meat or something.


What in the HEY is going on over there. LOL

WTH man?

LOL


----------



## mus1mus

Let the cleaning commence!

Grab some lemonade!


----------



## mfknjadagr8

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Thanks for the link.
> 
> And now it gets better. No wonder why one of the GPU ran 10C hotter....
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Looks like he barbequed some chicken meat or something.


that mess probably hurt flow pretty bad through the loop as well


----------



## Paul17041993

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Thanks for the link.
> 
> And now it gets better. No wonder why one of the GPU ran 10C hotter....
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Looks like he barbequed some chicken meat or something.


What even IS that... looks to be a bit more than just dye and tube gunk...


----------



## mus1mus

Must be some non watercooling dyes to give the liquid some colors.


----------



## toyz72

some quick questions for you guys. what software do you use for oc'ing a R9 290? i been out of the loop for awhile now,and would like to oc it a bit.i have msi afterburner,but i use it to monitor temps and usage. things i have no clue on would be...

max voltage?
whats more important(core or memory)?
max temp?

i only have stock cooling on my card,and im moving back into my node 304. i know not to expect a lot on the temp side of things.so if you could help me on software,and what software you use to test overclocks...i't would really help me out. sorry for being the new guy.


----------



## Chopper1591

Quote:


> Originally Posted by *mus1mus*
> 
> #dayum!
> #noobjob!
> 
> Now take things correctly and you'll enjoy the card a whole further


haha you were right after all.
Nice start of my day. I like epic fails.

I don't get it how people manage to do that. Get another hobby is my standard advice on these kind of things.
Quote:


> Originally Posted by *TheLAWNOOB*
> 
> And then I found out why the core temp was so high...
> 
> 
> Spoiler: Warning: Spoiler!


Like I said a few times. You could have saved the cash you spent on the parallel block.

Gunk is a common problem. The dude you bought your stuff from probably didn't clean his rads. But still, that is some serious gunk. It looks sticky.

Anyway. Your whole temp should drop after cleaning. Including cpu.

Pro tip:
Scrub those blocks with ketchup and a soft toothbrush.


----------



## Paul17041993

Quote:


> Originally Posted by *toyz72*
> 
> some quick questions for you guys. what software do you use for oc'ing a R9 290? i been out of the loop for awhile now,and would like to oc it a bit.i have msi afterburner,but i use it to monitor temps and usage. things i have no clue on would be...
> 
> max voltage?
> whats more important(core or memory)?
> max temp?
> 
> i only have stock cooling on my card,and im moving back into my node 304. i know not to expect a lot on the temp side of things.so if you could help me on software,and what software you use to test overclocks...i't would really help me out. sorry for being the new guy.


I use sapphire trixx to test what the card can do, however before you do anything, *disable CCC/radeon settings from starting at boot* as having it enabled will foce the clocks every time windows starts, however without the voltage, and in turn leave the system in a constant hang-loop.

After I know what the card can do I modify the BIOS with the clocks and voltage and re-install the drivers. If it needs tuning afterwards I re-mod the BIOS and reset the overdrive settings in radeon settings, provided the card is stable enough to run the system you don't need to re-install.

If the system gets into the mentioned hang-loop, ie; it starts booting but then locks just after the windows logo, you'll have to force windows into safemode and disable CCC/radeon settings from starting from there.


----------



## TheLAWNOOB

Quick Fire Strike run with no fans pointing at GPUs.

VRM ran into 115C with 2 fans before, now it sits at 78C at 1.3V.



And after cheating a little bit in drivers (tessellation off, filter in performance)


----------



## LandonAaron

Quote:


> Originally Posted by *toyz72*
> 
> some quick questions for you guys. what software do you use for oc'ing a R9 290? i been out of the loop for awhile now,and would like to oc it a bit.i have msi afterburner,but i use it to monitor temps and usage. things i have no clue on would be...
> 
> max voltage?
> whats more important(core or memory)?
> max temp?
> 
> i only have stock cooling on my card,and im moving back into my node 304. i know not to expect a lot on the temp side of things.so if you could help me on software,and what software you use to test overclocks...i't would really help me out. sorry for being the new guy.


Quote:


> Originally Posted by *Paul17041993*
> 
> I use sapphire trixx to test what the card can do, however before you do anything, *disable CCC/radeon settings from starting at boot* as having it enabled will foce the clocks every time windows starts, however without the voltage, and in turn leave the system in a constant hang-loop.
> 
> After I know what the card can do I modify the BIOS with the clocks and voltage and re-install the drivers. If it needs tuning afterwards I re-mod the BIOS and reset the overdrive settings in radeon settings, provided the card is stable enough to run the system you don't need to re-install.
> 
> If the system gets into the mentioned hang-loop, ie; it starts booting but then locks just after the windows logo, you'll have to force windows into safemode and disable CCC/radeon settings from starting from there.


I like use MSI Afterburner for overclocking my cards, but it maxes out at +100mV, where Trixx can go up to +200mV. However I use to get black screens on my cards whenever I went over +100mV, so no need for Trixx for me. However I haven't tried going past +100mV in a long time, the black screen issue may be resolved on newer drivers. Core is king. If you want to swing for the fences right off the bat I would try +200mV +50% power limit, 1200mhz core clock, and 1500mhz memory. This is what I use to run at when I had a 1080p monitor. Now though on my 1400p monitor I can't go over +100mv without black screen, so my OC has been reduced to 1100/1400. Just be sure you disable CCC/radeon setting from starting at boot like Paul said. As for testing your overclock I think 3d Mark Vantage is best. Its kind of slow but if it can get through the entire benchmark, especially the "demo" portion then its usually a stable OC in my experience. Unigen Valley is another option, but in my experience Valley doesn't do a good job testing stability. That is I can run Valley all day with an OC that would crash after about 20 minutes of gaming or 3d Mark Vantage.


----------



## Paul17041993

Quote:


> Originally Posted by *LandonAaron*
> 
> However I use to get black screens on my cards whenever I went over +100mV


This is generally due to the TDP limits forced in the BIOS and/or the card itself, I believe it's 375W total for the card, which can be fairly easy to hit once you go over +100mV. Cards will vary though in what they can achieve with what total power draw, especially if it's an 8GB variant that may use more power for the VRAM IC's or in general, alternatively certain motherboards or PSU noise may cause blackscreens under heavy loads.


----------



## TheLAWNOOB

Well +200mv is a lot. Its about 1.4V or maybe more.

I only gained 5% in 3DMark going from 1100 to 1200 core. IMO 5% gain is not worth the extra 100mv.


----------



## battleaxe

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Well +200mv is a lot. Its about 1.4V or maybe more.
> 
> I only gained 5% in 3DMark going from 1100 to 1200 core. IMO 5% gain is not worth the extra 100mv.


But when you're benching its those last few points that mean so much.

Gaming? Yeah agreed, don't bother.

Just refinished the thermal paste on my 290x. Ahhhh.... less heat is always a good thing.


----------



## rdr09

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Quick Fire Strike run with no fans pointing at GPUs.
> 
> VRM ran into 115C with 2 fans before, now it sits at 78C at 1.3V.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> And after cheating a little bit in drivers (tessellation off, filter in performance)


Man, that cpu is fast. It gets lower physics score than my i7 Sandy at 4.5 but the combined score tells a different story. prolly the ipc combined with DDR4 makes up for the lack of HT.

EDIT: BTW, your graphics score at 1200 is kinda low. Here is an old run with Tess on . . .

http://www.3dmark.com/3dm/6302964

But look, your combined and overall score are much higher. But, 2 290s at 1200 oc should be getting 24K in Graphics. Prolly higher with newer driver.

This is stock . . .

http://www.3dmark.com/3dm/10351872?

Quite simply an unstable run.


----------



## Paul17041993

Quote:


> Originally Posted by *rdr09*
> 
> Man, that cpu is fast. It gets lower physics score than my i7 Sandy at 4.5 but the combined score tells a different story. prolly the ipc combined with DDR4 makes up for the lack of HT.


Or the lack of HT makes a difference, have you tried with and without it to see if the scores differ?

Other than that there would be very minor arch differences that could make the difference, DDR4 and cache probably affect draw calls a fair bit.


----------



## toyz72

Quote:


> Originally Posted by *Paul17041993*
> 
> I use sapphire trixx to test what the card can do, however before you do anything, *disable CCC/radeon settings from starting at boot* as having it enabled will foce the clocks every time windows starts, however without the voltage, and in turn leave the system in a constant hang-loop.
> 
> After I know what the card can do I modify the BIOS with the clocks and voltage and re-install the drivers. If it needs tuning afterwards I re-mod the BIOS and reset the overdrive settings in radeon settings, provided the card is stable enough to run the system you don't need to re-install.
> 
> If the system gets into the mentioned hang-loop, ie; it starts booting but then locks just after the windows logo, you'll have to force windows into safemode and disable CCC/radeon settings from starting from there.


thx for the heads up on ccc....i was worried about system locks and restarts. im not tech enough to fiddle with bios on vcards yet,but maybe after some time playing around. i must say i dont have a lot of faith in win 10 .not sure how it will act to things going wrong,but ill figure it out...thx again


----------



## Chopper1591

Quote:


> Originally Posted by *toyz72*
> 
> thx for the heads up on ccc....i was worried about system locks and restarts. im not tech enough to fiddle with bios on vcards yet,but maybe after some time playing around. i must say i dont have a lot of faith in win 10 .not sure how it will act to things going wrong,but ill figure it out...thx again


I don't know if your card is dual bios?
if not, do extensive research before bios flashing. I've seen bricked cards due to improper flashing.


----------



## rdr09

Quote:


> Originally Posted by *Paul17041993*
> 
> Or the lack of HT makes a difference, have you tried with and without it to see if the scores differ?
> 
> Other than that there would be very minor arch differences that could make the difference, DDR4 and cache probably affect draw calls a fair bit.


Or it could be my Win7. My combined score in Sandy and X99 are the same with 2 290s - 6K.

I just noticed, though, that TheLawn's graphics score is short by about 2K at stock. My bad. It was an oc'ed run of 1200/1500. With tess on, it should be around 24K. Must be unstable run.


----------



## mus1mus

Yep. 1200 may be a little too much to ask for 1.3 VDDC.


----------



## TheLAWNOOB

1200 is a 5% gain from 1100 at 1.23V.

I'll try stock clock tonight. Do I need to apply a CF profile or leave it at default?

Also,








But they only used reference cards


----------



## rdr09

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> 1200 is a 5% gain from 1100 at 1.23V.
> 
> I'll try stock clock tonight. Do I need to apply a CF profile or leave it at default?
> 
> Also,
> 
> 
> 
> 
> 
> 
> 
> 
> But they only used reference cards


The chart is prolly showing Overall score. Here is 1100 oc on my 290s using 16.3 driver . . .

http://www.3dmark.com/3dm/12195069?

I wish i could run 1200 but it is kinda scary with an 850w psu.









I used to be able to oc them both to 1290 with a 1300w psu.

EDIT: yah, that's overall score cause a stock 290 graphics score is 11K.


----------



## TheLAWNOOB

Did you apply any settings to 3D mark? Any profiles?

Should I delete my profile and try to run it as is?


----------



## Chopper1591

Quote:


> Originally Posted by *rdr09*
> 
> Or it could be my Win7. My combined score in Sandy and X99 are the same with 2 290s - 6K.
> 
> I just noticed, though, that TheLawn's graphics score is short by about 2K at stock. My bad. It was an oc'ed run of 1200/1500. With tess on, it should be around 24K. Must be unstable run.


I use high clocks for fun and sometimes for heavy games. Most of the time I have my 290 at 1100 or even stock (1000).

Im pretty sure mine is at or over 1.4v with +200mv. Does run 1275 core though.

But yeah, +200 is probably not worth it for the power usage. hwinfo shows it uses around 350-375 watts.

edit:
woops. wrong quote here


----------



## rdr09

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Did you apply any settings to 3D mark? Any profiles?
> 
> Should I delete my profile and try to run it as is?


Nah, i was running a single 290, then just plugged the other 290 and booted. Crimson auto set crossfire. After that, i ran the bench. No profiles.

You can delete it. It is not needed. Just reset the default.
Quote:


> Originally Posted by *Chopper1591*
> 
> I use high clocks for fun and sometimes for heavy games. Most of the time I have my 290 at 1100 or even stock (1000).
> 
> Im pretty sure mine is at or over 1.4v with +200mv. Does run 1275 core though.
> 
> But yeah, +200 is probably not worth it for the power usage. hwinfo shows it uses around 350-375 watts.
> 
> edit:
> woops. wrong quote here


I did see 1.41v at +200.


----------



## fyzzz

Quote:


> Originally Posted by *rdr09*
> 
> The chart is prolly showing Overall score. Here is 1100 oc on my 290s using 16.3 driver . . .
> 
> http://www.3dmark.com/3dm/12195069?
> 
> I wish i could run 1200 but it is kinda scary with an 850w psu.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I used to be able to oc them both to 1290 with a 1300w psu.
> 
> EDIT: yah, that's overall score cause a stock 290 graphics score is 11K.


I pushed 2x290 to 1275 mhz on a Evga g2 850w, not a single shutdown. http://www.3dmark.com/fs/8242288 , this run was with tess off and custom bios etc..


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> I pushed 2x290 to 1275 mhz on a Evga g2 850w, not a single shutdown. http://www.3dmark.com/fs/8242288 , this run was with tess off and custom bios etc..


It's my UPS that i am really concerned about. Plus, no need to stress my 290s cause looks like Polaris are not worth a replacement. Waiting for Vega or whatever they call it. Nice run.

Man, that graphics score. That's with tweaked bios, right? Must have used less VDDC too.


----------



## TheLAWNOOB

Maybe my cards are physically damaged thanks to the idiot who ran it for years in 4way CF not knowing the VRM has no thermal pad over it.


----------



## Paul17041993

Quote:


> Originally Posted by *Chopper1591*
> 
> I don't know if your card is dual bios?
> if not, do extensive research before bios flashing. I've seen bricked cards due to improper flashing.


I'd be very concerned if it only had the one BIOS, AMD basically made it a fixed standard for at least their mid-to-high end cards to have dual BIOS, particularly because it allows both multi-profiles and BIOS updates without the risk of bricking (unless you're silly enough to flash a bad BIOS to both...).
Quote:


> Originally Posted by *rdr09*
> 
> It's my UPS that i am really concerned about. Plus, no need to stress my 290s cause looks like Polaris are not worth a replacement. Waiting for Vega or whatever they call it. Nice run.
> 
> Man, that graphics score. That's with tweaked bios, right? Must have used less VDDC too.


First line of polaris is mid-range, similar to the 7970/280X in specs, will come close to 290X/390X however will likely lack the raw pixel crunching power.


----------



## rdr09

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Maybe my cards are physically damaged thanks to the idiot who ran it for years in 4way CF not knowing the VRM has no thermal pad over it.


Do a run at stock.


----------



## Paul17041993

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Maybe my cards are physically damaged thanks to the idiot who ran it for years in 4way CF not knowing the VRM has no thermal pad over it.


I wonder if they even knew how to read a manual...


----------



## Chopper1591

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Maybe my cards are physically damaged thanks to the idiot who ran it for years in 4way CF not knowing the VRM has no thermal pad over it.


Who knows. But I doubt it.

Do note that small to no gain can be a sign of instability.
I will share some 3dmark runs if you want? Tell me what to run.


----------



## TheLAWNOOB

I'll run at stock tonight.
Quote:


> Originally Posted by *Chopper1591*
> 
> Who knows. But I doubt it.
> 
> Do note that small to no gain can be a sign of instability.
> I will share some 3dmark runs if you want? Tell me what to run.


Do you have 2 290s? I already have results from a single ASUS R9 290 (gave that to a friend).

Nevertheless, thanks for the gesture.


----------



## mus1mus

ULPS OFF and Performance setting on Crimson. Have you looked at them?

It can also be due to the clocks not being sync'ed.

Opps! My bad! Your clocks were sync'ed.


----------



## TheLAWNOOB

Ulps is off, ill try perfance setting


----------



## kizwan

Mine submitted long time ago. All default, no tweak, old driver.

http://www.3dmark.com/fs/2825429


----------



## TheLAWNOOB

Thanks


----------



## Chopper1591

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> I'll run at stock tonight.
> Do you have 2 290s? I already have results from a single ASUS R9 290 (gave that to a friend).
> 
> Nevertheless, thanks for the gesture.


Na, sadly not. Single runner here.

But I take it you were doubting your performance, or do I mistake?
I can share the difference with various clocks.

Ah.... whatever. I want to know it for myself again. Haha
I like benching.


----------



## LandonAaron

I don't think Lawnoob's score is off. Here is my Firestrike results:


----------



## TheLAWNOOB

I got 24000 graphics score AFTER I turned off tessellation and anistrophic filtering.

2 hours before I get home








Quote:


> Originally Posted by *Chopper1591*
> 
> Na, sadly not. Single runner here.
> 
> But I take it you were doubting your performance, or do I mistake?
> I can share the difference with various clocks.
> 
> Ah.... whatever. I want to know it for myself again. Haha
> I like benching.


I hate benching. Only have the 3DMark demo so I have to sit thru 5 minutes of the demo before actual benchmark starts. Waste of my time.


----------



## TheLAWNOOB

Spoiler: Freesync on



No profile, graphics set to performance.


----------



## TheLAWNOOB

Spoiler: Freesync off



Ok it's Freesync messing it up.

Stock, clean driver



Graphics score higher than 1200mhz, but combined score a lot lower

Edit: Afterburner does not effect it. Dropped score just a little because the monitoring requires some CPU time.

Conclusion: Freesync sucks, I wasted my money.


----------



## mus1mus

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> 2 hours before I get home
> 
> 
> 
> 
> 
> 
> 
> 
> I hate benching. Only have the 3DMark demo so I have to sit thru 5 minutes of the demo before actual benchmark starts. Waste of my time.


You can try opening an app before you fire up Fire Strike. Any app will do. But I have been doing this with TriXXx.

I have TriXXx in the background to adjust the clocks before firing Fire Strike. Adjust the Power Limit to max or +50, but don't hit Apply yet.

Fire up Fire Strike, it have a loading window before occupying the rest of the real estate of the screen. The time within the loading window and right before Demo goes to full screen is your window to hit apply on Trixxx. It will skip the Demo. But you lose the Valid Score.


----------



## TheLAWNOOB

Thanks for the "hack" but I "acquired" the full version a few hours ago









It seems if I leave freesync on it caps framerate to 88fps, even though the freesync limit for my monitor is 62hz.

Still don't understand why I get a higher combined score with capped framerate. The combined runs around 40fps without freesync, but magically run better with freesync.


----------



## rdr09

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Thanks for the "hack" but I "acquired" the full version a few hours ago
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It seems if I leave freesync on it caps framerate to 88fps, even though the freesync limit for my monitor is 62hz.
> 
> Still don't understand why I get a higher combined score with capped framerate. The combined runs around 40fps without freesync, but magically run better with freesync.


Just be thankful it is not the cards that's messed up.







No need for freesync running Firestrike.

I much rather have higher combined than graphics score - tbh.


----------



## mus1mus

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *TheLAWNOOB*
> 
> Thanks for the "hack" but I "acquired" the full version a few hours ago
> 
> 
> 
> 
> 
> 
> 
> 
> 
> It seems if I leave freesync on it caps framerate to 88fps, even though the freesync limit for my monitor is 62hz.
> 
> Still don't understand why I get a higher combined score with capped framerate. The combined runs around 40fps without freesync, but magically run better with freesync.
> 
> 
> 
> Just be thankful it is not the cards that's messed up.
> 
> 
> 
> 
> 
> 
> 
> No need for freesync running Firestrike.
> 
> *I much rather have higher combined than graphics score* - tbh.
Click to expand...

Very true. Combined pulls the score higher than Graphics. And yep, if it correlates to gaming, that's exactly what you will be getting and should always aim for.


----------



## catbeef

hey there, something weird has happened and i can only imagine it is the graphics card, but the whats and whys are perplexing me.

to begin, i have a sapphire r9 290 tri-x. the other week i got a kraken g10 and a corsair h55 and watercooled the thing to reduce noise blah blah. my card runs at 1100 / 1400 with +19mV and has done so fine for a long time. i used the gelid vrm heatsinks and put some heatsinks on my vram chips just to be safe. temps are great, vrm 1 and 2 do not go beyond 68~, gpu sits at around 65~ while gaming. much better than the near 90 it was pushing with the tri-x hair-dryer.

so now it is a week later, no problems. everything has been smooth sailing. i decide to play "the division" for the first time in.. well a long time. i played it for a week when it came out and got bored with the lack of end-game content. now they had just deployed patch 1.2, which as anyone who has played it will know.. was a major disaster. all kinds of awful ui bugs, graphical bugs. bad times. and then it froze my pc. no crash to desktop, a proper freeze. i had to push the reset button.

ever since then i get this happening every now and again: 




~~big update~~

seems this only happens when i have both monitors connected to my card, each one on their own are fine. but this never happened before. any reason why having two monitors connected would cause this flickering? little one is pretty old now, but it appears to work on its own okay? *shrug*

monitor 1: qnix 2710 evolution ii, 2560x1440 - DVI > DVI
monitor 2: asus vhsomething-from-2005 22", 1680x1050 - HDMI > DVI

any ideas?

edit: new dvi cable same effect. having both monitors dvi > dvi same effect. very confusing


----------



## TheLAWNOOB

Wipe driver with DDU and reinstall it?


----------



## rdr09

For those with dual hawaii cards looking for a true single card solution . . . looks like the 1080 Ti will be it.

http://www.3dmark.com/3dm/12195069

http://www.3dmark.com/3dm/12204180

The Ti should have a larger memory bus like the 980 Ti. If not, the Vega 'cause i want to stay AMD.


----------



## Chopper1591

Quote:


> Originally Posted by *rdr09*
> 
> For those with dual hawaii cards looking for a true single card solution . . . looks like the 1080 Ti will be it.
> 
> http://www.3dmark.com/3dm/12195069
> 
> http://www.3dmark.com/3dm/12204180
> 
> The Ti should have a larger memory bus like the 980 Ti. If not, the Vega 'cause i want to stay AMD.


Vega.

Especially since I just bought a 1440p 144hz freesync panel


----------



## TheLAWNOOB

Whens Vega coming out, realistically?


----------



## spyshagg

My next card must hit 28k graphics score at least

My single 290x hits 14150 24/7.
My dual hits 28k 24/7


----------



## kizwan

26k graphics score at 1100/1500 with latest driver. No tweak as usual. I'm using 390 modded ROM but right now with stock memory timings. 623 errors on the Hynix card.

http://www.3dmark.com/3dm/12216103?

stock timings vs. 1250 timings
http://www.3dmark.com/compare/fs/8608427/fs/7998487


----------



## catbeef

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Wipe driver with DDU and reinstall it?


edited too much of my original post away, including my troubleshooting so far

removing drivers and reinstalling various versions [16.4.2, 16.5.1, 16.5.3]
new cables
removing all overclocks from cpu, gpu, ram, monitor
underclocking gpu, vram..
reseating every thing in my pc
one stick of ram at a time
probably more things i have forgotten. but so far the only thing getting rid of the flicker is only running one monitor at a time.

i have been running these two monitors together on this card for 2 and a half years now with no problem, and on a 7870 XT before that with no problem. i don't know if the game freezing my pc was related at all, or just a weird coincidence. but it is just really confusing and there has to be a reason. even if it is unfixable, i'd still really like to know why this suddenly happened


----------



## Paul17041993

Quote:


> Originally Posted by *catbeef*
> 
> hey there, something weird has happened and i can only imagine it is the graphics card, but the whats and whys are perplexing me.
> 
> to begin, i have a sapphire r9 290 tri-x. the other week i got a kraken g10 and a corsair h55 and watercooled the thing to reduce noise blah blah. my card runs at 1100 / 1400 with +19mV and has done so fine for a long time. i used the gelid vrm heatsinks and put some heatsinks on my vram chips just to be safe. temps are great, vrm 1 and 2 do not go beyond 68~, gpu sits at around 65~ while gaming. much better than the near 90 it was pushing with the tri-x hair-dryer.
> 
> so now it is a week later, no problems. everything has been smooth sailing. i decide to play "the division" for the first time in.. well a long time. i played it for a week when it came out and got bored with the lack of end-game content. now they had just deployed patch 1.2, which as anyone who has played it will know.. was a major disaster. all kinds of awful ui bugs, graphical bugs. bad times. and then it froze my pc. no crash to desktop, a proper freeze. i had to push the reset button.
> 
> ever since then i get this happening every now and again:
> 
> 
> 
> 
> ~~big update~~
> 
> seems this only happens when i have both monitors connected to my card, each one on their own are fine. but this never happened before. any reason why having two monitors connected would cause this flickering? little one is pretty old now, but it appears to work on its own okay? *shrug*
> 
> monitor 1: qnix 2710 evolution ii, 2560x1440 - DVI > DVI
> monitor 2: asus vhsomething-from-2005 22", 1680x1050 - HDMI > DVI
> 
> any ideas?
> 
> edit: new dvi cable same effect. having both monitors dvi > dvi same effect. very confusing


Memory controller is unstable, occurs with dual-monitor as the memory is forced to run at max clock, try giving the core more voltage.


----------



## spyshagg

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Whens Vega coming out, realistically?


1st quarter of 2017 is realistic


----------



## TheLAWNOOB

Can't wait for a 200W GPU that can replace my CF R9 290s.

I can't game for a few hours straight because my rig heats up my room too much. Room temperature went from 22C to 27C in 4 hours.


----------



## spyshagg

Quote:


> Originally Posted by *kizwan*
> 
> 26k graphics score at 1100/1500 with latest driver. No tweak as usual. I'm using 390 modded ROM but right now with stock memory timings. 623 errors on the Hynix card.
> 
> http://www.3dmark.com/3dm/12216103?
> 
> stock timings vs. 1250 timings
> http://www.3dmark.com/compare/fs/8608427/fs/7998487


Nice

Here is my dual 24/7
http://www.3dmark.com/fs/8398187

My single. (very happy. Almost same result as stock fury)
http://www.3dmark.com/fs/8591492

Why are you using stock timings?


----------



## spyshagg

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Can't wait for a 200W GPU that can replace my CF R9 290s.
> 
> I can't game for a few hours straight because my rig heats up my room too much. Room temperature went from 22C to 27C in 4 hours.


I understand completely







that is why my water tubes go to a separate room where the radiators are. Zero noise and zero temperature


----------



## Paul17041993

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Can't wait for a 200W GPU that can replace my CF R9 290s.
> 
> I can't game for a few hours straight because my rig heats up my room too much. Room temperature went from 22C to 27C in 4 hours.


To me, that's quite cold, I'd prefer 30C...


----------



## Forceman

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Can't wait for a 200W GPU that can replace my CF R9 290s.
> 
> I can't game for a few hours straight because my rig heats up my room too much. Room temperature went from 22C to 27C in 4 hours.


I doubt Vega will be 200W if 1080 is 180W and P10 is 150W. More likely 250W, I'd guess.


----------



## SpykeZ

Quote:


> Originally Posted by *Forceman*
> 
> Because who's going to pay $350 for a used 290X when you can get a new, faster card for $300? Assuming the rumors are true, that is.
> 
> Are people paying $350 for used 290X cards now?


I paid 250 for mine.


----------



## catbeef

Quote:


> Originally Posted by *Paul17041993*
> 
> Memory controller is unstable, occurs with dual-monitor as the memory is forced to run at max clock, try giving the core more voltage.


still occurring all the way up to the max of +200mV that Trixx allows








i appreciate the suggestions though


----------



## ZealotKi11er

Quote:


> Originally Posted by *spyshagg*
> 
> My next card must hit 28k graphics score at least
> 
> My single 290x hits 14150 24/7.
> My dual hits 28k 24/7


It has to hit at least 30K for me. I want more then just 2 x 290X performance out of single card.


----------



## SpykeZ

Quote:


> Originally Posted by *catbeef*
> 
> still occurring all the way up to the max of +200mV that Trixx allows
> 
> 
> 
> 
> 
> 
> 
> 
> i appreciate the suggestions though


Ditch Trixx, it's pile of junk. Afterburner craps all over it. You can set up different profiles and save them, then in the options you can set one of the profiles for 2D and another for 3D and it'll auto manage those two profiles for you. So when you exit a game, it'll drop to the 2D profile settings you made then when you launch a game or something that uses 3D it loads of that profile and settings.


----------



## kizwan

Quote:


> Originally Posted by *spyshagg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> 26k graphics score at 1100/1500 with latest driver. No tweak as usual. I'm using 390 modded ROM but right now with stock memory timings. 623 errors on the Hynix card.
> 
> http://www.3dmark.com/3dm/12216103?
> 
> stock timings vs. 1250 timings
> http://www.3dmark.com/compare/fs/8608427/fs/7998487
> 
> 
> 
> Nice
> 
> Here is my dual 24/7
> http://www.3dmark.com/fs/8398187
> 
> My single. (very happy. Almost same result as stock fury)
> http://www.3dmark.com/fs/8591492
> 
> Why are you using stock timings?
Click to expand...

Nice CF score.

Using stock timings because one of the cards (Hynix, secondary card) showing degradation. From firestrike stable with no artifacts at 1200/1600 to some artifacts at (1140-1160)/1500, even with offset voltage all the way to +200mV. Returning to stock timings to let my cards "chill out". The difference is less than 2FPS anyway. Oh, btw, no errors at 1400 too. I'll try 1425 later.


----------



## catbeef

Quote:


> Originally Posted by *SpykeZ*
> 
> Ditch Trixx, it's pile of junk. Afterburner craps all over it. You can set up different profiles and save them, then in the options you can set one of the profiles for 2D and another for 3D and it'll auto manage those two profiles for you. So when you exit a game, it'll drop to the 2D profile settings you made then when you launch a game or something that uses 3D it loads of that profile and settings.


underclocking is not removing or reducing the glitching at all, so i don't think profiles will help (interesting info for future reference, however)

500 / 700 + 2 monitors = glitch glitch glitch
1000 / 1300 (stock) + 2 monitors = glitch glitch glitch
1100 / 1400 / 200mV + 2 monitors = glitch glitch glitch

1 monitor fine, 2 monitors bad. it is just really frustrating not knowing why after years of it being fine it has suddenly decided no more


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> It has to hit at least 30K for me. I want more then just 2 x 290X performance out of single card.


^this at stock.

btw, i saw you wrote you ave 90 in BF4 in 4K. Mine was a tad higher using all Medium and no AA. although, it varies with maps. my minimum - lowest i've seen did not go lower than 70. No tearing though unlike Titanfall.


----------



## Paul17041993

Quote:


> Originally Posted by *Forceman*
> 
> I doubt Vega will be 200W if 1080 is 180W and P10 is 150W. More likely 250W, I'd guess.


Vega will likely follow the same 3-tier group as the Fury's; nano, mid-range and full power models.

Quote:


> Originally Posted by *catbeef*
> 
> still occurring all the way up to the max of +200mV that Trixx allows
> 
> 
> 
> 
> 
> 
> 
> 
> i appreciate the suggestions though


Hm, that doesn't sound good as your card might be defective, still got a warranty? Other things to check though would be the PSU and motherboard, if you haven't done so already; pull the card out of the system and re-seat it, and/or test it in another system.

Quote:


> Originally Posted by *SpykeZ*
> 
> Ditch Trixx, it's pile of junk. Afterburner craps all over it. You can set up different profiles and save them, then in the options you can set one of the profiles for 2D and another for 3D and it'll auto manage those two profiles for you. So when you exit a game, it'll drop to the 2D profile settings you made then when you launch a game or something that uses 3D it loads of that profile and settings.


We use trixx because it allows more voltage than afterburner, in my case though, and possibly others, I use it purely for testing settings before writing out a custom BIOS.


----------



## gordesky1

Guys my 290x lightning died today.. was playing forza apex for about 3 hours.. than my screen went black with the sound looping.... thought it was my cpu oc but nope every time after the windows 10 splash screen it will black screen... So i tried driver uninstaller in safe mode and it got me in windows but every time it trys to install the driver it goes to a black screen and lock up..

So i figure i try it on a another pc and it does the same thing...... there is no signs of damage and i even tried the bios switch to the other bios but that still doesn't help...

So i guess its rma time? and if so will they replace it with another lightning? and another thing im the 2nd owner so how will this work and the warranty sticker is little mess up cause i did change the thermal paste couple months ago...

At the moment using a 5870 and it pretty much sucks... i mean good card but not for today games


----------



## Paul17041993

Quote:


> Originally Posted by *gordesky1*
> 
> Guys my 290x lightning died today.. was playing forza apex for about 3 hours.. than my screen went black with the sound looping.... thought it was my cpu oc but nope every time after the windows 10 splash screen it will black screen... So i tried driver uninstaller in safe mode and it got me in windows but every time it trys to install the driver it goes to a black screen and lock up..
> 
> So i figure i try it on a another pc and it does the same thing...... there is no signs of damage and i even tried the bios switch to the other bios but that still doesn't help...
> 
> So i guess its rma time? and if so will they replace it with another lightning? and another thing im the 2nd owner so how will this work and the warranty sticker is little mess up cause i did change the thermal paste couple months ago...
> 
> At the moment using a 5870 and it pretty much sucks... i mean good card but not for today games


little odd that it would blackscreen only with the drivers installed, have you tried a fresh windows 7 install with 2014 or 2015 drivers?

Warranty is a bit of an issue if the owner changed, for the most part the warranty is considered void, however if you have the original shop details and/or contact with the original owner you may be able to get it RMA'ed, the exact process will depend on the country and shop though.


----------



## GHADthc

So I`ve got a bit of an issue, awhile back I installed a Raijintek Morpheus on my 290X and did the good ol stock heatsink memory/vrm heatsink mod...problem is I mangled the thermal pads, and so I bought some expensive Alphacool pads (equivelent to Fujipoly)..problem is they were way too thin, and even when doubled up, they didn`t make the best contact, if at all..can anyone recommend me some pads that will go under the original heatsink for the card that aren`t going to break the bank?


----------



## catbeef

Quote:


> Originally Posted by *Paul17041993*
> 
> Hm, that doesn't sound good as your card might be defective, still got a warranty? Other things to check though would be the PSU and motherboard, if you haven't done so already; pull the card out of the system and re-seat it, and/or test it in another system.


re-seated it about 10 times lately with all my fiddling. sapphire do a 2 year warranty and i've had it since early 2014 so i think that is out the window. but that is what gets me, same 2 monitors, same computer for 2 years and it suddenly does this after a game forces me to reset. is it possible the reset just caused something to surge a bit too much power somewhere bad? was having no issues at all before that









tried another psu, same stuff. don't have access to another computer to try it in unfortunately. tried lowering the resolution on both monitors just to see what'd happen, but it still did it.

it is so strange that it is working perfectly with just one monitor, able to game for hours and it doesn't show any signs of instability or anything.

well, new cards are coming and i am saving up for one of them anyway. guess i can survive on one monitor until then if i still can't figure this out! appreciate the suggestions~


----------



## Aaron_Henderson

Quote:


> Originally Posted by *catbeef*
> 
> re-seated it about 10 times lately with all my fiddling. sapphire do a 2 year warranty and i've had it since early 2014 so i think that is out the window. but that is what gets me, same 2 monitors, same computer for 2 years and it suddenly does this after a game forces me to reset. is it possible the reset just caused something to surge a bit too much power somewhere bad? was having no issues at all before that
> 
> 
> 
> 
> 
> 
> 
> 
> 
> tried another psu, same stuff. don't have access to another computer to try it in unfortunately. tried lowering the resolution on both monitors just to see what'd happen, but it still did it.
> 
> it is so strange that it is working perfectly with just one monitor, able to game for hours and it doesn't show any signs of instability or anything.
> 
> well, new cards are coming and i am saving up for one of them anyway. guess i can survive on one monitor until then if i still can't figure this out! appreciate the suggestions~


I doubt there is anything physically wrong with the card if it's getting into Windows just fine. Something is up in regards to drivers/Windows, or even perhaps a corrupted GPU BIOS?


----------



## mus1mus

It happens. I have had that previously.

One thing, check atiflash.

If atiflash detects the card as Hawaii, you're good. If it doesn't the card is litterally bricked. Either the BIOS chip or anything.

atiflash -i


----------



## catbeef

i updated the bios the other day to the latest version which i found here, which had no noticeable improvement or.. (fancy word for making it any worse).

Code:



Code:


adapter bn dn fn dID       asic           flash      romsize test    bios p/n
======= == == == ==== =============== ============== ======= ==== ==============
   0    01 00 00 67B1 Hawaii          M25P10/c         20000 pass 113-E285HBOC-U004

atiflash is what i used to update the bios, and this is the current -i output.

i'm on windows 10 pro, is there any recommended thing i can do to verify the integrity of windows? as (i think) i said before, i've done the whole safe-mode uninstall drivers with ddu a couple times and tried various versions of the drivers, currently on the latest as neither of the older versions resolved the issue.

and just thought of something i haven't mentioned, i am not using the eyefinity or whatever it is called, just in case that was not obvious.


----------



## gordesky1

and just thought of something i haven't mentioned, i am not using the eyefinity or whatever it is called, just in case that was not obvious.[/quote]
Quote:


> Originally Posted by *Paul17041993*
> 
> little odd that it would blackscreen only with the drivers installed, have you tried a fresh windows 7 install with 2014 or 2015 drivers?
> 
> Warranty is a bit of an issue if the owner changed, for the most part the warranty is considered void, however if you have the original shop details and/or contact with the original owner you may be able to get it RMA'ed, the exact process will depend on the country and shop though.


From what i keep hearing threw google search msi ony go by the S/N number and if the card is not damage by opening up they dont care for the warranty sticker being removed.

will find out Monday.

I also just tried using a fresh install of windows and soon as it tried installing the drivers it black screen.. So it is the card sadly...


----------



## Paul17041993

Quote:


> Originally Posted by *catbeef*
> 
> i updated the bios the other day to the latest version which i found here, which had no noticeable improvement or.. (fancy word for making it any worse).
> 
> Code:
> 
> 
> 
> Code:
> 
> 
> adapter bn dn fn dID       asic           flash      romsize test    bios p/n
> ======= == == == ==== =============== ============== ======= ==== ==============
> 0    01 00 00 67B1 Hawaii          M25P10/c         20000 pass 113-E285HBOC-U004
> 
> atiflash is what i used to update the bios, and this is the current -i output.
> 
> i'm on windows 10 pro, is there any recommended thing i can do to verify the integrity of windows? as (i think) i said before, i've done the whole safe-mode uninstall drivers with ddu a couple times and tried various versions of the drivers, currently on the latest as neither of the older versions resolved the issue.
> 
> and just thought of something i haven't mentioned, i am not using the eyefinity or whatever it is called, just in case that was not obvious.


Is the BIOS update specifically for your card? have you also tried a full-reference BIOS?


----------



## catbeef

Quote:


> Originally Posted by *Paul17041993*
> 
> Is the BIOS update specifically for your card? have you also tried a full-reference BIOS?


whoops i actually linked to the wrong one, these are the bios i updated to https://www.techpowerup.com/vgabios/162230/sapphire-r9290-4096-140320-1 - i did actually try to flash the other one and atiflash informed me "yo, wrong card" and i said oops and got the correct one.

i have not tried a reference bios, i will look into this -- would that be either of these?

https://www.techpowerup.com/vgabios/149259/sapphire-r9290-4096-131003

https://www.techpowerup.com/vgabios/160868/amd-r9290-4096-131003


----------



## tokoam

I payed 222 including a ek waterblock working like a champ to.


----------



## rdr09

Quote:


> Originally Posted by *tokoam*
> 
> I payed 222 including a ek waterblock working like a champ to.


How are your temps? we have a member who bought his used too. go back a few pages and you'll see the gunk he found.

as soon as i got my second one, i cleaned and replaced the thermals. used clu and fuji leftover from my first.

Your core should not see 50 at load. and your vrms should be about 5 - 7C cooler.

This is with the original thermals from almost three years ago . . .


----------



## tokoam

Quote:


> Originally Posted by *rdr09*
> 
> How are your temps? we have a member who bought his used too. go back a few pages and you'll see the gunk he found.
> 
> as soon as i got my second one, i cleaned and replaced the thermals. used clu and fuji leftover from my first.
> 
> Your core should not see 50 at load. and your vrms should be about 5 - 7C cooler.
> 
> This is with the original thermals from almost three years ago . . .


 temps4.gif 15k .gif file


temps3.gif 15k .gif file


----------



## melodystyle2003

Let me share my experience with an asus r9 290 dcii card.

With stock cooler couldn't get it stable over 1075Mhz, no matter the voltage or the stock heatsink fans' rpms or by using 120mm fans on the stock heatsink. To be more accurate, it was running 1075 with stock voltage, but anything higher produced artifacts. Also, the noise of the cooler was getting annoying nowadays where the ambient temperature inside room reaches 30oC.

I installed a spare aio with a getto zip-ties installation method. Temps were reduced quite (~15oC less since the installation method was not tightening enough the block on the gpu core ) and o/c capabilities start to shine so i decided to proceed further.

I put a custom watercooling loop on my cpu and used an universal block on the gpu. Magically, gpu core now can be o/ced to i. 1170Mhz +100mV, ii. 1200Mhz +150mV, iii.1250Mhz +225mV stable for gaming (haven't tried higher). I was really surprised with the outcome. Temps reduced dramatically, from 85oC of the stock cooler (with custom aggressive fan profile) to ~58oC after hours of gaming o/ced! Both VRM's location are cooled with heatsinks and fans and their temps laying around 75-80oC and 55oC for VRM1 and VRM2 respectively (i mention the maximum recorded values for the above described o/c profile ii.). And all these by just using a some spare wc parts i had, a single 240mm rad with 3 fans (due to space restrictions, RAMs block the installation of the 4th fan) at low speeds and a pump capable of circulating just 600l/h (will change it to a higher flow one, whole wc setup flows within 1/2" ID hoses and barbs).

My question: Whats the max voltage we can apply for gaming on asus dcii pcb design (6+2 phases)? I see that the asu gputweak allows up to +160mV, while trixx up to +200mV.


----------



## TheLAWNOOB

Quote:


> Originally Posted by *melodystyle2003*
> 
> Let me share my experience with an asus r9 290 dcii card.
> 
> With stock cooler couldn't get it stable over 1075Mhz, no matter the voltage or the stock heatsink fans' rpms or by using 120mm fans on the stock heatsink. To be more accurate, it was running 1075 with stock voltage, but anything higher produced artifacts. Also, the noise of the cooler was getting annoying nowadays where the ambient temperature inside room reaches 30oC.
> 
> I installed a spare aio with a getto zip-ties installation method. Temps were reduced quite (~15oC less since the installation method was not tightening enough the block on the gpu core ) and o/c capabilities start to shine so i decided to proceed further.
> 
> I put a custom watercooling loop on my cpu and used an universal block on the gpu. Magically, gpu core now can be o/ced to i. 1170Mhz +100mV, ii. 1200Mhz +150mV, iii.1250Mhz +225mV stable for gaming (haven't tried higher). I was really surprised with the outcome. Temps reduced dramatically, from 85oC of the stock cooler (with custom aggressive fan profile) to ~58oC after hours of gaming o/ced! Both VRM's location are cooled with heatsinks and fans and their temps laying around 75-80oC and 55oC for VRM1 and VRM2 respectively (i mention the maximum recorded values for the above described o/c profile ii.). And all these by just using a some spare wc parts i had, a single 240mm rad with 3 fans (due to space restrictions, RAMs block the installation of the 4th fan) at low speeds and a pump capable of circulating just 600l/h (will change it to a higher flow one, whole wc setup flows within 1/2" ID hoses and barbs).
> 
> My question: Whats the max voltage we can apply for gaming on asus dcii pcb design (6+2 phases)? I see that the asu gputweak allows up to +160mV, while trixx up to +200mV.


I had 2 of those that were terrible on air. Used an H50 on one of them and the radiator of the H50 got to 50C...

Try to stick to +100mv if you want it to last a few more years. Only run it at +200mv 24/7 if you are waiting to replace it with Vega.

I would've ran my pair of R9 290s at +200mv, but I don't have enough radiator and it heats up my room too much.


----------



## melodystyle2003

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> I had 2 of those that were terrible on air. Used an H50 on one of them and the radiator of the H50 got to 50C...
> 
> Try to stick to +100mv if you want it to last a few more years. Only run it at +200mv 24/7 if you are waiting to replace it with Vega.
> 
> I would've ran my pair of R9 290s at +200mv, but I don't have enough radiator and it heats up my room too much.


Thats true, aio's radiator was so warm on the touch when was cooling the r9, despite the fact that two high flow 120mm were installed on it on push-pull configuration. Probably the lower liquid flow comparing to custom wc applications is to blame for that behavior.


----------



## Chopper1591

Quote:


> Originally Posted by *melodystyle2003*
> 
> Thats true, aio's radiator was so warm on the touch when was cooling the r9, despite the fact that two high flow 120mm were installed on it on push-pull configuration. Probably the lower liquid flow comparing to custom wc applications is to blame for that behavior.


Flow rate doesn't help that much.

You said you were just using a single 240, right? That is low rad space.

I have a 360 with 5 fans. Odd number love here







. Also added a 140.1 in the loop. Water is mostly below 35c. Gpu stays below 50c with a 1250 clock (Sapphire r9 290). VRMs at 40-45c.

Do note there is a fx-8320 @ 4.8ghz dumping massive amounts of heat in the water to.


----------



## Paul17041993

Quote:


> Originally Posted by *catbeef*
> 
> https://www.techpowerup.com/vgabios/149259/sapphire-r9290-4096-131003
> 
> https://www.techpowerup.com/vgabios/160868/amd-r9290-4096-131003


Those two should be identical as their hash's are the same


----------



## Paul17041993

Quote:


> Originally Posted by *melodystyle2003*
> 
> My question: Whats the max voltage we can apply for gaming on asus dcii pcb design (6+2 phases)? I see that the asu gputweak allows up to +160mV, while trixx up to +200mV.


Considering how similar it is to the reference VRMs; max is whatever stable voltage you can apply while keeping the temps low, core must be kept below 60C and VRMs close to it for best results.

Most cards are TDP-limited before they get to +200mV, however many are also too unstable to reach that anyway. While higher voltages increase logic stability at higher clocks, it also causes more leakage in weaker areas and reduces stability, it can take a little time to find that sweet spot for your particular GPU.

If you get the core below 0C however, the leakage is reduced and you can push it a lot higher, most records these days use liquid helium (-269C).


----------



## melodystyle2003

Quote:


> Originally Posted by *Chopper1591*
> 
> Flow rate doesn't help that much.
> 
> You said you were just using a single 240, right? That is low rad space.
> 
> I have a 360 with 5 fans. Odd number love here
> 
> 
> 
> 
> 
> 
> 
> . Also added a 140.1 in the loop. Water is mostly below 35c. Gpu stays below 50c with a 1250 clock (Sapphire r9 290). VRMs at 40-45c.
> 
> Do note there is a fx-8320 @ 4.8ghz dumping massive amounts of heat in the water to.


Then probably would be a good idea to add an 120mm rad together with a couple of fans when i will change the pump?
Quote:


> Originally Posted by *Paul17041993*
> 
> Considering how similar it is to the reference VRMs; max is whatever stable voltage you can apply while keeping the temps low, core must be kept below 60C and VRMs close to it for best results.
> 
> *Most cards are TDP-limited before they get to +200mV, however many are also too unstable to reach that anyway. While higher voltages increase logic stability at higher clocks, it also causes more leakage in weaker areas and reduces stability, it can take a little time to find that sweet spot for your particular GPU.*
> 
> If you get the core below 0C however, the leakage is reduced and you can push it a lot higher, most records these days use liquid helium (-269C).


Thanks, i agree with you especially on the emboldened part.


----------



## rdr09

Quote:


> Originally Posted by *tokoam*
> 
> temps4.gif 15k .gif file
> 
> 
> temps3.gif 15k .gif file


Very nice and good buy. Got mine for $248 about a year and half ago with full block.

Supposedly a mining card.


----------



## Chopper1591

Quote:


> Originally Posted by *melodystyle2003*
> 
> Then probably would be a good idea to add an 120mm rad together with a couple of fans when i will change the pump?
> Thanks, i agree with you especially on the emboldened part.


More is always better









Can't you fit a 240 extra? You can always mod a little with a radiator stand for example.

You can also run lower rpm with more rad space. My fans are mostly around 1000rpm depending on room temp.


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> Very nice and good buy. Got mine for $248 about a year and half ago with full block.
> 
> Supposedly a mining card.


Mining does nothing to these cards. If anything 290/290X have been proven to be monsters since so much mining has been done to them and not many people are complaining about dead cards.


----------



## mfknjadagr8

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Mining does nothing to these cards. If anything 290/290X have been proven to be monsters since so much mining has been done to them and not many people are complaining about dead cards.


both of mine were miner cards...they both had horribly dried paste (looked like dried modeling clay) but they are still kicking...


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Mining does nothing to these cards. If anything 290/290X have been proven to be monsters since so much mining has been done to them and not many people are complaining about dead cards.


heard/read all kinds of stories about cards used in mining. how they use special bios for mining, undervolting cards, etc. mine was in mint condition. hardly a dust on the pcb compared to mine. got the block separate, so i tested on air first before installing it. still works as great as my launch card.

Got it for $184. Six months prior to buying this . . . i sold my 7950 for $325.


----------



## OneB1t

290X had pretty decent PCB quality (same as firepro W9100) so mining have nearly zero effect on these cards


----------



## Chopper1591

Mine was also a miner. I have no comparison but it still works perfectly, bought about 1,5 years ago.

As it runs 1250 core without a sweat I guess it held up great while mining.

What do you guys think I will be able to get for it when I sell it? Comes with EK fullcover, backplate, reinforcer and fujipoly ultra extreme thermal pads.


----------



## LandonAaron

I recently saw it advised on this thread to disable Radeon Settings from starting with windows to avoid problems with your overclock's core and memory clocks being applied at windows startup, but without your overclock's voltage setting. This is a problem I have suffered from myself, and after reading this advice I even took it upon myself to repeat this advice to others. Unfortunately following such advice I have encountered a new problem, where my Radeon Settings' gaming/application settings aren't saving, or more accurately are being forgotton. For instance, I am playing fallout 4 right now, which has horrible stuttering with crossfire enabled, so I set "AMD Crossfire: Disabled" under Fallout 4's application profile in Radeon Settings. Everything will be fine until I reboot my computer, at which point Radeon Settings will revert my Fallout 4 application profile setting back to "AMD Crossfire: Default Mode". After encountering this, I decided to go ahead and set Radeon Settings back to enabled at startup. Unfortunately though, I continue to experience this issue, even after setting it back to launch at startup. What is especially strange is that it doesn't seem to forget my settings every time, more like every other time. Its really strange and annoying.

Has anyone else experienced a problem with Radeon Settins forgetting your Application Profile settings?


----------



## Chopper1591

Quote:


> Originally Posted by *LandonAaron*
> 
> I recently saw it advised on this thread to disable Radeon Settings from starting with windows to avoid problems with your overclock's core and memory clocks being applied at windows startup, but without your overclock's voltage setting. This is a problem I have suffered from myself, and after reading this advice I even took it upon myself to repeat this advice to others. Unfortunately following such advice I have encountered a new problem, where my Radeon Settings' gaming/application settings aren't saving, or more accurately are being forgotton. For instance, I am playing fallout 4 right now, which has horrible stuttering with crossfire enabled, so I set "AMD Crossfire: Disabled" under Fallout 4's application profile in Radeon Settings. Everything will be fine until I reboot my computer, at which point Radeon Settings will revert my Fallout 4 application profile setting back to "AMD Crossfire: Default Mode". After encountering this, I decided to go ahead and set Radeon Settings back to enabled at startup. Unfortunately though, I continue to experience this issue, even after setting it back to launch at startup. What is especially strange is that it doesn't seem to forget my settings every time, more like every other time. Its really strange and annoying.
> 
> Has anyone else experienced a problem with Radeon Settins forgetting your Application Profile settings?


Have you tried wiping with DDU and reinstalling the suite?


----------



## catbeef

Quote:


> Originally Posted by *Paul17041993*
> 
> Those two should be identical as their hash's are the same


well oops they were the wrong ones for my card too, but i did find suitable reference bios and tried that. still no change









i did notice almost every time testing out lately, it does the glitch on the windows 10 boot screen with the spinning dots.. are windows drivers loaded by that time?


----------



## Paul17041993

Quote:


> Originally Posted by *catbeef*
> 
> well oops they were the wrong ones for my card too, but i did find suitable reference bios and tried that. still no change
> 
> 
> 
> 
> 
> 
> 
> 
> 
> i did notice almost every time testing out lately, it does the glitch on the windows 10 boot screen with the spinning dots.. are windows drivers loaded by that time?


That's about the time that the desktop components are started, which would involve the drivers being fully started up and any profiles loaded. This is also where a blackscreen occurs if an unstable memory overclock is set in the crimson/CCC panel.

You could try underclocking the VRAM and see if the flickering still occurs, otherwise I don't think there's anything that can be done as it seems like one of the VRAM ICs or the memory controller might have developed a defect...

You could contact sapphire about it and see what they say, they may be able to repair or replace it at a low price, especially considering the stock may be going into clearance soon.


----------



## catbeef

sounds reasonable enough. i'll give them a tickle and see what they say! genuine thanks for the continued suggestions


----------



## gordesky1

Guys my msi lightning 290x got approved for warranty and i sent it in today. But i wanted to ask you guys should i email them beforehand and let them know i did replied the thermal paste and the sticker on the screw was torn with the screw driver but there is no damage that was done while doing so? Or shouldn't i say anything?

searching everywhere it says msi doeisnt care about that sticker and you can remove the heat sink to install a aftermarket one or even change the thermal paste.

But yea i wanted to know if i should tell them about it or not.

edit when i gave them the tracking i just told them anyways about it and told them when researching it up it says it was ok.

So will see for sure what they say, figure it best to let them know than not to let them know when they get it.


----------



## rdr09

Quote:


> Originally Posted by *gordesky1*
> 
> Guys my msi lightning 290x got approved for warranty and i sent it in today. But i wanted to ask you guys should i email them beforehand and let them know i did replied the thermal paste and the sticker on the screw was torn with the screw driver but there is no damage that was done while doing so? Or shouldn't i say anything?
> 
> searching everywhere it says msi doeisnt care about that sticker and you can remove the heat sink to install a aftermarket one or even change the thermal paste.
> 
> But yea i wanted to know if i should tell them about it or not.
> 
> edit when i gave them the tracking i just told them anyways about it and told them when researching it up it says it was ok.
> 
> So will see for sure what they say, figure it best to let them know than not to let them know when they get it.


No, you don't have to. If they ask sure tell them. But, i doubt they'll ask.

Anyways, . . .


----------



## mus1mus

Are you saying "X-Fire Hawaiis are still better than any single card solution"?


----------



## rdr09

Quote:


> Originally Posted by *mus1mus*
> 
> Are you saying "X-Fire Hawaiis are still better than any single card solution"? :thumbes:


For us that use 2 (including 970/980 users) - yes. Has to match a tri-fire imo. Looks like the Ti will.


----------



## mus1mus

Quote:


> Originally Posted by *rdr09*
> 
> For us that use 2 (including 970/980 users) - yes. *Has to match a tri-fire imo*. Looks like the Ti will.


That might be too much to ask.

Slightly beating 2 will do imo.


----------



## Paul17041993

Quote:


> Originally Posted by *gordesky1*
> 
> Guys my msi lightning 290x got approved for warranty and i sent it in today. But i wanted to ask you guys should i email them beforehand and let them know i did replied the thermal paste and the sticker on the screw was torn with the screw driver but there is no damage that was done while doing so? Or shouldn't i say anything?
> 
> searching everywhere it says msi doeisnt care about that sticker and you can remove the heat sink to install a aftermarket one or even change the thermal paste.
> 
> But yea i wanted to know if i should tell them about it or not.
> 
> edit when i gave them the tracking i just told them anyways about it and told them when researching it up it says it was ok.
> 
> So will see for sure what they say, figure it best to let them know than not to let them know when they get it.


Usually it's not worth mentioning as replacing the paste is rarely a cause of failure. Unless of course you're stupid enough to flood the thing with a conductive compound, of which they'd notice and at least ask why you would do such a thing.

The stickers are simply a way to know if the cooler was tampered with after it left the factory, of which most cases the tamper is perfectly innocent (paste replace or watercooling), but there's people out there who may just loosen the screws or similar just to make the card appear defective so they can try and get a refund.


----------



## rdr09

Quote:


> Originally Posted by *mus1mus*
> 
> That might be too much to ask.
> 
> Slightly beating 2 will do imo.


If that's the case, then Vega can easily accomplish that next year. One first, then two 'cause in a mATX or higher, one gpu is ugly. Case will look empty. Upgrade the rest of the system by that time. We'll call it - Viva Las Vegas. lol


----------



## mus1mus

Nice name.

And by then, you are most likely be using a mobo with all that RGB-ewness!








can imagine how the rig will look then.


----------



## rdr09

Quote:


> Originally Posted by *mus1mus*
> 
> Nice name.
> 
> And by then, you are most likely be using a mobo with all that RGB-ewness!
> 
> 
> 
> 
> 
> 
> 
> 
> can imagine how the rig will look then.


2 for 4K and disable crossfire if a game does not support it. one for 1440. XDMA baby!


----------



## serave

which thermal paste application method do you guys use?

i recently grabbed one of these just for the hell of it, and notice that the temps are slowly rising to the upper 70°C and sometimes hits 82°C

i use the spread across the die method by the way

fan speed operates the same % as the temps, (i.e 75°C = 75% fan speeds)

i know they are perfectly fine with the 290, but is there anything i can do to stop the "slowly rising to 80°C" part?

the guys at [H] tested the card and only they got 70°C temps during gaming


----------



## Streetdragon

Quote:


> Originally Posted by *serave*
> 
> which thermal paste application method do you guys use?
> 
> i recently grabbed one of these just for the hell of it, and notice that the temps are slowly rising to the upper 70°C and sometimes hits 82°C
> 
> i use the spread across the die method by the way
> 
> fan speed operates the same % as the temps, (i.e 75°C = 75% fan speeds)
> 
> i know they are perfectly fine with the 290, but is there anything i can do to stop the "slowly rising to 80°C" part?
> 
> the guys at [H] tested the card and only they got 70°C temps during gaming


i use a bit of liquid metal + good airflow.
lie this : https://www.amazon.de/Coollaboratory-Liquid-Pro-Fl%C3%BCssigmetall-W%C3%A4rmeleitpaste/dp/B001PE5XAC


----------



## serave

Quote:


> Originally Posted by *Streetdragon*
> 
> i use a bit of liquid metal + good airflow.
> lie this : https://www.amazon.de/Coollaboratory-Liquid-Pro-Fl%C3%BCssigmetall-W%C3%A4rmeleitpaste/dp/B001PE5XAC


isnt that kind of paste is conductive though? i might get one in a few weeks


----------



## mus1mus

Quote:


> Originally Posted by *serave*
> 
> which thermal paste application method do you guys use?
> 
> i recently grabbed one of these just for the hell of it, and notice that the temps are slowly rising to the upper 70°C and sometimes hits 82°C
> 
> i use the spread across the die method by the way
> 
> fan speed operates the same % as the temps, (i.e 75°C = 75% fan speeds)
> 
> i know they are perfectly fine with the 290, but is there anything i can do to stop the "slowly rising to 80°C" part?
> 
> the guys at [H] tested the card and only they got 70°C temps during gaming


More than a thermal paste, you should be looking into your case airflow with much more emphasis. Bad or lack of airflow will cause high temps. Fix that first. Then delve into taking the cooler apart and reapplying new thermal paste. Gelid GC Extreme and the like are good.

Another important aspect is your ambient temps.

Most reviewers do their testing on an open bench. So their temps can be misleading.


----------



## Paul17041993

Quote:


> Originally Posted by *serave*
> 
> which thermal paste application method do you guys use?
> 
> i recently grabbed one of these just for the hell of it, and notice that the temps are slowly rising to the upper 70°C and sometimes hits 82°C
> 
> i use the spread across the die method by the way
> 
> fan speed operates the same % as the temps, (i.e 75°C = 75% fan speeds)
> 
> i know they are perfectly fine with the 290, but is there anything i can do to stop the "slowly rising to 80°C" part?
> 
> the guys at [H] tested the card and only they got 70°C temps during gaming


Paste is likely just fine, however the heatsink isn't dissipating the heat quite fast enough, either it needs a higher fan profile or you need more case airflow.

Usually if there's a paste issue the temp will rise very sharply, except for in the case of high overclocks in a watercooled scenario which is sensitive to the thermal conductivity of the paste and pads.

Quote:


> Originally Posted by *Streetdragon*
> 
> i use a bit of liquid metal + good airflow.
> lie this : https://www.amazon.de/Coollaboratory-Liquid-Pro-Fl%C3%BCssigmetall-W%C3%A4rmeleitpaste/dp/B001PE5XAC


metal compounds are generally a bad idea for GPUs and other bare dies due to the possibility of a short.


----------



## Streetdragon

sure there is a risk, but the performence talks for itself







i used it now on cpu and gpu and i will never go back to silikon base


----------



## SpykeZ

If I can get another 290x for around 200ish, do you think it's worth dropping into my system for smoother 1440p? I know the Rx is going to drop next month supposedly for around 200 and it's supposed to have 980 performance which is quite a bit more than the 290x.


----------



## mus1mus

Sell yours and go for the RX.
2 of them.


----------



## TheReciever

For the sake of driver support I would ditch the 290x if your considering doubling down for the RX


----------



## rdr09

Quote:


> Originally Posted by *SpykeZ*
> 
> If I can get another 290x for around 200ish, do you think it's worth dropping into my system for smoother 1440p? I know the Rx is going to drop next month supposedly for around 200 and it's supposed to have 980 performance which is quite a bit more than the 290x.


What game is that?

We have a member playing games using 1440 and he is fine with a GTX 960. Prolly you just need to lower some settings. If it has gimpworks - disable it. Your 290X with a little oc should be matching a 390X, which is beating the 980 in some games.


----------



## SpykeZ

Quote:


> Originally Posted by *rdr09*
> 
> What game is that?
> 
> We have a member playing games using 1440 and he is fine with a GTX 960. Prolly you just need to lower some settings. If it has gimpworks - disable it. Your 290X with a little oc should be matching a 390X, which is beating the 980 in some games.


Well, there's really only been a couple games that get a tid bit down, but the point isn't to turn the settings down, it's to turn them up.

None the less, I was thinking about picking up that G10 cooling bracket this weekend, what's the possibility that it'll work with the 480?


----------



## Roboyto

Quote:


> Originally Posted by *SpykeZ*
> 
> Well, there's really only been a couple games that get a tid bit down, but the point isn't to turn the settings down, it's to turn them up.
> 
> None the less, I was thinking about picking up that G10 cooling bracket this weekend, what's the possibility that it'll work with the 480?


Sometimes you need to turn the settings down though. Maximum settings probably bog you down far more than they increase the fidelity of the image on your screen. I play 3K with a single 290 by adjusting settings accordingly to give me acceptable frame rates. Just have to play around with the settings a little bit to find a happy medium. I understand it's nice to just ram every slider to the right or click 'Ultra', but you have a graphics card that came out in 2013 and that's not how things work. You want absolute max settings without compromise then you either need the biggest baddest GPU or 2 good ones.

No way to know for certain if the G10 will be compatible with the RX 480. I will say that it probably wouldn't be good for the reference board because it looks rather short and the fan to cool the VRMs would likely not do it's job.


----------



## ebduncan

Quote:


> Originally Posted by *catbeef*
> 
> underclocking is not removing or reducing the glitching at all, so i don't think profiles will help (interesting info for future reference, however)
> 
> 500 / 700 + 2 monitors = glitch glitch glitch
> 1000 / 1300 (stock) + 2 monitors = glitch glitch glitch
> 1100 / 1400 / 200mV + 2 monitors = glitch glitch glitch
> 
> 1 monitor fine, 2 monitors bad. it is just really frustrating not knowing why after years of it being fine it has suddenly decided no more


its the drivers 16.5.3 screwed my system up bad. revert back to wqhl drivers and you shouldn't have any more issues.

That being said is anyone else having issues with the 16.5.3 drivers?


----------



## mus1mus

I always go stick with 15.10 as it's hassle free. And gives me the best frames in benching.


----------



## kizwan

Quote:


> Originally Posted by *ebduncan*
> 
> Quote:
> 
> 
> 
> Originally Posted by *catbeef*
> 
> underclocking is not removing or reducing the glitching at all, so i don't think profiles will help (interesting info for future reference, however)
> 
> 500 / 700 + 2 monitors = glitch glitch glitch
> 1000 / 1300 (stock) + 2 monitors = glitch glitch glitch
> 1100 / 1400 / 200mV + 2 monitors = glitch glitch glitch
> 
> 1 monitor fine, 2 monitors bad. it is just really frustrating not knowing why after years of it being fine it has suddenly decided no more
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> its the drivers 16.5.3 screwed my system up bad. revert back to wqhl drivers and you shouldn't have any more issues.
> 
> That being said is anyone else having issues with the 16.5.3 drivers?
Click to expand...

No problem here. With 2 monitors too, hdmi & dvi-to-hdmi.


----------



## Paul17041993

Quote:


> Originally Posted by *SpykeZ*
> 
> None the less, I was thinking about picking up that G10 cooling bracket this weekend, what's the possibility that it'll work with the 480?


The 480's PCB is like that in size to the Fury X's, as well as a different memory config and a much different die; no, basically no possible chance of any 290/X bracket being compatible with the 480.

The AIO cooler should be though as they all use standardised dimensions, but with the 480 having only 100W or less going through the die there's not much of a point in watercooling. That depends though on how effective the stock blower will be, of which it's got double-sided intakes so I'm expecting it to be very effective in virtually any environment...


----------



## TheLAWNOOB

It seems my driver keep trying to apply my previous OC on startup. I oc using Trixx, but if I dont reset settings and dont shutdown normally, then it will black screen when I boot up again.

Has this happened to anyone else? I think the driver is trying to apply 1200/1500 even before Trixx auto starts to apply the voltage.

On the bright side I get 60fps steady in NFS2015 on high now. Have texture on ultra with AA off. It runs at 62C with 24C ambient because my 1400rpm fans are too slow.


----------



## sinnedone

Quote:


> Originally Posted by *kizwan*
> 
> No problem here. With 2 monitors too, hdmi & dvi-to-hdmi.


With 2 monitors on the latest drivers I don't have a problem, with 3 I get flickering when playing youtube and changing sizes and starting a game etc.


----------



## ebduncan

Quote:


> Originally Posted by *mus1mus*
> 
> I always go stick with 15.10 as it's hassle free. And gives me the best frames in benching.


I was using 16.5.2 whql drivers without an issue before changing to the 16.5.3 "broken ones" haha I'm back on 16.5.2 now with zero issues again.
Quote:


> Originally Posted by *kizwan*
> 
> No problem here. With 2 monitors too, hdmi & dvi-to-hdmi.


Thanks

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> It seems my driver keep trying to apply my previous OC on startup. I oc using Trixx, but if I dont reset settings and dont shutdown normally, then it will black screen when I boot up again.
> 
> Has this happened to anyone else? I think the driver is trying to apply 1200/1500 even before Trixx auto starts to apply the voltage.
> 
> On the bright side I get 60fps steady in NFS2015 on high now. Have texture on ultra with AA off. It runs at 62C with 24C ambient because my 1400rpm fans are too slow.


I had problems in the past with some of the earlier crimson drivers doing this to me. Once I updated to 16.5.2 whql those issues were gone.

Quote:


> Originally Posted by *sinnedone*
> 
> With 2 monitors on the latest drivers I don't have a problem, with 3 I get flickering when playing youtube and changing sizes and starting a game etc.


err i have 4 monitors. but same problem on the 16.5.3 drivers, reverted back to 16.5.2 whql and no issues


----------



## kizwan

Quote:


> Originally Posted by *sinnedone*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> No problem here. With 2 monitors too, hdmi & dvi-to-hdmi.
> 
> 
> 
> With 2 monitors on the latest drivers I don't have a problem, with 3 I get flickering when playing youtube and changing sizes and starting a game etc.
Click to expand...

I believe it is memory related. Did you try without powerplay?


----------



## catbeef

Quote:


> Originally Posted by *ebduncan*
> 
> its the drivers 16.5.3 screwed my system up bad. revert back to wqhl drivers and you shouldn't have any more issues.
> 
> That being said is anyone else having issues with the 16.5.3 drivers?


no, it was my motherboard it seems. died officially on saturday :')


----------



## Roboyto

http://www.newegg.com/Product/Product.aspx?Item=14-202-042&utm_medium=Email&utm_source=GD060616&cm_mmc=EMC-GD060616-_-index-_-Item-_-14-202-042

Sapphire Reference 290X for $219 after $20 MIR


----------



## Paul17041993

Quote:


> Originally Posted by *Roboyto*
> 
> http://www.newegg.com/Product/Product.aspx?Item=14-202-042&utm_medium=Email&utm_source=GD060616&cm_mmc=EMC-GD060616-_-index-_-Item-_-14-202-042
> 
> Sapphire Reference 290X for $219 after $20 MIR


I'd jump the gun on two of those, especially considering they're the exact model as mine, however the 480's are just too close and more desirable...

If they were 100 or less though then I'd get them regardless.


----------



## Roboyto

Quote:


> Originally Posted by *Paul17041993*
> 
> I'd jump the gun on two of those, especially considering they're the exact model as mine, however the 480's are just too close and more desirable...
> 
> If they were 100 or less though then I'd get them regardless.


I hear ya. Need to finish the marketplace ad for my HTPC's 290 w/ waterblock so I will have the fundage for a 480. Even if it's barely more performance I want to jump head first into new GPU overclocking


----------



## SpykeZ

That 480 is really killing the value of everything lol. I was planning on throwing another 290x into my system just as a performance boost and now, here's the 480......which I might as well just wait to get two of them.


----------



## Roboyto

Quote:


> Originally Posted by *SpykeZ*
> 
> That 480 is really killing the value of everything lol. I was planning on throwing another 290x into my system just as a performance boost and now, here's the 480......which I might as well just wait to get two of them.


Or just wait and get a 490


----------



## rdr09

Quote:


> Originally Posted by *catbeef*
> 
> no, it was my motherboard it seems. died officially on saturday :')


a new motherboard hopefully will fix it. But, yah, the "driver issue" is like the path of least resistance.


----------



## sinnedone

Quote:


> Originally Posted by *kizwan*
> 
> No problem here. With 2 monitors too, hdmi & dvi-to-hdmi.


Quote:


> Originally Posted by *kizwan*
> 
> I believe it is memory related. Did you try without powerplay?


No I did not.

Settings are in afterburner and not in crimson correct?


----------



## kizwan

Quote:


> Originally Posted by *sinnedone*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> No problem here. With 2 monitors too, hdmi & dvi-to-hdmi.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> I believe it is memory related. Did you try without powerplay?
> 
> Click to expand...
> 
> No I did not.
> 
> Settings are in afterburner and not in crimson correct?
Click to expand...

Yes, using afterburner. It's called "Unofficial overclocking mode" & drop-down menu for "without powerplay support".


----------



## Roboyto

Selling my HTPC's R9 290 w/ waterblock & backplate

$225

http://www.overclock.net/t/1602597/xfx-r9-290-reference-w-xspc-razor-block-backplate/0_20


----------



## cephelix

Quote:


> Originally Posted by *Roboyto*
> 
> Selling my HTPC's R9 290 w/ waterblock & backplate
> 
> $225
> 
> http://www.overclock.net/t/1602597/xfx-r9-290-reference-w-xspc-razor-block-backplate/0_20


Noooo!!!it's a sad day when you sell off your 290...


----------



## Starbomba

Well, after a tryst with a 79x0 Trifire, a completely reference R9 290 and attempting to like again the green team with a 780 Classy, i'm back into the red team with a R9 290X Tri-X. Now, all i need is for AMD to release the RX 490 (or w/e sucessor to the Fury X is) for me to get two.

The funny part is that i miscalculated the length of the card. I read on my case manual that a 12'' card would fit, but i kinda forgot i had a tube running on that side of the case, and when i tried installing the card, the tubing was on the way. Since then i love my quick disconnects even more xD

I need to find the time to disassemble this loop and put the card on water, i have an EK full block+backplate i sooo want to use.

http://valid.x86.fr/apg389
http://www.techpowerup.com/gpuz/details/8dysb


----------



## Roboyto

Quote:


> Originally Posted by *cephelix*
> 
> Noooo!!!it's a sad day when you sell off your 290...


My main rig is still rocking my kickass 290 for the time being. I want to get into the Polaris OCing game right off the bat so the 290 should fund that.

Towards the end of the year when Zen and bigger GPUs start showing up my other 290 will also be going up for sale...and that will be a sad day.


----------



## cephelix

Quote:


> Originally Posted by *Roboyto*
> 
> My main rig is still rocking my kickass 290 for the time being. I want to get into the Polaris OCing game right off the bat so the 290 should fund that.
> 
> Towards the end of the year when Zen and bigger GPUs start showing up my other 290 will also be going up for sale...and that will be a sad day.


I'm thinking the same thing, whether i should get polaris or just wait for vega.something that is an actual worthy replacement for my 290.


----------



## Roboyto

Quote:


> Originally Posted by *cephelix*
> 
> I'm thinking the same thing, whether i should get polaris or just wait for vega.something that is an actual worthy replacement for my 290.


I know this isn't a leap in performance of epic proportions...but I'm really itching to play around with something new. I will definitely be grabbing one, maybe 2 for Xfire...haven't decided. For my wife, I try to keep things simple so when she wants to use it everything just works as intended. It took me a while to get her off the PS3 and justify upgrading the HTPC, so keeping the PC running appropriately is a must.


----------



## mus1mus

I'm contemplating the same thing. But my 290X is epic. I feel it would be a waste throwing it away. But hey, still have 3 cards. Only one fails to do 1300+ on core.

The thing that really makes me consider letting them go is Power Consumption. 2 1250s can easily be drained by these cards in a system.

I reckon 3 480s won't break a sweat on one of my PSUs.


----------



## Roboyto

Quote:


> Originally Posted by *mus1mus*
> 
> I'm contemplating the same thing. But my 290X is epic. I feel it would be a waste throwing it away. But hey, still have 3 cards. Only one fails to do 1300+ on core.
> 
> The thing that really makes me consider letting them go is Power Consumption. 2 1250s can easily be drained by these cards in a system.
> 
> I reckon 3 480s won't break a sweat on one of my PSUs.


The one I'm selling isn't a record setter, but it is still a decent performer running 1100/1350 with 50% power and a mere +25mV. It doesn't like extra voltage to boost the clocks a whole lot though. Best it does is 1200/1350 with +175mV.

1100 on the core is still a decent jump in performance. Sure it's not magical, but those clocks easily put it in stock 290X territory.

They are literally half the power consumption so there's no doubt your PSU would be working less.


----------



## Vellinious

I'm selling mine too, with the kryographics block. Is there a "for sale" section in this forum?


----------



## Roboyto

Quote:


> Originally Posted by *Vellinious*
> 
> I'm selling mine too, with the kryographics block. Is there a "for sale" section in this forum?


In the OCN Marketplace under Video you can post an ad.


----------



## TheLAWNOOB

I put up an ad on Kijiji saying I'm buying 290, or 290X, and put the price at 180. Some dude with a 290X emailed me and asked how much I want to pay. He was surprised when I offered him 200. He got all defensive when I said RX 480 is going to match the 290X at half the power. Maybe he wanted $300 lol.

200 might be a low ball but he asked me how much I wanted to pay and didn't ask for more.


----------



## Starbomba

I've seen reference 290X's for $200. Heck, i bought my own 290x Tri-X for $236, which in my opinion, is a steal.


----------



## Paul17041993

Quote:


> Originally Posted by *Starbomba*
> 
> Well, after a tryst with a 79x0 Trifire, a completely reference R9 290 and attempting to like again the green team with a 780 Classy, i'm back into the red team with a R9 290X Tri-X. Now, all i need is for AMD to release the RX 490 (or w/e sucessor to the Fury X is) for me to get two.
> 
> The funny part is that i miscalculated the length of the card. I read on my case manual that a 12'' card would fit, but i kinda forgot i had a tube running on that side of the case, and when i tried installing the card, the tubing was on the way. Since then i love my quick disconnects even more xD
> 
> I need to find the time to disassemble this loop and put the card on water, i have an EK full block+backplate i sooo want to use.
> 
> http://valid.x86.fr/apg389
> http://www.techpowerup.com/gpuz/details/8dysb
> 
> 
> Spoiler: Warning: Spoiler!


Too many QDC's :I

Though wouldn't having only 3/4 of the memory channels limit your CPU's performance? I've never really seen a config like that...

If you're getting a new EK 290X block then you'll want to get some proper 1mm or 1.5mm pads for the VRMs as the block nolonger comes with them.


----------



## TheLAWNOOB

Anybody tried putting a Gigabyte Windforce cooler on a reference R9 290?

R9 290 windforce has reference PCB except the capacitors are solid state.


----------



## Starbomba

Quote:


> Originally Posted by *Paul17041993*
> 
> Too many QDC's :I
> 
> Though wouldn't having only 3/4 of the memory channels limit your CPU's performance? I've never really seen a config like that...
> 
> If you're getting a new EK 290X block then you'll want to get some proper 1mm or 1.5mm pads for the VRMs as the block nolonger comes with them.


And i'm missing one more set of these QDC's. They make dismantling certain parts easy as hell, and now with this semi-ghetto config, i like them even more.

About the RAM, i am not 100% sure why, but the 4th channel (2 memory slots) are completely unrecognized. I would check it out, but it would involve taking everything apart and checking the motherboard if any socket pin is bent or missing. Triple channel RAM isn't that bad, especially when running @ 1866 MHz.

Lastly, i got the pads with the block. I had it with my 290 back in the day. I am thinking of getting the Fujipoly Extreme or Ultra or whatever is the best thermal pad for the VRMs though.


----------



## cephelix

Seems that a few of us are in the same position, thinking about the 480s. Yeah performance wouldn't be much different than my current 290 (1150/1300/+102mV) but at this point I'm just waiting for a new reference card to put my whole system under water. Everything has been sitting in a box under my bed for 3-4 months as it is.

@Starbomba Fujipoly Ultra Extreme FTW! Can't remember what it's called on PPCs though. Sarcon something something


----------



## kizwan

Quote:


> Originally Posted by *Starbomba*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Well, after a tryst with a 79x0 Trifire, a completely reference R9 290 and attempting to like again the green team with a 780 Classy, i'm back into the red team with a R9 290X Tri-X. Now, all i need is for AMD to release the RX 490 (or w/e sucessor to the Fury X is) for me to get two.
> 
> The funny part is that i miscalculated the length of the card. I read on my case manual that a 12'' card would fit, but i kinda forgot i had a tube running on that side of the case, and when i tried installing the card, the tubing was on the way. Since then i love my quick disconnects even more xD
> 
> I need to find the time to disassemble this loop and put the card on water, i have an EK full block+backplate i sooo want to use.
> 
> http://valid.x86.fr/apg389
> http://www.techpowerup.com/gpuz/details/8dysb
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!


Is there any reason why you populated memory like that? That is not the correct way btw. You should have 3 sticks on each side. Check the manual. On the left side, the first 3 leftmost A1/B1/B2 slots & on the right side, the first 3 rightmost C1/D1/D2 slots.

*Edit: correction.*


----------



## Starbomba

Quote:


> Originally Posted by *kizwan*
> 
> Is there any reason why you populated memory like that? That is not the correct way btw. You should have 3 sticks on each side. Check the manual. On the left side, the first 3 leftmost slots & on the right side, the first 3 rightmost slots.


Quote:


> Originally Posted by *Starbomba*
> 
> About the RAM, i am not 100% sure why, but the 4th channel (2 memory slots) are completely unrecognized. I would check it out, but it would involve taking everything apart and checking the motherboard if any socket pin is bent or missing. Triple channel RAM isn't that bad, especially when running @ 1866 MHz.


----------



## Paul17041993

Quote:


> Originally Posted by *kizwan*
> 
> Is there any reason why you populated memory like that? That is not the correct way btw. You should have 3 sticks on each side. Check the manual. On the left side, the first 3 leftmost slots & on the right side, the first 3 rightmost slots.


Pretty sure he has it set up correctly for 3-channel, channel pairs have to be right next to each other I'm pretty sure.
If any of the channel groups miss-match each other then the CPU will be locked into compatibility (single-channel) mode.


----------



## gupsterg

Quote:


> Originally Posted by *kizwan*
> 
> Is there any reason why you populated memory like that? That is not the correct way btw.


I'd agree







, ref'ing page 27 & 37.


----------



## Starbomba

I am starting to wonder if anyone even dare think on reading previous posts unless you've been quoted.


----------



## kizwan

Quote:


> Originally Posted by *kizwan*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *Starbomba*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Well, after a tryst with a 79x0 Trifire, a completely reference R9 290 and attempting to like again the green team with a 780 Classy, i'm back into the red team with a R9 290X Tri-X. Now, all i need is for AMD to release the RX 490 (or w/e sucessor to the Fury X is) for me to get two.
> 
> The funny part is that i miscalculated the length of the card. I read on my case manual that a 12'' card would fit, but i kinda forgot i had a tube running on that side of the case, and when i tried installing the card, the tubing was on the way. Since then i love my quick disconnects even more xD
> 
> I need to find the time to disassemble this loop and put the card on water, i have an EK full block+backplate i sooo want to use.
> 
> http://valid.x86.fr/apg389
> http://www.techpowerup.com/gpuz/details/8dysb
> 
> 
> 
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Is there any reason why you populated memory like that? That is not the correct way btw. You should have 3 sticks on each side. Check the manual. On the left side, the first 3 leftmost A1/B1/B2 slots & on the right side, the first 3 rightmost C1/D1/D2 slots.
> 
> *Edit: correction.*
Click to expand...

Quote:


> Originally Posted by *Paul17041993*
> 
> Pretty sure he has it set up correctly for 3-channel, channel pairs have to be right next to each other I'm pretty sure.
> *If any of the channel groups miss-match each other then the CPU will be locked into compatibility (single-channel) mode.*


That's not true (bolded). When the memory sticks installed according to the way I described above (before I make the correction above) they will run in quad-channel. It doesn't matter because I know now his slots are faulty.
Quote:


> Originally Posted by *gupsterg*
> 
> Quote:
> 
> 
> 
> Originally Posted by *kizwan*
> 
> Is there any reason why you populated memory like that? That is not the correct way btw.
> 
> 
> 
> I'd agree
> 
> 
> 
> 
> 
> 
> 
> , ref'ing page 27 & 37.
Click to expand...

Thank you for correcting me (*refering* A1/B1/B2/C1/D1/D2 for better compatibility).







But I'm pretty sure if we use the 3 leftmost slots on the left side & 3 rightmost slots on the right side, we still get it to run in quad-channel on x79 mobo...

http://www.mslinn.com/sites/mike/computers/gojira/E7069_X79_8-DIMM_Memory_Installation_Guide.pdf

...as long as the four blue/beige/red/dark_gray(black edition x79 ROG) & another pair in the two black slots are populated, it will run in quad-channel.
Quote:


> 6 DIMMs: Supports six (6) modules inserted into four blue slots(P9X79 series) / beige slots(TUF series) / red slots(ROG series) and one pair of black slots(P9X79 series; ROG series) / brown slots(TUF series) as three pairs of Quadchannel memory configurations. Install the modules into slots A1/B1/B2/C1/D1/D2 for better compatibility


Quote:


> Originally Posted by *Starbomba*
> 
> I am starting to wonder if anyone even dare think on reading previous posts unless you've been quoted.


I'm pretty sure *gupsterg* is only correcting me. Don't worry about it.

Honestly, I did noticed & read your previous post but 4th channel to me is D1 & D2. Basically your explanation doesn't make sense to me that time which is the reason why I asked the question anyway. Anyway, don't worry about it.


----------



## Paul17041993

Quote:


> Originally Posted by *kizwan*
> 
> That's not true (bolded). When the memory sticks installed according to the way I described above (before I make the correction above) they will run in quad-channel. It doesn't matter because I know now his slots are faulty.
> Thank you for correcting me (*refering* A1/B1/B2/C1/D1/D2 for better compatibility).
> 
> 
> 
> 
> 
> 
> 
> But I'm pretty sure if we use the 3 leftmost slots on the left side & 3 rightmost slots on the right side, we still get it to run in quad-channel on x79 mobo...
> 
> http://www.mslinn.com/sites/mike/computers/gojira/E7069_X79_8-DIMM_Memory_Installation_Guide.pdf
> 
> ...as long as the four blue/beige/red/dark_gray(black edition x79 ROG) & another pair in the two black slots are populated, it will run in quad-channel.
> 
> I'm pretty sure *gupsterg* is only correcting me. Don't worry about it.
> 
> Honestly, I did noticed & read your previous post but 4th channel to me is D1 & D2. Basically your explanation doesn't make sense to me that time which is the reason why I asked the question anyway. Anyway, don't worry about it.


Well that makes no sense as the CPU cant possibly make full-use of a mixed-channel configuration due to the issue of address miss-alignment...

Unless the single sticks are 2x the capacity of the paired sticks...

Either way, the issue is that the 4th channel is non-functional for some reason, so those last slots cant be used at all unless he fixes the root problem, if a CPU re-seat has already been tried then the mobo and/or CPU possibly needs an RMA...


----------



## ZealotKi11er

Meh I am not swelling my 290x s. Will be replacing the 2 x 7970s and putting those up my collection. These cards have served me too well to be sold.


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Meh I am not swelling my 290x s. Will be replacing the 2 x 7970s and putting those up my collection. These cards have served me too well to be sold.


Served you well? All you do is complain about them lol

Are you serious?


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> Served you well? All you do is complain about them lol
> 
> Are you serious?


What does complaining have anything to do with this? I would be splitting the cards into 2 different system.


----------



## dagget3450

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What does complaining have anything to do with this? I would be spiting the cards into 2 different system.


Those must be hard to chew and fit in your mouth for spiting


----------



## ZealotKi11er

Quote:


> Originally Posted by *dagget3450*
> 
> Those must be hard to chew and fit in your mouth for spiting


Got to love Auto correct. Also 4K 40" screen also helps.


----------



## dagget3450

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Got to love Auto correct. Also 4K 40" screen also helps.


No worries. I am only good at two things in my life syntax errors, and confusion. Well, i am also a master of chicken scratch but that's not applicable here


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What does complaining have anything to do with this? I would be splitting the cards into 2 different system.


All you do is compalin about them and they have served you well?

You are kidding.


----------



## dagget3450

Quote:


> Originally Posted by *rdr09*
> 
> All you do is compalin about them and they have served you well?
> 
> You are kidding.


Its not that unheard of. Its like the boy raking up leaves complaining about trees but in reality the trees are making him money. I complain about equipment at my job but in reality i love the fact it needs lots of repairs cause ya know job security ;-)


----------



## battleaxe

When the 290x came out it was competitive with what from Nvidia?

What was the competition? The GTX 780ti and the Titan?

And how long did the 780ti hold its own against the 290x as drivers matured?

Then the 980 came out right? And beat the 290x for a while right? Then drivers continued to mature and the 290x ended up besting it on 4k (at least) correct me if I'm wrong?

390x came out and didn't really compete with the 980 but in the end it is beating it now, especially on 4k and DX12 where it even competes with the 980ti correct? Seems to me like AMD is really adding some serious value to their customers while Nvidia does anything but create value for products already purchased...

So why do some have so much allegiance to Nvidia? Seems they are bleeding enthusiasts of all their money to me and shooting all of them in the face while laughing the entire time.


----------



## ZealotKi11er

Quote:


> Originally Posted by *battleaxe*
> 
> When the 290x came out it was competitive with what from Nvidia?
> 
> What was the competition? The GTX 780ti and the Titan?
> 
> And how long did the 780ti hold its own against the 290x as drivers matured?
> 
> Then the 980 came out right? And beat the 290x for a while right? Then drivers continued to mature and the 290x ended up besting it on 4k (at least) correct me if I'm wrong?
> 
> 390x came out and didn't really compete with the 980 but in the end it is beating it now, especially on 4k and DX12 where it even competes with the 980ti correct? Seems to me like AMD is really adding some serious value to their customers while Nvidia does anything but create value for products already purchased...
> 
> So why do some have so much allegiance to Nvidia? Seems they are bleeding enthusiasts of all their money to me and shooting all of them in the face while laughing the entire time.


290X came out 6 months after Titan OG and 3 months after GTX780. It was faster than both and cheaper then both. Problem was you had a 290X with a crappy stock cooler going up against GTX780s with amazing aftermarket cooler. No idea why AMD did not release this card with custom coolers off the bat. I hope they do it like 390X. All the cards starting R9 290 to GTX980 are in the same ballpark.


----------



## battleaxe

Quote:


> Originally Posted by *ZealotKi11er*
> 
> 290X came out 6 months after Titan OG and 3 months after GTX780. It was faster than both and cheaper then both. Problem was you had a 290X with a crappy stock cooler going up against GTX780s with amazing aftermarket cooler. No idea why AMD did not release this card with custom coolers off the bat. I hope they do it like 390X. All the cards starting R9 290 to GTX980 are in the same ballpark.


Yeah, which is my point. AMD really hit it out of the park with the Hawaii series. The 290x cards are still quite awesome and hold their own against the 980ti on DX12 which is going to be the new standard on games. Its hard not to like a brand that treats its customers like this. (meaning good)

And hard not to take a small offense at what Nvidia has been up to lately.

I just sold my 970 as a result of how Nvidia has been treating us... ie, (me). 970 RAM was a bad enough deal, but I'm just done with them ruining their old cards by bringing new cards out that are only marginally better and offering no real driver support (for past generation cards) that lasts more than about 6-8 months. Its a joke and a never ending upgrade cycle that is just plain annoying to me.

I get you need to make money, but I sincerely hope AMD can do a repeat on the 290x.


----------



## mus1mus

AMD releasd the 290/X to counter the 780/Titan. Both neck and neck at the time. On stock reference coolers with throttling issues due to heat.

nVidia released the 780TI with better specs sans memory size over the Titan. Beating reference 290Xs. They then released the 980/970 and removed the 780/TI off the map. At the time, reviewers are still comparing them to ref blowers 200 series cards. That, along with the fact that 780/TI vs 970/980 aren't really worthy upgrades -- but a mere effort by nVidia to attract customers saying, 1500MHz OC Clocks! -- (well, they have to go that far to justify the 700 to 900 series as an upgrade) and people were easily attracted.

But nVidia moved on from there. 980TI cannot be matched by any single AMD card except DX12/compute. But everything with current games is dominated by nVidia. Let's just hope Vega fixes that.

Regarding DX12, nVidia probably has more than enough money to pay game devs to delay full DX12 transition. Which benefits them quite big time.


----------



## gordesky1

So it looks like msi is taking care of my dead 290x lightning says Your RMA has been received and is now in replacement, testing, or repair process. If they do replace it if it cant be repaired whats the chance of getting another 290x lightning back?

Cause i really don't think i should accept any non lightning cards that's a 390x or under..

Im really hopeing they do have a extra 290x lightning because i had no problems in any game with that card... If they manage to fix it that would be great too.


----------



## ZealotKi11er

Quote:


> Originally Posted by *gordesky1*
> 
> So it looks like msi is taking care of my dead 290x lightning says Your RMA has been received and is now in replacement, testing, or repair process. If they do replace it if it cant be repaired whats the chance of getting another 290x lightning back?
> 
> Cause i really don't think i should accept any non lightning cards that's a 390x or under..
> 
> Im really hopeing they do have a extra 290x lightning because i had no problems in any game with that card... If they manage to fix it that would be great too.


Most likely 390X Gaming.


----------



## gordesky1

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Most likely 390X Gaming.


That really doeisnt sound right tho. replacing lightning gpu with a normal plain jane gpu... tho i guess i will have to accept if it comes down to it cause while so far this 570gtx is still holding its own today the 1280mb vram is very very lacking lol...

Really hopeing they still have refurbish 290x lightnings tho.


----------



## TheLAWNOOB

Quote:


> Originally Posted by *gordesky1*
> 
> Quote:
> 
> 
> 
> Originally Posted by *ZealotKi11er*
> 
> Most likely 390X Gaming.
> 
> 
> 
> That really doeisnt sound right tho. replacing lightning gpu with a normal plain jane gpu... tho i guess i will have to accept if it comes down to it cause while so far this 570gtx is still holding its own today the 1280mb vram is very very lacking lol...
> 
> Really hopeing they still have refurbish 290x lightnings tho.
Click to expand...

You don't want the free 8GB VRAM upgrade?


----------



## TheLAWNOOB

Got a reference R9 290 for $180 CAD. Have no waterblocks.

Already modded a Gigabyte Windforce cooler so it will hopefully fit. I took off 20% of the fins and hammered the heat pipes to give it more clearance.

Pics coming in a few hours if I don't lose an eye or something.


----------



## gordesky1

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> You don't want the free 8GB VRAM upgrade?


well the thing is the lightning is the much better built card and will out match a normal card with overclocking.

wouldn't mind the 8gb upgrade. but it bothers me getting a normal card back when before i had a higher end built card. imagine if i paid 700$ when theses cards came out... luckily i didn't.

still when these can be found they still can go for alot...


----------



## questgraves

Update:

Turned out to be the PSU Daisy Chained Cables on the Thermaltake Toughpower Grand Platinum 1050. I changed to individual PCI-E cables from my last PSU and no more problems, can OC again just like the card was new!!

Then I started to have other PSU problems such as tripping my home breaker so I RMA'd the PSU am sending out tomorrow.


----------



## Paul17041993

Quote:


> Originally Posted by *battleaxe*
> 
> So why do some have so much allegiance to Nvidia? Seems they are bleeding enthusiasts of all their money to me and shooting all of them in the face while laughing the entire time.


Nvidia's name and sole objective as a company is to create jealousy, nothing more. They've basically never done anything significant and probably never will, besides creating OGL and then leaving it to rot...

They live off marketing nonsense and white lies, and unfortunately there's people that fall for it all and even continue to when they get caught out...

Nvidia's the apple of the GPU space put basically.


----------



## Paul17041993

Quote:


> Originally Posted by *questgraves*
> 
> Update:
> 
> Turned out to be the PSU Daisy Chained Cables on the Thermaltake Toughpower Grand Platinum 1050. I changed to individual PCI-E cables from my last PSU and no more problems, can OC again just like the card was new!!
> 
> Then I started to have other PSU problems such as tripping my home breaker so I RMA'd the PSU am sending out tomorrow.


How much power does your house support?









Though if it's an old breaker it may need replacing, they tend to degrade and go wonky after 5-10 years, maby more.


----------



## questgraves

That's a good point, I will check that out. after I fixed the cable issue I had my card OC'd for a few days playing games and not only was the card stable but my FPS shot up way higher than they should have for the 14-17% OC. we are talking 45 to 80 in SWTOR with 3 1080 monitors. So I was playing with it, bumped it up a little more and ended up taking a nap. I wake up and it's over 80 degrees in the house and the power keeps trying to turn on and then shuts right back off...I go through messing with things. Turn off the back of the PSU and the power stays on. I wait about 5 minutes and try the PSU and all is good...Very weird. But another point that holds weight to your theory is we blow bulbs too often. Our house is 11 years old. So I assumed it should be ok. My darkest fear is a deep short some place, but the breaker sounds like a good place to start. Thanks! Reading this I now sound like a moron. I'm a software engineer and have been building and fixing computers for over 15 years you would think I would have thought of that...sometimes the answer is too close to be seen.


----------



## battleaxe

Quote:


> Originally Posted by *questgraves*
> 
> That's a good point, I will check that out. after I fixed the cable issue I had my card OC'd for a few days playing games and not only was the card stable but my FPS shot up way higher than they should have for the 14-17% OC. we are talking 45 to 80 in SWTOR with 3 1080 monitors. So I was playing with it, bumped it up a little more and ended up taking a nap. I wake up and it's over 80 degrees in the house and the power keeps trying to turn on and then shuts right back off...I go through messing with things. Turn off the back of the PSU and the power stays on. I wait about 5 minutes and try the PSU and all is good...Very weird. But another point that holds weight to your theory is we blow bulbs too often. Our house is 11 years old. So I assumed it should be ok. My darkest fear is a deep short some place, but the breaker sounds like a good place to start. Thanks! Reading this I now sound like a moron. I'm a software engineer and have been building and fixing computers for over 15 years you would think I would have thought of that...sometimes the answer is too close to be seen.


See if the breaker in question is getting warm. If it is, then there is too great a load placed on that circuit. One fix is to run a dedicated circuit to your office (PC hardware) and relieve some of the stress on that circuit.


----------



## kizwan

Quote:


> Originally Posted by *questgraves*
> 
> Update:
> 
> Turned out to be the PSU Daisy Chained Cables on the Thermaltake Toughpower Grand Platinum 1050. I changed to individual PCI-E cables from my last PSU and no more problems, can OC again just like the card was new!!
> 
> *Then I started to have other PSU problems such as tripping my home breaker* so I RMA'd the PSU am sending out tomorrow.


When I got my 290s, I'm having similar problem too when playing games because other high wattage appliances also in the same circuit. I end up getting dedicated circuit for my computer. At the same time I also re-wire most of the wiring in the house (it's an old house too).

Looking at your signature, your computer should not pulling too much power from the wall. Most likely you'll need to change the circuit breaker and/or the main circuit breaker. Probably need to check the wiring in your house too.


----------



## LandonAaron

I have noticed that anytime I run a game or other fullscreen application which utilizes crossfire, that when I Alt-Tab out to the desktop Windows will behave very strange and sluggish. Typing is extremely slow, and MSIAB's GUI doesn't respond. Has anyone else experienced this?


----------



## Paul17041993

Quote:


> Originally Posted by *LandonAaron*
> 
> I have noticed that anytime I run a game or other fullscreen application which utilizes crossfire, that when I Alt-Tab out to the desktop Windows will behave very strange and sluggish. Typing is extremely slow, and MSIAB's GUI doesn't respond. Has anyone else experienced this?


Tried without MSIAB running? it could be causing a driver/hardware event pipe fault. Also try and test with crossfire disabled, if crossfire is the sole cause then you may have to use a different driver version.


----------



## TheLAWNOOB

Too lazy to put windforce coolers on the windforce cards, so put it on the reference 290 i just got.

Of course didnt fit. Removed a bunch of fins and hammered a few heatpipes to make room for capacitor. Bent a bunch of fins for the inductor.

Then i realized the seller didnt give me any screws for the windforce cooler.

Welp.

I yoloed. And worked.


----------



## mus1mus

And the results?


----------



## TheLAWNOOB

72C with fan at 7V and core at -60mv stock clocking when mining

The fan plug on windforce is backwards compared to reference card, so powered with 7V directly


----------



## Paul17041993

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> 72C with fan at 7V and core at -60mv stock clocking when mining
> 
> The fan plug on windforce is backwards compared to reference card, so powered with 7V directly


if it's just for mining or folding then you might as well have it on a fixed voltage anyway, the core will throttle if it needs to.


----------



## diggiddi

Quote:


> Originally Posted by *gordesky1*
> 
> That really doeisnt sound right tho. replacing lightning gpu with a normal plain jane gpu... tho i guess i will have to accept if it comes down to it cause while so far this 570gtx is still holding its own today the 1280mb vram is very very lacking lol...
> 
> Really hopeing they still have refurbish 290x lightnings tho.


Maybe we can swap







, I'll give you one of mine


----------



## clao

1. http://www.techpowerup.com/gpuz/details/d8qnn

2. Gigabyte R9 290x

3. Aftermarket



Is my OC attempt on Unigine ok?

1100Mhz core and 1500Mhz memory

max power limit


----------



## mus1mus

Turn Tessellation to Maximum for easy comparison.

You will then find out that you need to push that CPU.


----------



## Tradition

my top record
r9 290(x) XFX DD it used to be a 290 i unlocked the streams with the r9 390x bios
1250/1550 200mv 1.4v max


----------



## mus1mus

Tess is disabled though.

And yeah, use print screne.


----------



## Tradition

i ran that for a friends ranking and i has to be tess off and was easier to just take a pic xD


----------



## Tradition

@TheLAWNOOBjust put the r9 390x bios on it it will give you a boost on mining and also lower your temperatures by about 7 degrees at least that's what it did for me ^^ 30,5MH/s 1060/1250 -6mv 60C GPU 78C VRM


----------



## TheLAWNOOB

Thanks ill try it


----------



## TheLAWNOOB

Getting hyped for RX 480


----------



## serave

a guy is willing to buy my 290 for $235 while it is a pretty fair offering

i feels bad getting rid of my 290s lol, probably the best card i've ever owned to this date as the price/performance cant be beaten since its inception









well bring on the 480 i guess


----------



## Starbomba

Quote:


> Originally Posted by *serave*
> 
> a guy is willing to buy my 290 for $235 while it is a pretty fair offering


It is more than good. I bought my 290X Tri-X for $236 like a month ago.

I'll definitely wait for the RX 480, but mostly since i plan to go Vega. If all the "leaks" tend to not be that far off, it would be a powerhouse. However, i will keep my 290X until it burns out. My GTX 470 was begging for a replacement anyways, after around 3 years of abuse under me (and like 1.5 years with the previous owner to boot).


----------



## LandonAaron

Forbes has an interesting article on the upcoming RX480 and its marketing campaign: http://www.forbes.com/sites/jasonevangelho/2016/06/12/exclusive-amds-radeon-uprising-campaign-incites-gpu-rebellion/#7dd6709b1c8d


----------



## TheLAWNOOB

Quote:


> Originally Posted by *LandonAaron*
> 
> Forbes has an interesting article on the upcoming RX480 and its marketing campaign: http://www.forbes.com/sites/jasonevangelho/2016/06/12/exclusive-amds-radeon-uprising-campaign-incites-gpu-rebellion/#7dd6709b1c8d


Please give another link that doesn't force me to turn off adblock.


----------



## TheLAWNOOB

Flashed to a modded 390 BIOS and the Aux current is thru the roof.

It still runs at the same temps but why is the aux current so high? It makes my card go over 270W Tdp and throttles core from 1050 to 933


----------



## Tradition

@TheLAWNOOB which miner are you using im getting arround 28mh/s with my r9 290x 1060/1250


----------



## Tradition

the +38mv on aux its normal just go on the afterburner and lower it but it dosent make and difference to the board Power consumption and temperature i think there might be something more to it


----------



## Tradition

changed the fans on the card for 2 cooler master stickeflow and manage to get this temp 24/7 ^^


----------



## TheLAWNOOB

Using claymore and 25-27 after modding. Was about 28-29 before modding.

Runs at 1000/1250 before, mod is 1000/1500


----------



## Tradition

Try something like this also click on that arrow it will give you access to the aux volt


----------



## TheLAWNOOB

Aux voltage is fine, problem is hardware info is reporting over 100A of current on Aux, which is not possible.

My GPU is also throttling because the false aux current is making the card exceed TDP


----------



## TheLAWNOOB

Could you send me your BIOS? My R9 is now doing 25MH/s with default BIOS at 1000 core 1500 mem

My other R9s are doing 29MH/s with default BIOS, 1040 core, 1250mem

Also, for that screwed up 290, HWinfo reports 889Mhz while MSI Afterburner shows 1000.


----------



## ZealotKi11er

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Could you send me your BIOS? My R9 is now doing 25MH/s with default BIOS at 1000 core 1500 mem
> 
> My other R9s are doing 29MH/s with default BIOS, 1040 core, 1250mem
> 
> Also, for that screwed up 290, HWinfo reports 889Mhz while MSI Afterburner shows 1000.


My 290 + 290X do 66MH/s @ 1200/1375 + 75mV.


----------



## Tradition

sure but i use this bios full stock and overclock afterwards

http://www.overclock.net/t/1564219/modded-r9-390x-bios-for-r9-290-290x-updated-02-16-2016


----------



## TheLAWNOOB

Well I give up. HWInfo keeps telling me I'm running 10% slower than what GPUZ and MSI Afterburner tells me, and the hashrate reflects that.

At 1040Mhz core I used to get 26MH/s last night with stock BIOS, but now I get 23-24MH/s.

HWInfo used to report same clock as MSI Afterburner, but after I flashed BIOSes it reports 10% lower all the time. Even after I flash it back to stock.

I didn't reset overclock before flashing, could that make a difference?


----------



## Tradition

hardly are your drivers right 15.12 or the opencl on the mining folder


----------



## TheLAWNOOB

I'm using 16.3 Beta but it worked fine on my other rig. I'll try 15.12.


----------



## ZealotKi11er

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> I'm using 16.3 Beta but it worked fine on my other rig. I'll try 15.12.


That's your problem. 16.2 is the latest you can use.


----------



## TheLAWNOOB

My 290 is hopeless.

26MH/s after using 15.12 driver, 390 BIOS, 1030core 1250 mem.

1500 mem cuts hashrate in half 50% of the time

Soon on ebay: "Mint condition never overclocked only used for gaming R9 290 for sale, barely used, $300"


----------



## ZealotKi11er

Is it a reference R9 290?


----------



## TheLAWNOOB

Yes, XFX reference 290 that somehow came with a powercolor cooler

It has a windforce cooler on it right now, core 59C VRM 63C


----------



## ZealotKi11er

http://www.overclock.net/t/1561904/mlu-bios-builds-for-290x


----------



## gupsterg

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Yes, XFX reference 290 that somehow came with a powercolor cooler
> 
> It has a windforce cooler on it right now, core 59C VRM 63C


If you have reference PCB (ie AMD stamp by PCI-E fingers) I'd get The Stilt mining ROMs vs MLU. ROPs have been shed in mining ROM plus has fSW tweak (MLU do also).

Now flashing to 390/X can be problematic IMO, the insanity ROMs are not ideal for all, personally I'd stick to 290/X ROM with mods or The Stilt ROMs.


----------



## gordesky1

Quote:


> Originally Posted by *diggiddi*
> 
> Maybe we can swap
> 
> 
> 
> 
> 
> 
> 
> , I'll give you one of mine


Will let you know if i do get one and if i dislike it lol

So far the rma status went a step back for some reason so i guess so i guess more waiting.... lol


----------



## ZealotKi11er

Quote:


> Originally Posted by *gupsterg*
> 
> If you have reference PCB (ie AMD stamp by PCI-E fingers) I'd get The Stilt mining ROMs vs MLU. ROPs have been shed in mining ROM plus has fSW tweak (MLU do also).
> 
> Now flashing to 390/X can be problematic IMO, the insanity ROMs are not ideal for all, personally I'd stick to 290/X ROM with mods or The Stilt ROMs.


Do you have a link for his mining ROMs?


----------



## mus1mus

Are you still gaining with them these days?


----------



## gupsterg

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Do you have a link for his mining ROMs?


Here you go







.

Benefit over stock ROM:-

i) manual VID set for all DPMs, lower than EVV VID per DPM = stock ROM.
ii) fSW lowered = power efficiency/lower VRM temp.
iii) ROP shed for power saving.
iv) RAM timings tweaked for mining.

The Stilt MLU vs Mining ROM:-

i) VID per DPM not manually set, they are EVV.
ii) ROPs are not shed.
iii) RAM timings tweaked for gaming.

I'm currently investigating how ROPs were shed in The Stilt mining ROMs, this would mean we can have:-

i) 390/X ROM with ROPs shed for mining plus make mining ROMs for non reference cards 290/X & 390/X.
ii) if the older The Stilt mining ROMs is an issue to use for anyone we can make new ROMs.

All command tables are the same in The Stilt mining ROMs compared with stock hex for hex. Don't think the ROPs are shed by data tables, hoping to find time to check, currently I believe the shedding is done via upper section of ROM.

The Stilt mining ROMs are Non-UEFI, so if your're using on UEFI mobo you will need to make sure legacy compatibility is on (ie CSM, etc). UEFI module can be added to his ROMs, check out Hawaii bios mod OP and post there if need help with mod







.


----------



## TheLAWNOOB

Can you game on a card with mining BIOS, or do I need to flash back for gaming?


----------



## rdr09

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Can you game on a card with mining BIOS, or do I need to flash back for gaming?


Check your card if it has a secondary bios . . . it should. If it does, shutdown, switch, then game. Not sure if the driver for the game matches the mining, though.


----------



## gupsterg

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Can you game on a card with mining BIOS, or do I need to flash back for gaming?


The Stilt mining ROM with ROPs shed will be poor performance in games.

You could have stock ROM on one bios position and mining on one. You'll need to power down/up when changing between ROMs.


----------



## TheLAWNOOB

Thanks. I'll flash the reference 290 and not bother with the pair of Gigabytes. Gigabytes are already doing 29.5MH/s at 1040 core with 16.5.3 so I don't want to fix what's not broke.

Looks like I'll sell the EK blocks and use some universal blocks instead. Got chinese universal blocks 12 bucks a piece. Once the 480 drops 290s are going to be worthless and nobody will want block for them anymore.

Not sure if I mentioned it alredy, I got some aluminum F150 radiator and heat core that I will be using for the 480s I'm going to buy. Also bought cheap aluminum blocks for 5 bucks a piece. Will update once everything works out.


----------



## Paul17041993

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Thanks. I'll flash the reference 290 and not bother with the pair of Gigabytes. Gigabytes are already doing 29.5MH/s at 1040 core with 16.5.3 so I don't want to fix what's not broke.
> 
> Looks like I'll sell the EK blocks and use some universal blocks instead. Got chinese universal blocks 12 bucks a piece. Once the 480 drops 290s are going to be worthless and nobody will want block for them anymore.
> 
> Not sure if I mentioned it alredy, I got some aluminum F150 radiator and heat core that I will be using for the 480s I'm going to buy. Also bought cheap aluminum blocks for 5 bucks a piece. Will update once everything works out.


You'll have to be careful what coolant you use with aluminium components, you may need some automotive grade stuff, though most mixed coolants with corrosion inhibitors should do the job.

If _everything_ is aluminium though you may be fine with just distilled water and some form of biocide.


----------



## TheLAWNOOB

I am well aware. Going to use 40% anti freeze.

For the 290. Flashed mining BIOS and got 29MH/s at first. Then it downclocked to 900Mhz and I got 26MH/s.

I think the Aux current sensor is screwed up. It keeps saying the aux is drawing 100W of power which is 4 times the regular amount of power drawn by the memory controller. My other 290 only draws 25W for aux.


----------



## TheLAWNOOB

Clean install with Win7 with no OC software and same issue.

Switched to Ether only and problem seems to be solved.


----------



## HOMECINEMA-PC

11560x2160p


----------



## mfknjadagr8

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> 
> 
> 11560x2160p


nice very nice...


----------



## gordesky1

msi just got back to me and said the our rma service center stated the lighting 290x model is out of buffer, only the R9 390 GAMING 8G is available......

And from what i seen the normal 390 is a bit slower than the 290x correct?


----------



## rdr09

Quote:


> Originally Posted by *gordesky1*
> 
> msi just got back to me and said the our rma service center stated the lighting 290x model is out of buffer, only the R9 390 GAMING 8G is available......
> 
> And from what i seen the normal 390 is a bit slower than the 290x correct?


No, even.


----------



## gordesky1

still say they should've offer the 390x cause well my card its a lighting not just a normal card...

They got back to me cause i ask are they getting any back in stock? and they said our rma service center informed us that the buffer will be waiting for another 3 weeks.

So does that mean it will be in stock in 3weeks?

I also ask them what card they mean the 390 or 390x cause i told them about having a lightning and i should be getting the 390x instead.


----------



## rdr09

Quote:


> Originally Posted by *gordesky1*
> 
> still say they shouldve offer the 390x cause well my card its a lighting not just a normal card...


Retailers have the 480s. Try that. The 470 matches the Hawaiis.


----------



## gordesky1

Quote:


> Originally Posted by *rdr09*
> 
> Retailers have the 480s. Try that. The 470 matches the Hawaiis.


Hmm just asked them about the 390x and they said the ony card they have is the 390 GAMING 8G: https://us.msi.com/Graphics-card/R9-390-GAMING-8G.html#hero-specification buffer = stock ...

Will ask them about the 480. tho wouldn't you think they would have the 390x in stock?

It seems like they saying they ony have the 390 and that's it in stock...

edit just asked them about the 480 and i also asked is the 390 non x is the ony card that they have in stock. just waiting for them to get back to me. also asked them about if i wait 3weeks will i get back the 290x lighting.


----------



## TheLAWNOOB

Quote:


> Originally Posted by *gordesky1*
> 
> msi just got back to me and said the our rma service center stated the lighting 290x model is out of buffer, only the R9 390 GAMING 8G is available......
> 
> And from what i seen the normal 390 is a bit slower than the 290x correct?


Just ask nicely for a RX 480.

290X Lightning to 390 Gaming is huge downgrade.


----------



## ZealotKi11er

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Just ask nicely for a RX 480.
> 
> 290X Lightning to 390 Gaming is huge downgrade.


What special about 290X Lightning? 390 is a downgrade but 390X is not cause u get 8GB vRAM.


----------



## gordesky1

Quote:


> Originally Posted by *ZealotKi11er*
> 
> What special about 290X Lightning? 390 is a downgrade but 390X is not cause u get 8GB vRAM.


its mostly because the lightning is a premium card and a great overclocker and have alot of good features and also performs better than the normal 290x at times and stays cooler.

the 390 is a 8gb card too but still i would think they would've offer the x version for the premium card. heck i would've said yes even tho i still think i should be offer more cause of it being a lightning. not sure if they going to give in for the 480 even tho i don't see why not sense it is cheaper than than the 390s..

So far they still haven't replay back after i asked them... wonder if it would be better if i called?

i also have to watch also cause im not the first owner they took the rma in for the serial number like they usely known to do, But i just hope they don't ask for the proof from keep asking them stuff lol


----------



## Paul17041993

I wouldn't accept a 390 as a replacement for a 290X as it's a straight down-grade, 390X is the optimal replacement as it's a side-grade when you factor in clocks, the 8GB of VRAM doesn't actually do much unless you're pushing enough pixels that you actually need multiple cards.

So they either give you a 290X, 390X or a refund, in AU that's the law and they cant back out of it, shouldn't be much different in your country.


----------



## gordesky1

Quote:


> Originally Posted by *Paul17041993*
> 
> I wouldn't accept a 390 as a replacement for a 290X as it's a straight down-grade, 390X is the optimal replacement as it's a side-grade when you factor in clocks, the 8GB of VRAM doesn't actually do much unless you're pushing enough pixels that you actually need multiple cards.
> 
> So they either give you a 290X, 390X or a refund, in AU that's the law and they cant back out of it, shouldn't be much different in your country.


they just got back to me and said our rma service center will be in touch with you later for follow up.... Not sure if the rma service center will be contacting me threw email or threw phone.

They didn't say which card will be the replacement..

And yea i wont accept a 390 non x. if they had the 390x i would take that or even if they had another lightning.

Im not sure how a refund will go if that comes down to it cause i did buy it used from the market here from MunneY Not sure if they would refused the rma if they knew that or what.. all they asked for is the serial on back of the card which i herd that's the ony thing they go by.

I just hope asking and refusing the 390 non x wont push them to ask more questions...


----------



## mus1mus

My MSI BF4 290 was replaced with a 290X.

Was it an upgrade? NO!

That 290 has Hynix. 290X with Elpida. 290 has been my go to bencher. This 290X Elpida sucks.

Lightning 290X is still slower than a 390 Gaming at stock. :meh:

Your greatest immunity to their RMA system is that, you paid for a PREMIUM CARD. They should prioritize you.


----------



## Eroticus

Quote:


> Originally Posted by *mus1mus*
> 
> My MSI BF4 290 was replaced with a 290X.
> 
> Was it an upgrade? NO!
> 
> That 290 has Hynix. 290X with Elpida. 290 has been my go to bencher. This 290X Elpida sucks.
> 
> Lightning 290X is still slower than a 390 Gaming at stock. :meh:
> 
> Your greatest immunity to their RMA system is that, you paid for a PREMIUM CARD. They should prioritize you.


Can't be , show me ur 390 Gaming score in 3D Mark

Lightning 290X is 1080mhz on gpu + more cores while MSI is 1040 + less cores.


----------



## mus1mus

Quote:


> Originally Posted by *Eroticus*
> 
> Can't be , show me ur 390 Gaming score in 3D Mark
> 
> Lightning 290X is 1080mhz on gpu + more cores while MSI is 1040 + less cores.


Don't make me pull such stuff for you to be in the know.









1. I don't have a 390/X
2. I don't run stock
3. 390 Memory is making a ton of difference
4. 300 series cards benefit from driver support better than 200 series cards.

I can go on, you know.

My 1517MHz 290X

vs

Fyzzz's 1300MHz 390

http://www.3dmark.com/3dm11/11315810

http://www.3dmark.com/3dm11/11329448


----------



## Eroticus

But you said stock, not overlocked...


----------



## mus1mus

Quote:


> Originally Posted by *Eroticus*
> 
> But you said stock, not overlocked...


Grow some common sense.

1500MHz on core for the 290X is being beaten by a 1300MHz 390.

Come on man.


----------



## Eroticus

Quote:


> Originally Posted by *mus1mus*
> 
> Grow some common sense.
> 
> 1500MHz on core for the 290X is being beaten by a 1300MHz 390.
> 
> Come on man.


Anything can lower ur score , from bad voltages to a broken card or old drivers.

I don't have sense, when some one says stock, so i'm thinking about stock vs clock.

+not everyone is even able to overclock to half of ur or his clock, so it's pretty useless for me and 95% of the planet =D.... any way nice overclock dude...


----------



## mus1mus

Quote:


> Originally Posted by *Eroticus*
> 
> *Anything can lower ur score , from bad voltages to a broken card.*
> 
> I don't have sense, when some one says stock, so i'm thinking about stock clock.
> 
> +not everyone is even able to overclock to half of ur or his clock, so it's pretty useless for me and 95% of the planet =D.... any way nice overclock dude...


Are you saying I don't know how to overclock dude?









Here is another 1500MHz example for you then.

http://www.3dmark.com/3dm11/9423744

Look up just the graphics score. And check out fyzzz's 390.

130Hz core 390 will annihilate a 290X running the same clocks.

Heck, even the variance from Hynix to Elpida for the 290Xs matter quite a lot.


----------



## ZealotKi11er

Quote:


> Originally Posted by *mus1mus*
> 
> Don't make me pull such stuff for you to be in the know.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1. I don't have a 390/X
> 2. I don't run stock
> 3. 390 Memory is making a ton of difference
> 4. 300 series cards benefit from driver support better than 200 series cards.
> 
> I can go on, you know.
> 
> My 1517MHz 290X
> 
> vs
> 
> Fyzzz's 1300MHz 390
> 
> http://www.3dmark.com/3dm11/11315810
> 
> http://www.3dmark.com/3dm11/11329448


290X Lightning had the best memory of any 290X. Could do 1700MHz easily.


----------



## mus1mus

I am still a Hynix believer.









Lightnings use Samsung to clock high. Though performs worse. IIRC

Holds true even on my 780s.


----------



## Eroticus

Nah, Sapphire had the best memory ! =P

How do you even run 3D Mark Performance ? they didn't removed it some years ago ?

ur driver is 2 year old(Nov,2014) ...


----------



## ZealotKi11er

Quote:


> Originally Posted by *mus1mus*
> 
> I am still a Hynix believer.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Lightnings use Samsung to clock high. Though performs worse. IIRC
> 
> Holds true even on my 780s.


Have there been any tests on this?

Speaking of clocking high. What about the new 8Gbps chips on RX480 and 1070? Who is making them?


----------



## Vellinious

I had great luck with the Hynix memory on the 290X I have. Clocked easily to 1800, and would run 1860 for benchmark runs.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Vellinious*
> 
> I had great luck with the Hynix memory on the 290X I have. Clocked easily to 1800, and would run 1860 for benchmark runs.


Thats impressive. Only my 7970 Hynix can do 1900MHz.


----------



## gordesky1

Still waiting for the rma service to contact me, But i wonder if they will ask me for proof of purchase if i do keep pushing for the 390x or 480?

I mean they didn't ask none of this before hand just the serial number and that's it. Can they turn around and deny the warranty if they do find out im not the first owner?

just sucks that they ony offer me a 390 normal to replace my premium card, i thought they was going to say the 390x which pretty much i would've went with that. heck the 480 is going to be cheaper than both cards so i really don't see a problem replacing it with that..

I just don't know what to say when they ask me how much i paid and were did i buy it and do i have receipt ... that's if they do ask but you never know..


----------



## Eroticus

Quote:


> Originally Posted by *gordesky1*
> 
> Still waiting for the rma service to contact me, But i wonder if they will ask me for proof of purchase if i do keep pushing for the 390x or 480?
> 
> I mean they didn't ask none of this before hand just the serial number and that's it. Can they turn around and deny the warranty if they do find out im not the first owner?
> 
> just sucks that they ony offer me a 390 normal to replace my premium card, i thought they was going to say the 390x which pretty much i would've went with that. heck the 480 is going to be cheaper than both cards so i really don't see a problem replacing it with that..
> 
> I just don't know what to say when they ask me how much i paid and were did i buy it and do i have receipt ... that's if they do ask but you never know..


Ask from the guy you bought the documents and they won't ask you if you want the RX480..

the only option is 290X Lighting / 390x or 390.


----------



## mus1mus

That really depends on the shop/service center you deal with.

I am already the 3rd owner of the 290 I RMAed. I have the receipt though. The card was even opened up due to watercooling.

According to the shop, my card was even sent to Taiwan for Repair and was replaced from a Taiwan unit.


----------



## Paul17041993

Quote:


> Originally Posted by *gordesky1*
> 
> they just got back to me and said our rma service center will be in touch with you later for follow up.... Not sure if the rma service center will be contacting me threw email or threw phone.
> 
> They didn't say which card will be the replacement..
> 
> And yea i wont accept a 390 non x. if they had the 390x i would take that or even if they had another lightning.
> 
> Im not sure how a refund will go if that comes down to it cause i did buy it used from the market here from MunneY Not sure if they would refused the rma if they knew that or what.. all they asked for is the serial on back of the card which i herd that's the ony thing they go by.
> 
> I just hope asking and refusing the 390 non x wont push them to ask more questions...


IMHO the lightning isnt worth much more than a reference card, however if you wanted to "LN2" bench it that's where they shine.

But the fact still stands though that you paid for a "premium" card, so they cant skimp you out on a lower spec card (4GB vs 8GB doesn't count).

Quote:


> Originally Posted by *mus1mus*
> 
> My MSI BF4 290 was replaced with a 290X.
> 
> Was it an upgrade? NO!
> 
> That 290 has Hynix. 290X with Elpida. 290 has been my go to bencher. This 290X Elpida sucks.
> 
> Lightning 290X is still slower than a 390 Gaming at stock. :meh:
> 
> Your greatest immunity to their RMA system is that, you paid for a PREMIUM CARD. They should prioritize you.


What's the ICs spec'd at? Most Hawaii cards only have 1500 ICs and, as such, usually meet that specification. Some clock higher somewhat independent on the brand of the ICs as the limit tends to be the memory controller, generally due to being weaker than the other GCN designs (optimised for efficient 512bit).

Have you also tried meddling with the aux rail and/or BIOS versions?

Quote:


> Originally Posted by *mus1mus*
> 
> Don't make me pull such stuff for you to be in the know.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1. I don't have a 390/X
> 2. I don't run stock
> 3. 390 Memory is making a ton of difference
> 4. 300 series cards benefit from driver support better than 200 series cards.
> 
> I can go on, you know.
> 
> My 1517MHz 290X
> 
> vs
> 
> Fyzzz's 1300MHz 390
> 
> http://www.3dmark.com/3dm11/11315810
> 
> http://www.3dmark.com/3dm11/11329448


Different drivers, CPU, system config and all clocks. Also a higher physics score tends to bring the graphics score down on some systems because that's just how stupid 3dmark is.


----------



## Chopper1591

A lot of talking about the rx480 I see.
I doubt it is worth it to get that to replace my 290 tri-x. I am probably waiting for the 490(x).

Have to stay red as I just bought a 144hz 144hz freesync panel a few weeks ago. I love it.


----------



## diggiddi

Quote:


> Originally Posted by *gordesky1*
> 
> Still waiting for the rma service to contact me, But i wonder if they will ask me for proof of purchase if i do keep pushing for the 390x or 480?
> 
> I mean they didn't ask none of this before hand just the serial number and that's it. Can they turn around and deny the warranty if they do find out im not the first owner?
> 
> just sucks that they ony offer me a 390 normal to replace my premium card, i thought they was going to say the 390x which pretty much i would've went with that. heck the 480 is going to be cheaper than both cards so i really don't see a problem replacing it with that..
> 
> I just don't know what to say when they ask me how much i paid and were did i buy it and do i have receipt ... that's if they do ask but you never know..


No need to panic as RMA is done by serial number and is from date of manufacture, not based on receipt otherwise they'd have asked you for that already


----------



## rdr09

Quote:


> Originally Posted by *Chopper1591*
> 
> A lot of talking about the rx480 I see.
> I doubt it is worth it to get that to replace my 290 tri-x. I am probably waiting for the 490(x).
> 
> Have to stay red as I just bought a 144hz 144hz freesync panel a few weeks ago. I love it.


We are talking RMA stuff.


----------



## Chopper1591

Quote:


> Originally Posted by *rdr09*
> 
> We are talking RMA stuff.


No


----------



## Eroticus

Quote:


> Originally Posted by *Chopper1591*
> 
> No


Are you drunk ?

Not about ur card...


----------



## battleaxe

Quote:


> Originally Posted by *Eroticus*
> 
> Are you drunk ?
> 
> Not about ur card...


LOL... he's just messing with you.


----------



## TheLAWNOOB

They see me trollin', they hatin'....


----------



## rdr09

Tryin to catch me ridin dirty


----------



## mus1mus

coz I'm so white and nerdy...


----------



## TheLAWNOOB

How likely is it for Hawaii BIOS editor to work with Polaris 10?


----------



## OneB1t

im thinking about upgrade after card will be released







or someone else prolly will modify it


----------



## Chopper1591

Quote:


> Originally Posted by *Eroticus*
> 
> Are you drunk ?
> 
> Not about ur card...


Haha, dude.
I posted while on my sigaret break at work. So no, I was not drunk. Just trolling a bit.

You thinking someone is drunk tells more about yourself than about me.








But.. no offense meant. Sowwy








Quote:


> Originally Posted by *battleaxe*
> 
> LOL... he's just messing with you.


We all have to sometimes, right?
Nothing personal, this time







.

I just had such a crappy day at work that I just had to play troll for a bit.
Quote:


> Originally Posted by *TheLAWNOOB*
> 
> They see me trollin', they hatin'....


Quote:


> Originally Posted by *rdr09*
> 
> Tryin to catch me ridin dirty


Quote:


> Originally Posted by *mus1mus*
> 
> coz I'm so white and nerdy...


Whaha, wow.
I love OC.net so much sometimes for the dry humor.


----------



## Connolly

Did anyone find a solution for the latest Crimson drivers conflicting with AB? Where the PC black screens at login.


----------



## spyshagg

Quote:


> Originally Posted by *Connolly*
> 
> Did anyone find a solution for the latest Crimson drivers conflicting with AB? Where the PC black screens at login.


I have a batch file at home that runs on windows logout and resets the voltage. I dont know when I'm going home but I'll give it to you when I do.


----------



## i2CY

Sorry for this... cause it kinda un-advances the the thread, but

I have stopped looking strictly at the GPU Temps and FPSs, and when I can remember to look at the usage %, the MHz, and volts

Now, do to air flow I have plugged my HDMI cable into the bottom GPU2 [PCI-E LANe 2] and it is the one pulling the load. So with that being said I am see nothing but idle on the other GPU1 [PCI-E LANe 1]. 468C. 0% load 300MHz, 0.984V, 1.000V

Stock Bios

Epic Graphic settings on Ark - 1080p

MOBO Bios - PCI-E Lanes set manul to Gen 3.

ULPS Disableed

GPU2 [load baring] is at 70*C 92%, 946Mhz, 1.125V, 1.00V

So... I have lates Crimson drivers, do I have to select my one CXF profile pre-game?

Also - GPU-z say CXF is enabled on both cards.

Sorry again for the redundancy of this question.


----------



## TheLAWNOOB

If it has a CF profile use it. If it doesn't then it probably don't need special driver optimization for CF to work.


----------



## Starbomba

Well, after a forced downtime, i was able to fix the two things that have bothered me on my rig. First, the RAM Issue on my mobo (there was a small piece of plastic between the socket and the CPU) once i took that off i could get 28 GB to work flawlessly. If only i had one more stick of WonderRAM...

Second, i was able to get my 290X on water. At long last. The tri-X cooler is awesome as hell, but the noise when only 2 Gentle Typhoons at half speed are keeping everything under 55c is totally awesome. Wait, what noise? My pumps are noisier than the fans!









Will see if i can tidy up the rest of the case then take some pics this weekend.


----------



## spyshagg

talking about watercooling, starting today I have two full cover waterblocks needing a new owner, an aquacomputer for the 290x ref and a EK for the 290x DCU II

They served me greatly







And now, one month to see what AIB rx480 can do, or If i will need to spend 100$ on a waterblock for the 230$ card (sickening lol)


----------



## gordesky1

wow just wow.. i just got a email from msi saying they don't have any replacements for the 290x lightning and would you accept refund of $141.00?

***>?

Do they even know what they talking about? even use this card is worth way more than that...


----------



## TheLAWNOOB

Quote:


> Originally Posted by *gordesky1*
> 
> wow just wow.. i just got a email from msi saying they don't have any replacements for the 290x lightning and would you accept refund of $141.00?
> 
> ***>?
> 
> Do they even know what they talking about? even use this card is worth way more than that...












Ask them if the $141 was a typo or not.

Ask them if they meant $441.


----------



## gordesky1

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Ask them if the $141 was a typo or not.
> 
> Ask them if they meant $441.


Pretty much just message them back and asked if that was a typo lol

I mean hell a 7950-7970 goes for almost that from looking on ebay....

first it was the 390 non x and now they went lower....

I know they have 390x in stock... heck i don't see why they wont send me the 480 when it releases,, the 8gb model seems like its going to be like 220-240... i did ask them about that.

Before that message i was told to contact tomH which i did message him which i also left a message on his voicemail and right after that like 10mins Grace Liao sent me the 141.00 message....


----------



## TheReciever

141...take the card back and try to bake it or something.

I got my card for 215 and I wouldnt take 141 for it still...


----------



## gordesky1

Quote:


> Originally Posted by *TheReciever*
> 
> 141...take the card back and try to bake it or something.
> 
> I got my card for 215 and I wouldnt take 141 for it still...


Yep that's pretty much what been going in my mind if they refuse to give a Equal replacement card for it i tell them send it back to me in there expense and i try baking it . i will not accept that low ball offer...

Really not sure why it died.. Like it would boot show everything right but when you try to install any drivers it would black screen. And on the next boot it would black screen after the widows loading screen.. tried it in another pc and also a fresh install of windows with the same issue..

Even tried re flashing the bios.. And no damage on the card which had me puzzled.. So it could be a bad solder joint which baking it might get it going again.. but sense it was under warranty i thought msi would've treat me good and replace it with the same card or a Equal replacement.. But so far its not looking good

Just sucks having this big fancy lightning box and certificate and no lightning...


----------



## TheLAWNOOB

Kyle on HardOCP is a dick :/

All I posted was "Loading this page is a waste of my bandwidth", and he shadow banned me :\

#Dying


----------



## Talon720

So I'm at the same crossroads a lot of you have faced. My recent custom water cooled 290x tri-fire z87 formula vi 4770k died... Well the motherboard died. So I found a good price on a 5820k silicon lottery had a sale on an over clocked one. I just bought a case lab s8 to replace my modded 540 air, and to fit more radiators Into. I was just deciding on what x99 board to transfer my cards into ( deluxe II or Rampage Edition 10 I like pcie spacing m.2 location) when I stared to have second thoughts. I started thinking that my back up small ncase m1 with an impact z97 could be almost as fast as this trifire build with a newer single gpu. I do just enjoy the building part of it, but I could literally buy a new gsync or free sync monitor, new gpu, m.2 nvme ssd, and some nice new speakers. I do love my 290x's but we all know the practicality of more than one


----------



## paresser

Who has GV-R929D5-4GD-B (r2 290 ref from giga)?


----------



## TheLAWNOOB

Quote:


> Originally Posted by *paresser*
> 
> Who has GV-R929D5-4GD-B (r2 290 ref from giga)?


Need help with BIOS?

All ref cards have same BIOS.


----------



## Newbie2009

Tech power up VGA Bios database

https://www.techpowerup.com/vgabios/?architecture=AMD&manufacturer=&model=R9+290&interface=&memType=&memSize=


----------



## Newbie2009

Quote:


> Originally Posted by *mus1mus*
> 
> Don't make me pull such stuff for you to be in the know.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1. I don't have a 390/X
> 2. I don't run stock
> 3. 390 Memory is making a ton of difference
> 4. 300 series cards benefit from driver support better than 200 series cards.
> 
> I can go on, you know.
> 
> My 1517MHz 290X
> 
> vs
> 
> Fyzzz's 1300MHz 390
> 
> http://www.3dmark.com/3dm11/11315810
> 
> http://www.3dmark.com/3dm11/11329448


I always thought performance was near identical too.

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Have there been any tests on this?
> 
> Speaking of clocking high. What about the new 8Gbps chips on RX480 and 1070? Who is making them?


I think I seen a mention of samsung somewhere.


----------



## gordesky1

they got back to me grace... she said this.. Refund amount of $141.00 is correct. Depreciation value is factored by remaining warranty.
Refund is being offered because there are currently no same model or swap replacements available in stock to offer.

Sorry for the limited option.

now what should i say? thinking about telling them to send the card back to me.


----------



## diggiddi

What happened to the 390 option?


----------



## TheLAWNOOB

Quote:


> Originally Posted by *gordesky1*
> 
> they got back to me grace... she said this.. Refund amount of $141.00 is correct. Depreciation value is factored by remaining warranty.
> Refund is being offered because there are currently no same model or swap replacements available in stock to offer.
> 
> Sorry for the limited option.
> 
> now what should i say? thinking about telling them to send the card back to me.


Use colorful words to tell them off, and never buy MSI again.


----------



## gordesky1

Quote:


> Originally Posted by *diggiddi*
> 
> What happened to the 390 option?


that's what im trying to figure out... i did tell when they ask about that on the 17th couldn't i get a 390x instead? and that's when i was asked to wait for the rma team to contact me, than to contact tomh which so far haven't even return my call... couple hours i get the 141 refund offer...


----------



## Therandomness

First post... Can I join the owner's club?


----------



## gordesky1

welp just got off the phone with them, and i told them can they send me a 390x for replacement or even the 480 when it comes out. they said they can ony send out a replacement of the same almost matching performance of the card.. which they said the 390 is the card that replaces it with and they said sense i refuse the 390 gaming 8gb they can ony do the refund of 141$. So i told them the 290x lightning is a premium card and i would like a 390x for replacement. he put me on hold and went to talk to grace i guess shes in charge. than they came back and asked do i have a invoice that i can send them and she will see if she can get me the 390x...

and sense i don't have one pretty much i asked them what about the 390 gaming 8gb what they told me about first. he put me on hold and got back and said they can do that and have that ship right out to me...

Pretty much i had no choice and had to take that offer and i think its better than getting the 141$ or even asking for the card back and take my chances baking it.. Even than if the owner gave me the papers i think they still can refuse it because my name is not on it so im not the first owner. and they probably would gave me trouble with that...

dam sucks but i guess im not loosing much because i did ony pay 280$ for it.


----------



## diggiddi

That's not right, Refund is based on serial not invoice. See if you can speak to Grace directly


----------



## Starbomba

If i were you i'd take the 390 and sell it. The 390 is at ~$350 on Amazon, so you could sell it back for $280 to recoup losses and sell it fast, and get rid of MSI altogether. I've never liked that brand too much, and their RMA processes are even more of a pain than other manufacturers. For AMD i stick to Sapphire and some XFX models.

Asus is good, but has had some flukes with the 290 and 390 series, like not adapting their coolers to the card itself (like the 290X DirectCU II and 390X STRIX fiasco where not all heatpipes made contact with the core).


----------



## gordesky1

Quote:


> Originally Posted by *diggiddi*
> 
> That's not right, Refund is based on serial not invoice. See if you can speak to Grace directly


well the thing is on there warranty page there's nothing that says its based on serial not invoice.. So i think they can refuse it if they wanted from not having the invoice.

They said if i had the invoice thats showing the price i paid than they can work out getting me the 390x. if i told them i don't have a invoice they might've refused the whole rma all together... and i might be with out nothing... If i had proof i bought it trust me i would fight more but with out any i don't think i can do much more..
Quote:


> Originally Posted by *Starbomba*
> 
> If i were you i'd take the 390 and sell it. The 390 is at ~$350 on Amazon, so you could sell it back for $280 to recoup losses and sell it fast, and get rid of MSI altogether. I've never liked that brand too much, and their RMA processes are even more of a pain than other manufacturers. For AMD i stick to Sapphire and some XFX models.
> 
> Asus is good, but has had some flukes with the 290 and 390 series, like not adapting their coolers to the card itself (like the 290X DirectCU II and 390X STRIX fiasco where not all heatpipes made contact with the core).


Yea pretty much i told them after they ask for the invoice what about their first offer for the 390 and that's when he got off again and went to ask grace and he came back 5min later and said she said ok and they have those in stock and will be shipped to me. really i don't think i had any other choices but taking the 390.


----------



## gordesky1

Oh another thing, do you think they would be willing to send back the 290x if i pay shipping to have that ship back also? or don't you think they would do that sense they already sending a replacement also?

edit just got back off the phone again asking if i can received the old card back also. and they first said didint you already accepted the 390 gaming 8gb? i told them yes but i was seeing if i can get the 290x back also so i can hang it on my wall which i would've tried baking it lol and he put me on hold and got back to me and said the cards get sent to the mfg i guess amd? cards that cant be repaired so it would be a no..

guess it was worth a try and really i cant do much now cause they did say i already accepted the 390. i guess its better getting that card than 141$ or even no card lol..


----------



## Paul17041993

Quote:


> Originally Posted by *diggiddi*
> 
> No need to panic as RMA is done by serial number and is from date of manufacture, not based on receipt otherwise they'd have asked you for that already


In AU, RMA's require the receipt, but many [online] stores automate their process, so you just select the order from the drop-down list, enter the serial and they'll verify the warranty etc and send you the RMA number.
Quote:


> Originally Posted by *TheLAWNOOB*
> 
> How likely is it for Hawaii BIOS editor to work with Polaris 10?


Probably wont until the ROM addresses are sorted out, assuming the system is still similar.
Quote:


> Originally Posted by *Connolly*
> 
> Did anyone find a solution for the latest Crimson drivers conflicting with AB? Where the PC black screens at login.


Disable the panel starting up on boot is the best way, for win10 this is done via taskmanager. No idea how it affects game profiles though, but you could always add an entry in the task scheduler to start the panel up after a certain delay time.
Quote:


> Originally Posted by *gordesky1*
> 
> they got back to me grace... she said this.. Refund amount of $141.00 is correct. Depreciation value is factored by remaining warranty.


In AU they can only use the price the card was purchased for at the original store, no depreciation, even after 3 years or more. Though I suppose your country's laws may differ...

Selling the 390 is still a good option if you wish, new ones still sell upwards of 300USD, of which you could then get one or two 480's.


----------



## gordesky1

Quote:


> Originally Posted by *Paul17041993*
> 
> In AU, RMA's require the receipt, but many [online] stores automate their process, so you just select the order from the drop-down list, enter the serial and they'll verify the warranty etc and send you the RMA number.
> Probably wont until the ROM addresses are sorted out, assuming the system is still similar.
> Disable the panel starting up on boot is the best way, for win10 this is done via taskmanager. No idea how it affects game profiles though, but you could always add an entry in the task scheduler to start the panel up after a certain delay time.
> In AU they can only use the price the card was purchased for at the original store, no depreciation, even after 3 years or more. Though I suppose your country's laws may differ...
> 
> Selling the 390 is still a good option if you wish, new ones still sell upwards of 300USD, of which you could then get one or two 480's.


yea tho i think its going to be a refurb cause that's what i herd they give.. and yea it might be different cause they wouldn't budge to give me a 390x at all.. than when he came back from talking to grace rep i guess shes the gm and he asked they can work out to get me a 390x if i can send them a invoice of the price i paid... that's when i had to tell them can they just send me the 390 gaming 8gb than like they said the first time and they said that would be no problem and we will ship it out to you soon.

If i had a invoice i probably could've fight it more, even tho im not sure if they can even deny it than if they see a different name on the invoice. really they couldve asked for a invoice even for the 390 but they didn't.


----------



## Starbomba

I loved my 290X with the default BIOS, i love it more even now with the modded one. I cannot remember why i dared change my old 290 for a GTX 780. Must've been drunk when i did that


----------



## TheLAWNOOB

My Ref R9 290 had a "pop" and black screened. Can't get it to display anything anymore. No physical damage.

Should I sell it on ebay as broken?

I already tried everything.


----------



## Paul17041993

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> My Ref R9 290 had a "pop" and black screened. Can't get it to display anything anymore. No physical damage.
> 
> Should I sell it on ebay as broken?
> 
> I already tried everything.


If there's no option for RMA'ing, I'd just mount it on a wall.

If you wanted though you could try to find the popped component and replace it, but there's no guarantee that it'll work again.


----------



## mus1mus

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> My Ref R9 290 had a "pop" and black screened. Can't get it to display anything anymore. No physical damage.
> 
> Should I sell it on ebay as broken?
> 
> I already tried everything.


What exactly happened?

Quote:


> Originally Posted by *Starbomba*
> 
> I loved my 290X with the default BIOS, i love it more even now with the modded one. I cannot remember why i dared change my old 290 for a GTX 780. Must've been *drunk* when i did that


You must be.









Even a 1500/1900 GTX780 has nothing compared to a 1250/1500 290. Let alone a 290X.


----------



## gordesky1

Hmm the seller that i bought the 290x lightning off of got back to me with the new egg invoice... wonder if i can tell msi i have the invoice now that when they asked for it to see if they can get me the 390x? or is it to late? wish i had it like 12 hours ago lol... does have 2 names on it tho bill too and ship too...

might be to late sense i did already accepted the 390..

edit sent them the invoice that just shows the invoice number and order number tracking and date and the card price which was 389$. didn't bother sending them the other invoice yet.

All they told me to do is send them the invoice that shows the price i paid and the date on the phone when they asked me if i have a invoice which at the time i didint so i just asked them can they send me the 390 instead which they said yes.

But they did say if i had a invoice they will work with me to get the 390x.

edit pretty much to late now... i just figure i check the rma status to see if it ever was updated and i notice a tracking number on the bottom which i thought it was just the tracking i gave to them at first.. Nope its a new one and says it was shipped 7pm yesterday and i will get it the 28th... guess they figure they shipped it fast incase i change my mine or find the invoice... lol..


----------



## battleaxe

Quote:


> Originally Posted by *gordesky1*
> 
> Hmm the seller that i bought the 290x lightning off of got back to me with the new egg invoice... wonder if i can tell msi i have the invoice now that when they asked for it to see if they can get me the 390x? or is it to late? wish i had it like 12 hours ago lol... does have 2 names on it tho bill too and ship too...
> 
> might be to late sense i did already accepted the 390..
> 
> edit sent them the invoice that just shows the invoice number and order number tracking and date and the card price which was 389$. didn't bother sending them the other invoice yet.
> 
> All they told me to do is send them the invoice that shows the price i paid and the date on the phone when they asked me if i have a invoice which at the time i didint so i just asked them can they send me the 390 instead which they said yes.
> 
> But they did say if i had a invoice they will work with me to get the 390x.
> 
> edit pretty much to late now... i just figure i check the rma status to see if it ever was updated and i notice a tracking number on the bottom which i thought it was just the tracking i gave to them at first.. Nope its a new one and says it was shipped 7pm yesterday and i will get it the 28th... guess they figure they shipped it fast incase i change my mine or find the invoice... lol..


Just forget it, sell that 390 and put it towards a new RX480 and be happy.


----------



## Paul17041993

Quote:


> Originally Posted by *gordesky1*
> 
> Hmm the seller that i bought the 290x lightning off of got back to me with the new egg invoice... wonder if i can tell msi i have the invoice now that when they asked for it to see if they can get me the 390x? or is it to late? wish i had it like 12 hours ago lol... does have 2 names on it tho bill too and ship too...
> 
> might be to late sense i did already accepted the 390..
> 
> edit sent them the invoice that just shows the invoice number and order number tracking and date and the card price which was 389$. didn't bother sending them the other invoice yet.
> 
> All they told me to do is send them the invoice that shows the price i paid and the date on the phone when they asked me if i have a invoice which at the time i didint so i just asked them can they send me the 390 instead which they said yes.
> 
> But they did say if i had a invoice they will work with me to get the 390x.
> 
> edit pretty much to late now... i just figure i check the rma status to see if it ever was updated and i notice a tracking number on the bottom which i thought it was just the tracking i gave to them at first.. Nope its a new one and says it was shipped 7pm yesterday and i will get it the 28th... guess they figure they shipped it fast incase i change my mine or find the invoice... lol..


Probably not worth the effort as the cost in shipping the 390 back will probably be about the same as the extra value of the 390X...


----------



## gordesky1

Yep just taking the 390 and see how it does and if i don't like it will just sell it. I know one thing my next card will not be a msi... i also have a gigabyte 570gtx that died because of the vrms back in 2014 which i rma it and they sent another 570gtx back to me in like a week. and that card was going on 3years at the time and they had them in stock.. while msi took more than 2weeks and didn't have any old cards in stock. than worse they tried ony offering 141$ that's when the ball drop for me...

last message they sent

Sorry after you had agreed to accept swap model yesterday, a replacement R9 390 GAMING 8G was already processed and shipped.
RMA now closed. they probably hurry and shipped it just in case i find the invoice...

Pretty much i just wish they get true bench's up for the 480 so i can decide what to do.. its just hard to believe the 480 going to be super good for its price point of 220 for the 8gb.... The stuff that is going around does look good tho..

what sucks is the price of the 300 series is probably going to go down even more by time i get it which im getting it the 28th and from what it says on the net the 480 is coming out on the 29th...


----------



## mus1mus

Prices have been going crazy lately from both camps. Imagine a GTX 980TI for around $300-400 used. They were what? 700ish?

Some even claimed $250. Blasphemous for the performance it offers.


----------



## gordesky1

yep seeing them in the 300s and low 400 on ebay. and even seeing the normal 980s in the 200s lol....


----------



## mus1mus

980s should not be considered. lol

tThe TIs are still "the game" TBH.


----------



## Starbomba

What i want is a Fury Pro Duo for $400, or a 295x2 for $200


----------



## mus1mus

Fury is problematic. 295X2 is less than 2 290s when it come to overclocking.

And yeah, you are asking too much.


----------



## Paul17041993

480 is going to be a beast for 1080p and probably 1440p, though for 4k it may struggle unless you have multiple of them.

I'm likely going to get 2 or 3 at some point, unless the 480X or 490/X's pop up soon, and use them to experiment with interlaced/stride frame rendering in vulkan...


----------



## Starbomba

I will wait for the 480 reviews, only to see a glimpse of the big boss Vega. I miiiiiight get a 4 GB 470 since they would be really cheap. i really hope there is a low profile version of this, it would make for one of the most powerful and low power HTPC cards. if not, just to have a Polaris chip on the cheap.
Quote:


> Originally Posted by *mus1mus*
> 
> Fury is problematic. 295X2 is less than 2 290s when it come to overclocking.
> 
> And yeah, you are asking too much.


Eh, i want them for BOINC. Don't really think i can get them under true water, but on compute power alone they should be beasts in their own right.


----------



## Paul17041993

Interesting thing with GCN4's efficiency, is vega may have as much as 5,120 + 80 shaders, GPUs are getting pretty riddick...

Also that there are now methods to achieve near-linear scaling across multiple GPUs...


----------



## Shweller

Hello everyone,

I was wondering if someone can provide me with some advice. Long story short. I am having black screen issues with my 290x anytime I raise the voltage via MSI AB. The card will still be rendering in the back ground but the screen will be black. I ran 2 cables to the card. 1 for the 6 pin and 1 for the 8 pin connector thinking 1 cable for both was not enough juice.

This run was made looking at a black screen. stock clocks but voltage raised to max. Latest drivers, stock core/mem clocks.

http://www.3dmark.com/fs/9010618

This is what the card use to do with no black screens.

http://www.3dmark.com/3dm/10487940

Not sure if this is driver related, card related or what. I if run the benchmark at stock voltage I have no issues. Please advise.

Edit: I am monitoring voltage via OSD. The card gets a constant 1.227V, VRM temperatures and core temps are all under 48 degrees since the card is under water.


----------



## mus1mus

How much voltage is added til it Black Screens?

Are you doing the MSI AB Voltage hack to get the Voltage past +100mV ?

The issue is common on these cards when you push the Voltage. Some cards do that earlier than the others. It's just luck. The only remedy is a Hard Mod.
http://www.hwbox.gr/hardware-mods/33792-amd-radeon-r9-290x-vmods.html

we're talking about the 0.95 Rail mods.

If you have 2 cards, use the one that doesn't black screen as the primary when doing XFire. That way, you won't lose the signal.


----------



## Paul17041993

Quote:


> Originally Posted by *Shweller*
> 
> Hello everyone,
> 
> I was wondering if someone can provide me with some advice. Long story short. I am having black screen issues with my 290x anytime I raise the voltage via MSI AB. The card will still be rendering in the back ground but the screen will be black. I ran 2 cables to the card. 1 for the 6 pin and 1 for the 8 pin connector thinking 1 cable for both was not enough juice.
> 
> This run was made looking at a black screen. stock clocks but voltage raised to max. Latest drivers, stock core/mem clocks.
> 
> http://www.3dmark.com/fs/9010618
> 
> This is what the card use to do with no black screens.
> 
> http://www.3dmark.com/3dm/10487940
> 
> Not sure if this is driver related, card related or what. I if run the benchmark at stock voltage I have no issues. Please advise.
> 
> Edit: I am monitoring voltage via OSD. The card gets a constant 1.227V, VRM temperatures and core temps are all under 48 degrees since the card is under water.


How many watts/amps are going through the VRMs? The common cause of black-screening when overvolting is simply due to exceeding the card's full TDP, which is very easy to do most of the time. A possible fix is to flash a patched BIOS that overrides the TDP limit, however runs the risk of destroying the card.


----------



## mus1mus

errmmm..

I have been doing my crazy OC's on unmodded BIOSes sans Voltage limit and they don't, again, THEY DON'T LIMIT my OCs.









You have 50% Power Limit more to push when overclocking, you should know.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> errmmm..
> 
> I have been doing my crazy OC's on unmodded BIOSes sans Voltage limit and they don't, again, THEY DON'T LIMIT my OCs.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You have 50% Power Limit more to push when overclocking, you should know.


Agreed. I never changed the TDP / power limit in the bios on my 290X, and it clocked to 1300+ pretty easy.


----------



## Shweller

Quote:


> Originally Posted by *mus1mus*
> 
> How much voltage is added til it Black Screens?
> 
> Are you doing the MSI AB Voltage hack to get the Voltage past +100mV ?
> 
> The issue is common on these cards when you push the Voltage. Some cards do that earlier than the others. It's just luck. The only remedy is a Hard Mod.
> http://www.hwbox.gr/hardware-mods/33792-amd-radeon-r9-290x-vmods.html
> 
> we're talking about the 0.95 Rail mods.
> 
> If you have 2 cards, use the one that doesn't black screen as the primary when doing XFire. That way, you won't lose the signal.


It seems anything above +30mv's starts giving me intermittent black screens. I have resorted to just unplugging my second card cause most scaling is horrible and I encounter too many problems to mess with all of the time. I normally dont overclock on a day 2 day bases. I remember these cards overvolting way better just 6 months ago.

Quote:


> Originally Posted by *Paul17041993*
> 
> How many watts/amps are going through the VRMs? The common cause of black-screening when overvolting is simply due to exceeding the card's full TDP, which is very easy to do most of the time. A possible fix is to flash a patched BIOS that overrides the TDP limit, however runs the risk of destroying the card.


Not sure what wattage is being pushed through but everything is staying well within design in regards to the temperatures. Not sure what TDP limits are on this card...

Quote:


> Originally Posted by *mus1mus*
> 
> errmmm..
> 
> I have been doing my crazy OC's on unmodded BIOSes sans Voltage limit and they don't, again, THEY DON'T LIMIT my OCs.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You have 50% Power Limit more to push when overclocking, you should know.


Agreed, use to be able to turn up the voltage with no issues too. Most I have ever got is 1225MHz on the core with very little artifact.

Quote:


> Originally Posted by *Vellinious*
> 
> Agreed. I never changed the TDP / power limit in the bios on my 290X, and it clocked to 1300+ pretty easy.


X2

This is a bench I just ran with black screen coming in and out. No crashes as it was still rendering while the screen was black.

http://www.3dmark.com/3dm/12776795



My understanding is that these cards can be over volted to 1.4-1.5V under water....


----------



## Shweller

Just did this run with +0mv and power limit set to +50% with no black screens and core clock at 1050MHz.

http://www.3dmark.com/3dm/12777114

Not sure if AB is causing any issues. I do remember there was a 1.3v soft mod you can do on afterburner but not sure it works on AMD cards as I use to do it on my GTX 770's...


----------



## mus1mus

When's the last time you repasted the cards?

Would you be inclined to check for the VRM temps?

If they have no issues with temps, paste is fresh, do a DDU, and reinstall your driver.

I have 2 cards that does that when I add +300ish mV. And one that doesn't blink til around +381mV. I put that in as the primary and didn't have to worry about Black Screens.


----------



## Shweller

Quote:


> Originally Posted by *mus1mus*
> 
> When's the last time you repasted the cards?
> 
> Would you be inclined to check for the VRM temps?
> 
> If they have no issues with temps, paste is fresh, do a DDU, and reinstall your driver.
> 
> I have 2 cards that does that when I add +300ish mV. And one that doesn't blink til around +381mV. I put that in as the primary and didn't have to worry about Black Screens.


Its hard to tell from my picture above but VRMs dont go past 48 degrees. I just did the past about 3 months ago. I did revert back to the previous amd driver as I did seem to have less issues with that driver. I also ran DDU in Safe mode aswell before reinstalling. What program do you use add 300mv? I would like to be able to go to 1.3v to bench, maybe even 1.4v. I am also willing to do custom bios as these cards are now out of warrenty.


----------



## rdr09

Quote:


> Originally Posted by *Shweller*
> 
> Just did this run with +0mv and power limit set to +50% with no black screens and core clock at 1050MHz.
> 
> http://www.3dmark.com/3dm/12777114
> 
> Not sure if AB is causing any issues. I do remember there was a 1.3v soft mod you can do on afterburner but not sure it works on AMD cards as I use to do it on my GTX 770's...


So, you are using AB to oc? If so, did you acceptAMD Overdrive in Crimson? If you did, then that might be the issue. Ever since Crimson, I notice that when i use 3rd party app, Overdrive follows the oc settings but not all settings. Windows maybe following Overdrive. Not sure.

I accidentally accepted Overdrive on my 2 290s.


----------



## mus1mus

Quote:


> Originally Posted by *Shweller*
> 
> Its hard to tell from my picture above but VRMs dont go past 48 degrees. I just did the past about 3 months ago. I did revert back to the previous amd driver as I did seem to have less issues with that driver. I also ran DDU in Safe mode aswell before reinstalling. What program do you use add 300mv? I would like to be able to go to 1.3v to bench, maybe even 1.4v. I am also willing to do custom bios as these cards are now out of warrenty.


Sapphire Trixxx allows +200
Grab HIS iTurbo for +400 here:

Be careful.

If you can't get past a certain Voltage level, means the rom is Voltage Locked. We can unlock that for you. Head up Hawaii BIOS Editting thread.


----------



## Paul17041993

Quote:


> Originally Posted by *mus1mus*
> 
> errmmm..
> 
> I have been doing my crazy OC's on unmodded BIOSes sans Voltage limit and they don't, again, THEY DON'T LIMIT my OCs.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> You have 50% Power Limit more to push when overclocking, you should know.


+200mV and you'll probably blackscreen regardless of die lottery.

The power limit also has no effect on the card's TDP, it only sets a peak below it (+50% is just at the card's ~375W limit).
Quote:


> Originally Posted by *mus1mus*
> 
> I have 2 cards that does that when I add +300ish mV. And one that doesn't blink til around +381mV. I put that in as the primary and didn't have to worry about Black Screens.


1.5-1.6V...? I would assume VRM modded cards with sub-zero cooling...


----------



## mus1mus

Quote:


> Originally Posted by *Paul17041993*
> 
> +200mV and you'll probably blackscreen regardless of die lottery.


Not true. There are cards that can hold their signal up to +300. My 290 ref does. My newer 290X ref from VTX3D too. Up to +350
Quote:


> The power limit also has no effect on the card's TDP, it only sets a peak below it (+50% is just at the card's ~375W limit).


TDP limit doesn't seem to actually work for these cards. Esp ref ones.
Quote:


> 1.5-1.6V...? I would assume VRM modded cards with sub-zero cooling...


Water.

Another notable spec -- ASIC quality.

The higher the chip's ASIC, means, it is more power hungry. Even with lower VIDs.


----------



## Paul17041993

Quote:


> Originally Posted by *mus1mus*
> 
> The higher the chip's ASIC, means, it is more power hungry. Even with lower VIDs.


Except at 1.6V the die will melt off the card unless you keep it below -150C, assuming the caps and VRMs don't rupture first. I can only assume your stock voltage is a lot lower than mine and +350 for yours actually results in 1.4V...

+100 puts mine a little above 1.3V and 300W with 1100/1600, any higher in voltage will result in blackscreens from hitting the 375W limit.


----------



## rdr09

Quote:


> Originally Posted by *Paul17041993*
> 
> +200mV and you'll probably blackscreen regardless of die lottery.
> 
> The power limit also has no effect on the card's TDP, it only sets a peak below it (+50% is just at the card's ~375W limit).
> 1.5-1.6V...? I would assume VRM modded cards with sub-zero cooling...


I only use Trixx to oc both my 290s. +200 does not give them any blacckscreen at all. Higher clocks, though, do.

One can do 1300 benching no problem It blackscreens at 1330. The other can do 1300 at with same voltage of +200. Never tried higher. Together they can do 1290. They are watered and ambient were really low when i was benching.

At +200, i saw it hit 1.41v in GPUZ.


----------



## AliNT77

can someone please send me a link to HIS iTurbo software ?

its nowhere on the internet :|


----------



## Vellinious

Quote:


> Originally Posted by *AliNT77*
> 
> can someone please send me a link to HIS iTurbo software ?
> 
> its nowhere on the internet :|


https://drive.google.com/open?id=0B6zqzZ0qTCB5TkRDNWgwcVd1Z1k


----------



## AliNT77

Quote:


> Originally Posted by *Vellinious*
> 
> https://drive.google.com/open?id=0B6zqzZ0qTCB5TkRDNWgwcVd1Z1k


TY Rep+









Scored 14731 in FireStrike with +300mV 1225-1725 (1250Timings) with my Ref. 290 with AIO and Mid-Plate
how is it?


----------



## Vellinious

Quote:


> Originally Posted by *AliNT77*
> 
> TY Rep+
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Scored 14731 in FireStrike with +300mV 1225-1725 (1250Timings) with my Ref. 290 with AIO and Mid-Plate
> how is it?


Link?


----------



## AliNT77

Quote:


> Originally Posted by *Vellinious*
> 
> Link?


saved it :

1225-1725.zip 88k .zip file


----------



## mus1mus

Quote:


> Originally Posted by *Paul17041993*
> 
> Except at 1.6V the die will melt off the card unless you keep it below -150C, assuming the caps and VRMs don't rupture first. I can only assume your stock voltage is a lot lower than mine and +350 for yours actually results in 1.4V...
> 
> +100 puts mine a little above 1.3V and 300W with 1100/1600, any higher in voltage will result in blackscreens from hitting the 375W limit.


Stop assuming things.









The Die will not melt. It will crack itself with too much Voltage when not cooled.

You don't need subzero cooling to tame these chips.

Caps and VRMs won't be damaged unless uou kess with their switching frequency.

+381 results in 1.6V under load on 2 of my cards under load! No load, higher.

My 290 has a VID of 1.18, 290Xs has 1.25ish each.

You need to classify these black screens. Momentary loss of the signal is die to high Voltages and clocks. (1300 MHz region usually)

If your card(s) loss the signal due to instability, lock-up, no recovery of the video signal, then it may be due to other things than Voltage and Hitting the limit. Maybe heat?

Do note that uneven thermal paste coverage can do this, so is mediocre PSU, high memory clcks too.

We can help uou if you are willing, you know.


----------



## Vellinious

Quote:


> Originally Posted by *AliNT77*
> 
> saved it :
> 
> 1225-1725.zip 88k .zip file


Yeah, that's not bad.


----------



## AliNT77

Quote:


> Originally Posted by *Vellinious*
> 
> Yeah, that's not bad.


what about this one? :
again +300mv 1225-1725 (1250 timings)


unfortunately it's core doesn't overclock well








it can do just 1085 with stock voltage


----------



## Shweller

Quote:


> Originally Posted by *mus1mus*
> 
> Sapphire Trixxx allows +200
> Grab HIS iTurbo for +400 here:
> 
> Be careful.
> 
> If you can't get past a certain Voltage level, means the rom is Voltage Locked. We can unlock that for you. Head up Hawaii BIOS Editting thread.


Thank you for the helpful links. So what I am getting out of everyone replies is that the card will render a black screen regardless once it gets past a certain voltage. I am very open to modded bios. I was recently reading a thread about modded 390x bios on the 290x. Do I need to post my bios for someone to mod or is there a ROM already out there for the reference Asus 290x cards?


----------



## mus1mus

Quote:


> Originally Posted by *Shweller*
> 
> Thank you for the helpful links. So what I am getting out of everyone replies is that the card will render a black screen regardless once it gets past a certain voltage. I am very open to modded bios. I was recently reading a thread about modded 390x bios on the 290x. Do I need to post my bios for someone to mod or is there a ROM already out there for the reference Asus 290x cards?


You can try the latest here:
http://www.overclock.net/t/1564219/modded-r9-390x-bios-for-r9-290-290x-updated-02-16-2016/0_50

If that doesn't work well, post your BIOS here. I can mod things for you.


----------



## mirzet1976

With new driver I see a boost in graphics score 14 624 vs 15 555 + 6.4 % http://www.3dmark.com/compare/fs/8807484/fs/9030332


----------



## mus1mus

wut?

That's huge!


----------



## Vellinious

Makes me wonder what I could get out of that 290X with these new drivers. I was pulling just a shade short of 16k graphics score already......tempting.


----------



## Paul17041993

Quote:


> Originally Posted by *mus1mus*
> 
> Stop assuming things.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> The Die will not melt. It will crack itself with too much Voltage when not cooled.
> 
> You don't need subzero cooling to tame these chips.
> 
> Caps and VRMs won't be damaged unless uou kess with their switching frequency.
> 
> +381 results in 1.6V under load on 2 of my cards under load! No load, higher.
> 
> My 290 has a VID of 1.18, 290Xs has 1.25ish each.
> 
> You need to classify these black screens. Momentary loss of the signal is die to high Voltages and clocks. (1300 MHz region usually)
> 
> If your card(s) loss the signal due to instability, lock-up, no recovery of the video signal, then it may be due to other things than Voltage and Hitting the limit. Maybe heat?
> 
> Do note that uneven thermal paste coverage can do this, so is mediocre PSU, high memory clcks too.
> 
> We can help uou if you are willing, you know.


true, the silicon would probably rupture before the solder could get hot enough, but I'm still very sceptical of your cards running at 1.6V and over. Do your cards have custom VRMs that can handle 400W and over?


----------



## mus1mus

Im not preaching over 1.4V FYI.

To get is straight, as long as you cool these chips, you can pump Voltages.

Heck, I pushed 1.6V to a GTX 780 just days ago on water. Temps didn't even break 40C.
http://hwbot.org/submission/3241985_mus1mus_catzilla___720p_geforce_gtx_780_32261_marks

And yep, AMD ref VRMs are the best you can find today.







You just need to cool them properly.


----------



## Shweller

Quote:


> Originally Posted by *mirzet1976*
> 
> With new driver I see a boost in graphics score 14 624 vs 15 555 + 6.4 % http://www.3dmark.com/compare/fs/8807484/fs/9030332


16.6.2 Drivers?


----------



## melodystyle2003

Guys
Quote:


> Originally Posted by *Shweller*
> 
> 16.6.2 Drivers?


16.200.1035.0 is the 16.6.2 driver.


----------



## mirzet1976

Quote:


> Originally Posted by *Shweller*
> 
> 16.6.2 Drivers?


Yes.


----------



## mus1mus

I can't enabled XFire though. hmmm


----------



## AliNT77

close to 0% improvement with the new driver ( 16.6.2 )


----------



## NaXter24R

Guys do you have any experience with the aquacomputer backplate for 290x? I have and msi gaming oc. The pcb is reference like, but with minor changes. I already have an xspc waterblock on it so I've bought the aquacomputer backplate but I doesn't fit. I mean, screws are ok and aligned, but on the power connector side of the card, it the vrm zone, there Are some caps or inductors soldering point that are a bit high and they touch the metal. The cut on the backplate is like 3mm further. Any advice?


----------



## Paul17041993

Quote:


> Originally Posted by *NaXter24R*
> 
> Guys do you have any experience with the aquacomputer backplate for 290x? I have and msi gaming oc. The pcb is reference like, but with minor changes. I already have an xspc waterblock on it so I've bought the aquacomputer backplate but I doesn't fit. I mean, screws are ok and aligned, but on the power connector side of the card, it the vrm zone, there Are some caps or inductors soldering point that are a bit high and they touch the metal. The cut on the backplate is like 3mm further. Any advice?


I wouldn't suggest using the backplate at all as anything you could try could kill the card. However there's the one option of machining the backplate in the needed areas, but do you have at least a bench drill nearby for that?


----------



## NaXter24R

I might know someone with that. I'm listening


----------



## NaXter24R

Quote:


> Originally Posted by *Paul17041993*
> 
> I wouldn't suggest using the backplate at all as anything you could try could kill the card. However there's the one option of machining the backplate in the needed areas, but do you have at least a bench drill nearby for that?


I might know someone with that, or at least something that could work as well...


----------



## joseph172g

Hello everyone, i am trying to overclock my MSI r9 290x gaming 4gb. So I oced it all the way up to 1170 mhz core and 1350 mhz mem without artifacts. It gaves me pretty nice boost in witcher 3 (1080p,no hair, ultra- 55-74fps) but there is a problem which i cant solve. I just cant get it stable it always ends up shuting down my gpu a loosing signal to monitor after couple of minutes







. It seems like it is a problem with voltage, when i want to play with 1170core a got to push it to +90Mv. But i cant even push it to +50Mv with 1130mhz, cause it also end up with loosing signal. please how can i solve his problem and make it stable? Thanks in advance


----------



## Paul17041993

Quote:


> Originally Posted by *NaXter24R*
> 
> I might know someone with that, or at least something that could work as well...


You'll have to somehow mark and machine down the areas of the block that make close contact with the PCB components/solder points, though I cant help you much there as I don't have your particular card to look and measure...
Quote:


> Originally Posted by *joseph172g*
> 
> Hello everyone, i am trying to overclock my MSI r9 290x gaming 4gb. So I oced it all the way up to 1170 mhz core and 1350 mhz mem without artifacts. It gaves me pretty nice boost in witcher 3 (1080p,no hair, ultra- 55-74fps) but there is a problem which i cant solve. I just cant get it stable it always ends up shuting down my gpu a loosing signal to monitor after couple of minutes
> 
> 
> 
> 
> 
> 
> 
> . It seems like it is a problem with voltage, when i want to play with 1170core a got to push it to +90Mv. But i cant even push it to +50Mv with 1130mhz, cause it also end up with loosing signal. please how can i solve his problem and make it stable? Thanks in advance


It's generally because you're making the GPU pull too much power than what the PCB allows by default, but there are BIOS patches that can help you (with risks). Ideally just bring the core clock and voltage down a bit, additionally you could probably get the memory to run at 1500 without much additional core voltage.

I'm sure other users here will post links for the BIOS patch option.


----------



## joseph172g

Thank you,,,also i would be happy if someone gave me moded bios for my hinix Msi r9 290x. I tryed all hinix 390x roms from other thread but all ends up with black screen.


----------



## NaXter24R

Quote:


> Originally Posted by *Paul17041993*
> 
> You'll have to somehow mark and machine down the areas of the block that make close contact with the PCB components/solder points, though I cant help you much there as I don't have your particular card to look and measure...
> It's generally because you're making the GPU pull too much power than what the PCB allows by default, but there are BIOS patches that can help you (with risks).


Ok, i can post some picture later so I can explain better where is the problem.
Quote:


> Originally Posted by *joseph172g*
> 
> Hello everyone, i am trying to overclock my MSI r9 290x gaming 4gb. So I oced it all the way up to 1170 mhz core and 1350 mhz mem without artifacts. It gaves me pretty nice boost in witcher 3 (1080p,no hair, ultra- 55-74fps) but there is a problem which i cant solve. I just cant get it stable it always ends up shuting down my gpu a loosing signal to monitor after couple of minutes
> 
> 
> 
> 
> 
> 
> 
> . It seems like it is a problem with voltage, when i want to play with 1170core a got to push it to +90Mv. But i cant even push it to +50Mv with 1130mhz, cause it also end up with loosing signal. please how can i solve his problem and make it stable? Thanks in advance


If you lose signal it's not the core voltage, otherwise it would crash. It's the rail that delivers power to the video output. Have you tried to increase just a bit the auxiliary voltage? It should be 1.0v by default. Try 1.08 or something.


----------



## joseph172g

Quote:


> Originally Posted by *NaXter24R*
> 
> Ok, i can post some picture later so I can explain better where is the problem.
> 
> If you lose signal it's not the core voltage, otherwise it would crash. It's the rail that delivers power to the video output. Have you tried to increase just a bit the auxiliary voltage? It should be 1.0v by default. Try 1.08 or something.


I will try add just a little bit,,cause last time a was adding too much and it didnt help. thanks


----------



## kizwan

Quote:


> Originally Posted by *joseph172g*
> 
> Hello everyone, i am trying to overclock my MSI r9 290x gaming 4gb. So I oced it all the way up to 1170 mhz core and 1350 mhz mem without artifacts. It gaves me pretty nice boost in witcher 3 (1080p,no hair, ultra- 55-74fps) but there is a problem which i cant solve. I just cant get it stable it always ends up shuting down my gpu a loosing signal to monitor after couple of minutes
> 
> 
> 
> 
> 
> 
> 
> . It seems like it is a problem with voltage, when i want to play with 1170core a got to push it to +90Mv. But i cant even push it to +50Mv with 1130mhz, cause it also end up with loosing signal. please how can i solve his problem and make it stable? Thanks in advance


How can it stable but not stable at the same time?









What you're experiencing there is known issue with some Hawaii cards. Yours seems a bit at the low side (voltage when black screen occur). In case you missed @mus1mus's post a couple pages back.
Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Shweller*
> 
> Hello everyone,
> 
> I was wondering if someone can provide me with some advice. Long story short. I am having black screen issues with my 290x anytime I raise the voltage via MSI AB. The card will still be rendering in the back ground but the screen will be black. I ran 2 cables to the card. 1 for the 6 pin and 1 for the 8 pin connector thinking 1 cable for both was not enough juice.
> 
> This run was made looking at a black screen. stock clocks but voltage raised to max. Latest drivers, stock core/mem clocks.
> 
> http://www.3dmark.com/fs/9010618
> 
> This is what the card use to do with no black screens.
> 
> http://www.3dmark.com/3dm/10487940
> 
> Not sure if this is driver related, card related or what. I if run the benchmark at stock voltage I have no issues. Please advise.
> 
> Edit: I am monitoring voltage via OSD. The card gets a constant 1.227V, VRM temperatures and core temps are all under 48 degrees since the card is under water.
> 
> 
> 
> How much voltage is added til it Black Screens?
> 
> Are you doing the MSI AB Voltage hack to get the Voltage past +100mV ?
> 
> The issue is common on these cards when you push the Voltage. Some cards do that earlier than the others. It's just luck. The only remedy is a Hard Mod.
> http://www.hwbox.gr/hardware-mods/33792-amd-radeon-r9-290x-vmods.html
> 
> we're talking about the 0.95 Rail mods.
> 
> If you have 2 cards, use the one that doesn't black screen as the primary when doing XFire. That way, you won't lose the signal.
Click to expand...

You have two choices here; 1) shammy's PT1 BIOS or 2) 0.95V rail mod. Actually I'm not sure whether shammy's PT1 BIOS can help but I just put it here as one of the things you can try. The only problem here is both are made or well documented for referenced cards. I don't know which gaming card you have but if I remember correctly gaming have three revisions; referenced design card, slightly bigger caps card & the "golden" chokes card. If you want to go with no. 1 solution, either you need to find out whether other people with the same gaming card that successfully use shammy's PT1 BIOS or ask @mus1mus to copy shammy's PT1 mod to your card BIOS. He have done this before, so it's child's play to him.









For no. 2, if your card is not the first revision card which follow referenced design, you'll need to trace the 0.95V rail on your card.
You can refer to the shammy's doc on how to do it http://forum.kingpincooling.com/showthread.php?t=2473&highlight=290x

This may helps too:-
http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/1990#post_24936560
... with result:-
http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/1990#post_24938022

This is a bit different but similar mod:-
http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/2140#post_24964867
... with result:-
http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/2150#post_24965770

This post may help you understand why black/blank screen occur & why the 0.95V rail mod:-
http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/2160#post_24965878


----------



## NaXter24R

The problem is on the .95v rail infact. When you overvolt, the card do not take it very kindly and because that .95v rail is for video signal, you shoul increase that a bit in order to see.
Aux might help, it's not a fix, it's not 100% sure, but it might help stabilize the card a bit and hopefully provide some more juice.

Another workaround could be using another type of connector. What are you using right now? DVI, HDMI, DP?


----------



## joseph172g

i am using HDMI


----------



## Casterina

My Sapphire R9 290 Tri-X current BIOS version is 015.043.000.001.000000, is that the latest version? If not then is there any advantages of flashing the latest version?


----------



## NaXter24R

Quote:


> Originally Posted by *joseph172g*
> 
> i am using HDMI


You could try with DVI, but this is just empirical. Practically you have 2 ways, increasing the 0.95v rail with a mod (and i think it's some hard mod) or step back in voltage just a bit.

Quote:


> Originally Posted by *Casterina*
> 
> My Sapphire R9 290 Tri-X current BIOS version is 015.043.000.001.000000, is that the latest version? If not then is there any advantages of flashing the latest version?


Nope, you could potentially brick your card if you mess up with the bios, wrong version, wrong memory support, wrong voltage and so on. Don't need to mess with the bios unless you want to push the card with some heavy voltage


----------



## Casterina

Quote:


> Originally Posted by *NaXter24R*
> 
> Nope, you could potentially brick your card if you mess up with the bios, wrong version, wrong memory support, wrong voltage and so on. Don't need to mess with the bios unless you want to push the card with some heavy voltage


I don't plan to overclock the card as it is handling the games I play fine with good frames. Also, which way should I have the BIOS switch towards, UEFI or Classic? Is there any differences between them?


----------



## joseph172g

i tryed dvi but it didnt solve a problem, still lost signal after couple of minutes. I have also tryed to flash bios from shammy but both ends up with black screen. is it even posible to flash msi r9 290x gsming 4gb(hynix)?


----------



## joseph172g

I also oced my memory without adding volts to 1600mhz, it was perfectly fine but than i tryed 1625 and screen goes black and i couldnt bot up my pc. Luckily i solve it by changing pci slots


----------



## Shweller

Made this run with the modded 390x BIOS on my reference ASUS 290x card. Unfortunately this was a black screen run. I am sad to see that my card is black screening at what I think is low voltages for 1100MHz stable. Only saw a max temp of 45°C on the core so I will continue to increase voltage and see what I can get.



http://www.3dmark.com/3dm/12882807


----------



## NaXter24R

Quote:


> Originally Posted by *joseph172g*
> 
> i tryed dvi but it didnt solve a problem, still lost signal after couple of minutes. I have also tryed to flash bios from shammy but both ends up with black screen. is it even posible to flash msi r9 290x gsming 4gb(hynix)?


Use memoryinfo to check your memory. I don't advise you to flash some fancy bios unless you know what you're doing. You could potentially kill some VRAM chip (different speed, timings and voltages).
The easiest way is to reduce your vcore, otherwise the hardmod is the only solution. 0.95v rail has nothing to do with VRAM, it's a separate rail that feeds the video output. When you increae voltage you give stability to vcore but the other rail do not take it very well. That's why after some overvolt, you can come across some issue like you.
My old 280x holds 1.3v without any problem at all, but Hawaii is different. I can't recreate the same scenario myseld because i'm waiting for some backplate because right now, even if watercooled, my VRMs are quite hot and i don't want to push over 1.23v even if core temp is fine
Quote:


> Originally Posted by *joseph172g*
> 
> I also oced my memory without adding volts to 1600mhz, it was perfectly fine but than i tryed 1625 and screen goes black and i couldnt bot up my pc. Luckily i solve it by changing pci slots


Have you check the supported VRAM within the bios? As i told you, sometimes even minor changes can cause issue. Sometimes even damage like the 7970 that i've got for free ([rul]http://www.overclock.net/t/1602714/help-me-fix-my-7970-update-the-soldering-is-real[/url this was a bad flash turned into 3gb fried to 260ish mb







)
Quote:


> Originally Posted by *Casterina*
> 
> I don't plan to overclock the card as it is handling the games I play fine with good frames. Also, which way should I have the BIOS switch towards, UEFI or Classic? Is there any differences between them?


Unless you have some problem with the UEFI one, stick on that. Legacy is for older system that, as the word says, do not support the new UEFI. Moreover, one is flashabe, the other isn't if i'm right. That's a safety measure in case of a bad flash. But, since i've ran across some trouble in the past with bios (motherboard) i highly suggest you not to play with that unless it's mandatory


----------



## Paul17041993

There's no risk of killing the card with a BIOS flash, these cards are designed for it. However before flashing either of them, back them both up and make notes of what one was in each position, so if the flash fails in some way it's easy to re-flash the original. You can flip the BIOS switch at any point in time after windows has booted and this is the simplest way to re-flash a bad BIOS, simply boot it up using the good BIOS and flip it over to the bad BIOS after windows has booted to re-flash.

The only cases a BIOS can kill a card is if it has a serious voltage override that disables the safety shutoffs, of which is still a relatively low risk provided you monitor all the temperatures.

I'm not aware of any flashing limits either as I've re-flashed my 290X and old 7970 something like 50 times each without issue, occasionally however winflash can bug out, but that rarely corrupts the active BIOS and a re-boot and retry usually works fine.


----------



## NaXter24R

Quote:


> Originally Posted by *Paul17041993*
> 
> There's no risk of killing the card with a BIOS flash, these cards are designed for it. However before flashing either of them, back them both up and make notes of what one was in each position, so if the flash fails in some way it's easy to re-flash the original. You can flip the BIOS switch at any point in time after windows has booted and this is the simplest way to re-flash a bad BIOS, simply boot it up using the good BIOS and flip it over to the bad BIOS after windows has booted to re-flash.
> 
> The only cases a BIOS can kill a card is if it has a serious voltage override that disables the safety shutoffs, of which is still a relatively low risk provided you monitor all the temperatures.
> 
> I'm not aware of any flashing limits either as I've re-flashed my 290X and old 7970 something like 50 times each without issue, occasionally however winflash can bug out, but that rarely corrupts the active BIOS and a re-boot and retry usually works fine.


I don't know what happened to my 7970 but the guy who had it before me flashed a wrong bios on it. I checked the memory and they were different. Now that card is fine, but crashes with drivers and ony recognize 260ish mb of vram. I think because of different voltage on memory chips. I've flashed several other bios, still the same issue, so i'm about to solder new vram chips on it.

Maybe it's just me, but i'm not a fan of bios flash, but it's me, i'm pretty unlucky. My first z97 motherboard died after a bios flash. Turned out to be a bad bios chip. Still, i do that only if it's mandatory


----------



## joseph172g

I am adding screenshot from memory info. Do i have to mod my own bios? Or is there any other which will work?


----------



## NaXter24R

Quote:


> Originally Posted by *joseph172g*
> 
> I am adding screenshot from memory info. Do i have to mod my own bios? Or is there any other which will work?


There is no reason to change the bios. You could try to make that card a 300 series one with some minor improvement but there is really no practical benefit in some bios change just for the sake of it. Usually when people change the bios is because they want to override some safety features like adding more room for voltage or making some default settings in clock speed. For example, if your card can run 1100/1500 you could set those as default setting so when you boot you don't have to overclock the card every time. This tho, can be done either applying the overclock at startup like trixx or afterburner allow with just one click. I can't see any reason to replace that bios with another one. But just in case, check the compatibility. Vram, voltage controller and pcb in general are the things to consider.


----------



## Paul17041993

Quote:


> Originally Posted by *NaXter24R*
> 
> I don't know what happened to my 7970 but the guy who had it before me flashed a wrong bios on it. I checked the memory and they were different. Now that card is fine, but crashes with drivers and ony recognize 260ish mb of vram. I think because of different voltage on memory chips. I've flashed several other bios, still the same issue, so i'm about to solder new vram chips on it.
> 
> Maybe it's just me, but i'm not a fan of bios flash, but it's me, i'm pretty unlucky. My first z97 motherboard died after a bios flash. Turned out to be a bad bios chip. Still, i do that only if it's mandatory


There's a lot of factors that could attribute to the VRAM issue on that 7970, but dead IC's would be the most likely. The key would be whether the PCB accepts the new IC's happily or just remains dead, of which would most likely be a VRM failure that prevents the core and/or IC's from going in to full power.

Most of the time dead cards and whatnot are just scrapped for parts and only a fraction of the components will survive the process, the rest and the PCB are shredded and melted down into raw materials for new components.


----------



## NaXter24R

Quote:


> Originally Posted by *Paul17041993*
> 
> There's a lot of factors that could attribute to the VRAM issue on that 7970, but dead IC's would be the most likely. The key would be whether the PCB accepts the new IC's happily or just remains dead, of which would most likely be a VRM failure that prevents the core and/or IC's from going in to full power.
> 
> Most of the time dead cards and whatnot are just scrapped for parts and only a fraction of the components will survive the process, the rest and the PCB are shredded and melted down into raw materials for new components.


Those VRAM modules cost me 25€ so i can give them a try, even because i got the card for free and i like to do some experiment on that.
VRMs seems got tho, no sign of burnt things or strange smell.


----------



## LandonAaron

This is a public service announcement to any Crossfire users who either have or are thinking of getting a steam controller or use or are considering using Steam Big Picture Mode.

The problem is that *the alternative overlay which is activated when using a Steam Controller or Big Picture Mode causes crossfire to cease working*. This is likely because Crossfire only works in exclusive fullscreen or true fullscreen mode. Crossfire does not work in a borderless fullscreen mode. I believe that Big Picture Mode operates with the games running in borderless windowed mode even when you have your game set to fullscreen. That is just hunch, but what I can say for sure is that crossfire does not work with BPM or its alternate overlay.

When ever you launch a game that utilizes crossfire, if you monitor it with MSIAB you can see the 2nd card being utilized right up until the first steam overlay message appears onscreen about "accessing the steam community" at which point the 2nd cards utilization will drop to 0, and its clock speeds will drop down to its idle state speeds.

The answer then is to not use Big Picture Mode. However, if you are using a steam controller, even if you launch a game from the steam desktop program (not BPM), it will still utilize Steam's BPM overlay. You can disable the overlay all together by right clicking your game and selecting Properties>General> and unticking "Enable the Steam Overlay while in-game", however without the overlay active the steam controller will not use the configuration you assigned to it for that game. Instead it will just continue to use your desktop controller configuration, which is less than ideal.

*The real solution is to click in the upper left corner on Steam>Settings>In Game and untick "Use the Big Picture Overlay when using a Steam Controller from the desktop".* With this option disabled you can launch a game from the Steam desktop program and as long as you have the regular steam overlay enabled (which doesn't interfere with crossfire) the game will use the controller configuration you have choosen for it. The only minor issue with this method is that you can't press the Steam button on the controller to bring up controller configuration menu. To make changes to the controller configuration you need to alt-tab to the desktop, right click your game and select "Edit Steam Controller Configuration".

This will just allow you to play games with your Steam Controller and enjoy crossfire, by avoiding BPM and its alternate overlay. There is no fix that I know of for crossfire that will allow you to use BPM. So this fix applies strictly to Steam Controller users only.

Hopefully they will fix Big Picture Mode in the future so that this workaround isn't required, but it kind of seems like the whole big picture mode is built around borderless window mode so who knows if its even possible to fix. It was hard finding any information on the subject, but from what I have read this issue seems to affect Nvidia SLI users as well.


----------



## serave

I'm pretty sure someone in this thread would know which one of these products is the better one

Accelero Xtreme IV
Raijintek Morpheus

Think i decided to hold on to my 290 a little longer till next gen, the cooler that my card (Powercolor Turboduo, not the PCS one) came with can barely hold 1000Mhz under 90°C and that is @95% fan speed

also considering this one but i have my doubts that it'll contain my 290 @1150-1175 speed


----------



## i2CY

*Single Card Setup*- Accelero Xtreme IV - wasn't to hard to install, quite quite quite even on 12V.

*Crossfire Setu*p -keep you factory coolers & use a head set, or go under water.
_couple of factors - not enough room if you have a bottom mount Power suppler, top card also get starved for air, in my case_

*Note:* I have a really cramped case, with a poor air flow reviews; bought it with out researching. I had to also take advantage of costume BIOs to keep the temps down, mind you i also run un-OC'd.


----------



## mus1mus

Quote:


> Originally Posted by *gordesky1*
> 
> Oh another thing, do you think they would be willing to send back the 290x if i pay shipping to have that ship back also? or don't you think they would do that sense they already sending a replacement also?
> 
> edit just got back off the phone again asking if i can received the old card back also. and they first said didint you already accepted the 390 gaming 8gb? i told them yes but i was seeing if i can get the 290x back also so i can hang it on my wall which i would've tried baking it lol and he put me on hold and got back to me and said the cards get sent to the mfg i guess amd? cards that cant be repaired so it would be a no..
> 
> guess it was worth a try and really i cant do much now cause they did say i already accepted the 390. i guess its better getting that card than 141$ or even no card lol..


It's a bit of a necro, but did receive the card already?

I have a 390 now and I can honestly say, you will love it. Pair it with the new driver and you'll never miss the 290X.

Except if your lightning can do over 1300Mhz which I doubt without watercooling.

1200MHz 390 can hang with a 1300MHz 290. On same drivers. Once you go with the latest Crimson, well.....

http://www.overclock.net/t/1561704/official-amd-r9-390-390x-owners-club/9900_50#post_25339680


----------



## serave

Quote:


> Originally Posted by *i2CY*
> 
> *Single Card Setup*- Accelero Xtreme IV - wasn't to hard to install, quite quite quite even on 12V.
> 
> *Crossfire Setu*p -keep you factory coolers & use a head set, or go under water.
> _couple of factors - not enough room if you have a bottom mount Power suppler, top card also get starved for air, in my case_
> 
> *Note:* I have a really cramped case, with a poor air flow reviews; bought it with out researching. I had to also take advantage of costume BIOs to keep the temps down, mind you i also run un-OC'd.


I decided i'd go with the Kraken G10+AIO setup, the price difference between the Xtreme IV is only $15








Quote:


> Originally Posted by *mus1mus*
> 
> It's a bit of a necro, but did receive the card already?
> 
> I have a 390 now and I can honestly say, you will love it. Pair it with the new driver and you'll never miss the 290X.
> 
> Except if your lightning can do over 1300Mhz which I doubt without watercooling.
> 
> 1200MHz 390 can hang with a 1300MHz 290. On same drivers. Once you go with the latest Crimson, well.....
> 
> http://www.overclock.net/t/1561704/official-amd-r9-390-390x-owners-club/9900_50#post_25339680


Well, that's nice i wonder how they managed to do that with the same GPU chip though


----------



## OneB1t

different power limit or different bioses (so memory timings are different)


----------



## m70b1jr

Can someone help me out? http://www.overclock.net/t/1604873/bought-a-290x-with-a-broken-resistor-for-my-friend-thoughts/
Could someone here with am multimeter measure the resistor C21719? It's on the back of the GPU, opposite side from the PCI-e Area, on the edge of the card. I would greatly appreciate it.


----------



## kizwan

Quote:


> Originally Posted by *serave*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> It's a bit of a necro, but did receive the card already?
> 
> I have a 390 now and I can honestly say, you will love it. Pair it with the new driver and you'll never miss the 290X.
> 
> Except if your lightning can do over 1300Mhz which I doubt without watercooling.
> 
> 1200MHz 390 can hang with a 1300MHz 290. On same drivers. Once you go with the latest Crimson, well.....
> 
> http://www.overclock.net/t/1561704/official-amd-r9-390-390x-owners-club/9900_50#post_25339680
> 
> 
> 
> Well, that's nice i wonder how they managed to do that with the same GPU chip though
Click to expand...

By fine tuning in the VBIOS. Like race cars, you can fine tune GPU to get more performance. Just imagine the VRM controller, memory controller, etc as little computer.

With stock VBIOS, 390 is faster than 290 & 390X is faster than 290X. I believe 390 & 290X are neck and neck. 290 can keep up with 390 with 390 modded ROM but 390 with modded ROM will pull ahead.


----------



## gordesky1

Quote:


> Originally Posted by *mus1mus*
> 
> It's a bit of a necro, but did receive the card already?
> 
> I have a 390 now and I can honestly say, you will love it. Pair it with the new driver and you'll never miss the 290X.
> 
> Except if your lightning can do over 1300Mhz which I doubt without watercooling.
> 
> 1200MHz 390 can hang with a 1300MHz 290. On same drivers. Once you go with the latest Crimson, well.....
> 
> http://www.overclock.net/t/1561704/official-amd-r9-390-390x-owners-club/9900_50#post_25339680


yep had the 390 for weeks now

Yea max i put threw my lighting was 1200-1650 with around 200+ than i had to keep going lower over time because everything was getting unstable i guess the card was about to die and just took longer to do it..

For the 390 i didn't really push it to the max yet cause yea temps got out of control.. more than my lightning... i had it on 1180 core -1720mhz and everything seem fine than the temps got out of hand and went in the 90s even on max fan.... So at the moment i been at mostly stock ever sense i saw those temps lol

Performance its hard to say really but i do like that i can turn stuff up now in games and using vsr that needs alot of vram with out taking a huge performance hit,
Haven't did much testing on the new drivers tho but i do have them installed so will see. going to start clocking it up again and see what the temps do again.

But yea i do really miss the cooling the lightning had cause even with +200 and at 1200-1600 i never saw 90s









But here's the thing that bothers me about this card not sure if they all do this or what.

But if i overclock the memory even over 1mhz over 1500mhz it will go full vcore from 1086 voltage to 1234mv..... and it stays that way than if i load a game up it will drop lower than 1234 around 1172...

If i keep it at stock 1500 voltage will stay at 1086 always than when i load something up it will go to 1172-1180.

now if i up the core the voltage will stay were its at till i put some more into it..

Never remember my 290x doing this..


----------



## Paul17041993

Quote:


> Originally Posted by *gordesky1*
> 
> yep had the 390 for weeks now
> 
> Yea max i put threw my lighting was 1200-1650 with around 200+ than i had to keep going lower over time because everything was getting unstable i guess the card was about to die and just took longer to do it..
> 
> For the 390 i didn't really push it to the max yet cause yea temps got out of control.. more than my lightning... i had it on 1180 core -1720mhz and everything seem fine than the temps got out of hand and went in the 90s even on max fan.... So at the moment i been at mostly stock ever sense i saw those temps lol
> 
> Performance its hard to say really but i do like that i can turn stuff up now in games and using vsr that needs alot of vram with out taking a huge performance hit,
> Haven't did much testing on the new drivers tho but i do have them installed so will see. going to start clocking it up again and see what the temps do again.
> 
> But yea i do really miss the cooling the lightning had cause even with +200 and at 1200-1600 i never saw 90s
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But here's the thing that bothers me about this card not sure if they all do this or what.
> 
> But if i overclock the memory even over 1mhz over 1500mhz it will go full vcore from 1086 voltage to 1234mv..... and it stays that way than if i load a game up it will drop lower than 1234 around 1172...
> 
> If i keep it at stock 1500 voltage will stay at 1086 always than when i load something up it will go to 1172-1180.
> 
> now if i up the core the voltage will stay were its at till i put some more into it..
> 
> Never remember my 290x doing this..


Your lightning probably did the same and you didn't notice, when the memory has to run at its high clocks (3D use and/or multi-monitor) it forces the core to its highest as the memory controller uses the same rail. However if it sticks to the highest core voltage when both the core and memory are running at their lowest idle state then there's a bug with the software you're using (or a miss-configured BIOS).


----------



## gordesky1

Quote:


> Originally Posted by *Paul17041993*
> 
> Your lightning probably did the same and you didn't notice, when the memory has to run at its high clocks (3D use and/or multi-monitor) it forces the core to its highest as the memory controller uses the same rail. However if it sticks to the highest core voltage when both the core and memory are running at their lowest idle state then there's a bug with the software you're using (or a miss-configured BIOS).


the weird thing is the memory runs always at 1500 cause i do have dual monitor but the voltage is not running high till i overclock the memory even by 1mhz.. why would 1mhz cause it to go in high voltage mode even tho the memory is running at full clock already?


----------



## kizwan

Quote:


> Originally Posted by *gordesky1*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> It's a bit of a necro, but did receive the card already?
> 
> I have a 390 now and I can honestly say, you will love it. Pair it with the new driver and you'll never miss the 290X.
> 
> Except if your lightning can do over 1300Mhz which I doubt without watercooling.
> 
> 1200MHz 390 can hang with a 1300MHz 290. On same drivers. Once you go with the latest Crimson, well.....
> 
> http://www.overclock.net/t/1561704/official-amd-r9-390-390x-owners-club/9900_50#post_25339680
> 
> 
> 
> yep had the 390 for weeks now
> 
> Yea max i put threw my lighting was 1200-1650 with around 200+ than i had to keep going lower over time because everything was getting unstable i guess the card was about to die and just took longer to do it..
> 
> For the 390 i didn't really push it to the max yet cause yea temps got out of control.. more than my lightning... i had it on 1180 core -1720mhz and everything seem fine than the temps got out of hand and went in the 90s even on max fan.... So at the moment i been at mostly stock ever sense i saw those temps lol
> 
> Performance its hard to say really but i do like that i can turn stuff up now in games and using vsr that needs alot of vram with out taking a huge performance hit,
> Haven't did much testing on the new drivers tho but i do have them installed so will see. going to start clocking it up again and see what the temps do again.
> 
> But yea i do really miss the cooling the lightning had cause even with +200 and at 1200-1600 i never saw 90s
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> But here's the thing that bothers me about this card not sure if they all do this or what.
> 
> But if i overclock the memory even over 1mhz over 1500mhz it will go full vcore from 1086 voltage to 1234mv..... and it stays that way than if i load a game up it will drop lower than 1234 around 1172...
> 
> If i keep it at stock 1500 voltage will stay at 1086 always than when i load something up it will go to 1172-1180.
> 
> now if i up the core the voltage will stay were its at till i put some more into it..
> 
> Never remember my 290x doing this..
Click to expand...

290 with 390 modded ROM. I'm seeing the same thing. For what it's worth, voltage for DPM7 (highest pstate) was set to 1.287V with +37.5mV hidden offset in the BIOS, meaning highest voltage at highest pstate is ~1.325V. I don't know what is going on but probably drivers automatically increase the voltage when it detect overclocking, even if it's only 1MHz.

Please don't ask me to test with stock/290 ROM because I don't want to do that.









Memory 1300MHz


Memory 1301MHz


----------



## Paul17041993

Quote:


> Originally Posted by *gordesky1*
> 
> the weird thing is the memory runs always at 1500 cause i do have dual monitor but the voltage is not running high till i overclock the memory even by 1mhz.. why would 1mhz cause it to go in high voltage mode even tho the memory is running at full clock already?


Seems that the BIOS and/or drivers anticipate that the core may run unstable above the stock 1500 at a low voltage, as to what is expected, but stable with a low voltage at or below the stock 1500 when idling.

So you'll either want to not use multi-monitor or edit the BIOS's power profiles, or just leave it as-is if you don't mind the slightly higher idle power.


----------



## Offler

Few months ago I tried to reproduce my top score and failed:
http://www.3dmark.com/compare/fs/2873251/fs/7876016

I took some time to investigate whats wrong with my PC.

a) CPU VRM isnt really happy
Graphic card as a secondary heat source is too close to VRM coolers and results is problem with CPU power supply above certain frequency (4000 Mhz).

b) One memory timing went wrong.
2nd set of tests was running with memory setting 1600MHz 6-6-6-18 and 1,77v. These settings are OK, but due some reason one of the timing which was previously set to AUTO and was fine, changed from 34 to 40. The timing has proven to be too long as system was able to reboot without any apparent reason. setting it back to 34 fixed it.

c) Power setting
Due some reason the system was thinking its et to 0% and not 50%.

Actually I am considering to get Asus Sabertooth. Yes now when Phenom II is so old









So... Step by step...

1. I will restore 3900MHz on CPU and RAM timings. to 6-6-6-18
And take time to check wheter system is stable or not..

2. Then I will try GPU oc.
If there will be artifacts on 1180MHz we are still in trouble.


----------



## mus1mus

Hard to tell what's happening there. Different Drivers may do that as well as the rest of the settings in your OC.

Pretty surprised of the GS1 score on the old driver though.


----------



## Offler

So...
http://www.3dmark.com/compare/fs/9338552/fs/2873251

But there are artifacts on the first test.

Anyway CPU and MEM settings are now fine. At least we are getting somewhere. Anyway it seems that the old driver was somehow better. And I didnt used any kind of dirty tricks like running low texture quality.

In any case, from 1x R9-290x + Phenom II 1090t nobody (including me) beaten my old record with valid drivers. I will give a weekend to it


----------



## bobloadmire

Hey guys haven't been on here for a few years, but just wanted to update. I have a 290 flashed to 290x running stable at 1110 core (+50mv) and 1450mhz clock. The only problem is running at these settings, while my games never crash, windows will not resume from sleep. Has anyone found a solution for this? Again I did this unlock right when this mod was out a few years ago, so Im waaaaay out of the loop. The card works perfect under stock conditions BTW 1000/1350.


----------



## mus1mus

I don't see a reason why the card/OC will do that. Unless sleeping the computer resets the Voltage causing the card to blackscreen due to memory OC.

That is the thing with Crimson Drivers resuming form a shut down keeping the previous Overclocks -- minus the Voltages.

Try installing a pre-Crimson Driver and see if the PC resumes from sleep normally. If it does,

Read thru Hawaii Bios Editting Thread and look for a way to apply a hidden Offset to the BIOS. That way, Voltage Offset will be applied at startup instead of waiting for you to run MSI AB or OC Software to adjust the Voltage. This will teick Crimson Driver issues like Black Screen at startup after a Windows / OC failure.

The difference from Applying an in-ROM offset to setting up a profile that starts at startup with OC software is that, the driver reads the BIOS first that usually starts as soon as Windows loads. Not after Windows has been fully initiated with Software OC tools.


----------



## Roboyto

Quote:



> Originally Posted by *bobloadmire*
> 
> Hey guys haven't been on here for a few years, but just wanted to update. I have a 290 flashed to 290x running stable at 1110 core (+50mv) and 1450mhz clock. The only problem is running at these settings, while my games never crash, windows will not resume from sleep. Has anyone found a solution for this? Again I did this unlock right when this mod was out a few years ago, so Im waaaaay out of the loop. The card works perfect under stock conditions BTW 1000/1350.


Are you running Windows 10? Resuming from sleep is a known OS issue


----------



## Echoa

Excited to be joining your ranks in the next few days. Got me a visiontek 290 (aka VTX3D X Edition) for 150$ on Amazon the other day and should be here by Wednesday


----------



## bobloadmire

Interesting, thanks for the info.


----------



## bobloadmire

yes, win10, but it's not a win 10 issue. it resumes fine at stock settings.


----------



## ZealotKi11er

Do new drivers kill OC for 290X? I used to be able to tun at least 1275. Not I cant do 1240 without driver crash. Temps are very low as this happens within 4-5 seconds. Using +200mV.


----------



## mus1mus

Efficiency mode OfF?


----------



## ZealotKi11er

Quote:


> Originally Posted by *mus1mus*
> 
> Efficiency mode OfF?


What is this Efficiency Mode people keep telling me about.


----------



## Roboyto

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Do new drivers kill OC for 290X? I used to be able to tun at least 1275. Not I cant do 1240 without driver crash. Temps are very low as this happens within 4-5 seconds. Using +200mV.


I think they might...it is that or happenstance that downloading, installing, and running the new 3DMark Time Spy bench a few times caused something to go wonky.

My concrete solid overclock, for several years at this point, ended up giving me trouble...not even my maximum balls-to-the-wall OC, my gaming clocks.

Left my house for a couple hours immediately after Time Spy completed a bench run. Left the computer sitting with the score sitting there on the desktop. Came home to find black screen across 3 monitors and hard-locked.

Figure no big deal...hold power, reboot, and instant black screen after entering password...and kept doing this. Ultimately had to enter safe mode, uninstall drivers and problem solved.

I just re-installed them again because the performance boost to DOOM is some kind of wondrous sorcery I tell ya!


----------



## ZealotKi11er

Quote:


> Originally Posted by *Roboyto*
> 
> I think they might...it is that or happenstance that downloading, installing, and running the new 3DMark Time Spy bench a few times caused something to go wonky.
> 
> My concrete solid overclock, for several years at this point, ended up giving me trouble...not even my maximum balls-to-the-wall OC, my gaming clocks.
> 
> Left my house for a couple hours immediately after Time Spy completed a bench run. Left the computer sitting with the score sitting there on the desktop. Came home to find black screen across 3 monitors and hard-locked.
> 
> Figure no big deal...hold power, reboot, and instant black screen after entering password...and kept doing this. Ultimately had to enter safe mode, uninstall drivers and problem solved.
> 
> I just re-installed them again because the performance boost to DOOM is some kind of wondrous sorcery I tell ya!


Yeah, I crashed and then would get BSOD, Windows would not load. I had to go to save mode and reinstall drivers.


----------



## Roboyto

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah, I crashed and then would get BSOD, Windows would not load. I had to go to save mode and reinstall drivers.


They have been much more timely with driver updates...and these are not WHQL...they will likely sort out the issue sooner than later.

I hadn't done a clean install either, so maybe that had something to do with it...we will see


----------



## Paul17041993

Modding your OC into one of the BIOSs is probably the ultimate way to fix all OC and driver problems, never had any with mine (besides an unrelated issue with the PCIe slot pins shrinking in cold weather...).


----------



## Shweller

Was wondering if anyone has had issues like this. My second 290x card core clock just drops and stays at 300MHz at all times. I just reinstalled drivers used ddu to uninstall. This screenshot was while running the valley benchmark. I heard like a driver disconnect sound and the core dropped off to 300MHz and never came back up. Not sure what is going on...


----------



## mus1mus

Quote:


> Originally Posted by *Shweller*
> 
> Was wondering if anyone has had issues like this. My second 290x card core clock just drops and stays at 300MHz at all times. I just reinstalled drivers used ddu to uninstall. This screenshot was while running the valley benchmark. I heard like a driver disconnect sound and the core dropped off to 300MHz and never came back up. Not sure what is going on...


Did you enable Crossfire properly?

ULPS Disabled?


----------



## Caveat

Maybe a ******ed question, but does an R9 290X 8GB work together with an R9 290X 4GB in Crossfire or does the 2nd card need to be exactly the same?


----------



## Vellinious

Quote:


> Originally Posted by *Caveat*
> 
> Maybe a ******ed question, but does an R9 290X 8GB work together with an R9 290X 4GB in Crossfire or does the 2nd card need to be exactly the same?


Nope...they'll work just fine.

I have an XFX 8GB 290x with kryographics block and passive backplate that I'm not using any more....would be willing to sell it, if you're looking.


----------



## Offler

Quote:


> Originally Posted by *mus1mus*
> 
> Hard to tell what's happening there. Different Drivers may do that as well as the rest of the settings in your OC.
> 
> Pretty surprised of the GS1 score on the old driver though.


Quote:


> Originally Posted by *Offler*
> 
> So...
> http://www.3dmark.com/compare/fs/9338552/fs/2873251
> 
> But there are artifacts on the first test.
> 
> Anyway CPU and MEM settings are now fine. At least we are getting somewhere. Anyway it seems that the old driver was somehow better. And I didnt used any kind of dirty tricks like running low texture quality.
> 
> In any case, from 1x R9-290x + Phenom II 1090t nobody (including me) beaten my old record with valid drivers. I will give a weekend to it


Well...

Tried several different settings for CPU NB, CPU HT, PCI-E frequency. Those three actually had an impact on overall smoothness in gaming - Thief, Brutal Legend and even Archeage were smoother. Not exactly higher FPS but some stutter which might happen was either present or gone (now I have NB 2800, HT 2600 , PCI-E 105Mhz). Of course higher HT frequency limited FPS on 1st test even more, and best scores were on 2000Mhz.

Few notes:
a) Texture Quality: Performance
This setting made flickering even more noticeable. It simply indicates that faster texture processing is causing slightly higher load on shader units and makes them more prone to fail. Reasons for failure are then clearly thermal-voltage-performance related. Drop in scores is then caused partially by blocked CUs (which are failing and slowing down rendering process at some point). If I go down frequency by about 10MHz i will get clear performance without artifacts, in any case in this test is performance clearly capped and to get above it i have to check and fix thermal problem.

On other hand is GPU not indicating limit values.

b) Catalyst 14.3 VS 16.3
Because I set RAM latencies from 7-7-7-18 to 6-6-6-18 its hard to say whether CPU has lower load, but its clear that newer driver is more optimized and able to generate slightly higher load on shader units. Even when I encountered trouble with cooling/performance its in general a good thing.

c) To go back with the driver or to keep upgraded one?
Thats the question, since we now have Vulkan. Increased performance output which is clearly attribute of the newer driver however may require either frequency decrease, or cooling adjustment.

d) Realife game performance

Thief
Played Thief with Mantle a bit (Vsync, Triple buffering, FullHD) and the overall smoothess of the game is simply great. Absolutely no hiccups, no rendering trouble and everything works really fine. Unreal engine does a really nice smooth job. Great example of how multi-thread rendering can work.

Brutal Legend
A console port of unknown engine with low HW requirements. Works just fine. Very few slight hiccups which are hardly noticeable.

Mortal Kombat 9
Perfectly smooth, low requirement-console friendly game. Absolutely no hiccups.

Starcraft II
Blizards proprietary engine, much upgraded over time. Frames still drop if you are placing dozens of pylons.

Archeage
CryEngine 3.0. Support for multithread rendering, support for DX11. Various frame drops Several objects or effects are causing massive frame drops. Absolutely unoptimized.

Witcher 2
Only noticeable frame drop is with Ubersampling on, at the beginning of the game while observing the trebuchet.


----------



## Caveat

Quote:


> Originally Posted by *Vellinious*
> 
> Nope...they'll work just fine.
> 
> I have an XFX 8GB 290x with kryographics block and passive backplate that I'm not using any more....would be willing to sell it, if you're looking.


Ok thank you. No im not looking for watercooling. Stock coolers are good enough for me coz im not overclocking. But thnx for the offer


----------



## Vellinious

Sure thing. = )


----------



## mus1mus

I'd buy his card if I were you.


----------



## Vellinious

I haven't even advertised it yet really.....I was going to keep it for a while and use it while I send my 980tis in to EVGA for the step up program....should be my turn in the queue here pretty soon. In that time, I'll throw an overclock on it, and turn in some scores for HWBot. I've been watching Ebay for another one to put together with it, but....man, people sure do love those 8GB 290Xs...my goodness.


----------



## ZealotKi11er

Man I do not know what is going on. With the lateste drivers my driver crash anything above 1100MHz with my 290X + 290 + 100mV. Something crazy is going on.


----------



## Vellinious

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Man I do not know what is going on. With the lateste drivers my driver crash anything above 1100MHz with my 290X + 290 + 100mV. Something crazy is going on.


So, roll your driver back and see if it persists


----------



## ZealotKi11er

Quote:


> Originally Posted by *Vellinious*
> 
> So, roll your driver back and see if it persists


Changed bios and I can do now 1175 +50mv. After that cant go any higher even +100mv. These cards are so weird. Going to run some stress test and get 24/7 OC. Never really overclocked them outside 3DMark.


----------



## Roboyto

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Man I do not know what is going on. With the lateste drivers my driver crash anything above 1100MHz with my 290X + 290 + 100mV. Something crazy is going on.


I haven't tested extensively but I've got the same thing going on. My 1200/1500 +100mV impervious OC settings are still golden for Time Spy and Doom unden OpenGL/Vulkan...

But trying to push my card like I used to and I get a blackscreen crash and it corrupts the drivers requiring a safe-mode boot to uninstall drivers and then re-installing.

I will say the performance boost to Doom with Vulkan and those drivers is off the ^(*%# chain though! With the right selection of ultra/high settings I get a solid 50-70fps on my Eyefinity setup with the one 290 @ 1200/1500


----------



## ZealotKi11er

Quote:


> Originally Posted by *Roboyto*
> 
> I haven't tested extensively but I've got the same thing going on. My 1200/1500 +100mV impervious OC settings are still golden for Time Spy and Doom unden OpenGL/Vulkan...
> 
> But trying to push my card like I used to and I get a blackscreen crash and it corrupts the drivers requiring a safe-mode boot to uninstall drivers and then re-installing.
> 
> I will say the performance boost to Doom with Vulkan and those drivers is off the ^(*%# chain though! With the right selection of ultra/high settings I get a solid 50-70fps on my Eyefinity setup with the one 290 @ 1200/1500


I used to get artifact with Fire Strike 1300MHz but now simply driver crash. I think it's time to retire them lol.


----------



## Vellinious

Hmm...I might stay away from the new drivers at first, and see if I can get my card to run like it used to. I was hitting 1330 / 1862 on FS...1250 / 1750 all day long. I'm curious if the new drivers add performance without the additional clocks.....I'll have to do some playing. = )


----------



## DMatthewStewart

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Man I do not know what is going on. With the lateste drivers my driver crash anything above 1100MHz with my 290X + 290 + 100mV. Something crazy is going on.


Same thing happened to me. I went back to earlier driver and its fine now


----------



## Shweller

Quote:


> Originally Posted by *mus1mus*
> 
> Did you enable Crossfire properly?
> 
> ULPS Disabled?


Yup, ULPS is disabled. I also install drivers as per another thread.

1. Uninstall AMD drivers
2. Reboot into safe mode run DDU.
3. Install AMD drivers with only PSU connected to primary card.
4. Power down and connect second card to PSU.
5. Boot up and AMD settings automatically recognizes cards and enables Xfire.

I will revert to some older drivers and see if the issue persists. Is the latest drivers the best ones for the 290x since they are 3 generations old now.....


----------



## mus1mus

I always restart after the driver recognizes the second card for good measure.

The latest driver gave me a huge boost in benchmarks. That is comparing 15.10 to 16.7.3?


----------



## Roboyto

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I used to get artifact with Fire Strike 1300MHz but now simply driver crash. I think it's time to retire them lol.


I think it's probably the drivers tbh...but after pushing cards 30% past intended clock speeds for a couple years you have to expect them to start to falter eventually.

Retirement for Vega will probably be a good idea. I just grabbed a 480 over the weekend. Clocked at 1350/2250 it trails my 290 at 1200/1500 by only 3% in Time Spy.

That's 7%/12.5% overclock competing with 27%/20% comparatively AND the 290 is running a modded BIOS for RAM timings. Sure the 290 can go further, but past 1200 is above the average Hawaii card for the most part..And it cost me $200 less than when I bought my 290 for a slightly inflated $450.

Once these 480s have some additional power with AIB designs, they are going to be pretty awesome.

I'll have one of XSPC's new Blade blocks ordered as soon as they're available. PPCs said sometime this week most likely.


----------



## Echoa

Card came today, just in time for my computer to stop booting (hoping a new PSU fixes it)

Visiontek r9 290 (x?)






Great looking card, paid for a 290 but sticker says 290x







hoping that's true, when computer is working again I'm hoping to see those 2816 shaders. Either way a great 150$


----------



## Shweller

Quote:


> Originally Posted by *Roboyto*
> 
> I think it's probably the drivers tbh...but after pushing cards 30% past intended clock speeds for a couple years you have to expect them to start to falter eventually.
> 
> Retirement for Vega will probably be a good idea. I just grabbed a 480 over the weekend. Clocked at 1350/2250 it trails my 290 at 1200/1500 by only 3% in Time Spy.
> 
> That's 7%/12.5% overclock competing with 27%/20% comparatively AND the 290 is running a modded BIOS for RAM timings. Sure the 290 can go further, but past 1200 is above the average Hawaii card for the most part..And it cost me $200 less than when I bought my 290 for a slightly inflated $450.
> 
> Once these 480s have some additional power with AIB designs, they are going to be pretty awesome.
> 
> I'll have one of XSPC's new Blade blocks ordered as soon as they're available. PPCs said sometime this week most likely.


Its amazing how these cards were $700+ not to mention you couldn't find any because of the mining craze. How times have changes and technology advances so fast.


----------



## Echoa

Quote:


> Originally Posted by *Shweller*
> 
> Its amazing how these cards were $700+ not to mention you couldn't find any because of the mining craze. How times have changes and technology advances so fast.


When I first built my latest rig they just came out and had been far too expensive to get. Now I grab one on the cheap. I'm ok with fast advancement of it gets me what I want cheaper lol


----------



## mus1mus

Yeah, if only AMD produces GPUs as fast as nVidia does.


----------



## TheReciever

I've actually under clocked my 290x to 950c, along with undervolting it. This way I don't have to worry about Temps as even idle fan speeds keep it under 61c (1300rpm)

Couldnt be happier with it


----------



## Shweller

Im not sure exactly but the green team put 3x the amount of money into gpu technology as the red team put into cpu/gpu this last year. But the red team still manages to kill it price/power ratio against the 1060 which has no sli support or pcb to support it.


----------



## asciii

Removed.


----------



## rdr09

Quote:


> Originally Posted by *asciii*
> 
> My cards haven't been running as expected for months now and I passed it off as general wear or temperature.
> Today I opened GPU-Z to get an idea of what driver version I was running and discovered that I never enabled crossfire
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I'm surprised that single R9 290 kept up with the resolution (1080p 144Hz) in most games and benchmarks, but I guess it helped to have it at 1100/1300.


Are you able to push both gpus using that rez? what cpu do you have?

Anyways, i tested 16.7.2 and to get 18K in 3DMark, we only need 1200 now.

http://www.3dmark.com/3dm11/11433102

I recall needing 1300 core to get that Graphics.


----------



## asciii

Removed.


----------



## rdr09

Quote:


> Originally Posted by *asciii*
> 
> I'm using a 4690k at 4.5GHz. Probably a bit of a bottleneck but it does the job.
> 
> Could you clarify what you mean by pushing both GPUs at the resolution?


At lower rez, the burden is on the cpu, while the gpus can't flex their legs. Does not really matter so long as you are getting the fps needed to play the game.


----------



## PRSCU24

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Man I do not know what is going on. With the lateste drivers my driver crash anything above 1100MHz with my 290X + 290 + 100mV. Something crazy is going on.


Just got my 1150/1500mhz o/c back (games would crash over 1100/1400mhz for whatever reason) by setting the DPM7 voltage to 1.350V instead of 1.25V+100MV offset. Sounds stupid but worked for me.


----------



## rdr09

Just upgraded my clone drive with Win7 and 16.1 Driver to Win 10. After the install, AMD Firepro driver was installed with it for my 290s.

Good thing I have 16.7.2 in a flashdrive.


----------



## diggiddi

Quote:


> Originally Posted by *rdr09*
> 
> Just upgraded my clone drive with Win7 and 16.1 Driver to Win 10. After the install, *AMD Firepro driver* was installed with it for my 290s.
> 
> Good thing I have 16.7.2 in a flashdrive.


Cool, what are you using it for,if you dont mind me asking, also are there major differences b/n using it and a regular driver?


----------



## rdr09

Quote:


> Originally Posted by *diggiddi*
> 
> Cool, what are you using it for,if you dont mind me asking, also are there major differences b/n using it and a regular driver?


No, Win10 decided to install AMD Firepro by itself.. Did not stay long 'cause I noticed i can't game with it plus the visuals are a little corrupted, so i used Crimson's uninstall feature, then installed 16.7.2.

No issues so far with Win10 except my nic does not work. Using wifi atm.


----------



## diggiddi

Quote:


> Originally Posted by *rdr09*
> 
> No, Win10 decided to install AMD Firepro by itself.. Did not stay long 'cause I noticed i can't game with it plus the visuals are a little corrupted, so i used Crimson's uninstall feature, then installed 16.7.2.
> 
> No issues so far with Win10 except my nic does not work. Using wifi atm.


Ahh! the craziness of Win 10


----------



## rdr09

Quote:


> Originally Posted by *diggiddi*
> 
> Cool, what are you using it for,if you dont mind me asking, also are there major differences b/n using it and a regular driver?


It fixed itself. Loving it! Used 10 on a laptop for a good while, so i said . . . it's time to push 10 to the desktops. All files and apps are where they should be. Great.


----------



## fyzzz

So I recently bought a AOC U2879VF Freesync monitor and I am pretty impressed by how well a single 290 can handle 4K.

Battlefield 4 Settings:

All Ultra, except post process quality (medium), Ambient occlusion SSAO and all antialiasing off.

[email protected], [email protected] CL11, R9 290 1150/1425 custom bios.

Operation Locker 64 players Conquest Large.

Frames: 37793 - Time: 598688ms - Avg: 63.126 - Min: 37 - Max: 114

Max VRAM usage was 2673 MB


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> So I recently bought a AOC U2879VF Freesync monitor and I am pretty impressed by how well a single 290 can handle 4K.
> 
> Battlefield 4 Settings:
> 
> All Ultra, except post process quality (medium), Ambient occlusion SSAO and all antialiasing off.
> 
> [email protected], [email protected] CL11, R9 290 1150/1425 custom bios.
> 
> Operation Locker 64 players Conquest Large.
> 
> Frames: 37793 - Time: 598688ms - Avg: 63.126 - Min: 37 - Max: 114
> 
> Max VRAM usage was 2673 MB


Nice. I assume you have Crimson and can you still use Mantle?


----------



## fyzzz

Quote:


> Originally Posted by *rdr09*
> 
> Nice. I assume you have Crimson and can you still use Mantle?


Yes, latest crimson drivers. I haven't tried mantle yet.


----------



## rdr09

Quote:


> Originally Posted by *fyzzz*
> 
> Yes, latest crimson drivers. I haven't tried mantle yet.


Thanks. If you are using fraps . . . it might not work with mantle.


----------



## LandonAaron

Have AMD removed the Custom Resolutions option from the Crimson drivers? I have 16.7.3, and under Display there is no longer any "additional settings" option. There is a "more..." option in the upper right hand corner but all it does is display a little bit info on the various options on the display page. I can't believe they did away with Custom resolutions they just added that not too long ago. Hopefully I am just missing something.


----------



## fyzzz

Quote:


> Originally Posted by *LandonAaron*
> 
> Have AMD removed the Custom Resolutions option from the Crimson drivers? I have 16.7.3, and under Display there is no longer any "additional settings" option. There is a "more..." option in the upper right hand corner but all it does is display a little bit info on the various options on the display page. I can't believe they did away with Custom resolutions they just added that not too long ago. Hopefully I am just missing something.


Radeon Additional settings is under preferences now. Just navigate to Home and then Preferences.


----------



## LandonAaron

Quote:


> Originally Posted by *fyzzz*
> 
> Radeon Additional settings is under preferences now. Just navigate to Home and then Preferences.


Thanks. God I hate these tiled menus. Its like a freaking word search.


----------



## MissHaswellE

Quote:


> Originally Posted by *LandonAaron*
> 
> Thanks. God I hate these tiled menus. Its like a freaking word search.


The new catalyst crimson whatever is awful. I don't know why they changed it.


----------



## Jaimelmiel

Hello everyone,Ihave an R9 290x sapphire vaporx 8GB card,with a evga supernova G2 750 watt psu. Motherboard is a sabertooth 990 fx r2.0. I have a clean install of win 10. the setup ran with minor difficulties for a while.It makes past the windows logo and cuts out. I put in a r9 380x nitro
card and the system runs flawlessly. I also took the card back to my friends house who I got it from and we put it his machine and it worked flawlessly. He has an asus crosshair with a 1300 watt G2 supernova.the only thought I have is maybe I need another psu or to send it back.


----------



## Roboyto

Quote:


> Originally Posted by *Jaimelmiel*
> 
> Hello everyone,Ihave an R9 290x sapphire vaporx 8GB card,with a evga supernova G2 750 watt psu. Motherboard is a sabertooth 990 fx r2.0. I have a clean install of win 10. the setup ran with minor difficulties for a while.It makes past the windows logo and cuts out. I put in a r9 380x nitro
> card and the system runs flawlessly. I also took the card back to my friends house who I got it from and we put it his machine and it worked flawlessly. He has an asus crosshair with a 1300 watt G2 supernova.the only thought I have is maybe I need another psu or to send it back.


750 should be enough to run the 290X.

Try dropping the 290X in one more time...all it takes is a little piece of dust or hair to get in the PCIe slot and you can have goofy problems.

Looks like there is a drivers hotfix that came out today 16.8.1; couldn't hurt to do a driver scrub and give those a try.


----------



## Paul17041993

Quote:


> Originally Posted by *Jaimelmiel*
> 
> Hello everyone,Ihave an R9 290x sapphire vaporx 8GB card,with a evga supernova G2 750 watt psu. Motherboard is a sabertooth 990 fx r2.0. I have a clean install of win 10. the setup ran with minor difficulties for a while.It makes past the windows logo and cuts out. I put in a r9 380x nitro
> card and the system runs flawlessly. I also took the card back to my friends house who I got it from and we put it his machine and it worked flawlessly. He has an asus crosshair with a 1300 watt G2 supernova.the only thought I have is maybe I need another psu or to send it back.


Do you use displayport at all (direct to monitor or via adapter)? I know of an odd BIOS issue on mine currently that it wont boot correctly without HDMI plugged in, but I likely just need to adjust the voltages again.


----------



## mus1mus

Clearing the mobo BIOS if anything has been altered due to overclocking may help.

You are basically using the same chipset as your friend's so it has to go down to either a bad slot or some non-default settings in the mobo.


----------



## spyshagg

*no man's sky* game completely destroys my otherwise 100% stable overclock. I have to drop 50mhz which is insane.


----------



## pshootr

Quote:


> Originally Posted by *spyshagg*
> 
> *no man's sky* game completely destroys my otherwise 100% stable overclock. I have to drop 50mhz which is insane.


GPU, or CPU?


----------



## spyshagg

make that -100mhz (on gpu memory) to be stable.


----------



## Echoa

Got my PC back together last night after RMA. Confirmed I got a 290x vs the 290 I ordered







it's so much faster than my 270x it's funny. Will post validation for it later after work.


----------



## Aretak

Quote:


> Originally Posted by *spyshagg*
> 
> *no man's sky* game completely destroys my otherwise 100% stable overclock. I have to drop 50mhz which is insane.


It sucks finding that one game which doesn't like your overclock after spending hours testing it and thinking you've got it rock solid. It's usually not something you'd expect either. World of Warcraft of all things has been the game to sniff out many an unstable overclock for me. Not only GPU-wise either, but CPU and RAM too.


----------



## Echoa

Quote:


> Originally Posted by *Aretak*
> 
> It sucks finding that one game which doesn't like your overclock after spending hours testing it and thinking you've got it rock solid. It's usually not something you'd expect either. World of Warcraft of all things has been the game to sniff out many an unstable overclock for me. Not only GPU-wise either, but CPU and RAM too.


Crysis 3 for me, guarentee within 5min


----------



## shpeki

I have sapphire 290x vapor-x, currently [email protected] + 100 mv, how much overvoltage is safe for 24/7. The temperature is 80 degree on core and similar on vrm1 and 2.


----------



## pshootr

Quote:


> Originally Posted by *shpeki*
> 
> I have sapphire 290x vapor-x, currently [email protected] + 100 mv, how much overvoltage is safe for 24/7. The temperature is 80 degree on core and similar on vrm1 and 2.


I would say you are at your limit. I wouldn't push it much farther than 80C


----------



## ZealotKi11er

Quote:


> Originally Posted by *shpeki*
> 
> I have sapphire 290x vapor-x, currently [email protected] + 100 mv, how much overvoltage is safe for 24/7. The temperature is 80 degree on core and similar on vrm1 and 2.


Cant believe that is stable at 80C.


----------



## pshootr

Mine runs at 77C @ load, and it is stable. 1124/1400 +69mv


----------



## mus1mus

Mod the VRM switching Frequency. It does help!


----------



## Harry604

Quote:


> Originally Posted by *mus1mus*
> 
> Mod the VRM switching Frequency. It does help!


what does that do? I run a mod memory timings 390x bios on my 290x will that give me even more of performance boost


----------



## buttface420

so i just got a msi gaming 290x that runs hot as the sun. like 94c within a few minutes of gaming even with fans at 100%.i even repasted twice just in case but no change in temp.

i have a hg10 a1 and a h55 aio laying around from a 290x i used to own but i dont think it would fit at all.

the gpu works great but thats just way too hot for stock 290x twin frozer i dont get it even my msi reference 290x's were lower than that if i had fans at 70%.

anyone know if a kraken g10 would work with this?


----------



## mus1mus

Quote:


> Originally Posted by *Harry604*
> 
> what does that do? I run a mod memory timings 390x bios on my 290x will that give me even more of performance boost


Nope.

Vrm switching frequency decreases the time the Vrms switch on per second. So the end result is cooler vrms temps.


----------



## Streetdragon

but wouldnt this mean, that les power comes to the core? does a overclock wont get unstable? is there a guide for how to change it? on my 390 nitro bios i dont have a option in the hawaiibiosreader(and i didnt found a guide in the thread).


----------



## Echoa

Quote:


> Originally Posted by *Streetdragon*
> 
> but wouldnt this mean, that les power comes to the core? does a overclock wont get unstable? is there a guide for how to change it? on my 390 nitro bios i dont have a option in the hawaiibiosreader(and i didnt found a guide in the thread).


Same amount of power, increases voltage ripple though.


----------



## mus1mus

Quote:


> Originally Posted by *Streetdragon*
> 
> but wouldnt this mean, that les power comes to the core? does a overclock wont get unstable? is there a guide for how to change it? on my 390 nitro bios i dont have a option in the hawaiibiosreader(and i didnt found a guide in the thread).


Visit Hawaii BIOS Editting thread.

It doesn't affect overclockability on my cards. Nor make an unstable clock stable by increasing it.


----------



## pshootr

Hey guys, I'm running R9-290 Tri-X OC. I run win7 still. Is there any reason for me to install Crimson drivers?

Currently using:

Driver Packaging Version 14.501.1003-141120a-177998C
Catalyst Version 14.12 AMD Catalyst Omega Software


----------



## catbeef

Quote:


> Originally Posted by *buttface420*
> 
> so i just got a msi gaming 290x that runs hot as the sun. like 94c within a few minutes of gaming even with fans at 100%.i even repasted twice just in case but no change in temp.
> 
> i have a hg10 a1 and a h55 aio laying around from a 290x i used to own but i dont think it would fit at all.
> 
> the gpu works great but thats just way too hot for stock 290x twin frozer i dont get it even my msi reference 290x's were lower than that if i had fans at 70%.
> 
> anyone know if a kraken g10 would work with this?


i use the g10 and a h55 on my msi game 290. i don't see why it wouldn't fit on the x model.. and i mean, if you already have one laying around why not try it? you are in a better position than me to find out!


----------



## mfknjadagr8

Quote:


> Originally Posted by *buttface420*
> 
> so i just got a msi gaming 290x that runs hot as the sun. like 94c within a few minutes of gaming even with fans at 100%.i even repasted twice just in case but no change in temp.
> 
> i have a hg10 a1 and a h55 aio laying around from a 290x i used to own but i dont think it would fit at all.
> 
> the gpu works great but thats just way too hot for stock 290x twin frozer i dont get it even my msi reference 290x's were lower than that if i had fans at 70%.
> 
> anyone know if a kraken g10 would work with this?


sounds like poor contact between the heatsink and the chip...how good was the spread when you took it back off? I would put an extremely small amount on it and see if it spreads properly to rule that out...


----------



## Roboyto

Quote:


> Originally Posted by *buttface420*
> 
> so i just got a msi gaming 290x that runs hot as the sun. like 94c within a few minutes of gaming even with fans at 100%.i even repasted twice just in case but no change in temp.
> 
> i have a hg10 a1 and a h55 aio laying around from a 290x i used to own but i dont think it would fit at all.
> 
> the gpu works great but thats just way too hot for stock 290x twin frozer i dont get it even my msi reference 290x's were lower than that if i had fans at 70%.
> 
> anyone know if a kraken g10 would work with this?


That cooler is more than capable of keeping respectable temperatures even on these cards. I agree with @mfknjadagr8 that you should be pulling the cooler off to ensure that the heatsink is making good contact with the GPU die.

A G10 will yield better temperatures, but it can't hurt to check the contact first.

If you're going to go the G10 route, make sure you get a good heatsink for the VRMs otherwise the mod is somewhat pointless as they will be your limiting factor if not cooled properly as seen here:

http://www.overclock.net/t/1478544/the-build-formerly-known-as-the-devil-inside-a-gaming-htpc-for-my-wife-4770k-corsair-250d-powercolor-r9-290/20_20#post_22214255


----------



## Echoa

Looking to Join the club with my Visiontek 290x with stock visiontek cooler

Screen of GPUz and me logged into my account



pics of it in my case


----------



## pshootr

Quote:


> Originally Posted by *pshootr*
> 
> Hey guys, I'm running R9-290 Tri-X OC. I run win7 still. Is there any reason for me to install Crimson drivers?
> 
> Currently using:
> 
> Driver Packaging Version 14.501.1003-141120a-177998C
> Catalyst Version 14.12 AMD Catalyst Omega Software


No one?


----------



## Roboyto

Quote:


> Originally Posted by *pshootr*
> 
> No one?


Same reason as any driver update. Potential performance boosts for various things. It can also come with side effects like any drive update.

I would say make sure you have your current driver available to reinstall, and give crimson a shot. Download Crimson first, uninstall current drivers with DDU(display driver uninstaller) utilizing safe mode, clean registry with something like CCleaner and install Crimson.

If you have problems revert back.

Crimson on a much more timely update schedule. They're releasing patches and hotfixes for things monthly, at least. It runs faster, hogs less resources on start up, and looks better.

Worth a shot IMO.


----------



## TheLAWNOOB

Put the Windforce on another ref 290.

Hopefully this doesn't bite the dust.


----------



## ZealotKi11er

Quote:


> Originally Posted by *TheLAWNOOB*
> 
> Put the Windforce on another ref 290.
> 
> Hopefully this doesn't bite the dust.


Mining killing them?


----------



## pshootr

Quote:


> Originally Posted by *Roboyto*
> 
> Same reason as any driver update. Potential performance boosts for various things. It can also come with side effects like any drive update.
> 
> I would say make sure you have your current driver available to reinstall, and give crimson a shot. Download Crimson first, uninstall current drivers with DDU(display driver uninstaller) utilizing safe mode, clean registry with something like CCleaner and install Crimson.
> 
> If you have problems revert back.
> 
> Crimson on a much more timely update schedule. They're releasing patches and hotfixes for things monthly, at least. It runs faster, hogs less resources on start up, and looks better.
> 
> Worth a shot IMO.


Good advice. I shall try to do backups, and revert if hell rises.


----------



## bkvamme

Finally bit the bullet, and got another 290X to put under water. Should receive it later this week


----------



## rdr09

Quote:


> Originally Posted by *pshootr*
> 
> Good advice. I shall try to do backups, and revert if hell rises.


If you haven't done the update . . . run some synthetic and game benches. I only need to oc my 290 1200 core with 16 driver in order to get the same graphics score in Firestrike where I used to need 1300 core using 14 driver.

I keep my cards stock for games.


----------



## mus1mus

From 15.10 to 16.7.3, I have seen as much as 10 FPS (key areas) more in Benchmarks.


----------



## pshootr

Quote:


> Originally Posted by *rdr09*
> 
> If you haven't done the update . . . run some synthetic and game benches. I only need to oc my 290 1200 core with 16 driver in order to get the same graphics score in Firestrike where I used to need 1300 core using 14 driver.
> 
> I keep my cards stock for games.


Quote:


> Originally Posted by *mus1mus*
> 
> From 15.10 to 16.7.3, I have seen as much as 10 FPS (key areas) more in Benchmarks.


Good info guys ty.


----------



## diggiddi

Quote:


> Originally Posted by *rdr09*
> 
> If you haven't done the update . . . run some synthetic and game benches. I only need to oc my 290 1200 core with 16 driver in order to get the same graphics score in Firestrike where I used to need 1300 core using 14 driver.
> 
> I keep my cards stock for games.


Which driver are u currently on?


----------



## melodystyle2003

Quote:


> Originally Posted by *diggiddi*
> 
> Which driver are u currently on?


Having issues with the 16.7.3 switched to 16.8.2 and i am fine so far.


----------



## rdr09

Quote:


> Originally Posted by *diggiddi*
> 
> Which driver are u currently on?


16.7.2


----------



## ZealotKi11er

I still get crashing in Overwatch from time to time with CFX.


----------



## Echoa

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I still get crashing in Overwatch from time to time with CFX.


Thought that was a big with the current drivers? Or was it supposed to be fixed


----------



## cplifj

has anyone noticed clockspeeds going up to maximum 3D clocks while watching mere video ?

What worked before at low power usage does not longer work with the crimson crap.

Clock speeds stayed at a low 300MHz core clock and 150MHz memory clock before crimson crap drivers when watching video's.

Now whenever i watch video 2d memory clock goes MAXIMUM of 1250MHz and depending on vid resolution the gpu clock goes maximum speed aswell.

Mind you , maximum clockspeeds and no gpu load whatsoever. This results in a nice HOT card with a fan that does not speed up at all because ...
exactly , no gpu load....

So anyone got a clue as to why we now have a hot running card while watching video when it was a cool running card before crimson crap hit us ?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Echoa*
> 
> Thought that was a big with the current drivers? Or was it supposed to be fixed


Not sure but it happened more before. I thought it was my OC until I had the same crashes at stock.


----------



## Echoa

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Not sure but it happened more before. I thought it was my OC until I had the same crashes at stock.


Bet you disabling xfire will fix it lol


----------



## rdr09

Quote:


> Originally Posted by *cplifj*
> 
> has anyone noticed clockspeeds going up to maximum 3D clocks while watching mere video ?
> 
> What worked before at low power usage does not longer work with the crimson crap.
> 
> Clock speeds stayed at a low 300MHz core clock and 150MHz memory clock before crimson crap drivers when watching video's.
> 
> Now whenever i watch video 2d memory clock goes MAXIMUM of 1250MHz and depending on vid resolution the gpu clock goes maximum speed aswell.
> 
> Mind you , maximum clockspeeds and no gpu load whatsoever. This results in a nice HOT card with a fan that does not speed up at all because ...
> exactly , no gpu load....
> 
> So anyone got a clue as to why we now have a hot running card while watching video when it was a cool running card before crimson crap hit us ?


Video as in YT videos? How much Hz? Disabling Hardware Acceleration might help except if you are using Chrome I think. I use Microsoft Edge with Win10 and mine does jump a bit in 4K videos.



You can also use a custom fan curve.


----------



## vodkapl

Could someone running Ubuntu with 290, amdgpu(pro) drivers, provide benchmarks? I'm most interested in witcher 2.


----------



## cplifj

Yes, especially in YT vids, i have noticed they did go to more HD vids, but simply put, this little bit more of resolution (720 vs 480) does not justify the running at max clockspeeds at all since the GPU load stays quasi 0% all the time, so really no need at all to have it running at max 3d clockpseeds while watching video. I do as i allways did, watching vids in IE11.

Now nothing else changed on this machine and it did't need those high clockspeeds before to do the video playback (HD or not).

I simply find this no excuse to have clockspeeds go up while cooling stays down. A graphics card hitting 80°c while watching mere video is total BS in my opinion.

80°c just because it's clockspeeds are maxed out and no coolerfan speed-up (stays at 20% fanspeed) because of no gpu load measured by amd's build in powercontrol tool (or whatever they call it).

what is with AMD, they really want to set the world on fire ? and i do mean that litteraly.

i can say that while watching YT 720 resolution vids goes with a memory clockspeed of 1250MHz continuesly (does not come down during watching) and when i look at 1080p vids, the GPU clock goes to around 800Mhz aswell (while gpu load stays at 0% also), totally not needed at all. It never did that before and i could watch any HD 1080p without problems at the low idle clockspeeds of 300/150MHz and consequent low temperature of 44°C (the normal idle temp).

So why AMD ? i consider this to be a quick and lazy driver designing issue. Nowhere near optimized at all for a topline card like a 290X at all. If AMD think this will make me move to a new card sale , they are failing at marketing dreams. The same problem in all of tech business today, they create new things while the old ones are still not optimized. Only thing that counts is sales , sales and sales. Product quality, customer satisfaction...terms that are considered a dreadfull disease these days.


----------



## rdr09

Quote:


> Originally Posted by *cplifj*
> 
> Yes, especially in YT vids, i have noticed they did go to more HD vids, but simply put, this little bit more of resolution (720 vs 480) does not justify the running at max clockspeeds at all since the GPU load stays quasi 0% all the time, so really no need at all to have it running at max 3d clockpseeds while watching video. I do as i allways did, watching vids in IE11.
> 
> Now nothing else changed on this machine and it did't need those high clockspeeds before to do the video playback (HD or not).
> 
> I simply find this no excuse to have clockspeeds go up while cooling stays down. A graphics card hitting 80°c while watching mere video is total BS in my opinion.
> 
> 80°c just because it's clockspeeds are maxed out and no coolerfan speed-up (stays at 20% fanspeed) because of no gpu load measured by amd's build in powercontrol tool (or whatever they call it)..


Try 16.7.2. As you can see, I watch 4K videos and I have no issue such as yours.

BTW, I don't use DDU.


----------



## Echoa

Quote:


> Originally Posted by *vodkapl*
> 
> Could someone running Ubuntu with 290, amdgpu(pro) drivers, provide benchmarks? I'm most interested in witcher 2.


Last I heard it kinda runs like crap because of the fact it uses a wrapper to run on Linux. It'll be fine just it's a quick and dirty port is all and doesn't matter that much if your hardware is good because the port sucks.


----------



## vodkapl

This nvidia card is running well so curious how if latest drivers gives similar results or not.


----------



## Paul17041993

Quote:


> Originally Posted by *spyshagg*
> 
> make that -100mhz (on gpu memory) to be stable.


Probably because it might be one of the only games to make full use of GCN, so a lot more stress is applied to the texture and branch units.
Quote:


> Originally Posted by *cplifj*
> 
> Yes, especially in YT vids, i have noticed they did go to more HD vids, but simply put, this little bit more of resolution (720 vs 480) does not justify the running at max clockspeeds at all since the GPU load stays quasi 0% all the time, so really no need at all to have it running at max 3d clockpseeds while watching video. I do as i allways did, watching vids in IE11.
> 
> Now nothing else changed on this machine and it did't need those high clockspeeds before to do the video playback (HD or not).
> 
> I simply find this no excuse to have clockspeeds go up while cooling stays down. A graphics card hitting 80°c while watching mere video is total BS in my opinion.
> 
> 80°c just because it's clockspeeds are maxed out and no coolerfan speed-up (stays at 20% fanspeed) because of no gpu load measured by amd's build in powercontrol tool (or whatever they call it).
> 
> what is with AMD, they really want to set the world on fire ? and i do mean that litteraly.
> 
> i can say that while watching YT 720 resolution vids goes with a memory clockspeed of 1250MHz continuesly (does not come down during watching) and when i look at 1080p vids, the GPU clock goes to around 800Mhz aswell (while gpu load stays at 0% also), totally not needed at all. It never did that before and i could watch any HD 1080p without problems at the low idle clockspeeds of 300/150MHz and consequent low temperature of 44°C (the normal idle temp).
> 
> So why AMD ? i consider this to be a quick and lazy driver designing issue. Nowhere near optimized at all for a topline card like a 290X at all. If AMD think this will make me move to a new card sale , they are failing at marketing dreams. The same problem in all of tech business today, they create new things while the old ones are still not optimized. Only thing that counts is sales , sales and sales. Product quality, customer satisfaction...terms that are considered a dreadfull disease these days.


If you don't like it then turn off HW acceleration, otherwise just let the GPU do its thing, temperatures don't define the power draw either.

The reason these cores jump to extreme clocks and voltages on moderate loads is to maintain stability and consistency, compared to the stutters and crashes you can see on [some] intel and nvidia hardware.


----------



## rdr09

Update for cplifj: Updated to the latest driver and it does max out the memory when watching videos. Core goes to about 460MHz from 300 MHz. My 290s are watercooled, so my temps pretty much stay the same. Memory maxing out, though, does not really heat up the gpu. The core, though, does if it reaches the max speed.

I tested YT using IE and Firefox.


----------



## demitrisln

Hey all I have a Sapphire R9 290X and a VisionTek CryoVenom R9 290 in crossfire. My rig is almost complete but I can't find a better way to make the power cables look better. I used combs but still doesn't look great.... Anything I'm missing????


----------



## battleaxe

Quote:


> Originally Posted by *demitrisln*
> 
> Hey all I have a Sapphire R9 290X and a VisionTek CryoVenom R9 290 in crossfire. My rig is almost complete but I can't find a better way to make the power cables look better. I used combs but still doesn't look great.... Anything I'm missing????


What if you ran them between the GPU's. That way you would only see them fold over for about an inch and they dissapear. Also, try to make the once coming out of the Mobu go straight into the slot and make the loop shorter.


----------



## Roboyto

Quote:


> Originally Posted by *demitrisln*
> 
> Hey all I have a Sapphire R9 290X and a VisionTek CryoVenom R9 290 in crossfire. My rig is almost complete but I can't find a better way to make the power cables look better. I used combs but still doesn't look great.... Anything I'm missing????


Quote:


> Originally Posted by *battleaxe*
> 
> What if you ran them between the GPU's. That way you would only see them fold over for about an inch and they dissapear. Also, try to make the once coming out of the Mobu go straight into the slot and make the loop shorter.


^ What he said









Setup looks nice


----------



## demitrisln

So what I would do is run the cables for the top GPU behind and on top of the graphics card on the bottom correct?

Also what do you mean by Also, try to make the once coming out of the Mobu go straight into the slot and make the loop shorter.


----------



## Roboyto

Quote:


> Originally Posted by *demitrisln*
> 
> So what I would do is run the cables for the top GPU behind and on top of the graphics card on the bottom correct?
> 
> Also what do you mean by Also, try to make the once coming out of the Mobu go straight into the slot and make the loop shorter.




I'd run both sets of wires through that grommet there. Either use the combs or zip-tie them all together to get a nice tight group of wires.


----------



## jagdtigger

Have anyone any idea what can cause this while OC-ed? I only pushed the core so i dont think its the RAM. Card driver: 16.7.3.



(Dont mind the dips on the graph, it just the result of trixx setting the voltage and clock while the bench was running.)

Tried to add more mV but it didnt helped, the same goes for lowering it. And why just this one is doing it([email protected], DP)?


----------



## mus1mus

That a signal loss issue due to too high Core Voltage. It's not because it is unstable (artifacting) but because the 0.95 Volt Rail is not enough to keep the signal going.

It can be modded. Otherwise, it's a card limit. Not the GPU Core.

Here is the mod mentioned: http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/1700_50#post_24870702

It can be sometimes fixed by knowing the Core Voltage threashold where it happens. But that is also dependent to your load. Catzilla Bench allows it at say, 1.35 Volts but Heaven will blackscreen at 1.3V.

If you are just doing it for Benchmarks, dnt worry, scores are unaffected. But for gaming (less likely gaming at those clocks) just clock it down and use a fair amount of Voltage to keep the signal going.


----------



## jagdtigger

Quote:


> Originally Posted by *mus1mus*
> 
> That a signal loss issue due to too high Core Voltage. It's not because it is unstable (artifacting) but because the 0.95 Volt Rail is not enough to keep the signal going.
> 
> It can be modded. Otherwise, it's a card limit. Not the GPU Core.
> 
> Here is the mod mentioned: http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/1700_50#post_24870702
> 
> It can be sometimes fixed by knowing the Core Voltage threashold where it happens. But that is also dependent to your load. Catzilla Bench allows it at say, 1.35 Volts but Heaven will blackscreen at 1.3V.
> 
> If you are just doing it for Benchmarks, dnt worry, scores are unaffected. But for gaming (less likely gaming at those clocks) just clock it down and use a fair amount of Voltage to keep the signal going.


Thanks for the link, i look into it. BTW why do you think its not for gaming?

/EDIT
No thanks, that hard mod is too risky. I have to hold out with lower OC until i can buy a new card...


----------



## mus1mus

Who wants to game with blackscreens?


----------



## Echoa

Quote:


> Originally Posted by *mus1mus*
> 
> Who wants to game with blackscreens?


I do it for the challenge... You're not a real gamer if you can't game blind


----------



## jagdtigger

Quote:


> Originally Posted by *mus1mus*
> 
> Who wants to game with blackscreens?


No one i guess, but you mentioned the clocks thats why i asked it







.


----------



## Paul17041993

Quote:


> Originally Posted by *jagdtigger*
> 
> Have anyone any idea what can cause this while OC-ed? I only pushed the core so i dont think its the RAM. Card driver: 16.7.3.
> 
> 
> 
> (Dont mind the dips on the graph, it just the result of trixx setting the voltage and clock while the bench was running.)
> 
> Tried to add more mV but it didnt helped, the same goes for lowering it. And why just this one is doing it([email protected], DP)?


That kind of thing happens with my display regardless of clocks and voltage, it's just DP being over-sensitive to pin contact. Tends to happen on very cold mornings and is fixed either by waiting for the sun to come up or re-seating either end of the cable.

That being said, the display rail issue is still a possibility, but only if it occurs with a high core clock and voltage increase.


----------



## mus1mus

hmm. Nice catch!

It probably is. I haven't done benches with dual monitors so that can be it.

I assumed simply because he said, the card is overclocked.


----------



## jagdtigger

Quote:


> Originally Posted by *mus1mus*
> 
> hmm. Nice catch!
> 
> It probably is. I haven't done benches with dual monitors so that can be it.
> 
> I assumed simply because he said, the card is overclocked.


I have actually 3 monitors and 1 TV hooked up to the 290x







.


----------



## The Storm

Is anyone having problems with low performance in 3dMark and 3dMark 11 in crossfire? I get about the same score crossfire as I do single gpu, my combined score drastically drops in crossfire mode. When gaming like BF4 both cards load up great and have great fps, im not sure whats going on. I have uninstalled drivers many times, uninstalled both 3dmarks and reinstalled still to no luck. I have had these 290xs underwater since launch and haven't benched in over a year, I just thought I would try a bench for giggles and now im baffled.


----------



## ManofGod1000

Quote:


> Originally Posted by *The Storm*
> 
> Is anyone having problems with low performance in 3dMark and 3dMark 11 in crossfire? I get about the same score crossfire as I do single gpu, my combined score drastically drops in crossfire mode. When gaming like BF4 both cards load up great and have great fps, im not sure whats going on. I have uninstalled drivers many times, uninstalled both 3dmarks and reinstalled still to no luck. I have had these 290xs underwater since launch and haven't benched in over a year, I just thought I would try a bench for giggles and now im baffled.


That is just the way it is. Your graphics score will nearly double but the combined score drops like a rock. That is the way it has always been and was never fixed, that I am aware of.


----------



## The Storm

Quote:


> Originally Posted by *ManofGod1000*
> 
> That is just the way it is. Your graphics score will nearly double but the combined score drops like a rock. That is the way it has always been and was never fixed, that I am aware of.


Unfortunately my graphics score is not doubling, my fps during the tests
Stay the same, I can get 70fps on the first test, and get 68 fps crossfired...


----------



## mus1mus

Look into your XFire Profile. It may need to be set into Global.


----------



## The Storm

Problem solved, I have been working on this issue for days and could not for the life of me figure it out. Just on a hunch I removed and re seated my 4 ram chips...Bam!!!!!! fps is screaming again in both 3dmark 11 and 3dmark...time for sleep.


----------



## serave

is it possible to brick a card by uninstalling a driver via DDU?

my 290 went from perfectly fine card into a card that wont give signal to my monitor

I've returned the BIOS on both positions to stock prior uninstalling the driver so I doubt it's the bios that's causing the problem

tried updating my motherboardbios, reset-ing the cmos, os re-install, nothing works

anyone ever experienced the same in the past?

I can get the RX 470/480 if the card is indeed dead (i doubt it, hghly)

still, the performance jump from R9 290 to either 470/480 simply isn't worth it in my book


----------



## TheReciever

Most of the time when I uninstall drivers it keeps VSR enabled for some reason, so I have to alter the default resolution


----------



## serave

Quote:


> Originally Posted by *TheReciever*
> 
> Most of the time when I uninstall drivers it keeps VSR enabled for some reason, so I have to alter the default resolution


might try that in a bit, been scratching my head over the last 3 days with this problem

I highly suspect its the leftover driver tho, still, I did a clean install of my windows 8 and the problem wont go away still lel

perhaps the refresh rate plays a role aswell in this problem i'm having (64hz for some weird reason, cant change that aswell)


----------



## TheReciever

Try running Custom Resolution Utility to see what default values are in place and make sure it matches what your monitor supports

Then you can also look at the Device Manager and delete any monitors that may be present (I think), its been a while since I have had to do that.


----------



## pshootr

I just went to the latest Crimson drivers finally. So far no issues yet, and I did get a higher Fire-Strike Score. I feel like some time to test is in order, but so far so good.


----------



## serave

Seems like my motherboard is the culprit, got the same no signal to problem using my spare gpus (GTX 560Ti, HD 7790, HD2400)

what a relive knowing it's not my 290 that's been acting up, well seems like nowhere near driver problems tho


----------



## pshootr

Quote:


> Originally Posted by *serave*
> 
> might try that in a bit, been scratching my head over the last 3 days with this problem
> 
> I highly suspect its the leftover driver tho, still, I did a clean install of my windows 8 and the problem wont go away still lel
> 
> perhaps the refresh rate plays a role aswell in this problem i'm having (64hz for some weird reason, cant change that aswell)


You should never have to reinstall windows over this.If you encounter issues due to a driver, then use DDU to remove old drivers before installing the new one.

Quote:


> Originally Posted by *serave*
> 
> Seems like my motherboard is the culprit, got the same no signal to problem using my spare gpus (GTX 560Ti, HD 7790, HD2400)
> 
> what a relive knowing it's not my 290 that's been acting up, well seems like nowhere near driver problems tho


Do you by chance have a spare monitor to test with just to rule that out?


----------



## serave

@pshootr

That's what I've been thinking at first aswell, sad thing is, it's literally DDU that causes my 290 to lost signal in the first place

I cleaned my old driver without restarting first, but then the 16.8.2 installer doesn't show any driver is going to be installed, it shows something like this



I thought it would go away if i opted for the "clean and restart" and the damn thing just wont start again, lol

also no, I don't have any spare monitor to test it out, I doubt its the monitor though, thanks for taking the time to reply by the way









I can live with the igpu for now


----------



## David28557

URL cpu-z Validation: https://www.techpowerup.com/gpuz/details/94f8g

Manufacturer: MSI Gaming R9 290x 8GB GDDR5

Cooling: Stock Cooling


----------



## mirzet1976

Quote:


> Originally Posted by *serave*
> 
> @pshootr
> 
> That's what I've been thinking at first aswell, sad thing is, it's literally DDU that causes my 290 to lost signal in the first place
> 
> I cleaned my old driver without restarting first, but then the 16.8.2 installer doesn't show any driver is going to be installed, it shows something like this
> 
> 
> 
> I thought it would go away if i opted for the "clean and restart" and the damn thing just wont start again, lol
> 
> also no, I don't have any spare monitor to test it out, I doubt its the monitor though, thanks for taking the time to reply by the way
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can live with the igpu for now


Go to Device manager and uninstall driver from there if ist present.


----------



## diggiddi

Quote:


> Originally Posted by *serave*
> 
> @pshootr
> 
> That's what I've been thinking at first aswell, sad thing is, it's literally DDU that causes my 290 to lost signal in the first place
> 
> I cleaned my old driver without restarting first, but then the 16.8.2 installer doesn't show any driver is going to be installed, it shows something like this
> 
> 
> 
> I thought it would go away if i opted for the "clean and restart" and the damn thing just wont start again, lol
> 
> also no, I don't have any spare monitor to test it out, I doubt its the monitor though, thanks for taking the time to reply by the way
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can live with the igpu for now


Are you running firewall when installing? If so turn it off until its completed and restarted


----------



## Paul17041993

Quote:


> Originally Posted by *serave*
> 
> @pshootr
> 
> That's what I've been thinking at first aswell, sad thing is, it's literally DDU that causes my 290 to lost signal in the first place
> 
> I cleaned my old driver without restarting first, but then the 16.8.2 installer doesn't show any driver is going to be installed, it shows something like this
> 
> 
> 
> I thought it would go away if i opted for the "clean and restart" and the damn thing just wont start again, lol
> 
> also no, I don't have any spare monitor to test it out, I doubt its the monitor though, thanks for taking the time to reply by the way
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can live with the igpu for now


Quote:


> Originally Posted by *mirzet1976*
> 
> Go to Device manager and uninstall driver from there if ist present.


^ This, what your getting generally occurs if the drivers are corrupted, so go to device manager and uninstall the card there. Additionally you should probably delete the AMD folder under C so you have a full-clean start.

If problems continue however you'll either need a windows re-install or a new mobo...


----------



## mfknjadagr8

Quote:


> Originally Posted by *Paul17041993*
> 
> ^ This, what your getting generally occurs if the drivers are corrupted, so go to device manager and uninstall the card there. Additionally you should probably delete the AMD folder under C so you have a full-clean start.
> 
> If problems continue however you'll either need a windows re-install or a new mobo...


ddu....always use ddu in safe mode


----------



## semitope

Quote:


> Originally Posted by *serave*
> 
> @pshootr
> 
> That's what I've been thinking at first aswell, sad thing is, it's literally DDU that causes my 290 to lost signal in the first place
> 
> I cleaned my old driver without restarting first, but then the 16.8.2 installer doesn't show any driver is going to be installed, it shows something like this
> 
> 
> 
> I thought it would go away if i opted for the "clean and restart" and the damn thing just wont start again, lol
> 
> also no, I don't have any spare monitor to test it out, I doubt its the monitor though, thanks for taking the time to reply by the way
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I can live with the igpu for now


I think I had something like this with my fury. One source might be windows anniversary editions new driver signing requirements with fresh install. I also had some ssd corruption that was causing issues (probably separate). Windows update could be meeting things up as well


----------



## 113802

RAIJINTEK MORPHEUS II CORE EDITION or Corsair Hydro HG10 A1 + H80i v2 for VRM cooling on a reference AMD R9 290? The core stays around 73-75C but the VRM/Memory get hot. Noise isn't an issue on either one since I'll be using Gentle Typhoons on either one. Which would cool the VRM/Memory more efficiently.


----------



## Roboyto

Quote:


> Originally Posted by *WannaBeOCer*
> 
> RAIJINTEK MORPHEUS II CORE EDITION or Corsair Hydro HG10 A1 + H80i v2 for VRM cooling on a reference AMD R9 290? The core stays around 73-75C but the VRM/Memory get hot. Noise isn't an issue on either one since I'll be using Gentle Typhoons on either one. Which would cool the VRM/Memory more efficiently.


I don't know anything about the Raijintek, but the Hydro does a decent job of cooling the VRMs since it uses a fairly large sink and the reference fan to only cool the VRMs. RAM you have little to worry about as far as temps are concerned as it does not generate much heat. There's bound to be some information either here, or on another site, where someone has posted VRM temps from the HG10 setup..do a little research. My gut feeling is that the HG10 would be a better solution if you won't have an issue with placement of the radiator.

If you want to optimize the efficacy of either cooler for the VRMs, I would suggest purchasing some good thermal pads. You can see drastic temperature drops from using Fujipoly Ultra Extreme thermal pads; somewhere in the vicinity of 20% from my findings. http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures


----------



## Pillendreher

Hey guys,

I got myself a new Asus Radeon R9 290X DirectCU II OC yesterday for 180 €. I was planning on getting an RX480 to replace my HD6950, but I'd rather get something cheaper and wait for the next (real) performance jump than paying 260+ € for a card that's barely better than a R9 290(x) or a R9 390(x).

I've very excited to see what kind of cooling performance I'm gonna get. Since I'm not under any pressure to keep the card (it was a 'spur of the moment buy': I was looking for used ones and saw that a German retailer was selling the DCII, so I said 'Yeah, let's give it a try'), I don't think I'll tinker all that much. Maybe check the thermal paste, but that'll be it I reckon.


----------



## Echoa

Quote:


> Originally Posted by *Pillendreher*
> 
> Hey guys,
> 
> I got myself a new Asus Radeon R9 290X DirectCU II OC yesterday for 180 €. I was planning on getting an RX480 to replace my HD6950, but I'd rather get something cheaper and wait for the next (real) performance jump than paying 260+ € for a card that's barely better than a R9 290(x) or a R9 390(x).
> 
> I've very excited to see what kind of cooling performance I'm gonna get. Since I'm not under any pressure to keep the card (it was a 'spur of the moment buy': I was looking for used ones and saw that a German retailer was selling the DCII, so I said 'Yeah, let's give it a try'), I don't think I'll tinker all that much. Maybe check the thermal paste, but that'll be it I reckon.


Cooling will be good enough but nothing to write home about. You'll see a massive performance increase though. Expect average gaming temps around 60-80c depending on your case and stress test temps around 80-90c

It'll be similar to my visiontek 290x temps wise


----------



## Pillendreher

I'm a little worried to be honest. There are two huge threads regarding te ASUS 290x Matrix cooling (100 pages combined), which should be the same as with the DCII. Consensus was that the cooling design was basically trash. People reapplied thermal paste and removed the plastic covering in order to get better temps...


----------



## Roboyto

Quote:


> Originally Posted by *Pillendreher*
> 
> I'm a little worried to be honest. There are two huge threads regarding te ASUS 290x Matrix cooling (100 pages combined), which should be the same as with the DCII. Consensus was that the cooling design was basically trash. People reapplied thermal paste and removed the plastic covering in order to get better temps...


That's pretty accurate. ASUS was lazy as hell and took the cooler from the GTX780 without altering anything. The result was only getting contact on 2.5/5 heatpipes and subpar cooling. They recently did the same thing with the 1060/480..AMD getting the shaft again. Go to the first page of this thread an in the useful info section I have a write-up called the 290(X) Need to Know. There's a little blurb in that post about the DC2.

The only good thing to do with the DC2 290(x) is water cool it if you ask me.


----------



## Pillendreher

Well like I said it was a spontaneous decision. I was looking at used cards and came upon this offer by chance. I thought 'Ah, what the heck. I'll give it a try and if the cooling really sucks, I'll return it'. The weird thing is: It's like they're two cards out there. Buyers/users complain about the cooling, yet it seems okay in some reviews like

http://www.guru3d.com/articles_pages/asus_radeon_r9_290x_directcuii_oc_review,10.html
http://www.guru3d.com/articles_pages/asus_radeon_r9_290x_directcuii_oc_review,12.html

http://www.legitreviews.com/asus-r9-290x-directcu-ii-sapphire-r9-290x-tri-x-video-card-reviews_132158/11

http://www.pcper.com/reviews/Graphics-Cards/ASUS-Radeon-R9-290X-DirectCU-II-Graphics-Card-Review/Cooling-and-Clock-Benefi

https://www.techpowerup.com/reviews/ASUS/R9_290X_Direct_Cu_II_OC/23.html
https://www.techpowerup.com/reviews/ASUS/R9_290X_Direct_Cu_II_OC/28.html

So I don't know what's up with that. I should get it tomorrow or on Friday and I'll give it a try. I thought I'd first try the new 290x for 180 € before getting a used one for a little bit less.


----------



## Roboyto

Quote:


> Originally Posted by *Pillendreher*
> 
> Well like I said it was a spontaneous decision. I was looking at used cards and came upon this offer by chance. I thought 'Ah, what the heck. I'll give it a try and if the cooling really sucks, I'll return it'. The weird thing is: It's like they're two cards out there. Buyers/users complain about the cooling, yet it seems okay in some reviews like
> 
> http://www.guru3d.com/articles_pages/asus_radeon_r9_290x_directcuii_oc_review,10.html
> http://www.guru3d.com/articles_pages/asus_radeon_r9_290x_directcuii_oc_review,12.html
> 
> http://www.legitreviews.com/asus-r9-290x-directcu-ii-sapphire-r9-290x-tri-x-video-card-reviews_132158/11
> 
> http://www.pcper.com/reviews/Graphics-Cards/ASUS-Radeon-R9-290X-DirectCU-II-Graphics-Card-Review/Cooling-and-Clock-Benefi
> 
> https://www.techpowerup.com/reviews/ASUS/R9_290X_Direct_Cu_II_OC/23.html
> https://www.techpowerup.com/reviews/ASUS/R9_290X_Direct_Cu_II_OC/28.html
> 
> So I don't know what's up with that. I should get it tomorrow or on Friday and I'll give it a try. I thought I'd first try the new 290x for 180 € before getting a used one for a little bit less.


I believe the difference between some reviews and user complaints may be an open test bench/case. If you're going to keep this card, or any air cooled 290/390 for that matter, in a closed case setting, then you must have excellent airflow through it.

Even if the temps are fair under normal use, attempting to push the limits with an overclock/overvolt will cause you to hit a wall quickly and be limited by temperature.

Can't hurt to test the card and see what happens though.


----------



## Echoa

Quote:


> Originally Posted by *Roboyto*
> 
> I believe the difference between some reviews and user complaints may be an open test bench/case. If you're going to keep this card, or any air cooled 290/390 for that matter, in a closed case setting, then you must have excellent airflow through it.
> 
> Even if the temps are fair under normal use, attempting to push the limits with an overclock/overvolt will cause you to hit a wall quickly and be limited by temperature.
> 
> Can't hurt to test the card and see what happens though.


This is the problem with my visiontek 290x, raising the voltage quickly throws me into a temperature limit. I'm not to worried about it as it still maxes everything at 1080p at 1100/1375.


----------



## Roboyto

Quote:


> Originally Posted by *Echoa*
> 
> This is the problem with my visiontek 290x, raising the voltage quickly throws me into a temperature limit. I'm not to worried about it as it still maxes everything at 1080p at 1100/1375.


As is the case with pretty much any air cooled Hawaii card...just how it is.

1100/1375 are solid clocks for air cooled though...Plus 1200 core is about where you start seeing diminishing returns anyhow, so I wouldn't worry about it.

I'm selling my kick ass 290 with waterblock if you're interested  All setup ready to rock and roll at ~1300/~1700 depending on what bench/game you'd be running. I've always ran it at 1200/1500 for gaming and it has served me quite well driving 3K eyefinity for the last couple years.

$379 Fury X seemed like too good of a deal to pass up with the questionable release date of Vega cards. I figure the Fury X can at least get me through this fall/winter if Vega won't be out until Q1 2017.

http://www.overclock.net/t/1611205/xfx-black-edition-r9-290-w-xspc-razor-block-backplate


----------



## Paul17041993

Don't be worried about the core hitting 95C as they're designed to handle it, however if it throttles and/or makes a lot of noise then you'll need to make sure you have plenty of case airflow. The ASUS coolers are *ok*, better than reference, but horrible compared to the coolers from gigabyte and sapphire, if you don't mind however you can always bump the TDP level down (forced throttle) and the card will still perform great at 1080p-1440p with little heat.

Results vary between cards however, some can pull a lot more power (and make into heat) than others. Also if the card throttles badly on moderate loads you'll need to check the paste and make sure the heatsink sits correctly, ASUS in particular has had problems with the core screws (the 4 centre ones with springs) being too loose on some cards.


----------



## ABigGuyForYou

Argh, I messed up big time. Haven't upgraded my drivers in a long time. Was using Catalyst and when I installed it switched to Crimson, which reset all of my overclock settings to the defaults. Now I'm having some trouble running games, and can't remember how I had everything set before. Tried using a guide I found online, but it's giving me more problems than leaving things stock. Could anybody share a modern overclocking guide, preferably focused on the Crimson software, so I can pinpoint if it's me or the card that's messing up? (it's probably me)


----------



## diggiddi

Hey all, is anyone else not seeing the button to enable VSR under additional settings ?? 16.9.1
Also could someone with Xfire run crysis 3 at 3200x1800,3440x1440 or 4k @ max settings(Anti-aliasing 4x, 8x) and report VRAM usage
Will be repped up for sure

Edit:Found where they moved VSR controls to unfortunately can't seem to get it to work


----------



## Paul17041993

Quote:


> Originally Posted by *ABigGuyForYou*
> 
> Argh, I messed up big time. Haven't upgraded my drivers in a long time. Was using Catalyst and when I installed it switched to Crimson, which reset all of my overclock settings to the defaults. Now I'm having some trouble running games, and can't remember how I had everything set before. Tried using a guide I found online, but it's giving me more problems than leaving things stock. Could anybody share a modern overclocking guide, preferably focused on the Crimson software, so I can pinpoint if it's me or the card that's messing up? (it's probably me)


You may or may not need to run a GPU driver cleaner tool and clean install the drivers again, I know I had to on mine because something weird happened to the profiles (phantom profiles that could not be overwritten or removed).

Other than that; I haven't used any guides (besides BIOS modding), I just bump the clocks by 50-100, increase the voltage also by 50-100 mV until its stable, see how high the clocks can go before it cant be stabilised (or black-screens). Key thing to remember though is to turn off CCC/Crimson's startup so that it doesn't try to set the clocks on boot in case if a crash.


----------



## mus1mus

Yeah, DDU is not needed. Same as 90C is fine for temps.


----------



## 222Panther222

Should i buy a 290 used for 200$? It seems to have 970 performance which still cost a lot here and the 480 are 380$ Also my 560ti fans are kaputs and i'm on a tight budget..


----------



## 113802

Quote:


> Originally Posted by *222Panther222*
> 
> Should i buy a 290 used for 200$? It seems to have 970 performance which still cost a lot here and the 480 are 380$ Also my 560ti fans are kaputs and i'm on a tight budget..


No, a ton of 290x/290 use Elpida ram that cause frequent black screens. Look for a rx480 for $250 or below. Sorry just noticed you are in Canada. Try to get a used a 390/390x if you can.


----------



## mfknjadagr8

Quote:


> Originally Posted by *mus1mus*
> 
> Yeah, DDU is not needed. Same as 90C is fine for temps.


your sarcasm knows no bounds....i cant express how many people I've seen and heard of selling off amd cards because they refused to do clean install of drivers then had issues...just wish I'd have had extra money then lol...ddu is needed imo every time you do anything with drivers...


----------



## Roboyto

Quote:


> Originally Posted by *222Panther222*
> 
> Should i buy a 290 used for 200$? It seems to have 970 performance which still cost a lot here and the 480 are 380$ Also my 560ti fans are kaputs and i'm on a tight budget..


Yes, 290 does have roughly 970 performance for lesser cost. It depends on well you can overclock the 290, so it may perform slightly worse than a 970. If you are running games that will suck up VRAM, then the 290(X) is the better choice since you can utilize all 4GB of VRAM without any penalties.

If you have questions about 290 cards in general my Need to Know post is a good place to start:

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781

Quote:


> Originally Posted by *WannaBeOCer*
> 
> No, a ton of 290x/290 use Elpida ram that cause frequent black screens. Look for a rx480 for $250 or below.


I must disagree and say this is bad advice. Plus he said 'here the 480 are $380', you should have checked where he resides, which is not in the US.

Just because a GPU has Elpida RAM doesn't mean that it is prone to black screening. Elpida RAM, generally speaking, on these GPUs did not overclock as well as their Hynix counterparts. There are plenty of people in this forum though that have had good luck with overclocking their Elpida equipped cards; they are not as common as good Hynix cards however.

These cards did have black screen problems when they were first released, a long time ago, some cards needed RMA and AMD fixed another issue promptly with a driver update.

If a Hawaii card is black screening then odds are that the problem is caused by the memory being overclocked too far. The memory OC is much less important on these cards as they already have tons of bandwidth. Not saying that you can't get extra performance from a memory OC, but it's nowhere near as effective as the core.


----------



## rdr09

Quote:


> Originally Posted by *WannaBeOCer*
> 
> No, a ton of 290x/290 use Elpida ram that cause frequent black screens. Look for a rx480 for $250 or below. Sorry just noticed you are in Canada. Try to get a used a 390/390x if you can.


Both my 290 reference use elpida. one I bought at launch and the other used after a year. Zero blackscreen.

Contrary to the other posters - I don't use ddu ever.

EDIT: Wait, I did get a blackscreen oc'ing beyond 1300 core.


----------



## Echoa

I just had to share this, I was in the middle of benching and my 290x caught on fire lol

I've heard of this kinda thing before but it's never happened to me

Edit: one of the vrms popped and it smells awful, Im not even mad I just laughed


----------



## 222Panther222

The model is a Club 3D RoyalKing 4GB.


----------



## 113802

Quote:


> Originally Posted by *Roboyto*
> 
> Yes, 290 does have roughly 970 performance for lesser cost. It depends on well you can overclock the 290, so it may perform slightly worse than a 970. If you are running games that will suck up VRAM, then the 290(X) is the better choice since you can utilize all 4GB of VRAM without any penalties.
> 
> If you have questions about 290 cards in general my Need to Know post is a good place to start:
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781
> 
> I must disagree and say this is bad advice. Plus he said 'here the 480 are $380', you should have checked where he resides, which is not in the US.
> 
> Just because a GPU has Elpida RAM doesn't mean that it is prone to black screening. Elpida RAM, generally speaking, on these GPUs did not overclock as well as their Hynix counterparts. There are plenty of people in this forum though that have had good luck with overclocking their Elpida equipped cards; they are not as common as good Hynix cards however.
> 
> These cards did have black screen problems when they were first released, a long time ago, some cards needed RMA and AMD fixed another issue promptly with a driver update.
> 
> If a Hawaii card is black screening then odds are that the problem is caused by the memory being overclocked too far. The memory OC is much less important on these cards as they already have tons of bandwidth. Not saying that you can't get extra performance from a memory OC, but it's nowhere near as effective as the core.


I would stay away from any reference R9 290/x especially used because they use Elpida memory. Elpida ram is known for not overclocking well that is correct but it also is famous for causing black screens on R9 290 cards. Might not be for all users but a high percentage of users experience it including my R9 290 which XFX wouldn't replace. All XFX PCB R9 390/xs use Hynix memory.

http://www.overclock.net/t/1441349/290-290x-black-screen-poll


----------



## mirzet1976

Quote:


> Originally Posted by *rdr09*
> 
> Both my 290 reference use elpida. one I bought at launch and the other used after a year. Zero blackscreen.
> 
> Contrary to the other posters - I don't use ddu ever.
> 
> EDIT: Wait, I did get a blackscreen oc'ing beyond 1300 core.


Same here, I have elpida memory ref. R9 290 and never had a driver work stoped or black screen except in cases when I OC over 1325MHz for benching.


----------



## rdr09

Quote:


> Originally Posted by *222Panther222*
> 
> The model is a Club 3D RoyalKing 4GB.


I saw a pic of it. Looks like it has a beefy cooler and a heat sink on the vrm2, which heats a lot more than vrm1 (one near the bracket).

I say, make sure you can return it at a certain window. $200 is no go if in the US, but since you are from up north, I guess it is still a good buy. Just make sure to ask seller for some kind of warranty like any other used item.


----------



## Roboyto

Quote:


> Originally Posted by *WannaBeOCer*
> 
> I would stay away from any reference R9 290/x especially used because they use Elpida memory. Elpida ram is known for not overclocking well that is correct but it also is famous for causing black screens on R9 290 cards. Might not be for all users but a high percentage of users experience it including my R9 290 which XFX wouldn't replace. All XFX PCB R9 390/xs use Hynix memory.
> 
> http://www.overclock.net/t/1441349/290-290x-black-screen-poll


Not all reference boards have Elpida RAM though; At least my XFX black edition reference cards didn't. Also had 2 other reference cards, powercolor and XFX, with Elpida that ran fine, but only overclocked like 10% on RAM. Never had problems like that unless I pushed RAM too far..but maybe I'm just lucky getting 2 cards with Hynix and 2 cards that were problem free.


----------



## rdr09

Quote:


> Originally Posted by *WannaBeOCer*
> 
> I would stay away from any reference R9 290/x especially used because they use Elpida memory. Elpida ram is known for not overclocking well that is correct but it also is famous for causing black screens on R9 290 cards. Might not be for all users but a high percentage of users experience it including my R9 290 which XFX wouldn't replace. All XFX PCB R9 390/xs use Hynix memory.
> 
> http://www.overclock.net/t/1441349/290-290x-black-screen-poll


If you look at that poll and how old it is, the number pretty much stayed the same. Some blackscreened due to flashing other bioses, insufficient power delivery, o'cing, etc.


----------



## fyzzz

Never had any issues with my elpida 290's. I only get blackscreens/signal loss when i'm using my 4k screen or really high overclocks. Also all 390's doesn't have hynix memory. I know some sapphire cards has elpida and also xfx. My 390 from xfx has elpida memory, but it's alot better than the elpida memory on the 290's.


----------



## 113802

Quote:


> Originally Posted by *rdr09*
> 
> If you look at that poll and how old it is, the number pretty much stayed the same. Some blackscreened due to flashing other bioses, insufficient power delivery, o'cing, etc.


Multiple people in the thread are running a stock R9 290 like I am and randomly it black screens. Will work fine for a few weeks and all of sudden it blacks. Always running the latest drivers. Latest Official XFX UEFI. It's a horrible experience and it definitely relates to the Elpida ram. Even with a waterblock/Raijintek Morpheus it black screens at stock.

https://www.techpowerup.com/gpuz/details/br6fm


----------



## rdr09

Quote:


> Originally Posted by *WannaBeOCer*
> 
> Multiple people in the thread are running a stock R9 290 like I am and randomly it black screens. Will work fine for a few weeks and all of sudden it blacks. Always running the latest drivers. Latest Official XFX UEFI. It's a horrible experience and it definitely relates to the Elpida ram. Even with a waterblock/Raijintek Morpheus it black screens at stock.
> 
> https://www.techpowerup.com/gpuz/details/br6fm


I am not disputing the existence of BS with Hawaii. A ton? nah. We have a lot of members in the club not having issues at all.


----------



## 222Panther222

Quote:


> Originally Posted by *Roboyto*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> Yes, 290 does have roughly 970 performance for lesser cost. It depends on well you can overclock the 290, so it may perform slightly worse than a 970. If you are running games that will suck up VRAM, then the 290(X) is the better choice since you can utilize all 4GB of VRAM without any penalties.
> 
> If you have questions about 290 cards in general my Need to Know post is a good place to start:
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781
> 
> I must disagree and say this is bad advice. Plus he said 'here the 480 are $380', you should have checked where he resides, which is not in the US.
> 
> Just because a GPU has Elpida RAM doesn't mean that it is prone to black screening. Elpida RAM, generally speaking, on these GPUs did not overclock as well as their Hynix counterparts. There are plenty of people in this forum though that have had good luck with overclocking their Elpida equipped cards; they are not as common as good Hynix cards however.
> 
> These cards did have black screen problems when they were first released, a long time ago, some cards needed RMA and AMD fixed another issue promptly with a driver update.
> 
> If a Hawaii card is black screening then odds are that the problem is caused by the memory being overclocked too far. The memory OC is much less important on these cards as they already have tons of bandwidth. Not saying that you can't get extra performance from a memory OC, but it's nowhere near as effective as the core.


Thanks for the link, so it seems like the black screen was a problem at the driver level and was fixed in more recent ones? Does anyone had a card with no oc/flash that did black screens then was fixed by updating their drivers? I remember reading a review of a card on neweggs that did the same thing, then with a small raise in voltage and it was fine, turns out it was unstable even at "oc stock clock".


----------



## Pillendreher

Quote:


> Originally Posted by *Paul17041993*
> 
> Don't be worried about the core hitting 95C as they're designed to handle it, however if it throttles and/or makes a lot of noise then you'll need to make sure you have plenty of case airflow. The ASUS coolers are *ok*, better than reference, but horrible compared to the coolers from gigabyte and sapphire, if you don't mind however you can always bump the TDP level down (forced throttle) and the card will still perform great at 1080p-1440p with little heat.
> 
> Results vary between cards however, some can pull a lot more power (and make into heat) than others. Also if the card throttles badly on moderate loads you'll need to check the paste and make sure the heatsink sits correctly, ASUS in particular has had problems with the core screws (the 4 centre ones with springs) being too loose on some cards.


I'm currently playing in 1080p and unless my display dies, I'm not gonna go higher anytime soon.

How do I lower the TDP level? Power-limit in Afterburner?

Right now I have 3x 120mm fans in my case. One exhaust fan in the back, one intake fan in the front and one directly in front of my 1090T to keep it cool on my ASRock board. I read that installing a side fan should help. I reckon that one should be exhaust as well, right?


----------



## Roboyto

Quote:


> Originally Posted by *222Panther222*
> 
> Thanks for the link, so it seems like the black screen was a problem at the driver level and was fixed in more recent ones? Does anyone had a card with no oc/flash that did black screens then was fixed by updating their drivers? I remember reading a review of a card on neweggs that did the same thing, then with a small raise in voltage and it was fine, turns out it was unstable even at "oc stock clock".


Some early cards had to be RMA'd and replaced, but there was a driver fix a long time ago to remedy this issue.

There will always be lemons for any kind of product, but from my experience you would probably be OK...always get as much info on the used item as possible.

If you can get a used 290(X), or even a 390(X), for a good price then they are capable cards. There's always a gamble with used products though...but that is the trade off for a good price.

I dunno what Fury prices are like in your neck of the woods, but currently in the US Amazon has a Sapphire 'Nitro' Fury for $310 brand new...That's a hell of a deal even if it only has 4GB of HBM. They perform enough beyond a 480 to warrant the small price difference between this card and AIB 480s.

https://www.amazon.com/Sapphire-Radeon-PCI-Express-Graphics-11247-03-40G/dp/B0196LWL3W/ref=sr_1_1?ie=UTF8&qid=1473975293&sr=8-1&keywords=sapphire+fury+nitro


----------



## mus1mus

What most people don't realize is Elpida Cards have a lower headroom for Memory OC but the Core on them usually clocks better than Hynix Cards. I have been playing with 5 in total and only one of them fails to do 1300MHz on the Core.

Black Screens happen when you are overclocking past certain threshold and pumping in a ton of Voltage. It holds true even with Samsung and Hynix cards.

I have observed that even nVidia cards display these traits. I have been overclocking several GTX 780s with Hynix, Samsung and Elpida Memory. Cards with Elpida can always clock the core up to 50MHz more than Hynix and Samsung cards.

In terms of performance, 50 or even 25 MHz on the core can always offset the 125MHz Memory OC advantage that you can get from Hynix and Samsung cards.

Heck, I can throw you benchmarks where my Elpida cards can smother most users in this club's scores.


----------



## 113802

Quote:


> Originally Posted by *mus1mus*
> 
> What most people don't realize is Elpida Cards have a lower headroom for Memory OC but the Core on them usually clocks better than Hynix Cards. I have been playing with 5 in total and only one of them fails to do 1300MHz on the Core.
> 
> Black Screens happen when you are overclocking past certain threshold and pumping in a ton of Voltage. It holds true even with Samsung and Hynix cards.
> 
> I have observed that even nVidia cards display these traits. I have been overclocking several GTX 780s with Hynix, Samsung and Elpida Memory. Cards with Elpida can always clock the core up to 50MHz more than Hynix and Samsung cards.
> 
> In terms of performance, 50 or even 25 MHz on the core can always offset the 125MHz Memory OC advantage that you can get from Hynix and Samsung cards.
> 
> Heck, I can throw you benchmarks where my Elpida cards can smother most users in this club's scores.


This was my first amd card in 5 years and the one I have is garbage. Only hits 1070mhz on core. All my EVGA cards I've owned always had Samsung memory and the memory always clock 1Ghz higher. Anyway it's not about overclocking when the card never could run at stock.

My current 1070 hits 2.1ghz core and 9.6ghz mem

Edit: from my experience owning 2 780 Tis, 3 980s 1 980 ti Hynix has tighter timings and always scores the highest on benchmarks than my Samsung or Elpida counter parts.


----------



## ABigGuyForYou

Quote:


> Originally Posted by *Paul17041993*
> 
> You may or may not need to run a GPU driver cleaner tool and clean install the drivers again, I know I had to on mine because something weird happened to the profiles (phantom profiles that could not be overwritten or removed).
> 
> Other than that; I haven't used any guides (besides BIOS modding), I just bump the clocks by 50-100, increase the voltage also by 50-100 mV until its stable, see how high the clocks can go before it cant be stabilised (or black-screens). Key thing to remember though is to turn off CCC/Crimson's startup so that it doesn't try to set the clocks on boot in case if a crash.


What cleaner do you recommend? And do you increase both settings at the same time? I


----------



## Roboyto

Quote:


> Originally Posted by *mus1mus*
> 
> What most people don't realize is Elpida Cards have a lower headroom for Memory OC but the Core on them usually clocks better than Hynix Cards. I have been playing with 5 in total and only one of them fails to do 1300MHz on the Core.
> 
> Black Screens happen when you are overclocking past certain threshold and pumping in a ton of Voltage. It holds true even with Samsung and Hynix cards.
> 
> I have observed that even nVidia cards display these traits. I have been overclocking several GTX 780s with Hynix, Samsung and Elpida Memory. Cards with Elpida can always clock the core up to 50MHz more than Hynix and Samsung cards.
> 
> In terms of performance, 50 or even 25 MHz on the core can always offset the 125MHz Memory OC advantage that you can get from Hynix and Samsung cards.
> 
> Heck, I can throw you benchmarks where my Elpida cards can smother most users in this club's scores.


----------



## mus1mus

Quote:


> Originally Posted by *WannaBeOCer*
> 
> This was my first amd card in 5 years and the one I have is garbage. Only hits 1070mhz on core. All my EVGA cards I've owned always had Samsung memory *and the memory always clock 1Ghz higher*. Anyway it's not about overclocking when the card never could run at stock.
> 
> My current 1070 hits 2.1ghz core and 9.6ghz mem
> 
> Edit: from my experience owning 2 780 Tis, 3 980s 1 980 ti Hynix has tighter timings and always scores the highest on benchmarks than my Samsung or Elpida counter parts.


whut?

Apparently, nVidia owners don't have the luxury to adjust memory timings. So yeah, you are left with what the bios has to offer. And I did see how Hynix scores. But hmmm. I wouldn't want a Hynix card when my Elpidas do these:


Spoiler: Prepare yourself!







290?


----------



## 113802

Quote:


> Originally Posted by *mus1mus*
> 
> whut?
> 
> Apparently, nVidia owners don't have the luxury to adjust memory timings. So yeah, you are left with what the bios has to offer. And I did see how Hynix scores. But hmmm. I wouldn't want a Hynix card when my Elpidas do these:]


Very nice!

(Samsung) on Air Loving my GTX 1070 though! 9720 on memory - http://www.3dmark.com/fs/9996036

(Hynix) GTX 980 Ti - Water - 7800Mhz http://www.3dmark.com/fs/6803047

(Samsung) GTX 980 - Air - 8012Mhz http://www.3dmark.com/fs/6231246

(Samsung) - Water Regret buying this card GTX 780 Ti KingPin Edition - http://www.3dmark.com/fs/5700128

(Hynix) GTX 780 6948Mhz, this card was a beast and it was my favorite card. The bios I was using was a boost bios. Were you using a non-boost bios on your 780? 1450mhz is insane if it was boost.


----------



## ZealotKi11er

Has any overclocking expects here even heard or performed a "GPU massage" to improve overclocking?


----------



## mus1mus

Quote:


> Originally Posted by *WannaBeOCer*
> 
> Very nice!
> 
> (Samsung) on Air Loving my GTX 1070 though! 9720 on memory - http://www.3dmark.com/fs/9996036
> 
> (Hynix) GTX 980 Ti - Water - 7800Mhz http://www.3dmark.com/fs/6803047
> 
> (Samsung) GTX 980 - Air - 8012Mhz http://www.3dmark.com/fs/6231246
> 
> (Samsung) - Water Regret buying this card GTX 780 Ti KingPin Edition - http://www.3dmark.com/fs/5700128
> 
> (Hynix) GTX 780 6948Mhz, this card was a beast and it was my favorite card. The bios I was using was a boost bios. Were you using a non-boost bios on your 780? 1450mhz is insane if it was boost.
> 
> 
> Spoiler: Warning: Spoiler!


By Boost, yu mean with GPU Boost ON?

Nope I am not using Boost. It's imperative for nVidia Cards to disable Boost when you want maximum Performance.

For 780/TI/Titans, you need Skynet BIOS. And learn a few tricks to OC them hard.

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Has any overclocking expects here even heard or performed a "GPU massage" to improve overclocking?


What about it?


----------



## ZealotKi11er

Quote:


> Originally Posted by *mus1mus*
> 
> [/SPOILER]
> 
> By Boost, yu mean with GPU Boost ON?
> 
> Nope I am not using Boost. It's imperative for nVidia Cards to disable Boost when you want maximum Performance.
> 
> For 780/TI/Titans, you need Skynet BIOS. And learn a few tricks to OC them hard.
> What about it?


JayZtwoCents was talking about it and how he got 1400MHz with RX480.


----------



## mfknjadagr8

Quote:


> Originally Posted by *WannaBeOCer*
> 
> Multiple people in the thread are running a stock R9 290 like I am and randomly it black screens. Will work fine for a few weeks and all of sudden it blacks. Always running the latest drivers. Latest Official XFX UEFI. It's a horrible experience and it definitely relates to the Elpida ram. Even with a waterblock/Raijintek Morpheus it black screens at stock.
> 
> https://www.techpowerup.com/gpuz/details/br6fm


both of my 290s are xfx reference with elpedia only time i had a black screen was when i first got them and paste was horribly dried up and they were running around 90c under load...they are blocked now but even before that no black screenies


----------



## Echoa

Well just an FYI for anyone wondering, Origin Pc will not refund or replace me for the card that caught on fire. I understand it's used and they don't really care about it but it's just really unfortunate that they just flat won't do anything. I hadn't even owned it 2 months :/


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Has any overclocking expects here even heard or performed a "GPU massage" to improve overclocking?


You don't really need to oc as much anymore. I use to oc my 290s to 1290 to get same performance I get now at 1200. Just for benching. It is my cpu that needs oc'ing with 2 of these, so I keep mine at stock.

http://www.3dmark.com/3dm/11398144


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> You don't really need to oc as much anymore. I use to oc my 290s to 1290 to get same performance I get now at 1200. Just for benching. It is my cpu that needs oc'ing with 2 of these, so I keep mine at stock.
> 
> http://www.3dmark.com/3dm/11398144


I am not sure if it's the drivers or Windows 10 but my 290X and 290 do not OC like they used to. Was 1200MHz stable before Crimson but now I cant even do 1125MHz in some games and I will crash. Keep in mid this is under water so temps are not a problem.


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I am not sure if it's the drivers or Windows 10 but my 290X and 290 do not OC like they used to. Was 1200MHz stable before Crimson but now I cant even do 1125MHz in some games and I will crash. Keep in mid this is under water so temps are not a problem.


Are you using third-party app to oc? If you accepted Overdrive, then that could be it.


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> Are you using third-party app to oc? If you accepted Overdrive, then that could be it.


Only use MSI AB and TriXX to get higher voltage. I do not get artifacts. I just get the "driver has stopped" working message.


----------



## Pillendreher

So I got the chance to spend my first couple of hours with my ASUS 290X DCII. A quick question: Why is my core clock dropping, even though I'm nowhere near 90°C for the GPU/80°C for the VRM? I just spent 45 minutes playing Dark Souls 3 (Irithyll of the Boreal Valley) and my average core clock was 914 MHz on Power Limit -15. That's not normal now, is it?

EDIT: Oh and by the way my average FPS was 39 - somehow couldn't get above 40 even though the 290x should be able to do that. I have everything on Max in 1080p. According to a couple of 'GPU performance reviews' for Dark Souls 3, the 290x should be able to hit 60 fps constantly. Is my FX 8320 limiting?


----------



## battleaxe

Quote:


> Originally Posted by *Pillendreher*
> 
> So I got the chance to spend my first couple of hours with my ASUS 290X DCII. A quick question: Why is my core clock dropping, even though I'm nowhere near 90°C for the GPU/80°C for the VRM? I just spent 45 minutes playing Dark Souls 3 (Irithyll of the Boreal Valley) and my average core clock was 914 MHz on Power Limit -15. That's not normal now, is it?
> 
> EDIT: Oh and by the way my average FPS was 39 - somehow couldn't get above 40 even though the 290x should be able to do that. I have everything on Max in 1080p. According to a couple of 'GPU performance reviews' for Dark Souls 3, the 290x should be able to hit 60 fps constantly. Is my FX 8320 limiting?


Turn the power limit up to +50. That's likely why its dropping. It won't hurt anything. Just lets the card use full power if it needs to. Kinda dumb setting really, its just there to give the appearance of saving power. But really, its just dumb, crank er' up.


----------



## the9quad

Quote:


> Originally Posted by *WannaBeOCer*
> 
> Multiple people in the thread are running a stock R9 290 like I am and randomly it black screens. Will work fine for a few weeks and all of sudden it blacks. Always running the latest drivers. Latest Official XFX UEFI. It's a horrible experience and it definitely relates to the Elpida ram. Even with a waterblock/Raijintek Morpheus it black screens at stock.
> 
> https://www.techpowerup.com/gpuz/details/br6fm


I ended getting 5 of them day one in order to end up with 3 that didn't black screen. I think that initial batch of reference 290x's had a ton that were flat out bad, and even the driver "fix" didn't help them. Luckily once i got three that worked, I haven't had a black screen since.


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Only use MSI AB and TriXX to get higher voltage. I do not get artifacts. I just get the "driver has stopped" working message.


If you accepted Overdrive, you'll notice that it follows the oc settings you set in AB or Trixx. But, Overdrive, it seems, cannot set the same amount of voltage. Windows seems to follow Overdrive over the third apps, which makes the oc unstable (just my observation). I accidentally accepted Overdrive when I had Win7 when I switched to Crimson.

When I switched to Win10 from Win7, Crimsom was set to default, where my Overdrive needs to be accepted. I never did accept it and I can oc my cards as normal using Trixx. But, like I said, I only oc for benching.


----------



## Pillendreher

Quote:


> Originally Posted by *battleaxe*
> 
> Turn the power limit up to +50. That's likely why its dropping. It won't hurt anything. Just lets the card use full power if it needs to. Kinda dumb setting really, its just there to give the appearance of saving power. But really, its just dumb, crank er' up.


I lowered the Power Limit to get better temps. This post gave me hope towards 'undervolting' witout losing performance:

https://techreport.com/forums/viewtopic.php?t=109936


----------



## ShrewLlama

I bought a second hand R9 290 today. Unfortunately the previous owner had been using a waterblock on the card, and all of the screws that connect the card to the cooler are missing.

Does anyone happen to know the exact length + thickness of the screws? I need to know the sizes of both the small spingy screws around the GPU as well as the larger screws on the PCB.

The card is a reference XFX R9 290.


----------



## 222Panther222

I didn't have any respond from the 290 guy, but i found a Sapphire Nitro 470 4gb for 250$ (here it's 300$ + taxes) I'll might buy it, only used for 3 days, which sound a bit shady.

Any particular problem with this card?


----------



## battleaxe

Quote:


> Originally Posted by *222Panther222*
> 
> I didn't have any respond from the 290 guy, but i found a Sapphire Nitro 470 4gb for 250$ (here it's 300$ + taxes) I'll might buy it, only used for 3 days, which sound a bit shady.
> 
> Any particular problem with this card?


That's too much for a used 470. No way I'd buy that.


----------



## ZealotKi11er

Quote:


> Originally Posted by *rdr09*
> 
> If you accepted Overdrive, you'll notice that it follows the oc settings you set in AB or Trixx. But, Overdrive, it seems, cannot set the same amount of voltage. Windows seems to follow Overdrive over the third apps, which makes the oc unstable (just my observation). I accidentally accepted Overdrive when I had Win7 when I switched to Crimson.
> 
> When I switched to Win10 from Win7, Crimsom was set to default, where my Overdrive needs to be accepted. I never did accept it and I can oc my cards as normal using Trixx. But, like I said, I only oc for benching.


Yeah I only OC for benching too.


----------



## diggiddi

Quote:


> Originally Posted by *222Panther222*
> 
> I didn't have any respond from the 290 guy, but i found a Sapphire Nitro 470 4gb for 250$ (here it's 300$ + taxes) I'll might buy it, only used for 3 days, which sound a bit shady.
> 
> Any particular problem with this card?


Isn't that a step back?


----------



## 222Panther222

A bit less powerful than the 290, but i think i'll just zip tie a fan on my 560ti and stick to my ps4 for current gen games. Unless i find a crazy good deal someday.


----------



## ABigGuyForYou

I've been having trouble with my 290. If anybody could lend a hand with an overclocking problem, my thread's here. Thanks.


----------



## SPLWF

Question, I want to return my R9 290 back to stock REF cooler. What do I use to cool the VRMS? My REF cooler still has some paste at the VRM areas from when I removed it. Should I clean it off and put some new paste? Also, the thermal pads are also still on the REF cooler, can I also still use those? Thank you very much.

This is my current R9 290 (usless someone wants to buy it for cheap?)

AMD Sapphire R9 290 + NZXT Kraken G10 + Corsair H55 + GELID ICY Vision VRM Heatsinks + Zalman ZM-RHS1 VRAM Heatsinks
GPU Core Clock: [email protected] Max Core Temp [email protected] Max on VRM1
GPU Memory Clock: 1350mhz (6ghz) [email protected] Max on VRM2


----------



## BlueKnight83

Hello men,

I I registered on this forum because i think it's probably the best about overclocking .

I have "strange voltage" problem whit my Sapphire Tri-X R9 290 board an OCCT;

i use AIDA64 to read voltages of the 7 DPM status of my VGA. The default voltage is 1.250 V

If i use a game or GPU-Z Render test, all the monitoring programs say that my VDDC are 1.189 V ..and OK, it's normal..i know that actually it's working at 1.250 V



To confirm a good overclock, i have always used OCCT Error check artifacs test; using differents monitor programs like GPU-Z e MSI AB, the voltage read on my VDDC drop to 1.086 V : i had always think that it's normal, probably an read error by the programs due to high work load ...
In this day searching to increase my overclock (usually it was 1030 MHz @ default vddc + 16 mV ) i read in a lot of forum people that reach 1100 / 1150 GPU frequency whit only 1.200 / 1225 V on VDDC, and i start to think that there was something strange whit my VGA: and if read voltage during OCCT was not a read error?!'!



Now I suppose that that 1.086 V mean that in OCCT my VGA work at 1.150 V, and probably this is reason why i have so low overclock margins.

Someone can help me to explain that voltage drop under OCCT? Someone read similar story whit R9 290 ? Any suggestions?

Thank you for yours help!


----------



## rdr09

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yeah I only OC for benching too.


Just ran FS at 1200/1500 oc using 16.8.2 and it ran fine. Not sure what's going.

Quote:


> Originally Posted by *BlueKnight83*
> 
> Hello men,
> 
> I I registered on this forum because i think it's probably the best about overclocking .
> 
> I have "strange voltage" problem whit my Sapphire Tri-X R9 290 board an OCCT;
> 
> i use AIDA64 to read voltages of the 7 DPM status of my VGA. The default voltage is 1.250 V
> 
> If i use a game or GPU-Z Render test, all the monitoring programs say that my VDDC are 1.189 V ..and OK, it's normal..i know that actually it's working at 1.250 V
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> To confirm a good overclock, i have always used OCCT Error check artifacs test; using differents monitor programs like GPU-Z e MSI AB, the voltage read on my VDDC drop to 1.086 V : i had always think that it's normal, probably an read error by the programs due to high work load ...
> In this day searching to increase my overclock (usually it was 1030 MHz @ default vddc + 16 mV ) i read in a lot of forum people that reach 1100 / 1150 GPU frequency whit only 1.200 / 1225 V on VDDC, and i start to think that there was something strange whit my VGA: and if read voltage during OCCT was not a read error?!'!
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Now I suppose that that 1.086 V mean that in OCCT my VGA work at 1.150 V, and probably this is reason why i have so low overclock margins.
> 
> Someone can help me to explain that voltage drop under OCCT? Someone read similar story whit R9 290 ? Any suggestions?
> 
> Thank you for yours help!


Use two instances of GPUZ and use one to monitor the readings. For the voltage, I recommend setting the rdg to Max to show the voltage at load like this one . . .



I don't really oc my gpus and only use the GPU render test to check my temps and voltages. I find the readings I get with this tool quite similar to BF4 MP.

EDIT: My idle voltage is 0.984. As you can see at load it reads 1.18v.

BTW, if you'll oc . . . make sure to set the Power Limit to max - +50. At +200 VDDC, I've seen the voltage go as high as 1.41v benching. Not recommended for Air. Also, watch the VRMs, especially VRM1. On air, they can get warmer than the core. Watered, they are cooler than the core.


----------



## Echoa

Just wanted to share how my 290x died as i leave the club







Amazon and Origin PC both refuse a refund so i just took a peek. Sorry i couldnt get closer/ better pic room is dark even in day time. The VRM popped/ruptured and damaged the PCB and scorched some of the area around it and the heatsink


----------



## rdr09

Quote:


> Originally Posted by *Echoa*
> 
> Just wanted to share how my 290x died as i leave the club
> 
> 
> 
> 
> 
> 
> 
> Amazon and Origin PC both refuse a refund so i just took a peek. Sorry i couldnt get closer/ better pic room is dark even in day time. The VRM popped/ruptured and damaged the PCB and scorched some of the area around it and the heatsink
> 
> 
> Spoiler: Warning: Spoiler!


Looks like it did not have enuf thermal pad and somehow a short occurred. Those heatsinks are conductive. tsk tsk.


----------



## Echoa

Quote:


> Originally Posted by *rdr09*
> 
> [/SPOILER]
> 
> Looks like it did not have enuf thermal pad and somehow a short occurred. Those heatsinks are conductive. tsk tsk.


possibly, doesnt seem to have any less than any other GPU Ive owned and the damage to the thermal pad looks like it came from under the pad. Either way its done and over with and ill be getting a RX 470 xP


----------



## Paul17041993

Quote:


> Originally Posted by *WannaBeOCer*
> 
> This was my first amd card in 5 years and the one I have is garbage. Only hits 1070mhz on core. All my EVGA cards I've owned always had Samsung memory and the memory always clock 1Ghz higher. Anyway it's not about overclocking when the card never could run at stock.
> 
> My current 1070 hits 2.1ghz core and 9.6ghz mem
> 
> Edit: from my experience owning 2 780 Tis, 3 980s 1 980 ti Hynix has tighter timings and always scores the highest on benchmarks than my Samsung or Elpida counter parts.


1GHz on what though? cause my 290X can do +2GHz on the memory if you look at the wrong value...
Quote:


> Originally Posted by *Echoa*
> 
> Just wanted to share how my 290x died as i leave the club
> 
> 
> 
> 
> 
> 
> 
> Amazon and Origin PC both refuse a refund so i just took a peek. Sorry i couldnt get closer/ better pic room is dark even in day time. The VRM popped/ruptured and damaged the PCB and scorched some of the area around it and the heatsink


Somehow I think those VRM's are only rated for <20A... But either way they're not the official ones that's for sure.


----------



## Echoa

Quote:


> Originally Posted by *Paul17041993*
> 
> 1GHz on what though? cause my 290X can do +2GHz on the memory if you look at the wrong value...
> Somehow I think those VRM's are only rated for <20A... But either way they're not the official ones that's for sure.


no theyre supposed to be "better", and rated for 50A at 90% efficiency

http://www.tweaktown.com/guides/6667/visiontek-radeon-r9-290-video-card-circuit-and-overclocking-guide/index3.html

thats the visiontek r9 290/290x power circuit, I only added +75mv and temps on VRMS didnt go over 70c :/

it was either defective or somehow the sink made contact with the VRM


----------



## Paul17041993

Quote:


> Originally Posted by *Echoa*
> 
> no theyre supposed to be "better", and rated for 50A at 90% efficiency
> 
> http://www.tweaktown.com/guides/6667/visiontek-radeon-r9-290-video-card-circuit-and-overclocking-guide/index3.html
> 
> thats the visiontek r9 290/290x power circuit, I only added +75mv and temps on VRMS didnt go over 70c :/
> 
> it was either defective or somehow the sink made contact with the VRM


50A ~quote unquote~, I can assure you that those little things wouldnt be able to sustain that for a long period of time, and possibly what happened is it started to de-solder itself. Alternatively it may have been one of the neighbour micro capacitors getting too hot and short-circuiting, causing that explosion of dust.

Looking at the pads again actually, it does seem that the center pads were not making adequate contact, which would have resulted in additional heat and may not be detected by the PCB controller. Something to keep in mind next time you do more overclocking.


----------



## 113802

Quote:


> Originally Posted by *Paul17041993*
> 
> 1GHz on what though? cause my 290X can do +2GHz on the memory if you look at the wrong value...


Not sure what you mean by looking at the wrong value. I can overclock most cards memory that use Samsung by 1000Mhz or more. For example my GTX 1070 stock is 2002Mhz(8008Mhz) I run it at 2400Mhz(9600). People with Micron can only overclock by 400-800Mhz

My 780 Ti overclocked by 1ghz more along with my 980 Ti

Anyway XFX agreed to ship me a 390x for my issues with the R9 290







Wasn't expecting them to have good customer service. I am always a Sapphire guy when it comes to AMD cards. I've used more ATI* cards than nVidia cards but EVGA's customer service is the reason why I've been buying nVidia cards.


----------



## Echoa

Quote:


> Originally Posted by *Paul17041993*
> 
> 50A ~quote unquote~.


thats why i said "better" lol xD

Im just gonna go with a brand and card that i know are good and well made


----------



## 113802

Loving my r9 390x so far! No more black screens at stock







The core is average and the ram overclocks decently.

https://www.techpowerup.com/gpuz/details/hzpfn


----------



## lanofsong

Hey R9 290/290X owners,

Would you consider putting all that power to a good cause for the next 2 days? If so, come *sign up* and fold with us for our monthly Foldathons - see attached link.

September Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## chiknnwatrmln

Well, it pains me to say this but goodbye. It's been a good three year run but it's time to move on. I just bought a used 1080 from another OCN member and ordered a block for it.

As much as I wanted to stay with AMD they are just taking too damn long to release a card worth buying.

I'll be selling both my cards and blocks shortly.

Peace out


----------



## battleaxe

Quote:


> Originally Posted by *chiknnwatrmln*
> 
> Well, it pains me to say this but goodbye. It's been a good three year run but it's time to move on. I just bought a used 1080 from another OCN member and ordered a block for it.
> 
> As much as I wanted to stay with AMD they are just taking too damn long to release a card worth buying.
> 
> I'll be selling both my cards and blocks shortly.
> 
> Peace out


Sorry to see ya go.

I still own a 290, 290x, and a pair of 390x. So I'm not leaving anytime soon until AMD or Nvidia comes out with something I can't resist. Hasn't happened for me yet.


----------



## lanofsong

Quote:


> Originally Posted by *battleaxe*
> 
> Sorry to see ya go.
> 
> I still own a 290, 290x, and a pair of 390x. So I'm not leaving anytime soon until AMD or Nvidia comes out with something I can't resist. Hasn't happened for me yet.


Are you interested in Team Competition folding on your 390X?


----------



## battleaxe

Quote:


> Originally Posted by *lanofsong*
> 
> Are you interested in Team Competition folding on your 390X?


ehh... not really. I've been looking into Eth mining, but haven't been able to get it working right. So I just decided not to bother with taxing my hardware out of laziness. LOL

What's the cause?

I kinda figure I need a return of some sort to be hitting my hardware 24/7... so I don't fold as a result. I was folding cure coin back when that was a thing. But that has long since been over.

They really need to do an alt coin that mates folding with a coin value if you ask me. We need something for our efforts and the use of our hardware. That's just me though I guess. I don't want to cook my GPU's for nothing ya know?

Such a shame Cure coin didn't last or take off. What a shame.


----------



## lanofsong

A brief explanation here as to what folding is:
http://web.stanford.edu/group/pandegroup/folding/FoldingFAQ.pdf

As for the Foldathon, it is a two day event where we fold on whatever hardware that we are willing to fold on








Team Competition requires you to fold at least 20/7 - 24/7 flat out is better - WHY??? asides from donating folding time to science, it is also kind of fun pushing your hardware to the limits and defeating all other challengers out there; doing it as a part of a team makes it even sweeter.


----------



## Offler

Hello guys.

After release of RX-480 i have tough choice.

a) Keep my R9-290x and get a better cooler to it.
Since I installed it, the heat from graphics is affecting CPU VRM causing serious trouble.

b) To replace it with RX-480
Almost same performance at lower watts and temperatures. In this case however I am getting into "chip lottery". My current R9-290x is a clear win as I was able to attain really great results while using lapped stock cooler (ASIC above 80%)

Thanks.


----------



## mus1mus

The only benefit going for the RX 480 really are the Lower Power Consumption and the VRAM for 8GB versions. Otherwise, you might even see lower performance.

I have had a chance to play with a GTX1060 and even that can be humbled by a 290 let alone a 290X.


----------



## generaleramon

Quote:


> Originally Posted by *Offler*
> 
> Hello guys.
> 
> After release of RX-480 i have tough choice.
> 
> a) Keep my R9-290x and get a better cooler to it.
> Since I installed it, the heat from graphics is affecting CPU VRM causing serious trouble.
> 
> b) To replace it with RX-480
> Almost same performance at lower watts and temperatures. In this case however I am getting into "chip lottery". My current R9-290x is a clear win as I was able to attain really great results while using lapped stock cooler (ASIC above 80%)
> 
> Thanks.


i'm an ex 290X Owner. Keep your 290x and wait Vega. My [email protected] is only a little better than my old [email protected], no big difference


----------



## Echoa

Quote:


> Originally Posted by *generaleramon*
> 
> i'm an ex 290X Owner. Keep your 290x and wait Vega. My [email protected] is only a little better than my old [email protected], no big difference


Agreed, keep your 290x as the performance is pretty much the same and just wait it out to Vega. Only reason I switched to a 470 was my 290x died and at 1350/1800 with aggressive memory timings I'm able to just match the 290x. 290x is a good card, I'd run it into the ground then replace it.


----------



## Paul17041993

Quote:


> Originally Posted by *Offler*
> 
> Hello guys.
> 
> After release of RX-480 i have tough choice.
> 
> a) Keep my R9-290x and get a better cooler to it.
> Since I installed it, the heat from graphics is affecting CPU VRM causing serious trouble.
> 
> b) To replace it with RX-480
> Almost same performance at lower watts and temperatures. In this case however I am getting into "chip lottery". My current R9-290x is a clear win as I was able to attain really great results while using lapped stock cooler (ASIC above 80%)
> 
> Thanks.


Really depends on your budget and resolution, if you only use 1080p then the 480 is much better suited, but otherwise you could possibly watercool the 290X for about the same price as a 480.


----------



## Offler

Quote:


> Originally Posted by *Paul17041993*
> 
> Really depends on your budget and resolution, if you only use 1080p then the 480 is much better suited, but otherwise you could possibly watercool the 290X for about the same price as a 480.


The price of NZXT cooler I was considering is about same as the price of new 480.

I play on 1080p, upgrade to 4k was considered too along with new display with Freesync support, yet my current display is superior in its class even today.

I also considered cheaper solution with a fan mounted over GPU in PCI slot.

https://www.tntrade.sk/primecooler-pc-sysb-c-_d13249.html

Such thing may help get heat out of the graphic card and prevent overheating of CPU VRMs.


----------



## Paul17041993

Quote:


> Originally Posted by *Offler*
> 
> The price of NZXT cooler I was considering is about same as the price of new 480.
> 
> I play on 1080p, upgrade to 4k was considered too along with new display with Freesync support, yet my current display is superior in its class even today.
> 
> I also considered cheaper solution with a fan mounted over GPU in PCI slot.
> 
> https://www.tntrade.sk/primecooler-pc-sysb-c-_d13249.html
> 
> Such thing may help get heat out of the graphic card and prevent overheating of CPU VRMs.


If you wanted to upgrade to 4k then you might as well stick with the 290X as the 480 simply lacks the bandwidth, even with delta compression.

That PCI fan would probably work quite well however the more important thing is air intake, if you have too much exhaust then they'll just suck air back in the wrong places. If the issue is just CPU VRM/NB temps, and you haven't already, stick one or more small fans around the motherboard heatsink and directly behind the board blowing onto it if you can, if your case has a CPU cooler mount hole on the base then you can stick a fan there.

One note though with the NZXT cooler bracket is you need to make sure you have VRM heatsinks, as keeping the VRMs bare tends to not keep them cool very well unless you have a super powerful fan.

The corsair HG10 A1 would be a better option and works with virtually any AIO cooler, can even be used with some air coolers...


----------



## invincible20xx

Quote:


> Originally Posted by *Paul17041993*
> 
> If you wanted to upgrade to 4k then you might as well stick with the 290X as the 480 simply lacks the bandwidth, even with delta compression.
> 
> That PCI fan would probably work quite well however the more important thing is air intake, if you have too much exhaust then they'll just suck air back in the wrong places. If the issue is just CPU VRM/NB temps, and you haven't already, stick one or more small fans around the motherboard heatsink and directly behind the board blowing onto it if you can, if your case has a CPU cooler mount hole on the base then you can stick a fan there.
> 
> One note though with the NZXT cooler bracket is you need to make sure you have VRM heatsinks, as keeping the VRMs bare tends to not keep them cool very well unless you have a super powerful fan.
> 
> The corsair HG10 A1 would be a better option and works with virtually any AIO cooler, can even be used with some air coolers...


290x is not going to do for 4k at any respectable frame rate so as the RX 480, it's kind of a moot point


----------



## Paul17041993

Quote:


> Originally Posted by *invincible20xx*
> 
> 290x is not going to do for 4k at any respectable frame rate so as the RX 480, it's kind of a moot point


Sorry? I game at 4k with my 290X (4GB) just fine, with or without an OC. The simple fact is that the 480 simply isn't strong enough to actually beat the 290X at 4k so there's no point side-grading to it.

Two 480's on the other hand are a different story, most games will run at least a little better at 4k than a single 290X and will particularly run well with mGPU DX12 or vulkan titles. Two 290X's are a similar story provided you can keep them cool.


----------



## the9quad

Quote:


> Originally Posted by *Paul17041993*
> 
> Sorry? I game at 4k with my 290X (4GB) just fine, with or without an OC. The simple fact is that the 480 simply isn't strong enough to actually beat the 290X at 4k so there's no point side-grading to it.
> 
> Two 480's on the other hand are a different story, most games will run at least a little better at 4k than a single 290X and will particularly run well with mGPU DX12 or vulkan titles. Two 290X's are a similar story provided you can keep them cool.


Half the time I have to go down to 1080p to get a decent experience, I can't imagine 4k being doable at all. But to each their own.


----------



## Roboyto

Quote:


> Originally Posted by *Offler*
> 
> Hello guys.
> 
> After release of RX-480 i have tough choice.
> 
> a) Keep my R9-290x and get a better cooler to it.
> Since I installed it, the heat from graphics is affecting CPU VRM causing serious trouble.
> 
> b) To replace it with RX-480
> Almost same performance at lower watts and temperatures. In this case however I am getting into "chip lottery". My current R9-290x is a clear win as I was able to attain really great results while using lapped stock cooler (ASIC above 80%)
> 
> Thanks.


As everyone else has said the 480 isn't going to do anything for you in terms of performance.

Spend $104 on these 3 items and you will have plenty of cooling for your 290X to even run a mild OC:

Gelid VRM Heatsink Kit:

http://www.newegg.com/Product/Product.aspx?Item=N82E16835426042

Kraken G10 Bracket:

http://www.newegg.com/Product/Product.aspx?Item=N82E16835146037

Corsair H55

http://www.newegg.com/Product/Product.aspx?Item=N82E16835181029

Optional RAM Heatsinks (2 sets required):

http://www.performance-pcs.com/3m-8810-thermally-conductive-adhesive-transfer-tapes-thermal-sink-8-pack.html

If you happen to have an Asetek based AIO lying around that is compatible with the Kraken G10, then you save yourself $60.



Spoiler: Asetek AIO















See performance here:

http://www.overclock.net/t/1478544/the-build-formerly-known-as-the-devil-inside-a-gaming-htpc-for-my-wife-4770k-corsair-250d-powercolor-r9-290/20_20#post_22214255


----------



## Paul17041993

Quote:


> Originally Posted by *the9quad*
> 
> Half the time I have to go down to 1080p to get a decent experience, I can't imagine 4k being doable at all. But to each their own.


Of course your mileage will vary with the particular game, but if a game cant maintain 60 at 1440p without AA and AF then it's simply not optimised at all. HD textures are a different story though, there's only so much you can fit on 4GB of VRAM...


----------



## mfknjadagr8

Ok so quick question...ive been monitoring temps while gaming with my 290s and everything is great there never a temp over 40s...im having direct x errors (gaming using dsr to 3200 x 2800 so far only in cod bo3) and random freezes...when i have the freeze sometimes it doesnt hard lock and recovers but the second cards clocks drop to 150 and 100 from the 300 and 150 idle they usually set at...rebooting as if it hard locked fixes it...ive tried reinstalling the drivers and reinstalling afterburner and rivatuner (which i use to increase the power limit and monitor temps etc) at stock clocks...windows 7 all updates and patches...overclock is very solid on cpu running the full gambit...any suggestions?


----------



## battleaxe

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Ok so quick question...ive been monitoring temps while gaming with my 290s and everything is great there never a temp over 40s...im having direct x errors (gaming using dsr to 3200 x 2800 so far only in cod bo3) and random freezes...when i have the freeze sometimes it doesnt hard lock and recovers but the second cards clocks drop to 150 and 100 from the 300 and 150 idle they usually set at...rebooting as if it hard locked fixes it...ive tried reinstalling the drivers and reinstalling afterburner and rivatuner (which i use to increase the power limit and monitor temps etc) at stock clocks...windows 7 all updates and patches...overclock is very solid on cpu running the full gambit...any suggestions?


Did you try to do a repair on your OS?

I had this issue about a month ago. Turned out it was an unstable OC on the CPU. Weird, but when I got that fixed it went away.


----------



## Paul17041993

Quote:


> Originally Posted by *mfknjadagr8*
> 
> Ok so quick question...ive been monitoring temps while gaming with my 290s and everything is great there never a temp over 40s...im having direct x errors (gaming using dsr to 3200 x 2800 so far only in cod bo3) and random freezes...when i have the freeze sometimes it doesnt hard lock and recovers but the second cards clocks drop to 150 and 100 from the 300 and 150 idle they usually set at...rebooting as if it hard locked fixes it...ive tried reinstalling the drivers and reinstalling afterburner and rivatuner (which i use to increase the power limit and monitor temps etc) at stock clocks...windows 7 all updates and patches...overclock is very solid on cpu running the full gambit...any suggestions?


Have you run memtest and/or prime95? Do problems occur with only a single GPU in use? (unplugging the PCIe power cables can be useful here)


----------



## mfknjadagr8

Quote:


> Originally Posted by *battleaxe*
> 
> Did you try to do a repair on your OS?
> 
> I had this issue about a month ago. Turned out it was an unstable OC on the CPU. Weird, but when I got that fixed it went away.


no i havent done a repair on the os...i run 40 runs of ibt avx on very high and 12 hours prime in both blend and small fts...so im pretty sure overclock is stable.

Quote:


> Originally Posted by *Paul17041993*
> 
> Have you run memtest and/or prime95? Do problems occur with only a single GPU in use? (unplugging the PCIe power cables can be useful here)


i havent ran memtest since i set everything up again but for prime see above...i havent gotten around to playing with just one gpu yet but im going to try that next...the last time i ran hci memetest it ran up to 1000 percent no errors...the memory never left the board when i switched to the primo...


----------



## Paul17041993

Quote:


> Originally Posted by *mfknjadagr8*
> 
> no i havent done a repair on the os...i run 40 runs of ibt avx on very high and 12 hours prime in both blend and small fts...so im pretty sure overclock is stable.
> i havent ran memtest since i set everything up again but for prime see above...i havent gotten around to playing with just one gpu yet but im going to try that next...the last time i ran hci memetest it ran up to 1000 percent no errors...the memory never left the board when i switched to the primo...


You mention an overclock, have you tried full-stock? I'm wondering if your board may not be able to handle two cards as well as a CPU overclock...

If it runs properly with two cards and the CPU at stock then you could try bumping the PCIe and NB voltages a bit in your OC, and/or under-volting the 290's.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Paul17041993*
> 
> You mention an overclock, have you tried full-stock? I'm wondering if your board may not be able to handle two cards as well as a CPU overclock...
> 
> If it runs properly with two cards and the CPU at stock then you could try bumping the PCIe and NB voltages a bit in your OC, and/or under-volting the 290's.


I'm running sabertooth...its done all the time...gpus are run at stock settings with only power limit increase...ran fine before with lesser cooling and same overclocks and at times pushing 1.6v...im sure the memory and cpu clocks are fine and work together beautifully...ram is stock with a slight overvolt for stability and cpu clock has been tested extremely thoroughly


----------



## D3lBoy

I asked this in Lightning group but no concrete answer.

I have 290X Lightning, i5 6600 and FSP Raider 650W 80+ Silver psu... That should be enough right. But lately i've been noticing few "checkboard" artifacts (core) in new Deus Ex. It doesn't drop any artifacts in benchmarks so no point in RMAing because here (in Croatia) they just test 3DMark, Heaven, Furmark and stuff like that and if it's stable - it's stable.

So i've noticed antoher thing which may be connected. My voltage under load averages around 1.12-1.13V which is really low, because Lightning comes with +63mV and 1080core. And i saw that other people with Lightnings have around 1.2V on +63 so they can overclock it to around 1125MHz without touching voltage... I can't even make 1100 stable with stock clocks. So i thought that may be the reason of occasional artifact in games, because 1080MHz is probably on edge of 1.13V.

So obviously it's a huge Vdroop, but i want to know if it's a GPU or PSU problem ? I don't know anyone that is close enough and have good PSU to test it, but i also don't want to RMA it for no real reason. Is there anything i can do besides pumping voltage to like +200mV to get 1.2V ? That's not normal or safe


----------



## diggiddi

Quote:


> Originally Posted by *D3lBoy*
> 
> I asked this in Lightning group but no concrete answer.
> 
> I have 290X Lightning, i5 6600 and FSP Raider 650W 80+ Silver psu... That should be enough right. But lately i've been noticing few "checkboard" artifacts (core) in new Deus Ex. It doesn't drop any artifacts in benchmarks so no point in RMAing because here (in Croatia) they just test 3DMark, Heaven, Furmark and stuff like that and if it's stable - it's stable.
> 
> So i've noticed antoher thing which may be connected. My voltage under load averages around 1.12-1.13V which is really low, because Lightning comes with +63mV and 1080core. And i saw that other people with Lightnings have around 1.2V on +63 so they can overclock it to around 1125MHz without touching voltage... I can't even make 1100 stable with stock clocks. So i thought that may be the reason of occasional artifact in games, because 1080MHz is probably on edge of 1.13V.
> 
> So obviously it's a huge Vdroop, but i want to know if it's a GPU or PSU problem ? I don't know anyone that is close enough and have good PSU to test it, but i also don't want to RMA it for no real reason. Is there anything i can do besides pumping voltage to like +200mV to get 1.2V ? That's not normal or safe


A 650 w Psu should have enough juice but it could be going bad. If you have another Psu connect it using paper clip and use it to power the card and see if the issues continue


----------



## D3lBoy

Ok, i've tried my GPU in friends system with Thermaltake 750W PSU... And VDDC under load is still around 1.13V, but with his 12V rail is dropping all the way down to 11.2V. I don't even know how PC didn't shut down itself.

So i still don't know is his and mine PSU too weak for Lightning or it is actually a GPU problem :/


----------



## lanofsong

Hey R9 290/290X owners,

We are having our monthly Foldathon from Monday 17th - 19th 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

October Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## mirzet1976

Quote:


> Originally Posted by *lanofsong*
> 
> Hey R9 290/290X owners,
> 
> We are having our monthly Foldathon from Monday 17th - 19th 12noon EST.
> Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.
> 
> October Foldathon
> 
> To get started:
> 
> 1.Get a passkey (allows for speed bonus) - need a valid email address
> http://fah-web.stanford.edu/cgi-bin/getpasskey.py
> 
> 2.Download the folding program:
> http://folding.stanford.edu/
> 
> Enter your folding name (mine is the same as my OCN name)
> Enter your passkey
> Enter Team OCN number - 37726
> 
> later
> lanofsong


I'm in.


----------



## Widde

Long time since I've been in here but I gotta say I love the new look of the site ^^

Been getting a problem with my 290 as of late, At cold boot I'm getting horizontal tearing/Flickering from the windows login screen and into windows. It goes away as soon as I give it +25mV on the core (Havent tried less, it was what I put it at from the start)

Have not tried leaving it be to see if the tearing goes away by itself, is there anything you can do to make for example a +25mV change to the bios itself so it always gets that voltage?


----------



## D3lBoy

Now i noticed one interesting thing... Card is locked to 1.21V when i'm rendering video with GPU, and as i said Lightning should be 1.2V under the load. But in games it drops to 1.12V like i said, which sometimes makes it unstable.

So i'm thinking, is there any way that games throttle GPU voltage ? Or some software, driver ?

Like i said in previous post, i rulled out PSU problem, also while i was rendering video i've tried running OCCT CPU stress test at the same time, to see if GPU voltage will drop from 1.2V but no, still locked... It has something to do with 3D applications and i'm not smart enough to think what


----------



## c0V3Ro

Seems that Win10 update unleashed a few points on 3Dmark








Crimson 16.8.2: 8100
http://www.3dmark.com/fs/10441322
Crimson 16.9.2: 8176
http://www.3dmark.com/fs/10442031
win10 update: 8292
http://www.3dmark.com/fs/10446392


----------



## Coree

The Asus 290X DC II is a POS. 94C at stock clocks, with 67% fanspeed. Case airflow is good, 1 intake and 1 exhaust fan. Replaced the thermalpaste, still getting 94C at 67% fanspeed. Got fed up, changed to the ACIII and temps are hovering in the upper 70's,while being silent. Why is Asus failing on GPU cooler designs on AMD cards?


----------



## mfknjadagr8

On the topic i posted here about my problem with hard locks and the occasional blue screen.....im almost 100 percent sure it's the bottom video card....i cant figure out why but running with just the top card and no issues...my temps never break 50c on any component....my drivers are always manual clean install using ddu and my outlined method of installing with ine card then connecting the other blah blah...
The wierd thing is after the blue screen which is a stop error on infinite driver loop of the gpu obviously...after the restart it will say there is no amd driver installed or it is malfunctioning... and the second card shows no clocks in any monitoring program and .97v on core voltage...even though everything for the driver loads...at first i kept safe mode and ddu and reinstall....the last time i decided to try something different and unplug the second card after shutting down...load up windows again driver is working all is good...

Now you guys may or may not remember i had an issue early this year where i would get blue screens and it would corrupt the driver and blue screen everytime windows tried to load it...so i ran a long time with one card but after changing cases i decided to give it another shot...the weirdest part is up until the crashes i get the performance with almost 100 percent scaling on the gpus in most titles

My question is this...can a bios problem cause issues with a card communicating with a driver properly....and what else could cause a random issue like this...I've insured my block has no rear leads touching the backplate...ive checked the block for leaks and tore it down six months ago....ive reinstalled everything within my windows (all programs related to gpus in any way)...ive monitored voltages and temps with temps never exceeding 50c and voltages staying fairly constant around 1.21 under load on both cards

Is there a way to flash a bios on a second card while the first is still installed? I did notice my two cards have seperate bios versions in gpuz...


----------



## spyshagg

I have one. Its pretty ****. Even on watercooling the vrm's could fry an egg.

Still, all tuned with bios mods and 15% clock Oc, it manages 14400 graphics score

http://www.3dmark.com/fs/10263465


----------



## SultanOfWalmart

Quote:


> Originally Posted by *Widde*
> 
> Long time since I've been in here but I gotta say I love the new look of the site ^^
> 
> Been getting a problem with my 290 as of late, At cold boot I'm getting horizontal tearing/Flickering from the windows login screen and into windows. It goes away as soon as I give it +25mV on the core (Havent tried less, it was what I put it at from the start)
> 
> Have not tried leaving it be to see if the tearing goes away by itself, is there anything you can do to make for example a +25mV change to the bios itself so it always gets that voltage?


Is that at stock clocks or overclocked? Stock BIOS? Is it at idle? If it's all stock, it is strange that it would all of a sudden require more voltage to idle/on boot.

As to your original question, yes, you can easily add +25mV to the bios. Export and open your BIOS in Hawaii Bios Reader and adjust DPM 0 from your current voltage to +25mV (this will adjust your idle voltage input). You have to do it in both PowerPlay and Limit Tables tabs.

Here is a pic of a bios from my 290x, it's not the original bios but still illustrates the point. Some of your voltages will obviously differ.



Alternatively, you can also add a +25mV offset to the ROM itself, but that requires a tiny bit of hex editing. You can find more info on in HERE in the 290/290x/390/390x Bios Editing thread.


----------



## Widde

Stock bios, reference card at stock clocks. It's gotten worse as of late, the card always had some random black screen issues at stock voltage since new, they were rare and I was to lazy to ship it back.

Will give it a bash and see if it goes away









I'll just add +25 to the stock 993? It would come to 1018 then


----------



## CALiteral

Quote:


> Stock bios, reference card at stock clocks. It's gotten worse as of late, the card always had some random black screen issues at stock voltage since new, they were rare and I was to lazy to ship it back.


I posted this earlier as a solution to the black screen problem that I had with my Asus 290x. Change the memory frequency for DPM 1 and 2 to 150MHz as it is on the stock 390x bioses. This will ensure that when the memory clock is increased to 1250MHz, the core (and more importantly the memory controller) voltage is increased to DPM 3 level. This completely stopped the black screens for me.

As a side effect of this mod, I was also able to set a much higher memory clock since its speed is usually limited by the memory controller and not the VRAM ICs themselves.


Spoiler: Warning: Spoiler!







Edit: I just noticed you mentioned earlier about artifacts in Windows so this may not be the solution you're looking for. Still, it could be worth a shot if nothing else for possibly better memory clocks.


----------



## ZealotKi11er

Back to single 290X while waiting for Vega.


----------



## Roboyto

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Back to single 290X while waiting for Vega.


http://wccftech.com/amd-vega-10-vega-11-magnum/

http://wccftech.com/amd-polaris-revisions-performance-per-watt/


----------



## serave

Soo my Seasonic X-750 just hit the **** and i came up with a kinda dumb question and probably have been asked many times over and over

will a 430W PSU even enough to drive a single R9 290 that's moderately underclocked (-72mV/ -20% power limit @ stock clocks)

the PSU in question is my spare unit which is the XFX TS 430W, literally a rebadged Seasonic ECO Unit, also planning to use the 2 Molex to 8-pin connector since my spare psu doesnt have one lol.

Buying a new unit is out the question since i got moved to another workplace that's faraway in a country side ( a bit exaggeration, but you get my point)

rest of the system are i5 6500, 1SSD, 1TB HDD and 3 case fans

for a starting point, here's my GPU-Z screenshot while playing The Witcher 3


----------



## diggiddi

You'll be maxxing that PSU but it should work


----------



## Roboyto

Quote:


> Originally Posted by *serave*
> 
> Soo my Seasonic X-750 just hit the **** and i came up with a kinda dumb question and probably have been asked many times over and over
> 
> will a 430W PSU even enough to drive a single R9 290 that's moderately underclocked (-72mV/ -20% power limit @ stock clocks)
> 
> the PSU in question is my spare unit which is the XFX TS 430W, literally a rebadged Seasonic ECO Unit, also planning to use the 2 Molex to 8-pin connector since my spare psu doesnt have one lol.
> 
> Buying a new unit is out the question since i got moved to another workplace that's faraway in a country side ( a bit exaggeration, but you get my point)
> 
> rest of the system are i5 6500, 1SSD, 1TB HDD and 3 case fans
> 
> for a starting point, here's my GPU-Z screenshot while playing The Witcher 3


I ran an overclocked 4790k and 290 off of a Rosewill Capstone 450W unit for some time without issue. Granted it isn't an ideal scenario, you should be able to do it. Would definitely suggest a larger PSU as soon as it is possible however.


----------



## serave

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Back to single 290X while waiting for Vega.


same here, kinda hard to find an "upgrade" from 290/390/X performance levels

the 470/480/1060 simply isn't worth it in my book, also the 1060 is like $300-400 in my place.
Quote:


> Originally Posted by *diggiddi*
> 
> You'll be maxxing that PSU but it should work


Quote:


> Originally Posted by *Roboyto*
> 
> I ran an overclocked 4790k and 290 off of a Rosewill Capstone 450W unit for some time without issue. Granted it isn't an ideal scenario, you should be able to do it. Would definitely suggest a larger PSU as soon as it is possible however.


Thank you for the answers, now i can relax a bit after hearing more opinions, already guessed it would work but hearing more opinions wont hurt
Quote:


> Originally Posted by *Roboyto*
> 
> I ran an overclocked 4790k and 290 off of a Rosewill Capstone 450W unit for some time without issue. Granted it isn't an ideal scenario, you should be able to do it. .


For how long did you run that setup with the PSU tho


----------



## Roboyto

Quote:


> Originally Posted by *serave*
> 
> For how long did you run that setup with the PSU tho


That little Rosewill powered numerous GPUs before either of (2) the 290s I had in my HTPC. All the way down from a HD6790, 270X, GTX 670, R9 290, 970/980 (briefly), and another 290. In total a few years...retired the PSU planning on Xfire 290s and then that just never happened. The little Rosewill sits collecting dust most of the time until I pull it out to use as a tester PSU.


----------



## Gil80

Hi everyone.

In urgent help here, please.

Windows 10, 64bit.
2x R9 290 water cooled.
I upgraded to latest AMD driver from October. I used to have the previous one from September.
I noticed some issues where things would get stuck in games.

I used DDU in safe mode to remove AMD drivers and installed the latest october driver.
I didn't reboot immediately after install.

After a reboot I got a BSOD with error message: *thread stuck in device driver*
I pushed the rest button and it booted normally.

I use GPU-Z to run a quick benchmark where I can see that both GPU operate at full capacity.

The 1st GPU shows: PCI-E 3.0 X 16 @ X8 3.0
The 2nd GPU shows: PCI-E 3.0 X 16 @ X2 1.1 !!!!!!!

I somehow managed to disconnect both GPU's and reconnect them.
Now the BIOS shows x8 on both GPU's however something is not right with the 2nd GPU and I really need your help.

I was able to make GPU-Z use the two R9 290's. But now it shows PCI-E 3.0 X 16 @ X8 1.1 instead of 3.0 on both of them.

Below is a GPU-Z screenshot that shows there's a BIOS issue with the 2nd GPU:



Can someone please tell me if there's a way to correct this?

is the reporting accurate?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Gil80*
> 
> Hi everyone.
> 
> In urgent help here, please.
> 
> Windows 10, 64bit.
> 2x R9 290 water cooled.
> I upgraded to latest AMD driver from October. I used to have the previous one from September.
> I noticed some issues where things would get stuck in games.
> 
> I used DDU in safe mode to remove AMD drivers and installed the latest october driver.
> I didn't reboot immediately after install.
> 
> After a reboot I got a BSOD with error message: *thread stuck in device driver*
> I pushed the rest button and it booted normally.
> 
> I use GPU-Z to run a quick benchmark where I can see that both GPU operate at full capacity.
> 
> The 1st GPU shows: PCI-E 3.0 X 16 @ X8 3.0
> The 2nd GPU shows: PCI-E 3.0 X 16 @ X2 1.1 !!!!!!!
> 
> I somehow managed to disconnect both GPU's and reconnect them.
> Now the BIOS shows x8 on both GPU's however something is not right with the 2nd GPU and I really need your help.
> 
> I was able to make GPU-Z use the two R9 290's. But now it shows PCI-E 3.0 X 16 @ X8 1.1 instead of 3.0 on both of them.
> 
> Below is a GPU-Z screenshot that shows there's a BIOS issue with the 2nd GPU:
> 
> 
> 
> Can someone please tell me if there's a way to correct this?
> 
> is the reporting accurate?


Try to turn off ULPS.


----------



## Gil80

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Try to turn off ULPS.


I did. It looks better now, but GPU-Z still shows x8 1.1 for both GPUs while under load. The load animation looks smooth and both GPUs clocks are ramping up to max under load. Why then GPU-z reports as x8 1.1 instead of x8 3.0?

See here:


----------



## ZealotKi11er

Quote:


> Originally Posted by *Gil80*
> 
> I did. It looks better now, but GPU-Z still shows x8 1.1 for both GPUs while under load. The load animation looks smooth and both GPUs clocks are ramping up to max under load. Why then GPU-z reports as x8 1.1 instead of x8 3.0?
> 
> See here:


Should not be the case. Benchmark the card and see if the performance is there. 1.1 is way too slow.


----------



## Gil80

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Should not be the case. Benchmark the card and see if the performance is there. 1.1 is way too slow.


I don't have a base for comparison when benchmarking. I'd appreciate your advice on this. If it matters at all, I played BF4 on my 3 monitor setup 6000x1200 with graphic details ranging from medium to ultra with 60fps over 1 hour.
It seems that the GPU clocks are ramping up to their max under load. 1040/1250. However, GPU-z either doesn't report it correctly or there is an underlying issue which I'm unaware of.
I've used DDU and reinstalled the drivers over and over again (the latest AMD drivers from October).


----------



## mus1mus

It should at best display PCIE 3.0 x16 @ 8X 3.0 due to your CPU's 16 PCIE Lanes.
Try to unplug the power from the second card and see if GPU-Z displays PCIE 3.0 at x16.

You might need to reinstall the driver with one card, reboot, power up the 2nd card and install again.


----------



## Gil80

Quote:


> Originally Posted by *mus1mus*
> 
> It should at best display PCIE 3.0 x16 @ 8X 3.0 due to your CPU's 16 PCIE Lanes.
> Try to unplug the power from the second card and see if GPU-Z displays PCIE 3.0 at x16.
> 
> You might need to reinstall the driver with one card, reboot, power up the 2nd card and install again.


Ok, I'll do that and report back later today.
It did show PCIE 3.0 x16 @ 8X 3.0 before I had the issue.
The motherboard supports only X8 with dual GPU's and X16 with single GPU. So I'm expecting X8 3.0, but not 1.1.
But the load animation runs smoothly, contradicting the 1.1 value. If I'm right, 1.1 is idle... they are definitely not idling.


----------



## mus1mus

Also try the motherboard BIOS if there's anything that allows you to set PCIE speeds.


----------



## Gil80

Quote:


> Originally Posted by *mus1mus*
> 
> Also try the motherboard BIOS if there's anything that allows you to set PCIE speeds.


I can only set it to Gen1, Gen2, Gen3 or Auto.
Currently, both PCI-E lanes are set to Auto and as such, are detected in BIOS as 'x8'.


----------



## Paul17041993

Quote:


> Originally Posted by *Gil80*
> 
> I can only set it to Gen1, Gen2, Gen3 or Auto.
> Currently, both PCI-E lanes are set to Auto and as such, are detected in BIOS as 'x8'.


Force G3 and see what happens.

There may not be a discernible difference between 1.1 and 3.0 however unless you had 3*1080p running over 120 FPS, especially if it's in a benchmark that doesn't send many draw calls.


----------



## Gil80

Quote:


> Originally Posted by *Paul17041993*
> 
> Force G3 and see what happens.
> 
> There may not be a discernible difference between 1.1 and 3.0 however unless you had 3*1080p running over 120 FPS, especially if it's in a benchmark that doesn't send many draw calls.


you gave me am idea. I'll enable eyefinity mode and run the gpu z benchmark


----------



## Gil80

Quote:


> Originally Posted by *mus1mus*
> 
> Also try the motherboard BIOS if there's anything that allows you to set PCIE speeds.


Quote:


> Originally Posted by *Paul17041993*
> 
> Force G3 and see what happens.
> 
> There may not be a discernible difference between 1.1 and 3.0 however unless you had 3*1080p running over 120 FPS, especially if it's in a benchmark that doesn't send many draw calls.


I have set both GPU's to GEN3 instead of Auto. Windows won't boot. It gets stuck on the spinning wheel.

I reverted to Auto, used DDU to remove the drivers and set back to GEN3, still can't boot.

If I set the primary to GEN3 and the 2nd GPU to Auto, it will boot. GPU-Z will then report that the main GPU is x8 3.0 under load, but 2nd GPU is not working. I wasn't able to enable crossfire in this case.

I must mention that I have a sound card installed on one of the PCI-E slots.
I use Asus P8Z77-V-Deluxe and i7 3770K.

Am I missing something here?


----------



## Paul17041993

Quote:


> Originally Posted by *Gil80*
> 
> I have set both GPU's to GEN3 instead of Auto. Windows won't boot. It gets stuck on the spinning wheel.
> 
> I reverted to Auto, used DDU to remove the drivers and set back to GEN3, still can't boot.
> 
> If I set the primary to GEN3 and the 2nd GPU to Auto, it will boot. GPU-Z will then report that the main GPU is x8 3.0 under load, but 2nd GPU is not working. I wasn't able to enable crossfire in this case.
> 
> I must mention that I have a sound card installed on one of the PCI-E slots.
> I use Asus P8Z77-V-Deluxe and i7 3770K.
> 
> Am I missing something here?


weeeeel it's an intel board, which tend to be full of shenanigans like this...
Have you tried setting both to gen 2? have you also tried swapping the cards around? and you should also test without extra PCIe cards such as your soundcard as it seems like it may be restricting the second slot somehow...


----------



## Gil80

Quote:


> Originally Posted by *Paul17041993*
> 
> weeeeel it's an intel board, which tend to be full of shenanigans like this...
> Have you tried setting both to gen 2? have you also tried swapping the cards around? and you should also test without extra PCIe cards such as your soundcard as it seems like it may be restricting the second slot somehow...


It's an Asus board with Intel chipset









So I solved the problems, except one.

I decided to remove all peripherals, including my other 2 monitors, SSD and HDD, webcam and what not. Barebone system SSD, mouse, keyboard and one monitor.
The system booted normally with both GPU's set to GEN 3 in BIOS and GPU-Z finally reports x8 3.0 on both GPU's under load.

I then started to connect one device at a time to see what causing the boot issues when both GPU's are on GEN 3. The answer is "Nothing".
It seems that the BSOD that I had right after updating AMD drivers had jurked up something in the system. Removing the devices seemed to solve it.

So what are the remaining issues? That's not related to this thread anymore, but I'll mention them anyway and we can call it a day








1. Still can't make windows "forget" about the iGPU. Before I had to use the onboard HDMI port to troubleshoot my BSOD, windows never discovered that I had iGPU. Now, it shows in Device Manager and GPU-Z complains about it because it's missing its drivers. I need to make windows to stop discovering it.
2. My desktop icons are loading slowly.

Thanks for trying to help!


----------



## owcraftsman

Quote:


> Originally Posted by *Gil80*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Paul17041993*
> 
> weeeeel it's an intel board, which tend to be full of shenanigans like this...
> Have you tried setting both to gen 2? have you also tried swapping the cards around? and you should also test without extra PCIe cards such as your soundcard as it seems like it may be restricting the second slot somehow...
> 
> 
> 
> It's an Asus board with Intel chipset
> 
> 
> 
> 
> 
> 
> 
> 
> 
> So I solved the problems, except one.
> 
> I decided to remove all peripherals, including my other 2 monitors, SSD and HDD, webcam and what not. Barebone system SSD, mouse, keyboard and one monitor.
> The system booted normally with both GPU's set to GEN 3 in BIOS and GPU-Z finally reports x8 3.0 on both GPU's under load.
> 
> I then started to connect one device at a time to see what causing the boot issues when both GPU's are on GEN 3. The answer is "Nothing".
> It seems that the BSOD that I had right after updating AMD drivers had jurked up something in the system. Removing the devices seemed to solve it.
> 
> So what are the remaining issues? That's not related to this thread anymore, but I'll mention them anyway and we can call it a day
> 
> 
> 
> 
> 
> 
> 
> 
> 1. Still can't make windows "forget" about the iGPU. Before I had to use the onboard HDMI port to troubleshoot my BSOD, windows never discovered that I had iGPU. Now, it shows in Device Manager and GPU-Z complains about it because it's missing its drivers. I need to make windows to stop discovering it.
> 2. My desktop icons are loading slowly.
> 
> Thanks for trying to help!
Click to expand...

Unless I'm missing something iGPU need to be enabled or disabled in the bios. If the system sees it and is requesting drivers it is enabled. When disable the iGPU won't be visible. At least that's how it works for me.


----------



## Gil80

Quote:


> Originally Posted by *owcraftsman*
> 
> Unless I'm missing something iGPU need to be enabled or disabled in the bios. If the system sees it and is requesting drivers it is enabled. When disable the iGPU won't be visible. At least that's how it works for me.


Asus p8z77 bios doesn't have the option to disable iGPU. You can only set which is the primary GPU to use from a list of iGPU or PCI-E.
I'm using the advanced BIOS view.

I have a theory about this, though.
The one time I had to use the onboard HDMI since I got this motherboard was 3 days ago. That's when it suddenly got picked up in Device Manager.
My theory is that the BIOS has a flag that is set to '1' once a cable was detected on the HDMI port, and it's set to '1' since then.
I think that if I use the clear CMOS button, this flag will return to '0' and won't show up anymore.
The real issue with having it discoverable by Windows is that once it exists, it conflicts with my 2nd GPU. This means that though Crossfire is enabled, the 2nd GPU doesn't work, even if ULPS is totally disabled.

I have to go to gpedit.msc and set a rule to prevent from installing this hardware. So the iGPU is now displayed as an unknown device 'Compatible VGA...' something like that.


----------



## Paul17041993

Quote:


> Originally Posted by *owcraftsman*
> 
> Unless I'm missing something iGPU need to be enabled or disabled in the bios. If the system sees it and is requesting drivers it is enabled. When disable the iGPU won't be visible. At least that's how it works for me.


In many cases now the iGPU cannot be disabled as it's required for compute purposes (and video decoding if the particular codec isn't present on the dGPU). This is at least the case for modern AMD APUs due to how hard-wired the memory controller and cache is between the CPU and GPU cores, as well as providing OpenCL 2.x etc, but I can only assume it's the same case for most intel parts now.

The solution really is to either disable the adapter in device manager or just ignore its presence, with or without the iGPU drivers installed.


----------



## Gil80

Quote:


> Originally Posted by *Paul17041993*
> 
> In many cases now the iGPU cannot be disabled as it's required for compute purposes (and video decoding if the particular codec isn't present on the dGPU). This is at least the case for modern AMD APUs due to how hard-wired the memory controller and cache is between the CPU and GPU cores, as well as providing OpenCL 2.x etc, but I can only assume it's the same case for most intel parts now.
> 
> The solution really is to either disable the adapter in device manager or just ignore its presence, with or without the iGPU drivers installed.


If only windows 10 could work well with crossfire and igpu, that would be great.
my previous post describes what I'm going through now


----------



## owcraftsman

@Gil80 I still say there is a switch in the bios there has been since Z68 chipset maybe you need a bios update. At the time Intel was pushing software that allowed us to utilize both discrete and integrated GPUs together and the software allowed for three different configurations. All which depended on the switch in bios. I'd poke around in the legacy bios and see if you can find it.


----------



## ZealotKi11er

Has anyone ever experience screen going black and then coming like nothing has happened? It only happens with +100mV but only when GPU load is not high. For example in Dota 2 core clock is ~ 105XMHz and it does this. When in BF1 GPU is full load and Core is 1200MHz it works fine.


----------



## Paul17041993

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Has anyone ever experience screen going black and then coming like nothing has happened? It only happens with +100mV but only when GPU load is not high. For example in Dota 2 core clock is ~ 105XMHz and it does this. When in BF1 GPU is full load and Core is 1200MHz it works fine.


You using displayport? Such behaviour can occur with a bad DP cable or connection, but otherwise you may have to adjust VRM timings and whatnot, or disable power saving completely.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Paul17041993*
> 
> You using displayport? Such behaviour can occur with a bad DP cable or connection, but otherwise you may have to adjust VRM timings and whatnot, or disable power saving completely.


Yes I use DP. Yeah I think it has to do with power saving. I tried a different BIOS but did not help. How do I disable power savings?


----------



## battleaxe

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yes I use DP. Yeah I think it has to do with power saving. I tried a different BIOS but did not help. How do I disable power savings?


I had trouble with this on one DP cable. Bought new cables and the problem went away. Just FYI


----------



## Gil80

Quote:


> Originally Posted by *owcraftsman*
> 
> @Gil80
> I still say there is a switch in the bios there has been since Z68 chipset maybe you need a bios update. At the time Intel was pushing software that allowed us to utilize both discrete and integrated GPUs together and the software allowed for three different configurations. All which depended on the switch in bios. I'd poke around in the legacy bios and see if you can find it.


mate, I wish I was wrong. There's just no way of switching iGPU off. See screenshot from my BIOS:


I have the latest BIOS firmware 2104.


----------



## jagdtigger

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Has anyone ever experience screen going black and then coming like nothing has happened? It only happens with +100mV but only when GPU load is not high. For example in Dota 2 core clock is ~ 105XMHz and it does this. When in BF1 GPU is full load and Core is 1200MHz it works fine.


DP doesn't like the high core voltage







:
http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/43200#post_25458350


----------



## fyzzz

Same issue here, running 390+290 crossfire and anything over 18mv (max voltage spike around 1.25v) causes the on and off blackscreen thing. Running 4k screen and displayport cable.


----------



## ZealotKi11er

Quote:


> Originally Posted by *fyzzz*
> 
> Same issue here, running 390+290 crossfire and anything over 18mv (max voltage spike around 1.25v) causes the on and off blackscreen thing. Running 4k screen and displayport cable.


They happens in Dota 2 where voltage ~ 1.28v but in BF1 its 1.21-1.23v. The more load the GPU the lower the voltage is. Good thing its stable in BF1.


----------



## pshootr

Why can't CF just work!









I mean I want to get a second R9-290, but after reading a bit I feel its a bad move.


----------



## battleaxe

Quote:


> Originally Posted by *pshootr*
> 
> Why can't CF just work!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I mean I want to get a second R9-290, but after reading a bit I feel its a bad move.


I've been using xfire for over a year. No issues for me at all. Works flawlessly with the games I play


----------



## pshootr

Quote:


> Originally Posted by *battleaxe*
> 
> I've been using xfire for over a year. No issues for me at all. Works flawlessly with the games I play


Well that's awesome. Really nice to hear of your success, and It gives me hope to finally enjoy a CF setup. I have never done it yet, and this is my third sli/cf Board!


----------



## PsYcHo29388

Hey guys, what clock speeds on the core and memory are you able to hit without having to up the voltage? I've been using 1075/1375 for awhile now but curious if I could/should be pushing it higher, thanks.


----------



## battleaxe

Quote:


> Originally Posted by *pshootr*
> 
> Well that's awesome. Really nice to hear of your success, and It gives me hope to finally enjoy a CF setup. I have never done it yet, and this is my third sli/cf Board!


I had 290x in Xfire before the 390x I have now. No issues with those either. So I think you're fine. Some games can really test your OC though, so keep that in mind. But if the hardware is up to it its honestly a seamless affair. Shouldn't give you any problems. I don't notice tearing either at all, or microstutter either. Just works, and works well.

I think some of the issues some folks have is they don't understand how to set things up. Like monitoring software for example... should have about 5000msec of delay so everything has adequate timing to monitor without causing stuttering. That seems to be one of the issues that gives some people problems. So keep that in mind. Setup is crucial, its not necessarily plug and play.

If you get issues, google how to work them out before giving up on it. That's what I've done and I would bet the others here that have zero issues with Xfire. Just have to educate yourself when adding several powerful pieces of hardware. The weakest link will give you issues and you have to isolate and control those issues if there are any.


----------



## Paul17041993

Quote:


> Originally Posted by *pshootr*
> 
> Why can't CF just work!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I mean I want to get a second R9-290, but after reading a bit I feel its a bad move.


Generally because it's a driver hack that many games are simply not compatible with or have too much buffer stages and communication (reduces scaling). But from all the experiences I've heard through the years it works at least 75% of the time.


----------



## Buckster

I've a 290 Tri-X - and still going strong - but have a question about "black screens" please

my memory will overclock to about 1440 with no issues at all (core goes to about 1100) - Hynix memory

above 1440 - I start getting issues - it will bench and game with no artefacts etc all the way up to about 1550

BUT anything over say about 1460 and whilst will work fine all day gaming etc, and will go for days going into sleep and back out with no issues- randomly Windows will boot to a completely black screen - I've tried different bioses' to no avail

when this happens, you get the bios screen fine, you get the windows loading screen, but as soon as desktop opens it goes to black screen

once this happens its VERY VERY difficult to get back to working windows, requires a roll-back, multiple multiple re-starts etc.

its almost as if windows, the driver or something has cached something corrupted

from memory I think I even did a fresh install of windows and it still black screened - so maybe something paged inside of the GPU itself ? but then why/how working in bios/windows loading screens ?

only ever happens on boot or out of sleep, never during gaming

any ideas please ? at 1440 I never get a black screen on windows boot

thanks


----------



## Streetdragon

you oc with afterburner? when windows is starting, afterburner set up the oc to your videocard. If you have to less voltage for the memory-clock you get a blackscreen. just lower the memory clock or bump the voltage


----------



## Aaron_Henderson

Back in the club...used to have an Asus Direct CU II OC 290X, had to sell it, now got a Gigabyte Windforce for $150


----------



## Enzarch

Quote:


> Originally Posted by *fyzzz*
> 
> Same issue here, running 390+290 crossfire and anything over 18mv (max voltage spike around 1.25v) causes the on and off blackscreen thing. Running 4k screen and displayport cable.


Try switching which card is your primary, this behaviour is unique to each card but happens at the end of the line and doesnt affect the rendering otherwise. Or the .95V rail mod can be done, this fixes it entirely, if you are comfortable with soldering.


----------



## fyzzz

Quote:


> Originally Posted by *Enzarch*
> 
> Try switching which card is your primary, this behaviour is unique to each card but happens at the end of the line and doesnt affect the rendering otherwise. Or the .95V rail mod can be done, this fixes it entirely, if you are comfortable with soldering.


Yes, I know what causes this issue, but I think I'm going to upgrade the gpu soon anyways.


----------



## ratchet4234

Sapphire Tri-X R9 290
My Setup with a i7 6700k only scores 2600 on valley with extreme preset.
is that really low for a mildly overclocked r9 290?


----------



## diggiddi

Quote:


> Originally Posted by *ratchet4234*
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Sapphire Tri-X R9 290
> My Setup with a i7 6700k only scores 2600 on valley with extreme preset.
> is that really low for a mildly overclocked r9 290?


Looks decent to me, cpu clock also affects results. I was able to get 2829 on my 290X not too sure what clocks it was at though
([email protected])


----------



## ratchet4234

I got a i7 6700k overclocked to 4.644 ghz with a r15 score of 1014


----------



## LazarusIV

My XFX R9 290 DD has been a peach up until a few days ago... The fan closest to the PCI bracket is making a rhythmic rattling sound. Has anyone had this issue or know of a fix? I appreciate the help, I don't want to have to replace it yet


----------



## Paul17041993

Quote:


> Originally Posted by *LazarusIV*
> 
> My XFX R9 290 DD has been a peach up until a few days ago... The fan closest to the PCI bracket is making a rhythmic rattling sound. Has anyone had this issue or know of a fix? I appreciate the help, I don't want to have to replace it yet


Made sure nothings making contact with the fan blades? Otherwise it sounds like the bearing's gone, of which the only real option is to replace the fan somehow. A very simple solution is to take the shroud off the card and strap custom fans to the heatsink, however you'll need to cut and re-wire the fan header/s to keep fan control, assuming you actually need that.


----------



## Paopawdecarabao

I have a 290x crossfire. Is it still good for 5760x1080p gaming for modern games? I haven't played in awhile and wanted to play BF 1,Titanfall 2 and overwatch in triple monitor 5760x1080p. I was playing BF4 and Titanfall before. I am planning to overclock my build as I have my waterblocks sitting in dust. Current CPU is i7 3770k


----------



## sinnedone

Quote:


> Originally Posted by *LazarusIV*
> 
> My XFX R9 290 DD has been a peach up until a few days ago... The fan closest to the PCI bracket is making a rhythmic rattling sound. Has anyone had this issue or know of a fix? I appreciate the help, I don't want to have to replace it yet


Contact XFX they might send you a fan if you ask nicely.

Quote:


> Originally Posted by *Paopawdecarabao*
> 
> I have a 290x crossfire. Is it still good for 5760x1080p gaming for modern games? I haven't played in awhile and wanted to play BF 1,Titanfall 2 and overwatch in triple monitor 5760x1080p. I was playing BF4 and Titanfall before. I am planning to overclock my build as I have my waterblocks sitting in dust. Current CPU is i7 3770k


Crossfire is broken in BF1


----------



## LazarusIV

Quote:


> Originally Posted by *Paul17041993*
> 
> Made sure nothings making contact with the fan blades? Otherwise it sounds like the bearing's gone, of which the only real option is to replace the fan somehow. A very simple solution is to take the shroud off the card and strap custom fans to the heatsink, however you'll need to cut and re-wire the fan header/s to keep fan control, assuming you actually need that.


Quote:


> Originally Posted by *sinnedone*
> 
> Contact XFX they might send you a fan if you ask nicely.
> Crossfire is broken in BF1


Thanks guys! I'll give 'em a call when I can.


----------



## Paul17041993

Quote:


> Originally Posted by *sinnedone*
> 
> Crossfire is broken in BF1


Probably because crossfire and SLI are legacy...

BF1 should have its own mGPU system at some point if not already.


----------



## PsYcHo29388

Quote:


> Originally Posted by *LazarusIV*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Paul17041993*
> 
> Made sure nothings making contact with the fan blades? Otherwise it sounds like the bearing's gone, of which the only real option is to replace the fan somehow. A very simple solution is to take the shroud off the card and strap custom fans to the heatsink, however you'll need to cut and re-wire the fan header/s to keep fan control, assuming you actually need that.
> 
> 
> 
> Quote:
> 
> 
> 
> Originally Posted by *sinnedone*
> 
> Contact XFX they might send you a fan if you ask nicely.
> Crossfire is broken in BF1
> 
> Click to expand...
> 
> Thanks guys! I'll give 'em a call when I can.
Click to expand...

I contacted a rep by the name of Mark on reddit, asked him where I could get a backplate for my R9 290 and he sent me an entire 390 cooler, backplate and all, free of charge. Since your fan is actually going bad I'm sure they would have no issue doing the same for you.


----------



## sinnedone

Quote:


> Originally Posted by *Paul17041993*
> 
> Probably because crossfire and SLI are legacy...
> 
> BF1 should have its own mGPU system at some point if not already.


If you follow that reasoning so is DX11.


----------



## serave

anyone tried to fit the Prolimatech MK-13 with their 290/X ?

how does it fit ? interested in buying one


----------



## lanofsong

Hey AMD R9 290/290X owners,

We are having our monthly Foldathon from Monday 21st - 23rd 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.


November Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## akromatic

Hey guys, im having issues(noise and heat) with my gigabyte windforce 290. I realize these guys spew a ton of heat that i had to resort to not have a side panel and at the the same time the factory fan on these guys are very loud. Apparently the default idle fan speed is 47% and its too loud to my ears. I've been tweaking the fan and it seemed that at 30% its as loud as i can tolerate but im dropping alot in terms of 3d mark scores and frames.

stock factory overclock was at 1040mhz which i now then clocked to 900mhz which results in a very mixed result varying from 6k to 8k on 3dmark firestrike 1.1

question is how hot can this card run before i get into trouble and what is a safe temp for it? im obviously not after the coolest running card or those who think anything over 70c is too hot. what im trying to do is find a balance where the card would perform comfortably at 30% fan on a 35-40c ambient day without sacrificing too much performance.

ideally i'd like to keep its performance around the RX470 if possible


----------



## sinnedone

These cards aren't supposed to throttle until the hit 95c.

If you could put your pc's side panel on, it would seem quieter. Add case fans to keep the hot air moving out of the case and depending how long you've had the card, try changing thermal paste.


----------



## akromatic

Quote:


> Originally Posted by *sinnedone*
> 
> These cards aren't supposed to throttle until the hit 95c.
> 
> If you could put your pc's side panel on, it would seem quieter. Add case fans to keep the hot air moving out of the case and depending how long you've had the card, try changing thermal paste.


well im not hitting 95 but im already throttling hard.

im averaging 80-85c on both fan profiles, if i have my fans down to 30% i hit the 6k points in 3dmark while if i set my fan to 45% i get 9k

side panel only makes it run hotter as the case was not designed to run such a card. i have a 120mm slim fan facing it as its as much clearance i get considering my case's depth. fans literally <1mm away from the card though its only spinning at 700rpm because i cant stand the noise

card is a factory refurb < 1 year old and hasnt really placed in a PC till like 3 months ago so essentially it has < 3 months of "on" time


----------



## Paul17041993

Quote:


> Originally Posted by *sinnedone*
> 
> If you follow that reasoning so is DX11.


Is it not? 11 is so bloated now in comparison that it simply cannot use modern GPUs effectively, it should only be used as a fall-back for the 10-20% of systems that don't support DX12 or Vulkan. Projects that are fairly lightweight are an exception as they wouldn't really make use of multiple GPUs and advanced features anyway.

The majority of older DX11 games can be maxed out on new GPUs fairly easily and tend to also be CPU-bound, so even if crossfire/SLI works with them it doesn't always give satisfactory results.


----------



## sinnedone

Quote:


> Originally Posted by *akromatic*
> 
> well im not hitting 95 but im already throttling hard.
> 
> im averaging 80-85c on both fan profiles, if i have my fans down to 30% i hit the 6k points in 3dmark while if i set my fan to 45% i get 9k
> 
> side panel only makes it run hotter as the case was not designed to run such a card. i have a 120mm slim fan facing it as its as much clearance i get considering my case's depth. fans literally <1mm away from the card though its only spinning at 700rpm because i cant stand the noise
> 
> card is a factory refurb < 1 year old and hasnt really placed in a PC till like 3 months ago so essentially it has < 3 months of "on" time


Have you used gpuz to check vrm temperatures and actual gpu load/usage? Could it be a driver issue or Bios setting?

Quote:


> Originally Posted by *Paul17041993*
> 
> Is it not? 11 is so bloated now in comparison that it simply cannot use modern GPUs effectively, it should only be used as a fall-back for the 10-20% of systems that don't support DX12 or Vulkan. Projects that are fairly lightweight are an exception as they wouldn't really make use of multiple GPUs and advanced features anyway.
> 
> The majority of older DX11 games can be maxed out on new GPUs fairly easily and tend to also be CPU-bound, so even if crossfire/SLI works with them it doesn't always give satisfactory results.


10-20% of multiple gpu systems or just systems in general? The problem is most game engines are just being "backported" to support DX12 on not made from the ground up. I don't think we will see true DX12 support for a couple of years yet. THe only well implemented games lately have been Doom and ashes.(I think there was an indy game or 2 as well can't remember as I mostly follow major titles)

As far as maxing out the majority of DX11 games I think that might be true at 1080p but there are lot of 1440p and 4k systems out there they surely that doesn't apply to.


----------



## akromatic

Quote:


> Originally Posted by *sinnedone*
> 
> Have you used gpuz to check vrm temperatures and actual gpu load/usage? Could it be a driver issue or Bios setting?


i didnt exactly monitor vrm temps, would have thought that they would be fine considering underclocking and stock cooling.

GPU load is 100% throughout 3dmark. driver is latest AMD drivers on a fresh windows.

im sure the drop in frames are purely throttling, maybe i need to pay closer attention to GPU clocks. ether way though my goal is to reach acceptable performance with minimal noise and i would be happy to be around the RX470 mark.


----------



## i2CY

So did some benching with 3DMark last night, and my old 2 CX R9 290s @ 947mhz 1250mhz are greater then my new GTX Strix 1070 Gaming Card with OC at 2000mhz









Go red!


----------



## mus1mus

Quote:


> Originally Posted by *i2CY*
> 
> So did some benching with 3DMark last night, and my old 2 CX R9 290s @ 947mhz 1250mhz are greater then my new GTX Strix 1070 Gaming Card with OC at 2000mhz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Go red!


Only the TitanXP can beat XFire Hawaiis.


----------



## TheLAWNOOB

Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *i2CY*
> 
> So did some benching with 3DMark last night, and my old 2 CX R9 290s @ 947mhz 1250mhz are greater then my new GTX Strix 1070 Gaming Card with OC at 2000mhz
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Go red!
> 
> 
> 
> Only the TitanXP can beat XFire Hawaiis.
Click to expand...

And then you play a game and realize sometimes it take AMD ages to improve CF scaling. I'm still salty over the 60% scaling in NFS 2016.


----------



## mus1mus

Stay on topic.









If scaling is an issue with XFire, it is and will also be an issue with SLI.


----------



## Caveat

Quote:


> Originally Posted by *mus1mus*
> 
> Only the TitanXP can beat XFire Hawaiis.


But it doesn't beat the power used. Right? I mean like the 2 290's use wat more power than a single 1070. Or am i wrong?


----------



## i2CY

lol, just a wee bit,

Sapphire R9 290 4G D5 (947mhz) - Power Consumption <300W
Asus GeForce GTX 1070 Strix GTX 1070 - 150W

I now have eh ton of PSU head room.

Sorry for off topic.


----------



## mus1mus

Quote:


> Originally Posted by *Caveat*
> 
> But it doesn't beat the power used. Right? I mean like the 2 290's use wat more power than a single 1070. Or am i wrong?


You can't compare hawaii to anything nVidia currently offers in terms of power consumption.

If you are looking for a good comparison, I am tripping a 1250W Seasonic X benching two 290Xs at 1300+MHz with a 5930K at 4.8GHz.

On an nVidia rig, the same power supply can survive a trio of 980TIs on Max OC attainable on air.

A 1070 rig can't even OCP trip a 600W cheapo on my tests. It's running really cool (cold) too.

Also, heat produced by a 290X would leave you in awe until you experience a Titan X Maxwell.


----------



## Caveat

Quote:


> Originally Posted by *mus1mus*
> 
> You can't compare hawaii to anything nVidia currently offers in terms of power consumption.
> 
> If you are looking for a good comparison, I am tripping a 1250W Seasonic X benching two 290Xs at 1300+MHz with a 5930K at 4.8GHz.
> 
> On an nVidia rig, the same power supply can survive a trio of 980TIs on Max OC attainable on air.
> 
> A 1070 rig can't even OCP trip a 600W cheapo on my tests. It's running really cool (cold) too.
> 
> Also, heat produced by a 290X would leave you in awe until you experience a Titan X Maxwell.


I am running 2 Sapphire R9 290X and a FX-9590 on 4.7 Ghz. One guy said he can outperform my set with an i5 and a single 1060 or 1070 and with 700Watt less


----------



## mus1mus

The question really is, "in what game?"
If they call you out on poorly traded games, bring BF1 to the discussion.

1070 is less than a 980TI when both are OC'd to the max.

A 980TI is roughly 30%, at best, faster than a 290X in Fire Strike. Fire Strike scaling is superb for the GPU part.

Their math may be off


----------



## Caveat

Quote:


> Originally Posted by *mus1mus*
> 
> The question really is, "in what game?"
> If they call you out on poorly traded games, bring BF1 to the discussion.
> 
> 1070 is less than a 980TI when both are OC'd to the max.
> 
> A 980TI is roughly 30%, at best, faster than a 290X in Fire Strike. Fire Strike scaling is superb for the GPU part.
> 
> Their math may be off


He didnt mentioned a game. He just said "Jesus Christ M8, 9590 with 290X CF @1080p poke.gif an unlocked i5 with 1070 would outperform it while drawing 700W less"


----------



## mus1mus

The 700W part could be true though.


----------



## Caveat

Quote:


> Originally Posted by *mus1mus*
> 
> The 700W part could be true though.


Yes that is possible. My cpu eats alot. 2 290x's also eat aloy. Someone posted the watts the cards eat. 290x was high on the list


----------



## TheLAWNOOB

If you don't unlock the BIOS of the 1070, you could probably draw less than 400W at max OC with a Skylake i5 at 5Ghz.

Meanwhile 2 R9 290s with EK Fullcover blocks overwhelm a 480mm rad if I use 1200rpm fans.


----------



## mus1mus

Exaggeration!

The 1070 uses very low power. Won't and will never need 400 watts even woth hard mods!

Or I misunderstand what you are trying to say.


----------



## TheLAWNOOB

Why do you think I'm exaggerating?



Test system has a 4.2Ghz i7 4960X. A 5Ghz Skylake i5 should draw about the same power. The power consumption was total system consumption.


----------



## mus1mus

Yeah. Misunderstood.

I thought I read like, another 400W less.









---sleepy head


----------



## sinnedone

A 1060 does not perform as well as a 290x. A 1070 will definitely beat a single 290x, but if a game has good crossfire support a single 1070 will be bested by crossfire 290x's.

If I remember correctly your AMD CPU would be on par with an i5.

Yes your system uses more power, but if it's fine with you doesn't really matter.

The guy was just trolling with nothing really said. Without talk of specific games at specific resolutions and cpu/gpu clocks it's a blanket statement that doesn't mean anything.


----------



## Caveat

This is correct?


----------



## TheLAWNOOB

nVidia limits the power consumption of the entire card, so it would be pretty close to that.

As long as you don't remove the power limit, you can keep total power consumption well under 400W as long as you don't use an overclocked FX CPU.


----------



## akromatic

so how do i limit the boost or disable it on my gigabyte 290? or perhaps even cap its power. i figured half my issues is that the card kept boosting itself and drawing more thus more heat and noise.


----------



## TheLAWNOOB

Quote:


> Originally Posted by *akromatic*
> 
> so how do i limit the boost or disable it on my gigabyte 290? or perhaps even cap its power. i figured half my issues is that the card kept boosting itself and drawing more thus more heat and noise.


Undervolt it by 100mV. And put power limit at -20%.


----------



## asciii

Removed.


----------



## melodystyle2003

I ran the latest without issues (16.11.4 don't know if it has changed till now).


----------



## PontiacGTX

someone has used/owned a HIS R9 290X ICEQ x2 or the Diamond R9 290X dual fan?


----------



## Rabit

I buy recently MSI Radeon R9 290X Gaming 4GB cost me £95 * but temps reaching 94C and fans 100% on single pass on 3DM
have original warranty sticker on screws probably time for new thermal grease







I have Arctic MX-2 will be ok ?
And what thermal pads I need to replace old ones if they brake during changing thermal grease.
http://www.3dmark.com/3dm/16379895

Edit:

I changed thermal grease and clean cooler

Temps reduced from 94C to 74C * temp will reduce over time MX-2 Reach best performance after a week








Fan from 100% to 54%
I also decrease -37Mv I truly do no need OC for 1080P









I happy now is silent


----------



## mfknjadagr8

Quote:


> Originally Posted by *Rabit*
> 
> I buy recently MSI Radeon R9 290X Gaming 4GB cost me £95 * but temps reaching 94C and fans 100% on single pass on 3DM
> have original warranty sticker on screws probably time for new thermal grease
> 
> 
> 
> 
> 
> 
> 
> I have Arctic MX-2 will be ok ?
> And what thermal pads I need to replace old ones if they brake during changing thermal grease.
> http://www.3dmark.com/3dm/16379895
> 
> Edit:
> 
> I changed thermal grease and clean cooler
> 
> Temps reduced from 94C to 74C * temp will reduce over time MX-2 Reach best performance after a week
> 
> 
> 
> 
> 
> 
> 
> 
> Fan from 100% to 54%
> I also decrease -37Mv I truly do no need OC for 1080P
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I happy now is silent


yeah paste was probably like dried clay...both of my reference xfx r9 290s were like that...i didnt see that low of a temp but reference cooler is garbage...


----------



## Rabit

Yes very dry anyway Artic-MX2 you use once no need reapply As the package says"8 year durability!


----------



## pshootr

Hey guys. I am having an issue with lag in an MMO game "PerfectWorld International". Actually its ok for the most part, but I have noticed when I run an instance several times that my FPS tanks to like 10 FPS after a couple runs. Why is this? Also when I check MSI Afterburner, the graph for GPU usage is like roller coaster of spikes. It looks like my GPU is not being used to its full potential. My specs are in my sig.

Thanks









Edit: Oh, and while I play this game my CPU usage is low. It only uses like 20-40% cpu, and mostly closer to 20%.. I just don't get this. It appears that the game is struggeing, yet neither my CPU or GPU wants to get off its ass.

I am using Crimson drivers 16.9.2


----------



## Roy360

Quote:


> Originally Posted by *pshootr*
> 
> Hey guys. I am having an issue with lag in an MMO game "PerfectWorld International". Actually its ok for the most part, but I have noticed when I run an instance several times that my FPS tanks to like 10 FPS after a couple runs. Why is this? Also when I check MSI Afterburner, the graph for GPU usage is like roller coaster of spikes. It looks like my GPU is not being used to its full potential. My specs are in my sig.
> 
> Thanks
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Edit: Oh, and while I play this game my CPU usage is low. It only uses like 20-40% cpu, and mostly closer to 20%.. I just don't get this. It appears that the game is struggeing, yet neither my CPU or GPU wants to get off its ass.
> 
> I am using Crimson drivers 16.9.2


If you have an i7 with hyper threading enabled it won't show CPU usage correctly.


----------



## Removed1

Hello there, since i found this forum quite useful, i decide to share some *good R9 290 porn*!









This nice beast have 79.7% asics quality and clock really well under water. I modded a loop fusing 2 IAO cooler to form a single loop with a reservoir.
The beast is under a bad Corsair HG10 coupled with a H100 IAO.
It is still under stock bios, flashed a PCS+ Powercolor R9 290 bios but have anyway the PT1 on the other switch.
I capped the limit of TDP/VCCD of the bios i think, i get 1.4v under load, 1.31v after the vdroop.
So i think i need to switch to a modded bios, to go further, anyway i'm lucky to achieve a nice 1280Mhz for the core and 1652Mhz for the ram, those are stable gaming clock and not bench ones.
About the PT1 bios, i didnt use it because it haven't 2D clock so it quite harsh to leave the gpu under these voltages.
Under load i got more or less 61/65° for the core and 65/75° for the vrm1.

http://www.hostingpics.net/viewer.php?id=7733833DM.jpg



In game this card is absolutely wonderful, thinking it had almost 3 years, i got 100FPS +/- average on BF1 ultra quality in 1080pi, and run every single game in a decent way in 1080pi max quality.
The fun is that i paid the card 100e + 70e of spare part for the watercooling, i wish had luck with one that could oc like this and be unlocked to 290X.


----------



## Domiro

My PC and google skills failed me so I turn to you, OCN.

I recently went from a full watercooling loop back to air and i'm running into the issue of my Sapphire 290 Tri-X running far too hot. Well over 80c playing any sort of game while it's winter over here. Versus the high 60c it used to run before watercooling.
So far I've tried different spring loaded heatsink screws (Shorter screw, longer/stiffer spring), reseated the PCB/heatsink and rolled back drivers but no game.

I've considered the EK thermal pads being too thick (1mm for VRMs) but can't find info on the standard thickness. I could but on different paste, which is currently EK ectotherm, for Gelid GC Extreme or Cryorig CP7 but I'm at a loss otherwise.

Halp.


----------



## Paopawdecarabao

Is it too late to watercoolmy 290x crossfire? I alreadg have my ek waterblock andbackplate. Not sure if I will upgradeto vega right away. Currently playing3440x 1440p overwatch and csgo.might play bf1 and titan fall2


----------



## Blameless

Quote:


> Originally Posted by *Domiro*
> 
> I've considered the EK thermal pads being too thick (1mm for VRMs) but can't find info on the standard thickness. I could but on different paste, which is currently EK ectotherm, for Gelid GC Extreme or Cryorig CP7 but I'm at a loss otherwise.


0.5mm is the stanard VRM pad thickness and even these pads will be heavily compressed.

When it doubt, buy thinner pads. You'll lose less from having to stack them than from having to use ones that are too thick.
Quote:


> Originally Posted by *Paopawdecarabao*
> 
> Is it too late to watercoolmy 290x crossfire? I alreadg have my ek waterblock andbackplate. Not sure if I will upgradeto vega right away. Currently playing3440x 1440p overwatch and csgo.might play bf1 and titan fall2


If you've already got the block, you may as well put it to use.


----------



## Removed1

If the thickness of the pads or the thermal paste are not in cause, it maybe because your card get bend somehow near the gpu.
This could cause the inefficient contact between the gpu and the heatshink.
Since u could barely adjust the how u could screw together the gpu/heatshink with a stock cooler, thing u can easily do with your custom waterblock.
It happen sometime, after a long run with a cooler, that the card get the shape and bend to fit cooler, especially it the card run warm, that should not be the case there since u were under water.

So take a picture of the contact between the gpu and the heatsink after u applied thermal paste, if u notice a poor contact/gpu footprint on the heatshink, u find your problem.
Other possibility is that u broke somehow the heat-pipes or the heatsink, so give it a try with another cooler if u can.


----------



## boot318

Quote:


> Originally Posted by *Wimpzilla*
> 
> Hello there, since i found this forum quite useful, i decide to share some *good R9 290 porn*!
> 
> 
> 
> 
> 
> 
> 
> http://www.hostingpics.net/viewer.php?id=7733833DM.jpg




My score with my 290x @1200C and 1550M. I would've expected a 290 at those clocks to smash a 290x at that clockspeed.


----------



## Domiro

Changed the 1mm for 0.5mm pads. Shaved off 5-10 degrees but it's still not where it once was. Would adding o-rings underneath the spring loaded screw help in getting better contact? I could add the rings, not tighten down all the way and let them creep up over several days of usage.


----------



## Removed1

Quote:


> Originally Posted by *boot318*
> 
> My score with my 290x @1200C and 1550M. I would've expected a 290 at those clocks to smash a 290x at that clockspeed.


Your 290X is well clocked too, 1200mhz for the core is already nice. U got 300 CU more than the 290, these clock they make a big difference. I know it is more difficult to raise the clock on hawaii XT.
Pretty happy anyway of this sample.








Quote:


> Originally Posted by *Domiro*
> 
> Changed the 1mm for 0.5mm pads. Shaved off 5-10 degrees but it's still not where it once was. Would adding o-rings underneath the spring loaded screw help in getting better contact? I could add the rings, not tighten down all the way and let them creep up over several days of usage.


You could, dont see any problem with that, just pay attention because u are putting more strenght on the cooler and the gpu.
Give the thermal pad a couple of days to squeeze a bit and try again to tighten everything.


----------



## Domiro

Quote:


> Originally Posted by *Wimpzilla*
> 
> You could but like i said before, i will begin to bend the card near the gpu and the vrm. Give the thermal pad a couple of days to squeeze a bit and try again to tighten everything.


I'll let it run for a week and try the o-ring method if it doesn't improve. 1mm thick o-ring and decompress the spring a little less than that, let it creep over a couple of weeks.


----------



## Removed1

Quote:


> Originally Posted by *Domiro*
> 
> I'll let it run for a week and try the o-ring method if it doesn't improve. 1mm thick o-ring and decompress the spring a little less than that, let it creep over a couple of weeks.


U checked the gpu/heatsink thermal paste footprint?
The contact between the two is so bad?


----------



## Domiro

Spreads out well but much of it stays in the center, rather than get pushed outward. The thinner pads helped though, old pads had the paste spread evenly but came apart in "strings".


----------



## Removed1

Quote:


> Originally Posted by *Domiro*
> 
> Spreads out well but much of it stays in the center, rather than get pushed outward. The thinner pads helped though, old pads had the paste spread evenly but came apart in "strings".


And now witch is the T° of the gpu and the vrm?
Once i got similar problem with the stock reference cooler, until i repaste it was fine, after that, no way to get a decent contact, the heatsink was bad and not completly flat or even was not going down enough to give a decent pression on the gpu.

It was working well before i changed the thermal grease, because the stock one was became so hard and dry that it would compensate the bad contact.
So like u tryed to put some metal ring to increase the pression without succes.
When the contact is bad u can try to force a bit but u will not get decent results.


----------



## Domiro

GPU Core from 84c to 75c average. VRMs are 55c and 61c.


----------



## Removed1

Quote:


> Originally Posted by *Domiro*
> 
> GPU Core from 84c to 75c average. VRMs are 55c and 61c.


With the stock fan speed? If u increase the fan speed u get decent t°?
The vrm are ok, it really surprise me that the pad would create such a mess.
It is true that the Trix-X heatsink is flat, so u have on same level the gpu and the vrm.
U lost the stock thermal pad for the vrm? Do you have some other pad like the one used for the ram, they are more soft, not the best but maybe it could be enough to chill the vrm and allow the gpu to get a decent contact.


----------



## Pooms

Guys please help me I just put my antec kuhler 620 on my r9 290 now I don't know where to put the cooler power connector its only 3 pin female and the gpu fan is 4 pin male


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Paopawdecarabao*
> 
> Is it too late to watercoolmy 290x crossfire? I alreadg have my ek waterblock andbackplate. Not sure if I will upgradeto vega right away. Currently playing3440x 1440p overwatch and csgo.might play bf1 and titan fall2


Its never too late ....











Half your temps


----------



## diggiddi

Quote:


> Originally Posted by *Pooms*
> 
> Guys please help me I just put my antec kuhler 620 on my r9 290 now I don't know where to put the cooler power connector its only 3 pin female and the gpu fan is 4 pin male


Is there an extra fan header on your mobo doesn't have to be 3pin, 4pin works too look around cpu memory area


----------



## Removed1

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Its never too late ....
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Half your temps












Half your temps and Boost your perf!


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Wimpzilla*
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Half your temps and Boost your perf!


Hasn't missed a beat in over 2 years









11500x2160p






Twinchilldeskputer .









I used to bench , bench a lot


----------



## bond32

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Hasn't missed a beat in over 2 years
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 11500x2160p
> 
> Twinchilldeskputer .
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I used to bench , bench a lot


You disgust me. Stop it.

Now give me your setup.

Much Jelly.


----------



## Rabit

Looks like thermal pad go to old my VRM1 temp in idle is 74C was 51C 3 days ago :/
This pads will be ok ?
https://www.amazon.co.uk/Arctic-Thermal-Pad-145X145%C2%A0MM-1-5%C2%A0mm/dp/B00UYTU6Z6
I probably buy 1,5 and 1mm version









And I start see some graphic ram artefacts when heat up * need replace thermal pads on rams also

* My Gpu MSI twin R9 290X


----------



## Matt-Matt

I'm back here around 3 years later!

Recently readjusted my OC! (Was 1179/1450MHz)

Somehow had Aux Voltage set to +19mv and it gave me core stability? - I'm yet to test how long term it's stable.. But enough to pass Valley (and get over 70FPS)

Currently at 1205/1500MHz with +100mv on the core and +19mv on the aux, adding more aux doesn't do anything unfortunately.

This is with a full cover block, temps don't surpass 60c on the core or VRM's.

Anyone got any tips for getting more then +100mv?

Using a reference XFX 290 and using Afterburner for overclocking.. Trixx is ergh, it hates me.


----------



## mirzet1976

Try HIS iTurbo for OC.


----------



## Roboyto

Quote:


> Originally Posted by *Matt-Matt*
> 
> I'm back here around 3 years later!
> 
> Recently readjusted my OC! (Was 1179/1450MHz)
> 
> Somehow had Aux Voltage set to +19mv and it gave me core stability? - I'm yet to test how long term it's stable.. But enough to pass Valley (and get over 70FPS)
> 
> Currently at 1205/1500MHz with +100mv on the core and +19mv on the aux, adding more aux doesn't do anything unfortunately.
> 
> This is with a full cover block, temps don't surpass 60c on the core or VRM's.
> 
> Anyone got any tips for getting more then +100mv?
> 
> Using a reference XFX 290 and using Afterburner for overclocking.. Trixx is ergh, it hates me.


I typically had better luck with the older version of trixx before the most recent UI change. Version 5.X.X I believe.

Aux voltage for memory controller if I'm not mistaken.


----------



## EpicOtis13

Well it looks like my 290's are dying. If I OC them at all with Trixx I start getting immediate artifacting. Should I go for 390x's since they're compatible with my waterblocks or should I wait for Vega and go all out.


----------



## rancor

Quote:


> Originally Posted by *EpicOtis13*
> 
> Well it looks like my 290's are dying. If I OC them at all with Trixx I start getting immediate artifacting. Should I go for 390x's since they're compatible with my waterblocks or should I wait for Vega and go all out.


I would wait for vega its not much longer now.


----------



## EpicOtis13

Quote:


> Originally Posted by *rancor*
> 
> I would wait for vega its not much longer now.


What's your guess on a time frame


----------



## boot318

Quote:


> Originally Posted by *EpicOtis13*
> 
> What's your guess on a time frame


You probably won't see it until the PC Gaming Show at E3 if AMD keeps up with history on its GPU releases (Fury in 2015 and Polaris in 2016). June?


----------



## boot318

Quote:


> Originally Posted by *Matt-Matt*
> 
> I'm back here around 3 years later!
> 
> Recently readjusted my OC! (Was 1179/1450MHz)
> 
> Somehow had Aux Voltage set to +19mv and it gave me core stability? - I'm yet to test how long term it's stable.. But enough to pass Valley (and get over 70FPS)
> 
> Currently at 1205/1500MHz with +100mv on the core and +19mv on the aux, adding more aux doesn't do anything unfortunately.
> 
> This is with a full cover block, temps don't surpass 60c on the core or VRM's.
> 
> *Anyone got any tips for getting more then +100mv?*
> 
> Using a reference XFX 290 and using *Afterburner* for overclocking.. Trixx is ergh, it hates me.


1. Find your MSI AB directory.
2. Make a shortcut of the MSI AB.exe onto your desktop.
3. Right click, properties and paste this at the end of target "/wi6,30,8d,10" Behind MSI AB.exe. This will give you 100mV hence the number 10. (Change the last 2 digits)
4. Run the shortcut after running the shortcut with edited target and you should have an extra 100mV. (This will not launch MSI AB, you'll have to launch MSI AB from the original .exe or shortcut)

"C:\Program Files (x86)\MSI Afterburner\MSIAfterburner.exe" "/wi6,30,8d,10"
^it has to look like that for it to change the voltage to me.

"C:\Program Files (x86)\MSI Afterburner\MSIAfterburner.exe" "/wi6,30,8d,11"
This gives me 106mV in Afterburner.

Do not jack it up to 30. It will probably instanta fry your GPU. Slowly go up. "/wi6,30,8d,16" gives me +150mV in AB.


----------



## Mingan

Kind of late, but I've been using the Asus Matrix Platinum 290x with stock cooler I traded for previously with UnrulyCactus! Please add me to the roster.


----------



## Removed1

Quote:


> Originally Posted by *boot318*
> 
> 1. Find your MSI AB directory.
> 2. Make a shortcut of the MSI AB.exe onto your desktop.
> 3. Right click, properties and paste this at the end of target "/wi6,30,8d,10" Behind MSI AB.exe. This will give you 100mV hence the number 10. (Change the last 2 digits)
> 4. Run the shortcut after running the shortcut with edited target and you should have an extra 100mV. (This will not launch MSI AB, you'll have to launch MSI AB from the original .exe or shortcut)
> 
> "C:\Program Files (x86)\MSI Afterburner\MSIAfterburner.exe" "/wi6,30,8d,10"
> ^it has to look like that for it to change the voltage to me.
> 
> "C:\Program Files (x86)\MSI Afterburner\MSIAfterburner.exe" "/wi6,30,8d,11"
> This gives me 106mV in Afterburner.
> 
> Do not jack it up to 30. It will probably instanta fry your GPU. Slowly go up. "/wi6,30,8d,16" gives me +150mV in AB.


I'm actually under water and i give /wi6,30,8d,30, it give me +300mv, it is fine, 1.4v during load, 1.31v after vdrop.
This trick on AB work like a charm, thx for the tip to add directly the command under the .exe, actually i have a .bat file that i lunch for apply the +300mv.


----------



## BadRobot

Hey R9 290 owners. I want to know the measurements for the screw holes of the R9 290. Thinking of dumping a cpu cooler on it with supports. Something along the lines of a Sycthe Mugen 2/3/4. It's more of a test/DIY project and if I succeed to lower temperatures significantly, I'll keep it that way.


----------



## lanofsong

Hey AMD R9 290/290X owners,

We are having our monthly Foldathon from Monday 19th - 21st 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

December Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## Removed1

Quote:


> Originally Posted by *Rabit*
> 
> Looks like thermal pad go to old my VRM1 temp in idle is 74C was 51C 3 days ago :/
> This pads will be ok ?
> https://www.amazon.co.uk/Arctic-Thermal-Pad-145X145%C2%A0MM-1-5%C2%A0mm/dp/B00UYTU6Z6
> I probably buy 1,5 and 1mm version
> 
> 
> 
> 
> 
> 
> 
> 
> 
> And I start see some graphic ram artefacts when heat up * need replace thermal pads on rams also
> 
> * My Gpu MSI twin R9 290X


I would rather get some 0.5mm and 1mm.
Remember a guy said wisely, better stack two of 0.5mm than have one 1.5mm too thick.

Someone have done some 3DMark13 bench lately, with the last drivers.
I pushed my 2600k to 4.7Ghz, i got nice cpu score but crappy gpu score.
I lost almost 500pts on the gpu score, before 15k, now 14.5k?!!









And then playing Bf1 i didn't notice the improvement, rather i loose some frames, but i'm not sure it might come from the last game patch.
Nevertheless i noticed some improvement on timeframe.

A bit deceived!

Before ReLive!


Now, same gpu clock but cpu 4.7Ghz!


----------



## th3illusiveman

Is there any AMD card worth upgrading to right now? Last i checked the Fury X was top dog but it's a pretty mediocre step up. Hundreds of dollars for that little gain is not appealing. Running 2560x1440p 144Hz right now and my 290x aint got the juice. Like Freesync tho, thats why it needs to be AMD.


----------



## diggiddi

2 choices are Fury non X or RX 480 depending on what games you play. I lean towards the former


----------



## Matt-Matt

Quote:


> Originally Posted by *Wimpzilla*
> 
> I'm actually under water and i give /wi6,30,8d,30, it give me +300mv, it is fine, 1.4v during load, 1.31v after vdrop.
> This trick on AB work like a charm, thx for the tip to add directly the command under the .exe, actually i have a .bat file that i lunch for apply the +300mv.


Are you keen on sharing the commands for that? I'd be keen to just do a manual profile switch to higher clocks for more demanding games when I need the bit more grunt.

Not sure about 300mv though aha.


----------



## gupsterg

Quote:


> Originally Posted by *th3illusiveman*
> 
> Is there any AMD card worth upgrading to right now? Last i checked the Fury X was top dog but it's a pretty mediocre step up. Hundreds of dollars for that little gain is not appealing. Running 2560x1440p 144Hz right now and my 290x aint got the juice. Like Freesync tho, thats why it needs to be AMD.


Fury/X will be better at 1440P than RX 480. Fury performs very close to Fury X, when I unlocked 3840SP on Fury and clocked it same as a genuine Fury X (which I had also) performance was on a par.

From viewing your rig in sig your on air Tri-X, Fury on air is quieter than Hawaii. I've had 3x 290 Tri-X , 1x Vapor-X 290X and Asus DCUII 290X.

Fiji at stock benches like highly clocked Hawaii (~1200MHz) from what I viewed when decided to go Fiji.

Yeah Fiji has 4GB RAM but TBH I have not found it an issue. Recently I saw a YT Video showing say Lords of the fallen on MAX in game settings running 1080P witb ~6GB VRAM usage, on same settings but 1440P I get 3.9GB. My frame rates at 1440P are what a RX 480 got at 1080P. From some research I did monitoring software is showing basically VRAM allocation and not true usage, there is a post in the HWiNFO support thread on OCN highlighting this.

As your in Canada I think the bigger issue for "upgrade" is pricing. If you find a Fury on promo I reckon it will be more "bang for $" than even Vega when released. I reckon Vega is gonna be high end pricing at launch just like 290X & Fury X were. Plus due to only talk of Vega to be realeased 1H 2017, which could mean another upto 6mths wait I reckon Fiji is valid purchase still if you need better performance now.


----------



## Removed1

Quote:


> Originally Posted by *Matt-Matt*
> 
> Are you keen on sharing the commands for that? I'd be keen to just do a manual profile switch to higher clocks for more demanding games when I need the bit more grunt.
> 
> Not sure about 300mv though aha.


_CD C:\Program Files (x86)\MSI Afterburner
MSIAfterburner.exe /wi6,30,8d,28_

Copy this in a text file and save it like a .bat file.

U need to change the last number, the command for the vgpu is 6.25v*x=vgpu. So if u want +200mv it is 6.25mv*32=200mv. If u convert 32 into hex it become 20.
So if u want to apply +200mv the command is:

_CD C:\Program Files (x86)\MSI Afterburner
MSIAfterburner.exe /wi6,30,8d,20_


----------



## Matt-Matt

Quote:


> Originally Posted by *Wimpzilla*
> 
> _CD C:\Program Files (x86)\MSI Afterburner
> MSIAfterburner.exe /wi6,30,8d,28_
> 
> Copy this in a text file and save it like a .bat file.
> 
> U need to change the last number, the command for the vgpu is 6.25v*x=vgpu. So if u want +200mv it is 6.25v*32=200mv. If u convert 32 into hex it become 20.
> So if u want to apply +200mv the command is:
> 
> _CD C:\Program Files (x86)\MSI Afterburner
> MSIAfterburner.exe /wi6,30,8d,20_


Cheers, another batch file to add to the collection!


----------



## BadRobot

Quote:


> Originally Posted by *th3illusiveman*
> 
> Is there any AMD card worth upgrading to right now? Last i checked the Fury X was top dog but it's a pretty mediocre step up. Hundreds of dollars for that little gain is not appealing. Running 2560x1440p 144Hz right now and my 290x aint got the juice. Like Freesync tho, thats why it needs to be AMD.


Sure the Fury X and RX 480 might be okay upgrades but even for my R9 290 I feel they're more of a sidegrade especially with how expensive they are. Unless you can find a cheap Fury X second hand or something, I'd say save that money and see what Vega brings in 2017.


----------



## Removed1

Quote:


> Originally Posted by *Matt-Matt*
> 
> Cheers, another batch file to add to the collection!


----------



## Matt-Matt

Quote:


> Originally Posted by *Wimpzilla*


I've set it on boot.. But it's delayed and the PC artifacts for a bit before it boots... Darn.

Wonder if I can set it to start with the OS, rather then with my profile.. hmm


----------



## Removed1

Quote:


> Originally Posted by *Matt-Matt*
> 
> I've set it on boot.. But it's delayed and the PC artifacts for a bit before it boots... Darn.
> 
> Wonder if I can set it to start with the OS, rather then with my profile.. hmm


I think it need anyway MSI AB to get the I2C command executed/send to the card, so i dunno if u could lunch it like a service that boot with window.
Other than including it as a service at the boot i really do no see how u could get the VCCD increase.
Maybe if u lunch MSI AB on boot, and u add the command on the .exe.
I tried on my MSI AB desktop shortcut, to add the command at the end of the target line, it apply the VCCD but it does not lunch MSI AB lol!








So


----------



## gupsterg

Why not use either or combo of both:-

i) ROM with VDDC offset.
ii) change VID per DPM in ROM.

You guys are using high VDDC offsets, which will mean idle/lower DPMs are also getting big rise in voltage, by modifying the ROM you could keep lower DPMs stock voltage and have voltage only at higher DPMs where you are OC'ing.

For example let's say DPM 0 (idle) is 968mV, you apply a VDDC offset in MSI AB of +300mV, it will become 1268mV, which is crazy high for 300MHz. So say you mod the ROM to have a VDDC offset of +300mV and you wish to keep DPM 0 at 968mV, you just change DPM 0 to be 668mV so including the VDDC offset it become 968mV.


----------



## Bratislav

My old reference 290X, PT1 vBIOS, EK block:


----------



## mus1mus

Merry Christmas everyone!

Have fun and may you all have a happy one!


----------



## zerokool_3211

just started having the issue where my memory clock is staying at 300Mhz no matter what. funny thing is it never stayed at this before, it was always locked at 1250 the stock clock seeing as i have 4 monitors, it would never powerplay the memory down. anyone have any ideas on this

PS sorry my tags are not updated but i do have a sapphire 290X


----------



## zerokool_3211

another update i reinstalled MSI and now it sitting at 1250 again but now my core clock is stuck at 300


----------



## Removed1

It seems help somehow the clock management if i set into amd wattman, the frequency to dynamic and the voltage in manual for both gpu and memory.
I get this problem too since last drivers that the memory is generally stuck to it's full speed even in idle.


----------



## Enzarch

Quote:


> Originally Posted by *gupsterg*
> 
> Why not use either or combo of both:-
> 
> i) ROM with VDDC offset.
> ii) change VID per DPM in ROM.
> 
> You guys are using high VDDC offsets, which will mean idle/lower DPMs are also getting big rise in voltage, by modifying the ROM you could keep lower DPMs stock voltage and have voltage only at higher DPMs where you are OC'ing.
> 
> For example let's say DPM 0 (idle) is 968mV, you apply a VDDC offset in MSI AB of +300mV, it will become 1268mV, which is crazy high for 300MHz. So say you mod the ROM to have a VDDC offset of +300mV and you wish to keep DPM 0 at 968mV, you just change DPM 0 to be 668mV so including the VDDC offset it become 968mV.


This method is what I settled on for my cards, as it made for a much simpler testing/ tweaking process. Chasing instability at lower loads was a headache. After finalizing I set the offset within the ROM so i never have to touch software again.


----------



## Removed1

Quote:


> Originally Posted by *gupsterg*
> 
> Why not use either or combo of both:-
> 
> i) ROM with VDDC offset.
> ii) change VID per DPM in ROM.
> 
> You guys are using high VDDC offsets, which will mean idle/lower DPMs are also getting big rise in voltage, by modifying the ROM you could keep lower DPMs stock voltage and have voltage only at higher DPMs where you are OC'ing.
> 
> For example let's say DPM 0 (idle) is 968mV, you apply a VDDC offset in MSI AB of +300mV, it will become 1268mV, which is crazy high for 300MHz. So say you mod the ROM to have a VDDC offset of +300mV and you wish to keep DPM 0 at 968mV, you just change DPM 0 to be 668mV so including the VDDC offset it become 968mV.


Quote:


> Originally Posted by *Enzarch*
> 
> This method is what I settled on for my cards, as it made for a much simpler testing/ tweaking process. Chasing instability at lower loads was a headache. After finalizing I set the offset within the ROM so i never have to touch software again.


Well the idle voltage is not a problem, even if it is 1.266v when i apply the oc, i know already that the transistor/die have some features that allow them to be shut down if they are not used, something like that i think, if i remember well.

I didn't want to add a offset to the bios because i didn't apply the max oc every time. For instance if i play games like Lol that do no required horse power the card stay at stock with +50% power.
The reason is simple, like i said before the gpu get the vccd that fit with the load, i.e in BF1 my vccd 100% stable full load is 1.305/1.320v . In Lol with the 1280mhz oc my vccd is 1.36/.1.38v.
I think this is the way the vdrop work, depending on the gpu load it will feed the gpu with higher vccd if the gpu do not need it when it is not full load.
This thing is quite annoying, because u think u are stable with a determinate voltage, but if the load goes up from 95% to real 99% u aren't stable anymore and u need to add again a bit of voltage.
On the other hand the PT3 bios without vdrop could kill the card if it the voltage spike suddenly. This is the behavior i get on my 290.
Thats why i would get the bios at stock as possible to leave the card rest when i m just using the pc, like 50% of the time.

edit: Gotcha paul17041993


----------



## Paul17041993

Quote:


> Originally Posted by *Enzarch*
> 
> Duplicate please remove


Quote:


> Originally Posted by *Wimpzilla*
> 
> cant delete double post?!


You can flag/report your own dupe posts so the moderators can see they need deleting.


----------



## andrews2547

Quote:


> Originally Posted by *Paul17041993*
> 
> You can flag/report your own dupe posts so the moderators can see they need deleting.


This. If it does happen again, just report it.


----------



## Removed1

Quote:


> Originally Posted by *andrews2547*
> 
> This. If it does happen again, just report it.


It seems, when u click submit, it post the message but leaving u into the post-form and giving u a warning "_You've already submitted this form_", if u click submit again, it end with 2 post.








Maybe a problem refreshing/redirecting the page on the discussion thread, instead it, the page get stuck into the post-form.


----------



## Tobiman

How can I get an older version of crimson? The new ones causes conflicts with MSI AB and is locking my clocks at max 24/7.


----------



## Removed1

Quote:


> Originally Posted by *Tobiman*
> 
> How can I get an older version of crimson? The new ones causes conflicts with MSI AB and is locking my clocks at max 24/7.


Go on amd driver page, select the last drivers for the card, then into the last driver page, there is a link to the old driver on the right. _Download Previous Drivers & Software_

For win10 64bit here the link for instance, it is my specs: http://support.amd.com/en-us/download/desktop/previous?os=Windows%2010%20-%2064

Before install, do a clean up with DDU.


----------



## asciii

Removed.


----------



## Matt-Matt

Quote:


> Originally Posted by *Wimpzilla*
> 
> I think it need anyway MSI AB to get the I2C command executed/send to the card, so i dunno if u could lunch it like a service that boot with window.
> Other than including it as a service at the boot i really do no see how u could get the VCCD increase.
> Maybe if u lunch MSI AB on boot, and u add the command on the .exe.
> I tried on my MSI AB desktop shortcut, to add the command at the end of the target line, it apply the VCCD but it does not lunch MSI AB lol!
> 
> 
> 
> 
> 
> 
> 
> 
> So


Ended up going with trixx, the latest version with an ASUS Reference BIOS on my card seems to play ball.

Currently running 1250/1450 with +183mv and it seems stable.


----------



## LandonAaron

Yeah these new drivers are garbage. We can't overclock at all now? Whats up with that?


----------



## Paul17041993

Quote:


> Originally Posted by *LandonAaron*
> 
> Yeah these new drivers are garbage. We can't overclock at all now? Whats up with that?


Considering how old hawaii is now and how short of resources AMD are, they don't exactly have an option to support it.

What specific options were you after though?


----------



## Removed1

_(Edited: Use this tool carefully )_

Here some feedback using the *VRMTool* on hawaii R9 290 card.
So as far i tested, everything is working but _the volatge VID_ command, the tool read/write correctly the IC settings via I2C bus.

It allow you to modify almost everything you need to overclock, so along the new Wattman tool from AMD drivers, you should not anymore need any 3rd party software like MSI AB.

As far i tested:
-_*The Voltage VID*_ is well read *but* the tool *didn't allowed* me to change the VID, it seems.
I tried both changing the value with something like 1000mv, 1200mv but it did not work.
Same if you directly write yourself the hex value into the registry, i tried 60, 61, nothing.
If i press override without a value or a bad one, blank screen and reboot.

-_*The Voltage Offset*_ is working and allow you to apply an offset to the gpu. Just write 50, 100, 300 into the box, it will add the +50mv, +100mv, etc offset to the VID giving you _the actual voltage_ given by the tool.

-_*The Actual Voltage*_ is well read summing up the VID + Offset.

-_*The Output Current Scaling*_ work, the default value is 0x60=*1x*, you can change it to 0x20=*0.5x* or in between.
It will reduce the power sensed by the card, in few word will remove the TDP limit, especially if already added 50% TDP into Wattman. If you change it to 0x20 the card will draw the max current it can.
*BEWARE WITH THIS OPTION*, there is no current control protection, the vccd between 0x60 and 0x20 fly from 1.180v to 1.2v at stock under load, it is a lot.
I can't imagine what could do the setting 0.5x on an already oc card, running a bench.








Same as for the current flow through the vrm, it is outrageous and not healthy for the components.
If i set the value 0x50 i get crash in Firestrike, so it is either stock or 0x40/0x30 to get higher TDP, after this, with 0x20 the card draw too much power.

-_*Phase 1/2/3 Gain*_ is the only option i didn't tested, sometimes i get the default value, sometimes the values changes, meaning that there is some phase modulation within the VRM. I will look more in details later.

-_*LLC*_ is working, *BEWARE WITH THIS OPTION*, removing the LLC will allow the card to run with a fixed vccd, i get 1.25v by default at stock when i remove the LLC, from the ususal 1.18v under load.

-_*LLC Slope*_ is working, it define how strong the LLC is, so by default i have something like 1.075mOhm, if you set a bigger resistance the vccd will go down when full load, if you set less resistance the vccd will go up when full load compared to reference stock vccd.
If you want to modify just write the value into the box, if default is 1.075mOhm write 1.05 or 1.09 to change the LLC strength.

-_*PWM Freq*_ is working i guess, i tried to change the vrm duty cycle, it seems impact on oc/stability but i didn't manage to grasp how much. Just write into the box the desired frequency, 300,400,500,600Khz.



Spoiler: Warning: Spoiler!



Loop 1 voltage VID: 0.99375V (reg 0x93=0x59 *RO* | 1.55V-X*0.00625V)
Loop 1 voltage offset: +0.00000V (reg 0x8D=0x00 _RW_ | X*0.00625V, 0x00=default)
Loop 1 output voltage: 0.97656V (reg 0x9A=0x7d *RO* | X*(1/128)V)
Loop 1 output current scaling: ? (reg 0x4D=0x60 _RW_ | 0x60=default, 0x20=lower reported current)
Loop 1 phase 1+2 load distrib.: ? (reg 0x1E=0x00 _RW_ | 0xdd=default=more load on PCIe, 0x00=equal)
Loop 1 phase 3+4 load distrib.: ? (reg 0x1F=0x00 _RW_ | 0xd0=default=more load on PCIe, 0x00=equal)
Loop 1 loadline slope: 1.075mOhm (reg 0x24=0x2b _RW_ | X*0.025mOhm, 0x1C=default)
Loop 1 loadline enable: 1 (reg 0x38=0x81 _RW_ | bit7, 0x81=on/default, 0x01=off)
Loop 1 PWM frequency: 500.1kHz (reg 0x22=0x60 _RW_ | (1/(X*20.83ns))kHz, 0x9E=default)
Loop 1 temperatur: 22C (reg 0x9E=0x16 *RO* | X*1C)



Concluding: VRMTool is i think the best tool atm to oc supported AMD cards.
Imo i would prefer to change these settings on the fly instead going to bios mod.
I found this tool *awesome*, i think i never had so much fun/control oc AMD via software like this R9 290, without modifing the bios.


----------



## BadRobot

Quote:


> Originally Posted by *Wimpzilla*
> 
> Here some feedback using the VRMTool on hawaii R9 290 card.
> So as far i tested, everything is working, the tool read/write correctly the IC settings via I2C bus.
> 
> It allow you to modify almost everything you need for overclock, so with the new wattman from AMD drivers, you should not need anymore any 3rd party software like MSI AB.
> 
> [......]
> 
> Concluding: VRMTool is i think the best tool atm to oc supported AMD cards.
> Imo i would prefer to change these settings on the fly instead going to bios mod.
> I found this tool *awesome*, i think i never had so much fun/control oc AMD via software like this R9 290, without modifing the bios.


Did the tool fix/bypass the 150Mhz memory clock issue a lot of people have been having with the new Wattman from 16.12.1/16.12.2?


----------



## Removed1

Quote:


> Originally Posted by *BadRobot*
> 
> Did the tool fix/bypass the 150Mhz memory clock issue a lot of people have been having with the new Wattman from 16.12.1/16.12.2?


I got this problem too and i think is related to sapphire trixx or 3rd party oc software.
That's why i went through AMD wattman and completely uninstall trixx, i have only MSI AB atm, with the voltage control disabled, just for the OSD monitoring.
So the 150mhz issue was related to trixx for me, i think, i could solve it only by applying/resetting the memory clock with trixx, wattman or MSI AB didn't had any control on memory clocks.
So bored to use X software to oc, i decided to dump everything and use only wattman along VRMTool.
I think i solved the 150mhz problem, either the clock works, either my card crash and i need to reset the whole driver.


----------



## BadRobot

Quote:


> Originally Posted by *Wimpzilla*
> 
> I got this problem too and i think is related to sapphire trixx or 3rd party oc software. That's why i get through AMD wattman and completly uninstall trixx, i have only MSI AB with the voltage control disabled for the OSD monitoring.
> So the 150mhz issue was related to trixx for me i think, i could solve it only by applying/resetting the memory clock with trixx, wattman or MSI AB didn't had any controle on memory.
> So bored to use X softwares to oc, i decided to dump everything and use only wattman and VRMTool.
> I think i solved the 150mhz problem, either the clock works, either my card carsh and i need to reset the whole driver.


Ah, I had no 3rd party software to oc when I installed the new drivers. I tried to fix it by not touching wattman and using MSI afterburner but that solution broke a day later, even after deleting PP_CNEscapeinput from registry (a fix i found on reddit.com/r/amd). I'll try with VRMTool and no Afterburner this time.


----------



## Removed1

Quote:


> Originally Posted by *BadRobot*
> 
> Ah, I had no 3rd party software to oc when I installed the new drivers. I tried to fix it by not touching wattman and using MSI afterburner but that solution broke a day later, even after deleting PP_CNEscapeinput from registry (a fix i found on reddit.com/r/amd). I'll try with VRMTool and no Afterburner this time.


I was using MSI AB to change clocks, trixx to add +200mv offset.
Then i switched to the MSI AB command to give more than +300mv offset, so i would not need sapphire trixx anymore.
I pushed to get VRMTool work because it is the most serious tool we had to really give direct command to the regulation IC via I2C bus.
Try to uninstall MSI AB, do a clean AMD drivers uninstall with DDU, clear the registry stuff with ccleaner and then reinstall the last drivers and use wattman + VRMTool to edit voltages only.
I got this 150mhz memory problem, when my drivers crashed after too much oc and did not recover.
By the way now, at idle the memory clock is right, 150mhz and rise with the load, with trixx i had 300mhz at idle when the memory frequency was not stuck to permanent 1350mhz..
I will still test and give a feedback about this problem.

EDIT: @gupsterg
Quote:


> Originally Posted by *gupsterg*
> 
> @deeper-blue
> 
> +rep for app creation.
> 
> I would like to share some register data which I have collected:-
> 
> i) some from testing myself.
> ii) some from ROM (VoltageObjectInfo) / i2cdump compares of various cards.
> iii) some from posts by The Stilt, Unwinder (author of MSI AB) and elmor/der8auer RX 480 ROM.
> 
> 
> 
> Do you have any registers information besides ones in your app/my image?
> 
> Cheers
> 
> 
> 
> 
> 
> 
> 
> .


If i read these words/values i could possibly modify each of these with VRMTool?
Just reading and writing the value for the different registry settings?
Where i can find a list of the hex words and corresponding values, you have one?


----------



## mirzet1976

Quote:


> Originally Posted by *Wimpzilla*
> 
> Here some feedback using the *VRMTool* on hawaii R9 290 card.
> So as far i tested, everything is working, the tool read/write correctly the IC settings via I2C bus.
> 
> It allow you to modify almost everything you need for overclock, so with the new wattman from AMD drivers, you should not need anymore any 3rd party software like MSI AB.
> 
> So as far i tested:
> -_*The voltage VID*_ is well read and allow you to modify it as you want.
> 
> -_*The voltage Offset*_ is working and allow you to apply an offset to the gpu.
> 
> -_*The actual Voltage*_ is well read summing up the VID + Offset.
> 
> -_*The Output Current Scaling*_ work, the default value is 0x60, you can change it to 0x20 or in between. It will reduce the power sensed by the card, in few word will remove the TDP limit.
> If you change it to 0x20 the card will draw the max current it can.
> *BEWARE WITH THIS OPTION*, there is no current control, the vccd between 0x60 and 0x20 fly from 1.180v stock to 1.2v. Same as for the current flow through the vrm. If i set the value 0x50 i get crash in firestrike, so it is either stock or 0x40/0x30 to get higher TDP, after this 0x20 the card draw too much power.
> 
> -_*Phase 1/2/3 Gain*_ is the only option i didn't tested, sometimes i get the default value, sometimes the values changes, meaning that there is some phase modulation within the VRM. I will look for more in details later.
> 
> -_*LLC*_ is working, *BEWARE WITH THIS OPTION*, removing the LLC will allow the card to run with a fixed vccd, i get 1.25v by default at stock when i remove the LLC, from the ususal 1.18v.
> 
> -_*LLC Slope*_ is working, it define how strong is the LLC, so by default i have something like 1.075mOhm, if you set a bigger resistance the vccd will go down when full load, if you set less resistance the vccd will go up when full load compared to reference stock vccd.
> 
> -_*PWM Freq*_ is working i guess, i tried to change the vrm duty cycle, it seems impact on oc/stability but i didn't manage to understand how much.
> 
> Concluding: VRMTool is i think the best tool atm to oc supported AMD cards.
> Imo i would prefer to change these settings on the fly instead going to bios mod.
> I found this tool *awesome*, i think i never had so much fun/control oc AMD via software like this R9 290, without modifing the bios.


After reading this up I have a question whether it is safe to have Load Line off with Voltage VID 1.3V temps are good for now. And I'm folding with my R9 290.

GPU temp max 47°C
VRM1 max 58°C
VRM2 max 43°C - after 1h with LLC off


----------



## boot318

The LLC function in VRMtool is something I really want, but I can't deal with UI. It just looks like something I will apply too much voltage to and destroy my card.









For the past two weeks I've been running my Asus DCll 290x @1225C & 1625M at +200mV. Max temps are 63C (core) and 85C (vrm) with the "Red Mod". It would be nice if I could set voltage to 1.33v and it would stay there under load.


----------



## Removed1

Quote:


> Originally Posted by *mirzet1976*
> 
> After reading this up I have a question whether it is safe to have Load Line off with Voltage VID 1.3V temps are good for now. And I'm folding with my R9 290.
> 
> GPU temp max 47°C
> VRM1 max 58°C
> VRM2 max 43°C - after 1h with LLC off


Clocks along 1.3v, 1250/1300mhz?
You just disabled the LLC and the stock vccd given was directly 1.3v?
I had 1.25v w/o LLC with 50mv offset already on. Since i couldn't modify the VID, i just applied an offset on the top of the stock VID.
There is no difference w/ or w/o, the vccd that feed the gpu could be either 1.3v w/o LLC or 1.5v/1.3v final w/ LLC, i think.
It is dangerous when someone had an oc bios, an offset or an oc applied. If that, your vccd will jump to 1.5v if you disable it suddenly.















So if your start your oc stock clock/vccd, disable LLC, then rise the clock until you get your max stable clocks, there should not be any problem under decent gpu/vrm load t°.

The real question here is: what are your clocks under load, did you really need 1.3v to run at stock?
I'm pretty sure the gpu didn't even use all this 1.3v fed to it and it is generally not good to feed the gpu with higher voltage when you didn't use it.
The answer imo is no, in this case it is better leave LLC on and maybe play with the LLC slope if you want a bit higher vccd without raising it really.
LLC off is interesting if you push the card at his max stable clock for 1.3v, it could be it handle better the load since there is no modulation, and get maybe higher oc.

Quote:


> Originally Posted by *boot318*
> 
> The LLC function in VRMtool is something I really want, but I can't deal with UI. It just looks like something I will apply too much voltage to and destroy my card.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> For the past two weeks I've been running my Asus DCll 290x @1225C & 1625M at +200mV. Max temps are 63C (core) and 85C (vrm) with the "Red Mod". It would be nice if I could set voltage to 1.33v and it would stay there under load.


63° for the core is ok t°, but you should work on the vrm t°, 85° is already to much if you want continue to oc further.









This tool haven't any complicate things, all the blank box need to be filled with the desired value, the scale of the value is written near by.
I have *edited* my previous post with more explanations, just finished tested the VID command and this one seems not work.








I still need to test the phase gain, but i'm sure it is not a big deal for our 290.

So there is nothing difficult, just *ALWAYS* start this tool being sure your card is at *STOCK*, or at least know yourself, if you have any offset, oc already applied.


----------



## Removed1

Here some thought after playing around with *VRMTool* and Wattman only, _beware_ these are strong settings applied under water, clock at 1280/1665mhz and beyond.

-The 150mhz ram bug is related in my case to the refresh rate of my monitor too, i can oc from 60Hz to 75Hz my panel, if i set 75Hz i can't change the ram clock with Wattman, slider stuck and sometimes clock stuck after driver crash. If i revert to 60Hz i can magically change again the ram clock. So i would add this bug to the problem i had with Trixx before and 3rd party tools.

-I tried to use the command to shut down LLC without success, i couldn't improve my oc and get a stable card with a constant 1.3/1.32v on the gpu. The card heat too much, the vrm goes up to 86° constant and the gpu up to 72°. For me these t° are too high when you push to card so far, the thermal proprieties begin to deteriorates. Changing the vrm duty-cycle, current scaling do not impact on the stability in this case, the card power draw is humongous and not sustainable imo.

-My aim was to get the max from the card meanwhile hold on good thermal conditions on gpu/vrm with less than 70° for the gpu and less than 80° for the vrm. With LLC on and +306mv offset the card is not stable beyond 1280mhz, i could push further but even at +350mv the core refuse to run decently at 1300Mhz+. I can bench further but it is too much and not viable as stable winter gaming clocks. I think i have hit the max stable clock of the silicon anyway.

-So i just adjusted the Wattman 1280/1664Mhz + 50% Power oc, i highlighted the changes in VRMTool. What i did i think with these settings, is allow the card to draw more power, being less strict on the LLC slope allowing the gpu to freely dig into it. The results are not bad, the card just go on rampage, never had this before, i have ran Valley too and i get the highest score i ever had, 3329pts. Confirmed on BF1 too.









_(Do not edit these values without knowing your card and cooling possibility.)_

_Detected IR3567B VRM controller chip
Loop 1 voltage VID: 0.99375V (reg 0x93=0x59 RO | 1.55V-X*0.00625V)
Loop 1 voltage offset: *+0.30625V* (reg 0x8D=0x31 RW | X*0.00625V, 0x00=default) *+0mv Default*
Loop 1 output voltage: 1.27344V (reg 0x9A=0xa3 RO | X*(1/128)V)
Loop 1 output current scaling: ? (reg 0x4D=0x*40* RW | 0x60=default, 0x20=lower reported current) *60 Default*
Loop 1 phase 1+2 load distrib.: ? (reg 0x1E=0x00 RW | 0xdd=default=more load on PCIe, 0x00=equal)
Loop 1 phase 3+4 load distrib.: ? (reg 0x1F=0x00 RW | 0xd0=default=more load on PCIe, 0x00=equal)
Loop 1 loadline slope: *1.025m*Ohm (reg 0x24=0x29 RW | X*0.025mOhm, 0x1C=default) *1.075mOhm Default*
Loop 1 loadline enable: 1 (reg 0x38=0x81 RW | bit7, 0x81=on/default, 0x01=off)
Loop 1 PWM frequency: *600.1kHz* (reg 0x22=0x50 RW | (1/(X*20.83ns))kHz, 0x9E=default) *500kHz Default*
Loop 1 temperatur: 24C (reg 0x9E=0x18 RO | X*1C)_

http://www.hostingpics.net/viewer.php?id=987381Wow.jpg

15409pts Graphic score, my previous highest was 15k








Hawaii rock hard with over 400Watt TDP allowed and decent cooling!


----------



## boot318

^Are you using a modded bios? Your score should be way better than that if so. I got this with 1210C and 1525M if I remember correctly. The modded bios makes it where 3dmark can't see my gpu name correctly though.


----------



## Removed1

Quote:


> Originally Posted by *boot318*
> 
> ^Are you using a modded bios? Your score should be way better than that if so. I got this with 1210C and 1525M if I remember correctly. The modded bios makes it where 3dmark can't see my gpu name correctly though.


Nope it is a stock powercolor 290 bios.
I don't use modded one because i rather like to run the gpu to default values and use 3rd party tools, i do not always run the card oc.
Almost all the bios made here get the vrm switch frequency lowered, like all the Stilts bios.
I don't like these bios they do not overclock the gpu decently due to the lower vrm switch, imo.
The only difference, i think between the two is the edited memory table on your bios.

Anyway i'm done testing with VRMTool, i finally get the card run to it's full potential, on BF1 ultra settings DX12 1080p, it hit 130fps max and more than 100fps on average.








I never got these performances, especially with a frame-time really low and no stutter at all.
Maybe i will look for the OCP/OVP/Combined offset/VCCD limit changes, if i can grab some more perfs.

This night i ran 2h of BF1, 1280/1700Mhz, +300/20mv, output current scaling 40, LLC slope to 1.050mOhm, both vrm loop 1/2 at 550kHz. The results were awesome, gpu 66° max, vrm 78° max but unfortunately it was loud and cold.


----------



## mirzet1976

@Wimpzilla can you post pic of VRMTool with settings you use.


----------



## Removed1

Quote:


> Originally Posted by *mirzet1976*
> 
> @Wimpzilla can you post pic of VRMTool with settings you use.


Here you go!

http://www.hostingpics.net/viewer.php?id=998126Untitled.jpg

The gpu offset, the LLC slope and the vrm switch freq loop 1 aka gpu are edited using the blank box.
What i need to edit manually with the read/write byte command.

-The output current scaling, you can set the values like 80/60/40/30/20% i think, 50% is not allowed.
The default is 60%, the tool allow you to set 20% but you need to manually input 40%, as 20% is really to much even on water, don't even think about it on air.
_00 04 30: read register 4d: 60
00 04 30: write register 4d <- 40
00 04 30: read register 4d: 40_
-This is the register for the vrm switch frequency loop 2 aka the memory vrm, the tool allow you only to edit the loop 1, that is the gpu one.
So you need to manually edit the VCCDI vrm switch freq as same as the gpu, imo it is wise to run the whole vrm power phases to the same freq.
_00 04 30: read register 23: 60
00 04 30: write register 23 <- 57
00 04 30: read register 23: 57_
-This is the register and bytes for the VCCDI offset, the one you find in MSIAB.
I run with +20mv to the aux voltage, it seems help a bit, i get some more point on 3DX if i set to +20mv for 1700mhz memory clock, higher i loose point.
_00 04 30: read register 8e: 00
00 04 30: write register 8e <- 03
00 04 30: read register 8e: 03_


----------



## mirzet1976

@Wimpzilla thank you, most interested me about this manually command. .


----------



## scolon

What do you guys think about doing something like this to the stock cooler? I have a spare stock cooler anyway...

http://www.rage3d.com/board/showthread.php?t=33938071

Aftermarket coolers won't fit into my case and they're too expensive anyway. Non-reference coolers are extremely hard to get, only two on ebay in the last 6 months.


----------



## BadRobot

Quote:


> Originally Posted by *scolon*
> 
> What do you guys think about doing something like this to the stock cooler? I have a spare stock cooler anyway...
> 
> http://www.rage3d.com/board/showthread.php?t=33938071
> 
> Aftermarket coolers won't fit into my case and they're too expensive anyway. Non-reference coolers are extremely hard to get, only two on ebay in the last 6 months.


That's a techsupport macgyver idea. If it works, it ain't stupid and 20C drop is no laughing matter. I've considered ziptieing a fat cpu cooler on mine just for fun but I didn't want to risk it. I asked for screw hole measurements around the chip in this thread but no one replied


----------



## scolon

Biggest problem is that the cooling fins are closed on top.

http://images.anandtech.com/doci/7481/290X_HSF.jpg

I would have to grind them down a bit so the air can pass through. They might be too thin to survive that treatment.

Other option is to strap an old CPU cooler (Blue Orb 2) on with zip ties that I have lying around.

Kind of like this project:
https://i.ytimg.com/vi/1-5WEgZhCec/maxresdefault.jpg


----------



## BadRobot

Quote:


> Originally Posted by *scolon*
> 
> Other option is to strap an old CPU cooler (Blue Orb 2) on with zip ties that I have lying around.
> 
> Kind of like this project:
> https://i.ytimg.com/vi/1-5WEgZhCec/maxresdefault.jpg


Yep. Excatly like that except with an even bigger cooler. Of course, I'll still have tiny stick-on heatsinks for the vram/vrms around it. Maybe I could macgyver an extra fan to blow air around that.


----------



## mfknjadagr8

I think youre beating a dead horse with that solution...if it doesnt help at all then youve got a cut up shroud that doesnt do what the stock cooler intended either...its worth a try if you dont care about damaging the card for the experience and possiblity of decreased load temperatures but if i had to guess the poster in the thread you linked probably needed new paste as he was hitting 90 on stock cooler with a pretty high fan curve...usually they stay within operating temperatures unless paste or contact is bad...i would look for a solid stock cooler from someone who has watercooled and doesnt need theirs...like a windforce or a twin frozer something with a reference pcb...


----------



## Removed1

Speaking about beating a dead horse, i finally take the time to decently upgrade my *R9 290 ghetto cooling* solution with something more _stable and reliable_.
So i ordered some new 0.5/1mm thermal pads.

First at all, i'm using a corsair H100 coupled with a corsair HG10 on my 290, my H100 is modded into a loop with the H100i on the cpu and a tank, like a custom water, so the cooling performances should be nearly the same.
But i always got 5°/10° more than custom loops, 55°/58° full load oc when i got 60°/65° and i'm not happy with that since i spend a lot of time modding my IAO custom loop.
On cpu the results are awesome, you could use my idea like i've done to get a decent water using cheap second hand corsair IAO.

So each time i repaste and remount the corsair HG10/290 i fix something going *wrong* with this corsair IAO adapter.
Don't get me wrong the HG10 is really a good product compared to the kraken G10. For the R9 290x, it cool the ram/vrm reusing the stock blower.
Fine, the finitions and design are good, the aluminum plate is thick, the vrm cooling is fine, the pad are fine, but there is only a small engineering mistake, that the corsair guy should be ashamed to had release it like final product.









So the mistake regard the thermal pads, the ones used for the vrm are 1mm thick, the ones used for the ram are 0.5mm thick.
It would not cause any problem, you would just need to squeeze the vrm pads when you tighten the HG10, but...
When you full tighten the HG10 on the pcb, the distance between the *ram/vrm* and the metal is *0.5mm*, so the ram pad are ok, the vrm ones not.
Corsair could have used 1mm pad for the ram too, so you would not tighten at the max, but this would be problematic for the IAO sitting and alignment on gpu.

The worst, the real engineering mistake, is the vrm 2 cooling design, the metal shroud of the HG10 touch some smd capacitors when you tighten to default max and since these ones are higher than the ram/vrm being near by:
-High side vrm 2 poor contact with the HG10 pad.
-The pads and the metal shroud make only half contact with some ram chip.
-Wrong asset, alignment of the IAO on gpu and it can kill your card.

What corsair did and now i understand why, also i understand why a lot of users said kill their card on it, complaining of bending the pcb..!
Corsair just changed the thickness of the vrm thermal pads hoping it would be enough, but not.
At least they could have put 1mm pads everywhere, bad finish when the rest is really perfect.









The Ghetto R9 290, the Thermal Right thing is just for tighten the zip tie that squeeze the pcb/vrm and HG10 together.


The Card without the first Ghetto layer, second layer i had modded the vrm cooling removing the stock blower mounted there, gluing stuff to help to chill the vrm, it was doing it job decently with the 120mm fan.


Here you can clearly see the consequences, the oil mark on the top memory chip, less than half contact.
Hopefully the ram can hold great t°, but still, i oced my ram to 1664/1700mhz gaming, was bad more than last time i repasted, i can clearly notice my last push with bios mod + VRMTool.


The vrm, i reused the stock thermal pad, it is not so bad and can be squeeze more easly.
Could keep them under 80° but still was not decent and optimal and begin to be dangerous if i keep pushing more than 1.32/1.35v without limits.


----------



## Removed1

The vrm 2 cooling, you can clearly see the caps that are causing the problem, should the metal plate had another design?!
It follow the same oil line on top ram chip, i checked carefully if these were really the cause, imo yes.


Better mod of the vrm 1 section, i cut the black heatsink in small pieces, each one with a new pad, to get a better cooling and air flow.
I also did a third hole on the top of the HG10, to use a screw to squeeze decently the whole together and hold the pcb straight.


The vrm 2 mod, i had only a metal saw and some basic tools, so i cut the pieces i could, bending a bit the one i couldn't really remove since the space, smd cap disposition. I wished had a dremel, but poor/ghetto modding religion do not allow the use of these satan's tools!!















At the end, 3h of careful work for not bend everything on the HG10, the contact between the vrm 2 and the pad was good. I had to put two 0.5mm pad on it, they are anyway less thick and i recovered the contact of the HG10 on the whole pcb!










Final product! A bit deceived, not Ghetto anymore.










The results spokes by themselves, it is stock 6 loop valley plus final bench, the other one is 45min folding. I lost +/- 5° everywhere, gpu, vrm1/2.
I think now i got nearly basic custom water cooling performances, it feel it when i run the card. And yes it is a stock R9 290 running full load, not idle t°.








Room t° 15°/20°, 500/1500rmp cpu rad, 1500rmp gpu rad push/pull.


Hope my painful experience with the Corsair HG10 could help others, i leave it there, in the temple of the Hawaii thread!









Edit: fold is on actually!


----------



## lanofsong

Hello AMD R9 290X/290 owners,

We are having our monthly Foldathon from Monday 16th - 18th - 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

January 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## kmetek

Hey guys, i just bought 2nd hand Sapphire R9 280X VAPOR-X and would like to know how to fix throttling in games?


----------



## the9quad

Quote:


> Originally Posted by *kmetek*
> 
> Hey guys, i just bought 2nd hand Sapphire R9 280X VAPOR-X and would like to know how to fix throttling in games?


Use a custom fan curve.


----------



## HOMECINEMA-PC

Best money I spent ever on these red things .



My TRI 290 setup hasn't missed a beat since late 2013 when I blocked em


----------



## Removed1

I love so much your ghetto setup.








I'm planing to build a chiller too, i saw a nice idea on ocn, a mate used 2 ram waterblock and squeeze in the middle 2 TEC.








And yes, i admit, i take some tweak to get the max performances out of hawaii gpu, but damn son, a 3 years gpu that pawn 15.5k pts on firestike when oc is absurd!

I just checked the actual vga prices, US side a rx480 is 190/240$, here in EU is more 200/250e, lol.








Even more catastrophic a 1060 is 200/250$, here in EU is more *250/350*e, trololol.

You can find cheap R9 29Xx for 150e or less, i paid this golden 79.7% asics gpu 100e.


----------



## mirzet1976

I paid my R9 290 asic 79.2% 140euro and two full cover block's, Evo Supremacy CPU block and dual terminal for 55euro. Now I think, to take anothere because I have a block and terminal.


----------



## Removed1

Quote:


> Originally Posted by *mirzet1976*
> 
> I paid my R9 290 asic 79.2% 140euro and two full cover block's, Evo Supremacy CPU block and dual terminal for 55euro. Now I think, to take anothere because I have a block and terminal.


Cheers mate!















I paid my first R9 290 74% asic the same price, add to that 40e of second hand H100 and 23e of HG10 bracket.
















So this is my overall t° after 12h fold.
[email protected], t° on H100i 35.5°, 2*[email protected]
[email protected]/1350Mhz, 390 bios without limits, +50mv, 4*[email protected]
Room t° 15°/20°


----------



## MSIMAX

testing with relive 17.1.1 drivers


----------



## Removed1

Same, 17.1.1, 1250/1450Mhz, +175mv, LLC slope 1.05mOhm

Still able to break 15k on graphic score.
I flashed the 390 VID bios + edited memory tables, took from the other thread, edited it without limit and max gpu clock 1500mhz.
The gpu seems to run more stable with the 290 modded bios, but the timings on the ram lock the maximum clock on gpu it noticed.
The 390 bios run the card smoother and stable, the resulting gaming experience is awesome.








On the other hand the stock 290 bios allow me to freely climb up to 1290/1700+mhz, even if the card is on the edge of stability.

So my highest bench is still with the 290 stock bios 1280/1705Mhz 15544pts on the graphic score!
Still can't go up 1260mhz for the gpu and 1500mhz for the ram with the 390 bios, i clearly notice the impact of modified memory tables here.
Got roughly the same performances with lower clock adding the reward to get smoother frame rate.


----------



## bichael

Hi all. Have just picked up a reference 290 and EK block - paid equivalent of 135 USD so pretty happy but was wondering which approach I should look into for getting the best out of the 290. Simple OC with Wattman? Afterburner? Flash a different (390?) BIOS? Mod the BIOS? VRMTool? Given the 290 has been around for a while there is obviously a huge amount on info out there, and some of it probably not so current, so just looking for a steer to avoid wasting time on a dead end. I don't mind a bit of tweaking and time if it's worth the effort but if I can get 99% of the way there with minimal effort then I would probably just go for that. Cheers.


----------



## dagget3450

Quote:


> Originally Posted by *bichael*
> 
> Hi all. Have just picked up a reference 290 and EK block - paid equivalent of 135 USD so pretty happy but was wondering which approach I should look into for getting the best out of the 290. Simple OC with Wattman? Afterburner? Flash a different (390?) BIOS? Mod the BIOS? VRMTool? Given the 290 has been around for a while there is obviously a huge amount on info out there, and some of it probably not so current, so just looking for a steer to avoid wasting time on a dead end. I don't mind a bit of tweaking and time if it's worth the effort but if I can get 99% of the way there with minimal effort then I would probably just go for that. Cheers.


It really depends more on your personal usage. If your going to game and want stability primary issue, then i would say maybe look into memory timing/straps and a slight core OC. Also, wouldn't hurt to verify if your 290 can unlock to a 290x as well.


----------



## Removed1

Quote:


> Originally Posted by *bichael*
> 
> Hi all. Have just picked up a reference 290 and EK block - paid equivalent of 135 USD so pretty happy but was wondering which approach I should look into for getting the best out of the 290. Simple OC with Wattman? Afterburner? Flash a different (390?) BIOS? Mod the BIOS? VRMTool? Given the 290 has been around for a while there is obviously a huge amount on info out there, and some of it probably not so current, so just looking for a steer to avoid wasting time on a dead end. I don't mind a bit of tweaking and time if it's worth the effort but if I can get 99% of the way there with minimal effort then I would probably just go for that. Cheers.


DDU and clean driver install.

Change clock/PL wattman, leave everything auto, just change the gpu/ram clock and PL.
To add the desired VCCD/VCCDI offsets, vrmtool, the settings reset when you power down the vrm.

1/ Add offset
2/ Changes desired clock
3/ ????
4/ PROFIT!

+1 dagget3450
Check if the baby could unlock to 290x wouldn't absolutely hurt.








Low profile oc, solid performance, if good asic gpu = 390 bios with edited 390 VID and memory tables, work like a charm!


----------



## bichael

Thanks for the tips guys all sounds good, looking forward to seeing how the card does!
And yeah it's for general gaming so just looking for something solid.
Just waiting for my new PSU now so I can start playing (have upgraded from a 450W Bronze to a 550W Gold), thought it might arrive today but seems not.


----------



## Roboyto

Quote:


> Originally Posted by *bichael*
> 
> Hi all. Have just picked up a reference 290 and EK block - paid equivalent of 135 USD so pretty happy but was wondering which approach I should look into for getting the best out of the 290. Simple OC with Wattman? Afterburner? Flash a different (390?) BIOS? Mod the BIOS? VRMTool? Given the 290 has been around for a while there is obviously a huge amount on info out there, and some of it probably not so current, so just looking for a steer to avoid wasting time on a dead end. I don't mind a bit of tweaking and time if it's worth the effort but if I can get 99% of the way there with minimal effort then I would probably just go for that. Cheers.


My post is from quite some time ago, but the general information is still relevant: http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/21890#post_22208781

Would suggest putting upgraded thermal pads for VRMs for maximum performance: http://www.overclock.net/t/1468593/r9-290-x-thermal-pad-upgrade-vrm-temperatures

The waterblock is going to solve the biggest problem with the Hawaii cards...heat...assuming you have enough radiator space to cool it properly. From my experience even a standard thickness 240mm radiator with good fans is sufficient to keep temperatures reasonable even with a fair overclock and some voltage.

Generally speaking overclocking on these cards is pretty straightforward. My technique for the numerous Hawaii cards I have owned was always as follows:


Max power slider to 50%
Incremental core clock increases until you have issues like artifacting, driver crashes, or lockups. To get a 'feel' for your card 20-25MHz increments initially works well.
Add some voltage. Incremental steps once again; 20-25mV can also work well here. If you want to use smaller increments that is up to how detailed you want to get.
Rinse and Repeat.
You may want to create a spreadsheet with clocks, voltages, and how the card acts. This way you can get a 'picture' of how the card does with varying amounts of voltage and clock speeds. This will also allow you to eventually do some fine tuning, if you choose, once you see where the card acts best.

From my experience nearly any 290 will hit 1200MHz, or very close, on the core...but it may very well take maximum power % and voltage to do so. If you use a spreadsheet, or at least take some notes, you will find a "sweet spot" somewhere with a moderate amount of voltage to get most of the performance out of the card without pushing it to the brink all the time.

I didn't say anything about RAM speeds because it doesn't effect performance anywhere near as much as core clock. Do this last, and just think of any performance boost you get from the RAM as the cherry on top. Most cards generally didn't have RAM that overclocked all that great, but they are out there.

There's more information in the first link I posted. If you have any questions, don't hesitate to ask


----------



## Streetdragon

not every card can do 1200mhz. On water yes, but not on air


----------



## mus1mus

Quote:


> Originally Posted by *Streetdragon*
> 
> not every card can do 1200mhz. On water yes, but not on air


Well, he pretty much said it that way.









I know my cards can on air pretty easy. Reference cards too.


----------



## kmetek

Quote:


> Originally Posted by *the9quad*
> 
> Use a custom fan curve.


how? with msi afterburner?


----------



## Removed1

Quote:


> Originally Posted by *mus1mus*
> 
> Well, he pretty much said it that way.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I know my cards can on air pretty easy. Reference cards too.


It just depend witch ones, the first ones with the bad node and 60% asic, having the gpu dirty like some mercure amozone river?








Or the one with 70%+ asic that came after when AMD develop the node.









It is true that my 3 ref 290 +70% asic where *horrible* with stock cooler and 1250mhz+ capable under extreme air, water!








Quote:


> Originally Posted by *kmetek*
> 
> how? with msi afterburner?


Yes!
Or you can tune yourself the fan curve directly by bios editing, when you find a good one with MSI AB.


----------



## diggiddi

Quote:


> Originally Posted by *kmetek*
> 
> how? with msi afterburner?


Yessir


----------



## Roboyto

Quote:


> Originally Posted by *Streetdragon*
> 
> not every card can do 1200mhz. On water yes, but not on air


I've owned 4, maybe 5, of them and they all were able to hit 1200, *"or very close"*, and *"nearly any"* as I stated above. Very close likely meaning ~1175 at worst with maximum power/voltage...it has been a while. Even with reference blowers this was possible by cranking the fan to 100%. I've only owned reference models.

After benching with reference cooler, every card got at least a kraken if not a full cover block. The reduction in temperature and leakage would typically allow a little more on the core and/or reducing voltage for the same clock speed(s).

Of course there are exceptions on either end of the spectrum of good/bad performers. I still have my reference 290, that I purchased back in Dec of 2013 that is an absolute beast. It will crank out to approx. 1300/1700 in most benchmarks, at max power/voltage, and it still hits those RAM speeds running the 1250MHz RAM timings. It does 1200/1500 with 100mv, guaranteed, in any benchmark or game. That is about as good of a sample of a 290(X) as anyone could ask for. Unfortunately it is sitting on my desk as a glorified paper weight right now because of next gen cards.

Quote:


> Originally Posted by *mus1mus*
> 
> Well, he pretty much said it that way.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I know my cards can on air pretty easy. Reference cards too.


I sure did  If anyone can tell you about how these cards perform...this ^ guy can


----------



## mus1mus

I can retract from my previous statement a little back though.

Hawaii with Hynix Memory are known to clock lower than those with Elpida. Still, 1200 is achievable with a thermally sound set-up.

Regarding the ASIC and manufacturing improvements, higher ASIC cards are harder to tame. If you have a waterblock, then fine. Even then, my card with the highest ASIC runs the hottest on lesser Voltage.


----------



## mus1mus

Quote:


> Originally Posted by *Roboyto*
> 
> I sure did
> 
> 
> 
> 
> 
> 
> 
> If anyone can tell you about how these cards perform...this ^ guy can


I just learned from you guys.







Im still a noob really.

Your thread is a Hawaii Bible.


----------



## Removed1

Quote:


> Originally Posted by *Roboyto*
> 
> *Unfortunately it is sitting on my desk as a glorified paper weight right now because of next gen cards.*
> 
> I sure did
> 
> 
> 
> 
> 
> 
> 
> If anyone can tell you about how these cards perform...this ^ guy can


Your post was perfect but that.








Dude my 290 1250/1420Mhz under 390 bios deliver roughly a 100fps average on BF1 DX12 Ultra 1080p!















15.2K graph score on firestrike and 3200pts on Valley, stables clocks, just played 3 hours last night, instead on endlessly benching.








No need an next gen card atm, but for the hype train on vega.

This is the baby 30min full throttle under [email protected], get same t° after couple of hours BF1.
It seems stable even i got one bad sate in 30min. Almost 500k PPD!








1250/1420Mhz, 390 bios with mem tables, 50% PL, +163mv, TPD/PL/TCD 260/260/240.


----------



## AndreVeck

Quote:


> Originally Posted by *Wimpzilla*
> 
> [...] my 290 1250/1420Mhz under 390 bios [...]


I'm new to modding bios,is there a guide for that? I have a 290 tri-x from Sapphire.
Thanks for the help!


----------



## Removed1

Quote:


> Originally Posted by *AndreVeck*
> 
> I'm new to modding bios,is there a guide for that? I have a 290 tri-x from Sapphire.
> Thanks for the help!


Here For Bios Modding
Read carefuly the first OP post!

Here For The 390 Bios suited for the 290
Same read carefuly the first OP post!

The 290=>290x Unlock Thread, The Other One

The clocks i got are under water and not under air.
So first check witch version of the Tri-x you got, the reference design pcb or not, also check if your card could unlock to 290x.
If you got a 290 Trix-x ref pcb,the best you could do is just to flash a 390 bios with modded memory tables and try a little oc like 1100/1150mhz, +50/100mv.
You will get a nice performance boost if your card do not run too much hotter.


----------



## Roboyto

Quote:



> Originally Posted by *Wimpzilla*
> 
> Your post was perfect but that.
> 
> 
> 
> 
> 
> 
> 
> 
> Dude my 290 1250/1420Mhz under 390 bios deliver roughly a 100fps average on BF1 DX12 Ultra 1080p!
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 15.2K graph score on firestrike and 3200pts on Valley, stables clocks, just played 3 hours last night, instead on endlessly benching.
> 
> 
> 
> 
> 
> 
> 
> 
> No need an next gen card atm, but for the hype train on vega.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> This is the baby 30min full throttle under [email protected], get same t° after couple of hours BF1.
> It seems stable even i got one bad sate in 30min. Almost 500k PPD!
> 
> 
> 
> 
> 
> 
> 
> 
> 1250/1420Mhz, 390 bios with mem tables, 50% PL, +163mv, TPD/PL/TCD 260/260/240.


I'm not saying my 290 was performing poorly, because it wasn't in the least bit. However, when Fury X pricing dropped to $379 a few months back AND PPCs had the EK blocks on blowout for $70, it seemed like the opportune time to jump on that deal. While it wasn't a massive boost in performance, for my 3x1080 Eyefinity it gave me a decent bump in in the right direction. I'm actually glad I did make the switch when I did since Vega has been pushed back as far as it has. Either way I look at it, $450 for AMD's flagship card with an EK nickel plated block is a damn good deal...and I'm not sorry I did it in the least bit.

I made a spreadsheet back in Feb of last year when I was toying with the idea of the Fury X upgrade, and my conclusion at that time was that a Fury X at stock clocks would be about 29% faster than a MSI Gaming 390X running at 1050/1500. http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/41800_20#post_24903602

My awesome 290 running at 1200/1500 may be a little faster than that 390X, but not by much so I figured this was a decent comparison. Now if you factor in my Fury X's OC capability which isn't terrible at ~1150/550, then the ~30%, or more, performance boost from 290 -> Fury X should easily hold true.

I also pulled my other blocked 290 out of my HTPC in favor of a reference RX480 8GB. While this wasn't a large gain in performance it is faster than the *mediocre* 290 it replaced on all fronts, especially when you compare 290/480 at stock clocks like I had previously: http://www.overclock.net/t/1605802/official-radeon-rx480-470-460-owners-club/40_20#post_25355009

It was also nice to have a new card to play around with since I've had so many 290s over the last couple years. It was a fruitful upgrade considering I got a sample that does 1400/2250 under stock voltage/power constraints with the reference 6-pin power connector.


----------



## AndreVeck

Hi @Wimpzilla, thanks for the help!

I've managed to flash the 390 bios with base voltages since the modded one booted with artifacts and weird things on screen.

My card is not unlockable to 290x so modding the 390 bios will be the next thing to do.

Now I'll do some testing with this bios,

Is modding a bios difficult? Do you recommend it?

Thanks!


----------



## catbeef

updating to 17.1.1 from 16.6 seems to have ****ered my memory oc, can only get 1250mhz now where i was at 1400mhz before - above 1250 my monitor begins flipping out.

it is a msi gaming 4g 290.. haven't touched the bios, using trixx to oc. have been happily plodding along at 1100mhz +19mv / 1400mhz memory for however long. did i miss an important psa regarding the 17.x drivers and 290 cards? anything i should be doing to get my memory back up and stable? (i guess with 17.x would be nice, but if i gotta go back to 16.x for the time being.. sure)

also~ unrelated to my current problem, but relevant to the above chatter.. i think i mentioned before in this thread, my card wont be stable above 1100, even sticking the power to +200mv. and it isn't a heat thing, as i am cooling my 290 with a corsair h55 (keeps it 30 degrees cooler than the stock cooler did lol).
*shrug* idk why, was the same with a sapphire tri-x 290 i had before. flipped it for this msi card - made £50 and wondered if this one would go above 1100... nope. maybe it is something in my setup? or i am just super unlucky and got the only two 290s that don't go above 1100 haha

ah well, i'll go revert to 16.6 for now~


----------



## diggiddi

Quote:


> Originally Posted by *catbeef*
> 
> updating to 17.1.1 from 16.6 seems to have ****ered my memory oc, can only get 1250mhz now where i was at 1400mhz before - above 1250 my monitor begins flipping out.
> 
> it is a msi gaming 4g 290.. haven't touched the bios, using trixx to oc. have been happily plodding along at 1100mhz +19mv / 1400mhz memory for however long. did i miss an important psa regarding the 17.x drivers and 290 cards? anything i should be doing to get my memory back up and stable? (i guess with 17.x would be nice, but if i gotta go back to 16.x for the time being.. sure)
> 
> also~ unrelated to my current problem, but relevant to the above chatter.. i think i mentioned before in this thread, my card wont be stable above 1100, even sticking the power to +200mv. and it isn't a heat thing, as i am cooling my 290 with a corsair h55 (keeps it 30 degrees cooler than the stock cooler did lol).
> *shrug* idk why, was the same with a sapphire tri-x 290 i had before. flipped it for this msi card - made £50 and wondered if this one would go above 1100... nope. maybe it is something in my setup? or i am just super unlucky and got the only two 290s that don't go above 1100 haha
> 
> ah well, i'll go revert to 16.6 for now~


It might could be a power thing, what Psu and how old is it?


----------



## azcrazy

Quote:


> Originally Posted by *catbeef*
> 
> updating to 17.1.1 from 16.6 seems to have ****ered my memory oc, can only get 1250mhz now where i was at 1400mhz before - above 1250 my monitor begins flipping out.
> 
> it is a msi gaming 4g 290.. haven't touched the bios, using trixx to oc. have been happily plodding along at 1100mhz +19mv / 1400mhz memory for however long. did i miss an important psa regarding the 17.x drivers and 290 cards? anything i should be doing to get my memory back up and stable? (i guess with 17.x would be nice, but if i gotta go back to 16.x for the time being.. sure)
> 
> also~ unrelated to my current problem, but relevant to the above chatter.. i think i mentioned before in this thread, my card wont be stable above 1100, even sticking the power to +200mv. and it isn't a heat thing, as i am cooling my 290 with a corsair h55 (keeps it 30 degrees cooler than the stock cooler did lol).
> *shrug* idk why, was the same with a sapphire tri-x 290 i had before. flipped it for this msi card - made £50 and wondered if this one would go above 1100... nope. maybe it is something in my setup? or i am just super unlucky and got the only two 290s that don't go above 1100 haha
> 
> ah well, i'll go revert to 16.6 for now~


I have the same problem, can't even control the fan speed


----------



## catbeef

Quote:


> Originally Posted by *diggiddi*
> 
> It might could be a power thing, what Psu and how old is it?


a Seasonic OEM XFX 650w semi-modular thing, P1-650X-XXB9. almost 3 years old. downgraded to 16.12.x first and had the exact same issues with that.. 16.6.2 is working fine though.

i read a couple google results where people were having trouble with hawaii cards and wattman.. but i've not been paying much attention and i couldn't find anything called wattman at all -- all i remember of wattman is it was an ati/amd made power manager that they sneak-peaked when the 480s were first coming about, or something like that. *shrug shrug*

Quote:


> Originally Posted by *azcrazy*
> 
> I have the same problem, can't even control the fan speed


not sure whether it affects my fan control as i'm using a kraken g10 bracket, and the onboard fan controller (gpu) is powering that fan.


----------



## Satanello

Hi all, i have a little problem. In the last days i added a second r9 290x and tested various driver version (included the last 17.1.2) but Diablo 3 and The Division run and use only one card. Crossfire mode is ok with 3dmark and Arma 3.
Is this normal?? I should order a new 4k monitor but with only one card running i
think only Diablo could be playable.
Tnx for your help!

Inviato da mTalk


----------



## BadRobot

Quote:


> Originally Posted by *Satanello*
> 
> Hi all, i have a little problem. In the last days i added a second r9 290x and tested various driver version (included the last 17.1.2) but Diablo 3 and The Division run and use only one card. Crossfire mode is ok with 3dmark and Arma 3.
> Is this normal?? I should order a new 4k monitor but with only one card running i
> think only Diablo could be playable.
> Tnx for your help!
> 
> Inviato da mTalk


iirc, crossfire only works in fullscreen. Can you make sure the games are running fullscreen? Try making a profile in AMD CCC for both games and for the crossfire settings set it to AFR Friendly.

From the ArsTechnica forum:
Quote:


> Default mode is whatever you have set globally. It is essentially an absence of Crossfire mode setting. The driver does whatever. This is AFR if nothing else is set, meaning it usually is AFR - but it will go to AMD pre-defined if nothing else. Frame pacing is active in this mode for DX9, DX10+ games.
> AFR Friendly is an AFR mode for stuff that just doesn't seem to get along with anything else. Frame pacing is disabled for DX9 games.
> Optimize 1x1 is identical to AFR Friendly, but includes additional help for games using 1x1 pixel surfaces, which can flicker or flash in AFR Friendly mode. Frame pacing is not active. Frame pacing is completely disabled.
> AMD Pre-Defined is only that. If there is no pre-defined profile, it will not fall back on AFR, it'll disable CrossfireX. This is how it differs from Default mode, which WILL fall back on AFR. Frame pacing is active in this mode for DX9, DX10+ games.


----------



## dagget3450

Quote:


> Originally Posted by *BadRobot*
> 
> iirc, crossfire only works in fullscreen. Can you make sure the games are running fullscreen? Try making a profile in AMD CCC for both games and for the crossfire settings set it to AFR Friendly.
> 
> From the ArsTechnica forum:


In dx12 crossfire works in window mode. Timespy is a good example for that. It actually works better than fullscreen in dx12. The division is dx12?


----------



## BadRobot

Quote:


> Originally Posted by *dagget3450*
> 
> In dx12 crossfire works in window mode. Timespy is a good example for that. It actually works better than fullscreen in dx12. The division is dx12?


Oh neat! I tried crossfire back in 2013 and my "game flow" makes me alt tab a lot so fullscreen is never a good idea. sold the extra card instead.

The Division has a DX12 patch it seems. I'd still try the profile thing. If the game isn't specifically coded to support multi-gpu, then the default game profile in CCC will only use 1 card.


----------



## Satanello

Thank for your help, i usually play all games in fullscreen mode so i'll try to create new profile for both Diablo and The Division at full screen and windowed mode!

P.s.: The Division was patched to dx12 (now you can select dx12 mode from options menu) with good performance improvements (at least for my system)!

Inviato da mTalk


----------



## Gdourado

Hello, how are you?

So, i have a question...
Is it worth trading a Sapphire 290 Vapor-X for a MSI 290X Gaming?
Basically it is a trade between the Sapphire superior cooling solution for the extra stream processors on the 290X.
I use the 290 at 1100 core and 5750 memory, so my vapor is not the greatest of overclockers... Still, worth the trade?

Cheers!


----------



## dagget3450

Quote:


> Originally Posted by *Gdourado*
> 
> Hello, how are you?
> 
> So, i have a question...
> Is it worth trading a Sapphire 290 Vapor-X for a MSI 290X Gaming?
> Basically it is a trade between the Sapphire superior cooling solution for the extra stream processors on the 290X.
> I use the 290 at 1100 core and 5750 memory, so my vapor is not the greatest of overclockers... Still, worth the trade?
> 
> Cheers!


Most likely the [email protected] core matches or surpasses 290x at stock. The difference between cards is very small. If your benchmarking you may see a difference, but in gaming i would imagine they will be so close you couldn't tell them apart.


----------



## azcrazy

Quote:


> Originally Posted by *catbeef*
> 
> i read a couple google results where people were having trouble with hawaii cards and wattman.. but i've not been paying much attention and i couldn't find anything called wattman at all -- all i remember of wattman is it was an ati/amd made power manager that they sneak-peaked when the 480s were first coming about, or something like that. *shrug shrug*
> not sure whether it affects my fan control as i'm using a kraken g10 bracket, and the onboard fan controller (gpu) is powering that fan.


I used MSI afterburner to control my card, but after installing 16.9.x cant even control the fan speed.

Where did you found the older drivers?


----------



## Satanello

Quote:


> Originally Posted by *BadRobot*
> 
> Oh neat! I tried crossfire back in 2013 and my "game flow" makes me alt tab a lot so fullscreen is never a good idea. sold the extra card instead.
> 
> The Division has a DX12 patch it seems. I'd still try the profile thing. If the game isn't specifically coded to support multi-gpu, then the default game profile in CCC will only use 1 card.










excellent guys, i deleted the original profile for "The Division" and created a new one pointing to the exe file, i loaded the game but cf was not running so i swithed off the DX12 render, restarted the game and BOOM both video cart fully running! at the end of internal bench run i reactivated the DX12 render and now my CF is running!
Now i'm working on Diablo 3 profile.


----------



## scolon

My fan died today so I decided to try a few new things. I'm going to buy a new one anyway plus have a spare stock cooler lying around.

So first did this, opening the fins with a knife and ripping off the top layer with pliers:



Then I simply taped an old 80mm fan on top of it.
Idle temps went down to 39°C from around 45°C, noise is also a bit lower than the stock fan at 20%.

Then I tried Furmark for about 30 seconds to check how quickly temperatures would rise, almost killed the card.
The problem is that software tools read out the wrong temperature. The card got insanely hot, it started ticking from expanding with the heat. It ended with a purple picture before I cut the power. The temperature was only in the high 60s according to GPU-Z altough the card was too hot to touch.

Tomorrow I'm going to try this with 2x 80mm fans or 92mm ones if I can find some. I don't expect this to be a good solution, this is merely me goofing around.









But how do I know the real temperatures? I'm thinking the sensors maybe couldn't keep up with the quick temperature rise.


----------



## scolon

So I put a 92mm and a 80mm fan on. It's borderline usable, the two fans can not keep the card below 94°C in furmark (just like the stock fan).
Although it's still below the critical 94°C in regular gaming.



I believe two decent 92mm fans would be good enough keep this card well below 94°C in furmark, especially since mine is overclocked to 1000Mhz with +25mV and the 80mm fan crap. The noise level is better than the stock blower fan. But you need a couple of big case fans, this card is literally like putting a hair dryer at the lowest setting inside your case. Now the card is idling at 39°C and the two fans are running at 800rpm.

So this ghetto solution is ok if you're waiting for a fan replacement like me. For other video cards it might even be a viable solution if you have fans lying around and don't want to spend money, the R9 290 is just an extremely hot card.


----------



## mus1mus

Furmark huh?

I didn't know such crap is used to gauge stability. It's utter garbage! If you still want your card to survive, ditch that app out!


----------



## Ipak

For stability or artifact hunt I usually use 3dmark stability test or just 1st graphic test looped. Also heaven benchmark freecam looking at roof next to couple trees.

Did few tweaks improving physics score and beat my old record in firestrike and timespy








http://www.3dmark.com/fs/11622597
http://www.3dmark.com/spy/1170285

it seems my timespy is #1 score using 4770k and 290


----------



## scolon

Well, I installed 3dmark and it completely freezes my pc as soon as I hit the start test button.








I'm also not paying for all these extra features...

But intensive games like Witcher 3 are ok so far.


----------



## diggiddi

try heaven and or valley instead


----------



## FuriousPop

Quote:


> Originally Posted by *Satanello*
> 
> 
> 
> 
> 
> 
> 
> 
> excellent guys, i deleted the original profile for "The Division" and created a new one pointing to the exe file, i loaded the game but cf was not running so i swithed off the DX12 render, restarted the game and BOOM both video cart fully running! at the end of internal bench run i reactivated the DX12 render and now my CF is running!
> Now i'm working on Diablo 3 profile.


just make a note, most ported games are not 2+ GPU friendly. i have 3 and have struggled to get alot of older games working. Poor programming or to be politically correct, lazy/lack of.

Could of sworn Division doesn't like the 1+ gpus, i myself struggled heaps with that... and d3 is no need for more than 1x gpu.. what res you running by the way?


----------



## Satanello

actually i'm using a 1920x1080 res. but in 2 days i should receive a new 4k monitor. Diablo don't require lot of power from graphic card so i think i can use only one 290x (i have problem when forcing CF enabled) but The Division need lot of power so if i can't use the CF i think the game will be unplayable at 4K res.


----------



## crazyxelite

Hi guys I'm planning in making a new PC (mini itx) my current one I will connect to TV it has the r9 290x vapor x. I plan making a ryzen build, and i saw a good deal on a MSI r9 290x gaming 8gb for £135 should I go for it? The game I mainly play is gears 4 and I get 90fps locked stable on multiplayer high settings 1440p. But I think I would be more happy if I get 120 fps. Should I wait for Vega? Or just save my money?


----------



## Dekaohtoura

Hello everyone.

I'm looking for a Club3d 290 Royalking BIOS to flash my VTX3d 290 X-Ed V2 (I'm that desperate...this card is a total nightmare oc wise).

If anyone could help, I'd be grateful.

Thnx, Petros


----------



## bichael

Quote:


> Originally Posted by *crazyxelite*
> 
> Hi guys I'm planning in making a new PC (mini itx) my current one I will connect to TV it has the r9 290x vapor x. I plan making a ryzen build, and i saw a good deal on a MSI r9 290x gaming 8gb for £135 should I go for it? The game I mainly play is gears 4 and I get 90fps locked stable on multiplayer high settings 1440p. But I think I would be more happy if I get 120 fps. Should I wait for Vega? Or just save my money?


I think used 290's probably offer about the best bang for buck you can get at the moment. Just be careful it's been relatively well looked after and it should be good for the next few years I think. The only proviso is obviously the size and high power draw. For a new mITX build depending on case and psu something like a 480 might be a better option.

Not sure when mITX Ryzen will be available, isn't it going to be sometime later than the main release? If waiting for that then may make sense to wait a bit longer for gpu anyway, even for a second hand one you might get a better deal.


----------



## Mech0z

Whenver I start my PC up from cold boot it goes into black screen in windows, then I reboot and it loads windows fine, this is with a R9 290 Tri-X and latest Relive drivers on Windows 10.

Anyone have an idea what can cause this?


----------



## diggiddi

Have you changed/installed any new software?


----------



## HOMECINEMA-PC

I get the odd DP error ( blackscreen ) , I just turn monitor off then back on .........

Don't think that will help any


----------



## Mech0z

Quote:


> Originally Posted by *diggiddi*
> 
> Have you changed/installed any new software?


Tried reinstalling all drivers and going back to latest crimson driver, but no dice. Might try to install windows on another ssd and see if the problem continues
Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> I get the odd DP error ( blackscreen ) , I just turn monitor off then back on .........
> 
> Don't think that will help any


Think I tried turning my 2 moniters off and on, but can check again


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Mech0z*
> 
> Tried reinstalling all drivers and going back to latest crimson driver, but no dice. Might try to install windows on another ssd and see if the problem continues
> Think I tried turning my 2 moniters off and on, but can check again


Flick over to the other bios .........


----------



## Mech0z

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> I get the odd DP error ( blackscreen ) , I just turn monitor off then back on .........
> 
> Don't think that will help any


Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Flick over to the other bios .........


Are there alternative bios' for the Tri-x ? Just wonder if it can get a 390 bios or similar and maybe fix this at the same time


----------



## HOMECINEMA-PC

Assuming its got a bios switch on the card ........


----------



## Mech0z

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Assuming its got a bios switch on the card ........


It does, will try when I get home


----------



## Mech0z

Switching bios worked! yay ^^

edit: but this bios is unstable







set default in wattman, but freezes once in a while in Diablo 3


----------



## scolon

I also had blackscreens and freezes all the time. Tried a lot of things, there are plenty of threads out there with the same problem. I think people have it narrowed down to VDDC voltage, basically the card is a huge mess with the insane power draw.

This bios totally fixed the blackscreens for me, the card hasn't had an error or blackscreen ever since:
http://www.overclock.net/t/1561904/mlu-bios-builds-for-290x

But the card seems to be prone to braking, there are always plenty of r9 290s being sold on ebay as "broken". Might be a good time to switch to a rx 470, if you're lucky switching will cost only around 50€.

My weird setup with the 92 and 80mm fans is doing fine so far. The computer is extremely silent in 2D, the card got to 90°C in heaven, 81°C with an additional 120mm fan losely put a few centimeters below the card. I think I'll buy an additional fan and keep the card until summer.

What fan do you think will be better? 92 or 120mm. 120mm obviously is quieter and stronger but some of the air will blow past the graphics card because the fan is too wide.


----------



## Mech0z

Quote:


> Originally Posted by *scolon*
> 
> I also had blackscreens and freezes all the time. Tried a lot of things, there are plenty of threads out there with the same problem. I think people have it narrowed down to VDDC voltage, basically the card is a huge mess with the insane power draw.
> 
> This bios totally fixed the blackscreens for me, the card hasn't had an error or blackscreen ever since:
> http://www.overclock.net/t/1561904/mlu-bios-builds-for-290x
> 
> But the card seems to be prone to braking, there are always plenty of r9 290s being sold on ebay as "broken". Might be a good time to switch to a rx 470, if you're lucky switching will cost only around 50€.
> 
> My weird setup with the 92 and 80mm fans is doing fine so far. The computer is extremely silent in 2D, the card got to 90°C in heaven, 81°C with an additional 120mm fan losely put a few centimeters below the card. I think I'll buy an additional fan and keep the card until summer.
> 
> What fan do you think will be better? 92 or 120mm. 120mm obviously is quieter and stronger but some of the air will blow past the graphics card because the fan is too wide.


Is that bios for the Tri-x ?


----------



## mus1mus

I have just awaken one of my Hawaiis and tried the latest RELIVE Driver. And I can OC the card to this level now!

Sheet!



Just kidding!

+4FPS over the last Catalyst Driver at 1200/[email protected]+100 MSI AB.


----------



## AndreVeck

@mus1mus

What card?
Have you got a custom bios?

How do you reach that clock on memory? mine @1450 blackscreens. You are touching some voltages?

Also, what is the base Vcore to which you add +100?

What are your temps? (core and vrms)

Sorry for all the questions!


----------



## mus1mus

Quote:


> Originally Posted by *AndreVeck*
> 
> @mus1mus
> 
> What card?
> Have you got a custom bios?
> 
> How do you reach that clock on memory? mine @1450 blackscreens. You are touching some voltages?
> 
> Also, what is the base Vcore to which you add +100?
> 
> What are your temps? (core and vrms)
> 
> Sorry for all the questions!


All there buddy.









1200/1550 (Heaven has this bug when you OC past your bios' base clock)

Modded BIOS

+100mV with MSI AB

Voltage in GPU-Z monitor - 1.18V on the Core after +100mV with AB at LOAD. Idle - 1.200

Watercooled in 30C room - 45C Core / 55C VRM1 / 35C VRM2

80+% ASIC card = Power hog which explains the high VRM temps.


----------



## ZealotKi11er

Quote:


> Originally Posted by *mus1mus*
> 
> All there buddy.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1200/1550 (Heaven has this bug when you OC past your bios' base clock)
> 
> Modded BIOS
> 
> +100mV with MSI AB
> 
> Voltage in GPU-Z monitor - 1.18V on the Core after +100mV with AB at LOAD. Idle - 1.200
> 
> Watercooled in 30C room - 45C Core / 55C VRM1 / 35C VRM2
> 
> 80+% ASIC card = Power hog which explains the high VRM temps.


My card does 1200/1575 but is not bot stay stable for more than 30 mins. I have found that I need to drop to 1150 to maintain 100% stability in every game and temperature.


----------



## mus1mus

Quote:


> Originally Posted by *ZealotKi11er*
> 
> My card does 1200/1575 but is not bot stay stable for more than 30 mins. I have found that I need to drop to 1150 to maintain 100% stability in every game and temperature.


Voltage.









I am trying out BIOSes again as I have been removed from this card for a while where it gathered dust.

That run wasn't as stable as I want it to be as well. Had to pump a little more Voltage to keep it artifact free.

I did make a BIOS that is pre-OCed today. 1287 set for DPM7 to let it run at 1200/1500 without MSI AB and stay error free to boot. Voltgae drops to 1.21 at LOAD. Terrible droop! But that's favorable than using a PT based BIOS.


----------



## ZealotKi11er

Quote:


> Originally Posted by *mus1mus*
> 
> Voltage.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I am trying out BIOSes again as I have been removed from this card for a while where it gathered dust.
> 
> That run wasn't as stable as I want it to be as well. Had to pump a little more Voltage to keep it artifact free.
> 
> I did make a BIOS that is pre-OCed today. 1287 set for DPM7 to let it run at 1200/1500 without MSI AB and stay error free to boot. Voltgae drops to 1.21 at LOAD. Terrible droop! But that's favorable than using a PT based BIOS.


Mine does not have any artifacts at all. It just cuts off. Add more voltage would help but temps will go higher bringing it closer to instability again.


----------



## mus1mus

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Mine does not have any artifacts at all. It just cuts off. Add more voltage would help but temps will go higher bringing it closer to instability again.


Something must be heating up? A repaste maybe.


----------



## catbeef

Quote:


> Originally Posted by *azcrazy*
> 
> Where did you found the older drivers?


amd website, though i had to google for them. seem impossible to find just navigating the website.. not great.


----------



## scolon

Quote:


> Originally Posted by *Mech0z*
> 
> Is that bios for the Tri-x ?


The bios meant for the Stock fan designs so I don't know how that will affect a Tri-X. Might be a bit louder.

Just choose one of the version in the thread, not much you can destroy with a dual bios. I went with 1000/1250MHz and +25mV, the most conservative one...


----------



## yoteko

Hey.

I have an issue with AMD 290 Tri X and MSAA. When I turn on MSAA then the screen is filled with weird pattern. FXAA, TXAA won't do that.

I wonder if that is a software/driver issue or my card is slowly dying? Or I have a bad PSU (VS650 - both PCIe jacks are on 1 wire).

Screens:

CMAA: https://www.upload.ee/image/6679948/Korras_CMAA.png

MSAA: https://www.upload.ee/image/6679950/Ruudud_MSAAx2.png

Unigine: https://www.upload.ee/image/6680890/UniHe_MSAA.png

Thanks!


----------



## mus1mus

Check WattMan or minimize your OC. From experience, those are memory related instability.

If you have tried overclocking with the latest Drivers remember to bring the clocks back to default before a restart. Wattman resumes with OC clocks without OC Voltage thus creating those or even black screens at boot.


----------



## ZealotKi11er

Quote:


> Originally Posted by *mus1mus*
> 
> Check WattMan or minimize your OC. From experience, those are memory related instability.
> 
> If you have tried overclocking with the latest Drivers remember to bring the clocks back to default before a restart. Wattman resumes with OC clocks without OC Voltage thus creating those or even black screens at boot.


I know Wattman is an overclocking tool but sometimes I wish AMD had a option not to install it.


----------



## mus1mus

Quote:


> Originally Posted by *ZealotKi11er*
> 
> I know Wattman is an overclocking tool but sometimes I wish AMD had a option not to install it.


Very true. The lastest







driver doesn't allow me to run HIS iTurbo for one.

I know it's a niche issue but I am pushing for some serious clocks and I can't!

But, looking back, you'll be amazed how far Driver improvements brought these cards.

At least 5 FPS from 15.10 to 17.2.1 in Heaven 4.0


----------



## AndreVeck

Hi all, someone has a 290 reference pcb near a ruler?
I have to build a custom cooler for the vrm but I don't have dimensions.
Thanks!


----------



## Raephen

Quote:


> Originally Posted by *AndreVeck*
> 
> Hi all, someone has a 290 reference pcb near a ruler?
> I have to build a custom cooler for the vrm but I don't have dimensions.
> Thanks!


Which dimensions do you need? My card is blocked, so kinda unavailable. But I still have the reference cooler and did a quick measure up of the spacing of the two screw holes nearest to what I think was the part that covered the main VRM bit (right below the blower) and it's roughly 88mm from centre to centre.


----------



## Matt-Matt

Quote:


> Originally Posted by *Wimpzilla*
> 
> Here some thought after playing around with *VRMTool* and Wattman only, _beware_ these are strong settings applied under water, clock at 1280/1665mhz and beyond.
> 
> -The 150mhz ram bug is related in my case to the refresh rate of my monitor too, i can oc from 60Hz to 75Hz my panel, if i set 75Hz i can't change the ram clock with Wattman, slider stuck and sometimes clock stuck after driver crash. If i revert to 60Hz i can magically change again the ram clock. So i would add this bug to the problem i had with Trixx before and 3rd party tools.
> 
> -I tried to use the command to shut down LLC without success, i couldn't improve my oc and get a stable card with a constant 1.3/1.32v on the gpu. The card heat too much, the vrm goes up to 86° constant and the gpu up to 72°. For me these t° are too high when you push to card so far, the thermal proprieties begin to deteriorates. Changing the vrm duty-cycle, current scaling do not impact on the stability in this case, the card power draw is humongous and not sustainable imo.
> 
> -My aim was to get the max from the card meanwhile hold on good thermal conditions on gpu/vrm with less than 70° for the gpu and less than 80° for the vrm. With LLC on and +306mv offset the card is not stable beyond 1280mhz, i could push further but even at +350mv the core refuse to run decently at 1300Mhz+. I can bench further but it is too much and not viable as stable winter gaming clocks. I think i have hit the max stable clock of the silicon anyway.
> 
> -So i just adjusted the Wattman 1280/1664Mhz + 50% Power oc, i highlighted the changes in VRMTool. What i did i think with these settings, is allow the card to draw more power, being less strict on the LLC slope allowing the gpu to freely dig into it. The results are not bad, the card just go on rampage, never had this before, i have ran Valley too and i get the highest score i ever had, 3329pts. Confirmed on BF1 too.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> _(Do not edit these values without knowing your card and cooling possibility.)_
> 
> _Detected IR3567B VRM controller chip
> Loop 1 voltage VID: 0.99375V (reg 0x93=0x59 RO | 1.55V-X*0.00625V)
> Loop 1 voltage offset: *+0.30625V* (reg 0x8D=0x31 RW | X*0.00625V, 0x00=default) *+0mv Default*
> Loop 1 output voltage: 1.27344V (reg 0x9A=0xa3 RO | X*(1/128)V)
> Loop 1 output current scaling: ? (reg 0x4D=0x*40* RW | 0x60=default, 0x20=lower reported current) *60 Default*
> Loop 1 phase 1+2 load distrib.: ? (reg 0x1E=0x00 RW | 0xdd=default=more load on PCIe, 0x00=equal)
> Loop 1 phase 3+4 load distrib.: ? (reg 0x1F=0x00 RW | 0xd0=default=more load on PCIe, 0x00=equal)
> Loop 1 loadline slope: *1.025m*Ohm (reg 0x24=0x29 RW | X*0.025mOhm, 0x1C=default) *1.075mOhm Default*
> Loop 1 loadline enable: 1 (reg 0x38=0x81 RW | bit7, 0x81=on/default, 0x01=off)
> Loop 1 PWM frequency: *600.1kHz* (reg 0x22=0x50 RW | (1/(X*20.83ns))kHz, 0x9E=default) *500kHz Default*
> Loop 1 temperatur: 24C (reg 0x9E=0x18 RO | X*1C)_
> 
> http://www.hostingpics.net/viewer.php?id=987381Wow.jpg
> 
> 15409pts Graphic score, my previous highest was 15k
> 
> 
> 
> 
> 
> 
> 
> 
> Hawaii rock hard with over 400Watt TDP allowed and decent cooling!


I should get onto this. Was running +186mv at 1240/1450MHz (I can't run more then that on the RAM no matter what!)

Trixx is glitchy for me, everytime my PC sleeps it locks up my system or I have to restart Trixx while looking through a mass of lines.

I'm currently at 1179/1450MHz @ +100mv with afterburner. Want to push more... but


----------



## Newbie2009

I gave up on adding volts to my cards when I see how much power consumption jumped by adding just 50mv

Will pull an average of 550w when gaming, 630w when highly stressed.

Added 50mv and it jumped to about 750w. for a few mhz just not worth the hassle.


----------



## AndreVeck

Quote:


> Originally Posted by *Raephen*
> 
> Which dimensions do you need? My card is blocked, so kinda unavailable. But I still have the reference cooler and did a quick measure up of the spacing of the two screw holes nearest to what I think was the part that covered the main VRM bit (right below the blower) and it's roughly 88mm from centre to centre.


Thanks, really helpful!


----------



## Raephen

Quote:


> Originally Posted by *AndreVeck*
> 
> Thanks, really helpful!


Prego.

I don't know what exactly you are planning, but I'm guessing it's something like this: http://gelidsolutions.com/thermal-solutions/accessories-enhancement-kit-of-rev-2-icy-vision-for-amd-r9-290-290x/ ?


----------



## ZealotKi11er

Quote:


> Originally Posted by *Newbie2009*
> 
> I gave up on adding volts to my cards when I see how much power consumption jumped by adding just 50mv
> 
> Will pull an average of 550w when gaming, 630w when highly stressed.
> 
> Added 50mv and it jumped to about 750w. for a few mhz just not worth the hassle.


Yeah same here. I run +50mV for 1150/1500. After that is pointless.


----------



## Roy360

It there a bios I could flash to disable one of my GPUs?

Im trying to trouble shoot an issue with my PC and it would be helpful to do so without removing the cards.


----------



## mus1mus

Just removing the Power Cable is easy enough.


----------



## AndreVeck

Quote:


> Originally Posted by *Raephen*
> 
> Prego.
> 
> I don't know what exactly you are planning, but I'm guessing it's something like this: http://gelidsolutions.com/thermal-solutions/accessories-enhancement-kit-of-rev-2-icy-vision-for-amd-r9-290-290x/ ?





This is it!
Probably ugly looking but temps are good.
VRMs are below 45 on stock with heaven. Core 50 degrees.


----------



## Roy360

With the new catalyst is possible to designate which GPU to use?

Ie. Currently I have crossfire disabled and only GPU1 is being used. I now want to try using GPU2 only


----------



## Newbie2009

Quote:


> Originally Posted by *Roy360*
> 
> With the new catalyst is possible to designate which GPU to use?
> 
> Ie. Currently I have crossfire disabled and only GPU1 is being used. I now want to try using GPU2 only


Don't think so, you could disconnect power from the gpu you don't want to use and see if that works.


----------



## Raephen

Quote:


> Originally Posted by *AndreVeck*
> 
> 
> 
> 
> This is it!
> Probably ugly looking but temps are good.
> VRMs are below 45 on stock with heaven. Core 50 degrees.


Very nicely done!

And it's not ugly: it has the uber functional ghetto look!


----------



## Roy360

Quote:


> Originally Posted by *Newbie2009*
> 
> Don't think so, you could disconnect power from the gpu you don't want to use and see if that works.


I'll give it a shot. Shame they got rid of that feature though. The old CCC would let you customize which cards to use.

I'm trying to diagnose a fault in my computer. So far, disabling crossfire stops my machine from crashing randomly. Unfortunately that means my second R9 290 might be faulty.


----------



## diggiddi

Might be able to disable in Bios?


----------



## bichael

Finally got my new psu so started to play around with my 290 (reference powercolor model with EK block and fujipoly on the vrm). No unlock to a 290x unfortunately. So far just using wattman and stock voltage. Core of 1100 seems stable after limited test, 1120 got through a firestrike run but with some artifacts.

I then tried with memory at 1400 and it completed firestrike run fine. So with 1100/1400 got a score of 11097 total, 13216 graphics which I was pretty happy with.

However I then restarted the computer and got a black screen halfway through boot. To reset the OC I enabled igpu in bios and booted up with that then switched back to the 290 which black screened after a few seconds but then automatically reset to default clocks. I then tried 1100/1300 and it's restarting fine so I guess there was some instability in the lower power states with the ram overclock. My question is how do you test for that? Also I just applied a 13% overclock on the core which I think boosts all the power states, is that maybe making it worse and I should only adjust the highest?


----------



## mus1mus

Quote:


> Originally Posted by *bichael*
> 
> Finally got my new psu so started to play around with my 290 (reference powercolor model with EK block and fujipoly on the vrm). No unlock to a 290x unfortunately. So far just using wattman and stock voltage. Core of 1100 seems stable after limited test, 1120 got through a firestrike run but with some artifacts.
> 
> I then tried with memory at 1400 and it completed firestrike run fine. So with 1100/1400 got a score of 11097 total, 13216 graphics which I was pretty happy with.
> 
> However I then restarted the computer and got a black screen halfway through boot. To reset the OC I enabled igpu in bios and booted up with that then switched back to the 290 which black screened after a few seconds but then automatically reset to default clocks. I then tried 1100/1300 and it's restarting fine so I guess there was some instability in the lower power states with the ram overclock. My question is how do you test for that? Also I just applied a 13% overclock on the core which I think boosts all the power states, is that maybe making it worse and I should only adjust the highest?


It's the Driver that resets the clocks at boot coz the Voltages didn't stick (reverts to default) causing it to be unstable especially with high memory clocks (anything over 1375 in my case).

If you want, you can look at editting your BIOS and manually putting a Voltage Value to the card's DPM 7 Voltage or ask someone there to help you add an offset Voltage that will eliminate the issue.

http://www.overclock.net/t/1561372/hawaii-bios-editing-290-290x-295x2-390-390x/0_50

Or simply, reverting back to stock clocks and Voltages before every reboot.


----------



## bichael

Thanks for the reply and yeah you're right I think it's a driver thing. I'd ruled that out as I haven't been changing any voltages only the clocks (as you mention I thought the bug was mainly voltages not sticking) but looks like something can still go wrong at boot even if only upping the memory clock. I reapplied the OC and used it for a while and watched in gpuz as it hit different power states and was completely stable, as soon as restart then same black screen problem.

Your last line led me to the best solution (or the easiest at least) which is to just use wattman profiles. eg. I have the global set for 1100/1250 but 3dmark set for 1100/1425. Seems to work perfectly to step up the clocks when something opens and drop back down when closed so will just do the same for any games where I feel the need. Should keep me going until I have a bit more time to look into pushing further using vrmtool or maybe a 390 bios.


----------



## Roy360

Quote:


> Originally Posted by *diggiddi*
> 
> Might be able to disable in Bios?


It was as simple as disabling crossfire and moving all the monitors to the bottom card.

The monitors can't display the bios, but since I"m only using this for testing purposes.

I'm confused now though.

My system works fine when either card is set to primary. However when crossfire is enabled the system randomly crashes while gaming.

PSU is a single rail Corsair AX1200i that's less than a year old, and both gpus have full cover GPU blocks


----------



## Matt-Matt

Quote:


> Originally Posted by *Roy360*
> 
> It there a bios I could flash to disable one of my GPUs?
> 
> Im trying to trouble shoot an issue with my PC and it would be helpful to do so without removing the cards.


You really wouldn't want to do that..

Turn Crossfire off..
If that's not enough unplug the PCI-E External power and just remove the card..
Or if you have a board that has PCI-E Shutoff you can use that too.


----------



## Gdourado

Hello,
How are you?
Wondering if it worth it to pick up a MSI 290X Lightning to replace a Sapphire 290 VaporX?
The 290 is not a great clocker. It runs 1100 core.
The 290X has more stream processors and stock runs 1080.
Is it worth it to go trough the trouble of selling the 290 and get the 290X?
If I can sell the 290 at a good price, I am looking at maybe 30 extra for the 290X.

Cheers!


----------



## AndreVeck

Quote:


> Originally Posted by *Raephen*
> 
> Very nicely done!
> 
> And it's not ugly: it has the uber functional ghetto look!


Thanks for the appreciation! I like it too. Next step is fixing the loop (I want to put the inlet on the bottom of the tank) and painting the case matte black.


----------



## CALiteral

Quote:


> Originally Posted by *bichael*
> 
> Thanks for the reply and yeah you're right I think it's a driver thing. I'd ruled that out as I haven't been changing any voltages only the clocks (as you mention I thought the bug was mainly voltages not sticking) but looks like something can still go wrong at boot even if only upping the memory clock. I reapplied the OC and used it for a while and watched in gpuz as it hit different power states and was completely stable, as soon as restart then same black screen problem.
> 
> Your last line led me to the best solution (or the easiest at least) which is to just use wattman profiles. eg. I have the global set for 1100/1250 but 3dmark set for 1100/1425. Seems to work perfectly to step up the clocks when something opens and drop back down when closed so will just do the same for any games where I feel the need. Should keep me going until I have a bit more time to look into pushing further using vrmtool or maybe a 390 bios.


This is how I was able to get my ASUS 290x memory clock from 1350 to 1500 without black screens. It involves a bios edit using Hawaii Bios Editor, but it's really easy.



Spoiler: Warning: Spoiler!







Changing the frequencies in the red box to 150 like I did above will effectively set the minimum core voltage for a memory frequency of 1500 to that of DPM 3 rather than DPM 1. Since it is the memory controller on the GPU die itself that can't handle the increased frequency, the slightly higher minimum voltage should make it stable in low load situations.

Let me know if you need any help. By doing this and the 390 memory controller mod, I got nearly identical performance to a 390x at the same clock speeds.


----------



## Vellinious

Quote:


> Originally Posted by *AndreVeck*
> 
> 
> 
> 
> This is it!
> Probably ugly looking but temps are good.
> VRMs are below 45 on stock with heaven. Core 50 degrees.


I think it looks pretty cool. Nice job!
Quote:


> Originally Posted by *CALiteral*
> 
> This is how I was able to get my ASUS 290x memory clock from 1350 to 1500 without black screens. It involves a bios edit using Hawaii Bios Editor, but it's really easy.
> 
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> 
> Changing the frequencies in the red box to 150 like I did above will effectively set the minimum core voltage for a memory frequency of 1500 to that of DPM 3 rather than DPM 1. Since it is the memory controller on the GPU die itself that can't handle the increased frequency, the slightly higher minimum voltage should make it stable in low load situations.
> 
> Let me know if you need any help. By doing this and the 390 memory controller mod, I got nearly identical performance to a 390x at the same clock speeds.


Are people having trouble getting over 1500 on the memory? Was the 8GB 290X different in some way? I ran mine at 1790-1840 for benchmarking, and 1650 for daily clocks.


----------



## CALiteral

Quote:


> Originally Posted by *Vellinious*
> 
> Are people having trouble getting over 1500 on the memory? Was the 8GB 290X different in some way? I ran mine at 1790-1840 for benchmarking, and 1650 for daily clocks.


That's a good question. I guess I was just assuming they were all similar. The only reason I even tried this method was because I looked at a 390x bios in Hawaii Bios Reader and noticed the difference in memory frequency settings and replicated it on my own card.

I don't have the card anymore but I know before modifying the bios, I wasn't able to run 1400Mhz on the memory without black screens. After, I think I ran 1525 day to day. It went a bit higher but I saw reduced performance.

My card really wasn't a great candidate for overclocking at all though since even with +50mV, anything over 1050Mhz core clock caused driver crashes.


----------



## Vellinious

Quote:


> Originally Posted by *CALiteral*
> 
> That's a good question. I guess I was just assuming they were all similar. The only reason I even tried this method was because I looked at a 390x bios in Hawaii Bios Reader and noticed the difference in memory frequency settings and replicated it on my own card.
> 
> I don't have the card anymore but I know before modifying the bios, I wasn't able to run 1400Mhz on the memory without black screens. After, I think I ran 1525 day to day. It went a bit higher but I saw reduced performance.
> 
> My card really wasn't a great candidate for overclocking at all though since even with +50mV, anything over 1050Mhz core clock caused driver crashes.


The card I had was outstanding. I had it running 1200 / 1750 with +25mv for daily, and benchmarked at 1300-1330 / 1800+ with +252mv. Card was killer.....should have never sold it.


----------



## mus1mus

Yeah.


----------



## bichael

Quote:


> Originally Posted by *CALiteral*
> 
> This is how I was able to get my ASUS 290x memory clock from 1350 to 1500 without black screens. It involves a bios edit using Hawaii Bios Editor, but it's really easy.
> 
> 
> Spoiler: Warning: Spoiler!
> 
> 
> 
> 
> 
> 
> Changing the frequencies in the red box to 150 like I did above will effectively set the minimum core voltage for a memory frequency of 1500 to that of DPM 3 rather than DPM 1. Since it is the memory controller on the GPU die itself that can't handle the increased frequency, the slightly higher minimum voltage should make it stable in low load situations.
> Let me know if you need any help. By doing this and the 390 memory controller mod, I got nearly identical performance to a 390x at the same clock speeds.


Thanks. Yeah I came across something similar which was to set DPM1 and 2 at stock ram speed and only use higher in DPM3 and up. I also noticed some oddity in the default voltages as below, still not really sure why they are like that and if that was part of the problem, found another post mentioning it in the bios thread but still not clear why though there was a suggestion to manually set voltages for all states which seemed to make sense. Based on testing with VRM tool to boost voltage 1180 was stable at +100mV but I settled on +75mV and 1165. Finding stability on the ram clock is a complete pain as it mostly meant having to uninstall driver after a failed attempt so I settled on 1425 though could have probably gone slightly higher. I then checked stability at state 3 before just estimating values in between. Updated bios shown below seems completely stable so far, could maybe optimise some of the states a bit more but think I will just start enjoying gaming again for a while now. Maybe later on I'll feel the need to push it some more as I'm sure 1200 will be possible with +150mV or so.

Editing and flashing bios with Hawaii bios reader and atiflash I must say was easier than I expected. I wasted much more time trying to make a USB flash disk with windows 10 (used rufus in the end) and then also took me a while to remember how to get legacy boot options back in the uefi bios!

Stock:
DPM0: GPUClock = 300 MHz, VID = 0.99300 V, Mem = 150
DPM1: GPUClock = 483 MHz, VID = 1.18100 V, Mem = 1250
DPM2: GPUClock = 681 MHz, VID = 1.16800 V, Mem = 1250
DPM3: GPUClock = 865 MHz, VID = 1.16200 V, Mem = 1250
DPM4: GPUClock = 906 MHz, VID = 1.19300 V, Mem = 1250
DPM5: GPUClock = 940 MHz, VID = 1.22500 V, Mem = 1250
DPM6: GPUClock = 966 MHz, VID = 1.25000 V, Mem = 1250
DPM7: GPUClock = 975 MHz, VID = 1.25000 V, Mem = 1250

Updated:
DPM0: GPUClock = 300 MHz, VID = 0.99300 V, Mem = 150
DPM1: GPUClock = 483 MHz, VID = 1.16200 V, Mem = 1250
DPM2: GPUClock = 770 MHz, VID = 1.16800 V, Mem = 1250
DPM3: GPUClock = 966 MHz, VID = 1.17500 V, Mem = 1425
DPM4: GPUClock = 1020 MHz, VID = 1.20000 V, Mem = 1425
DPM5: GPUClock = 1070 MHz, VID = 1.23700 V, Mem = 1425
DPM6: GPUClock = 1120 MHz, VID = 1.28700 V, Mem = 1425
DPM7: GPUClock = 1165 MHz, VID = 1.32500 V, Mem = 1425


----------



## mus1mus

How did you manage to edit the memory DPMs? Is that allowed by the app?


----------



## bichael

Yeah if you see below I think those highlighted green are what can be edited.

I've noticed something that's bugging me though. When I was doing testing (so using wattman and vrmtool) I got a firestrike graphics score of 13900 at 1165/1425, with the modded bios I'm getting around 13500 with the same clocks. From what I've tried so far I can't figure out why the difference - GPUZ shows the clock is being maintained, hwinfo doesn't show any memory errors, adding more voltage on top with vrmtool doesn't change the score, changing power limit doesn't change the score. Kind of wish I hadn't checked again!

Just noticed now when uploading screenshot that I have VDDCI state set as 1250, does anyone know if I should have used 1425 there instead? Could that be causing the lower score maybe? The example I was looking at looked a bit different as guess it was an older version.


----------



## chris89

Quote:


> Originally Posted by *bichael*


I can teach you how to make gains in the bios. Your post made me a chuckle a little. It's okay. I was right where you are now not too long ago. Completely stumped.

If you want, send me you stock GPUz bios dump .rom. I can make changes and explain to you what everything does and how to make gains. Primarily focused on thermals. Only once thermals are under control, can any gains be made. Check this out. Tested on a very old 2007 xeon quad, ddr2 system with PCIe 2.0.


----------



## AndreVeck

Quote:


> Originally Posted by *bichael*
> 
> Just noticed now when uploading screenshot that I have VDDCI state set as 1250, does anyone know if I should have used 1425 there instead? Could that be causing the lower score maybe? The example I was looking at looked a bit different as guess it was an older version.


Absolutely. That value should be the same as the max frequency of memory. Try also the VDDCI voltage @ 1050.


----------



## bichael

Yeah I'm definitely new to this. It's impressive how much info and knowledge is out there on tweaking Hawaii though it can be a bit overwhelming as well. The bios editing thread is great for example but pretty technical. Thinking I might just pretend I never saw the earlier higher score and play some games instead for now!

I don't think thermals are an issue, max temps I've had are 60 on the core and 71 on the vrm when testing with +100mV which I assume should be fine.

I've now upped the value in the VDDCI to match the max memory clock so thanks for confirming. Doesn't seem to have changed anything noticeable but nice knowing it's where it should be. Will have a quick go with a higher VDDCI voltage as well.


----------



## diggiddi

Quote:


> Originally Posted by *Gdourado*
> 
> Hello,
> How are you?
> Wondering if it worth it to pick up a MSI 290X Lightning to replace a Sapphire 290 VaporX?
> The 290 is not a great clocker. It runs 1100 core.
> The 290X has more stream processors and stock runs 1080.
> Is it worth it to go trough the trouble of selling the 290 and get the 290X?
> If I can sell the 290 at a good price, I am looking at maybe 30 extra for the 290X.
> 
> Cheers!


I dunno if its worth it but the lightnings should push 1200+ mhz, but u def need a good psu to feed them
l'll be selling my twin lightnings if you are interested, highest clocks were 1230/1620 iirc on one of them


----------



## bichael

Aha. Think I figured out why I was seeing different scores at the same clocks.

1165/1425 in bios gives around 13500 in firestrike for graphics
1165/1250 in bios but with memory then OC to 1425 with wattman gives around 13870.

Guessing completely but I suppose this could be something like the wattman OC keeps the original mem timings whereas setting it in bios uses looser timings based on the higher clock. Not sure if that would be expected behaviour anyway?


----------



## gooshpitz

Hi peeps,
Can anyone tell me why is vram still clocked at 1300 after bios edit to 1250?


----------



## bichael

Does first page of gpuz show both mem clocks as 1300 or does the default show as 1250?
If default is 1250 but it's running higher then maybe there's some software OC happening somewhere that didn't get fully reset, could try a driver reinstall maybe.

Also with 0 load shouldn't it have been at 150 memory anyway for most of that plot? Or is that a multi-monitor thing?


----------



## gooshpitz

Changed back to 1300 but I'm pretty sure it was 1300 for default on first page as well. It is dual monitor. Will try again for 1200 and check.


----------



## gooshpitz

Nvm, changed it like this.

Switched the second monitor to integrated gpu so now the memory for dedicated one is mostly at 1000MHz when watching vids, brwosing etc.


----------



## AliNT77

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/40140#post_24575331

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/40170#post_24575695

For anyone interested in undervolting
And memory tweaking


----------



## PontiacGTX

Any of your cards have ever stopped displaying video and noticed that the card had some liquid on the front plate?


----------



## mus1mus

I doubt that's harmful liquid.

Pads seem to excrete some form of grease over time from what I have observed.


----------



## bichael

Couldn't resist pushing my card for a few firestike runs. 14427 firestrike graphics score, not too shabby. 1225/1460 with +150mV.

Did result in some very noticeable buzzing through the speakers. Have got a power amp plugged direct to the motherboard at the moment (my integrated amp with dac is in for repair). Have been amazed how good the integrated audio sounds through a good amp (when just playing music) but equally amazed how much feedback there is when the gpu is under load!


----------



## jorgp2

What manufacturer is it, did you use trixx or Afterburner?

Sent from my HTC 10 using Tapatalk


----------



## ryboto

Should I trust TRIXX for temp monitoring? I've got a 290x Tri-X OC 4gb, 16.11.5 drivers, 1140/1500 of +80mV set in TRIXX... Through the TRIXX hardware monitor, vrm1 reached 108C after a short round of Battlefront...

Bad news, right?


----------



## Broken-Heart

Hello Again,

I'm already a member of this club and here's a link to my previous post:

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/29460#post_22834563

I am now the proud owner of a second R9 290X from ASUS. I have been using it in Crossfire with my Sapphire R9 290X since May 2016 So I would like to add my second card to the list and I have all the necessary requirements, here they are:

1) The GPU *Manufacturer* is ASUS. The *Model* is R9 290X DirectCU II OC. The *Cooling Solution* is DirectCU II.

2)GPU-Z validation link : https://www.techpowerup.com/gpuz/details/59g3z

3) Card and Rig Pictures:


----------



## Broken-Heart

Not necessarily, VRMs are rated for about 125C. I know this because my VRM temperatures were above 100C as well until I forced the fans 100% during gaming. I recommend you do the same and use GPU-Z to verify those temperatures.


----------



## mus1mus

100C is already riding the limits IMO.

Fan curve and less additional Voltage to be safer.


----------



## Newbie2009

Quote:


> Originally Posted by *ryboto*
> 
> Should I trust TRIXX for temp monitoring? I've got a 290x Tri-X OC 4gb, 16.11.5 drivers, 1140/1500 of +80mV set in TRIXX... Through the TRIXX hardware monitor, vrm1 reached 108C after a short round of Battlefront...
> 
> Bad news, right?


Yeah, bad news.


----------



## ryboto

I wonder if the VRM is just seeing poor airflow...fans go to 100% rather quickly under load, and the temperature still climbs.


----------



## Streetdragon

Quote:


> Originally Posted by *ryboto*
> 
> I wonder if the VRM is just seeing poor airflow...fans go to 100% rather quickly under load, and the temperature still climbs.


maybe bad airflow on the gpu?


----------



## ryboto

Quote:


> Originally Posted by *Streetdragon*
> 
> maybe bad airflow on the gpu?


Well, it's a Tri-X, and I have an SG-08, the 3 fans are maybe ~0.5-0.25'' from the unfiltered vent holes on the side of the case...so, plenty of access to cool air. Having the cover off only makes a 1-2C difference. Fans are clean, vents are free of dust. The one thing I didn't check was any dust that might be on the surface of the board, impeding airflow around the VRM heatsinks...maybe?


----------



## AndreVeck

[/quote]
Quote:


> Originally Posted by *ryboto*
> 
> Well, it's a Tri-X, and I have an SG-08, the 3 fans are maybe ~0.5-0.25'' from the unfiltered vent holes on the side of the case...so, plenty of access to cool air. Having the cover off only makes a 1-2C difference. Fans are clean, vents are free of dust. The one thing I didn't check was any dust that might be on the surface of the board, impeding airflow around the VRM heatsinks...maybe?


I don't know if you are aware of how poorly VRMs are cooled in this card. I have a tri-x myself and I have opened the card to see what was cooling VRMs.
Found out that there is a alluminium heatsink with spikes. no fins or heatpipes.





I'd like to add that with my new custom made heatsink, VRMs stay at 50 degrees.


----------



## ryboto

Quote:


> Originally Posted by *AndreVeck*
> 
> I don't know if you are aware of how poorly VRMs are cooled in this card. I have a tri-x myself and I have opened the card to see what was cooling VRMs.
> Found out that there is a alluminium heatsink with spikes. no fins or heatpipes.


Awesome! Is the Tri-X a reference design? Can I just buy a different heatsink? Was hoping to hold off until Vega before upgrading anything, honestly.


----------



## AndreVeck

Quote:


> Originally Posted by *ryboto*
> 
> Awesome! Is the Tri-X a reference design? Can I just buy a different heatsink? Was hoping to hold off until Vega before upgrading anything, honestly.


Yes, it's a reference design, you can say this because the card has the AMD logo on the PCI-e connector. The Artic Accelero Extreme III is compatible with our card, but I don't know how well it cools VRMs.


----------



## ryboto

What's the most economical option for aftermarket cooling with the 290x?


----------



## AndreVeck

Quote:


> Originally Posted by *ryboto*
> 
> What's the most economical option for aftermarket cooling with the 290x?


Probably it is better to stick with the reference cooler.


----------



## AliNT77

Quote:


> Originally Posted by *ryboto*
> 
> What's the most economical option for aftermarket cooling with the 290x?


Have you tried undervolting?


----------



## ryboto

Quote:


> Originally Posted by *AliNT77*
> 
> Have you tried undervolting?


I think 1125 actually didn't require much additional voltage...so, yes, I'll try that...I think I'll also try replacing the thermal pads. I've honestly never owned a GPU this long.. 3 years 2 months...


----------



## AliNT77

Quote:


> Originally Posted by *ryboto*
> 
> I think 1125 actually didn't require much additional voltage...so, yes, I'll try that...I think I'll also try replacing the thermal pads. I've honestly never owned a GPU this long.. 3 years 2 months...


i bought my 290 reference roughly a year ago (second hand) and its been under AIO since the day after i recieved it.. if you can find one for cheap , go for it.
I picked up a corsair H70 for ~45$ and with +100mv running at 1125-1650 (with 1375 timings) it never exceeds 65c

BTW for if you are interested in undervolting be sure to take a look at my results and guide here:
http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/40140#post_24575331


----------



## ryboto

Quote:


> Originally Posted by *AliNT77*
> 
> i bought my 290 reference roughly a year ago (second hand) and its been under AIO since the day after i recieved it.. if you can find one for cheap , go for it.
> I picked up a corsair H70 for ~45$ and with +100mv running at 1125-1650 (with 1375 timings) it never exceeds 65c
> 
> BTW for if you are interested in undervolting be sure to take a look at my results and guide here:
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/40140#post_24575331


I don't know if I could fit that cooler in my case, is the only issue...


----------



## chris89

Quote:


> Originally Posted by *bichael*
> 
> Yeah I'm definitely new to this. It's impressive how much info and knowledge is out there on tweaking Hawaii though it can be a bit overwhelming as well. The bios editing thread is great for example but pretty technical. Thinking I might just pretend I never saw the earlier higher score and play some games instead for now!
> 
> I don't think thermals are an issue, max temps I've had are 60 on the core and 71 on the vrm when testing with +100mV which I assume should be fine.
> 
> I've now upped the value in the VDDCI to match the max memory clock so thanks for confirming. Doesn't seem to have changed anything noticeable but nice knowing it's where it should be. Will have a quick go with a higher VDDCI voltage as well.


Yeah if u run the VDDCI at the same 1250mv as the 1250mhz memory clock... that's 25% more power/ heat. Stock is 130 memory watts load, at 1250mv it's 163 watts. VRM from however hot call it 80C to now 100. Calls on more fan... No need to run beyond 1250mhz memory or beyond 1000mv vddci. Even 888mv VDDCI is highly appreciated by the vrm at 1,250mhz memory clock.

From 1000mv to 888mv is a 12.61% reduction in power & heat. Cooler/ faster/ quieter card. As long as the BIOS is dialed in right especially. Improper settings can cause unusual behavior or too hot among many other things.

Send your bios, happy to help.








Quote:


> Originally Posted by *gooshpitz*
> 
> Hi peeps,
> Can anyone tell me why is vram still clocked at 1300 after bios edit to 1250?


I see your GPU Bios has some unusual settings placed in various positions causing possibly weird memory issues im sure. Send bios on over to me, I'll take a look bro.









Quote:


> Originally Posted by *gooshpitz*
> 
> Nvm, changed it like this.
> 
> Switched the second monitor to integrated gpu so now the memory for dedicated one is mostly at 1000MHz when watching vids, brwosing etc.


Nice I notice some strange memory clock issues on dxva etc. If you want i can dial in both yourcards and synced up to really nice cooler/ quieter/ faster performing cards for sure.

Quote:


> Originally Posted by *ryboto*
> 
> I wonder if the VRM is just seeing poor airflow...fans go to 100% rather quickly under load, and the temperature still climbs.


Stock reference cooler? lmk .. send your original bios I'll take a look.









Quote:


> Originally Posted by *AndreVeck*
> 
> Probably it is better to stick with the reference cooler.


I second this. Reference performs better than Arctic Accelero, I tested it. If you remove the vrm pad and smoosh it up and roll it up and cut into pieces for each vrm. It reduced my vrm temp by 50 celsius. Also allows 1,250mhz core on reference blower. I clocked it to 1,204mhz no problem just 78C core at 50% fan speed.


----------



## Xuper

Hi , I recently have bought XFX 290 Double dissipation I have issue with AMD Wattman ( Crimson Version 17.1.2 ) :

1) I can not manually set to fixed RPM.Fan speed always stays at low RPM around 300 to 400 from any arbitrary RPM.only Auto Option works.

Edit : I just set to 4008 RPM and GPU-z Reports : Current RPM : 1250.

2) There is no Temp Control (for example This Link ). maybe because of this ?

Quote:


> AMD manufactures the physical GPU chip and provides it to manufacturers like MSI, ASUS, etc. These companies design boards around the basic GPU supplied by AMD. That means they mean tweak or change some features of the the GPU. The Crimson ReLive drivers are based on a standard GPU implementation and therefore if there are customizations made by the AIB, the driver does not have access to all the features.


Source

Edit : Added Screenshot of My Crimson. Here


----------



## chris89

Thats a bios fan control "type" issue & pwm max issue. GPUz bios dump here unless you want to try and fix it yourself.


----------



## Xuper

Quote:


> Originally Posted by *chris89*
> 
> Thats a bios fan control "type" issue & pwm max issue. GPUz bios dump here unless you want to try and fix it yourself.


so if I get proper Bios , Can i change manually fan speed? where can i find Proper Bios ?

Edit : My Bios : 015.039.000.007.003523


----------



## chris89

Proper bios means no need to change manually. Send your .rom and ill take a look bro.


----------



## Xuper

Here My Bios

https://en.file-upload.net/download-12371354/XFX290Doubledissipation.rom.html


----------



## chris89




----------



## Xuper

I get Error "AJAX response unable to be parsed as valid JSON object".


----------



## chris89

.zip


----------



## Xuper

XFX290Doubledissipation.zip 100k .zip file


Bios Version :
015.039.000.007.003523


----------



## gooshpitz

Quote:


> Originally Posted by *chris89*
> 
> Nice I notice some strange memory clock issues on dxva etc. If you want i can dial in both yourcards and synced up to really nice cooler/ quieter/ faster performing cards for sure.


I unsubbed from this thread so only saw now, but what do you mean by "strange memory clock issues"?
By the way my bios looks like this now.

r9_290_gooshpitz_moded_bios.zip 98k .zip file


----------



## chris89

atiflash_274.zip 1214k .zip file


@gooshpitz Can you send original bios? Sure thing : my explanations are long so I'll try and keep it short. Basically we want 2 memory speed hops, no more. The rest is from comparing all the bios of all the variants of 290/290x/295x2/390/390x and concluding the most appropriate settings for all. HAHA Huge performance increase as well.

I fixed it : try this : atiwinflashv274 : open as administrator : every single app in taskbar closed : maybe watch youtube video : begin flash : don't touch anything : wait : Complete









Gooshpitz_290.zip 99k .zip file


@Xuper Test this and let me know so we can refine it if needed. Memory vddci down from 1000mv to 888mv... way cooler memory vrm. No power limit so huge big end performance for 4k and what not. Fan tune.

atiwinflashv274 : open as administrator : every single app in taskbar closed : maybe watch youtube video : begin flash : don't touch anything : wait : Complete









Xuper290.zip 99k .zip file


----------



## gooshpitz

Quote:


> Originally Posted by *chris89*
> 
> @gooshpitz Can you send original bios? Sure thing : my explanations are long so I'll try and keep it short. Basically we want 2 memory speed hops, no more. The rest is from comparing all the bios of all the variants of 290/290x/295x2/390/390x and concluding the most appropriate settings for all. HAHA Huge performance increase as well.


Not sure this is the original original, at some point I asked sapphire tech for a new version but they said what I have is the latest, which at the time was this one.

r9_290_default_bios.zip 98k .zip file


----------



## chris89

@gooshpitz I think your pcb is the more refined Grenada so basically a 390/390x pcb. The default bios you sent probably isn't as good as the one you sent first. Try the one I sent. It's power de-limited so no stuttering while at ultra 4k... Just buttery smooth performance. By the way, nothing to gain past 1250mhz especially at 888 millivolts. That's 320GB/s on like 75 less memory watts. So far more efficient and very powerful.

Let me know how it goes, if it is in fact a Grenada PCB then it can probably do 1133mhz core on stock 65288. If it's just a Grenada-like Hawaii then 1066mhz core on stock voltage just as the BIOS I sent. We can fiddle with it if you have the power available and want more speed. Usually 1133Mhz on 1333mv is ideal for 72 1/2 billion pixel's per second.


----------



## gooshpitz

Quote:


> Originally Posted by *chris89*
> 
> @gooshpitz I think your pcb is the more refined Grenada so basically a 390/390x pcb. The default bios you sent probably isn't as good as the one you sent first. Try the one I sent. It's power de-limited so no stuttering while at ultra 4k... Just buttery smooth performance. By the way, nothing to gain past 1250mhz especially at 888 millivolts. That's 320GB/s on like 75 less memory watts. So far more efficient and very powerful.
> 
> Let me know how it goes, if it is in fact a Grenada PCB then it can probably do 1133mhz core on stock 65288. If it's just a Grenada-like Hawaii then 1066mhz core on stock voltage just as the BIOS I sent. We can fiddle with it if you have the power available and want more speed. Usually 1133Mhz on 1333mv is ideal for 72 1/2 billion pixel's per second.


It works ok, did not see much difference in game or movie, granted short test, but it runs cooler so thanks. But I think I'll lower the clock a bit and put a manual lower voltage as well for less noise / fan rpm.


----------



## AndreVeck

Someone knows the max vcore of a 290?
I want to do a 1200mhz overclock but I'm scared to fry the card.

Also is it normal that my memories are stable until 1350 and no further?

Thanks, this is a great community!


----------



## Vellinious

Quote:


> Originally Posted by *AndreVeck*
> 
> Someone knows the max vcore of a 290?
> I want to do a 1200mhz overclock but I'm scared to fry the card.
> 
> Also is it normal that my memories are stable until 1350 and no further?
> 
> Thanks, this is a great community!


Mine was under water, but I'd run mine at +262mv, with the custom bios to decrease vdroop. GPUz reported 1.468v. Mine would black screen with any more voltage than that.


----------



## Streetdragon

Quote:


> Originally Posted by *AndreVeck*
> 
> Someone knows the max vcore of a 290?
> I want to do a 1200mhz overclock but I'm scared to fry the card.
> 
> Also is it normal that my memories are stable until 1350 and no further?
> 
> Thanks, this is a great community!


on water? 1,45V would be mine max.
Air... 1,35 or the temp of the core and VRM1+2
vrm max 90° and core 80-85° (for me)

that the ram wont go higher is not realy a problem. the fps-gain from that is like null/minimal


----------



## AndreVeck

Ok, thanks!
I have the card watercooled with a ghetto setup but it runs cool! At about 1.35v, 1200mhz I had 40° on core and 60 on VRMs.
It was unstable.
I don't know anything about vdroop, where do I find info about it?

Thanks guys!


----------



## Streetdragon

Quote:


> Originally Posted by *AndreVeck*
> 
> Ok, thanks!
> I have the card watercooled with a ghetto setup but it runs cool! At about 1.35v, 1200mhz I had 40° on core and 60 on VRMs.
> It was unstable.
> I don't know anything about vdroop, where do I find info about it?
> 
> Thanks guys!


i cant help with the vdroop. maybe here somewhere http://www.overclock.net/t/1564219/modded-r9-390x-bios-for-r9-290-290x-updated-02-16-2016

with unstable: give it bit more power. the temps are all fine(better then my temps xD) maybe 10-20mv and its stable


----------



## Newbie2009

I've noticed a blackscreen issue. I recently hooked up a 2nd monitor and problems since. I disabled ULPS and seems to have helped, but sometimes the monitors just lose video, audio continues. Need a power cycle.

It seems to happen randomly and thus far hasn't happened while playing a videogame.

Anyone else experience this or are my cards dying?


----------



## Streetdragon

Quote:


> Originally Posted by *Newbie2009*
> 
> I've noticed a blackscreen issue. I recently hooked up a 2nd monitor and problems since. I disabled ULPS and seems to have helped, but sometimes the monitors just lose video, audio continues. Need a power cycle.
> 
> It seems to happen randomly and thus far hasn't happened while playing a videogame.
> 
> Anyone else experience this or are my cards dying?


i think the vddci voltage(not the core voltage) is a bit to low. bump that a bit. if that wont help, bump the core voltage a bit OR lower the memory clock by 20 mhz or so.

if that is not helping too, try other cables for monitor


----------



## Newbie2009

Quote:


> Originally Posted by *Streetdragon*
> 
> i think the vddci voltage(not the core voltage) is a bit to low. bump that a bit. if that wont help, bump the core voltage a bit OR lower the memory clock by 20 mhz or so.
> 
> if that is not helping too, try other cables for monitor


Cool ta, will lower clocks and see if works.


----------



## lanofsong

Hey R9 290X/290 owners,

We are having our monthly Foldathon from Monday 20th - Wednesday 22nd - 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

March 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## zerokool_3211

i have also started all of a sudden having this issue. i just upgraded to the newest drivers to see if it fixes it, but it is a very troublesome issue that at times wont happen at all and then other times will happen two or three times in 30 minutes.


----------



## Newbie2009

Quote:


> Originally Posted by *zerokool_3211*
> 
> i have also started all of a sudden having this issue. i just upgraded to the newest drivers to see if it fixes it, but it is a very troublesome issue that at times wont happen at all and then other times will happen two or three times in 30 minutes.


Mine wasn't driver related. Video output on one card is broken, had to disable, running on 2nd card now so no more xfire for me.


----------



## zerokool_3211

and you were getting black screens on all monitors and a reboot would fix it?


----------



## Newbie2009

Quote:


> Originally Posted by *zerokool_3211*
> 
> and you were getting black screens on all monitors and a reboot would fix it?


Yeah. But when I investigated, it happened with just one monitor attached also, xfire on and off, different drivers. It would happen on desktop and in games (randomly)

I eliminated all possible issues, which took a while. It is working fine now 24 hours without a blackscreen, multi monitor, with the 2nd graphics card, 1st one being disabled.

I haven't tried to run xfire yet using the 2nd card as the video output, I don't think it is possible but yet to check.


----------



## Newbie2009

Just for the record I got the 2nd card working as my video out and the main card with the broken video out as the slave in crossfire.


----------



## Gdourado

Anyone running a crossfire setup?
What is the current situation with regard to support and issues?

Cheers


----------



## diggiddi

Its a game by game situation eg. BF4 flickers, project cars negative scaling(last few drivers at least) ,crysis 3 works fine etc, etc


----------



## Gdourado

Quote:


> Originally Posted by *diggiddi*
> 
> Its a game by game situation eg. BF4 flickers, project cars negative scaling(last few drivers at least) ,crysis 3 works fine etc, etc


How about the situation with frame pacing and micro stutter?
Is it still an issue?
Does using a freesync monitor resolve the issue?

Cheers


----------



## diggiddi

I have never had an issue with either frame pacing and micro stutter and I don't have freesync monitor so I'm no help there


----------



## Newbie2009

Quote:


> Originally Posted by *Gdourado*
> 
> How about the situation with frame pacing and micro stutter?
> Is it still an issue?
> Does using a freesync monitor resolve the issue?
> 
> Cheers


No, you shouldn't be able to tell the difference vs 1 card in most games.

It is hit and miss to be honest, 2 games I play don't have xfire support , homefront and resident evil 7.

If you go multi gpu, you can't be the kinda person who buys games when they come out, usually take a while to get a profile and some games will never get a profile. (although rare)


----------



## bichael

Think I've settled on bios / clocks for my 290 - for now at least









Using the Stilts MLU 1050/1375 rom, 0 offset, with my powercolor reference 290 (watercooled) and seems to be working well. Just needed a small edit to change device ID for 290 rather than 290x before it would boot.

I got a bit fed up playing with memory OC on stock rom. 390 rom was working pretty well and got good Firestrike scores but seemed maybe a little unstable and limited OC. The MLU rom seems to give best of both, good scores and stability.

I've modified slightly by bios edit to up the voltage of dpm7 and have overclocked dpm 6&7 with wattman (with volt set to manual so it doesn't change anything) so it's running as below, gets around 13500 firestrike graphics score. May push more if I move somewhere colder but with 30oC ambient I think this is probably a good sweet spot (also my psu gets loud after extended period at very high load). Not sure if I should try and make the jumps smaller/smoother but it's working so think I'll leave well alone.

VID / Clock
0.925 / 300
1 / 516
1.081 / 727
1.112 / 840
1.156 / 890
1.2 / 936
1.237 / 977 1050
1.25 1.306 / 1050 1140


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Gdourado*
> 
> How about the situation with frame pacing and micro stutter?
> Is it still an issue?
> Does using a freesync monitor resolve the issue?
> 
> Cheers


Freesync and DP fixed that single monitor drama for me . But I run 12k surround when gaming


----------



## diggiddi

On 3 290s?


----------



## chris89

12K? wow thats awesome... and I thought 4096x2160 was solid... guess not

Might need to crank out the cards to 1.21 jigga hertz and hold true maybe haha


----------



## ziddey

Anyone else not able to change memory clocks with newer drivers (e.g. 17.3.3)? GPUZ reports the new frequency, but no screen tearing occurs when changing, and performance is the same. HWINFO64 correctly continues to report the stock clock. To prove this point, I can set the memory clock to the max 2000mhz without any issues.

And with older drivers (e.g. 15.12), underclocking is not possible for either core or memory.


----------



## Streetdragon

do you oc with afterburner or wattman? if you touched wattman, than you cant change anything with other tools.


----------



## Talon720

I've been looking at the .95 rail hard mods lately, but still have questions. 1st is I've read you can use a 50k vr potentiometer or a 33k resistor. Even though the straight resistor would be easiest I don't know where to measure. Maybe I won't be able to with a waterblock on


----------



## ziddey

Quote:


> Originally Posted by *Streetdragon*
> 
> do you oc with afterburner or wattman? if you touched wattman, than you cant change anything with other tools.


Results are the same with either. Cleared out the registry keys so that wattman shows the agreement screen again to confirm this.

It looks like the issue with the newer drivers is present when using multiple monitors. It holds the ram at full speed, and is unable to change it dynamically. I can set my desired speed in wattman and reboot, and they'll actually take effect. Or, if I go down to one monitor and let the ram drop to idle clocks (150mhz), it will then take the set speed next time it clocks up.


----------



## PontiacGTX

anyone keeps using their reference cooler?


----------



## spyshagg

Quote:


> Originally Posted by *PontiacGTX*
> 
> anyone keeps using their reference cooler?


yeah, on my mining rigs.


----------



## PontiacGTX

do you have a kill a watt?


----------



## spyshagg

Quote:


> Originally Posted by *PontiacGTX*
> 
> do you have a kill a watt?


yes, me. 7 Reference cards for mining and one DCU II 290x for gaming


----------



## PontiacGTX

do you mind if I PM?


----------



## spyshagg

Quote:


> Originally Posted by *PontiacGTX*
> 
> do you mind if I ask you about the power draw?


The reference 290's with reference cooler are giving me between 115 and 130watts in gpu-z while mining @ 75ºc 94% Fan

The reference 290x's with reference cooler are giving me between 135 and 150watts in gpu-z while mining @ 83º 85% fan

they are all clocked between 1000 and 1040mhz with 1.8 mod bios and undervolted by ~30mv from the bios stock voltage (+37mv, so - 67mv in total)


----------



## Raephen

Last week Tuesday, I swapped the cooling of my ref Sapphire R9 290 from an AC Hawaii waterblock to an old Arctic Cooling Twin Turbo I had still lying arround + a Gelid VRM kit R2 that I bought for €14,95 incl shipping (clearance sale, I guess).




This in preparation of: a major system upgrade (RyZen 5) and perhaps the sale to a friend who's in need of a decent GPU.

The temps, so far, have been far from disappointing. (Sub 70C at 40% fan speed -- can't seem to get the MSI AB custom fan curve to work.)


----------



## mAs81

Quote:


> Originally Posted by *Raephen*
> 
> can't seem to get the MSI AB custom fan curve to work


I've seen people complain about that quite often..Plus it happened to me too..

I believe that Wattman is to blame..Update/Reinstall your drivers and DO NOT open Wattman/AMD Overdrive settings - ever....


----------



## ZealotKi11er

Quote:


> Originally Posted by *Raephen*
> 
> Last week Tuesday, I swapped the cooling of my ref Sapphire R9 290 from an AC Hawaii waterblock to an old Arctic Cooling Twin Turbo I had still lying arround + a Gelid VRM kit R2 that I bought for €14,95 incl shipping (clearance sale, I guess).
> 
> 
> 
> 
> This in preparation of: a major system upgrade (RyZen 5) and perhaps the sale to a friend who's in need of a decent GPU.
> 
> The temps, so far, have been far from disappointing. (Sub 70C at 40% fan speed -- can't seem to get the MSI AB custom fan curve to work.)


Any reason why you removed the WB?


----------



## ILOVEPOTtery

Quote:


> Originally Posted by *Raephen*
> 
> ::snip::
> This in preparation of: a major system upgrade (RyZen 5) and perhaps the sale to a friend who's in need of a decent GPU.
> ::snip::


Quote:


> Originally Posted by *ZealotKi11er*
> 
> Any reason why you removed the WB?


Willing to bet his friend's build will be air only.


----------



## Raephen

Quote:


> Originally Posted by *mAs81*
> 
> I've seen people complain about that quite often..Plus it happened to me too..
> 
> I believe that Wattman is to blame..Update/Reinstall your drivers and DO NOT open Wattman/AMD Overdrive settings - ever....


Oddly enough, I got it to work without all that.

I've got a Sapphire card, so I thought I'd give Trixx a go. It seemed to work fine, appart from what I can only gues some issue with the 2D clocks and voltage. When I rebooted and switched back to MSI-AB, the curve I had set seemed to hold this time, so whatever it was, I'm not comlaining.


----------



## Raephen

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Any reason why you removed the WB?


Quote:


> Originally Posted by *ILOVEPOTtery*
> 
> Willing to bet his friend's build will be air only.


Two reasons: 1. I honestly don't have much time and interest in watercooling anymore, and 2. the friend I might have sold it to has indeed no watercooling setup anymore (he sold what he had and was thinking of a new system instead of the old Athlon II x2 250 he still has lying arround) and no interest in it anymore (he's got kids, busy job etc - just wanted something that works). He bought a PS4, now.

Honestly, while reworking my current setup arround a AM4 motherboard could be doable, it'll set me back quite a buck, in my first, raw approximations. A good aircooler would be cheaper and most likely, a good one would be more than enough for RyZen. They seem to hit a voltage bump on anything past 3.8/3.9 GiggleHertz.

As a result of those ideas, I was currious to see how the old Twin Turbo II would do and since I happened on those Gelid VRM heatsinks on sale, I thought I'd give it a go. I was very pleasantly surprised by the results I got.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Raephen*
> 
> Two reasons: 1. I honestly don't have much time and interest in watercooling anymore, and 2. the friend I might have sold it to has indeed no watercooling setup anymore (he sold what he had and was thinking of a new system instead of the old Athlon II x2 250 he still has lying arround) and no interest in it anymore (he's got kids, busy job etc - just wanted something that works). He bought a PS4, now.
> 
> Honestly, while reworking my current setup arround a AM4 motherboard could be doable, it'll set me back quite a buck, in my first, raw approximations. A good aircooler would be cheaper and most likely, a good one would be more than enough for RyZen. They seem to hit a voltage bump on anything past 3.8/3.9 GiggleHertz.
> 
> As a result of those ideas, I was currious to see how the old Twin Turbo II would do and since I happened on those Gelid VRM heatsinks on sale, I thought I'd give it a go. I was very pleasantly surprised by the results I got.


GPU is not a problem. How cool are the VMR1?


----------



## Raephen

Quote:


> Originally Posted by *ZealotKi11er*
> 
> GPU is not a problem. How cool are the VMR1?


With the fans at a fixed 40% ad no added voltage, the max reading GPU-z gave me after a stint in Boston head-shotting raders and super mutants was 71C on VRM1.

With a more aggresive fan profile now working: 65C.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Raephen*
> 
> With the fans at a fixed 40% ad no added voltage, the max reading GPU-z gave me after a stint in Boston head-shotting raders and super mutants was 71C on VRM1.
> 
> With a more aggresive fan profile now working: 65C.


Not bad.


----------



## misho93

Hei
i need ask about my proplem with change thermal paste Tri-X R9 290X AND ASUS R9 390
my r9 290x only heatup the board ''PCB'' after reapply thermal paste on gpu
i touch the heatpipe it cool also the heatsink cool but board verrrry hot 90c and pc shotdown .
i try reapply thermal paste many time and cline but the same thing now my 2 gpu i can't use them help plz


----------



## mAs81

Quote:


> Originally Posted by *misho93*
> 
> Hei
> i need ask about my proplem with change thermal paste Tri-X R9 290X AND ASUS R9 390
> my r9 290x only heatup the board ''PCB'' after reapply thermal paste on gpu
> i touch the heatpipe it cool also the heatsink cool but board verrrry hot 90c and pc shotdown .
> i try reapply thermal paste many time and cline but the same thing now my 2 gpu i can't use them help plz


You must be re-assembling it wrong..If the PCB is hot but the heatsink is cool,I can only assume that there's poor to no contact at all between the chip and the heatsink..

Go again through the tutorials for your cards and make sure that you're screwing the right screws in the right place..

Also,what kind of thermal paste are you using on which card?

Pictures would also do wonders in helping us understand..


----------



## Lernatix

I just picked up a reference 290X second hand and am already thinking about cooling mods.
Seems my old 6870 XFX heatsink will fit and I can use small alloy sinks on the VRM.
However the older 6870 only uses two pin fans vs the 290X 4 pin, so two part question.

1) Which two wires would I hack into to use the older fans.
2) If I decide to use a 120mm pwm fan, do the four pins match a typical motherboard layout?

Cheers.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Lernatix*
> 
> I just picked up a reference 290X second hand and am already thinking about cooling mods.
> Seems my old 6870 XFX heatsink will fit and I can use small alloy sinks on the VRM.
> However the older 6870 only uses two pin fans vs the 290X 4 pin, so two part question.
> 
> 1) Which two wires would I hack into to use the older fans.
> 2) If I decide to use a 120mm pwm fan, do the four pins match a typical motherboard layout?
> 
> Cheers.


4-Pin is PWM.


----------



## ILOVEPOTtery

If you want to wire the old two wire fan, peep the attached pic. You'll need the +12V and ground. I'd go with a PWM fan so you can control speeds for idle/gaming.


----------



## Caradine

I have an r9 290 tri-x with 85.3% ASIC and 1.08V stock voltage which seems cool, but this seemingly low stock voltage means the +100mV limit in afterburner isn't really enough. Is there a way to bypass that limit?

I don't want to use trixx because it gives me horrible flicker whenever I apply any setting at all.


----------



## ZealotKi11er

Quote:


> Originally Posted by *Caradine*
> 
> I have an r9 290 tri-x with 85.3% ASIC and 1.08V stock voltage which seems cool, but this seemingly low stock voltage means the +100mV limit in afterburner isn't really enough. Is there a way to bypass that limit?
> 
> I don't want to use trixx because it gives me horrible flicker whenever I apply any setting at all.


Yes. You change change stock value to +50 mV in Bios with the tool.

https://github.com/OneB1t/HawaiiBiosReader


----------



## Caradine

Quote:


> Originally Posted by *ZealotKi11er*
> 
> Yes. You change change stock value to +50 mV in Bios with the tool.
> 
> https://github.com/OneB1t/HawaiiBiosReader


Thanks


----------



## Lernatix

Thank you both.
I decided to use my 7950 Windforce cooler instead, fans plug straight in and it
needed very little modding to make it fit.

A couple of bent fins and a small dent in the DVI shield being the worst of it.
I also added a bunch of small heatsinks for VRMs and one of the RAM modules.

Most importantly, my ears have stopped bleeding!


----------



## ZealotKi11er

Quote:


> Originally Posted by *Lernatix*
> 
> Thank you both.
> I decided to use my 7950 Windforce cooler instead, fans plug straight in and it
> needed very little modding to make it fit.
> 
> A couple of bent fins and a small dent in the DVI shield being the worst of it.
> I also added a bunch of small heatsinks for VRMs and one of the RAM modules.
> 
> Most importantly, my ears have stopped bleeding!


You can actually made 290X Reference usable if you know how to mod the cards BIOS. 1GHz was simply too much for the card. The Base clock speeds is 727MHz. You can run the card ~ 850MHz with half the power which is crazy.


----------



## ziddey

I recently bought a damaged 290. The smd cap shown was smashed. I don't have the specs or the means to measure the capacitance of one of the good caps. Instead, I just swapped in a same-sized cap from a donor 8800gt.

The card works now and is completely stable with the stock bios, as well as with much more aggressive memory timings. However, I'm simply unable to go beyond ~1325mem stably. Stock rom, the hynix mod rom, manually applying the 390 mc timings, or any other testing have all proven fruitless.

Stock bios has mem at 1300, and it seems that with 390 mc timings, it should be able to hit 1500.

Further, it seems when exceeding ~2gb of memory usage (e.g. when mining at higher intensities), errors are produced (hwinfo shows no memory errors as long as I stay within 1325mhz).

So I'm wondering if this capacitor is related to the memory / memory controller. Or better yet, if anyone is able to determine the actual specs so I can replace it with a proper one.

Thanks

edit: it's a reference 290 (sapphire tri-x)


----------



## ZealotKi11er

Quote:


> Originally Posted by *ziddey*
> 
> 
> I recently bought a damaged 290. The smd cap shown was smashed. I don't have the specs or the means to measure the capacitance of one of the good caps. Instead, I just swapped in a same-sized cap from a donor 8800gt.
> 
> The card works now and is completely stable with the stock bios, as well as with much more aggressive memory timings. However, I'm simply unable to go beyond ~1325mem stably. Stock rom, the hynix mod rom, manually applying the 390 mc timings, or any other testing have all proven fruitless.
> 
> Stock bios has mem at 1300, and it seems that with 390 mc timings, it should be able to hit 1500.
> 
> Further, it seems when exceeding ~2gb of memory usage (e.g. when mining at higher intensities), errors are produced (hwinfo shows no memory errors as long as I stay within 1325mhz).
> 
> So I'm wondering if this capacitor is related to the memory / memory controller. Or better yet, if anyone is able to determine the actual specs so I can replace it with a proper one.
> 
> Thanks
> 
> edit: it's a reference 290 (sapphire tri-x)


They might have different value. 1300MHz is a bad clock for 290. You want 1250, 1375 or 1500.


----------



## ziddey

Quote:


> Originally Posted by *ZealotKi11er*
> 
> They might have different value. 1300MHz is a bad clock for 290. You want 1250, 1375 or 1500.


Running the same timings for both the 1250 and 1375 strap.


----------



## Lernatix

Well that was all for nothing, card was working fine (cool and quiet) last night and this morning, this afternoon PC shut down and won't boot with the 290X installed. ***

Looks like I blew the top most mosfet in pic. It's almost black compared to the others.
(Must have had one heatsink on poorly) Is this something I can remove/replace/bypass?

https://images.bit-tech.net/content_images/2014/03/sapphire-radeon-r9-290x-tri-x-oc-review/sap290xoc-7b.jpg


----------



## ziddey

Quote:


> Originally Posted by *Lernatix*
> 
> Well that was all for nothing, card was working fine (cool and quiet) last night and this morning, this afternoon PC shut down and won't boot with the 290X installed. ***
> 
> Looks like I blew the top most mosfet in pic. It's almost black compared to the others.
> (Must have had one heatsink on poorly) Is this something I can remove/replace/bypass?
> 
> https://images.bit-tech.net/content_images/2014/03/sapphire-radeon-r9-290x-tri-x-oc-review/sap290xoc-7b.jpg


This guy? https://www.digikey.com/product-detail/en/infineon-technologies/IRF6811STRPBF/IRF6811STRPBFCT-ND/3598437


----------



## Caradine

that guy is supposed to throttle the GPU when it goes above 110 degrees, it shouldn't be able to go boom. Is the card voltage/power modded or using the pt3 bios?


----------



## lanofsong

Hey AMD R9 290X / 290 owners,

We are having our monthly Foldathon from Monday 17th - Wednesday 19th - 12 noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

April 2017 Foldathon

BTW - make sure you sign up









To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## punkfiveo

Wondering if anyone can help me pinpoint my problem with R9 290 crossfire:

Here's my problem: I plug in one R9 290 into PCI-e slot 1 and OC it, it can take 1125 core and 1450 memory @ +50mV. I take it out, and plug the other R9 290 into PCI-e slot 1, and it can OC to 1125 core and 1400 memory @ +50mV. I take it out, plug it into PCI-e slot 3, and it can overclock the same amount. But when I put the other R9 290 back into PCI-e 1 and enable crossfire, I can run 1125 core @+50mV but I must keep memory stock at 1250mhz. It won't even go up to +25mhz to 1275mhz memory before black screening. I have also tried setting both cards back to stock clock (947mhz), but I still can't seem to get any sort of memory OC.

How can I proceed from here to pinpoint the issue? I am thinking the motherboard does not like crossfire?

*System specs:*

ASUS X99-A
i7 5930k
32gb Corsair ram
2x ASUS R9 290 (reference) with Kraken G10 and H55, with individual VRAM heatsinks and Gelid VRM heatsinks
M.2 SSD

*Here are things I have tried:*

Update and do a clean install video driver DDU in safe mode etc
Update Motherboard Bios (it's the latest)
Run another card in Crossfire (Ran 1x Sapphire R9 290 (reference) with 1x ASUS R9 290 G10+H55 Mod.... still won't take GPU mem OC
VRM temps are 70 deg under full load after running for hours
GPU temps are 65 deg under full load and after running for hours
Increasing voltage doesn't do anything to mitigate blackscreen on mem OC
I upgraded PSU from old Thermaltake 1250W to new EVGA 1200W - no difference in stability
Ran independent VGA power cables from PSU to cards, as opposed to the one cable that splits into an 8-pin and another 6+2 connector.. no difference
Re-seated both GPUs multiple times
Keep CPU at stock speed
Tried different OC programs (Using AB right now)
Disable ULPS
Force constant voltage

*Things I have not tried:*

Different motherboard (nobody will lend me their X99)
Different RAM (don't know how this would make a difference)
Have not tried SLI/crossfire with a different set of cards...say Rx480s or 970s
Have not fiddled with AUX voltage much
GPU BIOS Flash


----------



## Streetdragon

dont go higher with momory clock read a bit here
http://www.overclock.net/t/1561704/official-amd-r9-390-390x-owners-club/11220

you wont get performence out of it(not much)

i know this wont help you with your problems. maybe someone aelse can help, if you waant higher clocks


----------



## diggiddi

Quote:


> Originally Posted by *punkfiveo*
> 
> Wondering if anyone can help me pinpoint my problem with R9 290 crossfire:
> 
> Here's my problem: I plug in one R9 290 into PCI-e slot 1 and OC it, it can take 1125 core and 1450 memory @ +50mV. I take it out, and plug the other R9 290 into PCI-e slot 1, and it can OC to 1125 core and 1400 memory @ +50mV. I take it out, plug it into PCI-e slot 3, and it can overclock the same amount. But when I put the other R9 290 back into PCI-e 1 and enable crossfire, I can run 1125 core @+50mV but I must keep memory stock at 1250mhz. It won't even go up to +25mhz to 1275mhz memory before black screening. I have also tried setting both cards back to stock clock (947mhz), but I still can't seem to get any sort of memory OC.
> 
> How can I proceed from here to pinpoint the issue? I am thinking the motherboard does not like crossfire?
> 
> *System specs:*
> 
> ASUS X99-A
> i7 5930k
> 32gb Corsair ram
> 2x ASUS R9 290 (reference) with Kraken G10 and H55, with individual VRAM heatsinks and Gelid VRM heatsinks
> M.2 SSD
> 
> *Here are things I have tried:*
> 
> Update and do a clean install video driver DDU in safe mode etc
> Update Motherboard Bios (it's the latest)
> Run another card in Crossfire (Ran 1x Sapphire R9 290 (reference) with 1x ASUS R9 290 G10+H55 Mod.... still won't take GPU mem OC
> VRM temps are 70 deg under full load after running for hours
> GPU temps are 65 deg under full load and after running for hours
> Increasing voltage doesn't do anything to mitigate blackscreen on mem OC
> I upgraded PSU from old Thermaltake 1250W to new EVGA 1200W - no difference in stability
> Ran independent VGA power cables from PSU to cards, as opposed to the one cable that splits into an 8-pin and another 6+2 connector.. no difference
> Re-seated both GPUs multiple times
> Keep CPU at stock speed
> Tried different OC programs (Using AB right now)
> Disable ULPS
> Force constant voltage
> 
> *Things I have not tried:*
> 
> Different motherboard (nobody will lend me their X99)
> Different RAM (don't know how this would make a difference)
> Have not tried SLI/crossfire with a different set of cards...say Rx480s or 970s
> Have not fiddled with AUX voltage much
> GPU BIOS Flash


What PSU are you using and how old is it?
Never mind


----------



## artemis2307

Got this bunch for 150$ all brand new








will post temps when I got home


----------



## Newbie2009

Quote:


> Originally Posted by *artemis2307*
> 
> Got this bunch for 150$ all brand new
> 
> 
> 
> 
> 
> 
> 
> 
> will post temps when I got home


My temps are 31c to 45c with a block for reference.


----------



## artemis2307

*
1150/1500 +38mV +25% powerlimit
ambient temp 25C, push-pull 1800rpm
I know it'd be wiser to get a RX480 but a decent 480 8GB costs 250$ in my country
so 150$ for the same performance is pretty decent I guess







*


----------



## Newbie2009

Quote:


> Originally Posted by *artemis2307*
> 
> *
> 1150/1500 +38mV +25% powerlimit
> ambient temp 25C, push-pull 1800rpm
> I know it'd be wiser to get a RX480 but a decent 480 8GB costs 250$ in my country
> so 150$ for the same performance is pretty decent I guess
> 
> 
> 
> 
> 
> 
> 
> *


Nice! I'd be careful on the memory OC though, sometimes gains nothing/slightly loses performance.


----------



## artemis2307

Quote:


> Originally Posted by *Newbie2009*
> 
> Nice! I'd be careful on the memory OC though, sometimes gains nothing/slightly loses performance.


that thing can do 1600 though it's not gaining much
1500 gains pretty decent over stock mem though


----------



## Newbie2009

Quote:


> Originally Posted by *artemis2307*
> 
> that thing can do 1600 though it's not gaining much
> 1500 gains pretty decent over stock mem though


Nice. Money well spent!


----------



## prichina

Can someone help me with raising vram voltage on R9 290X (in my signature) ? Pls msg me


----------



## lanofsong

Hey there R9 290/290X owners,

We are having our monthly Foldathon from Monday 22nd - Wednesday 24th - 12noon EST.
Would you consider putting all that power to a good cause for those 2 days? If so, come sign up and fold with us - see attached link.

May 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## Mega Man

No, please stop spamming our thread every month


----------



## lanofsong

Quote:


> Originally Posted by *Mega Man*
> 
> No, please stop spamming our thread every month


Just passing along OCN Event information - you need not participate if you do not wish too







.


----------



## Mega Man

Ok, so make a thread for folding, with a monthly announcement so you don't have to spam threads with off topic info.

This thread was created for 290x/290s not for folding as I am sure all the threads you spam with the same post was made to talk about their product. That said ot conversations are fine. But there is no glitter to folding. Just bam! It would be like me coming into the folding thread and starting to talk about childcare. Completely irrelevant and way off topic. Make a thread, like the rest of us have to and post there. If people want to fold they will follow it.

I feel just like I as hit with a panhandler every month holding a sign on the side of the road. (Or a more actuate description, like the one years ago that wire a male thong then walked between cars, with a dog so you had to look) frankly it just looks bad for you having to beg every month.


----------



## johnz31975

hello i have 2 xfx r9 290 not unlockable and one asus matrix r9 290x on my system. In dx12, crossfire is enabled but time spy in 3d mark only uses the first card the asus one. If i test tomb raider with dx12 enabled only the first card works, but in dx11 all of them work, and in fire strike 3d mark again again all of them work. I think this is a problem with dx12 crossfire with different device ids in cards. I tryied to write bios from r9 290x to my 2 xfx cards, but ati flash changes device id from 67B1(290) TO 67B0(290X) but when i use gpu z in windows the device id is not changed it is still 67B1, SO NO LUCK WITH XFX cards, My second option is to try to change device id to the matrix card, can someone please if this a correct idea mod my bios from device id 67B0 to 67B1 so i can test the other option?
here is my bios
https://www.mediafire.com/?0vh6ucli8za7xgf


----------



## mirzet1976

I have the same problem with a 290X and 290 that DX12 does not work in the CF mode for Time Spy and ROTR, Fire Strike DX11 it is OK


----------



## lum-x

I bought a trixx cooler for 25$ and i changed it with my reference cooler. I managed to overclock my card to 1150/1400 with +35mv. The only thing i would like to do is change those values in BIOS and change the fan profile. I still want to do more testing and see if i can push the GPU a bit more but if i want to do 1170 i have to add a lot more voltage and i dont think its worth. I think i have a decent canr for overclocking









Few benchmarks i just did are:
*Unigine Heaven Benchmark 4.0*
FPS: 54.7
Score: 1379
Min FPS: 26.2
Max FPS: 118.4
*Time Spy:* 4017

I have my FX-8320 overclocked to 4447 with 233 FSB. will try and push it further and change FSB to 266 and see if i can clock it to 4.6 without too much voltage


----------



## dave338

Bello everybody.

I bought a 2nd hand sapphire 290 referencie model. It isn't unlockable and ir wears a gelid icy cooler for amd.

Temps are not bad about 80°c Max with fans at 1400rpm, but vrm1 gets in the 90s, so i don't want to increase voltatge. In fact I want to decrease it if possible.

With stock voltatges (1,18v is possible?) and stock bios I have pushed it to 1050/1420.
Are them good numbers for a reference card?? I tried 1075/1450 and worked un the moment, but later un next restart i got flashes and black screen idling in windows... So i revert to previous values which worked well.

Any advice to continue improving its performance?

Regards.


----------



## boot318

https://www.amazon.com/Gelid-Solutions-Vision-Cooler-GC-VGA02-01/dp/B003U8BLJI

^ Is that the cooler you put on the reference PCB? Did you put a heatsink on the VRMs?


----------



## dave338

Hi boot









Almost, is the other revisión, that one:
https://www.amazon.com/GeLid-GC-VGA02-02-Vision-Video-Cooler/dp/B00A6OAH08/ref=sr_1_1?s=electronics&ie=UTF8&qid=1495743103&sr=1-1&keywords=gelid+icy+vision

It has a different base special for AMD cards.

And comes with all the Little heatsinks that you can see in the last pic of your link, for ram and vrms

I bought an Arctic termal pad (the blue one, with 6W/mk, and 1,5mm thick) to change vrms pad, to see if it gave better results, but they are almost the same. With low airflow (1200rpm and very quiet), vrm1 get to 100ºC which is too hot. With 1400rpm, audible, but not annoying (the rest of fans in my case are at about 900rpm, 12 and 14cm), they are around 90ºC

I'd like to have better vrm cooling but I think it's enough...

Regards.


----------



## boot318

I expected better temps with that aftermarket/VRM cooler. I'm guessing fans don't have enough static pressure to properly cool the heatsink & VRM heatsink. Can you raise the fan speed up? I'm seeing other people with that cooler saying core temps stays in the mid-70's and VRM temps are in the low 80's. I guess case airflow could be raising your temps also.

Those temps are safe but I don't think you should have to live with it... since you got the aftermarket cooler set for you 290.









P.S. I ghetto'd some heatsinks to my 290x VRM. They are doing a great job cooling it! I don't recommend it but can take some pictures of it later.


----------



## dave338

Well, these temps are max I've seen in Rise of the tomb Raider, which is the game that more hot drives the card from what I've seen...

In other games like Forza Ápex or Overwatch, gpu and vrm temps are lower, specially vrm ones (about 75ºC)

Fans con run up to 2000rpm, but at this speed are annoying for mi delicate ears xDDD

I'll see if I can improve a bit airflow on my case, but it's a Mastercase pro5, so it's well ventilated, although all fans are at about 900rpm (2 front + 1 rear + 2 superior with 240 radiator)

Is a modded BIOS recommended at this point for my card?? Or I'll make things worse???
My freqs of 1050/1420 are right??

Regards.


----------



## bichael

I would give the MLU bios a try. One of the benefits is lower VRM temps I seem to remember. It needs a very small mod of the device ID as it's really a 290x bios. I can share myne later if you want as I'm using it on a 290 reference as well.

http://www.overclock.net/t/1561904/mlu-bios-builds-for-290x


----------



## dave338

Oh! It woul be great!!

I can give it a try.. is it valid for any reference card??
Which are the stock speeds and voltages in this BIOS ??

I knew infinity BIOS , that is a modded BIOs from 390/390X, but I'm not fully confident to put it on my card, to be honest...


----------



## bichael

Yep as far as I know should work for all reference cards.
Sent bios by PM. It's the 1050/1375 one with 0 voltage offset, only mod is just changing device ID for a 290. From what you mention above it should be okay with your card hopefully as you were running at similar clocks. If not there are other ones that include various voltage offsets.
Note the memory timings are tuned so just leave clock at 1375.
Enjoy!


----------



## dave338

Thank you very much, bichael.

Just one question... Can I flash any of the two bios positions of the switch?? It's to leave actual one safe and use the other one, which I haven't used yet

Regards


----------



## bichael

Best flashing guide I found was in the below thread - it does mention putting the switch towards the power connectors though not sure what difference it makes, never checked when I did myne as was hard to get to!
http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread


----------



## dave338

Quote:


> Originally Posted by *bichael*
> 
> Best flashing guide I found was in the below thread - it does mention putting the switch towards the power connectors though not sure what difference it makes, never checked when I did myne as was hard to get to!
> http://www.overclock.net/t/1443242/the-r9-290-290x-unlock-thread


Hello again, I'm back xDD

I tried your BIOS bichael, and worked quite well, lowered temps by almost 10ºC (in VRM even more), but because it lowered voltaje in my core to 1,11v (from 1,18 I had with previous BIOS). The fact is that I saw two minor glitches in Heaven and I started to try other versions of MLU BIOS, doing the mod with HeX editor and Hawaii reader to convert them to 290.

I tried 1075/1375 ones and the only one that worked without any artifacts was the +50mV one and with that one temps were near my previous temps, only 2-3º lower, so I went back to 1050/1375 BIOS, but with +25mV offset and this one worked perfect, with 2-3º lower GPU temp and 5-6 lower VRM temp tan original BIOS at same frecuency (vid 1,14). REsult in heaven is almost the same I had previously at 1050/1420, at 1340 points (previous 1355).

Now I'm gonna try insanity BIOS with all mods to compare if it gives me some benefit (althoug I have to overclock it manually and this one comes with perfect clocks from default)

Thank you very much for your learnings









PD: the switch is to change to first or second BIOS, so you can make testing in one position and have the other position with a safe BIOS to boot if something goes wrong xD


----------



## Newbie2009

I managed to salvage my 290x which died a while back. Not 100% sure what was wrong with it. New bios seems to have done the job ( and a couple of months rest)


----------



## mAs81

Changed thermal paste on my 290 VaporX,didn't change the thermal pads tho..
Used MX-4 , can only already see a slight difference , since its summertime time here , but all in all everything went smoothly and I'm satisfied


----------



## Caradine

Well, I changed the thermal paste on my 290 tri-x and made VRM temps worse >.>
2C less on the GPU, 5C more on the VRM - I don't know what to think.


----------



## mAs81

Quote:


> Originally Posted by *Caradine*
> 
> Well, I changed the thermal paste on my 290 tri-x and made VRM temps worse >.>
> 2C less on the GPU, 5C more on the VRM - I don't know what to think.


I'm guessing that maybe we should have changed the thermal pads also..Did you re-align them correctly on the card?I removed all the pads that were stuck on the heatsink and put them in their place and then reassembled the card..

In my card VRM2 is a bit hotter I believe , but it's still well under 10c from the core temp so I'm not worried..


----------



## Caradine

Well, the VRM pad stayed on the card and I watched it while reassembling the card. It was well aligned at least as far as I could see. Didn't take the memory pads off to put them on one side, those on the heatsink stayed on the heatsink and those on the card stayed on the card







Just nudged them back in place because some got moved away slightly. The VRAM temp I wish I could check...

Also yeah we definitely should have, the pads are pretty sticky (??) and reapplying it when it's more dry probably gave a small hit to temps...

Btw Vapor X is literally my favorite card lol. And separate heatsink for mosfets = no problem


----------



## BIGARCANGEL

Thought I'd share a benchmarking video i just made with my 2 r9 290x underwater with 2017 drivers ? the overclock i have used was my 24/7 clocks.


----------



## Ukkooh

What overclcoking software should I use with the new relive drivers that supports voltage control? I previously used trixx for my OC but after relive it hasn't worked at all for me. I need +125mV for my 1200 core clock.


----------



## diggiddi

Quote:


> Originally Posted by *Ukkooh*
> 
> What overclcoking software should I use with the new relive drivers that supports voltage control? I previously used trixx for my OC but after relive it hasn't worked at all for me. I need +125mV for my 1200 core clock.


You need a new bios then, not software
Msi AB might be better to extend OC limits and voltage control but not sure if it goes to 125mv hence bios


----------



## vegito

I recently got my hands on 290X Gaming edition from MSI. After replacing thermal paste it works much better yet still close to 90'C, the thing that bothers me the most is that it works perfectyl fine undervolted like -90mV but only when under load, as soon as the load is gone it becomes unstable, how does that even make any sense?


----------



## Streetdragon

i would say it is the memory. try to lower the speed of you ram. 1300-1350 is more than fast enough


----------



## Tame

Yay! Broke 16K graphics score on Firestrike with my watercooled Gigabyte R9 290 and 2013 hardware still in one piece










http://www.3dmark.com/fs/13110104


----------



## ManofGod1000

Quote:


> Originally Posted by *Tame*
> 
> Yay! Broke 16K graphics score on Firestrike with my watercooled Gigabyte R9 290 and 2013 hardware still in one piece
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://www.3dmark.com/fs/13110104


Love it! This card has always had more to give than was given out of the box. I can imagine 2 x watercooled R 9 290's probably can put up some really good numbers and performance.


----------



## Tame

Quote:


> Originally Posted by *ManofGod1000*
> 
> Love it! This card has always had more to give than was given out of the box. I can imagine 2 x watercooled R 9 290's probably can put up some really good numbers and performance.


Thanks! Lately I've had some fun with pushing them hard at benchmarks! They are very durable, provided you can cool them. Though for daily use I use pretty conservative settings. Even if they are somewhat power hungry, they still pack a good performance today, especially if watercooled! One of the best aged GPU's in my opinion.


----------



## Cyber Locc

Hey guys so I sold some of my old 290s to my step brother, and he is having a problem.

The clocks seems to be stuck on 2d clocks for him, not going over 300mhz, on any bench or game or anything.

He is on a fresh windows 10 install with the latest drivers. They were not doing this before the fresh windows 10 install.


----------



## dagget3450

Quote:


> Originally Posted by *Cyber Locc*
> 
> Hey guys so I sold some of my old 290s to my step brother, and he is having a problem.
> 
> The clocks seems to be stuck on 2d clocks for him, not going over 300mhz, on any bench or game or anything.
> 
> He is on a fresh windows 10 install with the latest drivers. They were not doing this before the fresh windows 10 install.


Can you list some of his system specs? Weird indeed. How about gpuz and check pcie speeds also? might also try testing with only 1 gpu first and see before trying CF if you already did that disregard.

I recall people having clock issues with desktop but not in games/benches.....been a while since i toyed on my 290x's and i'm down to only 1 left. I seem to recall low clocks when the card was unstable and i think was fixed on a reboot/power off/on... but that was in overclocking it i guess...wish i could be more help


----------



## Cyber Locc

Quote:


> Originally Posted by *dagget3450*
> 
> Can you list some of his system specs? Weird indeed. How about gpuz and check pcie speeds also? might also try testing with only 1 gpu first and see before trying CF if you already did that disregard.
> 
> I recall people having clock issues with desktop but not in games/benches.....been a while since i toyed on my 290x's and i'm down to only 1 left. I seem to recall low clocks when the card was unstable and i think was fixed on a reboot/power off/on... but that was in overclocking it i guess...wish i could be more help


He has a 6700k, a MSI gaming 7, and IDK beyond that really.

So the issue he is having is the system will not go to 3d clocks, when he runs a game or a bench it stays at 300 mem and 150 core its not switching to 3d clocks.

And it isnt just 1 card, we tried one at a time and they are both doing it.

He is not overclocking.

If you have a idea, of how we can force 3d clocks permanently he is fine with that. However I cannot get any of old ways to do it.

So we just tried in GPU Tweak 2 and he doesn't even have an option for force 3d clocks, my 980ti does.


----------



## Tame

Quote:


> Originally Posted by *Cyber Locc*
> 
> He has a 6700k, a MSI gaming 7, and IDK beyond that really.
> 
> So the issue he is having is the system will not go to 3d clocks, when he runs a game or a bench it stays at 300 mem and 150 core its not switching to 3d clocks.
> 
> And it isnt just 1 card, we tried one at a time and they are both doing it.
> 
> He is not overclocking.
> 
> If you have a idea, of how we can force 3d clocks permanently he is fine with that. However I cannot get any of old ways to do it.
> 
> So we just tried in GPU Tweak 2 and he doesn't even have an option for force 3d clocks, my 980ti does.


Latest AMD drivers disable 3rd party overclocking software from working correctly. I have had similar clock issues with MSI AB unofficial oc mode and Trixx. I would try to uninstalling all running oc/monitoring programs, download DDU and AMD driver package, disable internet, run DDU, restart, install drivers, restart, enable internet... Without touching anything does this problem still persist in games/benchmarks? Also, if the clocks really are 300/150 the game should stutter horribly and barely run. If not, it might just be a reporting problem.


----------



## Cyber Locc

Quote:


> Originally Posted by *Tame*
> 
> Latest AMD drivers disable 3rd party overclocking software from working correctly. I have had similar clock issues with MSI AB unofficial oc mode and Trixx. I would try to uninstalling all running oc/monitoring programs, download DDU and AMD driver package, disable internet, run DDU, restart, install drivers, restart, enable internet... Without touching anything does this problem still persist in games/benchmarks? Also, if the clocks really are 300/150 the game should stutter horribly and barely run. If not, it might just be a reporting problem.


It runs horribly even in windows, its even stuttering just wacthing movies.

So more info, in the Crimson control panel in that new OC stuff (Forget the name its weird) under the Overclocking the only clock listed is 300mhz, the rest say NA NA NA, where the other states should be.

As for the stutter, ya he is getting a 1400 graphics score in firestrike, it is not just a false reporting. (19 max fps, 9 average, 0.3 minimum lol)

Also he wasnt using any overclocking anything, prior to my helping him figure out the issue. So its not any incompatibility with Afterburner or anything.

We just DDUed and are trying older drivers to see if that helps.


----------



## Cyber Locc

Quote:


> Originally Posted by *Tame*
> 
> Latest AMD drivers disable 3rd party overclocking software from working correctly. I have had similar clock issues with MSI AB unofficial oc mode and Trixx. I would try to uninstalling all running oc/monitoring programs, download DDU and AMD driver package, disable internet, run DDU, restart, install drivers, restart, enable internet... Without touching anything does this problem still persist in games/benchmarks? Also, if the clocks really are 300/150 the game should stutter horribly and barely run. If not, it might just be a reporting problem.


Got it, we DDUed again, we Installed the last WHQL driver from April and bang issue gone.

So that new driver is just well bad, My step brother doesn't overclock, he never even knew what afterburner was lol. So it wasn't a OCing issue, it was a bad driver from AMD issue.

He had thought he tried different drivers, and maybe he didn't DDU it right, I dont know I video chatted him through it, and we got it set







.


----------



## Tame

Quote:


> Originally Posted by *Cyber Locc*
> 
> Got it, we DDUed again, we Installed the last WHQL driver from April and bang issue gone.
> 
> So that new driver is just well bad, My step brother doesn't overclock, he never even knew what afterburner was lol. So it wasn't a OCing issue, it was a bad driver from AMD issue.
> 
> He had thought he tried different drivers, and maybe he didn't DDU it right, I dont know I video chatted him through it, and we got it set
> 
> 
> 
> 
> 
> 
> 
> .


Glad to hear you got it working


----------



## lum-x

If its not the best thread to ask I apology

What is the difference between *EDW2032BBBG*and *EDW2032BBBG_DEBUG2* because I wanted to flash my cards BIOS with Sapphire Tri-X R9 290 OC but I cant find a BIOS that has support for *EDW2032BBBG_DEBUG2* and has UEFI besides the reference BIOS I got from an AMD engineer.


----------



## mus1mus

Nothing much. Any BIOS that supports either Memory will work for any Elpida card.


----------



## lum-x

is anyone aware of a bios for Sapphire R9 290 4 GB Tri-X OC that supports *EDW2032BBBG* and has UEFI support. I looked at techpowerup website but i could not find any. The whole story of my card is that I changed the cooler from reference to Tri-X (I found a new cooler for 25$) and its annoying having the fan speed issues with reference bios.


----------



## georgebou

Hello guys.

Can anyone confirm that the Sapphire R9 290 reference design works fine with the RYZEN 7 1700 and a B350 tomahawk motherboard?

I have no signal at my monitor and the system boots to windows and i can confirm it by the windows welcome sound and the card is also recognised on device manager as it should.

I can confirm that the card is recognisezed after using my old graphics card and checking the device manager with show hidden devices.

The card is not dead, but i get no signal even with a different monitor, but it worked on the previous pc i had it.

Thanks in advance.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *georgebou*
> 
> Hello guys.
> 
> Can anyone confirm that the Sapphire R9 290 reference design works fine with the RYZEN 7 1700 and a B350 tomahawk motherboard?
> 
> I have no signal at my monitor and the system boots to windows and i can confirm it by the windows welcome sound and the card is also recognised on device manager as it should.
> 
> I can confirm that the card is recognisezed after using my old graphics card and checking the device manager with show hidden devices.
> 
> The card is not dead, but i get no signal even with a different monitor, but it worked on the previous pc i had it.
> 
> Thanks in advance.


So are you running the latest driver ?

Also flick the switch to the 2nd bios and try that .


----------



## georgebou

I can't even see the bios.


----------



## rdr09

Quote:


> Originally Posted by *georgebou*
> 
> I can't even see the bios.


Try to unplug and plug the cable from the monitor. If it works - you may have to do each time.







. Until you figure out a fix.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *georgebou*
> 
> I can't even see the bios.


There is a little black switch on the card dude


----------



## georgebou

Man i mean the UEFI Bios.

Not the cards alternative bios.

Read the post.

I see nothing on my monitor from the minute i press power switch, but i can confrm that the card is working by hearing the windows sound.

Thats why i am asking if anyone has it working with RYZEN motherboard B350 tomahawk


----------



## Ukkooh

Any OC programs out there that work with the newest drivers (17.7.2)? Both trixx and afterburner just lock my gpu at 300mhz core. I can OC the core fine with wattman through the wattman settings but my mem keeps at 1250mhz even if I change the setting from there.


----------



## Ipak

New afterburner beta works great http://forums.guru3d.com/showpost.php?p=5458447&postcount=629


----------



## Ukkooh

Quote:


> Originally Posted by *Ipak*
> 
> New afterburner beta works great http://forums.guru3d.com/showpost.php?p=5458447&postcount=629


Thank you! Had no issues with this version.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *georgebou*
> 
> Man i mean the UEFI Bios.
> 
> Not the cards alternative bios.
> 
> Read the post.
> 
> I see nothing on my monitor from the minute i press power switch, but i can confrm that the card is working by hearing the windows sound.
> 
> Thats why i am asking if anyone has it working with RYZEN motherboard B350 tomahawk


Read this ........

Well its about the 290 here not the mobo bios .

Try a forum on the B350 then


----------



## ElevenEleven

I'm a bit stuck here, trying to install AMD graphics driver on a Win 8.1 system with 2 MSI Lightning 290X cards installed. Windows boots fine without the AMD driver installed and shows the two GPUs as "Generic PnP" devices in device manager. But whether I try installing the driver by manually downloading it and running set-up, or by installing the auto-detection AMD software, or by pointing Windows from device manager to the driver folder location--in all cases immediately after driver installation the screen turns black and after automatic system restart, booting into Windows no longer works properly. I can get into safe mode and uninstall GPU drivers, then I can boot into Windows again, with Device Manager yet again showing the cards as Generic PnP adapters. I've used the AMD Clean Uninstall utility a few times already as well.

Is there a trick to get the drivers to install?.. I'd remove one of the cards and try installing with just the other one in, but there's a custom watercooling loop the two cards are a part of, and I really don't want to disassemble it just for the driver installation.

Thanks for any tips.


----------



## mAs81

Quote:


> Originally Posted by *ElevenEleven*
> 
> I'm a bit stuck here, trying to install AMD graphics driver on a Win 8.1 system with 2 MSI Lightning 290X cards installed. Windows boots fine without the AMD driver installed and shows the two GPUs as "Generic PnP" devices in device manager. But whether I try installing the driver by manually downloading it and running set-up, or by installing the auto-detection AMD software, or by pointing Windows from device manager to the driver folder location--in all cases immediately after driver installation the screen turns black and after automatic system restart, booting into Windows no longer works properly. I can get into safe mode and uninstall GPU drivers, then I can boot into Windows again, with Device Manager yet again showing the cards as Generic PnP adapters. I've used the AMD Clean Uninstall utility a few times already as well.
> 
> Is there a trick to get the drivers to install?.. I'd remove one of the cards and try installing with just the other one in, but there's a custom watercooling loop the two cards are a part of, and I really don't want to disassemble it just for the driver installation.
> 
> Thanks for any tips.


You could try removing completely all traces of the drivers using this guide and then try re-installing them..
I know it seems like a lot of work,but after all the times I've done it , it's easy for me now

Black screens used to plague a lot of 290/290X owners indeed..Supposedly if you're on Elpida memory and UEFI bios , turning your bios on legacy mode makes it stop,so it's worth a try I guess..

More info on that , here ..


----------



## georgebou

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Read this ........
> 
> Well its about the 290 here not the mobo bios .
> 
> Try a forum on the B350 then


What part of if has someone has the 290 on a ryzen system you don't understand.

The question is very clear and the bios is referred to describe the problem of the card.

Possibly the card is defective, but I want to be 100% sure.


----------



## rdr09

Quote:


> Originally Posted by *ElevenEleven*
> 
> I'm a bit stuck here, trying to install AMD graphics driver on a Win 8.1 system with 2 MSI Lightning 290X cards installed. Windows boots fine without the AMD driver installed and shows the two GPUs as "Generic PnP" devices in device manager. But whether I try installing the driver by manually downloading it and running set-up, or by installing the auto-detection AMD software, or by pointing Windows from device manager to the driver folder location--in all cases immediately after driver installation the screen turns black and after automatic system restart, booting into Windows no longer works properly. I can get into safe mode and uninstall GPU drivers, then I can boot into Windows again, with Device Manager yet again showing the cards as Generic PnP adapters. I've used the AMD Clean Uninstall utility a few times already as well.
> 
> Is there a trick to get the drivers to install?.. I'd remove one of the cards and try installing with just the other one in, but there's a custom watercooling loop the two cards are a part of, and I really don't want to disassemble it just for the driver installation.
> 
> Thanks for any tips.


No need to pull out any of the cards. Just unplug power from the secondary card. Shutdown system, then carefully unplug power cable from card. After driver install, then you can shut sytem down to replug the power to the secondary card.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *georgebou*
> 
> What part of if has someone has the 290 on a ryzen system you don't understand.
> 
> The question is very clear and the bios is referred to describe the problem of the card.
> 
> Possibly the card is defective, but I want to be 100% sure.


I remember years ago I had that issue with my 1st reference 290 I am certain its the cards bios that's doing it .

My fix was at the time Shimanos modded bios flash and the newest mobo bios ...................

Also did you flick the cards bios switch over to the 2nd position ??


----------



## Raephen

Quote:


> Originally Posted by *georgebou*
> 
> Hello guys.
> 
> Can anyone confirm that the Sapphire R9 290 reference design works fine with the RYZEN 7 1700 and a B350 tomahawk motherboard?
> 
> I have no signal at my monitor and the system boots to windows and i can confirm it by the windows welcome sound and the card is also recognised on device manager as it should.
> 
> I can confirm that the card is recognisezed after using my old graphics card and checking the device manager with show hidden devices.
> 
> The card is not dead, but i get no signal even with a different monitor, but it worked on the previous pc i had it.
> 
> Thanks in advance.


My Sapphire reference R9 290 works flawlessly with my MSI B350M Mortar Arctic & Ryzen 5 1600X from day 1 with my Ryzen setup, both the DVI and Displayport output. The only issue I've ever run into is uefi not scaling correctly sometimes, but a reboot or in the worst case: power down and disconnect for mayb e a few minutes solves it.

Whilst not exacttly the same chip and mobo, similar enough, I'd think.


----------



## cephelix

Quote:


> Originally Posted by *ElevenEleven*
> 
> I'm a bit stuck here, trying to install AMD graphics driver on a Win 8.1 system with 2 MSI Lightning 290X cards installed. Windows boots fine without the AMD driver installed and shows the two GPUs as "Generic PnP" devices in device manager. But whether I try installing the driver by manually downloading it and running set-up, or by installing the auto-detection AMD software, or by pointing Windows from device manager to the driver folder location--in all cases immediately after driver installation the screen turns black and after automatic system restart, booting into Windows no longer works properly. I can get into safe mode and uninstall GPU drivers, then I can boot into Windows again, with Device Manager yet again showing the cards as Generic PnP adapters. I've used the AMD Clean Uninstall utility a few times already as well.
> 
> Is there a trick to get the drivers to install?.. I'd remove one of the cards and try installing with just the other one in, but there's a custom watercooling loop the two cards are a part of, and I really don't want to disassemble it just for the driver installation.
> 
> Thanks for any tips.


Are you installing the latest drivers? Cos I had the same problem when trying to install 17.7.1. Went so far as to dismantle my system and test is piece by piece and reformatting my SSD. Decided to roll back to 16.X.X and everything runs fine


----------



## rdr09

Quote:


> Originally Posted by *cephelix*
> 
> Are you installing the latest drivers? Cos I had the same problem when trying to install 17.7.1. Went so far as to dismantle my system and test is piece by piece and reformatting my SSD. Decided to roll back to 16.X.X and everything runs fine


I recently installed the latest driver over 16.X.X. Just installed it over. No DDU. Runs fine except i can't use Trixx.


----------



## cephelix

Quote:


> Originally Posted by *rdr09*
> 
> I recently installed the latest driver over 16.X.X. Just installed it over. No DDU. Runs fine except i can't use Trixx.


Hmm, well, initially I couldn't even install. And I DDU-ed the crap out of it. Halfway through installation when the monitor flickers, it just bluecreens.


----------



## ElevenEleven

Quote:


> Originally Posted by *mAs81*
> 
> You could try removing completely all traces of the drivers using this guide and then try re-installing them..
> I know it seems like a lot of work,but after all the times I've done it , it's easy for me now
> 
> Black screens used to plague a lot of 290/290X owners indeed..Supposedly if you're on Elpida memory and UEFI bios , turning your bios on legacy mode makes it stop,so it's worth a try I guess..
> 
> More info on that , here ..


Quote:


> Originally Posted by *rdr09*
> 
> No need to pull out any of the cards. Just unplug power from the secondary card. Shutdown system, then carefully unplug power cable from card. After driver install, then you can shut sytem down to replug the power to the secondary card.


Quote:


> Originally Posted by *cephelix*
> 
> Are you installing the latest drivers? Cos I had the same problem when trying to install 17.7.1. Went so far as to dismantle my system and test is piece by piece and reformatting my SSD. Decided to roll back to 16.X.X and everything runs fine


Thank you all!

1. The BIOS switch appeared to be in the "Normal" position on both cards, not LN2. I've switched it to LN2 as a test, and no positive change

2. I've upgraded to Windows 10 (64-bit, Pro), same issue with the video driver

3. I've unplugged power from the second (bottom) card on the ASUS Rampage IV Black Edition motherboard, so only one card is recognized by the system for now as a test. It connects to one monitor via HDMI cable.

4. I can boot into Windows no problem AS LONG as the radeon driver is not installed. As soon as I attempt to install the radeon driver (or Windows 10 attempts to install it automatically with Windows Update), the screen turns black upon installation completion, system auto-reboots with all the fans at full speed and error code "6F" displayed on the motherboard. Then after a while of that, the system auto-shuts down. If I attempt to reboot, I can't get back into Windows normally (screen turns black but the monitor light is still on, right after Windows loading screen) until I go into Safe Mode and uninstall/delete Radeon driver. I can then boot into Windows normally with the GPU showing as "Microsoft Basic Display Adapter" on the Device Manager list.

5. In Safe Mode, going into Device Manager, it shows video card identified as Radeon R9 200 Series after radeon driver installation. So for some reason I can get into Safe Mode with the driver installed, but not into full Windows until I uninstall the graphics driver from Safe Mode.

6. _With_ Radeon drivers installed, sometimes after rebooting I see the VIDEO_TDR_FAILURE error on a blue screen.]

7. I can always get into Safe Mode with or without Radeon driver installed. I can also always get into full Windows without Radeon driver installed. I only can't enter full Windows and see error(s) with Radeon driver installed.

I'm not sure how to proceed and why I can clearly use the card with windows UNTIL I install the graphics driver. This time the driver was for Windows 10 (previous times I tried it was for Windows 8.1 before the upgrade). I used the web install / auto-detect version just in case. Clean install.


----------



## rdr09

Quote:


> Originally Posted by *ElevenEleven*
> 
> Thank you all!
> 
> 1. The BIOS switch appeared to be in the "Normal" position on both cards, not LN2. I've switched it to LN2 as a test, and no positive change
> 
> 2. I've upgraded to Windows 10 (64-bit, Pro), same issue with the video driver
> 
> 3. I've unplugged power from the second (bottom) card on the ASUS Rampage IV Black Edition motherboard, so only one card is recognized by the system for now as a test. It connects to one monitor via HDMI cable.
> 
> 4. I can boot into Windows no problem AS LONG as the radeon driver is not installed. As soon as I attempt to install the radeon driver (or Windows 10 attempts to install it automatically with Windows Update), the screen turns black upon installation completion, system auto-reboots with all the fans at full speed and error code "6F" displayed on the motherboard. Then after a while of that, the system auto-shuts down. If I attempt to reboot, I can't get back into Windows normally (screen turns black but the monitor light is still on, right after Windows loading screen) until I go into Safe Mode and uninstall/delete Radeon driver. I can then boot into Windows normally with the GPU showing as "Microsoft Basic Display Adapter" on the Device Manager list.
> 
> 5. In Safe Mode, going into Device Manager, it shows video card identified as Radeon R9 200 Series after radeon driver installation. So for some reason I can get into Safe Mode with the driver installed, but not into full Windows until I uninstall the graphics driver from Safe Mode.
> 
> 6. _With_ Radeon drivers installed, sometimes after rebooting I see the VIDEO_TDR_FAILURE error on a blue screen.]
> 
> 7. I can always get into Safe Mode with or without Radeon driver installed. I can also always get into full Windows without Radeon driver installed. I only can't enter full Windows and see error(s) with Radeon driver installed.
> 
> I'm not sure how to proceed and why I can clearly use the card with windows UNTIL I install the graphics driver. This time the driver was for Windows 10 (previous times I tried it was for Windows 8.1 before the upgrade). I used the web install / auto-detect version just in case. Clean install.


Try the steps about the Device Driver Uninstall.

http://www.overclock.net/t/1414916/amd-display-driver-failing-to-install-solved/10

If still fails, then my last suggestion is to clear the cmos. BUT, save your oc settings on the cpu first if it is oc'ed.
Quote:


> Originally Posted by *cephelix*
> 
> Hmm, well, initially I couldn't even install. And I DDU-ed the crap out of it. Halfway through installation when the monitor flickers, it just bluecreens.


That sucks. When i was at 16, i was only getting 10.5K in Graphics in FS at stock of 945GHz. With 17, it is at 11.4K. Sorry, i use Firestrike as a gage.


----------



## georgebou

Quote:


> Originally Posted by *Raephen*
> 
> My Sapphire reference R9 290 works flawlessly with my MSI B350M Mortar Arctic & Ryzen 5 1600X from day 1 with my Ryzen setup, both the DVI and Displayport output. The only issue I've ever run into is uefi not scaling correctly sometimes, but a reboot or in the worst case: power down and disconnect for mayb e a few minutes solves it.
> 
> Whilst not exacttly the same chip and mobo, similar enough, I'd think.


Ok thank you.

It seems the card is defective then and i will see what i can do.

Thanks mate.


----------



## ElevenEleven

Quote:


> Originally Posted by *rdr09*
> 
> Try the steps about the Device Driver Uninstall.
> 
> http://www.overclock.net/t/1414916/amd-display-driver-failing-to-install-solved/10
> 
> If still fails, then my last suggestion is to clear the cmos. BUT, save your oc settings on the cpu first if it is oc'ed.


I've already tried what's suggested in that thread, but my problem is different. The drivers do install, it's just that they don't work properly.
. . .

Installed and ran Hawaii Info v1.2. Here's my result, Samsung memory? o.o I do see that MSI Lightning 290x uses Samsung memory in GPU reviews. So then I should not be seeing Elpida-related black screen issues. Still stuck troubleshooting this...


----------



## rdr09

Quote:


> Originally Posted by *ElevenEleven*
> 
> I've already tried what's suggested in that thread, but my problem is different. The drivers do install, it's just that they don't work properly.
> . . .
> 
> Installed and ran Hawaii Info v1.2. Here's my result, Samsung memory? o.o I do see that MSI Lightning 290x uses Samsung memory in GPU reviews. So then I should not be seeing Elpida-related black screen issues. Still stuck troubleshooting this...


Im sorry if i missed it. Did you try installing the driver using the other card? Power up the secondary, unplug the primary and connect monitor cable to the secondary card.

If you have not cleared cmos. Try that too.


----------



## ElevenEleven

Quote:


> Originally Posted by *rdr09*
> 
> Im sorry if i missed it. Did you try installing the driver using the other card? Power up the secondary, unplug the primary and connect monitor cable to the secondary card.
> 
> If you have not cleared cmos. Try that too.


That actually did seem to help. Disconnected power from the top GPU and reconnected the bottom. Connected the monitor via HDMI cord to the bottom GPU. Drivers installed fine now, no more black screen.

Had a surprising display scaling issue which I've now fixed by setting HDMI scaling to 3% in Radeon settings. (The displayed image on the screen was shifted diagonally to the bottom left a bit, so like the desktop icons are a bit cut off on the left side, and the Windows start menu button icon is partially hidden to the left. Wasn't like that to begin with, but the screen blinked a few times changing resolutions and then settled in this offset configuration, which I've not yet been able to correct.)

So I don't know, two options, either something off with the top card or the motherboard settings/socket (I doubt it, since the card does project the image fine until I install radeon driver), OR perhaps after installing the radeon driver, somehow Windows wants to use the bottom slot as default image output slot, which is not properly connected. Also doubtful as I got blue screen errors periodically when trying to boot with the top card connected.

P.S.: clearning CMOS did not help earlier.
P.P.S.: I'll try reconnecting power to the top card now.


----------



## rdr09

Quote:


> Originally Posted by *ElevenEleven*
> 
> That actually did seem to help. Disconnected power from the top GPU and reconnected the bottom. Connected the monitor via HDMI cord to the bottom GPU. Drivers installed fine now, no more black screen.
> 
> Had a surprising display scaling issue which I've now fixed by setting HDMI scaling to 3% in Radeon settings. (The displayed image on the screen was shifted diagonally to the bottom left a bit, so like the desktop icons are a bit cut off on the left side, and the Windows start menu button icon is partially hidden to the left. Wasn't like that to begin with, but the screen blinked a few times changing resolutions and then settled in this offset configuration, which I've not yet been able to correct.)
> 
> So I don't know, two options, either something off with the top card or the motherboard settings/socket (I doubt it, since the card does project the image fine until I install radeon driver), OR perhaps after installing the radeon driver, somehow Windows wants to use the bottom slot as default image output slot, which is not properly connected. Also doubtful as I got blue screen errors periodically when trying to boot with the top card connected.
> 
> P.S.: clearning CMOS did not help earlier.
> P.P.S.: I'll try reconnecting power to the top card now.


Is this the "Auto Detect" Driver? If it is, then manually download the right driver and reinstall. To uninstall old driver, run the app like you are about to install until you get to the window you are given a choice to intall or uninstall. Uninstall it using Express method. Install the Driver Manually chosen.

If you have another machine, then download it there. Just save it on a flashdrive. Hope you get it fixed and that no card is borked.


----------



## ElevenEleven

Quote:


> Originally Posted by *rdr09*
> 
> Is this the "Auto Detect" Driver? If it is, then manually download the right driver and reinstall. To uninstall old driver, run the app like you are about to install until you get to the window you are given a choice to intall or uninstall. Uninstall it using Express method. Install the Driver Manually chosen.


Yes the current driver is the Auto-Detect type from AMD. I did try manually linking to driver folder for two different driver versions from Device Manager before with the top card, and the behavior of the system was the same (still the black screen and periodic blue screen errors).


----------



## rdr09

Quote:


> Originally Posted by *ElevenEleven*
> 
> Yes the current driver is the Auto-Detect type from AMD. I did try manually linking to driver folder for two different driver versions from Device Manager before with the top card, and the behavior of the system was the same (still the black screen and periodic blue screen errors).


Go back to the AMD site and search the driver for your 290X manually. Make sure you pick the right OS.


----------



## ElevenEleven

Quote:


> Originally Posted by *rdr09*
> 
> Go back to the AMD site and search the driver for your 290X manually. Make sure you pick the right OS.


I did that first when I began this process. Then because of the errors I tried the auto-detect method. So basically I've tried it both ways


----------



## Raephen

Quote:


> Originally Posted by *georgebou*
> 
> Ok thank you.
> 
> It seems the card is defective then and i will see what i can do.
> 
> Thanks mate.


Not necessarily. If you have a secondary pc to try your card in or a friend with a pc that would let you try out the card in to check could confirm/eliminate that option. If you have a cheap ass pcie-gpu you could swap in that one to check. I used to have one lying-arround for that same purpose, but it's found a new home in my steph-dad's pc.

Whilst the gpu does look like the prime suspect, it could always be the motherboard (it is, after all, a budget solution).

Do try to test before you order an exspensive gpu only to find the same issue.

Best of luck!


----------



## ElevenEleven

Quote:


> Originally Posted by *rdr09*
> 
> Go back to the AMD site and search the driver for your 290X manually. Make sure you pick the right OS.


Ok, I've reconnected power cables to the top card and booted the computer (display still connected to the bottom card). Booted fine into Windows 10, though only one GPU is seen in Device Manager or by GPU-Z. Latest Radeon driver was installed with 1 GPU powered. (EDIT: I've just again done a clean install, this time with both GPUs powered and the bottom one connected to the monitor, same thing, top one is not detected but otherwise no black screen). Seems like the top slot GPU is not being recognized by Windows for some reason.

No "Crossfire" options in the Radeon Settings menu, as shown here:
http://support.amd.com/en-us/kb-articles/Pages/How-to-Configure-AMD-CrossFire-Using-AMD-Radeon-Settings.aspx









Also some weird overscan wonkiness via HDMI cord at 1920x1080p, native resolution of the monitor I'm using to test this computer with. Using scaling in Radeon settings at 3% does fit the image properly, but then everything is fuzzy.


----------



## rdr09

Quote:


> Originally Posted by *ElevenEleven*
> 
> Ok, I've reconnected power cables to the top card and booted the computer (display still connected to the bottom card). Booted fine into Windows 10, though only one GPU is seen in Device Manager or by GPU-Z. Latest Radeon driver was installed with 1 GPU powered. (EDIT: I've just again done a clean install, this time with both GPUs powered and the bottom one connected to the monitor, same thing, top one is not detected but otherwise no black screen). Seems like the top slot GPU is not being recognized by Windows for some reason.
> 
> No "Crossfire" options in the Radeon Settings menu, as shown here:
> http://support.amd.com/en-us/kb-articles/Pages/How-to-Configure-AMD-CrossFire-Using-AMD-Radeon-Settings.aspx
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Also some weird overscan wonkiness via HDMI cord at 1920x1080p, native resolution of the monitor I'm using to test this computer with. Using scaling in Radeon settings at 3% does fit the image properly, but then everything is fuzzy.


Have you tried reconnecting monitor cable to the top card? Shut system down first.

I'll check back tomorrow.


----------



## ElevenEleven

Quote:


> Originally Posted by *rdr09*
> 
> Have you tried reconnecting monitor cable to the top card? Shut system down first.
> 
> I'll check back tomorrow.


That was actually my bad, I did not properly power the top card (a 1x molex connector was not all the way in). After pushing all the power connectors firmly in, the power LEDs on the top card properly lit up orange.

So with BOTH cards powered, if I connect the HDMI cable to the bottom card from the monitor, the monitor never comes on. If I connect the HDMI cable to the top card, the monitor comes on as normal until it gets to the Windows loading screen. There after the blue windows logo and the "loading" circle finish, screen goes black indefinitely. And by black I don't mean that it powers off as if there's no signal. More like it actually displays a black background.

If I only use the top card (power removed from the bottom card), same bad scenario as long as Radeon driver is installed.

If I only use the bottom card (power removed from the top card), all is well with or without Radeon drivers installed.

For reference, the motherboard is ASUS Rampage Black Edition. Top card is in PCI-E slot 1, bottom card is in slot 4. MB manual recommends using a single card in slot 1 instead of 4, if more stuff is connected in slot 6 (there's a sound card in 6, so at the moment the bottom GPU PCI-E slot operates in X8 mode).


----------



## cephelix

@rdr09 interesting. How does it translate to normal gaming though? I haven't bothered with firestrike in years since I last set my OC. Plays modded FO4 fine for the most part but stumbles when I install more graphically demanding ones.

@ElevenEleven sorry to hear about your troubles mate. Don't know how else I could help. Hopefully everything gets fixed soon


----------



## rdr09

Quote:


> Originally Posted by *cephelix*
> 
> @rdr09 interesting. How does it translate to normal gaming though? I haven't bothered with firestrike in years since I last set my OC. Plays modded FO4 fine for the most part but stumbles when I install more graphically demanding ones.
> 
> @ElevenEleven sorry to hear about your troubles mate. Don't know how else I could help. Hopefully everything gets fixed soon


I dont see any difference in a game. Just playing unturned at the moment. Not really a very demanding game.

Single: http://www.3dmark.com/3dm/21424355?

Crossfire: http://www.3dmark.com/3dm/21424992

Quote:


> Originally Posted by *ElevenEleven*
> 
> That was actually my bad, I did not properly power the top card (a 1x molex connector was not all the way in). After pushing all the power connectors firmly in, the power LEDs on the top card properly lit up orange.
> 
> So with BOTH cards powered, if I connect the HDMI cable to the bottom card from the monitor, the monitor never comes on. If I connect the HDMI cable to the top card, the monitor comes on as normal until it gets to the Windows loading screen. There after the blue windows logo and the "loading" circle finish, screen goes black indefinitely. And by black I don't mean that it powers off as if there's no signal. More like it actually displays a black background.
> 
> If I only use the top card (power removed from the bottom card), same bad scenario as long as Radeon driver is installed.
> 
> If I only use the bottom card (power removed from the top card), all is well with or without Radeon drivers installed.
> 
> For reference, the motherboard is ASUS Rampage Black Edition. Top card is in PCI-E slot 1, bottom card is in slot 4. MB manual recommends using a single card in slot 1 instead of 4, if more stuff is connected in slot 6 (there's a sound card in 6, so at the moment the bottom GPU PCI-E slot operates in X8 mode).


x8 is fine with our cards. How's the display, though? Still screwed up?


----------



## ElevenEleven

Quote:


> Originally Posted by *rdr09*
> 
> x8 is fine with our cards. How's the display, though? Still screwed up?


Yes, still either fuzzy at 3% HDMI scaling or the projected image is out of bounds a bit.


----------



## rdr09

Quote:


> Originally Posted by *ElevenEleven*
> 
> Yes, still either fuzzy at 3% HDMI scaling or the projected image is out of bounds a bit.


If it was me, i would re-download the latest driver and install it using the Secondary card with the primary card power unplugged. You can first uninstall existing driver using the same freshly downloaded driver, shutdown system, and install.

Or

You can just simply install the freshly downloaded driver over existing one using express method.

Edit: Could it be that the sound card has to do with your issues?


----------



## mus1mus

I'd install the cards one by one. Starting from the first card having issues.

Or swap them around.

How are you @rdr09?


----------



## rdr09

Quote:


> Originally Posted by *mus1mus*
> 
> I'd install the cards one by one. Starting from the first card having issues.
> 
> Or swap them around.
> 
> How are you @rdr09?


Hi mus, back to reality after two months there. Only stayed 6 days. in Mnl in BGC. It looks like modern NYC.

Almost bought a R7 but they were asking 10K for the Krait. So, i'll just wait for TR.

Eleven's gpu are watercooled.


----------



## mus1mus

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> I'd install the cards one by one. Starting from the first card having issues.
> 
> Or swap them around.
> 
> How are you @rdr09?
> 
> 
> 
> Hi mus, back to reality after two months there. Only stayed 6 days. in Mnl in BGC. It looks like modern NYC.
> 
> Almost bought a R7 but they were asking 10K for the Krait. So, i'll just wait for TR.
> 
> Eleven's gpu are watercooled.
Click to expand...

Good thing you didn't grab that Krait.









That place is nice and fairly newly developed. I live ~30 minutes away.

Threadripper is turning out to be great btw. Will have to grab one as well.


----------



## ElevenEleven

Quote:


> Originally Posted by *rdr09*
> 
> If it was me, i would re-download the latest driver and install it using the Secondary card with the primary card power unplugged. You can first uninstall existing driver using the same freshly downloaded driver, shutdown system, and install.
> 
> Or
> 
> You can just simply install the freshly downloaded driver over existing one using express method.
> 
> Edit: Could it be that the sound card has to do with your issues?


I had a thought yesterday evening that maybe something was corrupted in the top card's BIOS... Ended up looking for the latest BIOS for MSI 290X Lightning cards and learned that there's a UEFI BIOS available upon request on MSI forums. Submitted my request with GPU serial numbers and flashed the UEFI BIOS today (had to use ATIFLASH from Windows, as DOS method was not working), and both cards are now operational!! Latest Radeon driver installed. I also did a fresh windows 10 install, just in case.


----------



## rdr09

Quote:


> Originally Posted by *mus1mus*
> 
> Good thing you didn't grab that Krait.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That place is nice and fairly newly developed. I live ~30 minutes away.
> 
> Threadripper is turning out to be great btw. Will have to grab one as well.


Fell in love with msi boards when i had the cheap msi x99 sli. That was solid.

So close, i'd hanging out there every day. Lol. But Palawan was mind blowing. Stayed in Lagen for 2 nites.

I'll wait till the prices normalize and bugs get sorted out on the TR.

Quote:


> Originally Posted by *ElevenEleven*
> 
> I had a thought yesterday evening that maybe something was corrupted in the top card's BIOS... Ended up looking for the latest BIOS for MSI 290X Lightning cards and learned that there's a UEFI BIOS available upon request on MSI forums. Submitted my request with GPU serial numbers and flashed the UEFI BIOS today (had to use ATIFLASH from Windows, as DOS method was not working), and both cards are now operational!! Latest Radeon driver installed. I also did a fresh windows 10 install, just in case.


Great. Happy for you bud.

Edit: I doubt if you are oc'ing with 2 lightnings but if you are, there is the latest version of Afterburner that works with 17.7.2. Have not tried crossfire, though, i'll check.

Oc'ed both my 290s using AB @ 1100 core. It works, but adding voltage is limited to +100. Don't know how to go over that limit.

http://www.3dmark.com/3dm/21522850?


----------



## mus1mus

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Good thing you didn't grab that Krait.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> That place is nice and fairly newly developed. I live ~30 minutes away.
> 
> Threadripper is turning out to be great btw. Will have to grab one as well.
> 
> 
> 
> Fell in love with msi boards when i had the cheap msi x99 sli. That was solid.
> 
> So close, i'd hanging out there every day. Lol. But Palawan was mind blowing. Stayed in Lagen for 2 nites.
> 
> I'll wait till the prices normalize and bugs get sorted out on the TR.
> 
> Quote:
> 
> 
> 
> Originally Posted by *ElevenEleven*
> 
> I had a thought yesterday evening that maybe something was corrupted in the top card's BIOS... Ended up looking for the latest BIOS for MSI 290X Lightning cards and learned that there's a UEFI BIOS available upon request on MSI forums. Submitted my request with GPU serial numbers and flashed the UEFI BIOS today (had to use ATIFLASH from Windows, as DOS method was not working), and both cards are now operational!! Latest Radeon driver installed. I also did a fresh windows 10 install, just in case.
> 
> Click to expand...
> 
> Great. Happy for you bud.
> 
> Edit: I doubt if you are oc'ing with 2 lightnings but if you are, there is the latest version of Afterburner that works with 17.7.2. Have not tried crossfire, though, i'll check.
> 
> Oc'ed both my 290s using AB @ 1100 core. It works, but adding voltage is limited to +100. Don't know how to go over that limit.
> 
> http://www.3dmark.com/3dm/21522850?
Click to expand...

Same MSI AB OverVoltage code works prior to the latest Beta Driver. But I have not been able to do past 1300/1625 lately. Which is bad considering, I could do more than 1300 on my 290. -- sniped 290's TS, FS, and 3DM11 records using an all-AMD-rig btw.









MSI's are doing great on X370 TBH. Just expensive.

Finally, I'm glad that you enjoyed your vacation here bud. Irregardless of the place you visit, relaxing, and having the time to forget about normal daily routine matter more.


----------



## rdr09

Quote:


> Originally Posted by *mus1mus*
> 
> Same MSI AB OverVoltage code works prior to the latest Beta Driver. But I have not been able to do past 1300/1625 lately. Which is bad considering, I could do more than 1300 on my 290. -- sniped 290's TS, FS, and 3DM11 records using an all-AMD-rig btw.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> MSI's are doing great on X370 TBH. Just expensive.
> 
> Finally, I'm glad that you enjoyed your vacation here bud. Irregardless of the place you visit, relaxing, and having the time to forget about normal daily routine matter more.


Now i remember that about Afterburner. I got used to Trixx. My first 290 can oc as much as well. My second one i got after a year can only do 1250.

I've been thinking of just settling for a 1700.

Thanks.


----------



## mus1mus

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Same MSI AB OverVoltage code works prior to the latest Beta Driver. But I have not been able to do past 1300/1625 lately. Which is bad considering, I could do more than 1300 on my 290. -- sniped 290's TS, FS, and 3DM11 records using an all-AMD-rig btw.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> MSI's are doing great on X370 TBH. Just expensive.
> 
> Finally, I'm glad that you enjoyed your vacation here bud. Irregardless of the place you visit, relaxing, and having the time to forget about normal daily routine matter more.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Now i remember that about Afterburner. I got used to Trixx. My first 290 can oc as much as well. My second one i got after a year can only do 1250.
> 
> I've been thinking of just settling for a 1700.
> 
> Thanks.
Click to expand...

Trixx won't work with the current drivers. However, using it alongside MSI AB will give you the +200 Voltage.

Set +200 on Trixx, close the app and ise MSI AB for the clocks.

Or simply use this trick.
https://forums.overclockers.co.uk/threads/290x-290-voltage-control-with-msi-ab-stock-bios-guide.18556274/

Some cards require wi6, some wi4. Hou already know this tho.









1700X is cheaper but has shown inferior OC results than 1800Xs. Still luck plays it's part. But IMO, with the release of TR, you might wanna look at the 1900X or even the non X versions of them in the coming months.

1900X is interesting as there is a chance the enabled cores are stronger as they can chose which ones to enable on 2 CCX modules.

You easentially get better boards. And a monster of a set-up that will last longer than the Ryzen platform. At least, that's how I see it.


----------



## Vellinious

HIS iTurbo allowed for +400mv....I preferred it for overclocking on AMD GPUs. Worked great for the 290X I had.


----------



## mus1mus

HIS ITurbo will blackscreen on the latest drivers.


----------



## Vellinious

Blackscreened on the old drivers, above a certain voltage point.


----------



## Tame

When MSI AB still worked, I used it + Cheat Engine to set higher voltages and clocks than would have otherwise been possible. Worked great


----------



## mus1mus

Quote:


> Originally Posted by *Vellinious*
> 
> Blackscreened on the old drivers, above a certain voltage point.


This time is different. Black screen on settings IIRC. Or GPU doing 300MHz when you apply an OC.


----------



## wolf9466

All these Windows driver woes remind me of why I use only Linux - ESPECIALLY with AMD GPUs. Not only do I have flawless control of the core voltage, I have the same for the GDDR5 (MVDDC) on my DirectCU II 290X.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> This time is different. Black screen on settings IIRC. Or GPU doing 300MHz when you apply an OC.


That sucks


----------



## mus1mus

Quote:


> Originally Posted by *wolf9466*
> 
> All these Windows driver woes remind me of why I use only Linux - ESPECIALLY with AMD GPUs. Not only do I have flawless control of the core voltage, I have the same for the GDDR5 (MVDDC) on my DirectCU II 290X.












Voltage control for memory?

Not a benching OS tho.

Quote:


> Originally Posted by *Vellinious*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> This time is different. Black screen on settings IIRC. Or GPU doing 300MHz when you apply an OC.
> 
> 
> 
> That sucks
Click to expand...

But MSI AB can give you the same +400.








Maybe even more. Haven't tried.


----------



## wolf9466

Quote:


> Originally Posted by *mus1mus*
> 
> Quote:
> 
> 
> 
> Originally Posted by *wolf9466*
> 
> All these Windows driver woes remind me of why I use only Linux - ESPECIALLY with AMD GPUs. Not only do I have flawless control of the core voltage, I have the same for the GDDR5 (MVDDC) on my DirectCU II 290X.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Voltage control for memory?
> 
> Not a benching OS tho.
> 
> Quote:
> 
> 
> 
> Originally Posted by *Vellinious*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> This time is different. Black screen on settings IIRC. Or GPU doing 300MHz when you apply an OC.
> 
> Click to expand...
> 
> That sucks
> 
> Click to expand...
> 
> But MSI AB can give you the same +400.
> 
> 
> 
> 
> 
> 
> 
> 
> Maybe even more. Haven't tried.
Click to expand...

Yes, indeed! Little uP1801 on it that I use!


----------



## ElevenEleven

Quote:


> Originally Posted by *ElevenEleven*
> 
> I had a thought yesterday evening that maybe something was corrupted in the top card's BIOS... Ended up looking for the latest BIOS for MSI 290X Lightning cards and learned that there's a UEFI BIOS available upon request on MSI forums. Submitted my request with GPU serial numbers and flashed the UEFI BIOS today (had to use ATIFLASH from Windows, as DOS method was not working), and both cards are now operational!! Latest Radeon driver installed. I also did a fresh windows 10 install, just in case.


So... everything worked perfectly. Through restarts and such, even driver update. Then I powered off the computer for a couple of days (busy, didn't work on it). Came back, powered it on, was greeted with black screen again right after the blue Windows loading screen, just before the user login screen is supposed to appear =/

Same thing once again: no problem with just the bottom card connected, top+bottom or just top connected = black screen after Windows logo. But can still get into the BIOS or safe mode no problem with the top card.

This is so frustrating, it WORKED fine for a while, I have no idea why it worked, then just didn't. Looked through Windows update for any unintended updates/driver changes, and there wasn't anything listed that I could identify...

I know that it's the Radeon driver that messes things up. As soon as the driver is applied to the top card, it ends in a black screen.


----------



## Matt-Matt

Quote:


> Originally Posted by *Tame*
> 
> When MSI AB still worked, I used it + Cheat Engine to set higher voltages and clocks than would have otherwise been possible. Worked great


It doesn't work anymore?

I'm trying to play FH3 and it's been suggested that my drivers being on 17.1.1 is causing issues.

Have just downloaded 17.7 now.. Lets see how I go, can't live without 1179/1450 clocks..


----------



## Aussiejuggalo

So err question, what are the signs of a dying GPU?

My cards been running at 60 - 70° for the past 2 days for no reason and the thing just crashed when I opened MSI Afterburner...


----------



## alancsalt

Has your TIM dried out?


----------



## Aussiejuggalo

Quote:


> Originally Posted by *alancsalt*
> 
> Has your TIM dried out?


Shouldn't have I only put it on in July, then again it is the Noctua stuff so who knows. The weird thing is it's showing 60 - 70° in HWInfo but the fans aren't ramping up.

Edit, GPU-Z is showing the same temps.



As long as it lasts another couple weeks I can live with it, changing to a 1080 seeing Vega didn't impress.


----------



## Tame

Quote:


> Originally Posted by *Matt-Matt*
> 
> It doesn't work anymore?
> 
> I'm trying to play FH3 and it's been suggested that my drivers being on 17.1.1 is causing issues.
> 
> Have just downloaded 17.7 now.. Lets see how I go, can't live without 1179/1450 clocks..


Well 17.7.2 drivers broke the oc utilies etc, so yeah... Still works on the older drivers.
For me it would only be an issue with max overclock benchmarking, since I use customized bios for daily clocks...


----------



## mus1mus

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *alancsalt*
> 
> Has your TIM dried out?
> 
> 
> 
> Shouldn't have I only put it on in July, then again it is the Noctua stuff so who knows. The weird thing is it's showing 60 - 70° in HWInfo but the fans aren't ramping up.
> 
> Edit, GPU-Z is showing the same temps.
> 
> 
> 
> As long as it lasts another couple weeks I can live with it, changing to a 1080 seeing Vega didn't impress.
Click to expand...

Surely a TIM or mount issue. Not a dying GPU.
Quote:


> Originally Posted by *Tame*
> 
> Quote:
> 
> 
> 
> Originally Posted by *Matt-Matt*
> 
> It doesn't work anymore?
> 
> I'm trying to play FH3 and it's been suggested that my drivers being on 17.1.1 is causing issues.
> 
> Have just downloaded 17.7 now.. Lets see how I go, can't live without 1179/1450 clocks..
> 
> 
> 
> Well 17.7.2 drivers broke the oc utilies etc, so yeah... Still works on the older drivers.
> For me it would only be an issue with max overclock benchmarking, since I use customized bios for daily clocks...
Click to expand...

Indeed.

One thing to note though, Wattman works fine now.

Mining on one of my GPUs at 1250/1500.

Works every reboot as Voltage sticks.


----------



## outofmyheadyo

What is it with the R290 overclocking or how are u actually supposed to do it ? I have witcher 3 open, also afterburner, cards stock clocks are 947/1250 whenever I increase the core by 5 or 253 mhz the witchers fps drops to about 10 and coreclock drops to 300 ? Cant even reset them, they are just stuck until the next restart...
Temps are fine its under a fullcover block, and drivers are 17.7.2


----------



## rdr09

Quote:


> Originally Posted by *outofmyheadyo*
> 
> What is it with the R290 overclocking or how are u actually supposed to do it ? I have witcher 3 open, also afterburner, cards stock clocks are 947/1250 whenever I increase the core by 5 or 253 mhz the witchers fps drops to about 10 and coreclock drops to 300 ? Cant even reset them, they are just stuck until the next restart...
> Temps are fine its under a fullcover block, and drivers are 17.7.2


Try this to oc . . .

http://forums.guru3d.com/showpost.php?p=5459908&postcount=637

Btw, i play W3 with a single 290 stock at 1440. I keep my cpu oc'ed. It uses a lot of threads.


----------



## Aussiejuggalo

Quote:


> Originally Posted by *mus1mus*
> 
> Surely a TIM or mount issue. Not a dying GPU.


Annoying if it is, it's only been doing this for a couple of days. If I grabbed some Coollaboratory Liquid Ultra would that be ok on the die







?


----------



## mus1mus

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *outofmyheadyo*
> 
> What is it with the R290 overclocking or how are u actually supposed to do it ? I have witcher 3 open, also afterburner, cards stock clocks are 947/1250 whenever I increase the core by 5 or 253 mhz the witchers fps drops to about 10 and coreclock drops to 300 ? Cant even reset them, they are just stuck until the next restart...
> Temps are fine its under a fullcover block, and drivers are 17.7.2
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Try this to oc . . .
> 
> http://forums.guru3d.com/showpost.php?p=5459908&postcount=637
> 
> Btw, i play W3 with a single 290 stock at 1440. I keep my cpu oc'ed. It uses a lot of threads.
Click to expand...

That is an MSI AB issue with 17.7.2.

He'd rather use Wattman.

I'd suggest removing MSI AB.


----------



## mus1mus

Quote:


> Originally Posted by *Aussiejuggalo*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Surely a TIM or mount issue. Not a dying GPU.
> 
> 
> 
> Annoying if it is, it's only been doing this for a couple of days. If I grabbed some Coollaboratory Liquid Ultra would that be ok on the die
> 
> 
> 
> 
> 
> 
> 
> ?
Click to expand...

I haven't used Liquid Metal. So I can't comment on the effects to the Die and the cooler. Tho, the presence of SMD components on the GPU may pose issues with a conductive liquid metal.


----------



## rdr09

Quote:


> Originally Posted by *mus1mus*
> 
> That is an MSI AB issue with 17.7.2.
> 
> He'd rather use Wattman.
> 
> I'd suggest removing MSI AB.


Im currently using that version of Afterburner with 17.7.2. It works even in crossfire.


----------



## Matt-Matt

Quote:


> Originally Posted by *mus1mus*
> 
> Surely a TIM or mount issue. Not a dying GPU.
> Indeed.
> 
> One thing to note though, Wattman works fine now.
> 
> Mining on one of my GPUs at 1250/1500.
> 
> Works every reboot as Voltage sticks.


Yeah it's nice, bit different to get used to as opposed to afterburner but eh it works well.
Quote:


> Originally Posted by *mus1mus*
> 
> That is an MSI AB issue with 17.7.2.
> 
> He'd rather use Wattman.
> 
> I'd suggest removing MSI AB.


Awesome, yeah got wattman actualy running a higher overclock then afterburner with the voltage maxed..

OK, i was running 1179 before and I'm running 1180 now.. Still technically higher though









I need to check what vRAM I have, IIRC it's rated to 1500MHz just under volted. Got it at 1450MHz right now though and I know that it's rock solid so for the 50MHz I can't be bothered.

If it was 50MHz core I'd be all over it.

It actually runs Forza Horizon 3 pretty well if I'm honest.


----------



## rdr09

Quote:


> Originally Posted by *Matt-Matt*
> 
> Yeah it's nice, bit different to get used to as opposed to afterburner but eh it works well.
> Awesome, yeah got wattman actualy running a higher overclock then afterburner with the voltage maxed..
> 
> OK, i was running 1179 before and I'm running 1180 now.. Still technically higher though
> 
> 
> 
> 
> 
> 
> 
> 
> 
> I need to check what vRAM I have, IIRC it's rated to 1500MHz just under volted. Got it at 1450MHz right now though and I know that it's rock solid so for the 50MHz I can't be bothered.
> 
> If it was 50MHz core I'd be all over it.
> 
> It actually runs Forza Horizon 3 pretty well if I'm honest.


Can you oc the RAM, Matt?


----------



## outofmyheadyo

Quote:


> Originally Posted by *rdr09*
> 
> Try this to oc . . .
> 
> http://forums.guru3d.com/showpost.php?p=5459908&postcount=637
> 
> Btw, i play W3 with a single 290 stock at 1440. I keep my cpu oc'ed. It uses a lot of threads.


Thanks I'll try, I run witcher 3 @1440p aswell quite surprised how well it ran on this old card just needed to turn the shadows low and foilage low and it runs perfectly fine, this old card still has some life left it seems. My cpu is a ryzen 1700 @ 3.9 with 3200 ram, that surely helps.


----------



## rdr09

Quote:


> Originally Posted by *outofmyheadyo*
> 
> Thanks I'll try, I run witcher 3 @1440p aswell quite surprised how well it ran on this old card just needed to turn the shadows low and foilage low and it runs perfectly fine, this old card still has some life left it seems. My cpu is a ryzen 1700 @ 3.9 with 3200 ram, that surely helps.


Oh yah. The 1700 should do better with the real cores instead of ht.


----------



## mus1mus

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> That is an MSI AB issue with 17.7.2.
> 
> He'd rather use Wattman.
> 
> I'd suggest removing MSI AB.
> 
> 
> 
> Im currently using that version of Afterburner with 17.7.2. It works even in crossfire.
Click to expand...

Awesome. Need to pick that version.







Thanks for the info.

Tho I am only mining with the 290.







good enough for 30+ MH/s at 1250/1500.

Side note: latest drivers do pretty good for benching. I smashed 3 hardware records on the 290 recently.


----------



## rdr09

Quote:


> Originally Posted by *mus1mus*
> 
> Awesome. Need to pick that version.
> 
> 
> 
> 
> 
> 
> 
> Thanks for the info.
> 
> Tho I am only mining with the 290.
> 
> 
> 
> 
> 
> 
> 
> good enough for 30+ MH/s at 1250/1500.
> 
> Side note: latest drivers do pretty good for benching. I smashed 3 hardware records on the 290 recently.


Share those records here. I saw Tame got his to 1375 core.


----------



## mus1mus

http://hwbot.org/submission/3614609_mus1mus_3dmark___time_spy_radeon_r9_290_5212_marks

http://hwbot.org/submission/3614599_mus1mus_3dmark___fire_strike_radeon_r9_290_14830_marks

http://hwbot.org/submission/3614596_mus1mus_3dmark11___performance_radeon_r9_290_22514_marks

Nothing that special. Just remarkable that these were done with an all-AMD rig.









1375 core is bauce! Can't do that on the latest driver.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> http://hwbot.org/submission/3614609_mus1mus_3dmark___time_spy_radeon_r9_290_5212_marks
> 
> http://hwbot.org/submission/3614599_mus1mus_3dmark___fire_strike_radeon_r9_290_14830_marks
> 
> http://hwbot.org/submission/3614596_mus1mus_3dmark11___performance_radeon_r9_290_22514_marks
> 
> Nothing that special. Just remarkable that these were done with an all-AMD rig.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1375 core is bauce! Can't do that on the latest driver.


Well ain't that some sh..... HW Bot took my score down when I posted one that high with a 290X. Still pisses me off.


----------



## rdr09

Quote:


> Originally Posted by *mus1mus*
> 
> http://hwbot.org/submission/3614609_mus1mus_3dmark___time_spy_radeon_r9_290_5212_marks
> 
> http://hwbot.org/submission/3614599_mus1mus_3dmark___fire_strike_radeon_r9_290_14830_marks
> 
> http://hwbot.org/submission/3614596_mus1mus_3dmark11___performance_radeon_r9_290_22514_marks
> 
> Nothing that special. Just remarkable that these were done with an all-AMD rig.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1375 core is bauce! Can't do that on the latest driver.


25K for 3D11. Nice. I think i got 21K tess off when i got my first 290 at launch. It was winter at the time. Lol.

Quote:


> Originally Posted by *Vellinious*
> 
> Well ain't that some sh..... HW Bot took my score down when I posted one that high with a 290X. Still pisses me off.


I saw your benches.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *mus1mus*
> 
> http://hwbot.org/submission/3614609_mus1mus_3dmark___time_spy_radeon_r9_290_5212_marks
> 
> http://hwbot.org/submission/3614599_mus1mus_3dmark___fire_strike_radeon_r9_290_14830_marks
> 
> http://hwbot.org/submission/3614596_mus1mus_3dmark11___performance_radeon_r9_290_22514_marks
> 
> Nothing that special. Just remarkable that these were done with an all-AMD rig.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1375 core is bauce! Can't do that on the latest driver.




I was #1 in Enthusiast 3.5 years ago









http://hwbot.org/submission/2568247_homecinema_pc_3dmark11___performance_radeon_r9_290_19462_marks/

If I only had a 1680v2 ...........


----------



## mus1mus

Quote:


> Originally Posted by *rdr09*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> http://hwbot.org/submission/3614609_mus1mus_3dmark___time_spy_radeon_r9_290_5212_marks
> 
> http://hwbot.org/submission/3614599_mus1mus_3dmark___fire_strike_radeon_r9_290_14830_marks
> 
> http://hwbot.org/submission/3614596_mus1mus_3dmark11___performance_radeon_r9_290_22514_marks
> 
> Nothing that special. Just remarkable that these were done with an all-AMD rig.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1375 core is bauce! Can't do that on the latest driver.
> 
> 
> 
> 25K for 3D11. Nice. I think i got 21K tess off when i got my first 290 at launch. It was winter at the time. Lol.
> 
> Quote:
> 
> 
> 
> Originally Posted by *Vellinious*
> 
> Well ain't that some sh..... HW Bot took my score down when I posted one that high with a 290X. Still pisses me off.
> 
> Click to expand...
> 
> I saw your benches.
Click to expand...

There's more in tank for these cards on the latest drivers.








And there's more to squeeze from the CPU for FS.








LN2 tho.
Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> http://hwbot.org/submission/3614609_mus1mus_3dmark___time_spy_radeon_r9_290_5212_marks
> 
> http://hwbot.org/submission/3614599_mus1mus_3dmark___fire_strike_radeon_r9_290_14830_marks
> 
> http://hwbot.org/submission/3614596_mus1mus_3dmark11___performance_radeon_r9_290_22514_marks
> 
> Nothing that special. Just remarkable that these were done with an all-AMD rig.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1375 core is bauce! Can't do that on the latest driver.
> 
> 
> 
> 
> 
> I was #1 in Enthusiast 3.5 years ago
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://hwbot.org/submission/2568247_homecinema_pc_3dmark11___performance_radeon_r9_290_19462_marks/
> 
> If I only had a *1680v2* ...........
Click to expand...

You need to get back in action buddy.







I moved out of ambient coz I can't do much on my ambient / location.

2X, and 3X Xfire benches for ya.









Xeons and you will be moved to Elite tho.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *mus1mus*
> 
> There's more in tank for these cards on the latest drivers.
> 
> 
> 
> 
> 
> 
> 
> 
> And there's more to squeeze from the CPU for FS.
> 
> 
> 
> 
> 
> 
> 
> 
> LN2 tho.
> You need to get back in action buddy.
> 
> 
> 
> 
> 
> 
> 
> 
> I moved out of ambient coz I can't do much on my ambient / location.
> 
> 2X, and 3X Xfire benches for ya.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Xeons and you will be moved to Elite tho.


Well , looks like when I have spare loot I'm gonna have to go DDR4 somethin , somethin


----------



## mus1mus

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> There's more in tank for these cards on the latest drivers.
> 
> 
> 
> 
> 
> 
> 
> 
> And there's more to squeeze from the CPU for FS.
> 
> 
> 
> 
> 
> 
> 
> 
> LN2 tho.
> You need to get back in action buddy.
> 
> 
> 
> 
> 
> 
> 
> I moved out of ambient coz I can't do much on my ambient / location.
> 
> 2X, and 3X Xfire benches for ya.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Xeons and you will be moved to Elite tho.
> 
> 
> 
> Well , looks like when I have spare loot I'm gonna have to go DDR4 somethin , somethin
Click to expand...

Go TR. they would do well in FS and 3DM11 as long as you chose an Asus board.


----------



## Matt-Matt

Quote:


> Originally Posted by *rdr09*
> 
> Can you oc the RAM, Matt?


Yeah I have a reference XFX Card a "Core Edition" that runs 947/1250 stock, it's at 1180/1450 and 100% stable.

Been on 1179 for literally *years* and never missed a beat on me.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *mus1mus*
> 
> Go TR. they would do well in FS and 3DM11 as long as you chose an *Asus board.*


Always , I learned that one with X79









Quote:


> Originally Posted by *Matt-Matt*
> 
> Yeah I have a reference XFX Card a "Core Edition" that runs 947/1250 stock, it's at 1180/1450 and 100% stable.
> 
> Been on 1179 for literally *years* and never missed a beat on me.


Howsitgoin tasweigen !


----------



## mus1mus

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Quote:
> 
> 
> 
> Originally Posted by *mus1mus*
> 
> Go TR. they would do well in FS and 3DM11 as long as you chose an *Asus board.*
> 
> 
> 
> Always , I learned that one with X79
Click to expand...

Only because with Ryzen, they have cheats.


----------



## Tame

Quote:


> Originally Posted by *rdr09*
> 
> Share those records here. I saw Tame got his to 1375 core.


Quote:


> Originally Posted by *mus1mus*
> 
> http://hwbot.org/submission/3614609_mus1mus_3dmark___time_spy_radeon_r9_290_5212_marks
> 
> http://hwbot.org/submission/3614599_mus1mus_3dmark___fire_strike_radeon_r9_290_14830_marks
> 
> http://hwbot.org/submission/3614596_mus1mus_3dmark11___performance_radeon_r9_290_22514_marks
> 
> Nothing that special. Just remarkable that these were done with an all-AMD rig.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> 1375 core is bauce! Can't do that on the latest driver.


I could do single card 1375 MHz pass in FireStrike, but in TimeSpy for example, I could do only 1350 MHz








I crashed Superposition with 1400 MHz in ~5 seconds. I'm planning to try again when the weather gets cold, haul my rig outside and see if there's any improvement :3


----------



## Mattbag

How do you guys feel about your cards after them being out for a few years?

Currently I'm still gaming on 1440p and can max out most games at that resolution and get 45-60 frames.

No reason for me to upgrade just yet


----------



## mfknjadagr8

Quote:


> Originally Posted by *Mattbag*
> 
> How do you guys feel about your cards after them being out for a few years?
> 
> Currently I'm still gaming on 1440p and can max out most games at that resolution and get 45-60 frames.
> 
> No reason for me to upgrade just yet


best 300 dollars i ever spent on a gpu (got two for that price)


----------



## rdr09

Quote:


> Originally Posted by *Matt-Matt*
> 
> Yeah I have a reference XFX Card a "Core Edition" that runs 947/1250 stock, it's at 1180/1450 and 100% stable.
> 
> Been on 1179 for literally *years* and never missed a beat on me.


I know. My 7950 went out. So, these 290s might be next.









Quote:


> Originally Posted by *mfknjadagr8*
> 
> best 300 dollars i ever spent on a gpu (got two for that price)


Got my second 290 for $240 with a block a year later. Together they play 4K . . .



That's old. Other are boasting about their 3DMarks. Ha! I'll get a Vega 64 hopefully since it is faster than my 2 290s. I don't oc in games.


----------



## Vellinious

Quote:


> Originally Posted by *mus1mus*
> 
> There's more in tank for these cards on the latest drivers.
> 
> 
> 
> 
> 
> 
> 
> 
> .


I shouldn't have sold it...... Was the most fun I've ever had overclocking a GPU.


----------



## Matt-Matt

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> Always , I learned that one with X79
> 
> 
> 
> 
> 
> 
> 
> 
> Howsitgoin tasweigen !


Lol! The internet is small.

Well OCN at least









It's good, I actually didn't realize you had 290's haha.

I got an R9 270 for my spare PC recently, maxes wattman easily..

Afterburner can't overclock







really need to figure out how to make it clock higher.


----------



## Matt-Matt

So yeaaah

My 290 has been artefacting, I was really worried till I watched GPU-Z at stock and GPU-Z with my "overclocked" settings.

It's overclocking the core fine, but not over-volting it like it _was_.. So now I'm stuck at pretty much stock (Can do ~1000mhz or so) until AMD fixes it.. great

Can't roll back, as I need the latest for Forza Horizon 3 which runs impeccably at 1180/1450.

For my 290 at least, and the 17.2.2 drivers it works by.

Running the latest Afterburner (4.3.0) and setting that to +100mv

Then using wattman to overclock the GPU, as afterburner will only let you set the voltage.

Tested this with Valley and with GPU-Z telling me the VDDC and it being raised.


----------



## rdr09

Quote:


> Originally Posted by *Matt-Matt*
> 
> So yeaaah
> 
> My 290 has been artefacting, I was really worried till I watched GPU-Z at stock and GPU-Z with my "overclocked" settings.
> 
> It's overclocking the core fine, but not over-volting it like it _was_.. So now I'm stuck at pretty much stock (Can do ~1000mhz or so) until AMD fixes it.. great
> 
> Can't roll back, as I need the latest for Forza Horizon 3 which runs impeccably at 1180/1450.
> 
> For my 290 at least, and the 17.2.2 drivers it works by.
> 
> Running the latest Afterburner (4.3.0) and setting that to +100mv
> 
> Then using wattman to overclock the GPU, as afterburner will only let you set the voltage.
> 
> Tested this with Valley and with GPU-Z telling me the VDDC and it being raised.


Posted this earlier . . .

https://forums.guru3d.com/threads/rtss-6-7-0-beta-1.412822/page-32#post-5459908

Oc's my crossfire.


----------



## Matt-Matt

Quote:


> Originally Posted by *rdr09*
> 
> Posted this earlier . . .
> 
> https://forums.guru3d.com/threads/rtss-6-7-0-beta-1.412822/page-32#post-5459908
> 
> Oc's my crossfire.


Made my PC Black screen.

Rebooted and everytime I loaded Windows it black screened (after MSI AB opening, assumably)

I had to use Windows restore (which also broke Forza Horizon 3) so now I have to re-download it.


----------



## rdr09

Quote:


> Originally Posted by *Matt-Matt*
> 
> Made my PC Black screen.
> 
> Rebooted and everytime I loaded Windows it black screened (after MSI AB opening, assumably)
> 
> I had to use Windows restore (which also broke Forza Horizon 3) so now I have to re-download it.


That's a first. I recommended it here before and to 390(X) owners and it worked in all of them. Hmmm. Did you uninstall the old AB first? Or you left your gpu oc'ed. Windows got confused.


----------



## Matt-Matt

Quote:


> Originally Posted by *rdr09*
> 
> That's a first. I recommended it here before and to 390(X) owners and it worked in all of them. Hmmm. Did you uninstall the old AB first? Or you left your gpu oc'ed. Windows got confused.


Old AMD was uninstalled and wattman was stock.

Can't be bothered testing it again for now, what I have works fine.

Will probably update to the afterburner offical release and see how we go.


----------



## spyshagg

Anyone having huge stutters (like pausing to load stuff) in Battlefield 1 with 290x + latest drivers ?


----------



## Aussiejuggalo

Just swapped my Notua NT-H1 paste to Conductonaut, dropped 15°







. Wish I changed pastes years ago, getting a Galax 1070 in a couple of weeks







.


----------



## mfknjadagr8

Quote:


> Originally Posted by *spyshagg*
> 
> Anyone having huge stutters (like pausing to load stuff) in Battlefield 1 with 290x + latest drivers ?


nope all good here with 2 on dx11 though


----------



## Lazat

Hi all! This is a big cry for help! I have a problem with blackscreens on my Sapphire 290 Tri-X which i can't get my head around at all.

Computer:
Intel [email protected] 24GB [email protected]
MSI z77A-GD65 Motherboard
256GB 840Evo ssd, 512GB MX100 ssd.
Corsair TX650W (Running separate pci-e cables to the both connectors)
Windows 10, 10.0.15063
Crimson Radeon Software Version 17.8.2
Sapphire 290 Tri-X, Elpida memory, running The Stilts MLU version for reduced VRM temp. Base voltage at DPM7 = 1250mV, Vaux=1.000.

Sequence:
- Start computer from scratch.
- Set voltage levels with mVoffset and mVauxOffset in Afterburner.
- Test stability with Heaven for hours.
- If run long enough, put computer to sleep and then start Heaven testing again.

Run Heaven looping:
1100/1315mhz, +75mVoffset/+100mVauxoffset=(1.181-1.201-1.275Vm), 270Wpl, 84Tcore/88Tvrm 219Wavg/236Wmax - @1h16min no blackscreen.
If i put the computer to sleep or hibernation and run tests after waking it up it will blackscreen on these settings after 2-5min.

1075/1250mhz, +125mVoffset/+100mVauxoffset=(1.206-1.226-1.300Vm), 270Wpl, 85Tcore/90Tvrm 227Wavg/245Wmax -
@1h42min no blackscreen.
If i put the computer to sleep or hibernation and run tests after waking it up it will blackscreen on these settings after 2-5min.

1050/1315mhz, -25mVoffset/+75mVauxoffset=(1.106-1.120-1.131) 312Wpl, 81Tcore/83Tvrm - @4h30min no blackscreen.
If i put the computer to sleep or hibernation and run tests after waking it up it will blackscreen on these settings after 2-5min.

1025mhz, -31mVoffset/+12mVauxoffset=1.120Vavg, 312Wpl, 79Tcore/81Tvrm 169Wavg - @4h30min no blackscreen.
If i put the computer to sleep or hibernation and run tests after waking it up it will sometimes run Heaven for hours, sometimes blackscreen on these settings after 2-5min.

1000mhz, -38mVoffset/+12mVauxoffset=1.120Vavg, 312W, 80Tcore/81.5Tvrm, 167Wavg - @6h10min no blackscreen.
If i put the computer to sleep or hibernation and run tests after waking it up it will sometimes run Heaven for hours, sometimes blackscreen on these settings after 2-5min.

975mhz, -25mVoffset/+12mVauxoffset, 270Wpl, 80Tcore/80Tvrm - @8h15min no blackscreen.
If i put the computer to sleep or hibernation and run tests after waking it up it will sometimes run Heaven for hours, sometimes blackscreen on these settings after 2-5min.

Why is it so that i first of all are stable when the computer hasnt slept? And why is it stable sometimes after sleep but mostly not stable after sleep? Ive spent so much time on this, its driving me crazy, every day i try new stuff just to reach 100% stability.

Does anyone have any suggestions?

And if you guys are wondering why i need sleep and/or hibernation. I just hate to set up all my word-documents/pdfs/desktops/matlabsessions. So i tend to leave my computer in hibernation over extended periods of time until im done with a project. then ill let it restart and update etc etc.

Thanks so much in advance for any constructive tips and tricks!


----------



## qwertyist12

AMD R9 290X

SAPPHIRE

TRI-X


----------



## abe_joker

Hey guys, what are your OC values and voltage numbers? I got a 290x and I wanna tinker a bit but would like to know around what numbers might be safe to play with.


----------



## Caradine

R9 290 Tri-X 1020mhz 1.2V VDDC in bios with vdroop to 1.15V
and i'm getting 69 degrees max with that voltage and clock in witcher 3 with 100% gpu usage.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Lazat*
> 
> Hi all! This is a big cry for help! I have a problem with blackscreens on my Sapphire 290 Tri-X which i can't get my head around at all.
> 
> Computer:
> Intel [email protected] 24GB [email protected]
> MSI z77A-GD65 Motherboard
> 256GB 840Evo ssd, 512GB MX100 ssd.
> Corsair TX650W (Running separate pci-e cables to the both connectors)
> Windows 10, 10.0.15063
> Crimson Radeon Software Version 17.8.2
> Sapphire 290 Tri-X, Elpida memory, running The Stilts MLU version for reduced VRM temp. Base voltage at DPM7 = 1250mV, Vaux=1.000.
> 
> Sequence:
> - Start computer from scratch.
> - Set voltage levels with mVoffset and mVauxOffset in Afterburner.
> - Test stability with Heaven for hours.
> - If run long enough, put computer to sleep and then start Heaven testing again.
> 
> Run Heaven looping:
> 1100/1315mhz, +75mVoffset/+100mVauxoffset=(1.181-1.201-1.275Vm), 270Wpl, 84Tcore/88Tvrm 219Wavg/236Wmax - @1h16min no blackscreen.
> If i put the computer to sleep or hibernation and run tests after waking it up it will blackscreen on these settings after 2-5min.
> 
> 1075/1250mhz, +125mVoffset/+100mVauxoffset=(1.206-1.226-1.300Vm), 270Wpl, 85Tcore/90Tvrm 227Wavg/245Wmax -
> @1h42min no blackscreen.
> If i put the computer to sleep or hibernation and run tests after waking it up it will blackscreen on these settings after 2-5min.
> 
> 1050/1315mhz, -25mVoffset/+75mVauxoffset=(1.106-1.120-1.131) 312Wpl, 81Tcore/83Tvrm - @4h30min no blackscreen.
> If i put the computer to sleep or hibernation and run tests after waking it up it will blackscreen on these settings after 2-5min.
> 
> 1025mhz, -31mVoffset/+12mVauxoffset=1.120Vavg, 312Wpl, 79Tcore/81Tvrm 169Wavg - @4h30min no blackscreen.
> If i put the computer to sleep or hibernation and run tests after waking it up it will sometimes run Heaven for hours, sometimes blackscreen on these settings after 2-5min.
> 
> 1000mhz, -38mVoffset/+12mVauxoffset=1.120Vavg, 312W, 80Tcore/81.5Tvrm, 167Wavg - @6h10min no blackscreen.
> If i put the computer to sleep or hibernation and run tests after waking it up it will sometimes run Heaven for hours, sometimes blackscreen on these settings after 2-5min.
> 
> 975mhz, -25mVoffset/+12mVauxoffset, 270Wpl, 80Tcore/80Tvrm - @8h15min no blackscreen.
> If i put the computer to sleep or hibernation and run tests after waking it up it will sometimes run Heaven for hours, sometimes blackscreen on these settings after 2-5min.
> 
> Why is it so that i first of all are stable when the computer hasnt slept? And why is it stable sometimes after sleep but mostly not stable after sleep? Ive spent so much time on this, its driving me crazy, every day i try new stuff just to reach 100% stability.
> 
> Does anyone have any suggestions?
> 
> And if you guys are wondering why i need sleep and/or hibernation. I just hate to set up all my word-documents/pdfs/desktops/matlabsessions. So i tend to leave my computer in hibernation over extended periods of time until im done with a project. then ill let it restart and update etc etc.
> 
> Thanks so much in advance for any constructive tips and tricks!


iirc there was an issue with overclock not being applied correctly after sleep and or hibernation...the only fix that i seen back then was to avoid using those states....hopefully someone who experienced it can shed some light on how to fix the issue


----------



## Lazat

Quote:


> Originally Posted by *mfknjadagr8*
> 
> iirc there was an issue with overclock not being applied correctly after sleep and or hibernation...the only fix that i seen back then was to avoid using those states....hopefully someone who experienced it can shed some light on how to fix the issue


The weird thing is that i check voltages with HWinfo and Afterburner after sleep and they are the same as before sleep. Have also enabled the option "restore settings after suspend mode" in afterburner.

Tested a bit more now:
1050/1250mhz @ Base VID1250mV/1000mVaux +25mVcore/25mVaux | 1.1438Vmin-1.1720Vavg-1.2375Vmax | 81/83C | - no sleep and no blackscreen for 12h46min.

To reach the same stability after putting computer to sleep.
1050/1250mhz @ Base VID 1250mV/1000mVaux +75mVcore/25mVaux | 1.1875Vmin-1.2115Vavg-1.2813Vmax | 83/86C | - Sleep and no blackscreen for @6h1m.

A whopping 50mV more to reach stability in sleep! What gives? There must be some error somewhere. This is also extremely time consuming to track down, because i can be stable for 2-3hours and then blackscreen when testing with heaven. If i try furmark or something it never blackscreens. Would be nice to have a quicker way to determine stability then to have to let heaven run for atleast 6h.


----------



## Kevindewapper

My videocard likes to go up to 94/95 degrees whenever I play "high demand" games. Locking the FPS makes a huge difference:

- Idle temperature in Windows: 35 - 40 degrees (VRM I/II: 30 and 33)
- PUBG: 94 - 95 degrees ( VRM I/II: 82 and 70)
- PUBG Locked @60 FPS: 64 degrees
- PUBG Locked @30 FPS: 54 degrees
- Rocket league @250 FPS: 94- 95 degrees
- Rocket league @144 FPS: 75 degrees
- Furmark: 94- 95 degrees

I know this cards heats up so my card is modded with the Icy vision like this: http://www.overclock.net/t/1437634/installation-guide-tips-of-rev-2-icy-vision-on-r9-290x. I have an nzxt s340 case with a top and back fan, 2 intake fans (Be quiet pure wings 2) and gelid icy vision on my GPU.

*Is there any overclock potential?* I read a lot about this card but I don't understand if:

A. Card goes to 95 degrees because of the design of the card (OC potential: yes)
B. Card goes to 95 degrees and down clocks because temperate is too high (OC potential: no)
It is very strange to me that the GPU goes to 95 degrees @250 FPS in the menu of Rocket league? So, the temperature in game is the same as idle in the menu&#8230; Card is in uber mode (switch to the right:


http://imgur.com/ldz4p

)), I tried installing a fan under, behind and from the side of the GPU but GPU always goes up to 95 degrees but Idle temperatures are 2-4 degrees lower.


----------



## Roboyto

Quote:


> Originally Posted by *Kevindewapper*
> 
> My videocard likes to go up to 94/95 degrees whenever I play "high demand" games. Locking the FPS makes a huge difference:
> 
> - Idle temperature in Windows: 35 - 40 degrees (VRM I/II: 30 and 33)
> - PUBG: 94 - 95 degrees ( VRM I/II: 82 and 70)
> - PUBG Locked @60 FPS: 64 degrees
> - PUBG Locked @30 FPS: 54 degrees
> - Rocket league @250 FPS: 94- 95 degrees
> - Rocket league @144 FPS: 75 degrees
> - Furmark: 94- 95 degrees
> 
> I know this cards heats up so my card is modded with the Icy vision like this: http://www.overclock.net/t/1437634/installation-guide-tips-of-rev-2-icy-vision-on-r9-290x. I have an nzxt s340 case with a top and back fan, 2 intake fans (Be quiet pure wings 2) and gelid icy vision on my GPU.
> 
> *Is there any overclock potential?* I read a lot about this card but I don't understand if:
> 
> A. Card goes to 95 degrees because of the design of the card (OC potential: yes)
> B. Card goes to 95 degrees and down clocks because temperate is too high (OC potential: no)
> It is very strange to me that the GPU goes to 95 degrees @250 FPS in the menu of Rocket league? So, the temperature in game is the same as idle in the menu&#8230; Card is in uber mode (switch to the right:
> 
> 
> http://imgur.com/ldz4p
> 
> )), I tried installing a fan under, behind and from the side of the GPU but GPU always goes up to 95 degrees but Idle temperatures are 2-4 degrees lower.


Check the mount of the cooler. Pretty sure 95 is max temp and the card is doing whatever it can to keep it from exceeding that temperature. I'm willing to bet that you don't have good contact with the die which is why it's hitting those temps.


----------



## mfknjadagr8

Quote:


> Originally Posted by *Roboyto*
> 
> Check the mount of the cooler. Pretty sure 95 is max temp and the card is doing whatever it can to keep it from exceeding that temperature. I'm willing to bet that you don't have good contact with the die which is why it's hitting those temps.


repaste woildnt hurt in any case...unless you arent good at putting things back together


----------



## dave338

Is strange that you haver higher temps in core than in vrm... I think your paste is bad/old or you didn't put enough...
My 290 had an icy vision before going water and max Temps with fans at 1500rpm aprox where 76-78 in the core and about 85 in vrm, oc at 1040/1400 in heavy benches.

Check heatsink contact and reapply thermal paste.

Br


----------



## chantruong

Hey Guys

I just picked up a Reference MSI 290x for cheap and just installed it today on my test bench. I installed the latest drivers which seemed to install fine. Temperatures seem normal around 94 C at stock fan curve and 90 c at 70% constant fan speed. During 3d loads, programs randomly crashes. In Unigine Valley seems to crash anywhere from 10 seconds to 5 minutes. Unigine Heaven has more luck at around 15 to 30 minutes. Crysis 3 crashed around 10 minutes. During the programs, there seemed to be no artifacts. Note that each program crashes to desktop but does not freeze or blue screen. I changed my psu to a newer model and it seemed to helped a little bit but still crashed. I have not overclocked the card.

These are my test bench specs

CPU: i7-3770 @4.1 ghz
Ram: 4x2 ddr3 1600
GPU: MSI R9 290x Reference
Motherboard: Asrock Z77 Pro 3
OS: Windows 10 Pro
PSU 1: Corsair TX650M from 2013
PSU 2: New Seasonic 620 Bronze

Attached are some screenshots from gpu-z during a 30 minute unigine valley run.




I think these are the the possible causes

1) Driver Issues

2) Power delivery issues

3) Defective/Dead Card

4) Motherboard Issues. I've had randomly blue screens in the past with the board. It seemed to disappear recently with other cards.

5) Tampered Bios. How do I check if I have the stock bios? Could my card be a 290 that was flashed to 290x?

Let me know what issue may be and how or if I can correct it.

Thanks,


----------



## boot318

^^^^^^^^^^^^^^^^^

1. I'm using 17.7.1. I was getting crashes with 17.7.2. Alot of people have been posting problems on drivers 17.7.2+ for older cards.

2. I can't confirm or deny this issue.

3. Used card that could've seen mining in its past? Can't rule it out.

4. Run everything stock so we can rule it out.

5. The card should have a sticker on it somewhere. Snap a picture of it so we can look it up.

I highly recommend using 17.7.1 or lower driver to start the debugging process.


----------



## rdr09

Quote:


> Originally Posted by *chantruong*
> 
> Hey Guys
> 
> I just picked up a Reference MSI 290x for cheap and just installed it today on my test bench. I installed the latest drivers which seemed to install fine. Temperatures seem normal around 94 C at stock fan curve and 90 c at 70% constant fan speed. During 3d loads, programs randomly crashes. In Unigine Valley seems to crash anywhere from 10 seconds to 5 minutes. Unigine Heaven has more luck at around 15 to 30 minutes. Crysis 3 crashed around 10 minutes. During the programs, there seemed to be no artifacts. Note that each program crashes to desktop but does not freeze or blue screen. I changed my psu to a newer model and it seemed to helped a little bit but still crashed. I have not overclocked the card.
> 
> These are my test bench specs
> 
> CPU: i7-3770 @4.1 ghz
> Ram: 4x2 ddr3 1600
> GPU: MSI R9 290x Reference
> Motherboard: Asrock Z77 Pro 3
> OS: Windows 10 Pro
> PSU 1: Corsair TX650M from 2013
> PSU 2: New Seasonic 620 Bronze
> 
> Attached are some screenshots from gpu-z during a 30 minute unigine valley run.
> 
> 
> 
> 
> I think these are the the possible causes
> 
> 1) Driver Issues
> 
> 2) Power delivery issues
> 
> 3) Defective/Dead Card
> 
> 4) Motherboard Issues. I've had randomly blue screens in the past with the board. It seemed to disappear recently with other cards.
> 
> 5) Tampered Bios. How do I check if I have the stock bios? Could my card be a 290 that was flashed to 290x?
> 
> Let me know what issue may be and how or if I can correct it.
> 
> Thanks,


There are two bioses and you can try the other. You can find the tiny switch on top of the card near the bracket. Turn off machine first before switching.

Quote:


> Originally Posted by *boot318*
> 
> ^^^^^^^^^^^^^^^^^
> 
> 1. I'm using 17.7.1. I was getting crashes with 17.7.2. Alot of people have been posting problems on drivers 17.7.2+ for older cards.
> 
> 2. I can't confirm or deny this issue.
> 
> 3. Used card that could've seen mining in its past? Can't rule it out.
> 
> 4. Run everything stock so we can rule it out.
> 
> 5. The card should have a sticker on it somewhere. Snap a picture of it so we can look it up.
> 
> I highly recommend using 17.7.1 or lower driver to start the debugging process.


Im currently using 17.7.2 without issue with my crossfire.


----------



## HOMECINEMA-PC

Turn off 2d clocks and vdroop ,power saving stuff and finally add vcore and or go a higher power level


----------



## RAZZTA01

Hi, I own a sapphire r9 290 running on water with a [email protected] I tried to OC it, but could not get past 1500 (ram) and 1170 (core). I used f+50% powerlimit +75mV.

You think I have more room to play with it?
Is the result I score in the expected range?
Rgds.


----------



## chantruong

Seems like my issues are related to my system and not the actual card. I switched my card from my z77 system to my spare h61 system and I haven't had any issues so far. I do recall my z77 system having issues with graphics cards in the past. I will do a few more tests and to make sure.

I can't wait to hybrid mod it and starting OC'ing it


----------



## AliNT77

Quote:


> Originally Posted by *RAZZTA01*
> 
> Hi, I own a sapphire r9 290 running on water with a [email protected] I tried to OC it, but could not get past 1500 (ram) and 1170 (core). I used f+50% powerlimit +75mV.
> 
> You think I have more room to play with it?
> Is the result I score in the expected range?
> Rgds.


Sky Diver? Really? try FireStrike instead


----------



## Kevindewapper

Quote:


> Originally Posted by *mfknjadagr8*
> 
> repaste woildnt hurt in any case...unless you arent good at putting things back together


I tried extra fans, other fan positions, fans on 100% fan and this all worked liked **** to get temperatures down. Yesterday I applied new thermal paste and that made a HUGE difference. My average temperatures under load dropped on average by 20 degrees! I did buy the card from a guy from a dutch tech site and he did this mod (http://www.overclock.net/t/1437634/installation-guide-tips-of-rev-2-icy-vision-on-r9-290x). It looks more to me that he did not do the mod but only replaced the fans. The inside looks so strange to me compared to other cards. On the last photo's you see some sort of metal plate over the thermal pads. Is that normal?

(


http://imgur.com/gxWY0



Idle in windows from 39 to 35 degrees (VRM dropped 3 degrees). Not much difference (but it was " good" already)
Rocket league @250 FPS: 94 vs 72 degrees (VRM I/II : 74 and 60 degrees)
Pubg full load: 75 degrees (VRM I/II: 78 and 61 degrees)
Furmark @ 94 degrees after 2.02 minutes. But high temperatures are normal from what I read about Furmark.

Finally I am not fighting anymore against the 95 degrees gods of the R9 290x. My card is finally prepared to overclock. Unfortunately I do get artifects if I run Heavens Benchmark @ 1100 core clock or higher.


----------



## Lazat

Quote:


> Originally Posted by *Kevindewapper*
> 
> .....
> 
> Idle in windows from 94 to 35 degrees (VRM dropped 3 degrees). Not much difference (but it was " good" already)
> Rocket league @250 FPS: 94 vs 72 degrees (VRM I/II : 74 and 60 degrees)
> Pubg full load: 75 degrees (VRM I/II: 78 and 61 degrees)
> Furmark @ 94 degrees after 2.02 minutes. But high temperatures are normal from what I read about Furmark.
> Finally I am not fighting anymore against the 95 degrees gods of the R9 290x. My card is finally prepared to overclock. Unfortunately I do get artifects if I run Heavens Benchmark @ 1100 core clock or higher.


Woaw! That repaste really helped ;D If u wanna get rid of artifacts, try addings some Voltage offset. Also, from my findings, dont upgrade driver past the 17.7.1/2 drivers. I've been plagued with so much blacksscreens while gaming after having computer sleeping so its not even funny. Finally figured out its the drivers. My card has always been plagued by blackscreens when gaming, but now after months of finding what i thought stable frequency/voltage levels everything just went haywire again. After a week or two of endless heaven runs after putting computer to sleep i finally nailed that it was the never drivers and not the ****ty card.


----------



## Kevindewapper

Quote:


> Originally Posted by *Lazat*
> 
> Woaw! That repaste really helped ;D If u wanna get rid of artifacts, try addings some Voltage offset. Also, from my findings, dont upgrade driver past the 17.7.1/2 drivers. I've been plagued with so much blacksscreens while gaming after having computer sleeping so its not even funny. Finally figured out its the drivers. My card has always been plagued by blackscreens when gaming, but now after months of finding what i thought stable frequency/voltage levels everything just went haywire again. After a week or two of endless heaven runs after putting computer to sleep i finally nailed that it was the never drivers and not the ****ty card.


I will add some voltage soon to get rid of artifacts. I have been using 17.7.2 and didn't run into blackscreens untill now. Thanks for the tip, will try older stable drivers when i run into problems.


----------



## Lazat

Quote:


> Originally Posted by *Kevindewapper*
> 
> I will add some voltage soon to get rid of artifacts. I have been using 17.7.2 and didn't run into blackscreens untill now. Thanks for the tip, will try older stable drivers when i run into problems.


Also make sure you run MSI Afterburner latest beta, 4.4.0 Beta 16 (if u use afterburner that is). Link to the posts from Unwinder with 4.4.0 Beta 16


----------



## mfknjadagr8

Quote:


> Originally Posted by *Kevindewapper*
> 
> I tried extra fans, other fan positions, fans on 100% fan and this all worked liked **** to get temperatures down. Yesterday I applied new thermal paste and that made a HUGE difference. My average temperatures under load dropped on average by 20 degrees! I did buy the card from a guy from a dutch tech site and he did this mod (http://www.overclock.net/t/1437634/installation-guide-tips-of-rev-2-icy-vision-on-r9-290x). It looks more to me that he did not do the mod but only replaced the fans. The inside looks so strange to me compared to other cards. On the last photo's you see some sort of metal plate over the thermal pads. Is that normal?
> 
> (
> 
> 
> http://imgur.com/gxWY0
> 
> 
> 
> Idle in windows from 39 to 35 degrees (VRM dropped 3 degrees). Not much difference (but it was " good" already)
> Rocket league @250 FPS: 94 vs 72 degrees (VRM I/II : 74 and 60 degrees)
> Pubg full load: 75 degrees (VRM I/II: 78 and 61 degrees)
> Furmark @ 94 degrees after 2.02 minutes. But high temperatures are normal from what I read about Furmark.
> 
> Finally I am not fighting anymore against the 95 degrees gods of the R9 290x. My card is finally prepared to overclock. Unfortunately I do get artifects if I run Heavens Benchmark @ 1100 core clock or higher.


yes the paste was probably chalky and hard yeah? That comes from them overpasting at the factory and then people run them at high temps to prevent fan noise....before i switched to water cooling my card needed paste horribly...i was throttling badly over 90 degrees and it was hitting 94 within a few minutes of any 3d app even with 100 percent fan speed....after repasting i seen mid 70s with 70 percent fan speed which mind you is very loud on reference card 5000rpms lol...glad you got temps sorted...it looks like that metal is used as a heatsink of sorts...my card is reference so it doesnt have that hopefully someone with your card and mod can let you know if thats part of it....it looks like he used a metal plate instead of aluminum heatsinks on the vrms....vrm temps dont look bad...suprised jt works that well honestly


----------



## RAZZTA01

Here it goes.

And more...you see cpu mhz do not exceed 1107mhz and gpu clock 1607mhz, are those readings correct?


Is is posible to achieve 1200mhz gpu clock frequency like 290x? I am having trouble passing 1117gpu clock.


----------



## mirzet1976

Quote:


> Originally Posted by *RAZZTA01*
> 
> Here it goes.
> 
> And more...you see cpu mhz do not exceed 1107mhz and gpu clock 1607mhz, are those readings correct?
> 
> 
> Is is posible to achieve 1200mhz gpu clock frequency like 290x? I am having trouble passing 1117gpu clock.


It is posible 1300+mhz, this is with R9 290


----------



## RAZZTA01

Hi, that´s good news! But I have no clue what I need to do.
What drivers can I use (I am using 17.8.2)?

In Afterburner I set 1107 gpu core frequency and this graph shows 1607 peak? Can anyone explain to me why this difference?
Also, gpu is set to 1400 but the graph shows 1107 peak?

How should I proceed to obtain 1200-1300 gpu core? Afterburner core max is 1235!

Do I need to upload a specific BIOS? Anything else I am missing?


----------



## spyshagg

I'm starting to feel the weight of time with my oced 290x (14300 firestrike graphics).

It can still push BF4/BF1/BLOPS3 above 100fps @ 1440P (tryhard low settings) which is always the recommended low latency settings for online games, but i'm starting to miss the visual bells and whistles. It also cannot run nearly good enough on the new "hip" games such as PUBG sometimes dropping bellow 60fps in low settings (pretty horrible graphics when low). For Oculus rift the burden is even heavier, specially with the need to run supersampling to hide the jaggies.

Like many of you, I was holding out for the promise of Vega to keep making use of my freesync monitor. But as things stand right now, I'll either have to wait another 12 months to see what AMD brings out, or I bite the bullet.

How are your cards doing now days? Mine is degrading quite a bit, about 30mhz on the GPU, and 100mhz+ on the ram. So its about 13700 in firestrike graphics score.


----------



## artemis2307

Quote:


> Originally Posted by *spyshagg*
> 
> I'm starting to feel the weight of time with my oced 290x (14300 firestrike graphics).
> 
> It can still push BF4/BF1/BLOPS3 above 100fps @ 1440P (tryhard low settings) which is always the recommended low latency settings for online games, but i'm starting to miss the visual bells and whistles. It also cannot run nearly good enough on the new "hip" games such as PUBG sometimes dropping bellow 60fps in low settings (pretty horrible graphics when low). For Oculus rift the burden is even heavier, specially with the need to run supersampling to hide the jaggies.
> 
> Like many of you, I was holding out for the promise of Vega to keep making use of my freesync monitor. But as things stand right now, I'll either have to wait another 12 months to see what AMD brings out, or I bite the bullet.
> 
> How are your cards doing now days? Mine is degrading quite a bit, about 30mhz on the GPU, and 100mhz+ on the ram. So its about 13700 in firestrike graphics score.


Hawaii was mean to be 1440p card in 2014. Not 2017
But I play on 2560x1080p so a 290 oc is enough for me. Shame I had to trade it for a 970 when I go ITX and SFX psu


----------



## RAZZTA01

Hi guys!
Pls, can anyone help me to achieve core 1300mhz on my r9290 stock? I am stuck beacause I can´t pass the +100mV limit in Afterburner (last beta). I think I need to add more mV to achieve my 1300mhz target. I have been reading about the hack for MSI cards, but did not work for my sapphire.
Any help is welcome!


----------



## Tame

Quote:


> Originally Posted by *RAZZTA01*
> 
> Hi guys!
> Pls, can anyone help me to achieve core 1300mhz on my r9290 stock? I am stuck beacause I can´t pass the +100mV limit in Afterburner (last beta). I think I need to add more mV to achieve my 1300mhz target. I have been reading about the hack for MSI cards, but did not work for my sapphire.
> Any help is welcome!


You can either use Cheat Engine to hack MSI AB voltage to whatever you want, or use the VRMtool http://www.overclock.net/t/1605757/vrmtool-a-simple-tool-to-read-and-write-to-i2c-vrm-controllers


----------



## RAZZTA01

Thx Tame! This is what I needed.
Before I start my oc, what is the absolute max limit voltage to not fry my gpu? Starting from VDCC 0.984v ....?
Rgds.


----------



## Tame

Quote:


> Originally Posted by *RAZZTA01*
> 
> Thx Tame! This is what I needed.
> Before I start my oc, what is the absolute max limit voltage to not fry my gpu? Starting from VDCC 0.984v ....?
> Rgds.


Well, I've done VID of 1250 mV + 330 mV offset, but the screen starts to go bonkers at that point because I have stock 0.95 V rail. I got pretty beefy water cooling system, full cover block and upgraded VRM cooling pads... I don't think it's very easy to damage the core with voltage under normal conditions, because either you don't got the cooling and over heat, or the card black screens on you before you can apply that high voltage (because of various protection mechanisms kicking in, or otherwise). If you use PT1 BIOS or other heavily modded BIOS you need to be mindful of the voltage, but even with that you will usually black screen on really high voltages with a normal card.

I would say you can use as much as you dare (though rise with ~50 mV increments to get your playing field), but monitor your VRM1 temps with GPU-Z, because the VRM1 mosfets are the first thing to likely blow up. I would back off once the VRM starts to hit 100 C.


----------



## Kevindewapper

Quote:


> Originally Posted by *mfknjadagr8*
> 
> glad you got temps sorted...it looks like that metal is used as a heatsink of sorts...my card is reference so it doesnt have that hopefully someone with your card and mod can let you know if thats part of it....it looks like he used a metal plate instead of aluminum heatsinks on the vrms....vrm temps dont look bad...suprised jt works that well honestly


Yeah it looks like a custom made plate (https://i.imgur.com/yOee1hB.jpg). I couldn't get it off. The problem is temperatures of VRM are rising exponentially with voltage (19mV leads to 90-100 degrees VRM).
The max (unvolted) stable overclock is 1075/1400 with a Heavens Benchmark score of 2062.

I can run my core to 1120 and my memory to 1600 with more voltage but VRM temperatures are getting too high.

I can't change the thermal pads on the VRM if I can't get the plate of. I think this is holding my overclock for now.


----------



## Kevindewapper

Quote:


> Originally Posted by *Lazat*
> 
> Woaw! That repaste really helped ;D If u wanna get rid of artifacts, try addings some Voltage offset. Also, from my findings, dont upgrade driver past the 17.7.1/2 drivers. I've been plagued with so much blacksscreens while gaming after having computer sleeping so its not even funny. Finally figured out its the drivers. My card has always been plagued by blackscreens when gaming, but now after months of finding what i thought stable frequency/voltage levels everything just went haywire again. After a week or two of endless heaven runs after putting computer to sleep i finally nailed that it was the never drivers and not the ****ty card.


I did get a lot of black screens when I overclocked the memory. To be save, I did go back to 17.1.1 and beta MSI aferburner. Still the same black screens with same overclocks though.

After installing the 17.1.1 driver I get wrong memory speeds in the first tab of GPU-Z (


http://imgur.com/J6lMZ

). But the memory value is shown right at the " sensors"?!


----------



## Firstsage

Hello everyone,
New poster, but long time reader of the forum. I recently decided to setup a water loop and am looking into overclocking my 2, 290s. I have a Sapphire TRI and a Asus reference both on blocks.

After fighting a bit on drivers and 3rd party OC software, I found that I can use the beta version of Afterburner liked above to get my clocks up, but am unable to modify the voltage on either card.

I have also tried the latest version of Sapphire TRIXX, but all I get is 2 cards stuck at 300MHz.

Right now, I have the 17.7.2 drivers. If i use 16.9.1, the TRIXX software works fine, but I have ran into some issues with games that I am hoping the updated drives will resolve.

Any help/ideas on options would be appropriated!

Thanks,

Jon


----------



## boot318

^Does 17.7.1 still allows for you to OC? 17.7.2+ have been giving Hawaii owners troubles.

17.7.1 is a beta driver, but it has been super stable for me and my friends.


----------



## rdr09

Quote:


> Originally Posted by *Firstsage*
> 
> Hello everyone,
> New poster, but long time reader of the forum. I recently decided to setup a water loop and am looking into overclocking my 2, 290s. I have a Sapphire TRI and a Asus reference both on blocks.
> 
> After fighting a bit on drivers and 3rd party OC software, I found that I can use the beta version of Afterburner liked above to get my clocks up, but am unable to modify the voltage on either card.
> 
> I have also tried the latest version of Sapphire TRIXX, but all I get is 2 cards stuck at 300MHz.
> 
> Right now, I have the 17.7.2 drivers. If i use 16.9.1, the TRIXX software works fine, but I have ran into some issues with games that I am hoping the updated drives will resolve.
> 
> Any help/ideas on options would be appropriated!
> 
> Thanks,
> 
> Jon


See post# 43819

http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/43810


----------



## Firstsage

Quote:


> Originally Posted by *rdr09*
> 
> See post# 43819
> 
> http://www.overclock.net/t/1436497/official-amd-r9-290x-290-owners-club/43810


Perfect! Thank you! I will give both of those a shot.

Quote:


> Originally Posted by *boot318*
> 
> ^Does 17.7.1 still allows for you to OC? 17.7.2+ have been giving Hawaii owners troubles.
> 
> 17.7.1 is a beta driver, but it has been super stable for me and my friends.


So far the 17.7.2 has been fine. The Beta of AB currently has both cards at 1075/1350.


----------



## Lazat

Quote:


> Originally Posted by *Kevindewapper*
> 
> I did get a lot of black screens when I overclocked the memory. To be save, I did go back to 17.1.1 and beta MSI aferburner. Still the same black screens with same overclocks though.
> 
> After installing the 17.1.1 driver I get wrong memory speeds in the first tab of GPU-Z (
> 
> 
> http://imgur.com/J6lMZ
> 
> ). But the memory value is shown right at the " sensors"?!


Yeah that blackscreen happens if you go to far with OC of memory. Try to max out Core first, then find your max Memfrequency on that core clock which wont give you blackscreens.
I would monitor your graphics card with MSI Afterburner, not GPU-Z, and HWinfo64 for the VRMs. REMEMBER to use AB 4.4.0 Beta 16

Quote:


> Originally Posted by *Firstsage*
> 
> Hello everyone,
> New poster, but long time reader of the forum. I recently decided to setup a water loop and am looking into overclocking my 2, 290s. I have a Sapphire TRI and a Asus reference both on blocks.
> 
> After fighting a bit on drivers and 3rd party OC software, I found that I can use the beta version of Afterburner liked above to get my clocks up, but am unable to modify the voltage on either card.
> 
> I have also tried the latest version of Sapphire TRIXX, but all I get is 2 cards stuck at 300MHz.
> 
> Right now, I have the 17.7.2 drivers. If i use 16.9.1, the TRIXX software works fine, but I have ran into some issues with games that I am hoping the updated drives will resolve.
> ...


Use 17.7.2 or 17.7.1 and MSI Afterburner 4.4.0 Beta 16

You need to unlock voltage control in Afterburner settings.



Spoiler: For overvolting past 100mV with Afterburner



Overvolt past 100mV with AB
Just use /wi4,30,8d,10 for 100mv. The offset is 6.25 mv in hexademical. So on decimal is :16*6.25=100 mv. For 50mv you need 8. For 200mv you need 20( 20=32 on dec. So 32 * 6.25=200mv)

Create a txt on desktop. Write
CD C:\Program Files (x86)\MSI Afterburner
MSIAfterburner.exe /wi6,30,8d,10
and then save as .bat file.

Start by rightclicking and "run as administrator", Everytime you start this bat file msi will start with +100mv

For 50mv: 8h (h as in hex)
For 100mv:10
For 125mv:14
For 150mv:18
For 175mv:1C
For 200mv:20


----------



## rdr09

Quote:


> Originally Posted by *Firstsage*
> 
> Perfect! Thank you! I will give both of those a shot.
> So far the 17.7.2 has been fine. The Beta of AB currently has both cards at 1075/1350.


OC the core first see how far you can get them. Actually, if you can find which card oc better than the other do that too. Years ago i found out that at near freezing ambient i can oc one of my 290s to 1300 core while the second only up to 1250. Both at +200mv using Trixx. The slower card will be the limit in crossfire.

After the cores, then oc the memory.

Not sure what PSU you have but i used to have a 1300w and +200 was easy. Now with just an 850W i can only add +100 using AB.

Max the Power Limit in any OC.


----------



## Kevindewapper

Quote:


> Originally Posted by *Lazat*
> 
> Yeah that blackscreen happens if you go to far with OC of memory. Try to max out Core first, then find your max Memfrequency on that core clock which wont give you blackscreens.
> I would monitor your graphics card with MSI Afterburner, not GPU-Z, and HWinfo64 for the VRMs. REMEMBER to use AB 4.4.0 Beta 16
> Use 17.7.2 or 17.7.1 and MSI Afterburner 4.4.0 Beta 16


I downloaded Hwinfo64 and already have the right version of Afterburner and AMD drivers.
My problem is running 1120/1450 Heaven Benchmark just fine without problems, but when i start a game with high FPS, temperatures rise very very fast (monitored 120+ VRM!)

I see most black screens and freezes happen in games with uncapped FPS. I.e. Dirt rally with 300 FPS is giving me 100-120 degrees VRM temperatures. When I lock the [email protected] (with Radeon or Vsync in game) the temperatures drop to 65-75 degrees.

So for now, I will lock [email protected] and see if that's stable with higher clocks.


----------



## RAZZTA01

Quote:


> Originally Posted by *Tame*
> 
> You can either use Cheat Engine to hack MSI AB voltage to whatever you want, or use the VRMtool http://www.overclock.net/t/1605757/vrmtool-a-simple-tool-to-read-and-write-to-i2c-vrm-controllers


Hi, thx for input, but just looking at VRMTool I got scared







I don´t want to mess up since I could not find any guide for this tool. Definitely is for advanced people with bios knowledge.

Can you point me to any guide to setup Cheat Engine to hack MSI AB voltage? Maybe this: 



Also, which version should I use?

Rgds.


----------



## Tame

Quote:


> Originally Posted by *RAZZTA01*
> 
> Hi, thx for input, but just looking at VRMTool I got scared
> 
> 
> 
> 
> 
> 
> 
> I don´t want to mess up since I could not find any guide for this tool. Definitely is for advanced people with bios knowledge.
> 
> Can you point me to any guide to setup Cheat Engine to hack MSI AB voltage? Maybe this:
> 
> 
> 
> Also, which version should I use?
> 
> Rgds.


Well yeah as the most basic you use the cheat engine like that.
Get MSI afterburner 4.4.0 Beta 16: https://forums.guru3d.com/threads/rtss-6-7-0-beta-1.412822/page-32#post-5459908

If you want, here's a ready made table to work with the MSI AB 4.4.0 beta 16 (default red skin), unzip to ...Dokuments/My Cheat Tables/ and CE can load it automatically when you select MSIAfterburner.exe process

MSIAfterburner.zip 0k .zip file

Note: to use voltage offset hack, move voltage slider a little bit in AB, type desired value in CE and click apply in AB


----------



## RAZZTA01

Thx a lot for your help! I have succesfully achieved +220mV.
What is not working is:
override core clock
override ddr limit.
Any thoughts how to do that or it is not posible?
Rgds
EDIT: just found the answer myself. THX A LOT TAME


----------



## RAZZTA01

Quote:


> Originally Posted by *Tame*
> 
> Well yeah as the most basic you use the cheat engine like that.
> Get MSI afterburner 4.4.0 Beta 16: https://forums.guru3d.com/threads/rtss-6-7-0-beta-1.412822/page-32#post-5459908
> 
> If you want, here's a ready made table to work with the MSI AB 4.4.0 beta 16 (default red skin), unzip to ...Dokuments/My Cheat Tables/ and CE can load it automatically when you select MSIAfterburner.exe process
> 
> MSIAfterburner.zip 0k .zip file
> 
> Note: to use voltage offset hack, move voltage slider a little bit in AB, type desired value in CE and click apply in AB


Is "set aux voltage offset" needed?


----------



## Tame

Quote:


> Originally Posted by *RAZZTA01*
> 
> Is "set aux voltage offset" needed?


Not really, the +100 you get in AB should be more than enough. For some tests I have used +150 mV, but I'm still not sure if it helps much







Something like +30~ +50 mV can help stability at higher memory clocks.


----------



## clamyboy74

I have an msi 290x gaming, and I'm planning to put some conductonaut on it. However, I cant find if the contact between the die and the heatsink is made of aluminum or not, so can anyone confirm that it isn't aluminum? Here's a pic I found online.


----------



## Tame

Quote:


> Originally Posted by *clamyboy74*
> 
> I have an msi 290x gaming, and I'm planning to put some conductonaut on it. However, I cant find if the contact between the die and the heatsink is made of aluminum or not, so can anyone confirm that it isn't aluminum? Here's a pic I found online.


I'm not completely sure, but I don't think anyone uses aluminium for the heatsink base material, it's not very suitable for that. Most silver looking bases are nickel plated copper...


----------



## ZealotKi11er

Quote:


> Originally Posted by *clamyboy74*
> 
> I have an msi 290x gaming, and I'm planning to put some conductonaut on it. However, I cant find if the contact between the die and the heatsink is made of aluminum or not, so can anyone confirm that it isn't aluminum? Here's a pic I found online.


Do not use Liquid Metal in the GPU.


----------



## clamyboy74

Why? I've seen thousands of posts and videos of other people doing it. Not giving any reason makes me question.


----------



## ZealotKi11er

Quote:


> Originally Posted by *clamyboy74*
> 
> Why? I've seen thousands of posts and videos of other people doing it. Not giving any reason makes me question.


It will stain/corrode the die and the benefits are not worth it.


----------



## RAZZTA01

Quote:


> Originally Posted by *Tame*
> 
> Not really, the +100 you get in AB should be more than enough. For some tests I have used +150 mV, but I'm still not sure if it helps much
> 
> 
> 
> 
> 
> 
> 
> Something like +30~ +50 mV can help stability at higher memory clocks.


Hi! Good to know. Thx.
I cannot pass 1235mhz, not even with CE. It just don´t let me try above. Any ideas about how to solve this?
Rgds.


----------



## Tame

Quote:


> Originally Posted by *RAZZTA01*
> 
> Hi! Good to know. Thx.
> I cannot pass 1235mhz, not even with CE. It just don´t let me try above. Any ideas about how to solve this?
> Rgds.


Hmm, my cards came with pretty high 1500/2000 limits in the BIOS by default, but driver limited to 1300/1625, which I could bypass with CE. Now my daily BIOS is 1200/1655, and because it's over the driver limit by default, I somehow get the full range anyway...
It might be your BIOS limit is at 1235... I would look into Hawaii BIOS modding thread; How to modify oc limits section. If you just want to do pure high OC benchmarking stuff, I suggest try PT1 overclocking BIOS, if you have reference design based card.


----------



## Kevindewapper

I am experiencing a lot of Wattman crashes, even with a very low overclock (1055/1250). Seems to also reset my overclock.


----------



## Lazat

Quote:


> Originally Posted by *Kevindewapper*
> 
> I am experiencing a lot of Wattman crashes, even with a very low overclock (1055/1250). Seems to also reset my overclock.


which driver?


----------



## Kevindewapper

Quote:


> Originally Posted by *Lazat*
> 
> which driver?


17.7.1.


----------



## chantruong

1) When I try to overclock my reference 290x on afterburner, the screen resets then will limit to 300 core and 150 memory. My driver version is 17.7.2. What is the fix for this?

2) Can you recommend a aftermarket vrm and memory cooling if I want to use an AIO on this card?

Thanks,


----------



## diggiddi

See if these work
https://www.newegg.com/Product/Product.aspx?Item=N82E16835708012


----------



## Lazat

Quote:


> Originally Posted by *chantruong*
> 
> 1) When I try to overclock my reference 290x on afterburner, the screen resets then will limit to 300 core and 150 memory. My driver version is 17.7.2. What is the fix for this?
> Thanks,


Make sure you're using afterburner latest beta 16. You can find my post earlier with the link to it


----------



## chantruong

Quote:


> Originally Posted by *Lazat*
> 
> Make sure you're using afterburner latest beta 16. You can find my post earlier with the link to it


Thanks! This did the trick







I will dial down the overclocks until i get better cooling. I would like to preserve my hearing.


----------



## cheapchipower

My reference, modded, and unlocked R9-290 to R9-290X just died last week. I just can't express how sad and awkward I feel right now. I should be happy for the service she gave me and look forward to a new GPU purchase. But I am sad, very very sad. I just realized that I did not to save up for a new card. Honestly, I am not expecting it to die so was not in the market for a new one. The most awkward and sad about it is... I bought another one since that's all I can afford right now... Welcome back to Hawaii then! US$124.00 is all I payed for another trip! Did not get enough sun burn I guess... damn


----------



## Tame

Quote:


> Originally Posted by *cheapchipower*
> 
> My reference, modded, and unlocked R9-290 to R9-290X just died last week. I just can't express how sad and awkward I feel right now. I should be happy for the service she gave me and look forward to a new GPU purchase. But I am sad, very very sad. I just realized that I did not to save up for a new card. Honestly, I am not expecting it to die so was not in the market for a new one. The most awkward and sad about it is... I bought another one since that's all I can afford right now... Welcome back to Hawaii then! US$124.00 is all I payed for another trip! Did not get enough sun burn I guess... damn


Sad to hear that. How did it die however?


----------



## cheapchipower

Quote:


> Originally Posted by *Tame*
> 
> Sad to hear that. How did it die however?


From a regular GPU / Whole PC maintenance.. Replaced the 8 month old thermal paste and when I put it back, sparks all over the card. I have an HD-7850 lying around to test everything else. PSU, Mobo and all are still good. I might have touched something with the excess thermal paste or whatever I did wrong in the disassembly. I can't tell. I checked everything and thermal paste are just right. No excess, no cracks in the die. No nothing. I am very careful to my stuff, and very well experienced with those regular PC maintenance. I really can't pin point the cause. I'll just call it mishandling.


----------



## Tame

Quote:


> Originally Posted by *cheapchipower*
> 
> From a regular GPU / Whole PC maintenance.. Replaced the 8 month old thermal paste and when I put it back, sparks all over the card. I have an HD-7850 lying around to test everything else. PSU, Mobo and all are still good. I might have touched something with the excess thermal paste or whatever I did wrong in the disassembly. I can't tell. I checked everything and thermal paste are just right. No excess, no cracks in the die. No nothing. I am very careful to my stuff, and very well experienced with those regular PC maintenance. I really can't pin point the cause. I'll just call it mishandling.


Oh okay I see, thanks for the answer. I was curious if it died because of component degration or what not. But these gpus seem to be tanks so no.

I've lost two small ceramic capacitors on my primary R9 290, I think they are related to Vmem ripple supression for the mem chips. They were put so close to a mounting hole that a small plastic washer of a mounting screw partially covered them and pushed them off when I tightened the block down. It's a design flaw of the Gigabyte PCB in my opinion... I could try soldering them back, but the card is clocking better than ever so I didn't bother.


----------



## ZealotKi11er

Quote:


> Originally Posted by *cheapchipower*
> 
> From a regular GPU / Whole PC maintenance.. Replaced the 8 month old thermal paste and when I put it back, sparks all over the card. I have an HD-7850 lying around to test everything else. PSU, Mobo and all are still good. I might have touched something with the excess thermal paste or whatever I did wrong in the disassembly. I can't tell. I checked everything and thermal paste are just right. No excess, no cracks in the die. No nothing. I am very careful to my stuff, and very well experienced with those regular PC maintenance. I really can't pin point the cause. I'll just call it mishandling.


So you killed it because 290/290X card just do not go an die by themselves.


----------



## Paul17041993

Quote:


> Originally Posted by *cheapchipower*
> 
> From a regular GPU / Whole PC maintenance.. Replaced the 8 month old thermal paste and when I put it back, sparks all over the card. I have an HD-7850 lying around to test everything else. PSU, Mobo and all are still good. I might have touched something with the excess thermal paste or whatever I did wrong in the disassembly. I can't tell. I checked everything and thermal paste are just right. No excess, no cracks in the die. No nothing. I am very careful to my stuff, and very well experienced with those regular PC maintenance. I really can't pin point the cause. I'll just call it mishandling.


Did you use conductive paste?


----------



## cheapchipower

idk


----------



## cheapchipower

I have no idea. It's artic silver MX-4


----------



## Mega Man

No it's not conductive or capacitive


----------



## ratchet4234

Funny how this gpu is still able to play today's titles and its old af now.
I got a sapphire tri X r9 290.
Still going stong!


----------



## cheapchipower

Okay, so I owned one of these for 3 years. And now got another one to replace the one that died. I transferred everything from the old card. (Asetek CLC + NZXT G10) I never benched the old card as hard as this one. Everything is fine and cool though. This second card I got in the second hand market is only 1 year old. Still have the original stickers and receipt to prove. has an ASIC Quality of 86%. But I encounter a problem when it's kicking out those FPS. It happened 4 times now. Everytime I remove my FPS cap and the card is pushing out 180+ FPS, somehow the game would just hang and black screen, but there is still game sounds and FFB. I cannot ctrl+alt+del out of it or esc or any other escape route. I just switch the PSU off to restart my PC. This never happened to my old card. Anyone have an idea?


----------



## bossie2000

Msi 390 x owner here.Got my card for 2 years now.Awesome card! Will keep it for another 2 years before upgrading i think but my kid wants it.So maybe check Vega 11 out comming soon.


----------



## Tame

I played around with modding a R9 390 bios for my R9 290. It's quirky and harder to run than my R9 290, but o boy more performance. I was able to break 17k graphics score in Firestrike pretty easily (cpu was just at my daily low setting). Still needs fine tuning.


----------



## Streetdragon

Quote:


> Originally Posted by *cheapchipower*
> 
> Okay, so I owned one of these for 3 years. And now got another one to replace the one that died. I transferred everything from the old card. (Asetek CLC + NZXT G10) I never benched the old card as hard as this one. Everything is fine and cool though. This second card I got in the second hand market is only 1 year old. Still have the original stickers and receipt to prove. has an ASIC Quality of 86%. But I encounter a problem when it's kicking out those FPS. It happened 4 times now. Everytime I remove my FPS cap and the card is pushing out 180+ FPS, somehow the game would just hang and black screen, but there is still game sounds and FFB. I cannot ctrl+alt+del out of it or esc or any other escape route. I just switch the PSU off to restart my PC. This never happened to my old card. Anyone have an idea?


hmmm multible problems could lead to something like that....

1. Wrong Bios? Check if there is the original Bios on the Card.
2. Heat? Maybe something got to Hot and the card tryied to safe its life.
3. Try lowering the Ram Clock. 30-50Mhz should be enough.
4. Driver problem. Do a DDU and fresh install
5. Try to shout at it.

If all thisn ot helping... Try to go to a dinenr with tha card. maybe she is shy


----------



## lanofsong

Hey AMD R9 290/290X owners,

We are having our monthly Foldathon from Monday 16th - Wednesday 18th - 12noon EST.
Would you consider putting all that awesome GPU power to a good cause for those 2 days? If so, come *sign up* and fold with us - see attached link.

October 2017 Foldathon

To get started:

1.Get a passkey (allows for speed bonus) - need a valid email address
http://fah-web.stanford.edu/cgi-bin/getpasskey.py

2.Download the folding program:
http://folding.stanford.edu/

Enter your folding name (mine is the same as my OCN name)
Enter your passkey
Enter Team OCN number - 37726

later
lanofsong


----------



## spyshagg

Quote:


> Originally Posted by *Tame*
> 
> I played around with modding a R9 390 bios for my R9 290. It's quirky and harder to run than my R9 290, but o boy more performance. I was able to break 17k graphics score in Firestrike pretty easily (cpu was just at my daily low setting). Still needs fine tuning.


Very nice. Is it a Reference 290? I have loads of problems with reference 290's because the display port loses sync when increasing vcore. I had it happen on a reference 290x a year ago, and yesterday when trying another reference 290 it also loses sync with just +40mv! When modding the stock bios with the 1250 strap, it black screens at just 1300mhz Ram speed. The modded 390 bios brings new problems, such as the lack of UEFI. Disappointed with the results, I took it out and put in my trusty DCUII 290X and some of the problems I witnessed with the 290 also happened. It turns out the very last Relive Driver is giving me issues, so I rolled back to the previous one and it behaved well. Sadly, time ran out and I could not test the reference 290 again with the previous driver to see if it improved.

Anyway, my best 24/7 score with the modded 290X DCUII was 14300 firestrike (graphics), and despite the problemas I had yesterday, the 290 scored 12900. A far cry from your 17K.

Could you post your bios btw? thx


----------



## Tame

Quote:


> Originally Posted by *spyshagg*
> 
> Very nice. Is it a Reference 290? I have loads of problems with reference 290's because the display port loses sync when increasing vcore. I had it happen on a reference 290x a year ago, and yesterday when trying another reference 290 it also loses sync with just +40mv! When modding the stock bios with the 1250 strap, it black screens at just 1300mhz Ram speed. The modded 390 bios brings new problems, such as the lack of UEFI. Disappointed with the results, I took it out and put in my trusty DCUII 290X and some of the problems I witnessed with the 290 also happened. It turns out the very last Relive Driver is giving me issues, so I rolled back to the previous one and it behaved well. Sadly, time ran out and I could not test the reference 290 again with the previous driver to see if it improved.
> 
> Anyway, my best 24/7 score with the modded 290X DCUII was 14300 firestrike (graphics), and despite the problemas I had yesterday, the 290 scored 12900. A far cry from your 17K.
> 
> Could you post your bios btw? thx


My cards are Gigabyte Windforce R9 290 OC 4GB with Hynix memory (Hynix AFR). They are pretty much reference design with 5 phase VRM. They are water cooled with full cover blocks and 11W/mk thermal pads for the VRM (otherwise they probably would have blown up already with this abuse).
The 17K was done on 1250 mV + 375 mV offset (=1625 mV theoretical 0% load voltage), and what you would call higher LLC setting for the VRM Vcore, so less Vdroop under load. The BIOS I used, was a R9 390X Bios modded by me. I slapped my previous Bios modifications into it and did some tweaks. It doesn't clock quite as well as the 290 bios, but still gives better performance.

Yeah loosing display signal is a problem at high voltage. To really fix it, you would need to hard mod the 0.95V rail voltage. For benchmarking you can alleviate this issue by: reducing Monitor color depth to 6 bpc, reducing display resolution and refresh rate, by using DVI instead of Display Port.
For the all out stuff I use a 1080p monitor with DVI @ 25 Hz interlaced - instead of my 1440p IPS with Display Port









Here are some BIOSes, but the caveat is that they are all optimized for Hynix AFR 4GB memory only, and I'm 90% sure wouldn't work with anything else. They are provided as is, use at your own risk.
The R9 390 BIOS isn't well tested, and I only used it for benchmarks. It reports ~5 C lower temps (idle and load), but that seems to be a pure calibration difference between 290 and 390.

Here are my daily optimized BIOS (what I currently use) for R9 290 Hynix AFR:


Spoiler: Warning: Spoiler!



I cooked up optimized BIOS for my R9 290s. It's derived from my high clocks BIOSes. I used the knowledge I gained while modding those. Default clocks for this Tv1 version are 1100/1500.

Main features:
- Modded from Gigabyte BIOS, meant for reference designs, memory settings are for Hynix AFR
- Optimized voltage levels, both of my cards work great with these settings, or even little lower.
- Tighter voltage regulation, Load Line Slope Trim at -40% for reduced Vdroop
- Custom memory settings to get most out of the memory, and uses my hand tuned performance timings for 3D apps.
- Memory voltage upped to 1550 mV from default 1500 mV for a tad more memory oc headroom.

Powerplay table:


With these settings I can run the memory up to 1600 MHz with crossfire (up to 1700 MHz for high OC benching).



Tv1.zip 99k .zip file


R9 290 Benchmarking BIOS with the TV1 stuff, but power and voltage limits are outta window:

M_OC.zip 99k .zip file


R9 390X benchmarking BIOS modded for R9 290 (Hynix AFR):

T39_OC.zip 99k .zip file


----------



## spyshagg

Thanks man! My 290 is also on a full cover waterblock when I tested it. But its Elpida. Biggest issue is the UEFI, as I need to run my pc in uefi due to bitlocker and these modded bios wont show any image in this mode.
Also, crazy voltages you are using in that run! But I do need my cards to never lose sync as I still game with them. Unless I passed them onto my mining rigs and get a vega or something.


----------



## Tame

Quote:


> Originally Posted by *spyshagg*
> 
> Thanks man! My 290 is also on a full cover waterblock when I tested it. But its Elpida. Biggest issue is the UEFI, as I need to run my pc in uefi due to bitlocker and these modded bios wont show any image in this mode.
> Also, crazy voltages you are using in that run! But I do need my cards to never lose sync as I still game with them. Unless I passed them onto my mining rigs and get a vega or something.


I see, well no luck with these then...

I did a quick comparison between BIOSes and Fire Strike graphics score. I chose to run all tests @ 1200/1650, except Stilts timing can only do on the 1500 memory.

Default with 1625 timing strap: 14 809
Custom with Stilt strap: 15 039 (could only be run @ <1500 memory)
Default with 1250 timing strap: 15 072
Custom with my strap: 15 388
Custom 390 with my strap: 15 533

Not a huge difference, but still something


----------



## Tame

I opened my room's window wide open, and put the 1080 rad drawing the ~10 C air from outside. I was finally able to break the 1400 MHz barrier on core clock









Water cooled R9 290 @ 1400/1650 MHz, Superposition benchmark:


I was actually able to also do 1400/1700, but the score was a couple of points lower








It's amazing how much even this small drop in temps help, I didn't expect it to have a measurable difference. It's not like my room is at 10 C too...


----------



## Mega Man

Quote:


> Originally Posted by *cheapchipower*
> 
> Okay, so I owned one of these for 3 years. And now got another one to replace the one that died. I transferred everything from the old card. (Asetek CLC + NZXT G10) I never benched the old card as hard as this one. Everything is fine and cool though. This second card I got in the second hand market is only 1 year old. Still have the original stickers and receipt to prove. has an ASIC Quality of 86%. But I encounter a problem when it's kicking out those FPS. It happened 4 times now. Everytime I remove my FPS cap and the card is pushing out 180+ FPS, somehow the game would just hang and black screen, but there is still game sounds and FFB. I cannot ctrl+alt+del out of it or esc or any other escape route. I just switch the PSU off to restart my PC. This never happened to my old card. Anyone have an idea?


Sounds like a memory heat issue ( minimal details could very well be wrong ) it is off of the reasons i dont like the slap a clc on a card mentality


----------



## serave

anyone knows where to get one of those reference cooler for the 290? any dual slot cooler is fine also

I really need one for my itx build because i have a 290 laying around yet unable to use it for the build because i slapped Prolimatech MK-26 on it


----------



## Raephen

Quote:


> Originally Posted by *serave*
> 
> anyone knows where to get one of those reference cooler for the 290? any dual slot cooler is fine also
> 
> I really need one for my itx build because i have a 290 laying around yet unable to use it for the build because i slapped Prolimatech MK-26 on it


A reference cooler is not a thing you should / would want. Too inefficient and loud.

My R9 290 uses a Accelero Twin Turbo II + Gelid VRM kit atm and it does the job well enough.

Though, if I remember correctly, it's slightly higher than dual-slot so I don't know how much of an issue that would be for you.


----------



## diggiddi

Ebay for sure


----------



## VegaSecureA

Quote:


> Originally Posted by *serave*
> 
> anyone knows where to get one of those reference cooler for the 290? any dual slot cooler is fine also
> 
> I really need one for my itx build because i have a 290 laying around yet unable to use it for the build because i slapped Prolimatech MK-26 on it


Good Luck with your search!

I still rock my 4-way CrossfireX reference R9 290Xs (in Uber mode). Purchased back when they came out in 2013. Never looked back!
Still runs all the triple A titles I play, smoothly and without throttling, on a 34" 3440x1440 FreeSync IPS monitor @ 60fps.

Noise? Irrelevant. I use in-ear headphones.
Heat? Irrelevant. Does not bother me, lol.
Power Consumption? A little over 1500W. I use an EVGA SuperNOVA NEX1500 Classified in OC mode (1650W).
Power Cost? Irrelevant. The price of a cup of coffee a month.


----------



## HOMECINEMA-PC

Quote:


> Originally Posted by *Tame*
> 
> I opened my room's window wide open, and put the 1080 rad drawing the ~10 C air from outside. I was finally able to break the 1400 MHz barrier on core clock
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Water cooled R9 290 @ 1400/1650 MHz, Superposition benchmark:
> 
> 
> I was actually able to also do 1400/1700, but the score was a couple of points lower
> 
> 
> 
> 
> 
> 
> 
> 
> It's amazing how much even this small drop in temps help, I didn't expect it to have a measurable difference. It's not like my room is at 10 C too...


I would like to see 3d mk11 / firestrike screener running that core clock








Quote:


> Originally Posted by *serave*
> 
> anyone knows where to get one of those reference cooler for the 290? any dual slot cooler is fine also
> 
> I really need one for my itx build because i have a 290 laying around yet unable to use it for the build because i slapped Prolimatech MK-26 on it


Found this




Lol a bit big for ITX


----------



## Tame

Quote:


> Originally Posted by *HOMECINEMA-PC*
> 
> I would like to see 3d mk11 / firestrike screener running that core clock


Here's even a clip that shows the benchmark setup












And yes, I used two PSUs just in case. The AX860i is powering the system + GPU 8-pin, and a cheapo 1000 W PSU is powering the GPU 6-pin...
I don't have the 0.95V rail modded, so the display out is loosing sync and doing all sorts of wonky things. Also, at 1400 MHz artifacts are obviously a thing








I used a R9 290 BIOS modded by me... It has a bit less Vdroop and tighter memory timings etc.
Until I really pushed the OC, I never thought 1400 MHz on the core would be even remotely possible...



https://www.3dmark.com/fs/13883476


----------



## mus1mus

WOW!

Solid Score!

How did you overcome the official limits on the new Driver?

Epic card man!

Edit, put it into the bot and you'll be first!







But no! please. I hate losing points.









http://hwbot.org/submission/3614599_mus1mus_3dmark___fire_strike_radeon_r9_290_14830_marks


----------



## spyshagg

17k graphics. Madness. The card is from 2013


----------



## dagget3450

I just sold my last 290x today :-( i no longer have any 290x now.


----------



## mus1mus

Anyone here has a Matrix BIOS?


----------



## crazycrave

I still have my Tri- X 290x New Edition and someday I will over clock it as it is rated for 375 watts .. but it did 13,685 on the shipping clocks .

https://www.3dmark.com/fs/14037392


----------



## pozzallo

mus1ms I have 3 asus R9 290x matrix plantium cards. If you tell me how to to copy bios I will send to you


----------



## mus1mus

Use GPU-Z and Save the BIOS Files.















I appreciate it mate.


----------



## cephelix

Quote:


> Originally Posted by *dagget3450*
> 
> I just sold my last 290x today :-( i no longer have any 290x now.


I feel you. Took my MSI R9 290 out on Monday and replaced it with a Vega 56. At least it's going to my close friend. So I could always take it back whenever.. lol


----------



## Tame

Quote:


> Originally Posted by *mus1mus*
> 
> WOW!
> 
> Solid Score!
> 
> How did you overcome the official limits on the new Driver?
> 
> Epic card man!
> 
> Edit, put it into the bot and you'll be first!
> 
> 
> 
> 
> 
> 
> 
> But no! please. I hate losing points.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://hwbot.org/submission/3614599_mus1mus_3dmark___fire_strike_radeon_r9_290_14830_marks


Thanks!

I have MSI AB... So I can type in and apply any values I want. One could also mod higher base clocks in the bios, which will increase the overclocking frequency limits, and then use the VRMTool to apply as much voltage as you'd like. But I find it more straightforward to just use the MSI AB with limits lifted :3

As it turns out, looks like my primary card is pretty ok. It can clock the memory a bit better and it also has higher Vcore tolerance than my second card. At 1400 MHz I'm just running the card at the max Vcore it can tolerate. If I increase Vcore even more, it will just crash faster. I'm not sure if the 5 phase vrm has anything to do with it, or if it's just the limit for the card.
Quote:


> Originally Posted by *spyshagg*
> 
> 17k graphics. Madness. The card is from 2013


Yeah, and it's the non X. Almost 48% OC and tweaked memory timings should do something


----------



## mus1mus

That is hard for me to beat. For sure. But I may look into it some day.









280Xs are taking my time at the moment. Pretty good cards too! Did 1300/1800 on air on a couple of cards.










I did buy a 290X Matrix yesterday. But I got crossed by a seller who sold me a dead card!







My mistake! So it's nothing but a paperweight!


----------



## pozzallo

mus1mus here is bios for R9 290X MATRIX attached. Hope it works for you. Are welcome

R9290Xmatrix.zip 99k .zip file


----------



## mus1mus

Quote:


> Originally Posted by *pozzallo*
> 
> mus1mus here is bios for R9 290X MATRIX attached. Hope it works for you. Are welcome
> 
> R9290Xmatrix.zip 99k .zip file


Thanks bud. Sad to say, that card is DOA. Not allowing me to install drivers.

Best I can do maybe is to get info on how to turn that card into a Power Board for other GPUs.

But I really appreciate the help man. Thanks a lot.


----------



## pozzallo

Quote:


> Originally Posted by *mus1mus*
> 
> Thanks bud. Sad to say, that card is DOA. Not allowing me to install drivers.
> 
> Best I can do maybe is to get info on how to turn that card into a Power Board for other GPUs.
> 
> But I really appreciate the help man. Thanks a lot.


Sorry to hear you have a DOA card. Glad to help.


----------



## gapottberg

Hey everyone. What do you all think a fair price for a used saphire 290 tri-x is these days. A 4gb RX 580 is similar in perfomance and goes for about $250. I am thinking used a fair price would be about half that at $125 assuming it is in working order.

Thoughts? It would be a potential upgrade for my brother or sons rigs both running FX with R9 270 and a gtx 560ti iirc.


----------



## Broken-Heart

Just make sure their PSUs will handle the 290X. The Tri-X is one of the best 290X models. If you want you can sell it for $150 if it's in FULL working order. If you want to know the price of a GPU, you can search for it on ebay or other selling sites.

Their next upgrade should DEFINITELY be the CPU. Check out this video comparing the FX8370 (Second best FX Chip) to the Lowest tier Ryzen 3 CPU


----------



## Grummpy

Gave my 2 290 cards away to friends.


Both very happy


----------



## Matt-Matt

Quote:


> Originally Posted by *VegaSecureA*
> 
> Good Luck with your search!
> 
> I still rock my 4-way CrossfireX reference R9 290Xs (in Uber mode). Purchased back when they came out in 2013. Never looked back!
> Still runs all the triple A titles I play, smoothly and without throttling, on a 34" 3440x1440 FreeSync IPS monitor @ 60fps.
> 
> Noise? Irrelevant. I use in-ear headphones.
> Heat? Irrelevant. Does not bother me, lol.
> Power Consumption? A little over 1500W. I use an EVGA SuperNOVA NEX1500 Classified in OC mode (1650W).
> Power Cost? Irrelevant. The price of a cup of coffee a month.


I'm still on a single R9 290 from late 2013 that's had a full cover block most of it's life.

Honestly apart from a week of mining when I got it, it's had an overall easy life. Sitting at +100mv with 1180/1450 isn't that hard on it and it's never gone over 60c since being under water. I probably use it less then 10 hours a week now at the most.


----------



## TwilightRavens

I actually bought a second hand MSi R9 290x (with the twin frozr) but it throttled at stock clocks (easy 95C in about 5 minutes in any game or benchmark) so I slapped a Kraken G10 + Corsair H75 and have had no issues except that I don't have the Asus bios one that offers 200 mv. Even still I got really lucky considering that mine has SK Hynix Ram and not elpida







so I can get the VRAM clocks sky high with 1600+ without any major issues.


----------



## Noswal

I just did a clean install of Windows 10, and I'm getting flickers and blackscreens/freezes while benchmarking in Heaven on stock clocks. It always crashes at the same point. I didn't have these issues before, but I can't remember how I set it up 3 years ago. Can anyone give me some insight? I'm just trying to get to a stable setting before I move on.
I have two 290X reference (Dell) w/ full cover waterblocks (Koolance) powered by an Antec HCP-1000.

I tried:
Clean install of Windows 10 with & without UEFI
Latest drivers on AMD site (instant blackscreening on login)
Catalyst 14.7 drivers (currently using)
Cleaning out old drivers with DDU
Turning off ULPS in registry
Maxing out power limit in CCC and Afterburner
Quote:


> Originally Posted by *serave*
> 
> anyone knows where to get one of those reference cooler for the 290? any dual slot cooler is fine also
> 
> I really need one for my itx build because i have a 290 laying around yet unable to use it for the build because i slapped Prolimatech MK-26 on it


I still have mine for the 290X. If you're still interested, can send it to you for cost of shipping.


----------



## Tame

Quote:


> Originally Posted by *mus1mus*
> 
> WOW!
> 
> Solid Score!
> 
> How did you overcome the official limits on the new Driver?
> 
> Epic card man!
> 
> Edit, put it into the bot and you'll be first!
> 
> 
> 
> 
> 
> 
> 
> But no! please. I hate losing points.
> 
> 
> 
> 
> 
> 
> 
> 
> 
> http://hwbot.org/submission/3614599_mus1mus_3dmark___fire_strike_radeon_r9_290_14830_marks


I finally got around to registering on HWbot, lol. I did the Firestrike suite and got 1st place in all 1x R9 290, muhahaha.
Crossfire and other benchmarks still to-do, but I also got the 1st place in GPUPI 1B somehow.
http://hwbot.org/submission/3737023_
http://hwbot.org/submission/3737035_
http://hwbot.org/submission/3737046_
http://hwbot.org/submission/3736564_


----------



## TwilightRavens

Quote:


> Originally Posted by *Tame*
> 
> I finally got around to registering on HWbot, lol. I did the Firestrike suite and got 1st place in all 1x R9 290, muhahaha.
> Crossfire and other benchmarks still to-do, but I also got the 1st place in GPUPI 1B somehow.
> http://hwbot.org/submission/3737023_
> http://hwbot.org/submission/3737035_
> http://hwbot.org/submission/3737046_
> http://hwbot.org/submission/3736564_


Dang, how did you get it stable with that overclock? mine won't go any higher than 1150Mhz but thats because my vbios lock me at no more than +/- 100 mv.


----------



## Tame

Quote:


> Originally Posted by *TwilightRavens*
> 
> I have my
> Dang, how did you get it stable with that overclock? mine won't go any higher than 1150Mhz but thats because my vbios lock me at no more than +/- 100 mv.


In short, I just use high voltage and beefy watercooling. The card(s) are only stable enough to pass the benchmark at the highest clocks. I could probably game at 1300 MHz, if I really wanted to, but the tress and heat on the components would be pretty high.


----------



## TwilightRavens

Quote:


> Originally Posted by *Tame*
> 
> In short, I just use high voltage and beefy watercooling. The card(s) are only stable enough to pass the benchmark at the highest clocks. I could probably game at 1300 MHz, if I really wanted to, but the tress and heat on the components would be pretty high.


Yeah mine could probably handle 1200 core if i could get it stable, but thats the problem. How risky would flashing the asus bios to my msi card be? It has Hynix RAM. But in the most demanding stuff the temps never go over 65C but thats with a AIO with the Kraken G10.


----------



## Tame

Quote:


> Originally Posted by *TwilightRavens*
> 
> Yeah mine could probably handle 1200 core if i could get it stable, but thats the problem. How risky would flashing the asus bios to my msi card be? It has Hynix RAM. But in the most demanding stuff the temps never go over 65C but thats with a AIO with the Kraken G10.


I actually used to game on a 1200 MHz bios for a while, but the heat was a little annoying in the summer. Now I'm just using 1100 MHz with ~def voltage & optimed timings, since I'm not playing anything too demanding atm.

I'd say you can try to flash, but be sure to have the dual bios switch and allways keep the default bios in the other position as backup.


----------



## TwilightRavens

Quote:


> Originally Posted by *Tame*
> 
> I actually used to game on a 1200 MHz bios for a while, but the heat was a little annoying in the summer. Now I'm just using 1100 MHz with ~def voltage & optimed timings, since I'm not playing anything too demanding atm.
> 
> I'd say you can try to flash, but be sure to have the dual bios switch and allways keep the default bios in the other position as backup.


Yeah my card doesn't have dual bios


----------



## supersf

Can somebody help me to modify memory timing on my MSI 290 Gaming with Hynix memory?

Thanks

MSI_290.zip 42k .zip file


----------



## TwilightRavens

Also the other reason I wish I could hit 1200 on the core is because for some games I just need that extra oomph If you know what I mean. At 1150/1525 +100 mv don’t get me wrong is pretty nice but sometimes just isn’t enough, no matter how much more Aux Voltage I add It just doesn’t get 100% stable above those clocks.


----------



## Tame

Quote:


> Originally Posted by *TwilightRavens*
> 
> Also the other reason I wish I could hit 1200 on the core is because for some games I just need that extra oomph If you know what I mean. At 1150/1525 +100 mv don't get me wrong is pretty nice but sometimes just isn't enough, no matter how much more Aux Voltage I add It just doesn't get 100% stable above those clocks.


Quote:


> Originally Posted by *TwilightRavens*
> 
> Yeah my card doesn't have dual bios


Aux voltage has not really done much for me... I have it set to 1030 mV, but honestly it doesn't do much. I think I did a test in the past with a high memory clock. I observed memory errors with different levels of aux voltage. I think up to 1070 mV there was very small improvement and under 1000 mV error count starts to rise sharply... For benching I have it at ~1100 mV but it probably does nothing. Are you sure you don't have a bios switch? I think every R9 290 should have one.


----------



## TwilightRavens

Quote:


> Originally Posted by *Tame*
> 
> Aux voltage has not really done much for me... I have it set to 1030 mV, but honestly it doesn't do much. I think I did a test in the past with a high memory clock. I observed memory errors with different levels of aux voltage. I think up to 1070 mV there was very small improvement and under 1000 mV error count starts to rise sharply... For benching I have it at ~1100 mV but it probably does nothing. Are you sure you don't have a bios switch? I think every R9 290 should have one.


I might but I haven't noticed one on mine, but I also never looked for it either . Any Idea where on the card it would be?

But yeah running the memory at 390X speeds made quite the difference but any higher than 1525 doesn't make a bit of difference in performance.


----------



## Tame

Quote:


> Originally Posted by *TwilightRavens*
> 
> I might but I haven't noticed one on mine, but I also never looked for it either . Any Idea where on the card it would be?
> 
> But yeah running the memory at 390X speeds made quite the difference but any higher than 1525 doesn't make a bit of difference in performance.


It should be a small toggle like this at the side of the PCB:


I have observed an improvement in most 3D benchmarks from running the memory as high as possible. With tight timings my other card can do 1700 and other 1650 most of the time. For games running 1500-1600 is probably the sweet spot for Hynix.


----------



## TwilightRavens

Quote:


> Originally Posted by *Tame*
> 
> It should be a small toggle like this at the side of the PCB
> 
> I have observed an improvement in most 3D benchmarks from running the memory as high as possible. With tight timings my other card can do 1700 and other 1650 most of the time. For games running 1500-1600 is probably the sweet spot for Hynix.


Learn something new every day, it does have dual bios. Any recommendations on how to go about with the flashing procedure?


----------



## bustacap22

Hoping someone can chime in on any benchmarks with Assassins Creed Origins on 290x w/ current Adrenaline drivers.. Debating if I should get this game on PC or console. Would prefer getting on PC and use my $50.00 steam card.

Current setup: i7 3930k....dual 290x...running 1440p. Appreciate any feedbacks. Thanks.


----------



## TwilightRavens

AFAIK Assassins Creed Origins is a CPU hog because of the copyright protection thing, so I imagine it’ll definitely max out at 1080p or 1440p with crossfire. If not it may be CPU bound.


----------



## ManofGod1000

Quote:


> Originally Posted by *bustacap22*
> 
> Hoping someone can chime in on any benchmarks with Assassins Creed Origins on 290x w/ current Adrenaline drivers.. Debating if I should get this game on PC or console. Would prefer getting on PC and use my $50.00 steam card.
> 
> Current setup: i7 3930k....dual 290x...running 1440p. Appreciate any feedbacks. Thanks.


Only if the game supports crossfire, which I was not able to find out yet. The Assassins Creed games heavily favor Nvidia hardware as well, so there is that.


----------



## TwilightRavens

Quote:


> Originally Posted by *ManofGod1000*
> 
> Only if the game supports crossfire, which I was not able to find out yet. The Assassins Creed games heavily favor Nvidia hardware as well, so there is that.


Check this out: https://community.amd.com/thread/221648


----------



## RatusNatus

Quote:


> Originally Posted by *TwilightRavens*
> 
> Learn something new every day, it does have dual bios. Any recommendations on how to go about with the flashing procedure?


Search the thread about the R9 290 mod to 290x.
Its really easy if you follow the tutorial.


----------



## PurpleChef

Quote:


> Originally Posted by *Noswal*
> 
> I just did a clean install of Windows 10, and I'm getting flickers and blackscreens/freezes while benchmarking in Heaven on stock clocks. It always crashes at the same point. I didn't have these issues before, but I can't remember how I set it up 3 years ago. Can anyone give me some insight? I'm just trying to get to a stable setting before I move on.
> I have two 290X reference (Dell) w/ full cover waterblocks (Koolance) powered by an Antec HCP-1000.
> 
> I tried:
> Clean install of Windows 10 with & without UEFI
> Latest drivers on AMD site (instant blackscreening on login)
> Catalyst 14.7 drivers (currently using)
> Cleaning out old drivers with DDU
> Turning off ULPS in registry
> Maxing out power limit in CCC and Afterburner
> I still have mine for the 290X. If you're still interested, can send it to you for cost of shipping.


Did you solve this? im having huge problems with W10 and drivers. I dont know if its the OS or if its the card giving up. Alt+tab from cs go is close to impossible. Either it freeze or it take long time to get back into the game. With latest drivers i sometimes get driver crash or something in the middle of matches, were i get the classic windows sound, when u plug/unplug something, and screen blink like i was installing a new driver or something.
When using DDU to uninstall the drivers i cant install new drivers after that. Have to use windows update to even get some drivers to the card. Tryed diffrent drivers, same thing.


----------



## TwilightRavens

Quote:


> Originally Posted by *PurpleChef*
> 
> Did you solve this? im having huge problems with W10 and drivers. I dont know if its the OS or if its the card giving up. Alt+tab from cs go is close to impossible. Either it freeze or it take long time to get back into the game. With latest drivers i sometimes get driver crash or something in the middle of matches, were i get the classic windows sound, when u plug/unplug something, and screen blink like i was installing a new driver or something.
> When using DDU to uninstall the drivers i cant install new drivers after that. Have to use windows update to even get some drivers to the card. Tryed diffrent drivers, same thing.


Only issue I have been having is sometimes when I wase my computer after it being asleep all night it'll resume but to a black screen, I though disabling ULPS would fix it but it sometimes still does it so i have to do a hard reset, this is using the December 17.12.whatever the last 17 driver is/was, also not sure if the 18 drivers are out of beta yet as I haven't looked in a week. But if there's anything i've learned about their drivers its not use the first of the years drivers.


----------



## Noswal

Quote:


> Originally Posted by *PurpleChef*
> 
> Did you solve this? im having huge problems with W10 and drivers. I dont know if its the OS or if its the card giving up. Alt+tab from cs go is close to impossible. Either it freeze or it take long time to get back into the game. With latest drivers i sometimes get driver crash or something in the middle of matches, were i get the classic windows sound, when u plug/unplug something, and screen blink like i was installing a new driver or something.
> When using DDU to uninstall the drivers i cant install new drivers after that. Have to use windows update to even get some drivers to the card. Tried different drivers, same thing.


I flashed to TheStilt MLU bios for both my cards. That solved the immediate crash issues. I get a white screen (with sound) after about 30 minutes of play with older games (BF2:BC) and game crashes with newer games (Destiny 2). This was the same with version 17 and 18 drivers.

I tried going back to stock on version 18 and it seems to work after leaving it untouched for several minutes when it blackscreens. I think something is going on while the screen is black. If it crashes, the computer reboots. I just let this happen a few times and eventually it worked. However, I still got whitescreens on BF2:BC. I added 10mv to each core DMP state in Wattman and BF2:BC stopped whitescreening. However, Destiny 2 still froze my computer after some time.

I also switched the DisplayPort cable from the first card to the second card. This may have affected it, but I'm not too sure. I just know that the cards don't finish setting up until they're plugged into a display. I learned this by switching from on-board graphics to dedicated graphics while my computer was on. Instant freeze.


----------



## chris89

Anyone want me to mod your bios? Send it on over here... Attach .ZIP the .ROM to avoid JSON ERROR.


----------



## TwilightRavens

chris89 said:


> Anyone want me to mod your bios? Send it on over here... Attach .ZIP the .ROM to avoid JSON ERROR.


Would it be possible for you to say adjust the max voltage allowed on my card from say +100mv to +200mv? Or would this mainly be for clocks and timings?


----------



## Talon720

Noswal said:


> I just did a clean install of Windows 10, and I’m getting flickers and blackscreens/freezes while benchmarking in Heaven on stock clocks. It always crashes at the same point. I didn’t have these issues before, but I can’t remember how I set it up 3 years ago. Can anyone give me some insight? I’m just trying to get to a stable setting before I move on.
> I have two 290X reference (Dell) w/ full cover waterblocks (Koolance) powered by an Antec HCP-1000.
> 
> I tried:
> Clean install of Windows 10 with & without UEFI
> Latest drivers on AMD site (instant blackscreening on login)
> Catalyst 14.7 drivers (currently using)
> Cleaning out old drivers with DDU
> Turning off ULPS in registry
> Maxing out power limit in CCC and Afterburner


I have been having a very similar issues to you. I just got my 290x build back together and started having black screens on login or first starting a game or something like aquasuite opening. I tried a lot of the same things you did. Only thing that’s sorta helped me was disabling crossfire via dip switches and tried a new bios. It’s been super frustrating


----------



## chris89

TwilightRavens said:


> Would it be possible for you to say adjust the max voltage allowed on my card from say +100mv to +200mv? Or would this mainly be for clocks and timings?


Sure .. attach your .rom .. Do you know just save the .rom in gpuz but .zip it compress the .rom into .zip do u know how to do that?



Talon720 said:


> I have been having a very similar issues to you. I just got my 290x build back together and started having black screens on login or first starting a game or something like aquasuite opening. I tried a lot of the same things you did. Only thing that’s sorta helped me was disabling crossfire via dip switches and tried a new bios. It’s been super frustrating


Sure .. attach your .rom .. Do you know just save the .rom in gpuz but .zip it compress the .rom into .zip do u know how to do that?


----------



## TwilightRavens

chris89 said:


> Sure .. attach your .rom .. Do you know just save the .rom in gpuz but .zip it compress the .rom into .zip do u know how to do that?
> 
> 
> 
> Sure .. attach your .rom .. Do you know just save the .rom in gpuz but .zip it compress the .rom into .zip do u know how to do that?


Yeah I do


----------



## PurpleChef

So i own a xfx radeon 290x. Whats the best driver for Win7?
Is it worth overclocking at all with stock cooling? cpu is i5 4670k @ 4.5ghz.


----------



## ZealotKi11er

PurpleChef said:


> So i own a xfx radeon 290x. Whats the best driver for Win7?
> Is it worth overclocking at all with stock cooling? cpu is i5 4670k @ 4.5ghz.


Latest drivers. Stock cooling as in Reference Blower? If that is the case best thing is to under-volt.


----------



## TwilightRavens

ZealotKi11er said:


> Latest drivers. Stock cooling as in Reference Blower? If that is the case best thing is to under-volt.


Yeah if it is reference cooler try to undervolt it, or buy an arctic accelero II, it is a eally good after market air cooler or if you want to go all out you can buy a Kraken G10 and a good AIO for it (I picked a Corsair H75i for mine) and I can overclock the crap out of it and it never exceeds 65C under synthetic loads, real world temps are in the upper 40's to mid 50's most of the time but some games that hammer gpu hard will take it up to about 62C.


----------



## chris89

PurpleChef said:


> So i own a xfx radeon 290x. Whats the best driver for Win7?
> Is it worth overclocking at all with stock cooling? cpu is i5 4670k @ 4.5ghz.


Attach you ROM but you need to ZIP it to attach here... Send your HWINFO Sensors page so I know what were looking at thermals..

I use Reference 390x & no you can't undervolt or anything I run fine at even up to 1,250mhz core clock so it can handle voltage sky high...



ZealotKi11er said:


> Latest drivers. Stock cooling as in Reference Blower? If that is the case best thing is to under-volt.


I use Reference 390x & no you can't undervolt or anything I run fine at even up to 1,250mhz core clock so it can handle voltage sky high...



TwilightRavens said:


> Yeah if it is reference cooler try to undervolt it, or buy an arctic accelero II, it is a eally good after market air cooler or if you want to go all out you can buy a Kraken G10 and a good AIO for it (I picked a Corsair H75i for mine) and I can overclock the crap out of it and it never exceeds 65C under synthetic loads, real world temps are in the upper 40's to mid 50's most of the time but some games that hammer gpu hard will take it up to about 62C.


Your better off buying a Sapphire Tri X 290 or 290X or 390 or 390x, Its the best cooler or maybe Gigabyte Windforce cooler... 

I bought all those coolers, the Arctic Extreme IV & Gelid Icy Vision & they all suck, the stock reference can hold thermally way better than everything except maybe Sapphire TriX cooler & Gigabyte Windforce cooler


----------



## TwilightRavens

chris89 said:


> Attach you ROM but you need to ZIP it to attach here... Send your HWINFO Sensors page so I know what were looking at thermals..
> 
> I use Reference 390x & no you can't undervolt or anything I run fine at even up to 1,250mhz core clock so it can handle voltage sky high...
> 
> 
> 
> I use Reference 390x & no you can't undervolt or anything I run fine at even up to 1,250mhz core clock so it can handle voltage sky high...
> 
> 
> 
> Your better off buying a Sapphire Tri X 290 or 290X or 390 or 390x, Its the best cooler or maybe Gigabyte Windforce cooler...
> 
> I bought all those coolers, the Arctic Extreme IV & Gelid Icy Vision & they all suck, the stock reference can hold thermally way better than everything except maybe Sapphire TriX cooler & Gigabyte Windforce cooler


Yeah, I had an accellero on mine for a while because the stock twin frozr cooler just wouldn’t keep this thing under 95C, and I know that it’s supposed to be a good fan but this thing wasn’t even overclocked and it would throttle. With the accellero I was able to keep it at 75C with a modest overclock, yes all the fans on the twin frozr were working and I tried repasting, I just think the heatsink had a manufacturing defect because it seemed slighty convex on the actual contact point.

Also is there really any difference in the bios of the MSI R9 290X Gaming 4G (the way the switch is flipped) in terms of performance or is just factory overclock/stock reference clocks? Or does one way have better timings than the other (more aggressive)


----------



## PurpleChef

ZealotKi11er said:


> Latest drivers. Stock cooling as in Reference Blower? If that is the case best thing is to under-volt.


Sorry i dont understand, why would i under-volt it? yes its the reference card, the one with only 1 fan. 

Why should i upload ROM? whats the purpose? im new to gpu oc etc


----------



## TwilightRavens

PurpleChef said:


> Sorry i dont understand, why would i under-volt it? yes its the reference card, the one with only 1 fan.
> 
> Why should i upload ROM? whats the purpose? im new to gpu oc etc


Because if you don’t undervolt it, it will most likely run at 95c the whole time and just throttle down its speed to stay at 95c, the reference blower really “blows” in terms of cooling, pun intended.


Also I think I hit the max stable clocks my card will do: 1150/1425 (Hynix VRAM and it can probably go higher but it artifacts at 1525, and 1475 occasionally). It might also hit 1160 but I haven’t tried that with 1425 memory, will increasing AUX voltage help with core clock or does that really only affect memory? Because I know the regular voltage is tied to both. Temps are not an issue, running card at +100mv and the hottest I’ve seen it get in game is 62c (68c in stress testing) usually hovers in the mid 50’s. VRM’s probably max at 85c


----------



## blazestalker100

Hi, help please.
i got my reference card at release.
clocked it to 1155 core stable, and ran it there ever since.
card was a good pick and gave me almost 25% performance increase stable all these years. thats a 1.22 +/- voltage in my case.
aftermarket cooler at max 75c

the last 6 months its started dying on me, just woundering if thats expected since it was clocked all these years?
Or if this can be fixed with a bios edit etc?


its basically crashing under load, showing lots of artifacts, sound hangs, screen freezes and the amd control center gives a error saying (device lost) or something like that.
yes, its running stock now, and it really doesnt make any difference wether its stock or clocked anymore, errors still pops up.
some games seem to trigger the crash faster than others.

Im monitoring my temps and they are not high.

it does seem to have something to do with crimson temp/downclock controls though, cause it seem to happen when gpu reaches 75c and stay there for 1 min, then crashes. 

been playing around to make some fixes for many months but i really cant figure, im just trying the obvious and not into the more advanced troubleshooting.

i had a idea that maybe crimson undervolt the card too much causing it to crash?

or maybe my card has finished its life span


----------



## TwilightRavens

blazestalker100 said:


> Hi, help please.
> i got my reference card at release.
> clocked it to 1155 core stable, and ran it there ever since.
> card was a good pick and gave me almost 25% performance increase stable all these years. thats a 1.22 +/- voltage in my case.
> aftermarket cooler at max 75c
> 
> the last 6 months its started dying on me, just woundering if thats expected since it was clocked all these years?
> Or if this can be fixed with a bios edit etc?
> 
> 
> its basically crashing under load, showing lots of artifacts, sound hangs, screen freezes and the amd control center gives a error saying (device lost) or something like that.
> yes, its running stock now, and it really doesnt make any difference wether its stock or clocked anymore, errors still pops up.
> some games seem to trigger the crash faster than others.
> 
> Im monitoring my temps and they are not high.
> 
> it does seem to have something to do with crimson temp/downclock controls though, cause it seem to happen when gpu reaches 75c and stay there for 1 min, then crashes.
> 
> been playing around to make some fixes for many months but i really cant figure, im just trying the obvious and not into the more advanced troubleshooting.
> 
> i had a idea that maybe crimson undervolt the card too much causing it to crash?
> 
> or maybe my card has finished its life span


Try one of these things:

1. Raise the voltage
2. Lower the clocks whilst keeping the same voltage.

Then stress test it like you normally do. After that did it help any? To me it sounds more like chip degradation than the GPU dying because my 290X is going through degradation too. Had to go from 1150 down to 1140 with the same +100mv and that pretty much made it stable as far as I could tell.

Since you have it back at stock just raise the voltage slider by about 10mv and go from there.


----------



## joharrisderby82

*r9 290 screw size*

hi my waterblock has failed so i need to my fans back any idea what screws and size i need for gpu and back plate please


----------



## TwilightRavens

joharrisderby82 said:


> hi my waterblock has failed so i need to my fans back any idea what screws and size i need for gpu and back plate please


Going to need more info like which model of the 290 do you have, reference or aib? if either what pcb manufacturer.


----------



## TheOnlyNapster

*screw size*

Does anyone know what spring loaded screw size the asus 290X uses.
I got this card with the backplate and all but no screw to hold the gpu. 

Anyone knows where to get them and what size they are?

Thanks


----------



## blazestalker100

TwilightRavens said:


> Try one of these things:
> 
> 1. Raise the voltage
> 2. Lower the clocks whilst keeping the same voltage.
> 
> Then stress test it like you normally do. After that did it help any? To me it sounds more like chip degradation than the GPU dying because my 290X is going through degradation too. Had to go from 1150 down to 1140 with the same +100mv and that pretty much made it stable as far as I could tell.
> 
> Since you have it back at stock just raise the voltage slider by about 10mv and go from there.


Hi, a bit late but..

i tried this but wont work.

i enabled force constant voltage in afterburner but still.

my voltage lives it own life when gpu are under load/games
and when i exit the game, the voltage goes up to the "forced" amount i set.
back to the game, and it lowers the voltage to around 1.18v
exit the game and its up to 1.25.

i cant seem to force voltage under load?


----------



## bichael

Hi, wondering how quiet a reference 290 cooler is with a decent undervolt and underclock?

I've moved my water cooled 290 to a cheap bedroom build now so it's massively cpu bottlenecked anyway, hence considering taking off the water cooling.


----------



## crazycrave

I need some help as my card is Sapphire Tri X New Edition with Samsung memory and 2 x 8 pin as my issue is the 100mv slider as I need more on afterburner as it's maxed but card still pretty stable.


15,000 gpu score https://www.3dmark.com/3dm/26714789


----------



## Tame

crazycrave said:


> I need some help as my card is Sapphire Tri X New Edition with Samsung memory and 2 x 8 pin as my issue is the 100mv slider as I need more on afterburner as it's maxed but card still pretty stable.
> 
> 
> 15,000 gpu score https://www.3dmark.com/3dm/26714789


You can hack Afterburner with Cheat Engine, or use the VRMTool.


----------



## crazycrave

What do you feel is safe with voltage as it is 6 phase 375 watts rated board as it has some kind of reset or it's my 8 year old corsair tx 950 tripping after about 6 runs in Fire strike but not system just black screen trip.. but nice to do it on all AMD Hardware as still have active 3800x2 running.


----------



## TwilightRavens

So how would one go about adjusting the max voltage your card can use in Hawaii bios edit? i.e. my card currently has a max of +100mv but i want to make it 150 or 200 without flashing the asus bios


----------



## Tame

crazycrave said:


> What do you feel is safe with voltage as it is 6 phase 375 watts rated board as it has some kind of reset or it's my 8 year old corsair tx 950 tripping after about 6 runs in Fire strike but not system just black screen trip.. but nice to do it on all AMD Hardware as still have active 3800x2 running.


Well, on stock bios +200 mV with decent cooling should not be damaging.
I have run some extreme benchmark runs with ~+400mV (1.45-1.50V under load), where my Corsair AX860i was not enough for the system with one card, and I had to use an additional second PSU. I did this with water cooled Gigabyte Windforce cards with upgraded VRM thermal pads that actually improved the cooling quite a bit. They just have the stockish 5+1 phase VRM.
The black screen could also be the OCP of the cards or the memory if you are pushing that. If your whole system shuts down it's probably your old PSU, but with 950 watt PSU you should not be anywhere near that with a single card and non extreme overclock.


----------



## crazycrave

The board is Sapphire design with 6 phase power design and 2 x 8 pin rated at 375 watts as it may be reaching the amp limits of the supply for a single rail would be my best guess, it has Samsung memory and why the 1700Mhz clocks but it seems FireStrike likes gpu clock scaling from Hawaii more then memory.


My goal was to show that Ryzen does not hold gpu's back if you can scale up 1300 gpu points just by turning the gpu clocks up and as cpu runs stock clocks .


----------



## chris89

crazycrave said:


> The board is Sapphire design with 6 phase power design and 2 x 8 pin rated at 375 watts as it may be reaching the amp limits of the supply for a single rail would be my best guess, it has Samsung memory and why the 1700Mhz clocks but it seems FireStrike likes gpu clock scaling from Hawaii more then memory.
> 
> 
> My goal was to show that Ryzen does not hold gpu's back if you can scale up 1300 gpu points just by turning the gpu clocks up and as cpu runs stock clocks .


Can you attach your bios here? .zip


----------



## TwilightRavens

crazycrave said:


> The board is Sapphire design with 6 phase power design and 2 x 8 pin rated at 375 watts as it may be reaching the amp limits of the supply for a single rail would be my best guess, it has Samsung memory and why the 1700Mhz clocks but it seems FireStrike likes gpu clock scaling from Hawaii more then memory.
> 
> 
> My goal was to show that Ryzen does not hold gpu's back if you can scale up 1300 gpu points just by turning the gpu clocks up and as cpu runs stock clocks .


Yeah the only reasons Ryzen isn’t as good in gaming is because 1. Its a new architecture that games haven’t been optimized for yet, and 2. The infinity fabric introduces a bit of latency that Intel’s Ring Bus technology doesn’t suffer from, if Ryzen was a monolithic core design somehow, it would probably match or beat intel’s CPU in gaming, think of it like the old Core 2 Quads, where it was two dual core dies on one substrate intel had before they started making monolithic quad cores introduced with Nehalem.


And Hawaii doesn’t really benefit from high memory clocks because of the 512 bit bus its memory resides on, it already has a ton of bandwidth, the core is the best thing to shoot for on it because it has to feed all that memory.


----------



## rdr09

crazycrave said:


> The board is Sapphire design with 6 phase power design and 2 x 8 pin rated at 375 watts as it may be reaching the amp limits of the supply for a single rail would be my best guess, it has Samsung memory and why the 1700Mhz clocks but it seems FireStrike likes gpu clock scaling from Hawaii more then memory.
> 
> 
> My goal was to show that Ryzen does not hold gpu's back if you can scale up 1300 gpu points just by turning the gpu clocks up and as cpu runs stock clocks .



A Graphics score of 1300 points in Firestrike with a R9 290? You'll only need 1100 MHz core for that with memory at 1400MHz. That's been the case since 16.X drivers.

I recall needing 1300MHz to reach same score at launch.lol


----------



## crazycrave

chris89 said:


> Can you attach your bios here? .zip


give me a link on how to do it,


----------



## chris89

crazycrave said:


> give me a link on how to do it,


Should be able to drag a zip in like this unless it isn't working try http://www.mediafire.com


----------



## BradleyW

Can anyone recommend the settings I should use for GR Wildlands @ 1440p for a stock 290X 4GB? I want to maintain above 36 fps. Thank you.


----------



## TwilightRavens

BradleyW said:


> Can anyone recommend the settings I should use for GR Wildlands @ 1440p for a stock 290X 4GB? I want to maintain above 36 fps. Thank you.


All medium with maybe a mix of low settings in shadow related settings.


----------



## BradleyW

TwilightRavens said:


> All medium with maybe a mix of low settings in shadow related settings.


That's a bit disappointing. I watched a video on YouTube of the 290X pulling 30fps on ultra and 45+ on high but this was at 1080p.


----------



## TwilightRavens

BradleyW said:


> That's a bit disappointing. I watched a video on YouTube of the 290X pulling 30fps on ultra and 45+ on high but this was at 1080p.


I mean you could certainly try, but an overclock would probably help a bit if you did.


----------



## chris89

Send your bios so we can amp up the clocks a little bit first but boost your CPU clock too if possible...


@mikeyy233

http://www.mediafire.com/file/77e9x4099a36g04/1173.1563.1373.1000.rom


----------



## BradleyW

TwilightRavens said:


> All medium with maybe a mix of low settings in shadow related settings.


Tried the game! Glad to say I am playing on Very High with God Rays Enhanced and all other settings enabled apart from long dist. shadows. Getting 34 fps at worst, but mostly hovering around 42 fps. This is at 2560 x 1080. No stuttering, silky smooth. Lovely at 144 Hz. The higher refresh rate makes the game feel like it is running at 60 fps with Vsync.

3930K 4.5GHz, R9 290X 1.1GHz/1.35GHz, 4x4GB DDR3 2400.

Quick question, how do I force constant core and memory frequency? When I use an FPS limiter in-game, the clocks hold back because the GPU usage is reduced. I used to be able to force the speeds by disabling powerplay in MSI afterburner but I don't have this option anymore. Using latest Beta Adrenalin driver (18.6.1) and latest MSI AB (4.5.0). Chill is OFF and Wattman is not active.

Thank you.


----------



## TwilightRavens

BradleyW said:


> Tried the game! Glad to say I am playing on Very High with God Rays Enhanced and all other settings enabled apart from long dist. shadows. Getting 34 fps at worst, but mostly hovering around 42 fps. This is at 2560 x 1080. No stuttering, silky smooth. Lovely at 144 Hz. The higher refresh rate makes the game feel like it is running at 60 fps with Vsync.
> 
> 3930K 4.5GHz, R9 290X 1.1GHz/1.35GHz, 4x4GB DDR3 2400.
> 
> Quick question, how do I force constant core and memory frequency? When I use an FPS limiter in-game, the clocks hold back because the GPU usage is reduced. I used to be able to force the speeds by disabling powerplay in MSI afterburner but I don't have this option anymore. Using latest Beta Adrenalin driver (18.6.1) and latest MSI AB (4.5.0). Chill is OFF and Wattman is not active.
> 
> Thank you.


The clocks will only go as high as needed, so if you have it capped at say 30fps and it stays at 800MHz you can’t really force it to go higher because it doesn’t need to. The only real way I can think of is upping the frame rate cap, that would up the utilization and the clocks.


----------



## BradleyW

TwilightRavens said:


> The clocks will only go as high as needed, so if you have it capped at say 30fps and it stays at 800MHz you can’t really force it to go higher because it doesn’t need to. The only real way I can think of is upping the frame rate cap, that would up the utilization and the clocks.


That's useless. Before this new series of driver, or in prior MSI AB applications, you had the option to remove powerplay which would force constant clock speeds. I think it's since Wattman came out. It replaced powerplay by the looks of it!

Edit: It seems wattman has STATE options, so if I put the GPU at full speed on all STATE(s) then that should force the clocks.


----------



## chris89

BradleyW said:


> Tried the game! Glad to say I am playing on Very High with God Rays Enhanced and all other settings enabled apart from long dist. shadows. Getting 34 fps at worst, but mostly hovering around 42 fps. This is at 2560 x 1080. No stuttering, silky smooth. Lovely at 144 Hz. The higher refresh rate makes the game feel like it is running at 60 fps with Vsync.
> 
> 3930K 4.5GHz, R9 290X 1.1GHz/1.35GHz, 4x4GB DDR3 2400.
> 
> Quick question, how do I force constant core and memory frequency? When I use an FPS limiter in-game, the clocks hold back because the GPU usage is reduced. I used to be able to force the speeds by disabling powerplay in MSI afterburner but I don't have this option anymore. Using latest Beta Adrenalin driver (18.6.1) and latest MSI AB (4.5.0). Chill is OFF and Wattman is not active.
> 
> Thank you.


I can mod the BIOS so it has only low speed 2d & high speed 3d, never tested it before so maybe you could be the first? Send your bios


----------



## crazycrave

I been busy and have lost interest some sorry but coming back around as I was testing the set up of my Seiki 42UM 4K 60Hz display with firm wire upgraded for gaming lag as a HHGregg display for $246 .. New Club 3D dp to HDMI 2.0 connector as It works on both my R9 280 /290X as this is Sapphire Tri X 290X New Edition at factory 1020/1350 clocks on Ryzen 5 1400 with 16Gb 2400Mhz on MSI Gaming Pro 350B playing World of Tanks new core engine in 4K 60Hz High .


I will try World of Worlds if Interested?

Also, I will get someone to hold the dam camera while play next time.


----------



## ratchet4234

Hey guys,

I have an issue with my r9 290 I have noticed.
I never actually benchmarked this card and compared it and I believe it is heavily under performing the core frequency is staying at its stock 1ghz speed consistently during benchmarks. It got 1200 something in Unigine Heaven. With the benchmark competition settings. 1080p


----------



## chris89

ratchet4234 said:


> Hey guys,
> 
> I have an issue with my r9 290 I have noticed.
> I never actually benchmarked this card and compared it and I believe it is heavily under performing the core frequency is staying at its stock 1ghz speed consistently during benchmarks. It got 1200 something in Unigine Heaven. With the benchmark competition settings. 1080p


Send your bios attach it here .zip


----------



## ratchet4234

Its 015.042.000.000.000000 Its a sapphire Tri X oc card.
Its on latest drivers and is not throttling for the most part.


----------



## BradleyW

My 290X is a strange beast. No matter how much voltage I add to the core, I can't get stable beyond 1100MHz. Temperatures aren't an issue. It's water-cooled. VRM's and Core are all below 50c.

Core 1100
Mem 1350
Vcore Stock.


----------



## chris89

ratchet4234 said:


> Its 015.042.000.000.000000 Its a sapphire Tri X oc card.
> Its on latest drivers and is not throttling for the most part.


Attach the .rom via Save BIOS GPUz here as attachment.



BradleyW said:


> My 290X is a strange beast. No matter how much voltage I add to the core, I can't get stable beyond 1100MHz. Temperatures aren't an issue. It's water-cooled. VRM's and Core are all below 50c.
> 
> Core 1100
> Mem 1350
> Vcore Stock.


Attach the .rom via Save BIOS GPUz here as attachment. I can help with that.


----------



## BradleyW

chris89 said:


> Attach the .rom via Save BIOS GPUz here as attachment.
> 
> 
> 
> Attach the .rom via Save BIOS GPUz here as attachment. I can help with that.


What changes will you make to the BIOS?

I can get 1100/1350 on stock voltage. If I increase the Vcore and/or increase the Core speed beyond my current configuration, it becomes unstable. Adding voltage to the card results in instability, regardless of Core speed

My card is the reference design which uses the Uber and Quite mode BIOS switch. Currently set to Uber mode.


----------



## chris89

BradleyW said:


> What changes will you make to the BIOS?
> 
> I can get 1100/1350 on stock voltage. If I increase the Vcore and/or increase the Core speed beyond my current configuration, it becomes unstable. Adding voltage to the card results in instability, regardless of Core speed


It's 1133mhz on 1333mv here so try it and lmk and use atiwinflash or whatever to flash lmk


----------



## BradleyW

chris89 said:


> It's 1133mhz on 1333mv here so try it and lmk and use atiwinflash or whatever to flash lmk


I'm not sure how adding such a high amount of voltage will help in my situation. The card doesn't respond well to voltage increases beyond +30mV. Can you explain how the BIOS will help me? Thank you very much.


----------



## chris89

BradleyW said:


> I'm not sure how adding such a high amount of voltage will help in my situation. The card doesn't respond well to voltage increases beyond +30mV. Can you explain how the BIOS will help me? Thank you very much.


Its really not that much voltage really & voltages go like this

1275mv 390/390x is 65288 for 1094mhz
1250mv 290/290x is 65288 for like 1075mhz

1133mhz = 1333mv no less for actual stability
1150mhz = 1350mv 
1170mhz = 1370mv
1200mhz = 1400mv
1250mhz = 1450mv

1133mhz = 72.5 gigapixel 
1173mhz = what 75 gigapixel
1250mhz - 80 gigapixel


----------



## BradleyW

chris89 said:


> Its really not that much voltage really & voltages go like this
> 
> 1275mv 390/390x is 65288 for 1094mhz
> 1250mv 290/290x is 65288 for like 1075mhz
> 
> 1133mhz = 1333mv no less for actual stability
> 1150mhz = 1350mv
> 1170mhz = 1370mv
> 1200mhz = 1400mv
> 1250mhz = 1450mv
> 
> 1133mhz = 72.5 gigapixel
> 1173mhz = what 75 gigapixel
> 1250mhz - 80 gigapixel


I'm not sure you understand. If I increase the voltage by anything higher than 30mV above the stock voltage, my card becomes unstable, regardless of the Core frequency. I don't understand how pumping even more voltage will fix the issue.


----------



## chris89

BradleyW said:


> I'm not sure you understand. If I increase the voltage by anything higher than 30mV above the stock voltage, my card becomes unstable, regardless of the Core frequency. I don't understand how pumping even more voltage will fix the issue.


Try just using the BIOS with 1333mv & do not change the MSI Afterburner to +30mv... make sense?

If the VRM over heats, it cannot handle any extra voltage. What is your HWInfo VRM 1 & 2 temperature, there are 4 sensors... I need the readings of all of them & the core.


----------



## ratchet4234

Heres my GPU bios mate.
I fixed the under performing part but on 100+mv in after burner gets me a 1140 core clock do you think its got more in it?


----------



## BradleyW

chris89 said:


> Try just using the BIOS with 1333mv & do not change the MSI Afterburner to +30mv... make sense?
> 
> If the VRM over heats, it cannot handle any extra voltage. What is your HWInfo VRM 1 & 2 temperature, there are 4 sensors... I need the readings of all of them & the core.


VRM's and Core are all around 50c. Thank you.


----------



## TwilightRavens

I know for my 290X you have to increase core clock linearly with memory clock, so for example at +100mv and +100mv on AUX voltage I can get 1150/1525, but say I run the memory at a stock 1250MHz with 1150MHz on the core its not stable with any voltage. And 1100 will do 1425-1450 on the memory.


----------



## PurpleChef

Here is my rom. Thanks for help!
Radeon 290x reference card. so ive been told to undervolt it so it dosnt hit max temp limit so it underclocks itself, right ? 
how can i improve the cooling for it without buying something new? case fan blowing air from the side helps? 
really hate the high sound of this card grr.

(might need to rename the zip to .7z to open it)


----------



## ratchet4234

Well you can buy a new cooler for your GPU there are some aftermarket solutions out there but that is all you can do without compromising performance. The "Free Way" is to set your fan to 100% on your gpu in MSI After Burner but it will be loud.

After market air cooling for gpus is cheap "ARCTIC Accelero Xtreme IV Enthusiast VGA Cooler" Only $60 USD is a cheap for the results you will gain with this triple fan cooler. 

They also sell a double fan one that is not as big that will still be a big gain over the current stock cooler that is also less long and will more likely fit in your case if it is already a tight fit and its cheaper too.

There are many products like this on the market if you browse the net and also look for reviews and people installing these things so you know what you are getting into if you end up buying one of these things.

You could then overclock your card and you would have a bit of an upgrade performance wise.


----------



## chris89

ratchet4234 said:


> Heres my GPU bios mate.
> I fixed the under performing part but on 100+mv in after burner gets me a 1140 core clock do you think its got more in it?


Yeah here is your modded bios


BradleyW said:


> VRM's and Core are all around 50c. Thank you.


Nice temps


PurpleChef said:


> Here is my rom. Thanks for help!
> Radeon 290x reference card. so ive been told to undervolt it so it dosnt hit max temp limit so it underclocks itself, right ?
> how can i improve the cooling for it without buying something new? case fan blowing air from the side helps?
> really hate the high sound of this card grr.
> 
> (might need to rename the zip to .7z to open it)


We can mod the fan profile or repaste too but i recommend the sapphire tri x cooler to replace yours with, buy one on ebay pull the cooler & enjoy cooler quieter performance


ratchet4234 said:


> Well you can buy a new cooler for your GPU there are some aftermarket solutions out there but that is all you can do without compromising performance. The "Free Way" is to set your fan to 100% on your gpu in MSI After Burner but it will be loud.
> 
> After market air cooling for gpus is cheap "ARCTIC Accelero Xtreme IV Enthusiast VGA Cooler" Only $60 USD is a cheap for the results you will gain with this triple fan cooler.
> 
> They also sell a double fan one that is not as big that will still be a big gain over the current stock cooler that is also less long and will more likely fit in your case if it is already a tight fit and its cheaper too.
> 
> There are many products like this on the market if you browse the net and also look for reviews and people installing these things so you know what you are getting into if you end up buying one of these things.
> 
> You could then overclock your card and you would have a bit of an upgrade performance wise.


i recommend the sapphire tri x cooler to replace yours with, buy one on ebay pull the cooler & enjoy cooler quieter performance

https://www.ebay.com/itm/AS-IS-Sapp...=15&_sacat=0&_nkw=sapphire+r9&_from=R40&rt=nc


----------



## ratchet4234

Got to ask you what did you do in the bios??


----------



## chris89

ratchet4234 said:


> Got to ask you what did you do in the bios??


Hows it performing? I set TDP limit to unlimited, so its basically NULL, doesn't matter. It won't throttle no matter what, unless your CPU Bound & itll throttle if your CPU bound which is the case on these cards.

Not to mention left memory clocks stock & increased core clock to 1133mhz & core voltage to the required amount of 1333mv. As well as a proper ideal fan curve to keep it all well under control while leaving a tolerable Hysteresis level so it's not annoying & quiet at idle & tolerable at load.

Let me know how you like it. :thumb:


----------



## ratchet4234

It's exactly the same as what I achieved in MSI afterburner so far the card never throttled anyways my the gpu is probably bottle necking my i7 6700k.


----------



## chris89

ratchet4234 said:


> It's exactly the same as what I achieved in MSI afterburner so far the card never throttled anyways my the gpu is probably bottle necking my i7 6700k.


If you can manage to hold the clock on the gpu, then its no cpu bottleneck, its a software bottleneck in the LOD distance ratios most of the time.

Hows the GPU running?


----------



## ratchet4234

It seems stable and functions the same as the oc i set in afterburner pretty much apples to apples.


----------



## ratchet4234

Using sapphire Trixx to increase the voltage and WattMan to control the clocks and Power Limit was able to get to 1200 on the core @ 1.3 volts. 
One of the VRM 2 is getting to 85 degrees and the card is slowing down from 1200 to 1100 or even 950mhz on the core clock so I think I may have found the limit going past 1200 on the core some instability signs begin to occur at 1.3 volts and I dont think I can go past that voltage.


----------



## chris89

ratchet4234 said:


> Using sapphire Trixx to increase the voltage and WattMan to control the clocks and Power Limit was able to get to 1200 on the core @ 1.3 volts.
> One of the VRM 2 is getting to 85 degrees and the card is slowing down from 1200 to 1100 or even 950mhz on the core clock so I think I may have found the limit going past 1200 on the core some instability signs begin to occur at 1.3 volts and I dont think I can go past that voltage.


Nice yeah you'll need better thermal material like Fujipoly Xtreme to get to max memory clock & core on water...

I use it & achieve 1,267mhz & higher but i used the good material...

http://www.frozencpu.com/search.htm...sTcn&searchspec=fujipoly+17.0+W&go.x=0&go.y=0


----------



## sinnedone

chris89 said:


> Nice yeah you'll need better thermal material like Fujipoly Xtreme to get to max memory clock & core on water...
> 
> I use it & achieve 1,267mhz & higher but i used the good material...
> 
> http://www.frozencpu.com/search.htm...sTcn&searchspec=fujipoly+17.0+W&go.x=0&go.y=0


 @chris89

You think you can do something with this BIOS?

It's a Sapphire Tri-x 290x with Elpidia memory. Last time I tried overclocking it, it wouldn't do more than an extra 20mhz (1050)on the core.


----------



## chris89

sinnedone said:


> @chris89
> 
> You think you can do something with this BIOS?
> 
> It's a Sapphire Tri-x 290x with Elpidia memory. Last time I tried overclocking it, it wouldn't do more than an extra 20mhz (1050)on the core.


Try this, i love that card u have, so got any pics? please tell me how cool the core & vrm is please under load?


----------



## sinnedone

chris89 said:


> Try this, i love that card u have, so got any pics? please tell me how cool the core & vrm is please under load?


I don't have any pictures right now, but i can get some during the day. Nothing special really, but was thinking about painting the orange bits silver or red, and the visiable sides of the heatsink black. ;D


Tried the 1180 Bios and the temps were as follows, gpu 89c, vrm1 85c, vrm2 51c. Unfortunately I get artifacts in Valley benchmark. Any ideas? (might be a poor overclocker?)


----------



## chris89

sinnedone said:


> I don't have any pictures right now, but i can get some during the day. Nothing special really, but was thinking about painting the orange bits silver or red, and the visiable sides of the heatsink black. ;D
> 
> 
> Tried the 1180 Bios and the temps were as follows, gpu 89c, vrm1 85c, vrm2 51c. Unfortunately I get artifacts in Valley benchmark. Any ideas? (might be a poor overclocker?)


You need to repaste the core like this video & add better vrm pads for the core vrm to hold 1200Mhz+ no problem




Try this 1133mhz bios & post screenshots please if you would?  maybe of your score too on 3dmark or something, even Sky Diver would be cool


----------



## ratchet4234

Is there a better air cooler than the Tri x cooler that will allow me to cool the vrms better and core? 
And what one would that be and what other products would be best to get with it?

Trying to push my OC the crap out of my current card before I eventually upgrade when AMD releases another good Graphics card that manages to beat out team green again like the r9 290 and r90x did.


----------



## chris89

ratchet4234 said:


> Is there a better air cooler than the Tri x cooler that will allow me to cool the vrms better and core?
> And what one would that be and what other products would be best to get with it?
> 
> Trying to push my OC the crap out of my current card before I eventually upgrade when AMD releases another good Graphics card that manages to beat out team green again like the r9 290 and r90x did.


The TriX is the best card out there, just need to remove the cooler, to apply new high end thermal pads & thermal paste to the core.


----------



## ratchet4234

Thanks very much also what sort of gains would be expected temp wise and overclocking?


----------



## chris89

ratchet4234 said:


> Thanks very much also what sort of gains would be expected temp wise and overclocking?


You should see 70c max vrm and something like 75c core but idk I need to test the cooler myself, one day.

You need to turn your VRM pads into 15 balls to cover all 15 VRM, its cooler VRM this way.

*REMOVE THOSE BLACK RUBBER Grommets around the CORE*


----------



## sinnedone

chris89 said:


> You need to repaste the core like this video & add better vrm pads for the core vrm to hold 1200Mhz+ no problem
> https://www.youtube.com/watch?v=OASBIoecLy0
> Try this 1133mhz bios & post screenshots please if you would? maybe of your score too on 3dmark or something, even Sky Diver would be cool


Think I might have some better fujypoly thermal pad material somewhere, might try that quick. Haven't opened it up for a while but I'll check on the spread of paste on GPU die.

Give me a couple of days and I'll report back.


----------



## colorfuel

Hello people,

I have a 290X Tri-X new edition (11226-16-20G) and I have deleted my backup for the UEFI bios, which I flashed. 

Maybe someone who has the same card could get me a backup of their UEFI bios or tell me I could find one. 

I've done some research and haven't found any.


Thanks in advance for your help.

EDIT: Nevermind. I think I found it in unverfied Bios Collection on TPU.


----------



## TwilightRavens

So I think my 290X might be on its way out, just got my main rig up and running again and the ysual overclock I have on it 1140/1525 almost immediately causes my monitor to lose signal starting up Heaven benchmark, but if I run it at stock the same issue doesn’t arise. Anyone have any ideas? I thought that maybe it was overheating or the VRM’s were overheating but that’s actually not the case at all.

Just to make sure that it wasn’t something else i threw in my spare R9 380 and overclocked it to see if it would have the same issue (ruling out PSU issues) and no it was perfectly fine.

Also to rule out further issues I tried several driver versions 18.8.1, 18.7.1 and 18.4.1 and the issue persists among all 3 versions (before the computer went to the wife as a temporary computer the version was at 18.4.1 and it was working fine) and the whole 4 months she had it it was running at stock frequencies with no overvolt (1030/1250)

Edit: Took it all apart and put it back together, and somehow it works just as before.


----------



## rdr09

Hello, how's the 290 handling PUBG at 1080? I got two sitting in the closet but they require water. Thinking of activating one.


----------



## ratchet4234

No they do not require water cooling.

But if it is reference one with a blower style cooler you may have some temperature issues and throttling.

If you have a reference one you can get aftermarket air cooling tons of options out there google around and do some research on what your trying to achieve.


----------



## rdr09

ratchet4234 said:


> No they do not require water cooling.
> 
> But if it is reference one with a blower style cooler you may have some temperature issues and throttling.
> 
> If you have a reference one you can get aftermarket air cooling tons of options out there google around and do some research on what your trying to achieve.



Yes, mine are reference. They do have waterblocks. I just got rid of watercooling and i have no aircoolers for these. 

Just want to know how they do in PUBG. I might go ahead and put one back in service.

EDIT: @ratchet4234, saw your thread about your 290. If you want to upgrade from this gpu, you need to look at the GTX 1070/Vega 56 or above. I have a 1060 currently and it performs similarly as my 290. No difference. But i was not able to use the 290 in PUBG. A 290 oc'ed to 1150MHz will match a 1060 stock boost to 1800MHz.


----------



## sinnedone

rdr09 said:


> Yes, mine are reference. They do have waterblocks. I just got rid of watercooling and i have no aircoolers for these.
> 
> Just want to know how they do in PUBG. I might go ahead and put one back in service.


What settings are you planning on playing at?


----------



## rdr09

sinnedone said:


> What settings are you planning on playing at?



Pretty much all Med at 1080P. Any higher, it is difficult to see the enemy.


----------



## chris89

Let me know if anyone wants advice or help with a BIOS mod. What I do is know the voltage for said clocks & also know if you want to use 1250mhz memory clock u can run it down from 1000mv to 875mv to save on heat & power & also set an unlimited power limit so the power limit is void to maximum so you get max performance. Not to mention finally a Fan Profile tune

1100mhz is on 1.25/1.275v
1133mhz is on 1.33v
1155mhz is on 1.355v
1175mhz is on 1.375v
1200mhz is on 1.4-1.450v but you can play with what works way up past 1200mhz because I benchmark at 1267mhz max on my 390x


----------



## bond32

Hello folks... I've decided to get back into the scene, as it were. Took a few years off, amazing how fast tech moves.

Going back to tri-fire, have 2 more 290x's on the way. Now I need to find one more Koolance block then decide if I want to go with a 4th card or not. I still have my quad Koolance block, but only 1 power supply now. 1300 watt G2, think it could run a quad 290x 4790k rig? Probably stock or very small overclock, I remember I always had struggles getting all 4 cards to work correctly overclocking...

Any other tri/quad members still around?

Sent from my Pixel 2 XL using Tapatalk


----------



## TwilightRavens

bond32 said:


> Hello folks... I've decided to get back into the scene, as it were. Took a few years off, amazing how fast tech moves.
> 
> Going back to tri-fire, have 2 more 290x's on the way. Now I need to find one more Koolance block then decide if I want to go with a 4th card or not. I still have my quad Koolance block, but only 1 power supply now. 1300 watt G2, think it could run a quad 290x 4790k rig? Probably stock or very small overclock, I remember I always had struggles getting all 4 cards to work correctly overclocking...
> 
> Any other tri/quad members still around?
> 
> Sent from my Pixel 2 XL using Tapatalk


Well, its a lot of factoring but using a PSU calculator and making a few assumptions (CPU not overclocked, 16GB RAM, 6 fans and, 4 290X’s overclocked to 1100/1500) you’ll need about 1700W. So you may need a separate PSU for the GPU’s, then another one for everything else. So I would use the 1300W for the 290X quadfire, then get like a 450w-550w for all the peripherals (MB, CPU, storage drives etc) and you’d probably be fine, of course I am estimating high to be on the safe side. If you do trifire that 1300W would most likely be fine.


----------



## bond32

TwilightRavens said:


> Well, its a lot of factoring but using a PSU calculator and making a few assumptions (CPU not overclocked, 16GB RAM, 6 fans and, 4 290X’s overclocked to 1100/1500) you’ll need about 1700W. So you may need a separate PSU for the GPU’s, then another one for everything else. So I would use the 1300W for the 290X quadfire, then get like a 450w-550w for all the peripherals (MB, CPU, storage drives etc) and you’d probably be fine, of course I am estimating high to be on the safe side. If you do trifire that 1300W would most likely be fine.


My thoughts exactly...

I'm also leaning towards getting the Thermaltake X9 case, I think my Air 540 will be quite crowded with tri-fire.

Sent from my Pixel 2 XL using Tapatalk


----------



## rdr09

bond32 said:


> My thoughts exactly...
> 
> I'm also leaning towards getting the Thermaltake X9 case, I think my Air 540 will be quite crowded with tri-fire.
> 
> Sent from my Pixel 2 XL using Tapatalk


Tri might work but watercooled . . . Oh no, not enuf rad space.


----------



## chris89

JE$UCRI$TX, [02.09.18 19:21]
i play with low settings, only textures, distance and antialiasing on ultra

JE$UCRI$TX, [02.09.18 19:21]
60 fps all time

JE$UCRI$TX, [02.09.18 19:22]
40-60% gpu use

JE$UCRI$TX, [02.09.18 19:22]
at 1100 mhz u can reach 60 fps almost all time in ultra settings at 1080p

JE$UCRI$TX, [02.09.18 19:22]
using high shadow and the rest on ultra, the game is stable at 60 fps


----------



## TwilightRavens

bond32 said:


> My thoughts exactly...
> 
> I'm also leaning towards getting the Thermaltake X9 case, I think my Air 540 will be quite crowded with tri-fire.
> 
> Sent from my Pixel 2 XL using Tapatalk


To give you an idea, on my MSI 290X card hooked up to a Kraken G10 and Corsair H75 it pulls about what a Vega 64 liquid cooled pulls from the wall.


----------



## bond32

TwilightRavens said:


> To give you an idea, on my MSI 290X card hooked up to a Kraken G10 and Corsair H75 it pulls about what a Vega 64 liquid cooled pulls from the wall.


Thanks, I've actually had quad 290x's before. I ultimately sold the majority of the gear (marriage, no time, etc). I've now got some time to get back in it, thought tri-fire is the best to start. 

Previously to run my original quad setup, I used 2 1300 watt G2 PSU's, 1 for the cards, other for the rest. I still have 1 of the 2 G2's, should be good like previously stated for tri-fire.

Sent from my Pixel 2 XL using Tapatalk


----------



## crazycrave

Copy of my Sapphire Tri-X 290X NE if you want to play with Samsung memory as 7000Mhz np as is but I need more voltage as stable maxed out at 1.1Ghz means just waking up lol ..


----------



## sinnedone

bond32 said:


> Hello folks... I've decided to get back into the scene, as it were. Took a few years off, amazing how fast tech moves.
> 
> Going back to tri-fire, have 2 more 290x's on the way. Now I need to find one more Koolance block then decide if I want to go with a 4th card or not. I still have my quad Koolance block, but only 1 power supply now. 1300 watt G2, think it could run a quad 290x 4790k rig? Probably stock or very small overclock, I remember I always had struggles getting all 4 cards to work correctly overclocking...
> 
> Any other tri/quad members still around?
> 
> Sent from my Pixel 2 XL using Tapatalk


I honestly don't think driver support is there anymore. Between AMD and game devs giving up on multi GPU support there's no use other than benchmarking for yourself.

When I went from tri-fire back to crossfire alot of games performed better.


----------



## bond32

sinnedone said:


> I honestly don't think driver support is there anymore. Between AMD and game devs giving up on multi GPU support there's no use other than benchmarking for yourself.
> 
> When I went from tri-fire back to crossfire alot of games performed better.


That's a bummer...

Well I'm committed now, going to give it a try and report back. 

Still need 1 gpu block, guess I'll just keep my eyes open.

What resolution were you running? Settings?

Sent from my Pixel 2 XL using Tapatalk


----------



## sinnedone

2560x1440p 144hz

Mostly Ultra on anything that supported trifire. Freesync works better with only two cards as well.


----------



## chris89

crazycrave said:


> Copy of my Sapphire Tri-X 290X NE if you want to play with Samsung memory as 7000Mhz np as is but I need more voltage as stable maxed out at 1.1Ghz means just waking up lol ..


Try this, I tuned the fan & added voltage from 1250mv to 1333mv up to 1133mhz, play with afterburner up to 1433mv at +100mv for 1200mhz if u wanted maybe more than 1220mhz.


----------



## bond32

Cards came in today. Guy sent me 290's by mistake, thought I would play with them anyway. With a light oc on CPU (4.7), finally broke 10k on firestrike extreme. They didn't clock well...
https://www.3dmark.com/fs/16310721

I'm in the process of cleaning, I wanted to clean my single 290, block good so I put the stock cooler back on for now just for funsies. Haven't had a chance to bench the 3, but my previous scores for 3 and 4 cards:
https://www.3dmark.com/fs/3333923
https://www.3dmark.com/fs/3222466

This is weird though. My evga board, the 3 cards are at 8x16x16... Not sure what happened there.


----------



## bond32

Somehow, I managed to beat my previous high scores in Firestrike on single card. I even did this with the stock cooler... My single 290X isn't really even a good clocker.

https://www.3dmark.com/fs/16325934 14,138 in Firestrike
https://www.3dmark.com/fs/16325998 7,213 in Firestrike Extreme
https://www.3dmark.com/fs/16326015 3,865 in Firestrike Ultra

These were done at 1200 Core, 1625 memory, +100 in AB. Any higher it shows instability.

I suppose the higher scores are due to improved drivers, or bios. I've tried what felt like 100's of bios, the plain jane stock bios always seems to win. I didn't have luck with the PT1 bios even, at PT1 vs stock bios, same clocks, the stock bios scored higher every time.

Remaining 2 290X's are shipped, on their way. Hopefully they clock well. I also have a second Koolance block on the way (these are very hard to find now). Since I can't find any blocks for sale, I will have to move to a universal GPU block, VRM heatsinks, etc. 

Somewhat related, I did disassemble my gpu block. Anyone know what this black nonsense is? It came off with some scrubbing. I see some oxidation too, I guess this is all normal wear.


----------



## chris89

Gnarly waterblock lol


----------



## bond32

chris89 said:


> Gnarly waterblock lol


Agreed! Lol.

Luckily, all that black cleaned up. Still had oxidation, but it was pretty gross.


----------



## chris89

bond32 said:


> Agreed! Lol.
> 
> Luckily, all that black cleaned up. Still had oxidation, but it was pretty gross.


That system looks so badass though with the 3-way what 290's or 290x's? Thats wild looking with water cooling & all, whats the PSU?

We should BIOS mod them & bench them, then 1 BIOS original to flash to all 3 at once & run Firestrike.


----------



## bond32

chris89 said:


> That system looks so badass though with the 3-way what 290's or 290x's? Thats wild looking with water cooling & all, whats the PSU?
> 
> We should BIOS mod them & bench them, then 1 BIOS original to flash to all 3 at once & run Firestrike.


All in the plan my friend!

I used to have a quad 290x rig, then heavily downsized a few years ago (got married). Now that things have settled down, I'm going back to at least tri-fire, maybe quad with the prices of 290's.

PSU is an EVGA 1300 G2. Used to have 2 of these, now just 1. 

Guy sent me 290's (ordered 290x's), but I benched them anyway. Full load on cards was around 280 watts each (on air). I believe my CPU load (4790k @ 4.9) was about 170-180 watts if I recall, so we are still good with the 1300 watt psu. I think adding a fourth card would put it over its capacity though...


----------



## tivook

bond32 said:


> Somehow, I managed to beat my previous high scores in Firestrike on single card. I even did this with the stock cooler... My single 290X isn't really even a good clocker.
> 
> https://www.3dmark.com/fs/16325934 14,138 in Firestrike
> https://www.3dmark.com/fs/16325998 7,213 in Firestrike Extreme
> https://www.3dmark.com/fs/16326015 3,865 in Firestrike Ultra
> 
> These were done at 1200 Core, 1625 memory, +100 in AB. Any higher it shows instability.
> 
> I suppose the higher scores are due to improved drivers, or bios. I've tried what felt like 100's of bios, the plain jane stock bios always seems to win. I didn't have luck with the PT1 bios even, at PT1 vs stock bios, same clocks, the stock bios scored higher every time.
> 
> Remaining 2 290X's are shipped, on their way. Hopefully they clock well. I also have a second Koolance block on the way (these are very hard to find now). Since I can't find any blocks for sale, I will have to move to a universal GPU block, VRM heatsinks, etc.
> 
> Somewhat related, I did disassemble my gpu block. Anyone know what this black nonsense is? It came off with some scrubbing. I see some oxidation too, I guess this is all normal wear.


Mixing aluminum with copper is a bad idea, it seems like you have a conflict between metal types in your waterloop, or lack of distilled water and some waterwetter type of mixture.


----------



## bond32

tivook said:


> Mixing aluminum with copper is a bad idea, it seems like you have a conflict between metal types in your waterloop, or lack of distilled water some waterwetter type of mixture.


Must have been some small part of non-distilled water then, all my rads are Alphacool copper and blocks are both Koolance nickel plated copper.


----------



## bond32

Pretty quiet here... That saddens me...

Anywho, got the correct cards in. I completely forgot, my board will allow for trifire in 8x8x8 with triple-slot spacing. This vastly improved things... Also improved after simply cleaning up the cards and installing MX-4. 

Also received another Koolance block, got an amazing deal on it. It looked rough, but after I cleaned it up it actually looks better than my original block...

Now the challenge... I cannot find another koolance block. I have an EK universal GPU block, but no passive heatsinks. In looking into it, I can get some passive heatsinks for around $20 per pack (Enzotech, 1 pack for RAM, separate pack for VRM) so $40 and these are solid copper and I would still need a thermal adhesive (3M tape probably). However, the Prolimetech MK-26 can be had for $40 which includes those heatsinks and various other things (aluminum, but it will do). I'm leaning towards that or even the one that includes the 140mm fans ($70) as this will allow for future usage. Seems like a pretty good deal, get quite a bit more for the money. Thoughts?

Haven't really had time to bench the cards yet or test anything. I did manage to still beat my previous trifire score in Firestrike Extreme (disregard the zero score, I think I hit something on accident at the very end):
https://www.3dmark.com/fs/16349571 vs old, 13,542 https://www.3dmark.com/fs/16317205. This was at 1115/1350 I think on all 3 cards.

Looks like now I'm CPU limited, running benches the CPU load is usually maxed while GPU load now hovers in 80-90%


----------



## Newbie2009

I just bought a 2nd hand 290x asus direct cu II for 2nd pc on 1080p tv for < €100

Pleased with the price, I feel like I've committed a crime.


----------



## sinnedone

bond32 said:


> Pretty quiet here... That saddens me...
> 
> Anywho, got the correct cards in. I completely forgot, my board will allow for trifire in 8x8x8 with triple-slot spacing. This vastly improved things... Also improved after simply cleaning up the cards and installing MX-4.
> 
> Also received another Koolance block, got an amazing deal on it. It looked rough, but after I cleaned it up it actually looks better than my original block...
> 
> Now the challenge... I cannot find another koolance block. I have an EK universal GPU block, but no passive heatsinks. In looking into it, I can get some passive heatsinks for around $20 per pack (Enzotech, 1 pack for RAM, separate pack for VRM) so $40 and these are solid copper and I would still need a thermal adhesive (3M tape probably). However, the Prolimetech MK-26 can be had for $40 which includes those heatsinks and various other things (aluminum, but it will do). I'm leaning towards that or even the one that includes the 140mm fans ($70) as this will allow for future usage. Seems like a pretty good deal, get quite a bit more for the money. Thoughts?
> 
> Haven't really had time to bench the cards yet or test anything. I did manage to still beat my previous trifire score in Firestrike Extreme (disregard the zero score, I think I hit something on accident at the very end):
> https://www.3dmark.com/fs/16349571 vs old, 13,542 https://www.3dmark.com/fs/16317205. This was at 1115/1350 I think on all 3 cards.
> 
> Looks like now I'm CPU limited, running benches the CPU load is usually maxed while GPU load now hovers in 80-90%


I still have a EK waterblock from when I was running tri-fire. Might not work with those other blocks you got.

If you're interested let me know.


----------



## Talon720

I’ve had so many problems with my tri fire 290x working right with the drivers I get flickering studdering and black screensw . I didn’t use have as much of an issue but after I rebuilt my system I can never get the drivers installed right... I’ve used ddu and didn’t use ddu tried just uninstalling then reinstalling I think I’ve looked up every method used including hopping on one foot and praying to the amd gods.


----------



## sinnedone

Possible bios setting you might have missed?


----------



## chris89

Talon720 said:


> I’ve had so many problems with my tri fire 290x working right with the drivers I get flickering studdering and black screensw . I didn’t use have as much of an issue but after I rebuilt my system I can never get the drivers installed right... I’ve used ddu and didn’t use ddu tried just uninstalling then reinstalling I think I’ve looked up every method used including hopping on one foot and praying to the amd gods.


Upload your BIOS here & I can help smooth out your gameplay & increase stability.


----------



## bond32

Pm sent


sinnedone said:


> I still have a EK waterblock from when I was running tri-fire. Might not work with those other blocks you got.
> 
> If you're interested let me know.


----------



## JMCB

Anyone still using four 290x cards? Gaming has been pretty bad on them lately.


----------



## TwilightRavens

JMCB said:


> Anyone still using four 290x cards? Gaming has been pretty bad on them lately.


I thought about going crossfire, if the prices for secondhand 290X’s weren’t as stupid as they currently are. But i’ve also heard the scaling isn’t as good as SLI in most titles.


----------



## diggiddi

TwilightRavens said:


> I thought about going crossfire, if the prices for secondhand 290X’s weren’t as stupid as they currently are. But i’ve also heard the scaling isn’t as good as SLI in most titles.


Well here are a couple of sources that should help negate that belief

source 1
https://www.techpowerup.com/reviews/AMD/RX_480_CrossFire/19.html

TPU has the rx480cf beating 970sli in 2 resolutions, tieing in one and losing at 1600x900. If you take out all the games that did not scale well with CF it shreds the 970sli
Guys running crossfire are most likely not using 1600x900 monitors though, so I'd safely say its a win for crossfire

and.....

"The Hardware.Info GPU Performance Score 2018 is the average frame rate in all games tested. In 4K resolution with ultra settings, CrossFire delivers an average of 37% extra output, while SLI scores 23% worse"

Source 2
https://us.hardware.info/reviews/81...nchmarks-hardwareinfo-gpu-prestatiescore-2018


----------



## bond32

diggiddi said:


> Well here are a couple of sources that should help negate that belief
> 
> source 1
> https://www.techpowerup.com/reviews/AMD/RX_480_CrossFire/19.html
> 
> TPU has the rx480cf beating 970sli in 2 resolutions, tieing in one and losing at 1600x900. If you take out all the games that did not scale well with CF it shreds the 970sli
> Guys running crossfire are most likely not using 1600x900 monitors though, so I'd safely say its a win for crossfire
> 
> and.....
> 
> "The Hardware.Info GPU Performance Score 2018 is the average frame rate in all games tested. In 4K resolution with ultra settings, CrossFire delivers an average of 37% extra output, while SLI scores 23% worse"
> 
> Source 2
> https://us.hardware.info/reviews/81...nchmarks-hardwareinfo-gpu-prestatiescore-2018


Impressive!

The prices of gpu's are insane cheap right now. Especially 290's, anyone who can run at least 2 should go for it.

I have 5 right now! Don't ask why...


----------



## rdr09

bond32 said:


> Impressive!
> 
> The prices of gpu's are insane cheap right now. Especially 290's, anyone who can run at least 2 should go for it.
> 
> I have 5 right now! Don't ask why...


That Vega 64 score is kinda low. I got 26K with 2 290s 4 years ago.

EDIT: WannaBeOcer Vega (Single) . . .

https://www.3dmark.com/fs/16277993

2X

https://www.3dmark.com/3dm/22278289


----------



## bond32

Still getting setup, but have my tri-fire running. Pretty disappointed to see Fallout 4 doesn't have Crossfire optimizations, doesn't seem to use the other cards...

Anywho, looking for some bios assistance here. @chris89 looking for your expertise!

The files are attached, the "290x.rom" is a stock bios I believe, it is from my Hynix card. The other 2 cards are Elipda.

Crysis 3 seems to have good crossfire support, I was able to play it at max settings at 4K, 40-70 FPS.

Doom, still unsure on that.


----------



## bond32

No replies.... Guess these forums aren't as lively as I remember.

I'm attempting to mod the stock cooler since I wasn't able to get the VRM temps in check with just a universal GPU block... I thought I was being creative then u saw this has definitely been done before. Still need to cut the shroud.


----------



## diggiddi

cool, keep us updated with pics and yes since the move to the new format this forum died a little


----------



## bond32

Just finished redoing a number of things. Got my 420 GTR in, have another 420 GTX on it's way which will get installed in the top. Yeah, I'm basically filling every possible spot with a radiator but it's a hobby... Setup is:
Koolance 380i on 4790k at 4.8ghz
2xKoolance R9 290x full cover blocks
1xEKWB Universal GPU block -> 3x290X's
Swiftech MCP35X2 pwm pumps (series) with reservoir
XSPC Photon 170 reservoir
XSPC AX120 with XSPC fans in p/p
Alphacool ST30 140mm with Phanteks fans p/p
Hardware Labs 240GTS with Swiftech fans pull (garbage fans, hate these)
Alphacool ST30 360mm with Swiftech fans pull
Alphacool UT60 360mm with GT AP-14's p/p
Hardware Labs 420GTR with Phanteks fans p/p
Hardware Labs 420GTX with Phanteks fans p/p (not in yet)

Plan to mount the photon 170 above the GPU mounts, anyone have input?

I realize it looks a little kaddywampus with different tubing, fittings, etc but I'm only using what I have available and I'm not a fan of hard tubing (except copper).

This case is amazing especially for the money. Since mine was damaged on arrival, some things are still bent slightly which I'm bummed about. Also, what's with the different thumb screws? Which goes where? Looks like the majority have a pointed tip where some have a flat tip.


----------



## bond32

Woops posted the above post in the wrong thread...

Hit a new high for me, thought I had a poor clocker but 2 of my cards hit 1300 core 1625 memory with 1.43 V. 

Pretty darn good! PT1 bios ftw


----------



## bond32

Still struggling to cool the VRM on my third card. My last ditch effort, I'll have to make my own heatsink. Here are the measurements, in case anyone else needed.


----------



## sinnedone

There use to be these random vrm heatsinks that used 1/4 soft tubing into a distro block. If you could find those, you'd be golden.


----------



## bond32

sinnedone said:


> There use to be these random vrm heatsinks that used 1/4 soft tubing into a distro block. If you could find those, you'd be golden.


Believe me... I've been looking everywhere. I even reached out to both Arctic and Gelid to find out if I could get their kits... No luck... So I ordered an aluminum generic heatsink and I am going to try to cut it down to a similar size and shape of the Gelid enhancement kit.

Problem I'm having is nothing will stay on. The small heatsinks, they continue to fall off. I'm using 3M tape - no luck, same with MX-4. I'm hesitant to use any sort of thermal adhesive, but I may need to try some. Any recommendations?

I also cut the stock cooler up and put the EK block on the core - it didn't sit correctly, so I cut off just the VRM section to use. It might work...


----------



## TwilightRavens

bond32 said:


> Believe me... I've been looking everywhere. I even reached out to both Arctic and Gelid to find out if I could get their kits... No luck... So I ordered an aluminum generic heatsink and I am going to try to cut it down to a similar size and shape of the Gelid enhancement kit.
> 
> Problem I'm having is nothing will stay on. The small heatsinks, they continue to fall off. I'm using 3M tape - no luck, same with MX-4. I'm hesitant to use any sort of thermal adhesive, but I may need to try some. Any recommendations?
> 
> I also cut the stock cooler up and put the EK block on the core - it didn't sit correctly, so I cut off just the VRM section to use. It might work...


Really the only VRM that gives me any temp issue is VRM1, VRM2 never goes over like 54C (I think that’s the memory one) so the other one you may want to keep an eye on though.


----------



## bond32

Well, got the aluminum heatsink in and cut it to fit. I have high hopes, really think this should work!


----------



## PurpleChef

@chris89

Sorry for the late answer! Here are my bios and temp screenshot. Am i missing something?
Im on radeon 290x reference card. Its loud as duck right now 
Think im gonna buy an aftermarket cooler soon, but i wanna do what i can now to tame this loud beast


----------



## bond32

Here we go... (Again)









Sent from my Pixel 2 XL using Tapatalk


----------



## Satanello

bond32 said:


> Here we go... (Again)
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Sent from my Pixel 2 XL using Tapatalk


Wow great work!! Can't imagine the amount of power needed during gaming/benchmark!
( I have only two 290x air cooled)

Inviato dal mio MI 8 utilizzando Tapatalk


----------



## sinnedone

Lol nice!

What motherboard are you using?


----------



## bond32

sinnedone said:


> Lol nice!
> 
> What motherboard are you using?


EVGA Z87 classified, 4790k

On power, I'm stuck with my 1300 G2 for now, should be fine but I won't be able to overclock them. I'll pick up a 1600watt later.

I've pulled 380 watts before from 1 card before

Sent from my Pixel 2 XL using Tapatalk


----------



## TwilightRavens

bond32 said:


> EVGA Z87 classified, 4790k
> 
> On power, I'm stuck with my 1300 G2 for now, should be fine but I won't be able to overclock them. I'll pick up a 1600watt later.
> 
> I've pulled 380 watts before from 1 card before
> 
> Sent from my Pixel 2 XL using Tapatalk


Good lord that looks cool, good job!


----------



## Arizonian

@bond32 Nice work on the quad set up. :Snorkle:


----------



## mAs81

bond32 said:


> Here we go... (Again)
> *snip*


Damn that looks glorious - I'd love to see more pics of that setup in use 

I've recently got me a 1070ti , so I moved the Vapor-x 290 in my gf's rig , and damn I'm once again taken away by how cool it is,and how well it performs in 1080p

If it was as cool and powerful in my 1440p rig,I wouldn't consider the change , lol 

..but , since it's unlocked , maybe it's time to fool around and flash a different BIOS in there
Anyway, here it is now , in all it's glory ;


----------



## bond32

Awesome setup! I'm always a fan of UV. Awesome card too, very good performer. On one 290x, I've been playing 1440p for years. Settings on most games are medium to high and I get decent framerates.

I've actually been struggling since adding the 4th card. I couldn't for the life of me get the drivers installed correctly. I tried every freaking troubleshooting method I could with no luck. Finally last night I isolated each card, found one wouldn't budge (no video), so I had to tear it all down again. I put another 290, I have in, but I think my replacement is dead too. Sucks, but thank God prices are not insane. I'll have to pick up another I think...










Sent from my Pixel 2 XL using Tapatalk


----------



## mAs81

bond32 said:


> Awesome setup! I'm always a fan of UV. Awesome card too, very good performer. On one 290x, I've been playing 1440p for years. Settings on most games are medium to high and I get decent framerates.


Thanks !! It's not UV tho,just purple LED fans and LED strip(gf likes purple)

And yeah,my 290 still pretty much can rock anything @ high/mix of high and ultra settings on 1080p - was starting to show it's age tho on 1440p 



bond32 said:


> I've actually been struggling since adding the 4th card. I couldn't for the life of me get the drivers installed correctly. I tried every freaking troubleshooting method I could with no luck. Finally last night I isolated each card, found one wouldn't budge (no video), so I had to tear it all down again. I put another 290, I have in, but I think my replacement is dead too. Sucks, but thank God prices are not insane. I'll have to pick up another I think...


That setup really looks boss - bummer about the card(s) but the used gpu market is waay better there than here,so I hope you get a good one soon!

AMD drivers really have come a long way since the release of these cards , but when I want to uninstall them / upgrade them , I've used this guide , and ,taxing as it may seem , has never created problems for me 

:thumb:


----------



## bond32

mAs81 said:


> Thanks !! It's not UV tho,just purple LED fans and LED strip(gf likes purple)
> 
> And yeah,my 290 still pretty much can rock anything @ high/mix of high and ultra settings on 1080p - was starting to show it's age tho on 1440p
> 
> 
> That setup really looks boss - bummer about the card(s) but the used gpu market is waay better there than here,so I hope you get a good one soon!
> 
> AMD drivers really have come a long way since the release of these cards , but when I want to uninstall them / upgrade them , I've used this guide , and ,taxing as it may seem , has never created problems for me
> 
> :thumb:


Thanks, yeah I have been trying everything I possibly could. This is actually the second time I have gone to a quadfire setup - the first time (with the same board) I didn't have any of the driver problems I have now. I think one of my cards may be no good.


Replaced the card in question last night, bleeding the loop today. I have a total of 5, 2 290's & 3 290x's but I suspected 1 card was bad - now I may have 2 bad cards...

I actually discovered as well, the joining blocks for the Koolance cards have only one flow path. I didn't have it plumbed 100% correct last time as the top 2 cards were basically getting only branch flow. Should be fixed now...


----------



## bond32

Still no progress. I'm also short on time most days, don't have time to properly troubleshoot.

Purchased a 6th 290x from a member here. Yes, I now have 6. My goal is to hoard all the 290/290x's on the planet!


But seriously, if I still have these driver problems, I may have to either get a new board, or bite the bullet and move to x99 (expensive, considering I need ddr4). If I could get my hands on an Asus z97-ws board, that would be nice but I can't find any.


Basically, the symptoms are:
Clean install of Windows 10 LTSB on RAID 0 drives (yes, I've tried a single drive for giggles, no luck) - upon first starting windows, either allowing windows to install drivers OR if I specifically state for it to not install anything then install the latest AMD display drivers, both result in it attempting to install (screen flickering, normal stuff), then screen goes black. Restart only to find no drivers have been installed. I have tried just about everything, the only thing that I made progress with was turning off cards via dipswitches.


----------



## mAs81

bond32 said:


> Still no progress. I'm also short on time most days, don't have time to properly troubleshoot.
> 
> Purchased a 6th 290x from a member here. Yes, I now have 6. My goal is to hoard all the 290/290x's on the planet!
> 
> 
> But seriously, if I still have these driver problems, I may have to either get a new board, or bite the bullet and move to x99 (expensive, considering I need ddr4). If I could get my hands on an Asus z97-ws board, that would be nice but I can't find any.


Nice! 290's/290X's still go from 120 -150 euro here used, depending the card..



bond32 said:


> Basically, the symptoms are:
> Clean install of Windows 10 LTSB on RAID 0 drives (yes, I've tried a single drive for giggles, no luck) - upon first starting windows, either allowing windows to install drivers OR if I specifically state for it to not install anything then install the latest AMD display drivers, both result in it attempting to install (screen flickering, normal stuff), then screen goes black. Restart only to find no drivers have been installed. I have tried just about everything, the only thing that I made progress with was turning off cards via dipswitches.



Might be a longshot , but have you tried disabling fast boot?It was a long time culprit in the r9 black screen issues of the past 



In win10, go to settings, system, power & sleep, additional power settings, choose what the power button does, change settings that are currently unavailable. After that,you can un-check 'turn on fast startup'.


----------



## rdr09

bond32 said:


> Still no progress. I'm also short on time most days, don't have time to properly troubleshoot.
> 
> Purchased a 6th 290x from a member here. Yes, I now have 6. My goal is to hoard all the 290/290x's on the planet!
> 
> 
> But seriously, if I still have these driver problems, I may have to either get a new board, or bite the bullet and move to x99 (expensive, considering I need ddr4). If I could get my hands on an Asus z97-ws board, that would be nice but I can't find any.
> 
> 
> Basically, the symptoms are:
> Clean install of Windows 10 LTSB on RAID 0 drives (yes, I've tried a single drive for giggles, no luck) - upon first starting windows, either allowing windows to install drivers OR if I specifically state for it to not install anything then install the latest AMD display drivers, both result in it attempting to install (screen flickering, normal stuff), then screen goes black. Restart only to find no drivers have been installed. I have tried just about everything, the only thing that I made progress with was turning off cards via dipswitches.


Do you have all cards powered up (plugged)? If so, just plug power on the first card, then install the driver. After driver install, shutdown the system, then plug the rest before turning the pc back on. Crossfire should automatically set, if not, then manually set it.


----------



## bond32

rdr09 said:


> Do you have all cards powered up (plugged)? If so, just plug power on the first card, then install the driver. After driver install, shutdown the system, then plug the rest before turning the pc back on. Crossfire should automatically set, if not, then manually set it.


That's basically the only progress I've made so far. I've singled out each card too which led me to the third card, found it to not produce any video.

Regardless, I've singled each out and been able to install the drivers. But when enabling all 4 again, I get a black screen.

I'm about 80 percent sure the problem was 2 bad cards. I'm not sure how I killed the second card, maybe I did something wrong. Either way, another is on the way. I also may have found an Asus Z97-WS board...

I've been using an EVGA z87 classified, it's been a headache to say the least. However, multi-gpu boards like this aren't easy to come by. 

Sent from my Pixel 2 XL using Tapatalk


----------



## jagdtigger

bond32 said:


> That's basically the only progress I've made so far. I've singled out each card too which led me to the third card, found it to not produce any video.
> 
> Regardless, I've singled each out and been able to install the drivers. But when enabling all 4 again, I get a black screen.
> 
> I'm about 80 percent sure the problem was 2 bad cards. I'm not sure how I killed the second card, maybe I did something wrong. Either way, another is on the way. I also may have found an Asus Z97-WS board...
> 
> I've been using an EVGA z87 classified, it's been a headache to say the least. However, multi-gpu boards like this aren't easy to come by.
> 
> Sent from my Pixel 2 XL using Tapatalk


Maybe you are out of PCI-E lanes....


----------



## bond32

jagdtigger said:


> Maybe you are out of PCI-E lanes....


Board uses a PLX chip, 4-way runs at 8x8x8x8. But I did suspect something maybe by me using a RAID 0, so installed Windows fresh on a single SSD, same result. 

Sent from my Pixel 2 XL using Tapatalk


----------



## mAs81

bond32 said:


> Board uses a PLX chip, 4-way runs at 8x8x8x8. But I did suspect something maybe by me using a RAID 0, so installed Windows fresh on a single SSD, same result.


Is fast startup enabled?


----------



## bond32

mAs81 said:


> Is fast startup enabled?


Crap I honestly forgot to check last night.

I don't think so but could be wrong. Again, I've run quad 290’s before, using this same board. Never had near these issues. 

Sent from my Pixel 2 XL using Tapatalk


----------



## mAs81

bond32 said:


> Crap I honestly forgot to check last night.
> 
> I don't think so but could be wrong. Again, I've run quad 290’s before, using this same board. Never had near these issues.


Again,it might be a longshot,but let's hope it is something simple as that..


----------



## jagdtigger

bond32 said:


> Crap I honestly forgot to check last night.
> 
> I don't think so but could be wrong. Again, I've run quad 290’s before, using this same board. Never had near these issues.
> 
> Sent from my Pixel 2 XL using Tapatalk


Maybe its an OS issue, try testing the GPU's with a live linux.


----------



## bond32

Success!

Thanks to another member selling me his old 290x, that was the problem - somehow I killed 2 cards. Not sure what I did wrong, but everything seems so far so good on a fresh install, (3) 290x's and (1) 290. 

I haven't done any gaming or benching yet, but all 4 cards installed without a problem now.


----------



## TwilightRavens

bond32 said:


> Success!
> 
> Thanks to another member selling me his old 290x, that was the problem - somehow I killed 2 cards. Not sure what I did wrong, but everything seems so far so good on a fresh install, (3) 290x's and (1) 290.
> 
> I haven't done any gaming or benching yet, but all 4 cards installed without a problem now.


You might wanna look into flashing that 290 to a 290X, most of the 290’s are capable of it. That way it won’t limit your 290X’s to 290 specs.


----------



## bond32

TwilightRavens said:


> You might wanna look into flashing that 290 to a 290X, most of the 290’s are capable of it. That way it won’t limit your 290X’s to 290 specs.


The working 290 I have does not unlock to a 290x. It wad my understanding the device ID needed to reflect a certain number (do not recall what).

Currently on the Insanity 390/390x bios. In my own testing, that has provided slightly higher benchmark numbers than PT1, although I've hit the highest clocks on PT1. 

Sent from my Pixel 2 XL using Tapatalk


----------



## bond32

Basically had time to run a few benches last night...

FS Ultra at stock clocks on the cards 7938
https://www.3dmark.com/fs/16978405

Not sure if my power supply is the limiting factor now, but I noticed serious instability when I began to raise the clocks. I need a watt meter for the wall...


Edit: For comparison, I found my previous top quad score:

https://www.3dmark.com/fs/3239962
10,746 - that was with two (2) 1300 G2's, now I only have 1.


----------



## sinnedone

Try finding the same driver you used before. AMD has changed a lot in how it handles clocks.


----------



## revengeyo

Hello my friends

I recently bought a R9 290X with an Arctic Accelero IV cooler on it.
All temps stay around 60c. It has extra VRAM and VRM heatsinks on it.

3dmark11 performance score: 14.500
3dmark firestrike: 10.300
Seems all normal

Now when playing Battlefield 1, i only get around 40 fps out of it on LOW/MEDIUM/HIGH/ULTRA.
It makes no difference.
GPU usage is 100%
CPU usage is 60-80% (2600k @ 4.6)

I know it should do around 75 fps+ on ultra

The same for Battlefield 4, only around 50-60 fps on all settings

In PEs 2019 it does only 35 fps. With my old gtx770 it does 100+ fps.....

Tried to reinstall amd drivers (using driver display uninstaller). Tried different drivers, but no luck

am i missing something?

My old gtx770 performed better!

Also tried my friends gtx1060, and it worked great @ 90+ fps

Please help!


Specs:

Corsair RM 850 PSU
i7 2600k @ 4.6ghz
8gb ram 1600mhz corsair vengeance
z68 Asrock Extreme3 gen4
SSD 500gb Samsung EVO 850


----------



## JMCB

I'm also having issues with BF1 (and BF5) on my old rig which runs 4x R9 290x. I've used that setup for almost 5 years, and when BF1 came out, I got a great framerate and crossfire was working. Now when I play it with the latest drivers I get a jittery mess of 40-50fps with crossfire enabled and it looks bad. Bf5 doesn't even work with CF (despite the beta working fine on ultra settings). Its a little disheartening as I use this rig to play games when friends are over and we can't game on it well anymore.


----------



## revengeyo

Solved it!
When i changed the pci-e cable, it suddenly worked great again!!
Runs 80+ fps in BF1 now
Who would of tought that? The PCI-E-e cable....


----------



## revengeyo

Lol, i just started to overclock my card, and on +100mv i get 1150 core, and 1625mhz on the memory!
The memory is Elpida btw.....
I cannot go higher as 1625mhz because MSI Afterburner limit is 1625..
How can i change this maximum slider?


----------



## JMCB

In the settings, click on Extend Overclocking Limits. Then restart your PC. Should work.


----------



## Talon720

I don’t want to but ever since I’ve bought my 3 290x back when they came out I’ve had more problems than not. My backup mini itx build was using a 750ti and the system worked flawlessly. Even running just one of my 290x sometimes is funky. I’d hate to jump off amd since I got a FreeSync monitor but I dunno how much more I can take of what seems like constant driver issues. To further complicate things ever since I switched to full uefi windows i can’t get any version of atiflash or atiwinflash to work.


----------



## revengeyo

JMCB said:


> In the settings, click on Extend Overclocking Limits. Then restart your PC. Should work.


i tried this and the option stays greyed out. I cannot enable it!
I tried reinstall drivers and afterburner. It did not work.
What is going on here?


----------



## NaturalAimi

Does anyone know the thickness of thermal pads in r9 290 reference?


----------



## BradleyW

revengeyo said:


> Solved it!
> When i changed the pci-e cable, it suddenly worked great again!!
> Runs 80+ fps in BF1 now
> Who would of tought that? The PCI-E-e cable....


Resistance on the cable, normally from a loose crimp connection. Can happen over time.


----------



## BradleyW

NaturalAimi said:


> Does anyone know the thickness of thermal pads in r9 290 reference?


1mm VRM, .5mm VRAM I believe.


----------



## rdr09

May have to keep my 290s longer. I used to oc the core to 1300 MHz to even get near 14000 Graphics score. Now, just 1100 MHz using the latest driver. Just express install over 17.7.


https://www.3dmark.com/3dm/31324181?


----------



## ratchet4234

Recently I have found the AMD drivers are interfering with my graphics card software and I have to change the voltage in the In Sapphire OC Utility software and then change core speeds via wattman in the AMD radeon gpu driver.

Now I personally gave up on all the hassle for a 10% gain in fps because most games I play today Is Eve Online and Assetto Corsa and other racing sim games and they are not demanding at 1080P I might get a 4k monitor soon and then that would force me into a gpu upgrade.


----------



## Zypheryal

Hi yall!!! Building a rig as a gift for my brother. 

Got a second hand Radeon R9 290 and did not expected it to be that hot and noisy.

So im gonna remove the stock heatsink cut the stock metal plate to fit a modern AIO while respecting the height of the inner space of the blower plastic case to keep it like original but only cutted on the outer middle (glass panel side of atx case) to let out the AIO tubes to radiator on the top. 

My question is wich AIO in our modern age are available for the specific r9 290 mounting holes ? 

i calculate always the holes spacing diagonally ? 3 inches i saw from a previous image on this forum.

Thank you !


----------



## Gilles3000

Zypheryal said:


> Hi yall!!! Building a rig as a gift for my brother.
> 
> Got a second hand Radeon R9 290 and did not expected it to be that hot and noisy.
> 
> So im gonna remove the stock heatsink cut the stock metal plate to fit a modern AIO while respecting the height of the inner space of the blower plastic case to keep it like original but only cutted on the outer middle (glass panel side of atx case) to let out the AIO tubes to radiator on the top.
> 
> My question is wich AIO in our modern age are available for the specific r9 290 mounting holes ?
> 
> i calculate always the holes spacing diagonally ? 3 inches i saw from a previous image on this forum.
> 
> Thank you !


depends on how you want to mount it, if you're going to make your own brackets, anything goes.

If you want something off the shelf, there's the Corsair HG10 A1 and NZXT G10


----------



## Zypheryal

Hey a warm thanks for your reply!

Opting for drilling, sawing and liming! so yeah own bracket i would love i want to learn diy more than the easy way and the stock metal plate will keep and cut it to fit aio. 

But i am searching for an AIO that the block circle circumference or square shape doesn't block the future 4 bolts to be used in the pcb.

Also i want to keep the plastic blower cover and use again the Radeon stock fan with the stock metal plate just removing heatsink and cutting aio shape in metal plate.

I just want an aio head that is not too high so i can put back the modded (Cutted rectangle for aio tubes on the stock plastic shroud) so it looks like the stock original but with tubes and rad.


----------



## Zypheryal

Wondering if this Rosewill model will work... https://www.amazon.com/Rosewill-Liquid-Cooler-Closed-Cooling/dp/B07CQDGT2R?th=1 single fan rad or double fan rad ? Will it work ?

They tell the dimension about radiator but not the gpu block mounting holes specs. 

Will keep searching the posibilities.


----------



## TwilightRavens

For best results I recommend an AIO with a 240mm radiator, I used a H75 and I mean it cooled it don’t get me wrong, but it just didn’t seem like enough at times when I was trying to get the card below 60C. But I know for sure the H75 does fit.

You might also want to look into the Cooler Master Liquid series, the one I have (a 240mm) cools my old Core 2 Extreme with relative ease.

The main problem with mounting an AIO to a 290X is you will need to mount a 92mm fan blowing over the VRM’s otherwise pretty rapidly they will go up to 120C+ and just wreck the card so if you do make a braket that works just don’t forget to cool the VRM’s/MOSFETS.


----------



## Zypheryal

Thx TwilightRavens ! I will follow this video 




i just bought of at ebay the exact Cooler Master 2012's Seidon 120M plus using stock fan case and metal plate !

Gonna be lots of DIY fun !


----------



## Gilles3000

Zypheryal said:


> Thx TwilightRavens ! I will follow this video https://www.youtube.com/watch?v=OvLjsd4kQXU
> 
> i just bought of at ebay the exact Cooler Master 2012's Seidon 120M plus using stock fan case and metal plate !
> 
> Gonna be lots of DIY fun !


Looks like a pretty decent guide, only thing I'd add are plastic/nylon washers where the nuts contact the pcb, just to avoid any damage to it.


----------



## jagdtigger

Hi all!

After a long service my 290X got replaced by a Vega 64 last year. Id like to ask if someone here has the pad thickness for the reference pcb and cooler by any chance....

Thanks in advance!


----------



## rdr09

jagdtigger said:


> Hi all!
> 
> After a long service my 290X got replaced by a Vega 64 last year. Id like to ask if someone here has the pad thickness for the reference pcb and cooler by any chance....
> 
> Thanks in advance!


It's 0.5mm for the memory and 1mm for the vrm/mosfet.


----------



## jagdtigger

rdr09 said:


> It's 0.5mm for the memory and 1mm for the vrm/mosfet.


Thanks  .


----------



## jagdtigger

Okay, managed to put it back together. The fan is a bit dodgy on it, when the fan runs at a a higher RPM its audible that its not stable. It oscillates up and down continuously, i really hope it only needs to run for a while so it can get its act together.....


----------



## TwilightRavens

I really need to get my 290X up and running one of these days, been too preoccupied fixing the thermals on my MSI 1080 ti, reminds me of how bad the thermals were on my MSI 290X too.


----------



## rdr09

TwilightRavens said:


> I really need to get my 290X up and running one of these days, been too preoccupied fixing the thermals on my MSI 1080 ti, reminds me of how bad the thermals were on my MSI 290X too.


Still have 2 reference 290s with waterblocks on them in the closet. Decided to go back to air and i got rid of the stock coolers. Those two never saw 60c in any of their temps.


----------



## TwilightRavens

rdr09 said:


> Still have 2 reference 290s with waterblocks on them in the closet. Decided to go back to air and i got rid of the stock coolers. Those two never saw 60c in any of their temps.



I might invest in a Morpheus II, i heard that’s really effective on a 290X.


----------



## Blameless

Still have my original reference 290 (flashed to X) I got during the launch week for $400...thing has something like 40k hours of mining on it and still works flawlessly, except for the rattling bearings on the fan.

Also had three Sapphire 290X Tri-Xes...all of which died with much less use, though they were mostly gaming parts, so I ran them quite a bit harder when they were used.


----------



## mAs81

I’m really happy with my Vapor-X 290,it performs still very well @ 1080p60..It is unlocked , but I still haven’t flashed it to a 290X tho...


----------



## TwilightRavens

Just dug out my old coolerless MSI R9 290X Gaming card out of storage and gave it a good alcohol cleaning, planning to resurrect it here in the next few weeks. Since the stock MSI cooler has always been inadequate for this gpu (and i lost about 90% of the screws to mount it) I decided to outfit it with an Arctic Accelero iii. I saw how well that cooler cools my 1080 ti so i’ll give it a shot on my 290X to see if it can tame the beast, because i’d like to slot this into a system down the road if i am able to.


----------



## L N

What is the difference between the sapphire tri x r9 290x oc and r9 290x sapphire tri x. 

Are they the same card pcb wise. 

If so do they support Kraken G12 GPU cooler and do I need a copper shim?


----------



## badheaven

Anyone with a Gigabyte R9 290 Windforce know if you have to remove the heatsink first in order to remove the fan shroud? Need to replace the fans.


----------



## DDSZ

badheaven said:


> Anyone with a Gigabyte R9 290 Windforce know if you have to remove the heatsink first in order to remove the fan shroud? Need to replace the fans.


You can unscrew the fan shroud if you remove rear shield, as far as I remember


----------



## ElevenEleven

I have an old R9 290X, MSI Lightning version. It came to me from a friend's computer with nothing but EK watercooling plates on both sides, original aircooling shroud removed and misplaced. I'd like to use the card for a while, but need to cool it somehow. I'm going to check how it does with just the EK plates on it and some strong case fan airflow across, but I suspect more than that will be needed.

Is there anything special about the MSI Lightning version of this card that I need to worry about with an aftermarket air cooler? Any recommendations?


----------



## Raephen

ElevenEleven said:


> I have an old R9 290X, MSI Lightning version. It came to me from a friend's computer with nothing but EK watercooling plates on both sides, original aircooling shroud removed and misplaced. I'd like to use the card for a while, but need to cool it somehow. I'm going to check how it does with just the EK plates on it and some strong case fan airflow across, but I suspect more than that will be needed.
> 
> Is there anything special about the MSI Lightning version of this card that I need to worry about with an aftermarket air cooler? Any recommendations?


Hello there,

I used an Arctic Twin Turbo II to cool my previously water-cooled R9 290 when I moved it from my old watercooled i5 3570K system to a (then) brand spanking new Ryzen 5 1600X system.


----------



## rdr09

Testing one of my watercooled R9 290 reference that has been sitting in the closet since i replaced them with the 5700s. Eight long years using same fujis and liquid metal paste. Temps are still good after a few minutes of gaming. Be testing the second one shortly. Great cards!

FS results compared 









Result







www.3dmark.com


----------



## rdr09

Second R9 290 works just as good.

AMD Radeon R9 290 video card benchmark result - AMD Ryzen 7 2700,ASUSTeK COMPUTER INC. PRIME X470-PRO (3dmark.com)


----------



## dagget3450

Haha, I had two of my old 290x cards come back to me from friends. They upgraded and no longer need them. I have water locks but I don't think they will make it onto them. I have some 390x cards I've been meaning to put on the ek blocks 

I do have one 290x is a spare rig on a reference cooler though. It just gets used for streaming and browsing web.

I always liked my hawaii cards even if they were the red headed step child of gpus back then.


----------



## rdr09

dagget3450 said:


> Haha, I had two of my old 290x cards come back to me from friends. They upgraded and no longer need them. I have water locks but I don't think they will make it onto them. I have some 390x cards I've been meaning to put on the ek blocks
> 
> I do have one 290x is a spare rig on a reference cooler though. It just gets used for streaming and browsing web.
> 
> I always liked my hawaii cards even if they were the red headed step child of gpus back then.


Well, they are both back in the closet. I enjoyed those reference cards. Never had issues like blackscreen and stuff. They just work. Crossfired, they worked well for 4K gaming.


----------



## chris89

Does anyone have the complete strings of the Elpida & Hynix memory timings?

I have this info here but I need the strings for Elpida I think.

290 290x 390 390x memory timings help video - YouTube

Strap end 400MHz (40 9C 00) , Range = 150-400MHz
Strap end 800MHz (80 38 01) , Range = 401-800MHz
Strap end 900MHz (90 5F 01) , Range = 801-900MHz
Strap end 1000MHz (A0 86 01) , Range = 901-1000MHz
Strap end 1125MHz (74 B7 01) , Range = 1001-1125MHz
Strap end 1250MHz (48 E8 01) , Range = 1126-1250MHz
Strap end 1375MHz (1C 19 02) , Range = 1251-1375MHz
Strap end 1500MHz (F0 49 02) , Range = 1376-1500MHz
Strap end 1625MHz (C4 7A 02) , Range = 1501-1625MHz
Strap end 1750MHZ (98 AB 02) , Range = 1626-1750MHz


----------



## mirzet1976

chris89 said:


> Does anyone have the complete strings of the Elpida & Hynix memory timings?
> 
> I have this info here but I need the strings for Elpida I think.
> 
> 290 290x 390 390x memory timings help video - YouTube
> 
> Strap end 400MHz (40 9C 00) , Range = 150-400MHz
> Strap end 800MHz (80 38 01) , Range = 401-800MHz
> Strap end 900MHz (90 5F 01) , Range = 801-900MHz
> Strap end 1000MHz (A0 86 01) , Range = 901-1000MHz
> Strap end 1125MHz (74 B7 01) , Range = 1001-1125MHz
> Strap end 1250MHz (48 E8 01) , Range = 1126-1250MHz
> Strap end 1375MHz (1C 19 02) , Range = 1251-1375MHz
> Strap end 1500MHz (F0 49 02) , Range = 1376-1500MHz
> Strap end 1625MHz (C4 7A 02) , Range = 1501-1625MHz
> Strap end 1750MHZ (98 AB 02) , Range = 1626-1750MHz


In bios hynix has the designation 01, elpida 02 and samsung 00 like this

40 9C 00 01 55 51 33 20 00 00 00 00 84 94 12 12 F0 54 0B 07 15 09 73 02 00 20 41 00 22 DD 1C 08 1B 04 14 20 9A 88 00 A0 00 00 01 20 06 05 0D 0E 27 0F 16 0E

strap 40 9C 00 + 01(Hynix) + timings 55 51 33 20 00 00 00 00 84 94 12 12 F0 54 0B 07 15 09 73 02 00 20 41 00 22 DD 1C 08 1B 04 14 20 9A 88 00 A0 00 00 01 20 06 05 0D 0E 27 0F 16 0E

40 9C 00 02 99 91 33 20 00 00 00 00 60 94 12 0F D0 54 0C 07 15 88 92 01 00 20 41 00 22 AA 1C 08 1D 03 14 20 9A 88 80 A2 00 00 00 C0 06 01 0A 0F 19 0E 16 0D

strap 40 9C 00 + 02(Elpida) + timings 99 91 33 20 00 00 00 00 60 94 12 0F D0 54 0C 07 15 88 92 01 00 20 41 00 22 AA 1C 08 1D 03 14 20 9A 88 80 A2 00 00 00 C0 06 01 0A 0F 19 0E 16 0D

here is all elpida and hynix straps


----------



## chris89

@mirzet1976 Thanks bro.

What Mhz strap is 20 4E 00 & A4 2C 02 in Elpida?

Do you also have all the Samsung straps just in case I run across a Samsung card? How can I tell just by looking at the BIOS which memory the card is using or do the cards only carry the straps designated for the specified memory only & not carrying multiple strap sections in hex?

So in this video which version (if it matters) of hex workshop do I need to get mine to look like his? How did he static highlight the straps like he did?






*I made this table to simplify the straps*

*ELPIDA*

20 4E 00 02 99 91 33 20 00 00 00 00 60 88 11 07 C0 54 0B 06 0F 05 C1 00 00 20 41 00 22 AA 1C 08 15 00 14 20 9A 88 40 A1 00 00 00 C0 03 01 05 07 0C 0A 10 0C

Strap end 400MHz (40 9C 00) , Range = 150-400MHz
40 9C 00 02 99 91 33 20 00 00 00 00 60 94 12 0F D0 54 0C 07 15 88 92 01 00 20 41 00 22 AA 1C 08 1D 03 14 20 9A 88 80 A2 00 00 00 C0 06 01 0A 0F 19 0E 16 0D

Strap end 800MHz (80 38 01) , Range = 401-800MHz
80 38 01 02 99 91 33 20 00 00 00 00 A5 AC 35 1F 10 55 0E 0B 21 8E 35 03 00 44 82 00 22 AA 1C 08 3D 09 14 20 2A 89 00 A5 00 00 00 C0 0C 06 14 1A 33 18 22 10

Strap end 1000MHz (A0 86 01) , Range = 901-1000MHz
A0 86 01 02 77 71 33 20 00 00 00 00 29 39 57 27 50 55 0D 0E 26 11 07 04 00 68 C2 00 22 AA 1C 08 54 0C 14 20 AA 89 00 A6 00 00 00 C0 0F 0A 19 1E 40 1E 27 12

Strap end 1250MHz (48 E8 01) , Range = 1126-1250MHz
48 E8 01 02 77 71 33 20 00 00 00 00 AD 49 59 32 70 55 0E 10 2D 15 19 05 00 68 C3 00 22 AA 1C 08 64 0F 14 20 BA 89 80 A7 00 00 00 C0 13 0E 20 25 51 24 2E 13

Strap end 1375MHz (1C 19 02) , Range = 1251-1375MHz
1C 19 02 02 77 71 33 20 00 00 00 00 EF 51 6A 37 90 55 0F 12 32 17 9A 05 00 6A E4 00 22 AA 1C 08 74 02 14 20 CA 89 C0 A8 02 00 00 C0 15 10 23 28 59 28 33 15

A4 2C 02 02 77 71 33 20 00 00 00 00 10 D6 6A 39 90 55 0F 12 34 98 CA 05 00 6A E4 00 22 AA 1C 08 74 03 14 20 CA 89 00 A9 02 00 00 C0 16 11 24 29 5C 29 35 15

Strap end 1500MHz (F0 49 02) , Range = 1376-1500MHz
F0 49 02 02 77 71 33 20 00 00 00 00 31 5A 6B 3C A0 55 0F 13 36 19 1B 06 00 6A E4 00 22 AA 1C 08 7C 04 14 20 CA 89 80 A9 02 00 00 C0 17 12 26 2B 61 2B 37 15

Strap end 1625MHz (C4 7A 02) , Range = 1501-1625MHz
C4 7A 02 02 77 71 33 20 00 00 00 00 73 62 7C 41 B0 55 10 14 3A 1B 9C 06 00 6C 06 01 22 AA 1C 08 04 06 14 20 EA 89 40 AA 03 00 00 C0 19 14 29 2E 69 2E 3B 16

Strap end 1750MHZ (98 AB 02) , Range = 1626-1750MHz
98 AB 02 02 77 71 33 20 00 00 00 00 B5 6A 7D 46 C0 55 10 15 3E 1D 1D 07 00 6C 07 01 22 AA 1C 08 0C 08 14 20 FA 89 00 AB 03 00 00 C0 1B 16 2C 31 71 31 3F 17 

*HYNIX*

Strap end 400MHz (40 9C 00) , Range = 150-400MHz 
40 9C 00 01 55 51 33 20 00 00 00 00 84 94 12 12 F0 54 0B 07 15 09 73 02 00 20 41 00 22 DD 1C 08 1B 04 14 20 9A 88 00 A0 00 00 01 20 06 05 0D 0E 27 0F 16 0E

Strap end 800MHz (80 38 01) , Range = 401-800MHz
80 38 01 01 77 71 33 20 00 00 00 00 E7 AC 35 22 10 55 0D 0A 20 8E F5 04 00 24 81 00 22 DD 1C 08 34 09 14 20 9A 88 00 A0 00 00 01 20 0C 08 17 1B 4F 17 21 10

Strap end 900MHz (90 5F 01) , Range = 801-900MHz
90 5F 01 01 77 71 33 20 00 00 00 00 29 31 46 26 20 55 0E 0B 22 0F 96 05 00 26 A2 00 22 DD 1C 08 3C 0A 14 20 AA 88 00 A0 00 00 01 20 0D 0A 1A 1D 59 19 23 11

Strap end 1000MHz (A0 86 01) , Range = 901-1000MHz
A0 86 01 01 77 71 33 20 00 00 00 00 29 B5 46 29 30 55 0E 0C 24 90 26 06 00 26 A2 00 22 DD 1C 08 44 0B 14 20 AA 88 00 A0 00 00 01 20 0E 0A 1C 20 62 1B 25 11

Strap end 1125MHz (74 B7 01) , Range = 1001-1125MHz
74 B7 01 01 77 71 33 20 00 00 00 00 6B BD 57 2F 40 55 0F 0D 28 92 F7 06 00 48 C5 00 22 FF 1C 08 4C 0D 14 20 5A 89 00 A0 00 00 01 20 10 0C 20 24 6F 1E 29 12

Strap end 1250MHz (48 E8 01) , Range = 1126-1250MHz
48 E8 01 01 77 71 33 20 00 00 00 00 8C C5 58 34 60 55 0F 0F 2C 94 B8 07 00 48 C5 00 22 FF 1C 08 5C 0F 14 20 5A 89 00 A0 00 00 01 20 12 0D 23 28 7B 22 2D 13

Strap end 1375MHz (1C 19 02) , Range = 1251-1375MHz
1C 19 02 01 77 71 33 20 00 00 00 00 CE CD 59 39 80 55 11 11 2E 15 89 08 00 48 C6 00 22 33 9D 08 6C 00 14 20 6A 89 00 A0 02 00 01 20 14 0F 26 2B 88 25 2F 15

A4 2C 02 01 77 71 33 20 00 00 00 00 CE 51 6A 3B 80 55 11 11 2F 96 D9 08 00 4A E6 00 22 33 9D 08 6C 00 14 20 6A 89 00 A0 02 00 01 20 15 0F 27 2D 8D 26 30 15

Strap end 1500MHz (F0 49 02) , Range = 1376-1500MHz
F0 49 02 01 77 71 33 20 00 00 00 00 CE 51 6A 3D 90 55 11 12 30 96 49 09 00 4A E6 00 22 33 9D 08 74 01 14 20 6A 89 00 A0 02 00 01 20 15 0F 29 2F 94 27 31 16

Strap end 1625MHz (C4 7A 02) , Range = 1501-1625MHz
C4 7A 02 01 99 91 33 20 00 00 00 00 10 DE 7B 44 80 55 13 12 37 19 4B 0A 00 4C 06 01 22 55 9D 08 75 04 14 20 6A 89 00 A0 02 00 01 20 18 11 2D 34 A4 2A 38 16


----------



## ElevenEleven

GPU-Z should show the memory type. My MSI Lightning R9 290x says "Memory Type: GDDR5 (Samsung)"


----------



## chris89

@ElevenEleven Thanks, The only reason why I ask is I do BIOS for people quite often & its timely to have them send me their memory type sometimes. I was just wondering if these cards carry multiple sections of memory timings or have just 1 section. Since from my experience they do carry say all the timings in the BIOS for the different memory types, just like they do on Polaris. I'm too lazy to check for myself but if no one knows then I guess I'll just be asking people for their BIOS brand.

Update - Looking over @kikorog BIOS I saw only Elpida & Hynix in there so I guess I answered my own question. So @kikorog what memory do u have? It says in GPU-Z, by the way how is the BIOS I made for you?


----------



## chris89

PS - What I also found as for clocks & voltages were as follows on my 390x with my 1000w kingwin power supply.

Giga = Billion, so Billion's of Pixel's per second.

Highest core clock I got was 1240mhz @ 1440mv.

1094mhz @ 1250mv @ 70.0 Giga Pixel/s
1133mhz @ 1333mv @ 72.5 Giga Pixel/s
1157mhz @ 1357mv @ 74.0 Giga Pixel/s
1173mhz @ 1373mv @ 75.0 Giga Pixel/s
1188mhz @ 1388mv @ 76.0 Giga Pixel/s
1204mhz @ 1404mv @ 77.0 Giga Pixel/s
1219mhz @ 1419mv @ 78.0 Giga Pixel/s
1234mhz @ 1434mv @ 79.0 Giga Pixel/s
1250mhz @ 1450mv @ 80.0 Giga Pixel/s


----------



## Bride

I got another 290 MSI Lighting, lovely card... trying to find the right BIOS to convert it in a 290x


----------



## nashii74

I all. My GPU FirePro W9100 bios mod
It's fully watercooled with alphacool Block and EKWB system


http://gpuz.techpowerup.com/21/09/01/a4x.png




http://gpuz.techpowerup.com/21/09/01/zmn.png



CPU Intel I7 3770K @ 4GHz++Motherboard MSI Z77A-GD80++ RAM 32GB DDR3 (4x8GB) GSKILL RIPJAW @ 2400MHz++GPU AMD FirePRo W9100 GPU 1140MHZ OC /16GB VRAM 1750MHz (7GHz) OC


----------



## chris89

@nashii74 impressive card 16gb vram bro


----------



## nashii74

Thanks @chris89 
I always working on to find best setting for my use. I spent lot of time to find how OC correctly in BIOS using Hawaii Bios Reader (Thanks for the creators of this tool very hepfull) and HxD to edit bios with help in several vids and forums description.


----------



## 0x6A7232

Hi, I have a Sapphire 290X Tri-X OC (4GB) which is 1040 MHz Core and 1350 MHz RAM, wondering if I could flash the Toxic Sapphire 290X 4GB which is 1100 MHz Core and 1400 MHz RAM, but I can't find that BIOS. Is there a repository of good BIOSes to try, or a good guide for modding the 290X BIOS around? I have the Hawaii BIOS Reader and I've downloaded my BIOS. Tried the 390X BIOS modded for the 290X but it crashes (probably b/c that's an 8GB card? I did match the memory type (Samsung)... attached my bioses, bluebios is the UEFI + legacy hybrid, bios is legacy only bios (the two BIOSes switched with the BIOS button on the Sapphire 290X)


----------



## chris89

@0x6A7232 So which BIOS is Sapphire 290X Tri-X OC (4GB)? Post again with your untouched BIOS & I'll mod it for you.

Thanks


----------



## Kaltenbrunner

I used to have r9 290 crossfire, so that was around gtx1060 or so maybe. Then for a while I had a r9 280X as a hold over card.

I wish I still had 1 of them. My backup PC only has the iGPU of a i5-2400. Some day soon for fun I want to see what my 2000's era games run like on it.


----------



## numanair

I have an XFX R9 380 2GB (two fans). Great card for it's age, even with just 2GB memory. The fan curve is super annoying on the stock card. The fans do not spin at all until 80% or 83%. After some troubleshooting I found that the fans themselves, not the fan driver on the card is causing this. They think they are smarter than the fan controller and basically ignore the PWM.
The solution to this is to drive them in voltage mode (3-pin) from a different source than the card. I made a simple cable to run them off the motherboard and set it for voltage mode instead of auto. Then I used some software (Fan Control) to map the GPU temp to a fan curve. This worked perfectly, and fans do not even reach 80% under full load. Much quieter and they ramp up smoothly!


----------



## chris89

If your fan isn't achieving 100% at load, your not achieving 100% of the cards capable performance but maybe your not looking to get peak performance out of your card and are pleased with the low framerate & quiet performance. I for one know that the VRM under load on AMD Tonga gets extremely hot just like the others but more so on Tonga because no limiter is in place just like AMD Hawaii / Grenada. Which again needs a linear curve from 20-35% minimum pwm with zero fan disabled & 100% at no more than 84°C core temp, but less is better to cool the VRM down in time to achieve max boost clock & hold it continuously.


----------



## 0x6A7232

chris89 said:


> @0x6A7232 So which BIOS is Sapphire 290X Tri-X OC (4GB)? Post again with your untouched BIOS & I'll mod it for you.
> 
> Thanks


Nuts. Didn't see this, as I temporarily borked my card (tried to clean it, but got alcohol down where it could evaporate, had to pull the heatsink off, at which point the old thermal pads and past were dried out, cracked and useless, so I had to order new pads, deep clean, dry, & reassemble). 
Those are both unmodded BIOSes. One is simply the UEFI+legacy dual mode BIOS (bluerom, as the dual BIOS switch lights up blue when it is selected) and the other is the legacy ONLY BIOS. 

Also... I think that I've got a separate problem (which has been occuring for a long time, before I cleaned it, so it's not that I messed something up doing that). Others have reported that their card isn't holding a steady 1.000v and so black screens under idle conditions. To test this I was going to bump idle voltage up a hair. What makes me think my card has this issue? Well, it black screens, but not under heavy load (gaming or even stress test, even with OC applied!) -- however, randomly at idle, eg when logging in to Windows, it black screens, causing me to have to hard reset.

I've already cleaned & reapplied thermal paste & thermal pads, it does not fix the problem.

HOWEVER, under Linux, when it black screens, I can just move the mouse and instantly everything comes back. 

WHEN does it happen? Well, randomly BUT ALSO, often when FIRST starting a new video on eg Netflix or YouTube -- after it comes back when I move the mouse, it will play THE ENTIRE VIDEO just fine, no problems. Load next video? 90% chance of black screen, again I move the mouse and it comes back.

WHY is Windows so touchy? Well, I had to use a modded driver as AMD stopped updating, and I think that may have broken the auto-driver recovery in case of crash feature. Regardless, it can sometimes go days without an issue, but when it's in a funk, black screens can happen 100% before everything gets loaded. This is a problem, as even if I try to apply a voltage fix in MSI Afterburner, 1) I think that only works under load? and 2) it's likely to crash before it can even load up.

SO.. I'm trying for a idle voltage increase, just enough to hold steady. PROBLEM - Linux utils to show voltage seem to not work (CPU-X) and only show temps. Windows would be nice but right now it's in a funk and black screens after logging in.

I'm wondering if there's a slight voltage tune in the background going on, and since I re-did the thermal paste it can use less voltage, but this causes the idle voltage to be too low when sudden demand (start video) is applied and it black screens. Or maybe that's my imagination. Regardless, I'm running Hawaii Bios Reader using WINE in Linux (lets you run Windows apps in Linux, pretty handy) and trying to figure out how to safely bump my idle voltage. It's not the clearest of apps, I'm guessing I have to, under "Limit Tables", bump DPM 0 column in all 4 headings to say 1050 for 1.05v?? And what should I set... I have no idea what the idle voltage is ACTUALLY on the card as I can't, at this point, get Windows to load long enough to view it. Hmm. I think I have a safe mode modded Windows PE build I could boot, but the problem with those is that a lot of times the specialized drivers etc don't have all the required access to display voltage etc. Well, can't hurt to try... Imma give this a shot.

tl;dr I'm also trying to bump the idle voltage slightly to hopefully fix my black screen issues, which are not temp related as they happen at idle, ESPECIALLY when first starting video playback, in Linux & Windows but worse on Windows, probably because of desktop GPU acceleration & Windows driver restart not working on my build. Any help appreciated! I'll try to watch this space as well, now that my computer is working again (seriously, at this point, with Linux, it's basically the same as if the screen goes to sleep when I start watching videos -- move the mouse & I'm good until the next video).


----------



## chris89

@0x6A7232 May you please post your bios of choice either the UEFI bios switch or Legacy bios switch. Please send the untouched original bios buddy.

Check out the picture of my 290x I added thermal pads for check out my idle vrm temps and core temps, it's less than 30°C. You probably didn't paste the core correctly. Did u add a dot in the center and just put the heatsink back on or did u spread the paste across the core by finger manually as to cover the entire chip from edge-to-edge? That's important to do on all chips for CPUs & GPUs.

*Notice how I rolled the thermal pads into tiny balls & put on each VRM specifically & my memory pads weren't ideal for this time I re-did it again a different time though. Notice how I spread by the paste (ARCTIC SILVER CERAMIQUE 2) on the core from EDGE-TO-EDGE as that alone prevents black screen. Black screen is an overheating ASIC condition which needs more fan speed to cool the card. You need me to mod your BIOS to fix the issue & I recommend sticking with the LEGACY driver on AMDs website instead of using NimeZ Modded Driver. Unless NimeZ has a issue free full WHQL release for your card with zero issues because that LEGACY driver on AMDs website has minimal issues even to this day.*


----------



## 0x6A7232

(Caps are for emphasis, I'm not trying to yell or anything here).

Hmm, I know overheating can cause black screen, but in my case that's not it. I can go from cold boot to black screen in 25 seconds. If I managed to get the thing booted, I can run an MSI Kombustor stress test (!!!) for HOURS. Linux will run with no problems for DAYS, except for the black screen ONLY on playback start, which is cleared by moving my mouse. The fans are not even ramping in this case.
Again, I had this problem BEFORE re-pasting. As for thermal paste application method, I used the line method, and you can see a graph with the results of different methods HERE for why I chose that: 







For the thermal pads I used a sheet for the VRMs, they are 100% covered, and individually cut squares for the VRAM.
Again, though, this isn't a temp issue, as the card will NOT black screen --->>under load!! <<---.

As for the VBIOS? That's what was on my card when I bought it. I suppose it's possible someone modded them already?? They match the MD5 of these posted BIOSes, which aren't verified, but they are listed as being for my specific card (Sapphire 290X 4GB)
bluebios / UEFI: Sapphire R9 290X VBIOS
bios / non-UEFI only: Sapphire R9 290X VBIOS

I guess there are verified Tri-X OC bioses here, should I flash one of those? Perhaps someone before me tinkered with it and screwed something up. Here's the list of BIOSes, note however that some are for different (not Tri-X OC) variants:








TechPowerUp


Extensive repository of graphics card BIOS image files. Our database covers submissions categorized by GPU vendor, type, and board partner variant.




www.techpowerup.com





For an example of what I mean by black screen at idle load, please see these posts:


__
https://www.reddit.com/r/AdvancedMicroDevices/comments/3ck5ha

AMD R9 290/290x Issues Thread






Page 2 - Explain how this works (R9 290 Black Screen Fix)


Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.




forums.tomshardware.com


----------



## 0x6A7232

Come to think of it, if my BIOS isn't stock, I should try that first. Is there any way to verify if the BIOS I'm using doesn't come from Sapphire?


----------



## dagget3450

Black screen was always an issue that was from multiple symptoms. Given the age of these GPUs and the fact mining has taken it's route. There is a high chance your card has a modded bios, that works great for miners but causes black screens for you.

I would definitely look into getting and flashing a proper bios to eliminate it.

Also, I would advise if you are over clocking that can cause black screens also. This can happen due to unstable oc to a possible voltage issue with video ports as well. 

Many moons ago I recall pops and black screens from oc and DP output, also I think HDMI can have issues. I believe only DVI was the safe oc port for extreme/high OCs 

Not sure if yours has it but check for a bios switch and try the other bios if possible.

Good luck and keep up posted!


----------



## 0x6A7232

I think the nutters who modded the BIOS (if they did) modded both switches. I mean... now that I'm thinking about it, my card spec is 1040 MHz Core / 1300 MHz Mem on stock and I'm actually getting 1020 MHz Core and 1350 MHz Mem stock. That's with both BIOS positions. I always shrugged it off but maybe I shouldn't have...


----------



## 0x6A7232

So I tried flashing what I could find for the TRI-X OC (4GB) ... looks like mine is a newer variant or something as NONE of the bios files I could find worked. Flashed back to what I'm using now. Asked Sapphire for stock bios, hopefully they just fork them over instead of nattering about it being out of warranty. Seriously, why aren't those files on the website?? Anyways.


----------



## znalq

Hello, it seems the latest version available of the driver is 21. I am wondering if it would be possible to update to 22, because I would like to have Vulkan 1.3. Thanks.


----------



## Bun-ny

znalq said:


> Hello, it seems the latest version available of the driver is 21. I am wondering if it would be possible to update to 22, because I would like to have Vulkan 1.3. Thanks.



Hi znalq,

See my post here, 









Good news for us with older cards


Hi Guys, Good news for us with older cards. Steve




www.overclock.net





You need to uninstall old driver first.


Steve.


----------



## znalq

Bun-ny said:


> Hi znalq,
> 
> See my post here,
> 
> 
> 
> 
> 
> 
> 
> 
> 
> Good news for us with older cards
> 
> 
> Hi Guys, Good news for us with older cards. Steve
> 
> 
> 
> 
> www.overclock.net
> 
> 
> 
> 
> 
> You need to uninstall old driver first.
> 
> 
> Steve.


That seems to install Adrenalin 22.6.1, but the driver stays at version 21.


----------

